hexsha string | size int64 | ext string | lang string | max_stars_repo_path string | max_stars_repo_name string | max_stars_repo_head_hexsha string | max_stars_repo_licenses list | max_stars_count int64 | max_stars_repo_stars_event_min_datetime string | max_stars_repo_stars_event_max_datetime string | max_issues_repo_path string | max_issues_repo_name string | max_issues_repo_head_hexsha string | max_issues_repo_licenses list | max_issues_count int64 | max_issues_repo_issues_event_min_datetime string | max_issues_repo_issues_event_max_datetime string | max_forks_repo_path string | max_forks_repo_name string | max_forks_repo_head_hexsha string | max_forks_repo_licenses list | max_forks_count int64 | max_forks_repo_forks_event_min_datetime string | max_forks_repo_forks_event_max_datetime string | content string | avg_line_length float64 | max_line_length int64 | alphanum_fraction float64 | qsc_code_num_words_quality_signal int64 | qsc_code_num_chars_quality_signal float64 | qsc_code_mean_word_length_quality_signal float64 | qsc_code_frac_words_unique_quality_signal float64 | qsc_code_frac_chars_top_2grams_quality_signal float64 | qsc_code_frac_chars_top_3grams_quality_signal float64 | qsc_code_frac_chars_top_4grams_quality_signal float64 | qsc_code_frac_chars_dupe_5grams_quality_signal float64 | qsc_code_frac_chars_dupe_6grams_quality_signal float64 | qsc_code_frac_chars_dupe_7grams_quality_signal float64 | qsc_code_frac_chars_dupe_8grams_quality_signal float64 | qsc_code_frac_chars_dupe_9grams_quality_signal float64 | qsc_code_frac_chars_dupe_10grams_quality_signal float64 | qsc_code_frac_chars_replacement_symbols_quality_signal float64 | qsc_code_frac_chars_digital_quality_signal float64 | qsc_code_frac_chars_whitespace_quality_signal float64 | qsc_code_size_file_byte_quality_signal float64 | qsc_code_num_lines_quality_signal float64 | qsc_code_num_chars_line_max_quality_signal float64 | qsc_code_num_chars_line_mean_quality_signal float64 | qsc_code_frac_chars_alphabet_quality_signal float64 | qsc_code_frac_chars_comments_quality_signal float64 | qsc_code_cate_xml_start_quality_signal float64 | qsc_code_frac_lines_dupe_lines_quality_signal float64 | qsc_code_cate_autogen_quality_signal float64 | qsc_code_frac_lines_long_string_quality_signal float64 | qsc_code_frac_chars_string_length_quality_signal float64 | qsc_code_frac_chars_long_word_length_quality_signal float64 | qsc_code_frac_lines_string_concat_quality_signal float64 | qsc_code_cate_encoded_data_quality_signal float64 | qsc_code_frac_chars_hex_words_quality_signal float64 | qsc_code_frac_lines_prompt_comments_quality_signal float64 | qsc_code_frac_lines_assert_quality_signal float64 | qsc_codepython_cate_ast_quality_signal float64 | qsc_codepython_frac_lines_func_ratio_quality_signal float64 | qsc_codepython_cate_var_zero_quality_signal bool | qsc_codepython_frac_lines_pass_quality_signal float64 | qsc_codepython_frac_lines_import_quality_signal float64 | qsc_codepython_frac_lines_simplefunc_quality_signal float64 | qsc_codepython_score_lines_no_logic_quality_signal float64 | qsc_codepython_frac_lines_print_quality_signal float64 | qsc_code_num_words int64 | qsc_code_num_chars int64 | qsc_code_mean_word_length int64 | qsc_code_frac_words_unique null | qsc_code_frac_chars_top_2grams int64 | qsc_code_frac_chars_top_3grams int64 | qsc_code_frac_chars_top_4grams int64 | qsc_code_frac_chars_dupe_5grams int64 | qsc_code_frac_chars_dupe_6grams int64 | qsc_code_frac_chars_dupe_7grams int64 | qsc_code_frac_chars_dupe_8grams int64 | qsc_code_frac_chars_dupe_9grams int64 | qsc_code_frac_chars_dupe_10grams int64 | qsc_code_frac_chars_replacement_symbols int64 | qsc_code_frac_chars_digital int64 | qsc_code_frac_chars_whitespace int64 | qsc_code_size_file_byte int64 | qsc_code_num_lines int64 | qsc_code_num_chars_line_max int64 | qsc_code_num_chars_line_mean int64 | qsc_code_frac_chars_alphabet int64 | qsc_code_frac_chars_comments int64 | qsc_code_cate_xml_start int64 | qsc_code_frac_lines_dupe_lines int64 | qsc_code_cate_autogen int64 | qsc_code_frac_lines_long_string int64 | qsc_code_frac_chars_string_length int64 | qsc_code_frac_chars_long_word_length int64 | qsc_code_frac_lines_string_concat null | qsc_code_cate_encoded_data int64 | qsc_code_frac_chars_hex_words int64 | qsc_code_frac_lines_prompt_comments int64 | qsc_code_frac_lines_assert int64 | qsc_codepython_cate_ast int64 | qsc_codepython_frac_lines_func_ratio int64 | qsc_codepython_cate_var_zero int64 | qsc_codepython_frac_lines_pass int64 | qsc_codepython_frac_lines_import int64 | qsc_codepython_frac_lines_simplefunc int64 | qsc_codepython_score_lines_no_logic int64 | qsc_codepython_frac_lines_print int64 | effective string | hits int64 |
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
16db62eebc7d7b6b9686a9005f15f128903c8bda | 6,340 | py | Python | mysignal/filter_graph.py | wy2136/wython | 0eaa9db335d57052806ae956afe6a34705407628 | [
"MIT"
] | 1 | 2022-03-21T21:24:40.000Z | 2022-03-21T21:24:40.000Z | mysignal/filter_graph.py | wy2136/wython | 0eaa9db335d57052806ae956afe6a34705407628 | [
"MIT"
] | null | null | null | mysignal/filter_graph.py | wy2136/wython | 0eaa9db335d57052806ae956afe6a34705407628 | [
"MIT"
] | null | null | null | # -*- coding: utf-8 -*-
"""
Created on Thu Jan 15, 2015
@author: Wenchang Yang
"""
from __future__ import print_function
from .filter import \
_lowpass_ba, _highpass_ba, _bandpass_ba, \
_lowpass_ba_lanczos, _highpass_ba_lanczos
import numpy as np
import matplotlib.pyplot as plt
from scipy.signal import freqz
# ---- general filter
def show_response(b=None,a=1.0,fbfilt=False, label=None, xtick_labels_T=False, return_data=False):
'''Show the signal response function in frequency space given the coefficients of a and b. '''
# parameters
if b is None:
b = [1./4,]*4
b = np.array(b)
a = np.array(a)
print ('b = ',b)
print ('a = ',a)
print ('fbfilt is ',fbfilt)
# data
w,h = freqz(b,a)
xx = w/w[-1]/2
if fbfilt:
yy = abs(h**2)
else:
yy = abs(h)
# plot
plt.plot(xx, yy, label=label)
plt.xlim(0,0.5)
plt.axhline(1,color='gray', ls='--')
plt.title('b = ' + str(b.round(2)) + '; a = ' + str(a.round(2)), y=1.02)
# xtick labels as periods
if xtick_labels_T:
T = np.array(
[2, 4, 10, 100]
)
f = 1./T
xtick_labels = [str(i) for i in T]
plt.xticks(f, xtick_labels)
plt.xlabel('$T/T_s$')
plt.ylabel('$|R(T/T_s)|$')
else:
plt.xlabel('$f/f_s$')
plt.ylabel('$|R(f/f_s)|$')
# return data
if return_data:
return xx, yy
#
# ---- Butterworth filter
def show_response_lp(lowcut=0.25,fs=1.0,order=2, fbfilt=True,label=None, xtick_labels_T=False, return_data=False):
'''Show the lowpass (Butterworth) response function.'''
# params
print ('lowcut = ',lowcut)
print ('fs = ',fs)
print ('order = ',order)
# data
b,a = _lowpass_ba(lowcut=lowcut,fs=fs,order=order)
# plot
xx, yy = show_response(b, a, fbfilt=fbfilt, label=label, xtick_labels_T=xtick_labels_T, return_data=True)
plt.title('$f_{low}/f_s = $ $%.2g$' % (lowcut/fs), y=1)
plt.axvline(lowcut,color='gray', ls='--')
# return data
if return_data:
return xx, yy
def show_response_hp(highcut=0.25,fs=1.0,order=2, fbfilt=True,label=None, xtick_labels_T=False, return_data=False):
'''Show the highpass (Butterworth) response function.'''
# params
print ('highcut = ',highcut)
print ('fs = ',fs)
print ('order = ',order)
print ('fbfilt is ',fbfilt)
# data
b,a = _highpass_ba(highcut=highcut,fs=fs,order=order)
# plot
xx, yy = show_response(b, a, fbfilt=fbfilt, label=label, xtick_labels_T=xtick_labels_T, return_data=True)
plt.title('$f_{high}/f_s = $ $%.2g$' % (highcut/fs), y=1)
plt.axvline(highcut,color='gray', ls='--')
# return data
if return_data:
return xx, yy
def show_response_bp(lowcut=0.125,highcut=0.375,fs=1.0,order=2, fbfilt=True,label=None, xtick_labels_T=False, return_data=False):
'''Show the bandpass (Butterworth) response function. '''
# params
print ('lowcut = ',lowcut)
print ('highcut = ',highcut)
print ('fs = ',fs)
print ('order = ',order)
print ('fbfilt is ',fbfilt)
# data
b,a = _bandpass_ba(lowcut=lowcut,highcut=highcut,fs=fs,order=order)
# plot
xx, yy = show_response(b, a, fbfilt=fbfilt, label=label, xtick_labels_T=xtick_labels, return_data=True)
plt.title('$f_{high}/f_s = $ $%.2g$, $f_{high}/f_s = $ $%.2g$' % (lowcut/fs, highcut/fs), y=1)
plt.axvline(highcut,color='gray', ls='--')
plt.axvline(lowcut,color='gray', ls='--')
plt.xlabel('$f/f_s$')
plt.ylabel('$|R(f/f_s)|$')
# return data
if return_data:
return xx, yy
#
# ---- Lanczos filter
def show_response_lp_lanczos(lowcut=0.25,fs=1.,M=10, fbfilt=True,label=None, xtick_labels_T=False, return_data=False):
'''Show the lowpass (Lanczos) response function.'''
# params
print ('lowcut = ',lowcut)
print ('fs = ',fs)
print ('M = ',M)
print ('fbfilt is ',fbfilt)
# data
b,a = _lowpass_ba_lanczos(lowcut,fs=fs,M=M)
# plot
xx, yy = show_response(b, a, fbfilt=fbfilt, label=label, xtick_labels_T=xtick_labels_T, return_data=True)
plt.title('$f_{low}/f_s = $ $%.2g$' % (lowcut/fs), y=1)
plt.axvline(lowcut,color='gray', ls='--')
# return data
if return_data:
return xx, yy
def show_response_hp_lanczos(highcut=0.25, fs=1., M=10, fbfilt=True, label=None, xtick_labels_T=False, return_data=False):
'''Show the highpass (Lanczos) response function.'''
# params
print ('highcut = ',highcut)
print ('fs = ',fs)
print ('M = ',M)
print ('fbfilt is ',fbfilt)
# data
b,a = _highpass_ba_lanczos(highcut,fs=fs,M=M)
# plot
xx, yy = show_response(b, a, fbfilt=fbfilt, label=label, xtick_labels_T=xtick_labels_T, return_data=True)
plt.title('$f_{high}/f_s = $ $%.2g$' % (highcut/fs), y=1)
plt.axvline(highcut,color='gray', ls='--')
# return data
if return_data:
return xx, yy
def show_response_bp_lanczos(lowcut=0.125,highcut=0.375,fs=1.0,M=10, fbfilt=True,label=None, xtick_labels_T=False, return_data=False):
'''Show the bandpass (Lanczos) response function.'''
# params
print ('lowcut = ',lowcut)
print ('highcut = ',highcut)
print ('fs = ',fs)
print ('M = ',M)
print ('fbfilt is ',fbfilt)
# data
b,a = _lowpass_ba_lanczos(highcut,fs=fs,M=M)
w,hlow = freqz(b,a)
b,a = _highpass_ba_lanczos(lowcut,fs=fs,M=M)
w,hhigh = freqz(b,a)
xx = w/w[-1]/2
if fbfilt:
yy = abs(hlow**2*hhigh**2)
else:
yy = abs(hlow)*abs(hhigh)
# plot
plt.plot(xx, yy, label=label)
plt.title('$f_{high}/f_s = $ $%.2g$, $f_{high}/f_s = $ $%.2g$' % (lowcut/fs, highcut/fs), y=1)
plt.axvline(highcut,color='gray', ls='--')
plt.axvline(lowcut,color='gray', ls='--')
plt.axhline(1,color='gray', ls='--')
# xtick labels as periods
if xtick_labels_T:
T = np.array(
[2, 4, 10, 100]
)
f = 1./T
xtick_labels = [str(i) for i in T]
plt.xticks(f, xtick_labels)
plt.xlabel('$T/T_s$')
plt.ylabel('$|R(T/T_s)|$')
else:
plt.xlabel('$f/f_s$')
plt.ylabel('$|R(f/f_s)|$')
# return data
if return_data:
return xx, yy
#
| 30.776699 | 134 | 0.588328 | 956 | 6,340 | 3.753138 | 0.117155 | 0.072464 | 0.060201 | 0.039019 | 0.841973 | 0.802954 | 0.78874 | 0.77369 | 0.748885 | 0.730491 | 0 | 0.021914 | 0.237066 | 6,340 | 205 | 135 | 30.926829 | 0.719868 | 0.122871 | 0 | 0.70229 | 0 | 0.015267 | 0.103813 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.053435 | false | 0.068702 | 0.038168 | 0 | 0.145038 | 0.221374 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 8 |
bc6c24681f5bfa47c7c00668e97fa05ecd0134e2 | 254 | py | Python | app/storage/__init__.py | dolfinus/cryptonite | b90c24d008f4af78f8ed00fc9a30fea4628a443a | [
"MIT"
] | null | null | null | app/storage/__init__.py | dolfinus/cryptonite | b90c24d008f4af78f8ed00fc9a30fea4628a443a | [
"MIT"
] | 4 | 2020-03-18T12:04:12.000Z | 2020-07-07T19:32:23.000Z | app/storage/__init__.py | dolfinus/cryptonite | b90c24d008f4af78f8ed00fc9a30fea4628a443a | [
"MIT"
] | null | null | null |
from storage.common.test import TestSingletone, TestItemSingletone, TestResultSingletone, TestAnswerSingletone
from storage.common.user import UserSingletone
from storage.common.article import ArticleSingletone
from storage.db.user import User | 50.8 | 114 | 0.84252 | 27 | 254 | 7.925926 | 0.518519 | 0.205607 | 0.238318 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.114173 | 254 | 5 | 115 | 50.8 | 0.951111 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 7 |
bc826fafba999ea6138297b89cf52cc8a465353b | 26,118 | py | Python | cogs/kits.py | jeraldlyh/horizon | 8898019d79ed168b991a649c1f84b7d1ec4c433d | [
"Apache-2.0"
] | 2 | 2021-07-06T10:42:02.000Z | 2021-07-06T11:16:53.000Z | cogs/kits.py | jeraldlyh/horizon | 8898019d79ed168b991a649c1f84b7d1ec4c433d | [
"Apache-2.0"
] | null | null | null | cogs/kits.py | jeraldlyh/horizon | 8898019d79ed168b991a649c1f84b7d1ec4c433d | [
"Apache-2.0"
] | null | null | null | import pymongo
import discord
import random
import datetime
import time
import pytz
import os
from discord.ext import commands
from cogs.utils.misc import level_up
from cogs.utils.checks import is_donator, has_registered, is_economy_channel
from cogs.utils.embed import (passembed, errorembed)
class Kits(commands.Cog):
def __init__(self, bot):
self.bot = bot
self.client = pymongo.MongoClient(os.getenv("MONGO_DB"))
self.db = self.client.get_database('Users')
self.records = self.db.horizon_database
self.classDict = {
'Soldier':['Sash Sergeant', 'Shock Trooper', 'Commando', 'Special Forces', 'Bullet Storm'],
'Constructor':['BASE', 'Heavy BASE', 'MEGABASE', 'Riot Control', 'Warden'],
'Ninja':['Assassin', 'Deadly Blade', 'Dim Mak', 'Harvester', 'Shuriken Master'],
'Outlander':['Pathfinder', 'Reclaimer', 'Recon Scout', 'T.E.D.D Shot', 'Trailblazer']
}
def kitEmbed(self, ctx, amount, exp):
embed = discord.Embed(color=discord.Color.from_hsv(random.random(), 1, 1))
embed.set_author(name=f'{ctx.command.name.capitalize()} Kit', icon_url=ctx.author.avatar_url)
embed.add_field(name='Rewards', value=f'Wood: **+{round(amount)}**<:Wood:585780105696116736>\n Experience: **+{round(exp)}**<:BattlePass:585742444092456960>')
embed.set_footer(text='Type `.cd` to check your kit cooldowns')
return embed
@commands.command(aliases=['dl'])
@has_registered()
@is_economy_channel()
async def daily(self, ctx):
amount = 150
exp = 200
try:
userData = self.records.find({'userID':str(ctx.author.id)})
for x in userData:
timeData = str(x['Kits']['Daily'])
woodData = float(x['Currencies']['Wood'])
expData = float(x['Profile']['Experience'])
jobData = str(x['RPG']['Job'])
classData = str(x['RPG']['Class'])
# Converts date from database to compare
availableTime = datetime.datetime.strptime(timeData, '%Y-%m-%d %H:%M:%S.%f%z')
# Current Time
currentTime = datetime.datetime.now(tz=pytz.timezone('Asia/Singapore'))
# Current Time in seconds
if currentTime > availableTime:
# Use this format to update database
formatTime = (datetime.datetime.now(tz=pytz.timezone('Asia/Singapore')) + datetime.timedelta(days=1)).strftime('%Y-%m-%d %H:%M:%S.%f%z')
jobAdvancementBonus = self.classDict[classData].index(jobData) + 1 if jobData != 'None' else 1 if jobData != 'None' else 1
kitAmount = amount*jobAdvancementBonus if classData else amount
kitExp = exp*jobAdvancementBonus if classData else exp
woodData += kitAmount
expData += kitExp
dataUpdate = {
'Kits.Daily':formatTime,
'Currencies.Wood':woodData,
'Profile.Experience':expData
}
update = self.records.update_one({'userID':str(ctx.author.id)}, {'$set':dataUpdate})
await level_up(ctx)
await ctx.send(ctx.author.mention)
embed = self.kitEmbed(ctx, kitAmount, kitExp)
return await ctx.send(embed=embed)
else:
eembed = errorembed(description=f'{ctx.author.mention} You are currently on cooldown. Type ``.cd`` to check your cooldowns.')
return await ctx.send(embed=eembed)
except Exception as e:
print(e)
# Use this format to update database
formatTime = (datetime.datetime.now(tz=pytz.timezone('Asia/Singapore')) + datetime.timedelta(days=1)).strftime('%Y-%m-%d %H:%M:%S.%f%z')
jobAdvancementBonus = self.classDict[classData].index(jobData) + 1 if jobData != 'None' else 1
kitAmount = amount*jobAdvancementBonus if classData else amount
kitExp = exp*jobAdvancementBonus if classData else exp
woodData += kitAmount
expData += kitExp
dataUpdate = {
'Kits.Daily':formatTime,
'Currencies.Wood':woodData,
'Profile.Experience':expData
}
update = self.records.update_one({'userID':str(ctx.author.id)}, {'$set':dataUpdate})
await level_up(ctx)
await ctx.send(ctx.author.mention)
embed = self.kitEmbed(ctx, kitAmount, kitExp)
return await ctx.send(embed=embed)
@commands.command(aliases=['wk'])
@has_registered()
@is_economy_channel()
async def weekly(self, ctx):
amount = 2000
exp = 3500
try:
userData = self.records.find({'userID':str(ctx.author.id)})
for x in userData:
timeData = str(x['Kits']['Weekly'])
woodData = float(x['Currencies']['Wood'])
expData = float(x['Profile']['Experience'])
jobData = str(x['RPG']['Job'])
classData = str(x['RPG']['Class'])
# Converts date from database to compare
availableTime = datetime.datetime.strptime(timeData, '%Y-%m-%d %H:%M:%S.%f%z')
# Current Time
currentTime = datetime.datetime.now(tz=pytz.timezone('Asia/Singapore'))
# Current Time in seconds
if currentTime > availableTime:
# Use this format to update database
formatTime = (datetime.datetime.now(tz=pytz.timezone('Asia/Singapore')) + datetime.timedelta(days=7)).strftime('%Y-%m-%d %H:%M:%S.%f%z')
jobAdvancementBonus = self.classDict[classData].index(jobData) + 1 if jobData != 'None' else 1
kitAmount = amount*jobAdvancementBonus if classData else amount
kitExp = exp*jobAdvancementBonus if classData else exp
woodData += kitAmount
expData += kitExp
dataUpdate = {
'Kits.Weekly':formatTime,
'Currencies.Wood':woodData,
'Profile.Experience':expData
}
update = self.records.update_one({'userID':str(ctx.author.id)}, {'$set':dataUpdate})
await level_up(ctx)
await ctx.send(ctx.author.mention)
embed = self.kitEmbed(ctx, kitAmount, kitExp)
return await ctx.send(embed=embed)
else:
eembed = errorembed(description=f'{ctx.author.mention} You are currently on cooldown. Type ``.cd`` to check your cooldowns.')
return await ctx.send(embed=eembed)
except:
# Use this format to update database
formatTime = (datetime.datetime.now(tz=pytz.timezone('Asia/Singapore')) + datetime.timedelta(days=7)).strftime('%Y-%m-%d %H:%M:%S.%f%z')
jobAdvancementBonus = self.classDict[classData].index(jobData) + 1 if jobData != 'None' else 1
kitAmount = amount*jobAdvancementBonus if classData else amount
kitExp = exp*jobAdvancementBonus if classData else exp
woodData += kitAmount
expData += kitExp
dataUpdate = {
'Kits.Weekly':formatTime,
'Currencies.Wood':woodData,
'Profile.Experience':expData
}
update = self.records.update_one({'userID':str(ctx.author.id)}, {'$set':dataUpdate})
await level_up(ctx)
await ctx.send(ctx.author.mention)
embed = self.kitEmbed(ctx, kitAmount, kitExp)
return await ctx.send(embed=embed)
@commands.command(aliases=['sp'])
@has_registered()
@is_economy_channel()
@commands.has_any_role('Titan Donator', 'Mystic Donator', 'Immortal Donator')
async def supporter(self, ctx):
amount = 350
exp = 500
try:
userData = self.records.find({'userID':str(ctx.author.id)})
for x in userData:
timeData = str(x['Kits']['Supporter'])
woodData = float(x['Currencies']['Wood'])
expData = float(x['Profile']['Experience'])
jobData = str(x['RPG']['Job'])
classData = str(x['RPG']['Class'])
# Converts date from database to compare
availableTime = datetime.datetime.strptime(timeData, '%Y-%m-%d %H:%M:%S.%f%z')
# Current Time
currentTime = datetime.datetime.now(tz=pytz.timezone('Asia/Singapore'))
# Current Time in seconds
if currentTime > availableTime:
# Use this format to update database
formatTime = (datetime.datetime.now(tz=pytz.timezone('Asia/Singapore')) + datetime.timedelta(days=1)).strftime('%Y-%m-%d %H:%M:%S.%f%z')
jobAdvancementBonus = self.classDict[classData].index(jobData) + 1 if jobData != 'None' else 1
kitAmount = amount*jobAdvancementBonus if classData else amount
kitExp = exp*jobAdvancementBonus if classData else exp
woodData += kitAmount
expData += kitExp
dataUpdate = {
'Kits.Supporter':formatTime,
'Currencies.Wood':woodData,
'Profile.Experience':expData
}
update = self.records.update_one({'userID':str(ctx.author.id)}, {'$set':dataUpdate})
await level_up(ctx)
await ctx.send(ctx.author.mention)
embed = self.kitEmbed(ctx, kitAmount, kitExp)
return await ctx.send(embed=embed)
else:
eembed = errorembed(description=f'{ctx.author.mention} You are currently on cooldown. Type ``.cd`` to check your cooldowns.')
return await ctx.send(embed=eembed)
except:
# Use this format to update database
formatTime = (datetime.datetime.now(tz=pytz.timezone('Asia/Singapore')) + datetime.timedelta(days=1)).strftime('%Y-%m-%d %H:%M:%S.%f%z')
jobAdvancementBonus = self.classDict[classData].index(jobData) + 1 if jobData != 'None' else 1
kitAmount = amount*jobAdvancementBonus if classData else amount
kitExp = exp*jobAdvancementBonus if classData else exp
woodData += kitAmount
expData += kitExp
dataUpdate = {
'Kits.Supporter':formatTime,
'Currencies.Wood':woodData,
'Profile.Experience':expData
}
update = self.records.update_one({'userID':str(ctx.author.id)}, {'$set':dataUpdate})
await level_up(ctx)
await ctx.send(ctx.author.mention)
embed = self.kitEmbed(ctx, kitAmount, kitExp)
return await ctx.send(embed=embed)
@supporter.error
async def supporter_error(self, ctx, error):
if isinstance(error, commands.MissingAnyRole):
supporterRole = discord.utils.get(ctx.message.guild.roles, name='Titan Donator').mention
eembed = errorembed(description=f'{ctx.author.mention} Want to claim this **Supporter** kit? You have to minimally be a {supporterRole}')
return await ctx.send(embed=eembed)
@commands.command(aliases=['nt'])
@has_registered()
@is_economy_channel()
@commands.has_any_role('Nitro Booster')
async def nitro(self, ctx):
amount = 250
exp = 400
try:
userData = self.records.find({'userID':str(ctx.author.id)})
for x in userData:
timeData = str(x['Kits']['Nitro'])
woodData = float(x['Currencies']['Wood'])
expData = float(x['Profile']['Experience'])
jobData = str(x['RPG']['Job'])
classData = str(x['RPG']['Class'])
# Converts date from database to compare
availableTime = datetime.datetime.strptime(timeData, '%Y-%m-%d %H:%M:%S.%f%z')
# Current Time
currentTime = datetime.datetime.now(tz=pytz.timezone('Asia/Singapore'))
# Current Time in seconds
if currentTime > availableTime:
# Use this format to update database
formatTime = (datetime.datetime.now(tz=pytz.timezone('Asia/Singapore')) + datetime.timedelta(days=1)).strftime('%Y-%m-%d %H:%M:%S.%f%z')
jobAdvancementBonus = self.classDict[classData].index(jobData) + 1 if jobData != 'None' else 1
kitAmount = amount*jobAdvancementBonus if classData else amount
kitExp = exp*jobAdvancementBonus if classData else exp
woodData += kitAmount
expData += kitExp
dataUpdate = {
'Kits.Nitro':formatTime,
'Currencies.Wood':woodData,
'Profile.Experience':expData
}
update = self.records.update_one({'userID':str(ctx.author.id)}, {'$set':dataUpdate})
await level_up(ctx)
await ctx.send(ctx.author.mention)
embed = self.kitEmbed(ctx, kitAmount, kitExp)
return await ctx.send(embed=embed)
else:
eembed = errorembed(description=f'{ctx.author.mention} You are currently on cooldown. Type ``.cd`` to check your cooldowns.')
return await ctx.send(embed=eembed)
except:
# Use this format to update database
formatTime = (datetime.datetime.now(tz=pytz.timezone('Asia/Singapore')) + datetime.timedelta(days=1)).strftime('%Y-%m-%d %H:%M:%S.%f%z')
jobAdvancementBonus = self.classDict[classData].index(jobData) + 1 if jobData != 'None' else 1
kitAmount = amount*jobAdvancementBonus if classData else amount
kitExp = exp*jobAdvancementBonus if classData else exp
woodData += kitAmount
expData += kitExp
dataUpdate = {
'Kits.Nitro':formatTime,
'Currencies.Wood':woodData,
'Profile.Experience':expData
}
update = self.records.update_one({'userID':str(ctx.author.id)}, {'$set':dataUpdate})
await level_up(ctx)
await ctx.send(ctx.author.mention)
embed = self.kitEmbed(ctx, kitAmount, kitExp)
return await ctx.send(embed=embed)
@nitro.error
async def nitro_error(self, ctx, error):
if isinstance(error, commands.MissingAnyRole):
nitroRole = discord.utils.get(ctx.message.guild.roles, name='Nitro Booster').mention
eembed = errorembed(description=f'{ctx.author.mention} Want to claim this **Nitro** kit? You have to be a {nitroRole}')
return await ctx.send(embed=eembed)
@commands.command(aliases=['v'])
@has_registered()
@is_economy_channel()
async def vote(self, ctx, user:discord.User):
if user == ctx.author:
eembed = errorembed(description=f"{ctx.author.mention} You can't upvote yourself. Good try though! <:PepeHugs:541252355518365718>")
return await ctx.send(embed=eembed)
# Checks if User is inside Database
try:
userList = [x['userID'] for x in self.records.find({})]
if str(ctx.author.id) not in userList:
eembed = errorembed(description=f'{ctx.author.mention} You are currently not registered yet. Kindly type ``.register`` to be registered.')
return await ctx.send(embed=eembed)
elif str(user.id) not in userList:
eembed = errorembed(description=f'{ctx.author.mention} {user.mention} has not registered yet.')
return await ctx.send(embed=eembed)
except:
pass
try:
userData = self.records.find({'userID':str(ctx.author.id)})
for x in userData:
voteData = str(x['Kits']['Votes'])
# Converts date from database to compare
availableTime = datetime.datetime.strptime(voteData, '%Y-%m-%d %H:%M:%S.%f%z')
# Current Time
currentTime = datetime.datetime.now(tz=pytz.timezone('Asia/Singapore'))
# Current Time in seconds
if currentTime > availableTime:
userData = self.records.find({'userID':str(user.id)})
for x in userData:
repData = int(x['Profile']['Rep'])
repData += 1
dataUpdate = {
'Profile.Rep':repData
}
update = self.records.update_one({'userID':str(user.id)}, {'$set':dataUpdate})
# Use this format to update database
formatTime = (datetime.datetime.now(tz=pytz.timezone('Asia/Singapore')) + datetime.timedelta(days=1)).strftime('%Y-%m-%d %H:%M:%S.%f%z')
dataUpdate = {
'Kits.Votes':formatTime
}
update = self.records.update_one({'userID':str(ctx.author.id)}, {'$set':dataUpdate})
pembed = passembed(description=f'{ctx.author.mention} You have successfully added reputation for {user.mention}.')
return await ctx.send(embed=pembed)
else:
eembed = errorembed(description=f'{ctx.author.mention} You are currently on cooldown. Type ``.cd`` to check your cooldowns.')
return await ctx.send(embed=eembed)
except Exception:
userData = self.records.find({'userID':str(user.id)})
for x in userData:
repData = int(x['Profile']['Rep'])
repData += 1
dataUpdate = {
'Profile.Rep':repData
}
update = self.records.update_one({'userID':str(user.id)}, {'$set':dataUpdate})
# Use this format to update database
formatTime = (datetime.datetime.now(tz=pytz.timezone('Asia/Singapore')) + datetime.timedelta(days=1)).strftime('%Y-%m-%d %H:%M:%S.%f%z')
dataUpdate = {
'Kits.Votes':formatTime
}
update = self.records.update_one({'userID':str(ctx.author.id)}, {'$set':dataUpdate})
pembed = passembed(description=f'{ctx.author.mention} You have successfully added reputation for {user.mention}.')
return await ctx.send(embed=pembed)
@vote.error
async def vote_error(self, ctx, error):
if isinstance(error, commands.MissingRequiredArgument):
eembed = errorembed(description='Kindly indicate the User that you wish to upvote.')
return await ctx.send(embed=eembed)
@commands.command(aliases=['cd'])
@has_registered()
@is_economy_channel()
async def cooldown(self, ctx):
userData = self.records.find({'userID':str(ctx.author.id)})
for x in userData:
dailyData = str(x['Kits']['Daily'])
weeklyData = str(x['Kits']['Weekly'])
supporterData = str(x['Kits']['Supporter'])
nitroData = str(x['Kits']['Nitro'])
voteData = str(x['Kits']['Votes'])
# Daily Cooldown
try:
# Usable Time
timeFormat = datetime.datetime.strptime(dailyData, '%Y-%m-%d %H:%M:%S.%f%z')
timeInSeconds = time.mktime(timeFormat.timetuple())
# Current Time
timeNow = datetime.datetime.now(tz=pytz.timezone('Asia/Singapore'))
timeNowInSeconds = time.mktime(timeNow.timetuple())
# Before rounding off
cooldownHours = ((timeInSeconds - timeNowInSeconds)/60)/60
coolDownMins = float('.' + str(round(((timeInSeconds - timeNowInSeconds)/60)/60, 3)).split('.')[1])*60
if int(cooldownHours) < 0:
dailyCooldown = '• You have not claimed your **Daily** kit yet.'
else:
# After rounding off
coolDownMins = round(coolDownMins)
coolDownHours = str(cooldownHours).split('.')[0]
dailyCooldown = '• ' + str(coolDownHours) + 'H ' + str(coolDownMins) + 'M'
except Exception:
dailyCooldown = '• You have not claimed your **Daily** kit yet.'
# Weekly Cooldown
try:
# Usable Time
timeFormat = datetime.datetime.strptime(weeklyData, '%Y-%m-%d %H:%M:%S.%f%z')
timeInSeconds = time.mktime(timeFormat.timetuple())
# Current Time
timeNow = datetime.datetime.now(tz=pytz.timezone('Asia/Singapore'))
timeNowInSeconds = time.mktime(timeNow.timetuple())
# Before rounding off
coolDownHours = float('.' + str(round((((timeInSeconds - timeNowInSeconds)/60)/60)/24, 4)).split('.')[1])*24
coolDownDays = (((timeInSeconds - timeNowInSeconds)/60)/60)/24
if int(coolDownDays) < 0:
weeklyCooldown = '• You have not claimed your **Weekly** kit yet.'
else:
# After rounding off
coolDownHours = round(coolDownHours)
coolDownDays = str(coolDownDays).split('.')[0]
weeklyCooldown = '• ' + str(coolDownDays) + 'D ' + str(coolDownHours) + 'H'
except Exception:
weeklyCooldown = '• You have not claimed your **Weekly** kit yet.'
# Supporter Cooldown
try:
# Usable Time
timeFormat = datetime.datetime.strptime(supporterData, '%Y-%m-%d %H:%M:%S.%f%z')
timeInSeconds = time.mktime(timeFormat.timetuple())
# Current Time
timeNow = datetime.datetime.now(tz=pytz.timezone('Asia/Singapore'))
timeNowInSeconds = time.mktime(timeNow.timetuple())
# Before rounding off
coolDownMins = float('.' + str(round(((timeInSeconds - timeNowInSeconds)/60)/60, 3)).split('.')[1])*60
cooldownHours = ((timeInSeconds - timeNowInSeconds)/60)/60
if int(cooldownHours) < 0:
supporterCooldown = '• You have not claimed your **Supporter** kit yet.'
else:
# After rounding off
coolDownMins = round(coolDownMins)
coolDownHours = str(cooldownHours).split('.')[0]
supporterCooldown = '• ' + str(coolDownHours) + 'H ' + str(coolDownMins) + 'M'
except Exception:
supporterCooldown = '• You have not claimed your **Supporter** kit yet.'
# Nitro Cooldown
try:
# Usable Time
timeFormat = datetime.datetime.strptime(nitroData, '%Y-%m-%d %H:%M:%S.%f%z')
timeInSeconds = time.mktime(timeFormat.timetuple())
# Current Time
timeNow = datetime.datetime.now(tz=pytz.timezone('Asia/Singapore'))
timeNowInSeconds = time.mktime(timeNow.timetuple())
# Before rounding off
coolDownMins = float('.' + str(round(((timeInSeconds - timeNowInSeconds)/60)/60, 3)).split('.')[1])*60
cooldownHours = ((timeInSeconds - timeNowInSeconds)/60)/60
if int(cooldownHours) < 0:
nitroCooldown = '• You have not claimed your **Nitro** kit yet.'
else:
# After rounding off
coolDownMins = round(coolDownMins)
coolDownHours = str(cooldownHours).split('.')[0]
nitroCooldown = '• ' + str(coolDownHours) + 'H ' + str(coolDownMins) + 'M'
except Exception:
nitroCooldown = '• You have not claimed your **Nitro** kit yet.'
# Vote Cooldown
try:
# Usable Time
timeFormat = datetime.datetime.strptime(voteData, '%Y-%m-%d %H:%M:%S.%f%z')
timeInSeconds = time.mktime(timeFormat.timetuple())
# Current Time
timeNow = datetime.datetime.now(tz=pytz.timezone('Asia/Singapore'))
timeNowInSeconds = time.mktime(timeNow.timetuple())
# Before rounding off
coolDownMins = float('.' + str(round(((timeInSeconds - timeNowInSeconds)/60)/60, 3)).split('.')[1])*60
cooldownHours = ((timeInSeconds - timeNowInSeconds)/60)/60
if int(cooldownHours) < 0:
voteCooldown = '• You have not **voted** anyone today yet.'
else:
# After rounding off
coolDownMins = round(coolDownMins)
coolDownHours = str(cooldownHours).split('.')[0]
voteCooldown = '• ' + str(coolDownHours) + 'H ' + str(coolDownMins) + 'M'
except Exception:
voteCooldown = '• You have not **voted** anyone today yet.'
# Embed Cooldown Message
embed = discord.Embed(title='Kit Cooldowns', color=discord.Color.from_hsv(random.random(), 1, 1), timestamp=datetime.datetime.now(tz=pytz.timezone('Asia/Singapore')))
embed.set_author(name=ctx.author.name, icon_url=ctx.author.avatar_url)
embed.add_field(name='⏰ Daily', value=dailyCooldown)
embed.add_field(name='📅 Weekly', value=weeklyCooldown)
embed.add_field(name='💎 Supporter', value=supporterCooldown)
embed.add_field(name='⚡ Nitro', value=nitroCooldown)
embed.add_field(name='🌟 Votes', value=voteCooldown)
embed.set_footer(text=ctx.guild.name, icon_url=ctx.guild.icon_url)
await ctx.send(embed=embed)
# Adding the cog to main script
def setup(bot):
bot.add_cog(Kits(bot)) | 47.401089 | 174 | 0.563366 | 2,730 | 26,118 | 5.369231 | 0.110256 | 0.025174 | 0.02456 | 0.025515 | 0.833879 | 0.820303 | 0.814982 | 0.799359 | 0.767704 | 0.719129 | 0 | 0.010401 | 0.311624 | 26,118 | 551 | 175 | 47.401089 | 0.803771 | 0.04652 | 0 | 0.709135 | 0 | 0.024038 | 0.153924 | 0.006358 | 0 | 0 | 0 | 0 | 0 | 1 | 0.007212 | false | 0.012019 | 0.026442 | 0 | 0.088942 | 0.002404 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
bc91fd856ccf6f6f9af6b5ea68da15419f17a37d | 11,860 | py | Python | cm_lex/Frames.py | UKPLab/acl2021-metaphor-generation-conceptual | 2db0c927ad2e89792030dbdcf0eddd78e18d8e85 | [
"Apache-2.0"
] | 6 | 2021-05-25T13:15:13.000Z | 2022-01-05T06:15:26.000Z | cm_lex/Frames.py | UKPLab/acl2021-metaphor-generation-conceptual | 2db0c927ad2e89792030dbdcf0eddd78e18d8e85 | [
"Apache-2.0"
] | 3 | 2021-07-25T23:17:29.000Z | 2021-09-26T14:00:23.000Z | cm_lex/Frames.py | UKPLab/acl2021-metaphor-generation-conceptual | 2db0c927ad2e89792030dbdcf0eddd78e18d8e85 | [
"Apache-2.0"
] | 1 | 2021-09-07T12:03:18.000Z | 2021-09-07T12:03:18.000Z | """
Set of frame names
"""
frames = {'Emergency_fire', 'Being_in_effect', 'Continued_state_of_affairs', 'Taking', 'Plants', 'Traversing', 'Representative', 'Size', 'Stinginess', 'Becoming_a_member', 'Run_risk', 'Trendiness', 'Certainty', 'Motion', 'Measure_mass', 'Emotions_by_stimulus', 'Fleeing', 'Being_wet', 'Sign', 'Rope_manipulation', 'Repel', 'Departing', 'Volubility', 'Contingency', 'Surviving', 'Left_to_do', 'Protest', 'Medical_professionals', 'Exporting', 'Firefighting', 'Scarcity', 'Emotions_of_mental_activity', 'Cognitive_connection', 'Relational_natural_features', 'Interior_profile_relation', 'Type', 'Becoming_aware', 'Part_ordered_segments', 'First_rank', 'Conferring_benefit', 'Being_in_control', 'Recovery', 'Color', 'Identity', 'Telling', 'Strictness', 'Limitation', 'Coming_up_with', 'Partitive', 'Commutative_statement', 'Activity_ready_state', 'Vocalizations', 'Evoking', 'Surrounding', 'Quantity', 'Activity_resume', 'Dead_or_alive', 'Subjective_influence', 'Becoming_attached', 'Contrition', 'Institutions', 'Duplication', 'System_complexity', 'Enforcing', 'Businesses', 'Reveal_secret', 'Endangering', 'Waking_up', 'Jury_deliberation', 'Sending', 'Text_creation', 'Operate_vehicle', 'Change_post-state', 'Optical_image', 'Social_event', 'Installing', 'Fire_burning', 'Connectors', 'Economy', 'Personal_relationship', 'Filling', 'Amounting_to', 'Possibility', 'Possession', 'Thriving', 'Storing', 'Version_sequence', 'Accoutrements', 'Execute_plan', 'Completeness', 'Documents', 'Cause_to_end', 'Destroying', 'Rejuvenation', 'Manipulate_into_doing', 'Food_gathering', 'Origin', 'Disgraceful_situation', 'State_continue', 'Summarizing', 'Remainder', 'Terrorism', 'Weather', 'Suitability', 'Ineffability', 'Manufacturing', 'Commerce_scenario', 'Trust', 'Residence', 'Absorb_heat', 'Replacing', 'Soaking_up', 'Dynamism', 'Rashness', 'Hunting', 'Electricity', 'Gradable_proximity', 'Change_of_leadership', 'Sharing', 'Inspecting', 'Process_end', 'Bringing', 'Expend_resource', 'Feigning', 'Foreign_or_domestic_country', 'Abundance', 'Candidness', 'Measure_volume', 'Law_enforcement_agency', 'Being_obligatory', 'Communication_manner', 'Ranked_expectation', 'Preference', 'Noise_makers', 'Body_movement', 'Measure_by_action', 'People_by_residence', 'Perception_active', 'Lending', 'Gusto', 'Just_found_out', 'Leadership', 'Remembering_experience', 'Social_interaction_evaluation', 'Simple_name', 'Judgment', 'Destiny', 'Emotion_directed', 'Aiming', 'Performing_arts', 'Locative_relation', 'Inhibit_movement', 'Obscurity', 'Prohibiting_or_licensing', 'Likelihood', 'Reading_perception', 'Cause_to_make_progress', 'Grinding', 'Difficulty', 'Motion_directional', 'People_by_age', 'Memory', 'Self_control', 'Conduct', 'Body_description_holistic', 'Activity_stop', 'Abounding_with', 'Sex', 'Color_qualities', 'People_by_origin', 'Path_traveled', 'Compliance', 'Physical_artworks', 'Occupy_rank', 'Being_in_category', 'Diversity', 'Coming_to_believe', 'Rewards_and_punishments', 'Execution', 'Supporting', 'Triggering', 'Communication_means', 'Process_continue', 'Aggregate', 'Taking_time', 'Typicality', 'Sacrificing_for', 'Publishing', 'Having_or_lacking_access', 'Idiosyncrasy', 'Reasoning', 'Evaluative_comparison', 'Commercial_transaction', 'Lively_place', 'Weapon', 'Temperature', 'Releasing', 'Breaking_out_captive', 'Chemical_potency', 'Isolated_places', 'Legal_rulings', 'Word_relations', 'Success_or_failure', 'Quantified_mass', 'Medical_specialties', 'Locale', 'Change_of_consistency', 'Intoxicants', 'Sent_items', 'Earnings_and_losses', 'Cause_emotion', 'Wagering', 'Judicial_body', 'Desiring', 'Scrutiny', 'Cause_to_perceive', 'Pattern', 'Craft', 'Communication_response', 'Arriving', 'Distinctiveness', 'Desirable_event', 'Chatting', 'Alternatives', 'Separating', 'Experiencer_obj', 'Expressing_publicly', 'Level_of_force_exertion', 'Frequency', 'Non-commutative_statement', 'Rotting', 'Being_in_captivity', 'Cause_change', 'Arranging', 'Activity_ongoing', 'Level_of_force_resistance', 'Sociability', 'Losing_someone', 'Undergo_transformation', 'Event', 'Patrolling', 'Increment', 'Correctness', 'Daring', 'Evidence', 'Chemical-sense_description', 'Recording', 'Being_up_to_it', 'Political_locales', 'Undergoing', 'Change_posture', 'Text', 'Existence', 'Killing', 'Improvement_or_decline', 'Questioning', 'Control', 'Front_for', 'Connecting_architecture', 'Custom', 'Coincidence', 'Renunciation', 'Shopping', 'Affirm_or_deny', 'Ground_up', 'Negation', 'Means', 'Misdeed', 'Putting_out_fire', 'Part_whole', 'Practice', 'Natural_features', 'Fields', 'Removing', 'Colonization', 'Age', 'Secrecy_status', 'Translating', 'Beat_opponent', 'Vehicle', 'Health_response', 'Placing', 'Sentencing', 'Being_attached', 'Becoming_dry', 'Deny_or_grant_permission', 'Preventing_or_letting', 'Familiarity', 'Part_piece', 'Part_inner_outer', 'Emptying', 'Omen', 'Agriculture', 'Dressing', 'Infecting', 'Armor', 'Commerce_sell', 'Apply_heat', 'Distributed_position', 'Notification_of_charges', 'Locale_by_use', 'Hunting_success_or_failure', 'Sign_agreement', 'Statement', 'Finish_competition', 'Body_mark', 'Concessive', 'Imitating', 'Extreme_value', 'Mathematical_relationship', 'Revenge', 'Timetable', 'Probability', 'Unattributed_information', 'Linguistic_meaning', 'Purpose', 'Commitment', 'Heralding', 'Hedging', 'Planting', 'Activity_pause', 'Attack', 'Temporal_subregion', 'Cause_to_start', 'Building', 'Heat_potential', 'Preliminaries', 'Estimating', 'Speak_on_topic', '_', 'Biological_urge', 'Law', 'Substance', 'Make_noise', 'Terms_of_agreement', 'Choosing', 'Artificiality', 'Satisfying', 'Measure_linear_extent', 'Locating', 'Reforming_a_system', 'Sounds', 'Ordinal_numbers', 'Make_agreement_on_action', 'Locale_by_event', 'Besieging', 'Shoot_projectiles', 'Frugality', 'Delivery', 'Morality_evaluation', 'Membership', 'Cause_to_resume', 'Capability', 'Part_orientational', 'Alliance', 'Dimension', 'Project', 'State_of_entity', 'Board_vehicle', 'Explaining_the_facts', 'Becoming', 'Being_questionable', 'Fall_asleep', 'Achieving_first', 'Network', 'Cause_to_fragment', 'Experience_bodily_harm', 'Passing', 'Response', 'Manipulate_into_shape', 'Warning', 'Being_born', 'Cause_to_experience', 'Obviousness', 'Attempt_suasion', 'Toxic_substance', 'Emotion_active', 'Sufficiency', 'Commonality', 'Ammunition', 'Change_event_duration', 'Giving', 'Make_cognitive_connection', 'Appellations', 'Fairness_evaluation', 'Wealthiness', 'Deception_success', 'Cure', 'Damaging', 'Experimentation', 'Buildings', 'Expensiveness', 'Arraignment', 'Time_vector', 'Biological_classification', 'Operational_testing', 'Rebellion', 'Desirability', 'Theft', 'Attempt', 'Fame', 'Animals', 'Causation', 'Aesthetics', 'Predicting', 'Opinion', 'Cogitation', 'Come_together', 'Performers', 'Employing', 'Ingredients', 'Needing', 'Cause_change_of_consistency', 'Renting', 'Excreting', 'Guilt_or_innocence', 'Taking_captive', 'Gesture', 'Verification', 'Protecting', 'Shapes', 'Punctual_perception', 'Accomplishment', 'Commerce_buy', 'Amalgamation', 'Feeling', 'Labeling', 'Create_physical_artwork', 'Degree', 'Impression', 'Becoming_silent', 'Give_impression', 'Process_start', 'Objective_influence', 'Being_necessary', 'Resolve_problem', 'Sharpness', 'Offering', 'Adopt_selection', 'Partiality', 'Activity_finish', 'Cause_change_of_position_on_a_scale', 'Forging', 'Indigenous_origin', 'Rite', 'Competition', 'Body_parts', 'Boundary', 'Social_desirability', 'Imprisonment', 'Expertise', 'Suasion', 'Judgment_direct_address', 'Meet_specifications', 'Manipulation', 'Mental_stimulus_stimulus_focus', 'Luck', 'Identicality', 'Used_up', 'Processing_materials', 'Participation', 'Active_substance', 'Quitting_a_place', 'Product_line', 'Being_located', 'Hostile_encounter', 'Public_services', 'Precipitation', 'Military', 'Intentionally_create', 'Catastrophe', 'Building_subparts', 'Presence', 'Change_event_time', 'Range', 'Roadways', 'Commerce_pay', 'Judgment_communication', 'Exchange', 'Go_into_shape', 'Actually_occurring_entity', 'Architectural_part', 'Position_on_a_scale', 'Taking_sides', 'Duration_relation', 'Path_shape', 'Similarity', 'Conquering', 'Fighting_activity', 'Travel', 'Kinship', 'Attending', 'People', 'Store', 'Exemplar', 'Direction', 'Hiding_objects', 'Prison', 'Vehicle_subpart', 'People_by_morality', 'Relative_time', 'Required_event', 'Addiction', 'Moving_in_place', 'Subordinates_and_superiors', 'Ingestion', 'Committing_crime', 'Preserving', 'Transfer', 'Money', 'Emphasizing', 'People_by_jurisdiction', 'Gathering_up', 'Cause_motion', 'Negative_conditional', 'Gizmo', 'Precariousness', 'Bearing_arms', 'Using', 'Collaboration', 'Perception_experience', 'Sole_instance', 'Submitting_documents', 'Have_associated', 'Posing_as', 'Be_in_agreement_on_action', 'Scope', 'Mass_motion', 'People_by_religion', 'Clothing_parts', 'Surrendering_possession', 'Cause_harm', 'Ingest_substance', 'Activity_start', 'Operating_a_system', 'Entity', 'Forming_relationships', 'Motion_noise', 'Eclipse', 'Smuggling', 'Cause_expansion', 'Progression', 'Willingness', 'Margin_of_resolution', 'Light_movement', 'Interrupt_process', 'Adjacency', 'Cotheme', 'Activity_done_state', 'Cardinal_numbers', 'Judgment_of_intensity', 'Sleep', 'Wearing', 'Corporal_punishment', 'Attitude_description', 'Activity_prepare', 'Dominate_situation', 'Goal', 'Complaining', 'Measure_area', 'Cause_change_of_strength', 'Instance', 'Information', 'Erasing', 'Deserving', 'Discussion', 'Research', 'Mining', 'Undergo_change', 'Being_named', 'Getting', 'Records', 'Being_employed', 'Clothing', 'System', 'Attaching', 'Legality', 'Artifact', 'Visiting', 'Thermodynamic_phase', 'Temporal_collocation', 'Awareness', 'Thwarting', 'Successful_action', 'Name_conferral', 'Trap', 'Catching_fire', 'Death', 'Defending', 'Adducing', 'Categorization', 'Reliance', 'Piracy', 'Importing', 'Being_obligated', 'Experiencer_focus', 'Fluidic_motion', 'Expectation', 'Prevent_or_allow_possession', 'Relation', 'Endeavor_failure', 'Criminal_investigation', 'Kidnapping', 'Education_teaching', 'Differentiation', 'Individual_history', 'Request', 'Hindering', 'Regard', 'Medical_instruments', 'Hospitality', 'Responsibility', 'Stimulus_focus', 'Degree_of_processing', 'Containers', 'Temporal_pattern', 'Measurable_attributes', 'Importance', 'Giving_in', 'Reason', 'Impact', 'Avoiding', 'Grasp', 'Leaving_traces', 'Self_motion', 'Food', 'Agree_or_refuse_to_act', 'Change_resistance', 'Imposing_obligation', 'Topic', 'People_by_vocation', 'Deciding', 'Surpassing', 'Duration_description', 'Intentionally_act', 'Openness', 'Be_on_alert', 'Waiting', 'Sound_level', 'Cause_to_move_in_place', 'Forgiveness', 'Quarreling', 'Proportional_quantity', 'Rescuing', 'Capacity', 'Mental_property', 'Simultaneity', 'Member_of_military', 'Dispersal', 'Reporting', 'Firing', 'Calendric_unit', 'Appointing', 'Usefulness', 'Setting_fire', 'Event_instance', 'Assessing', 'Organization', 'Proper_reference', 'Reserving', 'Communicate_categorization', 'Predicament', 'Risky_situation', 'Assistance', 'Sound_movement', 'Guest_and_host', 'Posture', 'Performers_and_roles', 'Prominence', 'Change_position_on_a_scale', 'Tolerating', 'Breaking_off', 'Being_at_risk', 'Inclusion', 'Biological_area', 'Hair_configuration', 'Change_of_phase', 'Presentation_of_mitigation', 'Opportunity', 'Proportion', 'Invading', 'Receiving', 'Being_operational', 'Rate_description', 'Medical_conditions', 'Cause_to_make_noise', 'Location_of_light', 'Encoding', 'Breathing', 'Claim_ownership', 'Contacting', 'Spatial_contact', 'Rest', 'Change_of_temperature', 'Coming_to_be', 'Respond_to_proposal', 'Attention', 'Process', 'Non-gradable_proximity', 'Supply', 'Accompaniment', 'Facial_expression', 'Sensation', 'Seeking_to_achieve', 'Directional_locative_relation', 'Team', 'Source_of_getting', 'Verdict', 'Stage_of_progress', 'Intentional_traversing'}
| 2,372 | 11,832 | 0.758853 | 1,205 | 11,860 | 7.0639 | 0.661411 | 0.007401 | 0.004582 | 0.005639 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.06054 | 11,860 | 4 | 11,833 | 2,965 | 0.763956 | 0.001518 | 0 | 0 | 0 | 0 | 0.759084 | 0.141795 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 1 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 1 | 0 | 0 | 7 |
bcb09e7320aa41ff74b4f36b5ecccb2401f053c0 | 43,097 | bzl | Python | dotnet/private/stdlib/2.1.502-runtime.bzl | samhowes/rules_dotnet | 1a7e62866cdc03748fe4c9817b49108513af4ffd | [
"Apache-2.0"
] | 1 | 2018-07-04T19:49:53.000Z | 2018-07-04T19:49:53.000Z | dotnet/private/stdlib/2.1.502-runtime.bzl | tomaszstrejczek/rules_dotnet | 2192d6810674c7a836f66b8ee90f420cfe1ff261 | [
"Apache-2.0"
] | 1 | 2018-08-20T02:34:12.000Z | 2018-08-20T02:34:12.000Z | dotnet/private/stdlib/2.1.502-runtime.bzl | samhowes/rules_dotnet | 1a7e62866cdc03748fe4c9817b49108513af4ffd | [
"Apache-2.0"
] | 1 | 2018-08-03T03:31:00.000Z | 2018-08-03T03:31:00.000Z | "Define stdlibs"
load("@io_bazel_rules_dotnet//dotnet/private:rules/stdlib.bzl", "core_stdlib_internal")
load("@io_bazel_rules_dotnet//dotnet/private:rules/libraryset.bzl", "core_libraryset")
def define_runtime():
"Declares stdlibs"
core_stdlib_internal(
name = "microsoft.win32.registry.dll",
version = "4.1.1.0",
ref = ":core/shared/Microsoft.NETCore.App/2.1.6/Microsoft.Win32.Registry.dll",
stdlib_path = ":core/shared/Microsoft.NETCore.App/2.1.6/Microsoft.Win32.Registry.dll",
deps = [
":system.runtime.dll",
":system.resources.resourcemanager.dll",
":system.runtime.interopservices.dll",
":system.runtime.extensions.dll",
":system.security.accesscontrol.dll",
":system.security.principal.windows.dll",
":system.collections.dll",
":system.buffers.dll",
":system.memory.dll",
],
)
core_stdlib_internal(
name = "sos.netcore.dll",
version = "1.0.0.0",
ref = ":core/shared/Microsoft.NETCore.App/2.1.6/SOS.NETCore.dll",
stdlib_path = ":core/shared/Microsoft.NETCore.App/2.1.6/SOS.NETCore.dll",
deps = [
":system.runtime.dll",
":system.runtime.interopservices.dll",
":system.reflection.metadata.dll",
":system.collections.dll",
":system.io.dll",
":system.collections.immutable.dll",
":system.diagnostics.debug.dll",
":system.runtime.extensions.dll",
":system.io.filesystem.dll",
],
)
core_stdlib_internal(
name = "system.io.filesystem.accesscontrol.dll",
version = "4.0.3.0",
ref = ":core/shared/Microsoft.NETCore.App/2.1.6/System.IO.FileSystem.AccessControl.dll",
stdlib_path = ":core/shared/Microsoft.NETCore.App/2.1.6/System.IO.FileSystem.AccessControl.dll",
deps = [
":system.runtime.dll",
":system.resources.resourcemanager.dll",
":system.runtime.extensions.dll",
":system.io.filesystem.dll",
":system.security.accesscontrol.dll",
":system.security.principal.windows.dll",
":system.collections.nongeneric.dll",
],
)
core_stdlib_internal(
name = "system.io.pipes.accesscontrol.dll",
version = "4.0.3.0",
ref = ":core/shared/Microsoft.NETCore.App/2.1.6/System.IO.Pipes.AccessControl.dll",
stdlib_path = ":core/shared/Microsoft.NETCore.App/2.1.6/System.IO.Pipes.AccessControl.dll",
deps = [
":system.runtime.dll",
":system.resources.resourcemanager.dll",
":system.io.pipes.dll",
],
)
core_stdlib_internal(
name = "system.private.corelib.dll",
version = "4.0.0.0",
ref = ":core/shared/Microsoft.NETCore.App/2.1.6/System.Private.CoreLib.dll",
stdlib_path = ":core/shared/Microsoft.NETCore.App/2.1.6/System.Private.CoreLib.dll",
deps = [
],
)
core_stdlib_internal(
name = "system.private.datacontractserialization.dll",
version = "4.1.4.0",
ref = ":core/shared/Microsoft.NETCore.App/2.1.6/System.Private.DataContractSerialization.dll",
stdlib_path = ":core/shared/Microsoft.NETCore.App/2.1.6/System.Private.DataContractSerialization.dll",
deps = [
":system.runtime.dll",
":system.resources.resourcemanager.dll",
":system.runtime.extensions.dll",
":system.xml.readerwriter.dll",
":system.text.encoding.extensions.dll",
":system.threading.tasks.dll",
":system.diagnostics.debug.dll",
":system.collections.dll",
":system.reflection.emit.lightweight.dll",
":system.reflection.emit.ilgeneration.dll",
":system.reflection.primitives.dll",
":system.runtime.serialization.primitives.dll",
":system.xml.xmlserializer.dll",
":system.collections.nongeneric.dll",
":system.runtime.serialization.formatters.dll",
":system.threading.dll",
":system.linq.dll",
":system.text.regularexpressions.dll",
":system.xml.xdocument.dll",
":system.collections.specialized.dll",
],
)
core_stdlib_internal(
name = "system.private.uri.dll",
version = "4.0.5.0",
ref = ":core/shared/Microsoft.NETCore.App/2.1.6/System.Private.Uri.dll",
stdlib_path = ":core/shared/Microsoft.NETCore.App/2.1.6/System.Private.Uri.dll",
deps = [
":system.private.corelib.dll",
],
)
core_stdlib_internal(
name = "system.private.xml.dll",
version = "4.0.1.0",
ref = ":core/shared/Microsoft.NETCore.App/2.1.6/System.Private.Xml.dll",
stdlib_path = ":core/shared/Microsoft.NETCore.App/2.1.6/System.Private.Xml.dll",
deps = [
":system.runtime.dll",
":system.resources.resourcemanager.dll",
":system.runtime.interopservices.dll",
":system.runtime.extensions.dll",
":system.collections.dll",
":system.diagnostics.debug.dll",
":system.security.cryptography.algorithms.dll",
":system.threading.tasks.dll",
":system.diagnostics.tracesource.dll",
":system.text.regularexpressions.dll",
":system.net.primitives.dll",
":system.net.requests.dll",
":system.text.encoding.extensions.dll",
":system.reflection.emit.dll",
":system.reflection.emit.ilgeneration.dll",
":system.collections.nongeneric.dll",
":system.reflection.primitives.dll",
":system.collections.specialized.dll",
":system.collections.concurrent.dll",
":system.linq.expressions.dll",
":system.threading.dll",
":system.diagnostics.tools.dll",
":system.reflection.emit.lightweight.dll",
":system.objectmodel.dll",
":system.memory.dll",
":system.threading.thread.dll",
":system.linq.dll",
":system.io.filesystem.dll",
],
)
core_stdlib_internal(
name = "system.private.xml.linq.dll",
version = "4.0.1.0",
ref = ":core/shared/Microsoft.NETCore.App/2.1.6/System.Private.Xml.Linq.dll",
stdlib_path = ":core/shared/Microsoft.NETCore.App/2.1.6/System.Private.Xml.Linq.dll",
deps = [
":system.runtime.dll",
":system.resources.resourcemanager.dll",
":system.private.xml.dll",
":system.diagnostics.debug.dll",
":system.collections.dll",
":system.runtime.extensions.dll",
":system.diagnostics.tools.dll",
":system.threading.dll",
":system.threading.tasks.dll",
":system.linq.dll",
],
)
core_stdlib_internal(
name = "system.security.accesscontrol.dll",
version = "4.1.1.0",
ref = ":core/shared/Microsoft.NETCore.App/2.1.6/System.Security.AccessControl.dll",
stdlib_path = ":core/shared/Microsoft.NETCore.App/2.1.6/System.Security.AccessControl.dll",
deps = [
":system.runtime.dll",
":system.resources.resourcemanager.dll",
":system.runtime.interopservices.dll",
":system.runtime.extensions.dll",
":system.security.principal.windows.dll",
":system.collections.dll",
":system.threading.dll",
":system.threading.thread.dll",
":system.collections.nongeneric.dll",
":microsoft.win32.primitives.dll",
],
)
core_stdlib_internal(
name = "system.security.cryptography.cng.dll",
version = "4.3.1.0",
ref = ":core/shared/Microsoft.NETCore.App/2.1.6/System.Security.Cryptography.Cng.dll",
stdlib_path = ":core/shared/Microsoft.NETCore.App/2.1.6/System.Security.Cryptography.Cng.dll",
deps = [
":system.runtime.dll",
":system.resources.resourcemanager.dll",
":system.runtime.interopservices.dll",
":system.runtime.extensions.dll",
":system.security.cryptography.encoding.dll",
":system.security.cryptography.primitives.dll",
":system.security.cryptography.algorithms.dll",
":system.collections.concurrent.dll",
":system.buffers.dll",
":system.memory.dll",
],
)
core_stdlib_internal(
name = "system.security.cryptography.openssl.dll",
version = "4.1.1.0",
ref = ":core/shared/Microsoft.NETCore.App/2.1.6/System.Security.Cryptography.OpenSsl.dll",
stdlib_path = ":core/shared/Microsoft.NETCore.App/2.1.6/System.Security.Cryptography.OpenSsl.dll",
deps = [
":system.runtime.dll",
":system.resources.resourcemanager.dll",
":system.runtime.interopservices.dll",
":system.runtime.extensions.dll",
":system.security.cryptography.algorithms.dll",
":system.security.cryptography.primitives.dll",
],
)
core_stdlib_internal(
name = "system.security.principal.windows.dll",
version = "4.1.1.1",
ref = ":core/shared/Microsoft.NETCore.App/2.1.6/System.Security.Principal.Windows.dll",
stdlib_path = ":core/shared/Microsoft.NETCore.App/2.1.6/System.Security.Principal.Windows.dll",
deps = [
":system.runtime.dll",
":system.resources.resourcemanager.dll",
":system.runtime.interopservices.dll",
":system.runtime.extensions.dll",
":system.collections.dll",
":system.security.claims.dll",
":system.security.principal.dll",
":system.threading.dll",
":system.diagnostics.debug.dll",
":microsoft.win32.primitives.dll",
],
)
core_libraryset(
name = "runtime",
deps = select({
"@bazel_tools//src/conditions:windows": [
":microsoft.csharp.dll",
":microsoft.visualbasic.dll",
":microsoft.win32.primitives.dll",
":microsoft.win32.registry.dll",
":mscorlib.dll",
":netstandard.dll",
":sos.netcore.dll",
":system.appcontext.dll",
":system.buffers.dll",
":system.collections.concurrent.dll",
":system.collections.dll",
":system.collections.immutable.dll",
":system.collections.nongeneric.dll",
":system.collections.specialized.dll",
":system.componentmodel.annotations.dll",
":system.componentmodel.dataannotations.dll",
":system.componentmodel.dll",
":system.componentmodel.eventbasedasync.dll",
":system.componentmodel.primitives.dll",
":system.componentmodel.typeconverter.dll",
":system.configuration.dll",
":system.console.dll",
":system.core.dll",
":system.data.common.dll",
":system.data.dll",
":system.diagnostics.contracts.dll",
":system.diagnostics.debug.dll",
":system.diagnostics.diagnosticsource.dll",
":system.diagnostics.fileversioninfo.dll",
":system.diagnostics.process.dll",
":system.diagnostics.stacktrace.dll",
":system.diagnostics.textwritertracelistener.dll",
":system.diagnostics.tools.dll",
":system.diagnostics.tracesource.dll",
":system.diagnostics.tracing.dll",
":system.dll",
":system.drawing.dll",
":system.drawing.primitives.dll",
":system.dynamic.runtime.dll",
":system.globalization.calendars.dll",
":system.globalization.dll",
":system.globalization.extensions.dll",
":system.io.compression.brotli.dll",
":system.io.compression.dll",
":system.io.compression.filesystem.dll",
":system.io.compression.zipfile.dll",
":system.io.dll",
":system.io.filesystem.accesscontrol.dll",
":system.io.filesystem.dll",
":system.io.filesystem.driveinfo.dll",
":system.io.filesystem.primitives.dll",
":system.io.filesystem.watcher.dll",
":system.io.isolatedstorage.dll",
":system.io.memorymappedfiles.dll",
":system.io.pipes.accesscontrol.dll",
":system.io.pipes.dll",
":system.io.unmanagedmemorystream.dll",
":system.linq.dll",
":system.linq.expressions.dll",
":system.linq.parallel.dll",
":system.linq.queryable.dll",
":system.memory.dll",
":system.net.dll",
":system.net.http.dll",
":system.net.httplistener.dll",
":system.net.mail.dll",
":system.net.nameresolution.dll",
":system.net.networkinformation.dll",
":system.net.ping.dll",
":system.net.primitives.dll",
":system.net.requests.dll",
":system.net.security.dll",
":system.net.servicepoint.dll",
":system.net.sockets.dll",
":system.net.webclient.dll",
":system.net.webheadercollection.dll",
":system.net.webproxy.dll",
":system.net.websockets.client.dll",
":system.net.websockets.dll",
":system.numerics.dll",
":system.numerics.vectors.dll",
":system.objectmodel.dll",
":system.private.corelib.dll",
":system.private.datacontractserialization.dll",
":system.private.uri.dll",
":system.private.xml.dll",
":system.private.xml.linq.dll",
":system.reflection.dispatchproxy.dll",
":system.reflection.dll",
":system.reflection.emit.dll",
":system.reflection.emit.ilgeneration.dll",
":system.reflection.emit.lightweight.dll",
":system.reflection.extensions.dll",
":system.reflection.metadata.dll",
":system.reflection.primitives.dll",
":system.reflection.typeextensions.dll",
":system.resources.reader.dll",
":system.resources.resourcemanager.dll",
":system.resources.writer.dll",
":system.runtime.compilerservices.visualc.dll",
":system.runtime.dll",
":system.runtime.extensions.dll",
":system.runtime.handles.dll",
":system.runtime.interopservices.dll",
":system.runtime.interopservices.runtimeinformation.dll",
":system.runtime.interopservices.windowsruntime.dll",
":system.runtime.loader.dll",
":system.runtime.numerics.dll",
":system.runtime.serialization.dll",
":system.runtime.serialization.formatters.dll",
":system.runtime.serialization.json.dll",
":system.runtime.serialization.primitives.dll",
":system.runtime.serialization.xml.dll",
":system.security.accesscontrol.dll",
":system.security.claims.dll",
":system.security.cryptography.algorithms.dll",
":system.security.cryptography.cng.dll",
":system.security.cryptography.csp.dll",
":system.security.cryptography.encoding.dll",
":system.security.cryptography.openssl.dll",
":system.security.cryptography.primitives.dll",
":system.security.cryptography.x509certificates.dll",
":system.security.dll",
":system.security.principal.dll",
":system.security.principal.windows.dll",
":system.security.securestring.dll",
":system.servicemodel.web.dll",
":system.serviceprocess.dll",
":system.text.encoding.dll",
":system.text.encoding.extensions.dll",
":system.text.regularexpressions.dll",
":system.threading.dll",
":system.threading.overlapped.dll",
":system.threading.tasks.dataflow.dll",
":system.threading.tasks.dll",
":system.threading.tasks.extensions.dll",
":system.threading.tasks.parallel.dll",
":system.threading.thread.dll",
":system.threading.threadpool.dll",
":system.threading.timer.dll",
":system.transactions.dll",
":system.transactions.local.dll",
":system.valuetuple.dll",
":system.web.dll",
":system.web.httputility.dll",
":system.windows.dll",
":system.xml.dll",
":system.xml.linq.dll",
":system.xml.readerwriter.dll",
":system.xml.serialization.dll",
":system.xml.xdocument.dll",
":system.xml.xmldocument.dll",
":system.xml.xmlserializer.dll",
":system.xml.xpath.dll",
":system.xml.xpath.xdocument.dll",
":windowsbase.dll",
],
"@bazel_tools//src/conditions:darwin": [
":microsoft.csharp.dll",
":microsoft.visualbasic.dll",
":microsoft.win32.primitives.dll",
":microsoft.win32.registry.dll",
":mscorlib.dll",
":netstandard.dll",
":sos.netcore.dll",
":system.appcontext.dll",
":system.buffers.dll",
":system.collections.concurrent.dll",
":system.collections.dll",
":system.collections.immutable.dll",
":system.collections.nongeneric.dll",
":system.collections.specialized.dll",
":system.componentmodel.annotations.dll",
":system.componentmodel.dataannotations.dll",
":system.componentmodel.dll",
":system.componentmodel.eventbasedasync.dll",
":system.componentmodel.primitives.dll",
":system.componentmodel.typeconverter.dll",
":system.configuration.dll",
":system.console.dll",
":system.core.dll",
":system.data.common.dll",
":system.data.dll",
":system.diagnostics.contracts.dll",
":system.diagnostics.debug.dll",
":system.diagnostics.diagnosticsource.dll",
":system.diagnostics.fileversioninfo.dll",
":system.diagnostics.process.dll",
":system.diagnostics.stacktrace.dll",
":system.diagnostics.textwritertracelistener.dll",
":system.diagnostics.tools.dll",
":system.diagnostics.tracesource.dll",
":system.diagnostics.tracing.dll",
":system.dll",
":system.drawing.dll",
":system.drawing.primitives.dll",
":system.dynamic.runtime.dll",
":system.globalization.calendars.dll",
":system.globalization.dll",
":system.globalization.extensions.dll",
":system.io.compression.brotli.dll",
":system.io.compression.dll",
":system.io.compression.filesystem.dll",
":system.io.compression.zipfile.dll",
":system.io.dll",
":system.io.filesystem.accesscontrol.dll",
":system.io.filesystem.dll",
":system.io.filesystem.driveinfo.dll",
":system.io.filesystem.primitives.dll",
":system.io.filesystem.watcher.dll",
":system.io.isolatedstorage.dll",
":system.io.memorymappedfiles.dll",
":system.io.pipes.accesscontrol.dll",
":system.io.pipes.dll",
":system.io.unmanagedmemorystream.dll",
":system.linq.dll",
":system.linq.expressions.dll",
":system.linq.parallel.dll",
":system.linq.queryable.dll",
":system.memory.dll",
":system.net.dll",
":system.net.http.dll",
":system.net.httplistener.dll",
":system.net.mail.dll",
":system.net.nameresolution.dll",
":system.net.networkinformation.dll",
":system.net.ping.dll",
":system.net.primitives.dll",
":system.net.requests.dll",
":system.net.security.dll",
":system.net.servicepoint.dll",
":system.net.sockets.dll",
":system.net.webclient.dll",
":system.net.webheadercollection.dll",
":system.net.webproxy.dll",
":system.net.websockets.client.dll",
":system.net.websockets.dll",
":system.numerics.dll",
":system.numerics.vectors.dll",
":system.objectmodel.dll",
":system.private.corelib.dll",
":system.private.datacontractserialization.dll",
":system.private.uri.dll",
":system.private.xml.dll",
":system.private.xml.linq.dll",
":system.reflection.dispatchproxy.dll",
":system.reflection.dll",
":system.reflection.emit.dll",
":system.reflection.emit.ilgeneration.dll",
":system.reflection.emit.lightweight.dll",
":system.reflection.extensions.dll",
":system.reflection.metadata.dll",
":system.reflection.primitives.dll",
":system.reflection.typeextensions.dll",
":system.resources.reader.dll",
":system.resources.resourcemanager.dll",
":system.resources.writer.dll",
":system.runtime.compilerservices.visualc.dll",
":system.runtime.dll",
":system.runtime.extensions.dll",
":system.runtime.handles.dll",
":system.runtime.interopservices.dll",
":system.runtime.interopservices.runtimeinformation.dll",
":system.runtime.interopservices.windowsruntime.dll",
":system.runtime.loader.dll",
":system.runtime.numerics.dll",
":system.runtime.serialization.dll",
":system.runtime.serialization.formatters.dll",
":system.runtime.serialization.json.dll",
":system.runtime.serialization.primitives.dll",
":system.runtime.serialization.xml.dll",
":system.security.accesscontrol.dll",
":system.security.claims.dll",
":system.security.cryptography.algorithms.dll",
":system.security.cryptography.cng.dll",
":system.security.cryptography.csp.dll",
":system.security.cryptography.encoding.dll",
":system.security.cryptography.openssl.dll",
":system.security.cryptography.primitives.dll",
":system.security.cryptography.x509certificates.dll",
":system.security.dll",
":system.security.principal.dll",
":system.security.principal.windows.dll",
":system.security.securestring.dll",
":system.servicemodel.web.dll",
":system.serviceprocess.dll",
":system.text.encoding.dll",
":system.text.encoding.extensions.dll",
":system.text.regularexpressions.dll",
":system.threading.dll",
":system.threading.overlapped.dll",
":system.threading.tasks.dataflow.dll",
":system.threading.tasks.dll",
":system.threading.tasks.extensions.dll",
":system.threading.tasks.parallel.dll",
":system.threading.thread.dll",
":system.threading.threadpool.dll",
":system.threading.timer.dll",
":system.transactions.dll",
":system.transactions.local.dll",
":system.valuetuple.dll",
":system.web.dll",
":system.web.httputility.dll",
":system.windows.dll",
":system.xml.dll",
":system.xml.linq.dll",
":system.xml.readerwriter.dll",
":system.xml.serialization.dll",
":system.xml.xdocument.dll",
":system.xml.xmldocument.dll",
":system.xml.xmlserializer.dll",
":system.xml.xpath.dll",
":system.xml.xpath.xdocument.dll",
":windowsbase.dll",
],
"//conditions:default": [
":microsoft.csharp.dll",
":microsoft.visualbasic.dll",
":microsoft.win32.primitives.dll",
":microsoft.win32.registry.dll",
":mscorlib.dll",
":netstandard.dll",
":sos.netcore.dll",
":system.appcontext.dll",
":system.buffers.dll",
":system.collections.concurrent.dll",
":system.collections.dll",
":system.collections.immutable.dll",
":system.collections.nongeneric.dll",
":system.collections.specialized.dll",
":system.componentmodel.annotations.dll",
":system.componentmodel.dataannotations.dll",
":system.componentmodel.dll",
":system.componentmodel.eventbasedasync.dll",
":system.componentmodel.primitives.dll",
":system.componentmodel.typeconverter.dll",
":system.configuration.dll",
":system.console.dll",
":system.core.dll",
":system.data.common.dll",
":system.data.dll",
":system.diagnostics.contracts.dll",
":system.diagnostics.debug.dll",
":system.diagnostics.diagnosticsource.dll",
":system.diagnostics.fileversioninfo.dll",
":system.diagnostics.process.dll",
":system.diagnostics.stacktrace.dll",
":system.diagnostics.textwritertracelistener.dll",
":system.diagnostics.tools.dll",
":system.diagnostics.tracesource.dll",
":system.diagnostics.tracing.dll",
":system.dll",
":system.drawing.dll",
":system.drawing.primitives.dll",
":system.dynamic.runtime.dll",
":system.globalization.calendars.dll",
":system.globalization.dll",
":system.globalization.extensions.dll",
":system.io.compression.brotli.dll",
":system.io.compression.dll",
":system.io.compression.filesystem.dll",
":system.io.compression.zipfile.dll",
":system.io.dll",
":system.io.filesystem.accesscontrol.dll",
":system.io.filesystem.dll",
":system.io.filesystem.driveinfo.dll",
":system.io.filesystem.primitives.dll",
":system.io.filesystem.watcher.dll",
":system.io.isolatedstorage.dll",
":system.io.memorymappedfiles.dll",
":system.io.pipes.accesscontrol.dll",
":system.io.pipes.dll",
":system.io.unmanagedmemorystream.dll",
":system.linq.dll",
":system.linq.expressions.dll",
":system.linq.parallel.dll",
":system.linq.queryable.dll",
":system.memory.dll",
":system.net.dll",
":system.net.http.dll",
":system.net.httplistener.dll",
":system.net.mail.dll",
":system.net.nameresolution.dll",
":system.net.networkinformation.dll",
":system.net.ping.dll",
":system.net.primitives.dll",
":system.net.requests.dll",
":system.net.security.dll",
":system.net.servicepoint.dll",
":system.net.sockets.dll",
":system.net.webclient.dll",
":system.net.webheadercollection.dll",
":system.net.webproxy.dll",
":system.net.websockets.client.dll",
":system.net.websockets.dll",
":system.numerics.dll",
":system.numerics.vectors.dll",
":system.objectmodel.dll",
":system.private.corelib.dll",
":system.private.datacontractserialization.dll",
":system.private.uri.dll",
":system.private.xml.dll",
":system.private.xml.linq.dll",
":system.reflection.dispatchproxy.dll",
":system.reflection.dll",
":system.reflection.emit.dll",
":system.reflection.emit.ilgeneration.dll",
":system.reflection.emit.lightweight.dll",
":system.reflection.extensions.dll",
":system.reflection.metadata.dll",
":system.reflection.primitives.dll",
":system.reflection.typeextensions.dll",
":system.resources.reader.dll",
":system.resources.resourcemanager.dll",
":system.resources.writer.dll",
":system.runtime.compilerservices.visualc.dll",
":system.runtime.dll",
":system.runtime.extensions.dll",
":system.runtime.handles.dll",
":system.runtime.interopservices.dll",
":system.runtime.interopservices.runtimeinformation.dll",
":system.runtime.interopservices.windowsruntime.dll",
":system.runtime.loader.dll",
":system.runtime.numerics.dll",
":system.runtime.serialization.dll",
":system.runtime.serialization.formatters.dll",
":system.runtime.serialization.json.dll",
":system.runtime.serialization.primitives.dll",
":system.runtime.serialization.xml.dll",
":system.security.accesscontrol.dll",
":system.security.claims.dll",
":system.security.cryptography.algorithms.dll",
":system.security.cryptography.cng.dll",
":system.security.cryptography.csp.dll",
":system.security.cryptography.encoding.dll",
":system.security.cryptography.openssl.dll",
":system.security.cryptography.primitives.dll",
":system.security.cryptography.x509certificates.dll",
":system.security.dll",
":system.security.principal.dll",
":system.security.principal.windows.dll",
":system.security.securestring.dll",
":system.servicemodel.web.dll",
":system.serviceprocess.dll",
":system.text.encoding.dll",
":system.text.encoding.extensions.dll",
":system.text.regularexpressions.dll",
":system.threading.dll",
":system.threading.overlapped.dll",
":system.threading.tasks.dataflow.dll",
":system.threading.tasks.dll",
":system.threading.tasks.extensions.dll",
":system.threading.tasks.parallel.dll",
":system.threading.thread.dll",
":system.threading.threadpool.dll",
":system.threading.timer.dll",
":system.transactions.dll",
":system.transactions.local.dll",
":system.valuetuple.dll",
":system.web.dll",
":system.web.httputility.dll",
":system.windows.dll",
":system.xml.dll",
":system.xml.linq.dll",
":system.xml.readerwriter.dll",
":system.xml.serialization.dll",
":system.xml.xdocument.dll",
":system.xml.xmldocument.dll",
":system.xml.xmlserializer.dll",
":system.xml.xpath.dll",
":system.xml.xpath.xdocument.dll",
":windowsbase.dll",
],
}),
data = select({
"@bazel_tools//src/conditions:windows": [
":core/shared/Microsoft.NETCore.App/2.1.6/.version",
":core/shared/Microsoft.NETCore.App/2.1.6/api-ms-win-core-console-l1-1-0.dll",
":core/shared/Microsoft.NETCore.App/2.1.6/api-ms-win-core-datetime-l1-1-0.dll",
":core/shared/Microsoft.NETCore.App/2.1.6/api-ms-win-core-debug-l1-1-0.dll",
":core/shared/Microsoft.NETCore.App/2.1.6/api-ms-win-core-errorhandling-l1-1-0.dll",
":core/shared/Microsoft.NETCore.App/2.1.6/api-ms-win-core-file-l1-1-0.dll",
":core/shared/Microsoft.NETCore.App/2.1.6/api-ms-win-core-file-l1-2-0.dll",
":core/shared/Microsoft.NETCore.App/2.1.6/api-ms-win-core-file-l2-1-0.dll",
":core/shared/Microsoft.NETCore.App/2.1.6/api-ms-win-core-handle-l1-1-0.dll",
":core/shared/Microsoft.NETCore.App/2.1.6/api-ms-win-core-heap-l1-1-0.dll",
":core/shared/Microsoft.NETCore.App/2.1.6/api-ms-win-core-interlocked-l1-1-0.dll",
":core/shared/Microsoft.NETCore.App/2.1.6/api-ms-win-core-libraryloader-l1-1-0.dll",
":core/shared/Microsoft.NETCore.App/2.1.6/api-ms-win-core-localization-l1-2-0.dll",
":core/shared/Microsoft.NETCore.App/2.1.6/api-ms-win-core-memory-l1-1-0.dll",
":core/shared/Microsoft.NETCore.App/2.1.6/api-ms-win-core-namedpipe-l1-1-0.dll",
":core/shared/Microsoft.NETCore.App/2.1.6/api-ms-win-core-processenvironment-l1-1-0.dll",
":core/shared/Microsoft.NETCore.App/2.1.6/api-ms-win-core-processthreads-l1-1-0.dll",
":core/shared/Microsoft.NETCore.App/2.1.6/api-ms-win-core-processthreads-l1-1-1.dll",
":core/shared/Microsoft.NETCore.App/2.1.6/api-ms-win-core-profile-l1-1-0.dll",
":core/shared/Microsoft.NETCore.App/2.1.6/api-ms-win-core-rtlsupport-l1-1-0.dll",
":core/shared/Microsoft.NETCore.App/2.1.6/api-ms-win-core-string-l1-1-0.dll",
":core/shared/Microsoft.NETCore.App/2.1.6/api-ms-win-core-synch-l1-1-0.dll",
":core/shared/Microsoft.NETCore.App/2.1.6/api-ms-win-core-synch-l1-2-0.dll",
":core/shared/Microsoft.NETCore.App/2.1.6/api-ms-win-core-sysinfo-l1-1-0.dll",
":core/shared/Microsoft.NETCore.App/2.1.6/api-ms-win-core-timezone-l1-1-0.dll",
":core/shared/Microsoft.NETCore.App/2.1.6/api-ms-win-core-util-l1-1-0.dll",
":core/shared/Microsoft.NETCore.App/2.1.6/api-ms-win-crt-conio-l1-1-0.dll",
":core/shared/Microsoft.NETCore.App/2.1.6/api-ms-win-crt-convert-l1-1-0.dll",
":core/shared/Microsoft.NETCore.App/2.1.6/api-ms-win-crt-environment-l1-1-0.dll",
":core/shared/Microsoft.NETCore.App/2.1.6/api-ms-win-crt-filesystem-l1-1-0.dll",
":core/shared/Microsoft.NETCore.App/2.1.6/api-ms-win-crt-heap-l1-1-0.dll",
":core/shared/Microsoft.NETCore.App/2.1.6/api-ms-win-crt-locale-l1-1-0.dll",
":core/shared/Microsoft.NETCore.App/2.1.6/api-ms-win-crt-math-l1-1-0.dll",
":core/shared/Microsoft.NETCore.App/2.1.6/api-ms-win-crt-multibyte-l1-1-0.dll",
":core/shared/Microsoft.NETCore.App/2.1.6/api-ms-win-crt-private-l1-1-0.dll",
":core/shared/Microsoft.NETCore.App/2.1.6/api-ms-win-crt-process-l1-1-0.dll",
":core/shared/Microsoft.NETCore.App/2.1.6/api-ms-win-crt-runtime-l1-1-0.dll",
":core/shared/Microsoft.NETCore.App/2.1.6/api-ms-win-crt-stdio-l1-1-0.dll",
":core/shared/Microsoft.NETCore.App/2.1.6/api-ms-win-crt-string-l1-1-0.dll",
":core/shared/Microsoft.NETCore.App/2.1.6/api-ms-win-crt-time-l1-1-0.dll",
":core/shared/Microsoft.NETCore.App/2.1.6/api-ms-win-crt-utility-l1-1-0.dll",
":core/shared/Microsoft.NETCore.App/2.1.6/clrcompression.dll",
":core/shared/Microsoft.NETCore.App/2.1.6/clretwrc.dll",
":core/shared/Microsoft.NETCore.App/2.1.6/clrjit.dll",
":core/shared/Microsoft.NETCore.App/2.1.6/coreclr.dll",
":core/shared/Microsoft.NETCore.App/2.1.6/dbgshim.dll",
":core/shared/Microsoft.NETCore.App/2.1.6/hostpolicy.dll",
":core/shared/Microsoft.NETCore.App/2.1.6/Microsoft.DiaSymReader.Native.amd64.dll",
":core/shared/Microsoft.NETCore.App/2.1.6/Microsoft.NETCore.App.deps.json",
":core/shared/Microsoft.NETCore.App/2.1.6/mscordaccore.dll",
":core/shared/Microsoft.NETCore.App/2.1.6/mscordaccore_amd64_amd64_4.6.27019.06.dll",
":core/shared/Microsoft.NETCore.App/2.1.6/mscordbi.dll",
":core/shared/Microsoft.NETCore.App/2.1.6/mscorrc.debug.dll",
":core/shared/Microsoft.NETCore.App/2.1.6/mscorrc.dll",
":core/shared/Microsoft.NETCore.App/2.1.6/sos.dll",
":core/shared/Microsoft.NETCore.App/2.1.6/sos_amd64_amd64_4.6.27019.06.dll",
":core/shared/Microsoft.NETCore.App/2.1.6/ucrtbase.dll",
":core/host/fxr/2.1.6/hostfxr.dll",
],
"@bazel_tools//src/conditions:darwin": [
":core/shared/Microsoft.NETCore.App/2.1.6/.version",
":core/shared/Microsoft.NETCore.App/2.1.6/libclrjit.dylib",
":core/shared/Microsoft.NETCore.App/2.1.6/libcoreclr.dylib",
":core/shared/Microsoft.NETCore.App/2.1.6/libdbgshim.dylib",
":core/shared/Microsoft.NETCore.App/2.1.6/libhostpolicy.dylib",
":core/shared/Microsoft.NETCore.App/2.1.6/libmscordaccore.dylib",
":core/shared/Microsoft.NETCore.App/2.1.6/libmscordbi.dylib",
":core/shared/Microsoft.NETCore.App/2.1.6/libsos.dylib",
":core/shared/Microsoft.NETCore.App/2.1.6/Microsoft.NETCore.App.deps.json",
":core/shared/Microsoft.NETCore.App/2.1.6/sosdocsunix.txt",
":core/shared/Microsoft.NETCore.App/2.1.6/System.Globalization.Native.dylib",
":core/shared/Microsoft.NETCore.App/2.1.6/System.IO.Compression.Native.a",
":core/shared/Microsoft.NETCore.App/2.1.6/System.IO.Compression.Native.dylib",
":core/shared/Microsoft.NETCore.App/2.1.6/System.Native.a",
":core/shared/Microsoft.NETCore.App/2.1.6/System.Native.dylib",
":core/shared/Microsoft.NETCore.App/2.1.6/System.Net.Http.Native.a",
":core/shared/Microsoft.NETCore.App/2.1.6/System.Net.Http.Native.dylib",
":core/shared/Microsoft.NETCore.App/2.1.6/System.Net.Security.Native.a",
":core/shared/Microsoft.NETCore.App/2.1.6/System.Net.Security.Native.dylib",
":core/shared/Microsoft.NETCore.App/2.1.6/System.Security.Cryptography.Native.Apple.a",
":core/shared/Microsoft.NETCore.App/2.1.6/System.Security.Cryptography.Native.Apple.dylib",
":core/shared/Microsoft.NETCore.App/2.1.6/System.Security.Cryptography.Native.OpenSsl.a",
":core/shared/Microsoft.NETCore.App/2.1.6/System.Security.Cryptography.Native.OpenSsl.dylib",
":core/host/fxr/2.1.6/libhostfxr.dylib",
],
"//conditions:default": [
":core/shared/Microsoft.NETCore.App/2.1.6/.version",
":core/shared/Microsoft.NETCore.App/2.1.6/createdump",
":core/shared/Microsoft.NETCore.App/2.1.6/libclrjit.so",
":core/shared/Microsoft.NETCore.App/2.1.6/libcoreclr.so",
":core/shared/Microsoft.NETCore.App/2.1.6/libcoreclrtraceptprovider.so",
":core/shared/Microsoft.NETCore.App/2.1.6/libdbgshim.so",
":core/shared/Microsoft.NETCore.App/2.1.6/libhostpolicy.so",
":core/shared/Microsoft.NETCore.App/2.1.6/libmscordaccore.so",
":core/shared/Microsoft.NETCore.App/2.1.6/libmscordbi.so",
":core/shared/Microsoft.NETCore.App/2.1.6/libsos.so",
":core/shared/Microsoft.NETCore.App/2.1.6/libsosplugin.so",
":core/shared/Microsoft.NETCore.App/2.1.6/Microsoft.NETCore.App.deps.json",
":core/shared/Microsoft.NETCore.App/2.1.6/sosdocsunix.txt",
":core/shared/Microsoft.NETCore.App/2.1.6/System.Globalization.Native.so",
":core/shared/Microsoft.NETCore.App/2.1.6/System.IO.Compression.Native.a",
":core/shared/Microsoft.NETCore.App/2.1.6/System.IO.Compression.Native.so",
":core/shared/Microsoft.NETCore.App/2.1.6/System.Native.a",
":core/shared/Microsoft.NETCore.App/2.1.6/System.Native.so",
":core/shared/Microsoft.NETCore.App/2.1.6/System.Net.Http.Native.a",
":core/shared/Microsoft.NETCore.App/2.1.6/System.Net.Http.Native.so",
":core/shared/Microsoft.NETCore.App/2.1.6/System.Net.Security.Native.a",
":core/shared/Microsoft.NETCore.App/2.1.6/System.Net.Security.Native.so",
":core/shared/Microsoft.NETCore.App/2.1.6/System.Security.Cryptography.Native.OpenSsl.a",
":core/shared/Microsoft.NETCore.App/2.1.6/System.Security.Cryptography.Native.OpenSsl.so",
":core/host/fxr/2.1.6/libhostfxr.so",
],
}),
)
| 51.861613 | 110 | 0.553565 | 4,378 | 43,097 | 5.435587 | 0.05482 | 0.209144 | 0.10619 | 0.142035 | 0.968231 | 0.959827 | 0.918477 | 0.895575 | 0.881035 | 0.831617 | 0 | 0.020642 | 0.29183 | 43,097 | 830 | 111 | 51.924096 | 0.759076 | 0.000719 | 0 | 0.809927 | 0 | 0.117433 | 0.634986 | 0.590969 | 0 | 0 | 0 | 0 | 0 | 1 | 0.001211 | true | 0 | 0 | 0 | 0.001211 | 0 | 0 | 0 | 0 | null | 1 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 1 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 11 |
bcc2810968b4b65fb452fde81b962f4accc9d557 | 2,426 | py | Python | tests/test_04_automark.py | aaronsewall/pytest-dependency | db34c5451891629ad54a18e8a5e6a45b7ec968f8 | [
"Apache-2.0"
] | 1 | 2018-07-17T13:35:40.000Z | 2018-07-17T13:35:40.000Z | tests/test_04_automark.py | aaronsewall/pytest-dependency | db34c5451891629ad54a18e8a5e6a45b7ec968f8 | [
"Apache-2.0"
] | null | null | null | tests/test_04_automark.py | aaronsewall/pytest-dependency | db34c5451891629ad54a18e8a5e6a45b7ec968f8 | [
"Apache-2.0"
] | 1 | 2019-05-10T17:17:56.000Z | 2019-05-10T17:17:56.000Z | """Test the automark_dependency option.
"""
import pytest
def test_not_set(ctestdir):
"""No pytest.ini file, e.g. automark_dependency is not set.
Since automark_dependency defaults to false and test_a is not
marked, the outcome of test_a will not be recorded. As a result,
test_b will be skipped due to a missing dependency.
"""
ctestdir.makepyfile("""
import pytest
def test_a():
pass
@pytest.mark.dependency(depends=["test_a"])
def test_b():
pass
""")
result = ctestdir.runpytest("--verbose", "-rs")
result.assert_outcomes(passed=1, skipped=1, failed=0)
result.stdout.fnmatch_lines("""
*::test_a PASSED
*::test_b SKIPPED
""")
def test_set_false(ctestdir):
"""A pytest.ini is present, automark_dependency is set to false.
Since automark_dependency is set to false and test_a is not
marked, the outcome of test_a will not be recorded. As a result,
test_b will be skipped due to a missing dependency.
"""
ctestdir.makefile('.ini', pytest="""
[pytest]
automark_dependency = false
console_output_style = classic
""")
ctestdir.makepyfile("""
import pytest
def test_a():
pass
@pytest.mark.dependency(depends=["test_a"])
def test_b():
pass
""")
result = ctestdir.runpytest("--verbose", "-rs")
result.assert_outcomes(passed=1, skipped=1, failed=0)
result.stdout.fnmatch_lines("""
*::test_a PASSED
*::test_b SKIPPED
""")
def test_set_true(ctestdir):
"""A pytest.ini is present, automark_dependency is set to false.
Since automark_dependency is set to true, the outcome of test_a
will be recorded, even though it is not marked. As a result,
test_b will be skipped due to a missing dependency.
"""
ctestdir.makefile('.ini', pytest="""
[pytest]
automark_dependency = true
console_output_style = classic
""")
ctestdir.makepyfile("""
import pytest
def test_a():
pass
@pytest.mark.dependency(depends=["test_a"])
def test_b():
pass
""")
result = ctestdir.runpytest("--verbose", "-rs")
result.assert_outcomes(passed=2, skipped=0, failed=0)
result.stdout.fnmatch_lines("""
*::test_a PASSED
*::test_b PASSED
""")
| 26.955556 | 69 | 0.615004 | 308 | 2,426 | 4.688312 | 0.201299 | 0.048476 | 0.069252 | 0.052632 | 0.855956 | 0.855956 | 0.841413 | 0.841413 | 0.841413 | 0.841413 | 0 | 0.005088 | 0.270816 | 2,426 | 89 | 70 | 27.258427 | 0.811193 | 0.311624 | 0 | 0.875 | 0 | 0 | 0.551659 | 0.080776 | 0 | 0 | 0 | 0 | 0.053571 | 1 | 0.053571 | false | 0.232143 | 0.071429 | 0 | 0.125 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 8 |
4c52b1d4678332266abc0f066ec4ba4a6cd572a7 | 44 | py | Python | test_data/parse_retree/expected/character_set/common_escaping/source.py | aas-core-works/aas-core-codegen | afec2cf363b6cb69816e7724a2b58626e2165869 | [
"MIT"
] | 5 | 2021-12-29T12:55:34.000Z | 2022-03-01T17:57:21.000Z | test_data/parse_retree/expected/character_set/common_escaping/source.py | aas-core-works/aas-core-codegen | afec2cf363b6cb69816e7724a2b58626e2165869 | [
"MIT"
] | 10 | 2021-12-29T02:15:55.000Z | 2022-03-09T11:04:22.000Z | test_data/parse_retree/expected/character_set/common_escaping/source.py | aas-core-works/aas-core-codegen | afec2cf363b6cb69816e7724a2b58626e2165869 | [
"MIT"
] | 2 | 2021-12-29T01:42:12.000Z | 2022-02-15T13:46:33.000Z | "[\\^\\xab\\uc0de\\U0010ffff\\-\\[\\]\\\\]"
| 22 | 43 | 0.386364 | 3 | 44 | 5.666667 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.116279 | 0.022727 | 44 | 1 | 44 | 44 | 0.27907 | 0.727273 | 0 | 0 | 0 | 0 | 0.931818 | 0.931818 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
4c77b14f6a9fca902052cdf6b329407a84bec5b7 | 3,049 | py | Python | authnzerver/actions/email_templates.py | waqasbhatti/authnzerver | d40fa38601f4f11e966fc52e11ad6fe1116bb145 | [
"MIT"
] | 3 | 2019-06-02T12:57:08.000Z | 2020-04-01T14:00:12.000Z | authnzerver/actions/email_templates.py | waqasbhatti/authnzerver | d40fa38601f4f11e966fc52e11ad6fe1116bb145 | [
"MIT"
] | 7 | 2020-03-17T21:55:41.000Z | 2020-07-07T22:58:48.000Z | authnzerver/actions/email_templates.py | waqasbhatti/authnzerver | d40fa38601f4f11e966fc52e11ad6fe1116bb145 | [
"MIT"
] | 2 | 2020-03-04T06:56:27.000Z | 2020-03-24T08:39:11.000Z | # -*- coding: utf-8 -*-
# email_templates.py - Waqas Bhatti (waqas.afzal.bhatti@gmail.com) - Jul 2020
# License: MIT - see the LICENSE file for the full text.
"""
This contains simple default verification email templates.
"""
SIGNUP_VERIFICATION_EMAIL_SUBJECT = (
'[{server_name}] Please verify your account sign up request'
)
SIGNUP_VERIFICATION_EMAIL_TEMPLATE = '''\
Hello,
This is an automated message from the {server_name} at: {server_baseurl}.
We received an account sign up request for: {user_email}. This request
was made using the browser:
{browser_identifier}
from the IP address: {ip_address}.
Please enter this code:
{verification_code}
into the account verification form at: {server_baseurl}{account_verify_url}
to verify that you made this request. This code will expire on
{verification_expiry}
You will also need to enter your email address and password
to log in.
If you do not recognize the browser and IP address above or did not
initiate this request, someone else may have used your email address
in error. Feel free to ignore this email.
You can see your IP address here: https://www.google.com/search?q=my+ip+address
Thanks,
{server_name} admins
{server_baseurl}
'''
FORGOTPASS_VERIFICATION_EMAIL_SUBJECT = (
'[{server_name}] Please verify your password reset request'
)
FORGOTPASS_VERIFICATION_EMAIL_TEMPLATE = '''\
Hello,
This is an automated message from the {server_name} at: {server_baseurl}.
We received a password reset request for: {user_email}. This request
was initiated using the browser:
{browser_identifier}
from the IP address: {ip_address}.
Please enter this code:
{verification_code}
into the password reset form at: {server_baseurl}{password_forgot_url}
to verify that you made this request. This code will expire on
{verification_expiry}
If you do not recognize the browser and IP address above or did not
initiate this request, someone else may have used your email address
in error. Feel free to ignore this email.
You can see your IP address here: https://www.google.com/search?q=my+ip+address
Thanks,
{server_name} admins
{server_baseurl}
'''
CHANGEPASS_VERIFICATION_EMAIL_SUBJECT = (
'[{server_name}] Please verify your password change request'
)
CHANGEPASS_VERIFICATION_EMAIL_TEMPLATE = '''\
Hello,
This is an automated message from the {server_name} at: {server_baseurl}.
We received a password change request for: {user_email}. This request
was initiated using the browser:
{browser_identifier}
from the IP address: {ip_address}.
Please enter this code:
{verification_code}
into the account verification form at: {server_baseurl}{password_change_url}
to verify that you made this request. This code will expire on
{verification_expiry}
If you do not recognize the browser and IP address above or did not
initiate this request, someone else may have used your email address
in error. Feel free to ignore this email.
You can see your IP address here: https://www.google.com/search?q=my+ip+address
Thanks,
{server_name} admins
{server_baseurl}
'''
| 25.198347 | 79 | 0.775992 | 465 | 3,049 | 4.963441 | 0.232258 | 0.058492 | 0.038995 | 0.038995 | 0.834922 | 0.819757 | 0.819757 | 0.805459 | 0.783795 | 0.733536 | 0 | 0.001929 | 0.149885 | 3,049 | 120 | 80 | 25.408333 | 0.888503 | 0.069531 | 0 | 0.704225 | 0 | 0.042254 | 0.894551 | 0.061217 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0.15493 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 9 |
d5b1d3e274548693fce2945f177390f5dcc11aa2 | 37,476 | py | Python | rnnlayer.py | wentaozhu/recurrent-attention-for-QA-SQUAD-based-on-keras | 40a895e55eb8571120ee70c5b7e32319be1d5627 | [
"MIT"
] | 28 | 2017-03-12T17:28:28.000Z | 2020-06-12T11:17:04.000Z | rnnlayer.py | wentaozhu/recurrent-attention-for-QA-SQUAD-based-on-keras | 40a895e55eb8571120ee70c5b7e32319be1d5627 | [
"MIT"
] | 2 | 2017-06-01T19:18:27.000Z | 2017-09-07T01:05:23.000Z | rnnlayer.py | wentaozhu/recurrent-attention-for-QA-SQUAD-based-on-keras | 40a895e55eb8571120ee70c5b7e32319be1d5627 | [
"MIT"
] | 18 | 2017-04-29T06:32:17.000Z | 2021-03-03T07:17:06.000Z | # -*- coding: utf-8 -*-
from __future__ import absolute_import
import numpy as np
from keras import backend as K
from keras.regularizers import l2
from keras.callbacks import *
# from visualizer import *
from keras.models import *
from keras.optimizers import *
from keras.utils.np_utils import to_categorical#, accuracy
from keras.layers.core import *
from keras.layers import Input, Embedding, LSTM, Dense, merge, TimeDistributed, Recurrent
def time_distributed_dense(x, w, b=None, dropout=None,
input_dim=None, units=None, timesteps=None):
"""Apply `y . w + b` for every temporal slice y of x.
# Arguments
x: input tensor.
w: weight matrix.
b: optional bias vector.
dropout: wether to apply dropout (same dropout mask
for every temporal slice of the input).
input_dim: integer; optional dimensionality of the input.
units: integer; optional dimensionality of the output.
timesteps: integer; optional number of timesteps.
# Returns
Output tensor.
"""
if not input_dim:
input_dim = K.shape(x)[2]
if not timesteps:
timesteps = K.shape(x)[1]
if not units:
units = K.shape(w)[1]
if dropout is not None and 0. < dropout < 1.:
# apply the same dropout pattern at every timestep
ones = K.ones_like(K.reshape(x[:, 0, :], (-1, input_dim)))
dropout_matrix = K.dropout(ones, dropout)
expanded_dropout_matrix = K.repeat(dropout_matrix, timesteps)
x = K.in_train_phase(x * expanded_dropout_matrix, x)
# collapse time dimension and batch dimension together
x = K.reshape(x, (-1, input_dim))
x = K.dot(x, w)
if b:
x += b
# reshape to 3D tensor
if K.backend() == 'tensorflow':
x = K.reshape(x, K.stack([-1, timesteps, units]))
x.set_shape([None, None, units])
else:
x = K.reshape(x, (-1, timesteps, units))
return x
class Attention(Recurrent):
"""Attention Recurrent Unit - Bengio et al. ICLR 2015.
# Arguments
units: dimension of the internal projections and the final output.
h: we use it as attention to process the input
init: weight initialization function.
Can be the name of an existing function (str),
or a Theano function (see: [initializations](../initializations.md)).
inner_init: initialization function of the inner cells.
activation: activation function.
Can be the name of an existing function (str),
or a Theano function (see: [activations](../activations.md)).
inner_activation: activation function for the inner cells.
W_regularizer: instance of [WeightRegularizer](../regularizers.md)
(eg. L1 or L2 regularization), applied to the input weights matrices.
U_regularizer: instance of [WeightRegularizer](../regularizers.md)
(eg. L1 or L2 regularization), applied to the recurrent weights matrices.
b_regularizer: instance of [WeightRegularizer](../regularizers.md),
applied to the bias.
dropout_W: float between 0 and 1. Fraction of the input units to drop for input gates.
dropout_U: float between 0 and 1. Fraction of the input units to drop for recurrent connections.
"""
def __init__(self, units, h, h_dim,
kernel_initializer='glorot_uniform', recurrent_initializer='orthogonal',
#activation='tanh', inner_activation='hard_sigmoid',
#W_regularizer=None, U_regularizer=None, b_regularizer=None,
#dropout_W=0., dropout_U=0.,
**kwargs):
self.units = units
self.h = h[:,-1,:]
self.h_dim = h_dim
self.kernel_initializer = initializers.get(kernel_initializer)
self.recurrent_initializer = initializers.get(recurrent_initializer)
#self.activation = activations.get(activation)
#self.inner_activation = activations.get(inner_activation)
#self.W_regularizer = regularizers.get(W_regularizer)
#self.U_regularizer = regularizers.get(U_regularizer)
#self.b_regularizer = regularizers.get(b_regularizer)
#self.dropout_W = dropout_W
#self.dropout_U = dropout_U
#if self.dropout_W or self.dropout_U:
# self.uses_learning_phase = True
super(Attention, self).__init__(**kwargs)
def build(self, input_shape):
self.input_spec = [InputSpec(shape=input_shape)]
self.input_dim = input_shape[2]
if self.stateful:
self.reset_states()
else:
# initial states: all-zero tensor of shape (units)
self.states = [None]
self.Wa = self.add_weight((self.units, self.units),
initializer=self.kernel_initializer,
name='{}_Wa'.format(self.name))
self.Ua = self.add_weight((self.h_dim, self.units),
initializer=self.recurrent_initializer,
name='{}_Ua'.format(self.name))
self.Va = self.add_weight((self.units,1),
initializer=self.kernel_initializer,
name='{}_Va'.format(self.name))
self.Wzr = self.add_weight((self.input_dim, 2 * self.units),
initializer=self.kernel_initializer,
name='{}_Wzr'.format(self.name))
self.Uzr = self.add_weight((self.units, 2 * self.units),
initializer=self.recurrent_initializer,
name='{}_Wzr'.format(self.name))
self.Czr = self.add_weight((self.h_dim, 2 * self.units),
initializer=self.recurrent_initializer,
name='{}_Czr'.format(self.name))
self.W = self.add_weight((self.input_dim, self.units),
initializer=self.kernel_initializer,
name='{}_W'.format(self.name))
self.U = self.add_weight((self.units, self.units),
initializer=self.recurrent_initializer,
name='{}_U'.format(self.name))
self.C = self.add_weight((self.h_dim, self.units),
initializer=self.recurrent_initializer,
name='{}_C'.format(self.name))
#if self.initial_weights is not None:
# self.set_weights(self.initial_weights)
# del self.initial_weights
self.built = True
def reset_states(self):
assert self.stateful, 'Layer must be stateful.'
input_shape = self.input_spec[0].shape
if not input_shape[0]:
raise ValueError('If a RNN is stateful, a complete '
'input_shape must be provided '
'(including batch size).')
if hasattr(self, 'states'):
K.set_value(self.states[0],
np.zeros((input_shape[0], self.units)))
else:
self.states = [K.zeros((input_shape[0], self.units))]
def preprocess_input(self, inputs, training=None):
#if self.consume_less == 'cpu':
# input_shape = K.int_shape(x)
# input_dim = input_shape[2]
# timesteps = input_shape[1]
# x_z = time_distributed_dense(x, self.W_z, self.b_z, self.dropout_W,
# input_dim, self.units, timesteps)
# x_r = time_distributed_dense(x, self.W_r, self.b_r, self.dropout_W,
# input_dim, self.units, timesteps)
# x_h = time_distributed_dense(x, self.W_h, self.b_h, self.dropout_W,
# input_dim, self.units, timesteps)
# return K.concatenate([x_z, x_r, x_h], axis=2)
#else:
# return x
self.ha = time_distributed_dense(self.h, self.Ua)
return inputs
def step(self, inputs, states):
h_tm1 = states[0] # previous memory
#B_U = states[1] # dropout matrices for recurrent units
#B_W = states[2]
h_tm1a = K.dot(h_tm1, self.Wa)
eij = K.dot(K.tanh(K.repeat(h_tm1a, K.shape(self.h)[1]) + self.ha), self.Va)
eijs = K.squeeze(eij, -1)
alphaij = K.softmax(eijs) # batchsize * lenh h batchsize * lenh * ndim
ci = K.permute_dimensions(K.permute_dimensions(self.h, [2,0,1]) * alphaij, [1,2,0])
cisum = K.sum(ci, axis=1)
#print(K.shape(cisum), cisum.shape, ci.shape, self.h.shape, alphaij.shape, x.shape)
zr = K.sigmoid(K.dot(inputs, self.Wzr) + K.dot(h_tm1, self.Uzr) + K.dot(cisum, self.Czr))
zi = zr[:, :self.units]
ri = zr[:, self.units: 2 * self.units]
si_ = K.tanh(K.dot(inputs, self.W) + K.dot(ri*h_tm1, self.U) + K.dot(cisum, self.C))
si = (1-zi) * h_tm1 + zi * si_
return si, [si] #h_tm1, [h_tm1]
'''if self.consume_less == 'gpu':
matrix_x = K.dot(x * B_W[0], self.W) + self.b
matrix_inner = K.dot(h_tm1 * B_U[0], self.U[:, :2 * self.units])
x_z = matrix_x[:, :self.units]
x_r = matrix_x[:, self.units: 2 * self.units]
inner_z = matrix_inner[:, :self.units]
inner_r = matrix_inner[:, self.units: 2 * self.units]
z = self.inner_activation(x_z + inner_z)
r = self.inner_activation(x_r + inner_r)
x_h = matrix_x[:, 2 * self.units:]
inner_h = K.dot(r * h_tm1 * B_U[0], self.U[:, 2 * self.units:])
hh = self.activation(x_h + inner_h)
else:
if self.consume_less == 'cpu':
x_z = x[:, :self.units]
x_r = x[:, self.units: 2 * self.units]
x_h = x[:, 2 * self.units:]
elif self.consume_less == 'mem':
x_z = K.dot(x * B_W[0], self.W_z) + self.b_z
x_r = K.dot(x * B_W[1], self.W_r) + self.b_r
x_h = K.dot(x * B_W[2], self.W_h) + self.b_h
else:
raise ValueError('Unknown `consume_less` mode.')
z = self.inner_activation(x_z + K.dot(h_tm1 * B_U[0], self.U_z))
r = self.inner_activation(x_r + K.dot(h_tm1 * B_U[1], self.U_r))
hh = self.activation(x_h + K.dot(r * h_tm1 * B_U[2], self.U_h))
h = z * h_tm1 + (1 - z) * hh
return h, [h]'''
def get_constants(self, inputs, training=None):
constants = []
'''if 0 < self.dropout_U < 1:
ones = K.ones_like(K.reshape(x[:, 0, 0], (-1, 1)))
ones = K.tile(ones, (1, self.units))
B_U = [K.in_train_phase(K.dropout(ones, self.dropout_U), ones) for _ in range(3)]
constants.append(B_U)
else:
constants.append([K.cast_to_floatx(1.) for _ in range(3)])
if 0 < self.dropout_W < 1:
input_shape = K.int_shape(x)
input_dim = input_shape[-1]
ones = K.ones_like(K.reshape(x[:, 0, 0], (-1, 1)))
ones = K.tile(ones, (1, int(input_dim)))
B_W = [K.in_train_phase(K.dropout(ones, self.dropout_W), ones) for _ in range(3)]
constants.append(B_W)
else:'''
constants.append([K.cast_to_floatx(1.) for _ in range(3)])
return constants
def get_config(self):
config = {'units': self.units,
'kernel_initializer': initializers.serialize(self.kernel_initializer),
'recurrent_initializer': initializers.serialize(self.recurrent_initializer)}
base_config = super(SimpleAttention, self).get_config()
return dict(list(base_config.items()) + list(config.items()))
class SimpleAttention(Recurrent):
"""Attention Recurrent Unit - Bengio et al. ICLR 2015.
# Arguments
units: dimension of the internal projections and the final output.
h: we use it as attention to process the input
init: weight initialization function.
Can be the name of an existing function (str),
or a Theano function (see: [initializations](../initializations.md)).
inner_init: initialization function of the inner cells.
activation: activation function.
Can be the name of an existing function (str),
or a Theano function (see: [activations](../activations.md)).
inner_activation: activation function for the inner cells.
W_regularizer: instance of [WeightRegularizer](../regularizers.md)
(eg. L1 or L2 regularization), applied to the input weights matrices.
U_regularizer: instance of [WeightRegularizer](../regularizers.md)
(eg. L1 or L2 regularization), applied to the recurrent weights matrices.
b_regularizer: instance of [WeightRegularizer](../regularizers.md),
applied to the bias.
dropout_W: float between 0 and 1. Fraction of the input units to drop for input gates.
dropout_U: float between 0 and 1. Fraction of the input units to drop for recurrent connections.
"""
def __init__(self, units, h, h_dim,
kernel_initializer='glorot_uniform', recurrent_initializer='orthogonal',
#activation='tanh', inner_activation='hard_sigmoid',
#W_regularizer=None, U_regularizer=None, b_regularizer=None,
#dropout_W=0., dropout_U=0.,
**kwargs):
self.units = units
self.h = h
self.h_dim = h_dim
self.kernel_initializer = initializers.get(kernel_initializer)
self.recurrent_initializer = initializers.get(recurrent_initializer)
#self.activation = activations.get(activation)
#self.inner_activation = activations.get(inner_activation)
#self.W_regularizer = regularizers.get(W_regularizer)
#self.U_regularizer = regularizers.get(U_regularizer)
#self.b_regularizer = regularizers.get(b_regularizer)
#self.dropout_W = dropout_W
#self.dropout_U = dropout_U
#if self.dropout_W or self.dropout_U:
# self.uses_learning_phase = True
super(SimpleAttention, self).__init__(**kwargs)
def build(self, input_shape):
self.input_spec = [InputSpec(shape=input_shape)]
self.input_dim = input_shape[2]
if self.stateful:
self.reset_states()
else:
# initial states: all-zero tensor of shape (units)
self.states = [None]
self.Wa = self.add_weight((self.units, self.units),
initializer=self.kernel_initializer,
name='{}_Wa'.format(self.name))
self.Ua = self.add_weight((self.h_dim, self.units),
initializer=self.recurrent_initializer,
name='{}_Ua'.format(self.name))
self.Va = self.add_weight((self.units,1),
initializer=self.kernel_initializer,
name='{}_Va'.format(self.name))
self.Wzr = self.add_weight((self.input_dim, 2 * self.units),
initializer=self.kernel_initializer,
name='{}_Wzr'.format(self.name))
self.Uzr = self.add_weight((self.units, 2 * self.units),
initializer=self.recurrent_initializer,
name='{}_Wzr'.format(self.name))
self.Czr = self.add_weight((self.h_dim, 2 * self.units),
initializer=self.recurrent_initializer,
name='{}_Czr'.format(self.name))
self.W = self.add_weight((self.input_dim, self.units),
initializer=self.kernel_initializer,
name='{}_W'.format(self.name))
self.U = self.add_weight((self.units, self.units),
initializer=self.recurrent_initializer,
name='{}_U'.format(self.name))
self.C = self.add_weight((self.h_dim, self.units),
initializer=self.recurrent_initializer,
name='{}_C'.format(self.name))
#if self.initial_weights is not None:
# self.set_weights(self.initial_weights)
# del self.initial_weights
self.built = True
def reset_states(self):
assert self.stateful, 'Layer must be stateful.'
input_shape = self.input_spec[0].shape
if not input_shape[0]:
raise ValueError('If a RNN is stateful, a complete '
'input_shape must be provided '
'(including batch size).')
if hasattr(self, 'states'):
K.set_value(self.states[0],
np.zeros((input_shape[0], self.units)))
else:
self.states = [K.zeros((input_shape[0], self.units))]
def preprocess_input(self, inputs, training=None):
#if self.consume_less == 'cpu':
# input_shape = K.int_shape(x)
# input_dim = input_shape[2]
# timesteps = input_shape[1]
# x_z = time_distributed_dense(x, self.W_z, self.b_z, self.dropout_W,
# input_dim, self.units, timesteps)
# x_r = time_distributed_dense(x, self.W_r, self.b_r, self.dropout_W,
# input_dim, self.units, timesteps)
# x_h = time_distributed_dense(x, self.W_h, self.b_h, self.dropout_W,
# input_dim, self.units, timesteps)
# return K.concatenate([x_z, x_r, x_h], axis=2)
#else:
# return x
self.ha = K.dot(self.h, self.Ua) #time_distributed_dense(self.h, self.Ua)
return inputs
def step(self, inputs, states):
h_tm1 = states[0] # previous memory
#B_U = states[1] # dropout matrices for recurrent units
#B_W = states[2]
h_tm1a = K.dot(h_tm1, self.Wa)
eij = K.dot(K.tanh(h_tm1a + self.ha), self.Va)
eijs = K.repeat_elements(eij, self.h_dim, axis=1)
#alphaij = K.softmax(eijs) # batchsize * lenh h batchsize * lenh * ndim
#ci = K.permute_dimensions(K.permute_dimensions(self.h, [2,0,1]) * alphaij, [1,2,0])
#cisum = K.sum(ci, axis=1)
cisum = eijs*self.h
#print(K.shape(cisum), cisum.shape, ci.shape, self.h.shape, alphaij.shape, x.shape)
zr = K.sigmoid(K.dot(inputs, self.Wzr) + K.dot(h_tm1, self.Uzr) + K.dot(cisum, self.Czr))
zi = zr[:, :self.units]
ri = zr[:, self.units: 2 * self.units]
si_ = K.tanh(K.dot(inputs, self.W) + K.dot(ri*h_tm1, self.U) + K.dot(cisum, self.C))
si = (1-zi) * h_tm1 + zi * si_
return si, [si] #h_tm1, [h_tm1]
'''if self.consume_less == 'gpu':
matrix_x = K.dot(x * B_W[0], self.W) + self.b
matrix_inner = K.dot(h_tm1 * B_U[0], self.U[:, :2 * self.units])
x_z = matrix_x[:, :self.units]
x_r = matrix_x[:, self.units: 2 * self.units]
inner_z = matrix_inner[:, :self.units]
inner_r = matrix_inner[:, self.units: 2 * self.units]
z = self.inner_activation(x_z + inner_z)
r = self.inner_activation(x_r + inner_r)
x_h = matrix_x[:, 2 * self.units:]
inner_h = K.dot(r * h_tm1 * B_U[0], self.U[:, 2 * self.units:])
hh = self.activation(x_h + inner_h)
else:
if self.consume_less == 'cpu':
x_z = x[:, :self.units]
x_r = x[:, self.units: 2 * self.units]
x_h = x[:, 2 * self.units:]
elif self.consume_less == 'mem':
x_z = K.dot(x * B_W[0], self.W_z) + self.b_z
x_r = K.dot(x * B_W[1], self.W_r) + self.b_r
x_h = K.dot(x * B_W[2], self.W_h) + self.b_h
else:
raise ValueError('Unknown `consume_less` mode.')
z = self.inner_activation(x_z + K.dot(h_tm1 * B_U[0], self.U_z))
r = self.inner_activation(x_r + K.dot(h_tm1 * B_U[1], self.U_r))
hh = self.activation(x_h + K.dot(r * h_tm1 * B_U[2], self.U_h))
h = z * h_tm1 + (1 - z) * hh
return h, [h]'''
def get_constants(self, inputs, training=None):
constants = []
'''if 0 < self.dropout_U < 1:
ones = K.ones_like(K.reshape(x[:, 0, 0], (-1, 1)))
ones = K.tile(ones, (1, self.units))
B_U = [K.in_train_phase(K.dropout(ones, self.dropout_U), ones) for _ in range(3)]
constants.append(B_U)
else:
constants.append([K.cast_to_floatx(1.) for _ in range(3)])
if 0 < self.dropout_W < 1:
input_shape = K.int_shape(x)
input_dim = input_shape[-1]
ones = K.ones_like(K.reshape(x[:, 0, 0], (-1, 1)))
ones = K.tile(ones, (1, int(input_dim)))
B_W = [K.in_train_phase(K.dropout(ones, self.dropout_W), ones) for _ in range(3)]
constants.append(B_W)
else:'''
constants.append([K.cast_to_floatx(1.) for _ in range(3)])
return constants
def get_config(self):
config = {'units': self.units,
'kernel_initializer': initializers.serialize(self.kernel_initializer),
'recurrent_initializer': initializers.serialize(self.recurrent_initializer)}
base_config = super(SimpleAttention, self).get_config()
return dict(list(base_config.items()) + list(config.items()))
class SSimpleAttention(Recurrent):
"""Attention Recurrent Unit - Bengio et al. ICLR 2015.
# Arguments
units: dimension of the internal projections and the final output.
h: we use it as attention to process the input
init: weight initialization function.
Can be the name of an existing function (str),
or a Theano function (see: [initializations](../initializations.md)).
inner_init: initialization function of the inner cells.
activation: activation function.
Can be the name of an existing function (str),
or a Theano function (see: [activations](../activations.md)).
inner_activation: activation function for the inner cells.
W_regularizer: instance of [WeightRegularizer](../regularizers.md)
(eg. L1 or L2 regularization), applied to the input weights matrices.
U_regularizer: instance of [WeightRegularizer](../regularizers.md)
(eg. L1 or L2 regularization), applied to the recurrent weights matrices.
b_regularizer: instance of [WeightRegularizer](../regularizers.md),
applied to the bias.
dropout_W: float between 0 and 1. Fraction of the input units to drop for input gates.
dropout_U: float between 0 and 1. Fraction of the input units to drop for recurrent connections.
"""
def __init__(self, units, h, h_dim,
kernel_initializer='glorot_uniform', recurrent_initializer='orthogonal',
#activation='tanh', inner_activation='hard_sigmoid',
#W_regularizer=None, U_regularizer=None, b_regularizer=None,
#dropout_W=0., dropout_U=0.,
**kwargs):
self.units = units
self.h = h[:,-1,:]
self.h_dim = h_dim
self.kernel_initializer = initializers.get(kernel_initializer)
self.recurrent_initializer = initializers.get(recurrent_initializer)
#self.activation = activations.get(activation)
#self.inner_activation = activations.get(inner_activation)
#self.W_regularizer = regularizers.get(W_regularizer)
#self.U_regularizer = regularizers.get(U_regularizer)
#self.b_regularizer = regularizers.get(b_regularizer)
#self.dropout_W = dropout_W
#self.dropout_U = dropout_U
#if self.dropout_W or self.dropout_U:
# self.uses_learning_phase = True
super(SSimpleAttention, self).__init__(**kwargs)
def build(self, input_shape):
self.input_spec = [InputSpec(shape=input_shape)]
self.input_dim = input_shape[2]
if self.stateful:
self.reset_states()
else:
# initial states: all-zero tensor of shape (units)
self.states = [None]
self.Wa = self.add_weight((self.units, self.units),
initializer=self.kernel_initializer,
name='{}_Wa'.format(self.name))
self.Ua = self.add_weight((self.h_dim, self.units),
initializer=self.recurrent_initializer,
name='{}_Ua'.format(self.name))
self.Va = self.add_weight((self.units,1),
initializer=self.kernel_initializer,
name='{}_Va'.format(self.name))
self.Wzr = self.add_weight((self.input_dim, 2 * self.units),
initializer=self.kernel_initializer,
name='{}_Wzr'.format(self.name))
self.Uzr = self.add_weight((self.units, 2 * self.units),
initializer=self.recurrent_initializer,
name='{}_Wzr'.format(self.name))
self.Czr = self.add_weight((self.h_dim, 2 * self.units),
initializer=self.recurrent_initializer,
name='{}_Czr'.format(self.name))
self.W = self.add_weight((self.input_dim, self.units),
initializer=self.kernel_initializer,
name='{}_W'.format(self.name))
self.U = self.add_weight((self.units, self.units),
initializer=self.recurrent_initializer,
name='{}_U'.format(self.name))
self.C = self.add_weight((self.h_dim, self.units),
initializer=self.recurrent_initializer,
name='{}_C'.format(self.name))
#if self.initial_weights is not None:
# self.set_weights(self.initial_weights)
# del self.initial_weights
self.built = True
def reset_states(self):
assert self.stateful, 'Layer must be stateful.'
input_shape = self.input_spec[0].shape
if not input_shape[0]:
raise ValueError('If a RNN is stateful, a complete '
'input_shape must be provided '
'(including batch size).')
if hasattr(self, 'states'):
K.set_value(self.states[0],
np.zeros((input_shape[0], self.units)))
else:
self.states = [K.zeros((input_shape[0], self.units))]
def preprocess_input(self, inputs, training=None):
#if self.consume_less == 'cpu':
# input_shape = K.int_shape(x)
# input_dim = input_shape[2]
# timesteps = input_shape[1]
# x_z = time_distributed_dense(x, self.W_z, self.b_z, self.dropout_W,
# input_dim, self.units, timesteps)
# x_r = time_distributed_dense(x, self.W_r, self.b_r, self.dropout_W,
# input_dim, self.units, timesteps)
# x_h = time_distributed_dense(x, self.W_h, self.b_h, self.dropout_W,
# input_dim, self.units, timesteps)
# return K.concatenate([x_z, x_r, x_h], axis=2)
#else:
# return x
self.ha = K.dot(self.h, self.Ua) #time_distributed_dense(self.h, self.Ua)
return inputs
def step(self, inputs, states):
h_tm1 = states[0] # previous memory
#B_U = states[1] # dropout matrices for recurrent units
#B_W = states[2]
h_tm1a = K.dot(h_tm1, self.Wa)
eij = K.dot(K.tanh(h_tm1a + self.ha), self.Va)
eijs = K.repeat_elements(eij, self.h_dim, axis=1)
#alphaij = K.softmax(eijs) # batchsize * lenh h batchsize * lenh * ndim
#ci = K.permute_dimensions(K.permute_dimensions(self.h, [2,0,1]) * alphaij, [1,2,0])
#cisum = K.sum(ci, axis=1)
cisum = eijs*self.h
#print(K.shape(cisum), cisum.shape, ci.shape, self.h.shape, alphaij.shape, x.shape)
zr = K.sigmoid(K.dot(inputs, self.Wzr) + K.dot(h_tm1, self.Uzr) + K.dot(cisum, self.Czr))
zi = zr[:, :self.units]
ri = zr[:, self.units: 2 * self.units]
si_ = K.tanh(K.dot(inputs, self.W) + K.dot(ri*h_tm1, self.U) + K.dot(cisum, self.C))
si = (1-zi) * h_tm1 + zi * si_
return si, [si] #h_tm1, [h_tm1]
'''if self.consume_less == 'gpu':
matrix_x = K.dot(x * B_W[0], self.W) + self.b
matrix_inner = K.dot(h_tm1 * B_U[0], self.U[:, :2 * self.units])
x_z = matrix_x[:, :self.units]
x_r = matrix_x[:, self.units: 2 * self.units]
inner_z = matrix_inner[:, :self.units]
inner_r = matrix_inner[:, self.units: 2 * self.units]
z = self.inner_activation(x_z + inner_z)
r = self.inner_activation(x_r + inner_r)
x_h = matrix_x[:, 2 * self.units:]
inner_h = K.dot(r * h_tm1 * B_U[0], self.U[:, 2 * self.units:])
hh = self.activation(x_h + inner_h)
else:
if self.consume_less == 'cpu':
x_z = x[:, :self.units]
x_r = x[:, self.units: 2 * self.units]
x_h = x[:, 2 * self.units:]
elif self.consume_less == 'mem':
x_z = K.dot(x * B_W[0], self.W_z) + self.b_z
x_r = K.dot(x * B_W[1], self.W_r) + self.b_r
x_h = K.dot(x * B_W[2], self.W_h) + self.b_h
else:
raise ValueError('Unknown `consume_less` mode.')
z = self.inner_activation(x_z + K.dot(h_tm1 * B_U[0], self.U_z))
r = self.inner_activation(x_r + K.dot(h_tm1 * B_U[1], self.U_r))
hh = self.activation(x_h + K.dot(r * h_tm1 * B_U[2], self.U_h))
h = z * h_tm1 + (1 - z) * hh
return h, [h]'''
def get_constants(self, inputs, training=None):
constants = []
'''if 0 < self.dropout_U < 1:
ones = K.ones_like(K.reshape(x[:, 0, 0], (-1, 1)))
ones = K.tile(ones, (1, self.units))
B_U = [K.in_train_phase(K.dropout(ones, self.dropout_U), ones) for _ in range(3)]
constants.append(B_U)
else:
constants.append([K.cast_to_floatx(1.) for _ in range(3)])
if 0 < self.dropout_W < 1:
input_shape = K.int_shape(x)
input_dim = input_shape[-1]
ones = K.ones_like(K.reshape(x[:, 0, 0], (-1, 1)))
ones = K.tile(ones, (1, int(input_dim)))
B_W = [K.in_train_phase(K.dropout(ones, self.dropout_W), ones) for _ in range(3)]
constants.append(B_W)
else:'''
constants.append([K.cast_to_floatx(1.) for _ in range(3)])
return constants
def get_config(self):
config = {'units': self.units,
'kernel_initializer': initializers.serialize(self.kernel_initializer),
'recurrent_initializer': initializers.serialize(self.recurrent_initializer)}
base_config = super(SSimpleAttention, self).get_config()
return dict(list(base_config.items()) + list(config.items()))
class SimpleAttention2(Recurrent):
def __init__(self, units, h_dim,
kernel_initializer='glorot_uniform', recurrent_initializer='orthogonal',
**kwargs):
self.units = units
self.h_dim = h_dim
self.kernel_initializer = initializers.get(kernel_initializer)
self.recurrent_initializer = initializers.get(recurrent_initializer)
super(SimpleAttention2, self).__init__(**kwargs)
def build(self, input_shape):
self.input_spec = [InputSpec(shape=input_shape)]
self.input_dim = input_shape[2] - self.h_dim
if self.stateful:
self.reset_states()
else:
self.states = [None]
self.Wa = self.add_weight((self.units, self.units),
initializer=self.recurrent_initializer,
name='{}_Wa'.format(self.name))
self.Ua = self.add_weight((self.h_dim, self.units),
initializer=self.recurrent_initializer,
name='{}_Ua'.format(self.name))
self.Va = self.add_weight((self.units,1),
initializer=self.recurrent_initializer,
name='{}_Va'.format(self.name))
self.Wzr = self.add_weight((self.input_dim, 2 * self.units),
initializer=self.recurrent_initializer,
name='{}_Wzr'.format(self.name))
self.Uzr = self.add_weight((self.units, 2 * self.units),
initializer=self.recurrent_initializer,
name='{}_Wzr'.format(self.name))
self.Czr = self.add_weight((self.h_dim, 2 * self.units),
initializer=self.recurrent_initializer,
name='{}_Czr'.format(self.name))
self.W = self.add_weight((self.input_dim, self.units),
initializer=self.recurrent_initializer,
name='{}_W'.format(self.name))
self.U = self.add_weight((self.units, self.units),
initializer=self.recurrent_initializer,
name='{}_U'.format(self.name))
self.C = self.add_weight((self.h_dim, self.units),
initializer=self.recurrent_initializer,
name='{}_C'.format(self.name))
self.built = True
def reset_states(self):
assert self.stateful, 'Layer must be stateful.'
input_shape = self.input_spec[0].shape
if not input_shape[0]:
raise ValueError('If a RNN is stateful, a complete '
'input_shape must be provided '
'(including batch size).')
if hasattr(self, 'states'):
K.set_value(self.states[0],
np.zeros((input_shape[0], self.units)))
else:
self.states = [K.zeros((input_shape[0], self.units))]
def preprocess_input(self, inputs, training=None):
#self.ha = K.dot(self.h, self.Ua) #time_distributed_dense(self.h, self.Ua)
return inputs
def step(self, inputs, states):
h_tm1 = states[0] # previous memory
#B_U = states[1] # dropout matrices for recurrent units
#B_W = states[2]
h_tm1a = K.dot(h_tm1, self.Wa)
eij = K.dot(K.tanh(h_tm1a + K.dot(inputs[:, :self.h_dim], self.Ua)), self.Va)
eijs = K.repeat_elements(eij, self.h_dim, axis=1)
#alphaij = K.softmax(eijs) # batchsize * lenh h batchsize * lenh * ndim
#ci = K.permute_dimensions(K.permute_dimensions(self.h, [2,0,1]) * alphaij, [1,2,0])
#cisum = K.sum(ci, axis=1)
cisum = eijs*inputs[:, :self.h_dim]
#print(K.shape(cisum), cisum.shape, ci.shape, self.h.shape, alphaij.shape, x.shape)
zr = K.sigmoid(K.dot(inputs[:, self.h_dim:], self.Wzr) + K.dot(h_tm1, self.Uzr) + K.dot(cisum, self.Czr))
zi = zr[:, :self.units]
ri = zr[:, self.units: 2 * self.units]
si_ = K.tanh(K.dot(inputs[:, self.h_dim:], self.W) + K.dot(ri*h_tm1, self.U) + K.dot(cisum, self.C))
si = (1-zi) * h_tm1 + zi * si_
return si, [si] #h_tm1, [h_tm1]
def get_constants(self, inputs, training=None):
constants = []
constants.append([K.cast_to_floatx(1.) for _ in range(3)])
return constants
def get_config(self):
config = {'units': self.units,
'kernel_initializer': initializers.serialize(self.kernel_initializer),
'recurrent_initializer': initializers.serialize(self.recurrent_initializer)}
base_config = super(SimpleAttention2, self).get_config()
return dict(list(base_config.items()) + list(config.items())) | 49.505945 | 114 | 0.550512 | 4,709 | 37,476 | 4.199405 | 0.053939 | 0.059621 | 0.01871 | 0.030948 | 0.931884 | 0.926068 | 0.924197 | 0.921011 | 0.917067 | 0.913831 | 0 | 0.013216 | 0.329678 | 37,476 | 757 | 115 | 49.505945 | 0.773974 | 0.257231 | 0 | 0.847645 | 0 | 0 | 0.045813 | 0.004192 | 0 | 0 | 0 | 0 | 0.01108 | 1 | 0.080332 | false | 0 | 0.027701 | 0.00277 | 0.166205 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
d5f343ecaf2139fc94489f520f1bd1c5dd3f1372 | 5,752 | py | Python | restraintlib/lib/PO4_terminal_C3.py | mkowiel/restraintlib | 32de01d67ae290a45f3199e90c729acc258a6249 | [
"BSD-3-Clause"
] | null | null | null | restraintlib/lib/PO4_terminal_C3.py | mkowiel/restraintlib | 32de01d67ae290a45f3199e90c729acc258a6249 | [
"BSD-3-Clause"
] | 1 | 2021-11-11T18:45:10.000Z | 2021-11-11T18:45:10.000Z | restraintlib/lib/PO4_terminal_C3.py | mkowiel/restraintlib | 32de01d67ae290a45f3199e90c729acc258a6249 | [
"BSD-3-Clause"
] | null | null | null |
PO4_3_TERMINAL_PDB_CODES = ['A', 'C', 'G', 'T', 'U', 'DA', 'DC', 'DG', 'DT', 'DU', 'IC', 'IG']
PO4_3_TERMINAL_ATOM_NAMES = {
'P': 'P',
'OP1': 'OP1',
'O1P': 'OP1',
'OP2': 'OP2',
'O2P': 'OP2',
"OP3": "OP3",
"O3P": "OP3",
"O3'": "O3'",
"O3*": "O3'",
"C3'": "C3'",
"C3*": "C3'",
}
PO4_3_TERMINAL_ATOM_RES = {
'P': 0,
'OP1': 0,
'OP2': 0,
"OP3": 0,
"O3'": -1,
"C3'": -1,
}
PO4_3_TERMINAL_REQUIRED_CONDITION = {
("P", "O3'", 2.0, 0, -1),
("P", "OP1", 2.0, 0, 0),
("P", "OP2", 2.0, 0, 0),
("P", "OP3", 2.0, 0, 0),
("C3'", "O3'", 2.0, -1, -1),
}
PO4_3_TERMINAL_DISTANCE_MEASURE = {
'measure': 'euclidean_angles',
'restraint_names': ['aO1O2', 'aO1O3', 'aO1O5', 'aO2O3', 'aO2O5', 'aO3O5']
}
PO4_3_TERMINAL_CONDITION_DISTANCE_MEASURE = {
'measure': 'euclidean_angles',
'restraint_names': ['tC3O3P4O5', 'tC5O5P4O3']
}
PO4_3_TERMINAL_RESTRAINTS = [
{
"conditions": [],
"name": "PO4_Terminal==C3_0",
"restraints": [
["angle", "aO1O2", ["OP1", "P", "OP2"], 111.7, 1.0],
["angle", "aO1O3", ["OP1", "P", "OP3"], 114.0, 0.7],
["angle", "aO1O5", ["OP1", "P", "O3'"], 107.5, 0.7],
["angle", "aO2O3", ["OP2", "P", "OP3"], 112.8, 1.0],
["angle", "aO2O5", ["OP2", "P", "O3'"], 107.2, 0.8],
["angle", "aO3O5", ["OP3", "P", "O3'"], 102.8, 1.2],
["angle", "aP4O5C5", ["P", "O3'", "C3'"], 119.0, 2.2],
["dist", "dO1P4", ["OP1", "P"], 1.514, 0.009],
["dist", "dO2P4", ["OP2", "P"], 1.52, 0.009],
["dist", "dO3P4", ["OP3", "P"], 1.514, 0.01],
["dist", "dO5P4", ["O3'", "P"], 1.622, 0.009]
]
},
{
"conditions": [],
"name": "PO4_Terminal==C3_1",
"restraints": [
["angle", "aO1O2", ["OP1", "P", "OP2"], 114.0, 0.7],
["angle", "aO1O3", ["OP1", "P", "OP3"], 111.7, 1.0],
["angle", "aO1O5", ["OP1", "P", "O3'"], 107.5, 0.7],
["angle", "aO2O3", ["OP2", "P", "OP3"], 112.8, 1.0],
["angle", "aO2O5", ["OP2", "P", "O3'"], 102.8, 1.2],
["angle", "aO3O5", ["OP3", "P", "O3'"], 107.2, 0.8],
["angle", "aP4O5C5", ["P", "O3'", "C3'"], 119.0, 2.2],
["dist", "dO1P4", ["OP1", "P"], 1.514, 0.009],
["dist", "dO2P4", ["OP2", "P"], 1.514, 0.01],
["dist", "dO3P4", ["OP3", "P"], 1.52, 0.009],
["dist", "dO5P4", ["O3'", "P"], 1.622, 0.009]
]
},
{
"conditions": [],
"name": "PO4_Terminal==C3_2",
"restraints": [
["angle", "aO1O2", ["OP1", "P", "OP2"], 111.7, 1.0],
["angle", "aO1O3", ["OP1", "P", "OP3"], 112.8, 1.0],
["angle", "aO1O5", ["OP1", "P", "O3'"], 107.2, 0.8],
["angle", "aO2O3", ["OP2", "P", "OP3"], 114.0, 0.7],
["angle", "aO2O5", ["OP2", "P", "O3'"], 107.5, 0.7],
["angle", "aO3O5", ["OP3", "P", "O3'"], 102.8, 1.2],
["angle", "aP4O5C5", ["P", "O3'", "C3'"], 119.0, 2.2],
["dist", "dO1P4", ["OP1", "P"], 1.52, 0.009],
["dist", "dO2P4", ["OP2", "P"], 1.514, 0.009],
["dist", "dO3P4", ["OP3", "P"], 1.514, 0.01],
["dist", "dO5P4", ["O3'", "P"], 1.622, 0.009]
]
},
{
"conditions": [],
"name": "PO4_Terminal==C3_3",
"restraints": [
["angle", "aO1O2", ["OP1", "P", "OP2"], 114.0, 0.7],
["angle", "aO1O3", ["OP1", "P", "OP3"], 112.8, 1.0],
["angle", "aO1O5", ["OP1", "P", "O3'"], 102.8, 1.2],
["angle", "aO2O3", ["OP2", "P", "OP3"], 111.7, 1.0],
["angle", "aO2O5", ["OP2", "P", "O3'"], 107.5, 0.7],
["angle", "aO3O5", ["OP3", "P", "O3'"], 107.2, 0.8],
["angle", "aP4O5C5", ["P", "O3'", "C3'"], 119.0, 2.2],
["dist", "dO1P4", ["OP1", "P"], 1.514, 0.01],
["dist", "dO2P4", ["OP2", "P"], 1.514, 0.009],
["dist", "dO3P4", ["OP3", "P"], 1.52, 0.009],
["dist", "dO5P4", ["O3'", "P"], 1.622, 0.009]
]
},
{
"conditions": [],
"name": "PO4_Terminal==C3_4",
"restraints": [
["angle", "aO1O2", ["OP1", "P", "OP2"], 112.8, 1.0],
["angle", "aO1O3", ["OP1", "P", "OP3"], 111.7, 1.0],
["angle", "aO1O5", ["OP1", "P", "O3'"], 107.2, 0.8],
["angle", "aO2O3", ["OP2", "P", "OP3"], 114.0, 0.7],
["angle", "aO2O5", ["OP2", "P", "O3'"], 102.8, 1.2],
["angle", "aO3O5", ["OP3", "P", "O3'"], 107.5, 0.7],
["angle", "aP4O5C5", ["P", "O3'", "C3'"], 119.0, 2.2],
["dist", "dO1P4", ["OP1", "P"], 1.52, 0.009],
["dist", "dO2P4", ["OP2", "P"], 1.514, 0.01],
["dist", "dO3P4", ["OP3", "P"], 1.514, 0.009],
["dist", "dO5P4", ["O3'", "P"], 1.622, 0.009]
]
},
{
"conditions": [],
"name": "PO4_Terminal=C3_5",
"restraints": [
["angle", "aO1O2", ["OP1", "P", "OP2"], 112.8, 1.0],
["angle", "aO1O3", ["OP1", "P", "OP3"], 114.0, 0.7],
["angle", "aO1O5", ["OP1", "P", "O3'"], 102.8, 1.2],
["angle", "aO2O3", ["OP2", "P", "OP3"], 111.7, 1.0],
["angle", "aO2O5", ["OP2", "P", "O3'"], 107.2, 0.8],
["angle", "aO3O5", ["OP3", "P", "O3'"], 107.5, 0.7],
["angle", "aP4O5C5", ["P", "O3'", "C3'"], 119.0, 2.2],
["dist", "dO1P4", ["OP1", "P"], 1.514, 0.01],
["dist", "dO2P4", ["OP2", "P"], 1.52, 0.009],
["dist", "dO3P4", ["OP3", "P"], 1.514, 0.009],
["dist", "dO5P4", ["O3'", "P"], 1.622, 0.009]
]
}
]
| 38.604027 | 94 | 0.370654 | 741 | 5,752 | 2.817814 | 0.097166 | 0.03592 | 0.04023 | 0.034483 | 0.846743 | 0.829023 | 0.829023 | 0.780172 | 0.780172 | 0.780172 | 0 | 0.196455 | 0.303547 | 5,752 | 148 | 95 | 38.864865 | 0.324763 | 0 | 0 | 0.567376 | 0 | 0 | 0.266388 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
910b6b88ea975d920aecefb4438de57674798b7a | 39,787 | py | Python | hermes_fix/message_lib/FIX_4_1/fix_messages.py | yabov/hermes_fix | 0a5e89fd15903a7ee0929e82b39879362e2e1008 | [
"Apache-2.0"
] | 2 | 2020-02-20T15:00:35.000Z | 2020-02-21T19:27:53.000Z | hermes_fix/message_lib/FIX_4_1/fix_messages.py | yabov/hermes_fix | 0a5e89fd15903a7ee0929e82b39879362e2e1008 | [
"Apache-2.0"
] | 3 | 2020-02-21T03:25:35.000Z | 2020-02-21T18:37:42.000Z | hermes_fix/message_lib/FIX_4_1/fix_messages.py | yabov/hermes_fix | 0a5e89fd15903a7ee0929e82b39879362e2e1008 | [
"Apache-2.0"
] | null | null | null |
from ... import fix_message
from . import fields
from . import field_types
BEGINSTRING = 'FIX.4.1'
MESSAGE_TYPES = {}
class Header(fix_message.MessageBase):
def __init__(self):
super().__init__()
register_StandardHeader_component(self)
class Trailer(fix_message.MessageBase):
def __init__(self):
super().__init__()
register_StandardTrailer_component(self)
##############Begin Repeating Groups###############
class NoIOIQualifiersGroup(fix_message.FIXGroup):
def __init__(self, value = None):
super().__init__(value)
self.register_field(fields.IOIQualifier, False)
class NoRelatedSymGroup(fix_message.FIXGroup):
def __init__(self, value = None):
super().__init__(value)
self.register_field(fields.RelatdSym, False)
self.register_field(fields.SymbolSfx, False)
self.register_field(fields.SecurityID, False)
self.register_field(fields.IDSource, False)
self.register_field(fields.SecurityType, False)
self.register_field(fields.MaturityMonthYear, False)
self.register_field(fields.MaturityDay, False)
self.register_field(fields.PutOrCall, False)
self.register_field(fields.StrikePrice, False)
self.register_field(fields.OptAttribute, False)
self.register_field(fields.SecurityExchange, False)
self.register_field(fields.Issuer, False)
self.register_field(fields.SecurityDesc, False)
class LinesOfTextGroup(fix_message.FIXGroup):
def __init__(self, value = None):
super().__init__(value)
self.register_field(fields.Text, True)
class NoOrdersGroup(fix_message.FIXGroup):
def __init__(self, value = None):
super().__init__(value)
self.register_field(fields.ClOrdID, False)
self.register_field(fields.OrderID, False)
self.register_field(fields.SecondaryOrderID, False)
self.register_field(fields.ListID, False)
self.register_field(fields.WaveNo, False)
class NoExecsGroup(fix_message.FIXGroup):
def __init__(self, value = None):
super().__init__(value)
self.register_field(fields.LastShares, False)
self.register_field(fields.ExecID, False)
self.register_field(fields.LastPx, False)
self.register_field(fields.LastCapacity, False)
class NoAllocsGroup(fix_message.FIXGroup):
def __init__(self, value = None):
super().__init__(value)
self.register_field(fields.AllocAccount, False)
self.register_field(fields.AllocShares, True)
self.register_field(fields.ProcessCode, False)
self.register_field(fields.BrokerOfCredit, False)
self.register_field(fields.NotifyBrokerOfCredit, False)
self.register_field(fields.AllocHandlInst, False)
self.register_field(fields.AllocText, False)
self.register_field(fields.ExecBroker, False)
self.register_field(fields.ClientID, False)
self.register_field(fields.Commission, False)
self.register_field(fields.CommType, False)
self.register_field(fields.AllocAvgPx, False)
self.register_field(fields.AllocNetMoney, False)
self.register_field(fields.SettlCurrAmt, False)
self.register_field(fields.SettlCurrency, False)
self.register_field(fields.SettlCurrFxRate, False)
self.register_field(fields.SettlCurrFxRateCalc, False)
self.register_field(fields.AccruedInterestAmt, False)
self.register_field(fields.SettlInstMode, False)
self.register_group(fields.NoMiscFees, NoMiscFeesGroup, False)
class NoMiscFeesGroup(fix_message.FIXGroup):
def __init__(self, value = None):
super().__init__(value)
self.register_field(fields.MiscFeeAmt, False)
self.register_field(fields.MiscFeeCurr, False)
self.register_field(fields.MiscFeeType, False)
##############End Repeating Groups###############
##############Begin Componenets###############
def register_StandardHeader_component(self):
self.register_field(fields.BeginString, True)
self.register_field(fields.BodyLength, True)
self.register_field(fields.MsgType, True)
self.register_field(fields.SenderCompID, True)
self.register_field(fields.TargetCompID, True)
self.register_field(fields.OnBehalfOfCompID, False)
self.register_field(fields.DeliverToCompID, False)
self.register_field(fields.SecureDataLen, False)
self.register_field(fields.SecureData, False)
self.register_field(fields.MsgSeqNum, True)
self.register_field(fields.SenderSubID, False)
self.register_field(fields.SenderLocationID, False)
self.register_field(fields.TargetSubID, False)
self.register_field(fields.TargetLocationID, False)
self.register_field(fields.OnBehalfOfSubID, False)
self.register_field(fields.OnBehalfOfLocationID, False)
self.register_field(fields.DeliverToSubID, False)
self.register_field(fields.DeliverToLocationID, False)
self.register_field(fields.PossDupFlag, False)
self.register_field(fields.PossResend, False)
self.register_field(fields.SendingTime, True)
self.register_field(fields.OrigSendingTime, False)
def register_StandardTrailer_component(self):
self.register_field(fields.SignatureLength, False)
self.register_field(fields.Signature, False)
self.register_field(fields.CheckSum, True)
##############End Componenets###############
class Heartbeat(fix_message.MessageBase):
_msgtype = '0'
_msgcat = 'admin'
def __init__(self):
self.Header = Header()
self.Trailer = Trailer()
super().__init__()
self.register_field(fields.TestReqID, False)
MESSAGE_TYPES['0'] = Heartbeat
class TestRequest(fix_message.MessageBase):
_msgtype = '1'
_msgcat = 'admin'
def __init__(self):
self.Header = Header()
self.Trailer = Trailer()
super().__init__()
self.register_field(fields.TestReqID, True)
MESSAGE_TYPES['1'] = TestRequest
class ResendRequest(fix_message.MessageBase):
_msgtype = '2'
_msgcat = 'admin'
def __init__(self):
self.Header = Header()
self.Trailer = Trailer()
super().__init__()
self.register_field(fields.BeginSeqNo, True)
self.register_field(fields.EndSeqNo, True)
MESSAGE_TYPES['2'] = ResendRequest
class Reject(fix_message.MessageBase):
_msgtype = '3'
_msgcat = 'admin'
def __init__(self):
self.Header = Header()
self.Trailer = Trailer()
super().__init__()
self.register_field(fields.RefSeqNum, True)
self.register_field(fields.Text, False)
MESSAGE_TYPES['3'] = Reject
class SequenceReset(fix_message.MessageBase):
_msgtype = '4'
_msgcat = 'admin'
def __init__(self):
self.Header = Header()
self.Trailer = Trailer()
super().__init__()
self.register_field(fields.GapFillFlag, False)
self.register_field(fields.NewSeqNo, True)
MESSAGE_TYPES['4'] = SequenceReset
class Logout(fix_message.MessageBase):
_msgtype = '5'
_msgcat = 'admin'
def __init__(self):
self.Header = Header()
self.Trailer = Trailer()
super().__init__()
self.register_field(fields.Text, False)
MESSAGE_TYPES['5'] = Logout
class IOI(fix_message.MessageBase):
_msgtype = '6'
_msgcat = 'app'
def __init__(self):
self.Header = Header()
self.Trailer = Trailer()
super().__init__()
self.register_field(fields.IOIid, True)
self.register_field(fields.IOITransType, True)
self.register_field(fields.IOIRefID, False)
self.register_field(fields.Symbol, True)
self.register_field(fields.SymbolSfx, False)
self.register_field(fields.SecurityID, False)
self.register_field(fields.IDSource, False)
self.register_field(fields.SecurityType, False)
self.register_field(fields.MaturityMonthYear, False)
self.register_field(fields.MaturityDay, False)
self.register_field(fields.PutOrCall, False)
self.register_field(fields.StrikePrice, False)
self.register_field(fields.OptAttribute, False)
self.register_field(fields.SecurityExchange, False)
self.register_field(fields.Issuer, False)
self.register_field(fields.SecurityDesc, False)
self.register_field(fields.Side, True)
self.register_field(fields.IOIShares, True)
self.register_field(fields.Price, False)
self.register_field(fields.Currency, False)
self.register_field(fields.ValidUntilTime, False)
self.register_field(fields.IOIQltyInd, False)
self.register_field(fields.IOIOthSvc, False)
self.register_field(fields.IOINaturalFlag, False)
self.register_group(fields.NoIOIQualifiers, NoIOIQualifiersGroup, False)
self.register_field(fields.Text, False)
self.register_field(fields.TransactTime, False)
self.register_field(fields.URLLink, False)
MESSAGE_TYPES['6'] = IOI
class Advertisement(fix_message.MessageBase):
_msgtype = '7'
_msgcat = 'app'
def __init__(self):
self.Header = Header()
self.Trailer = Trailer()
super().__init__()
self.register_field(fields.AdvId, True)
self.register_field(fields.AdvTransType, True)
self.register_field(fields.AdvRefID, False)
self.register_field(fields.Symbol, True)
self.register_field(fields.SymbolSfx, False)
self.register_field(fields.SecurityID, False)
self.register_field(fields.IDSource, False)
self.register_field(fields.SecurityType, False)
self.register_field(fields.MaturityMonthYear, False)
self.register_field(fields.MaturityDay, False)
self.register_field(fields.PutOrCall, False)
self.register_field(fields.StrikePrice, False)
self.register_field(fields.OptAttribute, False)
self.register_field(fields.SecurityExchange, False)
self.register_field(fields.Issuer, False)
self.register_field(fields.SecurityDesc, False)
self.register_field(fields.AdvSide, True)
self.register_field(fields.Shares, True)
self.register_field(fields.Price, False)
self.register_field(fields.Currency, False)
self.register_field(fields.TradeDate, False)
self.register_field(fields.TransactTime, False)
self.register_field(fields.Text, False)
self.register_field(fields.URLLink, False)
self.register_field(fields.LastMkt, False)
MESSAGE_TYPES['7'] = Advertisement
class ExecutionReport(fix_message.MessageBase):
_msgtype = '8'
_msgcat = 'app'
def __init__(self):
self.Header = Header()
self.Trailer = Trailer()
super().__init__()
self.register_field(fields.OrderID, True)
self.register_field(fields.SecondaryOrderID, False)
self.register_field(fields.ClOrdID, False)
self.register_field(fields.OrigClOrdID, False)
self.register_field(fields.ClientID, False)
self.register_field(fields.ExecBroker, False)
self.register_field(fields.ListID, False)
self.register_field(fields.ExecID, True)
self.register_field(fields.ExecTransType, True)
self.register_field(fields.ExecRefID, False)
self.register_field(fields.ExecType, True)
self.register_field(fields.OrdStatus, True)
self.register_field(fields.OrdRejReason, False)
self.register_field(fields.Account, False)
self.register_field(fields.SettlmntTyp, False)
self.register_field(fields.FutSettDate, False)
self.register_field(fields.Symbol, True)
self.register_field(fields.SymbolSfx, False)
self.register_field(fields.SecurityID, False)
self.register_field(fields.IDSource, False)
self.register_field(fields.SecurityType, False)
self.register_field(fields.MaturityMonthYear, False)
self.register_field(fields.MaturityDay, False)
self.register_field(fields.PutOrCall, False)
self.register_field(fields.StrikePrice, False)
self.register_field(fields.OptAttribute, False)
self.register_field(fields.SecurityExchange, False)
self.register_field(fields.Issuer, False)
self.register_field(fields.SecurityDesc, False)
self.register_field(fields.Side, True)
self.register_field(fields.OrderQty, True)
self.register_field(fields.OrdType, False)
self.register_field(fields.Price, False)
self.register_field(fields.StopPx, False)
self.register_field(fields.PegDifference, False)
self.register_field(fields.Currency, False)
self.register_field(fields.TimeInForce, False)
self.register_field(fields.ExpireTime, False)
self.register_field(fields.ExecInst, False)
self.register_field(fields.Rule80A, False)
self.register_field(fields.LastShares, True)
self.register_field(fields.LastPx, True)
self.register_field(fields.LastSpotRate, False)
self.register_field(fields.LastForwardPoints, False)
self.register_field(fields.LastMkt, False)
self.register_field(fields.LastCapacity, False)
self.register_field(fields.LeavesQty, True)
self.register_field(fields.CumQty, True)
self.register_field(fields.AvgPx, True)
self.register_field(fields.TradeDate, False)
self.register_field(fields.TransactTime, False)
self.register_field(fields.ReportToExch, False)
self.register_field(fields.Commission, False)
self.register_field(fields.CommType, False)
self.register_field(fields.SettlCurrAmt, False)
self.register_field(fields.SettlCurrency, False)
self.register_field(fields.SettlCurrFxRate, False)
self.register_field(fields.SettlCurrFxRateCalc, False)
self.register_field(fields.Text, False)
MESSAGE_TYPES['8'] = ExecutionReport
class OrderCancelReject(fix_message.MessageBase):
_msgtype = '9'
_msgcat = 'app'
def __init__(self):
self.Header = Header()
self.Trailer = Trailer()
super().__init__()
self.register_field(fields.OrderID, True)
self.register_field(fields.SecondaryOrderID, False)
self.register_field(fields.ClOrdID, True)
self.register_field(fields.OrigClOrdID, True)
self.register_field(fields.OrdStatus, True)
self.register_field(fields.ClientID, False)
self.register_field(fields.ExecBroker, False)
self.register_field(fields.ListID, False)
self.register_field(fields.CxlRejReason, False)
self.register_field(fields.Text, False)
MESSAGE_TYPES['9'] = OrderCancelReject
class Logon(fix_message.MessageBase):
_msgtype = 'A'
_msgcat = 'admin'
def __init__(self):
self.Header = Header()
self.Trailer = Trailer()
super().__init__()
self.register_field(fields.EncryptMethod, True)
self.register_field(fields.HeartBtInt, True)
self.register_field(fields.RawDataLength, False)
self.register_field(fields.RawData, False)
self.register_field(fields.ResetSeqNumFlag, False)
MESSAGE_TYPES['A'] = Logon
class News(fix_message.MessageBase):
_msgtype = 'B'
_msgcat = 'app'
def __init__(self):
self.Header = Header()
self.Trailer = Trailer()
super().__init__()
self.register_field(fields.OrigTime, False)
self.register_field(fields.Urgency, False)
self.register_field(fields.Headline, True)
self.register_group(fields.NoRelatedSym, NoRelatedSymGroup, False)
self.register_group(fields.LinesOfText, LinesOfTextGroup, True)
self.register_field(fields.URLLink, False)
self.register_field(fields.RawDataLength, False)
self.register_field(fields.RawData, False)
MESSAGE_TYPES['B'] = News
class Email(fix_message.MessageBase):
_msgtype = 'C'
_msgcat = 'app'
def __init__(self):
self.Header = Header()
self.Trailer = Trailer()
super().__init__()
self.register_field(fields.EmailThreadID, True)
self.register_field(fields.EmailType, True)
self.register_field(fields.OrigTime, False)
self.register_field(fields.Subject, True)
self.register_group(fields.NoRelatedSym, NoRelatedSymGroup, False)
self.register_field(fields.OrderID, False)
self.register_field(fields.ClOrdID, False)
self.register_group(fields.LinesOfText, LinesOfTextGroup, True)
self.register_field(fields.RawDataLength, False)
self.register_field(fields.RawData, False)
MESSAGE_TYPES['C'] = Email
class OrderSingle(fix_message.MessageBase):
_msgtype = 'D'
_msgcat = 'app'
def __init__(self):
self.Header = Header()
self.Trailer = Trailer()
super().__init__()
self.register_field(fields.ClOrdID, True)
self.register_field(fields.ClientID, False)
self.register_field(fields.ExecBroker, False)
self.register_field(fields.Account, False)
self.register_field(fields.SettlmntTyp, False)
self.register_field(fields.FutSettDate, False)
self.register_field(fields.HandlInst, True)
self.register_field(fields.ExecInst, False)
self.register_field(fields.MinQty, False)
self.register_field(fields.MaxFloor, False)
self.register_field(fields.ExDestination, False)
self.register_field(fields.ProcessCode, False)
self.register_field(fields.Symbol, True)
self.register_field(fields.SymbolSfx, False)
self.register_field(fields.SecurityID, False)
self.register_field(fields.IDSource, False)
self.register_field(fields.SecurityType, False)
self.register_field(fields.MaturityMonthYear, False)
self.register_field(fields.MaturityDay, False)
self.register_field(fields.PutOrCall, False)
self.register_field(fields.StrikePrice, False)
self.register_field(fields.OptAttribute, False)
self.register_field(fields.SecurityExchange, False)
self.register_field(fields.Issuer, False)
self.register_field(fields.SecurityDesc, False)
self.register_field(fields.PrevClosePx, False)
self.register_field(fields.Side, True)
self.register_field(fields.LocateReqd, False)
self.register_field(fields.OrderQty, False)
self.register_field(fields.CashOrderQty, False)
self.register_field(fields.OrdType, True)
self.register_field(fields.Price, False)
self.register_field(fields.StopPx, False)
self.register_field(fields.Currency, False)
self.register_field(fields.IOIid, False)
self.register_field(fields.QuoteID, False)
self.register_field(fields.TimeInForce, False)
self.register_field(fields.ExpireTime, False)
self.register_field(fields.Commission, False)
self.register_field(fields.CommType, False)
self.register_field(fields.Rule80A, False)
self.register_field(fields.ForexReq, False)
self.register_field(fields.SettlCurrency, False)
self.register_field(fields.Text, False)
self.register_field(fields.FutSettDate2, False)
self.register_field(fields.OrderQty2, False)
self.register_field(fields.OpenClose, False)
self.register_field(fields.CoveredOrUncovered, False)
self.register_field(fields.CustomerOrFirm, False)
self.register_field(fields.MaxShow, False)
self.register_field(fields.PegDifference, False)
MESSAGE_TYPES['D'] = OrderSingle
class OrderList(fix_message.MessageBase):
_msgtype = 'E'
_msgcat = 'app'
def __init__(self):
self.Header = Header()
self.Trailer = Trailer()
super().__init__()
self.register_field(fields.ListID, True)
self.register_field(fields.WaveNo, False)
self.register_field(fields.ListSeqNo, True)
self.register_field(fields.ListNoOrds, True)
self.register_field(fields.ListExecInst, False)
self.register_field(fields.ClOrdID, True)
self.register_field(fields.ClientID, False)
self.register_field(fields.ExecBroker, False)
self.register_field(fields.Account, False)
self.register_field(fields.SettlmntTyp, False)
self.register_field(fields.FutSettDate, False)
self.register_field(fields.HandlInst, True)
self.register_field(fields.ExecInst, False)
self.register_field(fields.MinQty, False)
self.register_field(fields.MaxFloor, False)
self.register_field(fields.ExDestination, False)
self.register_field(fields.ProcessCode, False)
self.register_field(fields.Symbol, True)
self.register_field(fields.SymbolSfx, False)
self.register_field(fields.SecurityID, False)
self.register_field(fields.IDSource, False)
self.register_field(fields.SecurityType, False)
self.register_field(fields.MaturityMonthYear, False)
self.register_field(fields.MaturityDay, False)
self.register_field(fields.PutOrCall, False)
self.register_field(fields.StrikePrice, False)
self.register_field(fields.OptAttribute, False)
self.register_field(fields.SecurityExchange, False)
self.register_field(fields.Issuer, False)
self.register_field(fields.SecurityDesc, False)
self.register_field(fields.PrevClosePx, False)
self.register_field(fields.Side, True)
self.register_field(fields.LocateReqd, False)
self.register_field(fields.OrderQty, True)
self.register_field(fields.OrdType, True)
self.register_field(fields.Price, False)
self.register_field(fields.StopPx, False)
self.register_field(fields.PegDifference, False)
self.register_field(fields.Currency, False)
self.register_field(fields.TimeInForce, False)
self.register_field(fields.ExpireTime, False)
self.register_field(fields.Commission, False)
self.register_field(fields.CommType, False)
self.register_field(fields.Rule80A, False)
self.register_field(fields.ForexReq, False)
self.register_field(fields.SettlCurrency, False)
self.register_field(fields.Text, False)
self.register_field(fields.FutSettDate2, False)
self.register_field(fields.OrderQty2, False)
self.register_field(fields.OpenClose, False)
self.register_field(fields.CoveredOrUncovered, False)
self.register_field(fields.CustomerOrFirm, False)
self.register_field(fields.MaxShow, False)
MESSAGE_TYPES['E'] = OrderList
class OrderCancelRequest(fix_message.MessageBase):
_msgtype = 'F'
_msgcat = 'app'
def __init__(self):
self.Header = Header()
self.Trailer = Trailer()
super().__init__()
self.register_field(fields.OrigClOrdID, True)
self.register_field(fields.OrderID, False)
self.register_field(fields.ClOrdID, True)
self.register_field(fields.ListID, False)
self.register_field(fields.ClientID, False)
self.register_field(fields.ExecBroker, False)
self.register_field(fields.Symbol, True)
self.register_field(fields.SymbolSfx, False)
self.register_field(fields.SecurityID, False)
self.register_field(fields.IDSource, False)
self.register_field(fields.SecurityType, False)
self.register_field(fields.MaturityMonthYear, False)
self.register_field(fields.MaturityDay, False)
self.register_field(fields.PutOrCall, False)
self.register_field(fields.StrikePrice, False)
self.register_field(fields.OptAttribute, False)
self.register_field(fields.SecurityExchange, False)
self.register_field(fields.Issuer, False)
self.register_field(fields.SecurityDesc, False)
self.register_field(fields.Side, True)
self.register_field(fields.OrderQty, False)
self.register_field(fields.CashOrderQty, False)
self.register_field(fields.Text, False)
MESSAGE_TYPES['F'] = OrderCancelRequest
class OrderCancelReplaceRequest(fix_message.MessageBase):
_msgtype = 'G'
_msgcat = 'app'
def __init__(self):
self.Header = Header()
self.Trailer = Trailer()
super().__init__()
self.register_field(fields.OrderID, False)
self.register_field(fields.ClientID, False)
self.register_field(fields.ExecBroker, False)
self.register_field(fields.OrigClOrdID, True)
self.register_field(fields.ClOrdID, True)
self.register_field(fields.ListID, False)
self.register_field(fields.Account, False)
self.register_field(fields.SettlmntTyp, False)
self.register_field(fields.FutSettDate, False)
self.register_field(fields.HandlInst, True)
self.register_field(fields.ExecInst, False)
self.register_field(fields.MinQty, False)
self.register_field(fields.MaxFloor, False)
self.register_field(fields.ExDestination, False)
self.register_field(fields.Symbol, True)
self.register_field(fields.SymbolSfx, False)
self.register_field(fields.SecurityID, False)
self.register_field(fields.IDSource, False)
self.register_field(fields.SecurityType, False)
self.register_field(fields.MaturityMonthYear, False)
self.register_field(fields.MaturityDay, False)
self.register_field(fields.PutOrCall, False)
self.register_field(fields.StrikePrice, False)
self.register_field(fields.OptAttribute, False)
self.register_field(fields.SecurityExchange, False)
self.register_field(fields.Issuer, False)
self.register_field(fields.SecurityDesc, False)
self.register_field(fields.Side, True)
self.register_field(fields.OrderQty, False)
self.register_field(fields.CashOrderQty, False)
self.register_field(fields.OrdType, True)
self.register_field(fields.Price, False)
self.register_field(fields.StopPx, False)
self.register_field(fields.PegDifference, False)
self.register_field(fields.Currency, False)
self.register_field(fields.TimeInForce, False)
self.register_field(fields.ExpireTime, False)
self.register_field(fields.Commission, False)
self.register_field(fields.CommType, False)
self.register_field(fields.Rule80A, False)
self.register_field(fields.ForexReq, False)
self.register_field(fields.SettlCurrency, False)
self.register_field(fields.Text, False)
self.register_field(fields.FutSettDate2, False)
self.register_field(fields.OrderQty2, False)
self.register_field(fields.OpenClose, False)
self.register_field(fields.CoveredOrUncovered, False)
self.register_field(fields.CustomerOrFirm, False)
self.register_field(fields.MaxShow, False)
self.register_field(fields.LocateReqd, False)
MESSAGE_TYPES['G'] = OrderCancelReplaceRequest
class OrderStatusRequest(fix_message.MessageBase):
_msgtype = 'H'
_msgcat = 'app'
def __init__(self):
self.Header = Header()
self.Trailer = Trailer()
super().__init__()
self.register_field(fields.OrderID, False)
self.register_field(fields.ClOrdID, True)
self.register_field(fields.ClientID, False)
self.register_field(fields.ExecBroker, False)
self.register_field(fields.Symbol, True)
self.register_field(fields.SymbolSfx, False)
self.register_field(fields.SecurityID, False)
self.register_field(fields.IDSource, False)
self.register_field(fields.SecurityType, False)
self.register_field(fields.MaturityMonthYear, False)
self.register_field(fields.MaturityDay, False)
self.register_field(fields.PutOrCall, False)
self.register_field(fields.StrikePrice, False)
self.register_field(fields.OptAttribute, False)
self.register_field(fields.SecurityExchange, False)
self.register_field(fields.Issuer, False)
self.register_field(fields.SecurityDesc, False)
self.register_field(fields.Side, True)
MESSAGE_TYPES['H'] = OrderStatusRequest
class Allocation(fix_message.MessageBase):
_msgtype = 'J'
_msgcat = 'app'
def __init__(self):
self.Header = Header()
self.Trailer = Trailer()
super().__init__()
self.register_field(fields.AllocID, True)
self.register_field(fields.AllocTransType, True)
self.register_field(fields.RefAllocID, False)
self.register_field(fields.AllocLinkID, False)
self.register_field(fields.AllocLinkType, False)
self.register_group(fields.NoOrders, NoOrdersGroup, False)
self.register_group(fields.NoExecs, NoExecsGroup, False)
self.register_field(fields.Side, True)
self.register_field(fields.Symbol, True)
self.register_field(fields.SymbolSfx, False)
self.register_field(fields.SecurityID, False)
self.register_field(fields.IDSource, False)
self.register_field(fields.SecurityType, False)
self.register_field(fields.MaturityMonthYear, False)
self.register_field(fields.MaturityDay, False)
self.register_field(fields.PutOrCall, False)
self.register_field(fields.StrikePrice, False)
self.register_field(fields.OptAttribute, False)
self.register_field(fields.SecurityExchange, False)
self.register_field(fields.Issuer, False)
self.register_field(fields.SecurityDesc, False)
self.register_field(fields.Shares, True)
self.register_field(fields.LastMkt, False)
self.register_field(fields.AvgPx, True)
self.register_field(fields.Currency, False)
self.register_field(fields.AvgPrxPrecision, False)
self.register_field(fields.TradeDate, True)
self.register_field(fields.TransactTime, False)
self.register_field(fields.SettlmntTyp, False)
self.register_field(fields.FutSettDate, False)
self.register_field(fields.NetMoney, False)
self.register_field(fields.OpenClose, False)
self.register_field(fields.Text, False)
self.register_field(fields.NumDaysInterest, False)
self.register_field(fields.AccruedInterestRate, False)
self.register_group(fields.NoAllocs, NoAllocsGroup, False)
MESSAGE_TYPES['J'] = Allocation
class ListCancelRequest(fix_message.MessageBase):
_msgtype = 'K'
_msgcat = 'app'
def __init__(self):
self.Header = Header()
self.Trailer = Trailer()
super().__init__()
self.register_field(fields.ListID, True)
self.register_field(fields.WaveNo, False)
self.register_field(fields.Text, False)
MESSAGE_TYPES['K'] = ListCancelRequest
class ListExecute(fix_message.MessageBase):
_msgtype = 'L'
_msgcat = 'app'
def __init__(self):
self.Header = Header()
self.Trailer = Trailer()
super().__init__()
self.register_field(fields.ListID, True)
self.register_field(fields.WaveNo, False)
self.register_field(fields.Text, False)
MESSAGE_TYPES['L'] = ListExecute
class ListStatusRequest(fix_message.MessageBase):
_msgtype = 'M'
_msgcat = 'app'
def __init__(self):
self.Header = Header()
self.Trailer = Trailer()
super().__init__()
self.register_field(fields.ListID, True)
self.register_field(fields.WaveNo, False)
self.register_field(fields.Text, False)
MESSAGE_TYPES['M'] = ListStatusRequest
class ListStatus(fix_message.MessageBase):
_msgtype = 'N'
_msgcat = 'app'
def __init__(self):
self.Header = Header()
self.Trailer = Trailer()
super().__init__()
self.register_field(fields.ListID, True)
self.register_field(fields.WaveNo, False)
self.register_field(fields.NoRpts, True)
self.register_field(fields.RptSeq, True)
self.register_group(fields.NoOrders, NoOrdersGroup, True)
MESSAGE_TYPES['N'] = ListStatus
class AllocationInstructionAck(fix_message.MessageBase):
_msgtype = 'P'
_msgcat = 'app'
def __init__(self):
self.Header = Header()
self.Trailer = Trailer()
super().__init__()
self.register_field(fields.ClientID, False)
self.register_field(fields.ExecBroker, False)
self.register_field(fields.AllocID, True)
self.register_field(fields.TradeDate, True)
self.register_field(fields.TransactTime, False)
self.register_field(fields.AllocStatus, True)
self.register_field(fields.AllocRejCode, False)
self.register_field(fields.Text, False)
MESSAGE_TYPES['P'] = AllocationInstructionAck
class DontKnowTrade(fix_message.MessageBase):
_msgtype = 'Q'
_msgcat = 'app'
def __init__(self):
self.Header = Header()
self.Trailer = Trailer()
super().__init__()
self.register_field(fields.OrderID, False)
self.register_field(fields.ExecID, False)
self.register_field(fields.DKReason, True)
self.register_field(fields.Symbol, True)
self.register_field(fields.SymbolSfx, False)
self.register_field(fields.SecurityID, False)
self.register_field(fields.IDSource, False)
self.register_field(fields.SecurityType, False)
self.register_field(fields.MaturityMonthYear, False)
self.register_field(fields.MaturityDay, False)
self.register_field(fields.PutOrCall, False)
self.register_field(fields.StrikePrice, False)
self.register_field(fields.OptAttribute, False)
self.register_field(fields.SecurityExchange, False)
self.register_field(fields.Issuer, False)
self.register_field(fields.SecurityDesc, False)
self.register_field(fields.Side, True)
self.register_field(fields.OrderQty, False)
self.register_field(fields.CashOrderQty, False)
self.register_field(fields.LastShares, False)
self.register_field(fields.LastPx, False)
self.register_field(fields.Text, False)
MESSAGE_TYPES['Q'] = DontKnowTrade
class QuoteRequest(fix_message.MessageBase):
_msgtype = 'R'
_msgcat = 'app'
def __init__(self):
self.Header = Header()
self.Trailer = Trailer()
super().__init__()
self.register_field(fields.QuoteReqID, True)
self.register_field(fields.Symbol, True)
self.register_field(fields.SymbolSfx, False)
self.register_field(fields.SecurityID, False)
self.register_field(fields.IDSource, False)
self.register_field(fields.SecurityType, False)
self.register_field(fields.MaturityMonthYear, False)
self.register_field(fields.MaturityDay, False)
self.register_field(fields.PutOrCall, False)
self.register_field(fields.StrikePrice, False)
self.register_field(fields.OptAttribute, False)
self.register_field(fields.SecurityExchange, False)
self.register_field(fields.Issuer, False)
self.register_field(fields.SecurityDesc, False)
self.register_field(fields.PrevClosePx, False)
self.register_field(fields.Side, False)
self.register_field(fields.OrderQty, False)
self.register_field(fields.FutSettDate, False)
self.register_field(fields.OrdType, False)
self.register_field(fields.FutSettDate2, False)
self.register_field(fields.OrderQty2, False)
MESSAGE_TYPES['R'] = QuoteRequest
class Quote(fix_message.MessageBase):
_msgtype = 'S'
_msgcat = 'app'
def __init__(self):
self.Header = Header()
self.Trailer = Trailer()
super().__init__()
self.register_field(fields.QuoteReqID, False)
self.register_field(fields.QuoteID, True)
self.register_field(fields.Symbol, True)
self.register_field(fields.SymbolSfx, False)
self.register_field(fields.SecurityID, False)
self.register_field(fields.IDSource, False)
self.register_field(fields.SecurityType, False)
self.register_field(fields.MaturityMonthYear, False)
self.register_field(fields.MaturityDay, False)
self.register_field(fields.PutOrCall, False)
self.register_field(fields.StrikePrice, False)
self.register_field(fields.OptAttribute, False)
self.register_field(fields.SecurityExchange, False)
self.register_field(fields.Issuer, False)
self.register_field(fields.SecurityDesc, False)
self.register_field(fields.BidPx, False)
self.register_field(fields.OfferPx, False)
self.register_field(fields.BidSize, False)
self.register_field(fields.OfferSize, False)
self.register_field(fields.ValidUntilTime, False)
self.register_field(fields.BidSpotRate, False)
self.register_field(fields.OfferSpotRate, False)
self.register_field(fields.BidForwardPoints, False)
self.register_field(fields.OfferForwardPoints, False)
self.register_field(fields.TransactTime, False)
self.register_field(fields.FutSettDate, False)
self.register_field(fields.OrdType, False)
self.register_field(fields.FutSettDate2, False)
self.register_field(fields.OrderQty2, False)
MESSAGE_TYPES['S'] = Quote
class SettlementInstructions(fix_message.MessageBase):
_msgtype = 'T'
_msgcat = 'app'
def __init__(self):
self.Header = Header()
self.Trailer = Trailer()
super().__init__()
self.register_field(fields.SettlInstID, True)
self.register_field(fields.SettlInstTransType, True)
self.register_field(fields.SettlInstMode, True)
self.register_field(fields.SettlInstSource, True)
self.register_field(fields.AllocAccount, True)
self.register_field(fields.SettlLocation, False)
self.register_field(fields.TradeDate, False)
self.register_field(fields.AllocID, False)
self.register_field(fields.LastMkt, False)
self.register_field(fields.Side, False)
self.register_field(fields.SecurityType, False)
self.register_field(fields.EffectiveTime, False)
self.register_field(fields.TransactTime, True)
self.register_field(fields.ClientID, False)
self.register_field(fields.ExecBroker, False)
self.register_field(fields.StandInstDbType, False)
self.register_field(fields.StandInstDbName, False)
self.register_field(fields.StandInstDbID, False)
self.register_field(fields.SettlDeliveryType, False)
self.register_field(fields.SettlDepositoryCode, False)
self.register_field(fields.SettlBrkrCode, False)
self.register_field(fields.SettlInstCode, False)
self.register_field(fields.SecuritySettlAgentName, False)
self.register_field(fields.SecuritySettlAgentCode, False)
self.register_field(fields.SecuritySettlAgentAcctNum, False)
self.register_field(fields.SecuritySettlAgentAcctName, False)
self.register_field(fields.SecuritySettlAgentContactName, False)
self.register_field(fields.SecuritySettlAgentContactPhone, False)
self.register_field(fields.CashSettlAgentName, False)
self.register_field(fields.CashSettlAgentCode, False)
self.register_field(fields.CashSettlAgentAcctNum, False)
self.register_field(fields.CashSettlAgentAcctName, False)
self.register_field(fields.CashSettlAgentContactName, False)
self.register_field(fields.CashSettlAgentContactPhone, False)
MESSAGE_TYPES['T'] = SettlementInstructions
| 41.969409 | 80 | 0.705356 | 4,348 | 39,787 | 6.221481 | 0.069227 | 0.25951 | 0.361354 | 0.488891 | 0.853906 | 0.756275 | 0.743891 | 0.741082 | 0.738679 | 0.729104 | 0 | 0.001245 | 0.1925 | 39,787 | 947 | 81 | 42.013728 | 0.840731 | 0.00186 | 0 | 0.708876 | 0 | 0 | 0.004066 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.046154 | false | 0 | 0.00355 | 0 | 0.159763 | 0 | 0 | 0 | 0 | null | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 10 |
9151d54c04cf215ad60e94743eddf4617a09973f | 3,919 | py | Python | demonstrations.py | Video-Lab/turtlepatterns | 96a2a645e35c47f51af060f4a0a2042aade56b7e | [
"MIT"
] | null | null | null | demonstrations.py | Video-Lab/turtlepatterns | 96a2a645e35c47f51af060f4a0a2042aade56b7e | [
"MIT"
] | null | null | null | demonstrations.py | Video-Lab/turtlepatterns | 96a2a645e35c47f51af060f4a0a2042aade56b7e | [
"MIT"
] | 1 | 2020-12-12T12:48:27.000Z | 2020-12-12T12:48:27.000Z | # === DEMONSTRATIONS ===
# All functions in this file are inverted, use squares, use fill, use random colors, are instantly drawn, & have 20 iterations.
# Modular patterns use a divider of 3.
from turtlepatterns import *
import math
import random
t = TurtlePatterns(numSides=4, color=None, bgColor="#ffffff")
# === LINEAR ===
# t.shapePattern(numIter=20, invert=True, fill=True, func=lambda x: x, instant=True) # Linear w/ slope of 1
# t.shapePattern(numIter=20, invert=True, fill=True, func=lambda x: 2*x, instant=True) # Linear w/ slope of 2
# t.shapePattern(numIter=20, invert=True, fill=True, func=lambda x: 0.5*x, instant=True) # Linear w/ slope of 1/2
# t.modularPattern(numIter=20, invert=True, fill=True, func=lambda x: x, instant=True, divider=3) # Linear w/ slope of 1
# t.modularPattern(numIter=20, invert=True, fill=True, func=lambda x: 2*x, instant=True, divider=3) # Linear w/ slope of 2
# t.modularPattern(numIter=20, invert=True, fill=True, func=lambda x: 0.5*x, instant=True, divider=3) # Linear w/ slope of 1/2
# === POLYNOMIAL ===
# t.shapePattern(numIter=20, invert=True, fill=True, func=lambda x: x**2, instant=True) # Squared
# t.shapePattern(numIter=20, invert=True, fill=True, func=lambda x: x**3, instant=True) # Cubed
# t.shapePattern(numIter=20, invert=True, fill=True, func=lambda x: x**4, instant=True) # Fourth Power
# t.modularPattern(numIter=20, invert=True, fill=True, func=lambda x: x**2, instant=True, divider=3) # Squared
# t.modularPattern(numIter=20, invert=True, fill=True, func=lambda x: x**3, instant=True, divider=3) # Cubed
# t.modularPattern(numIter=20, invert=True, fill=True, func=lambda x: x**4, instant=True, divider=3) # Fourth Power
# === EXPONENTIAL ===
# t.shapePattern(numIter=20, invert=True, fill=True, func=lambda x: 2**x, instant=True) # 2^x
# t.shapePattern(numIter=20, invert=True, fill=True, func=lambda x: 3**x, instant=True) # 3^x
# t.shapePattern(numIter=20, invert=True, fill=True, func=lambda x: math.exp(x), instant=True) # e^x
# t.modularPattern(numIter=20, invert=True, fill=True, func=lambda x: 2**x, instant=True, divider=3) # 2^x
# t.modularPattern(numIter=20, invert=True, fill=True, func=lambda x: 3**x, instant=True, divider=3) # 3^x
# t.modularPattern(numIter=20, invert=True, fill=True, func=lambda x: math.exp(x), instant=True, divider=3) # e^x
# === TRIGONOMETRIC ===
# t.shapePattern(numIter=20, invert=True, fill=True, func=lambda x: 50*math.sin(x), instant=True) # Sine
# t.shapePattern(numIter=20, invert=True, fill=True, func=lambda x: 50*math.cos(x), instant=True) # Cosine
# t.modularPattern(numIter=20, invert=True, fill=True, func=lambda x: 50*math.sin(x), instant=True, divider=3) # Sine
# t.modularPattern(numIter=20, invert=True, fill=True, func=lambda x: 50*math.cos(x), instant=True, divider=3) # Cosine
# === MODULAR ===
# t.shapePattern(numIter=200, invert=True, fill=True, func=lambda x: x*(x%2), instant=True) # x mod 2 (divisible by 2)
# t.shapePattern(numIter=20, invert=True, fill=True, func=lambda x: x*(x%6), instant=True) # x mod 6 (divisibile by 6)
# t.modularPattern(numIter=200, invert=True, fill=True, func=lambda x: x*(x%2), instant=True, divider=3) # x mod 2 (divisible by 2)
# t.modularPattern(numIter=20, invert=True, fill=True, func=lambda x: x*(x%6), instant=True, divider=3) # x mod 6 (divisibile by 6)
# === RANDOM ===
# t.shapePattern(numIter=20, invert=True, fill=True, func=lambda x: random.randint(1,100), instant=True) # Random size from 1 to 100
# t.shapePattern(numIter=20, invert=True, fill=True, func=lambda x: random.randint(1,500), instant=True) # Random size from 1 to 500
# t.modularPattern(numIter=20, invert=True, fill=True, func=lambda x: random.randint(1,100), instant=True, divider=3) # Random size from 1 to 100
# t.modularPattern(numIter=20, invert=True, fill=True, func=lambda x: random.randint(1,500), instant=True, divider=3) # Random size from 1 to 500
turtle.done() | 66.423729 | 145 | 0.713447 | 657 | 3,919 | 4.255708 | 0.118721 | 0.107296 | 0.150215 | 0.193133 | 0.852289 | 0.840844 | 0.822604 | 0.786123 | 0.775393 | 0.752504 | 0 | 0.04664 | 0.119163 | 3,919 | 59 | 146 | 66.423729 | 0.763326 | 0.935188 | 0 | 0 | 0 | 0 | 0.037838 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.6 | 0 | 0.6 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 10 |
e673ef5103ecff4865bca3e0cc33c6c81c72946e | 435,122 | py | Python | doc/eagle.py | lambdafu/schematic-file-converter | db84220de12da07f5fd577020feb5fa20f707e68 | [
"Apache-2.0"
] | 1 | 2019-11-24T15:03:56.000Z | 2019-11-24T15:03:56.000Z | doc/eagle.py | lambdafu/schematic-file-converter | db84220de12da07f5fd577020feb5fa20f707e68 | [
"Apache-2.0"
] | null | null | null | doc/eagle.py | lambdafu/schematic-file-converter | db84220de12da07f5fd577020feb5fa20f707e68 | [
"Apache-2.0"
] | null | null | null | #!/usr/bin/env python
# -*- coding: utf-8 -*-
#
# Generated Wed Nov 9 18:31:45 2011 by generateDS.py version 2.7a.
#
import sys
import getopt
import re as re_
etree_ = None
Verbose_import_ = False
( XMLParser_import_none, XMLParser_import_lxml,
XMLParser_import_elementtree
) = range(3)
XMLParser_import_library = None
try:
# lxml
from lxml import etree as etree_
XMLParser_import_library = XMLParser_import_lxml
if Verbose_import_:
print("running with lxml.etree")
except ImportError:
try:
# cElementTree from Python 2.5+
import xml.etree.cElementTree as etree_
XMLParser_import_library = XMLParser_import_elementtree
if Verbose_import_:
print("running with cElementTree on Python 2.5+")
except ImportError:
try:
# ElementTree from Python 2.5+
import xml.etree.ElementTree as etree_
XMLParser_import_library = XMLParser_import_elementtree
if Verbose_import_:
print("running with ElementTree on Python 2.5+")
except ImportError:
try:
# normal cElementTree install
import cElementTree as etree_
XMLParser_import_library = XMLParser_import_elementtree
if Verbose_import_:
print("running with cElementTree")
except ImportError:
try:
# normal ElementTree install
import elementtree.ElementTree as etree_
XMLParser_import_library = XMLParser_import_elementtree
if Verbose_import_:
print("running with ElementTree")
except ImportError:
raise ImportError("Failed to import ElementTree from any known place")
def parsexml_(*args, **kwargs):
if (XMLParser_import_library == XMLParser_import_lxml and
'parser' not in kwargs):
# Use the lxml ElementTree compatible parser so that, e.g.,
# we ignore comments.
kwargs['parser'] = etree_.ETCompatXMLParser()
doc = etree_.parse(*args, **kwargs)
return doc
#
# User methods
#
# Calls to the methods in these classes are generated by generateDS.py.
# You can replace these methods by re-implementing the following class
# in a module named generatedssuper.py.
try:
from generatedssuper import GeneratedsSuper
except ImportError, exp:
class GeneratedsSuper(object):
def gds_format_string(self, input_data, input_name=''):
return input_data
def gds_validate_string(self, input_data, node, input_name=''):
return input_data
def gds_format_integer(self, input_data, input_name=''):
return '%d' % input_data
def gds_validate_integer(self, input_data, node, input_name=''):
return input_data
def gds_format_integer_list(self, input_data, input_name=''):
return '%s' % input_data
def gds_validate_integer_list(self, input_data, node, input_name=''):
values = input_data.split()
for value in values:
try:
fvalue = float(value)
except (TypeError, ValueError), exp:
raise_parse_error(node, 'Requires sequence of integers')
return input_data
def gds_format_float(self, input_data, input_name=''):
return '%f' % input_data
def gds_validate_float(self, input_data, node, input_name=''):
return input_data
def gds_format_float_list(self, input_data, input_name=''):
return '%s' % input_data
def gds_validate_float_list(self, input_data, node, input_name=''):
values = input_data.split()
for value in values:
try:
fvalue = float(value)
except (TypeError, ValueError), exp:
raise_parse_error(node, 'Requires sequence of floats')
return input_data
def gds_format_double(self, input_data, input_name=''):
return '%e' % input_data
def gds_validate_double(self, input_data, node, input_name=''):
return input_data
def gds_format_double_list(self, input_data, input_name=''):
return '%s' % input_data
def gds_validate_double_list(self, input_data, node, input_name=''):
values = input_data.split()
for value in values:
try:
fvalue = float(value)
except (TypeError, ValueError), exp:
raise_parse_error(node, 'Requires sequence of doubles')
return input_data
def gds_format_boolean(self, input_data, input_name=''):
return '%s' % input_data
def gds_validate_boolean(self, input_data, node, input_name=''):
return input_data
def gds_format_boolean_list(self, input_data, input_name=''):
return '%s' % input_data
def gds_validate_boolean_list(self, input_data, node, input_name=''):
values = input_data.split()
for value in values:
if value not in ('true', '1', 'false', '0', ):
raise_parse_error(node, 'Requires sequence of booleans ("true", "1", "false", "0")')
return input_data
def gds_str_lower(self, instring):
return instring.lower()
def get_path_(self, node):
path_list = []
self.get_path_list_(node, path_list)
path_list.reverse()
path = '/'.join(path_list)
return path
Tag_strip_pattern_ = re_.compile(r'\{.*\}')
def get_path_list_(self, node, path_list):
if node is None:
return
tag = GeneratedsSuper.Tag_strip_pattern_.sub('', node.tag)
if tag:
path_list.append(tag)
self.get_path_list_(node.getparent(), path_list)
def get_class_obj_(self, node, default_class=None):
class_obj1 = default_class
if 'xsi' in node.nsmap:
classname = node.get('{%s}type' % node.nsmap['xsi'])
if classname is not None:
names = classname.split(':')
if len(names) == 2:
classname = names[1]
class_obj2 = globals().get(classname)
if class_obj2 is not None:
class_obj1 = class_obj2
return class_obj1
def gds_build_any(self, node, type_name=None):
return None
#
# If you have installed IPython you can uncomment and use the following.
# IPython is available from http://ipython.scipy.org/.
#
## from IPython.Shell import IPShellEmbed
## args = ''
## ipshell = IPShellEmbed(args,
## banner = 'Dropping into IPython',
## exit_msg = 'Leaving Interpreter, back to program.')
# Then use the following line where and when you want to drop into the
# IPython shell:
# ipshell('<some message> -- Entering ipshell.\nHit Ctrl-D to exit')
#
# Globals
#
ExternalEncoding = 'ascii'
Tag_pattern_ = re_.compile(r'({.*})?(.*)')
String_cleanup_pat_ = re_.compile(r"[\n\r\s]+")
Namespace_extract_pat_ = re_.compile(r'{(.*)}(.*)')
#
# Support/utility functions.
#
def showIndent(outfile, level):
for idx in range(level):
outfile.write(' ')
def quote_xml(inStr):
if not inStr:
return ''
s1 = (isinstance(inStr, basestring) and inStr or
'%s' % inStr)
s1 = s1.replace('&', '&')
s1 = s1.replace('<', '<')
s1 = s1.replace('>', '>')
return s1
def quote_attrib(inStr):
s1 = (isinstance(inStr, basestring) and inStr or
'%s' % inStr)
s1 = s1.replace('&', '&')
s1 = s1.replace('<', '<')
s1 = s1.replace('>', '>')
if '"' in s1:
if "'" in s1:
s1 = '"%s"' % s1.replace('"', """)
else:
s1 = "'%s'" % s1
else:
s1 = '"%s"' % s1
return s1
def quote_python(inStr):
s1 = inStr
if s1.find("'") == -1:
if s1.find('\n') == -1:
return "'%s'" % s1
else:
return "'''%s'''" % s1
else:
if s1.find('"') != -1:
s1 = s1.replace('"', '\\"')
if s1.find('\n') == -1:
return '"%s"' % s1
else:
return '"""%s"""' % s1
def get_all_text_(node):
if node.text is not None:
text = node.text
else:
text = ''
for child in node:
if child.tail is not None:
text += child.tail
return text
def find_attr_value_(attr_name, node):
attrs = node.attrib
attr_parts = attr_name.split(':')
value = None
if len(attr_parts) == 1:
value = attrs.get(attr_name)
elif len(attr_parts) == 2:
prefix, name = attr_parts
namespace = node.nsmap.get(prefix)
if namespace is not None:
value = attrs.get('{%s}%s' % (namespace, name, ))
return value
class GDSParseError(Exception):
pass
def raise_parse_error(node, msg):
if XMLParser_import_library == XMLParser_import_lxml:
msg = '%s (element %s/line %d)' % (msg, node.tag, node.sourceline, )
else:
msg = '%s (element %s)' % (msg, node.tag, )
raise GDSParseError(msg)
class MixedContainer:
# Constants for category:
CategoryNone = 0
CategoryText = 1
CategorySimple = 2
CategoryComplex = 3
# Constants for content_type:
TypeNone = 0
TypeText = 1
TypeString = 2
TypeInteger = 3
TypeFloat = 4
TypeDecimal = 5
TypeDouble = 6
TypeBoolean = 7
def __init__(self, category, content_type, name, value):
self.category = category
self.content_type = content_type
self.name = name
self.value = value
def getCategory(self):
return self.category
def getContenttype(self, content_type):
return self.content_type
def getValue(self):
return self.value
def getName(self):
return self.name
def export(self, outfile, level, name, namespace):
if self.category == MixedContainer.CategoryText:
# Prevent exporting empty content as empty lines.
if self.value.strip():
outfile.write(self.value)
elif self.category == MixedContainer.CategorySimple:
self.exportSimple(outfile, level, name)
else: # category == MixedContainer.CategoryComplex
self.value.export(outfile, level, namespace,name)
def exportSimple(self, outfile, level, name):
if self.content_type == MixedContainer.TypeString:
outfile.write('<%s>%s</%s>' % (self.name, self.value, self.name))
elif self.content_type == MixedContainer.TypeInteger or \
self.content_type == MixedContainer.TypeBoolean:
outfile.write('<%s>%d</%s>' % (self.name, self.value, self.name))
elif self.content_type == MixedContainer.TypeFloat or \
self.content_type == MixedContainer.TypeDecimal:
outfile.write('<%s>%f</%s>' % (self.name, self.value, self.name))
elif self.content_type == MixedContainer.TypeDouble:
outfile.write('<%s>%g</%s>' % (self.name, self.value, self.name))
def exportLiteral(self, outfile, level, name):
if self.category == MixedContainer.CategoryText:
showIndent(outfile, level)
outfile.write('model_.MixedContainer(%d, %d, "%s", "%s"),\n' % \
(self.category, self.content_type, self.name, self.value))
elif self.category == MixedContainer.CategorySimple:
showIndent(outfile, level)
outfile.write('model_.MixedContainer(%d, %d, "%s", "%s"),\n' % \
(self.category, self.content_type, self.name, self.value))
else: # category == MixedContainer.CategoryComplex
showIndent(outfile, level)
outfile.write('model_.MixedContainer(%d, %d, "%s",\n' % \
(self.category, self.content_type, self.name,))
self.value.exportLiteral(outfile, level + 1)
showIndent(outfile, level)
outfile.write(')\n')
class MemberSpec_(object):
def __init__(self, name='', data_type='', container=0):
self.name = name
self.data_type = data_type
self.container = container
def set_name(self, name): self.name = name
def get_name(self): return self.name
def set_data_type(self, data_type): self.data_type = data_type
def get_data_type_chain(self): return self.data_type
def get_data_type(self):
if isinstance(self.data_type, list):
if len(self.data_type) > 0:
return self.data_type[-1]
else:
return 'xs:string'
else:
return self.data_type
def set_container(self, container): self.container = container
def get_container(self): return self.container
def _cast(typ, value):
if typ is None or value is None:
return value
return typ(value)
#
# Data representation classes.
#
class eagle(GeneratedsSuper):
subclass = None
superclass = None
def __init__(self, version=None, compatibility=None, drawing=None):
self.version = _cast(None, version)
self.compatibility = compatibility
self.drawing = drawing
def factory(*args_, **kwargs_):
if eagle.subclass:
return eagle.subclass(*args_, **kwargs_)
else:
return eagle(*args_, **kwargs_)
factory = staticmethod(factory)
def get_compatibility(self): return self.compatibility
def set_compatibility(self, compatibility): self.compatibility = compatibility
def get_drawing(self): return self.drawing
def set_drawing(self, drawing): self.drawing = drawing
def get_version(self): return self.version
def set_version(self, version): self.version = version
def export(self, outfile, level, namespace_='t:', name_='eagle', namespacedef_=''):
showIndent(outfile, level)
outfile.write('<%s%s%s' % (namespace_, name_, namespacedef_ and ' ' + namespacedef_ or '', ))
already_processed = []
self.exportAttributes(outfile, level, already_processed, namespace_, name_='eagle')
if self.hasContent_():
outfile.write('>\n')
self.exportChildren(outfile, level + 1, namespace_, name_)
showIndent(outfile, level)
outfile.write('</%s%s>\n' % (namespace_, name_))
else:
outfile.write('/>\n')
def exportAttributes(self, outfile, level, already_processed, namespace_='t:', name_='eagle'):
if self.version is not None and 'version' not in already_processed:
already_processed.append('version')
outfile.write(' version=%s' % (self.gds_format_string(quote_attrib(self.version).encode(ExternalEncoding), input_name='version'), ))
def exportChildren(self, outfile, level, namespace_='t:', name_='eagle', fromsubclass_=False):
if self.compatibility is not None:
self.compatibility.export(outfile, level, namespace_, name_='compatibility')
if self.drawing is not None:
self.drawing.export(outfile, level, namespace_, name_='drawing', )
def hasContent_(self):
if (
self.compatibility is not None or
self.drawing is not None
):
return True
else:
return False
def exportLiteral(self, outfile, level, name_='eagle'):
level += 1
self.exportLiteralAttributes(outfile, level, [], name_)
if self.hasContent_():
self.exportLiteralChildren(outfile, level, name_)
def exportLiteralAttributes(self, outfile, level, already_processed, name_):
if self.version is not None and 'version' not in already_processed:
already_processed.append('version')
showIndent(outfile, level)
outfile.write('version = "%s",\n' % (self.version,))
def exportLiteralChildren(self, outfile, level, name_):
if self.compatibility is not None:
showIndent(outfile, level)
outfile.write('compatibility=model_.compatibility(\n')
self.compatibility.exportLiteral(outfile, level)
showIndent(outfile, level)
outfile.write('),\n')
if self.drawing is not None:
showIndent(outfile, level)
outfile.write('drawing=model_.drawing(\n')
self.drawing.exportLiteral(outfile, level)
showIndent(outfile, level)
outfile.write('),\n')
def build(self, node):
self.buildAttributes(node, node.attrib, [])
for child in node:
nodeName_ = Tag_pattern_.match(child.tag).groups()[-1]
self.buildChildren(child, node, nodeName_)
def buildAttributes(self, node, attrs, already_processed):
value = find_attr_value_('version', node)
if value is not None and 'version' not in already_processed:
already_processed.append('version')
self.version = value
def buildChildren(self, child_, node, nodeName_, fromsubclass_=False):
if nodeName_ == 'compatibility':
obj_ = compatibility.factory()
obj_.build(child_)
self.set_compatibility(obj_)
elif nodeName_ == 'drawing':
obj_ = drawing.factory()
obj_.build(child_)
self.set_drawing(obj_)
# end class eagle
class compatibility(GeneratedsSuper):
subclass = None
superclass = None
def __init__(self, note=None):
self.note = note
def factory(*args_, **kwargs_):
if compatibility.subclass:
return compatibility.subclass(*args_, **kwargs_)
else:
return compatibility(*args_, **kwargs_)
factory = staticmethod(factory)
def get_note(self): return self.note
def set_note(self, note): self.note = note
def export(self, outfile, level, namespace_='t:', name_='compatibility', namespacedef_=''):
showIndent(outfile, level)
outfile.write('<%s%s%s' % (namespace_, name_, namespacedef_ and ' ' + namespacedef_ or '', ))
already_processed = []
self.exportAttributes(outfile, level, already_processed, namespace_, name_='compatibility')
if self.hasContent_():
outfile.write('>\n')
self.exportChildren(outfile, level + 1, namespace_, name_)
showIndent(outfile, level)
outfile.write('</%s%s>\n' % (namespace_, name_))
else:
outfile.write('/>\n')
def exportAttributes(self, outfile, level, already_processed, namespace_='t:', name_='compatibility'):
pass
def exportChildren(self, outfile, level, namespace_='t:', name_='compatibility', fromsubclass_=False):
if self.note is not None:
self.note.export(outfile, level, namespace_, name_='note', )
def hasContent_(self):
if (
self.note is not None
):
return True
else:
return False
def exportLiteral(self, outfile, level, name_='compatibility'):
level += 1
self.exportLiteralAttributes(outfile, level, [], name_)
if self.hasContent_():
self.exportLiteralChildren(outfile, level, name_)
def exportLiteralAttributes(self, outfile, level, already_processed, name_):
pass
def exportLiteralChildren(self, outfile, level, name_):
if self.note is not None:
showIndent(outfile, level)
outfile.write('note=model_.note(\n')
self.note.exportLiteral(outfile, level)
showIndent(outfile, level)
outfile.write('),\n')
def build(self, node):
self.buildAttributes(node, node.attrib, [])
for child in node:
nodeName_ = Tag_pattern_.match(child.tag).groups()[-1]
self.buildChildren(child, node, nodeName_)
def buildAttributes(self, node, attrs, already_processed):
pass
def buildChildren(self, child_, node, nodeName_, fromsubclass_=False):
if nodeName_ == 'note':
obj_ = note.factory()
obj_.build(child_)
self.set_note(obj_)
# end class compatibility
class note(GeneratedsSuper):
subclass = None
superclass = None
def __init__(self, version=None, severity=None, valueOf_=None, mixedclass_=None, content_=None):
self.version = _cast(None, version)
self.severity = _cast(None, severity)
self.valueOf_ = valueOf_
if mixedclass_ is None:
self.mixedclass_ = MixedContainer
else:
self.mixedclass_ = mixedclass_
if content_ is None:
self.content_ = []
else:
self.content_ = content_
self.valueOf_ = valueOf_
def factory(*args_, **kwargs_):
if note.subclass:
return note.subclass(*args_, **kwargs_)
else:
return note(*args_, **kwargs_)
factory = staticmethod(factory)
def get_version(self): return self.version
def set_version(self, version): self.version = version
def get_severity(self): return self.severity
def set_severity(self, severity): self.severity = severity
def get_valueOf_(self): return self.valueOf_
def set_valueOf_(self, valueOf_): self.valueOf_ = valueOf_
def export(self, outfile, level, namespace_='t:', name_='note', namespacedef_=''):
showIndent(outfile, level)
outfile.write('<%s%s%s' % (namespace_, name_, namespacedef_ and ' ' + namespacedef_ or '', ))
already_processed = []
self.exportAttributes(outfile, level, already_processed, namespace_, name_='note')
outfile.write('>')
self.exportChildren(outfile, level + 1, namespace_, name_)
outfile.write('</%s%s>\n' % (namespace_, name_))
def exportAttributes(self, outfile, level, already_processed, namespace_='t:', name_='note'):
if self.version is not None and 'version' not in already_processed:
already_processed.append('version')
outfile.write(' version=%s' % (self.gds_format_string(quote_attrib(self.version).encode(ExternalEncoding), input_name='version'), ))
if self.severity is not None and 'severity' not in already_processed:
already_processed.append('severity')
outfile.write(' severity=%s' % (self.gds_format_string(quote_attrib(self.severity).encode(ExternalEncoding), input_name='severity'), ))
def exportChildren(self, outfile, level, namespace_='t:', name_='note', fromsubclass_=False):
pass
def hasContent_(self):
if (
self.valueOf_
):
return True
else:
return False
def exportLiteral(self, outfile, level, name_='note'):
level += 1
self.exportLiteralAttributes(outfile, level, [], name_)
if self.hasContent_():
self.exportLiteralChildren(outfile, level, name_)
showIndent(outfile, level)
outfile.write('valueOf_ = """%s""",\n' % (self.valueOf_,))
def exportLiteralAttributes(self, outfile, level, already_processed, name_):
if self.version is not None and 'version' not in already_processed:
already_processed.append('version')
showIndent(outfile, level)
outfile.write('version = "%s",\n' % (self.version,))
if self.severity is not None and 'severity' not in already_processed:
already_processed.append('severity')
showIndent(outfile, level)
outfile.write('severity = "%s",\n' % (self.severity,))
def exportLiteralChildren(self, outfile, level, name_):
pass
def build(self, node):
self.buildAttributes(node, node.attrib, [])
self.valueOf_ = get_all_text_(node)
if node.text is not None:
obj_ = self.mixedclass_(MixedContainer.CategoryText,
MixedContainer.TypeNone, '', node.text)
self.content_.append(obj_)
for child in node:
nodeName_ = Tag_pattern_.match(child.tag).groups()[-1]
self.buildChildren(child, node, nodeName_)
def buildAttributes(self, node, attrs, already_processed):
value = find_attr_value_('version', node)
if value is not None and 'version' not in already_processed:
already_processed.append('version')
self.version = value
value = find_attr_value_('severity', node)
if value is not None and 'severity' not in already_processed:
already_processed.append('severity')
self.severity = value
def buildChildren(self, child_, node, nodeName_, fromsubclass_=False):
if not fromsubclass_ and child_.tail is not None:
obj_ = self.mixedclass_(MixedContainer.CategoryText,
MixedContainer.TypeNone, '', child_.tail)
self.content_.append(obj_)
pass
# end class note
class drawing(GeneratedsSuper):
subclass = None
superclass = None
def __init__(self, settings=None, grid=None, layers=None, library=None, schematic=None, board=None):
self.settings = settings
self.grid = grid
self.layers = layers
self.library = library
self.schematic = schematic
self.board = board
def factory(*args_, **kwargs_):
if drawing.subclass:
return drawing.subclass(*args_, **kwargs_)
else:
return drawing(*args_, **kwargs_)
factory = staticmethod(factory)
def get_settings(self): return self.settings
def set_settings(self, settings): self.settings = settings
def get_grid(self): return self.grid
def set_grid(self, grid): self.grid = grid
def get_layers(self): return self.layers
def set_layers(self, layers): self.layers = layers
def get_library(self): return self.library
def set_library(self, library): self.library = library
def get_schematic(self): return self.schematic
def set_schematic(self, schematic): self.schematic = schematic
def get_board(self): return self.board
def set_board(self, board): self.board = board
def export(self, outfile, level, namespace_='t:', name_='drawing', namespacedef_=''):
showIndent(outfile, level)
outfile.write('<%s%s%s' % (namespace_, name_, namespacedef_ and ' ' + namespacedef_ or '', ))
already_processed = []
self.exportAttributes(outfile, level, already_processed, namespace_, name_='drawing')
if self.hasContent_():
outfile.write('>\n')
self.exportChildren(outfile, level + 1, namespace_, name_)
showIndent(outfile, level)
outfile.write('</%s%s>\n' % (namespace_, name_))
else:
outfile.write('/>\n')
def exportAttributes(self, outfile, level, already_processed, namespace_='t:', name_='drawing'):
pass
def exportChildren(self, outfile, level, namespace_='t:', name_='drawing', fromsubclass_=False):
if self.settings is not None:
self.settings.export(outfile, level, namespace_, name_='settings')
if self.grid is not None:
self.grid.export(outfile, level, namespace_, name_='grid')
if self.layers is not None:
self.layers.export(outfile, level, namespace_, name_='layers', )
if self.library is not None:
self.library.export(outfile, level, namespace_, name_='library', )
if self.schematic is not None:
self.schematic.export(outfile, level, namespace_, name_='schematic', )
if self.board is not None:
self.board.export(outfile, level, namespace_, name_='board', )
def hasContent_(self):
if (
self.settings is not None or
self.grid is not None or
self.layers is not None or
self.library is not None or
self.schematic is not None or
self.board is not None
):
return True
else:
return False
def exportLiteral(self, outfile, level, name_='drawing'):
level += 1
self.exportLiteralAttributes(outfile, level, [], name_)
if self.hasContent_():
self.exportLiteralChildren(outfile, level, name_)
def exportLiteralAttributes(self, outfile, level, already_processed, name_):
pass
def exportLiteralChildren(self, outfile, level, name_):
if self.settings is not None:
showIndent(outfile, level)
outfile.write('settings=model_.settings(\n')
self.settings.exportLiteral(outfile, level)
showIndent(outfile, level)
outfile.write('),\n')
if self.grid is not None:
showIndent(outfile, level)
outfile.write('grid=model_.grid(\n')
self.grid.exportLiteral(outfile, level)
showIndent(outfile, level)
outfile.write('),\n')
if self.layers is not None:
showIndent(outfile, level)
outfile.write('layers=model_.layers(\n')
self.layers.exportLiteral(outfile, level)
showIndent(outfile, level)
outfile.write('),\n')
if self.library is not None:
showIndent(outfile, level)
outfile.write('library=model_.library(\n')
self.library.exportLiteral(outfile, level)
showIndent(outfile, level)
outfile.write('),\n')
if self.schematic is not None:
showIndent(outfile, level)
outfile.write('schematic=model_.schematic(\n')
self.schematic.exportLiteral(outfile, level)
showIndent(outfile, level)
outfile.write('),\n')
if self.board is not None:
showIndent(outfile, level)
outfile.write('board=model_.board(\n')
self.board.exportLiteral(outfile, level)
showIndent(outfile, level)
outfile.write('),\n')
def build(self, node):
self.buildAttributes(node, node.attrib, [])
for child in node:
nodeName_ = Tag_pattern_.match(child.tag).groups()[-1]
self.buildChildren(child, node, nodeName_)
def buildAttributes(self, node, attrs, already_processed):
pass
def buildChildren(self, child_, node, nodeName_, fromsubclass_=False):
if nodeName_ == 'settings':
obj_ = settings.factory()
obj_.build(child_)
self.set_settings(obj_)
elif nodeName_ == 'grid':
obj_ = grid.factory()
obj_.build(child_)
self.set_grid(obj_)
elif nodeName_ == 'layers':
obj_ = layers.factory()
obj_.build(child_)
self.set_layers(obj_)
elif nodeName_ == 'library':
obj_ = library.factory()
obj_.build(child_)
self.set_library(obj_)
elif nodeName_ == 'schematic':
obj_ = schematic.factory()
obj_.build(child_)
self.set_schematic(obj_)
elif nodeName_ == 'board':
obj_ = board.factory()
obj_.build(child_)
self.set_board(obj_)
# end class drawing
class library(GeneratedsSuper):
subclass = None
superclass = None
def __init__(self, name=None, description=None, packages=None, symbols=None, devicesets=None):
self.name = _cast(None, name)
self.description = description
self.packages = packages
self.symbols = symbols
self.devicesets = devicesets
def factory(*args_, **kwargs_):
if library.subclass:
return library.subclass(*args_, **kwargs_)
else:
return library(*args_, **kwargs_)
factory = staticmethod(factory)
def get_description(self): return self.description
def set_description(self, description): self.description = description
def get_packages(self): return self.packages
def set_packages(self, packages): self.packages = packages
def get_symbols(self): return self.symbols
def set_symbols(self, symbols): self.symbols = symbols
def get_devicesets(self): return self.devicesets
def set_devicesets(self, devicesets): self.devicesets = devicesets
def get_name(self): return self.name
def set_name(self, name): self.name = name
def export(self, outfile, level, namespace_='t:', name_='library', namespacedef_=''):
showIndent(outfile, level)
outfile.write('<%s%s%s' % (namespace_, name_, namespacedef_ and ' ' + namespacedef_ or '', ))
already_processed = []
self.exportAttributes(outfile, level, already_processed, namespace_, name_='library')
if self.hasContent_():
outfile.write('>\n')
self.exportChildren(outfile, level + 1, namespace_, name_)
showIndent(outfile, level)
outfile.write('</%s%s>\n' % (namespace_, name_))
else:
outfile.write('/>\n')
def exportAttributes(self, outfile, level, already_processed, namespace_='t:', name_='library'):
if self.name is not None and 'name' not in already_processed:
already_processed.append('name')
outfile.write(' name=%s' % (self.gds_format_string(quote_attrib(self.name).encode(ExternalEncoding), input_name='name'), ))
def exportChildren(self, outfile, level, namespace_='t:', name_='library', fromsubclass_=False):
if self.description is not None:
self.description.export(outfile, level, namespace_, name_='description')
if self.packages is not None:
self.packages.export(outfile, level, namespace_, name_='packages')
if self.symbols is not None:
self.symbols.export(outfile, level, namespace_, name_='symbols')
if self.devicesets is not None:
self.devicesets.export(outfile, level, namespace_, name_='devicesets')
def hasContent_(self):
if (
self.description is not None or
self.packages is not None or
self.symbols is not None or
self.devicesets is not None
):
return True
else:
return False
def exportLiteral(self, outfile, level, name_='library'):
level += 1
self.exportLiteralAttributes(outfile, level, [], name_)
if self.hasContent_():
self.exportLiteralChildren(outfile, level, name_)
def exportLiteralAttributes(self, outfile, level, already_processed, name_):
if self.name is not None and 'name' not in already_processed:
already_processed.append('name')
showIndent(outfile, level)
outfile.write('name = "%s",\n' % (self.name,))
def exportLiteralChildren(self, outfile, level, name_):
if self.description is not None:
showIndent(outfile, level)
outfile.write('description=model_.description(\n')
self.description.exportLiteral(outfile, level)
showIndent(outfile, level)
outfile.write('),\n')
if self.packages is not None:
showIndent(outfile, level)
outfile.write('packages=model_.packages(\n')
self.packages.exportLiteral(outfile, level)
showIndent(outfile, level)
outfile.write('),\n')
if self.symbols is not None:
showIndent(outfile, level)
outfile.write('symbols=model_.symbols(\n')
self.symbols.exportLiteral(outfile, level)
showIndent(outfile, level)
outfile.write('),\n')
if self.devicesets is not None:
showIndent(outfile, level)
outfile.write('devicesets=model_.devicesets(\n')
self.devicesets.exportLiteral(outfile, level)
showIndent(outfile, level)
outfile.write('),\n')
def build(self, node):
self.buildAttributes(node, node.attrib, [])
for child in node:
nodeName_ = Tag_pattern_.match(child.tag).groups()[-1]
self.buildChildren(child, node, nodeName_)
def buildAttributes(self, node, attrs, already_processed):
value = find_attr_value_('name', node)
if value is not None and 'name' not in already_processed:
already_processed.append('name')
self.name = value
def buildChildren(self, child_, node, nodeName_, fromsubclass_=False):
if nodeName_ == 'description':
obj_ = description.factory()
obj_.build(child_)
self.set_description(obj_)
elif nodeName_ == 'packages':
obj_ = packages.factory()
obj_.build(child_)
self.set_packages(obj_)
elif nodeName_ == 'symbols':
obj_ = symbols.factory()
obj_.build(child_)
self.set_symbols(obj_)
elif nodeName_ == 'devicesets':
obj_ = devicesets.factory()
obj_.build(child_)
self.set_devicesets(obj_)
# end class library
class schematic(GeneratedsSuper):
subclass = None
superclass = None
def __init__(self, xrefpart=None, xreflabel=None, description=None, libraries=None, attributes=None, variantdefs=None, classes=None, parts=None, sheets=None, errors=None):
self.xrefpart = _cast(None, xrefpart)
self.xreflabel = _cast(None, xreflabel)
self.description = description
self.libraries = libraries
self.attributes = attributes
self.variantdefs = variantdefs
self.classes = classes
self.parts = parts
self.sheets = sheets
self.errors = errors
def factory(*args_, **kwargs_):
if schematic.subclass:
return schematic.subclass(*args_, **kwargs_)
else:
return schematic(*args_, **kwargs_)
factory = staticmethod(factory)
def get_description(self): return self.description
def set_description(self, description): self.description = description
def get_libraries(self): return self.libraries
def set_libraries(self, libraries): self.libraries = libraries
def get_attributes(self): return self.attributes
def set_attributes(self, attributes): self.attributes = attributes
def get_variantdefs(self): return self.variantdefs
def set_variantdefs(self, variantdefs): self.variantdefs = variantdefs
def get_classes(self): return self.classes
def set_classes(self, classes): self.classes = classes
def get_parts(self): return self.parts
def set_parts(self, parts): self.parts = parts
def get_sheets(self): return self.sheets
def set_sheets(self, sheets): self.sheets = sheets
def get_errors(self): return self.errors
def set_errors(self, errors): self.errors = errors
def get_xrefpart(self): return self.xrefpart
def set_xrefpart(self, xrefpart): self.xrefpart = xrefpart
def get_xreflabel(self): return self.xreflabel
def set_xreflabel(self, xreflabel): self.xreflabel = xreflabel
def export(self, outfile, level, namespace_='t:', name_='schematic', namespacedef_=''):
showIndent(outfile, level)
outfile.write('<%s%s%s' % (namespace_, name_, namespacedef_ and ' ' + namespacedef_ or '', ))
already_processed = []
self.exportAttributes(outfile, level, already_processed, namespace_, name_='schematic')
if self.hasContent_():
outfile.write('>\n')
self.exportChildren(outfile, level + 1, namespace_, name_)
showIndent(outfile, level)
outfile.write('</%s%s>\n' % (namespace_, name_))
else:
outfile.write('/>\n')
def exportAttributes(self, outfile, level, already_processed, namespace_='t:', name_='schematic'):
if self.xrefpart is not None and 'xrefpart' not in already_processed:
already_processed.append('xrefpart')
outfile.write(' xrefpart=%s' % (self.gds_format_string(quote_attrib(self.xrefpart).encode(ExternalEncoding), input_name='xrefpart'), ))
if self.xreflabel is not None and 'xreflabel' not in already_processed:
already_processed.append('xreflabel')
outfile.write(' xreflabel=%s' % (self.gds_format_string(quote_attrib(self.xreflabel).encode(ExternalEncoding), input_name='xreflabel'), ))
def exportChildren(self, outfile, level, namespace_='t:', name_='schematic', fromsubclass_=False):
if self.description is not None:
self.description.export(outfile, level, namespace_, name_='description')
if self.libraries is not None:
self.libraries.export(outfile, level, namespace_, name_='libraries')
if self.attributes is not None:
self.attributes.export(outfile, level, namespace_, name_='attributes')
if self.variantdefs is not None:
self.variantdefs.export(outfile, level, namespace_, name_='variantdefs')
if self.classes is not None:
self.classes.export(outfile, level, namespace_, name_='classes')
if self.parts is not None:
self.parts.export(outfile, level, namespace_, name_='parts')
if self.sheets is not None:
self.sheets.export(outfile, level, namespace_, name_='sheets')
if self.errors is not None:
self.errors.export(outfile, level, namespace_, name_='errors')
def hasContent_(self):
if (
self.description is not None or
self.libraries is not None or
self.attributes is not None or
self.variantdefs is not None or
self.classes is not None or
self.parts is not None or
self.sheets is not None or
self.errors is not None
):
return True
else:
return False
def exportLiteral(self, outfile, level, name_='schematic'):
level += 1
self.exportLiteralAttributes(outfile, level, [], name_)
if self.hasContent_():
self.exportLiteralChildren(outfile, level, name_)
def exportLiteralAttributes(self, outfile, level, already_processed, name_):
if self.xrefpart is not None and 'xrefpart' not in already_processed:
already_processed.append('xrefpart')
showIndent(outfile, level)
outfile.write('xrefpart = "%s",\n' % (self.xrefpart,))
if self.xreflabel is not None and 'xreflabel' not in already_processed:
already_processed.append('xreflabel')
showIndent(outfile, level)
outfile.write('xreflabel = "%s",\n' % (self.xreflabel,))
def exportLiteralChildren(self, outfile, level, name_):
if self.description is not None:
showIndent(outfile, level)
outfile.write('description=model_.description(\n')
self.description.exportLiteral(outfile, level)
showIndent(outfile, level)
outfile.write('),\n')
if self.libraries is not None:
showIndent(outfile, level)
outfile.write('libraries=model_.libraries(\n')
self.libraries.exportLiteral(outfile, level)
showIndent(outfile, level)
outfile.write('),\n')
if self.attributes is not None:
showIndent(outfile, level)
outfile.write('attributes=model_.attributes(\n')
self.attributes.exportLiteral(outfile, level)
showIndent(outfile, level)
outfile.write('),\n')
if self.variantdefs is not None:
showIndent(outfile, level)
outfile.write('variantdefs=model_.variantdefs(\n')
self.variantdefs.exportLiteral(outfile, level)
showIndent(outfile, level)
outfile.write('),\n')
if self.classes is not None:
showIndent(outfile, level)
outfile.write('classes=model_.classes(\n')
self.classes.exportLiteral(outfile, level)
showIndent(outfile, level)
outfile.write('),\n')
if self.parts is not None:
showIndent(outfile, level)
outfile.write('parts=model_.parts(\n')
self.parts.exportLiteral(outfile, level)
showIndent(outfile, level)
outfile.write('),\n')
if self.sheets is not None:
showIndent(outfile, level)
outfile.write('sheets=model_.sheets(\n')
self.sheets.exportLiteral(outfile, level)
showIndent(outfile, level)
outfile.write('),\n')
if self.errors is not None:
showIndent(outfile, level)
outfile.write('errors=model_.errors(\n')
self.errors.exportLiteral(outfile, level)
showIndent(outfile, level)
outfile.write('),\n')
def build(self, node):
self.buildAttributes(node, node.attrib, [])
for child in node:
nodeName_ = Tag_pattern_.match(child.tag).groups()[-1]
self.buildChildren(child, node, nodeName_)
def buildAttributes(self, node, attrs, already_processed):
value = find_attr_value_('xrefpart', node)
if value is not None and 'xrefpart' not in already_processed:
already_processed.append('xrefpart')
self.xrefpart = value
value = find_attr_value_('xreflabel', node)
if value is not None and 'xreflabel' not in already_processed:
already_processed.append('xreflabel')
self.xreflabel = value
def buildChildren(self, child_, node, nodeName_, fromsubclass_=False):
if nodeName_ == 'description':
obj_ = description.factory()
obj_.build(child_)
self.set_description(obj_)
elif nodeName_ == 'libraries':
obj_ = libraries.factory()
obj_.build(child_)
self.set_libraries(obj_)
elif nodeName_ == 'attributes':
obj_ = attributes.factory()
obj_.build(child_)
self.set_attributes(obj_)
elif nodeName_ == 'variantdefs':
obj_ = variantdefs.factory()
obj_.build(child_)
self.set_variantdefs(obj_)
elif nodeName_ == 'classes':
obj_ = classes.factory()
obj_.build(child_)
self.set_classes(obj_)
elif nodeName_ == 'parts':
obj_ = parts.factory()
obj_.build(child_)
self.set_parts(obj_)
elif nodeName_ == 'sheets':
obj_ = sheets.factory()
obj_.build(child_)
self.set_sheets(obj_)
elif nodeName_ == 'errors':
obj_ = errors.factory()
obj_.build(child_)
self.set_errors(obj_)
# end class schematic
class board(GeneratedsSuper):
subclass = None
superclass = None
def __init__(self, description=None, plain=None, libraries=None, attributes=None, variantdefs=None, classes=None, designrules=None, autorouter=None, elements=None, signals=None, errors=None):
self.description = description
self.plain = plain
self.libraries = libraries
self.attributes = attributes
self.variantdefs = variantdefs
self.classes = classes
self.designrules = designrules
self.autorouter = autorouter
self.elements = elements
self.signals = signals
self.errors = errors
def factory(*args_, **kwargs_):
if board.subclass:
return board.subclass(*args_, **kwargs_)
else:
return board(*args_, **kwargs_)
factory = staticmethod(factory)
def get_description(self): return self.description
def set_description(self, description): self.description = description
def get_plain(self): return self.plain
def set_plain(self, plain): self.plain = plain
def get_libraries(self): return self.libraries
def set_libraries(self, libraries): self.libraries = libraries
def get_attributes(self): return self.attributes
def set_attributes(self, attributes): self.attributes = attributes
def get_variantdefs(self): return self.variantdefs
def set_variantdefs(self, variantdefs): self.variantdefs = variantdefs
def get_classes(self): return self.classes
def set_classes(self, classes): self.classes = classes
def get_designrules(self): return self.designrules
def set_designrules(self, designrules): self.designrules = designrules
def get_autorouter(self): return self.autorouter
def set_autorouter(self, autorouter): self.autorouter = autorouter
def get_elements(self): return self.elements
def set_elements(self, elements): self.elements = elements
def get_signals(self): return self.signals
def set_signals(self, signals): self.signals = signals
def get_errors(self): return self.errors
def set_errors(self, errors): self.errors = errors
def export(self, outfile, level, namespace_='t:', name_='board', namespacedef_=''):
showIndent(outfile, level)
outfile.write('<%s%s%s' % (namespace_, name_, namespacedef_ and ' ' + namespacedef_ or '', ))
already_processed = []
self.exportAttributes(outfile, level, already_processed, namespace_, name_='board')
if self.hasContent_():
outfile.write('>\n')
self.exportChildren(outfile, level + 1, namespace_, name_)
showIndent(outfile, level)
outfile.write('</%s%s>\n' % (namespace_, name_))
else:
outfile.write('/>\n')
def exportAttributes(self, outfile, level, already_processed, namespace_='t:', name_='board'):
pass
def exportChildren(self, outfile, level, namespace_='t:', name_='board', fromsubclass_=False):
if self.description is not None:
self.description.export(outfile, level, namespace_, name_='description')
if self.plain is not None:
self.plain.export(outfile, level, namespace_, name_='plain')
if self.libraries is not None:
self.libraries.export(outfile, level, namespace_, name_='libraries')
if self.attributes is not None:
self.attributes.export(outfile, level, namespace_, name_='attributes')
if self.variantdefs is not None:
self.variantdefs.export(outfile, level, namespace_, name_='variantdefs')
if self.classes is not None:
self.classes.export(outfile, level, namespace_, name_='classes')
if self.designrules is not None:
self.designrules.export(outfile, level, namespace_, name_='designrules')
if self.autorouter is not None:
self.autorouter.export(outfile, level, namespace_, name_='autorouter')
if self.elements is not None:
self.elements.export(outfile, level, namespace_, name_='elements')
if self.signals is not None:
self.signals.export(outfile, level, namespace_, name_='signals')
if self.errors is not None:
self.errors.export(outfile, level, namespace_, name_='errors')
def hasContent_(self):
if (
self.description is not None or
self.plain is not None or
self.libraries is not None or
self.attributes is not None or
self.variantdefs is not None or
self.classes is not None or
self.designrules is not None or
self.autorouter is not None or
self.elements is not None or
self.signals is not None or
self.errors is not None
):
return True
else:
return False
def exportLiteral(self, outfile, level, name_='board'):
level += 1
self.exportLiteralAttributes(outfile, level, [], name_)
if self.hasContent_():
self.exportLiteralChildren(outfile, level, name_)
def exportLiteralAttributes(self, outfile, level, already_processed, name_):
pass
def exportLiteralChildren(self, outfile, level, name_):
if self.description is not None:
showIndent(outfile, level)
outfile.write('description=model_.description(\n')
self.description.exportLiteral(outfile, level)
showIndent(outfile, level)
outfile.write('),\n')
if self.plain is not None:
showIndent(outfile, level)
outfile.write('plain=model_.plain(\n')
self.plain.exportLiteral(outfile, level)
showIndent(outfile, level)
outfile.write('),\n')
if self.libraries is not None:
showIndent(outfile, level)
outfile.write('libraries=model_.libraries(\n')
self.libraries.exportLiteral(outfile, level)
showIndent(outfile, level)
outfile.write('),\n')
if self.attributes is not None:
showIndent(outfile, level)
outfile.write('attributes=model_.attributes(\n')
self.attributes.exportLiteral(outfile, level)
showIndent(outfile, level)
outfile.write('),\n')
if self.variantdefs is not None:
showIndent(outfile, level)
outfile.write('variantdefs=model_.variantdefs(\n')
self.variantdefs.exportLiteral(outfile, level)
showIndent(outfile, level)
outfile.write('),\n')
if self.classes is not None:
showIndent(outfile, level)
outfile.write('classes=model_.classes(\n')
self.classes.exportLiteral(outfile, level)
showIndent(outfile, level)
outfile.write('),\n')
if self.designrules is not None:
showIndent(outfile, level)
outfile.write('designrules=model_.designrules(\n')
self.designrules.exportLiteral(outfile, level)
showIndent(outfile, level)
outfile.write('),\n')
if self.autorouter is not None:
showIndent(outfile, level)
outfile.write('autorouter=model_.autorouter(\n')
self.autorouter.exportLiteral(outfile, level)
showIndent(outfile, level)
outfile.write('),\n')
if self.elements is not None:
showIndent(outfile, level)
outfile.write('elements=model_.elements(\n')
self.elements.exportLiteral(outfile, level)
showIndent(outfile, level)
outfile.write('),\n')
if self.signals is not None:
showIndent(outfile, level)
outfile.write('signals=model_.signals(\n')
self.signals.exportLiteral(outfile, level)
showIndent(outfile, level)
outfile.write('),\n')
if self.errors is not None:
showIndent(outfile, level)
outfile.write('errors=model_.errors(\n')
self.errors.exportLiteral(outfile, level)
showIndent(outfile, level)
outfile.write('),\n')
def build(self, node):
self.buildAttributes(node, node.attrib, [])
for child in node:
nodeName_ = Tag_pattern_.match(child.tag).groups()[-1]
self.buildChildren(child, node, nodeName_)
def buildAttributes(self, node, attrs, already_processed):
pass
def buildChildren(self, child_, node, nodeName_, fromsubclass_=False):
if nodeName_ == 'description':
obj_ = description.factory()
obj_.build(child_)
self.set_description(obj_)
elif nodeName_ == 'plain':
obj_ = plain.factory()
obj_.build(child_)
self.set_plain(obj_)
elif nodeName_ == 'libraries':
obj_ = libraries.factory()
obj_.build(child_)
self.set_libraries(obj_)
elif nodeName_ == 'attributes':
obj_ = attributes.factory()
obj_.build(child_)
self.set_attributes(obj_)
elif nodeName_ == 'variantdefs':
obj_ = variantdefs.factory()
obj_.build(child_)
self.set_variantdefs(obj_)
elif nodeName_ == 'classes':
obj_ = classes.factory()
obj_.build(child_)
self.set_classes(obj_)
elif nodeName_ == 'designrules':
obj_ = designrules.factory()
obj_.build(child_)
self.set_designrules(obj_)
elif nodeName_ == 'autorouter':
obj_ = autorouter.factory()
obj_.build(child_)
self.set_autorouter(obj_)
elif nodeName_ == 'elements':
obj_ = elements.factory()
obj_.build(child_)
self.set_elements(obj_)
elif nodeName_ == 'signals':
obj_ = signals.factory()
obj_.build(child_)
self.set_signals(obj_)
elif nodeName_ == 'errors':
obj_ = errors.factory()
obj_.build(child_)
self.set_errors(obj_)
# end class board
class sheet(GeneratedsSuper):
subclass = None
superclass = None
def __init__(self, description=None, plain=None, instances=None, busses=None, nets=None):
self.description = description
self.plain = plain
self.instances = instances
self.busses = busses
self.nets = nets
def factory(*args_, **kwargs_):
if sheet.subclass:
return sheet.subclass(*args_, **kwargs_)
else:
return sheet(*args_, **kwargs_)
factory = staticmethod(factory)
def get_description(self): return self.description
def set_description(self, description): self.description = description
def get_plain(self): return self.plain
def set_plain(self, plain): self.plain = plain
def get_instances(self): return self.instances
def set_instances(self, instances): self.instances = instances
def get_busses(self): return self.busses
def set_busses(self, busses): self.busses = busses
def get_nets(self): return self.nets
def set_nets(self, nets): self.nets = nets
def export(self, outfile, level, namespace_='t:', name_='sheet', namespacedef_=''):
showIndent(outfile, level)
outfile.write('<%s%s%s' % (namespace_, name_, namespacedef_ and ' ' + namespacedef_ or '', ))
already_processed = []
self.exportAttributes(outfile, level, already_processed, namespace_, name_='sheet')
if self.hasContent_():
outfile.write('>\n')
self.exportChildren(outfile, level + 1, namespace_, name_)
showIndent(outfile, level)
outfile.write('</%s%s>\n' % (namespace_, name_))
else:
outfile.write('/>\n')
def exportAttributes(self, outfile, level, already_processed, namespace_='t:', name_='sheet'):
pass
def exportChildren(self, outfile, level, namespace_='t:', name_='sheet', fromsubclass_=False):
if self.description is not None:
self.description.export(outfile, level, namespace_, name_='description')
if self.plain is not None:
self.plain.export(outfile, level, namespace_, name_='plain')
if self.instances is not None:
self.instances.export(outfile, level, namespace_, name_='instances')
if self.busses is not None:
self.busses.export(outfile, level, namespace_, name_='busses')
if self.nets is not None:
self.nets.export(outfile, level, namespace_, name_='nets')
def hasContent_(self):
if (
self.description is not None or
self.plain is not None or
self.instances is not None or
self.busses is not None or
self.nets is not None
):
return True
else:
return False
def exportLiteral(self, outfile, level, name_='sheet'):
level += 1
self.exportLiteralAttributes(outfile, level, [], name_)
if self.hasContent_():
self.exportLiteralChildren(outfile, level, name_)
def exportLiteralAttributes(self, outfile, level, already_processed, name_):
pass
def exportLiteralChildren(self, outfile, level, name_):
if self.description is not None:
showIndent(outfile, level)
outfile.write('description=model_.description(\n')
self.description.exportLiteral(outfile, level)
showIndent(outfile, level)
outfile.write('),\n')
if self.plain is not None:
showIndent(outfile, level)
outfile.write('plain=model_.plain(\n')
self.plain.exportLiteral(outfile, level)
showIndent(outfile, level)
outfile.write('),\n')
if self.instances is not None:
showIndent(outfile, level)
outfile.write('instances=model_.instances(\n')
self.instances.exportLiteral(outfile, level)
showIndent(outfile, level)
outfile.write('),\n')
if self.busses is not None:
showIndent(outfile, level)
outfile.write('busses=model_.busses(\n')
self.busses.exportLiteral(outfile, level)
showIndent(outfile, level)
outfile.write('),\n')
if self.nets is not None:
showIndent(outfile, level)
outfile.write('nets=model_.nets(\n')
self.nets.exportLiteral(outfile, level)
showIndent(outfile, level)
outfile.write('),\n')
def build(self, node):
self.buildAttributes(node, node.attrib, [])
for child in node:
nodeName_ = Tag_pattern_.match(child.tag).groups()[-1]
self.buildChildren(child, node, nodeName_)
def buildAttributes(self, node, attrs, already_processed):
pass
def buildChildren(self, child_, node, nodeName_, fromsubclass_=False):
if nodeName_ == 'description':
obj_ = description.factory()
obj_.build(child_)
self.set_description(obj_)
elif nodeName_ == 'plain':
obj_ = plain.factory()
obj_.build(child_)
self.set_plain(obj_)
elif nodeName_ == 'instances':
obj_ = instances.factory()
obj_.build(child_)
self.set_instances(obj_)
elif nodeName_ == 'busses':
obj_ = busses.factory()
obj_.build(child_)
self.set_busses(obj_)
elif nodeName_ == 'nets':
obj_ = nets.factory()
obj_.build(child_)
self.set_nets(obj_)
# end class sheet
class package(GeneratedsSuper):
subclass = None
superclass = None
def __init__(self, name=None, description=None, polygon=None, wire=None, text=None, circle=None, rectangle=None, frame=None, hole=None, pad=None, smd=None):
self.name = _cast(None, name)
self.description = description
if polygon is None:
self.polygon = []
else:
self.polygon = polygon
if wire is None:
self.wire = []
else:
self.wire = wire
if text is None:
self.text = []
else:
self.text = text
if circle is None:
self.circle = []
else:
self.circle = circle
if rectangle is None:
self.rectangle = []
else:
self.rectangle = rectangle
if frame is None:
self.frame = []
else:
self.frame = frame
if hole is None:
self.hole = []
else:
self.hole = hole
if pad is None:
self.pad = []
else:
self.pad = pad
if smd is None:
self.smd = []
else:
self.smd = smd
def factory(*args_, **kwargs_):
if package.subclass:
return package.subclass(*args_, **kwargs_)
else:
return package(*args_, **kwargs_)
factory = staticmethod(factory)
def get_description(self): return self.description
def set_description(self, description): self.description = description
def get_polygon(self): return self.polygon
def set_polygon(self, polygon): self.polygon = polygon
def add_polygon(self, value): self.polygon.append(value)
def insert_polygon(self, index, value): self.polygon[index] = value
def get_wire(self): return self.wire
def set_wire(self, wire): self.wire = wire
def add_wire(self, value): self.wire.append(value)
def insert_wire(self, index, value): self.wire[index] = value
def get_text(self): return self.text
def set_text(self, text): self.text = text
def add_text(self, value): self.text.append(value)
def insert_text(self, index, value): self.text[index] = value
def get_circle(self): return self.circle
def set_circle(self, circle): self.circle = circle
def add_circle(self, value): self.circle.append(value)
def insert_circle(self, index, value): self.circle[index] = value
def get_rectangle(self): return self.rectangle
def set_rectangle(self, rectangle): self.rectangle = rectangle
def add_rectangle(self, value): self.rectangle.append(value)
def insert_rectangle(self, index, value): self.rectangle[index] = value
def get_frame(self): return self.frame
def set_frame(self, frame): self.frame = frame
def add_frame(self, value): self.frame.append(value)
def insert_frame(self, index, value): self.frame[index] = value
def get_hole(self): return self.hole
def set_hole(self, hole): self.hole = hole
def add_hole(self, value): self.hole.append(value)
def insert_hole(self, index, value): self.hole[index] = value
def get_pad(self): return self.pad
def set_pad(self, pad): self.pad = pad
def add_pad(self, value): self.pad.append(value)
def insert_pad(self, index, value): self.pad[index] = value
def get_smd(self): return self.smd
def set_smd(self, smd): self.smd = smd
def add_smd(self, value): self.smd.append(value)
def insert_smd(self, index, value): self.smd[index] = value
def get_name(self): return self.name
def set_name(self, name): self.name = name
def export(self, outfile, level, namespace_='t:', name_='package', namespacedef_=''):
showIndent(outfile, level)
outfile.write('<%s%s%s' % (namespace_, name_, namespacedef_ and ' ' + namespacedef_ or '', ))
already_processed = []
self.exportAttributes(outfile, level, already_processed, namespace_, name_='package')
if self.hasContent_():
outfile.write('>\n')
self.exportChildren(outfile, level + 1, namespace_, name_)
showIndent(outfile, level)
outfile.write('</%s%s>\n' % (namespace_, name_))
else:
outfile.write('/>\n')
def exportAttributes(self, outfile, level, already_processed, namespace_='t:', name_='package'):
if self.name is not None and 'name' not in already_processed:
already_processed.append('name')
outfile.write(' name=%s' % (self.gds_format_string(quote_attrib(self.name).encode(ExternalEncoding), input_name='name'), ))
def exportChildren(self, outfile, level, namespace_='t:', name_='package', fromsubclass_=False):
if self.description is not None:
self.description.export(outfile, level, namespace_, name_='description')
for polygon_ in self.polygon:
polygon_.export(outfile, level, namespace_, name_='polygon')
for wire_ in self.wire:
wire_.export(outfile, level, namespace_, name_='wire')
for text_ in self.text:
text_.export(outfile, level, namespace_, name_='text')
for circle_ in self.circle:
circle_.export(outfile, level, namespace_, name_='circle')
for rectangle_ in self.rectangle:
rectangle_.export(outfile, level, namespace_, name_='rectangle')
for frame_ in self.frame:
frame_.export(outfile, level, namespace_, name_='frame')
for hole_ in self.hole:
hole_.export(outfile, level, namespace_, name_='hole')
for pad_ in self.pad:
pad_.export(outfile, level, namespace_, name_='pad')
for smd_ in self.smd:
smd_.export(outfile, level, namespace_, name_='smd')
def hasContent_(self):
if (
self.description is not None or
self.polygon or
self.wire or
self.text or
self.circle or
self.rectangle or
self.frame or
self.hole or
self.pad or
self.smd
):
return True
else:
return False
def exportLiteral(self, outfile, level, name_='package'):
level += 1
self.exportLiteralAttributes(outfile, level, [], name_)
if self.hasContent_():
self.exportLiteralChildren(outfile, level, name_)
def exportLiteralAttributes(self, outfile, level, already_processed, name_):
if self.name is not None and 'name' not in already_processed:
already_processed.append('name')
showIndent(outfile, level)
outfile.write('name = "%s",\n' % (self.name,))
def exportLiteralChildren(self, outfile, level, name_):
if self.description is not None:
showIndent(outfile, level)
outfile.write('description=model_.description(\n')
self.description.exportLiteral(outfile, level)
showIndent(outfile, level)
outfile.write('),\n')
showIndent(outfile, level)
outfile.write('polygon=[\n')
level += 1
for polygon_ in self.polygon:
showIndent(outfile, level)
outfile.write('model_.polygon(\n')
polygon_.exportLiteral(outfile, level)
showIndent(outfile, level)
outfile.write('),\n')
level -= 1
showIndent(outfile, level)
outfile.write('],\n')
showIndent(outfile, level)
outfile.write('wire=[\n')
level += 1
for wire_ in self.wire:
showIndent(outfile, level)
outfile.write('model_.wire(\n')
wire_.exportLiteral(outfile, level)
showIndent(outfile, level)
outfile.write('),\n')
level -= 1
showIndent(outfile, level)
outfile.write('],\n')
showIndent(outfile, level)
outfile.write('text=[\n')
level += 1
for text_ in self.text:
showIndent(outfile, level)
outfile.write('model_.text(\n')
text_.exportLiteral(outfile, level)
showIndent(outfile, level)
outfile.write('),\n')
level -= 1
showIndent(outfile, level)
outfile.write('],\n')
showIndent(outfile, level)
outfile.write('circle=[\n')
level += 1
for circle_ in self.circle:
showIndent(outfile, level)
outfile.write('model_.circle(\n')
circle_.exportLiteral(outfile, level)
showIndent(outfile, level)
outfile.write('),\n')
level -= 1
showIndent(outfile, level)
outfile.write('],\n')
showIndent(outfile, level)
outfile.write('rectangle=[\n')
level += 1
for rectangle_ in self.rectangle:
showIndent(outfile, level)
outfile.write('model_.rectangle(\n')
rectangle_.exportLiteral(outfile, level)
showIndent(outfile, level)
outfile.write('),\n')
level -= 1
showIndent(outfile, level)
outfile.write('],\n')
showIndent(outfile, level)
outfile.write('frame=[\n')
level += 1
for frame_ in self.frame:
showIndent(outfile, level)
outfile.write('model_.frame(\n')
frame_.exportLiteral(outfile, level)
showIndent(outfile, level)
outfile.write('),\n')
level -= 1
showIndent(outfile, level)
outfile.write('],\n')
showIndent(outfile, level)
outfile.write('hole=[\n')
level += 1
for hole_ in self.hole:
showIndent(outfile, level)
outfile.write('model_.hole(\n')
hole_.exportLiteral(outfile, level)
showIndent(outfile, level)
outfile.write('),\n')
level -= 1
showIndent(outfile, level)
outfile.write('],\n')
showIndent(outfile, level)
outfile.write('pad=[\n')
level += 1
for pad_ in self.pad:
showIndent(outfile, level)
outfile.write('model_.pad(\n')
pad_.exportLiteral(outfile, level)
showIndent(outfile, level)
outfile.write('),\n')
level -= 1
showIndent(outfile, level)
outfile.write('],\n')
showIndent(outfile, level)
outfile.write('smd=[\n')
level += 1
for smd_ in self.smd:
showIndent(outfile, level)
outfile.write('model_.smd(\n')
smd_.exportLiteral(outfile, level)
showIndent(outfile, level)
outfile.write('),\n')
level -= 1
showIndent(outfile, level)
outfile.write('],\n')
def build(self, node):
self.buildAttributes(node, node.attrib, [])
for child in node:
nodeName_ = Tag_pattern_.match(child.tag).groups()[-1]
self.buildChildren(child, node, nodeName_)
def buildAttributes(self, node, attrs, already_processed):
value = find_attr_value_('name', node)
if value is not None and 'name' not in already_processed:
already_processed.append('name')
self.name = value
def buildChildren(self, child_, node, nodeName_, fromsubclass_=False):
if nodeName_ == 'description':
obj_ = description.factory()
obj_.build(child_)
self.set_description(obj_)
elif nodeName_ == 'polygon':
obj_ = polygon.factory()
obj_.build(child_)
self.polygon.append(obj_)
elif nodeName_ == 'wire':
obj_ = wire.factory()
obj_.build(child_)
self.wire.append(obj_)
elif nodeName_ == 'text':
obj_ = text.factory()
obj_.build(child_)
self.text.append(obj_)
elif nodeName_ == 'circle':
obj_ = circle.factory()
obj_.build(child_)
self.circle.append(obj_)
elif nodeName_ == 'rectangle':
obj_ = rectangle.factory()
obj_.build(child_)
self.rectangle.append(obj_)
elif nodeName_ == 'frame':
obj_ = frame.factory()
obj_.build(child_)
self.frame.append(obj_)
elif nodeName_ == 'hole':
obj_ = hole.factory()
obj_.build(child_)
self.hole.append(obj_)
elif nodeName_ == 'pad':
obj_ = pad.factory()
obj_.build(child_)
self.pad.append(obj_)
elif nodeName_ == 'smd':
obj_ = smd.factory()
obj_.build(child_)
self.smd.append(obj_)
# end class package
class symbol(GeneratedsSuper):
subclass = None
superclass = None
def __init__(self, name=None, description=None, polygon=None, wire=None, text=None, pin=None, circle=None, rectangle=None, frame=None):
self.name = _cast(None, name)
self.description = description
if polygon is None:
self.polygon = []
else:
self.polygon = polygon
if wire is None:
self.wire = []
else:
self.wire = wire
if text is None:
self.text = []
else:
self.text = text
if pin is None:
self.pin = []
else:
self.pin = pin
if circle is None:
self.circle = []
else:
self.circle = circle
if rectangle is None:
self.rectangle = []
else:
self.rectangle = rectangle
if frame is None:
self.frame = []
else:
self.frame = frame
def factory(*args_, **kwargs_):
if symbol.subclass:
return symbol.subclass(*args_, **kwargs_)
else:
return symbol(*args_, **kwargs_)
factory = staticmethod(factory)
def get_description(self): return self.description
def set_description(self, description): self.description = description
def get_polygon(self): return self.polygon
def set_polygon(self, polygon): self.polygon = polygon
def add_polygon(self, value): self.polygon.append(value)
def insert_polygon(self, index, value): self.polygon[index] = value
def get_wire(self): return self.wire
def set_wire(self, wire): self.wire = wire
def add_wire(self, value): self.wire.append(value)
def insert_wire(self, index, value): self.wire[index] = value
def get_text(self): return self.text
def set_text(self, text): self.text = text
def add_text(self, value): self.text.append(value)
def insert_text(self, index, value): self.text[index] = value
def get_pin(self): return self.pin
def set_pin(self, pin): self.pin = pin
def add_pin(self, value): self.pin.append(value)
def insert_pin(self, index, value): self.pin[index] = value
def get_circle(self): return self.circle
def set_circle(self, circle): self.circle = circle
def add_circle(self, value): self.circle.append(value)
def insert_circle(self, index, value): self.circle[index] = value
def get_rectangle(self): return self.rectangle
def set_rectangle(self, rectangle): self.rectangle = rectangle
def add_rectangle(self, value): self.rectangle.append(value)
def insert_rectangle(self, index, value): self.rectangle[index] = value
def get_frame(self): return self.frame
def set_frame(self, frame): self.frame = frame
def add_frame(self, value): self.frame.append(value)
def insert_frame(self, index, value): self.frame[index] = value
def get_name(self): return self.name
def set_name(self, name): self.name = name
def export(self, outfile, level, namespace_='t:', name_='symbol', namespacedef_=''):
showIndent(outfile, level)
outfile.write('<%s%s%s' % (namespace_, name_, namespacedef_ and ' ' + namespacedef_ or '', ))
already_processed = []
self.exportAttributes(outfile, level, already_processed, namespace_, name_='symbol')
if self.hasContent_():
outfile.write('>\n')
self.exportChildren(outfile, level + 1, namespace_, name_)
showIndent(outfile, level)
outfile.write('</%s%s>\n' % (namespace_, name_))
else:
outfile.write('/>\n')
def exportAttributes(self, outfile, level, already_processed, namespace_='t:', name_='symbol'):
if self.name is not None and 'name' not in already_processed:
already_processed.append('name')
outfile.write(' name=%s' % (self.gds_format_string(quote_attrib(self.name).encode(ExternalEncoding), input_name='name'), ))
def exportChildren(self, outfile, level, namespace_='t:', name_='symbol', fromsubclass_=False):
if self.description is not None:
self.description.export(outfile, level, namespace_, name_='description')
for polygon_ in self.polygon:
polygon_.export(outfile, level, namespace_, name_='polygon')
for wire_ in self.wire:
wire_.export(outfile, level, namespace_, name_='wire')
for text_ in self.text:
text_.export(outfile, level, namespace_, name_='text')
for pin_ in self.pin:
pin_.export(outfile, level, namespace_, name_='pin')
for circle_ in self.circle:
circle_.export(outfile, level, namespace_, name_='circle')
for rectangle_ in self.rectangle:
rectangle_.export(outfile, level, namespace_, name_='rectangle')
for frame_ in self.frame:
frame_.export(outfile, level, namespace_, name_='frame')
def hasContent_(self):
if (
self.description is not None or
self.polygon or
self.wire or
self.text or
self.pin or
self.circle or
self.rectangle or
self.frame
):
return True
else:
return False
def exportLiteral(self, outfile, level, name_='symbol'):
level += 1
self.exportLiteralAttributes(outfile, level, [], name_)
if self.hasContent_():
self.exportLiteralChildren(outfile, level, name_)
def exportLiteralAttributes(self, outfile, level, already_processed, name_):
if self.name is not None and 'name' not in already_processed:
already_processed.append('name')
showIndent(outfile, level)
outfile.write('name = "%s",\n' % (self.name,))
def exportLiteralChildren(self, outfile, level, name_):
if self.description is not None:
showIndent(outfile, level)
outfile.write('description=model_.description(\n')
self.description.exportLiteral(outfile, level)
showIndent(outfile, level)
outfile.write('),\n')
showIndent(outfile, level)
outfile.write('polygon=[\n')
level += 1
for polygon_ in self.polygon:
showIndent(outfile, level)
outfile.write('model_.polygon(\n')
polygon_.exportLiteral(outfile, level)
showIndent(outfile, level)
outfile.write('),\n')
level -= 1
showIndent(outfile, level)
outfile.write('],\n')
showIndent(outfile, level)
outfile.write('wire=[\n')
level += 1
for wire_ in self.wire:
showIndent(outfile, level)
outfile.write('model_.wire(\n')
wire_.exportLiteral(outfile, level)
showIndent(outfile, level)
outfile.write('),\n')
level -= 1
showIndent(outfile, level)
outfile.write('],\n')
showIndent(outfile, level)
outfile.write('text=[\n')
level += 1
for text_ in self.text:
showIndent(outfile, level)
outfile.write('model_.text(\n')
text_.exportLiteral(outfile, level)
showIndent(outfile, level)
outfile.write('),\n')
level -= 1
showIndent(outfile, level)
outfile.write('],\n')
showIndent(outfile, level)
outfile.write('pin=[\n')
level += 1
for pin_ in self.pin:
showIndent(outfile, level)
outfile.write('model_.pin(\n')
pin_.exportLiteral(outfile, level)
showIndent(outfile, level)
outfile.write('),\n')
level -= 1
showIndent(outfile, level)
outfile.write('],\n')
showIndent(outfile, level)
outfile.write('circle=[\n')
level += 1
for circle_ in self.circle:
showIndent(outfile, level)
outfile.write('model_.circle(\n')
circle_.exportLiteral(outfile, level)
showIndent(outfile, level)
outfile.write('),\n')
level -= 1
showIndent(outfile, level)
outfile.write('],\n')
showIndent(outfile, level)
outfile.write('rectangle=[\n')
level += 1
for rectangle_ in self.rectangle:
showIndent(outfile, level)
outfile.write('model_.rectangle(\n')
rectangle_.exportLiteral(outfile, level)
showIndent(outfile, level)
outfile.write('),\n')
level -= 1
showIndent(outfile, level)
outfile.write('],\n')
showIndent(outfile, level)
outfile.write('frame=[\n')
level += 1
for frame_ in self.frame:
showIndent(outfile, level)
outfile.write('model_.frame(\n')
frame_.exportLiteral(outfile, level)
showIndent(outfile, level)
outfile.write('),\n')
level -= 1
showIndent(outfile, level)
outfile.write('],\n')
def build(self, node):
self.buildAttributes(node, node.attrib, [])
for child in node:
nodeName_ = Tag_pattern_.match(child.tag).groups()[-1]
self.buildChildren(child, node, nodeName_)
def buildAttributes(self, node, attrs, already_processed):
value = find_attr_value_('name', node)
if value is not None and 'name' not in already_processed:
already_processed.append('name')
self.name = value
def buildChildren(self, child_, node, nodeName_, fromsubclass_=False):
if nodeName_ == 'description':
obj_ = description.factory()
obj_.build(child_)
self.set_description(obj_)
elif nodeName_ == 'polygon':
obj_ = polygon.factory()
obj_.build(child_)
self.polygon.append(obj_)
elif nodeName_ == 'wire':
obj_ = wire.factory()
obj_.build(child_)
self.wire.append(obj_)
elif nodeName_ == 'text':
obj_ = text.factory()
obj_.build(child_)
self.text.append(obj_)
elif nodeName_ == 'pin':
obj_ = pin.factory()
obj_.build(child_)
self.pin.append(obj_)
elif nodeName_ == 'circle':
obj_ = circle.factory()
obj_.build(child_)
self.circle.append(obj_)
elif nodeName_ == 'rectangle':
obj_ = rectangle.factory()
obj_.build(child_)
self.rectangle.append(obj_)
elif nodeName_ == 'frame':
obj_ = frame.factory()
obj_.build(child_)
self.frame.append(obj_)
# end class symbol
class deviceset(GeneratedsSuper):
subclass = None
superclass = None
def __init__(self, uservalue=None, prefix=None, name=None, description=None, gates=None, devices=None):
self.uservalue = _cast(None, uservalue)
self.prefix = _cast(None, prefix)
self.name = _cast(None, name)
self.description = description
self.gates = gates
self.devices = devices
def factory(*args_, **kwargs_):
if deviceset.subclass:
return deviceset.subclass(*args_, **kwargs_)
else:
return deviceset(*args_, **kwargs_)
factory = staticmethod(factory)
def get_description(self): return self.description
def set_description(self, description): self.description = description
def get_gates(self): return self.gates
def set_gates(self, gates): self.gates = gates
def get_devices(self): return self.devices
def set_devices(self, devices): self.devices = devices
def get_uservalue(self): return self.uservalue
def set_uservalue(self, uservalue): self.uservalue = uservalue
def get_prefix(self): return self.prefix
def set_prefix(self, prefix): self.prefix = prefix
def get_name(self): return self.name
def set_name(self, name): self.name = name
def export(self, outfile, level, namespace_='t:', name_='deviceset', namespacedef_=''):
showIndent(outfile, level)
outfile.write('<%s%s%s' % (namespace_, name_, namespacedef_ and ' ' + namespacedef_ or '', ))
already_processed = []
self.exportAttributes(outfile, level, already_processed, namespace_, name_='deviceset')
if self.hasContent_():
outfile.write('>\n')
self.exportChildren(outfile, level + 1, namespace_, name_)
showIndent(outfile, level)
outfile.write('</%s%s>\n' % (namespace_, name_))
else:
outfile.write('/>\n')
def exportAttributes(self, outfile, level, already_processed, namespace_='t:', name_='deviceset'):
if self.uservalue is not None and 'uservalue' not in already_processed:
already_processed.append('uservalue')
outfile.write(' uservalue=%s' % (self.gds_format_string(quote_attrib(self.uservalue).encode(ExternalEncoding), input_name='uservalue'), ))
if self.prefix is not None and 'prefix' not in already_processed:
already_processed.append('prefix')
outfile.write(' prefix=%s' % (self.gds_format_string(quote_attrib(self.prefix).encode(ExternalEncoding), input_name='prefix'), ))
if self.name is not None and 'name' not in already_processed:
already_processed.append('name')
outfile.write(' name=%s' % (self.gds_format_string(quote_attrib(self.name).encode(ExternalEncoding), input_name='name'), ))
def exportChildren(self, outfile, level, namespace_='t:', name_='deviceset', fromsubclass_=False):
if self.description is not None:
self.description.export(outfile, level, namespace_, name_='description')
if self.gates is not None:
self.gates.export(outfile, level, namespace_, name_='gates', )
if self.devices is not None:
self.devices.export(outfile, level, namespace_, name_='devices', )
def hasContent_(self):
if (
self.description is not None or
self.gates is not None or
self.devices is not None
):
return True
else:
return False
def exportLiteral(self, outfile, level, name_='deviceset'):
level += 1
self.exportLiteralAttributes(outfile, level, [], name_)
if self.hasContent_():
self.exportLiteralChildren(outfile, level, name_)
def exportLiteralAttributes(self, outfile, level, already_processed, name_):
if self.uservalue is not None and 'uservalue' not in already_processed:
already_processed.append('uservalue')
showIndent(outfile, level)
outfile.write('uservalue = "%s",\n' % (self.uservalue,))
if self.prefix is not None and 'prefix' not in already_processed:
already_processed.append('prefix')
showIndent(outfile, level)
outfile.write('prefix = "%s",\n' % (self.prefix,))
if self.name is not None and 'name' not in already_processed:
already_processed.append('name')
showIndent(outfile, level)
outfile.write('name = "%s",\n' % (self.name,))
def exportLiteralChildren(self, outfile, level, name_):
if self.description is not None:
showIndent(outfile, level)
outfile.write('description=model_.description(\n')
self.description.exportLiteral(outfile, level)
showIndent(outfile, level)
outfile.write('),\n')
if self.gates is not None:
showIndent(outfile, level)
outfile.write('gates=model_.gates(\n')
self.gates.exportLiteral(outfile, level)
showIndent(outfile, level)
outfile.write('),\n')
if self.devices is not None:
showIndent(outfile, level)
outfile.write('devices=model_.devices(\n')
self.devices.exportLiteral(outfile, level)
showIndent(outfile, level)
outfile.write('),\n')
def build(self, node):
self.buildAttributes(node, node.attrib, [])
for child in node:
nodeName_ = Tag_pattern_.match(child.tag).groups()[-1]
self.buildChildren(child, node, nodeName_)
def buildAttributes(self, node, attrs, already_processed):
value = find_attr_value_('uservalue', node)
if value is not None and 'uservalue' not in already_processed:
already_processed.append('uservalue')
self.uservalue = value
value = find_attr_value_('prefix', node)
if value is not None and 'prefix' not in already_processed:
already_processed.append('prefix')
self.prefix = value
value = find_attr_value_('name', node)
if value is not None and 'name' not in already_processed:
already_processed.append('name')
self.name = value
def buildChildren(self, child_, node, nodeName_, fromsubclass_=False):
if nodeName_ == 'description':
obj_ = description.factory()
obj_.build(child_)
self.set_description(obj_)
elif nodeName_ == 'gates':
obj_ = gates.factory()
obj_.build(child_)
self.set_gates(obj_)
elif nodeName_ == 'devices':
obj_ = devices.factory()
obj_.build(child_)
self.set_devices(obj_)
# end class deviceset
class device(GeneratedsSuper):
subclass = None
superclass = None
def __init__(self, name=None, package=None, connects=None, technologies=None):
self.name = _cast(None, name)
self.package = _cast(None, package)
self.connects = connects
self.technologies = technologies
def factory(*args_, **kwargs_):
if device.subclass:
return device.subclass(*args_, **kwargs_)
else:
return device(*args_, **kwargs_)
factory = staticmethod(factory)
def get_connects(self): return self.connects
def set_connects(self, connects): self.connects = connects
def get_technologies(self): return self.technologies
def set_technologies(self, technologies): self.technologies = technologies
def get_name(self): return self.name
def set_name(self, name): self.name = name
def get_package(self): return self.package
def set_package(self, package): self.package = package
def export(self, outfile, level, namespace_='t:', name_='device', namespacedef_=''):
showIndent(outfile, level)
outfile.write('<%s%s%s' % (namespace_, name_, namespacedef_ and ' ' + namespacedef_ or '', ))
already_processed = []
self.exportAttributes(outfile, level, already_processed, namespace_, name_='device')
if self.hasContent_():
outfile.write('>\n')
self.exportChildren(outfile, level + 1, namespace_, name_)
showIndent(outfile, level)
outfile.write('</%s%s>\n' % (namespace_, name_))
else:
outfile.write('/>\n')
def exportAttributes(self, outfile, level, already_processed, namespace_='t:', name_='device'):
if self.name is not None and 'name' not in already_processed:
already_processed.append('name')
outfile.write(' name=%s' % (self.gds_format_string(quote_attrib(self.name).encode(ExternalEncoding), input_name='name'), ))
if self.package is not None and 'package' not in already_processed:
already_processed.append('package')
outfile.write(' package=%s' % (self.gds_format_string(quote_attrib(self.package).encode(ExternalEncoding), input_name='package'), ))
def exportChildren(self, outfile, level, namespace_='t:', name_='device', fromsubclass_=False):
if self.connects is not None:
self.connects.export(outfile, level, namespace_, name_='connects')
if self.technologies is not None:
self.technologies.export(outfile, level, namespace_, name_='technologies')
def hasContent_(self):
if (
self.connects is not None or
self.technologies is not None
):
return True
else:
return False
def exportLiteral(self, outfile, level, name_='device'):
level += 1
self.exportLiteralAttributes(outfile, level, [], name_)
if self.hasContent_():
self.exportLiteralChildren(outfile, level, name_)
def exportLiteralAttributes(self, outfile, level, already_processed, name_):
if self.name is not None and 'name' not in already_processed:
already_processed.append('name')
showIndent(outfile, level)
outfile.write('name = "%s",\n' % (self.name,))
if self.package is not None and 'package' not in already_processed:
already_processed.append('package')
showIndent(outfile, level)
outfile.write('package = "%s",\n' % (self.package,))
def exportLiteralChildren(self, outfile, level, name_):
if self.connects is not None:
showIndent(outfile, level)
outfile.write('connects=model_.connects(\n')
self.connects.exportLiteral(outfile, level)
showIndent(outfile, level)
outfile.write('),\n')
if self.technologies is not None:
showIndent(outfile, level)
outfile.write('technologies=model_.technologies(\n')
self.technologies.exportLiteral(outfile, level)
showIndent(outfile, level)
outfile.write('),\n')
def build(self, node):
self.buildAttributes(node, node.attrib, [])
for child in node:
nodeName_ = Tag_pattern_.match(child.tag).groups()[-1]
self.buildChildren(child, node, nodeName_)
def buildAttributes(self, node, attrs, already_processed):
value = find_attr_value_('name', node)
if value is not None and 'name' not in already_processed:
already_processed.append('name')
self.name = value
value = find_attr_value_('package', node)
if value is not None and 'package' not in already_processed:
already_processed.append('package')
self.package = value
def buildChildren(self, child_, node, nodeName_, fromsubclass_=False):
if nodeName_ == 'connects':
obj_ = connects.factory()
obj_.build(child_)
self.set_connects(obj_)
elif nodeName_ == 'technologies':
obj_ = technologies.factory()
obj_.build(child_)
self.set_technologies(obj_)
# end class device
class bus(GeneratedsSuper):
subclass = None
superclass = None
def __init__(self, name=None, segment=None):
self.name = _cast(None, name)
self.segment = segment
def factory(*args_, **kwargs_):
if bus.subclass:
return bus.subclass(*args_, **kwargs_)
else:
return bus(*args_, **kwargs_)
factory = staticmethod(factory)
def get_segment(self): return self.segment
def set_segment(self, segment): self.segment = segment
def get_name(self): return self.name
def set_name(self, name): self.name = name
def export(self, outfile, level, namespace_='t:', name_='bus', namespacedef_=''):
showIndent(outfile, level)
outfile.write('<%s%s%s' % (namespace_, name_, namespacedef_ and ' ' + namespacedef_ or '', ))
already_processed = []
self.exportAttributes(outfile, level, already_processed, namespace_, name_='bus')
if self.hasContent_():
outfile.write('>\n')
self.exportChildren(outfile, level + 1, namespace_, name_)
showIndent(outfile, level)
outfile.write('</%s%s>\n' % (namespace_, name_))
else:
outfile.write('/>\n')
def exportAttributes(self, outfile, level, already_processed, namespace_='t:', name_='bus'):
if self.name is not None and 'name' not in already_processed:
already_processed.append('name')
outfile.write(' name=%s' % (self.gds_format_string(quote_attrib(self.name).encode(ExternalEncoding), input_name='name'), ))
def exportChildren(self, outfile, level, namespace_='t:', name_='bus', fromsubclass_=False):
if self.segment is not None:
self.segment.export(outfile, level, namespace_, name_='segment', )
def hasContent_(self):
if (
self.segment is not None
):
return True
else:
return False
def exportLiteral(self, outfile, level, name_='bus'):
level += 1
self.exportLiteralAttributes(outfile, level, [], name_)
if self.hasContent_():
self.exportLiteralChildren(outfile, level, name_)
def exportLiteralAttributes(self, outfile, level, already_processed, name_):
if self.name is not None and 'name' not in already_processed:
already_processed.append('name')
showIndent(outfile, level)
outfile.write('name = "%s",\n' % (self.name,))
def exportLiteralChildren(self, outfile, level, name_):
if self.segment is not None:
showIndent(outfile, level)
outfile.write('segment=model_.segment(\n')
self.segment.exportLiteral(outfile, level)
showIndent(outfile, level)
outfile.write('),\n')
def build(self, node):
self.buildAttributes(node, node.attrib, [])
for child in node:
nodeName_ = Tag_pattern_.match(child.tag).groups()[-1]
self.buildChildren(child, node, nodeName_)
def buildAttributes(self, node, attrs, already_processed):
value = find_attr_value_('name', node)
if value is not None and 'name' not in already_processed:
already_processed.append('name')
self.name = value
def buildChildren(self, child_, node, nodeName_, fromsubclass_=False):
if nodeName_ == 'segment':
obj_ = segment.factory()
obj_.build(child_)
self.set_segment(obj_)
# end class bus
class net(GeneratedsSuper):
subclass = None
superclass = None
def __init__(self, name=None, classxx=None, segment=None):
self.name = _cast(None, name)
self.classxx = _cast(None, classxx)
self.segment = segment
def factory(*args_, **kwargs_):
if net.subclass:
return net.subclass(*args_, **kwargs_)
else:
return net(*args_, **kwargs_)
factory = staticmethod(factory)
def get_segment(self): return self.segment
def set_segment(self, segment): self.segment = segment
def get_name(self): return self.name
def set_name(self, name): self.name = name
def get_class(self): return self.classxx
def set_class(self, classxx): self.classxx = classxx
def export(self, outfile, level, namespace_='t:', name_='net', namespacedef_=''):
showIndent(outfile, level)
outfile.write('<%s%s%s' % (namespace_, name_, namespacedef_ and ' ' + namespacedef_ or '', ))
already_processed = []
self.exportAttributes(outfile, level, already_processed, namespace_, name_='net')
if self.hasContent_():
outfile.write('>\n')
self.exportChildren(outfile, level + 1, namespace_, name_)
showIndent(outfile, level)
outfile.write('</%s%s>\n' % (namespace_, name_))
else:
outfile.write('/>\n')
def exportAttributes(self, outfile, level, already_processed, namespace_='t:', name_='net'):
if self.name is not None and 'name' not in already_processed:
already_processed.append('name')
outfile.write(' name=%s' % (self.gds_format_string(quote_attrib(self.name).encode(ExternalEncoding), input_name='name'), ))
if self.classxx is not None and 'classxx' not in already_processed:
already_processed.append('classxx')
outfile.write(' class=%s' % (self.gds_format_string(quote_attrib(self.classxx).encode(ExternalEncoding), input_name='class'), ))
def exportChildren(self, outfile, level, namespace_='t:', name_='net', fromsubclass_=False):
if self.segment is not None:
self.segment.export(outfile, level, namespace_, name_='segment', )
def hasContent_(self):
if (
self.segment is not None
):
return True
else:
return False
def exportLiteral(self, outfile, level, name_='net'):
level += 1
self.exportLiteralAttributes(outfile, level, [], name_)
if self.hasContent_():
self.exportLiteralChildren(outfile, level, name_)
def exportLiteralAttributes(self, outfile, level, already_processed, name_):
if self.name is not None and 'name' not in already_processed:
already_processed.append('name')
showIndent(outfile, level)
outfile.write('name = "%s",\n' % (self.name,))
if self.classxx is not None and 'classxx' not in already_processed:
already_processed.append('classxx')
showIndent(outfile, level)
outfile.write('classxx = "%s",\n' % (self.classxx,))
def exportLiteralChildren(self, outfile, level, name_):
if self.segment is not None:
showIndent(outfile, level)
outfile.write('segment=model_.segment(\n')
self.segment.exportLiteral(outfile, level)
showIndent(outfile, level)
outfile.write('),\n')
def build(self, node):
self.buildAttributes(node, node.attrib, [])
for child in node:
nodeName_ = Tag_pattern_.match(child.tag).groups()[-1]
self.buildChildren(child, node, nodeName_)
def buildAttributes(self, node, attrs, already_processed):
value = find_attr_value_('name', node)
if value is not None and 'name' not in already_processed:
already_processed.append('name')
self.name = value
value = find_attr_value_('class', node)
if value is not None and 'class' not in already_processed:
already_processed.append('class')
self.classxx = value
def buildChildren(self, child_, node, nodeName_, fromsubclass_=False):
if nodeName_ == 'segment':
obj_ = segment.factory()
obj_.build(child_)
self.set_segment(obj_)
# end class net
class segment(GeneratedsSuper):
subclass = None
superclass = None
def __init__(self, pinref=None, wire=None, junction=None, label=None):
if pinref is None:
self.pinref = []
else:
self.pinref = pinref
if wire is None:
self.wire = []
else:
self.wire = wire
if junction is None:
self.junction = []
else:
self.junction = junction
if label is None:
self.label = []
else:
self.label = label
def factory(*args_, **kwargs_):
if segment.subclass:
return segment.subclass(*args_, **kwargs_)
else:
return segment(*args_, **kwargs_)
factory = staticmethod(factory)
def get_pinref(self): return self.pinref
def set_pinref(self, pinref): self.pinref = pinref
def add_pinref(self, value): self.pinref.append(value)
def insert_pinref(self, index, value): self.pinref[index] = value
def get_wire(self): return self.wire
def set_wire(self, wire): self.wire = wire
def add_wire(self, value): self.wire.append(value)
def insert_wire(self, index, value): self.wire[index] = value
def get_junction(self): return self.junction
def set_junction(self, junction): self.junction = junction
def add_junction(self, value): self.junction.append(value)
def insert_junction(self, index, value): self.junction[index] = value
def get_label(self): return self.label
def set_label(self, label): self.label = label
def add_label(self, value): self.label.append(value)
def insert_label(self, index, value): self.label[index] = value
def export(self, outfile, level, namespace_='t:', name_='segment', namespacedef_=''):
showIndent(outfile, level)
outfile.write('<%s%s%s' % (namespace_, name_, namespacedef_ and ' ' + namespacedef_ or '', ))
already_processed = []
self.exportAttributes(outfile, level, already_processed, namespace_, name_='segment')
if self.hasContent_():
outfile.write('>\n')
self.exportChildren(outfile, level + 1, namespace_, name_)
showIndent(outfile, level)
outfile.write('</%s%s>\n' % (namespace_, name_))
else:
outfile.write('/>\n')
def exportAttributes(self, outfile, level, already_processed, namespace_='t:', name_='segment'):
pass
def exportChildren(self, outfile, level, namespace_='t:', name_='segment', fromsubclass_=False):
for pinref_ in self.pinref:
pinref_.export(outfile, level, namespace_, name_='pinref')
for wire_ in self.wire:
wire_.export(outfile, level, namespace_, name_='wire')
for junction_ in self.junction:
junction_.export(outfile, level, namespace_, name_='junction')
for label_ in self.label:
label_.export(outfile, level, namespace_, name_='label')
def hasContent_(self):
if (
self.pinref or
self.wire or
self.junction or
self.label
):
return True
else:
return False
def exportLiteral(self, outfile, level, name_='segment'):
level += 1
self.exportLiteralAttributes(outfile, level, [], name_)
if self.hasContent_():
self.exportLiteralChildren(outfile, level, name_)
def exportLiteralAttributes(self, outfile, level, already_processed, name_):
pass
def exportLiteralChildren(self, outfile, level, name_):
showIndent(outfile, level)
outfile.write('pinref=[\n')
level += 1
for pinref_ in self.pinref:
showIndent(outfile, level)
outfile.write('model_.pinref(\n')
pinref_.exportLiteral(outfile, level)
showIndent(outfile, level)
outfile.write('),\n')
level -= 1
showIndent(outfile, level)
outfile.write('],\n')
showIndent(outfile, level)
outfile.write('wire=[\n')
level += 1
for wire_ in self.wire:
showIndent(outfile, level)
outfile.write('model_.wire(\n')
wire_.exportLiteral(outfile, level)
showIndent(outfile, level)
outfile.write('),\n')
level -= 1
showIndent(outfile, level)
outfile.write('],\n')
showIndent(outfile, level)
outfile.write('junction=[\n')
level += 1
for junction_ in self.junction:
showIndent(outfile, level)
outfile.write('model_.junction(\n')
junction_.exportLiteral(outfile, level)
showIndent(outfile, level)
outfile.write('),\n')
level -= 1
showIndent(outfile, level)
outfile.write('],\n')
showIndent(outfile, level)
outfile.write('label=[\n')
level += 1
for label_ in self.label:
showIndent(outfile, level)
outfile.write('model_.label(\n')
label_.exportLiteral(outfile, level)
showIndent(outfile, level)
outfile.write('),\n')
level -= 1
showIndent(outfile, level)
outfile.write('],\n')
def build(self, node):
self.buildAttributes(node, node.attrib, [])
for child in node:
nodeName_ = Tag_pattern_.match(child.tag).groups()[-1]
self.buildChildren(child, node, nodeName_)
def buildAttributes(self, node, attrs, already_processed):
pass
def buildChildren(self, child_, node, nodeName_, fromsubclass_=False):
if nodeName_ == 'pinref':
obj_ = pinref.factory()
obj_.build(child_)
self.pinref.append(obj_)
elif nodeName_ == 'wire':
obj_ = wire.factory()
obj_.build(child_)
self.wire.append(obj_)
elif nodeName_ == 'junction':
obj_ = junction.factory()
obj_.build(child_)
self.junction.append(obj_)
elif nodeName_ == 'label':
obj_ = label.factory()
obj_.build(child_)
self.label.append(obj_)
# end class segment
class signal(GeneratedsSuper):
subclass = None
superclass = None
def __init__(self, airwireshidden=None, name=None, classxx=None, contactref=None, polygon=None, wire=None, via=None):
self.airwireshidden = _cast(None, airwireshidden)
self.name = _cast(None, name)
self.classxx = _cast(None, classxx)
if contactref is None:
self.contactref = []
else:
self.contactref = contactref
if polygon is None:
self.polygon = []
else:
self.polygon = polygon
if wire is None:
self.wire = []
else:
self.wire = wire
if via is None:
self.via = []
else:
self.via = via
def factory(*args_, **kwargs_):
if signal.subclass:
return signal.subclass(*args_, **kwargs_)
else:
return signal(*args_, **kwargs_)
factory = staticmethod(factory)
def get_contactref(self): return self.contactref
def set_contactref(self, contactref): self.contactref = contactref
def add_contactref(self, value): self.contactref.append(value)
def insert_contactref(self, index, value): self.contactref[index] = value
def get_polygon(self): return self.polygon
def set_polygon(self, polygon): self.polygon = polygon
def add_polygon(self, value): self.polygon.append(value)
def insert_polygon(self, index, value): self.polygon[index] = value
def get_wire(self): return self.wire
def set_wire(self, wire): self.wire = wire
def add_wire(self, value): self.wire.append(value)
def insert_wire(self, index, value): self.wire[index] = value
def get_via(self): return self.via
def set_via(self, via): self.via = via
def add_via(self, value): self.via.append(value)
def insert_via(self, index, value): self.via[index] = value
def get_airwireshidden(self): return self.airwireshidden
def set_airwireshidden(self, airwireshidden): self.airwireshidden = airwireshidden
def get_name(self): return self.name
def set_name(self, name): self.name = name
def get_class(self): return self.classxx
def set_class(self, classxx): self.classxx = classxx
def export(self, outfile, level, namespace_='t:', name_='signal', namespacedef_=''):
showIndent(outfile, level)
outfile.write('<%s%s%s' % (namespace_, name_, namespacedef_ and ' ' + namespacedef_ or '', ))
already_processed = []
self.exportAttributes(outfile, level, already_processed, namespace_, name_='signal')
if self.hasContent_():
outfile.write('>\n')
self.exportChildren(outfile, level + 1, namespace_, name_)
showIndent(outfile, level)
outfile.write('</%s%s>\n' % (namespace_, name_))
else:
outfile.write('/>\n')
def exportAttributes(self, outfile, level, already_processed, namespace_='t:', name_='signal'):
if self.airwireshidden is not None and 'airwireshidden' not in already_processed:
already_processed.append('airwireshidden')
outfile.write(' airwireshidden=%s' % (self.gds_format_string(quote_attrib(self.airwireshidden).encode(ExternalEncoding), input_name='airwireshidden'), ))
if self.name is not None and 'name' not in already_processed:
already_processed.append('name')
outfile.write(' name=%s' % (self.gds_format_string(quote_attrib(self.name).encode(ExternalEncoding), input_name='name'), ))
if self.classxx is not None and 'classxx' not in already_processed:
already_processed.append('classxx')
outfile.write(' class=%s' % (self.gds_format_string(quote_attrib(self.classxx).encode(ExternalEncoding), input_name='class'), ))
def exportChildren(self, outfile, level, namespace_='t:', name_='signal', fromsubclass_=False):
for contactref_ in self.contactref:
contactref_.export(outfile, level, namespace_, name_='contactref')
for polygon_ in self.polygon:
polygon_.export(outfile, level, namespace_, name_='polygon')
for wire_ in self.wire:
wire_.export(outfile, level, namespace_, name_='wire')
for via_ in self.via:
via_.export(outfile, level, namespace_, name_='via')
def hasContent_(self):
if (
self.contactref or
self.polygon or
self.wire or
self.via
):
return True
else:
return False
def exportLiteral(self, outfile, level, name_='signal'):
level += 1
self.exportLiteralAttributes(outfile, level, [], name_)
if self.hasContent_():
self.exportLiteralChildren(outfile, level, name_)
def exportLiteralAttributes(self, outfile, level, already_processed, name_):
if self.airwireshidden is not None and 'airwireshidden' not in already_processed:
already_processed.append('airwireshidden')
showIndent(outfile, level)
outfile.write('airwireshidden = "%s",\n' % (self.airwireshidden,))
if self.name is not None and 'name' not in already_processed:
already_processed.append('name')
showIndent(outfile, level)
outfile.write('name = "%s",\n' % (self.name,))
if self.classxx is not None and 'classxx' not in already_processed:
already_processed.append('classxx')
showIndent(outfile, level)
outfile.write('classxx = "%s",\n' % (self.classxx,))
def exportLiteralChildren(self, outfile, level, name_):
showIndent(outfile, level)
outfile.write('contactref=[\n')
level += 1
for contactref_ in self.contactref:
showIndent(outfile, level)
outfile.write('model_.contactref(\n')
contactref_.exportLiteral(outfile, level)
showIndent(outfile, level)
outfile.write('),\n')
level -= 1
showIndent(outfile, level)
outfile.write('],\n')
showIndent(outfile, level)
outfile.write('polygon=[\n')
level += 1
for polygon_ in self.polygon:
showIndent(outfile, level)
outfile.write('model_.polygon(\n')
polygon_.exportLiteral(outfile, level)
showIndent(outfile, level)
outfile.write('),\n')
level -= 1
showIndent(outfile, level)
outfile.write('],\n')
showIndent(outfile, level)
outfile.write('wire=[\n')
level += 1
for wire_ in self.wire:
showIndent(outfile, level)
outfile.write('model_.wire(\n')
wire_.exportLiteral(outfile, level)
showIndent(outfile, level)
outfile.write('),\n')
level -= 1
showIndent(outfile, level)
outfile.write('],\n')
showIndent(outfile, level)
outfile.write('via=[\n')
level += 1
for via_ in self.via:
showIndent(outfile, level)
outfile.write('model_.via(\n')
via_.exportLiteral(outfile, level)
showIndent(outfile, level)
outfile.write('),\n')
level -= 1
showIndent(outfile, level)
outfile.write('],\n')
def build(self, node):
self.buildAttributes(node, node.attrib, [])
for child in node:
nodeName_ = Tag_pattern_.match(child.tag).groups()[-1]
self.buildChildren(child, node, nodeName_)
def buildAttributes(self, node, attrs, already_processed):
value = find_attr_value_('airwireshidden', node)
if value is not None and 'airwireshidden' not in already_processed:
already_processed.append('airwireshidden')
self.airwireshidden = value
value = find_attr_value_('name', node)
if value is not None and 'name' not in already_processed:
already_processed.append('name')
self.name = value
value = find_attr_value_('class', node)
if value is not None and 'class' not in already_processed:
already_processed.append('class')
self.classxx = value
def buildChildren(self, child_, node, nodeName_, fromsubclass_=False):
if nodeName_ == 'contactref':
obj_ = contactref.factory()
obj_.build(child_)
self.contactref.append(obj_)
elif nodeName_ == 'polygon':
obj_ = polygon.factory()
obj_.build(child_)
self.polygon.append(obj_)
elif nodeName_ == 'wire':
obj_ = wire.factory()
obj_.build(child_)
self.wire.append(obj_)
elif nodeName_ == 'via':
obj_ = via.factory()
obj_.build(child_)
self.via.append(obj_)
# end class signal
class variantdef(GeneratedsSuper):
subclass = None
superclass = None
def __init__(self, name=None):
self.name = _cast(None, name)
pass
def factory(*args_, **kwargs_):
if variantdef.subclass:
return variantdef.subclass(*args_, **kwargs_)
else:
return variantdef(*args_, **kwargs_)
factory = staticmethod(factory)
def get_name(self): return self.name
def set_name(self, name): self.name = name
def export(self, outfile, level, namespace_='t:', name_='variantdef', namespacedef_=''):
showIndent(outfile, level)
outfile.write('<%s%s%s' % (namespace_, name_, namespacedef_ and ' ' + namespacedef_ or '', ))
already_processed = []
self.exportAttributes(outfile, level, already_processed, namespace_, name_='variantdef')
if self.hasContent_():
outfile.write('>\n')
self.exportChildren(outfile, level + 1, namespace_, name_)
outfile.write('</%s%s>\n' % (namespace_, name_))
else:
outfile.write('/>\n')
def exportAttributes(self, outfile, level, already_processed, namespace_='t:', name_='variantdef'):
if self.name is not None and 'name' not in already_processed:
already_processed.append('name')
outfile.write(' name=%s' % (self.gds_format_string(quote_attrib(self.name).encode(ExternalEncoding), input_name='name'), ))
def exportChildren(self, outfile, level, namespace_='t:', name_='variantdef', fromsubclass_=False):
pass
def hasContent_(self):
if (
):
return True
else:
return False
def exportLiteral(self, outfile, level, name_='variantdef'):
level += 1
self.exportLiteralAttributes(outfile, level, [], name_)
if self.hasContent_():
self.exportLiteralChildren(outfile, level, name_)
def exportLiteralAttributes(self, outfile, level, already_processed, name_):
if self.name is not None and 'name' not in already_processed:
already_processed.append('name')
showIndent(outfile, level)
outfile.write('name = "%s",\n' % (self.name,))
def exportLiteralChildren(self, outfile, level, name_):
pass
def build(self, node):
self.buildAttributes(node, node.attrib, [])
for child in node:
nodeName_ = Tag_pattern_.match(child.tag).groups()[-1]
self.buildChildren(child, node, nodeName_)
def buildAttributes(self, node, attrs, already_processed):
value = find_attr_value_('name', node)
if value is not None and 'name' not in already_processed:
already_processed.append('name')
self.name = value
def buildChildren(self, child_, node, nodeName_, fromsubclass_=False):
pass
# end class variantdef
class variant(GeneratedsSuper):
subclass = None
superclass = None
def __init__(self, value=None, technology=None, name=None, populate=None):
self.value = _cast(None, value)
self.technology = _cast(None, technology)
self.name = _cast(None, name)
self.populate = _cast(None, populate)
pass
def factory(*args_, **kwargs_):
if variant.subclass:
return variant.subclass(*args_, **kwargs_)
else:
return variant(*args_, **kwargs_)
factory = staticmethod(factory)
def get_value(self): return self.value
def set_value(self, value): self.value = value
def get_technology(self): return self.technology
def set_technology(self, technology): self.technology = technology
def get_name(self): return self.name
def set_name(self, name): self.name = name
def get_populate(self): return self.populate
def set_populate(self, populate): self.populate = populate
def export(self, outfile, level, namespace_='t:', name_='variant', namespacedef_=''):
showIndent(outfile, level)
outfile.write('<%s%s%s' % (namespace_, name_, namespacedef_ and ' ' + namespacedef_ or '', ))
already_processed = []
self.exportAttributes(outfile, level, already_processed, namespace_, name_='variant')
if self.hasContent_():
outfile.write('>\n')
self.exportChildren(outfile, level + 1, namespace_, name_)
outfile.write('</%s%s>\n' % (namespace_, name_))
else:
outfile.write('/>\n')
def exportAttributes(self, outfile, level, already_processed, namespace_='t:', name_='variant'):
if self.value is not None and 'value' not in already_processed:
already_processed.append('value')
outfile.write(' value=%s' % (self.gds_format_string(quote_attrib(self.value).encode(ExternalEncoding), input_name='value'), ))
if self.technology is not None and 'technology' not in already_processed:
already_processed.append('technology')
outfile.write(' technology=%s' % (self.gds_format_string(quote_attrib(self.technology).encode(ExternalEncoding), input_name='technology'), ))
if self.name is not None and 'name' not in already_processed:
already_processed.append('name')
outfile.write(' name=%s' % (self.gds_format_string(quote_attrib(self.name).encode(ExternalEncoding), input_name='name'), ))
if self.populate is not None and 'populate' not in already_processed:
already_processed.append('populate')
outfile.write(' populate=%s' % (self.gds_format_string(quote_attrib(self.populate).encode(ExternalEncoding), input_name='populate'), ))
def exportChildren(self, outfile, level, namespace_='t:', name_='variant', fromsubclass_=False):
pass
def hasContent_(self):
if (
):
return True
else:
return False
def exportLiteral(self, outfile, level, name_='variant'):
level += 1
self.exportLiteralAttributes(outfile, level, [], name_)
if self.hasContent_():
self.exportLiteralChildren(outfile, level, name_)
def exportLiteralAttributes(self, outfile, level, already_processed, name_):
if self.value is not None and 'value' not in already_processed:
already_processed.append('value')
showIndent(outfile, level)
outfile.write('value = "%s",\n' % (self.value,))
if self.technology is not None and 'technology' not in already_processed:
already_processed.append('technology')
showIndent(outfile, level)
outfile.write('technology = "%s",\n' % (self.technology,))
if self.name is not None and 'name' not in already_processed:
already_processed.append('name')
showIndent(outfile, level)
outfile.write('name = "%s",\n' % (self.name,))
if self.populate is not None and 'populate' not in already_processed:
already_processed.append('populate')
showIndent(outfile, level)
outfile.write('populate = "%s",\n' % (self.populate,))
def exportLiteralChildren(self, outfile, level, name_):
pass
def build(self, node):
self.buildAttributes(node, node.attrib, [])
for child in node:
nodeName_ = Tag_pattern_.match(child.tag).groups()[-1]
self.buildChildren(child, node, nodeName_)
def buildAttributes(self, node, attrs, already_processed):
value = find_attr_value_('value', node)
if value is not None and 'value' not in already_processed:
already_processed.append('value')
self.value = value
value = find_attr_value_('technology', node)
if value is not None and 'technology' not in already_processed:
already_processed.append('technology')
self.technology = value
value = find_attr_value_('name', node)
if value is not None and 'name' not in already_processed:
already_processed.append('name')
self.name = value
value = find_attr_value_('populate', node)
if value is not None and 'populate' not in already_processed:
already_processed.append('populate')
self.populate = value
def buildChildren(self, child_, node, nodeName_, fromsubclass_=False):
pass
# end class variant
class gate(GeneratedsSuper):
subclass = None
superclass = None
def __init__(self, name=None, symbol=None, swaplevel=None, addlevel=None, y=None, x=None):
self.name = _cast(None, name)
self.symbol = _cast(None, symbol)
self.swaplevel = _cast(None, swaplevel)
self.addlevel = _cast(None, addlevel)
self.y = _cast(None, y)
self.x = _cast(None, x)
pass
def factory(*args_, **kwargs_):
if gate.subclass:
return gate.subclass(*args_, **kwargs_)
else:
return gate(*args_, **kwargs_)
factory = staticmethod(factory)
def get_name(self): return self.name
def set_name(self, name): self.name = name
def get_symbol(self): return self.symbol
def set_symbol(self, symbol): self.symbol = symbol
def get_swaplevel(self): return self.swaplevel
def set_swaplevel(self, swaplevel): self.swaplevel = swaplevel
def get_addlevel(self): return self.addlevel
def set_addlevel(self, addlevel): self.addlevel = addlevel
def get_y(self): return self.y
def set_y(self, y): self.y = y
def get_x(self): return self.x
def set_x(self, x): self.x = x
def export(self, outfile, level, namespace_='t:', name_='gate', namespacedef_=''):
showIndent(outfile, level)
outfile.write('<%s%s%s' % (namespace_, name_, namespacedef_ and ' ' + namespacedef_ or '', ))
already_processed = []
self.exportAttributes(outfile, level, already_processed, namespace_, name_='gate')
if self.hasContent_():
outfile.write('>\n')
self.exportChildren(outfile, level + 1, namespace_, name_)
outfile.write('</%s%s>\n' % (namespace_, name_))
else:
outfile.write('/>\n')
def exportAttributes(self, outfile, level, already_processed, namespace_='t:', name_='gate'):
if self.name is not None and 'name' not in already_processed:
already_processed.append('name')
outfile.write(' name=%s' % (self.gds_format_string(quote_attrib(self.name).encode(ExternalEncoding), input_name='name'), ))
if self.symbol is not None and 'symbol' not in already_processed:
already_processed.append('symbol')
outfile.write(' symbol=%s' % (self.gds_format_string(quote_attrib(self.symbol).encode(ExternalEncoding), input_name='symbol'), ))
if self.swaplevel is not None and 'swaplevel' not in already_processed:
already_processed.append('swaplevel')
outfile.write(' swaplevel=%s' % (self.gds_format_string(quote_attrib(self.swaplevel).encode(ExternalEncoding), input_name='swaplevel'), ))
if self.addlevel is not None and 'addlevel' not in already_processed:
already_processed.append('addlevel')
outfile.write(' addlevel=%s' % (self.gds_format_string(quote_attrib(self.addlevel).encode(ExternalEncoding), input_name='addlevel'), ))
if self.y is not None and 'y' not in already_processed:
already_processed.append('y')
outfile.write(' y=%s' % (self.gds_format_string(quote_attrib(self.y).encode(ExternalEncoding), input_name='y'), ))
if self.x is not None and 'x' not in already_processed:
already_processed.append('x')
outfile.write(' x=%s' % (self.gds_format_string(quote_attrib(self.x).encode(ExternalEncoding), input_name='x'), ))
def exportChildren(self, outfile, level, namespace_='t:', name_='gate', fromsubclass_=False):
pass
def hasContent_(self):
if (
):
return True
else:
return False
def exportLiteral(self, outfile, level, name_='gate'):
level += 1
self.exportLiteralAttributes(outfile, level, [], name_)
if self.hasContent_():
self.exportLiteralChildren(outfile, level, name_)
def exportLiteralAttributes(self, outfile, level, already_processed, name_):
if self.name is not None and 'name' not in already_processed:
already_processed.append('name')
showIndent(outfile, level)
outfile.write('name = "%s",\n' % (self.name,))
if self.symbol is not None and 'symbol' not in already_processed:
already_processed.append('symbol')
showIndent(outfile, level)
outfile.write('symbol = "%s",\n' % (self.symbol,))
if self.swaplevel is not None and 'swaplevel' not in already_processed:
already_processed.append('swaplevel')
showIndent(outfile, level)
outfile.write('swaplevel = "%s",\n' % (self.swaplevel,))
if self.addlevel is not None and 'addlevel' not in already_processed:
already_processed.append('addlevel')
showIndent(outfile, level)
outfile.write('addlevel = "%s",\n' % (self.addlevel,))
if self.y is not None and 'y' not in already_processed:
already_processed.append('y')
showIndent(outfile, level)
outfile.write('y = "%s",\n' % (self.y,))
if self.x is not None and 'x' not in already_processed:
already_processed.append('x')
showIndent(outfile, level)
outfile.write('x = "%s",\n' % (self.x,))
def exportLiteralChildren(self, outfile, level, name_):
pass
def build(self, node):
self.buildAttributes(node, node.attrib, [])
for child in node:
nodeName_ = Tag_pattern_.match(child.tag).groups()[-1]
self.buildChildren(child, node, nodeName_)
def buildAttributes(self, node, attrs, already_processed):
value = find_attr_value_('name', node)
if value is not None and 'name' not in already_processed:
already_processed.append('name')
self.name = value
value = find_attr_value_('symbol', node)
if value is not None and 'symbol' not in already_processed:
already_processed.append('symbol')
self.symbol = value
value = find_attr_value_('swaplevel', node)
if value is not None and 'swaplevel' not in already_processed:
already_processed.append('swaplevel')
self.swaplevel = value
value = find_attr_value_('addlevel', node)
if value is not None and 'addlevel' not in already_processed:
already_processed.append('addlevel')
self.addlevel = value
value = find_attr_value_('y', node)
if value is not None and 'y' not in already_processed:
already_processed.append('y')
self.y = value
value = find_attr_value_('x', node)
if value is not None and 'x' not in already_processed:
already_processed.append('x')
self.x = value
def buildChildren(self, child_, node, nodeName_, fromsubclass_=False):
pass
# end class gate
class wire(GeneratedsSuper):
subclass = None
superclass = None
def __init__(self, layer=None, y2=None, width=None, cap=None, curve=None, style=None, x2=None, extent=None, y1=None, x1=None):
self.layer = _cast(None, layer)
self.y2 = _cast(None, y2)
self.width = _cast(None, width)
self.cap = _cast(None, cap)
self.curve = _cast(None, curve)
self.style = _cast(None, style)
self.x2 = _cast(None, x2)
self.extent = _cast(None, extent)
self.y1 = _cast(None, y1)
self.x1 = _cast(None, x1)
pass
def factory(*args_, **kwargs_):
if wire.subclass:
return wire.subclass(*args_, **kwargs_)
else:
return wire(*args_, **kwargs_)
factory = staticmethod(factory)
def get_layer(self): return self.layer
def set_layer(self, layer): self.layer = layer
def get_y2(self): return self.y2
def set_y2(self, y2): self.y2 = y2
def get_width(self): return self.width
def set_width(self, width): self.width = width
def get_cap(self): return self.cap
def set_cap(self, cap): self.cap = cap
def get_curve(self): return self.curve
def set_curve(self, curve): self.curve = curve
def get_style(self): return self.style
def set_style(self, style): self.style = style
def get_x2(self): return self.x2
def set_x2(self, x2): self.x2 = x2
def get_extent(self): return self.extent
def set_extent(self, extent): self.extent = extent
def get_y1(self): return self.y1
def set_y1(self, y1): self.y1 = y1
def get_x1(self): return self.x1
def set_x1(self, x1): self.x1 = x1
def export(self, outfile, level, namespace_='t:', name_='wire', namespacedef_=''):
showIndent(outfile, level)
outfile.write('<%s%s%s' % (namespace_, name_, namespacedef_ and ' ' + namespacedef_ or '', ))
already_processed = []
self.exportAttributes(outfile, level, already_processed, namespace_, name_='wire')
if self.hasContent_():
outfile.write('>\n')
self.exportChildren(outfile, level + 1, namespace_, name_)
outfile.write('</%s%s>\n' % (namespace_, name_))
else:
outfile.write('/>\n')
def exportAttributes(self, outfile, level, already_processed, namespace_='t:', name_='wire'):
if self.layer is not None and 'layer' not in already_processed:
already_processed.append('layer')
outfile.write(' layer=%s' % (self.gds_format_string(quote_attrib(self.layer).encode(ExternalEncoding), input_name='layer'), ))
if self.y2 is not None and 'y2' not in already_processed:
already_processed.append('y2')
outfile.write(' y2=%s' % (self.gds_format_string(quote_attrib(self.y2).encode(ExternalEncoding), input_name='y2'), ))
if self.width is not None and 'width' not in already_processed:
already_processed.append('width')
outfile.write(' width=%s' % (self.gds_format_string(quote_attrib(self.width).encode(ExternalEncoding), input_name='width'), ))
if self.cap is not None and 'cap' not in already_processed:
already_processed.append('cap')
outfile.write(' cap=%s' % (self.gds_format_string(quote_attrib(self.cap).encode(ExternalEncoding), input_name='cap'), ))
if self.curve is not None and 'curve' not in already_processed:
already_processed.append('curve')
outfile.write(' curve=%s' % (self.gds_format_string(quote_attrib(self.curve).encode(ExternalEncoding), input_name='curve'), ))
if self.style is not None and 'style' not in already_processed:
already_processed.append('style')
outfile.write(' style=%s' % (self.gds_format_string(quote_attrib(self.style).encode(ExternalEncoding), input_name='style'), ))
if self.x2 is not None and 'x2' not in already_processed:
already_processed.append('x2')
outfile.write(' x2=%s' % (self.gds_format_string(quote_attrib(self.x2).encode(ExternalEncoding), input_name='x2'), ))
if self.extent is not None and 'extent' not in already_processed:
already_processed.append('extent')
outfile.write(' extent=%s' % (self.gds_format_string(quote_attrib(self.extent).encode(ExternalEncoding), input_name='extent'), ))
if self.y1 is not None and 'y1' not in already_processed:
already_processed.append('y1')
outfile.write(' y1=%s' % (self.gds_format_string(quote_attrib(self.y1).encode(ExternalEncoding), input_name='y1'), ))
if self.x1 is not None and 'x1' not in already_processed:
already_processed.append('x1')
outfile.write(' x1=%s' % (self.gds_format_string(quote_attrib(self.x1).encode(ExternalEncoding), input_name='x1'), ))
def exportChildren(self, outfile, level, namespace_='t:', name_='wire', fromsubclass_=False):
pass
def hasContent_(self):
if (
):
return True
else:
return False
def exportLiteral(self, outfile, level, name_='wire'):
level += 1
self.exportLiteralAttributes(outfile, level, [], name_)
if self.hasContent_():
self.exportLiteralChildren(outfile, level, name_)
def exportLiteralAttributes(self, outfile, level, already_processed, name_):
if self.layer is not None and 'layer' not in already_processed:
already_processed.append('layer')
showIndent(outfile, level)
outfile.write('layer = "%s",\n' % (self.layer,))
if self.y2 is not None and 'y2' not in already_processed:
already_processed.append('y2')
showIndent(outfile, level)
outfile.write('y2 = "%s",\n' % (self.y2,))
if self.width is not None and 'width' not in already_processed:
already_processed.append('width')
showIndent(outfile, level)
outfile.write('width = "%s",\n' % (self.width,))
if self.cap is not None and 'cap' not in already_processed:
already_processed.append('cap')
showIndent(outfile, level)
outfile.write('cap = "%s",\n' % (self.cap,))
if self.curve is not None and 'curve' not in already_processed:
already_processed.append('curve')
showIndent(outfile, level)
outfile.write('curve = "%s",\n' % (self.curve,))
if self.style is not None and 'style' not in already_processed:
already_processed.append('style')
showIndent(outfile, level)
outfile.write('style = "%s",\n' % (self.style,))
if self.x2 is not None and 'x2' not in already_processed:
already_processed.append('x2')
showIndent(outfile, level)
outfile.write('x2 = "%s",\n' % (self.x2,))
if self.extent is not None and 'extent' not in already_processed:
already_processed.append('extent')
showIndent(outfile, level)
outfile.write('extent = "%s",\n' % (self.extent,))
if self.y1 is not None and 'y1' not in already_processed:
already_processed.append('y1')
showIndent(outfile, level)
outfile.write('y1 = "%s",\n' % (self.y1,))
if self.x1 is not None and 'x1' not in already_processed:
already_processed.append('x1')
showIndent(outfile, level)
outfile.write('x1 = "%s",\n' % (self.x1,))
def exportLiteralChildren(self, outfile, level, name_):
pass
def build(self, node):
self.buildAttributes(node, node.attrib, [])
for child in node:
nodeName_ = Tag_pattern_.match(child.tag).groups()[-1]
self.buildChildren(child, node, nodeName_)
def buildAttributes(self, node, attrs, already_processed):
value = find_attr_value_('layer', node)
if value is not None and 'layer' not in already_processed:
already_processed.append('layer')
self.layer = value
value = find_attr_value_('y2', node)
if value is not None and 'y2' not in already_processed:
already_processed.append('y2')
self.y2 = value
value = find_attr_value_('width', node)
if value is not None and 'width' not in already_processed:
already_processed.append('width')
self.width = value
value = find_attr_value_('cap', node)
if value is not None and 'cap' not in already_processed:
already_processed.append('cap')
self.cap = value
value = find_attr_value_('curve', node)
if value is not None and 'curve' not in already_processed:
already_processed.append('curve')
self.curve = value
value = find_attr_value_('style', node)
if value is not None and 'style' not in already_processed:
already_processed.append('style')
self.style = value
value = find_attr_value_('x2', node)
if value is not None and 'x2' not in already_processed:
already_processed.append('x2')
self.x2 = value
value = find_attr_value_('extent', node)
if value is not None and 'extent' not in already_processed:
already_processed.append('extent')
self.extent = value
value = find_attr_value_('y1', node)
if value is not None and 'y1' not in already_processed:
already_processed.append('y1')
self.y1 = value
value = find_attr_value_('x1', node)
if value is not None and 'x1' not in already_processed:
already_processed.append('x1')
self.x1 = value
def buildChildren(self, child_, node, nodeName_, fromsubclass_=False):
pass
# end class wire
class dimension(GeneratedsSuper):
subclass = None
superclass = None
def __init__(self, layer=None, y2=None, dtype=None, x2=None, y1=None, x3=None, y3=None, x1=None):
self.layer = _cast(None, layer)
self.y2 = _cast(None, y2)
self.dtype = _cast(None, dtype)
self.x2 = _cast(None, x2)
self.y1 = _cast(None, y1)
self.x3 = _cast(None, x3)
self.y3 = _cast(None, y3)
self.x1 = _cast(None, x1)
pass
def factory(*args_, **kwargs_):
if dimension.subclass:
return dimension.subclass(*args_, **kwargs_)
else:
return dimension(*args_, **kwargs_)
factory = staticmethod(factory)
def get_layer(self): return self.layer
def set_layer(self, layer): self.layer = layer
def get_y2(self): return self.y2
def set_y2(self, y2): self.y2 = y2
def get_dtype(self): return self.dtype
def set_dtype(self, dtype): self.dtype = dtype
def get_x2(self): return self.x2
def set_x2(self, x2): self.x2 = x2
def get_y1(self): return self.y1
def set_y1(self, y1): self.y1 = y1
def get_x3(self): return self.x3
def set_x3(self, x3): self.x3 = x3
def get_y3(self): return self.y3
def set_y3(self, y3): self.y3 = y3
def get_x1(self): return self.x1
def set_x1(self, x1): self.x1 = x1
def export(self, outfile, level, namespace_='t:', name_='dimension', namespacedef_=''):
showIndent(outfile, level)
outfile.write('<%s%s%s' % (namespace_, name_, namespacedef_ and ' ' + namespacedef_ or '', ))
already_processed = []
self.exportAttributes(outfile, level, already_processed, namespace_, name_='dimension')
if self.hasContent_():
outfile.write('>\n')
self.exportChildren(outfile, level + 1, namespace_, name_)
outfile.write('</%s%s>\n' % (namespace_, name_))
else:
outfile.write('/>\n')
def exportAttributes(self, outfile, level, already_processed, namespace_='t:', name_='dimension'):
if self.layer is not None and 'layer' not in already_processed:
already_processed.append('layer')
outfile.write(' layer=%s' % (self.gds_format_string(quote_attrib(self.layer).encode(ExternalEncoding), input_name='layer'), ))
if self.y2 is not None and 'y2' not in already_processed:
already_processed.append('y2')
outfile.write(' y2=%s' % (self.gds_format_string(quote_attrib(self.y2).encode(ExternalEncoding), input_name='y2'), ))
if self.dtype is not None and 'dtype' not in already_processed:
already_processed.append('dtype')
outfile.write(' dtype=%s' % (self.gds_format_string(quote_attrib(self.dtype).encode(ExternalEncoding), input_name='dtype'), ))
if self.x2 is not None and 'x2' not in already_processed:
already_processed.append('x2')
outfile.write(' x2=%s' % (self.gds_format_string(quote_attrib(self.x2).encode(ExternalEncoding), input_name='x2'), ))
if self.y1 is not None and 'y1' not in already_processed:
already_processed.append('y1')
outfile.write(' y1=%s' % (self.gds_format_string(quote_attrib(self.y1).encode(ExternalEncoding), input_name='y1'), ))
if self.x3 is not None and 'x3' not in already_processed:
already_processed.append('x3')
outfile.write(' x3=%s' % (self.gds_format_string(quote_attrib(self.x3).encode(ExternalEncoding), input_name='x3'), ))
if self.y3 is not None and 'y3' not in already_processed:
already_processed.append('y3')
outfile.write(' y3=%s' % (self.gds_format_string(quote_attrib(self.y3).encode(ExternalEncoding), input_name='y3'), ))
if self.x1 is not None and 'x1' not in already_processed:
already_processed.append('x1')
outfile.write(' x1=%s' % (self.gds_format_string(quote_attrib(self.x1).encode(ExternalEncoding), input_name='x1'), ))
def exportChildren(self, outfile, level, namespace_='t:', name_='dimension', fromsubclass_=False):
pass
def hasContent_(self):
if (
):
return True
else:
return False
def exportLiteral(self, outfile, level, name_='dimension'):
level += 1
self.exportLiteralAttributes(outfile, level, [], name_)
if self.hasContent_():
self.exportLiteralChildren(outfile, level, name_)
def exportLiteralAttributes(self, outfile, level, already_processed, name_):
if self.layer is not None and 'layer' not in already_processed:
already_processed.append('layer')
showIndent(outfile, level)
outfile.write('layer = "%s",\n' % (self.layer,))
if self.y2 is not None and 'y2' not in already_processed:
already_processed.append('y2')
showIndent(outfile, level)
outfile.write('y2 = "%s",\n' % (self.y2,))
if self.dtype is not None and 'dtype' not in already_processed:
already_processed.append('dtype')
showIndent(outfile, level)
outfile.write('dtype = "%s",\n' % (self.dtype,))
if self.x2 is not None and 'x2' not in already_processed:
already_processed.append('x2')
showIndent(outfile, level)
outfile.write('x2 = "%s",\n' % (self.x2,))
if self.y1 is not None and 'y1' not in already_processed:
already_processed.append('y1')
showIndent(outfile, level)
outfile.write('y1 = "%s",\n' % (self.y1,))
if self.x3 is not None and 'x3' not in already_processed:
already_processed.append('x3')
showIndent(outfile, level)
outfile.write('x3 = "%s",\n' % (self.x3,))
if self.y3 is not None and 'y3' not in already_processed:
already_processed.append('y3')
showIndent(outfile, level)
outfile.write('y3 = "%s",\n' % (self.y3,))
if self.x1 is not None and 'x1' not in already_processed:
already_processed.append('x1')
showIndent(outfile, level)
outfile.write('x1 = "%s",\n' % (self.x1,))
def exportLiteralChildren(self, outfile, level, name_):
pass
def build(self, node):
self.buildAttributes(node, node.attrib, [])
for child in node:
nodeName_ = Tag_pattern_.match(child.tag).groups()[-1]
self.buildChildren(child, node, nodeName_)
def buildAttributes(self, node, attrs, already_processed):
value = find_attr_value_('layer', node)
if value is not None and 'layer' not in already_processed:
already_processed.append('layer')
self.layer = value
value = find_attr_value_('y2', node)
if value is not None and 'y2' not in already_processed:
already_processed.append('y2')
self.y2 = value
value = find_attr_value_('dtype', node)
if value is not None and 'dtype' not in already_processed:
already_processed.append('dtype')
self.dtype = value
value = find_attr_value_('x2', node)
if value is not None and 'x2' not in already_processed:
already_processed.append('x2')
self.x2 = value
value = find_attr_value_('y1', node)
if value is not None and 'y1' not in already_processed:
already_processed.append('y1')
self.y1 = value
value = find_attr_value_('x3', node)
if value is not None and 'x3' not in already_processed:
already_processed.append('x3')
self.x3 = value
value = find_attr_value_('y3', node)
if value is not None and 'y3' not in already_processed:
already_processed.append('y3')
self.y3 = value
value = find_attr_value_('x1', node)
if value is not None and 'x1' not in already_processed:
already_processed.append('x1')
self.x1 = value
def buildChildren(self, child_, node, nodeName_, fromsubclass_=False):
pass
# end class dimension
class text(GeneratedsSuper):
subclass = None
superclass = None
def __init__(self, layer=None, ratio=None, align=None, y=None, x=None, font=None, rot=None, size=None, valueOf_=None, mixedclass_=None, content_=None):
self.layer = _cast(None, layer)
self.ratio = _cast(None, ratio)
self.align = _cast(None, align)
self.y = _cast(None, y)
self.x = _cast(None, x)
self.font = _cast(None, font)
self.rot = _cast(None, rot)
self.size = _cast(None, size)
self.valueOf_ = valueOf_
if mixedclass_ is None:
self.mixedclass_ = MixedContainer
else:
self.mixedclass_ = mixedclass_
if content_ is None:
self.content_ = []
else:
self.content_ = content_
self.valueOf_ = valueOf_
def factory(*args_, **kwargs_):
if text.subclass:
return text.subclass(*args_, **kwargs_)
else:
return text(*args_, **kwargs_)
factory = staticmethod(factory)
def get_layer(self): return self.layer
def set_layer(self, layer): self.layer = layer
def get_ratio(self): return self.ratio
def set_ratio(self, ratio): self.ratio = ratio
def get_align(self): return self.align
def set_align(self, align): self.align = align
def get_y(self): return self.y
def set_y(self, y): self.y = y
def get_x(self): return self.x
def set_x(self, x): self.x = x
def get_font(self): return self.font
def set_font(self, font): self.font = font
def get_rot(self): return self.rot
def set_rot(self, rot): self.rot = rot
def get_size(self): return self.size
def set_size(self, size): self.size = size
def get_valueOf_(self): return self.valueOf_
def set_valueOf_(self, valueOf_): self.valueOf_ = valueOf_
def export(self, outfile, level, namespace_='t:', name_='text', namespacedef_=''):
showIndent(outfile, level)
outfile.write('<%s%s%s' % (namespace_, name_, namespacedef_ and ' ' + namespacedef_ or '', ))
already_processed = []
self.exportAttributes(outfile, level, already_processed, namespace_, name_='text')
outfile.write('>')
self.exportChildren(outfile, level + 1, namespace_, name_)
outfile.write('</%s%s>\n' % (namespace_, name_))
def exportAttributes(self, outfile, level, already_processed, namespace_='t:', name_='text'):
if self.layer is not None and 'layer' not in already_processed:
already_processed.append('layer')
outfile.write(' layer=%s' % (self.gds_format_string(quote_attrib(self.layer).encode(ExternalEncoding), input_name='layer'), ))
if self.ratio is not None and 'ratio' not in already_processed:
already_processed.append('ratio')
outfile.write(' ratio=%s' % (self.gds_format_string(quote_attrib(self.ratio).encode(ExternalEncoding), input_name='ratio'), ))
if self.align is not None and 'align' not in already_processed:
already_processed.append('align')
outfile.write(' align=%s' % (self.gds_format_string(quote_attrib(self.align).encode(ExternalEncoding), input_name='align'), ))
if self.y is not None and 'y' not in already_processed:
already_processed.append('y')
outfile.write(' y=%s' % (self.gds_format_string(quote_attrib(self.y).encode(ExternalEncoding), input_name='y'), ))
if self.x is not None and 'x' not in already_processed:
already_processed.append('x')
outfile.write(' x=%s' % (self.gds_format_string(quote_attrib(self.x).encode(ExternalEncoding), input_name='x'), ))
if self.font is not None and 'font' not in already_processed:
already_processed.append('font')
outfile.write(' font=%s' % (self.gds_format_string(quote_attrib(self.font).encode(ExternalEncoding), input_name='font'), ))
if self.rot is not None and 'rot' not in already_processed:
already_processed.append('rot')
outfile.write(' rot=%s' % (self.gds_format_string(quote_attrib(self.rot).encode(ExternalEncoding), input_name='rot'), ))
if self.size is not None and 'size' not in already_processed:
already_processed.append('size')
outfile.write(' size=%s' % (self.gds_format_string(quote_attrib(self.size).encode(ExternalEncoding), input_name='size'), ))
def exportChildren(self, outfile, level, namespace_='t:', name_='text', fromsubclass_=False):
pass
def hasContent_(self):
if (
self.valueOf_
):
return True
else:
return False
def exportLiteral(self, outfile, level, name_='text'):
level += 1
self.exportLiteralAttributes(outfile, level, [], name_)
if self.hasContent_():
self.exportLiteralChildren(outfile, level, name_)
showIndent(outfile, level)
outfile.write('valueOf_ = """%s""",\n' % (self.valueOf_,))
def exportLiteralAttributes(self, outfile, level, already_processed, name_):
if self.layer is not None and 'layer' not in already_processed:
already_processed.append('layer')
showIndent(outfile, level)
outfile.write('layer = "%s",\n' % (self.layer,))
if self.ratio is not None and 'ratio' not in already_processed:
already_processed.append('ratio')
showIndent(outfile, level)
outfile.write('ratio = "%s",\n' % (self.ratio,))
if self.align is not None and 'align' not in already_processed:
already_processed.append('align')
showIndent(outfile, level)
outfile.write('align = "%s",\n' % (self.align,))
if self.y is not None and 'y' not in already_processed:
already_processed.append('y')
showIndent(outfile, level)
outfile.write('y = "%s",\n' % (self.y,))
if self.x is not None and 'x' not in already_processed:
already_processed.append('x')
showIndent(outfile, level)
outfile.write('x = "%s",\n' % (self.x,))
if self.font is not None and 'font' not in already_processed:
already_processed.append('font')
showIndent(outfile, level)
outfile.write('font = "%s",\n' % (self.font,))
if self.rot is not None and 'rot' not in already_processed:
already_processed.append('rot')
showIndent(outfile, level)
outfile.write('rot = "%s",\n' % (self.rot,))
if self.size is not None and 'size' not in already_processed:
already_processed.append('size')
showIndent(outfile, level)
outfile.write('size = "%s",\n' % (self.size,))
def exportLiteralChildren(self, outfile, level, name_):
pass
def build(self, node):
self.buildAttributes(node, node.attrib, [])
self.valueOf_ = get_all_text_(node)
if node.text is not None:
obj_ = self.mixedclass_(MixedContainer.CategoryText,
MixedContainer.TypeNone, '', node.text)
self.content_.append(obj_)
for child in node:
nodeName_ = Tag_pattern_.match(child.tag).groups()[-1]
self.buildChildren(child, node, nodeName_)
def buildAttributes(self, node, attrs, already_processed):
value = find_attr_value_('layer', node)
if value is not None and 'layer' not in already_processed:
already_processed.append('layer')
self.layer = value
value = find_attr_value_('ratio', node)
if value is not None and 'ratio' not in already_processed:
already_processed.append('ratio')
self.ratio = value
value = find_attr_value_('align', node)
if value is not None and 'align' not in already_processed:
already_processed.append('align')
self.align = value
value = find_attr_value_('y', node)
if value is not None and 'y' not in already_processed:
already_processed.append('y')
self.y = value
value = find_attr_value_('x', node)
if value is not None and 'x' not in already_processed:
already_processed.append('x')
self.x = value
value = find_attr_value_('font', node)
if value is not None and 'font' not in already_processed:
already_processed.append('font')
self.font = value
value = find_attr_value_('rot', node)
if value is not None and 'rot' not in already_processed:
already_processed.append('rot')
self.rot = value
value = find_attr_value_('size', node)
if value is not None and 'size' not in already_processed:
already_processed.append('size')
self.size = value
def buildChildren(self, child_, node, nodeName_, fromsubclass_=False):
if not fromsubclass_ and child_.tail is not None:
obj_ = self.mixedclass_(MixedContainer.CategoryText,
MixedContainer.TypeNone, '', child_.tail)
self.content_.append(obj_)
pass
# end class text
class circle(GeneratedsSuper):
subclass = None
superclass = None
def __init__(self, y=None, x=None, layer=None, radius=None, width=None):
self.y = _cast(None, y)
self.x = _cast(None, x)
self.layer = _cast(None, layer)
self.radius = _cast(None, radius)
self.width = _cast(None, width)
pass
def factory(*args_, **kwargs_):
if circle.subclass:
return circle.subclass(*args_, **kwargs_)
else:
return circle(*args_, **kwargs_)
factory = staticmethod(factory)
def get_y(self): return self.y
def set_y(self, y): self.y = y
def get_x(self): return self.x
def set_x(self, x): self.x = x
def get_layer(self): return self.layer
def set_layer(self, layer): self.layer = layer
def get_radius(self): return self.radius
def set_radius(self, radius): self.radius = radius
def get_width(self): return self.width
def set_width(self, width): self.width = width
def export(self, outfile, level, namespace_='t:', name_='circle', namespacedef_=''):
showIndent(outfile, level)
outfile.write('<%s%s%s' % (namespace_, name_, namespacedef_ and ' ' + namespacedef_ or '', ))
already_processed = []
self.exportAttributes(outfile, level, already_processed, namespace_, name_='circle')
if self.hasContent_():
outfile.write('>\n')
self.exportChildren(outfile, level + 1, namespace_, name_)
outfile.write('</%s%s>\n' % (namespace_, name_))
else:
outfile.write('/>\n')
def exportAttributes(self, outfile, level, already_processed, namespace_='t:', name_='circle'):
if self.y is not None and 'y' not in already_processed:
already_processed.append('y')
outfile.write(' y=%s' % (self.gds_format_string(quote_attrib(self.y).encode(ExternalEncoding), input_name='y'), ))
if self.x is not None and 'x' not in already_processed:
already_processed.append('x')
outfile.write(' x=%s' % (self.gds_format_string(quote_attrib(self.x).encode(ExternalEncoding), input_name='x'), ))
if self.layer is not None and 'layer' not in already_processed:
already_processed.append('layer')
outfile.write(' layer=%s' % (self.gds_format_string(quote_attrib(self.layer).encode(ExternalEncoding), input_name='layer'), ))
if self.radius is not None and 'radius' not in already_processed:
already_processed.append('radius')
outfile.write(' radius=%s' % (self.gds_format_string(quote_attrib(self.radius).encode(ExternalEncoding), input_name='radius'), ))
if self.width is not None and 'width' not in already_processed:
already_processed.append('width')
outfile.write(' width=%s' % (self.gds_format_string(quote_attrib(self.width).encode(ExternalEncoding), input_name='width'), ))
def exportChildren(self, outfile, level, namespace_='t:', name_='circle', fromsubclass_=False):
pass
def hasContent_(self):
if (
):
return True
else:
return False
def exportLiteral(self, outfile, level, name_='circle'):
level += 1
self.exportLiteralAttributes(outfile, level, [], name_)
if self.hasContent_():
self.exportLiteralChildren(outfile, level, name_)
def exportLiteralAttributes(self, outfile, level, already_processed, name_):
if self.y is not None and 'y' not in already_processed:
already_processed.append('y')
showIndent(outfile, level)
outfile.write('y = "%s",\n' % (self.y,))
if self.x is not None and 'x' not in already_processed:
already_processed.append('x')
showIndent(outfile, level)
outfile.write('x = "%s",\n' % (self.x,))
if self.layer is not None and 'layer' not in already_processed:
already_processed.append('layer')
showIndent(outfile, level)
outfile.write('layer = "%s",\n' % (self.layer,))
if self.radius is not None and 'radius' not in already_processed:
already_processed.append('radius')
showIndent(outfile, level)
outfile.write('radius = "%s",\n' % (self.radius,))
if self.width is not None and 'width' not in already_processed:
already_processed.append('width')
showIndent(outfile, level)
outfile.write('width = "%s",\n' % (self.width,))
def exportLiteralChildren(self, outfile, level, name_):
pass
def build(self, node):
self.buildAttributes(node, node.attrib, [])
for child in node:
nodeName_ = Tag_pattern_.match(child.tag).groups()[-1]
self.buildChildren(child, node, nodeName_)
def buildAttributes(self, node, attrs, already_processed):
value = find_attr_value_('y', node)
if value is not None and 'y' not in already_processed:
already_processed.append('y')
self.y = value
value = find_attr_value_('x', node)
if value is not None and 'x' not in already_processed:
already_processed.append('x')
self.x = value
value = find_attr_value_('layer', node)
if value is not None and 'layer' not in already_processed:
already_processed.append('layer')
self.layer = value
value = find_attr_value_('radius', node)
if value is not None and 'radius' not in already_processed:
already_processed.append('radius')
self.radius = value
value = find_attr_value_('width', node)
if value is not None and 'width' not in already_processed:
already_processed.append('width')
self.width = value
def buildChildren(self, child_, node, nodeName_, fromsubclass_=False):
pass
# end class circle
class rectangle(GeneratedsSuper):
subclass = None
superclass = None
def __init__(self, layer=None, y2=None, x2=None, y1=None, x1=None, rot=None):
self.layer = _cast(None, layer)
self.y2 = _cast(None, y2)
self.x2 = _cast(None, x2)
self.y1 = _cast(None, y1)
self.x1 = _cast(None, x1)
self.rot = _cast(None, rot)
pass
def factory(*args_, **kwargs_):
if rectangle.subclass:
return rectangle.subclass(*args_, **kwargs_)
else:
return rectangle(*args_, **kwargs_)
factory = staticmethod(factory)
def get_layer(self): return self.layer
def set_layer(self, layer): self.layer = layer
def get_y2(self): return self.y2
def set_y2(self, y2): self.y2 = y2
def get_x2(self): return self.x2
def set_x2(self, x2): self.x2 = x2
def get_y1(self): return self.y1
def set_y1(self, y1): self.y1 = y1
def get_x1(self): return self.x1
def set_x1(self, x1): self.x1 = x1
def get_rot(self): return self.rot
def set_rot(self, rot): self.rot = rot
def export(self, outfile, level, namespace_='t:', name_='rectangle', namespacedef_=''):
showIndent(outfile, level)
outfile.write('<%s%s%s' % (namespace_, name_, namespacedef_ and ' ' + namespacedef_ or '', ))
already_processed = []
self.exportAttributes(outfile, level, already_processed, namespace_, name_='rectangle')
if self.hasContent_():
outfile.write('>\n')
self.exportChildren(outfile, level + 1, namespace_, name_)
outfile.write('</%s%s>\n' % (namespace_, name_))
else:
outfile.write('/>\n')
def exportAttributes(self, outfile, level, already_processed, namespace_='t:', name_='rectangle'):
if self.layer is not None and 'layer' not in already_processed:
already_processed.append('layer')
outfile.write(' layer=%s' % (self.gds_format_string(quote_attrib(self.layer).encode(ExternalEncoding), input_name='layer'), ))
if self.y2 is not None and 'y2' not in already_processed:
already_processed.append('y2')
outfile.write(' y2=%s' % (self.gds_format_string(quote_attrib(self.y2).encode(ExternalEncoding), input_name='y2'), ))
if self.x2 is not None and 'x2' not in already_processed:
already_processed.append('x2')
outfile.write(' x2=%s' % (self.gds_format_string(quote_attrib(self.x2).encode(ExternalEncoding), input_name='x2'), ))
if self.y1 is not None and 'y1' not in already_processed:
already_processed.append('y1')
outfile.write(' y1=%s' % (self.gds_format_string(quote_attrib(self.y1).encode(ExternalEncoding), input_name='y1'), ))
if self.x1 is not None and 'x1' not in already_processed:
already_processed.append('x1')
outfile.write(' x1=%s' % (self.gds_format_string(quote_attrib(self.x1).encode(ExternalEncoding), input_name='x1'), ))
if self.rot is not None and 'rot' not in already_processed:
already_processed.append('rot')
outfile.write(' rot=%s' % (self.gds_format_string(quote_attrib(self.rot).encode(ExternalEncoding), input_name='rot'), ))
def exportChildren(self, outfile, level, namespace_='t:', name_='rectangle', fromsubclass_=False):
pass
def hasContent_(self):
if (
):
return True
else:
return False
def exportLiteral(self, outfile, level, name_='rectangle'):
level += 1
self.exportLiteralAttributes(outfile, level, [], name_)
if self.hasContent_():
self.exportLiteralChildren(outfile, level, name_)
def exportLiteralAttributes(self, outfile, level, already_processed, name_):
if self.layer is not None and 'layer' not in already_processed:
already_processed.append('layer')
showIndent(outfile, level)
outfile.write('layer = "%s",\n' % (self.layer,))
if self.y2 is not None and 'y2' not in already_processed:
already_processed.append('y2')
showIndent(outfile, level)
outfile.write('y2 = "%s",\n' % (self.y2,))
if self.x2 is not None and 'x2' not in already_processed:
already_processed.append('x2')
showIndent(outfile, level)
outfile.write('x2 = "%s",\n' % (self.x2,))
if self.y1 is not None and 'y1' not in already_processed:
already_processed.append('y1')
showIndent(outfile, level)
outfile.write('y1 = "%s",\n' % (self.y1,))
if self.x1 is not None and 'x1' not in already_processed:
already_processed.append('x1')
showIndent(outfile, level)
outfile.write('x1 = "%s",\n' % (self.x1,))
if self.rot is not None and 'rot' not in already_processed:
already_processed.append('rot')
showIndent(outfile, level)
outfile.write('rot = "%s",\n' % (self.rot,))
def exportLiteralChildren(self, outfile, level, name_):
pass
def build(self, node):
self.buildAttributes(node, node.attrib, [])
for child in node:
nodeName_ = Tag_pattern_.match(child.tag).groups()[-1]
self.buildChildren(child, node, nodeName_)
def buildAttributes(self, node, attrs, already_processed):
value = find_attr_value_('layer', node)
if value is not None and 'layer' not in already_processed:
already_processed.append('layer')
self.layer = value
value = find_attr_value_('y2', node)
if value is not None and 'y2' not in already_processed:
already_processed.append('y2')
self.y2 = value
value = find_attr_value_('x2', node)
if value is not None and 'x2' not in already_processed:
already_processed.append('x2')
self.x2 = value
value = find_attr_value_('y1', node)
if value is not None and 'y1' not in already_processed:
already_processed.append('y1')
self.y1 = value
value = find_attr_value_('x1', node)
if value is not None and 'x1' not in already_processed:
already_processed.append('x1')
self.x1 = value
value = find_attr_value_('rot', node)
if value is not None and 'rot' not in already_processed:
already_processed.append('rot')
self.rot = value
def buildChildren(self, child_, node, nodeName_, fromsubclass_=False):
pass
# end class rectangle
class frame(GeneratedsSuper):
subclass = None
superclass = None
def __init__(self, y2=None, layer=None, rows=None, border_right=None, border_bottom=None, x2=None, border_top=None, border_left=None, y1=None, x1=None, columns=None):
self.y2 = _cast(None, y2)
self.layer = _cast(None, layer)
self.rows = _cast(None, rows)
self.border_right = _cast(None, border_right)
self.border_bottom = _cast(None, border_bottom)
self.x2 = _cast(None, x2)
self.border_top = _cast(None, border_top)
self.border_left = _cast(None, border_left)
self.y1 = _cast(None, y1)
self.x1 = _cast(None, x1)
self.columns = _cast(None, columns)
pass
def factory(*args_, **kwargs_):
if frame.subclass:
return frame.subclass(*args_, **kwargs_)
else:
return frame(*args_, **kwargs_)
factory = staticmethod(factory)
def get_y2(self): return self.y2
def set_y2(self, y2): self.y2 = y2
def get_layer(self): return self.layer
def set_layer(self, layer): self.layer = layer
def get_rows(self): return self.rows
def set_rows(self, rows): self.rows = rows
def get_border_right(self): return self.border_right
def set_border_right(self, border_right): self.border_right = border_right
def get_border_bottom(self): return self.border_bottom
def set_border_bottom(self, border_bottom): self.border_bottom = border_bottom
def get_x2(self): return self.x2
def set_x2(self, x2): self.x2 = x2
def get_border_top(self): return self.border_top
def set_border_top(self, border_top): self.border_top = border_top
def get_border_left(self): return self.border_left
def set_border_left(self, border_left): self.border_left = border_left
def get_y1(self): return self.y1
def set_y1(self, y1): self.y1 = y1
def get_x1(self): return self.x1
def set_x1(self, x1): self.x1 = x1
def get_columns(self): return self.columns
def set_columns(self, columns): self.columns = columns
def export(self, outfile, level, namespace_='t:', name_='frame', namespacedef_=''):
showIndent(outfile, level)
outfile.write('<%s%s%s' % (namespace_, name_, namespacedef_ and ' ' + namespacedef_ or '', ))
already_processed = []
self.exportAttributes(outfile, level, already_processed, namespace_, name_='frame')
if self.hasContent_():
outfile.write('>\n')
self.exportChildren(outfile, level + 1, namespace_, name_)
outfile.write('</%s%s>\n' % (namespace_, name_))
else:
outfile.write('/>\n')
def exportAttributes(self, outfile, level, already_processed, namespace_='t:', name_='frame'):
if self.y2 is not None and 'y2' not in already_processed:
already_processed.append('y2')
outfile.write(' y2=%s' % (self.gds_format_string(quote_attrib(self.y2).encode(ExternalEncoding), input_name='y2'), ))
if self.layer is not None and 'layer' not in already_processed:
already_processed.append('layer')
outfile.write(' layer=%s' % (self.gds_format_string(quote_attrib(self.layer).encode(ExternalEncoding), input_name='layer'), ))
if self.rows is not None and 'rows' not in already_processed:
already_processed.append('rows')
outfile.write(' rows=%s' % (self.gds_format_string(quote_attrib(self.rows).encode(ExternalEncoding), input_name='rows'), ))
if self.border_right is not None and 'border_right' not in already_processed:
already_processed.append('border_right')
outfile.write(' border-right=%s' % (self.gds_format_string(quote_attrib(self.border_right).encode(ExternalEncoding), input_name='border-right'), ))
if self.border_bottom is not None and 'border_bottom' not in already_processed:
already_processed.append('border_bottom')
outfile.write(' border-bottom=%s' % (self.gds_format_string(quote_attrib(self.border_bottom).encode(ExternalEncoding), input_name='border-bottom'), ))
if self.x2 is not None and 'x2' not in already_processed:
already_processed.append('x2')
outfile.write(' x2=%s' % (self.gds_format_string(quote_attrib(self.x2).encode(ExternalEncoding), input_name='x2'), ))
if self.border_top is not None and 'border_top' not in already_processed:
already_processed.append('border_top')
outfile.write(' border-top=%s' % (self.gds_format_string(quote_attrib(self.border_top).encode(ExternalEncoding), input_name='border-top'), ))
if self.border_left is not None and 'border_left' not in already_processed:
already_processed.append('border_left')
outfile.write(' border-left=%s' % (self.gds_format_string(quote_attrib(self.border_left).encode(ExternalEncoding), input_name='border-left'), ))
if self.y1 is not None and 'y1' not in already_processed:
already_processed.append('y1')
outfile.write(' y1=%s' % (self.gds_format_string(quote_attrib(self.y1).encode(ExternalEncoding), input_name='y1'), ))
if self.x1 is not None and 'x1' not in already_processed:
already_processed.append('x1')
outfile.write(' x1=%s' % (self.gds_format_string(quote_attrib(self.x1).encode(ExternalEncoding), input_name='x1'), ))
if self.columns is not None and 'columns' not in already_processed:
already_processed.append('columns')
outfile.write(' columns=%s' % (self.gds_format_string(quote_attrib(self.columns).encode(ExternalEncoding), input_name='columns'), ))
def exportChildren(self, outfile, level, namespace_='t:', name_='frame', fromsubclass_=False):
pass
def hasContent_(self):
if (
):
return True
else:
return False
def exportLiteral(self, outfile, level, name_='frame'):
level += 1
self.exportLiteralAttributes(outfile, level, [], name_)
if self.hasContent_():
self.exportLiteralChildren(outfile, level, name_)
def exportLiteralAttributes(self, outfile, level, already_processed, name_):
if self.y2 is not None and 'y2' not in already_processed:
already_processed.append('y2')
showIndent(outfile, level)
outfile.write('y2 = "%s",\n' % (self.y2,))
if self.layer is not None and 'layer' not in already_processed:
already_processed.append('layer')
showIndent(outfile, level)
outfile.write('layer = "%s",\n' % (self.layer,))
if self.rows is not None and 'rows' not in already_processed:
already_processed.append('rows')
showIndent(outfile, level)
outfile.write('rows = "%s",\n' % (self.rows,))
if self.border_right is not None and 'border_right' not in already_processed:
already_processed.append('border_right')
showIndent(outfile, level)
outfile.write('border_right = "%s",\n' % (self.border_right,))
if self.border_bottom is not None and 'border_bottom' not in already_processed:
already_processed.append('border_bottom')
showIndent(outfile, level)
outfile.write('border_bottom = "%s",\n' % (self.border_bottom,))
if self.x2 is not None and 'x2' not in already_processed:
already_processed.append('x2')
showIndent(outfile, level)
outfile.write('x2 = "%s",\n' % (self.x2,))
if self.border_top is not None and 'border_top' not in already_processed:
already_processed.append('border_top')
showIndent(outfile, level)
outfile.write('border_top = "%s",\n' % (self.border_top,))
if self.border_left is not None and 'border_left' not in already_processed:
already_processed.append('border_left')
showIndent(outfile, level)
outfile.write('border_left = "%s",\n' % (self.border_left,))
if self.y1 is not None and 'y1' not in already_processed:
already_processed.append('y1')
showIndent(outfile, level)
outfile.write('y1 = "%s",\n' % (self.y1,))
if self.x1 is not None and 'x1' not in already_processed:
already_processed.append('x1')
showIndent(outfile, level)
outfile.write('x1 = "%s",\n' % (self.x1,))
if self.columns is not None and 'columns' not in already_processed:
already_processed.append('columns')
showIndent(outfile, level)
outfile.write('columns = "%s",\n' % (self.columns,))
def exportLiteralChildren(self, outfile, level, name_):
pass
def build(self, node):
self.buildAttributes(node, node.attrib, [])
for child in node:
nodeName_ = Tag_pattern_.match(child.tag).groups()[-1]
self.buildChildren(child, node, nodeName_)
def buildAttributes(self, node, attrs, already_processed):
value = find_attr_value_('y2', node)
if value is not None and 'y2' not in already_processed:
already_processed.append('y2')
self.y2 = value
value = find_attr_value_('layer', node)
if value is not None and 'layer' not in already_processed:
already_processed.append('layer')
self.layer = value
value = find_attr_value_('rows', node)
if value is not None and 'rows' not in already_processed:
already_processed.append('rows')
self.rows = value
value = find_attr_value_('border-right', node)
if value is not None and 'border-right' not in already_processed:
already_processed.append('border-right')
self.border_right = value
value = find_attr_value_('border-bottom', node)
if value is not None and 'border-bottom' not in already_processed:
already_processed.append('border-bottom')
self.border_bottom = value
value = find_attr_value_('x2', node)
if value is not None and 'x2' not in already_processed:
already_processed.append('x2')
self.x2 = value
value = find_attr_value_('border-top', node)
if value is not None and 'border-top' not in already_processed:
already_processed.append('border-top')
self.border_top = value
value = find_attr_value_('border-left', node)
if value is not None and 'border-left' not in already_processed:
already_processed.append('border-left')
self.border_left = value
value = find_attr_value_('y1', node)
if value is not None and 'y1' not in already_processed:
already_processed.append('y1')
self.y1 = value
value = find_attr_value_('x1', node)
if value is not None and 'x1' not in already_processed:
already_processed.append('x1')
self.x1 = value
value = find_attr_value_('columns', node)
if value is not None and 'columns' not in already_processed:
already_processed.append('columns')
self.columns = value
def buildChildren(self, child_, node, nodeName_, fromsubclass_=False):
pass
# end class frame
class hole(GeneratedsSuper):
subclass = None
superclass = None
def __init__(self, y=None, x=None, drill=None):
self.y = _cast(None, y)
self.x = _cast(None, x)
self.drill = _cast(None, drill)
pass
def factory(*args_, **kwargs_):
if hole.subclass:
return hole.subclass(*args_, **kwargs_)
else:
return hole(*args_, **kwargs_)
factory = staticmethod(factory)
def get_y(self): return self.y
def set_y(self, y): self.y = y
def get_x(self): return self.x
def set_x(self, x): self.x = x
def get_drill(self): return self.drill
def set_drill(self, drill): self.drill = drill
def export(self, outfile, level, namespace_='t:', name_='hole', namespacedef_=''):
showIndent(outfile, level)
outfile.write('<%s%s%s' % (namespace_, name_, namespacedef_ and ' ' + namespacedef_ or '', ))
already_processed = []
self.exportAttributes(outfile, level, already_processed, namespace_, name_='hole')
if self.hasContent_():
outfile.write('>\n')
self.exportChildren(outfile, level + 1, namespace_, name_)
outfile.write('</%s%s>\n' % (namespace_, name_))
else:
outfile.write('/>\n')
def exportAttributes(self, outfile, level, already_processed, namespace_='t:', name_='hole'):
if self.y is not None and 'y' not in already_processed:
already_processed.append('y')
outfile.write(' y=%s' % (self.gds_format_string(quote_attrib(self.y).encode(ExternalEncoding), input_name='y'), ))
if self.x is not None and 'x' not in already_processed:
already_processed.append('x')
outfile.write(' x=%s' % (self.gds_format_string(quote_attrib(self.x).encode(ExternalEncoding), input_name='x'), ))
if self.drill is not None and 'drill' not in already_processed:
already_processed.append('drill')
outfile.write(' drill=%s' % (self.gds_format_string(quote_attrib(self.drill).encode(ExternalEncoding), input_name='drill'), ))
def exportChildren(self, outfile, level, namespace_='t:', name_='hole', fromsubclass_=False):
pass
def hasContent_(self):
if (
):
return True
else:
return False
def exportLiteral(self, outfile, level, name_='hole'):
level += 1
self.exportLiteralAttributes(outfile, level, [], name_)
if self.hasContent_():
self.exportLiteralChildren(outfile, level, name_)
def exportLiteralAttributes(self, outfile, level, already_processed, name_):
if self.y is not None and 'y' not in already_processed:
already_processed.append('y')
showIndent(outfile, level)
outfile.write('y = "%s",\n' % (self.y,))
if self.x is not None and 'x' not in already_processed:
already_processed.append('x')
showIndent(outfile, level)
outfile.write('x = "%s",\n' % (self.x,))
if self.drill is not None and 'drill' not in already_processed:
already_processed.append('drill')
showIndent(outfile, level)
outfile.write('drill = "%s",\n' % (self.drill,))
def exportLiteralChildren(self, outfile, level, name_):
pass
def build(self, node):
self.buildAttributes(node, node.attrib, [])
for child in node:
nodeName_ = Tag_pattern_.match(child.tag).groups()[-1]
self.buildChildren(child, node, nodeName_)
def buildAttributes(self, node, attrs, already_processed):
value = find_attr_value_('y', node)
if value is not None and 'y' not in already_processed:
already_processed.append('y')
self.y = value
value = find_attr_value_('x', node)
if value is not None and 'x' not in already_processed:
already_processed.append('x')
self.x = value
value = find_attr_value_('drill', node)
if value is not None and 'drill' not in already_processed:
already_processed.append('drill')
self.drill = value
def buildChildren(self, child_, node, nodeName_, fromsubclass_=False):
pass
# end class hole
class pad(GeneratedsSuper):
subclass = None
superclass = None
def __init__(self, diameter=None, thermals=None, name=None, stop=None, shape=None, drill=None, y=None, x=None, rot=None, first=None):
self.diameter = _cast(None, diameter)
self.thermals = _cast(None, thermals)
self.name = _cast(None, name)
self.stop = _cast(None, stop)
self.shape = _cast(None, shape)
self.drill = _cast(None, drill)
self.y = _cast(None, y)
self.x = _cast(None, x)
self.rot = _cast(None, rot)
self.first = _cast(None, first)
pass
def factory(*args_, **kwargs_):
if pad.subclass:
return pad.subclass(*args_, **kwargs_)
else:
return pad(*args_, **kwargs_)
factory = staticmethod(factory)
def get_diameter(self): return self.diameter
def set_diameter(self, diameter): self.diameter = diameter
def get_thermals(self): return self.thermals
def set_thermals(self, thermals): self.thermals = thermals
def get_name(self): return self.name
def set_name(self, name): self.name = name
def get_stop(self): return self.stop
def set_stop(self, stop): self.stop = stop
def get_shape(self): return self.shape
def set_shape(self, shape): self.shape = shape
def get_drill(self): return self.drill
def set_drill(self, drill): self.drill = drill
def get_y(self): return self.y
def set_y(self, y): self.y = y
def get_x(self): return self.x
def set_x(self, x): self.x = x
def get_rot(self): return self.rot
def set_rot(self, rot): self.rot = rot
def get_first(self): return self.first
def set_first(self, first): self.first = first
def export(self, outfile, level, namespace_='t:', name_='pad', namespacedef_=''):
showIndent(outfile, level)
outfile.write('<%s%s%s' % (namespace_, name_, namespacedef_ and ' ' + namespacedef_ or '', ))
already_processed = []
self.exportAttributes(outfile, level, already_processed, namespace_, name_='pad')
if self.hasContent_():
outfile.write('>\n')
self.exportChildren(outfile, level + 1, namespace_, name_)
outfile.write('</%s%s>\n' % (namespace_, name_))
else:
outfile.write('/>\n')
def exportAttributes(self, outfile, level, already_processed, namespace_='t:', name_='pad'):
if self.diameter is not None and 'diameter' not in already_processed:
already_processed.append('diameter')
outfile.write(' diameter=%s' % (self.gds_format_string(quote_attrib(self.diameter).encode(ExternalEncoding), input_name='diameter'), ))
if self.thermals is not None and 'thermals' not in already_processed:
already_processed.append('thermals')
outfile.write(' thermals=%s' % (self.gds_format_string(quote_attrib(self.thermals).encode(ExternalEncoding), input_name='thermals'), ))
if self.name is not None and 'name' not in already_processed:
already_processed.append('name')
outfile.write(' name=%s' % (self.gds_format_string(quote_attrib(self.name).encode(ExternalEncoding), input_name='name'), ))
if self.stop is not None and 'stop' not in already_processed:
already_processed.append('stop')
outfile.write(' stop=%s' % (self.gds_format_string(quote_attrib(self.stop).encode(ExternalEncoding), input_name='stop'), ))
if self.shape is not None and 'shape' not in already_processed:
already_processed.append('shape')
outfile.write(' shape=%s' % (self.gds_format_string(quote_attrib(self.shape).encode(ExternalEncoding), input_name='shape'), ))
if self.drill is not None and 'drill' not in already_processed:
already_processed.append('drill')
outfile.write(' drill=%s' % (self.gds_format_string(quote_attrib(self.drill).encode(ExternalEncoding), input_name='drill'), ))
if self.y is not None and 'y' not in already_processed:
already_processed.append('y')
outfile.write(' y=%s' % (self.gds_format_string(quote_attrib(self.y).encode(ExternalEncoding), input_name='y'), ))
if self.x is not None and 'x' not in already_processed:
already_processed.append('x')
outfile.write(' x=%s' % (self.gds_format_string(quote_attrib(self.x).encode(ExternalEncoding), input_name='x'), ))
if self.rot is not None and 'rot' not in already_processed:
already_processed.append('rot')
outfile.write(' rot=%s' % (self.gds_format_string(quote_attrib(self.rot).encode(ExternalEncoding), input_name='rot'), ))
if self.first is not None and 'first' not in already_processed:
already_processed.append('first')
outfile.write(' first=%s' % (self.gds_format_string(quote_attrib(self.first).encode(ExternalEncoding), input_name='first'), ))
def exportChildren(self, outfile, level, namespace_='t:', name_='pad', fromsubclass_=False):
pass
def hasContent_(self):
if (
):
return True
else:
return False
def exportLiteral(self, outfile, level, name_='pad'):
level += 1
self.exportLiteralAttributes(outfile, level, [], name_)
if self.hasContent_():
self.exportLiteralChildren(outfile, level, name_)
def exportLiteralAttributes(self, outfile, level, already_processed, name_):
if self.diameter is not None and 'diameter' not in already_processed:
already_processed.append('diameter')
showIndent(outfile, level)
outfile.write('diameter = "%s",\n' % (self.diameter,))
if self.thermals is not None and 'thermals' not in already_processed:
already_processed.append('thermals')
showIndent(outfile, level)
outfile.write('thermals = "%s",\n' % (self.thermals,))
if self.name is not None and 'name' not in already_processed:
already_processed.append('name')
showIndent(outfile, level)
outfile.write('name = "%s",\n' % (self.name,))
if self.stop is not None and 'stop' not in already_processed:
already_processed.append('stop')
showIndent(outfile, level)
outfile.write('stop = "%s",\n' % (self.stop,))
if self.shape is not None and 'shape' not in already_processed:
already_processed.append('shape')
showIndent(outfile, level)
outfile.write('shape = "%s",\n' % (self.shape,))
if self.drill is not None and 'drill' not in already_processed:
already_processed.append('drill')
showIndent(outfile, level)
outfile.write('drill = "%s",\n' % (self.drill,))
if self.y is not None and 'y' not in already_processed:
already_processed.append('y')
showIndent(outfile, level)
outfile.write('y = "%s",\n' % (self.y,))
if self.x is not None and 'x' not in already_processed:
already_processed.append('x')
showIndent(outfile, level)
outfile.write('x = "%s",\n' % (self.x,))
if self.rot is not None and 'rot' not in already_processed:
already_processed.append('rot')
showIndent(outfile, level)
outfile.write('rot = "%s",\n' % (self.rot,))
if self.first is not None and 'first' not in already_processed:
already_processed.append('first')
showIndent(outfile, level)
outfile.write('first = "%s",\n' % (self.first,))
def exportLiteralChildren(self, outfile, level, name_):
pass
def build(self, node):
self.buildAttributes(node, node.attrib, [])
for child in node:
nodeName_ = Tag_pattern_.match(child.tag).groups()[-1]
self.buildChildren(child, node, nodeName_)
def buildAttributes(self, node, attrs, already_processed):
value = find_attr_value_('diameter', node)
if value is not None and 'diameter' not in already_processed:
already_processed.append('diameter')
self.diameter = value
value = find_attr_value_('thermals', node)
if value is not None and 'thermals' not in already_processed:
already_processed.append('thermals')
self.thermals = value
value = find_attr_value_('name', node)
if value is not None and 'name' not in already_processed:
already_processed.append('name')
self.name = value
value = find_attr_value_('stop', node)
if value is not None and 'stop' not in already_processed:
already_processed.append('stop')
self.stop = value
value = find_attr_value_('shape', node)
if value is not None and 'shape' not in already_processed:
already_processed.append('shape')
self.shape = value
value = find_attr_value_('drill', node)
if value is not None and 'drill' not in already_processed:
already_processed.append('drill')
self.drill = value
value = find_attr_value_('y', node)
if value is not None and 'y' not in already_processed:
already_processed.append('y')
self.y = value
value = find_attr_value_('x', node)
if value is not None and 'x' not in already_processed:
already_processed.append('x')
self.x = value
value = find_attr_value_('rot', node)
if value is not None and 'rot' not in already_processed:
already_processed.append('rot')
self.rot = value
value = find_attr_value_('first', node)
if value is not None and 'first' not in already_processed:
already_processed.append('first')
self.first = value
def buildChildren(self, child_, node, nodeName_, fromsubclass_=False):
pass
# end class pad
class smd(GeneratedsSuper):
subclass = None
superclass = None
def __init__(self, layer=None, thermals=None, name=None, stop=None, roundness=None, dx=None, dy=None, y=None, x=None, rot=None, cream=None):
self.layer = _cast(None, layer)
self.thermals = _cast(None, thermals)
self.name = _cast(None, name)
self.stop = _cast(None, stop)
self.roundness = _cast(None, roundness)
self.dx = _cast(None, dx)
self.dy = _cast(None, dy)
self.y = _cast(None, y)
self.x = _cast(None, x)
self.rot = _cast(None, rot)
self.cream = _cast(None, cream)
pass
def factory(*args_, **kwargs_):
if smd.subclass:
return smd.subclass(*args_, **kwargs_)
else:
return smd(*args_, **kwargs_)
factory = staticmethod(factory)
def get_layer(self): return self.layer
def set_layer(self, layer): self.layer = layer
def get_thermals(self): return self.thermals
def set_thermals(self, thermals): self.thermals = thermals
def get_name(self): return self.name
def set_name(self, name): self.name = name
def get_stop(self): return self.stop
def set_stop(self, stop): self.stop = stop
def get_roundness(self): return self.roundness
def set_roundness(self, roundness): self.roundness = roundness
def get_dx(self): return self.dx
def set_dx(self, dx): self.dx = dx
def get_dy(self): return self.dy
def set_dy(self, dy): self.dy = dy
def get_y(self): return self.y
def set_y(self, y): self.y = y
def get_x(self): return self.x
def set_x(self, x): self.x = x
def get_rot(self): return self.rot
def set_rot(self, rot): self.rot = rot
def get_cream(self): return self.cream
def set_cream(self, cream): self.cream = cream
def export(self, outfile, level, namespace_='t:', name_='smd', namespacedef_=''):
showIndent(outfile, level)
outfile.write('<%s%s%s' % (namespace_, name_, namespacedef_ and ' ' + namespacedef_ or '', ))
already_processed = []
self.exportAttributes(outfile, level, already_processed, namespace_, name_='smd')
if self.hasContent_():
outfile.write('>\n')
self.exportChildren(outfile, level + 1, namespace_, name_)
outfile.write('</%s%s>\n' % (namespace_, name_))
else:
outfile.write('/>\n')
def exportAttributes(self, outfile, level, already_processed, namespace_='t:', name_='smd'):
if self.layer is not None and 'layer' not in already_processed:
already_processed.append('layer')
outfile.write(' layer=%s' % (self.gds_format_string(quote_attrib(self.layer).encode(ExternalEncoding), input_name='layer'), ))
if self.thermals is not None and 'thermals' not in already_processed:
already_processed.append('thermals')
outfile.write(' thermals=%s' % (self.gds_format_string(quote_attrib(self.thermals).encode(ExternalEncoding), input_name='thermals'), ))
if self.name is not None and 'name' not in already_processed:
already_processed.append('name')
outfile.write(' name=%s' % (self.gds_format_string(quote_attrib(self.name).encode(ExternalEncoding), input_name='name'), ))
if self.stop is not None and 'stop' not in already_processed:
already_processed.append('stop')
outfile.write(' stop=%s' % (self.gds_format_string(quote_attrib(self.stop).encode(ExternalEncoding), input_name='stop'), ))
if self.roundness is not None and 'roundness' not in already_processed:
already_processed.append('roundness')
outfile.write(' roundness=%s' % (self.gds_format_string(quote_attrib(self.roundness).encode(ExternalEncoding), input_name='roundness'), ))
if self.dx is not None and 'dx' not in already_processed:
already_processed.append('dx')
outfile.write(' dx=%s' % (self.gds_format_string(quote_attrib(self.dx).encode(ExternalEncoding), input_name='dx'), ))
if self.dy is not None and 'dy' not in already_processed:
already_processed.append('dy')
outfile.write(' dy=%s' % (self.gds_format_string(quote_attrib(self.dy).encode(ExternalEncoding), input_name='dy'), ))
if self.y is not None and 'y' not in already_processed:
already_processed.append('y')
outfile.write(' y=%s' % (self.gds_format_string(quote_attrib(self.y).encode(ExternalEncoding), input_name='y'), ))
if self.x is not None and 'x' not in already_processed:
already_processed.append('x')
outfile.write(' x=%s' % (self.gds_format_string(quote_attrib(self.x).encode(ExternalEncoding), input_name='x'), ))
if self.rot is not None and 'rot' not in already_processed:
already_processed.append('rot')
outfile.write(' rot=%s' % (self.gds_format_string(quote_attrib(self.rot).encode(ExternalEncoding), input_name='rot'), ))
if self.cream is not None and 'cream' not in already_processed:
already_processed.append('cream')
outfile.write(' cream=%s' % (self.gds_format_string(quote_attrib(self.cream).encode(ExternalEncoding), input_name='cream'), ))
def exportChildren(self, outfile, level, namespace_='t:', name_='smd', fromsubclass_=False):
pass
def hasContent_(self):
if (
):
return True
else:
return False
def exportLiteral(self, outfile, level, name_='smd'):
level += 1
self.exportLiteralAttributes(outfile, level, [], name_)
if self.hasContent_():
self.exportLiteralChildren(outfile, level, name_)
def exportLiteralAttributes(self, outfile, level, already_processed, name_):
if self.layer is not None and 'layer' not in already_processed:
already_processed.append('layer')
showIndent(outfile, level)
outfile.write('layer = "%s",\n' % (self.layer,))
if self.thermals is not None and 'thermals' not in already_processed:
already_processed.append('thermals')
showIndent(outfile, level)
outfile.write('thermals = "%s",\n' % (self.thermals,))
if self.name is not None and 'name' not in already_processed:
already_processed.append('name')
showIndent(outfile, level)
outfile.write('name = "%s",\n' % (self.name,))
if self.stop is not None and 'stop' not in already_processed:
already_processed.append('stop')
showIndent(outfile, level)
outfile.write('stop = "%s",\n' % (self.stop,))
if self.roundness is not None and 'roundness' not in already_processed:
already_processed.append('roundness')
showIndent(outfile, level)
outfile.write('roundness = "%s",\n' % (self.roundness,))
if self.dx is not None and 'dx' not in already_processed:
already_processed.append('dx')
showIndent(outfile, level)
outfile.write('dx = "%s",\n' % (self.dx,))
if self.dy is not None and 'dy' not in already_processed:
already_processed.append('dy')
showIndent(outfile, level)
outfile.write('dy = "%s",\n' % (self.dy,))
if self.y is not None and 'y' not in already_processed:
already_processed.append('y')
showIndent(outfile, level)
outfile.write('y = "%s",\n' % (self.y,))
if self.x is not None and 'x' not in already_processed:
already_processed.append('x')
showIndent(outfile, level)
outfile.write('x = "%s",\n' % (self.x,))
if self.rot is not None and 'rot' not in already_processed:
already_processed.append('rot')
showIndent(outfile, level)
outfile.write('rot = "%s",\n' % (self.rot,))
if self.cream is not None and 'cream' not in already_processed:
already_processed.append('cream')
showIndent(outfile, level)
outfile.write('cream = "%s",\n' % (self.cream,))
def exportLiteralChildren(self, outfile, level, name_):
pass
def build(self, node):
self.buildAttributes(node, node.attrib, [])
for child in node:
nodeName_ = Tag_pattern_.match(child.tag).groups()[-1]
self.buildChildren(child, node, nodeName_)
def buildAttributes(self, node, attrs, already_processed):
value = find_attr_value_('layer', node)
if value is not None and 'layer' not in already_processed:
already_processed.append('layer')
self.layer = value
value = find_attr_value_('thermals', node)
if value is not None and 'thermals' not in already_processed:
already_processed.append('thermals')
self.thermals = value
value = find_attr_value_('name', node)
if value is not None and 'name' not in already_processed:
already_processed.append('name')
self.name = value
value = find_attr_value_('stop', node)
if value is not None and 'stop' not in already_processed:
already_processed.append('stop')
self.stop = value
value = find_attr_value_('roundness', node)
if value is not None and 'roundness' not in already_processed:
already_processed.append('roundness')
self.roundness = value
value = find_attr_value_('dx', node)
if value is not None and 'dx' not in already_processed:
already_processed.append('dx')
self.dx = value
value = find_attr_value_('dy', node)
if value is not None and 'dy' not in already_processed:
already_processed.append('dy')
self.dy = value
value = find_attr_value_('y', node)
if value is not None and 'y' not in already_processed:
already_processed.append('y')
self.y = value
value = find_attr_value_('x', node)
if value is not None and 'x' not in already_processed:
already_processed.append('x')
self.x = value
value = find_attr_value_('rot', node)
if value is not None and 'rot' not in already_processed:
already_processed.append('rot')
self.rot = value
value = find_attr_value_('cream', node)
if value is not None and 'cream' not in already_processed:
already_processed.append('cream')
self.cream = value
def buildChildren(self, child_, node, nodeName_, fromsubclass_=False):
pass
# end class smd
class element(GeneratedsSuper):
subclass = None
superclass = None
def __init__(self, locked=None, name=None, package=None, value=None, smashed=None, library=None, y=None, x=None, rot=None, attribute=None, variant=None):
self.locked = _cast(None, locked)
self.name = _cast(None, name)
self.package = _cast(None, package)
self.value = _cast(None, value)
self.smashed = _cast(None, smashed)
self.library = _cast(None, library)
self.y = _cast(None, y)
self.x = _cast(None, x)
self.rot = _cast(None, rot)
if attribute is None:
self.attribute = []
else:
self.attribute = attribute
if variant is None:
self.variant = []
else:
self.variant = variant
def factory(*args_, **kwargs_):
if element.subclass:
return element.subclass(*args_, **kwargs_)
else:
return element(*args_, **kwargs_)
factory = staticmethod(factory)
def get_attribute(self): return self.attribute
def set_attribute(self, attribute): self.attribute = attribute
def add_attribute(self, value): self.attribute.append(value)
def insert_attribute(self, index, value): self.attribute[index] = value
def get_variant(self): return self.variant
def set_variant(self, variant): self.variant = variant
def add_variant(self, value): self.variant.append(value)
def insert_variant(self, index, value): self.variant[index] = value
def get_locked(self): return self.locked
def set_locked(self, locked): self.locked = locked
def get_name(self): return self.name
def set_name(self, name): self.name = name
def get_package(self): return self.package
def set_package(self, package): self.package = package
def get_value(self): return self.value
def set_value(self, value): self.value = value
def get_smashed(self): return self.smashed
def set_smashed(self, smashed): self.smashed = smashed
def get_library(self): return self.library
def set_library(self, library): self.library = library
def get_y(self): return self.y
def set_y(self, y): self.y = y
def get_x(self): return self.x
def set_x(self, x): self.x = x
def get_rot(self): return self.rot
def set_rot(self, rot): self.rot = rot
def export(self, outfile, level, namespace_='t:', name_='element', namespacedef_=''):
showIndent(outfile, level)
outfile.write('<%s%s%s' % (namespace_, name_, namespacedef_ and ' ' + namespacedef_ or '', ))
already_processed = []
self.exportAttributes(outfile, level, already_processed, namespace_, name_='element')
if self.hasContent_():
outfile.write('>\n')
self.exportChildren(outfile, level + 1, namespace_, name_)
showIndent(outfile, level)
outfile.write('</%s%s>\n' % (namespace_, name_))
else:
outfile.write('/>\n')
def exportAttributes(self, outfile, level, already_processed, namespace_='t:', name_='element'):
if self.locked is not None and 'locked' not in already_processed:
already_processed.append('locked')
outfile.write(' locked=%s' % (self.gds_format_string(quote_attrib(self.locked).encode(ExternalEncoding), input_name='locked'), ))
if self.name is not None and 'name' not in already_processed:
already_processed.append('name')
outfile.write(' name=%s' % (self.gds_format_string(quote_attrib(self.name).encode(ExternalEncoding), input_name='name'), ))
if self.package is not None and 'package' not in already_processed:
already_processed.append('package')
outfile.write(' package=%s' % (self.gds_format_string(quote_attrib(self.package).encode(ExternalEncoding), input_name='package'), ))
if self.value is not None and 'value' not in already_processed:
already_processed.append('value')
outfile.write(' value=%s' % (self.gds_format_string(quote_attrib(self.value).encode(ExternalEncoding), input_name='value'), ))
if self.smashed is not None and 'smashed' not in already_processed:
already_processed.append('smashed')
outfile.write(' smashed=%s' % (self.gds_format_string(quote_attrib(self.smashed).encode(ExternalEncoding), input_name='smashed'), ))
if self.library is not None and 'library' not in already_processed:
already_processed.append('library')
outfile.write(' library=%s' % (self.gds_format_string(quote_attrib(self.library).encode(ExternalEncoding), input_name='library'), ))
if self.y is not None and 'y' not in already_processed:
already_processed.append('y')
outfile.write(' y=%s' % (self.gds_format_string(quote_attrib(self.y).encode(ExternalEncoding), input_name='y'), ))
if self.x is not None and 'x' not in already_processed:
already_processed.append('x')
outfile.write(' x=%s' % (self.gds_format_string(quote_attrib(self.x).encode(ExternalEncoding), input_name='x'), ))
if self.rot is not None and 'rot' not in already_processed:
already_processed.append('rot')
outfile.write(' rot=%s' % (self.gds_format_string(quote_attrib(self.rot).encode(ExternalEncoding), input_name='rot'), ))
def exportChildren(self, outfile, level, namespace_='t:', name_='element', fromsubclass_=False):
for attribute_ in self.attribute:
attribute_.export(outfile, level, namespace_, name_='attribute')
for variant_ in self.variant:
variant_.export(outfile, level, namespace_, name_='variant')
def hasContent_(self):
if (
self.attribute or
self.variant
):
return True
else:
return False
def exportLiteral(self, outfile, level, name_='element'):
level += 1
self.exportLiteralAttributes(outfile, level, [], name_)
if self.hasContent_():
self.exportLiteralChildren(outfile, level, name_)
def exportLiteralAttributes(self, outfile, level, already_processed, name_):
if self.locked is not None and 'locked' not in already_processed:
already_processed.append('locked')
showIndent(outfile, level)
outfile.write('locked = "%s",\n' % (self.locked,))
if self.name is not None and 'name' not in already_processed:
already_processed.append('name')
showIndent(outfile, level)
outfile.write('name = "%s",\n' % (self.name,))
if self.package is not None and 'package' not in already_processed:
already_processed.append('package')
showIndent(outfile, level)
outfile.write('package = "%s",\n' % (self.package,))
if self.value is not None and 'value' not in already_processed:
already_processed.append('value')
showIndent(outfile, level)
outfile.write('value = "%s",\n' % (self.value,))
if self.smashed is not None and 'smashed' not in already_processed:
already_processed.append('smashed')
showIndent(outfile, level)
outfile.write('smashed = "%s",\n' % (self.smashed,))
if self.library is not None and 'library' not in already_processed:
already_processed.append('library')
showIndent(outfile, level)
outfile.write('library = "%s",\n' % (self.library,))
if self.y is not None and 'y' not in already_processed:
already_processed.append('y')
showIndent(outfile, level)
outfile.write('y = "%s",\n' % (self.y,))
if self.x is not None and 'x' not in already_processed:
already_processed.append('x')
showIndent(outfile, level)
outfile.write('x = "%s",\n' % (self.x,))
if self.rot is not None and 'rot' not in already_processed:
already_processed.append('rot')
showIndent(outfile, level)
outfile.write('rot = "%s",\n' % (self.rot,))
def exportLiteralChildren(self, outfile, level, name_):
showIndent(outfile, level)
outfile.write('attribute=[\n')
level += 1
for attribute_ in self.attribute:
showIndent(outfile, level)
outfile.write('model_.attribute(\n')
attribute_.exportLiteral(outfile, level)
showIndent(outfile, level)
outfile.write('),\n')
level -= 1
showIndent(outfile, level)
outfile.write('],\n')
showIndent(outfile, level)
outfile.write('variant=[\n')
level += 1
for variant_ in self.variant:
showIndent(outfile, level)
outfile.write('model_.variant(\n')
variant_.exportLiteral(outfile, level)
showIndent(outfile, level)
outfile.write('),\n')
level -= 1
showIndent(outfile, level)
outfile.write('],\n')
def build(self, node):
self.buildAttributes(node, node.attrib, [])
for child in node:
nodeName_ = Tag_pattern_.match(child.tag).groups()[-1]
self.buildChildren(child, node, nodeName_)
def buildAttributes(self, node, attrs, already_processed):
value = find_attr_value_('locked', node)
if value is not None and 'locked' not in already_processed:
already_processed.append('locked')
self.locked = value
value = find_attr_value_('name', node)
if value is not None and 'name' not in already_processed:
already_processed.append('name')
self.name = value
value = find_attr_value_('package', node)
if value is not None and 'package' not in already_processed:
already_processed.append('package')
self.package = value
value = find_attr_value_('value', node)
if value is not None and 'value' not in already_processed:
already_processed.append('value')
self.value = value
value = find_attr_value_('smashed', node)
if value is not None and 'smashed' not in already_processed:
already_processed.append('smashed')
self.smashed = value
value = find_attr_value_('library', node)
if value is not None and 'library' not in already_processed:
already_processed.append('library')
self.library = value
value = find_attr_value_('y', node)
if value is not None and 'y' not in already_processed:
already_processed.append('y')
self.y = value
value = find_attr_value_('x', node)
if value is not None and 'x' not in already_processed:
already_processed.append('x')
self.x = value
value = find_attr_value_('rot', node)
if value is not None and 'rot' not in already_processed:
already_processed.append('rot')
self.rot = value
def buildChildren(self, child_, node, nodeName_, fromsubclass_=False):
if nodeName_ == 'attribute':
obj_ = attribute.factory()
obj_.build(child_)
self.attribute.append(obj_)
elif nodeName_ == 'variant':
obj_ = variant.factory()
obj_.build(child_)
self.variant.append(obj_)
# end class element
class via(GeneratedsSuper):
subclass = None
superclass = None
def __init__(self, diameter=None, extent=None, shape=None, alwaysstop=None, drill=None, y=None, x=None):
self.diameter = _cast(None, diameter)
self.extent = _cast(None, extent)
self.shape = _cast(None, shape)
self.alwaysstop = _cast(None, alwaysstop)
self.drill = _cast(None, drill)
self.y = _cast(None, y)
self.x = _cast(None, x)
pass
def factory(*args_, **kwargs_):
if via.subclass:
return via.subclass(*args_, **kwargs_)
else:
return via(*args_, **kwargs_)
factory = staticmethod(factory)
def get_diameter(self): return self.diameter
def set_diameter(self, diameter): self.diameter = diameter
def get_extent(self): return self.extent
def set_extent(self, extent): self.extent = extent
def get_shape(self): return self.shape
def set_shape(self, shape): self.shape = shape
def get_alwaysstop(self): return self.alwaysstop
def set_alwaysstop(self, alwaysstop): self.alwaysstop = alwaysstop
def get_drill(self): return self.drill
def set_drill(self, drill): self.drill = drill
def get_y(self): return self.y
def set_y(self, y): self.y = y
def get_x(self): return self.x
def set_x(self, x): self.x = x
def export(self, outfile, level, namespace_='t:', name_='via', namespacedef_=''):
showIndent(outfile, level)
outfile.write('<%s%s%s' % (namespace_, name_, namespacedef_ and ' ' + namespacedef_ or '', ))
already_processed = []
self.exportAttributes(outfile, level, already_processed, namespace_, name_='via')
if self.hasContent_():
outfile.write('>\n')
self.exportChildren(outfile, level + 1, namespace_, name_)
outfile.write('</%s%s>\n' % (namespace_, name_))
else:
outfile.write('/>\n')
def exportAttributes(self, outfile, level, already_processed, namespace_='t:', name_='via'):
if self.diameter is not None and 'diameter' not in already_processed:
already_processed.append('diameter')
outfile.write(' diameter=%s' % (self.gds_format_string(quote_attrib(self.diameter).encode(ExternalEncoding), input_name='diameter'), ))
if self.extent is not None and 'extent' not in already_processed:
already_processed.append('extent')
outfile.write(' extent=%s' % (self.gds_format_string(quote_attrib(self.extent).encode(ExternalEncoding), input_name='extent'), ))
if self.shape is not None and 'shape' not in already_processed:
already_processed.append('shape')
outfile.write(' shape=%s' % (self.gds_format_string(quote_attrib(self.shape).encode(ExternalEncoding), input_name='shape'), ))
if self.alwaysstop is not None and 'alwaysstop' not in already_processed:
already_processed.append('alwaysstop')
outfile.write(' alwaysstop=%s' % (self.gds_format_string(quote_attrib(self.alwaysstop).encode(ExternalEncoding), input_name='alwaysstop'), ))
if self.drill is not None and 'drill' not in already_processed:
already_processed.append('drill')
outfile.write(' drill=%s' % (self.gds_format_string(quote_attrib(self.drill).encode(ExternalEncoding), input_name='drill'), ))
if self.y is not None and 'y' not in already_processed:
already_processed.append('y')
outfile.write(' y=%s' % (self.gds_format_string(quote_attrib(self.y).encode(ExternalEncoding), input_name='y'), ))
if self.x is not None and 'x' not in already_processed:
already_processed.append('x')
outfile.write(' x=%s' % (self.gds_format_string(quote_attrib(self.x).encode(ExternalEncoding), input_name='x'), ))
def exportChildren(self, outfile, level, namespace_='t:', name_='via', fromsubclass_=False):
pass
def hasContent_(self):
if (
):
return True
else:
return False
def exportLiteral(self, outfile, level, name_='via'):
level += 1
self.exportLiteralAttributes(outfile, level, [], name_)
if self.hasContent_():
self.exportLiteralChildren(outfile, level, name_)
def exportLiteralAttributes(self, outfile, level, already_processed, name_):
if self.diameter is not None and 'diameter' not in already_processed:
already_processed.append('diameter')
showIndent(outfile, level)
outfile.write('diameter = "%s",\n' % (self.diameter,))
if self.extent is not None and 'extent' not in already_processed:
already_processed.append('extent')
showIndent(outfile, level)
outfile.write('extent = "%s",\n' % (self.extent,))
if self.shape is not None and 'shape' not in already_processed:
already_processed.append('shape')
showIndent(outfile, level)
outfile.write('shape = "%s",\n' % (self.shape,))
if self.alwaysstop is not None and 'alwaysstop' not in already_processed:
already_processed.append('alwaysstop')
showIndent(outfile, level)
outfile.write('alwaysstop = "%s",\n' % (self.alwaysstop,))
if self.drill is not None and 'drill' not in already_processed:
already_processed.append('drill')
showIndent(outfile, level)
outfile.write('drill = "%s",\n' % (self.drill,))
if self.y is not None and 'y' not in already_processed:
already_processed.append('y')
showIndent(outfile, level)
outfile.write('y = "%s",\n' % (self.y,))
if self.x is not None and 'x' not in already_processed:
already_processed.append('x')
showIndent(outfile, level)
outfile.write('x = "%s",\n' % (self.x,))
def exportLiteralChildren(self, outfile, level, name_):
pass
def build(self, node):
self.buildAttributes(node, node.attrib, [])
for child in node:
nodeName_ = Tag_pattern_.match(child.tag).groups()[-1]
self.buildChildren(child, node, nodeName_)
def buildAttributes(self, node, attrs, already_processed):
value = find_attr_value_('diameter', node)
if value is not None and 'diameter' not in already_processed:
already_processed.append('diameter')
self.diameter = value
value = find_attr_value_('extent', node)
if value is not None and 'extent' not in already_processed:
already_processed.append('extent')
self.extent = value
value = find_attr_value_('shape', node)
if value is not None and 'shape' not in already_processed:
already_processed.append('shape')
self.shape = value
value = find_attr_value_('alwaysstop', node)
if value is not None and 'alwaysstop' not in already_processed:
already_processed.append('alwaysstop')
self.alwaysstop = value
value = find_attr_value_('drill', node)
if value is not None and 'drill' not in already_processed:
already_processed.append('drill')
self.drill = value
value = find_attr_value_('y', node)
if value is not None and 'y' not in already_processed:
already_processed.append('y')
self.y = value
value = find_attr_value_('x', node)
if value is not None and 'x' not in already_processed:
already_processed.append('x')
self.x = value
def buildChildren(self, child_, node, nodeName_, fromsubclass_=False):
pass
# end class via
class polygon(GeneratedsSuper):
subclass = None
superclass = None
def __init__(self, layer=None, thermals=None, spacing=None, orphans=None, isolate=None, pour=None, width=None, rank=None, vertex=None):
self.layer = _cast(None, layer)
self.thermals = _cast(None, thermals)
self.spacing = _cast(None, spacing)
self.orphans = _cast(None, orphans)
self.isolate = _cast(None, isolate)
self.pour = _cast(None, pour)
self.width = _cast(None, width)
self.rank = _cast(None, rank)
self.vertex = vertex
def factory(*args_, **kwargs_):
if polygon.subclass:
return polygon.subclass(*args_, **kwargs_)
else:
return polygon(*args_, **kwargs_)
factory = staticmethod(factory)
def get_vertex(self): return self.vertex
def set_vertex(self, vertex): self.vertex = vertex
def get_layer(self): return self.layer
def set_layer(self, layer): self.layer = layer
def get_thermals(self): return self.thermals
def set_thermals(self, thermals): self.thermals = thermals
def get_spacing(self): return self.spacing
def set_spacing(self, spacing): self.spacing = spacing
def get_orphans(self): return self.orphans
def set_orphans(self, orphans): self.orphans = orphans
def get_isolate(self): return self.isolate
def set_isolate(self, isolate): self.isolate = isolate
def get_pour(self): return self.pour
def set_pour(self, pour): self.pour = pour
def get_width(self): return self.width
def set_width(self, width): self.width = width
def get_rank(self): return self.rank
def set_rank(self, rank): self.rank = rank
def export(self, outfile, level, namespace_='t:', name_='polygon', namespacedef_=''):
showIndent(outfile, level)
outfile.write('<%s%s%s' % (namespace_, name_, namespacedef_ and ' ' + namespacedef_ or '', ))
already_processed = []
self.exportAttributes(outfile, level, already_processed, namespace_, name_='polygon')
if self.hasContent_():
outfile.write('>\n')
self.exportChildren(outfile, level + 1, namespace_, name_)
showIndent(outfile, level)
outfile.write('</%s%s>\n' % (namespace_, name_))
else:
outfile.write('/>\n')
def exportAttributes(self, outfile, level, already_processed, namespace_='t:', name_='polygon'):
if self.layer is not None and 'layer' not in already_processed:
already_processed.append('layer')
outfile.write(' layer=%s' % (self.gds_format_string(quote_attrib(self.layer).encode(ExternalEncoding), input_name='layer'), ))
if self.thermals is not None and 'thermals' not in already_processed:
already_processed.append('thermals')
outfile.write(' thermals=%s' % (self.gds_format_string(quote_attrib(self.thermals).encode(ExternalEncoding), input_name='thermals'), ))
if self.spacing is not None and 'spacing' not in already_processed:
already_processed.append('spacing')
outfile.write(' spacing=%s' % (self.gds_format_string(quote_attrib(self.spacing).encode(ExternalEncoding), input_name='spacing'), ))
if self.orphans is not None and 'orphans' not in already_processed:
already_processed.append('orphans')
outfile.write(' orphans=%s' % (self.gds_format_string(quote_attrib(self.orphans).encode(ExternalEncoding), input_name='orphans'), ))
if self.isolate is not None and 'isolate' not in already_processed:
already_processed.append('isolate')
outfile.write(' isolate=%s' % (self.gds_format_string(quote_attrib(self.isolate).encode(ExternalEncoding), input_name='isolate'), ))
if self.pour is not None and 'pour' not in already_processed:
already_processed.append('pour')
outfile.write(' pour=%s' % (self.gds_format_string(quote_attrib(self.pour).encode(ExternalEncoding), input_name='pour'), ))
if self.width is not None and 'width' not in already_processed:
already_processed.append('width')
outfile.write(' width=%s' % (self.gds_format_string(quote_attrib(self.width).encode(ExternalEncoding), input_name='width'), ))
if self.rank is not None and 'rank' not in already_processed:
already_processed.append('rank')
outfile.write(' rank=%s' % (self.gds_format_string(quote_attrib(self.rank).encode(ExternalEncoding), input_name='rank'), ))
def exportChildren(self, outfile, level, namespace_='t:', name_='polygon', fromsubclass_=False):
if self.vertex is not None:
self.vertex.export(outfile, level, namespace_, name_='vertex', )
def hasContent_(self):
if (
self.vertex is not None
):
return True
else:
return False
def exportLiteral(self, outfile, level, name_='polygon'):
level += 1
self.exportLiteralAttributes(outfile, level, [], name_)
if self.hasContent_():
self.exportLiteralChildren(outfile, level, name_)
def exportLiteralAttributes(self, outfile, level, already_processed, name_):
if self.layer is not None and 'layer' not in already_processed:
already_processed.append('layer')
showIndent(outfile, level)
outfile.write('layer = "%s",\n' % (self.layer,))
if self.thermals is not None and 'thermals' not in already_processed:
already_processed.append('thermals')
showIndent(outfile, level)
outfile.write('thermals = "%s",\n' % (self.thermals,))
if self.spacing is not None and 'spacing' not in already_processed:
already_processed.append('spacing')
showIndent(outfile, level)
outfile.write('spacing = "%s",\n' % (self.spacing,))
if self.orphans is not None and 'orphans' not in already_processed:
already_processed.append('orphans')
showIndent(outfile, level)
outfile.write('orphans = "%s",\n' % (self.orphans,))
if self.isolate is not None and 'isolate' not in already_processed:
already_processed.append('isolate')
showIndent(outfile, level)
outfile.write('isolate = "%s",\n' % (self.isolate,))
if self.pour is not None and 'pour' not in already_processed:
already_processed.append('pour')
showIndent(outfile, level)
outfile.write('pour = "%s",\n' % (self.pour,))
if self.width is not None and 'width' not in already_processed:
already_processed.append('width')
showIndent(outfile, level)
outfile.write('width = "%s",\n' % (self.width,))
if self.rank is not None and 'rank' not in already_processed:
already_processed.append('rank')
showIndent(outfile, level)
outfile.write('rank = "%s",\n' % (self.rank,))
def exportLiteralChildren(self, outfile, level, name_):
if self.vertex is not None:
showIndent(outfile, level)
outfile.write('vertex=model_.vertex(\n')
self.vertex.exportLiteral(outfile, level)
showIndent(outfile, level)
outfile.write('),\n')
def build(self, node):
self.buildAttributes(node, node.attrib, [])
for child in node:
nodeName_ = Tag_pattern_.match(child.tag).groups()[-1]
self.buildChildren(child, node, nodeName_)
def buildAttributes(self, node, attrs, already_processed):
value = find_attr_value_('layer', node)
if value is not None and 'layer' not in already_processed:
already_processed.append('layer')
self.layer = value
value = find_attr_value_('thermals', node)
if value is not None and 'thermals' not in already_processed:
already_processed.append('thermals')
self.thermals = value
value = find_attr_value_('spacing', node)
if value is not None and 'spacing' not in already_processed:
already_processed.append('spacing')
self.spacing = value
value = find_attr_value_('orphans', node)
if value is not None and 'orphans' not in already_processed:
already_processed.append('orphans')
self.orphans = value
value = find_attr_value_('isolate', node)
if value is not None and 'isolate' not in already_processed:
already_processed.append('isolate')
self.isolate = value
value = find_attr_value_('pour', node)
if value is not None and 'pour' not in already_processed:
already_processed.append('pour')
self.pour = value
value = find_attr_value_('width', node)
if value is not None and 'width' not in already_processed:
already_processed.append('width')
self.width = value
value = find_attr_value_('rank', node)
if value is not None and 'rank' not in already_processed:
already_processed.append('rank')
self.rank = value
def buildChildren(self, child_, node, nodeName_, fromsubclass_=False):
if nodeName_ == 'vertex':
obj_ = vertex.factory()
obj_.build(child_)
self.set_vertex(obj_)
# end class polygon
class vertex(GeneratedsSuper):
subclass = None
superclass = None
def __init__(self, y=None, x=None, curve=None):
self.y = _cast(None, y)
self.x = _cast(None, x)
self.curve = _cast(None, curve)
pass
def factory(*args_, **kwargs_):
if vertex.subclass:
return vertex.subclass(*args_, **kwargs_)
else:
return vertex(*args_, **kwargs_)
factory = staticmethod(factory)
def get_y(self): return self.y
def set_y(self, y): self.y = y
def get_x(self): return self.x
def set_x(self, x): self.x = x
def get_curve(self): return self.curve
def set_curve(self, curve): self.curve = curve
def export(self, outfile, level, namespace_='t:', name_='vertex', namespacedef_=''):
showIndent(outfile, level)
outfile.write('<%s%s%s' % (namespace_, name_, namespacedef_ and ' ' + namespacedef_ or '', ))
already_processed = []
self.exportAttributes(outfile, level, already_processed, namespace_, name_='vertex')
if self.hasContent_():
outfile.write('>\n')
self.exportChildren(outfile, level + 1, namespace_, name_)
outfile.write('</%s%s>\n' % (namespace_, name_))
else:
outfile.write('/>\n')
def exportAttributes(self, outfile, level, already_processed, namespace_='t:', name_='vertex'):
if self.y is not None and 'y' not in already_processed:
already_processed.append('y')
outfile.write(' y=%s' % (self.gds_format_string(quote_attrib(self.y).encode(ExternalEncoding), input_name='y'), ))
if self.x is not None and 'x' not in already_processed:
already_processed.append('x')
outfile.write(' x=%s' % (self.gds_format_string(quote_attrib(self.x).encode(ExternalEncoding), input_name='x'), ))
if self.curve is not None and 'curve' not in already_processed:
already_processed.append('curve')
outfile.write(' curve=%s' % (self.gds_format_string(quote_attrib(self.curve).encode(ExternalEncoding), input_name='curve'), ))
def exportChildren(self, outfile, level, namespace_='t:', name_='vertex', fromsubclass_=False):
pass
def hasContent_(self):
if (
):
return True
else:
return False
def exportLiteral(self, outfile, level, name_='vertex'):
level += 1
self.exportLiteralAttributes(outfile, level, [], name_)
if self.hasContent_():
self.exportLiteralChildren(outfile, level, name_)
def exportLiteralAttributes(self, outfile, level, already_processed, name_):
if self.y is not None and 'y' not in already_processed:
already_processed.append('y')
showIndent(outfile, level)
outfile.write('y = "%s",\n' % (self.y,))
if self.x is not None and 'x' not in already_processed:
already_processed.append('x')
showIndent(outfile, level)
outfile.write('x = "%s",\n' % (self.x,))
if self.curve is not None and 'curve' not in already_processed:
already_processed.append('curve')
showIndent(outfile, level)
outfile.write('curve = "%s",\n' % (self.curve,))
def exportLiteralChildren(self, outfile, level, name_):
pass
def build(self, node):
self.buildAttributes(node, node.attrib, [])
for child in node:
nodeName_ = Tag_pattern_.match(child.tag).groups()[-1]
self.buildChildren(child, node, nodeName_)
def buildAttributes(self, node, attrs, already_processed):
value = find_attr_value_('y', node)
if value is not None and 'y' not in already_processed:
already_processed.append('y')
self.y = value
value = find_attr_value_('x', node)
if value is not None and 'x' not in already_processed:
already_processed.append('x')
self.x = value
value = find_attr_value_('curve', node)
if value is not None and 'curve' not in already_processed:
already_processed.append('curve')
self.curve = value
def buildChildren(self, child_, node, nodeName_, fromsubclass_=False):
pass
# end class vertex
class pin(GeneratedsSuper):
subclass = None
superclass = None
def __init__(self, function=None, direction=None, name=None, visible=None, length=None, y=None, x=None, rot=None, swaplevel=None):
self.function = _cast(None, function)
self.direction = _cast(None, direction)
self.name = _cast(None, name)
self.visible = _cast(None, visible)
self.length = _cast(None, length)
self.y = _cast(None, y)
self.x = _cast(None, x)
self.rot = _cast(None, rot)
self.swaplevel = _cast(None, swaplevel)
pass
def factory(*args_, **kwargs_):
if pin.subclass:
return pin.subclass(*args_, **kwargs_)
else:
return pin(*args_, **kwargs_)
factory = staticmethod(factory)
def get_function(self): return self.function
def set_function(self, function): self.function = function
def get_direction(self): return self.direction
def set_direction(self, direction): self.direction = direction
def get_name(self): return self.name
def set_name(self, name): self.name = name
def get_visible(self): return self.visible
def set_visible(self, visible): self.visible = visible
def get_length(self): return self.length
def set_length(self, length): self.length = length
def get_y(self): return self.y
def set_y(self, y): self.y = y
def get_x(self): return self.x
def set_x(self, x): self.x = x
def get_rot(self): return self.rot
def set_rot(self, rot): self.rot = rot
def get_swaplevel(self): return self.swaplevel
def set_swaplevel(self, swaplevel): self.swaplevel = swaplevel
def export(self, outfile, level, namespace_='t:', name_='pin', namespacedef_=''):
showIndent(outfile, level)
outfile.write('<%s%s%s' % (namespace_, name_, namespacedef_ and ' ' + namespacedef_ or '', ))
already_processed = []
self.exportAttributes(outfile, level, already_processed, namespace_, name_='pin')
if self.hasContent_():
outfile.write('>\n')
self.exportChildren(outfile, level + 1, namespace_, name_)
outfile.write('</%s%s>\n' % (namespace_, name_))
else:
outfile.write('/>\n')
def exportAttributes(self, outfile, level, already_processed, namespace_='t:', name_='pin'):
if self.function is not None and 'function' not in already_processed:
already_processed.append('function')
outfile.write(' function=%s' % (self.gds_format_string(quote_attrib(self.function).encode(ExternalEncoding), input_name='function'), ))
if self.direction is not None and 'direction' not in already_processed:
already_processed.append('direction')
outfile.write(' direction=%s' % (self.gds_format_string(quote_attrib(self.direction).encode(ExternalEncoding), input_name='direction'), ))
if self.name is not None and 'name' not in already_processed:
already_processed.append('name')
outfile.write(' name=%s' % (self.gds_format_string(quote_attrib(self.name).encode(ExternalEncoding), input_name='name'), ))
if self.visible is not None and 'visible' not in already_processed:
already_processed.append('visible')
outfile.write(' visible=%s' % (self.gds_format_string(quote_attrib(self.visible).encode(ExternalEncoding), input_name='visible'), ))
if self.length is not None and 'length' not in already_processed:
already_processed.append('length')
outfile.write(' length=%s' % (self.gds_format_string(quote_attrib(self.length).encode(ExternalEncoding), input_name='length'), ))
if self.y is not None and 'y' not in already_processed:
already_processed.append('y')
outfile.write(' y=%s' % (self.gds_format_string(quote_attrib(self.y).encode(ExternalEncoding), input_name='y'), ))
if self.x is not None and 'x' not in already_processed:
already_processed.append('x')
outfile.write(' x=%s' % (self.gds_format_string(quote_attrib(self.x).encode(ExternalEncoding), input_name='x'), ))
if self.rot is not None and 'rot' not in already_processed:
already_processed.append('rot')
outfile.write(' rot=%s' % (self.gds_format_string(quote_attrib(self.rot).encode(ExternalEncoding), input_name='rot'), ))
if self.swaplevel is not None and 'swaplevel' not in already_processed:
already_processed.append('swaplevel')
outfile.write(' swaplevel=%s' % (self.gds_format_string(quote_attrib(self.swaplevel).encode(ExternalEncoding), input_name='swaplevel'), ))
def exportChildren(self, outfile, level, namespace_='t:', name_='pin', fromsubclass_=False):
pass
def hasContent_(self):
if (
):
return True
else:
return False
def exportLiteral(self, outfile, level, name_='pin'):
level += 1
self.exportLiteralAttributes(outfile, level, [], name_)
if self.hasContent_():
self.exportLiteralChildren(outfile, level, name_)
def exportLiteralAttributes(self, outfile, level, already_processed, name_):
if self.function is not None and 'function' not in already_processed:
already_processed.append('function')
showIndent(outfile, level)
outfile.write('function = "%s",\n' % (self.function,))
if self.direction is not None and 'direction' not in already_processed:
already_processed.append('direction')
showIndent(outfile, level)
outfile.write('direction = "%s",\n' % (self.direction,))
if self.name is not None and 'name' not in already_processed:
already_processed.append('name')
showIndent(outfile, level)
outfile.write('name = "%s",\n' % (self.name,))
if self.visible is not None and 'visible' not in already_processed:
already_processed.append('visible')
showIndent(outfile, level)
outfile.write('visible = "%s",\n' % (self.visible,))
if self.length is not None and 'length' not in already_processed:
already_processed.append('length')
showIndent(outfile, level)
outfile.write('length = "%s",\n' % (self.length,))
if self.y is not None and 'y' not in already_processed:
already_processed.append('y')
showIndent(outfile, level)
outfile.write('y = "%s",\n' % (self.y,))
if self.x is not None and 'x' not in already_processed:
already_processed.append('x')
showIndent(outfile, level)
outfile.write('x = "%s",\n' % (self.x,))
if self.rot is not None and 'rot' not in already_processed:
already_processed.append('rot')
showIndent(outfile, level)
outfile.write('rot = "%s",\n' % (self.rot,))
if self.swaplevel is not None and 'swaplevel' not in already_processed:
already_processed.append('swaplevel')
showIndent(outfile, level)
outfile.write('swaplevel = "%s",\n' % (self.swaplevel,))
def exportLiteralChildren(self, outfile, level, name_):
pass
def build(self, node):
self.buildAttributes(node, node.attrib, [])
for child in node:
nodeName_ = Tag_pattern_.match(child.tag).groups()[-1]
self.buildChildren(child, node, nodeName_)
def buildAttributes(self, node, attrs, already_processed):
value = find_attr_value_('function', node)
if value is not None and 'function' not in already_processed:
already_processed.append('function')
self.function = value
value = find_attr_value_('direction', node)
if value is not None and 'direction' not in already_processed:
already_processed.append('direction')
self.direction = value
value = find_attr_value_('name', node)
if value is not None and 'name' not in already_processed:
already_processed.append('name')
self.name = value
value = find_attr_value_('visible', node)
if value is not None and 'visible' not in already_processed:
already_processed.append('visible')
self.visible = value
value = find_attr_value_('length', node)
if value is not None and 'length' not in already_processed:
already_processed.append('length')
self.length = value
value = find_attr_value_('y', node)
if value is not None and 'y' not in already_processed:
already_processed.append('y')
self.y = value
value = find_attr_value_('x', node)
if value is not None and 'x' not in already_processed:
already_processed.append('x')
self.x = value
value = find_attr_value_('rot', node)
if value is not None and 'rot' not in already_processed:
already_processed.append('rot')
self.rot = value
value = find_attr_value_('swaplevel', node)
if value is not None and 'swaplevel' not in already_processed:
already_processed.append('swaplevel')
self.swaplevel = value
def buildChildren(self, child_, node, nodeName_, fromsubclass_=False):
pass
# end class pin
class part(GeneratedsSuper):
subclass = None
superclass = None
def __init__(self, name=None, deviceset=None, value=None, library=None, device=None, technology=None, attribute=None, variant=None):
self.name = _cast(None, name)
self.deviceset = _cast(None, deviceset)
self.value = _cast(None, value)
self.library = _cast(None, library)
self.device = _cast(None, device)
self.technology = _cast(None, technology)
if attribute is None:
self.attribute = []
else:
self.attribute = attribute
if variant is None:
self.variant = []
else:
self.variant = variant
def factory(*args_, **kwargs_):
if part.subclass:
return part.subclass(*args_, **kwargs_)
else:
return part(*args_, **kwargs_)
factory = staticmethod(factory)
def get_attribute(self): return self.attribute
def set_attribute(self, attribute): self.attribute = attribute
def add_attribute(self, value): self.attribute.append(value)
def insert_attribute(self, index, value): self.attribute[index] = value
def get_variant(self): return self.variant
def set_variant(self, variant): self.variant = variant
def add_variant(self, value): self.variant.append(value)
def insert_variant(self, index, value): self.variant[index] = value
def get_name(self): return self.name
def set_name(self, name): self.name = name
def get_deviceset(self): return self.deviceset
def set_deviceset(self, deviceset): self.deviceset = deviceset
def get_value(self): return self.value
def set_value(self, value): self.value = value
def get_library(self): return self.library
def set_library(self, library): self.library = library
def get_device(self): return self.device
def set_device(self, device): self.device = device
def get_technology(self): return self.technology
def set_technology(self, technology): self.technology = technology
def export(self, outfile, level, namespace_='t:', name_='part', namespacedef_=''):
showIndent(outfile, level)
outfile.write('<%s%s%s' % (namespace_, name_, namespacedef_ and ' ' + namespacedef_ or '', ))
already_processed = []
self.exportAttributes(outfile, level, already_processed, namespace_, name_='part')
if self.hasContent_():
outfile.write('>\n')
self.exportChildren(outfile, level + 1, namespace_, name_)
showIndent(outfile, level)
outfile.write('</%s%s>\n' % (namespace_, name_))
else:
outfile.write('/>\n')
def exportAttributes(self, outfile, level, already_processed, namespace_='t:', name_='part'):
if self.name is not None and 'name' not in already_processed:
already_processed.append('name')
outfile.write(' name=%s' % (self.gds_format_string(quote_attrib(self.name).encode(ExternalEncoding), input_name='name'), ))
if self.deviceset is not None and 'deviceset' not in already_processed:
already_processed.append('deviceset')
outfile.write(' deviceset=%s' % (self.gds_format_string(quote_attrib(self.deviceset).encode(ExternalEncoding), input_name='deviceset'), ))
if self.value is not None and 'value' not in already_processed:
already_processed.append('value')
outfile.write(' value=%s' % (self.gds_format_string(quote_attrib(self.value).encode(ExternalEncoding), input_name='value'), ))
if self.library is not None and 'library' not in already_processed:
already_processed.append('library')
outfile.write(' library=%s' % (self.gds_format_string(quote_attrib(self.library).encode(ExternalEncoding), input_name='library'), ))
if self.device is not None and 'device' not in already_processed:
already_processed.append('device')
outfile.write(' device=%s' % (self.gds_format_string(quote_attrib(self.device).encode(ExternalEncoding), input_name='device'), ))
if self.technology is not None and 'technology' not in already_processed:
already_processed.append('technology')
outfile.write(' technology=%s' % (self.gds_format_string(quote_attrib(self.technology).encode(ExternalEncoding), input_name='technology'), ))
def exportChildren(self, outfile, level, namespace_='t:', name_='part', fromsubclass_=False):
for attribute_ in self.attribute:
attribute_.export(outfile, level, namespace_, name_='attribute')
for variant_ in self.variant:
variant_.export(outfile, level, namespace_, name_='variant')
def hasContent_(self):
if (
self.attribute or
self.variant
):
return True
else:
return False
def exportLiteral(self, outfile, level, name_='part'):
level += 1
self.exportLiteralAttributes(outfile, level, [], name_)
if self.hasContent_():
self.exportLiteralChildren(outfile, level, name_)
def exportLiteralAttributes(self, outfile, level, already_processed, name_):
if self.name is not None and 'name' not in already_processed:
already_processed.append('name')
showIndent(outfile, level)
outfile.write('name = "%s",\n' % (self.name,))
if self.deviceset is not None and 'deviceset' not in already_processed:
already_processed.append('deviceset')
showIndent(outfile, level)
outfile.write('deviceset = "%s",\n' % (self.deviceset,))
if self.value is not None and 'value' not in already_processed:
already_processed.append('value')
showIndent(outfile, level)
outfile.write('value = "%s",\n' % (self.value,))
if self.library is not None and 'library' not in already_processed:
already_processed.append('library')
showIndent(outfile, level)
outfile.write('library = "%s",\n' % (self.library,))
if self.device is not None and 'device' not in already_processed:
already_processed.append('device')
showIndent(outfile, level)
outfile.write('device = "%s",\n' % (self.device,))
if self.technology is not None and 'technology' not in already_processed:
already_processed.append('technology')
showIndent(outfile, level)
outfile.write('technology = "%s",\n' % (self.technology,))
def exportLiteralChildren(self, outfile, level, name_):
showIndent(outfile, level)
outfile.write('attribute=[\n')
level += 1
for attribute_ in self.attribute:
showIndent(outfile, level)
outfile.write('model_.attribute(\n')
attribute_.exportLiteral(outfile, level)
showIndent(outfile, level)
outfile.write('),\n')
level -= 1
showIndent(outfile, level)
outfile.write('],\n')
showIndent(outfile, level)
outfile.write('variant=[\n')
level += 1
for variant_ in self.variant:
showIndent(outfile, level)
outfile.write('model_.variant(\n')
variant_.exportLiteral(outfile, level)
showIndent(outfile, level)
outfile.write('),\n')
level -= 1
showIndent(outfile, level)
outfile.write('],\n')
def build(self, node):
self.buildAttributes(node, node.attrib, [])
for child in node:
nodeName_ = Tag_pattern_.match(child.tag).groups()[-1]
self.buildChildren(child, node, nodeName_)
def buildAttributes(self, node, attrs, already_processed):
value = find_attr_value_('name', node)
if value is not None and 'name' not in already_processed:
already_processed.append('name')
self.name = value
value = find_attr_value_('deviceset', node)
if value is not None and 'deviceset' not in already_processed:
already_processed.append('deviceset')
self.deviceset = value
value = find_attr_value_('value', node)
if value is not None and 'value' not in already_processed:
already_processed.append('value')
self.value = value
value = find_attr_value_('library', node)
if value is not None and 'library' not in already_processed:
already_processed.append('library')
self.library = value
value = find_attr_value_('device', node)
if value is not None and 'device' not in already_processed:
already_processed.append('device')
self.device = value
value = find_attr_value_('technology', node)
if value is not None and 'technology' not in already_processed:
already_processed.append('technology')
self.technology = value
def buildChildren(self, child_, node, nodeName_, fromsubclass_=False):
if nodeName_ == 'attribute':
obj_ = attribute.factory()
obj_.build(child_)
self.attribute.append(obj_)
elif nodeName_ == 'variant':
obj_ = variant.factory()
obj_.build(child_)
self.variant.append(obj_)
# end class part
class instance(GeneratedsSuper):
subclass = None
superclass = None
def __init__(self, smashed=None, part=None, x=None, y=None, gate=None, rot=None, attribute=None):
self.smashed = _cast(None, smashed)
self.part = _cast(None, part)
self.x = _cast(None, x)
self.y = _cast(None, y)
self.gate = _cast(None, gate)
self.rot = _cast(None, rot)
self.attribute = attribute
def factory(*args_, **kwargs_):
if instance.subclass:
return instance.subclass(*args_, **kwargs_)
else:
return instance(*args_, **kwargs_)
factory = staticmethod(factory)
def get_attribute(self): return self.attribute
def set_attribute(self, attribute): self.attribute = attribute
def get_smashed(self): return self.smashed
def set_smashed(self, smashed): self.smashed = smashed
def get_part(self): return self.part
def set_part(self, part): self.part = part
def get_x(self): return self.x
def set_x(self, x): self.x = x
def get_y(self): return self.y
def set_y(self, y): self.y = y
def get_gate(self): return self.gate
def set_gate(self, gate): self.gate = gate
def get_rot(self): return self.rot
def set_rot(self, rot): self.rot = rot
def export(self, outfile, level, namespace_='t:', name_='instance', namespacedef_=''):
showIndent(outfile, level)
outfile.write('<%s%s%s' % (namespace_, name_, namespacedef_ and ' ' + namespacedef_ or '', ))
already_processed = []
self.exportAttributes(outfile, level, already_processed, namespace_, name_='instance')
if self.hasContent_():
outfile.write('>\n')
self.exportChildren(outfile, level + 1, namespace_, name_)
showIndent(outfile, level)
outfile.write('</%s%s>\n' % (namespace_, name_))
else:
outfile.write('/>\n')
def exportAttributes(self, outfile, level, already_processed, namespace_='t:', name_='instance'):
if self.smashed is not None and 'smashed' not in already_processed:
already_processed.append('smashed')
outfile.write(' smashed=%s' % (self.gds_format_string(quote_attrib(self.smashed).encode(ExternalEncoding), input_name='smashed'), ))
if self.part is not None and 'part' not in already_processed:
already_processed.append('part')
outfile.write(' part=%s' % (self.gds_format_string(quote_attrib(self.part).encode(ExternalEncoding), input_name='part'), ))
if self.x is not None and 'x' not in already_processed:
already_processed.append('x')
outfile.write(' x=%s' % (self.gds_format_string(quote_attrib(self.x).encode(ExternalEncoding), input_name='x'), ))
if self.y is not None and 'y' not in already_processed:
already_processed.append('y')
outfile.write(' y=%s' % (self.gds_format_string(quote_attrib(self.y).encode(ExternalEncoding), input_name='y'), ))
if self.gate is not None and 'gate' not in already_processed:
already_processed.append('gate')
outfile.write(' gate=%s' % (self.gds_format_string(quote_attrib(self.gate).encode(ExternalEncoding), input_name='gate'), ))
if self.rot is not None and 'rot' not in already_processed:
already_processed.append('rot')
outfile.write(' rot=%s' % (self.gds_format_string(quote_attrib(self.rot).encode(ExternalEncoding), input_name='rot'), ))
def exportChildren(self, outfile, level, namespace_='t:', name_='instance', fromsubclass_=False):
if self.attribute is not None:
self.attribute.export(outfile, level, namespace_, name_='attribute', )
def hasContent_(self):
if (
self.attribute is not None
):
return True
else:
return False
def exportLiteral(self, outfile, level, name_='instance'):
level += 1
self.exportLiteralAttributes(outfile, level, [], name_)
if self.hasContent_():
self.exportLiteralChildren(outfile, level, name_)
def exportLiteralAttributes(self, outfile, level, already_processed, name_):
if self.smashed is not None and 'smashed' not in already_processed:
already_processed.append('smashed')
showIndent(outfile, level)
outfile.write('smashed = "%s",\n' % (self.smashed,))
if self.part is not None and 'part' not in already_processed:
already_processed.append('part')
showIndent(outfile, level)
outfile.write('part = "%s",\n' % (self.part,))
if self.x is not None and 'x' not in already_processed:
already_processed.append('x')
showIndent(outfile, level)
outfile.write('x = "%s",\n' % (self.x,))
if self.y is not None and 'y' not in already_processed:
already_processed.append('y')
showIndent(outfile, level)
outfile.write('y = "%s",\n' % (self.y,))
if self.gate is not None and 'gate' not in already_processed:
already_processed.append('gate')
showIndent(outfile, level)
outfile.write('gate = "%s",\n' % (self.gate,))
if self.rot is not None and 'rot' not in already_processed:
already_processed.append('rot')
showIndent(outfile, level)
outfile.write('rot = "%s",\n' % (self.rot,))
def exportLiteralChildren(self, outfile, level, name_):
if self.attribute is not None:
showIndent(outfile, level)
outfile.write('attribute=model_.attribute(\n')
self.attribute.exportLiteral(outfile, level)
showIndent(outfile, level)
outfile.write('),\n')
def build(self, node):
self.buildAttributes(node, node.attrib, [])
for child in node:
nodeName_ = Tag_pattern_.match(child.tag).groups()[-1]
self.buildChildren(child, node, nodeName_)
def buildAttributes(self, node, attrs, already_processed):
value = find_attr_value_('smashed', node)
if value is not None and 'smashed' not in already_processed:
already_processed.append('smashed')
self.smashed = value
value = find_attr_value_('part', node)
if value is not None and 'part' not in already_processed:
already_processed.append('part')
self.part = value
value = find_attr_value_('x', node)
if value is not None and 'x' not in already_processed:
already_processed.append('x')
self.x = value
value = find_attr_value_('y', node)
if value is not None and 'y' not in already_processed:
already_processed.append('y')
self.y = value
value = find_attr_value_('gate', node)
if value is not None and 'gate' not in already_processed:
already_processed.append('gate')
self.gate = value
value = find_attr_value_('rot', node)
if value is not None and 'rot' not in already_processed:
already_processed.append('rot')
self.rot = value
def buildChildren(self, child_, node, nodeName_, fromsubclass_=False):
if nodeName_ == 'attribute':
obj_ = attribute.factory()
obj_.build(child_)
self.set_attribute(obj_)
# end class instance
class label(GeneratedsSuper):
subclass = None
superclass = None
def __init__(self, layer=None, xref=None, ratio=None, y=None, x=None, font=None, rot=None, size=None):
self.layer = _cast(None, layer)
self.xref = _cast(None, xref)
self.ratio = _cast(None, ratio)
self.y = _cast(None, y)
self.x = _cast(None, x)
self.font = _cast(None, font)
self.rot = _cast(None, rot)
self.size = _cast(None, size)
pass
def factory(*args_, **kwargs_):
if label.subclass:
return label.subclass(*args_, **kwargs_)
else:
return label(*args_, **kwargs_)
factory = staticmethod(factory)
def get_layer(self): return self.layer
def set_layer(self, layer): self.layer = layer
def get_xref(self): return self.xref
def set_xref(self, xref): self.xref = xref
def get_ratio(self): return self.ratio
def set_ratio(self, ratio): self.ratio = ratio
def get_y(self): return self.y
def set_y(self, y): self.y = y
def get_x(self): return self.x
def set_x(self, x): self.x = x
def get_font(self): return self.font
def set_font(self, font): self.font = font
def get_rot(self): return self.rot
def set_rot(self, rot): self.rot = rot
def get_size(self): return self.size
def set_size(self, size): self.size = size
def export(self, outfile, level, namespace_='t:', name_='label', namespacedef_=''):
showIndent(outfile, level)
outfile.write('<%s%s%s' % (namespace_, name_, namespacedef_ and ' ' + namespacedef_ or '', ))
already_processed = []
self.exportAttributes(outfile, level, already_processed, namespace_, name_='label')
if self.hasContent_():
outfile.write('>\n')
self.exportChildren(outfile, level + 1, namespace_, name_)
outfile.write('</%s%s>\n' % (namespace_, name_))
else:
outfile.write('/>\n')
def exportAttributes(self, outfile, level, already_processed, namespace_='t:', name_='label'):
if self.layer is not None and 'layer' not in already_processed:
already_processed.append('layer')
outfile.write(' layer=%s' % (self.gds_format_string(quote_attrib(self.layer).encode(ExternalEncoding), input_name='layer'), ))
if self.xref is not None and 'xref' not in already_processed:
already_processed.append('xref')
outfile.write(' xref=%s' % (self.gds_format_string(quote_attrib(self.xref).encode(ExternalEncoding), input_name='xref'), ))
if self.ratio is not None and 'ratio' not in already_processed:
already_processed.append('ratio')
outfile.write(' ratio=%s' % (self.gds_format_string(quote_attrib(self.ratio).encode(ExternalEncoding), input_name='ratio'), ))
if self.y is not None and 'y' not in already_processed:
already_processed.append('y')
outfile.write(' y=%s' % (self.gds_format_string(quote_attrib(self.y).encode(ExternalEncoding), input_name='y'), ))
if self.x is not None and 'x' not in already_processed:
already_processed.append('x')
outfile.write(' x=%s' % (self.gds_format_string(quote_attrib(self.x).encode(ExternalEncoding), input_name='x'), ))
if self.font is not None and 'font' not in already_processed:
already_processed.append('font')
outfile.write(' font=%s' % (self.gds_format_string(quote_attrib(self.font).encode(ExternalEncoding), input_name='font'), ))
if self.rot is not None and 'rot' not in already_processed:
already_processed.append('rot')
outfile.write(' rot=%s' % (self.gds_format_string(quote_attrib(self.rot).encode(ExternalEncoding), input_name='rot'), ))
if self.size is not None and 'size' not in already_processed:
already_processed.append('size')
outfile.write(' size=%s' % (self.gds_format_string(quote_attrib(self.size).encode(ExternalEncoding), input_name='size'), ))
def exportChildren(self, outfile, level, namespace_='t:', name_='label', fromsubclass_=False):
pass
def hasContent_(self):
if (
):
return True
else:
return False
def exportLiteral(self, outfile, level, name_='label'):
level += 1
self.exportLiteralAttributes(outfile, level, [], name_)
if self.hasContent_():
self.exportLiteralChildren(outfile, level, name_)
def exportLiteralAttributes(self, outfile, level, already_processed, name_):
if self.layer is not None and 'layer' not in already_processed:
already_processed.append('layer')
showIndent(outfile, level)
outfile.write('layer = "%s",\n' % (self.layer,))
if self.xref is not None and 'xref' not in already_processed:
already_processed.append('xref')
showIndent(outfile, level)
outfile.write('xref = "%s",\n' % (self.xref,))
if self.ratio is not None and 'ratio' not in already_processed:
already_processed.append('ratio')
showIndent(outfile, level)
outfile.write('ratio = "%s",\n' % (self.ratio,))
if self.y is not None and 'y' not in already_processed:
already_processed.append('y')
showIndent(outfile, level)
outfile.write('y = "%s",\n' % (self.y,))
if self.x is not None and 'x' not in already_processed:
already_processed.append('x')
showIndent(outfile, level)
outfile.write('x = "%s",\n' % (self.x,))
if self.font is not None and 'font' not in already_processed:
already_processed.append('font')
showIndent(outfile, level)
outfile.write('font = "%s",\n' % (self.font,))
if self.rot is not None and 'rot' not in already_processed:
already_processed.append('rot')
showIndent(outfile, level)
outfile.write('rot = "%s",\n' % (self.rot,))
if self.size is not None and 'size' not in already_processed:
already_processed.append('size')
showIndent(outfile, level)
outfile.write('size = "%s",\n' % (self.size,))
def exportLiteralChildren(self, outfile, level, name_):
pass
def build(self, node):
self.buildAttributes(node, node.attrib, [])
for child in node:
nodeName_ = Tag_pattern_.match(child.tag).groups()[-1]
self.buildChildren(child, node, nodeName_)
def buildAttributes(self, node, attrs, already_processed):
value = find_attr_value_('layer', node)
if value is not None and 'layer' not in already_processed:
already_processed.append('layer')
self.layer = value
value = find_attr_value_('xref', node)
if value is not None and 'xref' not in already_processed:
already_processed.append('xref')
self.xref = value
value = find_attr_value_('ratio', node)
if value is not None and 'ratio' not in already_processed:
already_processed.append('ratio')
self.ratio = value
value = find_attr_value_('y', node)
if value is not None and 'y' not in already_processed:
already_processed.append('y')
self.y = value
value = find_attr_value_('x', node)
if value is not None and 'x' not in already_processed:
already_processed.append('x')
self.x = value
value = find_attr_value_('font', node)
if value is not None and 'font' not in already_processed:
already_processed.append('font')
self.font = value
value = find_attr_value_('rot', node)
if value is not None and 'rot' not in already_processed:
already_processed.append('rot')
self.rot = value
value = find_attr_value_('size', node)
if value is not None and 'size' not in already_processed:
already_processed.append('size')
self.size = value
def buildChildren(self, child_, node, nodeName_, fromsubclass_=False):
pass
# end class label
class junction(GeneratedsSuper):
subclass = None
superclass = None
def __init__(self, y=None, x=None):
self.y = _cast(None, y)
self.x = _cast(None, x)
pass
def factory(*args_, **kwargs_):
if junction.subclass:
return junction.subclass(*args_, **kwargs_)
else:
return junction(*args_, **kwargs_)
factory = staticmethod(factory)
def get_y(self): return self.y
def set_y(self, y): self.y = y
def get_x(self): return self.x
def set_x(self, x): self.x = x
def export(self, outfile, level, namespace_='t:', name_='junction', namespacedef_=''):
showIndent(outfile, level)
outfile.write('<%s%s%s' % (namespace_, name_, namespacedef_ and ' ' + namespacedef_ or '', ))
already_processed = []
self.exportAttributes(outfile, level, already_processed, namespace_, name_='junction')
if self.hasContent_():
outfile.write('>\n')
self.exportChildren(outfile, level + 1, namespace_, name_)
outfile.write('</%s%s>\n' % (namespace_, name_))
else:
outfile.write('/>\n')
def exportAttributes(self, outfile, level, already_processed, namespace_='t:', name_='junction'):
if self.y is not None and 'y' not in already_processed:
already_processed.append('y')
outfile.write(' y=%s' % (self.gds_format_string(quote_attrib(self.y).encode(ExternalEncoding), input_name='y'), ))
if self.x is not None and 'x' not in already_processed:
already_processed.append('x')
outfile.write(' x=%s' % (self.gds_format_string(quote_attrib(self.x).encode(ExternalEncoding), input_name='x'), ))
def exportChildren(self, outfile, level, namespace_='t:', name_='junction', fromsubclass_=False):
pass
def hasContent_(self):
if (
):
return True
else:
return False
def exportLiteral(self, outfile, level, name_='junction'):
level += 1
self.exportLiteralAttributes(outfile, level, [], name_)
if self.hasContent_():
self.exportLiteralChildren(outfile, level, name_)
def exportLiteralAttributes(self, outfile, level, already_processed, name_):
if self.y is not None and 'y' not in already_processed:
already_processed.append('y')
showIndent(outfile, level)
outfile.write('y = "%s",\n' % (self.y,))
if self.x is not None and 'x' not in already_processed:
already_processed.append('x')
showIndent(outfile, level)
outfile.write('x = "%s",\n' % (self.x,))
def exportLiteralChildren(self, outfile, level, name_):
pass
def build(self, node):
self.buildAttributes(node, node.attrib, [])
for child in node:
nodeName_ = Tag_pattern_.match(child.tag).groups()[-1]
self.buildChildren(child, node, nodeName_)
def buildAttributes(self, node, attrs, already_processed):
value = find_attr_value_('y', node)
if value is not None and 'y' not in already_processed:
already_processed.append('y')
self.y = value
value = find_attr_value_('x', node)
if value is not None and 'x' not in already_processed:
already_processed.append('x')
self.x = value
def buildChildren(self, child_, node, nodeName_, fromsubclass_=False):
pass
# end class junction
class connect(GeneratedsSuper):
subclass = None
superclass = None
def __init__(self, gate=None, route=None, pad=None, pin=None):
self.gate = _cast(None, gate)
self.route = _cast(None, route)
self.pad = _cast(None, pad)
self.pin = _cast(None, pin)
pass
def factory(*args_, **kwargs_):
if connect.subclass:
return connect.subclass(*args_, **kwargs_)
else:
return connect(*args_, **kwargs_)
factory = staticmethod(factory)
def get_gate(self): return self.gate
def set_gate(self, gate): self.gate = gate
def get_route(self): return self.route
def set_route(self, route): self.route = route
def get_pad(self): return self.pad
def set_pad(self, pad): self.pad = pad
def get_pin(self): return self.pin
def set_pin(self, pin): self.pin = pin
def export(self, outfile, level, namespace_='t:', name_='connect', namespacedef_=''):
showIndent(outfile, level)
outfile.write('<%s%s%s' % (namespace_, name_, namespacedef_ and ' ' + namespacedef_ or '', ))
already_processed = []
self.exportAttributes(outfile, level, already_processed, namespace_, name_='connect')
if self.hasContent_():
outfile.write('>\n')
self.exportChildren(outfile, level + 1, namespace_, name_)
outfile.write('</%s%s>\n' % (namespace_, name_))
else:
outfile.write('/>\n')
def exportAttributes(self, outfile, level, already_processed, namespace_='t:', name_='connect'):
if self.gate is not None and 'gate' not in already_processed:
already_processed.append('gate')
outfile.write(' gate=%s' % (self.gds_format_string(quote_attrib(self.gate).encode(ExternalEncoding), input_name='gate'), ))
if self.route is not None and 'route' not in already_processed:
already_processed.append('route')
outfile.write(' route=%s' % (self.gds_format_string(quote_attrib(self.route).encode(ExternalEncoding), input_name='route'), ))
if self.pad is not None and 'pad' not in already_processed:
already_processed.append('pad')
outfile.write(' pad=%s' % (self.gds_format_string(quote_attrib(self.pad).encode(ExternalEncoding), input_name='pad'), ))
if self.pin is not None and 'pin' not in already_processed:
already_processed.append('pin')
outfile.write(' pin=%s' % (self.gds_format_string(quote_attrib(self.pin).encode(ExternalEncoding), input_name='pin'), ))
def exportChildren(self, outfile, level, namespace_='t:', name_='connect', fromsubclass_=False):
pass
def hasContent_(self):
if (
):
return True
else:
return False
def exportLiteral(self, outfile, level, name_='connect'):
level += 1
self.exportLiteralAttributes(outfile, level, [], name_)
if self.hasContent_():
self.exportLiteralChildren(outfile, level, name_)
def exportLiteralAttributes(self, outfile, level, already_processed, name_):
if self.gate is not None and 'gate' not in already_processed:
already_processed.append('gate')
showIndent(outfile, level)
outfile.write('gate = "%s",\n' % (self.gate,))
if self.route is not None and 'route' not in already_processed:
already_processed.append('route')
showIndent(outfile, level)
outfile.write('route = "%s",\n' % (self.route,))
if self.pad is not None and 'pad' not in already_processed:
already_processed.append('pad')
showIndent(outfile, level)
outfile.write('pad = "%s",\n' % (self.pad,))
if self.pin is not None and 'pin' not in already_processed:
already_processed.append('pin')
showIndent(outfile, level)
outfile.write('pin = "%s",\n' % (self.pin,))
def exportLiteralChildren(self, outfile, level, name_):
pass
def build(self, node):
self.buildAttributes(node, node.attrib, [])
for child in node:
nodeName_ = Tag_pattern_.match(child.tag).groups()[-1]
self.buildChildren(child, node, nodeName_)
def buildAttributes(self, node, attrs, already_processed):
value = find_attr_value_('gate', node)
if value is not None and 'gate' not in already_processed:
already_processed.append('gate')
self.gate = value
value = find_attr_value_('route', node)
if value is not None and 'route' not in already_processed:
already_processed.append('route')
self.route = value
value = find_attr_value_('pad', node)
if value is not None and 'pad' not in already_processed:
already_processed.append('pad')
self.pad = value
value = find_attr_value_('pin', node)
if value is not None and 'pin' not in already_processed:
already_processed.append('pin')
self.pin = value
def buildChildren(self, child_, node, nodeName_, fromsubclass_=False):
pass
# end class connect
class technology(GeneratedsSuper):
subclass = None
superclass = None
def __init__(self, name=None, attribute=None):
self.name = _cast(None, name)
self.attribute = attribute
def factory(*args_, **kwargs_):
if technology.subclass:
return technology.subclass(*args_, **kwargs_)
else:
return technology(*args_, **kwargs_)
factory = staticmethod(factory)
def get_attribute(self): return self.attribute
def set_attribute(self, attribute): self.attribute = attribute
def get_name(self): return self.name
def set_name(self, name): self.name = name
def export(self, outfile, level, namespace_='t:', name_='technology', namespacedef_=''):
showIndent(outfile, level)
outfile.write('<%s%s%s' % (namespace_, name_, namespacedef_ and ' ' + namespacedef_ or '', ))
already_processed = []
self.exportAttributes(outfile, level, already_processed, namespace_, name_='technology')
if self.hasContent_():
outfile.write('>\n')
self.exportChildren(outfile, level + 1, namespace_, name_)
showIndent(outfile, level)
outfile.write('</%s%s>\n' % (namespace_, name_))
else:
outfile.write('/>\n')
def exportAttributes(self, outfile, level, already_processed, namespace_='t:', name_='technology'):
if self.name is not None and 'name' not in already_processed:
already_processed.append('name')
outfile.write(' name=%s' % (self.gds_format_string(quote_attrib(self.name).encode(ExternalEncoding), input_name='name'), ))
def exportChildren(self, outfile, level, namespace_='t:', name_='technology', fromsubclass_=False):
if self.attribute is not None:
self.attribute.export(outfile, level, namespace_, name_='attribute', )
def hasContent_(self):
if (
self.attribute is not None
):
return True
else:
return False
def exportLiteral(self, outfile, level, name_='technology'):
level += 1
self.exportLiteralAttributes(outfile, level, [], name_)
if self.hasContent_():
self.exportLiteralChildren(outfile, level, name_)
def exportLiteralAttributes(self, outfile, level, already_processed, name_):
if self.name is not None and 'name' not in already_processed:
already_processed.append('name')
showIndent(outfile, level)
outfile.write('name = "%s",\n' % (self.name,))
def exportLiteralChildren(self, outfile, level, name_):
if self.attribute is not None:
showIndent(outfile, level)
outfile.write('attribute=model_.attribute(\n')
self.attribute.exportLiteral(outfile, level)
showIndent(outfile, level)
outfile.write('),\n')
def build(self, node):
self.buildAttributes(node, node.attrib, [])
for child in node:
nodeName_ = Tag_pattern_.match(child.tag).groups()[-1]
self.buildChildren(child, node, nodeName_)
def buildAttributes(self, node, attrs, already_processed):
value = find_attr_value_('name', node)
if value is not None and 'name' not in already_processed:
already_processed.append('name')
self.name = value
def buildChildren(self, child_, node, nodeName_, fromsubclass_=False):
if nodeName_ == 'attribute':
obj_ = attribute.factory()
obj_.build(child_)
self.set_attribute(obj_)
# end class technology
class attribute(GeneratedsSuper):
subclass = None
superclass = None
def __init__(self, layer=None, ratio=None, name=None, value=None, y=None, x=None, constant=None, font=None, rot=None, display=None, size=None):
self.layer = _cast(None, layer)
self.ratio = _cast(None, ratio)
self.name = _cast(None, name)
self.value = _cast(None, value)
self.y = _cast(None, y)
self.x = _cast(None, x)
self.constant = _cast(None, constant)
self.font = _cast(None, font)
self.rot = _cast(None, rot)
self.display = _cast(None, display)
self.size = _cast(None, size)
pass
def factory(*args_, **kwargs_):
if attribute.subclass:
return attribute.subclass(*args_, **kwargs_)
else:
return attribute(*args_, **kwargs_)
factory = staticmethod(factory)
def get_layer(self): return self.layer
def set_layer(self, layer): self.layer = layer
def get_ratio(self): return self.ratio
def set_ratio(self, ratio): self.ratio = ratio
def get_name(self): return self.name
def set_name(self, name): self.name = name
def get_value(self): return self.value
def set_value(self, value): self.value = value
def get_y(self): return self.y
def set_y(self, y): self.y = y
def get_x(self): return self.x
def set_x(self, x): self.x = x
def get_constant(self): return self.constant
def set_constant(self, constant): self.constant = constant
def get_font(self): return self.font
def set_font(self, font): self.font = font
def get_rot(self): return self.rot
def set_rot(self, rot): self.rot = rot
def get_display(self): return self.display
def set_display(self, display): self.display = display
def get_size(self): return self.size
def set_size(self, size): self.size = size
def export(self, outfile, level, namespace_='t:', name_='attribute', namespacedef_=''):
showIndent(outfile, level)
outfile.write('<%s%s%s' % (namespace_, name_, namespacedef_ and ' ' + namespacedef_ or '', ))
already_processed = []
self.exportAttributes(outfile, level, already_processed, namespace_, name_='attribute')
if self.hasContent_():
outfile.write('>\n')
self.exportChildren(outfile, level + 1, namespace_, name_)
outfile.write('</%s%s>\n' % (namespace_, name_))
else:
outfile.write('/>\n')
def exportAttributes(self, outfile, level, already_processed, namespace_='t:', name_='attribute'):
if self.layer is not None and 'layer' not in already_processed:
already_processed.append('layer')
outfile.write(' layer=%s' % (self.gds_format_string(quote_attrib(self.layer).encode(ExternalEncoding), input_name='layer'), ))
if self.ratio is not None and 'ratio' not in already_processed:
already_processed.append('ratio')
outfile.write(' ratio=%s' % (self.gds_format_string(quote_attrib(self.ratio).encode(ExternalEncoding), input_name='ratio'), ))
if self.name is not None and 'name' not in already_processed:
already_processed.append('name')
outfile.write(' name=%s' % (self.gds_format_string(quote_attrib(self.name).encode(ExternalEncoding), input_name='name'), ))
if self.value is not None and 'value' not in already_processed:
already_processed.append('value')
outfile.write(' value=%s' % (self.gds_format_string(quote_attrib(self.value).encode(ExternalEncoding), input_name='value'), ))
if self.y is not None and 'y' not in already_processed:
already_processed.append('y')
outfile.write(' y=%s' % (self.gds_format_string(quote_attrib(self.y).encode(ExternalEncoding), input_name='y'), ))
if self.x is not None and 'x' not in already_processed:
already_processed.append('x')
outfile.write(' x=%s' % (self.gds_format_string(quote_attrib(self.x).encode(ExternalEncoding), input_name='x'), ))
if self.constant is not None and 'constant' not in already_processed:
already_processed.append('constant')
outfile.write(' constant=%s' % (self.gds_format_string(quote_attrib(self.constant).encode(ExternalEncoding), input_name='constant'), ))
if self.font is not None and 'font' not in already_processed:
already_processed.append('font')
outfile.write(' font=%s' % (self.gds_format_string(quote_attrib(self.font).encode(ExternalEncoding), input_name='font'), ))
if self.rot is not None and 'rot' not in already_processed:
already_processed.append('rot')
outfile.write(' rot=%s' % (self.gds_format_string(quote_attrib(self.rot).encode(ExternalEncoding), input_name='rot'), ))
if self.display is not None and 'display' not in already_processed:
already_processed.append('display')
outfile.write(' display=%s' % (self.gds_format_string(quote_attrib(self.display).encode(ExternalEncoding), input_name='display'), ))
if self.size is not None and 'size' not in already_processed:
already_processed.append('size')
outfile.write(' size=%s' % (self.gds_format_string(quote_attrib(self.size).encode(ExternalEncoding), input_name='size'), ))
def exportChildren(self, outfile, level, namespace_='t:', name_='attribute', fromsubclass_=False):
pass
def hasContent_(self):
if (
):
return True
else:
return False
def exportLiteral(self, outfile, level, name_='attribute'):
level += 1
self.exportLiteralAttributes(outfile, level, [], name_)
if self.hasContent_():
self.exportLiteralChildren(outfile, level, name_)
def exportLiteralAttributes(self, outfile, level, already_processed, name_):
if self.layer is not None and 'layer' not in already_processed:
already_processed.append('layer')
showIndent(outfile, level)
outfile.write('layer = "%s",\n' % (self.layer,))
if self.ratio is not None and 'ratio' not in already_processed:
already_processed.append('ratio')
showIndent(outfile, level)
outfile.write('ratio = "%s",\n' % (self.ratio,))
if self.name is not None and 'name' not in already_processed:
already_processed.append('name')
showIndent(outfile, level)
outfile.write('name = "%s",\n' % (self.name,))
if self.value is not None and 'value' not in already_processed:
already_processed.append('value')
showIndent(outfile, level)
outfile.write('value = "%s",\n' % (self.value,))
if self.y is not None and 'y' not in already_processed:
already_processed.append('y')
showIndent(outfile, level)
outfile.write('y = "%s",\n' % (self.y,))
if self.x is not None and 'x' not in already_processed:
already_processed.append('x')
showIndent(outfile, level)
outfile.write('x = "%s",\n' % (self.x,))
if self.constant is not None and 'constant' not in already_processed:
already_processed.append('constant')
showIndent(outfile, level)
outfile.write('constant = "%s",\n' % (self.constant,))
if self.font is not None and 'font' not in already_processed:
already_processed.append('font')
showIndent(outfile, level)
outfile.write('font = "%s",\n' % (self.font,))
if self.rot is not None and 'rot' not in already_processed:
already_processed.append('rot')
showIndent(outfile, level)
outfile.write('rot = "%s",\n' % (self.rot,))
if self.display is not None and 'display' not in already_processed:
already_processed.append('display')
showIndent(outfile, level)
outfile.write('display = "%s",\n' % (self.display,))
if self.size is not None and 'size' not in already_processed:
already_processed.append('size')
showIndent(outfile, level)
outfile.write('size = "%s",\n' % (self.size,))
def exportLiteralChildren(self, outfile, level, name_):
pass
def build(self, node):
self.buildAttributes(node, node.attrib, [])
for child in node:
nodeName_ = Tag_pattern_.match(child.tag).groups()[-1]
self.buildChildren(child, node, nodeName_)
def buildAttributes(self, node, attrs, already_processed):
value = find_attr_value_('layer', node)
if value is not None and 'layer' not in already_processed:
already_processed.append('layer')
self.layer = value
value = find_attr_value_('ratio', node)
if value is not None and 'ratio' not in already_processed:
already_processed.append('ratio')
self.ratio = value
value = find_attr_value_('name', node)
if value is not None and 'name' not in already_processed:
already_processed.append('name')
self.name = value
value = find_attr_value_('value', node)
if value is not None and 'value' not in already_processed:
already_processed.append('value')
self.value = value
value = find_attr_value_('y', node)
if value is not None and 'y' not in already_processed:
already_processed.append('y')
self.y = value
value = find_attr_value_('x', node)
if value is not None and 'x' not in already_processed:
already_processed.append('x')
self.x = value
value = find_attr_value_('constant', node)
if value is not None and 'constant' not in already_processed:
already_processed.append('constant')
self.constant = value
value = find_attr_value_('font', node)
if value is not None and 'font' not in already_processed:
already_processed.append('font')
self.font = value
value = find_attr_value_('rot', node)
if value is not None and 'rot' not in already_processed:
already_processed.append('rot')
self.rot = value
value = find_attr_value_('display', node)
if value is not None and 'display' not in already_processed:
already_processed.append('display')
self.display = value
value = find_attr_value_('size', node)
if value is not None and 'size' not in already_processed:
already_processed.append('size')
self.size = value
def buildChildren(self, child_, node, nodeName_, fromsubclass_=False):
pass
# end class attribute
class pinref(GeneratedsSuper):
subclass = None
superclass = None
def __init__(self, gate=None, part=None, pin=None):
self.gate = _cast(None, gate)
self.part = _cast(None, part)
self.pin = _cast(None, pin)
pass
def factory(*args_, **kwargs_):
if pinref.subclass:
return pinref.subclass(*args_, **kwargs_)
else:
return pinref(*args_, **kwargs_)
factory = staticmethod(factory)
def get_gate(self): return self.gate
def set_gate(self, gate): self.gate = gate
def get_part(self): return self.part
def set_part(self, part): self.part = part
def get_pin(self): return self.pin
def set_pin(self, pin): self.pin = pin
def export(self, outfile, level, namespace_='t:', name_='pinref', namespacedef_=''):
showIndent(outfile, level)
outfile.write('<%s%s%s' % (namespace_, name_, namespacedef_ and ' ' + namespacedef_ or '', ))
already_processed = []
self.exportAttributes(outfile, level, already_processed, namespace_, name_='pinref')
if self.hasContent_():
outfile.write('>\n')
self.exportChildren(outfile, level + 1, namespace_, name_)
outfile.write('</%s%s>\n' % (namespace_, name_))
else:
outfile.write('/>\n')
def exportAttributes(self, outfile, level, already_processed, namespace_='t:', name_='pinref'):
if self.gate is not None and 'gate' not in already_processed:
already_processed.append('gate')
outfile.write(' gate=%s' % (self.gds_format_string(quote_attrib(self.gate).encode(ExternalEncoding), input_name='gate'), ))
if self.part is not None and 'part' not in already_processed:
already_processed.append('part')
outfile.write(' part=%s' % (self.gds_format_string(quote_attrib(self.part).encode(ExternalEncoding), input_name='part'), ))
if self.pin is not None and 'pin' not in already_processed:
already_processed.append('pin')
outfile.write(' pin=%s' % (self.gds_format_string(quote_attrib(self.pin).encode(ExternalEncoding), input_name='pin'), ))
def exportChildren(self, outfile, level, namespace_='t:', name_='pinref', fromsubclass_=False):
pass
def hasContent_(self):
if (
):
return True
else:
return False
def exportLiteral(self, outfile, level, name_='pinref'):
level += 1
self.exportLiteralAttributes(outfile, level, [], name_)
if self.hasContent_():
self.exportLiteralChildren(outfile, level, name_)
def exportLiteralAttributes(self, outfile, level, already_processed, name_):
if self.gate is not None and 'gate' not in already_processed:
already_processed.append('gate')
showIndent(outfile, level)
outfile.write('gate = "%s",\n' % (self.gate,))
if self.part is not None and 'part' not in already_processed:
already_processed.append('part')
showIndent(outfile, level)
outfile.write('part = "%s",\n' % (self.part,))
if self.pin is not None and 'pin' not in already_processed:
already_processed.append('pin')
showIndent(outfile, level)
outfile.write('pin = "%s",\n' % (self.pin,))
def exportLiteralChildren(self, outfile, level, name_):
pass
def build(self, node):
self.buildAttributes(node, node.attrib, [])
for child in node:
nodeName_ = Tag_pattern_.match(child.tag).groups()[-1]
self.buildChildren(child, node, nodeName_)
def buildAttributes(self, node, attrs, already_processed):
value = find_attr_value_('gate', node)
if value is not None and 'gate' not in already_processed:
already_processed.append('gate')
self.gate = value
value = find_attr_value_('part', node)
if value is not None and 'part' not in already_processed:
already_processed.append('part')
self.part = value
value = find_attr_value_('pin', node)
if value is not None and 'pin' not in already_processed:
already_processed.append('pin')
self.pin = value
def buildChildren(self, child_, node, nodeName_, fromsubclass_=False):
pass
# end class pinref
class contactref(GeneratedsSuper):
subclass = None
superclass = None
def __init__(self, route=None, pad=None, element=None):
self.route = _cast(None, route)
self.pad = _cast(None, pad)
self.element = _cast(None, element)
pass
def factory(*args_, **kwargs_):
if contactref.subclass:
return contactref.subclass(*args_, **kwargs_)
else:
return contactref(*args_, **kwargs_)
factory = staticmethod(factory)
def get_route(self): return self.route
def set_route(self, route): self.route = route
def get_pad(self): return self.pad
def set_pad(self, pad): self.pad = pad
def get_element(self): return self.element
def set_element(self, element): self.element = element
def export(self, outfile, level, namespace_='t:', name_='contactref', namespacedef_=''):
showIndent(outfile, level)
outfile.write('<%s%s%s' % (namespace_, name_, namespacedef_ and ' ' + namespacedef_ or '', ))
already_processed = []
self.exportAttributes(outfile, level, already_processed, namespace_, name_='contactref')
if self.hasContent_():
outfile.write('>\n')
self.exportChildren(outfile, level + 1, namespace_, name_)
outfile.write('</%s%s>\n' % (namespace_, name_))
else:
outfile.write('/>\n')
def exportAttributes(self, outfile, level, already_processed, namespace_='t:', name_='contactref'):
if self.route is not None and 'route' not in already_processed:
already_processed.append('route')
outfile.write(' route=%s' % (self.gds_format_string(quote_attrib(self.route).encode(ExternalEncoding), input_name='route'), ))
if self.pad is not None and 'pad' not in already_processed:
already_processed.append('pad')
outfile.write(' pad=%s' % (self.gds_format_string(quote_attrib(self.pad).encode(ExternalEncoding), input_name='pad'), ))
if self.element is not None and 'element' not in already_processed:
already_processed.append('element')
outfile.write(' element=%s' % (self.gds_format_string(quote_attrib(self.element).encode(ExternalEncoding), input_name='element'), ))
def exportChildren(self, outfile, level, namespace_='t:', name_='contactref', fromsubclass_=False):
pass
def hasContent_(self):
if (
):
return True
else:
return False
def exportLiteral(self, outfile, level, name_='contactref'):
level += 1
self.exportLiteralAttributes(outfile, level, [], name_)
if self.hasContent_():
self.exportLiteralChildren(outfile, level, name_)
def exportLiteralAttributes(self, outfile, level, already_processed, name_):
if self.route is not None and 'route' not in already_processed:
already_processed.append('route')
showIndent(outfile, level)
outfile.write('route = "%s",\n' % (self.route,))
if self.pad is not None and 'pad' not in already_processed:
already_processed.append('pad')
showIndent(outfile, level)
outfile.write('pad = "%s",\n' % (self.pad,))
if self.element is not None and 'element' not in already_processed:
already_processed.append('element')
showIndent(outfile, level)
outfile.write('element = "%s",\n' % (self.element,))
def exportLiteralChildren(self, outfile, level, name_):
pass
def build(self, node):
self.buildAttributes(node, node.attrib, [])
for child in node:
nodeName_ = Tag_pattern_.match(child.tag).groups()[-1]
self.buildChildren(child, node, nodeName_)
def buildAttributes(self, node, attrs, already_processed):
value = find_attr_value_('route', node)
if value is not None and 'route' not in already_processed:
already_processed.append('route')
self.route = value
value = find_attr_value_('pad', node)
if value is not None and 'pad' not in already_processed:
already_processed.append('pad')
self.pad = value
value = find_attr_value_('element', node)
if value is not None and 'element' not in already_processed:
already_processed.append('element')
self.element = value
def buildChildren(self, child_, node, nodeName_, fromsubclass_=False):
pass
# end class contactref
class variantdefs(GeneratedsSuper):
subclass = None
superclass = None
def __init__(self, variantdef=None):
self.variantdef = variantdef
def factory(*args_, **kwargs_):
if variantdefs.subclass:
return variantdefs.subclass(*args_, **kwargs_)
else:
return variantdefs(*args_, **kwargs_)
factory = staticmethod(factory)
def get_variantdef(self): return self.variantdef
def set_variantdef(self, variantdef): self.variantdef = variantdef
def export(self, outfile, level, namespace_='t:', name_='variantdefs', namespacedef_=''):
showIndent(outfile, level)
outfile.write('<%s%s%s' % (namespace_, name_, namespacedef_ and ' ' + namespacedef_ or '', ))
already_processed = []
self.exportAttributes(outfile, level, already_processed, namespace_, name_='variantdefs')
if self.hasContent_():
outfile.write('>\n')
self.exportChildren(outfile, level + 1, namespace_, name_)
showIndent(outfile, level)
outfile.write('</%s%s>\n' % (namespace_, name_))
else:
outfile.write('/>\n')
def exportAttributes(self, outfile, level, already_processed, namespace_='t:', name_='variantdefs'):
pass
def exportChildren(self, outfile, level, namespace_='t:', name_='variantdefs', fromsubclass_=False):
if self.variantdef is not None:
self.variantdef.export(outfile, level, namespace_, name_='variantdef', )
def hasContent_(self):
if (
self.variantdef is not None
):
return True
else:
return False
def exportLiteral(self, outfile, level, name_='variantdefs'):
level += 1
self.exportLiteralAttributes(outfile, level, [], name_)
if self.hasContent_():
self.exportLiteralChildren(outfile, level, name_)
def exportLiteralAttributes(self, outfile, level, already_processed, name_):
pass
def exportLiteralChildren(self, outfile, level, name_):
if self.variantdef is not None:
showIndent(outfile, level)
outfile.write('variantdef=model_.variantdef(\n')
self.variantdef.exportLiteral(outfile, level)
showIndent(outfile, level)
outfile.write('),\n')
def build(self, node):
self.buildAttributes(node, node.attrib, [])
for child in node:
nodeName_ = Tag_pattern_.match(child.tag).groups()[-1]
self.buildChildren(child, node, nodeName_)
def buildAttributes(self, node, attrs, already_processed):
pass
def buildChildren(self, child_, node, nodeName_, fromsubclass_=False):
if nodeName_ == 'variantdef':
obj_ = variantdef.factory()
obj_.build(child_)
self.set_variantdef(obj_)
# end class variantdefs
class settings(GeneratedsSuper):
subclass = None
superclass = None
def __init__(self, setting=None):
self.setting = setting
def factory(*args_, **kwargs_):
if settings.subclass:
return settings.subclass(*args_, **kwargs_)
else:
return settings(*args_, **kwargs_)
factory = staticmethod(factory)
def get_setting(self): return self.setting
def set_setting(self, setting): self.setting = setting
def export(self, outfile, level, namespace_='t:', name_='settings', namespacedef_=''):
showIndent(outfile, level)
outfile.write('<%s%s%s' % (namespace_, name_, namespacedef_ and ' ' + namespacedef_ or '', ))
already_processed = []
self.exportAttributes(outfile, level, already_processed, namespace_, name_='settings')
if self.hasContent_():
outfile.write('>\n')
self.exportChildren(outfile, level + 1, namespace_, name_)
showIndent(outfile, level)
outfile.write('</%s%s>\n' % (namespace_, name_))
else:
outfile.write('/>\n')
def exportAttributes(self, outfile, level, already_processed, namespace_='t:', name_='settings'):
pass
def exportChildren(self, outfile, level, namespace_='t:', name_='settings', fromsubclass_=False):
if self.setting is not None:
self.setting.export(outfile, level, namespace_, name_='setting', )
def hasContent_(self):
if (
self.setting is not None
):
return True
else:
return False
def exportLiteral(self, outfile, level, name_='settings'):
level += 1
self.exportLiteralAttributes(outfile, level, [], name_)
if self.hasContent_():
self.exportLiteralChildren(outfile, level, name_)
def exportLiteralAttributes(self, outfile, level, already_processed, name_):
pass
def exportLiteralChildren(self, outfile, level, name_):
if self.setting is not None:
showIndent(outfile, level)
outfile.write('setting=model_.setting(\n')
self.setting.exportLiteral(outfile, level)
showIndent(outfile, level)
outfile.write('),\n')
def build(self, node):
self.buildAttributes(node, node.attrib, [])
for child in node:
nodeName_ = Tag_pattern_.match(child.tag).groups()[-1]
self.buildChildren(child, node, nodeName_)
def buildAttributes(self, node, attrs, already_processed):
pass
def buildChildren(self, child_, node, nodeName_, fromsubclass_=False):
if nodeName_ == 'setting':
obj_ = setting.factory()
obj_.build(child_)
self.set_setting(obj_)
# end class settings
class sheets(GeneratedsSuper):
subclass = None
superclass = None
def __init__(self, sheet=None):
self.sheet = sheet
def factory(*args_, **kwargs_):
if sheets.subclass:
return sheets.subclass(*args_, **kwargs_)
else:
return sheets(*args_, **kwargs_)
factory = staticmethod(factory)
def get_sheet(self): return self.sheet
def set_sheet(self, sheet): self.sheet = sheet
def export(self, outfile, level, namespace_='t:', name_='sheets', namespacedef_=''):
showIndent(outfile, level)
outfile.write('<%s%s%s' % (namespace_, name_, namespacedef_ and ' ' + namespacedef_ or '', ))
already_processed = []
self.exportAttributes(outfile, level, already_processed, namespace_, name_='sheets')
if self.hasContent_():
outfile.write('>\n')
self.exportChildren(outfile, level + 1, namespace_, name_)
showIndent(outfile, level)
outfile.write('</%s%s>\n' % (namespace_, name_))
else:
outfile.write('/>\n')
def exportAttributes(self, outfile, level, already_processed, namespace_='t:', name_='sheets'):
pass
def exportChildren(self, outfile, level, namespace_='t:', name_='sheets', fromsubclass_=False):
if self.sheet is not None:
self.sheet.export(outfile, level, namespace_, name_='sheet', )
def hasContent_(self):
if (
self.sheet is not None
):
return True
else:
return False
def exportLiteral(self, outfile, level, name_='sheets'):
level += 1
self.exportLiteralAttributes(outfile, level, [], name_)
if self.hasContent_():
self.exportLiteralChildren(outfile, level, name_)
def exportLiteralAttributes(self, outfile, level, already_processed, name_):
pass
def exportLiteralChildren(self, outfile, level, name_):
if self.sheet is not None:
showIndent(outfile, level)
outfile.write('sheet=model_.sheet(\n')
self.sheet.exportLiteral(outfile, level)
showIndent(outfile, level)
outfile.write('),\n')
def build(self, node):
self.buildAttributes(node, node.attrib, [])
for child in node:
nodeName_ = Tag_pattern_.match(child.tag).groups()[-1]
self.buildChildren(child, node, nodeName_)
def buildAttributes(self, node, attrs, already_processed):
pass
def buildChildren(self, child_, node, nodeName_, fromsubclass_=False):
if nodeName_ == 'sheet':
obj_ = sheet.factory()
obj_.build(child_)
self.set_sheet(obj_)
# end class sheets
class layers(GeneratedsSuper):
subclass = None
superclass = None
def __init__(self, layer=None):
self.layer = layer
def factory(*args_, **kwargs_):
if layers.subclass:
return layers.subclass(*args_, **kwargs_)
else:
return layers(*args_, **kwargs_)
factory = staticmethod(factory)
def get_layer(self): return self.layer
def set_layer(self, layer): self.layer = layer
def export(self, outfile, level, namespace_='t:', name_='layers', namespacedef_=''):
showIndent(outfile, level)
outfile.write('<%s%s%s' % (namespace_, name_, namespacedef_ and ' ' + namespacedef_ or '', ))
already_processed = []
self.exportAttributes(outfile, level, already_processed, namespace_, name_='layers')
if self.hasContent_():
outfile.write('>\n')
self.exportChildren(outfile, level + 1, namespace_, name_)
showIndent(outfile, level)
outfile.write('</%s%s>\n' % (namespace_, name_))
else:
outfile.write('/>\n')
def exportAttributes(self, outfile, level, already_processed, namespace_='t:', name_='layers'):
pass
def exportChildren(self, outfile, level, namespace_='t:', name_='layers', fromsubclass_=False):
if self.layer is not None:
self.layer.export(outfile, level, namespace_, name_='layer', )
def hasContent_(self):
if (
self.layer is not None
):
return True
else:
return False
def exportLiteral(self, outfile, level, name_='layers'):
level += 1
self.exportLiteralAttributes(outfile, level, [], name_)
if self.hasContent_():
self.exportLiteralChildren(outfile, level, name_)
def exportLiteralAttributes(self, outfile, level, already_processed, name_):
pass
def exportLiteralChildren(self, outfile, level, name_):
if self.layer is not None:
showIndent(outfile, level)
outfile.write('layer=model_.layer(\n')
self.layer.exportLiteral(outfile, level)
showIndent(outfile, level)
outfile.write('),\n')
def build(self, node):
self.buildAttributes(node, node.attrib, [])
for child in node:
nodeName_ = Tag_pattern_.match(child.tag).groups()[-1]
self.buildChildren(child, node, nodeName_)
def buildAttributes(self, node, attrs, already_processed):
pass
def buildChildren(self, child_, node, nodeName_, fromsubclass_=False):
if nodeName_ == 'layer':
obj_ = layer.factory()
obj_.build(child_)
self.set_layer(obj_)
# end class layers
class packages(GeneratedsSuper):
subclass = None
superclass = None
def __init__(self, package=None):
self.package = package
def factory(*args_, **kwargs_):
if packages.subclass:
return packages.subclass(*args_, **kwargs_)
else:
return packages(*args_, **kwargs_)
factory = staticmethod(factory)
def get_package(self): return self.package
def set_package(self, package): self.package = package
def export(self, outfile, level, namespace_='t:', name_='packages', namespacedef_=''):
showIndent(outfile, level)
outfile.write('<%s%s%s' % (namespace_, name_, namespacedef_ and ' ' + namespacedef_ or '', ))
already_processed = []
self.exportAttributes(outfile, level, already_processed, namespace_, name_='packages')
if self.hasContent_():
outfile.write('>\n')
self.exportChildren(outfile, level + 1, namespace_, name_)
showIndent(outfile, level)
outfile.write('</%s%s>\n' % (namespace_, name_))
else:
outfile.write('/>\n')
def exportAttributes(self, outfile, level, already_processed, namespace_='t:', name_='packages'):
pass
def exportChildren(self, outfile, level, namespace_='t:', name_='packages', fromsubclass_=False):
if self.package is not None:
self.package.export(outfile, level, namespace_, name_='package', )
def hasContent_(self):
if (
self.package is not None
):
return True
else:
return False
def exportLiteral(self, outfile, level, name_='packages'):
level += 1
self.exportLiteralAttributes(outfile, level, [], name_)
if self.hasContent_():
self.exportLiteralChildren(outfile, level, name_)
def exportLiteralAttributes(self, outfile, level, already_processed, name_):
pass
def exportLiteralChildren(self, outfile, level, name_):
if self.package is not None:
showIndent(outfile, level)
outfile.write('package=model_.package(\n')
self.package.exportLiteral(outfile, level)
showIndent(outfile, level)
outfile.write('),\n')
def build(self, node):
self.buildAttributes(node, node.attrib, [])
for child in node:
nodeName_ = Tag_pattern_.match(child.tag).groups()[-1]
self.buildChildren(child, node, nodeName_)
def buildAttributes(self, node, attrs, already_processed):
pass
def buildChildren(self, child_, node, nodeName_, fromsubclass_=False):
if nodeName_ == 'package':
obj_ = package.factory()
obj_.build(child_)
self.set_package(obj_)
# end class packages
class symbols(GeneratedsSuper):
subclass = None
superclass = None
def __init__(self, symbol=None):
self.symbol = symbol
def factory(*args_, **kwargs_):
if symbols.subclass:
return symbols.subclass(*args_, **kwargs_)
else:
return symbols(*args_, **kwargs_)
factory = staticmethod(factory)
def get_symbol(self): return self.symbol
def set_symbol(self, symbol): self.symbol = symbol
def export(self, outfile, level, namespace_='t:', name_='symbols', namespacedef_=''):
showIndent(outfile, level)
outfile.write('<%s%s%s' % (namespace_, name_, namespacedef_ and ' ' + namespacedef_ or '', ))
already_processed = []
self.exportAttributes(outfile, level, already_processed, namespace_, name_='symbols')
if self.hasContent_():
outfile.write('>\n')
self.exportChildren(outfile, level + 1, namespace_, name_)
showIndent(outfile, level)
outfile.write('</%s%s>\n' % (namespace_, name_))
else:
outfile.write('/>\n')
def exportAttributes(self, outfile, level, already_processed, namespace_='t:', name_='symbols'):
pass
def exportChildren(self, outfile, level, namespace_='t:', name_='symbols', fromsubclass_=False):
if self.symbol is not None:
self.symbol.export(outfile, level, namespace_, name_='symbol', )
def hasContent_(self):
if (
self.symbol is not None
):
return True
else:
return False
def exportLiteral(self, outfile, level, name_='symbols'):
level += 1
self.exportLiteralAttributes(outfile, level, [], name_)
if self.hasContent_():
self.exportLiteralChildren(outfile, level, name_)
def exportLiteralAttributes(self, outfile, level, already_processed, name_):
pass
def exportLiteralChildren(self, outfile, level, name_):
if self.symbol is not None:
showIndent(outfile, level)
outfile.write('symbol=model_.symbol(\n')
self.symbol.exportLiteral(outfile, level)
showIndent(outfile, level)
outfile.write('),\n')
def build(self, node):
self.buildAttributes(node, node.attrib, [])
for child in node:
nodeName_ = Tag_pattern_.match(child.tag).groups()[-1]
self.buildChildren(child, node, nodeName_)
def buildAttributes(self, node, attrs, already_processed):
pass
def buildChildren(self, child_, node, nodeName_, fromsubclass_=False):
if nodeName_ == 'symbol':
obj_ = symbol.factory()
obj_.build(child_)
self.set_symbol(obj_)
# end class symbols
class devicesets(GeneratedsSuper):
subclass = None
superclass = None
def __init__(self, deviceset=None):
self.deviceset = deviceset
def factory(*args_, **kwargs_):
if devicesets.subclass:
return devicesets.subclass(*args_, **kwargs_)
else:
return devicesets(*args_, **kwargs_)
factory = staticmethod(factory)
def get_deviceset(self): return self.deviceset
def set_deviceset(self, deviceset): self.deviceset = deviceset
def export(self, outfile, level, namespace_='t:', name_='devicesets', namespacedef_=''):
showIndent(outfile, level)
outfile.write('<%s%s%s' % (namespace_, name_, namespacedef_ and ' ' + namespacedef_ or '', ))
already_processed = []
self.exportAttributes(outfile, level, already_processed, namespace_, name_='devicesets')
if self.hasContent_():
outfile.write('>\n')
self.exportChildren(outfile, level + 1, namespace_, name_)
showIndent(outfile, level)
outfile.write('</%s%s>\n' % (namespace_, name_))
else:
outfile.write('/>\n')
def exportAttributes(self, outfile, level, already_processed, namespace_='t:', name_='devicesets'):
pass
def exportChildren(self, outfile, level, namespace_='t:', name_='devicesets', fromsubclass_=False):
if self.deviceset is not None:
self.deviceset.export(outfile, level, namespace_, name_='deviceset', )
def hasContent_(self):
if (
self.deviceset is not None
):
return True
else:
return False
def exportLiteral(self, outfile, level, name_='devicesets'):
level += 1
self.exportLiteralAttributes(outfile, level, [], name_)
if self.hasContent_():
self.exportLiteralChildren(outfile, level, name_)
def exportLiteralAttributes(self, outfile, level, already_processed, name_):
pass
def exportLiteralChildren(self, outfile, level, name_):
if self.deviceset is not None:
showIndent(outfile, level)
outfile.write('deviceset=model_.deviceset(\n')
self.deviceset.exportLiteral(outfile, level)
showIndent(outfile, level)
outfile.write('),\n')
def build(self, node):
self.buildAttributes(node, node.attrib, [])
for child in node:
nodeName_ = Tag_pattern_.match(child.tag).groups()[-1]
self.buildChildren(child, node, nodeName_)
def buildAttributes(self, node, attrs, already_processed):
pass
def buildChildren(self, child_, node, nodeName_, fromsubclass_=False):
if nodeName_ == 'deviceset':
obj_ = deviceset.factory()
obj_.build(child_)
self.set_deviceset(obj_)
# end class devicesets
class gates(GeneratedsSuper):
subclass = None
superclass = None
def __init__(self, gate=None):
self.gate = gate
def factory(*args_, **kwargs_):
if gates.subclass:
return gates.subclass(*args_, **kwargs_)
else:
return gates(*args_, **kwargs_)
factory = staticmethod(factory)
def get_gate(self): return self.gate
def set_gate(self, gate): self.gate = gate
def export(self, outfile, level, namespace_='t:', name_='gates', namespacedef_=''):
showIndent(outfile, level)
outfile.write('<%s%s%s' % (namespace_, name_, namespacedef_ and ' ' + namespacedef_ or '', ))
already_processed = []
self.exportAttributes(outfile, level, already_processed, namespace_, name_='gates')
if self.hasContent_():
outfile.write('>\n')
self.exportChildren(outfile, level + 1, namespace_, name_)
showIndent(outfile, level)
outfile.write('</%s%s>\n' % (namespace_, name_))
else:
outfile.write('/>\n')
def exportAttributes(self, outfile, level, already_processed, namespace_='t:', name_='gates'):
pass
def exportChildren(self, outfile, level, namespace_='t:', name_='gates', fromsubclass_=False):
if self.gate is not None:
self.gate.export(outfile, level, namespace_, name_='gate', )
def hasContent_(self):
if (
self.gate is not None
):
return True
else:
return False
def exportLiteral(self, outfile, level, name_='gates'):
level += 1
self.exportLiteralAttributes(outfile, level, [], name_)
if self.hasContent_():
self.exportLiteralChildren(outfile, level, name_)
def exportLiteralAttributes(self, outfile, level, already_processed, name_):
pass
def exportLiteralChildren(self, outfile, level, name_):
if self.gate is not None:
showIndent(outfile, level)
outfile.write('gate=model_.gate(\n')
self.gate.exportLiteral(outfile, level)
showIndent(outfile, level)
outfile.write('),\n')
def build(self, node):
self.buildAttributes(node, node.attrib, [])
for child in node:
nodeName_ = Tag_pattern_.match(child.tag).groups()[-1]
self.buildChildren(child, node, nodeName_)
def buildAttributes(self, node, attrs, already_processed):
pass
def buildChildren(self, child_, node, nodeName_, fromsubclass_=False):
if nodeName_ == 'gate':
obj_ = gate.factory()
obj_.build(child_)
self.set_gate(obj_)
# end class gates
class devices(GeneratedsSuper):
subclass = None
superclass = None
def __init__(self, device=None):
self.device = device
def factory(*args_, **kwargs_):
if devices.subclass:
return devices.subclass(*args_, **kwargs_)
else:
return devices(*args_, **kwargs_)
factory = staticmethod(factory)
def get_device(self): return self.device
def set_device(self, device): self.device = device
def export(self, outfile, level, namespace_='t:', name_='devices', namespacedef_=''):
showIndent(outfile, level)
outfile.write('<%s%s%s' % (namespace_, name_, namespacedef_ and ' ' + namespacedef_ or '', ))
already_processed = []
self.exportAttributes(outfile, level, already_processed, namespace_, name_='devices')
if self.hasContent_():
outfile.write('>\n')
self.exportChildren(outfile, level + 1, namespace_, name_)
showIndent(outfile, level)
outfile.write('</%s%s>\n' % (namespace_, name_))
else:
outfile.write('/>\n')
def exportAttributes(self, outfile, level, already_processed, namespace_='t:', name_='devices'):
pass
def exportChildren(self, outfile, level, namespace_='t:', name_='devices', fromsubclass_=False):
if self.device is not None:
self.device.export(outfile, level, namespace_, name_='device', )
def hasContent_(self):
if (
self.device is not None
):
return True
else:
return False
def exportLiteral(self, outfile, level, name_='devices'):
level += 1
self.exportLiteralAttributes(outfile, level, [], name_)
if self.hasContent_():
self.exportLiteralChildren(outfile, level, name_)
def exportLiteralAttributes(self, outfile, level, already_processed, name_):
pass
def exportLiteralChildren(self, outfile, level, name_):
if self.device is not None:
showIndent(outfile, level)
outfile.write('device=model_.device(\n')
self.device.exportLiteral(outfile, level)
showIndent(outfile, level)
outfile.write('),\n')
def build(self, node):
self.buildAttributes(node, node.attrib, [])
for child in node:
nodeName_ = Tag_pattern_.match(child.tag).groups()[-1]
self.buildChildren(child, node, nodeName_)
def buildAttributes(self, node, attrs, already_processed):
pass
def buildChildren(self, child_, node, nodeName_, fromsubclass_=False):
if nodeName_ == 'device':
obj_ = device.factory()
obj_.build(child_)
self.set_device(obj_)
# end class devices
class libraries(GeneratedsSuper):
subclass = None
superclass = None
def __init__(self, library=None):
self.library = library
def factory(*args_, **kwargs_):
if libraries.subclass:
return libraries.subclass(*args_, **kwargs_)
else:
return libraries(*args_, **kwargs_)
factory = staticmethod(factory)
def get_library(self): return self.library
def set_library(self, library): self.library = library
def export(self, outfile, level, namespace_='t:', name_='libraries', namespacedef_=''):
showIndent(outfile, level)
outfile.write('<%s%s%s' % (namespace_, name_, namespacedef_ and ' ' + namespacedef_ or '', ))
already_processed = []
self.exportAttributes(outfile, level, already_processed, namespace_, name_='libraries')
if self.hasContent_():
outfile.write('>\n')
self.exportChildren(outfile, level + 1, namespace_, name_)
showIndent(outfile, level)
outfile.write('</%s%s>\n' % (namespace_, name_))
else:
outfile.write('/>\n')
def exportAttributes(self, outfile, level, already_processed, namespace_='t:', name_='libraries'):
pass
def exportChildren(self, outfile, level, namespace_='t:', name_='libraries', fromsubclass_=False):
if self.library is not None:
self.library.export(outfile, level, namespace_, name_='library', )
def hasContent_(self):
if (
self.library is not None
):
return True
else:
return False
def exportLiteral(self, outfile, level, name_='libraries'):
level += 1
self.exportLiteralAttributes(outfile, level, [], name_)
if self.hasContent_():
self.exportLiteralChildren(outfile, level, name_)
def exportLiteralAttributes(self, outfile, level, already_processed, name_):
pass
def exportLiteralChildren(self, outfile, level, name_):
if self.library is not None:
showIndent(outfile, level)
outfile.write('library=model_.library(\n')
self.library.exportLiteral(outfile, level)
showIndent(outfile, level)
outfile.write('),\n')
def build(self, node):
self.buildAttributes(node, node.attrib, [])
for child in node:
nodeName_ = Tag_pattern_.match(child.tag).groups()[-1]
self.buildChildren(child, node, nodeName_)
def buildAttributes(self, node, attrs, already_processed):
pass
def buildChildren(self, child_, node, nodeName_, fromsubclass_=False):
if nodeName_ == 'library':
obj_ = library.factory()
obj_.build(child_)
self.set_library(obj_)
# end class libraries
class connects(GeneratedsSuper):
subclass = None
superclass = None
def __init__(self, connect=None):
self.connect = connect
def factory(*args_, **kwargs_):
if connects.subclass:
return connects.subclass(*args_, **kwargs_)
else:
return connects(*args_, **kwargs_)
factory = staticmethod(factory)
def get_connect(self): return self.connect
def set_connect(self, connect): self.connect = connect
def export(self, outfile, level, namespace_='t:', name_='connects', namespacedef_=''):
showIndent(outfile, level)
outfile.write('<%s%s%s' % (namespace_, name_, namespacedef_ and ' ' + namespacedef_ or '', ))
already_processed = []
self.exportAttributes(outfile, level, already_processed, namespace_, name_='connects')
if self.hasContent_():
outfile.write('>\n')
self.exportChildren(outfile, level + 1, namespace_, name_)
showIndent(outfile, level)
outfile.write('</%s%s>\n' % (namespace_, name_))
else:
outfile.write('/>\n')
def exportAttributes(self, outfile, level, already_processed, namespace_='t:', name_='connects'):
pass
def exportChildren(self, outfile, level, namespace_='t:', name_='connects', fromsubclass_=False):
if self.connect is not None:
self.connect.export(outfile, level, namespace_, name_='connect', )
def hasContent_(self):
if (
self.connect is not None
):
return True
else:
return False
def exportLiteral(self, outfile, level, name_='connects'):
level += 1
self.exportLiteralAttributes(outfile, level, [], name_)
if self.hasContent_():
self.exportLiteralChildren(outfile, level, name_)
def exportLiteralAttributes(self, outfile, level, already_processed, name_):
pass
def exportLiteralChildren(self, outfile, level, name_):
if self.connect is not None:
showIndent(outfile, level)
outfile.write('connect=model_.connect(\n')
self.connect.exportLiteral(outfile, level)
showIndent(outfile, level)
outfile.write('),\n')
def build(self, node):
self.buildAttributes(node, node.attrib, [])
for child in node:
nodeName_ = Tag_pattern_.match(child.tag).groups()[-1]
self.buildChildren(child, node, nodeName_)
def buildAttributes(self, node, attrs, already_processed):
pass
def buildChildren(self, child_, node, nodeName_, fromsubclass_=False):
if nodeName_ == 'connect':
obj_ = connect.factory()
obj_.build(child_)
self.set_connect(obj_)
# end class connects
class technologies(GeneratedsSuper):
subclass = None
superclass = None
def __init__(self, technology=None):
self.technology = technology
def factory(*args_, **kwargs_):
if technologies.subclass:
return technologies.subclass(*args_, **kwargs_)
else:
return technologies(*args_, **kwargs_)
factory = staticmethod(factory)
def get_technology(self): return self.technology
def set_technology(self, technology): self.technology = technology
def export(self, outfile, level, namespace_='t:', name_='technologies', namespacedef_=''):
showIndent(outfile, level)
outfile.write('<%s%s%s' % (namespace_, name_, namespacedef_ and ' ' + namespacedef_ or '', ))
already_processed = []
self.exportAttributes(outfile, level, already_processed, namespace_, name_='technologies')
if self.hasContent_():
outfile.write('>\n')
self.exportChildren(outfile, level + 1, namespace_, name_)
showIndent(outfile, level)
outfile.write('</%s%s>\n' % (namespace_, name_))
else:
outfile.write('/>\n')
def exportAttributes(self, outfile, level, already_processed, namespace_='t:', name_='technologies'):
pass
def exportChildren(self, outfile, level, namespace_='t:', name_='technologies', fromsubclass_=False):
if self.technology is not None:
self.technology.export(outfile, level, namespace_, name_='technology', )
def hasContent_(self):
if (
self.technology is not None
):
return True
else:
return False
def exportLiteral(self, outfile, level, name_='technologies'):
level += 1
self.exportLiteralAttributes(outfile, level, [], name_)
if self.hasContent_():
self.exportLiteralChildren(outfile, level, name_)
def exportLiteralAttributes(self, outfile, level, already_processed, name_):
pass
def exportLiteralChildren(self, outfile, level, name_):
if self.technology is not None:
showIndent(outfile, level)
outfile.write('technology=model_.technology(\n')
self.technology.exportLiteral(outfile, level)
showIndent(outfile, level)
outfile.write('),\n')
def build(self, node):
self.buildAttributes(node, node.attrib, [])
for child in node:
nodeName_ = Tag_pattern_.match(child.tag).groups()[-1]
self.buildChildren(child, node, nodeName_)
def buildAttributes(self, node, attrs, already_processed):
pass
def buildChildren(self, child_, node, nodeName_, fromsubclass_=False):
if nodeName_ == 'technology':
obj_ = technology.factory()
obj_.build(child_)
self.set_technology(obj_)
# end class technologies
class attributes(GeneratedsSuper):
subclass = None
superclass = None
def __init__(self, attribute=None):
self.attribute = attribute
def factory(*args_, **kwargs_):
if attributes.subclass:
return attributes.subclass(*args_, **kwargs_)
else:
return attributes(*args_, **kwargs_)
factory = staticmethod(factory)
def get_attribute(self): return self.attribute
def set_attribute(self, attribute): self.attribute = attribute
def export(self, outfile, level, namespace_='t:', name_='attributes', namespacedef_=''):
showIndent(outfile, level)
outfile.write('<%s%s%s' % (namespace_, name_, namespacedef_ and ' ' + namespacedef_ or '', ))
already_processed = []
self.exportAttributes(outfile, level, already_processed, namespace_, name_='attributes')
if self.hasContent_():
outfile.write('>\n')
self.exportChildren(outfile, level + 1, namespace_, name_)
showIndent(outfile, level)
outfile.write('</%s%s>\n' % (namespace_, name_))
else:
outfile.write('/>\n')
def exportAttributes(self, outfile, level, already_processed, namespace_='t:', name_='attributes'):
pass
def exportChildren(self, outfile, level, namespace_='t:', name_='attributes', fromsubclass_=False):
if self.attribute is not None:
self.attribute.export(outfile, level, namespace_, name_='attribute', )
def hasContent_(self):
if (
self.attribute is not None
):
return True
else:
return False
def exportLiteral(self, outfile, level, name_='attributes'):
level += 1
self.exportLiteralAttributes(outfile, level, [], name_)
if self.hasContent_():
self.exportLiteralChildren(outfile, level, name_)
def exportLiteralAttributes(self, outfile, level, already_processed, name_):
pass
def exportLiteralChildren(self, outfile, level, name_):
if self.attribute is not None:
showIndent(outfile, level)
outfile.write('attribute=model_.attribute(\n')
self.attribute.exportLiteral(outfile, level)
showIndent(outfile, level)
outfile.write('),\n')
def build(self, node):
self.buildAttributes(node, node.attrib, [])
for child in node:
nodeName_ = Tag_pattern_.match(child.tag).groups()[-1]
self.buildChildren(child, node, nodeName_)
def buildAttributes(self, node, attrs, already_processed):
pass
def buildChildren(self, child_, node, nodeName_, fromsubclass_=False):
if nodeName_ == 'attribute':
obj_ = attribute.factory()
obj_.build(child_)
self.set_attribute(obj_)
# end class attributes
class classes(GeneratedsSuper):
subclass = None
superclass = None
def __init__(self, classxx=None):
self.classxx = classxx
def factory(*args_, **kwargs_):
if classes.subclass:
return classes.subclass(*args_, **kwargs_)
else:
return classes(*args_, **kwargs_)
factory = staticmethod(factory)
def get_class(self): return self.classxx
def set_class(self, classxx): self.classxx = classxx
def export(self, outfile, level, namespace_='t:', name_='classes', namespacedef_=''):
showIndent(outfile, level)
outfile.write('<%s%s%s' % (namespace_, name_, namespacedef_ and ' ' + namespacedef_ or '', ))
already_processed = []
self.exportAttributes(outfile, level, already_processed, namespace_, name_='classes')
if self.hasContent_():
outfile.write('>\n')
self.exportChildren(outfile, level + 1, namespace_, name_)
showIndent(outfile, level)
outfile.write('</%s%s>\n' % (namespace_, name_))
else:
outfile.write('/>\n')
def exportAttributes(self, outfile, level, already_processed, namespace_='t:', name_='classes'):
pass
def exportChildren(self, outfile, level, namespace_='t:', name_='classes', fromsubclass_=False):
if self.classxx is not None:
self.classxx.export(outfile, level, namespace_, name_='class', )
def hasContent_(self):
if (
self.classxx is not None
):
return True
else:
return False
def exportLiteral(self, outfile, level, name_='classes'):
level += 1
self.exportLiteralAttributes(outfile, level, [], name_)
if self.hasContent_():
self.exportLiteralChildren(outfile, level, name_)
def exportLiteralAttributes(self, outfile, level, already_processed, name_):
pass
def exportLiteralChildren(self, outfile, level, name_):
if self.classxx is not None:
showIndent(outfile, level)
outfile.write('classxx=model_.classxx(\n')
self.classxx.exportLiteral(outfile, level)
showIndent(outfile, level)
outfile.write('),\n')
def build(self, node):
self.buildAttributes(node, node.attrib, [])
for child in node:
nodeName_ = Tag_pattern_.match(child.tag).groups()[-1]
self.buildChildren(child, node, nodeName_)
def buildAttributes(self, node, attrs, already_processed):
pass
def buildChildren(self, child_, node, nodeName_, fromsubclass_=False):
if nodeName_ == 'class':
obj_ = classxx.factory()
obj_.build(child_)
self.set_class(obj_)
# end class classes
class parts(GeneratedsSuper):
subclass = None
superclass = None
def __init__(self, part=None):
self.part = part
def factory(*args_, **kwargs_):
if parts.subclass:
return parts.subclass(*args_, **kwargs_)
else:
return parts(*args_, **kwargs_)
factory = staticmethod(factory)
def get_part(self): return self.part
def set_part(self, part): self.part = part
def export(self, outfile, level, namespace_='t:', name_='parts', namespacedef_=''):
showIndent(outfile, level)
outfile.write('<%s%s%s' % (namespace_, name_, namespacedef_ and ' ' + namespacedef_ or '', ))
already_processed = []
self.exportAttributes(outfile, level, already_processed, namespace_, name_='parts')
if self.hasContent_():
outfile.write('>\n')
self.exportChildren(outfile, level + 1, namespace_, name_)
showIndent(outfile, level)
outfile.write('</%s%s>\n' % (namespace_, name_))
else:
outfile.write('/>\n')
def exportAttributes(self, outfile, level, already_processed, namespace_='t:', name_='parts'):
pass
def exportChildren(self, outfile, level, namespace_='t:', name_='parts', fromsubclass_=False):
if self.part is not None:
self.part.export(outfile, level, namespace_, name_='part', )
def hasContent_(self):
if (
self.part is not None
):
return True
else:
return False
def exportLiteral(self, outfile, level, name_='parts'):
level += 1
self.exportLiteralAttributes(outfile, level, [], name_)
if self.hasContent_():
self.exportLiteralChildren(outfile, level, name_)
def exportLiteralAttributes(self, outfile, level, already_processed, name_):
pass
def exportLiteralChildren(self, outfile, level, name_):
if self.part is not None:
showIndent(outfile, level)
outfile.write('part=model_.part(\n')
self.part.exportLiteral(outfile, level)
showIndent(outfile, level)
outfile.write('),\n')
def build(self, node):
self.buildAttributes(node, node.attrib, [])
for child in node:
nodeName_ = Tag_pattern_.match(child.tag).groups()[-1]
self.buildChildren(child, node, nodeName_)
def buildAttributes(self, node, attrs, already_processed):
pass
def buildChildren(self, child_, node, nodeName_, fromsubclass_=False):
if nodeName_ == 'part':
obj_ = part.factory()
obj_.build(child_)
self.set_part(obj_)
# end class parts
class instances(GeneratedsSuper):
subclass = None
superclass = None
def __init__(self, instance=None):
self.instance = instance
def factory(*args_, **kwargs_):
if instances.subclass:
return instances.subclass(*args_, **kwargs_)
else:
return instances(*args_, **kwargs_)
factory = staticmethod(factory)
def get_instance(self): return self.instance
def set_instance(self, instance): self.instance = instance
def export(self, outfile, level, namespace_='t:', name_='instances', namespacedef_=''):
showIndent(outfile, level)
outfile.write('<%s%s%s' % (namespace_, name_, namespacedef_ and ' ' + namespacedef_ or '', ))
already_processed = []
self.exportAttributes(outfile, level, already_processed, namespace_, name_='instances')
if self.hasContent_():
outfile.write('>\n')
self.exportChildren(outfile, level + 1, namespace_, name_)
showIndent(outfile, level)
outfile.write('</%s%s>\n' % (namespace_, name_))
else:
outfile.write('/>\n')
def exportAttributes(self, outfile, level, already_processed, namespace_='t:', name_='instances'):
pass
def exportChildren(self, outfile, level, namespace_='t:', name_='instances', fromsubclass_=False):
if self.instance is not None:
self.instance.export(outfile, level, namespace_, name_='instance', )
def hasContent_(self):
if (
self.instance is not None
):
return True
else:
return False
def exportLiteral(self, outfile, level, name_='instances'):
level += 1
self.exportLiteralAttributes(outfile, level, [], name_)
if self.hasContent_():
self.exportLiteralChildren(outfile, level, name_)
def exportLiteralAttributes(self, outfile, level, already_processed, name_):
pass
def exportLiteralChildren(self, outfile, level, name_):
if self.instance is not None:
showIndent(outfile, level)
outfile.write('instance=model_.instance(\n')
self.instance.exportLiteral(outfile, level)
showIndent(outfile, level)
outfile.write('),\n')
def build(self, node):
self.buildAttributes(node, node.attrib, [])
for child in node:
nodeName_ = Tag_pattern_.match(child.tag).groups()[-1]
self.buildChildren(child, node, nodeName_)
def buildAttributes(self, node, attrs, already_processed):
pass
def buildChildren(self, child_, node, nodeName_, fromsubclass_=False):
if nodeName_ == 'instance':
obj_ = instance.factory()
obj_.build(child_)
self.set_instance(obj_)
# end class instances
class errors(GeneratedsSuper):
subclass = None
superclass = None
def __init__(self, approved=None):
self.approved = approved
def factory(*args_, **kwargs_):
if errors.subclass:
return errors.subclass(*args_, **kwargs_)
else:
return errors(*args_, **kwargs_)
factory = staticmethod(factory)
def get_approved(self): return self.approved
def set_approved(self, approved): self.approved = approved
def export(self, outfile, level, namespace_='t:', name_='errors', namespacedef_=''):
showIndent(outfile, level)
outfile.write('<%s%s%s' % (namespace_, name_, namespacedef_ and ' ' + namespacedef_ or '', ))
already_processed = []
self.exportAttributes(outfile, level, already_processed, namespace_, name_='errors')
if self.hasContent_():
outfile.write('>\n')
self.exportChildren(outfile, level + 1, namespace_, name_)
showIndent(outfile, level)
outfile.write('</%s%s>\n' % (namespace_, name_))
else:
outfile.write('/>\n')
def exportAttributes(self, outfile, level, already_processed, namespace_='t:', name_='errors'):
pass
def exportChildren(self, outfile, level, namespace_='t:', name_='errors', fromsubclass_=False):
if self.approved is not None:
self.approved.export(outfile, level, namespace_, name_='approved', )
def hasContent_(self):
if (
self.approved is not None
):
return True
else:
return False
def exportLiteral(self, outfile, level, name_='errors'):
level += 1
self.exportLiteralAttributes(outfile, level, [], name_)
if self.hasContent_():
self.exportLiteralChildren(outfile, level, name_)
def exportLiteralAttributes(self, outfile, level, already_processed, name_):
pass
def exportLiteralChildren(self, outfile, level, name_):
if self.approved is not None:
showIndent(outfile, level)
outfile.write('approved=model_.approved(\n')
self.approved.exportLiteral(outfile, level)
showIndent(outfile, level)
outfile.write('),\n')
def build(self, node):
self.buildAttributes(node, node.attrib, [])
for child in node:
nodeName_ = Tag_pattern_.match(child.tag).groups()[-1]
self.buildChildren(child, node, nodeName_)
def buildAttributes(self, node, attrs, already_processed):
pass
def buildChildren(self, child_, node, nodeName_, fromsubclass_=False):
if nodeName_ == 'approved':
obj_ = approved.factory()
obj_.build(child_)
self.set_approved(obj_)
# end class errors
class plain(GeneratedsSuper):
subclass = None
superclass = None
def __init__(self, polygon=None, wire=None, text=None, circle=None, rectangle=None, frame=None, hole=None):
if polygon is None:
self.polygon = []
else:
self.polygon = polygon
if wire is None:
self.wire = []
else:
self.wire = wire
if text is None:
self.text = []
else:
self.text = text
if circle is None:
self.circle = []
else:
self.circle = circle
if rectangle is None:
self.rectangle = []
else:
self.rectangle = rectangle
if frame is None:
self.frame = []
else:
self.frame = frame
if hole is None:
self.hole = []
else:
self.hole = hole
def factory(*args_, **kwargs_):
if plain.subclass:
return plain.subclass(*args_, **kwargs_)
else:
return plain(*args_, **kwargs_)
factory = staticmethod(factory)
def get_polygon(self): return self.polygon
def set_polygon(self, polygon): self.polygon = polygon
def add_polygon(self, value): self.polygon.append(value)
def insert_polygon(self, index, value): self.polygon[index] = value
def get_wire(self): return self.wire
def set_wire(self, wire): self.wire = wire
def add_wire(self, value): self.wire.append(value)
def insert_wire(self, index, value): self.wire[index] = value
def get_text(self): return self.text
def set_text(self, text): self.text = text
def add_text(self, value): self.text.append(value)
def insert_text(self, index, value): self.text[index] = value
def get_circle(self): return self.circle
def set_circle(self, circle): self.circle = circle
def add_circle(self, value): self.circle.append(value)
def insert_circle(self, index, value): self.circle[index] = value
def get_rectangle(self): return self.rectangle
def set_rectangle(self, rectangle): self.rectangle = rectangle
def add_rectangle(self, value): self.rectangle.append(value)
def insert_rectangle(self, index, value): self.rectangle[index] = value
def get_frame(self): return self.frame
def set_frame(self, frame): self.frame = frame
def add_frame(self, value): self.frame.append(value)
def insert_frame(self, index, value): self.frame[index] = value
def get_hole(self): return self.hole
def set_hole(self, hole): self.hole = hole
def add_hole(self, value): self.hole.append(value)
def insert_hole(self, index, value): self.hole[index] = value
def export(self, outfile, level, namespace_='t:', name_='plain', namespacedef_=''):
showIndent(outfile, level)
outfile.write('<%s%s%s' % (namespace_, name_, namespacedef_ and ' ' + namespacedef_ or '', ))
already_processed = []
self.exportAttributes(outfile, level, already_processed, namespace_, name_='plain')
if self.hasContent_():
outfile.write('>\n')
self.exportChildren(outfile, level + 1, namespace_, name_)
showIndent(outfile, level)
outfile.write('</%s%s>\n' % (namespace_, name_))
else:
outfile.write('/>\n')
def exportAttributes(self, outfile, level, already_processed, namespace_='t:', name_='plain'):
pass
def exportChildren(self, outfile, level, namespace_='t:', name_='plain', fromsubclass_=False):
for polygon_ in self.polygon:
polygon_.export(outfile, level, namespace_, name_='polygon')
for wire_ in self.wire:
wire_.export(outfile, level, namespace_, name_='wire')
for text_ in self.text:
text_.export(outfile, level, namespace_, name_='text')
for circle_ in self.circle:
circle_.export(outfile, level, namespace_, name_='circle')
for rectangle_ in self.rectangle:
rectangle_.export(outfile, level, namespace_, name_='rectangle')
for frame_ in self.frame:
frame_.export(outfile, level, namespace_, name_='frame')
for hole_ in self.hole:
hole_.export(outfile, level, namespace_, name_='hole')
def hasContent_(self):
if (
self.polygon or
self.wire or
self.text or
self.circle or
self.rectangle or
self.frame or
self.hole
):
return True
else:
return False
def exportLiteral(self, outfile, level, name_='plain'):
level += 1
self.exportLiteralAttributes(outfile, level, [], name_)
if self.hasContent_():
self.exportLiteralChildren(outfile, level, name_)
def exportLiteralAttributes(self, outfile, level, already_processed, name_):
pass
def exportLiteralChildren(self, outfile, level, name_):
showIndent(outfile, level)
outfile.write('polygon=[\n')
level += 1
for polygon_ in self.polygon:
showIndent(outfile, level)
outfile.write('model_.polygon(\n')
polygon_.exportLiteral(outfile, level)
showIndent(outfile, level)
outfile.write('),\n')
level -= 1
showIndent(outfile, level)
outfile.write('],\n')
showIndent(outfile, level)
outfile.write('wire=[\n')
level += 1
for wire_ in self.wire:
showIndent(outfile, level)
outfile.write('model_.wire(\n')
wire_.exportLiteral(outfile, level)
showIndent(outfile, level)
outfile.write('),\n')
level -= 1
showIndent(outfile, level)
outfile.write('],\n')
showIndent(outfile, level)
outfile.write('text=[\n')
level += 1
for text_ in self.text:
showIndent(outfile, level)
outfile.write('model_.text(\n')
text_.exportLiteral(outfile, level)
showIndent(outfile, level)
outfile.write('),\n')
level -= 1
showIndent(outfile, level)
outfile.write('],\n')
showIndent(outfile, level)
outfile.write('circle=[\n')
level += 1
for circle_ in self.circle:
showIndent(outfile, level)
outfile.write('model_.circle(\n')
circle_.exportLiteral(outfile, level)
showIndent(outfile, level)
outfile.write('),\n')
level -= 1
showIndent(outfile, level)
outfile.write('],\n')
showIndent(outfile, level)
outfile.write('rectangle=[\n')
level += 1
for rectangle_ in self.rectangle:
showIndent(outfile, level)
outfile.write('model_.rectangle(\n')
rectangle_.exportLiteral(outfile, level)
showIndent(outfile, level)
outfile.write('),\n')
level -= 1
showIndent(outfile, level)
outfile.write('],\n')
showIndent(outfile, level)
outfile.write('frame=[\n')
level += 1
for frame_ in self.frame:
showIndent(outfile, level)
outfile.write('model_.frame(\n')
frame_.exportLiteral(outfile, level)
showIndent(outfile, level)
outfile.write('),\n')
level -= 1
showIndent(outfile, level)
outfile.write('],\n')
showIndent(outfile, level)
outfile.write('hole=[\n')
level += 1
for hole_ in self.hole:
showIndent(outfile, level)
outfile.write('model_.hole(\n')
hole_.exportLiteral(outfile, level)
showIndent(outfile, level)
outfile.write('),\n')
level -= 1
showIndent(outfile, level)
outfile.write('],\n')
def build(self, node):
self.buildAttributes(node, node.attrib, [])
for child in node:
nodeName_ = Tag_pattern_.match(child.tag).groups()[-1]
self.buildChildren(child, node, nodeName_)
def buildAttributes(self, node, attrs, already_processed):
pass
def buildChildren(self, child_, node, nodeName_, fromsubclass_=False):
if nodeName_ == 'polygon':
obj_ = polygon.factory()
obj_.build(child_)
self.polygon.append(obj_)
elif nodeName_ == 'wire':
obj_ = wire.factory()
obj_.build(child_)
self.wire.append(obj_)
elif nodeName_ == 'text':
obj_ = text.factory()
obj_.build(child_)
self.text.append(obj_)
elif nodeName_ == 'circle':
obj_ = circle.factory()
obj_.build(child_)
self.circle.append(obj_)
elif nodeName_ == 'rectangle':
obj_ = rectangle.factory()
obj_.build(child_)
self.rectangle.append(obj_)
elif nodeName_ == 'frame':
obj_ = frame.factory()
obj_.build(child_)
self.frame.append(obj_)
elif nodeName_ == 'hole':
obj_ = hole.factory()
obj_.build(child_)
self.hole.append(obj_)
# end class plain
class designrules(GeneratedsSuper):
subclass = None
superclass = None
def __init__(self, description=None, param=None):
if description is None:
self.description = []
else:
self.description = description
if param is None:
self.param = []
else:
self.param = param
def factory(*args_, **kwargs_):
if designrules.subclass:
return designrules.subclass(*args_, **kwargs_)
else:
return designrules(*args_, **kwargs_)
factory = staticmethod(factory)
def get_description(self): return self.description
def set_description(self, description): self.description = description
def add_description(self, value): self.description.append(value)
def insert_description(self, index, value): self.description[index] = value
def get_param(self): return self.param
def set_param(self, param): self.param = param
def add_param(self, value): self.param.append(value)
def insert_param(self, index, value): self.param[index] = value
def export(self, outfile, level, namespace_='t:', name_='designrules', namespacedef_=''):
showIndent(outfile, level)
outfile.write('<%s%s%s' % (namespace_, name_, namespacedef_ and ' ' + namespacedef_ or '', ))
already_processed = []
self.exportAttributes(outfile, level, already_processed, namespace_, name_='designrules')
if self.hasContent_():
outfile.write('>\n')
self.exportChildren(outfile, level + 1, namespace_, name_)
showIndent(outfile, level)
outfile.write('</%s%s>\n' % (namespace_, name_))
else:
outfile.write('/>\n')
def exportAttributes(self, outfile, level, already_processed, namespace_='t:', name_='designrules'):
pass
def exportChildren(self, outfile, level, namespace_='t:', name_='designrules', fromsubclass_=False):
for description_ in self.description:
description_.export(outfile, level, namespace_, name_='description')
for param_ in self.param:
param_.export(outfile, level, namespace_, name_='param')
def hasContent_(self):
if (
self.description or
self.param
):
return True
else:
return False
def exportLiteral(self, outfile, level, name_='designrules'):
level += 1
self.exportLiteralAttributes(outfile, level, [], name_)
if self.hasContent_():
self.exportLiteralChildren(outfile, level, name_)
def exportLiteralAttributes(self, outfile, level, already_processed, name_):
pass
def exportLiteralChildren(self, outfile, level, name_):
showIndent(outfile, level)
outfile.write('description=[\n')
level += 1
for description_ in self.description:
showIndent(outfile, level)
outfile.write('model_.description(\n')
description_.exportLiteral(outfile, level)
showIndent(outfile, level)
outfile.write('),\n')
level -= 1
showIndent(outfile, level)
outfile.write('],\n')
showIndent(outfile, level)
outfile.write('param=[\n')
level += 1
for param_ in self.param:
showIndent(outfile, level)
outfile.write('model_.param(\n')
param_.exportLiteral(outfile, level)
showIndent(outfile, level)
outfile.write('),\n')
level -= 1
showIndent(outfile, level)
outfile.write('],\n')
def build(self, node):
self.buildAttributes(node, node.attrib, [])
for child in node:
nodeName_ = Tag_pattern_.match(child.tag).groups()[-1]
self.buildChildren(child, node, nodeName_)
def buildAttributes(self, node, attrs, already_processed):
pass
def buildChildren(self, child_, node, nodeName_, fromsubclass_=False):
if nodeName_ == 'description':
obj_ = description.factory()
obj_.build(child_)
self.description.append(obj_)
elif nodeName_ == 'param':
obj_ = param.factory()
obj_.build(child_)
self.param.append(obj_)
# end class designrules
class autorouter(GeneratedsSuper):
subclass = None
superclass = None
def __init__(self, passxx=None):
self.passxx = passxx
def factory(*args_, **kwargs_):
if autorouter.subclass:
return autorouter.subclass(*args_, **kwargs_)
else:
return autorouter(*args_, **kwargs_)
factory = staticmethod(factory)
def get_pass(self): return self.passxx
def set_pass(self, passxx): self.passxx = passxx
def export(self, outfile, level, namespace_='t:', name_='autorouter', namespacedef_=''):
showIndent(outfile, level)
outfile.write('<%s%s%s' % (namespace_, name_, namespacedef_ and ' ' + namespacedef_ or '', ))
already_processed = []
self.exportAttributes(outfile, level, already_processed, namespace_, name_='autorouter')
if self.hasContent_():
outfile.write('>\n')
self.exportChildren(outfile, level + 1, namespace_, name_)
showIndent(outfile, level)
outfile.write('</%s%s>\n' % (namespace_, name_))
else:
outfile.write('/>\n')
def exportAttributes(self, outfile, level, already_processed, namespace_='t:', name_='autorouter'):
pass
def exportChildren(self, outfile, level, namespace_='t:', name_='autorouter', fromsubclass_=False):
if self.passxx is not None:
self.passxx.export(outfile, level, namespace_, name_='pass', )
def hasContent_(self):
if (
self.passxx is not None
):
return True
else:
return False
def exportLiteral(self, outfile, level, name_='autorouter'):
level += 1
self.exportLiteralAttributes(outfile, level, [], name_)
if self.hasContent_():
self.exportLiteralChildren(outfile, level, name_)
def exportLiteralAttributes(self, outfile, level, already_processed, name_):
pass
def exportLiteralChildren(self, outfile, level, name_):
if self.passxx is not None:
showIndent(outfile, level)
outfile.write('passxx=model_.passxx(\n')
self.passxx.exportLiteral(outfile, level)
showIndent(outfile, level)
outfile.write('),\n')
def build(self, node):
self.buildAttributes(node, node.attrib, [])
for child in node:
nodeName_ = Tag_pattern_.match(child.tag).groups()[-1]
self.buildChildren(child, node, nodeName_)
def buildAttributes(self, node, attrs, already_processed):
pass
def buildChildren(self, child_, node, nodeName_, fromsubclass_=False):
if nodeName_ == 'pass':
obj_ = passxx.factory()
obj_.build(child_)
self.set_pass(obj_)
# end class autorouter
class elements(GeneratedsSuper):
subclass = None
superclass = None
def __init__(self, element=None):
self.element = element
def factory(*args_, **kwargs_):
if elements.subclass:
return elements.subclass(*args_, **kwargs_)
else:
return elements(*args_, **kwargs_)
factory = staticmethod(factory)
def get_element(self): return self.element
def set_element(self, element): self.element = element
def export(self, outfile, level, namespace_='t:', name_='elements', namespacedef_=''):
showIndent(outfile, level)
outfile.write('<%s%s%s' % (namespace_, name_, namespacedef_ and ' ' + namespacedef_ or '', ))
already_processed = []
self.exportAttributes(outfile, level, already_processed, namespace_, name_='elements')
if self.hasContent_():
outfile.write('>\n')
self.exportChildren(outfile, level + 1, namespace_, name_)
showIndent(outfile, level)
outfile.write('</%s%s>\n' % (namespace_, name_))
else:
outfile.write('/>\n')
def exportAttributes(self, outfile, level, already_processed, namespace_='t:', name_='elements'):
pass
def exportChildren(self, outfile, level, namespace_='t:', name_='elements', fromsubclass_=False):
if self.element is not None:
self.element.export(outfile, level, namespace_, name_='element', )
def hasContent_(self):
if (
self.element is not None
):
return True
else:
return False
def exportLiteral(self, outfile, level, name_='elements'):
level += 1
self.exportLiteralAttributes(outfile, level, [], name_)
if self.hasContent_():
self.exportLiteralChildren(outfile, level, name_)
def exportLiteralAttributes(self, outfile, level, already_processed, name_):
pass
def exportLiteralChildren(self, outfile, level, name_):
if self.element is not None:
showIndent(outfile, level)
outfile.write('element=model_.element(\n')
self.element.exportLiteral(outfile, level)
showIndent(outfile, level)
outfile.write('),\n')
def build(self, node):
self.buildAttributes(node, node.attrib, [])
for child in node:
nodeName_ = Tag_pattern_.match(child.tag).groups()[-1]
self.buildChildren(child, node, nodeName_)
def buildAttributes(self, node, attrs, already_processed):
pass
def buildChildren(self, child_, node, nodeName_, fromsubclass_=False):
if nodeName_ == 'element':
obj_ = element.factory()
obj_.build(child_)
self.set_element(obj_)
# end class elements
class signals(GeneratedsSuper):
subclass = None
superclass = None
def __init__(self, signal=None):
self.signal = signal
def factory(*args_, **kwargs_):
if signals.subclass:
return signals.subclass(*args_, **kwargs_)
else:
return signals(*args_, **kwargs_)
factory = staticmethod(factory)
def get_signal(self): return self.signal
def set_signal(self, signal): self.signal = signal
def export(self, outfile, level, namespace_='t:', name_='signals', namespacedef_=''):
showIndent(outfile, level)
outfile.write('<%s%s%s' % (namespace_, name_, namespacedef_ and ' ' + namespacedef_ or '', ))
already_processed = []
self.exportAttributes(outfile, level, already_processed, namespace_, name_='signals')
if self.hasContent_():
outfile.write('>\n')
self.exportChildren(outfile, level + 1, namespace_, name_)
showIndent(outfile, level)
outfile.write('</%s%s>\n' % (namespace_, name_))
else:
outfile.write('/>\n')
def exportAttributes(self, outfile, level, already_processed, namespace_='t:', name_='signals'):
pass
def exportChildren(self, outfile, level, namespace_='t:', name_='signals', fromsubclass_=False):
if self.signal is not None:
self.signal.export(outfile, level, namespace_, name_='signal', )
def hasContent_(self):
if (
self.signal is not None
):
return True
else:
return False
def exportLiteral(self, outfile, level, name_='signals'):
level += 1
self.exportLiteralAttributes(outfile, level, [], name_)
if self.hasContent_():
self.exportLiteralChildren(outfile, level, name_)
def exportLiteralAttributes(self, outfile, level, already_processed, name_):
pass
def exportLiteralChildren(self, outfile, level, name_):
if self.signal is not None:
showIndent(outfile, level)
outfile.write('signal=model_.signal(\n')
self.signal.exportLiteral(outfile, level)
showIndent(outfile, level)
outfile.write('),\n')
def build(self, node):
self.buildAttributes(node, node.attrib, [])
for child in node:
nodeName_ = Tag_pattern_.match(child.tag).groups()[-1]
self.buildChildren(child, node, nodeName_)
def buildAttributes(self, node, attrs, already_processed):
pass
def buildChildren(self, child_, node, nodeName_, fromsubclass_=False):
if nodeName_ == 'signal':
obj_ = signal.factory()
obj_.build(child_)
self.set_signal(obj_)
# end class signals
class busses(GeneratedsSuper):
subclass = None
superclass = None
def __init__(self, bus=None):
self.bus = bus
def factory(*args_, **kwargs_):
if busses.subclass:
return busses.subclass(*args_, **kwargs_)
else:
return busses(*args_, **kwargs_)
factory = staticmethod(factory)
def get_bus(self): return self.bus
def set_bus(self, bus): self.bus = bus
def export(self, outfile, level, namespace_='t:', name_='busses', namespacedef_=''):
showIndent(outfile, level)
outfile.write('<%s%s%s' % (namespace_, name_, namespacedef_ and ' ' + namespacedef_ or '', ))
already_processed = []
self.exportAttributes(outfile, level, already_processed, namespace_, name_='busses')
if self.hasContent_():
outfile.write('>\n')
self.exportChildren(outfile, level + 1, namespace_, name_)
showIndent(outfile, level)
outfile.write('</%s%s>\n' % (namespace_, name_))
else:
outfile.write('/>\n')
def exportAttributes(self, outfile, level, already_processed, namespace_='t:', name_='busses'):
pass
def exportChildren(self, outfile, level, namespace_='t:', name_='busses', fromsubclass_=False):
if self.bus is not None:
self.bus.export(outfile, level, namespace_, name_='bus', )
def hasContent_(self):
if (
self.bus is not None
):
return True
else:
return False
def exportLiteral(self, outfile, level, name_='busses'):
level += 1
self.exportLiteralAttributes(outfile, level, [], name_)
if self.hasContent_():
self.exportLiteralChildren(outfile, level, name_)
def exportLiteralAttributes(self, outfile, level, already_processed, name_):
pass
def exportLiteralChildren(self, outfile, level, name_):
if self.bus is not None:
showIndent(outfile, level)
outfile.write('bus=model_.bus(\n')
self.bus.exportLiteral(outfile, level)
showIndent(outfile, level)
outfile.write('),\n')
def build(self, node):
self.buildAttributes(node, node.attrib, [])
for child in node:
nodeName_ = Tag_pattern_.match(child.tag).groups()[-1]
self.buildChildren(child, node, nodeName_)
def buildAttributes(self, node, attrs, already_processed):
pass
def buildChildren(self, child_, node, nodeName_, fromsubclass_=False):
if nodeName_ == 'bus':
obj_ = bus.factory()
obj_.build(child_)
self.set_bus(obj_)
# end class busses
class nets(GeneratedsSuper):
subclass = None
superclass = None
def __init__(self, net=None):
self.net = net
def factory(*args_, **kwargs_):
if nets.subclass:
return nets.subclass(*args_, **kwargs_)
else:
return nets(*args_, **kwargs_)
factory = staticmethod(factory)
def get_net(self): return self.net
def set_net(self, net): self.net = net
def export(self, outfile, level, namespace_='t:', name_='nets', namespacedef_=''):
showIndent(outfile, level)
outfile.write('<%s%s%s' % (namespace_, name_, namespacedef_ and ' ' + namespacedef_ or '', ))
already_processed = []
self.exportAttributes(outfile, level, already_processed, namespace_, name_='nets')
if self.hasContent_():
outfile.write('>\n')
self.exportChildren(outfile, level + 1, namespace_, name_)
showIndent(outfile, level)
outfile.write('</%s%s>\n' % (namespace_, name_))
else:
outfile.write('/>\n')
def exportAttributes(self, outfile, level, already_processed, namespace_='t:', name_='nets'):
pass
def exportChildren(self, outfile, level, namespace_='t:', name_='nets', fromsubclass_=False):
if self.net is not None:
self.net.export(outfile, level, namespace_, name_='net', )
def hasContent_(self):
if (
self.net is not None
):
return True
else:
return False
def exportLiteral(self, outfile, level, name_='nets'):
level += 1
self.exportLiteralAttributes(outfile, level, [], name_)
if self.hasContent_():
self.exportLiteralChildren(outfile, level, name_)
def exportLiteralAttributes(self, outfile, level, already_processed, name_):
pass
def exportLiteralChildren(self, outfile, level, name_):
if self.net is not None:
showIndent(outfile, level)
outfile.write('net=model_.net(\n')
self.net.exportLiteral(outfile, level)
showIndent(outfile, level)
outfile.write('),\n')
def build(self, node):
self.buildAttributes(node, node.attrib, [])
for child in node:
nodeName_ = Tag_pattern_.match(child.tag).groups()[-1]
self.buildChildren(child, node, nodeName_)
def buildAttributes(self, node, attrs, already_processed):
pass
def buildChildren(self, child_, node, nodeName_, fromsubclass_=False):
if nodeName_ == 'net':
obj_ = net.factory()
obj_.build(child_)
self.set_net(obj_)
# end class nets
class setting(GeneratedsSuper):
subclass = None
superclass = None
def __init__(self, alwaysvectorfont=None, verticaltext=None):
self.alwaysvectorfont = _cast(None, alwaysvectorfont)
self.verticaltext = _cast(None, verticaltext)
pass
def factory(*args_, **kwargs_):
if setting.subclass:
return setting.subclass(*args_, **kwargs_)
else:
return setting(*args_, **kwargs_)
factory = staticmethod(factory)
def get_alwaysvectorfont(self): return self.alwaysvectorfont
def set_alwaysvectorfont(self, alwaysvectorfont): self.alwaysvectorfont = alwaysvectorfont
def get_verticaltext(self): return self.verticaltext
def set_verticaltext(self, verticaltext): self.verticaltext = verticaltext
def export(self, outfile, level, namespace_='t:', name_='setting', namespacedef_=''):
showIndent(outfile, level)
outfile.write('<%s%s%s' % (namespace_, name_, namespacedef_ and ' ' + namespacedef_ or '', ))
already_processed = []
self.exportAttributes(outfile, level, already_processed, namespace_, name_='setting')
if self.hasContent_():
outfile.write('>\n')
self.exportChildren(outfile, level + 1, namespace_, name_)
outfile.write('</%s%s>\n' % (namespace_, name_))
else:
outfile.write('/>\n')
def exportAttributes(self, outfile, level, already_processed, namespace_='t:', name_='setting'):
if self.alwaysvectorfont is not None and 'alwaysvectorfont' not in already_processed:
already_processed.append('alwaysvectorfont')
outfile.write(' alwaysvectorfont=%s' % (self.gds_format_string(quote_attrib(self.alwaysvectorfont).encode(ExternalEncoding), input_name='alwaysvectorfont'), ))
if self.verticaltext is not None and 'verticaltext' not in already_processed:
already_processed.append('verticaltext')
outfile.write(' verticaltext=%s' % (self.gds_format_string(quote_attrib(self.verticaltext).encode(ExternalEncoding), input_name='verticaltext'), ))
def exportChildren(self, outfile, level, namespace_='t:', name_='setting', fromsubclass_=False):
pass
def hasContent_(self):
if (
):
return True
else:
return False
def exportLiteral(self, outfile, level, name_='setting'):
level += 1
self.exportLiteralAttributes(outfile, level, [], name_)
if self.hasContent_():
self.exportLiteralChildren(outfile, level, name_)
def exportLiteralAttributes(self, outfile, level, already_processed, name_):
if self.alwaysvectorfont is not None and 'alwaysvectorfont' not in already_processed:
already_processed.append('alwaysvectorfont')
showIndent(outfile, level)
outfile.write('alwaysvectorfont = "%s",\n' % (self.alwaysvectorfont,))
if self.verticaltext is not None and 'verticaltext' not in already_processed:
already_processed.append('verticaltext')
showIndent(outfile, level)
outfile.write('verticaltext = "%s",\n' % (self.verticaltext,))
def exportLiteralChildren(self, outfile, level, name_):
pass
def build(self, node):
self.buildAttributes(node, node.attrib, [])
for child in node:
nodeName_ = Tag_pattern_.match(child.tag).groups()[-1]
self.buildChildren(child, node, nodeName_)
def buildAttributes(self, node, attrs, already_processed):
value = find_attr_value_('alwaysvectorfont', node)
if value is not None and 'alwaysvectorfont' not in already_processed:
already_processed.append('alwaysvectorfont')
self.alwaysvectorfont = value
value = find_attr_value_('verticaltext', node)
if value is not None and 'verticaltext' not in already_processed:
already_processed.append('verticaltext')
self.verticaltext = value
def buildChildren(self, child_, node, nodeName_, fromsubclass_=False):
pass
# end class setting
class grid(GeneratedsSuper):
subclass = None
superclass = None
def __init__(self, distance=None, style=None, multiple=None, altdistance=None, altunit=None, unitdist=None, altunitdist=None, display=None, unit=None):
self.distance = _cast(None, distance)
self.style = _cast(None, style)
self.multiple = _cast(None, multiple)
self.altdistance = _cast(None, altdistance)
self.altunit = _cast(None, altunit)
self.unitdist = _cast(None, unitdist)
self.altunitdist = _cast(None, altunitdist)
self.display = _cast(None, display)
self.unit = _cast(None, unit)
pass
def factory(*args_, **kwargs_):
if grid.subclass:
return grid.subclass(*args_, **kwargs_)
else:
return grid(*args_, **kwargs_)
factory = staticmethod(factory)
def get_distance(self): return self.distance
def set_distance(self, distance): self.distance = distance
def get_style(self): return self.style
def set_style(self, style): self.style = style
def get_multiple(self): return self.multiple
def set_multiple(self, multiple): self.multiple = multiple
def get_altdistance(self): return self.altdistance
def set_altdistance(self, altdistance): self.altdistance = altdistance
def get_altunit(self): return self.altunit
def set_altunit(self, altunit): self.altunit = altunit
def get_unitdist(self): return self.unitdist
def set_unitdist(self, unitdist): self.unitdist = unitdist
def get_altunitdist(self): return self.altunitdist
def set_altunitdist(self, altunitdist): self.altunitdist = altunitdist
def get_display(self): return self.display
def set_display(self, display): self.display = display
def get_unit(self): return self.unit
def set_unit(self, unit): self.unit = unit
def export(self, outfile, level, namespace_='t:', name_='grid', namespacedef_=''):
showIndent(outfile, level)
outfile.write('<%s%s%s' % (namespace_, name_, namespacedef_ and ' ' + namespacedef_ or '', ))
already_processed = []
self.exportAttributes(outfile, level, already_processed, namespace_, name_='grid')
if self.hasContent_():
outfile.write('>\n')
self.exportChildren(outfile, level + 1, namespace_, name_)
outfile.write('</%s%s>\n' % (namespace_, name_))
else:
outfile.write('/>\n')
def exportAttributes(self, outfile, level, already_processed, namespace_='t:', name_='grid'):
if self.distance is not None and 'distance' not in already_processed:
already_processed.append('distance')
outfile.write(' distance=%s' % (self.gds_format_string(quote_attrib(self.distance).encode(ExternalEncoding), input_name='distance'), ))
if self.style is not None and 'style' not in already_processed:
already_processed.append('style')
outfile.write(' style=%s' % (self.gds_format_string(quote_attrib(self.style).encode(ExternalEncoding), input_name='style'), ))
if self.multiple is not None and 'multiple' not in already_processed:
already_processed.append('multiple')
outfile.write(' multiple=%s' % (self.gds_format_string(quote_attrib(self.multiple).encode(ExternalEncoding), input_name='multiple'), ))
if self.altdistance is not None and 'altdistance' not in already_processed:
already_processed.append('altdistance')
outfile.write(' altdistance=%s' % (self.gds_format_string(quote_attrib(self.altdistance).encode(ExternalEncoding), input_name='altdistance'), ))
if self.altunit is not None and 'altunit' not in already_processed:
already_processed.append('altunit')
outfile.write(' altunit=%s' % (self.gds_format_string(quote_attrib(self.altunit).encode(ExternalEncoding), input_name='altunit'), ))
if self.unitdist is not None and 'unitdist' not in already_processed:
already_processed.append('unitdist')
outfile.write(' unitdist=%s' % (self.gds_format_string(quote_attrib(self.unitdist).encode(ExternalEncoding), input_name='unitdist'), ))
if self.altunitdist is not None and 'altunitdist' not in already_processed:
already_processed.append('altunitdist')
outfile.write(' altunitdist=%s' % (self.gds_format_string(quote_attrib(self.altunitdist).encode(ExternalEncoding), input_name='altunitdist'), ))
if self.display is not None and 'display' not in already_processed:
already_processed.append('display')
outfile.write(' display=%s' % (self.gds_format_string(quote_attrib(self.display).encode(ExternalEncoding), input_name='display'), ))
if self.unit is not None and 'unit' not in already_processed:
already_processed.append('unit')
outfile.write(' unit=%s' % (self.gds_format_string(quote_attrib(self.unit).encode(ExternalEncoding), input_name='unit'), ))
def exportChildren(self, outfile, level, namespace_='t:', name_='grid', fromsubclass_=False):
pass
def hasContent_(self):
if (
):
return True
else:
return False
def exportLiteral(self, outfile, level, name_='grid'):
level += 1
self.exportLiteralAttributes(outfile, level, [], name_)
if self.hasContent_():
self.exportLiteralChildren(outfile, level, name_)
def exportLiteralAttributes(self, outfile, level, already_processed, name_):
if self.distance is not None and 'distance' not in already_processed:
already_processed.append('distance')
showIndent(outfile, level)
outfile.write('distance = "%s",\n' % (self.distance,))
if self.style is not None and 'style' not in already_processed:
already_processed.append('style')
showIndent(outfile, level)
outfile.write('style = "%s",\n' % (self.style,))
if self.multiple is not None and 'multiple' not in already_processed:
already_processed.append('multiple')
showIndent(outfile, level)
outfile.write('multiple = "%s",\n' % (self.multiple,))
if self.altdistance is not None and 'altdistance' not in already_processed:
already_processed.append('altdistance')
showIndent(outfile, level)
outfile.write('altdistance = "%s",\n' % (self.altdistance,))
if self.altunit is not None and 'altunit' not in already_processed:
already_processed.append('altunit')
showIndent(outfile, level)
outfile.write('altunit = "%s",\n' % (self.altunit,))
if self.unitdist is not None and 'unitdist' not in already_processed:
already_processed.append('unitdist')
showIndent(outfile, level)
outfile.write('unitdist = "%s",\n' % (self.unitdist,))
if self.altunitdist is not None and 'altunitdist' not in already_processed:
already_processed.append('altunitdist')
showIndent(outfile, level)
outfile.write('altunitdist = "%s",\n' % (self.altunitdist,))
if self.display is not None and 'display' not in already_processed:
already_processed.append('display')
showIndent(outfile, level)
outfile.write('display = "%s",\n' % (self.display,))
if self.unit is not None and 'unit' not in already_processed:
already_processed.append('unit')
showIndent(outfile, level)
outfile.write('unit = "%s",\n' % (self.unit,))
def exportLiteralChildren(self, outfile, level, name_):
pass
def build(self, node):
self.buildAttributes(node, node.attrib, [])
for child in node:
nodeName_ = Tag_pattern_.match(child.tag).groups()[-1]
self.buildChildren(child, node, nodeName_)
def buildAttributes(self, node, attrs, already_processed):
value = find_attr_value_('distance', node)
if value is not None and 'distance' not in already_processed:
already_processed.append('distance')
self.distance = value
value = find_attr_value_('style', node)
if value is not None and 'style' not in already_processed:
already_processed.append('style')
self.style = value
value = find_attr_value_('multiple', node)
if value is not None and 'multiple' not in already_processed:
already_processed.append('multiple')
self.multiple = value
value = find_attr_value_('altdistance', node)
if value is not None and 'altdistance' not in already_processed:
already_processed.append('altdistance')
self.altdistance = value
value = find_attr_value_('altunit', node)
if value is not None and 'altunit' not in already_processed:
already_processed.append('altunit')
self.altunit = value
value = find_attr_value_('unitdist', node)
if value is not None and 'unitdist' not in already_processed:
already_processed.append('unitdist')
self.unitdist = value
value = find_attr_value_('altunitdist', node)
if value is not None and 'altunitdist' not in already_processed:
already_processed.append('altunitdist')
self.altunitdist = value
value = find_attr_value_('display', node)
if value is not None and 'display' not in already_processed:
already_processed.append('display')
self.display = value
value = find_attr_value_('unit', node)
if value is not None and 'unit' not in already_processed:
already_processed.append('unit')
self.unit = value
def buildChildren(self, child_, node, nodeName_, fromsubclass_=False):
pass
# end class grid
class layer(GeneratedsSuper):
subclass = None
superclass = None
def __init__(self, name=None, color=None, number=None, visible=None, active=None, fill=None):
self.name = _cast(None, name)
self.color = _cast(None, color)
self.number = _cast(None, number)
self.visible = _cast(None, visible)
self.active = _cast(None, active)
self.fill = _cast(None, fill)
pass
def factory(*args_, **kwargs_):
if layer.subclass:
return layer.subclass(*args_, **kwargs_)
else:
return layer(*args_, **kwargs_)
factory = staticmethod(factory)
def get_name(self): return self.name
def set_name(self, name): self.name = name
def get_color(self): return self.color
def set_color(self, color): self.color = color
def get_number(self): return self.number
def set_number(self, number): self.number = number
def get_visible(self): return self.visible
def set_visible(self, visible): self.visible = visible
def get_active(self): return self.active
def set_active(self, active): self.active = active
def get_fill(self): return self.fill
def set_fill(self, fill): self.fill = fill
def export(self, outfile, level, namespace_='t:', name_='layer', namespacedef_=''):
showIndent(outfile, level)
outfile.write('<%s%s%s' % (namespace_, name_, namespacedef_ and ' ' + namespacedef_ or '', ))
already_processed = []
self.exportAttributes(outfile, level, already_processed, namespace_, name_='layer')
if self.hasContent_():
outfile.write('>\n')
self.exportChildren(outfile, level + 1, namespace_, name_)
outfile.write('</%s%s>\n' % (namespace_, name_))
else:
outfile.write('/>\n')
def exportAttributes(self, outfile, level, already_processed, namespace_='t:', name_='layer'):
if self.name is not None and 'name' not in already_processed:
already_processed.append('name')
outfile.write(' name=%s' % (self.gds_format_string(quote_attrib(self.name).encode(ExternalEncoding), input_name='name'), ))
if self.color is not None and 'color' not in already_processed:
already_processed.append('color')
outfile.write(' color=%s' % (self.gds_format_string(quote_attrib(self.color).encode(ExternalEncoding), input_name='color'), ))
if self.number is not None and 'number' not in already_processed:
already_processed.append('number')
outfile.write(' number=%s' % (self.gds_format_string(quote_attrib(self.number).encode(ExternalEncoding), input_name='number'), ))
if self.visible is not None and 'visible' not in already_processed:
already_processed.append('visible')
outfile.write(' visible=%s' % (self.gds_format_string(quote_attrib(self.visible).encode(ExternalEncoding), input_name='visible'), ))
if self.active is not None and 'active' not in already_processed:
already_processed.append('active')
outfile.write(' active=%s' % (self.gds_format_string(quote_attrib(self.active).encode(ExternalEncoding), input_name='active'), ))
if self.fill is not None and 'fill' not in already_processed:
already_processed.append('fill')
outfile.write(' fill=%s' % (self.gds_format_string(quote_attrib(self.fill).encode(ExternalEncoding), input_name='fill'), ))
def exportChildren(self, outfile, level, namespace_='t:', name_='layer', fromsubclass_=False):
pass
def hasContent_(self):
if (
):
return True
else:
return False
def exportLiteral(self, outfile, level, name_='layer'):
level += 1
self.exportLiteralAttributes(outfile, level, [], name_)
if self.hasContent_():
self.exportLiteralChildren(outfile, level, name_)
def exportLiteralAttributes(self, outfile, level, already_processed, name_):
if self.name is not None and 'name' not in already_processed:
already_processed.append('name')
showIndent(outfile, level)
outfile.write('name = "%s",\n' % (self.name,))
if self.color is not None and 'color' not in already_processed:
already_processed.append('color')
showIndent(outfile, level)
outfile.write('color = "%s",\n' % (self.color,))
if self.number is not None and 'number' not in already_processed:
already_processed.append('number')
showIndent(outfile, level)
outfile.write('number = "%s",\n' % (self.number,))
if self.visible is not None and 'visible' not in already_processed:
already_processed.append('visible')
showIndent(outfile, level)
outfile.write('visible = "%s",\n' % (self.visible,))
if self.active is not None and 'active' not in already_processed:
already_processed.append('active')
showIndent(outfile, level)
outfile.write('active = "%s",\n' % (self.active,))
if self.fill is not None and 'fill' not in already_processed:
already_processed.append('fill')
showIndent(outfile, level)
outfile.write('fill = "%s",\n' % (self.fill,))
def exportLiteralChildren(self, outfile, level, name_):
pass
def build(self, node):
self.buildAttributes(node, node.attrib, [])
for child in node:
nodeName_ = Tag_pattern_.match(child.tag).groups()[-1]
self.buildChildren(child, node, nodeName_)
def buildAttributes(self, node, attrs, already_processed):
value = find_attr_value_('name', node)
if value is not None and 'name' not in already_processed:
already_processed.append('name')
self.name = value
value = find_attr_value_('color', node)
if value is not None and 'color' not in already_processed:
already_processed.append('color')
self.color = value
value = find_attr_value_('number', node)
if value is not None and 'number' not in already_processed:
already_processed.append('number')
self.number = value
value = find_attr_value_('visible', node)
if value is not None and 'visible' not in already_processed:
already_processed.append('visible')
self.visible = value
value = find_attr_value_('active', node)
if value is not None and 'active' not in already_processed:
already_processed.append('active')
self.active = value
value = find_attr_value_('fill', node)
if value is not None and 'fill' not in already_processed:
already_processed.append('fill')
self.fill = value
def buildChildren(self, child_, node, nodeName_, fromsubclass_=False):
pass
# end class layer
class classxx(GeneratedsSuper):
subclass = None
superclass = None
def __init__(self, width=None, number=None, drill=None, name=None, clearance=None):
self.width = _cast(None, width)
self.number = _cast(None, number)
self.drill = _cast(None, drill)
self.name = _cast(None, name)
self.clearance = clearance
def factory(*args_, **kwargs_):
if classxx.subclass:
return classxx.subclass(*args_, **kwargs_)
else:
return classxx(*args_, **kwargs_)
factory = staticmethod(factory)
def get_clearance(self): return self.clearance
def set_clearance(self, clearance): self.clearance = clearance
def get_width(self): return self.width
def set_width(self, width): self.width = width
def get_number(self): return self.number
def set_number(self, number): self.number = number
def get_drill(self): return self.drill
def set_drill(self, drill): self.drill = drill
def get_name(self): return self.name
def set_name(self, name): self.name = name
def export(self, outfile, level, namespace_='t:', name_='class', namespacedef_=''):
showIndent(outfile, level)
outfile.write('<%s%s%s' % (namespace_, name_, namespacedef_ and ' ' + namespacedef_ or '', ))
already_processed = []
self.exportAttributes(outfile, level, already_processed, namespace_, name_='class')
if self.hasContent_():
outfile.write('>\n')
self.exportChildren(outfile, level + 1, namespace_, name_)
showIndent(outfile, level)
outfile.write('</%s%s>\n' % (namespace_, name_))
else:
outfile.write('/>\n')
def exportAttributes(self, outfile, level, already_processed, namespace_='t:', name_='class'):
if self.width is not None and 'width' not in already_processed:
already_processed.append('width')
outfile.write(' width=%s' % (self.gds_format_string(quote_attrib(self.width).encode(ExternalEncoding), input_name='width'), ))
if self.number is not None and 'number' not in already_processed:
already_processed.append('number')
outfile.write(' number=%s' % (self.gds_format_string(quote_attrib(self.number).encode(ExternalEncoding), input_name='number'), ))
if self.drill is not None and 'drill' not in already_processed:
already_processed.append('drill')
outfile.write(' drill=%s' % (self.gds_format_string(quote_attrib(self.drill).encode(ExternalEncoding), input_name='drill'), ))
if self.name is not None and 'name' not in already_processed:
already_processed.append('name')
outfile.write(' name=%s' % (self.gds_format_string(quote_attrib(self.name).encode(ExternalEncoding), input_name='name'), ))
def exportChildren(self, outfile, level, namespace_='t:', name_='class', fromsubclass_=False):
if self.clearance is not None:
self.clearance.export(outfile, level, namespace_, name_='clearance', )
def hasContent_(self):
if (
self.clearance is not None
):
return True
else:
return False
def exportLiteral(self, outfile, level, name_='class'):
level += 1
self.exportLiteralAttributes(outfile, level, [], name_)
if self.hasContent_():
self.exportLiteralChildren(outfile, level, name_)
def exportLiteralAttributes(self, outfile, level, already_processed, name_):
if self.width is not None and 'width' not in already_processed:
already_processed.append('width')
showIndent(outfile, level)
outfile.write('width = "%s",\n' % (self.width,))
if self.number is not None and 'number' not in already_processed:
already_processed.append('number')
showIndent(outfile, level)
outfile.write('number = "%s",\n' % (self.number,))
if self.drill is not None and 'drill' not in already_processed:
already_processed.append('drill')
showIndent(outfile, level)
outfile.write('drill = "%s",\n' % (self.drill,))
if self.name is not None and 'name' not in already_processed:
already_processed.append('name')
showIndent(outfile, level)
outfile.write('name = "%s",\n' % (self.name,))
def exportLiteralChildren(self, outfile, level, name_):
if self.clearance is not None:
showIndent(outfile, level)
outfile.write('clearance=model_.clearance(\n')
self.clearance.exportLiteral(outfile, level)
showIndent(outfile, level)
outfile.write('),\n')
def build(self, node):
self.buildAttributes(node, node.attrib, [])
for child in node:
nodeName_ = Tag_pattern_.match(child.tag).groups()[-1]
self.buildChildren(child, node, nodeName_)
def buildAttributes(self, node, attrs, already_processed):
value = find_attr_value_('width', node)
if value is not None and 'width' not in already_processed:
already_processed.append('width')
self.width = value
value = find_attr_value_('number', node)
if value is not None and 'number' not in already_processed:
already_processed.append('number')
self.number = value
value = find_attr_value_('drill', node)
if value is not None and 'drill' not in already_processed:
already_processed.append('drill')
self.drill = value
value = find_attr_value_('name', node)
if value is not None and 'name' not in already_processed:
already_processed.append('name')
self.name = value
def buildChildren(self, child_, node, nodeName_, fromsubclass_=False):
if nodeName_ == 'clearance':
obj_ = clearance.factory()
obj_.build(child_)
self.set_clearance(obj_)
# end class classxx
class clearance(GeneratedsSuper):
subclass = None
superclass = None
def __init__(self, classxx=None, value=None):
self.classxx = _cast(None, classxx)
self.value = _cast(None, value)
pass
def factory(*args_, **kwargs_):
if clearance.subclass:
return clearance.subclass(*args_, **kwargs_)
else:
return clearance(*args_, **kwargs_)
factory = staticmethod(factory)
def get_class(self): return self.classxx
def set_class(self, classxx): self.classxx = classxx
def get_value(self): return self.value
def set_value(self, value): self.value = value
def export(self, outfile, level, namespace_='t:', name_='clearance', namespacedef_=''):
showIndent(outfile, level)
outfile.write('<%s%s%s' % (namespace_, name_, namespacedef_ and ' ' + namespacedef_ or '', ))
already_processed = []
self.exportAttributes(outfile, level, already_processed, namespace_, name_='clearance')
if self.hasContent_():
outfile.write('>\n')
self.exportChildren(outfile, level + 1, namespace_, name_)
outfile.write('</%s%s>\n' % (namespace_, name_))
else:
outfile.write('/>\n')
def exportAttributes(self, outfile, level, already_processed, namespace_='t:', name_='clearance'):
if self.classxx is not None and 'classxx' not in already_processed:
already_processed.append('classxx')
outfile.write(' class=%s' % (self.gds_format_string(quote_attrib(self.classxx).encode(ExternalEncoding), input_name='class'), ))
if self.value is not None and 'value' not in already_processed:
already_processed.append('value')
outfile.write(' value=%s' % (self.gds_format_string(quote_attrib(self.value).encode(ExternalEncoding), input_name='value'), ))
def exportChildren(self, outfile, level, namespace_='t:', name_='clearance', fromsubclass_=False):
pass
def hasContent_(self):
if (
):
return True
else:
return False
def exportLiteral(self, outfile, level, name_='clearance'):
level += 1
self.exportLiteralAttributes(outfile, level, [], name_)
if self.hasContent_():
self.exportLiteralChildren(outfile, level, name_)
def exportLiteralAttributes(self, outfile, level, already_processed, name_):
if self.classxx is not None and 'classxx' not in already_processed:
already_processed.append('classxx')
showIndent(outfile, level)
outfile.write('classxx = "%s",\n' % (self.classxx,))
if self.value is not None and 'value' not in already_processed:
already_processed.append('value')
showIndent(outfile, level)
outfile.write('value = "%s",\n' % (self.value,))
def exportLiteralChildren(self, outfile, level, name_):
pass
def build(self, node):
self.buildAttributes(node, node.attrib, [])
for child in node:
nodeName_ = Tag_pattern_.match(child.tag).groups()[-1]
self.buildChildren(child, node, nodeName_)
def buildAttributes(self, node, attrs, already_processed):
value = find_attr_value_('class', node)
if value is not None and 'class' not in already_processed:
already_processed.append('class')
self.classxx = value
value = find_attr_value_('value', node)
if value is not None and 'value' not in already_processed:
already_processed.append('value')
self.value = value
def buildChildren(self, child_, node, nodeName_, fromsubclass_=False):
pass
# end class clearance
class description(GeneratedsSuper):
subclass = None
superclass = None
def __init__(self, language=None, valueOf_=None, mixedclass_=None, content_=None):
self.language = _cast(None, language)
self.valueOf_ = valueOf_
if mixedclass_ is None:
self.mixedclass_ = MixedContainer
else:
self.mixedclass_ = mixedclass_
if content_ is None:
self.content_ = []
else:
self.content_ = content_
self.valueOf_ = valueOf_
def factory(*args_, **kwargs_):
if description.subclass:
return description.subclass(*args_, **kwargs_)
else:
return description(*args_, **kwargs_)
factory = staticmethod(factory)
def get_language(self): return self.language
def set_language(self, language): self.language = language
def get_valueOf_(self): return self.valueOf_
def set_valueOf_(self, valueOf_): self.valueOf_ = valueOf_
def export(self, outfile, level, namespace_='t:', name_='description', namespacedef_=''):
showIndent(outfile, level)
outfile.write('<%s%s%s' % (namespace_, name_, namespacedef_ and ' ' + namespacedef_ or '', ))
already_processed = []
self.exportAttributes(outfile, level, already_processed, namespace_, name_='description')
outfile.write('>')
self.exportChildren(outfile, level + 1, namespace_, name_)
outfile.write('</%s%s>\n' % (namespace_, name_))
def exportAttributes(self, outfile, level, already_processed, namespace_='t:', name_='description'):
if self.language is not None and 'language' not in already_processed:
already_processed.append('language')
outfile.write(' language=%s' % (self.gds_format_string(quote_attrib(self.language).encode(ExternalEncoding), input_name='language'), ))
def exportChildren(self, outfile, level, namespace_='t:', name_='description', fromsubclass_=False):
pass
def hasContent_(self):
if (
self.valueOf_
):
return True
else:
return False
def exportLiteral(self, outfile, level, name_='description'):
level += 1
self.exportLiteralAttributes(outfile, level, [], name_)
if self.hasContent_():
self.exportLiteralChildren(outfile, level, name_)
showIndent(outfile, level)
outfile.write('valueOf_ = """%s""",\n' % (self.valueOf_,))
def exportLiteralAttributes(self, outfile, level, already_processed, name_):
if self.language is not None and 'language' not in already_processed:
already_processed.append('language')
showIndent(outfile, level)
outfile.write('language = "%s",\n' % (self.language,))
def exportLiteralChildren(self, outfile, level, name_):
pass
def build(self, node):
self.buildAttributes(node, node.attrib, [])
self.valueOf_ = get_all_text_(node)
if node.text is not None:
obj_ = self.mixedclass_(MixedContainer.CategoryText,
MixedContainer.TypeNone, '', node.text)
self.content_.append(obj_)
for child in node:
nodeName_ = Tag_pattern_.match(child.tag).groups()[-1]
self.buildChildren(child, node, nodeName_)
def buildAttributes(self, node, attrs, already_processed):
value = find_attr_value_('language', node)
if value is not None and 'language' not in already_processed:
already_processed.append('language')
self.language = value
def buildChildren(self, child_, node, nodeName_, fromsubclass_=False):
if not fromsubclass_ and child_.tail is not None:
obj_ = self.mixedclass_(MixedContainer.CategoryText,
MixedContainer.TypeNone, '', child_.tail)
self.content_.append(obj_)
pass
# end class description
class param(GeneratedsSuper):
subclass = None
superclass = None
def __init__(self, name=None, value=None):
self.name = _cast(None, name)
self.value = _cast(None, value)
pass
def factory(*args_, **kwargs_):
if param.subclass:
return param.subclass(*args_, **kwargs_)
else:
return param(*args_, **kwargs_)
factory = staticmethod(factory)
def get_name(self): return self.name
def set_name(self, name): self.name = name
def get_value(self): return self.value
def set_value(self, value): self.value = value
def export(self, outfile, level, namespace_='t:', name_='param', namespacedef_=''):
showIndent(outfile, level)
outfile.write('<%s%s%s' % (namespace_, name_, namespacedef_ and ' ' + namespacedef_ or '', ))
already_processed = []
self.exportAttributes(outfile, level, already_processed, namespace_, name_='param')
if self.hasContent_():
outfile.write('>\n')
self.exportChildren(outfile, level + 1, namespace_, name_)
outfile.write('</%s%s>\n' % (namespace_, name_))
else:
outfile.write('/>\n')
def exportAttributes(self, outfile, level, already_processed, namespace_='t:', name_='param'):
if self.name is not None and 'name' not in already_processed:
already_processed.append('name')
outfile.write(' name=%s' % (self.gds_format_string(quote_attrib(self.name).encode(ExternalEncoding), input_name='name'), ))
if self.value is not None and 'value' not in already_processed:
already_processed.append('value')
outfile.write(' value=%s' % (self.gds_format_string(quote_attrib(self.value).encode(ExternalEncoding), input_name='value'), ))
def exportChildren(self, outfile, level, namespace_='t:', name_='param', fromsubclass_=False):
pass
def hasContent_(self):
if (
):
return True
else:
return False
def exportLiteral(self, outfile, level, name_='param'):
level += 1
self.exportLiteralAttributes(outfile, level, [], name_)
if self.hasContent_():
self.exportLiteralChildren(outfile, level, name_)
def exportLiteralAttributes(self, outfile, level, already_processed, name_):
if self.name is not None and 'name' not in already_processed:
already_processed.append('name')
showIndent(outfile, level)
outfile.write('name = "%s",\n' % (self.name,))
if self.value is not None and 'value' not in already_processed:
already_processed.append('value')
showIndent(outfile, level)
outfile.write('value = "%s",\n' % (self.value,))
def exportLiteralChildren(self, outfile, level, name_):
pass
def build(self, node):
self.buildAttributes(node, node.attrib, [])
for child in node:
nodeName_ = Tag_pattern_.match(child.tag).groups()[-1]
self.buildChildren(child, node, nodeName_)
def buildAttributes(self, node, attrs, already_processed):
value = find_attr_value_('name', node)
if value is not None and 'name' not in already_processed:
already_processed.append('name')
self.name = value
value = find_attr_value_('value', node)
if value is not None and 'value' not in already_processed:
already_processed.append('value')
self.value = value
def buildChildren(self, child_, node, nodeName_, fromsubclass_=False):
pass
# end class param
class passxx(GeneratedsSuper):
subclass = None
superclass = None
def __init__(self, active=None, name=None, refer=None, param=None):
self.active = _cast(None, active)
self.name = _cast(None, name)
self.refer = _cast(None, refer)
self.param = param
def factory(*args_, **kwargs_):
if passxx.subclass:
return passxx.subclass(*args_, **kwargs_)
else:
return passxx(*args_, **kwargs_)
factory = staticmethod(factory)
def get_param(self): return self.param
def set_param(self, param): self.param = param
def get_active(self): return self.active
def set_active(self, active): self.active = active
def get_name(self): return self.name
def set_name(self, name): self.name = name
def get_refer(self): return self.refer
def set_refer(self, refer): self.refer = refer
def export(self, outfile, level, namespace_='t:', name_='pass', namespacedef_=''):
showIndent(outfile, level)
outfile.write('<%s%s%s' % (namespace_, name_, namespacedef_ and ' ' + namespacedef_ or '', ))
already_processed = []
self.exportAttributes(outfile, level, already_processed, namespace_, name_='pass')
if self.hasContent_():
outfile.write('>\n')
self.exportChildren(outfile, level + 1, namespace_, name_)
showIndent(outfile, level)
outfile.write('</%s%s>\n' % (namespace_, name_))
else:
outfile.write('/>\n')
def exportAttributes(self, outfile, level, already_processed, namespace_='t:', name_='pass'):
if self.active is not None and 'active' not in already_processed:
already_processed.append('active')
outfile.write(' active=%s' % (self.gds_format_string(quote_attrib(self.active).encode(ExternalEncoding), input_name='active'), ))
if self.name is not None and 'name' not in already_processed:
already_processed.append('name')
outfile.write(' name=%s' % (self.gds_format_string(quote_attrib(self.name).encode(ExternalEncoding), input_name='name'), ))
if self.refer is not None and 'refer' not in already_processed:
already_processed.append('refer')
outfile.write(' refer=%s' % (self.gds_format_string(quote_attrib(self.refer).encode(ExternalEncoding), input_name='refer'), ))
def exportChildren(self, outfile, level, namespace_='t:', name_='pass', fromsubclass_=False):
if self.param is not None:
self.param.export(outfile, level, namespace_, name_='param', )
def hasContent_(self):
if (
self.param is not None
):
return True
else:
return False
def exportLiteral(self, outfile, level, name_='pass'):
level += 1
self.exportLiteralAttributes(outfile, level, [], name_)
if self.hasContent_():
self.exportLiteralChildren(outfile, level, name_)
def exportLiteralAttributes(self, outfile, level, already_processed, name_):
if self.active is not None and 'active' not in already_processed:
already_processed.append('active')
showIndent(outfile, level)
outfile.write('active = "%s",\n' % (self.active,))
if self.name is not None and 'name' not in already_processed:
already_processed.append('name')
showIndent(outfile, level)
outfile.write('name = "%s",\n' % (self.name,))
if self.refer is not None and 'refer' not in already_processed:
already_processed.append('refer')
showIndent(outfile, level)
outfile.write('refer = "%s",\n' % (self.refer,))
def exportLiteralChildren(self, outfile, level, name_):
if self.param is not None:
showIndent(outfile, level)
outfile.write('param=model_.param(\n')
self.param.exportLiteral(outfile, level)
showIndent(outfile, level)
outfile.write('),\n')
def build(self, node):
self.buildAttributes(node, node.attrib, [])
for child in node:
nodeName_ = Tag_pattern_.match(child.tag).groups()[-1]
self.buildChildren(child, node, nodeName_)
def buildAttributes(self, node, attrs, already_processed):
value = find_attr_value_('active', node)
if value is not None and 'active' not in already_processed:
already_processed.append('active')
self.active = value
value = find_attr_value_('name', node)
if value is not None and 'name' not in already_processed:
already_processed.append('name')
self.name = value
value = find_attr_value_('refer', node)
if value is not None and 'refer' not in already_processed:
already_processed.append('refer')
self.refer = value
def buildChildren(self, child_, node, nodeName_, fromsubclass_=False):
if nodeName_ == 'param':
obj_ = param.factory()
obj_.build(child_)
self.set_param(obj_)
# end class passxx
class approved(GeneratedsSuper):
subclass = None
superclass = None
def __init__(self, hash=None):
self.hash = _cast(None, hash)
pass
def factory(*args_, **kwargs_):
if approved.subclass:
return approved.subclass(*args_, **kwargs_)
else:
return approved(*args_, **kwargs_)
factory = staticmethod(factory)
def get_hash(self): return self.hash
def set_hash(self, hash): self.hash = hash
def export(self, outfile, level, namespace_='t:', name_='approved', namespacedef_=''):
showIndent(outfile, level)
outfile.write('<%s%s%s' % (namespace_, name_, namespacedef_ and ' ' + namespacedef_ or '', ))
already_processed = []
self.exportAttributes(outfile, level, already_processed, namespace_, name_='approved')
if self.hasContent_():
outfile.write('>\n')
self.exportChildren(outfile, level + 1, namespace_, name_)
outfile.write('</%s%s>\n' % (namespace_, name_))
else:
outfile.write('/>\n')
def exportAttributes(self, outfile, level, already_processed, namespace_='t:', name_='approved'):
if self.hash is not None and 'hash' not in already_processed:
already_processed.append('hash')
outfile.write(' hash=%s' % (self.gds_format_string(quote_attrib(self.hash).encode(ExternalEncoding), input_name='hash'), ))
def exportChildren(self, outfile, level, namespace_='t:', name_='approved', fromsubclass_=False):
pass
def hasContent_(self):
if (
):
return True
else:
return False
def exportLiteral(self, outfile, level, name_='approved'):
level += 1
self.exportLiteralAttributes(outfile, level, [], name_)
if self.hasContent_():
self.exportLiteralChildren(outfile, level, name_)
def exportLiteralAttributes(self, outfile, level, already_processed, name_):
if self.hash is not None and 'hash' not in already_processed:
already_processed.append('hash')
showIndent(outfile, level)
outfile.write('hash = "%s",\n' % (self.hash,))
def exportLiteralChildren(self, outfile, level, name_):
pass
def build(self, node):
self.buildAttributes(node, node.attrib, [])
for child in node:
nodeName_ = Tag_pattern_.match(child.tag).groups()[-1]
self.buildChildren(child, node, nodeName_)
def buildAttributes(self, node, attrs, already_processed):
value = find_attr_value_('hash', node)
if value is not None and 'hash' not in already_processed:
already_processed.append('hash')
self.hash = value
def buildChildren(self, child_, node, nodeName_, fromsubclass_=False):
pass
# end class approved
USAGE_TEXT = """
Usage: python <Parser>.py [ -s ] <in_xml_file>
"""
def usage():
print USAGE_TEXT
sys.exit(1)
def get_root_tag(node):
tag = Tag_pattern_.match(node.tag).groups()[-1]
rootClass = globals().get(tag)
return tag, rootClass
def parse(inFileName):
doc = parsexml_(inFileName)
rootNode = doc.getroot()
rootTag, rootClass = get_root_tag(rootNode)
if rootClass is None:
rootTag = 'eagle'
rootClass = eagle
rootObj = rootClass.factory()
rootObj.build(rootNode)
# Enable Python to collect the space used by the DOM.
doc = None
sys.stdout.write('<?xml version="1.0" ?>\n')
rootObj.export(sys.stdout, 0, name_=rootTag,
namespacedef_='')
return rootObj
def parseString(inString):
from StringIO import StringIO
doc = parsexml_(StringIO(inString))
rootNode = doc.getroot()
rootTag, rootClass = get_root_tag(rootNode)
if rootClass is None:
rootTag = 'eagle'
rootClass = eagle
rootObj = rootClass.factory()
rootObj.build(rootNode)
# Enable Python to collect the space used by the DOM.
doc = None
sys.stdout.write('<?xml version="1.0" ?>\n')
rootObj.export(sys.stdout, 0, name_="eagle",
namespacedef_='')
return rootObj
def parseLiteral(inFileName):
doc = parsexml_(inFileName)
rootNode = doc.getroot()
rootTag, rootClass = get_root_tag(rootNode)
if rootClass is None:
rootTag = 'eagle'
rootClass = eagle
rootObj = rootClass.factory()
rootObj.build(rootNode)
# Enable Python to collect the space used by the DOM.
doc = None
sys.stdout.write('#from eagle import *\n\n')
sys.stdout.write('import eagle as model_\n\n')
sys.stdout.write('rootObj = model_.rootTag(\n')
rootObj.exportLiteral(sys.stdout, 0, name_=rootTag)
sys.stdout.write(')\n')
return rootObj
def main():
args = sys.argv[1:]
if len(args) == 1:
parse(args[0])
else:
usage()
if __name__ == '__main__':
#import pdb; pdb.set_trace()
main()
__all__ = [
"approved",
"attribute",
"attributes",
"autorouter",
"board",
"bus",
"busses",
"circle",
"classes",
"classxx",
"clearance",
"compatibility",
"connect",
"connects",
"contactref",
"description",
"designrules",
"device",
"devices",
"deviceset",
"devicesets",
"dimension",
"drawing",
"eagle",
"element",
"elements",
"errors",
"frame",
"gate",
"gates",
"grid",
"hole",
"instance",
"instances",
"junction",
"label",
"layer",
"layers",
"libraries",
"library",
"net",
"nets",
"note",
"package",
"packages",
"pad",
"param",
"part",
"parts",
"passxx",
"pin",
"pinref",
"plain",
"polygon",
"rectangle",
"schematic",
"segment",
"setting",
"settings",
"sheet",
"sheets",
"signal",
"signals",
"smd",
"symbol",
"symbols",
"technologies",
"technology",
"text",
"variant",
"variantdef",
"variantdefs",
"vertex",
"via",
"wire"
]
| 46.059278 | 195 | 0.621531 | 49,807 | 435,122 | 5.250929 | 0.009376 | 0.10076 | 0.029801 | 0.029182 | 0.879763 | 0.843335 | 0.821724 | 0.806193 | 0.771218 | 0.762978 | 0 | 0.002598 | 0.26137 | 435,122 | 9,446 | 196 | 46.064154 | 0.811148 | 0.006247 | 0 | 0.771639 | 1 | 0 | 0.056624 | 0.004349 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0.024605 | 0.003844 | null | null | 0.000659 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 9 |
e6761d169ef4563f78451feb7c9e38c9e8312ee6 | 2,646 | py | Python | livestyled/tests/test_resource_audience_device.py | Edvinas9/python-sdk | e24413b337f7d2232e28944b93ded7a430df0293 | [
"MIT"
] | null | null | null | livestyled/tests/test_resource_audience_device.py | Edvinas9/python-sdk | e24413b337f7d2232e28944b93ded7a430df0293 | [
"MIT"
] | 1 | 2021-07-09T10:59:21.000Z | 2021-07-09T10:59:21.000Z | livestyled/tests/test_resource_audience_device.py | livestyled/python-sdk | e75263e8bbf7132e4ce0e69d0ca3ad19088661b2 | [
"MIT"
] | 3 | 2021-02-01T10:13:36.000Z | 2022-02-11T17:47:30.000Z | from datetime import datetime
import os
from livestyled.models.audience_device import AudienceDevice
from livestyled.resource_client import LiveStyledResourceClient
from livestyled.tests.utils import configure_mock_responses
FIXTURES_DIR = os.path.join(os.path.dirname(__file__), 'fixtures')
TEST_API_DOMAIN = 'test.livestyled.com'
CONTENT_TYPE = 'application/ld+json'
def test_audience_creation_duplicate(requests_mock):
mock_responses = (
('POST', 'https://' + TEST_API_DOMAIN + '/v4/user_management/audience_devices',
'mock_responses/ls_api/user_management/create_audience_500_response.json', 500),
)
configure_mock_responses(requests_mock, mock_responses, FIXTURES_DIR, CONTENT_TYPE)
resource_client = LiveStyledResourceClient(TEST_API_DOMAIN, 'bar')
audience_device = AudienceDevice({
'id': 1, 'name': 'name', 'reality_values': [], 'updated_at': datetime.now(), 'created_at': datetime.now()
}, {
'id': 1,
'token': 'name',
'consent': {},
'push_consents': [],
'status': 'active',
'type': 'active',
'app_version': 'active',
'os_version': 'active',
'model': 'active',
'manufacturer': 'active',
'bluetooth_on': 'active',
'wifi_connected': 'active',
'updated_at': datetime.now(),
'created_at': datetime.now()
})
res = resource_client.create_audience_device(audience_device=audience_device)
assert audience_device == res
def test_audience_creation_duplicate_adverse_response(requests_mock):
mock_responses = (
('POST', 'https://' + TEST_API_DOMAIN + '/v4/user_management/audience_devices',
'mock_responses/ls_api/user_management/create_audience_500_response_2.json', 500),
)
configure_mock_responses(requests_mock, mock_responses, FIXTURES_DIR, CONTENT_TYPE)
resource_client = LiveStyledResourceClient(TEST_API_DOMAIN, 'bar')
audience_device = AudienceDevice({
'id': 1, 'name': 'name', 'reality_values': [], 'updated_at': datetime.now(), 'created_at': datetime.now()
}, {
'id': 1,
'token': 'name',
'consent': {},
'push_consents': [],
'status': 'active',
'type': 'active',
'app_version': 'active',
'os_version': 'active',
'model': 'active',
'manufacturer': 'active',
'bluetooth_on': 'active',
'wifi_connected': 'active',
'updated_at': datetime.now(),
'created_at': datetime.now()
})
res = resource_client.create_audience_device(audience_device=audience_device)
assert audience_device == res
| 35.28 | 113 | 0.660242 | 281 | 2,646 | 5.871886 | 0.263345 | 0.093333 | 0.06303 | 0.060606 | 0.807273 | 0.768485 | 0.768485 | 0.768485 | 0.768485 | 0.768485 | 0 | 0.008988 | 0.201058 | 2,646 | 74 | 114 | 35.756757 | 0.771523 | 0 | 0 | 0.774194 | 0 | 0 | 0.274376 | 0.081633 | 0 | 0 | 0 | 0 | 0.032258 | 1 | 0.032258 | false | 0 | 0.080645 | 0 | 0.112903 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
e6960f04b19b006af92eb512a96660bce137deb7 | 147 | py | Python | commands/helpers/load_inventory.py | TiesWestendorp/MTG-Deck-Organizer | 409185c1bcfc8bf70c441d3242ed10c7c41f9d90 | [
"MIT"
] | null | null | null | commands/helpers/load_inventory.py | TiesWestendorp/MTG-Deck-Organizer | 409185c1bcfc8bf70c441d3242ed10c7c41f9d90 | [
"MIT"
] | null | null | null | commands/helpers/load_inventory.py | TiesWestendorp/MTG-Deck-Organizer | 409185c1bcfc8bf70c441d3242ed10c7c41f9d90 | [
"MIT"
] | null | null | null | from helpers.file_to_dict import file_to_dict
def load_inventory():
"""Load the inventory"""
return file_to_dict('../data/inventory.txt')
| 24.5 | 48 | 0.734694 | 22 | 147 | 4.590909 | 0.590909 | 0.178218 | 0.29703 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.136054 | 147 | 5 | 49 | 29.4 | 0.795276 | 0.122449 | 0 | 0 | 0 | 0 | 0.170732 | 0.170732 | 0 | 0 | 0 | 0 | 0 | 1 | 0.333333 | true | 0 | 0.333333 | 0 | 1 | 0 | 1 | 0 | 0 | null | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 7 |
e69c8c4a4b7c3dff8c8723bc49cddbc98a219dbe | 1,553 | py | Python | test_maximum_product_subarray.py | jaebradley/leetcode.py | 64634cc7d0e975ddd163f35acb18cc92960b8eb5 | [
"MIT"
] | null | null | null | test_maximum_product_subarray.py | jaebradley/leetcode.py | 64634cc7d0e975ddd163f35acb18cc92960b8eb5 | [
"MIT"
] | 2 | 2019-11-13T19:55:49.000Z | 2019-11-13T19:55:57.000Z | test_maximum_product_subarray.py | jaebradley/leetcode.py | 64634cc7d0e975ddd163f35acb18cc92960b8eb5 | [
"MIT"
] | null | null | null | from unittest import TestCase
from maximum_product_subarray import Solution
class TestMaximumProductSubarray(TestCase):
def test_with_positive_numbers(self):
self.assertEqual(Solution().maxProduct([1, 2, 3, 4]), 24)
def test_with_positive_numbers_with_zero_at_beginning(self):
self.assertEqual(Solution().maxProduct([0, 4, 5]), 20)
def test_with_positive_numbers_with_zero_in_the_middle(self):
self.assertEqual(Solution().maxProduct([4, 0, 5]), 5)
def test_with_positive_numbers_with_zero_at_end(self):
self.assertEqual(Solution().maxProduct([4, 5, 0]), 20)
def test_with_even_number_of_negative_numbers(self):
self.assertEqual(Solution().maxProduct([-1, -2, -3, -4]), 24)
def test_with_negative_numbers_with_zero_at_beginning(self):
self.assertEqual(Solution().maxProduct([0, -4, -5]), 20)
def test_with_negative_numbers_with_zero_in_the_middle(self):
self.assertEqual(Solution().maxProduct([-4, 0, -5]), 0)
def test_with_negative_numbers_with_zero_at_end(self):
self.assertEqual(Solution().maxProduct([-4, -5, 0]), 20)
def test_with_odd_number_of_negative_numbers_with_zero_at_beginning(self):
self.assertEqual(Solution().maxProduct([0, 4, -5]), 4)
def test_with_odd_numbers_of_negative_numbers_with_zero_in_the_middle(self):
self.assertEqual(Solution().maxProduct([4, 0, -5]), 4)
def test_with_odd_numbers_of_negative_numbers_with_zero_at_end(self):
self.assertEqual(Solution().maxProduct([4, -5, 0]), 4)
| 39.820513 | 80 | 0.731487 | 220 | 1,553 | 4.772727 | 0.172727 | 0.073333 | 0.115238 | 0.282857 | 0.869524 | 0.851429 | 0.849524 | 0.841905 | 0.805714 | 0.805714 | 0 | 0.039098 | 0.143593 | 1,553 | 38 | 81 | 40.868421 | 0.750376 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.44 | 1 | 0.44 | false | 0 | 0.08 | 0 | 0.56 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 10 |
e6bd65b7d46897b79d501cf5b0b788b3bd8b5fc9 | 184 | py | Python | dataverse/settings/__init__.py | SamiSousa/dataverse-client-python | 0730a657d44782fe8f06d72e9a14e975ac8caa50 | [
"Apache-2.0"
] | 29 | 2015-08-10T14:11:09.000Z | 2021-05-18T18:36:08.000Z | dataverse/settings/__init__.py | SamiSousa/dataverse-client-python | 0730a657d44782fe8f06d72e9a14e975ac8caa50 | [
"Apache-2.0"
] | 45 | 2015-04-14T20:56:21.000Z | 2019-06-28T18:09:08.000Z | dataverse/settings/__init__.py | SamiSousa/dataverse-client-python | 0730a657d44782fe8f06d72e9a14e975ac8caa50 | [
"Apache-2.0"
] | 24 | 2015-04-29T10:15:04.000Z | 2019-12-10T15:34:34.000Z | from __future__ import absolute_import
from dataverse.settings.defaults import * # noqa
try:
from dataverse.settings.local import * # noqa
except ImportError as error:
pass | 23 | 50 | 0.766304 | 23 | 184 | 5.913043 | 0.652174 | 0.191176 | 0.308824 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.179348 | 184 | 8 | 51 | 23 | 0.900662 | 0.048913 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0.166667 | 0.666667 | 0 | 0.666667 | 0 | 1 | 0 | 0 | null | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 0 | 1 | 0 | 0 | 7 |
fc31e2b3784eecf933b9eb968a8a4a37777fc8d8 | 149,977 | py | Python | sdk/agrifood/azure-agrifood-farming/azure/agrifood/farming/models/_models.py | rsdoherty/azure-sdk-for-python | 6bba5326677468e6660845a703686327178bb7b1 | [
"MIT"
] | 2,728 | 2015-01-09T10:19:32.000Z | 2022-03-31T14:50:33.000Z | sdk/agrifood/azure-agrifood-farming/azure/agrifood/farming/models/_models.py | rsdoherty/azure-sdk-for-python | 6bba5326677468e6660845a703686327178bb7b1 | [
"MIT"
] | 17,773 | 2015-01-05T15:57:17.000Z | 2022-03-31T23:50:25.000Z | sdk/agrifood/azure-agrifood-farming/azure/agrifood/farming/models/_models.py | rsdoherty/azure-sdk-for-python | 6bba5326677468e6660845a703686327178bb7b1 | [
"MIT"
] | 1,916 | 2015-01-19T05:05:41.000Z | 2022-03-31T19:36:44.000Z | # coding=utf-8
# --------------------------------------------------------------------------
# Copyright (c) Microsoft Corporation. All rights reserved.
# Licensed under the MIT License. See License.txt in the project root for license information.
# Code generated by Microsoft (R) AutoRest Code Generator.
# Changes may cause incorrect behavior and will be lost if the code is regenerated.
# --------------------------------------------------------------------------
from azure.core.exceptions import HttpResponseError
import msrest.serialization
class ApplicationData(msrest.serialization.Model):
"""Schema of application data resource.
Variables are only populated by the server, and will be ignored when sending a request.
:param application_product_details: Application product details.
:type application_product_details:
list[~azure.agrifood.farming.models.ApplicationProductDetail]
:param avg_material: Schema for storing measurement reading and unit.
:type avg_material: ~azure.agrifood.farming.models.Measure
:param total_material: Schema for storing measurement reading and unit.
:type total_material: ~azure.agrifood.farming.models.Measure
:param area: Schema for storing measurement reading and unit.
:type area: ~azure.agrifood.farming.models.Measure
:param source: Source of the operation data.
:type source: str
:param operation_modified_date_time: Modified date-time of the operation data, sample format:
yyyy-MM-ddTHH:mm:ssZ.
Note: this will be specified by the source provider itself.
:type operation_modified_date_time: ~datetime.datetime
:param operation_start_date_time: Start date-time of the operation data, sample format:
yyyy-MM-ddTHH:mm:ssZ.
:type operation_start_date_time: ~datetime.datetime
:param operation_end_date_time: End date-time of the operation data, sample format:
yyyy-MM-ddTHH:mm:ssZ.
:type operation_end_date_time: ~datetime.datetime
:ivar attachments_link: Link for attachments.
:vartype attachments_link: str
:param associated_boundary_id: Optional boundary ID of the field for which operation was
applied.
:type associated_boundary_id: str
:param operation_boundary_id: Optional boundary ID of the actual area for which operation was
applied inside the specified field.
:type operation_boundary_id: str
:ivar farmer_id: Farmer ID which belongs to the operation data.
:vartype farmer_id: str
:ivar id: Unique resource ID.
:vartype id: str
:ivar e_tag: The ETag value to implement optimistic concurrency.
:vartype e_tag: str
:param status: Status of the resource.
:type status: str
:ivar created_date_time: Date-time when resource was created, sample format:
yyyy-MM-ddTHH:mm:ssZ.
:vartype created_date_time: ~datetime.datetime
:ivar modified_date_time: Date-time when resource was last modified, sample format:
yyyy-MM-ddTHH:mm:ssZ.
:vartype modified_date_time: ~datetime.datetime
:param name: Name to identify resource.
:type name: str
:param description: Textual description of the resource.
:type description: str
:param properties: A collection of key value pairs that belongs to the resource.
Each pair must not have a key greater than 50 characters
and must not have a value greater than 150 characters.
Note: A maximum of 25 key value pairs can be provided for a resource and only string and
numeral values are supported.
:type properties: dict[str, any]
"""
_validation = {
'source': {'max_length': 100, 'min_length': 2},
'attachments_link': {'readonly': True},
'farmer_id': {'readonly': True},
'id': {'readonly': True},
'e_tag': {'readonly': True},
'status': {'max_length': 100, 'min_length': 0},
'created_date_time': {'readonly': True},
'modified_date_time': {'readonly': True},
'name': {'max_length': 100, 'min_length': 0},
'description': {'max_length': 500, 'min_length': 0},
}
_attribute_map = {
'application_product_details': {'key': 'applicationProductDetails', 'type': '[ApplicationProductDetail]'},
'avg_material': {'key': 'avgMaterial', 'type': 'Measure'},
'total_material': {'key': 'totalMaterial', 'type': 'Measure'},
'area': {'key': 'area', 'type': 'Measure'},
'source': {'key': 'source', 'type': 'str'},
'operation_modified_date_time': {'key': 'operationModifiedDateTime', 'type': 'iso-8601'},
'operation_start_date_time': {'key': 'operationStartDateTime', 'type': 'iso-8601'},
'operation_end_date_time': {'key': 'operationEndDateTime', 'type': 'iso-8601'},
'attachments_link': {'key': 'attachmentsLink', 'type': 'str'},
'associated_boundary_id': {'key': 'associatedBoundaryId', 'type': 'str'},
'operation_boundary_id': {'key': 'operationBoundaryId', 'type': 'str'},
'farmer_id': {'key': 'farmerId', 'type': 'str'},
'id': {'key': 'id', 'type': 'str'},
'e_tag': {'key': 'eTag', 'type': 'str'},
'status': {'key': 'status', 'type': 'str'},
'created_date_time': {'key': 'createdDateTime', 'type': 'iso-8601'},
'modified_date_time': {'key': 'modifiedDateTime', 'type': 'iso-8601'},
'name': {'key': 'name', 'type': 'str'},
'description': {'key': 'description', 'type': 'str'},
'properties': {'key': 'properties', 'type': '{object}'},
}
def __init__(
self,
**kwargs
):
super(ApplicationData, self).__init__(**kwargs)
self.application_product_details = kwargs.get('application_product_details', None)
self.avg_material = kwargs.get('avg_material', None)
self.total_material = kwargs.get('total_material', None)
self.area = kwargs.get('area', None)
self.source = kwargs.get('source', None)
self.operation_modified_date_time = kwargs.get('operation_modified_date_time', None)
self.operation_start_date_time = kwargs.get('operation_start_date_time', None)
self.operation_end_date_time = kwargs.get('operation_end_date_time', None)
self.attachments_link = None
self.associated_boundary_id = kwargs.get('associated_boundary_id', None)
self.operation_boundary_id = kwargs.get('operation_boundary_id', None)
self.farmer_id = None
self.id = None
self.e_tag = None
self.status = kwargs.get('status', None)
self.created_date_time = None
self.modified_date_time = None
self.name = kwargs.get('name', None)
self.description = kwargs.get('description', None)
self.properties = kwargs.get('properties', None)
class ApplicationDataListResponse(msrest.serialization.Model):
"""Paged response contains list of requested objects and a URL link to get the next set of results.
:param value: List of requested objects.
:type value: list[~azure.agrifood.farming.models.ApplicationData]
:param skip_token: Token used in retrieving the next page. If null, there are no additional
pages.
:type skip_token: str
:param next_link: Continuation link (absolute URI) to the next page of results in the list.
:type next_link: str
"""
_attribute_map = {
'value': {'key': 'value', 'type': '[ApplicationData]'},
'skip_token': {'key': '$skipToken', 'type': 'str'},
'next_link': {'key': 'nextLink', 'type': 'str'},
}
def __init__(
self,
**kwargs
):
super(ApplicationDataListResponse, self).__init__(**kwargs)
self.value = kwargs.get('value', None)
self.skip_token = kwargs.get('skip_token', None)
self.next_link = kwargs.get('next_link', None)
class ApplicationProductDetail(msrest.serialization.Model):
"""Schema of product used during application.
:param product_name: Name of the product applied.
:type product_name: str
:param is_carrier: A flag indicating whether product is a carrier for a tank mix.
:type is_carrier: bool
:param avg_material: Schema for storing measurement reading and unit.
:type avg_material: ~azure.agrifood.farming.models.Measure
:param total_material: Schema for storing measurement reading and unit.
:type total_material: ~azure.agrifood.farming.models.Measure
"""
_validation = {
'product_name': {'max_length': 100, 'min_length': 1},
}
_attribute_map = {
'product_name': {'key': 'productName', 'type': 'str'},
'is_carrier': {'key': 'isCarrier', 'type': 'bool'},
'avg_material': {'key': 'avgMaterial', 'type': 'Measure'},
'total_material': {'key': 'totalMaterial', 'type': 'Measure'},
}
def __init__(
self,
**kwargs
):
super(ApplicationProductDetail, self).__init__(**kwargs)
self.product_name = kwargs.get('product_name', None)
self.is_carrier = kwargs.get('is_carrier', False)
self.avg_material = kwargs.get('avg_material', None)
self.total_material = kwargs.get('total_material', None)
class Attachment(msrest.serialization.Model):
"""Schema of attachment resource.
Variables are only populated by the server, and will be ignored when sending a request.
:ivar farmer_id: Farmer id for this attachment.
:vartype farmer_id: str
:param resource_id: Associated Resource id for this attachment.
:type resource_id: str
:param resource_type: Associated Resource type for this attachment
i.e. Farmer, Farm, Field, SeasonalField, Boundary, FarmOperationApplicationData, HarvestData,
TillageData, PlantingData.
:type resource_type: str
:ivar original_file_name: Original File Name for this attachment.
:vartype original_file_name: str
:ivar id: Unique id.
:vartype id: str
:param status: Status of the resource.
:type status: str
:ivar created_date_time: Date when resource was created.
:vartype created_date_time: ~datetime.datetime
:ivar modified_date_time: Date when resource was last modified.
:vartype modified_date_time: ~datetime.datetime
:param name: Name to identify resource.
:type name: str
:param description: Textual description of resource.
:type description: str
:ivar e_tag: The ETag value to implement optimistic concurrency.
:vartype e_tag: str
"""
_validation = {
'farmer_id': {'readonly': True},
'original_file_name': {'readonly': True},
'id': {'readonly': True},
'status': {'max_length': 100, 'min_length': 0},
'created_date_time': {'readonly': True},
'modified_date_time': {'readonly': True},
'name': {'max_length': 100, 'min_length': 0},
'description': {'max_length': 500, 'min_length': 0},
'e_tag': {'readonly': True},
}
_attribute_map = {
'farmer_id': {'key': 'farmerId', 'type': 'str'},
'resource_id': {'key': 'resourceId', 'type': 'str'},
'resource_type': {'key': 'resourceType', 'type': 'str'},
'original_file_name': {'key': 'originalFileName', 'type': 'str'},
'id': {'key': 'id', 'type': 'str'},
'status': {'key': 'status', 'type': 'str'},
'created_date_time': {'key': 'createdDateTime', 'type': 'iso-8601'},
'modified_date_time': {'key': 'modifiedDateTime', 'type': 'iso-8601'},
'name': {'key': 'name', 'type': 'str'},
'description': {'key': 'description', 'type': 'str'},
'e_tag': {'key': 'eTag', 'type': 'str'},
}
def __init__(
self,
**kwargs
):
super(Attachment, self).__init__(**kwargs)
self.farmer_id = None
self.resource_id = kwargs.get('resource_id', None)
self.resource_type = kwargs.get('resource_type', None)
self.original_file_name = None
self.id = None
self.status = kwargs.get('status', None)
self.created_date_time = None
self.modified_date_time = None
self.name = kwargs.get('name', None)
self.description = kwargs.get('description', None)
self.e_tag = None
class AttachmentListResponse(msrest.serialization.Model):
"""Paged response contains list of requested objects and a URL link to get the next set of results.
:param value: List of requested objects.
:type value: list[~azure.agrifood.farming.models.Attachment]
:param skip_token: Token used in retrieving the next page. If null, there are no additional
pages.
:type skip_token: str
:param next_link: Continuation link (absolute URI) to the next page of results in the list.
:type next_link: str
"""
_attribute_map = {
'value': {'key': 'value', 'type': '[Attachment]'},
'skip_token': {'key': '$skipToken', 'type': 'str'},
'next_link': {'key': 'nextLink', 'type': 'str'},
}
def __init__(
self,
**kwargs
):
super(AttachmentListResponse, self).__init__(**kwargs)
self.value = kwargs.get('value', None)
self.skip_token = kwargs.get('skip_token', None)
self.next_link = kwargs.get('next_link', None)
class Boundary(msrest.serialization.Model):
"""Schema of boundary resource.
Variables are only populated by the server, and will be ignored when sending a request.
:ivar farmer_id: Farmer ID.
:vartype farmer_id: str
:param parent_id: ID of the parent(field or seasonalField) it belongs to.
:type parent_id: str
:param geometry: GeoJSON abstract class.
:type geometry: ~azure.agrifood.farming.models.GeoJsonObject
:param is_primary: Is the boundary primary.
:type is_primary: bool
:ivar acreage: Boundary area in acres.
:vartype acreage: float
:ivar parent_type: Type of the parent it belongs to.
:vartype parent_type: str
:ivar id: Unique resource ID.
:vartype id: str
:ivar e_tag: The ETag value to implement optimistic concurrency.
:vartype e_tag: str
:param status: Status of the resource.
:type status: str
:ivar created_date_time: Date-time when resource was created, sample format:
yyyy-MM-ddTHH:mm:ssZ.
:vartype created_date_time: ~datetime.datetime
:ivar modified_date_time: Date-time when resource was last modified, sample format:
yyyy-MM-ddTHH:mm:ssZ.
:vartype modified_date_time: ~datetime.datetime
:param name: Name to identify resource.
:type name: str
:param description: Textual description of the resource.
:type description: str
:param properties: A collection of key value pairs that belongs to the resource.
Each pair must not have a key greater than 50 characters
and must not have a value greater than 150 characters.
Note: A maximum of 25 key value pairs can be provided for a resource and only string and
numeral values are supported.
:type properties: dict[str, any]
"""
_validation = {
'farmer_id': {'readonly': True},
'acreage': {'readonly': True},
'parent_type': {'readonly': True},
'id': {'readonly': True},
'e_tag': {'readonly': True},
'status': {'max_length': 100, 'min_length': 0},
'created_date_time': {'readonly': True},
'modified_date_time': {'readonly': True},
'name': {'max_length': 100, 'min_length': 0},
'description': {'max_length': 500, 'min_length': 0},
}
_attribute_map = {
'farmer_id': {'key': 'farmerId', 'type': 'str'},
'parent_id': {'key': 'parentId', 'type': 'str'},
'geometry': {'key': 'geometry', 'type': 'GeoJsonObject'},
'is_primary': {'key': 'isPrimary', 'type': 'bool'},
'acreage': {'key': 'acreage', 'type': 'float'},
'parent_type': {'key': 'parentType', 'type': 'str'},
'id': {'key': 'id', 'type': 'str'},
'e_tag': {'key': 'eTag', 'type': 'str'},
'status': {'key': 'status', 'type': 'str'},
'created_date_time': {'key': 'createdDateTime', 'type': 'iso-8601'},
'modified_date_time': {'key': 'modifiedDateTime', 'type': 'iso-8601'},
'name': {'key': 'name', 'type': 'str'},
'description': {'key': 'description', 'type': 'str'},
'properties': {'key': 'properties', 'type': '{object}'},
}
def __init__(
self,
**kwargs
):
super(Boundary, self).__init__(**kwargs)
self.farmer_id = None
self.parent_id = kwargs.get('parent_id', None)
self.geometry = kwargs.get('geometry', None)
self.is_primary = kwargs.get('is_primary', None)
self.acreage = None
self.parent_type = None
self.id = None
self.e_tag = None
self.status = kwargs.get('status', None)
self.created_date_time = None
self.modified_date_time = None
self.name = kwargs.get('name', None)
self.description = kwargs.get('description', None)
self.properties = kwargs.get('properties', None)
class BoundaryListResponse(msrest.serialization.Model):
"""Paged response contains list of requested objects and a URL link to get the next set of results.
:param value: List of requested objects.
:type value: list[~azure.agrifood.farming.models.Boundary]
:param skip_token: Token used in retrieving the next page. If null, there are no additional
pages.
:type skip_token: str
:param next_link: Continuation link (absolute URI) to the next page of results in the list.
:type next_link: str
"""
_attribute_map = {
'value': {'key': 'value', 'type': '[Boundary]'},
'skip_token': {'key': '$skipToken', 'type': 'str'},
'next_link': {'key': 'nextLink', 'type': 'str'},
}
def __init__(
self,
**kwargs
):
super(BoundaryListResponse, self).__init__(**kwargs)
self.value = kwargs.get('value', None)
self.skip_token = kwargs.get('skip_token', None)
self.next_link = kwargs.get('next_link', None)
class BoundaryOverlapResponse(msrest.serialization.Model):
"""Schema of boundary overlap response.
:param boundary_acreage: Acreage of Main boundary.
:type boundary_acreage: float
:param other_boundary_acreage: Acreage of other boundary.
:type other_boundary_acreage: float
:param intersecting_acreage: Acreage of intersecting boundary.
:type intersecting_acreage: float
"""
_attribute_map = {
'boundary_acreage': {'key': 'boundaryAcreage', 'type': 'float'},
'other_boundary_acreage': {'key': 'otherBoundaryAcreage', 'type': 'float'},
'intersecting_acreage': {'key': 'intersectingAcreage', 'type': 'float'},
}
def __init__(
self,
**kwargs
):
super(BoundaryOverlapResponse, self).__init__(**kwargs)
self.boundary_acreage = kwargs.get('boundary_acreage', None)
self.other_boundary_acreage = kwargs.get('other_boundary_acreage', None)
self.intersecting_acreage = kwargs.get('intersecting_acreage', None)
class CascadeDeleteJob(msrest.serialization.Model):
"""Schema of cascade delete job.
Variables are only populated by the server, and will be ignored when sending a request.
All required parameters must be populated in order to send to Azure.
:param farmer_id: Required. Farmer ID.
:type farmer_id: str
:param resource_id: Required. The id of the resource.
:type resource_id: str
:param resource_type: Required. The type of the resource.
:type resource_type: str
:ivar id: Unique job id.
:vartype id: str
:ivar status: Status of the job.
Possible values: 'Waiting', 'Running', 'Succeeded', 'Failed', 'Cancelled'.
:vartype status: str
:ivar duration_in_seconds: Duration of the job in seconds.
:vartype duration_in_seconds: float
:ivar message: Status message to capture more details of the job.
:vartype message: str
:ivar created_date_time: Job created at dateTime. Sample format: yyyy-MM-ddTHH:mm:ssZ.
:vartype created_date_time: ~datetime.datetime
:ivar last_action_date_time: Job was last acted upon at dateTime. Sample format:
yyyy-MM-ddTHH:mm:ssZ.
:vartype last_action_date_time: ~datetime.datetime
:ivar start_time: Job start time when available. Sample format: yyyy-MM-ddTHH:mm:ssZ.
:vartype start_time: ~datetime.datetime
:ivar end_time: Job end time when available. Sample format: yyyy-MM-ddTHH:mm:ssZ.
:vartype end_time: ~datetime.datetime
:param name: Name to identify resource.
:type name: str
:param description: Textual description of the resource.
:type description: str
:param properties: A collection of key value pairs that belongs to the resource.
Each pair must not have a key greater than 50 characters
and must not have a value greater than 150 characters.
Note: A maximum of 25 key value pairs can be provided for a resource and only string and
numeral values are supported.
:type properties: dict[str, any]
"""
_validation = {
'farmer_id': {'required': True},
'resource_id': {'required': True},
'resource_type': {'required': True},
'id': {'readonly': True},
'status': {'readonly': True},
'duration_in_seconds': {'readonly': True},
'message': {'readonly': True},
'created_date_time': {'readonly': True},
'last_action_date_time': {'readonly': True},
'start_time': {'readonly': True},
'end_time': {'readonly': True},
'name': {'max_length': 100, 'min_length': 0},
'description': {'max_length': 500, 'min_length': 0},
}
_attribute_map = {
'farmer_id': {'key': 'farmerId', 'type': 'str'},
'resource_id': {'key': 'resourceId', 'type': 'str'},
'resource_type': {'key': 'resourceType', 'type': 'str'},
'id': {'key': 'id', 'type': 'str'},
'status': {'key': 'status', 'type': 'str'},
'duration_in_seconds': {'key': 'durationInSeconds', 'type': 'float'},
'message': {'key': 'message', 'type': 'str'},
'created_date_time': {'key': 'createdDateTime', 'type': 'iso-8601'},
'last_action_date_time': {'key': 'lastActionDateTime', 'type': 'iso-8601'},
'start_time': {'key': 'startTime', 'type': 'iso-8601'},
'end_time': {'key': 'endTime', 'type': 'iso-8601'},
'name': {'key': 'name', 'type': 'str'},
'description': {'key': 'description', 'type': 'str'},
'properties': {'key': 'properties', 'type': '{object}'},
}
def __init__(
self,
**kwargs
):
super(CascadeDeleteJob, self).__init__(**kwargs)
self.farmer_id = kwargs['farmer_id']
self.resource_id = kwargs['resource_id']
self.resource_type = kwargs['resource_type']
self.id = None
self.status = None
self.duration_in_seconds = None
self.message = None
self.created_date_time = None
self.last_action_date_time = None
self.start_time = None
self.end_time = None
self.name = kwargs.get('name', None)
self.description = kwargs.get('description', None)
self.properties = kwargs.get('properties', None)
class Crop(msrest.serialization.Model):
"""Schema of crop resource.
Variables are only populated by the server, and will be ignored when sending a request.
:param phenotype: Crop phenotype.
:type phenotype: str
:ivar id: Unique resource ID.
:vartype id: str
:ivar e_tag: The ETag value to implement optimistic concurrency.
:vartype e_tag: str
:param status: Status of the resource.
:type status: str
:ivar created_date_time: Date-time when resource was created, sample format:
yyyy-MM-ddTHH:mm:ssZ.
:vartype created_date_time: ~datetime.datetime
:ivar modified_date_time: Date-time when resource was last modified, sample format:
yyyy-MM-ddTHH:mm:ssZ.
:vartype modified_date_time: ~datetime.datetime
:param name: Name to identify resource.
:type name: str
:param description: Textual description of the resource.
:type description: str
:param properties: A collection of key value pairs that belongs to the resource.
Each pair must not have a key greater than 50 characters
and must not have a value greater than 150 characters.
Note: A maximum of 25 key value pairs can be provided for a resource and only string and
numeral values are supported.
:type properties: dict[str, any]
"""
_validation = {
'phenotype': {'max_length': 100, 'min_length': 0},
'id': {'readonly': True},
'e_tag': {'readonly': True},
'status': {'max_length': 100, 'min_length': 0},
'created_date_time': {'readonly': True},
'modified_date_time': {'readonly': True},
'name': {'max_length': 100, 'min_length': 0},
'description': {'max_length': 500, 'min_length': 0},
}
_attribute_map = {
'phenotype': {'key': 'phenotype', 'type': 'str'},
'id': {'key': 'id', 'type': 'str'},
'e_tag': {'key': 'eTag', 'type': 'str'},
'status': {'key': 'status', 'type': 'str'},
'created_date_time': {'key': 'createdDateTime', 'type': 'iso-8601'},
'modified_date_time': {'key': 'modifiedDateTime', 'type': 'iso-8601'},
'name': {'key': 'name', 'type': 'str'},
'description': {'key': 'description', 'type': 'str'},
'properties': {'key': 'properties', 'type': '{object}'},
}
def __init__(
self,
**kwargs
):
super(Crop, self).__init__(**kwargs)
self.phenotype = kwargs.get('phenotype', None)
self.id = None
self.e_tag = None
self.status = kwargs.get('status', None)
self.created_date_time = None
self.modified_date_time = None
self.name = kwargs.get('name', None)
self.description = kwargs.get('description', None)
self.properties = kwargs.get('properties', None)
class CropListResponse(msrest.serialization.Model):
"""Paged response contains list of requested objects and a URL link to get the next set of results.
:param value: List of requested objects.
:type value: list[~azure.agrifood.farming.models.Crop]
:param skip_token: Token used in retrieving the next page. If null, there are no additional
pages.
:type skip_token: str
:param next_link: Continuation link (absolute URI) to the next page of results in the list.
:type next_link: str
"""
_attribute_map = {
'value': {'key': 'value', 'type': '[Crop]'},
'skip_token': {'key': '$skipToken', 'type': 'str'},
'next_link': {'key': 'nextLink', 'type': 'str'},
}
def __init__(
self,
**kwargs
):
super(CropListResponse, self).__init__(**kwargs)
self.value = kwargs.get('value', None)
self.skip_token = kwargs.get('skip_token', None)
self.next_link = kwargs.get('next_link', None)
class CropVariety(msrest.serialization.Model):
"""Schema of crop variety resource.
Variables are only populated by the server, and will be ignored when sending a request.
:ivar crop_id: ID of the crop it belongs to.
:vartype crop_id: str
:param brand: CropVariety Brand.
:type brand: str
:param product: CropVariety product.
:type product: str
:ivar id: Unique resource ID.
:vartype id: str
:ivar e_tag: The ETag value to implement optimistic concurrency.
:vartype e_tag: str
:param status: Status of the resource.
:type status: str
:ivar created_date_time: Date-time when resource was created, sample format:
yyyy-MM-ddTHH:mm:ssZ.
:vartype created_date_time: ~datetime.datetime
:ivar modified_date_time: Date-time when resource was last modified, sample format:
yyyy-MM-ddTHH:mm:ssZ.
:vartype modified_date_time: ~datetime.datetime
:param name: Name to identify resource.
:type name: str
:param description: Textual description of the resource.
:type description: str
:param properties: A collection of key value pairs that belongs to the resource.
Each pair must not have a key greater than 50 characters
and must not have a value greater than 150 characters.
Note: A maximum of 25 key value pairs can be provided for a resource and only string and
numeral values are supported.
:type properties: dict[str, any]
"""
_validation = {
'crop_id': {'readonly': True},
'brand': {'max_length': 100, 'min_length': 0},
'product': {'max_length': 100, 'min_length': 0},
'id': {'readonly': True},
'e_tag': {'readonly': True},
'status': {'max_length': 100, 'min_length': 0},
'created_date_time': {'readonly': True},
'modified_date_time': {'readonly': True},
'name': {'max_length': 100, 'min_length': 0},
'description': {'max_length': 500, 'min_length': 0},
}
_attribute_map = {
'crop_id': {'key': 'cropId', 'type': 'str'},
'brand': {'key': 'brand', 'type': 'str'},
'product': {'key': 'product', 'type': 'str'},
'id': {'key': 'id', 'type': 'str'},
'e_tag': {'key': 'eTag', 'type': 'str'},
'status': {'key': 'status', 'type': 'str'},
'created_date_time': {'key': 'createdDateTime', 'type': 'iso-8601'},
'modified_date_time': {'key': 'modifiedDateTime', 'type': 'iso-8601'},
'name': {'key': 'name', 'type': 'str'},
'description': {'key': 'description', 'type': 'str'},
'properties': {'key': 'properties', 'type': '{object}'},
}
def __init__(
self,
**kwargs
):
super(CropVariety, self).__init__(**kwargs)
self.crop_id = None
self.brand = kwargs.get('brand', None)
self.product = kwargs.get('product', None)
self.id = None
self.e_tag = None
self.status = kwargs.get('status', None)
self.created_date_time = None
self.modified_date_time = None
self.name = kwargs.get('name', None)
self.description = kwargs.get('description', None)
self.properties = kwargs.get('properties', None)
class CropVarietyListResponse(msrest.serialization.Model):
"""Paged response contains list of requested objects and a URL link to get the next set of results.
:param value: List of requested objects.
:type value: list[~azure.agrifood.farming.models.CropVariety]
:param skip_token: Token used in retrieving the next page. If null, there are no additional
pages.
:type skip_token: str
:param next_link: Continuation link (absolute URI) to the next page of results in the list.
:type next_link: str
"""
_attribute_map = {
'value': {'key': 'value', 'type': '[CropVariety]'},
'skip_token': {'key': '$skipToken', 'type': 'str'},
'next_link': {'key': 'nextLink', 'type': 'str'},
}
def __init__(
self,
**kwargs
):
super(CropVarietyListResponse, self).__init__(**kwargs)
self.value = kwargs.get('value', None)
self.skip_token = kwargs.get('skip_token', None)
self.next_link = kwargs.get('next_link', None)
class Error(msrest.serialization.Model):
"""An error from the Azure AgPlatform service.
:param code: Server-defined set of error codes.
:type code: str
:param message: Human-readable representation of the error.
:type message: str
:param target: Target of the error.
:type target: str
:param details: Array of details about specific errors that led to this reported error.
:type details: list[~azure.agrifood.farming.models.Error]
:param innererror: Inner error containing list of errors.
:code:`<see
href="https://github.com/Microsoft/api-guidelines/blob/vNext/Guidelines.md#innererror--object">InnerError
reference document</see>`.
:type innererror: ~azure.agrifood.farming.models.InnerError
"""
_attribute_map = {
'code': {'key': 'code', 'type': 'str'},
'message': {'key': 'message', 'type': 'str'},
'target': {'key': 'target', 'type': 'str'},
'details': {'key': 'details', 'type': '[Error]'},
'innererror': {'key': 'innererror', 'type': 'InnerError'},
}
def __init__(
self,
**kwargs
):
super(Error, self).__init__(**kwargs)
self.code = kwargs.get('code', None)
self.message = kwargs.get('message', None)
self.target = kwargs.get('target', None)
self.details = kwargs.get('details', None)
self.innererror = kwargs.get('innererror', None)
class ErrorResponse(msrest.serialization.Model):
"""An error response from the Azure AgPlatform service.
:code:`<see href="https://github.com/Microsoft/api-guidelines/blob/vNext/Guidelines.md#7102-error-condition-responses">ErrorResponse reference document.</see>`.
:param error: An error from the Azure AgPlatform service.
:type error: ~azure.agrifood.farming.models.Error
:param trace_id: Unique trace ID.
:type trace_id: str
"""
_attribute_map = {
'error': {'key': 'error', 'type': 'Error'},
'trace_id': {'key': 'traceId', 'type': 'str'},
}
def __init__(
self,
**kwargs
):
super(ErrorResponse, self).__init__(**kwargs)
self.error = kwargs.get('error', None)
self.trace_id = kwargs.get('trace_id', None)
class Farm(msrest.serialization.Model):
"""Schema of farm resource.
Variables are only populated by the server, and will be ignored when sending a request.
:ivar farmer_id: Farmer ID.
:vartype farmer_id: str
:ivar id: Unique resource ID.
:vartype id: str
:ivar e_tag: The ETag value to implement optimistic concurrency.
:vartype e_tag: str
:param status: Status of the resource.
:type status: str
:ivar created_date_time: Date-time when resource was created, sample format:
yyyy-MM-ddTHH:mm:ssZ.
:vartype created_date_time: ~datetime.datetime
:ivar modified_date_time: Date-time when resource was last modified, sample format:
yyyy-MM-ddTHH:mm:ssZ.
:vartype modified_date_time: ~datetime.datetime
:param name: Name to identify resource.
:type name: str
:param description: Textual description of the resource.
:type description: str
:param properties: A collection of key value pairs that belongs to the resource.
Each pair must not have a key greater than 50 characters
and must not have a value greater than 150 characters.
Note: A maximum of 25 key value pairs can be provided for a resource and only string and
numeral values are supported.
:type properties: dict[str, any]
"""
_validation = {
'farmer_id': {'readonly': True},
'id': {'readonly': True},
'e_tag': {'readonly': True},
'status': {'max_length': 100, 'min_length': 0},
'created_date_time': {'readonly': True},
'modified_date_time': {'readonly': True},
'name': {'max_length': 100, 'min_length': 0},
'description': {'max_length': 500, 'min_length': 0},
}
_attribute_map = {
'farmer_id': {'key': 'farmerId', 'type': 'str'},
'id': {'key': 'id', 'type': 'str'},
'e_tag': {'key': 'eTag', 'type': 'str'},
'status': {'key': 'status', 'type': 'str'},
'created_date_time': {'key': 'createdDateTime', 'type': 'iso-8601'},
'modified_date_time': {'key': 'modifiedDateTime', 'type': 'iso-8601'},
'name': {'key': 'name', 'type': 'str'},
'description': {'key': 'description', 'type': 'str'},
'properties': {'key': 'properties', 'type': '{object}'},
}
def __init__(
self,
**kwargs
):
super(Farm, self).__init__(**kwargs)
self.farmer_id = None
self.id = None
self.e_tag = None
self.status = kwargs.get('status', None)
self.created_date_time = None
self.modified_date_time = None
self.name = kwargs.get('name', None)
self.description = kwargs.get('description', None)
self.properties = kwargs.get('properties', None)
class Farmer(msrest.serialization.Model):
"""Schema of farmer resource.
Variables are only populated by the server, and will be ignored when sending a request.
:ivar id: Unique resource ID.
:vartype id: str
:ivar e_tag: The ETag value to implement optimistic concurrency.
:vartype e_tag: str
:param status: Status of the resource.
:type status: str
:ivar created_date_time: Date-time when resource was created, sample format:
yyyy-MM-ddTHH:mm:ssZ.
:vartype created_date_time: ~datetime.datetime
:ivar modified_date_time: Date-time when resource was last modified, sample format:
yyyy-MM-ddTHH:mm:ssZ.
:vartype modified_date_time: ~datetime.datetime
:param name: Name to identify resource.
:type name: str
:param description: Textual description of the resource.
:type description: str
:param properties: A collection of key value pairs that belongs to the resource.
Each pair must not have a key greater than 50 characters
and must not have a value greater than 150 characters.
Note: A maximum of 25 key value pairs can be provided for a resource and only string and
numeral values are supported.
:type properties: dict[str, any]
"""
_validation = {
'id': {'readonly': True},
'e_tag': {'readonly': True},
'status': {'max_length': 100, 'min_length': 0},
'created_date_time': {'readonly': True},
'modified_date_time': {'readonly': True},
'name': {'max_length': 100, 'min_length': 0},
'description': {'max_length': 500, 'min_length': 0},
}
_attribute_map = {
'id': {'key': 'id', 'type': 'str'},
'e_tag': {'key': 'eTag', 'type': 'str'},
'status': {'key': 'status', 'type': 'str'},
'created_date_time': {'key': 'createdDateTime', 'type': 'iso-8601'},
'modified_date_time': {'key': 'modifiedDateTime', 'type': 'iso-8601'},
'name': {'key': 'name', 'type': 'str'},
'description': {'key': 'description', 'type': 'str'},
'properties': {'key': 'properties', 'type': '{object}'},
}
def __init__(
self,
**kwargs
):
super(Farmer, self).__init__(**kwargs)
self.id = None
self.e_tag = None
self.status = kwargs.get('status', None)
self.created_date_time = None
self.modified_date_time = None
self.name = kwargs.get('name', None)
self.description = kwargs.get('description', None)
self.properties = kwargs.get('properties', None)
class FarmerListResponse(msrest.serialization.Model):
"""Paged response contains list of requested objects and a URL link to get the next set of results.
:param value: List of requested objects.
:type value: list[~azure.agrifood.farming.models.Farmer]
:param skip_token: Token used in retrieving the next page. If null, there are no additional
pages.
:type skip_token: str
:param next_link: Continuation link (absolute URI) to the next page of results in the list.
:type next_link: str
"""
_attribute_map = {
'value': {'key': 'value', 'type': '[Farmer]'},
'skip_token': {'key': '$skipToken', 'type': 'str'},
'next_link': {'key': 'nextLink', 'type': 'str'},
}
def __init__(
self,
**kwargs
):
super(FarmerListResponse, self).__init__(**kwargs)
self.value = kwargs.get('value', None)
self.skip_token = kwargs.get('skip_token', None)
self.next_link = kwargs.get('next_link', None)
class FarmListResponse(msrest.serialization.Model):
"""Paged response contains list of requested objects and a URL link to get the next set of results.
:param value: List of requested objects.
:type value: list[~azure.agrifood.farming.models.Farm]
:param skip_token: Token used in retrieving the next page. If null, there are no additional
pages.
:type skip_token: str
:param next_link: Continuation link (absolute URI) to the next page of results in the list.
:type next_link: str
"""
_attribute_map = {
'value': {'key': 'value', 'type': '[Farm]'},
'skip_token': {'key': '$skipToken', 'type': 'str'},
'next_link': {'key': 'nextLink', 'type': 'str'},
}
def __init__(
self,
**kwargs
):
super(FarmListResponse, self).__init__(**kwargs)
self.value = kwargs.get('value', None)
self.skip_token = kwargs.get('skip_token', None)
self.next_link = kwargs.get('next_link', None)
class FarmOperationDataIngestionJob(msrest.serialization.Model):
"""Schema of farm operation data ingestion job.
Variables are only populated by the server, and will be ignored when sending a request.
All required parameters must be populated in order to send to Azure.
:param farmer_id: Required. Farmer ID.
:type farmer_id: str
:param auth_provider_id: Required. Authentication provider ID.
:type auth_provider_id: str
:param operations: List of operation types for which data needs to be downloaded. Available
values: AllOperations, Application, Planting, Harvest, Tillage.
:type operations: list[str]
:param start_year: Required. Start Year (Minimum = 2000, Maximum = CurrentYear).
:type start_year: int
:ivar id: Unique job id.
:vartype id: str
:ivar status: Status of the job.
Possible values: 'Waiting', 'Running', 'Succeeded', 'Failed', 'Cancelled'.
:vartype status: str
:ivar duration_in_seconds: Duration of the job in seconds.
:vartype duration_in_seconds: float
:ivar message: Status message to capture more details of the job.
:vartype message: str
:ivar created_date_time: Job created at dateTime. Sample format: yyyy-MM-ddTHH:mm:ssZ.
:vartype created_date_time: ~datetime.datetime
:ivar last_action_date_time: Job was last acted upon at dateTime. Sample format:
yyyy-MM-ddTHH:mm:ssZ.
:vartype last_action_date_time: ~datetime.datetime
:ivar start_time: Job start time when available. Sample format: yyyy-MM-ddTHH:mm:ssZ.
:vartype start_time: ~datetime.datetime
:ivar end_time: Job end time when available. Sample format: yyyy-MM-ddTHH:mm:ssZ.
:vartype end_time: ~datetime.datetime
:param name: Name to identify resource.
:type name: str
:param description: Textual description of the resource.
:type description: str
:param properties: A collection of key value pairs that belongs to the resource.
Each pair must not have a key greater than 50 characters
and must not have a value greater than 150 characters.
Note: A maximum of 25 key value pairs can be provided for a resource and only string and
numeral values are supported.
:type properties: dict[str, any]
"""
_validation = {
'farmer_id': {'required': True},
'auth_provider_id': {'required': True},
'start_year': {'required': True},
'id': {'readonly': True},
'status': {'readonly': True},
'duration_in_seconds': {'readonly': True},
'message': {'readonly': True},
'created_date_time': {'readonly': True},
'last_action_date_time': {'readonly': True},
'start_time': {'readonly': True},
'end_time': {'readonly': True},
'name': {'max_length': 100, 'min_length': 0},
'description': {'max_length': 500, 'min_length': 0},
}
_attribute_map = {
'farmer_id': {'key': 'farmerId', 'type': 'str'},
'auth_provider_id': {'key': 'authProviderId', 'type': 'str'},
'operations': {'key': 'operations', 'type': '[str]'},
'start_year': {'key': 'startYear', 'type': 'int'},
'id': {'key': 'id', 'type': 'str'},
'status': {'key': 'status', 'type': 'str'},
'duration_in_seconds': {'key': 'durationInSeconds', 'type': 'float'},
'message': {'key': 'message', 'type': 'str'},
'created_date_time': {'key': 'createdDateTime', 'type': 'iso-8601'},
'last_action_date_time': {'key': 'lastActionDateTime', 'type': 'iso-8601'},
'start_time': {'key': 'startTime', 'type': 'iso-8601'},
'end_time': {'key': 'endTime', 'type': 'iso-8601'},
'name': {'key': 'name', 'type': 'str'},
'description': {'key': 'description', 'type': 'str'},
'properties': {'key': 'properties', 'type': '{object}'},
}
def __init__(
self,
**kwargs
):
super(FarmOperationDataIngestionJob, self).__init__(**kwargs)
self.farmer_id = kwargs['farmer_id']
self.auth_provider_id = kwargs['auth_provider_id']
self.operations = kwargs.get('operations', None)
self.start_year = kwargs['start_year']
self.id = None
self.status = None
self.duration_in_seconds = None
self.message = None
self.created_date_time = None
self.last_action_date_time = None
self.start_time = None
self.end_time = None
self.name = kwargs.get('name', None)
self.description = kwargs.get('description', None)
self.properties = kwargs.get('properties', None)
class Field(msrest.serialization.Model):
"""Schema of field resource.
Variables are only populated by the server, and will be ignored when sending a request.
:param farm_id: ID of the associated Farm.
:type farm_id: str
:ivar farmer_id: Farmer ID.
:vartype farmer_id: str
:ivar primary_boundary_id: Primary boundary id.
:vartype primary_boundary_id: str
:ivar boundary_ids: Boundary Ids.
:vartype boundary_ids: list[str]
:ivar id: Unique resource ID.
:vartype id: str
:ivar e_tag: The ETag value to implement optimistic concurrency.
:vartype e_tag: str
:param status: Status of the resource.
:type status: str
:ivar created_date_time: Date-time when resource was created, sample format:
yyyy-MM-ddTHH:mm:ssZ.
:vartype created_date_time: ~datetime.datetime
:ivar modified_date_time: Date-time when resource was last modified, sample format:
yyyy-MM-ddTHH:mm:ssZ.
:vartype modified_date_time: ~datetime.datetime
:param name: Name to identify resource.
:type name: str
:param description: Textual description of the resource.
:type description: str
:param properties: A collection of key value pairs that belongs to the resource.
Each pair must not have a key greater than 50 characters
and must not have a value greater than 150 characters.
Note: A maximum of 25 key value pairs can be provided for a resource and only string and
numeral values are supported.
:type properties: dict[str, any]
"""
_validation = {
'farmer_id': {'readonly': True},
'primary_boundary_id': {'readonly': True},
'boundary_ids': {'readonly': True, 'unique': True},
'id': {'readonly': True},
'e_tag': {'readonly': True},
'status': {'max_length': 100, 'min_length': 0},
'created_date_time': {'readonly': True},
'modified_date_time': {'readonly': True},
'name': {'max_length': 100, 'min_length': 0},
'description': {'max_length': 500, 'min_length': 0},
}
_attribute_map = {
'farm_id': {'key': 'farmId', 'type': 'str'},
'farmer_id': {'key': 'farmerId', 'type': 'str'},
'primary_boundary_id': {'key': 'primaryBoundaryId', 'type': 'str'},
'boundary_ids': {'key': 'boundaryIds', 'type': '[str]'},
'id': {'key': 'id', 'type': 'str'},
'e_tag': {'key': 'eTag', 'type': 'str'},
'status': {'key': 'status', 'type': 'str'},
'created_date_time': {'key': 'createdDateTime', 'type': 'iso-8601'},
'modified_date_time': {'key': 'modifiedDateTime', 'type': 'iso-8601'},
'name': {'key': 'name', 'type': 'str'},
'description': {'key': 'description', 'type': 'str'},
'properties': {'key': 'properties', 'type': '{object}'},
}
def __init__(
self,
**kwargs
):
super(Field, self).__init__(**kwargs)
self.farm_id = kwargs.get('farm_id', None)
self.farmer_id = None
self.primary_boundary_id = None
self.boundary_ids = None
self.id = None
self.e_tag = None
self.status = kwargs.get('status', None)
self.created_date_time = None
self.modified_date_time = None
self.name = kwargs.get('name', None)
self.description = kwargs.get('description', None)
self.properties = kwargs.get('properties', None)
class FieldListResponse(msrest.serialization.Model):
"""Paged response contains list of requested objects and a URL link to get the next set of results.
:param value: List of requested objects.
:type value: list[~azure.agrifood.farming.models.Field]
:param skip_token: Token used in retrieving the next page. If null, there are no additional
pages.
:type skip_token: str
:param next_link: Continuation link (absolute URI) to the next page of results in the list.
:type next_link: str
"""
_attribute_map = {
'value': {'key': 'value', 'type': '[Field]'},
'skip_token': {'key': '$skipToken', 'type': 'str'},
'next_link': {'key': 'nextLink', 'type': 'str'},
}
def __init__(
self,
**kwargs
):
super(FieldListResponse, self).__init__(**kwargs)
self.value = kwargs.get('value', None)
self.skip_token = kwargs.get('skip_token', None)
self.next_link = kwargs.get('next_link', None)
class GeoJsonObject(msrest.serialization.Model):
"""GeoJSON abstract class.
You probably want to use the sub-classes and not this class directly. Known
sub-classes are: MultiPolygon, Point, Polygon.
All required parameters must be populated in order to send to Azure.
:param type: Required. GeoJSON object type.Constant filled by server. Possible values include:
"Point", "Polygon", "MultiPolygon".
:type type: str or ~azure.agrifood.farming.models.GeoJsonObjectType
"""
_validation = {
'type': {'required': True},
}
_attribute_map = {
'type': {'key': 'type', 'type': 'str'},
}
_subtype_map = {
'type': {'MultiPolygon': 'MultiPolygon', 'Point': 'Point', 'Polygon': 'Polygon'}
}
def __init__(
self,
**kwargs
):
super(GeoJsonObject, self).__init__(**kwargs)
self.type = None # type: Optional[str]
class HarvestData(msrest.serialization.Model):
"""Schema of harvest data resource.
Variables are only populated by the server, and will be ignored when sending a request.
:param total_yield: Schema for storing measurement reading and unit.
:type total_yield: ~azure.agrifood.farming.models.Measure
:param avg_yield: Schema for storing measurement reading and unit.
:type avg_yield: ~azure.agrifood.farming.models.Measure
:param total_wet_mass: Schema for storing measurement reading and unit.
:type total_wet_mass: ~azure.agrifood.farming.models.Measure
:param avg_wet_mass: Schema for storing measurement reading and unit.
:type avg_wet_mass: ~azure.agrifood.farming.models.Measure
:param avg_moisture: Schema for storing measurement reading and unit.
:type avg_moisture: ~azure.agrifood.farming.models.Measure
:param avg_speed: Schema for storing measurement reading and unit.
:type avg_speed: ~azure.agrifood.farming.models.Measure
:param harvest_product_details: Harvest product details.
:type harvest_product_details: list[~azure.agrifood.farming.models.HarvestProductDetail]
:param area: Schema for storing measurement reading and unit.
:type area: ~azure.agrifood.farming.models.Measure
:param source: Source of the operation data.
:type source: str
:param operation_modified_date_time: Modified date-time of the operation data, sample format:
yyyy-MM-ddTHH:mm:ssZ.
Note: this will be specified by the source provider itself.
:type operation_modified_date_time: ~datetime.datetime
:param operation_start_date_time: Start date-time of the operation data, sample format:
yyyy-MM-ddTHH:mm:ssZ.
:type operation_start_date_time: ~datetime.datetime
:param operation_end_date_time: End date-time of the operation data, sample format:
yyyy-MM-ddTHH:mm:ssZ.
:type operation_end_date_time: ~datetime.datetime
:ivar attachments_link: Link for attachments.
:vartype attachments_link: str
:param associated_boundary_id: Optional boundary ID of the field for which operation was
applied.
:type associated_boundary_id: str
:param operation_boundary_id: Optional boundary ID of the actual area for which operation was
applied inside the specified field.
:type operation_boundary_id: str
:ivar farmer_id: Farmer ID which belongs to the operation data.
:vartype farmer_id: str
:ivar id: Unique resource ID.
:vartype id: str
:ivar e_tag: The ETag value to implement optimistic concurrency.
:vartype e_tag: str
:param status: Status of the resource.
:type status: str
:ivar created_date_time: Date-time when resource was created, sample format:
yyyy-MM-ddTHH:mm:ssZ.
:vartype created_date_time: ~datetime.datetime
:ivar modified_date_time: Date-time when resource was last modified, sample format:
yyyy-MM-ddTHH:mm:ssZ.
:vartype modified_date_time: ~datetime.datetime
:param name: Name to identify resource.
:type name: str
:param description: Textual description of the resource.
:type description: str
:param properties: A collection of key value pairs that belongs to the resource.
Each pair must not have a key greater than 50 characters
and must not have a value greater than 150 characters.
Note: A maximum of 25 key value pairs can be provided for a resource and only string and
numeral values are supported.
:type properties: dict[str, any]
"""
_validation = {
'source': {'max_length': 100, 'min_length': 2},
'attachments_link': {'readonly': True},
'farmer_id': {'readonly': True},
'id': {'readonly': True},
'e_tag': {'readonly': True},
'status': {'max_length': 100, 'min_length': 0},
'created_date_time': {'readonly': True},
'modified_date_time': {'readonly': True},
'name': {'max_length': 100, 'min_length': 0},
'description': {'max_length': 500, 'min_length': 0},
}
_attribute_map = {
'total_yield': {'key': 'totalYield', 'type': 'Measure'},
'avg_yield': {'key': 'avgYield', 'type': 'Measure'},
'total_wet_mass': {'key': 'totalWetMass', 'type': 'Measure'},
'avg_wet_mass': {'key': 'avgWetMass', 'type': 'Measure'},
'avg_moisture': {'key': 'avgMoisture', 'type': 'Measure'},
'avg_speed': {'key': 'avgSpeed', 'type': 'Measure'},
'harvest_product_details': {'key': 'harvestProductDetails', 'type': '[HarvestProductDetail]'},
'area': {'key': 'area', 'type': 'Measure'},
'source': {'key': 'source', 'type': 'str'},
'operation_modified_date_time': {'key': 'operationModifiedDateTime', 'type': 'iso-8601'},
'operation_start_date_time': {'key': 'operationStartDateTime', 'type': 'iso-8601'},
'operation_end_date_time': {'key': 'operationEndDateTime', 'type': 'iso-8601'},
'attachments_link': {'key': 'attachmentsLink', 'type': 'str'},
'associated_boundary_id': {'key': 'associatedBoundaryId', 'type': 'str'},
'operation_boundary_id': {'key': 'operationBoundaryId', 'type': 'str'},
'farmer_id': {'key': 'farmerId', 'type': 'str'},
'id': {'key': 'id', 'type': 'str'},
'e_tag': {'key': 'eTag', 'type': 'str'},
'status': {'key': 'status', 'type': 'str'},
'created_date_time': {'key': 'createdDateTime', 'type': 'iso-8601'},
'modified_date_time': {'key': 'modifiedDateTime', 'type': 'iso-8601'},
'name': {'key': 'name', 'type': 'str'},
'description': {'key': 'description', 'type': 'str'},
'properties': {'key': 'properties', 'type': '{object}'},
}
def __init__(
self,
**kwargs
):
super(HarvestData, self).__init__(**kwargs)
self.total_yield = kwargs.get('total_yield', None)
self.avg_yield = kwargs.get('avg_yield', None)
self.total_wet_mass = kwargs.get('total_wet_mass', None)
self.avg_wet_mass = kwargs.get('avg_wet_mass', None)
self.avg_moisture = kwargs.get('avg_moisture', None)
self.avg_speed = kwargs.get('avg_speed', None)
self.harvest_product_details = kwargs.get('harvest_product_details', None)
self.area = kwargs.get('area', None)
self.source = kwargs.get('source', None)
self.operation_modified_date_time = kwargs.get('operation_modified_date_time', None)
self.operation_start_date_time = kwargs.get('operation_start_date_time', None)
self.operation_end_date_time = kwargs.get('operation_end_date_time', None)
self.attachments_link = None
self.associated_boundary_id = kwargs.get('associated_boundary_id', None)
self.operation_boundary_id = kwargs.get('operation_boundary_id', None)
self.farmer_id = None
self.id = None
self.e_tag = None
self.status = kwargs.get('status', None)
self.created_date_time = None
self.modified_date_time = None
self.name = kwargs.get('name', None)
self.description = kwargs.get('description', None)
self.properties = kwargs.get('properties', None)
class HarvestDataListResponse(msrest.serialization.Model):
"""Paged response contains list of requested objects and a URL link to get the next set of results.
:param value: List of requested objects.
:type value: list[~azure.agrifood.farming.models.HarvestData]
:param skip_token: Token used in retrieving the next page. If null, there are no additional
pages.
:type skip_token: str
:param next_link: Continuation link (absolute URI) to the next page of results in the list.
:type next_link: str
"""
_attribute_map = {
'value': {'key': 'value', 'type': '[HarvestData]'},
'skip_token': {'key': '$skipToken', 'type': 'str'},
'next_link': {'key': 'nextLink', 'type': 'str'},
}
def __init__(
self,
**kwargs
):
super(HarvestDataListResponse, self).__init__(**kwargs)
self.value = kwargs.get('value', None)
self.skip_token = kwargs.get('skip_token', None)
self.next_link = kwargs.get('next_link', None)
class HarvestProductDetail(msrest.serialization.Model):
"""Schema of product used during harvesting.
:param product_name: Name of the product.
:type product_name: str
:param area: Schema for storing measurement reading and unit.
:type area: ~azure.agrifood.farming.models.Measure
:param total_yield: Schema for storing measurement reading and unit.
:type total_yield: ~azure.agrifood.farming.models.Measure
:param avg_yield: Schema for storing measurement reading and unit.
:type avg_yield: ~azure.agrifood.farming.models.Measure
:param avg_moisture: Schema for storing measurement reading and unit.
:type avg_moisture: ~azure.agrifood.farming.models.Measure
:param total_wet_mass: Schema for storing measurement reading and unit.
:type total_wet_mass: ~azure.agrifood.farming.models.Measure
:param avg_wet_mass: Schema for storing measurement reading and unit.
:type avg_wet_mass: ~azure.agrifood.farming.models.Measure
"""
_validation = {
'product_name': {'max_length': 100, 'min_length': 1},
}
_attribute_map = {
'product_name': {'key': 'productName', 'type': 'str'},
'area': {'key': 'area', 'type': 'Measure'},
'total_yield': {'key': 'totalYield', 'type': 'Measure'},
'avg_yield': {'key': 'avgYield', 'type': 'Measure'},
'avg_moisture': {'key': 'avgMoisture', 'type': 'Measure'},
'total_wet_mass': {'key': 'totalWetMass', 'type': 'Measure'},
'avg_wet_mass': {'key': 'avgWetMass', 'type': 'Measure'},
}
def __init__(
self,
**kwargs
):
super(HarvestProductDetail, self).__init__(**kwargs)
self.product_name = kwargs.get('product_name', None)
self.area = kwargs.get('area', None)
self.total_yield = kwargs.get('total_yield', None)
self.avg_yield = kwargs.get('avg_yield', None)
self.avg_moisture = kwargs.get('avg_moisture', None)
self.total_wet_mass = kwargs.get('total_wet_mass', None)
self.avg_wet_mass = kwargs.get('avg_wet_mass', None)
class ImageFile(msrest.serialization.Model):
"""Schema of image file resource.
All required parameters must be populated in order to send to Azure.
:param file_link: Link of the image file.
:type file_link: str
:param name: Required. Name of the image file.
:type name: str
:param image_format: Supported image formats for scene resource. Possible values include:
"TIF".
:type image_format: str or ~azure.agrifood.farming.models.ImageFormat
:param resolution: Resolution of image file in meters.
:type resolution: float
"""
_validation = {
'name': {'required': True},
}
_attribute_map = {
'file_link': {'key': 'fileLink', 'type': 'str'},
'name': {'key': 'name', 'type': 'str'},
'image_format': {'key': 'imageFormat', 'type': 'str'},
'resolution': {'key': 'resolution', 'type': 'float'},
}
def __init__(
self,
**kwargs
):
super(ImageFile, self).__init__(**kwargs)
self.file_link = kwargs.get('file_link', None)
self.name = kwargs['name']
self.image_format = kwargs.get('image_format', None)
self.resolution = kwargs.get('resolution', None)
class ImageProcessingRasterizeJob(msrest.serialization.Model):
"""ImageProcessingRasterizeJob.
Variables are only populated by the server, and will be ignored when sending a request.
All required parameters must be populated in order to send to Azure.
:param farmer_id: Required. Farmer ID.
:type farmer_id: str
:param shapefile_attachment_id: Required. Shapefile attachment ID.
:type shapefile_attachment_id: str
:param shapefile_column_names: Required. List of shapefile column names to create raster
attachments.
:type shapefile_column_names: list[str]
:ivar id: Unique job id.
:vartype id: str
:ivar status: Status of the job.
Possible values: 'Waiting', 'Running', 'Succeeded', 'Failed', 'Cancelled'.
:vartype status: str
:ivar duration_in_seconds: Duration of the job in seconds.
:vartype duration_in_seconds: float
:ivar message: Status message to capture more details of the job.
:vartype message: str
:ivar created_date_time: Job created at dateTime. Sample format: yyyy-MM-ddTHH:mm:ssZ.
:vartype created_date_time: ~datetime.datetime
:ivar last_action_date_time: Job was last acted upon at dateTime. Sample format:
yyyy-MM-ddTHH:mm:ssZ.
:vartype last_action_date_time: ~datetime.datetime
:ivar start_time: Job start time when available. Sample format: yyyy-MM-ddTHH:mm:ssZ.
:vartype start_time: ~datetime.datetime
:ivar end_time: Job end time when available. Sample format: yyyy-MM-ddTHH:mm:ssZ.
:vartype end_time: ~datetime.datetime
:param name: Name to identify resource.
:type name: str
:param description: Textual description of the resource.
:type description: str
:param properties: A collection of key value pairs that belongs to the resource.
Each pair must not have a key greater than 50 characters
and must not have a value greater than 150 characters.
Note: A maximum of 25 key value pairs can be provided for a resource and only string and
numeral values are supported.
:type properties: dict[str, any]
"""
_validation = {
'farmer_id': {'required': True},
'shapefile_attachment_id': {'required': True},
'shapefile_column_names': {'required': True},
'id': {'readonly': True},
'status': {'readonly': True},
'duration_in_seconds': {'readonly': True},
'message': {'readonly': True},
'created_date_time': {'readonly': True},
'last_action_date_time': {'readonly': True},
'start_time': {'readonly': True},
'end_time': {'readonly': True},
'name': {'max_length': 100, 'min_length': 0},
'description': {'max_length': 500, 'min_length': 0},
}
_attribute_map = {
'farmer_id': {'key': 'farmerId', 'type': 'str'},
'shapefile_attachment_id': {'key': 'shapefileAttachmentId', 'type': 'str'},
'shapefile_column_names': {'key': 'shapefileColumnNames', 'type': '[str]'},
'id': {'key': 'id', 'type': 'str'},
'status': {'key': 'status', 'type': 'str'},
'duration_in_seconds': {'key': 'durationInSeconds', 'type': 'float'},
'message': {'key': 'message', 'type': 'str'},
'created_date_time': {'key': 'createdDateTime', 'type': 'iso-8601'},
'last_action_date_time': {'key': 'lastActionDateTime', 'type': 'iso-8601'},
'start_time': {'key': 'startTime', 'type': 'iso-8601'},
'end_time': {'key': 'endTime', 'type': 'iso-8601'},
'name': {'key': 'name', 'type': 'str'},
'description': {'key': 'description', 'type': 'str'},
'properties': {'key': 'properties', 'type': '{object}'},
}
def __init__(
self,
**kwargs
):
super(ImageProcessingRasterizeJob, self).__init__(**kwargs)
self.farmer_id = kwargs['farmer_id']
self.shapefile_attachment_id = kwargs['shapefile_attachment_id']
self.shapefile_column_names = kwargs['shapefile_column_names']
self.id = None
self.status = None
self.duration_in_seconds = None
self.message = None
self.created_date_time = None
self.last_action_date_time = None
self.start_time = None
self.end_time = None
self.name = kwargs.get('name', None)
self.description = kwargs.get('description', None)
self.properties = kwargs.get('properties', None)
class InnerError(msrest.serialization.Model):
"""Inner error containing list of errors.
:code:`<see href="https://github.com/Microsoft/api-guidelines/blob/vNext/Guidelines.md#innererror--object">InnerError reference document</see>`.
:param additional_properties: Unmatched properties from the message are deserialized to this
collection.
:type additional_properties: dict[str, any]
:param code: Specific error code than was provided by the
containing error.
:type code: str
:param innererror: Inner error containing list of errors.
:code:`<see
href="https://github.com/Microsoft/api-guidelines/blob/vNext/Guidelines.md#innererror--object">InnerError
reference document</see>`.
:type innererror: ~azure.agrifood.farming.models.InnerError
"""
_attribute_map = {
'additional_properties': {'key': '', 'type': '{object}'},
'code': {'key': 'code', 'type': 'str'},
'innererror': {'key': 'innererror', 'type': 'InnerError'},
}
def __init__(
self,
**kwargs
):
super(InnerError, self).__init__(**kwargs)
self.additional_properties = kwargs.get('additional_properties', None)
self.code = kwargs.get('code', None)
self.innererror = kwargs.get('innererror', None)
class Location(msrest.serialization.Model):
"""Location model class.
All required parameters must be populated in order to send to Azure.
:param latitude: Required. Latitude of the location.
:type latitude: float
:param longitude: Required. Longitude of the location.
:type longitude: float
"""
_validation = {
'latitude': {'required': True, 'maximum': 90, 'minimum': -90},
'longitude': {'required': True, 'maximum': 180, 'minimum': -180},
}
_attribute_map = {
'latitude': {'key': 'latitude', 'type': 'float'},
'longitude': {'key': 'longitude', 'type': 'float'},
}
def __init__(
self,
**kwargs
):
super(Location, self).__init__(**kwargs)
self.latitude = kwargs['latitude']
self.longitude = kwargs['longitude']
class Measure(msrest.serialization.Model):
"""Schema for storing measurement reading and unit.
:param unit: Data unit.
:type unit: str
:param value: Data value.
:type value: float
"""
_validation = {
'unit': {'max_length': 50, 'min_length': 1},
}
_attribute_map = {
'unit': {'key': 'unit', 'type': 'str'},
'value': {'key': 'value', 'type': 'float'},
}
def __init__(
self,
**kwargs
):
super(Measure, self).__init__(**kwargs)
self.unit = kwargs.get('unit', None)
self.value = kwargs.get('value', None)
class MultiPolygonCoordinates(msrest.serialization.Model):
"""Schema of multi polygon coordinates.
All required parameters must be populated in order to send to Azure.
:param coordinates: Required. Gets or sets Coordinates of GeoJSON Object.
It must be an array of polygons, each polygon contains list of linear rings.
For Polygons with more than one of these rings, the first MUST be the exterior ring,
and any others MUST be interior rings.
:type coordinates: list[list[list[list[float]]]]
"""
_validation = {
'coordinates': {'required': True},
}
_attribute_map = {
'coordinates': {'key': 'coordinates', 'type': '[[[[float]]]]'},
}
def __init__(
self,
**kwargs
):
super(MultiPolygonCoordinates, self).__init__(**kwargs)
self.coordinates = kwargs['coordinates']
class MultiPolygon(GeoJsonObject, MultiPolygonCoordinates):
"""MultiPolygon geometry.
All required parameters must be populated in order to send to Azure.
:param coordinates: Required. Gets or sets Coordinates of GeoJSON Object.
It must be an array of polygons, each polygon contains list of linear rings.
For Polygons with more than one of these rings, the first MUST be the exterior ring,
and any others MUST be interior rings.
:type coordinates: list[list[list[list[float]]]]
:param type: Required. GeoJSON object type.Constant filled by server. Possible values include:
"Point", "Polygon", "MultiPolygon".
:type type: str or ~azure.agrifood.farming.models.GeoJsonObjectType
"""
_validation = {
'coordinates': {'required': True},
'type': {'required': True},
}
_attribute_map = {
'coordinates': {'key': 'coordinates', 'type': '[[[[float]]]]'},
'type': {'key': 'type', 'type': 'str'},
}
def __init__(
self,
**kwargs
):
super(MultiPolygon, self).__init__(**kwargs)
self.coordinates = kwargs['coordinates']
self.type = 'MultiPolygon' # type: str
self.type = 'MultiPolygon' # type: str
class OAuthConnectRequest(msrest.serialization.Model):
"""Get OAuth config query parameters.
All required parameters must be populated in order to send to Azure.
:param farmer_id: Required. ID of the farmer.
:type farmer_id: str
:param o_auth_provider_id: Required. ID of the OAuthProvider.
:type o_auth_provider_id: str
:param user_redirect_link: Required. Link to redirect the user to, at the end of the oauth
flow.
:type user_redirect_link: str
:param user_redirect_state: State to provide back when redirecting the user, at the end of the
oauth flow.
:type user_redirect_state: str
"""
_validation = {
'farmer_id': {'required': True},
'o_auth_provider_id': {'required': True},
'user_redirect_link': {'required': True, 'max_length': 1000, 'min_length': 0},
'user_redirect_state': {'max_length': 200, 'min_length': 0},
}
_attribute_map = {
'farmer_id': {'key': 'farmerId', 'type': 'str'},
'o_auth_provider_id': {'key': 'oAuthProviderId', 'type': 'str'},
'user_redirect_link': {'key': 'userRedirectLink', 'type': 'str'},
'user_redirect_state': {'key': 'userRedirectState', 'type': 'str'},
}
def __init__(
self,
**kwargs
):
super(OAuthConnectRequest, self).__init__(**kwargs)
self.farmer_id = kwargs['farmer_id']
self.o_auth_provider_id = kwargs['o_auth_provider_id']
self.user_redirect_link = kwargs['user_redirect_link']
self.user_redirect_state = kwargs.get('user_redirect_state', None)
class OAuthProvider(msrest.serialization.Model):
"""Schema of OAuth provider resource.
Variables are only populated by the server, and will be ignored when sending a request.
:param app_id: OAuth App ID for given OAuth Provider.
:type app_id: str
:param app_secret: OAuth App secret for given Provider.
Note: Won't be sent in response.
:type app_secret: str
:param api_key: OAuth Api key for given Provider.
Note: currently Applicable to Climate provider. Won't be sent in response.
:type api_key: str
:param is_production_app: An optional flag to determine if the App is ready to be used for
Production scenarios in the provider side or not. (Default value: false)
Note: Currently applicable for JohnDeere.
:type is_production_app: bool
:ivar id: Unique OAuth provider ID.
:vartype id: str
:ivar e_tag: The ETag value to implement optimistic concurrency.
:vartype e_tag: str
:ivar created_date_time: Date-time when resource was created, sample format:
yyyy-MM-ddTHH:mm:ssZ.
:vartype created_date_time: ~datetime.datetime
:ivar modified_date_time: Date-time when resource was last modified, sample format:
yyyy-MM-ddTHH:mm:ssZ.
:vartype modified_date_time: ~datetime.datetime
:param name: Name to identify resource.
:type name: str
:param description: Textual description of the resource.
:type description: str
:param properties: A collection of key value pairs that belongs to the resource.
Each pair must not have a key greater than 50 characters
and must not have a value greater than 150 characters.
Note: A maximum of 25 key value pairs can be provided for a resource and only string and
numeral values are supported.
:type properties: dict[str, any]
"""
_validation = {
'app_id': {'max_length': 200, 'min_length': 2},
'app_secret': {'max_length': 200, 'min_length': 2},
'api_key': {'max_length': 200, 'min_length': 2},
'id': {'readonly': True},
'e_tag': {'readonly': True},
'created_date_time': {'readonly': True},
'modified_date_time': {'readonly': True},
'name': {'max_length': 100, 'min_length': 0},
'description': {'max_length': 500, 'min_length': 0},
}
_attribute_map = {
'app_id': {'key': 'appId', 'type': 'str'},
'app_secret': {'key': 'appSecret', 'type': 'str'},
'api_key': {'key': 'apiKey', 'type': 'str'},
'is_production_app': {'key': 'isProductionApp', 'type': 'bool'},
'id': {'key': 'id', 'type': 'str'},
'e_tag': {'key': 'eTag', 'type': 'str'},
'created_date_time': {'key': 'createdDateTime', 'type': 'iso-8601'},
'modified_date_time': {'key': 'modifiedDateTime', 'type': 'iso-8601'},
'name': {'key': 'name', 'type': 'str'},
'description': {'key': 'description', 'type': 'str'},
'properties': {'key': 'properties', 'type': '{object}'},
}
def __init__(
self,
**kwargs
):
super(OAuthProvider, self).__init__(**kwargs)
self.app_id = kwargs.get('app_id', None)
self.app_secret = kwargs.get('app_secret', None)
self.api_key = kwargs.get('api_key', None)
self.is_production_app = kwargs.get('is_production_app', False)
self.id = None
self.e_tag = None
self.created_date_time = None
self.modified_date_time = None
self.name = kwargs.get('name', None)
self.description = kwargs.get('description', None)
self.properties = kwargs.get('properties', None)
class OAuthProviderListResponse(msrest.serialization.Model):
"""Paged response contains list of requested objects and a URL link to get the next set of results.
:param value: List of requested objects.
:type value: list[~azure.agrifood.farming.models.OAuthProvider]
:param skip_token: Token used in retrieving the next page. If null, there are no additional
pages.
:type skip_token: str
:param next_link: Continuation link (absolute URI) to the next page of results in the list.
:type next_link: str
"""
_attribute_map = {
'value': {'key': 'value', 'type': '[OAuthProvider]'},
'skip_token': {'key': '$skipToken', 'type': 'str'},
'next_link': {'key': 'nextLink', 'type': 'str'},
}
def __init__(
self,
**kwargs
):
super(OAuthProviderListResponse, self).__init__(**kwargs)
self.value = kwargs.get('value', None)
self.skip_token = kwargs.get('skip_token', None)
self.next_link = kwargs.get('next_link', None)
class OAuthToken(msrest.serialization.Model):
"""Schema of OAuth token resource.
Variables are only populated by the server, and will be ignored when sending a request.
All required parameters must be populated in order to send to Azure.
:param farmer_id: Required. Farmer ID for this OAuth config.
:type farmer_id: str
:param auth_provider_id: Required. ID of the OAuth provider resource containing app
information.
:type auth_provider_id: str
:param is_valid: An optional flag indicating whether the token is a valid or expired (Default
value: true).
:type is_valid: bool
:ivar e_tag: The ETag value to implement optimistic concurrency.
:vartype e_tag: str
:ivar created_date_time: Date-time when resource was created, sample format:
yyyy-MM-ddTHH:mm:ssZ.
:vartype created_date_time: ~datetime.datetime
:ivar modified_date_time: Date-time when resource was last modified, sample format:
yyyy-MM-ddTHH:mm:ssZ.
:vartype modified_date_time: ~datetime.datetime
"""
_validation = {
'farmer_id': {'required': True},
'auth_provider_id': {'required': True},
'e_tag': {'readonly': True},
'created_date_time': {'readonly': True},
'modified_date_time': {'readonly': True},
}
_attribute_map = {
'farmer_id': {'key': 'farmerId', 'type': 'str'},
'auth_provider_id': {'key': 'authProviderId', 'type': 'str'},
'is_valid': {'key': 'isValid', 'type': 'bool'},
'e_tag': {'key': 'eTag', 'type': 'str'},
'created_date_time': {'key': 'createdDateTime', 'type': 'iso-8601'},
'modified_date_time': {'key': 'modifiedDateTime', 'type': 'iso-8601'},
}
def __init__(
self,
**kwargs
):
super(OAuthToken, self).__init__(**kwargs)
self.farmer_id = kwargs['farmer_id']
self.auth_provider_id = kwargs['auth_provider_id']
self.is_valid = kwargs.get('is_valid', True)
self.e_tag = None
self.created_date_time = None
self.modified_date_time = None
class OAuthTokenListResponse(msrest.serialization.Model):
"""Paged response contains list of requested objects and a URL link to get the next set of results.
:param value: List of requested objects.
:type value: list[~azure.agrifood.farming.models.OAuthToken]
:param skip_token: Token used in retrieving the next page. If null, there are no additional
pages.
:type skip_token: str
:param next_link: Continuation link (absolute URI) to the next page of results in the list.
:type next_link: str
"""
_attribute_map = {
'value': {'key': 'value', 'type': '[OAuthToken]'},
'skip_token': {'key': '$skipToken', 'type': 'str'},
'next_link': {'key': 'nextLink', 'type': 'str'},
}
def __init__(
self,
**kwargs
):
super(OAuthTokenListResponse, self).__init__(**kwargs)
self.value = kwargs.get('value', None)
self.skip_token = kwargs.get('skip_token', None)
self.next_link = kwargs.get('next_link', None)
class Paths1LxjoxzFarmersFarmeridAttachmentsAttachmentidPatchRequestbodyContentMultipartFormDataSchema(msrest.serialization.Model):
"""Paths1LxjoxzFarmersFarmeridAttachmentsAttachmentidPatchRequestbodyContentMultipartFormDataSchema.
:param file: File to be uploaded.
:type file: IO
:param farmer_id: Farmer id for this attachment.
:type farmer_id: str
:param resource_id: Associated Resource id for this attachment.
:type resource_id: str
:param resource_type: Associated Resource type for this attachment
i.e. Farmer, Farm, Field, SeasonalField, Boundary, FarmOperationApplicationData, HarvestData,
TillageData, PlantingData.
:type resource_type: str
:param original_file_name: Original File Name for this attachment.
:type original_file_name: str
:param id: Unique id.
:type id: str
:param status: Status of the resource.
:type status: str
:param created_date_time: Date when resource was created.
:type created_date_time: str
:param modified_date_time: Date when resource was last modified.
:type modified_date_time: str
:param name: Name to identify resource.
:type name: str
:param description: Textual description of resource.
:type description: str
:param e_tag: The ETag value to implement optimistic concurrency.
:type e_tag: str
"""
_attribute_map = {
'file': {'key': 'file', 'type': 'IO'},
'farmer_id': {'key': 'FarmerId', 'type': 'str'},
'resource_id': {'key': 'ResourceId', 'type': 'str'},
'resource_type': {'key': 'ResourceType', 'type': 'str'},
'original_file_name': {'key': 'OriginalFileName', 'type': 'str'},
'id': {'key': 'Id', 'type': 'str'},
'status': {'key': 'Status', 'type': 'str'},
'created_date_time': {'key': 'CreatedDateTime', 'type': 'str'},
'modified_date_time': {'key': 'ModifiedDateTime', 'type': 'str'},
'name': {'key': 'Name', 'type': 'str'},
'description': {'key': 'Description', 'type': 'str'},
'e_tag': {'key': 'ETag', 'type': 'str'},
}
def __init__(
self,
**kwargs
):
super(Paths1LxjoxzFarmersFarmeridAttachmentsAttachmentidPatchRequestbodyContentMultipartFormDataSchema, self).__init__(**kwargs)
self.file = kwargs.get('file', None)
self.farmer_id = kwargs.get('farmer_id', None)
self.resource_id = kwargs.get('resource_id', None)
self.resource_type = kwargs.get('resource_type', None)
self.original_file_name = kwargs.get('original_file_name', None)
self.id = kwargs.get('id', None)
self.status = kwargs.get('status', None)
self.created_date_time = kwargs.get('created_date_time', None)
self.modified_date_time = kwargs.get('modified_date_time', None)
self.name = kwargs.get('name', None)
self.description = kwargs.get('description', None)
self.e_tag = kwargs.get('e_tag', None)
class PlantingData(msrest.serialization.Model):
"""Schema of planting data resource.
Variables are only populated by the server, and will be ignored when sending a request.
:param avg_planting_rate: Schema for storing measurement reading and unit.
:type avg_planting_rate: ~azure.agrifood.farming.models.Measure
:param total_material: Schema for storing measurement reading and unit.
:type total_material: ~azure.agrifood.farming.models.Measure
:param avg_material: Schema for storing measurement reading and unit.
:type avg_material: ~azure.agrifood.farming.models.Measure
:param planting_product_details: Planting product details.
:type planting_product_details: list[~azure.agrifood.farming.models.PlantingProductDetail]
:param area: Schema for storing measurement reading and unit.
:type area: ~azure.agrifood.farming.models.Measure
:param source: Source of the operation data.
:type source: str
:param operation_modified_date_time: Modified date-time of the operation data, sample format:
yyyy-MM-ddTHH:mm:ssZ.
Note: this will be specified by the source provider itself.
:type operation_modified_date_time: ~datetime.datetime
:param operation_start_date_time: Start date-time of the operation data, sample format:
yyyy-MM-ddTHH:mm:ssZ.
:type operation_start_date_time: ~datetime.datetime
:param operation_end_date_time: End date-time of the operation data, sample format:
yyyy-MM-ddTHH:mm:ssZ.
:type operation_end_date_time: ~datetime.datetime
:ivar attachments_link: Link for attachments.
:vartype attachments_link: str
:param associated_boundary_id: Optional boundary ID of the field for which operation was
applied.
:type associated_boundary_id: str
:param operation_boundary_id: Optional boundary ID of the actual area for which operation was
applied inside the specified field.
:type operation_boundary_id: str
:ivar farmer_id: Farmer ID which belongs to the operation data.
:vartype farmer_id: str
:ivar id: Unique resource ID.
:vartype id: str
:ivar e_tag: The ETag value to implement optimistic concurrency.
:vartype e_tag: str
:param status: Status of the resource.
:type status: str
:ivar created_date_time: Date-time when resource was created, sample format:
yyyy-MM-ddTHH:mm:ssZ.
:vartype created_date_time: ~datetime.datetime
:ivar modified_date_time: Date-time when resource was last modified, sample format:
yyyy-MM-ddTHH:mm:ssZ.
:vartype modified_date_time: ~datetime.datetime
:param name: Name to identify resource.
:type name: str
:param description: Textual description of the resource.
:type description: str
:param properties: A collection of key value pairs that belongs to the resource.
Each pair must not have a key greater than 50 characters
and must not have a value greater than 150 characters.
Note: A maximum of 25 key value pairs can be provided for a resource and only string and
numeral values are supported.
:type properties: dict[str, any]
"""
_validation = {
'source': {'max_length': 100, 'min_length': 2},
'attachments_link': {'readonly': True},
'farmer_id': {'readonly': True},
'id': {'readonly': True},
'e_tag': {'readonly': True},
'status': {'max_length': 100, 'min_length': 0},
'created_date_time': {'readonly': True},
'modified_date_time': {'readonly': True},
'name': {'max_length': 100, 'min_length': 0},
'description': {'max_length': 500, 'min_length': 0},
}
_attribute_map = {
'avg_planting_rate': {'key': 'avgPlantingRate', 'type': 'Measure'},
'total_material': {'key': 'totalMaterial', 'type': 'Measure'},
'avg_material': {'key': 'avgMaterial', 'type': 'Measure'},
'planting_product_details': {'key': 'plantingProductDetails', 'type': '[PlantingProductDetail]'},
'area': {'key': 'area', 'type': 'Measure'},
'source': {'key': 'source', 'type': 'str'},
'operation_modified_date_time': {'key': 'operationModifiedDateTime', 'type': 'iso-8601'},
'operation_start_date_time': {'key': 'operationStartDateTime', 'type': 'iso-8601'},
'operation_end_date_time': {'key': 'operationEndDateTime', 'type': 'iso-8601'},
'attachments_link': {'key': 'attachmentsLink', 'type': 'str'},
'associated_boundary_id': {'key': 'associatedBoundaryId', 'type': 'str'},
'operation_boundary_id': {'key': 'operationBoundaryId', 'type': 'str'},
'farmer_id': {'key': 'farmerId', 'type': 'str'},
'id': {'key': 'id', 'type': 'str'},
'e_tag': {'key': 'eTag', 'type': 'str'},
'status': {'key': 'status', 'type': 'str'},
'created_date_time': {'key': 'createdDateTime', 'type': 'iso-8601'},
'modified_date_time': {'key': 'modifiedDateTime', 'type': 'iso-8601'},
'name': {'key': 'name', 'type': 'str'},
'description': {'key': 'description', 'type': 'str'},
'properties': {'key': 'properties', 'type': '{object}'},
}
def __init__(
self,
**kwargs
):
super(PlantingData, self).__init__(**kwargs)
self.avg_planting_rate = kwargs.get('avg_planting_rate', None)
self.total_material = kwargs.get('total_material', None)
self.avg_material = kwargs.get('avg_material', None)
self.planting_product_details = kwargs.get('planting_product_details', None)
self.area = kwargs.get('area', None)
self.source = kwargs.get('source', None)
self.operation_modified_date_time = kwargs.get('operation_modified_date_time', None)
self.operation_start_date_time = kwargs.get('operation_start_date_time', None)
self.operation_end_date_time = kwargs.get('operation_end_date_time', None)
self.attachments_link = None
self.associated_boundary_id = kwargs.get('associated_boundary_id', None)
self.operation_boundary_id = kwargs.get('operation_boundary_id', None)
self.farmer_id = None
self.id = None
self.e_tag = None
self.status = kwargs.get('status', None)
self.created_date_time = None
self.modified_date_time = None
self.name = kwargs.get('name', None)
self.description = kwargs.get('description', None)
self.properties = kwargs.get('properties', None)
class PlantingDataListResponse(msrest.serialization.Model):
"""Paged response contains list of requested objects and a URL link to get the next set of results.
:param value: List of requested objects.
:type value: list[~azure.agrifood.farming.models.PlantingData]
:param skip_token: Token used in retrieving the next page. If null, there are no additional
pages.
:type skip_token: str
:param next_link: Continuation link (absolute URI) to the next page of results in the list.
:type next_link: str
"""
_attribute_map = {
'value': {'key': 'value', 'type': '[PlantingData]'},
'skip_token': {'key': '$skipToken', 'type': 'str'},
'next_link': {'key': 'nextLink', 'type': 'str'},
}
def __init__(
self,
**kwargs
):
super(PlantingDataListResponse, self).__init__(**kwargs)
self.value = kwargs.get('value', None)
self.skip_token = kwargs.get('skip_token', None)
self.next_link = kwargs.get('next_link', None)
class PlantingProductDetail(msrest.serialization.Model):
"""Schema for Planting product detail.
:param product_name: Name of the product.
:type product_name: str
:param area: Schema for storing measurement reading and unit.
:type area: ~azure.agrifood.farming.models.Measure
:param total_material: Schema for storing measurement reading and unit.
:type total_material: ~azure.agrifood.farming.models.Measure
:param avg_material: Schema for storing measurement reading and unit.
:type avg_material: ~azure.agrifood.farming.models.Measure
"""
_attribute_map = {
'product_name': {'key': 'productName', 'type': 'str'},
'area': {'key': 'area', 'type': 'Measure'},
'total_material': {'key': 'totalMaterial', 'type': 'Measure'},
'avg_material': {'key': 'avgMaterial', 'type': 'Measure'},
}
def __init__(
self,
**kwargs
):
super(PlantingProductDetail, self).__init__(**kwargs)
self.product_name = kwargs.get('product_name', None)
self.area = kwargs.get('area', None)
self.total_material = kwargs.get('total_material', None)
self.avg_material = kwargs.get('avg_material', None)
class PointCoordinates(msrest.serialization.Model):
"""Schema of the coordinates of a point.
All required parameters must be populated in order to send to Azure.
:param coordinates: Required. Gets or sets the coordinate of this point.
It must be an array of 2 or 3 elements for a 2D or 3D system.
:type coordinates: list[float]
"""
_validation = {
'coordinates': {'required': True},
}
_attribute_map = {
'coordinates': {'key': 'coordinates', 'type': '[float]'},
}
def __init__(
self,
**kwargs
):
super(PointCoordinates, self).__init__(**kwargs)
self.coordinates = kwargs['coordinates']
class Point(GeoJsonObject, PointCoordinates):
"""Point geometry.
All required parameters must be populated in order to send to Azure.
:param coordinates: Required. Gets or sets the coordinate of this point.
It must be an array of 2 or 3 elements for a 2D or 3D system.
:type coordinates: list[float]
:param type: Required. GeoJSON object type.Constant filled by server. Possible values include:
"Point", "Polygon", "MultiPolygon".
:type type: str or ~azure.agrifood.farming.models.GeoJsonObjectType
"""
_validation = {
'coordinates': {'required': True},
'type': {'required': True},
}
_attribute_map = {
'coordinates': {'key': 'coordinates', 'type': '[float]'},
'type': {'key': 'type', 'type': 'str'},
}
def __init__(
self,
**kwargs
):
super(Point, self).__init__(**kwargs)
self.coordinates = kwargs['coordinates']
self.type = 'Point' # type: str
self.type = 'Point' # type: str
class PolygonCoordinates(msrest.serialization.Model):
"""Schema of polygon coordinates.
All required parameters must be populated in order to send to Azure.
:param coordinates: Required. Gets or sets type of the GeoJSON Object.
It must be an array of linear ring coordinate arrays.
For Polygons with more than one of these rings, the first MUST be the exterior ring,
and any others MUST be interior rings.
:type coordinates: list[list[list[float]]]
"""
_validation = {
'coordinates': {'required': True},
}
_attribute_map = {
'coordinates': {'key': 'coordinates', 'type': '[[[float]]]'},
}
def __init__(
self,
**kwargs
):
super(PolygonCoordinates, self).__init__(**kwargs)
self.coordinates = kwargs['coordinates']
class Polygon(GeoJsonObject, PolygonCoordinates):
"""Polygon geometry.
All required parameters must be populated in order to send to Azure.
:param coordinates: Required. Gets or sets type of the GeoJSON Object.
It must be an array of linear ring coordinate arrays.
For Polygons with more than one of these rings, the first MUST be the exterior ring,
and any others MUST be interior rings.
:type coordinates: list[list[list[float]]]
:param type: Required. GeoJSON object type.Constant filled by server. Possible values include:
"Point", "Polygon", "MultiPolygon".
:type type: str or ~azure.agrifood.farming.models.GeoJsonObjectType
"""
_validation = {
'coordinates': {'required': True},
'type': {'required': True},
}
_attribute_map = {
'coordinates': {'key': 'coordinates', 'type': '[[[float]]]'},
'type': {'key': 'type', 'type': 'str'},
}
def __init__(
self,
**kwargs
):
super(Polygon, self).__init__(**kwargs)
self.coordinates = kwargs['coordinates']
self.type = 'Polygon' # type: str
self.type = 'Polygon' # type: str
class SatelliteData(msrest.serialization.Model):
"""Data Model for SatelliteIngestionJobRequest.
:param image_names: List of ImageNames.
:type image_names: list[str]
:param image_formats: List of ImageFormats. Available value: TIF.
:type image_formats: list[str]
:param image_resolutions: List of ImageResolutions in meters. Available values: 10, 20, 60.
:type image_resolutions: list[float]
"""
_attribute_map = {
'image_names': {'key': 'imageNames', 'type': '[str]'},
'image_formats': {'key': 'imageFormats', 'type': '[str]'},
'image_resolutions': {'key': 'imageResolutions', 'type': '[float]'},
}
def __init__(
self,
**kwargs
):
super(SatelliteData, self).__init__(**kwargs)
self.image_names = kwargs.get('image_names', None)
self.image_formats = kwargs.get('image_formats', None)
self.image_resolutions = kwargs.get('image_resolutions', None)
class SatelliteDataIngestionJob(msrest.serialization.Model):
"""Schema of satellite data ingestion job.
Variables are only populated by the server, and will be ignored when sending a request.
All required parameters must be populated in order to send to Azure.
:param farmer_id: Required. Farmer ID.
:type farmer_id: str
:param boundary_id: Required. The id of the boundary object for which satellite data is being
fetched.
:type boundary_id: str
:param start_date_time: Required. Start Date.
:type start_date_time: ~datetime.datetime
:param end_date_time: Required. End Date.
:type end_date_time: ~datetime.datetime
:param provider: Provider of satellite data. Possible values include: "Microsoft".
:type provider: str or ~azure.agrifood.farming.models.DataProvider
:param source: Source of satellite data. Possible values include: "Sentinel_2_L2A".
:type source: str or ~azure.agrifood.farming.models.Source
:param data: Data Model for SatelliteIngestionJobRequest.
:type data: ~azure.agrifood.farming.models.SatelliteData
:ivar id: Unique job id.
:vartype id: str
:ivar status: Status of the job.
Possible values: 'Waiting', 'Running', 'Succeeded', 'Failed', 'Cancelled'.
:vartype status: str
:ivar duration_in_seconds: Duration of the job in seconds.
:vartype duration_in_seconds: float
:ivar message: Status message to capture more details of the job.
:vartype message: str
:ivar created_date_time: Job created at dateTime. Sample format: yyyy-MM-ddTHH:mm:ssZ.
:vartype created_date_time: ~datetime.datetime
:ivar last_action_date_time: Job was last acted upon at dateTime. Sample format:
yyyy-MM-ddTHH:mm:ssZ.
:vartype last_action_date_time: ~datetime.datetime
:ivar start_time: Job start time when available. Sample format: yyyy-MM-ddTHH:mm:ssZ.
:vartype start_time: ~datetime.datetime
:ivar end_time: Job end time when available. Sample format: yyyy-MM-ddTHH:mm:ssZ.
:vartype end_time: ~datetime.datetime
:param name: Name to identify resource.
:type name: str
:param description: Textual description of the resource.
:type description: str
:param properties: A collection of key value pairs that belongs to the resource.
Each pair must not have a key greater than 50 characters
and must not have a value greater than 150 characters.
Note: A maximum of 25 key value pairs can be provided for a resource and only string and
numeral values are supported.
:type properties: dict[str, any]
"""
_validation = {
'farmer_id': {'required': True},
'boundary_id': {'required': True},
'start_date_time': {'required': True},
'end_date_time': {'required': True},
'id': {'readonly': True},
'status': {'readonly': True},
'duration_in_seconds': {'readonly': True},
'message': {'readonly': True},
'created_date_time': {'readonly': True},
'last_action_date_time': {'readonly': True},
'start_time': {'readonly': True},
'end_time': {'readonly': True},
'name': {'max_length': 100, 'min_length': 0},
'description': {'max_length': 500, 'min_length': 0},
}
_attribute_map = {
'farmer_id': {'key': 'farmerId', 'type': 'str'},
'boundary_id': {'key': 'boundaryId', 'type': 'str'},
'start_date_time': {'key': 'startDateTime', 'type': 'iso-8601'},
'end_date_time': {'key': 'endDateTime', 'type': 'iso-8601'},
'provider': {'key': 'provider', 'type': 'str'},
'source': {'key': 'source', 'type': 'str'},
'data': {'key': 'data', 'type': 'SatelliteData'},
'id': {'key': 'id', 'type': 'str'},
'status': {'key': 'status', 'type': 'str'},
'duration_in_seconds': {'key': 'durationInSeconds', 'type': 'float'},
'message': {'key': 'message', 'type': 'str'},
'created_date_time': {'key': 'createdDateTime', 'type': 'iso-8601'},
'last_action_date_time': {'key': 'lastActionDateTime', 'type': 'iso-8601'},
'start_time': {'key': 'startTime', 'type': 'iso-8601'},
'end_time': {'key': 'endTime', 'type': 'iso-8601'},
'name': {'key': 'name', 'type': 'str'},
'description': {'key': 'description', 'type': 'str'},
'properties': {'key': 'properties', 'type': '{object}'},
}
def __init__(
self,
**kwargs
):
super(SatelliteDataIngestionJob, self).__init__(**kwargs)
self.farmer_id = kwargs['farmer_id']
self.boundary_id = kwargs['boundary_id']
self.start_date_time = kwargs['start_date_time']
self.end_date_time = kwargs['end_date_time']
self.provider = kwargs.get('provider', None)
self.source = kwargs.get('source', None)
self.data = kwargs.get('data', None)
self.id = None
self.status = None
self.duration_in_seconds = None
self.message = None
self.created_date_time = None
self.last_action_date_time = None
self.start_time = None
self.end_time = None
self.name = kwargs.get('name', None)
self.description = kwargs.get('description', None)
self.properties = kwargs.get('properties', None)
class Scene(msrest.serialization.Model):
"""Schema of scene resource.
Variables are only populated by the server, and will be ignored when sending a request.
:param scene_date_time: Date-time of the scene, sample format: yyyy-MM-ddTHH:mm:ssZ.
:type scene_date_time: ~datetime.datetime
:param provider: Data provider of the scene.
:type provider: str
:param source: Data source of the scene.
:type source: str
:param image_files: Collection of image files.
:type image_files: list[~azure.agrifood.farming.models.ImageFile]
:param image_format: Supported image formats for scene resource. Possible values include:
"TIF".
:type image_format: str or ~azure.agrifood.farming.models.ImageFormat
:param cloud_cover_percentage: Cloud cover percentage of the scene.
:type cloud_cover_percentage: float
:param dark_pixel_percentage: Dark pixel percentage of the scene.
:type dark_pixel_percentage: float
:param ndvi_median_value: Median of NDVI of the scene.
:type ndvi_median_value: float
:param boundary_id: Boundary ID which belongs to the scene.
:type boundary_id: str
:param farmer_id: Farmer ID which belongs to the scene.
:type farmer_id: str
:param id: Unique scene resource ID.
:type id: str
:ivar e_tag: The ETag value to implement optimistic concurrency.
:vartype e_tag: str
"""
_validation = {
'provider': {'max_length': 100, 'min_length': 2},
'source': {'max_length': 100, 'min_length': 2},
'cloud_cover_percentage': {'maximum': 100, 'minimum': 0},
'dark_pixel_percentage': {'maximum': 100, 'minimum': 0},
'ndvi_median_value': {'maximum': 1, 'minimum': 0},
'boundary_id': {'max_length': 100, 'min_length': 2},
'e_tag': {'readonly': True},
}
_attribute_map = {
'scene_date_time': {'key': 'sceneDateTime', 'type': 'iso-8601'},
'provider': {'key': 'provider', 'type': 'str'},
'source': {'key': 'source', 'type': 'str'},
'image_files': {'key': 'imageFiles', 'type': '[ImageFile]'},
'image_format': {'key': 'imageFormat', 'type': 'str'},
'cloud_cover_percentage': {'key': 'cloudCoverPercentage', 'type': 'float'},
'dark_pixel_percentage': {'key': 'darkPixelPercentage', 'type': 'float'},
'ndvi_median_value': {'key': 'ndviMedianValue', 'type': 'float'},
'boundary_id': {'key': 'boundaryId', 'type': 'str'},
'farmer_id': {'key': 'farmerId', 'type': 'str'},
'id': {'key': 'id', 'type': 'str'},
'e_tag': {'key': 'eTag', 'type': 'str'},
}
def __init__(
self,
**kwargs
):
super(Scene, self).__init__(**kwargs)
self.scene_date_time = kwargs.get('scene_date_time', None)
self.provider = kwargs.get('provider', None)
self.source = kwargs.get('source', None)
self.image_files = kwargs.get('image_files', None)
self.image_format = kwargs.get('image_format', None)
self.cloud_cover_percentage = kwargs.get('cloud_cover_percentage', None)
self.dark_pixel_percentage = kwargs.get('dark_pixel_percentage', None)
self.ndvi_median_value = kwargs.get('ndvi_median_value', None)
self.boundary_id = kwargs.get('boundary_id', None)
self.farmer_id = kwargs.get('farmer_id', None)
self.id = kwargs.get('id', None)
self.e_tag = None
class SceneListResponse(msrest.serialization.Model):
"""Paged response contains list of requested objects and a URL link to get the next set of results.
:param value: List of requested objects.
:type value: list[~azure.agrifood.farming.models.Scene]
:param skip_token: Token used in retrieving the next page. If null, there are no additional
pages.
:type skip_token: str
:param next_link: Continuation link (absolute URI) to the next page of results in the list.
:type next_link: str
"""
_attribute_map = {
'value': {'key': 'value', 'type': '[Scene]'},
'skip_token': {'key': '$skipToken', 'type': 'str'},
'next_link': {'key': 'nextLink', 'type': 'str'},
}
def __init__(
self,
**kwargs
):
super(SceneListResponse, self).__init__(**kwargs)
self.value = kwargs.get('value', None)
self.skip_token = kwargs.get('skip_token', None)
self.next_link = kwargs.get('next_link', None)
class SearchBoundaryQuery(msrest.serialization.Model):
"""SearchAllBoundaries and SearchBoundaries parameters.
:param ids: Ids of the resource.
:type ids: list[str]
:param names: Names of the resource.
:type names: list[str]
:param property_filters: Filters on key-value pairs within the Properties object.
eg. "{testKey} eq {testValue}".
:type property_filters: list[str]
:param statuses: Statuses of the resource.
:type statuses: list[str]
:param min_created_date_time: Minimum creation date of resource (inclusive).
:type min_created_date_time: ~datetime.datetime
:param max_created_date_time: Maximum creation date of resource (inclusive).
:type max_created_date_time: ~datetime.datetime
:param min_last_modified_date_time: Minimum last modified date of resource (inclusive).
:type min_last_modified_date_time: ~datetime.datetime
:param max_last_modified_date_time: Maximum last modified date of resource (inclusive).
:type max_last_modified_date_time: ~datetime.datetime
:param max_page_size: Maximum number of items needed (inclusive).
Minimum = 10, Maximum = 1000, Default value = 50.
:type max_page_size: int
:param skip_token: Skip token for getting next set of results.
:type skip_token: str
:param is_primary: Is the boundary primary.
:type is_primary: bool
:param parent_type: Type of the parent it belongs to.
:type parent_type: str
:param parent_ids: Parent Ids of the resource.
:type parent_ids: list[str]
:param min_acreage: Minimum acreage of the boundary (inclusive).
:type min_acreage: float
:param max_acreage: Maximum acreage of the boundary (inclusive).
:type max_acreage: float
:param intersects_with_geometry: GeoJSON abstract class.
:type intersects_with_geometry: ~azure.agrifood.farming.models.GeoJsonObject
"""
_validation = {
'max_page_size': {'maximum': 1000, 'minimum': 10},
}
_attribute_map = {
'ids': {'key': 'ids', 'type': '[str]'},
'names': {'key': 'names', 'type': '[str]'},
'property_filters': {'key': 'propertyFilters', 'type': '[str]'},
'statuses': {'key': 'statuses', 'type': '[str]'},
'min_created_date_time': {'key': 'minCreatedDateTime', 'type': 'iso-8601'},
'max_created_date_time': {'key': 'maxCreatedDateTime', 'type': 'iso-8601'},
'min_last_modified_date_time': {'key': 'minLastModifiedDateTime', 'type': 'iso-8601'},
'max_last_modified_date_time': {'key': 'maxLastModifiedDateTime', 'type': 'iso-8601'},
'max_page_size': {'key': '$maxPageSize', 'type': 'int'},
'skip_token': {'key': '$skipToken', 'type': 'str'},
'is_primary': {'key': 'isPrimary', 'type': 'bool'},
'parent_type': {'key': 'parentType', 'type': 'str'},
'parent_ids': {'key': 'parentIds', 'type': '[str]'},
'min_acreage': {'key': 'minAcreage', 'type': 'float'},
'max_acreage': {'key': 'maxAcreage', 'type': 'float'},
'intersects_with_geometry': {'key': 'intersectsWithGeometry', 'type': 'GeoJsonObject'},
}
def __init__(
self,
**kwargs
):
super(SearchBoundaryQuery, self).__init__(**kwargs)
self.ids = kwargs.get('ids', None)
self.names = kwargs.get('names', None)
self.property_filters = kwargs.get('property_filters', None)
self.statuses = kwargs.get('statuses', None)
self.min_created_date_time = kwargs.get('min_created_date_time', None)
self.max_created_date_time = kwargs.get('max_created_date_time', None)
self.min_last_modified_date_time = kwargs.get('min_last_modified_date_time', None)
self.max_last_modified_date_time = kwargs.get('max_last_modified_date_time', None)
self.max_page_size = kwargs.get('max_page_size', 50)
self.skip_token = kwargs.get('skip_token', None)
self.is_primary = kwargs.get('is_primary', None)
self.parent_type = kwargs.get('parent_type', None)
self.parent_ids = kwargs.get('parent_ids', None)
self.min_acreage = kwargs.get('min_acreage', None)
self.max_acreage = kwargs.get('max_acreage', None)
self.intersects_with_geometry = kwargs.get('intersects_with_geometry', None)
class Season(msrest.serialization.Model):
"""Schema of season resource.
Variables are only populated by the server, and will be ignored when sending a request.
:param start_date_time: Season start datetime, sample format: yyyy-MM-ddTHH:mm:ssZ.
:type start_date_time: ~datetime.datetime
:param end_date_time: Season end datetime, sample format: yyyy-MM-ddTHH:mm:ssZ.
:type end_date_time: ~datetime.datetime
:param year: Season year.
:type year: int
:ivar id: Unique resource ID.
:vartype id: str
:ivar e_tag: The ETag value to implement optimistic concurrency.
:vartype e_tag: str
:param status: Status of the resource.
:type status: str
:ivar created_date_time: Date-time when resource was created, sample format:
yyyy-MM-ddTHH:mm:ssZ.
:vartype created_date_time: ~datetime.datetime
:ivar modified_date_time: Date-time when resource was last modified, sample format:
yyyy-MM-ddTHH:mm:ssZ.
:vartype modified_date_time: ~datetime.datetime
:param name: Name to identify resource.
:type name: str
:param description: Textual description of the resource.
:type description: str
:param properties: A collection of key value pairs that belongs to the resource.
Each pair must not have a key greater than 50 characters
and must not have a value greater than 150 characters.
Note: A maximum of 25 key value pairs can be provided for a resource and only string and
numeral values are supported.
:type properties: dict[str, any]
"""
_validation = {
'id': {'readonly': True},
'e_tag': {'readonly': True},
'status': {'max_length': 100, 'min_length': 0},
'created_date_time': {'readonly': True},
'modified_date_time': {'readonly': True},
'name': {'max_length': 100, 'min_length': 0},
'description': {'max_length': 500, 'min_length': 0},
}
_attribute_map = {
'start_date_time': {'key': 'startDateTime', 'type': 'iso-8601'},
'end_date_time': {'key': 'endDateTime', 'type': 'iso-8601'},
'year': {'key': 'year', 'type': 'int'},
'id': {'key': 'id', 'type': 'str'},
'e_tag': {'key': 'eTag', 'type': 'str'},
'status': {'key': 'status', 'type': 'str'},
'created_date_time': {'key': 'createdDateTime', 'type': 'iso-8601'},
'modified_date_time': {'key': 'modifiedDateTime', 'type': 'iso-8601'},
'name': {'key': 'name', 'type': 'str'},
'description': {'key': 'description', 'type': 'str'},
'properties': {'key': 'properties', 'type': '{object}'},
}
def __init__(
self,
**kwargs
):
super(Season, self).__init__(**kwargs)
self.start_date_time = kwargs.get('start_date_time', None)
self.end_date_time = kwargs.get('end_date_time', None)
self.year = kwargs.get('year', None)
self.id = None
self.e_tag = None
self.status = kwargs.get('status', None)
self.created_date_time = None
self.modified_date_time = None
self.name = kwargs.get('name', None)
self.description = kwargs.get('description', None)
self.properties = kwargs.get('properties', None)
class SeasonalField(msrest.serialization.Model):
"""Schema of seasonal field resource.
Variables are only populated by the server, and will be ignored when sending a request.
:ivar farmer_id: Farmer ID.
:vartype farmer_id: str
:ivar primary_boundary_id: Primary boundary id.
:vartype primary_boundary_id: str
:ivar boundary_ids: Boundary Ids.
:vartype boundary_ids: list[str]
:param farm_id: ID of the associated Farm.
:type farm_id: str
:param field_id: ID of the associated Field.
:type field_id: str
:param season_id: ID of the season it belongs to.
:type season_id: str
:param crop_variety_ids: CropVariety ids.
:type crop_variety_ids: list[str]
:param crop_id: ID of the crop it belongs to.
:type crop_id: str
:param avg_yield_value: Average yield value of the seasonal field.
:type avg_yield_value: float
:param avg_yield_unit: Unit of the average yield value attribute.
:type avg_yield_unit: str
:param avg_seed_population_value: Average seed population value of the seasonal field.
:type avg_seed_population_value: float
:param avg_seed_population_unit: Unit of average seed population value attribute.
:type avg_seed_population_unit: str
:param planting_date_time: Planting datetime, sample format: yyyy-MM-ddTHH:mm:ssZ.
:type planting_date_time: ~datetime.datetime
:ivar id: Unique resource ID.
:vartype id: str
:ivar e_tag: The ETag value to implement optimistic concurrency.
:vartype e_tag: str
:param status: Status of the resource.
:type status: str
:ivar created_date_time: Date-time when resource was created, sample format:
yyyy-MM-ddTHH:mm:ssZ.
:vartype created_date_time: ~datetime.datetime
:ivar modified_date_time: Date-time when resource was last modified, sample format:
yyyy-MM-ddTHH:mm:ssZ.
:vartype modified_date_time: ~datetime.datetime
:param name: Name to identify resource.
:type name: str
:param description: Textual description of the resource.
:type description: str
:param properties: A collection of key value pairs that belongs to the resource.
Each pair must not have a key greater than 50 characters
and must not have a value greater than 150 characters.
Note: A maximum of 25 key value pairs can be provided for a resource and only string and
numeral values are supported.
:type properties: dict[str, any]
"""
_validation = {
'farmer_id': {'readonly': True},
'primary_boundary_id': {'readonly': True},
'boundary_ids': {'readonly': True, 'unique': True},
'crop_variety_ids': {'unique': True},
'avg_yield_unit': {'max_length': 32, 'min_length': 2},
'avg_seed_population_unit': {'max_length': 32, 'min_length': 2},
'id': {'readonly': True},
'e_tag': {'readonly': True},
'status': {'max_length': 100, 'min_length': 0},
'created_date_time': {'readonly': True},
'modified_date_time': {'readonly': True},
'name': {'max_length': 100, 'min_length': 0},
'description': {'max_length': 500, 'min_length': 0},
}
_attribute_map = {
'farmer_id': {'key': 'farmerId', 'type': 'str'},
'primary_boundary_id': {'key': 'primaryBoundaryId', 'type': 'str'},
'boundary_ids': {'key': 'boundaryIds', 'type': '[str]'},
'farm_id': {'key': 'farmId', 'type': 'str'},
'field_id': {'key': 'fieldId', 'type': 'str'},
'season_id': {'key': 'seasonId', 'type': 'str'},
'crop_variety_ids': {'key': 'cropVarietyIds', 'type': '[str]'},
'crop_id': {'key': 'cropId', 'type': 'str'},
'avg_yield_value': {'key': 'avgYieldValue', 'type': 'float'},
'avg_yield_unit': {'key': 'avgYieldUnit', 'type': 'str'},
'avg_seed_population_value': {'key': 'avgSeedPopulationValue', 'type': 'float'},
'avg_seed_population_unit': {'key': 'avgSeedPopulationUnit', 'type': 'str'},
'planting_date_time': {'key': 'plantingDateTime', 'type': 'iso-8601'},
'id': {'key': 'id', 'type': 'str'},
'e_tag': {'key': 'eTag', 'type': 'str'},
'status': {'key': 'status', 'type': 'str'},
'created_date_time': {'key': 'createdDateTime', 'type': 'iso-8601'},
'modified_date_time': {'key': 'modifiedDateTime', 'type': 'iso-8601'},
'name': {'key': 'name', 'type': 'str'},
'description': {'key': 'description', 'type': 'str'},
'properties': {'key': 'properties', 'type': '{object}'},
}
def __init__(
self,
**kwargs
):
super(SeasonalField, self).__init__(**kwargs)
self.farmer_id = None
self.primary_boundary_id = None
self.boundary_ids = None
self.farm_id = kwargs.get('farm_id', None)
self.field_id = kwargs.get('field_id', None)
self.season_id = kwargs.get('season_id', None)
self.crop_variety_ids = kwargs.get('crop_variety_ids', None)
self.crop_id = kwargs.get('crop_id', None)
self.avg_yield_value = kwargs.get('avg_yield_value', None)
self.avg_yield_unit = kwargs.get('avg_yield_unit', None)
self.avg_seed_population_value = kwargs.get('avg_seed_population_value', None)
self.avg_seed_population_unit = kwargs.get('avg_seed_population_unit', None)
self.planting_date_time = kwargs.get('planting_date_time', None)
self.id = None
self.e_tag = None
self.status = kwargs.get('status', None)
self.created_date_time = None
self.modified_date_time = None
self.name = kwargs.get('name', None)
self.description = kwargs.get('description', None)
self.properties = kwargs.get('properties', None)
class SeasonalFieldListResponse(msrest.serialization.Model):
"""Paged response contains list of requested objects and a URL link to get the next set of results.
:param value: List of requested objects.
:type value: list[~azure.agrifood.farming.models.SeasonalField]
:param skip_token: Token used in retrieving the next page. If null, there are no additional
pages.
:type skip_token: str
:param next_link: Continuation link (absolute URI) to the next page of results in the list.
:type next_link: str
"""
_attribute_map = {
'value': {'key': 'value', 'type': '[SeasonalField]'},
'skip_token': {'key': '$skipToken', 'type': 'str'},
'next_link': {'key': 'nextLink', 'type': 'str'},
}
def __init__(
self,
**kwargs
):
super(SeasonalFieldListResponse, self).__init__(**kwargs)
self.value = kwargs.get('value', None)
self.skip_token = kwargs.get('skip_token', None)
self.next_link = kwargs.get('next_link', None)
class SeasonListResponse(msrest.serialization.Model):
"""Paged response contains list of requested objects and a URL link to get the next set of results.
:param value: List of requested objects.
:type value: list[~azure.agrifood.farming.models.Season]
:param skip_token: Token used in retrieving the next page. If null, there are no additional
pages.
:type skip_token: str
:param next_link: Continuation link (absolute URI) to the next page of results in the list.
:type next_link: str
"""
_attribute_map = {
'value': {'key': 'value', 'type': '[Season]'},
'skip_token': {'key': '$skipToken', 'type': 'str'},
'next_link': {'key': 'nextLink', 'type': 'str'},
}
def __init__(
self,
**kwargs
):
super(SeasonListResponse, self).__init__(**kwargs)
self.value = kwargs.get('value', None)
self.skip_token = kwargs.get('skip_token', None)
self.next_link = kwargs.get('next_link', None)
class TillageData(msrest.serialization.Model):
"""Schema of tillage data resource.
Variables are only populated by the server, and will be ignored when sending a request.
:param tillage_depth: Schema for storing measurement reading and unit.
:type tillage_depth: ~azure.agrifood.farming.models.Measure
:param tillage_pressure: Schema for storing measurement reading and unit.
:type tillage_pressure: ~azure.agrifood.farming.models.Measure
:param area: Schema for storing measurement reading and unit.
:type area: ~azure.agrifood.farming.models.Measure
:param source: Source of the operation data.
:type source: str
:param operation_modified_date_time: Modified date-time of the operation data, sample format:
yyyy-MM-ddTHH:mm:ssZ.
Note: this will be specified by the source provider itself.
:type operation_modified_date_time: ~datetime.datetime
:param operation_start_date_time: Start date-time of the operation data, sample format:
yyyy-MM-ddTHH:mm:ssZ.
:type operation_start_date_time: ~datetime.datetime
:param operation_end_date_time: End date-time of the operation data, sample format:
yyyy-MM-ddTHH:mm:ssZ.
:type operation_end_date_time: ~datetime.datetime
:ivar attachments_link: Link for attachments.
:vartype attachments_link: str
:param associated_boundary_id: Optional boundary ID of the field for which operation was
applied.
:type associated_boundary_id: str
:param operation_boundary_id: Optional boundary ID of the actual area for which operation was
applied inside the specified field.
:type operation_boundary_id: str
:ivar farmer_id: Farmer ID which belongs to the operation data.
:vartype farmer_id: str
:ivar id: Unique resource ID.
:vartype id: str
:ivar e_tag: The ETag value to implement optimistic concurrency.
:vartype e_tag: str
:param status: Status of the resource.
:type status: str
:ivar created_date_time: Date-time when resource was created, sample format:
yyyy-MM-ddTHH:mm:ssZ.
:vartype created_date_time: ~datetime.datetime
:ivar modified_date_time: Date-time when resource was last modified, sample format:
yyyy-MM-ddTHH:mm:ssZ.
:vartype modified_date_time: ~datetime.datetime
:param name: Name to identify resource.
:type name: str
:param description: Textual description of the resource.
:type description: str
:param properties: A collection of key value pairs that belongs to the resource.
Each pair must not have a key greater than 50 characters
and must not have a value greater than 150 characters.
Note: A maximum of 25 key value pairs can be provided for a resource and only string and
numeral values are supported.
:type properties: dict[str, any]
"""
_validation = {
'source': {'max_length': 100, 'min_length': 2},
'attachments_link': {'readonly': True},
'farmer_id': {'readonly': True},
'id': {'readonly': True},
'e_tag': {'readonly': True},
'status': {'max_length': 100, 'min_length': 0},
'created_date_time': {'readonly': True},
'modified_date_time': {'readonly': True},
'name': {'max_length': 100, 'min_length': 0},
'description': {'max_length': 500, 'min_length': 0},
}
_attribute_map = {
'tillage_depth': {'key': 'tillageDepth', 'type': 'Measure'},
'tillage_pressure': {'key': 'tillagePressure', 'type': 'Measure'},
'area': {'key': 'area', 'type': 'Measure'},
'source': {'key': 'source', 'type': 'str'},
'operation_modified_date_time': {'key': 'operationModifiedDateTime', 'type': 'iso-8601'},
'operation_start_date_time': {'key': 'operationStartDateTime', 'type': 'iso-8601'},
'operation_end_date_time': {'key': 'operationEndDateTime', 'type': 'iso-8601'},
'attachments_link': {'key': 'attachmentsLink', 'type': 'str'},
'associated_boundary_id': {'key': 'associatedBoundaryId', 'type': 'str'},
'operation_boundary_id': {'key': 'operationBoundaryId', 'type': 'str'},
'farmer_id': {'key': 'farmerId', 'type': 'str'},
'id': {'key': 'id', 'type': 'str'},
'e_tag': {'key': 'eTag', 'type': 'str'},
'status': {'key': 'status', 'type': 'str'},
'created_date_time': {'key': 'createdDateTime', 'type': 'iso-8601'},
'modified_date_time': {'key': 'modifiedDateTime', 'type': 'iso-8601'},
'name': {'key': 'name', 'type': 'str'},
'description': {'key': 'description', 'type': 'str'},
'properties': {'key': 'properties', 'type': '{object}'},
}
def __init__(
self,
**kwargs
):
super(TillageData, self).__init__(**kwargs)
self.tillage_depth = kwargs.get('tillage_depth', None)
self.tillage_pressure = kwargs.get('tillage_pressure', None)
self.area = kwargs.get('area', None)
self.source = kwargs.get('source', None)
self.operation_modified_date_time = kwargs.get('operation_modified_date_time', None)
self.operation_start_date_time = kwargs.get('operation_start_date_time', None)
self.operation_end_date_time = kwargs.get('operation_end_date_time', None)
self.attachments_link = None
self.associated_boundary_id = kwargs.get('associated_boundary_id', None)
self.operation_boundary_id = kwargs.get('operation_boundary_id', None)
self.farmer_id = None
self.id = None
self.e_tag = None
self.status = kwargs.get('status', None)
self.created_date_time = None
self.modified_date_time = None
self.name = kwargs.get('name', None)
self.description = kwargs.get('description', None)
self.properties = kwargs.get('properties', None)
class TillageDataListResponse(msrest.serialization.Model):
"""Paged response contains list of requested objects and a URL link to get the next set of results.
:param value: List of requested objects.
:type value: list[~azure.agrifood.farming.models.TillageData]
:param skip_token: Token used in retrieving the next page. If null, there are no additional
pages.
:type skip_token: str
:param next_link: Continuation link (absolute URI) to the next page of results in the list.
:type next_link: str
"""
_attribute_map = {
'value': {'key': 'value', 'type': '[TillageData]'},
'skip_token': {'key': '$skipToken', 'type': 'str'},
'next_link': {'key': 'nextLink', 'type': 'str'},
}
def __init__(
self,
**kwargs
):
super(TillageDataListResponse, self).__init__(**kwargs)
self.value = kwargs.get('value', None)
self.skip_token = kwargs.get('skip_token', None)
self.next_link = kwargs.get('next_link', None)
class WeatherData(msrest.serialization.Model):
"""Schema of weather data.
Variables are only populated by the server, and will be ignored when sending a request.
All required parameters must be populated in order to send to Azure.
:param farmer_id: Required. Farmer ID.
:type farmer_id: str
:param boundary_id: Required. Boundary ID.
:type boundary_id: str
:param extension_id: Required. ID of the weather extension.
:type extension_id: str
:param location: Required. Location model class.
:type location: ~azure.agrifood.farming.models.Location
:param date_time: Required. Date-time of the weather data, sample format: yyyy-MM-ddTHH:mm:ssZ.
:type date_time: ~datetime.datetime
:param unit_system_code: Unit System like US/SI etc.
:type unit_system_code: str
:param extension_version: Required. Version of the weather data extension.
:type extension_version: str
:param weather_data_type: Required. Type of weather data (forecast/historical).
:type weather_data_type: str
:param granularity: Required. Granularity of weather data (daily/hourly).
:type granularity: str
:param cloud_cover: Schema for storing measurement reading and unit.
:type cloud_cover: ~azure.agrifood.farming.models.Measure
:param dew_point: Schema for storing measurement reading and unit.
:type dew_point: ~azure.agrifood.farming.models.Measure
:param growing_degree_day: Schema for storing measurement reading and unit.
:type growing_degree_day: ~azure.agrifood.farming.models.Measure
:param precipitation: Schema for storing measurement reading and unit.
:type precipitation: ~azure.agrifood.farming.models.Measure
:param pressure: Schema for storing measurement reading and unit.
:type pressure: ~azure.agrifood.farming.models.Measure
:param relative_humidity: Schema for storing measurement reading and unit.
:type relative_humidity: ~azure.agrifood.farming.models.Measure
:param soil_moisture: Schema for storing measurement reading and unit.
:type soil_moisture: ~azure.agrifood.farming.models.Measure
:param soil_temperature: Schema for storing measurement reading and unit.
:type soil_temperature: ~azure.agrifood.farming.models.Measure
:param temperature: Schema for storing measurement reading and unit.
:type temperature: ~azure.agrifood.farming.models.Measure
:param visibility: Schema for storing measurement reading and unit.
:type visibility: ~azure.agrifood.farming.models.Measure
:param wet_bulb_temperature: Schema for storing measurement reading and unit.
:type wet_bulb_temperature: ~azure.agrifood.farming.models.Measure
:param wind_chill: Schema for storing measurement reading and unit.
:type wind_chill: ~azure.agrifood.farming.models.Measure
:param wind_direction: Schema for storing measurement reading and unit.
:type wind_direction: ~azure.agrifood.farming.models.Measure
:param wind_gust: Schema for storing measurement reading and unit.
:type wind_gust: ~azure.agrifood.farming.models.Measure
:param wind_speed: Schema for storing measurement reading and unit.
:type wind_speed: ~azure.agrifood.farming.models.Measure
:param id: Weather data ID.
:type id: str
:ivar e_tag: The ETag value to implement optimistic concurrency.
:vartype e_tag: str
:ivar created_date_time: Date-time when resource was created, sample format:
yyyy-MM-ddTHH:mm:ssZ.
:vartype created_date_time: ~datetime.datetime
:ivar modified_date_time: Date-time when resource was last modified, sample format:
yyyy-MM-ddTHH:mm:ssZ.
:vartype modified_date_time: ~datetime.datetime
:param properties: A collection of key value pairs that belongs to the resource.
Each pair must not have a key greater than 50 characters
and must not have a value greater than 150 characters.
Note: A maximum of 25 key value pairs can be provided for a resource and only string and
numeral values are supported.
:type properties: dict[str, any]
"""
_validation = {
'farmer_id': {'required': True},
'boundary_id': {'required': True},
'extension_id': {'required': True},
'location': {'required': True},
'date_time': {'required': True},
'extension_version': {'required': True},
'weather_data_type': {'required': True},
'granularity': {'required': True},
'e_tag': {'readonly': True},
'created_date_time': {'readonly': True},
'modified_date_time': {'readonly': True},
}
_attribute_map = {
'farmer_id': {'key': 'farmerId', 'type': 'str'},
'boundary_id': {'key': 'boundaryId', 'type': 'str'},
'extension_id': {'key': 'extensionId', 'type': 'str'},
'location': {'key': 'location', 'type': 'Location'},
'date_time': {'key': 'dateTime', 'type': 'iso-8601'},
'unit_system_code': {'key': 'unitSystemCode', 'type': 'str'},
'extension_version': {'key': 'extensionVersion', 'type': 'str'},
'weather_data_type': {'key': 'weatherDataType', 'type': 'str'},
'granularity': {'key': 'granularity', 'type': 'str'},
'cloud_cover': {'key': 'cloudCover', 'type': 'Measure'},
'dew_point': {'key': 'dewPoint', 'type': 'Measure'},
'growing_degree_day': {'key': 'growingDegreeDay', 'type': 'Measure'},
'precipitation': {'key': 'precipitation', 'type': 'Measure'},
'pressure': {'key': 'pressure', 'type': 'Measure'},
'relative_humidity': {'key': 'relativeHumidity', 'type': 'Measure'},
'soil_moisture': {'key': 'soilMoisture', 'type': 'Measure'},
'soil_temperature': {'key': 'soilTemperature', 'type': 'Measure'},
'temperature': {'key': 'temperature', 'type': 'Measure'},
'visibility': {'key': 'visibility', 'type': 'Measure'},
'wet_bulb_temperature': {'key': 'wetBulbTemperature', 'type': 'Measure'},
'wind_chill': {'key': 'windChill', 'type': 'Measure'},
'wind_direction': {'key': 'windDirection', 'type': 'Measure'},
'wind_gust': {'key': 'windGust', 'type': 'Measure'},
'wind_speed': {'key': 'windSpeed', 'type': 'Measure'},
'id': {'key': 'id', 'type': 'str'},
'e_tag': {'key': 'eTag', 'type': 'str'},
'created_date_time': {'key': 'createdDateTime', 'type': 'iso-8601'},
'modified_date_time': {'key': 'modifiedDateTime', 'type': 'iso-8601'},
'properties': {'key': 'properties', 'type': '{object}'},
}
def __init__(
self,
**kwargs
):
super(WeatherData, self).__init__(**kwargs)
self.farmer_id = kwargs['farmer_id']
self.boundary_id = kwargs['boundary_id']
self.extension_id = kwargs['extension_id']
self.location = kwargs['location']
self.date_time = kwargs['date_time']
self.unit_system_code = kwargs.get('unit_system_code', None)
self.extension_version = kwargs['extension_version']
self.weather_data_type = kwargs['weather_data_type']
self.granularity = kwargs['granularity']
self.cloud_cover = kwargs.get('cloud_cover', None)
self.dew_point = kwargs.get('dew_point', None)
self.growing_degree_day = kwargs.get('growing_degree_day', None)
self.precipitation = kwargs.get('precipitation', None)
self.pressure = kwargs.get('pressure', None)
self.relative_humidity = kwargs.get('relative_humidity', None)
self.soil_moisture = kwargs.get('soil_moisture', None)
self.soil_temperature = kwargs.get('soil_temperature', None)
self.temperature = kwargs.get('temperature', None)
self.visibility = kwargs.get('visibility', None)
self.wet_bulb_temperature = kwargs.get('wet_bulb_temperature', None)
self.wind_chill = kwargs.get('wind_chill', None)
self.wind_direction = kwargs.get('wind_direction', None)
self.wind_gust = kwargs.get('wind_gust', None)
self.wind_speed = kwargs.get('wind_speed', None)
self.id = kwargs.get('id', None)
self.e_tag = None
self.created_date_time = None
self.modified_date_time = None
self.properties = kwargs.get('properties', None)
class WeatherDataDeleteJob(msrest.serialization.Model):
"""Schema of weather data delete job.
Variables are only populated by the server, and will be ignored when sending a request.
All required parameters must be populated in order to send to Azure.
:param extension_id: Required. ID of the extension to be used for the providerInput. eg.
DTN.ClearAg.
:type extension_id: str
:param farmer_id: Required. The id of the farmer object for which weather data is being
fetched.
:type farmer_id: str
:param boundary_id: Required. The id of the boundary object for which weather data is being
fetched.
:type boundary_id: str
:param weather_data_type: Type of weather data. Possible values include: 'forecast' ,
'historical'.
:type weather_data_type: str
:param granularity: Granularity of weather data. Possible values include: 'daily' , 'hourly'.
:type granularity: str
:param start_date_time: Weather data start UTC date-time (inclusive), sample format:
yyyy-MM-ddTHH:mm:ssZ.
:type start_date_time: ~datetime.datetime
:param end_date_time: Weather data end UTC date-time (inclusive), sample format:
yyyy-MM-ddTHH:mm:ssZ.
:type end_date_time: ~datetime.datetime
:ivar id: Unique job id.
:vartype id: str
:ivar status: Status of the job.
Possible values: 'Waiting', 'Running', 'Succeeded', 'Failed', 'Cancelled'.
:vartype status: str
:ivar duration_in_seconds: Duration of the job in seconds.
:vartype duration_in_seconds: float
:ivar message: Status message to capture more details of the job.
:vartype message: str
:ivar created_date_time: Job created at dateTime. Sample format: yyyy-MM-ddTHH:mm:ssZ.
:vartype created_date_time: ~datetime.datetime
:ivar last_action_date_time: Job was last acted upon at dateTime. Sample format:
yyyy-MM-ddTHH:mm:ssZ.
:vartype last_action_date_time: ~datetime.datetime
:ivar start_time: Job start time when available. Sample format: yyyy-MM-ddTHH:mm:ssZ.
:vartype start_time: ~datetime.datetime
:ivar end_time: Job end time when available. Sample format: yyyy-MM-ddTHH:mm:ssZ.
:vartype end_time: ~datetime.datetime
:param name: Name to identify resource.
:type name: str
:param description: Textual description of the resource.
:type description: str
:param properties: A collection of key value pairs that belongs to the resource.
Each pair must not have a key greater than 50 characters
and must not have a value greater than 150 characters.
Note: A maximum of 25 key value pairs can be provided for a resource and only string and
numeral values are supported.
:type properties: dict[str, any]
"""
_validation = {
'extension_id': {'required': True, 'max_length': 100, 'min_length': 2, 'pattern': r'^[A-za-z]{3,50}[.][A-za-z]{3,100}$'},
'farmer_id': {'required': True},
'boundary_id': {'required': True},
'id': {'readonly': True},
'status': {'readonly': True},
'duration_in_seconds': {'readonly': True},
'message': {'readonly': True},
'created_date_time': {'readonly': True},
'last_action_date_time': {'readonly': True},
'start_time': {'readonly': True},
'end_time': {'readonly': True},
'name': {'max_length': 100, 'min_length': 0},
'description': {'max_length': 500, 'min_length': 0},
}
_attribute_map = {
'extension_id': {'key': 'extensionId', 'type': 'str'},
'farmer_id': {'key': 'farmerId', 'type': 'str'},
'boundary_id': {'key': 'boundaryId', 'type': 'str'},
'weather_data_type': {'key': 'weatherDataType', 'type': 'str'},
'granularity': {'key': 'granularity', 'type': 'str'},
'start_date_time': {'key': 'startDateTime', 'type': 'iso-8601'},
'end_date_time': {'key': 'endDateTime', 'type': 'iso-8601'},
'id': {'key': 'id', 'type': 'str'},
'status': {'key': 'status', 'type': 'str'},
'duration_in_seconds': {'key': 'durationInSeconds', 'type': 'float'},
'message': {'key': 'message', 'type': 'str'},
'created_date_time': {'key': 'createdDateTime', 'type': 'iso-8601'},
'last_action_date_time': {'key': 'lastActionDateTime', 'type': 'iso-8601'},
'start_time': {'key': 'startTime', 'type': 'iso-8601'},
'end_time': {'key': 'endTime', 'type': 'iso-8601'},
'name': {'key': 'name', 'type': 'str'},
'description': {'key': 'description', 'type': 'str'},
'properties': {'key': 'properties', 'type': '{object}'},
}
def __init__(
self,
**kwargs
):
super(WeatherDataDeleteJob, self).__init__(**kwargs)
self.extension_id = kwargs['extension_id']
self.farmer_id = kwargs['farmer_id']
self.boundary_id = kwargs['boundary_id']
self.weather_data_type = kwargs.get('weather_data_type', None)
self.granularity = kwargs.get('granularity', None)
self.start_date_time = kwargs.get('start_date_time', None)
self.end_date_time = kwargs.get('end_date_time', None)
self.id = None
self.status = None
self.duration_in_seconds = None
self.message = None
self.created_date_time = None
self.last_action_date_time = None
self.start_time = None
self.end_time = None
self.name = kwargs.get('name', None)
self.description = kwargs.get('description', None)
self.properties = kwargs.get('properties', None)
class WeatherDataIngestionJob(msrest.serialization.Model):
"""Schema of weather ingestion job.
Variables are only populated by the server, and will be ignored when sending a request.
All required parameters must be populated in order to send to Azure.
:param boundary_id: Required. The id of the boundary object for which weather data is being
fetched.
:type boundary_id: str
:param farmer_id: Required. The id of the farmer object for which weather data is being
fetched.
:type farmer_id: str
:param extension_id: Required. ID of the extension to be used for the providerInput. eg.
DTN.ClearAg.
:type extension_id: str
:param extension_api_name: Required. Extension api name to which request is to be made.
:type extension_api_name: str
:param extension_api_input: Required. Extension api input dictionary which would be used to
feed request query/body/parameter information.
:type extension_api_input: dict[str, any]
:param extension_data_provider_app_id: App id of the weather data provider.
:type extension_data_provider_app_id: str
:param extension_data_provider_api_key: Api key of the weather data provider.
:type extension_data_provider_api_key: str
:ivar id: Unique job id.
:vartype id: str
:ivar status: Status of the job.
Possible values: 'Waiting', 'Running', 'Succeeded', 'Failed', 'Cancelled'.
:vartype status: str
:ivar duration_in_seconds: Duration of the job in seconds.
:vartype duration_in_seconds: float
:ivar message: Status message to capture more details of the job.
:vartype message: str
:ivar created_date_time: Job created at dateTime. Sample format: yyyy-MM-ddTHH:mm:ssZ.
:vartype created_date_time: ~datetime.datetime
:ivar last_action_date_time: Job was last acted upon at dateTime. Sample format:
yyyy-MM-ddTHH:mm:ssZ.
:vartype last_action_date_time: ~datetime.datetime
:ivar start_time: Job start time when available. Sample format: yyyy-MM-ddTHH:mm:ssZ.
:vartype start_time: ~datetime.datetime
:ivar end_time: Job end time when available. Sample format: yyyy-MM-ddTHH:mm:ssZ.
:vartype end_time: ~datetime.datetime
:param name: Name to identify resource.
:type name: str
:param description: Textual description of the resource.
:type description: str
:param properties: A collection of key value pairs that belongs to the resource.
Each pair must not have a key greater than 50 characters
and must not have a value greater than 150 characters.
Note: A maximum of 25 key value pairs can be provided for a resource and only string and
numeral values are supported.
:type properties: dict[str, any]
"""
_validation = {
'boundary_id': {'required': True},
'farmer_id': {'required': True},
'extension_id': {'required': True, 'max_length': 100, 'min_length': 2, 'pattern': r'^[A-za-z]{3,50}[.][A-za-z]{3,100}$'},
'extension_api_name': {'required': True, 'max_length': 100, 'min_length': 2},
'extension_api_input': {'required': True},
'extension_data_provider_app_id': {'max_length': 200, 'min_length': 2},
'extension_data_provider_api_key': {'max_length': 200, 'min_length': 2},
'id': {'readonly': True},
'status': {'readonly': True},
'duration_in_seconds': {'readonly': True},
'message': {'readonly': True},
'created_date_time': {'readonly': True},
'last_action_date_time': {'readonly': True},
'start_time': {'readonly': True},
'end_time': {'readonly': True},
'name': {'max_length': 100, 'min_length': 0},
'description': {'max_length': 500, 'min_length': 0},
}
_attribute_map = {
'boundary_id': {'key': 'boundaryId', 'type': 'str'},
'farmer_id': {'key': 'farmerId', 'type': 'str'},
'extension_id': {'key': 'extensionId', 'type': 'str'},
'extension_api_name': {'key': 'extensionApiName', 'type': 'str'},
'extension_api_input': {'key': 'extensionApiInput', 'type': '{object}'},
'extension_data_provider_app_id': {'key': 'extensionDataProviderAppId', 'type': 'str'},
'extension_data_provider_api_key': {'key': 'extensionDataProviderApiKey', 'type': 'str'},
'id': {'key': 'id', 'type': 'str'},
'status': {'key': 'status', 'type': 'str'},
'duration_in_seconds': {'key': 'durationInSeconds', 'type': 'float'},
'message': {'key': 'message', 'type': 'str'},
'created_date_time': {'key': 'createdDateTime', 'type': 'iso-8601'},
'last_action_date_time': {'key': 'lastActionDateTime', 'type': 'iso-8601'},
'start_time': {'key': 'startTime', 'type': 'iso-8601'},
'end_time': {'key': 'endTime', 'type': 'iso-8601'},
'name': {'key': 'name', 'type': 'str'},
'description': {'key': 'description', 'type': 'str'},
'properties': {'key': 'properties', 'type': '{object}'},
}
def __init__(
self,
**kwargs
):
super(WeatherDataIngestionJob, self).__init__(**kwargs)
self.boundary_id = kwargs['boundary_id']
self.farmer_id = kwargs['farmer_id']
self.extension_id = kwargs['extension_id']
self.extension_api_name = kwargs['extension_api_name']
self.extension_api_input = kwargs['extension_api_input']
self.extension_data_provider_app_id = kwargs.get('extension_data_provider_app_id', None)
self.extension_data_provider_api_key = kwargs.get('extension_data_provider_api_key', None)
self.id = None
self.status = None
self.duration_in_seconds = None
self.message = None
self.created_date_time = None
self.last_action_date_time = None
self.start_time = None
self.end_time = None
self.name = kwargs.get('name', None)
self.description = kwargs.get('description', None)
self.properties = kwargs.get('properties', None)
class WeatherDataListResponse(msrest.serialization.Model):
"""Paged response contains list of requested objects and a URL link to get the next set of results.
:param value: List of requested objects.
:type value: list[~azure.agrifood.farming.models.WeatherData]
:param skip_token: Token used in retrieving the next page. If null, there are no additional
pages.
:type skip_token: str
:param next_link: Continuation link (absolute URI) to the next page of results in the list.
:type next_link: str
"""
_attribute_map = {
'value': {'key': 'value', 'type': '[WeatherData]'},
'skip_token': {'key': '$skipToken', 'type': 'str'},
'next_link': {'key': 'nextLink', 'type': 'str'},
}
def __init__(
self,
**kwargs
):
super(WeatherDataListResponse, self).__init__(**kwargs)
self.value = kwargs.get('value', None)
self.skip_token = kwargs.get('skip_token', None)
self.next_link = kwargs.get('next_link', None)
| 42.607102 | 160 | 0.645859 | 18,373 | 149,977 | 5.102433 | 0.035269 | 0.034476 | 0.020001 | 0.022187 | 0.811994 | 0.785476 | 0.76476 | 0.744248 | 0.729581 | 0.710135 | 0 | 0.007267 | 0.218273 | 149,977 | 3,519 | 161 | 42.61921 | 0.792342 | 0.424318 | 0 | 0.715488 | 0 | 0.001122 | 0.31803 | 0.034937 | 0 | 0 | 0 | 0 | 0 | 1 | 0.034231 | false | 0 | 0.001122 | 0 | 0.12514 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
fc3c790fa52e77edddb784046bc6a340f44d0fcc | 4,560 | py | Python | networkx/generators/tests/test_community.py | theaverageguy/networkx | b2b74b3ba028ef3788f796aa64b037c8ea446539 | [
"BSD-3-Clause"
] | 1 | 2020-11-10T08:15:23.000Z | 2020-11-10T08:15:23.000Z | networkx/generators/tests/test_community.py | theaverageguy/networkx | b2b74b3ba028ef3788f796aa64b037c8ea446539 | [
"BSD-3-Clause"
] | null | null | null | networkx/generators/tests/test_community.py | theaverageguy/networkx | b2b74b3ba028ef3788f796aa64b037c8ea446539 | [
"BSD-3-Clause"
] | 2 | 2016-09-04T10:59:12.000Z | 2020-02-17T07:43:04.000Z | import networkx as nx
from nose.tools import *
def test_random_partition_graph():
G = nx.random_partition_graph([3,3,3],1,0)
C = G.graph['partition']
assert_equal(C,[set([0,1,2]), set([3,4,5]), set([6,7,8])])
assert_equal(len(G),9)
assert_equal(len(list(G.edges())),9)
G = nx.random_partition_graph([3,3,3],0,1)
C = G.graph['partition']
assert_equal(C,[set([0,1,2]), set([3,4,5]), set([6,7,8])])
assert_equal(len(G),9)
assert_equal(len(list(G.edges())),27)
G = nx.random_partition_graph([3,3,3],1,0,directed=True)
C = G.graph['partition']
assert_equal(C,[set([0,1,2]), set([3,4,5]), set([6,7,8])])
assert_equal(len(G),9)
assert_equal(len(list(G.edges())),18)
G = nx.random_partition_graph([3,3,3],0,1,directed=True)
C = G.graph['partition']
assert_equal(C,[set([0,1,2]), set([3,4,5]), set([6,7,8])])
assert_equal(len(G),9)
assert_equal(len(list(G.edges())),54)
G = nx.random_partition_graph([1,2,3,4,5], 0.5, 0.1)
C = G.graph['partition']
assert_equal(C,[set([0]), set([1,2]), set([3,4,5]),
set([6,7,8,9]), set([10,11,12,13,14])])
assert_equal(len(G),15)
assert_raises(nx.NetworkXError, nx.random_partition_graph,[1,2,3],1.1,0.1)
assert_raises(nx.NetworkXError, nx.random_partition_graph,[1,2,3],-0.1,0.1)
assert_raises(nx.NetworkXError, nx.random_partition_graph,[1,2,3],0.1,1.1)
assert_raises(nx.NetworkXError, nx.random_partition_graph,[1,2,3],0.1,-0.1)
def test_planted_partition_graph():
G = nx.planted_partition_graph(4,3,1,0)
C = G.graph['partition']
assert_equal(len(C),4)
assert_equal(len(G),12)
assert_equal(len(list(G.edges())),12)
G = nx.planted_partition_graph(4,3,0,1)
C = G.graph['partition']
assert_equal(len(C),4)
assert_equal(len(G),12)
assert_equal(len(list(G.edges())),54)
G = nx.planted_partition_graph(10,4,.5,.1,seed=42)
C = G.graph['partition']
assert_equal(len(C),10)
assert_equal(len(G),40)
assert_equal(len(list(G.edges())),108)
G = nx.planted_partition_graph(4,3,1,0,directed=True)
C = G.graph['partition']
assert_equal(len(C),4)
assert_equal(len(G),12)
assert_equal(len(list(G.edges())),24)
G = nx.planted_partition_graph(4,3,0,1,directed=True)
C = G.graph['partition']
assert_equal(len(C),4)
assert_equal(len(G),12)
assert_equal(len(list(G.edges())),108)
G = nx.planted_partition_graph(10,4,.5,.1,seed=42,directed=True)
C = G.graph['partition']
assert_equal(len(C),10)
assert_equal(len(G),40)
assert_equal(len(list(G.edges())),218)
assert_raises(nx.NetworkXError, nx.planted_partition_graph, 3, 3, 1.1, 0.1)
assert_raises(nx.NetworkXError, nx.planted_partition_graph, 3, 3,-0.1, 0.1)
assert_raises(nx.NetworkXError, nx.planted_partition_graph, 3, 3, 0.1, 1.1)
assert_raises(nx.NetworkXError, nx.planted_partition_graph, 3, 3, 0.1,-0.1)
def test_relaxed_caveman_graph():
G = nx.relaxed_caveman_graph(4, 3, 0)
assert_equal(len(G), 12)
G = nx.relaxed_caveman_graph(4, 3, 1)
assert_equal(len(G), 12)
G = nx.relaxed_caveman_graph(4, 3, 0.5)
assert_equal(len(G), 12)
def test_connected_caveman_graph():
G = nx.connected_caveman_graph(4,3)
assert_equal(len(G),12)
G = nx.connected_caveman_graph(1,5)
K5 = nx.complete_graph(5)
K5.remove_edge(3,4)
assert_true(nx.is_isomorphic(G,K5))
def test_caveman_graph():
G = nx.caveman_graph(4,3)
assert_equal(len(G),12)
G = nx.caveman_graph(1,5)
K5 = nx.complete_graph(5)
assert_true(nx.is_isomorphic(G,K5))
def test_gaussian_random_partition_graph():
G = nx.gaussian_random_partition_graph(100, 10, 10, 0.3, 0.01)
assert_equal(len(G),100)
assert_raises(nx.NetworkXError,
nx.gaussian_random_partition_graph, 100, 101, 10, 1, 0)
def test_ring_of_cliques():
for i in range(2, 20):
for j in range(2, 20):
G = nx.ring_of_cliques(i, j)
assert_equal(G.number_of_nodes(), i*j)
if i != 2 or j != 1:
expected_num_edges = i * (((j * (j - 1)) // 2) + 1)
else:
# the edge that already exists cannot be duplicated
expected_num_edges = i * (((j * (j - 1)) // 2) + 1) - 1
assert_equal(G.number_of_edges(), expected_num_edges)
assert_raises(nx.NetworkXError, nx.ring_of_cliques, 1, 5)
assert_raises(nx.NetworkXError, nx.ring_of_cliques, 3, 0)
| 36.190476 | 79 | 0.629605 | 790 | 4,560 | 3.443038 | 0.108861 | 0.161765 | 0.169853 | 0.09375 | 0.857353 | 0.803676 | 0.779412 | 0.765809 | 0.725368 | 0.627941 | 0 | 0.080152 | 0.190132 | 4,560 | 125 | 80 | 36.48 | 0.656377 | 0.010746 | 0 | 0.423077 | 0 | 0 | 0.021956 | 0 | 0 | 0 | 0 | 0 | 0.509615 | 1 | 0.067308 | false | 0 | 0.019231 | 0 | 0.086538 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
fc5f6aa356a3cb59cf924e2abda8e4c89eb4f7a5 | 5,369 | py | Python | src/the_tale/the_tale/accounts/friends/tests/test_prototypes.py | al-arz/the-tale | 542770257eb6ebd56a5ac44ea1ef93ff4ab19eb5 | [
"BSD-3-Clause"
] | 85 | 2017-11-21T12:22:02.000Z | 2022-03-27T23:07:17.000Z | src/the_tale/the_tale/accounts/friends/tests/test_prototypes.py | devapromix/the-tale | 2a10efd3270734f8cf482b4cfbc5353ef8f0494c | [
"BSD-3-Clause"
] | 545 | 2017-11-04T14:15:04.000Z | 2022-03-27T14:19:27.000Z | src/the_tale/the_tale/accounts/friends/tests/test_prototypes.py | devapromix/the-tale | 2a10efd3270734f8cf482b4cfbc5353ef8f0494c | [
"BSD-3-Clause"
] | 45 | 2017-11-11T12:36:30.000Z | 2022-02-25T06:10:44.000Z |
import smart_imports
smart_imports.all()
class FriendshipPrototypeTests(utils_testcase.TestCase, personal_messages_helpers.Mixin):
def setUp(self):
super(FriendshipPrototypeTests, self).setUp()
game_logic.create_test_map()
self.account_1 = self.accounts_factory.create_account()
self.account_2 = self.accounts_factory.create_account()
self.account_3 = self.accounts_factory.create_account()
personal_messages_tt_services.personal_messages.cmd_debug_clear_service()
def test_request_friendship__own_request_exists(self):
with self.check_new_message(self.account_2.id, [self.account_1.id]):
own_request_1 = prototypes.FriendshipPrototype.request_friendship(self.account_1, self.account_2, 'text 1')
own_request_2 = prototypes.FriendshipPrototype.request_friendship(self.account_1, self.account_2, 'text 2')
self.assertEqual(own_request_1.id, own_request_2.id)
self.assertFalse(own_request_2.is_confirmed)
self.assertEqual(own_request_2.text_html, 'text 2')
self.assertEqual(models.Friendship.objects.all().count(), 1)
def test_request_friendship__his_request_exists(self):
with self.check_new_message(self.account_2.id, [accounts_logic.get_system_user_id()]):
with self.check_new_message(self.account_1.id, [self.account_2.id]):
his_request = prototypes.FriendshipPrototype.request_friendship(self.account_2, self.account_1, 'text 1')
own_request = prototypes.FriendshipPrototype.request_friendship(self.account_1, self.account_2, 'text 2')
self.assertEqual(his_request.id, own_request.id)
self.assertTrue(own_request.is_confirmed)
self.assertEqual(own_request.text_html, 'text 1')
self.assertEqual(models.Friendship.objects.all().count(), 1)
def test_request_friendship__new_request(self):
with self.check_new_message(self.account_2.id, [self.account_1.id]):
own_request = prototypes.FriendshipPrototype.request_friendship(self.account_1, self.account_2, 'text 1')
self.assertFalse(own_request.is_confirmed)
self.assertEqual(own_request.text_html, 'text 1')
self.assertEqual(models.Friendship.objects.all().count(), 1)
def test_remove_friendship__own_request_exists(self):
with self.check_new_message(self.account_2.id, [accounts_logic.get_system_user_id(), self.account_1.id], number=2):
prototypes.FriendshipPrototype.request_friendship(self.account_1, self.account_2, 'text 1')
prototypes.FriendshipPrototype.remove_friendship(self.account_1, self.account_2)
def test_remove_friendship__his_request_exists(self):
with self.check_new_message(self.account_2.id, [accounts_logic.get_system_user_id()]):
with self.check_new_message(self.account_1.id, [self.account_2.id]):
prototypes.FriendshipPrototype.request_friendship(self.account_2, self.account_1, 'text 1')
prototypes.FriendshipPrototype.remove_friendship(self.account_1, self.account_2)
self.assertEqual(models.Friendship.objects.all().count(), 0)
def test_remove_friendship__no_requests(self):
with self.check_no_messages(self.account_2.id):
with self.check_no_messages(self.account_1.id):
prototypes.FriendshipPrototype.remove_friendship(self.account_1, self.account_2)
self.assertEqual(models.Friendship.objects.all().count(), 0)
def test_get_friends_for__no_friendship(self):
self.assertEqual(prototypes.FriendshipPrototype.get_friends_for(self.account_1), [])
def test_get_friends_for__only_candidates(self):
prototypes.FriendshipPrototype.request_friendship(self.account_1, self.account_2, 'text 1')
prototypes.FriendshipPrototype.request_friendship(self.account_3, self.account_1, 'text 2')
self.assertEqual(prototypes.FriendshipPrototype.get_friends_for(self.account_1), [])
def test_get_friends_for__friends_exists(self):
prototypes.FriendshipPrototype.request_friendship(self.account_1, self.account_2, 'text 1')._confirm()
prototypes.FriendshipPrototype.request_friendship(self.account_3, self.account_1, 'text 2')._confirm()
self.assertEqual(set(account.id for account in prototypes.FriendshipPrototype.get_friends_for(self.account_1)), set([self.account_2.id, self.account_3.id]))
def test_get_candidates_for__no_friendship(self):
self.assertEqual(prototypes.FriendshipPrototype.get_candidates_for(self.account_1), [])
def test_get_candidates_for__only_friends(self):
prototypes.FriendshipPrototype.request_friendship(self.account_1, self.account_2, 'text 1')._confirm()
prototypes.FriendshipPrototype.request_friendship(self.account_3, self.account_1, 'text 2')._confirm()
self.assertEqual(prototypes.FriendshipPrototype.get_candidates_for(self.account_1), [])
def test_get_candidates_for__candidates_exists(self):
prototypes.FriendshipPrototype.request_friendship(self.account_1, self.account_2, 'text 1')
prototypes.FriendshipPrototype.request_friendship(self.account_3, self.account_1, 'text 2')
self.assertEqual([account.id for account in prototypes.FriendshipPrototype.get_candidates_for(self.account_1)], [self.account_3.id])
| 56.515789 | 164 | 0.75582 | 688 | 5,369 | 5.543605 | 0.101744 | 0.178815 | 0.097535 | 0.180912 | 0.8613 | 0.83849 | 0.822496 | 0.780021 | 0.739643 | 0.72968 | 0 | 0.019965 | 0.14174 | 5,369 | 94 | 165 | 57.117021 | 0.807726 | 0 | 0 | 0.426471 | 0 | 0 | 0.020119 | 0 | 0 | 0 | 0 | 0 | 0.279412 | 1 | 0.191176 | false | 0 | 0.029412 | 0 | 0.235294 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
fc86f623ff65381ecfaa7b123fb9627791c12fab | 31,224 | py | Python | TEST3D/GUI/0011003_page_femesh/log.py | usnistgov/OOF3D | 4fd423a48aea9c5dc207520f02de53ae184be74c | [
"X11"
] | 31 | 2015-04-01T15:59:36.000Z | 2022-03-18T20:21:47.000Z | TEST3D/GUI/0011003_page_femesh/log.py | usnistgov/OOF3D | 4fd423a48aea9c5dc207520f02de53ae184be74c | [
"X11"
] | 3 | 2015-02-06T19:30:24.000Z | 2017-05-25T14:14:31.000Z | TEST3D/GUI/0011003_page_femesh/log.py | usnistgov/OOF3D | 4fd423a48aea9c5dc207520f02de53ae184be74c | [
"X11"
] | 7 | 2015-01-23T15:19:22.000Z | 2021-06-09T09:03:59.000Z | # -*- python -*-
# This software was produced by NIST, an agency of the U.S. government,
# and by statute is not subject to copyright in the United States.
# Recipients of this software assume all responsibilities associated
# with its operation, modification and maintenance. However, to
# facilitate maintenance we ask that before distributing modified
# versions of this software, you first contact the authors at
# oof_manager@nist.gov.
import tests
#Testing FE Mesh page dependencies with the Field&Equations Page and the Solver Page.
#Testing also the impact of the subproblem consistency and the fact that it is shown in the other pages.
findWidget('OOF3D').resize(550, 350)
findMenu(findWidget('OOF3D:MenuBar'), 'File:Load:Data').activate()
checkpoint toplevel widget mapped Dialog-Data
findWidget('Dialog-Data').resize(190, 65)
findWidget('Dialog-Data:filename').set_text('TEST_DATA/triangle.skeleton')
findWidget('Dialog-Data:gtk-ok').clicked()
checkpoint meshable button set
checkpoint meshable button set
checkpoint microstructure page sensitized
checkpoint pixel page updated
checkpoint active area status updated
checkpoint named analysis chooser set
checkpoint microstructure page sensitized
checkpoint Field page sensitized
checkpoint meshable button set
checkpoint Materials page updated
checkpoint mesh page subproblems sensitized
checkpoint mesh page subproblems sensitized
checkpoint mesh page sensitized
checkpoint mesh page subproblems sensitized
checkpoint mesh page sensitized
checkpoint pinnodes page sensitized
checkpoint boundary page updated
checkpoint skeleton selection page selection sensitized
checkpoint skeleton selection page updated
checkpoint skeleton selection page groups sensitized
checkpoint skeleton selection page groups sensitized
checkpoint Solver page sensitized
checkpoint toplevel widget mapped OOF3D Activity Viewer
findWidget('OOF3D Activity Viewer').resize(400, 300)
checkpoint microstructure page sensitized
checkpoint meshable button set
checkpoint meshable button set
checkpoint microstructure page sensitized
checkpoint skeleton selection page groups sensitized
checkpoint microstructure page sensitized
checkpoint meshable button set
checkpoint meshable button set
checkpoint meshable button set
checkpoint microstructure page sensitized
checkpoint skeleton selection page groups sensitized
checkpoint meshable button set
checkpoint meshable button set
checkpoint microstructure page sensitized
checkpoint meshable button set
checkpoint meshable button set
checkpoint microstructure page sensitized
checkpoint skeleton selection page groups sensitized
checkpoint skeleton selection page selection sensitized
checkpoint skeleton selection page groups sensitized
checkpoint skeleton selection page updated
checkpoint named analysis chooser set
checkpoint Field page sensitized
checkpoint mesh page subproblems sensitized
checkpoint mesh page subproblems sensitized
checkpoint mesh page sensitized
checkpoint mesh page subproblems sensitized
checkpoint mesh page sensitized
checkpoint pinnodes page sensitized
checkpoint boundary page updated
checkpoint skeleton selection page selection sensitized
checkpoint Solver page sensitized
checkpoint skeleton selection page groups sensitized
checkpoint skeleton selection page groups sensitized
checkpoint skeleton selection page updated
checkpoint pinnodes page sensitized
checkpoint pinnodes page sensitized
checkpoint skeleton selection page groups sensitized
checkpoint skeleton selection page groups sensitized
checkpoint skeleton selection page groups sensitized
checkpoint skeleton selection page groups sensitized
checkpoint pinnodes page sensitized
checkpoint boundary page updated
checkpoint boundary page updated
checkpoint boundary page updated
checkpoint boundary page updated
checkpoint boundary page updated
checkpoint boundary page updated
checkpoint boundary page updated
checkpoint boundary page updated
checkpoint boundary page updated
checkpoint boundary page updated
checkpoint boundary page updated
checkpoint boundary page updated
checkpoint boundary page updated
checkpoint boundary page updated
checkpoint boundary page updated
checkpoint boundary page updated
checkpoint boundary page updated
checkpoint boundary page updated
checkpoint boundary page updated
checkpoint boundary page updated
checkpoint boundary page updated
checkpoint boundary page updated
checkpoint boundary page updated
checkpoint boundary page updated
checkpoint boundary page updated
checkpoint boundary page updated
checkpoint boundary page updated
checkpoint boundary page updated
checkpoint boundary page updated
checkpoint boundary page updated
checkpoint boundary page updated
checkpoint boundary page updated
checkpoint boundary page updated
checkpoint boundary page updated
checkpoint boundary page updated
checkpoint boundary page updated
checkpoint boundary page updated
checkpoint boundary page updated
checkpoint boundary page updated
checkpoint boundary page updated
checkpoint boundary page updated
checkpoint boundary page updated
checkpoint boundary page updated
checkpoint boundary page updated
checkpoint boundary page updated
checkpoint boundary page updated
checkpoint boundary page updated
checkpoint boundary page updated
checkpoint boundary page updated
checkpoint boundary page updated
checkpoint boundary page updated
checkpoint boundary page updated
checkpoint OOF.File.Load.Data
widget_0=findWidget('OOF3D Activity Viewer')
handled_0=widget_0.event(event(gtk.gdk.DELETE,window=widget_0.window))
postpone if not handled_0: widget_0.destroy()
checkpoint OOF.ActivityViewer.File.Close
setComboBox(findWidget('OOF3D:Navigation:PageMenu'), 'FE Mesh')
checkpoint mesh page subproblems sensitized
checkpoint mesh page sensitized
checkpoint mesh page subproblems sensitized
checkpoint mesh page subproblems sensitized
checkpoint mesh page sensitized
checkpoint page installed FE Mesh
assert tests.FEMeshPageInfoCheck()
assert tests.FEMeshPageCheck1()
assert tests.FEMeshPageSubproblemsCheck0()
assert tests.FEMeshPageOperationsCheck0()
assert tests.chooserCheck('OOF3D:FE Mesh Page:Microstructure', ['triangle'])
assert tests.chooserStateCheck('OOF3D:FE Mesh Page:Microstructure', 'triangle')
assert tests.chooserCheck('OOF3D:FE Mesh Page:Skeleton', ['skeleton'])
assert tests.chooserStateCheck('OOF3D:FE Mesh Page:Skeleton', 'skeleton')
findWidget('OOF3D:FE Mesh Page:Pane').set_position(304)
findWidget('OOF3D').resize(559, 364)
findWidget('OOF3D:FE Mesh Page:Pane').set_position(313)
findWidget('OOF3D').resize(566, 371)
findWidget('OOF3D:FE Mesh Page:Pane').set_position(320)
findWidget('OOF3D').resize(580, 383)
findWidget('OOF3D:FE Mesh Page:Pane').set_position(334)
findWidget('OOF3D').resize(600, 400)
findWidget('OOF3D:FE Mesh Page:Pane').set_position(354)
findWidget('OOF3D').resize(611, 409)
findWidget('OOF3D:FE Mesh Page:Pane').set_position(365)
findWidget('OOF3D').resize(623, 417)
findWidget('OOF3D:FE Mesh Page:Pane').set_position(377)
findWidget('OOF3D').resize(635, 426)
findWidget('OOF3D:FE Mesh Page:Pane').set_position(389)
findWidget('OOF3D').resize(650, 441)
findWidget('OOF3D:FE Mesh Page:Pane').set_position(404)
findWidget('OOF3D').resize(664, 457)
findWidget('OOF3D:FE Mesh Page:Pane').set_position(418)
findWidget('OOF3D').resize(688, 478)
findWidget('OOF3D:FE Mesh Page:Pane').set_position(442)
findWidget('OOF3D').resize(701, 488)
findWidget('OOF3D:FE Mesh Page:Pane').set_position(455)
findWidget('OOF3D').resize(718, 501)
findWidget('OOF3D:FE Mesh Page:Pane').set_position(472)
findWidget('OOF3D').resize(727, 511)
findWidget('OOF3D:FE Mesh Page:Pane').set_position(481)
findWidget('OOF3D').resize(734, 516)
findWidget('OOF3D:FE Mesh Page:Pane').set_position(488)
findWidget('OOF3D').resize(740, 521)
findWidget('OOF3D:FE Mesh Page:Pane').set_position(494)
findWidget('OOF3D').resize(747, 522)
findWidget('OOF3D:FE Mesh Page:Pane').set_position(501)
findWidget('OOF3D').resize(752, 524)
findWidget('OOF3D:FE Mesh Page:Pane').set_position(506)
findWidget('OOF3D').resize(756, 526)
findWidget('OOF3D:FE Mesh Page:Pane').set_position(510)
findWidget('OOF3D').resize(761, 527)
findWidget('OOF3D:FE Mesh Page:Pane').set_position(515)
findWidget('OOF3D').resize(764, 527)
findWidget('OOF3D:FE Mesh Page:Pane').set_position(518)
findWidget('OOF3D').resize(766, 527)
findWidget('OOF3D:FE Mesh Page:Pane').set_position(520)
findWidget('OOF3D').resize(767, 527)
findWidget('OOF3D:FE Mesh Page:Pane').set_position(521)
findWidget('OOF3D').resize(770, 529)
findWidget('OOF3D:FE Mesh Page:Pane').set_position(524)
findWidget('OOF3D').resize(773, 530)
findWidget('OOF3D:FE Mesh Page:Pane').set_position(527)
findWidget('OOF3D').resize(778, 530)
findWidget('OOF3D:FE Mesh Page:Pane').set_position(532)
findWidget('OOF3D').resize(779, 531)
findWidget('OOF3D:FE Mesh Page:Pane').set_position(533)
setComboBox(findWidget('OOF3D:Navigation:PageMenu'), 'Fields & Equations')
checkpoint Field page sensitized
checkpoint Field page sensitized
checkpoint page installed Fields & Equations
assert tests.chooserCheck('OOF3D:Fields & Equations Page:Microstructure', ['triangle'])
assert tests.chooserStateCheck('OOF3D:Fields & Equations Page:Microstructure', 'triangle')
assert tests.chooserCheck('OOF3D:Fields & Equations Page:Skeleton', ['skeleton'])
assert tests.chooserStateCheck('OOF3D:Fields & Equations Page:Skeleton', 'skeleton')
assert tests.sensitizationCheck({"Mesh" : 0,"SubProblem" : 0},base="OOF3D:Fields & Equations Page")
findWidget('OOF3D:Fields & Equations Page:HPane').set_position(348)
setComboBox(findWidget('OOF3D:Navigation:PageMenu'), 'Solver')
checkpoint Solver page sensitized
checkpoint page installed Solver
assert tests.chooserCheck('OOF3D:Solver Page:Microstructure', ['triangle'])
assert tests.chooserStateCheck('OOF3D:Solver Page:Microstructure', 'triangle')
assert tests.chooserCheck('OOF3D:Solver Page:Skeleton', ['skeleton'])
assert tests.chooserStateCheck('OOF3D:Solver Page:Skeleton', 'skeleton')
assert tests.sensitizationCheck({"Mesh" : 0},base="OOF3D:Solver Page")
findWidget('OOF3D:Solver Page:VPane').set_position(185)
setComboBox(findWidget('OOF3D:Navigation:PageMenu'), 'FE Mesh')
checkpoint mesh page subproblems sensitized
checkpoint mesh page sensitized
checkpoint mesh page subproblems sensitized
checkpoint mesh page subproblems sensitized
checkpoint mesh page sensitized
checkpoint page installed FE Mesh
assert tests.FEMeshPageInfoCheck()
assert tests.FEMeshPageCheck1()
assert tests.FEMeshPageSubproblemsCheck0()
assert tests.FEMeshPageOperationsCheck0()
assert tests.chooserCheck('OOF3D:FE Mesh Page:Microstructure', ['triangle'])
assert tests.chooserStateCheck('OOF3D:FE Mesh Page:Microstructure', 'triangle')
assert tests.chooserCheck('OOF3D:FE Mesh Page:Skeleton', ['skeleton'])
assert tests.chooserStateCheck('OOF3D:FE Mesh Page:Skeleton', 'skeleton')
findWidget('OOF3D:FE Mesh Page:New').clicked()
checkpoint toplevel widget mapped Dialog-Create a new mesh
findWidget('Dialog-Create a new mesh').resize(373, 229)
findWidget('Dialog-Create a new mesh:gtk-ok').clicked()
checkpoint named analysis chooser set
checkpoint Field page sensitized
checkpoint mesh page subproblems sensitized
checkpoint mesh page subproblems sensitized
checkpoint mesh page sensitized
checkpoint mesh page subproblems sensitized
checkpoint mesh page sensitized
checkpoint mesh page subproblems sensitized
checkpoint mesh page subproblems sensitized
checkpoint mesh page sensitized
checkpoint mesh page subproblems sensitized
checkpoint mesh page sensitized
checkpoint Solver page sensitized
checkpoint Field page sensitized
checkpoint mesh page subproblems sensitized
checkpoint Solver page sensitized
checkpoint Field page sensitized
checkpoint Solver page sensitized
checkpoint OOF.Mesh.New
assert tests.FEMeshPageCheck2()
assert tests.FEMeshPageSubproblemsCheck1()
assert tests.FEMeshPageOperationsCheck1()
assert tests.chooserCheck('OOF3D:FE Mesh Page:Mesh', ['mesh'])
assert tests.chooserStateCheck('OOF3D:FE Mesh Page:Mesh', 'mesh')
assert tests.subproblemsCheck(['default'])
setComboBox(findWidget('OOF3D:Navigation:PageMenu'), 'Fields & Equations')
checkpoint Field page sensitized
checkpoint Field page sensitized
checkpoint page installed Fields & Equations
assert tests.chooserCheck('OOF3D:Fields & Equations Page:Microstructure', ['triangle'])
assert tests.chooserStateCheck('OOF3D:Fields & Equations Page:Microstructure', 'triangle')
assert tests.chooserCheck('OOF3D:Fields & Equations Page:Skeleton', ['skeleton'])
assert tests.chooserStateCheck('OOF3D:Fields & Equations Page:Skeleton', 'skeleton')
assert tests.chooserCheck('OOF3D:Fields & Equations Page:Mesh', ['mesh'])
assert tests.chooserStateCheck('OOF3D:Fields & Equations Page:Mesh', 'mesh')
assert tests.chooserCheck('OOF3D:Fields & Equations Page:SubProblem', ['default'])
assert tests.chooserStateCheck('OOF3D:Fields & Equations Page:SubProblem', 'default')
setComboBox(findWidget('OOF3D:Navigation:PageMenu'), 'Solver')
checkpoint Solver page sensitized
checkpoint page installed Solver
assert tests.chooserCheck('OOF3D:Solver Page:Microstructure', ['triangle'])
assert tests.chooserStateCheck('OOF3D:Solver Page:Microstructure', 'triangle')
assert tests.chooserCheck('OOF3D:Solver Page:Skeleton', ['skeleton'])
assert tests.chooserStateCheck('OOF3D:Solver Page:Skeleton', 'skeleton')
assert tests.chooserCheck('OOF3D:Solver Page:Mesh', ['mesh'])
assert tests.chooserStateCheck('OOF3D:Solver Page:Mesh', 'mesh')
setComboBox(findWidget('OOF3D:Navigation:PageMenu'), 'FE Mesh')
checkpoint mesh page subproblems sensitized
checkpoint mesh page sensitized
checkpoint mesh page subproblems sensitized
checkpoint mesh page subproblems sensitized
checkpoint mesh page sensitized
checkpoint page installed FE Mesh
assert tests.FEMeshPageCheck2()
assert tests.FEMeshPageSubproblemsCheck1()
assert tests.FEMeshPageOperationsCheck1()
assert tests.chooserCheck('OOF3D:FE Mesh Page:Mesh', ['mesh'])
assert tests.chooserStateCheck('OOF3D:FE Mesh Page:Mesh', 'mesh')
assert tests.subproblemsCheck(['default'])
findWidget('OOF3D:FE Mesh Page:Pane:Subproblems:New').clicked()
checkpoint toplevel widget mapped Dialog-Create a new subproblem
findWidget('Dialog-Create a new subproblem').resize(286, 94)
findWidget('Dialog-Create a new subproblem:gtk-ok').clicked()
checkpoint mesh page subproblems sensitized
checkpoint Solver page sensitized
checkpoint OOF.Subproblem.New
assert tests.subproblemsCheck(['default','subproblem'])
assert tests.subproblemsConsistencies([True, True])
findWidget('OOF3D:FE Mesh Page:Pane:Subproblems:New').clicked()
checkpoint toplevel widget mapped Dialog-Create a new subproblem
findWidget('Dialog-Create a new subproblem').resize(286, 94)
setComboBox(findWidget('Dialog-Create a new subproblem:subproblem:Chooser'), 'Complement')
findWidget('Dialog-Create a new subproblem').resize(365, 121)
setComboBox(findWidget('Dialog-Create a new subproblem:subproblem:Complement:complement_of'), 'subproblem')
findWidget('Dialog-Create a new subproblem:gtk-ok').clicked()
checkpoint mesh page subproblems sensitized
findWidget('OOF3D Messages 1').resize(543, 200)
checkpoint Solver page sensitized
checkpoint OOF.Subproblem.New
assert tests.subproblemsCheck(['default','subproblem','subproblem<2>'])
assert tests.subproblemsConsistencies([True, True, True])
setComboBox(findWidget('OOF3D:Navigation:PageMenu'), 'Fields & Equations')
checkpoint Field page sensitized
checkpoint Field page sensitized
checkpoint page installed Fields & Equations
assert tests.chooserCheck('OOF3D:Fields & Equations Page:Microstructure', ['triangle'])
assert tests.chooserStateCheck('OOF3D:Fields & Equations Page:Microstructure', 'triangle')
assert tests.chooserCheck('OOF3D:Fields & Equations Page:Skeleton', ['skeleton'])
assert tests.chooserStateCheck('OOF3D:Fields & Equations Page:Skeleton', 'skeleton')
assert tests.chooserCheck('OOF3D:Fields & Equations Page:Mesh', ['mesh'])
assert tests.chooserStateCheck('OOF3D:Fields & Equations Page:Mesh', 'mesh')
assert tests.chooserCheck('OOF3D:Fields & Equations Page:SubProblem', ['default','subproblem','subproblem<2>'])
assert tests.chooserStateCheck('OOF3D:Fields & Equations Page:SubProblem', 'default')
setComboBox(findWidget('OOF3D:Navigation:PageMenu'), 'Solver')
checkpoint Solver page sensitized
checkpoint page installed Solver
assert tests.chooserCheck('OOF3D:Solver Page:Microstructure', ['triangle'])
assert tests.chooserStateCheck('OOF3D:Solver Page:Microstructure', 'triangle')
assert tests.chooserCheck('OOF3D:Solver Page:Skeleton', ['skeleton'])
assert tests.chooserStateCheck('OOF3D:Solver Page:Skeleton', 'skeleton')
assert tests.chooserCheck('OOF3D:Solver Page:Mesh', ['mesh'])
assert tests.chooserStateCheck('OOF3D:Solver Page:Mesh', 'mesh')
setComboBox(findWidget('OOF3D:Navigation:PageMenu'), 'FE Mesh')
checkpoint mesh page subproblems sensitized
checkpoint mesh page sensitized
checkpoint mesh page subproblems sensitized
checkpoint mesh page subproblems sensitized
checkpoint mesh page sensitized
checkpoint page installed FE Mesh
findWidget('OOF3D:FE Mesh Page:Pane:Subproblems:SubproblemScroll:SubproblemList').get_selection().select_path((1,))
checkpoint mesh page subproblems sensitized
findWidget('OOF3D:FE Mesh Page:Pane:Subproblems:Edit').clicked()
checkpoint toplevel widget mapped Dialog-Edit Subproblem definition
findWidget('Dialog-Edit Subproblem definition').resize(244, 73)
setComboBox(findWidget('Dialog-Edit Subproblem definition:subproblem:Chooser'), 'Complement')
findWidget('Dialog-Edit Subproblem definition').resize(389, 100)
setComboBox(findWidget('Dialog-Edit Subproblem definition:subproblem:Complement:complement_of'), 'subproblem<2>')
findWidget('Dialog-Edit Subproblem definition:gtk-ok').clicked()
findWidget('OOF3D Messages 1').resize(573, 200)
checkpoint mesh page subproblems sensitized
checkpoint Solver page sensitized
checkpoint mesh page subproblems sensitized
checkpoint Solver page sensitized
checkpoint Solver page sensitized
checkpoint Solver page sensitized
checkpoint named analysis chooser set
checkpoint Solver page sensitized
checkpoint Field page sensitized
checkpoint Solver page sensitized
checkpoint Field page sensitized
checkpoint Solver page sensitized
checkpoint OOF.Subproblem.Edit
assert tests.subproblemsCheck(['default','subproblem<2>','subproblem'])
assert tests.subproblemsConsistencies([True, False, False])
setComboBox(findWidget('OOF3D:Navigation:PageMenu'), 'Fields & Equations')
checkpoint Field page sensitized
checkpoint Field page sensitized
checkpoint page installed Fields & Equations
assert tests.chooserCheck('OOF3D:Fields & Equations Page:Microstructure', ['triangle'])
assert tests.chooserStateCheck('OOF3D:Fields & Equations Page:Microstructure', 'triangle')
assert tests.chooserCheck('OOF3D:Fields & Equations Page:Skeleton', ['skeleton'])
assert tests.chooserStateCheck('OOF3D:Fields & Equations Page:Skeleton', 'skeleton')
assert tests.chooserCheck('OOF3D:Fields & Equations Page:Mesh', ['mesh'])
assert tests.chooserStateCheck('OOF3D:Fields & Equations Page:Mesh', 'mesh')
assert tests.chooserCheck('OOF3D:Fields & Equations Page:SubProblem', ['default'])
assert tests.chooserStateCheck('OOF3D:Fields & Equations Page:SubProblem', 'default')
setComboBox(findWidget('OOF3D:Navigation:PageMenu'), 'Solver')
checkpoint Solver page sensitized
checkpoint page installed Solver
assert tests.chooserCheck('OOF3D:Solver Page:Microstructure', ['triangle'])
assert tests.chooserStateCheck('OOF3D:Solver Page:Microstructure', 'triangle')
assert tests.chooserCheck('OOF3D:Solver Page:Skeleton', ['skeleton'])
assert tests.chooserStateCheck('OOF3D:Solver Page:Skeleton', 'skeleton')
assert tests.chooserCheck('OOF3D:Solver Page:Mesh', ['mesh'])
assert tests.chooserStateCheck('OOF3D:Solver Page:Mesh', 'mesh')
setComboBox(findWidget('OOF3D:Navigation:PageMenu'), 'FE Mesh')
checkpoint mesh page subproblems sensitized
checkpoint mesh page sensitized
checkpoint mesh page subproblems sensitized
checkpoint mesh page subproblems sensitized
checkpoint mesh page sensitized
checkpoint page installed FE Mesh
findWidget('OOF3D:FE Mesh Page:Pane:Subproblems:New').clicked()
checkpoint toplevel widget mapped Dialog-Create a new subproblem
findWidget('Dialog-Create a new subproblem').resize(389, 121)
setComboBox(findWidget('Dialog-Create a new subproblem:subproblem:Chooser'), 'EntireMesh')
findWidget('Dialog-Create a new subproblem:gtk-ok').clicked()
checkpoint mesh page subproblems sensitized
checkpoint Solver page sensitized
checkpoint OOF.Subproblem.New
assert tests.subproblemsCheck(['default','subproblem<2>','subproblem','subproblem<3>'])
assert tests.subproblemsConsistencies([True, False, False, True])
setComboBox(findWidget('OOF3D:Navigation:PageMenu'), 'Fields & Equations')
checkpoint Field page sensitized
checkpoint Field page sensitized
checkpoint page installed Fields & Equations
assert tests.chooserCheck('OOF3D:Fields & Equations Page:Microstructure', ['triangle'])
assert tests.chooserStateCheck('OOF3D:Fields & Equations Page:Microstructure', 'triangle')
assert tests.chooserCheck('OOF3D:Fields & Equations Page:Skeleton', ['skeleton'])
assert tests.chooserStateCheck('OOF3D:Fields & Equations Page:Skeleton', 'skeleton')
assert tests.chooserCheck('OOF3D:Fields & Equations Page:Mesh', ['mesh'])
assert tests.chooserStateCheck('OOF3D:Fields & Equations Page:Mesh', 'mesh')
assert tests.chooserCheck('OOF3D:Fields & Equations Page:SubProblem', ['default','subproblem<3>'])
assert tests.chooserStateCheck('OOF3D:Fields & Equations Page:SubProblem', 'default')
setComboBox(findWidget('OOF3D:Navigation:PageMenu'), 'Solver')
checkpoint Solver page sensitized
checkpoint page installed Solver
assert tests.chooserCheck('OOF3D:Solver Page:Microstructure', ['triangle'])
assert tests.chooserStateCheck('OOF3D:Solver Page:Microstructure', 'triangle')
assert tests.chooserCheck('OOF3D:Solver Page:Skeleton', ['skeleton'])
assert tests.chooserStateCheck('OOF3D:Solver Page:Skeleton', 'skeleton')
assert tests.chooserCheck('OOF3D:Solver Page:Mesh', ['mesh'])
assert tests.chooserStateCheck('OOF3D:Solver Page:Mesh', 'mesh')
setComboBox(findWidget('OOF3D:Navigation:PageMenu'), 'FE Mesh')
checkpoint mesh page subproblems sensitized
checkpoint mesh page sensitized
checkpoint mesh page subproblems sensitized
checkpoint mesh page subproblems sensitized
checkpoint mesh page sensitized
checkpoint page installed FE Mesh
findWidget('OOF3D:FE Mesh Page:Pane:Subproblems:SubproblemScroll:SubproblemList').get_selection().select_path((3,))
checkpoint mesh page subproblems sensitized
findWidget('OOF3D:FE Mesh Page:Pane:Subproblems:Edit').clicked()
checkpoint toplevel widget mapped Dialog-Edit Subproblem definition
findWidget('Dialog-Edit Subproblem definition').resize(244, 73)
setComboBox(findWidget('Dialog-Edit Subproblem definition:subproblem:Chooser'), 'Complement')
findWidget('Dialog-Edit Subproblem definition').resize(389, 100)
findWidget('Dialog-Edit Subproblem definition:gtk-ok').clicked()
checkpoint mesh page subproblems sensitized
checkpoint Solver page sensitized
checkpoint mesh page subproblems sensitized
checkpoint Solver page sensitized
checkpoint Solver page sensitized
checkpoint Solver page sensitized
checkpoint named analysis chooser set
checkpoint Solver page sensitized
checkpoint Field page sensitized
checkpoint Solver page sensitized
checkpoint Field page sensitized
checkpoint Solver page sensitized
checkpoint OOF.Subproblem.Edit
assert tests.subproblemsCheck(['default','subproblem<2>','subproblem','subproblem<3>'])
assert tests.subproblemsConsistencies([True, False, False, False])
setComboBox(findWidget('OOF3D:Navigation:PageMenu'), 'Fields & Equations')
checkpoint Field page sensitized
checkpoint Field page sensitized
checkpoint page installed Fields & Equations
assert tests.chooserCheck('OOF3D:Fields & Equations Page:Microstructure', ['triangle'])
assert tests.chooserStateCheck('OOF3D:Fields & Equations Page:Microstructure', 'triangle')
assert tests.chooserCheck('OOF3D:Fields & Equations Page:Skeleton', ['skeleton'])
assert tests.chooserStateCheck('OOF3D:Fields & Equations Page:Skeleton', 'skeleton')
assert tests.chooserCheck('OOF3D:Fields & Equations Page:Mesh', ['mesh'])
assert tests.chooserStateCheck('OOF3D:Fields & Equations Page:Mesh', 'mesh')
assert tests.chooserCheck('OOF3D:Fields & Equations Page:SubProblem', ['default'])
assert tests.chooserStateCheck('OOF3D:Fields & Equations Page:SubProblem', 'default')
setComboBox(findWidget('OOF3D:Navigation:PageMenu'), 'Solver')
checkpoint Solver page sensitized
checkpoint page installed Solver
assert tests.chooserCheck('OOF3D:Solver Page:Microstructure', ['triangle'])
assert tests.chooserStateCheck('OOF3D:Solver Page:Microstructure', 'triangle')
assert tests.chooserCheck('OOF3D:Solver Page:Skeleton', ['skeleton'])
assert tests.chooserStateCheck('OOF3D:Solver Page:Skeleton', 'skeleton')
assert tests.chooserCheck('OOF3D:Solver Page:Mesh', ['mesh'])
assert tests.chooserStateCheck('OOF3D:Solver Page:Mesh', 'mesh')
setComboBox(findWidget('OOF3D:Navigation:PageMenu'), 'FE Mesh')
checkpoint mesh page subproblems sensitized
checkpoint mesh page sensitized
checkpoint mesh page subproblems sensitized
checkpoint mesh page subproblems sensitized
checkpoint mesh page sensitized
checkpoint page installed FE Mesh
findWidget('OOF3D:FE Mesh Page:Pane:Subproblems:SubproblemScroll:SubproblemList').get_selection().select_path((2,))
checkpoint mesh page subproblems sensitized
findWidget('OOF3D:FE Mesh Page:Pane:Subproblems:Edit').clicked()
checkpoint toplevel widget mapped Dialog-Edit Subproblem definition
findWidget('Dialog-Edit Subproblem definition').resize(389, 100)
setComboBox(findWidget('Dialog-Edit Subproblem definition:subproblem:Chooser'), 'EntireMesh')
findWidget('Dialog-Edit Subproblem definition:gtk-ok').clicked()
checkpoint mesh page subproblems sensitized
checkpoint Solver page sensitized
checkpoint mesh page subproblems sensitized
checkpoint Solver page sensitized
checkpoint Solver page sensitized
checkpoint Solver page sensitized
checkpoint named analysis chooser set
checkpoint Solver page sensitized
checkpoint Field page sensitized
checkpoint Solver page sensitized
checkpoint Field page sensitized
checkpoint Solver page sensitized
checkpoint OOF.Subproblem.Edit
assert tests.subproblemsCheck(['default','subproblem<2>','subproblem<3>','subproblem'])
assert tests.subproblemsConsistencies([True, True, True, True])
setComboBox(findWidget('OOF3D:Navigation:PageMenu'), 'Fields & Equations')
checkpoint Field page sensitized
checkpoint Field page sensitized
checkpoint page installed Fields & Equations
assert tests.chooserCheck('OOF3D:Fields & Equations Page:Microstructure', ['triangle'])
assert tests.chooserStateCheck('OOF3D:Fields & Equations Page:Microstructure', 'triangle')
assert tests.chooserCheck('OOF3D:Fields & Equations Page:Skeleton', ['skeleton'])
assert tests.chooserStateCheck('OOF3D:Fields & Equations Page:Skeleton', 'skeleton')
assert tests.chooserCheck('OOF3D:Fields & Equations Page:Mesh', ['mesh'])
assert tests.chooserStateCheck('OOF3D:Fields & Equations Page:Mesh', 'mesh')
assert tests.chooserCheck('OOF3D:Fields & Equations Page:SubProblem', ['default','subproblem<2>','subproblem<3>','subproblem'])
assert tests.chooserStateCheck('OOF3D:Fields & Equations Page:SubProblem', 'default')
setComboBox(findWidget('OOF3D:Navigation:PageMenu'), 'Solver')
checkpoint Solver page sensitized
checkpoint page installed Solver
assert tests.chooserCheck('OOF3D:Solver Page:Microstructure', ['triangle'])
assert tests.chooserStateCheck('OOF3D:Solver Page:Microstructure', 'triangle')
assert tests.chooserCheck('OOF3D:Solver Page:Skeleton', ['skeleton'])
assert tests.chooserStateCheck('OOF3D:Solver Page:Skeleton', 'skeleton')
assert tests.chooserCheck('OOF3D:Solver Page:Mesh', ['mesh'])
assert tests.chooserStateCheck('OOF3D:Solver Page:Mesh', 'mesh')
findWidget('OOF3D:Solver Page:VPane:Subproblems:SubproblemScroll').get_vadjustment().set_value( 1.4000000000000e+01)
findWidget('OOF3D:Solver Page:VPane:Subproblems:SubproblemScroll').get_vadjustment().set_value( 0.0000000000000e+00)
setComboBox(findWidget('OOF3D:Navigation:PageMenu'), 'FE Mesh')
checkpoint mesh page subproblems sensitized
checkpoint mesh page sensitized
checkpoint mesh page subproblems sensitized
checkpoint mesh page subproblems sensitized
checkpoint mesh page sensitized
checkpoint page installed FE Mesh
findWidget('OOF3D:FE Mesh Page:Pane:Subproblems:SubproblemScroll:SubproblemList').get_selection().select_path((3,))
checkpoint mesh page subproblems sensitized
findWidget('OOF3D:FE Mesh Page:Pane:Subproblems:Delete').clicked()
checkpoint toplevel widget mapped Questioner
findWidget('Questioner').resize(354, 87)
findWidget('Questioner:gtk-yes').clicked()
checkpoint mesh page subproblems sensitized
checkpoint Solver page sensitized
checkpoint mesh page subproblems sensitized
checkpoint Solver page sensitized
checkpoint mesh page subproblems sensitized
checkpoint Solver page sensitized
checkpoint OOF.Subproblem.Delete
assert tests.subproblemsCheck(['default'])
setComboBox(findWidget('OOF3D:Navigation:PageMenu'), 'Fields & Equations')
checkpoint Field page sensitized
checkpoint Field page sensitized
checkpoint page installed Fields & Equations
assert tests.chooserCheck('OOF3D:Fields & Equations Page:Microstructure', ['triangle'])
assert tests.chooserStateCheck('OOF3D:Fields & Equations Page:Microstructure', 'triangle')
assert tests.chooserCheck('OOF3D:Fields & Equations Page:Skeleton', ['skeleton'])
assert tests.chooserStateCheck('OOF3D:Fields & Equations Page:Skeleton', 'skeleton')
assert tests.chooserCheck('OOF3D:Fields & Equations Page:Mesh', ['mesh'])
assert tests.chooserStateCheck('OOF3D:Fields & Equations Page:Mesh', 'mesh')
assert tests.chooserCheck('OOF3D:Fields & Equations Page:SubProblem', ['default'])
assert tests.chooserStateCheck('OOF3D:Fields & Equations Page:SubProblem', 'default')
setComboBox(findWidget('OOF3D:Navigation:PageMenu'), 'Solver')
checkpoint Solver page sensitized
checkpoint page installed Solver
assert tests.chooserCheck('OOF3D:Solver Page:Microstructure', ['triangle'])
assert tests.chooserStateCheck('OOF3D:Solver Page:Microstructure', 'triangle')
assert tests.chooserCheck('OOF3D:Solver Page:Skeleton', ['skeleton'])
assert tests.chooserStateCheck('OOF3D:Solver Page:Skeleton', 'skeleton')
assert tests.chooserCheck('OOF3D:Solver Page:Mesh', ['mesh'])
assert tests.chooserStateCheck('OOF3D:Solver Page:Mesh', 'mesh')
setComboBox(findWidget('OOF3D:Navigation:PageMenu'), 'FE Mesh')
checkpoint mesh page subproblems sensitized
checkpoint mesh page sensitized
checkpoint mesh page subproblems sensitized
checkpoint mesh page subproblems sensitized
checkpoint mesh page sensitized
checkpoint page installed FE Mesh
findMenu(findWidget('OOF3D:MenuBar'), 'File:Save:Python_Log').activate()
checkpoint toplevel widget mapped Dialog-Python_Log
findWidget('Dialog-Python_Log').resize(190, 92)
findWidget('Dialog-Python_Log:filename').set_text('meshpage.log')
findWidget('Dialog-Python_Log:gtk-ok').clicked()
checkpoint OOF.File.Save.Python_Log
assert tests.filediff('meshpage.log')
widget_1=findWidget('OOF3D')
handled_1=widget_1.event(event(gtk.gdk.DELETE,window=widget_1.window))
| 51.609917 | 127 | 0.820459 | 3,724 | 31,224 | 6.862245 | 0.071966 | 0.134612 | 0.09955 | 0.058227 | 0.913676 | 0.900685 | 0.886402 | 0.874506 | 0.836783 | 0.819213 | 0 | 0.022204 | 0.08122 | 31,224 | 604 | 128 | 51.695364 | 0.868586 | 0.0196 | 0 | 0.798305 | 0 | 0 | 0.288062 | 0.046077 | 0 | 0 | 0 | 0 | 0.254237 | 0 | null | null | 0 | 0.001695 | null | null | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 8 |
5dda4dcb51047325df03549dd4a0de97c152077a | 11,478 | py | Python | tests/test_gpnh_convex_coding.py | azedarach/matrix-factorization-case-studies | b689c8af677c378bad75f68e56671a5c6f6589ec | [
"MIT"
] | null | null | null | tests/test_gpnh_convex_coding.py | azedarach/matrix-factorization-case-studies | b689c8af677c378bad75f68e56671a5c6f6589ec | [
"MIT"
] | null | null | null | tests/test_gpnh_convex_coding.py | azedarach/matrix-factorization-case-studies | b689c8af677c378bad75f68e56671a5c6f6589ec | [
"MIT"
] | null | null | null | """
Provides unit tests for GPNH regularized convex coding.
"""
# License: MIT
import numpy as np
from sklearn.utils import check_random_state
from convex_dim_red import right_stochastic_matrix
from convex_dim_red.gpnh_convex_coding import (
_gpnh_cost, _iterate_gpnh_convex_coding,
_update_gpnh_dictionary, _update_gpnh_weights)
def test_cost_returns_zero_for_perfect_reconstruction_no_regularization():
"""Test cost is zero for perfect factorization."""
random_seed = 0
random_state = check_random_state(random_seed)
n_features = 5
n_components = 3
n_samples = 30
tolerance = 1e-14
lambda_W = 0
W = random_state.uniform(size=(n_features, n_components))
Z = right_stochastic_matrix(
(n_samples, n_components), random_state=random_state)
assert np.allclose(Z.sum(axis=1), 1, tolerance)
X = Z.dot(W.T)
cost = _gpnh_cost(X, Z, W, lambda_W=lambda_W)
expected_cost = 0
assert abs(cost - expected_cost) < 1e-14
def test_single_dictionary_update_reduces_cost_function_with_zero_lambda():
"""Test single dictionary update reduces cost function."""
random_seed = 0
random_state = check_random_state(random_seed)
n_features = 7
n_components = 5
n_samples = 450
lambda_W = 0
X = random_state.uniform(size=(n_samples, n_features))
W = random_state.uniform(size=(n_features, n_components))
Z = right_stochastic_matrix(
(n_samples, n_components), random_state=random_state)
assert np.allclose(Z.sum(axis=1), 1, 1e-14)
prefactor = (4.0 / (n_features * n_components * (n_components - 1)))
GW = prefactor * (n_components * np.eye(n_components) - 1)
ZtZ = Z.T.dot(Z)
initial_cost = _gpnh_cost(X, Z, W, lambda_W=lambda_W)
updated_W = _update_gpnh_dictionary(X, Z, ZtZ, GW, lambda_W=lambda_W)
final_cost = _gpnh_cost(X, Z, updated_W, lambda_W=lambda_W)
assert final_cost <= initial_cost
def test_single_dictionary_update_reduces_cost_function_with_nonzero_lambda():
"""Test single dictionary update reduces regularized cost function."""
random_seed = 0
random_state = check_random_state(random_seed)
n_features = 11
n_components = 6
n_samples = 230
lambda_W = 3.2
X = random_state.uniform(size=(n_samples, n_features))
W = random_state.uniform(size=(n_features, n_components))
Z = right_stochastic_matrix(
(n_samples, n_components), random_state=random_state)
assert np.allclose(Z.sum(axis=1), 1, 1e-14)
prefactor = (4.0 / (n_features * n_components * (n_components - 1)))
GW = prefactor * (n_components * np.eye(n_components) - 1)
ZtZ = Z.T.dot(Z)
initial_cost = _gpnh_cost(X, Z, W, lambda_W=lambda_W)
updated_W = _update_gpnh_dictionary(X, Z, ZtZ, GW, lambda_W=lambda_W)
final_cost = _gpnh_cost(X, Z, updated_W, lambda_W=lambda_W)
assert final_cost <= initial_cost
def test_exact_solution_is_dictionary_update_fixed_point():
"""Test exact solution is a fixed point of dictionary update step."""
random_seed = 0
random_state = check_random_state(random_seed)
n_features = 10
n_components = 6
n_samples = 40
lambda_W = 0
tolerance = 1e-6
W = random_state.uniform(size=(n_features, n_components))
Z = right_stochastic_matrix(
(n_samples, n_components), random_state=random_state)
assert np.allclose(Z.sum(axis=1), 1, tolerance)
X = Z.dot(W.T)
prefactor = (4.0 / (n_features * n_components * (n_components - 1)))
GW = prefactor * (n_components * np.eye(n_components) - 1)
ZtZ = Z.T.dot(Z)
initial_cost = _gpnh_cost(X, Z, W, lambda_W=lambda_W)
updated_W = _update_gpnh_dictionary(X, Z, ZtZ, GW, lambda_W=lambda_W)
final_cost = _gpnh_cost(X, Z, updated_W, lambda_W=lambda_W)
assert np.allclose(updated_W, W, tolerance)
assert abs(final_cost - initial_cost) < tolerance
def test_repeated_dictionary_updates_converge_with_zero_lambda():
"""Test repeated updates converge to fixed point."""
random_seed = 0
random_state = check_random_state(random_seed)
n_features = 13
n_components = 3
n_samples = 50
max_iterations = 100
tolerance = 1e-6
lambda_W = 0
X = random_state.uniform(size=(n_samples, n_features))
W = random_state.uniform(size=(n_features, n_components))
Z = right_stochastic_matrix(
(n_samples, n_components), random_state=random_state)
assert np.allclose(Z.sum(axis=1), 1, 1e-12)
initial_cost = _gpnh_cost(X, Z, W, lambda_W=lambda_W)
updated_Z, updated_W, _, n_iter, _, _ = _iterate_gpnh_convex_coding(
X, Z, W, lambda_W=lambda_W,
update_weights=False, update_dictionary=True,
tolerance=tolerance, max_iterations=max_iterations,
require_monotonic_cost_decrease=True)
final_cost = _gpnh_cost(X, updated_Z, updated_W, lambda_W=lambda_W)
assert final_cost <= initial_cost
assert n_iter < max_iterations
def test_repeated_dictionary_updates_converge_with_nonzero_lambda():
"""Test repeated updates converge to fixed point for regularized problem."""
random_seed = 0
random_state = check_random_state(random_seed)
n_features = 27
n_components = 13
n_samples = 500
max_iterations = 100
tolerance = 1e-6
lambda_W = 1.5
X = random_state.uniform(size=(n_samples, n_features))
W = random_state.uniform(size=(n_features, n_components))
Z = right_stochastic_matrix(
(n_samples, n_components), random_state=random_state)
assert np.allclose(Z.sum(axis=1), 1, 1e-12)
initial_cost = _gpnh_cost(X, Z, W, lambda_W=lambda_W)
updated_Z, updated_W, _, n_iter, _, _ = _iterate_gpnh_convex_coding(
X, Z, W, lambda_W=lambda_W,
update_weights=False, update_dictionary=True,
tolerance=tolerance, max_iterations=max_iterations,
require_monotonic_cost_decrease=True, verbose=True)
final_cost = _gpnh_cost(X, updated_Z, updated_W, lambda_W=lambda_W)
assert final_cost <= initial_cost
assert n_iter < max_iterations
def test_single_weights_updates_reduces_cost_function_with_zero_lambda():
"""Test single weights update reduces cost function."""
random_seed = 0
random_state = check_random_state(random_seed)
n_features = 25
n_components = 6
n_samples = 300
lambda_W = 0
X = random_state.uniform(size=(n_samples, n_features))
W = random_state.uniform(size=(n_features, n_components))
Z = right_stochastic_matrix(
(n_samples, n_components), random_state=random_state)
assert np.allclose(Z.sum(axis=1), 1, 1e-14)
initial_cost = _gpnh_cost(X, Z, W, lambda_W=lambda_W)
updated_Z = _update_gpnh_weights(X, Z, W)
final_cost = _gpnh_cost(X, updated_Z, W, lambda_W=lambda_W)
assert final_cost <= initial_cost
def test_single_weights_updates_reduces_cost_function_with_nonzero_lambda():
"""Test single weights update reduces cost function."""
random_seed = 0
random_state = check_random_state(random_seed)
n_features = 43
n_components = 10
n_samples = 320
lambda_W = 4.2
X = random_state.uniform(size=(n_samples, n_features))
W = random_state.uniform(size=(n_features, n_components))
Z = right_stochastic_matrix(
(n_samples, n_components), random_state=random_state)
assert np.allclose(Z.sum(axis=1), 1, 1e-14)
initial_cost = _gpnh_cost(X, Z, W, lambda_W=lambda_W)
updated_Z = _update_gpnh_weights(X, Z, W)
final_cost = _gpnh_cost(X, updated_Z, W, lambda_W=lambda_W)
assert final_cost <= initial_cost
def test_exact_solution_is_weights_update_fixed_point_with_zero_lambda():
"""Test exact solution is a fixed point of weights update step."""
random_seed = 0
random_state = check_random_state(random_seed)
n_features = 14
n_components = 5
n_samples = 324
lambda_W = 0
tolerance = 1e-6
W = random_state.uniform(size=(n_features, n_components))
Z = right_stochastic_matrix(
(n_samples, n_components), random_state=random_state)
assert np.allclose(Z.sum(axis=1), 1, tolerance)
X = Z.dot(W.T)
initial_cost = _gpnh_cost(X, Z, W, lambda_W=lambda_W)
updated_Z = _update_gpnh_weights(X, Z, W)
final_cost = _gpnh_cost(X, updated_Z, W, lambda_W=lambda_W)
assert np.allclose(Z, updated_Z, tolerance)
assert abs(final_cost - initial_cost) < tolerance
def test_exact_solution_is_weights_update_fixed_point_with_nonzero_lambda():
"""Test exact solution is a fixed point of regularized weights update step."""
random_seed = 0
random_state = check_random_state(random_seed)
n_features = 24
n_components = 8
n_samples = 200
lambda_W = 3.8
tolerance = 1e-6
W = random_state.uniform(size=(n_features, n_components))
Z = right_stochastic_matrix(
(n_samples, n_components), random_state=random_state)
assert np.allclose(Z.sum(axis=1), 1, tolerance)
X = Z.dot(W.T)
initial_cost = _gpnh_cost(X, Z, W, lambda_W=lambda_W)
updated_Z = _update_gpnh_weights(X, Z, W)
final_cost = _gpnh_cost(X, updated_Z, W, lambda_W=lambda_W)
assert np.allclose(Z, updated_Z, tolerance)
assert abs(final_cost - initial_cost) < tolerance
def test_repeated_weights_updates_converge_with_zero_lambda():
"""Test repeated weights updates converge to fixed point."""
random_seed = 0
random_state = check_random_state(random_seed)
n_features = 43
n_components = 3
n_samples = 100
max_iterations = 100
tolerance = 1e-6
lambda_W = 0
X = random_state.uniform(size=(n_samples, n_features))
W = random_state.uniform(size=(n_features, n_components))
Z = right_stochastic_matrix(
(n_samples, n_components), random_state=random_state)
assert np.allclose(Z.sum(axis=1), 1, 1e-12)
initial_cost = _gpnh_cost(X, Z, W, lambda_W=lambda_W)
updated_Z, updated_W, _, n_iter, _, _ = _iterate_gpnh_convex_coding(
X, Z, W, lambda_W=lambda_W,
update_weights=True, update_dictionary=False,
tolerance=tolerance, max_iterations=max_iterations,
require_monotonic_cost_decrease=True)
final_cost = _gpnh_cost(X, updated_Z, updated_W, lambda_W=lambda_W)
assert final_cost <= initial_cost
assert n_iter < max_iterations
def test_repeated_weights_updates_converge_with_nonzero_lambda():
"""Test repeated weights updates converge to fixed point for regularized problem."""
random_seed = 0
random_state = check_random_state(random_seed)
n_features = 12
n_components = 6
n_samples = 500
max_iterations = 100
tolerance = 1e-6
lambda_W = 6.2
X = random_state.uniform(size=(n_samples, n_features))
W = random_state.uniform(size=(n_features, n_components))
Z = right_stochastic_matrix(
(n_samples, n_components), random_state=random_state)
assert np.allclose(Z.sum(axis=1), 1, 1e-12)
initial_cost = _gpnh_cost(X, Z, W, lambda_W=lambda_W)
updated_Z, updated_W, _, n_iter, _, _ = _iterate_gpnh_convex_coding(
X, Z, W, lambda_W=lambda_W,
update_weights=True, update_dictionary=False,
tolerance=tolerance, max_iterations=max_iterations,
require_monotonic_cost_decrease=True)
final_cost = _gpnh_cost(X, updated_Z, updated_W, lambda_W=lambda_W)
assert final_cost <= initial_cost
assert n_iter < max_iterations
| 27.791768 | 88 | 0.710664 | 1,703 | 11,478 | 4.405755 | 0.072225 | 0.067173 | 0.060776 | 0.055978 | 0.9171 | 0.895642 | 0.892576 | 0.873651 | 0.856058 | 0.816473 | 0 | 0.020689 | 0.195679 | 11,478 | 412 | 89 | 27.859223 | 0.792028 | 0.06813 | 0 | 0.784232 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.128631 | 1 | 0.049793 | false | 0 | 0.016598 | 0 | 0.06639 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
5dfc52eaf374d1cb778b1c7bad4cbff511c4c3f3 | 14,424 | py | Python | goldstone/core/tests_producer_view.py | Solinea/goldstone-server | 91b078ca9fed1b33f48dc79f4af5c9d1817a1bc5 | [
"Apache-2.0"
] | 14 | 2015-05-18T22:11:11.000Z | 2020-08-14T06:50:09.000Z | goldstone/core/tests_producer_view.py | lexjacobs/goldstone-server | 91b078ca9fed1b33f48dc79f4af5c9d1817a1bc5 | [
"Apache-2.0"
] | 568 | 2015-05-17T01:26:36.000Z | 2021-06-10T20:36:47.000Z | goldstone/core/tests_producer_view.py | lexjacobs/goldstone-server | 91b078ca9fed1b33f48dc79f4af5c9d1817a1bc5 | [
"Apache-2.0"
] | 22 | 2015-05-25T20:16:06.000Z | 2021-08-08T20:25:24.000Z | """Unit tests for AlertDefinition views."""
# Copyright 2016 Solinea, Inc.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
import json
from rest_framework import status
from rest_framework.test import APITestCase
from goldstone.core.models import SavedSearch, AlertDefinition, Alert, \
EmailProducer
from goldstone.test_utils import CONTENT_NO_CREDENTIALS, \
AUTHORIZATION_PAYLOAD, BAD_TOKEN, CONTENT_BAD_TOKEN, create_and_login
PRODUCER_URL = '/core/producer/'
EMAIL_PRODUCER_URL = '/core/email_producer/'
class ProducerViewTests(APITestCase):
""" Test Producer API """
fixtures = ['core_initial_data.yaml']
def setUp(self):
self.saved_search = SavedSearch.objects.all()[0]
self.alert_def = AlertDefinition(name='alert_def',
search=self.saved_search)
self.alert_def.save()
self.alert = Alert(short_message='test',
long_message='test123',
alert_def=self.alert_def)
self.alert.save()
self.producer = EmailProducer(sender='me@localhost',
receiver='you@localhost',
alert_def=self.alert_def)
self.producer.save()
self.basic_post_body = {
'sender': 'bell@xyz.com',
'receiver': 'watson@xyz.com',
'alert_def': self.alert_def.uuid
}
def test_not_logged_in(self):
"""All operations should fail when not logged in."""
# Try getting resource with no token.
response = self.client.get(PRODUCER_URL)
self.assertContains(response,
CONTENT_NO_CREDENTIALS,
status_code=status.HTTP_401_UNAUTHORIZED)
# Try getting resource a bogus token.
response = self.client.get(
PRODUCER_URL,
HTTP_AUTHORIZATION=AUTHORIZATION_PAYLOAD % BAD_TOKEN)
self.assertContains(response,
CONTENT_BAD_TOKEN,
status_code=status.HTTP_401_UNAUTHORIZED)
# Try creating resource with no token.
response = self.client.post(PRODUCER_URL,
json.dumps(self.basic_post_body),
content_type="application/json")
self.assertContains(response,
CONTENT_NO_CREDENTIALS,
status_code=status.HTTP_401_UNAUTHORIZED)
# Try creating resource with a bogus token.
response = self.client.post(
PRODUCER_URL,
json.dumps(self.basic_post_body),
content_type="application/json",
HTTP_AUTHORIZATION=AUTHORIZATION_PAYLOAD % BAD_TOKEN)
self.assertContains(response,
CONTENT_BAD_TOKEN,
status_code=status.HTTP_401_UNAUTHORIZED)
# Try updating resource with no token.
response = self.client.put(
PRODUCER_URL + self.alert_def.uuid + '/',
json.dumps(self.basic_post_body),
content_type="application/json")
self.assertContains(response,
CONTENT_NO_CREDENTIALS,
status_code=status.HTTP_401_UNAUTHORIZED)
# Try updating resource with a bogus token.
response = self.client.put(
PRODUCER_URL + self.alert_def.uuid + '/',
json.dumps(self.basic_post_body),
content_type="application/json",
HTTP_AUTHORIZATION=AUTHORIZATION_PAYLOAD % BAD_TOKEN)
self.assertContains(response,
CONTENT_BAD_TOKEN,
status_code=status.HTTP_401_UNAUTHORIZED)
def test_post_not_allowed(self):
"""POST operation tests"""
# Create a user and get the authorization token. Then do the test.
token = create_and_login()
# Try creating resource with a valid token.
response = self.client.post(
PRODUCER_URL,
json.dumps(self.basic_post_body),
content_type="application/json",
HTTP_AUTHORIZATION=AUTHORIZATION_PAYLOAD % token)
# pylint: disable=E1101
self.assertEqual(response.status_code,
status.HTTP_405_METHOD_NOT_ALLOWED)
def test_get(self):
"""GET operation tests"""
# Create a user and get the authorization token. Then do the test.
token = create_and_login()
# We should have at least one result in our list, but could have more
response = self.client.get(
PRODUCER_URL,
accept="application/json",
HTTP_AUTHORIZATION=AUTHORIZATION_PAYLOAD % token)
self.assertEqual(response.status_code, status.HTTP_200_OK)
content = json.loads(response.content)
self.assertIn('count', content)
self.assertIn('next', content)
self.assertIn('previous', content)
self.assertIn('results', content)
self.assertIsInstance(content['results'], list)
self.assertGreater(len(content['results']), 0)
# test the structure of the one we loaded
response = self.client.get(
PRODUCER_URL + "%s/" % self.producer.uuid,
accept="application/json",
HTTP_AUTHORIZATION=AUTHORIZATION_PAYLOAD % token)
self.assertEqual(response.status_code, status.HTTP_200_OK)
content = json.loads(response.content)
self.assertIn('uuid', content)
self.assertIn('alert_def', content)
self.assertIn('created', content)
self.assertIn('updated', content)
self.assertIn('sender', content)
self.assertIn('receiver', content)
def test_delete_not_allowed(self):
"""DELETE operation tests"""
# Create a user and get the authorization token. Then do the test.
token = create_and_login()
# Try creating resource with a valid token.
response = self.client.delete(
PRODUCER_URL + '%s/' % self.alert_def.uuid,
HTTP_AUTHORIZATION=AUTHORIZATION_PAYLOAD % token)
self.assertEqual(response.status_code,
status.HTTP_405_METHOD_NOT_ALLOWED)
def test_put_not_allowed(self):
"""PUT operation tests"""
# Create a user and get the authorization token. Then do the test.
token = create_and_login()
# Try creating resource with a valid token.
response = self.client.put(
PRODUCER_URL + '%s/' % self.alert_def.uuid,
json.dumps(self.basic_post_body),
content_type="application/json",
HTTP_AUTHORIZATION=AUTHORIZATION_PAYLOAD % token)
self.assertEqual(response.status_code,
status.HTTP_405_METHOD_NOT_ALLOWED)
def test_patch_not_allowed(self):
"""PATCH operation tests"""
# Create a user and get the authorization token. Then do the test.
token = create_and_login()
# Try creating resource with a valid token.
response = self.client.put(
PRODUCER_URL + '%s/' % self.alert_def.uuid,
json.dumps(self.basic_post_body),
content_type="application/json",
HTTP_AUTHORIZATION=AUTHORIZATION_PAYLOAD % token)
self.assertEqual(response.status_code,
status.HTTP_405_METHOD_NOT_ALLOWED)
class EmailProducerViewTests(APITestCase):
""" Test Email Producer API """
fixtures = ['core_initial_data.yaml']
def setUp(self):
self.saved_search = SavedSearch.objects.all()[0]
self.alert_def = AlertDefinition(name='alert_def',
search=self.saved_search)
self.alert_def.save()
self.alert = Alert(short_message='test',
long_message='test123',
alert_def=self.alert_def)
self.alert.save()
self.producer = EmailProducer(sender='me', receiver='you',
alert_def=self.alert_def)
self.producer.save()
self.basic_post_body = {
"sender": "bell@localhost",
"receiver": "watson@localhost",
"alert_def": self.alert_def.uuid
}
def test_not_logged_in(self):
"""All operations should fail when not logged in."""
# Try getting resource with no token.
response = self.client.get(EMAIL_PRODUCER_URL)
self.assertContains(response,
CONTENT_NO_CREDENTIALS,
status_code=status.HTTP_401_UNAUTHORIZED)
# Try getting resource a bogus token.
response = self.client.get(
EMAIL_PRODUCER_URL,
HTTP_AUTHORIZATION=AUTHORIZATION_PAYLOAD % BAD_TOKEN)
self.assertContains(response,
CONTENT_BAD_TOKEN,
status_code=status.HTTP_401_UNAUTHORIZED)
# Try creating resource with no token.
response = self.client.post(EMAIL_PRODUCER_URL,
json.dumps(self.basic_post_body),
content_type="application/json")
self.assertContains(response,
CONTENT_NO_CREDENTIALS,
status_code=status.HTTP_401_UNAUTHORIZED)
# Try creating resource with a bogus token.
response = self.client.post(
EMAIL_PRODUCER_URL,
json.dumps(self.basic_post_body),
content_type="application/json",
HTTP_AUTHORIZATION=AUTHORIZATION_PAYLOAD % BAD_TOKEN)
self.assertContains(response,
CONTENT_BAD_TOKEN,
status_code=status.HTTP_401_UNAUTHORIZED)
# Try updating resource with no token.
response = self.client.put(
EMAIL_PRODUCER_URL + self.alert_def.uuid + '/',
json.dumps(self.basic_post_body),
content_type="application/json")
self.assertContains(response,
CONTENT_NO_CREDENTIALS,
status_code=status.HTTP_401_UNAUTHORIZED)
# Try updating resource with a bogus token.
response = self.client.put(
EMAIL_PRODUCER_URL + self.alert_def.uuid + '/',
json.dumps(self.basic_post_body),
content_type="application/json",
HTTP_AUTHORIZATION=AUTHORIZATION_PAYLOAD % BAD_TOKEN)
self.assertContains(response,
CONTENT_BAD_TOKEN,
status_code=status.HTTP_401_UNAUTHORIZED)
def test_crud(self):
"""POST operation tests"""
# Create a user and get the authorization token. Then do the test.
token = create_and_login()
# Try creating resource with a valid token.
response = self.client.post(
EMAIL_PRODUCER_URL,
json.dumps(self.basic_post_body),
content_type="application/json",
HTTP_AUTHORIZATION=AUTHORIZATION_PAYLOAD % token)
self.assertEqual(response.status_code,
status.HTTP_201_CREATED)
# Quick test of a filtered GET of the new resource
response = self.client.get(
EMAIL_PRODUCER_URL + "?sender=bell@localhost",
HTTP_AUTHORIZATION=AUTHORIZATION_PAYLOAD % token)
self.assertEqual(response.status_code, status.HTTP_200_OK)
content = json.loads(response.content)
self.assertIn('count', content)
self.assertIn('next', content)
self.assertIn('previous', content)
self.assertIn('results', content)
self.assertIsInstance(content['results'], list)
self.assertGreater(len(content['results']), 0)
self.bell_uuid = content['results'][0]['uuid']
# test the structure of the record we posted
response = self.client.get(
EMAIL_PRODUCER_URL + "%s/" % self.bell_uuid,
accept="application/json",
HTTP_AUTHORIZATION=AUTHORIZATION_PAYLOAD % token)
self.assertEqual(response.status_code, status.HTTP_200_OK)
content = json.loads(response.content)
self.assertIn('uuid', content)
self.assertIn('alert_def', content)
self.assertIn('created', content)
self.assertIn('updated', content)
self.assertIn('sender', content)
self.assertIn('receiver', content)
self.assertEqual(content['sender'], 'bell@localhost')
self.assertEqual(content['receiver'], 'watson@localhost')
self.bell_content = content
put_body = self.bell_content
put_body['receiver'] = 'howell@localhost'
# Try updating resource with a valid token.
response = self.client.put(
EMAIL_PRODUCER_URL + '%s/' % self.bell_uuid,
json.dumps(put_body),
content_type="application/json",
HTTP_AUTHORIZATION=AUTHORIZATION_PAYLOAD % token)
self.assertEqual(response.status_code,
status.HTTP_200_OK)
# Try patching resource with a valid token.
response = self.client.patch(
EMAIL_PRODUCER_URL + '%s/' % self.bell_uuid,
json.dumps({'receiver': 'watson@localhost'}),
content_type="application/json",
HTTP_AUTHORIZATION=AUTHORIZATION_PAYLOAD % token)
self.assertEqual(response.status_code,
status.HTTP_200_OK)
# Try deleting resource with a valid token.
response = self.client.delete(
EMAIL_PRODUCER_URL + '%s/' % self.bell_uuid,
HTTP_AUTHORIZATION=AUTHORIZATION_PAYLOAD % token)
self.assertEqual(response.status_code,
status.HTTP_204_NO_CONTENT)
| 37.464935 | 77 | 0.608361 | 1,537 | 14,424 | 5.500325 | 0.130774 | 0.02555 | 0.0511 | 0.056778 | 0.825172 | 0.818902 | 0.809203 | 0.793589 | 0.787556 | 0.767093 | 0 | 0.009488 | 0.30581 | 14,424 | 384 | 78 | 37.5625 | 0.834815 | 0.156822 | 0 | 0.812245 | 0 | 0 | 0.069231 | 0.007213 | 0 | 0 | 0 | 0 | 0.204082 | 1 | 0.040816 | false | 0 | 0.020408 | 0 | 0.077551 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
538c5571a48c83520f89568da4d7cc31f9b7ca38 | 36,680 | py | Python | dashboard/main.py | mkingopng/Jamaica | 39a2c259098d7ef10322a5c40d85e836fbdd05ee | [
"MIT"
] | null | null | null | dashboard/main.py | mkingopng/Jamaica | 39a2c259098d7ef10322a5c40d85e836fbdd05ee | [
"MIT"
] | null | null | null | dashboard/main.py | mkingopng/Jamaica | 39a2c259098d7ef10322a5c40d85e836fbdd05ee | [
"MIT"
] | null | null | null | """
written by: noOne
date:
last updated: 29th august 2021
"""
import dash
import dash_core_components as dcc
import dash_html_components as html
from dash.dependencies import Input, Output
import plotly.graph_objs as go
import pandas as pd
# fix_me: something has changed. not running correctly anymore.
dtypes = {'INVOICE_NUMBER': str, 'QTY_INVOICED': float, 'UNIT_NET_PRICE': float, 'SALES_ORDER_LINE_NUMBER': int,
'SALES_VALUE': float, 'INVOICE_DATE': str, 'FULLY_SHIPPED?': bool, 'WEEK_ID': int, 'YEAR_ID': int,
'ALT_ALT_PRODUCT_GROUP': str, 'STOCK_CODE': str, 'PRODUCT_DESCRIPTION': str, 'SALES_PERSON': str, 'CC': int,
'TEAM': str, 'CUSTOMER_CODE': str, 'CUSTOMER_NAME': str, 'CUSTOMER_PHONE': str, 'CUSTOMER_ADDRESS_1': str,
'CUSTOMER_ADDRESS_2': str, 'CITY': str, 'PROVINCE': str}
parse_dates = ['STOCK_CODE']
sales_1 = pd.read_csv("/home/michaelkingston/Documents/GitHub/Jamaica/data/test_query.csv", encoding='utf-8', sep=',',
dtype=dtypes, parse_dates=parse_dates, error_bad_lines=False, index_col=False)
app = dash.Dash(__name__, )
app.layout = html.Div([
html.Div([
html.Br(), html.Br(),
html.H1('KK Kingston Sales Dashboard')],
style={'margin-left': '5%', 'color': '#808000', 'width': '50%', 'display': 'inline-block'
}),
html.Div([
html.Br(), html.Br(),
html.H4('Created by: noOne')],
style={'color': '#17202A', 'width': '30%', 'display': 'inline-block', 'float': 'right'
}),
# this is the key filter
html.Div([
html.Label('Select a Team:'),
dcc.Dropdown(id='w_teams',
multi=False,
clearable=True,
value='',
placeholder='Select a Team',
options=[{'label': c, 'value': c}
for c in (sales_1['TEAM'].unique())])
], style={'width': '20%', 'margin-left': '45%'}),
# TODO: charts L1, R1, L2 are quite similar. They all related to sales qty, sales value,
# and by extension price per UOM and cost per UOM. I think this could be reduced to 1 chart and be more
# informative. Y1 is always some variant of qty. Y2 is always some variant of sales value.
# X is more constant. It is always alt product group (APG) I like R1 the best.
# Chart L1: Create combination of bar chart and line chart (Compare quantity ordered to each Price of product)
html.Div([
html.Br(),
dcc.Graph(id='bar_line_1', config={'displayModeBar': False}),
], style={'margin-left': '1.4%', 'width': '50%', 'display': 'inline-block'}),
# Chart R1: Create combination of bar chart and line chart (Compare sales to each Price of product)
html.Div([
html.Br(),
dcc.Graph(id='bar_line_2', config={'displayModeBar': False}),
], style={'width': '48.6%', 'display': 'inline-block', 'float': 'right'}),
# Chart L2: Create group bar chart (Compare sales and quantity ordered for each product)
html.Div([
html.Br(),
dcc.Graph(id='bar_bar_3', config={'displayModeBar': False}),
], style={'margin-left': '1.4%', 'width': '50%', 'display': 'inline-block'}),
# Chart R2: Create combination of bar chart and line chart (Compare each year sales and q. ordered for each product)
html.Div([
html.Br(),
dcc.Graph(id='bar_line_4', config={'displayModeBar': False}),
], style={'width': '48.6%', 'display': 'inline-block', 'float': 'right'}),
# Chart L3: Create line chart (each month sales)
html.Div([
html.Br(),
dcc.Graph(id='line_line_5', config={'displayModeBar': False}),
], style={'margin-left': '1.4%', 'width': '50%', 'display': 'inline-block', 'margin-bottom': '3%'}),
# Chart R3: Create scatter chart (Compare sales and q. ordered)
html.Div([
html.Br(),
dcc.Graph(id='scatter_6', config={'displayModeBar': False}),
], style={'width': '48.6%', 'display': 'inline-block', 'float': 'right', 'margin-bottom': '3%'}),
], style={'background-color': '#e6e6e6'})
# Chart L1: Create combination of bar chart and line chart (Compare quantity ordered to each Price of product)
@app.callback(Output('bar_line_1', 'figure'),
[Input('w_teams', 'value')])
def update_graph(w_teams):
product_sales1 = sales_1.groupby(['ALT_PRODUCT_GROUP', 'TEAM'])['QTY_INVOICED'].sum().reset_index()
product_sales2 = sales_1.groupby(['ALT_PRODUCT_GROUP', 'TEAM'])['UNIT_NET_PRICE'].mean().reset_index()
return {
'dec_20': [go.Bar(x=product_sales1[product_sales1['TEAM'] == w_teams]['ALT_PRODUCT_GROUP'],
y=product_sales1[product_sales1['TEAM'] == w_teams]['QTY_INVOICED'],
text=product_sales1[product_sales1['TEAM'] == w_teams]['QTY_INVOICED'],
name='∑ Qty Sold',
texttemplate='%{text:.2s}',
textposition='auto',
marker=dict(
color=product_sales1[product_sales1['TEAM'] == w_teams]['QTY_INVOICED'],
colorscale='phase',
showscale=False),
yaxis='y1',
hoverinfo='text',
hovertext='<b>Team</b>: ' + product_sales1[product_sales1['TEAM'] == w_teams][
'TEAM'].astype(str) + '<br>' +
'<b>∑ Qty Sold</b>: ' + [f'{x:,.0f}' for x in
product_sales1[product_sales1['TEAM'] == w_teams][
'QTY_INVOICED']] + '<br>' +
'<b>APG</b>: ' + product_sales1[product_sales1['TEAM'] == w_teams][
'ALT_PRODUCT_GROUP'].astype(str) + '<br> '
),
go.Scatter(
x=product_sales2[product_sales2['TEAM'] == w_teams]['ALT_PRODUCT_GROUP'],
y=product_sales2[product_sales2['TEAM'] == w_teams]['UNIT_NET_PRICE'],
name='μ Price',
text=product_sales2[product_sales2['TEAM'] == w_teams]['UNIT_NET_PRICE'],
mode='markers + lines',
marker=dict(color='#bd3786'),
yaxis='y2',
hoverinfo='text',
hovertext='<b>Team</b>: ' + product_sales2[product_sales2['TEAM'] == w_teams][
'TEAM'].astype(str) + '<br>' +
'<b>Price</b>: $' + [f'{x:,.0f}' for x in
product_sales2[product_sales2['TEAM'] == w_teams][
'UNIT_NET_PRICE']] + '<br>' +
'<b>APG</b>: ' + product_sales2[product_sales2['TEAM'] == w_teams][
'ALT_PRODUCT_GROUP'].astype(str) + '<br>'
)],
'layout': go.Layout(
width=780,
height=520,
title={
'text': '∑ Qty Sold & Price: ' + w_teams,
'y': 0.93,
'x': 0.43,
'xanchor': 'center',
'yanchor': 'top'},
titlefont={'family': 'Oswald',
'color': 'rgb(230, 34, 144)',
'size': 25},
hovermode='x',
xaxis=dict(title='<b>APG</b>',
color='rgb(230, 34, 144)',
showline=True,
showgrid=True,
showticklabels=True,
linecolor='rgb(104, 204, 104)',
linewidth=2,
ticks='outside',
tickfont=dict(
family='Arial',
size=12,
color='rgb(17, 37, 239)'
)
),
yaxis=dict(title='<b>∑ Qty Sold</b>',
color='rgb(230, 34, 144)',
showline=True,
showgrid=True,
showticklabels=True,
linecolor='rgb(104, 204, 104)',
linewidth=2,
ticks='outside',
tickfont=dict(
family='Arial',
size=12,
color='rgb(17, 37, 239)'
)
),
yaxis2=dict(title='<b>Price (K)</b>',
overlaying='y',
side='right',
color='rgb(230, 34, 144)',
showline=True,
showgrid=False,
showticklabels=True,
linecolor='rgb(104, 204, 104)',
linewidth=2,
ticks='outside',
tickfont=dict(
family='Arial',
size=12,
color='rgb(17, 37, 239)'
)
),
legend=dict(title='',
x=0.25,
y=1.08,
orientation='h',
bgcolor='rgba(255, 255, 255, 0)',
traceorder="normal",
font=dict(
family="sans-serif",
size=12,
color='#000000'
)
),
legend_title_font_color="green",
uniformtext_minsize=12,
uniformtext_mode='hide',
)
}
# TODO: this is my preferred hart for sales revenue and sales qty
# Chart R1: Create combination of bar chart and line chart (Compare sales to each Price of product)
@app.callback(Output('bar_line_2', 'figure'),
[Input('w_teams', 'value')])
def update_graph(w_teams):
product_sales3 = sales_1.groupby(['ALT_PRODUCT_GROUP', 'TEAM'])['SALES_VALUE'].sum().reset_index()
product_sales4 = sales_1.groupby(['ALT_PRODUCT_GROUP', 'TEAM'])['UNIT_NET_PRICE'].mean().reset_index()
return {
'dec_20': [go.Bar(x=product_sales3[product_sales3['TEAM'] == w_teams]['ALT_PRODUCT_GROUP'],
y=product_sales3[product_sales3['TEAM'] == w_teams]['SALES_VALUE'],
text=product_sales3[product_sales3['TEAM'] == w_teams]['SALES_VALUE'],
name='∑ Sales',
texttemplate='%{text:.2s}',
textposition='auto',
marker=dict(
color=product_sales3[product_sales3['TEAM'] == w_teams]
['SALES_VALUE'],
colorscale='portland',
showscale=False),
yaxis='y1',
hoverinfo='text',
hovertext='<b>Team</b>: ' + product_sales3[product_sales3['TEAM'] == w_teams][
'TEAM'].astype(str) + '<br>' +
'<b>Sales</b>: $' + [f'{x:,.0f}' for x in
product_sales3[product_sales3['TEAM'] == w_teams][
'SALES_VALUE']] + '<br>' +
'<b>APG</b>: ' + product_sales3[product_sales3['TEAM'] == w_teams][
'ALT_PRODUCT_GROUP'].astype(str) + '<br>'
),
go.Scatter(
x=product_sales4[product_sales4['TEAM'] == w_teams]['ALT_PRODUCT_GROUP'],
y=product_sales4[product_sales4['TEAM'] == w_teams]['UNIT_NET_PRICE'],
name='μ Price (K)',
text=product_sales4[product_sales4['TEAM'] == w_teams]['UNIT_NET_PRICE'],
mode='markers + lines',
marker=dict(color='#bd3786'),
yaxis='y2',
hoverinfo='text',
hovertext='<b>Team</b>: ' + product_sales4[product_sales4['TEAM'] == w_teams][
'TEAM'].astype(str) + '<br>' +
'<b>APG</b>: ' + product_sales4[product_sales4['TEAM'] == w_teams][
'ALT_PRODUCT_GROUP'].astype(str) + '<br>' +
'<b>Price</b>: $' + [f'{x:,.0f}' for x in
product_sales4[product_sales4['TEAM'] == w_teams][
'UNIT_NET_PRICE']] + '<br>'
)],
'layout': go.Layout(
width=780,
height=520,
title={
'text': '∑ Sales & μ Price p/uom: ' + w_teams,
'y': 0.93,
'x': 0.43,
'xanchor': 'center',
'yanchor': 'top'},
titlefont={'family': 'Oswald',
'color': 'rgb(230, 34, 144)',
'size': 25},
hovermode='x',
xaxis=dict(title='<b>APG</b>',
color='rgb(230, 34, 144)',
showline=True,
showgrid=True,
showticklabels=True,
linecolor='rgb(104, 204, 104)',
linewidth=2,
ticks='outside',
tickfont=dict(
family='Arial',
size=12,
color='rgb(17, 37, 239)'
)
),
yaxis=dict(title='<b>∑ Sales</b>',
color='rgb(230, 34, 144)',
showline=True,
showgrid=True,
showticklabels=True,
linecolor='rgb(104, 204, 104)',
linewidth=2,
ticks='outside',
tickfont=dict(
family='Arial',
size=12,
color='rgb(17, 37, 239)'
)
),
yaxis2=dict(title='<b>μ Price of Ea. uom (K)</b>',
overlaying='y',
side='right',
color='rgb(230, 34, 144)',
showline=True,
showgrid=False,
showticklabels=True,
linecolor='rgb(104, 204, 104)',
linewidth=2,
ticks='outside',
tickfont=dict(
family='Arial',
size=12,
color='rgb(17, 37, 239)'
)
),
legend=dict(title='',
x=0.25,
y=1.08,
orientation='h',
bgcolor='rgba(255, 255, 255, 0)',
traceorder="normal",
font=dict(
family="sans-serif",
size=12,
color='#000000')),
legend_title_font_color="green",
uniformtext_minsize=12,
uniformtext_mode='hide',
)
}
# Chart L2: Create group bar chart (Compare sales and quantity ordered for each product)
@app.callback(Output('bar_bar_3', 'figure'),
[Input('w_teams', 'value')])
def update_graph(w_teams):
product_sales5 = sales_1.groupby(['ALT_PRODUCT_GROUP', 'TEAM'])['SALES_VALUE'].sum().reset_index()
product_sales6 = sales_1.groupby(['ALT_PRODUCT_GROUP', 'TEAM'])['QTY_INVOICED'].sum().reset_index()
return {
'dec_20': [go.Bar(x=product_sales5[product_sales5['TEAM'] == w_teams]['ALT_PRODUCT_GROUP'],
y=product_sales5[product_sales5['TEAM'] == w_teams]['SALES_VALUE'],
text=product_sales5[product_sales5['TEAM'] == w_teams]['SALES_VALUE'],
name='∑ Sales',
texttemplate='%{text:.2s}',
textposition='auto',
marker=dict(color='rgb(214, 137, 16)'),
yaxis='y1',
offsetgroup=1,
hoverinfo='text',
hovertext='<b>Team</b>: ' + product_sales5[product_sales5['TEAM'] == w_teams][
'TEAM'].astype(str) + '<br>' +
'<b>APG</b>: ' + product_sales5[product_sales5['TEAM'] == w_teams][
'ALT_PRODUCT_GROUP'].astype(str) + '<br>' +
'<b>∑ Sales</b>: $' + [f'{x:,.0f}' for x in
product_sales5[product_sales5['TEAM'] == w_teams][
'SALES_VALUE']] + '<br>'
),
go.Bar(
x=product_sales6[product_sales6['TEAM'] == w_teams]['ALT_PRODUCT_GROUP'],
y=product_sales6[product_sales6['TEAM'] == w_teams]['QTY_INVOICED'],
name='∑ Qty. Ordered',
text=product_sales6[product_sales6['TEAM'] == w_teams]['QTY_INVOICED'],
texttemplate='%{text:.2s}',
textposition='auto',
marker=dict(color='rgb(112, 123, 124)'),
yaxis='y2',
offsetgroup=2,
hoverinfo='text',
hovertext='<b>Team</b>: ' + product_sales6[product_sales6['TEAM'] == w_teams][
'TEAM'].astype(str) + '<br>' +
'<b>APG</b>: ' + product_sales6[product_sales6['TEAM'] == w_teams][
'ALT_PRODUCT_GROUP'].astype(str) + '<br>' +
'<b>∑ Qty Ordered</b>: ' + [f'{x:,.0f}' for x in
product_sales6[product_sales6['TEAM'] == w_teams][
'QTY_INVOICED']] + '<br>'
)],
'layout': go.Layout(
width=780,
height=520,
title={
'text': '∑ sales & ∑ Qty Sold by APG: ' + w_teams,
'y': 0.93,
'x': 0.43,
'xanchor': 'center',
'yanchor': 'top'},
titlefont={'family': 'Oswald',
'color': 'rgb(230, 34, 144)',
'size': 25},
hovermode='x',
xaxis=dict(title='<b>Name of APG</b>',
color='rgb(230, 34, 144)',
showline=True,
showgrid=True,
showticklabels=True,
linecolor='rgb(104, 204, 104)',
linewidth=2,
ticks='outside',
tickfont=dict(
family='Arial',
size=12,
color='rgb(17, 37, 239)'
)
),
yaxis=dict(title='<b>∑ Sales</b>',
color='rgb(230, 34, 144)',
showline=True,
showgrid=True,
showticklabels=True,
linecolor='rgb(104, 204, 104)',
linewidth=2,
ticks='outside',
tickfont=dict(
family='Arial',
size=12,
color='rgb(17, 37, 239)'
)
),
yaxis2=dict(title='<b>∑ Qty Sold</b>', overlaying='y', side='right',
color='rgb(230, 34, 144)',
showline=True,
showgrid=False,
showticklabels=True,
linecolor='rgb(104, 204, 104)',
linewidth=2,
ticks='outside',
tickfont=dict(
family='Arial',
size=12,
color='rgb(17, 37, 239)'
)
),
legend=dict(title='',
x=0.25,
y=1.08,
orientation='h',
bgcolor='rgba(255, 255, 255, 0)',
traceorder="normal",
font=dict(
family="sans-serif",
size=12,
color='#000000')),
legend_title_font_color="green",
uniformtext_minsize=12,
uniformtext_mode='hide',
)
}
# Chart R2: Create combination of bar chart and line chart (Compare each year sales and q. ordered for each product)
@app.callback(Output('bar_line_4', 'figure'),
[Input('w_teams', 'value')])
def update_graph(w_teams):
product_sales7 = sales_1.groupby(['TEAM', 'YEAR_ID'])['SALES_VALUE'].sum().reset_index()
product_sales8 = sales_1.groupby(['TEAM', 'YEAR_ID'])['QTY_INVOICED'].sum().reset_index()
return {
'dec_20': [go.Bar(x=product_sales7[product_sales7['TEAM'] == w_teams]['YEAR_ID'],
y=product_sales7[product_sales7['TEAM'] == w_teams]['SALES_VALUE'],
text=product_sales7[product_sales7['TEAM'] == w_teams]['SALES_VALUE'],
name='∑ Sales',
texttemplate='%{text:.2s}',
textposition='auto',
marker=dict(color='rgb(11, 220, 239)'),
yaxis='y1',
hoverinfo='text',
hovertext='<b>Team</b>: ' + product_sales7[product_sales7['TEAM'] == w_teams][
'TEAM'].astype(str) + '<br>' +
'<b>∑ sales</b>: $' + [f'{x:,.0f}' for x in
product_sales7[product_sales7['TEAM'] == w_teams][
'SALES_VALUE']] + '<br>' +
'<b>Year</b>: ' + product_sales7[product_sales7['TEAM'] == w_teams][
'YEAR_ID'].astype(
str) + '<br>'
),
go.Scatter(
x=product_sales8[product_sales8['TEAM'] == w_teams]['YEAR_ID'],
y=product_sales8[product_sales8['TEAM'] == w_teams]['QTY_INVOICED'],
name='∑ Qty Ordered',
text=product_sales8[product_sales8['TEAM'] == w_teams]['QTY_INVOICED'],
mode='markers + lines',
marker=dict(color='#bd3786'),
yaxis='y2',
hoverinfo='text',
hovertext='<b>Team</b>: ' + product_sales8[product_sales8['TEAM'] == w_teams][
'TEAM'].astype(str) + '<br>' +
'<b>∑ Qty Sold</b>: ' + [f'{x:,.0f}' for x in
product_sales8[product_sales8['TEAM'] == w_teams][
'QTY_INVOICED']] + '<br>' +
'<b>Year</b>: ' + product_sales8[product_sales8['TEAM'] == w_teams]['YEAR_ID'].astype(
str) + '<br>'
)],
'layout': go.Layout(
width=780,
height=520,
title={
'text': '∑ Sales & ∑ Qty Sold by APG: ' + w_teams,
'y': 0.93,
'x': 0.43,
'xanchor': 'center',
'yanchor': 'top'},
titlefont={'family': 'Oswald',
'color': 'rgb(230, 34, 144)',
'size': 25},
hovermode='x',
xaxis=dict(title='<b>APG</b>',
tick0=0,
dtick=1,
color='rgb(230, 34, 144)',
showline=True,
showgrid=True,
showticklabels=True,
linecolor='rgb(104, 204, 104)',
linewidth=2,
ticks='outside',
tickfont=dict(
family='Arial',
size=12,
color='rgb(17, 37, 239)'
)
),
yaxis=dict(title='<b>∑ sales</b>',
color='rgb(230, 34, 144)',
showline=True,
showgrid=True,
showticklabels=True,
linecolor='rgb(104, 204, 104)',
linewidth=2,
ticks='outside',
tickfont=dict(
family='Arial',
size=12,
color='rgb(17, 37, 239)'
)
),
yaxis2=dict(title='<b>∑ Qty Sold</b>', overlaying='y', side='right',
color='rgb(230, 34, 144)',
showline=True,
showgrid=False,
showticklabels=True,
linecolor='rgb(104, 204, 104)',
linewidth=2,
ticks='outside',
tickfont=dict(
family='Arial',
size=12,
color='rgb(17, 37, 239)'
)
),
legend=dict(title='',
x=0.25,
y=1.08,
orientation='h',
bgcolor='rgba(255, 255, 255, 0)',
traceorder="normal",
font=dict(family="sans-serif", size=12, color='#000000')),
legend_title_font_color="green",
uniformtext_minsize=12,
uniformtext_mode='hide',
)
}
# TODO: i like this one. How can i make it more readable. why do some people disagree with the week number?
# Chart L3: Create line chart (year oon year weekly sales)
@app.callback(Output('line_line_5', 'figure'),
[Input('w_teams', 'value')])
def update_graph(w_teams):
monthly_sales = sales_1.groupby(['TEAM', 'YEAR_ID', 'WEEK_ID'])['SALES_VALUE'].sum().reset_index()
return {
'dec_20': [go.Scatter(
x=monthly_sales[(monthly_sales['YEAR_ID'] == 2020) & (monthly_sales['TEAM'] == w_teams)]['WEEK_ID'],
y=monthly_sales[(monthly_sales['YEAR_ID'] == 2020) & (monthly_sales['TEAM'] == w_teams)]['SALES_VALUE'],
text=monthly_sales[(monthly_sales['YEAR_ID'] == 2020) & (monthly_sales['TEAM'] == w_teams)][
'SALES_VALUE'],
name='2020',
mode='markers+lines',
hoverinfo='text',
hovertext='<b>Team</b>: ' + monthly_sales[(monthly_sales['YEAR_ID'] == 2020) & (monthly_sales[
'TEAM'] == w_teams)][
'TEAM'].astype(str) + '<br>' +
'<b>Year</b>: ' + monthly_sales[(monthly_sales['YEAR_ID'] == 2020) & (monthly_sales[
'TEAM'] == w_teams)][
'YEAR_ID'].astype(str) + '<br>' +
'<b>Week</b>: ' + monthly_sales[(monthly_sales['YEAR_ID'] == 2020) & (monthly_sales[
'TEAM'] == w_teams)][
'WEEK_ID'].astype(str) + '<br>' +
'<b>Sales</b>: $' + [f'{x:,.0f}' for x in monthly_sales[
(monthly_sales['YEAR_ID'] == 2020) & (monthly_sales['TEAM'] == w_teams)]['SALES_VALUE']] + '<br>'
),
go.Scatter(
x=monthly_sales[(monthly_sales['YEAR_ID'] == 2021) & (monthly_sales['TEAM'] == w_teams)][
'WEEK_ID'],
y=monthly_sales[(monthly_sales['YEAR_ID'] == 2021) & (monthly_sales['TEAM'] == w_teams)][
'SALES_VALUE'],
text=monthly_sales[(monthly_sales['YEAR_ID'] == 2021) & (monthly_sales['TEAM'] == w_teams)][
'SALES_VALUE'],
name='2021',
mode='markers+lines',
hoverinfo='text',
hovertext='<b>Team</b>: ' + monthly_sales[(monthly_sales['YEAR_ID'] == 2021) & (monthly_sales[
'TEAM'] == w_teams)][
'TEAM'].astype(str) + '<br>' +
'<b>Year</b>: ' + monthly_sales[(monthly_sales['YEAR_ID'] == 2021) & (monthly_sales[
'TEAM'] == w_teams)][
'YEAR_ID'].astype(str) + '<br>' +
'<b>Week</b>: ' + monthly_sales[(monthly_sales['YEAR_ID'] == 2021) & (monthly_sales[
'TEAM'] == w_teams)][
'WEEK_ID'].astype(str) + '<br>' +
'<b>Sales</b>: $' + [f'{x:,.0f}' for x in monthly_sales[
(monthly_sales['YEAR_ID'] == 2021) & (monthly_sales['TEAM'] == w_teams)][
'SALES_VALUE']] + '<br>'
)],
'layout': go.Layout(
width=780,
height=520,
title={
'text': 'Weekly Sales: ' + w_teams,
'y': 0.93,
'x': 0.43,
'xanchor': 'center',
'yanchor': 'top'},
titlefont={'family': 'Oswald',
'color': 'rgb(230, 34, 144)',
'size': 25},
hovermode='x',
xaxis=dict(title='<b>Week</b>',
tick0=0,
dtick=1,
color='rgb(230, 34, 144)',
showline=True,
showgrid=True,
showticklabels=True,
linecolor='rgb(104, 204, 104)',
linewidth=2, ticks='outside',
tickfont=dict(
family='Arial',
size=12,
color='rgb(17, 37, 239)'
)
),
yaxis=dict(title='<b>∑ sales</b>',
color='rgb(230, 34, 144)',
showline=True,
showgrid=True,
showticklabels=True,
linecolor='rgb(104, 204, 104)',
linewidth=2, ticks='outside',
tickfont=dict(
family='Arial',
size=12,
color='rgb(17, 37, 239)'
)
),
legend=dict(title='',
x=0.25,
y=1.08,
orientation='h',
bgcolor='rgba(255, 255, 255, 0)',
traceorder="normal",
font=dict(
family="sans-serif",
size=12,
color='#000000')),
legend_title_font_color="green",
uniformtext_minsize=12,
uniformtext_mode='hide',
)
}
# TODO: Chart R3: Create bubble chart (Compare sales and q. ordered). This one doesn't really work. reconsider and
# re-plot. If you were the sales manager, what information would you need to see to make the decisions you need to
# achieve the results yuo need.
@app.callback(Output('scatter_6', 'figure'),
[Input('w_teams', 'value')])
def update_graph(w_teams):
scatter = sales_1.groupby(['TEAM', 'ALT_PRODUCT_GROUP'])[['QTY_INVOICED', 'SALES_VALUE']].sum().reset_index()
return {
'dec_20': [go.Scatter(x=scatter[scatter['TEAM'] == w_teams]['QTY_INVOICED'],
y=scatter[scatter['TEAM'] == w_teams]['SALES_VALUE'],
text=scatter[scatter['TEAM'] == w_teams]['SALES_VALUE'],
mode='markers',
hoverinfo='text',
hovertext='<b>Team</b>: ' + scatter[scatter['TEAM'] == w_teams]['TEAM'].astype(
str) + '<br>' +
'<b>APG</b>: ' + scatter[scatter['TEAM'] == w_teams][
'ALT_PRODUCT_GROUP'].astype(
str) + '<br>' +
'<b>∑ Qty Sold</b>: ' + [f'{x:,.0f}' for x in
scatter[scatter['TEAM'] == w_teams][
'QTY_INVOICED']] + '<br>' +
'<b>Sales</b>: $' + [f'{x:,.0f}' for x in
scatter[scatter['TEAM'] == w_teams][
'SALES_VALUE']] + '<br>',
marker=dict(
size=20,
color=scatter[scatter['TEAM'] == w_teams]['QTY_INVOICED'],
colorscale='mrybm',
showscale=False
)
)],
'layout': go.Layout(
width=780,
height=520,
title={
'text': '∑ Sales vs. Sold Qty:' + w_teams,
'y': 0.93,
'x': 0.43,
'xanchor': 'center',
'yanchor': 'top'},
titlefont={'family': 'Oswald',
'color': 'rgb(230, 34, 144)',
'size': 25},
hovermode='x',
xaxis=dict(
title='<b>∑ Qty Sold</b>',
color='rgb(230, 34, 144)',
showline=True, showgrid=True,
showticklabels=True, linecolor='rgb(104, 204, 104)',
linewidth=2,
ticks='outside',
tickfont=dict(
family='Arial',
size=12,
color='rgb(17, 37, 239)'
)
),
yaxis=dict(title='<b>Sales</b>',
color='rgb(230, 34, 144)',
showline=True,
showgrid=True,
showticklabels=True,
linecolor='rgb(104, 204, 104)',
linewidth=2, ticks='outside',
tickfont=dict(
family='Arial',
size=12,
color='rgb(17, 37, 239)'
)
),
)
}
if __name__ == '__main__':
app.run_server(host='0.0.0.0', port=8050)
| 45.621891 | 121 | 0.397928 | 3,258 | 36,680 | 4.334561 | 0.111111 | 0.038663 | 0.050984 | 0.020252 | 0.847968 | 0.837204 | 0.819997 | 0.795426 | 0.775103 | 0.694802 | 0 | 0.054175 | 0.465567 | 36,680 | 803 | 122 | 45.678705 | 0.664847 | 0.052072 | 0 | 0.703757 | 0 | 0 | 0.172078 | 0.003166 | 0 | 0 | 0 | 0.001245 | 0 | 1 | 0.008671 | false | 0 | 0.008671 | 0 | 0.026012 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
539c30b0ad09fce4a4cadc97e1533485d930068b | 1,328 | py | Python | bugtests/test058.py | doom38/jython_v2.2.1 | 0803a0c953c294e6d14f9fc7d08edf6a3e630a15 | [
"CNRI-Jython"
] | null | null | null | bugtests/test058.py | doom38/jython_v2.2.1 | 0803a0c953c294e6d14f9fc7d08edf6a3e630a15 | [
"CNRI-Jython"
] | null | null | null | bugtests/test058.py | doom38/jython_v2.2.1 | 0803a0c953c294e6d14f9fc7d08edf6a3e630a15 | [
"CNRI-Jython"
] | null | null | null | """
"%#.0f", "%e" and "%+f" w/ negative numbers don't print correctly.
"""
import support
if "%#.0f" % 5 != "5.":
raise support.TestWarning("Format Error #1 %#.0f" % 5)
if "%.1f" % 5 != "5.0":
raise support.TestError("Format Error #2 %.1f" % 5)
if "%e" % -1e-6 != "-1.000000e-006":
raise support.TestError("Format Error #3 %e" % -1e-6)
if "%e" % 0 != "0.000000e+000":
raise support.TestError("Format Error #4 %e" % 0)
if "%e" % 1e-6 != "1.000000e-006":
raise support.TestError("Format Error #5 %e" % 1e-6)
if "%+f" % -5 != "-5.000000":
raise support.TestError("Format Error #6 %+f" % -5)
if "%+f" % 5 != "+5.000000":
raise support.TestError("Format Error #7 %+f" % 5)
import java
java.util.Locale.setDefault(java.util.Locale("us", ""))
if "%#.0f" % 5 != "5.":
raise support.TestError("Format Error #8")
if "%.1f" % 5 != "5.0":
raise support.TestError("Format Error #9")
if "%e" % -1e-6 != "-1.000000e-006":
raise support.TestError("Format Error #10")
if "%e" % 0 != "0.000000e+000":
raise support.TestError("Format Error #11")
if "%e" % 1e-6 != "1.000000e-006":
raise support.TestError("Format Error #12")
if "%+f" % -5 != "-5.000000":
raise support.TestError("Format Error #13")
if "%+f" % 5 != "+5.000000":
raise support.TestError("Format Error #14")
| 26.039216 | 66 | 0.573042 | 201 | 1,328 | 3.78607 | 0.223881 | 0.220762 | 0.358739 | 0.461235 | 0.78318 | 0.756899 | 0.709593 | 0.709593 | 0.709593 | 0.709593 | 0 | 0.136872 | 0.191265 | 1,328 | 50 | 67 | 26.56 | 0.571695 | 0.049699 | 0 | 0.451613 | 0 | 0 | 0.330136 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0.064516 | 0 | 0.064516 | 0 | 0 | 0 | 0 | null | 1 | 1 | 1 | 0 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 9 |
53f447d2db7cb1e57a497540564078da8a506ca0 | 848 | py | Python | Desafios/des017.py | joseangelooliveira-br/Python3 | c0ba39768706f84f26b0616b75dd8c7971145b0e | [
"MIT"
] | null | null | null | Desafios/des017.py | joseangelooliveira-br/Python3 | c0ba39768706f84f26b0616b75dd8c7971145b0e | [
"MIT"
] | null | null | null | Desafios/des017.py | joseangelooliveira-br/Python3 | c0ba39768706f84f26b0616b75dd8c7971145b0e | [
"MIT"
] | null | null | null | '''
co = float(input('Digite o comprimento do cateto oposto:'))
ca = float(input('Digite o comprimento do cateto adjacente:'))
hip = (ca**2 + co**2)**(1/2)
print('A hipotenusa ira medir {}.'.format(hip))
'''
'''
import math
co = float(input('Digite o comprimento do cateto oposto:'))
ca = float(input('Digite o comprimento do cateto adjacente:'))
hip = math.hypot(ca,co)
print('A hipotenusa ira medir {}.'.format(hip))
'''
'''
from math import hypot
co = float(input('Digite o comprimento do cateto oposto:'))
ca = float(input('Digite o comprimento do cateto adjacente:'))
hip = hypot(ca,co)
print('A hipotenusa ira medir {}.'.format(hip))
'''
from math import hypot
co = float(input('Digite o comprimento do cateto oposto:'))
ca = float(input('Digite o comprimento do cateto adjacente:'))
print('A hipotenusa ira medir {}.'.format(hypot(ca, co)))
| 33.92 | 62 | 0.694575 | 128 | 848 | 4.601563 | 0.1875 | 0.135823 | 0.217317 | 0.2309 | 0.947368 | 0.947368 | 0.896435 | 0.840407 | 0.840407 | 0.840407 | 0 | 0.005435 | 0.132075 | 848 | 24 | 63 | 35.333333 | 0.794837 | 0.23467 | 0 | 0 | 0 | 0 | 0.507246 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.25 | 0 | 0.25 | 0.25 | 0 | 0 | 0 | null | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 8 |
54ff34c4251c0441aad50496f70d8d23b0bb6dec | 270 | py | Python | ansys/dpf/core/operators/mapping/__init__.py | jfthuong/pydpf-core | bf2895ebc546e0004f759289bfc9a23196559ac3 | [
"MIT"
] | 18 | 2021-10-16T10:38:29.000Z | 2022-03-29T11:26:42.000Z | ansys/dpf/core/operators/mapping/__init__.py | jfthuong/pydpf-core | bf2895ebc546e0004f759289bfc9a23196559ac3 | [
"MIT"
] | 79 | 2021-10-11T23:18:54.000Z | 2022-03-29T14:53:14.000Z | ansys/dpf/core/operators/mapping/__init__.py | jfthuong/pydpf-core | bf2895ebc546e0004f759289bfc9a23196559ac3 | [
"MIT"
] | 5 | 2021-11-29T18:35:37.000Z | 2022-03-16T16:49:21.000Z | from .find_reduced_coordinates import find_reduced_coordinates
from .on_reduced_coordinates import on_reduced_coordinates
from .on_coordinates import on_coordinates
from .scoping_on_coordinates import scoping_on_coordinates
from .solid_to_skin import solid_to_skin
| 45 | 63 | 0.888889 | 38 | 270 | 5.842105 | 0.263158 | 0.324324 | 0.198198 | 0.216216 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.092593 | 270 | 5 | 64 | 54 | 0.906122 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 7 |
07103c1f315aa88a7f93ce9e710d854888f5d2a9 | 46,338 | py | Python | tccli/services/mongodb/mongodb_client.py | zqfan/tencentcloud-cli | b6ad9fced2a2b340087e4e5522121d405f68b615 | [
"Apache-2.0"
] | null | null | null | tccli/services/mongodb/mongodb_client.py | zqfan/tencentcloud-cli | b6ad9fced2a2b340087e4e5522121d405f68b615 | [
"Apache-2.0"
] | null | null | null | tccli/services/mongodb/mongodb_client.py | zqfan/tencentcloud-cli | b6ad9fced2a2b340087e4e5522121d405f68b615 | [
"Apache-2.0"
] | null | null | null | # -*- coding: utf-8 -*-
import os
import json
import tccli.options_define as OptionsDefine
import tccli.format_output as FormatOutput
from tccli import __version__
from tccli.utils import Utils
from tccli.exceptions import ConfigurationError
from tencentcloud.common import credential
from tencentcloud.common.profile.http_profile import HttpProfile
from tencentcloud.common.profile.client_profile import ClientProfile
from tencentcloud.mongodb.v20190725 import mongodb_client as mongodb_client_v20190725
from tencentcloud.mongodb.v20190725 import models as models_v20190725
from tencentcloud.mongodb.v20180408 import mongodb_client as mongodb_client_v20180408
from tencentcloud.mongodb.v20180408 import models as models_v20180408
def doDescribeDBInstanceDeal(args, parsed_globals):
g_param = parse_global_arg(parsed_globals)
cred = credential.Credential(
g_param[OptionsDefine.SecretId], g_param[OptionsDefine.SecretKey], g_param[OptionsDefine.Token]
)
http_profile = HttpProfile(
reqTimeout=60 if g_param[OptionsDefine.Timeout] is None else int(g_param[OptionsDefine.Timeout]),
reqMethod="POST",
endpoint=g_param[OptionsDefine.Endpoint]
)
profile = ClientProfile(httpProfile=http_profile, signMethod="HmacSHA256")
mod = CLIENT_MAP[g_param[OptionsDefine.Version]]
client = mod.MongodbClient(cred, g_param[OptionsDefine.Region], profile)
client._sdkVersion += ("_CLI_" + __version__)
models = MODELS_MAP[g_param[OptionsDefine.Version]]
model = models.DescribeDBInstanceDealRequest()
model.from_json_string(json.dumps(args))
rsp = client.DescribeDBInstanceDeal(model)
result = rsp.to_json_string()
try:
json_obj = json.loads(result)
except TypeError as e:
json_obj = json.loads(result.decode('utf-8')) # python3.3
FormatOutput.output("action", json_obj, g_param[OptionsDefine.Output], g_param[OptionsDefine.Filter])
def doDescribeCurrentOp(args, parsed_globals):
g_param = parse_global_arg(parsed_globals)
cred = credential.Credential(
g_param[OptionsDefine.SecretId], g_param[OptionsDefine.SecretKey], g_param[OptionsDefine.Token]
)
http_profile = HttpProfile(
reqTimeout=60 if g_param[OptionsDefine.Timeout] is None else int(g_param[OptionsDefine.Timeout]),
reqMethod="POST",
endpoint=g_param[OptionsDefine.Endpoint]
)
profile = ClientProfile(httpProfile=http_profile, signMethod="HmacSHA256")
mod = CLIENT_MAP[g_param[OptionsDefine.Version]]
client = mod.MongodbClient(cred, g_param[OptionsDefine.Region], profile)
client._sdkVersion += ("_CLI_" + __version__)
models = MODELS_MAP[g_param[OptionsDefine.Version]]
model = models.DescribeCurrentOpRequest()
model.from_json_string(json.dumps(args))
rsp = client.DescribeCurrentOp(model)
result = rsp.to_json_string()
try:
json_obj = json.loads(result)
except TypeError as e:
json_obj = json.loads(result.decode('utf-8')) # python3.3
FormatOutput.output("action", json_obj, g_param[OptionsDefine.Output], g_param[OptionsDefine.Filter])
def doDescribeClientConnections(args, parsed_globals):
g_param = parse_global_arg(parsed_globals)
cred = credential.Credential(
g_param[OptionsDefine.SecretId], g_param[OptionsDefine.SecretKey], g_param[OptionsDefine.Token]
)
http_profile = HttpProfile(
reqTimeout=60 if g_param[OptionsDefine.Timeout] is None else int(g_param[OptionsDefine.Timeout]),
reqMethod="POST",
endpoint=g_param[OptionsDefine.Endpoint]
)
profile = ClientProfile(httpProfile=http_profile, signMethod="HmacSHA256")
mod = CLIENT_MAP[g_param[OptionsDefine.Version]]
client = mod.MongodbClient(cred, g_param[OptionsDefine.Region], profile)
client._sdkVersion += ("_CLI_" + __version__)
models = MODELS_MAP[g_param[OptionsDefine.Version]]
model = models.DescribeClientConnectionsRequest()
model.from_json_string(json.dumps(args))
rsp = client.DescribeClientConnections(model)
result = rsp.to_json_string()
try:
json_obj = json.loads(result)
except TypeError as e:
json_obj = json.loads(result.decode('utf-8')) # python3.3
FormatOutput.output("action", json_obj, g_param[OptionsDefine.Output], g_param[OptionsDefine.Filter])
def doResetDBInstancePassword(args, parsed_globals):
g_param = parse_global_arg(parsed_globals)
cred = credential.Credential(
g_param[OptionsDefine.SecretId], g_param[OptionsDefine.SecretKey], g_param[OptionsDefine.Token]
)
http_profile = HttpProfile(
reqTimeout=60 if g_param[OptionsDefine.Timeout] is None else int(g_param[OptionsDefine.Timeout]),
reqMethod="POST",
endpoint=g_param[OptionsDefine.Endpoint]
)
profile = ClientProfile(httpProfile=http_profile, signMethod="HmacSHA256")
mod = CLIENT_MAP[g_param[OptionsDefine.Version]]
client = mod.MongodbClient(cred, g_param[OptionsDefine.Region], profile)
client._sdkVersion += ("_CLI_" + __version__)
models = MODELS_MAP[g_param[OptionsDefine.Version]]
model = models.ResetDBInstancePasswordRequest()
model.from_json_string(json.dumps(args))
rsp = client.ResetDBInstancePassword(model)
result = rsp.to_json_string()
try:
json_obj = json.loads(result)
except TypeError as e:
json_obj = json.loads(result.decode('utf-8')) # python3.3
FormatOutput.output("action", json_obj, g_param[OptionsDefine.Output], g_param[OptionsDefine.Filter])
def doDescribeSecurityGroup(args, parsed_globals):
g_param = parse_global_arg(parsed_globals)
cred = credential.Credential(
g_param[OptionsDefine.SecretId], g_param[OptionsDefine.SecretKey], g_param[OptionsDefine.Token]
)
http_profile = HttpProfile(
reqTimeout=60 if g_param[OptionsDefine.Timeout] is None else int(g_param[OptionsDefine.Timeout]),
reqMethod="POST",
endpoint=g_param[OptionsDefine.Endpoint]
)
profile = ClientProfile(httpProfile=http_profile, signMethod="HmacSHA256")
mod = CLIENT_MAP[g_param[OptionsDefine.Version]]
client = mod.MongodbClient(cred, g_param[OptionsDefine.Region], profile)
client._sdkVersion += ("_CLI_" + __version__)
models = MODELS_MAP[g_param[OptionsDefine.Version]]
model = models.DescribeSecurityGroupRequest()
model.from_json_string(json.dumps(args))
rsp = client.DescribeSecurityGroup(model)
result = rsp.to_json_string()
try:
json_obj = json.loads(result)
except TypeError as e:
json_obj = json.loads(result.decode('utf-8')) # python3.3
FormatOutput.output("action", json_obj, g_param[OptionsDefine.Output], g_param[OptionsDefine.Filter])
def doSetPassword(args, parsed_globals):
g_param = parse_global_arg(parsed_globals)
cred = credential.Credential(
g_param[OptionsDefine.SecretId], g_param[OptionsDefine.SecretKey], g_param[OptionsDefine.Token]
)
http_profile = HttpProfile(
reqTimeout=60 if g_param[OptionsDefine.Timeout] is None else int(g_param[OptionsDefine.Timeout]),
reqMethod="POST",
endpoint=g_param[OptionsDefine.Endpoint]
)
profile = ClientProfile(httpProfile=http_profile, signMethod="HmacSHA256")
mod = CLIENT_MAP[g_param[OptionsDefine.Version]]
client = mod.MongodbClient(cred, g_param[OptionsDefine.Region], profile)
client._sdkVersion += ("_CLI_" + __version__)
models = MODELS_MAP[g_param[OptionsDefine.Version]]
model = models.SetPasswordRequest()
model.from_json_string(json.dumps(args))
rsp = client.SetPassword(model)
result = rsp.to_json_string()
try:
json_obj = json.loads(result)
except TypeError as e:
json_obj = json.loads(result.decode('utf-8')) # python3.3
FormatOutput.output("action", json_obj, g_param[OptionsDefine.Output], g_param[OptionsDefine.Filter])
def doFlushInstanceRouterConfig(args, parsed_globals):
g_param = parse_global_arg(parsed_globals)
cred = credential.Credential(
g_param[OptionsDefine.SecretId], g_param[OptionsDefine.SecretKey], g_param[OptionsDefine.Token]
)
http_profile = HttpProfile(
reqTimeout=60 if g_param[OptionsDefine.Timeout] is None else int(g_param[OptionsDefine.Timeout]),
reqMethod="POST",
endpoint=g_param[OptionsDefine.Endpoint]
)
profile = ClientProfile(httpProfile=http_profile, signMethod="HmacSHA256")
mod = CLIENT_MAP[g_param[OptionsDefine.Version]]
client = mod.MongodbClient(cred, g_param[OptionsDefine.Region], profile)
client._sdkVersion += ("_CLI_" + __version__)
models = MODELS_MAP[g_param[OptionsDefine.Version]]
model = models.FlushInstanceRouterConfigRequest()
model.from_json_string(json.dumps(args))
rsp = client.FlushInstanceRouterConfig(model)
result = rsp.to_json_string()
try:
json_obj = json.loads(result)
except TypeError as e:
json_obj = json.loads(result.decode('utf-8')) # python3.3
FormatOutput.output("action", json_obj, g_param[OptionsDefine.Output], g_param[OptionsDefine.Filter])
def doSetAutoRenew(args, parsed_globals):
g_param = parse_global_arg(parsed_globals)
cred = credential.Credential(
g_param[OptionsDefine.SecretId], g_param[OptionsDefine.SecretKey], g_param[OptionsDefine.Token]
)
http_profile = HttpProfile(
reqTimeout=60 if g_param[OptionsDefine.Timeout] is None else int(g_param[OptionsDefine.Timeout]),
reqMethod="POST",
endpoint=g_param[OptionsDefine.Endpoint]
)
profile = ClientProfile(httpProfile=http_profile, signMethod="HmacSHA256")
mod = CLIENT_MAP[g_param[OptionsDefine.Version]]
client = mod.MongodbClient(cred, g_param[OptionsDefine.Region], profile)
client._sdkVersion += ("_CLI_" + __version__)
models = MODELS_MAP[g_param[OptionsDefine.Version]]
model = models.SetAutoRenewRequest()
model.from_json_string(json.dumps(args))
rsp = client.SetAutoRenew(model)
result = rsp.to_json_string()
try:
json_obj = json.loads(result)
except TypeError as e:
json_obj = json.loads(result.decode('utf-8')) # python3.3
FormatOutput.output("action", json_obj, g_param[OptionsDefine.Output], g_param[OptionsDefine.Filter])
def doDescribeDBBackups(args, parsed_globals):
g_param = parse_global_arg(parsed_globals)
cred = credential.Credential(
g_param[OptionsDefine.SecretId], g_param[OptionsDefine.SecretKey], g_param[OptionsDefine.Token]
)
http_profile = HttpProfile(
reqTimeout=60 if g_param[OptionsDefine.Timeout] is None else int(g_param[OptionsDefine.Timeout]),
reqMethod="POST",
endpoint=g_param[OptionsDefine.Endpoint]
)
profile = ClientProfile(httpProfile=http_profile, signMethod="HmacSHA256")
mod = CLIENT_MAP[g_param[OptionsDefine.Version]]
client = mod.MongodbClient(cred, g_param[OptionsDefine.Region], profile)
client._sdkVersion += ("_CLI_" + __version__)
models = MODELS_MAP[g_param[OptionsDefine.Version]]
model = models.DescribeDBBackupsRequest()
model.from_json_string(json.dumps(args))
rsp = client.DescribeDBBackups(model)
result = rsp.to_json_string()
try:
json_obj = json.loads(result)
except TypeError as e:
json_obj = json.loads(result.decode('utf-8')) # python3.3
FormatOutput.output("action", json_obj, g_param[OptionsDefine.Output], g_param[OptionsDefine.Filter])
def doIsolateDBInstance(args, parsed_globals):
g_param = parse_global_arg(parsed_globals)
cred = credential.Credential(
g_param[OptionsDefine.SecretId], g_param[OptionsDefine.SecretKey], g_param[OptionsDefine.Token]
)
http_profile = HttpProfile(
reqTimeout=60 if g_param[OptionsDefine.Timeout] is None else int(g_param[OptionsDefine.Timeout]),
reqMethod="POST",
endpoint=g_param[OptionsDefine.Endpoint]
)
profile = ClientProfile(httpProfile=http_profile, signMethod="HmacSHA256")
mod = CLIENT_MAP[g_param[OptionsDefine.Version]]
client = mod.MongodbClient(cred, g_param[OptionsDefine.Region], profile)
client._sdkVersion += ("_CLI_" + __version__)
models = MODELS_MAP[g_param[OptionsDefine.Version]]
model = models.IsolateDBInstanceRequest()
model.from_json_string(json.dumps(args))
rsp = client.IsolateDBInstance(model)
result = rsp.to_json_string()
try:
json_obj = json.loads(result)
except TypeError as e:
json_obj = json.loads(result.decode('utf-8')) # python3.3
FormatOutput.output("action", json_obj, g_param[OptionsDefine.Output], g_param[OptionsDefine.Filter])
def doDescribeBackupAccess(args, parsed_globals):
g_param = parse_global_arg(parsed_globals)
cred = credential.Credential(
g_param[OptionsDefine.SecretId], g_param[OptionsDefine.SecretKey], g_param[OptionsDefine.Token]
)
http_profile = HttpProfile(
reqTimeout=60 if g_param[OptionsDefine.Timeout] is None else int(g_param[OptionsDefine.Timeout]),
reqMethod="POST",
endpoint=g_param[OptionsDefine.Endpoint]
)
profile = ClientProfile(httpProfile=http_profile, signMethod="HmacSHA256")
mod = CLIENT_MAP[g_param[OptionsDefine.Version]]
client = mod.MongodbClient(cred, g_param[OptionsDefine.Region], profile)
client._sdkVersion += ("_CLI_" + __version__)
models = MODELS_MAP[g_param[OptionsDefine.Version]]
model = models.DescribeBackupAccessRequest()
model.from_json_string(json.dumps(args))
rsp = client.DescribeBackupAccess(model)
result = rsp.to_json_string()
try:
json_obj = json.loads(result)
except TypeError as e:
json_obj = json.loads(result.decode('utf-8')) # python3.3
FormatOutput.output("action", json_obj, g_param[OptionsDefine.Output], g_param[OptionsDefine.Filter])
def doInquirePriceModifyDBInstanceSpec(args, parsed_globals):
g_param = parse_global_arg(parsed_globals)
cred = credential.Credential(
g_param[OptionsDefine.SecretId], g_param[OptionsDefine.SecretKey], g_param[OptionsDefine.Token]
)
http_profile = HttpProfile(
reqTimeout=60 if g_param[OptionsDefine.Timeout] is None else int(g_param[OptionsDefine.Timeout]),
reqMethod="POST",
endpoint=g_param[OptionsDefine.Endpoint]
)
profile = ClientProfile(httpProfile=http_profile, signMethod="HmacSHA256")
mod = CLIENT_MAP[g_param[OptionsDefine.Version]]
client = mod.MongodbClient(cred, g_param[OptionsDefine.Region], profile)
client._sdkVersion += ("_CLI_" + __version__)
models = MODELS_MAP[g_param[OptionsDefine.Version]]
model = models.InquirePriceModifyDBInstanceSpecRequest()
model.from_json_string(json.dumps(args))
rsp = client.InquirePriceModifyDBInstanceSpec(model)
result = rsp.to_json_string()
try:
json_obj = json.loads(result)
except TypeError as e:
json_obj = json.loads(result.decode('utf-8')) # python3.3
FormatOutput.output("action", json_obj, g_param[OptionsDefine.Output], g_param[OptionsDefine.Filter])
def doDescribeAsyncRequestInfo(args, parsed_globals):
g_param = parse_global_arg(parsed_globals)
cred = credential.Credential(
g_param[OptionsDefine.SecretId], g_param[OptionsDefine.SecretKey], g_param[OptionsDefine.Token]
)
http_profile = HttpProfile(
reqTimeout=60 if g_param[OptionsDefine.Timeout] is None else int(g_param[OptionsDefine.Timeout]),
reqMethod="POST",
endpoint=g_param[OptionsDefine.Endpoint]
)
profile = ClientProfile(httpProfile=http_profile, signMethod="HmacSHA256")
mod = CLIENT_MAP[g_param[OptionsDefine.Version]]
client = mod.MongodbClient(cred, g_param[OptionsDefine.Region], profile)
client._sdkVersion += ("_CLI_" + __version__)
models = MODELS_MAP[g_param[OptionsDefine.Version]]
model = models.DescribeAsyncRequestInfoRequest()
model.from_json_string(json.dumps(args))
rsp = client.DescribeAsyncRequestInfo(model)
result = rsp.to_json_string()
try:
json_obj = json.loads(result)
except TypeError as e:
json_obj = json.loads(result.decode('utf-8')) # python3.3
FormatOutput.output("action", json_obj, g_param[OptionsDefine.Output], g_param[OptionsDefine.Filter])
def doCreateDBInstanceHour(args, parsed_globals):
g_param = parse_global_arg(parsed_globals)
cred = credential.Credential(
g_param[OptionsDefine.SecretId], g_param[OptionsDefine.SecretKey], g_param[OptionsDefine.Token]
)
http_profile = HttpProfile(
reqTimeout=60 if g_param[OptionsDefine.Timeout] is None else int(g_param[OptionsDefine.Timeout]),
reqMethod="POST",
endpoint=g_param[OptionsDefine.Endpoint]
)
profile = ClientProfile(httpProfile=http_profile, signMethod="HmacSHA256")
mod = CLIENT_MAP[g_param[OptionsDefine.Version]]
client = mod.MongodbClient(cred, g_param[OptionsDefine.Region], profile)
client._sdkVersion += ("_CLI_" + __version__)
models = MODELS_MAP[g_param[OptionsDefine.Version]]
model = models.CreateDBInstanceHourRequest()
model.from_json_string(json.dumps(args))
rsp = client.CreateDBInstanceHour(model)
result = rsp.to_json_string()
try:
json_obj = json.loads(result)
except TypeError as e:
json_obj = json.loads(result.decode('utf-8')) # python3.3
FormatOutput.output("action", json_obj, g_param[OptionsDefine.Output], g_param[OptionsDefine.Filter])
def doTerminateDBInstance(args, parsed_globals):
g_param = parse_global_arg(parsed_globals)
cred = credential.Credential(
g_param[OptionsDefine.SecretId], g_param[OptionsDefine.SecretKey], g_param[OptionsDefine.Token]
)
http_profile = HttpProfile(
reqTimeout=60 if g_param[OptionsDefine.Timeout] is None else int(g_param[OptionsDefine.Timeout]),
reqMethod="POST",
endpoint=g_param[OptionsDefine.Endpoint]
)
profile = ClientProfile(httpProfile=http_profile, signMethod="HmacSHA256")
mod = CLIENT_MAP[g_param[OptionsDefine.Version]]
client = mod.MongodbClient(cred, g_param[OptionsDefine.Region], profile)
client._sdkVersion += ("_CLI_" + __version__)
models = MODELS_MAP[g_param[OptionsDefine.Version]]
model = models.TerminateDBInstanceRequest()
model.from_json_string(json.dumps(args))
rsp = client.TerminateDBInstance(model)
result = rsp.to_json_string()
try:
json_obj = json.loads(result)
except TypeError as e:
json_obj = json.loads(result.decode('utf-8')) # python3.3
FormatOutput.output("action", json_obj, g_param[OptionsDefine.Output], g_param[OptionsDefine.Filter])
def doRenewDBInstances(args, parsed_globals):
g_param = parse_global_arg(parsed_globals)
cred = credential.Credential(
g_param[OptionsDefine.SecretId], g_param[OptionsDefine.SecretKey], g_param[OptionsDefine.Token]
)
http_profile = HttpProfile(
reqTimeout=60 if g_param[OptionsDefine.Timeout] is None else int(g_param[OptionsDefine.Timeout]),
reqMethod="POST",
endpoint=g_param[OptionsDefine.Endpoint]
)
profile = ClientProfile(httpProfile=http_profile, signMethod="HmacSHA256")
mod = CLIENT_MAP[g_param[OptionsDefine.Version]]
client = mod.MongodbClient(cred, g_param[OptionsDefine.Region], profile)
client._sdkVersion += ("_CLI_" + __version__)
models = MODELS_MAP[g_param[OptionsDefine.Version]]
model = models.RenewDBInstancesRequest()
model.from_json_string(json.dumps(args))
rsp = client.RenewDBInstances(model)
result = rsp.to_json_string()
try:
json_obj = json.loads(result)
except TypeError as e:
json_obj = json.loads(result.decode('utf-8')) # python3.3
FormatOutput.output("action", json_obj, g_param[OptionsDefine.Output], g_param[OptionsDefine.Filter])
def doCreateBackupDownloadTask(args, parsed_globals):
g_param = parse_global_arg(parsed_globals)
cred = credential.Credential(
g_param[OptionsDefine.SecretId], g_param[OptionsDefine.SecretKey], g_param[OptionsDefine.Token]
)
http_profile = HttpProfile(
reqTimeout=60 if g_param[OptionsDefine.Timeout] is None else int(g_param[OptionsDefine.Timeout]),
reqMethod="POST",
endpoint=g_param[OptionsDefine.Endpoint]
)
profile = ClientProfile(httpProfile=http_profile, signMethod="HmacSHA256")
mod = CLIENT_MAP[g_param[OptionsDefine.Version]]
client = mod.MongodbClient(cred, g_param[OptionsDefine.Region], profile)
client._sdkVersion += ("_CLI_" + __version__)
models = MODELS_MAP[g_param[OptionsDefine.Version]]
model = models.CreateBackupDownloadTaskRequest()
model.from_json_string(json.dumps(args))
rsp = client.CreateBackupDownloadTask(model)
result = rsp.to_json_string()
try:
json_obj = json.loads(result)
except TypeError as e:
json_obj = json.loads(result.decode('utf-8')) # python3.3
FormatOutput.output("action", json_obj, g_param[OptionsDefine.Output], g_param[OptionsDefine.Filter])
def doUpgradeDBInstanceHour(args, parsed_globals):
g_param = parse_global_arg(parsed_globals)
cred = credential.Credential(
g_param[OptionsDefine.SecretId], g_param[OptionsDefine.SecretKey], g_param[OptionsDefine.Token]
)
http_profile = HttpProfile(
reqTimeout=60 if g_param[OptionsDefine.Timeout] is None else int(g_param[OptionsDefine.Timeout]),
reqMethod="POST",
endpoint=g_param[OptionsDefine.Endpoint]
)
profile = ClientProfile(httpProfile=http_profile, signMethod="HmacSHA256")
mod = CLIENT_MAP[g_param[OptionsDefine.Version]]
client = mod.MongodbClient(cred, g_param[OptionsDefine.Region], profile)
client._sdkVersion += ("_CLI_" + __version__)
models = MODELS_MAP[g_param[OptionsDefine.Version]]
model = models.UpgradeDBInstanceHourRequest()
model.from_json_string(json.dumps(args))
rsp = client.UpgradeDBInstanceHour(model)
result = rsp.to_json_string()
try:
json_obj = json.loads(result)
except TypeError as e:
json_obj = json.loads(result.decode('utf-8')) # python3.3
FormatOutput.output("action", json_obj, g_param[OptionsDefine.Output], g_param[OptionsDefine.Filter])
def doDescribeDBInstances(args, parsed_globals):
g_param = parse_global_arg(parsed_globals)
cred = credential.Credential(
g_param[OptionsDefine.SecretId], g_param[OptionsDefine.SecretKey], g_param[OptionsDefine.Token]
)
http_profile = HttpProfile(
reqTimeout=60 if g_param[OptionsDefine.Timeout] is None else int(g_param[OptionsDefine.Timeout]),
reqMethod="POST",
endpoint=g_param[OptionsDefine.Endpoint]
)
profile = ClientProfile(httpProfile=http_profile, signMethod="HmacSHA256")
mod = CLIENT_MAP[g_param[OptionsDefine.Version]]
client = mod.MongodbClient(cred, g_param[OptionsDefine.Region], profile)
client._sdkVersion += ("_CLI_" + __version__)
models = MODELS_MAP[g_param[OptionsDefine.Version]]
model = models.DescribeDBInstancesRequest()
model.from_json_string(json.dumps(args))
rsp = client.DescribeDBInstances(model)
result = rsp.to_json_string()
try:
json_obj = json.loads(result)
except TypeError as e:
json_obj = json.loads(result.decode('utf-8')) # python3.3
FormatOutput.output("action", json_obj, g_param[OptionsDefine.Output], g_param[OptionsDefine.Filter])
def doOfflineIsolatedDBInstance(args, parsed_globals):
g_param = parse_global_arg(parsed_globals)
cred = credential.Credential(
g_param[OptionsDefine.SecretId], g_param[OptionsDefine.SecretKey], g_param[OptionsDefine.Token]
)
http_profile = HttpProfile(
reqTimeout=60 if g_param[OptionsDefine.Timeout] is None else int(g_param[OptionsDefine.Timeout]),
reqMethod="POST",
endpoint=g_param[OptionsDefine.Endpoint]
)
profile = ClientProfile(httpProfile=http_profile, signMethod="HmacSHA256")
mod = CLIENT_MAP[g_param[OptionsDefine.Version]]
client = mod.MongodbClient(cred, g_param[OptionsDefine.Region], profile)
client._sdkVersion += ("_CLI_" + __version__)
models = MODELS_MAP[g_param[OptionsDefine.Version]]
model = models.OfflineIsolatedDBInstanceRequest()
model.from_json_string(json.dumps(args))
rsp = client.OfflineIsolatedDBInstance(model)
result = rsp.to_json_string()
try:
json_obj = json.loads(result)
except TypeError as e:
json_obj = json.loads(result.decode('utf-8')) # python3.3
FormatOutput.output("action", json_obj, g_param[OptionsDefine.Output], g_param[OptionsDefine.Filter])
def doDescribeSlowLogPatterns(args, parsed_globals):
g_param = parse_global_arg(parsed_globals)
cred = credential.Credential(
g_param[OptionsDefine.SecretId], g_param[OptionsDefine.SecretKey], g_param[OptionsDefine.Token]
)
http_profile = HttpProfile(
reqTimeout=60 if g_param[OptionsDefine.Timeout] is None else int(g_param[OptionsDefine.Timeout]),
reqMethod="POST",
endpoint=g_param[OptionsDefine.Endpoint]
)
profile = ClientProfile(httpProfile=http_profile, signMethod="HmacSHA256")
mod = CLIENT_MAP[g_param[OptionsDefine.Version]]
client = mod.MongodbClient(cred, g_param[OptionsDefine.Region], profile)
client._sdkVersion += ("_CLI_" + __version__)
models = MODELS_MAP[g_param[OptionsDefine.Version]]
model = models.DescribeSlowLogPatternsRequest()
model.from_json_string(json.dumps(args))
rsp = client.DescribeSlowLogPatterns(model)
result = rsp.to_json_string()
try:
json_obj = json.loads(result)
except TypeError as e:
json_obj = json.loads(result.decode('utf-8')) # python3.3
FormatOutput.output("action", json_obj, g_param[OptionsDefine.Output], g_param[OptionsDefine.Filter])
def doDescribeSlowLogs(args, parsed_globals):
g_param = parse_global_arg(parsed_globals)
cred = credential.Credential(
g_param[OptionsDefine.SecretId], g_param[OptionsDefine.SecretKey], g_param[OptionsDefine.Token]
)
http_profile = HttpProfile(
reqTimeout=60 if g_param[OptionsDefine.Timeout] is None else int(g_param[OptionsDefine.Timeout]),
reqMethod="POST",
endpoint=g_param[OptionsDefine.Endpoint]
)
profile = ClientProfile(httpProfile=http_profile, signMethod="HmacSHA256")
mod = CLIENT_MAP[g_param[OptionsDefine.Version]]
client = mod.MongodbClient(cred, g_param[OptionsDefine.Region], profile)
client._sdkVersion += ("_CLI_" + __version__)
models = MODELS_MAP[g_param[OptionsDefine.Version]]
model = models.DescribeSlowLogsRequest()
model.from_json_string(json.dumps(args))
rsp = client.DescribeSlowLogs(model)
result = rsp.to_json_string()
try:
json_obj = json.loads(result)
except TypeError as e:
json_obj = json.loads(result.decode('utf-8')) # python3.3
FormatOutput.output("action", json_obj, g_param[OptionsDefine.Output], g_param[OptionsDefine.Filter])
def doCreateDBInstance(args, parsed_globals):
g_param = parse_global_arg(parsed_globals)
cred = credential.Credential(
g_param[OptionsDefine.SecretId], g_param[OptionsDefine.SecretKey], g_param[OptionsDefine.Token]
)
http_profile = HttpProfile(
reqTimeout=60 if g_param[OptionsDefine.Timeout] is None else int(g_param[OptionsDefine.Timeout]),
reqMethod="POST",
endpoint=g_param[OptionsDefine.Endpoint]
)
profile = ClientProfile(httpProfile=http_profile, signMethod="HmacSHA256")
mod = CLIENT_MAP[g_param[OptionsDefine.Version]]
client = mod.MongodbClient(cred, g_param[OptionsDefine.Region], profile)
client._sdkVersion += ("_CLI_" + __version__)
models = MODELS_MAP[g_param[OptionsDefine.Version]]
model = models.CreateDBInstanceRequest()
model.from_json_string(json.dumps(args))
rsp = client.CreateDBInstance(model)
result = rsp.to_json_string()
try:
json_obj = json.loads(result)
except TypeError as e:
json_obj = json.loads(result.decode('utf-8')) # python3.3
FormatOutput.output("action", json_obj, g_param[OptionsDefine.Output], g_param[OptionsDefine.Filter])
def doModifyDBInstanceSpec(args, parsed_globals):
g_param = parse_global_arg(parsed_globals)
cred = credential.Credential(
g_param[OptionsDefine.SecretId], g_param[OptionsDefine.SecretKey], g_param[OptionsDefine.Token]
)
http_profile = HttpProfile(
reqTimeout=60 if g_param[OptionsDefine.Timeout] is None else int(g_param[OptionsDefine.Timeout]),
reqMethod="POST",
endpoint=g_param[OptionsDefine.Endpoint]
)
profile = ClientProfile(httpProfile=http_profile, signMethod="HmacSHA256")
mod = CLIENT_MAP[g_param[OptionsDefine.Version]]
client = mod.MongodbClient(cred, g_param[OptionsDefine.Region], profile)
client._sdkVersion += ("_CLI_" + __version__)
models = MODELS_MAP[g_param[OptionsDefine.Version]]
model = models.ModifyDBInstanceSpecRequest()
model.from_json_string(json.dumps(args))
rsp = client.ModifyDBInstanceSpec(model)
result = rsp.to_json_string()
try:
json_obj = json.loads(result)
except TypeError as e:
json_obj = json.loads(result.decode('utf-8')) # python3.3
FormatOutput.output("action", json_obj, g_param[OptionsDefine.Output], g_param[OptionsDefine.Filter])
def doDescribeSpecInfo(args, parsed_globals):
g_param = parse_global_arg(parsed_globals)
cred = credential.Credential(
g_param[OptionsDefine.SecretId], g_param[OptionsDefine.SecretKey], g_param[OptionsDefine.Token]
)
http_profile = HttpProfile(
reqTimeout=60 if g_param[OptionsDefine.Timeout] is None else int(g_param[OptionsDefine.Timeout]),
reqMethod="POST",
endpoint=g_param[OptionsDefine.Endpoint]
)
profile = ClientProfile(httpProfile=http_profile, signMethod="HmacSHA256")
mod = CLIENT_MAP[g_param[OptionsDefine.Version]]
client = mod.MongodbClient(cred, g_param[OptionsDefine.Region], profile)
client._sdkVersion += ("_CLI_" + __version__)
models = MODELS_MAP[g_param[OptionsDefine.Version]]
model = models.DescribeSpecInfoRequest()
model.from_json_string(json.dumps(args))
rsp = client.DescribeSpecInfo(model)
result = rsp.to_json_string()
try:
json_obj = json.loads(result)
except TypeError as e:
json_obj = json.loads(result.decode('utf-8')) # python3.3
FormatOutput.output("action", json_obj, g_param[OptionsDefine.Output], g_param[OptionsDefine.Filter])
def doDescribeBackupDownloadTask(args, parsed_globals):
g_param = parse_global_arg(parsed_globals)
cred = credential.Credential(
g_param[OptionsDefine.SecretId], g_param[OptionsDefine.SecretKey], g_param[OptionsDefine.Token]
)
http_profile = HttpProfile(
reqTimeout=60 if g_param[OptionsDefine.Timeout] is None else int(g_param[OptionsDefine.Timeout]),
reqMethod="POST",
endpoint=g_param[OptionsDefine.Endpoint]
)
profile = ClientProfile(httpProfile=http_profile, signMethod="HmacSHA256")
mod = CLIENT_MAP[g_param[OptionsDefine.Version]]
client = mod.MongodbClient(cred, g_param[OptionsDefine.Region], profile)
client._sdkVersion += ("_CLI_" + __version__)
models = MODELS_MAP[g_param[OptionsDefine.Version]]
model = models.DescribeBackupDownloadTaskRequest()
model.from_json_string(json.dumps(args))
rsp = client.DescribeBackupDownloadTask(model)
result = rsp.to_json_string()
try:
json_obj = json.loads(result)
except TypeError as e:
json_obj = json.loads(result.decode('utf-8')) # python3.3
FormatOutput.output("action", json_obj, g_param[OptionsDefine.Output], g_param[OptionsDefine.Filter])
def doInquirePriceCreateDBInstances(args, parsed_globals):
g_param = parse_global_arg(parsed_globals)
cred = credential.Credential(
g_param[OptionsDefine.SecretId], g_param[OptionsDefine.SecretKey], g_param[OptionsDefine.Token]
)
http_profile = HttpProfile(
reqTimeout=60 if g_param[OptionsDefine.Timeout] is None else int(g_param[OptionsDefine.Timeout]),
reqMethod="POST",
endpoint=g_param[OptionsDefine.Endpoint]
)
profile = ClientProfile(httpProfile=http_profile, signMethod="HmacSHA256")
mod = CLIENT_MAP[g_param[OptionsDefine.Version]]
client = mod.MongodbClient(cred, g_param[OptionsDefine.Region], profile)
client._sdkVersion += ("_CLI_" + __version__)
models = MODELS_MAP[g_param[OptionsDefine.Version]]
model = models.InquirePriceCreateDBInstancesRequest()
model.from_json_string(json.dumps(args))
rsp = client.InquirePriceCreateDBInstances(model)
result = rsp.to_json_string()
try:
json_obj = json.loads(result)
except TypeError as e:
json_obj = json.loads(result.decode('utf-8')) # python3.3
FormatOutput.output("action", json_obj, g_param[OptionsDefine.Output], g_param[OptionsDefine.Filter])
def doAssignProject(args, parsed_globals):
g_param = parse_global_arg(parsed_globals)
cred = credential.Credential(
g_param[OptionsDefine.SecretId], g_param[OptionsDefine.SecretKey], g_param[OptionsDefine.Token]
)
http_profile = HttpProfile(
reqTimeout=60 if g_param[OptionsDefine.Timeout] is None else int(g_param[OptionsDefine.Timeout]),
reqMethod="POST",
endpoint=g_param[OptionsDefine.Endpoint]
)
profile = ClientProfile(httpProfile=http_profile, signMethod="HmacSHA256")
mod = CLIENT_MAP[g_param[OptionsDefine.Version]]
client = mod.MongodbClient(cred, g_param[OptionsDefine.Region], profile)
client._sdkVersion += ("_CLI_" + __version__)
models = MODELS_MAP[g_param[OptionsDefine.Version]]
model = models.AssignProjectRequest()
model.from_json_string(json.dumps(args))
rsp = client.AssignProject(model)
result = rsp.to_json_string()
try:
json_obj = json.loads(result)
except TypeError as e:
json_obj = json.loads(result.decode('utf-8')) # python3.3
FormatOutput.output("action", json_obj, g_param[OptionsDefine.Output], g_param[OptionsDefine.Filter])
def doDescribeSlowLog(args, parsed_globals):
g_param = parse_global_arg(parsed_globals)
cred = credential.Credential(
g_param[OptionsDefine.SecretId], g_param[OptionsDefine.SecretKey], g_param[OptionsDefine.Token]
)
http_profile = HttpProfile(
reqTimeout=60 if g_param[OptionsDefine.Timeout] is None else int(g_param[OptionsDefine.Timeout]),
reqMethod="POST",
endpoint=g_param[OptionsDefine.Endpoint]
)
profile = ClientProfile(httpProfile=http_profile, signMethod="HmacSHA256")
mod = CLIENT_MAP[g_param[OptionsDefine.Version]]
client = mod.MongodbClient(cred, g_param[OptionsDefine.Region], profile)
client._sdkVersion += ("_CLI_" + __version__)
models = MODELS_MAP[g_param[OptionsDefine.Version]]
model = models.DescribeSlowLogRequest()
model.from_json_string(json.dumps(args))
rsp = client.DescribeSlowLog(model)
result = rsp.to_json_string()
try:
json_obj = json.loads(result)
except TypeError as e:
json_obj = json.loads(result.decode('utf-8')) # python3.3
FormatOutput.output("action", json_obj, g_param[OptionsDefine.Output], g_param[OptionsDefine.Filter])
def doRenameInstance(args, parsed_globals):
g_param = parse_global_arg(parsed_globals)
cred = credential.Credential(
g_param[OptionsDefine.SecretId], g_param[OptionsDefine.SecretKey], g_param[OptionsDefine.Token]
)
http_profile = HttpProfile(
reqTimeout=60 if g_param[OptionsDefine.Timeout] is None else int(g_param[OptionsDefine.Timeout]),
reqMethod="POST",
endpoint=g_param[OptionsDefine.Endpoint]
)
profile = ClientProfile(httpProfile=http_profile, signMethod="HmacSHA256")
mod = CLIENT_MAP[g_param[OptionsDefine.Version]]
client = mod.MongodbClient(cred, g_param[OptionsDefine.Region], profile)
client._sdkVersion += ("_CLI_" + __version__)
models = MODELS_MAP[g_param[OptionsDefine.Version]]
model = models.RenameInstanceRequest()
model.from_json_string(json.dumps(args))
rsp = client.RenameInstance(model)
result = rsp.to_json_string()
try:
json_obj = json.loads(result)
except TypeError as e:
json_obj = json.loads(result.decode('utf-8')) # python3.3
FormatOutput.output("action", json_obj, g_param[OptionsDefine.Output], g_param[OptionsDefine.Filter])
def doUpgradeDBInstance(args, parsed_globals):
g_param = parse_global_arg(parsed_globals)
cred = credential.Credential(
g_param[OptionsDefine.SecretId], g_param[OptionsDefine.SecretKey], g_param[OptionsDefine.Token]
)
http_profile = HttpProfile(
reqTimeout=60 if g_param[OptionsDefine.Timeout] is None else int(g_param[OptionsDefine.Timeout]),
reqMethod="POST",
endpoint=g_param[OptionsDefine.Endpoint]
)
profile = ClientProfile(httpProfile=http_profile, signMethod="HmacSHA256")
mod = CLIENT_MAP[g_param[OptionsDefine.Version]]
client = mod.MongodbClient(cred, g_param[OptionsDefine.Region], profile)
client._sdkVersion += ("_CLI_" + __version__)
models = MODELS_MAP[g_param[OptionsDefine.Version]]
model = models.UpgradeDBInstanceRequest()
model.from_json_string(json.dumps(args))
rsp = client.UpgradeDBInstance(model)
result = rsp.to_json_string()
try:
json_obj = json.loads(result)
except TypeError as e:
json_obj = json.loads(result.decode('utf-8')) # python3.3
FormatOutput.output("action", json_obj, g_param[OptionsDefine.Output], g_param[OptionsDefine.Filter])
def doKillOps(args, parsed_globals):
g_param = parse_global_arg(parsed_globals)
cred = credential.Credential(
g_param[OptionsDefine.SecretId], g_param[OptionsDefine.SecretKey], g_param[OptionsDefine.Token]
)
http_profile = HttpProfile(
reqTimeout=60 if g_param[OptionsDefine.Timeout] is None else int(g_param[OptionsDefine.Timeout]),
reqMethod="POST",
endpoint=g_param[OptionsDefine.Endpoint]
)
profile = ClientProfile(httpProfile=http_profile, signMethod="HmacSHA256")
mod = CLIENT_MAP[g_param[OptionsDefine.Version]]
client = mod.MongodbClient(cred, g_param[OptionsDefine.Region], profile)
client._sdkVersion += ("_CLI_" + __version__)
models = MODELS_MAP[g_param[OptionsDefine.Version]]
model = models.KillOpsRequest()
model.from_json_string(json.dumps(args))
rsp = client.KillOps(model)
result = rsp.to_json_string()
try:
json_obj = json.loads(result)
except TypeError as e:
json_obj = json.loads(result.decode('utf-8')) # python3.3
FormatOutput.output("action", json_obj, g_param[OptionsDefine.Output], g_param[OptionsDefine.Filter])
def doCreateBackupDBInstance(args, parsed_globals):
g_param = parse_global_arg(parsed_globals)
cred = credential.Credential(
g_param[OptionsDefine.SecretId], g_param[OptionsDefine.SecretKey], g_param[OptionsDefine.Token]
)
http_profile = HttpProfile(
reqTimeout=60 if g_param[OptionsDefine.Timeout] is None else int(g_param[OptionsDefine.Timeout]),
reqMethod="POST",
endpoint=g_param[OptionsDefine.Endpoint]
)
profile = ClientProfile(httpProfile=http_profile, signMethod="HmacSHA256")
mod = CLIENT_MAP[g_param[OptionsDefine.Version]]
client = mod.MongodbClient(cred, g_param[OptionsDefine.Region], profile)
client._sdkVersion += ("_CLI_" + __version__)
models = MODELS_MAP[g_param[OptionsDefine.Version]]
model = models.CreateBackupDBInstanceRequest()
model.from_json_string(json.dumps(args))
rsp = client.CreateBackupDBInstance(model)
result = rsp.to_json_string()
try:
json_obj = json.loads(result)
except TypeError as e:
json_obj = json.loads(result.decode('utf-8')) # python3.3
FormatOutput.output("action", json_obj, g_param[OptionsDefine.Output], g_param[OptionsDefine.Filter])
def doInquirePriceRenewDBInstances(args, parsed_globals):
g_param = parse_global_arg(parsed_globals)
cred = credential.Credential(
g_param[OptionsDefine.SecretId], g_param[OptionsDefine.SecretKey], g_param[OptionsDefine.Token]
)
http_profile = HttpProfile(
reqTimeout=60 if g_param[OptionsDefine.Timeout] is None else int(g_param[OptionsDefine.Timeout]),
reqMethod="POST",
endpoint=g_param[OptionsDefine.Endpoint]
)
profile = ClientProfile(httpProfile=http_profile, signMethod="HmacSHA256")
mod = CLIENT_MAP[g_param[OptionsDefine.Version]]
client = mod.MongodbClient(cred, g_param[OptionsDefine.Region], profile)
client._sdkVersion += ("_CLI_" + __version__)
models = MODELS_MAP[g_param[OptionsDefine.Version]]
model = models.InquirePriceRenewDBInstancesRequest()
model.from_json_string(json.dumps(args))
rsp = client.InquirePriceRenewDBInstances(model)
result = rsp.to_json_string()
try:
json_obj = json.loads(result)
except TypeError as e:
json_obj = json.loads(result.decode('utf-8')) # python3.3
FormatOutput.output("action", json_obj, g_param[OptionsDefine.Output], g_param[OptionsDefine.Filter])
CLIENT_MAP = {
"v20190725": mongodb_client_v20190725,
"v20180408": mongodb_client_v20180408,
}
MODELS_MAP = {
"v20190725": models_v20190725,
"v20180408": models_v20180408,
}
ACTION_MAP = {
"DescribeDBInstanceDeal": doDescribeDBInstanceDeal,
"DescribeCurrentOp": doDescribeCurrentOp,
"DescribeClientConnections": doDescribeClientConnections,
"ResetDBInstancePassword": doResetDBInstancePassword,
"DescribeSecurityGroup": doDescribeSecurityGroup,
"SetPassword": doSetPassword,
"FlushInstanceRouterConfig": doFlushInstanceRouterConfig,
"SetAutoRenew": doSetAutoRenew,
"DescribeDBBackups": doDescribeDBBackups,
"IsolateDBInstance": doIsolateDBInstance,
"DescribeBackupAccess": doDescribeBackupAccess,
"InquirePriceModifyDBInstanceSpec": doInquirePriceModifyDBInstanceSpec,
"DescribeAsyncRequestInfo": doDescribeAsyncRequestInfo,
"CreateDBInstanceHour": doCreateDBInstanceHour,
"TerminateDBInstance": doTerminateDBInstance,
"RenewDBInstances": doRenewDBInstances,
"CreateBackupDownloadTask": doCreateBackupDownloadTask,
"UpgradeDBInstanceHour": doUpgradeDBInstanceHour,
"DescribeDBInstances": doDescribeDBInstances,
"OfflineIsolatedDBInstance": doOfflineIsolatedDBInstance,
"DescribeSlowLogPatterns": doDescribeSlowLogPatterns,
"DescribeSlowLogs": doDescribeSlowLogs,
"CreateDBInstance": doCreateDBInstance,
"ModifyDBInstanceSpec": doModifyDBInstanceSpec,
"DescribeSpecInfo": doDescribeSpecInfo,
"DescribeBackupDownloadTask": doDescribeBackupDownloadTask,
"InquirePriceCreateDBInstances": doInquirePriceCreateDBInstances,
"AssignProject": doAssignProject,
"DescribeSlowLog": doDescribeSlowLog,
"RenameInstance": doRenameInstance,
"UpgradeDBInstance": doUpgradeDBInstance,
"KillOps": doKillOps,
"CreateBackupDBInstance": doCreateBackupDBInstance,
"InquirePriceRenewDBInstances": doInquirePriceRenewDBInstances,
}
AVAILABLE_VERSION_LIST = [
"v20190725",
"v20180408",
]
def action_caller():
return ACTION_MAP
def parse_global_arg(parsed_globals):
g_param = parsed_globals
is_exist_profile = True
if not parsed_globals["profile"]:
is_exist_profile = False
g_param["profile"] = "default"
configure_path = os.path.join(os.path.expanduser("~"), ".tccli")
is_conf_exist, conf_path = Utils.file_existed(configure_path, g_param["profile"] + ".configure")
is_cred_exist, cred_path = Utils.file_existed(configure_path, g_param["profile"] + ".credential")
conf = {}
cred = {}
if is_conf_exist:
conf = Utils.load_json_msg(conf_path)
if is_cred_exist:
cred = Utils.load_json_msg(cred_path)
if not (isinstance(conf, dict) and isinstance(cred, dict)):
raise ConfigurationError(
"file: %s or %s is not json format"
% (g_param["profile"] + ".configure", g_param["profile"] + ".credential"))
if OptionsDefine.Token not in cred:
cred[OptionsDefine.Token] = None
if not is_exist_profile:
if os.environ.get(OptionsDefine.ENV_SECRET_ID) and os.environ.get(OptionsDefine.ENV_SECRET_KEY):
cred[OptionsDefine.SecretId] = os.environ.get(OptionsDefine.ENV_SECRET_ID)
cred[OptionsDefine.SecretKey] = os.environ.get(OptionsDefine.ENV_SECRET_KEY)
cred[OptionsDefine.Token] = os.environ.get(OptionsDefine.ENV_TOKEN)
if os.environ.get(OptionsDefine.ENV_REGION):
conf[OptionsDefine.Region] = os.environ.get(OptionsDefine.ENV_REGION)
for param in g_param.keys():
if g_param[param] is None:
if param in [OptionsDefine.SecretKey, OptionsDefine.SecretId, OptionsDefine.Token]:
if param in cred:
g_param[param] = cred[param]
else:
raise ConfigurationError("%s is invalid" % param)
elif param in [OptionsDefine.Region, OptionsDefine.Output]:
if param in conf:
g_param[param] = conf[param]
else:
raise ConfigurationError("%s is invalid" % param)
try:
if g_param[OptionsDefine.ServiceVersion]:
g_param[OptionsDefine.Version] = "v" + g_param[OptionsDefine.ServiceVersion].replace('-', '')
else:
version = conf["mongodb"][OptionsDefine.Version]
g_param[OptionsDefine.Version] = "v" + version.replace('-', '')
if g_param[OptionsDefine.Endpoint] is None:
g_param[OptionsDefine.Endpoint] = conf["mongodb"][OptionsDefine.Endpoint]
except Exception as err:
raise ConfigurationError("config file:%s error, %s" % (conf_path, str(err)))
if g_param[OptionsDefine.Version] not in AVAILABLE_VERSION_LIST:
raise Exception("available versions: %s" % " ".join(AVAILABLE_VERSION_LIST))
return g_param
| 43.550752 | 105 | 0.729639 | 5,217 | 46,338 | 6.240943 | 0.04332 | 0.078504 | 0.222335 | 0.056697 | 0.829264 | 0.821616 | 0.817316 | 0.815105 | 0.812218 | 0.763261 | 0 | 0.010762 | 0.163818 | 46,338 | 1,063 | 106 | 43.591722 | 0.829535 | 0.007791 | 0 | 0.698821 | 0 | 0 | 0.042838 | 0.008054 | 0 | 0 | 0 | 0 | 0 | 1 | 0.038585 | false | 0.008574 | 0.015005 | 0.001072 | 0.055734 | 0 | 0 | 0 | 0 | null | 0 | 1 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
074baf2b70236608b0b38cf13699ae7ca0900d7d | 71 | py | Python | moviebookingapi/booking/utils.py | John-Salmaan-School/MovieBookingAPI | b5d574155ea4364a0a809f8063a45672938db506 | [
"BSD-3-Clause"
] | null | null | null | moviebookingapi/booking/utils.py | John-Salmaan-School/MovieBookingAPI | b5d574155ea4364a0a809f8063a45672938db506 | [
"BSD-3-Clause"
] | null | null | null | moviebookingapi/booking/utils.py | John-Salmaan-School/MovieBookingAPI | b5d574155ea4364a0a809f8063a45672938db506 | [
"BSD-3-Clause"
] | null | null | null | from uuid import uuid4
def gen_booking_id():
return str(uuid4())
| 11.833333 | 23 | 0.704225 | 11 | 71 | 4.363636 | 0.909091 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.035088 | 0.197183 | 71 | 5 | 24 | 14.2 | 0.807018 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.333333 | true | 0 | 0.333333 | 0.333333 | 1 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 1 | 1 | 1 | 0 | 0 | 7 |
4afa4ec6a24e0c92a66b0a925f9e5337fad7ae26 | 18,642 | py | Python | tests/test_syntax/blocks/test_headers.py | hatzel/markdown-spoilers | 1964f298f0e8b99f1202d36ccc7d8cf7d613ad26 | [
"BSD-3-Clause"
] | 2 | 2020-06-21T12:02:58.000Z | 2020-09-02T15:21:19.000Z | tests/test_syntax/blocks/test_headers.py | hatzel/markdown-spoilers | 1964f298f0e8b99f1202d36ccc7d8cf7d613ad26 | [
"BSD-3-Clause"
] | null | null | null | tests/test_syntax/blocks/test_headers.py | hatzel/markdown-spoilers | 1964f298f0e8b99f1202d36ccc7d8cf7d613ad26 | [
"BSD-3-Clause"
] | null | null | null | # -*- coding: utf-8 -*-
"""
Python Markdown
A Python implementation of John Gruber's Markdown.
Documentation: https://python-markdown.github.io/
GitHub: https://github.com/Python-Markdown/markdown/
PyPI: https://pypi.org/project/Markdown/
Started by Manfred Stienstra (http://www.dwerg.net/).
Maintained for a few years by Yuri Takhteyev (http://www.freewisdom.org).
Currently maintained by Waylan Limberg (https://github.com/waylan),
Dmitry Shachnev (https://github.com/mitya57) and Isaac Muse (https://github.com/facelessuser).
Copyright 2007-2018 The Python Markdown Project (v. 1.7 and later)
Copyright 2004, 2005, 2006 Yuri Takhteyev (v. 0.2-1.6b)
Copyright 2004 Manfred Stienstra (the original version)
License: BSD (see LICENSE.md for details).
"""
import unittest
from markdown.test_tools import TestCase
class TestSetextHeaders(TestCase):
def test_setext_h1(self):
self.assertMarkdownRenders(
self.dedent(
"""
This is an H1
=============
"""
),
'<h1>This is an H1</h1>'
)
def test_setext_h2(self):
self.assertMarkdownRenders(
self.dedent(
"""
This is an H2
-------------
"""
),
'<h2>This is an H2</h2>'
)
def test_setext_h1_mismatched_length(self):
self.assertMarkdownRenders(
self.dedent(
"""
This is an H1
===
"""
),
'<h1>This is an H1</h1>'
)
def test_setext_h2_mismatched_length(self):
self.assertMarkdownRenders(
self.dedent(
"""
This is an H2
---
"""
),
'<h2>This is an H2</h2>'
)
def test_setext_h1_followed_by_p(self):
self.assertMarkdownRenders(
self.dedent(
"""
This is an H1
=============
Followed by a Paragraph with no blank line.
"""
),
self.dedent(
"""
<h1>This is an H1</h1>
<p>Followed by a Paragraph with no blank line.</p>
"""
)
)
def test_setext_h2_followed_by_p(self):
self.assertMarkdownRenders(
self.dedent(
"""
This is an H2
-------------
Followed by a Paragraph with no blank line.
"""
),
self.dedent(
"""
<h2>This is an H2</h2>
<p>Followed by a Paragraph with no blank line.</p>
"""
)
)
# TODO: fix this
# see https://johnmacfarlane.net/babelmark2/?normalize=1&text=Paragraph%0AAn+H1%0A%3D%3D%3D%3D%3D
@unittest.skip('This is broken in Python-Markdown')
def test_p_followed_by_setext_h1(self):
self.assertMarkdownRenders(
self.dedent(
"""
This is a Paragraph.
Followed by an H1 with no blank line.
=====================================
"""
),
self.dedent(
"""
<p>This is a Paragraph.</p>
<h1>Followed by an H1 with no blank line.</h1>
"""
)
)
# TODO: fix this
# see https://johnmacfarlane.net/babelmark2/?normalize=1&text=Paragraph%0AAn+H2%0A-----
@unittest.skip('This is broken in Python-Markdown')
def test_p_followed_by_setext_h2(self):
self.assertMarkdownRenders(
self.dedent(
"""
This is a Paragraph.
Followed by an H2 with no blank line.
-------------------------------------
"""
),
self.dedent(
"""
<p>This is a Paragraph.</p>
<h2>Followed by an H2 with no blank line.</h2>
"""
)
)
class TestHashHeaders(TestCase):
def test_hash_h1_open(self):
self.assertMarkdownRenders(
'# This is an H1',
'<h1>This is an H1</h1>'
)
def test_hash_h2_open(self):
self.assertMarkdownRenders(
'## This is an H2',
'<h2>This is an H2</h2>'
)
def test_hash_h3_open(self):
self.assertMarkdownRenders(
'### This is an H3',
'<h3>This is an H3</h3>'
)
def test_hash_h4_open(self):
self.assertMarkdownRenders(
'#### This is an H4',
'<h4>This is an H4</h4>'
)
def test_hash_h5_open(self):
self.assertMarkdownRenders(
'##### This is an H5',
'<h5>This is an H5</h5>'
)
def test_hash_h6_open(self):
self.assertMarkdownRenders(
'###### This is an H6',
'<h6>This is an H6</h6>'
)
def test_hash_gt6_open(self):
self.assertMarkdownRenders(
'####### This is an H6',
'<h6># This is an H6</h6>'
)
def test_hash_h1_open_missing_space(self):
self.assertMarkdownRenders(
'#This is an H1',
'<h1>This is an H1</h1>'
)
def test_hash_h2_open_missing_space(self):
self.assertMarkdownRenders(
'##This is an H2',
'<h2>This is an H2</h2>'
)
def test_hash_h3_open_missing_space(self):
self.assertMarkdownRenders(
'###This is an H3',
'<h3>This is an H3</h3>'
)
def test_hash_h4_open_missing_space(self):
self.assertMarkdownRenders(
'####This is an H4',
'<h4>This is an H4</h4>'
)
def test_hash_h5_open_missing_space(self):
self.assertMarkdownRenders(
'#####This is an H5',
'<h5>This is an H5</h5>'
)
def test_hash_h6_open_missing_space(self):
self.assertMarkdownRenders(
'######This is an H6',
'<h6>This is an H6</h6>'
)
def test_hash_gt6_open_missing_space(self):
self.assertMarkdownRenders(
'#######This is an H6',
'<h6>#This is an H6</h6>'
)
def test_hash_h1_closed(self):
self.assertMarkdownRenders(
'# This is an H1 #',
'<h1>This is an H1</h1>'
)
def test_hash_h2_closed(self):
self.assertMarkdownRenders(
'## This is an H2 ##',
'<h2>This is an H2</h2>'
)
def test_hash_h3_closed(self):
self.assertMarkdownRenders(
'### This is an H3 ###',
'<h3>This is an H3</h3>'
)
def test_hash_h4_closed(self):
self.assertMarkdownRenders(
'#### This is an H4 ####',
'<h4>This is an H4</h4>'
)
def test_hash_h5_closed(self):
self.assertMarkdownRenders(
'##### This is an H5 #####',
'<h5>This is an H5</h5>'
)
def test_hash_h6_closed(self):
self.assertMarkdownRenders(
'###### This is an H6 ######',
'<h6>This is an H6</h6>'
)
def test_hash_gt6_closed(self):
self.assertMarkdownRenders(
'####### This is an H6 #######',
'<h6># This is an H6</h6>'
)
def test_hash_h1_closed_missing_space(self):
self.assertMarkdownRenders(
'#This is an H1#',
'<h1>This is an H1</h1>'
)
def test_hash_h2_closed_missing_space(self):
self.assertMarkdownRenders(
'##This is an H2##',
'<h2>This is an H2</h2>'
)
def test_hash_h3_closed_missing_space(self):
self.assertMarkdownRenders(
'###This is an H3###',
'<h3>This is an H3</h3>'
)
def test_hash_h4_closed_missing_space(self):
self.assertMarkdownRenders(
'####This is an H4####',
'<h4>This is an H4</h4>'
)
def test_hash_h5_closed_missing_space(self):
self.assertMarkdownRenders(
'#####This is an H5#####',
'<h5>This is an H5</h5>'
)
def test_hash_h6_closed_missing_space(self):
self.assertMarkdownRenders(
'######This is an H6######',
'<h6>This is an H6</h6>'
)
def test_hash_gt6_closed_missing_space(self):
self.assertMarkdownRenders(
'#######This is an H6#######',
'<h6>#This is an H6</h6>'
)
def test_hash_h1_closed_mismatch(self):
self.assertMarkdownRenders(
'# This is an H1 ##',
'<h1>This is an H1</h1>'
)
def test_hash_h2_closed_mismatch(self):
self.assertMarkdownRenders(
'## This is an H2 #',
'<h2>This is an H2</h2>'
)
def test_hash_h3_closed_mismatch(self):
self.assertMarkdownRenders(
'### This is an H3 #',
'<h3>This is an H3</h3>'
)
def test_hash_h4_closed_mismatch(self):
self.assertMarkdownRenders(
'#### This is an H4 #',
'<h4>This is an H4</h4>'
)
def test_hash_h5_closed_mismatch(self):
self.assertMarkdownRenders(
'##### This is an H5 #',
'<h5>This is an H5</h5>'
)
def test_hash_h6_closed_mismatch(self):
self.assertMarkdownRenders(
'###### This is an H6 #',
'<h6>This is an H6</h6>'
)
def test_hash_gt6_closed_mismatch(self):
self.assertMarkdownRenders(
'####### This is an H6 ##################',
'<h6># This is an H6</h6>'
)
def test_hash_h1_followed_by_p(self):
self.assertMarkdownRenders(
self.dedent(
"""
# This is an H1
Followed by a Paragraph with no blank line.
"""
),
self.dedent(
"""
<h1>This is an H1</h1>
<p>Followed by a Paragraph with no blank line.</p>
"""
)
)
def test_hash_h2_followed_by_p(self):
self.assertMarkdownRenders(
self.dedent(
"""
## This is an H2
Followed by a Paragraph with no blank line.
"""
),
self.dedent(
"""
<h2>This is an H2</h2>
<p>Followed by a Paragraph with no blank line.</p>
"""
)
)
def test_hash_h3_followed_by_p(self):
self.assertMarkdownRenders(
self.dedent(
"""
### This is an H3
Followed by a Paragraph with no blank line.
"""
),
self.dedent(
"""
<h3>This is an H3</h3>
<p>Followed by a Paragraph with no blank line.</p>
"""
)
)
def test_hash_h4_followed_by_p(self):
self.assertMarkdownRenders(
self.dedent(
"""
#### This is an H4
Followed by a Paragraph with no blank line.
"""
),
self.dedent(
"""
<h4>This is an H4</h4>
<p>Followed by a Paragraph with no blank line.</p>
"""
)
)
def test_hash_h5_followed_by_p(self):
self.assertMarkdownRenders(
self.dedent(
"""
##### This is an H5
Followed by a Paragraph with no blank line.
"""
),
self.dedent(
"""
<h5>This is an H5</h5>
<p>Followed by a Paragraph with no blank line.</p>
"""
)
)
def test_hash_h6_followed_by_p(self):
self.assertMarkdownRenders(
self.dedent(
"""
###### This is an H6
Followed by a Paragraph with no blank line.
"""
),
self.dedent(
"""
<h6>This is an H6</h6>
<p>Followed by a Paragraph with no blank line.</p>
"""
)
)
def test_hash_h1_leading_space(self):
self.assertMarkdownRenders(
' # This is an H1',
'<p># This is an H1</p>'
)
def test_hash_h2_leading_space(self):
self.assertMarkdownRenders(
' ## This is an H2',
'<p>## This is an H2</p>'
)
def test_hash_h3_leading_space(self):
self.assertMarkdownRenders(
' ### This is an H3',
'<p>### This is an H3</p>'
)
def test_hash_h4_leading_space(self):
self.assertMarkdownRenders(
' #### This is an H4',
'<p>#### This is an H4</p>'
)
def test_hash_h5_leading_space(self):
self.assertMarkdownRenders(
' ##### This is an H5',
'<p>##### This is an H5</p>'
)
def test_hash_h6_leading_space(self):
self.assertMarkdownRenders(
' ###### This is an H6',
'<p>###### This is an H6</p>'
)
def test_hash_h1_open_trailing_space(self):
self.assertMarkdownRenders(
'# This is an H1 ',
'<h1>This is an H1</h1>'
)
def test_hash_h2_open_trailing_space(self):
self.assertMarkdownRenders(
'## This is an H2 ',
'<h2>This is an H2</h2>'
)
def test_hash_h3_open_trailing_space(self):
self.assertMarkdownRenders(
'### This is an H3 ',
'<h3>This is an H3</h3>'
)
def test_hash_h4_open_trailing_space(self):
self.assertMarkdownRenders(
'#### This is an H4 ',
'<h4>This is an H4</h4>'
)
def test_hash_h5_open_trailing_space(self):
self.assertMarkdownRenders(
'##### This is an H5 ',
'<h5>This is an H5</h5>'
)
def test_hash_h6_open_trailing_space(self):
self.assertMarkdownRenders(
'###### This is an H6 ',
'<h6>This is an H6</h6>'
)
def test_hash_gt6_open_trailing_space(self):
self.assertMarkdownRenders(
'####### This is an H6 ',
'<h6># This is an H6</h6>'
)
# TODO: Possibly change the following behavior. While this follows the behavior
# of markdown.pl, it is rather uncommon and not nessecarily intuitive.
# See: https://johnmacfarlane.net/babelmark2/?normalize=1&text=%23+This+is+an+H1+%23+
def test_hash_h1_closed_trailing_space(self):
self.assertMarkdownRenders(
'# This is an H1 # ',
'<h1>This is an H1 #</h1>'
)
def test_hash_h2_closed_trailing_space(self):
self.assertMarkdownRenders(
'## This is an H2 ## ',
'<h2>This is an H2 ##</h2>'
)
def test_hash_h3_closed_trailing_space(self):
self.assertMarkdownRenders(
'### This is an H3 ### ',
'<h3>This is an H3 ###</h3>'
)
def test_hash_h4_closed_trailing_space(self):
self.assertMarkdownRenders(
'#### This is an H4 #### ',
'<h4>This is an H4 ####</h4>'
)
def test_hash_h5_closed_trailing_space(self):
self.assertMarkdownRenders(
'##### This is an H5 ##### ',
'<h5>This is an H5 #####</h5>'
)
def test_hash_h6_closed_trailing_space(self):
self.assertMarkdownRenders(
'###### This is an H6 ###### ',
'<h6>This is an H6 ######</h6>'
)
def test_hash_gt6_closed_trailing_space(self):
self.assertMarkdownRenders(
'####### This is an H6 ####### ',
'<h6># This is an H6 #######</h6>'
)
def test_no_blank_lines_between_hashs(self):
self.assertMarkdownRenders(
self.dedent(
"""
# This is an H1
## This is an H2
"""
),
self.dedent(
"""
<h1>This is an H1</h1>
<h2>This is an H2</h2>
"""
)
)
def test_random_hash_levels(self):
self.assertMarkdownRenders(
self.dedent(
"""
### H3
###### H6
# H1
##### H5
#### H4
## H2
### H3
"""
),
self.dedent(
"""
<h3>H3</h3>
<h6>H6</h6>
<h1>H1</h1>
<h5>H5</h5>
<h4>H4</h4>
<h2>H2</h2>
<h3>H3</h3>
"""
)
)
def test_hash_followed_by_p(self):
self.assertMarkdownRenders(
self.dedent(
"""
# This is an H1
Followed by a Paragraph with no blank line.
"""
),
self.dedent(
"""
<h1>This is an H1</h1>
<p>Followed by a Paragraph with no blank line.</p>
"""
)
)
def test_p_followed_by_hash(self):
self.assertMarkdownRenders(
self.dedent(
"""
This is a Paragraph.
# Followed by an H1 with no blank line.
"""
),
self.dedent(
"""
<p>This is a Paragraph.</p>
<h1>Followed by an H1 with no blank line.</h1>
"""
)
)
def test_escaped_hash(self):
self.assertMarkdownRenders(
"### H3 \\###",
self.dedent(
"""
<h3>H3 #</h3>
"""
)
)
def test_unescaped_hash(self):
self.assertMarkdownRenders(
"### H3 \\\\###",
self.dedent(
"""
<h3>H3 \\</h3>
"""
)
)
| 25.502052 | 101 | 0.458374 | 1,953 | 18,642 | 4.213006 | 0.071173 | 0.108653 | 0.137093 | 0.220588 | 0.881138 | 0.849538 | 0.840544 | 0.840058 | 0.782815 | 0.7737 | 0 | 0.040423 | 0.416103 | 18,642 | 730 | 102 | 25.536986 | 0.71548 | 0.064049 | 0 | 0.537349 | 0 | 0 | 0.193464 | 0 | 0 | 0 | 0 | 0.00411 | 0.180723 | 1 | 0.180723 | false | 0 | 0.004819 | 0 | 0.190361 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
ab09fa9597707214317f4d54cf6b5044feba90b2 | 16,154 | py | Python | test/test_subscriptionattr.py | telstra/EventDetectionAPI-SDK-python | 94f3bb56ebd3e7bcb8818af3b1b43d910f92bd1f | [
"Apache-2.0"
] | 3 | 2018-05-23T11:12:25.000Z | 2020-05-18T00:53:50.000Z | test/test_subscriptionattr.py | telstra/EventDetectionAPI-SDK-python | 94f3bb56ebd3e7bcb8818af3b1b43d910f92bd1f | [
"Apache-2.0"
] | null | null | null | test/test_subscriptionattr.py | telstra/EventDetectionAPI-SDK-python | 94f3bb56ebd3e7bcb8818af3b1b43d910f92bd1f | [
"Apache-2.0"
] | 1 | 2018-12-10T01:35:11.000Z | 2018-12-10T01:35:11.000Z | # coding: utf-8
"""
Telstra Event Detection API
# Introduction Telstra's Event Detection API provides the ability to subscribe to and receive mobile network events for registered mobile numbers associated with Telstra's mobile network, such as; SIM swap, port-in, port-out, new MSIDN, new mobile service and cancelled mobile service, as well as carrier-detection. ## Features Event Detection API provides these features | Feature | Description | |---|---| |`SIM swap` | Returns timestamped event data when any of the following network events occurs in connection with a registered mobile number associated with Telstra’s mobile network: SIM swap, port-in, port-out, new MSISDN, new mobile service or cancelled mobile service | |`Carrier Detection` | Find out what Australian carrier a mobile number is subscribed to | |`International Roaming` | *Coming soon.* Will indicate if a mobile number is operaing in Australia or outside of Australia. | ## Getting access to the API The Event Detection API is available on our Enterprise Plans only. Please submit your [sales enquiry](https://dev.telstra.com/content/sales-enquiry-contact-form) . Or contact your Telstra Account Executive. We're available Monday to Friday 9am - 5pm. ## Frequently asked questions **Q: What is the Telstra Event Detection (TED) API?** A: The Telstra Event Detection (TED) API is a subscription based service from Telstra that enables a customer to be alerted when a particular network event is detected in connection with a registered mobile number that may indicate that a fraudulent misuse of an end user’s mobility service is about to occur. **Q: What are the network events that the TED API can detect?** A: Currently the TED API is able to detect a bundle of events associated with Telstra SIM swaps. **Q: Can TED API detect number porting between operators other than Telstra? E.g. Optus to Vodafone?** A: No, we don’t report these type of events at present. **Q: How quickly are the network events detected?** A: This will vary depending on the event being detected, but generally we detect the event within a couple of seconds of it occurring and notify subscribers within near real time via the API. **Q: How long does Telstra store the event data for?** A: Event data is stored for 90 days from the occurrence of a network event and then securely purged. **Q: Is there a limit to the number of registered mobile numbers I can have for the Telstra Event Detection API?** A: No. You may have as many Telstra Event Detection API registered mobile numbers as you require within practical limits. **Q: Why is monitoring for SIM SWAP events important?** A: Criminals are becoming much more savvy and will often try to circumvent two factor authentication protocols by swapping the SIM card for a particular mobile number in order to gain fraudulent access to the end user’s service. Monitoring for SIM swap events may provide early detection that this is occurring and help prevent criminals from being successful in their endeavours. **Q: If an end user is currently a customer of a Telstra Reseller that still utilises the Telstra Network, am I able to detect their Network events?** A: No. Telstra resellers such as Aldi Mobile are Mobile Virtual Network Operators (MVNO) that operate as totally independent businesses to Telstra. The Telstra SIM swap API does not monitor MNVO network events at present. **Q: How do I purchase Telstra Event Detection API?** A: At the moment, the Telstra Event Detection API is only available through your Telstra Account Manager. If you don't have a Telstra Account Manager, or are not sure who they are, please submit a [sales enquiry](https://dev.telstra.com/content/sales-enquiry-contact-form). **Q: What support options are available for the Telstra Event Detection API?** A: We provide 24/7 telephone based technical support (for paid plans) along with email support and an online community forum. **Q: Do you detect network events from another carrier?** A: The Telstra Event Detection API detects network events associated with the Telstra network and Telstra mobile services. **Q: Which Telstra personnel have access to the event detection data?** A: Access to Telstra Event Detection data is restricted to only Telstra personnel that require access for the purposes of providing the service. **Q: Why should I purchase the Telstra Event Detection API from Telstra?** A: As the network events are occurring on the Telstra network, Telstra is in a position to be able to provide fast notification of an event as it is occurring, helping subscribers to prevent fraudulent activity from occurring and to minimise the resulting financial losses. **Q: If I require assistance setting up my Telstra Event Detection API, are there any Professional Services options available to me?** A: At the current time, the Telstra Event Detection API does not have any Professional Service options available. **Q: What subscription options are available for Telstra Event Detection API?** A: There is a month-by-month Pay As You Go (PAYG) plan or 12 Month contract option available. **Q: Do Early Termination Charges (ETC’s) apply?** A: If you have subscribed to a 12 month contract and want to terminate the plan or downgrade to a lower plan before the expiry of your existing 12 month term, we may charge you ETCs. **Q: What privacy requirements apply to my use of the Telstra Event Detection API?** A: Before registering an end users mobile number with Telstra Event Detection API, you must: 1. prepare an 'End User Notifications'. for our approval, which sets out what end user information will be disclosed via the API, the purposes for which that information will be disclosed, and to which third parties that information will be disclosed; 2. provide each of your end user with the End User Notification; and 3. obtain express, informed consent from each end user to the use and disclosure of their event data via the API for the purposes set out in the notification. **Q: What terms and conditions apply to my use of the Telstra Event Detection API?** A: Before using the Telstra Event Detection API, you must agree to the TED API ['Our Customer Terms'](https://www.telstra.com.au/customer-terms/business-government#cloud-services). # Getting Started First step is to create an `App`. After you've created an `App`, follow these steps 1. Authenticate by getting an Oauth token 2. Use the Event Detection API ## Run in Postman To get started quickly and easily with all the features of the Event Detection API, download the Postman collection here <a href=\"https://app.getpostman.com/run-collection/8ab2273e066e5c6fd653#?env%5BEvent%20Detection%20API%5D=W3sidHlwZSI6InRleHQiLCJlbmFibGVkIjp0cnVlLCJrZXkiOiJjbGllbnRfaWQiLCJ2YWx1ZSI6ImNsaWVudF9pZCJ9LHsidHlwZSI6InRleHQiLCJlbmFibGVkIjp0cnVlLCJrZXkiOiJjbGllbnRfc2VjcmV0IiwidmFsdWUiOiJjbGllbnRfc2VjcmV0In0seyJ0eXBlIjoidGV4dCIsImVuYWJsZWQiOnRydWUsImtleSI6ImFjY2Vzc190b2tlbiIsInZhbHVlIjoiaTZPdmtyelVuc3hvODhrcU9BMXg4RWtPVWxuSyJ9LHsidHlwZSI6InRleHQiLCJlbmFibGVkIjp0cnVlLCJrZXkiOiJob3N0IiwidmFsdWUiOiJ0YXBpLnRlbHN0cmEuY29tIn0seyJ0eXBlIjoidGV4dCIsImVuYWJsZWQiOnRydWUsImtleSI6Im9hdXRoLWhvc3QiLCJ2YWx1ZSI6InRhcGkudGVsc3RyYS5jb20ifV0=\"><img alt=\"Run in Postman\" src=\"https://run.pstmn.io/button.svg\" /></a> ## Authentication To get an OAuth 2.0 Authentication token, pass through your Consumer Key and Consumer Secret that you received when you registered for the Event Detection API key. The `grant_type` should be left as `client_credentials` and the scope as v1_eventdetection_simswap. The token will expire in one hour. Get your keys by creating an `App`. # Request ` CONSUMER_KEY=\"your consumer key\" CONSUMER_SECRET=\"your consumer secret\" curl -X POST -H 'Content-Type: application/x-www-form-urlencoded' \\ -d 'grant_type=client_credentials&client_id=$CONSUMER_KEY&client_secret=$CONSUMER_SECRET&scope=v1_eventdetection_simswap' \\ 'https://tapi.telstra.com/v2/oauth/token' ` # Response `{ \"access_token\" : \"1234567890123456788901234567\", \"token_type\" : \"Bearer\", \"expires_in\" : \"3599\" }` ## Subscribe mobile numbers Subscribing end user mobile numbers informs the API to register that mobile number so that you can poll those numbers for particular events. You can subscribe and unsubscribe numbers (opt in and opt out) against this service. Only numbers that are opted in (i.e. subscribed) can be polled for events. You must have obtained your end customer’s consent before you can opt them into the Event Detection service. # Request `curl -X POST -H 'content-type: application/json' \\ -H 'Authorization: Bearer $TOKEN' \\ -d '{ \"msisdns\": [ \"61467754783\" ], \"eventType\": \"simswap\", \"notificationUrl\": \"https://requestb.in/161r14g1\" }' \\ 'https://tapi.telstra.com/v1/eventdetection/events'` | Parameter | Description | |---|---| |`msisdns` | list of mobile numbers that has to be registered for the event | |`eventType` | event Type to be subscribed to | |`notificationUrl` | URL where the event notifications has to be posted (Optional) | # Response `{ \"msisdns\": [ { \"msisdn\": \"61467754783\", \"description\": \"opt-in status updated for this MSISDN\", \"carrierName\": \"Telstra\" } ] }` | Parameter | Description | |---|---| |`msisdn` | msisdn | |`description` | status description indicating if the msisdn was opted-in| |`carrierName` | carrier name for the msisdn | ## Unsubscribe mobile numbers Unsubscribe mobile numbers against a particular event # Request `curl -X DELETE -H 'content-type: application/json' \\ -H 'Authorization: Bearer $token' \\ -d '{\"msisdns\": [\"61467754783\"]}' \\ 'https://tapi.telstra.com/v1/eventdetection/events/{event-type}'` | Parameter | Description | |---|---| |`msisdns` | list of mobile numbers that has to be unsubscribed from the event | |`eventType` | event Type to be unsubscribed from | |`notificationUrl` | notification URL that has to be removed (Optional) | # Response ` { \"msisdns\": [ { \"msisdn\": \"61467754783\", \"description\": \"opt-out status updated for this MSISDN\", \"carrierName\": \"Telstra\" } ] } ` | Parameter | Description | |---|---| |`msisdn` | msisdn | |`description` | status description indicating if the msisdn was opted-out | |`carrierName` | carrier name for the msisdn | ## Get event subscriptions Get the list of events subscribed for # Request `curl -X POST -H 'content-type: application/json' \\ -H 'Authorization: Bearer $TOKEN' \\ -d '{ \"msisdns\": [ \"61467754783\" ] }' \\ 'https://tapi.telstra.com/v1/eventdetection/events/subscriptions'` | Parameter | Description | |---|---| |`msisdns` | list of msisdns to get the subscription details | # Response ` { \"notificationURL\": \"https://requestb.in/161r14g1\", \"subscriptions\": [ { \"msisdn\": \"61467754783\", \"events\": [ \"SIM_SWAP\" ], \"carrierName\": \"Telstra\" } ] } ` | Parameter | Description | |---|---| |`notificationURL` | notification URL configured while registering msisdns | |`msisdn` | msisdn | |`events` | list of subscribed events for that msisdn | |`carrierName` | carrier name for the msisdn | ## Poll events Poll events for a given set of msisdns # Request `curl -X POST -H 'content-type: application/json' \\ -H 'Authorization: Bearer $token' \\ -d '{ \"msisdns\": [ \"61467754783\", \"61467984007\" ] }' \\ 'https://tapi.telstra.com/v1/eventdetection/events/{event_type}'` Parameter | Description | |---|---| |`msisdns` | list of msisdns to be polled for events | |`eventType` | event Type to be polled for | # Response ` { \"eventType\": \"simswap\", \"msisdns\": [ { \"msisdn\": \"+61467754783\", \"mobileServiceEvents\": [ { \"eventId\": \"NEW_SIM\", \"eventDate\": \"2018-01-19T14:40:34\" } ] }, { \"msisdn\": \"+61467984007\", \"mobileServiceEvents\": [ { \"eventId\": \"PORTOUT_SVC\", \"eventDate\": \"2018-02-21T15:20:01\", \"carrierName\": \"Telstra\" } ] } ] } ` | Parameter | Description | |---|---| |`eventType` | event type requested | |`msisdn` | msisdn | |`mobileServiceEvents` | list of service events | |`eventId` | Id of the event occured. Event Id can be any one of the following - NEW_MSISDN, PORTIN_SVC, PORTOUT_SVC, NEW_SIM, CREATE_SVC, DELETE_SVC | |`eventDate` | timestamp indicating when the event occured | |`carrierName` | carrier name for the msisdn. Carrier name will be returned only for port out events | ## Push notifications Push event notifications to the URL are configured with the parameter `notificationUrl` while subscribing mobile numbers. # Event notification format ` { \"eventId\": \"NEW_SIM\", \"msisdn\" : \"61467754783\", \"eventDate\" : \"2018-01-19T14:40:34\" } ` | Parameter | Description | |---|---| |`eventId` | event Id indicating the event occured. Event Id can be any one of the following - NEW_MSISDN, PORTIN_SVC, PORTOUT_SVC, NEW_SIM, CREATE_SVC, DELETE_SVC | |`msisdn` | msisdn for which the event occured | |`eventDate` | timestamp indicating when the event occured | ## SIMswap sub-features The following is a list of the sub-features for SIM swap and the description for that sub-feature. These will appear in the 'eventId' parameter in the API response payload for SIMswap events. | SIM swap Sub-Feature | Description | |---|---| |`NEW_MSISDN` | The MSISDN of a service changes. The SIM card is not changed. Results in two events being created: 1) CREATE_SVC/PORT_IN_SVC for the new number, and 2) a NEW_MSISDN for the old MSISDN | |`PORTIN_SVC` | A MSISDN registered for event detection is created as a mobile service on the Telstra network (note: if the MSISDN was not already registered by at least one customer for at least one event type, this event would be interpreted as a CREATE_SVC) | |`PORTOUT_SVC` | The MSISDN is ported out from Telstra to another domestic operator | |`NEW_SIM` | An existing Telstra MSISDN is moved onto a new SIM | |`CREATE_SVC` | A new mobile service is created on the Telstra network (a new SIM and a new MSISDN) | |`DELETE_SVC` | A mobile service (MSISDN and SIM) on the Telstra network is cancelled outright (as opposed to ported out to another domestic network) | ## SDK repos * [Event Detection API - Java SDK](https://github.com/telstra/EventDetectionAPI-SDK-java) * [Event Detection API - .Net2 SDK](https://github.com/telstra/EventDetectionAPI-SDK-dotnet) * [Event Detection API - NodeJS SDK](https://github.com/telstra/EventDetectionAPI-SDK-node) * [Event Detection API - PHP SDK](https://github.com/telstra/EventDetectionAPI-SDK-php) * [Event Detection API - Python SDK](https://github.com/telstra/EventDetectionAPI-SDK-python) * [Event Detection API - Ruby SDK](https://github.com/telstra/EventDetectionAPI-SDK-ruby) # noqa: E501
OpenAPI spec version: 1.0.0
Generated by: https://github.com/swagger-api/swagger-codegen.git
"""
from __future__ import absolute_import
import unittest
import Telstra_EventDetection
from Telstra_EventDetection.models.subscriptionattr import Subscriptionattr # noqa: E501
from Telstra_EventDetection.rest import ApiException
class TestSubscriptionattr(unittest.TestCase):
"""Subscriptionattr unit test stubs"""
def setUp(self):
pass
def tearDown(self):
pass
def testSubscriptionattr(self):
"""Test Subscriptionattr"""
# FIXME: construct object with mandatory attributes with example values
# model = Telstra_EventDetection.models.subscriptionattr.Subscriptionattr() # noqa: E501
pass
if __name__ == '__main__':
unittest.main()
| 394 | 15,278 | 0.722731 | 2,163 | 16,154 | 5.377254 | 0.246417 | 0.039721 | 0.039464 | 0.030952 | 0.245293 | 0.209956 | 0.183647 | 0.11177 | 0.11177 | 0.11177 | 0 | 0.024087 | 0.182741 | 16,154 | 40 | 15,279 | 403.85 | 0.855552 | 0.958153 | 0 | 0.214286 | 0 | 0 | 0.016807 | 0 | 0 | 0 | 0 | 0.025 | 0 | 1 | 0.214286 | false | 0.214286 | 0.357143 | 0 | 0.642857 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 1 | 1 | 0 | 1 | 0 | 0 | 8 |
ab509a20bf19cb075e724c4f936b462ce2dc84e4 | 17,665 | py | Python | tests/test_nearest.py | harisankarh/mabwiser | 0c860253be017d1f393e18bf9d9d7e1739f93dca | [
"Apache-2.0"
] | 60 | 2020-06-10T11:20:52.000Z | 2022-03-25T02:16:47.000Z | tests/test_nearest.py | harisankarh/mabwiser | 0c860253be017d1f393e18bf9d9d7e1739f93dca | [
"Apache-2.0"
] | 24 | 2020-06-04T18:40:21.000Z | 2022-03-24T16:49:51.000Z | tests/test_nearest.py | harisankarh/mabwiser | 0c860253be017d1f393e18bf9d9d7e1739f93dca | [
"Apache-2.0"
] | 12 | 2020-11-30T10:37:05.000Z | 2022-03-25T02:16:41.000Z | # -*- coding: utf-8 -*-
import numpy as np
from mabwiser.mab import LearningPolicy, NeighborhoodPolicy
from tests.test_base import BaseTest
class NearestTest(BaseTest):
def test_greedy0_k2(self):
arms, mab = self.predict(arms=[1, 2, 3, 4],
decisions=[1, 1, 1, 2, 2, 3, 3, 3, 3, 3],
rewards=[0, 1, 1, 0, 0, 0, 0, 1, 1, 1],
learning_policy=LearningPolicy.EpsilonGreedy(epsilon=0),
neighborhood_policy=NeighborhoodPolicy.KNearest(2),
context_history=[[0, 1, 2, 3, 5], [1, 1, 1, 1, 1], [0, 0, 1, 0, 0],
[0, 2, 2, 3, 5], [1, 3, 1, 1, 1], [0, 0, 0, 0, 0],
[0, 1, 4, 3, 5], [0, 1, 2, 4, 5], [1, 2, 1, 1, 3],
[0, 2, 1, 0, 0]],
contexts=[[0, 1, 2, 3, 5], [1, 1, 1, 1, 1]],
seed=123456,
num_run=1,
is_predict=True)
self.assertListEqual(arms, [1, 1])
def test_greedy0_k2_single_test(self):
arms, mab = self.predict(arms=[1, 2, 3, 4],
decisions=[1, 1, 1, 2, 2, 3, 3, 3, 3, 3],
rewards=[0, 1, 1, 0, 0, 0, 0, 1, 1, 1],
learning_policy=LearningPolicy.EpsilonGreedy(epsilon=0),
neighborhood_policy=NeighborhoodPolicy.KNearest(2),
context_history=[[0, 1, 2, 3, 5], [1, 1, 1, 1, 1], [0, 0, 1, 0, 0],
[0, 2, 2, 3, 5], [1, 3, 1, 1, 1], [0, 0, 0, 0, 0],
[0, 1, 4, 3, 5], [0, 1, 2, 4, 5], [1, 2, 1, 1, 3],
[0, 2, 1, 0, 0]],
contexts=[[0, 1, 2, 3, 5]],
seed=123456,
num_run=1,
is_predict=True)
self.assertEqual(arms, 1)
def test_greedy0_k2_single_list(self):
arms, mab = self.predict(arms=[1, 2, 3, 4],
decisions=[1, 1, 1, 2, 2, 3, 3, 3, 3, 3],
rewards=[0, 1, 1, 0, 0, 0, 0, 1, 1, 1],
learning_policy=LearningPolicy.EpsilonGreedy(epsilon=0),
neighborhood_policy=NeighborhoodPolicy.KNearest(2),
context_history=[[0, 1, 2, 3, 5], [1, 1, 1, 1, 1], [0, 0, 1, 0, 0],
[0, 2, 2, 3, 5], [1, 3, 1, 1, 1], [0, 0, 0, 0, 0],
[0, 1, 4, 3, 5], [0, 1, 2, 4, 5], [1, 2, 1, 1, 3],
[0, 2, 1, 0, 0]],
contexts=[[0, 1, 2, 3, 5]],
seed=123456,
num_run=1,
is_predict=True)
self.assertEqual(arms, 1)
def test_greedy0_k2_exps(self):
exps, mab = self.predict(arms=[1, 2, 3, 4],
decisions=[1, 1, 1, 2, 2, 3, 3, 3, 3, 3],
rewards=[0, 1, 1, 0, 0, 0, 0, 1, 1, 1],
learning_policy=LearningPolicy.EpsilonGreedy(epsilon=0),
neighborhood_policy=NeighborhoodPolicy.KNearest(2),
context_history=[[0, 1, 2, 3, 5], [1, 1, 1, 1, 1], [0, 0, 1, 0, 0],
[0, 2, 2, 3, 5], [1, 3, 1, 1, 1], [0, 0, 0, 0, 0],
[0, 1, 4, 3, 5], [0, 1, 2, 4, 5], [1, 2, 1, 1, 3],
[0, 2, 1, 0, 0]],
contexts=[[0, 1, 2, 3, 5], [1, 1, 1, 1, 1]],
seed=123456,
num_run=1,
is_predict=False)
self.assertDictEqual(exps[0], {1: 0.0, 2: 0.0, 3: 0, 4: 0})
self.assertDictEqual(exps[1], {1: 1.0, 2: 0.0, 3: 0, 4: 0})
def test_greedy0_k5(self):
arms, mab = self.predict(arms=[1, 2, 3, 4],
decisions=[1, 1, 1, 2, 2, 3, 3, 3, 3, 3],
rewards=[0, 0, 0, 1, 1, 0, 0, 1, 1, 1],
learning_policy=LearningPolicy.EpsilonGreedy(epsilon=0),
neighborhood_policy=NeighborhoodPolicy.KNearest(5),
context_history=[[0, 1, 2, 3, 5], [1, 1, 1, 1, 1], [0, 0, 1, 0, 0],
[0, 2, 2, 3, 5], [1, 3, 1, 1, 1], [0, 0, 0, 0, 0],
[0, 1, 4, 3, 5], [0, 1, 2, 4, 5], [1, 2, 1, 1, 3],
[0, 2, 1, 0, 0]],
contexts=[[0, 1, 2, 3, 5], [1, 1, 1, 1, 1]],
seed=123456,
num_run=1,
is_predict=True)
self.assertListEqual(arms, [2, 2])
def test_greedy1_k2(self):
arms, mab = self.predict(arms=[1, 2, 3, 4],
decisions=[1, 1, 1, 2, 2, 3, 3, 3, 3, 3],
rewards=[0, 1, 1, 0, 0, 0, 0, 1, 1, 1],
learning_policy=LearningPolicy.EpsilonGreedy(epsilon=1.0),
neighborhood_policy=NeighborhoodPolicy.KNearest(2),
context_history=[[0, 1, 2, 3, 5], [1, 1, 1, 1, 1], [0, 0, 1, 0, 0],
[0, 2, 2, 3, 5], [1, 3, 1, 1, 1], [0, 0, 0, 0, 0],
[0, 1, 4, 3, 5], [0, 1, 2, 4, 5], [1, 2, 1, 1, 3],
[0, 2, 1, 0, 0]],
contexts=[[0, 1, 2, 3, 5], [1, 1, 1, 1, 1]],
seed=123456,
num_run=1,
is_predict=True)
self.assertListEqual(arms, [4, 1])
def test_thompson_k2(self):
arms, mab = self.predict(arms=[1, 2, 3, 4],
decisions=[1, 1, 1, 2, 2, 3, 3, 3, 3, 3],
rewards=[0, 1, 1, 0, 0, 0, 0, 1, 1, 1],
learning_policy=LearningPolicy.ThompsonSampling(),
neighborhood_policy=NeighborhoodPolicy.KNearest(2),
context_history=[[0, 1, 2, 3, 5], [1, 1, 1, 1, 1], [0, 0, 1, 0, 0],
[0, 2, 2, 3, 5], [1, 3, 1, 1, 1], [0, 0, 0, 0, 0],
[0, 1, 4, 3, 5], [0, 1, 2, 4, 5], [1, 2, 1, 1, 3],
[0, 2, 1, 0, 0]],
contexts=[[0, 1, 2, 3, 5], [1, 1, 1, 1, 1]],
seed=123456,
num_run=1,
is_predict=True)
self.assertListEqual(arms, [4, 4])
def test_ucb_k2(self):
arms, mab = self.predict(arms=[1, 2, 3, 4],
decisions=[1, 1, 1, 2, 2, 3, 3, 3, 3, 3],
rewards=[0, 1, 1, 0, 0, 0, 0, 1, 1, 1],
learning_policy=LearningPolicy.UCB1(alpha=1),
neighborhood_policy=NeighborhoodPolicy.KNearest(2),
context_history=[[0, 1, 2, 3, 5], [1, 1, 1, 1, 1], [0, 0, 1, 0, 0],
[0, 2, 2, 3, 5], [1, 3, 1, 1, 1], [0, 0, 0, 0, 0],
[0, 1, 4, 3, 5], [0, 1, 2, 4, 5], [1, 2, 1, 1, 3],
[0, 2, 1, 0, 0]],
contexts=[[0, 1, 2, 3, 5], [1, 1, 1, 1, 1]],
seed=123456,
num_run=1,
is_predict=True)
self.assertListEqual(arms, [1, 1])
def test_softmax_k2(self):
arms, mab = self.predict(arms=[1, 2, 3, 4],
decisions=[1, 1, 1, 2, 2, 3, 3, 3, 3, 3],
rewards=[0, 1, 1, 0, 0, 0, 0, 1, 1, 1],
learning_policy=LearningPolicy.Softmax(tau=1),
neighborhood_policy=NeighborhoodPolicy.KNearest(2),
context_history=[[0, 1, 2, 3, 5], [1, 1, 1, 1, 1], [0, 0, 1, 0, 0],
[0, 2, 2, 3, 5], [1, 3, 1, 1, 1], [0, 0, 0, 0, 0],
[0, 1, 4, 3, 5], [0, 1, 2, 4, 5], [1, 2, 1, 1, 3],
[0, 2, 1, 0, 0]],
contexts=[[0, 1, 2, 3, 5], [1, 1, 1, 1, 1]],
seed=123456,
num_run=1,
is_predict=True)
self.assertListEqual(arms, [3, 2])
def test_max_k(self):
arms, mab = self.predict(arms=[1, 2, 3, 4],
decisions=[1, 1, 1, 2, 2, 3, 3, 3, 3, 3],
rewards=[0, 1, 1, 0, 0, 0, 0, 1, 1, 1],
learning_policy=LearningPolicy.EpsilonGreedy(epsilon=0),
neighborhood_policy=NeighborhoodPolicy.KNearest(10),
context_history=[[0, 1, 2, 3, 5], [1, 1, 1, 1, 1], [0, 0, 1, 0, 0],
[0, 2, 2, 3, 5], [1, 3, 1, 1, 1], [0, 0, 0, 0, 0],
[0, 1, 4, 3, 5], [0, 1, 2, 4, 5], [1, 2, 1, 1, 3],
[0, 2, 1, 0, 0]],
contexts=[[0, 1, 2, 3, 5], [1, 1, 1, 1, 1]],
seed=123456,
num_run=1,
is_predict=True)
self.assertListEqual(arms, [1, 1])
def test_partial_fit_greedy0_r2(self):
arms, mab = self.predict(arms=[1, 2, 3, 4],
decisions=[1, 1, 1, 2, 2, 3, 3, 3, 3, 3],
rewards=[0, 1, 1, 0, 0, 0, 0, 1, 1, 1],
learning_policy=LearningPolicy.EpsilonGreedy(epsilon=0),
neighborhood_policy=NeighborhoodPolicy.KNearest(2),
context_history=[[0, 1, 2, 3, 5], [1, 1, 1, 1, 1], [0, 0, 1, 0, 0],
[0, 2, 2, 3, 5], [1, 3, 1, 1, 1], [0, 0, 0, 0, 0],
[0, 1, 4, 3, 5], [0, 1, 2, 4, 5], [1, 2, 1, 1, 3],
[0, 2, 1, 0, 0]],
contexts=[[0, 1, 2, 3, 5], [1, 1, 1, 1, 1]],
seed=123456,
num_run=1,
is_predict=True)
self.assertListEqual(arms, [1, 1])
self.assertEqual(len(mab._imp.decisions), 10)
self.assertEqual(len(mab._imp.decisions), 10)
self.assertEqual(len(mab._imp.rewards), 10)
self.assertEqual(np.ndim(mab._imp.decisions), 1)
decisions2 = [1, 2, 3]
rewards2 = [1, 1, 1]
context_history2 = [[0, 1, 2, 3, 5], [1, 1, 1, 1, 1], [0, 0, 1, 0, 0]]
mab.partial_fit(decisions2, rewards2, context_history2)
self.assertEqual(len(mab._imp.decisions), 13)
self.assertEqual(len(mab._imp.rewards), 13)
self.assertEqual(len(mab._imp.contexts), 13)
self.assertEqual(np.ndim(mab._imp.decisions), 1)
def test_partial_fit_thompson_thresholds(self):
arm_to_threshold = {1: 1, 2: 5, 3: 2, 4: 3}
def binarize(arm, reward):
return reward >= arm_to_threshold[arm]
arms, mab = self.predict(arms=[1, 2, 3, 4],
decisions=[1, 1, 1, 2, 2, 3, 3, 3, 3, 3],
rewards=[0, 1, 7, 0, 1, 9, 0, 2, 6, 11],
learning_policy=LearningPolicy.ThompsonSampling(binarize),
neighborhood_policy=NeighborhoodPolicy.KNearest(2),
context_history=[[0, 1, 2, 3, 5], [1, 1, 1, 1, 1], [0, 0, 1, 0, 0],
[0, 2, 2, 3, 5], [1, 3, 1, 1, 1], [0, 0, 0, 0, 0],
[0, 1, 4, 3, 5], [0, 1, 2, 4, 5], [1, 2, 1, 1, 3],
[0, 2, 1, 0, 0]],
contexts=[[0, 1, 2, 3, 5], [1, 1, 1, 1, 1]],
seed=123456,
num_run=1,
is_predict=True)
self.assertTrue(mab._imp.lp.is_contextual_binarized)
self.assertListEqual(arms, [4, 4])
self.assertEqual(len(mab._imp.decisions), 10)
self.assertEqual(len(mab._imp.rewards), 10)
self.assertEqual(len(mab._imp.contexts), 10)
self.assertEqual(np.ndim(mab._imp.decisions), 1)
self.assertListEqual(list(set(mab._imp.rewards)), [0, 1])
decisions2 = [1, 2, 3]
rewards2 = [11, 1, 6]
context_history2 = [[0, 1, 2, 3, 5], [1, 1, 1, 1, 1], [0, 0, 1, 0, 0]]
mab.partial_fit(decisions2, rewards2, context_history2)
self.assertEqual(len(mab._imp.decisions), 13)
self.assertEqual(len(mab._imp.rewards), 13)
self.assertEqual(len(mab._imp.contexts), 13)
self.assertEqual(np.ndim(mab._imp.decisions), 1)
arm = mab.predict([[0, 1, 2, 3, 5]])
self.assertEqual(arm, 3)
self.assertListEqual(list(set(mab._imp.rewards)), [0, 1])
def test_fit_twice_thompson_thresholds(self):
arm_to_threshold = {1: 1, 2: 5, 3: 2, 4: 3}
def binarize(arm, reward):
return reward >= arm_to_threshold[arm]
arms, mab = self.predict(arms=[1, 2, 3, 4],
decisions=[1, 1, 1, 2, 2, 3, 3, 3, 3, 3],
rewards=[0, 1, 7, 0, 1, 9, 0, 2, 6, 11],
learning_policy=LearningPolicy.ThompsonSampling(binarize),
neighborhood_policy=NeighborhoodPolicy.KNearest(2),
context_history=[[0, 1, 2, 3, 5], [1, 1, 1, 1, 1], [0, 0, 1, 0, 0],
[0, 2, 2, 3, 5], [1, 3, 1, 1, 1], [0, 0, 0, 0, 0],
[0, 1, 4, 3, 5], [0, 1, 2, 4, 5], [1, 2, 1, 1, 3],
[0, 2, 1, 0, 0]],
contexts=[[0, 1, 2, 3, 5], [1, 1, 1, 1, 1]],
seed=123456,
num_run=1,
is_predict=True)
self.assertTrue(mab._imp.lp.is_contextual_binarized)
self.assertListEqual(arms, [4, 4])
self.assertEqual(len(mab._imp.decisions), 10)
self.assertEqual(len(mab._imp.rewards), 10)
self.assertEqual(len(mab._imp.contexts), 10)
self.assertEqual(np.ndim(mab._imp.decisions), 1)
self.assertListEqual(list(set(mab._imp.rewards)), [0, 1])
decisions2 = [1, 2, 3]
rewards2 = [11, 1, 6]
context_history2 = [[0, 1, 2, 3, 5], [1, 1, 1, 1, 1], [0, 0, 1, 0, 0]]
mab.fit(decisions2, rewards2, context_history2)
self.assertEqual(len(mab._imp.decisions), 3)
self.assertEqual(len(mab._imp.rewards), 3)
self.assertEqual(len(mab._imp.contexts), 3)
self.assertEqual(np.ndim(mab._imp.decisions), 1)
self.assertListEqual(list(set(mab._imp.rewards)), [0, 1])
def test_add_arm(self):
arms, mab = self.predict(arms=[1, 2, 3, 4],
decisions=[1, 1, 1, 2, 2, 3, 3, 3, 3, 3],
rewards=[0, 1, 1, 0, 0, 0, 0, 1, 1, 1],
learning_policy=LearningPolicy.EpsilonGreedy(epsilon=0),
neighborhood_policy=NeighborhoodPolicy.KNearest(2),
context_history=[[0, 1, 2, 3, 5], [1, 1, 1, 1, 1], [0, 0, 1, 0, 0],
[0, 2, 2, 3, 5], [1, 3, 1, 1, 1], [0, 0, 0, 0, 0],
[0, 1, 4, 3, 5], [0, 1, 2, 4, 5], [1, 2, 1, 1, 3],
[0, 2, 1, 0, 0]],
contexts=[[0, 1, 2, 3, 5], [1, 1, 1, 1, 1]],
seed=123456,
num_run=1,
is_predict=True)
mab.add_arm(5)
self.assertTrue(5 in mab.arms)
self.assertTrue(5 in mab._imp.arms)
self.assertTrue(5 in mab._imp.lp.arms)
self.assertTrue(5 in mab._imp.lp.arm_to_expectation.keys())
| 54.187117 | 100 | 0.363091 | 2,169 | 17,665 | 2.88059 | 0.044721 | 0.074264 | 0.06194 | 0.037132 | 0.931178 | 0.922055 | 0.90493 | 0.900608 | 0.888764 | 0.885243 | 0 | 0.167404 | 0.488367 | 17,665 | 325 | 101 | 54.353846 | 0.523899 | 0.001189 | 0 | 0.836431 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.182156 | 1 | 0.05948 | false | 0 | 0.011152 | 0.007435 | 0.081784 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
db63148fdce86b2b15135ed33f87084c3bf232ca | 24,131 | py | Python | bin/azure/mgmt/consumption/operations/budgets_operations.py | zdmc23/bash-lambda-layer | e762df0189cfb894dab2d96bae1655b8857d5efb | [
"MIT"
] | null | null | null | bin/azure/mgmt/consumption/operations/budgets_operations.py | zdmc23/bash-lambda-layer | e762df0189cfb894dab2d96bae1655b8857d5efb | [
"MIT"
] | null | null | null | bin/azure/mgmt/consumption/operations/budgets_operations.py | zdmc23/bash-lambda-layer | e762df0189cfb894dab2d96bae1655b8857d5efb | [
"MIT"
] | 2 | 2021-05-23T16:46:31.000Z | 2021-05-26T23:51:09.000Z | # coding=utf-8
# --------------------------------------------------------------------------
# Copyright (c) Microsoft Corporation. All rights reserved.
# Licensed under the MIT License. See License.txt in the project root for
# license information.
#
# Code generated by Microsoft (R) AutoRest Code Generator.
# Changes may cause incorrect behavior and will be lost if the code is
# regenerated.
# --------------------------------------------------------------------------
import uuid
from msrest.pipeline import ClientRawResponse
from .. import models
class BudgetsOperations(object):
"""BudgetsOperations operations.
:param client: Client for service requests.
:param config: Configuration of service client.
:param serializer: An object model serializer.
:param deserializer: An objec model deserializer.
:ivar api_version: Version of the API to be used with the client request. The current version is 2018-01-31. Constant value: "2018-01-31".
"""
models = models
def __init__(self, client, config, serializer, deserializer):
self._client = client
self._serialize = serializer
self._deserialize = deserializer
self.api_version = "2018-01-31"
self.config = config
def list(
self, custom_headers=None, raw=False, **operation_config):
"""Lists all budgets for a subscription.
:param dict custom_headers: headers that will be added to the request
:param bool raw: returns the direct response alongside the
deserialized response
:param operation_config: :ref:`Operation configuration
overrides<msrest:optionsforoperations>`.
:return: An iterator like instance of Budget
:rtype:
~azure.mgmt.consumption.models.BudgetPaged[~azure.mgmt.consumption.models.Budget]
:raises:
:class:`ErrorResponseException<azure.mgmt.consumption.models.ErrorResponseException>`
"""
def internal_paging(next_link=None, raw=False):
if not next_link:
# Construct URL
url = '/subscriptions/{subscriptionId}/providers/Microsoft.Consumption/budgets'
path_format_arguments = {
'subscriptionId': self._serialize.url("self.config.subscription_id", self.config.subscription_id, 'str')
}
url = self._client.format_url(url, **path_format_arguments)
# Construct parameters
query_parameters = {}
query_parameters['api-version'] = self._serialize.query("self.api_version", self.api_version, 'str')
else:
url = next_link
query_parameters = {}
# Construct headers
header_parameters = {}
header_parameters['Content-Type'] = 'application/json; charset=utf-8'
if self.config.generate_client_request_id:
header_parameters['x-ms-client-request-id'] = str(uuid.uuid1())
if custom_headers:
header_parameters.update(custom_headers)
if self.config.accept_language is not None:
header_parameters['accept-language'] = self._serialize.header("self.config.accept_language", self.config.accept_language, 'str')
# Construct and send request
request = self._client.get(url, query_parameters)
response = self._client.send(
request, header_parameters, stream=False, **operation_config)
if response.status_code not in [200]:
raise models.ErrorResponseException(self._deserialize, response)
return response
# Deserialize response
deserialized = models.BudgetPaged(internal_paging, self._deserialize.dependencies)
if raw:
header_dict = {}
client_raw_response = models.BudgetPaged(internal_paging, self._deserialize.dependencies, header_dict)
return client_raw_response
return deserialized
def list_by_resource_group_name(
self, resource_group_name, custom_headers=None, raw=False, **operation_config):
"""Lists all budgets for a resource group under a subscription.
:param resource_group_name: Azure Resource Group Name.
:type resource_group_name: str
:param dict custom_headers: headers that will be added to the request
:param bool raw: returns the direct response alongside the
deserialized response
:param operation_config: :ref:`Operation configuration
overrides<msrest:optionsforoperations>`.
:return: An iterator like instance of Budget
:rtype:
~azure.mgmt.consumption.models.BudgetPaged[~azure.mgmt.consumption.models.Budget]
:raises:
:class:`ErrorResponseException<azure.mgmt.consumption.models.ErrorResponseException>`
"""
def internal_paging(next_link=None, raw=False):
if not next_link:
# Construct URL
url = '/subscriptions/{subscriptionId}/resourceGroups/{resourceGroupName}/providers/Microsoft.Consumption/budgets'
path_format_arguments = {
'subscriptionId': self._serialize.url("self.config.subscription_id", self.config.subscription_id, 'str'),
'resourceGroupName': self._serialize.url("resource_group_name", resource_group_name, 'str')
}
url = self._client.format_url(url, **path_format_arguments)
# Construct parameters
query_parameters = {}
query_parameters['api-version'] = self._serialize.query("self.api_version", self.api_version, 'str')
else:
url = next_link
query_parameters = {}
# Construct headers
header_parameters = {}
header_parameters['Content-Type'] = 'application/json; charset=utf-8'
if self.config.generate_client_request_id:
header_parameters['x-ms-client-request-id'] = str(uuid.uuid1())
if custom_headers:
header_parameters.update(custom_headers)
if self.config.accept_language is not None:
header_parameters['accept-language'] = self._serialize.header("self.config.accept_language", self.config.accept_language, 'str')
# Construct and send request
request = self._client.get(url, query_parameters)
response = self._client.send(
request, header_parameters, stream=False, **operation_config)
if response.status_code not in [200]:
raise models.ErrorResponseException(self._deserialize, response)
return response
# Deserialize response
deserialized = models.BudgetPaged(internal_paging, self._deserialize.dependencies)
if raw:
header_dict = {}
client_raw_response = models.BudgetPaged(internal_paging, self._deserialize.dependencies, header_dict)
return client_raw_response
return deserialized
def get(
self, budget_name, custom_headers=None, raw=False, **operation_config):
"""Gets the budget for a subscription by budget name.
:param budget_name: Budget Name.
:type budget_name: str
:param dict custom_headers: headers that will be added to the request
:param bool raw: returns the direct response alongside the
deserialized response
:param operation_config: :ref:`Operation configuration
overrides<msrest:optionsforoperations>`.
:return: Budget or ClientRawResponse if raw=true
:rtype: ~azure.mgmt.consumption.models.Budget or
~msrest.pipeline.ClientRawResponse
:raises:
:class:`ErrorResponseException<azure.mgmt.consumption.models.ErrorResponseException>`
"""
# Construct URL
url = '/subscriptions/{subscriptionId}/providers/Microsoft.Consumption/budgets/{budgetName}'
path_format_arguments = {
'subscriptionId': self._serialize.url("self.config.subscription_id", self.config.subscription_id, 'str'),
'budgetName': self._serialize.url("budget_name", budget_name, 'str')
}
url = self._client.format_url(url, **path_format_arguments)
# Construct parameters
query_parameters = {}
query_parameters['api-version'] = self._serialize.query("self.api_version", self.api_version, 'str')
# Construct headers
header_parameters = {}
header_parameters['Content-Type'] = 'application/json; charset=utf-8'
if self.config.generate_client_request_id:
header_parameters['x-ms-client-request-id'] = str(uuid.uuid1())
if custom_headers:
header_parameters.update(custom_headers)
if self.config.accept_language is not None:
header_parameters['accept-language'] = self._serialize.header("self.config.accept_language", self.config.accept_language, 'str')
# Construct and send request
request = self._client.get(url, query_parameters)
response = self._client.send(request, header_parameters, stream=False, **operation_config)
if response.status_code not in [200]:
raise models.ErrorResponseException(self._deserialize, response)
deserialized = None
if response.status_code == 200:
deserialized = self._deserialize('Budget', response)
if raw:
client_raw_response = ClientRawResponse(deserialized, response)
return client_raw_response
return deserialized
def create_or_update(
self, budget_name, parameters, custom_headers=None, raw=False, **operation_config):
"""The operation to create or update a budget. Update operation requires
latest eTag to be set in the request mandatorily. You may obtain the
latest eTag by performing a get operation. Create operation does not
require eTag.
:param budget_name: Budget Name.
:type budget_name: str
:param parameters: Parameters supplied to the Create Budget operation.
:type parameters: ~azure.mgmt.consumption.models.Budget
:param dict custom_headers: headers that will be added to the request
:param bool raw: returns the direct response alongside the
deserialized response
:param operation_config: :ref:`Operation configuration
overrides<msrest:optionsforoperations>`.
:return: Budget or ClientRawResponse if raw=true
:rtype: ~azure.mgmt.consumption.models.Budget or
~msrest.pipeline.ClientRawResponse
:raises:
:class:`ErrorResponseException<azure.mgmt.consumption.models.ErrorResponseException>`
"""
# Construct URL
url = '/subscriptions/{subscriptionId}/providers/Microsoft.Consumption/budgets/{budgetName}'
path_format_arguments = {
'subscriptionId': self._serialize.url("self.config.subscription_id", self.config.subscription_id, 'str'),
'budgetName': self._serialize.url("budget_name", budget_name, 'str')
}
url = self._client.format_url(url, **path_format_arguments)
# Construct parameters
query_parameters = {}
query_parameters['api-version'] = self._serialize.query("self.api_version", self.api_version, 'str')
# Construct headers
header_parameters = {}
header_parameters['Content-Type'] = 'application/json; charset=utf-8'
if self.config.generate_client_request_id:
header_parameters['x-ms-client-request-id'] = str(uuid.uuid1())
if custom_headers:
header_parameters.update(custom_headers)
if self.config.accept_language is not None:
header_parameters['accept-language'] = self._serialize.header("self.config.accept_language", self.config.accept_language, 'str')
# Construct body
body_content = self._serialize.body(parameters, 'Budget')
# Construct and send request
request = self._client.put(url, query_parameters)
response = self._client.send(
request, header_parameters, body_content, stream=False, **operation_config)
if response.status_code not in [200, 201]:
raise models.ErrorResponseException(self._deserialize, response)
deserialized = None
if response.status_code == 200:
deserialized = self._deserialize('Budget', response)
if response.status_code == 201:
deserialized = self._deserialize('Budget', response)
if raw:
client_raw_response = ClientRawResponse(deserialized, response)
return client_raw_response
return deserialized
def delete(
self, budget_name, custom_headers=None, raw=False, **operation_config):
"""The operation to delete a budget.
:param budget_name: Budget Name.
:type budget_name: str
:param dict custom_headers: headers that will be added to the request
:param bool raw: returns the direct response alongside the
deserialized response
:param operation_config: :ref:`Operation configuration
overrides<msrest:optionsforoperations>`.
:return: None or ClientRawResponse if raw=true
:rtype: None or ~msrest.pipeline.ClientRawResponse
:raises:
:class:`ErrorResponseException<azure.mgmt.consumption.models.ErrorResponseException>`
"""
# Construct URL
url = '/subscriptions/{subscriptionId}/providers/Microsoft.Consumption/budgets/{budgetName}'
path_format_arguments = {
'subscriptionId': self._serialize.url("self.config.subscription_id", self.config.subscription_id, 'str'),
'budgetName': self._serialize.url("budget_name", budget_name, 'str')
}
url = self._client.format_url(url, **path_format_arguments)
# Construct parameters
query_parameters = {}
query_parameters['api-version'] = self._serialize.query("self.api_version", self.api_version, 'str')
# Construct headers
header_parameters = {}
header_parameters['Content-Type'] = 'application/json; charset=utf-8'
if self.config.generate_client_request_id:
header_parameters['x-ms-client-request-id'] = str(uuid.uuid1())
if custom_headers:
header_parameters.update(custom_headers)
if self.config.accept_language is not None:
header_parameters['accept-language'] = self._serialize.header("self.config.accept_language", self.config.accept_language, 'str')
# Construct and send request
request = self._client.delete(url, query_parameters)
response = self._client.send(request, header_parameters, stream=False, **operation_config)
if response.status_code not in [200]:
raise models.ErrorResponseException(self._deserialize, response)
if raw:
client_raw_response = ClientRawResponse(None, response)
return client_raw_response
def get_by_resource_group_name(
self, resource_group_name, budget_name, custom_headers=None, raw=False, **operation_config):
"""Gets the budget for a resource group under a subscription by budget
name.
:param resource_group_name: Azure Resource Group Name.
:type resource_group_name: str
:param budget_name: Budget Name.
:type budget_name: str
:param dict custom_headers: headers that will be added to the request
:param bool raw: returns the direct response alongside the
deserialized response
:param operation_config: :ref:`Operation configuration
overrides<msrest:optionsforoperations>`.
:return: Budget or ClientRawResponse if raw=true
:rtype: ~azure.mgmt.consumption.models.Budget or
~msrest.pipeline.ClientRawResponse
:raises:
:class:`ErrorResponseException<azure.mgmt.consumption.models.ErrorResponseException>`
"""
# Construct URL
url = '/subscriptions/{subscriptionId}/resourceGroups/{resourceGroupName}/providers/Microsoft.Consumption/budgets/{budgetName}'
path_format_arguments = {
'subscriptionId': self._serialize.url("self.config.subscription_id", self.config.subscription_id, 'str'),
'resourceGroupName': self._serialize.url("resource_group_name", resource_group_name, 'str'),
'budgetName': self._serialize.url("budget_name", budget_name, 'str')
}
url = self._client.format_url(url, **path_format_arguments)
# Construct parameters
query_parameters = {}
query_parameters['api-version'] = self._serialize.query("self.api_version", self.api_version, 'str')
# Construct headers
header_parameters = {}
header_parameters['Content-Type'] = 'application/json; charset=utf-8'
if self.config.generate_client_request_id:
header_parameters['x-ms-client-request-id'] = str(uuid.uuid1())
if custom_headers:
header_parameters.update(custom_headers)
if self.config.accept_language is not None:
header_parameters['accept-language'] = self._serialize.header("self.config.accept_language", self.config.accept_language, 'str')
# Construct and send request
request = self._client.get(url, query_parameters)
response = self._client.send(request, header_parameters, stream=False, **operation_config)
if response.status_code not in [200]:
raise models.ErrorResponseException(self._deserialize, response)
deserialized = None
if response.status_code == 200:
deserialized = self._deserialize('Budget', response)
if raw:
client_raw_response = ClientRawResponse(deserialized, response)
return client_raw_response
return deserialized
def create_or_update_by_resource_group_name(
self, resource_group_name, budget_name, parameters, custom_headers=None, raw=False, **operation_config):
"""The operation to create or update a budget. Update operation requires
latest eTag to be set in the request mandatorily. You may obtain the
latest eTag by performing a get operation. Create operation does not
require eTag.
:param resource_group_name: Azure Resource Group Name.
:type resource_group_name: str
:param budget_name: Budget Name.
:type budget_name: str
:param parameters: Parameters supplied to the Create Budget operation.
:type parameters: ~azure.mgmt.consumption.models.Budget
:param dict custom_headers: headers that will be added to the request
:param bool raw: returns the direct response alongside the
deserialized response
:param operation_config: :ref:`Operation configuration
overrides<msrest:optionsforoperations>`.
:return: Budget or ClientRawResponse if raw=true
:rtype: ~azure.mgmt.consumption.models.Budget or
~msrest.pipeline.ClientRawResponse
:raises:
:class:`ErrorResponseException<azure.mgmt.consumption.models.ErrorResponseException>`
"""
# Construct URL
url = '/subscriptions/{subscriptionId}/resourceGroups/{resourceGroupName}/providers/Microsoft.Consumption/budgets/{budgetName}'
path_format_arguments = {
'subscriptionId': self._serialize.url("self.config.subscription_id", self.config.subscription_id, 'str'),
'resourceGroupName': self._serialize.url("resource_group_name", resource_group_name, 'str'),
'budgetName': self._serialize.url("budget_name", budget_name, 'str')
}
url = self._client.format_url(url, **path_format_arguments)
# Construct parameters
query_parameters = {}
query_parameters['api-version'] = self._serialize.query("self.api_version", self.api_version, 'str')
# Construct headers
header_parameters = {}
header_parameters['Content-Type'] = 'application/json; charset=utf-8'
if self.config.generate_client_request_id:
header_parameters['x-ms-client-request-id'] = str(uuid.uuid1())
if custom_headers:
header_parameters.update(custom_headers)
if self.config.accept_language is not None:
header_parameters['accept-language'] = self._serialize.header("self.config.accept_language", self.config.accept_language, 'str')
# Construct body
body_content = self._serialize.body(parameters, 'Budget')
# Construct and send request
request = self._client.put(url, query_parameters)
response = self._client.send(
request, header_parameters, body_content, stream=False, **operation_config)
if response.status_code not in [200, 201]:
raise models.ErrorResponseException(self._deserialize, response)
deserialized = None
if response.status_code == 200:
deserialized = self._deserialize('Budget', response)
if response.status_code == 201:
deserialized = self._deserialize('Budget', response)
if raw:
client_raw_response = ClientRawResponse(deserialized, response)
return client_raw_response
return deserialized
def delete_by_resource_group_name(
self, resource_group_name, budget_name, custom_headers=None, raw=False, **operation_config):
"""The operation to delete a budget.
:param resource_group_name: Azure Resource Group Name.
:type resource_group_name: str
:param budget_name: Budget Name.
:type budget_name: str
:param dict custom_headers: headers that will be added to the request
:param bool raw: returns the direct response alongside the
deserialized response
:param operation_config: :ref:`Operation configuration
overrides<msrest:optionsforoperations>`.
:return: None or ClientRawResponse if raw=true
:rtype: None or ~msrest.pipeline.ClientRawResponse
:raises:
:class:`ErrorResponseException<azure.mgmt.consumption.models.ErrorResponseException>`
"""
# Construct URL
url = '/subscriptions/{subscriptionId}/resourceGroups/{resourceGroupName}/providers/Microsoft.Consumption/budgets/{budgetName}'
path_format_arguments = {
'subscriptionId': self._serialize.url("self.config.subscription_id", self.config.subscription_id, 'str'),
'resourceGroupName': self._serialize.url("resource_group_name", resource_group_name, 'str'),
'budgetName': self._serialize.url("budget_name", budget_name, 'str')
}
url = self._client.format_url(url, **path_format_arguments)
# Construct parameters
query_parameters = {}
query_parameters['api-version'] = self._serialize.query("self.api_version", self.api_version, 'str')
# Construct headers
header_parameters = {}
header_parameters['Content-Type'] = 'application/json; charset=utf-8'
if self.config.generate_client_request_id:
header_parameters['x-ms-client-request-id'] = str(uuid.uuid1())
if custom_headers:
header_parameters.update(custom_headers)
if self.config.accept_language is not None:
header_parameters['accept-language'] = self._serialize.header("self.config.accept_language", self.config.accept_language, 'str')
# Construct and send request
request = self._client.delete(url, query_parameters)
response = self._client.send(request, header_parameters, stream=False, **operation_config)
if response.status_code not in [200]:
raise models.ErrorResponseException(self._deserialize, response)
if raw:
client_raw_response = ClientRawResponse(None, response)
return client_raw_response
| 45.96381 | 144 | 0.66686 | 2,557 | 24,131 | 6.100508 | 0.070004 | 0.031412 | 0.030515 | 0.036925 | 0.946984 | 0.946984 | 0.944676 | 0.940701 | 0.938137 | 0.938137 | 0 | 0.004835 | 0.237247 | 24,131 | 524 | 145 | 46.051527 | 0.84266 | 0.288674 | 0 | 0.863813 | 0 | 0 | 0.162434 | 0.086557 | 0 | 0 | 0 | 0 | 0 | 1 | 0.042802 | false | 0 | 0.011673 | 0 | 0.124514 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
dbb23e98f444eb6cd087338954da3a7b89bcb02c | 5,032 | py | Python | tests/reader/test_singlepagetiff.py | mehta-lab/waveorder | 9892c20955d3487778fd440a0d7f4f86334e7b8e | [
"Unlicense"
] | 2 | 2020-12-19T02:55:09.000Z | 2022-02-24T19:40:26.000Z | tests/reader/test_singlepagetiff.py | mehta-lab/waveorder | 9892c20955d3487778fd440a0d7f4f86334e7b8e | [
"Unlicense"
] | 42 | 2021-01-20T22:34:14.000Z | 2022-03-31T00:13:37.000Z | tests/reader/test_singlepagetiff.py | mehta-lab/waveorder | 9892c20955d3487778fd440a0d7f4f86334e7b8e | [
"Unlicense"
] | null | null | null | import zarr
import numpy as np
from waveorder.io.singlepagetiff import MicromanagerSequenceReader
def test_constructor_mm2gamma(setup_mm2gamma_singlepage_tiffs):
"""
test that constructor parses metadata properly
no data extraction in this test
"""
# choose a specific folder
_, one_folder, _ = setup_mm2gamma_singlepage_tiffs
mmr = MicromanagerSequenceReader(one_folder, extract_data=False)
assert(mmr.mm_meta is not None)
assert(mmr.width is not 0)
assert(mmr.height is not 0)
assert(mmr.frames is not 0)
assert(mmr.slices is not 0)
assert(mmr.channels is not 0)
def test_output_dims_mm2gamma(setup_mm2gamma_singlepage_tiffs):
"""
test that output dimensions are always (t, c, z, y, x)
"""
# choose a random folder
_, _, rand_folder = setup_mm2gamma_singlepage_tiffs
mmr = MicromanagerSequenceReader(rand_folder, extract_data=False)
assert(mmr.get_zarr(0).shape[0] == mmr.frames)
assert(mmr.get_zarr(0).shape[1] == mmr.channels)
assert(mmr.get_zarr(0).shape[2] == mmr.slices)
assert(mmr.get_zarr(0).shape[3] == mmr.height)
assert(mmr.get_zarr(0).shape[4] == mmr.width)
def test_output_dims_mm2gamma_incomplete(setup_mm2gamma_singlepage_tiffs_incomplete):
"""
test that output dimensions are correct for interrupted data
"""
# choose a random folder
folder = setup_mm2gamma_singlepage_tiffs_incomplete
mmr = MicromanagerSequenceReader(folder, extract_data=True)
assert(mmr.get_zarr(0).shape[0] == mmr.frames)
assert(mmr.get_zarr(0).shape[1] == mmr.channels)
assert(mmr.get_zarr(0).shape[2] == mmr.slices)
assert(mmr.get_zarr(0).shape[3] == mmr.height)
assert(mmr.get_zarr(0).shape[4] == mmr.width)
assert(mmr.get_zarr(0).shape[0] == 11)
def test_get_zarr_mm2gamma(setup_mm2gamma_singlepage_tiffs):
_, _, rand_folder = setup_mm2gamma_singlepage_tiffs
mmr = MicromanagerSequenceReader(rand_folder, extract_data=True)
for i in range(mmr.get_num_positions()):
z = mmr.get_zarr(i)
assert(z.shape == mmr.shape)
assert(isinstance(z, zarr.core.Array))
def test_get_array_mm2gamma(setup_mm2gamma_singlepage_tiffs):
_, _, rand_folder = setup_mm2gamma_singlepage_tiffs
mmr = MicromanagerSequenceReader(rand_folder, extract_data=True)
for i in range(mmr.get_num_positions()):
z = mmr.get_array(i)
assert(z.shape == mmr.shape)
assert(isinstance(z, np.ndarray))
def test_get_num_positions_mm2gamma(setup_mm2gamma_singlepage_tiffs):
_, _, rand_folder = setup_mm2gamma_singlepage_tiffs
mmr = MicromanagerSequenceReader(rand_folder, extract_data=True)
assert(mmr.get_num_positions() >= 1)
# repeat of above but using mm1.4.22 data
def test_constructor_mm1422(setup_mm1422_singlepage_tiffs):
"""
test that constructor parses metadata properly
no data extraction in this test
"""
# choose a specific folder
_, one_folder, _ = setup_mm1422_singlepage_tiffs
mmr = MicromanagerSequenceReader(one_folder, extract_data=False)
assert(mmr.mm_meta is not None)
assert(mmr.width is not 0)
assert(mmr.height is not 0)
assert(mmr.frames is not 0)
assert(mmr.slices is not 0)
assert(mmr.channels is not 0)
def test_output_dims_mm1422(setup_mm1422_singlepage_tiffs):
"""
test that output dimensions are always (t, c, z, y, x)
"""
# choose a random folder
_, _, rand_folder = setup_mm1422_singlepage_tiffs
mmr = MicromanagerSequenceReader(rand_folder, extract_data=False)
assert(mmr.get_zarr(0).shape[0] == mmr.frames)
assert(mmr.get_zarr(0).shape[1] == mmr.channels)
assert(mmr.get_zarr(0).shape[2] == mmr.slices)
assert(mmr.get_zarr(0).shape[3] == mmr.height)
assert(mmr.get_zarr(0).shape[4] == mmr.width)
def test_get_zarr_mm1422(setup_mm1422_singlepage_tiffs):
_, _, rand_folder = setup_mm1422_singlepage_tiffs
mmr = MicromanagerSequenceReader(rand_folder, extract_data=True)
for i in range(mmr.get_num_positions()):
z = mmr.get_zarr(i)
assert(z.shape == mmr.shape)
assert(isinstance(z, zarr.core.Array))
def test_get_array_mm1422(setup_mm1422_singlepage_tiffs):
_, _, rand_folder = setup_mm1422_singlepage_tiffs
mmr = MicromanagerSequenceReader(rand_folder, extract_data=True)
for i in range(mmr.get_num_positions()):
z = mmr.get_array(i)
assert(z.shape == mmr.shape)
assert(isinstance(z, np.ndarray))
def test_get_num_positions_mm1422(setup_mm1422_singlepage_tiffs):
_, _, rand_folder = setup_mm1422_singlepage_tiffs
mmr = MicromanagerSequenceReader(rand_folder, extract_data=True)
assert(mmr.get_num_positions() >= 1)
# uncertain whether the bottom tests are useful
# def test_read_tiff_series_mm2gamma(setup_mm2gamma_singlepage_tiffs):
# pass
#
#
# def test_extract_coord_mm2gamma(setup_mm2gamma_singlepage_tiffs):
# pass
#
#
# def test_shape_mm2gamma(setup_mm2gamma_singlepage_tiffs):
# pass
| 33.105263 | 85 | 0.727146 | 699 | 5,032 | 4.941345 | 0.13877 | 0.07817 | 0.062536 | 0.074117 | 0.902721 | 0.865953 | 0.854372 | 0.815576 | 0.788361 | 0.788361 | 0 | 0.032173 | 0.172297 | 5,032 | 151 | 86 | 33.324503 | 0.797119 | 0.151431 | 0 | 0.768293 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.463415 | 1 | 0.134146 | false | 0 | 0.036585 | 0 | 0.170732 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 8 |
dbd40c322be9883515ef1d83b73b0cb241e052bc | 29,434 | py | Python | tests/unit/test_fp16.py | ConnollyLeon/DeepSpeed | 2d84d1c185ef0345eaf43a7240d61b33eda43497 | [
"MIT"
] | 58 | 2021-01-24T08:57:03.000Z | 2022-03-31T07:47:13.000Z | tests/unit/test_fp16.py | ConnollyLeon/DeepSpeed | 2d84d1c185ef0345eaf43a7240d61b33eda43497 | [
"MIT"
] | 1 | 2022-03-10T06:52:13.000Z | 2022-03-10T06:52:13.000Z | tests/unit/test_fp16.py | ConnollyLeon/DeepSpeed | 2d84d1c185ef0345eaf43a7240d61b33eda43497 | [
"MIT"
] | 14 | 2021-01-25T03:48:44.000Z | 2022-03-18T12:58:14.000Z | import torch
import deepspeed
import argparse
import pytest
import json
import os
from deepspeed.ops.adam import FusedAdam
from common import distributed_test
from simple_model import SimpleModel, SimpleOptimizer, random_dataloader, args_from_dict, create_deepspeed_args
from deepspeed.ops.op_builder import CPUAdamBuilder
try:
from apex import amp
_amp_available = True
except ImportError:
_amp_available = False
amp_available = pytest.mark.skip(_amp_available, reason="apex/amp is not installed")
def test_lamb_fp32_grad_clip(tmpdir):
config_dict = {
"train_batch_size": 2,
"steps_per_print": 1,
"optimizer": {
"type": "Lamb",
"params": {
"lr": 0.00015
}
},
"gradient_clipping": 1.0
}
args = args_from_dict(tmpdir, config_dict)
hidden_dim = 10
model = SimpleModel(hidden_dim)
@distributed_test(world_size=[1, 2])
def _test_lamb_fp32_grad_clip(args, model, hidden_dim):
model, _, _, _ = deepspeed.initialize(args=args,
model=model,
model_parameters=model.parameters())
data_loader = random_dataloader(model=model,
total_samples=50,
hidden_dim=hidden_dim,
device=model.device,
dtype=torch.float)
for n, batch in enumerate(data_loader):
loss = model(batch[0], batch[1])
model.backward(loss)
model.step()
_test_lamb_fp32_grad_clip(args=args, model=model, hidden_dim=hidden_dim)
def test_lamb_fp16_basic(tmpdir):
config_dict = {
"train_batch_size": 2,
"steps_per_print": 1,
"optimizer": {
"type": "Lamb",
"params": {
"lr": 0.00015
}
},
"gradient_clipping": 1.0,
"fp16": {
"enabled": True
}
}
args = args_from_dict(tmpdir, config_dict)
hidden_dim = 10
model = SimpleModel(hidden_dim)
@distributed_test(world_size=[1, 2])
def _test_lamb_fp16_basic(args, model, hidden_dim):
model, _, _, _ = deepspeed.initialize(args=args,
model=model,
model_parameters=model.parameters())
data_loader = random_dataloader(model=model,
total_samples=50,
hidden_dim=hidden_dim,
device=model.device)
for n, batch in enumerate(data_loader):
loss = model(batch[0], batch[1])
model.backward(loss)
model.step()
_test_lamb_fp16_basic(args=args, model=model, hidden_dim=hidden_dim)
def test_lamb_fp16_empty_grad(tmpdir):
config_dict = {
"train_batch_size": 2,
"steps_per_print": 1,
"optimizer": {
"type": "Lamb",
"params": {
"lr": 0.00015
}
},
"gradient_clipping": 1.0,
"fp16": {
"enabled": True
}
}
args = args_from_dict(tmpdir, config_dict)
hidden_dim = 10
model = SimpleModel(hidden_dim, empty_grad=True)
@distributed_test(world_size=[2])
def _test_lamb_fp16_empty_grad(args, model, hidden_dim):
model, _, _, _ = deepspeed.initialize(args=args,
model=model,
model_parameters=model.parameters())
data_loader = random_dataloader(model=model,
total_samples=50,
hidden_dim=hidden_dim,
device=model.device)
for n, batch in enumerate(data_loader):
loss = model(batch[0], batch[1])
model.backward(loss)
model.step()
_test_lamb_fp16_empty_grad(args=args, model=model, hidden_dim=hidden_dim)
def test_adam_fp32_empty_grad(tmpdir):
config_dict = {
"train_batch_size": 2,
"steps_per_print": 1,
"optimizer": {
"type": "Adam",
"params": {
"lr": 0.00015
}
},
"gradient_clipping": 1.0,
"fp16": {
"enabled": False
}
}
args = args_from_dict(tmpdir, config_dict)
hidden_dim = 10
model = SimpleModel(hidden_dim, empty_grad=True)
@distributed_test(world_size=[2])
def _test_adam_fp32_empty_grad(args, model, hidden_dim):
model, _, _, _ = deepspeed.initialize(args=args,
model=model,
model_parameters=model.parameters())
data_loader = random_dataloader(model=model,
total_samples=50,
hidden_dim=hidden_dim,
device=model.device,
dtype=torch.float)
for n, batch in enumerate(data_loader):
loss = model(batch[0], batch[1])
model.backward(loss)
model.step()
_test_adam_fp32_empty_grad(args=args, model=model, hidden_dim=hidden_dim)
def test_adamw_fp16_basic(tmpdir):
config_dict = {
"train_batch_size": 1,
"steps_per_print": 1,
"fp16": {
"enabled": True
}
}
args = args_from_dict(tmpdir, config_dict)
hidden_dim = 10
model = SimpleModel(hidden_dim)
@distributed_test(world_size=[1])
def _test_adamw_fp16_basic(args, model, hidden_dim):
optimizer = torch.optim.AdamW(params=model.parameters())
model, _, _, _ = deepspeed.initialize(args=args,
model=model,
optimizer=optimizer)
data_loader = random_dataloader(model=model,
total_samples=50,
hidden_dim=hidden_dim,
device=model.device)
for n, batch in enumerate(data_loader):
loss = model(batch[0], batch[1])
model.backward(loss)
model.step()
_test_adamw_fp16_basic(args=args, model=model, hidden_dim=hidden_dim)
def test_dict_config_adamw_fp16_basic():
config_dict = {
"train_batch_size": 1,
"steps_per_print": 1,
"fp16": {
"enabled": True
}
}
args = create_deepspeed_args()
hidden_dim = 10
model = SimpleModel(hidden_dim)
@distributed_test(world_size=[1])
def _test_adamw_fp16_basic(args, model, hidden_dim, config_dict):
optimizer = torch.optim.AdamW(params=model.parameters())
model, _, _, _ = deepspeed.initialize(args=args,
model=model,
optimizer=optimizer,
config_params=config_dict)
data_loader = random_dataloader(model=model,
total_samples=50,
hidden_dim=hidden_dim,
device=model.device)
for n, batch in enumerate(data_loader):
loss = model(batch[0], batch[1])
model.backward(loss)
model.step()
_test_adamw_fp16_basic(args=args,
model=model,
hidden_dim=hidden_dim,
config_dict=config_dict)
def test_adamw_fp16_empty_grad(tmpdir):
config_dict = {
"train_batch_size": 1,
"steps_per_print": 1,
"fp16": {
"enabled": True
}
}
args = args_from_dict(tmpdir, config_dict)
hidden_dim = 10
model = SimpleModel(hidden_dim, empty_grad=True)
@distributed_test(world_size=[1])
def _test_adamw_fp16_empty_grad(args, model, hidden_dim):
optimizer = torch.optim.AdamW(params=model.parameters())
model, _, _, _ = deepspeed.initialize(args=args,
model=model,
optimizer=optimizer)
data_loader = random_dataloader(model=model,
total_samples=50,
hidden_dim=hidden_dim,
device=model.device)
for n, batch in enumerate(data_loader):
loss = model(batch[0], batch[1])
model.backward(loss)
model.step()
_test_adamw_fp16_empty_grad(args=args, model=model, hidden_dim=hidden_dim)
@pytest.mark.parametrize('zero_stage, use_cpu_offload',
[(1,
False),
(2,
False),
(2,
True),
(3,
False),
(3,
True)])
def test_adam_fp16_zero_onecycle_compatibility(tmpdir, zero_stage, use_cpu_offload):
if use_cpu_offload and not deepspeed.ops.__compatible_ops__[CPUAdamBuilder.NAME]:
pytest.skip("cpu-adam is not compatible")
config_dict = {
"train_batch_size": 1,
"steps_per_print": 1,
"optimizer": {
"type": "Adam",
"params": {
"lr": 0.00015
}
},
"scheduler": {
"type": "OneCycle",
"params": {
"cycle_first_step_size": 16000,
"cycle_first_stair_count": 8000,
"decay_step_size": 16000,
"cycle_min_lr": 1e-06,
"cycle_max_lr": 3e-05,
"decay_lr_rate": 1e-07,
"cycle_min_mom": 0.85,
"cycle_max_mom": 0.99,
"decay_mom_rate": 0.0
}
},
"fp16": {
"enabled": True
},
"zero_optimization": {
"stage": zero_stage,
"cpu_offload": use_cpu_offload
}
}
args = args_from_dict(tmpdir, config_dict)
hidden_dim = 10
@distributed_test(world_size=[1])
def _test_adam_fp16_zero_onecycle_compatibility(args, zero_stage, hidden_dim):
model = SimpleModel(hidden_dim)
model, _, _,_ = deepspeed.initialize(args=args,
model=model,
model_parameters=model.parameters())
data_loader = random_dataloader(model=model,
total_samples=50,
hidden_dim=hidden_dim,
device=model.device)
for n, batch in enumerate(data_loader):
loss = model(batch[0], batch[1])
model.backward(loss)
model.step()
_test_adam_fp16_zero_onecycle_compatibility(args=args,
zero_stage=zero_stage,
hidden_dim=hidden_dim)
@pytest.mark.parametrize('zero_stage, use_cpu_offload',
[(1,
False),
(2,
False),
(2,
True),
(3,
False),
(3,
True)])
def test_zero_static_scale(tmpdir, zero_stage, use_cpu_offload):
if use_cpu_offload and not deepspeed.ops.__compatible_ops__[CPUAdamBuilder.NAME]:
pytest.skip("cpu-adam is not compatible")
config_dict = {
"train_batch_size": 4,
"steps_per_print": 1,
"optimizer": {
"type": "Adam",
"params": {
"lr": 0.00015
}
},
"fp16": {
"enabled": True,
"loss_scale": 138.
},
"zero_optimization": {
"stage": zero_stage,
"cpu_offload": use_cpu_offload
}
}
args = args_from_dict(tmpdir, config_dict)
@distributed_test(world_size=2)
def _test_zero_static_scale(args, zero_stage, hidden_dim):
#making hidden size not divisible by DP for covering this scenario
hidden_dim = hidden_dim
model = SimpleModel(hidden_dim)
model, optim, _, _ = deepspeed.initialize(args=args,
model=model,
model_parameters=model.parameters())
# Ensure the static scaler is configured.
assert optim.dynamic_loss_scale == False
assert optim.loss_scaler.loss_scale == 138.
# Now make sure things work..
data_loader = random_dataloader(model=model,
total_samples=10,
hidden_dim=hidden_dim,
device=model.device)
for n, batch in enumerate(data_loader):
loss = model(batch[0], batch[1])
model.backward(loss)
model.step()
#test when hidden_dim is not aligned with world size
_test_zero_static_scale(args=args, zero_stage=zero_stage, hidden_dim=9)
#test when hidden_dim is aligned with world size
_test_zero_static_scale(args=args, zero_stage=zero_stage, hidden_dim=10)
def test_zero_static_scale_deprecated_format(tmpdir):
config_dict = {
"train_batch_size": 4,
"steps_per_print": 1,
"optimizer": {
"type": "Adam",
"params": {
"lr": 0.00015
}
},
"fp16": {
"enabled": True,
"loss_scale": 138.
},
"zero_optimization": {
"stage": 1
}
}
args = args_from_dict(tmpdir, config_dict)
@distributed_test(world_size=2)
def _test_zero_static_scale(args):
hidden_dim = 10
model = SimpleModel(hidden_dim)
model, optim, _, _ = deepspeed.initialize(args=args,
model=model,
model_parameters=model.parameters())
# Ensure the static scaler is configured.
assert optim.dynamic_loss_scale == False
assert optim.loss_scaler.loss_scale == 138.
# Now make sure things work..
data_loader = random_dataloader(model=model,
total_samples=10,
hidden_dim=hidden_dim,
device=model.device)
for n, batch in enumerate(data_loader):
loss = model(batch[0], batch[1])
model.backward(loss)
model.step()
_test_zero_static_scale(args)
@pytest.mark.parametrize('zero_stage, use_cpu_offload',
[(1,
False),
(2,
False),
(2,
True),
(3,
False),
(3,
True)])
def test_zero_allow_untested_optimizer(tmpdir, zero_stage, use_cpu_offload):
if use_cpu_offload and not deepspeed.ops.__compatible_ops__[CPUAdamBuilder.NAME]:
pytest.skip("cpu-adam is not compatible")
config_dict = {
"train_batch_size": 4,
"steps_per_print": 1,
"fp16": {
"enabled": True,
},
"zero_optimization": {
"stage": zero_stage,
"cpu_offload": use_cpu_offload
},
"zero_allow_untested_optimizer": False
}
args = args_from_dict(tmpdir, config_dict)
@distributed_test(world_size=[1])
def _test_zero_allow_untested_optimizer(args, zero_stage):
hidden_dim = 10
model = SimpleModel(hidden_dim)
optimizer = SimpleOptimizer(model.parameters())
with pytest.raises(AssertionError):
model, optim, _, _ = deepspeed.initialize(args=args,
model=model,
optimizer=optimizer,
model_parameters=model.parameters())
_test_zero_allow_untested_optimizer(args, zero_stage)
@pytest.mark.parametrize('zero_stage, use_cpu_offload',
[(1,
False),
(2,
False),
(2,
True),
(3,
False),
(3,
True)])
def test_zero_empty_partition(tmpdir, zero_stage, use_cpu_offload):
if use_cpu_offload and not deepspeed.ops.__compatible_ops__[CPUAdamBuilder.NAME]:
pytest.skip("cpu-adam is not compatible")
if zero_stage == 3:
pytest.skip("skip for now")
config_dict = {
"train_micro_batch_size_per_gpu": 1,
"gradient_accumulation_steps": 1,
"fp16": {
"enabled": True,
"initial_scale_power": 8
},
"optimizer": {
"type": "Adam",
"params": {
"lr": 0.00015
}
},
"zero_optimization": {
"stage": zero_stage,
"cpu_offload": use_cpu_offload,
"reduce_bucket_size": 100,
"allgather_bucket_size": 100
}
}
args = args_from_dict(tmpdir, config_dict)
@distributed_test(world_size=[3])
def _test_zero_empty_partition(args, zero_stage):
hidden_dim = 1
model = SimpleModel(hidden_dim)
# Ensure model has 2 parameters, to cause empty partition with DP=3
assert len(list(model.parameters())) == 2
model, _, _, _ = deepspeed.initialize(args=args,
model=model,
model_parameters=model.parameters())
# Now make sure things work..
data_loader = random_dataloader(model=model,
total_samples=1,
hidden_dim=hidden_dim,
device=model.device)
for n, batch in enumerate(data_loader):
loss = model(batch[0], batch[1])
model.backward(loss)
model.step()
_test_zero_empty_partition(args=args, zero_stage=zero_stage)
@amp_available
def test_adam_amp_basic(tmpdir):
config_dict = {"train_batch_size": 1, "steps_per_print": 1, "amp": {"enabled": True}}
args = args_from_dict(tmpdir, config_dict)
hidden_dim = 10
model = SimpleModel(hidden_dim)
@distributed_test(world_size=[1])
def _test_adam_amp_basic(args, model, hidden_dim):
optimizer = torch.optim.Adam(params=model.parameters())
model, _, _, _ = deepspeed.initialize(args=args,
model=model,
optimizer=optimizer)
data_loader = random_dataloader(model=model,
total_samples=50,
hidden_dim=hidden_dim,
device=model.device)
for n, batch in enumerate(data_loader):
loss = model(batch[0], batch[1])
model.backward(loss)
model.step()
_test_adam_amp_basic(args=args, model=model, hidden_dim=hidden_dim)
@amp_available
def test_lamb_amp_basic(tmpdir):
config_dict = {
"train_batch_size": 2,
"steps_per_print": 1,
"optimizer": {
"type": "Lamb",
"params": {
"lr": 0.00015
}
},
"gradient_clipping": 1.0,
"amp": {
"enabled": True,
}
}
args = args_from_dict(tmpdir, config_dict)
hidden_dim = 10
model = SimpleModel(hidden_dim)
@distributed_test(world_size=[1, 2])
def _test_lamb_amp_basic(args, model, hidden_dim):
model, _, _, _ = deepspeed.initialize(args=args,
model=model,
model_parameters=model.parameters())
data_loader = random_dataloader(model=model,
total_samples=50,
hidden_dim=hidden_dim,
device=model.device)
for n, batch in enumerate(data_loader):
loss = model(batch[0], batch[1])
model.backward(loss)
model.step()
_test_lamb_amp_basic(args=args, model=model, hidden_dim=hidden_dim)
@amp_available
def test_adam_amp_o2(tmpdir):
config_dict = {
"train_batch_size": 2,
"steps_per_print": 1,
"optimizer": {
"type": "Adam",
"params": {
"lr": 0.00015
}
},
"gradient_clipping": 1.0,
"amp": {
"enabled": True,
"opt_level": "O2"
}
}
args = args_from_dict(tmpdir, config_dict)
hidden_dim = 10
model = SimpleModel(hidden_dim)
@distributed_test(world_size=[1, 2])
def _test_adam_amp_o2(args, model, hidden_dim):
model, _, _, _ = deepspeed.initialize(args=args,
model=model,
model_parameters=model.parameters())
data_loader = random_dataloader(model=model,
total_samples=50,
hidden_dim=hidden_dim,
device=model.device)
for n, batch in enumerate(data_loader):
loss = model(batch[0], batch[1])
model.backward(loss)
model.step()
_test_adam_amp_o2(args=args, model=model, hidden_dim=hidden_dim)
@amp_available
def test_adam_amp_o2_empty_grad(tmpdir):
config_dict = {
"train_batch_size": 2,
"steps_per_print": 1,
"optimizer": {
"type": "Adam",
"params": {
"lr": 0.00015
}
},
"gradient_clipping": 1.0,
"amp": {
"enabled": True,
"opt_level": "O2"
}
}
args = args_from_dict(tmpdir, config_dict)
hidden_dim = 10
model = SimpleModel(hidden_dim)
@distributed_test(world_size=[2])
def _test_adam_amp_o2_empty_grad(args, model, hidden_dim):
model, _, _, _ = deepspeed.initialize(args=args,
model=model,
model_parameters=model.parameters())
data_loader = random_dataloader(model=model,
total_samples=50,
hidden_dim=hidden_dim,
device=model.device)
for n, batch in enumerate(data_loader):
loss = model(batch[0], batch[1])
model.backward(loss)
model.step()
_test_adam_amp_o2_empty_grad(args=args, model=model, hidden_dim=hidden_dim)
@pytest.mark.parametrize('zero_stage, optimizer_constructor',
[(1,
FusedAdam),
(2,
torch.optim.Adam),
(2,
FusedAdam),
(3,
torch.optim.Adam),
(3,
FusedAdam)])
def test_zero_supported_client_optimizer(tmpdir, zero_stage, optimizer_constructor):
config_dict = {
"train_batch_size": 2,
"steps_per_print": 1,
"fp16": {
"enabled": True
},
"zero_optimization": {
"stage": zero_stage
}
}
args = args_from_dict(tmpdir, config_dict)
hidden_dim = 10
@distributed_test(world_size=[1])
def _test_zero_supported_client_optimizer(args, zero_stage, optimizer_constructor):
model = SimpleModel(hidden_dim)
client_optimizer = optimizer_constructor(params=model.parameters())
model, _, _, _ = deepspeed.initialize(args=args,
model=model,
optimizer=client_optimizer)
_test_zero_supported_client_optimizer(args=args,
zero_stage=zero_stage,
optimizer_constructor=optimizer_constructor)
def test_zero2_reduce_scatter_off(tmpdir):
config_dict = {
"train_batch_size": 2,
"steps_per_print": 1,
"optimizer": {
"type": "Adam",
"params": {
"lr": 0.00015
}
},
"gradient_clipping": 1.0,
"zero_optimization": {
"stage": 2,
"contiguous_gradients": True,
"allgather_bucket_size": 2000000000,
"reduce_bucket_size": 200000000,
"overlap_comm": False,
"reduce_scatter": False
},
"fp16": {
"enabled": True
}
}
args = args_from_dict(tmpdir, config_dict)
hidden_dim = 10
model = SimpleModel(hidden_dim)
@distributed_test(world_size=[2])
def _helper(args, model, hidden_dim):
model, _, _, _ = deepspeed.initialize(args=args,
model=model,
model_parameters=model.parameters())
data_loader = random_dataloader(model=model,
total_samples=50,
hidden_dim=hidden_dim,
device=model.device)
for n, batch in enumerate(data_loader):
loss = model(batch[0], batch[1])
model.backward(loss)
model.step()
_helper(args=args, model=model, hidden_dim=hidden_dim)
@pytest.mark.parametrize('adam_type, torch_impl',
[('Adam',
True),
('Adam',
False),
('AdamW',
True),
('AdamW',
False)])
def test_fp16_adam_types(tmpdir, adam_type, torch_impl):
config_dict = {
"train_batch_size": 1,
"steps_per_print": 1,
"fp16": {
"enabled": True,
"initial_scale_power": 10
},
"optimizer": {
"type": adam_type,
"torch_adam": torch_impl,
"params": {
"lr": 0.00015
}
}
}
args = args_from_dict(tmpdir, config_dict)
hidden_dim = 10
model = SimpleModel(hidden_dim)
@distributed_test(world_size=[1])
def _test_fp16_adam_types(args, model, hidden_dim):
model, _, _, _ = deepspeed.initialize(args=args,
model=model,
model_parameters=model.parameters())
data_loader = random_dataloader(model=model,
total_samples=10,
hidden_dim=hidden_dim,
device=model.device)
for _, batch in enumerate(data_loader):
loss = model(batch[0], batch[1])
model.backward(loss)
model.step()
_test_fp16_adam_types(args=args, model=model, hidden_dim=hidden_dim)
def test_zero3_lazyscatter(tmpdir):
config_dict = {
"train_batch_size": 1,
"steps_per_print": 1,
"fp16": {
"enabled": True,
"initial_scale_power": 10
},
"optimizer": {
"type": "AdamW",
"params": {
"lr": 0.00015
}
},
"zero_optimization": {
"stage": 3
}
}
args = args_from_dict(tmpdir, config_dict)
hidden_dim = 10
@distributed_test(world_size=[1])
def _go(args):
model = SimpleModel(hidden_dim)
model, _, _, _ = deepspeed.initialize(args=args,
model=model,
model_parameters=model.parameters())
data_loader = random_dataloader(model=model,
total_samples=10,
hidden_dim=hidden_dim,
device=model.device)
for _, batch in enumerate(data_loader):
loss = model(batch[0], batch[1])
model.backward(loss)
model.step()
_go(args=args)
| 33.910138 | 111 | 0.49154 | 2,784 | 29,434 | 4.867816 | 0.068966 | 0.082349 | 0.031656 | 0.043831 | 0.87286 | 0.855077 | 0.833235 | 0.813607 | 0.792798 | 0.790584 | 0 | 0.027624 | 0.417035 | 29,434 | 867 | 112 | 33.94925 | 0.762166 | 0.013352 | 0 | 0.720054 | 0 | 0 | 0.079223 | 0.006648 | 0 | 0 | 0 | 0 | 0.008075 | 1 | 0.053836 | false | 0 | 0.016151 | 0 | 0.069987 | 0.025572 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
9155e2a80cbf5a8dfaaf28cf7174740785416a56 | 95 | py | Python | dist/asmbattle.app/Contents/Resources/__main__.py | dequeb/asmbattle | 27e8b209de5787836e288a2f2f9b7644ce07563e | [
"MIT"
] | null | null | null | dist/asmbattle.app/Contents/Resources/__main__.py | dequeb/asmbattle | 27e8b209de5787836e288a2f2f9b7644ce07563e | [
"MIT"
] | null | null | null | dist/asmbattle.app/Contents/Resources/__main__.py | dequeb/asmbattle | 27e8b209de5787836e288a2f2f9b7644ce07563e | [
"MIT"
] | null | null | null | from asmbattle import ui_controller
def main():
ui_controller.main()
ui_controller.main() | 15.833333 | 35 | 0.768421 | 13 | 95 | 5.384615 | 0.538462 | 0.514286 | 0.457143 | 0.571429 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.136842 | 95 | 6 | 36 | 15.833333 | 0.853659 | 0 | 0 | 0.5 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.25 | true | 0 | 0.25 | 0 | 0.5 | 0 | 1 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
9156645cb661535d0c1323bf2224c542b0c1c487 | 13,611 | py | Python | tests/unit/modules/test_memcached.py | byteskeptical/salt | 637fe0b04f38b2274191b005d73b3c6707d7f400 | [
"Apache-2.0"
] | 5 | 2018-05-01T20:51:14.000Z | 2021-11-09T05:43:00.000Z | tests/unit/modules/test_memcached.py | byteskeptical/salt | 637fe0b04f38b2274191b005d73b3c6707d7f400 | [
"Apache-2.0"
] | 86 | 2017-01-27T11:54:46.000Z | 2020-05-20T06:25:26.000Z | tests/unit/modules/test_memcached.py | byteskeptical/salt | 637fe0b04f38b2274191b005d73b3c6707d7f400 | [
"Apache-2.0"
] | 11 | 2017-01-26T19:36:29.000Z | 2021-12-11T07:54:16.000Z | # -*- coding: utf-8 -*-
'''
:codeauthor: Jayesh Kariya <jayeshk@saltstack.com>
'''
# Import Python Libs
from __future__ import absolute_import, print_function, unicode_literals
# Import Salt Testing Libs
from tests.support.unit import TestCase, skipIf
from tests.support.mock import (
MagicMock,
patch,
NO_MOCK,
NO_MOCK_REASON
)
# Import Salt Libs
import salt.modules.memcached as memcached
from salt.exceptions import CommandExecutionError, SaltInvocationError
from salt.ext.six import integer_types
@skipIf(NO_MOCK, NO_MOCK_REASON)
class MemcachedTestCase(TestCase):
'''
Test cases for salt.modules.memcached
'''
# 'status' function tests: 2
def test_status(self):
'''
Test if it gets memcached status
'''
class MockMemcache(object):
"""
Mock of memcache
"""
@staticmethod
def get_stats():
"""
Mock of stats method
"""
return [('127.0.0.1:11211 (1)', {})]
with patch.object(memcached, '_connect',
MagicMock(return_value=MockMemcache())):
self.assertDictEqual(memcached.status(),
{'127.0.0.1:11211 (1)': {}})
def test_status_false(self):
'''
Test if it gets memcached status
'''
class MockMemcache(object):
"""
Mock of memcache
"""
@staticmethod
def get_stats():
"""
Mock of stats method
"""
return []
with patch.object(memcached, '_connect',
MagicMock(return_value=MockMemcache())):
self.assertFalse(memcached.status())
# 'get' function tests: 1
def test_get(self):
'''
Test if it retrieve value for a key
'''
class MockMemcache(object):
"""
Mock of memcache
"""
@staticmethod
def get_stats():
"""
Mock of stats method
"""
return [('127.0.0.1:11211 (1)', {})]
@staticmethod
def get(key):
"""
Mock of get method
"""
return key
with patch.object(memcached, '_connect',
MagicMock(return_value=MockMemcache())):
self.assertEqual(memcached.get('salt'), 'salt')
# 'set_' function tests: 1
def test_set(self):
'''
Test if it set a key on the memcached server
'''
class MockMemcache(object):
"""
Mock of memcache
"""
def __init__(self):
self.key = None
self.value = None
self.time = None
self.min_compress_len = None
@staticmethod
def get_stats():
"""
Mock of stats method
"""
return [('127.0.0.1:11211 (1)', {})]
def set(self, key, value, time, min_compress_len):
"""
Mock of set method
"""
self.key = key
self.value = value
self.time = time
self.min_compress_len = min_compress_len
return True
with patch.object(memcached, '_connect',
MagicMock(return_value=MockMemcache())):
self.assertTrue(memcached.set_('salt', '1111'))
self.assertRaises(SaltInvocationError, memcached.set_,
'salt', '1111', time='0.1')
self.assertRaises(SaltInvocationError, memcached.set_,
'salt', '1111', min_compress_len='0.1')
# 'delete' function tests: 1
def test_delete(self):
'''
Test if it delete a key from memcache server
'''
class MockMemcache(object):
"""
Mock of memcache
"""
def __init__(self):
self.key = None
self.time = None
@staticmethod
def get_stats():
"""
Mock of stats method
"""
return [('127.0.0.1:11211 (1)', {})]
def delete(self, key, time):
"""
Mock of delete method
"""
self.key = key
self.time = time
return True
with patch.object(memcached, '_connect',
MagicMock(return_value=MockMemcache())):
self.assertTrue(memcached.delete('salt'))
self.assertRaises(SaltInvocationError, memcached.delete,
'salt', '1111', time='0.1')
# 'add' function tests: 1
def test_add(self):
'''
Test if it add a key from memcache server
'''
class MockMemcache(object):
"""
Mock of memcache
"""
def __init__(self):
self.key = None
self.value = None
self.time = None
self.min_compress_len = None
@staticmethod
def get_stats():
"""
Mock of stats method
"""
return [('127.0.0.1:11211 (1)', {})]
def add(self, key, value, time, min_compress_len):
"""
Mock of add method
"""
self.key = key
self.value = value
self.time = time
self.min_compress_len = min_compress_len
return True
with patch.object(memcached, '_connect',
MagicMock(return_value=MockMemcache())):
self.assertTrue(memcached.add('salt', '1111'))
self.assertRaises(SaltInvocationError, memcached.add,
'salt', '1111', time='0.1')
self.assertRaises(SaltInvocationError, memcached.add,
'salt', '1111', min_compress_len='0.1')
# 'replace' function tests: 1
def test_replace(self):
'''
Test if it replace a key from memcache server
'''
class MockMemcache(object):
"""
Mock of memcache
"""
def __init__(self):
self.key = None
self.value = None
self.time = None
self.min_compress_len = None
@staticmethod
def get_stats():
"""
Mock of stats method
"""
return [('127.0.0.1:11211 (1)', {})]
def replace(self, key, value, time, min_compress_len):
"""
Mock of replace method
"""
self.key = key
self.value = value
self.time = time
self.min_compress_len = min_compress_len
return True
with patch.object(memcached, '_connect',
MagicMock(return_value=MockMemcache())):
self.assertTrue(memcached.replace('salt', '1111'))
self.assertRaises(SaltInvocationError, memcached.replace,
'salt', '1111', time='0.1')
self.assertRaises(SaltInvocationError, memcached.replace,
'salt', '1111', min_compress_len='0.1')
# 'increment' function tests: 3
def test_increment(self):
'''
Test if it increment the value of a key
'''
class MockMemcache(object):
"""
Mock of memcache
"""
def __init__(self):
self.key = None
@staticmethod
def get_stats():
"""
Mock of stats method
"""
return [('127.0.0.1:11211 (1)', {})]
def get(self, key):
"""
Mock of get method
"""
self.key = key
return 1
def incr(self, key, delta):
"""
Mock of incr method
"""
self.key = key
if not isinstance(delta, integer_types):
raise SaltInvocationError('Delta value must be an integer')
return key
with patch.object(memcached, '_connect',
MagicMock(return_value=MockMemcache())):
self.assertEqual(memcached.increment('salt'), 'salt')
self.assertRaises(SaltInvocationError, memcached.increment,
'salt', delta='sa')
def test_increment_exist(self):
'''
Test if it increment the value of a key
'''
class MockMemcache(object):
"""
Mock of memcache
"""
def __init__(self):
self.key = None
@staticmethod
def get_stats():
"""
Mock of stats method
"""
return [('127.0.0.1:11211 (1)', {})]
def get(self, key):
"""
Mock of get method
"""
self.key = key
return key
with patch.object(memcached, '_connect',
MagicMock(return_value=MockMemcache())):
self.assertRaises(CommandExecutionError, memcached.increment,
'salt')
def test_increment_none(self):
'''
Test if it increment the value of a key
'''
class MockMemcache(object):
"""
Mock of memcache
"""
def __init__(self):
self.key = None
@staticmethod
def get_stats():
"""
Mock of stats method
"""
return [('127.0.0.1:11211 (1)', {})]
def get(self, key):
"""
Mock of get method
"""
self.key = key
return None
with patch.object(memcached, '_connect',
MagicMock(return_value=MockMemcache())):
self.assertRaises(CommandExecutionError, memcached.increment,
'salt')
# 'decrement' function tests: 3
def test_decrement(self):
'''
Test if it decrement the value of a key
'''
class MockMemcache(object):
"""
Mock of memcache
"""
def __init__(self):
self.key = None
@staticmethod
def get_stats():
"""
Mock of stats method
"""
return [('127.0.0.1:11211 (1)', {})]
def get(self, key):
"""
Mock of get method
"""
self.key = key
return 1
def decr(self, key, delta):
"""
Mock of decr method
"""
self.key = key
if not isinstance(delta, integer_types):
raise SaltInvocationError('Delta value must be an integer')
return key
with patch.object(memcached, '_connect',
MagicMock(return_value=MockMemcache())):
self.assertEqual(memcached.decrement('salt'), 'salt')
self.assertRaises(SaltInvocationError, memcached.decrement,
'salt', delta='sa')
def test_decrement_exist(self):
'''
Test if it decrement the value of a key
'''
class MockMemcache(object):
"""
Mock of memcache
"""
def __init__(self):
self.key = None
@staticmethod
def get_stats():
"""
Mock of stats method
"""
return [('127.0.0.1:11211 (1)', {})]
def get(self, key):
"""
Mock of get method
"""
self.key = key
return key
with patch.object(memcached, '_connect',
MagicMock(return_value=MockMemcache())):
self.assertRaises(CommandExecutionError, memcached.decrement,
'salt')
def test_decrement_none(self):
'''
Test if it decrement the value of a key
'''
class MockMemcache(object):
"""
Mock of memcache
"""
def __init__(self):
self.key = None
@staticmethod
def get_stats():
"""
Mock of stats method
"""
return [('127.0.0.1:11211 (1)', {})]
def get(self, key):
"""
Mock of get method
"""
self.key = key
return None
with patch.object(memcached, '_connect',
MagicMock(return_value=MockMemcache())):
self.assertRaises(CommandExecutionError, memcached.decrement,
'salt')
| 29.083333 | 79 | 0.446991 | 1,186 | 13,611 | 5.005902 | 0.090219 | 0.039414 | 0.035371 | 0.026276 | 0.838134 | 0.775307 | 0.753748 | 0.712481 | 0.711807 | 0.664309 | 0 | 0.029752 | 0.454265 | 13,611 | 467 | 80 | 29.14561 | 0.769521 | 0.12174 | 0 | 0.789238 | 0 | 0 | 0.055686 | 0 | 0 | 0 | 0 | 0 | 0.098655 | 1 | 0.219731 | false | 0 | 0.026906 | 0 | 0.426009 | 0.004484 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 8 |
915e6d10a760be0a93a97bbafc6c9dd369fdf24c | 27,837 | py | Python | ibutsu_client/api/artifact_api.py | rsnyman/ibutsu-client-python | 451bae383a8bd1a35c3cf917749614cfcbd94283 | [
"MIT"
] | null | null | null | ibutsu_client/api/artifact_api.py | rsnyman/ibutsu-client-python | 451bae383a8bd1a35c3cf917749614cfcbd94283 | [
"MIT"
] | null | null | null | ibutsu_client/api/artifact_api.py | rsnyman/ibutsu-client-python | 451bae383a8bd1a35c3cf917749614cfcbd94283 | [
"MIT"
] | null | null | null | """
Ibutsu API
A system to store and query test results # noqa: E501
The version of the OpenAPI document: 1.13.4
Generated by: https://openapi-generator.tech
"""
import re # noqa: F401
import sys # noqa: F401
from ibutsu_client.api_client import ApiClient, Endpoint as _Endpoint
from ibutsu_client.model_utils import ( # noqa: F401
check_allowed_values,
check_validations,
date,
datetime,
file_type,
none_type,
validate_and_convert_types
)
from ibutsu_client.model.artifact import Artifact
from ibutsu_client.model.artifact_list import ArtifactList
class ArtifactApi(object):
"""NOTE: This class is auto generated by OpenAPI Generator
Ref: https://openapi-generator.tech
Do not edit the class manually.
"""
def __init__(self, api_client=None):
if api_client is None:
api_client = ApiClient()
self.api_client = api_client
def __delete_artifact(
self,
id,
**kwargs
):
"""Delete an artifact # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.delete_artifact(id, async_req=True)
>>> result = thread.get()
Args:
id (str): ID of artifact to delete
Keyword Args:
_return_http_data_only (bool): response data without head status
code and headers. Default is True.
_preload_content (bool): if False, the urllib3.HTTPResponse object
will be returned without reading/decoding response data.
Default is True.
_request_timeout (int/float/tuple): timeout setting for this request. If
one number provided, it will be total request timeout. It can also
be a pair (tuple) of (connection, read) timeouts.
Default is None.
_check_input_type (bool): specifies if type checking
should be done one the data sent to the server.
Default is True.
_check_return_type (bool): specifies if type checking
should be done one the data received from the server.
Default is True.
_host_index (int/None): specifies the index of the server
that we want to use.
Default is read from the configuration.
async_req (bool): execute request asynchronously
Returns:
None
If the method is called asynchronously, returns the request
thread.
"""
kwargs['async_req'] = kwargs.get(
'async_req', False
)
kwargs['_return_http_data_only'] = kwargs.get(
'_return_http_data_only', True
)
kwargs['_preload_content'] = kwargs.get(
'_preload_content', True
)
kwargs['_request_timeout'] = kwargs.get(
'_request_timeout', None
)
kwargs['_check_input_type'] = kwargs.get(
'_check_input_type', True
)
kwargs['_check_return_type'] = kwargs.get(
'_check_return_type', True
)
kwargs['_host_index'] = kwargs.get('_host_index')
kwargs['id'] = \
id
return self.call_with_http_info(**kwargs)
self.delete_artifact = _Endpoint(
settings={
'response_type': None,
'auth': [],
'endpoint_path': '/artifact/{id}',
'operation_id': 'delete_artifact',
'http_method': 'DELETE',
'servers': None,
},
params_map={
'all': [
'id',
],
'required': [
'id',
],
'nullable': [
],
'enum': [
],
'validation': [
]
},
root_map={
'validations': {
},
'allowed_values': {
},
'openapi_types': {
'id':
(str,),
},
'attribute_map': {
'id': 'id',
},
'location_map': {
'id': 'path',
},
'collection_format_map': {
}
},
headers_map={
'accept': [],
'content_type': [],
},
api_client=api_client,
callable=__delete_artifact
)
def __download_artifact(
self,
id,
**kwargs
):
"""Download an artifact # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.download_artifact(id, async_req=True)
>>> result = thread.get()
Args:
id (str): ID of artifact to return
Keyword Args:
_return_http_data_only (bool): response data without head status
code and headers. Default is True.
_preload_content (bool): if False, the urllib3.HTTPResponse object
will be returned without reading/decoding response data.
Default is True.
_request_timeout (int/float/tuple): timeout setting for this request. If
one number provided, it will be total request timeout. It can also
be a pair (tuple) of (connection, read) timeouts.
Default is None.
_check_input_type (bool): specifies if type checking
should be done one the data sent to the server.
Default is True.
_check_return_type (bool): specifies if type checking
should be done one the data received from the server.
Default is True.
_host_index (int/None): specifies the index of the server
that we want to use.
Default is read from the configuration.
async_req (bool): execute request asynchronously
Returns:
file_type
If the method is called asynchronously, returns the request
thread.
"""
kwargs['async_req'] = kwargs.get(
'async_req', False
)
kwargs['_return_http_data_only'] = kwargs.get(
'_return_http_data_only', True
)
kwargs['_preload_content'] = kwargs.get(
'_preload_content', True
)
kwargs['_request_timeout'] = kwargs.get(
'_request_timeout', None
)
kwargs['_check_input_type'] = kwargs.get(
'_check_input_type', True
)
kwargs['_check_return_type'] = kwargs.get(
'_check_return_type', True
)
kwargs['_host_index'] = kwargs.get('_host_index')
kwargs['id'] = \
id
return self.call_with_http_info(**kwargs)
self.download_artifact = _Endpoint(
settings={
'response_type': (file_type,),
'auth': [],
'endpoint_path': '/artifact/{id}/download',
'operation_id': 'download_artifact',
'http_method': 'GET',
'servers': None,
},
params_map={
'all': [
'id',
],
'required': [
'id',
],
'nullable': [
],
'enum': [
],
'validation': [
]
},
root_map={
'validations': {
},
'allowed_values': {
},
'openapi_types': {
'id':
(str,),
},
'attribute_map': {
'id': 'id',
},
'location_map': {
'id': 'path',
},
'collection_format_map': {
}
},
headers_map={
'accept': [
'text/plain',
'image/jpeg',
'image/png',
'image/gif',
'application/octet-stream'
],
'content_type': [],
},
api_client=api_client,
callable=__download_artifact
)
def __get_artifact(
self,
id,
**kwargs
):
"""Get a single artifact # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_artifact(id, async_req=True)
>>> result = thread.get()
Args:
id (str): ID of artifact to return
Keyword Args:
_return_http_data_only (bool): response data without head status
code and headers. Default is True.
_preload_content (bool): if False, the urllib3.HTTPResponse object
will be returned without reading/decoding response data.
Default is True.
_request_timeout (int/float/tuple): timeout setting for this request. If
one number provided, it will be total request timeout. It can also
be a pair (tuple) of (connection, read) timeouts.
Default is None.
_check_input_type (bool): specifies if type checking
should be done one the data sent to the server.
Default is True.
_check_return_type (bool): specifies if type checking
should be done one the data received from the server.
Default is True.
_host_index (int/None): specifies the index of the server
that we want to use.
Default is read from the configuration.
async_req (bool): execute request asynchronously
Returns:
Artifact
If the method is called asynchronously, returns the request
thread.
"""
kwargs['async_req'] = kwargs.get(
'async_req', False
)
kwargs['_return_http_data_only'] = kwargs.get(
'_return_http_data_only', True
)
kwargs['_preload_content'] = kwargs.get(
'_preload_content', True
)
kwargs['_request_timeout'] = kwargs.get(
'_request_timeout', None
)
kwargs['_check_input_type'] = kwargs.get(
'_check_input_type', True
)
kwargs['_check_return_type'] = kwargs.get(
'_check_return_type', True
)
kwargs['_host_index'] = kwargs.get('_host_index')
kwargs['id'] = \
id
return self.call_with_http_info(**kwargs)
self.get_artifact = _Endpoint(
settings={
'response_type': (Artifact,),
'auth': [],
'endpoint_path': '/artifact/{id}',
'operation_id': 'get_artifact',
'http_method': 'GET',
'servers': None,
},
params_map={
'all': [
'id',
],
'required': [
'id',
],
'nullable': [
],
'enum': [
],
'validation': [
]
},
root_map={
'validations': {
},
'allowed_values': {
},
'openapi_types': {
'id':
(str,),
},
'attribute_map': {
'id': 'id',
},
'location_map': {
'id': 'path',
},
'collection_format_map': {
}
},
headers_map={
'accept': [
'application/json'
],
'content_type': [],
},
api_client=api_client,
callable=__get_artifact
)
def __get_artifact_list(
self,
**kwargs
):
"""Get a (filtered) list of artifacts # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_artifact_list(async_req=True)
>>> result = thread.get()
Keyword Args:
result_id (str): The result ID to filter by. [optional]
run_id (str): The run ID to filter by. [optional]
page (int): Set the page of items to return, defaults to 1. [optional]
page_size (int): Set the number of items per page, defaults to 25. [optional]
_return_http_data_only (bool): response data without head status
code and headers. Default is True.
_preload_content (bool): if False, the urllib3.HTTPResponse object
will be returned without reading/decoding response data.
Default is True.
_request_timeout (int/float/tuple): timeout setting for this request. If
one number provided, it will be total request timeout. It can also
be a pair (tuple) of (connection, read) timeouts.
Default is None.
_check_input_type (bool): specifies if type checking
should be done one the data sent to the server.
Default is True.
_check_return_type (bool): specifies if type checking
should be done one the data received from the server.
Default is True.
_host_index (int/None): specifies the index of the server
that we want to use.
Default is read from the configuration.
async_req (bool): execute request asynchronously
Returns:
ArtifactList
If the method is called asynchronously, returns the request
thread.
"""
kwargs['async_req'] = kwargs.get(
'async_req', False
)
kwargs['_return_http_data_only'] = kwargs.get(
'_return_http_data_only', True
)
kwargs['_preload_content'] = kwargs.get(
'_preload_content', True
)
kwargs['_request_timeout'] = kwargs.get(
'_request_timeout', None
)
kwargs['_check_input_type'] = kwargs.get(
'_check_input_type', True
)
kwargs['_check_return_type'] = kwargs.get(
'_check_return_type', True
)
kwargs['_host_index'] = kwargs.get('_host_index')
return self.call_with_http_info(**kwargs)
self.get_artifact_list = _Endpoint(
settings={
'response_type': (ArtifactList,),
'auth': [],
'endpoint_path': '/artifact',
'operation_id': 'get_artifact_list',
'http_method': 'GET',
'servers': None,
},
params_map={
'all': [
'result_id',
'run_id',
'page',
'page_size',
],
'required': [],
'nullable': [
],
'enum': [
],
'validation': [
]
},
root_map={
'validations': {
},
'allowed_values': {
},
'openapi_types': {
'result_id':
(str,),
'run_id':
(str,),
'page':
(int,),
'page_size':
(int,),
},
'attribute_map': {
'result_id': 'resultId',
'run_id': 'runId',
'page': 'page',
'page_size': 'pageSize',
},
'location_map': {
'result_id': 'query',
'run_id': 'query',
'page': 'query',
'page_size': 'query',
},
'collection_format_map': {
}
},
headers_map={
'accept': [
'application/json'
],
'content_type': [],
},
api_client=api_client,
callable=__get_artifact_list
)
def __upload_artifact(
self,
filename,
file,
**kwargs
):
"""Uploads a test run artifact # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.upload_artifact(filename, file, async_req=True)
>>> result = thread.get()
Args:
filename (str): ID of pet to update
file (file_type): file to upload
Keyword Args:
result_id (str): ID of result to attach artifact to. [optional]
run_id (str): ID of run to attach artifact to. [optional]
additional_metadata ({str: (bool, date, datetime, dict, float, int, list, str, none_type)}): Additional data to pass to server. [optional]
_return_http_data_only (bool): response data without head status
code and headers. Default is True.
_preload_content (bool): if False, the urllib3.HTTPResponse object
will be returned without reading/decoding response data.
Default is True.
_request_timeout (int/float/tuple): timeout setting for this request. If
one number provided, it will be total request timeout. It can also
be a pair (tuple) of (connection, read) timeouts.
Default is None.
_check_input_type (bool): specifies if type checking
should be done one the data sent to the server.
Default is True.
_check_return_type (bool): specifies if type checking
should be done one the data received from the server.
Default is True.
_host_index (int/None): specifies the index of the server
that we want to use.
Default is read from the configuration.
async_req (bool): execute request asynchronously
Returns:
Artifact
If the method is called asynchronously, returns the request
thread.
"""
kwargs['async_req'] = kwargs.get(
'async_req', False
)
kwargs['_return_http_data_only'] = kwargs.get(
'_return_http_data_only', True
)
kwargs['_preload_content'] = kwargs.get(
'_preload_content', True
)
kwargs['_request_timeout'] = kwargs.get(
'_request_timeout', None
)
kwargs['_check_input_type'] = kwargs.get(
'_check_input_type', True
)
kwargs['_check_return_type'] = kwargs.get(
'_check_return_type', True
)
kwargs['_host_index'] = kwargs.get('_host_index')
kwargs['filename'] = \
filename
kwargs['file'] = \
file
return self.call_with_http_info(**kwargs)
self.upload_artifact = _Endpoint(
settings={
'response_type': (Artifact,),
'auth': [],
'endpoint_path': '/artifact',
'operation_id': 'upload_artifact',
'http_method': 'POST',
'servers': None,
},
params_map={
'all': [
'filename',
'file',
'result_id',
'run_id',
'additional_metadata',
],
'required': [
'filename',
'file',
],
'nullable': [
],
'enum': [
],
'validation': [
]
},
root_map={
'validations': {
},
'allowed_values': {
},
'openapi_types': {
'filename':
(str,),
'file':
(file_type,),
'result_id':
(str,),
'run_id':
(str,),
'additional_metadata':
({str: (bool, date, datetime, dict, float, int, list, str, none_type)},),
},
'attribute_map': {
'filename': 'filename',
'file': 'file',
'result_id': 'resultId',
'run_id': 'runId',
'additional_metadata': 'additionalMetadata',
},
'location_map': {
'filename': 'form',
'file': 'form',
'result_id': 'form',
'run_id': 'form',
'additional_metadata': 'form',
},
'collection_format_map': {
}
},
headers_map={
'accept': [
'application/json'
],
'content_type': [
'multipart/form-data'
]
},
api_client=api_client,
callable=__upload_artifact
)
def __view_artifact(
self,
id,
**kwargs
):
"""Stream an artifact directly to the client/browser # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.view_artifact(id, async_req=True)
>>> result = thread.get()
Args:
id (str): ID of artifact to return
Keyword Args:
_return_http_data_only (bool): response data without head status
code and headers. Default is True.
_preload_content (bool): if False, the urllib3.HTTPResponse object
will be returned without reading/decoding response data.
Default is True.
_request_timeout (int/float/tuple): timeout setting for this request. If
one number provided, it will be total request timeout. It can also
be a pair (tuple) of (connection, read) timeouts.
Default is None.
_check_input_type (bool): specifies if type checking
should be done one the data sent to the server.
Default is True.
_check_return_type (bool): specifies if type checking
should be done one the data received from the server.
Default is True.
_host_index (int/None): specifies the index of the server
that we want to use.
Default is read from the configuration.
async_req (bool): execute request asynchronously
Returns:
file_type
If the method is called asynchronously, returns the request
thread.
"""
kwargs['async_req'] = kwargs.get(
'async_req', False
)
kwargs['_return_http_data_only'] = kwargs.get(
'_return_http_data_only', True
)
kwargs['_preload_content'] = kwargs.get(
'_preload_content', True
)
kwargs['_request_timeout'] = kwargs.get(
'_request_timeout', None
)
kwargs['_check_input_type'] = kwargs.get(
'_check_input_type', True
)
kwargs['_check_return_type'] = kwargs.get(
'_check_return_type', True
)
kwargs['_host_index'] = kwargs.get('_host_index')
kwargs['id'] = \
id
return self.call_with_http_info(**kwargs)
self.view_artifact = _Endpoint(
settings={
'response_type': (file_type,),
'auth': [],
'endpoint_path': '/artifact/{id}/view',
'operation_id': 'view_artifact',
'http_method': 'GET',
'servers': None,
},
params_map={
'all': [
'id',
],
'required': [
'id',
],
'nullable': [
],
'enum': [
],
'validation': [
]
},
root_map={
'validations': {
},
'allowed_values': {
},
'openapi_types': {
'id':
(str,),
},
'attribute_map': {
'id': 'id',
},
'location_map': {
'id': 'path',
},
'collection_format_map': {
}
},
headers_map={
'accept': [
'text/plain',
'image/jpeg',
'image/png',
'image/gif',
'application/octet-stream'
],
'content_type': [],
},
api_client=api_client,
callable=__view_artifact
)
| 35.551724 | 154 | 0.443977 | 2,381 | 27,837 | 4.955481 | 0.088618 | 0.033562 | 0.026443 | 0.02746 | 0.844563 | 0.820917 | 0.807102 | 0.797101 | 0.790999 | 0.790999 | 0 | 0.002919 | 0.470848 | 27,837 | 782 | 155 | 35.597187 | 0.798099 | 0.332076 | 0 | 0.632887 | 1 | 0 | 0.207786 | 0.028441 | 0 | 0 | 0 | 0 | 0 | 1 | 0.013384 | false | 0 | 0.011472 | 0 | 0.038241 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
91bad5bc347e979010e22a1d6e913e8759c50d8c | 30,910 | py | Python | python/paddle/distributed/auto_parallel/operators/dist_matmul.py | DesmonDay/Paddle | a082b8d7f11e4d366d814b0dfc22b7b42edaba8f | [
"Apache-2.0"
] | 1 | 2021-08-19T05:56:50.000Z | 2021-08-19T05:56:50.000Z | python/paddle/distributed/auto_parallel/operators/dist_matmul.py | XYZ916829/Paddle | 1833a2311a9528b09ccba6ed8ebfa104db5147ff | [
"Apache-2.0"
] | null | null | null | python/paddle/distributed/auto_parallel/operators/dist_matmul.py | XYZ916829/Paddle | 1833a2311a9528b09ccba6ed8ebfa104db5147ff | [
"Apache-2.0"
] | null | null | null | # Copyright (c) 2021 PaddlePaddle Authors. All Rights Reserved.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License
from .common import DistributedOperator
from .common import DistributedOperatorImpl
from .common import register_distributed_operator
from .common import register_distributed_operator_impl
from ..utils import is_dim_shard
from ..utils import is_dim_replicate
from ..utils import is_valid_list_index
from ..utils import compute_compatible_dim_mapping
from ..utils import compute_compatible_dims_mapping
from ..utils import compute_compatible_and_update_dim_mapping
from paddle.fluid import core, unique_name
from paddle.fluid.framework import in_dygraph_mode
from paddle.fluid.framework import Program, Parameter, Variable, program_guard
from paddle.fluid.data_feeder import check_variable_and_dtype, check_dtype
from ..process import new_process_group
from ..utils import _get_comm_group
def _update_dims_mapping_for_matmul(op_dist_attr):
changed = False
op_desc = op_dist_attr.get_owner_op().desc
x_name = op_desc.input('X')[0]
y_name = op_desc.input('Y')[0]
out_name = op_desc.output('Out')[0]
x_dims_mapping = op_dist_attr.get_input_dims_mapping(x_name)
y_dims_mapping = op_dist_attr.get_input_dims_mapping(y_name)
out_dims_mapping = op_dist_attr.get_output_dims_mapping(out_name)
x_dims_mapping_len = len(x_dims_mapping)
y_dims_mapping_len = len(y_dims_mapping)
out_dims_mapping_len = len(out_dims_mapping)
# Add dim mapping to Make sure the length dims_mapping be at least 2
if x_dims_mapping_len == 1:
x_dims_mapping.insert(0, -1)
if y_dims_mapping_len == 1:
y_dims_mapping.insert(1, -1)
# Deal with dim > 2 and take care of broadcasting
if out_dims_mapping_len > 2:
broadcast_x_dims_mapping = []
broadcast_y_dims_mapping = []
broadcast_out_dims_mapping = []
for i in range(out_dims_mapping_len - x_dims_mapping_len):
broadcast_x_dims_mapping.append(out_dims_mapping[i])
for i in range(x_dims_mapping_len - 2):
broadcast_x_dims_mapping.append(x_dims_mapping[i])
for i in range(out_dims_mapping_len - y_dims_mapping_len):
broadcast_y_dims_mapping.append(out_dims_mapping[i])
for i in range(y_dims_mapping_len - 2):
broadcast_y_dims_mapping.append(y_dims_mapping[i])
for i in range(out_dims_mapping_len - 2):
broadcast_out_dims_mapping.append(out_dims_mapping[i])
compatible_dims_mapping = compute_compatible_dims_mapping([
broadcast_x_dims_mapping, broadcast_y_dims_mapping,
broadcast_out_dims_mapping
])
assert compatible_dims_mapping is not None, "There is no compatible dim mapping."
for i in range(x_dims_mapping_len - 2):
new_idx = i + (out_dims_mapping_len - x_dims_mapping_len)
if x_dims_mapping[i] != compatible_dims_mapping[new_idx]:
x_dims_mapping[i] = compatible_dims_mapping[new_idx]
changed = True
for i in range(y_dims_mapping_len - 2):
new_idx = i + (out_dims_mapping_len - y_dims_mapping_len)
if y_dims_mapping[i] != compatible_dims_mapping[new_idx]:
y_dims_mapping[i] = compatible_dims_mapping[new_idx]
changed = True
for i in range(out_dims_mapping_len - 2):
if out_dims_mapping[i] != compatible_dims_mapping[i]:
out_dims_mapping[i] = compatible_dims_mapping[i]
changed = True
# The following which uses negative index can be work
# when len(out_dims_mapping) > 2 and len(out_dims_mapping) <=2
dim_changed = compute_compatible_and_update_dim_mapping(
[x_dims_mapping, y_dims_mapping], [-1, -2])
if dim_changed:
changed = True
dim_changed = compute_compatible_and_update_dim_mapping(
[x_dims_mapping, out_dims_mapping], [-2, -2])
if dim_changed:
changed = True
dim_changed = compute_compatible_and_update_dim_mapping(
[y_dims_mapping, out_dims_mapping], [-1, -1])
if dim_changed:
changed = True
# Remove unnecessary dim mapping to make sure the lenght of dims_mapping is same as its tensor
if x_dims_mapping_len == 1:
x_dims_mapping.pop(0)
if y_dims_mapping_len == 1:
y_dims_mapping.pop(1)
assert len(x_dims_mapping) == x_dims_mapping_len
assert len(y_dims_mapping) == y_dims_mapping_len
assert len(out_dims_mapping) == out_dims_mapping_len
return changed
class DistributedMatmul(DistributedOperator):
def __init__(self, name):
super(DistributedMatmul, self).__init__()
self._name = name
register_distributed_operator("matmul", DistributedMatmul("matmul"))
# ColumnParallel
class DistributedMatmulImpl0(DistributedOperatorImpl):
def __init__(self, name):
super(DistributedMatmulImpl0, self).__init__()
self._name = name
self._forward_implemented = True
self._backward_implemented = False
def is_process_mesh_compatible(self, op_dist_attr):
""" No restriction for now. """
return True
def is_input_compatible(self, op_dist_attr):
op_desc = op_dist_attr.get_owner_op().desc
x_name = op_desc.input('X')[0]
y_name = op_desc.input('Y')[0]
x_dims_mapping = op_dist_attr.get_input_dims_mapping(x_name)
y_dims_mapping = op_dist_attr.get_input_dims_mapping(y_name)
if is_dim_shard(x_dims_mapping[-1]):
return False
if is_dim_shard(y_dims_mapping[0]) or is_dim_replicate(y_dims_mapping[
1]):
return False
for mapping in x_dims_mapping[1:-1]:
if is_dim_shard(mapping):
return False
return True
def is_output_compatible(self, op_dist_attr):
op_desc = op_dist_attr.get_owner_op().desc
out_name = op_desc.output('Out')[0]
out_dims_mapping = op_dist_attr.get_output_dims_mapping(out_name)
if is_dim_replicate(out_dims_mapping[-1]):
return False
for mapping in out_dims_mapping[1:-1]:
if is_dim_shard(mapping):
return False
return True
def update_dims_mapping(self, op_dist_attr):
changed = False
dim_changed = _update_dims_mapping_for_matmul(op_dist_attr)
if dim_changed:
changed = True
return changed
def forward(self, serial_op):
def static_handle(dst_block,
src_op,
op_dist_attr,
input_name_mapping,
output_name_mapping,
rank_id=0):
assert len(
input_name_mapping
) == 2, "col_parallel_linear take 2 inputs variable but got {}".format(
input_name_mapping)
assert len(
output_name_mapping
) == 1, "col_parallel_linear take 2 inputs variable but got {}".format(
output_name_mapping)
assert len(
input_name_mapping['X']
) == 1, "col_parallel_linear input X take 1 variable but got {}".format(
input_name_mapping['X'])
assert len(
input_name_mapping['Y']
) == 1, "col_parallel_linear input Y take 1 variable but got {}".format(
input_name_mapping['Y'])
assert len(
output_name_mapping['Out']
) == 1, "col_parallel_linear input Out take 1 variable but got {}".format(
input_name_mapping['Out'])
X_var = dst_block.var(input_name_mapping['X'][0])
Weight_var = dst_block.var(input_name_mapping['Y'][0])
Out_var = dst_block.var(output_name_mapping['Out'][0])
# TODO infer logic comm presentation
model_parallel_axis, process_mesh = op_dist_attr.get_owner_context(
)._get_model_parallel_info()
group_ranks = _get_comm_group(process_mesh.process_group,
process_mesh.topology,
model_parallel_axis, rank_id)
group = new_process_group(group_ranks)
intermediate_var_0 = dst_block.create_var(
name=unique_name.generate_with_ignorable_key(".".join(
["c_identity", 'tmp'])),
dtype=X_var.dtype,
shape=X_var.shape,
type=core.VarDesc.VarType.LOD_TENSOR,
persistable=False,
stop_gradient=X_var.stop_gradient)
check_variable_and_dtype(
X_var, 'tensor',
['float16', 'float32', 'float64', 'int32', 'int64'],
'_c_identity')
dst_block.append_op(
type='c_identity',
inputs={'X': [X_var]},
outputs={'Out': intermediate_var_0},
attrs={
'ring_id': group.id,
'use_calc_stream': True,
'use_model_parallel': True,
})
check_variable_and_dtype(intermediate_var_0, 'x',
['float16', 'float32', 'float64'],
'linear')
check_dtype(intermediate_var_0.dtype, 'dtype',
['float16', 'float32', 'float64'], 'linear')
attrs = {
'transpose_X': False,
'transpose_Y': False,
'alpha': 1,
}
inputs = {'X': [intermediate_var_0], 'Y': [Weight_var]}
dst_block.append_op(
type='matmul',
inputs=inputs,
outputs={'Out': Out_var},
attrs=attrs)
if in_dygraph_mode():
raise NotImplementedError(
"Dist op for [{}] with idx [{}] is NOT implemented yet.".format(
"matmul", 0))
else:
return static_handle
# RowParallel
class DistributedMatmulImpl1(DistributedOperatorImpl):
def __init__(self, name):
super(DistributedMatmulImpl1, self).__init__()
self._name = name
self._forward_implemented = True
self._backward_implemented = False
def is_process_mesh_compatible(self, op_dist_attr):
""" No restriction for now. """
return True
def is_input_compatible(self, op_dist_attr):
op_desc = op_dist_attr.get_owner_op().desc
x_name = op_desc.input('X')[0]
y_name = op_desc.input('Y')[0]
x_dims_mapping = op_dist_attr.get_input_dims_mapping(x_name)
y_dims_mapping = op_dist_attr.get_input_dims_mapping(y_name)
if is_dim_replicate(x_dims_mapping[-1]):
return False
if is_dim_replicate(y_dims_mapping[-2]) or is_dim_shard(y_dims_mapping[
-1]):
return False
# Other dimensions must be replicate except the batch dimension
for mapping in x_dims_mapping[1:-1]:
if is_dim_shard(mapping):
return False
return True
def is_output_compatible(self, op_dist_attr):
op_desc = op_dist_attr.get_owner_op().desc
out_name = op_desc.output('Out')[0]
out_dims_mapping = op_dist_attr.get_output_dims_mapping(out_name)
if is_dim_shard(out_dims_mapping[-1]):
return False
# Other dimensions must be replicate except the batch dimension
for mapping in out_dims_mapping[1:-1]:
if is_dim_shard(mapping):
return False
return True
def update_dims_mapping(self, op_dist_attr):
changed = False
dim_changed = _update_dims_mapping_for_matmul(op_dist_attr)
if dim_changed:
changed = True
return changed
def forward(self, serial_op):
def static_handle(dst_block,
src_op,
op_dist_attr,
input_name_mapping,
output_name_mapping,
rank_id=0):
assert len(
input_name_mapping
) == 2, "col_parallel_linear take 2 inputs variable but got {}".format(
input_name_mapping)
assert len(
output_name_mapping
) == 1, "col_parallel_linear take 2 inputs variable but got {}".format(
output_name_mapping)
assert len(
input_name_mapping['X']
) == 1, "col_parallel_linear input X take 1 variable but got {}".format(
input_name_mapping['X'])
assert len(
input_name_mapping['Y']
) == 1, "col_parallel_linear input Y take 1 variable but got {}".format(
input_name_mapping['Y'])
assert len(
output_name_mapping['Out']
) == 1, "col_parallel_linear input Out take 1 variable but got {}".format(
input_name_mapping['Out'])
X_var = dst_block.var(input_name_mapping['X'][0])
Weight_var = dst_block.var(input_name_mapping['Y'][0])
Out_var = dst_block.var(output_name_mapping['Out'][0])
# TODO infer logic comm presentation
model_parallel_axis, process_mesh = op_dist_attr.get_owner_context(
)._get_model_parallel_info()
group_ranks = _get_comm_group(process_mesh.process_group,
process_mesh.topology,
model_parallel_axis, rank_id)
group = new_process_group(group_ranks)
check_variable_and_dtype(
X_var, 'x', ['float16', 'float32', 'float64'], 'linear')
check_dtype(X_var.dtype, 'dtype',
['float16', 'float32', 'float64'], 'linear')
attrs = {
'transpose_X': False,
'transpose_Y': False,
'alpha': 1,
}
inputs = {'X': X_var, 'Y': Weight_var}
intermediate_var_0 = dst_block.create_var(
shape=Out_var.shape,
dtype=Out_var.dtype,
type=Out_var.type,
lod_level=Out_var.lod_level,
persistable=False,
is_data=False,
need_check_feed=Out_var.desc.need_check_feed())
dst_block.append_op(
type='matmul',
inputs=inputs,
outputs={'Out': intermediate_var_0},
attrs=attrs)
dst_block.append_op(
type='c_allreduce_sum',
inputs={'X': intermediate_var_0},
outputs={'Out': Out_var},
attrs={
'ring_id': group.id,
'use_calc_stream': True,
'use_model_parallel': True
})
if in_dygraph_mode():
raise NotImplementedError(
"Dist op for [{}] with idx [{}] is NOT implemented yet.".format(
"matmul", 0))
else:
return static_handle
# ReplicateParallel
class DistributedMatmulImpl2(DistributedOperatorImpl):
def __init__(self, name):
super(DistributedMatmulImpl2, self).__init__()
self._name = name
def is_process_mesh_compatible(self, op_dist_attr):
""" No restriction for now. """
return True
def is_input_compatible(self, op_dist_attr):
op_desc = op_dist_attr.get_owner_op().desc
x_name = op_desc.input('X')[0]
y_name = op_desc.input('Y')[0]
x_dims_mapping = op_dist_attr.get_input_dims_mapping(x_name)
y_dims_mapping = op_dist_attr.get_input_dims_mapping(y_name)
if is_dim_shard(x_dims_mapping[-1]):
return False
if is_valid_list_index(x_dims_mapping,
-2) and is_dim_shard(x_dims_mapping[-2]):
return False
if is_dim_shard(y_dims_mapping[-1]):
return False
if is_valid_list_index(y_dims_mapping,
-2) and is_dim_shard(y_dims_mapping[-2]):
return False
return True
def is_output_compatible(self, op_dist_attr):
op_desc = op_dist_attr.get_owner_op().desc
out_name = op_desc.output('Out')[0]
out_dims_mapping = op_dist_attr.get_output_dims_mapping(out_name)
if is_dim_shard(out_dims_mapping[-1]):
return False
if is_valid_list_index(out_dims_mapping,
-2) and is_dim_shard(out_dims_mapping[-2]):
return False
return True
def update_dims_mapping(self, op_dist_attr):
changed = False
dim_changed = _update_dims_mapping_for_matmul(op_dist_attr)
if dim_changed:
changed = True
return changed
register_distributed_operator_impl("matmul",
DistributedMatmulImpl0("column_parallel"))
register_distributed_operator_impl("matmul",
DistributedMatmulImpl1("row_parallel"))
register_distributed_operator_impl("matmul",
DistributedMatmulImpl2("replicate_parallel"))
class DistributedMatmulV2(DistributedOperator):
def __init__(self, name):
super(DistributedMatmulV2, self).__init__()
self._name = name
register_distributed_operator("matmul_v2", DistributedMatmulV2("matmul_v2"))
# ColumnParallel
class DistributedMatmulV2Impl0(DistributedOperatorImpl):
def __init__(self, name):
super(DistributedMatmulV2Impl0, self).__init__()
self._name = name
self._forward_implemented = True
self._backward_implemented = False
def is_process_mesh_compatible(self, op_dist_attr):
""" No restriction for now. """
return True
def is_input_compatible(self, op_dist_attr):
op_desc = op_dist_attr.get_owner_op().desc
x_name = op_desc.input('X')[0]
y_name = op_desc.input('Y')[0]
x_dims_mapping = op_dist_attr.get_input_dims_mapping(x_name)
y_dims_mapping = op_dist_attr.get_input_dims_mapping(y_name)
if is_dim_shard(x_dims_mapping[-1]):
return False
if is_dim_shard(y_dims_mapping[0]) or is_dim_replicate(y_dims_mapping[
1]):
return False
for mapping in x_dims_mapping[1:-1]:
if is_dim_shard(mapping):
return False
return True
def is_output_compatible(self, op_dist_attr):
op_desc = op_dist_attr.get_owner_op().desc
out_name = op_desc.output('Out')[0]
out_dims_mapping = op_dist_attr.get_output_dims_mapping(out_name)
if is_dim_replicate(out_dims_mapping[-1]):
return False
for mapping in out_dims_mapping[1:-1]:
if is_dim_shard(mapping):
return False
return True
def update_dims_mapping(self, op_dist_attr):
changed = False
dim_changed = _update_dims_mapping_for_matmul(op_dist_attr)
if dim_changed:
changed = True
return changed
def forward(self, serial_op):
def static_handle(dst_block,
src_op,
op_dist_attr,
input_name_mapping,
output_name_mapping,
rank_id=0):
assert len(
input_name_mapping
) == 2, "col_parallel_linear take 2 inputs variable but got {}".format(
input_name_mapping)
assert len(
output_name_mapping
) == 1, "col_parallel_linear take 2 inputs variable but got {}".format(
output_name_mapping)
assert len(
input_name_mapping['X']
) == 1, "col_parallel_linear input X take 1 variable but got {}".format(
input_name_mapping['X'])
assert len(
input_name_mapping['Y']
) == 1, "col_parallel_linear input Y take 1 variable but got {}".format(
input_name_mapping['Y'])
assert len(
output_name_mapping['Out']
) == 1, "col_parallel_linear input Out take 1 variable but got {}".format(
input_name_mapping['Out'])
X_var = dst_block.var(input_name_mapping['X'][0])
Weight_var = dst_block.var(input_name_mapping['Y'][0])
Out_var = dst_block.var(output_name_mapping['Out'][0])
# TODO infer logic comm presentation
from ..process import new_process_group
from ..transpiler import _get_comm_group
model_parallel_axis, process_mesh = op_dist_attr.get_owner_context(
)._get_model_parallel_info()
group_ranks = _get_comm_group(process_mesh.topology,
model_parallel_axis,
process_mesh.process_group, rank_id)
group = new_process_group(group_ranks)
# print("@@@@@@@@@@@@@@@@@@@@@ 5", group)
intermediate_var_0 = dst_block.create_var(
name=unique_name.generate_with_ignorable_key(".".join(
["c_identity", 'tmp'])),
dtype=X_var.dtype,
shape=X_var.shape,
type=core.VarDesc.VarType.LOD_TENSOR,
persistable=False,
stop_gradient=X_var.stop_gradient)
check_variable_and_dtype(
X_var, 'tensor',
['float16', 'float32', 'float64', 'int32', 'int64'],
'_c_identity')
dst_block.append_op(
type='c_identity',
inputs={'X': [X_var]},
outputs={'Out': intermediate_var_0},
attrs={
'ring_id': group.id,
'use_calc_stream': True,
'use_model_parallel': True,
})
check_variable_and_dtype(intermediate_var_0, 'x',
['float16', 'float32', 'float64'],
'linear')
check_dtype(intermediate_var_0.dtype, 'dtype',
['float16', 'float32', 'float64'], 'linear')
attrs = {'trans_x': False, 'trans_y': False}
inputs = {'X': [intermediate_var_0], 'Y': [Weight_var]}
dst_block.append_op(
type='matmul_v2',
inputs=inputs,
outputs={'Out': Out_var},
attrs=attrs)
if in_dygraph_mode():
raise NotImplementedError(
"Dist op for [{}] with idx [{}] is NOT implemented yet.".format(
"matmul", 0))
else:
return static_handle
# RowParallel
class DistributedMatmulV2Impl1(DistributedOperatorImpl):
def __init__(self, name):
super(DistributedMatmulV2Impl1, self).__init__()
self._name = name
self._forward_implemented = True
self._backward_implemented = False
def is_process_mesh_compatible(self, op_dist_attr):
""" No restriction for now. """
return True
def is_input_compatible(self, op_dist_attr):
op_desc = op_dist_attr.get_owner_op().desc
x_name = op_desc.input('X')[0]
y_name = op_desc.input('Y')[0]
x_dims_mapping = op_dist_attr.get_input_dims_mapping(x_name)
y_dims_mapping = op_dist_attr.get_input_dims_mapping(y_name)
if is_dim_replicate(x_dims_mapping[-1]):
return False
if is_dim_replicate(y_dims_mapping[-2]) or is_dim_shard(y_dims_mapping[
-1]):
return False
# Other dimensions must be replicate except the batch dimension
for mapping in x_dims_mapping[1:-1]:
if is_dim_shard(mapping):
return False
return True
def is_output_compatible(self, op_dist_attr):
op_desc = op_dist_attr.get_owner_op().desc
out_name = op_desc.output('Out')[0]
out_dims_mapping = op_dist_attr.get_output_dims_mapping(out_name)
if is_dim_shard(out_dims_mapping[-1]):
return False
# Other dimensions must be replicate except the batch dimension
for mapping in out_dims_mapping[1:-1]:
if is_dim_shard(mapping):
return False
return True
def update_dims_mapping(self, op_dist_attr):
changed = False
dim_changed = _update_dims_mapping_for_matmul(op_dist_attr)
if dim_changed:
changed = True
return changed
def forward(self, serial_op):
def static_handle(dst_block,
src_op,
op_dist_attr,
input_name_mapping,
output_name_mapping,
rank_id=0):
assert len(
input_name_mapping
) == 2, "col_parallel_linear take 2 inputs variable but got {}".format(
input_name_mapping)
assert len(
output_name_mapping
) == 1, "col_parallel_linear take 2 inputs variable but got {}".format(
output_name_mapping)
assert len(
input_name_mapping['X']
) == 1, "col_parallel_linear input X take 1 variable but got {}".format(
input_name_mapping['X'])
assert len(
input_name_mapping['Y']
) == 1, "col_parallel_linear input Y take 1 variable but got {}".format(
input_name_mapping['Y'])
assert len(
output_name_mapping['Out']
) == 1, "col_parallel_linear input Out take 1 variable but got {}".format(
input_name_mapping['Out'])
X_var = dst_block.var(input_name_mapping['X'][0])
Weight_var = dst_block.var(input_name_mapping['Y'][0])
Out_var = dst_block.var(output_name_mapping['Out'][0])
# TODO infer logic comm presentation
from ..process import new_process_group
from ..transpiler import _get_comm_group
model_parallel_axis, process_mesh = op_dist_attr.get_owner_context(
)._get_model_parallel_info()
group_ranks = _get_comm_group(process_mesh.topology,
model_parallel_axis,
process_mesh.process_group, rank_id)
group = new_process_group(group_ranks)
# print("@@@@@@@@@@@@@@@@@@@@@ 4", group)
check_variable_and_dtype(
X_var, 'x', ['float16', 'float32', 'float64'], 'linear')
check_dtype(X_var.dtype, 'dtype',
['float16', 'float32', 'float64'], 'linear')
attrs = {'trans_x': False, 'trans_y': False}
inputs = {'X': X_var, 'Y': Weight_var}
intermediate_var_0 = dst_block.create_var(
shape=Out_var.shape,
dtype=Out_var.dtype,
type=Out_var.type,
lod_level=Out_var.lod_level,
persistable=False,
is_data=False,
need_check_feed=Out_var.desc.need_check_feed())
dst_block.append_op(
type='matmul_v2',
inputs=inputs,
outputs={'Out': intermediate_var_0},
attrs=attrs)
dst_block.append_op(
type='c_allreduce_sum',
inputs={'X': intermediate_var_0},
outputs={'Out': Out_var},
attrs={
'ring_id': group.id,
'use_calc_stream': True,
'use_model_parallel': True
})
if in_dygraph_mode():
raise NotImplementedError(
"Dist op for [{}] with idx [{}] is NOT implemented yet.".format(
"matmul", 0))
else:
return static_handle
# ReplicateParallel
class DistributedMatmulV2Impl2(DistributedOperatorImpl):
def __init__(self, name):
super(DistributedMatmulV2Impl2, self).__init__()
self._name = name
def is_process_mesh_compatible(self, op_dist_attr):
""" No restriction for now. """
return True
def is_input_compatible(self, op_dist_attr):
op_desc = op_dist_attr.get_owner_op().desc
x_name = op_desc.input('X')[0]
y_name = op_desc.input('Y')[0]
x_dims_mapping = op_dist_attr.get_input_dims_mapping(x_name)
y_dims_mapping = op_dist_attr.get_input_dims_mapping(y_name)
if is_dim_shard(x_dims_mapping[-1]):
return False
if is_valid_list_index(x_dims_mapping,
-2) and is_dim_shard(x_dims_mapping[-2]):
return False
if is_dim_shard(y_dims_mapping[-1]):
return False
if is_valid_list_index(y_dims_mapping,
-2) and is_dim_shard(y_dims_mapping[-2]):
return False
return True
def is_output_compatible(self, op_dist_attr):
op_desc = op_dist_attr.get_owner_op().desc
out_name = op_desc.output('Out')[0]
out_dims_mapping = op_dist_attr.get_output_dims_mapping(out_name)
if is_dim_shard(out_dims_mapping[-1]):
return False
if is_valid_list_index(out_dims_mapping,
-2) and is_dim_shard(out_dims_mapping[-2]):
return False
return True
def update_dims_mapping(self, op_dist_attr):
changed = False
dim_changed = _update_dims_mapping_for_matmul(op_dist_attr)
if dim_changed:
changed = True
return changed
register_distributed_operator_impl("matmul_v2",
DistributedMatmulV2Impl0("column_parallel"))
register_distributed_operator_impl("matmul_v2",
DistributedMatmulV2Impl1("row_parallel"))
register_distributed_operator_impl(
"matmul_v2", DistributedMatmulV2Impl2("replicate_parallel"))
| 39.42602 | 98 | 0.591588 | 3,717 | 30,910 | 4.517084 | 0.06968 | 0.113996 | 0.043478 | 0.029422 | 0.893985 | 0.875521 | 0.831983 | 0.813341 | 0.799404 | 0.777248 | 0 | 0.013898 | 0.322582 | 30,910 | 783 | 99 | 39.476373 | 0.787955 | 0.052248 | 0 | 0.875583 | 0 | 0 | 0.079969 | 0 | 0 | 0 | 0 | 0.001277 | 0.037325 | 1 | 0.063764 | false | 0 | 0.031104 | 0 | 0.202177 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
37eccf2cda4c242fbb24dc3b59c6dbb75f3ad77e | 10,418 | py | Python | tests/bugs/core_0769_test.py | FirebirdSQL/firebird-qa | 96af2def7f905a06f178e2a80a2c8be4a4b44782 | [
"MIT"
] | 1 | 2022-02-05T11:37:13.000Z | 2022-02-05T11:37:13.000Z | tests/bugs/core_0769_test.py | FirebirdSQL/firebird-qa | 96af2def7f905a06f178e2a80a2c8be4a4b44782 | [
"MIT"
] | 1 | 2021-09-03T11:47:00.000Z | 2021-09-03T12:42:10.000Z | tests/bugs/core_0769_test.py | FirebirdSQL/firebird-qa | 96af2def7f905a06f178e2a80a2c8be4a4b44782 | [
"MIT"
] | 1 | 2021-06-30T14:14:16.000Z | 2021-06-30T14:14:16.000Z | #coding:utf-8
#
# id: bugs.core_0769
# title: Wildcards/Regular Expressions in WHERE clause - SIMILAR TO predicate
# decription:
# tracker_id: CORE-769
# min_versions: ['2.5.0']
# versions: 3.0
# qmid: None
import pytest
from firebird.qa import db_factory, isql_act, Action
# version: 3.0
# resources: None
substitutions_1 = []
init_script_1 = """"""
db_1 = db_factory(page_size=4096, sql_dialect=3, init=init_script_1)
test_script_1 = """SELECT IIF('ab' SIMILAR TO 'ab|cd|efg','true','false'),'true','''ab'' SIMILAR TO ''ab|cd|efg''' FROM RDB$DATABASE;
SELECT IIF('efg' SIMILAR TO 'ab|cd|efg','true','false'),'true','''efg'' SIMILAR TO ''ab|cd|efg''' FROM RDB$DATABASE;
SELECT IIF('a' SIMILAR TO 'ab|cd|efg','true','false'),'false','''a'' SIMILAR TO ''ab|cd|efg''' FROM RDB$DATABASE;
SELECT IIF('' SIMILAR TO 'a*','true','false'),'true',''''' SIMILAR TO ''a*''' FROM RDB$DATABASE;
SELECT IIF('a' SIMILAR TO 'a*','true','false'),'true','''a'' SIMILAR TO ''a*''' FROM RDB$DATABASE;
SELECT IIF('aaa' SIMILAR TO 'a*','true','false'),'true','''aaa'' SIMILAR TO ''a*''' FROM RDB$DATABASE;
SELECT IIF('' SIMILAR TO 'a+','true','false'),'false',''''' SIMILAR TO ''a+''' FROM RDB$DATABASE;
SELECT IIF('a' SIMILAR TO 'a+','true','false'),'true','''a'' SIMILAR TO ''a+''' FROM RDB$DATABASE;
SELECT IIF('aaa' SIMILAR TO 'a+','true','false'),'true','''aaa'' SIMILAR TO ''a+''' FROM RDB$DATABASE;
SELECT IIF('' SIMILAR TO 'a?','true','false'),'true',''''' SIMILAR TO ''a?''' FROM RDB$DATABASE;
SELECT IIF('a' SIMILAR TO 'a?','true','false'),'true','''a'' SIMILAR TO ''a?''' FROM RDB$DATABASE;
SELECT IIF('aaa' SIMILAR TO 'a?','true','false'),'false','''aaa'' SIMILAR TO ''a?''' FROM RDB$DATABASE;
SELECT IIF('' SIMILAR TO 'a{2,}','true','false'),'false',''''' SIMILAR TO ''a{2,}''' FROM RDB$DATABASE;
SELECT IIF('a' SIMILAR TO 'a{2,}','true','false'),'false','''a'' SIMILAR TO ''a{2,}''' FROM RDB$DATABASE;
SELECT IIF('aa' SIMILAR TO 'a{2,}','true','false'),'true','''aa'' SIMILAR TO ''a{2,}''' FROM RDB$DATABASE;
SELECT IIF('aaa' SIMILAR TO 'a{2,}','true','false'),'true','''aaa'' SIMILAR TO ''a{2,}''' FROM RDB$DATABASE;
SELECT IIF('' SIMILAR TO 'a{2,4}','true','false'),'false',''''' SIMILAR TO ''a{2,4}''' FROM RDB$DATABASE;
SELECT IIF('a' SIMILAR TO 'a{2,4}','true','false'),'false','''a'' SIMILAR TO ''a{2,4}''' FROM RDB$DATABASE;
SELECT IIF('aa' SIMILAR TO 'a{2,4}','true','false'),'true','''aa'' SIMILAR TO ''a{2,4}''' FROM RDB$DATABASE;
SELECT IIF('aaa' SIMILAR TO 'a{2,4}','true','false'),'true','''aaa'' SIMILAR TO ''a{2,4}''' FROM RDB$DATABASE;
SELECT IIF('aaaa' SIMILAR TO 'a{2,4}','true','false'),'true','''aaaa'' SIMILAR TO ''a{2,4}''' FROM RDB$DATABASE;
SELECT IIF('aaaaa' SIMILAR TO 'a{2,4}','true','false'),'false','''aaaaa'' SIMILAR TO ''a{2,4}''' FROM RDB$DATABASE;
SELECT IIF('' SIMILAR TO '_','true','false'),'false',''''' SIMILAR TO ''_''' FROM RDB$DATABASE;
SELECT IIF('a' SIMILAR TO '_','true','false'),'true','''a'' SIMILAR TO ''_''' FROM RDB$DATABASE;
SELECT IIF('1' SIMILAR TO '_','true','false'),'true','''1'' SIMILAR TO ''_''' FROM RDB$DATABASE;
SELECT IIF('a1' SIMILAR TO '_','true','false'),'false','''a1'' SIMILAR TO ''_''' FROM RDB$DATABASE;
SELECT IIF('' SIMILAR TO '%','true','false'),'true',''''' SIMILAR TO ''%''' FROM RDB$DATABASE;
SELECT IIF('az' SIMILAR TO 'a%z','true','false'),'true','''az'' SIMILAR TO ''a%z''' FROM RDB$DATABASE;
SELECT IIF('a123z' SIMILAR TO 'a%z','true','false'),'true','''a123z'' SIMILAR TO ''a%z''' FROM RDB$DATABASE;
SELECT IIF('azx' SIMILAR TO 'a%z','true','false'),'false','''azx'' SIMILAR TO ''a%z''' FROM RDB$DATABASE;
SELECT IIF('ab' SIMILAR TO '(ab){2}','true','false'),'false','''ab'' SIMILAR TO ''(ab){2}''' FROM RDB$DATABASE;
SELECT IIF('aabb' SIMILAR TO '(ab){2}','true','false'),'false','''aabb'' SIMILAR TO ''(ab){2}''' FROM RDB$DATABASE;
SELECT IIF('abab' SIMILAR TO '(ab){2}','true','false'),'true','''abab'' SIMILAR TO ''(ab){2}''' FROM RDB$DATABASE;
SELECT IIF('b' SIMILAR TO '[abc]','true','false'),'true','''b'' SIMILAR TO ''[abc]''' FROM RDB$DATABASE;
SELECT IIF('d' SIMILAR TO '[abc]','true','false'),'false','''d'' SIMILAR TO ''[abc]''' FROM RDB$DATABASE;
SELECT IIF('9' SIMILAR TO '[0-9]','true','false'),'true','''9'' SIMILAR TO ''[0-9]''' FROM RDB$DATABASE;
SELECT IIF('9' SIMILAR TO '[0-8]','true','false'),'false','''9'' SIMILAR TO ''[0-8]''' FROM RDB$DATABASE;
SELECT IIF('b' SIMILAR TO '[^abc]','true','false'),'false','''b'' SIMILAR TO ''[^abc]''' FROM RDB$DATABASE;
SELECT IIF('d' SIMILAR TO '[^abc]','true','false'),'true','''d'' SIMILAR TO ''[^abc]''' FROM RDB$DATABASE;
SELECT IIF('3' SIMILAR TO '[[:DIGIT:]^3]','true','false'),'false','''3'' SIMILAR TO ''[[:DIGIT:]^3]''' FROM RDB$DATABASE;
SELECT IIF('4' SIMILAR TO '[[:DIGIT:]^3]','true','false'),'true','''4'' SIMILAR TO ''[[:DIGIT:]^3]''' FROM RDB$DATABASE;
SELECT IIF('4' SIMILAR TO '[[:DIGIT:]]','true','false'),'true','''4'' SIMILAR TO ''[[:DIGIT:]]''' FROM RDB$DATABASE;
SELECT IIF('a' SIMILAR TO '[[:DIGIT:]]','true','false'),'false','''a'' SIMILAR TO ''[[:DIGIT:]]''' FROM RDB$DATABASE;
SELECT IIF('4' SIMILAR TO '[^[:DIGIT:]]','true','false'),'false','''4'' SIMILAR TO ''[^[:DIGIT:]]''' FROM RDB$DATABASE;
SELECT IIF('a' SIMILAR TO '[^[:DIGIT:]]','true','false'),'true','''a'' SIMILAR TO ''[^[:DIGIT:]]''' FROM RDB$DATABASE;
"""
act_1 = isql_act('db_1', test_script_1, substitutions=substitutions_1)
expected_stdout_1 = """
CASE CONSTANT CONSTANT
====== ======== ===========================
true true 'ab' SIMILAR TO 'ab|cd|efg'
CASE CONSTANT CONSTANT
====== ======== ============================
true true 'efg' SIMILAR TO 'ab|cd|efg'
CASE CONSTANT CONSTANT
====== ======== ==========================
false false 'a' SIMILAR TO 'ab|cd|efg'
CASE CONSTANT CONSTANT
====== ======== ==================
true true '' SIMILAR TO 'a*'
CASE CONSTANT CONSTANT
====== ======== ===================
true true 'a' SIMILAR TO 'a*'
CASE CONSTANT CONSTANT
====== ======== =====================
true true 'aaa' SIMILAR TO 'a*'
CASE CONSTANT CONSTANT
====== ======== ==================
false false '' SIMILAR TO 'a+'
CASE CONSTANT CONSTANT
====== ======== ===================
true true 'a' SIMILAR TO 'a+'
CASE CONSTANT CONSTANT
====== ======== =====================
true true 'aaa' SIMILAR TO 'a+'
CASE CONSTANT CONSTANT
====== ======== ==================
true true '' SIMILAR TO 'a?'
CASE CONSTANT CONSTANT
====== ======== ===================
true true 'a' SIMILAR TO 'a?'
CASE CONSTANT CONSTANT
====== ======== =====================
false false 'aaa' SIMILAR TO 'a?'
CASE CONSTANT CONSTANT
====== ======== =====================
false false '' SIMILAR TO 'a{2,}'
CASE CONSTANT CONSTANT
====== ======== ======================
false false 'a' SIMILAR TO 'a{2,}'
CASE CONSTANT CONSTANT
====== ======== =======================
true true 'aa' SIMILAR TO 'a{2,}'
CASE CONSTANT CONSTANT
====== ======== ========================
true true 'aaa' SIMILAR TO 'a{2,}'
CASE CONSTANT CONSTANT
====== ======== ======================
false false '' SIMILAR TO 'a{2,4}'
CASE CONSTANT CONSTANT
====== ======== =======================
false false 'a' SIMILAR TO 'a{2,4}'
CASE CONSTANT CONSTANT
====== ======== ========================
true true 'aa' SIMILAR TO 'a{2,4}'
CASE CONSTANT CONSTANT
====== ======== =========================
true true 'aaa' SIMILAR TO 'a{2,4}'
CASE CONSTANT CONSTANT
====== ======== ==========================
true true 'aaaa' SIMILAR TO 'a{2,4}'
CASE CONSTANT CONSTANT
====== ======== ===========================
false false 'aaaaa' SIMILAR TO 'a{2,4}'
CASE CONSTANT CONSTANT
====== ======== =================
false false '' SIMILAR TO '_'
CASE CONSTANT CONSTANT
====== ======== ==================
true true 'a' SIMILAR TO '_'
CASE CONSTANT CONSTANT
====== ======== ==================
true true '1' SIMILAR TO '_'
CASE CONSTANT CONSTANT
====== ======== ===================
false false 'a1' SIMILAR TO '_'
CASE CONSTANT CONSTANT
====== ======== =================
true true '' SIMILAR TO '%'
CASE CONSTANT CONSTANT
====== ======== =====================
true true 'az' SIMILAR TO 'a%z'
CASE CONSTANT CONSTANT
====== ======== ========================
true true 'a123z' SIMILAR TO 'a%z'
CASE CONSTANT CONSTANT
====== ======== ======================
false false 'azx' SIMILAR TO 'a%z'
CASE CONSTANT CONSTANT
====== ======== =========================
false false 'ab' SIMILAR TO '(ab){2}'
CASE CONSTANT CONSTANT
====== ======== ===========================
false false 'aabb' SIMILAR TO '(ab){2}'
CASE CONSTANT CONSTANT
====== ======== ===========================
true true 'abab' SIMILAR TO '(ab){2}'
CASE CONSTANT CONSTANT
====== ======== ======================
true true 'b' SIMILAR TO '[abc]'
CASE CONSTANT CONSTANT
====== ======== ======================
false false 'd' SIMILAR TO '[abc]'
CASE CONSTANT CONSTANT
====== ======== ======================
true true '9' SIMILAR TO '[0-9]'
CASE CONSTANT CONSTANT
====== ======== ======================
false false '9' SIMILAR TO '[0-8]'
CASE CONSTANT CONSTANT
====== ======== =======================
false false 'b' SIMILAR TO '[^abc]'
CASE CONSTANT CONSTANT
====== ======== =======================
true true 'd' SIMILAR TO '[^abc]'
CASE CONSTANT CONSTANT
====== ======== ==============================
false false '3' SIMILAR TO '[[:DIGIT:]^3]'
CASE CONSTANT CONSTANT
====== ======== ==============================
true true '4' SIMILAR TO '[[:DIGIT:]^3]'
CASE CONSTANT CONSTANT
====== ======== ============================
true true '4' SIMILAR TO '[[:DIGIT:]]'
CASE CONSTANT CONSTANT
====== ======== ============================
false false 'a' SIMILAR TO '[[:DIGIT:]]'
CASE CONSTANT CONSTANT
====== ======== =============================
false false '4' SIMILAR TO '[^[:DIGIT:]]'
CASE CONSTANT CONSTANT
====== ======== =============================
true true 'a' SIMILAR TO '[^[:DIGIT:]]'
"""
@pytest.mark.version('>=3.0')
def test_1(act_1: Action):
act_1.expected_stdout = expected_stdout_1
act_1.execute()
assert act_1.clean_stdout == act_1.clean_expected_stdout
| 34.157377 | 133 | 0.511231 | 1,301 | 10,418 | 4.056879 | 0.066103 | 0.231906 | 0.125047 | 0.175066 | 0.902046 | 0.865858 | 0.827018 | 0.713338 | 0.586396 | 0.464759 | 0 | 0.01655 | 0.147437 | 10,418 | 304 | 134 | 34.269737 | 0.577685 | 0.024189 | 0 | 0.474227 | 0 | 0.226804 | 0.951152 | 0.252511 | 0 | 0 | 0 | 0 | 0.005155 | 1 | 0.005155 | false | 0 | 0.010309 | 0 | 0.015464 | 0 | 0 | 0 | 0 | null | 1 | 0 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 8 |
533f271d1aafea8b6f3a2e92ff6fb443f4f5609e | 2,860 | py | Python | Square.py | Square5/Darkcyber | 0327b5fef137633b8f63c68f2d64e106a8660a3c | [
"Apache-2.0"
] | null | null | null | Square.py | Square5/Darkcyber | 0327b5fef137633b8f63c68f2d64e106a8660a3c | [
"Apache-2.0"
] | null | null | null | Square.py | Square5/Darkcyber | 0327b5fef137633b8f63c68f2d64e106a8660a3c | [
"Apache-2.0"
] | null | null | null | def square (a):
return a*a
x = input ("enter your square :-")
y =(square (int(x)))
print ("square of number is :-" + str (y))
def square (a):
return a*a
x = input ("enter your square :-")
y =(square (int(x)))
print ("square of number is :-" + str (y))
def square (a):
return a*a
x = input ("enter your square :-")
y =(square (int(x)))
print ("square of number is :-" + str (y))
def square (a):
return a*a
x = input ("enter your square :-")
y =(square (int(x)))
print ("square of number is :-" + str (y))
def square (a):
return a*a
x = input ("enter your square :-")
y =(square (int(x)))
print ("square of number is :-" + str (y))
def square (a):
return a*a
x = input ("enter your square :-")
y =(square (int(x)))
print ("square of number is :-" + str (y))
def square (a):
return a*a
x = input ("enter your square :-")
y =(square (int(x)))
print ("square of number is :-" + str (y))
def square (a):
return a*a
x = input ("enter your square :-")
y =(square (int(x)))
print ("square of number is :-" + str (y))
def square (a):
return a*a
x = input ("enter your square :-")
y =(square (int(x)))
print ("square of number is :-" + str (y))
def square (a):
return a*a
x = input ("enter your square :-")
y =(square (int(x)))
print ("square of number is :-" + str (y))
def square (a):
return a*a
x = input ("enter your square :-")
y =(square (int(x)))
print ("square of number is :-" + str (y))
def square (a):
return a*a
x = input ("enter your square :-")
y =(square (int(x)))
print ("square of number is :-" + str (y))
def square (a):
return a*a
x = input ("enter your square :-")
y =(square (int(x)))
print ("square of number is :-" + str (y))
def square (a):
return a*a
x = input ("enter your square :-")
y =(square (int(x)))
print ("square of number is :-" + str (y))
def square (a):
return a*a
x = input ("enter your square :-")
y =(square (int(x)))
print ("square of number is :-" + str (y))
def square (a):
return a*a
x = input ("enter your square :-")
y =(square (int(x)))
print ("square of number is :-" + str (y))
def square (a):
return a*a
x = input ("enter your square :-")
y =(square (int(x)))
print ("square of number is :-" + str (y))
def square (a):
return a*a
x = input ("enter your square :-")
y =(square (int(x)))
print ("square of number is :-" + str (y))
def square (a):
return a*a
x = input ("enter your square :-")
y =(square (int(x)))
print ("square of number is :-" + str (y))
def square (a):
return a*a
x = input ("enter your square :-")
y =(square (int(x)))
print ("square of number is :-" + str (y))
def square (a):
return a*a
x = input ("enter your square :-")
y =(square (int(x)))
print ("square of number is :-" + str (y))
def square (a):
return a*a
x = input ("enter your square :-")
y =(square (int(x)))
print ("square of number is :-" + str (y))
print ("thank you😊😊😊😊")
| 18.69281 | 42 | 0.581469 | 488 | 2,860 | 3.415984 | 0.036885 | 0.118776 | 0.131974 | 0.211158 | 0.989802 | 0.989802 | 0.989802 | 0.989802 | 0.989802 | 0.989802 | 0 | 0 | 0.208042 | 2,860 | 152 | 43 | 18.815789 | 0.734216 | 0 | 0 | 0.990991 | 0 | 0 | 0.329003 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.198198 | false | 0 | 0 | 0.198198 | 0.396396 | 0.207207 | 0 | 0 | 0 | null | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 9 |
72736e84597f25dbb1a297d41b610dd25d1cf79a | 10,153 | py | Python | tests/frameworks/test_starlette.py | rlopes-ki/python-sensor | 07e827f9982b2a0c482e8eab82d1a420923efd5e | [
"MIT"
] | 61 | 2017-09-27T02:50:17.000Z | 2022-03-22T12:13:37.000Z | tests/frameworks/test_starlette.py | rlopes-ki/python-sensor | 07e827f9982b2a0c482e8eab82d1a420923efd5e | [
"MIT"
] | 82 | 2017-07-11T13:47:33.000Z | 2022-03-22T10:10:38.000Z | tests/frameworks/test_starlette.py | rlopes-ki/python-sensor | 07e827f9982b2a0c482e8eab82d1a420923efd5e | [
"MIT"
] | 27 | 2017-09-11T16:22:32.000Z | 2022-03-11T17:21:49.000Z | # (c) Copyright IBM Corp. 2021
# (c) Copyright Instana Inc. 2020
from __future__ import absolute_import
import time
import pytest
import requests
import multiprocessing
from ..helpers import testenv
from instana.singletons import tracer
from ..helpers import get_first_span_by_filter
@pytest.fixture(scope="module")
def server():
from tests.apps.starlette_app import launch_starlette
proc = multiprocessing.Process(target=launch_starlette, args=(), daemon=True)
proc.start()
time.sleep(2)
yield
proc.kill() # Kill server after tests
def test_vanilla_get(server):
result = requests.get(testenv["starlette_server"] + '/')
assert(result)
spans = tracer.recorder.queued_spans()
# Starlette instrumentation (like all instrumentation) _always_ traces unless told otherwise
assert len(spans) == 1
assert spans[0].n == 'asgi'
assert "X-INSTANA-T" in result.headers
assert "X-INSTANA-S" in result.headers
assert "X-INSTANA-L" in result.headers
assert result.headers["X-INSTANA-L"] == '1'
assert "Server-Timing" in result.headers
def test_basic_get(server):
result = None
with tracer.start_active_span('test'):
result = requests.get(testenv["starlette_server"] + '/')
assert(result)
spans = tracer.recorder.queued_spans()
assert len(spans) == 3
span_filter = lambda span: span.n == "sdk" and span.data['sdk']['name'] == 'test'
test_span = get_first_span_by_filter(spans, span_filter)
assert(test_span)
span_filter = lambda span: span.n == "urllib3"
urllib3_span = get_first_span_by_filter(spans, span_filter)
assert(urllib3_span)
span_filter = lambda span: span.n == 'asgi'
asgi_span = get_first_span_by_filter(spans, span_filter)
assert(asgi_span)
assert(test_span.t == urllib3_span.t == asgi_span.t)
assert(asgi_span.p == urllib3_span.s)
assert(urllib3_span.p == test_span.s)
assert "X-INSTANA-T" in result.headers
assert result.headers["X-INSTANA-T"] == asgi_span.t
assert "X-INSTANA-S" in result.headers
assert result.headers["X-INSTANA-S"] == asgi_span.s
assert "X-INSTANA-L" in result.headers
assert result.headers["X-INSTANA-L"] == '1'
assert "Server-Timing" in result.headers
assert result.headers["Server-Timing"] == ("intid;desc=%s" % asgi_span.t)
assert(asgi_span.ec == None)
assert (asgi_span.data['http']['host'] == '127.0.0.1')
assert (asgi_span.data['http']['path'] == '/')
assert (asgi_span.data['http']['path_tpl'] == '/')
assert (asgi_span.data['http']['method'] == 'GET')
assert (asgi_span.data['http']['status'] == 200)
assert (asgi_span.data['http']['error'] is None)
assert (asgi_span.data['http']['params'] is None)
def test_path_templates(server):
result = None
with tracer.start_active_span('test'):
result = requests.get(testenv["starlette_server"] + '/users/1')
assert(result)
spans = tracer.recorder.queued_spans()
assert len(spans) == 3
span_filter = lambda span: span.n == "sdk" and span.data['sdk']['name'] == 'test'
test_span = get_first_span_by_filter(spans, span_filter)
assert(test_span)
span_filter = lambda span: span.n == "urllib3"
urllib3_span = get_first_span_by_filter(spans, span_filter)
assert(urllib3_span)
span_filter = lambda span: span.n == 'asgi'
asgi_span = get_first_span_by_filter(spans, span_filter)
assert(asgi_span)
assert(test_span.t == urllib3_span.t == asgi_span.t)
assert(asgi_span.p == urllib3_span.s)
assert(urllib3_span.p == test_span.s)
assert "X-INSTANA-T" in result.headers
assert result.headers["X-INSTANA-T"] == asgi_span.t
assert "X-INSTANA-S" in result.headers
assert result.headers["X-INSTANA-S"] == asgi_span.s
assert "X-INSTANA-L" in result.headers
assert result.headers["X-INSTANA-L"] == '1'
assert "Server-Timing" in result.headers
assert result.headers["Server-Timing"] == ("intid;desc=%s" % asgi_span.t)
assert(asgi_span.ec == None)
assert (asgi_span.data['http']['host'] == '127.0.0.1')
assert (asgi_span.data['http']['path'] == '/users/1')
assert (asgi_span.data['http']['path_tpl'] == '/users/{user_id}')
assert (asgi_span.data['http']['method'] == 'GET')
assert (asgi_span.data['http']['status'] == 200)
assert (asgi_span.data['http']['error'] is None)
assert (asgi_span.data['http']['params'] is None)
def test_secret_scrubbing(server):
result = None
with tracer.start_active_span('test'):
result = requests.get(testenv["starlette_server"] + '/?secret=shhh')
assert(result)
spans = tracer.recorder.queued_spans()
assert len(spans) == 3
span_filter = lambda span: span.n == "sdk" and span.data['sdk']['name'] == 'test'
test_span = get_first_span_by_filter(spans, span_filter)
assert(test_span)
span_filter = lambda span: span.n == "urllib3"
urllib3_span = get_first_span_by_filter(spans, span_filter)
assert(urllib3_span)
span_filter = lambda span: span.n == 'asgi'
asgi_span = get_first_span_by_filter(spans, span_filter)
assert(asgi_span)
assert(test_span.t == urllib3_span.t == asgi_span.t)
assert(asgi_span.p == urllib3_span.s)
assert(urllib3_span.p == test_span.s)
assert "X-INSTANA-T" in result.headers
assert result.headers["X-INSTANA-T"] == asgi_span.t
assert "X-INSTANA-S" in result.headers
assert result.headers["X-INSTANA-S"] == asgi_span.s
assert "X-INSTANA-L" in result.headers
assert result.headers["X-INSTANA-L"] == '1'
assert "Server-Timing" in result.headers
assert result.headers["Server-Timing"] == ("intid;desc=%s" % asgi_span.t)
assert(asgi_span.ec == None)
assert (asgi_span.data['http']['host'] == '127.0.0.1')
assert (asgi_span.data['http']['path'] == '/')
assert (asgi_span.data['http']['path_tpl'] == '/')
assert (asgi_span.data['http']['method'] == 'GET')
assert (asgi_span.data['http']['status'] == 200)
assert (asgi_span.data['http']['error'] is None)
assert (asgi_span.data['http']['params'] == 'secret=<redacted>')
def test_synthetic_request(server):
request_headers = {
'X-INSTANA-SYNTHETIC': '1'
}
with tracer.start_active_span('test'):
result = requests.get(testenv["starlette_server"] + '/', headers=request_headers)
assert(result)
spans = tracer.recorder.queued_spans()
assert len(spans) == 3
span_filter = lambda span: span.n == "sdk" and span.data['sdk']['name'] == 'test'
test_span = get_first_span_by_filter(spans, span_filter)
assert(test_span)
span_filter = lambda span: span.n == "urllib3"
urllib3_span = get_first_span_by_filter(spans, span_filter)
assert(urllib3_span)
span_filter = lambda span: span.n == 'asgi'
asgi_span = get_first_span_by_filter(spans, span_filter)
assert(asgi_span)
assert(test_span.t == urllib3_span.t == asgi_span.t)
assert(asgi_span.p == urllib3_span.s)
assert(urllib3_span.p == test_span.s)
assert "X-INSTANA-T" in result.headers
assert result.headers["X-INSTANA-T"] == asgi_span.t
assert "X-INSTANA-S" in result.headers
assert result.headers["X-INSTANA-S"] == asgi_span.s
assert "X-INSTANA-L" in result.headers
assert result.headers["X-INSTANA-L"] == '1'
assert "Server-Timing" in result.headers
assert result.headers["Server-Timing"] == ("intid;desc=%s" % asgi_span.t)
assert(asgi_span.ec == None)
assert (asgi_span.data['http']['host'] == '127.0.0.1')
assert (asgi_span.data['http']['path'] == '/')
assert (asgi_span.data['http']['path_tpl'] == '/')
assert (asgi_span.data['http']['method'] == 'GET')
assert (asgi_span.data['http']['status'] == 200)
assert (asgi_span.data['http']['error'] is None)
assert (asgi_span.data['http']['params'] is None)
assert(asgi_span.sy)
assert(urllib3_span.sy is None)
assert(test_span.sy is None)
def test_custom_header_capture(server):
from instana.singletons import agent
# The background Starlette server is pre-configured with custom headers to capture
request_headers = {
'X-Capture-This': 'this',
'X-Capture-That': 'that'
}
with tracer.start_active_span('test'):
result = requests.get(testenv["starlette_server"] + '/', headers=request_headers)
assert(result)
spans = tracer.recorder.queued_spans()
assert len(spans) == 3
span_filter = lambda span: span.n == "sdk" and span.data['sdk']['name'] == 'test'
test_span = get_first_span_by_filter(spans, span_filter)
assert(test_span)
span_filter = lambda span: span.n == "urllib3"
urllib3_span = get_first_span_by_filter(spans, span_filter)
assert(urllib3_span)
span_filter = lambda span: span.n == 'asgi'
asgi_span = get_first_span_by_filter(spans, span_filter)
assert(asgi_span)
assert(test_span.t == urllib3_span.t == asgi_span.t)
assert(asgi_span.p == urllib3_span.s)
assert(urllib3_span.p == test_span.s)
assert "X-INSTANA-T" in result.headers
assert result.headers["X-INSTANA-T"] == asgi_span.t
assert "X-INSTANA-S" in result.headers
assert result.headers["X-INSTANA-S"] == asgi_span.s
assert "X-INSTANA-L" in result.headers
assert result.headers["X-INSTANA-L"] == '1'
assert "Server-Timing" in result.headers
assert result.headers["Server-Timing"] == ("intid;desc=%s" % asgi_span.t)
assert(asgi_span.ec == None)
assert (asgi_span.data['http']['host'] == '127.0.0.1')
assert (asgi_span.data['http']['path'] == '/')
assert (asgi_span.data['http']['path_tpl'] == '/')
assert (asgi_span.data['http']['method'] == 'GET')
assert (asgi_span.data['http']['status'] == 200)
assert (asgi_span.data['http']['error'] is None)
assert (asgi_span.data['http']['params'] is None)
assert ("X-Capture-This" in asgi_span.data["http"]["header"])
assert ("this" == asgi_span.data["http"]["header"]["X-Capture-This"])
assert ("X-Capture-That" in asgi_span.data["http"]["header"])
assert ("that" == asgi_span.data["http"]["header"]["X-Capture-That"])
| 37.054745 | 96 | 0.669457 | 1,452 | 10,153 | 4.491047 | 0.084022 | 0.098144 | 0.109492 | 0.095691 | 0.853703 | 0.84941 | 0.84941 | 0.826407 | 0.817206 | 0.817206 | 0 | 0.011985 | 0.169999 | 10,153 | 273 | 97 | 37.190476 | 0.761837 | 0.025214 | 0 | 0.804651 | 0 | 0 | 0.148418 | 0 | 0 | 0 | 0 | 0 | 0.627907 | 1 | 0.032558 | false | 0 | 0.046512 | 0 | 0.07907 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 8 |
728761f13eaa3f50ae86b32e290e3d8de02bd964 | 10,039 | py | Python | imblearn/under_sampling/tests/test_instance_hardness_threshold.py | christophe-rannou/imbalanced-learn | c3f3b0fd9815e206ea63f3f11728f097608bf580 | [
"MIT"
] | null | null | null | imblearn/under_sampling/tests/test_instance_hardness_threshold.py | christophe-rannou/imbalanced-learn | c3f3b0fd9815e206ea63f3f11728f097608bf580 | [
"MIT"
] | null | null | null | imblearn/under_sampling/tests/test_instance_hardness_threshold.py | christophe-rannou/imbalanced-learn | c3f3b0fd9815e206ea63f3f11728f097608bf580 | [
"MIT"
] | null | null | null | """Test the module ."""
from __future__ import print_function
import numpy as np
from numpy.testing import (assert_array_equal, assert_equal, assert_raises,
assert_raises_regex)
from sklearn.ensemble import GradientBoostingClassifier
from imblearn.under_sampling import InstanceHardnessThreshold
# Generate a global dataset to use
RND_SEED = 0
X = np.array([[-0.3879569, 0.6894251], [-0.09322739, 1.28177189],
[-0.77740357, 0.74097941], [0.91542919, -0.65453327],
[-0.03852113, 0.40910479], [-0.43877303, 1.07366684],
[-0.85795321, 0.82980738], [-0.18430329, 0.52328473],
[-0.30126957, -0.66268378], [-0.65571327, 0.42412021],
[-0.28305528, 0.30284991], [0.20246714, -0.34727125],
[1.06446472, -1.09279772], [0.30543283, -0.02589502],
[-0.00717161, 0.00318087]])
Y = np.array([0, 1, 1, 0, 1, 1, 1, 1, 1, 0, 1, 1, 0, 0, 0])
ESTIMATOR = 'gradient-boosting'
def test_iht_wrong_estimator():
# Resample the data
ratio = 0.7
est = 'rnd'
iht = InstanceHardnessThreshold(
estimator=est, ratio=ratio, random_state=RND_SEED)
assert_raises(NotImplementedError, iht.fit_sample, X, Y)
def test_iht_init():
# Define a ratio
ratio = 'auto'
iht = InstanceHardnessThreshold(
ESTIMATOR, ratio=ratio, random_state=RND_SEED)
assert_equal(iht.ratio, ratio)
assert_equal(iht.random_state, RND_SEED)
def test_iht_fit_sample():
# Resample the data
iht = InstanceHardnessThreshold(ESTIMATOR, random_state=RND_SEED)
X_resampled, y_resampled = iht.fit_sample(X, Y)
X_gt = np.array([[-0.3879569, 0.6894251], [-0.09322739, 1.28177189],
[-0.77740357, 0.74097941], [0.91542919, -0.65453327],
[-0.43877303, 1.07366684], [-0.85795321, 0.82980738],
[-0.18430329, 0.52328473], [-0.65571327, 0.42412021],
[-0.28305528, 0.30284991], [1.06446472, -1.09279772],
[0.30543283, -0.02589502], [-0.00717161, 0.00318087]])
y_gt = np.array([0, 1, 1, 0, 1, 1, 1, 0, 1, 0, 0, 0])
assert_array_equal(X_resampled, X_gt)
assert_array_equal(y_resampled, y_gt)
def test_iht_fit_sample_with_indices():
# Resample the data
iht = InstanceHardnessThreshold(
ESTIMATOR, return_indices=True, random_state=RND_SEED)
X_resampled, y_resampled, idx_under = iht.fit_sample(X, Y)
X_gt = np.array([[-0.3879569, 0.6894251], [-0.09322739, 1.28177189],
[-0.77740357, 0.74097941], [0.91542919, -0.65453327],
[-0.43877303, 1.07366684], [-0.85795321, 0.82980738],
[-0.18430329, 0.52328473], [-0.65571327, 0.42412021],
[-0.28305528, 0.30284991], [1.06446472, -1.09279772],
[0.30543283, -0.02589502], [-0.00717161, 0.00318087]])
y_gt = np.array([0, 1, 1, 0, 1, 1, 1, 0, 1, 0, 0, 0])
idx_gt = np.array([0, 1, 2, 3, 5, 6, 7, 9, 10, 12, 13, 14])
assert_array_equal(X_resampled, X_gt)
assert_array_equal(y_resampled, y_gt)
assert_array_equal(idx_under, idx_gt)
def test_iht_fit_sample_half():
# Resample the data
ratio = 0.7
iht = InstanceHardnessThreshold(
ESTIMATOR, ratio=ratio, random_state=RND_SEED)
X_resampled, y_resampled = iht.fit_sample(X, Y)
X_gt = np.array([[-0.3879569, 0.6894251], [-0.09322739, 1.28177189],
[-0.77740357, 0.74097941], [0.91542919, -0.65453327],
[-0.03852113, 0.40910479], [-0.43877303, 1.07366684],
[-0.85795321, 0.82980738], [-0.18430329, 0.52328473],
[-0.30126957, -0.66268378], [-0.65571327, 0.42412021],
[-0.28305528, 0.30284991], [1.06446472, -1.09279772],
[0.30543283, -0.02589502], [-0.00717161, 0.00318087]])
y_gt = np.array([0, 1, 1, 0, 1, 1, 1, 1, 1, 0, 1, 0, 0, 0])
assert_array_equal(X_resampled, X_gt)
assert_array_equal(y_resampled, y_gt)
def test_iht_fit_sample_knn():
# Resample the data
est = 'knn'
iht = InstanceHardnessThreshold(est, random_state=RND_SEED)
X_resampled, y_resampled = iht.fit_sample(X, Y)
X_gt = np.array([[-0.3879569, 0.6894251], [-0.09322739, 1.28177189],
[-0.77740357, 0.74097941], [0.91542919, -0.65453327],
[-0.43877303, 1.07366684], [-0.85795321, 0.82980738],
[-0.30126957, -0.66268378], [-0.65571327, 0.42412021],
[0.20246714, -0.34727125], [1.06446472, -1.09279772],
[0.30543283, -0.02589502], [-0.00717161, 0.00318087]])
y_gt = np.array([0, 1, 1, 0, 1, 1, 1, 0, 1, 0, 0, 0])
assert_array_equal(X_resampled, X_gt)
assert_array_equal(y_resampled, y_gt)
def test_iht_fit_sample_decision_tree():
# Resample the data
est = 'decision-tree'
iht = InstanceHardnessThreshold(est, random_state=RND_SEED)
X_resampled, y_resampled = iht.fit_sample(X, Y)
X_gt = np.array([[-0.3879569, 0.6894251], [-0.09322739, 1.28177189],
[-0.77740357, 0.74097941], [0.91542919, -0.65453327],
[-0.43877303, 1.07366684], [-0.85795321, 0.82980738],
[-0.18430329, 0.52328473], [-0.65571327, 0.42412021],
[-0.28305528, 0.30284991], [1.06446472, -1.09279772],
[0.30543283, -0.02589502], [-0.00717161, 0.00318087]])
y_gt = np.array([0, 1, 1, 0, 1, 1, 1, 0, 1, 0, 0, 0])
assert_array_equal(X_resampled, X_gt)
assert_array_equal(y_resampled, y_gt)
def test_iht_fit_sample_random_forest():
# Resample the data
est = 'random-forest'
iht = InstanceHardnessThreshold(est, random_state=RND_SEED)
X_resampled, y_resampled = iht.fit_sample(X, Y)
X_gt = np.array([[-0.3879569, 0.6894251], [-0.09322739, 1.28177189],
[-0.77740357, 0.74097941], [0.91542919, -0.65453327],
[-0.03852113, 0.40910479], [-0.43877303, 1.07366684],
[-0.85795321, 0.82980738], [-0.18430329, 0.52328473],
[-0.65571327, 0.42412021], [-0.28305528, 0.30284991],
[1.06446472, -1.09279772], [0.30543283, -0.02589502],
[-0.00717161, 0.00318087]])
y_gt = np.array([0, 1, 1, 0, 1, 1, 1, 1, 0, 1, 0, 0, 0])
assert_array_equal(X_resampled, X_gt)
assert_array_equal(y_resampled, y_gt)
def test_iht_fit_sample_adaboost():
# Resample the data
est = 'adaboost'
iht = InstanceHardnessThreshold(est, random_state=RND_SEED)
X_resampled, y_resampled = iht.fit_sample(X, Y)
X_gt = np.array([[-0.3879569, 0.6894251], [-0.09322739, 1.28177189],
[-0.77740357, 0.74097941], [0.91542919, -0.65453327],
[-0.43877303, 1.07366684], [-0.85795321, 0.82980738],
[-0.18430329, 0.52328473], [-0.65571327, 0.42412021],
[-0.28305528, 0.30284991], [1.06446472, -1.09279772],
[0.30543283, -0.02589502], [-0.00717161, 0.00318087]])
y_gt = np.array([0, 1, 1, 0, 1, 1, 1, 0, 1, 0, 0, 0])
assert_array_equal(X_resampled, X_gt)
assert_array_equal(y_resampled, y_gt)
def test_iht_fit_sample_gradient_boosting():
# Resample the data
est = 'gradient-boosting'
iht = InstanceHardnessThreshold(est, random_state=RND_SEED)
X_resampled, y_resampled = iht.fit_sample(X, Y)
X_gt = np.array([[-0.3879569, 0.6894251], [-0.09322739, 1.28177189],
[-0.77740357, 0.74097941], [0.91542919, -0.65453327],
[-0.43877303, 1.07366684], [-0.85795321, 0.82980738],
[-0.18430329, 0.52328473], [-0.65571327, 0.42412021],
[-0.28305528, 0.30284991], [1.06446472, -1.09279772],
[0.30543283, -0.02589502], [-0.00717161, 0.00318087]])
y_gt = np.array([0, 1, 1, 0, 1, 1, 1, 0, 1, 0, 0, 0])
assert_array_equal(X_resampled, X_gt)
assert_array_equal(y_resampled, y_gt)
def test_iht_fit_sample_linear_svm():
# Resample the data
est = 'linear-svm'
iht = InstanceHardnessThreshold(est, random_state=RND_SEED)
X_resampled, y_resampled = iht.fit_sample(X, Y)
X_gt = np.array([[-0.3879569, 0.6894251], [-0.09322739, 1.28177189],
[-0.77740357, 0.74097941], [0.91542919, -0.65453327],
[-0.03852113, 0.40910479], [-0.43877303, 1.07366684],
[-0.18430329, 0.52328473], [-0.65571327, 0.42412021],
[-0.28305528, 0.30284991], [1.06446472, -1.09279772],
[0.30543283, -0.02589502], [-0.00717161, 0.00318087]])
y_gt = np.array([0, 1, 1, 0, 1, 1, 1, 0, 1, 0, 0, 0])
assert_array_equal(X_resampled, X_gt)
assert_array_equal(y_resampled, y_gt)
def test_iht_fit_sample_class_obj():
# Resample the data
est = GradientBoostingClassifier(random_state=RND_SEED)
iht = InstanceHardnessThreshold(estimator=est, random_state=RND_SEED)
X_resampled, y_resampled = iht.fit_sample(X, Y)
X_gt = np.array([[-0.3879569, 0.6894251], [-0.09322739, 1.28177189],
[-0.77740357, 0.74097941], [0.91542919, -0.65453327],
[-0.43877303, 1.07366684], [-0.85795321, 0.82980738],
[-0.18430329, 0.52328473], [-0.65571327, 0.42412021],
[-0.28305528, 0.30284991], [1.06446472, -1.09279772],
[0.30543283, -0.02589502], [-0.00717161, 0.00318087]])
y_gt = np.array([0, 1, 1, 0, 1, 1, 1, 0, 1, 0, 0, 0])
assert_array_equal(X_resampled, X_gt)
assert_array_equal(y_resampled, y_gt)
def test_iht_fit_sample_wrong_class_obj():
# Resample the data
from sklearn.cluster import KMeans
est = KMeans()
iht = InstanceHardnessThreshold(estimator=est, random_state=RND_SEED)
assert_raises_regex(ValueError, "Invalid parameter `estimator`",
iht.fit_sample, X, Y)
| 44.617778 | 75 | 0.595278 | 1,375 | 10,039 | 4.169455 | 0.084364 | 0.013605 | 0.032095 | 0.01535 | 0.839002 | 0.820687 | 0.788069 | 0.781092 | 0.757893 | 0.732252 | 0 | 0.347244 | 0.248132 | 10,039 | 224 | 76 | 44.816964 | 0.412295 | 0.02809 | 0 | 0.704142 | 0 | 0 | 0.012015 | 0 | 0 | 0 | 0 | 0 | 0.159763 | 1 | 0.076923 | false | 0 | 0.035503 | 0 | 0.112426 | 0.005917 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 9 |
729347be5d213303fefb82398cd52a607b7ac2e1 | 11,053 | py | Python | quad_logger/scripts/bag_reader.py | robomechanics/quad-software | 89154df18e98162249f38301b669df27ee595220 | [
"MIT"
] | 20 | 2021-12-05T03:40:28.000Z | 2022-03-30T02:53:56.000Z | quad_logger/scripts/bag_reader.py | robomechanics/rml-spirit-firmware | 89154df18e98162249f38301b669df27ee595220 | [
"MIT"
] | 101 | 2020-09-02T00:36:25.000Z | 2021-12-04T23:40:32.000Z | spirit_logger/scripts/bag_reader.py | robomechanics/spirit-software | b0a3d8defd3abe06406de6573212bab4a2eb769a | [
"MIT"
] | 2 | 2021-12-06T03:20:15.000Z | 2022-02-20T04:19:41.000Z | #!/usr/bin/python
import rosbag
import rospy
import yaml
import numpy as np
from tf.transformations import euler_from_quaternion
topic_type_dict = {}
msg_types = ['nav_msgs/Odometry', 'sensor_msgs/Imu', 'geometry_msgs/PoseStamped',
'quadrotor_msgs/PositionCommand', 'quadrotor_msgs/TRPYCommand',
'quadrotor_msgs/SO3Command', 'sensor_msgs/Range',
'geometry_msgs/PoseWithCovarianceStamped']
var_types = ['x', 'y', 'z', 'vx', 'vy', 'vz',
'acc_x', 'acc_y', 'acc_z',
'roll', 'pitch', 'yaw',
'ang_vel_x', 'ang_vel_y', 'ang_vel_z']
def read_bag(bagfile):
global inbag
inbag = rosbag.Bag(bagfile, 'r')
return read_topic_type()
def read_topic_type():
info_dict = yaml.load(inbag._get_yaml_info())
for x in info_dict['topics']:
topic_type_dict[x['topic']] = x['type']
return topic_type_dict
def read_msg(topics):
data = {}
if len(topics) > 0:
for topic, msg, type in inbag.read_messages():
if topics.count(topic):
if topic_type_dict[topic] == 'nav_msgs/Odometry':
data = update_odometry(data, topic, msg)
elif topic_type_dict[topic] == 'sensor_msgs/Imu':
data = update_imu(data, topic, msg)
elif topic_type_dict[topic] == 'geometry_msgs/PoseStamped':
data = update_pose(data, topic, msg.pose, msg.header)
elif topic_type_dict[topic] == 'quadrotor_msgs/PositionCommand':
data = update_pose_cmd(data, topic, msg)
elif topic_type_dict[topic] == 'geometry_msgs/PoseWithCovarianceStamped':
data = update_pose(data, topic, msg.pose.pose, msg.header)
return data
def update_odometry(data, topic, msg):
quat = [msg.pose.pose.orientation.x, msg.pose.pose.orientation.y,
msg.pose.pose.orientation.z, msg.pose.pose.orientation.w]
[r, p, y] = euler_from_quaternion(quat)
if topic in data:
data[topic]['x'] = np.append(data[topic]['x'], msg.pose.pose.position.x)
data[topic]['y'] = np.append(data[topic]['y'], msg.pose.pose.position.y)
data[topic]['z'] = np.append(data[topic]['z'], msg.pose.pose.position.z)
data[topic]['vx'] = np.append(data[topic]['vx'], msg.twist.twist.linear.x)
data[topic]['vy'] = np.append(data[topic]['vy'], msg.twist.twist.linear.y)
data[topic]['vz'] = np.append(data[topic]['vz'], msg.twist.twist.linear.z)
data[topic]['roll'] = np.append(data[topic]['roll'], r)
data[topic]['pitch'] = np.append(data[topic]['pitch'], p)
data[topic]['yaw'] = np.append(data[topic]['yaw'], y)
data[topic]['ang_vel_x'] = np.append(data[topic]['ang_vel_x'], msg.twist.twist.angular.x)
data[topic]['ang_vel_y'] = np.append(data[topic]['ang_vel_y'], msg.twist.twist.angular.y)
data[topic]['ang_vel_z'] = np.append(data[topic]['ang_vel_z'], msg.twist.twist.angular.z)
data[topic]['t'] = np.append(data[topic]['t'], msg.header.stamp.to_sec())
else:
data[topic] = {}
data[topic]['x'] = np.array([msg.pose.pose.position.x])
data[topic]['y'] = np.array([msg.pose.pose.position.y])
data[topic]['z'] = np.array([msg.pose.pose.position.z])
data[topic]['vx'] = np.array([msg.twist.twist.linear.x])
data[topic]['vy'] = np.array([msg.twist.twist.linear.y])
data[topic]['vz'] = np.array([msg.twist.twist.linear.z])
data[topic]['ang_vel_x'] = np.array([msg.twist.twist.angular.x])
data[topic]['ang_vel_y'] = np.array([msg.twist.twist.angular.y])
data[topic]['ang_vel_z'] = np.array([msg.twist.twist.angular.z])
data[topic]['roll'] = np.array([r])
data[topic]['pitch'] = np.array([p])
data[topic]['yaw'] = np.array([y])
data[topic]['t'] = np.array([msg.header.stamp.to_sec()])
return data
def update_pose(data, topic, msg, header):
quat = [msg.orientation.x, msg.orientation.y,
msg.orientation.z, msg.orientation.w]
[r, p, y] = euler_from_quaternion(quat)
if topic in data:
data[topic]['x'] = np.append(data[topic]['x'], msg.position.x)
data[topic]['y'] = np.append(data[topic]['y'], msg.position.y)
data[topic]['z'] = np.append(data[topic]['z'], msg.position.z)
data[topic]['roll'] = np.append(data[topic]['roll'], r)
data[topic]['pitch'] = np.append(data[topic]['pitch'], p)
data[topic]['yaw'] = np.append(data[topic]['yaw'], y)
data[topic]['t'] = np.append(data[topic]['t'], header.stamp.to_sec())
else:
data[topic] = {}
data[topic]['x'] = np.array([msg.position.x])
data[topic]['y'] = np.array([msg.position.y])
data[topic]['z'] = np.array([msg.position.z])
data[topic]['roll'] = np.array([r])
data[topic]['pitch'] = np.array([p])
data[topic]['yaw'] = np.array([y])
data[topic]['t'] = np.array([header.stamp.to_sec()])
return data
def update_imu(data, topic, msg):
quat = [msg.orientation.x, msg.orientation.y,
msg.orientation.z, msg.orientation.w]
[r, p, y] = euler_from_quaternion(quat)
if topic in data:
data[topic]['acc_x'] = np.append(data[topic]['acc_x'], msg.linear_acceleration.x)
data[topic]['acc_y'] = np.append(data[topic]['acc_y'], msg.linear_acceleration.y)
data[topic]['acc_z'] = np.append(data[topic]['acc_z'], msg.linear_acceleration.z)
data[topic]['ang_vel_x'] = np.append(data[topic]['ang_vel_x'], msg.angular_velocity.x)
data[topic]['ang_vel_y'] = np.append(data[topic]['ang_vel_y'], msg.angular_velocity.y)
data[topic]['ang_vel_z'] = np.append(data[topic]['ang_vel_z'], msg.angular_velocity.z)
data[topic]['roll'] = np.append(data[topic]['roll'], r)
data[topic]['pitch'] = np.append(data[topic]['pitch'], p)
data[topic]['yaw'] = np.append(data[topic]['yaw'], y)
data[topic]['t'] = np.append(data[topic]['t'], msg.header.stamp.to_sec())
else:
data[topic] = {}
data[topic]['acc_x'] = np.array([msg.linear_acceleration.x])
data[topic]['acc_y'] = np.array([msg.linear_acceleration.y])
data[topic]['acc_z'] = np.array([msg.linear_acceleration.z])
data[topic]['ang_vel_x'] = np.array([msg.angular_velocity.x])
data[topic]['ang_vel_y'] = np.array([msg.angular_velocity.y])
data[topic]['ang_vel_z'] = np.array([msg.angular_velocity.z])
data[topic]['roll'] = np.array([r])
data[topic]['pitch'] = np.array([p])
data[topic]['yaw'] = np.array([y])
data[topic]['t'] = np.array([msg.header.stamp.to_sec()])
return data
def update_pose_cmd(data, topic, msg):
if topic in data:
data[topic]['x'] = np.append(data[topic]['x'], msg.position.x)
data[topic]['y'] = np.append(data[topic]['y'], msg.position.y)
data[topic]['z'] = np.append(data[topic]['z'], msg.position.z)
data[topic]['vx'] = np.append(data[topic]['vx'], msg.velocity.x)
data[topic]['vy'] = np.append(data[topic]['vy'], msg.velocity.y)
data[topic]['vz'] = np.append(data[topic]['vz'], msg.velocity.z)
data[topic]['acc_x'] = np.append(data[topic]['acc_x'], msg.acceleration.x)
data[topic]['acc_y'] = np.append(data[topic]['acc_y'], msg.acceleration.y)
data[topic]['acc_z'] = np.append(data[topic]['acc_z'], msg.acceleration.z)
data[topic]['yaw'] = np.append(data[topic]['yaw'], msg.yaw)
data[topic]['t'] = np.append(data[topic]['t'], msg.header.stamp.to_sec())
else:
data[topic] = {}
data[topic]['x'] = np.array([msg.position.x])
data[topic]['y'] = np.array([msg.position.y])
data[topic]['z'] = np.array([msg.position.z])
data[topic]['vx'] = np.array([msg.velocity.x])
data[topic]['vy'] = np.array([msg.velocity.y])
data[topic]['vz'] = np.array([msg.velocity.z])
data[topic]['acc_x'] = np.array([msg.acceleration.x])
data[topic]['acc_y'] = np.array([msg.acceleration.y])
data[topic]['acc_z'] = np.array([msg.acceleration.z])
data[topic]['yaw'] = np.array([msg.yaw])
data[topic]['t'] = np.array([msg.header.stamp.to_sec()])
return data
def update_trpy_cmd(data, topic, msg):
if topic in data:
data[topic]['roll'] = np.append(data[topic]['roll'], msg.roll)
data[topic]['pitch'] = np.append(data[topic]['pitch'], msg.pitch)
data[topic]['yaw'] = np.append(data[topic]['yaw'], msg.yaw)
data[topic]['t'] = np.append(data[topic]['t'], msg.header.stamp.to_sec())
else:
data[topic] = {}
data[topic]['roll'] = np.array([msg.roll])
data[topic]['pitch'] = np.array([msg.pitch])
data[topic]['yaw'] = np.array([msg.yaw])
data[topic]['t'] = np.array([msg.header.stamp.to_sec()])
return data
def update_so3_cmd(data, topic, msg):
quat = [msg.orientation.x, msg.orientation.y,
msg.orientation.z, msg.orientation.w]
[r, p, y] = euler_from_quaternion(quat)
if topic in data:
data[topic]['yaw'] = np.append(data[topic]['yaw'], y)
data[topic]['ang_vel_x'] = np.append(data[topic]['ang_vel_x'], msg.angular_velocity.x)
data[topic]['ang_vel_y'] = np.append(data[topic]['ang_vel_y'], msg.angular_velocity.y)
data[topic]['ang_vel_z'] = np.append(data[topic]['ang_vel_z'], msg.angular_velocity.z)
data[topic]['t'] = np.append(data[topic]['t'], msg.header.stamp.to_sec())
data[topic]['roll'] = np.append(data[topic]['roll'], r)
data[topic]['pitch'] = np.append(data[topic]['pitch'], p)
data[topic]['yaw'] = np.append(data[topic]['yaw'], y)
else:
data[topic] = {}
data[topic]['yaw'] = np.array([y])
data[topic]['ang_vel_x'] = np.array(msg.angular_velocity.x)
data[topic]['ang_vel_y'] = np.array(msg.angular_velocity.y)
data[topic]['ang_vel_z'] = np.array(msg.angular_velocity.z)
data[topic]['t'] = np.array([msg.header.stamp.to_sec()])
data[topic]['roll'] = np.array([r])
data[topic]['pitch'] = np.array([p])
data[topic]['yaw'] = np.array([y])
return data
def update_range(data, topic, msg):
if topic in data:
data[topic]['z'] = np.append(data[topic]['z'], msg.range)
data[topic]['t'] = np.append(data[topic]['t'], msg.header.stamp.to_sec())
else:
data[topic] = {}
data[topic]['z'] = np.array([msg.range])
data[topic]['t'] = np.array([msg.header.stamp.to_sec()])
return data
if __name__ == "__main__":
read_topic_type()
| 49.34375 | 100 | 0.572967 | 1,558 | 11,053 | 3.941592 | 0.05905 | 0.269663 | 0.107474 | 0.152255 | 0.848559 | 0.828692 | 0.815665 | 0.774304 | 0.73913 | 0.661456 | 0 | 0.00035 | 0.22555 | 11,053 | 223 | 101 | 49.565022 | 0.717056 | 0.001448 | 0 | 0.5 | 0 | 0 | 0.092878 | 0.021656 | 0 | 0 | 0 | 0 | 0 | 1 | 0.050505 | false | 0 | 0.025253 | 0 | 0.126263 | 0 | 0 | 0 | 0 | null | 1 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
72df619b4e2e824a1bf71882f0f25e3915576829 | 68,580 | py | Python | benchmarks/SimResults/_bigLittle_hrrs_splash_tugberk_ml/ratio_based_results/cmp_fft/power.py | TugberkArkose/MLScheduler | e493b6cbf7b9d29a2c9300d7dd6f0c2f102e4061 | [
"Unlicense"
] | null | null | null | benchmarks/SimResults/_bigLittle_hrrs_splash_tugberk_ml/ratio_based_results/cmp_fft/power.py | TugberkArkose/MLScheduler | e493b6cbf7b9d29a2c9300d7dd6f0c2f102e4061 | [
"Unlicense"
] | null | null | null | benchmarks/SimResults/_bigLittle_hrrs_splash_tugberk_ml/ratio_based_results/cmp_fft/power.py | TugberkArkose/MLScheduler | e493b6cbf7b9d29a2c9300d7dd6f0c2f102e4061 | [
"Unlicense"
] | null | null | null | power = {'BUSES': {'Area': 1.33155,
'Bus/Area': 1.33155,
'Bus/Gate Leakage': 0.00662954,
'Bus/Peak Dynamic': 0.0,
'Bus/Runtime Dynamic': 0.0,
'Bus/Subthreshold Leakage': 0.0691322,
'Bus/Subthreshold Leakage with power gating': 0.0259246,
'Gate Leakage': 0.00662954,
'Peak Dynamic': 0.0,
'Runtime Dynamic': 0.0,
'Subthreshold Leakage': 0.0691322,
'Subthreshold Leakage with power gating': 0.0259246},
'Core': [{'Area': 32.6082,
'Execution Unit/Area': 8.2042,
'Execution Unit/Complex ALUs/Area': 0.235435,
'Execution Unit/Complex ALUs/Gate Leakage': 0.0132646,
'Execution Unit/Complex ALUs/Peak Dynamic': 0.122614,
'Execution Unit/Complex ALUs/Runtime Dynamic': 0.298995,
'Execution Unit/Complex ALUs/Subthreshold Leakage': 0.20111,
'Execution Unit/Complex ALUs/Subthreshold Leakage with power gating': 0.0754163,
'Execution Unit/Floating Point Units/Area': 4.6585,
'Execution Unit/Floating Point Units/Gate Leakage': 0.0656156,
'Execution Unit/Floating Point Units/Peak Dynamic': 0.78419,
'Execution Unit/Floating Point Units/Runtime Dynamic': 0.304033,
'Execution Unit/Floating Point Units/Subthreshold Leakage': 0.994829,
'Execution Unit/Floating Point Units/Subthreshold Leakage with power gating': 0.373061,
'Execution Unit/Gate Leakage': 0.122718,
'Execution Unit/Instruction Scheduler/Area': 2.17927,
'Execution Unit/Instruction Scheduler/FP Instruction Window/Area': 0.328073,
'Execution Unit/Instruction Scheduler/FP Instruction Window/Gate Leakage': 0.00115349,
'Execution Unit/Instruction Scheduler/FP Instruction Window/Peak Dynamic': 1.20978,
'Execution Unit/Instruction Scheduler/FP Instruction Window/Runtime Dynamic': 0.608634,
'Execution Unit/Instruction Scheduler/FP Instruction Window/Subthreshold Leakage': 0.017004,
'Execution Unit/Instruction Scheduler/FP Instruction Window/Subthreshold Leakage with power gating': 0.00962066,
'Execution Unit/Instruction Scheduler/Gate Leakage': 0.00730101,
'Execution Unit/Instruction Scheduler/Instruction Window/Area': 1.00996,
'Execution Unit/Instruction Scheduler/Instruction Window/Gate Leakage': 0.00529112,
'Execution Unit/Instruction Scheduler/Instruction Window/Peak Dynamic': 2.07911,
'Execution Unit/Instruction Scheduler/Instruction Window/Runtime Dynamic': 1.05393,
'Execution Unit/Instruction Scheduler/Instruction Window/Subthreshold Leakage': 0.0800117,
'Execution Unit/Instruction Scheduler/Instruction Window/Subthreshold Leakage with power gating': 0.0455351,
'Execution Unit/Instruction Scheduler/Peak Dynamic': 4.84781,
'Execution Unit/Instruction Scheduler/ROB/Area': 0.841232,
'Execution Unit/Instruction Scheduler/ROB/Gate Leakage': 0.000856399,
'Execution Unit/Instruction Scheduler/ROB/Peak Dynamic': 1.55892,
'Execution Unit/Instruction Scheduler/ROB/Runtime Dynamic': 0.604461,
'Execution Unit/Instruction Scheduler/ROB/Subthreshold Leakage': 0.0178624,
'Execution Unit/Instruction Scheduler/ROB/Subthreshold Leakage with power gating': 0.00897339,
'Execution Unit/Instruction Scheduler/Runtime Dynamic': 2.26703,
'Execution Unit/Instruction Scheduler/Subthreshold Leakage': 0.114878,
'Execution Unit/Instruction Scheduler/Subthreshold Leakage with power gating': 0.0641291,
'Execution Unit/Integer ALUs/Area': 0.47087,
'Execution Unit/Integer ALUs/Gate Leakage': 0.0265291,
'Execution Unit/Integer ALUs/Peak Dynamic': 0.481382,
'Execution Unit/Integer ALUs/Runtime Dynamic': 0.101344,
'Execution Unit/Integer ALUs/Subthreshold Leakage': 0.40222,
'Execution Unit/Integer ALUs/Subthreshold Leakage with power gating': 0.150833,
'Execution Unit/Peak Dynamic': 7.16826,
'Execution Unit/Register Files/Area': 0.570804,
'Execution Unit/Register Files/Floating Point RF/Area': 0.208131,
'Execution Unit/Register Files/Floating Point RF/Gate Leakage': 0.000232788,
'Execution Unit/Register Files/Floating Point RF/Peak Dynamic': 0.14815,
'Execution Unit/Register Files/Floating Point RF/Runtime Dynamic': 0.0220635,
'Execution Unit/Register Files/Floating Point RF/Subthreshold Leakage': 0.00399698,
'Execution Unit/Register Files/Floating Point RF/Subthreshold Leakage with power gating': 0.00176968,
'Execution Unit/Register Files/Gate Leakage': 0.000622708,
'Execution Unit/Register Files/Integer RF/Area': 0.362673,
'Execution Unit/Register Files/Integer RF/Gate Leakage': 0.00038992,
'Execution Unit/Register Files/Integer RF/Peak Dynamic': 0.200489,
'Execution Unit/Register Files/Integer RF/Runtime Dynamic': 0.163173,
'Execution Unit/Register Files/Integer RF/Subthreshold Leakage': 0.00614175,
'Execution Unit/Register Files/Integer RF/Subthreshold Leakage with power gating': 0.00246675,
'Execution Unit/Register Files/Peak Dynamic': 0.34864,
'Execution Unit/Register Files/Runtime Dynamic': 0.185236,
'Execution Unit/Register Files/Subthreshold Leakage': 0.0101387,
'Execution Unit/Register Files/Subthreshold Leakage with power gating': 0.00423643,
'Execution Unit/Results Broadcast Bus/Area Overhead': 0.0442632,
'Execution Unit/Results Broadcast Bus/Gate Leakage': 0.00607074,
'Execution Unit/Results Broadcast Bus/Peak Dynamic': 0.518056,
'Execution Unit/Results Broadcast Bus/Runtime Dynamic': 1.24113,
'Execution Unit/Results Broadcast Bus/Subthreshold Leakage': 0.0920413,
'Execution Unit/Results Broadcast Bus/Subthreshold Leakage with power gating': 0.0345155,
'Execution Unit/Runtime Dynamic': 4.39777,
'Execution Unit/Subthreshold Leakage': 1.83518,
'Execution Unit/Subthreshold Leakage with power gating': 0.709678,
'Gate Leakage': 0.372997,
'Instruction Fetch Unit/Area': 5.86007,
'Instruction Fetch Unit/Branch Predictor/Area': 0.138516,
'Instruction Fetch Unit/Branch Predictor/Chooser/Area': 0.0435221,
'Instruction Fetch Unit/Branch Predictor/Chooser/Gate Leakage': 0.000278362,
'Instruction Fetch Unit/Branch Predictor/Chooser/Peak Dynamic': 0.0168831,
'Instruction Fetch Unit/Branch Predictor/Chooser/Runtime Dynamic': 0.00301262,
'Instruction Fetch Unit/Branch Predictor/Chooser/Subthreshold Leakage': 0.00759719,
'Instruction Fetch Unit/Branch Predictor/Chooser/Subthreshold Leakage with power gating': 0.0039236,
'Instruction Fetch Unit/Branch Predictor/Gate Leakage': 0.000757657,
'Instruction Fetch Unit/Branch Predictor/Global Predictor/Area': 0.0435221,
'Instruction Fetch Unit/Branch Predictor/Global Predictor/Gate Leakage': 0.000278362,
'Instruction Fetch Unit/Branch Predictor/Global Predictor/Peak Dynamic': 0.0168831,
'Instruction Fetch Unit/Branch Predictor/Global Predictor/Runtime Dynamic': 0.00301262,
'Instruction Fetch Unit/Branch Predictor/Global Predictor/Subthreshold Leakage': 0.00759719,
'Instruction Fetch Unit/Branch Predictor/Global Predictor/Subthreshold Leakage with power gating': 0.0039236,
'Instruction Fetch Unit/Branch Predictor/L1_Local Predictor/Area': 0.0257064,
'Instruction Fetch Unit/Branch Predictor/L1_Local Predictor/Gate Leakage': 0.000154548,
'Instruction Fetch Unit/Branch Predictor/L1_Local Predictor/Peak Dynamic': 0.0142575,
'Instruction Fetch Unit/Branch Predictor/L1_Local Predictor/Runtime Dynamic': 0.00260934,
'Instruction Fetch Unit/Branch Predictor/L1_Local Predictor/Subthreshold Leakage': 0.00384344,
'Instruction Fetch Unit/Branch Predictor/L1_Local Predictor/Subthreshold Leakage with power gating': 0.00198631,
'Instruction Fetch Unit/Branch Predictor/L2_Local Predictor/Area': 0.0151917,
'Instruction Fetch Unit/Branch Predictor/L2_Local Predictor/Gate Leakage': 8.00196e-05,
'Instruction Fetch Unit/Branch Predictor/L2_Local Predictor/Peak Dynamic': 0.00527447,
'Instruction Fetch Unit/Branch Predictor/L2_Local Predictor/Runtime Dynamic': 0.0010021,
'Instruction Fetch Unit/Branch Predictor/L2_Local Predictor/Subthreshold Leakage': 0.00181347,
'Instruction Fetch Unit/Branch Predictor/L2_Local Predictor/Subthreshold Leakage with power gating': 0.000957045,
'Instruction Fetch Unit/Branch Predictor/Peak Dynamic': 0.0597838,
'Instruction Fetch Unit/Branch Predictor/RAS/Area': 0.0105732,
'Instruction Fetch Unit/Branch Predictor/RAS/Gate Leakage': 4.63858e-05,
'Instruction Fetch Unit/Branch Predictor/RAS/Peak Dynamic': 0.0117602,
'Instruction Fetch Unit/Branch Predictor/RAS/Runtime Dynamic': 0.00234399,
'Instruction Fetch Unit/Branch Predictor/RAS/Subthreshold Leakage': 0.000932505,
'Instruction Fetch Unit/Branch Predictor/RAS/Subthreshold Leakage with power gating': 0.000494733,
'Instruction Fetch Unit/Branch Predictor/Runtime Dynamic': 0.0109786,
'Instruction Fetch Unit/Branch Predictor/Subthreshold Leakage': 0.0199703,
'Instruction Fetch Unit/Branch Predictor/Subthreshold Leakage with power gating': 0.0103282,
'Instruction Fetch Unit/Branch Target Buffer/Area': 0.64954,
'Instruction Fetch Unit/Branch Target Buffer/Gate Leakage': 0.00272758,
'Instruction Fetch Unit/Branch Target Buffer/Peak Dynamic': 0.177867,
'Instruction Fetch Unit/Branch Target Buffer/Runtime Dynamic': 0.0294082,
'Instruction Fetch Unit/Branch Target Buffer/Subthreshold Leakage': 0.0811682,
'Instruction Fetch Unit/Branch Target Buffer/Subthreshold Leakage with power gating': 0.0435357,
'Instruction Fetch Unit/Gate Leakage': 0.0590479,
'Instruction Fetch Unit/Instruction Buffer/Area': 0.0226323,
'Instruction Fetch Unit/Instruction Buffer/Gate Leakage': 6.83558e-05,
'Instruction Fetch Unit/Instruction Buffer/Peak Dynamic': 0.606827,
'Instruction Fetch Unit/Instruction Buffer/Runtime Dynamic': 0.156862,
'Instruction Fetch Unit/Instruction Buffer/Subthreshold Leakage': 0.00151885,
'Instruction Fetch Unit/Instruction Buffer/Subthreshold Leakage with power gating': 0.000701682,
'Instruction Fetch Unit/Instruction Cache/Area': 3.14635,
'Instruction Fetch Unit/Instruction Cache/Gate Leakage': 0.029931,
'Instruction Fetch Unit/Instruction Cache/Peak Dynamic': 6.43323,
'Instruction Fetch Unit/Instruction Cache/Runtime Dynamic': 0.498823,
'Instruction Fetch Unit/Instruction Cache/Subthreshold Leakage': 0.367022,
'Instruction Fetch Unit/Instruction Cache/Subthreshold Leakage with power gating': 0.180386,
'Instruction Fetch Unit/Instruction Decoder/Area': 1.85799,
'Instruction Fetch Unit/Instruction Decoder/Gate Leakage': 0.0222493,
'Instruction Fetch Unit/Instruction Decoder/Peak Dynamic': 1.37404,
'Instruction Fetch Unit/Instruction Decoder/Runtime Dynamic': 0.532774,
'Instruction Fetch Unit/Instruction Decoder/Subthreshold Leakage': 0.442943,
'Instruction Fetch Unit/Instruction Decoder/Subthreshold Leakage with power gating': 0.166104,
'Instruction Fetch Unit/Peak Dynamic': 8.96874,
'Instruction Fetch Unit/Runtime Dynamic': 1.22885,
'Instruction Fetch Unit/Subthreshold Leakage': 0.932587,
'Instruction Fetch Unit/Subthreshold Leakage with power gating': 0.408542,
'L2/Area': 4.53318,
'L2/Gate Leakage': 0.015464,
'L2/Peak Dynamic': 0.105048,
'L2/Runtime Dynamic': 0.041173,
'L2/Subthreshold Leakage': 0.834142,
'L2/Subthreshold Leakage with power gating': 0.401066,
'Load Store Unit/Area': 8.80969,
'Load Store Unit/Data Cache/Area': 6.84535,
'Load Store Unit/Data Cache/Gate Leakage': 0.0279261,
'Load Store Unit/Data Cache/Peak Dynamic': 4.73164,
'Load Store Unit/Data Cache/Runtime Dynamic': 1.78899,
'Load Store Unit/Data Cache/Subthreshold Leakage': 0.527675,
'Load Store Unit/Data Cache/Subthreshold Leakage with power gating': 0.25085,
'Load Store Unit/Gate Leakage': 0.0351387,
'Load Store Unit/LoadQ/Area': 0.0836782,
'Load Store Unit/LoadQ/Gate Leakage': 0.00059896,
'Load Store Unit/LoadQ/Peak Dynamic': 0.113056,
'Load Store Unit/LoadQ/Runtime Dynamic': 0.113056,
'Load Store Unit/LoadQ/Subthreshold Leakage': 0.00941961,
'Load Store Unit/LoadQ/Subthreshold Leakage with power gating': 0.00536918,
'Load Store Unit/Peak Dynamic': 5.26769,
'Load Store Unit/Runtime Dynamic': 2.4596,
'Load Store Unit/StoreQ/Area': 0.322079,
'Load Store Unit/StoreQ/Gate Leakage': 0.00329971,
'Load Store Unit/StoreQ/Peak Dynamic': 0.278777,
'Load Store Unit/StoreQ/Runtime Dynamic': 0.557553,
'Load Store Unit/StoreQ/Subthreshold Leakage': 0.0345621,
'Load Store Unit/StoreQ/Subthreshold Leakage with power gating': 0.0197004,
'Load Store Unit/Subthreshold Leakage': 0.591622,
'Load Store Unit/Subthreshold Leakage with power gating': 0.283406,
'Memory Management Unit/Area': 0.434579,
'Memory Management Unit/Dtlb/Area': 0.0879726,
'Memory Management Unit/Dtlb/Gate Leakage': 0.00088729,
'Memory Management Unit/Dtlb/Peak Dynamic': 0.0989388,
'Memory Management Unit/Dtlb/Runtime Dynamic': 0.100513,
'Memory Management Unit/Dtlb/Subthreshold Leakage': 0.0155699,
'Memory Management Unit/Dtlb/Subthreshold Leakage with power gating': 0.00887485,
'Memory Management Unit/Gate Leakage': 0.00813591,
'Memory Management Unit/Itlb/Area': 0.301552,
'Memory Management Unit/Itlb/Gate Leakage': 0.00393464,
'Memory Management Unit/Itlb/Peak Dynamic': 0.399995,
'Memory Management Unit/Itlb/Runtime Dynamic': 0.0817832,
'Memory Management Unit/Itlb/Subthreshold Leakage': 0.0413758,
'Memory Management Unit/Itlb/Subthreshold Leakage with power gating': 0.0235842,
'Memory Management Unit/Peak Dynamic': 0.729539,
'Memory Management Unit/Runtime Dynamic': 0.182297,
'Memory Management Unit/Subthreshold Leakage': 0.0769113,
'Memory Management Unit/Subthreshold Leakage with power gating': 0.0399462,
'Peak Dynamic': 26.801,
'Renaming Unit/Area': 0.369768,
'Renaming Unit/FP Front End RAT/Area': 0.168486,
'Renaming Unit/FP Front End RAT/Gate Leakage': 0.00489731,
'Renaming Unit/FP Front End RAT/Peak Dynamic': 3.33511,
'Renaming Unit/FP Front End RAT/Runtime Dynamic': 0.516862,
'Renaming Unit/FP Front End RAT/Subthreshold Leakage': 0.0437281,
'Renaming Unit/FP Front End RAT/Subthreshold Leakage with power gating': 0.024925,
'Renaming Unit/Free List/Area': 0.0414755,
'Renaming Unit/Free List/Gate Leakage': 4.15911e-05,
'Renaming Unit/Free List/Peak Dynamic': 0.0401324,
'Renaming Unit/Free List/Runtime Dynamic': 0.0373417,
'Renaming Unit/Free List/Subthreshold Leakage': 0.000670426,
'Renaming Unit/Free List/Subthreshold Leakage with power gating': 0.000377987,
'Renaming Unit/Gate Leakage': 0.00863632,
'Renaming Unit/Int Front End RAT/Area': 0.114751,
'Renaming Unit/Int Front End RAT/Gate Leakage': 0.00038343,
'Renaming Unit/Int Front End RAT/Peak Dynamic': 0.86945,
'Renaming Unit/Int Front End RAT/Runtime Dynamic': 0.308411,
'Renaming Unit/Int Front End RAT/Subthreshold Leakage': 0.00611897,
'Renaming Unit/Int Front End RAT/Subthreshold Leakage with power gating': 0.00348781,
'Renaming Unit/Peak Dynamic': 4.56169,
'Renaming Unit/Runtime Dynamic': 0.862615,
'Renaming Unit/Subthreshold Leakage': 0.070483,
'Renaming Unit/Subthreshold Leakage with power gating': 0.0362779,
'Runtime Dynamic': 9.1723,
'Subthreshold Leakage': 6.21877,
'Subthreshold Leakage with power gating': 2.58311},
{'Area': 32.0201,
'Execution Unit/Area': 7.68434,
'Execution Unit/Complex ALUs/Area': 0.235435,
'Execution Unit/Complex ALUs/Gate Leakage': 0.0132646,
'Execution Unit/Complex ALUs/Peak Dynamic': 0.0634888,
'Execution Unit/Complex ALUs/Runtime Dynamic': 0.252556,
'Execution Unit/Complex ALUs/Subthreshold Leakage': 0.20111,
'Execution Unit/Complex ALUs/Subthreshold Leakage with power gating': 0.0754163,
'Execution Unit/Floating Point Units/Area': 4.6585,
'Execution Unit/Floating Point Units/Gate Leakage': 0.0656156,
'Execution Unit/Floating Point Units/Peak Dynamic': 0.40766,
'Execution Unit/Floating Point Units/Runtime Dynamic': 0.304033,
'Execution Unit/Floating Point Units/Subthreshold Leakage': 0.994829,
'Execution Unit/Floating Point Units/Subthreshold Leakage with power gating': 0.373061,
'Execution Unit/Gate Leakage': 0.120359,
'Execution Unit/Instruction Scheduler/Area': 1.66526,
'Execution Unit/Instruction Scheduler/FP Instruction Window/Area': 0.275653,
'Execution Unit/Instruction Scheduler/FP Instruction Window/Gate Leakage': 0.000977433,
'Execution Unit/Instruction Scheduler/FP Instruction Window/Peak Dynamic': 1.04181,
'Execution Unit/Instruction Scheduler/FP Instruction Window/Runtime Dynamic': 0.278722,
'Execution Unit/Instruction Scheduler/FP Instruction Window/Subthreshold Leakage': 0.0143453,
'Execution Unit/Instruction Scheduler/FP Instruction Window/Subthreshold Leakage with power gating': 0.00810519,
'Execution Unit/Instruction Scheduler/Gate Leakage': 0.00568913,
'Execution Unit/Instruction Scheduler/Instruction Window/Area': 0.805223,
'Execution Unit/Instruction Scheduler/Instruction Window/Gate Leakage': 0.00414562,
'Execution Unit/Instruction Scheduler/Instruction Window/Peak Dynamic': 1.6763,
'Execution Unit/Instruction Scheduler/Instruction Window/Runtime Dynamic': 0.449568,
'Execution Unit/Instruction Scheduler/Instruction Window/Subthreshold Leakage': 0.0625755,
'Execution Unit/Instruction Scheduler/Instruction Window/Subthreshold Leakage with power gating': 0.0355964,
'Execution Unit/Instruction Scheduler/Peak Dynamic': 3.82262,
'Execution Unit/Instruction Scheduler/ROB/Area': 0.584388,
'Execution Unit/Instruction Scheduler/ROB/Gate Leakage': 0.00056608,
'Execution Unit/Instruction Scheduler/ROB/Peak Dynamic': 1.10451,
'Execution Unit/Instruction Scheduler/ROB/Runtime Dynamic': 0.226927,
'Execution Unit/Instruction Scheduler/ROB/Subthreshold Leakage': 0.00906853,
'Execution Unit/Instruction Scheduler/ROB/Subthreshold Leakage with power gating': 0.00364446,
'Execution Unit/Instruction Scheduler/Runtime Dynamic': 0.955217,
'Execution Unit/Instruction Scheduler/Subthreshold Leakage': 0.0859892,
'Execution Unit/Instruction Scheduler/Subthreshold Leakage with power gating': 0.047346,
'Execution Unit/Integer ALUs/Area': 0.47087,
'Execution Unit/Integer ALUs/Gate Leakage': 0.0265291,
'Execution Unit/Integer ALUs/Peak Dynamic': 0.256278,
'Execution Unit/Integer ALUs/Runtime Dynamic': 0.101344,
'Execution Unit/Integer ALUs/Subthreshold Leakage': 0.40222,
'Execution Unit/Integer ALUs/Subthreshold Leakage with power gating': 0.150833,
'Execution Unit/Peak Dynamic': 5.00562,
'Execution Unit/Register Files/Area': 0.570804,
'Execution Unit/Register Files/Floating Point RF/Area': 0.208131,
'Execution Unit/Register Files/Floating Point RF/Gate Leakage': 0.000232788,
'Execution Unit/Register Files/Floating Point RF/Peak Dynamic': 0.0770157,
'Execution Unit/Register Files/Floating Point RF/Runtime Dynamic': 0.0116909,
'Execution Unit/Register Files/Floating Point RF/Subthreshold Leakage': 0.00399698,
'Execution Unit/Register Files/Floating Point RF/Subthreshold Leakage with power gating': 0.00176968,
'Execution Unit/Register Files/Gate Leakage': 0.000622708,
'Execution Unit/Register Files/Integer RF/Area': 0.362673,
'Execution Unit/Register Files/Integer RF/Gate Leakage': 0.00038992,
'Execution Unit/Register Files/Integer RF/Peak Dynamic': 0.105674,
'Execution Unit/Register Files/Integer RF/Runtime Dynamic': 0.086461,
'Execution Unit/Register Files/Integer RF/Subthreshold Leakage': 0.00614175,
'Execution Unit/Register Files/Integer RF/Subthreshold Leakage with power gating': 0.00246675,
'Execution Unit/Register Files/Peak Dynamic': 0.18269,
'Execution Unit/Register Files/Runtime Dynamic': 0.0981519,
'Execution Unit/Register Files/Subthreshold Leakage': 0.0101387,
'Execution Unit/Register Files/Subthreshold Leakage with power gating': 0.00423643,
'Execution Unit/Results Broadcast Bus/Area Overhead': 0.0390912,
'Execution Unit/Results Broadcast Bus/Gate Leakage': 0.00537402,
'Execution Unit/Results Broadcast Bus/Peak Dynamic': 0.238502,
'Execution Unit/Results Broadcast Bus/Runtime Dynamic': 0.580619,
'Execution Unit/Results Broadcast Bus/Subthreshold Leakage': 0.081478,
'Execution Unit/Results Broadcast Bus/Subthreshold Leakage with power gating': 0.0305543,
'Execution Unit/Runtime Dynamic': 2.29192,
'Execution Unit/Subthreshold Leakage': 1.79543,
'Execution Unit/Subthreshold Leakage with power gating': 0.688821,
'Gate Leakage': 0.368936,
'Instruction Fetch Unit/Area': 5.85939,
'Instruction Fetch Unit/Branch Predictor/Area': 0.138516,
'Instruction Fetch Unit/Branch Predictor/Chooser/Area': 0.0435221,
'Instruction Fetch Unit/Branch Predictor/Chooser/Gate Leakage': 0.000278362,
'Instruction Fetch Unit/Branch Predictor/Chooser/Peak Dynamic': 0.0168831,
'Instruction Fetch Unit/Branch Predictor/Chooser/Runtime Dynamic': 0.00171084,
'Instruction Fetch Unit/Branch Predictor/Chooser/Subthreshold Leakage': 0.00759719,
'Instruction Fetch Unit/Branch Predictor/Chooser/Subthreshold Leakage with power gating': 0.0039236,
'Instruction Fetch Unit/Branch Predictor/Gate Leakage': 0.000757657,
'Instruction Fetch Unit/Branch Predictor/Global Predictor/Area': 0.0435221,
'Instruction Fetch Unit/Branch Predictor/Global Predictor/Gate Leakage': 0.000278362,
'Instruction Fetch Unit/Branch Predictor/Global Predictor/Peak Dynamic': 0.0168831,
'Instruction Fetch Unit/Branch Predictor/Global Predictor/Runtime Dynamic': 0.00171084,
'Instruction Fetch Unit/Branch Predictor/Global Predictor/Subthreshold Leakage': 0.00759719,
'Instruction Fetch Unit/Branch Predictor/Global Predictor/Subthreshold Leakage with power gating': 0.0039236,
'Instruction Fetch Unit/Branch Predictor/L1_Local Predictor/Area': 0.0257064,
'Instruction Fetch Unit/Branch Predictor/L1_Local Predictor/Gate Leakage': 0.000154548,
'Instruction Fetch Unit/Branch Predictor/L1_Local Predictor/Peak Dynamic': 0.0142575,
'Instruction Fetch Unit/Branch Predictor/L1_Local Predictor/Runtime Dynamic': 0.0015125,
'Instruction Fetch Unit/Branch Predictor/L1_Local Predictor/Subthreshold Leakage': 0.00384344,
'Instruction Fetch Unit/Branch Predictor/L1_Local Predictor/Subthreshold Leakage with power gating': 0.00198631,
'Instruction Fetch Unit/Branch Predictor/L2_Local Predictor/Area': 0.0151917,
'Instruction Fetch Unit/Branch Predictor/L2_Local Predictor/Gate Leakage': 8.00196e-05,
'Instruction Fetch Unit/Branch Predictor/L2_Local Predictor/Peak Dynamic': 0.00527447,
'Instruction Fetch Unit/Branch Predictor/L2_Local Predictor/Runtime Dynamic': 0.000597744,
'Instruction Fetch Unit/Branch Predictor/L2_Local Predictor/Subthreshold Leakage': 0.00181347,
'Instruction Fetch Unit/Branch Predictor/L2_Local Predictor/Subthreshold Leakage with power gating': 0.000957045,
'Instruction Fetch Unit/Branch Predictor/Peak Dynamic': 0.0597838,
'Instruction Fetch Unit/Branch Predictor/RAS/Area': 0.0105732,
'Instruction Fetch Unit/Branch Predictor/RAS/Gate Leakage': 4.63858e-05,
'Instruction Fetch Unit/Branch Predictor/RAS/Peak Dynamic': 0.0117602,
'Instruction Fetch Unit/Branch Predictor/RAS/Runtime Dynamic': 0.00124202,
'Instruction Fetch Unit/Branch Predictor/RAS/Subthreshold Leakage': 0.000932505,
'Instruction Fetch Unit/Branch Predictor/RAS/Subthreshold Leakage with power gating': 0.000494733,
'Instruction Fetch Unit/Branch Predictor/Runtime Dynamic': 0.00617619,
'Instruction Fetch Unit/Branch Predictor/Subthreshold Leakage': 0.0199703,
'Instruction Fetch Unit/Branch Predictor/Subthreshold Leakage with power gating': 0.0103282,
'Instruction Fetch Unit/Branch Target Buffer/Area': 0.64954,
'Instruction Fetch Unit/Branch Target Buffer/Gate Leakage': 0.00272758,
'Instruction Fetch Unit/Branch Target Buffer/Peak Dynamic': 0.177867,
'Instruction Fetch Unit/Branch Target Buffer/Runtime Dynamic': 0.0156043,
'Instruction Fetch Unit/Branch Target Buffer/Subthreshold Leakage': 0.0811682,
'Instruction Fetch Unit/Branch Target Buffer/Subthreshold Leakage with power gating': 0.0435357,
'Instruction Fetch Unit/Gate Leakage': 0.0589979,
'Instruction Fetch Unit/Instruction Buffer/Area': 0.0226323,
'Instruction Fetch Unit/Instruction Buffer/Gate Leakage': 6.83558e-05,
'Instruction Fetch Unit/Instruction Buffer/Peak Dynamic': 0.606827,
'Instruction Fetch Unit/Instruction Buffer/Runtime Dynamic': 0.0831172,
'Instruction Fetch Unit/Instruction Buffer/Subthreshold Leakage': 0.00151885,
'Instruction Fetch Unit/Instruction Buffer/Subthreshold Leakage with power gating': 0.000701682,
'Instruction Fetch Unit/Instruction Cache/Area': 3.14635,
'Instruction Fetch Unit/Instruction Cache/Gate Leakage': 0.029931,
'Instruction Fetch Unit/Instruction Cache/Peak Dynamic': 5.28697,
'Instruction Fetch Unit/Instruction Cache/Runtime Dynamic': 0.27432,
'Instruction Fetch Unit/Instruction Cache/Subthreshold Leakage': 0.367022,
'Instruction Fetch Unit/Instruction Cache/Subthreshold Leakage with power gating': 0.180386,
'Instruction Fetch Unit/Instruction Decoder/Area': 1.85799,
'Instruction Fetch Unit/Instruction Decoder/Gate Leakage': 0.0222493,
'Instruction Fetch Unit/Instruction Decoder/Peak Dynamic': 1.37404,
'Instruction Fetch Unit/Instruction Decoder/Runtime Dynamic': 0.282303,
'Instruction Fetch Unit/Instruction Decoder/Subthreshold Leakage': 0.442943,
'Instruction Fetch Unit/Instruction Decoder/Subthreshold Leakage with power gating': 0.166104,
'Instruction Fetch Unit/Peak Dynamic': 7.76207,
'Instruction Fetch Unit/Runtime Dynamic': 0.661521,
'Instruction Fetch Unit/Subthreshold Leakage': 0.932286,
'Instruction Fetch Unit/Subthreshold Leakage with power gating': 0.40843,
'L2/Area': 4.53318,
'L2/Gate Leakage': 0.015464,
'L2/Peak Dynamic': 0.0267443,
'L2/Runtime Dynamic': 0.0102821,
'L2/Subthreshold Leakage': 0.834142,
'L2/Subthreshold Leakage with power gating': 0.401066,
'Load Store Unit/Area': 8.80901,
'Load Store Unit/Data Cache/Area': 6.84535,
'Load Store Unit/Data Cache/Gate Leakage': 0.0279261,
'Load Store Unit/Data Cache/Peak Dynamic': 3.11633,
'Load Store Unit/Data Cache/Runtime Dynamic': 0.919693,
'Load Store Unit/Data Cache/Subthreshold Leakage': 0.527675,
'Load Store Unit/Data Cache/Subthreshold Leakage with power gating': 0.25085,
'Load Store Unit/Gate Leakage': 0.0350888,
'Load Store Unit/LoadQ/Area': 0.0836782,
'Load Store Unit/LoadQ/Gate Leakage': 0.00059896,
'Load Store Unit/LoadQ/Peak Dynamic': 0.0607971,
'Load Store Unit/LoadQ/Runtime Dynamic': 0.060797,
'Load Store Unit/LoadQ/Subthreshold Leakage': 0.00941961,
'Load Store Unit/LoadQ/Subthreshold Leakage with power gating': 0.00536918,
'Load Store Unit/Peak Dynamic': 3.40343,
'Load Store Unit/Runtime Dynamic': 1.28032,
'Load Store Unit/StoreQ/Area': 0.322079,
'Load Store Unit/StoreQ/Gate Leakage': 0.00329971,
'Load Store Unit/StoreQ/Peak Dynamic': 0.149915,
'Load Store Unit/StoreQ/Runtime Dynamic': 0.29983,
'Load Store Unit/StoreQ/Subthreshold Leakage': 0.0345621,
'Load Store Unit/StoreQ/Subthreshold Leakage with power gating': 0.0197004,
'Load Store Unit/Subthreshold Leakage': 0.591321,
'Load Store Unit/Subthreshold Leakage with power gating': 0.283293,
'Memory Management Unit/Area': 0.4339,
'Memory Management Unit/Dtlb/Area': 0.0879726,
'Memory Management Unit/Dtlb/Gate Leakage': 0.00088729,
'Memory Management Unit/Dtlb/Peak Dynamic': 0.0532054,
'Memory Management Unit/Dtlb/Runtime Dynamic': 0.0536054,
'Memory Management Unit/Dtlb/Subthreshold Leakage': 0.0155699,
'Memory Management Unit/Dtlb/Subthreshold Leakage with power gating': 0.00887485,
'Memory Management Unit/Gate Leakage': 0.00808595,
'Memory Management Unit/Itlb/Area': 0.301552,
'Memory Management Unit/Itlb/Gate Leakage': 0.00393464,
'Memory Management Unit/Itlb/Peak Dynamic': 0.328724,
'Memory Management Unit/Itlb/Runtime Dynamic': 0.044975,
'Memory Management Unit/Itlb/Subthreshold Leakage': 0.0413758,
'Memory Management Unit/Itlb/Subthreshold Leakage with power gating': 0.0235842,
'Memory Management Unit/Peak Dynamic': 0.576231,
'Memory Management Unit/Runtime Dynamic': 0.0985804,
'Memory Management Unit/Subthreshold Leakage': 0.0766103,
'Memory Management Unit/Subthreshold Leakage with power gating': 0.0398333,
'Peak Dynamic': 20.3636,
'Renaming Unit/Area': 0.303608,
'Renaming Unit/FP Front End RAT/Area': 0.131045,
'Renaming Unit/FP Front End RAT/Gate Leakage': 0.00351123,
'Renaming Unit/FP Front End RAT/Peak Dynamic': 2.51468,
'Renaming Unit/FP Front End RAT/Runtime Dynamic': 0.202593,
'Renaming Unit/FP Front End RAT/Subthreshold Leakage': 0.0308571,
'Renaming Unit/FP Front End RAT/Subthreshold Leakage with power gating': 0.0175885,
'Renaming Unit/Free List/Area': 0.0340654,
'Renaming Unit/Free List/Gate Leakage': 2.5481e-05,
'Renaming Unit/Free List/Peak Dynamic': 0.0306032,
'Renaming Unit/Free List/Runtime Dynamic': 0.0150407,
'Renaming Unit/Free List/Subthreshold Leakage': 0.000370144,
'Renaming Unit/Free List/Subthreshold Leakage with power gating': 0.000201064,
'Renaming Unit/Gate Leakage': 0.00708398,
'Renaming Unit/Int Front End RAT/Area': 0.0941223,
'Renaming Unit/Int Front End RAT/Gate Leakage': 0.000283242,
'Renaming Unit/Int Front End RAT/Peak Dynamic': 0.731965,
'Renaming Unit/Int Front End RAT/Runtime Dynamic': 0.138811,
'Renaming Unit/Int Front End RAT/Subthreshold Leakage': 0.00435488,
'Renaming Unit/Int Front End RAT/Subthreshold Leakage with power gating': 0.00248228,
'Renaming Unit/Peak Dynamic': 3.58947,
'Renaming Unit/Runtime Dynamic': 0.356445,
'Renaming Unit/Subthreshold Leakage': 0.0552466,
'Renaming Unit/Subthreshold Leakage with power gating': 0.0276461,
'Runtime Dynamic': 4.69907,
'Subthreshold Leakage': 6.16288,
'Subthreshold Leakage with power gating': 2.55328},
{'Area': 32.0201,
'Execution Unit/Area': 7.68434,
'Execution Unit/Complex ALUs/Area': 0.235435,
'Execution Unit/Complex ALUs/Gate Leakage': 0.0132646,
'Execution Unit/Complex ALUs/Peak Dynamic': 0.0614842,
'Execution Unit/Complex ALUs/Runtime Dynamic': 0.250981,
'Execution Unit/Complex ALUs/Subthreshold Leakage': 0.20111,
'Execution Unit/Complex ALUs/Subthreshold Leakage with power gating': 0.0754163,
'Execution Unit/Floating Point Units/Area': 4.6585,
'Execution Unit/Floating Point Units/Gate Leakage': 0.0656156,
'Execution Unit/Floating Point Units/Peak Dynamic': 0.391445,
'Execution Unit/Floating Point Units/Runtime Dynamic': 0.304033,
'Execution Unit/Floating Point Units/Subthreshold Leakage': 0.994829,
'Execution Unit/Floating Point Units/Subthreshold Leakage with power gating': 0.373061,
'Execution Unit/Gate Leakage': 0.120359,
'Execution Unit/Instruction Scheduler/Area': 1.66526,
'Execution Unit/Instruction Scheduler/FP Instruction Window/Area': 0.275653,
'Execution Unit/Instruction Scheduler/FP Instruction Window/Gate Leakage': 0.000977433,
'Execution Unit/Instruction Scheduler/FP Instruction Window/Peak Dynamic': 1.04181,
'Execution Unit/Instruction Scheduler/FP Instruction Window/Runtime Dynamic': 0.272128,
'Execution Unit/Instruction Scheduler/FP Instruction Window/Subthreshold Leakage': 0.0143453,
'Execution Unit/Instruction Scheduler/FP Instruction Window/Subthreshold Leakage with power gating': 0.00810519,
'Execution Unit/Instruction Scheduler/Gate Leakage': 0.00568913,
'Execution Unit/Instruction Scheduler/Instruction Window/Area': 0.805223,
'Execution Unit/Instruction Scheduler/Instruction Window/Gate Leakage': 0.00414562,
'Execution Unit/Instruction Scheduler/Instruction Window/Peak Dynamic': 1.6763,
'Execution Unit/Instruction Scheduler/Instruction Window/Runtime Dynamic': 0.438933,
'Execution Unit/Instruction Scheduler/Instruction Window/Subthreshold Leakage': 0.0625755,
'Execution Unit/Instruction Scheduler/Instruction Window/Subthreshold Leakage with power gating': 0.0355964,
'Execution Unit/Instruction Scheduler/Peak Dynamic': 3.82262,
'Execution Unit/Instruction Scheduler/ROB/Area': 0.584388,
'Execution Unit/Instruction Scheduler/ROB/Gate Leakage': 0.00056608,
'Execution Unit/Instruction Scheduler/ROB/Peak Dynamic': 1.10451,
'Execution Unit/Instruction Scheduler/ROB/Runtime Dynamic': 0.221558,
'Execution Unit/Instruction Scheduler/ROB/Subthreshold Leakage': 0.00906853,
'Execution Unit/Instruction Scheduler/ROB/Subthreshold Leakage with power gating': 0.00364446,
'Execution Unit/Instruction Scheduler/Runtime Dynamic': 0.932619,
'Execution Unit/Instruction Scheduler/Subthreshold Leakage': 0.0859892,
'Execution Unit/Instruction Scheduler/Subthreshold Leakage with power gating': 0.047346,
'Execution Unit/Integer ALUs/Area': 0.47087,
'Execution Unit/Integer ALUs/Gate Leakage': 0.0265291,
'Execution Unit/Integer ALUs/Peak Dynamic': 0.251221,
'Execution Unit/Integer ALUs/Runtime Dynamic': 0.101344,
'Execution Unit/Integer ALUs/Subthreshold Leakage': 0.40222,
'Execution Unit/Integer ALUs/Subthreshold Leakage with power gating': 0.150833,
'Execution Unit/Peak Dynamic': 4.97023,
'Execution Unit/Register Files/Area': 0.570804,
'Execution Unit/Register Files/Floating Point RF/Area': 0.208131,
'Execution Unit/Register Files/Floating Point RF/Gate Leakage': 0.000232788,
'Execution Unit/Register Files/Floating Point RF/Peak Dynamic': 0.0739523,
'Execution Unit/Register Files/Floating Point RF/Runtime Dynamic': 0.0114143,
'Execution Unit/Register Files/Floating Point RF/Subthreshold Leakage': 0.00399698,
'Execution Unit/Register Files/Floating Point RF/Subthreshold Leakage with power gating': 0.00176968,
'Execution Unit/Register Files/Gate Leakage': 0.000622708,
'Execution Unit/Register Files/Integer RF/Area': 0.362673,
'Execution Unit/Register Files/Integer RF/Gate Leakage': 0.00038992,
'Execution Unit/Register Files/Integer RF/Peak Dynamic': 0.103142,
'Execution Unit/Register Files/Integer RF/Runtime Dynamic': 0.0844156,
'Execution Unit/Register Files/Integer RF/Subthreshold Leakage': 0.00614175,
'Execution Unit/Register Files/Integer RF/Subthreshold Leakage with power gating': 0.00246675,
'Execution Unit/Register Files/Peak Dynamic': 0.177095,
'Execution Unit/Register Files/Runtime Dynamic': 0.0958299,
'Execution Unit/Register Files/Subthreshold Leakage': 0.0101387,
'Execution Unit/Register Files/Subthreshold Leakage with power gating': 0.00423643,
'Execution Unit/Results Broadcast Bus/Area Overhead': 0.0390912,
'Execution Unit/Results Broadcast Bus/Gate Leakage': 0.00537402,
'Execution Unit/Results Broadcast Bus/Peak Dynamic': 0.232667,
'Execution Unit/Results Broadcast Bus/Runtime Dynamic': 0.565191,
'Execution Unit/Results Broadcast Bus/Subthreshold Leakage': 0.081478,
'Execution Unit/Results Broadcast Bus/Subthreshold Leakage with power gating': 0.0305543,
'Execution Unit/Runtime Dynamic': 2.25,
'Execution Unit/Subthreshold Leakage': 1.79543,
'Execution Unit/Subthreshold Leakage with power gating': 0.688821,
'Gate Leakage': 0.368936,
'Instruction Fetch Unit/Area': 5.85939,
'Instruction Fetch Unit/Branch Predictor/Area': 0.138516,
'Instruction Fetch Unit/Branch Predictor/Chooser/Area': 0.0435221,
'Instruction Fetch Unit/Branch Predictor/Chooser/Gate Leakage': 0.000278362,
'Instruction Fetch Unit/Branch Predictor/Chooser/Peak Dynamic': 0.0168831,
'Instruction Fetch Unit/Branch Predictor/Chooser/Runtime Dynamic': 0.00162206,
'Instruction Fetch Unit/Branch Predictor/Chooser/Subthreshold Leakage': 0.00759719,
'Instruction Fetch Unit/Branch Predictor/Chooser/Subthreshold Leakage with power gating': 0.0039236,
'Instruction Fetch Unit/Branch Predictor/Gate Leakage': 0.000757657,
'Instruction Fetch Unit/Branch Predictor/Global Predictor/Area': 0.0435221,
'Instruction Fetch Unit/Branch Predictor/Global Predictor/Gate Leakage': 0.000278362,
'Instruction Fetch Unit/Branch Predictor/Global Predictor/Peak Dynamic': 0.0168831,
'Instruction Fetch Unit/Branch Predictor/Global Predictor/Runtime Dynamic': 0.00162206,
'Instruction Fetch Unit/Branch Predictor/Global Predictor/Subthreshold Leakage': 0.00759719,
'Instruction Fetch Unit/Branch Predictor/Global Predictor/Subthreshold Leakage with power gating': 0.0039236,
'Instruction Fetch Unit/Branch Predictor/L1_Local Predictor/Area': 0.0257064,
'Instruction Fetch Unit/Branch Predictor/L1_Local Predictor/Gate Leakage': 0.000154548,
'Instruction Fetch Unit/Branch Predictor/L1_Local Predictor/Peak Dynamic': 0.0142575,
'Instruction Fetch Unit/Branch Predictor/L1_Local Predictor/Runtime Dynamic': 0.00143327,
'Instruction Fetch Unit/Branch Predictor/L1_Local Predictor/Subthreshold Leakage': 0.00384344,
'Instruction Fetch Unit/Branch Predictor/L1_Local Predictor/Subthreshold Leakage with power gating': 0.00198631,
'Instruction Fetch Unit/Branch Predictor/L2_Local Predictor/Area': 0.0151917,
'Instruction Fetch Unit/Branch Predictor/L2_Local Predictor/Gate Leakage': 8.00196e-05,
'Instruction Fetch Unit/Branch Predictor/L2_Local Predictor/Peak Dynamic': 0.00527447,
'Instruction Fetch Unit/Branch Predictor/L2_Local Predictor/Runtime Dynamic': 0.000566026,
'Instruction Fetch Unit/Branch Predictor/L2_Local Predictor/Subthreshold Leakage': 0.00181347,
'Instruction Fetch Unit/Branch Predictor/L2_Local Predictor/Subthreshold Leakage with power gating': 0.000957045,
'Instruction Fetch Unit/Branch Predictor/Peak Dynamic': 0.0597838,
'Instruction Fetch Unit/Branch Predictor/RAS/Area': 0.0105732,
'Instruction Fetch Unit/Branch Predictor/RAS/Gate Leakage': 4.63858e-05,
'Instruction Fetch Unit/Branch Predictor/RAS/Peak Dynamic': 0.0117602,
'Instruction Fetch Unit/Branch Predictor/RAS/Runtime Dynamic': 0.00121264,
'Instruction Fetch Unit/Branch Predictor/RAS/Subthreshold Leakage': 0.000932505,
'Instruction Fetch Unit/Branch Predictor/RAS/Subthreshold Leakage with power gating': 0.000494733,
'Instruction Fetch Unit/Branch Predictor/Runtime Dynamic': 0.00589003,
'Instruction Fetch Unit/Branch Predictor/Subthreshold Leakage': 0.0199703,
'Instruction Fetch Unit/Branch Predictor/Subthreshold Leakage with power gating': 0.0103282,
'Instruction Fetch Unit/Branch Target Buffer/Area': 0.64954,
'Instruction Fetch Unit/Branch Target Buffer/Gate Leakage': 0.00272758,
'Instruction Fetch Unit/Branch Target Buffer/Peak Dynamic': 0.177867,
'Instruction Fetch Unit/Branch Target Buffer/Runtime Dynamic': 0.0148215,
'Instruction Fetch Unit/Branch Target Buffer/Subthreshold Leakage': 0.0811682,
'Instruction Fetch Unit/Branch Target Buffer/Subthreshold Leakage with power gating': 0.0435357,
'Instruction Fetch Unit/Gate Leakage': 0.0589979,
'Instruction Fetch Unit/Instruction Buffer/Area': 0.0226323,
'Instruction Fetch Unit/Instruction Buffer/Gate Leakage': 6.83558e-05,
'Instruction Fetch Unit/Instruction Buffer/Peak Dynamic': 0.606827,
'Instruction Fetch Unit/Instruction Buffer/Runtime Dynamic': 0.0811509,
'Instruction Fetch Unit/Instruction Buffer/Subthreshold Leakage': 0.00151885,
'Instruction Fetch Unit/Instruction Buffer/Subthreshold Leakage with power gating': 0.000701682,
'Instruction Fetch Unit/Instruction Cache/Area': 3.14635,
'Instruction Fetch Unit/Instruction Cache/Gate Leakage': 0.029931,
'Instruction Fetch Unit/Instruction Cache/Peak Dynamic': 5.16189,
'Instruction Fetch Unit/Instruction Cache/Runtime Dynamic': 0.274289,
'Instruction Fetch Unit/Instruction Cache/Subthreshold Leakage': 0.367022,
'Instruction Fetch Unit/Instruction Cache/Subthreshold Leakage with power gating': 0.180386,
'Instruction Fetch Unit/Instruction Decoder/Area': 1.85799,
'Instruction Fetch Unit/Instruction Decoder/Gate Leakage': 0.0222493,
'Instruction Fetch Unit/Instruction Decoder/Peak Dynamic': 1.37404,
'Instruction Fetch Unit/Instruction Decoder/Runtime Dynamic': 0.275625,
'Instruction Fetch Unit/Instruction Decoder/Subthreshold Leakage': 0.442943,
'Instruction Fetch Unit/Instruction Decoder/Subthreshold Leakage with power gating': 0.166104,
'Instruction Fetch Unit/Peak Dynamic': 7.63092,
'Instruction Fetch Unit/Runtime Dynamic': 0.651776,
'Instruction Fetch Unit/Subthreshold Leakage': 0.932286,
'Instruction Fetch Unit/Subthreshold Leakage with power gating': 0.40843,
'L2/Area': 4.53318,
'L2/Gate Leakage': 0.015464,
'L2/Peak Dynamic': 0.0229731,
'L2/Runtime Dynamic': 0.0080628,
'L2/Subthreshold Leakage': 0.834142,
'L2/Subthreshold Leakage with power gating': 0.401066,
'Load Store Unit/Area': 8.80901,
'Load Store Unit/Data Cache/Area': 6.84535,
'Load Store Unit/Data Cache/Gate Leakage': 0.0279261,
'Load Store Unit/Data Cache/Peak Dynamic': 3.02266,
'Load Store Unit/Data Cache/Runtime Dynamic': 0.866568,
'Load Store Unit/Data Cache/Subthreshold Leakage': 0.527675,
'Load Store Unit/Data Cache/Subthreshold Leakage with power gating': 0.25085,
'Load Store Unit/Gate Leakage': 0.0350888,
'Load Store Unit/LoadQ/Area': 0.0836782,
'Load Store Unit/LoadQ/Gate Leakage': 0.00059896,
'Load Store Unit/LoadQ/Peak Dynamic': 0.0577665,
'Load Store Unit/LoadQ/Runtime Dynamic': 0.0577665,
'Load Store Unit/LoadQ/Subthreshold Leakage': 0.00941961,
'Load Store Unit/LoadQ/Subthreshold Leakage with power gating': 0.00536918,
'Load Store Unit/Peak Dynamic': 3.29545,
'Load Store Unit/Runtime Dynamic': 1.20922,
'Load Store Unit/StoreQ/Area': 0.322079,
'Load Store Unit/StoreQ/Gate Leakage': 0.00329971,
'Load Store Unit/StoreQ/Peak Dynamic': 0.142442,
'Load Store Unit/StoreQ/Runtime Dynamic': 0.284885,
'Load Store Unit/StoreQ/Subthreshold Leakage': 0.0345621,
'Load Store Unit/StoreQ/Subthreshold Leakage with power gating': 0.0197004,
'Load Store Unit/Subthreshold Leakage': 0.591321,
'Load Store Unit/Subthreshold Leakage with power gating': 0.283293,
'Memory Management Unit/Area': 0.4339,
'Memory Management Unit/Dtlb/Area': 0.0879726,
'Memory Management Unit/Dtlb/Gate Leakage': 0.00088729,
'Memory Management Unit/Dtlb/Peak Dynamic': 0.0505533,
'Memory Management Unit/Dtlb/Runtime Dynamic': 0.050897,
'Memory Management Unit/Dtlb/Subthreshold Leakage': 0.0155699,
'Memory Management Unit/Dtlb/Subthreshold Leakage with power gating': 0.00887485,
'Memory Management Unit/Gate Leakage': 0.00808595,
'Memory Management Unit/Itlb/Area': 0.301552,
'Memory Management Unit/Itlb/Gate Leakage': 0.00393464,
'Memory Management Unit/Itlb/Peak Dynamic': 0.320947,
'Memory Management Unit/Itlb/Runtime Dynamic': 0.0449689,
'Memory Management Unit/Itlb/Subthreshold Leakage': 0.0413758,
'Memory Management Unit/Itlb/Subthreshold Leakage with power gating': 0.0235842,
'Memory Management Unit/Peak Dynamic': 0.563898,
'Memory Management Unit/Runtime Dynamic': 0.0958659,
'Memory Management Unit/Subthreshold Leakage': 0.0766103,
'Memory Management Unit/Subthreshold Leakage with power gating': 0.0398333,
'Peak Dynamic': 20.0729,
'Renaming Unit/Area': 0.303608,
'Renaming Unit/FP Front End RAT/Area': 0.131045,
'Renaming Unit/FP Front End RAT/Gate Leakage': 0.00351123,
'Renaming Unit/FP Front End RAT/Peak Dynamic': 2.51468,
'Renaming Unit/FP Front End RAT/Runtime Dynamic': 0.194534,
'Renaming Unit/FP Front End RAT/Subthreshold Leakage': 0.0308571,
'Renaming Unit/FP Front End RAT/Subthreshold Leakage with power gating': 0.0175885,
'Renaming Unit/Free List/Area': 0.0340654,
'Renaming Unit/Free List/Gate Leakage': 2.5481e-05,
'Renaming Unit/Free List/Peak Dynamic': 0.0306032,
'Renaming Unit/Free List/Runtime Dynamic': 0.0146451,
'Renaming Unit/Free List/Subthreshold Leakage': 0.000370144,
'Renaming Unit/Free List/Subthreshold Leakage with power gating': 0.000201064,
'Renaming Unit/Gate Leakage': 0.00708398,
'Renaming Unit/Int Front End RAT/Area': 0.0941223,
'Renaming Unit/Int Front End RAT/Gate Leakage': 0.000283242,
'Renaming Unit/Int Front End RAT/Peak Dynamic': 0.731965,
'Renaming Unit/Int Front End RAT/Runtime Dynamic': 0.135756,
'Renaming Unit/Int Front End RAT/Subthreshold Leakage': 0.00435488,
'Renaming Unit/Int Front End RAT/Subthreshold Leakage with power gating': 0.00248228,
'Renaming Unit/Peak Dynamic': 3.58947,
'Renaming Unit/Runtime Dynamic': 0.344936,
'Renaming Unit/Subthreshold Leakage': 0.0552466,
'Renaming Unit/Subthreshold Leakage with power gating': 0.0276461,
'Runtime Dynamic': 4.55986,
'Subthreshold Leakage': 6.16288,
'Subthreshold Leakage with power gating': 2.55328},
{'Area': 32.0201,
'Execution Unit/Area': 7.68434,
'Execution Unit/Complex ALUs/Area': 0.235435,
'Execution Unit/Complex ALUs/Gate Leakage': 0.0132646,
'Execution Unit/Complex ALUs/Peak Dynamic': 0.0468765,
'Execution Unit/Complex ALUs/Runtime Dynamic': 0.239507,
'Execution Unit/Complex ALUs/Subthreshold Leakage': 0.20111,
'Execution Unit/Complex ALUs/Subthreshold Leakage with power gating': 0.0754163,
'Execution Unit/Floating Point Units/Area': 4.6585,
'Execution Unit/Floating Point Units/Gate Leakage': 0.0656156,
'Execution Unit/Floating Point Units/Peak Dynamic': 0.295804,
'Execution Unit/Floating Point Units/Runtime Dynamic': 0.304033,
'Execution Unit/Floating Point Units/Subthreshold Leakage': 0.994829,
'Execution Unit/Floating Point Units/Subthreshold Leakage with power gating': 0.373061,
'Execution Unit/Gate Leakage': 0.120359,
'Execution Unit/Instruction Scheduler/Area': 1.66526,
'Execution Unit/Instruction Scheduler/FP Instruction Window/Area': 0.275653,
'Execution Unit/Instruction Scheduler/FP Instruction Window/Gate Leakage': 0.000977433,
'Execution Unit/Instruction Scheduler/FP Instruction Window/Peak Dynamic': 1.04181,
'Execution Unit/Instruction Scheduler/FP Instruction Window/Runtime Dynamic': 0.21292,
'Execution Unit/Instruction Scheduler/FP Instruction Window/Subthreshold Leakage': 0.0143453,
'Execution Unit/Instruction Scheduler/FP Instruction Window/Subthreshold Leakage with power gating': 0.00810519,
'Execution Unit/Instruction Scheduler/Gate Leakage': 0.00568913,
'Execution Unit/Instruction Scheduler/Instruction Window/Area': 0.805223,
'Execution Unit/Instruction Scheduler/Instruction Window/Gate Leakage': 0.00414562,
'Execution Unit/Instruction Scheduler/Instruction Window/Peak Dynamic': 1.6763,
'Execution Unit/Instruction Scheduler/Instruction Window/Runtime Dynamic': 0.343432,
'Execution Unit/Instruction Scheduler/Instruction Window/Subthreshold Leakage': 0.0625755,
'Execution Unit/Instruction Scheduler/Instruction Window/Subthreshold Leakage with power gating': 0.0355964,
'Execution Unit/Instruction Scheduler/Peak Dynamic': 3.82262,
'Execution Unit/Instruction Scheduler/ROB/Area': 0.584388,
'Execution Unit/Instruction Scheduler/ROB/Gate Leakage': 0.00056608,
'Execution Unit/Instruction Scheduler/ROB/Peak Dynamic': 1.10451,
'Execution Unit/Instruction Scheduler/ROB/Runtime Dynamic': 0.173353,
'Execution Unit/Instruction Scheduler/ROB/Subthreshold Leakage': 0.00906853,
'Execution Unit/Instruction Scheduler/ROB/Subthreshold Leakage with power gating': 0.00364446,
'Execution Unit/Instruction Scheduler/Runtime Dynamic': 0.729705,
'Execution Unit/Instruction Scheduler/Subthreshold Leakage': 0.0859892,
'Execution Unit/Instruction Scheduler/Subthreshold Leakage with power gating': 0.047346,
'Execution Unit/Integer ALUs/Area': 0.47087,
'Execution Unit/Integer ALUs/Gate Leakage': 0.0265291,
'Execution Unit/Integer ALUs/Peak Dynamic': 0.198167,
'Execution Unit/Integer ALUs/Runtime Dynamic': 0.101344,
'Execution Unit/Integer ALUs/Subthreshold Leakage': 0.40222,
'Execution Unit/Integer ALUs/Subthreshold Leakage with power gating': 0.150833,
'Execution Unit/Peak Dynamic': 4.70743,
'Execution Unit/Register Files/Area': 0.570804,
'Execution Unit/Register Files/Floating Point RF/Area': 0.208131,
'Execution Unit/Register Files/Floating Point RF/Gate Leakage': 0.000232788,
'Execution Unit/Register Files/Floating Point RF/Peak Dynamic': 0.0558838,
'Execution Unit/Register Files/Floating Point RF/Runtime Dynamic': 0.00893082,
'Execution Unit/Register Files/Floating Point RF/Subthreshold Leakage': 0.00399698,
'Execution Unit/Register Files/Floating Point RF/Subthreshold Leakage with power gating': 0.00176968,
'Execution Unit/Register Files/Gate Leakage': 0.000622708,
'Execution Unit/Register Files/Integer RF/Area': 0.362673,
'Execution Unit/Register Files/Integer RF/Gate Leakage': 0.00038992,
'Execution Unit/Register Files/Integer RF/Peak Dynamic': 0.0803962,
'Execution Unit/Register Files/Integer RF/Runtime Dynamic': 0.0660489,
'Execution Unit/Register Files/Integer RF/Subthreshold Leakage': 0.00614175,
'Execution Unit/Register Files/Integer RF/Subthreshold Leakage with power gating': 0.00246675,
'Execution Unit/Register Files/Peak Dynamic': 0.13628,
'Execution Unit/Register Files/Runtime Dynamic': 0.0749797,
'Execution Unit/Register Files/Subthreshold Leakage': 0.0101387,
'Execution Unit/Register Files/Subthreshold Leakage with power gating': 0.00423643,
'Execution Unit/Results Broadcast Bus/Area Overhead': 0.0390912,
'Execution Unit/Results Broadcast Bus/Gate Leakage': 0.00537402,
'Execution Unit/Results Broadcast Bus/Peak Dynamic': 0.181094,
'Execution Unit/Results Broadcast Bus/Runtime Dynamic': 0.443218,
'Execution Unit/Results Broadcast Bus/Subthreshold Leakage': 0.081478,
'Execution Unit/Results Broadcast Bus/Subthreshold Leakage with power gating': 0.0305543,
'Execution Unit/Runtime Dynamic': 1.89279,
'Execution Unit/Subthreshold Leakage': 1.79543,
'Execution Unit/Subthreshold Leakage with power gating': 0.688821,
'Gate Leakage': 0.368936,
'Instruction Fetch Unit/Area': 5.85939,
'Instruction Fetch Unit/Branch Predictor/Area': 0.138516,
'Instruction Fetch Unit/Branch Predictor/Chooser/Area': 0.0435221,
'Instruction Fetch Unit/Branch Predictor/Chooser/Gate Leakage': 0.000278362,
'Instruction Fetch Unit/Branch Predictor/Chooser/Peak Dynamic': 0.0168831,
'Instruction Fetch Unit/Branch Predictor/Chooser/Runtime Dynamic': 0.00139563,
'Instruction Fetch Unit/Branch Predictor/Chooser/Subthreshold Leakage': 0.00759719,
'Instruction Fetch Unit/Branch Predictor/Chooser/Subthreshold Leakage with power gating': 0.0039236,
'Instruction Fetch Unit/Branch Predictor/Gate Leakage': 0.000757657,
'Instruction Fetch Unit/Branch Predictor/Global Predictor/Area': 0.0435221,
'Instruction Fetch Unit/Branch Predictor/Global Predictor/Gate Leakage': 0.000278362,
'Instruction Fetch Unit/Branch Predictor/Global Predictor/Peak Dynamic': 0.0168831,
'Instruction Fetch Unit/Branch Predictor/Global Predictor/Runtime Dynamic': 0.00139563,
'Instruction Fetch Unit/Branch Predictor/Global Predictor/Subthreshold Leakage': 0.00759719,
'Instruction Fetch Unit/Branch Predictor/Global Predictor/Subthreshold Leakage with power gating': 0.0039236,
'Instruction Fetch Unit/Branch Predictor/L1_Local Predictor/Area': 0.0257064,
'Instruction Fetch Unit/Branch Predictor/L1_Local Predictor/Gate Leakage': 0.000154548,
'Instruction Fetch Unit/Branch Predictor/L1_Local Predictor/Peak Dynamic': 0.0142575,
'Instruction Fetch Unit/Branch Predictor/L1_Local Predictor/Runtime Dynamic': 0.00123336,
'Instruction Fetch Unit/Branch Predictor/L1_Local Predictor/Subthreshold Leakage': 0.00384344,
'Instruction Fetch Unit/Branch Predictor/L1_Local Predictor/Subthreshold Leakage with power gating': 0.00198631,
'Instruction Fetch Unit/Branch Predictor/L2_Local Predictor/Area': 0.0151917,
'Instruction Fetch Unit/Branch Predictor/L2_Local Predictor/Gate Leakage': 8.00196e-05,
'Instruction Fetch Unit/Branch Predictor/L2_Local Predictor/Peak Dynamic': 0.00527447,
'Instruction Fetch Unit/Branch Predictor/L2_Local Predictor/Runtime Dynamic': 0.000487173,
'Instruction Fetch Unit/Branch Predictor/L2_Local Predictor/Subthreshold Leakage': 0.00181347,
'Instruction Fetch Unit/Branch Predictor/L2_Local Predictor/Subthreshold Leakage with power gating': 0.000957045,
'Instruction Fetch Unit/Branch Predictor/Peak Dynamic': 0.0597838,
'Instruction Fetch Unit/Branch Predictor/RAS/Area': 0.0105732,
'Instruction Fetch Unit/Branch Predictor/RAS/Gate Leakage': 4.63858e-05,
'Instruction Fetch Unit/Branch Predictor/RAS/Peak Dynamic': 0.0117602,
'Instruction Fetch Unit/Branch Predictor/RAS/Runtime Dynamic': 0.000948797,
'Instruction Fetch Unit/Branch Predictor/RAS/Subthreshold Leakage': 0.000932505,
'Instruction Fetch Unit/Branch Predictor/RAS/Subthreshold Leakage with power gating': 0.000494733,
'Instruction Fetch Unit/Branch Predictor/Runtime Dynamic': 0.00497342,
'Instruction Fetch Unit/Branch Predictor/Subthreshold Leakage': 0.0199703,
'Instruction Fetch Unit/Branch Predictor/Subthreshold Leakage with power gating': 0.0103282,
'Instruction Fetch Unit/Branch Target Buffer/Area': 0.64954,
'Instruction Fetch Unit/Branch Target Buffer/Gate Leakage': 0.00272758,
'Instruction Fetch Unit/Branch Target Buffer/Peak Dynamic': 0.177867,
'Instruction Fetch Unit/Branch Target Buffer/Runtime Dynamic': 0.0127463,
'Instruction Fetch Unit/Branch Target Buffer/Subthreshold Leakage': 0.0811682,
'Instruction Fetch Unit/Branch Target Buffer/Subthreshold Leakage with power gating': 0.0435357,
'Instruction Fetch Unit/Gate Leakage': 0.0589979,
'Instruction Fetch Unit/Instruction Buffer/Area': 0.0226323,
'Instruction Fetch Unit/Instruction Buffer/Gate Leakage': 6.83558e-05,
'Instruction Fetch Unit/Instruction Buffer/Peak Dynamic': 0.606827,
'Instruction Fetch Unit/Instruction Buffer/Runtime Dynamic': 0.0634945,
'Instruction Fetch Unit/Instruction Buffer/Subthreshold Leakage': 0.00151885,
'Instruction Fetch Unit/Instruction Buffer/Subthreshold Leakage with power gating': 0.000701682,
'Instruction Fetch Unit/Instruction Cache/Area': 3.14635,
'Instruction Fetch Unit/Instruction Cache/Gate Leakage': 0.029931,
'Instruction Fetch Unit/Instruction Cache/Peak Dynamic': 4.0388,
'Instruction Fetch Unit/Instruction Cache/Runtime Dynamic': 0.215814,
'Instruction Fetch Unit/Instruction Cache/Subthreshold Leakage': 0.367022,
'Instruction Fetch Unit/Instruction Cache/Subthreshold Leakage with power gating': 0.180386,
'Instruction Fetch Unit/Instruction Decoder/Area': 1.85799,
'Instruction Fetch Unit/Instruction Decoder/Gate Leakage': 0.0222493,
'Instruction Fetch Unit/Instruction Decoder/Peak Dynamic': 1.37404,
'Instruction Fetch Unit/Instruction Decoder/Runtime Dynamic': 0.215656,
'Instruction Fetch Unit/Instruction Decoder/Subthreshold Leakage': 0.442943,
'Instruction Fetch Unit/Instruction Decoder/Subthreshold Leakage with power gating': 0.166104,
'Instruction Fetch Unit/Peak Dynamic': 6.45332,
'Instruction Fetch Unit/Runtime Dynamic': 0.512684,
'Instruction Fetch Unit/Subthreshold Leakage': 0.932286,
'Instruction Fetch Unit/Subthreshold Leakage with power gating': 0.40843,
'L2/Area': 4.53318,
'L2/Gate Leakage': 0.015464,
'L2/Peak Dynamic': 0.0141483,
'L2/Runtime Dynamic': 0.00559648,
'L2/Subthreshold Leakage': 0.834142,
'L2/Subthreshold Leakage with power gating': 0.401066,
'Load Store Unit/Area': 8.80901,
'Load Store Unit/Data Cache/Area': 6.84535,
'Load Store Unit/Data Cache/Gate Leakage': 0.0279261,
'Load Store Unit/Data Cache/Peak Dynamic': 2.71275,
'Load Store Unit/Data Cache/Runtime Dynamic': 0.715045,
'Load Store Unit/Data Cache/Subthreshold Leakage': 0.527675,
'Load Store Unit/Data Cache/Subthreshold Leakage with power gating': 0.25085,
'Load Store Unit/Gate Leakage': 0.0350888,
'Load Store Unit/LoadQ/Area': 0.0836782,
'Load Store Unit/LoadQ/Gate Leakage': 0.00059896,
'Load Store Unit/LoadQ/Peak Dynamic': 0.0477401,
'Load Store Unit/LoadQ/Runtime Dynamic': 0.0477402,
'Load Store Unit/LoadQ/Subthreshold Leakage': 0.00941961,
'Load Store Unit/LoadQ/Subthreshold Leakage with power gating': 0.00536918,
'Load Store Unit/Peak Dynamic': 2.93819,
'Load Store Unit/Runtime Dynamic': 0.998224,
'Load Store Unit/StoreQ/Area': 0.322079,
'Load Store Unit/StoreQ/Gate Leakage': 0.00329971,
'Load Store Unit/StoreQ/Peak Dynamic': 0.117719,
'Load Store Unit/StoreQ/Runtime Dynamic': 0.235438,
'Load Store Unit/StoreQ/Subthreshold Leakage': 0.0345621,
'Load Store Unit/StoreQ/Subthreshold Leakage with power gating': 0.0197004,
'Load Store Unit/Subthreshold Leakage': 0.591321,
'Load Store Unit/Subthreshold Leakage with power gating': 0.283293,
'Memory Management Unit/Area': 0.4339,
'Memory Management Unit/Dtlb/Area': 0.0879726,
'Memory Management Unit/Dtlb/Gate Leakage': 0.00088729,
'Memory Management Unit/Dtlb/Peak Dynamic': 0.0417788,
'Memory Management Unit/Dtlb/Runtime Dynamic': 0.0419892,
'Memory Management Unit/Dtlb/Subthreshold Leakage': 0.0155699,
'Memory Management Unit/Dtlb/Subthreshold Leakage with power gating': 0.00887485,
'Memory Management Unit/Gate Leakage': 0.00808595,
'Memory Management Unit/Itlb/Area': 0.301552,
'Memory Management Unit/Itlb/Gate Leakage': 0.00393464,
'Memory Management Unit/Itlb/Peak Dynamic': 0.251117,
'Memory Management Unit/Itlb/Runtime Dynamic': 0.0353855,
'Memory Management Unit/Itlb/Subthreshold Leakage': 0.0413758,
'Memory Management Unit/Itlb/Subthreshold Leakage with power gating': 0.0235842,
'Memory Management Unit/Peak Dynamic': 0.478995,
'Memory Management Unit/Runtime Dynamic': 0.0773747,
'Memory Management Unit/Subthreshold Leakage': 0.0766103,
'Memory Management Unit/Subthreshold Leakage with power gating': 0.0398333,
'Peak Dynamic': 18.1815,
'Renaming Unit/Area': 0.303608,
'Renaming Unit/FP Front End RAT/Area': 0.131045,
'Renaming Unit/FP Front End RAT/Gate Leakage': 0.00351123,
'Renaming Unit/FP Front End RAT/Peak Dynamic': 2.51468,
'Renaming Unit/FP Front End RAT/Runtime Dynamic': 0.147004,
'Renaming Unit/FP Front End RAT/Subthreshold Leakage': 0.0308571,
'Renaming Unit/FP Front End RAT/Subthreshold Leakage with power gating': 0.0175885,
'Renaming Unit/Free List/Area': 0.0340654,
'Renaming Unit/Free List/Gate Leakage': 2.5481e-05,
'Renaming Unit/Free List/Peak Dynamic': 0.0306032,
'Renaming Unit/Free List/Runtime Dynamic': 0.0113954,
'Renaming Unit/Free List/Subthreshold Leakage': 0.000370144,
'Renaming Unit/Free List/Subthreshold Leakage with power gating': 0.000201064,
'Renaming Unit/Gate Leakage': 0.00708398,
'Renaming Unit/Int Front End RAT/Area': 0.0941223,
'Renaming Unit/Int Front End RAT/Gate Leakage': 0.000283242,
'Renaming Unit/Int Front End RAT/Peak Dynamic': 0.731965,
'Renaming Unit/Int Front End RAT/Runtime Dynamic': 0.106111,
'Renaming Unit/Int Front End RAT/Subthreshold Leakage': 0.00435488,
'Renaming Unit/Int Front End RAT/Subthreshold Leakage with power gating': 0.00248228,
'Renaming Unit/Peak Dynamic': 3.58947,
'Renaming Unit/Runtime Dynamic': 0.264511,
'Renaming Unit/Subthreshold Leakage': 0.0552466,
'Renaming Unit/Subthreshold Leakage with power gating': 0.0276461,
'Runtime Dynamic': 3.75118,
'Subthreshold Leakage': 6.16288,
'Subthreshold Leakage with power gating': 2.55328}],
'DRAM': {'Area': 0,
'Gate Leakage': 0,
'Peak Dynamic': 4.29734336729898,
'Runtime Dynamic': 4.29734336729898,
'Subthreshold Leakage': 4.252,
'Subthreshold Leakage with power gating': 4.252},
'L3': [{'Area': 61.9075,
'Gate Leakage': 0.0484137,
'Peak Dynamic': 0.150013,
'Runtime Dynamic': 0.0931411,
'Subthreshold Leakage': 6.80085,
'Subthreshold Leakage with power gating': 3.32364}],
'Processor': {'Area': 191.908,
'Gate Leakage': 1.53485,
'Peak Dynamic': 85.569,
'Peak Power': 118.681,
'Runtime Dynamic': 22.2755,
'Subthreshold Leakage': 31.5774,
'Subthreshold Leakage with power gating': 13.9484,
'Total Cores/Area': 128.669,
'Total Cores/Gate Leakage': 1.4798,
'Total Cores/Peak Dynamic': 85.419,
'Total Cores/Runtime Dynamic': 22.1824,
'Total Cores/Subthreshold Leakage': 24.7074,
'Total Cores/Subthreshold Leakage with power gating': 10.2429,
'Total L3s/Area': 61.9075,
'Total L3s/Gate Leakage': 0.0484137,
'Total L3s/Peak Dynamic': 0.150013,
'Total L3s/Runtime Dynamic': 0.0931411,
'Total L3s/Subthreshold Leakage': 6.80085,
'Total L3s/Subthreshold Leakage with power gating': 3.32364,
'Total Leakage': 33.1122,
'Total NoCs/Area': 1.33155,
'Total NoCs/Gate Leakage': 0.00662954,
'Total NoCs/Peak Dynamic': 0.0,
'Total NoCs/Runtime Dynamic': 0.0,
'Total NoCs/Subthreshold Leakage': 0.0691322,
'Total NoCs/Subthreshold Leakage with power gating': 0.0259246}} | 75.032823 | 124 | 0.681948 | 8,082 | 68,580 | 5.780747 | 0.067805 | 0.12363 | 0.113014 | 0.093493 | 0.93964 | 0.930458 | 0.918408 | 0.886622 | 0.862008 | 0.84283 | 0 | 0.131475 | 0.224424 | 68,580 | 914 | 125 | 75.032823 | 0.746903 | 0 | 0 | 0.642232 | 0 | 0 | 0.657689 | 0.048118 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
f40dcc7769cac411e9b4292d9e5c8381e561536c | 142 | py | Python | discord/ext/commands/cog.py | kuzaku-developers/disnake | 61cc1ad4c2bafd39726a1447c85f7e469e41af10 | [
"MIT"
] | null | null | null | discord/ext/commands/cog.py | kuzaku-developers/disnake | 61cc1ad4c2bafd39726a1447c85f7e469e41af10 | [
"MIT"
] | null | null | null | discord/ext/commands/cog.py | kuzaku-developers/disnake | 61cc1ad4c2bafd39726a1447c85f7e469e41af10 | [
"MIT"
] | null | null | null | from disnake.ext.commands.cog import *
from disnake.ext.commands.cog import __dict__ as __original_dict__
locals().update(__original_dict__)
| 28.4 | 66 | 0.830986 | 20 | 142 | 5.2 | 0.55 | 0.211538 | 0.269231 | 0.423077 | 0.596154 | 0.596154 | 0 | 0 | 0 | 0 | 0 | 0 | 0.084507 | 142 | 4 | 67 | 35.5 | 0.8 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0.666667 | 0 | 0.666667 | 0 | 1 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 8 |
f425c5a86246381536fc61c0082bfebbc93e4460 | 9,917 | py | Python | wilson/wcxf/converters/SMEFTsim_param_card_elements.py | jackypheno/wilson | 81c905c7c1e4cf26b5d5a1a2b7b316f1f4f9d285 | [
"MIT"
] | 24 | 2018-04-16T15:01:39.000Z | 2022-01-26T07:16:22.000Z | wilson/wcxf/converters/SMEFTsim_param_card_elements.py | jackypheno/wilson | 81c905c7c1e4cf26b5d5a1a2b7b316f1f4f9d285 | [
"MIT"
] | 85 | 2018-04-27T08:17:00.000Z | 2022-02-22T16:47:14.000Z | wilson/wcxf/converters/SMEFTsim_param_card_elements.py | jackypheno/wilson | 81c905c7c1e4cf26b5d5a1a2b7b316f1f4f9d285 | [
"MIT"
] | 17 | 2018-04-27T07:59:35.000Z | 2022-02-09T22:46:05.000Z | preamble_A = '''
######################################################################
## PARAM_CARD FOR SMEFTSIM SET A v2.0 - FLAVOR_GENERAL ALPHA_INPUTS #####
######################################################################
###################################
## INFORMATION FOR SMINPUTS
###################################
Block SMINPUTS
1 7.815553e-03 # aEW
2 1.166379e-05 # Gf
3 1.181000e-01 # aS
###################################
## INFORMATION FOR MASS
###################################
Block MASS
1 4.700000e-03 # MD
2 2.200000e-03 # MU
3 9.600000e-02 # MS
4 1.280000e+00 # MC
5 4.180000e+00 # MB
6 1.732000e+02 # MT
11 5.110000e-04 # Me
13 1.056600e-01 # MMU
15 1.777000e+00 # MTA
23 9.118760e+01 # MZ
25 1.250900e+02 # MH
## Not dependent paramater.
## Those values should be edited following analytical the
## analytical expression. Some generator could simply ignore
## those values and use the analytical expression
22 0.000000 # a : 0.0
24 73.187688 # W+ : dMW + MW0
21 0.000000 # g : 0.0
9000001 0.000000 # ghA : 0.0
9000003 73.187688 # ghWp : dMW + MW0
9000004 73.187688 # ghWm : dMW + MW0
82 0.000000 # ghG : 0.0
12 0.000000 # ve : 0.0
14 0.000000 # vm : 0.0
16 0.000000 # vt : 0.0
251 73.187688 # G+ : dMW + MW0
9000002 91.187600 # ghZ : MZ
250 91.187600 # G0 : MZ
###################################
## INFORMATION FOR DECAY
###################################
DECAY 6 1.508336e+00
DECAY 23 2.495200e+00
DECAY 24 2.085000e+00
DECAY 25 4.070000e-03
## Not dependent paramater.
## Those values should be edited following analytical the
## analytical expression. Some generator could simply ignore
## those values and use the analytical expression
DECAY 22 0.000000 # a : 0.0
DECAY 21 0.000000 # g : 0.0
DECAY 9000001 0.000000 # ghA : 0.0
DECAY 82 0.000000 # ghG : 0.0
DECAY 12 0.000000 # ve : 0.0
DECAY 14 0.000000 # vm : 0.0
DECAY 16 0.000000 # vt : 0.0
DECAY 11 0.000000 # e- : 0.0
DECAY 13 0.000000 # mu- : 0.0
DECAY 15 0.000000 # ta- : 0.0
DECAY 2 0.000000 # u : 0.0
DECAY 4 0.000000 # c : 0.0
DECAY 1 0.000000 # d : 0.0
DECAY 3 0.000000 # s : 0.0
DECAY 5 0.000000 # b : 0.0
DECAY 9000002 2.495200 # ghZ : WZ
DECAY 9000003 2.085000 # ghWp : WW
DECAY 9000004 2.085000 # ghWm : WW
DECAY 250 2.495200 # G0 : WZ
DECAY 251 2.085000 # G+ : WW
###################################
## INFORMATION FOR CKMBLOCK
###################################
Block CKMBLOCK
1 2.277360e-01 # cabi
2 2.250600e-01 # CKMlambda
3 8.110000e-01 # CKMA
4 1.240000e-01 # CKMrho
5 3.560000e-01 # CKMeta
'''
postamble_A = '''
###################################
## INFORMATION FOR YUKAWA
###################################
Block YUKAWA
1 4.700000e-03 # ymdo
2 2.200000e-03 # ymup
3 9.600000e-02 # yms
4 1.280000e+00 # ymc
5 4.180000e+00 # ymb
6 1.732000e+02 # ymt
11 5.110000e-04 # yme
13 1.056600e-01 # ymm
15 1.777000e+00 # ymtau
#===========================================================
# QUANTUM NUMBERS OF NEW STATE(S) (NON SM PDG CODE)
#===========================================================
Block QNUMBERS 9000001 # ghA
1 0 # 3 times electric charge
2 -1 # number of spin states (2S+1)
3 1 # colour rep (1: singlet, 3: triplet, 8: octet)
4 1 # Particle/Antiparticle distinction (0=own anti)
Block QNUMBERS 9000002 # ghZ
1 0 # 3 times electric charge
2 -1 # number of spin states (2S+1)
3 1 # colour rep (1: singlet, 3: triplet, 8: octet)
4 1 # Particle/Antiparticle distinction (0=own anti)
Block QNUMBERS 9000003 # ghWp
1 3 # 3 times electric charge
2 -1 # number of spin states (2S+1)
3 1 # colour rep (1: singlet, 3: triplet, 8: octet)
4 1 # Particle/Antiparticle distinction (0=own anti)
Block QNUMBERS 9000004 # ghWm
1 -3 # 3 times electric charge
2 -1 # number of spin states (2S+1)
3 1 # colour rep (1: singlet, 3: triplet, 8: octet)
4 1 # Particle/Antiparticle distinction (0=own anti)
Block QNUMBERS 82 # ghG
1 0 # 3 times electric charge
2 -1 # number of spin states (2S+1)
3 8 # colour rep (1: singlet, 3: triplet, 8: octet)
4 1 # Particle/Antiparticle distinction (0=own anti)
Block QNUMBERS 250 # G0
1 0 # 3 times electric charge
2 1 # number of spin states (2S+1)
3 1 # colour rep (1: singlet, 3: triplet, 8: octet)
4 0 # Particle/Antiparticle distinction (0=own anti)
Block QNUMBERS 251 # G+
1 3 # 3 times electric charge
2 1 # number of spin states (2S+1)
3 1 # colour rep (1: singlet, 3: triplet, 8: octet)
4 1 # Particle/Antiparticle distinction (0=own anti)
'''
preamble_B = '''
######################################################################
## PARAM_CARD FOR SMEFTSIM SET B - FLAVOR_GENERAL ALPHA_INPUTS #####
######################################################################
###################################
## INFORMATION FOR SMINPUTS
###################################
Block SMINPUTS
1 1.279000e+02 # aEWM1
2 1.166370e-05 # Gf
3 1.181000e-01 # aS
###################################
## INFORMATION FOR MASS
###################################
Block MASS
1 4.700000e-03 # MD
2 2.200000e-03 # MU
3 9.600000e-02 # MS
4 1.280000e+00 # MC
5 4.180000e+00 # MB
6 1.731000e+02 # MT
11 5.110000e-04 # Me
13 1.056600e-01 # MMU
15 1.777000e+00 # MTA
23 9.118750e+01 # MZ
25 1.250900e+02 # MH
## Not dependent paramater.
## Those values should be edited following analytical the
## analytical expression. Some generator could simply ignore
## those values and use the analytical expression
22 0.000000 # a : 0.0
24 79.824234 # W+ : dMW + MW0
21 0.000000 # g : 0.0
9000001 0.000000 # ghA : 0.0
9000003 79.824234 # ghWp : dMW + MW0
9000004 79.824234 # ghWm : dMW + MW0
82 0.000000 # ghG : 0.0
12 0.000000 # ve : 0.0
14 0.000000 # vm : 0.0
16 0.000000 # vt : 0.0
251 79.824234 # G+ : dMW + MW0
9000002 91.187500 # ghZ : MZ
250 91.187500 # G0 : MZ
###################################
## INFORMATION FOR DECAY
###################################
DECAY 6 1.508336e+00
DECAY 23 2.495200e+00
DECAY 24 2.085000e+00
DECAY 25 4.070000e-03
## Not dependent paramater.
## Those values should be edited following analytical the
## analytical expression. Some generator could simply ignore
## those values and use the analytical expression
DECAY 22 0.000000 # a : 0.0
DECAY 21 0.000000 # g : 0.0
DECAY 9000001 0.000000 # ghA : 0.0
DECAY 82 0.000000 # ghG : 0.0
DECAY 12 0.000000 # ve : 0.0
DECAY 14 0.000000 # vm : 0.0
DECAY 16 0.000000 # vt : 0.0
DECAY 11 0.000000 # e- : 0.0
DECAY 13 0.000000 # mu- : 0.0
DECAY 15 0.000000 # ta- : 0.0
DECAY 2 0.000000 # u : 0.0
DECAY 4 0.000000 # c : 0.0
DECAY 1 0.000000 # d : 0.0
DECAY 3 0.000000 # s : 0.0
DECAY 5 0.000000 # b : 0.0
DECAY 9000002 2.495200 # ghZ : WZ
DECAY 9000003 2.085000 # ghWp : WW
DECAY 9000004 2.085000 # ghWm : WW
DECAY 250 2.495200 # G0 : WZ
DECAY 251 2.085000 # G+ : WW
###################################
## INFORMATION FOR CKMBLOCK
###################################
Block CKMBLOCK
1 2.277360e-01 # cabi
2 9.743400e-01 # CKM11
3 2.250600e-01 # CKM12
4 3.570000e-03 # CKM13
5 2.249200e-01 # CKM21
6 9.735100e-01 # CKM22
7 4.110000e-02 # CKM23
8 8.750000e-03 # CKM31
9 4.030000e-02 # CKM32
10 9.991500e-01 # CKM33
'''
postamble_B = '''
###################################
## INFORMATION FOR NEWCOUPAUX
###################################
Block NEWCOUPaux
0 1.000000e+00 # WC
###################################
## INFORMATION FOR YUKAWA
###################################
Block YUKAWA
1 4.700000e-03 # ymdo
2 2.200000e-03 # ymup
3 9.600000e-02 # yms
4 1.280000e+00 # ymc
5 4.180000e+00 # ymb
6 1.731000e+02 # ymt
11 5.110000e-04 # yme
13 1.056600e-01 # ymm
15 1.777000e+00 # ymtau
#===========================================================
# QUANTUM NUMBERS OF NEW STATE(S) (NON SM PDG CODE)
#===========================================================
Block QNUMBERS 9000001 # ghA
1 0 # 3 times electric charge
2 -1 # number of spin states (2S+1)
3 1 # colour rep (1: singlet, 3: triplet, 8: octet)
4 1 # Particle/Antiparticle distinction (0=own anti)
Block QNUMBERS 9000002 # ghZ
1 0 # 3 times electric charge
2 -1 # number of spin states (2S+1)
3 1 # colour rep (1: singlet, 3: triplet, 8: octet)
4 1 # Particle/Antiparticle distinction (0=own anti)
Block QNUMBERS 9000003 # ghWp
1 3 # 3 times electric charge
2 -1 # number of spin states (2S+1)
3 1 # colour rep (1: singlet, 3: triplet, 8: octet)
4 1 # Particle/Antiparticle distinction (0=own anti)
Block QNUMBERS 9000004 # ghWm
1 -3 # 3 times electric charge
2 -1 # number of spin states (2S+1)
3 1 # colour rep (1: singlet, 3: triplet, 8: octet)
4 1 # Particle/Antiparticle distinction (0=own anti)
Block QNUMBERS 82 # ghG
1 0 # 3 times electric charge
2 -1 # number of spin states (2S+1)
3 8 # colour rep (1: singlet, 3: triplet, 8: octet)
4 1 # Particle/Antiparticle distinction (0=own anti)
Block QNUMBERS 250 # G0
1 0 # 3 times electric charge
2 1 # number of spin states (2S+1)
3 1 # colour rep (1: singlet, 3: triplet, 8: octet)
4 0 # Particle/Antiparticle distinction (0=own anti)
Block QNUMBERS 251 # G+
1 3 # 3 times electric charge
2 1 # number of spin states (2S+1)
3 1 # colour rep (1: singlet, 3: triplet, 8: octet)
4 1 # Particle/Antiparticle distinction (0=own anti)
'''
| 32.514754 | 74 | 0.541192 | 1,477 | 9,917 | 3.626947 | 0.140149 | 0.057495 | 0.039201 | 0.052268 | 0.899757 | 0.885197 | 0.885197 | 0.885197 | 0.885197 | 0.885197 | 0 | 0.241022 | 0.241908 | 9,917 | 304 | 75 | 32.621711 | 0.471535 | 0 | 0 | 0.84507 | 0 | 0 | 0.991227 | 0.159726 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 10 |
f452c960b3eea444afb1887c26af00f4f06054b2 | 17,264 | py | Python | heap.py | Wyzzard123/heap_py | aeedadf82defb4b73b69d09695d335a10267783c | [
"MIT"
] | null | null | null | heap.py | Wyzzard123/heap_py | aeedadf82defb4b73b69d09695d335a10267783c | [
"MIT"
] | null | null | null | heap.py | Wyzzard123/heap_py | aeedadf82defb4b73b69d09695d335a10267783c | [
"MIT"
] | null | null | null | """ Consolidating concepts by implementing heap creation
Originally by: Wyzzard123
Feel free to make changes."""
# Heap Array Representation Rules:
# for i starting at 1:
# for a node at i, parent node is at = i // 2
# for a node at i, left child node is at i * 2
# for a node at i, right child node is at (i * 2) + 1
def heap_insert(heap, value, type="max"):
"""Insert value at the top of a max or min heap"""
heap.append(value)
n = len(heap)
current_index = n
current_i = n - 1
parent_index = current_index // 2
# print("not none")
# print(f"parent_index - 1 is {parent_index-1}")
# print(f"current index is {current_i}")
# print(f"heap[parent_index -1] = {heap[parent_index - 1]}; heap[current_i] = {heap[current_i]}")
if type == "max":
while heap[parent_index - 1] < heap[current_i]:
print(f"current_index is {current_index}")
print(f"current i is {current_i}")
temp_value = heap[parent_index - 1]
heap[parent_index - 1] = heap[current_i]
heap[current_i] = temp_value
current_index //= 2
parent_index = current_index // 2
current_i = current_index - 1
left_child = current_index * 2
right_child = (current_index * 2 + 1)
if current_i == 0:
break
else:
pass
# print("none")
elif type == "min":
while heap[parent_index - 1] > heap[current_i]:
print(f"current_index is {current_index}")
print(f"current i is {current_i}")
temp_value = heap[parent_index - 1]
heap[parent_index - 1] = heap[current_i]
heap[current_i] = temp_value
current_index //= 2
parent_index = current_index // 2
current_i = current_index - 1
left_child = current_index * 2
right_child = (current_index * 2 + 1)
if current_i == 0:
break
else:
pass
# print("none")
return heap
def heap_create(array, type="max"):
"""Create max or min heap from a given array in nlogn time"""
# Start from the left and add
n = len(array)
heap = [None] * n
# print(heap)
height = 0
# Max Heap
if type == "max":
# starting from 1 to handle this mathematically
for index in range (1, n + 1):
parent_index = index // 2
left_child = index * 2
right_child = (index * 2) + 1
# insert index into heap
heap[index - 1] = array[index - 1]
# -1 to compensate for 0-counting
if heap[parent_index - 1] is not None:
current_index = index
i = index - 1
current_i = i
# print("not none")
# print(f"parent_index - 1 is {parent_index-1}")
# print(f"current index is {current_i}")
# print(f"heap[parent_index -1] = {heap[parent_index - 1]}; heap[current_i] = {heap[current_i]}")
while heap[parent_index - 1] < heap[current_i]:
# print(f"current_index is {current_index}")
# print(f"current i is {current_i}")
temp_value = heap[parent_index - 1]
heap[parent_index - 1] = heap[current_i]
heap[current_i] = temp_value
current_index //= 2
parent_index = current_index // 2
current_i = current_index - 1
left_child = current_index * 2
right_child = (current_index * 2 + 1)
if current_i == 0:
break
else:
# print("none")
pass
# Min Heap
if type == "min":
# starting from 1 to handle this mathematically
for index in range (1, n + 1):
parent_index = index // 2
left_child = index * 2
right_child = (index * 2) + 1
# insert index into heap
heap[index - 1] = array[index - 1]
# -1 to compensate for 0-counting
if heap[parent_index - 1] is not None:
current_index = index
i = index - 1
current_i = i
# print("not none")
# print(f"parent_index - 1 is {parent_index-1}")
# print(f"current index is {current_i}")
# print(f"heap[parent_index -1] = {heap[parent_index - 1]}; heap[current_i] = {heap[current_i]}")
while heap[parent_index - 1] > heap[current_i]:
# print(f"current_index is {current_index}")
# print(f"current i is {current_i}")
temp_value = heap[parent_index - 1]
heap[parent_index - 1] = heap[current_i]
heap[current_i] = temp_value
current_index //= 2
parent_index = current_index // 2
current_i = current_index - 1
left_child = current_index * 2
right_child = (current_index * 2 + 1)
if current_i == 0:
break
else:
pass
# print("none")
return heap
# array = [30, 123, 12, 321, 10, 442, 13, 320, 111, 310, 11, 1]
# # array = [1, 2, 3, 4, 5]
# heap = heap_create(array, "max")
# print("array is", array)
# print(heap)
# heap_insert(heap,554)
# print(heap)
def heapify(array, type=max):
"""Create a max or min heap from a given array in logn time."""
#TODO
pass
def heap_delete(heap, type="max"):
"""Delete from top of a max or min heap in logn time. Returns the deleted value (which can be used in heap sort)."""
deleted_element = heap[0] # Stores deleted element to be returned
n = len(heap)
current_index = 1
current_i = current_index - 1
left_child = current_index * 2
right_child = (current_index * 2) + 1
left_child_i = left_child - 1
right_child_i = right_child - 1
has_left_child = True
has_right_child = True
last_element_index = n - 1
if right_child_i > last_element_index:
has_right_child = False
if left_child_i > last_element_index:
has_left_child = False
# Replace deleted element with last element
heap[current_i] = heap[last_element_index]
# Remove last element of heap.
heap.pop()
n = len(heap)
last_element_index = n - 1
if right_child_i > last_element_index:
has_right_child = False
if left_child_i > last_element_index:
has_left_child = False
if type == "max":
# Check and swap elements below
# Condition is if it has a left child because this is a complete binary tree
while has_left_child is True:
if has_right_child is True:
# Check which child node is greater
if heap[right_child_i] > heap[left_child_i]:
# Check if greater child node is greater than current node
if heap[right_child_i] > heap[current_i]:
temp_value = heap[right_child_i]
heap[right_child_i] = heap[current_i]
heap[current_i] = temp_value
# + 1 because the current_index is now the right node
current_index = (current_index * 2) + 1
current_i = current_index - 1
left_child = current_index * 2
left_child_i = left_child - 1
right_child = (current_index * 2) + 1
right_child_i = right_child - 1
# Break if the sorting is done
else:
break
# This currently will also swap the left child by default if the two child nodes are equal
elif heap[left_child_i] >= heap[right_child_i]:
if heap[left_child_i] > heap[current_i]:
temp_value = heap[left_child_i]
heap[left_child_i] = heap[current_i]
heap[current_i] = temp_value
current_index *= 2
current_i = current_index - 1
left_child = current_index * 2
left_child_i = left_child - 1
right_child = (current_index * 2 + 1)
right_child_i = right_child - 1
# Break if the sorting is done
else:
break
# In this condition, has_left_child is true, but has_right_child is false
else:
if heap[left_child_i] > heap[current_i]:
temp_value = heap[left_child_i]
heap[left_child_i] = heap[current_i]
heap[current_i] = temp_value
current_index *= 2
current_i = current_index - 1
left_child = current_index * 2
left_child_i = left_child - 1
right_child = (current_index * 2 + 1)
right_child_i = right_child - 1
# Reset has_right_child and has_left_child for loop
if right_child > last_element_index:
has_right_child = False
if left_child > last_element_index:
has_left_child = False
break
elif type == "min":
# Check and swap elements below
# Condition is if it has a left child because this is a complete binary tree
while has_left_child is True:
if has_right_child is True:
# Check which child node is smaller
if heap[right_child_i] < heap[left_child_i]:
# Check if smaller child node is greater than current node
if heap[right_child_i] < heap[current_i]:
temp_value = heap[right_child_i]
heap[right_child_i] = heap[current_i]
heap[current_i] = temp_value
# + 1 because the current_index is now the right node
current_index = (current_index * 2) + 1
current_i = current_index - 1
left_child = current_index * 2
left_child_i = left_child - 1
right_child = (current_index * 2) + 1
right_child_i = right_child - 1
# Break if the sorting is done
else:
break
# This currently will also swap the left child by default if the two child nodes are equal
elif heap[left_child_i] <= heap[right_child_i]:
if heap[left_child_i] < heap[current_i]:
temp_value = heap[left_child_i]
heap[left_child_i] = heap[current_i]
heap[current_i] = temp_value
current_index *= 2
current_i = current_index - 1
left_child = current_index * 2
left_child_i = left_child - 1
right_child = (current_index * 2 + 1)
right_child_i = right_child - 1
# Break if the sorting is done
else:
break
# In this condition, has_left_child is true, but has_right_child is false
else:
if heap[left_child_i] < heap[current_i]:
temp_value = heap[left_child_i]
heap[left_child_i] = heap[current_i]
heap[current_i] = temp_value
current_index *= 2
current_i = current_index - 1
left_child = current_index * 2
left_child_i = left_child - 1
right_child = (current_index * 2 + 1)
right_child_i = right_child - 1
# Reset has_right_child and has_left_child for loop
if right_child > last_element_index:
has_right_child = False
if left_child > last_element_index:
has_left_child = False
break
return deleted_element
# heap_delete(heap)
# print(heap)
def heap_sort_from_heap(heap, type="max"):
"""Return a sorted array in nlogn time by deleting from top of heap."""
temp_heap = []
temp_heap += heap
# print("Temp =",temp_heap)
n = len(heap)
sorted_heap = [None] * n
for i in range(n):
sorted_heap[n - 1 - i] = heap_delete(heap)
# Copies the old values back into heap to ensure that the heap itself is not changed
# print("Temp =", temp_heap)
heap += temp_heap
return(sorted_heap)
# # You must reassign the heap to another variable
# sorted_heap = heap_sort_from_heap(heap)
# # Heap is sorted
# print(sorted_heap)
# # Heap is replicated into place
# print(heap)
def heap_sort_from_unsorted(array, reverse="false"):
"""Return a sorted array in nlogn time by deleting from top of heap. This does not modify the original array, and must be assigned. Takes second parameter of reverse which can be 'true' or 'false'"""
if reverse == "false" or reverse == "False":
heap = heap_create(array, "max")
print("new max heap: \n", heap)
# print("Temp =",temp_heap)
n = len(heap)
sorted_array = [None] * n
for i in range(n):
sorted_array[n - 1 - i] = heap_delete(heap, "max")
# print(f"heap now for i = {i}: {heap}")
# print(f"sorted array now for i = {i}: {sorted_array}")
if reverse == "true" or reverse == "reversed" or reverse == "True" or reverse == "Reversed":
heap = heap_create(array, "min")
print("new min heap: \n", heap)
# print("Temp =",temp_heap)
n = len(heap)
sorted_array = [None] * n
for i in range(n):
sorted_array[n - 1 - i] = heap_delete(heap, "min")
# print(f"heap now for i = {i}: {heap}")
# print(f"sorted array now for i = {i}: {sorted_array}")
# Copies the old values back into heap to ensure that the heap itself is not changed
# print("Temp =", temp_heap)
return(sorted_array)
array2 = [1, 3, 4, 51,3,53,45 ,234,5 ,32,534, 213, 21, 214124213, 1231, 1231, 123213, 5136, 1324, 634, 634, 632, 246, 4253, 632, 3, 4, 51, 231, 1251, 23, 12, 3115, 122, 1231, 232, 111, 321, 421, 231, 24, 123, 41241, 23, 12312, 123, 12,3 ,123 ,2132,1,32 ,123,213,12, 421,55, 31,5,325 ,23, 53,215,23,523,15,532,6,783,73,45 ,3,53,45 ,234,5 ,32,534, 213, 21, 214124213, 1231, 1231, 123213, 5136, 1324, 634, 634, 632, 246, 4253, 632, 3, 4, 51, 231, 1251, 23, 12, 3115, 122, 1231, 232, 111, 321, 421, 231, 24, 123, 41241, 23, 12312, 123, 12,3 ,123 ,2132,1,32 ,123,213,12, 421,55, 31,5,325 ,23, 53,215,23,523,15,532,6,783,73,45 ,3,53,45 ,234,5 ,32,534, 213, 21, 214124213, 1231, 1231, 123213, 5136, 1324, 634, 634, 632, 246, 4253, 632]
# array2 = [52, 23, 1, 52, 1, 23, 12, 21, 1]
import time
print("Sorting by max")
print("original: \n", array2)
t0 = time.time()
sorted_array = heap_sort_from_unsorted(array2, "false")
t1 = time.time()
print(f"b Time for heapsort max is: %f" % (t1-t0))
print("sorted: \n", sorted_array)
print("Sorting by min")
print("original: \n", array2)
t0 = time.time()
sorted_array = heap_sort_from_unsorted(array2, "true")
t1 = time.time()
print(f"b Time for heapsort min is: %f" % (t1-t0))
print("sorted: \n", sorted_array)
t0 = time.time()
python_sorted_array = sorted(array2)
t1 = time.time()
print(f"b Time for python sort is: %f" % (t1-t0))
print("sorted: \n", python_sorted_array)
t0 = time.time()
python_sorted_array_reversed = sorted(array2, reverse=False)
t1 = time.time()
print(f"b Time for python sort reversed is: %f" % (t1-t0))
print("sorted: \n", python_sorted_array_reversed)
# array3 = [3115, 1251, 2132, 421, 1, 534, 321, 55, 325, 1231, 783, 234, 421, 123, 232, 51, 31, 231, 215, 523, 532, 123, 73, 53, 123, 111, 231, 23, 32, 213, 12, 1, 32, 24, 5, 12, 23, 53, 123, 4, 23, 15, 23, 6, 122, 12, 45, 3, 3, 3, 45, 5]
# array3 = [10, 1, 10, 1, 10, 1, 10, 1, 10, 1, 10, 1, 10, 1, 10, 1, 20, 20, 10, 1, 10, 1, 10, 1, 10, 1, 10, 1, 10, 1, 10, 1, 10, 1, 20, 20, 10, 1, 10, 1, 10, 1, 10, 1, 10, 1, 10, 1, 10, 1, 10, 1, 20, 20]
# array3 = [140, 130, 120, 110, 100, 90, 80, 70, 60, 50, 40, 30, 20, 10, 23, 180, 170, 160, 200, 190, 150 ]
# heap3 = heap_sort_from_unsorted(array3)
# print(heap3) | 37.942857 | 728 | 0.52873 | 2,295 | 17,264 | 3.784749 | 0.101089 | 0.092563 | 0.055376 | 0.037417 | 0.811996 | 0.793576 | 0.777573 | 0.772047 | 0.759728 | 0.733594 | 0 | 0.087005 | 0.375521 | 17,264 | 455 | 729 | 37.942857 | 0.718672 | 0.264713 | 0 | 0.786822 | 0 | 0 | 0.036055 | 0 | 0 | 0 | 0 | 0.002198 | 0 | 1 | 0.023256 | false | 0.01938 | 0.003876 | 0 | 0.03876 | 0.069767 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
f465666eeb9757f0ab1702c98389d8d870a93a32 | 49 | py | Python | app/blueprints/__init__.py | mdazharulhoque7/azureFlaskPG | 4fc3a67f2a38eb4fa8eb3484f332b96b576d3f60 | [
"MIT"
] | null | null | null | app/blueprints/__init__.py | mdazharulhoque7/azureFlaskPG | 4fc3a67f2a38eb4fa8eb3484f332b96b576d3f60 | [
"MIT"
] | null | null | null | app/blueprints/__init__.py | mdazharulhoque7/azureFlaskPG | 4fc3a67f2a38eb4fa8eb3484f332b96b576d3f60 | [
"MIT"
] | null | null | null | from .views import blueprint as default_blueprint | 49 | 49 | 0.877551 | 7 | 49 | 6 | 0.857143 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.102041 | 49 | 1 | 49 | 49 | 0.954545 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 1 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | 7 |
f46cae5371a3d32324c86486e2d3c80fa290f4a8 | 39,611 | py | Python | tests/test_base_utils.py | tods-doc/d3m | e25793d4aaa9a8fdb63ac33bf1c045b96d6067a6 | [
"Apache-2.0"
] | null | null | null | tests/test_base_utils.py | tods-doc/d3m | e25793d4aaa9a8fdb63ac33bf1c045b96d6067a6 | [
"Apache-2.0"
] | null | null | null | tests/test_base_utils.py | tods-doc/d3m | e25793d4aaa9a8fdb63ac33bf1c045b96d6067a6 | [
"Apache-2.0"
] | null | null | null | import unittest
from d3m import container, utils as d3m_utils
from d3m.base import utils
from d3m.metadata import base as metadata_base
class TestBaseUtils(unittest.TestCase):
def test_combine_columns_compact_metadata(self):
main = container.DataFrame({'a1': [1, 2, 3], 'b1': [4, 5, 6], 'c1': [7, 8, 9], 'd1': [10, 11, 12], 'e1': [13, 14, 15]}, {
'top_level': 'main',
}, generate_metadata=False)
main.metadata = main.metadata.generate(main, compact=True)
main.metadata = main.metadata.update_column(0, {'name': 'aaa111'})
main.metadata = main.metadata.update_column(1, {'name': 'bbb111', 'extra': 'b_column'})
main.metadata = main.metadata.update_column(2, {'name': 'ccc111'})
columns2 = container.DataFrame({'a2': [21, 22, 23], 'b2': [24, 25, 26]}, {
'top_level': 'columns2',
}, generate_metadata=False)
columns2.metadata = columns2.metadata.generate(columns2, compact=True)
columns2.metadata = columns2.metadata.update_column(0, {'name': 'aaa222'})
columns2.metadata = columns2.metadata.update_column(1, {'name': 'bbb222'})
columns3 = container.DataFrame({'a3': [31, 32, 33], 'b3': [34, 35, 36]}, {
'top_level': 'columns3',
}, generate_metadata=False)
columns3.metadata = columns3.metadata.generate(columns3, compact=True)
columns3.metadata = columns3.metadata.update_column(0, {'name': 'aaa333'})
columns3.metadata = columns3.metadata.update_column(1, {'name': 'bbb333'})
result = utils.combine_columns(main, [1, 2], [columns2, columns3], return_result='append', add_index_columns=False)
self.assertEqual(result.values.tolist(), [
[1, 4, 7, 10, 13, 21, 24, 31, 34],
[2, 5, 8, 11, 14, 22, 25, 32, 35],
[3, 6, 9, 12, 15, 23, 26, 33, 36],
])
self.assertEqual(d3m_utils.to_json_structure(result.metadata.to_internal_simple_structure()), [{
'selector': [],
'metadata': {
'top_level': 'main',
'schema': metadata_base.CONTAINER_SCHEMA_VERSION,
'structural_type': 'd3m.container.pandas.DataFrame',
'semantic_types': ['https://metadata.datadrivendiscovery.org/types/Table'],
'dimension': {
'name': 'rows',
'semantic_types': ['https://metadata.datadrivendiscovery.org/types/TabularRow'],
'length': 3,
},
},
}, {
'selector': ['__ALL_ELEMENTS__'],
'metadata': {
'dimension': {
'name': 'columns',
'semantic_types': ['https://metadata.datadrivendiscovery.org/types/TabularColumn'],
'length': 9,
},
},
}, {
'selector': ['__ALL_ELEMENTS__', '__ALL_ELEMENTS__'],
'metadata': {
'structural_type': 'numpy.int64',
},
}, {
'selector': ['__ALL_ELEMENTS__', 0],
'metadata': {
'name': 'aaa111',
},
}, {
'selector': ['__ALL_ELEMENTS__', 1],
'metadata': {
'name': 'bbb111',
'extra': 'b_column',
},
}, {
'selector': ['__ALL_ELEMENTS__', 2],
'metadata': {
'name': 'ccc111',
},
}, {
'selector': ['__ALL_ELEMENTS__', 3],
'metadata': {
'name': 'd1',
},
}, {
'selector': ['__ALL_ELEMENTS__', 4],
'metadata': {
'name': 'e1',
},
}, {
'selector': ['__ALL_ELEMENTS__', 5],
'metadata': {
'name': 'aaa222',
'structural_type': 'numpy.int64',
},
}, {
'selector': ['__ALL_ELEMENTS__', 6],
'metadata': {
'name': 'bbb222',
'structural_type': 'numpy.int64',
},
}, {
'selector': ['__ALL_ELEMENTS__', 7],
'metadata': {
'name': 'aaa333',
'structural_type': 'numpy.int64',
},
}, {
'selector': ['__ALL_ELEMENTS__', 8],
'metadata': {
'name': 'bbb333',
'structural_type': 'numpy.int64',
},
}])
result = utils.combine_columns(main, [1, 2], [columns2, columns3], return_result='new', add_index_columns=False)
self.assertEqual(result.values.tolist(), [
[21, 24, 31, 34],
[22, 25, 32, 35],
[23, 26, 33, 36],
])
self.assertEqual(d3m_utils.to_json_structure(result.metadata.to_internal_simple_structure()), [{
'selector': [],
'metadata': {
'top_level': 'columns2',
'schema': metadata_base.CONTAINER_SCHEMA_VERSION,
'structural_type': 'd3m.container.pandas.DataFrame',
'semantic_types': ['https://metadata.datadrivendiscovery.org/types/Table'],
'dimension': {
'name': 'rows',
'semantic_types': ['https://metadata.datadrivendiscovery.org/types/TabularRow'],
'length': 3,
},
},
}, {
'selector': ['__ALL_ELEMENTS__'],
'metadata': {
'dimension': {
'name': 'columns',
'semantic_types': ['https://metadata.datadrivendiscovery.org/types/TabularColumn'],
'length': 4,
},
},
}, {
'selector': ['__ALL_ELEMENTS__', '__ALL_ELEMENTS__'],
'metadata': {
'structural_type': 'numpy.int64',
},
}, {
'selector': ['__ALL_ELEMENTS__', 0],
'metadata': {
'name': 'aaa222',
},
}, {
'selector': ['__ALL_ELEMENTS__', 1],
'metadata': {
'name': 'bbb222',
},
}, {
'selector': ['__ALL_ELEMENTS__', 2],
'metadata': {
'name': 'aaa333',
'structural_type': 'numpy.int64',
},
}, {
'selector': ['__ALL_ELEMENTS__', 3],
'metadata': {
'name': 'bbb333',
'structural_type': 'numpy.int64',
},
}])
result = utils.combine_columns(main, [1, 2], [columns2, columns3], return_result='replace', add_index_columns=False)
self.assertEqual(result.values.tolist(), [
[1, 21, 24, 31, 34, 10, 13],
[2, 22, 25, 32, 35, 11, 14],
[3, 23, 26, 33, 36, 12, 15],
])
self.assertEqual(d3m_utils.to_json_structure(result.metadata.to_internal_simple_structure()), [{
'selector': [],
'metadata': {
'top_level': 'main',
'schema': metadata_base.CONTAINER_SCHEMA_VERSION,
'structural_type': 'd3m.container.pandas.DataFrame',
'semantic_types': ['https://metadata.datadrivendiscovery.org/types/Table'],
'dimension': {
'name': 'rows',
'semantic_types': ['https://metadata.datadrivendiscovery.org/types/TabularRow'],
'length': 3,
},
},
}, {
'selector': ['__ALL_ELEMENTS__'],
'metadata': {
'dimension': {
'name': 'columns',
'semantic_types': ['https://metadata.datadrivendiscovery.org/types/TabularColumn'],
'length': 7,
},
},
}, {
'selector': ['__ALL_ELEMENTS__', '__ALL_ELEMENTS__'],
'metadata': {
'structural_type': 'numpy.int64',
},
}, {
'selector': ['__ALL_ELEMENTS__', 0],
'metadata': {
'name': 'aaa111',
},
}, {
'selector': ['__ALL_ELEMENTS__', 1],
'metadata': {
'name': 'aaa222',
'structural_type': 'numpy.int64',
},
}, {
'selector': ['__ALL_ELEMENTS__', 2],
'metadata': {
'name': 'bbb222',
'structural_type': 'numpy.int64',
},
}, {
'selector': ['__ALL_ELEMENTS__', 3],
'metadata': {
'name': 'aaa333',
'structural_type': 'numpy.int64',
},
}, {
'selector': ['__ALL_ELEMENTS__', 4],
'metadata': {
'name': 'bbb333',
'structural_type': 'numpy.int64',
},
}, {
'selector': ['__ALL_ELEMENTS__', 5],
'metadata': {
'name': 'd1',
'structural_type': 'numpy.int64',
},
}, {
'selector': ['__ALL_ELEMENTS__', 6],
'metadata': {
'name': 'e1',
'structural_type': 'numpy.int64',
},
}])
result = utils.combine_columns(main, [0, 1, 2, 3, 4], [columns2, columns3], return_result='replace', add_index_columns=False)
self.assertEqual(result.values.tolist(), [
[21, 24, 31, 34],
[22, 25, 32, 35],
[23, 26, 33, 36],
])
self.assertEqual(d3m_utils.to_json_structure(result.metadata.to_internal_simple_structure()), [{
'selector': [],
'metadata': {
'top_level': 'main',
'schema': metadata_base.CONTAINER_SCHEMA_VERSION,
'structural_type': 'd3m.container.pandas.DataFrame',
'semantic_types': ['https://metadata.datadrivendiscovery.org/types/Table'],
'dimension': {
'name': 'rows',
'semantic_types': ['https://metadata.datadrivendiscovery.org/types/TabularRow'],
'length': 3,
},
},
}, {
'selector': ['__ALL_ELEMENTS__'],
'metadata': {
'dimension': {
'name': 'columns',
'semantic_types': ['https://metadata.datadrivendiscovery.org/types/TabularColumn'],
'length': 4,
},
},
}, {
'selector': ['__ALL_ELEMENTS__', '__ALL_ELEMENTS__'],
'metadata': {
'structural_type': 'numpy.int64',
},
}, {
'selector': ['__ALL_ELEMENTS__', 0],
'metadata': {
'name': 'aaa222',
'structural_type': 'numpy.int64',
},
}, {
'selector': ['__ALL_ELEMENTS__', 1],
'metadata': {
'name': 'bbb222',
'structural_type': 'numpy.int64',
},
}, {
'selector': ['__ALL_ELEMENTS__', 2],
'metadata': {
'name': 'aaa333',
'structural_type': 'numpy.int64',
},
}, {
'selector': ['__ALL_ELEMENTS__', 3],
'metadata': {
'name': 'bbb333',
'structural_type': 'numpy.int64',
},
}])
result = utils.combine_columns(main, [4], [columns2, columns3], return_result='replace', add_index_columns=False)
self.assertEqual(result.values.tolist(), [
[1, 4, 7, 10, 21, 24, 31, 34],
[2, 5, 8, 11, 22, 25, 32, 35],
[3, 6, 9, 12, 23, 26, 33, 36],
])
self.assertEqual(d3m_utils.to_json_structure(result.metadata.to_internal_simple_structure()), [{
'selector': [],
'metadata': {
'top_level': 'main',
'schema': metadata_base.CONTAINER_SCHEMA_VERSION,
'structural_type': 'd3m.container.pandas.DataFrame',
'semantic_types': ['https://metadata.datadrivendiscovery.org/types/Table'],
'dimension': {
'name': 'rows',
'semantic_types': ['https://metadata.datadrivendiscovery.org/types/TabularRow'],
'length': 3,
},
},
}, {
'selector': ['__ALL_ELEMENTS__'],
'metadata': {
'dimension': {
'name': 'columns',
'semantic_types': ['https://metadata.datadrivendiscovery.org/types/TabularColumn'],
'length': 8,
},
},
}, {
'selector': ['__ALL_ELEMENTS__', '__ALL_ELEMENTS__'],
'metadata': {
'structural_type': 'numpy.int64',
},
}, {
'selector': ['__ALL_ELEMENTS__', 0],
'metadata': {
'name': 'aaa111',
},
}, {
'selector': ['__ALL_ELEMENTS__', 1],
'metadata': {
'name': 'bbb111',
'extra': 'b_column',
},
}, {
'selector': ['__ALL_ELEMENTS__', 2],
'metadata': {
'name': 'ccc111',
},
}, {
'selector': ['__ALL_ELEMENTS__', 3],
'metadata': {
'name': 'd1',
},
}, {
'selector': ['__ALL_ELEMENTS__', 4],
'metadata': {
'structural_type': 'numpy.int64',
'name': 'aaa222',
},
}, {
'selector': ['__ALL_ELEMENTS__', 5],
'metadata': {
'structural_type': 'numpy.int64',
'name': 'bbb222',
},
}, {
'selector': ['__ALL_ELEMENTS__', 6],
'metadata': {
'structural_type': 'numpy.int64',
'name': 'aaa333',
},
}, {
'selector': ['__ALL_ELEMENTS__', 7],
'metadata': {
'structural_type': 'numpy.int64',
'name': 'bbb333',
},
}])
result = utils.combine_columns(main, [0, 2, 4], [columns2, columns3], return_result='replace', add_index_columns=False)
self.assertEqual(result.values.tolist(), [
[21, 4, 24, 10, 31, 34],
[22, 5, 25, 11, 32, 35],
[23, 6, 26, 12, 33, 36],
])
self.assertEqual(d3m_utils.to_json_structure(result.metadata.to_internal_simple_structure()), [{
'selector': [],
'metadata': {
'top_level': 'main',
'schema': metadata_base.CONTAINER_SCHEMA_VERSION,
'structural_type': 'd3m.container.pandas.DataFrame',
'semantic_types': ['https://metadata.datadrivendiscovery.org/types/Table'],
'dimension': {
'name': 'rows',
'semantic_types': ['https://metadata.datadrivendiscovery.org/types/TabularRow'],
'length': 3,
},
},
}, {
'selector': ['__ALL_ELEMENTS__'],
'metadata': {
'dimension': {
'name': 'columns',
'semantic_types': ['https://metadata.datadrivendiscovery.org/types/TabularColumn'],
'length': 6,
},
},
}, {
'selector': ['__ALL_ELEMENTS__', '__ALL_ELEMENTS__'],
'metadata': {
'structural_type': 'numpy.int64',
},
}, {
'selector': ['__ALL_ELEMENTS__', 0],
'metadata': {
'name': 'aaa222',
'structural_type': 'numpy.int64',
},
}, {
'selector': ['__ALL_ELEMENTS__', 1],
'metadata': {
'name': 'bbb111',
'extra': 'b_column',
},
}, {
'selector': ['__ALL_ELEMENTS__', 2],
'metadata': {
'name': 'bbb222',
'structural_type': 'numpy.int64',
},
}, {
'selector': ['__ALL_ELEMENTS__', 3],
'metadata': {
'name': 'd1',
},
}, {
'selector': ['__ALL_ELEMENTS__', 4],
'metadata': {
'name': 'aaa333',
'structural_type': 'numpy.int64',
},
}, {
'selector': ['__ALL_ELEMENTS__', 5],
'metadata': {
'name': 'bbb333',
'structural_type': 'numpy.int64',
},
}])
def test_combine_columns_noncompact_metadata(self):
main = container.DataFrame({'a1': [1, 2, 3], 'b1': [4, 5, 6], 'c1': [7, 8, 9], 'd1': [10, 11, 12], 'e1': [13, 14, 15]}, {
'top_level': 'main',
}, generate_metadata=False)
main.metadata = main.metadata.generate(main, compact=False)
main.metadata = main.metadata.update_column(0, {'name': 'aaa111'})
main.metadata = main.metadata.update_column(1, {'name': 'bbb111', 'extra': 'b_column'})
main.metadata = main.metadata.update_column(2, {'name': 'ccc111'})
columns2 = container.DataFrame({'a2': [21, 22, 23], 'b2': [24, 25, 26]}, {
'top_level': 'columns2',
}, generate_metadata=False)
columns2.metadata = columns2.metadata.generate(columns2, compact=False)
columns2.metadata = columns2.metadata.update_column(0, {'name': 'aaa222'})
columns2.metadata = columns2.metadata.update_column(1, {'name': 'bbb222'})
columns3 = container.DataFrame({'a3': [31, 32, 33], 'b3': [34, 35, 36]}, {
'top_level': 'columns3',
}, generate_metadata=False)
columns3.metadata = columns3.metadata.generate(columns3, compact=False)
columns3.metadata = columns3.metadata.update_column(0, {'name': 'aaa333'})
columns3.metadata = columns3.metadata.update_column(1, {'name': 'bbb333'})
result = utils.combine_columns(main, [1, 2], [columns2, columns3], return_result='append', add_index_columns=False)
self.assertEqual(result.values.tolist(), [
[1, 4, 7, 10, 13, 21, 24, 31, 34],
[2, 5, 8, 11, 14, 22, 25, 32, 35],
[3, 6, 9, 12, 15, 23, 26, 33, 36],
])
self.assertEqual(d3m_utils.to_json_structure(result.metadata.to_internal_simple_structure()), [{
'selector': [],
'metadata': {
'top_level': 'main',
'schema': metadata_base.CONTAINER_SCHEMA_VERSION,
'structural_type': 'd3m.container.pandas.DataFrame',
'semantic_types': ['https://metadata.datadrivendiscovery.org/types/Table'],
'dimension': {
'name': 'rows',
'semantic_types': ['https://metadata.datadrivendiscovery.org/types/TabularRow'],
'length': 3,
},
},
}, {
'selector': ['__ALL_ELEMENTS__'],
'metadata': {
'dimension': {
'name': 'columns',
'semantic_types': ['https://metadata.datadrivendiscovery.org/types/TabularColumn'],
'length': 9,
},
},
}, {
'selector': ['__ALL_ELEMENTS__', 0],
'metadata': {
'name': 'aaa111',
'structural_type': 'numpy.int64',
},
}, {
'selector': ['__ALL_ELEMENTS__', 1],
'metadata': {
'name': 'bbb111',
'extra': 'b_column',
'structural_type': 'numpy.int64',
},
}, {
'selector': ['__ALL_ELEMENTS__', 2],
'metadata': {
'name': 'ccc111',
'structural_type': 'numpy.int64',
},
}, {
'selector': ['__ALL_ELEMENTS__', 3],
'metadata': {
'name': 'd1',
'structural_type': 'numpy.int64',
},
}, {
'selector': ['__ALL_ELEMENTS__', 4],
'metadata': {
'name': 'e1',
'structural_type': 'numpy.int64',
},
}, {
'selector': ['__ALL_ELEMENTS__', 5],
'metadata': {
'name': 'aaa222',
'structural_type': 'numpy.int64',
},
}, {
'selector': ['__ALL_ELEMENTS__', 6],
'metadata': {
'name': 'bbb222',
'structural_type': 'numpy.int64',
},
}, {
'selector': ['__ALL_ELEMENTS__', 7],
'metadata': {
'name': 'aaa333',
'structural_type': 'numpy.int64',
},
}, {
'selector': ['__ALL_ELEMENTS__', 8],
'metadata': {
'name': 'bbb333',
'structural_type': 'numpy.int64',
},
}])
result = utils.combine_columns(main, [1, 2], [columns2, columns3], return_result='new', add_index_columns=False)
self.assertEqual(result.values.tolist(), [
[21, 24, 31, 34],
[22, 25, 32, 35],
[23, 26, 33, 36],
])
self.assertEqual(d3m_utils.to_json_structure(result.metadata.to_internal_simple_structure()), [{
'selector': [],
'metadata': {
'top_level': 'columns2',
'schema': metadata_base.CONTAINER_SCHEMA_VERSION,
'structural_type': 'd3m.container.pandas.DataFrame',
'semantic_types': ['https://metadata.datadrivendiscovery.org/types/Table'],
'dimension': {
'name': 'rows',
'semantic_types': ['https://metadata.datadrivendiscovery.org/types/TabularRow'],
'length': 3,
},
},
}, {
'selector': ['__ALL_ELEMENTS__'],
'metadata': {
'dimension': {
'name': 'columns',
'semantic_types': ['https://metadata.datadrivendiscovery.org/types/TabularColumn'],
'length': 4,
},
},
}, {
'selector': ['__ALL_ELEMENTS__', 0],
'metadata': {
'name': 'aaa222',
'structural_type': 'numpy.int64',
},
}, {
'selector': ['__ALL_ELEMENTS__', 1],
'metadata': {
'name': 'bbb222',
'structural_type': 'numpy.int64',
},
}, {
'selector': ['__ALL_ELEMENTS__', 2],
'metadata': {
'name': 'aaa333',
'structural_type': 'numpy.int64',
},
}, {
'selector': ['__ALL_ELEMENTS__', 3],
'metadata': {
'name': 'bbb333',
'structural_type': 'numpy.int64',
},
}])
result = utils.combine_columns(main, [1, 2], [columns2, columns3], return_result='replace', add_index_columns=False)
self.assertEqual(result.values.tolist(), [
[1, 21, 24, 31, 34, 10, 13],
[2, 22, 25, 32, 35, 11, 14],
[3, 23, 26, 33, 36, 12, 15],
])
self.assertEqual(d3m_utils.to_json_structure(result.metadata.to_internal_simple_structure()), [{
'selector': [],
'metadata': {
'top_level': 'main',
'schema': metadata_base.CONTAINER_SCHEMA_VERSION,
'structural_type': 'd3m.container.pandas.DataFrame',
'semantic_types': ['https://metadata.datadrivendiscovery.org/types/Table'],
'dimension': {
'name': 'rows',
'semantic_types': ['https://metadata.datadrivendiscovery.org/types/TabularRow'],
'length': 3,
},
},
}, {
'selector': ['__ALL_ELEMENTS__'],
'metadata': {
'dimension': {
'name': 'columns',
'semantic_types': ['https://metadata.datadrivendiscovery.org/types/TabularColumn'],
'length': 7,
},
},
}, {
'selector': ['__ALL_ELEMENTS__', 0],
'metadata': {
'name': 'aaa111',
'structural_type': 'numpy.int64',
},
}, {
'selector': ['__ALL_ELEMENTS__', 1],
'metadata': {
'name': 'aaa222',
'structural_type': 'numpy.int64',
},
}, {
'selector': ['__ALL_ELEMENTS__', 2],
'metadata': {
'name': 'bbb222',
'structural_type': 'numpy.int64',
},
}, {
'selector': ['__ALL_ELEMENTS__', 3],
'metadata': {
'name': 'aaa333',
'structural_type': 'numpy.int64',
},
}, {
'selector': ['__ALL_ELEMENTS__', 4],
'metadata': {
'name': 'bbb333',
'structural_type': 'numpy.int64',
},
}, {
'selector': ['__ALL_ELEMENTS__', 5],
'metadata': {
'name': 'd1',
'structural_type': 'numpy.int64',
},
}, {
'selector': ['__ALL_ELEMENTS__', 6],
'metadata': {
'name': 'e1',
'structural_type': 'numpy.int64',
},
}])
result = utils.combine_columns(main, [0, 1, 2, 3, 4], [columns2, columns3], return_result='replace', add_index_columns=False)
self.assertEqual(result.values.tolist(), [
[21, 24, 31, 34],
[22, 25, 32, 35],
[23, 26, 33, 36],
])
self.assertEqual(d3m_utils.to_json_structure(result.metadata.to_internal_simple_structure()), [{
'selector': [],
'metadata': {
'top_level': 'main',
'schema': metadata_base.CONTAINER_SCHEMA_VERSION,
'structural_type': 'd3m.container.pandas.DataFrame',
'semantic_types': ['https://metadata.datadrivendiscovery.org/types/Table'],
'dimension': {
'name': 'rows',
'semantic_types': ['https://metadata.datadrivendiscovery.org/types/TabularRow'],
'length': 3,
},
},
}, {
'selector': ['__ALL_ELEMENTS__'],
'metadata': {
'dimension': {
'name': 'columns',
'semantic_types': ['https://metadata.datadrivendiscovery.org/types/TabularColumn'],
'length': 4,
},
},
}, {
'selector': ['__ALL_ELEMENTS__', 0],
'metadata': {
'name': 'aaa222',
'structural_type': 'numpy.int64',
},
}, {
'selector': ['__ALL_ELEMENTS__', 1],
'metadata': {
'name': 'bbb222',
'structural_type': 'numpy.int64',
},
}, {
'selector': ['__ALL_ELEMENTS__', 2],
'metadata': {
'name': 'aaa333',
'structural_type': 'numpy.int64',
},
}, {
'selector': ['__ALL_ELEMENTS__', 3],
'metadata': {
'name': 'bbb333',
'structural_type': 'numpy.int64',
},
}])
result = utils.combine_columns(main, [4], [columns2, columns3], return_result='replace', add_index_columns=False)
self.assertEqual(result.values.tolist(), [
[1, 4, 7, 10, 21, 24, 31, 34],
[2, 5, 8, 11, 22, 25, 32, 35],
[3, 6, 9, 12, 23, 26, 33, 36],
])
self.assertEqual(d3m_utils.to_json_structure(result.metadata.to_internal_simple_structure()), [{
'selector': [],
'metadata': {
'top_level': 'main',
'schema': metadata_base.CONTAINER_SCHEMA_VERSION,
'structural_type': 'd3m.container.pandas.DataFrame',
'semantic_types': ['https://metadata.datadrivendiscovery.org/types/Table'],
'dimension': {
'name': 'rows',
'semantic_types': ['https://metadata.datadrivendiscovery.org/types/TabularRow'],
'length': 3,
},
},
}, {
'selector': ['__ALL_ELEMENTS__'],
'metadata': {
'dimension': {
'name': 'columns',
'semantic_types': ['https://metadata.datadrivendiscovery.org/types/TabularColumn'],
'length': 8,
},
},
}, {
'selector': ['__ALL_ELEMENTS__', 0],
'metadata': {
'name': 'aaa111',
'structural_type': 'numpy.int64',
},
}, {
'selector': ['__ALL_ELEMENTS__', 1],
'metadata': {
'name': 'bbb111',
'extra': 'b_column',
'structural_type': 'numpy.int64',
},
}, {
'selector': ['__ALL_ELEMENTS__', 2],
'metadata': {
'name': 'ccc111',
'structural_type': 'numpy.int64',
},
}, {
'selector': ['__ALL_ELEMENTS__', 3],
'metadata': {
'name': 'd1',
'structural_type': 'numpy.int64',
},
}, {
'selector': ['__ALL_ELEMENTS__', 4],
'metadata': {
'structural_type': 'numpy.int64',
'name': 'aaa222',
},
}, {
'selector': ['__ALL_ELEMENTS__', 5],
'metadata': {
'structural_type': 'numpy.int64',
'name': 'bbb222',
},
}, {
'selector': ['__ALL_ELEMENTS__', 6],
'metadata': {
'structural_type': 'numpy.int64',
'name': 'aaa333',
},
}, {
'selector': ['__ALL_ELEMENTS__', 7],
'metadata': {
'structural_type': 'numpy.int64',
'name': 'bbb333',
},
}])
result = utils.combine_columns(main, [0, 2, 4], [columns2, columns3], return_result='replace', add_index_columns=False)
self.assertEqual(result.values.tolist(), [
[21, 4, 24, 10, 31, 34],
[22, 5, 25, 11, 32, 35],
[23, 6, 26, 12, 33, 36],
])
self.assertEqual(d3m_utils.to_json_structure(result.metadata.to_internal_simple_structure()), [{
'selector': [],
'metadata': {
'top_level': 'main',
'schema': metadata_base.CONTAINER_SCHEMA_VERSION,
'structural_type': 'd3m.container.pandas.DataFrame',
'semantic_types': ['https://metadata.datadrivendiscovery.org/types/Table'],
'dimension': {
'name': 'rows',
'semantic_types': ['https://metadata.datadrivendiscovery.org/types/TabularRow'],
'length': 3,
},
},
}, {
'selector': ['__ALL_ELEMENTS__'],
'metadata': {
'dimension': {
'name': 'columns',
'semantic_types': ['https://metadata.datadrivendiscovery.org/types/TabularColumn'],
'length': 6,
},
},
}, {
'selector': ['__ALL_ELEMENTS__', 0],
'metadata': {
'name': 'aaa222',
'structural_type': 'numpy.int64',
},
}, {
'selector': ['__ALL_ELEMENTS__', 1],
'metadata': {
'name': 'bbb111',
'extra': 'b_column',
'structural_type': 'numpy.int64',
},
}, {
'selector': ['__ALL_ELEMENTS__', 2],
'metadata': {
'name': 'bbb222',
'structural_type': 'numpy.int64',
},
}, {
'selector': ['__ALL_ELEMENTS__', 3],
'metadata': {
'name': 'd1',
'structural_type': 'numpy.int64',
},
}, {
'selector': ['__ALL_ELEMENTS__', 4],
'metadata': {
'name': 'aaa333',
'structural_type': 'numpy.int64',
},
}, {
'selector': ['__ALL_ELEMENTS__', 5],
'metadata': {
'name': 'bbb333',
'structural_type': 'numpy.int64',
},
}])
def test_combine_columns_new_with_index_compact_metadata(self):
main = container.DataFrame({'d3mIndex': [1, 2, 3], 'b1': [4, 5, 6], 'c1': [7, 8, 9]}, columns=['d3mIndex', 'b1', 'c1'], generate_metadata=False)
main.metadata = main.metadata.generate(main, compact=True)
main.metadata = main.metadata.update_column(0, {'name': 'd3mIndex', 'semantic_types': ['http://schema.org/Integer', 'https://metadata.datadrivendiscovery.org/types/PrimaryKey']})
main.metadata = main.metadata.update_column(1, {'name': 'b1', 'semantic_types': ['https://metadata.datadrivendiscovery.org/types/Attribute']})
main.metadata = main.metadata.update_column(2, {'name': 'c1', 'semantic_types': ['https://metadata.datadrivendiscovery.org/types/Attribute']})
columns = container.DataFrame({'d3mIndex': [1, 2, 3], 'b2': [4, 5, 6]}, columns=['d3mIndex', 'b2'], generate_metadata=False)
columns.metadata = columns.metadata.generate(columns, compact=True)
columns.metadata = columns.metadata.update_column(0, {'name': 'd3mIndex', 'semantic_types': ['http://schema.org/Integer', 'https://metadata.datadrivendiscovery.org/types/PrimaryKey']})
columns.metadata = columns.metadata.update_column(1, {'name': 'b2', 'semantic_types': ['https://metadata.datadrivendiscovery.org/types/Attribute']})
result = utils.combine_columns(main, [], [columns], return_result='new', add_index_columns=True)
self.assertEqual(result.values.tolist(), [
[1, 4],
[2, 5],
[3, 6],
])
self.assertEqual(d3m_utils.to_json_structure(result.metadata.to_internal_simple_structure()), [{
'selector': [],
'metadata': {
'schema': metadata_base.CONTAINER_SCHEMA_VERSION,
'structural_type': 'd3m.container.pandas.DataFrame',
'semantic_types': ['https://metadata.datadrivendiscovery.org/types/Table'],
'dimension': {
'name': 'rows',
'semantic_types': ['https://metadata.datadrivendiscovery.org/types/TabularRow'],
'length': 3,
},
},
}, {
'selector': ['__ALL_ELEMENTS__'],
'metadata': {
'dimension': {
'name': 'columns',
'semantic_types': ['https://metadata.datadrivendiscovery.org/types/TabularColumn'],
'length': 2,
},
},
}, {
'selector': ['__ALL_ELEMENTS__', '__ALL_ELEMENTS__'],
'metadata': {
'structural_type': 'numpy.int64',
},
}, {
'selector': ['__ALL_ELEMENTS__', 0],
'metadata': {
'name': 'd3mIndex',
'semantic_types': ['http://schema.org/Integer', 'https://metadata.datadrivendiscovery.org/types/PrimaryKey'],
},
}, {
'selector': ['__ALL_ELEMENTS__', 1],
'metadata': {
'name': 'b2',
'semantic_types': ['https://metadata.datadrivendiscovery.org/types/Attribute'],
},
}])
def test_combine_columns_new_with_index_noncompact_metadata(self):
main = container.DataFrame({'d3mIndex': [1, 2, 3], 'b1': [4, 5, 6], 'c1': [7, 8, 9]}, columns=['d3mIndex', 'b1', 'c1'], generate_metadata=False)
main.metadata = main.metadata.generate(main, compact=False)
main.metadata = main.metadata.update_column(0, {'name': 'd3mIndex', 'semantic_types': ['http://schema.org/Integer', 'https://metadata.datadrivendiscovery.org/types/PrimaryKey']})
main.metadata = main.metadata.update_column(1, {'name': 'b1', 'semantic_types': ['https://metadata.datadrivendiscovery.org/types/Attribute']})
main.metadata = main.metadata.update_column(2, {'name': 'c1', 'semantic_types': ['https://metadata.datadrivendiscovery.org/types/Attribute']})
columns = container.DataFrame({'d3mIndex': [1, 2, 3], 'b2': [4, 5, 6]}, columns=['d3mIndex', 'b2'], generate_metadata=False)
columns.metadata = columns.metadata.generate(columns, compact=False)
columns.metadata = columns.metadata.update_column(0, {'name': 'd3mIndex', 'semantic_types': ['http://schema.org/Integer', 'https://metadata.datadrivendiscovery.org/types/PrimaryKey']})
columns.metadata = columns.metadata.update_column(1, {'name': 'b2', 'semantic_types': ['https://metadata.datadrivendiscovery.org/types/Attribute']})
result = utils.combine_columns(main, [], [columns], return_result='new', add_index_columns=True)
self.assertEqual(result.values.tolist(), [
[1, 4],
[2, 5],
[3, 6],
])
self.assertEqual(d3m_utils.to_json_structure(result.metadata.to_internal_simple_structure()), [{
'selector': [],
'metadata': {
'schema': metadata_base.CONTAINER_SCHEMA_VERSION,
'structural_type': 'd3m.container.pandas.DataFrame',
'semantic_types': ['https://metadata.datadrivendiscovery.org/types/Table'],
'dimension': {
'name': 'rows',
'semantic_types': ['https://metadata.datadrivendiscovery.org/types/TabularRow'],
'length': 3,
},
},
}, {
'selector': ['__ALL_ELEMENTS__'],
'metadata': {
'dimension': {
'name': 'columns',
'semantic_types': ['https://metadata.datadrivendiscovery.org/types/TabularColumn'],
'length': 2,
},
},
}, {
'selector': ['__ALL_ELEMENTS__', 0],
'metadata': {
'name': 'd3mIndex',
'semantic_types': ['http://schema.org/Integer', 'https://metadata.datadrivendiscovery.org/types/PrimaryKey'],
'structural_type': 'numpy.int64',
},
}, {
'selector': ['__ALL_ELEMENTS__', 1],
'metadata': {
'name': 'b2',
'semantic_types': ['https://metadata.datadrivendiscovery.org/types/Attribute'],
'structural_type': 'numpy.int64',
},
}])
if __name__ == '__main__':
unittest.main()
| 38.234556 | 192 | 0.461034 | 3,092 | 39,611 | 5.597348 | 0.03881 | 0.068643 | 0.11088 | 0.098457 | 0.986999 | 0.986133 | 0.984168 | 0.979141 | 0.979141 | 0.979141 | 0 | 0.050166 | 0.378001 | 39,611 | 1,035 | 193 | 38.271498 | 0.652285 | 0 | 0 | 0.836735 | 0 | 0 | 0.314307 | 0.010603 | 0 | 0 | 0 | 0 | 0.028571 | 1 | 0.004082 | false | 0 | 0.004082 | 0 | 0.009184 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
f4700cc3ce530a21b25613814162ae209df56925 | 100 | py | Python | tests/test_default.py | soasme/dogeon | 496b9a5b099946d14434ed0cd7a94a270f607207 | [
"MIT"
] | 3 | 2017-10-18T22:17:43.000Z | 2021-09-15T01:27:29.000Z | tests/test_default.py | soasme/dogeon | 496b9a5b099946d14434ed0cd7a94a270f607207 | [
"MIT"
] | 1 | 2015-02-24T18:41:28.000Z | 2015-02-24T18:41:28.000Z | tests/test_default.py | soasme/dogeon | 496b9a5b099946d14434ed0cd7a94a270f607207 | [
"MIT"
] | null | null | null | import dson
def test_default():
assert dson.dumps(type, default=repr) == dson.dumps(repr(type))
| 25 | 67 | 0.72 | 15 | 100 | 4.733333 | 0.6 | 0.253521 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.13 | 100 | 3 | 68 | 33.333333 | 0.816092 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.333333 | 1 | 0.333333 | true | 0 | 0.333333 | 0 | 0.666667 | 0 | 1 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 7 |
be822736bea411c0297187dceadae6d44ced2947 | 770 | py | Python | EApp/remove.py | eljimenezj/Team_51_DS4A_2021 | 6f8e1fca0962e1698e4b533fee6eabd36abea1cf | [
"MIT"
] | null | null | null | EApp/remove.py | eljimenezj/Team_51_DS4A_2021 | 6f8e1fca0962e1698e4b533fee6eabd36abea1cf | [
"MIT"
] | null | null | null | EApp/remove.py | eljimenezj/Team_51_DS4A_2021 | 6f8e1fca0962e1698e4b533fee6eabd36abea1cf | [
"MIT"
] | null | null | null | import os
def removg():
ban = True
while ban:
if os.path.exists("static/graphs/graphs1.html"):
os.remove("static/graphs/graphs1.html")
ban = True
else:
ban = False
plt_html = "<html><body></body></html>"
Html_file= open("static/graphs/graphs1.html","x")
Html_file.write(plt_html)
Html_file.close()
def removgw():
ban = True
while ban:
if os.path.exists("static/graphs/graphw1.html"):
os.remove("static/graphs/graphw1.html")
ban = True
else:
ban = False
plt_html = "<html><body></body></html>"
Html_file= open("static/graphs/graphw1.html","x")
Html_file.write(plt_html)
Html_file.close() | 26.551724 | 57 | 0.551948 | 96 | 770 | 4.322917 | 0.270833 | 0.173494 | 0.106024 | 0.166265 | 0.833735 | 0.718072 | 0.718072 | 0.718072 | 0.718072 | 0.718072 | 0 | 0.011173 | 0.302597 | 770 | 29 | 58 | 26.551724 | 0.761639 | 0 | 0 | 0.64 | 0 | 0 | 0.282638 | 0.279946 | 0 | 0 | 0 | 0 | 0 | 1 | 0.08 | false | 0 | 0.04 | 0 | 0.12 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
fe239c67258cb96f56c3d1cd75713f86cca4908e | 8,717 | py | Python | irs/irsdata/models.py | jsfenfen/irs_527 | 40b9bfef3cd0b63d61d13240073127864a89b27d | [
"BSD-2-Clause"
] | 2 | 2020-03-14T20:59:24.000Z | 2020-06-02T17:03:51.000Z | irs/irsdata/models.py | dwillis/irs_527 | 40b9bfef3cd0b63d61d13240073127864a89b27d | [
"BSD-2-Clause"
] | null | null | null | irs/irsdata/models.py | dwillis/irs_527 | 40b9bfef3cd0b63d61d13240073127864a89b27d | [
"BSD-2-Clause"
] | 2 | 2016-07-06T21:51:50.000Z | 2020-06-02T17:03:54.000Z | from django.db import models
# Create your models here.
class irs_header(models.Model):
record_type = models.CharField(max_length=1, help_text="Record Type")
form_type = models.IntegerField(help_text="Form Type")
form_id = models.CharField(max_length=38, help_text="Form ID Number")
period_start = models.CharField(max_length=8, help_text="PERIOD Begin Date")
period_end = models.CharField(max_length=8, help_text="PERIOD End Date")
is_initial = models.NullBooleanField(help_text="Initial Report Indicator", default=False)
is_amended = models.NullBooleanField(help_text="Amended Report Indicator", null=True)
is_final = models.NullBooleanField(help_text="Final Report Indicator", null=True)
address_change = models.NullBooleanField(help_text="Change of Address Indicator", null=True)
org_name = models.CharField(max_length=70, help_text="ORGANIZATION NAME")
ein = models.CharField(max_length=9, help_text="EIN")
address1 = models.CharField(max_length=50, null=True, blank=True, help_text="MAILING ADDRESS 1")
address2 = models.CharField(max_length=50, null=True, blank=True, help_text="MAILING ADDRESS 2")
city = models.CharField(max_length=50, null=True, blank=True, help_text="MAILING ADDRESS CITY")
state = models.CharField(max_length=2, null=True, blank=True, help_text="MAILING ADDRESS STATE")
zipcode = models.CharField(max_length=5, null=True, blank=True, help_text="MAILING ADDRESS ZIP CODE")
zip4 = models.CharField(max_length=4, null=True, blank=True, help_text="MAILING ADDRESS ZIP EXT")
email = models.CharField(max_length=150, null=True, blank=True, help_text="E_MAIL ADDRESS")
org_date = models.CharField(max_length=50, null=True, blank=True, help_text="ORG FORMATION DATE")
cust_name = models.CharField(max_length=50, null=True, blank=True, help_text="CUSTODIAN NAME")
cust_add1 = models.CharField(max_length=50, null=True, blank=True, help_text="CUSTODIAN ADDRESS 1")
cust_add2 = models.CharField(max_length=50, null=True, blank=True, help_text="CUSTODIAN ADDRESS 2")
cust_city = models.CharField(max_length=50, null=True, blank=True, help_text="CUSTODIAN ADDRESS CITY")
cust_state = models.CharField(max_length=2, null=True, blank=True, help_text="CUSTODIAN ADDRESS STATE")
cust_zip = models.CharField(max_length=50, null=True, blank=True, help_text="CUSTODIAN ADDRESS ZIP CODE")
cust_zip4 = models.CharField(max_length=50, null=True, blank=True, help_text="CUSTODIAN ADDRESS ZIP EXT")
contact = models.CharField(max_length=50, null=True, blank=True, help_text="CONTACT PERSON NAME")
contact_add1 = models.CharField(max_length=50, null=True, blank=True, help_text="CONTACT ADDRESS 1")
contact_add2 = models.CharField(max_length=50, null=True, blank=True, help_text="CONTACT ADDRESS 2")
contact_city = models.CharField(max_length=50, null=True, blank=True, help_text="CONTACT ADDRESS CITY")
contact_state = models.CharField(max_length=2, null=True, blank=True, help_text="CONTACT ADDRESS STATE")
contact_zip = models.CharField(max_length=50, null=True, blank=True, help_text="CONTACT ADDRESS ZIP CODE")
contact_zip4 = models.CharField(max_length=50, null=True, blank=True, help_text="CONTACT ADDRESS ZIP EXT")
biz_add1 = models.CharField(max_length=50, null=True, blank=True, help_text="BUSINESS ADDRESS 1")
biz_add2 = models.CharField(max_length=50, null=True, blank=True, help_text="BUSINESS ADDRESS 2")
biz_city = models.CharField(max_length=50, null=True, blank=True, help_text="BUSINESS ADDRESS CITY")
biz_state = models.CharField(max_length=2, null=True, blank=True, help_text="BUSINESS ADDRESS STATE")
biz_zip = models.CharField(max_length=50, null=True, blank=True, help_text="BUSINESS ADDRESS ZIP CODE")
biz_zip4 = models.CharField(max_length=50, null=True, blank=True, help_text="BUSINESS ADDRESS ZIP EXT")
qtr_ind = models.CharField(max_length=1, null=True, blank=True, help_text="QTR INDICATOR - '1' = First Quarterly '2' = Second Quarterly '3' = Third Quarterly '4' = Year-end '5' = Mid-Year '6' = Monthly '7' = Pre-election '8' = Post-election")
rpt_month = models.CharField(max_length=2, null=True, blank=True, help_text="MONTHLY RPT MONTH; If QTR Indicator - Monthly, Month is filled")
pre_elect_type = models.CharField(max_length=10, null=True, blank=True, help_text="PRE ELECT TYPE - Null if this is a post election rpt")
elect_date = models.CharField(max_length=8, null=True, blank=True, help_text="PRE or POST ELECT DATE")
elect_state = models.CharField(max_length=2, null=True, blank=True, help_text="PRE or POST ELECT STATE")
skeda = models.NullBooleanField(help_text="SCHED_A_IND")
skeda_tot = models.DecimalField(max_digits=13, decimal_places=2, help_text="TOTAL_SCHED_A")
skedb = models.NullBooleanField(help_text="SCHED_B_IND")
skedb_tot = models.DecimalField(max_digits=13, decimal_places=2, help_text="TOTAL_SCHED_B")
insert_time = models.CharField(max_length=19, null=True, blank=True, help_text="INSERT_DATETIME")
dummy = models.NullBooleanField(help_text="empty", null=True)
class skeda(models.Model):
record_type = models.CharField(max_length=1, help_text="Record Type")
form_id = models.CharField(max_length=38, help_text="Form ID Number")
skeda_id = models.CharField(max_length=38, help_text="sked A ID Number")
org_name = models.CharField(max_length=70, help_text="ORGANIZATION NAME")
ein = models.CharField(max_length=9, help_text="EIN")
contrib_name = models.CharField(max_length=70, null=True, blank=True, help_text="CONTRIBUTOR NAME")
contrib_add1 = models.CharField(max_length=50, null=True, blank=True, help_text="CONTRIBUTOR ADDRESS 1")
contrib_add2 = models.CharField(max_length=50, null=True, blank=True, help_text="CONTRIBUTOR ADDRESS 2")
contrib_city = models.CharField(max_length=50, null=True, blank=True, help_text="CONTRIBUTOR ADDRESS CITY")
contrib_state = models.CharField(max_length=2, null=True, blank=True, help_text="CONTRIBUTOR ADDRESS STATE")
contrib_zip = models.CharField(max_length=50, null=True, blank=True, help_text="CONTRIBUTOR ADDRESS ZIP CODE")
contrib_zip4 = models.CharField(max_length=50, null=True, blank=True, help_text="CONTRIBUTOR ADDRESS ZIP EXT")
contrib_employer = models.CharField(max_length=70, null=True, blank=True, help_text="CONTRIBUTOR EMPLOYER")
contrib_amount= models.DecimalField(max_digits=13, decimal_places=2, null=True, blank=True,help_text="CONTRIBUTION AMOUNT")
contrib_occupation = models.CharField(max_length=70, null=True, blank=True, help_text="CONTRIBUTOR OCCUPATION")
contrib_ytd_amount= models.DecimalField(max_digits=13, decimal_places=2, null=True, blank=True, help_text="AGG CONTRIBUTION YTD")
contrib_date = models.CharField(max_length=8, null=True, blank=True, help_text="CONTRIBUTION DATE")
dummy = models.NullBooleanField(help_text="empty", null=True)
class skedb(models.Model):
record_type = models.CharField(max_length=1, help_text="Record Type")
form_id = models.CharField(max_length=38, help_text="Form ID Number")
skedb_id = models.CharField(max_length=38, help_text="sked B ID Number")
org_name = models.CharField(max_length=70, help_text="ORGANIZATION NAME")
ein = models.CharField(max_length=9, help_text="EIN")
recip_name = models.CharField(max_length=50, null=True, blank=True, help_text="RECIPIENT NAME")
recip_add1 = models.CharField(max_length=50, null=True, blank=True, help_text="RECIPIENT ADDRESS 1")
recip_add2 = models.CharField(max_length=50, null=True, blank=True, help_text="RECIPIENT ADDRESS 2")
recip_city = models.CharField(max_length=50, null=True, blank=True, help_text="RECIPIENT ADDRESS CITY")
recip_state = models.CharField(max_length=2, null=True, blank=True, help_text="RECIPIENT ADDRESS STATE")
recip_zip = models.CharField(max_length=50, null=True, blank=True, help_text="RECIPIENT ADDRESS ZIP CODE")
recip_zip4 = models.CharField(max_length=50, null=True, blank=True, help_text="RECIPIENT ADDRESS ZIP EXT")
recip_employer = models.CharField(max_length=70, null=True, blank=True, help_text="RECIPIENT EMPLOYER")
exp_amount= models.DecimalField(max_digits=13, decimal_places=2, null=True, blank=True, help_text="EXPENDITURE AMOUNT")
recip_occupation = models.CharField(max_length=70, null=True, blank=True, help_text="RECIPIENT OCCUPATION")
exp_date = models.CharField(max_length=8, null=True, blank=True, help_text="EXPENDITURE DATE")
exp_purpose = models.CharField(max_length=512, null=True, blank=True, help_text="EXPENDITURE PURPOSE")
dummy = models.NullBooleanField(help_text="empty", null=True)
| 88.050505 | 247 | 0.760812 | 1,295 | 8,717 | 4.928185 | 0.104247 | 0.107803 | 0.200251 | 0.267001 | 0.789878 | 0.765591 | 0.753212 | 0.735976 | 0.701661 | 0.659354 | 0 | 0.022757 | 0.117816 | 8,717 | 98 | 248 | 88.94898 | 0.807152 | 0.002753 | 0 | 0.166667 | 0 | 0.011111 | 0.20465 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.011111 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 1 | 1 | 0 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 8 |
fe3e5eae091157cf565c3083b7e5a6fd6f145f4e | 33,494 | py | Python | ts_control.py | ubbu36/CMIP6_pacific_analysis | b348142f76d3d5e76bd3908235495adf564d6756 | [
"MIT"
] | 4 | 2021-08-02T02:21:52.000Z | 2022-01-29T04:00:40.000Z | ts_control.py | ubbu36/CMIP6_pacific_analysis | b348142f76d3d5e76bd3908235495adf564d6756 | [
"MIT"
] | null | null | null | ts_control.py | ubbu36/CMIP6_pacific_analysis | b348142f76d3d5e76bd3908235495adf564d6756 | [
"MIT"
] | 1 | 2021-08-02T02:21:40.000Z | 2021-08-02T02:21:40.000Z | #!/usr/bin/env python3
# -*- coding: utf-8 -*-
"""
Created on Fri Mar 6 18:29:28 2020
@author: ullaheede
"""
# -*- coding: utf-8 -*-
"""
Spyder Editor
This is a temporary script file.
"""
# module import
import matplotlib.pyplot as plt
import cartopy.crs as ccrs
import numpy as np
import xarray as xr
import xesmf as xe
import pandas as pd
model_names=['ACCESS-CM2','ACCESS-ESM1-5','BCC-CSM2-MR','BCC-ESM1','CAMS-CSM1-0','CanESM5','CAS-ESM2-0','CESM2','CESM2-FV','CESM2-WACCM','CESM2-WACCM-FV2',\
'CIESM','CMCC-CM2-SR5','CNRM-CM6','CNRM-CM6-HR','CNRM-ESM2-1','E3SM','FGOALS-f3-L','FGOALS-g3','GFDL-CM4','GFDL-ESM4','GISS-E2-1-G','GISS-E2-1-H',\
'HadGEM3-GC31-LL','HadGEM3-GC3-MM','INM-CM4-8','INM-CM5-0','IPSL-CM6A','KACE-1-0-G','MCM-UA-1-0','MIROC-ES2L','MIROC6','MPI-ESM-1-2-HAM','MPI-ESM1-2-LR',\
'MRI-ESM2','NESM3','NorCPM1','SAM0-UNICORN','TaiESM1','UKESM1-0-LL']
def regrid_anomaly(control):
#control
uas_control= control['ts']
# uas_control= control['U']
#4xCO2
# uas_4xCO2=forcing['ts']
# uas_4xCO2=forcing['U']
control_timemean=uas_control.mean("time")
#uas_4xCO2_anom=uas_4xCO2#-control_timemean
#uas_4xCO2_anom_an=uas_4xCO2_anom
# uas_4xCO2_anom_an=uas_4xCO2_anom.groupby('time.year').mean('time')
ds_out = xr.Dataset({'lat': (['lat'], np.arange(-88, 90, 1.0)),
'lon': (['lon'], np.arange(0, 359, 1)),
}
)
regridder = xe.Regridder(control_timemean, ds_out, 'bilinear')
uas_regrid = regridder(control_timemean)
# uas_regrid = uas_regrid.assign_coords(year=list(range(a)))
return uas_regrid
### load and concatenate data ###
control = xr.open_dataset('/Volumes/Armor_CMIP6/CMIP6_project/TS/ts_Amon_ACCESS-CM2_piControl_r1i1p1f1_gn_095001-144912.nc')
forcing = xr.open_dataset('/Volumes/Armor_CMIP6/CMIP6_project/TS/ts_Amon_ACCESS-CM2_abrupt-4xCO2_r1i1p1f1_gn_095001-109912.nc')
a=int(forcing.sizes['time']/12)
output=regrid_anomaly(control)
mylist=output
control1 = xr.open_dataset('/Volumes/Armor_CMIP6/CMIP6_project/TS/ts_Amon_ACCESS-ESM1-5_piControl_r1i1p1f1_gn_010101-060012.nc')
control2 = xr.open_dataset('/Volumes/Armor_CMIP6/CMIP6_project/TS/ts_Amon_ACCESS-ESM1-5_piControl_r1i1p1f1_gn_060101-100012.nc')
control=xr.concat([control1,control2],'time')
forcing = xr.open_dataset('/Volumes/Armor_CMIP6/CMIP6_project/TS/ts_Amon_ACCESS-ESM1-5_abrupt-4xCO2_r1i1p1f1_gn_010101-025012.nc')
a=int(forcing.sizes['time']/12)
output=regrid_anomaly(control)
mylist=xr.concat([mylist,output], 'new_dim')
control = xr.open_dataset('/Volumes/Armor_CMIP6/CMIP6_project/TS/ts_Amon_BCC-CSM2-MR_piControl_r1i1p1f1_gn_185001-244912.nc')
forcing=xr.open_dataset('/Volumes/Armor_CMIP6/CMIP6_project/TS/ts_Amon_BCC-CSM2-MR_abrupt-4xCO2_r1i1p1f1_gn_185001-200012.nc')
a=int(forcing.sizes['time']/12)
uas_BCC_ESM1=regrid_anomaly(control)
mylist=xr.concat([mylist,uas_BCC_ESM1], 'new_dim')
control = xr.open_dataset('/Volumes/Armor_CMIP6/CMIP6_project/TS/ts_Amon_BCC-ESM1_piControl_r1i1p1f1_gn_185001-230012.nc')
forcing=xr.open_dataset('/Volumes/Armor_CMIP6/CMIP6_project/TS/ts_Amon_BCC-ESM1_abrupt-4xCO2_r1i1p1f1_gn_185001-200012.nc')
a=int(forcing.sizes['time']/12)
uas_BCC_ESM1=regrid_anomaly(control)
mylist=xr.concat([mylist,uas_BCC_ESM1], 'new_dim')
control1 = xr.open_dataset('/Volumes/Armor_CMIP6/CMIP6_project/TS/ts_Amon_CAMS-CSM1-0_piControl_r1i1p1f1_gn_290001-314912.nc')
control2 = xr.open_dataset('/Volumes/Armor_CMIP6/CMIP6_project/TS/ts_Amon_CAMS-CSM1-0_piControl_r1i1p1f1_gn_315001-339912.nc')
control=xr.concat([control1,control2],'time')
forcing = xr.open_dataset('/Volumes/Armor_CMIP6/CMIP6_project/TS/ts_Amon_ACCESS-ESM1-5_abrupt-4xCO2_r1i1p1f1_gn_010101-025012.nc')
a=int(forcing.sizes['time']/12)
output=regrid_anomaly(control)
mylist=xr.concat([mylist,output], 'new_dim')
control1=xr.open_dataset('/Volumes/Armor_CMIP6/CMIP6_project/TS/ts_Amon_CanESM5_piControl_r1i1p1f1_gn_600101-620012.nc')
control2=xr.open_dataset('/Volumes/Armor_CMIP6/CMIP6_project/TS/ts_Amon_CanESM5_piControl_r1i1p2f1_gn_560101-580012.nc')
control3=xr.open_dataset('/Volumes/Armor_CMIP6/CMIP6_project/TS/ts_Amon_CanESM5_piControl_r1i1p2f1_gn_580101-600012.nc')
control=xr.concat([control1,control2,control3],'time')
forcing=xr.open_dataset('/Volumes/Armor_CMIP6/CMIP6_project/TS/ts_Amon_CanESM5_abrupt-4xCO2_r1i1p1f1_gn_185001-200012.nc')
a=int(forcing.sizes['time']/12)
uas_CanESM5=regrid_anomaly(control)
mylist=xr.concat([mylist,uas_CanESM5], 'new_dim')
control= xr.open_dataset('/Volumes/Armor_CMIP6/CMIP6_project/TS/ts_Amon_CAS-ESM2-0_piControl_r1i1p1f1_gn_000101-054912.nc')
output=regrid_anomaly(control)
mylist=xr.concat([mylist,output], 'new_dim')
control1=xr.open_dataset('/Volumes/Armor_CMIP6/CMIP6_project/TS/ts_Amon_CESM2_piControl_r1i1p1f1_gn_000101-009912.nc')
control2=xr.open_dataset('/Volumes/Armor_CMIP6/CMIP6_project/TS/ts_Amon_CESM2_piControl_r1i1p1f1_gn_010001-019912.nc')
control3=xr.open_dataset('/Volumes/Armor_CMIP6/CMIP6_project/TS/ts_Amon_CESM2_piControl_r1i1p1f1_gn_020001-029912.nc')
control=xr.concat([control1,control2,control3],'time')
forcing1=xr.open_dataset('/Volumes/Armor_CMIP6/CMIP6_project/TS/ts_Amon_CESM2_abrupt-4xCO2_r1i1p1f1_gn_000101-015012.nc')
forcing2=xr.open_dataset('/Volumes/Armor_CMIP6/CMIP6_project/TS/ts_Amon_CESM2_abrupt-4xCO2_r1i1p1f1_gn_015101-019912.nc')
forcing3=xr.open_dataset('/Volumes/Armor_CMIP6/CMIP6_project/TS/ts_Amon_CESM2_abrupt-4xCO2_r1i1p1f1_gn_020001-024912.nc')
forcing4=xr.open_dataset('/Volumes/Armor_CMIP6/CMIP6_project/TS/ts_Amon_CESM2_abrupt-4xCO2_r1i1p1f1_gn_025001-029912.nc')
forcing=xr.concat([forcing1,forcing2,forcing3,forcing4],'time')
a=int(forcing.sizes['time']/12)
output=regrid_anomaly(control)
mylist=xr.concat([mylist,output],'new_dim')
control1=xr.open_dataset('/Volumes/Armor_CMIP6/CMIP6_project/TS/ts_Amon_CESM2-FV2_piControl_r1i1p1f1_gn_000101-005012.nc')
control2=xr.open_dataset('/Volumes/Armor_CMIP6/CMIP6_project/TS/ts_Amon_CESM2-FV2_piControl_r1i1p1f1_gn_005101-010012.nc')
control3=xr.open_dataset('/Volumes/Armor_CMIP6/CMIP6_project/TS/ts_Amon_CESM2-FV2_piControl_r1i1p1f1_gn_010101-015012.nc')
control4=xr.open_dataset('/Volumes/Armor_CMIP6/CMIP6_project/TS/ts_Amon_CESM2-FV2_piControl_r1i1p1f1_gn_015101-020012.nc')
control=xr.concat([control1,control2,control3,control4],'time')
a=int(forcing.sizes['time']/12)
output=regrid_anomaly(control)
mylist=xr.concat([mylist,output],'new_dim')
control1=xr.open_dataset('/Volumes/Armor_CMIP6/CMIP6_project/TS/ts_Amon_CESM2-WACCM_piControl_r1i1p1f1_gn_000101-009912.nc')
control2=xr.open_dataset('/Volumes/Armor_CMIP6/CMIP6_project/TS/ts_Amon_CESM2-WACCM_piControl_r1i1p1f1_gn_010001-019912.nc')
control=xr.concat([control1,control2],'time')
forcing1=xr.open_dataset('/Volumes/Armor_CMIP6/CMIP6_project/TS/ts_Amon_CESM2-WACCM_abrupt-4xCO2_r1i1p1f1_gn_000101-004912.nc')
forcing2=xr.open_dataset('/Volumes/Armor_CMIP6/CMIP6_project/TS/ts_Amon_CESM2-WACCM_abrupt-4xCO2_r1i1p1f1_gn_005001-009912.nc')
forcing3=xr.open_dataset('/Volumes/Armor_CMIP6/CMIP6_project/TS/ts_Amon_CESM2-WACCM_abrupt-4xCO2_r1i1p1f1_gn_010001-015012.nc')
forcing=xr.concat([forcing1,forcing2,forcing3],'time')
a=int(forcing.sizes['time']/12)
output=regrid_anomaly(control)
mylist=xr.concat([mylist,output], 'new_dim')
control1=xr.open_dataset('/Volumes/Armor_CMIP6/CMIP6_project/TS/ts_Amon_CESM2-WACCM-FV2_piControl_r1i1p1f1_gn_000101-004912.nc')
control2=xr.open_dataset('/Volumes/Armor_CMIP6/CMIP6_project/TS/ts_Amon_CESM2-WACCM-FV2_piControl_r1i1p1f1_gn_005001-009912.nc')
control3=xr.open_dataset('/Volumes/Armor_CMIP6/CMIP6_project/TS/ts_Amon_CESM2-WACCM-FV2_piControl_r1i1p1f1_gn_010001-014912.nc')
control4=xr.open_dataset('/Volumes/Armor_CMIP6/CMIP6_project/TS/ts_Amon_CESM2-WACCM-FV2_piControl_r1i1p1f1_gn_015001-019912.nc')
control=xr.concat([control1,control2,control3,control4],'time')
a=int(forcing.sizes['time']/12)
output=regrid_anomaly(control)
mylist=xr.concat([mylist,output],'new_dim')
control1=xr.open_dataset('/Volumes/Armor_CMIP6/CMIP6_project/TS/ts_Amon_CIESM_piControl_r1i1p1f1_gr_000101-005012.nc')
control2=xr.open_dataset('/Volumes/Armor_CMIP6/CMIP6_project/TS/ts_Amon_CIESM_piControl_r1i1p1f1_gr_005101-010012.nc')
control3=xr.open_dataset('/Volumes/Armor_CMIP6/CMIP6_project/TS/ts_Amon_CIESM_piControl_r1i1p1f1_gr_010101-015012.nc')
control4=xr.open_dataset('/Volumes/Armor_CMIP6/CMIP6_project/TS/ts_Amon_CIESM_piControl_r1i1p1f1_gr_015101-020012.nc')
control=xr.concat([control1,control2,control3,control4],'time')
a=int(forcing.sizes['time']/12)
output=regrid_anomaly(control)
mylist=xr.concat([mylist,output],'new_dim')
control = xr.open_dataset('/Volumes/Armor_CMIP6/CMIP6_project/TS/ts_Amon_CMCC-CM2-SR5_piControl_r1i1p1f1_gn_185001-209912.nc')
output=regrid_anomaly(control)
mylist=xr.concat([mylist,output], 'new_dim')
control = xr.open_dataset('/Volumes/Armor_CMIP6/CMIP6_project/TS/ts_Amon_CNRM-CM6-1_piControl_r1i1p1f2_gr_185001-234912.nc')
forcing=xr.open_dataset('/Volumes/Armor_CMIP6/CMIP6_project/TS/ts_Amon_CNRM-CM6-1_abrupt-4xCO2_r1i1p1f2_gr_185001-199912.nc')
a=int(forcing.sizes['time']/12)
output=regrid_anomaly(control)
mylist=xr.concat([mylist,output], 'new_dim')
del output
control = xr.open_dataset('/Volumes/Armor_CMIP6/CMIP6_project/TS/ts_Amon_CNRM-CM6-1-HR_piControl_r1i1p1f2_gr_185001-214912.nc')
output=regrid_anomaly(control)
mylist=xr.concat([mylist,output], 'new_dim')
del output
control = xr.open_dataset('/Volumes/Armor_CMIP6/CMIP6_project/TS/ts_Amon_CNRM-ESM2-1_piControl_r1i1p1f2_gr_185001-234912.nc')
forcing=xr.open_dataset('/Volumes/Armor_CMIP6/CMIP6_project/TS/ts_Amon_CNRM-ESM2-1_abrupt-4xCO2_r1i1p1f2_gr_185001-199912.nc')
a=int(forcing.sizes['time']/12)
output=regrid_anomaly(control)
mylist=xr.concat([mylist,output], 'new_dim')
del output
control1=xr.open_dataset('/Volumes/Armor_CMIP6/CMIP6_project/TS/ts_Amon_E3SM-1-0_piControl_r1i1p1f1_gr_010101-012512.nc')
control2=xr.open_dataset('/Volumes/Armor_CMIP6/CMIP6_project/TS/ts_Amon_E3SM-1-0_piControl_r1i1p1f1_gr_012601-015012.nc')
control3=xr.open_dataset('/Volumes/Armor_CMIP6/CMIP6_project/TS/ts_Amon_E3SM-1-0_piControl_r1i1p1f1_gr_015101-017512.nc')
control4=xr.open_dataset('/Volumes/Armor_CMIP6/CMIP6_project/TS/ts_Amon_E3SM-1-0_piControl_r1i1p1f1_gr_017601-020012.nc')
control5=xr.open_dataset('/Volumes/Armor_CMIP6/CMIP6_project/TS/ts_Amon_E3SM-1-0_piControl_r1i1p1f1_gr_020101-022512.nc')
control6=xr.open_dataset('/Volumes/Armor_CMIP6/CMIP6_project/TS/ts_Amon_E3SM-1-0_piControl_r1i1p1f1_gr_022601-025012.nc')
control=xr.concat([control1,control2,control3,control4,control5,control6],'time')
output=regrid_anomaly(control)
mylist=xr.concat([mylist,output],'new_dim')
control = xr.open_dataset('/Volumes/Armor_CMIP6/CMIP6_project/TS/ts_Amon_FGOALS-f3-L_piControl_r1i1p1f1_gr_060001-116012.nc')
forcing=xr.open_dataset('/Volumes/Armor_CMIP6/CMIP6_project/TS/ts_Amon_FGOALS-f3-L_abrupt-4xCO2_r1i1p1f1_gr_185001-200912.nc')
a=int(forcing.sizes['time']/12)
output=regrid_anomaly(control)
mylist=xr.concat([mylist,output], 'new_dim')
del output
control1=xr.open_dataset('/Volumes/Armor_CMIP6/CMIP6_project/TS/ts_Amon_FGOALS-g3_piControl_r1i1p1f1_gn_020001-020912.nc')
control2=xr.open_dataset('/Volumes/Armor_CMIP6/CMIP6_project/TS/ts_Amon_FGOALS-g3_piControl_r1i1p1f1_gn_021001-021912.nc')
control3=xr.open_dataset('/Volumes/Armor_CMIP6/CMIP6_project/TS/ts_Amon_FGOALS-g3_piControl_r1i1p1f1_gn_022001-022912.nc')
control4=xr.open_dataset('/Volumes/Armor_CMIP6/CMIP6_project/TS/ts_Amon_FGOALS-g3_piControl_r1i1p1f1_gn_023001-023912.nc')
control5=xr.open_dataset('/Volumes/Armor_CMIP6/CMIP6_project/TS/ts_Amon_FGOALS-g3_piControl_r1i1p1f1_gn_024001-024912.nc')
control6=xr.open_dataset('/Volumes/Armor_CMIP6/CMIP6_project/TS/ts_Amon_FGOALS-g3_piControl_r1i1p1f1_gn_025001-025912.nc')
control7=xr.open_dataset('/Volumes/Armor_CMIP6/CMIP6_project/TS/ts_Amon_FGOALS-g3_piControl_r1i1p1f1_gn_026001-026912.nc')
control8=xr.open_dataset('/Volumes/Armor_CMIP6/CMIP6_project/TS/ts_Amon_FGOALS-g3_piControl_r1i1p1f1_gn_027001-027912.nc')
control9=xr.open_dataset('/Volumes/Armor_CMIP6/CMIP6_project/TS/ts_Amon_FGOALS-g3_piControl_r1i1p1f1_gn_029001-029912.nc')
control10=xr.open_dataset('/Volumes/Armor_CMIP6/CMIP6_project/TS/ts_Amon_FGOALS-g3_piControl_r1i1p1f1_gn_030001-030912.nc')
control11=xr.open_dataset('/Volumes/Armor_CMIP6/CMIP6_project/TS/ts_Amon_FGOALS-g3_piControl_r1i1p1f1_gn_031001-031912.nc')
control12=xr.open_dataset('/Volumes/Armor_CMIP6/CMIP6_project/TS/ts_Amon_FGOALS-g3_piControl_r1i1p1f1_gn_032001-032912.nc')
control13=xr.open_dataset('/Volumes/Armor_CMIP6/CMIP6_project/TS/ts_Amon_FGOALS-g3_piControl_r1i1p1f1_gn_033001-033912.nc')
control14=xr.open_dataset('/Volumes/Armor_CMIP6/CMIP6_project/TS/ts_Amon_FGOALS-g3_piControl_r1i1p1f1_gn_034001-034912.nc')
control15=xr.open_dataset('/Volumes/Armor_CMIP6/CMIP6_project/TS/ts_Amon_FGOALS-g3_piControl_r1i1p1f1_gn_035001-035912.nc')
control16=xr.open_dataset('/Volumes/Armor_CMIP6/CMIP6_project/TS/ts_Amon_FGOALS-g3_piControl_r1i1p1f1_gn_036001-036912.nc')
control17=xr.open_dataset('/Volumes/Armor_CMIP6/CMIP6_project/TS/ts_Amon_FGOALS-g3_piControl_r1i1p1f1_gn_037001-037912.nc')
control18=xr.open_dataset('/Volumes/Armor_CMIP6/CMIP6_project/TS/ts_Amon_FGOALS-g3_piControl_r1i1p1f1_gn_038001-038912.nc')
control19=xr.open_dataset('/Volumes/Armor_CMIP6/CMIP6_project/TS/ts_Amon_FGOALS-g3_piControl_r1i1p1f1_gn_039001-039912.nc')
control=xr.concat([control1,control2,control3,control4,control5,control6, control7, control8, control9, control10, control11,\
control12, control13, control14, control15, control16, control17, control18, control19],'time')
output=regrid_anomaly(control)
mylist=xr.concat([mylist,output],'new_dim')
control1 = xr.open_dataset('/Volumes/Armor_CMIP6/CMIP6_project/TS/ts_Amon_GFDL-CM4_piControl_r1i1p1f1_gr1_055101-065012.nc')
control2 = xr.open_dataset('/Volumes/Armor_CMIP6/CMIP6_project/TS/ts_Amon_GFDL-CM4_piControl_r1i1p1f1_gr1_055101-065012.nc')
control=xr.concat([control1,control2],'time')
forcing1=xr.open_dataset('/Volumes/Armor_CMIP6/CMIP6_project/TS/ts_Amon_GFDL-CM4_abrupt-4xCO2_r1i1p1f1_gr1_000101-010012.nc')
forcing2=xr.open_dataset('/Volumes/Armor_CMIP6/CMIP6_project/TS/ts_Amon_GFDL-CM4_abrupt-4xCO2_r1i1p1f1_gr1_010101-015012.nc')
forcing=xr.concat([forcing1,forcing2],'time')
a=int(forcing.sizes['time']/12)
output=regrid_anomaly(control)
mylist=xr.concat([mylist,output], 'new_dim')
del output
control1 = xr.open_dataset('/Volumes/Armor_CMIP6/CMIP6_project/TS/ts_Amon_GFDL-ESM4_piControl_r1i1p1f1_gr1_040101-050012.nc')
control2 = xr.open_dataset('/Volumes/Armor_CMIP6/CMIP6_project/TS/ts_Amon_GFDL-ESM4_piControl_r1i1p1f1_gr1_040101-050012.nc')
control=xr.concat([control1,control2],'time')
forcing1=xr.open_dataset('/Volumes/Armor_CMIP6/CMIP6_project/TS/ts_Amon_GFDL-ESM4_abrupt-4xCO2_r1i1p1f1_gr1_000101-010012.nc')
forcing2=xr.open_dataset('/Volumes/Armor_CMIP6/CMIP6_project/TS/ts_Amon_GFDL-ESM4_abrupt-4xCO2_r1i1p1f1_gr1_010101-015012.nc')
forcing=xr.concat([forcing1,forcing2],'time')
a=int(forcing.sizes['time']/12)
output=regrid_anomaly(control)
mylist=xr.concat([mylist,output], 'new_dim')
del output
control1 = xr.open_dataset('/Volumes/Armor_CMIP6/CMIP6_project/TS/ts_Amon_GISS-E2-1-G_piControl_r1i1p1f1_gn_415001-420012.nc')
control2 = xr.open_dataset('/Volumes/Armor_CMIP6/CMIP6_project/TS/ts_Amon_GISS-E2-1-G_piControl_r1i1p1f1_gn_420101-425012.nc')
control3 = xr.open_dataset('/Volumes/Armor_CMIP6/CMIP6_project/TS/ts_Amon_GISS-E2-1-G_piControl_r1i1p1f1_gn_425101-430012.nc')
control=xr.concat([control1,control2, control3],'time')
forcing1=xr.open_dataset('/Volumes/Armor_CMIP6/CMIP6_project/TS/ts_Amon_GISS-E2-1-G_abrupt-4xCO2_r1i1p1f1_gn_190101-195012.nc')
forcing2=xr.open_dataset('/Volumes/Armor_CMIP6/CMIP6_project/TS/ts_Amon_GISS-E2-1-G_abrupt-4xCO2_r1i1p1f1_gn_195101-200012.nc')
forcing3=xr.open_dataset('/Volumes/Armor_CMIP6/CMIP6_project/TS/ts_Amon_GISS-E2-1-G_abrupt-4xCO2_r1i1p1f3_gn_290001-294912.nc')
forcing4=xr.open_dataset('/Volumes/Armor_CMIP6/CMIP6_project/TS/ts_Amon_GISS-E2-1-G_abrupt-4xCO2_r1i1p1f3_gn_295001-299912.nc')
forcing=xr.concat([forcing1,forcing2, forcing3, forcing4],'time')
a=int(forcing.sizes['time']/12)
output=regrid_anomaly(control)
mylist=xr.concat([mylist,output], 'new_dim')
del output
control1 = xr.open_dataset('/Volumes/Armor_CMIP6/CMIP6_project/TS/ts_Amon_GISS-E2-1-H_piControl_r1i1p1f1_gn_318001-323012.nc')
control2 = xr.open_dataset('/Volumes/Armor_CMIP6/CMIP6_project/TS/ts_Amon_GISS-E2-1-H_piControl_r1i1p1f1_gn_323101-328012.nc')
control3 = xr.open_dataset('/Volumes/Armor_CMIP6/CMIP6_project/TS/ts_Amon_GISS-E2-1-H_piControl_r1i1p1f1_gn_328101-333012.nc')
control=xr.concat([control1,control2, control3],'time')
forcing1=xr.open_dataset('/Volumes/Armor_CMIP6/CMIP6_project/TS/ts_Amon_GISS-E2-1-H_abrupt-4xCO2_r1i1p1f1_gn_185001-190012.nc')
forcing2=xr.open_dataset('/Volumes/Armor_CMIP6/CMIP6_project/TS/ts_Amon_GISS-E2-1-H_abrupt-4xCO2_r1i1p1f1_gn_190101-195012.nc')
forcing3=xr.open_dataset('/Volumes/Armor_CMIP6/CMIP6_project/TS/ts_Amon_GISS-E2-1-H_abrupt-4xCO2_r1i1p1f1_gn_195101-200012.nc')
forcing=xr.concat([forcing1,forcing2, forcing3],'time')
a=int(forcing.sizes['time']/12)
output=regrid_anomaly(control)
mylist=xr.concat([mylist,output], 'new_dim')
del output
#control1=xr.open_dataset('/Volumes/Armor_CMIP6/CMIP6_project/TS/ts_Amon_GISS-E2-2-G_piControl_r1i1p1f1_gn_200001-202512.nc')
#control2=xr.open_dataset('/Volumes/Armor_CMIP6/CMIP6_project/TS/ts_Amon_GISS-E2-2-G_piControl_r1i1p1f1_gn_202601-205012.nc')
#control3=xr.open_dataset('/Volumes/Armor_CMIP6/CMIP6_project/TS/ts_Amon_GISS-E2-2-G_piControl_r1i1p1f1_gn_205101-207512.nc')
#control4=xr.open_dataset('/Volumes/Armor_CMIP6/CMIP6_project/TS/ts_Amon_GISS-E2-2-G_piControl_r1i1p1f1_gn_207601-210012.nc')
#control5=xr.open_dataset('/Volumes/Armor_CMIP6/CMIP6_project/TS/ts_Amon_GISS-E2-2-G_piControl_r1i1p1f1_gn_210101-212512.nc')
#control6=xr.open_dataset('/Volumes/Armor_CMIP6/CMIP6_project/TS/ts_Amon_GISS-E2-2-G_piControl_r1i1p1f1_gn_212601-215012.nc')
#control=xr.concat([control1,control2,control3,control4,control5,control6],'time')
#output=regrid_anomaly(control)
#mylist=xr.concat([mylist,output],'new_dim')
#del output
control1 = xr.open_dataset('/Volumes/Armor_CMIP6/CMIP6_project/TS/ts_Amon_HadGEM3-GC31-LL_piControl_r1i1p1f1_gn_225001-234912.nc')
control2 = xr.open_dataset('/Volumes/Armor_CMIP6/CMIP6_project/TS/ts_Amon_HadGEM3-GC31-LL_piControl_r1i1p1f1_gn_225001-234912.nc')
control=xr.concat([control1,control2],'time')
forcing1=xr.open_dataset('/Volumes/Armor_CMIP6/CMIP6_project/TS/ts_Amon_HadGEM3-GC31-LL_abrupt-4xCO2_r1i1p1f3_gn_185001-194912.nc')
forcing2=xr.open_dataset('/Volumes/Armor_CMIP6/CMIP6_project/TS/ts_Amon_HadGEM3-GC31-LL_abrupt-4xCO2_r1i1p1f3_gn_195001-199912.nc')
forcing=xr.concat([forcing1,forcing2],'time')
a=int(forcing.sizes['time']/12)
output=regrid_anomaly(control)
mylist=xr.concat([mylist,output], 'new_dim')
del output
control1=xr.open_dataset('/Volumes/Armor_CMIP6/CMIP6_project/TS/ts_Amon_HadGEM3-GC31-MM_piControl_r1i1p1f1_gn_185001-186912.nc')
control2=xr.open_dataset('/Volumes/Armor_CMIP6/CMIP6_project/TS/ts_Amon_HadGEM3-GC31-MM_piControl_r1i1p1f1_gn_187001-188912.nc')
control3=xr.open_dataset('/Volumes/Armor_CMIP6/CMIP6_project/TS/ts_Amon_HadGEM3-GC31-MM_piControl_r1i1p1f1_gn_189001-190912.nc')
control4=xr.open_dataset('/Volumes/Armor_CMIP6/CMIP6_project/TS/ts_Amon_HadGEM3-GC31-MM_piControl_r1i1p1f1_gn_191001-192912.nc')
control5=xr.open_dataset('/Volumes/Armor_CMIP6/CMIP6_project/TS/ts_Amon_HadGEM3-GC31-MM_piControl_r1i1p1f1_gn_193001-194912.nc')
control6=xr.open_dataset('/Volumes/Armor_CMIP6/CMIP6_project/TS/ts_Amon_HadGEM3-GC31-MM_piControl_r1i1p1f1_gn_195001-196912.nc')
control4=xr.open_dataset('/Volumes/Armor_CMIP6/CMIP6_project/TS/ts_Amon_HadGEM3-GC31-MM_piControl_r1i1p1f1_gn_197001-198912.nc')
control5=xr.open_dataset('/Volumes/Armor_CMIP6/CMIP6_project/TS/ts_Amon_HadGEM3-GC31-MM_piControl_r1i1p1f1_gn_201001-202912.nc')
control6=xr.open_dataset('/Volumes/Armor_CMIP6/CMIP6_project/TS/ts_Amon_HadGEM3-GC31-MM_piControl_r1i1p1f1_gn_203001-204912.nc')
control=xr.concat([control1,control2,control3,control4,control5,control6],'time')
output=regrid_anomaly(control)
mylist=xr.concat([mylist,output],'new_dim')
del output
control1 = xr.open_dataset('/Volumes/Armor_CMIP6/CMIP6_project/TS/ts_Amon_INM-CM4-8_piControl_r1i1p1f1_gr1_185001-194912.nc')
control2 = xr.open_dataset('/Volumes/Armor_CMIP6/CMIP6_project/TS/ts_Amon_INM-CM4-8_piControl_r1i1p1f1_gr1_195001-204912.nc')
control3=xr.open_dataset('/Volumes/Armor_CMIP6/CMIP6_project/TS/ts_Amon_INM-CM4-8_piControl_r1i1p1f1_gr1_205001-214912.nc')
control=xr.concat([control1,control2,control3],'time')
forcing1=xr.open_dataset('/Volumes/Armor_CMIP6/CMIP6_project/TS/ts_Amon_INM-CM4-8_abrupt-4xCO2_r1i1p1f1_gr1_185001-194912.nc')
forcing2=xr.open_dataset('/Volumes/Armor_CMIP6/CMIP6_project/TS/ts_Amon_INM-CM4-8_abrupt-4xCO2_r1i1p1f1_gr1_195001-199912.nc')
forcing=xr.concat([forcing1,forcing2],'time')
a=int(forcing.sizes['time']/12)
output=regrid_anomaly(control)
mylist=xr.concat([mylist,output], 'new_dim')
del output
control1 = xr.open_dataset('/Volumes/Armor_CMIP6/CMIP6_project/TS/ts_Amon_INM-CM5-0_piControl_r1i1p1f1_gr1_199601-209512.nc')
control2 = xr.open_dataset('/Volumes/Armor_CMIP6/CMIP6_project/TS/ts_Amon_INM-CM5-0_piControl_r1i1p1f1_gr1_209601-219512.nc')
control=xr.concat([control1,control2],'time')
forcing1=xr.open_dataset('/Volumes/Armor_CMIP6/CMIP6_project/TS/ts_Amon_INM-CM5-0_abrupt-4xCO2_r1i1p1f1_gr1_185001-194912.nc')
forcing2=xr.open_dataset('/Volumes/Armor_CMIP6/CMIP6_project/TS/ts_Amon_INM-CM5-0_abrupt-4xCO2_r1i1p1f1_gr1_195001-199912.nc')
forcing=xr.concat([forcing1,forcing2],'time')
a=int(forcing.sizes['time']/12)
output=regrid_anomaly(control)
mylist=xr.concat([mylist,output], 'new_dim')
del output
control = xr.open_dataset('/Volumes/Armor_CMIP6/CMIP6_project/TS/ts_Amon_IPSL-CM6A-LR_piControl_r1i1p1f1_gr_285001-304912.nc')
#control = xr.decode_cf(control)
forcing=xr.open_dataset('/Volumes/Armor_CMIP6/CMIP6_project/TS/ts_Amon_IPSL-CM6A-LR_abrupt-4xCO2_r1i1p1f1_gr_185001-214912.nc',use_cftime=True)
a=int(forcing.sizes['time']/12)
output=regrid_anomaly(control)
mylist=xr.concat([mylist,output], 'new_dim')
del output
control1 = xr.open_dataset('/Volumes/Armor_CMIP6/CMIP6_project/TS/ts_Amon_KACE-1-0-G_piControl_r1i1p1f1_gr_200001-209912.nc')
control2 = xr.open_dataset('/Volumes/Armor_CMIP6/CMIP6_project/TS/ts_Amon_KACE-1-0-G_piControl_r1i1p1f1_gr_210001-219912.nc')
control=xr.concat([control1,control2],'time')
output=regrid_anomaly(control)
mylist=xr.concat([mylist,output], 'new_dim')
del output
control1 = xr.open_dataset('/Volumes/Armor_CMIP6/CMIP6_project/TS/ts_Amon_MCM-UA-1-0_piControl_r1i1p1f1_gn_000101-010012.nc')
control2 = xr.open_dataset('/Volumes/Armor_CMIP6/CMIP6_project/TS/ts_Amon_MCM-UA-1-0_piControl_r1i1p1f1_gn_010101-020012.nc')
control3=xr.open_dataset('/Volumes/Armor_CMIP6/CMIP6_project/TS/ts_Amon_MCM-UA-1-0_piControl_r1i1p1f1_gn_020101-030012.nc')
control=xr.concat([control1,control2,control3],'time')
control = control.rename({'longitude': 'lon', 'latitude': 'lat'})
forcing1=xr.open_dataset('/Volumes/Armor_CMIP6/CMIP6_project/TS/ts_Amon_MCM-UA-1-0_abrupt-4xCO2_r1i1p1f1_gn_000101-010012.nc')
forcing2=xr.open_dataset('/Volumes/Armor_CMIP6/CMIP6_project/TS/ts_Amon_MCM-UA-1-0_abrupt-4xCO2_r1i1p1f1_gn_010101-020012.nc')
forcing3=xr.open_dataset('/Volumes/Armor_CMIP6/CMIP6_project/TS/ts_Amon_MCM-UA-1-0_abrupt-4xCO2_r1i1p1f1_gn_020101-030012.nc')
forcing=xr.concat([forcing1,forcing2,forcing3],'time')
forcing = forcing.rename({'longitude': 'lon', 'latitude': 'lat'})
a=int(forcing.sizes['time']/12)
output=regrid_anomaly(control)
mylist=xr.concat([mylist,output], 'new_dim')
del output
control = xr.open_dataset('/Volumes/Armor_CMIP6/CMIP6_project/TS/ts_Amon_MIROC-ES2L_piControl_r1i1p1f2_gn_225001-234912.nc')
forcing=xr.open_dataset('/Volumes/Armor_CMIP6/CMIP6_project/TS/ts_Amon_MIROC-ES2L_abrupt-4xCO2_r1i1p1f2_gn_185001-199912.nc')
a=int(forcing.sizes['time']/12)
output=regrid_anomaly(control)
mylist=xr.concat([mylist,output], 'new_dim')
del output
control1 = xr.open_dataset('/Volumes/Armor_CMIP6/CMIP6_project/TS/ts_Amon_MIROC6_piControl_r1i1p1f1_gn_330001-339912.nc')
control2 = xr.open_dataset('/Volumes/Armor_CMIP6/CMIP6_project/TS/ts_Amon_MIROC6_piControl_r1i1p1f1_gn_340001-349912.nc')
control3 = xr.open_dataset('/Volumes/Armor_CMIP6/CMIP6_project/TS/ts_Amon_MIROC6_piControl_r1i1p1f1_gn_390001-399912.nc')
control=xr.concat([control1, control2,control3],'time')
forcing1=xr.open_dataset('/Volumes/Armor_CMIP6/CMIP6_project/TS/ts_Amon_MIROC6_abrupt-4xCO2_r1i1p1f1_gn_320001-329912.nc')
forcing2=xr.open_dataset('/Volumes/Armor_CMIP6/CMIP6_project/TS/ts_Amon_MIROC6_abrupt-4xCO2_r1i1p1f1_gn_330001-334912.nc')
forcing3=xr.open_dataset('/Volumes/Armor_CMIP6/CMIP6_project/TS/ts_Amon_MIROC6_abrupt-4xCO2_r1i1p1f1_gn_335001-344912.nc')
forcing=xr.concat([forcing1,forcing2, forcing3],'time')
a=int(forcing.sizes['time']/12)
output=regrid_anomaly(control)
mylist=xr.concat([mylist,output], 'new_dim')
del output
control1=xr.open_dataset('/Volumes/Armor_CMIP6/CMIP6_project/TS/ts_Amon_MPI-ESM-1-2-HAM_piControl_r1i1p1f1_gn_185001-186912.nc')
control2=xr.open_dataset('/Volumes/Armor_CMIP6/CMIP6_project/TS/ts_Amon_MPI-ESM-1-2-HAM_piControl_r1i1p1f1_gn_187001-188912.nc')
control3=xr.open_dataset('/Volumes/Armor_CMIP6/CMIP6_project/TS/ts_Amon_MPI-ESM-1-2-HAM_piControl_r1i1p1f1_gn_189001-190912.nc')
control4=xr.open_dataset('/Volumes/Armor_CMIP6/CMIP6_project/TS/ts_Amon_MPI-ESM-1-2-HAM_piControl_r1i1p1f1_gn_191001-192912.nc')
control5=xr.open_dataset('/Volumes/Armor_CMIP6/CMIP6_project/TS/ts_Amon_MPI-ESM-1-2-HAM_piControl_r1i1p1f1_gn_193001-194912.nc')
control6=xr.open_dataset('/Volumes/Armor_CMIP6/CMIP6_project/TS/ts_Amon_MPI-ESM-1-2-HAM_piControl_r1i1p1f1_gn_195001-196912.nc')
control4=xr.open_dataset('/Volumes/Armor_CMIP6/CMIP6_project/TS/ts_Amon_MPI-ESM-1-2-HAM_piControl_r1i1p1f1_gn_197001-198912.nc')
control5=xr.open_dataset('/Volumes/Armor_CMIP6/CMIP6_project/TS/ts_Amon_MPI-ESM-1-2-HAM_piControl_r1i1p1f1_gn_199001-200912.nc')
control6=xr.open_dataset('/Volumes/Armor_CMIP6/CMIP6_project/TS/ts_Amon_MPI-ESM-1-2-HAM_piControl_r1i1p1f1_gn_201001-202912.nc')
control7=xr.open_dataset('/Volumes/Armor_CMIP6/CMIP6_project/TS/ts_Amon_MPI-ESM-1-2-HAM_piControl_r1i1p1f1_gn_203001-204912.nc')
control=xr.concat([control1,control2,control3,control4,control5,control6, control7],'time')
output=regrid_anomaly(control)
mylist=xr.concat([mylist,output],'new_dim')
del output
control1=xr.open_dataset('/Volumes/Armor_CMIP6/CMIP6_project/TS/ts_Amon_MPI-ESM1-2-LR_piControl_r1i1p1f1_gn_185001-186912.nc')
control2=xr.open_dataset('/Volumes/Armor_CMIP6/CMIP6_project/TS/ts_Amon_MPI-ESM1-2-LR_piControl_r1i1p1f1_gn_187001-188912.nc')
control3=xr.open_dataset('/Volumes/Armor_CMIP6/CMIP6_project/TS/ts_Amon_MPI-ESM1-2-LR_piControl_r1i1p1f1_gn_189001-190912.nc')
control4=xr.open_dataset('/Volumes/Armor_CMIP6/CMIP6_project/TS/ts_Amon_MPI-ESM1-2-LR_piControl_r1i1p1f1_gn_191001-192912.nc')
control5=xr.open_dataset('/Volumes/Armor_CMIP6/CMIP6_project/TS/ts_Amon_MPI-ESM1-2-LR_piControl_r1i1p1f1_gn_193001-194912.nc')
control6=xr.open_dataset('/Volumes/Armor_CMIP6/CMIP6_project/TS/ts_Amon_MPI-ESM1-2-LR_piControl_r1i1p1f1_gn_195001-196912.nc')
control4=xr.open_dataset('/Volumes/Armor_CMIP6/CMIP6_project/TS/ts_Amon_MPI-ESM1-2-LR_piControl_r1i1p1f1_gn_197001-198912.nc')
control5=xr.open_dataset('/Volumes/Armor_CMIP6/CMIP6_project/TS/ts_Amon_MPI-ESM1-2-LR_piControl_r1i1p1f1_gn_199001-200912.nc')
control6=xr.open_dataset('/Volumes/Armor_CMIP6/CMIP6_project/TS/ts_Amon_MPI-ESM1-2-LR_piControl_r1i1p1f1_gn_201001-202912.nc')
control7=xr.open_dataset('/Volumes/Armor_CMIP6/CMIP6_project/TS/ts_Amon_MPI-ESM1-2-LR_piControl_r1i1p1f1_gn_203001-204912.nc')
control=xr.concat([control1,control2,control3,control4,control5,control6, control7],'time')
output=regrid_anomaly(control)
mylist=xr.concat([mylist,output],'new_dim')
del output
control = xr.open_dataset('/Volumes/Armor_CMIP6/CMIP6_project/TS/ts_Amon_MRI-ESM2-0_piControl_r1i1p1f1_gn_185001-255012.nc')
forcing=xr.open_dataset('/Volumes/Armor_CMIP6/CMIP6_project/TS/ts_Amon_MRI-ESM2-0_abrupt-4xCO2_r10i1p1f1_gn_185001-200012.nc')
a=int(forcing.sizes['time']/12)
output=regrid_anomaly(control)
mylist=xr.concat([mylist,output], 'new_dim')
del output
control1 = xr.open_dataset('/Volumes/Armor_CMIP6/CMIP6_project/TS/ts_Amon_NESM3_piControl_r1i1p1f1_gn_050001-059912.nc')
control2 = xr.open_dataset('/Volumes/Armor_CMIP6/CMIP6_project/TS/ts_Amon_NESM3_piControl_r1i1p1f1_gn_060001-069912.nc')
control3 = xr.open_dataset('/Volumes/Armor_CMIP6/CMIP6_project/TS/ts_Amon_NESM3_piControl_r1i1p1f1_gn_070001-079912.nc')
control=xr.concat([control1, control2,control3],'time')
forcing=xr.open_dataset('/Volumes/Armor_CMIP6/CMIP6_project/TS/ts_Amon_NESM3_abrupt-4xCO2_r1i1p1f1_gn_185001-199912.nc')
a=int(forcing.sizes['time']/12)
output=regrid_anomaly(control)
mylist=xr.concat([mylist,output], 'new_dim')
del output
control1=xr.open_dataset('/Volumes/Armor_CMIP6/CMIP6_project/TS/ts_Amon_NorCPM1_piControl_r1i1p1f1_gn_000101-010012.nc')
control2=xr.open_dataset('/Volumes/Armor_CMIP6/CMIP6_project/TS/ts_Amon_NorCPM1_piControl_r1i1p1f1_gn_010101-020012.nc')
control3=xr.open_dataset('/Volumes/Armor_CMIP6/CMIP6_project/TS/ts_Amon_NorCPM1_piControl_r1i1p1f1_gn_020101-030012.nc')
control=xr.concat([control1,control2,control3],'time')
output=regrid_anomaly(control)
mylist=xr.concat([mylist,output],'new_dim')
control1=xr.open_dataset('/Volumes/Armor_CMIP6/CMIP6_project/TS/ts_Amon_SAM0-UNICON_piControl_r1i1p1f1_gn_000101-001012.nc')
control2=xr.open_dataset('/Volumes/Armor_CMIP6/CMIP6_project/TS/ts_Amon_SAM0-UNICON_piControl_r1i1p1f1_gn_001101-002012.nc')
control3=xr.open_dataset('/Volumes/Armor_CMIP6/CMIP6_project/TS/ts_Amon_SAM0-UNICON_piControl_r1i1p1f1_gn_002101-003012.nc')
control4=xr.open_dataset('/Volumes/Armor_CMIP6/CMIP6_project/TS/ts_Amon_SAM0-UNICON_piControl_r1i1p1f1_gn_003101-004012.nc')
control5=xr.open_dataset('/Volumes/Armor_CMIP6/CMIP6_project/TS/ts_Amon_SAM0-UNICON_piControl_r1i1p1f1_gn_004101-005012.nc')
control6=xr.open_dataset('/Volumes/Armor_CMIP6/CMIP6_project/TS/ts_Amon_SAM0-UNICON_piControl_r1i1p1f1_gn_005101-006012.nc')
control7=xr.open_dataset('/Volumes/Armor_CMIP6/CMIP6_project/TS/ts_Amon_SAM0-UNICON_piControl_r1i1p1f1_gn_007101-008012.nc')
control8=xr.open_dataset('/Volumes/Armor_CMIP6/CMIP6_project/TS/ts_Amon_SAM0-UNICON_piControl_r1i1p1f1_gn_008101-009012.nc')
control9=xr.open_dataset('/Volumes/Armor_CMIP6/CMIP6_project/TS/ts_Amon_SAM0-UNICON_piControl_r1i1p1f1_gn_009101-010012.nc')
control10=xr.open_dataset('/Volumes/Armor_CMIP6/CMIP6_project/TS/ts_Amon_SAM0-UNICON_piControl_r1i1p1f1_gn_010101-011012.nc')
control11=xr.open_dataset('/Volumes/Armor_CMIP6/CMIP6_project/TS/ts_Amon_SAM0-UNICON_piControl_r1i1p1f1_gn_011101-012012.nc')
control12=xr.open_dataset('/Volumes/Armor_CMIP6/CMIP6_project/TS/ts_Amon_SAM0-UNICON_piControl_r1i1p1f1_gn_012101-013012.nc')
control13=xr.open_dataset('/Volumes/Armor_CMIP6/CMIP6_project/TS/ts_Amon_SAM0-UNICON_piControl_r1i1p1f1_gn_013101-014012.nc')
control14=xr.open_dataset('/Volumes/Armor_CMIP6/CMIP6_project/TS/ts_Amon_SAM0-UNICON_piControl_r1i1p1f1_gn_014101-015012.nc')
control15=xr.open_dataset('/Volumes/Armor_CMIP6/CMIP6_project/TS/ts_Amon_SAM0-UNICON_piControl_r1i1p1f1_gn_015101-016012.nc')
control16=xr.open_dataset('/Volumes/Armor_CMIP6/CMIP6_project/TS/ts_Amon_SAM0-UNICON_piControl_r1i1p1f1_gn_016101-017012.nc')
control17=xr.open_dataset('/Volumes/Armor_CMIP6/CMIP6_project/TS/ts_Amon_SAM0-UNICON_piControl_r1i1p1f1_gn_017101-018012.nc')
control18=xr.open_dataset('/Volumes/Armor_CMIP6/CMIP6_project/TS/ts_Amon_SAM0-UNICON_piControl_r1i1p1f1_gn_018101-019012.nc')
control19=xr.open_dataset('/Volumes/Armor_CMIP6/CMIP6_project/TS/ts_Amon_SAM0-UNICON_piControl_r1i1p1f1_gn_019101-020012.nc')
control=xr.concat([control1,control2,control3,control4,control5,control6, control7, control8, control9, control10, control11,\
control12, control13, control14, control15, control16, control17, control18, control19],'time')
output=regrid_anomaly(control)
mylist=xr.concat([mylist,output],'new_dim')
del output
control1=xr.open_dataset('/Volumes/Armor_CMIP6/CMIP6_project/TS/ts_Amon_TaiESM1_piControl_r1i1p1f1_gn_020101-030012.nc')
control2=xr.open_dataset('/Volumes/Armor_CMIP6/CMIP6_project/TS/ts_Amon_TaiESM1_piControl_r1i1p1f1_gn_030101-040012.nc')
control3=xr.open_dataset('/Volumes/Armor_CMIP6/CMIP6_project/TS/ts_Amon_TaiESM1_piControl_r1i1p1f1_gn_040101-050012.nc')
control=xr.concat([control1,control2,control3],'time')
output=regrid_anomaly(control)
mylist=xr.concat([mylist,output],'new_dim')
control = xr.open_dataset('/Volumes/Armor_CMIP6/CMIP6_project/TS/ts_Amon_UKESM1-0-LL_piControl_r1i1p1f2_gn_255001-264912.nc')
forcing1=xr.open_dataset('/Volumes/Armor_CMIP6/CMIP6_project/TS/ts_Amon_UKESM1-0-LL_abrupt-4xCO2_r1i1p1f2_gn_185001-194912.nc')
forcing2=xr.open_dataset('/Volumes/Armor_CMIP6/CMIP6_project/TS/ts_Amon_UKESM1-0-LL_abrupt-4xCO2_r1i1p1f2_gn_195001-199912.nc')
forcing=xr.concat([forcing1,forcing2],'time')
a=int(forcing.sizes['time']/12)
output=regrid_anomaly(control)
mylist=xr.concat([mylist,output], 'new_dim')
del output
xr.concat([mylist], pd.Index(list(model_names), name='new_dim'))
mylist.to_netcdf('/Volumes/Armor_CMIP6/control_timemean_ts_1deg.nc') | 68.215886 | 167 | 0.834358 | 5,372 | 33,494 | 4.85108 | 0.076322 | 0.090253 | 0.127859 | 0.149655 | 0.901266 | 0.888757 | 0.887145 | 0.881351 | 0.877513 | 0.877283 | 0 | 0.137161 | 0.028751 | 33,494 | 491 | 168 | 68.215886 | 0.663921 | 0.0415 | 0 | 0.444156 | 0 | 0.051948 | 0.600087 | 0.568568 | 0 | 0 | 0 | 0 | 0 | 1 | 0.002597 | false | 0 | 0.015584 | 0 | 0.020779 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 8 |
fe69cfdae8bb435bd00465ad4af8bb82a23b2b55 | 66 | py | Python | slides/deployment/python/3_multi_server_series_deploy.py | pxg/petegraham.co.uk | e5c49234ab8089c285e03b1877e067e3a30f151c | [
"MIT"
] | null | null | null | slides/deployment/python/3_multi_server_series_deploy.py | pxg/petegraham.co.uk | e5c49234ab8089c285e03b1877e067e3a30f151c | [
"MIT"
] | 27 | 2015-07-31T10:41:04.000Z | 2017-12-11T11:12:09.000Z | slides/deployment/python/3_multi_server_series_deploy.py | pxg/petegraham.co.uk | e5c49234ab8089c285e03b1877e067e3a30f151c | [
"MIT"
] | 1 | 2015-08-04T12:15:19.000Z | 2015-08-04T12:15:19.000Z | def stage():
env.hosts = ['54.228.188.132', '54.228.188.133']
| 22 | 52 | 0.575758 | 12 | 66 | 3.166667 | 0.75 | 0.263158 | 0.421053 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.392857 | 0.151515 | 66 | 2 | 53 | 33 | 0.285714 | 0 | 0 | 0 | 0 | 0 | 0.424242 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.5 | true | 0 | 0 | 0 | 0.5 | 0 | 1 | 0 | 0 | null | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 8 |
fe774615d8062621c2ecd8edb7ede4de5df80302 | 4,616 | py | Python | CGNet/CGNet.py | oscar-carlsson/CGNet | f9072b7d3b6b1efa0675870c2e6b9220a0d6cbc2 | [
"Unlicense"
] | null | null | null | CGNet/CGNet.py | oscar-carlsson/CGNet | f9072b7d3b6b1efa0675870c2e6b9220a0d6cbc2 | [
"Unlicense"
] | null | null | null | CGNet/CGNet.py | oscar-carlsson/CGNet | f9072b7d3b6b1efa0675870c2e6b9220a0d6cbc2 | [
"Unlicense"
] | 1 | 2021-10-14T07:45:05.000Z | 2021-10-14T07:45:05.000Z | import sys
import os
CUR_DIR = os.path.dirname(os.path.realpath(__file__))
REAL_PART = 0
IMAG_PART = 1
import torch.nn as nn
import torch
import numpy as np
import ipdb
import CGNet.CG_layers as cglayers
class SphericalCNN(nn.Module):
def __init__(self, lmax, taus,
cuda=True,
norm=True,
skipconn=True,
sparse_flag=False,
weight_type="cost"):
"""
:param lmax:
:param taus: list of list
taus[i] is the # of fragments of each l for (intput of) the i-th layer.
:param n_layers:
:param cuda: only supports cuda=True now
:param norm: perform "fragment normalization" or not
:param skipconn: if True, take all l=0 fragments and concatenate them to be the invariance embedding
"""
assert cuda, "Do not support these parameters yet"
# the maximum l is lmax (i.e. l in range(lmax+1))
super(SphericalCNN, self).__init__()
self.lmax = lmax
self.taus = taus
# the rest of the layers are like in CGnet (all in cuda)
self.n_layers = len(taus) - 1
self.cgs = nn.ModuleList([cglayers.CGBN_cuda(lmax, taus[layer_i], taus[layer_i + 1],
batchnorm=norm, sparse_flag=sparse_flag, weight_type=weight_type)
for layer_i in range(self.n_layers)])
# for the skip connection..
self.skipconn = skipconn
if self.skipconn:
self.output_length = 2 * sum([_taus[0] for _taus in taus])
else:
self.output_length = 2 * taus[-1][0]
if cuda:
self.cuda()
def forward(self, f_input):
embedding = []
if isinstance(f_input, list):
B = f_input[0].shape[0]
if self.skipconn: embedding = [f_input[0].view(B, -1)]
else:
B = f_input.shape[0]
if self.skipconn: embedding = [f_input[:, 0:(self.taus[0][0] * (2 * 0 + 1)), :].view(B, -1)]
fs = f_input
for i in range(self.n_layers):
fs = self.cgs[i](fs, straight_output=False)
if self.skipconn: embedding.append(fs[0].view(B, -1))
if self.skipconn:
embedding = torch.cat(embedding, 1)
else:
embedding = fs[0].view(B, -1)
return embedding
class SphericalCNN_py(nn.Module):
def __init__(self, lmax, taus,
cuda=True,
norm=True,
skipconn=True,
sparse_flag=False,
weight_type="cost"):
"""
:param lmax:
:param taus: list of list
taus[i] is the # of fragments of each l for (intput of) the i-th layer.
:param n_layers:
:param cuda: only supports cuda=True now
:param norm: perform "fragment normalization" or not
:param skipconn: if True, take all l=0 fragments and concatenate them to be the invariance embedding
"""
assert cuda, "Do not support these parameters yet"
# the maximum l is lmax (i.e. l in range(lmax+1))
super(SphericalCNN_py, self).__init__()
self.lmax = lmax
self.taus = taus
# the rest of the layers are like in CGnet (all in cuda)
self.n_layers = len(taus) - 1
self.cgs = nn.ModuleList([cglayers.CGBN_base(lmax, taus[layer_i], taus[layer_i + 1],
batchnorm=norm, sparse_flag=sparse_flag, weight_type=weight_type)
for layer_i in range(self.n_layers)])
# for the skip connection..
self.skipconn = skipconn
if self.skipconn:
self.output_length = 2 * sum([_taus[0] for _taus in taus])
else:
self.output_length = 2 * taus[-1][0]
if cuda:
self.cuda()
def forward(self, f_input):
embedding = []
if isinstance(f_input, list):
B = f_input[0].shape[0]
if self.skipconn: embedding = [f_input[0].view(B, -1)]
else:
B = f_input.shape[0]
if self.skipconn: embedding = [f_input[:, 0:(self.taus[0][0] * (2 * 0 + 1)), :].view(B, -1)]
fs = f_input
for i in range(self.n_layers):
fs = self.cgs[i](fs, straight_output=False)
if self.skipconn: embedding.append(fs[0].view(B, -1))
if self.skipconn:
embedding = torch.cat(embedding, 1)
else:
embedding = fs[0].view(B, -1)
return embedding | 35.782946 | 118 | 0.546794 | 614 | 4,616 | 3.983713 | 0.185668 | 0.034342 | 0.057236 | 0.075225 | 0.918234 | 0.918234 | 0.918234 | 0.918234 | 0.918234 | 0.918234 | 0 | 0.01791 | 0.346837 | 4,616 | 129 | 119 | 35.782946 | 0.793367 | 0.19974 | 0 | 0.818182 | 0 | 0 | 0.022003 | 0 | 0 | 0 | 0 | 0 | 0.022727 | 1 | 0.045455 | false | 0 | 0.079545 | 0 | 0.170455 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
fe9e74d972e0ccf6dd71c9c2c7028cf6413b9b2e | 1,182 | py | Python | gabriel_lego/lego_engine/tasks/task_L.py | molguin92/gabriel-lego-py3 | 2f8828326ca025997687a19d1af80bc1590a9290 | [
"Apache-2.0"
] | 1 | 2021-05-12T12:49:15.000Z | 2021-05-12T12:49:15.000Z | gabriel_lego/lego_engine/tasks/task_L.py | molguin92/gabriel-lego-py3 | 2f8828326ca025997687a19d1af80bc1590a9290 | [
"Apache-2.0"
] | 1 | 2019-09-10T23:41:41.000Z | 2019-09-11T20:21:11.000Z | gabriel_lego/lego_engine/tasks/task_L.py | molguin92/gabriel-lego-py3 | 2f8828326ca025997687a19d1af80bc1590a9290 | [
"Apache-2.0"
] | 1 | 2022-02-22T15:29:27.000Z | 2022-02-22T15:29:27.000Z | #!/usr/bin/env python
import numpy as np
# Labels: nothing:0, white:1, green:2, yellow:3, red:4, blue:5, black:6, unsure:7
bitmaps = [np.array([[4, 4, 4, 4, 4, 4]]),
np.array([[0, 6, 0, 0, 0, 0],
[4, 4, 4, 4, 4, 4]]),
np.array([[0, 6, 0, 6, 0, 0],
[4, 4, 4, 4, 4, 4]]),
np.array([[0, 6, 1, 6, 0, 0],
[4, 4, 4, 4, 4, 4]]),
np.array([[0, 6, 1, 6, 1, 0],
[4, 4, 4, 4, 4, 4]]),
np.array([[0, 1, 1, 1, 1, 0],
[0, 6, 1, 6, 1, 0],
[4, 4, 4, 4, 4, 4]]),
np.array([[0, 1, 1, 1, 1, 0],
[0, 6, 1, 6, 1, 0],
[4, 4, 4, 4, 4, 4],
[6, 6, 0, 0, 0, 0]]),
np.array([[0, 1, 1, 1, 1, 0],
[0, 6, 1, 6, 1, 0],
[4, 4, 4, 4, 4, 4],
[6, 6, 0, 0, 6, 6]]),
np.array([[0, 2, 2, 2, 2, 0],
[0, 1, 1, 1, 1, 0],
[0, 6, 1, 6, 1, 0],
[4, 4, 4, 4, 4, 4],
[6, 6, 0, 0, 6, 6]]),
]
| 35.818182 | 81 | 0.258037 | 194 | 1,182 | 1.572165 | 0.139175 | 0.295082 | 0.354098 | 0.354098 | 0.632787 | 0.622951 | 0.622951 | 0.622951 | 0.622951 | 0.622951 | 0 | 0.274783 | 0.513536 | 1,182 | 32 | 82 | 36.9375 | 0.255652 | 0.084602 | 0 | 0.62963 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.037037 | 0 | 0.037037 | 0 | 0 | 0 | 1 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 8 |
fea9a7e827daeaf46b4d6433ff51205221e9c1e7 | 38,663 | py | Python | Av2/atividade2.py | victorrsouzas/TeoriaDosGrafos | cbeb3fda4cf13eeb6e023383925851623a9165f0 | [
"MIT"
] | 1 | 2021-05-09T23:32:32.000Z | 2021-05-09T23:32:32.000Z | Av2/teste.py | victorrsouzas/TeoriaDosGrafos | cbeb3fda4cf13eeb6e023383925851623a9165f0 | [
"MIT"
] | null | null | null | Av2/teste.py | victorrsouzas/TeoriaDosGrafos | cbeb3fda4cf13eeb6e023383925851623a9165f0 | [
"MIT"
] | null | null | null | import time
import networkx as nx
import matplotlib.pyplot as plt
import sys
import numpy as np
import glob
buffer = 0
d = 1
e = 1
G = nx.DiGraph() # Direcionado
G2 = nx.Graph() # Não Direcionado
contaAresta = 0
def menu_Grafos():
print("""
-----------------------------
Teoria Grafos
-----------------------------
""")
time.sleep(1)
while d == 1:
print("""
-----------------------------
SISTEMA GRAFOS
-----------------------------
VOCÊ DESEJA INICIAR UM GRAFO?
- SIM
- NÃO
-----------------------------""")
a = input(" Opção: ").upper()
if a == "SIM":
grafos()
elif a == "NÃO" or a == "NAO":
sys.exit()
else:
print("\nRESPONDA APENAS SIM OU NÃO")
def menu_Tipo_Grafo():
print("""
-----------------------------
ADICIONAR GRAFO
-----------------------------
""")
print("""
Escolha a opção do grafo:
(1) Direcionado
(2) Não Direcionado
(3) Sair
""")
def menu_Tipo():
print("""
-----------------------------
TIPO GRAFO
-----------------------------
""")
print("""
(1) Valorado
(2) Não Valorado
(3) Sair
""")
def menu_Visualização(peso, buffer):
global opcao
global contaAresta
global G
if (peso == 1 and buffer == 1) or (peso == 1 and buffer == 2):
print("""
-----------------------------
VISUALIZAÇÃO
-----------------------------
""")
print("""
(1) Lista de arestas
(2) Lista de graus
(3) Lista de vertices
(4) Tamanho do grafo
(5) Matriz de adjacências
(6) Verificar vertices adjacentes
(7) Plot do Grafo
(8) Algoritmo de Dijkstra
(9) Algoritmo de Bellman-Ford
(10) Sair
""")
opcao = int(input(" Opção: "))
print("\n")
if (peso == 2 and buffer == 1) or (peso == 2 and buffer == 2):
print("""
-----------------------------
VISUALIZAÇÃO
-----------------------------
""")
print("""
(1) Lista de arestas
(2) Lista de graus
(3) Lista de vertices
(4) Tamanho do grafo
(5) Matriz de adjacências
(6) Verificar vertices adjacentes
(7) Plot do Grafo
(8) Sair
""")
opcao = int(input(" Opção: "))
print("\n")
def menu_Opcoes(peso, buffer):
global opcao
global contaAresta
global G
if (peso == 1 and buffer == 1) or (peso == 1 and buffer == 2):
print("""
-----------------------------
OPÇÕES
-----------------------------
""")
print("""
(1) Incluir Vertices e Arestas
(2) Alterar Peso da Aresta
(3) Remover Vertices
(4) Visualizar o grafo e os dados
(5) Importar arquivo .csv
(6) Sair
""")
opcao = int(input(" Opção: "))
print("\n")
if (peso == 2 and buffer == 1) or (peso == 2 and buffer == 2):
print("""
-----------------------------
OPÇÕES
-----------------------------
""")
print("""
(1) Incluir Vertices
(2) Remover Vertices
(3) Visualizar o grafo e os dados
(4) Importar arquivo .csv
(5) Sair
""")
opcao = int(input(" Opção: "))
print("\n")
def grafos():
while d == 1:
menu_Tipo_Grafo()
try:
buffer = int(input(" Opção: "))
# FOR DIRECIONADO
if buffer == 1:
menu_Tipo()
peso = int(input(" O seu grafo direcionado vai ser:"))
# FOR DIRECIONADO E VALORADO
if peso == 1:
opcoes(peso, buffer)
# FOR DIRECIONADO E NÃO VALORADO
elif peso == 2:
opcoes(peso, buffer)
# SAIR DA OPÇÃO DIRECIONADO: VALORADO OU NÃO VALORADO
elif peso == 3:
menu_Grafos()
# FOR NÃO DIRECIONADO
elif buffer == 2:
menu_Tipo()
peso = int(
input(" O seu grafo não direcionado vai ser: "))
# FOR NÃO DIRECIONADO E VALORADO
if peso == 1:
opcoes(peso, buffer)
# FOR NÃO DIRECIONADO E NÃO VALORADO
elif peso == 2:
opcoes(peso, buffer)
# SAIR
elif peso == 3:
menu_Grafos()
elif buffer == 3:
menu_Grafos()
except ValueError:
print(f" Erro no tipo da entrada {ValueError}")
break
def opcoes(peso, buffer):
global opcao
global contaAresta
global G
menu_Opcoes(peso, buffer)
try:
if (peso == 1 and buffer == 1) or (peso == 1 and buffer == 2):
# INCLUIR VERTICE
if opcao == 1:
if (peso == 1 and buffer == 1) or (peso == 1 and buffer == 2):
incluir_Vertice_ArestaValorado(peso, buffer)
if (peso == 2 and buffer == 1) or (peso == 2 and buffer == 2):
incluir_Vertice_ArestaNValorado(peso, buffer)
return opcoes(peso, buffer)
#ALTERAR O PESO DA ARESTA
elif opcao == 2:
alterarPeso(peso, buffer)
return opcoes(peso,buffer)
# REMOVER VERTICE
elif opcao == 3:
removerVertice()
return opcoes(peso, buffer)
# VISUALIZAR GRAFO E OS DADOS
elif opcao == 4:
opcoes_Visualização(peso, buffer)
return opcoes(peso, buffer)
# IMPORTAR DADOS .CSV
elif opcao == 5:
import_csv(peso, buffer)
return opcoes(peso, buffer)
# SAIR
elif opcao == 6:
export_csv(peso, buffer)
return grafos()
if (peso == 2 and buffer == 1) or (peso == 2 and buffer == 2):
# INCLUIR VERTICE
if opcao == 1:
if (peso == 1 and buffer == 1) or (peso == 1 and buffer == 2):
incluir_Vertice_ArestaValorado(peso, buffer)
if (peso == 2 and buffer == 1) or (peso == 2 and buffer == 2):
incluir_Vertice_ArestaNValorado(peso, buffer)
return opcoes(peso, buffer)
# REMOVER VERTICE
elif opcao == 2:
removerVertice()
return opcoes(peso, buffer)
# VISUALIZAR GRAFO E OS DADOS
elif opcao == 3:
opcoes_Visualização(peso, buffer)
return opcoes(peso, buffer)
# IMPORTAR DADOS .CSV
elif opcao == 4:
import_csv(peso, buffer)
return opcoes(peso, buffer)
# SAIR
elif opcao == 5:
export_csv(peso, buffer)
return grafos()
except ValueError:
print(f"Erro no tipo da entrada {ValueError}")
def export_csv(peso, buffer):
if (peso == 1 and buffer == 1) or (peso == 1 and buffer == 2):
x = input(
" Você deseja exportar o grafo em .CSV [S/N]? ").upper()
if x == "S":
y = str(input(" Digite o nome do arquivo:"))
nx.write_weighted_edgelist(
G, # grafo
y, # nome do arquivo
delimiter=",", # separador
encoding='utf-8' # codificação
)
G.clear()
G2.clear()
contaAresta = 0
if x == "N":
G.clear()
G2.clear()
contaAresta = 0
if (peso == 2 and buffer == 1) or (peso == 2 and buffer == 2):
x = input(
" Você deseja exportar o grafo em .CSV [S/N]? ").upper()
if x == "S":
y = str(input(" Digite o nome do arquivo:"))
nx.write_weighted_edgelist(
G2, # grafo
y, # nome do arquivo
delimiter=",", # separador
encoding='utf-8' # codificação
)
G.clear()
G2.clear()
contaAresta = 0
if x == "N":
G.clear()
G2.clear()
contaAresta = 0
def import_csv(peso, buffer):
if (peso == 1 and buffer == 1) or (peso == 1 and buffer == 2):
print("""
-----------------------------
LISTA DE ARQUIVOS
-----------------------------
""")
for f in glob.glob('*.*'):
print(f" {f}")
print("\n")
x = str(input(" Digite o nome do arquivo:"))
if x in glob.glob('*.*'):
graph = nx.read_weighted_edgelist(x, delimiter=',', create_using=G,nodetype=str,encoding='utf-8')
else:
print("\n Arquivo inexistente")
if (peso == 2 and buffer == 1) or (peso == 2 and buffer == 2):
print("""
-----------------------------
LISTA DE ARQUIVOS
-----------------------------
""")
for f in glob.glob('*.*'):
print(f" {f}")
print("\n")
x = str(input(" Digite o nome do arquivo:"))
if x in glob.glob('*.*'):
graph = nx.read_weighted_edgelist(x, delimiter=',', create_using=G2,nodetype=str,encoding='utf-8')
else:
print("\n Arquivo inexistente")
def incluir_Vertice_ArestaValorado(op, buff):
global contaAresta
global verticeInput1
global verticeInput2
global opcao
global G
try:
while e == 1:
if op == 1 and buff == 1:
verticeInput1 = input(" Digite o vertice de saida: ")
verticeInput2 = input(" Digite o vertice de chegada: ")
if op == 1 and buff == 2:
verticeInput1 = input(" Digite o vertice: ")
verticeInput2 = input(" Digite o vertice: ")
if ((verticeInput1 != verticeInput2) and (verticeInput2 != None) and (verticeInput1 != None)):
if verticeInput2 != "":
valor = int(input(" Digite o peso da aresta: "))
contaAresta += 1
G.add_node(verticeInput1)
G2.add_node(verticeInput1)
else:
valor = 0
contaAresta += 1
G.add_node(verticeInput1)
G2.add_node(verticeInput1)
if verticeInput2 != "":
G.add_node(verticeInput2)
G2.add_node(verticeInput2)
if op == 1 and buff == 1:
G.add_edge(verticeInput1, verticeInput2, weight=valor)
G2.add_edge(verticeInput1, verticeInput2)
if op == 1 and buff == 2:
G.add_edge(verticeInput1, verticeInput2)
G2.add_edge(verticeInput1, verticeInput2, weight=valor)
else:
print(" Operação não válida, tente novamente")
x = input(
"\n Você deseja continuar adicionando vértices[S/N]? ").upper()
if x == "S":
continue
if x == "N":
break
except ValueError:
print(f" Erro no tipo da entrada {ValueError}")
def incluir_Vertice_ArestaNValorado(op, buff):
global contaAresta
global verticeInput1
global verticeInput2
try:
while e == 1:
if op == 2 and buff == 1:
verticeInput1 = input(" Digite o vertice de saida: ")
verticeInput2 = input(" Digite o vertice de chegada: ")
if op == 2 and buff == 2:
verticeInput1 = input(" Digite o vertice: ")
verticeInput2 = input(" Digite o vertice: ")
if ((verticeInput1 != verticeInput2) and (verticeInput2 != None) and (verticeInput1 != None)):
contaAresta += 1
if op == 2 and buff == 1:
if verticeInput2 != "":
G.add_node(verticeInput1)
else:
G.add_node(verticeInput1)
if verticeInput2 != "":
G.add_node(verticeInput2)
G.add_edge(verticeInput1,
verticeInput2)
if op == 2 and buff == 2:
if verticeInput2 != "":
G2.add_node(verticeInput1)
else:
G2.add_node(verticeInput1)
if verticeInput2 != "":
G2.add_node(verticeInput2)
G2.add_edge(verticeInput1,
verticeInput2)
x = input(
"\n Você deseja continuar adicionando vértices[S/N]? ").upper()
if x == "S":
continue
if x == "N":
break
except ValueError:
print(f" Erro no tipo da entrada {ValueError}")
def alterarPeso(op, buff):
global opcao
global contaAresta
global G
global verticeInput1
global verticeInput2
global opcao
try:
tamanho = len(G.nodes)
if tamanho > 0:
while e == 1:
if op == 1 and buff == 1:
verticeInput1 = input(" Digite o vertice de saida: ")
for edge in G.edges():
if verticeInput1 == edge[0]:
verticeInput2 = input(" Digite o vertice de chegada: ")
if verticeInput2 == edge[1]:
if ((verticeInput1 != verticeInput2) and (verticeInput2 != None) and (verticeInput1 != None)):
if verticeInput2 != "":
valor = int(input(" Digite o peso novo da aresta: "))
contaAresta += 1
G.add_node(verticeInput1)
G2.add_node(verticeInput1)
else:
valor = 0
contaAresta += 1
G.add_node(verticeInput1)
G2.add_node(verticeInput1)
if verticeInput2 != "":
G.add_node(verticeInput2)
G2.add_node(verticeInput2)
if op == 1 and buff == 1:
G.add_edge(verticeInput1, verticeInput2, weight=valor)
G2.add_edge(verticeInput1, verticeInput2)
else:
print(" Operação não válida, tente novamente")
break
else:
print(" Não faz ligação com a origem")
break
else:
print(" Não é um vertice de origem")
x = input(
"\n Você deseja continuar alterando o peso das arestas[S/N]? ").upper()
if x == "S":
continue
if x == "N":
break
if op == 1 and buff == 2:
verticeInput1 = input(" Digite o vertice: ")
for edge in G.edges():
if verticeInput1 == edge[0]:
verticeInput2 = input(" Digite o vertice: ")
if verticeInput2 == edge[1]:
if ((verticeInput1 != verticeInput2) and (verticeInput2 != None) and (verticeInput1 != None)):
if verticeInput2 != "":
valor = int(input(" Digite o peso novo da aresta: "))
contaAresta += 1
G.add_node(verticeInput1)
G2.add_node(verticeInput1)
else:
valor = 0
contaAresta += 1
G.add_node(verticeInput1)
G2.add_node(verticeInput1)
if verticeInput2 != "":
G.add_node(verticeInput2)
G2.add_node(verticeInput2)
if op == 1 and buff == 2:
G.add_edge(verticeInput1, verticeInput2)
G2.add_edge(verticeInput1, verticeInput2, weight=valor)
else:
print(" Operação não válida, tente novamente")
else:
print(" Não faz ligação com a origem")
else:
print(" Não é um vertice de origem")
x = input(
"\n Você deseja continuar alterando o peso das arestas[S/N]? ").upper()
if x == "S":
continue
if x == "N":
break
else:
print(" Lista Vazia")
except ValueError:
print(f" Erro no tipo da entrada {ValueError}")
def removerVertice():
global opcao
global contaAresta
global G
try:
tamanho = len(G.nodes)
if tamanho > 0:
while e == 1:
print(f"\n Lista de Vértices: {G.nodes()}")
r = input(" Deseja remover qual vertice: ")
if r in G.nodes():
G.remove_node(r)
contaAresta -= 1
else:
print(" Vertice não se encontra no grafo")
x = input(
"\n Você deseja continuar adicionando vértices[S/N]? ").upper()
if x == "S":
continue
if x == "N":
break
else:
print(" Lista Vazia")
except ValueError:
print(f" Erro no tipo da entrada {ValueError}")
def opcoes_Visualização(peso, buffer):
global opcao
global contaAresta
global G
menu_Visualização(peso, buffer)
if (peso == 1 and buffer == 1) or (peso == 1 and buffer == 2):
if opcao == 1:
gerar_Grafo_Lista(peso, buffer)
return opcoes_Visualização(peso, buffer)
elif opcao == 2:
imprimir_GrauVertice(peso, buffer)
return opcoes_Visualização(peso, buffer)
elif opcao == 3:
print("""
-----------------------------
LISTA DE VERTICES
-----------------------------
""")
if (peso == 1 and buffer == 1) or (peso == 1 and buffer == 2):
print(f" Lista de Vértices: {G.nodes()}")
if peso == 2 and buffer == 1:
print(f" Lista de Vértices: {G.nodes()}")
if peso == 2 and buffer == 2:
print(f" Lista de Vértices: {G2.nodes()}")
print("---------------------------------------------------")
print("\n")
return opcoes_Visualização(peso, buffer)
elif opcao == 4:
print("""
-----------------------------
TAMANHO DO GRAFO
-----------------------------
""")
print(f" Tamanho do Grafo: {contaAresta}")
print("---------------------------------------------------")
print("\n")
return opcoes_Visualização(peso, buffer)
elif opcao == 5:
print("""
-----------------------------
MATRIZ DE ADJACÊNCIA
-----------------------------
""")
if peso == 1 and buffer == 1:
A = nx.adjacency_matrix(G2)
print(f"{A.todense()}")
print("\n")
return opcoes_Visualização(peso, buffer)
if peso == 1 and buffer == 2:
A = nx.adjacency_matrix(G)
print(f"{A.todense()}")
print("\n")
return opcoes_Visualização(peso, buffer)
if peso == 2 and buffer == 1:
A = nx.adjacency_matrix(G)
print(f"{A.todense()}")
print("\n")
return opcoes_Visualização(peso, buffer)
if peso == 2 and buffer == 2:
A = nx.adjacency_matrix(G2)
print(f"{A.todense()}")
print("\n")
return opcoes_Visualização(peso, buffer)
elif opcao == 6:
print("""
-----------------------------
VERTICES ADJACENTES
-----------------------------
""")
u = input(" Vértice 1: ")
v = input(" Vértice 2: ")
contador = 0
for edge in G.edges():
i = 0
if peso == 1 and buffer == 1:
if edge[i] == u:
if edge[i + 1] == v:
contador += 1
if peso == 1 and buffer == 2:
if edge[i] == u:
if edge[i + 1] == v:
contador -= 1
i += 1
if contador == 1:
print(" São adjacentes")
print("\n")
return opcoes_Visualização(peso, buffer)
elif contador <= 0:
print(" Não são adjacentes")
print("\n")
return opcoes_Visualização(peso, buffer)
elif opcao == 7:
print("""
-----------------------------
PLOT DO GRAFO
-----------------------------
""")
fig, ax = plt.subplots(figsize=(25, 25))
# Plot do Grafo Direcionado Valorado
if peso == 1 and buffer == 1:
node_size = [2500 for node in G.nodes]
pos = nx.spring_layout(G)
labels = nx.get_edge_attributes(G, 'weight')
nx.draw_networkx_edge_labels(
G, pos, edge_labels=labels)
options = {
'width': 1.0,
'arrowstyle': '-|>',
'arrowsize': 12,
}
nx.draw_networkx(G, pos, arrows=True,
with_labels=True, node_size=node_size, **options)
plt.show()
return opcoes_Visualização(peso, buffer)
# Plot do Grafo Não Direcionado Valorado
elif peso == 1 and buffer == 2:
pos = nx.spring_layout(G2)
labels = nx.get_edge_attributes(G2, 'weight')
nx.draw_networkx_edge_labels(G2, pos, edge_labels=labels)
nx.draw(G2, pos, with_labels=True)
plt.show()
return opcoes_Visualização(peso, buffer)
# Plot do grafo
# Direcionado e Não Valorado
elif peso == 2 and buffer == 1:
pos = nx.spring_layout(G)
nx.draw(G, pos, with_labels=True)
plt.show()
return opcoes_Visualização(peso, buffer)
# Não direcionado e Não Valorado
elif peso == 2 and buffer == 2:
pos = nx.spring_layout(G2)
nx.draw(G2, pos, with_labels=True)
plt.show()
return opcoes_Visualização(peso, buffer)
elif opcao == 8:
print("""
-----------------------------
ALGORITMO DE DIJKSTRA
-----------------------------
""")
if (peso == 1 and buffer == 1):
menu_Dijkstra(peso, buffer)
else:
print(" O algoritmo é exclusivo para a opção de direcionado e valorado")
return opcoes_Visualização(peso, buffer)
elif opcao == 9:
print("""
-----------------------------
ALGORITMO DE BELLMAN-FORD
-----------------------------
""")
if (peso == 1 and buffer == 1):
menu_BellmanFord(peso, buffer)
else:
print(" O algoritmo é exclusivo para a opção de direcionado e valorado")
return opcoes_Visualização(peso, buffer)
return opcoes_Visualização(peso, buffer)
elif opcao == 10:
return opcoes(peso, buffer)
if (peso == 2 and buffer == 1) or (peso == 2 and buffer == 2):
if opcao == 1:
gerar_Grafo_Lista(peso, buffer)
return opcoes_Visualização(peso, buffer)
elif opcao == 2:
imprimir_GrauVertice(peso, buffer)
return opcoes_Visualização(peso, buffer)
elif opcao == 3:
print("""
-----------------------------
LISTA DE VERTICES
-----------------------------
""")
if (peso == 1 and buffer == 1) or (peso == 1 and buffer == 2):
print(f" Lista de Vértices: {G.nodes()}")
if peso == 2 and buffer == 1:
print(f" Lista de Vértices: {G.nodes()}")
if peso == 2 and buffer == 2:
print(f" Lista de Vértices: {G2.nodes()}")
print("---------------------------------------------------")
print("\n")
return opcoes_Visualização(peso, buffer)
elif opcao == 4:
print("""
-----------------------------
TAMANHO DO GRAFO
-----------------------------
""")
print(f" Tamanho do Grafo: {contaAresta}")
print("---------------------------------------------------")
print("\n")
return opcoes_Visualização(peso, buffer)
elif opcao == 5:
print("""
-----------------------------
MATRIZ DE ADJACÊNCIA
-----------------------------
""")
if peso == 1 and buffer == 1:
A = nx.adjacency_matrix(G2)
print(f"{A.todense()}")
print("\n")
return opcoes_Visualização(peso, buffer)
if peso == 1 and buffer == 2:
A = nx.adjacency_matrix(G)
print(f"{A.todense()}")
print("\n")
return opcoes_Visualização(peso, buffer)
if peso == 2 and buffer == 1:
A = nx.adjacency_matrix(G)
print(f"{A.todense()}")
print("\n")
return opcoes_Visualização(peso, buffer)
if peso == 2 and buffer == 2:
A = nx.adjacency_matrix(G2)
print(f"{A.todense()}")
print("\n")
return opcoes_Visualização(peso, buffer)
elif opcao == 6:
print("""
-----------------------------
VERTICES ADJACENTES
-----------------------------
""")
u = input(" Vértice 1: ")
v = input(" Vértice 2: ")
contador = 0
for edge in G.edges():
i = 0
if peso == 1 and buffer == 1:
if edge[i] == u:
if edge[i + 1] == v:
contador += 1
if peso == 1 and buffer == 2:
if edge[i] == u:
if edge[i + 1] == v:
contador -= 1
i += 1
if contador == 1:
print(" São adjacentes")
print("\n")
return opcoes_Visualização(peso, buffer)
elif contador <= 0:
print(" Não são adjacentes")
print("\n")
return opcoes_Visualização(peso, buffer)
elif opcao == 7:
print("""
-----------------------------
PLOT DO GRAFO
-----------------------------
""")
fig, ax = plt.subplots(figsize=(25, 25))
# Plot do Grafo Direcionado Valorado
if peso == 1 and buffer == 1:
node_size = [2500 for node in G.nodes]
pos = nx.spring_layout(G)
labels = nx.get_edge_attributes(G, 'weight')
nx.draw_networkx_edge_labels(
G, pos, edge_labels=labels)
options = {
'width': 1.0,
'arrowstyle': '-|>',
'arrowsize': 12,
}
nx.draw_networkx(G, pos, arrows=True,
with_labels=True, node_size=node_size, **options)
plt.show()
return opcoes_Visualização(peso, buffer)
# Plot do Grafo Não Direcionado Valorado
elif peso == 1 and buffer == 2:
pos = nx.spring_layout(G2)
labels = nx.get_edge_attributes(G2, 'weight')
nx.draw_networkx_edge_labels(G2, pos, edge_labels=labels)
nx.draw(G2, pos, with_labels=True)
plt.show()
return opcoes_Visualização(peso, buffer)
# Plot do grafo
# Direcionado e Não Valorado
elif peso == 2 and buffer == 1:
pos = nx.spring_layout(G)
nx.draw(G, pos, with_labels=True)
plt.show()
return opcoes_Visualização(peso, buffer)
# Não direcionado e Não Valorado
elif peso == 2 and buffer == 2:
pos = nx.spring_layout(G2)
nx.draw(G2, pos, with_labels=True)
plt.show()
return opcoes_Visualização(peso, buffer)
elif opcao == 8:
return opcoes(peso, buffer)
def gerar_Grafo_Lista(modo, tipo):
global opcao
global contaAresta
global G
# Direcionado e Valorado
if modo == 1 and tipo == 1:
print("""
-----------------------------
LISTA DE ARESTAS
-----------------------------
""")
i = 1
for edge in G.edges():
u = edge[0]
v = edge[1]
print(
f"\n Par {i}: O peso da aresta: {edge}, vale: {G[u][v]['weight']}")
i += 1
print("---------------------------------------------------")
print("\n")
# Direcionado e Não Valorado
elif modo == 2 and tipo == 1:
print("""
-----------------------------
PARES DE VERTICES
-----------------------------
""")
i = 1
for edge in G.edges():
u = edge[0]
v = edge[1]
print(f"\n Par {i} de Vertices: {edge}")
i += 1
print("---------------------------------------------------")
print("\n")
# Não Direcionado e Valorado
elif modo == 1 and tipo == 2:
print("""
-----------------------------
LISTA DE ARESTAS
-----------------------------
""")
i = 1
for edge in G2.edges():
u = edge[0]
v = edge[1]
print(
f"\n Par {i}: O peso da aresta: {edge}, vale: {G2[u][v]['weight']}")
i += 1
print("---------------------------------------------------")
print("\n")
# Não Direcionado e Não valorado
elif modo == 2 and tipo == 2:
print("""
-----------------------------
PARES DE VERTICES
-----------------------------
""")
i = 1
for edge in G.edges():
u = edge[0]
v = edge[1]
print(f"\n Par {i} de Vertices: {edge}")
i += 1
print("---------------------------------------------------")
print("\n")
def imprimir_GrauVertice(modo, tipo):
global opcao
global contaAresta
global G
# Direcionado
if (modo == 1 or modo == 2) and tipo == 1:
print("""
-----------------------------
LISTA DE GRAUS
-----------------------------
""")
print(f" {G.degree}")
print("---------------------------------------------------")
print("\n")
# Não Direcionado
elif (modo == 1 or modo == 2) and tipo == 2:
print("""
-----------------------------
LISTA DE GRAUS
-----------------------------
""")
print(f" {G.degree}")
print("---------------------------------------------------")
print("\n")
def menu_Dijkstra(peso, buffer):
global opcao
global contaAresta
global G
if (peso == 1 and buffer == 1) or (peso == 1 and buffer == 2):
print("""
(1) Caminho e custo especifico
(2) Todos os caminhos e custos
(3) Sair
""")
opcao = int(input(" Opção: "))
print("\n")
if opcao == 1:
x = input(" Vertice de Origem: ")
y = input(" Vertice de Destino: ")
dijkstra_one_to_one(G, x, y)
return menu_Dijkstra(peso, buffer)
elif opcao == 2:
dijkstra_one_to_all(G)
return menu_Dijkstra(peso, buffer)
elif opcao == 3:
return opcoes_Visualização(peso, buffer)
def menu_BellmanFord(peso, buffer):
global opcao
global contaAresta
global G
if (peso == 1 and buffer == 1) or (peso == 1 and buffer == 2):
print("""
(1) Caminho e custo especifico
(2) Todos os caminhos e custos
(3) Sair
""")
opcao = int(input(" Opção: "))
print("\n")
if opcao == 1:
x = input(" Vertice de Origem: ")
y = input(" Vertice de Destino: ")
dijkstra_one_to_one(G, x, y)
return menu_BellmanFord(peso, buffer)
elif opcao == 2:
dijkstra_one_to_all(G)
return menu_BellmanFord(peso, buffer)
elif opcao == 3:
return opcoes_Visualização(peso, buffer)
def dijkstra_one_to_all(G):
try:
print("""
--------------------------------------------
MENOR CUSTO ENTRE OS VERTICES
--------------------------------------------
""")
for x in G.nodes():
length = nx.shortest_path_length(
G, source=x, weight="weight", method='dijkstra')
print(" O menor custo entre o vértice ",
x, " e o ", length)
print("\n")
print("""
--------------------------------------------
MENOR CAMINHO ENTRE OS VERTICES
--------------------------------------------
""")
for x in G.nodes():
path = nx.shortest_path(G, source=x, weight='weight')
print(" O caminho de menor custo entre o vértice ", x, ": ", path)
print("\n")
path_lengths = dict(
nx.all_pairs_dijkstra_path_length(G, weight='weight'))
# sum the lengths to individual nodes
new_dict = {node1: sum([length for length in path_lengths[node1].values()])
for node1 in path_lengths.keys()}
# print the lengths
print("""
--------------------------------------------
SOMA DOS CUSTOS DOS VERTICES
--------------------------------------------
""")
for node, length in new_dict.items():
print(' A soma dos comprimentos do vertice {} para todos os outros vertices é {}.'.format(
node, length))
except nx.exception.NetworkXNoPath:
print(' Sem caminho')
print("\n")
return
def dijkstra_one_to_one(G, origem, destino):
if destino != origem:
try:
length = nx.shortest_path_length(
G, source=origem, target=destino, weight="weight", method='dijkstra')
print(" O menor custo entre o vértice ",
origem, " e o ", destino, " é de ", length)
path = nx.shortest_path(
G, source=origem, target=destino, weight="weight")
print(" O caminho de menor custo entre o vértice ",
origem, " e o vértice ", destino, " é: ", path)
x = input(
"\n Você deseja visualizar o grafo [S/N]? ").upper()
if x == "S":
plotGraphDijkstra(G, path)
if x == "N":
return
except nx.exception.NetworkXNoPath:
print(' Sem caminho de ', origem, ' para ', destino)
else:
print("origem = destino")
return
def plotGraphDijkstra(G, path):
fig, ax = plt.subplots(figsize=(25, 25))
node_size = [2500 for node in G.nodes]
pos = nx.spring_layout(G)
labels = nx.get_edge_attributes(G, 'weight')
# Separa as arestas que fazem parte do menor caminho e guarda em arestas_vermelhas
arestas_vermelhas = list(zip(path, path[1:]))
# Marca as arestas que estão no Path para serem pintados de vermelho e as outras de preto
cor_arestas = [
'black' if not edge in arestas_vermelhas else 'red' for edge in G.edges()]
# Marca os vértices que estão no Path para serem pintados de azul e os outros de branco
cor_vertices = [
'yellow' if not node in path else 'green' for node in G.nodes()]
nx.draw_networkx_edges(G, pos, edge_color=cor_arestas)
nx.draw_networkx_edge_labels(G, pos, edge_labels=labels)
options = {
'width': 1.0,
'arrowstyle': '-|>',
'arrowsize': 12,
}
nx.draw_networkx(G, pos, arrows=True,
with_labels=True, node_color=cor_vertices, edge_color=cor_arestas, node_size=node_size, **options)
# Mostra o grafo
plt.show()
menu_Grafos()
| 35.084392 | 126 | 0.417686 | 3,662 | 38,663 | 4.342709 | 0.078919 | 0.049676 | 0.052569 | 0.033453 | 0.852481 | 0.836572 | 0.808024 | 0.781802 | 0.748727 | 0.715903 | 0 | 0.022837 | 0.425782 | 38,663 | 1,101 | 127 | 35.116258 | 0.693482 | 0.033055 | 0 | 0.849896 | 0 | 0.00207 | 0.241074 | 0.06501 | 0 | 0 | 0 | 0 | 0 | 1 | 0.021739 | false | 0 | 0.011387 | 0 | 0.087992 | 0.144928 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
228caef49c45375849f5b64b25f202f0c6aa0828 | 21,314 | py | Python | sdk/storage/azure-storage-queue/azure/storage/queue/_generated/operations/_queue_operations.py | vchske/azure-sdk-for-python | 6383ed3676b7355af7be394562b126209961ec13 | [
"MIT"
] | null | null | null | sdk/storage/azure-storage-queue/azure/storage/queue/_generated/operations/_queue_operations.py | vchske/azure-sdk-for-python | 6383ed3676b7355af7be394562b126209961ec13 | [
"MIT"
] | 1 | 2019-06-04T18:12:16.000Z | 2019-06-04T18:12:16.000Z | sdk/storage/azure-storage-queue/azure/storage/queue/_generated/operations/_queue_operations.py | vchske/azure-sdk-for-python | 6383ed3676b7355af7be394562b126209961ec13 | [
"MIT"
] | 1 | 2019-06-17T22:18:23.000Z | 2019-06-17T22:18:23.000Z | # coding=utf-8
# --------------------------------------------------------------------------
# Code generated by Microsoft (R) AutoRest Code Generator.
# Changes may cause incorrect behavior and will be lost if the code is
# regenerated.
# --------------------------------------------------------------------------
# pylint: skip-file
from azure.core.exceptions import map_error
from .. import models
class QueueOperations(object):
"""QueueOperations operations.
You should not instantiate directly this class, but create a Client instance that will create it for you and attach it as attribute.
:param client: Client for service requests.
:param config: Configuration of service client.
:param serializer: An object model serializer.
:param deserializer: An object model deserializer.
"""
models = models
def __init__(self, client, config, serializer, deserializer):
self._client = client
self._serialize = serializer
self._deserialize = deserializer
self._config = config
def create(self, timeout=None, metadata=None, request_id=None, cls=None, **kwargs):
"""creates a new queue under the given account.
:param timeout: The The timeout parameter is expressed in seconds. For
more information, see <a
href="https://docs.microsoft.com/en-us/rest/api/storageservices/setting-timeouts-for-queue-service-operations>Setting
Timeouts for Queue Service Operations.</a>
:type timeout: int
:param metadata: Optional. Include this parameter to specify that the
queue's metadata be returned as part of the response body. Note that
metadata requested with this parameter must be stored in accordance
with the naming restrictions imposed by the 2009-09-19 version of the
Queue service. Beginning with this version, all metadata names must
adhere to the naming conventions for C# identifiers.
:type metadata: str
:param request_id: Provides a client-generated, opaque value with a 1
KB character limit that is recorded in the analytics logs when storage
analytics logging is enabled.
:type request_id: str
:param callable cls: A custom type or function that will be passed the
direct response
:return: None or the result of cls(response)
:rtype: None
:raises:
:class:`StorageErrorException<queue.models.StorageErrorException>`
"""
error_map = kwargs.pop('error_map', None)
# Construct URL
url = self.create.metadata['url']
path_format_arguments = {
'url': self._serialize.url("self._config.url", self._config.url, 'str', skip_quote=True)
}
url = self._client.format_url(url, **path_format_arguments)
# Construct parameters
query_parameters = {}
if timeout is not None:
query_parameters['timeout'] = self._serialize.query("timeout", timeout, 'int', minimum=0)
# Construct headers
header_parameters = {}
if metadata is not None:
header_parameters['x-ms-meta'] = self._serialize.header("metadata", metadata, 'str')
header_parameters['x-ms-version'] = self._serialize.header("self._config.version", self._config.version, 'str')
if request_id is not None:
header_parameters['x-ms-client-request-id'] = self._serialize.header("request_id", request_id, 'str')
# Construct and send request
request = self._client.put(url, query_parameters, header_parameters)
pipeline_response = self._client._pipeline.run(request, stream=False, **kwargs)
response = pipeline_response.http_response
if response.status_code not in [201, 204]:
map_error(status_code=response.status_code, response=response, error_map=error_map)
raise models.StorageErrorException(response, self._deserialize)
if cls:
response_headers = {
'x-ms-request-id': self._deserialize('str', response.headers.get('x-ms-request-id')),
'x-ms-version': self._deserialize('str', response.headers.get('x-ms-version')),
'Date': self._deserialize('rfc-1123', response.headers.get('Date')),
'x-ms-error-code': self._deserialize('str', response.headers.get('x-ms-error-code')),
}
return cls(response, None, response_headers)
create.metadata = {'url': '/{queueName}'}
def delete(self, timeout=None, request_id=None, cls=None, **kwargs):
"""operation permanently deletes the specified queue.
:param timeout: The The timeout parameter is expressed in seconds. For
more information, see <a
href="https://docs.microsoft.com/en-us/rest/api/storageservices/setting-timeouts-for-queue-service-operations>Setting
Timeouts for Queue Service Operations.</a>
:type timeout: int
:param request_id: Provides a client-generated, opaque value with a 1
KB character limit that is recorded in the analytics logs when storage
analytics logging is enabled.
:type request_id: str
:param callable cls: A custom type or function that will be passed the
direct response
:return: None or the result of cls(response)
:rtype: None
:raises:
:class:`StorageErrorException<queue.models.StorageErrorException>`
"""
error_map = kwargs.pop('error_map', None)
# Construct URL
url = self.delete.metadata['url']
path_format_arguments = {
'url': self._serialize.url("self._config.url", self._config.url, 'str', skip_quote=True)
}
url = self._client.format_url(url, **path_format_arguments)
# Construct parameters
query_parameters = {}
if timeout is not None:
query_parameters['timeout'] = self._serialize.query("timeout", timeout, 'int', minimum=0)
# Construct headers
header_parameters = {}
header_parameters['x-ms-version'] = self._serialize.header("self._config.version", self._config.version, 'str')
if request_id is not None:
header_parameters['x-ms-client-request-id'] = self._serialize.header("request_id", request_id, 'str')
# Construct and send request
request = self._client.delete(url, query_parameters, header_parameters)
pipeline_response = self._client._pipeline.run(request, stream=False, **kwargs)
response = pipeline_response.http_response
if response.status_code not in [204]:
map_error(status_code=response.status_code, response=response, error_map=error_map)
raise models.StorageErrorException(response, self._deserialize)
if cls:
response_headers = {
'x-ms-request-id': self._deserialize('str', response.headers.get('x-ms-request-id')),
'x-ms-version': self._deserialize('str', response.headers.get('x-ms-version')),
'Date': self._deserialize('rfc-1123', response.headers.get('Date')),
'x-ms-error-code': self._deserialize('str', response.headers.get('x-ms-error-code')),
}
return cls(response, None, response_headers)
delete.metadata = {'url': '/{queueName}'}
def get_properties(self, timeout=None, request_id=None, cls=None, **kwargs):
"""Retrieves user-defined metadata and queue properties on the specified
queue. Metadata is associated with the queue as name-values pairs.
:param timeout: The The timeout parameter is expressed in seconds. For
more information, see <a
href="https://docs.microsoft.com/en-us/rest/api/storageservices/setting-timeouts-for-queue-service-operations>Setting
Timeouts for Queue Service Operations.</a>
:type timeout: int
:param request_id: Provides a client-generated, opaque value with a 1
KB character limit that is recorded in the analytics logs when storage
analytics logging is enabled.
:type request_id: str
:param callable cls: A custom type or function that will be passed the
direct response
:return: None or the result of cls(response)
:rtype: None
:raises:
:class:`StorageErrorException<queue.models.StorageErrorException>`
"""
error_map = kwargs.pop('error_map', None)
comp = "metadata"
# Construct URL
url = self.get_properties.metadata['url']
path_format_arguments = {
'url': self._serialize.url("self._config.url", self._config.url, 'str', skip_quote=True)
}
url = self._client.format_url(url, **path_format_arguments)
# Construct parameters
query_parameters = {}
if timeout is not None:
query_parameters['timeout'] = self._serialize.query("timeout", timeout, 'int', minimum=0)
query_parameters['comp'] = self._serialize.query("comp", comp, 'str')
# Construct headers
header_parameters = {}
header_parameters['x-ms-version'] = self._serialize.header("self._config.version", self._config.version, 'str')
if request_id is not None:
header_parameters['x-ms-client-request-id'] = self._serialize.header("request_id", request_id, 'str')
# Construct and send request
request = self._client.get(url, query_parameters, header_parameters)
pipeline_response = self._client._pipeline.run(request, stream=False, **kwargs)
response = pipeline_response.http_response
if response.status_code not in [200]:
map_error(status_code=response.status_code, response=response, error_map=error_map)
raise models.StorageErrorException(response, self._deserialize)
if cls:
response_headers = {
'x-ms-meta': self._deserialize('{str}', response.headers.get('x-ms-meta')),
'x-ms-approximate-messages-count': self._deserialize('int', response.headers.get('x-ms-approximate-messages-count')),
'x-ms-request-id': self._deserialize('str', response.headers.get('x-ms-request-id')),
'x-ms-version': self._deserialize('str', response.headers.get('x-ms-version')),
'Date': self._deserialize('rfc-1123', response.headers.get('Date')),
'x-ms-error-code': self._deserialize('str', response.headers.get('x-ms-error-code')),
}
return cls(response, None, response_headers)
get_properties.metadata = {'url': '/{queueName}'}
def set_metadata(self, timeout=None, metadata=None, request_id=None, cls=None, **kwargs):
"""sets user-defined metadata on the specified queue. Metadata is
associated with the queue as name-value pairs.
:param timeout: The The timeout parameter is expressed in seconds. For
more information, see <a
href="https://docs.microsoft.com/en-us/rest/api/storageservices/setting-timeouts-for-queue-service-operations>Setting
Timeouts for Queue Service Operations.</a>
:type timeout: int
:param metadata: Optional. Include this parameter to specify that the
queue's metadata be returned as part of the response body. Note that
metadata requested with this parameter must be stored in accordance
with the naming restrictions imposed by the 2009-09-19 version of the
Queue service. Beginning with this version, all metadata names must
adhere to the naming conventions for C# identifiers.
:type metadata: str
:param request_id: Provides a client-generated, opaque value with a 1
KB character limit that is recorded in the analytics logs when storage
analytics logging is enabled.
:type request_id: str
:param callable cls: A custom type or function that will be passed the
direct response
:return: None or the result of cls(response)
:rtype: None
:raises:
:class:`StorageErrorException<queue.models.StorageErrorException>`
"""
error_map = kwargs.pop('error_map', None)
comp = "metadata"
# Construct URL
url = self.set_metadata.metadata['url']
path_format_arguments = {
'url': self._serialize.url("self._config.url", self._config.url, 'str', skip_quote=True)
}
url = self._client.format_url(url, **path_format_arguments)
# Construct parameters
query_parameters = {}
if timeout is not None:
query_parameters['timeout'] = self._serialize.query("timeout", timeout, 'int', minimum=0)
query_parameters['comp'] = self._serialize.query("comp", comp, 'str')
# Construct headers
header_parameters = {}
if metadata is not None:
header_parameters['x-ms-meta'] = self._serialize.header("metadata", metadata, 'str')
header_parameters['x-ms-version'] = self._serialize.header("self._config.version", self._config.version, 'str')
if request_id is not None:
header_parameters['x-ms-client-request-id'] = self._serialize.header("request_id", request_id, 'str')
# Construct and send request
request = self._client.put(url, query_parameters, header_parameters)
pipeline_response = self._client._pipeline.run(request, stream=False, **kwargs)
response = pipeline_response.http_response
if response.status_code not in [204]:
map_error(status_code=response.status_code, response=response, error_map=error_map)
raise models.StorageErrorException(response, self._deserialize)
if cls:
response_headers = {
'x-ms-request-id': self._deserialize('str', response.headers.get('x-ms-request-id')),
'x-ms-version': self._deserialize('str', response.headers.get('x-ms-version')),
'Date': self._deserialize('rfc-1123', response.headers.get('Date')),
'x-ms-error-code': self._deserialize('str', response.headers.get('x-ms-error-code')),
}
return cls(response, None, response_headers)
set_metadata.metadata = {'url': '/{queueName}'}
def get_access_policy(self, timeout=None, request_id=None, cls=None, **kwargs):
"""returns details about any stored access policies specified on the queue
that may be used with Shared Access Signatures.
:param timeout: The The timeout parameter is expressed in seconds. For
more information, see <a
href="https://docs.microsoft.com/en-us/rest/api/storageservices/setting-timeouts-for-queue-service-operations>Setting
Timeouts for Queue Service Operations.</a>
:type timeout: int
:param request_id: Provides a client-generated, opaque value with a 1
KB character limit that is recorded in the analytics logs when storage
analytics logging is enabled.
:type request_id: str
:param callable cls: A custom type or function that will be passed the
direct response
:return: list or the result of cls(response)
:rtype: list[~queue.models.SignedIdentifier]
:raises:
:class:`StorageErrorException<queue.models.StorageErrorException>`
"""
error_map = kwargs.pop('error_map', None)
comp = "acl"
# Construct URL
url = self.get_access_policy.metadata['url']
path_format_arguments = {
'url': self._serialize.url("self._config.url", self._config.url, 'str', skip_quote=True)
}
url = self._client.format_url(url, **path_format_arguments)
# Construct parameters
query_parameters = {}
if timeout is not None:
query_parameters['timeout'] = self._serialize.query("timeout", timeout, 'int', minimum=0)
query_parameters['comp'] = self._serialize.query("comp", comp, 'str')
# Construct headers
header_parameters = {}
header_parameters['Accept'] = 'application/xml'
header_parameters['x-ms-version'] = self._serialize.header("self._config.version", self._config.version, 'str')
if request_id is not None:
header_parameters['x-ms-client-request-id'] = self._serialize.header("request_id", request_id, 'str')
# Construct and send request
request = self._client.get(url, query_parameters, header_parameters)
pipeline_response = self._client._pipeline.run(request, stream=False, **kwargs)
response = pipeline_response.http_response
if response.status_code not in [200]:
map_error(status_code=response.status_code, response=response, error_map=error_map)
raise models.StorageErrorException(response, self._deserialize)
header_dict = {}
deserialized = None
if response.status_code == 200:
deserialized = self._deserialize('[SignedIdentifier]', response)
header_dict = {
'x-ms-request-id': self._deserialize('str', response.headers.get('x-ms-request-id')),
'x-ms-version': self._deserialize('str', response.headers.get('x-ms-version')),
'Date': self._deserialize('rfc-1123', response.headers.get('Date')),
'x-ms-error-code': self._deserialize('str', response.headers.get('x-ms-error-code')),
}
if cls:
return cls(response, deserialized, header_dict)
return deserialized
get_access_policy.metadata = {'url': '/{queueName}'}
def set_access_policy(self, queue_acl=None, timeout=None, request_id=None, cls=None, **kwargs):
"""sets stored access policies for the queue that may be used with Shared
Access Signatures.
:param queue_acl: the acls for the queue
:type queue_acl: list[~queue.models.SignedIdentifier]
:param timeout: The The timeout parameter is expressed in seconds. For
more information, see <a
href="https://docs.microsoft.com/en-us/rest/api/storageservices/setting-timeouts-for-queue-service-operations>Setting
Timeouts for Queue Service Operations.</a>
:type timeout: int
:param request_id: Provides a client-generated, opaque value with a 1
KB character limit that is recorded in the analytics logs when storage
analytics logging is enabled.
:type request_id: str
:param callable cls: A custom type or function that will be passed the
direct response
:return: None or the result of cls(response)
:rtype: None
:raises:
:class:`StorageErrorException<queue.models.StorageErrorException>`
"""
error_map = kwargs.pop('error_map', None)
comp = "acl"
# Construct URL
url = self.set_access_policy.metadata['url']
path_format_arguments = {
'url': self._serialize.url("self._config.url", self._config.url, 'str', skip_quote=True)
}
url = self._client.format_url(url, **path_format_arguments)
# Construct parameters
query_parameters = {}
if timeout is not None:
query_parameters['timeout'] = self._serialize.query("timeout", timeout, 'int', minimum=0)
query_parameters['comp'] = self._serialize.query("comp", comp, 'str')
# Construct headers
header_parameters = {}
header_parameters['Content-Type'] = 'application/xml; charset=utf-8'
header_parameters['x-ms-version'] = self._serialize.header("self._config.version", self._config.version, 'str')
if request_id is not None:
header_parameters['x-ms-client-request-id'] = self._serialize.header("request_id", request_id, 'str')
# Construct body
serialization_ctxt = {'xml': {'name': 'SignedIdentifiers', 'itemsName': 'SignedIdentifier', 'wrapped': True}}
if queue_acl is not None:
body_content = self._serialize.serialize_iter(queue_acl, 'SignedIdentifier',
serialization_ctxt=serialization_ctxt)
else:
body_content = None
# Construct and send request
request = self._client.put(url, query_parameters, header_parameters, body_content)
pipeline_response = self._client._pipeline.run(request, stream=False, **kwargs)
response = pipeline_response.http_response
if response.status_code not in [204]:
map_error(status_code=response.status_code, response=response, error_map=error_map)
raise models.StorageErrorException(response, self._deserialize)
if cls:
response_headers = {
'x-ms-request-id': self._deserialize('str', response.headers.get('x-ms-request-id')),
'x-ms-version': self._deserialize('str', response.headers.get('x-ms-version')),
'Date': self._deserialize('rfc-1123', response.headers.get('Date')),
'x-ms-error-code': self._deserialize('str', response.headers.get('x-ms-error-code')),
}
return cls(response, None, response_headers)
set_access_policy.metadata = {'url': '/{queueName}'}
| 49.452436 | 136 | 0.648869 | 2,538 | 21,314 | 5.310087 | 0.094563 | 0.036061 | 0.036061 | 0.028196 | 0.879424 | 0.861987 | 0.861987 | 0.859835 | 0.853602 | 0.844476 | 0 | 0.004794 | 0.236699 | 21,314 | 430 | 137 | 49.567442 | 0.823591 | 0.322886 | 0 | 0.719048 | 1 | 0 | 0.135536 | 0.014471 | 0 | 0 | 0 | 0 | 0 | 1 | 0.033333 | false | 0 | 0.009524 | 0 | 0.085714 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 8 |
22a326c76fec53595996ab22a85186a8aed72065 | 3,088 | py | Python | tests/test_remove_title_page.py | KBNLresearch/ochre | a62bf3b31df83784c017d30a83ed8e01d454bf1c | [
"Apache-2.0"
] | 113 | 2017-10-22T20:50:43.000Z | 2022-03-26T22:51:26.000Z | tests/test_remove_title_page.py | KBNLresearch/ochre | a62bf3b31df83784c017d30a83ed8e01d454bf1c | [
"Apache-2.0"
] | 16 | 2017-10-23T13:33:37.000Z | 2021-05-06T12:28:43.000Z | tests/test_remove_title_page.py | KBNLresearch/Ochre | a62bf3b31df83784c017d30a83ed8e01d454bf1c | [
"Apache-2.0"
] | 22 | 2018-01-21T03:43:00.000Z | 2021-11-09T07:14:18.000Z | # -*- coding: utf-8 -*-
import os
from click.testing import CliRunner
# FIXME: import correct methods for testing
from ochre.remove_title_page import remove_title_page
# Documentation about testing click commands: http://click.pocoo.org/5/testing/
def test_remove_title_page_single_line():
runner = CliRunner()
with runner.isolated_filesystem():
os.makedirs('in')
os.makedirs('out')
with open('in/test-without.txt', 'w') as f:
content_without = 'Text starts here.\n' \
'Second line.\n'
f.write(content_without)
with open('in/test-with.txt', 'w') as f:
content = 'This is the title page\n' \
'Text starts here.\n' \
'Second line.\n'
f.write(content)
result = runner.invoke(remove_title_page,
['in/test-without.txt', 'in/test-with.txt',
'--out_dir', 'out'])
assert result.exit_code == 0
assert os.path.exists('out/test-with.txt')
with open('out/test-with.txt') as f:
c = f.read()
assert c == content_without
def test_remove_title_page_multiple_lines():
runner = CliRunner()
with runner.isolated_filesystem():
os.makedirs('in')
os.makedirs('out')
with open('in/test-without.txt', 'w') as f:
content_without = 'Text starts here.\n' \
'Second line.\n'
f.write(content_without)
with open('in/test-with.txt', 'w') as f:
content = 'This is the title page 1.\n' \
'This is the title page 2.\n' \
'Text starts here.\n' \
'Second line.\n'
f.write(content)
result = runner.invoke(remove_title_page,
['in/test-without.txt', 'in/test-with.txt',
'--out_dir', 'out'])
assert result.exit_code == 0
assert os.path.exists('out/test-with.txt')
with open('out/test-with.txt') as f:
c = f.read()
assert c == content_without
def test_remove_title_page_no_lines():
runner = CliRunner()
with runner.isolated_filesystem():
os.makedirs('in')
os.makedirs('out')
with open('in/test-without.txt', 'w') as f:
content_without = 'Text starts here.\n' \
'Second line.\n'
f.write(content_without)
with open('in/test-with.txt', 'w') as f:
content = 'Text starts here.\n' \
'Second line.\n'
f.write(content)
result = runner.invoke(remove_title_page,
['in/test-without.txt', 'in/test-with.txt',
'--out_dir', 'out'])
assert result.exit_code == 0
assert os.path.exists('out/test-with.txt')
with open('out/test-with.txt') as f:
c = f.read()
assert c == content_without
| 30.27451 | 79 | 0.517811 | 374 | 3,088 | 4.160428 | 0.179144 | 0.046272 | 0.084833 | 0.053985 | 0.861825 | 0.836118 | 0.836118 | 0.836118 | 0.836118 | 0.836118 | 0 | 0.003507 | 0.353627 | 3,088 | 101 | 80 | 30.574257 | 0.776052 | 0.045661 | 0 | 0.855072 | 0 | 0 | 0.219164 | 0 | 0 | 0 | 0 | 0.009901 | 0.130435 | 1 | 0.043478 | false | 0 | 0.043478 | 0 | 0.086957 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
a3d9efdf165dccf5db37300054d82a29a1329ae4 | 361 | py | Python | pset_challenging_ext/exercises/p43.py | mottaquikarim/pydev-psets | 9749e0d216ee0a5c586d0d3013ef481cc21dee27 | [
"MIT"
] | 5 | 2019-04-08T20:05:37.000Z | 2019-12-04T20:48:45.000Z | pset_challenging_ext/exercises/p43.py | mottaquikarim/pydev-psets | 9749e0d216ee0a5c586d0d3013ef481cc21dee27 | [
"MIT"
] | 8 | 2019-04-15T15:16:05.000Z | 2022-02-12T10:33:32.000Z | pset_challenging_ext/exercises/p43.py | mottaquikarim/pydev-psets | 9749e0d216ee0a5c586d0d3013ef481cc21dee27 | [
"MIT"
] | 2 | 2019-04-10T00:14:42.000Z | 2020-02-26T20:35:21.000Z | """
Write a program to generate and print another tuple whose values are even numbers in the given tuple (1,2,3,4,5,6,7,8,9,10).
"""
"""Question:
Write a program to generate and print another tuple whose values are even numbers in the given tuple (1,2,3,4,5,6,7,8,9,10).
Hints:
Use "for" to iterate the tuple
Use tuple() to generate a tuple from a list.
""" | 32.818182 | 125 | 0.711911 | 73 | 361 | 3.520548 | 0.465753 | 0.116732 | 0.101167 | 0.116732 | 0.731518 | 0.731518 | 0.731518 | 0.731518 | 0.731518 | 0.731518 | 0 | 0.073333 | 0.168975 | 361 | 11 | 126 | 32.818182 | 0.783333 | 0.34349 | 0 | null | 1 | null | 0 | 0 | null | 0 | 0 | 0 | null | 1 | null | true | 0 | 0 | null | null | null | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
a3f6000cd1ccafe3bfa7bf894cac8e18b050f2f1 | 1,134 | py | Python | tests/test_notebook.py | BlackHC/notebook_setup | 6092ae54a9b6b0ffe253765a22a6dd7bfaa3ea7b | [
"MIT"
] | null | null | null | tests/test_notebook.py | BlackHC/notebook_setup | 6092ae54a9b6b0ffe253765a22a6dd7bfaa3ea7b | [
"MIT"
] | null | null | null | tests/test_notebook.py | BlackHC/notebook_setup | 6092ae54a9b6b0ffe253765a22a6dd7bfaa3ea7b | [
"MIT"
] | null | null | null | from pyfakefs import fake_filesystem
from blackhc import project
import os
def test_get_cookiecutter_project_path_from_notebooks(fs: fake_filesystem.FakeFilesystem):
fs.CreateDirectory('/tmp/blackhc.project/notebooks')
assert (project.get_cookiecutter_project_path('/tmp/blackhc.project/notebooks') == os.path.abspath(
'/tmp/blackhc.project'))
def test_get_cookiecutter_project_path_from_scripts(fs: fake_filesystem.FakeFilesystem):
fs.CreateDirectory('/tmp/blackhc.project/scripts')
assert (project.get_cookiecutter_project_path('/tmp/blackhc.project/notebooks') == os.path.abspath(
'/tmp/blackhc.project'))
def test_get_cookiecutter_project_path_with_src(fs: fake_filesystem.FakeFilesystem):
fs.CreateDirectory('/tmp/blackhc.project/src')
assert (
project.get_cookiecutter_project_path('/tmp/blackhc.project/') == os.path.abspath('/tmp/blackhc.project'))
def test_get_cookiecutter_project_path_without_src(fs: fake_filesystem.FakeFilesystem):
fs.CreateDirectory('/tmp/blackhc.project')
assert project.get_cookiecutter_project_path('/tmp/blackhc.project/') is None
| 42 | 118 | 0.785714 | 140 | 1,134 | 6.071429 | 0.185714 | 0.129412 | 0.22 | 0.244706 | 0.863529 | 0.863529 | 0.863529 | 0.815294 | 0.815294 | 0.532941 | 0 | 0 | 0.099647 | 1,134 | 26 | 119 | 43.615385 | 0.832517 | 0 | 0 | 0.222222 | 0 | 0 | 0.232804 | 0.162258 | 0 | 0 | 0 | 0 | 0.222222 | 1 | 0.222222 | false | 0 | 0.166667 | 0 | 0.388889 | 0 | 0 | 0 | 0 | null | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 8 |
433e724ad0cc2e739ab2a0a776b4def459f9d855 | 321 | py | Python | negative.py | patlii/python-with-kids | 5b7afa9142974cf129ad12b2797edbf0780f1fb1 | [
"Unlicense"
] | 1 | 2016-01-29T02:42:33.000Z | 2016-01-29T02:42:33.000Z | negative.py | patlii/python-with-kids | 5b7afa9142974cf129ad12b2797edbf0780f1fb1 | [
"Unlicense"
] | null | null | null | negative.py | patlii/python-with-kids | 5b7afa9142974cf129ad12b2797edbf0780f1fb1 | [
"Unlicense"
] | null | null | null | #!/usr/bin/env python3
for person in range(1,10):
if (3 * person + 8) == (5 * person + 2):
print('person: %d' % person)
print('apple: %d' % (3 * person + 8))
print('-----')
for person in range(1,10):
if (3 * person + 8) == (5 * person - 2):
print('person: %d' % person)
print('apple: %d' % (3 * person + 8))
| 22.928571 | 41 | 0.52648 | 51 | 321 | 3.313725 | 0.352941 | 0.16568 | 0.189349 | 0.189349 | 0.87574 | 0.87574 | 0.87574 | 0.87574 | 0.87574 | 0.87574 | 0 | 0.076305 | 0.224299 | 321 | 13 | 42 | 24.692308 | 0.60241 | 0.065421 | 0 | 0.666667 | 0 | 0 | 0.143813 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0.555556 | 0 | 0 | 0 | null | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 9 |
4a66b8afbab912020a0de06db9e9626a00745063 | 2,367 | py | Python | tests/check_stack_tag_values_test.py | zaro0508/sceptrelint | e55be6ad0a2ed136880c45ef5add707faeaf28df | [
"Apache-2.0"
] | null | null | null | tests/check_stack_tag_values_test.py | zaro0508/sceptrelint | e55be6ad0a2ed136880c45ef5add707faeaf28df | [
"Apache-2.0"
] | null | null | null | tests/check_stack_tag_values_test.py | zaro0508/sceptrelint | e55be6ad0a2ed136880c45ef5add707faeaf28df | [
"Apache-2.0"
] | 1 | 2022-02-24T20:38:21.000Z | 2022-02-24T20:38:21.000Z | from __future__ import annotations
from pre_commit_hooks.check_stack_tag_values import lint
from testing.util import get_resource_path
TEST_RESOURCES_DIR = 'check_stack_tag_values'
def test_stack_tags_key_exist_value_valid_single_file():
resource_path = get_resource_path(TEST_RESOURCES_DIR)
files = [f'{resource_path}/stack_tags_match_tag_values1.yaml']
assert not lint(files, 'color', [f'{resource_path}/tag_values1.json'], [])
def test_stack_tags_key_exist_value_valid_multiple_files():
resource_path = get_resource_path(TEST_RESOURCES_DIR)
files = [f'{resource_path}/stack_tags_match_tag_values2.yaml']
assert not lint(
files, 'color', [
f'{resource_path}/tag_values1.json',
f'{resource_path}/tag_values2.json',
],
[],
)
def test_stack_tags_non_matching_value_exclude_matching_tag():
resource_path = get_resource_path(TEST_RESOURCES_DIR)
files = [f'{resource_path}/stack_tags_exclude_tags.yaml']
assert lint(
files, 'color', [
f'{resource_path}/tag_values1.json',
],
['grey'],
)
def test_stack_tags_matching_value_exclude_non_matching_tags():
resource_path = get_resource_path(TEST_RESOURCES_DIR)
files = [f'{resource_path}/stack_tags_exclude_tags.yaml']
assert not lint(
files, 'color', [f'{resource_path}/tag_values1.json'],
['yellow', 'bla'],
)
def test_stack_tags_key_missing():
resource_path = get_resource_path(TEST_RESOURCES_DIR)
files = [f'{resource_path}/stack_tags_missing.yaml']
assert lint(files, 'color', [f'{resource_path}/tag_values1.json'], [])
def test_stack_tags_tag_non_matching_value():
resource_path = get_resource_path(TEST_RESOURCES_DIR)
files = [f'{resource_path}/stack_tags_non_matching_value.yaml']
assert lint(files, 'color', [f'{resource_path}/tag_values1.json'], [])
def test_stack_tags_tag_missing():
resource_path = get_resource_path(TEST_RESOURCES_DIR)
files = [f'{resource_path}/stack_tags_missing.yaml']
assert lint(files, 'color', [f'{resource_path}/tag_values1.json'], [])
def test_stack_tags_tag_not_in_config_file():
resource_path = get_resource_path(TEST_RESOURCES_DIR)
files = [f'{resource_path}/stack_tags_tag_not_set.yaml']
assert lint(files, 'color', [f'{resource_path}/tag_values1.json'], [])
| 34.304348 | 78 | 0.729616 | 327 | 2,367 | 4.788991 | 0.149847 | 0.260536 | 0.141124 | 0.109195 | 0.816092 | 0.770754 | 0.750958 | 0.750958 | 0.717752 | 0.717752 | 0 | 0.005481 | 0.152091 | 2,367 | 68 | 79 | 34.808824 | 0.774788 | 0 | 0 | 0.48 | 0 | 0 | 0.304183 | 0.281791 | 0 | 0 | 0 | 0 | 0.16 | 1 | 0.16 | false | 0 | 0.06 | 0 | 0.22 | 0 | 0 | 0 | 0 | null | 1 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
4369959c812209276222dcbc26be241617624ae3 | 241 | py | Python | apps/backend/processor/tests/core/test_logs.py | jetoslabs/event-processor | 3f292f1d7265ab5c2bfdb91264d112e286e1d60b | [
"MIT"
] | null | null | null | apps/backend/processor/tests/core/test_logs.py | jetoslabs/event-processor | 3f292f1d7265ab5c2bfdb91264d112e286e1d60b | [
"MIT"
] | null | null | null | apps/backend/processor/tests/core/test_logs.py | jetoslabs/event-processor | 3f292f1d7265ab5c2bfdb91264d112e286e1d60b | [
"MIT"
] | null | null | null | import pytest
from processor.core.log import generate_trace_id
# @pytest.mark.asyncio
# async def test_generate_trace_id():
# trace_id = await generate_trace_id()
# print(f"trace_id: {trace_id}")
# assert trace_id is not None
| 21.909091 | 48 | 0.73444 | 37 | 241 | 4.486486 | 0.567568 | 0.295181 | 0.271084 | 0.168675 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.170124 | 241 | 10 | 49 | 24.1 | 0.83 | 0.680498 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 7 |
438da400c176b2e8e55db86377a6dcf94f588eb7 | 18,120 | py | Python | Source/testMarket.py | XiaotongAndyDing/WeTrade | 719fa3dedb15d66107c9c54bf4ef7b9001f62662 | [
"MIT"
] | 1 | 2022-03-12T10:42:04.000Z | 2022-03-12T10:42:04.000Z | Source/testMarket.py | XiaotongAndyDing/WeTrade | 719fa3dedb15d66107c9c54bf4ef7b9001f62662 | [
"MIT"
] | null | null | null | Source/testMarket.py | XiaotongAndyDing/WeTrade | 719fa3dedb15d66107c9c54bf4ef7b9001f62662 | [
"MIT"
] | 2 | 2021-12-03T02:52:59.000Z | 2021-12-10T01:07:13.000Z | from unittest import TestCase
import numpy as np
from Source.Market import Stock, Market, StockGeometricBrownianMotion, StockMeanRevertingGeometricBrownianMotion, \
Derivative, Option, EuropeanCallOption, EuropeanPutOption, StockTrendingGeometricBrownianMotion, \
MockStockGeometricBrownianMotion
class TestMarket(TestCase):
def test_check_current_value(self):
test_market = Market([Stock('stock_test_1', 100, 0, 0), Stock('stock_test_2', 101, 0, 0)])
# test creation of Stock
self.assertIn('stock_test_1', test_market._financial_product_dict)
self.assertIn('stock_test_2', test_market._financial_product_dict)
self.assertEqual(100, test_market.check_value('stock_test_1'))
self.assertEqual(101, test_market.check_value('stock_test_2'))
test_market.evolve()
self.assertEqual(100, test_market.check_value('stock_test_1'))
self.assertEqual(101, test_market.check_value('stock_test_2'))
def test_mark_current_value_to_record(self):
test_market = Market([Stock('stock_test_1', 100, 1, 0), Stock('stock_test_2', 101, 2, 0)])
test_market.mark_current_value_to_record(0)
self.assertEqual(1, len(test_market._financial_product_dict['stock_test_1'].price_record))
self.assertEqual(100, test_market._financial_product_dict['stock_test_1'].price_record[0])
self.assertEqual(1, len(test_market._financial_product_dict['stock_test_2'].price_record))
self.assertEqual(101, test_market._financial_product_dict['stock_test_2'].price_record[0])
self.assertEqual(100, test_market.check_record_value('stock_test_1', 0))
self.assertEqual(101, test_market.check_record_value('stock_test_2', 0))
test_market.evolve()
test_market.mark_current_value_to_record(1)
self.assertEqual(2, len(test_market._financial_product_dict['stock_test_1'].price_record))
self.assertEqual(100 + 1, test_market._financial_product_dict['stock_test_1'].price_record[1])
self.assertEqual(2, len(test_market._financial_product_dict['stock_test_2'].price_record))
self.assertEqual(101 + 2, test_market._financial_product_dict['stock_test_2'].price_record[1])
self.assertEqual(100, test_market.check_record_value('stock_test_1', 0))
self.assertEqual(101, test_market.check_record_value('stock_test_2', 0))
self.assertEqual(101, test_market.check_record_value('stock_test_1', 1))
self.assertEqual(103, test_market.check_record_value('stock_test_2', 1))
def test_check_initial_value(self):
test_market = Market([Stock('stock_test_1', 100, 1, 0), Stock('stock_test_2', 101, 2, 0)])
self.assertEqual(100, test_market.check_initial_value('stock_test_1'))
self.assertEqual(101, test_market.check_initial_value('stock_test_2'))
def test_check_delta(self):
stock_test = StockGeometricBrownianMotion('stock_gbm_test', 100, 0, 1 / np.sqrt(252))
option_test = EuropeanCallOption('option_test', [stock_test], 100, 252)
test_market = Market([stock_test, option_test])
self.assertAlmostEqual(0.691, option_test.delta, delta=0.001)
self.assertAlmostEqual(0.691, test_market.check_delta('option_test'), delta=0.001)
def test_check_type(self):
stock_test = StockGeometricBrownianMotion('stock_gbm_test', 100, 0, 1 / np.sqrt(252))
option_test = EuropeanCallOption('option_test', [stock_test], 100, 252)
test_market = Market([stock_test, option_test])
self.assertEqual('Option', test_market.check_type('option_test'))
self.assertEqual('Stock', test_market.check_type('stock_gbm_test'))
self.assertEqual('Cash', test_market.check_type('Cash'))
def test_check_underlier(self):
stock_test = StockGeometricBrownianMotion('stock_gbm_test', 100, 0, 1 / np.sqrt(252))
option_test = EuropeanCallOption('option_test', [stock_test], 100, 252)
test_market = Market([stock_test, option_test])
self.assertEqual('stock_gbm_test', test_market.check_underlier('option_test'))
class TestStock(TestCase):
def test_evolve(self):
stock_test = Stock('stock_test', 100, 0, 0)
# test creation of Stock
self.assertEqual('stock_test', stock_test.name)
self.assertEqual(100, stock_test.check_value())
# test evolve of Stock
stock_test.evolve()
# we set mu = 0, sigma = 0 in stock noise, so the stock price is the same after evolution.
self.assertAlmostEqual(100, stock_test.check_value(), delta=1e-6)
stock_test = Stock('stock_test', 100, 1, 0)
stock_test.evolve()
self.assertAlmostEqual(101, stock_test.check_value(), delta=1e-6)
def test_initial_value(self):
stock_test = Stock('stock_test', 100, 0, 0)
self.assertEqual(100, stock_test.check_initial_value())
def test_mark_current_value_to_record(self):
stock_test = Stock('stock_test', 100, 1, 0)
stock_test.mark_current_value_to_record(0)
self.assertEqual(1, len(stock_test.price_record))
self.assertEqual(100, stock_test.price_record[0])
stock_test.evolve()
stock_test.mark_current_value_to_record(1)
self.assertEqual(2, len(stock_test.price_record))
self.assertEqual(100, stock_test.price_record[0])
self.assertEqual(101, stock_test.price_record[1])
def test_simulate_price_moves(self):
stock_test = Stock('stock_test', 100, 1, 0) # stock_test increase $1 every day
simulated_future_prices = stock_test.simulate_price_moves(0, 10, 1000)
self.assertEqual(100, stock_test.current_value) # simulate price method does not influence its current value
self.assertEqual(1000, len(simulated_future_prices))
self.assertEqual(110, np.max(simulated_future_prices)) # 100 + 10 * 1 = 110
self.assertEqual(110, np.min(simulated_future_prices))
stock_test = Stock('stock_test', 100, 1, 1)
simulated_future_prices = stock_test.simulate_price_moves(0, 10)
self.assertAlmostEqual(110, float(np.mean(simulated_future_prices)), delta=5)
class TestStockGeometricBrownianMotion(TestCase):
def test_evolve(self):
stock_test = StockGeometricBrownianMotion('stock_gbm_test', 100, 0, 0)
stock_test.evolve()
self.assertAlmostEqual(100, stock_test.check_value(), delta=1e-6)
stock_test = StockGeometricBrownianMotion('stock_gbm_test', 100, 0.01, 0)
stock_test.evolve()
self.assertAlmostEqual(100 * np.exp(0.01), stock_test.check_value(), delta=1e-6)
stock_test.mu = 0.05
stock_test.evolve()
self.assertAlmostEqual(100 * np.exp(0.01) * np.exp(0.05), stock_test.check_value(), delta=1e-6)
class TestStockMeanRevertingGeometricBrownianMotion(TestCase):
def test_evolve(self):
stock_test = StockMeanRevertingGeometricBrownianMotion('stock_mr_gbm_test', 100, 0, 0,
equilibrium_price=100, mean_reversion_speed=0)
stock_test.evolve()
self.assertAlmostEqual(100, stock_test.check_value(), delta=1e-6)
stock_test = StockMeanRevertingGeometricBrownianMotion('stock_mr_gbm_test', 200, 0, 0,
equilibrium_price=100, mean_reversion_speed=0.001)
# initial stock price is higher than the equilibrium price. After long time, stock price is very close to the
# equilibrium price
for _ in range(100):
stock_test.evolve()
self.assertAlmostEqual(100, stock_test.check_value(), delta=0.01)
stock_test = StockMeanRevertingGeometricBrownianMotion('stock_mr_gbm_test', 50, 0, 0,
equilibrium_price=100, mean_reversion_speed=0.001)
# initial stock price is lower than the equilibrium price. After long time, stock price is very close to the
# equilibrium price
for _ in range(100):
stock_test.evolve()
self.assertAlmostEqual(100, stock_test.check_value(), delta=0.01)
class TestStockTrendingGeometricBrownianMotion(TestCase):
def test_evolve(self):
stock_test = StockTrendingGeometricBrownianMotion('stock_trending_gbm_test', 100, 0.01, 0,
trend_scale_param=0, trend_decay_param=1)
# stock price dynamic without trend
stock_test.mark_current_value_to_record(0)
for time in range(1, 5):
stock_test.evolve(time)
stock_test.mark_current_value_to_record(time)
self.assertAlmostEqual(100 * (np.exp(0.01) ** 4), stock_test.check_value(), delta=1e-6)
stock_test = StockTrendingGeometricBrownianMotion('stock_trending_gbm_test', 100, 0.01, 0,
trend_scale_param=0.1, trend_decay_param=1)
# stock price dynamic with trend
stock_test.mark_current_value_to_record(0)
for time in range(1, 5):
stock_test.evolve(time)
stock_test.mark_current_value_to_record(time)
self.assertAlmostEqual(104.233315, stock_test.check_value(), delta=1e-6)
class TestMockStockGeometricBrownianMotion(TestCase):
def test_evolve(self):
stock_test = MockStockGeometricBrownianMotion('mock_stock_gbm_test', 100, 0, 0.01)
self.assertEqual(100, stock_test.current_value)
next_day_value = stock_test.next_period_value
stock_test.evolve(1)
self.assertEqual(next_day_value, stock_test.current_value)
class TestDerivative(TestCase):
def test_init(self):
stock_test_1 = StockGeometricBrownianMotion('stock_gbm_test_1', 100, 0, 0)
stock_test_2 = StockGeometricBrownianMotion('stock_gbm_test_2', 101, 0, 0)
derivative_test = Derivative('derivative_test', [stock_test_1, stock_test_2])
self.assertEqual(2, len(derivative_test.underlyings))
self.assertEqual(100, derivative_test.underlyings[0].current_value)
self.assertEqual(101, derivative_test.underlyings[1].current_value)
class TestOption(TestCase):
def test_init(self):
stock_test = StockGeometricBrownianMotion('stock_gbm_test', 100, 0, 0)
option_test = Option('option_test', [stock_test], 110, 10)
self.assertEqual(110, option_test.strike)
self.assertEqual(10, option_test.expiry)
self.assertEqual(100, option_test.underlying.current_value)
class TestEuropeanCallOption(TestCase):
def test_evolve(self):
# Limit Case: Deep ITM Option, Stock Volatility is very small.
stock_test = StockGeometricBrownianMotion('stock_gbm_test', 100, 0, 1e-6)
option_test = EuropeanCallOption('option_test', [stock_test], 90, 10)
self.assertAlmostEqual(10, option_test.current_value, delta=1e-6)
self.assertAlmostEqual(1, option_test.delta, delta=1e-6)
self.assertAlmostEqual(0, option_test.gamma, delta=1e-6)
self.assertAlmostEqual(0, option_test.vega, delta=1e-6)
# Limit Case: Deep OTM Option, Stock Volatility is very small.
stock_test = StockGeometricBrownianMotion('stock_gbm_test', 100, 0, 1e-6)
option_test = EuropeanCallOption('option_test', [stock_test], 110, 10)
self.assertAlmostEqual(0, option_test.current_value, delta=1e-6)
self.assertAlmostEqual(0, option_test.delta, delta=1e-6)
self.assertAlmostEqual(0, option_test.gamma, delta=1e-6)
self.assertAlmostEqual(0, option_test.vega, delta=1e-6)
# Limit Case: Option, Stock Volatility is very large.
# you can win inf, with floored loss. So the price of the option is simply the price of stock, no matter strike
stock_test = StockGeometricBrownianMotion('stock_gbm_test', 100, 0, 1e6)
option_test = EuropeanCallOption('option_test', [stock_test], 90, 10)
self.assertAlmostEqual(100, option_test.current_value, delta=1e-6)
self.assertAlmostEqual(1, option_test.delta, delta=1e-6)
self.assertAlmostEqual(0, option_test.gamma, delta=1e-6)
self.assertAlmostEqual(0, option_test.vega, delta=1e-6)
# Limit Case: Already Expire.
stock_test = StockGeometricBrownianMotion('stock_gbm_test', 100, 0, 1e-6)
option_test = EuropeanCallOption('option_test', [stock_test], 110, 0)
self.assertAlmostEqual(0, option_test.current_value, delta=1e-6)
option_test = EuropeanCallOption('option_test', [stock_test], 90, 0)
self.assertAlmostEqual(10, option_test.current_value, delta=1e-6)
# Normal Case: ATM.
# Online Option Price Calculator: https://goodcalculators.com/black-scholes-calculator/
stock_test = StockGeometricBrownianMotion('stock_gbm_test', 100, 0, 1 / np.sqrt(252))
option_test = EuropeanCallOption('option_test', [stock_test], 100, 252) # 252 business days per year
self.assertAlmostEqual(38.292, option_test.current_value, delta=0.001)
self.assertAlmostEqual(38.292, option_test.initial_value, delta=0.001)
self.assertAlmostEqual(0.691, option_test.delta, delta=0.001)
self.assertAlmostEqual(0.004, option_test.gamma, delta=0.001)
self.assertAlmostEqual(35.207, option_test.vega, delta=0.001)
# Normal Case: ITM.
stock_test = StockGeometricBrownianMotion('stock_gbm_test', 100, 0, 1 / np.sqrt(252))
option_test = EuropeanCallOption('option_test', [stock_test], 90, 252) # 252 business days per year
self.assertAlmostEqual(41.563, option_test.current_value, delta=0.001)
self.assertAlmostEqual(0.728, option_test.delta, delta=0.001)
self.assertAlmostEqual(0.004, option_test.gamma, delta=0.001)
self.assertAlmostEqual(33.215, option_test.vega, delta=0.001)
# Normal Case: OTM.
stock_test = StockGeometricBrownianMotion('stock_gbm_test', 100, 0, 1 / np.sqrt(252))
option_test = EuropeanCallOption('option_test', [stock_test], 110, 252) # 252 business days per year
self.assertAlmostEqual(35.375, option_test.current_value, delta=0.001)
self.assertAlmostEqual(0.657, option_test.delta, delta=0.001)
self.assertAlmostEqual(0.004, option_test.gamma, delta=0.001)
self.assertAlmostEqual(36.758, option_test.vega, delta=0.001)
class TestEuropeanPutOption(TestCase):
def test_evolve(self):
# Limit Case: Deep OTM Option, Stock Volatility is very small.
stock_test = StockGeometricBrownianMotion('stock_gbm_test', 100, 0, 1e-6)
option_test = EuropeanPutOption('option_test', [stock_test], 90, 10)
self.assertAlmostEqual(0, option_test.current_value, delta=1e-6)
self.assertAlmostEqual(0, option_test.delta, delta=1e-6)
self.assertAlmostEqual(0, option_test.gamma, delta=1e-6)
self.assertAlmostEqual(0, option_test.vega, delta=1e-6)
# Limit Case: Deep OTM Option, Stock Volatility is very small.
stock_test = StockGeometricBrownianMotion('stock_gbm_test', 100, 0, 1e-6)
option_test = EuropeanPutOption('option_test', [stock_test], 110, 10)
self.assertAlmostEqual(10, option_test.current_value, delta=1e-6)
self.assertAlmostEqual(-1, option_test.delta, delta=1e-6)
self.assertAlmostEqual(0, option_test.gamma, delta=1e-6)
self.assertAlmostEqual(0, option_test.vega, delta=1e-6)
# Limit Case: Option, Stock Volatility is very large.
stock_test = StockGeometricBrownianMotion('stock_gbm_test', 100, 0, 1e6)
option_test = EuropeanPutOption('option_test', [stock_test], 110, 10)
self.assertAlmostEqual(110, option_test.current_value, delta=1e-6)
self.assertAlmostEqual(0, option_test.delta, delta=1e-6)
self.assertAlmostEqual(0, option_test.gamma, delta=1e-6)
self.assertAlmostEqual(0, option_test.vega, delta=1e-6)
# Limit Case: Already Expire.
stock_test = StockGeometricBrownianMotion('stock_gbm_test', 100, 0, 1e-6)
option_test = EuropeanPutOption('option_test', [stock_test], 110, 0)
self.assertAlmostEqual(10, option_test.current_value, delta=1e-6)
option_test = EuropeanPutOption('option_test', [stock_test], 90, 0)
self.assertAlmostEqual(0, option_test.current_value, delta=1e-6)
# Normal Case: ATM.
# Online Option Price Calculator: https://goodcalculators.com/black-scholes-calculator/
stock_test = StockGeometricBrownianMotion('stock_gbm_test', 100, 0, 1 / np.sqrt(252))
option_test = EuropeanPutOption('option_test', [stock_test], 100, 252) # 252 business days per year
self.assertAlmostEqual(38.292, option_test.current_value, delta=0.001)
self.assertAlmostEqual(38.292, option_test.initial_value, delta=0.001)
self.assertAlmostEqual(-0.309, option_test.delta, delta=0.001)
self.assertAlmostEqual(0.004, option_test.gamma, delta=0.001)
self.assertAlmostEqual(35.207, option_test.vega, delta=0.001)
# Normal Case: ITM.
stock_test = StockGeometricBrownianMotion('stock_gbm_test', 100, 0, 1 / np.sqrt(252))
option_test = EuropeanPutOption('option_test', [stock_test], 110, 252) # 252 business days per year
self.assertAlmostEqual(45.375, option_test.current_value, delta=0.001)
self.assertAlmostEqual(-0.343, option_test.delta, delta=0.001)
self.assertAlmostEqual(0.004, option_test.gamma, delta=0.001)
self.assertAlmostEqual(36.758, option_test.vega, delta=0.001)
# Normal Case: OTM.
stock_test = StockGeometricBrownianMotion('stock_gbm_test', 100, 0, 1 / np.sqrt(252))
option_test = EuropeanPutOption('option_test', [stock_test], 90, 252) # 252 business days per year
self.assertAlmostEqual(31.563, option_test.current_value, delta=0.001)
self.assertAlmostEqual(-0.272, option_test.delta, delta=0.001)
self.assertAlmostEqual(0.003, option_test.gamma, delta=0.001)
self.assertAlmostEqual(33.215, option_test.vega, delta=0.001)
| 52.521739 | 119 | 0.701104 | 2,351 | 18,120 | 5.149298 | 0.082518 | 0.103337 | 0.02379 | 0.021807 | 0.831241 | 0.818437 | 0.800182 | 0.739055 | 0.716752 | 0.69924 | 0 | 0.066835 | 0.191611 | 18,120 | 344 | 120 | 52.674419 | 0.759626 | 0.084879 | 0 | 0.549587 | 0 | 0 | 0.069447 | 0.00278 | 0 | 0 | 0 | 0 | 0.466942 | 1 | 0.07438 | false | 0 | 0.012397 | 0 | 0.128099 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
439ff43d8c5c8856261576db8c83a4b0f058175f | 25,527 | py | Python | watcher/tests/decision_engine/model/notification/test_cinder_notifications.py | ajaytikoo/watcher | 6dbac1f6ae7f3e10dfdcef5721fa4af7af54e159 | [
"Apache-2.0"
] | 64 | 2015-10-18T02:57:24.000Z | 2022-01-13T11:27:51.000Z | watcher/tests/decision_engine/model/notification/test_cinder_notifications.py | ajaytikoo/watcher | 6dbac1f6ae7f3e10dfdcef5721fa4af7af54e159 | [
"Apache-2.0"
] | null | null | null | watcher/tests/decision_engine/model/notification/test_cinder_notifications.py | ajaytikoo/watcher | 6dbac1f6ae7f3e10dfdcef5721fa4af7af54e159 | [
"Apache-2.0"
] | 35 | 2015-12-25T13:53:21.000Z | 2021-07-19T15:50:16.000Z | # -*- encoding: utf-8 -*-
# Copyright 2017 NEC Corporation
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or
# implied.
# See the License for the specific language governing permissions and
# limitations under the License.
import datetime
import os
from unittest import mock
from oslo_serialization import jsonutils
from watcher.common import cinder_helper
from watcher.common import context
from watcher.common import exception
from watcher.common import service as watcher_service
from watcher.db.sqlalchemy import api as db_api
from watcher.decision_engine.model.notification import cinder as cnotification
from watcher.tests import base as base_test
from watcher.tests.db import utils
from watcher.tests.decision_engine.model import faker_cluster_state
from watcher.tests.decision_engine.model.notification import fake_managers
class NotificationTestCase(base_test.TestCase):
@staticmethod
def load_message(filename):
cwd = os.path.abspath(os.path.dirname(__file__))
data_folder = os.path.join(cwd, "data")
with open(os.path.join(data_folder, filename), 'rb') as json_file:
json_data = jsonutils.load(json_file)
return json_data
class TestReceiveCinderNotifications(NotificationTestCase):
FAKE_METADATA = {'message_id': None, 'timestamp': None}
def setUp(self):
super(TestReceiveCinderNotifications, self).setUp()
p_from_dict = mock.patch.object(context.RequestContext, 'from_dict')
m_from_dict = p_from_dict.start()
m_from_dict.return_value = self.context
self.addCleanup(p_from_dict.stop)
p_get_service_list = mock.patch.object(
db_api.Connection, 'get_service_list')
p_update_service = mock.patch.object(
db_api.Connection, 'update_service')
m_get_service_list = p_get_service_list.start()
m_update_service = p_update_service.start()
fake_service = utils.get_test_service(
created_at=datetime.datetime.utcnow())
m_get_service_list.return_value = [fake_service]
m_update_service.return_value = fake_service.copy()
self.addCleanup(p_get_service_list.stop)
self.addCleanup(p_update_service.stop)
@mock.patch.object(cnotification.CapacityNotificationEndpoint, 'info')
def test_cinder_receive_capacity(self, m_info):
message = self.load_message('capacity.json')
expected_message = message['payload']
de_service = watcher_service.Service(fake_managers.FakeStorageManager)
incoming = mock.Mock(ctxt=self.context.to_dict(), message=message)
de_service.notification_handler.dispatcher.dispatch(incoming)
m_info.assert_called_once_with(
self.context, 'capacity.host1@backend1#pool1', 'capacity.pool',
expected_message, self.FAKE_METADATA)
@mock.patch.object(cnotification.VolumeCreateEnd, 'info')
def test_cinder_receive_volume_create_end(self, m_info):
message = self.load_message('scenario_1_volume-create.json')
expected_message = message['payload']
de_service = watcher_service.Service(fake_managers.FakeStorageManager)
incoming = mock.Mock(ctxt=self.context.to_dict(), message=message)
de_service.notification_handler.dispatcher.dispatch(incoming)
m_info.assert_called_once_with(
self.context, 'volume.host_0@backend_0#pool_0',
'volume.create.end', expected_message, self.FAKE_METADATA)
@mock.patch.object(cnotification.VolumeUpdateEnd, 'info')
def test_cinder_receive_volume_update_end(self, m_info):
message = self.load_message('scenario_1_volume-update.json')
expected_message = message['payload']
de_service = watcher_service.Service(fake_managers.FakeStorageManager)
incoming = mock.Mock(ctxt=self.context.to_dict(), message=message)
de_service.notification_handler.dispatcher.dispatch(incoming)
m_info.assert_called_once_with(
self.context, 'volume.host_0@backend_0#pool_0',
'volume.update.end', expected_message, self.FAKE_METADATA)
@mock.patch.object(cnotification.VolumeAttachEnd, 'info')
def test_cinder_receive_volume_attach_end(self, m_info):
message = self.load_message('scenario_1_volume-attach.json')
expected_message = message['payload']
de_service = watcher_service.Service(fake_managers.FakeStorageManager)
incoming = mock.Mock(ctxt=self.context.to_dict(), message=message)
de_service.notification_handler.dispatcher.dispatch(incoming)
m_info.assert_called_once_with(
self.context, 'volume.host_0@backend_0#pool_0',
'volume.attach.end', expected_message, self.FAKE_METADATA)
@mock.patch.object(cnotification.VolumeDetachEnd, 'info')
def test_cinder_receive_volume_detach_end(self, m_info):
message = self.load_message('scenario_1_volume-detach.json')
expected_message = message['payload']
de_service = watcher_service.Service(fake_managers.FakeStorageManager)
incoming = mock.Mock(ctxt=self.context.to_dict(), message=message)
de_service.notification_handler.dispatcher.dispatch(incoming)
m_info.assert_called_once_with(
self.context, 'volume.host_0@backend_0#pool_0',
'volume.detach.end', expected_message, self.FAKE_METADATA)
@mock.patch.object(cnotification.VolumeResizeEnd, 'info')
def test_cinder_receive_volume_resize_end(self, m_info):
message = self.load_message('scenario_1_volume-resize.json')
expected_message = message['payload']
de_service = watcher_service.Service(fake_managers.FakeStorageManager)
incoming = mock.Mock(ctxt=self.context.to_dict(), message=message)
de_service.notification_handler.dispatcher.dispatch(incoming)
m_info.assert_called_once_with(
self.context, 'volume.host_0@backend_0#pool_0',
'volume.resize.end', expected_message, self.FAKE_METADATA)
@mock.patch.object(cnotification.VolumeDeleteEnd, 'info')
def test_cinder_receive_volume_delete_end(self, m_info):
message = self.load_message('scenario_1_volume-delete.json')
expected_message = message['payload']
de_service = watcher_service.Service(fake_managers.FakeStorageManager)
incoming = mock.Mock(ctxt=self.context.to_dict(), message=message)
de_service.notification_handler.dispatcher.dispatch(incoming)
m_info.assert_called_once_with(
self.context, 'volume.host_0@backend_0#pool_0',
'volume.delete.end', expected_message, self.FAKE_METADATA)
class TestCinderNotifications(NotificationTestCase):
FAKE_METADATA = {'message_id': None, 'timestamp': None}
def setUp(self):
super(TestCinderNotifications, self).setUp()
# fake cluster
self.fake_cdmc = faker_cluster_state.FakerStorageModelCollector()
def test_cinder_capacity(self):
"""test consuming capacity"""
storage_model = self.fake_cdmc.generate_scenario_1()
self.fake_cdmc.cluster_data_model = storage_model
handler = cnotification.CapacityNotificationEndpoint(self.fake_cdmc)
pool_0_name = 'host_0@backend_0#pool_0'
pool_0 = storage_model.get_pool_by_pool_name(pool_0_name)
# before
self.assertEqual(pool_0_name, pool_0.name)
self.assertEqual(420, pool_0.free_capacity_gb)
self.assertEqual(420, pool_0.virtual_free)
self.assertEqual(80, pool_0.allocated_capacity_gb)
self.assertEqual(80, pool_0.provisioned_capacity_gb)
message = self.load_message('scenario_1_capacity.json')
handler.info(
ctxt=self.context,
publisher_id=message['publisher_id'],
event_type=message['event_type'],
payload=message['payload'],
metadata=self.FAKE_METADATA,
)
# after
self.assertEqual(pool_0_name, pool_0.name)
self.assertEqual(460, pool_0.free_capacity_gb)
self.assertEqual(460, pool_0.virtual_free)
self.assertEqual(40, pool_0.allocated_capacity_gb)
self.assertEqual(40, pool_0.provisioned_capacity_gb)
@mock.patch.object(cinder_helper, 'CinderHelper')
def test_cinder_capacity_pool_notfound(self, m_cinder_helper):
"""test consuming capacity, new pool in existing node"""
# storage_pool_by_name mock
return_mock = mock.Mock()
return_mock.configure_mock(
name='host_0@backend_0#pool_2',
total_volumes='2',
total_capacity_gb='500',
free_capacity_gb='380',
provisioned_capacity_gb='120',
allocated_capacity_gb='120')
m_get_storage_pool_by_name = mock.Mock(
side_effect=lambda name: return_mock)
m_cinder_helper.return_value = mock.Mock(
get_storage_pool_by_name=m_get_storage_pool_by_name)
storage_model = self.fake_cdmc.generate_scenario_1()
self.fake_cdmc.cluster_data_model = storage_model
handler = cnotification.CapacityNotificationEndpoint(self.fake_cdmc)
message = self.load_message('scenario_1_capacity_pool_notfound.json')
handler.info(
ctxt=self.context,
publisher_id=message['publisher_id'],
event_type=message['event_type'],
payload=message['payload'],
metadata=self.FAKE_METADATA,
)
# after consuming message, still pool_0 exists
pool_0_name = 'host_0@backend_0#pool_0'
pool_0 = storage_model.get_pool_by_pool_name(pool_0_name)
self.assertEqual(pool_0_name, pool_0.name)
self.assertEqual(420, pool_0.free_capacity_gb)
self.assertEqual(420, pool_0.virtual_free)
self.assertEqual(80, pool_0.allocated_capacity_gb)
self.assertEqual(80, pool_0.provisioned_capacity_gb)
# new pool was added
pool_1_name = 'host_0@backend_0#pool_2'
m_get_storage_pool_by_name.assert_called_once_with(pool_1_name)
storage_node = storage_model.get_node_by_pool_name(pool_1_name)
self.assertEqual('host_0@backend_0', storage_node.host)
pool_1 = storage_model.get_pool_by_pool_name(pool_1_name)
self.assertEqual(pool_1_name, pool_1.name)
self.assertEqual(500, pool_1.total_capacity_gb)
self.assertEqual(380, pool_1.free_capacity_gb)
self.assertEqual(120, pool_1.allocated_capacity_gb)
@mock.patch.object(cinder_helper, 'CinderHelper')
def test_cinder_capacity_node_notfound(self, m_cinder_helper):
"""test consuming capacity, new pool in new node"""
return_pool_mock = mock.Mock()
return_pool_mock.configure_mock(
name='host_2@backend_2#pool_0',
total_volumes='2',
total_capacity_gb='500',
free_capacity_gb='460',
provisioned_capacity_gb='40',
allocated_capacity_gb='40')
m_get_storage_pool_by_name = mock.Mock(
side_effect=lambda name: return_pool_mock)
# storage_node_by_name mock
return_node_mock = mock.Mock()
return_node_mock.configure_mock(
host='host_2@backend_2',
zone='nova',
state='up',
status='enabled')
m_get_storage_node_by_name = mock.Mock(
side_effect=lambda name: return_node_mock)
m_get_volume_type_by_backendname = mock.Mock(
side_effect=lambda name: [mock.Mock('backend_2')])
m_cinder_helper.return_value = mock.Mock(
get_storage_pool_by_name=m_get_storage_pool_by_name,
get_storage_node_by_name=m_get_storage_node_by_name,
get_volume_type_by_backendname=m_get_volume_type_by_backendname)
storage_model = self.fake_cdmc.generate_scenario_1()
self.fake_cdmc.cluster_data_model = storage_model
handler = cnotification.CapacityNotificationEndpoint(self.fake_cdmc)
message = self.load_message('scenario_1_capacity_node_notfound.json')
# self.assertRaises(exception.StorageNodeNotFound, handler.info,
handler.info(
ctxt=self.context,
publisher_id=message['publisher_id'],
event_type=message['event_type'],
payload=message['payload'],
metadata=self.FAKE_METADATA,
)
# new pool and new node was added
node_1_name = 'host_2@backend_2'
pool_1_name = node_1_name + '#pool_0'
volume_type = 'backend_2'
m_get_storage_pool_by_name.assert_called_once_with(pool_1_name)
m_get_storage_node_by_name.assert_called_once_with(node_1_name)
m_get_volume_type_by_backendname.assert_called_once_with(volume_type)
# new node was added
storage_node = storage_model.get_node_by_pool_name(pool_1_name)
self.assertEqual('host_2@backend_2', storage_node.host)
# new pool was added
pool_1 = storage_model.get_pool_by_pool_name(pool_1_name)
self.assertEqual(pool_1_name, pool_1.name)
self.assertEqual(500, pool_1.total_capacity_gb)
self.assertEqual(460, pool_1.free_capacity_gb)
self.assertEqual(40, pool_1.allocated_capacity_gb)
self.assertEqual(40, pool_1.provisioned_capacity_gb)
@mock.patch.object(cinder_helper, 'CinderHelper')
def test_cinder_volume_create(self, m_cinder_helper):
"""test creating volume in existing pool and node"""
# create storage_pool_by_name mock
return_pool_mock = mock.Mock()
return_pool_mock.configure_mock(
name='host_0@backend_0#pool_0',
total_volumes='3',
total_capacity_gb='500',
free_capacity_gb='380',
provisioned_capacity_gb='120',
allocated_capacity_gb='120')
m_get_storage_pool_by_name = mock.Mock(
side_effect=lambda name: return_pool_mock)
m_cinder_helper.return_value = mock.Mock(
get_storage_pool_by_name=m_get_storage_pool_by_name)
storage_model = self.fake_cdmc.generate_scenario_1()
self.fake_cdmc.cluster_data_model = storage_model
handler = cnotification.VolumeCreateEnd(self.fake_cdmc)
message = self.load_message('scenario_1_volume-create.json')
handler.info(
ctxt=self.context,
publisher_id=message['publisher_id'],
event_type=message['event_type'],
payload=message['payload'],
metadata=self.FAKE_METADATA,
)
# check that volume00 was added to the model
volume_00_name = '990a723f-6c19-4f83-8526-6383c9e9389f'
volume_00 = storage_model.get_volume_by_uuid(volume_00_name)
self.assertEqual(volume_00_name, volume_00.uuid)
self.assertFalse(volume_00.bootable)
# check that capacity was updated
pool_0_name = 'host_0@backend_0#pool_0'
m_get_storage_pool_by_name.assert_called_once_with(pool_0_name)
pool_0 = storage_model.get_pool_by_pool_name(pool_0_name)
self.assertEqual(pool_0.name, pool_0_name)
self.assertEqual(3, pool_0.total_volumes)
self.assertEqual(380, pool_0.free_capacity_gb)
self.assertEqual(120, pool_0.allocated_capacity_gb)
self.assertEqual(120, pool_0.provisioned_capacity_gb)
@mock.patch.object(cinder_helper, 'CinderHelper')
def test_cinder_bootable_volume_create(self, m_cinder_helper):
"""test creating bootable volume in existing pool and node"""
# create storage_pool_by_name mock
return_pool_mock = mock.Mock()
return_pool_mock.configure_mock(
name='host_0@backend_0#pool_0',
total_volumes='3',
total_capacity_gb='500',
free_capacity_gb='380',
provisioned_capacity_gb='120',
allocated_capacity_gb='120')
m_get_storage_pool_by_name = mock.Mock(
side_effect=lambda name: return_pool_mock)
m_cinder_helper.return_value = mock.Mock(
get_storage_pool_by_name=m_get_storage_pool_by_name)
storage_model = self.fake_cdmc.generate_scenario_1()
self.fake_cdmc.cluster_data_model = storage_model
handler = cnotification.VolumeCreateEnd(self.fake_cdmc)
message = self.load_message('scenario_1_bootable-volume-create.json')
handler.info(
ctxt=self.context,
publisher_id=message['publisher_id'],
event_type=message['event_type'],
payload=message['payload'],
metadata=self.FAKE_METADATA,
)
# check that volume00 was added to the model
volume_00_name = '990a723f-6c19-4f83-8526-6383c9e9389f'
volume_00 = storage_model.get_volume_by_uuid(volume_00_name)
self.assertEqual(volume_00_name, volume_00.uuid)
self.assertTrue(volume_00.bootable)
# check that capacity was updated
pool_0_name = 'host_0@backend_0#pool_0'
m_get_storage_pool_by_name.assert_called_once_with(pool_0_name)
pool_0 = storage_model.get_pool_by_pool_name(pool_0_name)
self.assertEqual(pool_0.name, pool_0_name)
self.assertEqual(3, pool_0.total_volumes)
self.assertEqual(380, pool_0.free_capacity_gb)
self.assertEqual(120, pool_0.allocated_capacity_gb)
self.assertEqual(120, pool_0.provisioned_capacity_gb)
@mock.patch.object(cinder_helper, 'CinderHelper')
def test_cinder_volume_create_pool_notfound(self, m_cinder_helper):
"""check creating volume in not existing pool and node"""
# get_storage_pool_by_name mock
return_pool_mock = mock.Mock()
return_pool_mock.configure_mock(
name='host_2@backend_2#pool_0',
total_volumes='1',
total_capacity_gb='500',
free_capacity_gb='460',
provisioned_capacity_gb='40',
allocated_capacity_gb='40')
m_get_storage_pool_by_name = mock.Mock(
side_effect=lambda name: return_pool_mock)
# create storage_node_by_name mock
return_node_mock = mock.Mock()
return_node_mock.configure_mock(
host='host_2@backend_2',
zone='nova',
state='up',
status='enabled')
m_get_storage_node_by_name = mock.Mock(
side_effect=lambda name: return_node_mock)
m_get_volume_type_by_backendname = mock.Mock(
side_effect=lambda name: [mock.Mock('backend_2')])
m_cinder_helper.return_value = mock.Mock(
get_storage_pool_by_name=m_get_storage_pool_by_name,
get_storage_node_by_name=m_get_storage_node_by_name,
get_volume_type_by_backendname=m_get_volume_type_by_backendname)
storage_model = self.fake_cdmc.generate_scenario_1()
self.fake_cdmc.cluster_data_model = storage_model
handler = cnotification.VolumeCreateEnd(self.fake_cdmc)
message = self.load_message(
'scenario_1_volume-create_pool_notfound.json')
handler.info(
ctxt=self.context,
publisher_id=message['publisher_id'],
event_type=message['event_type'],
payload=message['payload'],
metadata=self.FAKE_METADATA,
)
# check that volume00 was added to the model
volume_00_name = '990a723f-6c19-4f83-8526-6383c9e9389f'
volume_00 = storage_model.get_volume_by_uuid(volume_00_name)
self.assertEqual(volume_00_name, volume_00.uuid)
# check that capacity was updated
node_2_name = 'host_2@backend_2'
pool_0_name = node_2_name + '#pool_0'
pool_0 = storage_model.get_pool_by_pool_name(pool_0_name)
self.assertEqual(pool_0.name, pool_0_name)
self.assertEqual(1, pool_0.total_volumes)
self.assertEqual(460, pool_0.free_capacity_gb)
self.assertEqual(40, pool_0.allocated_capacity_gb)
self.assertEqual(40, pool_0.provisioned_capacity_gb)
# check that node was added
m_get_storage_node_by_name.assert_called_once_with(node_2_name)
@mock.patch.object(cinder_helper, 'CinderHelper')
def test_cinder_error_volume_unmapped(self, m_cinder_helper):
"""test creating error volume unmapped"""
m_get_storage_pool_by_name = mock.Mock(
side_effect=exception.PoolNotFound(name="TEST"))
m_cinder_helper.return_value = mock.Mock(
get_storage_pool_by_name=m_get_storage_pool_by_name)
storage_model = self.fake_cdmc.generate_scenario_1()
self.fake_cdmc.cluster_data_model = storage_model
handler = cnotification.VolumeCreateEnd(self.fake_cdmc)
message = self.load_message('scenario_1_error-volume-create.json')
handler.info(
ctxt=self.context,
publisher_id=message['publisher_id'],
event_type=message['event_type'],
payload=message['payload'],
metadata=self.FAKE_METADATA,
)
# we do not call get_storage_pool_by_name
m_get_storage_pool_by_name.assert_not_called()
# check that volume00 was added to the model
volume_00_name = '990a723f-6c19-4f83-8526-6383c9e9389f'
volume_00 = storage_model.get_volume_by_uuid(volume_00_name)
self.assertEqual(volume_00_name, volume_00.uuid)
@mock.patch.object(cinder_helper, 'CinderHelper')
def test_cinder_volume_update(self, m_cinder_helper):
"""test updating volume in existing pool and node"""
storage_model = self.fake_cdmc.generate_scenario_1()
self.fake_cdmc.cluster_data_model = storage_model
handler = cnotification.VolumeUpdateEnd(self.fake_cdmc)
volume_0_name = faker_cluster_state.volume_uuid_mapping['volume_0']
volume_0 = storage_model.get_volume_by_uuid(volume_0_name)
self.assertEqual('name_0', volume_0.name)
# create storage_pool_by name mock
return_pool_mock = mock.Mock()
return_pool_mock.configure_mock(
name='host_0@backend_0#pool_0',
total_volumes='2',
total_capacity_gb='500',
free_capacity_gb='420',
provisioned_capacity_gb='80',
allocated_capacity_gb='80')
m_get_storage_pool_by_name = mock.Mock(
side_effect=lambda name: return_pool_mock)
m_cinder_helper.return_value = mock.Mock(
get_storage_pool_by_name=m_get_storage_pool_by_name)
message = self.load_message('scenario_1_volume-update.json')
handler.info(
ctxt=self.context,
publisher_id=message['publisher_id'],
event_type=message['event_type'],
payload=message['payload'],
metadata=self.FAKE_METADATA,
)
# check that name of volume_0 was updated in the model
volume_0 = storage_model.get_volume_by_uuid(volume_0_name)
self.assertEqual('name_01', volume_0.name)
@mock.patch.object(cinder_helper, 'CinderHelper')
def test_cinder_volume_delete(self, m_cinder_helper):
"""test deleting volume"""
# create storage_pool_by name mock
return_pool_mock = mock.Mock()
return_pool_mock.configure_mock(
name='host_0@backend_0#pool_0',
total_volumes='1',
total_capacity_gb='500',
free_capacity_gb='460',
provisioned_capacity_gb='40',
allocated_capacity_gb='40')
m_get_storage_pool_by_name = mock.Mock(
side_effect=lambda name: return_pool_mock)
m_cinder_helper.return_value = mock.Mock(
get_storage_pool_by_name=m_get_storage_pool_by_name)
storage_model = self.fake_cdmc.generate_scenario_1()
self.fake_cdmc.cluster_data_model = storage_model
handler = cnotification.VolumeDeleteEnd(self.fake_cdmc)
# volume exists before consuming
volume_0_uuid = faker_cluster_state.volume_uuid_mapping['volume_0']
volume_0 = storage_model.get_volume_by_uuid(volume_0_uuid)
self.assertEqual(volume_0_uuid, volume_0.uuid)
message = self.load_message('scenario_1_volume-delete.json')
handler.info(
ctxt=self.context,
publisher_id=message['publisher_id'],
event_type=message['event_type'],
payload=message['payload'],
metadata=self.FAKE_METADATA,
)
# volume does not exists after consuming
self.assertRaises(
exception.VolumeNotFound,
storage_model.get_volume_by_uuid, volume_0_uuid)
# check that capacity was updated
pool_0_name = 'host_0@backend_0#pool_0'
m_get_storage_pool_by_name.assert_called_once_with(pool_0_name)
pool_0 = storage_model.get_pool_by_pool_name(pool_0_name)
self.assertEqual(pool_0.name, pool_0_name)
self.assertEqual(1, pool_0.total_volumes)
self.assertEqual(460, pool_0.free_capacity_gb)
self.assertEqual(40, pool_0.allocated_capacity_gb)
self.assertEqual(40, pool_0.provisioned_capacity_gb)
| 41.985197 | 78 | 0.696243 | 3,304 | 25,527 | 4.988499 | 0.077179 | 0.025179 | 0.029183 | 0.038163 | 0.820289 | 0.812341 | 0.776241 | 0.761497 | 0.75725 | 0.742507 | 0 | 0.027042 | 0.21773 | 25,527 | 607 | 79 | 42.054366 | 0.798337 | 0.075019 | 0 | 0.702461 | 0 | 0 | 0.08728 | 0.047998 | 0 | 0 | 0 | 0 | 0.161074 | 1 | 0.042506 | false | 0 | 0.03132 | 0 | 0.087248 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
43b788baee1d98c07f735ea80586a1e013560de6 | 93,976 | py | Python | tests/sample_program/converter.py | zevtyardt/no-strint | 47583d55e3c4cd12f00f46902d2fd7d5138c3275 | [
"MIT"
] | 13 | 2019-03-13T04:14:45.000Z | 2020-04-05T09:13:21.000Z | tests/sample_program/converter.py | zevtyardt/no-strint | 47583d55e3c4cd12f00f46902d2fd7d5138c3275 | [
"MIT"
] | null | null | null | tests/sample_program/converter.py | zevtyardt/no-strint | 47583d55e3c4cd12f00f46902d2fd7d5138c3275 | [
"MIT"
] | 6 | 2019-03-22T04:48:59.000Z | 2020-08-07T17:09:20.000Z | # _ @ Sat Apr 6 09:11:31 2019
# (o)
# (_|_) <no strint> 1.4.9 @ zvtyrdt.id
# ||| (https://github.com/zevtyardt)
def nwords(num):
num = int(num)
units = [((lambda:(([]==[])-({}<[]))).func_code.co_lnotab), str(bytearray((((((([]>{})+([]<=[])+(()>={}))+(([]<=())+({}!=())+([]>{}))+(([]>{})-(()!=())))<<((({}!=[])+([]!=()))+(([]>=[])+(()!={}))))-((([]!=())-({}!={})))),((((({}<())+([]<())+(()!={}))+((()!=[])+(()>={})+([]<=[]))+((()>[])*([]<=[])))<<((([]>={})+({}<()))+(([]==[])+({}!=()))))-((((()>{})+([]>())))<<((({}>=[])+([]<()))))),((((((((()>[])+([]>=[])+({}<[])))<<(((()>=())+([]>={})+(()>{}))))+((({}=={})*({}<=[])))))<<(((()>[])+({}!=()))))+((([]!={})+(()>())))),))), ((lambda:(([]==[])*(()<=[]))).func_code.co_lnotab).join(map(chr,[(((((()>=[])+([]!=())+(()>[]))+(({}<())+([]!=())+(()>{}))+(([]==())+(()==())))<<(((()!=[])+([]>{}))+((()!={})+([]>=[]))))+(((({}!=())-({}>[])))<<(((()<=())+([]!={}))))),(((((((([]>{})-([]>=())))<<((({}<={})+({}!=()))+(({}<=())+(()>=()))))-((({}<())+({}>())))))<<((([]<=[])+(()<=())+({}!=()))))-(((()==())-([]>[])))),((((([]>={})+([]>={})+([]<=()))+(([]!={})+([]>=[])+([]>=[]))+(([]==())+(()>=())))<<((([]>={})+({}<={}))+((()>[])+([]!=()))))-((([]!=())*(()>={}))))])), (lambda _:_(map(chr,[(((((()>=())+({}<())+([]>=[]))+(({}!=())+([]<=())+([]<=()))+((()>{})+(()<{})))<<(((()!={})+([]<=[]))+((()>={})+({}<={}))))+(((([]>{})+(()<[])))<<((([]!=())+(()>[]))))),(((((((((([]==[])+({}=={})+(()>={})))<<((({}!=())+({}<()))))+((({}<[])*([]>{})))))))<<((({}>={})+(()!=[])+({}<[]))))),((((([]<())+({}<={})+(()!=[]))+(([]>{})+(()>=())+({}!=[]))+(([]==[])-(()<[])))<<((([]<=())+({}!=[]))+(([]>=[])+(()==()))))+((((()<={})+([]>=[])))<<((([]>[])+(()<=()))))),((((((((()!=[])+({}<={})+([]>=[])))<<((([]>{})+(()==())+(()>{}))))+((([]!=())*({}<())))))<<(((()!=[])+([]<=()))))+((([]!={})+(()<=[])))),(((((((({}<[])+({}<=())+({}=={})))<<((([]==[])+(()!=[])+([]>=[]))))+((([]<())+({}!={})))))<<((([]>={})+(()>[]))))+(((()>=())*({}<={}))))])))(((lambda:(([]<=())-(()==()))).func_code.co_lnotab).join), ((lambda:((()<())-({}<{}))).func_code.co_lnotab).join(map(chr,[((((((((((()!=[])+([]<=())+({}<=[])))<<((([]!={})+(()>={}))))+(((()>=())+({}!={})))))))<<(((()!=[])+(()>={})+(()>=()))))-((((()>={})-(()=={})))<<((([]>=[])*([]>=[]))))),((((({}>={})+({}=={})+([]<()))+(([]<=[])+(()>{})+(()>[]))+(([]!={})-({}==())))<<(((()<=())+([]<()))+(([]>{})+([]<()))))-((([]>=())+([]<=())))),(((((((({}<=())-({}>{})))<<((({}<[])+(()==()))+(([]!=())+([]==[]))))-(((()>=())*({}<=())))))<<(((()>=())+({}<=[])+(()!=[]))))-(((()!=[])+([]>=[])+({}<[])))),((((({}<={})+(()>[])+([]<=()))+(([]<=())+([]>={})+([]==[]))+(({}<{})+(()>=())))<<((({}!=[])+(()==()))+(({}<=[])+(()>=()))))+(((({}<())+([]>[])))<<((({}!=[])+(()=={})))))])), ((lambda:(([]>=())-({}==[]))).func_code.co_lnotab).join(map(chr,[(((((((((([]<())+([]>=[])+({}>={})))<<(((()>={})+({}<=()))))+((([]<=())-(()>())))))))<<((({}<=())+(()!={})+([]<=[]))))-((((()>{})+([]<={})))<<(((()>())+([]>{}))))),(((((((((({}<())+({}=={})+({}<=())))<<((([]!=())+([]>={}))))+(((()==())*({}<={})))))))<<(((()<=())+(()!=[])+({}<()))))+((([]=={})+(()<=())))),(((((((({}<())*(()>[])))<<(((()>=())+({}!=[]))+(({}>={})+(()!=[]))))-((([]<())*({}!=())))))<<((({}<={})+({}!=())+(()!={}))))-((((()!=[])+(()<={})))<<((({}<=[])*([]>{}))))),(((((((([]<())+(()==())+(()<=())))<<(((()==())+([]!={})+({}<=[]))))+((({}!=[])+({}<{})))))<<((([]!={})+({}=={}))))+((([]>{})-([]!=[]))))])), str(bytearray((((((([]>=[])+({}>={})+({}<=()))+((()>{})+(()>[])+([]>=[]))+(([]=={})+(()!={})))<<((([]<())+([]>={}))+(([]!=())+(()>{}))))+(((()!=[])+([]<())+(()>[])))),((((((((((()>[])+([]<=())+(()>=[])))<<((([]>={})+({}!=()))))+(((()>=[])-({}>=[])))))))<<((([]!={})+(()>={})+({}>={}))))+(((()<=[])+(()<=())))),((((((((()>[])+({}>[])))<<((({}!=())+([]<=[]))+(([]!={})+({}<=[]))))-((([]>=[])+({}<{})))))<<((([]>={})+(()<=())+([]!=()))))),))), str(bytearray((((((([]<=())+({}<={})+([]<=[]))+(({}<[])+(()>{})+({}<={}))+(([]<=())*([]<=[])))<<((({}>={})+({}>={}))+(([]!=())+({}!=[]))))+(((()>{})+([]>{})+({}<[])))),(((((((({}!=())+(()>=[])+(()>{})))<<(((()<=())+({}<={})+({}<=[]))))+((({}!=())-({}>=[])))))<<((([]<=[])+([]>=[]))))+((([]==())+(()!={})))),(((((((({}<={})+({}==())))<<(((()>[])+([]>{}))+((()>=())+({}<={}))))-(((()>{})-({}==())))))<<((({}!=[])+({}<=())+({}!=[]))))-(((({}>=())+([]<=[])))<<(((()>=[])+([]>=()))))),(((((((([]<=[])+(()!={})+([]!=())))<<((([]>{})+({}<=[])+({}>={}))))+((([]>=[])*({}!=[])))))<<((({}<=[])+(()>{}))))+((({}!={})+(()==())))),((((({}<={})+([]<=())+({}!=()))+(([]>=[])+({}<=())+([]>{}))+(([]>{})*(()!={})))<<(((()>={})+([]!={}))+(([]<=())+([]<=()))))-((((()<())+(()!={})))<<((([]>{})*(()==()))))),))), ((lambda:((()!={})*([]==()))).func_code.co_lnotab).join(map(chr,[((((((((()!={})+([]>={})+(()==())))<<(((()>=())+(()>={})+(()>={}))))+(((()>{})*([]>{})))))<<(((()>=[])+({}<={}))))+(((()<[])+([]!=())))),(((((((((({}=={})+([]!=())+(()!=[])))<<((([]<=())+({}<()))))+((([]<{})+({}<[])))))))<<((([]<=[])+({}>={})+([]<=()))))+(((()>=())*(()!={})))),((((((((((()==())+(()!={})+([]<=[])))<<(((()>=[])+({}!=[]))))+(((()!=[])+([]==())))))))<<(((()!={})+({}!=())+([]>{}))))-(((()>[])*(()==())))),(((((((((([]!=())+({}<=())+({}<[])))<<((({}!=[])+({}=={}))))+((([]==())+({}<={})))))))<<(((()>=[])+([]<=[])+({}!=()))))),((((({}>={})+(()==())+({}!=()))+(({}<={})+([]<=[])+({}!=()))+((()>=())*([]>{})))<<((([]>=[])+([]!={}))+(({}<=[])+([]<()))))+((((()!=[])+({}==())))<<((({}<={})+({}>={})))))])), (lambda _:_(map(chr,[((((({}<={})+([]==[])+({}!=[]))+(([]<=[])+(()>[])+(()>[]))+((()>[])*([]>{})))<<((([]<())+([]<=()))+(([]>={})+([]!={}))))-(((({}!=())+([]>=())))<<(((()<=())*({}=={}))))),(((((((((([]<=())+(()!=[])+(()>[])))<<((({}=={})+(()!={}))))+(((()<=())+({}>=())))))))<<((([]<=())+({}=={})+({}<=[]))))+(((()<[])+({}<[])))),((((({}<=[])+({}<=[])+([]<()))+(([]<())+(()>{})+(()>={}))+((()==[])+(()>=())))<<((([]<=())+([]>=[]))+(({}<={})+(()!={}))))-((((()!={})+({}>[])))<<(((()<())+(()>[]))))),((((((((()>[])+([]!={})+({}!=[])))<<(((()>{})+({}<[])+([]<()))))+(((()>=())+([]>[])))))<<((({}<=())+({}<=()))))+((({}<={})+(()!=()))))])))(((lambda:((()>())+(()=={}))).func_code.co_lnotab).join)]
teens = [((lambda:((()<{})+(()<=[]))).func_code.co_lnotab), ((lambda:((()<=())*(()=={}))).func_code.co_lnotab).join(map(chr,[(((((((([]>={})+([]<=())+(()!=[])))<<((({}<=())+({}=={})+(()>=()))))+((([]<{})+({}<=[])))))<<((({}!=())+(()==()))))+(((()>[])-(()!=())))),((((((((()>=[])+(()>=())+({}=={}))+((()>[])+({}<())+(()>={}))+(({}!={})+({}!=[])))<<((({}<())+(()>=()))))-((({}>())+({}>={})))))<<((({}<[])+([]<=[]))))),((((((((()>={})+([]!={})+(()>[])))<<(((()>=())+({}<=())+(()!=[]))))+((([]<())*(()>=[])))))<<(((()!=[])+({}=={}))))+((({}!=())*([]>=[])))),(((((((([]>={})*([]!={})))<<(((()!={})+({}=={}))+((()>[])+(()>={}))))-(((()>=[])+(()<={})))))<<((({}<[])+([]>{})+(()==()))))-(((([]>())+({}<())))<<(((()!=())+(()<=()))))),(((((((({}<[])+(()==())+([]!={})))<<(((()>=[])+(()>=())+([]==[]))))+((([]<=[])*({}<=())))))<<((({}>={})+([]!={}))))+((({}==())+([]>={})))),((((([]==[])+([]>=[])+({}=={}))+(([]!=())+([]>={})+({}<=()))+(({}>{})+(()>{})))<<((([]==[])+({}<()))+(([]>{})+(()>=()))))-(((({}<={})+(()=={})))<<((([]>=())+(()==())))))])), str(bytearray(((((((()>{})+({}<[])+(()>{}))+((()>=())+([]<=())+([]==[]))+((()!={})+([]<{})))<<((([]!=())+(()!=[]))+(([]>={})+(()>={}))))+((((()=={})+(()>[])))<<((([]!=())+([]==[]))))),(((((((({}<[])*(()>={})))<<((([]<())+({}=={}))+((()>={})+({}!=()))))-(((()<=())+([]=={})))))<<((([]>=[])+({}<[])+([]>{}))))-(((()==())-([]>[])))),(((((((({}!=())+(()!={})+(()==())))<<((({}>={})+([]<=())+({}<()))))+((({}>={})*(()!={})))))<<((({}=={})+({}<=[]))))+(((()<=())+({}>=())))),(((((((({}=={})+([]<=())+(()>=()))+((()>{})+(()>=[])+({}>={}))+(({}<={})+([]>())))<<(((()<=())+({}<=()))))-((({}<[])+(()!=())))))<<(((()!=[])+([]!={}))))),(((((((([]<=[])*([]==[])))<<(((()>[])+(()>[]))+(({}>={})+([]<()))))-(((()>=[])-({}<{})))))<<((({}>={})+({}<=())+([]!={}))))-(((([]!={})*({}!=[])))<<((([]!=[])+(()==()))))),(((((((({}=={})+(()>=[])+(()==())))<<((({}<={})+([]<=[])+([]==[]))))+(((()<=())*(()>={})))))<<((({}<[])+({}<={}))))+((({}<=())*(()==())))),))), str(bytearray(((((((()>{})+({}=={})+({}<=[]))+((()>=[])+(()<=())+([]>={}))+(([]<=[])-([]<={})))<<((({}!=())+([]!=()))+(({}<())+({}>={}))))+(((({}!={})+({}<={})))<<((({}<=())+({}<={}))))),(((((((((({}<[])+([]==[])+({}<())))<<(((()<=())+(()>=()))))+((({}!=())+([]!=[])))))))<<((({}<=[])+({}<[])+([]>={}))))),(((((((((([]>={})+(()>=())+([]<())))<<((([]==[])+({}=={}))))+((([]<=())-(()<[])))))))<<(((()>=[])+(()!=[])+([]==[]))))+((([]!={})*([]<=[])))),((((({}<[])+({}<=[])+(()>={}))+(({}<={})+({}<=[])+([]>={}))+(([]<=[])*(()>[])))<<(((()>=())+({}!=()))+(({}<=[])+({}!=[]))))+(((({}!=[])-(()<())))<<(((()>=[])-(()!=()))))),((((([]!=())+({}!=())+([]>=[]))+(([]!=())+([]!=())+(()!={}))+(([]==[])+({}>[])))<<((([]!=())+([]>={}))+(([]!=())+({}<=()))))+(((({}<=[])+({}>=())))<<(((()>={})+({}<={}))))),(((((((({}<=[])+([]<=[])+(()!=[])))<<((([]>=[])+(()!={})+([]>={}))))+((({}=={})*({}<[])))))<<((([]>{})+({}<=()))))+((({}<={})*([]<())))),((((((((()>={})+([]==[])+({}<=())))<<((([]!=())+({}=={})+([]<=()))))+(((()>[])-([]<{})))))<<(((()>[])+({}=={}))))+(((()>{})+({}>())))),((((({}<=[])+([]>=[])+([]!=()))+(({}>={})+({}!=[])+({}<=()))+(([]==[])+(()=={})))<<(((()>=[])+([]<=()))+((()==())+(()>=[]))))-(((([]>=[])-([]!=[])))<<(((()<=())+([]==()))))),))), ((lambda:(([]<())*([]<{}))).func_code.co_lnotab).join(map(chr,[(((((((((({}<[])+([]>={})+(()>={})))<<((({}!=())+([]!=()))))+(((()>=())*({}!=[])))))))<<(((()>[])+({}!=[])+({}!=()))))-(((({}>{})+(()==())))<<((({}!=())+([]>=()))))),((((({}<=[])+({}>={})+(()>=[]))+(({}<=())+(()>=())+({}>={}))+(({}<={})*([]!=())))<<((([]<=())+(()>={}))+((()!={})+([]!={}))))-(((()>={})*({}<={})))),(((((((([]<())*(()>[])))<<(((()>={})+(()>{}))+(({}<=())+(()<=()))))-((({}!=())+(()<={})))))<<((([]==[])+(()!={})+(()<=()))))-(((()>{})+({}!=[])+({}<())))),((((([]!=())+([]<=())+({}<=[]))+(({}!=[])+({}!=())+([]>={}))+(([]==[])*({}<={})))<<((({}<[])+([]<=()))+(({}<=[])+(()>={}))))+((((()>=())*(()<=())))<<(((()>{})+([]<{}))))),((((({}<={})+(()<=())+(()!={}))+(({}<[])+({}<[])+(()>{}))+(({}>())+(()>[])))<<(((()>{})+(()>=[]))+((()==())+({}<()))))+((((()<=())*(()!={})))<<((([]<=[])+([]<=[]))))),(((((((({}<=[])+([]!=())+(()==())))<<((({}!=())+([]<=())+({}!=()))))+((([]<{})+({}<[])))))<<((([]>{})+(()>={}))))+((({}!=[])-(()>())))),((((((((()!={})+(()>={})+([]<=())))<<(((()!={})+(()>={})+([]==[]))))+((({}>={})+({}>[])))))<<((([]<=())+([]!={}))))+(((()<=())-(()<{})))),(((((()>[])+([]!={})+(()!=[]))+(({}!=[])+([]<())+([]<()))+(({}!=())+({}>=[])))<<((({}<=[])+([]>=[]))+((()>[])+([]>{}))))-(((({}>[])+([]>={})))<<((({}<{})+(()>=())))))])), (lambda _:_(map(chr,[((((((((((()>=())+(()!={})+({}<())))<<(((()!=[])+({}<=()))))+((([]>=[])-([]>=())))))))<<((({}<=[])+([]>={})+([]!=()))))-((((()>=())+({}>())))<<((({}<[])*([]>={}))))),(((((((((([]>={})+([]<=())+({}!=[])))<<((({}!=())+({}<=()))))+(((()!=())+([]<=())))))))<<(((()!=[])+([]!=())+([]>{}))))+((({}!={})+({}!=())))),(((((((((([]==[])+({}<[])+(()!={})))<<(((()>=())+(()>={}))))+(((()!={})*(()!={})))))))<<(((()>{})+([]>{})+(()>{}))))-(((([]!=[])+(()<=())))<<((({}>=())+(()<=()))))),(((((()!=[])+({}<={})+(()>=[]))+(([]<())+({}=={})+(()>={}))+(([]!=())+([]>())))<<(((()==())+([]<()))+(([]!=())+(()>=[]))))+(((({}!={})+(()<=())))<<(((()>={})+([]>={}))))),((((((((()==())+([]<=())+(()!={})))<<((({}<())+([]>={})+({}<[]))))+(((()>=[])*({}>={})))))<<(((()>={})+(()>=[]))))+(((()!={})-({}<{})))),((((((((()==())+([]>{})+([]<=())))<<(((()>={})+({}<=[])+([]<()))))+((({}=={})+([]<={})))))<<(((()!=[])+([]==[]))))+(((()>={})*({}<[])))),((((({}=={})+({}>={})+(()==()))+(({}>={})+({}<())+(()>={}))+(({}!=())*(()>=())))<<((({}!=[])+(()<=()))+(([]==[])+({}=={}))))-(((({}>[])+(()>{})))<<((({}!={})+({}=={})))))])))(((lambda:((()!=[])-([]<()))).func_code.co_lnotab).join), ((lambda:(([]==[])-(()>={}))).func_code.co_lnotab).join(map(chr,[((((({}<[])+({}!=[])+(()>={}))+(([]<=[])+(()!=[])+({}<=()))+(({}=={})+(()!=())))<<((({}<())+([]!={}))+(({}<[])+(()>={}))))+((([]!=())+(()>[])+(()>={})))),((((((((((()>[])+(()==())+({}!=())))<<((({}!=[])+(()>={}))))+((({}<={})+(()<())))))))<<(((()<=())+([]>={})+({}<()))))+((({}>=[])+(()>[])))),((((((((()>={})-({}>{})))<<((([]>={})+({}<=()))+(({}>={})+([]<=()))))-((([]<())*(()>=[])))))<<(((()!=[])+(()!={})+({}!=()))))),((((([]<())+(()>[])+({}!=[]))+(([]==[])+({}<())+(()>{}))+(({}>={})-([]>=())))<<(((()==())+(()<=()))+(([]>=[])+({}=={}))))+(((({}!=())-({}>=())))<<(((()>=[])+(()!=[]))))),(((((((([]>=[])+([]!=())+([]!={})))<<((([]<())+({}<=())+({}>={}))))+(((()<=())+(()>())))))<<((({}<={})+({}>={}))))+((([]>={})*({}<=())))),((((((((()==())+(()!={})+([]<())))<<((({}<[])+(()<=())+(()>=[]))))+((([]>{})*(()>=[])))))<<((({}>={})+(()>=()))))+((([]>=())+({}!=())))),((((({}!=[])+([]==[])+(()>=()))+(([]>=[])+(()>=())+({}<()))+((()>=[])-({}<{})))<<((({}=={})+(()!={}))+((()>=[])+([]>=[]))))-(((({}<=[])-(()!=())))<<(((()<{})+([]>=[])))))])),\
(lambda _:_(map(chr,[(((((()>=())+([]<=())+([]<=[]))+((()==())+([]<=[])+([]!={}))+(([]!=())+({}!={})))<<(((()<=())+([]<()))+(({}=={})+({}=={}))))+((([]!={})+(()>{})+(()>={})))),(((((((([]==[])+({}<=())+(()!=[])))<<((({}<={})+(()>[])+(()<=()))))+((({}!=())+(()<=[])))))<<((({}!=())+({}>={}))))+(((()>())+({}!=[])))),(((((((([]>=[])-(()>())))<<(((()>={})+({}<()))+(([]>{})+({}<[]))))-((({}<[])-([]<={})))))<<((([]<())+([]<=[])+({}!=[]))))-((((()<=())+([]>[])))<<(((()>=[])+([]<{}))))),((((((((()<=())+(()>={})+({}!=())))<<(((()==())+([]!=())+({}<=[]))))+((([]<[])+(()==())))))<<(((()>{})+(()>[]))))+((([]>=[])*(()<=())))),((((({}!=[])+({}<=[])+([]>{}))+(({}<[])+({}!=[])+(()>[]))+(({}<={})*(()>[])))<<((({}<=[])+({}<={}))+(({}<={})+([]<=()))))-((((()!=())+([]<=[])))<<(((()==[])+({}>={}))))),((((([]<())+([]<=())+(()==()))+(([]>=[])+([]<=())+([]>{}))+(([]!=[])+(()>{})))<<(((()<=())+(()>{}))+(([]!={})+([]<()))))+(((({}!={})+(()>=())))<<((([]<=[])+({}!=[]))))),((((((((()!={})+([]<())+({}=={})))<<((({}<={})+(()>[])+(()>{}))))+((({}!=())*({}=={})))))<<(((()>={})+({}<()))))+((({}<=[])+({}>=[])))),(((((((([]>{})+([]>=[])+([]>=[])))<<((([]==[])+(()>=[])+({}<=()))))+((({}<{})+({}=={})))))<<((([]>=[])+({}<={}))))+(((()==())+({}!={})))),((((([]>={})+([]<=())+(()>[]))+(({}<())+([]!={})+({}<=[]))+(({}<())+([]>())))<<((([]!=())+([]<=()))+((()<=())+({}<={}))))-((((()<{})+([]>{})))<<((([]!=[])+(()>=())))))])))(((lambda:((()<=[])-(()!=()))).func_code.co_lnotab).join), ((lambda:((()!=())*([]<={}))).func_code.co_lnotab).join(map(chr,[(((((((([]!={})+(()==())+([]==[])))<<(((()>[])+(()>={})+([]>=[]))))+((({}>=())+([]!={})))))<<(((()<=())+({}!=()))))+((({}<())-({}==[])))),((((((((((()>=[])+({}=={})+(()!={})))<<((([]==[])+({}<()))))+((({}!={})+({}<={})))))))<<((([]>{})+({}<[])+({}!=()))))+((({}=={})*({}<=[])))),((((((((((()>[])+(()==())+({}=={})))<<((([]==[])+({}!=()))))+((({}<=[])*(()<=())))))))<<((({}<=())+([]>=[])+(()<=()))))-((({}<())-([]<{})))),(((((((((({}<=())+([]>{})+(()>=[])))<<((([]>{})+(()!={}))))+((({}>={})*(()==())))))))<<(((()!={})+({}=={})+(()>={}))))),((((({}!=[])+({}<())+({}<={}))+((()==())+([]>{})+([]>=[]))+((()!=[])*({}=={})))<<((([]>=[])+(()>=()))+((()>=())+([]!={}))))+(((([]<=[])+([]<[])))<<((([]!={})+({}=={}))))),(((((((({}=={})+(()>={})+(()!={})))<<((([]<())+(()!=[])+({}<[]))))+((([]<={})+(()>={})))))<<(((()>{})+([]>{}))))+((({}<())+(()=={})))),(((((((([]<=[])+(()>=[])+([]!={})))<<((([]>={})+([]<())+([]>{}))))+((({}!={})+({}<=())))))<<((({}<=())+(()!=[]))))+(((()>={})+([]>())))),((((({}=={})+([]<())+([]<=[]))+(({}<=[])+([]<=[])+(()>=()))+((()<=[])+(()>[])))<<((([]!={})+({}<={}))+(({}>={})+([]>=[]))))-((((()<{})+(()>[])))<<((({}<{})+([]<())))))])), ((lambda:((()>[])-([]<=()))).func_code.co_lnotab).join(map(chr,[(((((()>={})+([]<=())+(()>={}))+(({}<[])+([]<=())+([]<()))+(([]>=())+([]<=[])))<<((({}<[])+([]<()))+(([]!={})+([]<=()))))-(((([]<=())-({}==[])))<<((({}>())+(()>[]))))),(((((((((({}>={})+([]>{})+({}<=())))<<(((()>{})+({}<()))))+((({}<={})+([]>())))))))<<((([]>={})+([]>={})+([]!=()))))+((({}<={})+(()<=[])))),(((((()!=[])+({}<())+({}=={}))+(({}>={})+({}=={})+(()>=()))+(({}<={})+({}>=[])))<<((([]>{})+(()!={}))+((()>={})+([]>{}))))-(((([]>=[])*([]!={})))<<((({}>={})*({}<[]))))),(((((((([]>{})+(()!={})+({}!=[])))<<((({}!=[])+({}!=())+({}!=[]))))+((([]==[])+([]==())))))<<((({}!=[])+([]!={}))))+(((()>{})-({}!={})))),(((((()>=[])+(()!={})+([]<=()))+(([]==[])+([]<=[])+({}<[]))+(({}=={})-([]=={})))<<(((()!={})+(()!=[]))+(({}<=[])+({}<={}))))+((((()>={})-(()=={})))<<((([]==[])+(()>=[]))))),(((((((({}!=())+({}>={})+(()!=[])))<<((([]>=[])+([]>=[])+({}!=()))))+(((()>{})-({}==[])))))<<((({}>={})+([]!={}))))+((([]>=[])*({}<={})))),(((((((({}<())+(()==())+(()>=())))<<((({}>={})+([]>=[])+([]<=[]))))+((([]>{})*({}<())))))<<((({}<[])+({}<()))))+(((()!=[])+(()>())))),(((((()>=[])+([]>{})+([]!={}))+((()>=())+([]!=())+([]<()))+((()<[])+(()>={})))<<((([]>{})+([]!={}))+(([]!=())+({}>={}))))-(((({}<[])*([]<=[])))<<((({}<=[])-([]<={})))))]))]
tens = [((lambda:((()<{})-({}>()))).func_code.co_lnotab), (lambda _:_(map(chr,[((((({}=={})+(()>{})+(()>{}))+(([]>{})+({}<=[])+({}<()))+(([]>={})-([]=={})))<<(((()!=[])+([]==[]))+(({}<())+({}<=()))))+(((({}==())+([]<=())))<<((([]>={})+({}!=()))))),(((((((([]!={})+({}<[])+({}<={})))<<(((()>{})+({}=={})+(()!=[]))))+(((()>{})-({}>=())))))<<((([]!=())+({}<={}))))+((({}!=())-({}!={})))),((((({}!=[])+([]==[])+([]!=()))+(([]>{})+(()>{})+([]!=()))+(({}<())-([]>[])))<<((({}=={})+([]<=()))+(({}!=[])+([]==[]))))-(((([]=={})+(()>{})))<<(((()>[])*(()<=())))))])))(((lambda:(([]<={})*(()>=()))).func_code.co_lnotab).join), str(bytearray((((((({}<={})+(()<=())+([]<()))+(({}!=())+(()>=[])+([]<=[]))+((()<=[])+(()>{})))<<(((()>={})+([]>=[]))+((()!={})+(()>={}))))+((((()!=[])-([]<{})))<<((([]<=[])+([]>={}))))),(((((((([]!={})-([]<{})))<<((({}<())+([]!=()))+(([]<())+([]>={}))))-(((()>=[])+([]<[])))))<<((({}<={})+([]==[])+({}!=()))))-((([]!={})*(()>=())))),(((((((({}<())+([]!={})+(()>={})))<<((({}<())+({}<=[])+(()>=()))))+((([]<=[])*({}<[])))))<<(((()>[])+(()<=()))))+((({}!=())+(()<={})))),((((({}!=())+(()==())+([]!={}))+((()!=[])+(()>{})+({}!=[]))+((()<())+(()==())))<<(((()>=())+(()>=()))+((()<=())+({}=={}))))-(((([]<=())+({}==[])))<<(((()>={})+([]!=[]))))),((((([]<=())+({}>={})+(()==()))+((()!={})+([]!={})+({}<[]))+((()>[])+(()!=())))<<((({}<[])+([]>={}))+((()!={})+({}>={}))))+(((({}<())-({}>=[])))<<((([]>{})+({}<()))))),((((((((()!=[])+(()=={})))<<((({}!=[])+(()>{}))+(([]<=())+([]!={}))))-((([]>=())+(()>={})))))<<((({}=={})+(()>{})+(()>=[]))))+((([]<=())-([]==())))),))), str(bytearray(((((((()==())+(()>={})+(()==()))+((()>[])+({}!=())+(()!=[]))+((()==())*(()>{})))<<(((()>=[])+([]>=[]))+(([]>={})+([]<=[]))))+((((()>[])+(()<[])))<<((({}!=[])+({}<()))))),((((((((((()>=[])+({}<=[])+([]==[])))<<((([]!={})+({}<=[]))))+((({}<=[])*({}<[])))))))<<((({}<={})+(()==())+(()!=[]))))),(((((((((([]==[])+(()<=())+({}<[])))<<((({}>={})+([]==[]))))+((({}<={})*(()!={})))))))<<((([]!=())+(()!=[])+({}<()))))+((({}<={})+({}>=[])))),(((((()>{})+({}<())+([]<=[]))+((()>=())+([]<=())+(()!={}))+(([]==[])+(()<{})))<<((({}>={})+(()<=()))+((()<=())+([]!=()))))+((((()==[])+(()==())))<<((([]>=[])-([]>=()))))),(((((()>=[])+([]>=[])+(()!={}))+((()<=())+(()>[])+({}!=[]))+((()<={})+(()>[])))<<((({}!=())+(()!={}))+(({}=={})+([]>=[]))))+(((({}>{})+({}<[])))<<((({}!=())+(()==()))))),((((((((()>())+({}<())))<<((({}<=[])+(()>[]))+(([]>=[])+(()>{}))))-((([]==())+(()>={})))))<<((([]<())+({}=={})+(()!={}))))+((({}>[])+(()==())))),))), ((lambda:(({}>{})*({}!=[]))).func_code.co_lnotab).join(map(chr,[(((((((((({}<[])+([]>={})+({}>={})))<<(((()!=[])+({}<[]))))+((([]<=())*({}<[])))))))<<(((()>[])+([]>={})+(()!={}))))-((((()==())+({}>{})))<<((({}>={})+([]<={}))))),((((([]!=())+([]==[])+({}<=[]))+((()>=())+([]<())+(()>={}))+((()!={})-({}==())))<<((([]<=())+([]==[]))+((()==())+({}<=[]))))-((([]<=())+({}==[])))),((((([]>{})+({}!=())+([]<()))+(({}<())+(()>[])+(()>={}))+((()>=[])-({}>=[])))<<(((()>=())+({}=={}))+((()<=())+([]==[]))))+((((()>[])+([]>=())))<<((({}<())*({}<[]))))),(((((()!={})+({}!=[])+(()>=()))+(([]==[])+([]==[])+(()!={}))+(({}<={})-([]<[])))<<(((()!={})+(()>=()))+(({}<[])+(()>[]))))+(((([]>{})-(()<={})))<<((({}<())+([]<()))))),(((((((({}>())+({}<())))<<((({}<=())+({}<={}))+((()>=[])+({}<=[]))))-(((()!={})+(()<=[])))))<<((([]<=[])+(()==())+({}<[]))))+((({}>={})-(()<=[]))))])), ((lambda:((()!=[])*({}<{}))).func_code.co_lnotab).join(map(chr,[(((((((((({}<[])+([]==[])+(()>[])))<<((({}<=())+(()>=()))))+((([]!=())+(()=={})))))))<<(((()==())+({}!=[])+(()>=[]))))-(((([]>{})-([]<[])))<<((({}<())+({}>[]))))),((((((((((()>=[])+(()>[])+(()>={})))<<(((()>=())+({}<()))))+((({}<[])*(()>=[])))))))<<((([]>{})+({}>={})+([]<()))))+((([]>{})+([]>=())))),((((((((((()<=())+(()>{})+({}!=())))<<((([]<=[])+({}=={}))))+((({}<=())+(()!=())))))))<<((([]<=[])+({}<=[])+({}!=()))))-(((([]>=[])-([]!=[])))<<((({}<=[])+([]<{}))))),((((({}<=[])+(()!=[])+([]>={}))+((()>{})+({}<[])+({}!=[]))+((()>=())*([]>=[])))<<(((()<=())+(()<=()))+((()<=())+([]<=()))))+((((()<())+(()<=())))<<(((()>=[])+({}<=[]))))),((((((((()!={})+(()<{})))<<(((()>=[])+({}<=[]))+((()!={})+(()>=()))))-(((()!={})-(()<={})))))<<(((()>{})+([]>=[])+(()>=()))))+(((()<{})+({}!=[]))))])), (lambda _:_(map(chr,[(((((()==())+([]==[])+([]>{}))+(([]==[])+({}!=())+([]<()))+(([]<())*([]<=())))<<((({}<={})+(()>={}))+((()>=())+({}!=()))))+((([]>=[])+({}<[])+([]<=())))),(((((((((({}>={})+(()>{})+(()!=[])))<<((([]>{})+([]<()))))+(((()>{})*({}!=[])))))))<<(((()>={})+(()>=())+({}=={}))))+(((()>=[])-([]<[])))),(((((((({}!={})+(()>=())))<<((({}<={})+([]>{}))+((()>=())+(()!=[]))))-((({}!={})+({}<[])))))<<(((()>{})+(()>=[])+([]<()))))),(((((()!=[])+({}<={})+([]<=[]))+(([]<=[])+(()==())+([]<=()))+(({}>={})-({}!={})))<<((({}<())+(()>=()))+(([]>=[])+({}!=[]))))+(((([]==())+([]!={})))<<((({}<=[])+(()>[]))))),(((((((({}!=())*({}<=[])))<<(((()==())+({}>={}))+((()>{})+([]>{}))))-((([]==[])*({}!=())))))<<((([]!={})+(()>={})+([]>=[]))))+((({}!={})+({}<()))))])))(((lambda:(({}!={})-([]<={}))).func_code.co_lnotab).join), (lambda _:_(map(chr,[(((((()==())+(()>[])+(()>[]))+(({}!=())+([]>{})+({}!=[]))+((()!=[])-([]!=[])))<<((([]!={})+({}<={}))+((()!=[])+(()>[]))))+(((()>=[])+([]!={})+(()>=[])))),(((((((([]<=())+(()!={})+({}<[])))<<((({}<=[])+(()<=())+(()!={}))))+(((()>[])*(()>=[])))))<<((([]!=())+(()>[]))))+((([]>={})+([]<{})))),(((((((({}==())+([]!=())))<<((([]>=[])+({}!=[]))+((()>{})+(()!=[]))))-(((()<={})+(()>[])))))<<(((()>=[])+([]<=[])+({}<=[]))))-(((([]>=[])+({}==[])))<<((({}>())+([]==[]))))),(((((((({}<={})+({}=={})+([]==[])))<<((([]!={})+([]>{})+({}<=()))))+(((()!=[])-(()<=[])))))<<(((()==())+({}<()))))+(((()!={})-(()<())))),((((({}<={})+([]>{})+([]<=()))+((()!=[])+([]>={})+({}<[]))+(([]<=[])-([]<{})))<<((([]>{})+(()>=[]))+(({}<=[])+({}<={}))))-(((([]!={})*(()<=())))<<((({}>={})+([]<[]))))),(((((()>[])+([]==[])+([]==[]))+(([]==[])+({}=={})+(()>{}))+(([]!=())-(()<{})))<<(((()>={})+(()>{}))+(({}<())+([]<=[]))))+((((()>=())-({}==())))<<((([]!=())+([]!=()))))),(((((((([]<=[])+([]>())))<<(((()>[])+([]!=()))+((()!={})+([]<=[]))))-((([]!=())*({}<=[])))))<<(((()>=())+({}>={})+(()>[]))))+(((()!={})*({}!=[]))))])))(((lambda:(({}<={})-({}!=()))).func_code.co_lnotab).join),\
str(bytearray((((((((((()!={})+([]>={})+(()<=())))<<((([]>=[])+(()>[])+({}<={}))))+((([]<=[])+([]<={})))))<<((({}>={})+(()<=()))))+((([]<=())*(()>=())))),(((((((((({}<=())+(()<=())+({}!=[])))<<((({}>={})+(()!=[]))))+((({}==())+([]>{})))))))<<((([]!=())+(()<=())+({}=={}))))+((([]<[])+({}=={})))),(((((((((({}<=())+([]>=[])+({}!=())))<<((([]>={})+([]<()))))+((([]==[])*([]!={})))))))<<((({}!=())+({}<=[])+(()!={}))))-(((()<{})+([]>=[])))),(((((((((({}!=[])+(()==())+({}>={})))<<((({}<=())+(()>=[]))))+((({}<=())*(()>=[])))))))<<((({}<[])+(()>{})+({}!=()))))),(((((()!={})+({}!=())+([]>={}))+(([]>=[])+({}=={})+({}>={}))+(([]<=[])*([]>=[])))<<((([]>={})+(()==()))+((()>[])+(()<=()))))+((((()=={})+([]==[])))<<((({}<={})+(()>{}))))),(((((((({}==[])+(()<=())))<<((({}!=())+(()!={}))+(([]==[])+([]<=()))))-((({}>{})+([]==[])))))<<((([]<())+(()!=[])+([]>=[]))))+((([]<=())+([]>())))),))), str(bytearray((((((({}<[])+(()<=())+(()!={}))+(({}<[])+(()!=[])+({}=={}))+((()>={})-({}>=())))<<((([]>{})+({}!=[]))+((()!={})+(()>[]))))-(((([]!={})+([]<{})))<<((([]>())+({}<={}))))),(((((((((({}=={})+([]!=())+({}>={})))<<(((()>=())+(()!=[]))))+((({}>=())+(()>=())))))))<<(((()!=[])+({}<[])+([]!=()))))+((({}==[])+([]==[])))),((((({}<={})+([]<=())+([]==[]))+((()>{})+(()>={})+({}>={}))+(({}!=())+({}!={})))<<(((()>=())+({}<[]))+(({}=={})+(()!=[]))))-((((()<=())+(()=={})))<<(((()>[])+({}>=[]))))),((((((((()<=())+([]<())+(()>=[])))<<((({}<())+({}<={})+(()>=()))))+(((()>={})-(()!=())))))<<((([]<=[])+([]>=[]))))+((([]==[])*(()!={})))),((((({}!=())+([]<=())+(()>[]))+(([]>={})+([]>{})+([]<()))+(([]<=())-(()==[])))<<((([]!={})+([]!=()))+((()>[])+([]>=[]))))+(((([]==())+(()==())))<<(((()>={})+(()==()))))),(((((((([]!={})-({}==[])))<<((({}!=[])+(()<=()))+(({}<=())+([]>={}))))-((([]==[])*({}!=())))))<<((([]<())+({}<())+({}=={}))))+(((()<{})+([]<=())))),)))]
thousands = [str(), str(bytearray((((((({}<=())+(()>{})+({}<[]))+(([]!=())+({}<[])+({}>={}))+(({}=={})-({}<{})))<<(((()==())+(()>[]))+((()>=())+([]!=()))))+(((({}=={})*({}!=())))<<((({}<=())+([]>={}))))),(((((((((([]==[])+(()!=[])+(()==())))<<(((()==())+([]>{}))))+(((()>())+(()!=[])))))))<<((({}<=())+(()>={})+([]==[]))))),(((((()<=())+({}<=())+([]<()))+(([]!=())+(()>=())+(()>{}))+(({}==())+({}>={})))<<((({}<=())+(()>[]))+(({}!=[])+({}<=()))))-((([]==())+({}<=())))),((((((((()>=[])-(()<[])))<<((([]<=())+([]!=()))+(([]<=[])+({}!=()))))-((([]<=())+({}<{})))))<<((({}!=[])+([]<())+({}>={}))))-((({}>={})+(()>={})+([]!={})))),((((({}!=())+({}<=())+([]!={}))+(([]>=[])+(()>=[])+(()>[]))+(([]!=())*({}>={})))<<((([]<())+([]>={}))+((()>=())+({}>={}))))+((({}!=[])+({}!=())+({}<())))),((((([]>=[])+(()>=())+([]>{})))<<((([]<=())+(()==()))+(({}<=[])+([]>={}))+((()!={})+({}==()))))+((({}!=[])*([]==[])))),((((([]==[])+(()>=[])+({}!=[]))+((()>[])+({}<=[])+([]>={}))+((()!={})*({}!=())))<<((({}<())+(()>{}))+(({}<={})+(()>[]))))-(((({}!=[])*({}<={})))<<(((()<={})+({}<=()))))),(((((((({}<[])+([]>={})+({}<=())))<<(((()>=[])+([]>{})+(()!=[]))))+(((()<=[])+(()>={})))))<<((({}=={})+([]>={}))))),))), str(bytearray(((((((((([]!=())+([]>{})+([]>=[]))+((()>=())+({}<())+({}<[]))+(({}==[])+(()>=())))<<((({}>={})+([]>{}))))-((({}<())-({}>=())))))<<(((()>[])+(()>{}))))+((([]<=())*({}<={})))),((((((((((()>[])+(()==())+(()!=[])))<<(((()>{})+([]<=[]))))+(((()>=())+([]=={})))))))<<((({}<())+([]<=())+([]<=()))))+(((()!=[])-([]<{})))),(((((((([]<=())+([]==[])+({}=={}))+(({}<())+([]>={})+(()!={}))+((()<=())-({}>[])))<<((([]>=[])+([]>={}))))-(((()<=())-(()<())))))<<((({}!=[])+({}!=()))))),(((((((({}<[])+([]<=[])+({}>={}))+(([]<=[])+(()==())+({}<()))+((()>=())-(()<=[])))<<((([]!={})+([]>={}))))-((({}<[])+(()>())))))<<(((()>[])+({}>={}))))),(((((((((([]>={})+({}!=())+([]==[])))<<((([]>{})+({}>={}))))+(((()>={})*(()>={})))))))<<(((()>=())+({}=={})+(()>={}))))+(((()<=())*(()>{})))),((((([]!={})+([]>{})+(()!=[]))+(([]<())+([]>={})+(()>[]))+(([]<=())+(()<={})))<<((({}<[])+({}=={}))+(([]!={})+({}=={}))))-((({}!=[])-([]==())))),(((((()>=())+([]!={})+([]>={}))+(({}!=[])+({}<=[])+([]!={}))+((()=={})+([]!={})))<<((({}<=[])+({}<[]))+(([]<=[])+([]>={}))))-((((()!=())+(()==())))<<((([]>={})+({}>()))))),))), str(bytearray(((((((()!=[])+(()!=[])+(()>[])))<<(((()>[])+([]==[]))+(([]>{})+([]<=[]))+((()==())-(()<()))))+(((([]<())*([]<())))<<((({}!=())-({}>=[]))))),(((((((((([]!={})+(()>={})+({}<={})))<<((({}<=[])+(()!=[]))))+((([]>={})+([]>[])))))))<<((([]<())+(()>[])+([]>={}))))+(((()<=[])+([]!=())))),(((((((({}<())+([]<=[])+([]>=[]))+(({}<())+(()>{})+([]<=()))+(({}>={})*(()>[])))<<((([]<=())+(()>={}))))-(((()>=[])-({}==[])))))<<((([]>=[])+({}!=()))))),(((((((({}<={})+([]>={})+(()!=[]))+(({}!=[])+(()>={})+([]<=()))+(({}<[])*(()==())))<<((({}>={})+({}!=()))))-(((()>=[])-({}>=())))))<<((({}<=())+([]!={}))))),((((((((((()>[])+([]<=())+(()<=())))<<((([]!={})+(()<=()))))+(((()<=())*(()!=[])))))))<<((({}>={})+({}<[])+({}<()))))+((([]>={})-(()==[])))),((((([]==[])+({}>={})+([]==[]))+(([]>=[])+([]<())+([]<()))+((()!={})*({}!=())))<<(((()<=())+(()>=[]))+(({}<[])+({}<[]))))-((({}<())-({}!={})))),((((({}<[])+(()>[])+(()>=()))+(([]>{})+(()==())+(()>={}))+(([]>{})*({}<={})))<<(((()!=[])+(()!=[]))+(([]==[])+([]<=()))))-(((({}>={})*({}<[])))<<(((()>{})+({}>=[]))))),))), (lambda _:_(map(chr,[((((([]>{})+(()>={})+(()!=[]))+((()==())+([]>=[])+({}=={}))+(({}>[])+(()<=())))<<((({}<[])+({}=={}))+(([]>={})+(()>={}))))+(((({}<=())-({}>{})))<<((({}<())+(()==()))))),((((({}=={})+({}<[])+(()>={}))+(([]==[])+({}<=())+([]<=()))+(([]<=[])-([]>())))<<((({}<=())+({}>={}))+((()!=[])+(()>=()))))+(((([]>={})-([]==())))<<((({}<{})+(()>{}))))),(((((((((([]<())+(()>=())+({}<[])))<<(((()>{})+({}=={}))))+(((()!=[])+({}>[])))))))<<((([]<())+(()>{})+([]!=()))))+(((()!=())+([]>={})))),(((((((([]>=[])+({}<={})+([]<()))+(({}<=())+({}<())+([]>={}))+((()>[])*({}!=[])))<<(((()>{})+(()!={}))))-((([]!=())-([]=={})))))<<((([]<=())+([]<=()))))),(((((((({}=={})+([]>={})+(()<=()))+(([]<=())+([]>{})+([]>={}))+((()>[])+({}>{})))<<((({}<[])+([]!=()))))-((([]<{})+({}<=[])))))<<(((()>=[])+(()!={}))))),(((((((((({}<={})+({}!=[])+({}<=[])))<<((([]!={})+({}=={}))))+(((()>=[])+([]==())))))))<<((({}<=())+({}<=[])+(()>{}))))+((({}<[])-(()>())))),((((([]<=[])+(()!=[])+(()!=[]))+(({}>={})+([]>={})+(()!=[]))+((()==())+([]>[])))<<(((()>[])+({}=={}))+((()>=())+([]<()))))-(((()>=[])*({}<=[])))),(((((()>=())+([]!=())+({}<=()))+(({}>={})+([]==[])+([]<=[]))+(({}<())-(()<[])))<<((({}<())+(()>[]))+((()<=())+([]>=[]))))-(((({}!=())+([]>())))<<((([]<={})+([]>={})))))])))(((lambda:(({}==[])-(()=={}))).func_code.co_lnotab).join), ((lambda:((()==[])*({}!=()))).func_code.co_lnotab).join(map(chr,[((((({}<={})+({}<={})+({}<={}))+(({}<())+(()>[])+({}!=[]))+(([]>=())+({}<[])))<<((([]>=[])+({}<()))+(([]==[])+({}<()))))+((([]!=())-(()<=[])))),((((((((()>=[])-(()<{})))<<((({}!=())+({}<[]))+(([]<=[])+([]<=()))))-((({}==[])+([]==[])))))<<((([]>{})+({}<=[])+(()==()))))-((({}<=[])+(()!={})+(()<=())))),((((({}<[])+([]>{})+(()<=())))<<((({}<={})+(()>[]))+((()<=())+({}!=[]))+(([]<=[])*({}>={}))))+((({}<())-(()<[])))),((((((((()<=())+({}<())+({}<=())))<<(((()>{})+([]!={})+([]!=()))))+(((()>=[])*({}=={})))))<<((({}<[])+(()!={}))))),(((((()>{})+(()<=())+([]==[]))+(([]!={})+({}<=[])+([]>={}))+((()>[])+(()==[])))<<((({}>={})+([]!={}))+((()!={})+([]==[]))))+(((([]<=[])+({}>())))<<(((()>=())-({}==[]))))),((((((((((()==())+([]<=())+([]>{})))<<((([]<())+([]>=[]))))+(((()==())-({}>[])))))))<<((({}!=())+([]<=[])+({}<[]))))+((([]==[])-([]<{})))),(((((((([]<())+(()!={})+(()>={}))+((()>=[])+(()>[])+({}>={}))+(({}<=[])*({}=={})))<<(((()>=())+([]>={}))))-((([]>())+(()==())))))<<(((()>[])+({}>={}))))),(((((((([]>{})+(()!=[])+({}<[]))+(([]<=[])+({}<=[])+([]<=()))+(({}!=())-({}==[])))<<(((()>[])+({}<=()))))-((([]<())+(()<{})))))<<((([]!={})+(()>=[]))))),(((((((((({}<={})+([]<=())+({}!=[])))<<(((()>[])+(()!={}))))+((({}>{})+({}<())))))))<<(((()>{})+([]>={})+(()>=()))))+((({}<[])-([]<[])))),(((((()>[])+({}!=[])+([]<=[]))+(({}<={})+({}!=[])+({}<[]))+((()<=())*([]<=[])))<<(((()>=[])+([]>=[]))+((()!=[])+(()==()))))-((([]<{})+({}<[])))),((((([]>{})+(()!=[])+([]<=()))+(({}<=[])+({}!=[])+([]!={}))+((()!=())+([]>=[])))<<(((()>={})+(()<=()))+(([]!=())+([]<=[]))))-((((()!={})-(()<())))<<((({}>())+(()>{})))))])),\
(lambda _:_(map(chr,[((((({}<=[])+(()!=[])+({}=={}))+(({}<={})+(()>{})+({}<=()))+(([]>=())+({}<=[])))<<(((()>=[])+([]!=()))+(([]!=())+([]!=()))))+(((()>())+([]!=())))),(((((((({}<[])+([]>[])))<<((([]<())+(()>=()))+(({}<())+(()>={}))))-((([]<())*({}>={})))))<<((({}<[])+([]<())+({}>={}))))-((([]!=())+({}=={})+({}>={})))),(((((((((({}<[])+([]>={})+([]==[])))<<(((()>{})+(()!=[]))))+((({}!=[])*(()>{})))))))<<((([]>={})+({}<())+(()!={}))))+(((()==[])+([]!={})))),((((([]!=())+({}<[])+(()==()))+(([]<=[])+([]>={})+({}<()))+(({}!=())*(()>{})))<<(((()<=())+(()>=[]))+(([]>{})+(()>=[]))))-(((([]>())+(()>[])))<<((([]>=[])+([]==()))))),((((({}<=[])+([]!={})+({}>={}))+(({}>={})+(()!=[])+({}<()))+(([]>={})+({}==[])))<<((({}<=[])+([]<()))+(([]>=[])+({}=={}))))+(((({}<[])*(()==())))<<((({}<=[])+([]<=[]))))),(((((((((([]<=())+(()>={})+({}=={})))<<((({}<=[])+(()==()))))+((([]<={})+([]!=())))))))<<((([]<=())+(()>={})+(()!={}))))+(((()>{})+([]<={})))),((((((((()<=())+([]>=[])+({}<=[]))+(({}>={})+({}=={})+([]!={}))+((()==())+([]<[])))<<(((()>=())+(()>={}))))-((({}!=[])+(()<())))))<<((({}<=[])+(()>{}))))),(((((((({}<=())+([]!={})+(()>[]))+(({}!=[])+({}=={})+({}<[]))+(({}!=())*({}!=[])))<<((({}<=[])+({}<()))))-((({}!=[])+(()<={})))))<<(((()<=())+([]<=[]))))),(((((((((([]>={})+({}<[])+([]<=())))<<((([]==[])+(()>={}))))+((([]>=[])-([]==())))))))<<((({}!=())+(()>={})+({}!=[]))))+(((()==())-(()<[])))),((((({}<=())+(()>={})+([]<()))+((()>[])+({}!=())+(()!={}))+(({}<=())*([]>={})))<<((([]==[])+([]<=[]))+((()>=[])+({}<={}))))-(((()>={})-({}>=[])))),((((({}<=[])+(()>=())+({}>={}))+((()!=[])+({}>={})+(()>=()))+(([]<=())-({}>{})))<<((({}!=())+([]!={}))+(({}<=())+({}<[]))))-(((([]<=[])-(()=={})))<<((([]<=[])+(()<{})))))])))(((lambda:(({}==[])-([]==()))).func_code.co_lnotab).join), ((lambda:(({}<[])*(()<=[]))).func_code.co_lnotab).join(map(chr,[((((([]==[])+(()>{})+([]>=[]))+((()!=[])+({}<[])+(()>=()))+((()>{})*({}<[])))<<((([]!=())+([]==[]))+(([]!={})+([]>{}))))+(((()<=())+([]<=())+({}<={})))),(((((((({}<=())+([]<=())+({}!=[])))<<((([]>=[])+(()>={})+([]>{}))))+((({}<())*([]<=())))))<<((({}<[])+(()==()))))+((({}<=())*([]!=())))),(((((((({}!=())+(()<=[])))<<((({}<={})+(()>=()))+(({}<={})+({}>={}))))-((([]<=[])*([]>={})))))<<(((()!=[])+([]>={})+(()>=[]))))),((((([]!={})+({}<=())+({}<=()))+(([]!=())+([]<=())+(()>[]))+(({}!={})+([]<())))<<(((()>={})+(()>[]))+(({}>={})+([]!={}))))+(((([]>=())+([]!=())))<<((([]>=[])+({}<[]))))),(((((((((([]>={})+([]>{})+([]!=())))<<((({}<={})+({}<=()))))+(((()<{})+({}<[])))))))<<((({}!=[])+([]!=())+(()>={}))))+((({}!=())-({}!={})))),(((((((({}!=[])+({}<=())+([]!={}))+(([]<=[])+(()<=())+([]>{}))+(([]>())+(()>[])))<<(((()>=())+({}<={}))))-((({}!=[])+(()<=[])))))<<((([]<=[])+(()>{}))))),(((((((({}<={})+([]<=[])+({}>={}))+(([]==[])+({}<=[])+([]<=[]))+(([]<=())+([]<[])))<<((([]!={})+([]<()))))-((([]<())+([]>[])))))<<(((()!={})+(()==()))))),((((((((((()==())+(()!=[])+(()>[])))<<((({}>={})+(()>[]))))+((({}=={})-(()==[])))))))<<(((()>=())+([]>=[])+([]>=[]))))+(((()>=[])-(()<{})))),((((({}<=[])+(()>=())+([]>={}))+((()!={})+(()>=())+(()>=()))+((()<=())+(()>())))<<((([]<=())+([]!={}))+(([]!={})+(()==()))))-((({}>())+([]>=[])))),((((([]>{})+({}=={})+({}<()))+(([]>=[])+([]!={})+([]>={}))+(([]<=[])+({}<{})))<<(((()!=[])+({}>={}))+((()>=())+(()>[]))))-(((([]>[])+({}<[])))<<(((()<=())*({}!=())))))])), (lambda _:_(map(chr,[(((((()>=[])+({}=={})+([]>=[]))+((()!=[])+([]<())+([]==[]))+(([]!={})*([]>{})))<<(((()>=[])+(()>=()))+(({}<=[])+({}=={}))))+((({}!=())+({}!=())+(()!={})))),(((((((({}<=[])+([]<=())+({}!=[])))<<(((()>[])+({}<=[])+([]<()))))+(((()>{})+([]!=[])))))<<((({}!=[])+([]==[]))))+(((()>=())-([]>[])))),((((([]<=[])+(()!={})+([]!=()))+(([]<())+([]<=())+({}<()))+(({}<={})*([]<=[])))<<((([]>{})+(()>={}))+(([]<())+(()>=[]))))),((((([]<=())+([]!={})+([]<()))+((()>=[])+([]<())+([]<()))+((()>=[])+([]!=[])))<<((({}<[])+([]<=[]))+(([]>=[])+([]<=[]))))+(((([]!=())+(()=={})))<<(((()>[])+([]<=()))))),(((((((((([]<=[])+([]!={})+(()!=[])))<<((([]>=[])+({}<={}))))+((({}>=[])+({}!=())))))))<<(((()>[])+([]!={})+(()>={}))))+((([]<())+(()<=[])))),((((((((()!={})+({}=={})+([]>={}))+((()<=())+([]>={})+(()!=[]))+(({}>=[])+({}<[])))<<((([]!={})+(()<=()))))-((([]<=[])-({}<{})))))<<((([]>={})+([]>=[]))))),(((((((([]<())+({}<[])+({}=={}))+((()!=[])+(()>=())+({}=={}))+(({}<())*([]>=[])))<<((({}=={})+(()==()))))-((({}<=())*([]!={})))))<<((([]>=[])+(()>=[]))))),(((((((((({}=={})+(()>={})+({}>={})))<<((([]<=[])+([]>{}))))+((([]!=())+(()=={})))))))<<(((()!=[])+({}!=())+(()<=()))))+((([]>=[])*(()>=())))),(((((()<=())+(()!={})+([]<=()))+((()>=[])+(()==())+(()==()))+(([]>={})+([]>[])))<<((([]<=())+([]==[]))+(([]>{})+([]!={}))))-((({}<=[])*([]<=[])))),((((([]>=[])+({}<=[])+([]!=()))+((()>=[])+({}=={})+([]>{}))+(([]==())+({}<=())))<<((({}<=[])+({}!=[]))+((()>[])+({}!=[]))))-(((({}<={})*([]!=())))<<((({}=={})*(()!=[])))))])))(((lambda:((()!={})*([]==()))).func_code.co_lnotab).join), ((lambda:(({}=={})*([]<={}))).func_code.co_lnotab).join(map(chr,[((((({}<[])+({}<={})+([]<()))+((()<=())+({}<={})+({}<=[]))+((()>[])-(()=={})))<<(((()<=())+(()>[]))+(({}!=())+({}<={}))))-((([]!=[])+([]<=[])))),((((((((()>={})+(()>{})+([]==[])))<<(((()<=())+({}<[])+([]<=()))))+(((()==[])+([]>=[])))))<<((({}!=())+(()!=[]))))-((({}>=())+([]<=[])))),((((({}<())+({}!=())+(()!={}))+((()!={})+({}<=())+(()!=[]))+(({}<())+(()<=[])))<<((([]==[])+(()>{}))+(([]<=())+({}<={}))))+((((()>{})*({}<())))<<((([]<())+(()!=[]))))),(((((((((({}!=[])+([]>={})+({}>={})))<<((({}<=())+([]<=()))))+(((()>{})-({}>())))))))<<(((()>{})+([]<=[])+({}<={}))))+(((()==())*({}=={})))),((((((((()>{})+({}>={})+([]>{}))+((()>[])+({}!=())+([]<()))+(([]==())+([]==[])))<<((([]<=())+({}<()))))-((({}>[])+({}!=[])))))<<(((()==())+(()!=[]))))),(((((((([]<=())+([]<())+({}<()))+((()<=())+(()!={})+([]<=[]))+(([]==())+({}!=())))<<(((()>[])+([]==[]))))-(((()>=[])*(()>=[])))))<<((({}<=[])+([]!=()))))),(((((((((({}<())+({}<())+(()>[])))<<((({}<[])+({}<={}))))+(((()<=[])+([]<=())))))))<<((([]>={})+(()<=())+(()>={}))))+((([]<=())-({}==[])))),((((({}<=())+(()!=[])+({}!=[]))+((()>=())+(()<=())+({}<={}))+((()!=[])*([]<=())))<<(((()!={})+([]!={}))+(([]>=[])+([]!={}))))-((({}>=[])+({}=={})))),((((([]<=())+([]>{})+([]!=()))+(([]<=())+(()!={})+([]<=()))+(({}!=[])+(()>())))<<(((()<=())+({}!=()))+((()==())+(()>=()))))-(((({}!=[])+([]>())))<<((({}<={})+([]!=[])))))])),\
((lambda:((()=={})+([]>=()))).func_code.co_lnotab).join(map(chr,[((((([]==[])+([]==[])+([]<=[]))+(([]>=[])+({}<=())+({}<[]))+((()!={})-([]==())))<<((({}!=())+([]==[]))+(([]>{})+(()>[]))))-((((()>={})+([]=={})))<<(((()<())+([]!={}))))),((((({}<=())+({}!=())+({}<={}))+(({}<())+([]>={})+([]<=[]))+((()>=[])-({}!={})))<<(((()<=())+({}<()))+(({}=={})+({}!=[]))))-(((()>=[])-(()<{})))),((((([]<=())+([]<())+(()!={}))+(({}!=())+({}<=())+([]>=[]))+((()!=[])+({}>=[])))<<((([]<=())+(()>=[]))+((()>={})+(()<=()))))-(((([]!=[])+({}<=[])))<<((({}!={})+({}=={}))))),(((((((((([]<=[])+([]>=[])+(()==())))<<(((()!={})+({}<=()))))+(((()>[])+(()!=())))))))<<(((()!=[])+(()>{})+({}<={}))))+(((()>())+({}!=())))),((((((((()<=())+({}<={})+([]<=[]))+(({}<={})+({}<=[])+([]>{}))+((()<{})+([]<())))<<((({}>={})+({}<()))))-(((()>{})+([]!=[])))))<<((([]!=())+({}=={}))))),(((((((({}<=[])+(()!=[])+(()>={}))+((()>[])+([]>=[])+(()!={}))+(({}<={})-([]!=[])))<<(((()==())+(()==()))))-((([]==())+([]<())))))<<((([]<=())+({}=={}))))),(((((((((({}<=())+({}<[])+(()<=())))<<((({}<())+([]<=[]))))+(((()>={})+({}>=[])))))))<<((([]<=())+([]!=())+({}<=[]))))+((([]<())-([]>())))),(((((()!=[])+({}>={})+({}!=()))+(({}<=[])+({}<=())+(()>{}))+((()>=[])*({}<={})))<<(((()>={})+({}=={}))+((()!={})+({}>={}))))-((({}>())+(()>{})))),((((({}!=[])+([]<=[])+(()>=()))+(([]>=[])+([]>{})+([]!={}))+(([]>{})*([]!=())))<<(((()==())+(()<=()))+(([]!={})+({}=={}))))-(((([]==[])-({}==())))<<((([]>[])+(()!={})))))])), ((lambda:(([]==[])*([]=={}))).func_code.co_lnotab).join(map(chr,[(((((((({}<[])+({}!=())+(()>={})))<<((({}<=())+(()>{})+({}<={}))))+((({}<[])-([]<={})))))<<((({}<())+({}<=[]))))),(((((((({}>={})+(()>={})+({}<=[])))<<((({}<())+([]==[])+(()>=()))))+((([]!=())-([]>())))))<<((({}<[])+(()>[]))))+((({}<={})-(()<=[])))),(((((((({}!=())+([]>={})+(()!={})))<<((([]>={})+({}<=[])+(()==()))))+((([]<=())-(()!=())))))<<(((()>{})+(()<=()))))-((({}!=[])*([]>=[])))),(((((((((([]>{})+([]>=[])+({}<[])))<<((({}>={})+(()>[]))))+((({}>[])+(()>={})))))))<<((([]<=[])+([]<=[])+([]!={}))))+((([]<{})+([]>=[])))),((((((((()>=())+(()>{})+([]>={}))+(({}!=())+(()>=())+([]<()))+(({}>={})-(()!=())))<<(((()<=())+({}<[]))))-(((()!={})*(()>{})))))<<((([]!={})+({}!=[]))))),(((((((([]>={})+([]>{})+({}<[]))+(({}!=())+({}=={})+({}<={}))+(({}<={})*(()>{})))<<((({}<=())+({}<()))))-((([]>{})+({}>{})))))<<(((()>={})+(()<=()))))),(((((((((({}!=[])+({}!=[])+([]==[])))<<(((()>=[])+(()>={}))))+((([]>{})*(()>=())))))))<<(((()!={})+([]<())+({}=={}))))+((([]!={})+(()<{})))),((((({}=={})+({}=={})+([]==[]))+(({}<[])+([]>={})+([]>{}))+((()<={})+(()!={})))<<((({}<[])+([]<()))+(({}>={})+([]!=()))))-(((()!=[])+(()>())))),(((((()>{})+([]==[])+([]>=[]))+(([]!={})+([]!=())+({}!=[]))+(({}>())+(()!=[])))<<(((()>={})+({}<=()))+(([]!={})+([]>={}))))-((((()!={})+([]!=[])))<<((([]<=())*([]<())))))])), (lambda _:_(map(chr,[(((((((([]<[])+({}<={})))<<((({}<={})+([]!={}))+((()>{})+([]==[]))))-(((()=={})+([]==[])))))<<((({}<[])+([]>{})+([]>={}))))-((([]>={})+({}<=[])+(()<=())))),(((((()>=())+({}<())+(()>=[]))+((()<=())+({}<())+(()>{}))+((()<=())+({}>())))<<((([]<=())+([]>=[]))+(([]<())+({}>={}))))-(((([]>={})-(()>())))<<((([]<[])+([]>={}))))),(((((((({}!=())+([]<=())+([]>=[])))<<((([]!=())+({}<())+({}<()))))+(((()==())*([]<=[])))))<<(((()>{})+(()>=[]))))),(((((((({}<())+([]==[])+(()>[])))<<(((()>=())+(()==())+([]<=[]))))+(((()<())+(()!={})))))<<((([]!={})+([]>=[]))))+(((()<{})+(()>{})))),((((((((()>[])+({}<=())+(()==())))<<((({}<={})+(()>{})+(()<=()))))+((([]!={})*([]!=())))))<<((({}!=())+([]!=()))))-((([]==())+(()!=[])))),((((((((((()!=[])+({}=={})+([]==[])))<<(((()>=())+({}<=[]))))+(((()>=())*([]>{})))))))<<((({}=={})+(()!={})+(()<=()))))+((({}<=[])+([]!=[])))),(((((((({}=={})+(()>={})+([]>={}))+(([]==[])+([]>=[])+([]<()))+((()<[])+([]!=())))<<((([]<=[])+(()!={}))))-(((()!=[])*({}!=[])))))<<(((()>=[])+([]!={}))))),(((((((([]>=[])+({}=={})+([]<=[]))+(([]<())+({}<=())+(()>={}))+(({}!=[])+(()<[])))<<(((()>=[])+(()>={}))))-((([]==[])*([]!={})))))<<((({}>={})+({}<=()))))),(((((((((([]!=())+({}<=())+(()<=())))<<(((()>{})+({}=={}))))+((([]>=())+([]>={})))))))<<(((()!={})+({}!=[])+([]<()))))+(((()>={})*([]>{})))),(((((()<=())+(()!=[])+(()>[]))+((()==())+([]<())+({}!=()))+((()<=())+({}>{})))<<((({}=={})+(()==()))+(({}<[])+([]<=[]))))-((({}>={})*([]>=[])))),((((({}<())+(()>={})+([]>{}))+((()>=[])+({}!=[])+({}<[]))+(({}<[])+({}>=())))<<(((()!={})+(()==()))+(([]<())+({}<[]))))-(((([]==[])+([]<[])))<<((({}>=())+({}!=())))))])))(((lambda:((()<{})-([]>()))).func_code.co_lnotab).join), ((lambda:((()==[])+({}>=()))).func_code.co_lnotab).join(map(chr,[(((((((({}<[])+({}!=[])+([]<=())))<<((([]<=[])+({}!=[])+([]<()))))+((([]!=())+({}>=[])))))<<(((()>[])+(()!=[]))))),(((((((({}!={})+({}<={})))<<((([]==[])+(()>[]))+((()<=())+([]==[]))))-((([]==())+([]<=[])))))<<(((()>=[])+(()>[])+(()==()))))-(((()>={})+([]<=[])+(()>=())))),((((([]!=())+(()!={})+(()!={}))+(([]>={})+(()<=())+({}<()))+(({}<=())*([]>=[])))<<(((()==())+(()>{}))+(([]!={})+({}<()))))-(((()<=())*(()!=[])))),(((((((({}>={})+({}!=())+({}!=[])))<<((([]<())+({}=={})+({}<=()))))+((([]==[])*([]<=[])))))<<((({}<())+(()!=[]))))),(((((((([]!={})+({}<=[])+({}!=())))<<(((()!={})+([]<=())+({}!=()))))+(((()=={})+(()!=[])))))<<((({}<=())+({}>={}))))+((([]<=[])*({}=={})))),(((((((({}<())+([]!=())+({}<[])))<<((([]<())+([]<=())+({}<=[]))))+((({}>=())+(()>{})))))<<((({}!=())+({}!=[]))))-((({}<())+([]>=())))),(((((((((([]>=[])+(()==())+([]>={})))<<(((()>={})+(()>[]))))+(((()>[])+(()<[])))))))<<(((()>[])+({}<={})+({}<()))))+(((()!=())+([]<())))),(((((((({}!=())+([]!=())+([]!={}))+(([]>=[])+([]<())+([]>=[]))+(({}<())+({}>())))<<(((()>=())+([]>={}))))-(((()>{})+(()=={})))))<<(((()!={})+(()==()))))),(((((((([]!=())+([]<=[])+({}>={}))+((()>[])+(()>=())+([]<=[]))+(({}=={})+(()>())))<<((({}<=[])+({}<()))))-((([]==[])+({}==[])))))<<((({}<=())+([]<=[]))))),((((((((((()!=[])+(()!=[])+([]<=())))<<((([]<=())+({}<=[]))))+((({}>[])+([]>{})))))))<<((([]>{})+(()==())+(()<=()))))+(((()==())+({}>=[])))),(((((()>=[])+(()==())+(()>={}))+(([]>=[])+([]!={})+([]<=()))+(({}==[])+(()>={})))<<(((()>={})+([]<=[]))+(([]<=[])+(()!=[]))))-((({}>={})-([]>=())))),((((({}=={})+(()>=[])+(()<=()))+((()>{})+(()>=())+([]>={}))+(([]!={})*([]<=[])))<<((([]==[])+([]<=()))+(([]!={})+({}=={}))))-((((()>{})*(()>{})))<<(((()>={})-({}>{})))))])),\
str(bytearray((((((([]<())+({}!=[])+([]>=[]))+((()>{})+(()>{})+([]<()))+(({}<=())-([]<{})))<<((([]!=())+(()>={}))+(([]<())+([]>=[]))))+(((([]>={})-({}>())))<<((([]<=())+({}<={}))))),(((((()>=())+([]!={})+({}<={}))+(({}<=())+([]!=())+([]!=()))+(([]>=[])+(()<{})))<<(((()>=())+({}!=[]))+(([]==[])+({}<=[]))))+((((()>{})*(()>=[])))<<((([]>=[])*({}!=[]))))),((((((((()!=[])+([]>{})+([]<=())))<<((([]<())+({}>={})+([]<()))))+((({}<=[])+([]!=[])))))<<((([]!=())+({}<=()))))+((({}<=())*({}<[])))),((((((((()!={})+({}<())+([]>=[])))<<((({}<[])+({}=={})+(()==()))))+(((()<=())*([]<=[])))))<<(((()>={})+([]>{}))))),(((((((({}<())+(()>={})+([]==[])))<<(((()>[])+(()!={})+(()==()))))+((({}<=())*([]>=[])))))<<((({}<())+([]<()))))+(((()!=())+([]!=())))),(((((((({}<={})+(()>{})+(()!=[])))<<((({}<=[])+({}=={})+(()<=()))))+(((()!=())+({}<=[])))))<<(((()==())+(()>=[]))))-((([]<())+([]>[])))),(((((((((([]<())+(()>={})+(()==())))<<(((()<=())+([]>{}))))+((({}=={})+({}<{})))))))<<((([]>={})+({}>={})+(()!={}))))+(((()>[])*(()!={})))),((((((((()!=[])+([]!={})+({}>={}))+(({}!=[])+(()!=[])+(()>{}))+(([]!={})-({}>())))<<((([]>{})+([]!=()))))-((([]!={})+({}>{})))))<<(((()<=())+({}<={}))))),(((((((({}<[])+(()<=())+({}<[]))+(([]>=[])+([]<=())+(()<=()))+((()<=())*({}<=[])))<<((([]>=[])+(()>={}))))-(((()>=())*(()>{})))))<<((([]==[])+(()>[]))))),(((((((((([]<())+([]<=[])+([]>=[])))<<(((()!={})+(()<=()))))+((({}<())*(()<=())))))))<<(((()>={})+([]<())+(()!=[]))))+((([]<={})+({}<={})))),(((((()<=())+({}!=[])+({}<=()))+(([]>=[])+([]>=[])+([]==[]))+((()!={})+(()!=())))<<(((()==())+([]!=()))+((()==())+([]==[]))))-((([]=={})+({}<={})))),((((([]!={})+(()!=[])+(()>{}))+(({}<[])+({}<())+(()>=[]))+(({}<())-([]!=[])))<<((([]<())+({}!=[]))+(({}<[])+([]!=()))))-(((([]>={})-(()<())))<<(((()>=[])-([]==()))))),))), str(bytearray((((((([]<=())+({}<=())+(()!={}))+(({}<())+([]!={})+([]<=[]))+((()>=[])-({}<{})))<<(((()>[])+(()>{}))+((()>={})+({}<=()))))+((([]>{})*({}=={})))),(((((((([]>=[])+(()=={})))<<(((()!={})+({}!=[]))+((()>=[])+([]<=[]))))-(((()!=[])-([]=={})))))<<((([]==[])+(()>[])+({}<[]))))-((([]<())+({}>={})+({}<[])))),((((([]==[])+([]<=())+({}!=[])))<<(((()>{})+([]<=[]))+((()!=[])+([]<=()))+(({}=={})*([]<=[]))))+((({}<=())-(()<{})))),((((({}<[])+(()!=[])+(()!={}))+((()<=())+([]<=[])+({}<[]))+((()>=())*([]>={})))<<((({}!=())+(()>={}))+(([]<=[])+(()!={}))))+((((()<())+({}>={})))<<((([]>{})+(()!={}))))),((((({}!=())+(()>={})+({}<=()))+(({}>={})+([]>={})+(()>[]))+(({}<=())+(()<=[])))<<((([]==[])+(()>={}))+((()>{})+({}=={}))))+((((()<={})+([]<=[])))<<((({}<=())+(()>=[]))))),((((((((()>[])-(()>())))<<((([]!={})+([]==[]))+((()>={})+(()==()))))-((([]==[])*(()>=())))))<<((({}!=[])+({}<[])+(()==()))))-((({}=={})+(()>={})+(()>={})))),((((([]<=())+(()>[])+([]<=()))+((()==())+(()>={})+(()>={}))+(({}!={})+({}<={})))<<(((()>={})+([]>=[]))+(([]<=())+([]==[]))))-((([]>())+(()!={})))),(((((()>={})+(()<=())+(()>[]))+((()!={})+(()!={})+({}<()))+(([]>={})+(()<={})))<<((({}<=[])+(()<=()))+((()!={})+([]<=[]))))+((((()==())*({}<())))<<(((()>=())+([]=={}))))),(((((((({}<={})+(()>{})+({}<={})))<<((([]>=[])+([]>={})+([]>={}))))+((({}<=())-([]>[])))))<<((({}<=[])+([]<=()))))),(((((((([]!=())+([]<=())+({}=={})))<<((([]<=[])+(()>=[])+([]!={}))))+(((()>={})-([]=={})))))<<((([]>={})+([]!={}))))+((([]<())-([]=={})))),(((((((({}<())+(()<=())+(()>=[])))<<((({}=={})+({}<[])+({}<=()))))+(((()>[])-(()<[])))))<<((([]<=[])+(()==()))))-((([]>{})-(()>())))),(((((((((([]<())+(()>=[])+([]<=[])))<<(((()!=[])+([]==[]))))+((([]>=())+([]>={})))))))<<((([]<())+({}<[])+({}!=[]))))+((([]<=[])*([]!={})))),(((((((([]==[])+(()>{})+({}<=[]))+((()>=())+([]>={})+({}<=()))+(([]<=[])-(()>())))<<((({}<=[])+({}!=()))))-((([]>[])+(()>{})))))<<((({}!=[])+(()>=()))))),((((((((()!=[])+(()>={})+({}<()))+(({}=={})+({}!=())+(()>[]))+(({}<={})-(()<{})))<<(((()>{})+([]>=[]))))-((({}<={})+(()!=())))))<<((([]!={})+({}<={}))))),((((((((((()==())+(()!=[])+(()>=())))<<((({}<=())+({}<={}))))+((([]!={})+({}!={})))))))<<(((()!={})+([]!={})+([]>={}))))+((({}!={})+(()>{})))),((((([]!={})+(()!=[])+([]!={}))+((()!={})+({}<())+({}>={}))+(([]!=())*({}>={})))<<((([]!=())+(()>={}))+(([]==[])+(()!=[]))))-(((()>[])-([]<={})))),((((({}<=[])+(()<=())+({}<()))+(([]<=())+({}<={})+(()>[]))+((()>=[])*(()==())))<<((([]>{})+(()>={}))+(([]<=())+([]>=[]))))-(((({}<=())*([]!=())))<<((([]<={})+([]!=()))))),))), (lambda _:_(map(chr,[((((({}=={})+({}<[])+([]<=[]))+(({}!=())+(()>[])+([]>{}))+(({}!=[])-(()!=())))<<((({}<())+([]>{}))+(([]<=())+(()==()))))+((([]<())+({}<())+([]>{})))),((((((((()>{})+(()>=())+({}=={})))<<((([]<=())+(()!=[])+({}!=()))))+(((()>[])-({}<{})))))<<(((()<=())+({}!=[]))))+((([]<=())*(()>=())))),(((((((([]>{})+({}>())))<<((({}<=[])+([]>{}))+((()>[])+(()==()))))-((([]<())*(()<=())))))<<((([]<=())+(()>=[])+({}>={}))))),(((((((([]<())+({}!=())+(()>{})))<<((({}!=[])+(()>[])+(()!={}))))+((([]<=[])*({}<())))))<<((([]==[])+(()>{}))))),(((((((([]>{})+({}!=[])+([]>{})))<<((({}<=[])+({}<[])+([]<()))))+((([]>{})*({}!=())))))<<((({}>={})+(()>={}))))+((({}=={})*([]<=())))),(((((((({}<=())+({}<=())+([]>{})))<<((({}>={})+([]<=[])+({}>={}))))+(((()>[])+(()=={})))))<<((({}>={})+(()>={}))))-((([]>={})-([]>=())))),(((((((((([]>{})+({}<())+([]>={})))<<((({}<())+({}>={}))))+(((()>={})*({}!=[])))))))<<((({}<())+({}<=[])+([]!=()))))+((({}==[])+([]!=())))),(((((((([]>=[])+(()==())+({}<={}))+(({}<=())+({}<=[])+([]<=()))+(({}==())+(()>{})))<<((({}!=())+(()>=()))))-(((()!={})-(()!=())))))<<(((()>=[])+([]<=[]))))),(((((((([]<=())+([]>{})+(()!={}))+((()>=[])+([]>{})+([]>={}))+(([]<())-([]<={})))<<(((()>=())+({}!=[]))))-(((()>={})+([]<[])))))<<((([]>={})+(()!={}))))),((((((((((()>{})+({}>={})+([]<=[])))<<((({}<=[])+({}<=[]))))+((([]>=())+(()>[])))))))<<((([]<=())+([]==[])+([]<()))))+((({}<=())-([]<={})))),((((([]<=[])+({}!=[])+(()==()))+(([]>{})+([]>=[])+(()!={}))+(({}<[])-(()<=[])))<<((([]>={})+([]>{}))+(({}<[])+({}=={}))))-((([]>())+([]!=())))),((((([]>={})+(()<=())+({}<[]))+(({}!=())+({}>={})+({}<()))+(([]<=[])+([]<{})))<<(((()>[])+([]==[]))+(([]!=())+(()!={}))))-((((()<())+({}<())))<<(((()<())+([]==[])))))])))(((lambda:(({}<=[])*(()<{}))).func_code.co_lnotab).join),\
((lambda:((()=={})+([]>[]))).func_code.co_lnotab).join(map(chr,[(((((()>=())+({}!=())+(()!=[]))+((()<=())+({}>={})+(()>{}))+((()!=[])+(()<())))<<(((()>{})+([]!={}))+((()>{})+({}<=[]))))+((({}=={})+(()!=[])+([]==[])))),(((((((({}!=())+([]>=[])+({}<={})))<<((([]<=())+([]!=())+({}<()))))+(((()>={})*([]>{})))))<<((([]<=())+(()>=[]))))+(((()>={})+(()<=[])))),((((([]>{})+({}>={})+(()==()))+(({}<=[])+([]!={})+(()>={}))+(([]=={})+({}<())))<<(((()<=())+({}<={}))+(({}!=())+(()<=()))))),((((([]<=())+([]==[])+({}>={}))+((()==())+([]<=())+([]>=[]))+(([]>=[])-(()<={})))<<(((()>{})+({}>={}))+((()>=[])+([]!={}))))+((((()!=[])*({}<=[])))<<((([]<=[])+(()>={}))))),((((((((()==())+({}!=[])+(()>{})))<<((({}>={})+({}>={})+([]<=[]))))+((([]>={})+([]<={})))))<<(((()==())+([]==[]))))+((([]>={})+(()==[])))),((((({}<=[])+({}<[])+([]>=[]))+(([]<=())+({}<=[])+(()!=[]))+(([]>=())+([]>{})))<<(((()>[])+(()<=()))+(([]>={})+({}<=[]))))-((((()==[])+([]<=[])))<<((({}=={})-(()<[]))))),(((((((([]<=())+(()>{})+([]>=[])))<<(((()>[])+({}<[])+({}<={}))))+(((()<=())+([]>=())))))<<(((()==())+([]<=()))))),(((((((({}=={})+({}<=[])+([]>=[])))<<((([]<=())+(()!=[])+([]>=[]))))+(((()!=())+([]!={})))))<<(((()>{})+({}>={}))))+(((()>={})*(()==())))),(((((((({}<())+(()<=())+({}<=())))<<((([]<())+([]<=[])+([]<=()))))+(((()<=())*([]>={})))))<<((([]>={})+(()>[]))))-((([]!=())-({}==[])))),(((((((((({}>={})+(()>=())+([]>=[])))<<((([]>=[])+({}<={}))))+((([]>={})+([]>())))))))<<(((()==())+({}<[])+({}<[]))))+((([]>=[])*([]>={})))),(((((((([]<=())+({}<=[])+([]<()))+((()>={})+([]!=())+([]<=()))+(({}!=[])*([]<())))<<(((()>=[])+({}!=[]))))-((({}!=[])-([]!=[])))))<<((([]<())+([]>={}))))),(((((((({}=={})+({}>={})+(()>{}))+(([]>={})+({}<={})+({}>={}))+(([]==[])*({}!=[])))<<((([]<=[])+(()>{}))))-(((()>=[])*(()<=())))))<<(((()!=[])+(()>[]))))),((((((((((()>=[])+({}<[])+({}=={})))<<(((()>=())+(()!={}))))+((({}<{})+(()>={})))))))<<((([]>{})+({}>={})+({}<()))))+((([]<=())*([]<=[])))),((((([]!=())+(()!=[])+([]>=[]))+(([]>{})+({}=={})+([]<=[]))+(([]!={})-([]>())))<<(((()==())+([]<=()))+(({}!=[])+({}!=[]))))-((({}!=())-(()<={})))),(((((()>=[])+([]!=())+(()>=[]))+(({}<=[])+(()>{})+({}<=[]))+((()==[])+([]<())))<<((([]<())+([]>{}))+(({}<[])+(()!={}))))-(((([]<=())*(()>={})))<<((([]!=())-([]!=[])))))])), (lambda _:_(map(chr,[((((([]>={})+({}<())+([]==[]))+((()!={})+({}<=())+([]>=[]))+((()>={})*([]<=[])))<<((({}<={})+({}>={}))+((()>{})+(()!={}))))-((({}>{})+([]>{})))),(((((((([]>={})+([]!={})+(()>={})))<<((([]<=())+(()>=[])+([]>{}))))+((([]<=[])-({}>{})))))<<((({}!=[])+(()<=()))))-(((()<[])+({}<=[])))),((((([]>=[])+([]>={})+([]==[]))+(([]>={})+({}!=())+({}<=[]))+(({}!=())*(()!=[])))<<((({}<())+([]<=()))+(([]>={})+([]<=()))))+(((({}=={})-({}>[])))<<((([]<())+({}=={}))))),(((((()>={})+([]>=[])+(()>=()))+((()>=())+(()!={})+({}<=[]))+((()!=[])*([]<=())))<<((({}<[])+([]!={}))+((()==())+({}<=[]))))-((({}<=[])-({}>=[])))),(((((((({}>={})+(()==())+([]<=[])))<<((({}=={})+({}<=())+({}<={}))))+((({}<=())-(()<={})))))<<((([]<=[])+(()>=()))))),((((((((()!={})+([]!=())+({}=={})))<<((([]>{})+(()<=())+({}=={}))))+((([]!=())+(()==[])))))<<(((()!=[])+({}!=[]))))+(((()>=())*([]>{})))),(((((((({}<=[])+({}<=[])+([]>={})))<<((([]>=[])+(()>=[])+({}>={}))))+((({}==())+(()>={})))))<<((({}<())+({}>={}))))-((([]<=())-([]==())))),((((((((((()<=())+(()>={})+({}<[])))<<((([]>=[])+(()<=()))))+(((()>{})+(()<=[])))))))<<((({}<())+([]<=())+(()!={}))))+((([]==[])-([]==())))),(((((((({}<=[])+(()!={})+({}=={}))+((()>[])+([]>=[])+({}<[]))+((()<=())*({}<())))<<((([]==[])+({}<[]))))-((([]<())-(()<[])))))<<((({}<())+([]!={}))))),((((((((()!={})+([]<=())+([]<()))+(([]<())+({}<={})+(()<=()))+((()>{})+([]>=())))<<((({}<=[])+(()==()))))-((({}>[])+(()>[])))))<<((({}<[])+({}<()))))),((((((((((()!={})+([]==[])+(()>{})))<<((([]>={})+(()!={}))))+((([]<={})+({}<())))))))<<(((()!=[])+([]!={})+([]!={}))))+(((()>=[])*(()>=[])))),(((((()<=())+([]>{})+([]!={}))+((()>={})+({}=={})+(()>=[]))+(({}==[])+(()>={})))<<((({}!=())+(()!=[]))+(({}!=())+(()>[]))))-((({}!=())-([]<{})))),(((((()>{})+([]!={})+([]<=[]))+(({}>={})+([]!=())+({}<[]))+((()==())+({}>{})))<<((([]>{})+([]<()))+((()>[])+(()>={}))))-(((({}<{})+({}<={})))<<((([]>{})+({}>=())))))])))(((lambda:((()!=())+(()!=()))).func_code.co_lnotab).join), (lambda _:_(map(chr,[((((([]!=())+([]>={})+(()!={}))+((()!={})+(()==())+({}<()))+(({}<{})+({}<[])))<<((([]<())+({}<[]))+(({}<=[])+({}<=()))))-((((()!={})+(()>())))<<((([]<())-(()<=[]))))),((((({}!=())+([]>=[])+({}>={}))+(([]>=[])+([]!={})+([]!=()))+(({}<=[])-({}==[])))<<((({}<={})+([]>=[]))+(([]!=())+([]>={}))))-((([]<[])+({}!=[])))),(((((((([]==[])*({}!=[])))<<((({}<=())+({}>={}))+(({}!=())+([]>={}))))-(((()<=())-({}>=[])))))<<((([]>{})+([]<=())+({}<=()))))-(((([]<[])+(()>=())))<<((({}>[])+({}>={}))))),(((((((({}<={})+({}<())+({}=={})))<<((({}<[])+(()>[])+([]!={}))))+((([]>())+(()>=[])))))<<((({}!=())+({}!=[]))))+((([]!={})+(()<{})))),(((((((([]!=())+({}>={})+([]>{}))+((()>=())+({}=={})+([]!={}))+((()==[])+({}<[])))<<((({}<=())+({}<={}))))-((({}==[])+(()>=())))))<<((([]<=())+(()==()))))+(((()>=())*([]<())))),(((((((([]<=())+([]<())+(()<=())))<<((({}>={})+(()>=[])+(()!={}))))+((({}=={})-(()<={})))))<<((([]<())+({}!=[]))))),((((((((()<=())+(()>={})+({}!=())))<<((([]<())+([]>={})+([]>=[]))))+((([]>{})*(()>[])))))<<((({}<={})+(()>={}))))+((([]==[])+([]<[])))),((((((((()>=())+(()>={})+([]<=())))<<((({}=={})+(()>[])+({}>={}))))+(((()>={})-({}>{})))))<<((({}!=[])+([]<=()))))-((([]!=())*([]<())))),(((((((((({}<[])+({}!=[])+([]>={})))<<(((()<=())+({}!=()))))+((({}>={})-(()!=())))))))<<(((()>[])+({}!=[])+(()>{}))))+((([]>[])+([]==[])))),(((((((([]!={})+(()>=())+([]!=()))+((()!={})+(()!={})+(()<=()))+(([]>[])+(()<=())))<<((([]>{})+({}=={}))))-((({}>={})+(()!=())))))<<((([]!={})+([]!={}))))),(((((((({}!=[])+(()<=())+({}=={}))+((()>={})+(()>=[])+({}<[]))+(({}!=())-([]<{})))<<((({}<={})+(()>=[]))))-((([]==[])*({}<=())))))<<(((()!={})+({}!=[]))))),(((((((((([]==[])+(()>[])+([]==[])))<<((({}!=[])+([]<=()))))+((({}<[])-(()>())))))))<<((([]!=())+(()>[])+([]==[]))))+(((()==())-([]>=())))),((((([]==[])+({}=={})+([]!=()))+((()>={})+([]==[])+(()==()))+(([]>())+([]>={})))<<((({}=={})+([]<=[]))+(({}<[])+([]<=()))))-((([]>{})+({}>=[])))),((((([]==[])+(()!={})+(()>{}))+(({}!=())+([]<=())+({}<={}))+((()!=[])+(()!=())))<<((([]!=())+([]<()))+(([]!={})+([]>{}))))-(((({}<={})-([]<[])))<<((({}<{})+([]<())))))])))(((lambda:((()<{})*(()!={}))).func_code.co_lnotab).join),\
(lambda _:_(map(chr,[(((((((([]<{})+([]!=())))<<((([]<=[])+({}=={}))+(([]==[])+([]==[]))))-((({}==())+({}<=[])))))<<((([]>=[])+({}<[])+([]>={}))))-(((({}!={})+({}<={})))<<((({}<{})+([]>={}))))),((((((((((()>={})+(()>{})+({}<=())))<<((([]>=[])+(()!=[]))))+((([]>=())+(()>=[])))))))<<((({}!=[])+({}<=[])+(()==()))))+((([]<=[])-({}!={})))),(((((((((({}<[])+({}<())+([]>=[])))<<((([]<=())+({}<[]))))+(((()!=[])*(()<=())))))))<<(((()>[])+([]<())+(()>{}))))-(((()>[])+({}!={})))),(((((((((([]<=())+({}<=())+({}!=[])))<<(((()>=())+(()>=[]))))+((({}!=())-({}!={})))))))<<((({}<={})+({}<=[])+(()>[]))))+((({}<[])-({}<{})))),(((((()>=())+([]>={})+(()>=()))+(([]<())+([]!=())+([]!={}))+(({}<=())*(()>[])))<<((([]!=())+([]>={}))+(({}<())+([]!={}))))-((((()<={})+([]>={})))<<(((()!={})+({}>{}))))),(((((()<=())+(()>={})+([]!={}))+(([]!=())+(()!={})+({}=={}))+((()!={})*(()==())))<<((([]>={})+(()==()))+((()>[])+([]>=[]))))+(((({}<=[])*(()!={})))<<((({}<={})+([]==[]))))),((((((((((()<=())+({}<={})+(()<=())))<<((([]<())+([]<=()))))+((([]<[])+([]>{})))))))<<((({}>={})+(()>=[])+({}<()))))+(((()!={})*([]>={})))),((((((((()!=[])+(()>{})+(()<=()))+(({}<[])+(()>={})+([]>{}))+(({}<=[])*({}=={})))<<(((()>[])+({}<()))))-((([]>{})+(()<())))))<<((({}<[])+([]<=()))))),(((((((([]==[])+([]<())+({}<[]))+(([]!={})+({}<())+(()==()))+(([]>={})*(()!={})))<<(((()>=[])+({}<[]))))-((({}!=[])*([]>=[])))))<<((({}<())+({}<()))))),((((((((((()>={})+([]!={})+({}=={})))<<(((()>{})+(()==()))))+(((()>[])+([]<={})))))))<<(((()>=[])+({}=={})+({}!=[]))))+((({}<=())*([]>{})))),((((([]<=())+({}<={})+([]>={}))+(({}<[])+({}>={})+({}!=[]))+((()!=[])-([]<[])))<<((([]<=())+({}<=[]))+(({}!=())+([]!=()))))-((([]==[])*(()>[])))),((((({}!=())+({}<=())+([]!={}))+(({}<=())+([]>=[])+(()!=[]))+(([]==())+(()>[])))<<((({}!=[])+([]!={}))+(([]>=[])+({}=={}))))-(((({}!=())*({}<())))<<((([]<())*(()>=())))))])))(((lambda:(({}>={})*(()<[]))).func_code.co_lnotab).join)]
words = []
if ((((({}>={})-([]==())))<<((([]!=())+([]>{})+([]>{}))+(([]==[])+(()>{})+({}!=()))))+(((({}!=())-(()>())))<<((({}<())-([]<[])))))-(((((()==())-({}>())))<<((({}!=[])+([]!=())+([]!=()))+(([]!={})+([]!={})+(()>{}))))+((((()!=())+([]>{})))<<((([]>[])+({}<()))))):
print (((((()<=())*([]!=())))<<((([]==[])+({}<=())+([]<=()))+(([]!=())+({}=={})+([]>{}))))+((([]<[])+({}>={}))))==(((((((((({}<=[])+(()>=())+({}!=())))<<(((()!={})+(()==()))))+((([]>=())+(()!=[])))))))<<((([]<())+({}<={}))))+((([]==[])*(()<=()))))
print ((lambda:(([]<{})*([]=={}))).func_code.co_lnotab).join(map(chr,[(((((()>={})+({}>={}))+(({}!=[])+({}<[]))+(([]>={})-({}>{})))<<((([]<=())+(()>={}))+(([]>{})+({}!=[]))))),((((({}<=())+(()>={})+([]<()))+(([]>{})+({}=={})+({}!=[]))+((()==())*(()>[])))<<((({}<())+([]<()))+(({}!=[])+(()==()))))+(((({}<=())-(()!=())))<<(((()>{})+([]<=()))))),(((((((({}<=[])*({}=={})))<<((([]!=())+(()>={}))+(({}<={})+({}>={}))))-(((()<())+(()==())))))<<((([]>=[])+([]!=())+([]>{}))))+(((([]>{})-(()<={})))<<((([]<=())-([]!=[]))))),(((((((([]==[])+(()>{}))+(([]!={})+({}<()))+((()>={})+([]!=[])))<<(((()>=())+(()>=()))))-(((()!=[])-(()=={})))))<<(((()<=())+(()>=()))))),(((((((((((({}!=())+({}==())))<<(((()>=())+(()>=[])+([]!={}))))+(((()<=())-([]=={})))))))))<<(((()>{})+([]>=[])+([]<()))))),((((([]>{})+(()>[]))+((()<=())+({}>={}))+(({}=={})+({}==[])))<<(((()!={})+({}>={}))+(({}<=())+(()==()))))-(((()!={})+(()=={}))))]))
if ((((([]>={})-([]!=[])))<<((([]>=[])+([]>={})+({}<=()))+(({}!=[])+(()>={})+([]<=[]))))-(((()<=())*({}<=()))))-((((({}>={})*([]!=())))<<((({}>={})+(()!={})+(()<=()))+(([]<=[])+([]<=())+({}<()))))-(((()>{})+(()<[])))):
print (((((((((((()>[])*(()>[])))<<((({}>={})+({}<[])+(()!=[]))))+((({}<=())+(()!=()))))))))))!=(((((((([]>={})+(()>[])+([]!={})))<<(((()!={})+({}<={})+(()!={}))))-((({}<[])-(()<{})))))<<((([]!=())+(()>{})))))
if (((((((((((([]<={})+([]<())))<<((([]<=[])+([]>={})+({}<=[]))))+((([]>=[])-({}>=[])))))))))<<(((()>=[])*([]<())))))-(((((((((((({}==())+(()>[])))<<((([]==[])+(()>{})+(()>{}))))+((({}<[])*([]>=[])))))))))<<((([]<())*(()>={}))))):
print (lambda _:_(map(chr,[(((((()>{})+({}<=[])+({}<={}))+(({}=={})+({}<[])+([]<=[]))+(([]>=[])+({}==())))<<((([]!=())+([]==[]))+((()>{})+({}<=()))))-((((()==())*([]!=())))<<(((()<=[])+(()!=[]))))),(((((((({}<=())+(()<())))<<((({}=={})+([]<=()))+((()>{})+(()>=[]))))+((({}<=[])+(()=={})))))<<((({}<())+([]>{}))))-((({}<[])+([]>[])))),(((((((([]>=[])*(()>=[])))<<(((()>{})+([]<()))+(({}!=())+(()>[]))))+((([]<=())-({}==[])))))<<((({}=={})+([]>{}))))),((((([]!={})+(()>=[])+(()>{}))+((()>=[])+([]<=[])+(()==()))+((()>[])+({}>{})))<<((([]<=())+([]!={}))+(([]>=[])+(()==()))))-(((()!={})+([]==())))),((((([]!=())+(()>[])+([]!=()))+(({}<=())+([]>=[])+([]<=[]))+(([]!=())*({}>={})))<<((([]<=())+([]>{}))+(({}!=[])+([]>=[]))))+(((({}<=[])+({}>=())))<<((({}!=[])+(()==()))))),((((([]<())+(()!={}))+(({}<())+(()>{}))+(([]!=[])+({}<={})))<<(((()>=())+([]>={}))+(([]<=[])+([]>={}))))),(((((((([]<=[])+(()<=()))+((()==())+({}<={}))+((()>())+(()<=())))<<(((()!=[])+(()==()))))+((([]<=())*([]<=[])))))<<(((()>={})+([]<=()))))+(((()!={})*({}>={})))),(((((()<=())+([]!={})+(()>=())))<<((({}<=())+({}!=[]))+(({}!=())+([]>={}))+(({}>={})-([]>=()))))+((({}>[])+(()!=[])))),((((([]<=())+([]<=[]))+((()>=())+({}<()))+((()>=[])-([]<{})))<<((([]<=())+([]<()))+(([]<=())+(()!=[]))))-(((({}!=())+(()<())))<<((([]<=())+(()==[]))))),(((((((([]<())+({}<()))+(([]<=())+({}>={}))+(([]>=())+(()>={})))<<((({}=={})+([]>={}))))-(((()==())*([]!={})))))<<(((()>[])+([]!=()))))-((([]==())+(()<=())))),((((((((()=={})+(()==())))<<(((()>{})+([]>=[]))+((()!=[])+([]!=()))))+(((()<={})+(()>={})))))<<((([]<=[])+(()!=[]))))-(((()!=[])*({}=={})))),(((((((((([]>{})+([]>={})+([]>{})))<<((({}<=[])+(()<=()))))+((([]>={})-({}==[])))))))<<((([]>=[])+({}=={})+(()>[]))))+((((()=={})+([]>{})))<<((({}<=[])+([]<={}))))),((((((((((((()!=[])*({}!=())))<<((([]!={})+({}!=())+([]!={}))))+((({}<=[])-(()<())))))))))<<(((()>={})+(()>={})+({}<[]))))+((([]>{})*(()!={})))),((((((((()>{})+([]<=[]))+(([]<=())+(()>=[]))+((()!={})-({}>())))<<(((()<=())+({}!=[]))))-((({}>[])+({}<=[])))))<<((([]>=[])+({}<=()))))),(((((((((([]>={})+(()!={})+({}<={})))<<((([]<=())+(()==()))))-((({}<={})-(()==[])))))))<<(((()>=[])+({}!=[])+([]<=()))))-(((([]<[])+([]==[])))<<((([]>{})-({}>[])))))])))(((lambda:(({}>())*(()<=()))).func_code.co_lnotab).join)
print (((((((({}<=[])+([]!={}))+((()>=())+(()!={}))+(({}!=[])-(()<[])))<<((([]>{})+([]<=[]))))-((({}<[])-({}==[])))))<<(((()!=[])+(()>=[])))))>=(((((((((((([]>=[])-(()<=[])))<<((({}!=[])+({}<())+([]>=[]))))+((([]<())+(()<[])))))))))<<((({}=={})+([]>=[]))))+((({}<[])*({}<={}))))
if num==((()!=())-({}>[])): words.append((lambda _:_(map(chr,[(((((((([]>={})*([]<=())))<<(((()>=[])+({}!=[]))+(({}>={})+({}<=()))))-((([]==[])*({}<=[])))))<<(((()==())+({}>={})+({}>={}))))+(((({}<={})*({}<())))<<((([]!=())*(()==()))))),((((((((()>=())+([]>{})+({}!=())))<<((([]<=[])+([]>{})+(()>[]))))+((([]<())-({}>[])))))<<((({}=={})+([]==[]))))+((([]>=[])+({}>())))),((((({}<())+([]!=())+({}>={}))+((()>{})+([]<())+([]<=()))+(([]==[])+([]>=())))<<((({}>={})+([]>{}))+(([]!=())+({}=={}))))+(((({}!=[])+({}<{})))<<(((()<[])+([]!={}))))),((((([]==[])+({}<={})+(()>=()))+(([]!=())+([]!={})+(()<=()))+(([]<=())*({}<=[])))<<((([]!={})+([]<=[]))+(([]<=())+([]>=[]))))-((([]>{})+(()<()))))])))(((lambda:((()<[])*({}!=[]))).func_code.co_lnotab).join))
else:
numStr = (lambda _:_(map(chr,[((((((((((((()>={})+(()<{})))<<((([]!={})+(()>[])+(()>={}))))+(((()>=())+([]<={})))))))))<<(((()>[])+(()!={}))))+(((()<())+([]>{})))),((((((((()>{})+({}!=())+(()!=[])))<<(((()>[])+([]<())+({}=={}))))+((([]>=[])+(()!=())))))<<(((()>[])+([]<=[])))))])))(((lambda:((()<=[])*(()>()))).func_code.co_lnotab).join)%num
numStrLen = len(numStr)
if ((((((((()==())-(()!=())))<<(((()<=())+({}!=()))+(({}!=())+(()>=[]))))-(((()>=())*(()>[])))))<<((([]>={})+([]<=())))))-((((((((()>=[])+(()!=())))<<(((()!=[])+(()>{}))+((()>[])+({}!=()))))-((({}<[])*({}>={})))))<<((([]>=[])+(()>=[]))))):
print (((((((((({}=={})+(()>=[])+([]<())))<<((({}<=[])+({}=={}))))-((({}=={})*(()>=())))))))<<(((()>={})+({}!=[])))))<=(((((((([]>=[])+(()>=()))+((()<=())+([]!=()))+((()>={})*(()>{})))<<(((()!=[])+({}!=()))))+((([]<{})+([]>={})))))<<(((()==())+(()<{})))))
if (((((()>={})+({}!=())+(()>{}))+(({}<=())+(()!=[])+(()>{}))+((()==())*(()>{})))<<((([]<=[])+([]<=[]))))+(((()<={})+([]<=[]))))-((((([]>={})+({}!=())+(()>{}))+(({}<=())+({}<[])+({}=={}))+(({}<=[])*(()<=())))<<((([]<=[])+({}!=()))))+((({}>{})+(()<=())))):
print ((((((((((((()<={})+([]>={})))<<((({}!=[])+(()>{})+([]<()))))+((([]>())+({}<={})))))))))<<((({}>={})+([]>{})))))>=((((([]<=())+({}>=[])))<<((({}>={})+([]<())+([]!={})))))
if (((((()>=())+({}<=())+([]!={})))<<((({}!=())-({}>=())))))-(((((()!=[])+([]<=())+(()>=[])))<<((([]==[])-({}>()))))):
print ((lambda:(({}==())-([]<{}))).func_code.co_lnotab).join(map(chr,[(((((((({}<[])+({}=={}))+(({}>={})+([]!={}))+(([]<=[])+({}>{})))<<((([]<())+({}>={}))))-((([]>={})*({}<())))))<<(((()!=[])+({}<[]))))-(((()==())+({}>[])))),(((((()<[])+({}<[])))<<((({}=={})+({}<=[])+([]<=()))+((()>={})+([]<())+({}<=()))))+(((()<=())+({}<{})))),(((((((((([]!=())+(()!={})+({}=={})))<<(((()!={})+(()>[]))))-((([]==[])*({}>={})))))))<<((({}<[])+({}<=())+({}!=()))))+((([]>())+({}!=[])))),(((((((((({}=={})+([]<())+({}!=())))<<((([]>=[])+([]>={}))))-(((()>=())*({}!=[])))))))<<(((()==())+({}=={})+({}<={}))))-(((({}<())-(()!=())))<<((([]>={})-(()<=[]))))),(((((()!=[])+({}>())))<<((({}<[])+([]<=())+([]<()))+(({}<[])+({}<=())+(()>=[]))))+(((()>())+([]<())))),((((((((()<=())+({}==())))<<((([]>={})+({}>={}))+((()>=[])+(()>{}))))-((([]<())+({}>=[])))))<<(((()!={})+({}<={})+({}<()))))-((({}<[])-(()<={})))),(((((((((([]!=())+([]==[])+(()<=())))<<((({}=={})+({}<={}))))+((({}>={})-([]>[])))))))<<(((()<=())+(()!={})+({}<[]))))+((({}<())*([]!=())))),((((({}=={})+({}<=())+([]!=()))+(([]<=[])+({}<=[])+({}<[]))+(({}<=())-([]<{})))<<(((()==())+(()>=[]))+((()>{})+(()>=()))))),((((((((((()!={})+([]!={})+({}!=[])))<<((({}!=())+([]==[]))))-((([]=={})+([]<=())))))))<<((({}<=[])+([]<())+([]==[]))))-((([]>{})-({}>())))),(((((((((([]<=[])+([]<=[])+(()!=[])))<<((([]!=())+([]>=[]))))-((({}>={})-(()<())))))))<<(((()>{})+({}!=[])+({}<={}))))+(((([]<())+(()<())))<<((([]!={})+({}==()))))),((((((((()>={})+(()!=[])+([]>={}))+(({}!=[])+(()==())+(()>=()))+(([]>[])+(()==())))<<((({}!=())+(()>[]))))-((([]<=[])-(()<[])))))<<((({}<=[])+([]>=[]))))+(((()<[])+([]==[])))),((((((((((((()!=[])+({}>{})))<<((({}<=())+({}<=())+({}<={}))))+(((()>={})+(()<=[])))))))))<<((([]>{})+({}>={})+({}>={}))))+(((()>[])*(()>=[])))),((((((((((()>=())+(()>=[])+(()!=[])))<<((({}<[])+(()>=[]))))-((([]!=())+({}>=())))))))<<((({}<=())+({}!=())+(()<=()))))-(((([]=={})+({}<={})))<<((({}<={})+(()<{})))))]))
if (((((((({}>={})+(()!=())))<<((([]<())+([]<=()))+((()>=())+({}=={}))))-(((()<{})+({}<=())))))<<((({}>=())+(()==())))))-(((((((({}>{})+([]!={})))<<((({}!=())+({}>={}))+(([]<())+({}>={}))))-((([]!=())-({}==[])))))<<((([]==[])+([]>=()))))):
print (lambda _:_(map(chr,[(((((((([]<())+(()!=())))<<((([]<())+(()>=()))+(([]!=())+({}>={}))))-((([]==())+(()>[])))))<<(((()<=())+([]<=[])+({}<=()))))-((([]!=[])+({}<())))),((((([]==[])+({}=={}))+(({}=={})+({}=={}))+((()>[])-(()<())))<<(((()<=())+([]!=()))+(({}<=[])+({}<={}))))),((((((((((()>[])+(()>[])+([]<=[])))<<((({}!=())+([]<()))))-(((()>=[])*({}<=[])))))))<<((({}<())+({}<=())+({}>={}))))-((([]==[])-([]!=[])))),((((([]!={})+([]>=[])+([]>=[])))<<(((()>=())+({}<()))+(({}<[])+({}>={}))+(({}!=())-(()<[]))))+(((([]<{})+({}<=[])))<<((({}!={})+(()==()))))),((((((((()<=())*(()!=[])))<<(((()>{})+([]>=[]))+(({}<())+([]>{}))))-(((()!=[])*({}<())))))<<((({}<={})+(()!={})+({}!=[]))))-(((({}<=[])+(()<{})))<<((({}>{})+([]==[]))))),(((((()!=[])+([]>{})+({}!=()))+(({}>={})+({}<=())+(()>=[]))+((()!={})-(()>())))<<(((()>[])+(()>[]))+((()>=[])+([]<())))))])))(((lambda:(({}>{})*([]!=[]))).func_code.co_lnotab).join)
print (((((((([]!=())-(()<[])))<<(((()>{})+([]>{}))+(({}>={})+(()>{}))))-((([]>=())+({}!=())))))<<((({}<[])+(()!={})))))<((((((((((((()>={})*({}>={})))<<((([]<=())+({}!=())+(()==()))))+(((()<=())*([]>=[])))))))))<<((({}<())+([]>={})+({}<=[]))))-((({}<={})-({}>=[]))))
if (((((()>[])+({}!=[])+([]<=[]))+((()>=[])+([]<=[])+(()>[]))+((()>=[])-(()!=())))<<(((()!=[])-(()=={})))))-((((([]>={})+([]<=())+({}=={}))+((()>={})+([]>={})+({}=={}))+((()!={})+(()<{})))<<((([]>={})+([]>=()))))):
print str(bytearray(((((((((({}<=())+([]>={}))+(([]>={})+([]!=()))+((()>[])-({}>())))<<((({}<[])+([]>={}))))+((([]>=[])*([]>{})))))<<((([]>{})+({}<()))))+((({}!=())+({}>[])))),(((((((([]==[])+({}<={}))+((()==())+([]<()))+(({}>{})+({}>={})))<<((({}!=[])+(()<=()))))+((([]<={})+([]>{})))))<<((([]<=())+(()>[]))))),(((((((({}!=[])-(()<())))<<((([]==[])+(()>=()))+(({}!=())+([]!={}))))-((([]<())+([]<={})))))<<((([]>=[])+([]>{})+([]>={}))))),(((((((((({}<[])+(()>={})+({}<={})))<<((({}<=())+(()!={}))))+((({}>={})*([]<=())))))))<<((([]>={})+([]==[])+({}>={}))))-((([]>=())+({}<=())))),(((((((({}!=[])+({}<())+({}<=()))+((()>=[])+([]>={})+({}>={}))+(([]==[])-(()<[])))<<(((()>{})+(()==()))))-((([]!={})+([]!=[])))))<<(((()>=[])+([]>=[]))))+((({}<{})+(()>={})))),((((([]<[])+([]>={})))<<(((()>{})+(()>=())+(()!=[]))+((()==())+({}<=[])+([]==[]))))+((([]<=())-({}>())))),(((((((((([]>{})+({}<=())+(()>=())))<<(((()==())+([]>={}))))+((({}>=())+([]>={})))))))<<((([]>{})+(()>{})+([]==[]))))-((({}<={})*(()>{})))),(((((((({}<={})+([]<()))+(([]>={})+({}!=()))+(({}>=[])+([]<())))<<(((()!={})+(()!=[]))))+((([]==[])*({}>={})))))<<((({}=={})+([]<=[]))))),((((((((()>=[])+([]==[])+([]<=[]))+((()>[])+([]>=[])+({}!=()))+(([]<=())*({}<())))<<(((()>[])+([]!={}))))-((([]<=())-(()<())))))<<(((()>={})+({}!=[]))))),((((((((()!={})+([]<())+([]>{}))+((()>[])+(()!={})+(()>={}))+((()>=[])*([]<())))<<((({}<=[])+({}<=()))))-(((()>[])+([]==())))))<<(((()!=[])+({}=={}))))+((({}>{})+(()>[])))),(((((()!=[])+([]!=())+({}<()))+((()>[])+(()>[])+([]<()))+(([]==[])+({}>())))<<((({}!=())+({}!=[]))+(([]!=())+(()>{}))))+(((([]>[])+(()>={})))<<((([]<())*([]<()))))),(((((((([]!=())+(()!={})+([]!=()))+(([]>=[])+(()<=())+([]==[]))+(([]>={})-([]<{})))<<((({}>={})+(()>[]))))-((([]>[])+([]<())))))<<((([]>={})+({}>={}))))-((([]<())+(()<[])))),(((((((([]!={})-([]!=[])))<<((([]<())+({}<()))+(({}<[])+(()<=()))))+(((()<=())+({}==[])))))<<((([]<=[])+({}!=()))))),((((([]!={})+({}>={})+({}>={}))+((()>{})+([]!=())+([]<=()))+(({}<[])+([]<[])))<<((([]>=[])+([]>=[]))+(({}!=[])+({}<()))))-(((({}!=[])-(()=={})))<<(((()!={})*(()>[]))))),(((((()==())+(()<=())+([]!={}))+(({}>={})+([]>{})+([]!={}))+((()>={})*([]!={})))<<((([]<=[])+({}<={}))+(({}!=[])+([]>=[]))))+((((()<[])+([]>{})))<<((([]<=[])+([]<=()))))),((((({}<=())+(()>=())+({}!=()))+(([]!=())+({}<={})+(()<=()))+(({}>=[])+({}<[])))<<(((()>[])+([]<=[]))+(([]>=[])+({}<[]))))+((({}!=[])+({}<=[])+({}!=())))),(((((()>{})+([]>=[]))+(({}<=[])+({}<=[]))+(({}>())+([]<=())))<<(((()!=[])+({}=={}))+(({}<[])+(()==()))))-((((()!={})-([]<[])))<<(((()==[])+(()>={}))))),((((([]>=[])+([]>{})+(()>=())))<<(((()>{})+([]!=()))+(({}!=())+({}<=[]))+(({}!=())+({}>[]))))+((([]=={})+(()==())))),(((((((([]!=())+({}==[])))<<((([]<=())+(()!={}))+((()>[])+({}!=[]))))-((({}!=[])*([]!=())))))<<(((()!=[])+([]!=())+(()>=[]))))-(((()>{})+({}!={})))),)))
groups = (numStrLen+(((((()!={})*(()==())))<<((({}!=())-(()<[]))))))/((({}=={})+({}<[])+([]!={})))
numStr = numStr.zfill(groups*((([]>{})+({}!=[])+({}!=()))))
if (((((((((([]>={})+([]>={})+({}<=[])))<<((({}<[])+({}<[]))))-((([]>=())+([]==[])))))))<<((({}<=[])+({}<[])+([]>{}))))+(((([]!=())*(()>={})))<<((({}<[])*(()>{})))))-(((((((((([]!=())+(()!=[])+({}=={})))<<((({}<={})+({}!=[]))))-(((()!={})-({}>=())))))))<<((([]!={})+([]>=[])+({}=={}))))+(((({}==[])+({}<=())))<<(((()<=[])+({}=={}))))):
print str(bytearray((((((([]<())+([]>=[]))+(([]<())+(()!=[]))+((()!={})+({}!={})))<<((([]<())+({}=={}))+(([]<())+(()>=()))))),(((((((([]<=[])+([]=={})))<<(((()!=[])+({}=={}))+((()>=[])+(()>[]))))+(((()!=[])-(()<())))))<<((([]>{})+([]>{}))))),((((([]<=())+(()!=[])+([]>{}))+(([]<=())+([]==[])+([]<=[]))+((()!={})-({}==[])))<<((({}<={})+({}<={}))+((()==())+([]<=[]))))-(((([]<=())*({}<=[])))<<((({}<[])+([]>()))))),(((((((((((({}>={})*({}<=())))<<(((()>=())+([]!={})+([]>=[]))))+((([]<=[])*({}!=())))))))))<<(((()==())+(()!={})+({}<()))))-((([]>=())+(()==())))),((((((((((()>=())+({}=={})+({}<={})))<<((([]>{})+(()>[]))))+((({}=={})+([]>())))))))<<((([]==[])+([]>={})+({}>={}))))+((([]!={})-([]=={})))),(((((((({}>={})+(()!={}))+(({}=={})+({}>={}))+((()>=[])*({}<=[])))<<((([]!=())+([]<=[]))))-((({}<=())*([]<=[])))))<<(((()==())+(()>[]))))+((([]<())-({}>{})))),)))
print ((([]>{})+({}>={})+([]!=())))>=((((((((((({}<=[])+([]!=[])))<<(((()<=())+({}<=[])+([]==[]))))+((([]!=[])+({}<()))))))))))
if ((((((((()>=[])+([]!=()))+(({}<=[])+(()<=()))+(({}=={})+({}>=[])))<<(((()>=())+([]<()))))-(((()>[])+(()<[])))))<<((([]!=())+(()<=()))))+((({}<={})*([]>={}))))-(((((((({}<=[])+([]!={}))+(({}<=())+([]<=[]))+(({}!=[])+(()<={})))<<((({}<())+([]!=()))))-((([]!={})+({}<{})))))<<(((()!=[])+({}<=[]))))+((({}==[])+([]<=())))):
print str(bytearray((((((({}>={})+(()>{}))+(({}<[])+({}<=[]))+((()>())+(()>=[])))<<(((()>=())+([]<()))+(({}<[])+([]!=()))))-((({}<[])-({}>())))),((((((((((()>{})+(()>=())+(()==())))<<(((()==())+(()>=[]))))+((({}!=())-({}<{})))))))<<((([]!=())+([]!={})+({}<=[]))))-((((()!=[])+(()<={})))<<((([]<())*(()!={}))))),(((((((({}<=())*(()>={})))<<((({}<[])+({}<={}))+(({}!=[])+([]>=[]))))+((({}!=[])+(()<=[])))))<<(((()>{})+([]>=[]))))+((({}<=[])*({}=={})))),(((((((((([]>={})+({}<())+({}=={})))<<((([]>=[])+({}>={}))))-(((()>())+(()==())))))))<<((([]==[])+(()>=[])+({}<[]))))-(((([]!={})*([]<=[])))<<((({}!=())*(()>{}))))),)))
print (((()>={})+({}<={})+(()>=[]))+(({}<=())+([]>{})+(()>[]))+(([]>{})-(()=={})))!=((((([]<=())+({}<={})+(()>={})))<<(((()!={})-({}>=[])))))
if ((((({}<[])*({}!=())))<<((({}!=[])+(()!={})+([]==[]))+(([]<())+(()>=())+(()>=()))))-((((()>=())*([]!={})))<<(((()<=())-({}<{})))))-((((([]>={})-([]<[])))<<(((()>=())+([]!=())+({}!=()))+(([]>{})+(()<=())+([]==[]))))-(((({}=={})*({}<())))<<((({}<())-({}>=()))))):
print ((((([]==[])+([]<=[])+([]<=[])))<<((({}<=())*({}>={})))))==((((([]<())+([]<())+([]!=()))+(([]>{})+([]!={})+([]>{}))+(({}<{})+([]>=[])))<<((({}<())+(()>={}))))-((([]<=())-(()<=[]))))
if (((((((()<={})+({}<[])))<<((([]!={})+(()>{}))+(({}!=[])+({}!=[]))))+((({}=={})*([]!=()))))))-((((((({}<=())+(()==[])))<<((([]<=[])+([]<=[]))+(([]==[])+({}<[]))))+(((()==())-([]!=[])))))):
print ((((([]<())+(()<=[])))<<((([]<=[])*(()!=[])))))!=(((((()<=())+(()<=())+([]<())))<<((([]<=())+([]<()))+(([]!={})+([]<=()))+(({}<={})*([]!={}))))+((((()!=[])*([]==[])))<<((([]>={})-({}>{})))))
print (((((((()>={})+(()>())))<<((([]==[])+(()>=[]))+(([]>=[])+({}<[]))))+((({}!=())-(()<[]))))))!=((((({}<())+({}<()))+(([]<=())+(()>=[]))+((()<=[])+({}>={})))<<(((()>={})+(()>=()))+(([]<=[])+({}!=()))))+(((({}<={})+([]!=[])))<<((({}!=[])*(()>={})))))
if ((((((((((()>=())+({}<[])+(()>{})))<<(((()>=[])+({}!=()))))-(((()<={})+({}!=[])))))))<<((([]<=())+(()!=[])+({}<={}))))+((({}<[])+(()<()))))-(((((((((({}!=[])+([]!={})+({}!=())))<<((([]<=[])+([]==[]))))-((([]>={})*({}=={})))))))<<((([]>=[])+(()>=())+([]>{}))))+((({}<[])+([]<{})))):
print ((((({}!=[])+(()>[]))+(({}!=[])+({}<={}))+(([]>={})*({}>={})))<<((({}<[])+([]>={}))+(({}<=())+([]!={}))))-(((()>{})-([]>=()))))<(((((()>[])-({}==())))<<(((()<=())+({}<())+(()>[]))+(({}!=())+(()<=())+([]!=()))))+(((({}<={})-(()==[])))<<((([]>{})-([]>())))))
for i in range(((()>=[])*([]>())), groups*(((()==())+(()>=[])+([]!={}))), ((([]>{})+([]<())+({}<=())))):
h, t, u = int(numStr[i]), int(numStr[i+((({}>=())+([]>={})))]), int(numStr[i+((((({}!=[])-(()<{})))<<(((()>=())-(()<{})))))])
g = groups-(i/((([]>{})+({}<())+({}!=[])))+((({}<())-(()=={}))))
if (((((((({}<=())*([]>{})))<<((({}<={})+([]<=()))+(([]>={})+([]<=[]))))-((({}<[])-(()==[])))))<<(((()>=[])+({}!=()))))-(((()>={})+({}==()))))-(((((((({}>=[])+({}<=())))<<((({}<=[])+({}>={}))+(({}<[])+([]==[]))))-((({}!=())*([]<())))))<<((([]<())+([]>=[]))))-((([]<{})+([]>={})))):
print ((((({}>{})+(()>[])))<<(((()>=[])+([]!=())+(()<=()))+((()>=())+({}<={})+(()<=()))))+((([]<=())-({}!={}))))<((((([]<=[])+(()!={})+({}=={})))<<(((()==())+([]>={})+({}<[]))))+((([]<=[])+(()<={}))))
if ((((((((()>{})+({}==())))<<((([]!={})+([]>={}))+((()!={})+([]<=()))))-(((()=={})+([]!={})))))<<(((()>={})+({}!=[]))))+((({}<={})*(()!=[]))))-(((((((({}>=())+([]>=[])))<<(((()>[])+({}<=()))+((()!={})+({}<=()))))-((({}>={})-({}==[])))))<<((([]!=())+([]>{}))))+(((()<=())+(()<=[])))):
print ((((({}!=[])+(()!=[]))+(({}<=[])+([]>{}))+(({}!=())-({}>())))<<((([]<=[])+([]==[])+({}<())))))<=((((([]!={})+({}>())))<<((({}<={})+({}!=[]))+(([]<=())+([]<()))+((()>=[])-({}==[]))))-(((()==[])+([]<=()))))
print (lambda _:_(map(chr,[((((((((()<=())+({}!=())+({}>={})))<<(((()!=[])+({}<=())+([]<=()))))+((([]<=())+([]>=())))))<<((([]>{})+(()>=[]))))+((({}!=())-([]<={})))),((((((((((((()>=[])+([]<{})))<<(((()>=[])+({}>={})+({}!=()))))+((({}>())+([]>={})))))))))<<(((()>=())+(()<=())+(()>{}))))-(((({}!=())+([]>[])))<<(((()>=())*({}>={}))))),((((((((()<{})+([]<=())))<<((({}<())+({}<()))+((()>{})+(()>={}))))-((([]<=())+({}!={})))))<<(((()>=())+([]<())+([]==[]))))-(((({}!=[])-({}!={})))<<((([]!=())+({}>[]))))),(((((()>=())+({}<=()))+(({}<={})+({}<=[]))+(([]<=[])-([]<[])))<<(((()==())+({}!=[]))+(([]!={})+({}=={}))))+((((()==())-(()<[])))<<((({}<={})-([]>()))))),(((((((([]!=())+({}<())+({}<=())))<<((([]>{})+({}<())+(()>{}))))+(((()!={})-(()=={})))))<<((({}<=())+([]<=()))))+((({}==())+(()>{})))),((((([]>={})+([]>{})+([]!={}))+(({}<())+(()>=[])+({}>={}))+(({}>{})+([]>{})))<<(((()<=())+(()!=[]))+(({}=={})+(()>{}))))+(((()<=())+({}<={})+([]<=())))),((((((((((()>{})+([]!=())+([]!=())))<<(((()>{})+(()>=()))))+(((()>=())-({}>=[])))))))<<(((()>{})+([]>={})+([]>=[]))))),(((((((({}>[])+(()>={})))<<((([]>={})+(()!={}))+(([]>=[])+(()>{}))))-((([]!=())-([]<[])))))<<((([]!=())+([]!={})+({}<=[]))))-((((()>=())*([]!=())))<<(((()>=[])+([]>()))))),(((((()>[])+([]<()))+(({}<=[])+({}<={}))+((()>=[])*({}<={})))<<(((()>[])+(()>=()))+((()>=())+(()>=()))))+(((([]<())-({}!={})))<<(((()==())*({}<=[])))))])))(((lambda:(([]<[])*(()<{}))).func_code.co_lnotab).join)
if (((((((()>=())+({}>[])))<<((([]<=[])+(()>={}))+((()>{})+([]>=[]))))-(((()>())+([]>{}))))))-((((((([]>={})-({}>())))<<(((()==())+([]<=[]))+(([]!=())+({}>={}))))-(((()<={})+([]>=[])))))):
print (((((((((((({}>{})+([]>={})))<<((([]==[])+(()!={})+({}=={}))))+(((()!=())+([]!=())))))))))<<(((()!={})+(()>=())+({}<=[])))))<=((((((((()>=[])+([]>[])))<<((({}<[])+([]<=()))+(([]>{})+({}=={}))))+(((()>[])+(()<{})))))<<(((()>[])+([]==[]))))-((([]<[])+([]!=()))))
print ((((((((()!=[])*({}<={})))<<(((()!={})+([]<=()))+(([]>={})+(()!={}))))-(((()!=[])-(()>())))))<<(((()!=[])*([]<=())))))<((((({}<())+([]<()))+(([]<=())+({}!=[]))+(([]>{})-([]>=())))<<(((()>={})+([]>=[]))+(([]>={})+({}<()))))-((((()<={})+([]!=())))<<((({}<=())-([]>())))))
if h>=((({}=={})*({}<[]))):
words.append(units[h])
words.append(str(bytearray((((((((((((()==())+({}>={})+([]!={})))<<((([]==[])+(()>={}))))+((([]<=())+(()==[])))))))<<((([]!=())+(()>=[])+([]!={}))))),(((((((({}>{})+([]==[])))<<((([]<=())+(()>=[]))+(([]!={})+(()>=[]))))-(((()==())*(()>=[])))))<<(((()!=[])+({}!=())+(()>=()))))-(((()<=())+([]!={})+({}<=[])))),((((([]!=())+({}!=())+(()>[]))+(([]>={})+([]!={})+({}!=[]))+((()==())*({}<=())))<<(((()<=())+({}=={}))+(([]<=[])+({}<=[]))))-(((({}<=())*({}<=())))<<((([]<())*({}<=[]))))),(((((((([]<=())+([]<())+(()>={})))<<((({}!=[])+([]>={})+([]!={}))))+((({}>[])+(()>=[])))))<<((([]!={})+({}<()))))),((((({}<={})+({}<={})+(()>={}))+(({}<=[])+(()>=[])+(()<=()))+(([]<())-({}>=())))<<(((()!={})+({}<=()))+((()!={})+(()>{}))))+(((([]!=())*(()>{})))<<(((()>={})-({}>()))))),((((((((()>={})+({}<=())+({}<=[])))<<((({}<())+([]!=())+([]>{}))))+((({}=={})*([]!=())))))<<((({}!=[])+([]==[]))))+((([]>=())+([]<=())))),(((((((({}=={})+({}!=())+({}<[])))<<((([]>{})+({}<=())+({}=={}))))+(((()<=())+(()==[])))))<<((([]>{})+(()<=()))))),))))
if ((((((((()<=())+([]!={}))+(([]!={})+([]<=[]))+(({}<=())+([]=={})))<<((([]!={})+({}<=[]))))-((({}!=())*(()<=())))))<<((([]==[])+([]>{}))))+((([]<())-({}>=[]))))-(((((((([]==[])+({}<=[]))+((()>={})+(()>=()))+(({}>())+(()==())))<<(((()!={})+(()<=()))))-((([]<[])+(()>={})))))<<((({}>={})+({}=={}))))+(((()<=())+([]>=())))):
print (lambda _:_(map(chr,[(((((((({}<())+({}=={})+([]>=[])))<<(((()>=[])+({}<())+({}<={}))))+((({}<[])+({}>=[])))))<<(((()>[])+(()>{}))))-((([]<=[])-([]<={})))),(((((()!=[])+(()>=())+([]<=())))<<(((()>{})+({}<[]))+((()>={})+({}!=()))+(([]>={})*([]==[]))))+((([]>=[])-({}>())))),(((((()<=())+({}<()))+(({}<=())+([]!={}))+((()==())+(()<[])))<<((({}=={})+([]>={}))+((()!=[])+({}<()))))-((((()!=[])+(()<{})))<<((({}<={})-(()<={}))))),((((({}<=())*(()>=[])))<<((([]<=[])+([]==[])+(()>[]))+(({}<={})+([]>={})+({}<=[]))))+((((()<=())-([]>=())))<<((({}<={})*(()==()))))),((((((((((()>{})+(()!={})+({}<=())))<<(((()<=())+(()>{}))))-((({}<={})+([]>())))))))<<((([]<=())+({}>={})+({}<={}))))-(((()<=())*({}!=())))),(((((((((((({}<())-([]<={})))<<((([]==[])+(()>=())+({}<={}))))+((({}>())+([]!={})))))))))<<((({}!=[])+({}<[])+(()>{}))))+(((({}<[])-({}>())))<<((({}>={})*(()<=()))))),(((((((((([]>=[])+(()>=())+(()>={})))<<((({}=={})+({}!=[]))))-(((()>=[])+([]==())))))))<<((([]<())+([]<())+(()!={}))))+((({}=={})+(()<()))))])))(((lambda:((()==())*(()==[]))).func_code.co_lnotab).join)
print (lambda _:_(map(chr,[(((((((((([]<=())+(()==())+([]>=[])))<<((({}<={})+(()>{}))))+((([]==[])*({}=={})))))))<<(((()==())+({}<[])+(()==()))))-(((()>=())-(()>())))),((((({}<())+([]==[])+({}<()))+(({}<=())+([]<())+([]<=()))+(({}>{})+({}<[])))<<(((()!={})+({}>={}))+(({}>={})+(()>=()))))+(((({}!=[])+([]==())))<<((({}==[])+({}!=[]))))),(((((((([]!=())-(()<[])))<<((({}=={})+({}<[]))+(([]>{})+(()>[]))))-(((()=={})+([]==[])))))<<(((()<=())+([]<())+(()>[]))))),(((((((({}>=())+([]>={})))<<((([]<=[])+({}<=()))+((()>=[])+({}<()))))-(((()>=[])-(()!=())))))<<((([]>=[])+([]<=())+([]<=()))))-((({}<[])+({}>={})+([]!=())))),((((([]!={})+({}<())+([]<=()))+((()>=())+([]>=[])+(()!=[]))+((()!={})*(()==())))<<((([]!={})+({}!=[]))+(([]<=())+([]>={}))))+(((()!=())+({}<=())))),((((({}!=())+([]<=())+(()>{}))+(({}<())+([]!=())+({}<()))+((()>=[])+(()!=())))<<(((()==())+(()!={}))+((()>={})+(()>=[]))))+(((({}==[])+([]>{})))<<((([]!=[])+({}<()))))),(((((((({}=={})+([]>={})+([]<())))<<(((()>=[])+(()==())+(()>=()))))+(((()<=())-(()<[])))))<<((({}<[])+({}>={}))))),(((((()>=())+([]<=[]))+(({}<={})+({}<=()))+(({}<=())*([]<())))<<((([]!={})+({}<[]))+(([]>=[])+(()>{}))))),((((({}!=[])+({}<=[])+([]<()))+((()>[])+({}=={})+([]<()))+(({}!=())+({}==[])))<<(((()>=())+({}!=[]))+(({}<=())+(()==()))))-(((()>=[])-(()<())))),(((((((((((({}=={})*(()>=[])))<<(((()>={})+({}<=())+({}<={}))))+((([]>=[])*({}>={})))))))))<<((([]<=())+(()>[])+({}!=()))))+((([]!={})-([]<[])))),((((((((((()>[])+([]==[])+(()<=())))<<(((()>[])+([]<=()))))+(((()!=[])-(()<())))))))<<((([]>=[])+([]<=())+(()<=()))))+(((([]==[])-([]<{})))<<((([]<=())+({}>{}))))),(((((((({}<=[])*(()==())))<<((([]==[])+([]>={}))+(({}=={})+({}<={}))))-((({}<())*([]>=[])))))<<((({}<=())+({}<[])+(()!=[]))))-((((()>{})*({}=={})))<<((([]<=[])+({}==()))))),(((((((({}=={})+({}<=[]))+((()!={})+({}!=[]))+((()>())+({}=={})))<<((([]!={})+(()>{}))))-((({}!=[])+(()<())))))<<((([]!={})+({}>={}))))-(((()>={})+({}>=[])))),((((((((((((()>{})+(()<())))<<((({}!=[])+(()>[])+(()==()))))+(((()<[])+(()>{})))))))))<<((({}<=[])+([]>{})+({}!=[]))))+((((()>[])*({}<=())))<<(((()==())+({}<{}))))),((((((((()>=[])+([]<()))+(({}!=())+([]>=[]))+(([]>{})+(()<={})))<<(((()<=())+({}=={}))))-(((()=={})+([]<=[])))))<<((([]!={})+(()>=())))))])))(((lambda:(({}<{})+(()<[]))).func_code.co_lnotab).join)
if (((((((([]>[])+(()==())))<<((([]<())+(()>[]))+((()>=())+({}<=()))))-((([]>=[])-(()<=[])))))<<((([]==[])*([]!={})))))-(((((((({}=={})-({}>{})))<<((([]<=[])+({}<=()))+((()>{})+(()>={}))))-((([]<=[])*({}<=[])))))<<((({}!=())*(()<=()))))):
print (lambda _:_(map(chr,[(((((((({}=={})+(()<={})))<<((([]<=[])+(()>{}))+(({}<={})+([]!=()))))-((({}=={})-([]>[])))))<<((([]>{})+(()<=())+(()==()))))-((([]>=[])+([]<())+([]>{})))),((((([]<())+([]>={}))+((()>=[])+({}<=()))+((()>=())*({}<=[])))<<(((()>={})+([]>{}))+(({}<=())+({}<()))))),((((((((()!={})+({}<{})))<<(((()>=())+(()>{}))+(({}<())+([]>={}))))-(((()>{})+({}==[])))))<<((([]>={})+([]!={})+({}<={}))))-((({}<[])-({}==())))),(((((()<=())+([]<()))+((()>=[])+([]<()))+(({}!=[])+([]>=())))<<((([]<())+([]>=[]))+(([]>{})+({}!=[]))))+(((([]==[])+([]>[])))<<((([]!={})+({}!={}))))),(((((()!=[])+({}!=[]))+(([]>{})+({}<=()))+(({}!={})+({}!=())))<<((([]!=())+({}<=[]))+((()>{})+([]>{}))))+(((({}<={})-(()>())))<<((({}>={})-(()<())))))])))(((lambda:(([]!=())-(()>=()))).func_code.co_lnotab).join)
if t>((({}<())*([]>={}))):
words.append(tens[t])
if u >= (((()<())+([]>{}))):
words.append(units[u])
elif t == ((([]==[])-(()<()))):
if u >= ((([]!=())+({}>{}))):
words.append(teens[u])
else:
words.append(tens[t])
else:
if u >= (((()<{})+(()>=()))):
words.append(units[u])
if (g >= ((({}==[])+(()>[])))) and ((h + t + u ) > (([]!=())-(()!={}))):
words.append(thousands[g])
return str(bytearray((((((({}<())-({}>=())))<<(((()>{})+(()>{}))+(([]<=())+([]>={}))+((()!=[])*(()>[]))))),))).join(words)
if (((((((((([]<())+(()>=())+(()>[])))<<(((()!=[])+({}>={}))))-(((()==())+([]<={})))))))<<((([]!={})+({}<=())+(()>={}))))+(((({}!=[])*(()>{})))<<((({}<=())-(()!=())))))-(((((((((([]<=())+(()!=[])+([]>{})))<<((({}!=())+([]>=[]))))-(((()>=[])-({}!={})))))))<<((({}=={})+([]!={})+([]<=()))))+(((({}==())+({}<[])))<<((([]==())+(()>[]))))):
print (((((((()>{})+({}=={})+({}!=())))<<(((()>=[])+(()!={})+(()!={}))))-(((()<=[])+(()>={}))))))>=((((({}=={})+([]<=())+(()==())))<<((([]<())+([]>=[])))))
print (((((()>[])+(()<=[])))<<((({}>={})+({}<=[])))))<=((((((((()==[])+(()>=())))<<((({}!=())+([]>={}))+((()>=())+([]!={}))))+((([]==[])-(()>())))))<<((({}!=[])-({}>())))))
if ((((([]<=[])+([]!={}))+(({}<={})+(()<=()))+((()<=[])+(()<=())))<<((({}!=())+(()!=[])+(()>[]))))+(((()>=())*(()==()))))-(((((()!=[])+({}!=()))+(({}<=[])+([]<=[]))+(([]>=[])*(()!={})))<<((({}<=())+([]==[])+(()>[]))))+((({}!=[])+(()<{})))):
print ((lambda:((()=={})+([]<{}))).func_code.co_lnotab).join(map(chr,[(((((((({}!=())+(()<={})))<<((([]>=[])+([]>{}))+(({}<())+({}!=[]))))-((({}<=[])-([]>())))))<<((([]>={})+([]==[])+([]==[]))))-(((()>{})+({}>={})+({}=={})))),((((((((()!=[])+([]<=())+([]!={}))+(([]==[])+([]<())+(()>=[]))+(([]>=[])*(()!=[])))<<((([]<())+([]>=[]))))-((([]<=())+(()<{})))))<<(((()!={})+([]!=()))))+((({}!=[])+([]==())))),(((((((({}<=[])+({}!=[]))+((()<=())+(()<=()))+((()!=[])*({}<=())))<<(((()<=())+(()>[]))))+((({}<())-({}==())))))<<((([]<=())+([]>={}))))+((({}<=())+([]==())))),((((({}<={})+({}<())+(()>[]))+((()>{})+([]!=())+({}!=()))+(([]>{})-({}==())))<<((([]>={})+(()<=()))+(({}<=())+(()!=[]))))-(((({}>=())+({}<={})))<<((([]<=())*([]!={}))))),((((((((()!=[])+(()!={}))+(({}<={})+([]>=[]))+((()==[])+({}<[])))<<(((()>{})+(()!={}))))+((({}>={})+([]<={})))))<<((({}<=[])+({}!=[]))))-((([]==[])-({}!={})))),((((((((((((()>[])*([]<=[])))<<((([]<=[])+([]!={})+({}<[]))))+(((()<())+({}<={})))))))))<<((({}<=[])+([]!={})+(()>=()))))),((((((((()>=[])+({}>={}))+(([]<=[])+([]!={}))+(({}!=[])*(()>={})))<<((({}=={})+([]==[]))))-(((()>=[])*([]<=())))))<<(((()>{})+(()!={}))))+((({}<=[])+(()<[])))),(((((((([]<{})+(()>{})))<<((([]!=())+({}<=()))+(([]!={})+({}<=[]))))-(((()>=[])-([]=={})))))<<(((()>=())+({}!=[])+(()>={}))))-((({}<=[])+({}<=())+(()>=())))),(((((((({}<())+({}!=[])+([]<=[]))+(({}<={})+(()>={})+([]<=()))+(([]!=())*([]<=())))<<(((()>{})+({}<[]))))-((({}!=[])-({}>{})))))<<((({}<=[])+(()!=[]))))+((({}<={})*({}!=[])))),(((((()>=())+(()=={})))<<(((()==())+({}<=())+(()==()))+(({}<=[])+([]>{})+(()>={}))))+(((()!=())+(()>={})))),((((({}=={})+([]!={})+([]==[]))+((()>[])+({}<())+(()>{}))+(({}<=())+({}>=())))<<((({}<=())+([]!=()))+(([]>={})+({}<={}))))+((([]==[])+({}=={})+([]!={})))),(((((((([]<=[])+({}<())+({}=={}))+(({}<())+(()>{})+({}!=()))+((()!=[])*([]<=())))<<((([]>=[])+(()>{}))))-((([]<=[])-([]<{})))))<<((({}=={})+(()>=[]))))+((({}!=[])*([]>{})))),((((((((((((()==[])+(()>{})))<<((([]<=())+({}=={})+(()<=()))))+((({}<[])*(()<=())))))))))<<((([]<=[])+(()!={})+(()!=[]))))-(((([]==[])+([]==())))<<((({}<=[])*([]<()))))),(((((((((((([]<=())-([]<{})))<<(((()>=[])+({}!=[])+([]==[]))))+((([]!=())-({}!={})))))))))<<(((()!={})+([]!={})+([]>{}))))-(((()==[])+(()>={})))),(((((((((((([]<())+(()<{})))<<((({}<=())+(()<=())+({}<=()))))+(((()>[])+([]==())))))))))<<((({}<={})+([]>{})+({}<[]))))),(((((((((((({}>={})-({}<{})))<<((([]<=())+({}<=())+(()>{}))))+((([]<=[])*({}<=())))))))))<<((({}<=[])+(()>=[])+({}!=[]))))+(((({}<={})-({}>())))<<(((()>=())+({}==()))))),(((((()>=[])+({}=={})+([]!=()))+(([]<=[])+(()!={})+([]>{}))+(({}>={})+([]==())))<<((({}<={})+([]!={}))+((()!=[])+({}<[]))))+((((()<={})+(()>=())))<<((([]>=[])+(()>={}))))),((((([]>=[])+([]>{}))+(({}=={})+({}>={}))+(({}!=[])-([]<[])))<<((([]>=[])+(()>={}))+(([]==[])+(()<=()))))-(((({}!={})+({}<={})))<<((({}<[])-(()<={})))))]))
print ((((({}<=[])+({}>={})+(()==()))+(({}<=[])+([]<=[])+({}<=()))+(([]==[])-(()<={})))<<(((()>={})+([]==[])+(()>=()))))+((({}<=[])*([]>=[]))))!=(((((()!={})+({}<={})+({}!=())))<<(((()>[])+({}=={}))+(({}<={})+(()>=[]))+(([]>{})*(()>=[]))))-((([]>=[])-([]>=()))))
if ((((([]==[])*([]<())))<<((({}<())+(()!=[])+({}<=()))+(({}<[])+([]<=[])+([]>={})))))-(((((()<=())-(()<=[])))<<(((()>[])+(()>={})+({}<[]))+(([]>={})+([]<())+(()>=()))))):
print str(bytearray(((((((((({}<())*([]>=[])))<<((({}<())+({}<()))+(([]<())+({}!=()))))+((([]!=())*([]>={})))))<<((({}<=())+(()==()))))-(((()<[])+({}<())))),((((((((((()!=[])+(()>=[])+([]>=[])))<<((({}>={})+({}=={}))))-(((()>=())*(()>=[])))))))<<((({}<={})+([]==[])+([]<=[]))))-(((({}<=[])*(()>[])))<<((([]<{})+(()!={}))))),(((((((([]==[])*({}!=())))<<((([]==[])+([]>=[]))+((()==())+(()!=[]))))+((({}>={})+([]>())))))<<(((()>=[])+({}=={}))))-((({}<[])+({}<{})))),(((((((([]>=[])+({}<())+({}<[])))<<((([]<=[])+(()>={})+(()>=[]))))+(((()<{})+(()==())))))<<((({}<={})+({}<={}))))+(((()==())*(()>={})))),((((((((((()<=())+({}=={})+([]>={})))<<((({}!=[])+(()>{}))))+((({}>[])+(()!={})))))))<<(((()>{})+([]>{})+(()==()))))+(((()!=())+({}<={})))),((((((((()<[])+({}!=())))<<((([]==[])+([]==[]))+(({}=={})+({}<=()))))-((([]>=[])+([]==())))))<<((([]==[])+(()>=[])+({}<=()))))+((((()==())-(()==[])))<<((({}==())+(()!={}))))),(((((((((([]<())+([]!=())+(()<=())))<<((({}<())+(()==()))))+(((()!=())+({}<[])))))))<<((({}<[])+(()!=[])+([]==[]))))),(((((()>=[])+([]>=[])+({}<={}))+(([]<())+(()<=())+(()>={}))+(({}<[])+(()<=[])))<<(((()!=[])+({}<={}))+(({}<())+({}=={}))))+(((()!={})+(()>={})+([]==[])))),)))
print (lambda _:_(map(chr,[(((((((((([]==[])+({}<=())+([]<=[])))<<(((()>[])+(()>=()))))+((([]>=[])*({}<[])))))))<<((([]>={})+({}!=[])+([]==[]))))+(((({}==[])+([]!={})))<<((({}=={})-(()==[]))))),(((((((({}<[])+([]<=())+(()>=[]))+(({}<[])+(()>=[])+([]>{}))+(([]<{})+([]==[])))<<(((()>{})+(()==()))))-(((()<())+([]!={})))))<<((({}=={})+({}!=[]))))+((([]>={})*([]<=())))),((((([]==())+(()!={})))<<((([]<=[])+([]!={})+({}<=()))+(([]<=())+({}!=())+([]>=[]))))+((({}>())+({}=={})))),((((((((()>{})+({}!=()))+(({}>={})+([]<=()))+(([]!={})+({}>[])))<<((({}>={})+([]<=()))))+(((()<())+({}>={})))))<<(((()!={})+([]<()))))-((({}<[])+({}>=[])))),((((((((()>[])+(()<{})))<<((({}<[])+([]<=[]))+(([]>=[])+([]>{}))))+(((()>={})-({}>=[])))))<<((([]>={})+([]!=()))))+((({}=={})-({}==())))),(((((((((({}<[])+(()>=[])+({}<[])))<<((({}<={})+([]<=[]))))-(((()<=[])+(()==())))))))<<((([]!={})+(()!=[])+(()>=()))))+((({}<())*({}!=()))))])))(((lambda:((()>())*([]>{}))).func_code.co_lnotab).join)
from sys import *
if len(argv) > ((({}<={})+({}>{}))):
print nwords(str().join(argv[(((()>=[])*(()!={}))):]))
else:
print ((lambda:((()<{})*(()>()))).func_code.co_lnotab).join(map(chr,[(((((((((((({}<={})*([]>{})))<<((({}<=())+(()>=[])+(()==()))))+((({}<={})+([]!=[])))))))))<<(((()>=())+([]!=()))))+((([]<=())+({}>{})))),(((((()>[])+([]<())+(()>{}))+(({}<=[])+({}=={})+({}<=()))+(([]>{})+({}==[])))<<((({}<[])+({}!=()))+(({}<())+([]>={}))))+((([]<())+(()==())+({}<())))),((((([]>={})-({}>())))<<((([]==[])+([]>=[]))+((()>={})+({}!=()))+(({}=={})+({}>[]))))),(((((((({}>{})+(()>=[])))<<(((()>=[])+([]>{}))+(([]>=[])+([]>={}))))-((({}>={})-([]=={})))))<<((([]>={})+({}<={}))))),(((((((((([]>={})+(()>=())+(()>={})))<<((([]!=())+([]>={}))))+((([]<())*([]>=[])))))))<<((({}<())+([]<=[])+(()<=()))))+(((()>={})+({}>=[])))),(((((()<=())+([]>=[])+([]!={}))+(({}<[])+({}<())+({}=={}))+(([]<=())-(()!=())))<<((([]>={})+(()>[]))+(({}<=())+(()>[]))))-(((([]!=())*(()>=())))<<((({}>={})*({}!=()))))),(((((()<=())+(()>=())+([]!=()))+(({}<=[])+({}<={})+(()==()))+(({}!={})+({}!=[])))<<((([]>{})+([]>=[]))+(({}>={})+(()>=[]))))+(((([]<())*([]>=[])))<<((({}<[])+(()>=[]))))),((((({}<=())+(()<[])))<<((({}<())+(()>=())+({}<()))+((()>[])+({}=={})+({}=={}))))-(((([]>=())+({}<=[])))<<((({}=={})+(()<{}))))),(((((()>={})+({}>=())))<<((({}=={})+(()==()))+((()>{})+(()>{}))+((()!=[])+(()<[]))))),(((((((([]<=[])+([]<=()))+(([]!=())+({}!=()))+(({}<=[])+({}==())))<<(((()!=[])+([]>=[]))))+((([]==[])*(()<=())))))<<((([]!=())*(()!={})))))])) % argv[(({}>[])+(()<={}))]
| 886.566038 | 6,816 | 0.029529 | 635 | 93,976 | 4.135433 | 0.097638 | 0.186596 | 0.261234 | 0.298553 | 0.835872 | 0.789033 | 0.736481 | 0.715537 | 0.715537 | 0.672506 | 0 | 0.00015 | 0.007204 | 93,976 | 105 | 6,817 | 895.009524 | 0.027996 | 0.001192 | 0 | 0.08 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0 | 0.01 | null | null | 0.35 | 0 | 0 | 0 | null | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 12 |
6001de93978479f1e6a99b4880f8e1741443c7fd | 9,558 | py | Python | torchsar/dsp/interpolation.py | aisari/torchsar | 05a46610d68bc884743a483565279f361ade5384 | [
"Apache-2.0"
] | 3 | 2021-06-04T13:13:07.000Z | 2021-08-24T16:28:31.000Z | torchlib/dsp/interpolation.py | antsfamily/torchtool | fd0d6e6fe6701206b15f95af145d6178a87233f9 | [
"MIT"
] | null | null | null | torchlib/dsp/interpolation.py | antsfamily/torchtool | fd0d6e6fe6701206b15f95af145d6178a87233f9 | [
"MIT"
] | 2 | 2021-08-15T09:01:03.000Z | 2021-12-21T08:53:53.000Z | #!/usr/bin/env python
# -*- coding: utf-8 -*-
# @Date : 2019-11-07 17:00:48
# @Author : Zhi Liu (zhiliu.mind@gmail.com)
# @Link : http://iridescent.ink
# @Version : $1.0$
import torch.nn.functional as thf
def interpolate(input, size=None, scale_factor=None, mode='nearest', align_corners=None, recompute_scale_factor=None):
r"""Down/up samples the input to either the given :attr:`size` or the given
:attr:`scale_factor`
The algorithm used for interpolation is determined by :attr:`mode`.
Currently temporal, spatial and volumetric sampling are supported, i.e.
expected inputs are 3-D, 4-D or 5-D in shape.
The input dimensions are interpreted in the form:
`mini-batch x channels x [optional depth] x [optional height] x width`.
The modes available for resizing are: `nearest`, `linear` (3D-only),
`bilinear`, `bicubic` (4D-only), `trilinear` (5D-only), `area`
Args:
input (Tensor): the input tensor
size (int or Tuple[int] or Tuple[int, int] or Tuple[int, int, int]):
output spatial size.
scale_factor (float or Tuple[float]): multiplier for spatial size. Has to match input size if it is a tuple.
mode (str): algorithm used for upsampling:
``'nearest'`` | ``'linear'`` | ``'bilinear'`` | ``'bicubic'`` |
``'trilinear'`` | ``'area'``. Default: ``'nearest'``
align_corners (bool, optional): Geometrically, we consider the pixels of the
input and output as squares rather than points.
If set to ``True``, the input and output tensors are aligned by the
center points of their corner pixels, preserving the values at the corner pixels.
If set to ``False``, the input and output tensors are aligned by the corner
points of their corner pixels, and the interpolation uses edge value padding
for out-of-boundary values, making this operation *independent* of input size
when :attr:`scale_factor` is kept the same. This only has an effect when :attr:`mode`
is ``'linear'``, ``'bilinear'``, ``'bicubic'`` or ``'trilinear'``.
Default: ``False``
recompute_scale_factor (bool, optional): recompute the scale_factor for use in the
interpolation calculation. When `scale_factor` is passed as a parameter, it is used
to compute the `output_size`. If `recompute_scale_factor` is ```False`` or not specified,
the passed-in `scale_factor` will be used in the interpolation computation.
Otherwise, a new `scale_factor` will be computed based on the output and input sizes for
use in the interpolation computation (i.e. the computation will be identical to if the computed
`output_size` were passed-in explicitly). Note that when `scale_factor` is floating-point,
the recomputed scale_factor may differ from the one passed in due to rounding and precision
issues.
.. note::
With ``mode='bicubic'``, it's possible to cause overshoot, in other words it can produce
negative values or values greater than 255 for images.
Explicitly call ``result.clamp(min=0, max=255)`` if you want to reduce the overshoot
when displaying the image.
.. warning::
With ``align_corners = True``, the linearly interpolating modes
(`linear`, `bilinear`, and `trilinear`) don't proportionally align the
output and input pixels, and thus the output values can depend on the
input size. This was the default behavior for these modes up to version
0.3.1. Since then, the default behavior is ``align_corners = False``.
See :class:`~th.nn.Upsample` for concrete examples on how this
affects the outputs.
.. warning::
When scale_factor is specified, if recompute_scale_factor=True,
scale_factor is used to compute the output_size which will then
be used to infer new scales for the interpolation.
The default behavior for recompute_scale_factor changed to False
in 1.6.0, and scale_factor is used in the interpolation
calculation.
Note:
When using the CUDA backend, this operation may induce nondeterministic
behaviour in its backward pass that is not easily switched off.
Please see the notes on :doc:`/notes/randomness` for background.
"""
return thf.interpolate(input, size, scale_factor, mode, align_corners, recompute_scale_factor)
def interpolatec(input, size=None, scale_factor=None, mode='nearest', align_corners=None, recompute_scale_factor=None):
r"""Down/up samples the input to either the given :attr:`size` or the given
:attr:`scale_factor`
The algorithm used for complex valued interpolation is determined by :attr:`mode`.
Currently temporal, spatial and volumetric sampling are supported, i.e.
expected inputs are 3-D, 4-D or 5-D in shape.
The input dimensions are interpreted in the form:
`mini-batch x [optional channels] x [optional height] x width x 2`.
The modes available for resizing are: `nearest`, `linear` (3D-only),
`bilinear`, `bicubic` (4D-only), `trilinear` (5D-only), `area`
Args:
input (Tensor): the input tensor
size (int or Tuple[int] or Tuple[int, int] or Tuple[int, int, int]):
output spatial size.
scale_factor (float or Tuple[float]): multiplier for spatial size. Has to match input size if it is a tuple.
mode (str): algorithm used for upsampling:
``'nearest'`` | ``'linear'`` | ``'bilinear'`` | ``'bicubic'`` |
``'trilinear'`` | ``'area'``. Default: ``'nearest'``
align_corners (bool, optional): Geometrically, we consider the pixels of the
input and output as squares rather than points.
If set to ``True``, the input and output tensors are aligned by the
center points of their corner pixels, preserving the values at the corner pixels.
If set to ``False``, the input and output tensors are aligned by the corner
points of their corner pixels, and the interpolation uses edge value padding
for out-of-boundary values, making this operation *independent* of input size
when :attr:`scale_factor` is kept the same. This only has an effect when :attr:`mode`
is ``'linear'``, ``'bilinear'``, ``'bicubic'`` or ``'trilinear'``.
Default: ``False``
recompute_scale_factor (bool, optional): recompute the scale_factor for use in the
interpolation calculation. When `scale_factor` is passed as a parameter, it is used
to compute the `output_size`. If `recompute_scale_factor` is ```False`` or not specified,
the passed-in `scale_factor` will be used in the interpolation computation.
Otherwise, a new `scale_factor` will be computed based on the output and input sizes for
use in the interpolation computation (i.e. the computation will be identical to if the computed
`output_size` were passed-in explicitly). Note that when `scale_factor` is floating-point,
the recomputed scale_factor may differ from the one passed in due to rounding and precision
issues.
.. note::
With ``mode='bicubic'``, it's possible to cause overshoot, in other words it can produce
negative values or values greater than 255 for images.
Explicitly call ``result.clamp(min=0, max=255)`` if you want to reduce the overshoot
when displaying the image.
.. warning::
With ``align_corners = True``, the linearly interpolating modes
(`linear`, `bilinear`, and `trilinear`) don't proportionally align the
output and input pixels, and thus the output values can depend on the
input size. This was the default behavior for these modes up to version
0.3.1. Since then, the default behavior is ``align_corners = False``.
See :class:`~th.nn.Upsample` for concrete examples on how this
affects the outputs.
.. warning::
When scale_factor is specified, if recompute_scale_factor=True,
scale_factor is used to compute the output_size which will then
be used to infer new scales for the interpolation.
The default behavior for recompute_scale_factor changed to False
in 1.6.0, and scale_factor is used in the interpolation
calculation.
Note:
When using the CUDA backend, this operation may induce nondeterministic
behaviour in its backward pass that is not easily switched off.
Please see the notes on :doc:`/notes/randomness` for background.
"""
dim0 = list(range(input.dim()))
dim = dim0.copy()
dim.insert(1, dim[-1])
dim.pop()
input = input.permute(dim)
dim0[1:-1] = dim0[2:-1]
dim0.append(1)
return thf.interpolate(input, size, scale_factor, mode, align_corners, recompute_scale_factor).permute(dim0)
if __name__ == "__main__":
pass
| 54.931034 | 120 | 0.634965 | 1,266 | 9,558 | 4.733807 | 0.21327 | 0.073419 | 0.030369 | 0.013015 | 0.944435 | 0.937427 | 0.937427 | 0.937427 | 0.937427 | 0.937427 | 0 | 0.010036 | 0.280707 | 9,558 | 173 | 121 | 55.248555 | 0.861673 | 0.821406 | 0 | 0 | 0 | 0 | 0.030178 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.125 | false | 0.0625 | 0.0625 | 0 | 0.3125 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 8 |
605ceefb96d2b8887d7270185f8551e4c45e2c6a | 17,460 | py | Python | OmniDB/OmniDB_app/tests.old/test_views.py | lejmr/OmniDB | 52c1c5a726a322f537a8e65f71d77ce322344d35 | [
"MIT"
] | 2,982 | 2016-04-12T13:33:50.000Z | 2022-03-31T14:16:43.000Z | OmniDB/OmniDB_app/tests.old/test_views.py | lejmr/OmniDB | 52c1c5a726a322f537a8e65f71d77ce322344d35 | [
"MIT"
] | 704 | 2016-04-30T14:44:11.000Z | 2022-03-18T09:39:41.000Z | OmniDB/OmniDB_app/tests.old/test_views.py | lejmr/OmniDB | 52c1c5a726a322f537a8e65f71d77ce322344d35 | [
"MIT"
] | 452 | 2016-04-25T23:50:25.000Z | 2022-03-28T15:03:52.000Z | from django.test import TestCase
from django.urls import reverse
from collections import OrderedDict
from OmniDB import settings
import OmniDB_app.include.OmniDatabase.SQLite
import OmniDB_app.include.Spartacus.Utils
from .utils_testing import (
build_client_ajax_request,
execute_client_login,
get_client_ajax_response_content,
get_client_omnidb_session,
get_omnidb_database_connection,
get_session_alert_message,
USERS
)
class ConnectionsNoSession(TestCase):
"""Test views from connections.py file with no user session.
"""
pass
class ConnectionsSession(TestCase):
"""Test views from connections.py file with user session.
"""
pass
class LoginNoSession(TestCase):
"""Test views from login.py file with no user session.
"""
def setUp(self):
"""Used to setup common properties between tests in this class.
"""
self.user = {
'user': USERS['ADMIN']['USER'],
'password': USERS['ADMIN']['PASSWORD']
}
self.assertIsNone(get_client_omnidb_session(p_client=self.client))
def test_get_index_user_pwd(self):
"""Test if is redirected to workspace when providing valid user and password parameters.
"""
v_response = self.client.get(
reverse('login'),
{
'user': self.user['user'],
'pwd': self.user['password']
},
follow=True
)
self.assertEquals(v_response.status_code, 200)
self.assertIn('OmniDB_app/workspace.html', [v_template.name for v_template in v_response.templates])
self.assertEquals(len(v_response.redirect_chain), 1)
self.assertEquals(v_response.redirect_chain[0][0], reverse('workspace'))
self.assertEquals(v_response.redirect_chain[0][1], 302)
self.assertIn('omnidb_short_version', v_response.context)
self.assertEquals(v_response.context['omnidb_short_version'], settings.OMNIDB_SHORT_VERSION)
def test_get_index_no_user_pwd(self):
"""Test if is redirected to workspace when providing invalid user and valid password parameters.
"""
v_response = self.client.get(
reverse('login'),
{
'user': '{p_user}kkk'.format(p_user=self.user['user']),
'pwd': self.user['password']
},
follow=True
)
self.assertEquals(v_response.status_code, 200)
self.assertEquals(len(v_response.redirect_chain), 0)
self.assertEquals(v_response.content, b'INVALID APP TOKEN')
def test_get_index_user_no_pwd(self):
"""Test if is redirected to workspace when providing valid user and invalid password parameters.
"""
v_response = self.client.get(
reverse('login'),
{
'user': self.user['user'],
'pwd': '{p_password}kkk'.format(p_password=self.user['password'])
},
follow=True
)
self.assertEquals(v_response.status_code, 200)
self.assertEquals(len(v_response.redirect_chain), 0)
self.assertEquals(v_response.content, b'INVALID APP TOKEN')
def test_get_logout(self):
"""Test if receives expected message while trying to logout.
"""
v_response = self.client.get(
reverse('logout'),
follow=True
)
self.assertEquals(v_response.status_code, 200)
self.assertIn('OmniDB_app/login.html', [v_template.name for v_template in v_response.templates])
self.assertEquals(len(v_response.redirect_chain), 1)
self.assertEquals(v_response.redirect_chain[0][0], reverse('login'))
self.assertEquals(v_response.redirect_chain[0][1], 302)
self.assertEquals(get_session_alert_message(p_client=self.client), 'Session object was already destroyed.')
def test_sign_in_no_user_password(self):
"""Test if sign in fails with invalid user and valid password.
"""
v_response = self.client.post(
reverse('sign_in'),
build_client_ajax_request(
p_data={
'p_username': '{p_user}kkk'.format(p_user=self.user['user']),
'p_pwd': self.user['password']
}
)
)
self.assertEquals(v_response.status_code, 200)
v_content = get_client_ajax_response_content(p_response=v_response)
self.assertEquals(v_content['v_data'], -1)
self.assertFalse(v_content['v_error'])
self.assertIsNone(get_client_omnidb_session(p_client=self.client))
def test_sign_in_no_user_password(self):
"""Test if sign in fails with valid user and invalid password.
"""
v_response = self.client.post(
reverse('sign_in'),
build_client_ajax_request(
p_data={
'p_username': self.user['user'],
'p_pwd': '{p_password}kkk'.format(p_password=self.user['password'])
}
)
)
self.assertEquals(v_response.status_code, 200)
v_content = get_client_ajax_response_content(p_response=v_response)
self.assertEquals(v_content['v_data'], -1)
self.assertFalse(v_content['v_error'])
self.assertIsNone(get_client_omnidb_session(p_client=self.client))
def test_sign_in_user_password(self):
"""Test if sign in succeeds with valid user and valid password.
"""
v_response = self.client.post(
reverse('sign_in'),
build_client_ajax_request(
p_data={
'p_username': self.user['user'],
'p_pwd': self.user['password']
}
)
)
self.assertEquals(v_response.status_code, 200)
v_content = get_client_ajax_response_content(p_response=v_response)
self.assertTrue(v_content['v_data'] >= 0)
self.assertFalse(v_content['v_error'])
v_omnidb_session = get_client_omnidb_session(p_client=self.client)
self.assertIsNotNone(v_omnidb_session)
v_omnidb_database = get_omnidb_database_connection()
v_user_table = v_omnidb_database.v_connection.Query(
p_sql='''
SELECT u.user_id,
u.password,
t.theme_id,
t.theme_name,
t.theme_type,
u.editor_font_size,
(CASE WHEN u.chat_enabled IS NULL
THEN 1
ELSE u.chat_enabled
END
) AS chat_enabled,
(CASE WHEN u.super_user IS NULL
THEN 0
ELSE u.super_user
END
) AS super_user,
u.csv_encoding,
u.csv_delimiter,
u.interface_font_size
FROM users u,
themes t
WHERE u.theme_id = t.theme_id
AND u.user_name = '{p_user}'
'''.format(
p_user=self.user['user']
)
)
self.assertEquals(len(v_user_table.Rows), 1)
v_user_row = v_user_table.Rows[0]
self.assertEquals(v_omnidb_session.v_user_id, v_user_row['user_id'])
self.assertEquals(v_omnidb_session.v_user_name, self.user['user'])
self.assertIsInstance(v_omnidb_session.v_omnidb_database, OmniDB_app.include.OmniDatabase.SQLite)
self.assertEquals(v_omnidb_session.v_editor_theme, v_user_row['theme_name'])
self.assertEquals(v_omnidb_session.v_theme_type, v_user_row['theme_type'])
self.assertEquals(v_omnidb_session.v_theme_id, v_user_row['theme_id'])
self.assertEquals(v_omnidb_session.v_editor_font_size, v_user_row['editor_font_size'])
self.assertEquals(v_omnidb_session.v_interface_font_size, v_user_row['interface_font_size'])
self.assertEquals(v_omnidb_session.v_enable_omnichat, int(v_user_row['chat_enabled']))
self.assertEquals(v_omnidb_session.v_super_user, int(v_user_row['super_user']))
self.assertIsInstance(v_omnidb_session.v_cryptor, OmniDB_app.include.Spartacus.Utils.Cryptor)
self.assertIsInstance(v_omnidb_session.v_database_index, int)
self.assertTrue(isinstance(v_omnidb_session.v_databases, OrderedDict) or isinstance(v_omnidb_session.v_databases, dict))
self.assertEquals(v_omnidb_session.v_user_key, self.client.session.session_key)
self.assertEquals(v_omnidb_session.v_csv_encoding, v_user_row['csv_encoding'])
self.assertEquals(v_omnidb_session.v_csv_delimiter, v_user_row['csv_delimiter'])
self.assertIsInstance(v_omnidb_session.v_tab_connections, dict)
class LoginSession(TestCase):
"""Test views from login.py file with user session.
"""
def setUp(self):
"""Used to setup common properties between tests in this class.
"""
self.user = {
'user': USERS['ADMIN']['USER'],
'password': USERS['ADMIN']['PASSWORD']
}
self.assertIsNone(get_client_omnidb_session(p_client=self.client))
v_successfull, v_response = execute_client_login(p_client=self.client, p_username=self.user['user'], p_password=self.user['password'])
self.assertTrue(v_successfull)
self.assertIsNotNone(get_client_omnidb_session(p_client=self.client))
def test_get_index_user_pwd(self):
"""Test if is redirected to workspace when providing valid user and password parameters.
"""
v_response = self.client.get(
reverse('login'),
{
'user': self.user['user'],
'pwd': self.user['password']
},
follow=True
)
self.assertEquals(v_response.status_code, 200)
self.assertIn('OmniDB_app/workspace.html', [v_template.name for v_template in v_response.templates])
self.assertEquals(len(v_response.redirect_chain), 1)
self.assertEquals(v_response.redirect_chain[0][0], reverse('workspace'))
self.assertEquals(v_response.redirect_chain[0][1], 302)
self.assertIn('omnidb_short_version', v_response.context)
self.assertEquals(v_response.context['omnidb_short_version'], settings.OMNIDB_SHORT_VERSION)
def test_get_index_no_user_pwd(self):
"""Test if is redirected to workspace when providing invalid user and valid password parameters.
"""
v_response = self.client.get(
reverse('login'),
{
'user': '{p_user}kkk'.format(p_user=self.user['user']),
'pwd': self.user['password']
},
follow=True
)
self.assertEquals(v_response.status_code, 200)
self.assertEquals(len(v_response.redirect_chain), 0)
self.assertEquals(v_response.content, b'INVALID APP TOKEN')
def test_get_index_user_no_pwd(self):
"""Test if is redirected to workspace when providing valid user and invalid password parameters.
"""
v_response = self.client.get(
reverse('login'),
{
'user': self.user['user'],
'pwd': '{p_password}kkk'.format(p_password=self.user['password'])
},
follow=True
)
self.assertEquals(v_response.status_code, 200)
self.assertEquals(len(v_response.redirect_chain), 0)
self.assertEquals(v_response.content, b'INVALID APP TOKEN')
def test_get_logout(self):
"""Test if receives expected response while trying to logout.
"""
v_response = self.client.get(
reverse('logout'),
follow=True
)
self.assertEquals(v_response.status_code, 200)
self.assertIn('OmniDB_app/login.html', [v_template.name for v_template in v_response.templates])
self.assertEquals(len(v_response.redirect_chain), 1)
self.assertEquals(v_response.redirect_chain[0][0], reverse('login'))
self.assertEquals(v_response.redirect_chain[0][1], 302)
self.assertIsNone(get_session_alert_message(p_client=self.client))
self.assertIsNone(self.client.session['omnidb_user_key'])
self.assertIsNone(get_client_omnidb_session(p_client=self.client))
def test_sign_in_no_user_password(self):
"""Test if sign in fails with invalid user and valid password.
"""
v_response = self.client.post(
reverse('sign_in'),
build_client_ajax_request(
p_data={
'p_username': '{p_user}kkk'.format(p_user=self.user['user']),
'p_pwd': self.user['password']
}
)
)
self.assertEquals(v_response.status_code, 200)
v_content = get_client_ajax_response_content(p_response=v_response)
self.assertEquals(v_content['v_data'], -1)
self.assertFalse(v_content['v_error'])
self.assertIsNone(get_client_omnidb_session(p_client=self.client))
def test_sign_in_no_user_password(self):
"""Test if sign in fails with valid user and invalid password.
"""
v_response = self.client.post(
reverse('sign_in'),
build_client_ajax_request(
p_data={
'p_username': self.user['user'],
'p_pwd': '{p_password}kkk'.format(p_password=self.user['password'])
}
)
)
self.assertEquals(v_response.status_code, 200)
v_content = get_client_ajax_response_content(p_response=v_response)
self.assertEquals(v_content['v_data'], -1)
self.assertFalse(v_content['v_error'])
def test_sign_in_user_password(self):
"""Test if sign in succeeds with valid user and valid password.
"""
v_response = self.client.post(
reverse('sign_in'),
build_client_ajax_request(
p_data={
'p_username': self.user['user'],
'p_pwd': self.user['password']
}
)
)
self.assertEquals(v_response.status_code, 200)
v_content = get_client_ajax_response_content(p_response=v_response)
self.assertTrue(v_content['v_data'] >= 0)
self.assertFalse(v_content['v_error'])
v_omnidb_session = get_client_omnidb_session(p_client=self.client)
self.assertIsNotNone(v_omnidb_session)
v_omnidb_database = get_omnidb_database_connection()
v_user_table = v_omnidb_database.v_connection.Query(
p_sql='''
SELECT u.user_id,
u.password,
t.theme_id,
t.theme_name,
t.theme_type,
u.editor_font_size,
(CASE WHEN u.chat_enabled IS NULL
THEN 1
ELSE u.chat_enabled
END
) AS chat_enabled,
(CASE WHEN u.super_user IS NULL
THEN 0
ELSE u.super_user
END
) AS super_user,
u.csv_encoding,
u.csv_delimiter,
u.interface_font_size
FROM users u,
themes t
WHERE u.theme_id = t.theme_id
AND u.user_name = '{p_user}'
'''.format(
p_user=self.user['user']
)
)
self.assertEquals(len(v_user_table.Rows), 1)
v_user_row = v_user_table.Rows[0]
self.assertEquals(v_omnidb_session.v_user_id, v_user_row['user_id'])
self.assertEquals(v_omnidb_session.v_user_name, self.user['user'])
self.assertIsInstance(v_omnidb_session.v_omnidb_database, OmniDB_app.include.OmniDatabase.SQLite)
self.assertEquals(v_omnidb_session.v_editor_theme, v_user_row['theme_name'])
self.assertEquals(v_omnidb_session.v_theme_type, v_user_row['theme_type'])
self.assertEquals(v_omnidb_session.v_theme_id, v_user_row['theme_id'])
self.assertEquals(v_omnidb_session.v_editor_font_size, v_user_row['editor_font_size'])
self.assertEquals(v_omnidb_session.v_interface_font_size, v_user_row['interface_font_size'])
self.assertEquals(v_omnidb_session.v_enable_omnichat, int(v_user_row['chat_enabled']))
self.assertEquals(v_omnidb_session.v_super_user, int(v_user_row['super_user']))
self.assertIsInstance(v_omnidb_session.v_cryptor, OmniDB_app.include.Spartacus.Utils.Cryptor)
self.assertIsInstance(v_omnidb_session.v_database_index, int)
self.assertTrue(isinstance(v_omnidb_session.v_databases, OrderedDict) or isinstance(v_omnidb_session.v_databases, dict))
self.assertEquals(v_omnidb_session.v_user_key, self.client.session.session_key)
self.assertEquals(v_omnidb_session.v_csv_encoding, v_user_row['csv_encoding'])
self.assertEquals(v_omnidb_session.v_csv_delimiter, v_user_row['csv_delimiter'])
self.assertIsInstance(v_omnidb_session.v_tab_connections, dict)
| 38.886414 | 142 | 0.617468 | 2,103 | 17,460 | 4.810271 | 0.075606 | 0.105971 | 0.094108 | 0.056346 | 0.944346 | 0.933768 | 0.927343 | 0.927343 | 0.904211 | 0.904211 | 0 | 0.007346 | 0.282761 | 17,460 | 448 | 143 | 38.973214 | 0.800447 | 0.086884 | 0 | 0.790274 | 0 | 0 | 0.192636 | 0.008463 | 0 | 0 | 0 | 0 | 0.31307 | 1 | 0.048632 | false | 0.075988 | 0.021277 | 0 | 0.082067 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 8 |
60709397d86485b70c41b427d5072698ac8b53f0 | 1,784 | py | Python | tests/test_provider_joyent_triton.py | mjuenema/python-terrascript | 6d8bb0273a14bfeb8ff8e950fe36f97f7c6e7b1d | [
"BSD-2-Clause"
] | 507 | 2017-07-26T02:58:38.000Z | 2022-01-21T12:35:13.000Z | tests/test_provider_joyent_triton.py | mjuenema/python-terrascript | 6d8bb0273a14bfeb8ff8e950fe36f97f7c6e7b1d | [
"BSD-2-Clause"
] | 135 | 2017-07-20T12:01:59.000Z | 2021-10-04T22:25:40.000Z | tests/test_provider_joyent_triton.py | mjuenema/python-terrascript | 6d8bb0273a14bfeb8ff8e950fe36f97f7c6e7b1d | [
"BSD-2-Clause"
] | 81 | 2018-02-20T17:55:28.000Z | 2022-01-31T07:08:40.000Z | # tests/test_provider_joyent_triton.py
# Automatically generated by tools/makecode.py (24-Sep-2021 15:29:19 UTC)
def test_provider_import():
import terrascript.provider.joyent.triton
def test_resource_import():
from terrascript.resource.joyent.triton import triton_fabric
from terrascript.resource.joyent.triton import triton_firewall_rule
from terrascript.resource.joyent.triton import triton_instance_template
from terrascript.resource.joyent.triton import triton_key
from terrascript.resource.joyent.triton import triton_machine
from terrascript.resource.joyent.triton import triton_service_group
from terrascript.resource.joyent.triton import triton_snapshot
from terrascript.resource.joyent.triton import triton_vlan
from terrascript.resource.joyent.triton import triton_volume
def test_datasource_import():
from terrascript.data.joyent.triton import triton_account
from terrascript.data.joyent.triton import triton_datacenter
from terrascript.data.joyent.triton import triton_fabric_network
from terrascript.data.joyent.triton import triton_fabric_vlan
from terrascript.data.joyent.triton import triton_image
from terrascript.data.joyent.triton import triton_network
from terrascript.data.joyent.triton import triton_package
from terrascript.data.joyent.triton import triton_volume
# TODO: Shortcut imports without namespace for official and supported providers.
# TODO: This has to be moved into a required_providers block.
# def test_version_source():
#
# import terrascript.provider.joyent.triton
#
# t = terrascript.provider.joyent.triton.triton()
# s = str(t)
#
# assert 'https://github.com/joyent/terraform-provider-triton' in s
# assert '0.8.2' in s
| 30.237288 | 80 | 0.789238 | 232 | 1,784 | 5.922414 | 0.323276 | 0.183406 | 0.222707 | 0.296943 | 0.644105 | 0.577147 | 0.577147 | 0.112809 | 0 | 0 | 0 | 0.009817 | 0.143498 | 1,784 | 58 | 81 | 30.758621 | 0.889398 | 0.272422 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0.017241 | 0 | 1 | 0.142857 | true | 0 | 1 | 0 | 1.142857 | 0 | 0 | 0 | 0 | null | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 7 |
608aa8904f51a631852ae12d95f8ea063171c097 | 19,928 | py | Python | auto_editor/resolve.py | Saya47/auto-editor | fdc55e4442d0c5eea11e30975fe0579d79f7f362 | [
"MIT"
] | 2 | 2022-01-08T06:38:23.000Z | 2022-03-10T22:05:42.000Z | auto_editor/resolve.py | Saya47/auto-editor | fdc55e4442d0c5eea11e30975fe0579d79f7f362 | [
"MIT"
] | null | null | null | auto_editor/resolve.py | Saya47/auto-editor | fdc55e4442d0c5eea11e30975fe0579d79f7f362 | [
"MIT"
] | 2 | 2021-06-26T10:59:49.000Z | 2022-01-17T02:44:22.000Z | '''resolve.py'''
"""
Export an XML file that can be imported by DaVinci Resolve.
"""
# Included functions
from usefulFunctions import conwrite, isAudioFile
# Internal libraries
import os
def exportToResolve(myInput, output, clips, duration, sampleRate, log):
pathurl = 'file://localhost' + os.path.abspath(myInput)
name = os.path.basename(myInput)
audioFile = isAudioFile(myInput)
ntsc = 'FALSE'
ana = 'FALSE' # anamorphic
depth = '16'
if(not audioFile):
try:
import cv2
conwrite('Grabbing video dimensions.')
cap = cv2.VideoCapture(myInput)
width = str(int(cap.get(cv2.CAP_PROP_FRAME_WIDTH)))
height = str(int(cap.get(cv2.CAP_PROP_FRAME_HEIGHT)))
cap.release()
cv2.destroyAllWindows()
except ImportError:
width = '1920'
height = '1080'
else:
width = '1920'
height = '1080'
pixelar = 'square' # pixel aspect ratio
colordepth = '24'
sr = sampleRate
if(audioFile):
with open(output, 'w', encoding='utf-8') as outfile:
outfile.write('<?xml version="1.0" encoding="UTF-8"?>\n<!DOCTYPE xmeml>\n')
outfile.write('<xmeml version="5">\n')
outfile.write('\t<sequence>\n')
outfile.write('\t\t<name>Auto-Editor Audio Group</name>\n')
outfile.write(f'\t\t<duration>{duration}</duration>\n')
outfile.write('\t\t<rate>\n')
outfile.write('\t\t\t<timebase>30</timebase>\n')
outfile.write(f'\t\t\t<ntsc>{ntsc}</ntsc>\n')
outfile.write('\t\t</rate>\n')
outfile.write('\t\t<in>-1</in>\n')
outfile.write('\t\t<out>-1</out>\n')
outfile.write('\t\t<media>\n')
outfile.write('\t\t\t<video>\n')
outfile.write('\t\t\t\t<format>\n')
outfile.write('\t\t\t\t\t<samplecharacteristics>\n')
outfile.write(f'\t\t\t\t\t\t<width>{width}</width>\n')
outfile.write(f'\t\t\t\t\t\t<height>{height}</height>\n')
outfile.write(f'\t\t\t\t\t\t<pixelaspectratio>{pixelar}</pixelaspectratio>\n')
outfile.write('\t\t\t\t\t\t<rate>\n')
outfile.write('\t\t\t\t\t\t\t<timebase>30</timebase>\n')
outfile.write(f'\t\t\t\t\t\t\t<ntsc>{ntsc}</ntsc>\n')
outfile.write('\t\t\t\t\t\t</rate>\n')
outfile.write('\t\t\t\t\t</samplecharacteristics>\n')
outfile.write('\t\t\t\t</format>\n')
outfile.write('\t\t\t</video>\n')
outfile.write('\t\t\t<audio>\n')
outfile.write('\t\t\t\t<track>\n')
total = 0
for j, clip in enumerate(clips):
myStart = int(total)
total += (clip[1] - clip[0]) / (clip[2] / 100)
myEnd = int(total)
outfile.write(f'\t\t\t\t\t<clipitem id="clipitem-{j+1}">\n')
outfile.write('\t\t\t\t\t\t<masterclipid>masterclip-1</masterclipid>\n')
outfile.write(f'\t\t\t\t\t\t<name>{name}</name>\n')
outfile.write(f'\t\t\t\t\t\t<start>{myStart}</start>\n')
outfile.write(f'\t\t\t\t\t\t<end>{myEnd}</end>\n')
outfile.write(f'\t\t\t\t\t\t<in>{int(clip[0] / (clip[2] / 100))}</in>\n')
outfile.write(f'\t\t\t\t\t\t<out>{int(clip[1] / (clip[2] / 100))}</out>\n')
if(j == 0):
outfile.write('\t\t\t\t\t\t<file id="file-1">\n')
outfile.write(f'\t\t\t\t\t\t\t<name>{name}</name>\n')
outfile.write(f'\t\t\t\t\t\t\t<pathurl>{pathurl}</pathurl>\n')
outfile.write('\t\t\t\t\t\t\t<rate>\n')
outfile.write('\t\t\t\t\t\t\t\t<timebase>30</timebase>\n')
outfile.write(f'\t\t\t\t\t\t\t\t<ntsc>{ntsc}</ntsc>\n')
outfile.write('\t\t\t\t\t\t\t</rate>\n')
outfile.write('\t\t\t\t\t\t\t<media>\n')
outfile.write('\t\t\t\t\t\t\t\t<audio>\n')
outfile.write('\t\t\t\t\t\t\t\t\t<channelcount>1</channelcount>\n')
outfile.write('\t\t\t\t\t\t\t\t</audio>\n')
outfile.write('\t\t\t\t\t\t\t</media>\n')
outfile.write('\t\t\t\t\t\t</file>\n')
else:
outfile.write(f'\t\t\t\t\t\t<file id="file-1"/>\n')
outfile.write('\t\t\t\t\t\t<sourcetrack>\n')
outfile.write('\t\t\t\t\t\t\t<mediatype>audio</mediatype>\n')
outfile.write('\t\t\t\t\t\t\t<trackindex>1</trackindex>\n')
outfile.write('\t\t\t\t\t\t</sourcetrack>\n')
outfile.write('\t\t\t\t\t</clipitem>\n')
outfile.write('\t\t\t\t</track>\n')
outfile.write('\t\t\t</audio>\n')
outfile.write('\t\t</media>\n')
outfile.write('\t</sequence>\n')
outfile.write('</xmeml>')
# Exit out of this function prematurely.
return None
# End of audio file code.
with open(output, 'w', encoding='utf-8') as outfile:
outfile.write('<?xml version="1.0" encoding="UTF-8"?>\n<!DOCTYPE xmeml>\n')
outfile.write('<xmeml version="4">\n')
outfile.write('\t<sequence id="sequence-1" TL.SQAudioVisibleBase="0" TL.SQVideoVisibleBase="0" TL.SQVisibleBaseTime="0" TL.SQAVDividerPosition="0.5" TL.SQHideShyTracks="0" TL.SQHeaderWidth="236" TL.SQTimePerPixel="0.013085939262623341" MZ.EditLine="0" MZ.Sequence.PreviewFrameSizeHeight="720" MZ.Sequence.AudioTimeDisplayFormat="200" MZ.Sequence.PreviewRenderingClassID="1297106761" MZ.Sequence.PreviewRenderingPresetCodec="1297107278" MZ.Sequence.PreviewRenderingPresetPath="EncoderPresets/SequencePreview/795454d9-d3c2-429d-9474-923ab13b7018/I-Frame Only MPEG.epr" MZ.Sequence.PreviewUseMaxRenderQuality="false" MZ.Sequence.PreviewUseMaxBitDepth="false" MZ.Sequence.EditingModeGUID="795454d9-d3c2-429d-9474-923ab13b7018" MZ.Sequence.VideoTimeDisplayFormat="104" MZ.WorkOutPoint="10770278400000" MZ.WorkInPoint="0" explodedTracks="true">\n')
outfile.write('\t\t<rate>\n')
outfile.write('\t\t\t<timebase>30</timebase>\n')
outfile.write(f'\t\t\t<ntsc>{ntsc}</ntsc>\n')
outfile.write('\t\t</rate>\n')
outfile.write('\t\t<name>Auto-Editor Video Group</name>\n')
outfile.write('\t\t<media>\n')
outfile.write('\t\t\t<video>\n')
outfile.write('\t\t\t\t<format>\n')
outfile.write('\t\t\t\t\t<samplecharacteristics>\n')
outfile.write('\t\t\t\t\t\t<rate>\n')
outfile.write('\t\t\t\t\t\t\t<timebase>30</timebase>\n')
outfile.write(f'\t\t\t\t\t\t\t<ntsc>{ntsc}</ntsc>\n')
outfile.write('\t\t\t\t\t\t</rate>\n')
outfile.write(f'\t\t\t\t\t\t<width>{width}</width>\n')
outfile.write(f'\t\t\t\t\t\t<height>{height}</height>\n')
outfile.write(f'\t\t\t\t\t\t<anamorphic>{ana}</anamorphic>\n')
outfile.write(f'\t\t\t\t\t\t<pixelaspectratio>{pixelar}</pixelaspectratio>\n')
outfile.write('\t\t\t\t\t\t<fielddominance>none</fielddominance>\n')
outfile.write(f'\t\t\t\t\t\t<colordepth>{colordepth}</colordepth>\n')
outfile.write('\t\t\t\t\t</samplecharacteristics>\n')
outfile.write('\t\t\t\t</format>\n')
outfile.write('\t\t\t\t<track>\n')
# Handle clips.
total = 0
for j, clip in enumerate(clips):
myStart = int(total)
total += (clip[1] - clip[0]) / (clip[2] / 100)
myEnd = int(total)
outfile.write(f'\t\t\t\t\t<clipitem id="clipitem-{j+7}">\n')
outfile.write('\t\t\t\t\t\t<masterclipid>masterclip-2</masterclipid>\n')
outfile.write(f'\t\t\t\t\t\t<name>{name}</name>\n')
outfile.write(f'\t\t\t\t\t\t<start>{myStart}</start>\n')
outfile.write(f'\t\t\t\t\t\t<end>{myEnd}</end>\n')
outfile.write(f'\t\t\t\t\t\t<in>{int(clip[0] / (clip[2] / 100))}</in>\n')
outfile.write(f'\t\t\t\t\t\t<out>{int(clip[1] / (clip[2] / 100))}</out>\n')
if(j == 0):
outfile.write('\t\t\t\t\t\t<file id="file-2">\n')
outfile.write(f'\t\t\t\t\t\t\t<name>{name}</name>\n')
outfile.write(f'\t\t\t\t\t\t\t<pathurl>{pathurl}</pathurl>\n')
outfile.write('\t\t\t\t\t\t\t<rate>\n')
outfile.write('\t\t\t\t\t\t\t\t<timebase>30</timebase>\n')
outfile.write(f'\t\t\t\t\t\t\t\t<ntsc>{ntsc}</ntsc>\n')
outfile.write('\t\t\t\t\t\t\t</rate>\n')
outfile.write(f'\t\t\t\t\t\t\t<duration>{duration}</duration>\n')
outfile.write('\t\t\t\t\t\t\t<media>\n')
outfile.write('\t\t\t\t\t\t\t\t<video>\n')
outfile.write('\t\t\t\t\t\t\t\t\t<samplecharacteristics>\n')
outfile.write('\t\t\t\t\t\t\t\t\t\t<rate>\n')
outfile.write('\t\t\t\t\t\t\t\t\t\t\t<timebase>30</timebase>\n')
outfile.write(f'\t\t\t\t\t\t\t\t\t\t\t<ntsc>{ntsc}</ntsc>\n')
outfile.write('\t\t\t\t\t\t\t\t\t\t</rate>\n')
outfile.write(f'\t\t\t\t\t\t\t\t\t\t<width>{width}</width>\n')
outfile.write(f'\t\t\t\t\t\t\t\t\t\t<height>{height}</height>\n')
outfile.write(f'\t\t\t\t\t\t\t\t\t\t<anamorphic>{ana}</anamorphic>\n')
outfile.write(f'\t\t\t\t\t\t\t\t\t\t<pixelaspectratio>{pixelar}</pixelaspectratio>\n')
outfile.write('\t\t\t\t\t\t\t\t\t\t<fielddominance>none</fielddominance>\n')
outfile.write('\t\t\t\t\t\t\t\t\t</samplecharacteristics>\n')
outfile.write('\t\t\t\t\t\t\t\t</video>\n')
outfile.write('\t\t\t\t\t\t\t\t<audio>\n')
outfile.write('\t\t\t\t\t\t\t\t\t<samplecharacteristics>\n')
outfile.write(f'\t\t\t\t\t\t\t\t\t\t<depth>{depth}</depth>\n')
outfile.write(f'\t\t\t\t\t\t\t\t\t\t<samplerate>{sr}</samplerate>\n')
outfile.write('\t\t\t\t\t\t\t\t\t</samplecharacteristics>\n')
outfile.write('\t\t\t\t\t\t\t\t\t<channelcount>2</channelcount>\n')
outfile.write('\t\t\t\t\t\t\t\t</audio>\n')
outfile.write('\t\t\t\t\t\t\t</media>\n')
outfile.write('\t\t\t\t\t\t</file>\n')
else:
outfile.write(f'\t\t\t\t\t\t<file id="file-2"/>\n')
# Add the speed effect if nessecary
if(clip[2] != 100):
outfile.write('\t\t\t\t\t\t<filter>\n')
outfile.write('\t\t\t\t\t\t\t<effect>\n')
outfile.write('\t\t\t\t\t\t\t\t<name>Time Remap</name>\n')
outfile.write('\t\t\t\t\t\t\t\t<effectid>timeremap</effectid>\n')
outfile.write('\t\t\t\t\t\t\t\t<effectcategory>motion</effectcategory>\n')
outfile.write('\t\t\t\t\t\t\t\t<effecttype>motion</effecttype>\n')
outfile.write('\t\t\t\t\t\t\t\t<mediatype>video</mediatype>\n')
outfile.write('\t\t\t\t\t\t\t\t<parameter authoringApp="PremierePro">\n')
outfile.write('\t\t\t\t\t\t\t\t\t<parameterid>variablespeed</parameterid>\n')
outfile.write('\t\t\t\t\t\t\t\t\t<name>variablespeed</name>\n')
outfile.write('\t\t\t\t\t\t\t\t\t<valuemin>0</valuemin>\n')
outfile.write('\t\t\t\t\t\t\t\t\t<valuemax>1</valuemax>\n')
outfile.write('\t\t\t\t\t\t\t\t\t<value>0</value>\n')
outfile.write('\t\t\t\t\t\t\t\t</parameter>\n')
outfile.write('\t\t\t\t\t\t\t\t<parameter authoringApp="PremierePro">\n')
outfile.write('\t\t\t\t\t\t\t\t\t<parameterid>speed</parameterid>\n')
outfile.write('\t\t\t\t\t\t\t\t\t<name>speed</name>\n')
outfile.write('\t\t\t\t\t\t\t\t\t<valuemin>-100000</valuemin>\n')
outfile.write('\t\t\t\t\t\t\t\t\t<valuemax>100000</valuemax>\n')
outfile.write(f'\t\t\t\t\t\t\t\t\t<value>{clip[2]}</value>\n')
outfile.write('\t\t\t\t\t\t\t\t</parameter>\n')
outfile.write('\t\t\t\t\t\t\t\t<parameter authoringApp="PremierePro">\n')
outfile.write('\t\t\t\t\t\t\t\t\t<parameterid>reverse</parameterid>\n')
outfile.write('\t\t\t\t\t\t\t\t\t<name>reverse</name>\n')
outfile.write('\t\t\t\t\t\t\t\t\t<value>FALSE</value>\n')
outfile.write('\t\t\t\t\t\t\t\t</parameter>\n')
outfile.write('\t\t\t\t\t\t\t\t<parameter authoringApp="PremierePro">\n')
outfile.write('\t\t\t\t\t\t\t\t\t<parameterid>frameblending</parameterid>\n')
outfile.write('\t\t\t\t\t\t\t\t\t<name>frameblending</name>\n')
outfile.write('\t\t\t\t\t\t\t\t\t<value>FALSE</value>\n')
outfile.write('\t\t\t\t\t\t\t\t</parameter>\n')
outfile.write('\t\t\t\t\t\t\t</effect>\n')
outfile.write('\t\t\t\t\t\t</filter>\n')
# Linking for video blocks
for i in range(3):
outfile.write('\t\t\t\t\t\t<link>\n')
outfile.write(f'\t\t\t\t\t\t\t<linkclipref>clipitem-{(i*(len(clips)+1))+7+j}</linkclipref>\n')
if(i == 0):
outfile.write('\t\t\t\t\t\t\t<mediatype>video</mediatype>\n')
else:
outfile.write('\t\t\t\t\t\t\t<mediatype>audio</mediatype>\n')
if(i == 2):
outfile.write('\t\t\t\t\t\t\t<trackindex>2</trackindex>\n')
else:
outfile.write('\t\t\t\t\t\t\t<trackindex>1</trackindex>\n')
outfile.write(f'\t\t\t\t\t\t\t<clipindex>{j+1}</clipindex>\n')
if(i == 1 or i == 2):
outfile.write('\t\t\t\t\t\t\t<groupindex>1</groupindex>\n')
outfile.write('\t\t\t\t\t\t</link>\n')
outfile.write('\t\t\t\t\t</clipitem>\n')
outfile.write('\t\t\t\t</track>\n')
outfile.write('\t\t\t</video>\n')
outfile.write('\t\t\t<audio>\n')
outfile.write('\t\t\t\t<numOutputChannels>2</numOutputChannels>\n')
outfile.write('\t\t\t\t<format>\n')
outfile.write('\t\t\t\t\t<samplecharacteristics>\n')
outfile.write(f'\t\t\t\t\t\t<depth>{depth}</depth>\n')
outfile.write(f'\t\t\t\t\t\t<samplerate>{sr}</samplerate>\n')
outfile.write('\t\t\t\t\t</samplecharacteristics>\n')
outfile.write('\t\t\t\t</format>\n')
outfile.write('\t\t\t\t<track PannerIsInverted="true" PannerStartKeyframe="-91445760000000000,0.5,0,0,0,0,0,0" PannerName="Balance" currentExplodedTrackIndex="0" totalExplodedTrackCount="2" premiereTrackType="Stereo">\n')
# Audio Clips
total = 0
for j, clip in enumerate(clips):
outfile.write(f'\t\t\t\t\t<clipitem id="clipitem-{len(clips)+8+j}" premiereChannelType="stereo">\n')
outfile.write(f'\t\t\t\t\t\t<masterclipid>masterclip-2</masterclipid>\n')
outfile.write(f'\t\t\t\t\t\t<name>{name}</name>\n')
myStart = int(total)
total += (clip[1] - clip[0]) / (clip[2] / 100)
myEnd = int(total)
outfile.write(f'\t\t\t\t\t\t<start>{myStart}</start>\n')
outfile.write(f'\t\t\t\t\t\t<end>{myEnd}</end>\n')
outfile.write(f'\t\t\t\t\t\t<in>{int(clip[0] / (clip[2] / 100))}</in>\n')
outfile.write(f'\t\t\t\t\t\t<out>{int(clip[1] / (clip[2] / 100))}</out>\n')
outfile.write('\t\t\t\t\t\t<file id="file-2"/>\n')
outfile.write('\t\t\t\t\t\t<sourcetrack>\n')
outfile.write('\t\t\t\t\t\t\t<mediatype>audio</mediatype>\n')
outfile.write('\t\t\t\t\t\t\t<trackindex>1</trackindex>\n')
outfile.write('\t\t\t\t\t\t</sourcetrack>\n')
# Add speed effect for audio blocks
if(clip[2] != 100):
outfile.write('\t\t\t\t\t\t<filter>\n')
outfile.write('\t\t\t\t\t\t\t<effect>\n')
outfile.write('\t\t\t\t\t\t\t\t<name>Time Remap</name>\n')
outfile.write('\t\t\t\t\t\t\t\t<effectid>timeremap</effectid>\n')
outfile.write('\t\t\t\t\t\t\t\t<effectcategory>motion</effectcategory>\n')
outfile.write('\t\t\t\t\t\t\t\t<effecttype>motion</effecttype>\n')
outfile.write('\t\t\t\t\t\t\t\t<mediatype>video</mediatype>\n')
outfile.write('\t\t\t\t\t\t\t\t<parameter authoringApp="PremierePro">\n')
outfile.write('\t\t\t\t\t\t\t\t\t<parameterid>variablespeed</parameterid>\n')
outfile.write('\t\t\t\t\t\t\t\t\t<name>variablespeed</name>\n')
outfile.write('\t\t\t\t\t\t\t\t\t<valuemin>0</valuemin>\n')
outfile.write('\t\t\t\t\t\t\t\t\t<valuemax>1</valuemax>\n')
outfile.write('\t\t\t\t\t\t\t\t\t<value>0</value>\n')
outfile.write('\t\t\t\t\t\t\t\t</parameter>\n')
outfile.write('\t\t\t\t\t\t\t\t<parameter authoringApp="PremierePro">\n')
outfile.write('\t\t\t\t\t\t\t\t\t<parameterid>speed</parameterid>\n')
outfile.write('\t\t\t\t\t\t\t\t\t<name>speed</name>\n')
outfile.write('\t\t\t\t\t\t\t\t\t<valuemin>-100000</valuemin>\n')
outfile.write('\t\t\t\t\t\t\t\t\t<valuemax>100000</valuemax>\n')
outfile.write(f'\t\t\t\t\t\t\t\t\t<value>{clip[2]}</value>\n')
outfile.write('\t\t\t\t\t\t\t\t</parameter>\n')
outfile.write('\t\t\t\t\t\t\t\t<parameter authoringApp="PremierePro">\n')
outfile.write('\t\t\t\t\t\t\t\t\t<parameterid>reverse</parameterid>\n')
outfile.write('\t\t\t\t\t\t\t\t\t<name>reverse</name>\n')
outfile.write('\t\t\t\t\t\t\t\t\t<value>FALSE</value>\n')
outfile.write('\t\t\t\t\t\t\t\t</parameter>\n')
outfile.write('\t\t\t\t\t\t\t\t<parameter authoringApp="PremierePro">\n')
outfile.write('\t\t\t\t\t\t\t\t\t<parameterid>frameblending</parameterid>\n')
outfile.write('\t\t\t\t\t\t\t\t\t<name>frameblending</name>\n')
outfile.write('\t\t\t\t\t\t\t\t\t<value>FALSE</value>\n')
outfile.write('\t\t\t\t\t\t\t\t</parameter>\n')
outfile.write('\t\t\t\t\t\t\t</effect>\n')
outfile.write('\t\t\t\t\t\t</filter>\n')
if(audioFile):
startOn = 1
else:
startOn = 0
for i in range(startOn, 3):
outfile.write('\t\t\t\t\t\t<link>\n')
outfile.write(f'\t\t\t\t\t\t\t<linkclipref>clipitem-{(i*(len(clips)+1))+7+j}</linkclipref>\n')
if(i == 0):
outfile.write('\t\t\t\t\t\t\t<mediatype>video</mediatype>\n')
else:
outfile.write('\t\t\t\t\t\t\t<mediatype>audio</mediatype>\n')
if(i == 2):
outfile.write('\t\t\t\t\t\t\t<trackindex>2</trackindex>\n')
else:
outfile.write('\t\t\t\t\t\t\t<trackindex>1</trackindex>\n')
outfile.write(f'\t\t\t\t\t\t\t<clipindex>{j+1}</clipindex>\n')
if(i == 1 or i == 2):
outfile.write('\t\t\t\t\t\t\t<groupindex>1</groupindex>\n')
outfile.write('\t\t\t\t\t\t</link>\n')
outfile.write('\t\t\t\t\t</clipitem>\n')
outfile.write('\t\t\t\t\t<outputchannelindex>1</outputchannelindex>\n')
outfile.write('\t\t\t\t</track>\n')
outfile.write('\t\t\t</audio>\n')
outfile.write('\t\t</media>\n')
outfile.write('\t</sequence>\n')
outfile.write('</xmeml>')
conwrite('')
| 56.774929 | 850 | 0.531162 | 3,288 | 19,928 | 3.217457 | 0.069039 | 0.2418 | 0.298327 | 0.316854 | 0.836752 | 0.825692 | 0.825031 | 0.823329 | 0.811986 | 0.807543 | 0 | 0.022162 | 0.239211 | 19,928 | 350 | 851 | 56.937143 | 0.675615 | 0.013097 | 0 | 0.764706 | 0 | 0.382353 | 0.493924 | 0.42259 | 0 | 0 | 0 | 0 | 0 | 1 | 0.003268 | false | 0 | 0.013072 | 0 | 0.019608 | 0 | 0 | 0 | 0 | null | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 12 |
71c79559ed23753f0f8461c1446b00ecdb507337 | 5,988 | py | Python | flask_appbuilder/fieldwidgets.py | mjkonarski-b/Flask-AppBuilder | afc8e2c9209a928fa6a791919ceae3ce3cdc48a7 | [
"BSD-3-Clause"
] | null | null | null | flask_appbuilder/fieldwidgets.py | mjkonarski-b/Flask-AppBuilder | afc8e2c9209a928fa6a791919ceae3ce3cdc48a7 | [
"BSD-3-Clause"
] | null | null | null | flask_appbuilder/fieldwidgets.py | mjkonarski-b/Flask-AppBuilder | afc8e2c9209a928fa6a791919ceae3ce3cdc48a7 | [
"BSD-3-Clause"
] | 1 | 2022-01-25T14:35:14.000Z | 2022-01-25T14:35:14.000Z | from flask_babel import lazy_gettext as _
from markupsafe import Markup
from wtforms import widgets
from wtforms.widgets import html_params
class DatePickerWidget(object):
"""
Date Time picker from Eonasdan GitHub
"""
data_template = (
'<div class="input-group date appbuilder_date" id="datepicker">'
'<span class="input-group-addon"><i class="fa fa-calendar cursor-hand"></i>'
"</span>"
'<input class="form-control" data-format="yyyy-MM-dd" %(text)s />'
"</div>"
)
def __call__(self, field, **kwargs):
kwargs.setdefault("id", field.id)
kwargs.setdefault("name", field.name)
if not field.data:
field.data = ""
template = self.data_template
return Markup(
template % {"text": html_params(type="text", value=field.data, **kwargs)}
)
class DateTimePickerWidget(object):
"""
Date Time picker from Eonasdan GitHub
"""
data_template = (
'<div class="input-group date appbuilder_datetime" id="datetimepicker">'
'<span class="input-group-addon"><i class="fa fa-calendar cursor-hand"></i>'
"</span>"
'<input class="form-control" data-format="yyyy-MM-dd hh:mm:ss" %(text)s />'
"</div>"
)
def __call__(self, field, **kwargs):
kwargs.setdefault("id", field.id)
kwargs.setdefault("name", field.name)
if not field.data:
field.data = ""
template = self.data_template
return Markup(
template % {"text": html_params(type="text", value=field.data, **kwargs)}
)
class BS3TextFieldWidget(widgets.TextInput):
def __call__(self, field, **kwargs):
kwargs["class"] = u"form-control"
if field.label:
kwargs["placeholder"] = field.label.text
if "name_" in kwargs:
field.name = kwargs["name_"]
return super(BS3TextFieldWidget, self).__call__(field, **kwargs)
class BS3TextAreaFieldWidget(widgets.TextArea):
def __call__(self, field, **kwargs):
kwargs["class"] = u"form-control"
kwargs["rows"] = 3
if field.label:
kwargs["placeholder"] = field.label.text
return super(BS3TextAreaFieldWidget, self).__call__(field, **kwargs)
class BS3PasswordFieldWidget(widgets.PasswordInput):
def __call__(self, field, **kwargs):
kwargs["class"] = u"form-control"
if field.label:
kwargs["placeholder"] = field.label.text
return super(BS3PasswordFieldWidget, self).__call__(field, **kwargs)
class Select2AJAXWidget(object):
data_template = "<input %(text)s />"
def __init__(self, endpoint, extra_classes=None, style=None):
self.endpoint = endpoint
self.extra_classes = extra_classes
self.style = style or u"width:250px"
def __call__(self, field, **kwargs):
kwargs.setdefault("id", field.id)
kwargs.setdefault("name", field.name)
kwargs.setdefault("endpoint", self.endpoint)
kwargs.setdefault("style", self.style)
input_classes = "input-group my_select2_ajax"
if self.extra_classes:
input_classes = input_classes + " " + self.extra_classes
kwargs.setdefault("class", input_classes)
if not field.data:
field.data = ""
template = self.data_template
return Markup(
template % {"text": html_params(type="text", value=field.data, **kwargs)}
)
class Select2SlaveAJAXWidget(object):
data_template = '<input class="input-group my_select2_ajax_slave" %(text)s />'
def __init__(self, master_id, endpoint, extra_classes=None, style=None):
self.endpoint = endpoint
self.master_id = master_id
self.extra_classes = extra_classes
self.style = style or u"width:250px"
def __call__(self, field, **kwargs):
kwargs.setdefault("id", field.id)
kwargs.setdefault("name", field.name)
kwargs.setdefault("endpoint", self.endpoint)
kwargs.setdefault("master_id", self.master_id)
kwargs.setdefault("style", self.style)
input_classes = "input-group my_select2_ajax"
if self.extra_classes:
input_classes = input_classes + " " + self.extra_classes
kwargs.setdefault("class", input_classes)
if not field.data:
field.data = ""
template = self.data_template
return Markup(
template % {"text": html_params(type="text", value=field.data, **kwargs)}
)
class Select2Widget(widgets.Select):
extra_classes = None
def __init__(self, extra_classes=None, style=None):
self.extra_classes = extra_classes
self.style = style or u"width:250px"
return super(Select2Widget, self).__init__()
def __call__(self, field, **kwargs):
kwargs["class"] = u"my_select2 form-control"
if self.extra_classes:
kwargs["class"] = kwargs["class"] + " " + self.extra_classes
kwargs["style"] = self.style
kwargs["data-placeholder"] = _("Select Value")
if "name_" in kwargs:
field.name = kwargs["name_"]
return super(Select2Widget, self).__call__(field, **kwargs)
class Select2ManyWidget(widgets.Select):
extra_classes = None
def __init__(self, extra_classes=None, style=None):
self.extra_classes = extra_classes
self.style = style or u"width:250px"
return super(Select2ManyWidget, self).__init__()
def __call__(self, field, **kwargs):
kwargs["class"] = u"my_select2 form-control"
if self.extra_classes:
kwargs["class"] = kwargs["class"] + " " + self.extra_classes
kwargs["style"] = self.style
kwargs["data-placeholder"] = _("Select Value")
kwargs["multiple"] = u"true"
if "name_" in kwargs:
field.name = kwargs["name_"]
return super(Select2ManyWidget, self).__call__(field, **kwargs)
| 33.640449 | 85 | 0.622912 | 675 | 5,988 | 5.303704 | 0.140741 | 0.073743 | 0.06257 | 0.040223 | 0.81648 | 0.774302 | 0.774302 | 0.774302 | 0.774302 | 0.774302 | 0 | 0.007091 | 0.246326 | 5,988 | 177 | 86 | 33.830508 | 0.786173 | 0.012525 | 0 | 0.716418 | 0 | 0.029851 | 0.169388 | 0.021259 | 0 | 0 | 0 | 0 | 0 | 1 | 0.097015 | false | 0.014925 | 0.029851 | 0 | 0.320896 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
71ca475736e85ff85d84bf3fa1b4f15368dd50d5 | 328 | py | Python | src/net/get_semnet.py | valeriopaolicelli/SegVPR | 3f918fef8126decfa2ab189628e0cc61263873fa | [
"Apache-2.0"
] | 2 | 2022-03-25T02:07:25.000Z | 2022-03-30T17:47:59.000Z | src/net/get_semnet.py | valeriopaolicelli/SegVPR | 3f918fef8126decfa2ab189628e0cc61263873fa | [
"Apache-2.0"
] | 1 | 2022-03-25T03:45:34.000Z | 2022-03-25T08:59:39.000Z | src/net/get_semnet.py | valeriopaolicelli/SegVPR | 3f918fef8126decfa2ab189628e0cc61263873fa | [
"Apache-2.0"
] | null | null | null | import net.pspnet as pspnet
import net.deeplab as deeplab
def get_deeplab(encoder, encoder_dim, classes):
return deeplab.DeepLab(encoder=encoder, encoder_dim=encoder_dim, classes=classes)
def get_pspnet(encoder, encoder_dim, classes):
return pspnet.PSPNet(encoder=encoder, encoder_dim=encoder_dim, classes=classes)
| 27.333333 | 85 | 0.801829 | 46 | 328 | 5.543478 | 0.23913 | 0.329412 | 0.266667 | 0.188235 | 0.611765 | 0.376471 | 0.376471 | 0.376471 | 0 | 0 | 0 | 0 | 0.112805 | 328 | 11 | 86 | 29.818182 | 0.876289 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.333333 | false | 0 | 0.333333 | 0.333333 | 1 | 0 | 0 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 1 | 1 | 0 | 0 | 7 |
e0f53de3faea6b2ee03220069c234dadf78050f2 | 56,464 | py | Python | simpleAICV/detection/models/fpn.py | zgcr/pytorch-ImageNet-CIFAR-COCO-voc-training | c5878df46ac874097d1a5076a81cd7d41630fb98 | [
"MIT"
] | null | null | null | simpleAICV/detection/models/fpn.py | zgcr/pytorch-ImageNet-CIFAR-COCO-voc-training | c5878df46ac874097d1a5076a81cd7d41630fb98 | [
"MIT"
] | null | null | null | simpleAICV/detection/models/fpn.py | zgcr/pytorch-ImageNet-CIFAR-COCO-voc-training | c5878df46ac874097d1a5076a81cd7d41630fb98 | [
"MIT"
] | null | null | null | import os
import sys
BASE_DIR = os.path.dirname(
os.path.dirname(os.path.dirname(os.path.dirname(
os.path.abspath(__file__)))))
sys.path.append(BASE_DIR)
import math
import torch
import torch.nn as nn
import torch.nn.functional as F
from simpleAICV.classification.backbones.darknet import ConvBnActBlock
from simpleAICV.classification.backbones.yolov5backbone import CSPBottleneck
from simpleAICV.classification.backbones.yoloxbackbone import YOLOXCSPBottleneck
class RetinaFPN(nn.Module):
def __init__(self, inplanes, planes, use_p5=False):
super(RetinaFPN, self).__init__()
# inplanes:[C3_inplanes,C4_inplanes,C5_inplanes]
self.use_p5 = use_p5
self.P3_1 = nn.Conv2d(inplanes[0],
planes,
kernel_size=1,
stride=1,
padding=0)
self.P3_2 = nn.Conv2d(planes,
planes,
kernel_size=3,
stride=1,
padding=1)
self.P4_1 = nn.Conv2d(inplanes[1],
planes,
kernel_size=1,
stride=1,
padding=0)
self.P4_2 = nn.Conv2d(planes,
planes,
kernel_size=3,
stride=1,
padding=1)
self.P5_1 = nn.Conv2d(inplanes[2],
planes,
kernel_size=1,
stride=1,
padding=0)
self.P5_2 = nn.Conv2d(planes,
planes,
kernel_size=3,
stride=1,
padding=1)
self.P6 = nn.Conv2d(
planes, planes, kernel_size=3, stride=2,
padding=1) if self.use_p5 else nn.Conv2d(
inplanes[2], planes, kernel_size=3, stride=2, padding=1)
self.P7 = nn.Sequential(
nn.ReLU(),
nn.Conv2d(planes, planes, kernel_size=3, stride=2, padding=1))
def forward(self, inputs):
[C3, C4, C5] = inputs
P5 = self.P5_1(C5)
P4 = self.P4_1(C4)
P4 = F.interpolate(P5,
size=(P4.shape[2], P4.shape[3]),
mode='bilinear',
align_corners=True) + P4
P3 = self.P3_1(C3)
P3 = F.interpolate(P4,
size=(P3.shape[2], P3.shape[3]),
mode='bilinear',
align_corners=True) + P3
del C3, C4
P5 = self.P5_2(P5)
P4 = self.P4_2(P4)
P3 = self.P3_2(P3)
P6 = self.P6(P5) if self.use_p5 else self.P6(C5)
del C5
P7 = self.P7(P6)
return [P3, P4, P5, P6, P7]
class Yolov3TinyFPNHead(nn.Module):
def __init__(self,
inplanes,
per_level_num_anchors=3,
num_classes=80,
act_type='leakyrelu'):
super(Yolov3TinyFPNHead, self).__init__()
# inplanes:[C4_inplanes,C5_inplanes]
self.per_level_num_anchors = per_level_num_anchors
self.conv1 = ConvBnActBlock(inplanes[1],
1024,
kernel_size=3,
stride=1,
padding=1,
groups=1,
has_bn=True,
has_act=True,
act_type=act_type)
self.conv2 = ConvBnActBlock(1024,
256,
kernel_size=1,
stride=1,
padding=0,
groups=1,
has_bn=True,
has_act=True,
act_type=act_type)
self.P5_conv = ConvBnActBlock(256,
512,
kernel_size=3,
stride=1,
padding=1,
groups=1,
has_bn=True,
has_act=True,
act_type=act_type)
self.P5_pred_conv = nn.Conv2d(512,
per_level_num_anchors *
(1 + 4 + num_classes),
kernel_size=1,
stride=1,
padding=0,
groups=1,
bias=True)
self.conv3 = ConvBnActBlock(256,
128,
kernel_size=1,
stride=1,
padding=0,
groups=1,
has_bn=True,
has_act=True,
act_type=act_type)
self.P4_conv = ConvBnActBlock(int(128 + inplanes[0]),
256,
kernel_size=3,
stride=1,
padding=1,
groups=1,
has_bn=True,
has_act=True,
act_type=act_type)
self.P4_pred_conv = nn.Conv2d(256,
per_level_num_anchors *
(1 + 4 + num_classes),
kernel_size=1,
stride=1,
padding=0,
groups=1,
bias=True)
self.sigmoid = nn.Sigmoid()
def forward(self, inputs):
[C4, C5] = inputs
C5 = self.conv1(C5)
C5 = self.conv2(C5)
P5 = self.P5_conv(C5)
P5 = self.P5_pred_conv(P5)
C5_upsample = F.interpolate(self.conv3(C5),
size=(C4.shape[2], C4.shape[3]),
mode='bilinear',
align_corners=True)
del C5
C4 = torch.cat([C4, C5_upsample], dim=1)
P4 = self.P4_conv(C4)
P4 = self.P4_pred_conv(P4)
del C4
# P4 shape:[B,255,H,W]->[B,H,W,255]->[B,H,W,3,85]
P4 = P4.permute(0, 2, 3, 1).contiguous()
P4 = P4.view(P4.shape[0], P4.shape[1], P4.shape[2],
self.per_level_num_anchors, -1)
# P5 shape:[B,255,H,W]->[B,H,W,255]->[B,H,W,3,85]
P5 = P5.permute(0, 2, 3, 1).contiguous()
P5 = P5.view(P5.shape[0], P5.shape[1], P5.shape[2],
self.per_level_num_anchors, -1)
P4[:, :, :, :, 0:3] = torch.sigmoid(P4[:, :, :, :, 0:3])
P4[:, :, :, :, 5:] = torch.sigmoid(P4[..., 5:])
P5[:, :, :, :, 0:3] = torch.sigmoid(P5[:, :, :, :, 0:3])
P5[:, :, :, :, 5:] = torch.sigmoid(P5[..., 5:])
return [P4, P5]
class Yolov3FPNHead(nn.Module):
def __init__(self,
inplanes,
per_level_num_anchors=3,
num_classes=80,
act_type='leakyrelu'):
super(Yolov3FPNHead, self).__init__()
# inplanes:[C3_inplanes,C4_inplanes,C5_inplanes]
self.per_level_num_anchors = per_level_num_anchors
P5_1_layers = []
for i in range(5):
P5_1_layers.append(
ConvBnActBlock(inplanes[2],
inplanes[2] // 2,
kernel_size=1,
stride=1,
padding=0,
groups=1,
has_bn=True,
has_act=True,
act_type=act_type) if i %
2 == 0 else ConvBnActBlock(inplanes[2] // 2,
inplanes[2],
kernel_size=3,
stride=1,
padding=1,
groups=1,
has_bn=True,
has_act=True,
act_type=act_type))
self.P5_1 = nn.Sequential(*P5_1_layers)
self.P5_2 = ConvBnActBlock(inplanes[2] // 2,
inplanes[2],
kernel_size=3,
stride=1,
padding=1,
groups=1,
has_bn=True,
has_act=True,
act_type=act_type)
self.P5_pred_conv = nn.Conv2d(inplanes[2],
per_level_num_anchors *
(1 + 4 + num_classes),
kernel_size=1,
stride=1,
padding=0,
groups=1,
bias=True)
self.P5_up_conv = ConvBnActBlock(inplanes[2] // 2,
inplanes[1] // 2,
kernel_size=1,
stride=1,
padding=0,
groups=1,
has_bn=True,
has_act=True,
act_type=act_type)
P4_1_layers = []
for i in range(5):
if i % 2 == 0:
P4_1_layers.append(
ConvBnActBlock((inplanes[1] // 2) + inplanes[1],
inplanes[1] // 2,
kernel_size=1,
stride=1,
padding=0,
groups=1,
has_bn=True,
has_act=True,
act_type=act_type) if i ==
0 else ConvBnActBlock(inplanes[1],
inplanes[1] // 2,
kernel_size=1,
stride=1,
padding=0,
groups=1,
has_bn=True,
has_act=True,
act_type=act_type))
else:
P4_1_layers.append(
ConvBnActBlock(inplanes[1] // 2,
inplanes[1],
kernel_size=3,
stride=1,
padding=1,
groups=1,
has_bn=True,
has_act=True,
act_type=act_type))
self.P4_1 = nn.Sequential(*P4_1_layers)
self.P4_2 = ConvBnActBlock(inplanes[1] // 2,
inplanes[1],
kernel_size=3,
stride=1,
padding=1,
groups=1,
has_bn=True,
has_act=True,
act_type=act_type)
self.P4_pred_conv = nn.Conv2d(inplanes[1],
per_level_num_anchors *
(1 + 4 + num_classes),
kernel_size=1,
stride=1,
padding=0,
groups=1,
bias=True)
self.P4_up_conv = ConvBnActBlock(inplanes[1] // 2,
inplanes[0] // 2,
kernel_size=1,
stride=1,
padding=0,
groups=1,
has_bn=True,
has_act=True,
act_type=act_type)
P3_1_layers = []
for i in range(5):
if i % 2 == 0:
P3_1_layers.append(
ConvBnActBlock((inplanes[0] // 2) + inplanes[0],
inplanes[0] // 2,
kernel_size=1,
stride=1,
padding=0,
groups=1,
has_bn=True,
has_act=True,
act_type=act_type) if i ==
0 else ConvBnActBlock(inplanes[0],
inplanes[0] // 2,
kernel_size=1,
stride=1,
padding=0,
groups=1,
has_bn=True,
has_act=True,
act_type=act_type))
else:
P3_1_layers.append(
ConvBnActBlock(inplanes[0] // 2,
inplanes[0],
kernel_size=3,
stride=1,
padding=1,
groups=1,
has_bn=True,
has_act=True,
act_type=act_type))
self.P3_1 = nn.Sequential(*P3_1_layers)
self.P3_2 = ConvBnActBlock(inplanes[0] // 2,
inplanes[0],
kernel_size=3,
stride=1,
padding=1,
groups=1,
has_bn=True,
has_act=True,
act_type=act_type)
self.P3_pred_conv = nn.Conv2d(inplanes[0],
per_level_num_anchors *
(1 + 4 + num_classes),
kernel_size=1,
stride=1,
padding=0,
groups=1,
bias=True)
self.sigmoid = nn.Sigmoid()
def forward(self, inputs):
[C3, C4, C5] = inputs
P5 = self.P5_1(C5)
del C5
C5_upsample = F.interpolate(self.P5_up_conv(P5),
size=(C4.shape[2], C4.shape[3]),
mode='bilinear',
align_corners=True)
C4 = torch.cat([C4, C5_upsample], axis=1)
del C5_upsample
P4 = self.P4_1(C4)
del C4
C4_upsample = F.interpolate(self.P4_up_conv(P4),
size=(C3.shape[2], C3.shape[3]),
mode='bilinear',
align_corners=True)
C3 = torch.cat([C3, C4_upsample], axis=1)
del C4_upsample
P3 = self.P3_1(C3)
del C3
P5 = self.P5_2(P5)
P5 = self.P5_pred_conv(P5)
P4 = self.P4_2(P4)
P4 = self.P4_pred_conv(P4)
P3 = self.P3_2(P3)
P3 = self.P3_pred_conv(P3)
# P3 shape:[B,255,H,W]->[B,H,W,255]->[B,H,W,3,85]
P3 = P3.permute(0, 2, 3, 1).contiguous()
P3 = P3.view(P3.shape[0], P3.shape[1], P3.shape[2],
self.per_level_num_anchors, -1)
# P4 shape:[B,255,H,W]->[B,H,W,255]->[B,H,W,3,85]
P4 = P4.permute(0, 2, 3, 1).contiguous()
P4 = P4.view(P4.shape[0], P4.shape[1], P4.shape[2],
self.per_level_num_anchors, -1)
# P5 shape:[B,255,H,W]->[B,H,W,255]->[B,H,W,3,85]
P5 = P5.permute(0, 2, 3, 1).contiguous()
P5 = P5.view(P5.shape[0], P5.shape[1], P5.shape[2],
self.per_level_num_anchors, -1)
P3[:, :, :, :, 0:3] = torch.sigmoid(P3[:, :, :, :, 0:3])
P3[:, :, :, :, 5:] = torch.sigmoid(P3[..., 5:])
P4[:, :, :, :, 0:3] = torch.sigmoid(P4[:, :, :, :, 0:3])
P4[:, :, :, :, 5:] = torch.sigmoid(P4[..., 5:])
P5[:, :, :, :, 0:3] = torch.sigmoid(P5[:, :, :, :, 0:3])
P5[:, :, :, :, 5:] = torch.sigmoid(P5[..., 5:])
return [P3, P4, P5]
class Yolov4TinyFPNHead(nn.Module):
def __init__(self,
inplanes,
per_level_num_anchors=3,
num_classes=80,
act_type='leakyrelu'):
super(Yolov4TinyFPNHead, self).__init__()
# inplanes:[C4_inplanes,C5_inplanes]
self.per_level_num_anchors = per_level_num_anchors
self.P5_1 = ConvBnActBlock(inplanes[1],
inplanes[1] // 2,
kernel_size=1,
stride=1,
padding=0,
groups=1,
has_bn=True,
has_act=True,
act_type=act_type)
self.P5_up_conv = ConvBnActBlock(inplanes[1] // 2,
inplanes[0] // 2,
kernel_size=1,
stride=1,
padding=0,
groups=1,
has_bn=True,
has_act=True,
act_type=act_type)
self.P5_2 = ConvBnActBlock(inplanes[1] // 2,
inplanes[1],
kernel_size=3,
stride=1,
padding=1,
groups=1,
has_bn=True,
has_act=True,
act_type=act_type)
self.P5_pred_conv = nn.Conv2d(inplanes[1],
per_level_num_anchors *
(1 + 4 + num_classes),
kernel_size=1,
stride=1,
padding=0,
bias=True)
self.P4_1 = ConvBnActBlock(int(inplanes[0] + inplanes[0] // 2),
inplanes[0],
kernel_size=3,
stride=1,
padding=1,
groups=1,
has_bn=True,
has_act=True,
act_type=act_type)
self.P4_pred_conv = nn.Conv2d(inplanes[0],
per_level_num_anchors *
(1 + 4 + num_classes),
kernel_size=1,
stride=1,
padding=0,
bias=True)
self.sigmoid = nn.Sigmoid()
def forward(self, inputs):
[C4, C5] = inputs
P5 = self.P5_1(C5)
del C5
P5_out = self.P5_2(P5)
P5_out = self.P5_pred_conv(P5_out)
P5_upsample = F.interpolate(self.P5_up_conv(P5),
size=(C4.shape[2], C4.shape[3]),
mode='bilinear',
align_corners=True)
P4 = torch.cat([C4, P5_upsample], dim=1)
del C4, P5, P5_upsample
P4 = self.P4_1(P4)
P4_out = self.P4_pred_conv(P4)
del P4
# P4_out shape:[B,255,H,W]->[B,H,W,255]->[B,H,W,3,85]
P4_out = P4_out.permute(0, 2, 3, 1).contiguous()
P4_out = P4_out.view(P4_out.shape[0], P4_out.shape[1], P4_out.shape[2],
self.per_level_num_anchors, -1)
# P5_out shape:[B,255,H,W]->[B,H,W,255]->[B,H,W,3,85]
P5_out = P5_out.permute(0, 2, 3, 1).contiguous()
P5_out = P5_out.view(P5_out.shape[0], P5_out.shape[1], P5_out.shape[2],
self.per_level_num_anchors, -1)
P4_out[:, :, :, :, 0:3] = torch.sigmoid(P4_out[:, :, :, :, 0:3])
P4_out[:, :, :, :, 5:] = torch.sigmoid(P4_out[..., 5:])
P5_out[:, :, :, :, 0:3] = torch.sigmoid(P5_out[:, :, :, :, 0:3])
P5_out[:, :, :, :, 5:] = torch.sigmoid(P5_out[..., 5:])
return P4_out, P5_out
class SPP(nn.Module):
'''
Spatial pyramid pooling layer used in YOLOv3-SPP
'''
def __init__(self, kernels=[5, 9, 13]):
super(SPP, self).__init__()
self.maxpool_layers = nn.ModuleList([
nn.MaxPool2d(kernel_size=kernel, stride=1, padding=kernel // 2)
for kernel in kernels
])
def forward(self, x):
out = torch.cat([x] + [layer(x) for layer in self.maxpool_layers],
dim=1)
return out
class Yolov4FPNHead(nn.Module):
def __init__(self,
inplanes,
per_level_num_anchors=3,
num_classes=80,
act_type='leakyrelu'):
super(Yolov4FPNHead, self).__init__()
# inplanes:[C3_inplanes,C4_inplanes,C5_inplanes]
self.per_level_num_anchors = per_level_num_anchors
p5_block1 = nn.Sequential(*[
ConvBnActBlock(inplanes[2],
inplanes[2] // 2,
kernel_size=1,
stride=1,
padding=0,
groups=1,
has_bn=True,
has_act=True,
act_type=act_type) if i %
2 == 0 else ConvBnActBlock(inplanes[2] // 2,
inplanes[2],
kernel_size=3,
stride=1,
padding=1,
groups=1,
has_bn=True,
has_act=True,
act_type=act_type) for i in range(3)
])
p5_spp_block = SPP(kernels=(5, 9, 13))
p5_block2 = nn.Sequential(
ConvBnActBlock(inplanes[2] * 2,
inplanes[2] // 2,
kernel_size=1,
stride=1,
padding=0,
groups=1,
has_bn=True,
has_act=True,
act_type=act_type),
ConvBnActBlock(inplanes[2] // 2,
inplanes[2],
kernel_size=3,
stride=1,
padding=1,
groups=1,
has_bn=True,
has_act=True,
act_type=act_type),
ConvBnActBlock(inplanes[2],
inplanes[2] // 2,
kernel_size=1,
stride=1,
padding=0,
groups=1,
has_bn=True,
has_act=True,
act_type=act_type))
self.P5_1 = nn.Sequential(p5_block1, p5_spp_block, p5_block2)
self.P5_up_conv = ConvBnActBlock(inplanes[2] // 2,
inplanes[1] // 2,
kernel_size=1,
stride=1,
padding=0,
groups=1,
has_bn=True,
has_act=True,
act_type=act_type)
self.P4_cat_conv = ConvBnActBlock(inplanes[1],
inplanes[1] // 2,
kernel_size=1,
stride=1,
padding=0,
groups=1,
has_bn=True,
has_act=True,
act_type=act_type)
self.P4_1 = nn.Sequential(*[
ConvBnActBlock(inplanes[1],
inplanes[1] // 2,
kernel_size=1,
stride=1,
padding=0,
groups=1,
has_bn=True,
has_act=True,
act_type=act_type) if i %
2 == 0 else ConvBnActBlock(inplanes[1] // 2,
inplanes[1],
kernel_size=3,
stride=1,
padding=1,
groups=1,
has_bn=True,
has_act=True,
act_type=act_type) for i in range(5)
])
self.P4_up_conv = ConvBnActBlock(inplanes[1] // 2,
inplanes[0] // 2,
kernel_size=1,
stride=1,
padding=0,
groups=1,
has_bn=True,
has_act=True,
act_type=act_type)
self.P3_cat_conv = ConvBnActBlock(inplanes[0],
inplanes[0] // 2,
kernel_size=1,
stride=1,
padding=0,
groups=1,
has_bn=True,
has_act=True,
act_type=act_type)
self.P3_1 = nn.Sequential(*[
ConvBnActBlock(inplanes[0],
inplanes[0] // 2,
kernel_size=1,
stride=1,
padding=0,
groups=1,
has_bn=True,
has_act=True,
act_type=act_type) if i %
2 == 0 else ConvBnActBlock(inplanes[0] // 2,
inplanes[0],
kernel_size=3,
stride=1,
padding=1,
groups=1,
has_bn=True,
has_act=True,
act_type=act_type) for i in range(5)
])
self.P3_out_conv = ConvBnActBlock(inplanes[0] // 2,
inplanes[0],
kernel_size=3,
stride=1,
padding=1,
groups=1,
has_bn=True,
has_act=True,
act_type=act_type)
self.P3_down_conv = ConvBnActBlock(inplanes[0] // 2,
inplanes[1] // 2,
kernel_size=3,
stride=2,
padding=1,
groups=1,
has_bn=True,
has_act=True,
act_type=act_type)
self.P4_2 = nn.Sequential(*[
ConvBnActBlock(inplanes[1],
inplanes[1] // 2,
kernel_size=1,
stride=1,
padding=0,
groups=1,
has_bn=True,
has_act=True,
act_type=act_type) if i %
2 == 0 else ConvBnActBlock(inplanes[1] // 2,
inplanes[1],
kernel_size=3,
stride=1,
padding=1,
groups=1,
has_bn=True,
has_act=True,
act_type=act_type) for i in range(5)
])
self.P4_out_conv = ConvBnActBlock(inplanes[1] // 2,
inplanes[1],
kernel_size=3,
stride=1,
padding=1,
groups=1,
has_bn=True,
has_act=True,
act_type=act_type)
self.P4_down_conv = ConvBnActBlock(inplanes[1] // 2,
inplanes[2] // 2,
kernel_size=3,
stride=2,
padding=1,
groups=1,
has_bn=True,
has_act=True,
act_type=act_type)
self.P5_2 = nn.Sequential(*[
ConvBnActBlock(inplanes[2],
inplanes[2] // 2,
kernel_size=1,
stride=1,
padding=0,
groups=1,
has_bn=True,
has_act=True,
act_type=act_type) if i %
2 == 0 else ConvBnActBlock(inplanes[2] // 2,
inplanes[2],
kernel_size=3,
stride=1,
padding=1,
groups=1,
has_bn=True,
has_act=True,
act_type=act_type) for i in range(5)
])
self.P5_out_conv = ConvBnActBlock(inplanes[2] // 2,
inplanes[2],
kernel_size=3,
stride=1,
padding=1,
groups=1,
has_bn=True,
has_act=True,
act_type=act_type)
self.P5_pred_conv = nn.Conv2d(inplanes[2],
per_level_num_anchors *
(1 + 4 + num_classes),
kernel_size=1,
stride=1,
padding=0,
bias=True)
self.P4_pred_conv = nn.Conv2d(inplanes[1],
per_level_num_anchors *
(1 + 4 + num_classes),
kernel_size=1,
stride=1,
padding=0,
bias=True)
self.P3_pred_conv = nn.Conv2d(inplanes[0],
per_level_num_anchors *
(1 + 4 + num_classes),
kernel_size=1,
stride=1,
padding=0,
bias=True)
self.sigmoid = nn.Sigmoid()
def forward(self, inputs):
[C3, C4, C5] = inputs
P5 = self.P5_1(C5)
del C5
P5_upsample = F.interpolate(self.P5_up_conv(P5),
size=(C4.shape[2], C4.shape[3]),
mode='bilinear',
align_corners=True)
C4 = torch.cat([self.P4_cat_conv(C4), P5_upsample], dim=1)
del P5_upsample
P4 = self.P4_1(C4)
del C4
P4_upsample = F.interpolate(self.P4_up_conv(P4),
size=(C3.shape[2], C3.shape[3]),
mode='bilinear',
align_corners=True)
C3 = torch.cat([self.P3_cat_conv(C3), P4_upsample], dim=1)
del P4_upsample
P3 = self.P3_1(C3)
del C3
P3_out = self.P3_out_conv(P3)
P3_out = self.P3_pred_conv(P3_out)
P4 = torch.cat([P4, self.P3_down_conv(P3)], dim=1)
del P3
P4 = self.P4_2(P4)
P4_out = self.P4_out_conv(P4)
P4_out = self.P4_pred_conv(P4_out)
P5 = torch.cat([P5, self.P4_down_conv(P4)], dim=1)
del P4
P5 = self.P5_2(P5)
P5_out = self.P5_out_conv(P5)
P5_out = self.P5_pred_conv(P5_out)
del P5
# P3_out shape:[B,255,H,W]->[B,H,W,255]->[B,H,W,3,85]
P3_out = P3_out.permute(0, 2, 3, 1).contiguous()
P3_out = P3_out.view(P3_out.shape[0], P3_out.shape[1], P3_out.shape[2],
self.per_level_num_anchors, -1)
# P4_out shape:[B,255,H,W]->[B,H,W,255]->[B,H,W,3,85]
P4_out = P4_out.permute(0, 2, 3, 1).contiguous()
P4_out = P4_out.view(P4_out.shape[0], P4_out.shape[1], P4_out.shape[2],
self.per_level_num_anchors, -1)
# P5_out shape:[B,255,H,W]->[B,H,W,255]->[B,H,W,3,85]
P5_out = P5_out.permute(0, 2, 3, 1).contiguous()
P5_out = P5_out.view(P5_out.shape[0], P5_out.shape[1], P5_out.shape[2],
self.per_level_num_anchors, -1)
P3_out[:, :, :, :, 0:3] = torch.sigmoid(P3_out[:, :, :, :, 0:3])
P3_out[:, :, :, :, 5:] = torch.sigmoid(P3_out[..., 5:])
P4_out[:, :, :, :, 0:3] = torch.sigmoid(P4_out[:, :, :, :, 0:3])
P4_out[:, :, :, :, 5:] = torch.sigmoid(P4_out[..., 5:])
P5_out[:, :, :, :, 0:3] = torch.sigmoid(P5_out[:, :, :, :, 0:3])
P5_out[:, :, :, :, 5:] = torch.sigmoid(P5_out[..., 5:])
return [P3_out, P4_out, P5_out]
class YOLOV5FPNHead(nn.Module):
def __init__(self,
inplanes,
csp_nums=3,
csp_shortcut=False,
per_level_num_anchors=3,
num_classes=80,
act_type='silu'):
super(YOLOV5FPNHead, self).__init__()
# inplanes:[C3_inplanes,C4_inplanes,C5_inplanes]
self.per_level_num_anchors = per_level_num_anchors
self.P5_fpn_1 = CSPBottleneck(inplanes[2],
inplanes[2],
bottleneck_nums=csp_nums,
reduction=0.5,
shortcut=csp_shortcut,
act_type=act_type)
self.P5_fpn_2 = ConvBnActBlock(inplanes[2],
inplanes[1],
kernel_size=1,
stride=1,
padding=0,
groups=1,
has_bn=True,
has_act=True,
act_type=act_type)
self.P4_fpn_1 = CSPBottleneck(int(inplanes[1] * 2),
inplanes[1],
bottleneck_nums=csp_nums,
reduction=0.5,
shortcut=csp_shortcut,
act_type=act_type)
self.P4_fpn_2 = ConvBnActBlock(inplanes[1],
inplanes[0],
kernel_size=1,
stride=1,
padding=0,
groups=1,
has_bn=True,
has_act=True,
act_type=act_type)
self.P3_out = CSPBottleneck(int(inplanes[0] * 2),
inplanes[0],
bottleneck_nums=csp_nums,
reduction=0.5,
shortcut=csp_shortcut,
act_type=act_type)
self.P3_pred_conv = nn.Conv2d(inplanes[0],
per_level_num_anchors *
(1 + 4 + num_classes),
kernel_size=1,
stride=1,
padding=0,
bias=True)
self.P3_pan_1 = ConvBnActBlock(inplanes[0],
inplanes[0],
kernel_size=3,
stride=2,
padding=1,
groups=1,
has_bn=True,
has_act=True,
act_type=act_type)
self.P4_out = CSPBottleneck(inplanes[1],
inplanes[1],
bottleneck_nums=csp_nums,
reduction=0.5,
shortcut=csp_shortcut,
act_type=act_type)
self.P4_pred_conv = nn.Conv2d(inplanes[1],
per_level_num_anchors *
(1 + 4 + num_classes),
kernel_size=1,
stride=1,
padding=0,
bias=True)
self.P4_pan_1 = ConvBnActBlock(inplanes[1],
inplanes[1],
kernel_size=3,
stride=2,
padding=1,
groups=1,
has_bn=True,
has_act=True,
act_type=act_type)
self.P5_out = CSPBottleneck(inplanes[2],
inplanes[2],
bottleneck_nums=csp_nums,
reduction=0.5,
shortcut=csp_shortcut,
act_type=act_type)
self.P5_pred_conv = nn.Conv2d(inplanes[2],
per_level_num_anchors *
(1 + 4 + num_classes),
kernel_size=1,
stride=1,
padding=0,
bias=True)
self.sigmoid = nn.Sigmoid()
# https://arxiv.org/abs/1708.02002 section 3.3
p5_bias = self.P5_pred_conv.bias.view(per_level_num_anchors, -1)
# init obj pred value,per image(640 resolution) has 8 objects,stride=32
p5_bias.data[:, 0] += math.log(8 / (640 / 32)**2)
# init cls pred value
p5_bias.data[:, 5:] += math.log(0.6 / (num_classes - 0.99))
self.P5_pred_conv.bias = torch.nn.Parameter(p5_bias.view(-1),
requires_grad=True)
p4_bias = self.P4_pred_conv.bias.view(per_level_num_anchors, -1)
# init obj pred value,per image(640 resolution) has 8 objects,stride=16
p4_bias.data[:, 0] += math.log(8 / (640 / 16)**2)
# init cls pred value
p4_bias.data[:, 5:] += math.log(0.6 / (num_classes - 0.99))
self.P4_pred_conv.bias = torch.nn.Parameter(p4_bias.view(-1),
requires_grad=True)
p3_bias = self.P3_pred_conv.bias.view(per_level_num_anchors, -1)
# init obj pred value,per image(640 resolution) has 8 objects,stride=8
p3_bias.data[:, 0] += math.log(8 / (640 / 8)**2)
# init cls pred value
p3_bias.data[:, 5:] += math.log(0.6 / (num_classes - 0.99))
self.P3_pred_conv.bias = torch.nn.Parameter(p3_bias.view(-1),
requires_grad=True)
def forward(self, inputs):
[C3, C4, C5] = inputs
P5 = self.P5_fpn_1(C5)
P5 = self.P5_fpn_2(P5)
del C5
P5_upsample = F.interpolate(P5,
size=(C4.shape[2], C4.shape[3]),
mode='bilinear',
align_corners=True)
P4 = torch.cat([C4, P5_upsample], axis=1)
del C4, P5_upsample
P4 = self.P4_fpn_1(P4)
P4 = self.P4_fpn_2(P4)
P4_upsample = F.interpolate(P4,
size=(C3.shape[2], C3.shape[3]),
mode='bilinear',
align_corners=True)
P3 = torch.cat([C3, P4_upsample], axis=1)
del C3, P4_upsample
P3 = self.P3_out(P3)
P3_out = self.P3_pred_conv(P3)
P3 = self.P3_pan_1(P3)
P4 = torch.cat([P3, P4], axis=1)
del P3
P4 = self.P4_out(P4)
P4_out = self.P4_pred_conv(P4)
P4 = self.P4_pan_1(P4)
P5 = torch.cat([P4, P5], axis=1)
del P4
P5 = self.P5_out(P5)
P5_out = self.P5_pred_conv(P5)
del P5
# P3_out shape:[B,255,H,W]->[B,H,W,255]->[B,H,W,3,85]
P3_out = P3_out.permute(0, 2, 3, 1).contiguous()
P3_out = P3_out.view(P3_out.shape[0], P3_out.shape[1], P3_out.shape[2],
self.per_level_num_anchors, -1).contiguous()
# P4_out shape:[B,255,H,W]->[B,H,W,255]->[B,H,W,3,85]
P4_out = P4_out.permute(0, 2, 3, 1).contiguous()
P4_out = P4_out.view(P4_out.shape[0], P4_out.shape[1], P4_out.shape[2],
self.per_level_num_anchors, -1).contiguous()
# P5_out shape:[B,255,H,W]->[B,H,W,255]->[B,H,W,3,85]
P5_out = P5_out.permute(0, 2, 3, 1).contiguous()
P5_out = P5_out.view(P5_out.shape[0], P5_out.shape[1], P5_out.shape[2],
self.per_level_num_anchors, -1).contiguous()
P3_out = self.sigmoid(P3_out)
P4_out = self.sigmoid(P4_out)
P5_out = self.sigmoid(P5_out)
return [P3_out, P4_out, P5_out]
class YOLOXFPN(nn.Module):
def __init__(self,
inplanes,
csp_nums=3,
csp_shortcut=False,
block=ConvBnActBlock,
act_type='silu'):
super(YOLOXFPN, self).__init__()
# inplanes:[C3_inplanes,C4_inplanes,C5_inplanes]
self.p5_reduce_conv = ConvBnActBlock(inplanes[2],
inplanes[1],
kernel_size=1,
stride=1,
padding=0,
groups=1,
has_bn=True,
has_act=True,
act_type=act_type)
self.p4_conv1 = YOLOXCSPBottleneck(int(inplanes[1] * 2),
inplanes[1],
bottleneck_nums=csp_nums,
bottleneck_block_type=block,
reduction=0.5,
shortcut=csp_shortcut,
act_type=act_type)
self.p4_reduce_conv = ConvBnActBlock(inplanes[1],
inplanes[0],
kernel_size=1,
stride=1,
padding=0,
groups=1,
has_bn=True,
has_act=True,
act_type=act_type)
self.p3_conv1 = YOLOXCSPBottleneck(int(inplanes[0] * 2),
inplanes[0],
bottleneck_nums=csp_nums,
bottleneck_block_type=block,
reduction=0.5,
shortcut=csp_shortcut,
act_type=act_type)
self.p3_up_conv = ConvBnActBlock(inplanes[0],
inplanes[0],
kernel_size=3,
stride=2,
padding=1,
groups=1,
has_bn=True,
has_act=True,
act_type=act_type)
self.p4_conv2 = YOLOXCSPBottleneck(int(inplanes[0] * 2),
inplanes[1],
bottleneck_nums=csp_nums,
bottleneck_block_type=block,
reduction=0.5,
shortcut=csp_shortcut,
act_type=act_type)
self.p4_up_conv = ConvBnActBlock(inplanes[1],
inplanes[1],
kernel_size=3,
stride=2,
padding=1,
groups=1,
has_bn=True,
has_act=True,
act_type=act_type)
self.p5_conv1 = YOLOXCSPBottleneck(int(inplanes[1] * 2),
inplanes[2],
bottleneck_nums=csp_nums,
bottleneck_block_type=block,
reduction=0.5,
shortcut=csp_shortcut,
act_type=act_type)
def forward(self, inputs):
[C3, C4, C5] = inputs
P5 = self.p5_reduce_conv(C5)
del C5
P5_upsample = F.interpolate(P5,
size=(C4.shape[2], C4.shape[3]),
mode='bilinear',
align_corners=True)
P4 = torch.cat([C4, P5_upsample], axis=1)
del C4, P5_upsample
P4 = self.p4_conv1(P4)
P4 = self.p4_reduce_conv(P4)
P4_upsample = F.interpolate(P4,
size=(C3.shape[2], C3.shape[3]),
mode='bilinear',
align_corners=True)
P3 = torch.cat([C3, P4_upsample], axis=1)
del C3, P4_upsample
P3_out = self.p3_conv1(P3)
P3_up = self.p3_up_conv(P3_out)
P4 = torch.cat([P3_up, P4], axis=1)
P4_out = self.p4_conv2(P4)
del P4
P4_up = self.p4_up_conv(P4_out)
P5 = torch.cat([P4_up, P5], axis=1)
P5_out = self.p5_conv1(P5)
del P5
return [P3_out, P4_out, P5_out]
if __name__ == '__main__':
import os
import random
import numpy as np
import torch
seed = 0
# for hash
os.environ['PYTHONHASHSEED'] = str(seed)
# for python and numpy
random.seed(seed)
np.random.seed(seed)
# for cpu gpu
torch.manual_seed(seed)
torch.cuda.manual_seed(seed)
torch.cuda.manual_seed_all(seed)
net = RetinaFPN([512, 1024, 2048], 256, use_p5=False)
C3, C4, C5 = torch.randn(3, 512, 80, 80), torch.randn(3, 1024, 40,
40), torch.randn(
3, 2048, 20, 20)
from thop import profile
from thop import clever_format
macs, params = profile(net, inputs=([C3, C4, C5], ), verbose=False)
macs, params = clever_format([macs, params], '%.3f')
print(f'1111, macs: {macs}, params: {params}')
outs = net([C3, C4, C5])
for out in outs:
print('2222', out.shape)
net = Yolov3TinyFPNHead([256, 512],
per_level_num_anchors=3,
num_classes=80,
act_type='leakyrelu')
C4, C5 = torch.randn(3, 256, 40, 40), torch.randn(3, 512, 20, 20)
from thop import profile
from thop import clever_format
macs, params = profile(net, inputs=([C4, C5], ), verbose=False)
macs, params = clever_format([macs, params], '%.3f')
print(f'1111, macs: {macs}, params: {params}')
outs = net([C4, C5])
for out in outs:
print('2222', out.shape)
net = Yolov3FPNHead([256, 512, 1024],
per_level_num_anchors=3,
num_classes=80,
act_type='leakyrelu')
C3, C4, C5 = torch.randn(3, 256, 80, 80), torch.randn(3, 512, 40,
40), torch.randn(
3, 1024, 20, 20)
from thop import profile
from thop import clever_format
macs, params = profile(net, inputs=([C3, C4, C5], ), verbose=False)
macs, params = clever_format([macs, params], '%.3f')
print(f'1111, macs: {macs}, params: {params}')
outs = net([C3, C4, C5])
for out in outs:
print('2222', out.shape)
net = Yolov4TinyFPNHead([256, 512],
per_level_num_anchors=3,
num_classes=80,
act_type='leakyrelu')
C4, C5 = torch.randn(3, 256, 40, 40), torch.randn(3, 512, 20, 20)
from thop import profile
from thop import clever_format
macs, params = profile(net, inputs=([C4, C5], ), verbose=False)
macs, params = clever_format([macs, params], '%.3f')
print(f'1111, macs: {macs}, params: {params}')
outs = net([C4, C5])
for out in outs:
print('2222', out.shape)
net = Yolov4FPNHead([256, 512, 1024],
per_level_num_anchors=3,
num_classes=80,
act_type='leakyrelu')
C3, C4, C5 = torch.randn(3, 256, 80, 80), torch.randn(3, 512, 40,
40), torch.randn(
3, 1024, 20, 20)
from thop import profile
from thop import clever_format
macs, params = profile(net, inputs=([C3, C4, C5], ), verbose=False)
macs, params = clever_format([macs, params], '%.3f')
print(f'1111, macs: {macs}, params: {params}')
outs = net([C3, C4, C5])
for out in outs:
print('2222', out.shape)
net = YOLOV5FPNHead([256, 512, 1024],
csp_nums=3,
csp_shortcut=False,
per_level_num_anchors=3,
num_classes=80,
act_type='silu')
C3, C4, C5 = torch.randn(3, 256, 80, 80), torch.randn(3, 512, 40,
40), torch.randn(
3, 1024, 20, 20)
from thop import profile
from thop import clever_format
macs, params = profile(net, inputs=([C3, C4, C5], ), verbose=False)
macs, params = clever_format([macs, params], '%.3f')
print(f'1111, macs: {macs}, params: {params}')
outs = net([C3, C4, C5])
for out in outs:
print('2222', out.shape)
net = YOLOXFPN([256, 512, 1024],
csp_nums=3,
csp_shortcut=False,
block=ConvBnActBlock,
per_level_num_anchors=3,
act_type='silu')
C3, C4, C5 = torch.randn(3, 256, 80, 80), torch.randn(3, 512, 40,
40), torch.randn(
3, 1024, 20, 20)
from thop import profile
from thop import clever_format
macs, params = profile(net, inputs=([C3, C4, C5], ), verbose=False)
macs, params = clever_format([macs, params], '%.3f')
print(f'1111, macs: {macs}, params: {params}')
outs = net([C3, C4, C5])
for out in outs:
print('2222', out.shape) | 42.840668 | 80 | 0.357502 | 5,181 | 56,464 | 3.688284 | 0.039568 | 0.049087 | 0.048354 | 0.044691 | 0.873096 | 0.854519 | 0.816369 | 0.800932 | 0.777435 | 0.765032 | 0 | 0.083735 | 0.552246 | 56,464 | 1,318 | 81 | 42.840668 | 0.672099 | 0.024192 | 0 | 0.803336 | 0 | 0 | 0.009336 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.014047 | false | 0 | 0.023705 | 0 | 0.0518 | 0.012291 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 8 |
1cbaec9b61460c67e740ceac062736fbaa998697 | 7,852 | py | Python | app/main/VR/example_form.py | 8by8-org/usvotes | e2af8b2d8b986bf36804bae1c784bc78b54dc412 | [
"MIT"
] | 10 | 2018-08-28T13:35:27.000Z | 2021-07-17T18:01:04.000Z | app/main/VR/example_form.py | 8by8-org/usvotes | e2af8b2d8b986bf36804bae1c784bc78b54dc412 | [
"MIT"
] | 253 | 2018-05-14T14:51:35.000Z | 2021-07-23T00:49:04.000Z | app/main/VR/example_form.py | lukecivantos/flvotes | ace6fbee9d6cfaa9e4e69e266e321d041ad65da4 | [
"MIT"
] | 5 | 2019-09-05T15:10:32.000Z | 2021-09-30T23:37:04.000Z | signature_img_string = 'data:image/png;base64,iVBORw0KGgoAAAANSUhEUgAAAYwAAAA4CAYAAADuFc+dAAAWpElEQVR4Xu1de5AbxZn/vp7Rau21vbaxeRhsjVaj1cg29vJKgCOHeYSQhAMScCWBSmFySR3c8UxdcuRxhbm6JKQKLiaEkFBJGVIXKglQQHEEuHDExcXnPDB+7WqkGe1Ia8BAsPFz2V2vNN9Vz2iMvF7vjlajXUnb84+rvN1ff/37Wv11f/09EMQnEBAICAQEAgIBHwigjzaiiUBAICAQEAgIBEAoDLEIBAICAYGAQMAXAkJh+IJJNBIICAQEAgIBoTDEGhAICAQEAgIBXwgIheELJtFIICAQEAgIBITCmKQ1kEyedcqgNGTnurvfnaQhxTACAYGAQCBQBITCCBTO0YmtXbuW/eKXv36XEA7lDD06CUOKIQQCAgGBQOAINIXCULXlqzgy2XT3hsARCoCgoia2AGAXQ7bTMlORAEgKEgKBqhCo999MVZMTnWuGQFMojGg8SRyhnKnX3XyWdHZ2SCT1EhAwxKctQ/9szaQpCAsEfCJQz78Zn1MQzaYAgbrbYCvFQFE1R1kgYl0qjKiaWA3IfkPEeYRv50z9O5XOcSrbq9qyvwPAg/V6e5tKbBp17Hr/zTQqrtOB74ZXGN5JqV5vGIqauBeR/QsRgSxLl2fTPS810sISJ9FGkpY/Xuv9N+NvFqLVVCAgFEaNUY/Gky8DwCVcYbD2trnW5s37gx7SvQXwN5ye54Kk7ZxEEbgpbcgy9NYgaU9nWonEmSsKdDiJDN/xcCCkk2UJ9Ex39/ZaYyMURq0Rbl76QmHUWLaKmiggMgkI3stl9RODHq6rq2vu/v6hvUHfsJR4vAtB3gLc4MfgsZyhrwma9+lIryORvINsuN9B9djPloGdZZo9W2uJjVAYtUS3uWkLhVFD+Spq8lwA2sSHYIyttYzUPUEOpyjKXAi1OsqCIR60DH1OUPQVNXkHIvyA34wklG7sNXseDYr2dKTT2bnygmEaeogIVvD3tvKPYwzgqGZEZPtkwItqqTSEwpiOKzCYOQuFEQyOo1JRYomHkbGb+H7Awixi9fTsDHI4JZ78bwT4ON9wWuXWaDq9NR8UfUVNPoMIV3F6YSkcKO3xeKyViW28cWv1d34L3HdocC9XFFxWoykM529A7wPBfIa4X0b5OsPY8dta8OQ9ejPE3ZapL6zFGIJmcyLQ8ArDs7MjIMjAzqjFySy+/MykVCgW0+ltRgXLgEXj2n4AnGUTvd6XTZ9VQd9xmyqqtgoRf+82tO/LmZmvjdupggZKLFEARAkRizlTlyvoOuGmmtalHC4evpuAHPNXPbpJVzo5fgskuXUPQ3RMUET29/PZzF3ldEo3xQ0IuJIrDQScj4gFy0iFKh1vvPaRmHY+Y7gR+Jsastt6zdSD4/URfxcIeAg0vsKIJQrImOQ8Kkt4p5XR1wUp3lWrVsl9b707yDdOy0iF/dJeEteukACfc+IvGLvZyqR+4revn3aKqg3ywz8iDuZMfYafPhW0YdF4suhgOgmnUG6uKdDw7TbY17qGGW7gZ3+1zNRJFfA8JU35AzZRcY5hbPvDaAwoqvYCAFzOpyUx+9OWYYx6a/CUBhCs5POv1QEooiaeZMiuAaICtrctqIUTxpQIQgw6KQg0vsIomU7cwDj2W8tIfTpI5BQ1sQ8A2x0zgsSuzPn0RFLi2qMIeIPjTjuzZXF2+/Y3g+LrtM7OU0MkvemYuhCfsszUtUHR5nSWLFkyTwq3vc8xRcR8LdOZxOPLugpgv+6qCffjmOWz6bpcm66H0+DHiWA5Ia0CAsXlGvcB0FZE3CAxMOUZ4Y0DBweildwCHaUhte5BhqwWB6COjo52m7Xs4SuZEB7Pm+nrg1w3glbzI1CXP8pKYPceZ0t97JypS5X0H6ttJB5PIkmpkn2Zb86/sgz9C37oR1VtGBC5KaeQM/VATQuRjuSnmATPOw/SjN3fa6T+2Q9Pftss6uhYEpbCfe4No3bpTKLRzo+RzF7lt4qSff8tQPh23kzX5QO78xbRP8jfGUb+buxyr6fSG4UNBLZjKQUayplpX27J3tuRewCSfmcZPZf5ldt47aIx7VZg+EPeTpLk87PpHY5DhvgEAn4RaHyFEY93AUlbanGNV9TEzQD4Y09hIODAyQvmnrBp06aBsQBetGjRzJaZc/qdcydje3NGar5fgfhpF41ptwDDB7kdGmtghluiLl0mIXWXNvJjTEOnn376vIFh+yPMJoYopwFm785kNh70w7vXRlW1TxQAnkdEiWy7CAy/XK+KgvM88i3CuwmN6vHEf1WO45P7IWPP5ozU1X7wqeUBSImXzJjAPsiZqTY//Ig2AoFyBBpeYTg/5phW9K7xQbqA8tMeAF3lbAoEwwQUkiX2uWwm9ZuxltERk46bDiTwNwYllvw+Mvi6e8PAz/ca+q+DXNYdHVonSZhxNjtkAwhwO4G9kpthAHAlAM09ZjyEPQj4vwD4agsLPT2Wx5aqLo8VsWhy8jbRXiwMduTz+X1BziFoWt5bBLecEdBTiPBfMAxuskvJ7gJkXQDAk2Be6BwwbJsrCse8xhi7xzJSa/3wxONfanEAEnE1ftAXbcZDoDkUhqptQMQLSyaUBywzdcd4Ex/378uWtShDhQF0f/WHAbHFtStLT1mZnjHfDE477bQZoRmzP/BcKCOnnhTasGFDYdwxfTZQVO2XiHgdP8VKsnRRtXmeEokzzrZh6OyCDR0AsAIBlhLQYjc+4Ojj8ojDcxEAioAQcs007l95BHPO0E8ZbToLEolFs4r4BiAwBByi4YGTG0BZ+PZIU8rWoudGK8tyRTIqPwAF5cgh4mp8/rhEszERaBaFsRYR7y5t0Jtypn5+tXJXYsntgHR6yXH+XkT8Rokm95YKIWKZ0eHY87aialxBOK6UYemEOZWabMbiX1G1VxDxIj7fFhZKGsaONE9XTUCdCMy5Gbi3A5rF/yXCQ0f9H9ntRRviRLSKp10nsucepRhKWmGkucWjMVoswYd/g32IcN9oSRZVVQ0XQH4bEecRETFbUiwr2NiUauU+Wv8o90hDDPt5jFdUrXwtOstn7qzWeVu3bvV9gyp/x0D0b84ae81MXVxNLWQiaE4NAs2hMGKdVyOTnvYgPLtrufzEE0/w0++Evkhcew4JrnA2W2Z/ig3DLpLYVie2ihtpGLugN9OzcSzi0XiS2/Rn8Q2jLSzN7unpObJpT4ipsk6KmuxBhKX8v05eMHcmf1Mpj94di34pqvhIE8fTih27DI5uh88g0QuA9E6R8C2ZDmf7+/sLra1zTwQZTwHEM3gAISJdTAQb89n0J0fjQYlr2xBwRclMc6llpP6nWiz89ueuu4jSgUzm9YpzNVWS3VXx1mIpPbEfJTNyDuW3AR75nTNT8/zO83jtItz93I2raYr4lmrxEP0nhkBzKAxFU0CGnHuqRm4zvqQ30/PKRCBR4tp6IFjjmhOK9+azhnOzUNQEN0+1lja771lG6ptjnuji2m4EPIGfpPPZ9Gh5gybCntNHURNF/pSKjPXnjNQsvqEdL3q4fJDjRRlzbx5E2EQI3WhDd5EwjZJ9NwN2gTtffMgy9FsmzDAARFTtQYZ4i8sD+1rOTN1XDb1K+pbyN/0H1/+I+G+VpmipJJWGUrYWvfVYaQCi995gAx2SgP08CBNrJXOoBFvRdnoh0BQKg4ssqmr8nSFUCjZbZ5n6nZWKMqJqX0XE+0ugvJoz9Qs9GlE1+TNA+Htnw2O4PWfoK8dUGKVNnAB25k09sCp7oxVkqkRhcC0IiDzIbAOQvRWKbGs+nz4mpUh5WnbG2F8sI/WRSvH02quqdl4Rkd/IkGzakO9NX1QpLR4FzvtUkv5ETa440y4OP2LbdCTK3lGsCHmJSTf6efsZia2fzd97g/Dm6KfPMbeMuOYkIMybaf6YXvUnFEbVEAoC5cFSjY6GEks+jAxucuaBkMsZOn/A9f1FVPUMBPl1Lx4gn00vLneO9Aoh2QAFBiBLM0LHDcaLqJ3/iMAe4oNX+uA5HsOTVZDJG6fETzXxLaGoqvUTQggBByOnnjR7Ig4A0XiSB1BSe1tL1M97QCl/0/vIH5vK8jcRwQcANJPLGZE9H5aku3R9e/fxcB9p6vOz+Zfn4eJ0/fQZT+7V/l0ojGoRFP3drbVJPi+YDYAOA2CLJIfOyurbeQTxuF8sFjuxAPIu5qQYgeFD+947Yffu3UfFFZRiKw44iWG55UtiD1iZ0b2xomryACDMJqBM3kxr4zJQQYPJKsjEbzLMZr3OIqnC7h1VtdcA8SwissMSLc5kMrsqmK7TVIlrWwkgxgBnIeIzlpH6zDg0ZCWufYCATsAkz98EgIMIcAcgthMRT6vC58WD6YqSqrRlX3hhaDSa5RstIhywDL19PP5HxFIIhTEeYOLvDYNA0ygMAAgpavJdRJjnJFaT2E96M6mbfUiCKar2DiAsBJvswwMFZdeu3jdG66eo2kuI6EXe2sjgq1ZGf8BrG+tMXl0k+FcEODMol9eRfETjyd8BwKXOhjdnZk0KMnljVnsq7VC1uwjxe453rmR/Mp/JvOhDHsc0iarJPCBEiGgIEcMywy+ZmdT649FSYonbANGVCxZvypvmTx3F47wv4DovCy/P2ugcABiulxh7ylUuH3qUEdiJYrH4Uy8IDxn6qgtyJObBZWBLzkydOZF5B9nHkyUCBuN2HiRzglbDINBMCoO/Y/wIEP+pdHosLFl04ozxzB+RmPYHxvBveCoGsqWL+3p7Shlgj5UhN9MQIA/a41EH3KjBj6l/RMI/EhWvJYDT3P/jGw/05bN6Kc9QcOshomrDCCBXc+r3y001CiMSSUQhhFmepZUAns6b+mf9jjuyXXlmXie/FWDxhus/17J27Vq+4Y/85KiqvQ+Is4nICrfKnykU6KNk0yLPjXjUwMPjMFdydnLcYysJCo2omuMV15dNO67NU/3x+BDOQz6b5sGF4hMITAiBplIYizs6z5EY+zO3dfPdnCH7Qq/R86vjIROJJb7FGPt3x8YN8N1cNv2tcVDkt5H+kimDbyLctOHkCPK8lDgtAhjqy/rLHVSp1KrZxCdzrKiaNAFBJbIHzjljxexq3Jw530osuQ4Z3H7kPYJJK3OZY8uZRmPaV4DhI44c7OI1TJKfdB7b3SJF7lfKBluOx3E9yEqy5VJtlYKtOVKpPER7gcBUI9BUCsPZWFTtALqnS+5eu9EyUheMBvKiaDTRIoV1/igKQH/JmWnfXkAjg7M8+gT0GBA8ms+m3ZQRNfgaQWFEE0tXg01O+hTC4pq8YTxWLRRuLqcwd1V2kksyiX3JGsUspajaB4g4g6fmIMQiIkjOra/sO55yGNU12T0N8DulmcvqndXOQ/QXCDQyAs2nMOKJmxDYw0R0GBFbkEnnWJnu10YKKaJqJkNUCaj/lAXzFo6XUHBkf26nJpI2MUCJAL4OhYFHJyPFxWQqjEoC1srx8Wp1MMQhy9R9ZWn18yOKxpMvA8AlY8WGKGqSvMx/ZRXu/kREL450I/ZqlgBQAQBllOQVVnrHjnJeIvGSacmsD9OSH5xEG4FArRBoOoXhpJ/A0C4g4tGxvEbyczkzdeVRm0AscRFj7BVuZrCL9qU7LWPSIo6rFeRkKoyJjFUeJxJUWgsPMz+xIZzno28QY1YjlBQ1+R53lCjdSH9uGakvVysj0V8g0KwINJ3C4IKKqtp3APGb3JWTm5yYHFrRW+ZrH1G1gwxxFhB8kMvqDZXmeSKb+EQX70TGqmWcyIexIU5SRMqZ+jER9A7PpZyJRPCnfFY/d6z5K2ryx4jgeNPVqizqRPEX/QQC9YZAcyqMaPQkkFvf5PntuLmbMfaflpH6Igd/4cJls9raiwfdrKrSn3NG6qP1JpSx+JlM98iJKAzvFsDdxCRZujyb7nkpKHyPur3A6DmWvBsGH9NP1T7ugQUAv69FPZWg5i3oCATqBYGmVBjuLSP5C0D4YsnUUJAhpBjGtrc8H/mSHXyHZegr6kUYfviYTPfIiSiMjs6lLxPRJc6JPeA4kSO1IngKSGn0mIgyheq3JjiW0srIzpqoQUEqP3IVbQQCjYBA8yqMaGIFyGwbkc0T9UmSLP2oN91z6+JYcrnMYMdk1KtuhAUQ9G0mEtMK3DOJIe62TH1hkBj4qengPdT7uV14vNWyLGqQ8xe0BAJTjUDTKgwOrKJqLyLiJ0ogO/mQeEAZa2GWm0Qw+PKpUy3QIMev9DYTiWnnM4YbnUh7ZLf1mqkHg+XHrekQdExELcuiBjl/QUsgMNUINLXCiKqJKwnwWc8+zYDd2E9Dz8/A0F9LEcPVJNWbatnV3fgRNfEkQ3YNEBWwvW2BtXnz/iCZLMsC258z9cAiqGtVFjXIuQtaAoF6QKCpFQbPYBSNJ3k0tmOfRsQNOVO/JBpPFj3Xy/lzZrZs3rx5uB6E0cg8dHR0tNsY2sOL9BDC43kzfX2Q8zmyqfO3EZ85nSoZvxZlUSsZX7QVCDQCAs2uMLhZyi2ZCTTMs5eGpXB0qDjEg7OcaniV2LobQaBTxWM0pt0KDH/Ix5ck+fxsesemIHnx835RzXjiHaMa9ETf6YJA8ysMRVMwhE41Pre4krzepsINPDwDAA7lTH32dBF2LefJk+0hQFutkiIqqvYsIl4Z9PuFh8mIdwxqbwvP91N3o5aYCtoCgXpDoOkVBge8vKCNE8wHwDOo8ofZJy0ztbrehNKI/EzEBbeSeR55vyDYm8vq8yvp66ftyHcMRLzHMlJr/fQVbQQC0wWB6aEwYp1XA7KnnRx03IffqddNIDFpTa/RU3VivOmyWMaaZy0VRlTVLiMAJwCQyex6K516vBaYR9XkYUAIuQ4RbJ/f6n614EXQFAjUIwLTQmE4t4yYVuT5zr3HbgIqhtkJ8zKZjUdV1qtHITUCT7VUGIqqdSPiMkLau+a6zy84Th2MqmFS4toaBFxPZA8islZxy6gaUkGgyRCYPgpDTT4DQFc5E+bpqoGez5npK5pMnlM2nXKF0d4WnheU/X9x59KPyUSvunVG7K/0ZY2f1XKSR6r7lW4ZOTPFk1iKTyAgEGimmt7jSZM/agLQDz5MeS1fljO7eblT8QWAgBNhXSpM5LPutq9RlVjyHWRwEgG9nTP0UxGxrBKSLxIVNfrwlkEHGLL1ljl63faKiIrGAoEmQWD63DDi8S4gaYtbJIfezJnpxU0iw7qYRlTVBgEx7NXdZhL7h950zyPVMBfr1G6wCR91bhdEF/f1Zo5bPreacUb2VeLaVv5/eTPdFSRdQUsg0OgITBuFwQWlxBP7iZD1ZdNzAaDY6MKrJ/7L626X+LIZwjW9hv7MRPh0K+y17nFqghPtzWfTgXtGTYQv0UcgMJ0RmFYKYzoLejLmzutuA9Ltji8a8pcipzjqOsvU76xk/NWrV0uvbenmBa7+lvezD9uxnTsNqxIaoq1AQCAQPAJCYQSP6bSlWLoVvM0QW22iAUSY4SoN7GESfsNua33Vyy913nnnzXjn4ME5WJDagYpzGNAim+hcIjoPAHjRo1Jp1zEr5k1brMXEBQJTgYBQGFOBehOPyZUGhFrXIeANzu0A6BADHDNRIH+jKP+cewnyawrtypvpU5sYLjE1gUBDISAURkOJq3GYdbyNCNYRQDvf/QmhgKUKiKX4SeLJWpwwStcT4cjnPHIjHOoz0yJtS+OIXHA6DRAQCmMaCHmqpuik27DZ/yFjM8bkgaCPgLYBwBayYbvNCtveyGb5m0VNXWinChcxrkCgURH4f0sW0cCMRJAmAAAAAElFTkSuQmCC'
| 3,926 | 7,851 | 0.973255 | 202 | 7,852 | 37.821782 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.14384 | 0.000382 | 7,852 | 1 | 7,852 | 7,852 | 0.829532 | 0 | 0 | 0 | 0 | 1 | 0.996689 | 0.996689 | 0 | 1 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 1 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 8 |
1cc65ed366c179288800d862eab3b5a039a232cb | 4,640 | py | Python | credolib/plotting.py | awacha/credolib | 11c0be3eea7257d3d6e13697d3e76ce538f2f1b2 | [
"BSD-3-Clause"
] | null | null | null | credolib/plotting.py | awacha/credolib | 11c0be3eea7257d3d6e13697d3e76ce538f2f1b2 | [
"BSD-3-Clause"
] | null | null | null | credolib/plotting.py | awacha/credolib | 11c0be3eea7257d3d6e13697d3e76ce538f2f1b2 | [
"BSD-3-Clause"
] | null | null | null | __all__=['plotsascurve','guinierplot','kratkyplot']
from .io import getsascurve
import matplotlib.pyplot as plt
from sastool.libconfig import qunit, dunit
def plotsascurve(samplename, *args, **kwargs):
if 'dist' not in kwargs:
kwargs['dist'] = None
data1d, dist = getsascurve(samplename, kwargs['dist'])
del kwargs['dist']
if 'factor' in kwargs:
factor=kwargs['factor']
del kwargs['factor']
else:
factor=1
if 'label' not in kwargs:
if isinstance(dist, str):
kwargs['label'] = samplename + ' ' + dist
else:
kwargs['label'] = samplename + ' %g mm' % dist
if 'errorbar' in kwargs:
errorbars = bool(kwargs['errorbar'])
del kwargs['errorbar']
else:
errorbars = False
if errorbars:
ret = (data1d*factor).errorbar(*args, **kwargs)
plt.xscale('log')
plt.yscale('log')
else:
ret = (data1d*factor).loglog(*args, **kwargs)
plt.xlabel('q (' + qunit() + ')')
plt.ylabel('$d\\Sigma/d\\Omega$ (cm$^{-1}$ sr$^{-1}$)')
plt.legend(loc='best')
plt.grid(True, which='both')
plt.axis('tight')
return ret
def guinierplot(*args, **kwargs):
"""Make a Guinier plot. This is simply a wrapper around plotsascurve()."""
ret=plotsascurve(*args, **kwargs)
plt.xscale('power',exponent=2)
plt.yscale('log')
return ret
def kratkyplot(samplename, *args, **kwargs):
if 'dist' not in kwargs:
kwargs['dist'] = None
data1d, dist = getsascurve(samplename, kwargs['dist'])
del kwargs['dist']
if 'factor' in kwargs:
factor=kwargs['factor']
del kwargs['factor']
else:
factor=1
if 'label' not in kwargs:
if isinstance(dist, str):
kwargs['label'] = samplename + ' ' + dist
else:
kwargs['label'] = samplename + ' %g mm' % dist
if 'errorbar' in kwargs:
errorbars = bool(kwargs['errorbar'])
del kwargs['errorbar']
else:
errorbars = False
data1dscaled=data1d*factor
if errorbars:
if hasattr(data1dscaled, 'dx'):
dx=data1dscaled.qError
dy=(data1dscaled.Error ** 2 * data1dscaled.q ** 4 +
data1dscaled.Intensity ** 2 * data1dscaled.qError ** 2
* data1dscaled.q ** 2 * 4) ** 0.5
else:
dx=None
dy=data1dscaled.Error
ret = plt.errorbar(data1dscaled.q,
data1dscaled.q ** 2 * data1dscaled.Intensity,
dy, dx, *args, **kwargs)
else:
ret = plt.plot(data1dscaled.q,
data1dscaled.Intensity * data1dscaled.q ** 2,
*args, **kwargs)
plt.xlabel('q (' + dunit() + ')')
plt.ylabel('$q^2 d\\Sigma/d\\Omega$ (' +
dunit() +
'$^{-2}$ cm$^{-1}$ sr$^{-1}$)')
plt.legend(loc='best')
plt.grid(True, which='both')
plt.axis('tight')
return ret
def porodplot(samplename, *args, **kwargs):
if 'dist' not in kwargs:
kwargs['dist'] = None
data1d, dist = getsascurve(samplename, kwargs['dist'])
del kwargs['dist']
if 'factor' in kwargs:
factor=kwargs['factor']
del kwargs['factor']
else:
factor=1
if 'label' not in kwargs:
if isinstance(dist, str):
kwargs['label'] = samplename + ' ' + dist
else:
kwargs['label'] = samplename + ' %g mm' % dist
if 'errorbar' in kwargs:
errorbars = bool(kwargs['errorbar'])
del kwargs['errorbar']
else:
errorbars = False
data1dscaled=data1d*factor
if errorbars:
if hasattr(data1dscaled, 'dx'):
dx=data1dscaled.qError
dy=(data1dscaled.Error ** 2 * data1dscaled.q ** 8 +
data1dscaled.Intensity ** 2 * data1dscaled.qError ** 2
* data1dscaled.q ** 6 * 14) ** 0.5
else:
dx=None
dy=data1dscaled.Error
ret = plt.errorbar(data1dscaled.q,
data1dscaled.q ** 4 * data1dscaled.Intensity,
dy, dx, *args, **kwargs)
else:
ret = plt.plot(data1dscaled.q,
data1dscaled.Intensity * data1dscaled.q ** 2,
*args, **kwargs)
plt.xlabel('q (' + dunit() + ')')
plt.ylabel('$q^4 d\\Sigma/d\\Omega$ (' +
dunit() +
'$^{-4}$ cm$^{-1}$ sr$^{-1}$)')
plt.legend(loc='best')
plt.xscale('power',exponent=4)
plt.yscale('linear')
plt.grid(True, which='both')
plt.axis('tight')
return ret
| 32.907801 | 78 | 0.534267 | 504 | 4,640 | 4.910714 | 0.176587 | 0.038788 | 0.026667 | 0.026667 | 0.810909 | 0.788283 | 0.788283 | 0.788283 | 0.744646 | 0.734545 | 0 | 0.022934 | 0.314009 | 4,640 | 140 | 79 | 33.142857 | 0.754634 | 0.014655 | 0 | 0.796992 | 0 | 0 | 0.109505 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.030075 | false | 0 | 0.022556 | 0 | 0.082707 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
1cde323caf473a0755efe1621a5d390e8386c0dd | 18,165 | py | Python | tests/test_play_requests.py | davidemoro/play_requests | 1173233a5da4fea5df22b95c9224f381b4cf7bf4 | [
"Apache-2.0"
] | 7 | 2018-04-14T08:07:02.000Z | 2019-02-19T12:01:18.000Z | tests/test_play_requests.py | davidemoro/play_requests | 1173233a5da4fea5df22b95c9224f381b4cf7bf4 | [
"Apache-2.0"
] | 5 | 2019-01-10T22:22:55.000Z | 2020-01-01T05:22:46.000Z | tests/test_play_requests.py | tierratelematics/play_requests | 1173233a5da4fea5df22b95c9224f381b4cf7bf4 | [
"Apache-2.0"
] | null | null | null | #!/usr/bin/env python
# -*- coding: utf-8 -*-
"""Tests for `play_requests` package."""
import os
import pytest
@pytest.fixture(scope='session')
def variables():
return {'skins': {'skin1': {'base_url': 'http://', 'credentials': {}}}}
def test_post1(play):
import requests_mock
with requests_mock.mock() as m:
m.request('POST',
'http://something/1',
json={'status': 'ok'})
mock_engine = play
mock_engine.variables = {}
from play_requests import providers
provider = providers.RequestsProvider(mock_engine)
assert provider.engine is mock_engine
provider.command_POST({
'provider': 'play_requests',
'type': 'POST',
'url': 'http://something/1',
'parameters': {
'json': {
'foo': 'bar',
},
'timeout': 2.5
},
})
history = m.request_history
assert len(history) == 1
assert history[0].method == 'POST'
assert history[0].url == 'http://something/1'
assert history[0].json() == {'foo': 'bar'}
assert history[0].timeout == 2.5
def test_post_variables(play):
import requests_mock
with requests_mock.mock() as m:
m.request('POST',
'http://something/1',
json={'status': 'ok'})
mock_engine = play
mock_engine.variables = {}
from play_requests import providers
provider = providers.RequestsProvider(mock_engine)
assert provider.engine is mock_engine
provider.command_POST({
'provider': 'play_requests',
'type': 'POST',
'url': 'http://something/1',
'variable': 'myvar',
'variable_expression': 'response.json()',
'parameters': {
'json': {
'foo': 'bar',
},
'timeout': 2.5
},
})
assert 'myvar' in mock_engine.variables
assert mock_engine.variables['myvar']['status'] == 'ok'
history = m.request_history
assert len(history) == 1
assert history[0].method == 'POST'
assert history[0].url == 'http://something/1'
assert history[0].json() == {'foo': 'bar'}
assert history[0].timeout == 2.5
def test_post_variables_assert(play):
import requests_mock
with requests_mock.mock() as m:
m.request('POST',
'http://something/1',
json={'status': 'ok'})
mock_engine = play
mock_engine.variables = {}
from play_requests import providers
provider = providers.RequestsProvider(mock_engine)
assert provider.engine is mock_engine
provider.command_POST({
'provider': 'play_requests',
'type': 'POST',
'url': 'http://something/1',
'variable': 'myvar',
'variable_expression': 'response.json()',
'assertion': 'variables["myvar"]["status"] == "ok"',
'parameters': {
'json': {
'foo': 'bar',
},
'timeout': 2.5
},
})
assert 'myvar' in mock_engine.variables
assert mock_engine.variables['myvar']['status'] == 'ok'
history = m.request_history
assert len(history) == 1
assert history[0].method == 'POST'
assert history[0].url == 'http://something/1'
assert history[0].json() == {'foo': 'bar'}
assert history[0].timeout == 2.5
def test_post_variables_assert_ko(play):
import requests_mock
with requests_mock.mock() as m:
m.request('POST',
'http://something/1',
json={'status': 'KO'})
mock_engine = play
mock_engine.variables = {}
from play_requests import providers
provider = providers.RequestsProvider(mock_engine)
assert provider.engine is mock_engine
with pytest.raises(AssertionError):
provider.command_POST({
'provider': 'play_requests',
'type': 'POST',
'url': 'http://something/1',
'variable': 'myvar',
'variable_expression': 'response.json()',
'assertion': 'variables["myvar"]["status"] == "ok"',
'parameters': {
'json': {
'foo': 'bar',
},
'timeout': 2.5
},
})
assert 'myvar' in mock_engine.variables
assert mock_engine.variables['myvar']['status'] == 'KO'
history = m.request_history
assert len(history) == 1
assert history[0].method == 'POST'
assert history[0].url == 'http://something/1'
assert history[0].json() == {'foo': 'bar'}
assert history[0].timeout == 2.5
def test_get(play):
import requests_mock
with requests_mock.mock() as m:
m.request('GET',
'http://something/1',
text='OK')
mock_engine = play
mock_engine.variables = {}
from play_requests import providers
provider = providers.RequestsProvider(mock_engine)
assert provider.engine is mock_engine
provider.command_GET({
'provider': 'play_requests',
'type': 'GET',
'url': 'http://something/1',
'parameters': {
'timeout': 2.5
},
})
history = m.request_history
assert len(history) == 1
assert history[0].method == 'GET'
assert history[0].url == 'http://something/1'
# mock requests bug
# assert history[0].text == 'OK'
assert history[0].timeout == 2.5
def test_no_parameters(play):
import requests_mock
with requests_mock.mock() as m:
m.request('GET',
'http://something/1',
text='OK')
mock_engine = play
mock_engine.variables = {}
from play_requests import providers
provider = providers.RequestsProvider(mock_engine)
assert provider.engine is mock_engine
provider.command_GET({
'provider': 'play_requests',
'type': 'GET',
'url': 'http://something/1',
})
history = m.request_history
assert len(history) == 1
assert history[0].method == 'GET'
assert history[0].url == 'http://something/1'
# mock requests bug
# assert history[0].text == 'OK'
def test_get_params_simple(play):
import requests_mock
with requests_mock.mock() as m:
m.request('GET',
'http://something/1',
text='OK')
mock_engine = play
mock_engine.variables = {}
from play_requests import providers
provider = providers.RequestsProvider(mock_engine)
assert provider.engine is mock_engine
provider.command_GET({
'provider': 'play_requests',
'type': 'GET',
'url': 'http://something/1',
'parameters': {
'params': {'foo': 'bar'},
'timeout': 2.5
},
})
history = m.request_history
assert len(history) == 1
assert history[0].method == 'GET'
assert history[0].url == 'http://something/1?foo=bar'
# mock requests bug
# assert history[0].text == 'OK'
assert history[0].timeout == 2.5
def test_get_params_multi(play):
import requests_mock
import re
with requests_mock.mock() as m:
m.request('GET',
'http://something/1',
text='OK')
mock_engine = play
mock_engine.variables = {}
from play_requests import providers
provider = providers.RequestsProvider(mock_engine)
assert provider.engine is mock_engine
provider.command_GET({
'provider': 'play_requests',
'type': 'GET',
'url': 'http://something/1',
'parameters': {
'params': {'foo': ['bar', 'baz']},
'timeout': 2.5
},
})
history = m.request_history
assert len(history) == 1
assert history[0].method == 'GET'
match = re.search(
r'http\://something/1\?foo=(bar|baz)&foo=(bar|baz)',
history[0].url)
foo1 = match.group(1)
foo2 = match.group(2)
assert foo1 != foo2
assert foo1 in ('bar', 'baz')
assert foo2 in ('bar', 'baz')
assert history[0].timeout == 2.5
def test_post_headers(play):
import requests_mock
with requests_mock.mock() as m:
headers = {'user-agent': 'my-app/0.0.1'}
m.request('POST',
'http://something/1',
request_headers=headers,
json={'status': 'ok'})
mock_engine = play
mock_engine.variables = {}
from play_requests import providers
provider = providers.RequestsProvider(mock_engine)
assert provider.engine is mock_engine
provider.command_POST({
'provider': 'play_requests',
'type': 'POST',
'url': 'http://something/1',
'parameters': {
'headers': headers,
'json': {
'foo': 'bar',
},
'timeout': 2.5
},
})
history = m.request_history
assert len(history) == 1
assert history[0].method == 'POST'
assert history[0].url == 'http://something/1'
assert history[0].json() == {'foo': 'bar'}
assert history[0].timeout == 2.5
# no headers
with pytest.raises(requests_mock.exceptions.NoMockAddress):
provider.command_POST({
'provider': 'play_requests',
'type': 'POST',
'url': 'http://something/1',
'parameters': {
'json': {
'foo': 'bar',
},
'timeout': 2.5
},
})
@pytest.mark.parametrize('command', [
{'provider': 'play_requests',
'type': 'POST',
'url': 'http://something/1',
'parameters': {
'files': {
'filecsv': (
'report.csv',
'some,data',
)
},
},
},
{'provider': 'play_requests',
'type': 'POST',
'url': 'http://something/1',
'parameters': {
'files': {
'filecsv': (
'report.csv',
'some,data',
'application/csv',
{'Expires': '0'},
)
},
},
},
{'provider': 'play_requests',
'type': 'POST',
'url': 'http://something/1',
'parameters': {
'files': {
'filecsv1': (
'report.csv',
'some,data',
),
'filecsv': (
'report.csv',
'some,data',
'application/csv',
{'Expires': '0'},
)
},
},
},
])
def test_post_files(command, play):
import mock
with mock.patch('play_requests.providers.requests') \
as mock_requests:
mock_engine = play
mock_engine.variables = {}
from play_requests import providers
provider = providers.RequestsProvider(mock_engine)
assert provider.engine is mock_engine
provider.command_POST(command)
assert mock_requests \
.Session \
.return_value \
.request \
.assert_called_once_with(
command['type'],
command['url'],
files=command['parameters']['files']) is None
def test_post_files_path(play):
file_path = os.path.join(os.path.dirname(__file__), 'file.csv')
command = {
'provider': 'play_requests',
'type': 'POST',
'url': 'http://something/1',
'parameters': {
'files': {
'filecsv': (
'file.csv',
'path:{0}'.format(file_path),
)
},
},
}
import mock
with mock.patch('play_requests.providers.requests') \
as mock_requests:
with mock.patch('play_requests.providers.open') \
as mock_open:
file_mock = mock.MagicMock()
mock_open.return_value = file_mock
mock_engine = play
mock_engine.variables = {}
from play_requests import providers
provider = providers.RequestsProvider(mock_engine)
assert provider.engine is mock_engine
provider.command_POST(command)
assert mock_requests \
.Session \
.return_value \
.request \
.assert_called_once_with(
command['type'],
command['url'],
files={'filecsv': ('file.csv', file_mock)}) is None
assert mock_open.assert_called_once_with(
file_path, 'rb') is None
@pytest.mark.parametrize('assertion', [
'response.status_code == 200',
'response.status_code != 404',
'response.status_code == 200 and response.json()["status"] == "ok"',
'"status" in response.json()',
'variables["foo"] == "baz"',
'"foo" in variables',
'len([1]) == 1',
'[1][0] == 1',
'len(list(response.json().items())) == 1',
'variables["foo"].upper() == "BAZ"',
'match(r"^([0-9]*)-data", "123-data")',
'match(r"^([0-9]*)-data", "123-data").group(1) == "123"'
])
def test_post_assertion(assertion, play):
import requests_mock
with requests_mock.mock() as m:
m.request('POST',
'http://something/1',
json={'status': 'ok'})
mock_engine = play
mock_engine.variables = {'foo': 'baz'}
from play_requests import providers
provider = providers.RequestsProvider(mock_engine)
assert provider.engine is mock_engine
provider.command_POST({
'provider': 'play_requests',
'type': 'POST',
'url': 'http://something/1',
'parameters': {
'json': {
'foo': 'bar',
},
'timeout': 2.5
},
'assertion': assertion
})
history = m.request_history
assert len(history) == 1
assert history[0].method == 'POST'
assert history[0].url == 'http://something/1'
assert history[0].json() == {'foo': 'bar'}
assert history[0].timeout == 2.5
def test_post_assertion_ko(play):
import requests_mock
with requests_mock.mock() as m:
m.request('POST',
'http://something/1',
json={'status': 'ok'})
mock_engine = play
mock_engine.variables = {'foo': 'baz'}
from play_requests import providers
provider = providers.RequestsProvider(mock_engine)
assert provider.engine is mock_engine
with pytest.raises(AssertionError):
provider.command_POST({
'provider': 'play_requests',
'type': 'POST',
'url': 'http://something/1',
'parameters': {
'json': {
'foo': 'bar',
},
'timeout': 2.5
},
'assertion': 'response.status_code == 404'
})
history = m.request_history
assert len(history) == 1
assert history[0].method == 'POST'
assert history[0].url == 'http://something/1'
assert history[0].json() == {'foo': 'bar'}
assert history[0].timeout == 2.5
@pytest.mark.parametrize('assertion', [
'open("/etc/passwd", "r")',
'open',
'import os',
'__file__',
'__file__',
'__builtins__.__dict__["bytes"]',
'__builtins__.__dict__["bytes"] = "pluto"',
'prova = lambda: 1',
'os = 1',
])
def test_post_assertion_bad(assertion, play):
import requests_mock
with requests_mock.mock() as m:
m.request('POST',
'http://something/1',
json={'status': 'ok'})
mock_engine = play
mock_engine.variables = {'foo': 'baz'}
from play_requests import providers
provider = providers.RequestsProvider(mock_engine)
assert provider.engine is mock_engine
with pytest.raises(Exception):
provider.command_POST({
'provider': 'play_requests',
'type': 'POST',
'url': 'http://something/1',
'parameters': {
'json': {
'foo': 'bar',
},
'timeout': 2.5
},
'assertion': assertion
})
history = m.request_history
assert len(history) == 1
assert history[0].method == 'POST'
assert history[0].url == 'http://something/1'
assert history[0].json() == {'foo': 'bar'}
assert history[0].timeout == 2.5
@pytest.mark.parametrize('verb', [
'OPTIONS',
'HEAD',
'PUT',
'PATCH',
'DELETE',
])
def test_other_verbs(verb, play):
""" """
import mock
_make_request = mock.MagicMock()
from play_requests import providers
provider = providers.RequestsProvider(play)
provider._make_request = _make_request
command = {'provider': 'play_requests', 'type': verb}
getattr(provider, 'command_{0}'.format(verb))(command, foo='bar')
assert _make_request.assert_called_once_with(verb, command) is None
| 31.924429 | 75 | 0.498982 | 1,748 | 18,165 | 5.036041 | 0.081236 | 0.070431 | 0.071567 | 0.054072 | 0.84085 | 0.831308 | 0.822106 | 0.817789 | 0.810519 | 0.798251 | 0 | 0.017085 | 0.362015 | 18,165 | 568 | 76 | 31.980634 | 0.742514 | 0.012937 | 0 | 0.75835 | 0 | 0.001965 | 0.183148 | 0.021609 | 0 | 0 | 0 | 0 | 0.19057 | 1 | 0.031434 | false | 0.001965 | 0.066798 | 0.001965 | 0.100196 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
1c66d754e1f680eb0a222d98b6ff0a80cc49728b | 7,676 | py | Python | okonomiyaki/platforms/tests/test_platform_filters.py | enthought/okonomiyaki | 51b8b4fa8d17255e13c097402691726545cf5b4c | [
"BSD-3-Clause"
] | 1 | 2021-06-01T16:35:00.000Z | 2021-06-01T16:35:00.000Z | okonomiyaki/platforms/tests/test_platform_filters.py | enthought/okonomiyaki | 51b8b4fa8d17255e13c097402691726545cf5b4c | [
"BSD-3-Clause"
] | 249 | 2015-02-24T19:06:53.000Z | 2021-07-30T09:01:53.000Z | okonomiyaki/platforms/tests/test_platform_filters.py | enthought/okonomiyaki | 51b8b4fa8d17255e13c097402691726545cf5b4c | [
"BSD-3-Clause"
] | 4 | 2015-02-19T21:29:12.000Z | 2016-01-14T21:02:39.000Z | from __future__ import absolute_import
import sys
from .._arch import Arch
from ..epd_platform import EPDPlatform
from .._platform import OSKind, NameKind, FamilyKind, Platform
from ..platform_filters import PlatformFilter, PlatformLabel, PlatformLiteral
if sys.version_info < (2, 7):
import unittest2 as unittest
else:
import unittest
LABEL_WINDOWS_ANY = PlatformLabel()
LABEL_WINDOWS_ANY.os_kind = OSKind.windows
LABEL_OSX_32 = PlatformLabel()
LABEL_OSX_32.os_kind = OSKind.darwin
LABEL_OSX_32.arch = Arch.from_name("x86")
def _platform_from_epd_string(s):
return EPDPlatform.from_epd_string(s).platform
RH5_32 = _platform_from_epd_string("rh5-32")
RH5_X86_64 = _platform_from_epd_string("rh5-64")
OSX_32 = _platform_from_epd_string("osx-32")
WIN_X86_64 = _platform_from_epd_string("win-64")
UBUNTU_12_10_X32 = Platform(
OSKind.linux, NameKind.ubuntu, FamilyKind.debian, "12.10",
Arch.from_name("x86"), Arch.from_name("x86"),
)
UBUNTU_14_04_X32 = Platform(
OSKind.linux, NameKind.ubuntu, FamilyKind.debian, "14.04",
Arch.from_name("x86"), Arch.from_name("x86"),
)
UBUNTU_14_04_X64 = Platform(
OSKind.linux, NameKind.ubuntu, FamilyKind.debian, "14.04",
Arch.from_name("x86_64"), Arch.from_name("x86_64")
)
class TestPlatformLabel(unittest.TestCase):
def test_bitwidth_only(self):
# Given
label = PlatformLabel(arch=Arch.from_name("x86"))
# When/Then
self.assertTrue(label.matches(RH5_32))
self.assertFalse(label.matches(RH5_X86_64))
self.assertTrue(label.matches(OSX_32))
self.assertFalse(label.matches(WIN_X86_64))
def test_os(self):
# Given
label = PlatformLabel(os_kind=OSKind.windows)
# When/Then
self.assertFalse(label.matches(RH5_32))
self.assertFalse(label.matches(RH5_X86_64))
self.assertFalse(label.matches(OSX_32))
self.assertTrue(label.matches(WIN_X86_64))
def test_name(self):
# Given
label = PlatformLabel(name_kind=NameKind.centos)
# When/Then
self.assertFalse(label.matches(RH5_32))
self.assertFalse(label.matches(RH5_X86_64))
self.assertFalse(label.matches(OSX_32))
self.assertFalse(label.matches(WIN_X86_64))
def test_specific(self):
# Given
label = PlatformLabel(
name_kind=NameKind.ubuntu, arch=Arch.from_name("x86"),
release="14.04"
)
# When/Then
self.assertFalse(label.matches(RH5_32))
self.assertFalse(label.matches(RH5_X86_64))
self.assertFalse(label.matches(OSX_32))
self.assertFalse(label.matches(WIN_X86_64))
self.assertFalse(label.matches(UBUNTU_12_10_X32))
self.assertTrue(label.matches(UBUNTU_14_04_X32))
self.assertFalse(label.matches(UBUNTU_14_04_X64))
def test_from_legacy_string(self):
# Given
label = PlatformLabel._from_legacy_string("rh")
# When/Then
self.assertTrue(label.matches(RH5_32))
self.assertTrue(label.matches(RH5_X86_64))
self.assertFalse(label.matches(OSX_32))
self.assertFalse(label.matches(WIN_X86_64))
self.assertFalse(label.matches(UBUNTU_12_10_X32))
self.assertFalse(label.matches(UBUNTU_14_04_X32))
self.assertFalse(label.matches(UBUNTU_14_04_X64))
# Given
label = PlatformLabel._from_legacy_string("rh6")
# When/Then
self.assertFalse(label.matches(RH5_32))
self.assertFalse(label.matches(RH5_X86_64))
self.assertFalse(label.matches(OSX_32))
self.assertFalse(label.matches(WIN_X86_64))
self.assertFalse(label.matches(UBUNTU_12_10_X32))
self.assertFalse(label.matches(UBUNTU_14_04_X32))
self.assertFalse(label.matches(UBUNTU_14_04_X64))
# Given
label = PlatformLabel._from_legacy_string("win-64")
# When/Then
self.assertFalse(label.matches(RH5_32))
self.assertFalse(label.matches(RH5_X86_64))
self.assertFalse(label.matches(OSX_32))
self.assertTrue(label.matches(WIN_X86_64))
self.assertFalse(label.matches(UBUNTU_12_10_X32))
self.assertFalse(label.matches(UBUNTU_14_04_X32))
self.assertFalse(label.matches(UBUNTU_14_04_X64))
# Given
label = PlatformLabel._from_legacy_string("64")
# When/Then
self.assertFalse(label.matches(RH5_32))
self.assertTrue(label.matches(RH5_X86_64))
self.assertFalse(label.matches(OSX_32))
self.assertTrue(label.matches(WIN_X86_64))
self.assertFalse(label.matches(UBUNTU_12_10_X32))
self.assertFalse(label.matches(UBUNTU_14_04_X32))
self.assertTrue(label.matches(UBUNTU_14_04_X64))
# Given
label = PlatformLabel._from_legacy_string("all")
# When/Then
self.assertTrue(label.matches(RH5_32))
self.assertTrue(label.matches(RH5_X86_64))
self.assertTrue(label.matches(OSX_32))
self.assertTrue(label.matches(WIN_X86_64))
self.assertTrue(label.matches(UBUNTU_12_10_X32))
self.assertTrue(label.matches(UBUNTU_14_04_X32))
self.assertTrue(label.matches(UBUNTU_14_04_X64))
class TestPlatformFilter(unittest.TestCase):
def test_simple(self):
# Given
literals = [PlatformLiteral(LABEL_WINDOWS_ANY, False),
PlatformLiteral(LABEL_OSX_32, False)]
# When
filtre = PlatformFilter(literals)
# Then
self.assertTrue(filtre.matches(RH5_X86_64))
def test_from_pisi_string_simple(self):
# Given
legacy_string = "all"
# When
filtre = PlatformFilter.from_legacy_string(legacy_string)
# Then
self.assertTrue(filtre.matches(RH5_32))
self.assertTrue(filtre.matches(RH5_X86_64))
self.assertTrue(filtre.matches(OSX_32))
self.assertTrue(filtre.matches(WIN_X86_64))
self.assertTrue(filtre.matches(UBUNTU_12_10_X32))
self.assertTrue(filtre.matches(UBUNTU_14_04_X32))
self.assertTrue(filtre.matches(UBUNTU_14_04_X64))
# Given
legacy_string = "!all"
# When
filtre = PlatformFilter.from_legacy_string(legacy_string)
# Then
self.assertFalse(filtre.matches(RH5_32))
self.assertFalse(filtre.matches(RH5_X86_64))
self.assertFalse(filtre.matches(OSX_32))
self.assertFalse(filtre.matches(WIN_X86_64))
self.assertFalse(filtre.matches(UBUNTU_12_10_X32))
self.assertFalse(filtre.matches(UBUNTU_14_04_X32))
self.assertFalse(filtre.matches(UBUNTU_14_04_X64))
# Given
legacy_string = "win"
# When
filtre = PlatformFilter.from_legacy_string(legacy_string)
# Then
self.assertFalse(filtre.matches(RH5_32))
self.assertFalse(filtre.matches(RH5_X86_64))
self.assertFalse(filtre.matches(OSX_32))
self.assertTrue(filtre.matches(WIN_X86_64))
self.assertFalse(filtre.matches(UBUNTU_12_10_X32))
self.assertFalse(filtre.matches(UBUNTU_14_04_X32))
def test_from_pisi_string_composite(self):
# Given
legacy_string = "!win,!osx"
# When
filtre = PlatformFilter.from_legacy_string(legacy_string)
# Then
self.assertTrue(filtre.matches(RH5_32))
self.assertTrue(filtre.matches(RH5_X86_64))
self.assertFalse(filtre.matches(OSX_32))
self.assertFalse(filtre.matches(WIN_X86_64))
self.assertTrue(filtre.matches(UBUNTU_12_10_X32))
self.assertTrue(filtre.matches(UBUNTU_14_04_X32))
self.assertTrue(filtre.matches(UBUNTU_14_04_X64))
| 33.666667 | 77 | 0.688379 | 972 | 7,676 | 5.135802 | 0.080247 | 0.129808 | 0.148237 | 0.20012 | 0.816306 | 0.782252 | 0.759215 | 0.733574 | 0.719551 | 0.70653 | 0 | 0.074844 | 0.204534 | 7,676 | 227 | 78 | 33.814978 | 0.742712 | 0.029052 | 0 | 0.587838 | 0 | 0 | 0.015096 | 0 | 0 | 0 | 0 | 0 | 0.554054 | 1 | 0.060811 | false | 0 | 0.054054 | 0.006757 | 0.135135 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 8 |
98c1f35fdc53485a832d1c2f5f4c6630b105ecd6 | 137 | py | Python | gym_quidditch/envs/__init__.py | ac1212/gym-quidditch | ac569de414c1e71dd46e0971aea251272cb68ec2 | [
"MIT"
] | 1 | 2019-05-09T01:15:37.000Z | 2019-05-09T01:15:37.000Z | gym_quidditch/envs/__init__.py | ac1212/gym-quidditch | ac569de414c1e71dd46e0971aea251272cb68ec2 | [
"MIT"
] | 3 | 2019-05-09T03:29:50.000Z | 2019-05-09T17:09:50.000Z | gym_quidditch/envs/__init__.py | ac1212/gym-quidditch | ac569de414c1e71dd46e0971aea251272cb68ec2 | [
"MIT"
] | null | null | null | from gym_quidditch.envs.quidditchsnitch_v0 import QuidditchSnitchEnv
from gym_quidditch.envs.quidditchseeker_v0 import QuidditchSeekerEnv | 68.5 | 68 | 0.919708 | 16 | 137 | 7.625 | 0.625 | 0.114754 | 0.262295 | 0.327869 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.015385 | 0.051095 | 137 | 2 | 69 | 68.5 | 0.923077 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 7 |
98d3726f920a268e860b45603fc947c05445ca34 | 150 | py | Python | {{ cookiecutter.repo_name }}/src/utils/__init__.py | thanhtcptit/ml_template | 0c5aa288ba0d8ab9f2b4a64fc0646c3fe751d414 | [
"MIT"
] | null | null | null | {{ cookiecutter.repo_name }}/src/utils/__init__.py | thanhtcptit/ml_template | 0c5aa288ba0d8ab9f2b4a64fc0646c3fe751d414 | [
"MIT"
] | null | null | null | {{ cookiecutter.repo_name }}/src/utils/__init__.py | thanhtcptit/ml_template | 0c5aa288ba0d8ab9f2b4a64fc0646c3fe751d414 | [
"MIT"
] | null | null | null | from src.utils.common import *
from src.utils.file_utils import *
from src.utils.params import Params, Registrable
from src.utils.logger import Logger | 37.5 | 48 | 0.82 | 24 | 150 | 5.083333 | 0.375 | 0.229508 | 0.393443 | 0.295082 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.106667 | 150 | 4 | 49 | 37.5 | 0.910448 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 8 |
c7109c8a98c4ba92ba66c20c99e139c2f4774086 | 683 | py | Python | centernet_lightning/eval/__init__.py | gau-nernst/CenterNet | bd6a21813a63310e3b1c77cefe24077b72d1092a | [
"MIT"
] | 47 | 2021-08-10T09:30:53.000Z | 2022-03-29T07:53:43.000Z | centernet_lightning/eval/__init__.py | gau-nernst/CenterNet | bd6a21813a63310e3b1c77cefe24077b72d1092a | [
"MIT"
] | 1 | 2021-08-07T13:46:49.000Z | 2021-08-07T13:46:49.000Z | centernet_lightning/eval/__init__.py | gau-nernst/CenterNet | bd6a21813a63310e3b1c77cefe24077b72d1092a | [
"MIT"
] | 6 | 2021-08-12T02:40:43.000Z | 2022-01-31T16:12:40.000Z | # from .coco import pred_detections_to_coco_format, target_detections_to_coco_format, evaluate_coco_detection, evaluate_coco_detection_from_file
# from .mot_challenge import evaluate_mot_tracking_sequence, evaluate_mot_tracking_from_file
# from .utils import voc_to_coco_annotations, detections_to_coco_results, ground_truth_to_coco_annotations
# __all__ = [
# "pred_detections_to_coco_format", "target_detections_to_coco_format",
# "evaluate_coco_detection", "evaluate_coco_detection_from_file",
# "evaluate_mot_tracking_sequence", "evaluate_mot_tracking_from_file",
# "voc_to_coco_annotations", "detections_to_coco_results", "ground_truth_to_coco_annotations"
# ]
| 62.090909 | 144 | 0.847731 | 91 | 683 | 5.626374 | 0.241758 | 0.117188 | 0.1875 | 0.171875 | 0.894531 | 0.894531 | 0.894531 | 0.894531 | 0.894531 | 0.683594 | 0 | 0 | 0.083455 | 683 | 10 | 145 | 68.3 | 0.817891 | 0.970717 | 0 | null | 0 | null | 0 | 0 | null | 0 | 0 | 0 | null | 1 | null | true | 0 | 0 | null | null | null | 0 | 0 | 0 | null | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 10 |
c732f5802e40eb70f8b7ea391a2f0099582299fd | 323 | py | Python | qutipy/Clifford/__init__.py | sumeetkhatri/QuTIPy | ca2a3344c1caa818504425496ea37278d80b1c44 | [
"Apache-2.0"
] | 19 | 2020-11-11T13:00:22.000Z | 2022-03-14T11:18:04.000Z | qutipy/Clifford/__init__.py | sumeetkhatri/QuTIPy | ca2a3344c1caa818504425496ea37278d80b1c44 | [
"Apache-2.0"
] | null | null | null | qutipy/Clifford/__init__.py | sumeetkhatri/QuTIPy | ca2a3344c1caa818504425496ea37278d80b1c44 | [
"Apache-2.0"
] | 1 | 2022-03-03T15:20:15.000Z | 2022-03-03T15:20:15.000Z | from qutipy.Clifford.Clifford_group_generators import Clifford_group_generators
from qutipy.Clifford.generate_Clifford_group import generate_Clifford_group
from qutipy.Clifford.generate_state_2design import generate_state_2design
from qutipy.Clifford.Clifford_twirl_channel_one_qubit import Clifford_twirl_channel_one_qubit | 80.75 | 93 | 0.928793 | 44 | 323 | 6.363636 | 0.295455 | 0.142857 | 0.257143 | 0.185714 | 0.2 | 0 | 0 | 0 | 0 | 0 | 0 | 0.006494 | 0.04644 | 323 | 4 | 93 | 80.75 | 0.902597 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 7 |
c73438dbfa3a8afd7723d02bba320814bd6d5882 | 2,608 | py | Python | http_scanner/wrapper_scanner_http.py | gryxon/ScannerHTTP | 3ee8d1d74e65a1bd1099e0ba90322e3d8ad6ecbb | [
"MIT"
] | 2 | 2018-01-22T12:22:30.000Z | 2019-07-12T12:56:21.000Z | http_scanner/wrapper_scanner_http.py | gryxon/ScannerHTTP | 3ee8d1d74e65a1bd1099e0ba90322e3d8ad6ecbb | [
"MIT"
] | null | null | null | http_scanner/wrapper_scanner_http.py | gryxon/ScannerHTTP | 3ee8d1d74e65a1bd1099e0ba90322e3d8ad6ecbb | [
"MIT"
] | 1 | 2019-07-12T12:57:18.000Z | 2019-07-12T12:57:18.000Z | import json
class WrapperBlackScannerHttp(object):
"""
Main black scanner wrapper class.
"""
def __init__(self, id_name="black_scanner"):
"""
Constructor of our scanner.
:param id_name: Optional argument. Id of scanner.
"""
self._modules = []
self._result = {}
self._id_mod = id_name
def add_module(self, mod):
"""
Method adds module to wrapper.
:param mod: Added module
:return: None.
"""
self._modules.append(mod)
def scan(self, link, data=None):
"""
Scanning method. Every added module scan the website.
:param link: Url of the website
:param data: Optional parameter to low implement module.
:return: None.
"""
for module in self._modules:
if module.get_id() == "bot" or module.get_id() == "dos":
#print(data[1])
module.scan(data)
else:
module.scan(link)
self._result[module.get_id()] = module.get_result()
def get_result(self):
"""
Method which returns id of the Module
:return: Dict with results.
"""
return self._result
def get_id(self):
"""
Method which returns id of the module
:return: Id of module
"""
return self._result
class WrapperWhiteScannerHttp(object):
"""
Main white scanner wrapper class.
"""
def __init__(self, id_name="white_scanner"):
"""
Constructor of our scanner.
:param id_name: Optional argument. Id of scanner.
"""
self._modules = []
self._result = {}
self._id_mod = id_name
def add_module(self, mod):
"""
Method adds module to wrapper.
:param mod: Added module
:return: None.
"""
self._modules.append(mod)
def scan(self):
"""
Scanning method. Every added module scan the website.
:param link: Url of the website
:param data: Optional parameter to low implement module.
:return: None.
"""
for module in self._modules:
module.scan()
self._result[module.get_id()] = module.get_result()
def get_result(self):
"""
Method which returns id of the Module
:return: Dict with results.
"""
return self._result
def get_id(self):
"""
Method which returns id of the module
:return: Id of module
"""
return self._id_mod | 24.148148 | 68 | 0.543712 | 292 | 2,608 | 4.695205 | 0.205479 | 0.087527 | 0.046681 | 0.064187 | 0.844639 | 0.844639 | 0.844639 | 0.844639 | 0.792123 | 0.792123 | 0 | 0.000596 | 0.356212 | 2,608 | 108 | 69 | 24.148148 | 0.815962 | 0.367331 | 0 | 0.617647 | 0 | 0 | 0.025357 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.294118 | false | 0 | 0.029412 | 0 | 0.5 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
c7cb9e977cc9812c3884e10701725a4610282666 | 6,924 | py | Python | backend/session/test_permissions.py | ThreeDRadio/intranet | b8c6ab177d508816da624d5063337cbd475fee9a | [
"MIT"
] | null | null | null | backend/session/test_permissions.py | ThreeDRadio/intranet | b8c6ab177d508816da624d5063337cbd475fee9a | [
"MIT"
] | 1 | 2016-10-31T11:17:13.000Z | 2016-10-31T11:17:13.000Z | backend/session/test_permissions.py | ThreeDRadio/intranet | b8c6ab177d508816da624d5063337cbd475fee9a | [
"MIT"
] | null | null | null | from django.test import TestCase
from django.contrib.auth.models import User
from rest_framework.test import APIRequestFactory, force_authenticate
from rest_framework.viewsets import ModelViewSet
import permissions
from models import Whitelist
class IsStaffOrTargetUserTest(TestCase):
def setUp(self):
self.user = User.objects.create_user('user', 'password', 'fake@user.com');
self.admin = User.objects.create_user('admin', 'password', 'fake@user.com');
self.admin.is_staff=True
self.admin.save()
def test_has_permission_no_auth(self):
"""View level returns true if the request is a retrieve, otherwise false"""
factory = APIRequestFactory()
request = factory.get('api/users');
permission = permissions.IsStaffOrTargetUser()
view = ModelViewSet();
view.action = 'retrieve'
self.assertEqual(permission.has_permission(request, view), True)
view.action = 'list'
self.assertEqual(permission.has_permission(request, view), False)
view.action = 'create'
self.assertEqual(permission.has_permission(request, view), False)
view.action = 'update'
self.assertEqual(permission.has_permission(request, view), False)
view.action = 'partial_update'
self.assertEqual(permission.has_permission(request, view), False)
view.action = 'destroy'
self.assertEqual(permission.has_permission(request, view), False)
def test_has_permission_regular_user(self):
"""View level returns true if the request is a retrieve, otherwise false"""
factory = APIRequestFactory()
request = factory.get('api/users');
permission = permissions.IsStaffOrTargetUser()
force_authenticate(request, self.user)
view = ModelViewSet();
view.action = 'retrieve'
self.assertEqual(permission.has_permission(request, view), True)
view.action = 'list'
self.assertEqual(permission.has_permission(request, view), False)
view.action = 'create'
self.assertEqual(permission.has_permission(request, view), False)
view.action = 'update'
self.assertEqual(permission.has_permission(request, view), False)
view.action = 'partial_update'
self.assertEqual(permission.has_permission(request, view), False)
view.action = 'destroy'
self.assertEqual(permission.has_permission(request, view), False)
def test_has_permission_admin_user(self):
"""View level returns true if the user is staff """
factory = APIRequestFactory()
request = factory.get('api/users');
request.user = self.admin
permission = permissions.IsStaffOrTargetUser()
force_authenticate(request, self.admin)
view = ModelViewSet();
view.action = 'retrieve'
self.assertEqual(permission.has_permission(request, view), True)
view.action = 'list'
self.assertEqual(permission.has_permission(request, view), True)
view.action = 'create'
self.assertEqual(permission.has_permission(request, view), True)
view.action = 'update'
self.assertEqual(permission.has_permission(request, view), True)
view.action = 'partial_update'
self.assertEqual(permission.has_permission(request, view), True)
view.action = 'destroy'
self.assertEqual(permission.has_permission(request, view), True)
def test_has_object_permission_admin_on_admin(self):
""" Makes sure an admin user has permissions to access themselves"""
factory = APIRequestFactory()
request = factory.get('api/users');
request.user = self.admin
permission = permissions.IsStaffOrTargetUser()
view = ModelViewSet();
view.action = 'retrieve'
self.assertEqual(permission.has_object_permission(request, view, self.admin), True)
def test_has_object_permission_admin_on_user(self):
""" Makes sure an admin user has permissions to access another user"""
factory = APIRequestFactory()
request = factory.get('api/users');
request.user = self.admin
permission = permissions.IsStaffOrTargetUser()
view = ModelViewSet();
view.action = 'retrieve'
self.assertEqual(permission.has_object_permission(request, view, self.user), True)
def test_has_object_permission_user_on_user(self):
""" Makes sure a regular user has permissions to access themselves"""
factory = APIRequestFactory()
request = factory.get('api/users');
request.user = self.user
permission = permissions.IsStaffOrTargetUser()
view = ModelViewSet();
view.action = 'retrieve'
self.assertEqual(permission.has_object_permission(request, view, self.user), True)
def test_has_object_permission_user_on_admin(self):
""" Makes sure a regular user cannot access other users"""
factory = APIRequestFactory()
request = factory.get('api/users');
request.user = self.user
permission = permissions.IsStaffOrTargetUser()
view = ModelViewSet();
view.action = 'retrieve'
self.assertEqual(permission.has_object_permission(request, view, self.admin), False)
class IsAuthenticatedOrWhitelistTest(TestCase):
def setUp(self):
self.user = User.objects.create_user('user', 'password', 'fake@user.com');
self.whitelist = Whitelist.objects.create(ip='127.0.0.1', name= 'Localhost')
def test_unauthenticated_not_whitelisted(self):
""" An unauthenticated, un-whitelisted IP address should not be granted permission"""
factory = APIRequestFactory()
request = factory.get('api/users');
request.META['REMOTE_ADDR'] = '255.255.255.0'
request.user = False
permission = permissions.IsAuthenticatedOrWhitelist()
view = ModelViewSet();
view.action = 'retrieve'
self.assertEqual(permission.has_permission(request, view), False)
def test_unauthenticated_but_whitelisted(self):
""" An unauthenticated, whitelisted IP address should be granted permission"""
factory = APIRequestFactory()
request = factory.get('api/users');
request.user = False
permission = permissions.IsAuthenticatedOrWhitelist()
view = ModelViewSet();
view.action = 'retrieve'
self.assertEqual(permission.has_permission(request, view), True)
def test_authenticated_not_whitelisted(self):
""" An authenticated, un-whitelisted IP address should be granted permission"""
factory = APIRequestFactory()
request = factory.get('api/users');
request.META['REMOTE_ADDR'] = '255.255.255.0'
request.user = self.user
permission = permissions.IsAuthenticatedOrWhitelist()
view = ModelViewSet();
view.action = 'retrieve'
self.assertEqual(permission.has_permission(request, view), True)
def test_authenticated_whitelisted(self):
""" An authenticated, un-whitelisted IP address should be granted permission"""
factory = APIRequestFactory()
request = factory.get('api/users');
request.user = self.user
permission = permissions.IsAuthenticatedOrWhitelist()
view = ModelViewSet();
view.action = 'retrieve'
self.assertEqual(permission.has_permission(request, view), True)
| 38.681564 | 89 | 0.72978 | 783 | 6,924 | 6.337165 | 0.11622 | 0.052398 | 0.130996 | 0.146715 | 0.882104 | 0.872834 | 0.856913 | 0.845627 | 0.825474 | 0.825474 | 0 | 0.004454 | 0.156846 | 6,924 | 178 | 90 | 38.898876 | 0.845495 | 0.104419 | 0 | 0.801471 | 0 | 0 | 0.071696 | 0 | 0 | 0 | 0 | 0 | 0.191176 | 1 | 0.095588 | false | 0.022059 | 0.044118 | 0 | 0.154412 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
40309019d22f9ebf58909078f04fbea152f60fea | 14,231 | py | Python | bitwise/storage/REG.py | jamesjiang52/Bitwise | c71f151d23034b3f9e2a939f637be0eaa16c45c3 | [
"MIT"
] | null | null | null | bitwise/storage/REG.py | jamesjiang52/Bitwise | c71f151d23034b3f9e2a939f637be0eaa16c45c3 | [
"MIT"
] | null | null | null | bitwise/storage/REG.py | jamesjiang52/Bitwise | c71f151d23034b3f9e2a939f637be0eaa16c45c3 | [
"MIT"
] | null | null | null | """
The following classes are defined:
Register4
Register8
Register16
"""
from .. import wire
from .. import signal
from . import FLOP
Wire = wire.Wire
Bus4 = wire.Bus4
Bus8 = wire.Bus8
Bus16 = wire.Bus16
class Register4:
"""Construct a new 4-bit storage register.
Args:
data_bus: An object of type Bus4. The data input to the register.
enable: An object of type Wire. Enables the register.
clock: An object of type Wire or Clock. The clock input to the
register.
output_bus: An object of type Bus4. The output of the register. Takes
on the value of data_bus on the positive edges of clock if the
value of enable is 1.
Raises:
TypeError: If either data_bus or output_bus is not a bus of width 4.
"""
def __init__(self, data_bus, enable, clock, output_bus):
if len(data_bus) != 4:
raise TypeError(
"Expected bus of width 4, received bus of width {0}.".format(
len(data_bus)
)
)
if len(output_bus) != 4:
raise TypeError(
"Expected bus of width 4, received bus of width {0}.".format(
len(output_bus)
)
)
not_1 = Wire()
not_2 = Wire()
not_3 = Wire()
not_4 = Wire()
mux_bus = Bus4()
_Multiplexer2To1_4(enable, data_bus, output_bus, mux_bus)
FLOP.DFlipFlop(mux_bus[0], clock, output_bus[0], not_1)
FLOP.DFlipFlop(mux_bus[1], clock, output_bus[1], not_2)
FLOP.DFlipFlop(mux_bus[2], clock, output_bus[2], not_3)
FLOP.DFlipFlop(mux_bus[3], clock, output_bus[3], not_4)
self.data_bus = data_bus
self.enable = enable
self.clock = clock
self.output_bus = output_bus
def __str__(self):
str_ = ""
str_ += "data_bus: " + self.data_bus.__str__() + "\n"
str_ += "enable: " + str(self.enable.value) + "\n"
str_ += "clock: " + str(self.clock.value) + "\n"
str_ += "output_bus: " + self.output_bus.__str__()
return str_
def __call__(
self, *,
data_bus=None,
enable=None,
clock=None,
output_bus=None
):
if data_bus is not None:
self.data_bus.wire_values = data_bus
if enable is not None:
self.enable.value = enable
if clock is not None:
self.clock.value = clock
if output_bus is not None:
self.output_bus.wire_values = output_bus
class Register8:
"""Construct a new 8-bit storage register.
Args:
data_bus: An object of type Bus8. The data input to the register.
enable: An object of type Wire. Enables the register.
clock: An object of type Wire or Clock. The clock input to the
register.
output_bus: An object of type Bus8. The output of the register. Takes
on the value of data_bus on the positive edges of clock if the
value of enable is 1.
Raises:
TypeError: If either data_bus or output_bus is not a bus of width 8.
"""
def __init__(self, data_bus, enable, clock, output_bus):
if len(data_bus) != 8:
raise TypeError(
"Expected bus of width 8, received bus of width {0}.".format(
len(data_bus)
)
)
if len(output_bus) != 8:
raise TypeError(
"Expected bus of width 8, received bus of width {0}.".format(
len(output_bus)
)
)
not_1 = Wire()
not_2 = Wire()
not_3 = Wire()
not_4 = Wire()
not_5 = Wire()
not_6 = Wire()
not_7 = Wire()
not_8 = Wire()
mux_bus = Bus8()
_Multiplexer2To1_8(enable, data_bus, output_bus, mux_bus)
FLOP.DFlipFlop(mux_bus[0], clock, output_bus[0], not_1)
FLOP.DFlipFlop(mux_bus[1], clock, output_bus[1], not_2)
FLOP.DFlipFlop(mux_bus[2], clock, output_bus[2], not_3)
FLOP.DFlipFlop(mux_bus[3], clock, output_bus[3], not_4)
FLOP.DFlipFlop(mux_bus[4], clock, output_bus[4], not_5)
FLOP.DFlipFlop(mux_bus[5], clock, output_bus[5], not_6)
FLOP.DFlipFlop(mux_bus[6], clock, output_bus[6], not_7)
FLOP.DFlipFlop(mux_bus[7], clock, output_bus[7], not_8)
self.data_bus = data_bus
self.enable = enable
self.clock = clock
self.output_bus = output_bus
def __str__(self):
str_ = ""
str_ += "data_bus: " + self.data_bus.__str__() + "\n"
str_ += "enable: " + str(self.enable.value) + "\n"
str_ += "clock: " + str(self.clock.value) + "\n"
str_ += "output_bus: " + self.output_bus.__str__()
return str_
def __call__(
self, *,
data_bus=None,
enable=None,
clock=None,
output_bus=None
):
if data_bus is not None:
self.data_bus.wire_values = data_bus
if enable is not None:
self.enable.value = enable
if clock is not None:
self.clock.value = clock
if output_bus is not None:
self.output_bus.wire_values = output_bus
class Register16:
"""Construct a new 16-bit storage register.
Args:
data_bus: An object of type Bus16. The data input to the register.
enable: An object of type Wire. Enables the register.
clock: An object of type Wire or Clock. The clock input to the
register.
output_bus: An object of type Bus16. The output of the register. Takes
on the value of data_bus on the positive edges of clock if the
value of enable is 1.
Raises:
TypeError: If either data_bus or output_bus is not a bus of width 16.
"""
def __init__(self, data_bus, enable, clock, output_bus):
if len(data_bus) != 16:
raise TypeError(
"Expected bus of width 16, received bus of width {0}.".format(
len(data_bus)
)
)
if len(output_bus) != 16:
raise TypeError(
"Expected bus of width 16, received bus of width {0}.".format(
len(output_bus)
)
)
not_1 = Wire()
not_2 = Wire()
not_3 = Wire()
not_4 = Wire()
not_5 = Wire()
not_6 = Wire()
not_7 = Wire()
not_8 = Wire()
not_9 = Wire()
not_10 = Wire()
not_11 = Wire()
not_12 = Wire()
not_13 = Wire()
not_14 = Wire()
not_15 = Wire()
not_16 = Wire()
mux_bus = Bus16()
_Multiplexer2To1_16(enable, data_bus, output_bus, mux_bus)
FLOP.DFlipFlop(mux_bus[0], clock, output_bus[0], not_1)
FLOP.DFlipFlop(mux_bus[1], clock, output_bus[1], not_2)
FLOP.DFlipFlop(mux_bus[2], clock, output_bus[2], not_3)
FLOP.DFlipFlop(mux_bus[3], clock, output_bus[3], not_4)
FLOP.DFlipFlop(mux_bus[4], clock, output_bus[4], not_5)
FLOP.DFlipFlop(mux_bus[5], clock, output_bus[5], not_6)
FLOP.DFlipFlop(mux_bus[6], clock, output_bus[6], not_7)
FLOP.DFlipFlop(mux_bus[7], clock, output_bus[7], not_8)
FLOP.DFlipFlop(mux_bus[8], clock, output_bus[8], not_9)
FLOP.DFlipFlop(mux_bus[9], clock, output_bus[9], not_10)
FLOP.DFlipFlop(mux_bus[10], clock, output_bus[10], not_11)
FLOP.DFlipFlop(mux_bus[11], clock, output_bus[11], not_12)
FLOP.DFlipFlop(mux_bus[12], clock, output_bus[12], not_13)
FLOP.DFlipFlop(mux_bus[13], clock, output_bus[13], not_14)
FLOP.DFlipFlop(mux_bus[14], clock, output_bus[14], not_15)
FLOP.DFlipFlop(mux_bus[15], clock, output_bus[15], not_16)
self.data_bus = data_bus
self.enable = enable
self.clock = clock
self.output_bus = output_bus
def __str__(self):
str_ = ""
str_ += "data_bus: " + self.data_bus.__str__() + "\n"
str_ += "enable: " + str(self.enable.value) + "\n"
str_ += "clock: " + str(self.clock.value) + "\n"
str_ += "output_bus: " + self.output_bus.__str__()
return str_
def __call__(
self, *,
data_bus=None,
enable=None,
clock=None,
output_bus=None
):
if data_bus is not None:
self.data_bus.wire_values = data_bus
if enable is not None:
self.enable.value = enable
if clock is not None:
self.clock.value = clock
if output_bus is not None:
self.output_bus.wire_values = output_bus
class _Multiplexer2To1_4:
"""
This is an internal module for Register4. It multiplexes two 4-bit inputs
to a single 4-bit output.
"""
def __init__(
self,
select,
input_1_bus,
input_2_bus,
output_bus
):
vcc = Wire()
vcc.value = 1
signal.Multiplexer2To1(
vcc,
select,
input_1_bus[0],
input_2_bus[0],
output_bus[0]
)
signal.Multiplexer2To1(
vcc,
select,
input_1_bus[1],
input_2_bus[1],
output_bus[1]
)
signal.Multiplexer2To1(
vcc,
select,
input_1_bus[2],
input_2_bus[2],
output_bus[2]
)
signal.Multiplexer2To1(
vcc,
select,
input_1_bus[3],
input_2_bus[3],
output_bus[3]
)
class _Multiplexer2To1_8:
"""
This is an internal module for Register8. It multiplexes two 8-bit inputs
to a single 8-bit output.
"""
def __init__(
self,
select,
input_1_bus,
input_2_bus,
output_bus
):
vcc = Wire()
vcc.value = 1
signal.Multiplexer2To1(
vcc,
select,
input_1_bus[0],
input_2_bus[0],
output_bus[0]
)
signal.Multiplexer2To1(
vcc,
select,
input_1_bus[1],
input_2_bus[1],
output_bus[1]
)
signal.Multiplexer2To1(
vcc,
select,
input_1_bus[2],
input_2_bus[2],
output_bus[2]
)
signal.Multiplexer2To1(
vcc,
select,
input_1_bus[3],
input_2_bus[3],
output_bus[3]
)
signal.Multiplexer2To1(
vcc,
select,
input_1_bus[4],
input_2_bus[4],
output_bus[4]
)
signal.Multiplexer2To1(
vcc,
select,
input_1_bus[5],
input_2_bus[5],
output_bus[5]
)
signal.Multiplexer2To1(
vcc,
select,
input_1_bus[6],
input_2_bus[6],
output_bus[6]
)
signal.Multiplexer2To1(
vcc,
select,
input_1_bus[7],
input_2_bus[7],
output_bus[7]
)
class _Multiplexer2To1_16:
"""
This is an internal module for Register16. It multiplexes two 16-bit inputs
to a single 16-bit output.
"""
def __init__(
self,
select,
input_1_bus,
input_2_bus,
output_bus
):
vcc = Wire()
vcc.value = 1
signal.Multiplexer2To1(
vcc,
select,
input_1_bus[0],
input_2_bus[0],
output_bus[0]
)
signal.Multiplexer2To1(
vcc,
select,
input_1_bus[1],
input_2_bus[1],
output_bus[1]
)
signal.Multiplexer2To1(
vcc,
select,
input_1_bus[2],
input_2_bus[2],
output_bus[2]
)
signal.Multiplexer2To1(
vcc,
select,
input_1_bus[3],
input_2_bus[3],
output_bus[3]
)
signal.Multiplexer2To1(
vcc,
select,
input_1_bus[4],
input_2_bus[4],
output_bus[4]
)
signal.Multiplexer2To1(
vcc,
select,
input_1_bus[5],
input_2_bus[5],
output_bus[5]
)
signal.Multiplexer2To1(
vcc,
select,
input_1_bus[6],
input_2_bus[6],
output_bus[6]
)
signal.Multiplexer2To1(
vcc,
select,
input_1_bus[7],
input_2_bus[7],
output_bus[7]
)
signal.Multiplexer2To1(
vcc,
select,
input_1_bus[8],
input_2_bus[8],
output_bus[8]
)
signal.Multiplexer2To1(
vcc,
select,
input_1_bus[9],
input_2_bus[9],
output_bus[9]
)
signal.Multiplexer2To1(
vcc,
select,
input_1_bus[10],
input_2_bus[10],
output_bus[10]
)
signal.Multiplexer2To1(
vcc,
select,
input_1_bus[11],
input_2_bus[11],
output_bus[11]
)
signal.Multiplexer2To1(
vcc,
select,
input_1_bus[12],
input_2_bus[12],
output_bus[12]
)
signal.Multiplexer2To1(
vcc,
select,
input_1_bus[13],
input_2_bus[13],
output_bus[13]
)
signal.Multiplexer2To1(
vcc,
select,
input_1_bus[14],
input_2_bus[14],
output_bus[14]
)
signal.Multiplexer2To1(
vcc,
select,
input_1_bus[15],
input_2_bus[15],
output_bus[15]
)
| 27.472973 | 79 | 0.514721 | 1,753 | 14,231 | 3.91158 | 0.053622 | 0.132565 | 0.063293 | 0.067814 | 0.848039 | 0.840163 | 0.829226 | 0.779933 | 0.779933 | 0.779933 | 0 | 0.052437 | 0.39161 | 14,231 | 517 | 80 | 27.526112 | 0.739547 | 0.135971 | 0 | 0.746988 | 0 | 0 | 0.036173 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.028916 | false | 0 | 0.007229 | 0 | 0.057831 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
408b0f04eab624c6927cc51f7d63a4cbea427b03 | 24,177 | py | Python | sdk/python/pulumi_spotinst/azure/elastigroup.py | timmyers/pulumi-spotinst | 3d071aaff57f7549403aca8587b1892f40e85d6c | [
"ECL-2.0",
"Apache-2.0"
] | null | null | null | sdk/python/pulumi_spotinst/azure/elastigroup.py | timmyers/pulumi-spotinst | 3d071aaff57f7549403aca8587b1892f40e85d6c | [
"ECL-2.0",
"Apache-2.0"
] | null | null | null | sdk/python/pulumi_spotinst/azure/elastigroup.py | timmyers/pulumi-spotinst | 3d071aaff57f7549403aca8587b1892f40e85d6c | [
"ECL-2.0",
"Apache-2.0"
] | null | null | null | # coding=utf-8
# *** WARNING: this file was generated by the Pulumi Terraform Bridge (tfgen) Tool. ***
# *** Do not edit by hand unless you're certain you know what you are doing! ***
import json
import warnings
import pulumi
import pulumi.runtime
from typing import Union
from .. import utilities, tables
class Elastigroup(pulumi.CustomResource):
custom_data: pulumi.Output[str]
desired_capacity: pulumi.Output[float]
"""
The desired number of instances the group should have at any time.
"""
health_check: pulumi.Output[dict]
images: pulumi.Output[list]
integration_kubernetes: pulumi.Output[dict]
integration_multai_runtime: pulumi.Output[dict]
load_balancers: pulumi.Output[list]
login: pulumi.Output[dict]
low_priority_sizes: pulumi.Output[list]
"""
Available Low-Priority sizes.
"""
managed_service_identities: pulumi.Output[list]
max_size: pulumi.Output[float]
"""
The maximum number of instances the group should have at any time.
"""
min_size: pulumi.Output[float]
"""
The minimum number of instances the group should have at any time.
"""
name: pulumi.Output[str]
"""
The name of the managed identity.
"""
network: pulumi.Output[dict]
od_sizes: pulumi.Output[list]
"""
Available On-Demand sizes
"""
product: pulumi.Output[str]
"""
Operation system type. Valid values: `"Linux"`, `"Windows"`.
"""
region: pulumi.Output[str]
"""
The region your Azure group will be created in.
"""
resource_group_name: pulumi.Output[str]
"""
The Resource Group that the user-assigned managed identity resides in.
"""
scaling_down_policies: pulumi.Output[list]
scaling_up_policies: pulumi.Output[list]
scheduled_tasks: pulumi.Output[list]
shutdown_script: pulumi.Output[str]
"""
Shutdown script for the group. Value should be passed as a string encoded at Base64 only.
"""
strategy: pulumi.Output[dict]
"""
Describes the deployment strategy.
* `draining_timeout` (`float`) - Time (seconds) to allow the instance to be drained from incoming TCP connections and detached from MLB before terminating it during a scale-down operation.
* `lowPriorityPercentage` (`float`) - Percentage of Low Priority instances to maintain. Required if `od_count` is not specified.
* `odCount` (`float`) - Number of On-Demand instances to maintain. Required if low_priority_percentage is not specified.
"""
update_policy: pulumi.Output[dict]
user_data: pulumi.Output[str]
"""
Base64-encoded MIME user data to make available to the instances.
"""
def __init__(__self__, resource_name, opts=None, custom_data=None, desired_capacity=None, health_check=None, images=None, integration_kubernetes=None, integration_multai_runtime=None, load_balancers=None, login=None, low_priority_sizes=None, managed_service_identities=None, max_size=None, min_size=None, name=None, network=None, od_sizes=None, product=None, region=None, resource_group_name=None, scaling_down_policies=None, scaling_up_policies=None, scheduled_tasks=None, shutdown_script=None, strategy=None, update_policy=None, user_data=None, __props__=None, __name__=None, __opts__=None):
"""
Provides a Spotinst elastigroup Azure resource.
:param str resource_name: The name of the resource.
:param pulumi.ResourceOptions opts: Options for the resource.
:param pulumi.Input[float] desired_capacity: The desired number of instances the group should have at any time.
:param pulumi.Input[list] low_priority_sizes: Available Low-Priority sizes.
:param pulumi.Input[float] max_size: The maximum number of instances the group should have at any time.
:param pulumi.Input[float] min_size: The minimum number of instances the group should have at any time.
:param pulumi.Input[str] name: The name of the managed identity.
:param pulumi.Input[list] od_sizes: Available On-Demand sizes
:param pulumi.Input[str] product: Operation system type. Valid values: `"Linux"`, `"Windows"`.
:param pulumi.Input[str] region: The region your Azure group will be created in.
:param pulumi.Input[str] resource_group_name: The Resource Group that the user-assigned managed identity resides in.
:param pulumi.Input[str] shutdown_script: Shutdown script for the group. Value should be passed as a string encoded at Base64 only.
:param pulumi.Input[dict] strategy: Describes the deployment strategy.
:param pulumi.Input[str] user_data: Base64-encoded MIME user data to make available to the instances.
The **health_check** object supports the following:
* `autoHealing` (`pulumi.Input[bool]`)
* `gracePeriod` (`pulumi.Input[float]`)
* `health_check_type` (`pulumi.Input[str]`)
The **images** object supports the following:
* `customs` (`pulumi.Input[list]`)
* `imageName` (`pulumi.Input[str]`)
* `resource_group_name` (`pulumi.Input[str]`) - The Resource Group that the user-assigned managed identity resides in.
* `marketplaces` (`pulumi.Input[list]`)
* `offer` (`pulumi.Input[str]`)
* `publisher` (`pulumi.Input[str]`)
* `sku` (`pulumi.Input[str]`)
The **integration_kubernetes** object supports the following:
* `clusterIdentifier` (`pulumi.Input[str]`)
The **integration_multai_runtime** object supports the following:
* `deploymentId` (`pulumi.Input[str]`)
The **load_balancers** object supports the following:
* `autoWeight` (`pulumi.Input[bool]`)
* `balancerId` (`pulumi.Input[str]`)
* `targetSetId` (`pulumi.Input[str]`)
* `type` (`pulumi.Input[str]`)
The **login** object supports the following:
* `password` (`pulumi.Input[str]`)
* `sshPublicKey` (`pulumi.Input[str]`)
* `userName` (`pulumi.Input[str]`)
The **managed_service_identities** object supports the following:
* `name` (`pulumi.Input[str]`) - The name of the managed identity.
* `resource_group_name` (`pulumi.Input[str]`) - The Resource Group that the user-assigned managed identity resides in.
The **network** object supports the following:
* `additionalIpConfigs` (`pulumi.Input[list]`)
* `name` (`pulumi.Input[str]`) - The name of the managed identity.
* `privateIpVersion` (`pulumi.Input[str]`)
* `assignPublicIp` (`pulumi.Input[bool]`)
* `resource_group_name` (`pulumi.Input[str]`) - The Resource Group that the user-assigned managed identity resides in.
* `subnetName` (`pulumi.Input[str]`)
* `virtualNetworkName` (`pulumi.Input[str]`)
The **scaling_down_policies** object supports the following:
* `actionType` (`pulumi.Input[str]`)
* `adjustment` (`pulumi.Input[str]`)
* `cooldown` (`pulumi.Input[float]`)
* `dimensions` (`pulumi.Input[list]`)
* `name` (`pulumi.Input[str]`) - The name of the managed identity.
* `value` (`pulumi.Input[str]`)
* `evaluationPeriods` (`pulumi.Input[float]`)
* `maxTargetCapacity` (`pulumi.Input[str]`)
* `maximum` (`pulumi.Input[str]`)
* `metricName` (`pulumi.Input[str]`)
* `minTargetCapacity` (`pulumi.Input[str]`)
* `minimum` (`pulumi.Input[str]`)
* `namespace` (`pulumi.Input[str]`)
* `operator` (`pulumi.Input[str]`)
* `period` (`pulumi.Input[float]`)
* `policyName` (`pulumi.Input[str]`)
* `statistic` (`pulumi.Input[str]`)
* `target` (`pulumi.Input[str]`)
* `threshold` (`pulumi.Input[float]`)
* `unit` (`pulumi.Input[str]`)
The **scaling_up_policies** object supports the following:
* `actionType` (`pulumi.Input[str]`)
* `adjustment` (`pulumi.Input[str]`)
* `cooldown` (`pulumi.Input[float]`)
* `dimensions` (`pulumi.Input[list]`)
* `name` (`pulumi.Input[str]`) - The name of the managed identity.
* `value` (`pulumi.Input[str]`)
* `evaluationPeriods` (`pulumi.Input[float]`)
* `maxTargetCapacity` (`pulumi.Input[str]`)
* `maximum` (`pulumi.Input[str]`)
* `metricName` (`pulumi.Input[str]`)
* `minTargetCapacity` (`pulumi.Input[str]`)
* `minimum` (`pulumi.Input[str]`)
* `namespace` (`pulumi.Input[str]`)
* `operator` (`pulumi.Input[str]`)
* `period` (`pulumi.Input[float]`)
* `policyName` (`pulumi.Input[str]`)
* `statistic` (`pulumi.Input[str]`)
* `target` (`pulumi.Input[str]`)
* `threshold` (`pulumi.Input[float]`)
* `unit` (`pulumi.Input[str]`)
The **scheduled_tasks** object supports the following:
* `adjustment` (`pulumi.Input[str]`)
* `adjustmentPercentage` (`pulumi.Input[str]`)
* `batchSizePercentage` (`pulumi.Input[str]`)
* `cronExpression` (`pulumi.Input[str]`)
* `gracePeriod` (`pulumi.Input[str]`)
* `isEnabled` (`pulumi.Input[bool]`)
* `scaleMaxCapacity` (`pulumi.Input[str]`)
* `scaleMinCapacity` (`pulumi.Input[str]`)
* `scaleTargetCapacity` (`pulumi.Input[str]`)
* `taskType` (`pulumi.Input[str]`)
The **strategy** object supports the following:
* `draining_timeout` (`pulumi.Input[float]`) - Time (seconds) to allow the instance to be drained from incoming TCP connections and detached from MLB before terminating it during a scale-down operation.
* `lowPriorityPercentage` (`pulumi.Input[float]`) - Percentage of Low Priority instances to maintain. Required if `od_count` is not specified.
* `odCount` (`pulumi.Input[float]`) - Number of On-Demand instances to maintain. Required if low_priority_percentage is not specified.
The **update_policy** object supports the following:
* `rollConfig` (`pulumi.Input[dict]`)
* `batchSizePercentage` (`pulumi.Input[float]`)
* `gracePeriod` (`pulumi.Input[float]`)
* `health_check_type` (`pulumi.Input[str]`)
* `shouldRoll` (`pulumi.Input[bool]`)
> This content is derived from https://github.com/terraform-providers/terraform-provider-spotinst/blob/master/website/docs/r/elastigroup_azure.html.markdown.
"""
if __name__ is not None:
warnings.warn("explicit use of __name__ is deprecated", DeprecationWarning)
resource_name = __name__
if __opts__ is not None:
warnings.warn("explicit use of __opts__ is deprecated, use 'opts' instead", DeprecationWarning)
opts = __opts__
if opts is None:
opts = pulumi.ResourceOptions()
if not isinstance(opts, pulumi.ResourceOptions):
raise TypeError('Expected resource options to be a ResourceOptions instance')
if opts.version is None:
opts.version = utilities.get_version()
if opts.id is None:
if __props__ is not None:
raise TypeError('__props__ is only valid when passed in combination with a valid opts.id to get an existing resource')
__props__ = dict()
__props__['custom_data'] = custom_data
__props__['desired_capacity'] = desired_capacity
__props__['health_check'] = health_check
__props__['images'] = images
__props__['integration_kubernetes'] = integration_kubernetes
__props__['integration_multai_runtime'] = integration_multai_runtime
__props__['load_balancers'] = load_balancers
__props__['login'] = login
if low_priority_sizes is None:
raise TypeError("Missing required property 'low_priority_sizes'")
__props__['low_priority_sizes'] = low_priority_sizes
__props__['managed_service_identities'] = managed_service_identities
__props__['max_size'] = max_size
__props__['min_size'] = min_size
__props__['name'] = name
if network is None:
raise TypeError("Missing required property 'network'")
__props__['network'] = network
if od_sizes is None:
raise TypeError("Missing required property 'od_sizes'")
__props__['od_sizes'] = od_sizes
if product is None:
raise TypeError("Missing required property 'product'")
__props__['product'] = product
if region is None:
raise TypeError("Missing required property 'region'")
__props__['region'] = region
if resource_group_name is None:
raise TypeError("Missing required property 'resource_group_name'")
__props__['resource_group_name'] = resource_group_name
__props__['scaling_down_policies'] = scaling_down_policies
__props__['scaling_up_policies'] = scaling_up_policies
__props__['scheduled_tasks'] = scheduled_tasks
__props__['shutdown_script'] = shutdown_script
if strategy is None:
raise TypeError("Missing required property 'strategy'")
__props__['strategy'] = strategy
__props__['update_policy'] = update_policy
__props__['user_data'] = user_data
super(Elastigroup, __self__).__init__(
'spotinst:azure/elastigroup:Elastigroup',
resource_name,
__props__,
opts)
@staticmethod
def get(resource_name, id, opts=None, custom_data=None, desired_capacity=None, health_check=None, images=None, integration_kubernetes=None, integration_multai_runtime=None, load_balancers=None, login=None, low_priority_sizes=None, managed_service_identities=None, max_size=None, min_size=None, name=None, network=None, od_sizes=None, product=None, region=None, resource_group_name=None, scaling_down_policies=None, scaling_up_policies=None, scheduled_tasks=None, shutdown_script=None, strategy=None, update_policy=None, user_data=None):
"""
Get an existing Elastigroup resource's state with the given name, id, and optional extra
properties used to qualify the lookup.
:param str resource_name: The unique name of the resulting resource.
:param str id: The unique provider ID of the resource to lookup.
:param pulumi.ResourceOptions opts: Options for the resource.
:param pulumi.Input[float] desired_capacity: The desired number of instances the group should have at any time.
:param pulumi.Input[list] low_priority_sizes: Available Low-Priority sizes.
:param pulumi.Input[float] max_size: The maximum number of instances the group should have at any time.
:param pulumi.Input[float] min_size: The minimum number of instances the group should have at any time.
:param pulumi.Input[str] name: The name of the managed identity.
:param pulumi.Input[list] od_sizes: Available On-Demand sizes
:param pulumi.Input[str] product: Operation system type. Valid values: `"Linux"`, `"Windows"`.
:param pulumi.Input[str] region: The region your Azure group will be created in.
:param pulumi.Input[str] resource_group_name: The Resource Group that the user-assigned managed identity resides in.
:param pulumi.Input[str] shutdown_script: Shutdown script for the group. Value should be passed as a string encoded at Base64 only.
:param pulumi.Input[dict] strategy: Describes the deployment strategy.
:param pulumi.Input[str] user_data: Base64-encoded MIME user data to make available to the instances.
The **health_check** object supports the following:
* `autoHealing` (`pulumi.Input[bool]`)
* `gracePeriod` (`pulumi.Input[float]`)
* `health_check_type` (`pulumi.Input[str]`)
The **images** object supports the following:
* `customs` (`pulumi.Input[list]`)
* `imageName` (`pulumi.Input[str]`)
* `resource_group_name` (`pulumi.Input[str]`) - The Resource Group that the user-assigned managed identity resides in.
* `marketplaces` (`pulumi.Input[list]`)
* `offer` (`pulumi.Input[str]`)
* `publisher` (`pulumi.Input[str]`)
* `sku` (`pulumi.Input[str]`)
The **integration_kubernetes** object supports the following:
* `clusterIdentifier` (`pulumi.Input[str]`)
The **integration_multai_runtime** object supports the following:
* `deploymentId` (`pulumi.Input[str]`)
The **load_balancers** object supports the following:
* `autoWeight` (`pulumi.Input[bool]`)
* `balancerId` (`pulumi.Input[str]`)
* `targetSetId` (`pulumi.Input[str]`)
* `type` (`pulumi.Input[str]`)
The **login** object supports the following:
* `password` (`pulumi.Input[str]`)
* `sshPublicKey` (`pulumi.Input[str]`)
* `userName` (`pulumi.Input[str]`)
The **managed_service_identities** object supports the following:
* `name` (`pulumi.Input[str]`) - The name of the managed identity.
* `resource_group_name` (`pulumi.Input[str]`) - The Resource Group that the user-assigned managed identity resides in.
The **network** object supports the following:
* `additionalIpConfigs` (`pulumi.Input[list]`)
* `name` (`pulumi.Input[str]`) - The name of the managed identity.
* `privateIpVersion` (`pulumi.Input[str]`)
* `assignPublicIp` (`pulumi.Input[bool]`)
* `resource_group_name` (`pulumi.Input[str]`) - The Resource Group that the user-assigned managed identity resides in.
* `subnetName` (`pulumi.Input[str]`)
* `virtualNetworkName` (`pulumi.Input[str]`)
The **scaling_down_policies** object supports the following:
* `actionType` (`pulumi.Input[str]`)
* `adjustment` (`pulumi.Input[str]`)
* `cooldown` (`pulumi.Input[float]`)
* `dimensions` (`pulumi.Input[list]`)
* `name` (`pulumi.Input[str]`) - The name of the managed identity.
* `value` (`pulumi.Input[str]`)
* `evaluationPeriods` (`pulumi.Input[float]`)
* `maxTargetCapacity` (`pulumi.Input[str]`)
* `maximum` (`pulumi.Input[str]`)
* `metricName` (`pulumi.Input[str]`)
* `minTargetCapacity` (`pulumi.Input[str]`)
* `minimum` (`pulumi.Input[str]`)
* `namespace` (`pulumi.Input[str]`)
* `operator` (`pulumi.Input[str]`)
* `period` (`pulumi.Input[float]`)
* `policyName` (`pulumi.Input[str]`)
* `statistic` (`pulumi.Input[str]`)
* `target` (`pulumi.Input[str]`)
* `threshold` (`pulumi.Input[float]`)
* `unit` (`pulumi.Input[str]`)
The **scaling_up_policies** object supports the following:
* `actionType` (`pulumi.Input[str]`)
* `adjustment` (`pulumi.Input[str]`)
* `cooldown` (`pulumi.Input[float]`)
* `dimensions` (`pulumi.Input[list]`)
* `name` (`pulumi.Input[str]`) - The name of the managed identity.
* `value` (`pulumi.Input[str]`)
* `evaluationPeriods` (`pulumi.Input[float]`)
* `maxTargetCapacity` (`pulumi.Input[str]`)
* `maximum` (`pulumi.Input[str]`)
* `metricName` (`pulumi.Input[str]`)
* `minTargetCapacity` (`pulumi.Input[str]`)
* `minimum` (`pulumi.Input[str]`)
* `namespace` (`pulumi.Input[str]`)
* `operator` (`pulumi.Input[str]`)
* `period` (`pulumi.Input[float]`)
* `policyName` (`pulumi.Input[str]`)
* `statistic` (`pulumi.Input[str]`)
* `target` (`pulumi.Input[str]`)
* `threshold` (`pulumi.Input[float]`)
* `unit` (`pulumi.Input[str]`)
The **scheduled_tasks** object supports the following:
* `adjustment` (`pulumi.Input[str]`)
* `adjustmentPercentage` (`pulumi.Input[str]`)
* `batchSizePercentage` (`pulumi.Input[str]`)
* `cronExpression` (`pulumi.Input[str]`)
* `gracePeriod` (`pulumi.Input[str]`)
* `isEnabled` (`pulumi.Input[bool]`)
* `scaleMaxCapacity` (`pulumi.Input[str]`)
* `scaleMinCapacity` (`pulumi.Input[str]`)
* `scaleTargetCapacity` (`pulumi.Input[str]`)
* `taskType` (`pulumi.Input[str]`)
The **strategy** object supports the following:
* `draining_timeout` (`pulumi.Input[float]`) - Time (seconds) to allow the instance to be drained from incoming TCP connections and detached from MLB before terminating it during a scale-down operation.
* `lowPriorityPercentage` (`pulumi.Input[float]`) - Percentage of Low Priority instances to maintain. Required if `od_count` is not specified.
* `odCount` (`pulumi.Input[float]`) - Number of On-Demand instances to maintain. Required if low_priority_percentage is not specified.
The **update_policy** object supports the following:
* `rollConfig` (`pulumi.Input[dict]`)
* `batchSizePercentage` (`pulumi.Input[float]`)
* `gracePeriod` (`pulumi.Input[float]`)
* `health_check_type` (`pulumi.Input[str]`)
* `shouldRoll` (`pulumi.Input[bool]`)
> This content is derived from https://github.com/terraform-providers/terraform-provider-spotinst/blob/master/website/docs/r/elastigroup_azure.html.markdown.
"""
opts = pulumi.ResourceOptions.merge(opts, pulumi.ResourceOptions(id=id))
__props__ = dict()
__props__["custom_data"] = custom_data
__props__["desired_capacity"] = desired_capacity
__props__["health_check"] = health_check
__props__["images"] = images
__props__["integration_kubernetes"] = integration_kubernetes
__props__["integration_multai_runtime"] = integration_multai_runtime
__props__["load_balancers"] = load_balancers
__props__["login"] = login
__props__["low_priority_sizes"] = low_priority_sizes
__props__["managed_service_identities"] = managed_service_identities
__props__["max_size"] = max_size
__props__["min_size"] = min_size
__props__["name"] = name
__props__["network"] = network
__props__["od_sizes"] = od_sizes
__props__["product"] = product
__props__["region"] = region
__props__["resource_group_name"] = resource_group_name
__props__["scaling_down_policies"] = scaling_down_policies
__props__["scaling_up_policies"] = scaling_up_policies
__props__["scheduled_tasks"] = scheduled_tasks
__props__["shutdown_script"] = shutdown_script
__props__["strategy"] = strategy
__props__["update_policy"] = update_policy
__props__["user_data"] = user_data
return Elastigroup(resource_name, opts=opts, __props__=__props__)
def translate_output_property(self, prop):
return tables._CAMEL_TO_SNAKE_CASE_TABLE.get(prop) or prop
def translate_input_property(self, prop):
return tables._SNAKE_TO_CAMEL_CASE_TABLE.get(prop) or prop
| 49.140244 | 597 | 0.622492 | 2,594 | 24,177 | 5.585968 | 0.105628 | 0.148792 | 0.129469 | 0.03989 | 0.851484 | 0.831953 | 0.828916 | 0.80911 | 0.797792 | 0.795169 | 0 | 0.000723 | 0.255863 | 24,177 | 491 | 598 | 49.240326 | 0.80468 | 0.543905 | 0 | 0.015748 | 1 | 0 | 0.16692 | 0.034464 | 0 | 0 | 0 | 0 | 0 | 1 | 0.031496 | false | 0.007874 | 0.047244 | 0.015748 | 0.307087 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
408e848d85df7082b396e75a5e76a02fd4de8a2b | 8,790 | py | Python | python_utils/datareader/files_based.py | Fhrozen/jrm_ssl | 91656e2e4d47aff2586fcbf41a9b959ba9d5d6b0 | [
"MIT"
] | 17 | 2017-02-07T04:25:32.000Z | 2021-12-13T08:25:49.000Z | python_utils/datareader/files_based.py | Fhrozen/jrm_ssl | 91656e2e4d47aff2586fcbf41a9b959ba9d5d6b0 | [
"MIT"
] | null | null | null | python_utils/datareader/files_based.py | Fhrozen/jrm_ssl | 91656e2e4d47aff2586fcbf41a9b959ba9d5d6b0 | [
"MIT"
] | 5 | 2017-10-20T03:19:24.000Z | 2020-10-19T06:30:56.000Z | from __future__ import absolute_import
from __future__ import division
from __future__ import print_function
import os, h5py, glob
import numpy as np
from matplotlib import pyplot as plt
class hdf5_plain(object):
def __init__(self, config_file):
self.prefix = config_file.get('database', 'prefix')
_inputs = config_file.get('data', 'labels')
self._inputs = [ x for x in _inputs.split(';')]
self.epochs_ = config_file.getint('train', 'epochs')
self.get_db_sizes()
def get_db_sizes(self):
print('Looking at {} for files:'.format(self.prefix))
list_dirs = glob.glob('{}/*'.format(self.prefix))
list_file = []
index = []
first = True
for i in range(len(list_dirs)):
_files = glob.glob('{}/*'.format(list_dirs[i]))
if first:
first = False
with h5py.File(_files[0], 'r') as f:
_dims = [ None ] * len(self._inputs)
_types = [ None ] * len(self._inputs)
for j in range(len(self._inputs)):
testfile= f[self._inputs[j]]
_dim = testfile.shape
_dims[j] = _dim if len(_dim) !=0 else []
_types[j] = testfile.dtype
print(' data label:{} dim:{} dtype:{}'.format(self._inputs[j], list(_dims[j]), _types[j]))
list_file += [_files]
index += [[i, x] for x in np.arange(len(_files))]
self._dims = _dims
self._type = _types
self.files = list_file
self.idxs = index
print(' Total of {} files on {} folders...'.format(len(self.idxs), len(self.files)))
return
def read_data(self, idxs, divisions):
"""
Data feeder from HDF5 Files
idxs[,0]: Folder
indx[,1]: Filename
"""
data_batch = np.empty((divisions),dtype=object)
idx_div = np.linspace(0, len(idxs), num = divisions + 1, dtype=np.int)
for i in range(divisions):
batch_in = idx_div[i+1] - idx_div[i]
_idxs = idxs[idx_div[i]:idx_div[i+1]]
data_labels = np.empty((len(self._inputs)),dtype=object)
for j in range(batch_in):
iDb, iFL = _idxs[j]
with h5py.File(self.files[iDb][iFL], 'r') as f:
for k in range(len(self._inputs)):
if j == 0:
dims = [batch_in]
dims += [ x for x in self._dims[k]]
data_labels[k] = np.zeros(dims, dtype=self._type[k])
data_labels[k][j] = np.asarray(f[self._inputs[k]])
data_batch[i] = data_labels
return data_batch
class hdf5_sigmoid(object):
def __init__(self, config_file):
self.prefix = config_file.get('database', 'prefix')
_inputs = config_file.get('data', 'labels')
self._inputs = [ x for x in _inputs.split(';')]
self.epochs_ = config_file.getint('train', 'epochs')
self.get_db_sizes()
def get_db_sizes(self):
print('Looking at {} for files:'.format(self.prefix))
list_dirs = glob.glob('{}/*'.format(self.prefix))
list_file = []
index = []
first = True
for i in range(len(list_dirs)):
_files = glob.glob('{}/*'.format(list_dirs[i]))
if first:
first = False
with h5py.File(_files[0], 'r') as f:
_dims = [ None ] * len(self._inputs)
_types = [ None ] * len(self._inputs)
for j in range(len(self._inputs)):
testfile= f[self._inputs[j]]
_dim = testfile.shape
_dims[j] = _dim if len(_dim) !=0 else []
_types[j] = testfile.dtype
print(' data label:{} dim:{} dtype:{}'.format(self._inputs[j], list(_dims[j]), _types[j]))
list_file += [_files]
index += [[i, x] for x in np.arange(len(_files))]
self._dims = _dims
self._type = _types
self.files = list_file
self.idxs = index
print(' Total of {} files on {} folders...'.format(len(self.idxs), len(self.files)))
return
def read_data(self, idxs, divisions):
"""
Data feeder from HDF5 Files
idxs[,0]: Folder
indx[,1]: Filename
"""
data_batch = np.empty((divisions),dtype=object)
idx_div = np.linspace(0, len(idxs), num = divisions + 1, dtype=np.int)
for i in range(divisions):
batch_in = idx_div[i+1] - idx_div[i]
_idxs = idxs[idx_div[i]:idx_div[i+1]]
data_labels = np.empty((len(self._inputs)),dtype=object)
for j in range(batch_in):
iDb, iFL = _idxs[j]
with h5py.File(self.files[iDb][iFL], 'r') as f:
for k in range(len(self._inputs)):
if j == 0:
dims = [batch_in]
dims += [ x for x in self._dims[k]]
data_labels[k] = np.zeros(dims, dtype=self._type[k])
#tmp = np.zeros(.shape,
data_labels[k][j] = np.asarray(f[self._inputs[k]])
data_batch[i] = data_labels
return data_batch
class hdf5_new_segment(object):
def __init__(self, config_file):
self.prefix = config_file.get('database', 'prefix')
_inputs = config_file.get('data', 'labels')
self._inputs = [ x for x in _inputs.split(';')]
self.epochs_ = config_file.getint('train', 'epochs')
self.get_db_sizes()
def get_db_sizes(self):
print('Looking at {} for files:'.format(self.prefix))
list_dirs = glob.glob('{}/*'.format(self.prefix))
list_file = []
index = []
first = True
for i in range(len(list_dirs)):
_files = glob.glob('{}/*'.format(list_dirs[i]))
if first:
first = False
with h5py.File(_files[0], 'r') as f:
_dims = [ None ] * len(self._inputs)
_types = [ None ] * len(self._inputs)
for j in range(len(self._inputs)):
testfile= f[self._inputs[j]]
_dim = testfile.shape
_dims[j] = _dim if len(_dim) !=0 else []
_types[j] = testfile.dtype
print(' data label:{} dim:{} dtype:{}'.format(self._inputs[j], list(_dims[j]), _types[j]))
list_file += [_files]
index += [[i, x] for x in np.arange(len(_files))]
self._dims = _dims
self._type = _types
self.files = list_file
self.idxs = index
print(' Total of {} files on {} folders...'.format(len(self.idxs), len(self.files)))
return
def read_data(self, idxs, divisions):
"""
Data feeder from HDF5 Files
idxs[,0]: Folder
indx[,1]: Filename
"""
data_batch = np.empty((divisions),dtype=object)
idx_div = np.linspace(0, len(idxs), num = divisions + 1, dtype=np.int)
for i in range(divisions):
batch_in = idx_div[i+1] - idx_div[i]
_idxs = idxs[idx_div[i]:idx_div[i+1]]
data_labels = np.empty((len(self._inputs)),dtype=object)
for j in range(batch_in):
iDb, iFL = _idxs[j]
with h5py.File(self.files[iDb][iFL], 'r') as f:
for k in range(len(self._inputs)):
if j == 0:
dims = [batch_in]
if k == 0:
data_labels[k] = np.zeros((batch_in,1,257,7*20), dtype=self._type[k])
else:
dims += [ x for x in self._dims[k]]
data_labels[k] = np.zeros(dims, dtype=self._type[k])
if k == 0:
a = np.zeros((1,257,7*20), dtype=np.float32)
for _fr in range(20):
for _ch in range(7):
a[0,:,_ch+_fr*7] = f[self._inputs[k]][_ch, :,_fr]
data_labels[k][j] = a
else:
data_labels[k][j] = np.asarray(f[self._inputs[k]])
data_batch[i] = data_labels
return data_batch | 44.393939 | 119 | 0.481115 | 1,055 | 8,790 | 3.790521 | 0.100474 | 0.070018 | 0.048762 | 0.015754 | 0.924231 | 0.913728 | 0.913728 | 0.913728 | 0.913728 | 0.913728 | 0 | 0.011684 | 0.386576 | 8,790 | 198 | 120 | 44.393939 | 0.72997 | 0.024346 | 0 | 0.912791 | 0 | 0 | 0.048801 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.052326 | false | 0 | 0.034884 | 0 | 0.139535 | 0.05814 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
40a6c905ab8da8c55474d5a2be54691250e1b286 | 3,959 | py | Python | onesignalapi/utils/http_request.py | alexromer0/onesignal-api | a5fb3acfd2fe81bc70fcf6c7b8a9216f207b409f | [
"MIT"
] | 1 | 2018-08-14T16:32:29.000Z | 2018-08-14T16:32:29.000Z | onesignalapi/utils/http_request.py | alexromer0/onesignal-api | a5fb3acfd2fe81bc70fcf6c7b8a9216f207b409f | [
"MIT"
] | 2 | 2018-11-20T22:38:43.000Z | 2018-11-20T22:42:35.000Z | onesignalapi/utils/http_request.py | alexromer0/onesignal-api | a5fb3acfd2fe81bc70fcf6c7b8a9216f207b409f | [
"MIT"
] | null | null | null | import requests
import json
import response
class HttpRequest(object):
__headers = None
def __init__(self, url, headers=None):
self.__url = url
self.__payload = {}
self.response = {}
if headers is not None:
if isinstance(headers, dict):
self.__headers = headers
self.response = response.Response().success_response('ok', [])
else:
self.response = response.Response().error_response('The headers attribute must be a dictionary', [])
raise ValueError('The headers attribute must be a dictionary')
def post_request(self, data):
try:
if not self.__url:
_response = response.Response()
self.response = _response.error_response('The url is empty, please provide an url', [])
else:
req = requests.post(self.__url, headers=self.__headers, data=json.dumps(data))
if req.status_code == 200:
res = req.json()
if 'errors' not in res:
_response = response.Response()
self.response = _response.success_response('Ok', res)
else:
_response = response.Response()
self.response = _response.error_response(
'Fail with a status 200, check the data for more information', res['errors'])
else:
_response = response.Response()
self.response['error'] = _response.error_response(
'Fail with a status different than 200, check the data for more info', req.json())
return self.response
except requests.exceptions.RequestException as e:
_response = response.Response()
self.response = _response.error_response(repr(e), [])
return self.response
def put_request(self, data):
try:
if not self.__url:
_response = response.Response()
self.response = _response.error_response('The url is empty, please provide an url', [])
else:
req = requests.put(self.__url, data=json.dumps(data), headers=self.__headers)
if req.status_code == 200:
_response = response.Response()
self.response = _response.success_response('success', req.json())
else:
_response = response.Response()
self.response['error'] = _response.error_response(
'Fail with a status different than 200, check the data for more info', req.json())
return self.response
except requests.exceptions.RequestException as e:
_response = response.Response()
self.response = _response.error_response(repr(e), [])
return self.response
def get_request(self, data):
try:
if not self.__url:
_response = response.Response()
self.response = _response.error_response('The url is empty, please provide an url', [])
else:
req = requests.get(self.__url, params=json.dumps(data))
if req.status_code == 200:
_response = response.Response()
self.response = _response.success_response('success', req.json())
else:
_response = response.Response()
self.response = _response.error_response(
'Fail with a status different than 200, check the data for more info', [req.text])
return self.response
except requests.exceptions.RequestException as e:
_response = response.Response()
self.response = _response.error_response(repr(e), [])
return self.response
| 43.988889 | 116 | 0.551907 | 397 | 3,959 | 5.314861 | 0.168766 | 0.3109 | 0.170616 | 0.172512 | 0.814692 | 0.78436 | 0.773934 | 0.73981 | 0.690995 | 0.690995 | 0 | 0.008258 | 0.357666 | 3,959 | 89 | 117 | 44.483146 | 0.821471 | 0 | 0 | 0.670886 | 0 | 0 | 0.126547 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.050633 | false | 0 | 0.037975 | 0 | 0.189873 | 0 | 0 | 0 | 0 | null | 1 | 0 | 1 | 1 | 1 | 1 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
40cfc0b95587c5e1669ebe86a286bb9b8a8a9555 | 1,696 | py | Python | tests/integration/scalar_fields/test_eigenvalues_scalar_fields.py | bernssolg/pyntcloud-master | 84cf000b7a7f69a2c1b36f9624f05f65160bf992 | [
"MIT"
] | 1,142 | 2016-10-10T08:55:30.000Z | 2022-03-30T04:46:16.000Z | tests/integration/scalar_fields/test_eigenvalues_scalar_fields.py | bernssolg/pyntcloud-master | 84cf000b7a7f69a2c1b36f9624f05f65160bf992 | [
"MIT"
] | 195 | 2016-10-10T08:30:37.000Z | 2022-02-17T12:51:17.000Z | tests/integration/scalar_fields/test_eigenvalues_scalar_fields.py | bernssolg/pyntcloud-master | 84cf000b7a7f69a2c1b36f9624f05f65160bf992 | [
"MIT"
] | 215 | 2017-02-28T00:50:29.000Z | 2022-03-22T17:01:31.000Z | import pytest
import numpy as np
@pytest.mark.parametrize("scalar_field_name", [
"anisotropy",
"planarity"
])
@pytest.mark.usefixtures("pyntcloud_and_eigenvalues")
def test_eigen_values_scalar_fields_where_coplanar_points_have_value_of_1(pyntcloud_and_eigenvalues, scalar_field_name):
cloud, ev = pyntcloud_and_eigenvalues
with np.errstate(divide='ignore', invalid='ignore'):
scalar_field = cloud.add_scalar_field(
scalar_field_name,
ev=ev)
scalar_field_values = cloud.points[scalar_field].values
assert all(scalar_field_values[:5] == 1)
assert scalar_field_values[5] < 1
@pytest.mark.parametrize("scalar_field_name", [
"curvature",
"eigenentropy",
"linearity",
"omnivariance",
"sphericity"
])
@pytest.mark.usefixtures("pyntcloud_and_eigenvalues")
def test_eigen_values_scalar_fieldss_where_coplanar_points_have_value_of_0(pyntcloud_and_eigenvalues, scalar_field_name):
cloud, ev = pyntcloud_and_eigenvalues
with np.errstate(divide='ignore', invalid='ignore'):
scalar_field = cloud.add_scalar_field(
scalar_field_name,
ev=ev)
scalar_field_values = cloud.points[scalar_field].values
assert all(scalar_field_values[:5] == 0)
assert scalar_field_values[5] > 0
@pytest.mark.usefixtures("pyntcloud_and_eigenvalues")
def test_eigen_sum_values(pyntcloud_and_eigenvalues):
cloud, ev = pyntcloud_and_eigenvalues
with np.errstate(divide='ignore', invalid='ignore'):
scalar_field = cloud.add_scalar_field(
"eigen_sum",
ev=ev)
scalar_field_values = cloud.points[scalar_field].values
assert all(scalar_field_values > 0)
| 32.615385 | 121 | 0.733491 | 213 | 1,696 | 5.446009 | 0.220657 | 0.218103 | 0.161207 | 0.062069 | 0.867241 | 0.822414 | 0.708621 | 0.708621 | 0.708621 | 0.660345 | 0 | 0.00779 | 0.167453 | 1,696 | 51 | 122 | 33.254902 | 0.813739 | 0 | 0 | 0.571429 | 0 | 0 | 0.1329 | 0.0443 | 0 | 0 | 0 | 0 | 0.119048 | 1 | 0.071429 | false | 0 | 0.047619 | 0 | 0.119048 | 0 | 0 | 0 | 0 | null | 1 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
90d05d50ed253e2f73df9645b9eafe82ad7a45dd | 34,921 | py | Python | python/nbdb/store/tests/test_rollups.py | rubrikinc/nbdb2 | 359db63a39e016e3eb197b8ea511d6e8cffa1853 | [
"Apache-2.0"
] | 2 | 2022-03-21T15:48:33.000Z | 2022-03-27T00:43:12.000Z | python/nbdb/store/tests/test_rollups.py | rubrikinc/nbdb2 | 359db63a39e016e3eb197b8ea511d6e8cffa1853 | [
"Apache-2.0"
] | null | null | null | python/nbdb/store/tests/test_rollups.py | rubrikinc/nbdb2 | 359db63a39e016e3eb197b8ea511d6e8cffa1853 | [
"Apache-2.0"
] | 1 | 2022-03-27T00:43:31.000Z | 2022-03-27T00:43:31.000Z | """
Unittest for SparseSeriesData class
"""
import os
import time
from unittest import TestCase
from unittest.mock import Mock
from nbdb.common.context import Context
from nbdb.common.data_point import DataPoint, MODE_ROLLUP
from nbdb.common.data_point import MISSING_POINT_VALUE, TOMBSTONE_VALUE
from nbdb.common.metric_parsers import CLUSTER_TAG_KEY
from nbdb.common.telemetry import Telemetry
from nbdb.config.settings import Settings
from nbdb.schema.schema import Schema, ROLLUP_LAST, ROLLUP_MAX, ROLLUP_MEAN
from nbdb.schema.schema import ROLLUP_SUM
from nbdb.store.rollups import Rollups
from nbdb.store.sparse_algo_selector import SparseAlgoSelector
from nbdb.store.sparse_series_stats import SparseSeriesStats
class TestRollups(TestCase):
"""
Test the write path by writing data and verifying that only sparse
points that conform to appropriate smoothing are written to the db
"""
def setUp(self) -> None:
"""
Setup a partitioned mocked sparse-time-series for testing
:return:
"""
Settings.load_yaml_settings(os.path.dirname(__file__) +
'/test_settings.yaml')
schema = Schema.load_from_file(
os.path.dirname(__file__) + '/test_schema.yaml')
self.context = Context(schema=schema)
Telemetry.inst = Mock()
sparse_algo_selector = SparseAlgoSelector(
self.context,
Settings.inst.sparse_store.sparse_telemetry,
Settings.inst.realtime_metric_consumer,
Mock(),
MODE_ROLLUP
)
self.mock_sparse_store = Mock()
self.rollups = Rollups(rollup_settings=Settings.inst.sparse_store.rollups,
schema=schema,
sparse_store=self.mock_sparse_store,
sparse_algo_selector=sparse_algo_selector)
self._unit_test_real_time_clock = 0
def test_append_metric_with_mean(self) -> None:
"""
Test writing a metric to sparse_series_writer
"""
# create the stat object
stat = SparseSeriesStats()
self.mock_sparse_store.reset_mock()
se = 3600
# 1st dp
dp1 = DataPoint('m', 'f', {CLUSTER_TAG_KEY: '0'},
se, int(time.time()), 100)
self.rollups.add(dp1, stat, replay_mode=False)
self.assertEqual(0, stat.get_rollup_intermediate_value(3600))
self.assertEqual(0, stat.get_rollup_intermediate_value(7200))
self.assertEqual(se, stat.get_check_point(3600, 0))
self.assertEqual(0, stat.get_check_point(7200, 0))
self.assertEqual(0, self.mock_sparse_store.write.call_count)
stat.set_window_value(0, dp1.value)
stat.set_window_epoch(0, dp1.epoch)
dp2 = DataPoint('m', 'f', {CLUSTER_TAG_KEY: '0'},
se + 200, int(time.time()), 200)
self.rollups.add(dp2, stat, replay_mode=False)
self.assertEqual(100*200,
stat.get_rollup_intermediate_value(3600))
self.assertEqual(100*200,
stat.get_rollup_intermediate_value(7200))
self.assertEqual(se, stat.get_check_point(3600))
self.assertEqual(0, stat.get_check_point(7200))
self.assertEqual(0, self.mock_sparse_store.write.call_count)
stat.set_window_value(0, dp2.value)
stat.set_window_epoch(0, dp2.epoch)
dp3 = DataPoint('m', 'f', {CLUSTER_TAG_KEY: '0'},
3700 + se, int(time.time()), 300)
self.rollups.add(dp3, stat, replay_mode=False)
self.assertEqual(200*(3700-3600),
stat.get_rollup_intermediate_value(3600))
self.assertEqual(200*(3700 + se - 7200),
stat.get_rollup_intermediate_value(7200))
self.assertEqual((100*200 + 200*(3600-200))/3600,
stat.get_window_value(3600))
self.assertEqual((100*200 + 200*(7200-se-200))/(7200-se),
stat.get_window_value(7200))
self.assertEqual(se + 3600, stat.get_check_point(3600))
self.assertEqual(7200, stat.get_check_point(7200))
# 3600 & 7200 window values must be written by now
self.assertEqual(2, self.mock_sparse_store.write.call_count)
stat.set_window_value(0, dp3.value)
stat.set_window_epoch(0, dp3.epoch)
# dp4 valid for 1 additional hour
dp4 = DataPoint('m', 'f', {CLUSTER_TAG_KEY: '0'},
7200 + se, int(time.time()), 400)
self.rollups.add(dp4, stat, replay_mode=False)
self.assertEqual(0, stat.get_rollup_intermediate_value(3600))
self.assertEqual(200*(3700 + se - 7200) + 3500*300,
stat.get_rollup_intermediate_value(7200))
self.assertEqual((100*200 + 3500*300)/3600,
stat.get_window_value(3600))
self.assertEqual((100*200 + 200*(7200-se-200))/(7200-se),
stat.get_window_value(7200))
self.assertEqual(se + 7200, stat.get_check_point(3600))
self.assertEqual(7200, stat.get_check_point(7200))
# One additional value will be written for 3600 window
self.assertEqual(3, self.mock_sparse_store.write.call_count)
stat.set_window_value(0, dp4.value)
stat.set_window_epoch(0, dp4.epoch)
# dp5 comes in 1 hour after dp4
dp5 = DataPoint('m', 'f', {CLUSTER_TAG_KEY: '0'},
10800 + se, int(time.time()), 500)
self.rollups.add(dp5, stat, replay_mode=False)
self.assertEqual(0, stat.get_rollup_intermediate_value(3600))
self.assertEqual(0, stat.get_rollup_intermediate_value(7200))
self.assertEqual(400, stat.get_window_value(3600))
self.assertEqual((100*200 + 3500*300 + 3600*400)/7200,
stat.get_window_value(7200))
self.assertEqual(14400, stat.get_check_point(3600))
self.assertEqual(14400, stat.get_check_point(7200))
# Values for both 3600 & 7200 get written
self.assertEqual(5, self.mock_sparse_store.write.call_count)
stat.set_window_value(0, dp5.value)
stat.set_window_epoch(0, dp5.epoch)
# We get a momentary blip ie. points go missing after dp5. And about 50
# mins after dp5 value was received, dp6 appears. SparseSeriesWriter
# will insert a MISSING_POINT_VALUE 10 mins after dp5
dp_missing = DataPoint('m', 'f', {CLUSTER_TAG_KEY: '0'},
10800 + 600 + se,
int(time.time()), MISSING_POINT_VALUE,
is_special_value=True)
self.rollups.add(dp_missing, stat, replay_mode=False)
self.assertEqual(500*600, stat.get_rollup_intermediate_value(3600))
self.assertEqual(500*600, stat.get_rollup_intermediate_value(7200))
self.assertEqual(400, stat.get_window_value(3600))
self.assertEqual((100*200 + 3500*300 + 3600*400)/7200,
stat.get_window_value(7200))
self.assertEqual(14400, stat.get_check_point(3600))
self.assertEqual(14400, stat.get_check_point(7200))
self.assertEqual(5, self.mock_sparse_store.write.call_count)
stat.set_window_value(0, dp_missing.value)
stat.set_window_epoch(0, dp_missing.epoch)
dp6 = DataPoint('m', 'f', {CLUSTER_TAG_KEY: '0'},
10800 + 3000 + se, int(time.time()), 100)
self.rollups.add(dp6, stat, replay_mode=False)
self.assertEqual(500*600, stat.get_rollup_intermediate_value(3600))
self.assertEqual(500*600, stat.get_rollup_intermediate_value(7200))
self.assertEqual(400, stat.get_window_value(3600))
self.assertEqual((100*200 + 3500*300 + 3600*400)/7200,
stat.get_window_value(7200))
self.assertEqual(14400, stat.get_check_point(3600))
self.assertEqual(14400, stat.get_check_point(7200))
self.assertEqual(5, self.mock_sparse_store.write.call_count)
stat.set_window_value(0, dp6.value)
stat.set_window_epoch(0, dp6.epoch)
# Add another datapoint to complete previous window
dp7 = DataPoint('m', 'f', {CLUSTER_TAG_KEY: '0'},
14400 + se, int(time.time()), 600)
self.rollups.add(dp7, stat, replay_mode=False)
self.assertEqual(0, stat.get_rollup_intermediate_value(3600))
# dp5 was active for 10m and dp6 was active for 10m till dp7 arrived
self.assertEqual(500 * 600 + 100 * 600,
stat.get_rollup_intermediate_value(7200))
# For the 3600 window, we only had 2 non-NULL datapoints dp5 & dp6
# active for 10m each. So the avg should be 300
self.assertEqual((500+100)/2, stat.get_window_value(3600))
self.assertEqual((100*200 + 3500*300 + 3600*400)/7200,
stat.get_window_value(7200))
self.assertEqual(18000, stat.get_check_point(3600))
self.assertEqual(14400, stat.get_check_point(7200))
# One additional 3600 window value should be written
self.assertEqual(6, self.mock_sparse_store.write.call_count)
stat.set_window_value(0, dp7.value)
stat.set_window_epoch(0, dp7.epoch)
# Add a tombstone value indicating the series is dead. We should
# generate extra datapoints for windows with intermediate computations,
# and generate tombstones for each window
dp_tombstone = DataPoint('m', 'f', {CLUSTER_TAG_KEY: '0'},
14400 + 600 + se,
int(time.time()), TOMBSTONE_VALUE,
is_special_value=True)
self.rollups.add(dp_tombstone, stat, replay_mode=False)
# We should be storing TOMBSTONE_VALUE for next checkpoint + window
self.assertEqual(TOMBSTONE_VALUE, stat.get_window_value(3600))
self.assertEqual(18000 + 3600 + 3600, stat.get_window_epoch(3600))
self.assertEqual(TOMBSTONE_VALUE, stat.get_window_value(7200))
self.assertEqual(14400 + 7200 + 7200, stat.get_window_epoch(7200))
# Four more writes should be generated: 2 writes for the intermediate
# state at next_checkpoint & 2 writes for tombstone values at
# next_checkpoint + 10m
self.assertEqual(10, self.mock_sparse_store.write.call_count)
# 3600 intermediate datapoint
self.assertEqual(18000 + 3600,
self.mock_sparse_store.mock_calls[-4][1][0].epoch)
self.assertEqual((600*600)/600,
self.mock_sparse_store.mock_calls[-4][1][0].value)
self.assertTrue(
self.mock_sparse_store.mock_calls[-4][1][0].datasource.endswith(
'3600'))
# 3600 tombstone
self.assertEqual(18000 + 3600 + 3600,
self.mock_sparse_store.mock_calls[-3][1][0].epoch)
self.assertEqual(TOMBSTONE_VALUE,
self.mock_sparse_store.mock_calls[-3][1][0].value)
self.assertTrue(
self.mock_sparse_store.mock_calls[-3][1][0].datasource.endswith(
'3600'))
# 7200 intermediate datapoint
self.assertEqual(14400 + 7200,
self.mock_sparse_store.mock_calls[-2][1][0].epoch)
self.assertEqual((500*600 + 100*600 + 600*600)/1800,
self.mock_sparse_store.mock_calls[-2][1][0].value)
self.assertTrue(
self.mock_sparse_store.mock_calls[-2][1][0].datasource.endswith(
'7200'))
# 7200 tombstone
self.assertEqual(14400 + 7200 + 7200,
self.mock_sparse_store.mock_calls[-1][1][0].epoch)
self.assertEqual(TOMBSTONE_VALUE,
self.mock_sparse_store.mock_calls[-1][1][0].value)
self.assertTrue(
self.mock_sparse_store.mock_calls[-1][1][0].datasource.endswith(
'7200'))
stat.set_window_value(0, dp_tombstone.value)
stat.set_window_epoch(0, dp_tombstone.epoch)
def test_missing_points_spanning_mul_windows(self) -> None:
"""
Test handling of missing points spanning multiple windows
"""
for rollup_function in [ROLLUP_MEAN, ROLLUP_MAX, ROLLUP_LAST,
ROLLUP_SUM]:
self.context.schema.get_rollup_function = lambda x: rollup_function
# create the stat object
stat = SparseSeriesStats()
se = 3600
# 1st dp
dp1 = DataPoint('m', 'f', {CLUSTER_TAG_KEY: '0'},
se, int(time.time()), 100)
self.rollups.add(dp1, stat, replay_mode=False)
# first time stats is not updated
self.assertEqual(0, stat.get_rollup_intermediate_value(3600))
self.assertEqual(0, stat.get_rollup_intermediate_value(7200))
self.assertEqual(3600, stat.get_check_point(3600, 0))
self.assertEqual(0, stat.get_check_point(7200, 0))
stat.set_window_value(0, dp1.value)
stat.set_window_epoch(0, dp1.epoch)
dp2 = DataPoint('m', 'f', {CLUSTER_TAG_KEY: '0'},
se + 200, int(time.time()), 200)
self.rollups.add(dp2, stat, replay_mode=False)
if rollup_function == ROLLUP_MEAN:
intermediate = 100*200
elif rollup_function == ROLLUP_MAX:
intermediate = 100
elif rollup_function == ROLLUP_LAST:
intermediate = 100
else:
# Rollup sum
intermediate = 100*200
self.assertEqual(intermediate,
stat.get_rollup_intermediate_value(3600))
self.assertEqual(intermediate,
stat.get_rollup_intermediate_value(7200))
self.assertEqual(se, stat.get_check_point(3600))
self.assertEqual(0, stat.get_check_point(7200))
stat.set_window_value(0, dp2.value)
stat.set_window_epoch(0, dp2.epoch)
# Simulate a blip which spans multiple windows. Insert a missing
# point marker 10m after dp2 and another datapoint dp3 multiple
# windows later
dp_missing = DataPoint('m', 'f', {CLUSTER_TAG_KEY: '0'},
se + 200 + 600, int(time.time()),
MISSING_POINT_VALUE, is_special_value=True)
self.rollups.add(dp_missing, stat, replay_mode=False)
stat.set_window_value(0, dp_missing.value)
stat.set_window_epoch(0, dp_missing.epoch)
# dp3 is 4 windows away from dp3 for the 3600 rollup and 2 windows
# away for the 7200 rollup
dp3 = DataPoint('m', 'f', {CLUSTER_TAG_KEY: '0'},
se + 14400 + 200, int(time.time()), 300)
self.rollups.add(dp3, stat, replay_mode=False)
# Verify that the last window values for both windows were the
# missing point marker
self.assertEqual(18000, stat.get_check_point(3600))
self.assertEqual(MISSING_POINT_VALUE, stat.get_window_value(3600))
self.assertEqual(14400, stat.get_check_point(7200))
self.assertEqual(MISSING_POINT_VALUE, stat.get_window_value(7200))
def test_append_metric_with_last(self) -> None:
"""
Test writing a metric to sparse_series_writer
"""
self.context.schema.get_rollup_function = lambda x: ROLLUP_LAST
# create the stat object
stat = SparseSeriesStats()
se = 3600
# 1st dp
dp1 = DataPoint('m', 'f', {CLUSTER_TAG_KEY: '0'},
se, int(time.time()), 100)
self.rollups.add(dp1, stat, replay_mode=False)
# first time stats is not updated
self.assertEqual(0, stat.get_rollup_intermediate_value(3600))
self.assertEqual(0, stat.get_rollup_intermediate_value(7200))
self.assertEqual(3600, stat.get_check_point(3600, 0))
self.assertEqual(0, stat.get_check_point(7200, 0))
stat.set_window_value(0, dp1.value)
stat.set_window_epoch(0, dp1.epoch)
dp2 = DataPoint('m', 'f', {CLUSTER_TAG_KEY: '0'},
se + 200, int(time.time()), 200)
self.rollups.add(dp2, stat, replay_mode=False)
self.assertEqual(100, stat.get_rollup_intermediate_value(3600))
self.assertEqual(100, stat.get_rollup_intermediate_value(7200))
self.assertEqual(3600, stat.get_check_point(3600))
self.assertEqual(0, stat.get_check_point(7200))
stat.set_window_value(0, dp2.value)
stat.set_window_epoch(0, dp2.epoch)
dp3 = DataPoint('m', 'f', {CLUSTER_TAG_KEY: '0'},
3700 + se, int(time.time()), 300)
self.rollups.add(dp3, stat, replay_mode=False)
self.assertEqual(200, stat.get_rollup_intermediate_value(3600))
self.assertEqual(200, stat.get_rollup_intermediate_value(7200))
self.assertEqual(200, stat.get_window_value(3600))
self.assertEqual(200, stat.get_window_value(7200))
self.assertEqual(se + 3600, stat.get_check_point(3600))
self.assertEqual(7200, stat.get_check_point(7200))
stat.set_window_value(0, dp3.value)
stat.set_window_epoch(0, dp3.epoch)
# dp4 valid for 1 additional hour
dp4 = DataPoint('m', 'f', {CLUSTER_TAG_KEY: '0'},
7200 + se, int(time.time()), 400)
self.rollups.add(dp4, stat, replay_mode=False)
self.assertEqual(300, stat.get_rollup_intermediate_value(3600))
self.assertEqual(300, stat.get_rollup_intermediate_value(7200))
self.assertEqual(300, stat.get_window_value(3600))
self.assertEqual(200, stat.get_window_value(7200))
self.assertEqual(se + 7200, stat.get_check_point(3600))
self.assertEqual(7200, stat.get_check_point(7200))
stat.set_window_value(0, dp4.value)
stat.set_window_epoch(0, dp4.epoch)
# dp5 valid for 1 additional hour
dp5 = DataPoint('m', 'f', {CLUSTER_TAG_KEY: '0'},
10800 + se, int(time.time()), 500)
self.rollups.add(dp5, stat, replay_mode=False)
self.assertEqual(400, stat.get_rollup_intermediate_value(3600))
self.assertEqual(400, stat.get_rollup_intermediate_value(7200))
self.assertEqual(400, stat.get_window_value(3600))
self.assertEqual(400, stat.get_window_value(7200))
self.assertEqual(14400, stat.get_check_point(3600))
self.assertEqual(14400, stat.get_check_point(7200))
stat.set_window_value(0, dp5.value)
stat.set_window_epoch(0, dp5.epoch)
def test_append_metric_with_max(self) -> None:
"""
Test writing a metric to sparse_series_writer
"""
self.context.schema.get_rollup_function = lambda x: ROLLUP_MAX
# create the stat object
stat = SparseSeriesStats()
se = 3600
# 1st dp
dp1 = DataPoint('m', 'f', {CLUSTER_TAG_KEY: '0'},
se, int(time.time()), 200)
self.rollups.add(dp1, stat, replay_mode=False)
# first time stats is not updated
self.assertEqual(0, stat.get_rollup_intermediate_value(3600))
self.assertEqual(0, stat.get_rollup_intermediate_value(7200))
self.assertEqual(3600, stat.get_check_point(3600, 0))
self.assertEqual(0, stat.get_check_point(7200, 0))
stat.set_window_value(0, dp1.value)
stat.set_window_epoch(0, dp1.epoch)
dp2 = DataPoint('m', 'f', {CLUSTER_TAG_KEY: '0'},
se + 200, int(time.time()), 100)
self.rollups.add(dp2, stat, replay_mode=False)
self.assertEqual(200, stat.get_rollup_intermediate_value(3600))
self.assertEqual(200, stat.get_rollup_intermediate_value(7200))
self.assertEqual(se, stat.get_check_point(3600))
self.assertEqual(0, stat.get_check_point(7200))
stat.set_window_value(0, dp2.value)
stat.set_window_epoch(0, dp2.epoch)
dp3 = DataPoint('m', 'f', {CLUSTER_TAG_KEY: '0'},
3700 + se, int(time.time()), 300)
self.rollups.add(dp3, stat, replay_mode=False)
self.assertEqual(100, stat.get_rollup_intermediate_value(3600))
self.assertEqual(100, stat.get_rollup_intermediate_value(7200))
self.assertEqual(200, stat.get_window_value(3600))
self.assertEqual(200, stat.get_window_value(7200))
self.assertEqual(se + 3600, stat.get_check_point(3600))
self.assertEqual(7200, stat.get_check_point(7200))
stat.set_window_value(0, dp3.value)
stat.set_window_epoch(0, dp3.epoch)
# dp4 valid for 1 additional hour
dp4 = DataPoint('m', 'f', {CLUSTER_TAG_KEY: '0'},
7200 + se, int(time.time()), 400)
self.rollups.add(dp4, stat, replay_mode=False)
self.assertEqual(300, stat.get_rollup_intermediate_value(3600))
self.assertEqual(300, stat.get_rollup_intermediate_value(7200))
self.assertEqual(300, stat.get_window_value(3600))
self.assertEqual(200, stat.get_window_value(7200))
self.assertEqual(se + 7200, stat.get_check_point(3600))
self.assertEqual(7200, stat.get_check_point(7200))
stat.set_window_value(0, dp4.value)
stat.set_window_epoch(0, dp4.epoch)
# dp5 valid for 1 additional hour
dp5 = DataPoint('m', 'f', {CLUSTER_TAG_KEY: '0'},
10800 + se, int(time.time()), 500)
self.rollups.add(dp5, stat, replay_mode=False)
self.assertEqual(400, stat.get_rollup_intermediate_value(3600))
self.assertEqual(400, stat.get_rollup_intermediate_value(7200))
self.assertEqual(400, stat.get_window_value(3600))
self.assertEqual(400, stat.get_window_value(7200))
self.assertEqual(14400, stat.get_check_point(3600))
self.assertEqual(14400, stat.get_check_point(7200))
stat.set_window_value(0, dp5.value)
stat.set_window_epoch(0, dp5.epoch)
def test_rollup_with_long_sparse_point(self) -> None:
"""
Tests rollup when a single sparse point spans multiple rollup
windows
:return:
"""
# create the stat object
stat = SparseSeriesStats()
se = 3600
# 1st dp
dp1 = DataPoint('m', 'f', {CLUSTER_TAG_KEY: '0'},
se, int(time.time()), 100)
self.rollups.add(dp1, stat, replay_mode=False)
# first time stats is not updated
self.assertEqual(0, stat.get_rollup_intermediate_value(3600))
self.assertEqual(0, stat.get_rollup_intermediate_value(7200))
self.assertEqual(3600, stat.get_check_point(3600, 0))
self.assertEqual(0, stat.get_check_point(7200, 0))
stat.set_window_value(0, dp1.value)
stat.set_window_epoch(0, dp1.epoch)
# Next data point comes a day later, so dp1 needs to generate
# multiple roll up data points
dp2 = DataPoint('m', 'f', {CLUSTER_TAG_KEY: '0'},
se + 86400 + 20, int(time.time()), 200)
self.rollups.add(dp2, stat, replay_mode=False)
self.assertEqual(20*100,
stat.get_rollup_intermediate_value(3600))
self.assertEqual(3620*100,
stat.get_rollup_intermediate_value(7200))
self.assertEqual(se + 86400, stat.get_check_point(3600, 0))
self.assertEqual(86400, stat.get_check_point(7200, 0))
self.assertEqual(100, stat.get_window_value(3600))
self.assertEqual(100, stat.get_window_value(7200))
stat.set_window_value(0, dp2.value)
stat.set_window_epoch(0, dp2.epoch)
def test_rollups_replay_mode(self) -> None:
"""
Test rollups during replay mode
"""
# Create the stat object
stat = SparseSeriesStats()
self.mock_sparse_store.reset_mock()
# First three datapoints will be received in replay mode
se = 3600
# 1st dp
dp1 = DataPoint('m', 'f', {CLUSTER_TAG_KEY: '0'},
se, int(time.time()), 200)
self.rollups.add(dp1, stat, replay_mode=True)
# first time stats is not updated
self.assertEqual(0, stat.get_rollup_intermediate_value(3600))
self.assertEqual(0, stat.get_rollup_intermediate_value(7200))
self.assertEqual(3600, stat.get_check_point(3600, 0))
self.assertEqual(0, stat.get_check_point(7200, 0))
self.assertEqual(0, self.mock_sparse_store.write.call_count)
stat.set_window_value(0, dp1.value)
stat.set_window_epoch(0, dp1.epoch)
stat.set_replay_mode(True)
dp2 = DataPoint('m', 'f', {CLUSTER_TAG_KEY: '0'},
se + 200, int(time.time()), 200)
self.rollups.add(dp2, stat, replay_mode=True)
self.assertEqual(200*200,
stat.get_rollup_intermediate_value(3600))
self.assertEqual(200*200,
stat.get_rollup_intermediate_value(7200))
self.assertEqual(se, stat.get_check_point(3600))
self.assertEqual(0, stat.get_check_point(7200))
self.assertEqual(0, self.mock_sparse_store.write.call_count)
stat.set_window_value(0, dp2.value)
stat.set_window_epoch(0, dp2.epoch)
stat.set_replay_mode(True)
dp3 = DataPoint('m', 'f', {CLUSTER_TAG_KEY: '0'},
3700 + se, int(time.time()), 200)
self.rollups.add(dp3, stat, replay_mode=True)
self.assertEqual(200*(3700-3600),
stat.get_rollup_intermediate_value(3600))
self.assertEqual(200*(3700 + se - 7200),
stat.get_rollup_intermediate_value(7200))
self.assertEqual((200*200 + 200*(3600-200))/3600,
stat.get_window_value(3600))
self.assertEqual((200*200 + 200*(7200-se-200))/(7200-se),
stat.get_window_value(7200))
self.assertEqual(se + 3600, stat.get_check_point(3600))
self.assertEqual(7200, stat.get_check_point(7200))
# Even though window values were generated for both 3600 and 7200, no
# writes should be done since dp3 was received in replay mode
self.assertEqual(0, self.mock_sparse_store.write.call_count)
stat.set_window_value(0, dp3.value)
stat.set_window_epoch(0, dp3.epoch)
stat.set_replay_mode(True)
# dp4 comes 1 hour after dp3 and is the first non-replay message.
# Even though the 3600 rollup value generated is the same as before, it
# will be written because it's the first value after transition from
# replay mode to non-replay mode
dp4 = DataPoint('m', 'f', {CLUSTER_TAG_KEY: '0'},
7300 + se, int(time.time()), 200)
self.rollups.add(dp4, stat, replay_mode=False)
self.assertEqual(200*100,
stat.get_rollup_intermediate_value(3600))
self.assertEqual(200*100 + 200*(7300-3700),
stat.get_rollup_intermediate_value(7200))
self.assertEqual((100*200 + 200*(7200-3700))/3600,
stat.get_window_value(3600))
self.assertEqual((200*100 + 200*(7200-3700))/(7200-se),
stat.get_window_value(7200))
self.assertEqual(se + 7200, stat.get_check_point(3600))
self.assertEqual(7200, stat.get_check_point(7200))
# Even though the 3600 rollup value is the same as the last time, it
# should still be written
self.assertEqual(1, self.mock_sparse_store.write.call_count)
stat.set_window_value(0, dp4.value)
stat.set_window_epoch(0, dp4.epoch)
stat.set_replay_mode(False)
def test_single_datapoint_series(self) -> None:
"""
Test writing a series to sparse_series_writer which generates a single
datapoint in its lifetime
"""
# create the stat object
stat = SparseSeriesStats()
self.mock_sparse_store.reset_mock()
se = 3800
# 1st dp
dp1 = DataPoint('m', 'f', {CLUSTER_TAG_KEY: '0'},
se, int(time.time()), 100)
self.rollups.add(dp1, stat, replay_mode=False)
self.assertEqual(0, stat.get_rollup_intermediate_value(3600))
self.assertEqual(0, stat.get_rollup_intermediate_value(7200))
self.assertEqual(3600, stat.get_check_point(3600, 0))
self.assertEqual(0, stat.get_check_point(7200, 0))
self.assertEqual(0, self.mock_sparse_store.write.call_count)
stat.set_window_value(0, dp1.value)
stat.set_window_epoch(0, dp1.epoch)
# Add a tombstone value indicating the series is dead. We should
# generate extra datapoints for windows with intermediate computations,
# and generate tombstones for each window
dp_tombstone = DataPoint('m', 'f', {CLUSTER_TAG_KEY: '0'},
se + 600,
int(time.time()), TOMBSTONE_VALUE,
is_special_value=True)
self.rollups.add(dp_tombstone, stat, replay_mode=False)
# We should be storing TOMBSTONE_VALUE for next checkpoint + window
self.assertEqual(TOMBSTONE_VALUE, stat.get_window_value(3600))
self.assertEqual(3600 + 2*3600, stat.get_window_epoch(3600))
self.assertEqual(TOMBSTONE_VALUE, stat.get_window_value(7200))
self.assertEqual(7200 + 7200, stat.get_window_epoch(7200))
# Four writes should be generated: 2 writes for the intermediate
# state at next_checkpoint & 2 writes for tombstone values at
# next_checkpoint + 10m
self.assertEqual(4, self.mock_sparse_store.write.call_count)
# 3600 intermediate datapoint
self.assertEqual(3600 + 3600,
self.mock_sparse_store.mock_calls[-4][1][0].epoch)
self.assertEqual((100*600)/600,
self.mock_sparse_store.mock_calls[-4][1][0].value)
self.assertTrue(
self.mock_sparse_store.mock_calls[-4][1][0].datasource.endswith(
'3600'))
# 3600 tombstone
self.assertEqual(3600 + 2*3600,
self.mock_sparse_store.mock_calls[-3][1][0].epoch)
self.assertEqual(TOMBSTONE_VALUE,
self.mock_sparse_store.mock_calls[-3][1][0].value)
self.assertTrue(
self.mock_sparse_store.mock_calls[-3][1][0].datasource.endswith(
'3600'))
# 7200 intermediate datapoint
self.assertEqual(7200,
self.mock_sparse_store.mock_calls[-2][1][0].epoch)
self.assertEqual((100*600)/600,
self.mock_sparse_store.mock_calls[-2][1][0].value)
self.assertTrue(
self.mock_sparse_store.mock_calls[-2][1][0].datasource.endswith(
'7200'))
# 7200 tombstone
self.assertEqual(7200 + 7200,
self.mock_sparse_store.mock_calls[-1][1][0].epoch)
self.assertEqual(TOMBSTONE_VALUE,
self.mock_sparse_store.mock_calls[-1][1][0].value)
self.assertTrue(
self.mock_sparse_store.mock_calls[-1][1][0].datasource.endswith(
'7200'))
stat.set_window_value(0, dp_tombstone.value)
stat.set_window_epoch(0, dp_tombstone.epoch)
def test_partial_windows_on_both_ends(self) -> None:
"""
Simulate a series whose start & end time result in partial windows at
the beginning & end. We should not drop the partial windows
"""
# create the stat object
stat = SparseSeriesStats()
self.mock_sparse_store.reset_mock()
se = 3800
# Use sum() for computing rollups
self.context.schema.get_rollup_function = lambda x: ROLLUP_SUM
# Generate multiple datapoints at regular 10m frequency
for epoch in range(se, 10800 + 1800, 600):
dp = DataPoint('m', 'f', {CLUSTER_TAG_KEY: '0'},
epoch, int(time.time()), 100)
self.rollups.add(dp, stat, replay_mode=False)
stat.set_window_value(0, dp.value)
stat.set_window_epoch(0, dp.epoch)
# Add a tombstone value indicating the series is dead. We should
# generate extra datapoints for windows with intermediate computations,
# and generate tombstones for each window
dp_tombstone = DataPoint('m', 'f', {CLUSTER_TAG_KEY: '0'},
10800 + 1800,
int(time.time()), TOMBSTONE_VALUE,
is_special_value=True)
self.rollups.add(dp_tombstone, stat, replay_mode=False)
# We should see 4 writes for the 3600 window and 3 writes for the 7200
# window
self.assertEqual(7, self.mock_sparse_store.write.call_count)
rollup_dps_3600 = [call[1][0] for call in
self.mock_sparse_store.mock_calls
if call[1][0].datasource.endswith('3600')]
rollup_dps_7200 = [call[1][0] for call in
self.mock_sparse_store.mock_calls
if call[1][0].datasource.endswith('7200')]
# Verify that we generated the following values for the 3600
# datasource:
#
# Epoch 7200: 340000
# Epoch 10800: 360000
# Epoch 14400: 180000
# Epoch 18000: TOMBSTONE
self.assertEqual(rollup_dps_3600[0].epoch, 7200)
self.assertEqual(rollup_dps_3600[0].value, 100*(7200-se))
self.assertEqual(rollup_dps_3600[1].epoch, 7200 + 3600)
self.assertEqual(rollup_dps_3600[1].value, 100*3600)
self.assertEqual(rollup_dps_3600[2].epoch, 7200 + 2*3600)
self.assertEqual(rollup_dps_3600[2].value, 100*1800)
self.assertEqual(rollup_dps_3600[3].epoch, 7200 + 3*3600)
self.assertEqual(rollup_dps_3600[3].value, TOMBSTONE_VALUE)
# Verify that we generated the following values for the 7200
# datasource:
#
# Epoch 7200: 340000
# Epoch 14400: 540000
# Epoch 18000: TOMBSTONE
self.assertEqual(rollup_dps_7200[0].epoch, 7200)
self.assertEqual(rollup_dps_7200[0].value, 100*(7200-se))
self.assertEqual(rollup_dps_7200[1].epoch, 7200 + 7200)
self.assertEqual(rollup_dps_7200[1].value, 100*(10800+1800-7200))
self.assertEqual(rollup_dps_7200[2].epoch, 7200 + 2*7200)
self.assertEqual(rollup_dps_7200[2].value, TOMBSTONE_VALUE)
| 47.641201 | 82 | 0.621374 | 4,453 | 34,921 | 4.664047 | 0.061981 | 0.141557 | 0.064038 | 0.045838 | 0.826328 | 0.810294 | 0.790409 | 0.764794 | 0.741634 | 0.706871 | 0 | 0.099039 | 0.272816 | 34,921 | 732 | 83 | 47.706284 | 0.718831 | 0.127402 | 0 | 0.720841 | 0 | 0 | 0.005814 | 0 | 0 | 0 | 0 | 0 | 0.390057 | 1 | 0.017208 | false | 0 | 0.028681 | 0 | 0.047801 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
90f0f5ee35b66ef32c8c4bb188c517945ed52cfe | 143,232 | py | Python | fireant/tests/widgets/test_highcharts.py | RobinPapke/fireant | 822a5306b981549cb21a508a9b35729ecb8af6ec | [
"Apache-2.0"
] | 2 | 2019-05-12T15:14:52.000Z | 2020-05-28T07:13:34.000Z | fireant/tests/widgets/test_highcharts.py | RobinPapke/fireant | 822a5306b981549cb21a508a9b35729ecb8af6ec | [
"Apache-2.0"
] | null | null | null | fireant/tests/widgets/test_highcharts.py | RobinPapke/fireant | 822a5306b981549cb21a508a9b35729ecb8af6ec | [
"Apache-2.0"
] | null | null | null | from unittest import TestCase
import pandas as pd
from fireant import CumSum, Rollup
from fireant.tests.dataset.mocks import (
ElectionOverElection,
day,
dimx0_metricx1_df,
dimx0_metricx2_df,
dimx1_date_df,
dimx1_date_meticx1_votes_df,
dimx1_date_operation_df,
dimx1_num_df,
dimx1_str_df,
dimx1_str_totals_df,
dimx2_date_index_str_df,
dimx2_date_num_df,
dimx2_date_str_df,
dimx2_date_str_ref_delta_df,
dimx2_date_str_ref_df,
dimx2_date_str_totals_df,
dimx2_date_str_totalsx2_df,
dimx2_str_num_df,
dimx2_category_index_str_df,
mock_dataset,
year,
)
from fireant.widgets.highcharts import DEFAULT_COLORS, HighCharts
class HighChartsLineChartTransformerTests(TestCase):
maxDiff = None
chart_class = HighCharts.LineSeries
chart_type = "line"
stacking = None
def test_dimx1_metricx1(self):
result = (
HighCharts(title="Time Series, Single Metric")
.axis(self.chart_class(mock_dataset.fields.votes))
.transform(dimx1_date_df, mock_dataset, [mock_dataset.fields.timestamp], [])
)
self.assertEqual(
{
"title": {"text": "Time Series, Single Metric"},
"xAxis": {"type": "datetime", "visible": True},
"yAxis": [
{
"id": "0",
"title": {"text": None},
"labels": {"style": {"color": None}},
"visible": True,
}
],
"tooltip": {"shared": True, "useHTML": True, "enabled": True},
"legend": {"useHTML": True},
"series": [
{
"type": self.chart_type,
"name": "Votes",
"yAxis": "0",
"data": [
(820454400000, 15220449),
(946684800000, 16662017),
(1072915200000, 19614932),
(1199145600000, 21294215),
(1325376000000, 20572210),
(1451606400000, 18310513),
],
"tooltip": {
"valuePrefix": None,
"valueSuffix": None,
"valueDecimals": None,
},
"color": "#DDDF0D",
"marker": {"symbol": "circle", "fillColor": "#DDDF0D"},
"dashStyle": "Solid",
"stacking": self.stacking,
}
],
"annotations": [],
"colors": DEFAULT_COLORS,
},
result,
)
def test_dimx1_year(self):
result = (
HighCharts(title="Time Series, Single Metric")
.axis(self.chart_class(mock_dataset.fields.votes))
.transform(
dimx1_date_df, mock_dataset, [year(mock_dataset.fields.timestamp)], []
)
)
self.assertEqual(
{
"title": {"text": "Time Series, Single Metric"},
"xAxis": {"type": "datetime", "visible": True},
"yAxis": [
{
"id": "0",
"title": {"text": None},
"labels": {"style": {"color": None}},
"visible": True,
}
],
"tooltip": {"shared": True, "useHTML": True, "enabled": True},
"legend": {"useHTML": True},
"series": [
{
"type": self.chart_type,
"name": "Votes",
"yAxis": "0",
"data": [
(820454400000, 15220449),
(946684800000, 16662017),
(1072915200000, 19614932),
(1199145600000, 21294215),
(1325376000000, 20572210),
(1451606400000, 18310513),
],
"tooltip": {
"valuePrefix": None,
"valueSuffix": None,
"valueDecimals": None,
},
"color": "#DDDF0D",
"marker": {"symbol": "circle", "fillColor": "#DDDF0D"},
"dashStyle": "Solid",
"stacking": self.stacking,
}
],
"annotations": [],
"colors": DEFAULT_COLORS,
},
result,
)
def test_dimx1_metricx1_suffix(self):
result = (
HighCharts(title="Time Series, Single Metric")
.axis(self.chart_class(mock_dataset.fields.turnout))
.transform(dimx1_date_df, mock_dataset, [mock_dataset.fields.timestamp], [])
)
self.assertEqual(
{
"title": {"text": "Time Series, Single Metric"},
"xAxis": {"type": "datetime", "visible": True},
"yAxis": [
{
"id": "0",
"title": {"text": None},
"labels": {"style": {"color": None}},
"visible": True,
}
],
"tooltip": {"shared": True, "useHTML": True, "enabled": True},
"legend": {"useHTML": True},
"series": [
{
"type": self.chart_type,
"name": "Turnout",
"yAxis": "0",
"data": [
(820454400000, 50),
(946684800000, 50),
(1072915200000, 50),
(1199145600000, 50),
(1325376000000, 50),
(1451606400000, 50),
],
"tooltip": {
"valuePrefix": None,
"valueSuffix": "%",
"valueDecimals": 2,
},
"color": "#DDDF0D",
"marker": {"symbol": "circle", "fillColor": "#DDDF0D"},
"dashStyle": "Solid",
"stacking": self.stacking,
}
],
"annotations": [],
"colors": DEFAULT_COLORS,
},
result,
)
def test_dimx1_metricx1_prefix_precision(self):
result = (
HighCharts(title="Time Series, Single Metric")
.axis(self.chart_class(mock_dataset.fields.wins_with_style))
.transform(dimx1_date_df, mock_dataset, [mock_dataset.fields.timestamp], [])
)
self.assertEqual(
{
"title": {"text": "Time Series, Single Metric"},
"xAxis": {"type": "datetime", "visible": True},
"yAxis": [
{
"id": "0",
"title": {"text": None},
"labels": {"style": {"color": None}},
"visible": True,
}
],
"tooltip": {"shared": True, "useHTML": True, "enabled": True},
"legend": {"useHTML": True},
"series": [
{
"type": self.chart_type,
"name": "Wins",
"yAxis": "0",
"data": [
(820454400000, 2),
(946684800000, 2),
(1072915200000, 2),
(1199145600000, 2),
(1325376000000, 2),
(1451606400000, 2),
],
"tooltip": {
"valuePrefix": "$",
"valueSuffix": None,
"valueDecimals": 0,
},
"color": "#DDDF0D",
"marker": {"symbol": "circle", "fillColor": "#DDDF0D"},
"dashStyle": "Solid",
"stacking": self.stacking,
}
],
"annotations": [],
"colors": DEFAULT_COLORS,
},
result,
)
def test_single_operation_line_chart(self):
result = (
HighCharts(title="Time Series, Single Metric")
.axis(self.chart_class(CumSum(mock_dataset.fields.votes)))
.transform(
dimx1_date_operation_df,
mock_dataset,
[mock_dataset.fields.timestamp],
[],
)
)
self.assertEqual(
{
"title": {"text": "Time Series, Single Metric"},
"xAxis": {"type": "datetime", "visible": True},
"yAxis": [
{
"id": "0",
"title": {"text": None},
"labels": {"style": {"color": None}},
"visible": True,
}
],
"tooltip": {"shared": True, "useHTML": True, "enabled": True},
"legend": {"useHTML": True},
"series": [
{
"type": self.chart_type,
"name": "CumSum(Votes)",
"yAxis": "0",
"data": [
(820454400000, 15220449),
(946684800000, 31882466),
(1072915200000, 51497398),
(1199145600000, 72791613),
(1325376000000, 93363823),
(1451606400000, 111674336),
],
"tooltip": {
"valuePrefix": None,
"valueSuffix": None,
"valueDecimals": None,
},
"color": "#DDDF0D",
"marker": {"symbol": "circle", "fillColor": "#DDDF0D"},
"dashStyle": "Solid",
"stacking": self.stacking,
}
],
"annotations": [],
"colors": DEFAULT_COLORS,
},
result,
)
def test_single_metric_with_uni_dim_line_chart(self):
dimensions = [mock_dataset.fields.timestamp, mock_dataset.fields.state]
result = (
HighCharts(title="Time Series with Unique Dimension and Single Metric")
.axis(self.chart_class(mock_dataset.fields.votes))
.transform(dimx2_date_str_df, mock_dataset, dimensions, [])
)
self.assertEqual(
{
"title": {
"text": "Time Series with Unique Dimension and Single Metric"
},
"xAxis": {"type": "datetime", "visible": True},
"yAxis": [
{
"id": "0",
"labels": {"style": {"color": None}},
"title": {"text": None},
"visible": True,
}
],
"tooltip": {"shared": True, "useHTML": True, "enabled": True},
"legend": {"useHTML": True},
"series": [
{
"color": "#DDDF0D",
"dashStyle": "Solid",
"data": [
(820454400000, 7579518),
(946684800000, 8294949),
(1072915200000, 9578189),
(1199145600000, 11803106),
(1325376000000, 12424128),
(1451606400000, 4871678),
],
"marker": {"fillColor": "#DDDF0D", "symbol": "circle"},
"name": "Votes (Democrat)",
"stacking": self.stacking,
"tooltip": {
"valueDecimals": None,
"valuePrefix": None,
"valueSuffix": None,
},
"type": self.chart_type,
"yAxis": "0",
},
{
"color": "#55BF3B",
"dashStyle": "Solid",
"data": [(820454400000, 1076384)],
"marker": {"fillColor": "#DDDF0D", "symbol": "square"},
"name": "Votes (Independent)",
"stacking": self.stacking,
"tooltip": {
"valueDecimals": None,
"valuePrefix": None,
"valueSuffix": None,
},
"type": self.chart_type,
"yAxis": "0",
},
{
"color": "#DF5353",
"dashStyle": "Solid",
"data": [
(820454400000, 6564547),
(946684800000, 8367068),
(1072915200000, 10036743),
(1199145600000, 9491109),
(1325376000000, 8148082),
(1451606400000, 13438835),
],
"marker": {"fillColor": "#DDDF0D", "symbol": "diamond"},
"name": "Votes (Republican)",
"stacking": self.stacking,
"tooltip": {
"valueDecimals": None,
"valuePrefix": None,
"valueSuffix": None,
},
"type": self.chart_type,
"yAxis": "0",
},
],
"annotations": [],
"colors": DEFAULT_COLORS,
},
result,
)
def test_multi_metrics_single_axis_line_chart(self):
result = (
HighCharts(title="Time Series with Unique Dimension and Multiple Metrics")
.axis(
self.chart_class(mock_dataset.fields.votes),
self.chart_class(mock_dataset.fields.wins),
)
.transform(
dimx2_date_str_df,
mock_dataset,
[mock_dataset.fields.timestamp, mock_dataset.fields.state],
[],
)
)
self.assertEqual(
{
"title": {
"text": "Time Series with Unique Dimension and Multiple Metrics"
},
"xAxis": {"type": "datetime", "visible": True},
"yAxis": [
{
"id": "0",
"labels": {"style": {"color": "#DDDF0D"}},
"title": {"text": None},
"visible": True,
}
],
"tooltip": {"shared": True, "useHTML": True, "enabled": True},
"legend": {"useHTML": True},
"series": [
{
"color": "#DDDF0D",
"dashStyle": "Solid",
"data": [
(820454400000, 7579518),
(946684800000, 8294949),
(1072915200000, 9578189),
(1199145600000, 11803106),
(1325376000000, 12424128),
(1451606400000, 4871678),
],
"marker": {"fillColor": "#DDDF0D", "symbol": "circle"},
"name": "Votes (Democrat)",
"stacking": self.stacking,
"tooltip": {
"valueDecimals": None,
"valuePrefix": None,
"valueSuffix": None,
},
"type": self.chart_type,
"yAxis": "0",
},
{
"color": "#55BF3B",
"dashStyle": "Solid",
"data": [(820454400000, 1076384)],
"marker": {"fillColor": "#DDDF0D", "symbol": "square"},
"name": "Votes (Independent)",
"stacking": self.stacking,
"tooltip": {
"valueDecimals": None,
"valuePrefix": None,
"valueSuffix": None,
},
"type": self.chart_type,
"yAxis": "0",
},
{
"color": "#DF5353",
"dashStyle": "Solid",
"data": [
(820454400000, 6564547),
(946684800000, 8367068),
(1072915200000, 10036743),
(1199145600000, 9491109),
(1325376000000, 8148082),
(1451606400000, 13438835),
],
"marker": {"fillColor": "#DDDF0D", "symbol": "diamond"},
"name": "Votes (Republican)",
"stacking": self.stacking,
"tooltip": {
"valueDecimals": None,
"valuePrefix": None,
"valueSuffix": None,
},
"type": self.chart_type,
"yAxis": "0",
},
{
"color": "#7798BF",
"dashStyle": "Solid",
"data": [
(820454400000, 2),
(946684800000, 0),
(1072915200000, 0),
(1199145600000, 2),
(1325376000000, 2),
(1451606400000, 0),
],
"marker": {"fillColor": "#DDDF0D", "symbol": "circle"},
"name": "Wins (Democrat)",
"stacking": self.stacking,
"tooltip": {
"valueDecimals": None,
"valuePrefix": None,
"valueSuffix": None,
},
"type": self.chart_type,
"yAxis": "0",
},
{
"color": "#AAEEEE",
"dashStyle": "Solid",
"data": [(820454400000, 0)],
"marker": {"fillColor": "#DDDF0D", "symbol": "square"},
"name": "Wins (Independent)",
"stacking": self.stacking,
"tooltip": {
"valueDecimals": None,
"valuePrefix": None,
"valueSuffix": None,
},
"type": self.chart_type,
"yAxis": "0",
},
{
"color": "#FF0066",
"dashStyle": "Solid",
"data": [
(820454400000, 0),
(946684800000, 2),
(1072915200000, 2),
(1199145600000, 0),
(1325376000000, 0),
(1451606400000, 2),
],
"marker": {"fillColor": "#DDDF0D", "symbol": "diamond"},
"name": "Wins (Republican)",
"stacking": self.stacking,
"tooltip": {
"valueDecimals": None,
"valuePrefix": None,
"valueSuffix": None,
},
"type": self.chart_type,
"yAxis": "0",
},
],
"annotations": [],
"colors": DEFAULT_COLORS,
},
result,
)
def test_multi_metrics_multi_axis_line_chart(self):
result = (
HighCharts(
title="Time Series with Unique Dimension and Multiple Metrics, Multi-Axis"
)
.axis(self.chart_class(mock_dataset.fields.votes))
.axis(self.chart_class(mock_dataset.fields.wins))
.transform(
dimx2_date_str_df,
mock_dataset,
[mock_dataset.fields.timestamp, mock_dataset.fields.state],
[],
)
)
self.assertEqual(
{
"title": {
"text": "Time Series with Unique Dimension and Multiple Metrics, Multi-Axis"
},
"xAxis": {"type": "datetime", "visible": True},
"yAxis": [
{
"id": "1",
"labels": {"style": {"color": "#7798BF"}},
"title": {"text": None},
"visible": True,
},
{
"id": "0",
"labels": {"style": {"color": "#DDDF0D"}},
"title": {"text": None},
"visible": True,
},
],
"tooltip": {"shared": True, "useHTML": True, "enabled": True},
"legend": {"useHTML": True},
"series": [
{
"color": "#DDDF0D",
"dashStyle": "Solid",
"data": [
(820454400000, 7579518),
(946684800000, 8294949),
(1072915200000, 9578189),
(1199145600000, 11803106),
(1325376000000, 12424128),
(1451606400000, 4871678),
],
"marker": {"fillColor": "#DDDF0D", "symbol": "circle"},
"name": "Votes (Democrat)",
"stacking": self.stacking,
"tooltip": {
"valueDecimals": None,
"valuePrefix": None,
"valueSuffix": None,
},
"type": self.chart_type,
"yAxis": "0",
},
{
"color": "#55BF3B",
"dashStyle": "Solid",
"data": [(820454400000, 1076384)],
"marker": {"fillColor": "#DDDF0D", "symbol": "square"},
"name": "Votes (Independent)",
"stacking": self.stacking,
"tooltip": {
"valueDecimals": None,
"valuePrefix": None,
"valueSuffix": None,
},
"type": self.chart_type,
"yAxis": "0",
},
{
"color": "#DF5353",
"dashStyle": "Solid",
"data": [
(820454400000, 6564547),
(946684800000, 8367068),
(1072915200000, 10036743),
(1199145600000, 9491109),
(1325376000000, 8148082),
(1451606400000, 13438835),
],
"marker": {"fillColor": "#DDDF0D", "symbol": "diamond"},
"name": "Votes (Republican)",
"stacking": self.stacking,
"tooltip": {
"valueDecimals": None,
"valuePrefix": None,
"valueSuffix": None,
},
"type": self.chart_type,
"yAxis": "0",
},
{
"color": "#7798BF",
"dashStyle": "Solid",
"data": [
(820454400000, 2),
(946684800000, 0),
(1072915200000, 0),
(1199145600000, 2),
(1325376000000, 2),
(1451606400000, 0),
],
"marker": {"fillColor": "#7798BF", "symbol": "circle"},
"name": "Wins (Democrat)",
"stacking": self.stacking,
"tooltip": {
"valueDecimals": None,
"valuePrefix": None,
"valueSuffix": None,
},
"type": self.chart_type,
"yAxis": "1",
},
{
"color": "#AAEEEE",
"dashStyle": "Solid",
"data": [(820454400000, 0)],
"marker": {"fillColor": "#7798BF", "symbol": "square"},
"name": "Wins (Independent)",
"stacking": self.stacking,
"tooltip": {
"valueDecimals": None,
"valuePrefix": None,
"valueSuffix": None,
},
"type": self.chart_type,
"yAxis": "1",
},
{
"color": "#FF0066",
"dashStyle": "Solid",
"data": [
(820454400000, 0),
(946684800000, 2),
(1072915200000, 2),
(1199145600000, 0),
(1325376000000, 0),
(1451606400000, 2),
],
"marker": {"fillColor": "#7798BF", "symbol": "diamond"},
"name": "Wins (Republican)",
"stacking": self.stacking,
"tooltip": {
"valueDecimals": None,
"valuePrefix": None,
"valueSuffix": None,
},
"type": self.chart_type,
"yAxis": "1",
},
],
"annotations": [],
"colors": DEFAULT_COLORS,
},
result,
)
def test_multi_dim_with_totals_line_chart_and_empty_data(self):
dataframe = (
pd.DataFrame()
.from_dict(
{
"$timestamp": ["~~totals"],
"$political_party": ["~~totals"],
"$votes": [None],
"$wins": [None],
"$wins_with_style": [None],
"$turnout": [None],
}
)
.set_index(dimx2_date_str_totals_df.index.names)
)
result = (
HighCharts(
title="Time Series with Unique Dimension and Multiple Metrics, Multi-Axis"
)
.axis(self.chart_class(mock_dataset.fields.votes))
.axis(self.chart_class(mock_dataset.fields.wins))
.transform(
dataframe,
mock_dataset,
[mock_dataset.fields.timestamp, Rollup(mock_dataset.fields.state)],
[],
)
)
self.assertEqual(
result,
{
"title": {
"text": "Time Series with Unique Dimension and Multiple Metrics, Multi-Axis"
},
"xAxis": {"type": "datetime", "visible": True},
"yAxis": [
{
"id": "1",
"title": {"text": None},
"labels": {"style": {"color": "#55BF3B"}},
"visible": True,
},
{
"id": "0",
"title": {"text": None},
"labels": {"style": {"color": "#DDDF0D"}},
"visible": True,
},
],
"annotations": [],
"colors": (
"#DDDF0D",
"#55BF3B",
"#DF5353",
"#7798BF",
"#AAEEEE",
"#FF0066",
"#EEAAEE",
"#DF5353",
"#7798BF",
"#AAEEEE",
),
"series": [
{
"type": self.chart_type,
"name": "Votes (Totals)",
"data": [],
"tooltip": {
"valuePrefix": None,
"valueSuffix": None,
"valueDecimals": None,
},
"yAxis": "0",
"marker": {"symbol": "circle", "fillColor": "#DDDF0D"},
"stacking": self.stacking,
"color": "#DDDF0D",
"dashStyle": "Solid",
},
{
"type": self.chart_type,
"name": "Wins (Totals)",
"data": [],
"tooltip": {
"valuePrefix": None,
"valueSuffix": None,
"valueDecimals": None,
},
"yAxis": "1",
"marker": {"symbol": "circle", "fillColor": "#55BF3B"},
"stacking": self.stacking,
"color": "#55BF3B",
"dashStyle": "Solid",
},
],
"tooltip": {"shared": True, "useHTML": True, "enabled": True},
"legend": {"useHTML": True},
},
)
def test_multi_dim_with_totals_line_chart(self):
result = (
HighCharts(
title="Time Series with Unique Dimension and Multiple Metrics, Multi-Axis"
)
.axis(self.chart_class(mock_dataset.fields.votes))
.axis(self.chart_class(mock_dataset.fields.wins))
.transform(
dimx2_date_str_totals_df,
mock_dataset,
[mock_dataset.fields.timestamp, Rollup(mock_dataset.fields.state)],
[],
)
)
self.assertEqual(
{
"title": {
"text": "Time Series with Unique Dimension and Multiple Metrics, Multi-Axis"
},
"xAxis": {"type": "datetime", "visible": True},
"yAxis": [
{
"id": "1",
"labels": {"style": {"color": "#AAEEEE"}},
"title": {"text": None},
"visible": True,
},
{
"id": "0",
"labels": {"style": {"color": "#DDDF0D"}},
"title": {"text": None},
"visible": True,
},
],
"tooltip": {"shared": True, "useHTML": True, "enabled": True},
"legend": {"useHTML": True},
"series": [
{
"color": "#DDDF0D",
"dashStyle": "Solid",
"data": [
(820454400000, 7579518),
(946684800000, 8294949),
(1072915200000, 9578189),
(1199145600000, 11803106),
(1325376000000, 12424128),
(1451606400000, 4871678),
],
"marker": {"fillColor": "#DDDF0D", "symbol": "circle"},
"name": "Votes (Democrat)",
"stacking": self.stacking,
"tooltip": {
"valueDecimals": None,
"valuePrefix": None,
"valueSuffix": None,
},
"type": self.chart_type,
"yAxis": "0",
},
{
"color": "#55BF3B",
"dashStyle": "Solid",
"data": [(820454400000, 1076384)],
"marker": {"fillColor": "#DDDF0D", "symbol": "square"},
"name": "Votes (Independent)",
"stacking": self.stacking,
"tooltip": {
"valueDecimals": None,
"valuePrefix": None,
"valueSuffix": None,
},
"type": self.chart_type,
"yAxis": "0",
},
{
"color": "#DF5353",
"dashStyle": "Solid",
"data": [
(820454400000, 6564547),
(946684800000, 8367068),
(1072915200000, 10036743),
(1199145600000, 9491109),
(1325376000000, 8148082),
(1451606400000, 13438835),
],
"marker": {"fillColor": "#DDDF0D", "symbol": "diamond"},
"name": "Votes (Republican)",
"stacking": self.stacking,
"tooltip": {
"valueDecimals": None,
"valuePrefix": None,
"valueSuffix": None,
},
"type": self.chart_type,
"yAxis": "0",
},
{
"color": "#7798BF",
"dashStyle": "Solid",
"data": [
(820454400000, 15220449),
(946684800000, 16662017),
(1072915200000, 19614932),
(1199145600000, 21294215),
(1325376000000, 20572210),
(1451606400000, 18310513),
],
"marker": {"fillColor": "#DDDF0D", "symbol": "triangle"},
"name": "Votes (Totals)",
"stacking": self.stacking,
"tooltip": {
"valueDecimals": None,
"valuePrefix": None,
"valueSuffix": None,
},
"type": self.chart_type,
"yAxis": "0",
},
{
"color": "#AAEEEE",
"dashStyle": "Solid",
"data": [
(820454400000, 2),
(946684800000, 0),
(1072915200000, 0),
(1199145600000, 2),
(1325376000000, 2),
(1451606400000, 0),
],
"marker": {"fillColor": "#AAEEEE", "symbol": "circle"},
"name": "Wins (Democrat)",
"stacking": self.stacking,
"tooltip": {
"valueDecimals": None,
"valuePrefix": None,
"valueSuffix": None,
},
"type": self.chart_type,
"yAxis": "1",
},
{
"color": "#FF0066",
"dashStyle": "Solid",
"data": [(820454400000, 0)],
"marker": {"fillColor": "#AAEEEE", "symbol": "square"},
"name": "Wins (Independent)",
"stacking": self.stacking,
"tooltip": {
"valueDecimals": None,
"valuePrefix": None,
"valueSuffix": None,
},
"type": self.chart_type,
"yAxis": "1",
},
{
"color": "#EEAAEE",
"dashStyle": "Solid",
"data": [
(820454400000, 0),
(946684800000, 2),
(1072915200000, 2),
(1199145600000, 0),
(1325376000000, 0),
(1451606400000, 2),
],
"marker": {"fillColor": "#AAEEEE", "symbol": "diamond"},
"name": "Wins (Republican)",
"stacking": self.stacking,
"tooltip": {
"valueDecimals": None,
"valuePrefix": None,
"valueSuffix": None,
},
"type": self.chart_type,
"yAxis": "1",
},
{
"color": "#DF5353",
"dashStyle": "Solid",
"data": [
(820454400000, 2),
(946684800000, 2),
(1072915200000, 2),
(1199145600000, 2),
(1325376000000, 2),
(1451606400000, 2),
],
"marker": {"fillColor": "#AAEEEE", "symbol": "triangle"},
"name": "Wins (Totals)",
"stacking": self.stacking,
"tooltip": {
"valueDecimals": None,
"valuePrefix": None,
"valueSuffix": None,
},
"type": self.chart_type,
"yAxis": "1",
},
],
"annotations": [],
"colors": DEFAULT_COLORS,
},
result,
)
def test_multi_dim_with_totals_on_first_dim_line_chart(self):
result = (
HighCharts(
title="Time Series with Unique Dimension and Multiple Metrics, Multi-Axis"
)
.axis(self.chart_class(mock_dataset.fields.votes))
.axis(self.chart_class(mock_dataset.fields.wins))
.transform(
dimx2_date_str_totalsx2_df,
mock_dataset,
[
Rollup(mock_dataset.fields.timestamp),
Rollup(mock_dataset.fields.state),
],
[],
)
)
self.assertEqual(
{
"title": {
"text": "Time Series with Unique Dimension and Multiple Metrics, Multi-Axis"
},
"xAxis": {"type": "datetime", "visible": True},
"yAxis": [
{
"id": "1",
"labels": {"style": {"color": "#AAEEEE"}},
"title": {"text": None},
"visible": True,
},
{
"id": "0",
"labels": {"style": {"color": "#DDDF0D"}},
"title": {"text": None},
"visible": True,
},
],
"tooltip": {"shared": True, "useHTML": True, "enabled": True},
"legend": {"useHTML": True},
"series": [
{
"color": "#DDDF0D",
"dashStyle": "Solid",
"data": [
(820454400000, 7579518),
(946684800000, 8294949),
(1072915200000, 9578189),
(1199145600000, 11803106),
(1325376000000, 12424128),
(1451606400000, 4871678),
],
"marker": {"fillColor": "#DDDF0D", "symbol": "circle"},
"name": "Votes (Democrat)",
"stacking": self.stacking,
"tooltip": {
"valueDecimals": None,
"valuePrefix": None,
"valueSuffix": None,
},
"type": self.chart_type,
"yAxis": "0",
},
{
"color": "#55BF3B",
"dashStyle": "Solid",
"data": [(820454400000, 1076384)],
"marker": {"fillColor": "#DDDF0D", "symbol": "square"},
"name": "Votes (Independent)",
"stacking": self.stacking,
"tooltip": {
"valueDecimals": None,
"valuePrefix": None,
"valueSuffix": None,
},
"type": self.chart_type,
"yAxis": "0",
},
{
"color": "#DF5353",
"dashStyle": "Solid",
"data": [
(820454400000, 6564547),
(946684800000, 8367068),
(1072915200000, 10036743),
(1199145600000, 9491109),
(1325376000000, 8148082),
(1451606400000, 13438835),
],
"marker": {"fillColor": "#DDDF0D", "symbol": "diamond"},
"name": "Votes (Republican)",
"stacking": self.stacking,
"tooltip": {
"valueDecimals": None,
"valuePrefix": None,
"valueSuffix": None,
},
"type": self.chart_type,
"yAxis": "0",
},
{
"color": "#7798BF",
"dashStyle": "Solid",
"data": [
(820454400000, 15220449),
(946684800000, 16662017),
(1072915200000, 19614932),
(1199145600000, 21294215),
(1325376000000, 20572210),
(1451606400000, 18310513),
],
"marker": {"fillColor": "#DDDF0D", "symbol": "triangle"},
"name": "Votes (Totals)",
"stacking": self.stacking,
"tooltip": {
"valueDecimals": None,
"valuePrefix": None,
"valueSuffix": None,
},
"type": self.chart_type,
"yAxis": "0",
},
{
"color": "#AAEEEE",
"dashStyle": "Solid",
"data": [
(820454400000, 2),
(946684800000, 0),
(1072915200000, 0),
(1199145600000, 2),
(1325376000000, 2),
(1451606400000, 0),
],
"marker": {"fillColor": "#AAEEEE", "symbol": "circle"},
"name": "Wins (Democrat)",
"stacking": self.stacking,
"tooltip": {
"valueDecimals": None,
"valuePrefix": None,
"valueSuffix": None,
},
"type": self.chart_type,
"yAxis": "1",
},
{
"color": "#FF0066",
"dashStyle": "Solid",
"data": [(820454400000, 0)],
"marker": {"fillColor": "#AAEEEE", "symbol": "square"},
"name": "Wins (Independent)",
"stacking": self.stacking,
"tooltip": {
"valueDecimals": None,
"valuePrefix": None,
"valueSuffix": None,
},
"type": self.chart_type,
"yAxis": "1",
},
{
"color": "#EEAAEE",
"dashStyle": "Solid",
"data": [
(820454400000, 0),
(946684800000, 2),
(1072915200000, 2),
(1199145600000, 0),
(1325376000000, 0),
(1451606400000, 2),
],
"marker": {"fillColor": "#AAEEEE", "symbol": "diamond"},
"name": "Wins (Republican)",
"stacking": self.stacking,
"tooltip": {
"valueDecimals": None,
"valuePrefix": None,
"valueSuffix": None,
},
"type": self.chart_type,
"yAxis": "1",
},
{
"color": "#DF5353",
"dashStyle": "Solid",
"data": [
(820454400000, 2),
(946684800000, 2),
(1072915200000, 2),
(1199145600000, 2),
(1325376000000, 2),
(1451606400000, 2),
],
"marker": {"fillColor": "#AAEEEE", "symbol": "triangle"},
"name": "Wins (Totals)",
"stacking": self.stacking,
"tooltip": {
"valueDecimals": None,
"valuePrefix": None,
"valueSuffix": None,
},
"type": self.chart_type,
"yAxis": "1",
},
],
"annotations": [],
"colors": DEFAULT_COLORS,
},
result,
)
def test_uni_dim_with_ref_line_chart(self):
dimensions = [mock_dataset.fields.timestamp, mock_dataset.fields.state]
references = [ElectionOverElection(mock_dataset.fields.timestamp)]
result = (
HighCharts(title="Time Series with Unique Dimension and Reference")
.axis(self.chart_class(mock_dataset.fields.votes))
.transform(dimx2_date_str_ref_df, mock_dataset, dimensions, references)
)
self.assertEqual(
{
"title": {"text": "Time Series with Unique Dimension and Reference"},
"xAxis": {"type": "datetime", "visible": True},
"yAxis": [
{
"id": "0",
"labels": {"style": {"color": None}},
"title": {"text": None},
"visible": True,
}
],
"tooltip": {"shared": True, "useHTML": True, "enabled": True},
"legend": {"useHTML": True},
"series": [
{
"color": "#DDDF0D",
"dashStyle": "Solid",
"data": [
(820454400000, 6564547),
(946684800000, 8367068),
(1072915200000, 10036743),
(1199145600000, 9491109),
(1325376000000, 8148082),
(1451606400000, 13438835),
],
"marker": {"fillColor": "#DDDF0D", "symbol": "circle"},
"name": "Votes (Republican)",
"stacking": self.stacking,
"tooltip": {
"valueDecimals": None,
"valuePrefix": None,
"valueSuffix": None,
},
"type": self.chart_type,
"yAxis": "0",
},
{
"color": "#DDDF0D",
"dashStyle": "Dash",
"data": [
(820454400000, 7579518.0),
(946684800000, 6564547.0),
(1072915200000, 8367068.0),
(1199145600000, 10036743.0),
(1325376000000, 9491109.0),
(1451606400000, 8148082.0),
],
"marker": {"fillColor": "#DDDF0D", "symbol": "circle"},
"name": "Votes EoE (Republican)",
"stacking": self.stacking,
"tooltip": {
"valueDecimals": None,
"valuePrefix": None,
"valueSuffix": None,
},
"type": self.chart_type,
"yAxis": "0",
},
{
"color": "#55BF3B",
"dashStyle": "Solid",
"data": [
(946684800000, 8294949),
(1072915200000, 9578189),
(1199145600000, 11803106),
(1325376000000, 12424128),
(1451606400000, 4871678),
],
"marker": {"fillColor": "#DDDF0D", "symbol": "square"},
"name": "Votes (Democrat)",
"stacking": self.stacking,
"tooltip": {
"valueDecimals": None,
"valuePrefix": None,
"valueSuffix": None,
},
"type": self.chart_type,
"yAxis": "0",
},
{
"color": "#55BF3B",
"dashStyle": "Dash",
"data": [
(946684800000, 1076384.0),
(1072915200000, 8294949.0),
(1199145600000, 9578189.0),
(1325376000000, 11803106.0),
(1451606400000, 12424128.0),
],
"marker": {"fillColor": "#DDDF0D", "symbol": "square"},
"name": "Votes EoE (Democrat)",
"stacking": self.stacking,
"tooltip": {
"valueDecimals": None,
"valuePrefix": None,
"valueSuffix": None,
},
"type": self.chart_type,
"yAxis": "0",
},
],
"annotations": [],
"colors": DEFAULT_COLORS,
},
result,
)
def test_uni_dim_with_ref_delta_line_chart(self):
dimensions = [mock_dataset.fields.timestamp, mock_dataset.fields.state]
references = [ElectionOverElection(mock_dataset.fields.timestamp, delta=True)]
result = (
HighCharts(title="Time Series with Unique Dimension and Delta Reference")
.axis(self.chart_class(mock_dataset.fields.votes))
.transform(
dimx2_date_str_ref_delta_df, mock_dataset, dimensions, references
)
)
self.assertEqual(
{
"title": {
"text": "Time Series with Unique Dimension and Delta Reference"
},
"xAxis": {"type": "datetime", "visible": True},
"yAxis": [
{
"id": "0",
"labels": {"style": {"color": None}},
"title": {"text": None},
"visible": True,
},
{
"id": "0_eoe_delta",
"labels": {"style": {"color": None}},
"opposite": True,
"title": {"text": "EoE Δ"},
"visible": True,
},
],
"tooltip": {"shared": True, "useHTML": True, "enabled": True},
"legend": {"useHTML": True},
"series": [
{
"color": "#DDDF0D",
"dashStyle": "Solid",
"data": [
(820454400000, 6564547),
(946684800000, 8367068),
(1072915200000, 10036743),
(1199145600000, 9491109),
(1325376000000, 8148082),
(1451606400000, 13438835),
],
"marker": {"fillColor": "#DDDF0D", "symbol": "circle"},
"name": "Votes (Republican)",
"stacking": self.stacking,
"tooltip": {
"valueDecimals": None,
"valuePrefix": None,
"valueSuffix": None,
},
"type": self.chart_type,
"yAxis": "0",
},
{
"color": "#DDDF0D",
"dashStyle": "Dash",
"data": [
(820454400000, 1014971.0),
(946684800000, -1802521.0),
(1072915200000, -1669675.0),
(1199145600000, 545634.0),
(1325376000000, 1343027.0),
(1451606400000, -5290753.0),
],
"marker": {"fillColor": "#DDDF0D", "symbol": "circle"},
"name": "Votes EoE Δ (Republican)",
"stacking": self.stacking,
"tooltip": {
"valueDecimals": None,
"valuePrefix": None,
"valueSuffix": None,
},
"type": self.chart_type,
"yAxis": "0_eoe_delta",
},
{
"color": "#55BF3B",
"dashStyle": "Solid",
"data": [
(946684800000, 8294949),
(1072915200000, 9578189),
(1199145600000, 11803106),
(1325376000000, 12424128),
(1451606400000, 4871678),
],
"marker": {"fillColor": "#DDDF0D", "symbol": "square"},
"name": "Votes (Democrat)",
"stacking": self.stacking,
"tooltip": {
"valueDecimals": None,
"valuePrefix": None,
"valueSuffix": None,
},
"type": self.chart_type,
"yAxis": "0",
},
{
"color": "#55BF3B",
"dashStyle": "Dash",
"data": [
(946684800000, -7218565.0),
(1072915200000, -1283240.0),
(1199145600000, -2224917.0),
(1325376000000, -621022.0),
(1451606400000, 7552450.0),
],
"marker": {"fillColor": "#DDDF0D", "symbol": "square"},
"name": "Votes EoE Δ (Democrat)",
"stacking": self.stacking,
"tooltip": {
"valueDecimals": None,
"valuePrefix": None,
"valueSuffix": None,
},
"type": self.chart_type,
"yAxis": "0_eoe_delta",
},
],
"annotations": [],
"colors": DEFAULT_COLORS,
},
result,
)
def test_invisible_y_axis(self):
result = (
HighCharts(title="Time Series, Single Metric")
.axis(self.chart_class(mock_dataset.fields.votes), y_axis_visible=False)
.transform(dimx1_date_df, mock_dataset, [mock_dataset.fields.timestamp], [])
)
self.assertEqual(
{
"title": {"text": "Time Series, Single Metric"},
"xAxis": {"type": "datetime", "visible": True},
"yAxis": [
{
"id": "0",
"title": {"text": None},
"labels": {"style": {"color": None}},
"visible": False,
}
],
"tooltip": {"shared": True, "useHTML": True, "enabled": True},
"legend": {"useHTML": True},
"series": [
{
"type": self.chart_type,
"name": "Votes",
"yAxis": "0",
"data": [
(820454400000, 15220449),
(946684800000, 16662017),
(1072915200000, 19614932),
(1199145600000, 21294215),
(1325376000000, 20572210),
(1451606400000, 18310513),
],
"tooltip": {
"valuePrefix": None,
"valueSuffix": None,
"valueDecimals": None,
},
"color": "#DDDF0D",
"marker": {"symbol": "circle", "fillColor": "#DDDF0D"},
"dashStyle": "Solid",
"stacking": self.stacking,
}
],
"annotations": [],
"colors": DEFAULT_COLORS,
},
result,
)
def test_ref_axes_set_to_same_visibility_as_parent_axis(self):
dimensions = [mock_dataset.fields.timestamp, mock_dataset.fields.state]
references = [ElectionOverElection(mock_dataset.fields.timestamp, delta=True)]
result = (
HighCharts(title="Time Series with Unique Dimension and Delta Reference")
.axis(self.chart_class(mock_dataset.fields.votes), y_axis_visible=False)
.transform(
dimx2_date_str_ref_delta_df, mock_dataset, dimensions, references
)
)
self.assertEqual(
{
"title": {
"text": "Time Series with Unique Dimension and Delta Reference"
},
"xAxis": {"type": "datetime", "visible": True},
"yAxis": [
{
"id": "0",
"labels": {"style": {"color": None}},
"title": {"text": None},
"visible": False,
},
{
"id": "0_eoe_delta",
"labels": {"style": {"color": None}},
"opposite": True,
"title": {"text": "EoE Δ"},
"visible": False,
},
],
"tooltip": {"shared": True, "useHTML": True, "enabled": True},
"legend": {"useHTML": True},
"series": [
{
"color": "#DDDF0D",
"dashStyle": "Solid",
"data": [
(820454400000, 6564547),
(946684800000, 8367068),
(1072915200000, 10036743),
(1199145600000, 9491109),
(1325376000000, 8148082),
(1451606400000, 13438835),
],
"marker": {"fillColor": "#DDDF0D", "symbol": "circle"},
"name": "Votes (Republican)",
"stacking": self.stacking,
"tooltip": {
"valueDecimals": None,
"valuePrefix": None,
"valueSuffix": None,
},
"type": self.chart_type,
"yAxis": "0",
},
{
"color": "#DDDF0D",
"dashStyle": "Dash",
"data": [
(820454400000, 1014971.0),
(946684800000, -1802521.0),
(1072915200000, -1669675.0),
(1199145600000, 545634.0),
(1325376000000, 1343027.0),
(1451606400000, -5290753.0),
],
"marker": {"fillColor": "#DDDF0D", "symbol": "circle"},
"name": "Votes EoE Δ (Republican)",
"stacking": self.stacking,
"tooltip": {
"valueDecimals": None,
"valuePrefix": None,
"valueSuffix": None,
},
"type": self.chart_type,
"yAxis": "0_eoe_delta",
},
{
"color": "#55BF3B",
"dashStyle": "Solid",
"data": [
(946684800000, 8294949),
(1072915200000, 9578189),
(1199145600000, 11803106),
(1325376000000, 12424128),
(1451606400000, 4871678),
],
"marker": {"fillColor": "#DDDF0D", "symbol": "square"},
"name": "Votes (Democrat)",
"stacking": self.stacking,
"tooltip": {
"valueDecimals": None,
"valuePrefix": None,
"valueSuffix": None,
},
"type": self.chart_type,
"yAxis": "0",
},
{
"color": "#55BF3B",
"dashStyle": "Dash",
"data": [
(946684800000, -7218565.0),
(1072915200000, -1283240.0),
(1199145600000, -2224917.0),
(1325376000000, -621022.0),
(1451606400000, 7552450.0),
],
"marker": {"fillColor": "#DDDF0D", "symbol": "square"},
"name": "Votes EoE Δ (Democrat)",
"stacking": self.stacking,
"tooltip": {
"valueDecimals": None,
"valuePrefix": None,
"valueSuffix": None,
},
"type": self.chart_type,
"yAxis": "0_eoe_delta",
},
],
"annotations": [],
"colors": DEFAULT_COLORS,
},
result,
)
class HighChartsBarChartTransformerTests(TestCase):
maxDiff = None
chart_class = HighCharts.BarSeries
chart_type = "bar"
stacking = None
def test_single_metric_bar_chart(self):
result = (
HighCharts(title="All Votes")
.axis(self.chart_class(mock_dataset.fields.votes))
.transform(dimx0_metricx1_df, mock_dataset, [], [])
)
self.assertEqual(
{
"title": {"text": "All Votes"},
"xAxis": {"type": "category", "categories": ["All"], "visible": True},
"yAxis": [
{
"id": "0",
"title": {"text": None},
"labels": {"style": {"color": None}},
"visible": True,
}
],
"tooltip": {"shared": True, "useHTML": True, "enabled": True},
"legend": {"useHTML": True},
"series": [
{
"type": self.chart_type,
"name": "Votes",
"yAxis": "0",
"data": [{"x": 0, "y": 111674336}],
"tooltip": {
"valueDecimals": None,
"valuePrefix": None,
"valueSuffix": None,
},
"marker": {},
"stacking": self.stacking,
}
],
"annotations": [],
"colors": DEFAULT_COLORS,
},
result,
)
def test_multi_metric_bar_chart(self):
result = (
HighCharts(title="Votes and Wins")
.axis(
self.chart_class(mock_dataset.fields.votes),
self.chart_class(mock_dataset.fields.wins),
)
.transform(dimx0_metricx2_df, mock_dataset, [], [])
)
self.assertEqual(
{
"title": {"text": "Votes and Wins"},
"xAxis": {"type": "category", "categories": ["All"], "visible": True},
"yAxis": [
{
"id": "0",
"title": {"text": None},
"labels": {"style": {"color": "#DDDF0D"}},
"visible": True,
}
],
"tooltip": {"shared": True, "useHTML": True, "enabled": True},
"legend": {"useHTML": True},
"series": [
{
"type": self.chart_type,
"name": "Votes",
"yAxis": "0",
"data": [{"x": 0, "y": 111674336}],
"tooltip": {
"valuePrefix": None,
"valueSuffix": None,
"valueDecimals": None,
},
"marker": {},
"stacking": self.stacking,
},
{
"type": self.chart_type,
"name": "Wins",
"yAxis": "0",
"data": [{"x": 0, "y": 12}],
"tooltip": {
"valuePrefix": None,
"valueSuffix": None,
"valueDecimals": None,
},
"marker": {},
"stacking": self.stacking,
},
],
"annotations": [],
"colors": DEFAULT_COLORS,
},
result,
)
def test_cat_dim_single_metric_bar_chart(self):
result = (
HighCharts("Votes and Wins")
.axis(self.chart_class(mock_dataset.fields.votes))
.transform(
dimx1_str_df, mock_dataset, [mock_dataset.fields.political_party], []
)
)
self.assertEqual(
{
"title": {"text": "Votes and Wins"},
"xAxis": {
"type": "category",
"categories": ["Democrat", "Independent", "Republican"],
"visible": True,
},
"yAxis": [
{
"id": "0",
"title": {"text": None},
"labels": {"style": {"color": None}},
"visible": True,
}
],
"tooltip": {"shared": True, "useHTML": True, "enabled": True},
"legend": {"useHTML": True},
"series": [
{
"type": self.chart_type,
"name": "Votes",
"yAxis": "0",
"data": [
{"x": 0, "y": 54551568},
{"x": 1, "y": 1076384},
{"x": 2, "y": 56046384},
],
"tooltip": {
"valuePrefix": None,
"valueSuffix": None,
"valueDecimals": None,
},
"marker": {},
"stacking": self.stacking,
}
],
"annotations": [],
"colors": DEFAULT_COLORS,
},
result,
)
def test_cat_dim_multi_metric_bar_chart(self):
result = (
HighCharts("Votes and Wins")
.axis(
self.chart_class(mock_dataset.fields.votes),
self.chart_class(mock_dataset.fields.wins),
)
.transform(
dimx1_str_df, mock_dataset, [mock_dataset.fields.political_party], []
)
)
self.assertEqual(
{
"title": {"text": "Votes and Wins"},
"xAxis": {
"type": "category",
"categories": ["Democrat", "Independent", "Republican"],
"visible": True,
},
"yAxis": [
{
"id": "0",
"title": {"text": None},
"labels": {"style": {"color": "#DDDF0D"}},
"visible": True,
}
],
"tooltip": {"shared": True, "useHTML": True, "enabled": True},
"legend": {"useHTML": True},
"series": [
{
"type": self.chart_type,
"name": "Votes",
"yAxis": "0",
"data": [
{"x": 0, "y": 54551568},
{"x": 1, "y": 1076384},
{"x": 2, "y": 56046384},
],
"tooltip": {
"valuePrefix": None,
"valueSuffix": None,
"valueDecimals": None,
},
"marker": {},
"stacking": self.stacking,
},
{
"type": self.chart_type,
"name": "Wins",
"yAxis": "0",
"data": [{"x": 0, "y": 6}, {"x": 1, "y": 0}, {"x": 2, "y": 6}],
"tooltip": {
"valuePrefix": None,
"valueSuffix": None,
"valueDecimals": None,
},
"marker": {},
"stacking": self.stacking,
},
],
"annotations": [],
"colors": DEFAULT_COLORS,
},
result,
)
def test_cont_uni_dims_single_metric_bar_chart(self):
dimensions = [mock_dataset.fields.timestamp, mock_dataset.fields.state]
result = (
HighCharts("Election Votes by State")
.axis(self.chart_class(mock_dataset.fields.votes))
.transform(dimx2_date_str_df, mock_dataset, dimensions, [])
)
self.assertEqual(
{
"title": {"text": "Election Votes by State"},
"xAxis": {"type": "datetime", "visible": True},
"yAxis": [
{
"id": "0",
"labels": {"style": {"color": None}},
"title": {"text": None},
"visible": True,
}
],
"tooltip": {"shared": True, "useHTML": True, "enabled": True},
"legend": {"useHTML": True},
"series": [
{
"data": [
(820454400000, 7579518),
(946684800000, 8294949),
(1072915200000, 9578189),
(1199145600000, 11803106),
(1325376000000, 12424128),
(1451606400000, 4871678),
],
"marker": {},
"name": "Votes (Democrat)",
"stacking": self.stacking,
"tooltip": {
"valueDecimals": None,
"valuePrefix": None,
"valueSuffix": None,
},
"type": self.chart_type,
"yAxis": "0",
},
{
"data": [(820454400000, 1076384)],
"marker": {},
"name": "Votes (Independent)",
"stacking": self.stacking,
"tooltip": {
"valueDecimals": None,
"valuePrefix": None,
"valueSuffix": None,
},
"type": self.chart_type,
"yAxis": "0",
},
{
"data": [
(820454400000, 6564547),
(946684800000, 8367068),
(1072915200000, 10036743),
(1199145600000, 9491109),
(1325376000000, 8148082),
(1451606400000, 13438835),
],
"marker": {},
"name": "Votes (Republican)",
"stacking": self.stacking,
"tooltip": {
"valueDecimals": None,
"valuePrefix": None,
"valueSuffix": None,
},
"type": self.chart_type,
"yAxis": "0",
},
],
"annotations": [],
"colors": DEFAULT_COLORS,
},
result,
)
def test_cont_uni_dims_multi_metric_single_axis_bar_chart(self):
dimensions = [mock_dataset.fields.timestamp, mock_dataset.fields.state]
result = (
HighCharts(title="Election Votes by State")
.axis(
self.chart_class(mock_dataset.fields.votes),
self.chart_class(mock_dataset.fields.wins),
)
.transform(dimx2_date_str_df, mock_dataset, dimensions, [])
)
self.assertEqual(
{
"title": {"text": "Election Votes by State"},
"xAxis": {"type": "datetime", "visible": True},
"yAxis": [
{
"id": "0",
"labels": {"style": {"color": "#DDDF0D"}},
"title": {"text": None},
"visible": True,
}
],
"tooltip": {"shared": True, "useHTML": True, "enabled": True},
"legend": {"useHTML": True},
"series": [
{
"data": [
(820454400000, 7579518),
(946684800000, 8294949),
(1072915200000, 9578189),
(1199145600000, 11803106),
(1325376000000, 12424128),
(1451606400000, 4871678),
],
"marker": {},
"name": "Votes (Democrat)",
"stacking": self.stacking,
"tooltip": {
"valueDecimals": None,
"valuePrefix": None,
"valueSuffix": None,
},
"type": self.chart_type,
"yAxis": "0",
},
{
"data": [(820454400000, 1076384)],
"marker": {},
"name": "Votes (Independent)",
"stacking": self.stacking,
"tooltip": {
"valueDecimals": None,
"valuePrefix": None,
"valueSuffix": None,
},
"type": self.chart_type,
"yAxis": "0",
},
{
"data": [
(820454400000, 6564547),
(946684800000, 8367068),
(1072915200000, 10036743),
(1199145600000, 9491109),
(1325376000000, 8148082),
(1451606400000, 13438835),
],
"marker": {},
"name": "Votes (Republican)",
"stacking": self.stacking,
"tooltip": {
"valueDecimals": None,
"valuePrefix": None,
"valueSuffix": None,
},
"type": self.chart_type,
"yAxis": "0",
},
{
"data": [
(820454400000, 2),
(946684800000, 0),
(1072915200000, 0),
(1199145600000, 2),
(1325376000000, 2),
(1451606400000, 0),
],
"marker": {},
"name": "Wins (Democrat)",
"stacking": self.stacking,
"tooltip": {
"valueDecimals": None,
"valuePrefix": None,
"valueSuffix": None,
},
"type": self.chart_type,
"yAxis": "0",
},
{
"data": [(820454400000, 0)],
"marker": {},
"name": "Wins (Independent)",
"stacking": self.stacking,
"tooltip": {
"valueDecimals": None,
"valuePrefix": None,
"valueSuffix": None,
},
"type": self.chart_type,
"yAxis": "0",
},
{
"data": [
(820454400000, 0),
(946684800000, 2),
(1072915200000, 2),
(1199145600000, 0),
(1325376000000, 0),
(1451606400000, 2),
],
"marker": {},
"name": "Wins (Republican)",
"stacking": self.stacking,
"tooltip": {
"valueDecimals": None,
"valuePrefix": None,
"valueSuffix": None,
},
"type": self.chart_type,
"yAxis": "0",
},
],
"annotations": [],
"colors": DEFAULT_COLORS,
},
result,
)
def test_cont_uni_dims_multi_metric_multi_axis_bar_chart(self):
dimensions = [mock_dataset.fields.timestamp, mock_dataset.fields.state]
result = (
HighCharts(title="Election Votes by State")
.axis(self.chart_class(mock_dataset.fields.votes))
.axis(self.chart_class(mock_dataset.fields.wins))
.transform(dimx2_date_str_df, mock_dataset, dimensions, [])
)
self.assertEqual(
{
"title": {"text": "Election Votes by State"},
"xAxis": {"type": "datetime", "visible": True},
"yAxis": [
{
"id": "1",
"labels": {"style": {"color": "#7798BF"}},
"title": {"text": None},
"visible": True,
},
{
"id": "0",
"labels": {"style": {"color": "#DDDF0D"}},
"title": {"text": None},
"visible": True,
},
],
"tooltip": {"shared": True, "useHTML": True, "enabled": True},
"legend": {"useHTML": True},
"series": [
{
"data": [
(820454400000, 7579518),
(946684800000, 8294949),
(1072915200000, 9578189),
(1199145600000, 11803106),
(1325376000000, 12424128),
(1451606400000, 4871678),
],
"marker": {},
"name": "Votes (Democrat)",
"stacking": self.stacking,
"tooltip": {
"valueDecimals": None,
"valuePrefix": None,
"valueSuffix": None,
},
"type": self.chart_type,
"yAxis": "0",
},
{
"data": [(820454400000, 1076384)],
"marker": {},
"name": "Votes (Independent)",
"stacking": self.stacking,
"tooltip": {
"valueDecimals": None,
"valuePrefix": None,
"valueSuffix": None,
},
"type": self.chart_type,
"yAxis": "0",
},
{
"data": [
(820454400000, 6564547),
(946684800000, 8367068),
(1072915200000, 10036743),
(1199145600000, 9491109),
(1325376000000, 8148082),
(1451606400000, 13438835),
],
"marker": {},
"name": "Votes (Republican)",
"stacking": self.stacking,
"tooltip": {
"valueDecimals": None,
"valuePrefix": None,
"valueSuffix": None,
},
"type": self.chart_type,
"yAxis": "0",
},
{
"data": [
(820454400000, 2),
(946684800000, 0),
(1072915200000, 0),
(1199145600000, 2),
(1325376000000, 2),
(1451606400000, 0),
],
"marker": {},
"name": "Wins (Democrat)",
"stacking": self.stacking,
"tooltip": {
"valueDecimals": None,
"valuePrefix": None,
"valueSuffix": None,
},
"type": self.chart_type,
"yAxis": "1",
},
{
"data": [(820454400000, 0)],
"marker": {},
"name": "Wins (Independent)",
"stacking": self.stacking,
"tooltip": {
"valueDecimals": None,
"valuePrefix": None,
"valueSuffix": None,
},
"type": self.chart_type,
"yAxis": "1",
},
{
"data": [
(820454400000, 0),
(946684800000, 2),
(1072915200000, 2),
(1199145600000, 0),
(1325376000000, 0),
(1451606400000, 2),
],
"marker": {},
"name": "Wins (Republican)",
"stacking": self.stacking,
"tooltip": {
"valueDecimals": None,
"valuePrefix": None,
"valueSuffix": None,
},
"type": self.chart_type,
"yAxis": "1",
},
],
"annotations": [],
"colors": DEFAULT_COLORS,
},
result,
)
def test_cat_dim_with_totals_chart(self):
result = (
HighCharts(title="Categorical Dimension with Totals")
.axis(self.chart_class(mock_dataset.fields.votes))
.transform(
dimx1_str_totals_df,
mock_dataset,
[Rollup(mock_dataset.fields.political_party)],
[],
)
)
self.assertEqual(
{
"title": {"text": "Categorical Dimension with Totals"},
"xAxis": {
"categories": ["Democrat", "Independent", "Republican", "Totals"],
"type": "category",
"visible": True,
},
"yAxis": [
{
"id": "0",
"labels": {"style": {"color": None}},
"title": {"text": None},
"visible": True,
}
],
"legend": {"useHTML": True},
"series": [
{
"name": "Votes",
"yAxis": "0",
"data": [
{"x": 0, "y": 54551568},
{"x": 1, "y": 1076384},
{"x": 2, "y": 56046384},
{"x": 3, "y": 111674336},
],
"marker": {},
"tooltip": {
"valueDecimals": None,
"valuePrefix": None,
"valueSuffix": None,
},
"type": self.chart_type,
"stacking": self.stacking,
}
],
"tooltip": {"enabled": True, "shared": True, "useHTML": True},
"annotations": [],
"colors": DEFAULT_COLORS,
},
result,
)
def test_cat_uni_dim_with_missing_values(self):
df = (
dimx2_str_num_df.drop(("Democrat", 1))
.drop(("Republican", 2))
.drop(("Republican", 10))
)
dimensions = [
mock_dataset.fields.political_party,
mock_dataset.fields["candidate-id"],
]
result = (
HighCharts(title="Categorical Dimension with Totals")
.axis(self.chart_class(mock_dataset.fields.votes))
.transform(df, mock_dataset, dimensions, [])
)
self.assertEqual(
{
"title": {"text": "Categorical Dimension with Totals"},
"xAxis": {
"categories": ["Democrat", "Independent", "Republican"],
"type": "category",
"visible": True,
},
"yAxis": [
{
"id": "0",
"labels": {"style": {"color": None}},
"title": {"text": None},
"visible": True,
}
],
"legend": {"useHTML": True},
"series": [
{
"data": [{"x": 0, "y": 8294949}],
"marker": {},
"name": "Votes (5)",
"stacking": self.stacking,
"tooltip": {
"valueDecimals": None,
"valuePrefix": None,
"valueSuffix": None,
},
"type": self.chart_type,
"yAxis": "0",
},
{
"data": [{"x": 0, "y": 9578189}],
"marker": {},
"name": "Votes (6)",
"stacking": self.stacking,
"tooltip": {
"valueDecimals": None,
"valuePrefix": None,
"valueSuffix": None,
},
"type": self.chart_type,
"yAxis": "0",
},
{
"data": [{"x": 0, "y": 24227234}],
"marker": {},
"name": "Votes (7)",
"stacking": self.stacking,
"tooltip": {
"valueDecimals": None,
"valuePrefix": None,
"valueSuffix": None,
},
"type": self.chart_type,
"yAxis": "0",
},
{
"data": [{"x": 0, "y": 4871678}],
"marker": {},
"name": "Votes (11)",
"stacking": self.stacking,
"tooltip": {
"valueDecimals": None,
"valuePrefix": None,
"valueSuffix": None,
},
"type": self.chart_type,
"yAxis": "0",
},
{
"data": [{"x": 1, "y": 1076384}],
"marker": {},
"name": "Votes (3)",
"stacking": self.stacking,
"tooltip": {
"valueDecimals": None,
"valuePrefix": None,
"valueSuffix": None,
},
"type": self.chart_type,
"yAxis": "0",
},
{
"data": [{"x": 2, "y": 18403811}],
"marker": {},
"name": "Votes (4)",
"stacking": self.stacking,
"tooltip": {
"valueDecimals": None,
"valuePrefix": None,
"valueSuffix": None,
},
"type": self.chart_type,
"yAxis": "0",
},
{
"data": [{"x": 2, "y": 9491109}],
"marker": {},
"name": "Votes (8)",
"stacking": self.stacking,
"tooltip": {
"valueDecimals": None,
"valuePrefix": None,
"valueSuffix": None,
},
"type": self.chart_type,
"yAxis": "0",
},
{
"data": [{"x": 2, "y": 8148082}],
"marker": {},
"name": "Votes (9)",
"stacking": self.stacking,
"tooltip": {
"valueDecimals": None,
"valuePrefix": None,
"valueSuffix": None,
},
"type": self.chart_type,
"yAxis": "0",
},
],
"tooltip": {"enabled": True, "shared": True, "useHTML": True},
"annotations": [],
"colors": DEFAULT_COLORS,
},
result,
)
def test_invisible_y_axis(self):
result = (
HighCharts(title="All Votes")
.axis(self.chart_class(mock_dataset.fields.votes), y_axis_visible=False)
.transform(dimx0_metricx1_df, mock_dataset, [], [])
)
self.assertEqual(
{
"title": {"text": "All Votes"},
"xAxis": {"type": "category", "categories": ["All"], "visible": True},
"yAxis": [
{
"id": "0",
"title": {"text": None},
"labels": {"style": {"color": None}},
"visible": False,
}
],
"tooltip": {"shared": True, "useHTML": True, "enabled": True},
"legend": {"useHTML": True},
"series": [
{
"type": self.chart_type,
"name": "Votes",
"yAxis": "0",
"data": [{"x": 0, "y": 111674336}],
"tooltip": {
"valueDecimals": None,
"valuePrefix": None,
"valueSuffix": None,
},
"marker": {},
"stacking": self.stacking,
}
],
"annotations": [],
"colors": DEFAULT_COLORS,
},
result,
)
class HighChartsColumnChartTransformerTests(HighChartsBarChartTransformerTests):
chart_class = HighCharts.ColumnSeries
chart_type = "column"
class HighChartsStackedBarChartTransformerTests(HighChartsBarChartTransformerTests):
maxDiff = None
chart_class = HighCharts.StackedBarSeries
chart_type = "bar"
stacking = "normal"
class HighChartsStackedColumnChartTransformerTests(HighChartsBarChartTransformerTests):
chart_class = HighCharts.StackedColumnSeries
chart_type = "column"
stacking = "normal"
class HighChartsAreaChartTransformerTests(HighChartsLineChartTransformerTests):
chart_class = HighCharts.AreaSeries
chart_type = "area"
class HighChartsAreaStackedChartTransformerTests(HighChartsAreaChartTransformerTests):
chart_class = HighCharts.AreaStackedSeries
stacking = "normal"
class HighChartsAreaPercentChartTransformerTests(HighChartsAreaChartTransformerTests):
chart_class = HighCharts.AreaPercentageSeries
stacking = "percent"
class HighChartsPieChartTransformerTests(TestCase):
maxDiff = None
chart_class = HighCharts.PieSeries
chart_type = "pie"
def test_pie_chart_metricx1(self):
result = (
HighCharts(title="All Votes")
.axis(self.chart_class(mock_dataset.fields.votes))
.transform(dimx0_metricx1_df, mock_dataset, [], [])
)
self.assertEqual(
{
"title": {"text": "All Votes"},
"tooltip": {"shared": True, "useHTML": True, "enabled": True},
"legend": {"useHTML": True},
"series": [
{
"name": "Votes",
"type": "pie",
"data": [{"name": "Votes", "y": 111674336}],
"tooltip": {
"pointFormat": '<span style="color:{point.color}">●</span> '
"{series.name}: <b>{point.y} ({point.percentage:.1f}%)</b><br/>",
"valueDecimals": None,
"valuePrefix": None,
"valueSuffix": None,
},
}
],
"xAxis": {"type": "category", "categories": ["All"], "visible": True},
"yAxis": [
{
"id": "0",
"labels": {"style": {"color": None}},
"title": {"text": None},
"visible": True,
}
],
"annotations": [],
"colors": DEFAULT_COLORS,
},
result,
)
def test_pie_chart_metricx2(self):
result = (
HighCharts(title="Votes and Wins")
.axis(
self.chart_class(mock_dataset.fields.votes),
self.chart_class(mock_dataset.fields.wins),
)
.transform(dimx0_metricx2_df, mock_dataset, [], [])
)
self.assertEqual(
{
"title": {"text": "Votes and Wins"},
"tooltip": {"shared": True, "useHTML": True, "enabled": True},
"legend": {"useHTML": True},
"series": [
{
"name": "Votes",
"type": "pie",
"data": [{"name": "Votes", "y": 111674336}],
"tooltip": {
"pointFormat": '<span style="color:{point.color}">●</span> '
"{series.name}: <b>{point.y} ({point.percentage:.1f}%)</b><br/>",
"valueDecimals": None,
"valuePrefix": None,
"valueSuffix": None,
},
},
{
"name": "Wins",
"type": "pie",
"data": [{"name": "Wins", "y": 12}],
"tooltip": {
"pointFormat": '<span style="color:{point.color}">●</span> '
"{series.name}: <b>{point.y} ({point.percentage:.1f}%)</b><br/>",
"valueDecimals": None,
"valuePrefix": None,
"valueSuffix": None,
},
},
],
"xAxis": {"type": "category", "categories": ["All"], "visible": True},
"yAxis": [
{
"id": "0",
"labels": {"style": {"color": "#DDDF0D"}},
"title": {"text": None},
"visible": True,
}
],
"annotations": [],
"colors": DEFAULT_COLORS,
},
result,
)
def test_pie_chart_dimx1_date(self):
result = (
HighCharts("Votes and Wins By Day")
.axis(self.chart_class(mock_dataset.fields.votes))
.transform(dimx1_date_df, mock_dataset, [mock_dataset.fields.timestamp], [])
)
self.assertEqual(
{
"annotations": [],
"colors": DEFAULT_COLORS,
"legend": {"useHTML": True},
"series": [
{
"data": [
{"name": "1996-01-01", "y": 15220449},
{"name": "2000-01-01", "y": 16662017},
{"name": "2004-01-01", "y": 19614932},
{"name": "2008-01-01", "y": 21294215},
{"name": "2012-01-01", "y": 20572210},
{"name": "2016-01-01", "y": 18310513},
],
"name": "Votes",
"tooltip": {
"pointFormat": "<span "
'style="color:{point.color}">●</span> '
"{series.name}: <b>{point.y} "
"({point.percentage:.1f}%)</b><br/>",
"valueDecimals": None,
"valuePrefix": None,
"valueSuffix": None,
},
"type": "pie",
}
],
"title": {"text": "Votes and Wins By Day"},
"tooltip": {"enabled": True, "shared": True, "useHTML": True},
"xAxis": {"type": "datetime", "visible": True},
"yAxis": [
{
"id": "0",
"labels": {"style": {"color": None}},
"title": {"text": None},
"visible": True,
}
],
},
result,
)
def test_pie_chart_dimx1_date_year(self):
result = (
HighCharts("Votes and Wins By Day")
.axis(self.chart_class(mock_dataset.fields.votes))
.transform(
dimx1_date_df, mock_dataset, [year(mock_dataset.fields.timestamp)], []
)
)
self.assertEqual(
{
"annotations": [],
"colors": DEFAULT_COLORS,
"legend": {"useHTML": True},
"series": [
{
"data": [
{"name": "1996", "y": 15220449},
{"name": "2000", "y": 16662017},
{"name": "2004", "y": 19614932},
{"name": "2008", "y": 21294215},
{"name": "2012", "y": 20572210},
{"name": "2016", "y": 18310513},
],
"name": "Votes",
"tooltip": {
"pointFormat": "<span "
'style="color:{point.color}">●</span> '
"{series.name}: <b>{point.y} "
"({point.percentage:.1f}%)</b><br/>",
"valueDecimals": None,
"valuePrefix": None,
"valueSuffix": None,
},
"type": "pie",
}
],
"title": {"text": "Votes and Wins By Day"},
"tooltip": {"enabled": True, "shared": True, "useHTML": True},
"xAxis": {"type": "datetime", "visible": True},
"yAxis": [
{
"id": "0",
"labels": {"style": {"color": None}},
"title": {"text": None},
"visible": True,
}
],
},
result,
)
def test_pie_chart_dimx1_str(self):
result = (
HighCharts("Votes and Wins By Party")
.axis(self.chart_class(mock_dataset.fields.votes))
.transform(
dimx1_str_df, mock_dataset, [mock_dataset.fields.political_party], []
)
)
self.assertEqual(
{
"title": {"text": "Votes and Wins By Party"},
"tooltip": {"useHTML": True, "shared": True, "enabled": True},
"legend": {"useHTML": True},
"series": [
{
"name": "Votes",
"type": "pie",
"data": [
{"y": 54551568, "name": "Democrat"},
{"y": 1076384, "name": "Independent"},
{"y": 56046384, "name": "Republican"},
],
"tooltip": {
"pointFormat": '<span style="color:{point.color}">●</span> '
"{series.name}: <b>{point.y} ({point.percentage:.1f}%)</b><br/>",
"valuePrefix": None,
"valueSuffix": None,
"valueDecimals": None,
},
}
],
"yAxis": [
{
"id": "0",
"labels": {"style": {"color": None}},
"title": {"text": None},
"visible": True,
}
],
"xAxis": {
"type": "category",
"categories": ["Democrat", "Independent", "Republican"],
"visible": True,
},
"annotations": [],
"colors": DEFAULT_COLORS,
},
result,
)
def test_pie_chart_dimx1_num(self):
result = (
HighCharts(title="Votes and Wins By Election")
.axis(self.chart_class(mock_dataset.fields.votes))
.transform(
dimx1_num_df, mock_dataset, [mock_dataset.fields["candidate-id"]], []
)
)
self.assertEqual(
{
"title": {"text": "Votes and Wins By Election"},
"xAxis": {
"type": "category",
"categories": [
"1",
"2",
"3",
"4",
"5",
"6",
"7",
"8",
"9",
"10",
"11",
],
"visible": True,
},
"yAxis": [
{
"id": "0",
"title": {"text": None},
"labels": {"style": {"color": None}},
"visible": True,
}
],
"annotations": [],
"colors": DEFAULT_COLORS,
"series": [
{
"name": "Votes",
"type": "pie",
"data": [
{"name": "1", "y": 7579518},
{"name": "2", "y": 6564547},
{"name": "3", "y": 1076384},
{"name": "4", "y": 18403811},
{"name": "5", "y": 8294949},
{"name": "6", "y": 9578189},
{"name": "7", "y": 24227234},
{"name": "8", "y": 9491109},
{"name": "9", "y": 8148082},
{"name": "10", "y": 13438835},
{"name": "11", "y": 4871678},
],
"tooltip": {
"pointFormat": '<span style="color:{point.color}">\u25cf</span> {'
"series.name}: <b>{point.y} ({"
"point.percentage:.1f}%)</b><br/>",
"valueDecimals": None,
"valuePrefix": None,
"valueSuffix": None,
},
}
],
"tooltip": {"shared": True, "useHTML": True, "enabled": True},
"legend": {"useHTML": True},
},
result,
)
def test_pie_chart_dimx2_date_str(self):
dimensions = [
mock_dataset.fields.timestamp,
mock_dataset.fields.political_party,
]
result = (
HighCharts(title="Votes by Date, Party")
.axis(self.chart_class(mock_dataset.fields.votes))
.transform(dimx2_date_str_df, mock_dataset, dimensions, [])
)
self.assertEqual(
{
"title": {"text": "Votes by Date, Party"},
"xAxis": {"type": "datetime", "visible": True},
"yAxis": [
{
"id": "0",
"title": {"text": None},
"labels": {"style": {"color": None}},
"visible": True,
}
],
"annotations": [],
"colors": DEFAULT_COLORS,
"series": [
{
"name": "Votes",
"type": "pie",
"data": [
{"name": "1996-01-01, Democrat", "y": 7579518},
{"name": "1996-01-01, Independent", "y": 1076384},
{"name": "1996-01-01, Republican", "y": 6564547},
{"name": "2000-01-01, Democrat", "y": 8294949},
{"name": "2000-01-01, Republican", "y": 8367068},
{"name": "2004-01-01, Democrat", "y": 9578189},
{"name": "2004-01-01, Republican", "y": 10036743},
{"name": "2008-01-01, Democrat", "y": 11803106},
{"name": "2008-01-01, Republican", "y": 9491109},
{"name": "2012-01-01, Democrat", "y": 12424128},
{"name": "2012-01-01, Republican", "y": 8148082},
{"name": "2016-01-01, Democrat", "y": 4871678},
{"name": "2016-01-01, Republican", "y": 13438835},
],
"tooltip": {
"pointFormat": '<span style="color:{point.color}">\u25cf</span> {series.name}: <b>{'
"point.y} ({"
"point.percentage:.1f}%)</b><br/>",
"valueDecimals": None,
"valuePrefix": None,
"valueSuffix": None,
},
}
],
"tooltip": {"shared": True, "useHTML": True, "enabled": True},
"legend": {"useHTML": True},
},
result,
)
def test_pie_chart_dimx2_date_num(self):
dimensions = [
day(mock_dataset.fields.timestamp),
mock_dataset.fields["candidate-id"],
]
result = (
HighCharts(title="Election Votes by Day and Candidate ID")
.axis(self.chart_class(mock_dataset.fields.votes))
.transform(dimx2_date_num_df, mock_dataset, dimensions, [])
)
self.assertEqual(
{
"title": {"text": "Election Votes by Day and Candidate ID"},
"xAxis": {"type": "datetime", "visible": True},
"yAxis": [
{
"id": "0",
"title": {"text": None},
"labels": {"style": {"color": None}},
"visible": True,
}
],
"annotations": [],
"colors": DEFAULT_COLORS,
"series": [
{
"name": "Votes",
"type": "pie",
"data": [
{"name": "1996-01-01, 1", "y": 7579518},
{"name": "1996-01-01, 2", "y": 6564547},
{"name": "1996-01-01, 3", "y": 1076384},
{"name": "2000-01-01, 4", "y": 8367068},
{"name": "2000-01-01, 5", "y": 8294949},
{"name": "2004-01-01, 4", "y": 10036743},
{"name": "2004-01-01, 6", "y": 9578189},
{"name": "2008-01-01, 7", "y": 11803106},
{"name": "2008-01-01, 8", "y": 9491109},
{"name": "2012-01-01, 7", "y": 12424128},
{"name": "2012-01-01, 9", "y": 8148082},
{"name": "2016-01-01, 10", "y": 13438835},
{"name": "2016-01-01, 11", "y": 4871678},
],
"tooltip": {
"pointFormat": '<span style="color:{point.color}">\u25cf</span> {series.name}: '
"<b>{point.y} ({point.percentage:.1f}%)</b><br/>",
"valueDecimals": None,
"valuePrefix": None,
"valueSuffix": None,
},
}
],
"tooltip": {"shared": True, "useHTML": True, "enabled": True},
"legend": {"useHTML": True},
},
result,
)
def test_pie_chart_dimx2_yearly_date_num(self):
dimensions = [
year(mock_dataset.fields.timestamp),
mock_dataset.fields["candidate-id"],
]
result = (
HighCharts(title="Election Votes by Day and Candidate ID")
.axis(self.chart_class(mock_dataset.fields.votes))
.transform(dimx2_date_num_df, mock_dataset, dimensions, [])
)
self.assertEqual(
{
"title": {"text": "Election Votes by Day and Candidate ID"},
"xAxis": {"type": "datetime", "visible": True},
"yAxis": [
{
"id": "0",
"title": {"text": None},
"labels": {"style": {"color": None}},
"visible": True,
}
],
"annotations": [],
"colors": DEFAULT_COLORS,
"series": [
{
"name": "Votes",
"type": "pie",
"data": [
{"name": "1996, 1", "y": 7579518},
{"name": "1996, 2", "y": 6564547},
{"name": "1996, 3", "y": 1076384},
{"name": "2000, 4", "y": 8367068},
{"name": "2000, 5", "y": 8294949},
{"name": "2004, 4", "y": 10036743},
{"name": "2004, 6", "y": 9578189},
{"name": "2008, 7", "y": 11803106},
{"name": "2008, 8", "y": 9491109},
{"name": "2012, 7", "y": 12424128},
{"name": "2012, 9", "y": 8148082},
{"name": "2016, 10", "y": 13438835},
{"name": "2016, 11", "y": 4871678},
],
"tooltip": {
"pointFormat": '<span style="color:{point.color}">\u25cf</span> {series.name}: '
"<b>{point.y} ({point.percentage:.1f}%)</b><br/>",
"valueDecimals": None,
"valuePrefix": None,
"valueSuffix": None,
},
}
],
"tooltip": {"shared": True, "useHTML": True, "enabled": True},
"legend": {"useHTML": True},
},
result,
)
def test_pie_chart_dimx2_date_str_reference(self):
dimensions = [mock_dataset.fields.timestamp, mock_dataset.fields.state]
references = [ElectionOverElection(mock_dataset.fields.timestamp)]
result = (
HighCharts(title="Election Votes by State")
.axis(
self.chart_class(mock_dataset.fields.votes),
self.chart_class(mock_dataset.fields.wins),
)
.transform(dimx2_date_str_df, mock_dataset, dimensions, references)
)
self.assertEqual(
{
"title": {"text": "Election Votes by State"},
"xAxis": {"type": "datetime", "visible": True},
"yAxis": [
{
"id": "0",
"title": {"text": None},
"labels": {"style": {"color": "#DDDF0D"}},
"visible": True,
}
],
"annotations": [],
"colors": DEFAULT_COLORS,
"series": [
{
"name": "Votes",
"type": "pie",
"data": [
{"name": "1996-01-01, Democrat", "y": 7579518},
{"name": "1996-01-01, Independent", "y": 1076384},
{"name": "1996-01-01, Republican", "y": 6564547},
{"name": "2000-01-01, Democrat", "y": 8294949},
{"name": "2000-01-01, Republican", "y": 8367068},
{"name": "2004-01-01, Democrat", "y": 9578189},
{"name": "2004-01-01, Republican", "y": 10036743},
{"name": "2008-01-01, Democrat", "y": 11803106},
{"name": "2008-01-01, Republican", "y": 9491109},
{"name": "2012-01-01, Democrat", "y": 12424128},
{"name": "2012-01-01, Republican", "y": 8148082},
{"name": "2016-01-01, Democrat", "y": 4871678},
{"name": "2016-01-01, Republican", "y": 13438835},
],
"tooltip": {
"pointFormat": '<span style="color:{point.color}">\u25cf</span> {'
"series.name}: <b>{point.y} ({"
"point.percentage:.1f}%)</b><br/>",
"valueDecimals": None,
"valuePrefix": None,
"valueSuffix": None,
},
},
{
"name": "Votes EoE",
"type": "pie",
"data": [
{"name": "1996-01-01, Democrat", "y": 7579518},
{"name": "1996-01-01, Independent", "y": 1076384},
{"name": "1996-01-01, Republican", "y": 6564547},
{"name": "2000-01-01, Democrat", "y": 8294949},
{"name": "2000-01-01, Republican", "y": 8367068},
{"name": "2004-01-01, Democrat", "y": 9578189},
{"name": "2004-01-01, Republican", "y": 10036743},
{"name": "2008-01-01, Democrat", "y": 11803106},
{"name": "2008-01-01, Republican", "y": 9491109},
{"name": "2012-01-01, Democrat", "y": 12424128},
{"name": "2012-01-01, Republican", "y": 8148082},
{"name": "2016-01-01, Democrat", "y": 4871678},
{"name": "2016-01-01, Republican", "y": 13438835},
],
"tooltip": {
"pointFormat": '<span style="color:{point.color}">\u25cf</span> {series.name}: <b>{'
"point.y} ({"
"point.percentage:.1f}%)</b><br/>",
"valueDecimals": None,
"valuePrefix": None,
"valueSuffix": None,
},
},
{
"name": "Wins",
"type": "pie",
"data": [
{"name": "1996-01-01, Democrat", "y": 2},
{"name": "1996-01-01, Independent", "y": 0},
{"name": "1996-01-01, Republican", "y": 0},
{"name": "2000-01-01, Democrat", "y": 0},
{"name": "2000-01-01, Republican", "y": 2},
{"name": "2004-01-01, Democrat", "y": 0},
{"name": "2004-01-01, Republican", "y": 2},
{"name": "2008-01-01, Democrat", "y": 2},
{"name": "2008-01-01, Republican", "y": 0},
{"name": "2012-01-01, Democrat", "y": 2},
{"name": "2012-01-01, Republican", "y": 0},
{"name": "2016-01-01, Democrat", "y": 0},
{"name": "2016-01-01, Republican", "y": 2},
],
"tooltip": {
"pointFormat": '<span style="color:{point.color}">\u25cf</span> {series.name}: <b>{'
"point.y} ({"
"point.percentage:.1f}%)</b><br/>",
"valueDecimals": None,
"valuePrefix": None,
"valueSuffix": None,
},
},
{
"name": "Wins EoE",
"type": "pie",
"data": [
{"name": "1996-01-01, Democrat", "y": 2},
{"name": "1996-01-01, Independent", "y": 0},
{"name": "1996-01-01, Republican", "y": 0},
{"name": "2000-01-01, Democrat", "y": 0},
{"name": "2000-01-01, Republican", "y": 2},
{"name": "2004-01-01, Democrat", "y": 0},
{"name": "2004-01-01, Republican", "y": 2},
{"name": "2008-01-01, Democrat", "y": 2},
{"name": "2008-01-01, Republican", "y": 0},
{"name": "2012-01-01, Democrat", "y": 2},
{"name": "2012-01-01, Republican", "y": 0},
{"name": "2016-01-01, Democrat", "y": 0},
{"name": "2016-01-01, Republican", "y": 2},
],
"tooltip": {
"pointFormat": '<span style="color:{point.color}">\u25cf</span> {series.name}: <b>{'
"point.y} ({"
"point.percentage:.1f}%)</b><br/>",
"valueDecimals": None,
"valuePrefix": None,
"valueSuffix": None,
},
},
],
"tooltip": {"shared": True, "useHTML": True, "enabled": True},
"legend": {"useHTML": True},
},
result,
)
class HighChartsLineChartAnnotationTransformerTests(TestCase):
maxDiff = None
chart_class = HighCharts.LineSeries
chart_type = "line"
stacking = None
def test_dimx1_timeseries_with_annotation(self):
result = (
HighCharts(title="Time Series, Single Metric")
.axis(self.chart_class(mock_dataset.fields.votes))
.transform(
dimx1_date_meticx1_votes_df,
mock_dataset,
[mock_dataset.fields.timestamp],
[],
dimx2_date_index_str_df,
)
)
self.assertEqual(
{
"title": {"text": "Time Series, Single Metric"},
"xAxis": {"type": "datetime", "visible": True},
"yAxis": [
{
"id": "0",
"title": {"text": None},
"labels": {"style": {"color": None}},
"visible": True,
}
],
"tooltip": {"shared": True, "useHTML": True, "enabled": True},
"legend": {"useHTML": True},
"series": [
{
"type": self.chart_type,
"name": "Votes",
"yAxis": "0",
"data": [
(820454400000, 15220449),
(946684800000, 16662017),
(1072915200000, 19614932),
(1199145600000, 21294215),
(1325376000000, 20572210),
(1451606400000, 18310513),
],
"tooltip": {
"valuePrefix": None,
"valueSuffix": None,
"valueDecimals": None,
},
"color": "#DDDF0D",
"marker": {"symbol": "circle", "fillColor": "#DDDF0D"},
"dashStyle": "Solid",
"stacking": self.stacking,
}
],
"annotations": [
{
"labels": [
{
"point": {"x": 820454400000, "xAxis": 0},
"text": "Bill Clinton, Bob Dole, Ross Perot, "
"Bill Clinton, Bob Dole, Ross Perot",
},
{
"point": {"x": 946684800000, "xAxis": 0},
"text": "George Bush, Al Gore, George Bush, Al Gore",
},
{
"point": {"x": 1072915200000, "xAxis": 0},
"text": "George Bush, John Kerry, George Bush, "
"John Kerry",
},
{
"point": {"x": 1199145600000, "xAxis": 0},
"text": "Barrack Obama, John McCain, Barrack "
"Obama, John McCain",
},
{
"point": {"x": 1325376000000, "xAxis": 0},
"text": "Barrack Obama, Mitt Romney, Barrack "
"Obama, Mitt Romney",
},
{
"point": {"x": 1451606400000, "xAxis": 0},
"text": "Donald Trump, Hillary Clinton, Donald "
"Trump, Hillary Clinton",
},
]
}
],
"colors": DEFAULT_COLORS,
},
result,
)
def test_dimx2_timeseries_with_annotation(self):
result = (
HighCharts(title="Time Series with Dimension and Single Metric")
.axis(self.chart_class(mock_dataset.fields.votes))
.transform(
dimx2_date_str_df,
mock_dataset,
[mock_dataset.fields.timestamp, mock_dataset.fields.political_party],
[],
dimx2_date_index_str_df,
)
)
self.assertEqual(
{
"title": {"text": "Time Series with Dimension and Single Metric"},
"xAxis": {"type": "datetime", "visible": True},
"yAxis": [
{
"id": "0",
"title": {"text": None},
"labels": {"style": {"color": None}},
"visible": True,
}
],
"tooltip": {"shared": True, "useHTML": True, "enabled": True},
"legend": {"useHTML": True},
"series": [
{
"type": self.chart_type,
"name": "Votes (Democrat)",
"yAxis": "0",
"data": [
(820454400000, 7579518),
(946684800000, 8294949),
(1072915200000, 9578189),
(1199145600000, 11803106),
(1325376000000, 12424128),
(1451606400000, 4871678),
],
"tooltip": {
"valuePrefix": None,
"valueSuffix": None,
"valueDecimals": None,
},
"color": "#DDDF0D",
"marker": {"symbol": "circle", "fillColor": "#DDDF0D"},
"dashStyle": "Solid",
"stacking": self.stacking,
},
{
"color": "#55BF3B",
"dashStyle": "Solid",
"data": [(820454400000, 1076384)],
"marker": {"fillColor": "#DDDF0D", "symbol": "square"},
"name": "Votes (Independent)",
"stacking": None,
"tooltip": {
"valueDecimals": None,
"valuePrefix": None,
"valueSuffix": None,
},
"type": "line",
"yAxis": "0",
},
{
"color": "#DF5353",
"dashStyle": "Solid",
"data": [
(820454400000, 6564547),
(946684800000, 8367068),
(1072915200000, 10036743),
(1199145600000, 9491109),
(1325376000000, 8148082),
(1451606400000, 13438835),
],
"marker": {"fillColor": "#DDDF0D", "symbol": "diamond"},
"name": "Votes (Republican)",
"stacking": None,
"tooltip": {
"valueDecimals": None,
"valuePrefix": None,
"valueSuffix": None,
},
"type": "line",
"yAxis": "0",
},
],
"annotations": [
{
"labels": [
{
"point": {"x": 820454400000, "xAxis": 0},
"text": "Bill Clinton, Bob Dole, Ross Perot, "
"Bill Clinton, Bob Dole, Ross Perot",
},
{
"point": {"x": 946684800000, "xAxis": 0},
"text": "George Bush, Al Gore, George Bush, Al Gore",
},
{
"point": {"x": 1072915200000, "xAxis": 0},
"text": "George Bush, John Kerry, George Bush, "
"John Kerry",
},
{
"point": {"x": 1199145600000, "xAxis": 0},
"text": "Barrack Obama, John McCain, Barrack "
"Obama, John McCain",
},
{
"point": {"x": 1325376000000, "xAxis": 0},
"text": "Barrack Obama, Mitt Romney, Barrack "
"Obama, Mitt Romney",
},
{
"point": {"x": 1451606400000, "xAxis": 0},
"text": "Donald Trump, Hillary Clinton, Donald "
"Trump, Hillary Clinton",
},
]
}
],
"colors": DEFAULT_COLORS,
},
result,
)
def test_dimx1_category_with_annotation(self):
result = (
HighCharts(title="Category Series, Single Metric")
.axis(self.chart_class(mock_dataset.fields.votes))
.transform(
dimx1_str_df,
mock_dataset,
[mock_dataset.fields.political_party],
[],
dimx2_category_index_str_df,
)
)
self.assertEqual(
{
"title": {"text": "Category Series, Single Metric"},
"xAxis": {
"categories": ["Democrat", "Independent", "Republican"],
"type": "category",
"visible": True,
},
"yAxis": [
{
"id": "0",
"title": {"text": None},
"labels": {"style": {"color": None}},
"visible": True,
}
],
"tooltip": {"shared": True, "useHTML": True, "enabled": True},
"legend": {"useHTML": True},
"series": [
{
"type": self.chart_type,
"name": "Votes",
"yAxis": "0",
"data": [
{"x": 0, "y": 54551568},
{"x": 1, "y": 1076384},
{"x": 2, "y": 56046384},
],
"tooltip": {
"valuePrefix": None,
"valueSuffix": None,
"valueDecimals": None,
},
"color": "#DDDF0D",
"marker": {"symbol": "circle", "fillColor": "#DDDF0D"},
"dashStyle": "Solid",
"stacking": self.stacking,
}
],
"annotations": [
{
"labels": [
{
"point": {"x": 0, "xAxis": 0},
"text": "Bill Clinton, Al Gore, John Kerry, "
"Barrack Obama, Barrack Obama, Hillary "
"Clinton, Bill Clinton, Al Gore, John "
"Kerry, Barrack Obama, Barrack Obama, "
"Hillary Clinton",
},
{
"point": {"x": 1, "xAxis": 0},
"text": "Ross Perot, Ross Perot",
},
{
"point": {"x": 2, "xAxis": 0},
"text": "Bob Dole, George Bush, George Bush, "
"John McCain, Mitt Romney, Donald Trump, "
"Bob Dole, George Bush, George Bush, "
"John McCain, Mitt Romney, Donald "
"Trump",
},
]
}
],
"colors": DEFAULT_COLORS,
},
result,
)
| 41.111366 | 112 | 0.324383 | 8,056 | 143,232 | 5.676639 | 0.034136 | 0.034397 | 0.038661 | 0.064945 | 0.925718 | 0.909055 | 0.901402 | 0.891059 | 0.887669 | 0.876889 | 0 | 0.129862 | 0.551085 | 143,232 | 3,483 | 113 | 41.12317 | 0.581269 | 0 | 0 | 0.733274 | 0 | 0.002368 | 0.178556 | 0.007038 | 0 | 0 | 0 | 0 | 0.011249 | 1 | 0.011249 | false | 0 | 0.00148 | 0 | 0.024571 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 8 |
292739fb55674aabc9b4dbf315399b2345e33906 | 24,895 | py | Python | azure-mgmt-batchai/azure/mgmt/batchai/operations/workspaces_operations.py | JonathanGailliez/azure-sdk-for-python | f0f051bfd27f8ea512aea6fc0c3212ee9ee0029b | [
"MIT"
] | 1 | 2018-07-23T08:59:24.000Z | 2018-07-23T08:59:24.000Z | azure-mgmt-batchai/azure/mgmt/batchai/operations/workspaces_operations.py | JonathanGailliez/azure-sdk-for-python | f0f051bfd27f8ea512aea6fc0c3212ee9ee0029b | [
"MIT"
] | 1 | 2018-11-29T14:46:42.000Z | 2018-11-29T14:46:42.000Z | azure-mgmt-batchai/azure/mgmt/batchai/operations/workspaces_operations.py | JonathanGailliez/azure-sdk-for-python | f0f051bfd27f8ea512aea6fc0c3212ee9ee0029b | [
"MIT"
] | 2 | 2021-05-23T16:46:31.000Z | 2021-05-26T23:51:09.000Z | # coding=utf-8
# --------------------------------------------------------------------------
# Copyright (c) Microsoft Corporation. All rights reserved.
# Licensed under the MIT License. See License.txt in the project root for
# license information.
#
# Code generated by Microsoft (R) AutoRest Code Generator.
# Changes may cause incorrect behavior and will be lost if the code is
# regenerated.
# --------------------------------------------------------------------------
import uuid
from msrest.pipeline import ClientRawResponse
from msrestazure.azure_exceptions import CloudError
from msrest.polling import LROPoller, NoPolling
from msrestazure.polling.arm_polling import ARMPolling
from .. import models
class WorkspacesOperations(object):
"""WorkspacesOperations operations.
:param client: Client for service requests.
:param config: Configuration of service client.
:param serializer: An object model serializer.
:param deserializer: An object model deserializer.
:ivar api_version: Specifies the version of API used for this request. Constant value: "2018-05-01".
"""
models = models
def __init__(self, client, config, serializer, deserializer):
self._client = client
self._serialize = serializer
self._deserialize = deserializer
self.api_version = "2018-05-01"
self.config = config
def list(
self, workspaces_list_options=None, custom_headers=None, raw=False, **operation_config):
"""Gets a list of Workspaces associated with the given subscription.
:param workspaces_list_options: Additional parameters for the
operation
:type workspaces_list_options:
~azure.mgmt.batchai.models.WorkspacesListOptions
:param dict custom_headers: headers that will be added to the request
:param bool raw: returns the direct response alongside the
deserialized response
:param operation_config: :ref:`Operation configuration
overrides<msrest:optionsforoperations>`.
:return: An iterator like instance of Workspace
:rtype:
~azure.mgmt.batchai.models.WorkspacePaged[~azure.mgmt.batchai.models.Workspace]
:raises: :class:`CloudError<msrestazure.azure_exceptions.CloudError>`
"""
max_results = None
if workspaces_list_options is not None:
max_results = workspaces_list_options.max_results
def internal_paging(next_link=None, raw=False):
if not next_link:
# Construct URL
url = self.list.metadata['url']
path_format_arguments = {
'subscriptionId': self._serialize.url("self.config.subscription_id", self.config.subscription_id, 'str')
}
url = self._client.format_url(url, **path_format_arguments)
# Construct parameters
query_parameters = {}
query_parameters['api-version'] = self._serialize.query("self.api_version", self.api_version, 'str')
if max_results is not None:
query_parameters['maxresults'] = self._serialize.query("max_results", max_results, 'int', maximum=1000, minimum=1)
else:
url = next_link
query_parameters = {}
# Construct headers
header_parameters = {}
header_parameters['Content-Type'] = 'application/json; charset=utf-8'
if self.config.generate_client_request_id:
header_parameters['x-ms-client-request-id'] = str(uuid.uuid1())
if custom_headers:
header_parameters.update(custom_headers)
if self.config.accept_language is not None:
header_parameters['accept-language'] = self._serialize.header("self.config.accept_language", self.config.accept_language, 'str')
# Construct and send request
request = self._client.get(url, query_parameters)
response = self._client.send(
request, header_parameters, stream=False, **operation_config)
if response.status_code not in [200]:
exp = CloudError(response)
exp.request_id = response.headers.get('x-ms-request-id')
raise exp
return response
# Deserialize response
deserialized = models.WorkspacePaged(internal_paging, self._deserialize.dependencies)
if raw:
header_dict = {}
client_raw_response = models.WorkspacePaged(internal_paging, self._deserialize.dependencies, header_dict)
return client_raw_response
return deserialized
list.metadata = {'url': '/subscriptions/{subscriptionId}/providers/Microsoft.BatchAI/workspaces'}
def list_by_resource_group(
self, resource_group_name, workspaces_list_by_resource_group_options=None, custom_headers=None, raw=False, **operation_config):
"""Gets a list of Workspaces within the specified resource group.
:param resource_group_name: Name of the resource group to which the
resource belongs.
:type resource_group_name: str
:param workspaces_list_by_resource_group_options: Additional
parameters for the operation
:type workspaces_list_by_resource_group_options:
~azure.mgmt.batchai.models.WorkspacesListByResourceGroupOptions
:param dict custom_headers: headers that will be added to the request
:param bool raw: returns the direct response alongside the
deserialized response
:param operation_config: :ref:`Operation configuration
overrides<msrest:optionsforoperations>`.
:return: An iterator like instance of Workspace
:rtype:
~azure.mgmt.batchai.models.WorkspacePaged[~azure.mgmt.batchai.models.Workspace]
:raises: :class:`CloudError<msrestazure.azure_exceptions.CloudError>`
"""
max_results = None
if workspaces_list_by_resource_group_options is not None:
max_results = workspaces_list_by_resource_group_options.max_results
def internal_paging(next_link=None, raw=False):
if not next_link:
# Construct URL
url = self.list_by_resource_group.metadata['url']
path_format_arguments = {
'resourceGroupName': self._serialize.url("resource_group_name", resource_group_name, 'str', pattern=r'^[-\w\._]+$'),
'subscriptionId': self._serialize.url("self.config.subscription_id", self.config.subscription_id, 'str')
}
url = self._client.format_url(url, **path_format_arguments)
# Construct parameters
query_parameters = {}
query_parameters['api-version'] = self._serialize.query("self.api_version", self.api_version, 'str')
if max_results is not None:
query_parameters['maxresults'] = self._serialize.query("max_results", max_results, 'int', maximum=1000, minimum=1)
else:
url = next_link
query_parameters = {}
# Construct headers
header_parameters = {}
header_parameters['Content-Type'] = 'application/json; charset=utf-8'
if self.config.generate_client_request_id:
header_parameters['x-ms-client-request-id'] = str(uuid.uuid1())
if custom_headers:
header_parameters.update(custom_headers)
if self.config.accept_language is not None:
header_parameters['accept-language'] = self._serialize.header("self.config.accept_language", self.config.accept_language, 'str')
# Construct and send request
request = self._client.get(url, query_parameters)
response = self._client.send(
request, header_parameters, stream=False, **operation_config)
if response.status_code not in [200]:
exp = CloudError(response)
exp.request_id = response.headers.get('x-ms-request-id')
raise exp
return response
# Deserialize response
deserialized = models.WorkspacePaged(internal_paging, self._deserialize.dependencies)
if raw:
header_dict = {}
client_raw_response = models.WorkspacePaged(internal_paging, self._deserialize.dependencies, header_dict)
return client_raw_response
return deserialized
list_by_resource_group.metadata = {'url': '/subscriptions/{subscriptionId}/resourceGroups/{resourceGroupName}/providers/Microsoft.BatchAI/workspaces'}
def _create_initial(
self, resource_group_name, workspace_name, location, tags=None, custom_headers=None, raw=False, **operation_config):
parameters = models.WorkspaceCreateParameters(location=location, tags=tags)
# Construct URL
url = self.create.metadata['url']
path_format_arguments = {
'resourceGroupName': self._serialize.url("resource_group_name", resource_group_name, 'str', pattern=r'^[-\w\._]+$'),
'workspaceName': self._serialize.url("workspace_name", workspace_name, 'str', max_length=64, min_length=1, pattern=r'^[-\w_]+$'),
'subscriptionId': self._serialize.url("self.config.subscription_id", self.config.subscription_id, 'str')
}
url = self._client.format_url(url, **path_format_arguments)
# Construct parameters
query_parameters = {}
query_parameters['api-version'] = self._serialize.query("self.api_version", self.api_version, 'str')
# Construct headers
header_parameters = {}
header_parameters['Content-Type'] = 'application/json; charset=utf-8'
if self.config.generate_client_request_id:
header_parameters['x-ms-client-request-id'] = str(uuid.uuid1())
if custom_headers:
header_parameters.update(custom_headers)
if self.config.accept_language is not None:
header_parameters['accept-language'] = self._serialize.header("self.config.accept_language", self.config.accept_language, 'str')
# Construct body
body_content = self._serialize.body(parameters, 'WorkspaceCreateParameters')
# Construct and send request
request = self._client.put(url, query_parameters)
response = self._client.send(
request, header_parameters, body_content, stream=False, **operation_config)
if response.status_code not in [200, 202]:
exp = CloudError(response)
exp.request_id = response.headers.get('x-ms-request-id')
raise exp
deserialized = None
if response.status_code == 200:
deserialized = self._deserialize('Workspace', response)
if raw:
client_raw_response = ClientRawResponse(deserialized, response)
return client_raw_response
return deserialized
def create(
self, resource_group_name, workspace_name, location, tags=None, custom_headers=None, raw=False, polling=True, **operation_config):
"""Creates a Workspace.
:param resource_group_name: Name of the resource group to which the
resource belongs.
:type resource_group_name: str
:param workspace_name: The name of the workspace. Workspace names can
only contain a combination of alphanumeric characters along with dash
(-) and underscore (_). The name must be from 1 through 64 characters
long.
:type workspace_name: str
:param location: The region in which to create the Workspace.
:type location: str
:param tags: The user specified tags associated with the Workspace.
:type tags: dict[str, str]
:param dict custom_headers: headers that will be added to the request
:param bool raw: The poller return type is ClientRawResponse, the
direct response alongside the deserialized response
:param polling: True for ARMPolling, False for no polling, or a
polling object for personal polling strategy
:return: An instance of LROPoller that returns Workspace or
ClientRawResponse<Workspace> if raw==True
:rtype:
~msrestazure.azure_operation.AzureOperationPoller[~azure.mgmt.batchai.models.Workspace]
or
~msrestazure.azure_operation.AzureOperationPoller[~msrest.pipeline.ClientRawResponse[~azure.mgmt.batchai.models.Workspace]]
:raises: :class:`CloudError<msrestazure.azure_exceptions.CloudError>`
"""
raw_result = self._create_initial(
resource_group_name=resource_group_name,
workspace_name=workspace_name,
location=location,
tags=tags,
custom_headers=custom_headers,
raw=True,
**operation_config
)
def get_long_running_output(response):
deserialized = self._deserialize('Workspace', response)
if raw:
client_raw_response = ClientRawResponse(deserialized, response)
return client_raw_response
return deserialized
lro_delay = operation_config.get(
'long_running_operation_timeout',
self.config.long_running_operation_timeout)
if polling is True: polling_method = ARMPolling(lro_delay, **operation_config)
elif polling is False: polling_method = NoPolling()
else: polling_method = polling
return LROPoller(self._client, raw_result, get_long_running_output, polling_method)
create.metadata = {'url': '/subscriptions/{subscriptionId}/resourceGroups/{resourceGroupName}/providers/Microsoft.BatchAI/workspaces/{workspaceName}'}
def update(
self, resource_group_name, workspace_name, tags=None, custom_headers=None, raw=False, **operation_config):
"""Updates properties of a Workspace.
:param resource_group_name: Name of the resource group to which the
resource belongs.
:type resource_group_name: str
:param workspace_name: The name of the workspace. Workspace names can
only contain a combination of alphanumeric characters along with dash
(-) and underscore (_). The name must be from 1 through 64 characters
long.
:type workspace_name: str
:param tags: The user specified tags associated with the Workspace.
:type tags: dict[str, str]
:param dict custom_headers: headers that will be added to the request
:param bool raw: returns the direct response alongside the
deserialized response
:param operation_config: :ref:`Operation configuration
overrides<msrest:optionsforoperations>`.
:return: Workspace or ClientRawResponse if raw=true
:rtype: ~azure.mgmt.batchai.models.Workspace or
~msrest.pipeline.ClientRawResponse
:raises: :class:`CloudError<msrestazure.azure_exceptions.CloudError>`
"""
parameters = models.WorkspaceUpdateParameters(tags=tags)
# Construct URL
url = self.update.metadata['url']
path_format_arguments = {
'resourceGroupName': self._serialize.url("resource_group_name", resource_group_name, 'str', pattern=r'^[-\w\._]+$'),
'workspaceName': self._serialize.url("workspace_name", workspace_name, 'str', max_length=64, min_length=1, pattern=r'^[-\w_]+$'),
'subscriptionId': self._serialize.url("self.config.subscription_id", self.config.subscription_id, 'str')
}
url = self._client.format_url(url, **path_format_arguments)
# Construct parameters
query_parameters = {}
query_parameters['api-version'] = self._serialize.query("self.api_version", self.api_version, 'str')
# Construct headers
header_parameters = {}
header_parameters['Content-Type'] = 'application/json; charset=utf-8'
if self.config.generate_client_request_id:
header_parameters['x-ms-client-request-id'] = str(uuid.uuid1())
if custom_headers:
header_parameters.update(custom_headers)
if self.config.accept_language is not None:
header_parameters['accept-language'] = self._serialize.header("self.config.accept_language", self.config.accept_language, 'str')
# Construct body
body_content = self._serialize.body(parameters, 'WorkspaceUpdateParameters')
# Construct and send request
request = self._client.patch(url, query_parameters)
response = self._client.send(
request, header_parameters, body_content, stream=False, **operation_config)
if response.status_code not in [200]:
exp = CloudError(response)
exp.request_id = response.headers.get('x-ms-request-id')
raise exp
deserialized = None
if response.status_code == 200:
deserialized = self._deserialize('Workspace', response)
if raw:
client_raw_response = ClientRawResponse(deserialized, response)
return client_raw_response
return deserialized
update.metadata = {'url': '/subscriptions/{subscriptionId}/resourceGroups/{resourceGroupName}/providers/Microsoft.BatchAI/workspaces/{workspaceName}'}
def _delete_initial(
self, resource_group_name, workspace_name, custom_headers=None, raw=False, **operation_config):
# Construct URL
url = self.delete.metadata['url']
path_format_arguments = {
'resourceGroupName': self._serialize.url("resource_group_name", resource_group_name, 'str', pattern=r'^[-\w\._]+$'),
'workspaceName': self._serialize.url("workspace_name", workspace_name, 'str', max_length=64, min_length=1, pattern=r'^[-\w_]+$'),
'subscriptionId': self._serialize.url("self.config.subscription_id", self.config.subscription_id, 'str')
}
url = self._client.format_url(url, **path_format_arguments)
# Construct parameters
query_parameters = {}
query_parameters['api-version'] = self._serialize.query("self.api_version", self.api_version, 'str')
# Construct headers
header_parameters = {}
header_parameters['Content-Type'] = 'application/json; charset=utf-8'
if self.config.generate_client_request_id:
header_parameters['x-ms-client-request-id'] = str(uuid.uuid1())
if custom_headers:
header_parameters.update(custom_headers)
if self.config.accept_language is not None:
header_parameters['accept-language'] = self._serialize.header("self.config.accept_language", self.config.accept_language, 'str')
# Construct and send request
request = self._client.delete(url, query_parameters)
response = self._client.send(request, header_parameters, stream=False, **operation_config)
if response.status_code not in [200, 202, 204]:
exp = CloudError(response)
exp.request_id = response.headers.get('x-ms-request-id')
raise exp
if raw:
client_raw_response = ClientRawResponse(None, response)
return client_raw_response
def delete(
self, resource_group_name, workspace_name, custom_headers=None, raw=False, polling=True, **operation_config):
"""Deletes a Workspace.
:param resource_group_name: Name of the resource group to which the
resource belongs.
:type resource_group_name: str
:param workspace_name: The name of the workspace. Workspace names can
only contain a combination of alphanumeric characters along with dash
(-) and underscore (_). The name must be from 1 through 64 characters
long.
:type workspace_name: str
:param dict custom_headers: headers that will be added to the request
:param bool raw: The poller return type is ClientRawResponse, the
direct response alongside the deserialized response
:param polling: True for ARMPolling, False for no polling, or a
polling object for personal polling strategy
:return: An instance of LROPoller that returns None or
ClientRawResponse<None> if raw==True
:rtype: ~msrestazure.azure_operation.AzureOperationPoller[None] or
~msrestazure.azure_operation.AzureOperationPoller[~msrest.pipeline.ClientRawResponse[None]]
:raises: :class:`CloudError<msrestazure.azure_exceptions.CloudError>`
"""
raw_result = self._delete_initial(
resource_group_name=resource_group_name,
workspace_name=workspace_name,
custom_headers=custom_headers,
raw=True,
**operation_config
)
def get_long_running_output(response):
if raw:
client_raw_response = ClientRawResponse(None, response)
return client_raw_response
lro_delay = operation_config.get(
'long_running_operation_timeout',
self.config.long_running_operation_timeout)
if polling is True: polling_method = ARMPolling(lro_delay, **operation_config)
elif polling is False: polling_method = NoPolling()
else: polling_method = polling
return LROPoller(self._client, raw_result, get_long_running_output, polling_method)
delete.metadata = {'url': '/subscriptions/{subscriptionId}/resourceGroups/{resourceGroupName}/providers/Microsoft.BatchAI/workspaces/{workspaceName}'}
def get(
self, resource_group_name, workspace_name, custom_headers=None, raw=False, **operation_config):
"""Gets information about a Workspace.
:param resource_group_name: Name of the resource group to which the
resource belongs.
:type resource_group_name: str
:param workspace_name: The name of the workspace. Workspace names can
only contain a combination of alphanumeric characters along with dash
(-) and underscore (_). The name must be from 1 through 64 characters
long.
:type workspace_name: str
:param dict custom_headers: headers that will be added to the request
:param bool raw: returns the direct response alongside the
deserialized response
:param operation_config: :ref:`Operation configuration
overrides<msrest:optionsforoperations>`.
:return: Workspace or ClientRawResponse if raw=true
:rtype: ~azure.mgmt.batchai.models.Workspace or
~msrest.pipeline.ClientRawResponse
:raises: :class:`CloudError<msrestazure.azure_exceptions.CloudError>`
"""
# Construct URL
url = self.get.metadata['url']
path_format_arguments = {
'resourceGroupName': self._serialize.url("resource_group_name", resource_group_name, 'str', pattern=r'^[-\w\._]+$'),
'workspaceName': self._serialize.url("workspace_name", workspace_name, 'str', max_length=64, min_length=1, pattern=r'^[-\w_]+$'),
'subscriptionId': self._serialize.url("self.config.subscription_id", self.config.subscription_id, 'str')
}
url = self._client.format_url(url, **path_format_arguments)
# Construct parameters
query_parameters = {}
query_parameters['api-version'] = self._serialize.query("self.api_version", self.api_version, 'str')
# Construct headers
header_parameters = {}
header_parameters['Content-Type'] = 'application/json; charset=utf-8'
if self.config.generate_client_request_id:
header_parameters['x-ms-client-request-id'] = str(uuid.uuid1())
if custom_headers:
header_parameters.update(custom_headers)
if self.config.accept_language is not None:
header_parameters['accept-language'] = self._serialize.header("self.config.accept_language", self.config.accept_language, 'str')
# Construct and send request
request = self._client.get(url, query_parameters)
response = self._client.send(request, header_parameters, stream=False, **operation_config)
if response.status_code not in [200]:
exp = CloudError(response)
exp.request_id = response.headers.get('x-ms-request-id')
raise exp
deserialized = None
if response.status_code == 200:
deserialized = self._deserialize('Workspace', response)
if raw:
client_raw_response = ClientRawResponse(deserialized, response)
return client_raw_response
return deserialized
get.metadata = {'url': '/subscriptions/{subscriptionId}/resourceGroups/{resourceGroupName}/providers/Microsoft.BatchAI/workspaces/{workspaceName}'}
| 47.149621 | 154 | 0.664631 | 2,706 | 24,895 | 5.906874 | 0.088322 | 0.036599 | 0.03297 | 0.027027 | 0.897272 | 0.88964 | 0.873436 | 0.867555 | 0.845158 | 0.830455 | 0 | 0.005235 | 0.240329 | 24,895 | 527 | 155 | 47.239089 | 0.839943 | 0.278008 | 0 | 0.765343 | 0 | 0 | 0.147295 | 0.072203 | 0 | 0 | 0 | 0 | 0 | 1 | 0.046931 | false | 0 | 0.021661 | 0 | 0.140794 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
294b10b6b3e174a5ec5048279375b72394752099 | 119 | py | Python | tests/general_test.py | mjirik/pythontemplate | d81983be44fda32c68b303bc396969d3ee5a714d | [
"MIT"
] | null | null | null | tests/general_test.py | mjirik/pythontemplate | d81983be44fda32c68b303bc396969d3ee5a714d | [
"MIT"
] | null | null | null | tests/general_test.py | mjirik/pythontemplate | d81983be44fda32c68b303bc396969d3ee5a714d | [
"MIT"
] | null | null | null | import pytest
import pythontemplate.moduleone
def test_hello():
pythontemplate.moduleone.print_hello("Vlkoslav")
| 17 | 52 | 0.806723 | 13 | 119 | 7.230769 | 0.692308 | 0.489362 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.109244 | 119 | 6 | 53 | 19.833333 | 0.886792 | 0 | 0 | 0 | 0 | 0 | 0.067227 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.25 | true | 0 | 0.5 | 0 | 0.75 | 0.25 | 1 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 7 |
2965b53f3770216d08990f16a8adfaed2b14b657 | 42,607 | py | Python | cloudify_rest_sdk/tests/test_sdk.py | cloudify-incubator/cloudify-utilities-plugins-sdk | f5cfe0381a13a85d89b905c06c79ad8ada5319bc | [
"Apache-2.0"
] | 1 | 2019-04-23T03:06:52.000Z | 2019-04-23T03:06:52.000Z | cloudify_rest_sdk/tests/test_sdk.py | cloudify-incubator/cloudify-utilities-plugins-sdk | f5cfe0381a13a85d89b905c06c79ad8ada5319bc | [
"Apache-2.0"
] | 9 | 2018-12-17T14:08:29.000Z | 2022-01-16T17:52:54.000Z | cloudify_rest_sdk/tests/test_sdk.py | cloudify-incubator/cloudify-utilities-plugins-sdk | f5cfe0381a13a85d89b905c06c79ad8ada5319bc | [
"Apache-2.0"
] | 3 | 2021-12-13T20:53:37.000Z | 2022-01-20T09:01:47.000Z | ########
# Copyright (c) 2014-2020 Cloudify Platform Ltd. All rights reserved
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
import unittest
import json
import mock
import six
from cloudify_rest_sdk import utility
from cloudify_common_sdk import exceptions
class TestSdk(unittest.TestCase):
def test_check_response(self):
parsed_json = json.loads('''{
"id": 10,
"name": "Clementina DuBuque",
"username": "Moriah.Stanton",
"email": "Rey.Padberg@karina.biz",
"address": {
"street": "Kattie Turnpike",
"suite": "Suite 198",
"city": "Lebsackbury",
"zipcode": "31428-2261",
"geo": {
"lat": "-38.2386",
"lng": "57.2232"
}
},
"phone": "024-648-3804",
"website": "ambrose.net",
"company": {
"name": "Hoeger LLC",
"catchPhrase": "Centralized empowering task-force",
"bs": "target end-to-end models"
}
}''')
# no check, should be skiped
utility._check_response(parsed_json, [], True)
# correct check
utility._check_response(parsed_json, [['id', '10']], True)
# incorect data / Recoverable, filter that data not match
with self.assertRaises(
exceptions.RecoverableResponseException
) as error:
utility._check_response(parsed_json, [['id', '22']], True)
self.assertEqual(
"{0}".format(error.exception),
'Trying one more time...\nResponse value:10 does not match '
'regexp: 22 from response_expectation')
# incorect data / NonRecoverable, filter that data match
with self.assertRaises(
exceptions.NonRecoverableResponseException
) as error:
utility._check_response(parsed_json, [['id', '10']], False)
self.assertEqual(
"{0}".format(error.exception),
'Giving up... \nResponse value: 10 matches regexp:10 from '
'nonrecoverable_response. ')
# correct data, filter that data not match
utility._check_response(parsed_json, [['id', '20']], False)
# wrond data structure
error_text = 'No key or index "id" in json [{\'id\': 40}]'
with self.assertRaises(
exceptions.ExpectationException
) as error:
utility._check_response([{'id': 40}], [['id', '20']], False)
self.assertEqual("{0}".format(error.exception), error_text)
with self.assertRaises(
exceptions.ExpectationException
) as error:
utility._check_response([{'id': 40}], [['id', '20']], True)
self.assertEqual("{0}".format(error.exception), error_text)
# wrong checked
with self.assertRaises(
exceptions.WrongTemplateDataException
) as error:
utility._check_response([{'id': 40}], 'AAAA', True)
if six.PY3:
# python 3
self.assertEqual(
"{0}".format(error.exception),
"Response (recoverable) had to be list. Type <class 'str'> "
"not supported. ")
else:
# python 2
self.assertEqual(
"{0}".format(error.exception),
"Response (recoverable) had to be list. Type <type 'str'> "
"not supported. ")
with self.assertRaises(
exceptions.WrongTemplateDataException
) as error:
utility._check_response([{'id': 40}], 'AAAA', False)
if six.PY3:
# python 3
self.assertEqual(
"{0}".format(error.exception),
"Response (nonrecoverable) had to be list. Type <class 'str'> "
"not supported. ")
else:
# python 2
self.assertEqual(
"{0}".format(error.exception),
"Response (nonrecoverable) had to be list. Type <type 'str'> "
"not supported. ")
# check regexp
def test_check_response_regexp(self):
# Success
parsed_json = json.loads('''{
"status": "Success"
}''')
utility._check_response(
parsed_json, [['status', '\\AWarning\\Z|\\ASuccess\\Z']], True)
# Warning
parsed_json = json.loads('''{
"status": "Warning"
}''')
utility._check_response(
parsed_json, [['status', '\\AWarning\\Z|\\ASuccess\\Z']], True)
# Incorrect suffix
parsed_json = json.loads('''{
"status": "Successful"
}''')
error_text = (
"Trying one more time...\nResponse value:Successful does not "
"match regexp: \\AWarning\\Z|\\ASuccess\\Z from "
"response_expectation"
)
with self.assertRaises(
exceptions.RecoverableResponseException
) as error:
utility._check_response(
parsed_json, [['status', '\\AWarning\\Z|\\ASuccess\\Z']], True)
self.assertEqual("{0}".format(error.exception), error_text)
# Incorrect prefix
parsed_json = json.loads('''{
"status": "Full Success"
}''')
error_text = (
"Trying one more time...\nResponse value:Full Success does not "
"match regexp: \\AWarning\\Z|\\ASuccess\\Z from "
"response_expectation"
)
with self.assertRaises(
exceptions.RecoverableResponseException
) as error:
utility._check_response(
parsed_json, [['status', '\\AWarning\\Z|\\ASuccess\\Z']], True)
self.assertEqual("{0}".format(error.exception), error_text)
def test_process_response(self):
parsed_json = json.loads('''{
"id": 10,
"name": "Clementina DuBuque",
"username": "Moriah.Stanton",
"email": "Rey.Padberg@karina.biz",
"address": {
"street": "Kattie Turnpike",
"suite": "Suite 198",
"city": "Lebsackbury",
"zipcode": "31428-2261",
"geo": {
"lat": "-38.2386",
"lng": "57.2232"
}
},
"phone": "024-648-3804",
"website": "ambrose.net",
"company": {
"name": "Hoeger LLC",
"catchPhrase": "Centralized empowering task-force",
"bs": "target end-to-end models"
}
}''')
response = mock.Mock()
response.json = mock.Mock(return_value=parsed_json)
response.text = '''<object>10</object>'''
response.headers = {
'Content-Type': "application/json"
}
response.cookies = mock.Mock()
response.cookies.get_dict = mock.Mock(return_value={'a': 'b'})
# json
store_props = {}
call = {
'response_format': 'json',
'nonrecoverable_response': [['id', '10'],
['id', '11'],
['id', '12']],
'response_expectation': [['id', '20']],
'response_translation': {
"name": ["user-full-name"],
"email": ["user-email"],
"address": {
"city": ["user-city"],
"zipcode": ["user-city-zip"],
"geo": {
"lat": ["user-city-geo", "latitude"],
"lng": ["user-city-geo", "longnitude"]
}
}
}
}
with self.assertRaises(
exceptions.NonRecoverableResponseException
):
utility._process_response(response, call, {})
# json
store_props = {}
call = {
'response_format': 'json',
'nonrecoverable_response': [['id', '20']],
'response_expectation': [['id', '10']],
'response_translation': {
"name": ["user-full-name"],
"email": ["user-email"],
"address": {
"city": ["user-city"],
"zipcode": ["user-city-zip"],
"geo": {
"lat": ["user-city-geo", "latitude"],
"lng": ["user-city-geo", "longnitude"]
}
}
}
}
utility._process_response(response, call, store_props)
self.assertDictEqual(store_props, {
'user-city': u'Lebsackbury',
'user-city-geo': {
'latitude': u'-38.2386',
'longnitude': u'57.2232'
},
'user-city-zip': u'31428-2261',
'user-email': u'Rey.Padberg@karina.biz',
'user-full-name': u'Clementina DuBuque'
})
# auto json
store_props = {}
call = {
'nonrecoverable_response': [['id', '20']],
'response_expectation': [['id', '10']],
'response_translation': {
"name": ["user-full-name"],
"email": ["user-email"],
"address": {
"city": ["user-city"],
"zipcode": ["user-city-zip"],
"geo": {
"lat": ["user-city-geo", "latitude"],
"lng": ["user-city-geo", "longnitude"]
}
}
}
}
utility._process_response(response, call, store_props)
self.assertDictEqual(store_props, {
'user-city': u'Lebsackbury',
'user-city-geo': {
'latitude': u'-38.2386',
'longnitude': u'57.2232'
},
'user-city-zip': u'31428-2261',
'user-email': u'Rey.Padberg@karina.biz',
'user-full-name': u'Clementina DuBuque'
})
# raw response
store_props = {}
call = {
'response_format': 'raw',
}
utility._process_response(response, call, store_props)
self.assertDictEqual(store_props, {})
# text response
store_props = {}
call = {
'response_format': 'text',
}
utility._process_response(response, call, store_props)
self.assertDictEqual(store_props, {'text': '<object>10</object>'})
# unknown response
store_props = {}
call = {
'response_format': 'other',
}
with self.assertRaises(
exceptions.WrongTemplateDataException
) as error:
utility._process_response(response, call, store_props)
self.assertEqual(
"{0}".format(error.exception),
"Response_format 'other' is not supported. Only json/xml or raw "
"response_format is supported")
self.assertDictEqual(store_props, {})
# xml response
store_props = {}
call = {
'response_format': 'xml',
'nonrecoverable_response': [['object', '20']],
'response_expectation': [['object', '10']],
'response_translation': {
"object": ["object_id"]
}
}
utility._process_response(response, call, store_props)
self.assertDictEqual(store_props, {'object_id': '10'})
# auto xml response
response.headers = {
'Content-Type': 'application/xml'
}
store_props = {}
call = {
'nonrecoverable_response': [['object', '20']],
'response_expectation': [['object', '10']],
'response_translation': {
"object": ["object_id"]
},
'header_translation': {
"Content-Type": ["content_type"]
}
}
utility._process_response(response, call, store_props)
self.assertDictEqual(store_props, {'object_id': '10',
'content_type': 'application/xml'})
# can't use autodetected type, failback to json
response.headers = {
'Content-Type': "json-alias"
}
store_props = {}
call = {
'nonrecoverable_response': [['id', '20']],
'response_expectation': [['id', '10']],
'response_translation': {
"name": ["user-full-name"],
"email": ["user-email"],
"address": {
"city": ["user-city"],
"zipcode": ["user-city-zip"],
"geo": {
"lat": ["user-city-geo", "latitude"],
"lng": ["user-city-geo", "longnitude"]
}
}
}
}
utility._process_response(response, call, store_props)
self.assertDictEqual(store_props, {
'user-city': u'Lebsackbury',
'user-city-geo': {
'latitude': u'-38.2386',
'longnitude': u'57.2232'
},
'user-city-zip': u'31428-2261',
'user-email': u'Rey.Padberg@karina.biz',
'user-full-name': u'Clementina DuBuque'
})
def test_send_request(self):
# json request
call = {
'ssl': True,
'path': "/",
'method': 'get',
'verify': False,
'host': 'localhost',
'auth': {
'user': 'someone',
'password': 'check'
},
'port': -1,
'payload': [1, 2, 3],
'headers': {"a": "b"},
'response_format': 'xml',
'nonrecoverable_response': [['object', '20']],
'response_expectation': [['object', '10']],
'response_translation': {
"object": ["object_id"]
}
}
response = mock.Mock()
response.json = None
response.raise_for_status = mock.Mock()
response.text = '''<object>10</object>'''
request = mock.Mock(return_value=response)
response.headers = {}
response.cookies = mock.Mock()
response.cookies.get_dict = mock.Mock(return_value={'a': 'b'})
with mock.patch(
"cloudify_rest_sdk.utility.requests.request", request
):
self.assertEqual(utility._send_request(call), response)
request.assert_called_with('get', 'https://localhost:443/',
data=None, headers={'a': 'b'},
json=[1, 2, 3],
params={},
files=None,
auth=('someone', 'check'),
cert=None,
proxies=None,
timeout=None,
verify=False)
# raw_files with string
call = {
'ssl': True,
'path': "/xml",
'method': 'get',
'verify': False,
'host': 'localhost',
'port': -1,
'files_raw': {
'file': 'some_name'
},
'headers': {"a": "b"},
'response_format': 'xml',
'nonrecoverable_response': [['object', '20']],
'response_expectation': [['object', '10']],
'response_translation': {
"object": ["object_id"]
}
}
response = mock.Mock()
response.json = None
response.raise_for_status = mock.Mock()
response.text = '''<object>10</object>'''
response.status_code = 404
response.headers = {}
response.cookies = mock.Mock()
response.cookies.get_dict = mock.Mock(return_value={'a': 'b'})
request = mock.Mock(return_value=response)
response_callback = mock.Mock(return_value="abc")
def _fake_StringIO(a):
return a
with mock.patch(
"cloudify_rest_sdk.utility.requests.request", request
):
with mock.patch(
"cloudify_rest_sdk.utility.StringIO", _fake_StringIO
):
self.assertEqual(
utility._send_request(call, response_callback),
response)
request.assert_called_with('get', 'https://localhost:443/xml',
data=None,
headers={'a': 'b'},
json=None,
params={},
files={'file': 'abc'},
auth=None,
cert=None,
proxies=None,
timeout=None,
verify=False)
# raw_files with list
call = {
'ssl': True,
'path': "/xml",
'method': 'get',
'verify': False,
'host': 'localhost',
'port': -1,
'files': {
'file': ['a', 'b', 'c']
},
'headers': {"a": "b"},
'response_format': 'xml',
'nonrecoverable_response': [['object', '20']],
'response_expectation': [['object', '10']],
'response_translation': {
"object": ["object_id"]
}
}
response = mock.Mock()
response.json = None
response.raise_for_status = mock.Mock()
response.text = '''<object>10</object>'''
response.status_code = 404
response.headers = {}
response.cookies = mock.Mock()
response.cookies.get_dict = mock.Mock(return_value={'a': 'b'})
request = mock.Mock(return_value=response)
response_callback = mock.Mock(return_value="abc")
def _fake_StringIO(a):
return a
with mock.patch(
"cloudify_rest_sdk.utility.requests.request", request
):
self.assertEqual(
utility._send_request(call, response_callback),
response)
request.assert_called_with('get', 'https://localhost:443/xml',
data=None,
headers={'a': 'b'},
json=None,
params={},
files={'file': ('a', 'b', 'c')},
auth=None,
cert=None,
proxies=None,
timeout=None,
verify=False)
# raw_files with tuple
call = {
'ssl': True,
'path': "/xml",
'method': 'get',
'verify': False,
'host': 'localhost',
'port': -1,
'files': {
'file': ('a', 'b', 'c')
},
'headers': {"a": "b"},
'response_format': 'xml',
'nonrecoverable_response': [['object', '20']],
'response_expectation': [['object', '10']],
'response_translation': {
"object": ["object_id"]
}
}
response = mock.Mock()
response.json = None
response.raise_for_status = mock.Mock()
response.text = '''<object>10</object>'''
response.status_code = 404
response.headers = {}
response.cookies = mock.Mock()
response.cookies.get_dict = mock.Mock(return_value={'a': 'b'})
request = mock.Mock(return_value=response)
response_callback = mock.Mock(return_value="abc")
def _fake_StringIO(a):
return a
with mock.patch(
"cloudify_rest_sdk.utility.requests.request", request
):
self.assertEqual(
utility._send_request(call, response_callback),
response)
request.assert_called_with('get', 'https://localhost:443/xml',
data=None,
headers={'a': 'b'},
json=None,
params={},
files={'file': ('a', 'b', 'c')},
auth=None,
cert=None,
proxies=None,
timeout=None,
verify=False)
# xml request
call = {
'ssl': True,
'path': "/xml",
'method': 'get',
'verify': False,
'host': 'localhost',
'port': -1,
'payload': '<object>11</object>',
'payload_format': 'raw',
'headers': {"a": "b"},
'response_format': 'xml',
'nonrecoverable_response': [['object', '20']],
'response_expectation': [['object', '10']],
'response_translation': {
"object": ["object_id"]
}
}
response = mock.Mock()
response.json = None
response.raise_for_status = mock.Mock()
response.text = '''<object>10</object>'''
response.status_code = 404
response.headers = {}
response.cookies = mock.Mock()
response.cookies.get_dict = mock.Mock(return_value={'a': 'b'})
request = mock.Mock(return_value=response)
with mock.patch(
"cloudify_rest_sdk.utility.requests.request", request
):
self.assertEqual(utility._send_request(call), response)
request.assert_called_with('get', 'https://localhost:443/xml',
data='<object>11</object>',
headers={'a': 'b'},
json=None,
params={},
files=None,
auth=None,
cert=None,
proxies=None,
timeout=None,
verify=False)
# raise error on request status
response.raise_for_status = mock.Mock(
side_effect=utility.requests.exceptions.HTTPError('Error!')
)
with mock.patch(
"cloudify_rest_sdk.utility.requests.request", request
):
with self.assertRaises(
utility.requests.exceptions.HTTPError
) as error:
self.assertEqual(utility._send_request(call), response)
self.assertEqual("{0}".format(error.exception), 'Error!')
# expected error
call['recoverable_codes'] = [404]
with mock.patch(
"cloudify_rest_sdk.utility.requests.request", request
):
with self.assertRaises(
exceptions.RecoverableStatusCodeCodeException
) as error:
utility._send_request(call)
self.assertEqual(
"{0}".format(error.exception),
'Response code 404 defined as recoverable')
# expected error accepted as successful
call['recoverable_codes'] = []
call['successful_codes'] = [404]
with mock.patch(
"cloudify_rest_sdk.utility.requests.request", request
):
self.assertEqual(utility._send_request(call), response)
# can't connect
request = mock.Mock(
side_effect=utility.requests.exceptions.ConnectionError(
'check connect')
)
with mock.patch(
"cloudify_rest_sdk.utility.requests.request", request
):
with self.assertRaises(
utility.requests.exceptions.ConnectionError
) as error:
self.assertEqual(utility._send_request(call), response)
self.assertEqual("{0}".format(error.exception), "check connect")
# ignore conenction errors
call['retry_on_connection_error'] = True
request = mock.Mock(
side_effect=utility.requests.exceptions.ConnectionError(
'check connect')
)
with mock.patch(
"cloudify_rest_sdk.utility.requests.request", request
):
with self.assertRaises(
exceptions.RecoverableResponseException
) as error:
self.assertEqual(utility._send_request(call), response)
self.assertEqual(
repr(error.exception),
"RecoverableResponseException(\"ConnectionError "
"ConnectionError('check connect',) has occurred, but flag "
"retry_on_connection_error is set. Retrying...\",)"
)
def test_process_pre_render(self):
# without params
template = """
rest_calls:
- ssl: true
path: "/xml"
method: get
verify: false
host: localhost
port: -1
payload: {{ payload }}
payload_format: raw
headers:
a: b
response_format: xml
nonrecoverable_response: [['object', '20']]
response_expectation: [['object', '10']]
response_translation:
object:
- object_id"""
response = mock.Mock()
response.json = None
response.raise_for_status = mock.Mock()
response.text = '''<object>10</object>'''
response.status_code = 404
response.headers = {
'Content-Type': "application/json"
}
response.cookies = mock.Mock()
response.cookies.get_dict = mock.Mock(return_value={'a': 'b'})
request = mock.Mock(return_value=response)
with mock.patch(
"cloudify_rest_sdk.utility.requests.request", request
):
self.assertEqual(
utility.process({'payload': '<object>11</object>'}, template,
{}, prerender=True), {
'calls': [{
'headers': {'a': 'b'},
'host': 'localhost',
'method': 'get',
'nonrecoverable_response': [['object']],
'path': '/xml',
'payload': '<object>11</object>',
'payload_format': 'raw',
'port': -1,
'response_expectation': [['object']],
'response_format': 'xml',
'response_translation': {'object': []},
'ssl': True,
'verify': False
}],
'result_properties': {'object_id': u'10'}})
request.assert_called_with('get', 'https://localhost:443/xml',
data='<object>11</object>',
headers={'a': 'b'},
json=None,
params={},
files=None,
auth=None,
cert=None,
proxies=None,
timeout=None,
verify=False)
# check rawpayload
template = """
rest_calls:
- ssl: true
path: "/xml"
method: get
verify: false
host: localhost
port: -1
raw_payload: payload.xml
payload: {{ payload }}
payload_format: raw
headers:
a: b
response_format: xml
nonrecoverable_response: [['object', '20']]
response_expectation: [['object', '10']]
response_translation:
object:
- object_id"""
payload_callback = mock.Mock(return_value="<object>22</object>")
response = mock.Mock()
response.json = None
response.raise_for_status = mock.Mock()
response.text = '''<object>10</object>'''
response.status_code = 404
response.headers = {
'Content-Type': "application/json"
}
response.cookies = mock.Mock()
response.cookies.get_dict = mock.Mock(return_value={'a': 'b'})
request = mock.Mock(return_value=response)
with mock.patch(
"cloudify_rest_sdk.utility.requests.request", request
):
self.assertEqual(
utility.process({'payload': '<object>11</object>'}, template,
{}, prerender=True,
resource_callback=payload_callback), {
'calls': [{
'headers': {'a': 'b'},
'host': 'localhost',
'method': 'get',
'nonrecoverable_response': [['object']],
'path': '/xml',
'raw_payload': 'payload.xml',
'payload': '<object>11</object>',
'payload_format': 'raw',
'port': -1,
'response_expectation': [['object']],
'response_format': 'xml',
'response_translation': {'object': []},
'ssl': True,
'verify': False
}],
'result_properties': {'object_id': u'10'}})
request.assert_called_with('get', 'https://localhost:443/xml',
data='<object>22</object>',
headers={'a': 'b'},
json=None,
params={},
files=None,
auth=None,
cert=None,
proxies=None,
timeout=None,
verify=False)
payload_callback.assert_called_with('payload.xml')
def test_process_empty(self):
# no calls in template
template = """
rest_calls:
"""
self.assertEqual(utility.process({}, template, {}), {})
# empty template
self.assertEqual(utility.process({}, "", {}), {})
def test_process_post_render(self):
# check server/client side cert
template = """
rest_calls:
- ssl: true
path: "/xml"
method: get
verify: "some_server_cert"
cert: "some_client_cert"
timeout: 300
host: localhost
port: -1
payload: '<object>11</object>'
payload_format: raw
headers:
a: b
response_format: xml
nonrecoverable_response: [['object', '20']]
response_expectation: [['object', '10']]
response_translation:
object:
- object_id"""
response = mock.Mock()
response.json = None
response.raise_for_status = mock.Mock()
response.text = '''<object>10</object>'''
response.status_code = 404
response.headers = {
'Content-Type': "application/json"
}
response.cookies = mock.Mock()
response.cookies.get_dict = mock.Mock(return_value={'a': 'b'})
request = mock.Mock(return_value=response)
with mock.patch(
"cloudify_rest_sdk.utility.requests.request", request
):
with mock.patch(
"cloudify_rest_sdk.utility.tempfile.mkstemp",
mock.Mock(return_value=['fake_fd', '/tmp/fake_tmp'])
):
fake_os = mock.Mock()
fake_os.path.isfile = mock.Mock(return_value=False)
fake_os.remove = mock.Mock(
side_effect=Exception("can't remove"))
with mock.patch("cloudify_rest_sdk.utility.os", fake_os):
self.assertEqual(
utility.process({}, template, {}), {
'calls': [{
'headers': {'a': 'b'},
'host': 'localhost',
'method': 'get',
'nonrecoverable_response': [['object']],
'path': '/xml',
'payload': '<object>11</object>',
'payload_format': 'raw',
'port': -1,
'response_expectation': [['object']],
'response_format': 'xml',
'response_translation': {'object': []},
'ssl': True,
'timeout': 300,
'cert': "some_client_cert",
'verify': "some_server_cert",
}],
'result_properties': {'object_id': u'10'}})
request.assert_called_with('get', 'https://localhost:443/xml',
data='<object>11</object>',
headers={'a': 'b'},
json=None,
params={},
files=None,
auth=None,
cert='/tmp/fake_tmp',
proxies=None,
timeout=300,
verify='/tmp/fake_tmp')
# without params
template = """
rest_calls:
- ssl: true
path: "/xml"
method: get
verify: false
host: localhost
port: -1
payload: '<object>11</object>'
payload_format: raw
headers:
a: b
response_format: xml
nonrecoverable_response: [['object', '20']]
response_expectation: [['object', '10']]
response_translation:
object:
- object_id"""
response = mock.Mock()
response.json = None
response.raise_for_status = mock.Mock()
response.text = '''<object>10</object>'''
response.status_code = 404
response.headers = {
'Content-Type': "application/json"
}
response.cookies = mock.Mock()
response.cookies.get_dict = mock.Mock(return_value={'a': 'b'})
request = mock.Mock(return_value=response)
with mock.patch(
"cloudify_rest_sdk.utility.requests.request", request
):
self.assertEqual(
utility.process({}, template, {}), {
'calls': [{
'headers': {'a': 'b'},
'host': 'localhost',
'method': 'get',
'nonrecoverable_response': [['object']],
'path': '/xml',
'payload': '<object>11</object>',
'payload_format': 'raw',
'port': -1,
'response_expectation': [['object']],
'response_format': 'xml',
'response_translation': {'object': []},
'ssl': True,
'verify': False
}],
'result_properties': {'object_id': u'10'}})
request.assert_called_with('get', 'https://localhost:443/xml',
data='<object>11</object>',
headers={'a': 'b'},
json=None,
params={},
files=None,
auth=None,
cert=None,
proxies=None,
timeout=None,
verify=False)
# check post apply parameters
template = """
rest_calls:
- ssl: true
path: "/xml"
method: get
verify: false
host: localhost
port: -1
payload: "{% if custom is not string %}{{custom}}{% endif %}"
payload_format: raw
headers:
a: b
response_format: xml
nonrecoverable_response: [['object', '20']]
response_expectation: [['object', '10']]
response_translation:
object:
- object_id"""
request = mock.Mock(return_value=response)
with mock.patch(
"cloudify_rest_sdk.utility.requests.request", request
):
self.assertEqual(
utility.process({'custom': [1, 2, 3]}, template,
{}, ), {
'calls': [{
'headers': {'a': 'b'},
'host': 'localhost',
'method': 'get',
'nonrecoverable_response': [['object']],
'path': '/xml',
'payload': [1, 2, 3],
'payload_format': 'raw',
'port': -1,
'response_expectation': [['object']],
'response_format': 'xml',
'response_translation': {'object': []},
'ssl': True,
'verify': False
}],
'result_properties': {'object_id': u'10'}})
request.assert_called_with('get', 'https://localhost:443/xml',
data=[1, 2, 3],
headers={'a': 'b'},
json=None,
params={},
files=None,
auth=None,
cert=None,
proxies=None,
timeout=None,
verify=False)
# urlencode
template = """
rest_calls:
- ssl: true
path: "/xml"
method: get
verify: false
host: localhost
port: -1
payload:
object: 11
payload_format: urlencoded
headers:
a: b
response_format: xml
nonrecoverable_response: [['object', '20']]
response_expectation: [['object', '10']]
cookies_translation:
a:
- a
response_translation:
object:
- object_id"""
response = mock.Mock()
response.json = None
response.raise_for_status = mock.Mock()
response.text = '''<object>10</object>'''
response.status_code = 404
response.headers = {
'Content-Type': "application/json"
}
response.cookies = mock.Mock()
response.cookies.get_dict = mock.Mock(return_value={'a': 'b'})
request = mock.Mock(return_value=response)
with mock.patch(
"cloudify_rest_sdk.utility.requests.request", request
):
self.assertEqual(
utility.process({}, template, {}),
{
'calls': [{
'headers': {'a': 'b'},
'host': 'localhost',
'method': 'get',
'nonrecoverable_response': [['object']],
'path': '/xml',
'payload': {
'object': 11
},
'payload_format': 'urlencoded',
'port': -1,
'response_expectation': [['object']],
'response_format': 'xml',
'response_translation': {'object': []},
'cookies_translation': {'a': []},
'ssl': True,
'verify': False
}],
'result_properties': {
'object_id': '10',
'a': 'b'
}})
request.assert_called_with('get', 'https://localhost:443/xml',
data=None,
headers={'a': 'b'},
json=None,
params={'object': 11},
files=None,
auth=None,
cert=None,
proxies=None,
timeout=None,
verify=False)
if __name__ == '__main__':
unittest.main()
| 38.910502 | 79 | 0.43697 | 3,368 | 42,607 | 5.385986 | 0.09709 | 0.029107 | 0.028225 | 0.030375 | 0.846417 | 0.816428 | 0.797299 | 0.777949 | 0.760364 | 0.755568 | 0 | 0.018964 | 0.436877 | 42,607 | 1,094 | 80 | 38.946069 | 0.73709 | 0.033328 | 0 | 0.801784 | 0 | 0 | 0.302539 | 0.048855 | 0 | 0 | 0 | 0 | 0.065411 | 1 | 0.009911 | false | 0.000991 | 0.005946 | 0.002973 | 0.019822 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
4642ea7b1180c3f6bb033b61183c8b8b66b9fc97 | 23,089 | py | Python | generators/common/blocks.py | mumumu99/latent-pose-reenactment | bfe8175f9cf3d67d46c21194bb5b6f898ef3ea53 | [
"Apache-2.0"
] | 103 | 2020-10-28T10:35:39.000Z | 2022-03-21T12:58:27.000Z | generators/common/blocks.py | jiaxiangshang/latent-pose-reenactment | 59629a64105c7c33fa01c461a3c65d3690f8533c | [
"Apache-2.0"
] | 37 | 2020-10-29T02:32:28.000Z | 2022-02-26T16:06:55.000Z | generators/common/blocks.py | jiaxiangshang/latent-pose-reenactment | 59629a64105c7c33fa01c461a3c65d3690f8533c | [
"Apache-2.0"
] | 26 | 2020-11-05T12:42:01.000Z | 2022-02-11T08:59:52.000Z | import torch
from torch import nn
from torch.nn.utils import spectral_norm
class AdaptiveNorm2d(nn.Module):
def __init__(self, num_features, norm_layer='in', eps=1e-4):
super(AdaptiveNorm2d, self).__init__()
self.num_features = num_features
self.weight = self.bias = None
if 'in' in norm_layer:
self.norm_layer = nn.InstanceNorm2d(num_features, eps=eps, affine=False)
elif 'bn' in norm_layer:
self.norm_layer = SyncBatchNorm(num_features, momentum=1.0, eps=eps, affine=False)
self.delete_weight_on_forward = True
def forward(self, input):
out = self.norm_layer(input)
output = out * self.weight[:, :, None, None] + self.bias[:, :, None, None]
# To save GPU memory
if self.delete_weight_on_forward:
self.weight = self.bias = None
return output
class AdaptiveNorm2dTrainable(nn.Module):
def __init__(self, num_features, norm_layer='in', eps=1e-4):
super(AdaptiveNorm2dTrainable, self).__init__()
self.num_features = num_features
if 'in' in norm_layer:
self.norm_layer = nn.InstanceNorm2d(num_features, eps=eps, affine=False)
def forward(self, input):
out = self.norm_layer(input)
t = out.shape[0] // self.weight.shape[0]
output = out * self.weight + self.bias
return output
def assign_params(self, weight, bias):
self.weight = torch.nn.Parameter(weight.view(1, -1, 1, 1))
self.bias = torch.nn.Parameter(bias.view(1, -1, 1, 1))
class ResBlock(nn.Module):
def __init__(self, in_channels, out_channels, padding, upsample, downsample,
norm_layer, activation=nn.ReLU, gated=False):
super(ResBlock, self).__init__()
normalize = norm_layer != 'none'
bias = not normalize
# if norm_layer == 'bn':
# # norm0 = SyncBatchNorm(in_channels, momentum=1.0, eps=1e-4)
# # norm1 = SyncBatchNorm(out_channels, momentum=1.0, eps=1e-4)
# pass
if norm_layer == 'in':
norm0 = nn.InstanceNorm2d(in_channels, eps=1e-4, affine=True)
norm1 = nn.InstanceNorm2d(out_channels, eps=1e-4, affine=True)
elif 'ada' in norm_layer:
norm0 = AdaptiveNorm2d(in_channels, norm_layer)
norm1 = AdaptiveNorm2d(out_channels, norm_layer)
elif 'tra' in norm_layer:
norm0 = AdaptiveNorm2dTrainable(in_channels, norm_layer)
norm1 = AdaptiveNorm2dTrainable(out_channels, norm_layer)
elif normalize:
raise Exception('ResBlock: Incorrect `norm_layer` parameter')
layers = []
if normalize:
layers.append(norm0)
layers.append(activation(inplace=True))
if upsample:
layers.append(nn.Upsample(scale_factor=2))
layers.extend([
nn.Sequential() if padding is nn.ZeroPad2d else padding(1),
spectral_norm(
nn.Conv2d(in_channels, out_channels, 3, 1, 1 if padding is nn.ZeroPad2d else 0, bias=bias),
eps=1e-4)])
if normalize:
layers.append(norm1)
layers.extend([
activation(inplace=True),
nn.Sequential() if padding is nn.ZeroPad2d else padding(1),
spectral_norm(
nn.Conv2d(out_channels, out_channels, 3, 1, 1 if padding is nn.ZeroPad2d else 0, bias=bias),
eps=1e-4)])
if downsample:
layers.append(nn.AvgPool2d(2))
self.block = nn.Sequential(*layers)
self.skip = None
if in_channels != out_channels or upsample or downsample:
layers = []
if upsample:
layers.append(nn.Upsample(scale_factor=2))
layers.append(spectral_norm(
nn.Conv2d(in_channels, out_channels, 1),
eps=1e-4))
if downsample:
layers.append(nn.AvgPool2d(2))
self.skip = nn.Sequential(*layers)
def forward(self, input):
out = self.block(input)
if self.skip is not None:
output = out + self.skip(input)
else:
output = out + input
return output
class channelShuffle(nn.Module):
def __init__(self,groups):
super(channelShuffle, self).__init__()
self.groups=groups
def forward(self,x):
batchsize, num_channels, height, width = x.data.size()
# batchsize = x.shape[0]
# num_channels = x.shape[1]
# height = x.shape[2]
# width = x.shape[3]
channels_per_group = num_channels // self.groups
# reshape
x = x.view(batchsize, self.groups, channels_per_group, height, width)
# transpose
# - contiguous() required if transpose() is used before view().
# See https://github.com/pytorch/pytorch/issues/764
x = torch.transpose(x, 1, 2).contiguous()
# flatten
x = x.view(batchsize, -1, height, width)
return x
class shuffleConv(nn.Module):
def __init__(self, in_channels, out_channels, kernel_size, stride=1, padding=0, dilation=1, groups=1, bias=True, padding_mode='zeros'):
super(shuffleConv, self).__init__()
self.in_channels=in_channels
self.out_channels=out_channels
self.stride=stride
self.padding=padding
groups=4
block=[]
if (in_channels%groups==0) and (out_channels%groups==0):
block.append(spectral_norm(nn.Conv2d(in_channels=in_channels, out_channels=out_channels, kernel_size=1,padding=0, groups=groups),eps=1e-4))
block.append(nn.ReLU6(inplace=True))
block.append(channelShuffle(groups=groups))
block.append(spectral_norm(nn.Conv2d(in_channels=out_channels, out_channels=out_channels,kernel_size=3,padding=1, groups=groups),eps=1e-4))
block.append(nn.ReLU6(inplace=True))
block.append(spectral_norm(nn.Conv2d(in_channels=out_channels, out_channels=out_channels,kernel_size=1,padding=0, groups=groups),eps=1e-4))
else:
block.append(spectral_norm(nn.Conv2d(in_channels=in_channels, out_channels=out_channels, kernel_size=3,padding=1),eps=1e-4))
self.block=nn.Sequential(*block)
def forward(self,x):
x=self.block(x)
return x
class ResBlockShuffle(nn.Module):
def __init__(self, in_channels, out_channels, padding, upsample, downsample,
norm_layer, activation=nn.ReLU, gated=False):
super(ResBlockShuffle, self).__init__()
normalize = norm_layer != 'none'
bias = not normalize
# if norm_layer == 'bn':
# # norm0 = SyncBatchNorm(in_channels, momentum=1.0, eps=1e-4)
# # norm1 = SyncBatchNorm(out_channels, momentum=1.0, eps=1e-4)
# pass
if norm_layer == 'in':
norm0 = nn.InstanceNorm2d(in_channels, eps=1e-4, affine=True)
norm1 = nn.InstanceNorm2d(out_channels, eps=1e-4, affine=True)
elif 'ada' in norm_layer:
norm0 = AdaptiveNorm2d(in_channels, norm_layer)
norm1 = AdaptiveNorm2d(out_channels, norm_layer)
elif 'tra' in norm_layer:
norm0 = AdaptiveNorm2dTrainable(in_channels, norm_layer)
norm1 = AdaptiveNorm2dTrainable(out_channels, norm_layer)
elif normalize:
raise Exception('ResBlock: Incorrect `norm_layer` parameter')
layers = []
if normalize:
layers.append(norm0)
layers.append(activation(inplace=True))
if upsample:
layers.append(nn.Upsample(scale_factor=2))
layers.extend([
#padding(1),
#spectral_norm(
shuffleConv(in_channels, out_channels, 3, 1, 0, bias=bias)#,
# eps=1e-4)
])
if normalize:
layers.append(norm1)
layers.extend([
activation(inplace=True),
#padding(1),
#spectral_norm(
shuffleConv(out_channels, out_channels, 3, 1, 0, bias=bias)#,
# eps=1e-4)
])
if downsample:
layers.append(nn.AvgPool2d(2))
self.block = nn.Sequential(*layers)
self.skip = None
if in_channels != out_channels or upsample or downsample:
layers = []
if upsample:
layers.append(nn.Upsample(scale_factor=2))
layers.append(
#spectral_norm(
shuffleConv(in_channels, out_channels, 1)#,
# eps=1e-4)
)
if downsample:
layers.append(nn.AvgPool2d(2))
self.skip = nn.Sequential(*layers)
def forward(self, input):
out = self.block(input)
if self.skip is not None:
output = out + self.skip(input)
else:
output = out + input
return output
class ResBlockV2(nn.Module):
def __init__(self, in_channels, out_channels, stride, groups,
resize_layer, norm_layer, activation):
super(ResBlockV2, self).__init__()
upsampling_layers = {
'nearest': lambda: nn.Upsample(scale_factor=stride, mode='nearest')
}
downsampling_layers = {
'avgpool': lambda: nn.AvgPool2d(stride)
}
norm_layers = {
'bn': lambda num_features: SyncBatchNorm(num_features, momentum=1.0, eps=1e-4),
'in': lambda num_features: nn.InstanceNorm2d(num_features, eps=1e-4, affine=True),
'adabn': lambda num_features: AdaptiveNorm2d(num_features, 'bn'),
'adain': lambda num_features: AdaptiveNorm2d(num_features, 'in')
}
normalize = norm_layer != 'none'
bias = not normalize
upsample = resize_layer in upsampling_layers
downsample = resize_layer in downsampling_layers
if normalize:
norm_layer = norm_layers[norm_layer]
layers = []
if normalize:
layers.append(norm_layer(in_channels))
layers.append(activation())
if upsample:
layers.append(nn.Upsample(scale_factor=2))
layers.extend([
spectral_norm(
nn.Conv2d(in_channels, out_channels, 3, 1, 1, bias=bias),
eps=1e-4)])
if normalize:
layers.append(norm_layer(out_channels))
layers.extend([
activation(),
spectral_norm(
nn.Conv2d(out_channels, out_channels, 3, 1, 1, bias=bias),
eps=1e-4)])
if downsample:
layers.append(nn.AvgPool2d(2))
self.block = nn.Sequential(*layers)
self.skip = None
if in_channels != out_channels or upsample or downsample:
layers = []
if upsample:
layers.append(nn.Upsample(scale_factor=2))
layers.append(spectral_norm(
nn.Conv2d(in_channels, out_channels, 1),
eps=1e-4))
if downsample:
layers.append(nn.AvgPool2d(2))
self.skip = nn.Sequential(*layers)
def forward(self, input):
out = self.block(input)
if self.skip is not None:
output = out + self.skip(input)
else:
output = out + input
return output
class ResBlockV2Shuffle(nn.Module):
def __init__(self, in_channels, out_channels, stride, groups,
resize_layer, norm_layer, activation):
super(ResBlockV2Shuffle, self).__init__()
upsampling_layers = {
'nearest': lambda: nn.Upsample(scale_factor=stride, mode='nearest')
}
downsampling_layers = {
'avgpool': lambda: nn.AvgPool2d(stride)
}
norm_layers = {
'bn': lambda num_features: SyncBatchNorm(num_features, momentum=1.0, eps=1e-4),
'in': lambda num_features: nn.InstanceNorm2d(num_features, eps=1e-4, affine=True),
'adabn': lambda num_features: AdaptiveNorm2d(num_features, 'bn'),
'adain': lambda num_features: AdaptiveNorm2d(num_features, 'in')
}
normalize = norm_layer != 'none'
bias = not normalize
upsample = resize_layer in upsampling_layers
downsample = resize_layer in downsampling_layers
if normalize:
norm_layer = norm_layers[norm_layer]
layers = []
if normalize:
layers.append(norm_layer(in_channels))
layers.append(activation())
if upsample:
layers.append(nn.Upsample(scale_factor=2))
layers.extend([
#spectral_norm(
shuffleConv(in_channels, out_channels, 3, 1, 1, bias=bias)#,
# eps=1e-4)
])
if normalize:
layers.append(norm_layer(out_channels))
layers.extend([
activation(),
#spectral_norm(
shuffleConv(out_channels, out_channels, 3, 1, 1, bias=bias)#,
# eps=1e-4)
])
if downsample:
layers.append(nn.AvgPool2d(2))
self.block = nn.Sequential(*layers)
self.skip = None
if in_channels != out_channels or upsample or downsample:
layers = []
if upsample:
layers.append(nn.Upsample(scale_factor=2))
layers.append(#spectral_norm(
shuffleConv(in_channels, out_channels, 1)#,
# eps=1e-4)
)
if downsample:
layers.append(nn.AvgPool2d(2))
self.skip = nn.Sequential(*layers)
def forward(self, input):
out = self.block(input)
if self.skip is not None:
output = out + self.skip(input)
else:
output = out + input
return output
class GatedBlock(nn.Module):
def __init__(self, in_channels, out_channels, act_fun, kernel_size, stride=1, padding=0, bias=True):
super(GatedBlock, self).__init__()
self.conv = spectral_norm(nn.Conv2d(in_channels, out_channels, kernel_size, stride, padding, bias=bias),
eps=1e-4)
self.gate = spectral_norm(nn.Conv2d(in_channels, out_channels, kernel_size, stride, padding, bias=bias),
eps=1e-4)
self.act_fun = act_fun()
self.gate_act_fun = nn.Sigmoid()
def forward(self, x):
out = self.conv(x)
out = self.act_fun(out)
mask = self.gate(x)
mask = self.gate_act_fun(mask)
out_masked = out * mask
return out_masked
class GatedResBlock(nn.Module):
def __init__(self, in_channels, out_channels, padding, upsample, downsample,
norm_layer, activation=nn.ReLU):
super(GatedResBlock, self).__init__()
normalize = norm_layer != 'none'
bias = not normalize
if norm_layer == 'in':
norm0 = nn.InstanceNorm2d(in_channels, eps=1e-4, affine=True)
norm1 = nn.InstanceNorm2d(out_channels, eps=1e-4, affine=True)
elif 'ada' in norm_layer:
norm0 = AdaptiveNorm2d(in_channels, norm_layer)
norm1 = AdaptiveNorm2d(out_channels, norm_layer)
elif 'tra' in norm_layer:
norm0 = AdaptiveNorm2dTrainable(in_channels, norm_layer)
norm1 = AdaptiveNorm2dTrainable(out_channels, norm_layer)
elif normalize:
raise Exception('ResBlock: Incorrect `norm_layer` parameter')
main_layers = []
if normalize:
main_layers.append(norm0)
if upsample:
main_layers.append(nn.Upsample(scale_factor=2))
main_layers.extend([
padding(1),
GatedBlock(in_channels, out_channels, activation, 3, 1, 0, bias=bias)])
if normalize:
main_layers.append(norm1)
main_layers.extend([
padding(1),
GatedBlock(out_channels, out_channels, activation, 3, 1, 0, bias=bias)])
if downsample:
main_layers.append(nn.AvgPool2d(2))
self.main_pipe = nn.Sequential(*main_layers)
self.skip_pipe = None
if in_channels != out_channels or upsample or downsample:
skip_layers = []
if upsample:
skip_layers.append(nn.Upsample(scale_factor=2))
skip_layers.append(GatedBlock(in_channels, out_channels, activation, 1))
if downsample:
skip_layers.append(nn.AvgPool2d(2))
self.skip_pipe = nn.Sequential(*skip_layers)
def forward(self, input):
mp_out = self.main_pipe(input)
if self.skip_pipe is not None:
output = mp_out + self.skip_pipe(input)
else:
output = mp_out + input
return output
class ResBlockWithoutSpectralNorms(nn.Module):
def __init__(self, in_channels, out_channels, padding, upsample, downsample,
norm_layer, activation=nn.ReLU):
super(ResBlockWithoutSpectralNorms, self).__init__()
normalize = norm_layer != 'none'
bias = not normalize
# if norm_layer == 'bn':
# # norm0 = SyncBatchNorm(in_channels, momentum=1.0, eps=1e-4)
# # norm1 = SyncBatchNorm(out_channels, momentum=1.0, eps=1e-4)
# pass
if norm_layer == 'in':
norm0 = nn.InstanceNorm2d(in_channels, eps=1e-4, affine=True)
norm1 = nn.InstanceNorm2d(out_channels, eps=1e-4, affine=True)
elif 'ada' in norm_layer:
norm0 = AdaptiveNorm2d(in_channels, norm_layer)
norm1 = AdaptiveNorm2d(out_channels, norm_layer)
elif 'tra' in norm_layer:
norm0 = AdaptiveNorm2dTrainable(in_channels, norm_layer)
norm1 = AdaptiveNorm2dTrainable(out_channels, norm_layer)
elif normalize:
raise Exception('ResBlock: Incorrect `norm_layer` parameter')
layers = []
if normalize:
layers.append(norm0)
layers.append(activation(inplace=True))
if upsample:
layers.append(nn.Upsample(scale_factor=2))
layers.extend([
padding(1),
# spectral_norm(
nn.Conv2d(in_channels, out_channels, 3, 1, 0, bias=bias) # ,
# eps=1e-4)
])
if normalize:
layers.append(norm1)
layers.extend([
activation(inplace=True),
padding(1),
# spectral_norm(
nn.Conv2d(out_channels, out_channels, 3, 1, 0, bias=bias) # ,
# eps=1e-4)
])
if downsample:
layers.append(nn.AvgPool2d(2))
self.block = nn.Sequential(*layers)
self.skip = None
if in_channels != out_channels or upsample or downsample:
layers = []
if upsample:
layers.append(nn.Upsample(scale_factor=2))
layers.append( # spectral_norm(
nn.Conv2d(in_channels, out_channels, 1) # ,
# eps=1e-4)
)
if downsample:
layers.append(nn.AvgPool2d(2))
self.skip = nn.Sequential(*layers)
def forward(self, input):
out = self.block(input)
if self.skip is not None:
output = out + self.skip(input)
else:
output = out + input
return output
class MobileNetBlock(nn.Module):
def __init__(self, in_channels, out_channels, padding, upsample, downsample,
norm_layer, activation=nn.ReLU6, expansion_factor=6):
super(MobileNetBlock, self).__init__()
normalize = norm_layer != 'none'
bias = not normalize
conv0 = nn.Conv2d(in_channels, int(in_channels * expansion_factor), 1)
dwise = nn.Conv2d(int(in_channels * expansion_factor), int(in_channels * expansion_factor), 3,
2 if downsample else 1, 1, groups=int(in_channels * expansion_factor))
conv1 = nn.Conv2d(int(in_channels * expansion_factor), out_channels, 1)
if norm_layer == 'bn':
# norm0 = SyncBatchNorm(in_channels, momentum=1.0, eps=1e-4)
# norm1 = SyncBatchNorm(out_channels, momentum=1.0, eps=1e-4)
pass
if 'in' in norm_layer:
norm0 = nn.InstanceNorm2d(int(in_channels * expansion_factor), eps=1e-4, affine=True)
norm1 = nn.InstanceNorm2d(int(in_channels * expansion_factor), eps=1e-4, affine=True)
norm2 = nn.InstanceNorm2d(out_channels, eps=1e-4, affine=True)
if 'ada' in norm_layer:
norm2 = AdaptiveNorm2d(out_channels, norm_layer)
elif 'tra' in norm_layer:
norm2 = AdaptiveNorm2dTrainable(out_channels, norm_layer)
# layers = [spectral_norm(conv0, eps=1e-4)]
layers = [conv0]
if normalize: layers.append(norm0)
layers.append(activation(inplace=True))
if upsample: layers.append(nn.Upsample(scale_factor=2))
# layers.append(spectral_norm(dwise, eps=1e-4))
layers.append(dwise)
if normalize: layers.append(norm1)
layers.extend([
activation(inplace=True),
# spectral_norm(
conv1 # ,
# eps=1e-4)
])
if normalize: layers.append(norm2)
self.block = nn.Sequential(*layers)
self.skip = None
if in_channels != out_channels or upsample or downsample:
layers = []
if upsample: layers.append(nn.Upsample(scale_factor=2))
layers.append(
# spectral_norm(
nn.Conv2d(in_channels, out_channels, 1) # ,
# eps=1e-4)
)
if downsample:
layers.append(nn.AvgPool2d(2))
self.skip = nn.Sequential(*layers)
def forward(self, input):
out = self.block(input)
if self.skip is not None:
output = out + self.skip(input)
else:
output = out + input
return output
class SelfAttention(nn.Module):
def __init__(self, in_channels):
super(SelfAttention, self).__init__()
self.in_channels = in_channels
self.query_conv = nn.Conv2d(in_channels, in_channels // 8, 1)
self.key_conv = nn.Conv2d(in_channels, in_channels // 8, 1)
self.value_conv = nn.Conv2d(in_channels, in_channels, 1)
self.gamma = nn.Parameter(torch.zeros(1))
self.softmax = nn.Softmax(-1)
def forward(self, input):
b, c, h, w = input.shape
query = self.query_conv(input).view(b, -1, h * w).permute(0, 2, 1) # B x HW x C/8
key = self.key_conv(input).view(b, -1, h * w) # B x C/8 x HW
energy = torch.bmm(query, key) # B x HW x HW
attention = self.softmax(energy) # B x HW x HW
value = self.value_conv(input).view(b, -1, h * w) # B x C x HW
out = torch.bmm(value, attention.permute(0, 2, 1)).view(b, c, h, w)
output = self.gamma * out + input
return output
| 37.85082 | 151 | 0.586383 | 2,694 | 23,089 | 4.845954 | 0.062361 | 0.065722 | 0.02298 | 0.0563 | 0.821371 | 0.791881 | 0.776791 | 0.746151 | 0.740253 | 0.717809 | 0 | 0.025396 | 0.307592 | 23,089 | 609 | 152 | 37.912972 | 0.791205 | 0.059162 | 0 | 0.705394 | 0 | 0 | 0.015289 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.056017 | false | 0.002075 | 0.006224 | 0 | 0.116183 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
467b1de55b22b6a2f742011f1027243315661f66 | 158 | py | Python | _test_projects/templates/index.py | oren0e/cob | f2a5d74a15f5262d7980e4cf1f1a20af29194ffb | [
"BSD-3-Clause"
] | 2 | 2019-04-07T20:19:55.000Z | 2021-05-27T10:23:31.000Z | _test_projects/templates/index.py | oren0e/cob | f2a5d74a15f5262d7980e4cf1f1a20af29194ffb | [
"BSD-3-Clause"
] | 126 | 2016-08-10T19:59:45.000Z | 2021-11-26T06:58:16.000Z | _test_projects/templates/index.py | oren0e/cob | f2a5d74a15f5262d7980e4cf1f1a20af29194ffb | [
"BSD-3-Clause"
] | 6 | 2017-11-16T12:05:47.000Z | 2021-11-24T09:21:17.000Z | # cob: type=views mountpoint=/test
from cob import route
from flask import render_template
@route('/')
def index():
return render_template('index.html')
| 19.75 | 40 | 0.740506 | 22 | 158 | 5.227273 | 0.681818 | 0.243478 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.139241 | 158 | 7 | 41 | 22.571429 | 0.845588 | 0.202532 | 0 | 0 | 0 | 0 | 0.08871 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.2 | true | 0 | 0.4 | 0.2 | 0.8 | 0 | 1 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 1 | 1 | 0 | 0 | 7 |
46955fcca35d4dccd8126ba43bc2ae89fe5c403a | 39,289 | py | Python | tasks/python/_wf/FEMA4ImagesEEG.py | risoms/mdl-R56 | 3f6f62f9e5f1aaf03e604c898e3d4b6b006f3436 | [
"MIT"
] | null | null | null | tasks/python/_wf/FEMA4ImagesEEG.py | risoms/mdl-R56 | 3f6f62f9e5f1aaf03e604c898e3d4b6b006f3436 | [
"MIT"
] | null | null | null | tasks/python/_wf/FEMA4ImagesEEG.py | risoms/mdl-R56 | 3f6f62f9e5f1aaf03e604c898e3d4b6b006f3436 | [
"MIT"
] | null | null | null | #!/usr/bin/env python2
# -*- coding: utf-8 -*-
"""
This experiment was partially created using PsychoPy2 Experiment Builder (v1.83.04), Tue Feb 23 13:01:04 2016
If you publish work using this script please cite the relevant PsychoPy publications
Peirce, JW (2007) PsychoPy - Psychophysics software in Python. Journal of Neuroscience Methods, 162(1-2), 8-13.
Peirce, JW (2009) Generating stimuli for neuroscience using PsychoPy. Frontiers in Neuroinformatics, 2:10. doi: 10.3389/neuro.11.010.2008
"""
from __future__ import division # so that 1/3=0.333 instead of 1/3=0
from psychopy import visual, core, data, event, logging, sound, gui, parallel
from psychopy.constants import * # things like STARTED, FINISHED
import numpy as np # whole numpy lib is available, prepend 'np.'
from numpy import sin, cos, tan, log, log10, pi, average, sqrt, std, deg2rad, rad2deg, linspace, asarray
from numpy.random import random, randint, normal, shuffle
import os # handy system and path functions
import sys # to get file system encoding
import random
locations = ((0, 6.2), (6.2, 0), (0, -6.2), (-6.2, 0))
number_texts = []
NUM_IMAGES = 20
MALE = 0
FEMALE = 1
NUM_BLOCKS = 3
sad_male_images = []
sad_male_images = sad_male_images*int(180/20)
neutral_male_images = []
neutral_male_images = neutral_male_images*int(180/20)
sad_female_images = []
sad_female_images = sad_female_images*int(180/20)
neutral_female_images = []
neutral_female_images = neutral_female_images*int(180/20)
corr_answer = 1*16 #equal to R1 in PyCorder
incorrect_answer = 2*16 #equal to R2 in PyCorder
no_answer = 3*16 #equal to R3 in PyCorder
def choose_male_images():
my_faces = []
my_faces.extend(np.random.choice(sad_male_images, 2, False))
my_faces.extend(np.random.choice(neutral_male_images, 2, False))
random.shuffle(my_faces)
for i in range(len(my_faces)):
my_faces[i].setPos(newPos = locations[i])
thisExp.addData("Image" + str(i) + " _Position", my_faces[i].name)
return my_faces
def choose_female_images():
my_faces = []
my_faces.extend(np.random.choice(sad_female_images, 2,False))
my_faces.extend(np.random.choice(neutral_female_images, 2,False))
random.shuffle(my_faces)
for i in range(len(my_faces)):
my_faces[i].setPos(newPos = locations[i])
thisExp.addData("Image" + str(i) + " _Position", my_faces[i].name)
return my_faces
def get_all_images(win):
path_to_images = "stim/"
#white male = sad male
for i in range(NUM_IMAGES):
file = path_to_images + "AMSA" + "%02d" %(i + 1) + ".jpg"
name = "AMSA" + "%02d" %(i + 1) + ".jpg"
sad_male_images.append(visual.ImageStim(win=win, name=name,units='deg',
image=file, mask=None,
ori=0, pos=[0, 3.2], size=[6, 6],
color=[1,1,1], colorSpace='rgb', opacity=1,
flipHoriz=False, flipVert=False,
texRes=128, interpolate=True, depth=-6.0))
#black male = neutral male
for i in range(NUM_IMAGES):
file = path_to_images + "AMNE" + "%02d" %(i + 1) + ".jpg"
name = "AMNE" + "%02d" %(i + 1) + ".jpg"
neutral_male_images.append(visual.ImageStim(win=win, name=name,units='deg',
image=file, mask=None,
ori=0, pos=[0, 3.2], size=[6, 6],
color=[1,1,1], colorSpace='rgb', opacity=1,
flipHoriz=False, flipVert=False,
texRes=128, interpolate=True, depth=-6.0))
#white female = sad female
for i in range(NUM_IMAGES):
file = path_to_images + "AFSA" + "%02d" %(i + 1) + ".jpg"
name = "AFSA" + "%02d" %(i + 1) + ".jpg"
sad_female_images.append(visual.ImageStim(win=win, name=name,units='deg',
image=file, mask=None,
ori=0, pos=[0, 3.2], size=[6, 6],
color=[1,1,1], colorSpace='rgb', opacity=1,
flipHoriz=False, flipVert=False,
texRes=128, interpolate=True, depth=-6.0))
#black female = neutral female
for i in range(NUM_IMAGES):
file = path_to_images + "AFNE" + "%02d" %(i + 1) + ".jpg"
name = "AFNE" + "%02d" %(i + 1) + ".jpg"
neutral_female_images.append(visual.ImageStim(win=win, name=name,units='deg',
image=file, mask=None,
ori=0, pos=[0, 3.2], size=[6, 6],
color=[1,1,1], colorSpace='rgb', opacity=1,
flipHoriz=False, flipVert=False,
texRes=128, interpolate=True, depth=-6.0))
# Ensure that relative paths start from the same directory as this script
_thisDir = os.path.dirname(os.path.abspath(__file__)).decode(sys.getfilesystemencoding())
os.chdir(_thisDir)
# Store info about the experiment session
expName = u'FEMA4ImagesEEG' # from the Builder filename that created this script
expInfo = {u'session': u'001', u'participant': u''}
try: #look for pipe from app
expInfo['participant'] = '%s'%(sys.argv[1])
expInfo['session'] = '001'
except IndexError: #if no pipe, run normally
print ('ran without app')
dlg = gui.DlgFromDict(dictionary=expInfo, title=expName)
if dlg.OK == False:
print ('app closed')
core.quit() # user pressed cancel
expInfo['date'] = data.getDateStr() # add a simple timestamp
expInfo['expName'] = expName
print 'subject:',expInfo['participant']
print 'exp:',expName
# Data file name stem = absolute path + name; later add .psyexp, .csv, .log, etc
filename = _thisDir + os.sep + u'data\%s_%s_%s' %(expInfo['participant'], expName, expInfo['date'])
# An ExperimentHandler isn't essential but helps with data saving
thisExp = data.ExperimentHandler(name=expName, version='',
extraInfo=expInfo, runtimeInfo=None,
originPath=None,
savePickle=True, saveWideText=True,
dataFileName=filename)
#save a log file for detail verbose info
logFile = logging.LogFile(filename+'.log', level=logging.EXP)
logging.console.setLevel(logging.WARNING) # this outputs to the screen, not a file
endExpNow = False # flag for 'escape' or other condition => quit the exp
# Start Code - component code to be run before the window creation
# Setup the Window
win = visual.Window(size=(1360, 768), fullscr=False, screen=0, allowGUI=True, allowStencil=False,
monitor='testMonitor', color=[0,0,0], colorSpace='rgb',
blendMode='avg', useFBO=True,
)
# Get all the images
get_all_images(win)
# store frame rate of monitor if we can measure it successfully
expInfo['frameRate']=win.getActualFrameRate()
if expInfo['frameRate']!=None:
frameDur = 1.0/round(expInfo['frameRate'])
else:
frameDur = 1.0/60.0 # couldn't get a reliable measure so guess
# Initialize components for Routine "Instr"
InstrClock = core.Clock()
Instructions = visual.TextStim(win=win, ori=0, name='Instructions',
text='You will be presented with an array of 4 faces; afterwards you will be presented with a face cue in the center of the screen.\n\nYour task is to indicate the location of the face cue in the previous array by pressing the correct location number (1-4) on the labeled keys.\n\nPlease keep your EYES FIXATED ON THE CROSS.\nPlease use your DOMINANT hand to respond as quickly and accurately as possible.\n\nYou will start with a practice session.\nPlease press the spacebar to continue.', font='Arial',
pos=[0, 0], height=0.1, wrapWidth=1.5,
color='white', colorSpace='rgb', opacity=1,
depth=0.0)
# Create the 4 numbers
for i in range(4):
number_texts.append(visual.TextStim(win=win, ori=0, name='Number' + str(i+1), units='deg',
text=str(i+1), font='Arial',
pos=locations[i], height=2, wrapWidth=None,
color='white', colorSpace='rgb', opacity=1,
depth=0.0))
# Initialize components for Routine "Continue"
ContinueClock = core.Clock()
Ready = visual.TextStim(win=win, ori=0, name='Ready',
text='That was the end of the block.\n\nIf you need to take a break please do so now.\n\nWhen you are ready to continue please press the spacebar and remember to KEEP YOUR EYES FIXATED ON THE CROSS.', font='Arial',
pos=[0, 0], height=0.1, wrapWidth=1.5,
color='white', colorSpace='rgb', opacity=1,
depth=0.0)
# Initialize components for Routine "Trial"
TrialClock = core.Clock()
FixationPoint = visual.TextStim(win=win, ori=0, name='FixationPoint',
text='+', font='Arial',
pos=[0, 0], height=0.1, wrapWidth=None,
color=1.0, colorSpace='rgb', opacity=1,
depth=0.0)
p_port = parallel.ParallelPort(address=u'0xDFF8')
p_port_images = parallel.ParallelPort(address=u'0xDFF8')
# Initialize components for Routine "Break"
BreakClock = core.Clock()
Break = visual.TextStim(win=win, ori=0, name='Break',
text='That was the end of the block.\n\nIf you need to take a break please take one now.\n\nWhen you are ready to continue please press the spacebar.', font='Arial',
pos=[0, 0], height=0.1, wrapWidth=1.5,
color='white', colorSpace='rgb', opacity=1,
depth=0.0)
# Create some handy timers
globalClock = core.Clock() # to track the time since experiment started
routineTimer = core.CountdownTimer() # to track time remaining of each (non-slip) routine
#------Prepare to start Routine "Instr"-------
t = 0
InstrClock.reset() # clock
frameN = -1
# update component parameters for each repeat
InstrKey_resp = event.BuilderKeyResponse() # create an object of type KeyResponse
InstrKey_resp.status = NOT_STARTED
# keep track of which components have finished
InstrComponents = []
InstrComponents.append(Instructions)
InstrComponents.append(InstrKey_resp)
for thisComponent in InstrComponents:
if hasattr(thisComponent, 'status'):
thisComponent.status = NOT_STARTED
#-------Start Routine "Instr"-------
continueRoutine = True
while continueRoutine:
# get current time
t = InstrClock.getTime()
frameN = frameN + 1 # number of completed frames (so 0 is the first frame)
# update/draw components on each frame
# *Instructions* updates
if t >= 0.0 and Instructions.status == NOT_STARTED:
# keep track of start time/frame for later
Instructions.tStart = t # underestimates by a little under one frame
Instructions.frameNStart = frameN # exact frame index
Instructions.setAutoDraw(True)
if Instructions.status == STARTED: # only update if being drawn
Instructions.setColor('white', colorSpace='rgb', log=False)
# *InstrKey_resp* updates
if t >= 0.0 and InstrKey_resp.status == NOT_STARTED:
# keep track of start time/frame for later
InstrKey_resp.tStart = t # underestimates by a little under one frame
InstrKey_resp.frameNStart = frameN # exact frame index
InstrKey_resp.status = STARTED
# keyboard checking is just starting
win.callOnFlip(InstrKey_resp.clock.reset) # t=0 on next screen flip
event.clearEvents(eventType='keyboard')
if InstrKey_resp.status == STARTED:
theseKeys = event.getKeys(keyList=['space'])
# check for quit:
if "escape" in theseKeys:
endExpNow = True
if len(theseKeys) > 0: # at least one key was pressed
InstrKey_resp.keys = theseKeys[-1] # just the last key pressed
InstrKey_resp.rt = InstrKey_resp.clock.getTime()
# a response ends the routine
continueRoutine = False
# check if all components have finished
if not continueRoutine: # a component has requested a forced-end of Routine
break
continueRoutine = False # will revert to True if at least one component still running
for thisComponent in InstrComponents:
if hasattr(thisComponent, "status") and thisComponent.status != FINISHED:
continueRoutine = True
break # at least one component has not yet finished
# check for quit (the Esc key)
if endExpNow or event.getKeys(keyList=["escape"]):
core.quit()
# refresh the screen
if continueRoutine: # don't flip if this routine is over or we'll get a blank screen
win.flip()
#-------Ending Routine "Instr"-------
for thisComponent in InstrComponents:
if hasattr(thisComponent, "setAutoDraw"):
thisComponent.setAutoDraw(False)
# check responses
if InstrKey_resp.keys in ['', [], None]: # No response was made
InstrKey_resp.keys=None
# store data for thisExp (ExperimentHandler)
thisExp.addData('InstrKey_resp.keys',InstrKey_resp.keys)
if InstrKey_resp.keys != None: # we had a response
thisExp.addData('InstrKey_resp.rt', InstrKey_resp.rt)
thisExp.nextEntry()
# the Routine "Instr" was not non-slip safe, so reset the non-slip timer
routineTimer.reset()
#-------Starting Practice Trials-------#
# Each trial executes this function
def execute_trial(name, gender):
endExpNow = False
if gender == MALE:
conditions_file = data.importConditions(u'MalePracticeFile.csv')
else:
conditions_file = data.importConditions(u'FemalePracticeFile.csv')
# set up handler to look after randomisation of conditions etc
Trials = data.TrialHandler(nReps=1, method='random', extraInfo=expInfo, originPath="-1", trialList=conditions_file, seed=None, name=name)
thisExp.addLoop(Trials) # add the loop to the experiment
thisTrials = Trials.trialList[0] # so we can initialise stimuli with some values
# abbreviate parameter names if possible (e.g. rgb=thisTrialsF1.rgb)
if thisTrials != None:
for paramName in thisTrials.keys():
exec(paramName + "= thisTrials." + paramName)
for thisTrials in Trials:
currentLoop = Trials
# abbreviate parameter names if possible (e.g. rgb = thisTrialsF1.rgb)
if thisTrials != None:
for paramName in thisTrials.keys():
exec(paramName + "= thisTrials." + paramName)
#------Prepare to start Routine "Trial"-------
t = 0
TrialClock.reset() # clock
frameN = -1
routineTimer.add(8.500000)
# update component parameters for each repeat
FixationPoint.setColor('white', colorSpace='rgb')
for number in number_texts:
number.setColor('white', colorSpace='rgb') #Numbers & color of numbers
Key_Resp = event.BuilderKeyResponse() # create an object of type KeyResponse
Key_Resp.status = NOT_STARTED
# Get the random images
if gender == MALE:
random_images = choose_male_images()
else:
random_images = choose_female_images()
Image1 = random_images[0]
Image2 = random_images[1]
Image3 = random_images[2]
Image4 = random_images[3]
#Choose center image & Add to file
center_image = random.choice(random_images)
thisExp.addData('Center_image', center_image.name)
#Code
print(str(center_image.name[1:3]))
if str(center_image.name[1:3]) == 'MN':
code = '1'
#white male = sad male
elif str(center_image.name[1:3]) == 'MS':
code = '2'
#black female = neutral female
elif str(center_image.name[1:3]) == 'FN':
code = '3'
#white female = sad female
else:
code = '4'
#Adding Condition Column to Data File
thisExp.addData('condition', center_image.name[1:3])
#Adding Code Column to Data File
thisExp.addData('code', code)
#Correct Answer
if center_image == Image1:
correct_answer = 'num_8'
elif center_image == Image2:
correct_answer = 'num_6'
elif center_image == Image3:
correct_answer = 'num_2'
elif center_image == Image4:
correct_answer = 'num_4'
else:
correct_answer = 'None'
#Adding Correct Answer Column to Data File
thisExp.addData('correct_answer', correct_answer)
# keep track of which components have finished
TrialComponents = []
TrialComponents.append(FixationPoint)
TrialComponents.append(Image1)
TrialComponents.append(Image2)
TrialComponents.append(Image3)
TrialComponents.append(Image4)
TrialComponents.append(center_image)
TrialComponents.append(p_port) #Adding port
TrialComponents.append(p_port_images) #Adding images port
for number in number_texts:
TrialComponents.append(number)
TrialComponents.append(Key_Resp)
for thisComponent in TrialComponents:
if hasattr(thisComponent, 'status'):
thisComponent.status = NOT_STARTED
#-------Start Routine "Trial"-------
continueRoutine = True
while continueRoutine and routineTimer.getTime() > 0:
# get current time
t = TrialClock.getTime()
frameN = frameN + 1 # number of completed frames (so 0 is the first frame)
# update/draw components on each frame
# *FixationPoint* updates
if t >= 0.0 and FixationPoint.status == NOT_STARTED:
# keep track of start time/frame for later
FixationPoint.tStart = t # underestimates by a little under one frame
FixationPoint.frameNStart = frameN # exact frame index
FixationPoint.setAutoDraw(True)
if FixationPoint.status == STARTED and t >= (0.0 + (8.5-win.monitorFramePeriod*0.75)): #most of one frame period left
#if FixationPoint.status == STARTED and t >= 0:
FixationPoint.setAutoDraw(False)
#Random Images
if t >= 0.5 and random_images[0].status == NOT_STARTED:
for image in random_images:
# keep track of start time/frame for later
image.tStart = t # underestimates by a little under one frame
image.frameNStart = frameN # exact frame index
image.setAutoDraw(True)
if random_images[0].status == STARTED and t >= 2.0 and t < 3.0: #most of one frame period left
for image in random_images:
image.setAutoDraw(False)
# *p_port_images* updates
if t >= 0.5 and p_port_images.status == NOT_STARTED:
# keep track of start time/frame for later
p_port_images.tStart = t # underestimates by a little under one frame
p_port_images.frameNStart = frameN # exact frame index
p_port_images.status = STARTED
win.callOnFlip(p_port_images.setData, int(14))
if p_port_images.status == STARTED and t >= (2.0-win.monitorFramePeriod*0.75): #most of one frame period left
p_port_images.status = STOPPED
win.callOnFlip(p_port_images.setData, int(15))
# *Numbers* updates
if t >= 3.5 and number_texts[0].status == NOT_STARTED:
for number in number_texts:
number.tStart = t # underestimates by a little under one frame
number.frameNStart = frameN # exact frame index
number.setAutoDraw(True)
if number_texts[0].status == STARTED and t >= 8.5: #most of one frame period left
for number in number_texts:
number.setAutoDraw(False)
# *Key_Resp* updates
if t >= 3.5 and Key_Resp.status == NOT_STARTED:
print(t)
center_image.setPos(newPos = (0, 0))
center_image.setAutoDraw(True)
# keep track of start time/frame for later
Key_Resp.tStart = t # underestimates by a little under one frame
Key_Resp.frameNStart = frameN # exact frame index
Key_Resp.status = STARTED
# keyboard checking is just starting
win.callOnFlip(Key_Resp.clock.reset) # t=0 on next screen flip
event.clearEvents(eventType='keyboard')
if Key_Resp.status == STARTED and t >= 8.5: #most of one frame period left
print(t)
Key_Resp.status = STOPPED
center_image.setAutoDraw(False)
if Key_Resp.status == STARTED:
theseKeys = event.getKeys(keyList=['num_8', 'num_6', 'num_2', 'num_4'])
# check for quit:
if "escape" in theseKeys:
endExpNow = True
accuracy = 0
if len(theseKeys) > 0: # at least one key was pressed
Key_Resp.keys = theseKeys[-1] # just the last key pressed
Key_Resp.rt = Key_Resp.clock.getTime()
print(t)
print('resp')
#Check for Correct or Incorrect
if (correct_answer == str(Key_Resp.keys)) or (correct_answer == Key_Resp.keys):
accuracy = corr_answer
elif Key_Resp.keys in ['', [], None]:
accuracy = no_answer
else:
accuracy = incorrect_answer
# a response ends the routine
continueRoutine = False
# *p_port* updates
if t >= 3.5 and p_port.status == NOT_STARTED:
# keep track of start time/frame for later
p_port.tStart = t # underestimates by a little under one frame
p_port.frameNStart = frameN # exact frame index
p_port.status = STARTED
win.callOnFlip(p_port.setData, int(code))
if p_port.status == STARTED and t >= (8.5-win.monitorFramePeriod*0.75): #most of one frame period left
p_port.status = STOPPED
win.callOnFlip(p_port.setData, int(0))
# check if all components have finished
if not continueRoutine: # a component has requested a forced-end of Routine
print(t)
break
continueRoutine = False # will revert to True if at least one component still running
for thisComponent in TrialComponents:
if hasattr(thisComponent, "status") and thisComponent.status != FINISHED:
continueRoutine = True
break # at least one component has not yet finished
# check for quit (the Esc key)
if endExpNow or event.getKeys(keyList=["escape"]):
core.quit()
# refresh the screen
if continueRoutine: # don't flip if this routine is over or we'll get a blank screen
win.flip()
#-------Ending Routine "Trial"-------
for thisComponent in TrialComponents:
if hasattr(thisComponent, "setAutoDraw"):
thisComponent.setAutoDraw(False)
# check responses
if Key_Resp.keys in ['', [], None]: # No response was made
Key_Resp.keys=None
#Adding Accuracy Column to Data File
if (correct_answer == str(Key_Resp.keys)) or (correct_answer == Key_Resp.keys):
print(t)
accuracy = corr_answer
elif Key_Resp.keys in ['', [], None]:
accuracy = no_answer
print(t)
else:
accuracy = incorrect_answer
thisExp.addData('accuracy', accuracy)
print(t)
# store data for TrialsF1 (TrialHandler)
Trials.addData('Key_Resp.keys',Key_Resp.keys)
if p_port_images.status == STARTED:
win.callOnFlip(p_port_images.setData, int(0))
if Key_Resp.keys != None: # we had a response
Trials.addData('Key_Resp.rt', Key_Resp.rt)
if p_port.status == STARTED:
win.callOnFlip(p_port.setData, int(accuracy))
thisExp.nextEntry()
for t in range(NUM_BLOCKS):
execute_trial("TrialsF" + str(t+1), FEMALE)
execute_trial("TrialsM" + str(t+1), MALE)
#Change order to change Male/Female blocks
# completed 1 repeats of 'practice'
#----------Starting Actual Trials---------#
# Each trial executes this function
def execute_trial(name, gender):
endExpNow = False
if gender == MALE:
conditions_file = data.importConditions(u'EmotionMaConditionsFile.csv')
else:
conditions_file = data.importConditions(u'EmotionFeConditionsFile.csv')
# set up handler to look after randomisation of conditions etc
Trials = data.TrialHandler(nReps=6, method='random', #CHANGE NUMBER OF REPS
extraInfo=expInfo, originPath="-1",
trialList=conditions_file,
seed=None, name=name)
thisExp.addLoop(Trials) # add the loop to the experiment
thisTrials = Trials.trialList[0] # so we can initialise stimuli with some values
# abbreviate parameter names if possible (e.g. rgb=thisTrialsF1.rgb)
if thisTrials != None:
for paramName in thisTrials.keys():
exec(paramName + "= thisTrials." + paramName)
for thisTrials in Trials:
currentLoop = Trials
# abbreviate parameter names if possible (e.g. rgb = thisTrialsF1.rgb)
if thisTrials != None:
for paramName in thisTrials.keys():
exec(paramName + "= thisTrials." + paramName)
#------Prepare to start Routine "Trial"-------
t = 0
TrialClock.reset() # clock
frameN = -1
routineTimer.add(8.500000)
# update component parameters for each repeat
FixationPoint.setColor('white', colorSpace='rgb')
for number in number_texts:
number.setColor('white', colorSpace='rgb')
Key_Resp = event.BuilderKeyResponse() # create an object of type KeyResponse
Key_Resp.status = NOT_STARTED
# Get the random images
if gender == MALE:
random_images = choose_male_images()
else:
random_images = choose_female_images()
Image1 = random_images[0]
Image2 = random_images[1]
Image3 = random_images[2]
Image4 = random_images[3]
#Choose center image & Add to file
center_image = random.choice(random_images)
thisExp.addData('Center_image', center_image.name)
#Code
print(str(center_image.name[1:3]))
if str(center_image.name[1:3]) == 'MN':
code = '1'
#white male = sad male
elif str(center_image.name[1:3]) == 'MS':
code = '2'
#black female = neutral female
elif str(center_image.name[1:3]) == 'FN':
code = '3'
#white female = sad female
else:
code = '4'
#Adding Condition Column to Data File
thisExp.addData('condition', center_image.name[1:3])
#Adding Code Column to Data File
thisExp.addData('code', code)
#Correct Answer
if center_image == Image1:
correct_answer = 'num_8'
elif center_image == Image2:
correct_answer = 'num_6'
elif center_image == Image3:
correct_answer = 'num_2'
elif center_image == Image4:
correct_answer = 'num_4'
else:
correct_answer = 'None'
#Adding Correct Answer Column to Data File
thisExp.addData('correct_answer', correct_answer)
# keep track of which components have finished
TrialComponents = []
TrialComponents.append(FixationPoint)
TrialComponents.append(Image1)
TrialComponents.append(Image2)
TrialComponents.append(Image3)
TrialComponents.append(Image4)
TrialComponents.append(center_image)
TrialComponents.append(p_port)
TrialComponents.append(p_port_images)
for number in number_texts:
TrialComponents.append(number)
TrialComponents.append(Key_Resp)
for thisComponent in TrialComponents:
if hasattr(thisComponent, 'status'):
thisComponent.status = NOT_STARTED
#-------Start Routine "Trial"-------
continueRoutine = True
while continueRoutine and routineTimer.getTime() > 0:
# get current time
t = TrialClock.getTime()
frameN = frameN + 1 # number of completed frames (so 0 is the first frame)
# update/draw components on each frame
# *FixationPoint* updates
if t >= 0.0 and FixationPoint.status == NOT_STARTED:
# keep track of start time/frame for later
FixationPoint.tStart = t # underestimates by a little under one frame
FixationPoint.frameNStart = frameN # exact frame index
FixationPoint.setAutoDraw(True)
if FixationPoint.status == STARTED and t >= (0.0 + (8.5-win.monitorFramePeriod*0.75)): #most of one frame period left
#if FixationPoint.status == STARTED and t >= 0:
FixationPoint.setAutoDraw(False)
#Random Images
if t >= 0.5 and random_images[0].status == NOT_STARTED:
for image in random_images:
# keep track of start time/frame for later
image.tStart = t # underestimates by a little under one frame
image.frameNStart = frameN # exact frame index
image.setAutoDraw(True)
if random_images[0].status == STARTED and t >= 2.0 and t < 3.0: #most of one frame period left
for image in random_images:
image.setAutoDraw(False)
# *p_port_images* updates
if t >= 0.5 and p_port_images.status == NOT_STARTED:
# keep track of start time/frame for later
p_port_images.tStart = t # underestimates by a little under one frame
p_port_images.frameNStart = frameN # exact frame index
p_port_images.status = STARTED
win.callOnFlip(p_port_images.setData, int(14))
if p_port_images.status == STARTED and t >= (2.0-win.monitorFramePeriod*0.75): #most of one frame period left
p_port_images.status = STOPPED
win.callOnFlip(p_port_images.setData, int(15))
# *Numbers* updates
if t >= 3.5 and number_texts[0].status == NOT_STARTED:
for number in number_texts:
number.tStart = t # underestimates by a little under one frame
number.frameNStart = frameN # exact frame index
number.setAutoDraw(True)
if number_texts[0].status == STARTED and t >= 8.5: #most of one frame period left
for number in number_texts:
number.setAutoDraw(False)
# *Key_Resp* updates
if t >= 3.5 and Key_Resp.status == NOT_STARTED:
center_image.setPos(newPos = (0, 0))
center_image.setAutoDraw(True)
# keep track of start time/frame for later
Key_Resp.tStart = t # underestimates by a little under one frame
Key_Resp.frameNStart = frameN # exact frame index
Key_Resp.status = STARTED
# keyboard checking is just starting
win.callOnFlip(Key_Resp.clock.reset) # t=0 on next screen flip
event.clearEvents(eventType='keyboard')
if Key_Resp.status == STARTED and t >= 8.5: #most of one frame period left
Key_Resp.status = STOPPED
center_image.setAutoDraw(False)
if Key_Resp.status == STARTED:
theseKeys = event.getKeys(keyList=['num_8', 'num_6', 'num_2', 'num_4'])
# check for quit:
if "escape" in theseKeys:
endExpNow = True
if len(theseKeys) > 0: # at least one key was pressed
Key_Resp.keys = theseKeys[-1] # just the last key pressed
Key_Resp.rt = Key_Resp.clock.getTime()
#Check for Correct or Incorrect
if (correct_answer == str(Key_Resp.keys)) or (correct_answer == Key_Resp.keys):
accuracy = corr_answer
elif Key_Resp.keys in ['', [], None]:
accuracy = no_answer
else:
accuracy = incorrect_answer
# a response ends the routine
continueRoutine = False
# *p_port* updates
if t >= 3.5 and p_port.status == NOT_STARTED:
# keep track of start time/frame for later
p_port.tStart = t # underestimates by a little under one frame
p_port.frameNStart = frameN # exact frame index
p_port.status = STARTED
win.callOnFlip(p_port.setData, int(code))
if p_port.status == STARTED and t >= (8.5-win.monitorFramePeriod*0.75): #most of one frame period left
p_port.status = STOPPED
win.callOnFlip(p_port.setData, int(0))
# check if all components have finished
if not continueRoutine: # a component has requested a forced-end of Routine
break
continueRoutine = False # will revert to True if at least one component still running
for thisComponent in TrialComponents:
if hasattr(thisComponent, "status") and thisComponent.status != FINISHED:
continueRoutine = True
break # at least one component has not yet finished
# check for quit (the Esc key)
if endExpNow or event.getKeys(keyList=["escape"]):
core.quit()
# refresh the screen
if continueRoutine: # don't flip if this routine is over or we'll get a blank screen
win.flip()
#-------Ending Routine "Trial"-------
for thisComponent in TrialComponents:
if hasattr(thisComponent, "setAutoDraw"):
thisComponent.setAutoDraw(False)
# check responses
if Key_Resp.keys in ['', [], None]: # No response was made
Key_Resp.keys=None
#Adding Accuracy Column to Data File
if (correct_answer == str(Key_Resp.keys)) or (correct_answer == Key_Resp.keys):
accuracy = corr_answer
elif Key_Resp.keys in ['', [], None]:
accuracy = no_answer
else:
accuracy = incorrect_answer
thisExp.addData('accuracy', accuracy)
# store data for TrialsF1 (TrialHandler)
Trials.addData('Key_Resp.keys',Key_Resp.keys)
if p_port_images.status == STARTED:
win.callOnFlip(p_port_images.setData, int(0))
if Key_Resp.keys != None: # we had a response
Trials.addData('Key_Resp.rt', Key_Resp.rt)
if p_port.status == STARTED:
win.callOnFlip(p_port.setData, int(accuracy))
thisExp.nextEntry()
#------Prepare to start Routine "Break"-------
def execute_Break(name):
t = 0
BreakClock.reset() # clock
frameN = -1
# update component parameters for each repeat
BreakKey_resp = event.BuilderKeyResponse() # create an object of type KeyResponse
BreakKey_resp.status = NOT_STARTED
# keep track of which components have finished
BreakComponents = []
BreakComponents.append(Ready)
BreakComponents.append(BreakKey_resp)
for thisComponent in BreakComponents:
if hasattr(thisComponent, 'status'):
thisComponent.status = NOT_STARTED
#-------Start Routine "Break"-------
continueRoutine = True
while continueRoutine:
# get current time
t = BreakClock.getTime()
frameN = frameN + 1 # number of completed frames (so 0 is the first frame)
# update/draw components on each frame
# *Ready* updates
if t >= 0.0 and Ready.status == NOT_STARTED:
# keep track of start time/frame for later
Ready.tStart = t # underestimates by a little under one frame
Ready.frameNStart = frameN # exact frame index
Ready.setAutoDraw(True)
# *BreakKey_resp* updates
if t >= 0.0 and BreakKey_resp.status == NOT_STARTED:
# keep track of start time/frame for later
BreakKey_resp.tStart = t # underestimates by a little under one frame
BreakKey_resp.frameNStart = frameN # exact frame index
BreakKey_resp.status = STARTED
# keyboard checking is just starting
win.callOnFlip(BreakKey_resp.clock.reset) # t=0 on next screen flip
event.clearEvents(eventType='keyboard')
if BreakKey_resp.status == STARTED:
theseKeys = event.getKeys(keyList=['space'])
# check for quit:
if "escape" in theseKeys:
endExpNow = True
if len(theseKeys) > 0: # at least one key was pressed
BreakKey_resp.keys = theseKeys[-1] # just the last key pressed
BreakKey_resp.rt = BreakKey_resp.clock.getTime()
# a response ends the routine
continueRoutine = False
# check if all components have finished
if not continueRoutine: # a component has requested a forced-end of Routine
break
continueRoutine = False # will revert to True if at least one component still running
for thisComponent in BreakComponents:
if hasattr(thisComponent, "status") and thisComponent.status != FINISHED:
continueRoutine = True
break # at least one component has not yet finished
endExpNow = False
# check for quit (the Esc key)
if endExpNow or event.getKeys(keyList=["escape"]):
core.quit()
# refresh the screen
if continueRoutine: # don't flip if this routine is over or we'll get a blank screen
win.flip()
#-------Ending Routine "Break"-------
for thisComponent in BreakComponents:
if hasattr(thisComponent, "setAutoDraw"):
thisComponent.setAutoDraw(False)
# check responses
if BreakKey_resp.keys in ['', [], None]: # No response was made
BreakKey_resp.keys=None
# store data for thisExp (ExperimentHandler)
thisExp.addData('BreakKey_resp.keys',BreakKey_resp.keys)
if BreakKey_resp.keys != None: # we had a response
thisExp.addData('BreakKey_resp.rt', BreakKey_resp.rt)
thisExp.nextEntry()
# the Routine "Continue" was not non-slip safe, so reset the non-slip timer
routineTimer.reset()
for t in range(NUM_BLOCKS):
execute_Break("Break")
execute_trial("TrialsF" + str(t+1), FEMALE)
execute_Break("Break")
execute_trial("TrialsM" + str(t+1), MALE)
#Change order to change Male/Female blocks
# these shouldn't be strictly necessary (should auto-save)
thisExp.saveAsWideText(filename+'.csv')
thisExp.saveAsPickle(filename)
logging.flush()
# make sure everything is closed down
thisExp.abort() # or data files will save again on exit
win.close()
core.quit()
| 43.703003 | 511 | 0.616432 | 4,834 | 39,289 | 4.919942 | 0.119156 | 0.016482 | 0.011563 | 0.015473 | 0.801749 | 0.782912 | 0.764496 | 0.745112 | 0.726107 | 0.709667 | 0 | 0.018996 | 0.28718 | 39,289 | 898 | 512 | 43.75167 | 0.830215 | 0.235104 | 0 | 0.732595 | 0 | 0.004747 | 0.069304 | 0.002592 | 0.006329 | 0 | 0.000409 | 0 | 0 | 0 | null | null | 0 | 0.02057 | null | null | 0.022152 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 8 |
46a7e1e152071104c9518e5ade0c72f98e30ee3d | 54,855 | py | Python | gr37/kerberos/kerberos_sigmf_decode1.py | zleffke/flowgraph_sandbox | 6bcad45fd4585e917678b843be323278ebf06323 | [
"MIT"
] | null | null | null | gr37/kerberos/kerberos_sigmf_decode1.py | zleffke/flowgraph_sandbox | 6bcad45fd4585e917678b843be323278ebf06323 | [
"MIT"
] | null | null | null | gr37/kerberos/kerberos_sigmf_decode1.py | zleffke/flowgraph_sandbox | 6bcad45fd4585e917678b843be323278ebf06323 | [
"MIT"
] | null | null | null | #!/usr/bin/env python2
# -*- coding: utf-8 -*-
##################################################
# GNU Radio Python Flow Graph
# Title: Kerberos Sigmf Decode1
# GNU Radio version: 3.7.13.4
##################################################
if __name__ == '__main__':
import ctypes
import sys
if sys.platform.startswith('linux'):
try:
x11 = ctypes.cdll.LoadLibrary('libX11.so')
x11.XInitThreads()
except:
print "Warning: failed to XInitThreads()"
from PyQt4 import Qt
from gnuradio import analog
from gnuradio import blocks
from gnuradio import eng_notation
from gnuradio import gr
from gnuradio import qtgui
from gnuradio.eng_option import eng_option
from gnuradio.filter import firdes
from optparse import OptionParser
import adsb
import gr_sigmf
import pyqt
import sip
import sys
from gnuradio import qtgui
class kerberos_sigmf_decode1(gr.top_block, Qt.QWidget):
def __init__(self):
gr.top_block.__init__(self, "Kerberos Sigmf Decode1")
Qt.QWidget.__init__(self)
self.setWindowTitle("Kerberos Sigmf Decode1")
qtgui.util.check_set_qss()
try:
self.setWindowIcon(Qt.QIcon.fromTheme('gnuradio-grc'))
except:
pass
self.top_scroll_layout = Qt.QVBoxLayout()
self.setLayout(self.top_scroll_layout)
self.top_scroll = Qt.QScrollArea()
self.top_scroll.setFrameStyle(Qt.QFrame.NoFrame)
self.top_scroll_layout.addWidget(self.top_scroll)
self.top_scroll.setWidgetResizable(True)
self.top_widget = Qt.QWidget()
self.top_scroll.setWidget(self.top_widget)
self.top_layout = Qt.QVBoxLayout(self.top_widget)
self.top_grid_layout = Qt.QGridLayout()
self.top_layout.addLayout(self.top_grid_layout)
self.settings = Qt.QSettings("GNU Radio", "kerberos_sigmf_decode1")
self.restoreGeometry(self.settings.value("geometry").toByteArray())
##################################################
# Variables
##################################################
self.throttle = throttle = 10
self.thresh = thresh = 10
self.samp_rate = samp_rate = 2e6
##################################################
# Blocks
##################################################
self.main_tab = Qt.QTabWidget()
self.main_tab_widget_0 = Qt.QWidget()
self.main_tab_layout_0 = Qt.QBoxLayout(Qt.QBoxLayout.TopToBottom, self.main_tab_widget_0)
self.main_tab_grid_layout_0 = Qt.QGridLayout()
self.main_tab_layout_0.addLayout(self.main_tab_grid_layout_0)
self.main_tab.addTab(self.main_tab_widget_0, 'Channel')
self.main_tab_widget_1 = Qt.QWidget()
self.main_tab_layout_1 = Qt.QBoxLayout(Qt.QBoxLayout.TopToBottom, self.main_tab_widget_1)
self.main_tab_grid_layout_1 = Qt.QGridLayout()
self.main_tab_layout_1.addLayout(self.main_tab_grid_layout_1)
self.main_tab.addTab(self.main_tab_widget_1, 'Correlate')
self.main_tab_widget_2 = Qt.QWidget()
self.main_tab_layout_2 = Qt.QBoxLayout(Qt.QBoxLayout.TopToBottom, self.main_tab_widget_2)
self.main_tab_grid_layout_2 = Qt.QGridLayout()
self.main_tab_layout_2.addLayout(self.main_tab_grid_layout_2)
self.main_tab.addTab(self.main_tab_widget_2, 'Decode')
self.top_grid_layout.addWidget(self.main_tab, 1, 0, 1, 1)
for r in range(1, 2):
self.top_grid_layout.setRowStretch(r, 1)
for c in range(0, 1):
self.top_grid_layout.setColumnStretch(c, 1)
self._throttle_tool_bar = Qt.QToolBar(self)
self._throttle_tool_bar.addWidget(Qt.QLabel('Throttle'+": "))
self._throttle_line_edit = Qt.QLineEdit(str(self.throttle))
self._throttle_tool_bar.addWidget(self._throttle_line_edit)
self._throttle_line_edit.returnPressed.connect(
lambda: self.set_throttle(eng_notation.str_to_num(str(self._throttle_line_edit.text().toAscii()))))
self.main_tab_grid_layout_0.addWidget(self._throttle_tool_bar, 9, 2, 1, 2)
for r in range(9, 10):
self.main_tab_grid_layout_0.setRowStretch(r, 1)
for c in range(2, 4):
self.main_tab_grid_layout_0.setColumnStretch(c, 1)
self._thresh_tool_bar = Qt.QToolBar(self)
self._thresh_tool_bar.addWidget(Qt.QLabel('GUI Threshold'+": "))
self._thresh_line_edit = Qt.QLineEdit(str(self.thresh))
self._thresh_tool_bar.addWidget(self._thresh_line_edit)
self._thresh_line_edit.returnPressed.connect(
lambda: self.set_thresh(eng_notation.str_to_num(str(self._thresh_line_edit.text().toAscii()))))
self.top_grid_layout.addWidget(self._thresh_tool_bar, 9, 0, 1, 1)
for r in range(9, 10):
self.top_grid_layout.setRowStretch(r, 1)
for c in range(0, 1):
self.top_grid_layout.setColumnStretch(c, 1)
self.sigmf_source_3 = gr_sigmf.source('/home/zleffke/captures/kerberos/20210326/with_noise/CHAN3_2021-03-26T21:12:02Z.sigmf-data', "cf32" + ("_le" if sys.byteorder == "little" else "_be"), False)
self.sigmf_source_2 = gr_sigmf.source('/home/zleffke/captures/kerberos/20210326/with_noise/CHAN2_2021-03-26T21:12:02Z.sigmf-data', "cf32" + ("_le" if sys.byteorder == "little" else "_be"), False)
self.sigmf_source_1 = gr_sigmf.source('/home/zleffke/captures/kerberos/20210326/with_noise/CHAN1_2021-03-26T21:12:02Z.sigmf-data', "cf32" + ("_le" if sys.byteorder == "little" else "_be"), False)
self.sigmf_source_0 = gr_sigmf.source('/home/zleffke/captures/kerberos/20210330/2200/CHAN0_2021-03-30T22:00:02Z.sigmf-data', "cf32" + ("_le" if sys.byteorder == "little" else "_be"), False)
self.qtgui_waterfall_sink_x_0_0_1 = qtgui.waterfall_sink_c(
1024, #size
firdes.WIN_BLACKMAN_hARRIS, #wintype
0, #fc
samp_rate, #bw
"", #name
1 #number of inputs
)
self.qtgui_waterfall_sink_x_0_0_1.set_update_time(0.010)
self.qtgui_waterfall_sink_x_0_0_1.enable_grid(False)
self.qtgui_waterfall_sink_x_0_0_1.enable_axis_labels(True)
if not True:
self.qtgui_waterfall_sink_x_0_0_1.disable_legend()
if "complex" == "float" or "complex" == "msg_float":
self.qtgui_waterfall_sink_x_0_0_1.set_plot_pos_half(not True)
labels = ['', '', '', '', '',
'', '', '', '', '']
colors = [0, 0, 0, 0, 0,
0, 0, 0, 0, 0]
alphas = [1.0, 1.0, 1.0, 1.0, 1.0,
1.0, 1.0, 1.0, 1.0, 1.0]
for i in xrange(1):
if len(labels[i]) == 0:
self.qtgui_waterfall_sink_x_0_0_1.set_line_label(i, "Data {0}".format(i))
else:
self.qtgui_waterfall_sink_x_0_0_1.set_line_label(i, labels[i])
self.qtgui_waterfall_sink_x_0_0_1.set_color_map(i, colors[i])
self.qtgui_waterfall_sink_x_0_0_1.set_line_alpha(i, alphas[i])
self.qtgui_waterfall_sink_x_0_0_1.set_intensity_range(-100, 10)
self._qtgui_waterfall_sink_x_0_0_1_win = sip.wrapinstance(self.qtgui_waterfall_sink_x_0_0_1.pyqwidget(), Qt.QWidget)
self.main_tab_grid_layout_0.addWidget(self._qtgui_waterfall_sink_x_0_0_1_win, 2, 3, 2, 1)
for r in range(2, 4):
self.main_tab_grid_layout_0.setRowStretch(r, 1)
for c in range(3, 4):
self.main_tab_grid_layout_0.setColumnStretch(c, 1)
self.qtgui_waterfall_sink_x_0_0_0 = qtgui.waterfall_sink_c(
1024, #size
firdes.WIN_BLACKMAN_hARRIS, #wintype
0, #fc
samp_rate, #bw
"", #name
1 #number of inputs
)
self.qtgui_waterfall_sink_x_0_0_0.set_update_time(0.010)
self.qtgui_waterfall_sink_x_0_0_0.enable_grid(False)
self.qtgui_waterfall_sink_x_0_0_0.enable_axis_labels(True)
if not True:
self.qtgui_waterfall_sink_x_0_0_0.disable_legend()
if "complex" == "float" or "complex" == "msg_float":
self.qtgui_waterfall_sink_x_0_0_0.set_plot_pos_half(not True)
labels = ['', '', '', '', '',
'', '', '', '', '']
colors = [0, 0, 0, 0, 0,
0, 0, 0, 0, 0]
alphas = [1.0, 1.0, 1.0, 1.0, 1.0,
1.0, 1.0, 1.0, 1.0, 1.0]
for i in xrange(1):
if len(labels[i]) == 0:
self.qtgui_waterfall_sink_x_0_0_0.set_line_label(i, "Data {0}".format(i))
else:
self.qtgui_waterfall_sink_x_0_0_0.set_line_label(i, labels[i])
self.qtgui_waterfall_sink_x_0_0_0.set_color_map(i, colors[i])
self.qtgui_waterfall_sink_x_0_0_0.set_line_alpha(i, alphas[i])
self.qtgui_waterfall_sink_x_0_0_0.set_intensity_range(-100, 10)
self._qtgui_waterfall_sink_x_0_0_0_win = sip.wrapinstance(self.qtgui_waterfall_sink_x_0_0_0.pyqwidget(), Qt.QWidget)
self.main_tab_grid_layout_0.addWidget(self._qtgui_waterfall_sink_x_0_0_0_win, 2, 2, 2, 1)
for r in range(2, 4):
self.main_tab_grid_layout_0.setRowStretch(r, 1)
for c in range(2, 3):
self.main_tab_grid_layout_0.setColumnStretch(c, 1)
self.qtgui_waterfall_sink_x_0_0 = qtgui.waterfall_sink_c(
1024, #size
firdes.WIN_BLACKMAN_hARRIS, #wintype
0, #fc
samp_rate, #bw
"", #name
1 #number of inputs
)
self.qtgui_waterfall_sink_x_0_0.set_update_time(0.010)
self.qtgui_waterfall_sink_x_0_0.enable_grid(False)
self.qtgui_waterfall_sink_x_0_0.enable_axis_labels(True)
if not True:
self.qtgui_waterfall_sink_x_0_0.disable_legend()
if "complex" == "float" or "complex" == "msg_float":
self.qtgui_waterfall_sink_x_0_0.set_plot_pos_half(not True)
labels = ['', '', '', '', '',
'', '', '', '', '']
colors = [0, 0, 0, 0, 0,
0, 0, 0, 0, 0]
alphas = [1.0, 1.0, 1.0, 1.0, 1.0,
1.0, 1.0, 1.0, 1.0, 1.0]
for i in xrange(1):
if len(labels[i]) == 0:
self.qtgui_waterfall_sink_x_0_0.set_line_label(i, "Data {0}".format(i))
else:
self.qtgui_waterfall_sink_x_0_0.set_line_label(i, labels[i])
self.qtgui_waterfall_sink_x_0_0.set_color_map(i, colors[i])
self.qtgui_waterfall_sink_x_0_0.set_line_alpha(i, alphas[i])
self.qtgui_waterfall_sink_x_0_0.set_intensity_range(-100, 10)
self._qtgui_waterfall_sink_x_0_0_win = sip.wrapinstance(self.qtgui_waterfall_sink_x_0_0.pyqwidget(), Qt.QWidget)
self.main_tab_grid_layout_0.addWidget(self._qtgui_waterfall_sink_x_0_0_win, 2, 1, 2, 1)
for r in range(2, 4):
self.main_tab_grid_layout_0.setRowStretch(r, 1)
for c in range(1, 2):
self.main_tab_grid_layout_0.setColumnStretch(c, 1)
self.qtgui_waterfall_sink_x_0 = qtgui.waterfall_sink_c(
1024, #size
firdes.WIN_BLACKMAN_hARRIS, #wintype
0, #fc
samp_rate, #bw
"", #name
1 #number of inputs
)
self.qtgui_waterfall_sink_x_0.set_update_time(0.010)
self.qtgui_waterfall_sink_x_0.enable_grid(False)
self.qtgui_waterfall_sink_x_0.enable_axis_labels(True)
if not True:
self.qtgui_waterfall_sink_x_0.disable_legend()
if "complex" == "float" or "complex" == "msg_float":
self.qtgui_waterfall_sink_x_0.set_plot_pos_half(not True)
labels = ['', '', '', '', '',
'', '', '', '', '']
colors = [0, 0, 0, 0, 0,
0, 0, 0, 0, 0]
alphas = [1.0, 1.0, 1.0, 1.0, 1.0,
1.0, 1.0, 1.0, 1.0, 1.0]
for i in xrange(1):
if len(labels[i]) == 0:
self.qtgui_waterfall_sink_x_0.set_line_label(i, "Data {0}".format(i))
else:
self.qtgui_waterfall_sink_x_0.set_line_label(i, labels[i])
self.qtgui_waterfall_sink_x_0.set_color_map(i, colors[i])
self.qtgui_waterfall_sink_x_0.set_line_alpha(i, alphas[i])
self.qtgui_waterfall_sink_x_0.set_intensity_range(-100, 10)
self._qtgui_waterfall_sink_x_0_win = sip.wrapinstance(self.qtgui_waterfall_sink_x_0.pyqwidget(), Qt.QWidget)
self.main_tab_grid_layout_0.addWidget(self._qtgui_waterfall_sink_x_0_win, 2, 0, 2, 1)
for r in range(2, 4):
self.main_tab_grid_layout_0.setRowStretch(r, 1)
for c in range(0, 1):
self.main_tab_grid_layout_0.setColumnStretch(c, 1)
self.qtgui_time_sink_x_0_1_0_0_0 = qtgui.time_sink_f(
int(samp_rate), #size
int(samp_rate*8), #samp_rate
"CHAN3", #name
2 #number of inputs
)
self.qtgui_time_sink_x_0_1_0_0_0.set_update_time(0.01)
self.qtgui_time_sink_x_0_1_0_0_0.set_y_axis(0, 1)
self.qtgui_time_sink_x_0_1_0_0_0.set_y_label('Amplitude', "")
self.qtgui_time_sink_x_0_1_0_0_0.enable_tags(-1, True)
self.qtgui_time_sink_x_0_1_0_0_0.set_trigger_mode(qtgui.TRIG_MODE_FREE, qtgui.TRIG_SLOPE_POS, 0, 1.25e-6, 0, "burst")
self.qtgui_time_sink_x_0_1_0_0_0.enable_autoscale(True)
self.qtgui_time_sink_x_0_1_0_0_0.enable_grid(True)
self.qtgui_time_sink_x_0_1_0_0_0.enable_axis_labels(True)
self.qtgui_time_sink_x_0_1_0_0_0.enable_control_panel(False)
self.qtgui_time_sink_x_0_1_0_0_0.enable_stem_plot(False)
if not False:
self.qtgui_time_sink_x_0_1_0_0_0.disable_legend()
labels = ['', '', '', '', '',
'', '', '', '', '']
widths = [1, 1, 1, 1, 1,
1, 1, 1, 1, 1]
colors = ["blue", "red", "green", "black", "cyan",
"magenta", "yellow", "dark red", "dark green", "blue"]
styles = [1, 1, 1, 1, 1,
1, 1, 1, 1, 1]
markers = [0, -1, -1, -1, -1,
-1, -1, -1, -1, -1]
alphas = [1.0, 1.0, 1.0, 1.0, 1.0,
1.0, 1.0, 1.0, 1.0, 1.0]
for i in xrange(2):
if len(labels[i]) == 0:
self.qtgui_time_sink_x_0_1_0_0_0.set_line_label(i, "Data {0}".format(i))
else:
self.qtgui_time_sink_x_0_1_0_0_0.set_line_label(i, labels[i])
self.qtgui_time_sink_x_0_1_0_0_0.set_line_width(i, widths[i])
self.qtgui_time_sink_x_0_1_0_0_0.set_line_color(i, colors[i])
self.qtgui_time_sink_x_0_1_0_0_0.set_line_style(i, styles[i])
self.qtgui_time_sink_x_0_1_0_0_0.set_line_marker(i, markers[i])
self.qtgui_time_sink_x_0_1_0_0_0.set_line_alpha(i, alphas[i])
self._qtgui_time_sink_x_0_1_0_0_0_win = sip.wrapinstance(self.qtgui_time_sink_x_0_1_0_0_0.pyqwidget(), Qt.QWidget)
self.main_tab_grid_layout_2.addWidget(self._qtgui_time_sink_x_0_1_0_0_0_win, 0, 3, 1, 1)
for r in range(0, 1):
self.main_tab_grid_layout_2.setRowStretch(r, 1)
for c in range(3, 4):
self.main_tab_grid_layout_2.setColumnStretch(c, 1)
self.qtgui_time_sink_x_0_1_0_0 = qtgui.time_sink_f(
int(samp_rate), #size
int(samp_rate*8), #samp_rate
"CHAN2", #name
2 #number of inputs
)
self.qtgui_time_sink_x_0_1_0_0.set_update_time(0.01)
self.qtgui_time_sink_x_0_1_0_0.set_y_axis(0, 1)
self.qtgui_time_sink_x_0_1_0_0.set_y_label('Amplitude', "")
self.qtgui_time_sink_x_0_1_0_0.enable_tags(-1, True)
self.qtgui_time_sink_x_0_1_0_0.set_trigger_mode(qtgui.TRIG_MODE_FREE, qtgui.TRIG_SLOPE_POS, 0, 1.25e-6, 0, "burst")
self.qtgui_time_sink_x_0_1_0_0.enable_autoscale(True)
self.qtgui_time_sink_x_0_1_0_0.enable_grid(True)
self.qtgui_time_sink_x_0_1_0_0.enable_axis_labels(True)
self.qtgui_time_sink_x_0_1_0_0.enable_control_panel(False)
self.qtgui_time_sink_x_0_1_0_0.enable_stem_plot(False)
if not False:
self.qtgui_time_sink_x_0_1_0_0.disable_legend()
labels = ['', '', '', '', '',
'', '', '', '', '']
widths = [1, 1, 1, 1, 1,
1, 1, 1, 1, 1]
colors = ["blue", "red", "green", "black", "cyan",
"magenta", "yellow", "dark red", "dark green", "blue"]
styles = [1, 1, 1, 1, 1,
1, 1, 1, 1, 1]
markers = [0, -1, -1, -1, -1,
-1, -1, -1, -1, -1]
alphas = [1.0, 1.0, 1.0, 1.0, 1.0,
1.0, 1.0, 1.0, 1.0, 1.0]
for i in xrange(2):
if len(labels[i]) == 0:
self.qtgui_time_sink_x_0_1_0_0.set_line_label(i, "Data {0}".format(i))
else:
self.qtgui_time_sink_x_0_1_0_0.set_line_label(i, labels[i])
self.qtgui_time_sink_x_0_1_0_0.set_line_width(i, widths[i])
self.qtgui_time_sink_x_0_1_0_0.set_line_color(i, colors[i])
self.qtgui_time_sink_x_0_1_0_0.set_line_style(i, styles[i])
self.qtgui_time_sink_x_0_1_0_0.set_line_marker(i, markers[i])
self.qtgui_time_sink_x_0_1_0_0.set_line_alpha(i, alphas[i])
self._qtgui_time_sink_x_0_1_0_0_win = sip.wrapinstance(self.qtgui_time_sink_x_0_1_0_0.pyqwidget(), Qt.QWidget)
self.main_tab_grid_layout_2.addWidget(self._qtgui_time_sink_x_0_1_0_0_win, 0, 2, 1, 1)
for r in range(0, 1):
self.main_tab_grid_layout_2.setRowStretch(r, 1)
for c in range(2, 3):
self.main_tab_grid_layout_2.setColumnStretch(c, 1)
self.qtgui_time_sink_x_0_1_0 = qtgui.time_sink_f(
int(samp_rate), #size
int(samp_rate*8), #samp_rate
"CHAN1", #name
2 #number of inputs
)
self.qtgui_time_sink_x_0_1_0.set_update_time(0.01)
self.qtgui_time_sink_x_0_1_0.set_y_axis(0, 1)
self.qtgui_time_sink_x_0_1_0.set_y_label('Amplitude', "")
self.qtgui_time_sink_x_0_1_0.enable_tags(-1, True)
self.qtgui_time_sink_x_0_1_0.set_trigger_mode(qtgui.TRIG_MODE_FREE, qtgui.TRIG_SLOPE_POS, 0, 1.25e-6, 0, "burst")
self.qtgui_time_sink_x_0_1_0.enable_autoscale(True)
self.qtgui_time_sink_x_0_1_0.enable_grid(True)
self.qtgui_time_sink_x_0_1_0.enable_axis_labels(True)
self.qtgui_time_sink_x_0_1_0.enable_control_panel(False)
self.qtgui_time_sink_x_0_1_0.enable_stem_plot(False)
if not False:
self.qtgui_time_sink_x_0_1_0.disable_legend()
labels = ['', '', '', '', '',
'', '', '', '', '']
widths = [1, 1, 1, 1, 1,
1, 1, 1, 1, 1]
colors = ["blue", "red", "green", "black", "cyan",
"magenta", "yellow", "dark red", "dark green", "blue"]
styles = [1, 1, 1, 1, 1,
1, 1, 1, 1, 1]
markers = [0, -1, -1, -1, -1,
-1, -1, -1, -1, -1]
alphas = [1.0, 1.0, 1.0, 1.0, 1.0,
1.0, 1.0, 1.0, 1.0, 1.0]
for i in xrange(2):
if len(labels[i]) == 0:
self.qtgui_time_sink_x_0_1_0.set_line_label(i, "Data {0}".format(i))
else:
self.qtgui_time_sink_x_0_1_0.set_line_label(i, labels[i])
self.qtgui_time_sink_x_0_1_0.set_line_width(i, widths[i])
self.qtgui_time_sink_x_0_1_0.set_line_color(i, colors[i])
self.qtgui_time_sink_x_0_1_0.set_line_style(i, styles[i])
self.qtgui_time_sink_x_0_1_0.set_line_marker(i, markers[i])
self.qtgui_time_sink_x_0_1_0.set_line_alpha(i, alphas[i])
self._qtgui_time_sink_x_0_1_0_win = sip.wrapinstance(self.qtgui_time_sink_x_0_1_0.pyqwidget(), Qt.QWidget)
self.main_tab_grid_layout_2.addWidget(self._qtgui_time_sink_x_0_1_0_win, 0, 1, 1, 1)
for r in range(0, 1):
self.main_tab_grid_layout_2.setRowStretch(r, 1)
for c in range(1, 2):
self.main_tab_grid_layout_2.setColumnStretch(c, 1)
self.qtgui_time_sink_x_0_1 = qtgui.time_sink_f(
int(samp_rate), #size
int(samp_rate*8), #samp_rate
"CHAN0", #name
2 #number of inputs
)
self.qtgui_time_sink_x_0_1.set_update_time(0.01)
self.qtgui_time_sink_x_0_1.set_y_axis(0, 1)
self.qtgui_time_sink_x_0_1.set_y_label('Amplitude', "")
self.qtgui_time_sink_x_0_1.enable_tags(-1, True)
self.qtgui_time_sink_x_0_1.set_trigger_mode(qtgui.TRIG_MODE_FREE, qtgui.TRIG_SLOPE_POS, 0, 1.25e-6, 0, "burst")
self.qtgui_time_sink_x_0_1.enable_autoscale(True)
self.qtgui_time_sink_x_0_1.enable_grid(True)
self.qtgui_time_sink_x_0_1.enable_axis_labels(True)
self.qtgui_time_sink_x_0_1.enable_control_panel(False)
self.qtgui_time_sink_x_0_1.enable_stem_plot(False)
if not False:
self.qtgui_time_sink_x_0_1.disable_legend()
labels = ['', '', '', '', '',
'', '', '', '', '']
widths = [1, 1, 1, 1, 1,
1, 1, 1, 1, 1]
colors = ["blue", "red", "green", "black", "cyan",
"magenta", "yellow", "dark red", "dark green", "blue"]
styles = [1, 1, 1, 1, 1,
1, 1, 1, 1, 1]
markers = [0, -1, -1, -1, -1,
-1, -1, -1, -1, -1]
alphas = [1.0, 1.0, 1.0, 1.0, 1.0,
1.0, 1.0, 1.0, 1.0, 1.0]
for i in xrange(2):
if len(labels[i]) == 0:
self.qtgui_time_sink_x_0_1.set_line_label(i, "Data {0}".format(i))
else:
self.qtgui_time_sink_x_0_1.set_line_label(i, labels[i])
self.qtgui_time_sink_x_0_1.set_line_width(i, widths[i])
self.qtgui_time_sink_x_0_1.set_line_color(i, colors[i])
self.qtgui_time_sink_x_0_1.set_line_style(i, styles[i])
self.qtgui_time_sink_x_0_1.set_line_marker(i, markers[i])
self.qtgui_time_sink_x_0_1.set_line_alpha(i, alphas[i])
self._qtgui_time_sink_x_0_1_win = sip.wrapinstance(self.qtgui_time_sink_x_0_1.pyqwidget(), Qt.QWidget)
self.main_tab_grid_layout_2.addWidget(self._qtgui_time_sink_x_0_1_win, 0, 0, 1, 1)
for r in range(0, 1):
self.main_tab_grid_layout_2.setRowStretch(r, 1)
for c in range(0, 1):
self.main_tab_grid_layout_2.setColumnStretch(c, 1)
self.qtgui_time_sink_x_0_0_1 = qtgui.time_sink_f(
1024, #size
samp_rate, #samp_rate
"", #name
1 #number of inputs
)
self.qtgui_time_sink_x_0_0_1.set_update_time(0.010)
self.qtgui_time_sink_x_0_0_1.set_y_axis(-1, 1)
self.qtgui_time_sink_x_0_0_1.set_y_label('Amplitude', "")
self.qtgui_time_sink_x_0_0_1.enable_tags(-1, True)
self.qtgui_time_sink_x_0_0_1.set_trigger_mode(qtgui.TRIG_MODE_AUTO, qtgui.TRIG_SLOPE_POS, thresh, 0, 0, "")
self.qtgui_time_sink_x_0_0_1.enable_autoscale(True)
self.qtgui_time_sink_x_0_0_1.enable_grid(False)
self.qtgui_time_sink_x_0_0_1.enable_axis_labels(True)
self.qtgui_time_sink_x_0_0_1.enable_control_panel(False)
self.qtgui_time_sink_x_0_0_1.enable_stem_plot(False)
if not True:
self.qtgui_time_sink_x_0_0_1.disable_legend()
labels = ['', '', '', '', '',
'', '', '', '', '']
widths = [1, 1, 1, 1, 1,
1, 1, 1, 1, 1]
colors = ["blue", "red", "green", "black", "cyan",
"magenta", "yellow", "dark red", "dark green", "blue"]
styles = [1, 1, 1, 1, 1,
1, 1, 1, 1, 1]
markers = [-1, -1, -1, -1, -1,
-1, -1, -1, -1, -1]
alphas = [1.0, 1.0, 1.0, 1.0, 1.0,
1.0, 1.0, 1.0, 1.0, 1.0]
for i in xrange(1):
if len(labels[i]) == 0:
self.qtgui_time_sink_x_0_0_1.set_line_label(i, "Data {0}".format(i))
else:
self.qtgui_time_sink_x_0_0_1.set_line_label(i, labels[i])
self.qtgui_time_sink_x_0_0_1.set_line_width(i, widths[i])
self.qtgui_time_sink_x_0_0_1.set_line_color(i, colors[i])
self.qtgui_time_sink_x_0_0_1.set_line_style(i, styles[i])
self.qtgui_time_sink_x_0_0_1.set_line_marker(i, markers[i])
self.qtgui_time_sink_x_0_0_1.set_line_alpha(i, alphas[i])
self._qtgui_time_sink_x_0_0_1_win = sip.wrapinstance(self.qtgui_time_sink_x_0_0_1.pyqwidget(), Qt.QWidget)
self.main_tab_grid_layout_0.addWidget(self._qtgui_time_sink_x_0_0_1_win, 4, 3, 2, 1)
for r in range(4, 6):
self.main_tab_grid_layout_0.setRowStretch(r, 1)
for c in range(3, 4):
self.main_tab_grid_layout_0.setColumnStretch(c, 1)
self.qtgui_time_sink_x_0_0_0 = qtgui.time_sink_f(
1024, #size
samp_rate, #samp_rate
"", #name
1 #number of inputs
)
self.qtgui_time_sink_x_0_0_0.set_update_time(0.010)
self.qtgui_time_sink_x_0_0_0.set_y_axis(-1, 1)
self.qtgui_time_sink_x_0_0_0.set_y_label('Amplitude', "")
self.qtgui_time_sink_x_0_0_0.enable_tags(-1, True)
self.qtgui_time_sink_x_0_0_0.set_trigger_mode(qtgui.TRIG_MODE_AUTO, qtgui.TRIG_SLOPE_POS, thresh, 0, 0, "")
self.qtgui_time_sink_x_0_0_0.enable_autoscale(True)
self.qtgui_time_sink_x_0_0_0.enable_grid(False)
self.qtgui_time_sink_x_0_0_0.enable_axis_labels(True)
self.qtgui_time_sink_x_0_0_0.enable_control_panel(False)
self.qtgui_time_sink_x_0_0_0.enable_stem_plot(False)
if not True:
self.qtgui_time_sink_x_0_0_0.disable_legend()
labels = ['', '', '', '', '',
'', '', '', '', '']
widths = [1, 1, 1, 1, 1,
1, 1, 1, 1, 1]
colors = ["blue", "red", "green", "black", "cyan",
"magenta", "yellow", "dark red", "dark green", "blue"]
styles = [1, 1, 1, 1, 1,
1, 1, 1, 1, 1]
markers = [-1, -1, -1, -1, -1,
-1, -1, -1, -1, -1]
alphas = [1.0, 1.0, 1.0, 1.0, 1.0,
1.0, 1.0, 1.0, 1.0, 1.0]
for i in xrange(1):
if len(labels[i]) == 0:
self.qtgui_time_sink_x_0_0_0.set_line_label(i, "Data {0}".format(i))
else:
self.qtgui_time_sink_x_0_0_0.set_line_label(i, labels[i])
self.qtgui_time_sink_x_0_0_0.set_line_width(i, widths[i])
self.qtgui_time_sink_x_0_0_0.set_line_color(i, colors[i])
self.qtgui_time_sink_x_0_0_0.set_line_style(i, styles[i])
self.qtgui_time_sink_x_0_0_0.set_line_marker(i, markers[i])
self.qtgui_time_sink_x_0_0_0.set_line_alpha(i, alphas[i])
self._qtgui_time_sink_x_0_0_0_win = sip.wrapinstance(self.qtgui_time_sink_x_0_0_0.pyqwidget(), Qt.QWidget)
self.main_tab_grid_layout_0.addWidget(self._qtgui_time_sink_x_0_0_0_win, 4, 2, 2, 1)
for r in range(4, 6):
self.main_tab_grid_layout_0.setRowStretch(r, 1)
for c in range(2, 3):
self.main_tab_grid_layout_0.setColumnStretch(c, 1)
self.qtgui_time_sink_x_0_0 = qtgui.time_sink_f(
1024, #size
samp_rate, #samp_rate
"", #name
1 #number of inputs
)
self.qtgui_time_sink_x_0_0.set_update_time(0.010)
self.qtgui_time_sink_x_0_0.set_y_axis(-1, 1)
self.qtgui_time_sink_x_0_0.set_y_label('Amplitude', "")
self.qtgui_time_sink_x_0_0.enable_tags(-1, True)
self.qtgui_time_sink_x_0_0.set_trigger_mode(qtgui.TRIG_MODE_AUTO, qtgui.TRIG_SLOPE_POS, thresh, 0, 0, "")
self.qtgui_time_sink_x_0_0.enable_autoscale(True)
self.qtgui_time_sink_x_0_0.enable_grid(False)
self.qtgui_time_sink_x_0_0.enable_axis_labels(True)
self.qtgui_time_sink_x_0_0.enable_control_panel(False)
self.qtgui_time_sink_x_0_0.enable_stem_plot(False)
if not True:
self.qtgui_time_sink_x_0_0.disable_legend()
labels = ['', '', '', '', '',
'', '', '', '', '']
widths = [1, 1, 1, 1, 1,
1, 1, 1, 1, 1]
colors = ["blue", "red", "green", "black", "cyan",
"magenta", "yellow", "dark red", "dark green", "blue"]
styles = [1, 1, 1, 1, 1,
1, 1, 1, 1, 1]
markers = [-1, -1, -1, -1, -1,
-1, -1, -1, -1, -1]
alphas = [1.0, 1.0, 1.0, 1.0, 1.0,
1.0, 1.0, 1.0, 1.0, 1.0]
for i in xrange(1):
if len(labels[i]) == 0:
self.qtgui_time_sink_x_0_0.set_line_label(i, "Data {0}".format(i))
else:
self.qtgui_time_sink_x_0_0.set_line_label(i, labels[i])
self.qtgui_time_sink_x_0_0.set_line_width(i, widths[i])
self.qtgui_time_sink_x_0_0.set_line_color(i, colors[i])
self.qtgui_time_sink_x_0_0.set_line_style(i, styles[i])
self.qtgui_time_sink_x_0_0.set_line_marker(i, markers[i])
self.qtgui_time_sink_x_0_0.set_line_alpha(i, alphas[i])
self._qtgui_time_sink_x_0_0_win = sip.wrapinstance(self.qtgui_time_sink_x_0_0.pyqwidget(), Qt.QWidget)
self.main_tab_grid_layout_0.addWidget(self._qtgui_time_sink_x_0_0_win, 4, 1, 2, 1)
for r in range(4, 6):
self.main_tab_grid_layout_0.setRowStretch(r, 1)
for c in range(1, 2):
self.main_tab_grid_layout_0.setColumnStretch(c, 1)
self.qtgui_time_sink_x_0 = qtgui.time_sink_f(
1024, #size
samp_rate, #samp_rate
"", #name
1 #number of inputs
)
self.qtgui_time_sink_x_0.set_update_time(0.010)
self.qtgui_time_sink_x_0.set_y_axis(-1, 1)
self.qtgui_time_sink_x_0.set_y_label('Amplitude', "")
self.qtgui_time_sink_x_0.enable_tags(-1, True)
self.qtgui_time_sink_x_0.set_trigger_mode(qtgui.TRIG_MODE_AUTO, qtgui.TRIG_SLOPE_POS, thresh, 0, 0, "")
self.qtgui_time_sink_x_0.enable_autoscale(True)
self.qtgui_time_sink_x_0.enable_grid(False)
self.qtgui_time_sink_x_0.enable_axis_labels(True)
self.qtgui_time_sink_x_0.enable_control_panel(False)
self.qtgui_time_sink_x_0.enable_stem_plot(False)
if not True:
self.qtgui_time_sink_x_0.disable_legend()
labels = ['', '', '', '', '',
'', '', '', '', '']
widths = [1, 1, 1, 1, 1,
1, 1, 1, 1, 1]
colors = ["blue", "red", "green", "black", "cyan",
"magenta", "yellow", "dark red", "dark green", "blue"]
styles = [1, 1, 1, 1, 1,
1, 1, 1, 1, 1]
markers = [-1, -1, -1, -1, -1,
-1, -1, -1, -1, -1]
alphas = [1.0, 1.0, 1.0, 1.0, 1.0,
1.0, 1.0, 1.0, 1.0, 1.0]
for i in xrange(1):
if len(labels[i]) == 0:
self.qtgui_time_sink_x_0.set_line_label(i, "Data {0}".format(i))
else:
self.qtgui_time_sink_x_0.set_line_label(i, labels[i])
self.qtgui_time_sink_x_0.set_line_width(i, widths[i])
self.qtgui_time_sink_x_0.set_line_color(i, colors[i])
self.qtgui_time_sink_x_0.set_line_style(i, styles[i])
self.qtgui_time_sink_x_0.set_line_marker(i, markers[i])
self.qtgui_time_sink_x_0.set_line_alpha(i, alphas[i])
self._qtgui_time_sink_x_0_win = sip.wrapinstance(self.qtgui_time_sink_x_0.pyqwidget(), Qt.QWidget)
self.main_tab_grid_layout_0.addWidget(self._qtgui_time_sink_x_0_win, 4, 0, 2, 1)
for r in range(4, 6):
self.main_tab_grid_layout_0.setRowStretch(r, 1)
for c in range(0, 1):
self.main_tab_grid_layout_0.setColumnStretch(c, 1)
self.qtgui_freq_sink_x_0_1_0 = qtgui.freq_sink_c(
1024, #size
firdes.WIN_BLACKMAN_hARRIS, #wintype
0, #fc
samp_rate, #bw
"", #name
1 #number of inputs
)
self.qtgui_freq_sink_x_0_1_0.set_update_time(0.010)
self.qtgui_freq_sink_x_0_1_0.set_y_axis(-140, 10)
self.qtgui_freq_sink_x_0_1_0.set_y_label('Relative Gain', 'dB')
self.qtgui_freq_sink_x_0_1_0.set_trigger_mode(qtgui.TRIG_MODE_FREE, 0.0, 0, "")
self.qtgui_freq_sink_x_0_1_0.enable_autoscale(False)
self.qtgui_freq_sink_x_0_1_0.enable_grid(False)
self.qtgui_freq_sink_x_0_1_0.set_fft_average(1.0)
self.qtgui_freq_sink_x_0_1_0.enable_axis_labels(True)
self.qtgui_freq_sink_x_0_1_0.enable_control_panel(False)
if not False:
self.qtgui_freq_sink_x_0_1_0.disable_legend()
if "complex" == "float" or "complex" == "msg_float":
self.qtgui_freq_sink_x_0_1_0.set_plot_pos_half(not True)
labels = ['', '', '', '', '',
'', '', '', '', '']
widths = [1, 1, 1, 1, 1,
1, 1, 1, 1, 1]
colors = ["blue", "red", "green", "black", "cyan",
"magenta", "yellow", "dark red", "dark green", "dark blue"]
alphas = [1.0, 1.0, 1.0, 1.0, 1.0,
1.0, 1.0, 1.0, 1.0, 1.0]
for i in xrange(1):
if len(labels[i]) == 0:
self.qtgui_freq_sink_x_0_1_0.set_line_label(i, "Data {0}".format(i))
else:
self.qtgui_freq_sink_x_0_1_0.set_line_label(i, labels[i])
self.qtgui_freq_sink_x_0_1_0.set_line_width(i, widths[i])
self.qtgui_freq_sink_x_0_1_0.set_line_color(i, colors[i])
self.qtgui_freq_sink_x_0_1_0.set_line_alpha(i, alphas[i])
self._qtgui_freq_sink_x_0_1_0_win = sip.wrapinstance(self.qtgui_freq_sink_x_0_1_0.pyqwidget(), Qt.QWidget)
self.main_tab_grid_layout_0.addWidget(self._qtgui_freq_sink_x_0_1_0_win, 0, 3, 2, 1)
for r in range(0, 2):
self.main_tab_grid_layout_0.setRowStretch(r, 1)
for c in range(3, 4):
self.main_tab_grid_layout_0.setColumnStretch(c, 1)
self.qtgui_freq_sink_x_0_1 = qtgui.freq_sink_c(
1024, #size
firdes.WIN_BLACKMAN_hARRIS, #wintype
0, #fc
samp_rate, #bw
"", #name
1 #number of inputs
)
self.qtgui_freq_sink_x_0_1.set_update_time(0.010)
self.qtgui_freq_sink_x_0_1.set_y_axis(-140, 10)
self.qtgui_freq_sink_x_0_1.set_y_label('Relative Gain', 'dB')
self.qtgui_freq_sink_x_0_1.set_trigger_mode(qtgui.TRIG_MODE_FREE, 0.0, 0, "")
self.qtgui_freq_sink_x_0_1.enable_autoscale(False)
self.qtgui_freq_sink_x_0_1.enable_grid(False)
self.qtgui_freq_sink_x_0_1.set_fft_average(1.0)
self.qtgui_freq_sink_x_0_1.enable_axis_labels(True)
self.qtgui_freq_sink_x_0_1.enable_control_panel(False)
if not False:
self.qtgui_freq_sink_x_0_1.disable_legend()
if "complex" == "float" or "complex" == "msg_float":
self.qtgui_freq_sink_x_0_1.set_plot_pos_half(not True)
labels = ['', '', '', '', '',
'', '', '', '', '']
widths = [1, 1, 1, 1, 1,
1, 1, 1, 1, 1]
colors = ["blue", "red", "green", "black", "cyan",
"magenta", "yellow", "dark red", "dark green", "dark blue"]
alphas = [1.0, 1.0, 1.0, 1.0, 1.0,
1.0, 1.0, 1.0, 1.0, 1.0]
for i in xrange(1):
if len(labels[i]) == 0:
self.qtgui_freq_sink_x_0_1.set_line_label(i, "Data {0}".format(i))
else:
self.qtgui_freq_sink_x_0_1.set_line_label(i, labels[i])
self.qtgui_freq_sink_x_0_1.set_line_width(i, widths[i])
self.qtgui_freq_sink_x_0_1.set_line_color(i, colors[i])
self.qtgui_freq_sink_x_0_1.set_line_alpha(i, alphas[i])
self._qtgui_freq_sink_x_0_1_win = sip.wrapinstance(self.qtgui_freq_sink_x_0_1.pyqwidget(), Qt.QWidget)
self.main_tab_grid_layout_0.addWidget(self._qtgui_freq_sink_x_0_1_win, 0, 2, 2, 1)
for r in range(0, 2):
self.main_tab_grid_layout_0.setRowStretch(r, 1)
for c in range(2, 3):
self.main_tab_grid_layout_0.setColumnStretch(c, 1)
self.qtgui_freq_sink_x_0_0 = qtgui.freq_sink_c(
1024, #size
firdes.WIN_BLACKMAN_hARRIS, #wintype
0, #fc
samp_rate, #bw
"", #name
1 #number of inputs
)
self.qtgui_freq_sink_x_0_0.set_update_time(0.010)
self.qtgui_freq_sink_x_0_0.set_y_axis(-140, 10)
self.qtgui_freq_sink_x_0_0.set_y_label('Relative Gain', 'dB')
self.qtgui_freq_sink_x_0_0.set_trigger_mode(qtgui.TRIG_MODE_FREE, 0.0, 0, "")
self.qtgui_freq_sink_x_0_0.enable_autoscale(False)
self.qtgui_freq_sink_x_0_0.enable_grid(False)
self.qtgui_freq_sink_x_0_0.set_fft_average(1.0)
self.qtgui_freq_sink_x_0_0.enable_axis_labels(True)
self.qtgui_freq_sink_x_0_0.enable_control_panel(False)
if not False:
self.qtgui_freq_sink_x_0_0.disable_legend()
if "complex" == "float" or "complex" == "msg_float":
self.qtgui_freq_sink_x_0_0.set_plot_pos_half(not True)
labels = ['', '', '', '', '',
'', '', '', '', '']
widths = [1, 1, 1, 1, 1,
1, 1, 1, 1, 1]
colors = ["blue", "red", "green", "black", "cyan",
"magenta", "yellow", "dark red", "dark green", "dark blue"]
alphas = [1.0, 1.0, 1.0, 1.0, 1.0,
1.0, 1.0, 1.0, 1.0, 1.0]
for i in xrange(1):
if len(labels[i]) == 0:
self.qtgui_freq_sink_x_0_0.set_line_label(i, "Data {0}".format(i))
else:
self.qtgui_freq_sink_x_0_0.set_line_label(i, labels[i])
self.qtgui_freq_sink_x_0_0.set_line_width(i, widths[i])
self.qtgui_freq_sink_x_0_0.set_line_color(i, colors[i])
self.qtgui_freq_sink_x_0_0.set_line_alpha(i, alphas[i])
self._qtgui_freq_sink_x_0_0_win = sip.wrapinstance(self.qtgui_freq_sink_x_0_0.pyqwidget(), Qt.QWidget)
self.main_tab_grid_layout_0.addWidget(self._qtgui_freq_sink_x_0_0_win, 0, 1, 2, 1)
for r in range(0, 2):
self.main_tab_grid_layout_0.setRowStretch(r, 1)
for c in range(1, 2):
self.main_tab_grid_layout_0.setColumnStretch(c, 1)
self.qtgui_freq_sink_x_0 = qtgui.freq_sink_c(
1024, #size
firdes.WIN_BLACKMAN_hARRIS, #wintype
0, #fc
samp_rate, #bw
"", #name
1 #number of inputs
)
self.qtgui_freq_sink_x_0.set_update_time(0.010)
self.qtgui_freq_sink_x_0.set_y_axis(-140, 10)
self.qtgui_freq_sink_x_0.set_y_label('Relative Gain', 'dB')
self.qtgui_freq_sink_x_0.set_trigger_mode(qtgui.TRIG_MODE_FREE, 0.0, 0, "")
self.qtgui_freq_sink_x_0.enable_autoscale(False)
self.qtgui_freq_sink_x_0.enable_grid(False)
self.qtgui_freq_sink_x_0.set_fft_average(1.0)
self.qtgui_freq_sink_x_0.enable_axis_labels(True)
self.qtgui_freq_sink_x_0.enable_control_panel(False)
if not False:
self.qtgui_freq_sink_x_0.disable_legend()
if "complex" == "float" or "complex" == "msg_float":
self.qtgui_freq_sink_x_0.set_plot_pos_half(not True)
labels = ['', '', '', '', '',
'', '', '', '', '']
widths = [1, 1, 1, 1, 1,
1, 1, 1, 1, 1]
colors = ["blue", "red", "green", "black", "cyan",
"magenta", "yellow", "dark red", "dark green", "dark blue"]
alphas = [1.0, 1.0, 1.0, 1.0, 1.0,
1.0, 1.0, 1.0, 1.0, 1.0]
for i in xrange(1):
if len(labels[i]) == 0:
self.qtgui_freq_sink_x_0.set_line_label(i, "Data {0}".format(i))
else:
self.qtgui_freq_sink_x_0.set_line_label(i, labels[i])
self.qtgui_freq_sink_x_0.set_line_width(i, widths[i])
self.qtgui_freq_sink_x_0.set_line_color(i, colors[i])
self.qtgui_freq_sink_x_0.set_line_alpha(i, alphas[i])
self._qtgui_freq_sink_x_0_win = sip.wrapinstance(self.qtgui_freq_sink_x_0.pyqwidget(), Qt.QWidget)
self.main_tab_grid_layout_0.addWidget(self._qtgui_freq_sink_x_0_win, 0, 0, 2, 1)
for r in range(0, 2):
self.main_tab_grid_layout_0.setRowStretch(r, 1)
for c in range(0, 1):
self.main_tab_grid_layout_0.setColumnStretch(c, 1)
self.pyqt_meta_text_output_0_0_0_0 = pyqt.meta_text_output()
self._pyqt_meta_text_output_0_0_0_0_win = self.pyqt_meta_text_output_0_0_0_0;
self.main_tab_grid_layout_2.addWidget(self._pyqt_meta_text_output_0_0_0_0_win, 1, 3, 1, 1)
for r in range(1, 2):
self.main_tab_grid_layout_2.setRowStretch(r, 1)
for c in range(3, 4):
self.main_tab_grid_layout_2.setColumnStretch(c, 1)
self.pyqt_meta_text_output_0_0_0 = pyqt.meta_text_output()
self._pyqt_meta_text_output_0_0_0_win = self.pyqt_meta_text_output_0_0_0;
self.main_tab_grid_layout_2.addWidget(self._pyqt_meta_text_output_0_0_0_win, 1, 2, 1, 1)
for r in range(1, 2):
self.main_tab_grid_layout_2.setRowStretch(r, 1)
for c in range(2, 3):
self.main_tab_grid_layout_2.setColumnStretch(c, 1)
self.pyqt_meta_text_output_0_0 = pyqt.meta_text_output()
self._pyqt_meta_text_output_0_0_win = self.pyqt_meta_text_output_0_0;
self.main_tab_grid_layout_2.addWidget(self._pyqt_meta_text_output_0_0_win, 1, 1, 1, 1)
for r in range(1, 2):
self.main_tab_grid_layout_2.setRowStretch(r, 1)
for c in range(1, 2):
self.main_tab_grid_layout_2.setColumnStretch(c, 1)
self.pyqt_meta_text_output_0 = pyqt.meta_text_output()
self._pyqt_meta_text_output_0_win = self.pyqt_meta_text_output_0;
self.main_tab_grid_layout_2.addWidget(self._pyqt_meta_text_output_0_win, 1, 0, 1, 1)
for r in range(1, 2):
self.main_tab_grid_layout_2.setRowStretch(r, 1)
for c in range(0, 1):
self.main_tab_grid_layout_2.setColumnStretch(c, 1)
self.blocks_throttle_3 = blocks.throttle(gr.sizeof_gr_complex*1, samp_rate / throttle,True)
self.blocks_throttle_2 = blocks.throttle(gr.sizeof_gr_complex*1, samp_rate / throttle,True)
self.blocks_throttle_1 = blocks.throttle(gr.sizeof_gr_complex*1, samp_rate /throttle,True)
self.blocks_throttle_0 = blocks.throttle(gr.sizeof_gr_complex*1, samp_rate / throttle,True)
self.blocks_skiphead_3 = blocks.skiphead(gr.sizeof_gr_complex*1, 8522)
self.blocks_skiphead_2 = blocks.skiphead(gr.sizeof_gr_complex*1, 0)
self.blocks_skiphead_1 = blocks.skiphead(gr.sizeof_gr_complex*1, 1318)
self.blocks_skiphead_0 = blocks.skiphead(gr.sizeof_gr_complex*1, 11532)
self.blocks_complex_to_mag_squared_1_0_0_0 = blocks.complex_to_mag_squared(1)
self.blocks_complex_to_mag_squared_1_0_0 = blocks.complex_to_mag_squared(1)
self.blocks_complex_to_mag_squared_1_0 = blocks.complex_to_mag_squared(1)
self.blocks_complex_to_mag_squared_1 = blocks.complex_to_mag_squared(1)
self.blocks_complex_to_mag_squared_0_1_1 = blocks.complex_to_mag_squared(1)
self.blocks_complex_to_mag_squared_0_1_0 = blocks.complex_to_mag_squared(1)
self.blocks_complex_to_mag_squared_0_1 = blocks.complex_to_mag_squared(1)
self.blocks_complex_to_mag_squared_0 = blocks.complex_to_mag_squared(1)
self.analog_const_source_x_0_0_0_0 = analog.sig_source_f(0, analog.GR_CONST_WAVE, 0, 0, thresh)
self.analog_const_source_x_0_0_0 = analog.sig_source_f(0, analog.GR_CONST_WAVE, 0, 0, thresh)
self.analog_const_source_x_0_0 = analog.sig_source_f(0, analog.GR_CONST_WAVE, 0, 0, thresh)
self.analog_const_source_x_0 = analog.sig_source_f(0, analog.GR_CONST_WAVE, 0, 0, thresh)
self.analog_agc2_xx_0_3 = analog.agc2_cc(1e-1, 1e-2, 1.0, 1.0)
self.analog_agc2_xx_0_3.set_max_gain(65536)
self.analog_agc2_xx_0_2 = analog.agc2_cc(1e-1, 1e-2, 1.0, 1.0)
self.analog_agc2_xx_0_2.set_max_gain(65536)
self.analog_agc2_xx_0_1 = analog.agc2_cc(1e-1, 1e-2, 1.0, 1.0)
self.analog_agc2_xx_0_1.set_max_gain(65536)
self.analog_agc2_xx_0 = analog.agc2_cc(1e-1, 1e-2, 1.0, 1.0)
self.analog_agc2_xx_0.set_max_gain(65536)
self.adsb_framer_1_0_0_0 = adsb.framer(samp_rate, thresh)
self.adsb_framer_1_0_0 = adsb.framer(samp_rate, thresh)
self.adsb_framer_1_0 = adsb.framer(samp_rate, thresh)
self.adsb_framer_1 = adsb.framer(samp_rate, thresh)
self.adsb_demod_0_0_0_0 = adsb.demod(samp_rate)
self.adsb_demod_0_0_0 = adsb.demod(samp_rate)
self.adsb_demod_0_0 = adsb.demod(samp_rate)
self.adsb_demod_0 = adsb.demod(samp_rate)
self.adsb_decoder_0_0_0_0 = adsb.decoder("Extended Squitter Only", "None", "Verbose")
self.adsb_decoder_0_0_0 = adsb.decoder("Extended Squitter Only", "None", "Verbose")
self.adsb_decoder_0_0 = adsb.decoder("Extended Squitter Only", "None", "Verbose")
self.adsb_decoder_0 = adsb.decoder("Extended Squitter Only", "None", "Verbose")
##################################################
# Connections
##################################################
self.msg_connect((self.adsb_decoder_0, 'decoded'), (self.pyqt_meta_text_output_0, 'pdus'))
self.msg_connect((self.adsb_decoder_0_0, 'decoded'), (self.pyqt_meta_text_output_0_0, 'pdus'))
self.msg_connect((self.adsb_decoder_0_0_0, 'decoded'), (self.pyqt_meta_text_output_0_0_0, 'pdus'))
self.msg_connect((self.adsb_decoder_0_0_0_0, 'decoded'), (self.pyqt_meta_text_output_0_0_0_0, 'pdus'))
self.msg_connect((self.adsb_demod_0, 'demodulated'), (self.adsb_decoder_0, 'demodulated'))
self.msg_connect((self.adsb_demod_0_0, 'demodulated'), (self.adsb_decoder_0_0, 'demodulated'))
self.msg_connect((self.adsb_demod_0_0_0, 'demodulated'), (self.adsb_decoder_0_0_0, 'demodulated'))
self.msg_connect((self.adsb_demod_0_0_0_0, 'demodulated'), (self.adsb_decoder_0_0_0_0, 'demodulated'))
self.connect((self.adsb_demod_0, 0), (self.qtgui_time_sink_x_0_1, 0))
self.connect((self.adsb_demod_0_0, 0), (self.qtgui_time_sink_x_0_1_0, 0))
self.connect((self.adsb_demod_0_0_0, 0), (self.qtgui_time_sink_x_0_1_0_0, 0))
self.connect((self.adsb_demod_0_0_0_0, 0), (self.qtgui_time_sink_x_0_1_0_0_0, 0))
self.connect((self.adsb_framer_1, 0), (self.adsb_demod_0, 0))
self.connect((self.adsb_framer_1_0, 0), (self.adsb_demod_0_0, 0))
self.connect((self.adsb_framer_1_0_0, 0), (self.adsb_demod_0_0_0, 0))
self.connect((self.adsb_framer_1_0_0_0, 0), (self.adsb_demod_0_0_0_0, 0))
self.connect((self.analog_agc2_xx_0, 0), (self.blocks_complex_to_mag_squared_0, 0))
self.connect((self.analog_agc2_xx_0, 0), (self.blocks_complex_to_mag_squared_1, 0))
self.connect((self.analog_agc2_xx_0, 0), (self.qtgui_freq_sink_x_0, 0))
self.connect((self.analog_agc2_xx_0, 0), (self.qtgui_waterfall_sink_x_0, 0))
self.connect((self.analog_agc2_xx_0_1, 0), (self.blocks_complex_to_mag_squared_0_1, 0))
self.connect((self.analog_agc2_xx_0_1, 0), (self.blocks_complex_to_mag_squared_1_0, 0))
self.connect((self.analog_agc2_xx_0_1, 0), (self.qtgui_freq_sink_x_0_0, 0))
self.connect((self.analog_agc2_xx_0_1, 0), (self.qtgui_waterfall_sink_x_0_0, 0))
self.connect((self.analog_agc2_xx_0_2, 0), (self.blocks_complex_to_mag_squared_0_1_0, 0))
self.connect((self.analog_agc2_xx_0_2, 0), (self.blocks_complex_to_mag_squared_1_0_0, 0))
self.connect((self.analog_agc2_xx_0_2, 0), (self.qtgui_freq_sink_x_0_1, 0))
self.connect((self.analog_agc2_xx_0_2, 0), (self.qtgui_waterfall_sink_x_0_0_0, 0))
self.connect((self.analog_agc2_xx_0_3, 0), (self.blocks_complex_to_mag_squared_0_1_1, 0))
self.connect((self.analog_agc2_xx_0_3, 0), (self.blocks_complex_to_mag_squared_1_0_0_0, 0))
self.connect((self.analog_agc2_xx_0_3, 0), (self.qtgui_freq_sink_x_0_1_0, 0))
self.connect((self.analog_agc2_xx_0_3, 0), (self.qtgui_waterfall_sink_x_0_0_1, 0))
self.connect((self.analog_const_source_x_0, 0), (self.qtgui_time_sink_x_0_1, 1))
self.connect((self.analog_const_source_x_0_0, 0), (self.qtgui_time_sink_x_0_1_0, 1))
self.connect((self.analog_const_source_x_0_0_0, 0), (self.qtgui_time_sink_x_0_1_0_0, 1))
self.connect((self.analog_const_source_x_0_0_0_0, 0), (self.qtgui_time_sink_x_0_1_0_0_0, 1))
self.connect((self.blocks_complex_to_mag_squared_0, 0), (self.qtgui_time_sink_x_0, 0))
self.connect((self.blocks_complex_to_mag_squared_0_1, 0), (self.qtgui_time_sink_x_0_0, 0))
self.connect((self.blocks_complex_to_mag_squared_0_1_0, 0), (self.qtgui_time_sink_x_0_0_0, 0))
self.connect((self.blocks_complex_to_mag_squared_0_1_1, 0), (self.qtgui_time_sink_x_0_0_1, 0))
self.connect((self.blocks_complex_to_mag_squared_1, 0), (self.adsb_framer_1, 0))
self.connect((self.blocks_complex_to_mag_squared_1_0, 0), (self.adsb_framer_1_0, 0))
self.connect((self.blocks_complex_to_mag_squared_1_0_0, 0), (self.adsb_framer_1_0_0, 0))
self.connect((self.blocks_complex_to_mag_squared_1_0_0_0, 0), (self.adsb_framer_1_0_0_0, 0))
self.connect((self.blocks_skiphead_0, 0), (self.blocks_throttle_0, 0))
self.connect((self.blocks_skiphead_1, 0), (self.blocks_throttle_1, 0))
self.connect((self.blocks_skiphead_2, 0), (self.blocks_throttle_2, 0))
self.connect((self.blocks_skiphead_3, 0), (self.blocks_throttle_3, 0))
self.connect((self.blocks_throttle_0, 0), (self.analog_agc2_xx_0, 0))
self.connect((self.blocks_throttle_1, 0), (self.analog_agc2_xx_0_1, 0))
self.connect((self.blocks_throttle_2, 0), (self.analog_agc2_xx_0_2, 0))
self.connect((self.blocks_throttle_3, 0), (self.analog_agc2_xx_0_3, 0))
self.connect((self.sigmf_source_0, 0), (self.blocks_skiphead_0, 0))
self.connect((self.sigmf_source_1, 0), (self.blocks_skiphead_1, 0))
self.connect((self.sigmf_source_2, 0), (self.blocks_skiphead_2, 0))
self.connect((self.sigmf_source_3, 0), (self.blocks_skiphead_3, 0))
def closeEvent(self, event):
self.settings = Qt.QSettings("GNU Radio", "kerberos_sigmf_decode1")
self.settings.setValue("geometry", self.saveGeometry())
event.accept()
def get_throttle(self):
return self.throttle
def set_throttle(self, throttle):
self.throttle = throttle
Qt.QMetaObject.invokeMethod(self._throttle_line_edit, "setText", Qt.Q_ARG("QString", eng_notation.num_to_str(self.throttle)))
self.blocks_throttle_3.set_sample_rate(self.samp_rate / self.throttle)
self.blocks_throttle_2.set_sample_rate(self.samp_rate / self.throttle)
self.blocks_throttle_1.set_sample_rate(self.samp_rate /self.throttle)
self.blocks_throttle_0.set_sample_rate(self.samp_rate / self.throttle)
def get_thresh(self):
return self.thresh
def set_thresh(self, thresh):
self.thresh = thresh
Qt.QMetaObject.invokeMethod(self._thresh_line_edit, "setText", Qt.Q_ARG("QString", eng_notation.num_to_str(self.thresh)))
self.qtgui_time_sink_x_0_0_1.set_trigger_mode(qtgui.TRIG_MODE_AUTO, qtgui.TRIG_SLOPE_POS, self.thresh, 0, 0, "")
self.qtgui_time_sink_x_0_0_0.set_trigger_mode(qtgui.TRIG_MODE_AUTO, qtgui.TRIG_SLOPE_POS, self.thresh, 0, 0, "")
self.qtgui_time_sink_x_0_0.set_trigger_mode(qtgui.TRIG_MODE_AUTO, qtgui.TRIG_SLOPE_POS, self.thresh, 0, 0, "")
self.qtgui_time_sink_x_0.set_trigger_mode(qtgui.TRIG_MODE_AUTO, qtgui.TRIG_SLOPE_POS, self.thresh, 0, 0, "")
self.analog_const_source_x_0_0_0_0.set_offset(self.thresh)
self.analog_const_source_x_0_0_0.set_offset(self.thresh)
self.analog_const_source_x_0_0.set_offset(self.thresh)
self.analog_const_source_x_0.set_offset(self.thresh)
self.adsb_framer_1_0_0_0.set_threshold(self.thresh)
self.adsb_framer_1_0_0.set_threshold(self.thresh)
self.adsb_framer_1_0.set_threshold(self.thresh)
self.adsb_framer_1.set_threshold(self.thresh)
def get_samp_rate(self):
return self.samp_rate
def set_samp_rate(self, samp_rate):
self.samp_rate = samp_rate
self.qtgui_waterfall_sink_x_0_0_1.set_frequency_range(0, self.samp_rate)
self.qtgui_waterfall_sink_x_0_0_0.set_frequency_range(0, self.samp_rate)
self.qtgui_waterfall_sink_x_0_0.set_frequency_range(0, self.samp_rate)
self.qtgui_waterfall_sink_x_0.set_frequency_range(0, self.samp_rate)
self.qtgui_time_sink_x_0_1_0_0_0.set_samp_rate(int(self.samp_rate*8))
self.qtgui_time_sink_x_0_1_0_0.set_samp_rate(int(self.samp_rate*8))
self.qtgui_time_sink_x_0_1_0.set_samp_rate(int(self.samp_rate*8))
self.qtgui_time_sink_x_0_1.set_samp_rate(int(self.samp_rate*8))
self.qtgui_time_sink_x_0_0_1.set_samp_rate(self.samp_rate)
self.qtgui_time_sink_x_0_0_0.set_samp_rate(self.samp_rate)
self.qtgui_time_sink_x_0_0.set_samp_rate(self.samp_rate)
self.qtgui_time_sink_x_0.set_samp_rate(self.samp_rate)
self.qtgui_freq_sink_x_0_1_0.set_frequency_range(0, self.samp_rate)
self.qtgui_freq_sink_x_0_1.set_frequency_range(0, self.samp_rate)
self.qtgui_freq_sink_x_0_0.set_frequency_range(0, self.samp_rate)
self.qtgui_freq_sink_x_0.set_frequency_range(0, self.samp_rate)
self.blocks_throttle_3.set_sample_rate(self.samp_rate / self.throttle)
self.blocks_throttle_2.set_sample_rate(self.samp_rate / self.throttle)
self.blocks_throttle_1.set_sample_rate(self.samp_rate /self.throttle)
self.blocks_throttle_0.set_sample_rate(self.samp_rate / self.throttle)
def main(top_block_cls=kerberos_sigmf_decode1, options=None):
from distutils.version import StrictVersion
if StrictVersion(Qt.qVersion()) >= StrictVersion("4.5.0"):
style = gr.prefs().get_string('qtgui', 'style', 'raster')
Qt.QApplication.setGraphicsSystem(style)
qapp = Qt.QApplication(sys.argv)
tb = top_block_cls()
tb.start()
tb.show()
def quitting():
tb.stop()
tb.wait()
qapp.connect(qapp, Qt.SIGNAL("aboutToQuit()"), quitting)
qapp.exec_()
if __name__ == '__main__':
main()
| 49.777677 | 203 | 0.638356 | 9,119 | 54,855 | 3.402895 | 0.035201 | 0.030937 | 0.068061 | 0.109568 | 0.924495 | 0.913796 | 0.882666 | 0.86188 | 0.847282 | 0.824724 | 0 | 0.068101 | 0.23254 | 54,855 | 1,101 | 204 | 49.822888 | 0.668995 | 0.012797 | 0 | 0.406797 | 0 | 0.004119 | 0.040995 | 0.007342 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0.00103 | 0.018538 | null | null | 0.00103 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
46b479be11afac9332cdc9d02ccc8d8b01117fe9 | 129 | py | Python | test/solution_tests/HLO/test_hello.py | DPNT-Sourcecode/CHK-pdqp01 | df6e16223974acf27b0730388efc2d2e1b2e7402 | [
"Apache-2.0"
] | null | null | null | test/solution_tests/HLO/test_hello.py | DPNT-Sourcecode/CHK-pdqp01 | df6e16223974acf27b0730388efc2d2e1b2e7402 | [
"Apache-2.0"
] | null | null | null | test/solution_tests/HLO/test_hello.py | DPNT-Sourcecode/CHK-pdqp01 | df6e16223974acf27b0730388efc2d2e1b2e7402 | [
"Apache-2.0"
] | null | null | null | from solutions.HLO import hello_solution
def test_hello():
assert hello_solution.hello("test_input") == "Hello, test_input!" | 32.25 | 69 | 0.767442 | 18 | 129 | 5.222222 | 0.555556 | 0.276596 | 0.297872 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.116279 | 129 | 4 | 69 | 32.25 | 0.824561 | 0 | 0 | 0 | 0 | 0 | 0.215385 | 0 | 0 | 0 | 0 | 0 | 0.333333 | 1 | 0.333333 | true | 0 | 0.333333 | 0 | 0.666667 | 0 | 1 | 0 | 0 | null | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 7 |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.