hexsha string | size int64 | ext string | lang string | max_stars_repo_path string | max_stars_repo_name string | max_stars_repo_head_hexsha string | max_stars_repo_licenses list | max_stars_count int64 | max_stars_repo_stars_event_min_datetime string | max_stars_repo_stars_event_max_datetime string | max_issues_repo_path string | max_issues_repo_name string | max_issues_repo_head_hexsha string | max_issues_repo_licenses list | max_issues_count int64 | max_issues_repo_issues_event_min_datetime string | max_issues_repo_issues_event_max_datetime string | max_forks_repo_path string | max_forks_repo_name string | max_forks_repo_head_hexsha string | max_forks_repo_licenses list | max_forks_count int64 | max_forks_repo_forks_event_min_datetime string | max_forks_repo_forks_event_max_datetime string | content string | avg_line_length float64 | max_line_length int64 | alphanum_fraction float64 | qsc_code_num_words_quality_signal int64 | qsc_code_num_chars_quality_signal float64 | qsc_code_mean_word_length_quality_signal float64 | qsc_code_frac_words_unique_quality_signal float64 | qsc_code_frac_chars_top_2grams_quality_signal float64 | qsc_code_frac_chars_top_3grams_quality_signal float64 | qsc_code_frac_chars_top_4grams_quality_signal float64 | qsc_code_frac_chars_dupe_5grams_quality_signal float64 | qsc_code_frac_chars_dupe_6grams_quality_signal float64 | qsc_code_frac_chars_dupe_7grams_quality_signal float64 | qsc_code_frac_chars_dupe_8grams_quality_signal float64 | qsc_code_frac_chars_dupe_9grams_quality_signal float64 | qsc_code_frac_chars_dupe_10grams_quality_signal float64 | qsc_code_frac_chars_replacement_symbols_quality_signal float64 | qsc_code_frac_chars_digital_quality_signal float64 | qsc_code_frac_chars_whitespace_quality_signal float64 | qsc_code_size_file_byte_quality_signal float64 | qsc_code_num_lines_quality_signal float64 | qsc_code_num_chars_line_max_quality_signal float64 | qsc_code_num_chars_line_mean_quality_signal float64 | qsc_code_frac_chars_alphabet_quality_signal float64 | qsc_code_frac_chars_comments_quality_signal float64 | qsc_code_cate_xml_start_quality_signal float64 | qsc_code_frac_lines_dupe_lines_quality_signal float64 | qsc_code_cate_autogen_quality_signal float64 | qsc_code_frac_lines_long_string_quality_signal float64 | qsc_code_frac_chars_string_length_quality_signal float64 | qsc_code_frac_chars_long_word_length_quality_signal float64 | qsc_code_frac_lines_string_concat_quality_signal float64 | qsc_code_cate_encoded_data_quality_signal float64 | qsc_code_frac_chars_hex_words_quality_signal float64 | qsc_code_frac_lines_prompt_comments_quality_signal float64 | qsc_code_frac_lines_assert_quality_signal float64 | qsc_codepython_cate_ast_quality_signal float64 | qsc_codepython_frac_lines_func_ratio_quality_signal float64 | qsc_codepython_cate_var_zero_quality_signal bool | qsc_codepython_frac_lines_pass_quality_signal float64 | qsc_codepython_frac_lines_import_quality_signal float64 | qsc_codepython_frac_lines_simplefunc_quality_signal float64 | qsc_codepython_score_lines_no_logic_quality_signal float64 | qsc_codepython_frac_lines_print_quality_signal float64 | qsc_code_num_words int64 | qsc_code_num_chars int64 | qsc_code_mean_word_length int64 | qsc_code_frac_words_unique null | qsc_code_frac_chars_top_2grams int64 | qsc_code_frac_chars_top_3grams int64 | qsc_code_frac_chars_top_4grams int64 | qsc_code_frac_chars_dupe_5grams int64 | qsc_code_frac_chars_dupe_6grams int64 | qsc_code_frac_chars_dupe_7grams int64 | qsc_code_frac_chars_dupe_8grams int64 | qsc_code_frac_chars_dupe_9grams int64 | qsc_code_frac_chars_dupe_10grams int64 | qsc_code_frac_chars_replacement_symbols int64 | qsc_code_frac_chars_digital int64 | qsc_code_frac_chars_whitespace int64 | qsc_code_size_file_byte int64 | qsc_code_num_lines int64 | qsc_code_num_chars_line_max int64 | qsc_code_num_chars_line_mean int64 | qsc_code_frac_chars_alphabet int64 | qsc_code_frac_chars_comments int64 | qsc_code_cate_xml_start int64 | qsc_code_frac_lines_dupe_lines int64 | qsc_code_cate_autogen int64 | qsc_code_frac_lines_long_string int64 | qsc_code_frac_chars_string_length int64 | qsc_code_frac_chars_long_word_length int64 | qsc_code_frac_lines_string_concat null | qsc_code_cate_encoded_data int64 | qsc_code_frac_chars_hex_words int64 | qsc_code_frac_lines_prompt_comments int64 | qsc_code_frac_lines_assert int64 | qsc_codepython_cate_ast int64 | qsc_codepython_frac_lines_func_ratio int64 | qsc_codepython_cate_var_zero int64 | qsc_codepython_frac_lines_pass int64 | qsc_codepython_frac_lines_import int64 | qsc_codepython_frac_lines_simplefunc int64 | qsc_codepython_score_lines_no_logic int64 | qsc_codepython_frac_lines_print int64 | effective string | hits int64 |
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
ef23a48948cf4e59d9a3ba6ae6aa377b374de99b | 35,965 | py | Python | py_hawkesn_sir/py_hawkesn_sir/hawkesn_seir_parallelizable.py | yogabonito/seir_hawkes | c632d7faabea396a91c80e2941d5b93232b64b3b | [
"BSD-3-Clause"
] | 4 | 2019-11-14T03:06:12.000Z | 2021-01-05T23:37:11.000Z | py_hawkesn_sir/py_hawkesn_sir/hawkesn_seir_parallelizable.py | yogabonito/seir_hawkes | c632d7faabea396a91c80e2941d5b93232b64b3b | [
"BSD-3-Clause"
] | null | null | null | py_hawkesn_sir/py_hawkesn_sir/hawkesn_seir_parallelizable.py | yogabonito/seir_hawkes | c632d7faabea396a91c80e2941d5b93232b64b3b | [
"BSD-3-Clause"
] | null | null | null | import numbers
from scipy.optimize import fmin_l_bfgs_b
import numpy as np
import matplotlib.pyplot as plt
llf_factor = 10**6 # factor for avoiding a too flat likelihood function
def exp_intensity(beta, sigma, gamma, n, history, sum_less_equal=True):
"""
Calculate the (exponential) intensity of a HawkesN process.
Parameters
----------
beta : float
Parameter beta of the corresponding SEIR model.
sigma : float or None
Parameter sigma of the corresponding SEIR model. If None,
`sigma`==`gamma` is assumed.
gamma : float
Parameter gamma of the corresponding SEIR model.
n : float or int
The size of the population.
history : np.array
Array containing the process' jump times.
sum_less_equal : bool, default: True
If True, we sum over all event times <= time t. Otherwise, we sum
over all event times < time t.
Returns
-------
exp_intensity_ : function
A function of the time (expecting a float or np.array as argument).
"""
def exp_intensity_(t):
"""
Parameters
----------
t : float or np.array
If `t` is a float, then it represents a point in time. If it is
a 1-dimensional array, then it represents an array of points in
time.
Returns
-------
result : float or np.array
If `t` is a float, then the intensity of the HawkesN process at
time `t` is returned. If `t` is a 1-dimensional array, then an
array is returned. Each entry of it represents the intensity of
the HawkesN process at the time that was specified by the
corresponding entry in `t`.
"""
nonlocal sigma
if sigma is None:
sigma = gamma
#if np.isnan([beta, sigma, gamma, n]).any():
# raise RuntimeError("One of the arguments to exp_intensity is nan: "
# "beta" + str(beta) + ", sigma" + str(sigma) +
# ", gamma" + str(gamma) + ", n" + str(n) +
# ", history" + str(history))
if isinstance(t, numbers.Number):
t = np.array([t])
result = np.empty(t.shape)
for index, time in enumerate(t):
if sum_less_equal:
history_until_t = history[history <= time]
else:
history_until_t = history[history < time]
if sigma != gamma:
result[index] = np.sum(
np.exp(-sigma * (time - history_until_t))
-
np.exp(-gamma * (time - history_until_t))
)
else:
result[index] = np.sum(
(time - history_until_t) *
np.exp(-gamma * (time - history_until_t))
)
result[index] *= (1 - np.count_nonzero(history <= time)/n)
# print("intensity at index", index, "is", result[index]*scale*decay)
if sigma != gamma:
result *= beta * sigma / (gamma - sigma)
else:
result *= beta * gamma
return result
return exp_intensity_
def plot_exp_intensity(t_max, intensity=None, beta=None, sigma=None,
gamma=None, n=None, history=None, width=5.51, height=4,
n_xticks=6, step=.01, fname=None, **kwargs):
"""
Plot (or save the plot of) the exponential intensity function from t=0
until t=t_max.
Parameters
----------
t_max : float
Define the time horizon of the plot. The time axis will contain
values from 0 to t_max.
intensity : function or None
If None, then the arguments scale, decay, and n must be provided in
order to compute an intensity function. If exp_intensity is already
a function (taking the time as only argument) then scale, decay, n,
and history are ignored.
beta : float
See corresponding argument in :func:`exp_intensity`. Ignored if
exp_intensity is provided as argument.
sigma : float
See corresponding argument in :func:`exp_intensity`. Ignored if
exp_intensity is provided as argument.
gamma : float
See corresponding argument in :func:`exp_intensity`. Ignored if
exp_intensity is provided as argument.
n : int or None
Population size. Ignored if exp_intensity is provided as argument.
history : np.array or None
One dimensional array containing the event times. The event times
must be sorted in ascending order. Ignored if exp_intensity is
provided as argument.
width : float, default: 5.51
Width of the plot.
height : float, default: 4
Height of the plot.
n_xticks : int (must be non-negative)
Number of ticks on the time axis.
step : float
Step size for drawing the function graph.
fname : str or None
Name (without extension) of the file the plot is saved to. If
`None`, the plot is not saved.
"""
if intensity is None:
intensity = exp_intensity(beta, sigma, gamma, n, history, **kwargs)
t = np.arange(0, t_max, step)
plt.figure(dpi=300, figsize=(width, height))
plt.plot(t, intensity(t))
plt.xlabel("$t$")
plt.xlim(0, t_max)
plt.xticks(np.linspace(0, t_max, n_xticks))
plt.ylabel("Intensity")
plt.grid()
title = "Intensity of a SEIR-related HawkesN process"
if history is not None and beta is not None and sigma is not None \
and sigma is not None and n is not None:
title += " with\nevent history \{" \
+ ",".join(str(i) for i in history[:4]) \
+ (", ..." if len(history) > 4 else "") \
+ "\} and parameters: $\\beta=" + str(beta) \
+ "$, $\\sigma=" + str(sigma) + "$, $\\gamma=" + str(gamma) + \
"$, $N$=" + str(n)
title += "."
plt.title(title)
if fname is not None:
plt.savefig(fname + ".pdf")
def llf_sigma_neq_gamma(beta, sigma, gamma, n, history, sum_less_equal=True):
"""
Parameters
----------
beta : float
See corresponding argument in :func:`exp_intensity`.
sigma : float
See corresponding argument in :func:`exp_intensity`.
gamma : float
See corresponding argument in :func:`exp_intensity`.
n : float
See corresponding argument in :func:`exp_intensity`.
history : np.array
See corresponding argument in :func:`exp_intensity`.
sum_less_equal : bool, default: True
See corresponding argument in :func:`exp_intensity`.
Returns
-------
llf : numpy.float64
The log-likelihood for the parameters passed as arguments.
"""
intensity = exp_intensity(beta, sigma, gamma, n, history,
sum_less_equal=sum_less_equal)
sum_log_arg = intensity(history[history > 0])
if sum_log_arg[0] <=0:
sum_log_arg = sum_log_arg[1:]
sum_part = np.sum(np.log(sum_log_arg))
int_part = 0
for i in range(len(history) - 1):
int_part += (n - (i + 1)) / n * np.sum(
(
np.exp(-sigma * (history[i] - history[:i + 1]))
-
np.exp(-sigma * (history[i + 1] - history[:i + 1]))
) / sigma
-
(
np.exp(-gamma * (history[i] - history[:i + 1]))
-
np.exp(-gamma * (history[i + 1] - history[:i + 1]))
) / gamma
)
int_part *= (beta * sigma / (gamma-sigma))
# print("sum:", sum_part)
# print("integral:", int_part)
# print("*** llf:", sum_part - int_part)
return sum_part - int_part
def llf_sigma_eq_gamma(beta, gamma, n, history, sum_less_equal=True):
"""
Parameters
----------
beta : float
See corresponding argument in :func:`exp_intensity`.
gamma : float
See corresponding argument in :func:`exp_intensity`.
n : float
See corresponding argument in :func:`exp_intensity`.
history : np.array
See corresponding argument in :func:`exp_intensity`.
sum_less_equal : bool, default: True
See corresponding argument in :func:`exp_intensity`.
Returns
-------
llf : numpy.float64
The log-likelihood for the parameters passed as arguments.
"""
intensity = exp_intensity(beta, gamma, gamma, n, history,
sum_less_equal=sum_less_equal)
sum_log_arg = intensity(history[history > 0])
if sum_log_arg[0] <=0:
sum_log_arg = sum_log_arg[1:]
sum_part = np.sum(sum_log_arg)
int_part = 0
for i in range(len(history) - 1):
int_part += (n - (i + 1)) / n * np.sum(
np.exp(-gamma * (history[i] - history[:i + 1]))
* (gamma * (history[i] - history[:i + 1]) + 1)
-
np.exp(-gamma * (history[i + 1] - history[:i + 1]))
* (gamma * (history[i + 1] - history[:i + 1]) + 1)
) / gamma
int_part *= beta
# print("sum:", sum_part)
# print("integral:", int_part)
# print("*** llf:", sum_part - int_part)
return sum_part - int_part
def llf(beta, sigma, gamma, n, history, sum_less_equal=True):
if sigma != gamma and sigma is not None:
return llf_sigma_neq_gamma(beta, sigma, gamma, n, history,
sum_less_equal)
else:
return llf_sigma_eq_gamma(beta, gamma, n, history, sum_less_equal)
def fit_sigma_neq_gamma(history, beta_start=None, sigma_start=None,
gamma_start=None, n_start=None, estimate_n_only=False):
"""
Parameters
----------
history : np.array
1-dimensional array containing the event times in ascending order.
beta_start : float
Starting value for the likelihood optimization.
sigma_start : float
Starting value for the likelihood optimization.
gamma_start : float
Starting value for the likelihood optimization.
n_start : float or None, default: None
Starting value for the likelihood optimization. If None, a value is
chosen based on the number of events contained in the `history`.
estimate_n_only : bool, default: False
If True, `beta`, `sigma` and `gamma` are considered to be fixed and
only :math:`N` is fitted. Otherwise, `beta`, `sigma` and `gamma` are
fitted together with :math:`N`.
References
----------
This method uses the L-BFGS algorithm (see [1]_).
.. [1] C. Zhu, R. H. Byrd and J. Nocedal. L-BFGS-B: Algorithm 778: L-BFGS-B, FORTRAN routines for large scale bound constrained optimization (1997), ACM Transactions on Mathematical Software, 23, 4, pp. 550 - 560.
"""
if estimate_n_only and \
(beta_start is None or sigma_start is None or gamma_start is None):
raise Exception("If beta, sigma, and gamma are fixed, their values "
"must be provided!")
if n_start is None:
n_start = len(history) + .5
def negative_llf(beta_sigma_gamma_n):
"""
Parameters
----------
beta_sigma_gamma_n : np.array (shape (4))
Values for the parameters beta, sigma, gamma, and N in a single
array.
Returns
-------
neg_llf : float
The negative log-likelihood.
"""
beta, sigma, gamma, n = tuple(beta_sigma_gamma_n)
if sigma == gamma:
sigma += 1e-7
return -llf_factor * llf(beta=beta, sigma=sigma, gamma=gamma, n=n,
history=history, sum_less_equal=False)
def negative_llf_separate_params(n, beta, sigma, gamma):
"""
Same as :func:`negative_llf` but taking the parameters `n`, `beta`,
`sigma`, and `gamma` as separate arguments. This makes the function
suitable for likelihood maximization in only one parameter (`n`) with
fixed values for `beta`, `sigma`, and `gamma`.
"""
if sigma == gamma:
sigma += 1e-7
return -llf_factor * llf(beta=beta, sigma=sigma, gamma=gamma, n=n,
history=history, sum_less_equal=False)
def negative_llf_gradient(beta_sigma_gamma_n):
beta, sigma, gamma, n = tuple(beta_sigma_gamma_n)
if sigma == gamma:
sigma += 1e-7
return -llf_factor * llf_gradient(beta=beta, sigma=sigma, gamma=gamma,
n=n, history=history,
sum_less_equal=False)
def negative_llf_gradient_separate_params(n, beta, sigma, gamma):
if sigma == gamma:
sigma += 1e-7
return -llf_factor * dllf_dn_sigma_neq_gamma(beta=beta, sigma=sigma,
gamma=gamma, n=n,
history=history,
sum_less_equal=False)
eps = np.finfo(float).eps
if estimate_n_only:
return fmin_l_bfgs_b(
func=negative_llf_separate_params, # minimize this
x0=np.array([n_start]), # initial guess
args=(beta_start, sigma_start, gamma_start), # additional args to func&fprime
fprime=negative_llf_gradient_separate_params,
bounds=[(len(history), None)],
iprint=1
)
else:
return fmin_l_bfgs_b(
func=negative_llf, # minimize this
x0=np.array([beta_start,
sigma_start,
gamma_start,
n_start]), # initial guess
fprime=negative_llf_gradient,
bounds=[(eps, None),
(eps, None),
(eps, None),
(len(history), None)],
factr=10,
iprint=1
)
def fit_sigma_eq_gamma(history, beta_start=None, gamma_start=None,
n_start=None, estimate_n_only=False):
"""
Parameters
----------
history : np.array
1-dimensional array containing the event times in ascending order.
beta_start : float
Starting value for the likelihood optimization.
gamma_start : float
Starting value for the likelihood optimization.
n_start : float or None, default: None
Starting value for the likelihood optimization. If None, a value is
chosen based on the number of events contained in the `history`.
estimate_n_only : bool, default: False
If True, `beta` and `gamma` are considered to be fixed and only
:math:`N` is fitted. Otherwise, `beta` and `gamma` are fitted together
with :math:`N`.
References
----------
This method uses the L-BFGS algorithm (see [1]_).
.. [1] C. Zhu, R. H. Byrd and J. Nocedal. L-BFGS-B: Algorithm 778: L-BFGS-B, FORTRAN routines for large scale bound constrained optimization (1997), ACM Transactions on Mathematical Software, 23, 4, pp. 550 - 560.
"""
if estimate_n_only and (beta_start is None or gamma_start is None):
raise Exception("If beta and gamma are fixed, their values "
"must be provided!")
if n_start is None:
n_start = len(history) + .5
def negative_llf(beta_gamma_n):
"""
Parameters
----------
beta_gamma_n : np.array (shape (4))
Values for the parameters beta, gamma, and N in a single array.
Returns
-------
neg_llf : float
The negative log-likelihood.
"""
beta, gamma, n = tuple(beta_gamma_n)
return -llf(beta=beta, sigma=None, gamma=gamma, n=n, history=history,
sum_less_equal=False)
def negative_llf_separate_params(n, beta, gamma):
"""
Same as :func:`negative_llf` but taking the parameters `n`, `beta`,
and `gamma` as separate arguments. This makes the function suitable for
likelihood maximization in only one parameter (`n`) with fixed values
for `beta` and `gamma`.
"""
return -llf(beta=beta, sigma=None, gamma=gamma, n=n, history=history,
sum_less_equal=False)
def negative_llf_gradient(beta_gamma_n):
beta, gamma, n = tuple(beta_gamma_n)
return -llf_gradient(beta=beta, sigma=None, gamma=gamma, n=n,
history=history, sum_less_equal=False)
def negative_llf_gradient_separate_params(n, beta, gamma):
return -dllf_dn_sigma_eq_gamma(beta=beta, gamma=gamma, n=n,
history=history, sum_less_equal=False)
eps = np.finfo(float).eps
if estimate_n_only:
return fmin_l_bfgs_b(
func=negative_llf_separate_params, # minimize this
x0=np.array([n_start]), # initial guess
args=(beta_start, gamma_start), # additional args to func&fprime
fprime=negative_llf_gradient_separate_params,
bounds=[(len(history), None)],
iprint=1
)
else:
return fmin_l_bfgs_b(
func=negative_llf, # minimize this
x0=np.array([beta_start, gamma_start, n_start]), # initial guess
fprime=negative_llf_gradient,
bounds=[(eps, None), (eps, None), (len(history), None)],
iprint=1
)
def dllf_dbeta_sigma_neq_gamma(beta, sigma, gamma, n, history,
sum_less_equal=True):
"""
Parameters
----------
beta : float
See corresponding argument in :func:`exp_intensity`.
sigma : float
See corresponding argument in :func:`exp_intensity`.
gamma : float
See corresponding argument in :func:`exp_intensity`.
n : float
See corresponding argument in :func:`exp_intensity`.
history : np.array
See corresponding argument in :func:`exp_intensity`.
sum_less_equal : bool, default: True
See corresponding argument in :func:`exp_intensity`.
Note that this argument does not affect the derivative w.r.t. the beta
parameter. Thus, the return value does not depend on it.
Returns
-------
derivative_wrt_beta : float
The derivative (w.r.t. the beta parameter) of the log-likelihood
function given the `history` and evaluated at the parameters
`beta`, `sigma`, `gammma`, and `n`.
"""
sum_part = np.count_nonzero(history) / beta
int_part = 0
for l in range(len(history) - 1):
int_part += (n - (l + 1)) / n * (
np.sum(
np.exp(-sigma * (history[l] - history[:l + 1]))
-
np.exp(-sigma * (history[l + 1] - history[:l + 1]))
) / sigma
-
np.sum(
np.exp(-gamma * (history[l] - history[:l + 1]))
-
np.exp(-gamma * (history[l + 1] - history[:l + 1]))
) / gamma
)
return sum_part - int_part * sigma / (gamma - sigma)
def dllf_dbeta_sigma_eq_gamma(beta, gamma, n, history, sum_less_equal=True):
"""
Parameters
----------
beta : float
See corresponding argument in :func:`exp_intensity`.
gamma : float
See corresponding argument in :func:`exp_intensity`.
n : float
See corresponding argument in :func:`exp_intensity`.
history : np.array
See corresponding argument in :func:`exp_intensity`.
sum_less_equal : bool, default: True
See corresponding argument in :func:`exp_intensity`.
Note that this argument does not affect the derivative w.r.t. the beta
parameter. Thus, the return value does not depend on it.
Returns
-------
derivative_wrt_beta : float
The derivative (w.r.t. the beta parameter) of the log-likelihood
function given the `history` and evaluated at the parameters
`beta`, `gammma`, and `n`.
"""
sum_part = np.count_nonzero(history) / beta
# print("#" * 80, "\n", "SCALE", sep="")
# print("#" * 80)
# print("addend sum:", addend_sum)
# print("#" * 80)
# print("#" * 80)
int_part = 0
for l in range(len(history) - 1):
mg_tl_minus_tj = -gamma * (history[l] - history[:l + 1])
mg_tlplus1_minus_tj = -gamma * (history[l + 1] - history[:l + 1])
int_part += (n - (l + 1)) / n * np.sum(
np.exp(mg_tlplus1_minus_tj) * (mg_tlplus1_minus_tj - 1)
-
np.exp(mg_tl_minus_tj) * (mg_tl_minus_tj - 1)
)
return sum_part - int_part / gamma
def dllf_dsigma_sigma_neq_gamma(beta, sigma, gamma, n, history,
sum_less_equal=True):
"""
Parameters
----------
beta : float
See corresponding argument in :func:`exp_intensity`.
sigma : float
See corresponding argument in :func:`exp_intensity`.
gamma : float
See corresponding argument in :func:`exp_intensity`.
n : float
See corresponding argument in :func:`exp_intensity`.
history : np.array
See corresponding argument in :func:`exp_intensity`.
sum_less_equal : bool, default: True
See corresponding argument in :func:`exp_intensity`.
Returns
-------
derivative_wrt_sigma : float
The derivative (w.r.t. the sigma parameter) of the log-likelihood
function given the `history` and evaluated at the parameters
`beta`, `sigma`, `gammma`, and `n`.
"""
his_after_zero = history[history > 0]
gms = float(gamma - sigma)
addend_sum = np.count_nonzero(history) * gamma / (sigma * gms)
for index, event_time in enumerate(his_after_zero):
if sum_less_equal:
ti_minus_tj = event_time - history[history <= event_time]
else:
ti_minus_tj = event_time - history[history < event_time]
exp_sigma = np.exp(-sigma * ti_minus_tj)
exp_gamma = np.exp(-gamma * ti_minus_tj)
numerator = np.sum(-ti_minus_tj * exp_sigma)
# print("addend_sum_tmp numerator:", addend_sum_tmp)
denominator = np.sum(exp_sigma - exp_gamma)
# print("addend_sum_tmp denominator", denominator*decay)
# print("--> addend for index", index, "is:", addend_sum_tmp)
addend_sum += numerator / denominator
# addend_sum /= decay TODO: rm
# print("addend_sum", addend_sum) # so far: unparallelized
int_part = 0
for l in range(len(history) - 1):
tl_minus_tj = history[l] - history[:l + 1]
# print("times", ti_minus_tj)
tlplus1_minus_tj = history[l + 1] - history[:l + 1]
# print("times+1", tiplus1_minus_tj)
exp_sigma = np.exp(-sigma * tl_minus_tj)
exp_sigma_plus1 = np.exp(-sigma * tlplus1_minus_tj)
exp_gamma = np.exp(-gamma * tl_minus_tj)
exp_gamma_plus1 = np.exp(-gamma * tlplus1_minus_tj)
int_part += (n - (l + 1)) / n * (
1 / gms**2 *
np.sum(
exp_sigma - exp_sigma_plus1
)
+
1 / gms *
np.sum(
-tl_minus_tj * exp_sigma + tlplus1_minus_tj * exp_sigma_plus1
)
-
1 / gms**2 *
np.sum(exp_gamma - exp_gamma_plus1)
)
int_part *= beta
# print("addend_int FINISH:", int_part)
return addend_sum - int_part
def dllf_dgamma_sigma_neq_gamma(beta, sigma, gamma, n, history,
sum_less_equal=True):
"""
Parameters
----------
beta : float
See corresponding argument in :func:`exp_intensity`.
sigma : float
See corresponding argument in :func:`exp_intensity`.
gamma : float
See corresponding argument in :func:`exp_intensity`.
n : float
See corresponding argument in :func:`exp_intensity`.
history : np.array
See corresponding argument in :func:`exp_intensity`.
sum_less_equal : bool, default: True
See corresponding argument in :func:`exp_intensity`.
Returns
-------
derivative_wrt_gamma : float
The derivative (w.r.t. the gamma parameter) of the log-likelihood
function given the `history` and evaluated at the parameters
`beta`, `sigma`, `gammma`, and `n`.
"""
his_after_zero = history[history > 0]
gms = float(gamma - sigma)
addend_sum = -np.count_nonzero(history) / gms
for index, event_time in enumerate(his_after_zero):
if sum_less_equal:
ti_minus_tj = event_time - history[history <= event_time]
else:
ti_minus_tj = event_time - history[history < event_time]
exp_sigma = np.exp(-sigma * ti_minus_tj)
exp_gamma = np.exp(-gamma * ti_minus_tj)
numerator = np.sum(-ti_minus_tj * exp_gamma)
# print("addend_sum_tmp numerator:", addend_sum_tmp)
denominator = np.sum(exp_sigma - exp_gamma)
# print("addend_sum_tmp denominator", denominator*decay)
# print("--> addend for index", index, "is:", addend_sum_tmp)
addend_sum -= numerator / denominator
# print("addend_sum", addend_sum) # so far: unparallelized
int_part = 0
for l in range(len(history) - 1):
tl_minus_tj = history[l] - history[:l + 1]
# print("times", ti_minus_tj)
tlplus1_minus_tj = history[l + 1] - history[:l + 1]
# print("times+1", tiplus1_minus_tj)
exp_sigma = np.exp(-sigma * tl_minus_tj)
exp_sigma_plus1 = np.exp(-sigma * tlplus1_minus_tj)
exp_gamma = np.exp(-gamma * tl_minus_tj)
exp_gamma_plus1 = np.exp(-gamma * tlplus1_minus_tj)
int_part += (n - (l + 1)) / n * (
-1 / gms**2 *
np.sum(
exp_sigma - exp_sigma_plus1
)
-
sigma * (sigma - 2 * gamma) / (gamma * gms)**2 *
np.sum(exp_gamma - exp_gamma_plus1)
-
sigma / (gamma * gms) *
np.sum(
tlplus1_minus_tj * exp_gamma_plus1 - tl_minus_tj * exp_gamma
)
)
int_part *= beta
# print("addend_int FINISH:", int_part)
return addend_sum - int_part
def dllf_dgamma_sigma_eq_gamma(beta, gamma, n, history, sum_less_equal=True):
"""
Parameters
----------
beta : float
See corresponding argument in :func:`exp_intensity`.
gamma : float
See corresponding argument in :func:`exp_intensity`.
n : float
See corresponding argument in :func:`exp_intensity`.
history : np.array
See corresponding argument in :func:`exp_intensity`.
sum_less_equal : bool, default: True
See corresponding argument in :func:`exp_intensity`.
Returns
-------
derivative_wrt_gamma : float
The derivative (w.r.t. the gamma parameter) of the log-likelihood
function given the `history` and evaluated at the parameters
`beta`, `gammma`, and `n`.
"""
# print("gamma (eq)", gamma)
his_after_zero = history[history > 0]
addend_sum = np.count_nonzero(history) / gamma
for index, event_time in enumerate(his_after_zero):
if sum_less_equal:
ti_minus_tj = event_time - history[history <= event_time]
else:
ti_minus_tj = event_time - history[history < event_time]
exp_gamma = np.exp(-gamma * ti_minus_tj)
numerator = np.sum(ti_minus_tj**2 * exp_gamma)
# print("addend_sum_tmp numerator:", addend_sum_tmp)
denominator = np.sum(ti_minus_tj * exp_gamma)
# print("addend_sum_tmp denominator", denominator*decay)
# print("--> addend for index", index, "is:", addend_sum_tmp)
addend_sum -= numerator / denominator
# print("addend_sum", addend_sum) # so far: unparallelized
int_part = 0
for l in range(len(history) - 1):
tl_minus_tj = history[l] - history[:l + 1]
tlplus1_minus_tj = history[l + 1] - history[:l + 1]
g_tl_minus_tj = gamma * tl_minus_tj
g_tlplus1_minus_tj = gamma * tlplus1_minus_tj
int_part += (n - (l + 1)) / n * np.sum(
np.exp(-g_tl_minus_tj)
*
(
-g_tl_minus_tj * (1 + g_tl_minus_tj) - 1
)
-
np.exp(-g_tlplus1_minus_tj)
*
(
-g_tlplus1_minus_tj * (1 + g_tlplus1_minus_tj) - 1
)
)
int_part *= beta / gamma**2
# print("addend_int FINISH:", int_part)
return addend_sum - int_part
def dllf_dn_sigma_neq_gamma(beta, sigma, gamma, n, history,
sum_less_equal=True):
"""
Parameters
----------
beta : float
See corresponding argument in :func:`exp_intensity`.
sigma : float
See corresponding argument in :func:`exp_intensity`.
gamma : float
See corresponding argument in :func:`exp_intensity`.
n : float
See corresponding argument in :func:`exp_intensity`.
history : np.array
See corresponding argument in :func:`exp_intensity`.
sum_less_equal : bool, default: True
See corresponding argument in :func:`exp_intensity`.
Returns
-------
derivative_wrt_n : float
The derivative (w.r.t. the `n` parameter) of the log-likelihood
function given the `history` and evaluated at the parameters
`beta`, `sigma`, `gammma`, and `n`.
"""
# print("we divide\n",
# np.arange(1, len(history) + 1),
# "\nby\n",
# (n * (n - np.arange(1, len(history) + 1))))
# print(np.sum(
# np.arange(1, len(history) - sum(history <= 0) + 1) /
# (n - np.arange(1, len(history) - sum(history <= 0) + 1))) / n
# )
time_indices_gt_zero = np.arange(np.count_nonzero(history <= 0) + 1, # missing in R code: +1
len(history) + 1) # added in R code: - sum(history <= 0))
addend_sum = np.sum(
time_indices_gt_zero / (n - time_indices_gt_zero)
) / n
# print("addend_sum:", addend_sum)
int_part = 0
gms = gamma - sigma
for l in range(len(history) - 1):
tl_minus_tj = history[l] - history[:l + 1]
tlplus1_minus_tj = history[l + 1] - history[:l + 1]
int_part += (l + 1) * (
np.sum(
np.exp(-sigma * tl_minus_tj)
-
np.exp(-sigma * tlplus1_minus_tj)
) / sigma
-
np.sum(
np.exp(-gamma * tl_minus_tj)
-
np.exp(-gamma * tlplus1_minus_tj)
) / gamma
)
int_part *= beta * sigma / gms / n**2
# print("addend_int FINISH:", int_part)
return addend_sum - int_part
def dllf_dn_sigma_eq_gamma(beta, gamma, n, history, sum_less_equal=True):
"""
Parameters
----------
beta : float
See corresponding argument in :func:`exp_intensity`.
gamma : float
See corresponding argument in :func:`exp_intensity`.
n : float
See corresponding argument in :func:`exp_intensity`.
history : np.array
See corresponding argument in :func:`exp_intensity`.
sum_less_equal : bool, default: True
See corresponding argument in :func:`exp_intensity`.
Returns
-------
derivative_wrt_n : float
The derivative (w.r.t. the `n` parameter) of the log-likelihood
function given the `history` and evaluated at the parameters
`beta`, `gammma`, and `n`.
"""
# print("we divide\n",
# np.arange(1, len(history) + 1),
# "\nby\n",
# (n * (n - np.arange(1, len(history) + 1))))
# print(np.sum(
# np.arange(1, len(history) - sum(history <= 0) + 1) /
# (n - np.arange(1, len(history) - sum(history <= 0) + 1))) / n
# )
time_indices_gt_zero = np.arange(np.count_nonzero(history <= 0) + 1, # missing in R code: +1
len(history) + 1) # added in R code: - sum(history <= 0))
addend_sum = np.sum(
time_indices_gt_zero / (n - time_indices_gt_zero)
) / n
# print("addend_sum:", addend_sum)
int_part = 0
for l in range(len(history) - 1):
tl_minus_tj = history[l] - history[:l + 1]
tlplus1_minus_tj = history[l + 1] - history[:l + 1]
mg_tl_minus_tj = -gamma * tl_minus_tj
mg_tlplus1_minus_tj = -gamma * tlplus1_minus_tj
int_part += (l + 1) * np.sum(
np.exp(mg_tlplus1_minus_tj) * (mg_tlplus1_minus_tj - 1)
-
np.exp(mg_tl_minus_tj) * (mg_tl_minus_tj - 1)
)
int_part *= beta / gamma / n**2
# print("addend_int FINISH:", int_part)
return addend_sum - int_part
def llf_gradient(beta, sigma, gamma, n, history, sum_less_equal=True):
if sigma is None or sigma == gamma:
gradient = np.empty(3)
gradient[0] = dllf_dbeta_sigma_eq_gamma(
beta=beta, gamma=gamma, n=n, history=history,
sum_less_equal=sum_less_equal
)
gradient[1] = dllf_dgamma_sigma_eq_gamma(
beta=beta, gamma=gamma, n=n, history=history,
sum_less_equal=sum_less_equal
)
gradient[2] = dllf_dn_sigma_eq_gamma(
beta=beta, gamma=gamma, n=n, history=history,
sum_less_equal=sum_less_equal
)
else:
gradient = np.empty(4)
gradient[0] = dllf_dbeta_sigma_neq_gamma(
beta=beta, sigma=sigma, gamma=gamma, n=n, history=history,
sum_less_equal=sum_less_equal
)
gradient[1] = dllf_dsigma_sigma_neq_gamma(
beta=beta, sigma=sigma, gamma=gamma, n=n, history=history,
sum_less_equal=sum_less_equal
)
gradient[2] = dllf_dgamma_sigma_neq_gamma(
beta=beta, sigma=sigma, gamma=gamma, n=n, history=history,
sum_less_equal=sum_less_equal
)
gradient[3] = dllf_dn_sigma_neq_gamma(
beta=beta, sigma=sigma, gamma=gamma, n=n, history=history,
sum_less_equal=sum_less_equal
)
# print("beta", round(beta, 2),
# "sigma", round(sigma, 2),
# "gamma", round(gamma, 2),
# "n", round(n, 2))
# print("gamma", gamma)
# print("gradient: ", gradient)
return gradient
# def to_seir(scale, decay, n): # todo
# """
# Return the parameters of the equivalent SIR model.
#
# Parameters
# ----------
# scale : float
# See the corresponding parameters in :meth:`exp_intensity` for more
# information.
# decay : float
# See the corresponding parameters in :meth:`exp_intensity` for more
# information.
# n : int
# See the corresponding parameters in :meth:`exp_intensity` for more
# information.
#
# Returns
# -------
# sir_parameters : dict
# Dictionary containing the corresponding SIR parameters n, i_0, beta
# (infection rate), and gamma (recovery rate).
#
# Examples
# --------
# >>> obtained = HawkesN.to_sir(scale=5, decay=0.2, n=400)
# >>> desired = {"n": 400, "i_0": 1, "beta": 1.0, "gamma": 0.2}
# >>> obtained == desired
# True
# """
# i_0 = 1 # the HawkesN process is assumed to have 1 "immigrant" at t=0
# beta = scale * decay
# gamma = decay
# return {"n": n, "i_0": i_0, "beta": beta, "gamma": gamma}
# def size_distribution(scale=5, decay=0.2, n=400, history=None,
# transition_matrix=None):
# sir_params = to_sir(scale=scale, decay=decay, n=n)
#
# if history is None:
# return StochasticSEIR.size_distribution(
# **seir_params, transition_matrix=transition_matrix)
# else:
# max_time = history[-2]
| 36.21853 | 217 | 0.575393 | 4,585 | 35,965 | 4.328462 | 0.07241 | 0.042931 | 0.032651 | 0.069435 | 0.826464 | 0.80263 | 0.774211 | 0.754913 | 0.740351 | 0.72861 | 0 | 0.012439 | 0.313749 | 35,965 | 992 | 218 | 36.25504 | 0.791662 | 0.436118 | 0 | 0.51049 | 0 | 0 | 0.014731 | 0 | 0 | 0 | 0 | 0.002016 | 0 | 1 | 0.055944 | false | 0 | 0.009324 | 0.002331 | 0.125874 | 0.009324 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
ef37673e90804422703b3fc457e621c0e505a2d1 | 24,876 | py | Python | test/message_test.py | Watchful1/RedditSubsBot | a65733f6a4a5a50b46463887ed38f6a0fb1d89b5 | [
"MIT"
] | 20 | 2017-02-06T21:06:55.000Z | 2019-09-15T19:24:38.000Z | test/message_test.py | Watchful1/RedditSubsBot | a65733f6a4a5a50b46463887ed38f6a0fb1d89b5 | [
"MIT"
] | 108 | 2016-07-18T03:17:25.000Z | 2019-03-05T18:44:27.000Z | test/message_test.py | Watchful1/RedditSubsBot | a65733f6a4a5a50b46463887ed38f6a0fb1d89b5 | [
"MIT"
] | 4 | 2017-02-06T21:07:00.000Z | 2019-07-24T01:21:16.000Z | import discord_logging
log = discord_logging.get_logger(init=True)
import messages
import utils
from praw_wrapper import reddit_test
import static
from classes.submission import Submission
from classes.subscription import Subscription
from classes.comment import DbComment
from classes.notification import Notification
def assert_message(message, included):
for include in included:
assert include in message
def assert_subscription(subscription, subscriber, author, subreddit, recurring, tag=None):
assert subscription.subscriber.name == subscriber
if author is None:
assert subscription.author is None
else:
assert subscription.author.name == author
assert subscription.subreddit.name == subreddit
assert subscription.recurring is recurring
assert subscription.tag == tag
def init_db(database, users=None, subreddits=None, default_subreddits=None, enable_tags=False):
if users is not None:
for user in users:
database.get_or_add_user(user)
if subreddits is not None:
for subreddit_name in subreddits:
subreddit = database.get_or_add_subreddit(subreddit_name)
subreddit.is_enabled = True
subreddit.tag_enabled = enable_tags
if default_subreddits is not None:
for subreddit_name in default_subreddits:
subreddit = database.get_or_add_subreddit(subreddit_name)
subreddit.is_enabled = True
subreddit.tag_enabled = enable_tags
subreddit.default_recurring = True
database.commit()
def test_add_update(database, reddit):
username = "Watchful1"
author = "AuthorName"
subreddit_name = "SubredditName"
init_db(database, [author], [subreddit_name])
message = reddit_test.RedditObject(
body=f"UpdateMe! u/{author} r/{subreddit_name}",
author=username
)
messages.process_message(message, reddit, database)
assert_message(message.get_first_child().body, [author, subreddit_name, "next time"])
subscriptions = database.get_user_subscriptions_by_name(username)
assert len(subscriptions) == 1
assert_subscription(subscriptions[0], username, author, subreddit_name, False)
def test_add_update_plus(database, reddit):
username = "Watchful1"
author = "AuthorName"
subreddit_name = "SubredditName"
init_db(database, [author], [subreddit_name])
message = reddit_test.RedditObject(
body=f"UpdateMe!+u/{author}+r/{subreddit_name}",
author=username
)
messages.process_message(message, reddit, database)
assert_message(message.get_first_child().body, [author, subreddit_name, "next time"])
subscriptions = database.get_user_subscriptions_by_name(username)
assert len(subscriptions) == 1
assert_subscription(subscriptions[0], username, author, subreddit_name, False)
def test_add_subscribe(database, reddit):
username = "Watchful1"
author = "AuthorName"
subreddit_name = "SubredditName"
init_db(database, [author], [subreddit_name])
message = reddit_test.RedditObject(
body=f"SubscribeMe! u/{author} r/{subreddit_name}",
author=username
)
messages.process_message(message, reddit, database)
assert_message(message.get_first_child().body, [author, subreddit_name, "each time"])
subscriptions = database.get_user_subscriptions_by_name(username)
assert len(subscriptions) == 1
assert_subscription(subscriptions[0], username, author, subreddit_name, True)
def test_add_subscribe_subreddit(database, reddit):
username = "Watchful1"
subreddit_name = "SubredditName"
init_db(database, [], [subreddit_name])
message = reddit_test.RedditObject(
body=f"SubscribeMe! r/{subreddit_name} -all",
author=username
)
messages.process_message(message, reddit, database)
assert_message(message.get_first_child().body, [subreddit_name, "each post"])
subscriptions = database.get_user_subscriptions_by_name(username)
assert len(subscriptions) == 1
assert_subscription(subscriptions[0], username, None, subreddit_name, True)
def test_update_subscribe(database, reddit):
username = "Watchful1"
author = "AuthorName"
subreddit_name = "SubredditName"
database.add_subscription(
Subscription(
database.get_or_add_user(username),
database.get_or_add_user(author),
database.get_or_add_subreddit(subreddit_name),
False
)
)
database.commit()
message = reddit_test.RedditObject(
body=f"SubscribeMe! u/{author} r/{subreddit_name}",
author=username
)
messages.process_message(message, reddit, database)
assert_message(
message.get_first_child().body,
[author, subreddit_name, "updated your subscription", "each"])
subscriptions = database.get_user_subscriptions_by_name(username, only_enabled=False)
assert len(subscriptions) == 1
assert_subscription(subscriptions[0], username, author, subreddit_name, True)
def test_already_subscribed(database, reddit):
username = "Watchful1"
author = "AuthorName"
subreddit_name = "SubredditName"
database.add_subscription(
Subscription(
database.get_or_add_user(username),
database.get_or_add_user(author),
database.get_or_add_subreddit(subreddit_name),
True
)
)
database.commit()
message = reddit_test.RedditObject(
body=f"SubscribeMe! u/{author} r/{subreddit_name}",
author=username
)
messages.process_message(message, reddit, database)
assert_message(
message.get_first_child().body,
[author, subreddit_name, "already asked me", "each"])
subscriptions = database.get_user_subscriptions_by_name(username, only_enabled=False)
assert len(subscriptions) == 1
assert_subscription(subscriptions[0], username, author, subreddit_name, True)
def test_subreddit_not_enabled(database, reddit):
username = "Watchful1"
author = "AuthorName"
subreddit_name = "SubredditName"
message = reddit_test.RedditObject(
body=f"UpdateMe! u/{author} r/{subreddit_name}",
author=username
)
messages.process_message(message, reddit, database)
assert_message(message.get_first_child().body, ["is not being tracked"])
subscriptions = database.get_user_subscriptions_by_name(username, only_enabled=False)
assert len(subscriptions) == 1
assert_subscription(subscriptions[0], username, author.lower(), subreddit_name.lower(), False)
def test_add_link(database, reddit):
username = "Watchful1"
author = "AuthorName"
subreddit_name = "SubredditName"
init_db(database, [author], [subreddit_name])
post_id = reddit_test.random_id()
reddit.add_submission(
reddit_test.RedditObject(
id=post_id,
author=author,
subreddit=subreddit_name
)
)
message = reddit_test.RedditObject(
body=f"https://www.reddit.com/r/updateme/comments/{post_id}/this_is_a_test_post/",
author=username
)
messages.process_message(message, reddit, database)
assert_message(message.get_first_child().body, [author, subreddit_name, "next time"])
subscriptions = database.get_user_subscriptions_by_name(username)
assert len(subscriptions) == 1
assert_subscription(subscriptions[0], username, author, subreddit_name, False)
def test_add_link_with_tag(database, reddit):
username = "Watchful1"
author = database.get_or_add_user("AuthorName")
subreddit_name = "SubredditName"
tag = "this is a tag"
init_db(database, [author.name], [subreddit_name], enable_tags=True)
submission_id = reddit_test.random_id()
db_submission = Submission(
submission_id=submission_id,
time_created=utils.datetime_now(),
author=author,
subreddit=database.get_or_add_subreddit(subreddit_name),
permalink=f"/r/{subreddit_name}/comments/{submission_id}/",
tag=tag
)
database.add_submission(db_submission)
message = reddit_test.RedditObject(
body=f"https://www.reddit.com/r/{subreddit_name}/comments/{submission_id}/this_is_a_test_post/",
author=username
)
messages.process_message(message, reddit, database)
assert_message(message.get_first_child().body, [author.name, subreddit_name, "next time"])
subscriptions = database.get_user_subscriptions_by_name(username)
assert len(subscriptions) == 1
assert_subscription(subscriptions[0], username, author.name, subreddit_name, False, tag)
def test_add_link_default_subscribe(database, reddit):
username = "Watchful1"
author = "AuthorName"
subreddit_name = "SubredditName"
init_db(database, [author], None, [subreddit_name])
post_id = reddit_test.random_id()
reddit.add_submission(
reddit_test.RedditObject(
id=post_id,
author=author,
subreddit=subreddit_name
)
)
message = reddit_test.RedditObject(
body=f"https://www.reddit.com/r/updateme/comments/{post_id}/this_is_a_test_post/",
author=username
)
messages.process_message(message, reddit, database)
assert_message(message.get_first_child().body, [author, subreddit_name, "each time"])
subscriptions = database.get_user_subscriptions_by_name(username)
assert len(subscriptions) == 1
assert_subscription(subscriptions[0], username, author, subreddit_name, True)
def test_remove_subscription(database, reddit):
username = "Watchful1"
author = "AuthorName"
subreddit_name = "SubredditName"
database.add_subscription(
Subscription(
database.get_or_add_user(username),
database.get_or_add_user(author),
database.get_or_add_subreddit(subreddit_name),
True
)
)
database.commit()
message = reddit_test.RedditObject(
body=f"Remove! u/{author} r/{subreddit_name}",
author=username
)
messages.process_message(message, reddit, database)
assert_message(
message.get_first_child().body,
[author, subreddit_name, "removed your subscription"])
subscriptions = database.get_user_subscriptions_by_name(username)
assert len(subscriptions) == 0
def test_remove_subscription_with_notifications(database, reddit):
subscriber = database.get_or_add_user("User1")
subscriber2 = database.get_or_add_user("User2")
author = database.get_or_add_user("AuthorName")
subreddit = database.get_or_add_subreddit("SubredditName")
subscription = Subscription(subscriber, author, subreddit, True)
database.add_subscription(subscription)
subscription2 = Subscription(subscriber2, author, subreddit, True)
database.add_subscription(subscription2)
submission = Submission(reddit_test.random_id(), utils.datetime_now(), author, subreddit, "")
database.add_submission(submission)
notification = Notification(subscription, submission)
database.add_notification(notification)
notification = Notification(subscription2, submission)
database.add_notification(notification)
database.commit()
message = reddit_test.RedditObject(
body=f"Remove! u/{author.name} r/{subreddit.name}",
author=subscriber.name
)
messages.process_message(message, reddit, database)
assert len(database.get_pending_notifications()) == 1
def test_remove_tagged_subscription(database, reddit):
username = "Watchful1"
author = "AuthorName"
subreddit_name = "SubredditName"
tag = "this is a tag"
database.add_subscription(
Subscription(
database.get_or_add_user(username),
database.get_or_add_user(author),
database.get_or_add_subreddit(subreddit_name),
True,
tag
)
)
database.commit()
message = reddit_test.RedditObject(
body=f"Remove! u/{author} r/{subreddit_name} <{tag}>",
author=username
)
messages.process_message(message, reddit, database)
assert_message(
message.get_first_child().body,
[author, subreddit_name, "removed your subscription", "with tag", tag])
subscriptions = database.get_user_subscriptions_by_name(username)
assert len(subscriptions) == 0
def test_remove_all_tagged_subscriptions(database, reddit):
username = "Watchful1"
author = "AuthorName"
subreddit_name = "SubredditName"
database.add_subscription(
Subscription(
database.get_or_add_user(username),
database.get_or_add_user(author),
database.get_or_add_subreddit(subreddit_name),
False,
"this is a tag"
)
)
database.add_subscription(
Subscription(
database.get_or_add_user(username),
database.get_or_add_user(author),
database.get_or_add_subreddit(subreddit_name),
False,
"this is another tag"
)
)
database.commit()
message = reddit_test.RedditObject(
body=f"Remove! u/{author} r/{subreddit_name}",
author=username
)
messages.process_message(message, reddit, database)
assert_message(
message.get_first_child().body,
[author, subreddit_name, "removed all your tagged subscriptions"])
subscriptions = database.get_user_subscriptions_by_name(username)
assert len(subscriptions) == 0
def test_remove_global_subscription(database, reddit):
username = "Watchful1"
subreddit_name = "SubredditName"
database.add_subscription(
Subscription(
database.get_or_add_user(username),
None,
database.get_or_add_subreddit(subreddit_name),
True
)
)
database.commit()
message = reddit_test.RedditObject(
body=f"Remove! r/{subreddit_name} -all",
author=username
)
messages.process_message(message, reddit, database)
assert_message(
message.get_first_child().body,
[subreddit_name, "removed your subscription in"])
subscriptions = database.get_user_subscriptions_by_name(username)
assert len(subscriptions) == 0
def test_remove_all_subscription(database, reddit):
username = "Watchful1"
database.add_subscription(
Subscription(
database.get_or_add_user(username),
database.get_or_add_user("Author1"),
database.get_or_add_subreddit("Subreddit1", enable_subreddit_if_new=True),
True
)
)
database.add_subscription(
Subscription(
database.get_or_add_user(username),
database.get_or_add_user("Author2"),
database.get_or_add_subreddit("Subreddit2", enable_subreddit_if_new=True),
True
)
)
database.commit()
message = reddit_test.RedditObject(
body=f"RemoveAll",
author=username
)
messages.process_message(message, reddit, database)
assert_message(
message.get_first_child().body,
["Author1", "Author2", "Subreddit1", "Subreddit2", "Removed your subscription"])
subscriptions = database.get_user_subscriptions_by_name(username, only_enabled=False)
assert len(subscriptions) == 0
def test_remove_all_subscription_with_notifications(database, reddit):
subscriber = database.get_or_add_user("User1")
subscriber2 = database.get_or_add_user("User2")
author1 = database.get_or_add_user("Author1")
author2 = database.get_or_add_user("Author2")
subreddit = database.get_or_add_subreddit("SubredditName", enable_subreddit_if_new=True)
subscription = Subscription(subscriber, author1, subreddit, True)
database.add_subscription(subscription)
subscription2 = Subscription(subscriber2, author1, subreddit, True)
database.add_subscription(subscription2)
subscription3 = Subscription(subscriber, author2, subreddit, True)
database.add_subscription(subscription3)
subscription4 = Subscription(subscriber2, author2, subreddit, True)
database.add_subscription(subscription4)
submission1 = Submission(reddit_test.random_id(), utils.datetime_now(), author1, subreddit, "")
database.add_submission(submission1)
submission2 = Submission(reddit_test.random_id(), utils.datetime_now(), author2, subreddit, "")
database.add_submission(submission2)
notification = Notification(subscription, submission1)
database.add_notification(notification)
notification = Notification(subscription2, submission1)
database.add_notification(notification)
notification = Notification(subscription3, submission2)
database.add_notification(notification)
notification = Notification(subscription4, submission2)
database.add_notification(notification)
database.commit()
message = reddit_test.RedditObject(body="RemoveAll", author=subscriber.name)
messages.process_message(message, reddit, database)
assert len(database.get_pending_notifications()) == 2
def test_delete_comment(database, reddit):
username = "Watchful1"
author = database.get_or_add_user("AuthorName")
subreddit_name = "SubredditName"
reddit_comment = reddit_test.RedditObject(
author=username,
link_id=reddit_test.random_id()
)
reddit.add_comment(reddit_comment)
db_submission = Submission(
submission_id=reddit_comment.link_id,
time_created=utils.datetime_now(),
author=author,
subreddit=database.get_or_add_subreddit(subreddit_name, enable_subreddit_if_new=True),
permalink=f"none"
)
database.add_submission(db_submission)
database.add_comment(
DbComment(
comment_id=reddit_comment.id,
submission=db_submission,
subscriber=database.get_or_add_user(username),
author=author,
subreddit=database.get_or_add_subreddit(subreddit_name),
recurring=True
)
)
database.commit()
message = reddit_test.RedditObject(
body=f"delete {reddit_comment.link_id}",
author=username
)
messages.process_message(message, reddit, database)
assert_message(message.get_first_child().body, ["Comment deleted"])
def test_delete_comment_not_author(database, reddit):
username = "Watchful1"
author = database.get_or_add_user("AuthorName")
subreddit_name = "SubredditName"
reddit_comment = reddit_test.RedditObject(
author=username,
link_id=reddit_test.random_id()
)
reddit.add_comment(reddit_comment)
db_submission = Submission(
submission_id=reddit_comment.link_id,
time_created=utils.datetime_now(),
author=author,
subreddit=database.get_or_add_subreddit(subreddit_name, enable_subreddit_if_new=True),
permalink=f"none"
)
database.add_submission(db_submission)
database.add_comment(
DbComment(
comment_id=reddit_comment.id,
submission=db_submission,
subscriber=database.get_or_add_user(username),
author=author,
subreddit=database.get_or_add_subreddit(subreddit_name),
recurring=True
)
)
database.commit()
message = reddit_test.RedditObject(
body=f"delete {reddit_comment.link_id}",
author="Watchful2"
)
messages.process_message(message, reddit, database)
assert_message(message.get_first_child().body, ["looks like the bot wasn't replying"])
def test_delete_comment_doesnt_exist(database, reddit):
message = reddit_test.RedditObject(
body=f"delete {reddit_test.random_id()}",
author="Watchful1"
)
messages.process_message(message, reddit, database)
assert_message(message.get_first_child().body, ["comment doesn't exist or was already"])
def test_list(database, reddit):
username = "Watchful1"
database.add_subscription(
Subscription(
database.get_or_add_user(username),
database.get_or_add_user("Author1"),
database.get_or_add_subreddit("Subreddit1", enable_subreddit_if_new=True),
True
)
)
database.add_subscription(
Subscription(
database.get_or_add_user(username),
database.get_or_add_user("Author2"),
database.get_or_add_subreddit("Subreddit2", enable_subreddit_if_new=True),
False
)
)
database.add_subscription(
Subscription(
database.get_or_add_user("Watchful2"),
database.get_or_add_user("Author3"),
database.get_or_add_subreddit("Subreddit3", enable_subreddit_if_new=True),
False
)
)
database.commit()
message = reddit_test.RedditObject(
body=f"MySubscriptions",
author=username
)
messages.process_message(message, reddit, database)
response = message.get_first_child().body
assert "Author1" in response
assert "Author2" in response
assert "Subreddit1" in response
assert "Subreddit2" in response
assert "Each" in response
assert "Next" in response
assert "Author3" not in response
assert "Subreddit3" not in response
def test_list_tagged(database, reddit):
username = "Watchful1"
database.add_subscription(
Subscription(
database.get_or_add_user(username),
database.get_or_add_user("Author1"),
database.get_or_add_subreddit("Subreddit1", enable_subreddit_if_new=True),
True,
"this is a tag"
)
)
database.add_subscription(
Subscription(
database.get_or_add_user(username),
database.get_or_add_user("Author2"),
database.get_or_add_subreddit("Subreddit2", enable_subreddit_if_new=True),
False,
"this is also a tag"
)
)
database.add_subscription(
Subscription(
database.get_or_add_user("Watchful2"),
database.get_or_add_user("Author3"),
database.get_or_add_subreddit("Subreddit3", enable_subreddit_if_new=True),
False
)
)
database.commit()
message = reddit_test.RedditObject(
body=f"MySubscriptions",
author=username
)
messages.process_message(message, reddit, database)
response = message.get_first_child().body
assert "Author1" in response
assert "Author2" in response
assert "Subreddit1" in response
assert "Subreddit2" in response
assert "Each" in response
assert "Next" in response
assert "Author3" not in response
assert "Subreddit3" not in response
assert "<this is a tag>" in response
assert "<this is also a tag>" in response
def test_add_sub(database, reddit):
database.get_or_add_subreddit("Subreddit1")
database.commit()
message = reddit_test.RedditObject(
body="AddSubreddit r/Subreddit1 subscribe",
author=static.OWNER
)
messages.process_message(message, reddit, database)
response = message.get_first_child().body
assert "Activated r/Subreddit1" in response
assert "as subscribe" in response
subreddit = database.get_or_add_subreddit("Subreddit1")
assert subreddit.is_enabled is True
assert subreddit.default_recurring is True
def test_add_update_tagged(database, reddit):
username = "Watchful1"
author = "AuthorName"
subreddit_name = "SubredditName"
tag = "this is a tag"
init_db(database, [author], [subreddit_name], enable_tags=True)
message = reddit_test.RedditObject(
body=f"UpdateMe! u/{author} r/{subreddit_name} <{tag}>",
author=username
)
messages.process_message(message, reddit, database)
assert_message(message.get_first_child().body, [author, subreddit_name, "next time", tag])
subscriptions = database.get_user_subscriptions_by_name(username)
assert len(subscriptions) == 1
assert_subscription(subscriptions[0], username, author, subreddit_name, False, tag)
def test_subscribe_existing_tagged(database, reddit):
username = "Watchful1"
author = "AuthorName"
subreddit_name = "SubredditName"
init_db(database, [author], [subreddit_name], enable_tags=True)
database.add_subscription(
Subscription(
database.get_or_add_user(username),
database.get_or_add_user(author),
database.get_or_add_subreddit(subreddit_name),
True,
"another tag"
)
)
database.add_subscription(
Subscription(
database.get_or_add_user(username),
database.get_or_add_user(author),
database.get_or_add_subreddit(subreddit_name),
True,
"this is a tag"
)
)
database.commit()
message = reddit_test.RedditObject(
body=f"SubscribeMe! u/{author} r/{subreddit_name}",
author=username
)
messages.process_message(message, reddit, database)
assert_message(message.get_first_child().body, [author, subreddit_name, "each time", "This replaces"])
subscriptions = database.get_user_subscriptions_by_name(username)
assert len(subscriptions) == 1
assert_subscription(subscriptions[0], username, author, subreddit_name, True, None)
def test_subscribe_tagged_existing_all(database, reddit):
username = "Watchful1"
author = "AuthorName"
subreddit_name = "SubredditName"
init_db(database, [author], [subreddit_name], enable_tags=True)
database.add_subscription(
Subscription(
database.get_or_add_user(username),
database.get_or_add_user(author),
database.get_or_add_subreddit(subreddit_name),
True
)
)
database.commit()
message = reddit_test.RedditObject(
body=f"SubscribeMe! u/{author} r/{subreddit_name} <this is a tag>",
author=username
)
messages.process_message(message, reddit, database)
assert_message(message.get_first_child().body, [author, subreddit_name, "You're already"])
subscriptions = database.get_user_subscriptions_by_name(username)
assert len(subscriptions) == 1
def test_add_link_tagged(database, reddit):
username = "Watchful1"
author = "AuthorName"
subreddit_name = "SubredditName"
init_db(database, [author], [subreddit_name])
post_id = reddit_test.random_id()
reddit.add_submission(
reddit_test.RedditObject(
id=post_id,
author=author,
subreddit=subreddit_name
)
)
message = reddit_test.RedditObject(
body=f"https://www.reddit.com/r/updateme/comments/{post_id}/this_is_a_test_post/",
author=username
)
messages.process_message(message, reddit, database)
assert_message(message.get_first_child().body, [author, subreddit_name, "next time"])
subscriptions = database.get_user_subscriptions_by_name(username)
assert len(subscriptions) == 1
assert_subscription(subscriptions[0], username, author, subreddit_name, False)
def test_short_notifs(database, reddit):
message = reddit_test.RedditObject(
body="Short",
author="User1"
)
messages.process_message(message, reddit, database)
response = message.get_first_child().body
assert "You'll now get shortened notifications" in response
user = database.get_or_add_user("User1")
assert user.short_notifs is True
def test_long_notifs(database, reddit):
message = reddit_test.RedditObject(
body="Long",
author="User1"
)
messages.process_message(message, reddit, database)
response = message.get_first_child().body
assert "You'll now get normal notifications" in response
user = database.get_or_add_user("User1")
assert user.short_notifs is False
| 30.262774 | 103 | 0.781436 | 3,086 | 24,876 | 6.021711 | 0.052171 | 0.070656 | 0.055266 | 0.068019 | 0.883119 | 0.859603 | 0.831297 | 0.806597 | 0.784911 | 0.762848 | 0 | 0.006345 | 0.113 | 24,876 | 821 | 104 | 30.299635 | 0.835849 | 0 | 0 | 0.67147 | 0 | 0 | 0.117824 | 0.006311 | 0 | 0 | 0 | 0 | 0.131124 | 1 | 0.04611 | false | 0 | 0.012968 | 0 | 0.059078 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
ef3f8fe20f6e0d7a4aa9aa6759cb4ac45cce87d5 | 2,786 | py | Python | tests/test_command_load.py | imolloy/kestrel-lang | 2844168253d92e6607013e873d862f365e2a1913 | [
"Apache-2.0"
] | 119 | 2021-06-04T15:40:10.000Z | 2022-03-24T09:56:53.000Z | tests/test_command_load.py | imolloy/kestrel-lang | 2844168253d92e6607013e873d862f365e2a1913 | [
"Apache-2.0"
] | 76 | 2021-06-04T15:06:10.000Z | 2022-03-20T21:03:13.000Z | tests/test_command_load.py | imolloy/kestrel-lang | 2844168253d92e6607013e873d862f365e2a1913 | [
"Apache-2.0"
] | 28 | 2021-06-05T07:27:15.000Z | 2022-01-20T18:43:47.000Z | import pytest
import os
from kestrel.session import Session
from kestrel.exceptions import MissingEntityType
def test_load_full_csv():
data_file_path = os.path.join(
os.path.dirname(__file__), "test_input_data_procs.csv"
)
with Session() as s:
stmt = f"newvar = LOAD {data_file_path}"
s.execute(stmt)
v = s.get_variable("newvar")
assert len(v) == 5
assert v[0]["type"] == "process"
assert v[0]["name"] == "reg.exe"
def test_load_full_json():
data_file_path = os.path.join(
os.path.dirname(__file__), "test_input_data_procs.json"
)
with Session() as s:
stmt = f"newvar = LOAD {data_file_path}"
s.execute(stmt)
v = s.get_variable("newvar")
assert len(v) == 5
assert v[0]["type"] == "process"
assert v[0]["name"] == "reg.exe"
def test_load_parquet_gz():
data_file_path = os.path.join(
os.path.dirname(__file__), "test_input_data_procs.parquet.gz"
)
with Session() as s:
stmt = f"newvar = LOAD {data_file_path}"
s.execute(stmt)
v = s.get_variable("newvar")
assert len(v) == 5
assert v[0]["type"] == "process"
assert v[0]["name"] == "reg.exe"
def test_load_notype_json_to_fail():
data_file_path = os.path.join(
os.path.dirname(__file__), "test_input_data_procs_no_type.json"
)
with Session() as s:
stmt = f"newvar = LOAD {data_file_path}"
with pytest.raises(MissingEntityType) as e:
s.execute(stmt)
def test_load_notype_json_as_type():
data_file_path = os.path.join(
os.path.dirname(__file__), "test_input_data_procs_no_type.json"
)
with Session() as s:
stmt = f"newvar = LOAD {data_file_path} AS process"
s.execute(stmt)
v = s.get_variable("newvar")
assert len(v) == 5
assert v[0]["type"] == "process"
assert v[0]["name"] == "reg.exe"
def test_load_string_list_to_fail():
data_file_path = os.path.join(
os.path.dirname(__file__), "test_input_data_procs_list.json"
)
with Session() as s:
stmt = f"newvar = LOAD {data_file_path}"
with pytest.raises(MissingEntityType) as e:
s.execute(stmt)
def test_load_string_list_as_type():
data_file_path = os.path.join(
os.path.dirname(__file__), "test_input_data_procs_list.json"
)
with Session() as s:
stmt = f"newvar = LOAD {data_file_path} AS process"
s.execute(stmt)
v = s.get_variable("newvar")
assert len(v) == 3
assert v[0]["type"] == "process"
assert v[0]["name"] in ["cmd.exe", "reg.exe", "explorer.exe"]
assert sorted([i["name"] for i in v]) == ["cmd.exe", "explorer.exe", "reg.exe"]
| 30.282609 | 87 | 0.606604 | 402 | 2,786 | 3.912935 | 0.141791 | 0.071202 | 0.106802 | 0.062301 | 0.863319 | 0.837889 | 0.837889 | 0.837889 | 0.837889 | 0.818182 | 0 | 0.007201 | 0.252333 | 2,786 | 91 | 88 | 30.615385 | 0.74796 | 0 | 0 | 0.684211 | 0 | 0 | 0.227566 | 0.076454 | 0 | 0 | 0 | 0 | 0.210526 | 1 | 0.092105 | false | 0 | 0.052632 | 0 | 0.144737 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
32af58ea3217bb4024f4bd1edc4d932218ff085f | 2,494 | py | Python | json_templates/ItemBP.py | TristanPrisemut/mcpybox | a6d1ee1e284c76588fb7fb214e006f02348181d6 | [
"MIT"
] | null | null | null | json_templates/ItemBP.py | TristanPrisemut/mcpybox | a6d1ee1e284c76588fb7fb214e006f02348181d6 | [
"MIT"
] | 1 | 2022-03-10T23:05:38.000Z | 2022-03-10T23:05:38.000Z | json_templates/ItemBP.py | TristanPrisemut/mcpybox | a6d1ee1e284c76588fb7fb214e006f02348181d6 | [
"MIT"
] | null | null | null | import json
#! Non-Eatable
#? @identifier - the identifier of the item
#? @use_duration - the use_duration of the item in game
#* return json data from v[1.16.0]
def non_eatable(identifier, use_duration, max_stack_size):
data = {
"format_version": "1.16.0",
"minecraft:item": {
"description": {
"identifier": str(identifier)
},
"components": {
"minecraft:use_duration": int(use_duration),
"minecraft:hand_equipped": 'true',
"minecraft:stacked_by_data": 'true',
"minecraft:foil": 'false',
"minecraft:max_stack_size": int(max_stack_size)
}
}
}
return json.dump(data)
#! Eatable without cooldown
#? @identifier - the identifier of the item
#? @use_duration - the use_duration of the item in game
#* return json data from v[1.16.0]
def eatable(identifier, use_duration, nutrition, max_stack_size):
data = {
"format_version": "1.16.0",
"minecraft:item": {
"description": {
"identifier": str(identifier)
},
"components": {
"minecraft:use_duration": int(use_duration),
"minecraft:food": {
"nutrition": int(nutrition),
"can_always_eat": 'true'
},
"minecraft:hand_equipped": 'true',
"minecraft:max_stack_size": int(max_stack_size)
}
}
}
return json.dump(data)
#! Eatable with cooldown
#? @identifier - the identifier of the item
#? @use_duration - the use_duration of the item in game
#* return json data from v[1.16.0]
def eatable(identifier, use_duration, nutrition, cooldown_type, cooldown_time, max_stack_size):
data = {
"format_version": "1.16.0",
"minecraft:item": {
"description": {
"identifier": str(identifier)
},
"components": {
"minecraft:use_duration": int(use_duration),
"minecraft:food": {
"nutrition": int(nutrition),
"can_always_eat": 'true'
},
"minecraft:hand_equipped": 'true',
"minecraft:max_stack_size": int(max_stack_size),
"cooldown_time": int(cooldown_time),
"cooldown_type": str(cooldown_type)
}
}
}
return json.dump(data)
| 33.702703 | 95 | 0.538492 | 254 | 2,494 | 5.082677 | 0.188976 | 0.127808 | 0.083656 | 0.058095 | 0.850503 | 0.831139 | 0.831139 | 0.831139 | 0.831139 | 0.831139 | 0 | 0.014625 | 0.342021 | 2,494 | 73 | 96 | 34.164384 | 0.77209 | 0.180032 | 0 | 0.603448 | 0 | 0 | 0.280512 | 0.114173 | 0 | 0 | 0 | 0 | 0 | 1 | 0.051724 | false | 0 | 0.017241 | 0 | 0.12069 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
32c23b44faa9f7c50e58f9cb1be6ec7b2b6068a0 | 5,516 | py | Python | tests/conftest.py | item4/ugoira | 3d5a1f8f82cf351b2c35349227bf5eb1129d2ae9 | [
"MIT"
] | 11 | 2018-04-21T08:04:11.000Z | 2021-02-14T17:09:33.000Z | tests/conftest.py | item4/ugoira | 3d5a1f8f82cf351b2c35349227bf5eb1129d2ae9 | [
"MIT"
] | 16 | 2018-09-18T11:10:12.000Z | 2021-02-14T16:47:29.000Z | tests/conftest.py | item4/ugoira | 3d5a1f8f82cf351b2c35349227bf5eb1129d2ae9 | [
"MIT"
] | 3 | 2018-09-18T10:50:45.000Z | 2021-01-31T18:01:33.000Z | import pathlib
import zipfile
from PIL import Image, ImageColor
import pytest
@pytest.fixture()
def fx_tmpdir(tmpdir):
"""Make :class:`pathlib.Path` instance of ```tmpdir```."""
return pathlib.Path(str(tmpdir))
@pytest.fixture(scope='session')
def ugoira_id():
return 74442143
@pytest.fixture(scope='session')
def non_ugoira_id():
return 74073488
@pytest.fixture(scope='session')
def small_zip_url():
return (
'https://i.pximg.net/img-zip-ugoira/img/'
'2019/04/29/16/09/38/74442143_ugoira600x600.zip'
)
@pytest.fixture(scope='session')
def big_zip_url():
return (
'https://i.pximg.net/img-zip-ugoira/img/'
'2019/04/29/16/09/38/74442143_ugoira1920x1080.zip'
)
@pytest.fixture(scope='session')
def meta_body():
"""Ugoira page data."""
return '{"error":false,"message":"","body":{"src":"https:\/\/i.pximg.net\/img-zip-ugoira\/img\/2019\/04\/29\/16\/09\/38\/74442143_ugoira600x600.zip","originalSrc":"https:\/\/i.pximg.net\/img-zip-ugoira\/img\/2019\/04\/29\/16\/09\/38\/74442143_ugoira1920x1080.zip","mime_type":"image\/jpeg","frames":[{"file":"000000.jpg","delay":70},{"file":"000001.jpg","delay":70},{"file":"000002.jpg","delay":70},{"file":"000003.jpg","delay":70},{"file":"000004.jpg","delay":70},{"file":"000005.jpg","delay":70},{"file":"000006.jpg","delay":70},{"file":"000007.jpg","delay":70},{"file":"000008.jpg","delay":70},{"file":"000009.jpg","delay":70},{"file":"000010.jpg","delay":70},{"file":"000011.jpg","delay":70},{"file":"000012.jpg","delay":70},{"file":"000013.jpg","delay":70},{"file":"000014.jpg","delay":70},{"file":"000015.jpg","delay":70},{"file":"000016.jpg","delay":70},{"file":"000017.jpg","delay":70},{"file":"000018.jpg","delay":70},{"file":"000019.jpg","delay":70},{"file":"000020.jpg","delay":70},{"file":"000021.jpg","delay":70},{"file":"000022.jpg","delay":70},{"file":"000023.jpg","delay":70},{"file":"000024.jpg","delay":70},{"file":"000025.jpg","delay":70},{"file":"000026.jpg","delay":70},{"file":"000027.jpg","delay":70},{"file":"000028.jpg","delay":70},{"file":"000029.jpg","delay":70},{"file":"000030.jpg","delay":70},{"file":"000031.jpg","delay":70},{"file":"000032.jpg","delay":70},{"file":"000033.jpg","delay":70},{"file":"000034.jpg","delay":70},{"file":"000035.jpg","delay":70},{"file":"000036.jpg","delay":70},{"file":"000037.jpg","delay":70},{"file":"000038.jpg","delay":70},{"file":"000039.jpg","delay":70},{"file":"000040.jpg","delay":70},{"file":"000041.jpg","delay":70},{"file":"000042.jpg","delay":70},{"file":"000043.jpg","delay":70},{"file":"000044.jpg","delay":70},{"file":"000045.jpg","delay":70},{"file":"000046.jpg","delay":70},{"file":"000047.jpg","delay":70},{"file":"000048.jpg","delay":70},{"file":"000049.jpg","delay":70},{"file":"000050.jpg","delay":70},{"file":"000051.jpg","delay":70},{"file":"000052.jpg","delay":70},{"file":"000053.jpg","delay":70},{"file":"000054.jpg","delay":70},{"file":"000055.jpg","delay":70},{"file":"000056.jpg","delay":70},{"file":"000057.jpg","delay":70},{"file":"000058.jpg","delay":70},{"file":"000059.jpg","delay":70},{"file":"000060.jpg","delay":70},{"file":"000061.jpg","delay":70},{"file":"000062.jpg","delay":70},{"file":"000063.jpg","delay":70},{"file":"000064.jpg","delay":70},{"file":"000065.jpg","delay":70},{"file":"000066.jpg","delay":70},{"file":"000067.jpg","delay":70},{"file":"000068.jpg","delay":70},{"file":"000069.jpg","delay":70},{"file":"000070.jpg","delay":70},{"file":"000071.jpg","delay":70},{"file":"000072.jpg","delay":70},{"file":"000073.jpg","delay":70},{"file":"000074.jpg","delay":70},{"file":"000075.jpg","delay":70},{"file":"000076.jpg","delay":70}]}}' # noqa
@pytest.fixture(scope='session')
def error_meta_body():
"""Non ugoira page data."""
return '{"error":true,"message":"\uc9c0\uc815\ud55c ID\ub294 \uc6b0\uace0\uc774\ub77c\uac00 \uc544\ub2d9\ub2c8\ub2e4","body":[]}' # noqa
@pytest.fixture()
def small_image_zip(fx_tmpdir):
"""
Generates a zip file used in testing
instead of downloading an actual ugoira.
"""
file = fx_tmpdir / '00000000_ugoira600x600.zip'
imgs = [
fx_tmpdir / '000000.jpg',
fx_tmpdir / '000001.jpg',
fx_tmpdir / '000002.jpg',
]
colors = [
ImageColor.getrgb('red'),
ImageColor.getrgb('blue'),
ImageColor.getrgb('green'),
]
for path, color in zip(imgs, colors):
Image.new('RGB', (10, 10), color).save(str(path))
with zipfile.ZipFile(str(file), 'w') as f:
for img in imgs:
f.write(str(img), img.name)
with file.open('rb') as f:
return f.read()
@pytest.fixture()
def big_image_zip(fx_tmpdir):
"""
Generates a zip file used in testing
instead of downloading an actual ugoira.
"""
file = fx_tmpdir / '00000000_ugoira1920x1080.zip'
imgs = [
fx_tmpdir / '000000.jpg',
fx_tmpdir / '000001.jpg',
fx_tmpdir / '000002.jpg',
]
colors = [
ImageColor.getrgb('red'),
ImageColor.getrgb('blue'),
ImageColor.getrgb('green'),
]
for path, color in zip(imgs, colors):
Image.new('RGB', (100, 100), color).save(str(path))
with zipfile.ZipFile(str(file), 'w') as f:
for img in imgs:
f.write(str(img), img.name)
with file.open('rb') as f:
return f.read()
@pytest.fixture()
def frames():
"""frames data."""
return {
'000000.jpg': 1000,
'000001.jpg': 2000,
'000002.jpg': 3000,
}
| 44.845528 | 2,856 | 0.607868 | 776 | 5,516 | 4.279639 | 0.240979 | 0.185486 | 0.231858 | 0.320385 | 0.384824 | 0.336043 | 0.319181 | 0.319181 | 0.319181 | 0.319181 | 0 | 0.181054 | 0.115845 | 5,516 | 122 | 2,857 | 45.213115 | 0.499897 | 0.049311 | 0 | 0.538462 | 0 | 0.025641 | 0.646026 | 0.596451 | 0 | 0 | 0 | 0 | 0 | 1 | 0.128205 | false | 0 | 0.051282 | 0.051282 | 0.307692 | 0 | 0 | 0 | 0 | null | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
0869e0376f370c40ad50cd1103f55ff7bc8cbadf | 659 | py | Python | model_factory/layers/__init__.py | yinochaos/model_factory | d8ff8b049cef9b2c19d6dc303874aee6401dc5ef | [
"Apache-2.0"
] | null | null | null | model_factory/layers/__init__.py | yinochaos/model_factory | d8ff8b049cef9b2c19d6dc303874aee6401dc5ef | [
"Apache-2.0"
] | null | null | null | model_factory/layers/__init__.py | yinochaos/model_factory | d8ff8b049cef9b2c19d6dc303874aee6401dc5ef | [
"Apache-2.0"
] | null | null | null | from __future__ import absolute_import, division, print_function
#__all__ = ['seq2seq_cross_entropy_loss']
from model_factory.layers.attentions.multihead_attention import MultiHeadAttention, RelativeMultiHeadAttention
from model_factory.layers.embeddings.pos_embedding import TransformerEmbedding, RelativePositionalEmbedding
from model_factory.layers.recurrents.rnns import pBLSTM
from model_factory.layers.transformers.transformer import Transformer
from model_factory.layers.transformers.transformer_xl import TransformerXL
from model_factory.layers.transformers.longformer import Longformer
from model_factory.layers.transformers.bert import Bert, Albert
| 65.9 | 110 | 0.890744 | 75 | 659 | 7.52 | 0.466667 | 0.111702 | 0.198582 | 0.27305 | 0.280142 | 0.159574 | 0 | 0 | 0 | 0 | 0 | 0.001616 | 0.060698 | 659 | 9 | 111 | 73.222222 | 0.909532 | 0.060698 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0.125 | 0 | 0 | 0 | null | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
088408813b0a97cfb6ba548e4257bb2a767d5d3a | 623 | py | Python | Visualization/FourierSquareWaveVisualization.py | SymmetricChaos/FiniteFields | 65258e06b7f04ce15223c1bc0c2384ef5e9cec1a | [
"MIT"
] | 1 | 2021-08-22T15:03:59.000Z | 2021-08-22T15:03:59.000Z | Visualization/FourierSquareWaveVisualization.py | SymmetricChaos/NumberTheory | 65258e06b7f04ce15223c1bc0c2384ef5e9cec1a | [
"MIT"
] | null | null | null | Visualization/FourierSquareWaveVisualization.py | SymmetricChaos/NumberTheory | 65258e06b7f04ce15223c1bc0c2384ef5e9cec1a | [
"MIT"
] | null | null | null | from Fourier import Fourier, evaluate_series
import numpy as np
import matplotlib.pyplot as plt
fig = plt.figure()
x = np.linspace(-5,5,150)
pi = np.pi
S = Fourier([0],[0],[0])
for i in [1,3,5]:
A = Fourier([4/(pi*i)],[i])
plt.plot(x,evaluate_series(A,x),color='gray',linestyle=":")
S += A
yS = evaluate_series(S,x)
plt.plot(x,yS,color='black')
fig = plt.figure()
x = np.linspace(-5,5,150)
pi = np.pi
S = Fourier([0],[0],[0])
for i in [1,2,3]:
A = Fourier([4/(pi*i)],[i])
plt.plot(x,evaluate_series(A,x),color='gray',linestyle=":")
S += A
yS = evaluate_series(S,x)
plt.plot(x,yS,color='black') | 21.482759 | 63 | 0.614767 | 119 | 623 | 3.176471 | 0.285714 | 0.185185 | 0.084656 | 0.068783 | 0.777778 | 0.777778 | 0.777778 | 0.777778 | 0.777778 | 0.777778 | 0 | 0.045541 | 0.154093 | 623 | 29 | 64 | 21.482759 | 0.671727 | 0 | 0 | 0.782609 | 0 | 0 | 0.032051 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.130435 | 0 | 0.130435 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
089710754fb7ed7774e8fedc1025c6640e6d9081 | 256 | py | Python | replikate_lib/terminal/colors.py | rloic/replikate | 1503d2c77aef4103f3a930664d0e2a2267c93f34 | [
"MIT"
] | null | null | null | replikate_lib/terminal/colors.py | rloic/replikate | 1503d2c77aef4103f3a930664d0e2a2267c93f34 | [
"MIT"
] | null | null | null | replikate_lib/terminal/colors.py | rloic/replikate | 1503d2c77aef4103f3a930664d0e2a2267c93f34 | [
"MIT"
] | null | null | null | def info(s: str) -> str:
return '\033[34m' + s + '\033[0m'
def log(s: str) -> str:
return '\033[32m' + s + '\033[0m'
def warn(s: str) -> str:
return '\033[33m' + s + '\033[0m'
def err(s: str) -> str:
return '\033[31m' + s + '\033[0m'
| 17.066667 | 37 | 0.492188 | 44 | 256 | 2.863636 | 0.318182 | 0.126984 | 0.222222 | 0.412698 | 0.507937 | 0 | 0 | 0 | 0 | 0 | 0 | 0.189474 | 0.257813 | 256 | 14 | 38 | 18.285714 | 0.473684 | 0 | 0 | 0 | 0 | 0 | 0.234375 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.5 | false | 0 | 0 | 0.5 | 1 | 0 | 0 | 0 | 0 | null | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 6 |
08b79e1077b46c3c9746815ef84806a8346ad9ad | 219 | py | Python | python/high-scores/high_scores.py | willdjames/exercism-praticas | 8de2fc82459c3f9e5eaff20bad4ce7a760ba786e | [
"MIT"
] | null | null | null | python/high-scores/high_scores.py | willdjames/exercism-praticas | 8de2fc82459c3f9e5eaff20bad4ce7a760ba786e | [
"MIT"
] | null | null | null | python/high-scores/high_scores.py | willdjames/exercism-praticas | 8de2fc82459c3f9e5eaff20bad4ce7a760ba786e | [
"MIT"
] | null | null | null | def latest(scores):
return personal_top_three(scores)[-1]
def personal_best(scores):
return personal_top_three(scores)[0]
def personal_top_three(scores):
scores.sort(reverse=True)
return scores[0:3]
| 18.25 | 41 | 0.73516 | 32 | 219 | 4.8125 | 0.4375 | 0.214286 | 0.311688 | 0.428571 | 0.441558 | 0.441558 | 0 | 0 | 0 | 0 | 0 | 0.021505 | 0.150685 | 219 | 11 | 42 | 19.909091 | 0.806452 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.428571 | false | 0 | 0 | 0.285714 | 0.857143 | 0 | 0 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 6 |
3efa76b899ac3ced05baffaec5dad84313fe0247 | 28 | py | Python | djmease/__init__.py | florianpaquet/django-mease | f1512ad5a337f457854c11c0606fd3331e6f4ccb | [
"MIT"
] | 4 | 2015-07-09T21:06:50.000Z | 2018-04-02T21:54:10.000Z | djmease/__init__.py | florianpaquet/django-mease | f1512ad5a337f457854c11c0606fd3331e6f4ccb | [
"MIT"
] | null | null | null | djmease/__init__.py | florianpaquet/django-mease | f1512ad5a337f457854c11c0606fd3331e6f4ccb | [
"MIT"
] | null | null | null | from .registry import mease
| 14 | 27 | 0.821429 | 4 | 28 | 5.75 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.142857 | 28 | 1 | 28 | 28 | 0.958333 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
41094a8fef5daa3e76f7967d3503d412c955d553 | 12,996 | py | Python | tests/test_data_version_handler.py | JohannesBrand/ecl_ekf_analysis | 9d0f3875767f94c5a37a676161e42b2e14589e8f | [
"BSD-3-Clause"
] | null | null | null | tests/test_data_version_handler.py | JohannesBrand/ecl_ekf_analysis | 9d0f3875767f94c5a37a676161e42b2e14589e8f | [
"BSD-3-Clause"
] | null | null | null | tests/test_data_version_handler.py | JohannesBrand/ecl_ekf_analysis | 9d0f3875767f94c5a37a676161e42b2e14589e8f | [
"BSD-3-Clause"
] | null | null | null | #! /usr/bin/env python3
"""
Testing the data version handler.
"""
import os
import pytest
from pyulog import ULog
import ecl_ekf_analysis.log_processing.data_version_handler as dvh
@pytest.fixture(scope="module")
def testing_args():
"""
arguments for testing.
:return: test arguments
"""
flight_logs_path = os.path.join(os.path.dirname(__file__), 'flight_logs')
log_est_format_v1 = ULog(os.path.join(flight_logs_path, 'short_f450_log.ulg'))
log_est_format_v2 = ULog(os.path.join(flight_logs_path, 'estimator_innovations.ulg'))
return {'est_format_version_1': log_est_format_v1,
'est_format_version_2': log_est_format_v2}
def test_get_output_tracking_error_message(testing_args):
"""
Test if the right message name will be returned for different log file versions
"""
log_est_format_version_1 = testing_args['est_format_version_1']
log_est_format_version_2 = testing_args['est_format_version_2']
assert dvh.get_output_tracking_error_message(log_est_format_version_1) == "ekf2_innovations"
assert dvh.get_output_tracking_error_message(log_est_format_version_2) == "estimator_status"
def test_get_innovation_message(testing_args):
"""
Test if the right message name will be returned for different log file versions
"""
log_est_format_version_1 = testing_args['est_format_version_1']
log_est_format_version_2 = testing_args['est_format_version_2']
assert dvh.get_innovation_message(
log_est_format_version_1, 'innovation') == "ekf2_innovations"
assert dvh.get_innovation_message(
log_est_format_version_2, 'innovation') == "estimator_innovations"
assert dvh.get_innovation_message(
log_est_format_version_1, 'innovation_variance') == "ekf2_innovations"
assert dvh.get_innovation_message(
log_est_format_version_2, 'innovation_variance') == "estimator_innovation_variances"
assert dvh.get_innovation_message(
log_est_format_version_1, 'innovation_test_ratio') is None
assert dvh.get_innovation_message(
log_est_format_version_2, 'innovation_test_ratio') == "estimator_innovation_test_ratios"
test_data = [
("est_format_version_1", 'innovation', 'vel_pos', True),
("est_format_version_1", 'innovation', 'gps_vvel', False),
("est_format_version_1", 'innovation', 'gps_hvel', False),
("est_format_version_1", 'innovation', 'gps_hpos', False),
("est_format_version_1", 'innovation', 'gps_vpos', False),
("est_format_version_1", 'innovation', 'ev_hvel', False),
("est_format_version_1", 'innovation', 'ev_vvel', False),
("est_format_version_1", 'innovation', 'ev_hpos', False),
("est_format_version_1", 'innovation', 'ev_vpos', False),
("est_format_version_1", 'innovation', 'fake_hvel', False),
("est_format_version_1", 'innovation', 'fake_vvel', False),
("est_format_version_1", 'innovation', 'fake_hpos', False),
("est_format_version_1", 'innovation', 'fake_vpos', False),
("est_format_version_1", 'innovation', 'rng_vpos', False),
("est_format_version_1", 'innovation', 'baro_vpos', False),
("est_format_version_1", 'innovation', 'aux_hvel', True),
("est_format_version_1", 'innovation', 'aux_vvel', False),
("est_format_version_1", 'innovation', 'mag_field', True),
("est_format_version_1", 'innovation', 'heading', True),
("est_format_version_1", 'innovation', 'airspeed', True),
("est_format_version_1", 'innovation', 'beta', True),
("est_format_version_1", 'innovation', 'flow', True),
("est_format_version_1", 'innovation', 'hagl', True),
("est_format_version_1", 'innovation', 'drag', True),
("est_format_version_1", 'innovation_variance', 'vel_pos', True),
("est_format_version_1", 'innovation_variance', 'gps_vvel', False),
("est_format_version_1", 'innovation_variance', 'gps_hvel', False),
("est_format_version_1", 'innovation_variance', 'gps_hpos', False),
("est_format_version_1", 'innovation_variance', 'gps_vpos', False),
("est_format_version_1", 'innovation_variance', 'ev_hvel', False),
("est_format_version_1", 'innovation_variance', 'ev_vvel', False),
("est_format_version_1", 'innovation_variance', 'ev_hpos', False),
("est_format_version_1", 'innovation_variance', 'ev_vpos', False),
("est_format_version_1", 'innovation_variance', 'fake_hvel', False),
("est_format_version_1", 'innovation_variance', 'fake_vvel', False),
("est_format_version_1", 'innovation_variance', 'fake_hpos', False),
("est_format_version_1", 'innovation_variance', 'fake_vpos', False),
("est_format_version_1", 'innovation_variance', 'rng_vpos', False),
("est_format_version_1", 'innovation_variance', 'baro_vpos', False),
("est_format_version_1", 'innovation_variance', 'aux_hvel', False),
("est_format_version_1", 'innovation_variance', 'aux_vvel', False),
("est_format_version_1", 'innovation_variance', 'mag_field', True),
("est_format_version_1", 'innovation_variance', 'heading', True),
("est_format_version_1", 'innovation_variance', 'airspeed', True),
("est_format_version_1", 'innovation_variance', 'beta', True),
("est_format_version_1", 'innovation_variance', 'flow', True),
("est_format_version_1", 'innovation_variance', 'hagl', True),
("est_format_version_1", 'innovation_variance', 'drag', True),
("est_format_version_1", 'innovation_test_ratio', 'hgt', True),
("est_format_version_1", 'innovation_test_ratio', 'vel', True),
("est_format_version_1", 'innovation_test_ratio', 'pos', True),
("est_format_version_1", 'innovation_test_ratio', 'gps_vvel', False),
("est_format_version_1", 'innovation_test_ratio', 'gps_hvel', False),
("est_format_version_1", 'innovation_test_ratio', 'gps_hpos', False),
("est_format_version_1", 'innovation_test_ratio', 'gps_vpos', False),
("est_format_version_1", 'innovation_test_ratio', 'ev_hvel', False),
("est_format_version_1", 'innovation_test_ratio', 'ev_vvel', False),
("est_format_version_1", 'innovation_test_ratio', 'ev_hpos', False),
("est_format_version_1", 'innovation_test_ratio', 'ev_vpos', False),
("est_format_version_1", 'innovation_test_ratio', 'fake_hvel', False),
("est_format_version_1", 'innovation_test_ratio', 'fake_vvel', False),
("est_format_version_1", 'innovation_test_ratio', 'fake_hpos', False),
("est_format_version_1", 'innovation_test_ratio', 'fake_vpos', False),
("est_format_version_1", 'innovation_test_ratio', 'rng_vpos', False),
("est_format_version_1", 'innovation_test_ratio', 'baro_vpos', False),
("est_format_version_1", 'innovation_test_ratio', 'aux_hvel', False),
("est_format_version_1", 'innovation_test_ratio', 'aux_vvel', False),
("est_format_version_1", 'innovation_test_ratio', 'mag_field', True),
("est_format_version_1", 'innovation_test_ratio', 'heading', False),
("est_format_version_1", 'innovation_test_ratio', 'airspeed', True),
("est_format_version_1", 'innovation_test_ratio', 'beta', True),
("est_format_version_1", 'innovation_test_ratio', 'flow', False),
("est_format_version_1", 'innovation_test_ratio', 'hagl', True),
("est_format_version_1", 'innovation_test_ratio', 'drag', False),
("est_format_version_2", 'innovation', 'vel_pos', False),
("est_format_version_2", 'innovation', 'gps_vvel', True),
("est_format_version_2", 'innovation', 'gps_hvel', True),
("est_format_version_2", 'innovation', 'gps_hpos', True),
("est_format_version_2", 'innovation', 'gps_vpos', True),
("est_format_version_2", 'innovation', 'ev_hvel', True),
("est_format_version_2", 'innovation', 'ev_vvel', True),
("est_format_version_2", 'innovation', 'ev_hpos', True),
("est_format_version_2", 'innovation', 'ev_vpos', True),
("est_format_version_2", 'innovation', 'fake_hvel', True),
("est_format_version_2", 'innovation', 'fake_vvel', True),
("est_format_version_2", 'innovation', 'fake_hpos', True),
("est_format_version_2", 'innovation', 'fake_vpos', True),
("est_format_version_2", 'innovation', 'rng_vpos', True),
("est_format_version_2", 'innovation', 'baro_vpos', True),
("est_format_version_2", 'innovation', 'aux_hvel', True),
("est_format_version_2", 'innovation', 'aux_vvel', True),
("est_format_version_2", 'innovation', 'mag_field', True),
("est_format_version_2", 'innovation', 'heading', True),
("est_format_version_2", 'innovation', 'airspeed', True),
("est_format_version_2", 'innovation', 'beta', True),
("est_format_version_2", 'innovation', 'flow', True),
("est_format_version_2", 'innovation', 'hagl', True),
("est_format_version_2", 'innovation', 'drag', True),
("est_format_version_2", 'innovation_variance', 'vel_pos', False),
("est_format_version_2", 'innovation_variance', 'gps_vvel', True),
("est_format_version_2", 'innovation_variance', 'gps_hvel', True),
("est_format_version_2", 'innovation_variance', 'gps_hpos', True),
("est_format_version_2", 'innovation_variance', 'gps_vpos', True),
("est_format_version_2", 'innovation_variance', 'ev_hvel', True),
("est_format_version_2", 'innovation_variance', 'ev_vvel', True),
("est_format_version_2", 'innovation_variance', 'ev_hpos', True),
("est_format_version_2", 'innovation_variance', 'ev_vpos', True),
("est_format_version_2", 'innovation_variance', 'fake_hvel', True),
("est_format_version_2", 'innovation_variance', 'fake_vvel', True),
("est_format_version_2", 'innovation_variance', 'fake_hpos', True),
("est_format_version_2", 'innovation_variance', 'fake_vpos', True),
("est_format_version_2", 'innovation_variance', 'rng_vpos', True),
("est_format_version_2", 'innovation_variance', 'baro_vpos', True),
("est_format_version_2", 'innovation_variance', 'aux_hvel', True),
("est_format_version_2", 'innovation_variance', 'aux_vvel', True),
("est_format_version_2", 'innovation_variance', 'mag_field', True),
("est_format_version_2", 'innovation_variance', 'heading', True),
("est_format_version_2", 'innovation_variance', 'airspeed', True),
("est_format_version_2", 'innovation_variance', 'beta', True),
("est_format_version_2", 'innovation_variance', 'flow', True),
("est_format_version_2", 'innovation_variance', 'hagl', True),
("est_format_version_2", 'innovation_variance', 'drag', True),
("est_format_version_2", 'innovation_test_ratio', 'hgt', False),
("est_format_version_2", 'innovation_test_ratio', 'vel', False),
("est_format_version_2", 'innovation_test_ratio', 'pos', False),
("est_format_version_2", 'innovation_test_ratio', 'gps_vvel', True),
("est_format_version_2", 'innovation_test_ratio', 'gps_hvel', True),
("est_format_version_2", 'innovation_test_ratio', 'gps_hpos', True),
("est_format_version_2", 'innovation_test_ratio', 'gps_vpos', True),
("est_format_version_2", 'innovation_test_ratio', 'ev_hvel', True),
("est_format_version_2", 'innovation_test_ratio', 'ev_vvel', True),
("est_format_version_2", 'innovation_test_ratio', 'ev_hpos', True),
("est_format_version_2", 'innovation_test_ratio', 'ev_vpos', True),
("est_format_version_2", 'innovation_test_ratio', 'fake_hvel', True),
("est_format_version_2", 'innovation_test_ratio', 'fake_vvel', True),
("est_format_version_2", 'innovation_test_ratio', 'fake_hpos', True),
("est_format_version_2", 'innovation_test_ratio', 'fake_vpos', True),
("est_format_version_2", 'innovation_test_ratio', 'rng_vpos', True),
("est_format_version_2", 'innovation_test_ratio', 'baro_vpos', True),
("est_format_version_2", 'innovation_test_ratio', 'aux_hvel', True),
("est_format_version_2", 'innovation_test_ratio', 'aux_vvel', True),
("est_format_version_2", 'innovation_test_ratio', 'mag_field', True),
("est_format_version_2", 'innovation_test_ratio', 'heading', True),
("est_format_version_2", 'innovation_test_ratio', 'airspeed', True),
("est_format_version_2", 'innovation_test_ratio', 'beta', True),
("est_format_version_2", 'innovation_test_ratio', 'flow', True),
("est_format_version_2", 'innovation_test_ratio', 'hagl', True),
("est_format_version_2", 'innovation_test_ratio', 'drag', True),
]
@pytest.mark.parametrize(
"est_format_version,message_descriptor,field_name_req,should_exist", test_data)
def test_get_field_name_from_message_and_descriptor(
testing_args, est_format_version, message_descriptor, field_name_req, should_exist):
"""
Test logs of different verison for the existence/inexistence
of innovation and innovation_variance fields
"""
log = testing_args[est_format_version]
message_name, field_name = dvh.get_innovation_message_and_field_name(
log, field_name_req, message_descriptor)
print(message_name, field_name)
assert dvh.check_if_field_name_exists_in_message(log, message_name, field_name) == should_exist
| 58.278027 | 99 | 0.720914 | 1,693 | 12,996 | 5.004135 | 0.064973 | 0.183782 | 0.319169 | 0.217186 | 0.897781 | 0.890227 | 0.877715 | 0.750118 | 0.300637 | 0.116737 | 0 | 0.015558 | 0.124577 | 12,996 | 222 | 100 | 58.540541 | 0.729103 | 0.028393 | 0 | 0.052632 | 0 | 0 | 0.558117 | 0.104268 | 0 | 0 | 0 | 0 | 0.047368 | 1 | 0.021053 | false | 0 | 0.021053 | 0 | 0.047368 | 0.005263 | 0 | 0 | 0 | null | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
411ddf25efd589974ec09e1e07b096a864eeffd8 | 3,288 | py | Python | deep_utils/vision/face_detection/mtcnn/tf/src/get_nets.py | dornasabet/deep_utils | be61b36ea5b3219831e9f2a364fbd4a63858abed | [
"MIT"
] | 36 | 2021-11-10T05:17:18.000Z | 2022-03-27T18:25:10.000Z | deep_utils/vision/face_detection/mtcnn/tf/src/get_nets.py | MenuaB/deep_utils | b1b936f4780ea7dc52224f53f5116288c5b0a804 | [
"MIT"
] | 15 | 2021-07-24T06:23:04.000Z | 2021-09-18T05:30:20.000Z | deep_utils/vision/face_detection/mtcnn/tf/src/get_nets.py | MenuaB/deep_utils | b1b936f4780ea7dc52224f53f5116288c5b0a804 | [
"MIT"
] | 4 | 2021-11-28T07:39:57.000Z | 2022-03-30T05:46:10.000Z | def ONet(weight_path='onet.h5'):
from tensorflow.keras.layers import Conv2D, Input, MaxPool2D, Flatten, Dense, Permute, PReLU
from tensorflow.keras.models import Model
input = Input(shape=[48, 48, 3])
x = Conv2D(32, (3, 3), strides=1, padding='valid', name='conv1')(input)
x = PReLU(shared_axes=[1, 2], name='prelu1')(x)
x = MaxPool2D(pool_size=3, strides=2, padding='same')(x)
x = Conv2D(64, (3, 3), strides=1, padding='valid', name='conv2')(x)
x = PReLU(shared_axes=[1, 2], name='prelu2')(x)
x = MaxPool2D(pool_size=3, strides=2)(x)
x = Conv2D(64, (3, 3), strides=1, padding='valid', name='conv3')(x)
x = PReLU(shared_axes=[1, 2], name='prelu3')(x)
x = MaxPool2D(pool_size=2)(x)
x = Conv2D(128, (2, 2), strides=1, padding='valid', name='conv4')(x)
x = PReLU(shared_axes=[1, 2], name='prelu4')(x)
x = Permute((3, 2, 1))(x)
x = Flatten()(x)
x = Dense(256, name='conv5')(x)
x = PReLU(name='prelu5')(x)
classifier = Dense(2, activation='softmax', name='conv6-1')(x)
bbox_regress = Dense(4, name='conv6-2')(x)
landmark_regress = Dense(10, name='conv6-3')(x)
model = Model([input], [classifier, bbox_regress, landmark_regress])
model.load_weights(weight_path, by_name=True)
return model
def RNet(weight_path='rnet.h5'):
from tensorflow.keras.layers import Conv2D, Input, MaxPool2D, Flatten, Dense, Permute, PReLU
from tensorflow.keras.models import Model
input = Input(shape=[24, 24, 3]) # change this shape to [None,None,3] to enable arbitraty shape input
x = Conv2D(28, (3, 3), strides=1, padding='valid', name='conv1')(input)
x = PReLU(shared_axes=[1, 2], name='prelu1')(x)
x = MaxPool2D(pool_size=3, strides=2, padding='same')(x)
x = Conv2D(48, (3, 3), strides=1, padding='valid', name='conv2')(x)
x = PReLU(shared_axes=[1, 2], name='prelu2')(x)
x = MaxPool2D(pool_size=3, strides=2)(x)
x = Conv2D(64, (2, 2), strides=1, padding='valid', name='conv3')(x)
x = PReLU(shared_axes=[1, 2], name='prelu3')(x)
x = Permute((3, 2, 1))(x)
x = Flatten()(x)
x = Dense(128, name='conv4')(x)
x = PReLU(name='prelu4')(x)
classifier = Dense(2, activation='softmax', name='conv5-1')(x)
bbox_regress = Dense(4, name='conv5-2')(x)
model = Model([input], [classifier, bbox_regress])
model.load_weights(weight_path, by_name=True)
return model
def PNet(weight_path='pnet.h5'):
from tensorflow.keras.layers import Conv2D, Input, MaxPool2D, Flatten, Dense, Permute, PReLU
from tensorflow.keras.models import Model
input = Input(shape=[None, None, 3])
x = Conv2D(10, (3, 3), strides=1, padding='valid', name='conv1')(input)
x = PReLU(shared_axes=[1, 2], name='PReLU1')(x)
x = MaxPool2D(pool_size=2)(x)
x = Conv2D(16, (3, 3), strides=1, padding='valid', name='conv2')(x)
x = PReLU(shared_axes=[1, 2], name='PReLU2')(x)
x = Conv2D(32, (3, 3), strides=1, padding='valid', name='conv3')(x)
x = PReLU(shared_axes=[1, 2], name='PReLU3')(x)
classifier = Conv2D(2, (1, 1), activation='softmax', name='conv4-1')(x)
bbox_regress = Conv2D(4, (1, 1), name='conv4-2')(x)
model = Model([input], [classifier, bbox_regress])
model.load_weights(weight_path, by_name=True)
return model
| 46.309859 | 106 | 0.631083 | 513 | 3,288 | 3.974659 | 0.1423 | 0.027464 | 0.073565 | 0.098087 | 0.833252 | 0.822952 | 0.822952 | 0.731241 | 0.719961 | 0.702795 | 0 | 0.066863 | 0.172141 | 3,288 | 70 | 107 | 46.971429 | 0.682219 | 0.020073 | 0 | 0.483871 | 0 | 0 | 0.087267 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.048387 | false | 0 | 0.096774 | 0 | 0.193548 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
f5aafa35e68a765d8013610d33f512215aba2b2a | 96 | py | Python | venv/lib/python3.8/site-packages/pip/_vendor/urllib3/util/ssltransport.py | GiulianaPola/select_repeats | 17a0d053d4f874e42cf654dd142168c2ec8fbd11 | [
"MIT"
] | 2 | 2022-03-13T01:58:52.000Z | 2022-03-31T06:07:54.000Z | venv/lib/python3.8/site-packages/pip/_vendor/urllib3/util/ssltransport.py | DesmoSearch/Desmobot | b70b45df3485351f471080deb5c785c4bc5c4beb | [
"MIT"
] | 19 | 2021-11-20T04:09:18.000Z | 2022-03-23T15:05:55.000Z | venv/lib/python3.8/site-packages/pip/_vendor/urllib3/util/ssltransport.py | DesmoSearch/Desmobot | b70b45df3485351f471080deb5c785c4bc5c4beb | [
"MIT"
] | null | null | null | /home/runner/.cache/pip/pool/17/f5/27/70e5c671cf8c81e5854c0d47e100adfd144d41745b17aa278dedd2c876 | 96 | 96 | 0.895833 | 9 | 96 | 9.555556 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.427083 | 0 | 96 | 1 | 96 | 96 | 0.46875 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | null | 0 | 0 | null | null | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 1 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
eb1873892d4386ce08ec973bae5ddaf10dfd0822 | 32,075 | py | Python | migrations/versions/2020-08-30_postgres_indexes_de48fc5206e4.py | onecrayon/api.ashes.live | 72709fb4e53220aa9b48749a51f5b834ebb2ca42 | [
"0BSD"
] | 11 | 2020-09-13T16:49:21.000Z | 2021-07-29T06:17:58.000Z | migrations/versions/2020-08-30_postgres_indexes_de48fc5206e4.py | onecrayon/api.ashes.live | 72709fb4e53220aa9b48749a51f5b834ebb2ca42 | [
"0BSD"
] | 49 | 2020-09-11T05:23:02.000Z | 2022-03-02T18:31:00.000Z | migrations/versions/2020-08-30_postgres_indexes_de48fc5206e4.py | onecrayon/api.ashes.live | 72709fb4e53220aa9b48749a51f5b834ebb2ca42 | [
"0BSD"
] | 1 | 2022-03-27T22:11:29.000Z | 2022-03-27T22:11:29.000Z | """postgres_indexes
Revision ID: de48fc5206e4
Revises: c58a815a71a0
Create Date: 2020-08-30 17:13:43.204983+00:00
"""
from alembic import op
import sqlalchemy as sa
from sqlalchemy.dialects import postgresql
# revision identifiers, used by Alembic.
revision = "de48fc5206e4"
down_revision = "c58a815a71a0"
branch_labels = None
depends_on = None
def upgrade():
# We will no longer be using the session table (Flask-login specific)
op.drop_table("sessions")
# We need to recreate a bunch of indexes to ensure automigrations work going forward
# (old ones were ported from MySQL, so they have some funky names)
op.create_index(
op.f("ix_ashes500_revision_created"),
"ashes500_revision",
["created"],
unique=False,
)
op.create_index(
op.f("ix_ashes500_revision_entity_id"),
"ashes500_revision",
["entity_id"],
unique=False,
)
op.drop_index(
"idx_16946_ix_ashes500_revision_created", table_name="ashes500_revision"
)
op.drop_index(
"idx_16946_ix_ashes500_revision_entity_id", table_name="ashes500_revision"
)
op.create_index(
op.f("ix_ashes500_value_card_id"), "ashes500_value", ["card_id"], unique=False
)
op.create_index(
op.f("ix_ashes500_value_revision_id"),
"ashes500_value",
["revision_id"],
unique=False,
)
op.drop_index("idx_16955_combo_card_id", table_name="ashes500_value")
op.drop_index("idx_16955_ix_ashes500_value_card_id", table_name="ashes500_value")
op.drop_index(
"idx_16955_ix_ashes500_value_revision_id", table_name="ashes500_value"
)
op.drop_constraint("ashes500_value_ibfk_3", "ashes500_value", type_="foreignkey")
op.drop_constraint("ashes500_value_ibfk_1", "ashes500_value", type_="foreignkey")
op.drop_constraint("ashes500_value_ibfk_2", "ashes500_value", type_="foreignkey")
op.create_foreign_key(
op.f("fk_ashes500_value_revision_id_ashes500_revision"),
"ashes500_value",
"ashes500_revision",
["revision_id"],
["id"],
)
op.create_foreign_key(
op.f("fk_ashes500_value_card_id_card"),
"ashes500_value",
"card",
["card_id"],
["id"],
)
op.create_foreign_key(
op.f("fk_ashes500_value_combo_card_id_card"),
"ashes500_value",
"card",
["combo_card_id"],
["id"],
)
op.create_index(
op.f("ix_card_alt_dice_flags"), "card", ["alt_dice_flags"], unique=False
)
op.create_index(op.f("ix_card_card_type"), "card", ["card_type"], unique=False)
op.create_index(op.f("ix_card_cost_weight"), "card", ["cost_weight"], unique=False)
op.create_index(op.f("ix_card_dice_flags"), "card", ["dice_flags"], unique=False)
op.create_index(op.f("ix_card_entity_id"), "card", ["entity_id"], unique=True)
op.create_index(op.f("ix_card_name"), "card", ["name"], unique=True)
op.create_index(op.f("ix_card_phoenixborn"), "card", ["phoenixborn"], unique=False)
op.create_index(op.f("ix_card_release_id"), "card", ["release_id"], unique=False)
op.create_index(op.f("ix_card_stub"), "card", ["stub"], unique=True)
op.create_index("ix_card_text", "card", ["name", "text"], unique=False)
op.drop_index("idx_16961_ix_card_alt_dice_flags", table_name="card")
op.drop_index("idx_16961_ix_card_card_type", table_name="card")
op.drop_index("idx_16961_ix_card_cost_weight", table_name="card")
op.drop_index("idx_16961_ix_card_dice_flags", table_name="card")
op.drop_index("idx_16961_ix_card_entity_id", table_name="card")
op.drop_index("idx_16961_ix_card_name", table_name="card")
op.drop_index("idx_16961_ix_card_phoenixborn", table_name="card")
op.drop_index("idx_16961_ix_card_release_id", table_name="card")
op.drop_index("idx_16961_ix_card_stub", table_name="card")
op.drop_index("idx_16961_ix_card_text", table_name="card")
op.drop_constraint("card_release_ibfk_1", "card", type_="foreignkey")
op.create_foreign_key(
op.f("fk_card_release_id_releases"), "card", "releases", ["release_id"], ["id"]
)
op.drop_index("idx_16973_conjuration_id", table_name="card_conjuration")
op.drop_constraint(
"card_conjuration_ibfk_2", "card_conjuration", type_="foreignkey"
)
op.drop_constraint(
"card_conjuration_ibfk_1", "card_conjuration", type_="foreignkey"
)
op.create_foreign_key(
op.f("fk_card_conjuration_conjuration_id_card"),
"card_conjuration",
"card",
["conjuration_id"],
["id"],
)
op.create_foreign_key(
op.f("fk_card_conjuration_card_id_card"),
"card_conjuration",
"card",
["card_id"],
["id"],
)
op.create_index(op.f("ix_comment_created"), "comment", ["created"], unique=False)
op.create_index(op.f("ix_comment_entity_id"), "comment", ["entity_id"], unique=True)
op.create_index(
op.f("ix_comment_is_deleted"), "comment", ["is_deleted"], unique=False
)
op.create_index(op.f("ix_comment_order"), "comment", ["order"], unique=False)
op.create_index(
op.f("ix_comment_source_entity_id"),
"comment",
["source_entity_id"],
unique=False,
)
op.drop_index("idx_16978_ix_comment_created", table_name="comment")
op.drop_index("idx_16978_ix_comment_entity_id", table_name="comment")
op.drop_index("idx_16978_ix_comment_is_deleted", table_name="comment")
op.drop_index("idx_16978_ix_comment_order", table_name="comment")
op.drop_index("idx_16978_ix_comment_source_entity_id", table_name="comment")
op.drop_index("idx_16978_user_id", table_name="comment")
op.drop_constraint("comment_ibfk_1", "comment", type_="foreignkey")
op.create_foreign_key(
op.f("fk_comment_user_id_user"), "comment", "user", ["user_id"], ["id"]
)
op.create_index(op.f("ix_deck_created"), "deck", ["created"], unique=False)
op.create_index(op.f("ix_deck_entity_id"), "deck", ["entity_id"], unique=True)
op.create_index(
op.f("ix_deck_is_preconstructed"), "deck", ["is_preconstructed"], unique=False
)
op.create_index(op.f("ix_deck_is_public"), "deck", ["is_public"], unique=False)
op.create_index(op.f("ix_deck_is_snapshot"), "deck", ["is_snapshot"], unique=False)
op.create_index(op.f("ix_deck_modified"), "deck", ["modified"], unique=False)
op.create_index(
op.f("ix_deck_phoenixborn_id"), "deck", ["phoenixborn_id"], unique=False
)
op.create_index(
op.f("ix_deck_preconstructed_release"),
"deck",
["preconstructed_release"],
unique=False,
)
op.create_index(op.f("ix_deck_source_id"), "deck", ["source_id"], unique=False)
op.create_index(op.f("ix_deck_title"), "deck", ["title"], unique=False)
op.create_index(op.f("ix_deck_user_id"), "deck", ["user_id"], unique=False)
op.drop_index("idx_16989_fk_ashes_500_revision_id", table_name="deck")
op.drop_index("idx_16989_ix_deck_created", table_name="deck")
op.drop_index("idx_16989_ix_deck_entity_id", table_name="deck")
op.drop_index("idx_16989_ix_deck_is_preconstructed", table_name="deck")
op.drop_index("idx_16989_ix_deck_is_public", table_name="deck")
op.drop_index("idx_16989_ix_deck_is_snapshot", table_name="deck")
op.drop_index("idx_16989_ix_deck_modified", table_name="deck")
op.drop_index("idx_16989_ix_deck_phoenixborn_id", table_name="deck")
op.drop_index("idx_16989_ix_deck_preconstructed_release", table_name="deck")
op.drop_index("idx_16989_ix_deck_source_id", table_name="deck")
op.drop_index("idx_16989_ix_deck_title", table_name="deck")
op.drop_index("idx_16989_ix_deck_user_id", table_name="deck")
op.drop_constraint("fk_ashes_500_revision_id", "deck", type_="foreignkey")
op.drop_constraint("deck_ibfk_3", "deck", type_="foreignkey")
op.drop_constraint("deck_ibfk_2", "deck", type_="foreignkey")
op.drop_constraint("deck_ibfk_1", "deck", type_="foreignkey")
op.create_foreign_key(
op.f("fk_deck_phoenixborn_id_card"), "deck", "card", ["phoenixborn_id"], ["id"]
)
op.create_foreign_key(
op.f("fk_deck_ashes_500_revision_id_ashes500_revision"),
"deck",
"ashes500_revision",
["ashes_500_revision_id"],
["id"],
)
op.create_foreign_key(
op.f("fk_deck_user_id_user"), "deck", "user", ["user_id"], ["id"]
)
op.create_foreign_key(
op.f("fk_deck_source_id_deck"), "deck", "deck", ["source_id"], ["id"]
)
op.drop_index("idx_16998_card_id", table_name="deck_card")
op.drop_constraint("deck_card_ibfk_1", "deck_card", type_="foreignkey")
op.drop_constraint("deck_card_ibfk_2", "deck_card", type_="foreignkey")
op.create_foreign_key(
op.f("fk_deck_card_card_id_card"), "deck_card", "card", ["card_id"], ["id"]
)
op.create_foreign_key(
op.f("fk_deck_card_deck_id_deck"), "deck_card", "deck", ["deck_id"], ["id"]
)
op.drop_constraint("deck_die_ibfk_1", "deck_die", type_="foreignkey")
op.create_foreign_key(
op.f("fk_deck_die_deck_id_deck"), "deck_die", "deck", ["deck_id"], ["id"]
)
op.drop_index(
"idx_17004_deck_selected_card_ibfk_1", table_name="deck_selected_card"
)
op.drop_constraint(
"deck_selected_card_ibfk_1", "deck_selected_card", type_="foreignkey"
)
op.drop_constraint(
"deck_selected_card_ibfk_2", "deck_selected_card", type_="foreignkey"
)
op.create_foreign_key(
op.f("fk_deck_selected_card_card_id_card"),
"deck_selected_card",
"card",
["card_id"],
["id"],
)
op.create_foreign_key(
op.f("fk_deck_selected_card_deck_id_deck"),
"deck_selected_card",
"deck",
["deck_id"],
["id"],
)
op.create_index(op.f("ix_invite_email"), "invite", ["email"], unique=True)
op.drop_index("idx_17008_ix_invite_email", table_name="invite")
op.create_index(
op.f("ix_phoenix_dice_email"), "phoenix_dice", ["email"], unique=True
)
op.create_index(
op.f("ix_phoenix_dice_only_official_icons"),
"phoenix_dice",
["only_official_icons"],
unique=False,
)
op.drop_index("idx_17013_ix_phoenix_dice_email", table_name="phoenix_dice")
op.drop_index(
"idx_17013_ix_phoenix_dice_only_official_icons", table_name="phoenix_dice"
)
op.create_index(op.f("ix_post_created"), "post", ["created"], unique=False)
op.create_index(op.f("ix_post_entity_id"), "post", ["entity_id"], unique=True)
op.create_index(op.f("ix_post_is_deleted"), "post", ["is_deleted"], unique=False)
op.create_index(op.f("ix_post_is_pinned"), "post", ["is_pinned"], unique=False)
op.create_index(op.f("ix_post_section_id"), "post", ["section_id"], unique=False)
op.drop_index("idx_17019_ix_post_created", table_name="post")
op.drop_index("idx_17019_ix_post_entity_id", table_name="post")
op.drop_index("idx_17019_ix_post_is_deleted", table_name="post")
op.drop_index("idx_17019_ix_post_is_pinned", table_name="post")
op.drop_index("idx_17019_ix_post_section_id", table_name="post")
op.drop_index("idx_17019_user_id", table_name="post")
op.drop_constraint("post_ibfk_2", "post", type_="foreignkey")
op.drop_constraint("post_ibfk_1", "post", type_="foreignkey")
op.create_foreign_key(
op.f("fk_post_user_id_user"), "post", "user", ["user_id"], ["id"]
)
op.create_foreign_key(
op.f("fk_post_section_id_section"), "post", "section", ["section_id"], ["id"]
)
op.create_unique_constraint(op.f("uq_releases_name"), "releases", ["name"])
op.drop_index("idx_17031_name", table_name="releases")
op.create_index(op.f("ix_section_entity_id"), "section", ["entity_id"], unique=True)
op.create_index(
op.f("ix_section_is_restricted"), "section", ["is_restricted"], unique=False
)
op.create_index(op.f("ix_section_stub"), "section", ["stub"], unique=True)
op.drop_index("idx_17039_ix_section_entity_id", table_name="section")
op.drop_index("idx_17039_ix_section_is_restricted", table_name="section")
op.drop_index("idx_17039_ix_section_stub", table_name="section")
op.create_index(op.f("ix_stream_entity_id"), "stream", ["entity_id"], unique=True)
op.create_index(
op.f("ix_stream_entity_type"), "stream", ["entity_type"], unique=False
)
op.create_index(op.f("ix_stream_posted"), "stream", ["posted"], unique=False)
op.create_index(
op.f("ix_stream_source_entity_id"), "stream", ["source_entity_id"], unique=False
)
op.drop_index("idx_17058_ix_stream_entity_id", table_name="stream")
op.drop_index("idx_17058_ix_stream_entity_type", table_name="stream")
op.drop_index("idx_17058_ix_stream_posted", table_name="stream")
op.drop_index("idx_17058_ix_stream_souce_entity_id", table_name="stream")
op.create_index(
op.f("ix_subscription_last_seen_entity_id"),
"subscription",
["last_seen_entity_id"],
unique=False,
)
op.drop_index(
"idx_17068_ix_subscription_last_seen_entity_id", table_name="subscription"
)
op.drop_constraint("subscription_ibfk_1", "subscription", type_="foreignkey")
op.create_foreign_key(
op.f("fk_subscription_user_id_user"),
"subscription",
"user",
["user_id"],
["id"],
)
op.create_index(op.f("ix_user_badge"), "user", ["badge"], unique=True)
op.create_index(op.f("ix_user_email"), "user", ["email"], unique=True)
op.create_index(
op.f("ix_user_email_subscriptions"),
"user",
["email_subscriptions"],
unique=False,
)
op.create_index(op.f("ix_user_reset_uuid"), "user", ["reset_uuid"], unique=True)
op.drop_index("idx_17073_ix_user_badge", table_name="user")
op.drop_index("idx_17073_ix_user_email", table_name="user")
op.drop_index("idx_17073_ix_user_email_subscriptions", table_name="user")
op.drop_index("idx_17073_ix_user_reset_uuid", table_name="user")
op.create_index(
op.f("ix_user_release_user_id"), "user_release", ["user_id"], unique=False
)
op.drop_index("idx_17083_ix_user_release_user_id", table_name="user_release")
op.drop_index("idx_17083_release_id", table_name="user_release")
op.drop_constraint("user_release_ibfk_1", "user_release", type_="foreignkey")
op.drop_constraint("user_release_ibfk_2", "user_release", type_="foreignkey")
op.create_foreign_key(
op.f("fk_user_release_release_id_releases"),
"user_release",
"releases",
["release_id"],
["id"],
)
op.create_foreign_key(
op.f("fk_user_release_user_id_user"),
"user_release",
"user",
["user_id"],
["id"],
)
def downgrade():
op.drop_constraint(
op.f("fk_user_release_user_id_user"), "user_release", type_="foreignkey"
)
op.drop_constraint(
op.f("fk_user_release_release_id_releases"), "user_release", type_="foreignkey"
)
op.create_foreign_key(
"user_release_ibfk_2",
"user_release",
"user",
["user_id"],
["id"],
onupdate="RESTRICT",
ondelete="RESTRICT",
)
op.create_foreign_key(
"user_release_ibfk_1",
"user_release",
"releases",
["release_id"],
["id"],
onupdate="RESTRICT",
ondelete="RESTRICT",
)
op.create_index(
"idx_17083_release_id", "user_release", ["release_id"], unique=False
)
op.create_index(
"idx_17083_ix_user_release_user_id", "user_release", ["user_id"], unique=False
)
op.drop_index(op.f("ix_user_release_user_id"), table_name="user_release")
op.create_index("idx_17073_ix_user_reset_uuid", "user", ["reset_uuid"], unique=True)
op.create_index(
"idx_17073_ix_user_email_subscriptions",
"user",
["email_subscriptions"],
unique=False,
)
op.create_index("idx_17073_ix_user_email", "user", ["email"], unique=True)
op.create_index("idx_17073_ix_user_badge", "user", ["badge"], unique=True)
op.drop_index(op.f("ix_user_reset_uuid"), table_name="user")
op.drop_index(op.f("ix_user_email_subscriptions"), table_name="user")
op.drop_index(op.f("ix_user_email"), table_name="user")
op.drop_index(op.f("ix_user_badge"), table_name="user")
op.drop_constraint(
op.f("fk_subscription_user_id_user"), "subscription", type_="foreignkey"
)
op.create_foreign_key(
"subscription_ibfk_1",
"subscription",
"user",
["user_id"],
["id"],
onupdate="RESTRICT",
ondelete="RESTRICT",
)
op.create_index(
"idx_17068_ix_subscription_last_seen_entity_id",
"subscription",
["last_seen_entity_id"],
unique=False,
)
op.drop_index(
op.f("ix_subscription_last_seen_entity_id"), table_name="subscription"
)
op.create_index(
"idx_17058_ix_stream_souce_entity_id",
"stream",
["source_entity_id"],
unique=False,
)
op.create_index("idx_17058_ix_stream_posted", "stream", ["posted"], unique=False)
op.create_index(
"idx_17058_ix_stream_entity_type", "stream", ["entity_type"], unique=False
)
op.create_index(
"idx_17058_ix_stream_entity_id", "stream", ["entity_id"], unique=True
)
op.drop_index(op.f("ix_stream_source_entity_id"), table_name="stream")
op.drop_index(op.f("ix_stream_posted"), table_name="stream")
op.drop_index(op.f("ix_stream_entity_type"), table_name="stream")
op.drop_index(op.f("ix_stream_entity_id"), table_name="stream")
op.create_index("idx_17039_ix_section_stub", "section", ["stub"], unique=True)
op.create_index(
"idx_17039_ix_section_is_restricted", "section", ["is_restricted"], unique=False
)
op.create_index(
"idx_17039_ix_section_entity_id", "section", ["entity_id"], unique=True
)
op.drop_index(op.f("ix_section_stub"), table_name="section")
op.drop_index(op.f("ix_section_is_restricted"), table_name="section")
op.drop_index(op.f("ix_section_entity_id"), table_name="section")
op.create_index("idx_17031_name", "releases", ["name"], unique=True)
op.drop_constraint(op.f("uq_releases_name"), "releases", type_="unique")
op.drop_constraint(op.f("fk_post_section_id_section"), "post", type_="foreignkey")
op.drop_constraint(op.f("fk_post_user_id_user"), "post", type_="foreignkey")
op.create_foreign_key(
"post_ibfk_1",
"post",
"section",
["section_id"],
["id"],
onupdate="RESTRICT",
ondelete="RESTRICT",
)
op.create_foreign_key(
"post_ibfk_2",
"post",
"user",
["user_id"],
["id"],
onupdate="RESTRICT",
ondelete="RESTRICT",
)
op.create_index("idx_17019_user_id", "post", ["user_id"], unique=False)
op.create_index(
"idx_17019_ix_post_section_id", "post", ["section_id"], unique=False
)
op.create_index("idx_17019_ix_post_is_pinned", "post", ["is_pinned"], unique=False)
op.create_index(
"idx_17019_ix_post_is_deleted", "post", ["is_deleted"], unique=False
)
op.create_index("idx_17019_ix_post_entity_id", "post", ["entity_id"], unique=True)
op.create_index("idx_17019_ix_post_created", "post", ["created"], unique=False)
op.drop_index(op.f("ix_post_section_id"), table_name="post")
op.drop_index(op.f("ix_post_is_pinned"), table_name="post")
op.drop_index(op.f("ix_post_is_deleted"), table_name="post")
op.drop_index(op.f("ix_post_entity_id"), table_name="post")
op.drop_index(op.f("ix_post_created"), table_name="post")
op.create_index(
"idx_17013_ix_phoenix_dice_only_official_icons",
"phoenix_dice",
["only_official_icons"],
unique=False,
)
op.create_index(
"idx_17013_ix_phoenix_dice_email", "phoenix_dice", ["email"], unique=True
)
op.drop_index(
op.f("ix_phoenix_dice_only_official_icons"), table_name="phoenix_dice"
)
op.drop_index(op.f("ix_phoenix_dice_email"), table_name="phoenix_dice")
op.create_index("idx_17008_ix_invite_email", "invite", ["email"], unique=True)
op.drop_index(op.f("ix_invite_email"), table_name="invite")
op.drop_constraint(
op.f("fk_deck_selected_card_deck_id_deck"),
"deck_selected_card",
type_="foreignkey",
)
op.drop_constraint(
op.f("fk_deck_selected_card_card_id_card"),
"deck_selected_card",
type_="foreignkey",
)
op.create_foreign_key(
"deck_selected_card_ibfk_2",
"deck_selected_card",
"deck",
["deck_id"],
["id"],
onupdate="RESTRICT",
ondelete="RESTRICT",
)
op.create_foreign_key(
"deck_selected_card_ibfk_1",
"deck_selected_card",
"card",
["card_id"],
["id"],
onupdate="RESTRICT",
ondelete="RESTRICT",
)
op.create_index(
"idx_17004_deck_selected_card_ibfk_1",
"deck_selected_card",
["card_id"],
unique=False,
)
op.drop_constraint(op.f("fk_deck_die_deck_id_deck"), "deck_die", type_="foreignkey")
op.create_foreign_key(
"deck_die_ibfk_1",
"deck_die",
"deck",
["deck_id"],
["id"],
onupdate="RESTRICT",
ondelete="CASCADE",
)
op.drop_constraint(
op.f("fk_deck_card_deck_id_deck"), "deck_card", type_="foreignkey"
)
op.drop_constraint(
op.f("fk_deck_card_card_id_card"), "deck_card", type_="foreignkey"
)
op.create_foreign_key(
"deck_card_ibfk_2",
"deck_card",
"deck",
["deck_id"],
["id"],
onupdate="RESTRICT",
ondelete="CASCADE",
)
op.create_foreign_key(
"deck_card_ibfk_1",
"deck_card",
"card",
["card_id"],
["id"],
onupdate="RESTRICT",
ondelete="RESTRICT",
)
op.create_index("idx_16998_card_id", "deck_card", ["card_id"], unique=False)
op.drop_constraint(op.f("fk_deck_source_id_deck"), "deck", type_="foreignkey")
op.drop_constraint(op.f("fk_deck_user_id_user"), "deck", type_="foreignkey")
op.drop_constraint(
op.f("fk_deck_ashes_500_revision_id_ashes500_revision"),
"deck",
type_="foreignkey",
)
op.drop_constraint(op.f("fk_deck_phoenixborn_id_card"), "deck", type_="foreignkey")
op.create_foreign_key(
"deck_ibfk_1",
"deck",
"card",
["phoenixborn_id"],
["id"],
onupdate="RESTRICT",
ondelete="RESTRICT",
)
op.create_foreign_key(
"deck_ibfk_2",
"deck",
"user",
["user_id"],
["id"],
onupdate="RESTRICT",
ondelete="RESTRICT",
)
op.create_foreign_key(
"deck_ibfk_3",
"deck",
"deck",
["source_id"],
["id"],
onupdate="RESTRICT",
ondelete="RESTRICT",
)
op.create_foreign_key(
"fk_ashes_500_revision_id",
"deck",
"ashes500_revision",
["ashes_500_revision_id"],
["id"],
onupdate="RESTRICT",
ondelete="RESTRICT",
)
op.create_index("idx_16989_ix_deck_user_id", "deck", ["user_id"], unique=False)
op.create_index("idx_16989_ix_deck_title", "deck", ["title"], unique=False)
op.create_index("idx_16989_ix_deck_source_id", "deck", ["source_id"], unique=False)
op.create_index(
"idx_16989_ix_deck_preconstructed_release",
"deck",
["preconstructed_release"],
unique=False,
)
op.create_index(
"idx_16989_ix_deck_phoenixborn_id", "deck", ["phoenixborn_id"], unique=False
)
op.create_index("idx_16989_ix_deck_modified", "deck", ["modified"], unique=False)
op.create_index(
"idx_16989_ix_deck_is_snapshot", "deck", ["is_snapshot"], unique=False
)
op.create_index("idx_16989_ix_deck_is_public", "deck", ["is_public"], unique=False)
op.create_index(
"idx_16989_ix_deck_is_preconstructed",
"deck",
["is_preconstructed"],
unique=False,
)
op.create_index("idx_16989_ix_deck_entity_id", "deck", ["entity_id"], unique=True)
op.create_index("idx_16989_ix_deck_created", "deck", ["created"], unique=False)
op.create_index(
"idx_16989_fk_ashes_500_revision_id",
"deck",
["ashes_500_revision_id"],
unique=False,
)
op.drop_index(op.f("ix_deck_user_id"), table_name="deck")
op.drop_index(op.f("ix_deck_title"), table_name="deck")
op.drop_index(op.f("ix_deck_source_id"), table_name="deck")
op.drop_index(op.f("ix_deck_preconstructed_release"), table_name="deck")
op.drop_index(op.f("ix_deck_phoenixborn_id"), table_name="deck")
op.drop_index(op.f("ix_deck_modified"), table_name="deck")
op.drop_index(op.f("ix_deck_is_snapshot"), table_name="deck")
op.drop_index(op.f("ix_deck_is_public"), table_name="deck")
op.drop_index(op.f("ix_deck_is_preconstructed"), table_name="deck")
op.drop_index(op.f("ix_deck_entity_id"), table_name="deck")
op.drop_index(op.f("ix_deck_created"), table_name="deck")
op.drop_constraint(op.f("fk_comment_user_id_user"), "comment", type_="foreignkey")
op.create_foreign_key(
"comment_ibfk_1",
"comment",
"user",
["user_id"],
["id"],
onupdate="RESTRICT",
ondelete="RESTRICT",
)
op.create_index("idx_16978_user_id", "comment", ["user_id"], unique=False)
op.create_index(
"idx_16978_ix_comment_source_entity_id",
"comment",
["source_entity_id"],
unique=False,
)
op.create_index("idx_16978_ix_comment_order", "comment", ["order"], unique=False)
op.create_index(
"idx_16978_ix_comment_is_deleted", "comment", ["is_deleted"], unique=False
)
op.create_index(
"idx_16978_ix_comment_entity_id", "comment", ["entity_id"], unique=True
)
op.create_index(
"idx_16978_ix_comment_created", "comment", ["created"], unique=False
)
op.drop_index(op.f("ix_comment_source_entity_id"), table_name="comment")
op.drop_index(op.f("ix_comment_order"), table_name="comment")
op.drop_index(op.f("ix_comment_is_deleted"), table_name="comment")
op.drop_index(op.f("ix_comment_entity_id"), table_name="comment")
op.drop_index(op.f("ix_comment_created"), table_name="comment")
op.drop_constraint(
op.f("fk_card_conjuration_card_id_card"), "card_conjuration", type_="foreignkey"
)
op.drop_constraint(
op.f("fk_card_conjuration_conjuration_id_card"),
"card_conjuration",
type_="foreignkey",
)
op.create_foreign_key(
"card_conjuration_ibfk_1",
"card_conjuration",
"card",
["card_id"],
["id"],
onupdate="RESTRICT",
ondelete="RESTRICT",
)
op.create_foreign_key(
"card_conjuration_ibfk_2",
"card_conjuration",
"card",
["conjuration_id"],
["id"],
onupdate="RESTRICT",
ondelete="RESTRICT",
)
op.create_index(
"idx_16973_conjuration_id", "card_conjuration", ["conjuration_id"], unique=False
)
op.drop_constraint(op.f("fk_card_release_id_releases"), "card", type_="foreignkey")
op.create_foreign_key(
"card_release_ibfk_1",
"card",
"releases",
["release_id"],
["id"],
onupdate="RESTRICT",
ondelete="RESTRICT",
)
op.create_index("idx_16961_ix_card_text", "card", ["name", "text"], unique=False)
op.create_index("idx_16961_ix_card_stub", "card", ["stub"], unique=True)
op.create_index(
"idx_16961_ix_card_release_id", "card", ["release_id"], unique=False
)
op.create_index(
"idx_16961_ix_card_phoenixborn", "card", ["phoenixborn"], unique=False
)
op.create_index("idx_16961_ix_card_name", "card", ["name"], unique=True)
op.create_index("idx_16961_ix_card_entity_id", "card", ["entity_id"], unique=True)
op.create_index(
"idx_16961_ix_card_dice_flags", "card", ["dice_flags"], unique=False
)
op.create_index(
"idx_16961_ix_card_cost_weight", "card", ["cost_weight"], unique=False
)
op.create_index("idx_16961_ix_card_card_type", "card", ["card_type"], unique=False)
op.create_index(
"idx_16961_ix_card_alt_dice_flags", "card", ["alt_dice_flags"], unique=False
)
op.drop_index("ix_card_text", table_name="card")
op.drop_index(op.f("ix_card_stub"), table_name="card")
op.drop_index(op.f("ix_card_release_id"), table_name="card")
op.drop_index(op.f("ix_card_phoenixborn"), table_name="card")
op.drop_index(op.f("ix_card_name"), table_name="card")
op.drop_index(op.f("ix_card_entity_id"), table_name="card")
op.drop_index(op.f("ix_card_dice_flags"), table_name="card")
op.drop_index(op.f("ix_card_cost_weight"), table_name="card")
op.drop_index(op.f("ix_card_card_type"), table_name="card")
op.drop_index(op.f("ix_card_alt_dice_flags"), table_name="card")
op.drop_constraint(
op.f("fk_ashes500_value_combo_card_id_card"),
"ashes500_value",
type_="foreignkey",
)
op.drop_constraint(
op.f("fk_ashes500_value_card_id_card"), "ashes500_value", type_="foreignkey"
)
op.drop_constraint(
op.f("fk_ashes500_value_revision_id_ashes500_revision"),
"ashes500_value",
type_="foreignkey",
)
op.create_foreign_key(
"ashes500_value_ibfk_2",
"ashes500_value",
"card",
["combo_card_id"],
["id"],
onupdate="RESTRICT",
ondelete="RESTRICT",
)
op.create_foreign_key(
"ashes500_value_ibfk_1",
"ashes500_value",
"card",
["card_id"],
["id"],
onupdate="RESTRICT",
ondelete="RESTRICT",
)
op.create_foreign_key(
"ashes500_value_ibfk_3",
"ashes500_value",
"ashes500_revision",
["revision_id"],
["id"],
onupdate="RESTRICT",
ondelete="RESTRICT",
)
op.create_index(
"idx_16955_ix_ashes500_value_revision_id",
"ashes500_value",
["revision_id"],
unique=False,
)
op.create_index(
"idx_16955_ix_ashes500_value_card_id",
"ashes500_value",
["card_id"],
unique=False,
)
op.create_index(
"idx_16955_combo_card_id", "ashes500_value", ["combo_card_id"], unique=False
)
op.drop_index(op.f("ix_ashes500_value_revision_id"), table_name="ashes500_value")
op.drop_index(op.f("ix_ashes500_value_card_id"), table_name="ashes500_value")
op.create_index(
"idx_16946_ix_ashes500_revision_entity_id",
"ashes500_revision",
["entity_id"],
unique=False,
)
op.create_index(
"idx_16946_ix_ashes500_revision_created",
"ashes500_revision",
["created"],
unique=False,
)
op.drop_index(
op.f("ix_ashes500_revision_entity_id"), table_name="ashes500_revision"
)
op.drop_index(op.f("ix_ashes500_revision_created"), table_name="ashes500_revision")
op.create_table(
"sessions",
sa.Column("id", sa.BIGINT(), autoincrement=True, nullable=False),
sa.Column(
"session_id", sa.VARCHAR(length=255), autoincrement=False, nullable=True
),
sa.Column("data", postgresql.BYTEA(), autoincrement=False, nullable=True),
sa.Column(
"expiry",
postgresql.TIMESTAMP(timezone=True),
autoincrement=False,
nullable=True,
),
sa.PrimaryKeyConstraint("id", name="idx_17049_primary"),
)
op.create_index("idx_17049_session_id", "sessions", ["session_id"], unique=True)
| 38.367225 | 88 | 0.654653 | 4,315 | 32,075 | 4.413905 | 0.042178 | 0.065526 | 0.076447 | 0.052504 | 0.945395 | 0.924551 | 0.88596 | 0.833666 | 0.724929 | 0.644282 | 0 | 0.036095 | 0.193266 | 32,075 | 835 | 89 | 38.413174 | 0.699954 | 0.011473 | 0 | 0.456654 | 0 | 0 | 0.393835 | 0.185102 | 0 | 0 | 0 | 0 | 0 | 1 | 0.002442 | false | 0 | 0.003663 | 0 | 0.006105 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
de8f12d11c5e2b1e75e9e47669f2a8569398e899 | 279 | py | Python | PythonScript/models.py | Arsennnic/OpenVHead | 27d3b890f04c1be321c8d56e79554cae7206ae0a | [
"MIT"
] | null | null | null | PythonScript/models.py | Arsennnic/OpenVHead | 27d3b890f04c1be321c8d56e79554cae7206ae0a | [
"MIT"
] | null | null | null | PythonScript/models.py | Arsennnic/OpenVHead | 27d3b890f04c1be321c8d56e79554cae7206ae0a | [
"MIT"
] | null | null | null | #!/usr/bin/env python3
# -*- coding: utf-8 -*-
"""
Created on Wed Feb 26 16:19:20 2020
@author: Hawk Shaw
"""
class Model():
def __init__(self, feature_points):
self.feature_points = feature_points
def foo(self):
# TODO: do something...
pass
| 15.5 | 44 | 0.598566 | 38 | 279 | 4.210526 | 0.815789 | 0.24375 | 0.2125 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.067308 | 0.25448 | 279 | 17 | 45 | 16.411765 | 0.701923 | 0.433692 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.058824 | 0 | 1 | 0.4 | false | 0.2 | 0 | 0 | 0.6 | 0 | 0 | 0 | 0 | null | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 6 |
a0e29f5baa59261ba7ca245a1bf2e4159b22f5b8 | 71 | py | Python | notebooks/lib/activation_functions/sigmoid.py | claclacla/Analyze-text-sentiment-The-machine-learning-approach | 6638b6f3d330d7618cee5980bc38abc6e84a43db | [
"MIT"
] | null | null | null | notebooks/lib/activation_functions/sigmoid.py | claclacla/Analyze-text-sentiment-The-machine-learning-approach | 6638b6f3d330d7618cee5980bc38abc6e84a43db | [
"MIT"
] | null | null | null | notebooks/lib/activation_functions/sigmoid.py | claclacla/Analyze-text-sentiment-The-machine-learning-approach | 6638b6f3d330d7618cee5980bc38abc6e84a43db | [
"MIT"
] | null | null | null | import numpy as np
def sigmoid(num):
return 1 / (1 + np.exp(-num)) | 17.75 | 33 | 0.619718 | 13 | 71 | 3.384615 | 0.769231 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.036364 | 0.225352 | 71 | 4 | 33 | 17.75 | 0.763636 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.333333 | false | 0 | 0.333333 | 0.333333 | 1 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 1 | 1 | 0 | 0 | 6 |
9d158eb4fe42822fee73a92f313022676038901b | 164 | py | Python | tests/basic/with.py | MoonStarCZW/py2rb | 89b247717d33d780fbf143e1583bfe9252984da4 | [
"MIT"
] | 124 | 2017-08-19T05:37:16.000Z | 2022-03-08T18:24:18.000Z | tests/basic/with.py | MoonStarCZW/py2rb | 89b247717d33d780fbf143e1583bfe9252984da4 | [
"MIT"
] | 15 | 2017-12-16T05:59:31.000Z | 2022-02-08T02:51:17.000Z | tests/basic/with.py | MoonStarCZW/py2rb | 89b247717d33d780fbf143e1583bfe9252984da4 | [
"MIT"
] | 18 | 2017-09-25T11:57:04.000Z | 2022-02-19T17:33:48.000Z | with open("hello.txt", 'w') as f:
f.write("Hello, world!")
with open("hello.txt", 'r') as f:
print(f.read())
with open("hello.txt", 'r'):
print('ok')
| 18.222222 | 33 | 0.54878 | 28 | 164 | 3.214286 | 0.464286 | 0.266667 | 0.433333 | 0.533333 | 0.377778 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.189024 | 164 | 8 | 34 | 20.5 | 0.676692 | 0 | 0 | 0 | 0 | 0 | 0.27439 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0 | 0 | 0 | 0.333333 | 1 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
9d2aa63e47154382cafdd20eab87bb534ccd872b | 43 | py | Python | LiBis/test/test.py | biobai/LiBis | e334cd14de2d365943c2f4edbc36505901f01df1 | [
"MIT"
] | null | null | null | LiBis/test/test.py | biobai/LiBis | e334cd14de2d365943c2f4edbc36505901f01df1 | [
"MIT"
] | null | null | null | LiBis/test/test.py | biobai/LiBis | e334cd14de2d365943c2f4edbc36505901f01df1 | [
"MIT"
] | null | null | null | import os
print(os.path.abspath(__file__))
| 14.333333 | 32 | 0.790698 | 7 | 43 | 4.285714 | 0.857143 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.069767 | 43 | 2 | 33 | 21.5 | 0.75 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0.5 | 0 | 0.5 | 0.5 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 1 | 0 | 6 |
9d3cfe2670fc3179c5b4900ed52840d18d720514 | 187 | py | Python | django/contrib/gis/gdal/base.py | Yoann-Vie/esgi-hearthstone | 115d03426c7e8e80d89883b78ac72114c29bed12 | [
"PSF-2.0",
"BSD-3-Clause"
] | null | null | null | django/contrib/gis/gdal/base.py | Yoann-Vie/esgi-hearthstone | 115d03426c7e8e80d89883b78ac72114c29bed12 | [
"PSF-2.0",
"BSD-3-Clause"
] | null | null | null | django/contrib/gis/gdal/base.py | Yoann-Vie/esgi-hearthstone | 115d03426c7e8e80d89883b78ac72114c29bed12 | [
"PSF-2.0",
"BSD-3-Clause"
] | null | null | null | from django.contrib.gis.gdal.error import GDALException
from django.contrib.gis.ptr import CPointerBase
class GDALBase(CPointerBase):
null_ptr_exception_class = GDALException
| 26.714286 | 56 | 0.807487 | 23 | 187 | 6.434783 | 0.608696 | 0.135135 | 0.22973 | 0.27027 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.13369 | 187 | 6 | 57 | 31.166667 | 0.91358 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.5 | 0 | 1 | 0 | 1 | 0 | 0 | null | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
9d5c09b937ea55139c86036d46503fa566b557d9 | 16,396 | py | Python | src/models.py | javierAraluce/ARAGAN | 5954cb8f5975b16b506ed33aaf842fc1cd715044 | [
"MIT"
] | 1 | 2022-02-02T20:06:15.000Z | 2022-02-02T20:06:15.000Z | src/models.py | javierAraluce/ARAGAN | 5954cb8f5975b16b506ed33aaf842fc1cd715044 | [
"MIT"
] | null | null | null | src/models.py | javierAraluce/ARAGAN | 5954cb8f5975b16b506ed33aaf842fc1cd715044 | [
"MIT"
] | null | null | null | import tensorflow as tf
from modules import Modules
from typing import Tuple
class Models(object):
def __init__(self,
img_height: int,
img_width: int,
output_channels: int,
batch_size: int):
self.blocks = Modules()
self.IMG_HEIGHT = img_height
self.IMG_WIDTH = img_width
self.OUTPUT_CHANNELS = output_channels
self.batch_size = batch_size
def Unet(self):
inputs = tf.keras.layers.Input(
shape=[self.IMG_HEIGHT, self.IMG_WIDTH, 3],
batch_size = self.batch_size)
kernel_size = 5
down_stack = [
self.blocks.downsample( 64,
kernel_size,
apply_batchnorm = False,
apply_dropout = True,
dropout = 0.5,
activator = 'relu'), # (bs, 128, 128, 64)
self.blocks.downsample(128,
kernel_size,
apply_batchnorm = False,
apply_dropout = True,
dropout = 0.5,
activator = 'relu'), # (bs, 64, 64, 128)
self.blocks.downsample(256,
kernel_size,
apply_batchnorm = False,
apply_dropout = True,
dropout = 0.5,
activator = 'relu'), # (bs, 32, 32, 256)
self.blocks.downsample(512,
kernel_size,
apply_batchnorm = False,
apply_dropout = False,
dropout = 0.5,
activator = 'relu'), # (bs, 16, 16, 512)
self.blocks.downsample(512,
kernel_size,
apply_batchnorm = False,
apply_dropout = False,
dropout = 0.5,
activator = 'relu'), # (bs, 8, 8, 512)
self.blocks.downsample(512,
kernel_size,
apply_batchnorm = False,
apply_dropout = False,
dropout = 0.5,
activator = 'relu'), # (bs, 4, 4, 512)
self.blocks.downsample(512,
kernel_size,
apply_batchnorm = False,
apply_dropout = False,
dropout = 0.5,
activator = 'relu'), # (bs, 2, 2, 512)
self.blocks.downsample(512,
kernel_size,
apply_batchnorm = False,
apply_dropout = False,
dropout = 0.5,
activator = 'relu'), # (bs, 1, 1, 512)
]
up_stack = [
self.blocks.upsample_unet(512,
kernel_size,
apply_batchnorm = True,
apply_dropout = True,
dropout = 0.5,
activator = 'relu'), # (bs, 1, 1, 512)
self.blocks.upsample_unet(512,
kernel_size,
apply_batchnorm = True,
apply_dropout = True,
dropout = 0.5,
activator = 'relu'), # (bs, 2, 2, 512)
self.blocks.upsample_unet(512,
kernel_size,
apply_batchnorm = True,
apply_dropout = True,
dropout = 0.5,
activator = 'relu'), # (bs, 4, 4, 512)
self.blocks.upsample_unet(512,
kernel_size,
apply_batchnorm = True,
apply_dropout = True,
dropout = 0.5,
activator = 'relu'), # (bs, 8, 8, 512)
self.blocks.upsample_unet(512,
kernel_size,
apply_batchnorm = True,
apply_dropout = False,
dropout = 0.5,
activator = 'relu'), # (bs, 16, 16, 512)
self.blocks.upsample_unet(256,
kernel_size,
apply_batchnorm = True,
apply_dropout = False,
dropout = 0.5,
activator = 'relu'), # (bs, 32, 32, 256)
self.blocks.upsample_unet(128,
kernel_size,
apply_batchnorm = True,
apply_dropout = False,
dropout = 0.5,
activator = 'relu'), # (bs, 64, 64, 128)
self.blocks.upsample_unet(64,
kernel_size,
apply_batchnorm = True,
apply_dropout = False,
dropout = 0.5,
activator = 'relu'), # (bs, 128, 128, 64)
]
initializer = tf.random_normal_initializer(0., 0.02)
last = tf.keras.layers.Conv2DTranspose(
self.OUTPUT_CHANNELS,
kernel_size,
strides=2,
padding='same',
kernel_initializer=initializer,
activation='sigmoid') # (bs, 256, 256, 3)
x = inputs
# Downsampling through the model
skips = []
for down in down_stack:
x = down(x)
skips.append(x)
skips = reversed(skips[:-1])
# Upsampling and establishing the skip connections
for up, skip in zip(up_stack, skips):
x = up(x)
x = tf.keras.layers.Concatenate()([x, skip])
x = last(x)
generator_model = tf.keras.Model(inputs=inputs, outputs=x)
generator_model.summary()
return generator_model
def Resnet(self):
inputs = tf.keras.layers.Input(
shape=[self.IMG_HEIGHT, self.IMG_WIDTH, 3],
batch_size = self.batch_size)
kernel_size = 3
# x = tf.keras.layers.ZeroPadding2D(padding=(3, 3))(inputs)
x = inputs
stride_steps = 14
upsample_steps = 0
# Downsample
filters = 8
for deep in range(1, stride_steps):
if deep % 2 == 0:
stride = 2
upsample_steps += 1
else:
stride = 1
filters = filters * stride
x = self.blocks.down_res_block(x,
stride=stride,
filters=(filters, filters * 4))
# Upsample
filters = 128
for deep in range (1,upsample_steps):
x = self.blocks.upsample(input = x,
filters = filters)
filters = filters / 2
initializer = tf.random_normal_initializer(0., 0.02)
x = tf.keras.layers.Conv2DTranspose(
self.OUTPUT_CHANNELS,
kernel_size,
strides=2,
padding='same',
kernel_initializer=initializer,
activation='sigmoid')(x) # (bs, 256, 256, 3)
generator_model = tf.keras.Model(inputs=inputs, outputs=x)
generator_model.summary()
return generator_model
def Resnet_Attention(self):
inputs = tf.keras.layers.Input(
shape=[self.IMG_HEIGHT, self.IMG_WIDTH, 3],
batch_size = self.batch_size)
kernel_size = 3
# x = tf.keras.layers.ZeroPadding2D(padding=(3, 3))(inputs)
x = inputs
stride_steps = 14
upsample_steps = 0
# Downsample
filters = 8
for deep in range(1, stride_steps):
if deep % 2 == 0:
stride = 2
upsample_steps += 1
else:
stride = 1
filters = filters * stride
x = self.blocks.down_res_block(x,
stride=stride,
filters=(filters, filters * 4))
x = self.blocks.google_attention(inputs = x,
filters = filters,
ratio = 8,
kernel_size = 3)
# Upsample
filters = 128
for deep in range (1, upsample_steps):
x = self.blocks.upsample(input = x,
filters = filters)
filters = filters / 2
initializer = tf.random_normal_initializer(0., 0.02)
x = tf.keras.layers.Conv2DTranspose(
self.OUTPUT_CHANNELS,
kernel_size,
strides=2,
padding='same',
kernel_initializer=initializer,
activation='sigmoid')(x) # (bs, 256, 256, 3)
generator_model = tf.keras.Model(inputs=inputs, outputs=x)
generator_model.summary()
return generator_model
def Resnet_Multi_Head_Attention(self):
inputs = tf.keras.layers.Input(
shape=[self.IMG_HEIGHT, self.IMG_WIDTH, 3],
batch_size = self.batch_size)
kernel_size = 3
x = inputs
stride_steps = 8
upsample_steps = 0
# Downsample
filters = 16
for deep in range(1, stride_steps):
if deep % 2 == 0:
stride = 2
upsample_steps += 1
else:
stride = 1
filters = filters * stride
x = self.blocks.down_res_block(x,
stride=stride,
filters=(filters, filters))
for deep in range(1, stride_steps):
if deep % 2 == 0:
stride = 2
upsample_steps += 1
else:
stride = 1
filters = filters * stride
x = self.blocks.down_res_block(x,
stride=stride,
filters=(filters, filters))
x = self.blocks.MultiHead_attention_block(x, filters)
# Upsample
filters = 128
for deep in range (1,upsample_steps):
x = self.blocks.upsample(input = x,
filters = filters)
filters = filters / 2
initializer = tf.random_normal_initializer(0., 0.02)
x = tf.keras.layers.Conv2DTranspose(
self.OUTPUT_CHANNELS,
kernel_size,
strides=2,
padding='same',
kernel_initializer=initializer,
activation='sigmoid')(x) # (bs, 256, 256, 3)
generator_model = tf.keras.Model(inputs=inputs, outputs=x)
generator_model.summary()
return generator_model
def CBAM(self):
inputs = tf.keras.layers.Input(
shape=[self.IMG_HEIGHT, self.IMG_WIDTH, 3],
batch_size = self.batch_size)
kernel_size = 3
x = inputs
# 1st stage
# here we perform maxpooling, see the figure above
x = tf.keras.layers.Conv2D(256, kernel_size=(7, 7), strides=(2, 2))(x)
x = tf.keras.layers.BatchNormalization()(x)
x = tf.keras.layers.Activation(tf.keras.activations.relu)(x)
x = tf.keras.layers.MaxPooling2D((3, 3), strides=(2, 2))(x)
# Downsample
filters = 32
for deep in range(1, 16):
if deep % 2 == 0:
stride = 2
else:
stride = 1
filters = filters * stride
x = self.blocks.ResBlock_CBAM(inputs = x,
filters = filters,
ratio = 8,
kernel_size = 7,
stride = stride)
# Upsample
filters = 1024
for deep in range (1,8):
x = self.blocks.upsample(input = x,
filters = filters)
filters = filters / 2
initializer = tf.random_normal_initializer(0., 0.02)
x = tf.keras.layers.Conv2DTranspose(
self.OUTPUT_CHANNELS,
kernel_size,
strides=2,
padding='same',
kernel_initializer=initializer,
activation='sigmoid')(x) # (bs, 256, 256, 3)
generator_model = tf.keras.Model(inputs=inputs, outputs=x)
generator_model.summary()
return generator_model
def Discriminator(self):
initializer = tf.random_normal_initializer(0., 0.02)
kernel_size = 3
inp = tf.keras.layers.Input(
shape=[self.IMG_HEIGHT, self.IMG_WIDTH, 3],
name='input_image',
batch_size=self.batch_size)
tar = tf.keras.layers.Input(
shape=[self.IMG_HEIGHT, self.IMG_WIDTH, self.OUTPUT_CHANNELS],
name='target_image',
batch_size=self.batch_size)
x = tf.keras.layers.concatenate([inp, tar]) # (bs, 256, 256, chan*2)
down1 = self.blocks.downsample(
filters = 64,
size = kernel_size,
apply_batchnorm = False,
apply_dropout = False,
dropout = 0.5,
activator = 'leaky_relu')(x) # (bs, 128, 128, 64)
down2 = self.blocks.downsample(
filters = 128,
size = kernel_size,
apply_batchnorm = True,
apply_dropout = False,
dropout = 0.5,
activator = 'leaky_relu')(down1) # (bs, 64, 64, 128)
down3 = self.blocks.downsample(
filters = 256,
size = kernel_size,
apply_batchnorm = True,
apply_dropout = False,
dropout = 0.5,
activator = 'leaky_relu')(down2) # (bs, 32, 32, 256)
# Add zerros to filter borders
zero_pad1 = tf.keras.layers.ZeroPadding2D()(down3) # (bs, 34, 34, 256)
down4 = self.blocks.downsample(
filters = 512,
size = kernel_size,
apply_batchnorm = True,
apply_dropout = False,
dropout = 0.5,
activator = 'leaky_relu',
stride = 1)(zero_pad1)
attention = self.blocks.google_attention(inputs = down4,
filters = 512,
ratio = 8,
kernel_size = 3)
zero_pad2 = tf.keras.layers.ZeroPadding2D()(attention) # (bs,33,33, 512)
last = tf.keras.layers.Conv2D(
filters = 1,
kernel_size = 4,
strides=1,
kernel_initializer=initializer)(zero_pad2) # (bs, 30, 30, 1)
discriminator_model = tf.keras.Model(inputs=[inp, tar], outputs=last)
# discriminator_model.summary()
return discriminator_model
| 38.669811 | 80 | 0.425714 | 1,443 | 16,396 | 4.686071 | 0.101178 | 0.053239 | 0.044218 | 0.070985 | 0.795031 | 0.743123 | 0.732919 | 0.732919 | 0.710145 | 0.699201 | 0 | 0.053138 | 0.492681 | 16,396 | 423 | 81 | 38.761229 | 0.759798 | 0.054038 | 0 | 0.758523 | 0 | 0 | 0.011769 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.019886 | false | 0 | 0.008523 | 0 | 0.048295 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
c233e169a05d81a7a845d67c95ccd6b1fb89721b | 3,892 | py | Python | src/problems/3dknapsack/src/knapsack.py | samuelpsouza/Computer-Science | 55bb15de7927141b49d9f535e18cf552454e9ea8 | [
"Apache-2.0"
] | null | null | null | src/problems/3dknapsack/src/knapsack.py | samuelpsouza/Computer-Science | 55bb15de7927141b49d9f535e18cf552454e9ea8 | [
"Apache-2.0"
] | null | null | null | src/problems/3dknapsack/src/knapsack.py | samuelpsouza/Computer-Science | 55bb15de7927141b49d9f535e18cf552454e9ea8 | [
"Apache-2.0"
] | null | null | null | def knapsack_0_1(W, v, w):
p = MixedIntegerLinearProgram()
x = p.new_variable(integer=True, nonnegative=True)
p.set_objective(p.sum(x[i] * v[i] for i in range(len(v))))
p.add_constraint(p.sum(x[i] * w[i] for i in range(len(v))) <= W)
for i in range(len(v)):
p.add_constraint(x[i] <= 1)
print(round(p.solve(), 2))
def knapsack_frac(W, v, w):
p = MixedIntegerLinearProgram()
x = p.new_variable(real=True, nonnegative=True)
p.set_objective(p.sum(x[i] * v[i] for i in range(len(v))))
p.add_constraint(p.sum(x[i] * w[i] for i in range(len(v))) <= W)
for i in range(len(v)):
p.add_constraint(x[i] <= 1)
print(round(p.solve(), 2))
def knapsack_3d_c_r(W1, W2, W3, x, y, z, v):
p = MixedIntegerLinearProgram()
s = p.new_variable(integer=True, nonnegative=True)
p.set_objective(p.sum(s[i] * v[i] for i in range(len(v))))
p.add_constraint(p.sum(s[i] * x[i] for i in range(len(v))) <= W1)
p.add_constraint(p.sum(s[i] * y[i] for i in range(len(v))) <= W2)
p.add_constraint(p.sum(s[i] * z[i] for i in range(len(v))) <= W3)
for i in range(len(v)):
p.add_constraint(s[i] <= 1)
for i in range(len(v)):
p.add_constraint(s[i] >= 0)
print(round(p.solve(), 2))
def knapsack_3d(W1, W2, W3, d1, d2, d3, v):
p = MixedIntegerLinearProgram()
s = p.new_variable(integer=True, nonnegative=True)
p.set_objective(p.sum(s[i] * v[i] for i in range(len(v))))
p.add_constraint(p.sum(s[i] * d1[i] for i in range(len(v))) <= W1)
p.add_constraint(p.sum(s[i] * d2[i] for i in range(len(v))) <= W2)
p.add_constraint(p.sum(s[i] * d3[i] for i in range(len(v))) <= W3)
for i in range(len(v)):
p.add_constraint(s[i] >= 0)
p.show()
print(round(p.solve(), 2))
values = p.get_values(s)
print(values)
def knapsack_3d_pl_pura(W1, W2, W3, d1, d2, d3, v):
p = MixedIntegerLinearProgram()
s = p.new_variable(real=True, nonnegative=True)
p.set_objective(p.sum(s[i] * v[i] for i in range(len(v))))
p.add_constraint(p.sum(s[i] * d1[i] for i in range(len(v))) <= W1)
p.add_constraint(p.sum(s[i] * d2[i] for i in range(len(v))) <= W2)
p.add_constraint(p.sum(s[i] * d3[i] for i in range(len(v))) <= W3)
for i in range(len(v)):
p.add_constraint(s[i] <= 1)
p.show()
print(round(p.solve(), 2))
values = p.get_values(s)
print(values)
def knapsack_3d_frac(W1, W2, W3, x, y, z, v):
p = MixedIntegerLinearProgram()
s = p.new_variable(real=True, nonnegative=True)
p.set_objective(p.sum(s[i] * v[i] for i in range(len(v))))
p.add_constraint(p.sum(s[i] * x[i] for i in range(len(v))) <= W1)
p.add_constraint(p.sum(s[i] * y[i] for i in range(len(v))) <= W2)
p.add_constraint(p.sum(s[i] * z[i] for i in range(len(v))) <= W3)
for i in range(len(v)):
p.add_constraint(s[i] <= 1)
print(round(p.solve(), 2))
def get_data(path, problem_type=1):
file = open(path, "r")
if problem_type == 1:
return get_3d_data(file)
def get_d_data(file):
line = file.readline()
(n, W) = line.split()
n = int(n) # number of items
W = int(W) # knapsack capacity
v = [None] * n
w = [None] * n
for i in range(n):
line = file.readline()
(v_i, w_i) = line.replace("\n", "").split(" ")
v[i] = int(v_i)
w[i] = int(w_i)
return (W, v, w)
def get_3d_data(file):
line = file.readline()
(n, W1, W2, W3) = line.split()
n = int(n)
W1 = float(W1)
W2 = float(W2)
W3 = float(W3)
x = [None] * n
y = [None] * n
z = [None] * n
v = [None] * n
for i in range(n):
line = file.readline()
(v_i, x_i, y_i, z_i) = line.split()
v[i] = float(v_i)
y[i] = float(y_i)
x[i] = float(x_i)
z[i] = float(z_i)
return (W1, W2, W3, x, y, z, v) | 26.657534 | 70 | 0.562436 | 727 | 3,892 | 2.917469 | 0.093535 | 0.054691 | 0.082037 | 0.150401 | 0.855728 | 0.842527 | 0.818953 | 0.814239 | 0.789722 | 0.761433 | 0 | 0.0241 | 0.243063 | 3,892 | 146 | 71 | 26.657534 | 0.695859 | 0.008479 | 0 | 0.66 | 0 | 0 | 0.001037 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.09 | false | 0 | 0 | 0 | 0.12 | 0.08 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
dfa8589eabb9c8bdbd49cafd213c738e54bcfac7 | 221 | py | Python | tccli/services/faceid/__init__.py | hapsyou/tencentcloud-cli-intl-en | fa8ba71164484f9a2be4b983080a1de08606c0b0 | [
"Apache-2.0"
] | null | null | null | tccli/services/faceid/__init__.py | hapsyou/tencentcloud-cli-intl-en | fa8ba71164484f9a2be4b983080a1de08606c0b0 | [
"Apache-2.0"
] | null | null | null | tccli/services/faceid/__init__.py | hapsyou/tencentcloud-cli-intl-en | fa8ba71164484f9a2be4b983080a1de08606c0b0 | [
"Apache-2.0"
] | null | null | null | # -*- coding: utf-8 -*-
from tccli.services.faceid.faceid_client import register_arg
from tccli.services.faceid.faceid_client import get_actions_info
from tccli.services.faceid.faceid_client import AVAILABLE_VERSION_LIST
| 44.2 | 70 | 0.841629 | 32 | 221 | 5.5625 | 0.53125 | 0.151685 | 0.286517 | 0.38764 | 0.691011 | 0.691011 | 0.691011 | 0 | 0 | 0 | 0 | 0.004902 | 0.076923 | 221 | 4 | 71 | 55.25 | 0.867647 | 0.095023 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
dfd762e474c88b37001500803f28ee7114768e9f | 92 | py | Python | dsaa/rsa.py | ntdgy/python_study | c3511846a89ea72418937de4cc3edf1595a46ec5 | [
"MIT"
] | null | null | null | dsaa/rsa.py | ntdgy/python_study | c3511846a89ea72418937de4cc3edf1595a46ec5 | [
"MIT"
] | null | null | null | dsaa/rsa.py | ntdgy/python_study | c3511846a89ea72418937de4cc3edf1595a46ec5 | [
"MIT"
] | null | null | null | # for i in range(1000):
# if 7 * i % 48 == 1:
# print(i)
print(pow(57, 7) % 65)
| 18.4 | 25 | 0.445652 | 17 | 92 | 2.411765 | 0.764706 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.216667 | 0.347826 | 92 | 4 | 26 | 23 | 0.466667 | 0.673913 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 6 |
a03c4bd55bc1ff5be909c4e186b3bf1ff9bfe018 | 46 | py | Python | VirginEurope/api/conf.py | fpeterek/VirginEuropeApp | 79418e5f6db0fd717001fff7fd89620c1e7cffc8 | [
"MIT"
] | null | null | null | VirginEurope/api/conf.py | fpeterek/VirginEuropeApp | 79418e5f6db0fd717001fff7fd89620c1e7cffc8 | [
"MIT"
] | null | null | null | VirginEurope/api/conf.py | fpeterek/VirginEuropeApp | 79418e5f6db0fd717001fff7fd89620c1e7cffc8 | [
"MIT"
] | null | null | null | class Conf:
url = 'http://127.0.0.1:8080'
| 15.333333 | 33 | 0.565217 | 9 | 46 | 2.888889 | 0.888889 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.27027 | 0.195652 | 46 | 2 | 34 | 23 | 0.432432 | 0 | 0 | 0 | 0 | 0 | 0.456522 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 1 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 6 |
a049c560a10df0b77b864102a6ff411e9aaab1c7 | 1,783 | py | Python | model_v3.py | vitorglemos/sentiment-analysis | 7ca3c7e3f97e503f573c35a49b3149fca67d1840 | [
"MIT"
] | 1 | 2021-05-07T16:22:18.000Z | 2021-05-07T16:22:18.000Z | model_v3.py | vitorglemos/sentiment-analysis | 7ca3c7e3f97e503f573c35a49b3149fca67d1840 | [
"MIT"
] | null | null | null | model_v3.py | vitorglemos/sentiment-analysis | 7ca3c7e3f97e503f573c35a49b3149fca67d1840 | [
"MIT"
] | null | null | null | num_features = 64
num_labels = 7
batch_size = 64
epochs = 100
width, height = 48, 48
model = Sequential()
model.add(Conv2D(num_features, kernel_size=(3, 3), activation='relu', input_shape=(width, height, 1), data_format='channels_last', kernel_regularizer=l2(0.01)))
model.add(Conv2D(num_features, kernel_size=(3, 3), activation='relu', padding='same'))
model.add(BatchNormalization())
model.add(MaxPooling2D(pool_size=(2, 2), strides=(2, 2)))
model.add(Dropout(0.5))
model.add(Conv2D(2*num_features, kernel_size=(3, 3), activation='relu', padding='same'))
model.add(BatchNormalization())
model.add(Conv2D(2*num_features, kernel_size=(3, 3), activation='relu', padding='same'))
model.add(BatchNormalization())
model.add(MaxPooling2D(pool_size=(2, 2), strides=(2, 2)))
model.add(Dropout(0.5))
model.add(Conv2D(2*2*num_features, kernel_size=(3, 3), activation='relu', padding='same'))
model.add(BatchNormalization())
model.add(Conv2D(2*2*num_features, kernel_size=(3, 3), activation='relu', padding='same'))
model.add(BatchNormalization())
model.add(MaxPooling2D(pool_size=(2, 2), strides=(2, 2)))
model.add(Dropout(0.5))
model.add(Conv2D(2*2*2*num_features, kernel_size=(3, 3), activation='relu', padding='same'))
model.add(BatchNormalization())
model.add(Conv2D(2*2*2*num_features, kernel_size=(3, 3), activation='relu', padding='same'))
model.add(BatchNormalization())
model.add(MaxPooling2D(pool_size=(2, 2), strides=(2, 2)))
model.add(Dropout(0.5))
model.add(Flatten())
model.add(Dense(2*2*2*num_features, activation='relu'))
model.add(Dropout(0.4))
model.add(Dense(2*2*num_features, activation='relu'))
model.add(Dropout(0.4))
model.add(Dense(2*num_features, activation='relu'))
model.add(Dropout(0.5))
model.add(Dense(num_labels, activation='softmax'))
model.summary()
| 37.145833 | 160 | 0.734717 | 280 | 1,783 | 4.567857 | 0.160714 | 0.193901 | 0.084441 | 0.131353 | 0.852228 | 0.841282 | 0.841282 | 0.834246 | 0.834246 | 0.802189 | 0 | 0.055689 | 0.063376 | 1,783 | 47 | 161 | 37.93617 | 0.71018 | 0 | 0 | 0.631579 | 0 | 0 | 0.051598 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
a06f95448803d86e62372ce7de9a80bb25060e9b | 78 | py | Python | cartoview/apps_handler/__init__.py | mofath/cartoview | 53f3891172686aaf4de296c3b1e2356aeba29d73 | [
"BSD-2-Clause"
] | 2 | 2021-12-29T02:18:01.000Z | 2021-12-29T02:18:02.000Z | cartoview/apps_handler/__init__.py | mofath/cartoview | 53f3891172686aaf4de296c3b1e2356aeba29d73 | [
"BSD-2-Clause"
] | null | null | null | cartoview/apps_handler/__init__.py | mofath/cartoview | 53f3891172686aaf4de296c3b1e2356aeba29d73 | [
"BSD-2-Clause"
] | null | null | null | # -*- coding: utf-8 -*-
from .utils import create_apps_dir
create_apps_dir()
| 15.6 | 34 | 0.705128 | 12 | 78 | 4.25 | 0.75 | 0.392157 | 0.509804 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.014925 | 0.141026 | 78 | 4 | 35 | 19.5 | 0.746269 | 0.269231 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0.5 | 0 | 0.5 | 0 | 1 | 0 | 0 | null | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 6 |
263bc955cadae8558bbf13e70bdb08b31abe0a3e | 41 | py | Python | problems/biclique/__init__.py | Benezivas/algobattle-problems | b00b85413893bd1618001a4cdaa0dd7442f4e481 | [
"MIT"
] | null | null | null | problems/biclique/__init__.py | Benezivas/algobattle-problems | b00b85413893bd1618001a4cdaa0dd7442f4e481 | [
"MIT"
] | null | null | null | problems/biclique/__init__.py | Benezivas/algobattle-problems | b00b85413893bd1618001a4cdaa0dd7442f4e481 | [
"MIT"
] | null | null | null | from .problem import Biclique as Problem
| 20.5 | 40 | 0.829268 | 6 | 41 | 5.666667 | 0.833333 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.146341 | 41 | 1 | 41 | 41 | 0.971429 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
cd638e9d5cc2ced2368475f472d34e3fcbd242f9 | 899 | py | Python | trim_model.py | kobewangSky/CenterNet | c7f001b1534064a1932b81df0c28c6197bec8375 | [
"MIT"
] | null | null | null | trim_model.py | kobewangSky/CenterNet | c7f001b1534064a1932b81df0c28c6197bec8375 | [
"MIT"
] | null | null | null | trim_model.py | kobewangSky/CenterNet | c7f001b1534064a1932b81df0c28c6197bec8375 | [
"MIT"
] | null | null | null | import os
import torch
os.environ["CUDA_VISIBLE_DEVICE"] = "1"
model = torch.load("./exp/ctdet/ctdet_coco_hg.pth")
del model['iteration']
del model['scheduler']
del model['optimizer']
del model['model']['module.roi_heads.box.predictor.cls_score.weight']
del model['model']['module.roi_heads.box.predictor.cls_score.bias']
del model['model']['module.roi_heads.box.predictor.bbox_pred.weight']
del model['model']['module.roi_heads.box.predictor.bbox_pred.bias']
del model['model']['module.roi_heads.mask.predictor.mask_fcn_logits.weight']
del model['model']['module.roi_heads.mask.predictor.mask_fcn_logits.bias']
del model['model']['module.rpn.head.cls_logits.weight']
del model['model']['module.rpn.head.cls_logits.bias']
del model['model']['module.rpn.head.bbox_pred.weight']
del model['model']['module.rpn.head.bbox_pred.bias']
torch.save(model, "./ACRV/CoCo_Virtual_30Class/model_0000000_.pth") | 44.95 | 76 | 0.769744 | 142 | 899 | 4.690141 | 0.28169 | 0.156156 | 0.195195 | 0.285285 | 0.717718 | 0.717718 | 0.717718 | 0.678679 | 0.459459 | 0.441441 | 0 | 0.011628 | 0.043382 | 899 | 20 | 77 | 44.95 | 0.762791 | 0 | 0 | 0 | 0 | 0 | 0.653333 | 0.545556 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.111111 | 0 | 0.111111 | 0 | 0 | 0 | 0 | null | 0 | 1 | 1 | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
269cb65feabb699098d8c4ed019933e7998f85a5 | 2,275 | py | Python | openomics/genomics.py | muluayele999/OpenOmics | 29e3bbc586489c3929ac54d9886627f907aa38b1 | [
"MIT"
] | null | null | null | openomics/genomics.py | muluayele999/OpenOmics | 29e3bbc586489c3929ac54d9886627f907aa38b1 | [
"MIT"
] | 1 | 2021-12-13T20:51:32.000Z | 2021-12-13T20:51:32.000Z | openomics/genomics.py | muluayele999/OpenOmics | 29e3bbc586489c3929ac54d9886627f907aa38b1 | [
"MIT"
] | 1 | 2021-02-18T10:39:00.000Z | 2021-02-18T10:39:00.000Z | from openomics.database.annotation import Annotatable
from openomics.transcriptomics import ExpressionData
class SingleNucleotideVariants(ExpressionData, Annotatable):
pass
class SomaticMutation(ExpressionData, Annotatable):
def __init__(self, cohort_name, file_path, columns, genes_col_name, gene_index, sample_index="sample_barcode",
transposed=True,
log2_transform=False, npartitions=0):
super(SomaticMutation, self).__init__(cohort_name, file_path, columns=columns, genes_col_name=genes_col_name,
gene_index=gene_index, sample_index=sample_index, transposed=transposed,
log2_transform=log2_transform, npartitions=npartitions)
@classmethod
def name(cls):
return cls.__name__
class DNAMethylation(ExpressionData, Annotatable):
def __init__(self, cohort_name, file_path, columns, genes_col_name, gene_index, sample_index="sample_barcode",
transposed=True,
log2_transform=False, npartitions=0):
super(DNAMethylation, self).__init__(cohort_name, file_path, columns=columns, genes_col_name=genes_col_name,
gene_index=gene_index, sample_index=sample_index, transposed=transposed,
log2_transform=log2_transform, npartitions=npartitions)
@classmethod
def name(cls):
return cls.__name__
class CopyNumberVariation(ExpressionData, Annotatable):
def __init__(self, cohort_name, file_path, columns, genes_col_name, gene_index, sample_index="sample_barcode",
transposed=True,
log2_transform=False, npartitions=0):
super(CopyNumberVariation, self).__init__(cohort_name, file_path, columns=columns,
genes_col_name=genes_col_name, gene_index=gene_index,
sample_index=sample_index,
transposed=transposed, log2_transform=log2_transform,
npartitions=npartitions)
@classmethod
def name(cls):
return cls.__name__
| 49.456522 | 118 | 0.634725 | 217 | 2,275 | 6.211982 | 0.179724 | 0.097923 | 0.080119 | 0.080119 | 0.810831 | 0.810831 | 0.810831 | 0.810831 | 0.810831 | 0.810831 | 0 | 0.007524 | 0.298901 | 2,275 | 45 | 119 | 50.555556 | 0.837618 | 0 | 0 | 0.611111 | 0 | 0 | 0.018462 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.166667 | false | 0.027778 | 0.055556 | 0.083333 | 0.416667 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
26bf4b70079fb49c077099a6252b1ae85bcabc3f | 187 | py | Python | Lesson 3/task1.py | arechesk/PythonHW | e05e8c36ac208153c3cab0c512c803723760ed2d | [
"MIT"
] | null | null | null | Lesson 3/task1.py | arechesk/PythonHW | e05e8c36ac208153c3cab0c512c803723760ed2d | [
"MIT"
] | null | null | null | Lesson 3/task1.py | arechesk/PythonHW | e05e8c36ac208153c3cab0c512c803723760ed2d | [
"MIT"
] | null | null | null | def squares(*args):
return [i**2 for i in args]
def even_items(*args):
return args[::2]
def cubes(*args):
return [i**3 for i in filter(lambda x: x % 2 == 0, args[1::2])]
| 15.583333 | 67 | 0.57754 | 35 | 187 | 3.057143 | 0.485714 | 0.280374 | 0.205607 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.048951 | 0.235294 | 187 | 11 | 68 | 17 | 0.699301 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.5 | true | 0 | 0 | 0.5 | 1 | 0 | 0 | 0 | 0 | null | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 1 | 1 | 0 | 0 | 6 |
26d04e030b2173d3fa7d624da9a74c4c82a04ba9 | 4,884 | py | Python | tests/network/test_ipfilter.py | Degget1986/neo-mamba | da7312d5027f3e9b0e5421495d5c00915bdfd786 | [
"MIT"
] | 12 | 2020-08-27T19:56:02.000Z | 2022-03-08T03:23:43.000Z | tests/network/test_ipfilter.py | Degget1986/neo-mamba | da7312d5027f3e9b0e5421495d5c00915bdfd786 | [
"MIT"
] | 101 | 2020-07-24T08:23:00.000Z | 2021-11-17T13:37:45.000Z | tests/network/test_ipfilter.py | Degget1986/neo-mamba | da7312d5027f3e9b0e5421495d5c00915bdfd786 | [
"MIT"
] | 11 | 2021-02-11T22:24:13.000Z | 2021-11-18T06:45:03.000Z | import unittest
from neo3.network.ipfilter import IPFilter
class IPFilteringTestCase(unittest.TestCase):
def test_nobody_allowed(self):
filter = IPFilter()
filter.load_config({
'blacklist': [
'0.0.0.0/0'
],
'whitelist': [
]
})
self.assertFalse(filter.is_allowed('127.0.0.1'))
self.assertFalse(filter.is_allowed('10.10.10.10'))
def test_nobody_allowed_except_one(self):
filter = IPFilter()
filter.load_config({
'blacklist': [
'0.0.0.0/0'
],
'whitelist': [
'10.10.10.10'
]
})
self.assertFalse(filter.is_allowed('127.0.0.1'))
self.assertFalse(filter.is_allowed('10.10.10.11'))
self.assertTrue(filter.is_allowed('10.10.10.10'))
def test_everybody_allowed(self):
filter = IPFilter()
filter.load_config({
'blacklist': [
],
'whitelist': [
]
})
self.assertTrue(filter.is_allowed('127.0.0.1'))
self.assertTrue(filter.is_allowed('10.10.10.11'))
self.assertTrue(filter.is_allowed('10.10.10.10'))
filter.load_config({
'blacklist': [
],
'whitelist': [
'0.0.0.0/0'
]
})
self.assertTrue(filter.is_allowed('127.0.0.1'))
self.assertTrue(filter.is_allowed('10.10.10.11'))
self.assertTrue(filter.is_allowed('10.10.10.10'))
filter.load_config({
'blacklist': [
'0.0.0.0/0'
],
'whitelist': [
'0.0.0.0/0'
]
})
self.assertTrue(filter.is_allowed('127.0.0.1'))
self.assertTrue(filter.is_allowed('10.10.10.11'))
self.assertTrue(filter.is_allowed('10.10.10.10'))
def test_everybody_allowed_except_one(self):
filter = IPFilter()
filter.load_config({
'blacklist': [
'127.0.0.1'
],
'whitelist': [
]
})
self.assertFalse(filter.is_allowed('127.0.0.1'))
self.assertTrue(filter.is_allowed('10.10.10.11'))
self.assertTrue(filter.is_allowed('10.10.10.10'))
def test_disallow_ip_range(self):
filter = IPFilter()
filter.load_config({
'blacklist': [
'127.0.0.0/24'
],
'whitelist': [
]
})
self.assertFalse(filter.is_allowed('127.0.0.0'))
self.assertFalse(filter.is_allowed('127.0.0.1'))
self.assertFalse(filter.is_allowed('127.0.0.100'))
self.assertFalse(filter.is_allowed('127.0.0.255'))
self.assertTrue(filter.is_allowed('10.10.10.11'))
self.assertTrue(filter.is_allowed('10.10.10.10'))
def test_updating_blacklist(self):
filter = IPFilter()
filter.load_config({
'blacklist': [
],
'whitelist': [
]
})
self.assertTrue(filter.is_allowed('127.0.0.1'))
filter.blacklist_add('127.0.0.0/24')
self.assertFalse(filter.is_allowed('127.0.0.1'))
# should have no effect, only exact matches
filter.blacklist_remove('127.0.0.1')
self.assertFalse(filter.is_allowed('127.0.0.1'))
filter.blacklist_remove('127.0.0.0/24')
self.assertTrue(filter.is_allowed('127.0.0.1'))
def test_updating_whitelist(self):
filter = IPFilter()
filter.load_config({
'blacklist': [
'0.0.0.0/0'
],
'whitelist': [
]
})
self.assertFalse(filter.is_allowed('127.0.0.1'))
filter.whitelist_add('127.0.0.0/24')
self.assertTrue(filter.is_allowed('127.0.0.1'))
filter.whitelist_remove('127.0.0.1')
# should have no effect, only exact matches
self.assertTrue(filter.is_allowed('127.0.0.1'))
filter.whitelist_remove('127.0.0.0/24')
self.assertFalse(filter.is_allowed('127.0.0.1'))
def test_invalid_config_loading(self):
filter = IPFilter()
# mandatory keys not present
with self.assertRaises(ValueError) as ctx:
filter.load_config({'blacklist':[]})
self.assertEqual("whitelist key not found", str(ctx.exception))
# mandatory key blacklist not present
with self.assertRaises(ValueError) as ctx:
filter.load_config({'whitelist':[]})
self.assertEqual("blacklist key not found", str(ctx.exception))
def test_config_reset(self):
filter = IPFilter()
filter.blacklist_add('127.0.0.1')
self.assertEqual(1, len(filter._config['blacklist']))
filter.reset()
self.assertEqual(filter.default_config,filter._config)
| 29.96319 | 71 | 0.54484 | 578 | 4,884 | 4.472318 | 0.112457 | 0.044101 | 0.179884 | 0.044101 | 0.829014 | 0.825919 | 0.788781 | 0.784139 | 0.767892 | 0.731528 | 0 | 0.094395 | 0.305897 | 4,884 | 162 | 72 | 30.148148 | 0.668142 | 0.029894 | 0 | 0.671756 | 0 | 0 | 0.148954 | 0 | 0 | 0 | 0 | 0 | 0.282443 | 1 | 0.068702 | false | 0 | 0.015267 | 0 | 0.091603 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
f80d622b73f648482d12831fd72291cfc98ee57b | 26 | py | Python | sampling_free/modeling/generalized_rcnn/rpn/paa/__init__.py | ChenJoya/sampling-free | 01dfd40cf794ee5afea4f052216483f3901ecd20 | [
"MIT"
] | 266 | 2019-09-26T07:03:48.000Z | 2022-02-23T13:26:16.000Z | sampling_free/modeling/generalized_rcnn/rpn/paa/__init__.py | chenjoya/sampling-free | 01dfd40cf794ee5afea4f052216483f3901ecd20 | [
"MIT"
] | 9 | 2019-10-24T00:41:49.000Z | 2021-12-31T01:26:08.000Z | sampling_free/modeling/generalized_rcnn/rpn/paa/__init__.py | chenjoya/sampling-free | 01dfd40cf794ee5afea4f052216483f3901ecd20 | [
"MIT"
] | 20 | 2019-11-07T10:03:18.000Z | 2021-11-13T14:03:31.000Z | from .paa import build_paa | 26 | 26 | 0.846154 | 5 | 26 | 4.2 | 0.8 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.115385 | 26 | 1 | 26 | 26 | 0.913043 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
3e072ff03a5fdcdb3278080f6b44ac147e1cf612 | 347 | py | Python | sdk/keyvault/azure-keyvault-keys/azure/keyvault/keys/__init__.py | gautam714/azure-sdk-for-python | 1741c199c42e8c85a2e14bc78195fd992837ef92 | [
"MIT"
] | null | null | null | sdk/keyvault/azure-keyvault-keys/azure/keyvault/keys/__init__.py | gautam714/azure-sdk-for-python | 1741c199c42e8c85a2e14bc78195fd992837ef92 | [
"MIT"
] | null | null | null | sdk/keyvault/azure-keyvault-keys/azure/keyvault/keys/__init__.py | gautam714/azure-sdk-for-python | 1741c199c42e8c85a2e14bc78195fd992837ef92 | [
"MIT"
] | null | null | null | # ------------------------------------
# Copyright (c) Microsoft Corporation.
# Licensed under the MIT License.
# -------------------------------------
from .client import KeyClient
from .enums import JsonWebKeyCurveName, JsonWebKeyOperation, JsonWebKeyType
__all__ = ["JsonWebKeyCurveName", "JsonWebKeyOperation", "JsonWebKeyType", "KeyClient"]
| 38.555556 | 87 | 0.622478 | 24 | 347 | 8.833333 | 0.75 | 0.358491 | 0.490566 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.092219 | 347 | 8 | 88 | 43.375 | 0.673016 | 0.412104 | 0 | 0 | 0 | 0 | 0.306533 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.666667 | 0 | 0.666667 | 0 | 1 | 0 | 0 | null | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
3e4c2ea5d22398954563b0c55718d5daab1aa317 | 31 | py | Python | mopet/__init__.py | caglorithm/mopet | f0b8a6f163ec5160b5d5e335f09eac0aae83bcba | [
"MIT"
] | 15 | 2020-03-13T03:43:51.000Z | 2021-02-25T14:58:35.000Z | mopet/__init__.py | caglorithm/mopet | f0b8a6f163ec5160b5d5e335f09eac0aae83bcba | [
"MIT"
] | 28 | 2020-04-20T19:48:12.000Z | 2021-02-16T11:31:12.000Z | mopet/__init__.py | caglorithm/mopet | f0b8a6f163ec5160b5d5e335f09eac0aae83bcba | [
"MIT"
] | 2 | 2020-05-06T19:48:41.000Z | 2020-05-28T07:00:10.000Z | from .mopet import Exploration
| 15.5 | 30 | 0.83871 | 4 | 31 | 6.5 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.129032 | 31 | 1 | 31 | 31 | 0.962963 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
e4b6af73cc746f4479c6bc40a829a5ec17e679b3 | 146 | py | Python | upgrade.py | brighthive/data-resource-api | a012fc0743f1ce2b72ddacf348c57adf44245cfa | [
"MIT"
] | 4 | 2019-02-14T01:07:54.000Z | 2019-11-04T17:28:35.000Z | upgrade.py | brighthive/data-resource-api | a012fc0743f1ce2b72ddacf348c57adf44245cfa | [
"MIT"
] | 39 | 2019-05-30T22:08:46.000Z | 2022-02-17T02:47:00.000Z | upgrade.py | brighthive/data-resource-api | a012fc0743f1ce2b72ddacf348c57adf44245cfa | [
"MIT"
] | 1 | 2020-04-29T18:16:20.000Z | 2020-04-29T18:16:20.000Z | import sys
if "--upgrade-104-to-110" in sys.argv:
from data_resource_api.backwards_compatibility.upgrade_104_to_110 import main
main()
| 18.25 | 81 | 0.760274 | 23 | 146 | 4.565217 | 0.695652 | 0.190476 | 0.228571 | 0.285714 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.096774 | 0.150685 | 146 | 7 | 82 | 20.857143 | 0.75 | 0 | 0 | 0 | 0 | 0 | 0.136986 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0.5 | 0 | 0.5 | 0 | 1 | 0 | 0 | null | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 6 |
904418936dc36010514bb2d1610f7d9c55a1110f | 202 | py | Python | lcp/__init__.py | thomaspingel/lcpy | 6bf2893010d1dad5ad783d7c4f0351166425a1a7 | [
"MIT"
] | null | null | null | lcp/__init__.py | thomaspingel/lcpy | 6bf2893010d1dad5ad783d7c4f0351166425a1a7 | [
"MIT"
] | null | null | null | lcp/__init__.py | thomaspingel/lcpy | 6bf2893010d1dad5ad783d7c4f0351166425a1a7 | [
"MIT"
] | null | null | null | from .lcp import get_linear_routes, get_areal_routes, cost_tobler_hiking_function, cost_rademaker, cost_pingel_exponential, direct_routes, create_raster_network, get_lists, lcp_coordinate_conversion, ve | 202 | 202 | 0.891089 | 29 | 202 | 5.655172 | 0.758621 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.059406 | 202 | 1 | 202 | 202 | 0.863158 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
5f90f9c88e4d7dda72dd917f7a622a0bfacf1cec | 25 | py | Python | tests/__init__.py | edwar64896/pycmx | 566e6257f460a62f2838e6159a75979a2e642d78 | [
"MIT"
] | 10 | 2019-11-18T08:31:40.000Z | 2021-09-21T22:47:29.000Z | tests/__init__.py | edwar64896/pycmx | 566e6257f460a62f2838e6159a75979a2e642d78 | [
"MIT"
] | 4 | 2018-12-01T22:00:19.000Z | 2018-12-29T22:08:44.000Z | tests/__init__.py | edwar64896/pycmx | 566e6257f460a62f2838e6159a75979a2e642d78 | [
"MIT"
] | 3 | 2019-12-07T06:28:29.000Z | 2021-06-24T15:00:53.000Z | from . import test_parse
| 12.5 | 24 | 0.8 | 4 | 25 | 4.75 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.16 | 25 | 1 | 25 | 25 | 0.904762 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
5fafaaeb2a1c44205fb94ee799def6777517f744 | 8,752 | py | Python | main.py | avara1986/pogona-api | 3d99083a94e50e9a8cc86ef07971ffde586c7b4d | [
"MIT"
] | null | null | null | main.py | avara1986/pogona-api | 3d99083a94e50e9a8cc86ef07971ffde586c7b4d | [
"MIT"
] | null | null | null | main.py | avara1986/pogona-api | 3d99083a94e50e9a8cc86ef07971ffde586c7b4d | [
"MIT"
] | null | null | null | from google.appengine.api import mail
from google.appengine.ext import db
from webapp2_extras import json
import os
import jinja2
import webapp2
JINJA_ENVIRONMENT = jinja2.Environment(
loader=jinja2.FileSystemLoader(os.path.dirname(__file__)),
extensions=['jinja2.ext.autoescape'],
autoescape=True)
class Email(db.Model):
to = db.StringProperty()
subject = db.StringProperty()
content = db.TextProperty()
class MainPage(webapp2.RequestHandler):
def get(self):
template = JINJA_ENVIRONMENT.get_template('index.html')
self.response.write(template.render())
class Createmail(webapp2.RequestHandler):
def options(self):
self.response.headers['Access-Control-Allow-Origin'] = '*'
self.response.headers[
'Access-Control-Allow-Headers'] = 'Origin, X-Requested-With, Content-Type, Accept'
self.response.headers[
'Access-Control-Allow-Methods'] = 'POST, GET, PUT, DELETE'
def post(self):
save = True
msg_error = ""
if len(self.request.get('subject')) > 0:
subject = self.request.get('subject')
else:
save = False
msg_error = "No se envio asunto"
if save is True and len(self.request.get('to')) > 0:
to = self.request.get('to')
else:
save = False
msg_error = "No se envio destinatario"
if save is True:
email = Email(to=to,
subject=subject)
email.put()
obj = {
'success': True,
'msg': '',
'pos': '',
'id': str(email.key()),
}
else:
obj = {
'success': False,
'pos': '',
'msg': msg_error,
'id': '',
}
self.response.content_type = 'application/json'
self.response.headers.add_header('Access-Control-Allow-Origin', '*')
self.response.write(json.encode(obj))
class Chunkmail(webapp2.RequestHandler):
def options(self):
self.response.headers['Access-Control-Allow-Origin'] = '*'
self.response.headers[
'Access-Control-Allow-Headers'] = 'Origin, X-Requested-With, Content-Type, Accept'
self.response.headers[
'Access-Control-Allow-Methods'] = 'POST, GET, PUT, DELETE'
def post(self):
q = Email.all()
try:
q.filter(
"__key__ =", db.Key(self.request.get('i')))
email = q.get()
if email.content is None:
email.content = self.request.get('content').rstrip()
else:
email.content = email.content + \
self.request.get('content').rstrip()
email.put()
obj = {
'success': True,
'msg': '',
'id': str(email.key()),
'to': email.to,
'pos': self.request.get('pos'),
'subject': email.subject,
'content': email.content,
}
except db.datastore_errors.BadKeyError:
obj = {
'success': False,
'msg': "no existe",
}
self.response.content_type = 'application/json'
self.response.headers.add_header('Access-Control-Allow-Origin', '*')
self.response.write(json.encode(obj))
class SendChunkMail(webapp2.RequestHandler):
def options(self):
self.response.headers['Access-Control-Allow-Origin'] = '*'
self.response.headers[
'Access-Control-Allow-Headers'] = 'Origin, X-Requested-With, Content-Type, Accept'
self.response.headers[
'Access-Control-Allow-Methods'] = 'POST, GET, PUT, DELETE'
def get(self):
q = Email.all()
try:
q.filter(
"__key__ =", db.Key(self.request.get('i')))
email = q.get()
send = True
msg_error = ""
if len(email.subject) > 0:
message = mail.EmailMessage(sender="Pogona Test Emailing <a.vara.1986@gmail.com>",
subject=email.subject)
else:
send = False
msg_error = "No se envio asunto"
if send is True and len(email.to) > 0:
message.to = email.to
else:
send = False
msg_error = "No se envio destinatario"
if send is True and len(email.content) > 0:
message.html = email.content
else:
send = False
msg_error = "No se envio contenido"
if send is True:
message.send()
obj = {
'success': True,
'msg': 'Envio ok',
}
else:
obj = {
'success': False,
'msg': msg_error,
}
except db.datastore_errors.BadKeyError:
obj = {
'success': False,
'msg': "no existe",
}
self.response.content_type = 'application/json'
self.response.headers.add_header('Access-Control-Allow-Origin', '*')
self.response.write(json.encode(obj))
def post(self):
q = Email.all()
try:
q.filter(
"__key__ =", db.Key(self.request.get('i')))
email = q.get()
send = True
msg_error = ""
if len(email.subject) > 0:
message = mail.EmailMessage(sender="Pogona Test Emailing <a.vara.1986@gmail.com>",
subject=email.subject)
else:
send = False
msg_error = "No se envio asunto"
if send is True and len(email.to) > 0:
message.to = email.to
else:
send = False
msg_error = "No se envio destinatario"
if send is True and len(email.content) > 0:
message.html = email.content
else:
send = False
msg_error = "No se envio contenido"
if send is True:
message.send()
obj = {
'success': True,
'msg': 'Envio ok',
}
else:
obj = {
'success': False,
'msg': msg_error,
}
except db.datastore_errors.BadKeyError:
obj = {
'success': False,
'msg': "no existe",
}
self.response.content_type = 'application/json'
self.response.headers.add_header('Access-Control-Allow-Origin', '*')
self.response.write(json.encode(obj))
class Sendmail(webapp2.RequestHandler):
def options(self):
self.response.headers['Access-Control-Allow-Origin'] = '*'
self.response.headers[
'Access-Control-Allow-Headers'] = 'Origin, X-Requested-With, Content-Type, Accept'
self.response.headers[
'Access-Control-Allow-Methods'] = 'POST, GET, PUT, DELETE'
def post(self):
self.response.content_type = 'application/json'
send = True
msg_error = ""
if len(self.request.get('subject')) > 0:
message = mail.EmailMessage(sender="Pogona Test Emailing <a.vara.1986@gmail.com>",
subject=self.request.get('subject'))
else:
send = False
msg_error = "No se envio asunto"
if send is True and len(self.request.get('to')) > 0:
message.to = self.request.get('to')
else:
send = False
msg_error = "No se envio destinatario"
if send is True and len(self.request.get('content')) > 0:
message.html = self.request.get('content')
else:
send = False
msg_error = "No se envio contenido"
if send is True:
message.send()
obj = {
'success': True,
'msg': 'Envio ok',
}
else:
obj = {
'success': False,
'msg': msg_error,
}
self.response.headers.add_header(
'Access-Control-Allow-Origin', '*')
self.response.write(json.encode(obj))
app = webapp2.WSGIApplication([
('/', MainPage),
('/create-mail', Createmail),
('/chunk-content', Chunkmail),
('/send-email-chunked', SendChunkMail),
('/send-mail', Sendmail)
], debug=True)
| 32.414815 | 98 | 0.497715 | 874 | 8,752 | 4.924485 | 0.136156 | 0.078067 | 0.075046 | 0.069703 | 0.797862 | 0.797862 | 0.758829 | 0.740706 | 0.721654 | 0.707714 | 0 | 0.006453 | 0.380256 | 8,752 | 269 | 99 | 32.535316 | 0.787058 | 0 | 0 | 0.731915 | 0 | 0 | 0.180187 | 0.063643 | 0 | 0 | 0 | 0 | 0 | 1 | 0.042553 | false | 0 | 0.025532 | 0 | 0.106383 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
396be05b7f8cebf952d282e3c3765232e1718e9f | 218 | py | Python | modules/ranking/models/annotations_ranking.py | heolin123/funcrowd | 20167783de208394c09ed0429a5f02ec6dd79c42 | [
"MIT"
] | null | null | null | modules/ranking/models/annotations_ranking.py | heolin123/funcrowd | 20167783de208394c09ed0429a5f02ec6dd79c42 | [
"MIT"
] | 11 | 2019-11-12T23:26:45.000Z | 2021-06-10T17:37:23.000Z | modules/ranking/models/annotations_ranking.py | heolin123/funcrowd | 20167783de208394c09ed0429a5f02ec6dd79c42 | [
"MIT"
] | null | null | null | from modules.ranking.models.ranking import Ranking
from modules.ranking.query.base import ANNOTATIONS_COUNT_RANKING_BASE_QUERY
class AnnotationsRanking(Ranking):
BASE_QUERY = ANNOTATIONS_COUNT_RANKING_BASE_QUERY
| 31.142857 | 75 | 0.866972 | 28 | 218 | 6.428571 | 0.392857 | 0.183333 | 0.266667 | 0.3 | 0.355556 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.087156 | 218 | 6 | 76 | 36.333333 | 0.904523 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.5 | 0 | 1 | 0 | 1 | 0 | 0 | null | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
39c7cc08c26ff4f0eabac39ed15941f62acbc26a | 35 | py | Python | airesources/ML-StarterBot-Python/tsmlstarterbot/__init__.py | xinnosuke/Halite-II | 99646c186bd6939460d451ef78d7af89fec6b1e1 | [
"MIT"
] | 232 | 2017-09-11T14:28:41.000Z | 2022-01-19T10:26:07.000Z | airesources/ML-StarterBot-Python/tsmlstarterbot/__init__.py | xinnosuke/Halite-II | 99646c186bd6939460d451ef78d7af89fec6b1e1 | [
"MIT"
] | 302 | 2017-09-13T04:46:25.000Z | 2018-09-06T22:14:06.000Z | airesources/ML-StarterBot-Python/tsmlstarterbot/__init__.py | xinnosuke/Halite-II | 99646c186bd6939460d451ef78d7af89fec6b1e1 | [
"MIT"
] | 151 | 2017-09-11T21:03:07.000Z | 2020-11-28T04:58:55.000Z | from tsmlstarterbot.bot import Bot
| 17.5 | 34 | 0.857143 | 5 | 35 | 6 | 0.8 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.114286 | 35 | 1 | 35 | 35 | 0.967742 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
39eca4e5e6002f3be43814fdc46adb6ddc285c0b | 116 | py | Python | blog/views/api.py | zzlhr/django-blog | 5f85116c1be7eeeb85291ab882e1284ca5abe724 | [
"MIT"
] | 2 | 2018-11-02T01:37:46.000Z | 2019-04-20T02:35:28.000Z | blog/views/api.py | zzlhr/django-blog | 5f85116c1be7eeeb85291ab882e1284ca5abe724 | [
"MIT"
] | 1 | 2018-11-02T05:19:50.000Z | 2018-11-02T05:19:50.000Z | blog/views/api.py | zzlhr/django-blog | 5f85116c1be7eeeb85291ab882e1284ca5abe724 | [
"MIT"
] | null | null | null | from django.http import JsonResponse
def index(request):
return JsonResponse({"name": "lhr", "sex": "nan"})
| 14.5 | 54 | 0.672414 | 14 | 116 | 5.571429 | 0.928571 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.163793 | 116 | 7 | 55 | 16.571429 | 0.804124 | 0 | 0 | 0 | 0 | 0 | 0.114035 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.333333 | false | 0 | 0.333333 | 0.333333 | 1 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 1 | 1 | 0 | 0 | 6 |
f2ccf256d84da42a5a1b65a78da94f658fb6ea3d | 2,453 | py | Python | crfill/util/gradient_norm.py | node21challenge/rank2_node21_generation | 6c1708468b4ba48383c55bc8473ebcc5a83b8995 | [
"Apache-2.0"
] | null | null | null | crfill/util/gradient_norm.py | node21challenge/rank2_node21_generation | 6c1708468b4ba48383c55bc8473ebcc5a83b8995 | [
"Apache-2.0"
] | null | null | null | crfill/util/gradient_norm.py | node21challenge/rank2_node21_generation | 6c1708468b4ba48383c55bc8473ebcc5a83b8995 | [
"Apache-2.0"
] | 1 | 2022-02-11T12:42:21.000Z | 2022-02-11T12:42:21.000Z | from __future__ import annotations
import math
import torch
def weight_norm(parameters: Union[Iterable[torch.Tensor], torch.Tensor],
norm_type: Optional[Union[float, int]] = 2) -> float:
"""Compute global norm of an iterable of parameters.
The norm is computed over all tensors together, as if they were
concatenated into a single vector. This code is based on
torch.nn.utils.clip_grad_norm_().
Args:
parameters (Iterable[Tensor] or Tensor): an iterable of Tensors or a
single Tensor
norm_type (float or int, optional): type of the used p-norm. Can be
``'inf'`` for infinity norm.
Returns:
Total norm of the parameters (viewed as a single vector).
"""
if isinstance(parameters, torch.Tensor):
parameters = [parameters]
norm_type = float(norm_type)
if norm_type == math.inf:
total_norm = max(p.data.abs().max() for p in parameters)
else:
total_norm = 0.
for p in parameters:
param_norm = p.data.norm(norm_type)
total_norm += param_norm.item() ** norm_type
total_norm = total_norm ** (1. / norm_type)
return total_norm
def gradient_norm(parameters: Union[Iterable[torch.Tensor], torch.Tensor],
norm_type: Optional[Union[float, int]] = 2) -> float:
"""Compute global norm of an iterable of parameters.
The norm is computed over all tensors together, as if they were
concatenated into a single vector. This code is based on
torch.nn.utils.clip_grad_norm_().
Args:
parameters (Iterable[Tensor] or Tensor): an iterable of Tensors or a
single Tensor
norm_type (float or int, optional): type of the used p-norm. Can be
``'inf'`` for infinity norm.
Returns:
Total norm of the parameters (viewed as a single vector).
"""
if isinstance(parameters, torch.Tensor):
parameters = [parameters]
norm_type = float(norm_type)
if norm_type == math.inf:
total_norm = max(p.data.abs().max() for p in parameters)
else:
total_norm = 0
parameters = [p for p in parameters if p.grad is not None and p.requires_grad] # gradients can be None
for p in parameters:
param_norm = p.grad.detach().data.norm(2)
total_norm += param_norm.item() ** 2
total_norm = total_norm ** 0.5
return total_norm
return total_norm | 40.213115 | 111 | 0.643294 | 344 | 2,453 | 4.462209 | 0.223837 | 0.087948 | 0.019544 | 0.052117 | 0.816938 | 0.788274 | 0.788274 | 0.754397 | 0.754397 | 0.754397 | 0 | 0.005006 | 0.26702 | 2,453 | 61 | 112 | 40.213115 | 0.848721 | 0.405218 | 0 | 0.575758 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.060606 | false | 0 | 0.090909 | 0 | 0.242424 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
842fcb4c0329d27ded6aaa565e0125e108d18c8c | 45 | py | Python | ipchange/moxa/__init__.py | cowanml/ipchange | 622bdfb456de1b02a604ab71c5a6c66e5a840679 | [
"BSD-3-Clause"
] | null | null | null | ipchange/moxa/__init__.py | cowanml/ipchange | 622bdfb456de1b02a604ab71c5a6c66e5a840679 | [
"BSD-3-Clause"
] | 2 | 2020-10-08T16:53:27.000Z | 2021-01-07T15:31:47.000Z | ipchange/moxa/__init__.py | cowanml/ipchange | 622bdfb456de1b02a604ab71c5a6c66e5a840679 | [
"BSD-3-Clause"
] | 2 | 2021-01-05T14:29:17.000Z | 2021-02-08T17:05:30.000Z | # flake8: noqa
from .moxa import MoxaHTTP_2_2 | 22.5 | 30 | 0.8 | 8 | 45 | 4.25 | 0.875 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.076923 | 0.133333 | 45 | 2 | 30 | 22.5 | 0.794872 | 0.266667 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
8436993eff57bb66969ae7696ee4288195dad665 | 4,672 | py | Python | PyBank/main.py | myanaga87/python-challenge | 11575b534c4c3c9cc25225dd79bc9c8c81e7c9ff | [
"MIT"
] | null | null | null | PyBank/main.py | myanaga87/python-challenge | 11575b534c4c3c9cc25225dd79bc9c8c81e7c9ff | [
"MIT"
] | null | null | null | PyBank/main.py | myanaga87/python-challenge | 11575b534c4c3c9cc25225dd79bc9c8c81e7c9ff | [
"MIT"
] | null | null | null | #import module
import os
import csv
csvpath = os.path.join("C:/", "Users", "Yanagam", "Desktop", "DataScience", "NUCHI201801DATA4-Class-Repository-DATA", "MWS", "Homework", "03-Python", "Instructions", "PyBank", "raw_data", "budget_data_1.csv")
#set empty lists
months = []
revenue = []
#read the csv
with open(csvpath, newline='') as data1:
csvreader = csv.reader(data1, delimiter=',')
#skip the header row
next(csvreader, None)
for row in csvreader:
months.append(row[0])
revenue.append(float((row[1])))
#find the total num of months in the dataset
totalMonths = len(months)
#find the total amt of revenue gained over the entire period
totalRevenue = (sum(revenue))
#find the average change in revenue between months over the entire period
avgChange = round(totalRevenue / totalMonths,2)
#find the greatest increase in revenue (date and amt) over the entire period
greatestInc = float('-inf')
for i in revenue:
if i > greatestInc:
greatestInc = i
#find the greatest decrease in revenue (date and amt) over the entire period
greatestDec = 1195111
for i in revenue:
if i < greatestDec:
greatestDec = i
#set up template
print("Financial Analysis")
print("-------------------------")
print("Total Months: "+ str(totalMonths))
print("Total Revenue: "+ str(totalRevenue))
print("Average Revenue Change: "+ str(avgChange))
print("Greatest Increase in Revenue: "+ str(greatestInc))
print("Greatest Decrease in Revenue: "+ str(greatestDec))
#set up the output text file
output_file = os.path.join("C:/", "Users", "Yanagam", "Desktop", "DataScience", "python-challenge", "PyBank", "budget_data_Analysis.txt")
with open(output_file, 'w') as Analysis:
Analysis.write("Financial Analysis\n")
Analysis.write("-------------------------\n")
Analysis.write("Total Months: "+ str(totalMonths)+"\n")
Analysis.write("Total Revenue: "+ str(totalRevenue)+"\n")
Analysis.write("Average Revenue Change: "+ str(avgChange)+"\n")
Analysis.write("Greatest Increase in Revenue: "+ str(greatestInc)+"\n")
Analysis.write("Greatest Decrease in Revenue: "+ str(greatestDec))
#empty the lists
months = []
revenue = []
#run the same process for the next file
csvpath = os.path.join("C:/", "Users", "Yanagam", "Desktop", "DataScience", "NUCHI201801DATA4-Class-Repository-DATA", "MWS", "Homework", "03-Python", "Instructions", "PyBank", "raw_data", "budget_data_2.csv")
with open(csvpath, newline='') as data2:
csvreader = csv.reader(data2, delimiter=',')
#skip the header row
next(csvreader, None)
for row in csvreader:
months.append(row[0])
revenue.append(float((row[1])))
#find the total num of months in the dataset
totalMonths = len(months)
#find the total amt of revenue gained over the entire period
totalRevenue = (sum(revenue))
#find the average change in revenue between months over the entire period
avgChange = round(totalRevenue / totalMonths,2)
#find the greatest increase in revenue (date and amt) over the entire period
greatestInc = float('-inf')
for i in revenue:
if i > greatestInc:
greatestInc = i
#find the greatest decrease in revenue (date and amt) over the entire period
greatestDec = float(greatestInc)
for i in revenue:
if i < greatestDec:
greatestDec = i
#set up template
print("Financial Analysis")
print("-------------------------")
print("Total Months: "+ str(totalMonths))
print("Total Revenue: "+ str(totalRevenue))
print("Average Revenue Change: "+ str(avgChange))
print("Greatest Increase in Revenue: "+ str(greatestInc))
print("Greatest Decrease in Revenue: "+ str(greatestDec))
#set up the output text file
output_file = os.path.join("C:/", "Users", "Yanagam", "Desktop", "DataScience", "python-challenge", "PyBank", "budget_data_Analysis2.txt")
with open(output_file, 'w') as Analysis2:
Analysis2.write("Financial Analysis\n")
Analysis2.write("-------------------------\n")
Analysis2.write("Total Months: "+ str(totalMonths)+"\n")
Analysis2.write("Total Revenue: "+ str(totalRevenue)+"\n")
Analysis2.write("Average Revenue Change: "+ str(avgChange)+"\n")
Analysis2.write("Greatest Increase in Revenue: "+ str(greatestInc)+"\n")
Analysis2.write("Greatest Decrease in Revenue: "+ str(greatestDec))
#empty the lists
months = []
revenue = [] | 33.855072 | 209 | 0.637414 | 557 | 4,672 | 5.321364 | 0.179533 | 0.054656 | 0.035088 | 0.051282 | 0.869096 | 0.869096 | 0.807692 | 0.765857 | 0.735493 | 0.735493 | 0 | 0.012534 | 0.214469 | 4,672 | 138 | 210 | 33.855072 | 0.795095 | 0.187928 | 0 | 0.631579 | 0 | 0 | 0.298955 | 0.062981 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.026316 | 0 | 0.026316 | 0.184211 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
ffcba10ca518a9ce53b2c30788893e3abafbf97a | 98 | py | Python | ABC_A/ABC196_A.py | ryosuke0825/atcoder_python | 185cdbe7db44ecca1aaf357858d16d31ce515ddb | [
"MIT"
] | null | null | null | ABC_A/ABC196_A.py | ryosuke0825/atcoder_python | 185cdbe7db44ecca1aaf357858d16d31ce515ddb | [
"MIT"
] | null | null | null | ABC_A/ABC196_A.py | ryosuke0825/atcoder_python | 185cdbe7db44ecca1aaf357858d16d31ce515ddb | [
"MIT"
] | null | null | null | a, b = map(int, input().split())
c, d = map(int, input().split())
print(max(a-c, a-d, b-c, b-d))
| 19.6 | 32 | 0.530612 | 22 | 98 | 2.363636 | 0.454545 | 0.230769 | 0.423077 | 0.615385 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.153061 | 98 | 4 | 33 | 24.5 | 0.626506 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0 | 0 | 0 | 0.333333 | 1 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
ffdfca759f766aa951263cb79117736cf943e00c | 90 | py | Python | src/main/anovos/__main__.py | ziedbouf/anovos | 4cd149fe803f8cec7d49cf1d2ebff5abf6b362ce | [
"Apache-2.0"
] | 60 | 2021-11-15T22:30:57.000Z | 2022-03-31T08:13:27.000Z | src/main/anovos/__main__.py | ziedbouf/anovos | 4cd149fe803f8cec7d49cf1d2ebff5abf6b362ce | [
"Apache-2.0"
] | 62 | 2021-11-15T17:27:56.000Z | 2022-03-28T20:12:56.000Z | src/main/anovos/__main__.py | ziedbouf/anovos | 4cd149fe803f8cec7d49cf1d2ebff5abf6b362ce | [
"Apache-2.0"
] | 15 | 2021-11-17T19:39:47.000Z | 2022-03-30T18:20:33.000Z | import sys
from .workflow import run
run(config_path=sys.argv[1], run_type=sys.argv[2])
| 15 | 50 | 0.755556 | 17 | 90 | 3.882353 | 0.647059 | 0.212121 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.025 | 0.111111 | 90 | 5 | 51 | 18 | 0.8 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0.666667 | 0 | 0.666667 | 0 | 1 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
f23cb6958125b355748fe0a1905cb9bfc48bc7af | 50 | py | Python | msp430/bsl/target/__main__.py | pvrs12/python-msp430-tools | bd9b1d55b43f884368eaef9dc537330882058fd9 | [
"BSD-3-Clause"
] | 15 | 2017-10-18T01:56:40.000Z | 2022-02-28T04:33:01.000Z | msp430/bsl/target/__main__.py | pvrs12/python-msp430-tools | bd9b1d55b43f884368eaef9dc537330882058fd9 | [
"BSD-3-Clause"
] | 3 | 2017-07-24T13:41:04.000Z | 2019-11-08T19:13:54.000Z | msp430/bsl/target/__main__.py | pvrs12/python-msp430-tools | bd9b1d55b43f884368eaef9dc537330882058fd9 | [
"BSD-3-Clause"
] | 8 | 2017-10-11T14:05:29.000Z | 2022-03-22T02:13:01.000Z | import msp430.bsl.target
msp430.bsl.target.main()
| 16.666667 | 24 | 0.8 | 8 | 50 | 5 | 0.625 | 0.45 | 0.75 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.12766 | 0.06 | 50 | 2 | 25 | 25 | 0.723404 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0.5 | 0 | 0.5 | 0 | 1 | 0 | 0 | null | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 6 |
4bba45a4c64d5014576bd2e5dce5a74f95a31a3b | 262 | py | Python | backend/code/iep/grid/routing.py | socek/iep | 793e35ca5304eef7b7dacb5dd8d486622f497759 | [
"Apache-2.0"
] | null | null | null | backend/code/iep/grid/routing.py | socek/iep | 793e35ca5304eef7b7dacb5dd8d486622f497759 | [
"Apache-2.0"
] | null | null | null | backend/code/iep/grid/routing.py | socek/iep | 793e35ca5304eef7b7dacb5dd8d486622f497759 | [
"Apache-2.0"
] | null | null | null | def panel_times_routing(routing):
routing.add("iep.grid.views.PanelTimesView", "panel_times", "/conventions/{convention_uid}/panel_times")
routing.add("iep.grid.views.PanelTimeView", "panel_time", "/conventions/{convention_uid}/panel_times/{panel_uid}")
| 65.5 | 118 | 0.770992 | 33 | 262 | 5.848485 | 0.424242 | 0.207254 | 0.176166 | 0.176166 | 0.580311 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.061069 | 262 | 3 | 119 | 87.333333 | 0.784553 | 0 | 0 | 0 | 0 | 0 | 0.656489 | 0.576336 | 0 | 0 | 0 | 0 | 0 | 1 | 0.333333 | false | 0 | 0 | 0 | 0.333333 | 0 | 0 | 0 | 0 | null | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
4bbc213e96db7d4448ba2a68ba04317ae706be10 | 55,862 | py | Python | 053_BlazePose/01_float32/02_pose_landmark_upper_body_tflite2h5_weight_int_fullint_float16_quant.py | komori556/PINTO_model_zoo | 47652dfc0b066e9a835a3d7c3555c25cb4dd6004 | [
"MIT"
] | 1 | 2021-03-07T06:31:11.000Z | 2021-03-07T06:31:11.000Z | 053_BlazePose/01_float32/02_pose_landmark_upper_body_tflite2h5_weight_int_fullint_float16_quant.py | mejimaru/PINTO_model_zoo | 0703bf1837d0d0e39f1d0dc9d523b6d8fa8145bd | [
"MIT"
] | null | null | null | 053_BlazePose/01_float32/02_pose_landmark_upper_body_tflite2h5_weight_int_fullint_float16_quant.py | mejimaru/PINTO_model_zoo | 0703bf1837d0d0e39f1d0dc9d523b6d8fa8145bd | [
"MIT"
] | null | null | null | ### tensorflow==2.3.0
### https://ai.googleblog.com/2020/08/on-device-real-time-body-pose-tracking.html
### https://google.github.io/mediapipe/solutions/pose
### https://www.tensorflow.org/api_docs/python/tf/keras/Model
### https://www.tensorflow.org/lite/guide/ops_compatibility
### https://www.tensorflow.org/api_docs/python/tf/keras/layers/Conv2D
### https://www.tensorflow.org/api_docs/python/tf/keras/layers/DepthwiseConv2D
### https://www.tensorflow.org/api_docs/python/tf/keras/layers/Add
### https://www.tensorflow.org/api_docs/python/tf/keras/layers/ReLU
### https://www.tensorflow.org/api_docs/python/tf/keras/layers/MaxPool2D
### https://www.tensorflow.org/api_docs/python/tf/keras/layers/Reshape
### https://www.tensorflow.org/api_docs/python/tf/keras/layers/Concatenate
### https://www.tensorflow.org/api_docs/python/tf/keras/layers/Layer
### How to initialize a convolution layer with an arbitrary kernel in Keras? https://stackoverrun.com/ja/q/12269118
### saved_model_cli show --dir saved_model_pose_landmark_upper_body/ --tag_set serve --signature_def serving_default
import tensorflow as tf
from tensorflow.keras import Model, Input
from tensorflow.keras.layers import Conv2D, DepthwiseConv2D, Add, ReLU, MaxPool2D, Reshape, Concatenate, Layer
from tensorflow.keras.initializers import Constant
from tensorflow.python.framework.convert_to_constants import convert_variables_to_constants_v2
import numpy as np
import sys
# tmp = np.load('weights/depthwise_conv2d_Kernel')
# print(tmp.shape)
# print(tmp)
# def init_f(shape, dtype=None):
# ker = np.load('weights/depthwise_conv2d_Kernel')
# print(shape)
# return ker
# sys.exit(0)
inputs = Input(shape=(256, 256, 3), name='input')
# Block_01
conv1_1 = Conv2D(filters=24, kernel_size=[3, 3], strides=[2, 2], padding="same", dilation_rate=[1, 1], activation='relu',
kernel_initializer=Constant(np.load('weights2/conv2d_Kernel').transpose(1,2,3,0)),
bias_initializer=Constant(np.load('weights2/conv2d_Bias')))(inputs)
depthconv1_1 = DepthwiseConv2D(kernel_size=[3, 3], strides=[1, 1], padding="same", depth_multiplier=1, dilation_rate=[1, 1],
depthwise_initializer=Constant(np.load('weights2/depthwise_conv2d_Kernel')),
bias_initializer=Constant(np.load('weights2/depthwise_conv2d_Bias')))(conv1_1)
conv1_2 = Conv2D(filters=24, kernel_size=[1, 1], strides=[1, 1], padding="valid", dilation_rate=[1, 1],
kernel_initializer=Constant(np.load('weights2/conv2d_1_Kernel').transpose(1,2,3,0)),
bias_initializer=Constant(np.load('weights2/conv2d_1_Bias')))(depthconv1_1)
add1_1 = Add()([conv1_1, conv1_2])
relu1_1 = ReLU()(add1_1)
# Block_02
depthconv2_1 = DepthwiseConv2D(kernel_size=[3, 3], strides=[1, 1], padding="same", depth_multiplier=1, dilation_rate=[1, 1],
depthwise_initializer=Constant(np.load('weights2/depthwise_conv2d_1_Kernel')),
bias_initializer=Constant(np.load('weights2/depthwise_conv2d_1_Bias')))(relu1_1)
conv2_1 = Conv2D(filters=24, kernel_size=[1, 1], strides=[1, 1], padding="valid", dilation_rate=[1, 1],
kernel_initializer=Constant(np.load('weights2/conv2d_2_Kernel').transpose(1,2,3,0)),
bias_initializer=Constant(np.load('weights2/conv2d_2_Bias')))(depthconv2_1)
add2_1 = Add()([relu1_1, conv2_1])
relu2_1 = ReLU()(add2_1)
# Block_03
depthconv3_1 = DepthwiseConv2D(kernel_size=[3, 3], strides=[2, 2], padding="same", depth_multiplier=1, dilation_rate=[1, 1],
depthwise_initializer=Constant(np.load('weights2/depthwise_conv2d_2_Kernel')),
bias_initializer=Constant(np.load('weights2/depthwise_conv2d_2_Bias')))(relu2_1)
conv3_1 = Conv2D(filters=48, kernel_size=[1, 1], strides=[1, 1], padding="valid", dilation_rate=[1, 1],
kernel_initializer=Constant(np.load('weights2/conv2d_3_Kernel').transpose(1,2,3,0)),
bias_initializer=Constant(np.load('weights2/conv2d_3_Bias')))(depthconv3_1)
maxpool3_1 = MaxPool2D(pool_size=[2, 2], strides=[2, 2], padding='valid')(relu2_1)
pad3_1 = tf.pad(maxpool3_1, paddings=tf.constant(np.load('weights2/channel_padding_Paddings')))
add3_1 = Add()([conv3_1, pad3_1])
relu3_1 = ReLU()(add3_1)
# Block_04
depthconv4_1 = DepthwiseConv2D(kernel_size=[3, 3], strides=[1, 1], padding="same", depth_multiplier=1, dilation_rate=[1, 1],
depthwise_initializer=Constant(np.load('weights2/depthwise_conv2d_3_Kernel')),
bias_initializer=Constant(np.load('weights2/depthwise_conv2d_3_Bias')))(relu3_1)
conv4_1 = Conv2D(filters=48, kernel_size=[1, 1], strides=[1, 1], padding="valid", dilation_rate=[1, 1],
kernel_initializer=Constant(np.load('weights2/conv2d_4_Kernel').transpose(1,2,3,0)),
bias_initializer=Constant(np.load('weights2/conv2d_4_Bias')))(depthconv4_1)
add4_1 = Add()([relu3_1, conv4_1])
relu4_1 = ReLU()(add4_1)
# Block_05
depthconv5_1 = DepthwiseConv2D(kernel_size=[3, 3], strides=[1, 1], padding="same", depth_multiplier=1, dilation_rate=[1, 1],
depthwise_initializer=Constant(np.load('weights2/depthwise_conv2d_4_Kernel')),
bias_initializer=Constant(np.load('weights2/depthwise_conv2d_4_Bias')))(relu4_1)
conv5_1 = Conv2D(filters=48, kernel_size=[1, 1], strides=[1, 1], padding="valid", dilation_rate=[1, 1],
kernel_initializer=Constant(np.load('weights2/conv2d_5_Kernel').transpose(1,2,3,0)),
bias_initializer=Constant(np.load('weights2/conv2d_5_Bias')))(depthconv5_1)
add5_1 = Add()([relu4_1, conv5_1])
relu5_1 = ReLU()(add5_1)
# Block_06
depthconv6_1 = DepthwiseConv2D(kernel_size=[3, 3], strides=[1, 1], padding="same", depth_multiplier=1, dilation_rate=[1, 1],
depthwise_initializer=Constant(np.load('weights2/depthwise_conv2d_5_Kernel')),
bias_initializer=Constant(np.load('weights2/depthwise_conv2d_5_Bias')))(relu5_1)
conv6_1 = Conv2D(filters=48, kernel_size=[1, 1], strides=[1, 1], padding="valid", dilation_rate=[1, 1],
kernel_initializer=Constant(np.load('weights2/conv2d_6_Kernel').transpose(1,2,3,0)),
bias_initializer=Constant(np.load('weights2/conv2d_6_Bias')))(depthconv6_1)
add6_1 = Add()([relu5_1, conv6_1])
relu6_1 = ReLU()(add6_1)
# Block_07
depthconv7_1 = DepthwiseConv2D(kernel_size=[3, 3], strides=[2, 2], padding="same", depth_multiplier=1, dilation_rate=[1, 1],
depthwise_initializer=Constant(np.load('weights2/depthwise_conv2d_6_Kernel')),
bias_initializer=Constant(np.load('weights2/depthwise_conv2d_6_Bias')))(relu6_1)
conv7_1 = Conv2D(filters=96, kernel_size=[1, 1], strides=[1, 1], padding="valid", dilation_rate=[1, 1],
kernel_initializer=Constant(np.load('weights2/conv2d_7_Kernel').transpose(1,2,3,0)),
bias_initializer=Constant(np.load('weights2/conv2d_7_Bias')))(depthconv7_1)
maxpool7_1 = MaxPool2D(pool_size=[2, 2], strides=[2, 2], padding='valid')(relu6_1)
pad7_1 = tf.pad(maxpool7_1, paddings=tf.constant(np.load('weights2/channel_padding_1_Paddings')))
add7_1 = Add()([conv7_1, pad7_1])
relu7_1 = ReLU()(add7_1)
# Block_08
depthconv8_1 = DepthwiseConv2D(kernel_size=[3, 3], strides=[1, 1], padding="same", depth_multiplier=1, dilation_rate=[1, 1],
depthwise_initializer=Constant(np.load('weights2/depthwise_conv2d_7_Kernel')),
bias_initializer=Constant(np.load('weights2/depthwise_conv2d_7_Bias')))(relu7_1)
conv8_1 = Conv2D(filters=96, kernel_size=[1, 1], strides=[1, 1], padding="valid", dilation_rate=[1, 1],
kernel_initializer=Constant(np.load('weights2/conv2d_8_Kernel').transpose(1,2,3,0)),
bias_initializer=Constant(np.load('weights2/conv2d_8_Bias')))(depthconv8_1)
add8_1 = Add()([relu7_1, conv8_1])
relu8_1 = ReLU()(add8_1)
# Block_09
depthconv9_1 = DepthwiseConv2D(kernel_size=[3, 3], strides=[1, 1], padding="same", depth_multiplier=1, dilation_rate=[1, 1],
depthwise_initializer=Constant(np.load('weights2/depthwise_conv2d_8_Kernel')),
bias_initializer=Constant(np.load('weights2/depthwise_conv2d_8_Bias')))(relu8_1)
conv9_1 = Conv2D(filters=96, kernel_size=[1, 1], strides=[1, 1], padding="valid", dilation_rate=[1, 1],
kernel_initializer=Constant(np.load('weights2/conv2d_9_Kernel').transpose(1,2,3,0)),
bias_initializer=Constant(np.load('weights2/conv2d_9_Bias')))(depthconv9_1)
add9_1 = Add()([relu8_1, conv9_1])
relu9_1 = ReLU()(add9_1)
# Block_10
depthconv10_1 = DepthwiseConv2D(kernel_size=[3, 3], strides=[1, 1], padding="same", depth_multiplier=1, dilation_rate=[1, 1],
depthwise_initializer=Constant(np.load('weights2/depthwise_conv2d_9_Kernel')),
bias_initializer=Constant(np.load('weights2/depthwise_conv2d_9_Bias')))(relu9_1)
conv10_1 = Conv2D(filters=96, kernel_size=[1, 1], strides=[1, 1], padding="valid", dilation_rate=[1, 1],
kernel_initializer=Constant(np.load('weights2/conv2d_10_Kernel').transpose(1,2,3,0)),
bias_initializer=Constant(np.load('weights2/conv2d_10_Bias')))(depthconv10_1)
add10_1 = Add()([relu9_1, conv10_1])
relu10_1 = ReLU()(add10_1)
# Block_11
depthconv11_1 = DepthwiseConv2D(kernel_size=[3, 3], strides=[1, 1], padding="same", depth_multiplier=1, dilation_rate=[1, 1],
depthwise_initializer=Constant(np.load('weights2/depthwise_conv2d_10_Kernel')),
bias_initializer=Constant(np.load('weights2/depthwise_conv2d_10_Bias')))(relu10_1)
conv11_1 = Conv2D(filters=96, kernel_size=[1, 1], strides=[1, 1], padding="valid", dilation_rate=[1, 1],
kernel_initializer=Constant(np.load('weights2/conv2d_11_Kernel').transpose(1,2,3,0)),
bias_initializer=Constant(np.load('weights2/conv2d_11_Bias')))(depthconv11_1)
add11_1 = Add()([relu10_1, conv11_1])
relu11_1 = ReLU()(add11_1)
# Block_12
depthconv12_1 = DepthwiseConv2D(kernel_size=[3, 3], strides=[2, 2], padding="same", depth_multiplier=1, dilation_rate=[1, 1],
depthwise_initializer=Constant(np.load('weights2/depthwise_conv2d_11_Kernel')),
bias_initializer=Constant(np.load('weights2/depthwise_conv2d_11_Bias')))(relu11_1)
conv12_1 = Conv2D(filters=192, kernel_size=[1, 1], strides=[1, 1], padding="valid", dilation_rate=[1, 1],
kernel_initializer=Constant(np.load('weights2/conv2d_12_Kernel').transpose(1,2,3,0)),
bias_initializer=Constant(np.load('weights2/conv2d_12_Bias')))(depthconv12_1)
maxpool12_1 = MaxPool2D(pool_size=[2, 2], strides=[2, 2], padding='valid')(relu11_1)
pad12_1 = tf.pad(maxpool12_1, paddings=tf.constant(np.load('weights2/channel_padding_2_Paddings')))
add12_1 = Add()([conv12_1, pad12_1])
relu12_1 = ReLU()(add12_1)
# Block_13
depthconv13_1 = DepthwiseConv2D(kernel_size=[3, 3], strides=[1, 1], padding="same", depth_multiplier=1, dilation_rate=[1, 1],
depthwise_initializer=Constant(np.load('weights2/depthwise_conv2d_12_Kernel')),
bias_initializer=Constant(np.load('weights2/depthwise_conv2d_12_Bias')))(relu12_1)
conv13_1 = Conv2D(filters=192, kernel_size=[1, 1], strides=[1, 1], padding="valid", dilation_rate=[1, 1],
kernel_initializer=Constant(np.load('weights2/conv2d_13_Kernel').transpose(1,2,3,0)),
bias_initializer=Constant(np.load('weights2/conv2d_13_Bias')))(depthconv13_1)
add13_1 = Add()([relu12_1, conv13_1])
relu13_1 = ReLU()(add13_1)
# Block_14
depthconv14_1 = DepthwiseConv2D(kernel_size=[3, 3], strides=[1, 1], padding="same", depth_multiplier=1, dilation_rate=[1, 1],
depthwise_initializer=Constant(np.load('weights2/depthwise_conv2d_13_Kernel')),
bias_initializer=Constant(np.load('weights2/depthwise_conv2d_13_Bias')))(relu13_1)
conv14_1 = Conv2D(filters=192, kernel_size=[1, 1], strides=[1, 1], padding="valid", dilation_rate=[1, 1],
kernel_initializer=Constant(np.load('weights2/conv2d_14_Kernel').transpose(1,2,3,0)),
bias_initializer=Constant(np.load('weights2/conv2d_14_Bias')))(depthconv14_1)
add14_1 = Add()([relu13_1, conv14_1])
relu14_1 = ReLU()(add14_1)
# Block_15
depthconv15_1 = DepthwiseConv2D(kernel_size=[3, 3], strides=[1, 1], padding="same", depth_multiplier=1, dilation_rate=[1, 1],
depthwise_initializer=Constant(np.load('weights2/depthwise_conv2d_14_Kernel')),
bias_initializer=Constant(np.load('weights2/depthwise_conv2d_14_Bias')))(relu14_1)
conv15_1 = Conv2D(filters=192, kernel_size=[1, 1], strides=[1, 1], padding="valid", dilation_rate=[1, 1],
kernel_initializer=Constant(np.load('weights2/conv2d_15_Kernel').transpose(1,2,3,0)),
bias_initializer=Constant(np.load('weights2/conv2d_15_Bias')))(depthconv15_1)
add15_1 = Add()([relu14_1, conv15_1])
relu15_1 = ReLU()(add15_1)
# Block_16
depthconv16_1 = DepthwiseConv2D(kernel_size=[3, 3], strides=[1, 1], padding="same", depth_multiplier=1, dilation_rate=[1, 1],
depthwise_initializer=Constant(np.load('weights2/depthwise_conv2d_15_Kernel')),
bias_initializer=Constant(np.load('weights2/depthwise_conv2d_15_Bias')))(relu15_1)
conv16_1 = Conv2D(filters=192, kernel_size=[1, 1], strides=[1, 1], padding="valid", dilation_rate=[1, 1],
kernel_initializer=Constant(np.load('weights2/conv2d_16_Kernel').transpose(1,2,3,0)),
bias_initializer=Constant(np.load('weights2/conv2d_16_Bias')))(depthconv16_1)
add16_1 = Add()([relu15_1, conv16_1])
relu16_1 = ReLU()(add16_1)
# Block_17
depthconv17_1 = DepthwiseConv2D(kernel_size=[3, 3], strides=[1, 1], padding="same", depth_multiplier=1, dilation_rate=[1, 1],
depthwise_initializer=Constant(np.load('weights2/depthwise_conv2d_16_Kernel')),
bias_initializer=Constant(np.load('weights2/depthwise_conv2d_16_Bias')))(relu16_1)
conv17_1 = Conv2D(filters=192, kernel_size=[1, 1], strides=[1, 1], padding="valid", dilation_rate=[1, 1],
kernel_initializer=Constant(np.load('weights2/conv2d_17_Kernel').transpose(1,2,3,0)),
bias_initializer=Constant(np.load('weights2/conv2d_17_Bias')))(depthconv17_1)
add17_1 = Add()([relu16_1, conv17_1])
relu17_1 = ReLU()(add17_1)
# Block_18
depthconv18_1 = DepthwiseConv2D(kernel_size=[3, 3], strides=[2, 2], padding="same", depth_multiplier=1, dilation_rate=[1, 1],
depthwise_initializer=Constant(np.load('weights2/depthwise_conv2d_17_Kernel')),
bias_initializer=Constant(np.load('weights2/depthwise_conv2d_17_Bias')))(relu17_1)
conv18_1 = Conv2D(filters=288, kernel_size=[1, 1], strides=[1, 1], padding="valid", dilation_rate=[1, 1],
kernel_initializer=Constant(np.load('weights2/conv2d_18_Kernel').transpose(1,2,3,0)),
bias_initializer=Constant(np.load('weights2/conv2d_18_Bias')))(depthconv18_1)
maxpool18_1 = MaxPool2D(pool_size=[2, 2], strides=[2, 2], padding='valid')(relu17_1)
pad18_1 = tf.pad(maxpool18_1, paddings=tf.constant(np.load('weights2/channel_padding_3_Paddings')))
add18_1 = Add()([conv18_1, pad18_1])
relu18_1 = ReLU()(add18_1)
# Block_19
depthconv19_1 = DepthwiseConv2D(kernel_size=[3, 3], strides=[1, 1], padding="same", depth_multiplier=1, dilation_rate=[1, 1],
depthwise_initializer=Constant(np.load('weights2/depthwise_conv2d_18_Kernel')),
bias_initializer=Constant(np.load('weights2/depthwise_conv2d_18_Bias')))(relu18_1)
conv19_1 = Conv2D(filters=288, kernel_size=[1, 1], strides=[1, 1], padding="valid", dilation_rate=[1, 1],
kernel_initializer=Constant(np.load('weights2/conv2d_19_Kernel').transpose(1,2,3,0)),
bias_initializer=Constant(np.load('weights2/conv2d_19_Bias')))(depthconv19_1)
add19_1 = Add()([relu18_1, conv19_1])
relu19_1 = ReLU()(add19_1)
# Block_20
depthconv20_1 = DepthwiseConv2D(kernel_size=[3, 3], strides=[1, 1], padding="same", depth_multiplier=1, dilation_rate=[1, 1],
depthwise_initializer=Constant(np.load('weights2/depthwise_conv2d_19_Kernel')),
bias_initializer=Constant(np.load('weights2/depthwise_conv2d_19_Bias')))(relu19_1)
conv20_1 = Conv2D(filters=288, kernel_size=[1, 1], strides=[1, 1], padding="valid", dilation_rate=[1, 1],
kernel_initializer=Constant(np.load('weights2/conv2d_20_Kernel').transpose(1,2,3,0)),
bias_initializer=Constant(np.load('weights2/conv2d_20_Bias')))(depthconv20_1)
add20_1 = Add()([relu19_1, conv20_1])
relu20_1 = ReLU()(add20_1)
# Block_21
depthconv21_1 = DepthwiseConv2D(kernel_size=[3, 3], strides=[1, 1], padding="same", depth_multiplier=1, dilation_rate=[1, 1],
depthwise_initializer=Constant(np.load('weights2/depthwise_conv2d_20_Kernel')),
bias_initializer=Constant(np.load('weights2/depthwise_conv2d_20_Bias')))(relu20_1)
conv21_1 = Conv2D(filters=288, kernel_size=[1, 1], strides=[1, 1], padding="valid", dilation_rate=[1, 1],
kernel_initializer=Constant(np.load('weights2/conv2d_21_Kernel').transpose(1,2,3,0)),
bias_initializer=Constant(np.load('weights2/conv2d_21_Bias')))(depthconv21_1)
add21_1 = Add()([relu20_1, conv21_1])
relu21_1 = ReLU()(add21_1)
# Block_22
depthconv22_1 = DepthwiseConv2D(kernel_size=[3, 3], strides=[1, 1], padding="same", depth_multiplier=1, dilation_rate=[1, 1],
depthwise_initializer=Constant(np.load('weights2/depthwise_conv2d_21_Kernel')),
bias_initializer=Constant(np.load('weights2/depthwise_conv2d_21_Bias')))(relu21_1)
conv22_1 = Conv2D(filters=288, kernel_size=[1, 1], strides=[1, 1], padding="valid", dilation_rate=[1, 1],
kernel_initializer=Constant(np.load('weights2/conv2d_22_Kernel').transpose(1,2,3,0)),
bias_initializer=Constant(np.load('weights2/conv2d_22_Bias')))(depthconv22_1)
add22_1 = Add()([relu21_1, conv22_1])
relu22_1 = ReLU()(add22_1)
# Block_23
depthconv23_1 = DepthwiseConv2D(kernel_size=[3, 3], strides=[1, 1], padding="same", depth_multiplier=1, dilation_rate=[1, 1],
depthwise_initializer=Constant(np.load('weights2/depthwise_conv2d_22_Kernel')),
bias_initializer=Constant(np.load('weights2/depthwise_conv2d_22_Bias')))(relu22_1)
conv23_1 = Conv2D(filters=288, kernel_size=[1, 1], strides=[1, 1], padding="valid", dilation_rate=[1, 1],
kernel_initializer=Constant(np.load('weights2/conv2d_23_Kernel').transpose(1,2,3,0)),
bias_initializer=Constant(np.load('weights2/conv2d_23_Bias')))(depthconv23_1)
add23_1 = Add()([relu22_1, conv23_1])
relu23_1 = ReLU()(add23_1)
# Block_24
depthconv24_1 = DepthwiseConv2D(kernel_size=[3, 3], strides=[1, 1], padding="same", depth_multiplier=1, dilation_rate=[1, 1],
depthwise_initializer=Constant(np.load('weights2/depthwise_conv2d_23_Kernel')),
bias_initializer=Constant(np.load('weights2/depthwise_conv2d_23_Bias')))(relu23_1)
conv24_1 = Conv2D(filters=288, kernel_size=[1, 1], strides=[1, 1], padding="valid", dilation_rate=[1, 1],
kernel_initializer=Constant(np.load('weights2/conv2d_24_Kernel').transpose(1,2,3,0)),
bias_initializer=Constant(np.load('weights2/conv2d_24_Bias')))(depthconv24_1)
add24_1 = Add()([relu23_1, conv24_1])
relu24_1 = ReLU()(add24_1)
# Block_25
depthconv25_1 = DepthwiseConv2D(kernel_size=[3, 3], strides=[1, 1], padding="same", depth_multiplier=1, dilation_rate=[1, 1],
depthwise_initializer=Constant(np.load('weights2/depthwise_conv2d_24_Kernel')),
bias_initializer=Constant(np.load('weights2/depthwise_conv2d_24_Bias')))(relu24_1)
conv25_1 = Conv2D(filters=48, kernel_size=[1, 1], strides=[1, 1], padding="valid", dilation_rate=[1, 1], activation='relu',
kernel_initializer=Constant(np.load('weights2/conv2d_25_Kernel').transpose(1,2,3,0)),
bias_initializer=Constant(np.load('weights2/conv2d_25_Bias')))(depthconv25_1)
resize25_1 = tf.image.resize(conv25_1, np.load('weights2/up_sampling2d_Size'))
depthconv25_2 = DepthwiseConv2D(kernel_size=[3, 3], strides=[1, 1], padding="same", depth_multiplier=1, dilation_rate=[1, 1],
depthwise_initializer=Constant(np.load('weights2/depthwise_conv2d_25_Kernel')),
bias_initializer=Constant(np.load('weights2/depthwise_conv2d_25_Bias')))(relu17_1)
conv25_2 = Conv2D(filters=48, kernel_size=[1, 1], strides=[1, 1], padding="valid", dilation_rate=[1, 1], activation='relu',
kernel_initializer=Constant(np.load('weights2/conv2d_26_Kernel').transpose(1,2,3,0)),
bias_initializer=Constant(np.load('weights2/conv2d_26_Bias')))(depthconv25_2)
add25_1 = Add()([resize25_1, conv25_2])
resize25_2 = tf.image.resize(add25_1, np.load('weights2/up_sampling2d_1_Size'))
depthconv25_3 = DepthwiseConv2D(kernel_size=[3, 3], strides=[1, 1], padding="same", depth_multiplier=1, dilation_rate=[1, 1],
depthwise_initializer=Constant(np.load('weights2/depthwise_conv2d_26_Kernel')),
bias_initializer=Constant(np.load('weights2/depthwise_conv2d_26_Bias')))(relu11_1)
conv25_3 = Conv2D(filters=48, kernel_size=[1, 1], strides=[1, 1], padding="valid", dilation_rate=[1, 1], activation='relu',
kernel_initializer=Constant(np.load('weights2/conv2d_27_Kernel').transpose(1,2,3,0)),
bias_initializer=Constant(np.load('weights2/conv2d_27_Bias')))(depthconv25_3)
add25_2 = Add()([resize25_2, conv25_3])
resize25_3 = tf.image.resize(add25_2, np.load('weights2/up_sampling2d_2_Size'))
depthconv25_4 = DepthwiseConv2D(kernel_size=[3, 3], strides=[1, 1], padding="same", depth_multiplier=1, dilation_rate=[1, 1],
depthwise_initializer=Constant(np.load('weights2/depthwise_conv2d_27_Kernel')),
bias_initializer=Constant(np.load('weights2/depthwise_conv2d_27_Bias')))(relu6_1)
conv25_4 = Conv2D(filters=48, kernel_size=[1, 1], strides=[1, 1], padding="valid", dilation_rate=[1, 1], activation='relu',
kernel_initializer=Constant(np.load('weights2/conv2d_28_Kernel').transpose(1,2,3,0)),
bias_initializer=Constant(np.load('weights2/conv2d_28_Bias')))(depthconv25_4)
add25_3 = Add()([resize25_3, conv25_4])
# Block_26
depthconv26_1 = DepthwiseConv2D(kernel_size=[3, 3], strides=[2, 2], padding="same", depth_multiplier=1, dilation_rate=[1, 1],
depthwise_initializer=Constant(np.load('weights2/depthwise_conv2d_28_Kernel')),
bias_initializer=Constant(np.load('weights2/depthwise_conv2d_28_Bias')))(add25_3)
conv26_1 = Conv2D(filters=96, kernel_size=[1, 1], strides=[1, 1], padding="valid", dilation_rate=[1, 1],
kernel_initializer=Constant(np.load('weights2/conv2d_29_Kernel').transpose(1,2,3,0)),
bias_initializer=Constant(np.load('weights2/conv2d_29_Bias')))(depthconv26_1)
maxpool26_1 = MaxPool2D(pool_size=[2, 2], strides=[2, 2], padding='valid')(add25_3)
pad26_1 = tf.pad(maxpool26_1, paddings=tf.constant(np.load('weights2/channel_padding_4_Paddings')))
add26_1 = Add()([conv26_1, pad26_1])
relu26_1 = ReLU()(add26_1)
# Block_27
depthconv27_1 = DepthwiseConv2D(kernel_size=[3, 3], strides=[1, 1], padding="same", depth_multiplier=1, dilation_rate=[1, 1],
depthwise_initializer=Constant(np.load('weights2/depthwise_conv2d_29_Kernel')),
bias_initializer=Constant(np.load('weights2/depthwise_conv2d_29_Bias')))(relu26_1)
conv27_1 = Conv2D(filters=96, kernel_size=[1, 1], strides=[1, 1], padding="valid", dilation_rate=[1, 1],
kernel_initializer=Constant(np.load('weights2/conv2d_30_Kernel').transpose(1,2,3,0)),
bias_initializer=Constant(np.load('weights2/conv2d_30_Bias')))(depthconv27_1)
add27_1 = Add()([relu26_1, conv27_1])
relu27_1 = ReLU()(add27_1)
# Block_28
depthconv28_1 = DepthwiseConv2D(kernel_size=[3, 3], strides=[1, 1], padding="same", depth_multiplier=1, dilation_rate=[1, 1],
depthwise_initializer=Constant(np.load('weights2/depthwise_conv2d_30_Kernel')),
bias_initializer=Constant(np.load('weights2/depthwise_conv2d_30_Bias')))(relu27_1)
conv28_1 = Conv2D(filters=96, kernel_size=[1, 1], strides=[1, 1], padding="valid", dilation_rate=[1, 1],
kernel_initializer=Constant(np.load('weights2/conv2d_31_Kernel').transpose(1,2,3,0)),
bias_initializer=Constant(np.load('weights2/conv2d_31_Bias')))(depthconv28_1)
add28_1 = Add()([relu27_1, conv28_1])
relu28_1 = ReLU()(add28_1)
# Block_29
depthconv29_1 = DepthwiseConv2D(kernel_size=[3, 3], strides=[1, 1], padding="same", depth_multiplier=1, dilation_rate=[1, 1],
depthwise_initializer=Constant(np.load('weights2/depthwise_conv2d_31_Kernel')),
bias_initializer=Constant(np.load('weights2/depthwise_conv2d_31_Bias')))(relu28_1)
conv29_1 = Conv2D(filters=96, kernel_size=[1, 1], strides=[1, 1], padding="valid", dilation_rate=[1, 1],
kernel_initializer=Constant(np.load('weights2/conv2d_32_Kernel').transpose(1,2,3,0)),
bias_initializer=Constant(np.load('weights2/conv2d_32_Bias')))(depthconv29_1)
add29_1 = Add()([relu28_1, conv29_1])
relu29_1 = ReLU()(add29_1)
# Block_30
depthconv30_1 = DepthwiseConv2D(kernel_size=[3, 3], strides=[1, 1], padding="same", depth_multiplier=1, dilation_rate=[1, 1],
depthwise_initializer=Constant(np.load('weights2/depthwise_conv2d_32_Kernel')),
bias_initializer=Constant(np.load('weights2/depthwise_conv2d_32_Bias')))(relu29_1)
conv30_1 = Conv2D(filters=96, kernel_size=[1, 1], strides=[1, 1], padding="valid", dilation_rate=[1, 1],
kernel_initializer=Constant(np.load('weights2/conv2d_33_Kernel').transpose(1,2,3,0)),
bias_initializer=Constant(np.load('weights2/conv2d_33_Bias')))(depthconv30_1)
add30_1 = Add()([relu29_1, conv30_1])
relu30_1 = ReLU()(add30_1)
# Block_31
depthconv31_1 = DepthwiseConv2D(kernel_size=[3, 3], strides=[1, 1], padding="same", depth_multiplier=1, dilation_rate=[1, 1],
depthwise_initializer=Constant(np.load('weights2/depthwise_conv2d_33_Kernel')),
bias_initializer=Constant(np.load('weights2/depthwise_conv2d_33_Bias')))(relu11_1)
conv31_1 = Conv2D(filters=96, kernel_size=[1, 1], strides=[1, 1], padding="valid", dilation_rate=[1, 1], activation='relu',
kernel_initializer=Constant(np.load('weights2/conv2d_34_Kernel').transpose(1,2,3,0)),
bias_initializer=Constant(np.load('weights2/conv2d_34_Bias')))(depthconv31_1)
add31_1 = Add()([relu30_1, conv31_1])
# Block_32
depthconv32_1 = DepthwiseConv2D(kernel_size=[3, 3], strides=[2, 2], padding="same", depth_multiplier=1, dilation_rate=[1, 1],
depthwise_initializer=Constant(np.load('weights2/depthwise_conv2d_34_Kernel')),
bias_initializer=Constant(np.load('weights2/depthwise_conv2d_34_Bias')))(add31_1)
conv32_1 = Conv2D(filters=192, kernel_size=[1, 1], strides=[1, 1], padding="valid", dilation_rate=[1, 1],
kernel_initializer=Constant(np.load('weights2/conv2d_35_Kernel').transpose(1,2,3,0)),
bias_initializer=Constant(np.load('weights2/conv2d_35_Bias')))(depthconv32_1)
maxpool32_1 = MaxPool2D(pool_size=[2, 2], strides=[2, 2], padding='valid')(add31_1)
pad32_1 = tf.pad(maxpool32_1, paddings=tf.constant(np.load('weights2/channel_padding_5_Paddings')))
add32_1 = Add()([conv32_1, pad32_1])
relu32_1 = ReLU()(add32_1)
# Block_33
depthconv33_1 = DepthwiseConv2D(kernel_size=[3, 3], strides=[1, 1], padding="same", depth_multiplier=1, dilation_rate=[1, 1],
depthwise_initializer=Constant(np.load('weights2/depthwise_conv2d_35_Kernel')),
bias_initializer=Constant(np.load('weights2/depthwise_conv2d_35_Bias')))(relu32_1)
conv33_1 = Conv2D(filters=192, kernel_size=[1, 1], strides=[1, 1], padding="valid", dilation_rate=[1, 1],
kernel_initializer=Constant(np.load('weights2/conv2d_36_Kernel').transpose(1,2,3,0)),
bias_initializer=Constant(np.load('weights2/conv2d_36_Bias')))(depthconv33_1)
add33_1 = Add()([relu32_1, conv33_1])
relu33_1 = ReLU()(add33_1)
# Block_34
depthconv34_1 = DepthwiseConv2D(kernel_size=[3, 3], strides=[1, 1], padding="same", depth_multiplier=1, dilation_rate=[1, 1],
depthwise_initializer=Constant(np.load('weights2/depthwise_conv2d_36_Kernel')),
bias_initializer=Constant(np.load('weights2/depthwise_conv2d_36_Bias')))(relu33_1)
conv34_1 = Conv2D(filters=192, kernel_size=[1, 1], strides=[1, 1], padding="valid", dilation_rate=[1, 1],
kernel_initializer=Constant(np.load('weights2/conv2d_37_Kernel').transpose(1,2,3,0)),
bias_initializer=Constant(np.load('weights2/conv2d_37_Bias')))(depthconv34_1)
add34_1 = Add()([relu33_1, conv34_1])
relu34_1 = ReLU()(add34_1)
# Block_35
depthconv35_1 = DepthwiseConv2D(kernel_size=[3, 3], strides=[1, 1], padding="same", depth_multiplier=1, dilation_rate=[1, 1],
depthwise_initializer=Constant(np.load('weights2/depthwise_conv2d_37_Kernel')),
bias_initializer=Constant(np.load('weights2/depthwise_conv2d_37_Bias')))(relu34_1)
conv35_1 = Conv2D(filters=192, kernel_size=[1, 1], strides=[1, 1], padding="valid", dilation_rate=[1, 1],
kernel_initializer=Constant(np.load('weights2/conv2d_38_Kernel').transpose(1,2,3,0)),
bias_initializer=Constant(np.load('weights2/conv2d_38_Bias')))(depthconv35_1)
add35_1 = Add()([relu34_1, conv35_1])
relu35_1 = ReLU()(add35_1)
# Block_36
depthconv36_1 = DepthwiseConv2D(kernel_size=[3, 3], strides=[1, 1], padding="same", depth_multiplier=1, dilation_rate=[1, 1],
depthwise_initializer=Constant(np.load('weights2/depthwise_conv2d_38_Kernel')),
bias_initializer=Constant(np.load('weights2/depthwise_conv2d_38_Bias')))(relu35_1)
conv36_1 = Conv2D(filters=192, kernel_size=[1, 1], strides=[1, 1], padding="valid", dilation_rate=[1, 1],
kernel_initializer=Constant(np.load('weights2/conv2d_39_Kernel').transpose(1,2,3,0)),
bias_initializer=Constant(np.load('weights2/conv2d_39_Bias')))(depthconv36_1)
add36_1 = Add()([relu35_1, conv36_1])
relu36_1 = ReLU()(add36_1)
# Block_37
depthconv37_1 = DepthwiseConv2D(kernel_size=[3, 3], strides=[1, 1], padding="same", depth_multiplier=1, dilation_rate=[1, 1],
depthwise_initializer=Constant(np.load('weights2/depthwise_conv2d_39_Kernel')),
bias_initializer=Constant(np.load('weights2/depthwise_conv2d_39_Bias')))(relu36_1)
conv37_1 = Conv2D(filters=192, kernel_size=[1, 1], strides=[1, 1], padding="valid", dilation_rate=[1, 1],
kernel_initializer=Constant(np.load('weights2/conv2d_40_Kernel').transpose(1,2,3,0)),
bias_initializer=Constant(np.load('weights2/conv2d_40_Bias')))(depthconv37_1)
add37_1 = Add()([relu36_1, conv37_1])
relu37_1 = ReLU()(add37_1)
# Block_38
depthconv38_1 = DepthwiseConv2D(kernel_size=[3, 3], strides=[1, 1], padding="same", depth_multiplier=1, dilation_rate=[1, 1],
depthwise_initializer=Constant(np.load('weights2/depthwise_conv2d_40_Kernel')),
bias_initializer=Constant(np.load('weights2/depthwise_conv2d_40_Bias')))(relu17_1)
conv38_1 = Conv2D(filters=192, kernel_size=[1, 1], strides=[1, 1], padding="valid", dilation_rate=[1, 1], activation='relu',
kernel_initializer=Constant(np.load('weights2/conv2d_41_Kernel').transpose(1,2,3,0)),
bias_initializer=Constant(np.load('weights2/conv2d_41_Bias')))(depthconv38_1)
add38_1 = Add()([conv38_1, relu37_1])
# Block_39
depthconv39_1 = DepthwiseConv2D(kernel_size=[3, 3], strides=[2, 2], padding="same", depth_multiplier=1, dilation_rate=[1, 1],
depthwise_initializer=Constant(np.load('weights2/depthwise_conv2d_41_Kernel')),
bias_initializer=Constant(np.load('weights2/depthwise_conv2d_41_Bias')))(add38_1)
conv39_1 = Conv2D(filters=288, kernel_size=[1, 1], strides=[1, 1], padding="valid", dilation_rate=[1, 1],
kernel_initializer=Constant(np.load('weights2/conv2d_42_Kernel').transpose(1,2,3,0)),
bias_initializer=Constant(np.load('weights2/conv2d_42_Bias')))(depthconv39_1)
maxpool39_1 = MaxPool2D(pool_size=[2, 2], strides=[2, 2], padding='valid')(add38_1)
pad39_1 = tf.pad(maxpool39_1, paddings=tf.constant(np.load('weights2/channel_padding_6_Paddings')))
add39_1 = Add()([conv39_1, pad39_1])
relu39_1 = ReLU()(add39_1)
# Block_40
depthconv40_1 = DepthwiseConv2D(kernel_size=[3, 3], strides=[1, 1], padding="same", depth_multiplier=1, dilation_rate=[1, 1],
depthwise_initializer=Constant(np.load('weights2/depthwise_conv2d_42_Kernel')),
bias_initializer=Constant(np.load('weights2/depthwise_conv2d_42_Bias')))(relu39_1)
conv40_1 = Conv2D(filters=288, kernel_size=[1, 1], strides=[1, 1], padding="valid", dilation_rate=[1, 1],
kernel_initializer=Constant(np.load('weights2/conv2d_43_Kernel').transpose(1,2,3,0)),
bias_initializer=Constant(np.load('weights2/conv2d_43_Bias')))(depthconv40_1)
add40_1 = Add()([relu39_1, conv40_1])
relu40_1 = ReLU()(add40_1)
# Block_41
depthconv41_1 = DepthwiseConv2D(kernel_size=[3, 3], strides=[1, 1], padding="same", depth_multiplier=1, dilation_rate=[1, 1],
depthwise_initializer=Constant(np.load('weights2/depthwise_conv2d_43_Kernel')),
bias_initializer=Constant(np.load('weights2/depthwise_conv2d_43_Bias')))(relu40_1)
conv41_1 = Conv2D(filters=288, kernel_size=[1, 1], strides=[1, 1], padding="valid", dilation_rate=[1, 1],
kernel_initializer=Constant(np.load('weights2/conv2d_44_Kernel').transpose(1,2,3,0)),
bias_initializer=Constant(np.load('weights2/conv2d_44_Bias')))(depthconv41_1)
add41_1 = Add()([relu40_1, conv41_1])
relu41_1 = ReLU()(add41_1)
# Block_42
depthconv42_1 = DepthwiseConv2D(kernel_size=[3, 3], strides=[1, 1], padding="same", depth_multiplier=1, dilation_rate=[1, 1],
depthwise_initializer=Constant(np.load('weights2/depthwise_conv2d_44_Kernel')),
bias_initializer=Constant(np.load('weights2/depthwise_conv2d_44_Bias')))(relu41_1)
conv42_1 = Conv2D(filters=288, kernel_size=[1, 1], strides=[1, 1], padding="valid", dilation_rate=[1, 1],
kernel_initializer=Constant(np.load('weights2/conv2d_45_Kernel').transpose(1,2,3,0)),
bias_initializer=Constant(np.load('weights2/conv2d_45_Bias')))(depthconv42_1)
add42_1 = Add()([relu41_1, conv42_1])
relu42_1 = ReLU()(add42_1)
# Block_43
depthconv43_1 = DepthwiseConv2D(kernel_size=[3, 3], strides=[1, 1], padding="same", depth_multiplier=1, dilation_rate=[1, 1],
depthwise_initializer=Constant(np.load('weights2/depthwise_conv2d_45_Kernel')),
bias_initializer=Constant(np.load('weights2/depthwise_conv2d_45_Bias')))(relu42_1)
conv43_1 = Conv2D(filters=288, kernel_size=[1, 1], strides=[1, 1], padding="valid", dilation_rate=[1, 1],
kernel_initializer=Constant(np.load('weights2/conv2d_46_Kernel').transpose(1,2,3,0)),
bias_initializer=Constant(np.load('weights2/conv2d_46_Bias')))(depthconv43_1)
add43_1 = Add()([relu42_1, conv43_1])
relu43_1 = ReLU()(add43_1)
# Block_44
depthconv44_1 = DepthwiseConv2D(kernel_size=[3, 3], strides=[1, 1], padding="same", depth_multiplier=1, dilation_rate=[1, 1],
depthwise_initializer=Constant(np.load('weights2/depthwise_conv2d_46_Kernel')),
bias_initializer=Constant(np.load('weights2/depthwise_conv2d_46_Bias')))(relu43_1)
conv44_1 = Conv2D(filters=288, kernel_size=[1, 1], strides=[1, 1], padding="valid", dilation_rate=[1, 1],
kernel_initializer=Constant(np.load('weights2/conv2d_47_Kernel').transpose(1,2,3,0)),
bias_initializer=Constant(np.load('weights2/conv2d_47_Bias')))(depthconv44_1)
add44_1 = Add()([relu43_1, conv44_1])
relu44_1 = ReLU()(add44_1)
# Block_45
depthconv45_1 = DepthwiseConv2D(kernel_size=[3, 3], strides=[1, 1], padding="same", depth_multiplier=1, dilation_rate=[1, 1],
depthwise_initializer=Constant(np.load('weights2/depthwise_conv2d_47_Kernel')),
bias_initializer=Constant(np.load('weights2/depthwise_conv2d_47_Bias')))(relu44_1)
conv45_1 = Conv2D(filters=288, kernel_size=[1, 1], strides=[1, 1], padding="valid", dilation_rate=[1, 1],
kernel_initializer=Constant(np.load('weights2/conv2d_48_Kernel').transpose(1,2,3,0)),
bias_initializer=Constant(np.load('weights2/conv2d_48_Bias')))(depthconv45_1)
add45_1 = Add()([relu44_1, conv45_1])
relu45_1 = ReLU()(add45_1)
# Block_46
depthconv46_1 = DepthwiseConv2D(kernel_size=[3, 3], strides=[1, 1], padding="same", depth_multiplier=1, dilation_rate=[1, 1],
depthwise_initializer=Constant(np.load('weights2/depthwise_conv2d_48_Kernel')),
bias_initializer=Constant(np.load('weights2/depthwise_conv2d_48_Bias')))(relu24_1)
conv46_1 = Conv2D(filters=288, kernel_size=[1, 1], strides=[1, 1], padding="valid", dilation_rate=[1, 1], activation='relu',
kernel_initializer=Constant(np.load('weights2/conv2d_49_Kernel').transpose(1,2,3,0)),
bias_initializer=Constant(np.load('weights2/conv2d_49_Bias')))(depthconv46_1)
add46_1 = Add()([conv46_1, relu45_1])
# Block_47
depthconv47_1 = DepthwiseConv2D(kernel_size=[3, 3], strides=[2, 2], padding="same", depth_multiplier=1, dilation_rate=[1, 1],
depthwise_initializer=Constant(np.load('weights2/depthwise_conv2d_49_Kernel')),
bias_initializer=Constant(np.load('weights2/depthwise_conv2d_49_Bias')))(add46_1)
conv47_1 = Conv2D(filters=288, kernel_size=[1, 1], strides=[1, 1], padding="valid", dilation_rate=[1, 1],
kernel_initializer=Constant(np.load('weights2/conv2d_50_Kernel').transpose(1,2,3,0)),
bias_initializer=Constant(np.load('weights2/conv2d_50_Bias')))(depthconv47_1)
maxpool47_1 = MaxPool2D(pool_size=[2, 2], strides=[2, 2], padding='valid')(add46_1)
add47_1 = Add()([conv47_1, maxpool47_1])
relu47_1 = ReLU()(add47_1)
# Block_48
depthconv48_1 = DepthwiseConv2D(kernel_size=[3, 3], strides=[1, 1], padding="same", depth_multiplier=1, dilation_rate=[1, 1],
depthwise_initializer=Constant(np.load('weights2/depthwise_conv2d_50_Kernel')),
bias_initializer=Constant(np.load('weights2/depthwise_conv2d_50_Bias')))(relu47_1)
conv48_1 = Conv2D(filters=288, kernel_size=[1, 1], strides=[1, 1], padding="valid", dilation_rate=[1, 1],
kernel_initializer=Constant(np.load('weights2/conv2d_51_Kernel').transpose(1,2,3,0)),
bias_initializer=Constant(np.load('weights2/conv2d_51_Bias')))(depthconv48_1)
add48_1 = Add()([conv48_1, relu47_1])
relu48_1 = ReLU()(add48_1)
# Block_49
depthconv49_1 = DepthwiseConv2D(kernel_size=[3, 3], strides=[1, 1], padding="same", depth_multiplier=1, dilation_rate=[1, 1],
depthwise_initializer=Constant(np.load('weights2/depthwise_conv2d_51_Kernel')),
bias_initializer=Constant(np.load('weights2/depthwise_conv2d_51_Bias')))(relu48_1)
conv49_1 = Conv2D(filters=288, kernel_size=[1, 1], strides=[1, 1], padding="valid", dilation_rate=[1, 1],
kernel_initializer=Constant(np.load('weights2/conv2d_52_Kernel').transpose(1,2,3,0)),
bias_initializer=Constant(np.load('weights2/conv2d_52_Bias')))(depthconv49_1)
add49_1 = Add()([conv49_1, relu48_1])
relu49_1 = ReLU()(add49_1)
# Block_50
depthconv50_1 = DepthwiseConv2D(kernel_size=[3, 3], strides=[1, 1], padding="same", depth_multiplier=1, dilation_rate=[1, 1],
depthwise_initializer=Constant(np.load('weights2/depthwise_conv2d_52_Kernel')),
bias_initializer=Constant(np.load('weights2/depthwise_conv2d_52_Bias')))(relu49_1)
conv50_1 = Conv2D(filters=288, kernel_size=[1, 1], strides=[1, 1], padding="valid", dilation_rate=[1, 1],
kernel_initializer=Constant(np.load('weights2/conv2d_53_Kernel').transpose(1,2,3,0)),
bias_initializer=Constant(np.load('weights2/conv2d_53_Bias')))(depthconv50_1)
add50_1 = Add()([conv50_1, relu49_1])
relu50_1 = ReLU()(add50_1)
# Block_51
depthconv51_1 = DepthwiseConv2D(kernel_size=[3, 3], strides=[1, 1], padding="same", depth_multiplier=1, dilation_rate=[1, 1],
depthwise_initializer=Constant(np.load('weights2/depthwise_conv2d_53_Kernel')),
bias_initializer=Constant(np.load('weights2/depthwise_conv2d_53_Bias')))(relu50_1)
conv51_1 = Conv2D(filters=288, kernel_size=[1, 1], strides=[1, 1], padding="valid", dilation_rate=[1, 1],
kernel_initializer=Constant(np.load('weights2/conv2d_54_Kernel').transpose(1,2,3,0)),
bias_initializer=Constant(np.load('weights2/conv2d_54_Bias')))(depthconv51_1)
add51_1 = Add()([conv51_1, relu50_1])
relu51_1 = ReLU()(add51_1)
# Block_52
depthconv52_1 = DepthwiseConv2D(kernel_size=[3, 3], strides=[1, 1], padding="same", depth_multiplier=1, dilation_rate=[1, 1],
depthwise_initializer=Constant(np.load('weights2/depthwise_conv2d_54_Kernel')),
bias_initializer=Constant(np.load('weights2/depthwise_conv2d_54_Bias')))(relu51_1)
conv52_1 = Conv2D(filters=288, kernel_size=[1, 1], strides=[1, 1], padding="valid", dilation_rate=[1, 1],
kernel_initializer=Constant(np.load('weights2/conv2d_55_Kernel').transpose(1,2,3,0)),
bias_initializer=Constant(np.load('weights2/conv2d_55_Bias')))(depthconv52_1)
add52_1 = Add()([conv52_1, relu51_1])
relu52_1 = ReLU()(add52_1)
# Block_53
depthconv53_1 = DepthwiseConv2D(kernel_size=[3, 3], strides=[1, 1], padding="same", depth_multiplier=1, dilation_rate=[1, 1],
depthwise_initializer=Constant(np.load('weights2/depthwise_conv2d_55_Kernel')),
bias_initializer=Constant(np.load('weights2/depthwise_conv2d_55_Bias')))(relu52_1)
conv53_1 = Conv2D(filters=288, kernel_size=[1, 1], strides=[1, 1], padding="valid", dilation_rate=[1, 1],
kernel_initializer=Constant(np.load('weights2/conv2d_56_Kernel').transpose(1,2,3,0)),
bias_initializer=Constant(np.load('weights2/conv2d_56_Bias')))(depthconv53_1)
add53_1 = Add()([conv53_1, relu52_1])
relu53_1 = ReLU()(add53_1)
# Block_54
depthconv54_1 = DepthwiseConv2D(kernel_size=[3, 3], strides=[1, 1], padding="same", depth_multiplier=1, dilation_rate=[1, 1],
depthwise_initializer=Constant(np.load('weights2/depthwise_conv2d_56_Kernel')),
bias_initializer=Constant(np.load('weights2/depthwise_conv2d_56_Bias')))(relu53_1)
conv54_1 = Conv2D(filters=288, kernel_size=[1, 1], strides=[1, 1], padding="valid", dilation_rate=[1, 1],
kernel_initializer=Constant(np.load('weights2/conv2d_57_Kernel').transpose(1,2,3,0)),
bias_initializer=Constant(np.load('weights2/conv2d_57_Bias')))(depthconv54_1)
add54_1 = Add()([conv54_1, relu53_1])
relu54_1 = ReLU()(add54_1)
# Block_55
depthconv55_1 = DepthwiseConv2D(kernel_size=[3, 3], strides=[2, 2], padding="same", depth_multiplier=1, dilation_rate=[1, 1],
depthwise_initializer=Constant(np.load('weights2/depthwise_conv2d_57_Kernel')),
bias_initializer=Constant(np.load('weights2/depthwise_conv2d_57_Bias')))(relu54_1)
conv55_1 = Conv2D(filters=288, kernel_size=[1, 1], strides=[1, 1], padding="valid", dilation_rate=[1, 1],
kernel_initializer=Constant(np.load('weights2/conv2d_58_Kernel').transpose(1,2,3,0)),
bias_initializer=Constant(np.load('weights2/conv2d_58_Bias')))(depthconv55_1)
maxpool55_1 = MaxPool2D(pool_size=[2, 2], strides=[2, 2], padding='valid')(relu54_1)
add55_1 = Add()([conv55_1, maxpool55_1])
relu55_1 = ReLU()(add55_1)
# Block_56
depthconv56_1 = DepthwiseConv2D(kernel_size=[3, 3], strides=[1, 1], padding="same", depth_multiplier=1, dilation_rate=[1, 1],
depthwise_initializer=Constant(np.load('weights2/depthwise_conv2d_58_Kernel')),
bias_initializer=Constant(np.load('weights2/depthwise_conv2d_58_Bias')))(relu55_1)
conv56_1 = Conv2D(filters=288, kernel_size=[1, 1], strides=[1, 1], padding="valid", dilation_rate=[1, 1],
kernel_initializer=Constant(np.load('weights2/conv2d_59_Kernel').transpose(1,2,3,0)),
bias_initializer=Constant(np.load('weights2/conv2d_59_Bias')))(depthconv56_1)
add56_1 = Add()([conv56_1, relu55_1])
relu56_1 = ReLU()(add56_1)
# Block_57
depthconv57_1 = DepthwiseConv2D(kernel_size=[3, 3], strides=[1, 1], padding="same", depth_multiplier=1, dilation_rate=[1, 1],
depthwise_initializer=Constant(np.load('weights2/depthwise_conv2d_59_Kernel')),
bias_initializer=Constant(np.load('weights2/depthwise_conv2d_59_Bias')))(relu56_1)
conv57_1 = Conv2D(filters=288, kernel_size=[1, 1], strides=[1, 1], padding="valid", dilation_rate=[1, 1],
kernel_initializer=Constant(np.load('weights2/conv2d_60_Kernel').transpose(1,2,3,0)),
bias_initializer=Constant(np.load('weights2/conv2d_60_Bias')))(depthconv57_1)
add57_1 = Add()([conv57_1, relu56_1])
relu57_1 = ReLU()(add57_1)
# Block_58
depthconv58_1 = DepthwiseConv2D(kernel_size=[3, 3], strides=[1, 1], padding="same", depth_multiplier=1, dilation_rate=[1, 1],
depthwise_initializer=Constant(np.load('weights2/depthwise_conv2d_60_Kernel')),
bias_initializer=Constant(np.load('weights2/depthwise_conv2d_60_Bias')))(relu57_1)
conv58_1 = Conv2D(filters=288, kernel_size=[1, 1], strides=[1, 1], padding="valid", dilation_rate=[1, 1],
kernel_initializer=Constant(np.load('weights2/conv2d_61_Kernel').transpose(1,2,3,0)),
bias_initializer=Constant(np.load('weights2/conv2d_61_Bias')))(depthconv58_1)
add58_1 = Add()([conv58_1, relu57_1])
relu58_1 = ReLU()(add58_1)
# Block_59
depthconv59_1 = DepthwiseConv2D(kernel_size=[3, 3], strides=[1, 1], padding="same", depth_multiplier=1, dilation_rate=[1, 1],
depthwise_initializer=Constant(np.load('weights2/depthwise_conv2d_61_Kernel')),
bias_initializer=Constant(np.load('weights2/depthwise_conv2d_61_Bias')))(relu58_1)
conv59_1 = Conv2D(filters=288, kernel_size=[1, 1], strides=[1, 1], padding="valid", dilation_rate=[1, 1],
kernel_initializer=Constant(np.load('weights2/conv2d_62_Kernel').transpose(1,2,3,0)),
bias_initializer=Constant(np.load('weights2/conv2d_62_Bias')))(depthconv59_1)
add59_1 = Add()([conv59_1, relu58_1])
relu59_1 = ReLU()(add59_1)
# Block_60
depthconv60_1 = DepthwiseConv2D(kernel_size=[3, 3], strides=[1, 1], padding="same", depth_multiplier=1, dilation_rate=[1, 1],
depthwise_initializer=Constant(np.load('weights2/depthwise_conv2d_62_Kernel')),
bias_initializer=Constant(np.load('weights2/depthwise_conv2d_62_Bias')))(relu59_1)
conv60_1 = Conv2D(filters=288, kernel_size=[1, 1], strides=[1, 1], padding="valid", dilation_rate=[1, 1],
kernel_initializer=Constant(np.load('weights2/conv2d_63_Kernel').transpose(1,2,3,0)),
bias_initializer=Constant(np.load('weights2/conv2d_63_Bias')))(depthconv60_1)
add60_1 = Add()([conv60_1, relu59_1])
relu60_1 = ReLU()(add60_1)
# Block_61
depthconv61_1 = DepthwiseConv2D(kernel_size=[3, 3], strides=[1, 1], padding="same", depth_multiplier=1, dilation_rate=[1, 1],
depthwise_initializer=Constant(np.load('weights2/depthwise_conv2d_63_Kernel')),
bias_initializer=Constant(np.load('weights2/depthwise_conv2d_63_Bias')))(relu60_1)
conv61_1 = Conv2D(filters=288, kernel_size=[1, 1], strides=[1, 1], padding="valid", dilation_rate=[1, 1],
kernel_initializer=Constant(np.load('weights2/conv2d_64_Kernel').transpose(1,2,3,0)),
bias_initializer=Constant(np.load('weights2/conv2d_64_Bias')))(depthconv61_1)
add61_1 = Add()([conv61_1, relu60_1])
relu61_1 = ReLU()(add61_1)
# Block_62
depthconv62_1 = DepthwiseConv2D(kernel_size=[3, 3], strides=[1, 1], padding="same", depth_multiplier=1, dilation_rate=[1, 1],
depthwise_initializer=Constant(np.load('weights2/depthwise_conv2d_64_Kernel')),
bias_initializer=Constant(np.load('weights2/depthwise_conv2d_64_Bias')))(relu61_1)
conv62_1 = Conv2D(filters=288, kernel_size=[1, 1], strides=[1, 1], padding="valid", dilation_rate=[1, 1],
kernel_initializer=Constant(np.load('weights2/conv2d_65_Kernel').transpose(1,2,3,0)),
bias_initializer=Constant(np.load('weights2/conv2d_65_Bias')))(depthconv62_1)
add62_1 = Add()([conv62_1, relu61_1])
relu62_1 = ReLU()(add62_1)
# Block_63
depthconv63_1 = DepthwiseConv2D(kernel_size=[3, 3], strides=[1, 1], padding="same", depth_multiplier=1, dilation_rate=[1, 1],
depthwise_initializer=Constant(np.load('weights2/depthwise_conv2d_66_Kernel')),
bias_initializer=Constant(np.load('weights2/depthwise_conv2d_66_Bias')))(add25_3)
conv63_1 = Conv2D(filters=8, kernel_size=[1, 1], strides=[1, 1], padding="valid", dilation_rate=[1, 1], activation='relu',
kernel_initializer=Constant(np.load('weights2/conv2d_67_Kernel').transpose(1,2,3,0)),
bias_initializer=Constant(np.load('weights2/conv2d_67_Bias')))(depthconv63_1)
resize63_1 = tf.image.resize(conv63_1, np.load('weights2/up_sampling2d_3_Size'))
depthconv63_2 = DepthwiseConv2D(kernel_size=[3, 3], strides=[1, 1], padding="same", depth_multiplier=1, dilation_rate=[1, 1],
depthwise_initializer=Constant(np.load('weights2/depthwise_conv2d_65_Kernel')),
bias_initializer=Constant(np.load('weights2/depthwise_conv2d_65_Bias')))(relu2_1)
conv63_2 = Conv2D(filters=8, kernel_size=[1, 1], strides=[1, 1], padding="valid", dilation_rate=[1, 1], activation='relu',
kernel_initializer=Constant(np.load('weights2/conv2d_66_Kernel').transpose(1,2,3,0)),
bias_initializer=Constant(np.load('weights2/conv2d_66_Bias')))(depthconv63_2)
add63_1 = Add()([resize63_1, conv63_2])
depthconv63_3 = DepthwiseConv2D(kernel_size=[3, 3], strides=[1, 1], padding="same", depth_multiplier=1, dilation_rate=[1, 1],
depthwise_initializer=Constant(np.load('weights2/depthwise_conv2d_67_Kernel')),
bias_initializer=Constant(np.load('weights2/depthwise_conv2d_67_Bias')))(add63_1)
conv63_3 = Conv2D(filters=8, kernel_size=[1, 1], strides=[1, 1], padding="valid", dilation_rate=[1, 1], activation='relu',
kernel_initializer=Constant(np.load('weights2/conv2d_68_Kernel').transpose(1,2,3,0)),
bias_initializer=Constant(np.load('weights2/conv2d_68_Bias')))(depthconv63_3)
# Final Block_99
conv99_1 = Conv2D(filters=1, kernel_size=[3, 3], strides=[1, 1], padding="same", dilation_rate=[1, 1],
kernel_initializer=Constant(np.load('weights2/output_segmentation_Kernel').transpose(1,2,3,0)),
bias_initializer=Constant(np.load('weights2/output_segmentation_Bias')), name='output_segmentation')(conv63_3)
conv99_2 = Conv2D(filters=1, kernel_size=[2, 2], strides=[1, 1], padding="valid", dilation_rate=[1, 1],
kernel_initializer=Constant(np.load('weights2/conv_poseflag_Kernel').transpose(1,2,3,0)),
bias_initializer=Constant(np.load('weights2/conv_poseflag_Bias')))(relu62_1)
sigm99_1 = tf.math.sigmoid(conv99_2, name='output_poseflag')
# reshape99_1 = tf.reshape(sigm99_1, (1, 1), name='output_poseflag')
conv99_3 = Conv2D(filters=124, kernel_size=[2, 2], strides=[1, 1], padding="valid", dilation_rate=[1, 1],
kernel_initializer=Constant(np.load('weights2/convld_3d_Kernel').transpose(1,2,3,0)),
bias_initializer=Constant(np.load('weights2/convld_3d_Bias')))(relu62_1)
reshape99_2 = tf.reshape(conv99_3, (1, 124), name='ld_3d')
model = Model(inputs=inputs, outputs=[conv99_1, sigm99_1, reshape99_2])
model.summary()
tf.saved_model.save(model, 'saved_model_pose_landmark_upper_body')
model.save('pose_landmark_upper_body.h5')
full_model = tf.function(lambda inputs: model(inputs))
full_model = full_model.get_concrete_function(inputs = (tf.TensorSpec(model.inputs[0].shape, model.inputs[0].dtype)))
frozen_func = convert_variables_to_constants_v2(full_model, lower_control_flow=False)
frozen_func.graph.as_graph_def()
tf.io.write_graph(graph_or_graph_def=frozen_func.graph,
logdir=".",
name="pose_landmark_upper_body_256x256_float32.pb",
as_text=False)
# No Quantization - Input/Output=float32
converter = tf.lite.TFLiteConverter.from_keras_model(model)
tflite_model = converter.convert()
with open('pose_landmark_upper_body_256x256_float32.tflite', 'wb') as w:
w.write(tflite_model)
print("tflite convert complete! - pose_landmark_upper_body_256x256_float32.tflite")
# Weight Quantization - Input/Output=float32
converter = tf.lite.TFLiteConverter.from_keras_model(model)
converter.optimizations = [tf.lite.Optimize.OPTIMIZE_FOR_SIZE]
tflite_model = converter.convert()
with open('pose_landmark_upper_body_256x256_weight_quant.tflite', 'wb') as w:
w.write(tflite_model)
print("Weight Quantization complete! - pose_landmark_upper_body_256x256_weight_quant.tflite")
def representative_dataset_gen():
for image in raw_test_data:
image = tf.image.resize(image, (256, 256))
image = image[np.newaxis,:,:,:]
yield [image]
raw_test_data = np.load('calibration_data_img_person.npy', allow_pickle=True)
# Integer Quantization - Input/Output=float32
converter = tf.lite.TFLiteConverter.from_keras_model(model)
converter.optimizations = [tf.lite.Optimize.DEFAULT]
converter.representative_dataset = representative_dataset_gen
tflite_quant_model = converter.convert()
with open('pose_landmark_upper_body_256x256_integer_quant.tflite', 'wb') as w:
w.write(tflite_quant_model)
print("Integer Quantization complete! - pose_landmark_upper_body_256x256_integer_quant.tflite")
# Full Integer Quantization - Input/Output=int8
converter = tf.lite.TFLiteConverter.from_keras_model(model)
converter.optimizations = [tf.lite.Optimize.DEFAULT]
converter.target_spec.supported_ops = [tf.lite.OpsSet.TFLITE_BUILTINS_INT8]
converter.inference_input_type = tf.uint8
converter.inference_output_type = tf.uint8
converter.representative_dataset = representative_dataset_gen
tflite_quant_model = converter.convert()
with open('pose_landmark_upper_body_256x256_full_integer_quant.tflite', 'wb') as w:
w.write(tflite_quant_model)
print("Full Integer Quantization complete! - pose_landmark_upper_body_256x256_full_integer_quant.tflite")
# Float16 Quantization - Input/Output=float32
converter = tf.lite.TFLiteConverter.from_keras_model(model)
converter.optimizations = [tf.lite.Optimize.DEFAULT]
converter.target_spec.supported_types = [tf.float16]
tflite_quant_model = converter.convert()
with open('pose_landmark_upper_body_256x256_float16_quant.tflite', 'wb') as w:
w.write(tflite_quant_model)
print("Float16 Quantization complete! - pose_landmark_upper_body_256x256_float16_quant.tflite")
# EdgeTPU
import subprocess
result = subprocess.check_output(["edgetpu_compiler", "-s", "pose_landmark_upper_body_256x256_full_integer_quant.tflite"])
print(result)
| 67.222623 | 127 | 0.70821 | 7,819 | 55,862 | 4.77107 | 0.070725 | 0.018228 | 0.109208 | 0.169253 | 0.781933 | 0.77893 | 0.773489 | 0.760193 | 0.747889 | 0.625975 | 0 | 0.098626 | 0.137661 | 55,862 | 830 | 128 | 67.303614 | 0.675787 | 0.037807 | 0 | 0.031797 | 0 | 0 | 0.188214 | 0.170033 | 0 | 0 | 0 | 0 | 0 | 1 | 0.00159 | false | 0 | 0.012719 | 0 | 0.014308 | 0.009539 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
29d9c7e131b115d2fbfad30e830ab74096c8f7e3 | 30 | py | Python | models/__init__.py | Wangxy2180/Spike-FlowNet | 1531c2422fcc9286b07dee11c7fd7995e6696800 | [
"MIT"
] | 53 | 2020-07-21T01:13:08.000Z | 2022-02-22T08:55:41.000Z | models/__init__.py | Wangxy2180/Spike-FlowNet | 1531c2422fcc9286b07dee11c7fd7995e6696800 | [
"MIT"
] | 7 | 2020-08-12T15:36:36.000Z | 2022-02-22T21:21:34.000Z | models/__init__.py | Wangxy2180/Spike-FlowNet | 1531c2422fcc9286b07dee11c7fd7995e6696800 | [
"MIT"
] | 16 | 2020-09-18T06:30:41.000Z | 2022-02-15T18:07:18.000Z | from .FlowNetS_spike import *
| 15 | 29 | 0.8 | 4 | 30 | 5.75 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.133333 | 30 | 1 | 30 | 30 | 0.884615 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
29e2c407cc9a0daed73d24bcadff9e6e8746a60d | 108 | py | Python | main.py | KooTQ/TitanicKaggle | 09cb13547498a90d1607d851291a9edbfe91633c | [
"MIT"
] | null | null | null | main.py | KooTQ/TitanicKaggle | 09cb13547498a90d1607d851291a9edbfe91633c | [
"MIT"
] | null | null | null | main.py | KooTQ/TitanicKaggle | 09cb13547498a90d1607d851291a9edbfe91633c | [
"MIT"
] | null | null | null |
def titanic_data_present():
print('Lets go!')
if __name__ == '__main__':
titanic_data_present()
| 12 | 27 | 0.666667 | 13 | 108 | 4.615385 | 0.769231 | 0.366667 | 0.6 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.194444 | 108 | 8 | 28 | 13.5 | 0.689655 | 0 | 0 | 0 | 0 | 0 | 0.150943 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.25 | true | 0 | 0 | 0 | 0.25 | 0.25 | 1 | 0 | 0 | null | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
8a6890ab027c49e84e9f8f457fa80fcd79e8cf19 | 3,546 | py | Python | custom_components/ble_monitor/test/test_xiaogui_parser.py | avidit/home-assistant-config | c4ea82e7ea75f93ebfb1700b39df5ff9787843b7 | [
"MIT"
] | 383 | 2019-11-25T17:13:04.000Z | 2020-11-12T08:47:35.000Z | custom_components/ble_monitor/test/test_xiaogui_parser.py | scrambledleek/ha_hacs_ble_monitor | abafc7a34312e695667325dbe97c309f20dd1527 | [
"MIT"
] | 142 | 2019-11-25T21:29:27.000Z | 2020-11-12T16:58:01.000Z | custom_components/ble_monitor/test/test_xiaogui_parser.py | scrambledleek/ha_hacs_ble_monitor | abafc7a34312e695667325dbe97c309f20dd1527 | [
"MIT"
] | 65 | 2019-11-26T11:53:44.000Z | 2020-11-11T16:37:52.000Z | """The tests for the Xiaogui ble_parser."""
from ble_monitor.ble_parser import BleParser
class TestXiaogui:
"""Tests for the Xiaogui parser"""
def test_xiaogui_tzc4_stab(self):
"""Test Xiaogui parser for Xiaogui TZC4 (stabilized weight)."""
data_string = "043e1d0201030094e0e5295a5f1110ffc0a30276138b0002215f5a29e5e094bd"
data = bytes(bytearray.fromhex(data_string))
# pylint: disable=unused-variable
ble_parser = BleParser()
sensor_msg, tracker_msg = ble_parser.parse_data(data)
assert sensor_msg["firmware"] == "Xiaogui"
assert sensor_msg["type"] == "TZC4"
assert sensor_msg["mac"] == "5F5A29E5E094"
assert sensor_msg["packet"] == 41761
assert sensor_msg["data"]
assert sensor_msg["non-stabilized weight"] == 63.0
assert sensor_msg["weight"] == 63.0
assert sensor_msg["impedance"] == 500.3
assert sensor_msg["stabilized"] == 1
assert sensor_msg["rssi"] == -67
def test_xiaogui_tzc4_non_stab(self):
"""Test Xiaogui parser for Xiaogui TZC4 (not stabilized weight)."""
data_string = "043e1d0201030094e0e5295a5f1110ffc05d008c00000002205f5a29e5e094bf"
data = bytes(bytearray.fromhex(data_string))
# pylint: disable=unused-variable
ble_parser = BleParser()
sensor_msg, tracker_msg = ble_parser.parse_data(data)
assert sensor_msg["firmware"] == "Xiaogui"
assert sensor_msg["type"] == "TZC4"
assert sensor_msg["mac"] == "5F5A29E5E094"
assert sensor_msg["packet"] == 23840
assert sensor_msg["data"]
assert sensor_msg["non-stabilized weight"] == 14.0
assert "weight" not in sensor_msg
assert "impedance" not in sensor_msg
assert sensor_msg["stabilized"] == 0
assert sensor_msg["rssi"] == -65
def test_xiaogui_maxxmee_qjj_stab(self):
"""Test Xiaogui parser for MaxxMee Mod QJ-J (stabilized weight)."""
data_string = "043e1d0201030094e0e5295a5f1110ffc07d2c4700000a01255f5a29e5e094bd"
data = bytes(bytearray.fromhex(data_string))
# pylint: disable=unused-variable
ble_parser = BleParser()
sensor_msg, tracker_msg = ble_parser.parse_data(data)
assert sensor_msg["firmware"] == "Xiaogui"
assert sensor_msg["type"] == "QJ-J"
assert sensor_msg["mac"] == "5F5A29E5E094"
assert sensor_msg["packet"] == 32037
assert sensor_msg["data"]
assert sensor_msg["non-stabilized weight"] == 113.35
assert sensor_msg["weight"] == 113.35
assert sensor_msg["stabilized"] == 1
assert sensor_msg["rssi"] == -67
def test_xiaogui_maxxmee_qjj_non_stab(self):
"""Test Xiaogui parser for MaxxMee Mod QJ-J (not stabilized weight)."""
data_string = "043e1d0201030094e0e5295a5f1110ffc024000000000a01245f5a29e5e094bd"
data = bytes(bytearray.fromhex(data_string))
# pylint: disable=unused-variable
ble_parser = BleParser()
sensor_msg, tracker_msg = ble_parser.parse_data(data)
assert sensor_msg["firmware"] == "Xiaogui"
assert sensor_msg["type"] == "QJ-J"
assert sensor_msg["mac"] == "5F5A29E5E094"
assert sensor_msg["packet"] == 9252
assert sensor_msg["data"]
assert sensor_msg["non-stabilized weight"] == 0.0
assert "weight" not in sensor_msg
assert "impedance" not in sensor_msg
assert sensor_msg["stabilized"] == 0
assert sensor_msg["rssi"] == -67
| 41.717647 | 88 | 0.656514 | 399 | 3,546 | 5.619048 | 0.165414 | 0.172614 | 0.234166 | 0.067797 | 0.788136 | 0.746209 | 0.713649 | 0.710972 | 0.676182 | 0.676182 | 0 | 0.103372 | 0.230682 | 3,546 | 84 | 89 | 42.214286 | 0.718475 | 0.12493 | 0 | 0.639344 | 0 | 0 | 0.208211 | 0.083415 | 0 | 0 | 0 | 0 | 0.639344 | 1 | 0.065574 | false | 0 | 0.016393 | 0 | 0.098361 | 0 | 0 | 0 | 0 | null | 0 | 1 | 0 | 0 | 1 | 1 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
8aaede5b971cb26479ca586d8b03c6568449e36e | 16 | py | Python | webrepl_cfg.py | HaBaLeS/MqttPixel | 26a6245a5ff787a3165b04d8bde89c39043de25d | [
"Beerware"
] | null | null | null | webrepl_cfg.py | HaBaLeS/MqttPixel | 26a6245a5ff787a3165b04d8bde89c39043de25d | [
"Beerware"
] | null | null | null | webrepl_cfg.py | HaBaLeS/MqttPixel | 26a6245a5ff787a3165b04d8bde89c39043de25d | [
"Beerware"
] | null | null | null | PASS = '1234'
| 5.333333 | 14 | 0.5 | 2 | 16 | 4 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.363636 | 0.3125 | 16 | 2 | 15 | 8 | 0.363636 | 0 | 0 | 0 | 0 | 0 | 0.285714 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 1 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 6 |
8ac67d99c7103d403eb5726c83136772892d56ad | 34 | py | Python | UponorJnap/__init__.py | asev/py-uponor-jnap | a3fbc842bcfc4833bb4eb13945f3088d9d470317 | [
"MIT"
] | 1 | 2020-11-26T16:12:16.000Z | 2020-11-26T16:12:16.000Z | UponorJnap/__init__.py | asev/py-uponor-jnap | a3fbc842bcfc4833bb4eb13945f3088d9d470317 | [
"MIT"
] | 1 | 2021-09-17T17:50:15.000Z | 2021-09-17T17:50:15.000Z | UponorJnap/__init__.py | asev/py-uponor-jnap | a3fbc842bcfc4833bb4eb13945f3088d9d470317 | [
"MIT"
] | 2 | 2020-12-12T20:21:54.000Z | 2021-02-07T20:54:24.000Z | from .UponorJnap import UponorJnap | 34 | 34 | 0.882353 | 4 | 34 | 7.5 | 0.75 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.088235 | 34 | 1 | 34 | 34 | 0.967742 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
76d8d833feb92392f5ccd140d930e2e075df0a8c | 5,105 | py | Python | oxyrhopus/datasets.py | bhoeckendorf/oxyrhopus | 0abc383fd2b3fcbe38a797e28198696178a19f8e | [
"Apache-2.0"
] | null | null | null | oxyrhopus/datasets.py | bhoeckendorf/oxyrhopus | 0abc383fd2b3fcbe38a797e28198696178a19f8e | [
"Apache-2.0"
] | null | null | null | oxyrhopus/datasets.py | bhoeckendorf/oxyrhopus | 0abc383fd2b3fcbe38a797e28198696178a19f8e | [
"Apache-2.0"
] | null | null | null | import os
from typing import Tuple, Optional
from dataclasses import dataclass
import torchvision
import torchvision.transforms as transforms
class CIFAR10:
def __init__(
self,
name: str = "CIFAR-10",
normalize_mean: Tuple[float,float,float] = (0.49139968, 0.48215841, 0.44653091),
normalize_std: Tuple[float,float,float] = (0.24703223, 0.24348513, 0.26158784),
normalize_inplace: bool = False,
resize_to: Optional[Tuple[int,int]] = None,
data_root: Optional[str] = None
):
if (data_root is None or len(data_root) == 0) and "MYDIR" in os.environ and os.path.exists(os.environ["MYDIR"]):
data_root = f"{os.environ['MYDIR']}/data/standard/pytorch"
else:
data_root = None
transform = [transforms.Resize(resize_to)] if resize_to is not None else []
transform.extend([
transforms.ToTensor(),
transforms.Normalize(normalize_mean, normalize_std, normalize_inplace)
])
transform = transforms.Compose(transform)
self.trnset = torchvision.datasets.CIFAR10(root=data_root, train=True, download=True, transform=transform)
self.tstset = torchvision.datasets.CIFAR10(root=data_root, train=False, download=True, transform=transform)
self.classes = tuple(self.trnset.classes)
class CIFAR100:
def __init__(
self,
name: str = "CIFAR-100",
normalize_mean: Tuple[float,float,float] = (0.50707516, 0.48654887, 0.44091784),
normalize_std: Tuple[float,float,float] = (0.26733429, 0.25643846, 0.27615047),
normalize_inplace: bool = False,
resize_to: Optional[Tuple[int,int]] = None,
data_root: Optional[str] = None
):
if (data_root is None or len(data_root) == 0) and "MYDIR" in os.environ and os.path.exists(os.environ["MYDIR"]):
data_root = f"{os.environ['MYDIR']}/data/standard/pytorch"
else:
data_root = None
transform = [transforms.Resize(resize_to)] if resize_to is not None else []
transform.extend([
transforms.ToTensor(),
transforms.Normalize(normalize_mean, normalize_std, normalize_inplace)
])
transform = transforms.Compose(transform)
self.trnset = torchvision.datasets.CIFAR100(root=data_root, train=True, download=True, transform=transform)
self.tstset = torchvision.datasets.CIFAR100(root=data_root, train=False, download=True, transform=transform)
self.classes = tuple(self.trnset.classes)
class MNIST:
def __init__(
self,
name: str = "MNIST",
normalize_mean: Tuple[float] = (0.13066048, ),
normalize_std: Tuple[float] = (0.30810781, ),
normalize_inplace: bool = False,
resize_to: Optional[Tuple[int,int]] = None,
data_root: Optional[str] = None
):
if (data_root is None or len(data_root) == 0) and "MYDIR" in os.environ and os.path.exists(os.environ["MYDIR"]):
data_root = f"{os.environ['MYDIR']}/data/standard/pytorch"
else:
data_root = None
transform = [transforms.Resize(resize_to)] if resize_to is not None else []
transform.extend([
transforms.ToTensor(),
transforms.Normalize(normalize_mean, normalize_std, normalize_inplace)
])
transform = transforms.Compose(transform)
self.trnset = torchvision.datasets.MNIST(root=data_root, train=True, download=True, transform=transform)
self.tstset = torchvision.datasets.MNIST(root=data_root, train=False, download=True, transform=transform)
self.classes = tuple(self.trnset.classes)
@dataclass
class CIFAR10Config:
_target_: str = "oxyrhopus.CIFAR10"
name: str = "CIFAR-10"
normalize_mean: Tuple[float,float,float] = (0.49139968, 0.48215841, 0.44653091)
normalize_std: Tuple[float,float,float] = (0.24703223, 0.24348513, 0.26158784)
normalize_inplace: bool = False
resize_to: Optional[Tuple[int,int]] = None
data_root: Optional[str] = None
@dataclass
class CIFAR100Config:
_target_: str = "oxyrhopus.CIFAR100"
name: str = "CIFAR-100"
normalize_mean: Tuple[float,float,float] = (0.50707516, 0.48654887, 0.44091784)
normalize_std: Tuple[float,float,float] = (0.26733429, 0.25643846, 0.27615047)
normalize_inplace: bool = False
resize_to: Optional[Tuple[int, int]] = None
data_root: Optional[str] = None
@dataclass
class MNISTConfig:
_target_: str = "oxyrhopus.MNIST"
name: str = "MNIST"
normalize_mean: Tuple[float] = (0.13066048, )
normalize_std: Tuple[float] = (0.30810781, )
normalize_inplace: bool = False
resize_to: Optional[Tuple[int,int]] = None
data_root: Optional[str] = None
def get_datasets():
return {
"cifar-10": CIFAR10Config,
"cifar-100": CIFAR100Config,
"mnist": MNISTConfig
}
| 38.969466 | 120 | 0.635651 | 595 | 5,105 | 5.315966 | 0.141176 | 0.060702 | 0.037939 | 0.050585 | 0.888713 | 0.885236 | 0.87828 | 0.865634 | 0.865634 | 0.865634 | 0 | 0.078247 | 0.248972 | 5,105 | 130 | 121 | 39.269231 | 0.74674 | 0 | 0 | 0.588785 | 0 | 0 | 0.053869 | 0.025269 | 0 | 0 | 0 | 0 | 0 | 1 | 0.037383 | false | 0 | 0.046729 | 0.009346 | 0.345794 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
76e8ea51a1dbbc617498f000fe93e0bd04dcb2ac | 17,483 | py | Python | oauth2_provider/tests/test_scopes.py | sfx/django-oauth-toolkit | 08ce1d63bfad16905a7d9f758591c0e8c65706ec | [
"BSD-2-Clause-FreeBSD"
] | 1 | 2016-10-17T15:52:39.000Z | 2016-10-17T15:52:39.000Z | oauth2_provider/tests/test_scopes.py | sfx/django-oauth-toolkit | 08ce1d63bfad16905a7d9f758591c0e8c65706ec | [
"BSD-2-Clause-FreeBSD"
] | null | null | null | oauth2_provider/tests/test_scopes.py | sfx/django-oauth-toolkit | 08ce1d63bfad16905a7d9f758591c0e8c65706ec | [
"BSD-2-Clause-FreeBSD"
] | 1 | 2016-01-20T18:11:05.000Z | 2016-01-20T18:11:05.000Z | from __future__ import unicode_literals
import json
from django.test import TestCase, RequestFactory
from django.core.exceptions import ImproperlyConfigured
from django.core.urlresolvers import reverse
from .test_utils import TestCaseUtils
from ..compat import urlparse, parse_qs, get_user_model, urlencode
from ..models import get_application_model, Grant, AccessToken
from ..settings import oauth2_settings
from ..views import ScopedProtectedResourceView, ReadWriteScopedResourceView
Application = get_application_model()
UserModel = get_user_model()
# mocking a protected resource view
class ScopeResourceView(ScopedProtectedResourceView):
required_scopes = ['scope1']
def get(self, request, *args, **kwargs):
return "This is a protected resource"
class MultiScopeResourceView(ScopedProtectedResourceView):
required_scopes = ['scope1', 'scope2']
def get(self, request, *args, **kwargs):
return "This is a protected resource"
class ReadWriteResourceView(ReadWriteScopedResourceView):
def get(self, request, *args, **kwargs):
return "This is a read protected resource"
def post(self, request, *args, **kwargs):
return "This is a write protected resource"
class BaseTest(TestCaseUtils, TestCase):
def setUp(self):
self.factory = RequestFactory()
self.test_user = UserModel.objects.create_user("test_user", "test@user.com", "123456")
self.dev_user = UserModel.objects.create_user("dev_user", "dev@user.com", "123456")
self.application = Application(
name="Test Application",
redirect_uris="http://localhost http://example.com http://example.it",
user=self.dev_user,
client_type=Application.CLIENT_CONFIDENTIAL,
authorization_grant_type=Application.GRANT_AUTHORIZATION_CODE,
)
self.application.save()
oauth2_settings._SCOPES = ['read', 'write', 'scope1', 'scope2', 'scope3']
oauth2_settings.READ_SCOPE = 'read'
oauth2_settings.WRITE_SCOPE = 'write'
def tearDown(self):
self.application.delete()
self.test_user.delete()
self.dev_user.delete()
class TestScopesQueryParameterBackwardsCompatibility(BaseTest):
def setUp(self):
super(TestScopesQueryParameterBackwardsCompatibility, self).setUp()
oauth2_settings._SCOPES = ['read', 'write']
def test_scopes_query_parameter_is_supported_on_post(self):
"""
Tests support for plural `scopes` query parameter on POST requests.
"""
self.client.login(username="test_user", password="123456")
# retrieve a valid authorization code
authcode_data = {
'client_id': self.application.client_id,
'state': 'random_state_string',
'scopes': 'read write', # using plural `scopes`
'redirect_uri': 'http://example.it',
'response_type': 'code',
'allow': True,
}
response = self.client.post(reverse('oauth2_provider:authorize'), data=authcode_data)
query_dict = parse_qs(urlparse(response['Location']).query)
authorization_code = query_dict['code'].pop()
grant = Grant.objects.get(code=authorization_code)
self.assertEqual(grant.scope, "read write")
def test_scopes_query_parameter_is_supported_on_get(self):
"""
Tests support for plural `scopes` query parameter on GET requests.
"""
self.client.login(username="test_user", password="123456")
query_string = urlencode({
'client_id': self.application.client_id,
'state': 'random_state_string',
'scopes': 'read write', # using plural `scopes`
'redirect_uri': 'http://example.it',
'response_type': 'code',
})
url = "{url}?{qs}".format(url=reverse('oauth2_provider:authorize'), qs=query_string)
response = self.client.get(url)
self.assertEqual(response.status_code, 200)
# check form is in context
self.assertIn("form", response.context)
form = response.context["form"]
self.assertEqual(form['scope'].value(), "read write")
class TestScopesSave(BaseTest):
def test_scopes_saved_in_grant(self):
"""
Test scopes are properly saved in grant
"""
self.client.login(username="test_user", password="123456")
# retrieve a valid authorization code
authcode_data = {
'client_id': self.application.client_id,
'state': 'random_state_string',
'scope': 'scope1 scope2',
'redirect_uri': 'http://example.it',
'response_type': 'code',
'allow': True,
}
response = self.client.post(reverse('oauth2_provider:authorize'), data=authcode_data)
query_dict = parse_qs(urlparse(response['Location']).query)
authorization_code = query_dict['code'].pop()
grant = Grant.objects.get(code=authorization_code)
self.assertEqual(grant.scope, "scope1 scope2")
def test_scopes_save_in_access_token(self):
"""
Test scopes are properly saved in access token
"""
self.client.login(username="test_user", password="123456")
# retrieve a valid authorization code
authcode_data = {
'client_id': self.application.client_id,
'state': 'random_state_string',
'scope': 'scope1 scope2',
'redirect_uri': 'http://example.it',
'response_type': 'code',
'allow': True,
}
response = self.client.post(reverse('oauth2_provider:authorize'), data=authcode_data)
query_dict = parse_qs(urlparse(response['Location']).query)
authorization_code = query_dict['code'].pop()
# exchange authorization code for a valid access token
token_request_data = {
'grant_type': 'authorization_code',
'code': authorization_code,
'redirect_uri': 'http://example.it'
}
auth_headers = self.get_basic_auth_header(self.application.client_id, self.application.client_secret)
response = self.client.post(reverse('oauth2_provider:token'), data=token_request_data, **auth_headers)
content = json.loads(response.content.decode("utf-8"))
access_token = content['access_token']
at = AccessToken.objects.get(token=access_token)
self.assertEqual(at.scope, "scope1 scope2")
class TestScopesProtection(BaseTest):
def test_scopes_protection_valid(self):
"""
Test access to a scope protected resource with correct scopes provided
"""
self.client.login(username="test_user", password="123456")
# retrieve a valid authorization code
authcode_data = {
'client_id': self.application.client_id,
'state': 'random_state_string',
'scope': 'scope1 scope2',
'redirect_uri': 'http://example.it',
'response_type': 'code',
'allow': True,
}
response = self.client.post(reverse('oauth2_provider:authorize'), data=authcode_data)
query_dict = parse_qs(urlparse(response['Location']).query)
authorization_code = query_dict['code'].pop()
# exchange authorization code for a valid access token
token_request_data = {
'grant_type': 'authorization_code',
'code': authorization_code,
'redirect_uri': 'http://example.it'
}
auth_headers = self.get_basic_auth_header(self.application.client_id, self.application.client_secret)
response = self.client.post(reverse('oauth2_provider:token'), data=token_request_data, **auth_headers)
content = json.loads(response.content.decode("utf-8"))
access_token = content['access_token']
# use token to access the resource
auth_headers = {
'HTTP_AUTHORIZATION': 'Bearer ' + access_token,
}
request = self.factory.get("/fake-resource", **auth_headers)
request.user = self.test_user
view = ScopeResourceView.as_view()
response = view(request)
self.assertEqual(response, "This is a protected resource")
def test_scopes_protection_fail(self):
"""
Test access to a scope protected resource with wrong scopes provided
"""
self.client.login(username="test_user", password="123456")
# retrieve a valid authorization code
authcode_data = {
'client_id': self.application.client_id,
'state': 'random_state_string',
'scope': 'scope2',
'redirect_uri': 'http://example.it',
'response_type': 'code',
'allow': True,
}
response = self.client.post(reverse('oauth2_provider:authorize'), data=authcode_data)
query_dict = parse_qs(urlparse(response['Location']).query)
authorization_code = query_dict['code'].pop()
# exchange authorization code for a valid access token
token_request_data = {
'grant_type': 'authorization_code',
'code': authorization_code,
'redirect_uri': 'http://example.it'
}
auth_headers = self.get_basic_auth_header(self.application.client_id, self.application.client_secret)
response = self.client.post(reverse('oauth2_provider:token'), data=token_request_data, **auth_headers)
content = json.loads(response.content.decode("utf-8"))
access_token = content['access_token']
# use token to access the resource
auth_headers = {
'HTTP_AUTHORIZATION': 'Bearer ' + access_token,
}
request = self.factory.get("/fake-resource", **auth_headers)
request.user = self.test_user
view = ScopeResourceView.as_view()
response = view(request)
self.assertEqual(response.status_code, 403)
def test_multi_scope_fail(self):
"""
Test access to a multi-scope protected resource with wrong scopes provided
"""
self.client.login(username="test_user", password="123456")
# retrieve a valid authorization code
authcode_data = {
'client_id': self.application.client_id,
'state': 'random_state_string',
'scope': 'scope1 scope3',
'redirect_uri': 'http://example.it',
'response_type': 'code',
'allow': True,
}
response = self.client.post(reverse('oauth2_provider:authorize'), data=authcode_data)
query_dict = parse_qs(urlparse(response['Location']).query)
authorization_code = query_dict['code'].pop()
# exchange authorization code for a valid access token
token_request_data = {
'grant_type': 'authorization_code',
'code': authorization_code,
'redirect_uri': 'http://example.it'
}
auth_headers = self.get_basic_auth_header(self.application.client_id, self.application.client_secret)
response = self.client.post(reverse('oauth2_provider:token'), data=token_request_data, **auth_headers)
content = json.loads(response.content.decode("utf-8"))
access_token = content['access_token']
# use token to access the resource
auth_headers = {
'HTTP_AUTHORIZATION': 'Bearer ' + access_token,
}
request = self.factory.get("/fake-resource", **auth_headers)
request.user = self.test_user
view = MultiScopeResourceView.as_view()
response = view(request)
self.assertEqual(response.status_code, 403)
def test_multi_scope_valid(self):
"""
Test access to a multi-scope protected resource with correct scopes provided
"""
self.client.login(username="test_user", password="123456")
# retrieve a valid authorization code
authcode_data = {
'client_id': self.application.client_id,
'state': 'random_state_string',
'scope': 'scope1 scope2',
'redirect_uri': 'http://example.it',
'response_type': 'code',
'allow': True,
}
response = self.client.post(reverse('oauth2_provider:authorize'), data=authcode_data)
query_dict = parse_qs(urlparse(response['Location']).query)
authorization_code = query_dict['code'].pop()
# exchange authorization code for a valid access token
token_request_data = {
'grant_type': 'authorization_code',
'code': authorization_code,
'redirect_uri': 'http://example.it'
}
auth_headers = self.get_basic_auth_header(self.application.client_id, self.application.client_secret)
response = self.client.post(reverse('oauth2_provider:token'), data=token_request_data, **auth_headers)
content = json.loads(response.content.decode("utf-8"))
access_token = content['access_token']
# use token to access the resource
auth_headers = {
'HTTP_AUTHORIZATION': 'Bearer ' + access_token,
}
request = self.factory.get("/fake-resource", **auth_headers)
request.user = self.test_user
view = MultiScopeResourceView.as_view()
response = view(request)
self.assertEqual(response, "This is a protected resource")
class TestReadWriteScope(BaseTest):
def get_access_token(self, scopes):
self.client.login(username="test_user", password="123456")
# retrieve a valid authorization code
authcode_data = {
'client_id': self.application.client_id,
'state': 'random_state_string',
'scope': scopes,
'redirect_uri': 'http://example.it',
'response_type': 'code',
'allow': True,
}
response = self.client.post(reverse('oauth2_provider:authorize'), data=authcode_data)
query_dict = parse_qs(urlparse(response['Location']).query)
authorization_code = query_dict['code'].pop()
# exchange authorization code for a valid access token
token_request_data = {
'grant_type': 'authorization_code',
'code': authorization_code,
'redirect_uri': 'http://example.it'
}
auth_headers = self.get_basic_auth_header(self.application.client_id, self.application.client_secret)
response = self.client.post(reverse('oauth2_provider:token'), data=token_request_data, **auth_headers)
content = json.loads(response.content.decode("utf-8"))
return content['access_token']
def test_improperly_configured(self):
oauth2_settings._SCOPES = ['scope1']
request = self.factory.get("/fake")
view = ReadWriteResourceView.as_view()
self.assertRaises(ImproperlyConfigured, view, request)
oauth2_settings._SCOPES = ['read', 'write']
oauth2_settings.READ_SCOPE = 'ciccia'
view = ReadWriteResourceView.as_view()
self.assertRaises(ImproperlyConfigured, view, request)
def test_properly_configured(self):
oauth2_settings._SCOPES = ['scope1']
request = self.factory.get("/fake")
view = ReadWriteResourceView.as_view()
self.assertRaises(ImproperlyConfigured, view, request)
oauth2_settings._SCOPES = ['read', 'write']
oauth2_settings.READ_SCOPE = 'ciccia'
view = ReadWriteResourceView.as_view()
self.assertRaises(ImproperlyConfigured, view, request)
def test_has_read_scope(self):
access_token = self.get_access_token('read')
# use token to access the resource
auth_headers = {
'HTTP_AUTHORIZATION': 'Bearer ' + access_token,
}
request = self.factory.get("/fake-resource", **auth_headers)
request.user = self.test_user
view = ReadWriteResourceView.as_view()
response = view(request)
self.assertEqual(response, "This is a read protected resource")
def test_no_read_scope(self):
access_token = self.get_access_token('scope1')
# use token to access the resource
auth_headers = {
'HTTP_AUTHORIZATION': 'Bearer ' + access_token,
}
request = self.factory.get("/fake-resource", **auth_headers)
request.user = self.test_user
view = ReadWriteResourceView.as_view()
response = view(request)
self.assertEqual(response.status_code, 403)
def test_has_write_scope(self):
access_token = self.get_access_token('write')
# use token to access the resource
auth_headers = {
'HTTP_AUTHORIZATION': 'Bearer ' + access_token,
}
request = self.factory.post("/fake-resource", **auth_headers)
request.user = self.test_user
view = ReadWriteResourceView.as_view()
response = view(request)
self.assertEqual(response, "This is a write protected resource")
def test_no_write_scope(self):
access_token = self.get_access_token('scope1')
# use token to access the resource
auth_headers = {
'HTTP_AUTHORIZATION': 'Bearer ' + access_token,
}
request = self.factory.post("/fake-resource", **auth_headers)
request.user = self.test_user
view = ReadWriteResourceView.as_view()
response = view(request)
self.assertEqual(response.status_code, 403)
| 38.089325 | 110 | 0.642224 | 1,880 | 17,483 | 5.755319 | 0.08883 | 0.058133 | 0.040758 | 0.031885 | 0.82098 | 0.80647 | 0.800277 | 0.789926 | 0.786784 | 0.760628 | 0 | 0.010466 | 0.245782 | 17,483 | 458 | 111 | 38.172489 | 0.810102 | 0.085054 | 0 | 0.685358 | 0 | 0 | 0.178979 | 0.022238 | 0 | 0 | 0 | 0 | 0.056075 | 1 | 0.068536 | false | 0.028037 | 0.031153 | 0.012461 | 0.146417 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
76ede8e9b9b3456ead3fc1339497c5a668428c20 | 1,266 | py | Python | aetcd/__init__.py | mk-fg/aetcd | ad133dba914ed12e1c35ce4318de80b13626f4f6 | [
"Apache-2.0"
] | 2 | 2022-02-02T15:18:04.000Z | 2022-03-16T06:39:26.000Z | aetcd/__init__.py | mk-fg/aetcd | ad133dba914ed12e1c35ce4318de80b13626f4f6 | [
"Apache-2.0"
] | 6 | 2022-01-12T05:29:15.000Z | 2022-01-18T00:29:57.000Z | aetcd/__init__.py | mk-fg/aetcd | ad133dba914ed12e1c35ce4318de80b13626f4f6 | [
"Apache-2.0"
] | 2 | 2022-01-11T14:34:18.000Z | 2022-01-28T15:53:20.000Z | from .client import Alarm # noqa: F401
from .client import Client # noqa: F401
from .client import Status # noqa: F401
from .client import Transactions # noqa: F401
from .exceptions import ClientError # noqa: F401
from .exceptions import ConnectionFailedError # noqa: F401
from .exceptions import ConnectionTimeoutError # noqa: F401
from .exceptions import InternalError # noqa: F401
from .exceptions import InvalidArgumentError # noqa: F401
from .exceptions import PreconditionFailedError # noqa: F401
from .exceptions import RevisionCompactedError # noqa: F401
from .exceptions import WatchTimeoutError # noqa: F401
from .leases import Lease # noqa: F401
from .locks import Lock # noqa: F401
from .members import Member # noqa: F401
from .rtypes import Delete # noqa: F401
from .rtypes import DeleteRange # noqa: F401
from .rtypes import Event # noqa: F401
from .rtypes import EventKind # noqa: F401
from .rtypes import Get # noqa: F401
from .rtypes import GetRange # noqa: F401
from .rtypes import KeyValue # noqa: F401
from .rtypes import Put # noqa: F401
from .rtypes import ResponseHeader # noqa: F401
from .rtypes import Watch # noqa: F401
from .watcher import Watcher # noqa: F401
from .watcher import WatcherCallback # noqa: F401
| 45.214286 | 61 | 0.765403 | 162 | 1,266 | 5.981481 | 0.222222 | 0.22291 | 0.321981 | 0.185759 | 0.604747 | 0 | 0 | 0 | 0 | 0 | 0 | 0.077143 | 0.170616 | 1,266 | 27 | 62 | 46.888889 | 0.845714 | 0.233807 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
0a0695db7309eb82a535c64861a8ef0618fc3eaf | 166 | py | Python | quark_core_api/context/__init__.py | arcticle/Quark | 17aa5b5869a9e9c7a04c1a371fef5998f33dc319 | [
"MIT"
] | null | null | null | quark_core_api/context/__init__.py | arcticle/Quark | 17aa5b5869a9e9c7a04c1a371fef5998f33dc319 | [
"MIT"
] | null | null | null | quark_core_api/context/__init__.py | arcticle/Quark | 17aa5b5869a9e9c7a04c1a371fef5998f33dc319 | [
"MIT"
] | null | null | null | from quark_core_api.context.core_context import CoreContext
from quark_core_api.context.context_objects import ApplicationContext, WorkspaceContext, ExperimentContext | 83 | 106 | 0.909639 | 20 | 166 | 7.25 | 0.55 | 0.124138 | 0.17931 | 0.22069 | 0.317241 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.054217 | 166 | 2 | 106 | 83 | 0.923567 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
0a6ec33364afbd7b43ebbd59066cc66c75b66ab2 | 5,885 | py | Python | tests/test_e2e_ac.py | brennerm/uptimerobot-operator | 25cdca5ec06e15d05ddd5ed4b13b20082fbb8889 | [
"MIT"
] | 47 | 2021-02-03T06:59:17.000Z | 2022-03-30T03:44:43.000Z | tests/test_e2e_ac.py | brennerm/uptimerobot-operator | 25cdca5ec06e15d05ddd5ed4b13b20082fbb8889 | [
"MIT"
] | 19 | 2021-02-02T23:26:56.000Z | 2021-11-09T12:13:42.000Z | tests/test_e2e_ac.py | brennerm/uptimerobot-operator | 25cdca5ec06e15d05ddd5ed4b13b20082fbb8889 | [
"MIT"
] | 2 | 2021-06-09T02:24:46.000Z | 2021-08-15T06:45:20.000Z | import pytest
import kubernetes.client as k8s_client
import kubernetes.config as k8s_config
import sys
from .utils import namespace_handling, kopf_runner, NAMESPACE, DEFAULT_WAIT_TIME
import os
import time
sys.path.insert(0, os.path.abspath(os.path.join(
os.path.dirname(__file__), '../ur_operator')))
import ur_operator.uptimerobot as uptimerobot
from ur_operator.crds.alert_contact import AlertContactV1Beta1, AlertContactType
from ur_operator.k8s import K8s
k8s = K8s()
k8s_config.load_kube_config()
core_api = k8s_client.CoreV1Api()
uptime_robot = uptimerobot.create_uptimerobot_api()
def create_k8s_ur_ac(namespace, name, wait_for_seconds=DEFAULT_WAIT_TIME, **spec):
k8s.create_k8s_crd_obj(AlertContactV1Beta1, namespace, name, **spec)
time.sleep(wait_for_seconds)
def update_k8s_ur_ac(namespace, name, wait_for_seconds=DEFAULT_WAIT_TIME, **spec):
k8s.update_k8s_crd_obj(AlertContactV1Beta1, namespace, name, **spec)
time.sleep(wait_for_seconds)
def delete_k8s_ur_ac(namespace, name, wait_for_seconds=DEFAULT_WAIT_TIME):
k8s.delete_k8s_crd_obj(AlertContactV1Beta1, namespace, name)
time.sleep(wait_for_seconds)
class TestDefaultOperator:
def test_create_email_ac(self, kopf_runner, namespace_handling):
name = 'foo'
ac_type = AlertContactType.EMAIL
value = 'foo@bar.com'
create_k8s_ur_ac(NAMESPACE, name, type=ac_type.name, value=value)
acs = uptime_robot.get_alert_contacts()['alert_contacts']
assert len(acs) == 2
assert acs[1]['friendly_name'] == name
assert acs[1]['type'] == ac_type.value
assert acs[1]['value'] == value
@pytest.mark.skip(reason='needs fixing')
def test_create_twitter_ac(self, kopf_runner, namespace_handling):
name = 'foo'
ac_type = AlertContactType.TWITTER_DM
value = '__brennerm'
create_k8s_ur_ac(NAMESPACE, name, type=ac_type.name, value=value)
acs = uptime_robot.get_alert_contacts()['alert_contacts']
assert len(acs) == 2
assert acs[1]['friendly_name'] == name
assert acs[1]['type'] == ac_type.value
assert acs[1]['value'] == value
def test_create_webhook_ac(self, kopf_runner, namespace_handling):
name = 'foo'
ac_type = AlertContactType.WEB_HOOK
value = 'https://brennerm.github.io?'
create_k8s_ur_ac(NAMESPACE, name, type=ac_type.name, value=value)
acs = uptime_robot.get_alert_contacts()['alert_contacts']
assert len(acs) == 2
assert acs[1]['friendly_name'] == name
assert acs[1]['type'] == ac_type.value
assert acs[1]['value'] == value
def test_create_mw_with_friendly_name(self, kopf_runner, namespace_handling):
name = 'foo'
friendly_name = 'bar'
ac_type = AlertContactType.EMAIL
value = 'foo@bar.com'
create_k8s_ur_ac(NAMESPACE, name, type=ac_type.name, value=value, friendlyName=friendly_name)
acs = uptime_robot.get_alert_contacts()['alert_contacts']
assert len(acs) == 2
assert acs[1]['friendly_name'] == friendly_name
assert acs[1]['type'] == ac_type.value
assert acs[1]['value'] == value
def test_update_ac_change_mail(self, kopf_runner, namespace_handling):
name = 'foo'
new_name = 'bar'
ac_type = AlertContactType.EMAIL
value = 'foo@bar.com'
new_value = 'bar@foo.com'
create_k8s_ur_ac(NAMESPACE, name, type=ac_type.name, value=value)
acs = uptime_robot.get_alert_contacts()['alert_contacts']
assert len(acs) == 2
assert acs[1]['friendly_name'] == name
assert acs[1]['value'] == value
update_k8s_ur_ac(NAMESPACE, name, friendlyName=new_name, value=new_value)
acs = uptime_robot.get_alert_contacts()['alert_contacts']
assert len(acs) == 2
assert acs[1]['friendly_name'] == new_name
assert acs[1]['value'] == new_value
def test_update_ac_change_webhook_url(self, kopf_runner, namespace_handling):
name = 'foo'
ac_type = AlertContactType.WEB_HOOK
value = 'https://brennerm.github.io?'
new_value = 'https://brennerm.github.com?'
create_k8s_ur_ac(NAMESPACE, name, type=ac_type.name, value=value)
acs = uptime_robot.get_alert_contacts()['alert_contacts']
assert len(acs) == 2
assert acs[1]['value'] == value
update_k8s_ur_ac(NAMESPACE, name, value=new_value)
acs = uptime_robot.get_alert_contacts()['alert_contacts']
assert len(acs) == 2
assert acs[1]['value'] == new_value
def test_update_ac_change_type(self, kopf_runner, namespace_handling):
name = 'foo'
ac_type = AlertContactType.EMAIL
new_ac_type = AlertContactType.WEB_HOOK
value = 'foo@bar.com'
new_value = 'https://brennerm.github.com?'
create_k8s_ur_ac(NAMESPACE, name, type=ac_type.name, value=value)
acs = uptime_robot.get_alert_contacts()['alert_contacts']
assert len(acs) == 2
assert acs[1]['type'] == ac_type.value
assert acs[1]['value'] == value
update_k8s_ur_ac(NAMESPACE, name, type=new_ac_type.name, value=new_value)
acs = uptime_robot.get_alert_contacts()['alert_contacts']
assert len(acs) == 2
assert acs[1]['type'] == new_ac_type.value
assert acs[1]['value'] == new_value
def test_delete_ac(self, kopf_runner, namespace_handling):
name = 'foo'
ac_type = AlertContactType.EMAIL
value = 'foo@bar.com'
create_k8s_ur_ac(NAMESPACE, name, type=ac_type.name, value=value)
acs = uptime_robot.get_alert_contacts()['alert_contacts']
assert len(acs) == 2
delete_k8s_ur_ac(NAMESPACE, name)
acs = uptime_robot.get_alert_contacts()['alert_contacts']
assert len(acs) == 1
| 34.617647 | 101 | 0.673917 | 791 | 5,885 | 4.707965 | 0.115044 | 0.038668 | 0.059076 | 0.064447 | 0.802632 | 0.795381 | 0.755102 | 0.73174 | 0.723684 | 0.723684 | 0 | 0.015702 | 0.210025 | 5,885 | 169 | 102 | 34.822485 | 0.78533 | 0 | 0 | 0.617886 | 0 | 0 | 0.095497 | 0 | 0 | 0 | 0 | 0 | 0.276423 | 1 | 0.089431 | false | 0 | 0.081301 | 0 | 0.178862 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
6a646b7e3688fcf78e09d90b399a145cac2b7e13 | 14,471 | py | Python | tests/unit/game_pkchess/controller/damage.py | RaenonX/Jelly-Bot-API | c7da1e91783dce3a2b71b955b3a22b68db9056cf | [
"MIT"
] | 5 | 2020-08-26T20:12:00.000Z | 2020-12-11T16:39:22.000Z | tests/unit/game_pkchess/controller/damage.py | RaenonX/Jelly-Bot | c7da1e91783dce3a2b71b955b3a22b68db9056cf | [
"MIT"
] | 234 | 2019-12-14T03:45:19.000Z | 2020-08-26T18:55:19.000Z | tests/unit/game_pkchess/controller/damage.py | RaenonX/Jelly-Bot-API | c7da1e91783dce3a2b71b955b3a22b68db9056cf | [
"MIT"
] | 2 | 2019-10-23T15:21:15.000Z | 2020-05-22T09:35:55.000Z | from game.pkchess.character import Character, CharacterTemplate
from game.pkchess.controller import Damage, DamageCalculator
from game.pkchess.exception import (
DamageFalseNegativeError, DamageFalsePositiveError, DamageTypeInvalidError, DamageValueNegativeError,
SkillPowerNegativeError
)
from game.pkchess.flags import DamageType
from tests.base import TestCase
__all__ = ["TestDamage", "TestDamageCalculator"]
class TestDamage(TestCase):
SOURCE = Character(CharacterTemplate(
"Test Source", 9999, 0, 9999, 0, 5000, 0, 9999, 0, 9999, 0, 1, 0, 0, 0, 1, 0, []
))
TARGET = Character(CharacterTemplate(
"Test Target", 9999, 0, 9999, 0, 9999, 0, 5000, 0, 9999, 0, 1, 0, 0, 0, 1, 0, []
))
def test_damage_normal_attack(self):
dmg = Damage(self.SOURCE, self.TARGET, 1, 5)
self.assertEqual(dmg.damage_type_code, 1)
self.assertEqual(dmg.damage_dealt, 5)
for dmg_type in DamageType:
self.assertEqual(dmg.is_damage_type(dmg_type), dmg_type == DamageType.DEALT)
self.assertTrue(dmg.is_dealt)
self.assertFalse(dmg.is_critical)
self.assertFalse(dmg.is_skill)
self.assertFalse(dmg.is_missed)
self.assertFalse(dmg.is_blocked)
def test_damage_crit_attack(self):
dmg = Damage(self.SOURCE, self.TARGET, 3, 5)
self.assertEqual(dmg.damage_type_code, 3)
self.assertEqual(dmg.damage_dealt, 5)
for dmg_type in DamageType:
with self.subTest(dmg_type=dmg_type):
self.assertEqual(dmg.is_damage_type(dmg_type),
dmg_type in (DamageType.DEALT, DamageType.CRITICAL))
self.assertTrue(dmg.is_dealt)
self.assertTrue(dmg.is_critical)
self.assertFalse(dmg.is_skill)
self.assertFalse(dmg.is_missed)
self.assertFalse(dmg.is_blocked)
def test_damage_normal_skill(self):
dmg = Damage(self.SOURCE, self.TARGET, 5, 5)
self.assertEqual(dmg.damage_type_code, 5)
self.assertEqual(dmg.damage_dealt, 5)
for dmg_type in DamageType:
with self.subTest(dmg_type=dmg_type):
self.assertEqual(dmg.is_damage_type(dmg_type),
dmg_type in (DamageType.DEALT, DamageType.SKILL))
self.assertTrue(dmg.is_dealt)
self.assertFalse(dmg.is_critical)
self.assertTrue(dmg.is_skill)
self.assertFalse(dmg.is_missed)
self.assertFalse(dmg.is_blocked)
def test_damage_crit_skill(self):
dmg = Damage(self.SOURCE, self.TARGET, 7, 5)
self.assertEqual(dmg.damage_type_code, 7)
self.assertEqual(dmg.damage_dealt, 5)
for dmg_type in DamageType:
with self.subTest(dmg_type=dmg_type):
self.assertEqual(dmg.is_damage_type(dmg_type),
dmg_type in (DamageType.DEALT, DamageType.SKILL, DamageType.CRITICAL))
self.assertTrue(dmg.is_dealt)
self.assertTrue(dmg.is_critical)
self.assertTrue(dmg.is_skill)
self.assertFalse(dmg.is_missed)
self.assertFalse(dmg.is_blocked)
def test_damage_invalid(self):
data = [
(DamageFalsePositiveError, (self.SOURCE, self.TARGET, DamageType.MISSED.code, 1)),
(DamageFalsePositiveError, (self.SOURCE, self.TARGET, DamageType.BLOCKED.code, 1)),
(DamageValueNegativeError, (self.SOURCE, self.TARGET, 1, -1)),
(DamageTypeInvalidError, (self.SOURCE, self.TARGET, 0, 0)),
(DamageTypeInvalidError, (self.SOURCE, self.TARGET, 0, 1)),
(DamageFalseNegativeError, (self.SOURCE, self.TARGET, 2, 1)),
(DamageFalseNegativeError, (self.SOURCE, self.TARGET, 4, 1)),
(DamageFalseNegativeError, (self.SOURCE, self.TARGET, 6, 1))
]
for error, args in data:
with self.subTest(expected_error=error, args=args):
with self.assertRaises(error):
Damage(*args)
def test_damage_missed(self):
dmg = Damage(self.SOURCE, self.TARGET, DamageType.MISSED.code, 0)
self.assertEqual(dmg.damage_type_code, DamageType.MISSED.code)
self.assertEqual(dmg.damage_dealt, 0)
for dmg_type in DamageType:
self.assertEqual(dmg.is_damage_type(dmg_type), dmg_type == DamageType.MISSED)
self.assertFalse(dmg.is_dealt)
self.assertFalse(dmg.is_critical)
self.assertFalse(dmg.is_skill)
self.assertTrue(dmg.is_missed)
self.assertFalse(dmg.is_blocked)
def test_damage_blocked(self):
dmg = Damage(self.SOURCE, self.TARGET, DamageType.BLOCKED.code, 0)
self.assertEqual(dmg.damage_type_code, DamageType.BLOCKED.code)
self.assertEqual(dmg.damage_dealt, 0)
for dmg_type in DamageType:
self.assertEqual(dmg.is_damage_type(dmg_type), dmg_type == DamageType.BLOCKED)
self.assertFalse(dmg.is_dealt)
self.assertFalse(dmg.is_critical)
self.assertFalse(dmg.is_skill)
self.assertFalse(dmg.is_missed)
self.assertTrue(dmg.is_blocked)
class TestDamageCalculator(TestCase):
def setUpTestCase(self) -> None:
DamageCalculator.RANDOM.seed(87)
def test_deal_damage(self):
SRC_ATK = 500
SRC_CRT = 0.15
TGT_DEF = 100
chara_src = Character(CharacterTemplate(
"Test Source", 9999, 0, 9999, 0, SRC_ATK, 0, 9999, 0, SRC_CRT, 0, 0.8, 0, 0, 0, 1, 0, []
))
chara_tgt = Character(CharacterTemplate(
"Test Target", 9999, 0, 9999, 0, 9999, 0, TGT_DEF, 0, 9999, 0, 1, 0, 0, 0, 1, 0, []
))
dmgs = [DamageCalculator.deal_damage(chara_src, chara_tgt) for _ in range(200)]
expected_content = [
("Damage dealt?", lambda dmg: dmg.is_dealt, 168),
("Skill damage?", lambda dmg: dmg.is_skill, 0),
("Is critical damage?", lambda dmg: dmg.is_critical, 21),
("Any missed?", lambda dmg: dmg.is_missed, 32),
("Any blocked?", lambda dmg: dmg.is_blocked, 0),
]
for description, filter_cond, expected_count in expected_content:
with self.subTest(description=description):
self.assertEqual(sum(map(filter_cond, dmgs)), expected_count)
lower_bound = (SRC_ATK - TGT_DEF) * DamageCalculator.DMG_LOWER_BOUNCE
upper_bound = (SRC_ATK - TGT_DEF) * DamageCalculator.DMG_UPPER_BOUNCE
for dmg_dealt in filter(lambda dmg: dmg.is_dealt and not dmg.is_critical, dmgs):
self.assertTrue(lower_bound <= dmg_dealt.damage_dealt <= upper_bound,
f"Damage {dmg_dealt.damage_dealt} not in expected bound ({lower_bound}~{upper_bound}).")
lower_bound *= DamageCalculator.CRT_DMG_RATE
upper_bound *= DamageCalculator.CRT_DMG_RATE
for dmg_dealt in filter(lambda dmg: dmg.is_dealt and dmg.is_critical, dmgs):
self.assertTrue(lower_bound <= dmg_dealt.damage_dealt <= upper_bound,
f"Damage {dmg_dealt.damage_dealt} not in expected bound ({lower_bound}~{upper_bound}).")
def test_deal_skill_damage(self):
SRC_ATK = 500
SRC_CRT = 0.15
TGT_DEF = 100
SKILL_PWR = 350
chara_src = Character(CharacterTemplate(
"Test Source", 9999, 0, 9999, 0, SRC_ATK, 0, 9999, 0, SRC_CRT, 0, 0.8, 0, 0, 0, 1, 0, []
))
chara_tgt = Character(CharacterTemplate(
"Test Target", 9999, 0, 9999, 0, 9999, 0, TGT_DEF, 0, 9999, 0, 1, 0, 0, 0, 1, 0, []
))
dmgs = [DamageCalculator.deal_damage(chara_src, chara_tgt, SKILL_PWR) for _ in range(200)]
expected_content = [
("Damage dealt?", lambda dmg: dmg.is_dealt, 168),
("Skill damage?", lambda dmg: dmg.is_skill, 168),
("Is critical damage?", lambda dmg: dmg.is_critical, 21),
("Any missed?", lambda dmg: dmg.is_missed, 32),
("Any blocked?", lambda dmg: dmg.is_blocked, 0),
]
for description, filter_cond, expected_count in expected_content:
with self.subTest(description=description):
self.assertEqual(sum(map(filter_cond, dmgs)), expected_count)
lower_bound = (SRC_ATK - TGT_DEF) * (SKILL_PWR / 100) * DamageCalculator.DMG_LOWER_BOUNCE
upper_bound = (SRC_ATK - TGT_DEF) * (SKILL_PWR / 100) * DamageCalculator.DMG_UPPER_BOUNCE
for dmg_dealt in filter(lambda dmg: dmg.is_dealt and not dmg.is_critical, dmgs):
self.assertTrue(lower_bound <= dmg_dealt.damage_dealt <= upper_bound,
f"Damage {dmg_dealt.damage_dealt} not in expected bound ({lower_bound}~{upper_bound}).")
lower_bound *= DamageCalculator.CRT_DMG_RATE
upper_bound *= DamageCalculator.CRT_DMG_RATE
for dmg_dealt in filter(lambda dmg: dmg.is_dealt and dmg.is_critical, dmgs):
self.assertTrue(lower_bound <= dmg_dealt.damage_dealt <= upper_bound,
f"Damage {dmg_dealt.damage_dealt} not in expected bound ({lower_bound}~{upper_bound}).")
def test_deal_to_death(self):
SRC_ATK = 500
SRC_CRT = 0.15
TGT_DEF = 100
SKILL_PWR = 350
chara_src = Character(CharacterTemplate(
"Test Source", 9999, 0, 9999, 0, SRC_ATK, 0, 9999, 0, SRC_CRT, 0, 1, 0, 0, 0, 1, 0, []
))
chara_tgt = Character(CharacterTemplate(
"Test Target", 1, 0, 9999, 0, 9999, 0, TGT_DEF, 0, 9999, 0, 1, 0, 0, 0, 1, 0, []
))
DamageCalculator.deal_damage(chara_src, chara_tgt, SKILL_PWR)
self.assertEqual(chara_tgt.HP, 0)
def test_deal_blocked(self):
SRC_ATK = 100
SRC_CRT = 0.15
TGT_DEF = 500
chara_src = Character(CharacterTemplate(
"Test Source", 9999, 0, 9999, 0, SRC_ATK, 0, 9999, 0, SRC_CRT, 0, 0.8, 0, 0, 0, 1, 0, []
))
chara_tgt = Character(CharacterTemplate(
"Test Target", 9999, 0, 9999, 0, 9999, 0, TGT_DEF, 0, 9999, 0, 1, 0, 0, 0, 1, 0, []
))
dmgs = [DamageCalculator.deal_damage(chara_src, chara_tgt) for _ in range(200)]
expected_content = [
("Damage dealt?", lambda dmg: dmg.is_dealt, 0),
("Skill damage?", lambda dmg: dmg.is_skill, 0),
("Is critical damage?", lambda dmg: dmg.is_critical, 0),
("Any missed?", lambda dmg: dmg.is_missed, 0),
("Any blocked?", lambda dmg: dmg.is_blocked, 200),
]
for description, filter_cond, expected_count in expected_content:
with self.subTest(description=description):
self.assertEqual(sum(map(filter_cond, dmgs)), expected_count)
def test_deal_blocked_skill(self):
SRC_ATK = 100
SRC_CRT = 0.15
TGT_DEF = 500
SKILL_PWR = 350
chara_src = Character(CharacterTemplate(
"Test Source", 9999, 0, 9999, 0, SRC_ATK, 0, 9999, 0, SRC_CRT, 0, 0.8, 0, 0, 0, 1, 0, []
))
chara_tgt = Character(CharacterTemplate(
"Test Target", 9999, 0, 9999, 0, 9999, 0, TGT_DEF, 0, 9999, 0, 1, 0, 0, 0, 1, 0, []
))
dmgs = [DamageCalculator.deal_damage(chara_src, chara_tgt, SKILL_PWR) for _ in range(200)]
expected_content = [
("Damage dealt?", lambda dmg: dmg.is_dealt, 0),
("Skill damage?", lambda dmg: dmg.is_skill, 0),
("Is critical damage?", lambda dmg: dmg.is_critical, 0),
("Any missed?", lambda dmg: dmg.is_missed, 0),
("Any blocked?", lambda dmg: dmg.is_blocked, 200),
]
for description, filter_cond, expected_count in expected_content:
with self.subTest(description=description):
self.assertEqual(sum(map(filter_cond, dmgs)), expected_count)
def test_negative_skill_power(self):
chara_src = Character(CharacterTemplate(
"Test Source", 9999, 0, 9999, 0, 9999, 0, 9999, 0, 9999, 0, 1, 0, 0, 0, 1, 0, []
))
chara_tgt = Character(CharacterTemplate(
"Test Target", 9999, 0, 9999, 0, 9999, 0, 5, 0, 9999, 0, 1, 0, 0, 0, 1, 0, []
))
with self.assertRaises(SkillPowerNegativeError):
DamageCalculator.deal_damage(chara_src, chara_tgt, -5)
def test_random_hit(self):
self.assertEqual([DamageCalculator.random_hit(0.87) for _ in range(200)].count(True), 176)
self.assertEqual([DamageCalculator.random_hit(0.5) for _ in range(200)].count(True), 97)
def test_hit_rate_acc_gt_evd(self):
SRC_ACC = 1.5
TGT_EVD = 0.7
chara_src = Character(CharacterTemplate(
"Test Source", 9999, 0, 9999, 0, 9999, 0, 9999, 0, 9999, 0, SRC_ACC, 0, 0, 0, 1, 0, []
))
chara_tgt = Character(CharacterTemplate(
"Test Target", 9999, 0, 9999, 0, 9999, 0, 5, 0, 9999, 0, 0, 0, TGT_EVD, 0, 1, 0, []
))
self.assertEqual(DamageCalculator.hit_rate(chara_src, chara_tgt), 0.8)
self.assertEqual(
[DamageCalculator.deal_damage(chara_src, chara_tgt).is_dealt for _ in range(200)].count(True),
168
)
def test_hit_rate_diff_gt_1(self):
SRC_ACC = 1.5
TGT_EVD = 0.5
chara_src = Character(CharacterTemplate(
"Test Source", 9999, 0, 9999, 0, 9999, 0, 9999, 0, 9999, 0, SRC_ACC, 0, 0, 0, 1, 0, []
))
chara_tgt = Character(CharacterTemplate(
"Test Target", 9999, 0, 9999, 0, 9999, 0, 5, 0, 9999, 0, 0, 0, TGT_EVD, 0, 1, 0, []
))
self.assertEqual(DamageCalculator.hit_rate(chara_src, chara_tgt), 1)
self.assertEqual(
[DamageCalculator.deal_damage(chara_src, chara_tgt).is_dealt for _ in range(200)].count(True),
200
)
def test_hit_rate_diff_neg(self):
SRC_ACC = 1.4
TGT_EVD = 1.5
chara_src = Character(CharacterTemplate(
"Test Source", 9999, 0, 9999, 0, 9999, 0, 9999, 0, 9999, 0, SRC_ACC, 0, 0, 0, 1, 0, []
))
chara_tgt = Character(CharacterTemplate(
"Test Target", 9999, 0, 9999, 0, 9999, 0, 5, 0, 9999, 0, 0, 0, TGT_EVD, 0, 1, 0, []
))
self.assertEqual(DamageCalculator.hit_rate(chara_src, chara_tgt), 0)
self.assertEqual(
[DamageCalculator.deal_damage(chara_src, chara_tgt).is_dealt for _ in range(200)].count(True),
0
)
| 41.823699 | 116 | 0.617165 | 1,881 | 14,471 | 4.547049 | 0.062733 | 0.045598 | 0.041389 | 0.049106 | 0.888928 | 0.879574 | 0.833392 | 0.802058 | 0.763241 | 0.7443 | 0 | 0.069316 | 0.266257 | 14,471 | 345 | 117 | 41.944928 | 0.736203 | 0 | 0 | 0.641577 | 0 | 0 | 0.059291 | 0.014926 | 0 | 0 | 0 | 0 | 0.240143 | 1 | 0.064516 | false | 0 | 0.017921 | 0 | 0.096774 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
6aaef19251b14d55a35df97cb5761da962093167 | 377 | py | Python | jesse/strategies/Test09/__init__.py | noenfugler/jesse | 217a3168620a755c1a9576d9deb27105db7dccf8 | [
"MIT"
] | 3,999 | 2018-11-09T10:38:51.000Z | 2022-03-31T12:29:12.000Z | jesse/strategies/Test09/__init__.py | noenfugler/jesse | 217a3168620a755c1a9576d9deb27105db7dccf8 | [
"MIT"
] | 172 | 2020-04-16T16:19:08.000Z | 2022-03-28T13:28:55.000Z | jesse/strategies/Test09/__init__.py | noenfugler/jesse | 217a3168620a755c1a9576d9deb27105db7dccf8 | [
"MIT"
] | 495 | 2019-03-01T21:48:53.000Z | 2022-03-30T15:35:19.000Z | from jesse.strategies import Strategy
# test_stats_for_a_strategy_without_any_trades
class Test09(Strategy):
def should_long(self):
return False
def should_short(self):
return False
def go_long(self):
pass
def go_short(self):
pass
def should_cancel(self):
return False
def filters(self):
return []
| 16.391304 | 46 | 0.644562 | 48 | 377 | 4.8125 | 0.520833 | 0.17316 | 0.194805 | 0.233766 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.007463 | 0.289125 | 377 | 22 | 47 | 17.136364 | 0.854478 | 0.116711 | 0 | 0.357143 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.428571 | false | 0.142857 | 0.071429 | 0.285714 | 0.857143 | 0 | 0 | 0 | 0 | null | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | 0 | 6 |
6ada15f31aaaec9a9459464ce7bda154511fee1b | 11,540 | py | Python | colour/appearance/tests/test_hke.py | rift-labs-developer/colour | 15112dbe824aab0f21447e0db4a046a28a06f43a | [
"BSD-3-Clause"
] | 1,380 | 2015-01-10T12:30:33.000Z | 2022-03-30T10:19:57.000Z | colour/appearance/tests/test_hke.py | rift-labs-developer/colour | 15112dbe824aab0f21447e0db4a046a28a06f43a | [
"BSD-3-Clause"
] | 638 | 2015-01-02T10:49:05.000Z | 2022-03-29T10:16:22.000Z | colour/appearance/tests/test_hke.py | rift-labs-developer/colour | 15112dbe824aab0f21447e0db4a046a28a06f43a | [
"BSD-3-Clause"
] | 250 | 2015-01-21T15:27:19.000Z | 2022-03-30T10:23:58.000Z | # !/usr/bin/env python
# -*- coding: utf-8 -*-
"""
Defines the unit tests for the :mod:`colour.appearance.hke` module.
"""
import numpy as np
import unittest
from itertools import permutations
from colour.appearance.hke import (
HelmholtzKohlrausch_effect_object_Nayatani1997,
HelmholtzKohlrausch_effect_luminous_Nayatani1997,
coefficient_K_Br_Nayatani1997, coefficient_q_Nayatani1997)
from colour.utilities import ignore_numpy_errors
__author__ = 'Ilia Sibiryakov'
__copyright__ = 'Copyright (C) 2013-2021 - Colour Developers'
__license__ = 'New BSD License - https://opensource.org/licenses/BSD-3-Clause'
__maintainer__ = 'Colour Developers'
__email__ = 'colour-developers@colour-science.org'
__status__ = 'Production'
__all__ = [
'TestHelmholtzKohlrauschEffectObjectNayatani1997',
'TestHelmholtzKohlrauschEffectLuminousNayatani1997',
'TestCoefficient_K_Br_Nayatani1997', 'TestCoefficient_q_Nayatani1997'
]
class TestHelmholtzKohlrauschEffectObjectNayatani1997(unittest.TestCase):
"""
Defines :func:`colour.colour.appearance.hke.\
HelmholtzKohlrausch_effect_object_Nayatani1997` definition unit tests methods.
"""
def test_HelmholtzKohlrausch_effect_object_Nayatani1997(self):
"""
Tests :func:`colour.appearance.hke.\
HelmholtzKohlrausch_effect_object_Nayatani1997` definition.
"""
self.assertAlmostEqual(
HelmholtzKohlrausch_effect_object_Nayatani1997(
np.array([0.40351010, 0.53933673]),
np.array([0.19783001, 0.46831999]),
63.66,
method='VCC'),
1.344152435497761,
places=7)
self.assertAlmostEqual(
HelmholtzKohlrausch_effect_object_Nayatani1997(
np.array([0.40351010, 0.53933673]),
np.array([0.19783001, 0.46831999]),
63.66,
method='VAC'),
1.261777232837009,
places=7)
def test_n_dimensional_HelmholtzKohlrausch_effect_object_Nayatani1997(
self):
"""
Tests :func:`colour.appearance.hke.\
HelmholtzKohlrausch_effect_object_Nayatani1997` definition n_dimensional
arrays support.
"""
uv_d65 = np.array([0.19783001, 0.46831999])
uv = np.array([0.40351010, 0.53933673])
L_a = 63.66
result_vcc = HelmholtzKohlrausch_effect_object_Nayatani1997(
uv, uv_d65, L_a, method='VCC')
result_vac = HelmholtzKohlrausch_effect_object_Nayatani1997(
uv, uv_d65, L_a, method='VAC')
uv_d65 = np.tile(uv_d65, (6, 1))
uv = np.tile(uv, (6, 1))
result_vcc = np.tile(result_vcc, 6)
result_vac = np.tile(result_vac, 6)
np.testing.assert_almost_equal(
HelmholtzKohlrausch_effect_object_Nayatani1997(
uv, uv_d65, L_a, method='VCC'),
result_vcc,
decimal=7)
np.testing.assert_almost_equal(
HelmholtzKohlrausch_effect_object_Nayatani1997(
uv, uv_d65, L_a, method='VAC'),
result_vac,
decimal=7)
uv_d65 = np.reshape(uv_d65, (2, 3, 2))
uv = np.reshape(uv, (2, 3, 2))
result_vcc = np.reshape(result_vcc, (2, 3))
result_vac = np.reshape(result_vac, (2, 3))
np.testing.assert_almost_equal(
HelmholtzKohlrausch_effect_object_Nayatani1997(
uv, uv_d65, L_a, method='VCC'),
result_vcc,
decimal=7)
np.testing.assert_almost_equal(
HelmholtzKohlrausch_effect_object_Nayatani1997(
uv, uv_d65, L_a, method='VAC'),
result_vac,
decimal=7)
@ignore_numpy_errors
def test_nan_HelmholtzKohlrausch_effect_object_Nayatani1997(self):
"""
Tests :func:`colour.appearance.hke.\
HelmholtzKohlrausch_effect_object_Nayatani1997` definition nan support.
"""
cases = [-1.0, 0.0, 1.0, -np.inf, np.inf, np.nan]
cases = set(permutations(cases * 2, r=2))
for case in cases:
HelmholtzKohlrausch_effect_object_Nayatani1997(case, case, case[0])
class TestHelmholtzKohlrauschEffectLuminousNayatani1997(unittest.TestCase):
"""
Defines :func:`colour.appearance.hke.\
HelmholtzKohlrausch_effect_luminous_Nayatani1997` definition unit tests
methods.
"""
def test_HelmholtzKohlrausch_effect_luminous_Nayatani1997(self):
"""
Tests :func:`colour.appearance.hke.\
HelmholtzKohlrausch_effect_luminous_Nayatani1997` definition.
"""
self.assertAlmostEqual(
HelmholtzKohlrausch_effect_luminous_Nayatani1997(
np.array([0.40351010, 0.53933673]),
np.array([0.19783001, 0.46831999]),
63.66,
method='VCC'),
2.014433723774654,
places=7)
self.assertAlmostEqual(
HelmholtzKohlrausch_effect_luminous_Nayatani1997(
np.array([0.40351010, 0.53933673]),
np.array([0.19783001, 0.46831999]),
63.66,
method='VAC'),
1.727991241148628,
places=7)
def test_n_dimensional_HelmholtzKohlrausch_effect_luminous_Nayatani1997(
self):
"""
Tests :func:`colour.appearance.hke.\
HelmholtzKohlrausch_effect_luminous_Nayatani1997` definition n_dimensional
arrays support.
"""
uv_d65 = np.array([0.19783001, 0.46831999])
uv = np.array([0.40351010, 0.53933673])
L_a = 63.66
result_vcc = HelmholtzKohlrausch_effect_luminous_Nayatani1997(
uv, uv_d65, L_a, method='VCC')
result_vac = HelmholtzKohlrausch_effect_luminous_Nayatani1997(
uv, uv_d65, L_a, method='VAC')
uv_d65 = np.tile(uv_d65, (6, 1))
uv = np.tile(uv, (6, 1))
result_vcc = np.tile(result_vcc, 6)
result_vac = np.tile(result_vac, 6)
np.testing.assert_almost_equal(
HelmholtzKohlrausch_effect_luminous_Nayatani1997(
uv, uv_d65, L_a, method='VCC'),
result_vcc,
decimal=7)
np.testing.assert_almost_equal(
HelmholtzKohlrausch_effect_luminous_Nayatani1997(
uv, uv_d65, L_a, method='VAC'),
result_vac,
decimal=7)
uv_d65 = np.reshape(uv_d65, (2, 3, 2))
uv = np.reshape(uv, (2, 3, 2))
result_vcc = np.reshape(result_vcc, (2, 3))
result_vac = np.reshape(result_vac, (2, 3))
np.testing.assert_almost_equal(
HelmholtzKohlrausch_effect_luminous_Nayatani1997(
uv, uv_d65, L_a, method='VCC'),
result_vcc,
decimal=7)
np.testing.assert_almost_equal(
HelmholtzKohlrausch_effect_luminous_Nayatani1997(
uv, uv_d65, L_a, method='VAC'),
result_vac,
decimal=7)
@ignore_numpy_errors
def test_nan_HelmholtzKohlrausch_effect_luminous_Nayatani1997(self):
"""
Tests :func:`colour.appearance.hke.\
HelmholtzKohlrausch_effect_luminous_Nayatani1997` definition nan support.
"""
cases = [-1.0, 0.0, 1.0, -np.inf, np.inf, np.nan]
cases = set(permutations(cases * 2, r=2))
for case in cases:
HelmholtzKohlrausch_effect_luminous_Nayatani1997(
case, case, case[0])
class TestCoefficient_K_Br_Nayatani1997(unittest.TestCase):
"""
Defines :func:`colour.appearance.hke.coefficient_K_Br_Nayatani1997`
definition unit tests methods.
"""
def test_coefficient_K_Br_Nayatani1997(self):
"""
Tests :func:`colour.appearance.hke.coefficient_K_Br_Nayatani1997`
definition.
"""
self.assertAlmostEqual(
coefficient_K_Br_Nayatani1997(10.00000000),
0.71344817765758839,
places=7)
self.assertAlmostEqual(
coefficient_K_Br_Nayatani1997(63.66000000),
1.000128455584031,
places=7)
self.assertAlmostEqual(
coefficient_K_Br_Nayatani1997(1000.00000000),
1.401080840298197,
places=7)
self.assertAlmostEqual(
coefficient_K_Br_Nayatani1997(10000.00000000),
1.592511806930447,
places=7)
def test_n_dimensional_coefficient_K_Br_Nayatani1997(self):
"""
Tests :func:`colour.appearance.hke.coefficient_K_Br_Nayatani1997`
definition n_dimensional arrays support.
"""
L_a = 63.66
K_Br = coefficient_K_Br_Nayatani1997(L_a)
L_a = np.tile(L_a, 6)
K_Br = np.tile(K_Br, 6)
np.testing.assert_almost_equal(
coefficient_K_Br_Nayatani1997(L_a), K_Br, decimal=7)
L_a = np.reshape(L_a, (2, 3))
K_Br = np.reshape(K_Br, (2, 3))
np.testing.assert_almost_equal(
coefficient_K_Br_Nayatani1997(L_a), K_Br, decimal=7)
L_a = np.reshape(L_a, (2, 3, 1))
K_Br = np.reshape(K_Br, (2, 3, 1))
np.testing.assert_almost_equal(
coefficient_K_Br_Nayatani1997(L_a), K_Br, decimal=7)
@ignore_numpy_errors
def test_nan_coefficient_K_Br_Nayatani1997(self):
"""
Tests :func:`colour.appearance.hke.coefficient_K_Br_Nayatani1997`
definition nan support.
"""
cases = [-1.0, 0.0, 1.0, -np.inf, np.inf, np.nan]
for case in cases:
coefficient_K_Br_Nayatani1997(case)
class TestCoefficient_q_Nayatani1997(unittest.TestCase):
"""
Defines :func:`colour.appearance.hke.coefficient_q_Nayatani1997`
definition unit tests methods.
"""
def test_coefficient_q_Nayatani1997(self):
"""
Tests :func:`colour.appearance.hke.coefficient_q_Nayatani1997`
definition.
"""
self.assertAlmostEqual(
coefficient_q_Nayatani1997(0.00000000),
-0.121200000000000,
places=7)
self.assertAlmostEqual(
coefficient_q_Nayatani1997(0.78539816),
0.125211117768464,
places=7)
self.assertAlmostEqual(
coefficient_q_Nayatani1997(1.57079633),
0.191679999416415,
places=7)
self.assertAlmostEqual(
coefficient_q_Nayatani1997(2.35619449),
0.028480866426611,
places=7)
def test_n_dimensional_coefficient_q_Nayatani1997(self):
"""
Tests :func:`colour.appearance.hke.coefficient_q_Nayatani1997`
definition n_dimensional arrays support.
"""
L_a = 63.66
q = coefficient_q_Nayatani1997(L_a)
L_a = np.tile(L_a, 6)
q = np.tile(q, 6)
np.testing.assert_almost_equal(
coefficient_q_Nayatani1997(L_a), q, decimal=7)
L_a = np.reshape(L_a, (2, 3))
q = np.reshape(q, (2, 3))
np.testing.assert_almost_equal(
coefficient_q_Nayatani1997(L_a), q, decimal=7)
L_a = np.reshape(L_a, (2, 3, 1))
q = np.reshape(q, (2, 3, 1))
np.testing.assert_almost_equal(
coefficient_q_Nayatani1997(L_a), q, decimal=7)
@ignore_numpy_errors
def test_nan_coefficient_q_Nayatani1997(self):
"""
Tests :func:`colour.appearance.hke.coefficient_q_Nayatani1997`
definition nan support.
"""
cases = [-1.0, 0.0, 1.0, -np.inf, np.inf, np.nan]
for case in cases:
coefficient_q_Nayatani1997(case)
| 32.507042 | 79 | 0.625997 | 1,285 | 11,540 | 5.320623 | 0.101946 | 0.010531 | 0.041685 | 0.106918 | 0.826386 | 0.819804 | 0.789966 | 0.755887 | 0.702209 | 0.648238 | 0 | 0.118581 | 0.27435 | 11,540 | 354 | 80 | 32.59887 | 0.697874 | 0.1513 | 0 | 0.674312 | 0 | 0 | 0.04168 | 0.02084 | 0 | 0 | 0 | 0 | 0.119266 | 1 | 0.055046 | false | 0 | 0.022936 | 0 | 0.09633 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
0a7ff0d295413f771fd632178656b07d5f5bdabf | 35 | py | Python | framework/avalanche/benchmarks/generators/__init__.py | Mattdl/RehearsalRevealed | f9cd2548f6c6d3ff119b40fecdb0df6fcd1525f6 | [
"MIT"
] | 12 | 2021-04-16T15:49:59.000Z | 2022-02-27T18:04:58.000Z | framework/avalanche/benchmarks/generators/__init__.py | Mattdl/RehearsalRevealed | f9cd2548f6c6d3ff119b40fecdb0df6fcd1525f6 | [
"MIT"
] | null | null | null | framework/avalanche/benchmarks/generators/__init__.py | Mattdl/RehearsalRevealed | f9cd2548f6c6d3ff119b40fecdb0df6fcd1525f6 | [
"MIT"
] | 2 | 2021-06-22T04:11:52.000Z | 2021-11-12T03:27:18.000Z | from .scenario_generators import *
| 17.5 | 34 | 0.828571 | 4 | 35 | 7 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.114286 | 35 | 1 | 35 | 35 | 0.903226 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
0aad40c7017a7691e592697f7d68a1e8a4a32f56 | 26,981 | py | Python | network/base_networks.py | aioz-ai/LDR_ALDK | 717f365e5d8ed36e4801ae66277c64501c48224b | [
"MIT"
] | 7 | 2021-04-18T16:11:28.000Z | 2022-03-07T10:19:51.000Z | network/base_networks.py | aioz-ai/LDR_ALDK | 717f365e5d8ed36e4801ae66277c64501c48224b | [
"MIT"
] | null | null | null | network/base_networks.py | aioz-ai/LDR_ALDK | 717f365e5d8ed36e4801ae66277c64501c48224b | [
"MIT"
] | 1 | 2022-03-31T15:54:27.000Z | 2022-03-31T15:54:27.000Z | """
Based networks for predicting displacements
Modified from
https://github.com/microsoft/Recursive-Cascaded-Networks/blob/master/network/base_networks.py
"""
import tensorflow as tf
import tflearn
from tensorflow.python.keras.layers.convolutional import UpSampling3D
from tflearn.initializations import normal
from .utils import Network, ReLU, LeakyReLU
def convolve(opName, inputLayer, outputChannel, kernelSize,
stride, stddev=1e-2, reuse=False, weights_init='uniform_scaling'):
return tflearn.layers.conv_3d(
inputLayer, outputChannel, kernelSize,
strides=stride, padding='same', activation='linear',
bias=True, scope=opName, reuse=reuse, weights_init=weights_init)
def convolveReLU(opName, inputLayer, outputChannel, kernelSize,
stride, stddev=1e-2, reuse=False):
return ReLU(convolve(opName, inputLayer,
outputChannel,
kernelSize, stride, stddev=stddev, reuse=reuse),
opName+'_rectified')
def convolveLeakyReLU(opName, inputLayer, outputChannel, kernelSize,
stride, alpha=0.1, stddev=1e-2, reuse=False):
return LeakyReLU(convolve(opName, inputLayer,
outputChannel,
kernelSize, stride, stddev, reuse),
alpha, opName+'_leakilyrectified')
def upconvolve(opName, inputLayer, outputChannel, kernelSize,
stride, targetShape, stddev=1e-2, reuse=False, weights_init='uniform_scaling'):
return tflearn.layers.conv.conv_3d_transpose(
inputLayer, outputChannel, kernelSize, targetShape, strides=stride,
padding='same', activation='linear', bias=False, scope=opName,
reuse=reuse, weights_init=weights_init)
def upconvolveReLU(opName, inputLayer, outputChannel, kernelSize,
stride, targetShape, stddev=1e-2, reuse=False):
return ReLU(upconvolve(opName, inputLayer,
outputChannel,
kernelSize, stride,
targetShape, stddev, reuse),
opName+'_rectified')
def upconvolveLeakyReLU(opName, inputLayer, outputChannel, kernelSize,
stride, targetShape, alpha=0.1, stddev=1e-2, reuse=tf.AUTO_REUSE):
return LeakyReLU(upconvolve(opName, inputLayer,
outputChannel,
kernelSize, stride,
targetShape, stddev, reuse),
alpha, opName+'_rectified')
class basicDiscriminator(Network):
def __init__(self, name='d_', reuse=False, **kwargs):
super().__init__(name, reuse, **kwargs)
def build(self, data):
tf_utils.print_activations(data)
# from (N, 128, 128, 128, 3) to (N, 64, 64, 64, 16)
h0_conv = tf_utils.conv3d(data, self.dis_c[0],
k_h=5, k_w=5, k_d=5, name='h0_conv2d')
h0_lrelu = tf_utils.lrelu(h0_conv, name='h0_lrelu')
# from (N, 64, 64, 64, 16) to (N, 32, 32, 32, 32)
h1_conv = tf_utils.conv3d(h0_lrelu, self.dis_c[1],
k_h=5, k_w=5, k_d=5, name='h1_conv2d')
h1_lrelu = tf_utils.lrelu(h1_conv, name='h1_lrelu')
# from (N, 32, 32, 32, 32) to (N, 16, 16, 16, 64)
h2_conv = tf_utils.conv3d(h1_lrelu, self.dis_c[2],
k_h=5, k_w=5, k_d=5, name='h2_conv2d')
h2_lrelu = tf_utils.lrelu(h2_conv, name='h2_lrelu')
# from (N, 16, 16, 16, 64) to (N, 8, 8, 8, 128)
h3_conv = tf_utils.conv3d(h2_lrelu, self.dis_c[3],
k_h=5, k_w=5, k_d=5, name='h3_conv2d')
h3_lrelu = tf_utils.lrelu(h3_conv, name='h2_lrelu')
# from (N, 8, 8, 8, 128) to (N, 4, 4, 4, 256)
h4_conv = tf_utils.conv3d(h3_lrelu, self.dis_c[4],
k_h=5, k_w=5, k_d=5, name='h4_conv2d')
h4_lrelu = tf_utils.lrelu(h4_conv, name='h2_lrelu')
# from (N, 4, 4, 4, 256) to (N, 4096) and to (N, 1)
h4_flatten = tf.reshape(h4_lrelu,
(-1, h4_lrelu.shape[1] * h4_lrelu.shape[2] \
* h4_lrelu.shape[3] * h4_lrelu.shape[4]))
h5_linear = tf_utils.linear(h4_flatten, 1, name='h3_linear')
return tf.nn.sigmoid(h5_linear), h5_linear
class VTN(Network):
def __init__(self, name, flow_multiplier=1., channels=16, **kwargs):
super().__init__(name, **kwargs)
self.flow_multiplier = flow_multiplier
self.channels = channels
def build(self, img1, img2):
'''
img1, img2, flow : tensor of shape [batch, X, Y, Z, C]
'''
concatImgs = tf.concat([img1, img2], 4, 'concatImgs')
dims = 3
c = self.channels
conv1 = convolveLeakyReLU(
'conv1', concatImgs, c, 3, 2) # 64 * 64 * 64
conv2 = convolveLeakyReLU(
'conv2', conv1, c*2, 3, 2) # 32 * 32 * 32
conv3 = convolveLeakyReLU('conv3', conv2, c*4, 3, 2)
conv3_1 = convolveLeakyReLU('conv3_1', conv3, c*4, 3, 1)
conv4 = convolveLeakyReLU(
'conv4', conv3_1, c*8, 3, 2) # 16 * 16 * 16
conv4_1 = convolveLeakyReLU('conv4_1', conv4, c*8, 3, 1)
conv5 = convolveLeakyReLU(
'conv5', conv4_1, c*16, 3, 2) # 8 * 8 * 8
conv5_1 = convolveLeakyReLU('conv5_1', conv5, c*16, 3, 1)
conv6 = convolveLeakyReLU(
'conv6', conv5_1, c*32, 3, 2) # 4 * 4 * 4
conv6_1 = convolveLeakyReLU('conv6_1', conv6, c*32, 3, 1)
# 16 * 32 = 512 channels
shape0 = concatImgs.shape.as_list()
shape1 = conv1.shape.as_list()
shape2 = conv2.shape.as_list()
shape3 = conv3.shape.as_list()
shape4 = conv4.shape.as_list()
shape5 = conv5.shape.as_list()
shape6 = conv6.shape.as_list()
pred6 = convolve('pred6', conv6_1, dims, 3, 1)
upsamp6to5 = upconvolve('upsamp6to5', pred6, dims, 4, 2, shape5[1:4])
deconv5 = upconvolveLeakyReLU(
'deconv5', conv6_1, shape5[4], 4, 2, shape5[1:4])
concat5 = tf.concat([conv5_1, deconv5, upsamp6to5], 4, 'concat5')
pred5 = convolve('pred5', concat5, dims, 3, 1)
upsamp5to4 = upconvolve('upsamp5to4', pred5, dims, 4, 2, shape4[1:4])
deconv4 = upconvolveLeakyReLU(
'deconv4', concat5, shape4[4], 4, 2, shape4[1:4])
concat4 = tf.concat([conv4_1, deconv4, upsamp5to4],
4, 'concat4') # channel = 512+256+2
pred4 = convolve('pred4', concat4, dims, 3, 1)
upsamp4to3 = upconvolve('upsamp4to3', pred4, dims, 4, 2, shape3[1:4])
deconv3 = upconvolveLeakyReLU(
'deconv3', concat4, shape3[4], 4, 2, shape3[1:4])
concat3 = tf.concat([conv3_1, deconv3, upsamp4to3],
4, 'concat3') # channel = 256+128+2
pred3 = convolve('pred3', concat3, dims, 3, 1)
upsamp3to2 = upconvolve('upsamp3to2', pred3, dims, 4, 2, shape2[1:4])
deconv2 = upconvolveLeakyReLU(
'deconv2', concat3, shape2[4], 4, 2, shape2[1:4])
concat2 = tf.concat([conv2, deconv2, upsamp3to2],
4, 'concat2') # channel = 128+64+2
pred2 = convolve('pred2', concat2, dims, 3, 1)
upsamp2to1 = upconvolve('upsamp2to1', pred2, dims, 4, 2, shape1[1:4])
deconv1 = upconvolveLeakyReLU(
'deconv1', concat2, shape1[4], 4, 2, shape1[1:4])
concat1 = tf.concat([conv1, deconv1, upsamp2to1], 4, 'concat1')
pred0 = upconvolve('upsamp1to0', concat1, dims, 4, 2, shape0[1:4])
return {'flow': pred0 * 20 * self.flow_multiplier}
class LDR(Network):
def __init__(self, name, flow_multiplier=1., channels=16, **kwargs):
super().__init__(name, **kwargs)
self.flow_multiplier = flow_multiplier
self.encoders = [m * channels for m in [2, 2, 2, 1]]
self.decoders = [m * channels for m in [2, 2, 2, 2, 1, 1, 1]] + [3]
def build(self, img1, img2):
'''
img1, img2, flow : tensor of shape [batch, X, Y, Z, C]
'''
concatImgs = tf.concat([img1, img2], 4, 'concatImgs')
conv1 = convolveLeakyReLU(
'conv1', concatImgs, self.encoders[0], 3, 2) # 64 * 64 * 64
conv2 = convolveLeakyReLU(
'conv2', conv1, self.encoders[1], 3, 2) # 32 * 32 * 32
conv3 = convolveLeakyReLU(
'conv3', conv2, self.encoders[2], 3, 2) # 16 * 16 * 16
conv4 = convolveLeakyReLU(
'conv4', conv3, self.encoders[3], 3, 2) # 8 * 8 * 8
net = convolveLeakyReLU('decode4', conv4, self.decoders[0], 3, 1)
net = tf.concat([UpSampling3D()(net), conv3], axis=-1)
net = convolveLeakyReLU('decode3', net, self.decoders[1], 3, 1)
net = tf.concat([UpSampling3D()(net), conv2], axis=-1)
net = convolveLeakyReLU('decode2', net, self.decoders[2], 3, 1)
net = tf.concat([UpSampling3D()(net), conv1], axis=-1)
net = convolveLeakyReLU('decode1', net, self.decoders[3], 3, 1)
net = convolveLeakyReLU('decode1_1', net, self.decoders[4], 3, 1)
net = tf.concat([UpSampling3D()(net), concatImgs], axis=-1)
net = convolveLeakyReLU('decode0', net, self.decoders[5], 3, 1)
net = convolveLeakyReLU('decode0_1', net, self.decoders[6], 3, 1)
net = convolve(
'flow', net, self.decoders[-1], 3, 1, weights_init=normal(stddev=1e-5))
return {
'flow': net * self.flow_multiplier
}
class VoxelMorph(Network):
def __init__(self, name, flow_multiplier=1., channels=16, **kwargs):
super().__init__(name, **kwargs)
self.flow_multiplier = flow_multiplier
self.encoders = [m * channels for m in [1, 2, 2, 2]]
self.decoders = [m * channels for m in [2, 2, 2, 2, 2, 1, 1]] + [3]
def build(self, img1, img2):
'''
img1, img2, flow : tensor of shape [batch, X, Y, Z, C]
'''
concatImgs = tf.concat([img1, img2], 4, 'concatImgs')
conv1 = convolveLeakyReLU(
'conv1', concatImgs, self.encoders[0], 3, 2) # 64 * 64 * 64
conv2 = convolveLeakyReLU(
'conv2', conv1, self.encoders[1], 3, 2) # 32 * 32 * 32
conv3 = convolveLeakyReLU(
'conv3', conv2, self.encoders[2], 3, 2) # 16 * 16 * 16
if img1.shape[1] != 8:
conv4 = convolveLeakyReLU(
'conv4', conv3, self.encoders[3], 3, 2) # 8 * 8 * 8
net = convolveLeakyReLU('decode4', conv4, self.decoders[0], 3, 1)
net = tf.concat([UpSampling3D()(net), conv3], axis=-1)
net = convolveLeakyReLU('decode3', net, self.decoders[1], 3, 1)
else:
net = convolveLeakyReLU('decode3', conv3, self.decoders[1], 3, 1)
net = tf.concat([UpSampling3D()(net), conv2], axis=-1)
net = convolveLeakyReLU('decode2', net, self.decoders[2], 3, 1)
net = tf.concat([UpSampling3D()(net), conv1], axis=-1)
net = convolveLeakyReLU('decode1', net, self.decoders[3], 3, 1)
net = convolveLeakyReLU('decode1_1', net, self.decoders[4], 3, 1)
net = tf.concat([UpSampling3D()(net), concatImgs], axis=-1)
net = convolveLeakyReLU('decode0', net, self.decoders[5], 3, 1)
if len(self.decoders) == 8:
net = convolveLeakyReLU('decode0_1', net, self.decoders[6], 3, 1)
net = convolve(
'flow', net, self.decoders[-1], 3, 1, weights_init=normal(stddev=1e-5))
return {
'flow': net * self.flow_multiplier
}
def affine_flow(W, b, len1, len2, len3):
b = tf.reshape(b, [-1, 1, 1, 1, 3])
xr = tf.range(-(len1 - 1) / 2.0, len1 / 2.0, 1.0, tf.float32)
xr = tf.reshape(xr, [1, -1, 1, 1, 1])
yr = tf.range(-(len2 - 1) / 2.0, len2 / 2.0, 1.0, tf.float32)
yr = tf.reshape(yr, [1, 1, -1, 1, 1])
zr = tf.range(-(len3 - 1) / 2.0, len3 / 2.0, 1.0, tf.float32)
zr = tf.reshape(zr, [1, 1, 1, -1, 1])
wx = W[:, :, 0]
wx = tf.reshape(wx, [-1, 1, 1, 1, 3])
wy = W[:, :, 1]
wy = tf.reshape(wy, [-1, 1, 1, 1, 3])
wz = W[:, :, 2]
wz = tf.reshape(wz, [-1, 1, 1, 1, 3])
return (xr * wx + yr * wy) + (zr * wz + b)
def det3x3(M):
M = [[M[:, i, j] for j in range(3)] for i in range(3)]
return tf.add_n([
M[0][0] * M[1][1] * M[2][2],
M[0][1] * M[1][2] * M[2][0],
M[0][2] * M[1][0] * M[2][1]
]) - tf.add_n([
M[0][0] * M[1][2] * M[2][1],
M[0][1] * M[1][0] * M[2][2],
M[0][2] * M[1][1] * M[2][0]
])
class LDRUtilStem(Network):
def __init__(self, name, flow_multiplier=1., **kwargs):
super().__init__(name, **kwargs)
self.flow_multiplier = flow_multiplier
def build(self, img1, img2):
'''
img1, img2, flow : tensor of shape [batch, X, Y, Z, C]
'''
concatImgs = tf.concat([img1, img2], 4, 'coloncatImgs')
dims = 3
conv1 = convolveLeakyReLU(
'conv1', concatImgs, 16, 3, 2) # 64 * 64 * 64
conv2 = convolveLeakyReLU(
'conv2', conv1, 32, 3, 2) # 32 * 32 * 32
conv3 = convolveLeakyReLU('conv3', conv2, 64, 3, 2)
conv6 = convolveLeakyReLU(
'conv4', conv3, 128, 3, 2) # 16 * 16 * 16
ks = conv6.shape.as_list()[1:4]
conv6 = tflearn.layers.avg_pool_3d(
conv6, 3,2, padding='valid')
ks = conv6.shape.as_list()[1:4]
conv7_W = tflearn.layers.conv_3d(
conv6, 9, ks, strides=1, padding='valid', activation='linear',
bias=False, scope='conv7_W')
conv7_b = tflearn.layers.conv_3d(
conv6, 3, ks, strides=1, padding='valid', activation='linear',
bias=False, scope='conv7_b')
print(conv7_b.shape)
I = [[[1.0, 0.0, 0.0], [0.0, 1.0, 0.0], [0.0, 0.0, 1.0]]]
W = tf.reshape(conv7_W, [-1, 3, 3]) * self.flow_multiplier
b = tf.reshape(conv7_b, [-1, 3]) * self.flow_multiplier
A = W + I
sx, sy, sz = img1.shape.as_list()[1:4]
flow = affine_flow(W, b, sx, sy, sz)
det = det3x3(A)
det_loss = tf.nn.l2_loss(det - 1.0)
eps = 1e-5
epsI = [[[eps * elem for elem in row] for row in Mat] for Mat in I]
C = tf.matmul(A, A, True) + epsI
def elem_sym_polys_of_eigen_values(M):
M = [[M[:, i, j] for j in range(3)] for i in range(3)]
sigma1 = tf.add_n([M[0][0], M[1][1], M[2][2]])
sigma2 = tf.add_n([
M[0][0] * M[1][1],
M[1][1] * M[2][2],
M[2][2] * M[0][0]
]) - tf.add_n([
M[0][1] * M[1][0],
M[1][2] * M[2][1],
M[2][0] * M[0][2]
])
sigma3 = tf.add_n([
M[0][0] * M[1][1] * M[2][2],
M[0][1] * M[1][2] * M[2][0],
M[0][2] * M[1][0] * M[2][1]
]) - tf.add_n([
M[0][0] * M[1][2] * M[2][1],
M[0][1] * M[1][0] * M[2][2],
M[0][2] * M[1][1] * M[2][0]
])
return sigma1, sigma2, sigma3
s1, s2, s3 = elem_sym_polys_of_eigen_values(C)
ortho_loss = s1 + (1 + eps) * (1 + eps) * s2 / s3 - 3 * 2 * (1 + eps)
ortho_loss = tf.reduce_sum(ortho_loss)
return {
'flow': flow,
'W': W,
'b': b,
'det_loss': det_loss,
'ortho_loss': ortho_loss
}
class VTNAffineStem(Network):
def __init__(self, name, flow_multiplier=1., **kwargs):
super().__init__(name, **kwargs)
self.flow_multiplier = flow_multiplier
def build(self, img1, img2):
'''
img1, img2, flow : tensor of shape [batch, X, Y, Z, C]
'''
concatImgs = tf.concat([img1, img2], 4, 'coloncatImgs')
dims = 3
conv1 = convolveLeakyReLU(
'conv1', concatImgs, 16, 3, 2) # 64 * 64 * 64
conv2 = convolveLeakyReLU(
'conv2', conv1, 32, 3, 2) # 32 * 32 * 32
conv3 = convolveLeakyReLU('conv3', conv2, 64, 3, 2)
conv3_1 = convolveLeakyReLU(
'conv3_1', conv3, 64, 3, 1)
conv4 = convolveLeakyReLU(
'conv4', conv3_1, 128, 3, 2) # 16 * 16 * 16
conv4_1 = convolveLeakyReLU(
'conv4_1', conv4, 128, 3, 1)
conv5 = convolveLeakyReLU(
'conv5', conv4_1, 256, 3, 2) # 8 * 8 * 8
conv5_1 = convolveLeakyReLU(
'conv5_1', conv5, 256, 3, 1)
conv6 = convolveLeakyReLU(
'conv6', conv5_1, 512, 3, 2) # 4 * 4 * 4
conv6_1 = convolveLeakyReLU(
'conv6_1', conv6, 512, 3, 1)
ks = conv6_1.shape.as_list()[1:4]
conv7_W = tflearn.layers.conv_3d(
conv6_1, 9, ks, strides=1, padding='valid', activation='linear',
bias=False, scope='conv7_W')
conv7_b = tflearn.layers.conv_3d(
conv6_1, 3, ks, strides=1, padding='valid', activation='linear',
bias=False, scope='conv7_b')
I = [[[1.0, 0.0, 0.0], [0.0, 1.0, 0.0], [0.0, 0.0, 1.0]]]
W = tf.reshape(conv7_W, [-1, 3, 3]) * self.flow_multiplier
b = tf.reshape(conv7_b, [-1, 3]) * self.flow_multiplier
A = W + I
# the flow is displacement(x) = place(x) - x = (Ax + b) - x
# the model learns W = A - I.
sx, sy, sz = img1.shape.as_list()[1:4]
flow = affine_flow(W, b, sx, sy, sz)
# determinant should be close to 1
det = det3x3(A)
det_loss = tf.nn.l2_loss(det - 1.0)
# should be close to being orthogonal
# C=A'A, a positive semi-definite matrix
# should be close to I. For this, we require C
# has eigen values close to 1 by minimizing
# k1+1/k1+k2+1/k2+k3+1/k3.
# to prevent NaN, minimize
# k1+eps + (1+eps)^2/(k1+eps) + ...
eps = 1e-5
epsI = [[[eps * elem for elem in row] for row in Mat] for Mat in I]
C = tf.matmul(A, A, True) + epsI
def elem_sym_polys_of_eigen_values(M):
M = [[M[:, i, j] for j in range(3)] for i in range(3)]
sigma1 = tf.add_n([M[0][0], M[1][1], M[2][2]])
sigma2 = tf.add_n([
M[0][0] * M[1][1],
M[1][1] * M[2][2],
M[2][2] * M[0][0]
]) - tf.add_n([
M[0][1] * M[1][0],
M[1][2] * M[2][1],
M[2][0] * M[0][2]
])
sigma3 = tf.add_n([
M[0][0] * M[1][1] * M[2][2],
M[0][1] * M[1][2] * M[2][0],
M[0][2] * M[1][0] * M[2][1]
]) - tf.add_n([
M[0][0] * M[1][2] * M[2][1],
M[0][1] * M[1][0] * M[2][2],
M[0][2] * M[1][1] * M[2][0]
])
return sigma1, sigma2, sigma3
s1, s2, s3 = elem_sym_polys_of_eigen_values(C)
ortho_loss = s1 + (1 + eps) * (1 + eps) * s2 / s3 - 3 * 2 * (1 + eps)
ortho_loss = tf.reduce_sum(ortho_loss)
return {
'flow': flow,
'W': W,
'b': b,
'det_loss': det_loss,
'ortho_loss': ortho_loss
}
class UNet(Network):
def __init__(self, name, flow_multiplier=1., channels=16, **kwargs):
super().__init__(name, **kwargs)
self.flow_multiplier = flow_multiplier
self.channels = channels
def build(self, img1, img2):
'''
img1, img2, flow : tensor of shape [batch, X, Y, Z, C]
'''
concatImgs = tf.concat([img1, img2], 4, 'concatImgs')
dims = 3
c = self.channels
conv1 = convolveLeakyReLU(
'conv1', concatImgs, c, 3, 2) # 64 * 64 * 64
conv2 = convolveLeakyReLU('conv2', conv1, c*2, 3, 2) # 32 * 32 * 32
conv2_1 = convolveLeakyReLU('conv2_1', conv2, c*2, 3, 1) # 32 * 32 * 32
conv3 = convolveLeakyReLU('conv3', conv2_1, c*4, 3, 2)
conv3_1 = convolveLeakyReLU('conv3_1', conv3, c*4, 3, 1)
conv4 = convolveLeakyReLU(
'conv4', conv3_1, c*8, 3, 2) # 16 * 16 * 16
conv4_1 = convolveLeakyReLU('conv4_1', conv4, c*8, 3, 1)
conv5 = convolveLeakyReLU(
'conv5', conv4_1, c*16, 3, 2) # 8 * 8 * 8
conv5_1 = convolveLeakyReLU('conv5_1', conv5, c*16, 3, 1)
conv6 = convolveLeakyReLU(
'conv6', conv5_1, c*32, 3, 2) # 4 * 4 * 4
conv6_1 = convolveLeakyReLU('conv6_1', conv6, c*32, 3, 1)
# 16 * 32 = 512 channels
shape0 = concatImgs.shape.as_list()
shape1 = conv1.shape.as_list()
shape2 = conv2.shape.as_list()
shape3 = conv3.shape.as_list()
shape4 = conv4.shape.as_list()
shape5 = conv5.shape.as_list()
shape6 = conv6.shape.as_list()
pred6 = convolve('pred6', conv6_1, dims, 3, 1)
upsamp6to5 = upconvolve('upsamp6to5', pred6, dims, 4, 2, shape5[1:4])
deconv5 = upconvolveLeakyReLU(
'deconv5', conv6_1, shape5[4], 4, 2, shape5[1:4])
concat5 = tf.concat([conv5_1, deconv5, upsamp6to5], 4, 'concat5')
pred5 = convolve('pred5', concat5, dims, 3, 1)
upsamp5to4 = upconvolve('upsamp5to4', pred5, dims, 4, 2, shape4[1:4])
deconv4 = upconvolveLeakyReLU( 'deconv4', concat5, shape4[4], 4, 2, shape4[1:4])
concat4 = tf.concat([conv4_1, deconv4, upsamp5to4],4, 'concat4')
pred4 = convolve('pred4', concat4, dims, 3, 1)
upsamp4to3 = upconvolve('upsamp4to3', pred4, dims, 4, 2, shape3[1:4])
deconv3 = upconvolveLeakyReLU( 'deconv3', concat4, shape3[4], 4, 2, shape3[1:4])
concat3 = tf.concat([conv3_1, deconv3, upsamp4to3], 4, 'concat3')
pred3 = convolve('pred3', concat3, dims, 3, 1)
upsamp3to2 = upconvolve('upsamp3to2', pred3, dims, 4, 2, shape2[1:4])
deconv2 = upconvolveLeakyReLU('deconv2', concat3, shape2[4], 4, 2, shape2[1:4])
concat2 = tf.concat([conv2_1, deconv2, upsamp3to2], 4, 'concat2')
pred2 = convolve('pred2', concat2, dims, 3, 1)
upsamp2to1 = upconvolve('upsamp2to1', pred2, dims, 4, 2, shape1[1:4])
deconv1 = upconvolveLeakyReLU('deconv1', concat2, shape1[4], 4, 2, shape1[1:4])
concat1 = tf.concat([conv1, deconv1, upsamp2to1], 4, 'concat1')
pred0 = upconvolve('upsamp1to0', concat1, dims, 4, 2, shape0[1:4])
return {'flow': pred0 * 20 * self.flow_multiplier}
class MobileNetAffine(Network):
def __init__(self, name, flow_multiplier=1., **kwargs):
super().__init__(name, **kwargs)
self.flow_multiplier = flow_multiplier
def build(self, img1, img2):
'''
img1, img2, flow : tensor of shape [batch, X, Y, Z, C]
'''
concatImgs = tf.concat([img1, img2], 4, 'coloncatImgs')
dims = 3
conv1 = convolveReLU(
'conv1', concatImgs, 32, 3, 2) # 64 * 64 * 64
# Depth-wise
conv2 = convolveReLU(
'conv2', conv1, 64, 3, 3)
# Point-wise
conv2_1 = convolveReLU(
'conv2_1', conv2, 64, 1, 1)
# Depth-wise
conv3 = convolveReLU(
'conv3', conv2_1, 128, 3, 3)
# Point-wise
conv3_1 = convolveReLU(
'conv3_1', conv3, 128, 1, 1)
# Depth-wise
conv4 = convolveReLU(
'conv4', conv3_1, 256, 3, 3)
# Point-wise
conv6 = convolveReLU(
'conv4_1', conv4, 256, 1, 1)
# # Depth-wise
# conv5 = convolveReLU(
# 'conv5', conv4_1, 512, 3, 3)
# # Point-wise
# conv5_1 = convolveReLU(
# 'conv5_1', conv5, 512, 1, 1)
# # Depth-wise
# conv6 = convolveReLU(
# 'conv6', conv5_1, 1024, 3, 3)
# # Point-wise
# conv6 = convolveReLU(
# 'conv6_1', conv6, 1024, 1, 1)
ks = conv6.shape.as_list()[1:4]
conv6 = tflearn.layers.avg_pool_3d(
conv6, 3,2, padding='valid')
ks = conv6.shape.as_list()[1:4]
conv7_W = tflearn.layers.conv_3d(
conv6, 9, ks, strides=1, padding='valid', activation='linear',
bias=False, scope='conv7_W')
conv7_b = tflearn.layers.conv_3d(
conv6, 3, ks, strides=1, padding='valid', activation='linear',
bias=False, scope='conv7_b')
I = [[[1.0, 0.0, 0.0], [0.0, 1.0, 0.0], [0.0, 0.0, 1.0]]]
W = tf.reshape(conv7_W, [-1, 3, 3]) * self.flow_multiplier
b = tf.reshape(conv7_b, [-1, 3]) * self.flow_multiplier
A = W + I
# the flow is displacement(x) = place(x) - x = (Ax + b) - x
# the model learns W = A - I.
sx, sy, sz = img1.shape.as_list()[1:4]
flow = affine_flow(W, b, sx, sy, sz)
# determinant should be close to 1
det = det3x3(A)
det_loss = tf.nn.l2_loss(det - 1.0)
# should be close to being orthogonal
# C=A'A, a positive semi-definite matrix
# should be close to I. For this, we require C
# has eigen values close to 1 by minimizing
# k1+1/k1+k2+1/k2+k3+1/k3.
# to prevent NaN, minimize
# k1+eps + (1+eps)^2/(k1+eps) + ...
eps = 1e-5
epsI = [[[eps * elem for elem in row] for row in Mat] for Mat in I]
C = tf.matmul(A, A, True) + epsI
def elem_sym_polys_of_eigen_values(M):
M = [[M[:, i, j] for j in range(3)] for i in range(3)]
sigma1 = tf.add_n([M[0][0], M[1][1], M[2][2]])
sigma2 = tf.add_n([
M[0][0] * M[1][1],
M[1][1] * M[2][2],
M[2][2] * M[0][0]
]) - tf.add_n([
M[0][1] * M[1][0],
M[1][2] * M[2][1],
M[2][0] * M[0][2]
])
sigma3 = tf.add_n([
M[0][0] * M[1][1] * M[2][2],
M[0][1] * M[1][2] * M[2][0],
M[0][2] * M[1][0] * M[2][1]
]) - tf.add_n([
M[0][0] * M[1][2] * M[2][1],
M[0][1] * M[1][0] * M[2][2],
M[0][2] * M[1][1] * M[2][0]
])
return sigma1, sigma2, sigma3
s1, s2, s3 = elem_sym_polys_of_eigen_values(C)
ortho_loss = s1 + (1 + eps) * (1 + eps) * s2 / s3 - 3 * 2 * (1 + eps)
ortho_loss = tf.reduce_sum(ortho_loss)
return {
'flow': flow,
'W': W,
'b': b,
'det_loss': det_loss,
'ortho_loss': ortho_loss
}
| 41.766254 | 93 | 0.515474 | 3,641 | 26,981 | 3.713815 | 0.075254 | 0.007839 | 0.006656 | 0.0071 | 0.835305 | 0.820958 | 0.80321 | 0.772223 | 0.768747 | 0.748188 | 0 | 0.105748 | 0.327416 | 26,981 | 645 | 94 | 41.831008 | 0.639389 | 0.087951 | 0 | 0.694949 | 0 | 0 | 0.048831 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.054545 | false | 0 | 0.010101 | 0.012121 | 0.119192 | 0.00404 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
0ab3ef8e45dd8d9b73c8b2434a6938a03df6b23b | 17,438 | py | Python | tests/components/homekit/test_type_lights.py | tbarbette/core | 8e58c3aa7bc8d2c2b09b6bd329daa1c092d52d3c | [
"Apache-2.0"
] | 1 | 2017-05-30T22:21:05.000Z | 2017-05-30T22:21:05.000Z | tests/components/homekit/test_type_lights.py | tbarbette/core | 8e58c3aa7bc8d2c2b09b6bd329daa1c092d52d3c | [
"Apache-2.0"
] | 58 | 2020-08-03T07:33:02.000Z | 2022-03-31T06:02:05.000Z | tests/components/homekit/test_type_lights.py | tbarbette/core | 8e58c3aa7bc8d2c2b09b6bd329daa1c092d52d3c | [
"Apache-2.0"
] | 1 | 2021-08-16T02:53:15.000Z | 2021-08-16T02:53:15.000Z | """Test different accessory types: Lights."""
from pyhap.const import HAP_REPR_AID, HAP_REPR_CHARS, HAP_REPR_IID, HAP_REPR_VALUE
from homeassistant.components.homekit.const import ATTR_VALUE
from homeassistant.components.homekit.type_lights import Light
from homeassistant.components.light import (
ATTR_BRIGHTNESS,
ATTR_BRIGHTNESS_PCT,
ATTR_COLOR_TEMP,
ATTR_HS_COLOR,
DOMAIN,
SUPPORT_BRIGHTNESS,
SUPPORT_COLOR,
SUPPORT_COLOR_TEMP,
)
from homeassistant.const import (
ATTR_ENTITY_ID,
ATTR_SUPPORTED_FEATURES,
EVENT_HOMEASSISTANT_START,
PERCENTAGE,
STATE_OFF,
STATE_ON,
STATE_UNKNOWN,
)
from homeassistant.core import CoreState
from homeassistant.helpers import entity_registry
from tests.common import async_mock_service
async def test_light_basic(hass, hk_driver, events):
"""Test light with char state."""
entity_id = "light.demo"
hass.states.async_set(entity_id, STATE_ON, {ATTR_SUPPORTED_FEATURES: 0})
await hass.async_block_till_done()
acc = Light(hass, hk_driver, "Light", entity_id, 1, None)
hk_driver.add_accessory(acc)
assert acc.aid == 1
assert acc.category == 5 # Lightbulb
assert acc.char_on.value
await acc.run()
await hass.async_block_till_done()
assert acc.char_on.value == 1
hass.states.async_set(entity_id, STATE_OFF, {ATTR_SUPPORTED_FEATURES: 0})
await hass.async_block_till_done()
assert acc.char_on.value == 0
hass.states.async_set(entity_id, STATE_UNKNOWN)
await hass.async_block_till_done()
assert acc.char_on.value == 0
hass.states.async_remove(entity_id)
await hass.async_block_till_done()
assert acc.char_on.value == 0
# Set from HomeKit
call_turn_on = async_mock_service(hass, DOMAIN, "turn_on")
call_turn_off = async_mock_service(hass, DOMAIN, "turn_off")
char_on_iid = acc.char_on.to_HAP()[HAP_REPR_IID]
hk_driver.set_characteristics(
{
HAP_REPR_CHARS: [
{HAP_REPR_AID: acc.aid, HAP_REPR_IID: char_on_iid, HAP_REPR_VALUE: 1}
]
},
"mock_addr",
)
await hass.async_add_executor_job(acc.char_on.client_update_value, 1)
await hass.async_block_till_done()
assert call_turn_on
assert call_turn_on[0].data[ATTR_ENTITY_ID] == entity_id
assert len(events) == 1
assert events[-1].data[ATTR_VALUE] == "Set state to 1"
hass.states.async_set(entity_id, STATE_ON)
await hass.async_block_till_done()
hk_driver.set_characteristics(
{
HAP_REPR_CHARS: [
{HAP_REPR_AID: acc.aid, HAP_REPR_IID: char_on_iid, HAP_REPR_VALUE: 0}
]
},
"mock_addr",
)
await hass.async_block_till_done()
assert call_turn_off
assert call_turn_off[0].data[ATTR_ENTITY_ID] == entity_id
assert len(events) == 2
assert events[-1].data[ATTR_VALUE] == "Set state to 0"
async def test_light_brightness(hass, hk_driver, events):
"""Test light with brightness."""
entity_id = "light.demo"
hass.states.async_set(
entity_id,
STATE_ON,
{ATTR_SUPPORTED_FEATURES: SUPPORT_BRIGHTNESS, ATTR_BRIGHTNESS: 255},
)
await hass.async_block_till_done()
acc = Light(hass, hk_driver, "Light", entity_id, 1, None)
hk_driver.add_accessory(acc)
# Initial value can be anything but 0. If it is 0, it might cause HomeKit to set the
# brightness to 100 when turning on a light on a freshly booted up server.
assert acc.char_brightness.value != 0
char_on_iid = acc.char_on.to_HAP()[HAP_REPR_IID]
char_brightness_iid = acc.char_brightness.to_HAP()[HAP_REPR_IID]
await acc.run()
await hass.async_block_till_done()
assert acc.char_brightness.value == 100
hass.states.async_set(entity_id, STATE_ON, {ATTR_BRIGHTNESS: 102})
await hass.async_block_till_done()
assert acc.char_brightness.value == 40
# Set from HomeKit
call_turn_on = async_mock_service(hass, DOMAIN, "turn_on")
call_turn_off = async_mock_service(hass, DOMAIN, "turn_off")
hk_driver.set_characteristics(
{
HAP_REPR_CHARS: [
{HAP_REPR_AID: acc.aid, HAP_REPR_IID: char_on_iid, HAP_REPR_VALUE: 1},
{
HAP_REPR_AID: acc.aid,
HAP_REPR_IID: char_brightness_iid,
HAP_REPR_VALUE: 20,
},
]
},
"mock_addr",
)
await hass.async_block_till_done()
assert call_turn_on[0]
assert call_turn_on[0].data[ATTR_ENTITY_ID] == entity_id
assert call_turn_on[0].data[ATTR_BRIGHTNESS_PCT] == 20
assert len(events) == 1
assert (
events[-1].data[ATTR_VALUE] == f"Set state to 1, brightness at 20{PERCENTAGE}"
)
hk_driver.set_characteristics(
{
HAP_REPR_CHARS: [
{HAP_REPR_AID: acc.aid, HAP_REPR_IID: char_on_iid, HAP_REPR_VALUE: 1},
{
HAP_REPR_AID: acc.aid,
HAP_REPR_IID: char_brightness_iid,
HAP_REPR_VALUE: 40,
},
]
},
"mock_addr",
)
await hass.async_block_till_done()
assert call_turn_on[1]
assert call_turn_on[1].data[ATTR_ENTITY_ID] == entity_id
assert call_turn_on[1].data[ATTR_BRIGHTNESS_PCT] == 40
assert len(events) == 2
assert (
events[-1].data[ATTR_VALUE] == f"Set state to 1, brightness at 40{PERCENTAGE}"
)
hk_driver.set_characteristics(
{
HAP_REPR_CHARS: [
{HAP_REPR_AID: acc.aid, HAP_REPR_IID: char_on_iid, HAP_REPR_VALUE: 1},
{
HAP_REPR_AID: acc.aid,
HAP_REPR_IID: char_brightness_iid,
HAP_REPR_VALUE: 0,
},
]
},
"mock_addr",
)
await hass.async_block_till_done()
assert call_turn_off
assert call_turn_off[0].data[ATTR_ENTITY_ID] == entity_id
assert len(events) == 3
assert events[-1].data[ATTR_VALUE] == f"Set state to 0, brightness at 0{PERCENTAGE}"
# 0 is a special case for homekit, see "Handle Brightness"
# in update_state
hass.states.async_set(entity_id, STATE_ON, {ATTR_BRIGHTNESS: 0})
await hass.async_block_till_done()
assert acc.char_brightness.value == 1
hass.states.async_set(entity_id, STATE_ON, {ATTR_BRIGHTNESS: 255})
await hass.async_block_till_done()
assert acc.char_brightness.value == 100
hass.states.async_set(entity_id, STATE_ON, {ATTR_BRIGHTNESS: 0})
await hass.async_block_till_done()
assert acc.char_brightness.value == 1
# Ensure floats are handled
hass.states.async_set(entity_id, STATE_ON, {ATTR_BRIGHTNESS: 55.66})
await hass.async_block_till_done()
assert acc.char_brightness.value == 22
hass.states.async_set(entity_id, STATE_ON, {ATTR_BRIGHTNESS: 108.4})
await hass.async_block_till_done()
assert acc.char_brightness.value == 43
hass.states.async_set(entity_id, STATE_ON, {ATTR_BRIGHTNESS: 0.0})
await hass.async_block_till_done()
assert acc.char_brightness.value == 1
async def test_light_color_temperature(hass, hk_driver, events):
"""Test light with color temperature."""
entity_id = "light.demo"
hass.states.async_set(
entity_id,
STATE_ON,
{ATTR_SUPPORTED_FEATURES: SUPPORT_COLOR_TEMP, ATTR_COLOR_TEMP: 190},
)
await hass.async_block_till_done()
acc = Light(hass, hk_driver, "Light", entity_id, 1, None)
hk_driver.add_accessory(acc)
assert acc.char_color_temperature.value == 190
await acc.run()
await hass.async_block_till_done()
assert acc.char_color_temperature.value == 190
# Set from HomeKit
call_turn_on = async_mock_service(hass, DOMAIN, "turn_on")
char_color_temperature_iid = acc.char_color_temperature.to_HAP()[HAP_REPR_IID]
hk_driver.set_characteristics(
{
HAP_REPR_CHARS: [
{
HAP_REPR_AID: acc.aid,
HAP_REPR_IID: char_color_temperature_iid,
HAP_REPR_VALUE: 250,
}
]
},
"mock_addr",
)
await hass.async_add_executor_job(
acc.char_color_temperature.client_update_value, 250
)
await hass.async_block_till_done()
assert call_turn_on
assert call_turn_on[0].data[ATTR_ENTITY_ID] == entity_id
assert call_turn_on[0].data[ATTR_COLOR_TEMP] == 250
assert len(events) == 1
assert events[-1].data[ATTR_VALUE] == "color temperature at 250"
async def test_light_color_temperature_and_rgb_color(hass, hk_driver, events):
"""Test light with color temperature and rgb color not exposing temperature."""
entity_id = "light.demo"
hass.states.async_set(
entity_id,
STATE_ON,
{
ATTR_SUPPORTED_FEATURES: SUPPORT_COLOR_TEMP | SUPPORT_COLOR,
ATTR_COLOR_TEMP: 190,
ATTR_HS_COLOR: (260, 90),
},
)
await hass.async_block_till_done()
acc = Light(hass, hk_driver, "Light", entity_id, 2, None)
assert acc.char_hue.value == 260
assert acc.char_saturation.value == 90
assert not hasattr(acc, "char_color_temperature")
hass.states.async_set(entity_id, STATE_ON, {ATTR_COLOR_TEMP: 224})
await hass.async_block_till_done()
await acc.run()
await hass.async_block_till_done()
assert acc.char_hue.value == 27
assert acc.char_saturation.value == 27
hass.states.async_set(entity_id, STATE_ON, {ATTR_COLOR_TEMP: 352})
await hass.async_block_till_done()
await acc.run()
await hass.async_block_till_done()
assert acc.char_hue.value == 28
assert acc.char_saturation.value == 61
async def test_light_rgb_color(hass, hk_driver, events):
"""Test light with rgb_color."""
entity_id = "light.demo"
hass.states.async_set(
entity_id,
STATE_ON,
{ATTR_SUPPORTED_FEATURES: SUPPORT_COLOR, ATTR_HS_COLOR: (260, 90)},
)
await hass.async_block_till_done()
acc = Light(hass, hk_driver, "Light", entity_id, 1, None)
hk_driver.add_accessory(acc)
assert acc.char_hue.value == 260
assert acc.char_saturation.value == 90
await acc.run()
await hass.async_block_till_done()
assert acc.char_hue.value == 260
assert acc.char_saturation.value == 90
# Set from HomeKit
call_turn_on = async_mock_service(hass, DOMAIN, "turn_on")
char_hue_iid = acc.char_hue.to_HAP()[HAP_REPR_IID]
char_saturation_iid = acc.char_saturation.to_HAP()[HAP_REPR_IID]
hk_driver.set_characteristics(
{
HAP_REPR_CHARS: [
{
HAP_REPR_AID: acc.aid,
HAP_REPR_IID: char_hue_iid,
HAP_REPR_VALUE: 145,
},
{
HAP_REPR_AID: acc.aid,
HAP_REPR_IID: char_saturation_iid,
HAP_REPR_VALUE: 75,
},
]
},
"mock_addr",
)
await hass.async_block_till_done()
assert call_turn_on
assert call_turn_on[0].data[ATTR_ENTITY_ID] == entity_id
assert call_turn_on[0].data[ATTR_HS_COLOR] == (145, 75)
assert len(events) == 1
assert events[-1].data[ATTR_VALUE] == "set color at (145, 75)"
async def test_light_restore(hass, hk_driver, events):
"""Test setting up an entity from state in the event registry."""
hass.state = CoreState.not_running
registry = await entity_registry.async_get_registry(hass)
registry.async_get_or_create("light", "hue", "1234", suggested_object_id="simple")
registry.async_get_or_create(
"light",
"hue",
"9012",
suggested_object_id="all_info_set",
capabilities={"max": 100},
supported_features=5,
device_class="mock-device-class",
)
hass.bus.async_fire(EVENT_HOMEASSISTANT_START, {})
await hass.async_block_till_done()
acc = Light(hass, hk_driver, "Light", "light.simple", 1, None)
hk_driver.add_accessory(acc)
assert acc.category == 5 # Lightbulb
assert acc.chars == []
assert acc.char_on.value == 0
acc = Light(hass, hk_driver, "Light", "light.all_info_set", 2, None)
assert acc.category == 5 # Lightbulb
assert acc.chars == ["Brightness"]
assert acc.char_on.value == 0
async def test_light_set_brightness_and_color(hass, hk_driver, events):
"""Test light with all chars in one go."""
entity_id = "light.demo"
hass.states.async_set(
entity_id,
STATE_ON,
{
ATTR_SUPPORTED_FEATURES: SUPPORT_BRIGHTNESS | SUPPORT_COLOR,
ATTR_BRIGHTNESS: 255,
},
)
await hass.async_block_till_done()
acc = Light(hass, hk_driver, "Light", entity_id, 1, None)
hk_driver.add_accessory(acc)
# Initial value can be anything but 0. If it is 0, it might cause HomeKit to set the
# brightness to 100 when turning on a light on a freshly booted up server.
assert acc.char_brightness.value != 0
char_on_iid = acc.char_on.to_HAP()[HAP_REPR_IID]
char_brightness_iid = acc.char_brightness.to_HAP()[HAP_REPR_IID]
char_hue_iid = acc.char_hue.to_HAP()[HAP_REPR_IID]
char_saturation_iid = acc.char_saturation.to_HAP()[HAP_REPR_IID]
await acc.run()
await hass.async_block_till_done()
assert acc.char_brightness.value == 100
hass.states.async_set(entity_id, STATE_ON, {ATTR_BRIGHTNESS: 102})
await hass.async_block_till_done()
assert acc.char_brightness.value == 40
hass.states.async_set(entity_id, STATE_ON, {ATTR_HS_COLOR: (4.5, 9.2)})
await hass.async_block_till_done()
assert acc.char_hue.value == 4
assert acc.char_saturation.value == 9
# Set from HomeKit
call_turn_on = async_mock_service(hass, DOMAIN, "turn_on")
hk_driver.set_characteristics(
{
HAP_REPR_CHARS: [
{HAP_REPR_AID: acc.aid, HAP_REPR_IID: char_on_iid, HAP_REPR_VALUE: 1},
{
HAP_REPR_AID: acc.aid,
HAP_REPR_IID: char_brightness_iid,
HAP_REPR_VALUE: 20,
},
{
HAP_REPR_AID: acc.aid,
HAP_REPR_IID: char_hue_iid,
HAP_REPR_VALUE: 145,
},
{
HAP_REPR_AID: acc.aid,
HAP_REPR_IID: char_saturation_iid,
HAP_REPR_VALUE: 75,
},
]
},
"mock_addr",
)
await hass.async_block_till_done()
assert call_turn_on[0]
assert call_turn_on[0].data[ATTR_ENTITY_ID] == entity_id
assert call_turn_on[0].data[ATTR_BRIGHTNESS_PCT] == 20
assert call_turn_on[0].data[ATTR_HS_COLOR] == (145, 75)
assert len(events) == 1
assert (
events[-1].data[ATTR_VALUE]
== f"Set state to 1, brightness at 20{PERCENTAGE}, set color at (145, 75)"
)
async def test_light_set_brightness_and_color_temp(hass, hk_driver, events):
"""Test light with all chars in one go."""
entity_id = "light.demo"
hass.states.async_set(
entity_id,
STATE_ON,
{
ATTR_SUPPORTED_FEATURES: SUPPORT_BRIGHTNESS | SUPPORT_COLOR_TEMP,
ATTR_BRIGHTNESS: 255,
},
)
await hass.async_block_till_done()
acc = Light(hass, hk_driver, "Light", entity_id, 1, None)
hk_driver.add_accessory(acc)
# Initial value can be anything but 0. If it is 0, it might cause HomeKit to set the
# brightness to 100 when turning on a light on a freshly booted up server.
assert acc.char_brightness.value != 0
char_on_iid = acc.char_on.to_HAP()[HAP_REPR_IID]
char_brightness_iid = acc.char_brightness.to_HAP()[HAP_REPR_IID]
char_color_temperature_iid = acc.char_color_temperature.to_HAP()[HAP_REPR_IID]
await acc.run()
await hass.async_block_till_done()
assert acc.char_brightness.value == 100
hass.states.async_set(entity_id, STATE_ON, {ATTR_BRIGHTNESS: 102})
await hass.async_block_till_done()
assert acc.char_brightness.value == 40
hass.states.async_set(entity_id, STATE_ON, {ATTR_COLOR_TEMP: (224.14)})
await hass.async_block_till_done()
assert acc.char_color_temperature.value == 224
# Set from HomeKit
call_turn_on = async_mock_service(hass, DOMAIN, "turn_on")
hk_driver.set_characteristics(
{
HAP_REPR_CHARS: [
{HAP_REPR_AID: acc.aid, HAP_REPR_IID: char_on_iid, HAP_REPR_VALUE: 1},
{
HAP_REPR_AID: acc.aid,
HAP_REPR_IID: char_brightness_iid,
HAP_REPR_VALUE: 20,
},
{
HAP_REPR_AID: acc.aid,
HAP_REPR_IID: char_color_temperature_iid,
HAP_REPR_VALUE: 250,
},
]
},
"mock_addr",
)
await hass.async_block_till_done()
assert call_turn_on[0]
assert call_turn_on[0].data[ATTR_ENTITY_ID] == entity_id
assert call_turn_on[0].data[ATTR_BRIGHTNESS_PCT] == 20
assert call_turn_on[0].data[ATTR_COLOR_TEMP] == 250
assert len(events) == 1
assert (
events[-1].data[ATTR_VALUE]
== f"Set state to 1, brightness at 20{PERCENTAGE}, color temperature at 250"
)
| 32.964083 | 88 | 0.64801 | 2,420 | 17,438 | 4.315289 | 0.070661 | 0.053624 | 0.058987 | 0.076415 | 0.87743 | 0.853778 | 0.84085 | 0.815379 | 0.802739 | 0.772671 | 0 | 0.023589 | 0.256107 | 17,438 | 528 | 89 | 33.026515 | 0.781452 | 0.042379 | 0 | 0.578824 | 0 | 0 | 0.044206 | 0.001349 | 0 | 0 | 0 | 0 | 0.207059 | 1 | 0 | false | 0 | 0.018824 | 0 | 0.018824 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
0af5aa25aff9efb52be6a8079359b554f7f1bf07 | 1,012 | py | Python | tests/models/FNN_test.py | PandasCute/DeepCTR | bd5dc215e1c21bf12e6b5297d10f293c45e16c77 | [
"MIT"
] | 2 | 2019-08-09T16:57:57.000Z | 2019-08-10T02:21:17.000Z | tests/models/FNN_test.py | Gearchen/DeepCTR | bd5dc215e1c21bf12e6b5297d10f293c45e16c77 | [
"MIT"
] | null | null | null | tests/models/FNN_test.py | Gearchen/DeepCTR | bd5dc215e1c21bf12e6b5297d10f293c45e16c77 | [
"MIT"
] | 1 | 2019-08-09T16:33:27.000Z | 2019-08-09T16:33:27.000Z | import pytest
from deepctr.models import FNN
from ..utils import check_model, get_test_data
@pytest.mark.parametrize(
'sparse_feature_num,dense_feature_num',
[(1, 1), (3, 3)
]
)
def test_FNN(sparse_feature_num, dense_feature_num):
model_name = "FNN"
sample_size = 64
x, y, feature_dim_dict = get_test_data(
sample_size, sparse_feature_num, dense_feature_num)
model = FNN(feature_dim_dict, hidden_size=[32, 32], keep_prob=0.5, )
check_model(model, model_name, x, y)
@pytest.mark.parametrize(
'sparse_feature_num,dense_feature_num',
[(0, 1), (1, 0)
]
)
def test_FNN_without_seq(sparse_feature_num, dense_feature_num):
model_name = "FNN"
sample_size = 64
x, y, feature_dim_dict = get_test_data(
sample_size, sparse_feature_num, dense_feature_num, sequence_feature=())
model = FNN(feature_dim_dict, hidden_size=[32, 32], keep_prob=0.5, )
check_model(model, model_name, x, y)
if __name__ == "__main__":
test_FNN(2, 2)
| 24.682927 | 80 | 0.698617 | 154 | 1,012 | 4.155844 | 0.266234 | 0.1875 | 0.15 | 0.196875 | 0.771875 | 0.771875 | 0.771875 | 0.771875 | 0.771875 | 0.609375 | 0 | 0.031592 | 0.186759 | 1,012 | 40 | 81 | 25.3 | 0.746051 | 0 | 0 | 0.482759 | 0 | 0 | 0.08498 | 0.071146 | 0 | 0 | 0 | 0 | 0 | 1 | 0.068966 | false | 0 | 0.103448 | 0 | 0.172414 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
7c3dfa87c7dbc621788e11355742e5e74e166890 | 9,443 | py | Python | tests/contracts/KT19VKNKqKDDunvwi4y2r6ugDhCcdhT2zG3c/test_michelson_coding_KT19VK.py | juztin/pytezos-1 | 7e608ff599d934bdcf129e47db43dbdb8fef9027 | [
"MIT"
] | 1 | 2021-05-20T16:52:08.000Z | 2021-05-20T16:52:08.000Z | tests/contracts/KT19VKNKqKDDunvwi4y2r6ugDhCcdhT2zG3c/test_michelson_coding_KT19VK.py | juztin/pytezos-1 | 7e608ff599d934bdcf129e47db43dbdb8fef9027 | [
"MIT"
] | 1 | 2020-12-30T16:44:56.000Z | 2020-12-30T16:44:56.000Z | tests/contracts/KT19VKNKqKDDunvwi4y2r6ugDhCcdhT2zG3c/test_michelson_coding_KT19VK.py | tqtezos/pytezos | a4ac0b022d35d4c9f3062609d8ce09d584b5faa8 | [
"MIT"
] | 1 | 2022-03-20T19:01:00.000Z | 2022-03-20T19:01:00.000Z | from unittest import TestCase
from tests import get_data
from pytezos.michelson.micheline import michelson_to_micheline
from pytezos.michelson.formatter import micheline_to_michelson
class MichelsonCodingTestKT19VK(TestCase):
def setUp(self):
self.maxDiff = None
def test_michelson_parse_code_KT19VK(self):
expected = get_data(
path='contracts/KT19VKNKqKDDunvwi4y2r6ugDhCcdhT2zG3c/code_KT19VK.json')
actual = michelson_to_micheline(get_data(
path='contracts/KT19VKNKqKDDunvwi4y2r6ugDhCcdhT2zG3c/code_KT19VK.tz'))
self.assertEqual(expected, actual)
def test_michelson_format_code_KT19VK(self):
expected = get_data(
path='contracts/KT19VKNKqKDDunvwi4y2r6ugDhCcdhT2zG3c/code_KT19VK.tz')
actual = micheline_to_michelson(get_data(
path='contracts/KT19VKNKqKDDunvwi4y2r6ugDhCcdhT2zG3c/code_KT19VK.json'),
inline=True)
self.assertEqual(expected, actual)
def test_michelson_inverse_code_KT19VK(self):
expected = get_data(
path='contracts/KT19VKNKqKDDunvwi4y2r6ugDhCcdhT2zG3c/code_KT19VK.json')
actual = michelson_to_micheline(micheline_to_michelson(expected))
self.assertEqual(expected, actual)
def test_michelson_parse_storage_KT19VK(self):
expected = get_data(
path='contracts/KT19VKNKqKDDunvwi4y2r6ugDhCcdhT2zG3c/storage_KT19VK.json')
actual = michelson_to_micheline(get_data(
path='contracts/KT19VKNKqKDDunvwi4y2r6ugDhCcdhT2zG3c/storage_KT19VK.tz'))
self.assertEqual(expected, actual)
def test_michelson_format_storage_KT19VK(self):
expected = get_data(
path='contracts/KT19VKNKqKDDunvwi4y2r6ugDhCcdhT2zG3c/storage_KT19VK.tz')
actual = micheline_to_michelson(get_data(
path='contracts/KT19VKNKqKDDunvwi4y2r6ugDhCcdhT2zG3c/storage_KT19VK.json'),
inline=True)
self.assertEqual(expected, actual)
def test_michelson_inverse_storage_KT19VK(self):
expected = get_data(
path='contracts/KT19VKNKqKDDunvwi4y2r6ugDhCcdhT2zG3c/storage_KT19VK.json')
actual = michelson_to_micheline(micheline_to_michelson(expected))
self.assertEqual(expected, actual)
def test_michelson_parse_parameter_onr6zv(self):
expected = get_data(
path='contracts/KT19VKNKqKDDunvwi4y2r6ugDhCcdhT2zG3c/parameter_onr6zv.json')
actual = michelson_to_micheline(get_data(
path='contracts/KT19VKNKqKDDunvwi4y2r6ugDhCcdhT2zG3c/parameter_onr6zv.tz'))
self.assertEqual(expected, actual)
def test_michelson_format_parameter_onr6zv(self):
expected = get_data(
path='contracts/KT19VKNKqKDDunvwi4y2r6ugDhCcdhT2zG3c/parameter_onr6zv.tz')
actual = micheline_to_michelson(get_data(
path='contracts/KT19VKNKqKDDunvwi4y2r6ugDhCcdhT2zG3c/parameter_onr6zv.json'),
inline=True)
self.assertEqual(expected, actual)
def test_michelson_inverse_parameter_onr6zv(self):
expected = get_data(
path='contracts/KT19VKNKqKDDunvwi4y2r6ugDhCcdhT2zG3c/parameter_onr6zv.json')
actual = michelson_to_micheline(micheline_to_michelson(expected))
self.assertEqual(expected, actual)
def test_michelson_parse_parameter_onzZ9j(self):
expected = get_data(
path='contracts/KT19VKNKqKDDunvwi4y2r6ugDhCcdhT2zG3c/parameter_onzZ9j.json')
actual = michelson_to_micheline(get_data(
path='contracts/KT19VKNKqKDDunvwi4y2r6ugDhCcdhT2zG3c/parameter_onzZ9j.tz'))
self.assertEqual(expected, actual)
def test_michelson_format_parameter_onzZ9j(self):
expected = get_data(
path='contracts/KT19VKNKqKDDunvwi4y2r6ugDhCcdhT2zG3c/parameter_onzZ9j.tz')
actual = micheline_to_michelson(get_data(
path='contracts/KT19VKNKqKDDunvwi4y2r6ugDhCcdhT2zG3c/parameter_onzZ9j.json'),
inline=True)
self.assertEqual(expected, actual)
def test_michelson_inverse_parameter_onzZ9j(self):
expected = get_data(
path='contracts/KT19VKNKqKDDunvwi4y2r6ugDhCcdhT2zG3c/parameter_onzZ9j.json')
actual = michelson_to_micheline(micheline_to_michelson(expected))
self.assertEqual(expected, actual)
def test_michelson_parse_parameter_ooCm5y(self):
expected = get_data(
path='contracts/KT19VKNKqKDDunvwi4y2r6ugDhCcdhT2zG3c/parameter_ooCm5y.json')
actual = michelson_to_micheline(get_data(
path='contracts/KT19VKNKqKDDunvwi4y2r6ugDhCcdhT2zG3c/parameter_ooCm5y.tz'))
self.assertEqual(expected, actual)
def test_michelson_format_parameter_ooCm5y(self):
expected = get_data(
path='contracts/KT19VKNKqKDDunvwi4y2r6ugDhCcdhT2zG3c/parameter_ooCm5y.tz')
actual = micheline_to_michelson(get_data(
path='contracts/KT19VKNKqKDDunvwi4y2r6ugDhCcdhT2zG3c/parameter_ooCm5y.json'),
inline=True)
self.assertEqual(expected, actual)
def test_michelson_inverse_parameter_ooCm5y(self):
expected = get_data(
path='contracts/KT19VKNKqKDDunvwi4y2r6ugDhCcdhT2zG3c/parameter_ooCm5y.json')
actual = michelson_to_micheline(micheline_to_michelson(expected))
self.assertEqual(expected, actual)
def test_michelson_parse_parameter_oozEnM(self):
expected = get_data(
path='contracts/KT19VKNKqKDDunvwi4y2r6ugDhCcdhT2zG3c/parameter_oozEnM.json')
actual = michelson_to_micheline(get_data(
path='contracts/KT19VKNKqKDDunvwi4y2r6ugDhCcdhT2zG3c/parameter_oozEnM.tz'))
self.assertEqual(expected, actual)
def test_michelson_format_parameter_oozEnM(self):
expected = get_data(
path='contracts/KT19VKNKqKDDunvwi4y2r6ugDhCcdhT2zG3c/parameter_oozEnM.tz')
actual = micheline_to_michelson(get_data(
path='contracts/KT19VKNKqKDDunvwi4y2r6ugDhCcdhT2zG3c/parameter_oozEnM.json'),
inline=True)
self.assertEqual(expected, actual)
def test_michelson_inverse_parameter_oozEnM(self):
expected = get_data(
path='contracts/KT19VKNKqKDDunvwi4y2r6ugDhCcdhT2zG3c/parameter_oozEnM.json')
actual = michelson_to_micheline(micheline_to_michelson(expected))
self.assertEqual(expected, actual)
def test_michelson_parse_parameter_oo9VWb(self):
expected = get_data(
path='contracts/KT19VKNKqKDDunvwi4y2r6ugDhCcdhT2zG3c/parameter_oo9VWb.json')
actual = michelson_to_micheline(get_data(
path='contracts/KT19VKNKqKDDunvwi4y2r6ugDhCcdhT2zG3c/parameter_oo9VWb.tz'))
self.assertEqual(expected, actual)
def test_michelson_format_parameter_oo9VWb(self):
expected = get_data(
path='contracts/KT19VKNKqKDDunvwi4y2r6ugDhCcdhT2zG3c/parameter_oo9VWb.tz')
actual = micheline_to_michelson(get_data(
path='contracts/KT19VKNKqKDDunvwi4y2r6ugDhCcdhT2zG3c/parameter_oo9VWb.json'),
inline=True)
self.assertEqual(expected, actual)
def test_michelson_inverse_parameter_oo9VWb(self):
expected = get_data(
path='contracts/KT19VKNKqKDDunvwi4y2r6ugDhCcdhT2zG3c/parameter_oo9VWb.json')
actual = michelson_to_micheline(micheline_to_michelson(expected))
self.assertEqual(expected, actual)
def test_michelson_parse_parameter_opP9oc(self):
expected = get_data(
path='contracts/KT19VKNKqKDDunvwi4y2r6ugDhCcdhT2zG3c/parameter_opP9oc.json')
actual = michelson_to_micheline(get_data(
path='contracts/KT19VKNKqKDDunvwi4y2r6ugDhCcdhT2zG3c/parameter_opP9oc.tz'))
self.assertEqual(expected, actual)
def test_michelson_format_parameter_opP9oc(self):
expected = get_data(
path='contracts/KT19VKNKqKDDunvwi4y2r6ugDhCcdhT2zG3c/parameter_opP9oc.tz')
actual = micheline_to_michelson(get_data(
path='contracts/KT19VKNKqKDDunvwi4y2r6ugDhCcdhT2zG3c/parameter_opP9oc.json'),
inline=True)
self.assertEqual(expected, actual)
def test_michelson_inverse_parameter_opP9oc(self):
expected = get_data(
path='contracts/KT19VKNKqKDDunvwi4y2r6ugDhCcdhT2zG3c/parameter_opP9oc.json')
actual = michelson_to_micheline(micheline_to_michelson(expected))
self.assertEqual(expected, actual)
def test_michelson_parse_parameter_oo73cY(self):
expected = get_data(
path='contracts/KT19VKNKqKDDunvwi4y2r6ugDhCcdhT2zG3c/parameter_oo73cY.json')
actual = michelson_to_micheline(get_data(
path='contracts/KT19VKNKqKDDunvwi4y2r6ugDhCcdhT2zG3c/parameter_oo73cY.tz'))
self.assertEqual(expected, actual)
def test_michelson_format_parameter_oo73cY(self):
expected = get_data(
path='contracts/KT19VKNKqKDDunvwi4y2r6ugDhCcdhT2zG3c/parameter_oo73cY.tz')
actual = micheline_to_michelson(get_data(
path='contracts/KT19VKNKqKDDunvwi4y2r6ugDhCcdhT2zG3c/parameter_oo73cY.json'),
inline=True)
self.assertEqual(expected, actual)
def test_michelson_inverse_parameter_oo73cY(self):
expected = get_data(
path='contracts/KT19VKNKqKDDunvwi4y2r6ugDhCcdhT2zG3c/parameter_oo73cY.json')
actual = michelson_to_micheline(micheline_to_michelson(expected))
self.assertEqual(expected, actual)
| 46.9801 | 90 | 0.733983 | 880 | 9,443 | 7.563636 | 0.05 | 0.048377 | 0.074369 | 0.135216 | 0.963341 | 0.963341 | 0.963341 | 0.963341 | 0.947416 | 0.947416 | 0 | 0.053038 | 0.191359 | 9,443 | 200 | 91 | 47.215 | 0.818622 | 0 | 0 | 0.639053 | 0 | 0 | 0.316531 | 0.316531 | 0 | 0 | 0 | 0 | 0.159763 | 1 | 0.16568 | false | 0 | 0.023669 | 0 | 0.195266 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
7c5ea02bea814d4da7c693ea4e65455ed295e22d | 76 | py | Python | bilgecode/apps/passage_planner/tests/__init__.py | BilgeCode/bilgecode.com | cbec5f60841d479d0b3b170936d0eb8aec2541da | [
"MIT"
] | null | null | null | bilgecode/apps/passage_planner/tests/__init__.py | BilgeCode/bilgecode.com | cbec5f60841d479d0b3b170936d0eb8aec2541da | [
"MIT"
] | null | null | null | bilgecode/apps/passage_planner/tests/__init__.py | BilgeCode/bilgecode.com | cbec5f60841d479d0b3b170936d0eb8aec2541da | [
"MIT"
] | null | null | null | from test_print import PrintTestCase
from test_200s import PageResponseTests | 38 | 39 | 0.907895 | 10 | 76 | 6.7 | 0.7 | 0.238806 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.043478 | 0.092105 | 76 | 2 | 39 | 38 | 0.927536 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0.5 | 1 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 1 | 0 | 6 |
7ca05197fe3e5baae5ddc7cd51dd118582eaf33b | 2,615 | py | Python | users-backend/users/internal_api/tests/queries/test_search_users.py | pauloxnet/pycon | 82b6eff76dcc785865ea3ffd97a45e931c0add26 | [
"MIT"
] | 2 | 2017-07-18T21:51:25.000Z | 2017-12-23T11:08:39.000Z | users-backend/users/internal_api/tests/queries/test_search_users.py | pauloxnet/pycon | 82b6eff76dcc785865ea3ffd97a45e931c0add26 | [
"MIT"
] | 23 | 2017-07-18T20:22:38.000Z | 2018-01-05T05:45:15.000Z | users-backend/users/internal_api/tests/queries/test_search_users.py | pauloxnet/pycon | 82b6eff76dcc785865ea3ffd97a45e931c0add26 | [
"MIT"
] | 2 | 2017-07-18T21:27:33.000Z | 2017-07-18T22:07:03.000Z | from ward import test
from users.tests.api import internalapi_graphql_client
from users.tests.factories import user_factory
from users.tests.session import db
@test("search users")
async def _(
internalapi_graphql_client=internalapi_graphql_client,
db=db,
user_factory=user_factory,
):
internalapi_graphql_client.force_service_login(issuer="pycon-backend")
user_1 = await user_factory(
email="testuser@user.it", fullname="Name", is_staff=False
)
await user_factory(email="testuser2@user.it", fullname="Another", is_staff=False)
user_3 = await user_factory(
email="testuser3@user.it", fullname="Name", is_staff=False
)
query = """query($query: String!) {
searchUsers(query: $query) {
id
}
}"""
response = await internalapi_graphql_client.query(
query, variables={"query": "name"}
)
assert not response.errors
assert len(response.data["searchUsers"]) == 2
assert {"id": str(user_1.id)} in response.data["searchUsers"]
assert {"id": str(user_3.id)} in response.data["searchUsers"]
@test("cannot call without token")
async def _(
internalapi_graphql_client=internalapi_graphql_client,
db=db,
user_factory=user_factory,
):
await user_factory(email="testuser@user.it", fullname="Name", is_staff=False)
await user_factory(email="testuser2@user.it", fullname="Another", is_staff=False)
await user_factory(email="testuser3@user.it", fullname="Name", is_staff=False)
query = """query($query: String!) {
searchUsers(query: $query) {
id
}
}"""
response = await internalapi_graphql_client.query(
query, variables={"query": "name"}
)
assert response.errors[0]["message"] == "Forbidden"
assert not response.data
@test("cannot call token of not allowed service")
async def _(
internalapi_graphql_client=internalapi_graphql_client,
db=db,
user_factory=user_factory,
):
internalapi_graphql_client.force_service_login(issuer="not-allowed-service")
await user_factory(email="testuser@user.it", fullname="Name", is_staff=False)
await user_factory(email="testuser2@user.it", fullname="Another", is_staff=False)
await user_factory(email="testuser3@user.it", fullname="Name", is_staff=False)
query = """query($query: String!) {
searchUsers(query: $query) {
id
}
}"""
response = await internalapi_graphql_client.query(
query, variables={"query": "name"}
)
assert response.errors[0]["message"] == "Forbidden"
assert not response.data
| 30.764706 | 85 | 0.681836 | 318 | 2,615 | 5.418239 | 0.191824 | 0.102147 | 0.16715 | 0.109692 | 0.806152 | 0.774811 | 0.774811 | 0.774811 | 0.774811 | 0.774811 | 0 | 0.006121 | 0.187763 | 2,615 | 84 | 86 | 31.130952 | 0.805085 | 0 | 0 | 0.637681 | 0 | 0 | 0.258509 | 0 | 0 | 0 | 0 | 0 | 0.115942 | 1 | 0 | false | 0 | 0.057971 | 0 | 0.057971 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
7ca36e374db567493a96c125e29fcd1e7c4f6824 | 817 | py | Python | Fourier Transform/ftpack/correlation.py | evgenyzhurko/DSP | 9a129da102ce0407ead77096abaa081462686f12 | [
"MIT"
] | null | null | null | Fourier Transform/ftpack/correlation.py | evgenyzhurko/DSP | 9a129da102ce0407ead77096abaa081462686f12 | [
"MIT"
] | null | null | null | Fourier Transform/ftpack/correlation.py | evgenyzhurko/DSP | 9a129da102ce0407ead77096abaa081462686f12 | [
"MIT"
] | null | null | null | from fft import fft, ifft
import numpy as np
def conv(y, z, n):
result = np.zeros(n)
for i in range(n):
k = i
for j in range(n):
result[i] += y[j] * z[k]
k -= 1
result[i] /= n
return result
def corr(y, z, n):
result = np.zeros(n)
for i in range(n):
k = -i
for j in range(n):
result[i] += y[k] * z[j]
k += 1
result[i] /= n
return result
def corr_fft(y, z, n):
y_value = fft(y)
z_value = fft(z)
result = [y_value[i].conj() * z_value[i] / n for i in xrange(n)]
return [i.real for i in ifft(result)]
def conv_fft(y, z, n):
y_value = fft(y)
z_value = fft(z)
result = [y_value[i] * z_value[i] / n for i in xrange(n)]
return [i.real for i in ifft(result)]
| 17.382979 | 68 | 0.493268 | 148 | 817 | 2.655405 | 0.182432 | 0.030534 | 0.091603 | 0.071247 | 0.849873 | 0.849873 | 0.849873 | 0.849873 | 0.849873 | 0.70229 | 0 | 0.003817 | 0.358629 | 817 | 46 | 69 | 17.76087 | 0.746183 | 0 | 0 | 0.533333 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.133333 | false | 0 | 0.066667 | 0 | 0.333333 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
7cc83026fb969df03c3196710884a01d08986de7 | 64 | py | Python | src/startServer.py | ChrisChV/Binance-Manager | 840e83e5e5586022ed2b9a181342d32e1e973ab7 | [
"MIT"
] | null | null | null | src/startServer.py | ChrisChV/Binance-Manager | 840e83e5e5586022ed2b9a181342d32e1e973ab7 | [
"MIT"
] | null | null | null | src/startServer.py | ChrisChV/Binance-Manager | 840e83e5e5586022ed2b9a181342d32e1e973ab7 | [
"MIT"
] | null | null | null | import server.bm_server as bm_server
bm_server.initServer()
| 9.142857 | 36 | 0.796875 | 10 | 64 | 4.8 | 0.5 | 0.5 | 0.583333 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.140625 | 64 | 6 | 37 | 10.666667 | 0.872727 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0.5 | 0 | 0.5 | 0 | 1 | 0 | 0 | null | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 6 |
7cea4eaf81854678c42103c0ef875d122b8088d5 | 22,915 | py | Python | attacks/pgd_attack.py | wutong16/Adversarial_Long-Tail | ab2f3a792aede2bd2a3e57657c787ec542165be1 | [
"FSFAP"
] | 78 | 2021-04-05T15:58:03.000Z | 2022-03-30T02:42:42.000Z | attacks/pgd_attack.py | wutong16/Adversarial_Long-Tail | ab2f3a792aede2bd2a3e57657c787ec542165be1 | [
"FSFAP"
] | 1 | 2022-03-08T04:11:02.000Z | 2022-03-18T04:11:17.000Z | attacks/pgd_attack.py | wutong16/Adversarial_Long-Tail | ab2f3a792aede2bd2a3e57657c787ec542165be1 | [
"FSFAP"
] | 9 | 2021-04-13T09:51:51.000Z | 2022-03-09T02:45:20.000Z | from __future__ import print_function
from tqdm import tqdm
import numpy as np
from torch.autograd import Variable
import torch.optim as optim
from sklearn.metrics import recall_score
from models.backbones.resnet import *
from utils.data_utils import save_csv, CountMeter
from sklearn.metrics import classification_report, precision_recall_fscore_support, accuracy_score
import os
import mmcv
def _pgd_whitebox_basic(
model,
device,
X,
y,
random,
epsilon,
num_steps,
step_size,
eval_steps = (5,10),
by_class=False,
num_classes=10,
**kwargs
):
model.eval()
out = model(X)
outputs = dict(CLEAN=out.clone().detach().cpu().numpy())
# FGSM attack
X_fgsm = Variable(X.data, requires_grad=True)
opt = optim.SGD([X_fgsm], lr=1e-3)
opt.zero_grad()
with torch.enable_grad():
loss = nn.CrossEntropyLoss()(model(X_fgsm), y)
loss.backward()
outputs.update({'zero_grad': (100 * torch.sum(X_fgsm.grad==0) / torch.sum(torch.ones_like(X_fgsm))).detach().cpu().numpy()})
eta = epsilon * X_fgsm.grad.data.sign()
X_fgsm = Variable(X.data + eta, requires_grad=True)
X_fgsm = Variable(torch.clamp(X_fgsm, 0, 1.0), requires_grad=True)
outputs.update({'FGSM':model(X_fgsm).detach().cpu().numpy()})
# PGD attack
X_pgd = Variable(X.data, requires_grad=True)
if random:
random_noise = torch.FloatTensor(*X_pgd.shape).uniform_(-epsilon, epsilon).to(device)
X_pgd = Variable(X_pgd.data + random_noise, requires_grad=True)
avg_grad_nat = 0
avg_grad_rob = 0
for i_step in range(num_steps):
opt = optim.SGD([X_pgd], lr=1e-3)
opt.zero_grad()
with torch.enable_grad():
pgd_out = model(X_pgd) # after i_step's attack
if i_step in eval_steps:
outputs.update({'PGD-{}'.format(i_step):pgd_out.detach().cpu().numpy()})
loss = nn.CrossEntropyLoss()(model(X_pgd), y)
loss.backward()
eta = step_size * X_pgd.grad.data.sign()
if i_step == 0:
avg_grad_nat += torch.sum(torch.abs(X_pgd.grad.data)) / torch.sum(torch.ones_like(X_pgd))
elif i_step == num_steps - 1:
avg_grad_rob += torch.sum(torch.abs(X_pgd.grad.data)) / torch.sum(torch.ones_like(X_pgd))
X_pgd = Variable(X_pgd.data + eta, requires_grad=True)
eta = torch.clamp(X_pgd.data - X.data, -epsilon, epsilon)
X_pgd = Variable(X.data + eta, requires_grad=True)
X_pgd = Variable(torch.clamp(X_pgd, 0, 1.0), requires_grad=True)
outputs.update({'PGD-{}'.format(num_steps):model(X_pgd).detach().cpu().numpy()})
return outputs
def _pgd_whitebox_extension(
model,
device,
X,
y,
random,
epsilon,
num_steps,
step_size,
step_counters=None,
targeted=False,
fix_target=None,
eval_steps = (5,10),
save_features = False,
eval_fine=False,
by_class=False,
num_classes=10,
**kwargs
):
model.eval()
out = model(X)
outputs = dict(CLEAN=out.clone().detach().cpu().numpy())
if fix_target is not None:
new_y = fix_target
else:
tmp1 = torch.argsort(out, dim=1)[:, -2:]
new_y = torch.where(tmp1[:, -1] == y, tmp1[:, -2], tmp1[:, -1])
# FGSM attack
X_fgsm = Variable(X.data, requires_grad=True)
opt = optim.SGD([X_fgsm], lr=1e-3)
opt.zero_grad()
with torch.enable_grad():
loss = nn.CrossEntropyLoss()(model(X_fgsm), y)
loss.backward()
outputs.update({'zero_grad': (100 * torch.sum(X_fgsm.grad==0) / torch.sum(torch.ones_like(X_fgsm))).detach().cpu().numpy()})
eta = epsilon * X_fgsm.grad.data.sign()
X_fgsm = Variable(X.data + eta, requires_grad=True)
X_fgsm = Variable(torch.clamp(X_fgsm, 0, 1.0), requires_grad=True)
outputs.update({'FGSM':model(X_fgsm).detach().cpu().numpy()})
# PGD attack
X_pgd = Variable(X.data, requires_grad=True)
success_steps = np.zeros(X.shape[0])
if random:
random_noise = torch.FloatTensor(*X_pgd.shape).uniform_(-epsilon, epsilon).to(device)
X_pgd = Variable(X_pgd.data + random_noise, requires_grad=True)
avg_grad_nat = 0
avg_grad_rob = 0
for i_step in range(num_steps):
opt = optim.SGD([X_pgd], lr=1e-3)
opt.zero_grad()
with torch.enable_grad():
pgd_out = model(X_pgd) # after i_step's attack
if i_step in eval_steps:
outputs.update({'PGD-{}'.format(i_step):pgd_out.detach().cpu().numpy()})
if step_counters is not None:
if targeted:
loss = - nn.CrossEntropyLoss(reduction='none')(model(X_pgd), new_y)
else:
loss = nn.CrossEntropyLoss(reduction='none')(model(X_pgd), y)
step_counters[i_step].update (loss.clone().detach().cpu().numpy(), y.cpu().numpy())
loss = loss.mean()
elif eval_fine:
if targeted:
logits = model(X_pgd)
success = (logits.data.max(1)[1] == new_y.data).cpu().numpy()
success_steps[success * (success_steps==0)] = i_step + 1
loss = - nn.CrossEntropyLoss()(logits, new_y)
else:
logits = model(X_pgd)
error = (logits.data.max(1)[1] != y.data).cpu().numpy()
success_steps[error * (success_steps==0)] = i_step + 1
loss = nn.CrossEntropyLoss()(logits, y)
else:
if targeted:
loss = - nn.CrossEntropyLoss()(model(X_pgd), new_y)
else:
loss = nn.CrossEntropyLoss()(model(X_pgd), y)
loss.backward()
eta = step_size * X_pgd.grad.data.sign()
if i_step == 0:
avg_grad_nat += torch.sum(torch.abs(X_pgd.grad.data)) / torch.sum(torch.ones_like(X_pgd))
elif i_step == num_steps - 1:
avg_grad_rob += torch.sum(torch.abs(X_pgd.grad.data)) / torch.sum(torch.ones_like(X_pgd))
X_pgd = Variable(X_pgd.data + eta, requires_grad=True)
eta = torch.clamp(X_pgd.data - X.data, -epsilon, epsilon)
X_pgd = Variable(X.data + eta, requires_grad=True)
X_pgd = Variable(torch.clamp(X_pgd, 0, 1.0), requires_grad=True)
outputs.update({'PGD-{}'.format(num_steps):model(X_pgd).detach().cpu().numpy()})
if save_features:
outputs.update(CLEAN_features=model.backbone(X).detach().cpu().numpy())
outputs.update(PGD_features=model.backbone(X_pgd).detach().cpu().numpy())
if eval_fine:
outputs.update(success_steps=success_steps,
avg_grad_nat=avg_grad_nat.detach().cpu().numpy(),
avg_grad_rob=avg_grad_rob.detach().cpu().numpy())
return outputs
def eval_metrics(y_out, y_true, num_classes=10, samples_per_cls=None):
y_pred = np.argmax(y_out,1)
y_true = np.asarray(y_true)
precision, recall, f_score, true_sum = \
precision_recall_fscore_support(y_true=y_true, y_pred=y_pred)
accuracy = accuracy_score(y_true=y_true, y_pred=y_pred)
return accuracy, precision, recall, f_score, true_sum
def eval_clean_only(model, device, cfgs, logger, test_loader):
model.eval()
all_labels = []
all_outputs = []
for i, (data, target) in enumerate(test_loader):
data, target = data.to(device), target.to(device)
X, y = Variable(data, requires_grad=True), Variable(target)
outputs = model(X).detach().clone().cpu().numpy()
all_outputs.append(outputs)
all_labels.extend(target.cpu().detach().numpy().tolist())
all_outputs = np.vstack(all_outputs)
accuracy, precision, recall, f_score, true_sum = eval_metrics(all_outputs, all_labels)
logger.info('>>> clean accuracy: {}'.format(accuracy))
for r in recall:
print('{:.4f}'.format(r), end=' ')
print()
return
def eval_adv_test_whitebox_pgd(model, device, cfgs, logger, test_loader, num_classes=10, print_freq=60, by_class=True,
targeted=False, step_eval=False, save_features=False, mode='test', eval_batches=10000,
eval_pair_mode=None, **kwargs):
"""
evaluate model by white-box attack
"""
model.eval()
total_batches = len(test_loader)
all_labels = []
all_outputs = dict()
eval_steps = (5, 10, 20)
if step_eval:
step_counters = [CountMeter(num_classes) for _ in range(cfgs.test_num_steps)]
else:
step_counters= None
total = 0
for i, (data, target) in enumerate(test_loader):
data, target = data.to(device), target.to(device)
total += len(data)
# pgd attack
X, y = Variable(data, requires_grad=True), Variable(target)
outputs = _pgd_whitebox_basic(
model, device, X, y, random=cfgs.random, epsilon=cfgs.test_epsilon,
num_steps=cfgs.test_num_steps, step_size=cfgs.test_step_size, eval_steps=eval_steps,
by_class=by_class, num_classes=num_classes, step_counters=step_counters,
targeted=targeted, save_features=save_features
)
all_labels.extend(target.cpu().detach().numpy().tolist())
for key in outputs.keys():
if key not in all_outputs.keys():
all_outputs.update({key: []})
all_outputs[key].append(outputs[key])
else:
all_outputs[key].append(outputs[key])
if i == 20 or (i+1) % print_freq == 0:
logger.info("[{}/{}] batches finished".format(i+1, total_batches))
for key, data in all_outputs.items():
if 'zero_grad' in key:
# print('zero_grad_rate: {} '.format(np.mean(data)))
pass
if 'features' in key or 'success' in key or 'grad' in key:
continue
data = np.vstack(data)
accuracy, precision, recall, f_score, true_sum = eval_metrics(data, all_labels)
logger.info("metric: {} | accuracy: {:.4f} | precision: {:.4f} | recall: {:.4f} | f_score: {:.4f} | ".format(
key, accuracy, np.mean(precision), np.mean(recall), np.mean(f_score)))
if i == eval_batches:
# sometimes we only evaluate part of the test set during training for efficiency
logger.info(">> Only {}/{} batches are evaluated, stop here to save time.".format(i+1, total_batches))
return
# save metrics into .csv
csv_name = '{}_all_results.csv'.format(cfgs.dataset)
save_csv(csv_name, [[cfgs.model_path], [cfgs.remark]], devide=False)
for key, data in all_outputs.items():
save_data = [[' '], [key]]
if 'success' in key:
data = np.hstack(data)
elif 'zero_grad' in key:
data = np.mean(data)
print(' * * * zero_grad * * * {}'.format(data))
else:
data = np.vstack(data)
all_outputs[key] = data
if 'features' in key or 'success' in key or 'grad' in key:
continue
accuracy, precision, recall, f_score, true_sum = eval_metrics(data, all_labels)
# use group evaluation for CIFAR100 and ImageNet
if recall.shape[0] >= 100:
g_recall = np.reshape(recall, (10, -1)).sum(-1)
g_precision = np.reshape(precision, (10, -1)).sum(-1)
g_f_score = np.reshape(f_score, (10, -1)).sum(-1)
save_data.extend([
['accuracy', accuracy],
['g_recall'], g_recall.tolist(),
['g_precision'], g_precision.tolist(),
['g_f_score'], g_f_score.tolist()
])
# save original per-class metric
save_data.extend([
['accuracy', accuracy],
['recall'], recall.tolist(),
['precision'], precision.tolist(),
['f_score'], f_score.tolist()
])
save_csv(csv_name, save_data, devide=False)
logger.info("Finished PGD Evaluation.")
logger.info("metric: {} | accuracy: {:.4f} | precision: {:.4f} | recall: {:.4f} | f_score: {:.4f} | ".format(
key, accuracy, np.mean(precision), np.mean(recall), np.mean(f_score)))
# for name, value in zip(['precision', 'recall', 'f_score'], [precision, recall, f_score]):
# print(name, end=' | ')
# for v in value:
# print("{:.2f}".format(v*100), end=' ')
# print()
logger.info("[Remarks] {} | End of evaluation,model path {}".format(cfgs.remark, cfgs.model_path))
for name, param in model.classifier.state_dict().items():
if 'weight' in name:
print('save classifier weight from module <{}> !'.format(name))
all_outputs.update(CLASSIFIER_weight=model.classifier.state_dict()[name].clone().detach().cpu().numpy())
all_outputs.update(LABLES=all_labels)
if eval_pair_mode is not None:
mode += '.{}'.format(eval_pair_mode)
mmcv.dump(all_outputs, cfgs.model_path + '.{}.pkl'.format(mode))
print('Data saved at {}'.format(cfgs.model_path + '.{}.pkl'.format(mode)))
# save atep-wise results during PGD attacking into .csv
if step_eval:
for i in range(cfgs.test_num_steps):
save_csv('./{}_all_results.csv'.format(cfgs.dataset), [[' ', ' ',' '], step_counters[i].avg_values.tolist()], msg=False)
return
def eval_adv_test_whitebox_with_restart(model, device, cfgs, logger, test_loader, num_classes=10, print_freq=60, by_class=True,
targeted=False, step_eval=False, save_features=True, mode='test', num_restart=2, free_bn=False):
"""
evaluate model by white-box attack
"""
model.eval()
total_batches = len(test_loader)
all_labels = []
all_features = dict(CLEAN_features=[], PGD_features=[])
all_outputs = dict(CLEAN=[], FGSM=[])
best_adv_features = dict(CLEAN_features=[], PGD_features=[])
best_adv_outputs = dict(CLEAN=[], FGSM=[])
best_adv_loss = dict(FGSM=[])
if mode == 'test':
eval_steps = (5, 10, 20, 100)
cfgs.test_num_steps = 100
else:
eval_steps = (5, 10, 20)
for i_step in eval_steps:
all_outputs.update({'PGD-{}'.format(i_step):[]})
best_adv_outputs.update({'PGD-{}'.format(i_step):[]})
best_adv_loss.update({'PGD-{}'.format(i_step):[]})
if step_eval:
step_counters = [CountMeter(num_classes) for _ in range(cfgs.test_num_steps)]
else:
step_counters= None
for i_re in range(num_restart):
logger.info('[White box evaluation] Restart: {}'.format(i_re + 1))
all_outputs={key: [] for key in all_outputs.keys()}
all_labels = []
total = 0
for i, (data, target) in enumerate(test_loader):
data, target = data.to(device), target.to(device)
total += len(data)
# pgd attack
X, y = Variable(data, requires_grad=True), Variable(target)
outputs = _pgd_whitebox_basic(
model, device, X, y, random=cfgs.random, epsilon=cfgs.test_epsilon,
num_steps=cfgs.test_num_steps, step_size=cfgs.test_step_size, eval_steps=eval_steps,
by_class=by_class, num_classes=num_classes, step_counters=step_counters,
targeted=targeted, save_features=save_features, free_bn=free_bn,
)
for key in all_outputs.keys():
all_outputs[key].append(outputs[key])
all_labels.extend(target.cpu().detach().numpy().tolist())
if save_features:
for key in all_features.keys():
all_features[key].append(outputs[key])
if i == 20 or (i+1) % print_freq == 0:
logger.info("[{}/{}] batches finished".format(i+1, total_batches))
for key, data in all_outputs.items():
data = np.vstack(data)
accuracy, precision, recall, f_score, true_sum = eval_metrics(data, all_labels)
logger.info("metric: {} | accuracy: {:.4f} | precision: {:.4f} | recall: {:.4f} | f_score: {:.4f} | ".format(
key, accuracy, np.mean(precision), np.mean(recall), np.mean(f_score)))
if mode == 'train':
return
for k, v in all_outputs.items():
print(k)
v = np.vstack(v)
loss = nn.CrossEntropyLoss(reduction='none')(torch.tensor(v), torch.tensor(all_labels, dtype=torch.int64))
if i_re == 0:
best_adv_outputs[k] = v
best_adv_loss[k] = loss
else:
# bb = best_adv_loss[k] >= loss
bw = best_adv_loss[k] < loss
best_adv_outputs[k][bw, :] = v[bw, :]
best_adv_loss[k][bw] = loss[bw]
all_outputs.update(best_adv_outputs)
# save metrics into .csv
csv_name = '{}_all_results.csv'.format(cfgs.dataset)
save_csv(csv_name, [[cfgs.model_path], [cfgs.remark]], devide=False)
for key, data in all_outputs.items():
save_data = [[' '], [key]]
data = np.vstack(data)
all_outputs[key] = data
accuracy, precision, recall, f_score, true_sum = eval_metrics(data, all_labels)
# use group evaluation for CIFAR100 and ImageNet
if recall.shape[0] >= 100:
g_recall = np.reshape(recall, (10, -1)).sum(-1)
g_precision = np.reshape(precision, (10, -1)).sum(-1)
g_f_score = np.reshape(f_score, (10, -1)).sum(-1)
save_data.extend([
['accuracy', accuracy],
['g_recall'], g_recall.tolist(),
['g_precision'], g_precision.tolist(),
['g_f_score'], g_f_score.tolist()
])
# save original per-class metric
save_data.extend([
['recall'], recall.tolist(),
['precision'], precision.tolist(),
['f_score'], f_score.tolist()
])
save_csv(csv_name, save_data, devide=False)
logger.info("metric: {} | accuracy: {:.4f} | precision: {:.4f} | recall: {:.4f} | f_score: {:.4f} | ".format(
key, accuracy, np.mean(precision), np.mean(recall), np.mean(f_score)))
for name, value in zip(['precision', 'recall', 'f_score'], [precision, recall, f_score]):
print(name, end=' | ')
for v in value:
print("{:.2f}".format(v*100), end=' ')
print()
logger.info("[Remarks] {} | End of evaluation,model path {}".format(cfgs.remark, cfgs.model_path))
if save_features:
for key, features in all_features.items():
all_features[key] = np.vstack(features)
# save output and features into .pkl
all_outputs.update(all_features)
if hasattr(model.classifier, 'fc'):
all_outputs.update(CLASSIFIER_weight=model.classifier.fc.weight.clone().detach().cpu().numpy())
all_outputs.update(LABLES=all_labels)
file_name = os.path.join(cfgs.model_dir, cfgs.model_path.split('/')[-1] + '.{}.pkl'.format(mode))
mmcv.dump(all_outputs, file_name)
print('Data saved at {}'.format(file_name))
# save atep-wise results during PGD attacking into .csv
if step_eval:
for i in range(cfgs.test_num_steps):
save_csv('./{}_all_results.csv'.format(cfgs.dataset), [[' ', ' ',' '], step_counters[i].avg_values.tolist()], msg=False)
return
def pairing_attack(eval_pair_mode, num_classes):
if 'neighbour' == eval_pair_mode:
pairing = [(i+1)%num_classes for i in range(num_classes)]
elif 'head_tail' == eval_pair_mode:
pairing = [num_classes-i-1 for i in range(num_classes)]
elif 'head_middle' == eval_pair_mode:
middle = num_classes // 2
pairing = [(i+middle)%num_classes for i in range(num_classes)]
else:
raise NameError
return pairing
def _pgd_whitebox_pair(model,
logger,
device,
X,
y,
random,
epsilon,
num_steps,
step_size,
by_class=False,
num_classes=10,
step_counters=None,
targeted=False,
fix_target=None,
eval_steps = (5,10),
save_features = False,
**kwargs
):
model.eval()
out = model(X)
outputs = dict(CLEAN=out.clone().detach().cpu().numpy())
if fix_target is not None:
new_y = fix_target
else:
tmp1 = torch.argsort(out, dim=1)[:, -2:]
new_y = torch.where(tmp1[:, -1] == y, tmp1[:, -2], tmp1[:, -1])
# PGD attack
X_pgd = Variable(X.data, requires_grad=True)
if random:
random_noise = torch.FloatTensor(*X_pgd.shape).uniform_(-epsilon, epsilon).to(device)
X_pgd = Variable(X_pgd.data + random_noise, requires_grad=True)
success_step = np.zeros(X.shape[0])
for i_step in range(num_steps):
opt = optim.SGD([X_pgd], lr=1e-3)
opt.zero_grad()
with torch.enable_grad():
pgd_out = model(X_pgd) # after i_step's attack
if i_step in eval_steps:
outputs.update({'PGD-{}'.format(i_step):pgd_out.detach().cpu().numpy()})
if step_counters is not None:
if targeted:
loss = - nn.CrossEntropyLoss(reduction='none')(model(X_pgd), new_y)
else:
logits = model(X_pgd)
success = logits
error = (logits.data.max(1)[1] != y.data)
print(error.shape)
exit()
loss = nn.CrossEntropyLoss(reduction='none')(model(X_pgd), y)
step_counters[i_step].update (loss.clone().detach().cpu().numpy(), y.cpu().numpy())
loss = loss.mean()
else:
if targeted:
loss = - nn.CrossEntropyLoss()(model(X_pgd), new_y)
else:
loss = nn.CrossEntropyLoss()(model(X_pgd), y)
loss.backward()
eta = step_size * X_pgd.grad.data.sign()
X_pgd = Variable(X_pgd.data + eta, requires_grad=True)
eta = torch.clamp(X_pgd.data - X.data, -epsilon, epsilon)
X_pgd = Variable(X.data + eta, requires_grad=True)
X_pgd = Variable(torch.clamp(X_pgd, 0, 1.0), requires_grad=True)
if save_features:
outputs.update(CLEAN_features=model.backbone(X).detach().cpu().numpy())
outputs.update(PGD_features=model.backbone(X_pgd).detach().cpu().numpy())
return outputs
| 39.372852 | 132 | 0.579577 | 2,979 | 22,915 | 4.24572 | 0.085935 | 0.019924 | 0.030361 | 0.012334 | 0.816967 | 0.78716 | 0.746047 | 0.726755 | 0.690544 | 0.683903 | 0 | 0.012606 | 0.279948 | 22,915 | 581 | 133 | 39.44062 | 0.753939 | 0.039974 | 0 | 0.714286 | 0 | 0.008658 | 0.0551 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.017316 | false | 0.002165 | 0.02381 | 0 | 0.062771 | 0.034632 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
7cf65431b69b056e5179c10b7de515ca7e2c62a3 | 245 | py | Python | tests/test_postprocessors/test_plugins.py | kanderso-nrel/pytest-notebook | 40381be0963a8866d1a46994595844bbfb4e66c1 | [
"BSD-3-Clause"
] | 37 | 2019-08-10T20:09:38.000Z | 2022-03-01T22:17:59.000Z | tests/test_postprocessors/test_plugins.py | kanderso-nrel/pytest-notebook | 40381be0963a8866d1a46994595844bbfb4e66c1 | [
"BSD-3-Clause"
] | 15 | 2019-12-09T14:34:14.000Z | 2022-01-18T13:32:50.000Z | tests/test_postprocessors/test_plugins.py | kanderso-nrel/pytest-notebook | 40381be0963a8866d1a46994595844bbfb4e66c1 | [
"BSD-3-Clause"
] | 7 | 2020-04-14T10:57:15.000Z | 2022-02-10T09:18:30.000Z | from pytest_notebook.post_processors import document_processors
def test_documentation(file_regression):
"""Test all the plugins are loading, by generating combined documentation."""
file_regression.check(document_processors() + "\n")
| 35 | 81 | 0.795918 | 29 | 245 | 6.482759 | 0.758621 | 0.191489 | 0.287234 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.118367 | 245 | 6 | 82 | 40.833333 | 0.87037 | 0.289796 | 0 | 0 | 0 | 0 | 0.011905 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.333333 | false | 0 | 0.333333 | 0 | 0.666667 | 0 | 1 | 0 | 0 | null | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
6b491988f4573f40ec70d957e47975bfda5a91dc | 41 | py | Python | samples/src/main/resources/datasets/python/132.py | sritchie/kotlingrad | 8165ed1cd77220a5347c58cded4c6f2bcf22ee30 | [
"Apache-2.0"
] | 11 | 2020-12-19T01:19:44.000Z | 2021-12-25T20:43:33.000Z | src/main/resources/datasets/python/132.py | breandan/katholic | 081c39f3acc73ff41f5865563debe78a36e1038f | [
"Apache-2.0"
] | null | null | null | src/main/resources/datasets/python/132.py | breandan/katholic | 081c39f3acc73ff41f5865563debe78a36e1038f | [
"Apache-2.0"
] | 2 | 2021-01-25T07:59:20.000Z | 2021-08-07T07:13:49.000Z | def unaryOp6(a, b):
return ~a and -b
| 13.666667 | 20 | 0.585366 | 8 | 41 | 3 | 0.75 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.033333 | 0.268293 | 41 | 2 | 21 | 20.5 | 0.766667 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.5 | false | 0 | 0 | 0.5 | 1 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 6 |
8656d92688d7f81e3845d9dd6345010db10ceede | 47,944 | py | Python | augur/metrics/issue/routes.py | computationalmystic/sengfs19-group8 | ef0c2c052aef5b936eba4eb83aeff36d6bf81db5 | [
"MIT"
] | 3 | 2019-10-31T19:07:48.000Z | 2019-11-20T23:14:15.000Z | augur/metrics/issue/routes.py | computationalmystic/sengfs19-group8 | ef0c2c052aef5b936eba4eb83aeff36d6bf81db5 | [
"MIT"
] | null | null | null | augur/metrics/issue/routes.py | computationalmystic/sengfs19-group8 | ef0c2c052aef5b936eba4eb83aeff36d6bf81db5 | [
"MIT"
] | null | null | null | from flask import Response
def create_issue_routes(server):
metrics = server._augur.metrics
"""
@api {get} /repo-groups/:repo_group_id/repos/:repo_id/issues_top_ten_number_of_assignees Issues New (Repo Group)
@apiName issues-top-ten-number-of-assignees
@apiGroup Evolution
@apiDescription Count of open issues.
<a href="https://github.com/chaoss/wg-evolution/blob/master/metrics/contributors-new.md">CHAOSS Metric Definition</a>
@apiParam {string} repo_group_id Repository Group ID
@apiSuccessExample {json} Success-Response:
[
{
"rg_name": "Netflix",
"open_count": 1,
"date": "2017-09-11T00:00:00.000Z"
},
{
"rg_name": "Netflix",
"open_count": 4,
"date": "2019-06-10T00:00:00.000Z"
}
]
"""
server.addRepoGroupMetric(metrics.issues_top_ten_number_of_assignees, 'issues-top-ten-number-of-assignees')
"""
@api {get} /repo-groups/:repo_group_id/issues-new Issues New (Repo Group)
@apiName issues-new-repo-group
@apiGroup Evolution
@apiDescription Time series of number of new issues opened during a certain period.
<a href="https://github.com/chaoss/wg-evolution/blob/master/metrics/Issues_New.md">CHAOSS Metric Definition</a>
@apiParam {string} repo_group_id Repository Group ID
@apiParam {string=day, week, month, year} [period="day"] Periodicity specification.
@apiParam {string} [begin_date="1970-1-1 0:0:0"] Beginning date specification. E.g. values: `2018`, `2018-05`, `2019-05-01`
@apiParam {string} [end_date="current date"] Ending date specification. E.g. values: `2018`, `2018-05`, `2019-05-01`
@apiSuccessExample {json} Success-Response:
[
{
"repo_id": 21000,
"repo_name": "rails",
"date": "2019-01-01T00:00:00.000Z",
"issues": 318
},
{
"repo_id": 21002,
"repo_name": "acts_as_list",
"date": "2009-01-01T00:00:00.000Z",
"issues": 1
},
{
"repo_id": 21002,
"repo_name": "acts_as_list",
"date": "2010-01-01T00:00:00.000Z",
"issues": 7
}
]
"""
server.addRepoGroupMetric(metrics.issues_new, 'issues-new')
"""
@api {get} /repo-groups/:repo_group_id/repos/:repo_id/issues-new Issues New (Repo)
@apiName issues-new-repo
@apiGroup Evolution
@apiDescription Time series of number of new issues opened during a certain period.
<a href="https://github.com/chaoss/wg-evolution/blob/master/metrics/Issues_New.md">CHAOSS Metric Definition</a>
@apiParam {string} repo_group_id Repository Group ID.
@apiParam {string} repo_id Repository ID.
@apiParam {string=day, week, month, year} [period="day"] Periodicity specification.
@apiParam {string} [begin_date="1970-1-1 0:0:0"] Beginning date specification. E.g. values: `2018`, `2018-05`, `2019-05-01`
@apiParam {string} [end_date="current date"] Ending date specification. E.g. values: `2018`, `2018-05`, `2019-05-01`
@apiSuccessExample {json} Success-Response:
[
{
"repo_name": "rails",
"date": "2015-01-01T00:00:00.000Z",
"issues": 116
},
{
"repo_name": "rails",
"date": "2016-01-01T00:00:00.000Z",
"issues": 196
},
{
"repo_name": "rails",
"date": "2017-01-01T00:00:00.000Z",
"issues": 180
}
]
"""
server.addRepoMetric(metrics.issues_new, 'issues-new')
"""
@api {get} /repo-groups/:repo_group_id/issues-active Issues Active (Repo Group)
@apiName issues-active-repo-group
@apiGroup Evolution
@apiDescription Time series of number of issues that showed some activity during a certain period.
<a href="https://github.com/chaoss/wg-evolution/blob/master/metrics/Issues_Active.md">CHAOSS Metric Definition</a>
@apiParam {string} repo_group_id Repository Group ID
@apiParam {string=day, week, month, year} [period="day"] Periodicity specification.
@apiParam {string} [begin_date="1970-1-1 0:0:0"] Beginning date specification. E.g. values: `2018`, `2018-05`, `2019-05-01`
@apiParam {string} [end_date="current date"] Ending date specification. E.g. values: `2018`, `2018-05`, `2019-05-01`
@apiSuccessExample {json} Success-Response:
[
{
"repo_id": 21039,
"repo_name": "rails_xss",
"date": "2019-01-01T00:00:00.000Z",
"issues": 18
},
{
"repo_id": 21041,
"repo_name": "prototype-rails",
"date": "2019-01-01T00:00:00.000Z",
"issues": 20
},
{
"repo_id": 21043,
"repo_name": "sprockets-rails",
"date": "2015-01-01T00:00:00.000Z",
"issues": 102
}
]
"""
server.addRepoGroupMetric(metrics.issues_active, 'issues-active')
"""
@api {get} /repo-groups/:repo_group_id/repos/:repo_id/issues-active Issues Active (Repo)
@apiName issues-active-repo
@apiGroup Evolution
@apiDescription Time series of number of issues that showed some activity during a certain period.
<a href="https://github.com/chaoss/wg-evolution/blob/master/metrics/Issues_Active.md">CHAOSS Metric Definition</a>
@apiParam {string} repo_group_id Repository Group ID.
@apiParam {string} repo_id Repository ID.
@apiParam {string=day, week, month, year} [period="day"] Periodicity specification.
@apiParam {string} [begin_date="1970-1-1 0:0:0"] Beginning date specification. E.g. values: `2018`, `2018-05`, `2019-05-01`
@apiParam {string} [end_date="current date"] Ending date specification. E.g. values: `2018`, `2018-05`, `2019-05-01`
@apiSuccessExample {json} Success-Response:
[
{
"repo_name": "rails",
"date": "2011-01-01T00:00:00.000Z",
"issues": 30
},
{
"repo_name": "rails",
"date": "2012-01-01T00:00:00.000Z",
"issues": 116
},
{
"repo_name": "rails",
"date": "2013-01-01T00:00:00.000Z",
"issues": 479
}
]
"""
server.addRepoMetric(metrics.issues_active, 'issues-active')
"""
@api {get} /repo-groups/:repo_group_id/issues-closed Issues Closed (Repo Group)
@apiName issues-closed-repo-group
@apiGroup Evolution
@apiDescription Time series of number of issues closed during a certain period.
<a href="https://github.com/chaoss/wg-evolution/blob/master/metrics/Issues_Closed.md">CHAOSS Metric Definition</a>
@apiParam {string} repo_group_id Repository Group ID
@apiParam {string=day, week, month, year} [period="day"] Periodicity specification.
@apiParam {string} [begin_date="1970-1-1 0:0:0"] Beginning date specification. E.g. values: `2018`, `2018-05`, `2019-05-01`
@apiParam {string} [end_date="current date"] Ending date specification. E.g. values: `2018`, `2018-05`, `2019-05-01`
@apiSuccessExample {json} Success-Response:
[
{
"repo_id": 21681,
"repo_name": "incubator-zipkin",
"date": "2019-01-01T00:00:00.000Z",
"issues": 425
},
{
"repo_id": 21682,
"repo_name": "incubator-dubbo",
"date": "2013-01-01T00:00:00.000Z",
"issues": 7
},
{
"repo_id": 21682,
"repo_name": "incubator-dubbo",
"date": "2014-01-01T00:00:00.000Z",
"issues": 47
}
]
"""
server.addRepoGroupMetric(metrics.issues_closed, 'issues-closed')
"""
@api {get} /repo-groups/:repo_group_id/repos/:repo_id/issues-closed Issues Closed (Repo)
@apiName issues-closed-repo
@apiGroup Evolution
@apiDescription Time series of number of issues closed during a certain period.
<a href="https://github.com/chaoss/wg-evolution/blob/master/metrics/Issues_New.md">CHAOSS Metric Definition</a>
@apiParam {string} repo_group_id Repository Group ID.
@apiParam {string} repo_id Repository ID.
@apiParam {string=day, week, month, year} [period="day"] Periodicity specification.
@apiParam {string} [begin_date="1970-1-1 0:0:0"] Beginning date specification. E.g. values: `2018`, `2018-05`, `2019-05-01`
@apiParam {string} [end_date="current date"] Ending date specification. E.g. values: `2018`, `2018-05`, `2019-05-01`
@apiSuccessExample {json} Success-Response:
[
{
"repo_name": "incubator-pagespeed-ngx",
"date": "2012-01-01T00:00:00.000Z",
"issues": 97
},
{
"repo_name": "incubator-pagespeed-ngx",
"date": "2013-01-01T00:00:00.000Z",
"issues": 395
},
{
"repo_name": "incubator-pagespeed-ngx",
"date": "2014-01-01T00:00:00.000Z",
"issues": 265
}
]
"""
server.addRepoMetric(metrics.issues_closed, 'issues-closed')
"""
@api {get} /repo-groups/:repo_group_id/issue-duration Issue Duration (Repo Group)
@apiName issue-duration-repo-group
@apiGroup Evolution
@apiDescription Time since an issue is proposed until it is closed.
<a href="https://github.com/chaoss/wg-evolution/blob/master/focus_areas/code_development.md">CHAOSS Metric Definition</a>
@apiParam {string} repo_group_id Repository Group ID
@apiParam {string} [begin_date="1970-1-1 0:0:0"] Beginning date specification. E.g. values: `2018`, `2018-05`, `2019-05-01`
@apiParam {string} [end_date="current date"] Ending date specification. E.g. values: `2018`, `2018-05`, `2019-05-01`
@apiSuccessExample {json} Success-Response:
[
{
"repo_id": 21017,
"repo_name": "ssl_requirement",
"issue_id": 50320,
"created_at": "2011-05-06T20:20:05.000Z",
"closed_at": "2011-05-06T20:21:47.000Z",
"duration": "0 days 00:01:42.000000000"
},
{
"repo_id": 21027,
"repo_name": "rails-contributors",
"issue_id": 50328,
"created_at": "2019-06-20T22:56:38.000Z",
"closed_at": "2019-06-21T20:17:28.000Z",
"duration": "0 days 21:20:50.000000000"
},
{
"repo_id": 21027,
"repo_name": "rails-contributors",
"issue_id": 50329,
"created_at": "2019-06-20T22:01:52.000Z",
"closed_at": "2019-06-22T02:29:03.000Z",
"duration": "1 days 04:27:11.000000000"
}
]
"""
server.addRepoGroupMetric(metrics.issue_duration, 'issue-duration')
"""
@api {get} /repo-groups/:repo_group_id/repos/:repo_id/issue-backlog Issue Duration (Repo)
@apiName issue-duration-repo
@apiGroup Evolution
@apiDescription Time since an issue is proposed until it is closed.
<a href="https://github.com/chaoss/wg-evolution/blob/master/focus_areas/code_development.md">CHAOSS Metric Definition</a>
@apiParam {string} repo_group_id Repository Group ID.
@apiParam {string} repo_id Repository ID.
@apiParam {string} [begin_date="1970-1-1 0:0:0"] Beginning date specification. E.g. values: `2018`, `2018-05`, `2019-05-01`
@apiParam {string} [end_date="current date"] Ending date specification. E.g. values: `2018`, `2018-05`, `2019-05-01`
@apiSuccessExample {json} Success-Response:
[
{
"repo_name": "exception_notification",
"issue_id": 50306,
"created_at": "2011-02-13T03:46:06.000Z",
"closed_at": "2011-04-14T23:27:33.000Z",
"duration": "60 days 19:41:27.000000000"
},
{
"repo_name": "exception_notification",
"issue_id": 50308,
"created_at": "2011-01-19T18:47:41.000Z",
"closed_at": "2013-12-09T13:51:03.000Z",
"duration": "1054 days 19:03:22.000000000"
}
]
"""
server.addRepoMetric(metrics.issue_duration, 'issue-duration')
"""
@api {get} /repo-groups/:repo_group_id/issue-participants Issue Participants (Repo Group)
@apiName issue-participants-repo-group
@apiGroup Evolution
@apiDescription How many persons participated in the discussion of issues.
<a href="https://github.com/chaoss/wg-evolution/blob/master/focus_areas/code_development.md">CHAOSS Metric Definition</a>
@apiParam {string} repo_group_id Repository Group ID.
@apiParam {string} [begin_date="1970-1-1 0:0:0"] Beginning date specification. E.g. values: `2018`, `2018-05`, `2019-05-01`
@apiParam {string} [end_date="current date"] Ending date specification. E.g. values: `2018`, `2018-05`, `2019-05-01`
@apiSuccessExample {json} Success-Response:
[
{
"repo_id": 21027,
"repo_name": "rails-contributors",
"issue_id": 50328,
"created_at": "2019-06-20T22:56:38.000Z",
"participants": 1
},
{
"repo_id": 21030,
"repo_name": "arel",
"issue_id": 50796,
"created_at": "2017-03-02T21:14:46.000Z",
"participants": 1
},
{
"repo_id": 21030,
"repo_name": "arel",
"issue_id": 50795,
"created_at": "2017-03-24T15:39:08.000Z",
"participants": 2
}
]
"""
server.addRepoGroupMetric(metrics.issue_participants, 'issue-participants')
"""
@api {get} /repo-groups/:repo_group_id/repos/:repo_id/issue-participants Issue Participants (Repo)
@apiName issue-participants-repo
@apiGroup Evolution
@apiDescription How many persons participated in the discussion of issues.
<a href="https://github.com/chaoss/wg-evolution/blob/master/focus_areas/code_development.md">CHAOSS Metric Definition</a>
@apiParam {string} repo_group_id Repository Group ID.
@apiParam {string} repo_id Repository ID.
@apiParam {string} [begin_date="1970-1-1 0:0:0"] Beginning date specification. E.g. values: `2018`, `2018-05`, `2019-05-01`
@apiParam {string} [end_date="current date"] Ending date specification. E.g. values: `2018`, `2018-05`, `2019-05-01`
@apiSuccessExample {json} Success-Response:
[
{
"repo_name": "arel",
"issue_id": 50796,
"created_at": "2017-03-02T21:14:46.000Z",
"participants": 1
},
{
"repo_name": "arel",
"issue_id": 50795,
"created_at": "2017-03-24T15:39:08.000Z",
"participants": 2
}
]
"""
server.addRepoMetric(metrics.issue_participants, 'issue-participants')
"""
@api {get} /repo-groups/:repo_group_id/issue-backlog Issue Backlog (Repo Group)
@apiName issue-backlog-repo-group
@apiGroup Evolution
@apiDescription Number of issues currently open.
<a href="https://github.com/chaoss/wg-evolution/blob/master/focus_areas/code_development.md">CHAOSS Metric Definition</a>
@apiParam {string} repo_group_id Repository Group ID
@apiSuccessExample {json} Success-Response:
[
{
"repo_id": 21055,
"repo_name": "cache_digests",
"issue_backlog": 21
},
{
"repo_id": 21056,
"repo_name": "rails-dev-box",
"issue_backlog": 1
},
{
"repo_id": 21058,
"repo_name": "activerecord-session_store",
"issue_backlog": 24
}
]
"""
server.addRepoGroupMetric(metrics.issue_backlog, 'issue-backlog')
"""
@api {get} /repo-groups/:repo_group_id/repos/:repo_id/issue-backlog Issue Backlog (Repo)
@apiName issue-backlog-repo
@apiGroup Evolution
@apiDescription Number of issues currently open.
<a href="https://github.com/chaoss/wg-evolution/blob/master/focus_areas/code_development.md">CHAOSS Metric Definition</a>
@apiParam {string} repo_group_id Repository Group ID.
@apiParam {string} repo_id Repository ID.
@apiSuccessExample {json} Success-Response:
[
{
"repo_name":"render_component",
"issue_backlog": 3
}
]
"""
server.addRepoMetric(metrics.issue_backlog, 'issue-backlog')
"""
@api {get} /repo-groups/:repo_group_id/issue-throughput Issue Throughput (Repo Group)
@apiName issue-throughput-repo-group
@apiGroup Evolution
@apiDescription Ratio of issues closed to total issues.
<a href="https://github.com/chaoss/wg-evolution/blob/master/focus_areas/code_development.md">CHAOSS Metric Definition</a>
@apiParam {string} repo_group_id Repository Group ID
@apiSuccessExample {json} Success-Response:
[
{
"repo_id": 21681,
"repo_name": "incubator-zipkin",
"throughput": 0.819125
},
{
"repo_id": 21682,
"repo_name": "incubator-dubbo",
"throughput": 0.861896
}
]
"""
server.addRepoGroupMetric(metrics.issue_throughput, 'issue-throughput')
"""
@api {get} /repo-groups/:repo_group_id/repos/:repo_id/issue-throughput Issue Throughput (Repo)
@apiName issue-throughput-repo
@apiGroup Evolution
@apiDescription Ratio of issues closed to total issues.
<a href="https://github.com/chaoss/wg-evolution/blob/master/focus_areas/code_development.md">CHAOSS Metric Definition</a>
@apiParam {string} repo_group_id Repository Group ID.
@apiParam {string} repo_id Repository ID.
@apiSuccessExample {json} Success-Response:
[
{
"repo_name": "rails-contributors",
"throughput": 0.997531
}
]
"""
server.addRepoMetric(metrics.issue_throughput, 'issue-throughput')
"""
@api {get} /repo-groups/:repo_group_id/issues-first-time-opened New Contributors of Issues (Repo Group)
@apiName New Contributors of Issues(Repo Group)
@apiGroup Evolution
@apiDescription Number of persons opening an issue for the first time.
<a href="https://github.com/chaoss/wg-evolution/blob/master/metrics/issues-first-time-opened.md">CHAOSS Metric Definition</a>
@apiParam {string} repo_group_id Repository Group ID
@apiParam {string=day, week, month, year} [period="day"] Periodicity specification.
@apiParam {string} [begin_date="1970-1-1 0:0:0"] Beginning date specification. E.g. values: `2018`, `2018-05`, `2019-05-01`
@apiParam {string} [end_date="current date"] Ending date specification. E.g. values: `2018`, `2018-05`, `2019-05-01`
@apiSuccessExample {json} Success-Response:
[
{
"issue_date": "2018-05-20T00:00:00.000Z",
"count": 3,
"repo_name": "rails",
"repo_id": 21000
},
{
"issue_date": "2019-06-03T00:00:00.000Z",
"count": 23,
"repo_name": "rails",
"repo_id": 21000
}
]
"""
server.addRepoGroupMetric(
metrics.issues_first_time_opened, 'issues-first-time-opened')
"""
@api {get} /repo-groups/:repo_group_id/repos/:repo_id/issues-first-time-opened New Contributors of Issues (Repo)
@apiName New Contributors of Issues(Repo)
@apiGroup Evolution
@apiDescription Number of persons opening an issue for the first time.
<a href="https://github.com/chaoss/wg-evolution/blob/master/metrics/issues-first-time-opened.md">CHAOSS Metric Definition</a>
@apiParam {string} repo_group_id Repository Group ID.
@apiParam {string} repo_id Repository ID.
@apiParam {string=day, week, month, year} [period="day"] Periodicity specification.
@apiParam {string} [begin_date="1970-1-1 0:0:0"] Beginning date specification. E.g. values: `2018`, `2018-05`, `2019-05-01`
@apiParam {string} [end_date="current date"] Ending date specification. E.g. values: `2018`, `2018-05`, `2019-05-01`
@apiSuccessExample {json} Success-Response:
[
{
"issue_date": "2018-05-20T00:00:00.000Z",
"count": 3,
"repo_name": "rails"
},
{
"issue_date": "2019-06-03T00:00:00.000Z",
"count": 23,
"repo_name": "rails"
}
]
"""
server.addRepoMetric(
metrics.issues_first_time_opened, 'issues-first-time-opened')
"""
@api {get} /repo-groups/:repo_group_id/issues-first-time-closed Closed Issues New Contributor (Repo Group)
@apiName Closed Issues New Contributors(Repo Group)
@apiGroup Evolution
@apiDescription Number of persons closing an issue for the first time.
<a href="https://github.com/chaoss/wg-evolution/blob/master/metrics/issues-first-time-closed.md">CHAOSS Metric Definition</a>
@apiParam {string} repo_group_id Repository Group ID
@apiParam {string=day, week, month, year} [period="day"] Periodicity specification.
@apiParam {string} [begin_date="1970-1-1 0:0:0"] Beginning date specification. E.g. values: `2018`, `2018-05`, `2019-05-01`
@apiParam {string} [end_date="current date"] Ending date specification. E.g. values: `2018`, `2018-05`, `2019-05-01`
@apiSuccessExample {json} Success-Response:
[
{
"issue_date": "2018-05-20T00:00:00.000Z",
"count": 3,
"repo_name": "rails",
"repo_id": 21000
},
{
"issue_date": "2019-06-03T00:00:00.000Z",
"count": 23
"repo_name": "rails",
"repo_id": 21000
}
]
"""
server.addRepoGroupMetric(
metrics.issues_first_time_closed, 'issues-first-time-closed')
"""
@api {get} /repo-groups/:repo_group_id/repos/:repo_id/issues-first-time-closed Closed Issues New Contributors (Repo)
@apiName Closed Issues New Contributors(Repo)
@apiGroup Evolution
@apiDescription Number of persons closing an issue for the first time.
<a href="https://github.com/chaoss/wg-evolution/blob/master/metrics/issues-first-time-closed.md">CHAOSS Metric Definition</a>
@apiParam {string} repo_group_id Repository Group ID.
@apiParam {string} repo_id Repository ID.
@apiParam {string=day, week, month, year} [period="day"] Periodicity specification.
@apiParam {string} [begin_date="1970-1-1 0:0:0"] Beginning date specification. E.g. values: `2018`, `2018-05`, `2019-05-01`
@apiParam {string} [end_date="current date"] Ending date specification. E.g. values: `2018`, `2018-05`, `2019-05-01`
@apiSuccessExample {json} Success-Response:
[
{
"issue_date": "2018-05-20T00:00:00.000Z",
"count": 3,
"repo_name": "rails"
},
{
"issue_date": "2019-06-03T00:00:00.000Z",
"count": 23,
"repo_name": "rails"
}
]
"""
server.addRepoMetric(
metrics.issues_first_time_closed, 'issues-first-time-closed')
"""
@api {get} /repo-groups/:repo_group_id/open-issues-count Open Issues Count (Repo Group)
@apiName open-issues-count-repo-group
@apiGroup Evolution
@apiDescription Count of open issues.
<a href="https://github.com/chaoss/wg-evolution/blob/master/metrics/contributors-new.md">CHAOSS Metric Definition</a>
@apiParam {string} repo_group_id Repository Group ID
@apiSuccessExample {json} Success-Response:
[
{
"rg_name": "Netflix",
"open_count": 1,
"date": "2017-09-11T00:00:00.000Z"
},
{
"rg_name": "Netflix",
"open_count": 4,
"date": "2019-06-10T00:00:00.000Z"
}
]
"""
server.addRepoGroupMetric(metrics.open_issues_count, 'open-issues-count')
"""
@api {get} /repo-groups/:repo_group_id/open-issues-count Open Issues Count (Repo)
@apiName open-issues-count-repo
@apiGroup Evolution
@apiDescription Count of open issues.
<a href="https://github.com/chaoss/wg-evolution/blob/master/metrics/contributors-new.md">CHAOSS Metric Definition</a>
@apiParam {string} repo_group_id Repository Group ID
@apiParam {string} repo_id Repository ID.
@apiSuccessExample {json} Success-Response:
[
{
"repo_id": 21681,
"open_count": 18,
"date": "2019-04-15T00:00:00.000Z"
},
{
"repo_id": 21681,
"open_count": 16,
"date": "2019-04-22T00:00:00.000Z"
}
]
"""
server.addRepoMetric(metrics.open_issues_count, 'open-issues-count')
"""
@api {get} /repo-groups/:repo_group_id/repos/:repo_id/closed-issues-count Closed Issues Count (Repo Group)
@apiName closed-issues-count-repo-group
@apiGroup Evolution
@apiDescription Count of closed issues.
<a href="https://github.com/chaoss/wg-evolution/blob/master/metrics/contributors-new.md">CHAOSS Metric Definition</a>
@apiParam {string} repo_group_id Repository Group ID
@apiSuccessExample {json} Success-Response:
[
{
"rg_name": "Apache",
"closed_count": 4,
"date": "2014-06-02T00:00:00.000Z"
},
{
"rg_name": "Apache",
"closed_count": 6,
"date": "2014-06-09T00:00:00.000Z"
}
]
"""
server.addRepoGroupMetric(metrics.closed_issues_count, 'closed-issues-count')
"""
@api {get} /repo-groups/:repo_group_id/closed-issues-count Closed Issues Count (Repo)
@apiName closed-issues-count-repo
@apiGroup Evolution
@apiDescription Count of closed issues.
<a href="https://github.com/chaoss/wg-evolution/blob/master/metrics/contributors-new.md">CHAOSS Metric Definition</a>
@apiParam {string} repo_group_id Repository Group ID
@apiParam {string} repo_id Repository ID.
@apiSuccessExample {json} Success-Response:
[
{
"repo_id": 21681,
"closed_count": 26,
"date": "2018-11-26T00:00:00.000Z"
},
{
"repo_id": 21681,
"closed_count": 14,
"date": "2018-12-03T00:00:00.000Z"
}
]
"""
server.addRepoMetric(metrics.closed_issues_count, 'closed-issues-count')
"""
@api {get} /repo-groups/:repo_group_id/issues-open-age Open Issue Age (Repo Group)
@apiName Open Issue Age(Repo Group)
@apiGroup Evolution
@apiDescription Age of open issues.
<a href="https://github.com/chaoss/wg-evolution/blob/master/metrics/issues-open-age.md">CHAOSS Metric Definition</a>
@apiParam {string} repo_group_id Repository Group ID
@apiSuccessExample {json} Success-Response:
[
{
"repo_id": 21000,
"repo_name": "rails",
"issue_id": 38318,
"date": "2009-05-15T19:48:43.000Z",
"open_date": 3696
},
{
"repo_id": 21000,
"repo_name": "rails",
"issue_id": 38317,
"date": "2009-05-16T14:35:40.000Z",
"open_date": 3695
}
]
"""
server.addRepoGroupMetric(
metrics.issues_open_age, 'issues-open-age')
"""
@api {get} /repo-groups/:repo_group_id/repos/:repo_id/issues-open-age Open Issue Age (Repo)
@apiName Open Issue Age(Repo)
@apiGroup Evolution
@apiDescription Age of open issues.
<a href="https://github.com/chaoss/wg-evolution/blob/master/metrics/issues-open-age.md">CHAOSS Metric Definition</a>
@apiParam {string} repo_group_id Repository Group ID.
@apiParam {string} repo_id Repository ID.
@apiSuccessExample {json} Success-Response:
[
{
"repo_id": 21000,
"repo_name": "rails",
"issue_id": 38318,
"date": "2009-05-15T19:48:43.000Z",
"open_date": 3696
},
{
"repo_id": 21000,
"repo_name": "rails",
"issue_id": 38317,
"date": "2009-05-16T14:35:40.000Z",
"open_date": 3695
}
]
"""
server.addRepoMetric(
metrics.issues_open_age, 'issues-open-age')
"""
@api {get} /repo-groups/:repo_group_id/issues-closed-resolution-duration Closed Issue Resolution Duration (Repo Group)
@apiName Closed Issue Resolution Duration(Repo Group)
@apiGroup Evolution
@apiDescription Duration of time for issues to be resolved.
<a href="https://github.com/chaoss/wg-evolution/blob/master/metrics/issues-closed-resolution-duration.md">CHAOSS Metric Definition</a>
@apiParam {string} repo_group_id Repository Group ID
@apiSuccessExample {json} Success-Response:
[
{
"repo_name":"incubator-dubbo",
"gh_issue_number":4110,
"issue_title":"rm incubating word",
"created_at":"2019-05-22T03:18:13.000Z",
"closed_at":"2019-05-22T05:27:29.000Z",
"diffdate":0.0
},
{
"repo_name":"incubator-dubbo",
"gh_issue_number":4111,
"issue_title":"nacos registry serviceName may conflict",
"created_at":"2019-05-22T03:30:23.000Z",
"closed_at":"2019-05-23T14:36:17.000Z",
"diffdate":1.0
}
]
"""
server.addRepoGroupMetric(
metrics.issues_closed_resolution_duration, 'issues-closed-resolution-duration')
"""
@api {get} /repo-groups/:repo_group_id/repos/:repo_id/issues-closed-resolution-duration Closed Issue Resolution Duration (Repo)
@apiName Closed Issue Resolution Duration(Repo)
@apiGroup Evolution
@apiDescription Duration of time for issues to be resolved.
<a href="https://github.com/chaoss/wg-evolution/blob/master/metrics/issues-closed-resolution-duration.md">CHAOSS Metric Definition</a>
@apiParam {string} repo_group_id Repository Group ID.
@apiParam {string} repo_id Repository ID.
@apiSuccessExample {json} Success-Response:
[
{
"repo_id": 21682
"repo_name":"incubator-dubbo",
"gh_issue_number":4223,
"issue_title":"Cloud Native PR",
"created_at":"2019-05-31T07:55:44.000Z",
"closed_at":"2019-06-17T03:12:48.000Z",
"diffdate":16.0
},
{
"repo_id": 21682,
"repo_name":"incubator-dubbo",
"gh_issue_number":4131,
"issue_title":"Reduce context switching cost by optimizing thread model on consumer side.",
"created_at":"2019-05-23T06:18:21.000Z",
"closed_at":"2019-06-03T08:07:27.000Z",
"diffdate":11.0
}
]
"""
server.addRepoMetric(
metrics.issues_closed_resolution_duration, 'issues-closed-resolution-duration')
"""
@api {get} /repo-groups/:repo_group_id/issues-maintainer-response-duration Issue Response Time (Repo Group)
@apiName Issue Response Time(Repo Group)
@apiGroup Evolution
@apiDescription Duration of time for issues to be resolved.
<a href="https://github.com/chaoss/wg-evolution/blob/master/metrics/issues-maintainer-response-duration.md">CHAOSS Metric Definition</a>
@apiParam {string} repo_group_id Repository Group ID
@apiParam {string} [begin_date="1970-1-1 0:0:0"] Beginning date specification. E.g. values: `2018`, `2018-05`, `2019-05-01`
@apiParam {string} [end_date="current date"] Ending date specification. E.g. values: `2018`, `2018-05`, `2019-05-01`
@apiSuccessExample {json} Success-Response:
[
{
"repo_id": 21987,
"repo_name": "qpid-proton",
"average_days_comment": 27.1111111111
},
{
"repo_id": 22252,
"repo_name": "cordova-create",
"average_days_comment": 0.8
}
]
"""
server.addRepoGroupMetric(metrics.issues_maintainer_response_duration, 'issues-maintainer-response-duration')
"""
@api {get} /repo-groups/:repo_group_id/repos/:repo_id/issues-maintainer-response-duration Issue Response Time (Repo)
@apiName Issue Response Time(Repo)
@apiGroup Evolution
@apiDescription Duration of time for issues to be resolved.
<a href="https://github.com/chaoss/wg-evolution/blob/master/metrics/issues-maintainer-response-duration.md">CHAOSS Metric Definition</a>
@apiParam {string} repo_group_id Repository Group ID
@apiParam {string} repo_id Repository ID.
@apiParam {string} [begin_date="1970-1-1 0:0:0"] Beginning date specification. E.g. values: `2018`, `2018-05`, `2019-05-01`
@apiParam {string} [end_date="current date"] Ending date specification. E.g. values: `2018`, `2018-05`, `2019-05-01`
@apiSuccessExample {json} Success-Response:
[
{
"repo_id": 21987,
"repo_name": "qpid-proton",
"average_days_comment": 27.1111111111
}
]
"""
server.addRepoMetric(metrics.issues_maintainer_response_duration, 'issues-maintainer-response-duration')
"""
@api {get} /repo-groups/:repo_group_id/avgerage-issue-resolution-time Average Issue Resolution Time (Repo Group)
@apiName average-issue-resolution-time-repo-group
@apiGroup Risk
@apiDescription The average issue resolution time.
<a href="https://github.com/chaoss/wg-risk/blob/master/focus-areas/business-risk.md">CHAOSS Metric Definition</a>
@apiParam {string} repo_group_id Repository Group ID
@apiSuccessExample {json} Success-Response:
[
{
"repo_id": 21353,
"repo_name": "open_id_authentication",
"avg_issue_resolution_time": "1413 days 15:39:48"
},
{
"repo_id": 21362,
"repo_name": "country_select",
"avg_issue_resolution_time": "140 days 09:37:58.2"
}
]
"""
server.addRepoGroupMetric(metrics.average_issue_resolution_time, 'average-issue-resolution-time')
"""
@api {get} /repo-groups/:repo_group_id/repos/:repo_id/avgerage-issue-resolution-time Average Issue Resolution Time (Repo)
@apiName average-issue-resolution-time-repo
@apiGroup Risk
@apiDescription The average issue resolution time.
<a href="https://github.com/chaoss/wg-risk/blob/master/focus-areas/business-risk.md">CHAOSS Metric Definition</a>
@apiParam {string} repo_group_id Repository Group ID.
@apiParam {string} repo_id Repository ID.
@apiSuccessExample {json} Success-Response:
[
{
"repo_name": "maven-release",
"avg_issue_resolution_time": "276 days 13:54:13.2"
}
]
"""
server.addRepoMetric(metrics.average_issue_resolution_time, 'average-issue-resolution-time')
"""
@api {get} /repo-groups/:repo_group_id/issue-comments-mean Issue Comments Mean (Repo Group)
@apiName issue-comments-mean-repo-group
@apiGroup Experimental
@apiDescription Mean(Average) of issue comments per day.
@apiParam {string} repo_group_id Repository Group ID.
@apiParam {string} [group_by="week"] Allows for results to be grouped by day, week, month, or year. E.g. values: `year`, `day`, `month`
@apiSuccessExample {json} Success-Response:
[
{
"repo_id": 21326,
"date": "2018-01-01T00:00:00.000Z",
"mean":0.6191780822
},
{
"repo_id": 21326,
"date": "2019-01-01T00:00:00.000Z",
"mean": 0.7671232877
},
{
"repo_id": 21327,
"date": "2015-01-01T00:00:00.000Z",
"mean": 0.0602739726
}
]
"""
server.addRepoGroupMetric(metrics.issue_comments_mean, 'issue-comments-mean')
"""
@api {get} /repo-groups/:repo_group_id/repos/:repo_id/issue-comments-mean Issue Comments Mean (Repo)
@apiName issue-comments-mean-repo
@apiGroup Experimental
@apiDescription Mean(Average) of issue comments per day.
@apiParam {string} repo_group_id Repository Group ID.
@apiParam {string} repo_id Repository ID.
@apiSuccessExample {json} Success-Response:
[
{
"repo_id": 21326,
"date": "2018-01-01T00:00:00.000Z",
"mean":0.6191780822
},
{
"repo_id": 21326,
"date": "2019-01-01T00:00:00.000Z",
"mean": 0.7671232877
}
]
"""
server.addRepoMetric(metrics.issue_comments_mean, 'issue-comments-mean')
"""
@api {get} /repo-groups/:repo_group_id/issue-comments-mean-std Issue Comments Mean Std (Repo Group)
@apiName issue-comments-mean-std-repo-group
@apiGroup Experimental
@apiDescription Mean(Average) and Standard Deviation of issue comments per day.
@apiParam {string} repo_group_id Repository Group ID.
@apiParam {string} [group_by="week"] Allows for results to be grouped by day, week, month, or year. E.g. values: `year`, `day`, `month`
@apiSuccessExample {json} Success-Response:
[
{
"repo_id": 21326,
"date": "2018-01-01T00:00:00.000Z",
"mean":0.6191780822
},
{
"repo_id": 21326,
"date": "2019-01-01T00:00:00.000Z",
"mean": 0.7671232877
},
{
"repo_id": 21327,
"date": "2015-01-01T00:00:00.000Z",
"mean": 0.0602739726
}
]
"""
server.addRepoGroupMetric(metrics.issue_comments_mean_std, 'issue-comments-mean-std')
"""
@api {get} /repo-groups/:repo_group_id/repos/:repo_id/issue-comments-mean-std Issue Comments Mean Std (Repo)
@apiName issue-comments-mean-repo
@apiGroup Experimental
@apiDescription Mean(Average) and Standard Deviation of issue comments per day.
@apiParam {string} repo_group_id Repository Group ID.
@apiParam {string} repo_id Repository ID.
@apiSuccessExample {json} Success-Response:
[
{
"repo_id": 21000,
"date": "2011-01-01T00:00:00.000Z",
"average": 2.5,
"standard_deviation":1.7159383568
},
{
"repo_id": 21000,
"date": "2012-01-01T00:00:00.000Z",
"average": 1.9666666667,
"standard_deviation": 1.3767361036
}
]
"""
server.addRepoMetric(metrics.issue_comments_mean_std, 'issue-comments-mean-std')
"""
@api {get} /repo-groups/:repo_group_id/abandoned_issues Abandoned Issues (Repo)
@apiName Abandoned Issues
@apiGroup Experimental
@apiDescription List of abandoned issues (last updated >= 1 year ago)
@apiParam {string} repo_group_id Repository Group ID
@apiParam {string} repo_id Repository ID.
@apiSuccessExample {json} Success-Response:
[
{
"updated_at": "2017-10-30T06:52:19.000Z",
"issue_id": 125071,
"repo_id": 22004
},
{
"updated_at": "2018-01-10T06:02:16.000Z",
"issue_id": 125070,
"repo_id": 22003
}
]
"""
server.addRepoGroupMetric(metrics.abandoned_issues, 'abandoned_issues')
| 48.47725 | 156 | 0.502461 | 4,695 | 47,944 | 5.015761 | 0.078807 | 0.032103 | 0.033632 | 0.02446 | 0.890441 | 0.866661 | 0.819525 | 0.806786 | 0.780585 | 0.766869 | 0 | 0.092898 | 0.385033 | 47,944 | 988 | 157 | 48.526316 | 0.705807 | 0 | 0 | 0.340426 | 0 | 0 | 0.216541 | 0.111278 | 0 | 0 | 0 | 0 | 0 | 1 | 0.021277 | false | 0 | 0.021277 | 0 | 0.042553 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
866f1900cd52c78a6a905ce527c504438bfefb0e | 36 | py | Python | lang/py/rfc/08/Graphics/Primitive/fill.py | ch1huizong/learning | 632267634a9fd84a5f5116de09ff1e2681a6cc85 | [
"MIT"
] | null | null | null | lang/py/rfc/08/Graphics/Primitive/fill.py | ch1huizong/learning | 632267634a9fd84a5f5116de09ff1e2681a6cc85 | [
"MIT"
] | null | null | null | lang/py/rfc/08/Graphics/Primitive/fill.py | ch1huizong/learning | 632267634a9fd84a5f5116de09ff1e2681a6cc85 | [
"MIT"
] | null | null | null |
def draw():
print('Fill draw')
| 9 | 22 | 0.555556 | 5 | 36 | 4 | 0.8 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.25 | 36 | 3 | 23 | 12 | 0.740741 | 0 | 0 | 0 | 0 | 0 | 0.257143 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.5 | true | 0 | 0 | 0 | 0.5 | 0.5 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 6 |
86d948b91fad29005cc6563a7e4f9a43bbbd3efc | 36,804 | py | Python | tests/v2/test_0118-numba-cpointers.py | douglasdavis/awkward-1.0 | f00775803a5568efb0a8e2dae3b1a4f23228fa40 | [
"BSD-3-Clause"
] | null | null | null | tests/v2/test_0118-numba-cpointers.py | douglasdavis/awkward-1.0 | f00775803a5568efb0a8e2dae3b1a4f23228fa40 | [
"BSD-3-Clause"
] | null | null | null | tests/v2/test_0118-numba-cpointers.py | douglasdavis/awkward-1.0 | f00775803a5568efb0a8e2dae3b1a4f23228fa40 | [
"BSD-3-Clause"
] | null | null | null | # BSD 3-Clause License; see https://github.com/scikit-hep/awkward-1.0/blob/main/LICENSE
import sys
import operator
import pytest # noqa: F401
import numpy as np # noqa: F401
import awkward as ak # noqa: F401
numba = pytest.importorskip("numba")
ak_numba_arrayview = pytest.importorskip("awkward._v2._connect.numba.arrayview")
ak._v2.numba.register_and_check()
def test_views():
assert ak._v2.operations.convert.to_list(
ak_numba_arrayview.ArrayView.fromarray(
ak._v2.highlevel.Array([1.1, 2.2, 3.3, 4.4, 5.5])
).toarray()
) == [1.1, 2.2, 3.3, 4.4, 5.5]
assert ak._v2.operations.convert.to_list(
ak_numba_arrayview.ArrayView.fromarray(
ak._v2.highlevel.Array(np.array([[1.1, 2.2, 3.3], [4.4, 5.5, 6.6]]))
).toarray()
) == [[1.1, 2.2, 3.3], [4.4, 5.5, 6.6]]
assert ak._v2.operations.convert.to_list(
ak_numba_arrayview.ArrayView.fromarray(
ak._v2.highlevel.Array([[1.1, 2.2, 3.3], [], [4.4, 5.5]])
).toarray()
) == [[1.1, 2.2, 3.3], [], [4.4, 5.5]]
assert ak._v2.operations.convert.to_list(
ak_numba_arrayview.ArrayView.fromarray(
ak._v2.highlevel.Array([1.1, 2.2, None, 3.3, None, 4.4, 5.5])
).toarray()
) == [1.1, 2.2, None, 3.3, None, 4.4, 5.5]
assert ak._v2.operations.convert.to_list(
ak_numba_arrayview.ArrayView.fromarray(
ak._v2.highlevel.Array(
[
{"x": 0.0, "y": []},
{"x": 1.1, "y": [1, 1]},
{"x": 2.2, "y": [2, 2, 2]},
],
check_valid=True,
)
).toarray()
) == [{"x": 0.0, "y": []}, {"x": 1.1, "y": [1, 1]}, {"x": 2.2, "y": [2, 2, 2]}]
assert ak._v2.operations.convert.to_list(
ak_numba_arrayview.ArrayView.fromarray(
ak._v2.highlevel.Array([(0.0, []), (1.1, [1, 1]), (2.2, [2, 2, 2])])
).toarray()
) == [(0.0, []), (1.1, [1, 1]), (2.2, [2, 2, 2])]
assert ak._v2.operations.convert.to_list(
ak_numba_arrayview.ArrayView.fromarray(
ak._v2.highlevel.Array([1.1, 2.2, 3.3, [], [1], [2, 2]])
).toarray()
) == [1.1, 2.2, 3.3, [], [1], [2, 2]]
def test_unbox():
array = ak._v2.highlevel.Array([[1.1, 2.2, 3.3], [], [4.4, 5.5]])
@numba.njit
def f1(x):
return 3.14
assert f1(array) == 3.14
def test_box():
array = ak._v2.highlevel.Array([[1.1, 2.2, 3.3], [], [4.4, 5.5]])
@numba.njit
def f1(x):
return x
assert ak._v2.operations.convert.to_list(f1(array)) == [
[1.1, 2.2, 3.3],
[],
[4.4, 5.5],
]
def test_refcount():
array = ak._v2.highlevel.Array([[1.1, 2.2, 3.3], [], [4.4, 5.5]])
array.numba_type
assert [
sys.getrefcount(x) == 2
for x in (
array._numbaview,
array._numbaview.lookup,
array._numbaview.lookup.positions,
array._numbaview.lookup.arrayptrs,
)
]
for _ in range(3):
@numba.njit
def f1(x):
return 3.14
for _ in range(10):
f1(array)
assert [
sys.getrefcount(x) == 2
for x in (
array._numbaview,
array._numbaview.lookup,
array._numbaview.lookup.positions,
array._numbaview.lookup.arrayptrs,
)
]
for _ in range(3):
@numba.njit
def f2(x):
return x
for _ in range(10):
y = f2(array)
assert [
sys.getrefcount(x) == 2
for x in (
array._numbaview,
array._numbaview.lookup,
array._numbaview.lookup.positions,
array._numbaview.lookup.arrayptrs,
)
]
for _ in range(3):
@numba.njit
def f3(x):
return x, x
for _ in range(10):
y = f3(array) # noqa: F841 (checking reference count)
assert [
sys.getrefcount(x) == 2
for x in (
array._numbaview,
array._numbaview.lookup,
array._numbaview.lookup.positions,
array._numbaview.lookup.arrayptrs,
)
]
def test_len():
array = ak._v2.highlevel.Array([1.1, 2.2, 3.3, 4.4, 5.5])
@numba.njit
def f1(x):
return len(x)
assert f1(array) == 5
def test_NumpyArray_getitem():
array = ak._v2.highlevel.Array([1.1, 2.2, 3.3, 4.4, 5.5])
@numba.njit
def f1(x, i):
return x[i]
assert f1(array, 0) == 1.1
assert f1(array, 1) == 2.2
assert f1(array, 2) == 3.3
assert f1(array, 3) == 4.4
assert f1(array, 4) == 5.5
assert f1(array, -1) == 5.5
assert f1(array, -2) == 4.4
assert f1(array, -3) == 3.3
assert f1(array, -4) == 2.2
assert f1(array, -5) == 1.1
with pytest.raises(ValueError) as err:
assert f1(array, 5)
assert str(err.value) == "slice index out of bounds"
with pytest.raises(ValueError) as err:
assert f1(array, -6)
assert str(err.value) == "slice index out of bounds"
@numba.njit
def f2(x, i1, i2):
return x[i1:i2]
assert ak._v2.operations.convert.to_list(f2(array, 0, 5)) == [
1.1,
2.2,
3.3,
4.4,
5.5,
]
assert ak._v2.operations.convert.to_list(f2(array, 1, 5)) == [2.2, 3.3, 4.4, 5.5]
assert ak._v2.operations.convert.to_list(f2(array, 2, 5)) == [3.3, 4.4, 5.5]
assert ak._v2.operations.convert.to_list(f2(array, 3, 5)) == [4.4, 5.5]
assert ak._v2.operations.convert.to_list(f2(array, 4, 5)) == [5.5]
assert ak._v2.operations.convert.to_list(f2(array, 5, 5)) == []
assert ak._v2.operations.convert.to_list(f2(array, 6, 5)) == []
assert ak._v2.operations.convert.to_list(f2(array, -1, 5)) == [5.5]
assert ak._v2.operations.convert.to_list(f2(array, -2, 5)) == [4.4, 5.5]
assert ak._v2.operations.convert.to_list(f2(array, -3, 5)) == [3.3, 4.4, 5.5]
assert ak._v2.operations.convert.to_list(f2(array, -4, 5)) == [2.2, 3.3, 4.4, 5.5]
assert ak._v2.operations.convert.to_list(f2(array, -5, 5)) == [
1.1,
2.2,
3.3,
4.4,
5.5,
]
assert ak._v2.operations.convert.to_list(f2(array, -6, 5)) == [
1.1,
2.2,
3.3,
4.4,
5.5,
]
assert ak._v2.operations.convert.to_list(f2(array, 0, -6)) == []
assert ak._v2.operations.convert.to_list(f2(array, 0, -5)) == []
assert ak._v2.operations.convert.to_list(f2(array, 0, -4)) == [1.1]
assert ak._v2.operations.convert.to_list(f2(array, 0, -3)) == [1.1, 2.2]
assert ak._v2.operations.convert.to_list(f2(array, 0, -2)) == [1.1, 2.2, 3.3]
assert ak._v2.operations.convert.to_list(f2(array, 0, -1)) == [1.1, 2.2, 3.3, 4.4]
assert ak._v2.operations.convert.to_list(f2(array, 0, 6)) == [
1.1,
2.2,
3.3,
4.4,
5.5,
]
assert ak._v2.operations.convert.to_list(f2(array, 0, 5)) == [
1.1,
2.2,
3.3,
4.4,
5.5,
]
assert ak._v2.operations.convert.to_list(f2(array, 0, 4)) == [1.1, 2.2, 3.3, 4.4]
assert ak._v2.operations.convert.to_list(f2(array, 0, 3)) == [1.1, 2.2, 3.3]
assert ak._v2.operations.convert.to_list(f2(array, 0, 2)) == [1.1, 2.2]
assert ak._v2.operations.convert.to_list(f2(array, 0, 1)) == [1.1]
assert ak._v2.operations.convert.to_list(f2(array, 0, 0)) == []
aslist = [1.1, 2.2, 3.3, 4.4, 5.5]
for i1 in range(-6, 7):
for i2 in range(-6, 7):
assert ak._v2.operations.convert.to_list(f2(array, i1, i2)) == aslist[i1:i2]
@numba.njit
def f3(x, i1, i2):
return x[1:4][i1:i2]
assert ak._v2.operations.convert.to_list(f3(array, -1, 3)) == [4.4]
assert ak._v2.operations.convert.to_list(f3(array, -2, 3)) == [3.3, 4.4]
assert ak._v2.operations.convert.to_list(f3(array, -3, 3)) == [2.2, 3.3, 4.4]
assert ak._v2.operations.convert.to_list(f3(array, 0, 3)) == [2.2, 3.3, 4.4]
assert ak._v2.operations.convert.to_list(f3(array, 1, 3)) == [3.3, 4.4]
assert ak._v2.operations.convert.to_list(f3(array, 2, 3)) == [4.4]
assert ak._v2.operations.convert.to_list(f3(array, 3, 3)) == []
assert ak._v2.operations.convert.to_list(f3(array, 0, -4)) == []
assert ak._v2.operations.convert.to_list(f3(array, 0, -3)) == []
assert ak._v2.operations.convert.to_list(f3(array, 0, -2)) == [2.2]
assert ak._v2.operations.convert.to_list(f3(array, 0, -1)) == [2.2, 3.3]
assert ak._v2.operations.convert.to_list(f3(array, 0, 3)) == [2.2, 3.3, 4.4]
assert ak._v2.operations.convert.to_list(f3(array, 0, 2)) == [2.2, 3.3]
assert ak._v2.operations.convert.to_list(f3(array, 0, 1)) == [2.2]
assert ak._v2.operations.convert.to_list(f3(array, 0, 0)) == []
def test_RegularArray_getitem():
array = ak._v2.highlevel.Array(np.array([[1.1, 2.2, 3.3], [4.4, 5.5, 6.6]]))
@numba.njit
def f1(x, i):
return x[i]
assert f1(array, -2).tolist() == [1.1, 2.2, 3.3]
assert f1(array, 0).tolist() == [1.1, 2.2, 3.3]
assert f1(array, 1).tolist() == [4.4, 5.5, 6.6]
assert f1(array, -1).tolist() == [4.4, 5.5, 6.6]
@numba.njit
def f2(x, i, j):
return x[i][j]
assert f2(array, 1, 0) == 4.4
assert f2(array, 1, 1) == 5.5
assert f2(array, 1, 2) == 6.6
assert f2(array, -1, 0) == 4.4
assert f2(array, -1, 1) == 5.5
assert f2(array, -1, 2) == 6.6
assert f2(array, 1, -3) == 4.4
assert f2(array, 1, -2) == 5.5
assert f2(array, 1, -1) == 6.6
array = ak._v2.highlevel.Array(np.array([[1.1, 2.2], [3.3, 4.4], [5.5, 6.6]]))
@numba.njit
def f3(x, i1, i2):
return x[i1:i2]
assert ak._v2.operations.convert.to_list(f3(array, -1, 3)) == [[5.5, 6.6]]
assert ak._v2.operations.convert.to_list(f3(array, -2, 3)) == [
[3.3, 4.4],
[5.5, 6.6],
]
assert ak._v2.operations.convert.to_list(f3(array, -3, 3)) == [
[1.1, 2.2],
[3.3, 4.4],
[5.5, 6.6],
]
assert ak._v2.operations.convert.to_list(f3(array, 0, 3)) == [
[1.1, 2.2],
[3.3, 4.4],
[5.5, 6.6],
]
assert ak._v2.operations.convert.to_list(f3(array, 1, 3)) == [
[3.3, 4.4],
[5.5, 6.6],
]
assert ak._v2.operations.convert.to_list(f3(array, 2, 3)) == [[5.5, 6.6]]
assert ak._v2.operations.convert.to_list(f3(array, 3, 3)) == []
assert ak._v2.operations.convert.to_list(f3(array, 0, 0)) == []
assert ak._v2.operations.convert.to_list(f3(array, 0, 1)) == [[1.1, 2.2]]
assert ak._v2.operations.convert.to_list(f3(array, 0, 2)) == [
[1.1, 2.2],
[3.3, 4.4],
]
assert ak._v2.operations.convert.to_list(f3(array, 0, 3)) == [
[1.1, 2.2],
[3.3, 4.4],
[5.5, 6.6],
]
assert ak._v2.operations.convert.to_list(f3(array, 0, -1)) == [
[1.1, 2.2],
[3.3, 4.4],
]
assert ak._v2.operations.convert.to_list(f3(array, 0, -2)) == [[1.1, 2.2]]
assert ak._v2.operations.convert.to_list(f3(array, 0, -3)) == []
def test_ListArray_getitem():
array = ak._v2.highlevel.Array(
[[0.0, 1.1, 2.2], [], [3.3, 4.4], [5.5], [6.6, 7.7, 8.8, 9.9]]
)
@numba.njit
def f1(x, i):
return x[i]
assert ak._v2.operations.convert.to_list(f1(array, 0)) == [0.0, 1.1, 2.2]
assert ak._v2.operations.convert.to_list(f1(array, 1)) == []
assert ak._v2.operations.convert.to_list(f1(array, 2)) == [3.3, 4.4]
assert ak._v2.operations.convert.to_list(f1(array, 3)) == [5.5]
assert ak._v2.operations.convert.to_list(f1(array, 4)) == [6.6, 7.7, 8.8, 9.9]
@numba.njit
def f2(x, i1, i2):
return x[i1:i2]
assert ak._v2.operations.convert.to_list(f2(array, 1, 4)) == [[], [3.3, 4.4], [5.5]]
def test_IndexedArray_getitem():
content = ak.from_iter(
[0.0, 1.1, 2.2, 3.3, 4.4, 5.5, 6.6, 7.7, 8.8, 9.9], highlevel=False
)
index = ak.layout.Index64(np.array([3, 2, 2, 5, 0, 7], dtype=np.int64))
array = ak._v2.highlevel.Array(ak.layout.IndexedArray64(index, content))
@numba.njit
def f1(x, i):
return x[i]
assert [f1(array, 0), f1(array, 1), f1(array, 2), f1(array, 3)] == [
3.3,
2.2,
2.2,
5.5,
]
@numba.njit
def f2(x, i1, i2):
return x[i1:i2]
assert ak._v2.operations.convert.to_list(f2(array, 1, 5)) == [2.2, 2.2, 5.5, 0]
def test_IndexedOptionArray_getitem():
array = ak._v2.highlevel.Array([1.1, 2.2, None, 3.3, None, None, 4.4, 5.5])
@numba.njit
def f1(x, i):
return x[i]
assert [f1(array, 0), f1(array, 1), f1(array, 2), f1(array, 3)] == [
1.1,
2.2,
None,
3.3,
]
@numba.njit
def f2(x, i1, i2):
return x[i1:i2]
assert ak._v2.operations.convert.to_list(f2(array, 1, 5)) == [2.2, None, 3.3, None]
def test_RecordView_unbox_box():
record = ak._v2.highlevel.Array(
[
{"x": 0.0, "y": []},
{"x": 1.1, "y": [1]},
{"x": 2.2, "y": [2, 2]},
{"x": 3.3, "y": [3, 3, 3]},
{"x": 4.4, "y": [4, 4, 4, 4]},
],
check_valid=True,
)[3]
assert ak._v2.operations.convert.to_list(
ak_numba_arrayview.RecordView.fromrecord(record).torecord()
) == {
"x": 3.3,
"y": [3, 3, 3],
}
@numba.njit
def f1(x):
return 3.14
assert f1(record) == 3.14
@numba.njit
def f2(x):
return x
assert ak._v2.operations.convert.to_list(f2(record)) == {"x": 3.3, "y": [3, 3, 3]}
def test_RecordView_refcount():
record = ak._v2.highlevel.Array(
[
{"x": 0.0, "y": []},
{"x": 1.1, "y": [1]},
{"x": 2.2, "y": [2, 2]},
{"x": 3.3, "y": [3, 3, 3]},
{"x": 4.4, "y": [4, 4, 4, 4]},
],
check_valid=True,
)[3]
record.numba_type
assert [
sys.getrefcount(x) == 2
for x in (
record._numbaview,
record._numbaview.arrayview,
record._numbaview.arrayview.lookup,
record._numbaview.arrayview.lookup.positions,
record._numbaview.arrayview.lookup.arrayptrs,
)
]
for _ in range(3):
@numba.njit
def f1(x):
return 3.14
for _ in range(10):
f1(record)
assert [
sys.getrefcount(x) == 2
for x in (
record._numbaview,
record._numbaview.arrayview,
record._numbaview.arrayview.lookup,
record._numbaview.arrayview.lookup.positions,
record._numbaview.arrayview.lookup.arrayptrs,
)
]
for _ in range(3):
@numba.njit
def f2(x):
return x
for _ in range(10):
y = f2(record)
assert [
sys.getrefcount(x) == 2
for x in (
record._numbaview,
record._numbaview.arrayview,
record._numbaview.arrayview.lookup,
record._numbaview.arrayview.lookup.positions,
record._numbaview.arrayview.lookup.arrayptrs,
)
]
for _ in range(3):
@numba.njit
def f3(x):
return x, x
for _ in range(10):
y = f3(record) # noqa: F841 (checking reference count)
assert [
sys.getrefcount(x) == 2
for x in (
record._numbaview,
record._numbaview.arrayview,
record._numbaview.arrayview.lookup,
record._numbaview.arrayview.lookup.positions,
record._numbaview.arrayview.lookup.arrayptrs,
)
]
def test_Record_getitem():
record = ak._v2.highlevel.Array(
[
{"x": 0.0, "y": []},
{"x": 1.1, "y": [1]},
{"x": 2.2, "y": [2, 2]},
{"x": 3.3, "y": [3, 3, 3]},
{"x": 4.4, "y": [4, 4, 4, 4]},
],
check_valid=True,
)[3]
@numba.njit
def f1(x):
return x["x"]
assert f1(record) == 3.3
@numba.njit
def f2(x):
return x["y"]
assert ak._v2.operations.convert.to_list(f2(record)) == [3, 3, 3]
@numba.njit
def f3(x):
return x.x
assert f3(record) == 3.3
@numba.njit
def f4(x):
return x.y
assert ak._v2.operations.convert.to_list(f4(record)) == [3, 3, 3]
def test_RecordArray_getitem():
array = ak._v2.highlevel.Array(
[
{"x": 0.0, "y": []},
{"x": 1.1, "y": [1]},
{"x": 2.2, "y": [2, 2]},
{"x": 3.3, "y": [3, 3, 3]},
{"x": 4.4, "y": [4, 4, 4, 4]},
],
check_valid=True,
)
@numba.njit
def f1(x, i):
return x[i]
assert ak._v2.operations.convert.to_list(f1(array, 3)) == {"x": 3.3, "y": [3, 3, 3]}
assert ak._v2.operations.convert.to_list(f1(array, 2)) == {"x": 2.2, "y": [2, 2]}
assert ak._v2.operations.convert.to_list(f1(array, 1)) == {"x": 1.1, "y": [1]}
@numba.njit
def f2(x, i1, i2):
return x[i1:i2]
assert ak._v2.operations.convert.to_list(f2(array, 1, 4)) == [
{"x": 1.1, "y": [1]},
{"x": 2.2, "y": [2, 2]},
{"x": 3.3, "y": [3, 3, 3]},
]
array = ak._v2.highlevel.Array(
[
[{"x": 0.0, "y": []}, {"x": 1.1, "y": [1]}, {"x": 2.2, "y": [2, 2]}],
[],
[{"x": 3.3, "y": [3, 3, 3]}, {"x": 4.4, "y": [4, 4, 4, 4]}],
],
check_valid=True,
)
@numba.njit
def f3(x, i, j):
return x[i][j]
assert ak._v2.operations.convert.to_list(f3(array, 2, -2)) == {
"x": 3.3,
"y": [3, 3, 3],
}
def test_RecordArray_getitem_field():
array = ak._v2.highlevel.Array(
[
{"x": 0.0, "y": []},
{"x": 1.1, "y": [1]},
{"x": 2.2, "y": [2, 2]},
{"x": 3.3, "y": [3, 3, 3]},
{"x": 4.4, "y": [4, 4, 4, 4]},
],
check_valid=True,
)
@numba.njit
def f1(x):
return x[1:4]["x"]
assert ak._v2.operations.convert.to_list(f1(array)) == [1.1, 2.2, 3.3]
@numba.njit
def f2(x):
return x[1:4]["y"]
assert ak._v2.operations.convert.to_list(f2(array)) == [[1], [2, 2], [3, 3, 3]]
@numba.njit
def f3(x):
return x[1:4].x
assert ak._v2.operations.convert.to_list(f3(array)) == [1.1, 2.2, 3.3]
@numba.njit
def f4(x):
return x[1:4].y
assert ak._v2.operations.convert.to_list(f4(array)) == [[1], [2, 2], [3, 3, 3]]
def test_ListArray_getitem_field():
array = ak._v2.highlevel.Array(
[
[{"x": 0.0, "y": []}, {"x": 1.1, "y": [1]}, {"x": 2.2, "y": [2, 2]}],
[],
[{"x": 3.3, "y": [3, 3, 3]}, {"x": 4.4, "y": [4, 4, 4, 4]}],
[{"x": 5.5, "y": [5, 5, 5, 5, 5]}],
[
{"x": 6.6, "y": [6, 6, 6, 6, 6, 6]},
{"x": 7.7, "y": [7, 7, 7, 7, 7, 7, 7]},
{"x": 8.8, "y": [8, 8, 8, 8, 8, 8, 8, 8]},
{"x": 9.9, "y": [9, 9, 9, 9, 9, 9, 9, 9, 9]},
],
],
check_valid=True,
)
@numba.njit
def f1(x):
return x["x"]
assert ak._v2.operations.convert.to_list(f1(array)) == [
[0.0, 1.1, 2.2],
[],
[3.3, 4.4],
[5.5],
[6.6, 7.7, 8.8, 9.9],
]
@numba.njit
def f2(x):
return x.y
assert ak._v2.operations.convert.to_list(f2(array)) == [
[[], [1], [2, 2]],
[],
[[3, 3, 3], [4, 4, 4, 4]],
[[5, 5, 5, 5, 5]],
[
[6, 6, 6, 6, 6, 6],
[7, 7, 7, 7, 7, 7, 7],
[8, 8, 8, 8, 8, 8, 8, 8],
[9, 9, 9, 9, 9, 9, 9, 9, 9],
],
]
@numba.njit
def f3(x):
return x[1:4].x
assert ak._v2.operations.convert.to_list(f3(array)) == [[], [3.3, 4.4], [5.5]]
@numba.njit
def f4(x):
return x[1:4]["y"]
assert ak._v2.operations.convert.to_list(f4(array)) == [
[],
[[3, 3, 3], [4, 4, 4, 4]],
[[5, 5, 5, 5, 5]],
]
@numba.njit
def f5(x):
return x["x"][1:4]
assert ak._v2.operations.convert.to_list(f5(array)) == [[], [3.3, 4.4], [5.5]]
@numba.njit
def f6(x):
return x.y[1:4]
assert ak._v2.operations.convert.to_list(f6(array)) == [
[],
[[3, 3, 3], [4, 4, 4, 4]],
[[5, 5, 5, 5, 5]],
]
@numba.njit
def f7(x):
return x[4]["x"]
assert ak._v2.operations.convert.to_list(
ak._v2.operations.convert.to_list(f7(array))
) == [6.6, 7.7, 8.8, 9.9]
@numba.njit
def f8(x):
return x[4].y
assert ak._v2.operations.convert.to_list(
ak._v2.operations.convert.to_list(f8(array))
) == [
[6, 6, 6, 6, 6, 6],
[7, 7, 7, 7, 7, 7, 7],
[8, 8, 8, 8, 8, 8, 8, 8],
[9, 9, 9, 9, 9, 9, 9, 9, 9],
]
@numba.njit
def f9(x):
return x.x[4]
assert ak._v2.operations.convert.to_list(
ak._v2.operations.convert.to_list(f9(array))
) == [6.6, 7.7, 8.8, 9.9]
@numba.njit
def f10(x):
return x["y"][4]
assert ak._v2.operations.convert.to_list(
ak._v2.operations.convert.to_list(f10(array))
) == [
[6, 6, 6, 6, 6, 6],
[7, 7, 7, 7, 7, 7, 7],
[8, 8, 8, 8, 8, 8, 8, 8],
[9, 9, 9, 9, 9, 9, 9, 9, 9],
]
@numba.njit
def f11(x):
return x[4]["x"][1]
assert f11(array) == 7.7
@numba.njit
def f12(x):
return x[4].y[1]
assert ak._v2.operations.convert.to_list(
ak._v2.operations.convert.to_list(f12(array))
) == [7, 7, 7, 7, 7, 7, 7]
@numba.njit
def f12b(x):
return x[4].y[1][6]
assert (
ak._v2.operations.convert.to_list(
ak._v2.operations.convert.to_list(f12b(array))
)
== 7
)
@numba.njit
def f13(x):
return x.x[4][1]
assert f13(array) == 7.7
@numba.njit
def f14(x):
return x["y"][4][1]
assert ak._v2.operations.convert.to_list(
ak._v2.operations.convert.to_list(f14(array))
) == [7, 7, 7, 7, 7, 7, 7]
@numba.njit
def f14b(x):
return x["y"][4][1][6]
assert ak._v2.operations.convert.to_list(f14b(array)) == 7
def test_RecordArray_deep_field():
array = ak._v2.highlevel.Array(
[{"x": {"y": {"z": 1.1}}}, {"x": {"y": {"z": 2.2}}}, {"x": {"y": {"z": 3.3}}}],
check_valid=True,
)
@numba.njit
def f1(x):
return x[1]["x"].y["z"]
assert f1(array) == 2.2
@numba.njit
def f2(x):
return x["x"][1].y["z"]
assert f2(array) == 2.2
@numba.njit
def f3(x):
return x["x"].y[1]["z"]
assert f3(array) == 2.2
@numba.njit
def f4(x):
return x["x"].y["z"][1]
assert f4(array) == 2.2
@numba.njit
def f5(x):
return x["x"].y["z"]
assert ak._v2.operations.convert.to_list(f5(array)) == [1.1, 2.2, 3.3]
@numba.njit
def f6(x):
return x.x["y"].z
assert ak._v2.operations.convert.to_list(f6(array)) == [1.1, 2.2, 3.3]
@numba.njit
def f7(x):
return x.x["y"]
assert ak._v2.operations.convert.to_list(f7(array)) == [
{"z": 1.1},
{"z": 2.2},
{"z": 3.3},
]
@numba.njit
def f8(x):
return x.x
assert ak._v2.operations.convert.to_list(f8(array)) == [
{"y": {"z": 1.1}},
{"y": {"z": 2.2}},
{"y": {"z": 3.3}},
]
def test_ListArray_deep_at():
content = ak.layout.NumpyArray(
np.array(
[
1.1,
2.2,
3.3,
4.4,
5.5,
6.6,
7.7,
8.8,
9.9,
10.0,
11.1,
12.2,
13.3,
14.4,
15.5,
16.6,
17.7,
18.8,
19.9,
]
)
)
offsets1 = ak.layout.Index32(
np.array([0, 2, 4, 6, 8, 10, 12, 14, 16, 18], dtype=np.int32)
)
listarray1 = ak.layout.ListOffsetArray32(offsets1, content)
offsets2 = ak.layout.Index64(np.array([0, 2, 4, 6, 8], dtype=np.int64))
listarray2 = ak.layout.ListOffsetArray64(offsets2, listarray1)
offsets3 = ak.layout.Index32(np.array([0, 2, 4], dtype=np.int32))
listarray3 = ak.layout.ListOffsetArray32(offsets3, listarray2)
array = ak._v2.highlevel.Array(listarray3)
@numba.njit
def f1(x):
return x[1][1][1][1]
assert f1(array) == 16.6
@numba.njit
def f2(x):
return x[1][1][1]
assert ak._v2.operations.convert.to_list(f2(array)) == [15.5, 16.6]
@numba.njit
def f3(x):
return x[1][1]
assert ak._v2.operations.convert.to_list(f3(array)) == [[13.3, 14.4], [15.5, 16.6]]
@numba.njit
def f4(x):
return x[1]
assert ak._v2.operations.convert.to_list(f4(array)) == [
[[9.9, 10.0], [11.1, 12.2]],
[[13.3, 14.4], [15.5, 16.6]],
]
def test_IndexedArray_deep_at():
content = ak.layout.NumpyArray(np.array([1.1, 2.2, 3.3, 4.4, 5.5]))
index1 = ak.layout.Index32(np.array([1, 2, 3, 4], dtype=np.int32))
indexedarray1 = ak.layout.IndexedArray32(index1, content)
index2 = ak.layout.Index64(np.array([1, 2, 3], dtype=np.int64))
indexedarray2 = ak.layout.IndexedArray64(index2, indexedarray1)
index3 = ak.layout.Index32(np.array([1, 2], dtype=np.int32))
indexedarray3 = ak.layout.IndexedArray32(index3, indexedarray2)
# This is not a valid array (IndexedArray inside IndexedArray), but instructive! :)
array = ak._v2.highlevel.Array(indexedarray3, check_valid=False)
@numba.njit
def f1(x):
return x[1]
assert f1(array) == 5.5
def test_iterator():
array = ak._v2.highlevel.Array(
[[0.0, 1.1, 2.2], [], [3.3, 4.4], [5.5], [6.6, 7.7, 8.8, 9.9]]
)
@numba.njit
def f1(a):
out = 0.0
for b in a:
for c in b:
out += c
return out
assert f1(array) == 49.5
def test_ArrayBuilder_refcount():
builder = ak._v2.highlevel.ArrayBuilder()
assert (sys.getrefcount(builder), sys.getrefcount(builder._layout)) == (2, 2)
@numba.njit
def f1(x):
return 3.14
y = f1(builder)
assert (sys.getrefcount(builder), sys.getrefcount(builder._layout)) == (2, 2)
@numba.njit
def f2(x):
return x
y = f2(builder)
assert (sys.getrefcount(builder), sys.getrefcount(builder._layout)) == (2, 3)
del y
assert (sys.getrefcount(builder), sys.getrefcount(builder._layout)) == (2, 2)
@numba.njit
def f3(x):
return x, x
y = f3(builder)
assert (sys.getrefcount(builder), sys.getrefcount(builder._layout)) == (2, 4)
del y
assert (sys.getrefcount(builder), sys.getrefcount(builder._layout)) == (2, 2)
def test_ArrayBuilder_len():
builder = ak._v2.highlevel.ArrayBuilder()
builder.real(1.1)
builder.real(2.2)
builder.real(3.3)
builder.real(4.4)
builder.real(5.5)
@numba.njit
def f1(x):
return len(x)
assert f1(builder) == 5
def test_ArrayBuilder_simple():
@numba.njit
def f1(x):
x.clear()
return 3.14
a = ak._v2.highlevel.ArrayBuilder()
f1(a)
def test_ArrayBuilder_boolean():
@numba.njit
def f1(x):
x.boolean(True)
x.boolean(False)
x.boolean(False)
return x
a = ak._v2.highlevel.ArrayBuilder()
b = f1(a)
assert ak._v2.operations.convert.to_list(a.snapshot()) == [True, False, False]
assert ak._v2.operations.convert.to_list(b.snapshot()) == [True, False, False]
def test_ArrayBuilder_integer():
@numba.njit
def f1(x):
x.integer(1)
x.integer(2)
x.integer(3)
return x
a = ak._v2.highlevel.ArrayBuilder()
b = f1(a)
assert ak._v2.operations.convert.to_list(a.snapshot()) == [1, 2, 3]
assert ak._v2.operations.convert.to_list(b.snapshot()) == [1, 2, 3]
def test_ArrayBuilder_real():
@numba.njit
def f1(x, z):
x.real(1)
x.real(2.2)
x.real(z)
return x
a = ak._v2.highlevel.ArrayBuilder()
b = f1(a, np.array([3.5], dtype=np.float32)[0])
assert ak._v2.operations.convert.to_list(a.snapshot()) == [1, 2.2, 3.5]
assert ak._v2.operations.convert.to_list(b.snapshot()) == [1, 2.2, 3.5]
def test_ArrayBuilder_list():
@numba.njit
def f1(x):
x.begin_list()
x.real(1.1)
x.real(2.2)
x.real(3.3)
x.end_list()
x.begin_list()
x.end_list()
x.begin_list()
x.real(4.4)
x.real(5.5)
x.end_list()
return x
a = ak._v2.highlevel.ArrayBuilder()
b = f1(a)
assert ak._v2.operations.convert.to_list(a.snapshot()) == [
[1.1, 2.2, 3.3],
[],
[4.4, 5.5],
]
assert ak._v2.operations.convert.to_list(b.snapshot()) == [
[1.1, 2.2, 3.3],
[],
[4.4, 5.5],
]
@numba.njit
def f2(x):
return len(x)
assert f2(a) == 3
assert f2(b) == 3
@numba.njit
def f3(x):
x.clear()
return x
c = f3(b)
assert ak._v2.operations.convert.to_list(a.snapshot()) == []
assert ak._v2.operations.convert.to_list(b.snapshot()) == []
assert ak._v2.operations.convert.to_list(c.snapshot()) == []
def test_ArrayBuilder_tuple():
@numba.njit
def f1(x):
x.begin_tuple(2)
x.index(0).append(1)
x.index(1).append(1.1)
x.end_tuple()
x.begin_tuple(2)
x.index(0).append(2)
x.index(1).append(2.2)
x.end_tuple()
return x
a = ak._v2.highlevel.ArrayBuilder()
b = f1(a)
assert ak._v2.operations.convert.to_list(a.snapshot()) == [(1, 1.1), (2, 2.2)]
assert ak._v2.operations.convert.to_list(b.snapshot()) == [(1, 1.1), (2, 2.2)]
c = f1.py_func(a)
assert ak._v2.operations.convert.to_list(a.snapshot()) == [
(1, 1.1),
(2, 2.2),
(1, 1.1),
(2, 2.2),
]
assert ak._v2.operations.convert.to_list(c.snapshot()) == [
(1, 1.1),
(2, 2.2),
(1, 1.1),
(2, 2.2),
]
def test_ArrayBuilder_record():
@numba.njit
def f1(x):
x.begin_record()
x.field("x").append(1)
x.field("y").append(1.1)
x.end_record()
x.begin_record()
x.field("x").append(2)
x.field("y").append(2.2)
x.end_record()
return x
a = ak._v2.highlevel.ArrayBuilder()
b = f1(a)
assert ak._v2.operations.convert.to_list(a.snapshot()) == [
{"x": 1, "y": 1.1},
{"x": 2, "y": 2.2},
]
assert ak._v2.operations.convert.to_list(b.snapshot()) == [
{"x": 1, "y": 1.1},
{"x": 2, "y": 2.2},
]
c = f1.py_func(a)
assert ak._v2.operations.convert.to_list(a.snapshot()) == [
{"x": 1, "y": 1.1},
{"x": 2, "y": 2.2},
{"x": 1, "y": 1.1},
{"x": 2, "y": 2.2},
]
assert ak._v2.operations.convert.to_list(c.snapshot()) == [
{"x": 1, "y": 1.1},
{"x": 2, "y": 2.2},
{"x": 1, "y": 1.1},
{"x": 2, "y": 2.2},
]
def dummy_typer(viewtype):
return numba.float64
def dummy_lower(context, builder, sig, args):
def convert(rec):
return rec.x + rec.y
return context.compile_internal(builder, convert, sig, args)
def test_custom_record():
behavior = {}
behavior["__numba_typer__", "Dummy"] = dummy_typer
behavior["__numba_lower__", "Dummy"] = dummy_lower
array = ak._v2.highlevel.Array(
[{"x": 1.1, "y": 100}, {"x": 2.2, "y": 200}, {"x": 3.3, "y": 300}],
behavior=behavior,
check_valid=True,
)
array.layout.parameters["__record__"] = "Dummy"
@numba.njit
def f1(x, i):
return x[i]
assert f1(array, 1) == 202.2
assert f1(array, 2) == 303.3
def dummy_typer2(viewtype):
return numba.float64
def dummy_lower2(context, builder, sig, args):
def compute(rec):
return rec.x + rec.y
return context.compile_internal(builder, compute, sig, args)
def test_custom_record2():
behavior = {}
behavior["__numba_typer__", "Dummy", "stuff"] = dummy_typer2
behavior["__numba_lower__", "Dummy", "stuff"] = dummy_lower2
array = ak._v2.highlevel.Array(
[{"x": 1.1, "y": 100}, {"x": 2.2, "y": 200}, {"x": 3.3, "y": 300}],
behavior=behavior,
check_valid=True,
)
array.layout.parameters["__record__"] = "Dummy"
@numba.njit
def f1(x, i):
return x[i].stuff
assert f1(array, 1) == 202.2
assert f1(array, 2) == 303.3
def dummy_typer3(viewtype, args):
return numba.float64(*args)
def dummy_lower3(context, builder, sig, args):
def compute(rec, j):
return rec.x + rec.y + j
return context.compile_internal(builder, compute, sig, args)
def test_custom_record3():
behavior = {}
behavior["__numba_typer__", "Dummy", "stuff", ()] = dummy_typer3
behavior["__numba_lower__", "Dummy", "stuff", ()] = dummy_lower3
array = ak._v2.highlevel.Array(
[{"x": 1.1, "y": 100}, {"x": 2.2, "y": 200}, {"x": 3.3, "y": 300}],
behavior=behavior,
check_valid=True,
)
array.layout.parameters["__record__"] = "Dummy"
@numba.njit
def f1(x, i, j):
return x[i].stuff(j)
assert f1(array, 1, 1000) == 1202.2
assert f1(array, 2, 1000) == 1303.3
def dummy_typer4(binop, left, right):
return numba.float64(left, right)
def dummy_lower4(context, builder, sig, args):
def compute(left, right):
return left.x + right.x
return context.compile_internal(builder, compute, sig, args)
def test_custom_record4():
behavior = {}
behavior["__numba_typer__", "Dummy", operator.add, "Dummy"] = dummy_typer4
behavior["__numba_lower__", "Dummy", operator.add, "Dummy"] = dummy_lower4
behavior["__numba_typer__", "Dummy", operator.eq, "Dummy"] = dummy_typer4
behavior["__numba_lower__", "Dummy", operator.eq, "Dummy"] = dummy_lower4
array = ak._v2.highlevel.Array(
[{"x": 1.1, "y": 100}, {"x": 2.2, "y": 200}, {"x": 3.3, "y": 300}],
behavior=behavior,
check_valid=True,
)
array.layout.parameters["__record__"] = "Dummy"
@numba.njit
def f1(x, i, j):
return x[i] + x[j]
assert f1(array, 1, 2) == 5.5
@numba.njit
def f2(x, i, j):
return x[i] == x[j]
assert f2(array, 1, 2) == 5.5
def dummy_typer5(unaryop, left):
return numba.float64(left)
def dummy_lower5(context, builder, sig, args):
def compute(left):
return abs(left.x)
return context.compile_internal(builder, compute, sig, args)
def test_custom_record5():
behavior = {}
behavior["__numba_typer__", abs, "Dummy"] = dummy_typer5
behavior["__numba_lower__", abs, "Dummy"] = dummy_lower5
behavior["__numba_typer__", operator.neg, "Dummy"] = dummy_typer5
behavior["__numba_lower__", operator.neg, "Dummy"] = dummy_lower5
array = ak._v2.highlevel.Array(
[{"x": 1.1, "y": 100}, {"x": -2.2, "y": 200}, {"x": 3.3, "y": 300}],
behavior=behavior,
check_valid=True,
)
array.layout.parameters["__record__"] = "Dummy"
@numba.njit
def f1(x, i):
return abs(x[i])
assert f1(array, 1) == 2.2
assert f1(array, 2) == 3.3
@numba.njit
def f2(x, i):
return -x[i]
assert f2(array, 1) == 2.2
assert f2(array, 2) == 3.3
def test_ArrayBuilder_append_numba5():
@numba.njit
def f1(builder, x):
builder.append(x)
@numba.njit
def f2(builder, i):
if i % 2 == 0:
return 3
else:
return None
@numba.njit
def f3(builder, i):
builder.append(f2(builder, i))
builder = ak._v2.highlevel.ArrayBuilder()
f1(builder, True)
f1(builder, 1)
f1(builder, 2.2)
f3(builder, 0)
f3(builder, 1)
f1(builder, None)
assert ak._v2.operations.convert.to_list(builder.snapshot()) == [
True,
1,
2.2,
3,
None,
None,
]
| 26.120653 | 88 | 0.505326 | 5,587 | 36,804 | 3.219617 | 0.041883 | 0.03936 | 0.103513 | 0.15527 | 0.840727 | 0.793084 | 0.766177 | 0.726373 | 0.701968 | 0.654492 | 0 | 0.098611 | 0.301516 | 36,804 | 1,408 | 89 | 26.139205 | 0.60112 | 0.007526 | 0 | 0.559353 | 0 | 0 | 0.019059 | 0.000986 | 0 | 0 | 0 | 0 | 0.178058 | 1 | 0.123201 | false | 0 | 0.006295 | 0.076439 | 0.220324 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
8111d6fc3141fbccfa787685b995048a1b649a5b | 224 | py | Python | venv/lib/python3.6/site-packages/imutils/face_utils/__init__.py | Seifar/ChitAnalysis | 0580802c78b11766ba21e3929b8fa96f8f6a17f0 | [
"MIT"
] | 5 | 2017-11-15T10:33:42.000Z | 2021-11-16T02:21:31.000Z | venv/lib/python3.6/site-packages/imutils/face_utils/__init__.py | Seifar/ChitAnalysis | 0580802c78b11766ba21e3929b8fa96f8f6a17f0 | [
"MIT"
] | 13 | 2020-01-28T22:20:14.000Z | 2022-03-11T23:20:14.000Z | venv/lib/python3.6/site-packages/imutils/face_utils/__init__.py | Seifar/ChitAnalysis | 0580802c78b11766ba21e3929b8fa96f8f6a17f0 | [
"MIT"
] | 6 | 2017-11-30T00:34:20.000Z | 2021-05-20T02:58:02.000Z | # import the necessary packages
from .helpers import FACIAL_LANDMARKS_IDXS
from .helpers import rect_to_bb
from .helpers import shape_to_np
from .helpers import visualize_facial_landmarks
from .facealigner import FaceAligner | 37.333333 | 47 | 0.866071 | 32 | 224 | 5.8125 | 0.5 | 0.236559 | 0.365591 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.107143 | 224 | 6 | 48 | 37.333333 | 0.93 | 0.129464 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | null | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
d4be6382ef486ae9fbcc3263e3233876122d70a7 | 151 | py | Python | microsim/risk_model_repository.py | jburke5/microsim | 4a484b9fe4fd0a4faf05150238fd79be4d8827f6 | [
"MIT"
] | null | null | null | microsim/risk_model_repository.py | jburke5/microsim | 4a484b9fe4fd0a4faf05150238fd79be4d8827f6 | [
"MIT"
] | 2 | 2021-05-17T21:19:09.000Z | 2021-06-02T20:20:13.000Z | microsim/risk_model_repository.py | jburke5/microsim | 4a484b9fe4fd0a4faf05150238fd79be4d8827f6 | [
"MIT"
] | 1 | 2020-05-18T02:12:58.000Z | 2020-05-18T02:12:58.000Z | class RiskModelRepository:
def __init__(self):
self._repository = {}
def get_model(self, name):
return self._repository[name]
| 21.571429 | 37 | 0.655629 | 16 | 151 | 5.75 | 0.625 | 0.304348 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.245033 | 151 | 6 | 38 | 25.166667 | 0.807018 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.4 | false | 0 | 0 | 0.2 | 0.8 | 0 | 1 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 6 |
d4c2444a1328f386f063242d74bdd95a98f73f88 | 3,207 | py | Python | lemur/tests/test_destinations.py | bunjiboys/lemur | b5fd8020055d8af07bd6f82f4dd38246dca8d0c5 | [
"Apache-2.0"
] | null | null | null | lemur/tests/test_destinations.py | bunjiboys/lemur | b5fd8020055d8af07bd6f82f4dd38246dca8d0c5 | [
"Apache-2.0"
] | 2 | 2020-04-03T09:28:20.000Z | 2020-04-04T04:56:35.000Z | lemur/tests/test_destinations.py | scriptsrc/lemur | 914de78576baf66d8f4c0365d8cedb27c6f70663 | [
"Apache-2.0"
] | null | null | null | import pytest
from lemur.destinations.views import * # noqa
from .vectors import VALID_ADMIN_HEADER_TOKEN, VALID_USER_HEADER_TOKEN
def test_destination_input_schema(client, destination_plugin, destination):
from lemur.destinations.schemas import DestinationInputSchema
input_data = {
'label': 'destination1',
'options': {},
'description': 'my destination',
'active': True,
'plugin': {
'slug': 'test-destination'
}
}
data, errors = DestinationInputSchema().load(input_data)
assert not errors
@pytest.mark.parametrize("token,status", [
(VALID_USER_HEADER_TOKEN, 404),
(VALID_ADMIN_HEADER_TOKEN, 404),
('', 401)
])
def test_destination_get(client, token, status):
assert client.get(api.url_for(Destinations, destination_id=1), headers=token).status_code == status
@pytest.mark.parametrize("token,status", [
(VALID_USER_HEADER_TOKEN, 405),
(VALID_ADMIN_HEADER_TOKEN, 405),
('', 405)
])
def test_destination_post_(client, token, status):
assert client.post(api.url_for(Destinations, destination_id=1), data={}, headers=token).status_code == status
@pytest.mark.parametrize("token,status", [
(VALID_USER_HEADER_TOKEN, 403),
(VALID_ADMIN_HEADER_TOKEN, 400),
('', 401)
])
def test_destination_put(client, token, status):
assert client.put(api.url_for(Destinations, destination_id=1), data={}, headers=token).status_code == status
@pytest.mark.parametrize("token,status", [
(VALID_USER_HEADER_TOKEN, 403),
(VALID_ADMIN_HEADER_TOKEN, 200),
('', 401)
])
def test_destination_delete(client, token, status):
assert client.delete(api.url_for(Destinations, destination_id=1), headers=token).status_code == status
@pytest.mark.parametrize("token,status", [
(VALID_USER_HEADER_TOKEN, 405),
(VALID_ADMIN_HEADER_TOKEN, 405),
('', 405)
])
def test_destination_patch(client, token, status):
assert client.patch(api.url_for(Destinations, destination_id=1), data={}, headers=token).status_code == status
@pytest.mark.parametrize("token,status", [
(VALID_USER_HEADER_TOKEN, 403),
(VALID_ADMIN_HEADER_TOKEN, 400),
('', 401)
])
def test_destination_list_post_(client, token, status):
assert client.post(api.url_for(DestinationsList), data={}, headers=token).status_code == status
@pytest.mark.parametrize("token,status", [
(VALID_USER_HEADER_TOKEN, 200),
(VALID_ADMIN_HEADER_TOKEN, 200),
('', 401)
])
def test_destination_list_get(client, token, status):
assert client.get(api.url_for(DestinationsList), headers=token).status_code == status
@pytest.mark.parametrize("token,status", [
(VALID_USER_HEADER_TOKEN, 405),
(VALID_ADMIN_HEADER_TOKEN, 405),
('', 405)
])
def test_destination_list_delete(client, token, status):
assert client.delete(api.url_for(DestinationsList), headers=token).status_code == status
@pytest.mark.parametrize("token,status", [
(VALID_USER_HEADER_TOKEN, 405),
(VALID_ADMIN_HEADER_TOKEN, 405),
('', 405)
])
def test_destination_list_patch(client, token, status):
assert client.patch(api.url_for(DestinationsList), data={}, headers=token).status_code == status
| 30.254717 | 114 | 0.714687 | 396 | 3,207 | 5.502525 | 0.146465 | 0.136301 | 0.073428 | 0.096374 | 0.800826 | 0.783846 | 0.783846 | 0.783846 | 0.783846 | 0.728316 | 0 | 0.031775 | 0.146243 | 3,207 | 105 | 115 | 30.542857 | 0.764061 | 0.001247 | 0 | 0.531646 | 0 | 0 | 0.059044 | 0 | 0 | 0 | 0 | 0 | 0.126582 | 1 | 0.126582 | false | 0 | 0.050633 | 0 | 0.177215 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
d4f7c84aadfb497f6816378c9d09b431d603b388 | 147 | py | Python | puzzle/main.py | bozzzzo/bedlam-solver | 148fc43f818818831c27a9e736ab0107bdbe595d | [
"Apache-2.0"
] | null | null | null | puzzle/main.py | bozzzzo/bedlam-solver | 148fc43f818818831c27a9e736ab0107bdbe595d | [
"Apache-2.0"
] | null | null | null | puzzle/main.py | bozzzzo/bedlam-solver | 148fc43f818818831c27a9e736ab0107bdbe595d | [
"Apache-2.0"
] | null | null | null | import box
from ipdb import launch_ipdb_on_exception
def main():
with launch_ipdb_on_exception():
b = box.Box()
box.find(b)
| 14.7 | 41 | 0.659864 | 22 | 147 | 4.136364 | 0.545455 | 0.21978 | 0.263736 | 0.461538 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.251701 | 147 | 9 | 42 | 16.333333 | 0.827273 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.166667 | false | 0 | 0.333333 | 0 | 0.5 | 0 | 1 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 6 |
be00baa4788f4faddbdbbcbbe3de0a18cba4a17b | 2,019 | py | Python | ec2_compare/internal/instance_type/g3s.py | weldpua2008/aws.ec2.compare | 5149fc4c7cb42f4d7df1930ed8a06750155fe578 | [
"Apache-2.0"
] | null | null | null | ec2_compare/internal/instance_type/g3s.py | weldpua2008/aws.ec2.compare | 5149fc4c7cb42f4d7df1930ed8a06750155fe578 | [
"Apache-2.0"
] | null | null | null | ec2_compare/internal/instance_type/g3s.py | weldpua2008/aws.ec2.compare | 5149fc4c7cb42f4d7df1930ed8a06750155fe578 | [
"Apache-2.0"
] | 1 | 2021-12-15T11:58:22.000Z | 2021-12-15T11:58:22.000Z |
# Automatically generated
# pylint: disable=all
get = [{'SupportedArchitectures': ['x86_64'], 'SustainedClockSpeedInGhz': 2.7, 'DefaultVCpus': 4, 'DefaultCores': 2, 'DefaultThreadsPerCore': 2, 'ValidCores': [1, 2], 'ValidThreadsPerCore': [1, 2], 'SizeInMiB': 31232, 'EbsOptimizedSupport': 'default', 'EncryptionSupport': 'supported', 'NetworkPerformance': 'Up to 10 Gigabit', 'MaximumNetworkInterfaces': 4, 'Ipv4AddressesPerInterface': 15, 'Ipv6AddressesPerInterface': 15, 'Ipv6Supported': True, 'EnaSupport': 'required', 'Gpus': [{'Name': 'M60', 'Manufacturer': 'NVIDIA', 'Count': 1, 'MemoryInfo': {'SizeInMiB': 8192}}], 'TotalGpuMemoryInMiB': 8192, 'SupportedStrategies': ['cluster', 'partition', 'spread'], 'InstanceType': 'g3s.xlarge', 'CurrentGeneration': True, 'FreeTierEligible': False, 'SupportedUsageClasses': ['on-demand', 'spot'], 'SupportedRootDeviceTypes': ['ebs'], 'BareMetal': False, 'Hypervisor': 'xen', 'ProcessorInfo': {'SupportedArchitectures': ['x86_64'], 'SustainedClockSpeedInGhz': 2.7}, 'VCpuInfo': {'DefaultVCpus': 4, 'DefaultCores': 2, 'DefaultThreadsPerCore': 2, 'ValidCores': [1, 2], 'ValidThreadsPerCore': [1, 2]}, 'MemoryInfo': {'SizeInMiB': 31232}, 'InstanceStorageSupported': False, 'EbsInfo': {'EbsOptimizedSupport': 'default', 'EncryptionSupport': 'supported'}, 'NetworkInfo': {'NetworkPerformance': 'Up to 10 Gigabit', 'MaximumNetworkInterfaces': 4, 'Ipv4AddressesPerInterface': 15, 'Ipv6AddressesPerInterface': 15, 'Ipv6Supported': True, 'EnaSupport': 'required'}, 'GpuInfo': {'Gpus': [{'Name': 'M60', 'Manufacturer': 'NVIDIA', 'Count': 1, 'MemoryInfo': {'SizeInMiB': 8192}}], 'TotalGpuMemoryInMiB': 8192}, 'PlacementGroupInfo': {'SupportedStrategies': ['cluster', 'partition', 'spread']}, 'HibernationSupported': False, 'BurstablePerformanceSupported': False, 'DedicatedHostsSupported': False, 'AutoRecoverySupported': False}] # noqa: E501
def get_instances_list() -> list:
'''Returns list EC2 instances with InstanceType = g3s .'''
# pylint: disable=all
return get
| 168.25 | 1,828 | 0.71471 | 169 | 2,019 | 8.514793 | 0.508876 | 0.005559 | 0.022238 | 0.070883 | 0.500347 | 0.500347 | 0.426685 | 0.426685 | 0.426685 | 0.426685 | 0 | 0.045777 | 0.091134 | 2,019 | 11 | 1,829 | 183.545455 | 0.73842 | 0.063398 | 0 | 0 | 1 | 0 | 0.64168 | 0.225412 | 0 | 0 | 0 | 0 | 0 | 1 | 0.333333 | false | 0 | 0 | 0 | 0.666667 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 6 |
be0cc2abdf74c73e6851ec47b274cb5fd1624a8c | 38,357 | py | Python | pyscf/symm/test/test_geom.py | januseriksen/pyscf | d40f5b728b6e26766343388b2240ef023b71766d | [
"Apache-2.0"
] | 1 | 2021-01-07T15:26:49.000Z | 2021-01-07T15:26:49.000Z | pyscf/symm/test/test_geom.py | ghb24/pyscf | 5cd81a47fb67cdf81ad2837a1a857effd48c472f | [
"Apache-2.0"
] | null | null | null | pyscf/symm/test/test_geom.py | ghb24/pyscf | 5cd81a47fb67cdf81ad2837a1a857effd48c472f | [
"Apache-2.0"
] | null | null | null | #!/usr/bin/env python
# Copyright 2014-2020 The PySCF Developers. All Rights Reserved.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
#
# Author: Qiming Sun <osirpt.sun@gmail.com>
#
import unittest
import numpy
from functools import reduce
from pyscf import lib
from pyscf import gto
from pyscf import symm
from pyscf.symm import geom
numpy.random.seed(12)
u = numpy.random.random((3,3))
u = numpy.linalg.svd(u)[0]
class KnownValues(unittest.TestCase):
def test_d5h(self):
atoms = ringhat(5, u)
atoms = atoms[5:]
gpname, orig, axes = geom.detect_symm(atoms)
self.assertEqual(gpname, 'D5h')
gpname, axes = geom.subgroup(gpname, axes)
atoms = geom.shift_atom(atoms, orig, axes)
self.assertEqual(gpname, 'C2v')
self.assertTrue(geom.check_given_symm('C2v', atoms))
self.assertEqual(geom.symm_identical_atoms(gpname, atoms),
[[0], [1, 4], [2, 3], [5, 6]])
atoms = ringhat(5, u)
atoms = atoms[5:]
atoms[1][0] = 'C1'
gpname, orig, axes = geom.detect_symm(atoms, {'C':'ccpvdz','C1':'sto3g','N':'631g'})
self.assertEqual(gpname, 'C2v')
gpname, axes = geom.subgroup(gpname, axes)
atoms = geom.shift_atom(atoms, orig, axes)
self.assertEqual(gpname, 'C2v')
self.assertTrue(geom.check_given_symm('C2v', atoms))
self.assertEqual(geom.symm_identical_atoms(gpname, atoms),
[[0,2],[1],[3,4],[5,6]])
def test_d6h(self):
atoms = ringhat(6, u)
atoms = atoms[6:]
gpname, orig, axes = geom.detect_symm(atoms)
self.assertEqual(gpname, 'D6h')
gpname, axes = geom.subgroup(gpname, axes)
atoms = geom.shift_atom(atoms, orig, axes)
self.assertEqual(geom.symm_identical_atoms(gpname, atoms),
[[0,3],[1,2,4,5],[6,7]])
self.assertTrue(geom.check_given_symm('D2h', atoms))
def test_d6h_1(self):
atoms = ringhat(6, u)
atoms = atoms[6:]
atoms[1][0] = 'C1'
atoms[2][0] = 'C1'
basis = {'C': 'sto3g', 'N':'sto3g', 'C1':'sto3g'}
gpname, orig, axes = geom.detect_symm(atoms, basis)
self.assertEqual(gpname, 'D6h')
gpname, axes = geom.subgroup(gpname, axes)
atoms = geom.shift_atom(atoms, orig, axes)
self.assertEqual(geom.symm_identical_atoms(gpname, atoms),
[[0,3],[1,2,4,5],[6,7]])
self.assertTrue(geom.check_given_symm('D2h', atoms, basis))
def test_c2v(self):
atoms = ringhat(6, u)
atoms = atoms[6:]
atoms[1][0] = 'C1'
atoms[2][0] = 'C1'
basis = {'C': 'sto3g', 'N':'sto3g', 'C1':'ccpvdz'}
gpname, orig, axes = geom.detect_symm(atoms, basis)
self.assertEqual(gpname, 'C2v')
gpname, axes = geom.subgroup(gpname, axes)
atoms = geom.shift_atom(atoms, orig, axes)
self.assertEqual(geom.symm_identical_atoms(gpname, atoms),
[[0, 3], [1, 2], [4, 5], [6, 7]])
self.assertTrue(geom.check_given_symm('C2', atoms, basis))
def test_s4(self):
atoms = [['C', ( 0.5, 0 , 1)],
['O', ( 0.4, 0.2 , 1)],
['C', ( -0.5, 0 , 1)],
['O', ( -0.4, -0.2 , 1)],
['C', ( 0 , 0.5 , -1)],
['O', ( -0.2, 0.4 , -1)],
['C', ( 0 , -0.5 , -1)],
['O', ( 0.2, -0.4 , -1)]]
gpname, orig, axes = geom.detect_symm(atoms)
self.assertEqual(gpname, 'S4')
gpname, axes = geom.subgroup(gpname, axes)
atoms = geom.shift_atom(atoms, orig, axes)
self.assertEqual(geom.symm_identical_atoms(gpname, atoms),
[[0, 2], [1, 3], [4, 6], [5, 7]])
self.assertTrue(geom.check_given_symm('C2', atoms))
def test_s6(self):
rotz = random_rotz()
atoms = ringhat(3, 1)[3:6] + ringhat(3, rotz)[:3]
rotz[2,2] = -1
rotz[:2,:2] = numpy.array(((.5, numpy.sqrt(3)/2),(-numpy.sqrt(3)/2, .5)))
r = numpy.dot([x[1] for x in atoms], rotz) - numpy.array((0,0,3.5))
atoms += list(zip([x[0] for x in atoms], r))
gpname, orig, axes = geom.detect_symm(atoms)
self.assertEqual(gpname, 'S6')
gpname, axes = geom.subgroup(gpname, axes)
self.assertEqual(gpname, 'C3')
def test_c5h(self):
atoms = ringhat(5, u)
gpname, orig, axes = geom.detect_symm(atoms)
self.assertEqual(gpname, 'C5h')
gpname, axes = geom.subgroup(gpname, axes)
atoms = geom.shift_atom(atoms, orig, axes)
self.assertEqual(gpname, 'Cs')
self.assertTrue(geom.check_given_symm('Cs', atoms))
self.assertEqual(geom.symm_identical_atoms(gpname, atoms),
[[0], [1], [2], [3], [4], [5], [6], [7], [8], [9], [10,11]])
def test_c5(self):
atoms = ringhat(5, u)[:-1]
gpname, orig, axes = geom.detect_symm(atoms)
self.assertEqual(gpname, 'C5')
gpname, axes = geom.subgroup(gpname, axes)
self.assertEqual(gpname, 'C1')
def test_c5v(self):
atoms = ringhat(5, u)[5:-1]
gpname, orig, axes = geom.detect_symm(atoms)
self.assertEqual(gpname, 'C5v')
gpname, axes = geom.subgroup(gpname, axes)
atoms = geom.shift_atom(atoms, orig, axes)
self.assertEqual(gpname, 'Cs')
self.assertTrue(geom.check_given_symm('Cs', atoms))
self.assertEqual(geom.symm_identical_atoms(gpname, atoms),
[[0, 4], [1, 3], [2], [5]])
def test_ih1(self):
coords = numpy.dot(make60(1.5, 1), u)
atoms = [['C', c] for c in coords]
gpname, orig, axes = geom.detect_symm(atoms)
self.assertEqual(gpname, 'Ih')
gpname, axes = geom.subgroup(gpname, axes)
atoms = geom.shift_atom(atoms, orig, axes)
self.assertEqual(gpname, 'Ci')
self.assertTrue(geom.check_given_symm('Ci', atoms))
self.assertEqual(geom.symm_identical_atoms(gpname, atoms),
[[0, 55], [1, 56], [2, 57], [3, 58], [4, 59],
[5, 30], [6, 31], [7, 32], [8, 33], [9, 34],
[10, 35], [11, 36], [12, 37], [13, 38], [14, 39],
[15, 40], [16, 41], [17, 42], [18, 43], [19, 44],
[20, 45], [21, 46], [22, 47], [23, 48], [24, 49],
[25, 50], [26, 51], [27, 52], [28, 53], [29, 54]])
def test_ih2(self):
coords1 = numpy.dot(make60(1.5, 3), u)
coords2 = numpy.dot(make12(1.1), u)
atoms = [['C', c] for c in coords1] + [['C', c] for c in coords2]
gpname, orig, axes = geom.detect_symm(atoms)
self.assertEqual(gpname, 'Ih')
gpname, axes = geom.subgroup(gpname, axes)
atoms = geom.shift_atom(atoms, orig, axes)
self.assertEqual(gpname, 'Ci')
self.assertTrue(geom.check_given_symm('Ci', atoms))
def test_ih3(self):
coords1 = numpy.dot(make20(1.5), u)
atoms = [['C', c] for c in coords1]
gpname, orig, axes = geom.detect_symm(atoms)
self.assertEqual(gpname, 'Ih')
gpname, axes = geom.subgroup(gpname, axes)
atoms = geom.shift_atom(atoms, orig, axes)
self.assertEqual(gpname, 'Ci')
self.assertTrue(geom.check_given_symm('Ci', atoms))
def test_ih4(self):
coords1 = make12(1.5)
atoms = [['C', c] for c in coords1]
gpname, orig, axes = geom.detect_symm(atoms)
self.assertEqual(gpname, 'Ih')
gpname, axes = geom.subgroup(gpname, axes)
atoms = geom.shift_atom(atoms, orig, axes)
self.assertEqual(gpname, 'Ci')
self.assertTrue(geom.check_given_symm('Ci', atoms))
def test_d5d_1(self):
coords1 = numpy.dot(make20(2.0), u)
coords2 = numpy.dot(make12(1.1), u)
atoms = [['C', c] for c in coords1] + [['C', c] for c in coords2]
gpname, orig, axes = geom.detect_symm(atoms)
self.assertEqual(gpname, 'D5d')
def test_s10(self):
numpy.random.seed(19)
rotz = numpy.eye(3)
rotz[:2,:2] = numpy.linalg.svd(numpy.random.random((2,2)))[0]
coords1 = numpy.dot(make60(1.5, 3.0), u)
coords2 = reduce(numpy.dot, (make20(1.1), rotz, u))
atoms = [['C', c] for c in coords1] + [['C', c] for c in coords2]
gpname, orig, axes = geom.detect_symm(atoms)
self.assertEqual(gpname, 'S10')
def test_oh1(self):
coords1 = numpy.dot(make6(1.5), u)
atoms = [['C', c] for c in coords1]
gpname, orig, axes = geom.detect_symm(atoms)
self.assertEqual(gpname, 'Oh')
gpname, axes = geom.subgroup(gpname, axes)
atoms = geom.shift_atom(atoms, orig, axes)
self.assertEqual(gpname, 'D2h')
self.assertTrue(geom.check_given_symm('D2h', atoms))
def test_oh2(self):
coords1 = numpy.dot(make8(1.5), u)
atoms = [['C', c] for c in coords1]
gpname, orig, axes = geom.detect_symm(atoms)
self.assertEqual(gpname, 'Oh')
gpname, axes = geom.subgroup(gpname, axes)
atoms = geom.shift_atom(atoms, orig, axes)
self.assertEqual(gpname, 'D2h')
self.assertTrue(geom.check_given_symm('D2h', atoms))
def test_oh3(self):
coords1 = numpy.dot(make8(1.5), u)
coords2 = numpy.dot(make6(1.5), u)
atoms = [['C', c] for c in coords1] + [['C', c] for c in coords2]
gpname, orig, axes = geom.detect_symm(atoms)
self.assertEqual(gpname, 'Oh')
gpname, axes = geom.subgroup(gpname, axes)
atoms = geom.shift_atom(atoms, orig, axes)
self.assertEqual(gpname, 'D2h')
self.assertTrue(geom.check_given_symm('D2h', atoms))
def test_c4h(self):
coords1 = numpy.dot(make8(1.5), u)
coords2 = numpy.dot(make6(1.5).dot(random_rotz()), u)
atoms = [['C', c] for c in coords1] + [['C', c] for c in coords2]
gpname, orig, axes = geom.detect_symm(atoms)
self.assertEqual(gpname, 'C4h')
def test_c2h(self):
coords1 = numpy.dot(make8(2.5), u)
coords2 = numpy.dot(make20(1.2), u)
atoms = [['C', c] for c in numpy.vstack((coords1,coords2))]
gpname, orig, axes = geom.detect_symm(atoms)
self.assertEqual(gpname, 'C2h')
def test_c1(self):
coords1 = numpy.dot(make4(2.5), u)
coords2 = make20(1.2)
atoms = [['C', c] for c in numpy.vstack((coords1,coords2))]
gpname, orig, axes = geom.detect_symm(atoms)
self.assertEqual(gpname, 'C1')
def test_c2(self):
coords1 = numpy.dot(make4(2.5), 1)
coords2 = make12(1.2)
axes = geom._make_axes(coords2[1]-coords2[0], coords2[2])
coords2 = numpy.dot(coords2, axes.T)
atoms = [['C', c] for c in numpy.vstack((coords1,coords2))]
gpname, orig, axes = geom.detect_symm(atoms)
self.assertEqual(gpname, 'C2')
def test_ci(self):
coords1 = numpy.dot(make8(2.5), u)
coords2 = numpy.dot(numpy.dot(make20(1.2), random_rotz()), u)
atoms = [['C', c] for c in numpy.vstack((coords1,coords2))]
gpname, orig, axes = geom.detect_symm(atoms)
self.assertEqual(gpname, 'Ci')
def test_cs(self):
coords1 = make4(2.5)
axes = geom._make_axes(coords1[1]-coords1[0], coords1[2])
coords1 = numpy.dot(coords1, axes.T)
coords2 = make12(1.2)
axes = geom._make_axes(coords2[1]-coords2[0], coords2[2])
coords2 = numpy.dot(coords2, axes.T)
atoms = [['C', c] for c in numpy.vstack((coords1,coords2))]
gpname, orig, axes = geom.detect_symm(atoms)
self.assertEqual(gpname, 'Cs')
numpy.random.seed(1)
c0 = numpy.random.random((4,3))
c0[:,1] *= .5
c1 = c0.copy()
c1[:,1] *= -1
atoms = [['C', c] for c in numpy.vstack((c0,c1))]
gpname, orig, axes = geom.detect_symm(atoms)
self.assertEqual(gpname, 'Cs')
def test_td1(self):
coords1 = numpy.dot(make4(1.5), u)
atoms = [['C', c] for c in coords1]
gpname, orig, axes = geom.detect_symm(atoms)
self.assertEqual(gpname, 'Td')
gpname, axes = geom.subgroup(gpname, axes)
atoms = geom.shift_atom(atoms, orig, axes)
self.assertEqual(gpname, 'D2')
self.assertTrue(geom.check_given_symm('D2', atoms))
self.assertEqual(geom.symm_identical_atoms(gpname, atoms),
[[0, 1, 2, 3]])
def test_td2(self):
coords1 = numpy.dot(make4(1.5), u)
coords2 = numpy.dot(make4(1.9), u)
atoms = [['C', c] for c in coords1] + [['C', c] for c in coords2]
gpname, orig, axes = geom.detect_symm(atoms)
self.assertEqual(gpname, 'Td')
gpname, axes = geom.subgroup(gpname, axes)
atoms = geom.shift_atom(atoms, orig, axes)
self.assertEqual(gpname, 'D2')
self.assertTrue(geom.check_given_symm('D2', atoms))
self.assertEqual(geom.symm_identical_atoms(gpname, atoms),
[[0, 1, 2, 3], [4, 5, 6, 7]])
def test_c3v(self):
coords1 = numpy.dot(make4(1.5), u)
coords2 = numpy.dot(make4(1.9), u)
atoms = [['C', c] for c in coords1] + [['C', c] for c in coords2]
atoms[2][0] = 'C1'
gpname, orig, axes = geom.detect_symm(atoms)
self.assertEqual(gpname, 'C3v')
gpname, axes = geom.subgroup(gpname, axes)
atoms = geom.shift_atom(atoms, orig, axes)
self.assertEqual(gpname, 'Cs')
self.assertTrue(geom.check_given_symm('Cs', atoms))
self.assertEqual(geom.symm_identical_atoms(gpname, atoms),
[[0, 1], [2], [3], [4, 5], [6], [7]])
def test_c3v_1(self):
mol = gto.M(atom='''
C 0.948065 -0.081406 -0.007893
C 0.462608 -0.144439 1.364854
N 0.077738 -0.194439 2.453356
H 0.591046 0.830035 -0.495369
H 0.591062 -0.944369 -0.576807
H 2.041481 -0.080642 -0.024174''')
gpname, orig, axes = geom.detect_symm(mol._atom)
self.assertEqual(gpname, 'C1')
with lib.temporary_env(geom, TOLERANCE=1e-3):
gpname, orig, axes = geom.detect_symm(mol._atom)
self.assertEqual(gpname, 'C3v')
def test_t(self):
atoms = [['C', ( 1.0 ,-1.0 , 1.0 )],
['O', ( 1.0-.1,-1.0+.2, 1.0 )],
['O', ( 1.0 ,-1.0+.1, 1.0-.2)],
['O', ( 1.0-.2,-1.0 , 1.0-.1)],
['C', (-1.0 , 1.0 , 1.0 )],
['O', (-1.0+.1, 1.0-.2, 1.0 )],
['O', (-1.0 , 1.0-.1, 1.0-.2)],
['O', (-1.0+.2, 1.0 , 1.0-.1)],
['C', ( 1.0 , 1.0 ,-1.0 )],
['O', ( 1.0-.2, 1.0 ,-1.0+.1)],
['O', ( 1.0 , 1.0-.1,-1.0+.2)],
['O', ( 1.0-.1, 1.0-.2,-1.0 )],
['C', (-1.0 ,-1.0 ,-1.0 )],
['O', (-1.0 ,-1.0+.1,-1.0+.2)],
['O', (-1.0+.2,-1.0 ,-1.0+.1)],
['O', (-1.0+.1,-1.0+.2,-1.0 )]]
gpname, orig, axes = geom.detect_symm(atoms)
self.assertEqual(gpname, 'T')
gpname, axes = geom.subgroup(gpname, axes)
atoms = geom.shift_atom(atoms, orig, axes)
self.assertEqual(gpname, 'D2')
self.assertTrue(geom.check_given_symm('D2', atoms))
self.assertEqual(geom.symm_identical_atoms(gpname, atoms),
[[0, 4, 8, 12], [1, 5, 11, 15],
[2, 6, 10, 13], [3, 7, 9, 14]])
def test_th(self):
atoms = [['C', ( 1.0 ,-1.0 , 1.0 )],
['O', ( 1.0-.1,-1.0+.2, 1.0 )],
['O', ( 1.0 ,-1.0+.1, 1.0-.2)],
['O', ( 1.0-.2,-1.0 , 1.0-.1)],
['C', ( 1.0 , 1.0 , 1.0 )],
['O', ( 1.0-.1, 1.0-.2, 1.0 )],
['O', ( 1.0 , 1.0-.1, 1.0-.2)],
['O', ( 1.0-.2, 1.0 , 1.0-.1)],
['C', (-1.0 , 1.0 , 1.0 )],
['O', (-1.0+.1, 1.0-.2, 1.0 )],
['O', (-1.0 , 1.0-.1, 1.0-.2)],
['O', (-1.0+.2, 1.0 , 1.0-.1)],
['C', (-1.0 ,-1.0 , 1.0 )],
['O', (-1.0+.1,-1.0+.2, 1.0 )],
['O', (-1.0 ,-1.0+.1, 1.0-.2)],
['O', (-1.0+.2,-1.0 , 1.0-.1)],
['C', ( 1.0 ,-1.0 ,-1.0 )],
['O', ( 1.0-.2,-1.0 ,-1.0+.1)],
['O', ( 1.0 ,-1.0+.1,-1.0+.2)],
['O', ( 1.0-.1,-1.0+.2,-1.0 )],
['C', ( 1.0 , 1.0 ,-1.0 )],
['O', ( 1.0-.2, 1.0 ,-1.0+.1)],
['O', ( 1.0 , 1.0-.1,-1.0+.2)],
['O', ( 1.0-.1, 1.0-.2,-1.0 )],
['C', (-1.0 , 1.0 ,-1.0 )],
['O', (-1.0+.2, 1.0 ,-1.0+.1)],
['O', (-1.0 , 1.0-.1,-1.0+.2)],
['O', (-1.0+.1, 1.0-.2,-1.0 )],
['C', (-1.0 ,-1.0 ,-1.0 )],
['O', (-1.0 ,-1.0+.1,-1.0+.2)],
['O', (-1.0+.2,-1.0 ,-1.0+.1)],
['O', (-1.0+.1,-1.0+.2,-1.0 )]]
gpname, orig, axes = geom.detect_symm(atoms)
self.assertEqual(gpname, 'Th')
gpname, axes = geom.subgroup(gpname, axes)
atoms = geom.shift_atom(atoms, orig, axes)
self.assertEqual(gpname, 'D2')
self.assertTrue(geom.check_given_symm('D2', atoms))
self.assertEqual(geom.symm_identical_atoms(gpname, atoms),
[[0, 8, 20, 28], [1, 9, 23, 31],
[2, 10, 22, 29], [3, 11, 21, 30],
[4, 12, 16, 24], [5, 13, 19, 27],
[6, 14, 18, 26], [7, 15, 17, 25]])
def test_s4_1(self):
coords1 = numpy.dot(make4(1.5), u)
coords2 = numpy.dot(numpy.dot(make4(2.4), random_rotz()), u)
atoms = [['C', c] for c in coords1] + [['C', c] for c in coords2]
gpname, orig, axes = geom.detect_symm(atoms)
self.assertEqual(gpname, 'S4')
def test_Dooh(self):
atoms = [['H', (0,0,0)], ['H', (0,0,-1)], ['H1', (0,0,1)]]
basis = {'H':'sto3g'}
gpname, orig, axes = geom.detect_symm(atoms, basis)
self.assertEqual(gpname, 'Dooh')
self.assertEqual(geom.symm_identical_atoms(gpname, atoms),
[[0], [1,2]])
gpname, axes = geom.subgroup(gpname, axes)
atoms = geom.shift_atom(atoms, orig, axes)
self.assertEqual(gpname, 'Dooh')
self.assertTrue(geom.check_given_symm('Dooh', atoms, basis))
self.assertTrue(geom.check_given_symm('D2h', atoms, basis))
self.assertEqual(geom.symm_identical_atoms(gpname, atoms),
[[0], [1,2]])
def test_Coov(self):
atoms = [['H', (0,0,0)], ['H', (0,0,-1)], ['H1', (0,0,1)]]
basis = {'H':'sto3g', 'H1':'6-31g'}
gpname, orig, axes = geom.detect_symm(atoms, basis)
self.assertEqual(gpname, 'Coov')
self.assertEqual(geom.symm_identical_atoms(gpname, atoms),
[[0], [1] ,[2]])
gpname, axes = geom.subgroup(gpname, axes)
atoms = geom.shift_atom(atoms, orig, axes)
self.assertEqual(gpname, 'Coov')
self.assertTrue(geom.check_given_symm('Coov', atoms, basis))
self.assertTrue(geom.check_given_symm('C2v', atoms, basis))
self.assertEqual(geom.symm_identical_atoms(gpname, atoms),
[[0], [1], [2]])
def test_d5(self):
coord1 = ring(5)
coord2 = ring(5, .1)
coord1[:,2] = 1
coord2[:,2] =-1
atoms = [['H', c] for c in numpy.vstack((coord1,coord2))]
gpname, orig, axes = geom.detect_symm(atoms)
self.assertEqual(gpname, 'D5')
gpname, axes = geom.subgroup(gpname, axes)
atoms = geom.shift_atom(atoms, orig, axes)
self.assertEqual(gpname, 'C2')
self.assertTrue(geom.check_given_symm('C2', atoms))
self.assertEqual(geom.symm_identical_atoms(gpname, atoms),
[[0, 5], [1, 9], [2, 8], [3, 7], [4, 6]])
def test_d5d(self):
coord1 = ring(5)
coord2 = ring(5, numpy.pi/5)
coord1[:,2] = 1
coord2[:,2] =-1
atoms = [['H', c] for c in numpy.vstack((coord1,coord2))]
gpname, orig, axes = geom.detect_symm(atoms)
self.assertEqual(gpname, 'D5d')
gpname, axes = geom.subgroup(gpname, axes)
atoms = geom.shift_atom(atoms, orig, axes)
self.assertEqual(gpname, 'C2h')
self.assertTrue(geom.check_given_symm('C2h', atoms))
self.assertEqual(geom.symm_identical_atoms(gpname, atoms),
[[0, 3, 5, 7], [1, 2, 8, 9], [4, 6]])
def test_detect_symm_c2v(self):
atoms = [['H' , (1., 0., 2.)],
['He', (0., 1., 0.)],
['H' , (-2.,0.,-1.)],
['He', (0.,-1., 0.)]]
l, orig, axes = geom.detect_symm(atoms)
self.assertEqual(l, 'C2v')
atoms = geom.shift_atom(atoms, orig, axes)
self.assertTrue(geom.check_given_symm('C2v', atoms))
self.assertEqual(geom.symm_identical_atoms(l,atoms), [[0,2],[1,3]])
def test_detect_symm_d2h_a(self):
atoms = [['He', (0., 1., 0.)],
['H' , (1., 0., 0.)],
['H' , (-1.,0., 0.)],
['He', (0.,-1., 0.)]]
l, orig, axes = geom.detect_symm(atoms)
self.assertEqual(l, 'D2h')
atoms = geom.shift_atom(atoms, orig, axes)
self.assertTrue(geom.check_given_symm('D2h', atoms))
self.assertEqual(geom.symm_identical_atoms(l,atoms),
[[0, 3], [1, 2]])
def test_detect_symm_d2h_b(self):
atoms = [['H' , (1., 0., 2.)],
['He', (0., 1., 0.)],
['H' , (-1.,0.,-2.)],
['He', (0.,-1., 0.)]]
l, orig, axes = geom.detect_symm(atoms)
self.assertEqual(l, 'D2h')
atoms = geom.shift_atom(atoms, orig, axes)
self.assertTrue(geom.check_given_symm('D2h', atoms))
self.assertEqual(geom.symm_identical_atoms(l,atoms), [[0,2],[1,3]])
def test_detect_symm_c2h_a(self):
atoms = [['H' , (1., 0., 2.)],
['He', (0., 1.,-1.)],
['H' , (-1.,0.,-2.)],
['He', (0.,-1., 1.)]]
l, orig, axes = geom.detect_symm(atoms)
atoms = geom.shift_atom(atoms, orig, axes)
self.assertEqual(l, 'C2h')
self.assertEqual(geom.symm_identical_atoms(l,atoms), [[0,2],[1,3]])
self.assertTrue(geom.check_given_symm('C2h', atoms))
def test_detect_symm_c2h(self):
atoms = [['H' , (1., 0., 2.)],
['He', (0., 1., 0.)],
['H' , (1., 0., 0.)],
['H' , (-1.,0., 0.)],
['H' , (-1.,0.,-2.)],
['He', (0.,-1., 0.)]]
l, orig, axes = geom.detect_symm(atoms)
atoms = geom.shift_atom(atoms, orig, axes)
self.assertEqual(l, 'C2h')
self.assertEqual(geom.symm_identical_atoms(l,atoms), [[0,4],[1,5],[2,3]])
self.assertTrue(geom.check_given_symm('C2h', atoms))
atoms = [['H' , (1., 0., 1.)],
['H' , (1., 0.,-1.)],
['He', (0., 0., 2.)],
['He', (2., 0.,-2.)],
['Li', (1., 1., 0.)],
['Li', (1.,-1., 0.)]]
l, orig, axes = geom.detect_symm(atoms)
atoms = geom.shift_atom(atoms, orig, axes)
self.assertEqual(l, 'C2h')
self.assertEqual(geom.symm_identical_atoms(l,atoms),
[[0, 1], [2, 3], [4, 5]])
self.assertTrue(geom.check_given_symm('C2h', atoms))
def test_detect_symm_d2_a(self):
atoms = [['H' , (1., 0., 1.)],
['H' , (1., 0.,-1.)],
['He', (0., 0., 2.)],
['He', (2., 0., 2.)],
['He', (1., 1.,-2.)],
['He', (1.,-1.,-2.)]]
l, orig, axes = geom.detect_symm(atoms)
self.assertEqual(l, 'D2d')
atoms = geom.shift_atom(atoms, orig, axes)
self.assertTrue(geom.check_given_symm('D2', atoms))
self.assertEqual(geom.symm_identical_atoms('D2', atoms),
[[0, 1], [2, 3, 4, 5]])
def test_detect_symm_d2_b(self):
s2 = numpy.sqrt(.5)
atoms = [['C', (0., 0., 1.)],
['C', (0., 0.,-1.)],
['H', ( 1, 0., 2.)],
['H', (-1, 0., 2.)],
['H', ( s2, s2,-2.)],
['H', (-s2,-s2,-2.)]]
l, orig, axes = geom.detect_symm(atoms)
self.assertEqual(l, 'D2')
atoms = geom.shift_atom(atoms, orig, axes)
self.assertTrue(geom.check_given_symm('D2', atoms))
self.assertEqual(geom.symm_identical_atoms(l,atoms),
[[0, 1], [2, 3, 4, 5]])
def test_detect_symm_s4(self):
atoms = [['H', (-1,-1.,-2.)],
['H', ( 1, 1.,-2.)],
['C', (-.9,-1.,-2.)],
['C', (.9, 1.,-2.)],
['H', ( 1,-1., 2.)],
['H', (-1, 1., 2.)],
['C', ( 1,-.9, 2.)],
['C', (-1, .9, 2.)],]
l, orig, axes = geom.detect_symm(atoms)
self.assertEqual(l, 'S4')
atoms = geom.shift_atom(atoms, orig, axes)
self.assertTrue(geom.check_given_symm('C2', atoms))
self.assertEqual(geom.symm_identical_atoms('C2',atoms),
[[0, 1], [2, 3], [4, 5], [6, 7]])
def test_detect_symm_ci(self):
atoms = [['H' , ( 1., 0., 0.)],
['He', ( 0., 1., 0.)],
['Li', ( 0., 0., 1.)],
['Be', ( .5, .5, .5)],
['H' , (-1., 0., 0.)],
['He', ( 0.,-1., 0.)],
['Li', ( 0., 0.,-1.)],
['Be', (-.5,-.5,-.5)]]
l, orig, axes = geom.detect_symm(atoms)
atoms = geom.shift_atom(atoms, orig, axes)
self.assertEqual(l, 'Ci')
self.assertTrue(geom.check_given_symm('Ci', atoms))
self.assertEqual(geom.symm_identical_atoms(l,atoms),
[[0, 4], [1, 5], [2, 6], [3, 7]])
def test_detect_symm_cs1(self):
atoms = [['H' , (1., 2., 0.)],
['He', (1., 0., 0.)],
['Li', (2.,-1., 0.)],
['Be', (0., 1., 0.)]]
l, orig, axes = geom.detect_symm(atoms)
atoms = geom.shift_atom(atoms, orig, axes)
self.assertEqual(l, 'Cs')
self.assertTrue(geom.check_given_symm('Cs', atoms))
self.assertEqual(geom.symm_identical_atoms(l,atoms),
[[0], [1], [2], [3]])
def test_detect_symm_cs2(self):
atoms = [['H' , (0., 1., 2.)],
['He', (0., 1., 0.)],
['Li', (0., 2.,-1.)],
['Be', (0., 0., 1.)],
['S' , (-3, 1., .5)],
['S' , ( 3, 1., .5)]]
coord = numpy.dot([a[1] for a in atoms], u)
atoms = [[atoms[i][0], c] for i,c in enumerate(coord)]
l, orig, axes = geom.detect_symm(atoms)
atoms = geom.shift_atom(atoms, orig, axes)
self.assertEqual(l, 'Cs')
self.assertTrue(geom.check_given_symm('Cs', atoms))
self.assertEqual(geom.symm_identical_atoms(l,atoms),
[[0], [1], [2], [3], [4, 5]])
def test_detect_symm_cs3(self):
atoms = [['H' , ( 2.,1., 0.)],
['He', ( 0.,1., 0.)],
['Li', (-1.,2., 0.)],
['Be', ( 1.,0., 0.)],
['S' , ( .5,1., -3)],
['S' , ( .5,1., 3)]]
coord = numpy.dot([a[1] for a in atoms], u)
atoms = [[atoms[i][0], c] for i,c in enumerate(coord)]
l, orig, axes = geom.detect_symm(atoms)
atoms = geom.shift_atom(atoms, orig, axes)
self.assertEqual(l, 'Cs')
self.assertTrue(geom.check_given_symm('Cs', atoms))
self.assertEqual(geom.symm_identical_atoms(l,atoms),
[[0], [1], [2], [3], [4, 5]])
def test_detect_symm_c1(self):
atoms = [['H' , ( 1., 0., 0.)],
['He', ( 0., 1., 0.)],
['Li', ( 0., 0., 1.)],
['Be', ( .5, .5, .5)]]
l, orig, axes = geom.detect_symm(atoms)
atoms = geom.shift_atom(atoms, orig, axes)
self.assertEqual(l, 'C1')
self.assertTrue(geom.check_given_symm('C1', atoms))
self.assertEqual(geom.symm_identical_atoms(l,atoms),
[[0], [1], [2], [3]])
def test_detect_symm_c2(self):
atoms = [['H' , ( 1., 0., 1.)],
['H' , ( 1., 0.,-1.)],
['He', ( 0.,-3., 2.)],
['He', ( 0., 3.,-2.)]]
l, orig, axes = geom.detect_symm(atoms)
atoms = geom.shift_atom(atoms, orig, axes)
self.assertEqual(l, 'C2')
self.assertTrue(geom.check_given_symm('C2', atoms))
self.assertEqual(geom.symm_identical_atoms(l,atoms), [[0,1],[2,3]])
def test_detect_symm_d3d(self):
atoms = [
['C', ( 1.25740, -0.72596, -0.25666)],
['C', ( 1.25740, 0.72596, 0.25666)],
['C', ( 0.00000, 1.45192, -0.25666)],
['C', (-1.25740, 0.72596, 0.25666)],
['C', (-1.25740, -0.72596, -0.25666)],
['C', ( 0.00000, -1.45192, 0.25666)],
['H', ( 2.04168, -1.17876, 0.05942)],
['H', ( 1.24249, -0.71735, -1.20798)],
['H', ( 2.04168, 1.17876, -0.05942)],
['H', ( 1.24249, 0.71735, 1.20798)],
['H', ( 0.00000, 1.43470, -1.20798)],
['H', ( 0.00000, 2.35753, 0.05942)],
['H', (-2.04168, 1.17876, -0.05942)],
['H', (-1.24249, 0.71735, 1.20798)],
['H', (-1.24249, -0.71735, -1.20798)],
['H', (-2.04168, -1.17876, 0.05942)],
['H', ( 0.00000, -1.43470, 1.20798)],
['H', ( 0.00000, -2.35753, -0.05942)], ]
with lib.temporary_env(geom, TOLERANCE=1e-4):
l, orig, axes = geom.detect_symm(atoms)
self.assertEqual(l, 'D3d')
def test_quasi_c2v(self):
atoms = [
['Fe', ( 0.0000000000, 0.0055197721, 0.0055197721)],
['O' , (-1.3265475500, 0.0000000000, -0.9445024777)],
['O' , ( 1.3265475500, 0.0000000000, -0.9445024777)],
['O' , ( 0.0000000000, -1.3265374484, 0.9444796669)],
['O' , ( 0.0000000000, 1.3265374484, 0.9444796669)],]
l, orig, axes = geom.detect_symm(atoms)
self.assertEqual(l, 'Cs')
with lib.temporary_env(geom, TOLERANCE=1e-2):
l, orig, axes = geom.detect_symm(atoms)
self.assertEqual(l, 'C2v')
with lib.temporary_env(geom, TOLERANCE=1e-1):
l, orig, axes = geom.detect_symm(atoms)
self.assertEqual(l, 'Td')
def test_as_subgroup(self):
axes = numpy.eye(3)
g, ax = symm.as_subgroup('D2d', axes)
self.assertEqual(g, 'D2')
self.assertAlmostEqual(abs(ax-numpy.eye(3)).max(), 0, 12)
g, ax = symm.as_subgroup('D2d', axes, 'D2')
self.assertEqual(g, 'D2')
self.assertAlmostEqual(abs(ax-numpy.eye(3)).max(), 0, 12)
g, ax = symm.as_subgroup('D2d', axes, 'C2v')
self.assertEqual(g, 'C2v')
self.assertAlmostEqual(ax[0,1], numpy.sqrt(.5), 9)
self.assertAlmostEqual(ax[1,0],-numpy.sqrt(.5), 9)
g, ax = symm.as_subgroup('C2v', axes, 'Cs')
self.assertEqual(g, 'Cs')
self.assertAlmostEqual(ax[2,0], 1, 9)
g, ax = symm.as_subgroup('D6', axes)
self.assertEqual(g, 'D2')
g, ax = symm.as_subgroup('C6h', axes)
self.assertEqual(g, 'C2h')
g, ax = symm.as_subgroup('C6v', axes)
self.assertEqual(g, 'C2v')
g, ax = symm.as_subgroup('C6', axes)
self.assertEqual(g, 'C2')
def test_ghost(self):
atoms = [
['Fe' , ( 0.0, 0.0, 0.0)],
['O' , (-1.3, 0.0, 0.0)],
['GHOST-O' , ( 1.3, 0.0, 0.0)],
['GHOST-O' , ( 0.0, -1.3, 0.0)],
['O' , ( 0.0, 1.3, 0.0)],]
l, orig, axes = geom.detect_symm(atoms)
self.assertEqual(l, 'C2v')
self.assertAlmostEqual(axes[2,0]*axes[2,1], -.5)
atoms = [
['Fe' , ( 0.0, 0.0, 0.0)],
['O' , (-1.3, 0.0, 0.0)],
['XO' , ( 1.3, 0.0, 0.0)],
['GHOSTO' , ( 0.0, -1.3, 0.0)],
['O' , ( 0.0, 1.3, 0.0)],]
l, orig, axes = geom.detect_symm(atoms)
self.assertEqual(l, 'C2v')
self.assertAlmostEqual(axes[2,0]*axes[2,1], -.5)
atoms = [
['Fe' , ( 0.0, 0.0, 0.0)],
['O' , (-1.3, 0.0, 0.0)],
['X' , ( 1.3, 0.0, 0.0)],
['X' , ( 0.0, -1.3, 0.0)],
['O' , ( 0.0, 1.3, 0.0)],]
l, orig, axes = geom.detect_symm(atoms)
self.assertEqual(l, 'C2v')
self.assertAlmostEqual(axes[2,0]*axes[2,1], -.5)
def test_sort_coords(self):
c = numpy.random.random((5,3))
c0 = symm.sort_coords(c)
idx = symm.argsort_coords(c)
self.assertAlmostEqual(abs(c[idx] - c0).max(), 0, 9)
def ring(n, start=0):
r = 1. / numpy.sin(numpy.pi/n)
coord = []
for i in range(n):
theta = i * (2*numpy.pi/n)
coord.append([r*numpy.cos(theta+start), r*numpy.sin(theta+start), 0])
return numpy.array(coord)
def ringhat(n, u):
atoms = [['H', c] for c in ring(n)] \
+ [['C', c] for c in ring(n, .1)] \
+ [['N', [0,0, 1.3]],
['N', [0,0,-1.3]]]
c = numpy.dot([a[1] for a in atoms], u)
return [[atoms[i][0], c[i]] for i in range(len(atoms))]
def rotmatz(ang):
c = numpy.cos(ang)
s = numpy.sin(ang)
return numpy.array((( c, s, 0),
(-s, c, 0),
( 0, 0, 1),))
def rotmaty(ang):
c = numpy.cos(ang)
s = numpy.sin(ang)
return numpy.array((( c, 0, s),
( 0, 1, 0),
(-s, 0, c),))
def r2edge(ang, r):
return 2*r*numpy.sin(ang/2)
def make60(b5, b6):
theta1 = numpy.arccos(1/numpy.sqrt(5))
theta2 = (numpy.pi - theta1) * .5
r = (b5*2+b6)/2/numpy.sin(theta1/2)
rot72 = rotmatz(numpy.pi*2/5)
s1 = numpy.sin(theta1)
c1 = numpy.cos(theta1)
s2 = numpy.sin(theta2)
c2 = numpy.cos(theta2)
p1 = numpy.array(( s2*b5, 0, r-c2*b5))
p9 = numpy.array((-s2*b5, 0,-r+c2*b5))
p2 = numpy.array(( s2*(b5+b6), 0, r-c2*(b5+b6)))
rot1 = reduce(numpy.dot, (rotmaty(theta1), rot72, rotmaty(-theta1)))
p2s = []
for i in range(5):
p2s.append(p2)
p2 = numpy.dot(p2, rot1)
coord = []
for i in range(5):
coord.append(p1)
p1 = numpy.dot(p1, rot72)
for pj in p2s:
pi = pj
for i in range(5):
coord.append(pi)
pi = numpy.dot(pi, rot72)
for pj in p2s:
pi = pj
for i in range(5):
coord.append(-pi)
pi = numpy.dot(pi, rot72)
for i in range(5):
coord.append(p9)
p9 = numpy.dot(p9, rot72)
return numpy.array(coord)
def make12(b):
theta1 = numpy.arccos(1/numpy.sqrt(5))
theta2 = (numpy.pi - theta1) * .5
r = b/2./numpy.sin(theta1/2)
rot72 = rotmatz(numpy.pi*2/5)
s1 = numpy.sin(theta1)
c1 = numpy.cos(theta1)
p1 = numpy.array(( s1*r, 0, c1*r))
p2 = numpy.array((-s1*r, 0, -c1*r))
coord = [( 0, 0, r)]
for i in range(5):
coord.append(p1)
p1 = numpy.dot(p1, rot72)
for i in range(5):
coord.append(p2)
p2 = numpy.dot(p2, rot72)
coord.append(( 0, 0, -r))
return numpy.array(coord)
def make20(b):
theta1 = numpy.arccos(numpy.sqrt(5)/3)
theta2 = numpy.arcsin(r2edge(theta1,1)/2/numpy.sin(numpy.pi/5))
r = b/2./numpy.sin(theta1/2)
rot72 = rotmatz(numpy.pi*2/5)
s2 = numpy.sin(theta2)
c2 = numpy.cos(theta2)
s3 = numpy.sin(theta1+theta2)
c3 = numpy.cos(theta1+theta2)
p1 = numpy.array(( s2*r, 0, c2*r))
p2 = numpy.array(( s3*r, 0, c3*r))
p3 = numpy.array((-s3*r, 0, -c3*r))
p4 = numpy.array((-s2*r, 0, -c2*r))
coord = []
for i in range(5):
coord.append(p1)
p1 = numpy.dot(p1, rot72)
for i in range(5):
coord.append(p2)
p2 = numpy.dot(p2, rot72)
for i in range(5):
coord.append(p3)
p3 = numpy.dot(p3, rot72)
for i in range(5):
coord.append(p4)
p4 = numpy.dot(p4, rot72)
return numpy.array(coord)
def make4(b):
coord = numpy.ones((4,3)) * b*.5
coord[1,0] = coord[1,1] = -b*.5
coord[2,2] = coord[2,1] = -b * .5
coord[3,0] = coord[3,2] = -b * .5
return coord
def make6(b):
coord = numpy.zeros((6,3))
coord[0,0] = coord[1,1] = coord[2,2] = b * .5
coord[3,0] = coord[4,1] = coord[5,2] =-b * .5
return coord
def make8(b):
coord = numpy.ones((8,3)) * b*.5
n = 0
for i in range(2):
for j in range(2):
for k in range(2):
coord[n,0] = (-1) ** i * b*.5
coord[n,1] = (-1) ** j * b*.5
coord[n,2] = (-1) ** k * b*.5
n += 1
return coord
def random_rotz(seed=19):
numpy.random.seed(seed)
rotz = numpy.eye(3)
rotz[:2,:2] = numpy.linalg.svd(numpy.random.random((2,2)))[0]
return rotz
if __name__ == "__main__":
print("Full Tests geom")
unittest.main()
| 38.980691 | 92 | 0.4863 | 5,350 | 38,357 | 3.41514 | 0.066168 | 0.021017 | 0.014778 | 0.05911 | 0.815062 | 0.791582 | 0.772372 | 0.741612 | 0.716326 | 0.696842 | 0 | 0.097906 | 0.315379 | 38,357 | 983 | 93 | 39.020346 | 0.597867 | 0.016868 | 0 | 0.568925 | 0 | 0 | 0.025337 | 0 | 0 | 0 | 0 | 0 | 0.204439 | 1 | 0.077103 | false | 0 | 0.008178 | 0.001168 | 0.100467 | 0.001168 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
be22c5715e59d9f5dd514b4c19a398197efc9641 | 74 | py | Python | cyphercat/datadefs/__init__.py | arafin-lab/model_inversion_experiments | 8e491f22ae847d8d0f2f31caed6ad0078c9b6a49 | [
"Apache-2.0"
] | 101 | 2018-10-26T14:39:19.000Z | 2022-03-28T11:38:45.000Z | cyphercat/datadefs/__init__.py | zhampel/cyphercat | 4029ae8683b9056013e6424d8931afe79afa618e | [
"Apache-2.0"
] | 32 | 2018-10-10T23:02:04.000Z | 2019-11-11T21:30:57.000Z | cyphercat/datadefs/__init__.py | zhampel/cyphercat | 4029ae8683b9056013e6424d8931afe79afa618e | [
"Apache-2.0"
] | 49 | 2018-10-05T17:54:49.000Z | 2022-02-10T07:47:25.000Z | from .cyphercat_dataset import CCATDataset
from .voices_dataset import *
| 18.5 | 42 | 0.837838 | 9 | 74 | 6.666667 | 0.666667 | 0.433333 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.121622 | 74 | 3 | 43 | 24.666667 | 0.923077 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
07b20eac0a9d7ee9730f45b0eb7f2ccb49f89b5f | 198 | py | Python | crm/admin.py | S0Imyr/projet12 | d71151fa9f960ff5e5b95074bacbde83d2d219c9 | [
"MIT"
] | 1 | 2021-09-03T10:37:06.000Z | 2021-09-03T10:37:06.000Z | crm/admin.py | S0Imyr/projet12 | d71151fa9f960ff5e5b95074bacbde83d2d219c9 | [
"MIT"
] | null | null | null | crm/admin.py | S0Imyr/projet12 | d71151fa9f960ff5e5b95074bacbde83d2d219c9 | [
"MIT"
] | null | null | null | from django.contrib import admin
from .models import Client, Contract, Event, Status
admin.site.register(Client)
admin.site.register(Contract)
admin.site.register(Event)
admin.site.register(Status) | 28.285714 | 51 | 0.818182 | 28 | 198 | 5.785714 | 0.428571 | 0.222222 | 0.419753 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.075758 | 198 | 7 | 52 | 28.285714 | 0.885246 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0.333333 | 0 | 0.333333 | 0 | 1 | 0 | 0 | null | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 6 |
07b5fe65c72d174245951119fd6e9e01266e209a | 31 | py | Python | noaa/__init__.py | exTerEX/noaa | d17364b7c500129e97ee4943e2bad9baf492df32 | [
"MIT"
] | null | null | null | noaa/__init__.py | exTerEX/noaa | d17364b7c500129e97ee4943e2bad9baf492df32 | [
"MIT"
] | 6 | 2022-02-07T16:09:42.000Z | 2022-03-21T19:56:44.000Z | noaa/__init__.py | exTerEX/noaa | d17364b7c500129e97ee4943e2bad9baf492df32 | [
"MIT"
] | null | null | null | import noaa.climate as climate
| 15.5 | 30 | 0.83871 | 5 | 31 | 5.2 | 0.8 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.129032 | 31 | 1 | 31 | 31 | 0.962963 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
07c44ebbd1936a2609f524c563bc2ae806683ed9 | 22 | py | Python | spatial_autocorrelation/__init__.py | yatshunlee/spatial_autocorrelation | 88fec61637fe0da18e3e12390ab63e34b3cebb78 | [
"MIT"
] | null | null | null | spatial_autocorrelation/__init__.py | yatshunlee/spatial_autocorrelation | 88fec61637fe0da18e3e12390ab63e34b3cebb78 | [
"MIT"
] | 1 | 2021-11-26T06:47:43.000Z | 2021-11-26T06:47:43.000Z | spatial_autocorrelation/__init__.py | yatshunlee/spatial_autocorrelation | 88fec61637fe0da18e3e12390ab63e34b3cebb78 | [
"MIT"
] | null | null | null | from .moransI import * | 22 | 22 | 0.772727 | 3 | 22 | 5.666667 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.136364 | 22 | 1 | 22 | 22 | 0.894737 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
07c51a4af2b5c56bee579e2dd8e797e730b87dcb | 26 | py | Python | illuminate/__init__.py | sci-microscopy/illuminate_controller | 58e209691f60e0fdb1f04f8654f58bb27eaa72f6 | [
"BSD-3-Clause"
] | 2 | 2021-05-06T15:56:57.000Z | 2021-05-19T01:56:29.000Z | illuminate/__init__.py | sci-microscopy/illuminate_controller | 58e209691f60e0fdb1f04f8654f58bb27eaa72f6 | [
"BSD-3-Clause"
] | 1 | 2019-07-16T16:09:45.000Z | 2019-07-16T16:09:45.000Z | illuminate/__init__.py | zfphil/illuminate_controller | 58e209691f60e0fdb1f04f8654f58bb27eaa72f6 | [
"BSD-3-Clause"
] | null | null | null | from .illuminate import *
| 13 | 25 | 0.769231 | 3 | 26 | 6.666667 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.153846 | 26 | 1 | 26 | 26 | 0.909091 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
07e162f6fff6f1c27307b26f57cc47c9efdba639 | 22 | py | Python | __init__.py | toni-heittola/pelican-bdates | 9b8ce3c8e02bb5e46b624597cd9b4dc49d3de946 | [
"MIT"
] | null | null | null | __init__.py | toni-heittola/pelican-bdates | 9b8ce3c8e02bb5e46b624597cd9b4dc49d3de946 | [
"MIT"
] | null | null | null | __init__.py | toni-heittola/pelican-bdates | 9b8ce3c8e02bb5e46b624597cd9b4dc49d3de946 | [
"MIT"
] | null | null | null | from .bdates import *
| 11 | 21 | 0.727273 | 3 | 22 | 5.333333 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.181818 | 22 | 1 | 22 | 22 | 0.888889 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
580237a1a39f240571350f964044c6755bdf5198 | 120 | py | Python | flask-skeleton/views/second_bp/second.py | x1ah/python-snippets | a7b9e10401bd9007d259bd418c0639a3607254e0 | [
"MIT"
] | null | null | null | flask-skeleton/views/second_bp/second.py | x1ah/python-snippets | a7b9e10401bd9007d259bd418c0639a3607254e0 | [
"MIT"
] | null | null | null | flask-skeleton/views/second_bp/second.py | x1ah/python-snippets | a7b9e10401bd9007d259bd418c0639a3607254e0 | [
"MIT"
] | null | null | null | # coding: utf-8
from flask_skeleton.views import second_bp
@second_bp.route('/test')
def test_s():
return 'test'
| 13.333333 | 42 | 0.708333 | 19 | 120 | 4.263158 | 0.789474 | 0.197531 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.009901 | 0.158333 | 120 | 8 | 43 | 15 | 0.792079 | 0.108333 | 0 | 0 | 0 | 0 | 0.085714 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.25 | true | 0 | 0.25 | 0.25 | 0.75 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 1 | 1 | 0 | 0 | 6 |
5806442cfbe53423fba65e091345627971617fb5 | 42 | py | Python | je_auto_control/utils/image/__init__.py | JE-Chen/Python_JEAutoControl | 477bf9612e28e9ab6d0a8e269db2f699e50a3744 | [
"MIT"
] | 9 | 2020-10-12T06:33:36.000Z | 2021-09-13T07:07:36.000Z | je_auto_control/utils/image/__init__.py | JE-Chen/Python_JEAutoControl | 477bf9612e28e9ab6d0a8e269db2f699e50a3744 | [
"MIT"
] | null | null | null | je_auto_control/utils/image/__init__.py | JE-Chen/Python_JEAutoControl | 477bf9612e28e9ab6d0a8e269db2f699e50a3744 | [
"MIT"
] | null | null | null | from je_auto_control.utils.image import *
| 21 | 41 | 0.833333 | 7 | 42 | 4.714286 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.095238 | 42 | 1 | 42 | 42 | 0.868421 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
58069f82f08936b5107b85617db38c980b5ddcee | 241,840 | py | Python | breathe/parser/compoundsuper.py | arwedus/breathe | 26844b77d805a93bd0df3e14a289d23adcf8ab19 | [
"BSD-3-Clause"
] | null | null | null | breathe/parser/compoundsuper.py | arwedus/breathe | 26844b77d805a93bd0df3e14a289d23adcf8ab19 | [
"BSD-3-Clause"
] | 1 | 2020-12-14T11:49:22.000Z | 2020-12-14T14:14:12.000Z | breathe/parser/compoundsuper.py | arwedus/breathe | 26844b77d805a93bd0df3e14a289d23adcf8ab19 | [
"BSD-3-Clause"
] | null | null | null | #
# Generated Thu Jun 11 18:44:25 2009 by generateDS.py.
#
import sys
import getopt
from xml.dom import minidom
from xml.dom import Node
#
# User methods
#
# Calls to the methods in these classes are generated by generateDS.py.
# You can replace these methods by re-implementing the following class
# in a module named generatedssuper.py.
try:
from generatedssuper import GeneratedsSuper
except ImportError as exp:
class GeneratedsSuper:
def format_string(self, input_data, input_name=''):
return input_data
def format_integer(self, input_data, input_name=''):
return '%d' % input_data
def format_float(self, input_data, input_name=''):
return '%f' % input_data
def format_double(self, input_data, input_name=''):
return '%e' % input_data
def format_boolean(self, input_data, input_name=''):
return '%s' % input_data
#
# If you have installed IPython you can uncomment and use the following.
# IPython is available from http://ipython.scipy.org/.
#
## from IPython.Shell import IPShellEmbed
## args = ''
## ipshell = IPShellEmbed(args,
## banner = 'Dropping into IPython',
## exit_msg = 'Leaving Interpreter, back to program.')
# Then use the following line where and when you want to drop into the
# IPython shell:
# ipshell('<some message> -- Entering ipshell.\nHit Ctrl-D to exit')
#
# Globals
#
ExternalEncoding = 'ascii'
#
# Support/utility functions.
#
def showIndent(outfile, level):
for idx in range(level):
outfile.write(' ')
def quote_xml(inStr):
s1 = (isinstance(inStr, basestring) and inStr or
'%s' % inStr)
s1 = s1.replace('&', '&')
s1 = s1.replace('<', '<')
s1 = s1.replace('>', '>')
return s1
def quote_attrib(inStr):
s1 = (isinstance(inStr, basestring) and inStr or
'%s' % inStr)
s1 = s1.replace('&', '&')
s1 = s1.replace('<', '<')
s1 = s1.replace('>', '>')
if '"' in s1:
if "'" in s1:
s1 = '"%s"' % s1.replace('"', """)
else:
s1 = "'%s'" % s1
else:
s1 = '"%s"' % s1
return s1
def quote_python(inStr):
s1 = inStr
if s1.find("'") == -1:
if s1.find('\n') == -1:
return "'%s'" % s1
else:
return "'''%s'''" % s1
else:
if s1.find('"') != -1:
s1 = s1.replace('"', '\\"')
if s1.find('\n') == -1:
return '"%s"' % s1
else:
return '"""%s"""' % s1
class MixedContainer:
node_type = "mixedcontainer"
# Constants for category:
CategoryNone = 0
CategoryText = 1
CategorySimple = 2
CategoryComplex = 3
# Constants for content_type:
TypeNone = 0
TypeText = 1
TypeString = 2
TypeInteger = 3
TypeFloat = 4
TypeDecimal = 5
TypeDouble = 6
TypeBoolean = 7
def __init__(self, category, content_type, name, value):
self.category = category
self.content_type = content_type
self.name = name
self.value = value
def getCategory(self):
return self.category
def getContenttype(self, content_type):
return self.content_type
def getValue(self):
return self.value
def getName(self):
return self.name
class _MemberSpec(object):
def __init__(self, name='', data_type='', container=0):
self.name = name
self.data_type = data_type
self.container = container
def set_name(self, name): self.name = name
def get_name(self): return self.name
def set_data_type(self, data_type): self.data_type = data_type
def get_data_type(self): return self.data_type
def set_container(self, container): self.container = container
def get_container(self): return self.container
#
# Data representation classes.
#
class DoxygenType(GeneratedsSuper):
subclass = None
superclass = None
def __init__(self, version=None, compounddef=None):
self.version = version
self.compounddef = compounddef
def factory(*args_, **kwargs_):
if DoxygenType.subclass:
return DoxygenType.subclass(*args_, **kwargs_)
else:
return DoxygenType(*args_, **kwargs_)
factory = staticmethod(factory)
def get_compounddef(self): return self.compounddef
def set_compounddef(self, compounddef): self.compounddef = compounddef
def get_version(self): return self.version
def set_version(self, version): self.version = version
def hasContent_(self):
if (
self.compounddef is not None
):
return True
else:
return False
def build(self, node_):
attrs = node_.attributes
self.buildAttributes(attrs)
for child_ in node_.childNodes:
nodeName_ = child_.nodeName.split(':')[-1]
self.buildChildren(child_, nodeName_)
def buildAttributes(self, attrs):
if attrs.get('version'):
self.version = attrs.get('version').value
def buildChildren(self, child_, nodeName_):
if child_.nodeType == Node.ELEMENT_NODE and \
nodeName_ == 'compounddef':
obj_ = compounddefType.factory()
obj_.build(child_)
self.set_compounddef(obj_)
# end class DoxygenType
class compounddefType(GeneratedsSuper):
subclass = None
superclass = None
def __init__(self, kind=None, prot=None, id=None, compoundname=None, title=None, basecompoundref=None, derivedcompoundref=None, includes=None, includedby=None, incdepgraph=None, invincdepgraph=None, innerdir=None, innerfile=None, innerclass=None, innernamespace=None, innerpage=None, innergroup=None, templateparamlist=None, sectiondef=None, briefdescription=None, detaileddescription=None, inheritancegraph=None, collaborationgraph=None, programlisting=None, location=None, listofallmembers=None):
self.kind = kind
self.prot = prot
self.id = id
self.compoundname = compoundname
self.title = title
if basecompoundref is None:
self.basecompoundref = []
else:
self.basecompoundref = basecompoundref
if derivedcompoundref is None:
self.derivedcompoundref = []
else:
self.derivedcompoundref = derivedcompoundref
if includes is None:
self.includes = []
else:
self.includes = includes
if includedby is None:
self.includedby = []
else:
self.includedby = includedby
self.incdepgraph = incdepgraph
self.invincdepgraph = invincdepgraph
if innerdir is None:
self.innerdir = []
else:
self.innerdir = innerdir
if innerfile is None:
self.innerfile = []
else:
self.innerfile = innerfile
if innerclass is None:
self.innerclass = []
else:
self.innerclass = innerclass
if innernamespace is None:
self.innernamespace = []
else:
self.innernamespace = innernamespace
if innerpage is None:
self.innerpage = []
else:
self.innerpage = innerpage
if innergroup is None:
self.innergroup = []
else:
self.innergroup = innergroup
self.templateparamlist = templateparamlist
if sectiondef is None:
self.sectiondef = []
else:
self.sectiondef = sectiondef
self.briefdescription = briefdescription
self.detaileddescription = detaileddescription
self.inheritancegraph = inheritancegraph
self.collaborationgraph = collaborationgraph
self.programlisting = programlisting
self.location = location
self.listofallmembers = listofallmembers
self.namespaces = []
def factory(*args_, **kwargs_):
if compounddefType.subclass:
return compounddefType.subclass(*args_, **kwargs_)
else:
return compounddefType(*args_, **kwargs_)
factory = staticmethod(factory)
def get_compoundname(self): return self.compoundname
def set_compoundname(self, compoundname): self.compoundname = compoundname
def get_title(self): return self.title
def set_title(self, title): self.title = title
def get_basecompoundref(self): return self.basecompoundref
def set_basecompoundref(self, basecompoundref): self.basecompoundref = basecompoundref
def add_basecompoundref(self, value): self.basecompoundref.append(value)
def insert_basecompoundref(self, index, value): self.basecompoundref[index] = value
def get_derivedcompoundref(self): return self.derivedcompoundref
def set_derivedcompoundref(self, derivedcompoundref): self.derivedcompoundref = derivedcompoundref
def add_derivedcompoundref(self, value): self.derivedcompoundref.append(value)
def insert_derivedcompoundref(self, index, value): self.derivedcompoundref[index] = value
def get_includes(self): return self.includes
def set_includes(self, includes): self.includes = includes
def add_includes(self, value): self.includes.append(value)
def insert_includes(self, index, value): self.includes[index] = value
def get_includedby(self): return self.includedby
def set_includedby(self, includedby): self.includedby = includedby
def add_includedby(self, value): self.includedby.append(value)
def insert_includedby(self, index, value): self.includedby[index] = value
def get_incdepgraph(self): return self.incdepgraph
def set_incdepgraph(self, incdepgraph): self.incdepgraph = incdepgraph
def get_invincdepgraph(self): return self.invincdepgraph
def set_invincdepgraph(self, invincdepgraph): self.invincdepgraph = invincdepgraph
def get_innerdir(self): return self.innerdir
def set_innerdir(self, innerdir): self.innerdir = innerdir
def add_innerdir(self, value): self.innerdir.append(value)
def insert_innerdir(self, index, value): self.innerdir[index] = value
def get_innerfile(self): return self.innerfile
def set_innerfile(self, innerfile): self.innerfile = innerfile
def add_innerfile(self, value): self.innerfile.append(value)
def insert_innerfile(self, index, value): self.innerfile[index] = value
def get_innerclass(self): return self.innerclass
def set_innerclass(self, innerclass): self.innerclass = innerclass
def add_innerclass(self, value): self.innerclass.append(value)
def insert_innerclass(self, index, value): self.innerclass[index] = value
def get_innernamespace(self): return self.innernamespace
def set_innernamespace(self, innernamespace): self.innernamespace = innernamespace
def add_innernamespace(self, value): self.innernamespace.append(value)
def insert_innernamespace(self, index, value): self.innernamespace[index] = value
def get_innerpage(self): return self.innerpage
def set_innerpage(self, innerpage): self.innerpage = innerpage
def add_innerpage(self, value): self.innerpage.append(value)
def insert_innerpage(self, index, value): self.innerpage[index] = value
def get_innergroup(self): return self.innergroup
def set_innergroup(self, innergroup): self.innergroup = innergroup
def add_innergroup(self, value): self.innergroup.append(value)
def insert_innergroup(self, index, value): self.innergroup[index] = value
def get_templateparamlist(self): return self.templateparamlist
def set_templateparamlist(self, templateparamlist): self.templateparamlist = templateparamlist
def get_sectiondef(self): return self.sectiondef
def set_sectiondef(self, sectiondef): self.sectiondef = sectiondef
def add_sectiondef(self, value): self.sectiondef.append(value)
def insert_sectiondef(self, index, value): self.sectiondef[index] = value
def get_briefdescription(self): return self.briefdescription
def set_briefdescription(self, briefdescription): self.briefdescription = briefdescription
def get_detaileddescription(self): return self.detaileddescription
def set_detaileddescription(self, detaileddescription): self.detaileddescription = detaileddescription
def get_inheritancegraph(self): return self.inheritancegraph
def set_inheritancegraph(self, inheritancegraph): self.inheritancegraph = inheritancegraph
def get_collaborationgraph(self): return self.collaborationgraph
def set_collaborationgraph(self, collaborationgraph): self.collaborationgraph = collaborationgraph
def get_programlisting(self): return self.programlisting
def set_programlisting(self, programlisting): self.programlisting = programlisting
def get_location(self): return self.location
def set_location(self, location): self.location = location
def get_listofallmembers(self): return self.listofallmembers
def set_listofallmembers(self, listofallmembers): self.listofallmembers = listofallmembers
def get_kind(self): return self.kind
def set_kind(self, kind): self.kind = kind
def get_prot(self): return self.prot
def set_prot(self, prot): self.prot = prot
def get_id(self): return self.id
def set_id(self, id): self.id = id
def hasContent_(self):
if (
self.compoundname is not None or
self.title is not None or
self.basecompoundref is not None or
self.derivedcompoundref is not None or
self.includes is not None or
self.includedby is not None or
self.incdepgraph is not None or
self.invincdepgraph is not None or
self.innerdir is not None or
self.innerfile is not None or
self.innerclass is not None or
self.innernamespace is not None or
self.innerpage is not None or
self.innergroup is not None or
self.templateparamlist is not None or
self.sectiondef is not None or
self.briefdescription is not None or
self.detaileddescription is not None or
self.inheritancegraph is not None or
self.collaborationgraph is not None or
self.programlisting is not None or
self.location is not None or
self.listofallmembers is not None
):
return True
else:
return False
def build(self, node_):
attrs = node_.attributes
self.buildAttributes(attrs)
for child_ in node_.childNodes:
nodeName_ = child_.nodeName.split(':')[-1]
self.buildChildren(child_, nodeName_)
def buildAttributes(self, attrs):
if attrs.get('kind'):
self.kind = attrs.get('kind').value
if attrs.get('prot'):
self.prot = attrs.get('prot').value
if attrs.get('id'):
self.id = attrs.get('id').value
def buildChildren(self, child_, nodeName_):
if child_.nodeType == Node.ELEMENT_NODE and \
nodeName_ == 'compoundname':
compoundname_ = ''
for text__content_ in child_.childNodes:
compoundname_ += text__content_.nodeValue
self.compoundname = compoundname_
elif child_.nodeType == Node.ELEMENT_NODE and \
nodeName_ == 'title':
obj_ = docTitleType.factory()
obj_.build(child_)
self.set_title(obj_)
elif child_.nodeType == Node.ELEMENT_NODE and \
nodeName_ == 'basecompoundref':
obj_ = compoundRefType.factory()
obj_.build(child_)
self.basecompoundref.append(obj_)
elif child_.nodeType == Node.ELEMENT_NODE and \
nodeName_ == 'derivedcompoundref':
obj_ = compoundRefType.factory()
obj_.build(child_)
self.derivedcompoundref.append(obj_)
elif child_.nodeType == Node.ELEMENT_NODE and \
nodeName_ == 'includes':
obj_ = incType.factory()
obj_.build(child_)
self.includes.append(obj_)
elif child_.nodeType == Node.ELEMENT_NODE and \
nodeName_ == 'includedby':
obj_ = incType.factory()
obj_.build(child_)
self.includedby.append(obj_)
elif child_.nodeType == Node.ELEMENT_NODE and \
nodeName_ == 'incdepgraph':
obj_ = graphType.factory()
obj_.build(child_)
self.set_incdepgraph(obj_)
elif child_.nodeType == Node.ELEMENT_NODE and \
nodeName_ == 'invincdepgraph':
obj_ = graphType.factory()
obj_.build(child_)
self.set_invincdepgraph(obj_)
elif child_.nodeType == Node.ELEMENT_NODE and \
nodeName_ == 'innerdir':
obj_ = refType.factory(nodeName_)
obj_.build(child_)
self.innerdir.append(obj_)
elif child_.nodeType == Node.ELEMENT_NODE and \
nodeName_ == 'innerfile':
obj_ = refType.factory(nodeName_)
obj_.build(child_)
self.innerfile.append(obj_)
elif child_.nodeType == Node.ELEMENT_NODE and \
nodeName_ == 'innerclass':
obj_ = refType.factory(nodeName_)
obj_.build(child_)
self.innerclass.append(obj_)
self.namespaces.append(obj_.content_[0].getValue())
elif child_.nodeType == Node.ELEMENT_NODE and \
nodeName_ == 'innernamespace':
obj_ = refType.factory(nodeName_)
obj_.build(child_)
self.innernamespace.append(obj_)
self.namespaces.append(obj_.content_[0].getValue())
elif child_.nodeType == Node.ELEMENT_NODE and \
nodeName_ == 'innerpage':
obj_ = refType.factory(nodeName_)
obj_.build(child_)
self.innerpage.append(obj_)
elif child_.nodeType == Node.ELEMENT_NODE and \
nodeName_ == 'innergroup':
obj_ = refType.factory(nodeName_)
obj_.build(child_)
self.innergroup.append(obj_)
elif child_.nodeType == Node.ELEMENT_NODE and \
nodeName_ == 'templateparamlist':
obj_ = templateparamlistType.factory()
obj_.build(child_)
self.set_templateparamlist(obj_)
elif child_.nodeType == Node.ELEMENT_NODE and \
nodeName_ == 'sectiondef':
obj_ = sectiondefType.factory()
obj_.build(child_)
self.sectiondef.append(obj_)
elif child_.nodeType == Node.ELEMENT_NODE and \
nodeName_ == 'briefdescription':
obj_ = descriptionType.factory()
obj_.build(child_)
self.set_briefdescription(obj_)
elif child_.nodeType == Node.ELEMENT_NODE and \
nodeName_ == 'detaileddescription':
obj_ = descriptionType.factory()
obj_.build(child_)
self.set_detaileddescription(obj_)
elif child_.nodeType == Node.ELEMENT_NODE and \
nodeName_ == 'inheritancegraph':
obj_ = graphType.factory()
obj_.build(child_)
self.set_inheritancegraph(obj_)
elif child_.nodeType == Node.ELEMENT_NODE and \
nodeName_ == 'collaborationgraph':
obj_ = graphType.factory()
obj_.build(child_)
self.set_collaborationgraph(obj_)
elif child_.nodeType == Node.ELEMENT_NODE and \
nodeName_ == 'programlisting':
obj_ = listingType.factory()
obj_.build(child_)
self.set_programlisting(obj_)
elif child_.nodeType == Node.ELEMENT_NODE and \
nodeName_ == 'location':
obj_ = locationType.factory()
obj_.build(child_)
self.set_location(obj_)
elif child_.nodeType == Node.ELEMENT_NODE and \
nodeName_ == 'listofallmembers':
obj_ = listofallmembersType.factory()
obj_.build(child_)
self.set_listofallmembers(obj_)
# end class compounddefType
class listofallmembersType(GeneratedsSuper):
subclass = None
superclass = None
def __init__(self, member=None):
if member is None:
self.member = []
else:
self.member = member
def factory(*args_, **kwargs_):
if listofallmembersType.subclass:
return listofallmembersType.subclass(*args_, **kwargs_)
else:
return listofallmembersType(*args_, **kwargs_)
factory = staticmethod(factory)
def get_member(self): return self.member
def set_member(self, member): self.member = member
def add_member(self, value): self.member.append(value)
def insert_member(self, index, value): self.member[index] = value
def hasContent_(self):
if (
self.member is not None
):
return True
else:
return False
def build(self, node_):
attrs = node_.attributes
self.buildAttributes(attrs)
for child_ in node_.childNodes:
nodeName_ = child_.nodeName.split(':')[-1]
self.buildChildren(child_, nodeName_)
def buildAttributes(self, attrs):
pass
def buildChildren(self, child_, nodeName_):
if child_.nodeType == Node.ELEMENT_NODE and \
nodeName_ == 'member':
obj_ = memberRefType.factory()
obj_.build(child_)
self.member.append(obj_)
# end class listofallmembersType
class memberRefType(GeneratedsSuper):
subclass = None
superclass = None
def __init__(self, virt=None, prot=None, refid=None, ambiguityscope=None, scope=None, name=None):
self.virt = virt
self.prot = prot
self.refid = refid
self.ambiguityscope = ambiguityscope
self.scope = scope
self.name = name
def factory(*args_, **kwargs_):
if memberRefType.subclass:
return memberRefType.subclass(*args_, **kwargs_)
else:
return memberRefType(*args_, **kwargs_)
factory = staticmethod(factory)
def get_scope(self): return self.scope
def set_scope(self, scope): self.scope = scope
def get_name(self): return self.name
def set_name(self, name): self.name = name
def get_virt(self): return self.virt
def set_virt(self, virt): self.virt = virt
def get_prot(self): return self.prot
def set_prot(self, prot): self.prot = prot
def get_refid(self): return self.refid
def set_refid(self, refid): self.refid = refid
def get_ambiguityscope(self): return self.ambiguityscope
def set_ambiguityscope(self, ambiguityscope): self.ambiguityscope = ambiguityscope
def hasContent_(self):
if (
self.scope is not None or
self.name is not None
):
return True
else:
return False
def build(self, node_):
attrs = node_.attributes
self.buildAttributes(attrs)
for child_ in node_.childNodes:
nodeName_ = child_.nodeName.split(':')[-1]
self.buildChildren(child_, nodeName_)
def buildAttributes(self, attrs):
if attrs.get('virt'):
self.virt = attrs.get('virt').value
if attrs.get('prot'):
self.prot = attrs.get('prot').value
if attrs.get('refid'):
self.refid = attrs.get('refid').value
if attrs.get('ambiguityscope'):
self.ambiguityscope = attrs.get('ambiguityscope').value
def buildChildren(self, child_, nodeName_):
if child_.nodeType == Node.ELEMENT_NODE and \
nodeName_ == 'scope':
scope_ = ''
for text__content_ in child_.childNodes:
scope_ += text__content_.nodeValue
self.scope = scope_
elif child_.nodeType == Node.ELEMENT_NODE and \
nodeName_ == 'name':
name_ = ''
for text__content_ in child_.childNodes:
name_ += text__content_.nodeValue
self.name = name_
# end class memberRefType
class scope(GeneratedsSuper):
subclass = None
superclass = None
def __init__(self, valueOf_=''):
self.valueOf_ = valueOf_
def factory(*args_, **kwargs_):
if scope.subclass:
return scope.subclass(*args_, **kwargs_)
else:
return scope(*args_, **kwargs_)
factory = staticmethod(factory)
def getValueOf_(self): return self.valueOf_
def setValueOf_(self, valueOf_): self.valueOf_ = valueOf_
def hasContent_(self):
if (
self.valueOf_ is not None
):
return True
else:
return False
def build(self, node_):
attrs = node_.attributes
self.buildAttributes(attrs)
self.valueOf_ = ''
for child_ in node_.childNodes:
nodeName_ = child_.nodeName.split(':')[-1]
self.buildChildren(child_, nodeName_)
def buildAttributes(self, attrs):
pass
def buildChildren(self, child_, nodeName_):
if child_.nodeType == Node.TEXT_NODE:
self.valueOf_ += child_.nodeValue
elif child_.nodeType == Node.CDATA_SECTION_NODE:
self.valueOf_ += '![CDATA['+child_.nodeValue+']]'
# end class scope
class name(GeneratedsSuper):
subclass = None
superclass = None
def __init__(self, valueOf_=''):
self.valueOf_ = valueOf_
def factory(*args_, **kwargs_):
if name.subclass:
return name.subclass(*args_, **kwargs_)
else:
return name(*args_, **kwargs_)
factory = staticmethod(factory)
def getValueOf_(self): return self.valueOf_
def setValueOf_(self, valueOf_): self.valueOf_ = valueOf_
def hasContent_(self):
if (
self.valueOf_ is not None
):
return True
else:
return False
def build(self, node_):
attrs = node_.attributes
self.buildAttributes(attrs)
self.valueOf_ = ''
for child_ in node_.childNodes:
nodeName_ = child_.nodeName.split(':')[-1]
self.buildChildren(child_, nodeName_)
def buildAttributes(self, attrs):
pass
def buildChildren(self, child_, nodeName_):
if child_.nodeType == Node.TEXT_NODE:
self.valueOf_ += child_.nodeValue
elif child_.nodeType == Node.CDATA_SECTION_NODE:
self.valueOf_ += '![CDATA['+child_.nodeValue+']]'
# end class name
class compoundRefType(GeneratedsSuper):
subclass = None
superclass = None
def __init__(self, virt=None, prot=None, refid=None, valueOf_='', mixedclass_=None, content_=None):
self.virt = virt
self.prot = prot
self.refid = refid
if mixedclass_ is None:
self.mixedclass_ = MixedContainer
else:
self.mixedclass_ = mixedclass_
if content_ is None:
self.content_ = []
else:
self.content_ = content_
def factory(*args_, **kwargs_):
if compoundRefType.subclass:
return compoundRefType.subclass(*args_, **kwargs_)
else:
return compoundRefType(*args_, **kwargs_)
factory = staticmethod(factory)
def get_virt(self): return self.virt
def set_virt(self, virt): self.virt = virt
def get_prot(self): return self.prot
def set_prot(self, prot): self.prot = prot
def get_refid(self): return self.refid
def set_refid(self, refid): self.refid = refid
def getValueOf_(self): return self.valueOf_
def setValueOf_(self, valueOf_): self.valueOf_ = valueOf_
def hasContent_(self):
if (
self.valueOf_ is not None
):
return True
else:
return False
def build(self, node_):
attrs = node_.attributes
self.buildAttributes(attrs)
self.valueOf_ = ''
for child_ in node_.childNodes:
nodeName_ = child_.nodeName.split(':')[-1]
self.buildChildren(child_, nodeName_)
def buildAttributes(self, attrs):
if attrs.get('virt'):
self.virt = attrs.get('virt').value
if attrs.get('prot'):
self.prot = attrs.get('prot').value
if attrs.get('refid'):
self.refid = attrs.get('refid').value
def buildChildren(self, child_, nodeName_):
if child_.nodeType == Node.TEXT_NODE:
obj_ = self.mixedclass_(MixedContainer.CategoryText,
MixedContainer.TypeNone, '', child_.nodeValue)
self.content_.append(obj_)
if child_.nodeType == Node.TEXT_NODE:
self.valueOf_ += child_.nodeValue
elif child_.nodeType == Node.CDATA_SECTION_NODE:
self.valueOf_ += '![CDATA['+child_.nodeValue+']]'
# end class compoundRefType
class reimplementType(GeneratedsSuper):
subclass = None
superclass = None
def __init__(self, refid=None, valueOf_='', mixedclass_=None, content_=None):
self.refid = refid
if mixedclass_ is None:
self.mixedclass_ = MixedContainer
else:
self.mixedclass_ = mixedclass_
if content_ is None:
self.content_ = []
else:
self.content_ = content_
def factory(*args_, **kwargs_):
if reimplementType.subclass:
return reimplementType.subclass(*args_, **kwargs_)
else:
return reimplementType(*args_, **kwargs_)
factory = staticmethod(factory)
def get_refid(self): return self.refid
def set_refid(self, refid): self.refid = refid
def getValueOf_(self): return self.valueOf_
def setValueOf_(self, valueOf_): self.valueOf_ = valueOf_
def hasContent_(self):
if (
self.valueOf_ is not None
):
return True
else:
return False
def build(self, node_):
attrs = node_.attributes
self.buildAttributes(attrs)
self.valueOf_ = ''
for child_ in node_.childNodes:
nodeName_ = child_.nodeName.split(':')[-1]
self.buildChildren(child_, nodeName_)
def buildAttributes(self, attrs):
if attrs.get('refid'):
self.refid = attrs.get('refid').value
def buildChildren(self, child_, nodeName_):
if child_.nodeType == Node.TEXT_NODE:
obj_ = self.mixedclass_(MixedContainer.CategoryText,
MixedContainer.TypeNone, '', child_.nodeValue)
self.content_.append(obj_)
if child_.nodeType == Node.TEXT_NODE:
self.valueOf_ += child_.nodeValue
elif child_.nodeType == Node.CDATA_SECTION_NODE:
self.valueOf_ += '![CDATA['+child_.nodeValue+']]'
# end class reimplementType
class incType(GeneratedsSuper):
subclass = None
superclass = None
def __init__(self, local=None, refid=None, valueOf_='', mixedclass_=None, content_=None):
self.local = local
self.refid = refid
if mixedclass_ is None:
self.mixedclass_ = MixedContainer
else:
self.mixedclass_ = mixedclass_
if content_ is None:
self.content_ = []
else:
self.content_ = content_
def factory(*args_, **kwargs_):
if incType.subclass:
return incType.subclass(*args_, **kwargs_)
else:
return incType(*args_, **kwargs_)
factory = staticmethod(factory)
def get_local(self): return self.local
def set_local(self, local): self.local = local
def get_refid(self): return self.refid
def set_refid(self, refid): self.refid = refid
def getValueOf_(self): return self.valueOf_
def setValueOf_(self, valueOf_): self.valueOf_ = valueOf_
def hasContent_(self):
if (
self.valueOf_ is not None
):
return True
else:
return False
def build(self, node_):
attrs = node_.attributes
self.buildAttributes(attrs)
self.valueOf_ = ''
for child_ in node_.childNodes:
nodeName_ = child_.nodeName.split(':')[-1]
self.buildChildren(child_, nodeName_)
def buildAttributes(self, attrs):
if attrs.get('local'):
self.local = attrs.get('local').value
if attrs.get('refid'):
self.refid = attrs.get('refid').value
def buildChildren(self, child_, nodeName_):
if child_.nodeType == Node.TEXT_NODE:
obj_ = self.mixedclass_(MixedContainer.CategoryText,
MixedContainer.TypeNone, '', child_.nodeValue)
self.content_.append(obj_)
if child_.nodeType == Node.TEXT_NODE:
self.valueOf_ += child_.nodeValue
elif child_.nodeType == Node.CDATA_SECTION_NODE:
self.valueOf_ += '![CDATA['+child_.nodeValue+']]'
# end class incType
class refType(GeneratedsSuper):
subclass = None
superclass = None
def __init__(self, prot=None, refid=None, valueOf_='', mixedclass_=None, content_=None):
self.prot = prot
self.refid = refid
if mixedclass_ is None:
self.mixedclass_ = MixedContainer
else:
self.mixedclass_ = mixedclass_
if content_ is None:
self.content_ = []
else:
self.content_ = content_
def factory(*args_, **kwargs_):
if refType.subclass:
return refType.subclass(*args_, **kwargs_)
else:
return refType(*args_, **kwargs_)
factory = staticmethod(factory)
def get_prot(self): return self.prot
def set_prot(self, prot): self.prot = prot
def get_refid(self): return self.refid
def set_refid(self, refid): self.refid = refid
def getValueOf_(self): return self.valueOf_
def setValueOf_(self, valueOf_): self.valueOf_ = valueOf_
def hasContent_(self):
if (
self.valueOf_ is not None
):
return True
else:
return False
def build(self, node_):
attrs = node_.attributes
self.buildAttributes(attrs)
self.valueOf_ = ''
for child_ in node_.childNodes:
nodeName_ = child_.nodeName.split(':')[-1]
self.buildChildren(child_, nodeName_)
def buildAttributes(self, attrs):
if attrs.get('prot'):
self.prot = attrs.get('prot').value
if attrs.get('refid'):
self.refid = attrs.get('refid').value
def buildChildren(self, child_, nodeName_):
if child_.nodeType == Node.TEXT_NODE:
obj_ = self.mixedclass_(MixedContainer.CategoryText,
MixedContainer.TypeNone, '', child_.nodeValue)
self.content_.append(obj_)
if child_.nodeType == Node.TEXT_NODE:
self.valueOf_ += child_.nodeValue
elif child_.nodeType == Node.CDATA_SECTION_NODE:
self.valueOf_ += '![CDATA['+child_.nodeValue+']]'
# end class refType
class refTextType(GeneratedsSuper):
subclass = None
superclass = None
def __init__(self, refid=None, kindref=None, external=None, valueOf_='', mixedclass_=None, content_=None):
self.refid = refid
self.kindref = kindref
self.external = external
if mixedclass_ is None:
self.mixedclass_ = MixedContainer
else:
self.mixedclass_ = mixedclass_
if content_ is None:
self.content_ = []
else:
self.content_ = content_
def factory(*args_, **kwargs_):
if refTextType.subclass:
return refTextType.subclass(*args_, **kwargs_)
else:
return refTextType(*args_, **kwargs_)
factory = staticmethod(factory)
def get_refid(self): return self.refid
def set_refid(self, refid): self.refid = refid
def get_kindref(self): return self.kindref
def set_kindref(self, kindref): self.kindref = kindref
def get_external(self): return self.external
def set_external(self, external): self.external = external
def getValueOf_(self): return self.valueOf_
def setValueOf_(self, valueOf_): self.valueOf_ = valueOf_
def hasContent_(self):
if (
self.valueOf_ is not None
):
return True
else:
return False
def build(self, node_):
attrs = node_.attributes
self.buildAttributes(attrs)
self.valueOf_ = ''
for child_ in node_.childNodes:
nodeName_ = child_.nodeName.split(':')[-1]
self.buildChildren(child_, nodeName_)
def buildAttributes(self, attrs):
if attrs.get('refid'):
self.refid = attrs.get('refid').value
if attrs.get('kindref'):
self.kindref = attrs.get('kindref').value
if attrs.get('external'):
self.external = attrs.get('external').value
def buildChildren(self, child_, nodeName_):
if child_.nodeType == Node.TEXT_NODE:
obj_ = self.mixedclass_(MixedContainer.CategoryText,
MixedContainer.TypeNone, '', child_.nodeValue)
self.content_.append(obj_)
if child_.nodeType == Node.TEXT_NODE:
self.valueOf_ += child_.nodeValue
elif child_.nodeType == Node.CDATA_SECTION_NODE:
self.valueOf_ += '![CDATA['+child_.nodeValue+']]'
# end class refTextType
class sectiondefType(GeneratedsSuper):
subclass = None
superclass = None
def __init__(self, kind=None, header=None, description=None, memberdef=None):
self.kind = kind
self.header = header
self.description = description
if memberdef is None:
self.memberdef = []
else:
self.memberdef = memberdef
def factory(*args_, **kwargs_):
if sectiondefType.subclass:
return sectiondefType.subclass(*args_, **kwargs_)
else:
return sectiondefType(*args_, **kwargs_)
factory = staticmethod(factory)
def get_header(self): return self.header
def set_header(self, header): self.header = header
def get_description(self): return self.description
def set_description(self, description): self.description = description
def get_memberdef(self): return self.memberdef
def set_memberdef(self, memberdef): self.memberdef = memberdef
def add_memberdef(self, value): self.memberdef.append(value)
def insert_memberdef(self, index, value): self.memberdef[index] = value
def get_kind(self): return self.kind
def set_kind(self, kind): self.kind = kind
def hasContent_(self):
if (
self.header is not None or
self.description is not None or
self.memberdef is not None
):
return True
else:
return False
def build(self, node_):
attrs = node_.attributes
self.buildAttributes(attrs)
for child_ in node_.childNodes:
nodeName_ = child_.nodeName.split(':')[-1]
self.buildChildren(child_, nodeName_)
def buildAttributes(self, attrs):
if attrs.get('kind'):
self.kind = attrs.get('kind').value
def buildChildren(self, child_, nodeName_):
if child_.nodeType == Node.ELEMENT_NODE and \
nodeName_ == 'header':
header_ = ''
for text__content_ in child_.childNodes:
header_ += text__content_.nodeValue
self.header = header_
elif child_.nodeType == Node.ELEMENT_NODE and \
nodeName_ == 'description':
obj_ = descriptionType.factory()
obj_.build(child_)
self.set_description(obj_)
elif child_.nodeType == Node.ELEMENT_NODE and \
nodeName_ == 'memberdef':
obj_ = memberdefType.factory()
obj_.build(child_)
self.memberdef.append(obj_)
# end class sectiondefType
class memberdefType(GeneratedsSuper):
subclass = None
superclass = None
def __init__(self, initonly=None, kind=None, volatile=None, const=None, raisexx=None, virt=None, readable=None, prot=None, explicit=None, new=None, final=None, writable=None, add=None, static=None, strong=None, remove=None, sealed=None, mutable=None, gettable=None, inline=None, settable=None, id=None, templateparamlist=None, type_=None, definition=None, argsstring=None, name=None, read=None, write=None, bitfield=None, reimplements=None, reimplementedby=None, param=None, enumvalue=None, initializer=None, exceptions=None, briefdescription=None, detaileddescription=None, inbodydescription=None, location=None, references=None, referencedby=None, refqual=None):
self.initonly = initonly
self.kind = kind
self.volatile = volatile
self.const = const
self.raisexx = raisexx
self.virt = virt
self.readable = readable
self.prot = prot
self.explicit = explicit
self.new = new
self.final = final
self.writable = writable
self.add = add
self.static = static
self.strong = strong
self.remove = remove
self.sealed = sealed
self.mutable = mutable
self.gettable = gettable
self.inline = inline
self.settable = settable
self.id = id
self.templateparamlist = templateparamlist
self.type_ = type_
self.definition = definition
self.argsstring = argsstring
self.name = name
self.read = read
self.write = write
self.bitfield = bitfield
if reimplements is None:
self.reimplements = []
else:
self.reimplements = reimplements
if reimplementedby is None:
self.reimplementedby = []
else:
self.reimplementedby = reimplementedby
if param is None:
self.param = []
else:
self.param = param
if enumvalue is None:
self.enumvalue = []
else:
self.enumvalue = enumvalue
self.initializer = initializer
self.exceptions = exceptions
self.briefdescription = briefdescription
self.detaileddescription = detaileddescription
self.inbodydescription = inbodydescription
self.location = location
if references is None:
self.references = []
else:
self.references = references
if referencedby is None:
self.referencedby = []
else:
self.referencedby = referencedby
self.refqual = refqual
def factory(*args_, **kwargs_):
if memberdefType.subclass:
return memberdefType.subclass(*args_, **kwargs_)
else:
return memberdefType(*args_, **kwargs_)
factory = staticmethod(factory)
def get_templateparamlist(self): return self.templateparamlist
def set_templateparamlist(self, templateparamlist): self.templateparamlist = templateparamlist
def get_type(self): return self.type_
def set_type(self, type_): self.type_ = type_
def get_definition(self): return self.definition
def set_definition(self, definition): self.definition = definition
def get_argsstring(self): return self.argsstring
def set_argsstring(self, argsstring): self.argsstring = argsstring
def get_name(self): return self.name
def set_name(self, name): self.name = name
def get_read(self): return self.read
def set_read(self, read): self.read = read
def get_write(self): return self.write
def set_write(self, write): self.write = write
def get_bitfield(self): return self.bitfield
def set_bitfield(self, bitfield): self.bitfield = bitfield
def get_reimplements(self): return self.reimplements
def set_reimplements(self, reimplements): self.reimplements = reimplements
def add_reimplements(self, value): self.reimplements.append(value)
def insert_reimplements(self, index, value): self.reimplements[index] = value
def get_reimplementedby(self): return self.reimplementedby
def set_reimplementedby(self, reimplementedby): self.reimplementedby = reimplementedby
def add_reimplementedby(self, value): self.reimplementedby.append(value)
def insert_reimplementedby(self, index, value): self.reimplementedby[index] = value
def get_param(self): return self.param
def set_param(self, param): self.param = param
def add_param(self, value): self.param.append(value)
def insert_param(self, index, value): self.param[index] = value
def get_enumvalue(self): return self.enumvalue
def set_enumvalue(self, enumvalue): self.enumvalue = enumvalue
def add_enumvalue(self, value): self.enumvalue.append(value)
def insert_enumvalue(self, index, value): self.enumvalue[index] = value
def get_initializer(self): return self.initializer
def set_initializer(self, initializer): self.initializer = initializer
def get_exceptions(self): return self.exceptions
def set_exceptions(self, exceptions): self.exceptions = exceptions
def get_briefdescription(self): return self.briefdescription
def set_briefdescription(self, briefdescription): self.briefdescription = briefdescription
def get_detaileddescription(self): return self.detaileddescription
def set_detaileddescription(self, detaileddescription): self.detaileddescription = detaileddescription
def get_inbodydescription(self): return self.inbodydescription
def set_inbodydescription(self, inbodydescription): self.inbodydescription = inbodydescription
def get_location(self): return self.location
def set_location(self, location): self.location = location
def get_references(self): return self.references
def set_references(self, references): self.references = references
def add_references(self, value): self.references.append(value)
def insert_references(self, index, value): self.references[index] = value
def get_referencedby(self): return self.referencedby
def set_referencedby(self, referencedby): self.referencedby = referencedby
def add_referencedby(self, value): self.referencedby.append(value)
def insert_referencedby(self, index, value): self.referencedby[index] = value
def get_initonly(self): return self.initonly
def set_initonly(self, initonly): self.initonly = initonly
def get_kind(self): return self.kind
def set_kind(self, kind): self.kind = kind
def get_volatile(self): return self.volatile
def set_volatile(self, volatile): self.volatile = volatile
def get_const(self): return self.const
def set_const(self, const): self.const = const
def get_raise(self): return self.raisexx
def set_raise(self, raisexx): self.raisexx = raisexx
def get_virt(self): return self.virt
def set_virt(self, virt): self.virt = virt
def get_readable(self): return self.readable
def set_readable(self, readable): self.readable = readable
def get_prot(self): return self.prot
def set_prot(self, prot): self.prot = prot
def get_explicit(self): return self.explicit
def set_explicit(self, explicit): self.explicit = explicit
def get_new(self): return self.new
def set_new(self, new): self.new = new
def get_final(self): return self.final
def set_final(self, final): self.final = final
def get_writable(self): return self.writable
def set_writable(self, writable): self.writable = writable
def get_add(self): return self.add
def set_add(self, add): self.add = add
def get_static(self): return self.static
def set_static(self, static): self.static = static
def get_strong(self): return self.strong
def set_strong(self, strong): self.strong = strong
def get_remove(self): return self.remove
def set_remove(self, remove): self.remove = remove
def get_sealed(self): return self.sealed
def set_sealed(self, sealed): self.sealed = sealed
def get_mutable(self): return self.mutable
def set_mutable(self, mutable): self.mutable = mutable
def get_gettable(self): return self.gettable
def set_gettable(self, gettable): self.gettable = gettable
def get_inline(self): return self.inline
def set_inline(self, inline): self.inline = inline
def get_settable(self): return self.settable
def set_settable(self, settable): self.settable = settable
def get_id(self): return self.id
def set_id(self, id): self.id = id
def get_refqual(self): return self.refqual
def set_refqual(self, refqual): self.refqual = refqual
def hasContent_(self):
if (
self.templateparamlist is not None or
self.type_ is not None or
self.definition is not None or
self.argsstring is not None or
self.name is not None or
self.read is not None or
self.write is not None or
self.bitfield is not None or
self.reimplements is not None or
self.reimplementedby is not None or
self.param is not None or
self.enumvalue is not None or
self.initializer is not None or
self.exceptions is not None or
self.briefdescription is not None or
self.detaileddescription is not None or
self.inbodydescription is not None or
self.location is not None or
self.references is not None or
self.referencedby is not None
):
return True
else:
return False
def build(self, node_):
attrs = node_.attributes
self.buildAttributes(attrs)
for child_ in node_.childNodes:
nodeName_ = child_.nodeName.split(':')[-1]
self.buildChildren(child_, nodeName_)
def buildAttributes(self, attrs):
if attrs.get('initonly'):
self.initonly = attrs.get('initonly').value
if attrs.get('kind'):
self.kind = attrs.get('kind').value
if attrs.get('volatile'):
self.volatile = attrs.get('volatile').value
if attrs.get('const'):
self.const = attrs.get('const').value
if attrs.get('raise'):
self.raisexx = attrs.get('raise').value
if attrs.get('virt'):
self.virt = attrs.get('virt').value
if attrs.get('readable'):
self.readable = attrs.get('readable').value
if attrs.get('prot'):
self.prot = attrs.get('prot').value
if attrs.get('explicit'):
self.explicit = attrs.get('explicit').value
if attrs.get('new'):
self.new = attrs.get('new').value
if attrs.get('final'):
self.final = attrs.get('final').value
if attrs.get('writable'):
self.writable = attrs.get('writable').value
if attrs.get('add'):
self.add = attrs.get('add').value
if attrs.get('static'):
self.static = attrs.get('static').value
if attrs.get('strong'):
self.strong = attrs.get('strong').value
if attrs.get('remove'):
self.remove = attrs.get('remove').value
if attrs.get('sealed'):
self.sealed = attrs.get('sealed').value
if attrs.get('mutable'):
self.mutable = attrs.get('mutable').value
if attrs.get('gettable'):
self.gettable = attrs.get('gettable').value
if attrs.get('inline'):
self.inline = attrs.get('inline').value
if attrs.get('settable'):
self.settable = attrs.get('settable').value
if attrs.get('id'):
self.id = attrs.get('id').value
if attrs.get('refqual'):
self.refqual = attrs.get('refqual').value
def buildChildren(self, child_, nodeName_):
if child_.nodeType == Node.ELEMENT_NODE and \
nodeName_ == 'templateparamlist':
obj_ = templateparamlistType.factory()
obj_.build(child_)
self.set_templateparamlist(obj_)
elif child_.nodeType == Node.ELEMENT_NODE and \
nodeName_ == 'type':
obj_ = linkedTextType.factory()
obj_.build(child_)
self.set_type(obj_)
elif child_.nodeType == Node.ELEMENT_NODE and \
nodeName_ == 'definition':
definition_ = ''
for text__content_ in child_.childNodes:
definition_ += text__content_.nodeValue
self.definition = definition_
elif child_.nodeType == Node.ELEMENT_NODE and \
nodeName_ == 'argsstring':
argsstring_ = ''
for text__content_ in child_.childNodes:
argsstring_ += text__content_.nodeValue
self.argsstring = argsstring_
elif child_.nodeType == Node.ELEMENT_NODE and \
nodeName_ == 'name':
name_ = ''
for text__content_ in child_.childNodes:
name_ += text__content_.nodeValue
self.name = name_
elif child_.nodeType == Node.ELEMENT_NODE and \
nodeName_ == 'read':
read_ = ''
for text__content_ in child_.childNodes:
read_ += text__content_.nodeValue
self.read = read_
elif child_.nodeType == Node.ELEMENT_NODE and \
nodeName_ == 'write':
write_ = ''
for text__content_ in child_.childNodes:
write_ += text__content_.nodeValue
self.write = write_
elif child_.nodeType == Node.ELEMENT_NODE and \
nodeName_ == 'bitfield':
bitfield_ = ''
for text__content_ in child_.childNodes:
bitfield_ += text__content_.nodeValue
self.bitfield = bitfield_
elif child_.nodeType == Node.ELEMENT_NODE and \
nodeName_ == 'reimplements':
obj_ = reimplementType.factory()
obj_.build(child_)
self.reimplements.append(obj_)
elif child_.nodeType == Node.ELEMENT_NODE and \
nodeName_ == 'reimplementedby':
obj_ = reimplementType.factory()
obj_.build(child_)
self.reimplementedby.append(obj_)
elif child_.nodeType == Node.ELEMENT_NODE and \
nodeName_ == 'param':
obj_ = paramType.factory()
obj_.build(child_)
self.param.append(obj_)
elif child_.nodeType == Node.ELEMENT_NODE and \
nodeName_ == 'enumvalue':
obj_ = enumvalueType.factory()
obj_.build(child_)
self.enumvalue.append(obj_)
elif child_.nodeType == Node.ELEMENT_NODE and \
nodeName_ == 'initializer':
obj_ = linkedTextType.factory()
obj_.build(child_)
self.set_initializer(obj_)
elif child_.nodeType == Node.ELEMENT_NODE and \
nodeName_ == 'exceptions':
obj_ = linkedTextType.factory()
obj_.build(child_)
self.set_exceptions(obj_)
elif child_.nodeType == Node.ELEMENT_NODE and \
nodeName_ == 'briefdescription':
obj_ = descriptionType.factory()
obj_.build(child_)
self.set_briefdescription(obj_)
elif child_.nodeType == Node.ELEMENT_NODE and \
nodeName_ == 'detaileddescription':
obj_ = descriptionType.factory()
obj_.build(child_)
self.set_detaileddescription(obj_)
elif child_.nodeType == Node.ELEMENT_NODE and \
nodeName_ == 'inbodydescription':
obj_ = descriptionType.factory()
obj_.build(child_)
self.set_inbodydescription(obj_)
elif child_.nodeType == Node.ELEMENT_NODE and \
nodeName_ == 'location':
obj_ = locationType.factory()
obj_.build(child_)
self.set_location(obj_)
elif child_.nodeType == Node.ELEMENT_NODE and \
nodeName_ == 'references':
obj_ = referenceType.factory()
obj_.build(child_)
self.references.append(obj_)
elif child_.nodeType == Node.ELEMENT_NODE and \
nodeName_ == 'referencedby':
obj_ = referenceType.factory()
obj_.build(child_)
self.referencedby.append(obj_)
# end class memberdefType
class definition(GeneratedsSuper):
subclass = None
superclass = None
def __init__(self, valueOf_=''):
self.valueOf_ = valueOf_
def factory(*args_, **kwargs_):
if definition.subclass:
return definition.subclass(*args_, **kwargs_)
else:
return definition(*args_, **kwargs_)
factory = staticmethod(factory)
def getValueOf_(self): return self.valueOf_
def setValueOf_(self, valueOf_): self.valueOf_ = valueOf_
def hasContent_(self):
if (
self.valueOf_ is not None
):
return True
else:
return False
def build(self, node_):
attrs = node_.attributes
self.buildAttributes(attrs)
self.valueOf_ = ''
for child_ in node_.childNodes:
nodeName_ = child_.nodeName.split(':')[-1]
self.buildChildren(child_, nodeName_)
def buildAttributes(self, attrs):
pass
def buildChildren(self, child_, nodeName_):
if child_.nodeType == Node.TEXT_NODE:
self.valueOf_ += child_.nodeValue
elif child_.nodeType == Node.CDATA_SECTION_NODE:
self.valueOf_ += '![CDATA['+child_.nodeValue+']]'
# end class definition
class argsstring(GeneratedsSuper):
subclass = None
superclass = None
def __init__(self, valueOf_=''):
self.valueOf_ = valueOf_
def factory(*args_, **kwargs_):
if argsstring.subclass:
return argsstring.subclass(*args_, **kwargs_)
else:
return argsstring(*args_, **kwargs_)
factory = staticmethod(factory)
def getValueOf_(self): return self.valueOf_
def setValueOf_(self, valueOf_): self.valueOf_ = valueOf_
def export(self, outfile, level, namespace_='', name_='argsstring', namespacedef_=''):
showIndent(outfile, level)
outfile.write('<%s%s %s' % (namespace_, name_, namespacedef_, ))
self.exportAttributes(outfile, level, namespace_, name_='argsstring')
if self.hasContent_():
outfile.write('>\n')
self.exportChildren(outfile, level + 1, namespace_, name_)
showIndent(outfile, level)
outfile.write('</%s%s>\n' % (namespace_, name_))
else:
outfile.write(' />\n')
def hasContent_(self):
if (
self.valueOf_ is not None
):
return True
else:
return False
def build(self, node_):
attrs = node_.attributes
self.buildAttributes(attrs)
self.valueOf_ = ''
for child_ in node_.childNodes:
nodeName_ = child_.nodeName.split(':')[-1]
self.buildChildren(child_, nodeName_)
def buildAttributes(self, attrs):
pass
def buildChildren(self, child_, nodeName_):
if child_.nodeType == Node.TEXT_NODE:
self.valueOf_ += child_.nodeValue
elif child_.nodeType == Node.CDATA_SECTION_NODE:
self.valueOf_ += '![CDATA['+child_.nodeValue+']]'
# end class argsstring
class read(GeneratedsSuper):
subclass = None
superclass = None
def __init__(self, valueOf_=''):
self.valueOf_ = valueOf_
def factory(*args_, **kwargs_):
if read.subclass:
return read.subclass(*args_, **kwargs_)
else:
return read(*args_, **kwargs_)
factory = staticmethod(factory)
def getValueOf_(self): return self.valueOf_
def setValueOf_(self, valueOf_): self.valueOf_ = valueOf_
def hasContent_(self):
if (
self.valueOf_ is not None
):
return True
else:
return False
def build(self, node_):
attrs = node_.attributes
self.buildAttributes(attrs)
self.valueOf_ = ''
for child_ in node_.childNodes:
nodeName_ = child_.nodeName.split(':')[-1]
self.buildChildren(child_, nodeName_)
def buildAttributes(self, attrs):
pass
def buildChildren(self, child_, nodeName_):
if child_.nodeType == Node.TEXT_NODE:
self.valueOf_ += child_.nodeValue
elif child_.nodeType == Node.CDATA_SECTION_NODE:
self.valueOf_ += '![CDATA['+child_.nodeValue+']]'
# end class read
class write(GeneratedsSuper):
subclass = None
superclass = None
def __init__(self, valueOf_=''):
self.valueOf_ = valueOf_
def factory(*args_, **kwargs_):
if write.subclass:
return write.subclass(*args_, **kwargs_)
else:
return write(*args_, **kwargs_)
factory = staticmethod(factory)
def getValueOf_(self): return self.valueOf_
def setValueOf_(self, valueOf_): self.valueOf_ = valueOf_
def hasContent_(self):
if (
self.valueOf_ is not None
):
return True
else:
return False
def build(self, node_):
attrs = node_.attributes
self.buildAttributes(attrs)
self.valueOf_ = ''
for child_ in node_.childNodes:
nodeName_ = child_.nodeName.split(':')[-1]
self.buildChildren(child_, nodeName_)
def buildAttributes(self, attrs):
pass
def buildChildren(self, child_, nodeName_):
if child_.nodeType == Node.TEXT_NODE:
self.valueOf_ += child_.nodeValue
elif child_.nodeType == Node.CDATA_SECTION_NODE:
self.valueOf_ += '![CDATA['+child_.nodeValue+']]'
# end class write
class bitfield(GeneratedsSuper):
subclass = None
superclass = None
def __init__(self, valueOf_=''):
self.valueOf_ = valueOf_
def factory(*args_, **kwargs_):
if bitfield.subclass:
return bitfield.subclass(*args_, **kwargs_)
else:
return bitfield(*args_, **kwargs_)
factory = staticmethod(factory)
def getValueOf_(self): return self.valueOf_
def setValueOf_(self, valueOf_): self.valueOf_ = valueOf_
def hasContent_(self):
if (
self.valueOf_ is not None
):
return True
else:
return False
def build(self, node_):
attrs = node_.attributes
self.buildAttributes(attrs)
self.valueOf_ = ''
for child_ in node_.childNodes:
nodeName_ = child_.nodeName.split(':')[-1]
self.buildChildren(child_, nodeName_)
def buildAttributes(self, attrs):
pass
def buildChildren(self, child_, nodeName_):
if child_.nodeType == Node.TEXT_NODE:
self.valueOf_ += child_.nodeValue
elif child_.nodeType == Node.CDATA_SECTION_NODE:
self.valueOf_ += '![CDATA['+child_.nodeValue+']]'
# end class bitfield
class descriptionType(GeneratedsSuper):
subclass = None
superclass = None
def __init__(self, title=None, para=None, sect1=None, internal=None, mixedclass_=None, content_=None):
if mixedclass_ is None:
self.mixedclass_ = MixedContainer
else:
self.mixedclass_ = mixedclass_
if content_ is None:
self.content_ = []
else:
self.content_ = content_
def factory(*args_, **kwargs_):
if descriptionType.subclass:
return descriptionType.subclass(*args_, **kwargs_)
else:
return descriptionType(*args_, **kwargs_)
factory = staticmethod(factory)
def get_title(self): return self.title
def set_title(self, title): self.title = title
def get_para(self): return self.para
def set_para(self, para): self.para = para
def add_para(self, value): self.para.append(value)
def insert_para(self, index, value): self.para[index] = value
def get_sect1(self): return self.sect1
def set_sect1(self, sect1): self.sect1 = sect1
def add_sect1(self, value): self.sect1.append(value)
def insert_sect1(self, index, value): self.sect1[index] = value
def get_internal(self): return self.internal
def set_internal(self, internal): self.internal = internal
def hasContent_(self):
if (
self.title is not None or
self.para is not None or
self.sect1 is not None or
self.internal is not None
):
return True
else:
return False
def build(self, node_):
attrs = node_.attributes
self.buildAttributes(attrs)
for child_ in node_.childNodes:
nodeName_ = child_.nodeName.split(':')[-1]
self.buildChildren(child_, nodeName_)
def buildAttributes(self, attrs):
pass
def buildChildren(self, child_, nodeName_):
if child_.nodeType == Node.ELEMENT_NODE and \
nodeName_ == 'title':
childobj_ = docTitleType.factory()
childobj_.build(child_)
obj_ = self.mixedclass_(MixedContainer.CategoryComplex,
MixedContainer.TypeNone, 'title', childobj_)
self.content_.append(obj_)
elif child_.nodeType == Node.ELEMENT_NODE and \
nodeName_ == 'para':
childobj_ = docParaType.factory()
childobj_.build(child_)
obj_ = self.mixedclass_(MixedContainer.CategoryComplex,
MixedContainer.TypeNone, 'para', childobj_)
self.content_.append(obj_)
elif child_.nodeType == Node.ELEMENT_NODE and \
nodeName_ == 'sect1':
childobj_ = docSect1Type.factory()
childobj_.build(child_)
obj_ = self.mixedclass_(MixedContainer.CategoryComplex,
MixedContainer.TypeNone, 'sect1', childobj_)
self.content_.append(obj_)
elif child_.nodeType == Node.ELEMENT_NODE and \
nodeName_ == 'internal':
childobj_ = docInternalType.factory()
childobj_.build(child_)
obj_ = self.mixedclass_(MixedContainer.CategoryComplex,
MixedContainer.TypeNone, 'internal', childobj_)
self.content_.append(obj_)
# end class descriptionType
class enumvalueType(GeneratedsSuper):
subclass = None
superclass = None
def __init__(self, prot=None, id=None, name=None, initializer=None, briefdescription=None, detaileddescription=None, mixedclass_=None, content_=None):
self.prot = prot
self.id = id
if mixedclass_ is None:
self.mixedclass_ = MixedContainer
else:
self.mixedclass_ = mixedclass_
if content_ is None:
self.content_ = []
else:
self.content_ = content_
def factory(*args_, **kwargs_):
if enumvalueType.subclass:
return enumvalueType.subclass(*args_, **kwargs_)
else:
return enumvalueType(*args_, **kwargs_)
factory = staticmethod(factory)
def get_name(self): return self.name
def set_name(self, name): self.name = name
def get_initializer(self): return self.initializer
def set_initializer(self, initializer): self.initializer = initializer
def get_briefdescription(self): return self.briefdescription
def set_briefdescription(self, briefdescription): self.briefdescription = briefdescription
def get_detaileddescription(self): return self.detaileddescription
def set_detaileddescription(self, detaileddescription): self.detaileddescription = detaileddescription
def get_prot(self): return self.prot
def set_prot(self, prot): self.prot = prot
def get_id(self): return self.id
def set_id(self, id): self.id = id
def hasContent_(self):
if (
self.name is not None or
self.initializer is not None or
self.briefdescription is not None or
self.detaileddescription is not None
):
return True
else:
return False
def build(self, node_):
attrs = node_.attributes
self.buildAttributes(attrs)
for child_ in node_.childNodes:
nodeName_ = child_.nodeName.split(':')[-1]
self.buildChildren(child_, nodeName_)
def buildAttributes(self, attrs):
if attrs.get('prot'):
self.prot = attrs.get('prot').value
if attrs.get('id'):
self.id = attrs.get('id').value
def buildChildren(self, child_, nodeName_):
if child_.nodeType == Node.ELEMENT_NODE and \
nodeName_ == 'name':
value_ = []
for text_ in child_.childNodes:
value_.append(text_.nodeValue)
valuestr_ = ''.join(value_)
obj_ = self.mixedclass_(MixedContainer.CategorySimple,
MixedContainer.TypeString, 'name', valuestr_)
self.content_.append(obj_)
elif child_.nodeType == Node.ELEMENT_NODE and \
nodeName_ == 'initializer':
childobj_ = linkedTextType.factory()
childobj_.build(child_)
obj_ = self.mixedclass_(MixedContainer.CategoryComplex,
MixedContainer.TypeNone, 'initializer', childobj_)
self.content_.append(obj_)
elif child_.nodeType == Node.ELEMENT_NODE and \
nodeName_ == 'briefdescription':
childobj_ = descriptionType.factory()
childobj_.build(child_)
obj_ = self.mixedclass_(MixedContainer.CategoryComplex,
MixedContainer.TypeNone, 'briefdescription', childobj_)
self.content_.append(obj_)
elif child_.nodeType == Node.ELEMENT_NODE and \
nodeName_ == 'detaileddescription':
childobj_ = descriptionType.factory()
childobj_.build(child_)
obj_ = self.mixedclass_(MixedContainer.CategoryComplex,
MixedContainer.TypeNone, 'detaileddescription', childobj_)
self.content_.append(obj_)
elif child_.nodeType == Node.TEXT_NODE:
obj_ = self.mixedclass_(MixedContainer.CategoryText,
MixedContainer.TypeNone, '', child_.nodeValue)
self.content_.append(obj_)
# end class enumvalueType
class templateparamlistType(GeneratedsSuper):
subclass = None
superclass = None
def __init__(self, param=None):
if param is None:
self.param = []
else:
self.param = param
def factory(*args_, **kwargs_):
if templateparamlistType.subclass:
return templateparamlistType.subclass(*args_, **kwargs_)
else:
return templateparamlistType(*args_, **kwargs_)
factory = staticmethod(factory)
def get_param(self): return self.param
def set_param(self, param): self.param = param
def add_param(self, value): self.param.append(value)
def insert_param(self, index, value): self.param[index] = value
def hasContent_(self):
if (
self.param is not None
):
return True
else:
return False
def build(self, node_):
attrs = node_.attributes
self.buildAttributes(attrs)
for child_ in node_.childNodes:
nodeName_ = child_.nodeName.split(':')[-1]
self.buildChildren(child_, nodeName_)
def buildAttributes(self, attrs):
pass
def buildChildren(self, child_, nodeName_):
if child_.nodeType == Node.ELEMENT_NODE and \
nodeName_ == 'param':
obj_ = paramType.factory()
obj_.build(child_)
self.param.append(obj_)
# end class templateparamlistType
class paramType(GeneratedsSuper):
subclass = None
superclass = None
def __init__(self, type_=None, declname=None, defname=None, array=None, defval=None, briefdescription=None):
self.type_ = type_
self.declname = declname
self.defname = defname
self.array = array
self.defval = defval
self.briefdescription = briefdescription
def factory(*args_, **kwargs_):
if paramType.subclass:
return paramType.subclass(*args_, **kwargs_)
else:
return paramType(*args_, **kwargs_)
factory = staticmethod(factory)
def get_type(self): return self.type_
def set_type(self, type_): self.type_ = type_
def get_declname(self): return self.declname
def set_declname(self, declname): self.declname = declname
def get_defname(self): return self.defname
def set_defname(self, defname): self.defname = defname
def get_array(self): return self.array
def set_array(self, array): self.array = array
def get_defval(self): return self.defval
def set_defval(self, defval): self.defval = defval
def get_briefdescription(self): return self.briefdescription
def set_briefdescription(self, briefdescription): self.briefdescription = briefdescription
def hasContent_(self):
if (
self.type_ is not None or
self.declname is not None or
self.defname is not None or
self.array is not None or
self.defval is not None or
self.briefdescription is not None
):
return True
else:
return False
def build(self, node_):
attrs = node_.attributes
self.buildAttributes(attrs)
for child_ in node_.childNodes:
nodeName_ = child_.nodeName.split(':')[-1]
self.buildChildren(child_, nodeName_)
def buildAttributes(self, attrs):
pass
def buildChildren(self, child_, nodeName_):
if child_.nodeType == Node.ELEMENT_NODE and \
nodeName_ == 'type':
obj_ = linkedTextType.factory()
obj_.build(child_)
self.set_type(obj_)
elif child_.nodeType == Node.ELEMENT_NODE and \
nodeName_ == 'declname':
declname_ = ''
for text__content_ in child_.childNodes:
declname_ += text__content_.nodeValue
self.declname = declname_
elif child_.nodeType == Node.ELEMENT_NODE and \
nodeName_ == 'defname':
defname_ = ''
for text__content_ in child_.childNodes:
defname_ += text__content_.nodeValue
self.defname = defname_
elif child_.nodeType == Node.ELEMENT_NODE and \
nodeName_ == 'array':
array_ = ''
for text__content_ in child_.childNodes:
array_ += text__content_.nodeValue
self.array = array_
elif child_.nodeType == Node.ELEMENT_NODE and \
nodeName_ == 'defval':
obj_ = linkedTextType.factory()
obj_.build(child_)
self.set_defval(obj_)
elif child_.nodeType == Node.ELEMENT_NODE and \
nodeName_ == 'briefdescription':
obj_ = descriptionType.factory()
obj_.build(child_)
self.set_briefdescription(obj_)
# end class paramType
class declname(GeneratedsSuper):
subclass = None
superclass = None
def __init__(self, valueOf_=''):
self.valueOf_ = valueOf_
def factory(*args_, **kwargs_):
if declname.subclass:
return declname.subclass(*args_, **kwargs_)
else:
return declname(*args_, **kwargs_)
factory = staticmethod(factory)
def getValueOf_(self): return self.valueOf_
def setValueOf_(self, valueOf_): self.valueOf_ = valueOf_
def hasContent_(self):
if (
self.valueOf_ is not None
):
return True
else:
return False
def build(self, node_):
attrs = node_.attributes
self.buildAttributes(attrs)
self.valueOf_ = ''
for child_ in node_.childNodes:
nodeName_ = child_.nodeName.split(':')[-1]
self.buildChildren(child_, nodeName_)
def buildAttributes(self, attrs):
pass
def buildChildren(self, child_, nodeName_):
if child_.nodeType == Node.TEXT_NODE:
self.valueOf_ += child_.nodeValue
elif child_.nodeType == Node.CDATA_SECTION_NODE:
self.valueOf_ += '![CDATA['+child_.nodeValue+']]'
# end class declname
class defname(GeneratedsSuper):
subclass = None
superclass = None
def __init__(self, valueOf_=''):
self.valueOf_ = valueOf_
def factory(*args_, **kwargs_):
if defname.subclass:
return defname.subclass(*args_, **kwargs_)
else:
return defname(*args_, **kwargs_)
factory = staticmethod(factory)
def getValueOf_(self): return self.valueOf_
def setValueOf_(self, valueOf_): self.valueOf_ = valueOf_
def hasContent_(self):
if (
self.valueOf_ is not None
):
return True
else:
return False
def build(self, node_):
attrs = node_.attributes
self.buildAttributes(attrs)
self.valueOf_ = ''
for child_ in node_.childNodes:
nodeName_ = child_.nodeName.split(':')[-1]
self.buildChildren(child_, nodeName_)
def buildAttributes(self, attrs):
pass
def buildChildren(self, child_, nodeName_):
if child_.nodeType == Node.TEXT_NODE:
self.valueOf_ += child_.nodeValue
elif child_.nodeType == Node.CDATA_SECTION_NODE:
self.valueOf_ += '![CDATA['+child_.nodeValue+']]'
# end class defname
class array(GeneratedsSuper):
subclass = None
superclass = None
def __init__(self, valueOf_=''):
self.valueOf_ = valueOf_
def factory(*args_, **kwargs_):
if array.subclass:
return array.subclass(*args_, **kwargs_)
else:
return array(*args_, **kwargs_)
factory = staticmethod(factory)
def getValueOf_(self): return self.valueOf_
def setValueOf_(self, valueOf_): self.valueOf_ = valueOf_
def hasContent_(self):
if (
self.valueOf_ is not None
):
return True
else:
return False
def build(self, node_):
attrs = node_.attributes
self.buildAttributes(attrs)
self.valueOf_ = ''
for child_ in node_.childNodes:
nodeName_ = child_.nodeName.split(':')[-1]
self.buildChildren(child_, nodeName_)
def buildAttributes(self, attrs):
pass
def buildChildren(self, child_, nodeName_):
if child_.nodeType == Node.TEXT_NODE:
self.valueOf_ += child_.nodeValue
elif child_.nodeType == Node.CDATA_SECTION_NODE:
self.valueOf_ += '![CDATA['+child_.nodeValue+']]'
# end class array
class linkedTextType(GeneratedsSuper):
subclass = None
superclass = None
def __init__(self, ref=None, mixedclass_=None, content_=None):
if mixedclass_ is None:
self.mixedclass_ = MixedContainer
else:
self.mixedclass_ = mixedclass_
if content_ is None:
self.content_ = []
else:
self.content_ = content_
def factory(*args_, **kwargs_):
if linkedTextType.subclass:
return linkedTextType.subclass(*args_, **kwargs_)
else:
return linkedTextType(*args_, **kwargs_)
factory = staticmethod(factory)
def get_ref(self): return self.ref
def set_ref(self, ref): self.ref = ref
def add_ref(self, value): self.ref.append(value)
def insert_ref(self, index, value): self.ref[index] = value
def hasContent_(self):
if (
self.ref is not None
):
return True
else:
return False
def build(self, node_):
attrs = node_.attributes
self.buildAttributes(attrs)
for child_ in node_.childNodes:
nodeName_ = child_.nodeName.split(':')[-1]
self.buildChildren(child_, nodeName_)
def buildAttributes(self, attrs):
pass
def buildChildren(self, child_, nodeName_):
if child_.nodeType == Node.ELEMENT_NODE and \
nodeName_ == 'ref':
childobj_ = docRefTextType.factory()
childobj_.build(child_)
obj_ = self.mixedclass_(MixedContainer.CategoryComplex,
MixedContainer.TypeNone, 'ref', childobj_)
self.content_.append(obj_)
elif child_.nodeType == Node.TEXT_NODE:
obj_ = self.mixedclass_(MixedContainer.CategoryText,
MixedContainer.TypeNone, '', child_.nodeValue)
self.content_.append(obj_)
# end class linkedTextType
class graphType(GeneratedsSuper):
subclass = None
superclass = None
def __init__(self, node=None):
if node is None:
self.node = []
else:
self.node = node
def factory(*args_, **kwargs_):
if graphType.subclass:
return graphType.subclass(*args_, **kwargs_)
else:
return graphType(*args_, **kwargs_)
factory = staticmethod(factory)
def get_node(self): return self.node
def set_node(self, node): self.node = node
def add_node(self, value): self.node.append(value)
def insert_node(self, index, value): self.node[index] = value
def hasContent_(self):
if (
self.node is not None
):
return True
else:
return False
def build(self, node_):
attrs = node_.attributes
self.buildAttributes(attrs)
for child_ in node_.childNodes:
nodeName_ = child_.nodeName.split(':')[-1]
self.buildChildren(child_, nodeName_)
def buildAttributes(self, attrs):
pass
def buildChildren(self, child_, nodeName_):
if child_.nodeType == Node.ELEMENT_NODE and \
nodeName_ == 'node':
obj_ = nodeType.factory()
obj_.build(child_)
self.node.append(obj_)
# end class graphType
class nodeType(GeneratedsSuper):
subclass = None
superclass = None
def __init__(self, id=None, label=None, link=None, childnode=None):
self.id = id
self.label = label
self.link = link
if childnode is None:
self.childnode = []
else:
self.childnode = childnode
def factory(*args_, **kwargs_):
if nodeType.subclass:
return nodeType.subclass(*args_, **kwargs_)
else:
return nodeType(*args_, **kwargs_)
factory = staticmethod(factory)
def get_label(self): return self.label
def set_label(self, label): self.label = label
def get_link(self): return self.link
def set_link(self, link): self.link = link
def get_childnode(self): return self.childnode
def set_childnode(self, childnode): self.childnode = childnode
def add_childnode(self, value): self.childnode.append(value)
def insert_childnode(self, index, value): self.childnode[index] = value
def get_id(self): return self.id
def set_id(self, id): self.id = id
def hasContent_(self):
if (
self.label is not None or
self.link is not None or
self.childnode is not None
):
return True
else:
return False
def build(self, node_):
attrs = node_.attributes
self.buildAttributes(attrs)
for child_ in node_.childNodes:
nodeName_ = child_.nodeName.split(':')[-1]
self.buildChildren(child_, nodeName_)
def buildAttributes(self, attrs):
if attrs.get('id'):
self.id = attrs.get('id').value
def buildChildren(self, child_, nodeName_):
if child_.nodeType == Node.ELEMENT_NODE and \
nodeName_ == 'label':
label_ = ''
for text__content_ in child_.childNodes:
label_ += text__content_.nodeValue
self.label = label_
elif child_.nodeType == Node.ELEMENT_NODE and \
nodeName_ == 'link':
obj_ = linkType.factory()
obj_.build(child_)
self.set_link(obj_)
elif child_.nodeType == Node.ELEMENT_NODE and \
nodeName_ == 'childnode':
obj_ = childnodeType.factory()
obj_.build(child_)
self.childnode.append(obj_)
# end class nodeType
class label(GeneratedsSuper):
subclass = None
superclass = None
def __init__(self, valueOf_=''):
self.valueOf_ = valueOf_
def factory(*args_, **kwargs_):
if label.subclass:
return label.subclass(*args_, **kwargs_)
else:
return label(*args_, **kwargs_)
factory = staticmethod(factory)
def getValueOf_(self): return self.valueOf_
def setValueOf_(self, valueOf_): self.valueOf_ = valueOf_
def hasContent_(self):
if (
self.valueOf_ is not None
):
return True
else:
return False
def build(self, node_):
attrs = node_.attributes
self.buildAttributes(attrs)
self.valueOf_ = ''
for child_ in node_.childNodes:
nodeName_ = child_.nodeName.split(':')[-1]
self.buildChildren(child_, nodeName_)
def buildAttributes(self, attrs):
pass
def buildChildren(self, child_, nodeName_):
if child_.nodeType == Node.TEXT_NODE:
self.valueOf_ += child_.nodeValue
elif child_.nodeType == Node.CDATA_SECTION_NODE:
self.valueOf_ += '![CDATA['+child_.nodeValue+']]'
# end class label
class childnodeType(GeneratedsSuper):
subclass = None
superclass = None
def __init__(self, relation=None, refid=None, edgelabel=None):
self.relation = relation
self.refid = refid
if edgelabel is None:
self.edgelabel = []
else:
self.edgelabel = edgelabel
def factory(*args_, **kwargs_):
if childnodeType.subclass:
return childnodeType.subclass(*args_, **kwargs_)
else:
return childnodeType(*args_, **kwargs_)
factory = staticmethod(factory)
def get_edgelabel(self): return self.edgelabel
def set_edgelabel(self, edgelabel): self.edgelabel = edgelabel
def add_edgelabel(self, value): self.edgelabel.append(value)
def insert_edgelabel(self, index, value): self.edgelabel[index] = value
def get_relation(self): return self.relation
def set_relation(self, relation): self.relation = relation
def get_refid(self): return self.refid
def set_refid(self, refid): self.refid = refid
def export(self, outfile, level, namespace_='', name_='childnodeType', namespacedef_=''):
showIndent(outfile, level)
outfile.write('<%s%s %s' % (namespace_, name_, namespacedef_, ))
self.exportAttributes(outfile, level, namespace_, name_='childnodeType')
if self.hasContent_():
outfile.write('>\n')
self.exportChildren(outfile, level + 1, namespace_, name_)
showIndent(outfile, level)
outfile.write('</%s%s>\n' % (namespace_, name_))
else:
outfile.write(' />\n')
def exportAttributes(self, outfile, level, namespace_='', name_='childnodeType'):
if self.relation is not None:
outfile.write(' relation=%s' % (quote_attrib(self.relation), ))
if self.refid is not None:
outfile.write(' refid=%s' % (self.format_string(quote_attrib(self.refid).encode(ExternalEncoding), input_name='refid'), ))
def exportChildren(self, outfile, level, namespace_='', name_='childnodeType'):
for edgelabel_ in self.edgelabel:
showIndent(outfile, level)
outfile.write('<%sedgelabel>%s</%sedgelabel>\n' % (namespace_, self.format_string(quote_xml(edgelabel_).encode(ExternalEncoding), input_name='edgelabel'), namespace_))
def hasContent_(self):
if (
self.edgelabel is not None
):
return True
else:
return False
def build(self, node_):
attrs = node_.attributes
self.buildAttributes(attrs)
for child_ in node_.childNodes:
nodeName_ = child_.nodeName.split(':')[-1]
self.buildChildren(child_, nodeName_)
def buildAttributes(self, attrs):
if attrs.get('relation'):
self.relation = attrs.get('relation').value
if attrs.get('refid'):
self.refid = attrs.get('refid').value
def buildChildren(self, child_, nodeName_):
if child_.nodeType == Node.ELEMENT_NODE and \
nodeName_ == 'edgelabel':
edgelabel_ = ''
for text__content_ in child_.childNodes:
edgelabel_ += text__content_.nodeValue
self.edgelabel.append(edgelabel_)
# end class childnodeType
class edgelabel(GeneratedsSuper):
subclass = None
superclass = None
def __init__(self, valueOf_=''):
self.valueOf_ = valueOf_
def factory(*args_, **kwargs_):
if edgelabel.subclass:
return edgelabel.subclass(*args_, **kwargs_)
else:
return edgelabel(*args_, **kwargs_)
factory = staticmethod(factory)
def getValueOf_(self): return self.valueOf_
def setValueOf_(self, valueOf_): self.valueOf_ = valueOf_
def export(self, outfile, level, namespace_='', name_='edgelabel', namespacedef_=''):
showIndent(outfile, level)
outfile.write('<%s%s %s' % (namespace_, name_, namespacedef_, ))
self.exportAttributes(outfile, level, namespace_, name_='edgelabel')
if self.hasContent_():
outfile.write('>\n')
self.exportChildren(outfile, level + 1, namespace_, name_)
showIndent(outfile, level)
outfile.write('</%s%s>\n' % (namespace_, name_))
else:
outfile.write(' />\n')
def exportAttributes(self, outfile, level, namespace_='', name_='edgelabel'):
pass
def exportChildren(self, outfile, level, namespace_='', name_='edgelabel'):
if self.valueOf_.find('![CDATA')>-1:
value=quote_xml('%s' % self.valueOf_)
value=value.replace('![CDATA','<![CDATA')
value=value.replace(']]',']]>')
outfile.write(value)
else:
outfile.write(quote_xml('%s' % self.valueOf_))
def hasContent_(self):
if (
self.valueOf_ is not None
):
return True
else:
return False
def build(self, node_):
attrs = node_.attributes
self.buildAttributes(attrs)
self.valueOf_ = ''
for child_ in node_.childNodes:
nodeName_ = child_.nodeName.split(':')[-1]
self.buildChildren(child_, nodeName_)
def buildAttributes(self, attrs):
pass
def buildChildren(self, child_, nodeName_):
if child_.nodeType == Node.TEXT_NODE:
self.valueOf_ += child_.nodeValue
elif child_.nodeType == Node.CDATA_SECTION_NODE:
self.valueOf_ += '![CDATA['+child_.nodeValue+']]'
# end class edgelabel
class linkType(GeneratedsSuper):
subclass = None
superclass = None
def __init__(self, refid=None, external=None, valueOf_=''):
self.refid = refid
self.external = external
self.valueOf_ = valueOf_
def factory(*args_, **kwargs_):
if linkType.subclass:
return linkType.subclass(*args_, **kwargs_)
else:
return linkType(*args_, **kwargs_)
factory = staticmethod(factory)
def get_refid(self): return self.refid
def set_refid(self, refid): self.refid = refid
def get_external(self): return self.external
def set_external(self, external): self.external = external
def getValueOf_(self): return self.valueOf_
def setValueOf_(self, valueOf_): self.valueOf_ = valueOf_
def export(self, outfile, level, namespace_='', name_='linkType', namespacedef_=''):
showIndent(outfile, level)
outfile.write('<%s%s %s' % (namespace_, name_, namespacedef_, ))
self.exportAttributes(outfile, level, namespace_, name_='linkType')
if self.hasContent_():
outfile.write('>\n')
self.exportChildren(outfile, level + 1, namespace_, name_)
showIndent(outfile, level)
outfile.write('</%s%s>\n' % (namespace_, name_))
else:
outfile.write(' />\n')
def exportAttributes(self, outfile, level, namespace_='', name_='linkType'):
if self.refid is not None:
outfile.write(' refid=%s' % (self.format_string(quote_attrib(self.refid).encode(ExternalEncoding), input_name='refid'), ))
if self.external is not None:
outfile.write(' external=%s' % (self.format_string(quote_attrib(self.external).encode(ExternalEncoding), input_name='external'), ))
def exportChildren(self, outfile, level, namespace_='', name_='linkType'):
if self.valueOf_.find('![CDATA')>-1:
value=quote_xml('%s' % self.valueOf_)
value=value.replace('![CDATA','<![CDATA')
value=value.replace(']]',']]>')
outfile.write(value)
else:
outfile.write(quote_xml('%s' % self.valueOf_))
def hasContent_(self):
if (
self.valueOf_ is not None
):
return True
else:
return False
def build(self, node_):
attrs = node_.attributes
self.buildAttributes(attrs)
self.valueOf_ = ''
for child_ in node_.childNodes:
nodeName_ = child_.nodeName.split(':')[-1]
self.buildChildren(child_, nodeName_)
def buildAttributes(self, attrs):
if attrs.get('refid'):
self.refid = attrs.get('refid').value
if attrs.get('external'):
self.external = attrs.get('external').value
def buildChildren(self, child_, nodeName_):
if child_.nodeType == Node.TEXT_NODE:
self.valueOf_ += child_.nodeValue
elif child_.nodeType == Node.CDATA_SECTION_NODE:
self.valueOf_ += '![CDATA['+child_.nodeValue+']]'
# end class linkType
class listingType(GeneratedsSuper):
subclass = None
superclass = None
def __init__(self, codeline=None):
if codeline is None:
self.codeline = []
else:
self.codeline = codeline
def factory(*args_, **kwargs_):
if listingType.subclass:
return listingType.subclass(*args_, **kwargs_)
else:
return listingType(*args_, **kwargs_)
factory = staticmethod(factory)
def get_codeline(self): return self.codeline
def set_codeline(self, codeline): self.codeline = codeline
def add_codeline(self, value): self.codeline.append(value)
def insert_codeline(self, index, value): self.codeline[index] = value
def export(self, outfile, level, namespace_='', name_='listingType', namespacedef_=''):
showIndent(outfile, level)
outfile.write('<%s%s %s' % (namespace_, name_, namespacedef_, ))
self.exportAttributes(outfile, level, namespace_, name_='listingType')
if self.hasContent_():
outfile.write('>\n')
self.exportChildren(outfile, level + 1, namespace_, name_)
showIndent(outfile, level)
outfile.write('</%s%s>\n' % (namespace_, name_))
else:
outfile.write(' />\n')
def exportAttributes(self, outfile, level, namespace_='', name_='listingType'):
pass
def exportChildren(self, outfile, level, namespace_='', name_='listingType'):
for codeline_ in self.codeline:
codeline_.export(outfile, level, namespace_, name_='codeline')
def hasContent_(self):
if (
self.codeline is not None
):
return True
else:
return False
def build(self, node_):
attrs = node_.attributes
self.buildAttributes(attrs)
for child_ in node_.childNodes:
nodeName_ = child_.nodeName.split(':')[-1]
self.buildChildren(child_, nodeName_)
def buildAttributes(self, attrs):
pass
def buildChildren(self, child_, nodeName_):
if child_.nodeType == Node.ELEMENT_NODE and \
nodeName_ == 'codeline':
obj_ = codelineType.factory()
obj_.build(child_)
self.codeline.append(obj_)
# end class listingType
class codelineType(GeneratedsSuper):
subclass = None
superclass = None
def __init__(self, external=None, lineno=None, refkind=None, refid=None, highlight=None):
self.external = external
self.lineno = lineno
self.refkind = refkind
self.refid = refid
if highlight is None:
self.highlight = []
else:
self.highlight = highlight
def factory(*args_, **kwargs_):
if codelineType.subclass:
return codelineType.subclass(*args_, **kwargs_)
else:
return codelineType(*args_, **kwargs_)
factory = staticmethod(factory)
def get_highlight(self): return self.highlight
def set_highlight(self, highlight): self.highlight = highlight
def add_highlight(self, value): self.highlight.append(value)
def insert_highlight(self, index, value): self.highlight[index] = value
def get_external(self): return self.external
def set_external(self, external): self.external = external
def get_lineno(self): return self.lineno
def set_lineno(self, lineno): self.lineno = lineno
def get_refkind(self): return self.refkind
def set_refkind(self, refkind): self.refkind = refkind
def get_refid(self): return self.refid
def set_refid(self, refid): self.refid = refid
def export(self, outfile, level, namespace_='', name_='codelineType', namespacedef_=''):
showIndent(outfile, level)
outfile.write('<%s%s %s' % (namespace_, name_, namespacedef_, ))
self.exportAttributes(outfile, level, namespace_, name_='codelineType')
if self.hasContent_():
outfile.write('>\n')
self.exportChildren(outfile, level + 1, namespace_, name_)
showIndent(outfile, level)
outfile.write('</%s%s>\n' % (namespace_, name_))
else:
outfile.write(' />\n')
def exportAttributes(self, outfile, level, namespace_='', name_='codelineType'):
if self.external is not None:
outfile.write(' external=%s' % (quote_attrib(self.external), ))
if self.lineno is not None:
outfile.write(' lineno="%s"' % self.format_integer(self.lineno, input_name='lineno'))
if self.refkind is not None:
outfile.write(' refkind=%s' % (quote_attrib(self.refkind), ))
if self.refid is not None:
outfile.write(' refid=%s' % (self.format_string(quote_attrib(self.refid).encode(ExternalEncoding), input_name='refid'), ))
def exportChildren(self, outfile, level, namespace_='', name_='codelineType'):
for highlight_ in self.highlight:
highlight_.export(outfile, level, namespace_, name_='highlight')
def hasContent_(self):
if (
self.highlight is not None
):
return True
else:
return False
def build(self, node_):
attrs = node_.attributes
self.buildAttributes(attrs)
for child_ in node_.childNodes:
nodeName_ = child_.nodeName.split(':')[-1]
self.buildChildren(child_, nodeName_)
def buildAttributes(self, attrs):
if attrs.get('external'):
self.external = attrs.get('external').value
if attrs.get('lineno'):
try:
self.lineno = int(attrs.get('lineno').value)
except ValueError as exp:
raise ValueError('Bad integer attribute (lineno): %s' % exp)
if attrs.get('refkind'):
self.refkind = attrs.get('refkind').value
if attrs.get('refid'):
self.refid = attrs.get('refid').value
def buildChildren(self, child_, nodeName_):
if child_.nodeType == Node.ELEMENT_NODE and \
nodeName_ == 'highlight':
obj_ = highlightType.factory()
obj_.build(child_)
self.highlight.append(obj_)
# end class codelineType
class highlightType(GeneratedsSuper):
subclass = None
superclass = None
def __init__(self, classxx=None, sp=None, ref=None, mixedclass_=None, content_=None):
self.classxx = classxx
if mixedclass_ is None:
self.mixedclass_ = MixedContainer
else:
self.mixedclass_ = mixedclass_
if content_ is None:
self.content_ = []
else:
self.content_ = content_
def factory(*args_, **kwargs_):
if highlightType.subclass:
return highlightType.subclass(*args_, **kwargs_)
else:
return highlightType(*args_, **kwargs_)
factory = staticmethod(factory)
def get_sp(self): return self.sp
def set_sp(self, sp): self.sp = sp
def add_sp(self, value): self.sp.append(value)
def insert_sp(self, index, value): self.sp[index] = value
def get_ref(self): return self.ref
def set_ref(self, ref): self.ref = ref
def add_ref(self, value): self.ref.append(value)
def insert_ref(self, index, value): self.ref[index] = value
def get_class(self): return self.classxx
def set_class(self, classxx): self.classxx = classxx
def export(self, outfile, level, namespace_='', name_='highlightType', namespacedef_=''):
showIndent(outfile, level)
outfile.write('<%s%s %s' % (namespace_, name_, namespacedef_, ))
self.exportAttributes(outfile, level, namespace_, name_='highlightType')
outfile.write('>')
self.exportChildren(outfile, level + 1, namespace_, name_)
outfile.write('</%s%s>\n' % (namespace_, name_))
def exportAttributes(self, outfile, level, namespace_='', name_='highlightType'):
if self.classxx is not None:
outfile.write(' class=%s' % (quote_attrib(self.classxx), ))
def exportChildren(self, outfile, level, namespace_='', name_='highlightType'):
for item_ in self.content_:
item_.export(outfile, level, item_.name, namespace_)
def hasContent_(self):
if (
self.sp is not None or
self.ref is not None
):
return True
else:
return False
def build(self, node_):
attrs = node_.attributes
self.buildAttributes(attrs)
for child_ in node_.childNodes:
nodeName_ = child_.nodeName.split(':')[-1]
self.buildChildren(child_, nodeName_)
def buildAttributes(self, attrs):
if attrs.get('class'):
self.classxx = attrs.get('class').value
def buildChildren(self, child_, nodeName_):
if child_.nodeType == Node.ELEMENT_NODE and \
nodeName_ == 'sp':
value_ = []
for text_ in child_.childNodes:
value_.append(text_.nodeValue)
# We make this unicode so that our unicode renderer catch-all picks it up
# otherwise it would go through as 'str' and we'd have to pick it up too
valuestr_ = u' '
obj_ = self.mixedclass_(MixedContainer.CategorySimple,
MixedContainer.TypeString, 'sp', valuestr_)
self.content_.append(obj_)
elif child_.nodeType == Node.ELEMENT_NODE and \
nodeName_ == 'ref':
childobj_ = docRefTextType.factory()
childobj_.build(child_)
obj_ = self.mixedclass_(MixedContainer.CategoryComplex,
MixedContainer.TypeNone, 'ref', childobj_)
self.content_.append(obj_)
elif child_.nodeType == Node.TEXT_NODE:
obj_ = self.mixedclass_(MixedContainer.CategoryText,
MixedContainer.TypeNone, '', child_.nodeValue)
self.content_.append(obj_)
# end class highlightType
class sp(GeneratedsSuper):
subclass = None
superclass = None
def __init__(self, valueOf_=''):
self.valueOf_ = valueOf_
def factory(*args_, **kwargs_):
if sp.subclass:
return sp.subclass(*args_, **kwargs_)
else:
return sp(*args_, **kwargs_)
factory = staticmethod(factory)
def getValueOf_(self): return self.valueOf_
def setValueOf_(self, valueOf_): self.valueOf_ = valueOf_
def export(self, outfile, level, namespace_='', name_='sp', namespacedef_=''):
showIndent(outfile, level)
outfile.write('<%s%s %s' % (namespace_, name_, namespacedef_, ))
self.exportAttributes(outfile, level, namespace_, name_='sp')
if self.hasContent_():
outfile.write('>\n')
self.exportChildren(outfile, level + 1, namespace_, name_)
showIndent(outfile, level)
outfile.write('</%s%s>\n' % (namespace_, name_))
else:
outfile.write(' />\n')
def exportAttributes(self, outfile, level, namespace_='', name_='sp'):
pass
def exportChildren(self, outfile, level, namespace_='', name_='sp'):
if self.valueOf_.find('![CDATA')>-1:
value=quote_xml('%s' % self.valueOf_)
value=value.replace('![CDATA','<![CDATA')
value=value.replace(']]',']]>')
outfile.write(value)
else:
outfile.write(quote_xml('%s' % self.valueOf_))
def hasContent_(self):
if (
self.valueOf_ is not None
):
return True
else:
return False
def build(self, node_):
attrs = node_.attributes
self.buildAttributes(attrs)
self.valueOf_ = ''
for child_ in node_.childNodes:
nodeName_ = child_.nodeName.split(':')[-1]
self.buildChildren(child_, nodeName_)
def buildAttributes(self, attrs):
pass
def buildChildren(self, child_, nodeName_):
if child_.nodeType == Node.TEXT_NODE:
self.valueOf_ += child_.nodeValue
elif child_.nodeType == Node.CDATA_SECTION_NODE:
self.valueOf_ += '![CDATA['+child_.nodeValue+']]'
# end class sp
class referenceType(GeneratedsSuper):
subclass = None
superclass = None
def __init__(self, endline=None, startline=None, refid=None, compoundref=None, valueOf_='', mixedclass_=None, content_=None):
self.endline = endline
self.startline = startline
self.refid = refid
self.compoundref = compoundref
if mixedclass_ is None:
self.mixedclass_ = MixedContainer
else:
self.mixedclass_ = mixedclass_
if content_ is None:
self.content_ = []
else:
self.content_ = content_
def factory(*args_, **kwargs_):
if referenceType.subclass:
return referenceType.subclass(*args_, **kwargs_)
else:
return referenceType(*args_, **kwargs_)
factory = staticmethod(factory)
def get_endline(self): return self.endline
def set_endline(self, endline): self.endline = endline
def get_startline(self): return self.startline
def set_startline(self, startline): self.startline = startline
def get_refid(self): return self.refid
def set_refid(self, refid): self.refid = refid
def get_compoundref(self): return self.compoundref
def set_compoundref(self, compoundref): self.compoundref = compoundref
def getValueOf_(self): return self.valueOf_
def setValueOf_(self, valueOf_): self.valueOf_ = valueOf_
def export(self, outfile, level, namespace_='', name_='referenceType', namespacedef_=''):
showIndent(outfile, level)
outfile.write('<%s%s %s' % (namespace_, name_, namespacedef_, ))
self.exportAttributes(outfile, level, namespace_, name_='referenceType')
outfile.write('>')
self.exportChildren(outfile, level + 1, namespace_, name_)
outfile.write('</%s%s>\n' % (namespace_, name_))
def exportAttributes(self, outfile, level, namespace_='', name_='referenceType'):
if self.endline is not None:
outfile.write(' endline="%s"' % self.format_integer(self.endline, input_name='endline'))
if self.startline is not None:
outfile.write(' startline="%s"' % self.format_integer(self.startline, input_name='startline'))
if self.refid is not None:
outfile.write(' refid=%s' % (self.format_string(quote_attrib(self.refid).encode(ExternalEncoding), input_name='refid'), ))
if self.compoundref is not None:
outfile.write(' compoundref=%s' % (self.format_string(quote_attrib(self.compoundref).encode(ExternalEncoding), input_name='compoundref'), ))
def exportChildren(self, outfile, level, namespace_='', name_='referenceType'):
if self.valueOf_.find('![CDATA')>-1:
value=quote_xml('%s' % self.valueOf_)
value=value.replace('![CDATA','<![CDATA')
value=value.replace(']]',']]>')
outfile.write(value)
else:
outfile.write(quote_xml('%s' % self.valueOf_))
def hasContent_(self):
if (
self.valueOf_ is not None
):
return True
else:
return False
def build(self, node_):
attrs = node_.attributes
self.buildAttributes(attrs)
self.valueOf_ = ''
for child_ in node_.childNodes:
nodeName_ = child_.nodeName.split(':')[-1]
self.buildChildren(child_, nodeName_)
def buildAttributes(self, attrs):
if attrs.get('endline'):
try:
self.endline = int(attrs.get('endline').value)
except ValueError as exp:
raise ValueError('Bad integer attribute (endline): %s' % exp)
if attrs.get('startline'):
try:
self.startline = int(attrs.get('startline').value)
except ValueError as exp:
raise ValueError('Bad integer attribute (startline): %s' % exp)
if attrs.get('refid'):
self.refid = attrs.get('refid').value
if attrs.get('compoundref'):
self.compoundref = attrs.get('compoundref').value
def buildChildren(self, child_, nodeName_):
if child_.nodeType == Node.TEXT_NODE:
obj_ = self.mixedclass_(MixedContainer.CategoryText,
MixedContainer.TypeNone, '', child_.nodeValue)
self.content_.append(obj_)
if child_.nodeType == Node.TEXT_NODE:
self.valueOf_ += child_.nodeValue
elif child_.nodeType == Node.CDATA_SECTION_NODE:
self.valueOf_ += '![CDATA['+child_.nodeValue+']]'
# end class referenceType
class locationType(GeneratedsSuper):
subclass = None
superclass = None
def __init__(self, bodystart=None, line=None, bodyend=None, bodyfile=None, file=None, valueOf_=''):
self.bodystart = bodystart
self.line = line
self.bodyend = bodyend
self.bodyfile = bodyfile
self.file = file
self.valueOf_ = valueOf_
def factory(*args_, **kwargs_):
if locationType.subclass:
return locationType.subclass(*args_, **kwargs_)
else:
return locationType(*args_, **kwargs_)
factory = staticmethod(factory)
def get_bodystart(self): return self.bodystart
def set_bodystart(self, bodystart): self.bodystart = bodystart
def get_line(self): return self.line
def set_line(self, line): self.line = line
def get_bodyend(self): return self.bodyend
def set_bodyend(self, bodyend): self.bodyend = bodyend
def get_bodyfile(self): return self.bodyfile
def set_bodyfile(self, bodyfile): self.bodyfile = bodyfile
def get_file(self): return self.file
def set_file(self, file): self.file = file
def getValueOf_(self): return self.valueOf_
def setValueOf_(self, valueOf_): self.valueOf_ = valueOf_
def export(self, outfile, level, namespace_='', name_='locationType', namespacedef_=''):
showIndent(outfile, level)
outfile.write('<%s%s %s' % (namespace_, name_, namespacedef_, ))
self.exportAttributes(outfile, level, namespace_, name_='locationType')
if self.hasContent_():
outfile.write('>\n')
self.exportChildren(outfile, level + 1, namespace_, name_)
showIndent(outfile, level)
outfile.write('</%s%s>\n' % (namespace_, name_))
else:
outfile.write(' />\n')
def exportAttributes(self, outfile, level, namespace_='', name_='locationType'):
if self.bodystart is not None:
outfile.write(' bodystart="%s"' % self.format_integer(self.bodystart, input_name='bodystart'))
if self.line is not None:
outfile.write(' line="%s"' % self.format_integer(self.line, input_name='line'))
if self.bodyend is not None:
outfile.write(' bodyend="%s"' % self.format_integer(self.bodyend, input_name='bodyend'))
if self.bodyfile is not None:
outfile.write(' bodyfile=%s' % (self.format_string(quote_attrib(self.bodyfile).encode(ExternalEncoding), input_name='bodyfile'), ))
if self.file is not None:
outfile.write(' file=%s' % (self.format_string(quote_attrib(self.file).encode(ExternalEncoding), input_name='file'), ))
def exportChildren(self, outfile, level, namespace_='', name_='locationType'):
if self.valueOf_.find('![CDATA')>-1:
value=quote_xml('%s' % self.valueOf_)
value=value.replace('![CDATA','<![CDATA')
value=value.replace(']]',']]>')
outfile.write(value)
else:
outfile.write(quote_xml('%s' % self.valueOf_))
def hasContent_(self):
if (
self.valueOf_ is not None
):
return True
else:
return False
def build(self, node_):
attrs = node_.attributes
self.buildAttributes(attrs)
self.valueOf_ = ''
for child_ in node_.childNodes:
nodeName_ = child_.nodeName.split(':')[-1]
self.buildChildren(child_, nodeName_)
def buildAttributes(self, attrs):
if attrs.get('bodystart'):
try:
self.bodystart = int(attrs.get('bodystart').value)
except ValueError as exp:
raise ValueError('Bad integer attribute (bodystart): %s' % exp)
if attrs.get('line'):
try:
self.line = int(attrs.get('line').value)
except ValueError as exp:
raise ValueError('Bad integer attribute (line): %s' % exp)
if attrs.get('bodyend'):
try:
self.bodyend = int(attrs.get('bodyend').value)
except ValueError as exp:
raise ValueError('Bad integer attribute (bodyend): %s' % exp)
if attrs.get('bodyfile'):
self.bodyfile = attrs.get('bodyfile').value
if attrs.get('file'):
self.file = attrs.get('file').value
def buildChildren(self, child_, nodeName_):
if child_.nodeType == Node.TEXT_NODE:
self.valueOf_ += child_.nodeValue
elif child_.nodeType == Node.CDATA_SECTION_NODE:
self.valueOf_ += '![CDATA['+child_.nodeValue+']]'
# end class locationType
class docSect1Type(GeneratedsSuper):
subclass = None
superclass = None
def __init__(self, id=None, title=None, para=None, sect2=None, internal=None, mixedclass_=None, content_=None):
self.id = id
if mixedclass_ is None:
self.mixedclass_ = MixedContainer
else:
self.mixedclass_ = mixedclass_
if content_ is None:
self.content_ = []
else:
self.content_ = content_
if title is None:
self.title = ""
else:
self.title = title
def factory(*args_, **kwargs_):
if docSect1Type.subclass:
return docSect1Type.subclass(*args_, **kwargs_)
else:
return docSect1Type(*args_, **kwargs_)
factory = staticmethod(factory)
def get_title(self): return self.title
def set_title(self, title): self.title = title
def get_para(self): return self.para
def set_para(self, para): self.para = para
def add_para(self, value): self.para.append(value)
def insert_para(self, index, value): self.para[index] = value
def get_sect2(self): return self.sect2
def set_sect2(self, sect2): self.sect2 = sect2
def add_sect2(self, value): self.sect2.append(value)
def insert_sect2(self, index, value): self.sect2[index] = value
def get_internal(self): return self.internal
def set_internal(self, internal): self.internal = internal
def get_id(self): return self.id
def set_id(self, id): self.id = id
def export(self, outfile, level, namespace_='', name_='docSect1Type', namespacedef_=''):
showIndent(outfile, level)
outfile.write('<%s%s %s' % (namespace_, name_, namespacedef_, ))
self.exportAttributes(outfile, level, namespace_, name_='docSect1Type')
outfile.write('>')
self.exportChildren(outfile, level + 1, namespace_, name_)
outfile.write('</%s%s>\n' % (namespace_, name_))
def exportAttributes(self, outfile, level, namespace_='', name_='docSect1Type'):
if self.id is not None:
outfile.write(' id=%s' % (self.format_string(quote_attrib(self.id).encode(ExternalEncoding), input_name='id'), ))
def exportChildren(self, outfile, level, namespace_='', name_='docSect1Type'):
for item_ in self.content_:
item_.export(outfile, level, item_.name, namespace_)
def hasContent_(self):
if (
self.title is not None or
self.para is not None or
self.sect2 is not None or
self.internal is not None
):
return True
else:
return False
def build(self, node_):
attrs = node_.attributes
self.buildAttributes(attrs)
for child_ in node_.childNodes:
nodeName_ = child_.nodeName.split(':')[-1]
self.buildChildren(child_, nodeName_)
def buildAttributes(self, attrs):
if attrs.get('id'):
self.id = attrs.get('id').value
def buildChildren(self, child_, nodeName_):
if child_.nodeType == Node.ELEMENT_NODE and \
nodeName_ == 'title':
self.title = child_.childNodes[0].nodeValue
elif child_.nodeType == Node.ELEMENT_NODE and \
nodeName_ == 'para':
childobj_ = docParaType.factory()
childobj_.build(child_)
obj_ = self.mixedclass_(MixedContainer.CategoryComplex,
MixedContainer.TypeNone, 'para', childobj_)
self.content_.append(obj_)
elif child_.nodeType == Node.ELEMENT_NODE and \
nodeName_ == 'sect2':
childobj_ = docSect2Type.factory()
childobj_.build(child_)
obj_ = self.mixedclass_(MixedContainer.CategoryComplex,
MixedContainer.TypeNone, 'sect2', childobj_)
self.content_.append(obj_)
elif child_.nodeType == Node.ELEMENT_NODE and \
nodeName_ == 'internal':
childobj_ = docInternalS1Type.factory()
childobj_.build(child_)
obj_ = self.mixedclass_(MixedContainer.CategoryComplex,
MixedContainer.TypeNone, 'internal', childobj_)
self.content_.append(obj_)
elif child_.nodeType == Node.TEXT_NODE:
obj_ = self.mixedclass_(MixedContainer.CategoryText,
MixedContainer.TypeNone, '', child_.nodeValue)
self.content_.append(obj_)
# end class docSect1Type
class docSect2Type(GeneratedsSuper):
subclass = None
superclass = None
def __init__(self, id=None, title=None, para=None, sect3=None, internal=None, mixedclass_=None, content_=None):
self.id = id
if mixedclass_ is None:
self.mixedclass_ = MixedContainer
else:
self.mixedclass_ = mixedclass_
if content_ is None:
self.content_ = []
else:
self.content_ = content_
if title is None:
title = ""
else:
title = title
def factory(*args_, **kwargs_):
if docSect2Type.subclass:
return docSect2Type.subclass(*args_, **kwargs_)
else:
return docSect2Type(*args_, **kwargs_)
factory = staticmethod(factory)
def get_title(self): return self.title
def set_title(self, title): self.title = title
def get_para(self): return self.para
def set_para(self, para): self.para = para
def add_para(self, value): self.para.append(value)
def insert_para(self, index, value): self.para[index] = value
def get_sect3(self): return self.sect3
def set_sect3(self, sect3): self.sect3 = sect3
def add_sect3(self, value): self.sect3.append(value)
def insert_sect3(self, index, value): self.sect3[index] = value
def get_internal(self): return self.internal
def set_internal(self, internal): self.internal = internal
def get_id(self): return self.id
def set_id(self, id): self.id = id
def export(self, outfile, level, namespace_='', name_='docSect2Type', namespacedef_=''):
showIndent(outfile, level)
outfile.write('<%s%s %s' % (namespace_, name_, namespacedef_, ))
self.exportAttributes(outfile, level, namespace_, name_='docSect2Type')
outfile.write('>')
self.exportChildren(outfile, level + 1, namespace_, name_)
outfile.write('</%s%s>\n' % (namespace_, name_))
def exportAttributes(self, outfile, level, namespace_='', name_='docSect2Type'):
if self.id is not None:
outfile.write(' id=%s' % (self.format_string(quote_attrib(self.id).encode(ExternalEncoding), input_name='id'), ))
def exportChildren(self, outfile, level, namespace_='', name_='docSect2Type'):
for item_ in self.content_:
item_.export(outfile, level, item_.name, namespace_)
def hasContent_(self):
if (
self.title is not None or
self.para is not None or
self.sect3 is not None or
self.internal is not None
):
return True
else:
return False
def build(self, node_):
attrs = node_.attributes
self.buildAttributes(attrs)
for child_ in node_.childNodes:
nodeName_ = child_.nodeName.split(':')[-1]
self.buildChildren(child_, nodeName_)
def buildAttributes(self, attrs):
if attrs.get('id'):
self.id = attrs.get('id').value
def buildChildren(self, child_, nodeName_):
if child_.nodeType == Node.ELEMENT_NODE and \
nodeName_ == 'title':
self.title = child_.childNodes[0].nodeValue
elif child_.nodeType == Node.ELEMENT_NODE and \
nodeName_ == 'para':
childobj_ = docParaType.factory()
childobj_.build(child_)
obj_ = self.mixedclass_(MixedContainer.CategoryComplex,
MixedContainer.TypeNone, 'para', childobj_)
self.content_.append(obj_)
elif child_.nodeType == Node.ELEMENT_NODE and \
nodeName_ == 'sect3':
childobj_ = docSect3Type.factory()
childobj_.build(child_)
obj_ = self.mixedclass_(MixedContainer.CategoryComplex,
MixedContainer.TypeNone, 'sect3', childobj_)
self.content_.append(obj_)
elif child_.nodeType == Node.ELEMENT_NODE and \
nodeName_ == 'internal':
childobj_ = docInternalS2Type.factory()
childobj_.build(child_)
obj_ = self.mixedclass_(MixedContainer.CategoryComplex,
MixedContainer.TypeNone, 'internal', childobj_)
self.content_.append(obj_)
elif child_.nodeType == Node.TEXT_NODE:
obj_ = self.mixedclass_(MixedContainer.CategoryText,
MixedContainer.TypeNone, '', child_.nodeValue)
self.content_.append(obj_)
# end class docSect2Type
class docSect3Type(GeneratedsSuper):
subclass = None
superclass = None
def __init__(self, id=None, title=None, para=None, sect4=None, internal=None, mixedclass_=None, content_=None):
self.id = id
if mixedclass_ is None:
self.mixedclass_ = MixedContainer
else:
self.mixedclass_ = mixedclass_
if content_ is None:
self.content_ = []
else:
self.content_ = content_
if title is None:
self.title = ""
else:
self.title = title
def factory(*args_, **kwargs_):
if docSect3Type.subclass:
return docSect3Type.subclass(*args_, **kwargs_)
else:
return docSect3Type(*args_, **kwargs_)
factory = staticmethod(factory)
def get_title(self): return self.title
def set_title(self, title): self.title = title
def get_para(self): return self.para
def set_para(self, para): self.para = para
def add_para(self, value): self.para.append(value)
def insert_para(self, index, value): self.para[index] = value
def get_sect4(self): return self.sect4
def set_sect4(self, sect4): self.sect4 = sect4
def add_sect4(self, value): self.sect4.append(value)
def insert_sect4(self, index, value): self.sect4[index] = value
def get_internal(self): return self.internal
def set_internal(self, internal): self.internal = internal
def get_id(self): return self.id
def set_id(self, id): self.id = id
def export(self, outfile, level, namespace_='', name_='docSect3Type', namespacedef_=''):
showIndent(outfile, level)
outfile.write('<%s%s %s' % (namespace_, name_, namespacedef_, ))
self.exportAttributes(outfile, level, namespace_, name_='docSect3Type')
outfile.write('>')
self.exportChildren(outfile, level + 1, namespace_, name_)
outfile.write('</%s%s>\n' % (namespace_, name_))
def exportAttributes(self, outfile, level, namespace_='', name_='docSect3Type'):
if self.id is not None:
outfile.write(' id=%s' % (self.format_string(quote_attrib(self.id).encode(ExternalEncoding), input_name='id'), ))
def exportChildren(self, outfile, level, namespace_='', name_='docSect3Type'):
for item_ in self.content_:
item_.export(outfile, level, item_.name, namespace_)
def hasContent_(self):
if (
self.title is not None or
self.para is not None or
self.sect4 is not None or
self.internal is not None
):
return True
else:
return False
def build(self, node_):
attrs = node_.attributes
self.buildAttributes(attrs)
for child_ in node_.childNodes:
nodeName_ = child_.nodeName.split(':')[-1]
self.buildChildren(child_, nodeName_)
def buildAttributes(self, attrs):
if attrs.get('id'):
self.id = attrs.get('id').value
def buildChildren(self, child_, nodeName_):
if child_.nodeType == Node.ELEMENT_NODE and \
nodeName_ == 'title':
self.title = child_.childNodes[0].nodeValue
elif child_.nodeType == Node.ELEMENT_NODE and \
nodeName_ == 'para':
childobj_ = docParaType.factory()
childobj_.build(child_)
obj_ = self.mixedclass_(MixedContainer.CategoryComplex,
MixedContainer.TypeNone, 'para', childobj_)
self.content_.append(obj_)
elif child_.nodeType == Node.ELEMENT_NODE and \
nodeName_ == 'sect4':
childobj_ = docSect4Type.factory()
childobj_.build(child_)
obj_ = self.mixedclass_(MixedContainer.CategoryComplex,
MixedContainer.TypeNone, 'sect4', childobj_)
self.content_.append(obj_)
elif child_.nodeType == Node.ELEMENT_NODE and \
nodeName_ == 'internal':
childobj_ = docInternalS3Type.factory()
childobj_.build(child_)
obj_ = self.mixedclass_(MixedContainer.CategoryComplex,
MixedContainer.TypeNone, 'internal', childobj_)
self.content_.append(obj_)
elif child_.nodeType == Node.TEXT_NODE:
obj_ = self.mixedclass_(MixedContainer.CategoryText,
MixedContainer.TypeNone, '', child_.nodeValue)
self.content_.append(obj_)
# end class docSect3Type
class docSect4Type(GeneratedsSuper):
subclass = None
superclass = None
def __init__(self, id=None, title=None, para=None, internal=None, mixedclass_=None, content_=None):
self.id = id
if mixedclass_ is None:
self.mixedclass_ = MixedContainer
else:
self.mixedclass_ = mixedclass_
if content_ is None:
self.content_ = []
else:
self.content_ = content_
def factory(*args_, **kwargs_):
if docSect4Type.subclass:
return docSect4Type.subclass(*args_, **kwargs_)
else:
return docSect4Type(*args_, **kwargs_)
factory = staticmethod(factory)
def get_title(self): return self.title
def set_title(self, title): self.title = title
def get_para(self): return self.para
def set_para(self, para): self.para = para
def add_para(self, value): self.para.append(value)
def insert_para(self, index, value): self.para[index] = value
def get_internal(self): return self.internal
def set_internal(self, internal): self.internal = internal
def get_id(self): return self.id
def set_id(self, id): self.id = id
def export(self, outfile, level, namespace_='', name_='docSect4Type', namespacedef_=''):
showIndent(outfile, level)
outfile.write('<%s%s %s' % (namespace_, name_, namespacedef_, ))
self.exportAttributes(outfile, level, namespace_, name_='docSect4Type')
outfile.write('>')
self.exportChildren(outfile, level + 1, namespace_, name_)
outfile.write('</%s%s>\n' % (namespace_, name_))
def exportAttributes(self, outfile, level, namespace_='', name_='docSect4Type'):
if self.id is not None:
outfile.write(' id=%s' % (self.format_string(quote_attrib(self.id).encode(ExternalEncoding), input_name='id'), ))
def exportChildren(self, outfile, level, namespace_='', name_='docSect4Type'):
for item_ in self.content_:
item_.export(outfile, level, item_.name, namespace_)
def hasContent_(self):
if (
self.title is not None or
self.para is not None or
self.internal is not None
):
return True
else:
return False
def build(self, node_):
attrs = node_.attributes
self.buildAttributes(attrs)
for child_ in node_.childNodes:
nodeName_ = child_.nodeName.split(':')[-1]
self.buildChildren(child_, nodeName_)
def buildAttributes(self, attrs):
if attrs.get('id'):
self.id = attrs.get('id').value
def buildChildren(self, child_, nodeName_):
if child_.nodeType == Node.ELEMENT_NODE and \
nodeName_ == 'title':
childobj_ = docTitleType.factory()
childobj_.build(child_)
obj_ = self.mixedclass_(MixedContainer.CategoryComplex,
MixedContainer.TypeNone, 'title', childobj_)
self.content_.append(obj_)
elif child_.nodeType == Node.ELEMENT_NODE and \
nodeName_ == 'para':
childobj_ = docParaType.factory()
childobj_.build(child_)
obj_ = self.mixedclass_(MixedContainer.CategoryComplex,
MixedContainer.TypeNone, 'para', childobj_)
self.content_.append(obj_)
elif child_.nodeType == Node.ELEMENT_NODE and \
nodeName_ == 'internal':
childobj_ = docInternalS4Type.factory()
childobj_.build(child_)
obj_ = self.mixedclass_(MixedContainer.CategoryComplex,
MixedContainer.TypeNone, 'internal', childobj_)
self.content_.append(obj_)
elif child_.nodeType == Node.TEXT_NODE:
obj_ = self.mixedclass_(MixedContainer.CategoryText,
MixedContainer.TypeNone, '', child_.nodeValue)
self.content_.append(obj_)
# end class docSect4Type
class docInternalType(GeneratedsSuper):
subclass = None
superclass = None
def __init__(self, para=None, sect1=None, mixedclass_=None, content_=None):
if mixedclass_ is None:
self.mixedclass_ = MixedContainer
else:
self.mixedclass_ = mixedclass_
if content_ is None:
self.content_ = []
else:
self.content_ = content_
def factory(*args_, **kwargs_):
if docInternalType.subclass:
return docInternalType.subclass(*args_, **kwargs_)
else:
return docInternalType(*args_, **kwargs_)
factory = staticmethod(factory)
def get_para(self): return self.para
def set_para(self, para): self.para = para
def add_para(self, value): self.para.append(value)
def insert_para(self, index, value): self.para[index] = value
def get_sect1(self): return self.sect1
def set_sect1(self, sect1): self.sect1 = sect1
def add_sect1(self, value): self.sect1.append(value)
def insert_sect1(self, index, value): self.sect1[index] = value
def export(self, outfile, level, namespace_='', name_='docInternalType', namespacedef_=''):
showIndent(outfile, level)
outfile.write('<%s%s %s' % (namespace_, name_, namespacedef_, ))
self.exportAttributes(outfile, level, namespace_, name_='docInternalType')
outfile.write('>')
self.exportChildren(outfile, level + 1, namespace_, name_)
outfile.write('</%s%s>\n' % (namespace_, name_))
def exportAttributes(self, outfile, level, namespace_='', name_='docInternalType'):
pass
def exportChildren(self, outfile, level, namespace_='', name_='docInternalType'):
for item_ in self.content_:
item_.export(outfile, level, item_.name, namespace_)
def hasContent_(self):
if (
self.para is not None or
self.sect1 is not None
):
return True
else:
return False
def build(self, node_):
attrs = node_.attributes
self.buildAttributes(attrs)
for child_ in node_.childNodes:
nodeName_ = child_.nodeName.split(':')[-1]
self.buildChildren(child_, nodeName_)
def buildAttributes(self, attrs):
pass
def buildChildren(self, child_, nodeName_):
if child_.nodeType == Node.ELEMENT_NODE and \
nodeName_ == 'para':
childobj_ = docParaType.factory()
childobj_.build(child_)
obj_ = self.mixedclass_(MixedContainer.CategoryComplex,
MixedContainer.TypeNone, 'para', childobj_)
self.content_.append(obj_)
elif child_.nodeType == Node.ELEMENT_NODE and \
nodeName_ == 'sect1':
childobj_ = docSect1Type.factory()
childobj_.build(child_)
obj_ = self.mixedclass_(MixedContainer.CategoryComplex,
MixedContainer.TypeNone, 'sect1', childobj_)
self.content_.append(obj_)
elif child_.nodeType == Node.TEXT_NODE:
obj_ = self.mixedclass_(MixedContainer.CategoryText,
MixedContainer.TypeNone, '', child_.nodeValue)
self.content_.append(obj_)
# end class docInternalType
class docInternalS1Type(GeneratedsSuper):
subclass = None
superclass = None
def __init__(self, para=None, sect2=None, mixedclass_=None, content_=None):
if mixedclass_ is None:
self.mixedclass_ = MixedContainer
else:
self.mixedclass_ = mixedclass_
if content_ is None:
self.content_ = []
else:
self.content_ = content_
def factory(*args_, **kwargs_):
if docInternalS1Type.subclass:
return docInternalS1Type.subclass(*args_, **kwargs_)
else:
return docInternalS1Type(*args_, **kwargs_)
factory = staticmethod(factory)
def get_para(self): return self.para
def set_para(self, para): self.para = para
def add_para(self, value): self.para.append(value)
def insert_para(self, index, value): self.para[index] = value
def get_sect2(self): return self.sect2
def set_sect2(self, sect2): self.sect2 = sect2
def add_sect2(self, value): self.sect2.append(value)
def insert_sect2(self, index, value): self.sect2[index] = value
def export(self, outfile, level, namespace_='', name_='docInternalS1Type', namespacedef_=''):
showIndent(outfile, level)
outfile.write('<%s%s %s' % (namespace_, name_, namespacedef_, ))
self.exportAttributes(outfile, level, namespace_, name_='docInternalS1Type')
outfile.write('>')
self.exportChildren(outfile, level + 1, namespace_, name_)
outfile.write('</%s%s>\n' % (namespace_, name_))
def exportAttributes(self, outfile, level, namespace_='', name_='docInternalS1Type'):
pass
def exportChildren(self, outfile, level, namespace_='', name_='docInternalS1Type'):
for item_ in self.content_:
item_.export(outfile, level, item_.name, namespace_)
def hasContent_(self):
if (
self.para is not None or
self.sect2 is not None
):
return True
else:
return False
def build(self, node_):
attrs = node_.attributes
self.buildAttributes(attrs)
for child_ in node_.childNodes:
nodeName_ = child_.nodeName.split(':')[-1]
self.buildChildren(child_, nodeName_)
def buildAttributes(self, attrs):
pass
def buildChildren(self, child_, nodeName_):
if child_.nodeType == Node.ELEMENT_NODE and \
nodeName_ == 'para':
childobj_ = docParaType.factory()
childobj_.build(child_)
obj_ = self.mixedclass_(MixedContainer.CategoryComplex,
MixedContainer.TypeNone, 'para', childobj_)
self.content_.append(obj_)
elif child_.nodeType == Node.ELEMENT_NODE and \
nodeName_ == 'sect2':
childobj_ = docSect2Type.factory()
childobj_.build(child_)
obj_ = self.mixedclass_(MixedContainer.CategoryComplex,
MixedContainer.TypeNone, 'sect2', childobj_)
self.content_.append(obj_)
elif child_.nodeType == Node.TEXT_NODE:
obj_ = self.mixedclass_(MixedContainer.CategoryText,
MixedContainer.TypeNone, '', child_.nodeValue)
self.content_.append(obj_)
# end class docInternalS1Type
class docInternalS2Type(GeneratedsSuper):
subclass = None
superclass = None
def __init__(self, para=None, sect3=None, mixedclass_=None, content_=None):
if mixedclass_ is None:
self.mixedclass_ = MixedContainer
else:
self.mixedclass_ = mixedclass_
if content_ is None:
self.content_ = []
else:
self.content_ = content_
def factory(*args_, **kwargs_):
if docInternalS2Type.subclass:
return docInternalS2Type.subclass(*args_, **kwargs_)
else:
return docInternalS2Type(*args_, **kwargs_)
factory = staticmethod(factory)
def get_para(self): return self.para
def set_para(self, para): self.para = para
def add_para(self, value): self.para.append(value)
def insert_para(self, index, value): self.para[index] = value
def get_sect3(self): return self.sect3
def set_sect3(self, sect3): self.sect3 = sect3
def add_sect3(self, value): self.sect3.append(value)
def insert_sect3(self, index, value): self.sect3[index] = value
def export(self, outfile, level, namespace_='', name_='docInternalS2Type', namespacedef_=''):
showIndent(outfile, level)
outfile.write('<%s%s %s' % (namespace_, name_, namespacedef_, ))
self.exportAttributes(outfile, level, namespace_, name_='docInternalS2Type')
outfile.write('>')
self.exportChildren(outfile, level + 1, namespace_, name_)
outfile.write('</%s%s>\n' % (namespace_, name_))
def exportAttributes(self, outfile, level, namespace_='', name_='docInternalS2Type'):
pass
def exportChildren(self, outfile, level, namespace_='', name_='docInternalS2Type'):
for item_ in self.content_:
item_.export(outfile, level, item_.name, namespace_)
def hasContent_(self):
if (
self.para is not None or
self.sect3 is not None
):
return True
else:
return False
def build(self, node_):
attrs = node_.attributes
self.buildAttributes(attrs)
for child_ in node_.childNodes:
nodeName_ = child_.nodeName.split(':')[-1]
self.buildChildren(child_, nodeName_)
def buildAttributes(self, attrs):
pass
def buildChildren(self, child_, nodeName_):
if child_.nodeType == Node.ELEMENT_NODE and \
nodeName_ == 'para':
childobj_ = docParaType.factory()
childobj_.build(child_)
obj_ = self.mixedclass_(MixedContainer.CategoryComplex,
MixedContainer.TypeNone, 'para', childobj_)
self.content_.append(obj_)
elif child_.nodeType == Node.ELEMENT_NODE and \
nodeName_ == 'sect3':
childobj_ = docSect3Type.factory()
childobj_.build(child_)
obj_ = self.mixedclass_(MixedContainer.CategoryComplex,
MixedContainer.TypeNone, 'sect3', childobj_)
self.content_.append(obj_)
elif child_.nodeType == Node.TEXT_NODE:
obj_ = self.mixedclass_(MixedContainer.CategoryText,
MixedContainer.TypeNone, '', child_.nodeValue)
self.content_.append(obj_)
# end class docInternalS2Type
class docInternalS3Type(GeneratedsSuper):
subclass = None
superclass = None
def __init__(self, para=None, sect3=None, mixedclass_=None, content_=None):
if mixedclass_ is None:
self.mixedclass_ = MixedContainer
else:
self.mixedclass_ = mixedclass_
if content_ is None:
self.content_ = []
else:
self.content_ = content_
def factory(*args_, **kwargs_):
if docInternalS3Type.subclass:
return docInternalS3Type.subclass(*args_, **kwargs_)
else:
return docInternalS3Type(*args_, **kwargs_)
factory = staticmethod(factory)
def get_para(self): return self.para
def set_para(self, para): self.para = para
def add_para(self, value): self.para.append(value)
def insert_para(self, index, value): self.para[index] = value
def get_sect3(self): return self.sect3
def set_sect3(self, sect3): self.sect3 = sect3
def add_sect3(self, value): self.sect3.append(value)
def insert_sect3(self, index, value): self.sect3[index] = value
def export(self, outfile, level, namespace_='', name_='docInternalS3Type', namespacedef_=''):
showIndent(outfile, level)
outfile.write('<%s%s %s' % (namespace_, name_, namespacedef_, ))
self.exportAttributes(outfile, level, namespace_, name_='docInternalS3Type')
outfile.write('>')
self.exportChildren(outfile, level + 1, namespace_, name_)
outfile.write('</%s%s>\n' % (namespace_, name_))
def exportAttributes(self, outfile, level, namespace_='', name_='docInternalS3Type'):
pass
def exportChildren(self, outfile, level, namespace_='', name_='docInternalS3Type'):
for item_ in self.content_:
item_.export(outfile, level, item_.name, namespace_)
def hasContent_(self):
if (
self.para is not None or
self.sect3 is not None
):
return True
else:
return False
def build(self, node_):
attrs = node_.attributes
self.buildAttributes(attrs)
for child_ in node_.childNodes:
nodeName_ = child_.nodeName.split(':')[-1]
self.buildChildren(child_, nodeName_)
def buildAttributes(self, attrs):
pass
def buildChildren(self, child_, nodeName_):
if child_.nodeType == Node.ELEMENT_NODE and \
nodeName_ == 'para':
childobj_ = docParaType.factory()
childobj_.build(child_)
obj_ = self.mixedclass_(MixedContainer.CategoryComplex,
MixedContainer.TypeNone, 'para', childobj_)
self.content_.append(obj_)
elif child_.nodeType == Node.ELEMENT_NODE and \
nodeName_ == 'sect3':
childobj_ = docSect4Type.factory()
childobj_.build(child_)
obj_ = self.mixedclass_(MixedContainer.CategoryComplex,
MixedContainer.TypeNone, 'sect3', childobj_)
self.content_.append(obj_)
elif child_.nodeType == Node.TEXT_NODE:
obj_ = self.mixedclass_(MixedContainer.CategoryText,
MixedContainer.TypeNone, '', child_.nodeValue)
self.content_.append(obj_)
# end class docInternalS3Type
class docInternalS4Type(GeneratedsSuper):
subclass = None
superclass = None
def __init__(self, para=None, mixedclass_=None, content_=None):
if mixedclass_ is None:
self.mixedclass_ = MixedContainer
else:
self.mixedclass_ = mixedclass_
if content_ is None:
self.content_ = []
else:
self.content_ = content_
def factory(*args_, **kwargs_):
if docInternalS4Type.subclass:
return docInternalS4Type.subclass(*args_, **kwargs_)
else:
return docInternalS4Type(*args_, **kwargs_)
factory = staticmethod(factory)
def get_para(self): return self.para
def set_para(self, para): self.para = para
def add_para(self, value): self.para.append(value)
def insert_para(self, index, value): self.para[index] = value
def export(self, outfile, level, namespace_='', name_='docInternalS4Type', namespacedef_=''):
showIndent(outfile, level)
outfile.write('<%s%s %s' % (namespace_, name_, namespacedef_, ))
self.exportAttributes(outfile, level, namespace_, name_='docInternalS4Type')
outfile.write('>')
self.exportChildren(outfile, level + 1, namespace_, name_)
outfile.write('</%s%s>\n' % (namespace_, name_))
def exportAttributes(self, outfile, level, namespace_='', name_='docInternalS4Type'):
pass
def exportChildren(self, outfile, level, namespace_='', name_='docInternalS4Type'):
for item_ in self.content_:
item_.export(outfile, level, item_.name, namespace_)
def hasContent_(self):
if (
self.para is not None
):
return True
else:
return False
def build(self, node_):
attrs = node_.attributes
self.buildAttributes(attrs)
for child_ in node_.childNodes:
nodeName_ = child_.nodeName.split(':')[-1]
self.buildChildren(child_, nodeName_)
def buildAttributes(self, attrs):
pass
def buildChildren(self, child_, nodeName_):
if child_.nodeType == Node.ELEMENT_NODE and \
nodeName_ == 'para':
childobj_ = docParaType.factory()
childobj_.build(child_)
obj_ = self.mixedclass_(MixedContainer.CategoryComplex,
MixedContainer.TypeNone, 'para', childobj_)
self.content_.append(obj_)
elif child_.nodeType == Node.TEXT_NODE:
obj_ = self.mixedclass_(MixedContainer.CategoryText,
MixedContainer.TypeNone, '', child_.nodeValue)
self.content_.append(obj_)
# end class docInternalS4Type
class docTitleType(GeneratedsSuper):
subclass = None
superclass = None
def __init__(self, valueOf_='', mixedclass_=None, content_=None):
if mixedclass_ is None:
self.mixedclass_ = MixedContainer
else:
self.mixedclass_ = mixedclass_
if content_ is None:
self.content_ = []
else:
self.content_ = content_
def factory(*args_, **kwargs_):
if docTitleType.subclass:
return docTitleType.subclass(*args_, **kwargs_)
else:
return docTitleType(*args_, **kwargs_)
factory = staticmethod(factory)
def getValueOf_(self): return self.valueOf_
def setValueOf_(self, valueOf_): self.valueOf_ = valueOf_
def export(self, outfile, level, namespace_='', name_='docTitleType', namespacedef_=''):
showIndent(outfile, level)
outfile.write('<%s%s %s' % (namespace_, name_, namespacedef_, ))
self.exportAttributes(outfile, level, namespace_, name_='docTitleType')
outfile.write('>')
self.exportChildren(outfile, level + 1, namespace_, name_)
outfile.write('</%s%s>\n' % (namespace_, name_))
def exportAttributes(self, outfile, level, namespace_='', name_='docTitleType'):
pass
def exportChildren(self, outfile, level, namespace_='', name_='docTitleType'):
if self.valueOf_.find('![CDATA')>-1:
value=quote_xml('%s' % self.valueOf_)
value=value.replace('![CDATA','<![CDATA')
value=value.replace(']]',']]>')
outfile.write(value)
else:
outfile.write(quote_xml('%s' % self.valueOf_))
def hasContent_(self):
if (
self.valueOf_ is not None
):
return True
else:
return False
def build(self, node_):
attrs = node_.attributes
self.buildAttributes(attrs)
self.valueOf_ = ''
for child_ in node_.childNodes:
nodeName_ = child_.nodeName.split(':')[-1]
self.buildChildren(child_, nodeName_)
def buildAttributes(self, attrs):
pass
def buildChildren(self, child_, nodeName_):
if child_.nodeType == Node.TEXT_NODE:
obj_ = self.mixedclass_(MixedContainer.CategoryText,
MixedContainer.TypeNone, '', child_.nodeValue)
self.content_.append(obj_)
if child_.nodeType == Node.TEXT_NODE:
self.valueOf_ += child_.nodeValue
elif child_.nodeType == Node.CDATA_SECTION_NODE:
self.valueOf_ += '![CDATA['+child_.nodeValue+']]'
# end class docTitleType
class docParaType(GeneratedsSuper):
subclass = None
superclass = None
def __init__(self, valueOf_='', mixedclass_=None, content_=None):
if mixedclass_ is None:
self.mixedclass_ = MixedContainer
else:
self.mixedclass_ = mixedclass_
if content_ is None:
self.content_ = []
else:
self.content_ = content_
def factory(*args_, **kwargs_):
if docParaType.subclass:
return docParaType.subclass(*args_, **kwargs_)
else:
return docParaType(*args_, **kwargs_)
factory = staticmethod(factory)
def getValueOf_(self): return self.valueOf_
def setValueOf_(self, valueOf_): self.valueOf_ = valueOf_
def export(self, outfile, level, namespace_='', name_='docParaType', namespacedef_=''):
showIndent(outfile, level)
outfile.write('<%s%s %s' % (namespace_, name_, namespacedef_, ))
self.exportAttributes(outfile, level, namespace_, name_='docParaType')
outfile.write('>')
self.exportChildren(outfile, level + 1, namespace_, name_)
outfile.write('</%s%s>\n' % (namespace_, name_))
def exportAttributes(self, outfile, level, namespace_='', name_='docParaType'):
pass
def exportChildren(self, outfile, level, namespace_='', name_='docParaType'):
if self.valueOf_.find('![CDATA')>-1:
value=quote_xml('%s' % self.valueOf_)
value=value.replace('![CDATA','<![CDATA')
value=value.replace(']]',']]>')
outfile.write(value)
else:
outfile.write(quote_xml('%s' % self.valueOf_))
def hasContent_(self):
if (
self.valueOf_ is not None
):
return True
else:
return False
def build(self, node_):
attrs = node_.attributes
self.buildAttributes(attrs)
self.valueOf_ = ''
for child_ in node_.childNodes:
nodeName_ = child_.nodeName.split(':')[-1]
self.buildChildren(child_, nodeName_)
def buildAttributes(self, attrs):
pass
def buildChildren(self, child_, nodeName_):
if child_.nodeType == Node.TEXT_NODE:
obj_ = self.mixedclass_(MixedContainer.CategoryText,
MixedContainer.TypeNone, '', child_.nodeValue)
self.content_.append(obj_)
if child_.nodeType == Node.TEXT_NODE:
self.valueOf_ += child_.nodeValue
elif child_.nodeType == Node.CDATA_SECTION_NODE:
self.valueOf_ += '![CDATA['+child_.nodeValue+']]'
# end class docParaType
class docMarkupType(GeneratedsSuper):
subclass = None
superclass = None
def __init__(self, valueOf_='', mixedclass_=None, content_=None):
if mixedclass_ is None:
self.mixedclass_ = MixedContainer
else:
self.mixedclass_ = mixedclass_
if content_ is None:
self.content_ = []
else:
self.content_ = content_
def factory(*args_, **kwargs_):
if docMarkupType.subclass:
return docMarkupType.subclass(*args_, **kwargs_)
else:
return docMarkupType(*args_, **kwargs_)
factory = staticmethod(factory)
def getValueOf_(self): return self.valueOf_
def setValueOf_(self, valueOf_): self.valueOf_ = valueOf_
def export(self, outfile, level, namespace_='', name_='docMarkupType', namespacedef_=''):
showIndent(outfile, level)
outfile.write('<%s%s %s' % (namespace_, name_, namespacedef_, ))
self.exportAttributes(outfile, level, namespace_, name_='docMarkupType')
outfile.write('>')
self.exportChildren(outfile, level + 1, namespace_, name_)
outfile.write('</%s%s>\n' % (namespace_, name_))
def exportAttributes(self, outfile, level, namespace_='', name_='docMarkupType'):
pass
def exportChildren(self, outfile, level, namespace_='', name_='docMarkupType'):
if self.valueOf_.find('![CDATA')>-1:
value=quote_xml('%s' % self.valueOf_)
value=value.replace('![CDATA','<![CDATA')
value=value.replace(']]',']]>')
outfile.write(value)
else:
outfile.write(quote_xml('%s' % self.valueOf_))
def hasContent_(self):
if (
self.valueOf_ is not None
):
return True
else:
return False
def build(self, node_):
attrs = node_.attributes
self.buildAttributes(attrs)
self.valueOf_ = ''
for child_ in node_.childNodes:
nodeName_ = child_.nodeName.split(':')[-1]
self.buildChildren(child_, nodeName_)
def buildAttributes(self, attrs):
pass
def buildChildren(self, child_, nodeName_):
if child_.nodeType == Node.TEXT_NODE:
obj_ = self.mixedclass_(MixedContainer.CategoryText,
MixedContainer.TypeNone, '', child_.nodeValue)
self.content_.append(obj_)
if child_.nodeType == Node.TEXT_NODE:
self.valueOf_ += child_.nodeValue
elif child_.nodeType == Node.CDATA_SECTION_NODE:
self.valueOf_ += '![CDATA['+child_.nodeValue+']]'
# end class docMarkupType
class docURLLink(GeneratedsSuper):
subclass = None
superclass = None
def __init__(self, url=None, valueOf_='', mixedclass_=None, content_=None):
self.url = url
if mixedclass_ is None:
self.mixedclass_ = MixedContainer
else:
self.mixedclass_ = mixedclass_
if content_ is None:
self.content_ = []
else:
self.content_ = content_
def factory(*args_, **kwargs_):
if docURLLink.subclass:
return docURLLink.subclass(*args_, **kwargs_)
else:
return docURLLink(*args_, **kwargs_)
factory = staticmethod(factory)
def get_url(self): return self.url
def set_url(self, url): self.url = url
def getValueOf_(self): return self.valueOf_
def setValueOf_(self, valueOf_): self.valueOf_ = valueOf_
def export(self, outfile, level, namespace_='', name_='docURLLink', namespacedef_=''):
showIndent(outfile, level)
outfile.write('<%s%s %s' % (namespace_, name_, namespacedef_, ))
self.exportAttributes(outfile, level, namespace_, name_='docURLLink')
outfile.write('>')
self.exportChildren(outfile, level + 1, namespace_, name_)
outfile.write('</%s%s>\n' % (namespace_, name_))
def exportAttributes(self, outfile, level, namespace_='', name_='docURLLink'):
if self.url is not None:
outfile.write(' url=%s' % (self.format_string(quote_attrib(self.url).encode(ExternalEncoding), input_name='url'), ))
def exportChildren(self, outfile, level, namespace_='', name_='docURLLink'):
if self.valueOf_.find('![CDATA')>-1:
value=quote_xml('%s' % self.valueOf_)
value=value.replace('![CDATA','<![CDATA')
value=value.replace(']]',']]>')
outfile.write(value)
else:
outfile.write(quote_xml('%s' % self.valueOf_))
def hasContent_(self):
if (
self.valueOf_ is not None
):
return True
else:
return False
def build(self, node_):
attrs = node_.attributes
self.buildAttributes(attrs)
self.valueOf_ = ''
for child_ in node_.childNodes:
nodeName_ = child_.nodeName.split(':')[-1]
self.buildChildren(child_, nodeName_)
def buildAttributes(self, attrs):
if attrs.get('url'):
self.url = attrs.get('url').value
def buildChildren(self, child_, nodeName_):
if child_.nodeType == Node.TEXT_NODE:
obj_ = self.mixedclass_(MixedContainer.CategoryText,
MixedContainer.TypeNone, '', child_.nodeValue)
self.content_.append(obj_)
if child_.nodeType == Node.TEXT_NODE:
self.valueOf_ += child_.nodeValue
elif child_.nodeType == Node.CDATA_SECTION_NODE:
self.valueOf_ += '![CDATA['+child_.nodeValue+']]'
# end class docURLLink
class docAnchorType(GeneratedsSuper):
subclass = None
superclass = None
def __init__(self, id=None, valueOf_='', mixedclass_=None, content_=None):
self.id = id
if mixedclass_ is None:
self.mixedclass_ = MixedContainer
else:
self.mixedclass_ = mixedclass_
if content_ is None:
self.content_ = []
else:
self.content_ = content_
def factory(*args_, **kwargs_):
if docAnchorType.subclass:
return docAnchorType.subclass(*args_, **kwargs_)
else:
return docAnchorType(*args_, **kwargs_)
factory = staticmethod(factory)
def get_id(self): return self.id
def set_id(self, id): self.id = id
def getValueOf_(self): return self.valueOf_
def setValueOf_(self, valueOf_): self.valueOf_ = valueOf_
def export(self, outfile, level, namespace_='', name_='docAnchorType', namespacedef_=''):
showIndent(outfile, level)
outfile.write('<%s%s %s' % (namespace_, name_, namespacedef_, ))
self.exportAttributes(outfile, level, namespace_, name_='docAnchorType')
outfile.write('>')
self.exportChildren(outfile, level + 1, namespace_, name_)
outfile.write('</%s%s>\n' % (namespace_, name_))
def exportAttributes(self, outfile, level, namespace_='', name_='docAnchorType'):
if self.id is not None:
outfile.write(' id=%s' % (self.format_string(quote_attrib(self.id).encode(ExternalEncoding), input_name='id'), ))
def exportChildren(self, outfile, level, namespace_='', name_='docAnchorType'):
if self.valueOf_.find('![CDATA')>-1:
value=quote_xml('%s' % self.valueOf_)
value=value.replace('![CDATA','<![CDATA')
value=value.replace(']]',']]>')
outfile.write(value)
else:
outfile.write(quote_xml('%s' % self.valueOf_))
def hasContent_(self):
if (
self.valueOf_ is not None
):
return True
else:
return False
def build(self, node_):
attrs = node_.attributes
self.buildAttributes(attrs)
self.valueOf_ = ''
for child_ in node_.childNodes:
nodeName_ = child_.nodeName.split(':')[-1]
self.buildChildren(child_, nodeName_)
def buildAttributes(self, attrs):
if attrs.get('id'):
self.id = attrs.get('id').value
def buildChildren(self, child_, nodeName_):
if child_.nodeType == Node.TEXT_NODE:
obj_ = self.mixedclass_(MixedContainer.CategoryText,
MixedContainer.TypeNone, '', child_.nodeValue)
self.content_.append(obj_)
if child_.nodeType == Node.TEXT_NODE:
self.valueOf_ += child_.nodeValue
elif child_.nodeType == Node.CDATA_SECTION_NODE:
self.valueOf_ += '![CDATA['+child_.nodeValue+']]'
# end class docAnchorType
class docFormulaType(GeneratedsSuper):
subclass = None
superclass = None
def __init__(self, id=None, valueOf_='', mixedclass_=None, content_=None):
self.id = id
if mixedclass_ is None:
self.mixedclass_ = MixedContainer
else:
self.mixedclass_ = mixedclass_
if content_ is None:
self.content_ = []
else:
self.content_ = content_
def factory(*args_, **kwargs_):
if docFormulaType.subclass:
return docFormulaType.subclass(*args_, **kwargs_)
else:
return docFormulaType(*args_, **kwargs_)
factory = staticmethod(factory)
def get_id(self): return self.id
def set_id(self, id): self.id = id
def getValueOf_(self): return self.valueOf_
def setValueOf_(self, valueOf_): self.valueOf_ = valueOf_
def export(self, outfile, level, namespace_='', name_='docFormulaType', namespacedef_=''):
showIndent(outfile, level)
outfile.write('<%s%s %s' % (namespace_, name_, namespacedef_, ))
self.exportAttributes(outfile, level, namespace_, name_='docFormulaType')
outfile.write('>')
self.exportChildren(outfile, level + 1, namespace_, name_)
outfile.write('</%s%s>\n' % (namespace_, name_))
def exportAttributes(self, outfile, level, namespace_='', name_='docFormulaType'):
if self.id is not None:
outfile.write(' id=%s' % (self.format_string(quote_attrib(self.id).encode(ExternalEncoding), input_name='id'), ))
def exportChildren(self, outfile, level, namespace_='', name_='docFormulaType'):
if self.valueOf_.find('![CDATA')>-1:
value=quote_xml('%s' % self.valueOf_)
value=value.replace('![CDATA','<![CDATA')
value=value.replace(']]',']]>')
outfile.write(value)
else:
outfile.write(quote_xml('%s' % self.valueOf_))
def hasContent_(self):
if (
self.valueOf_ is not None
):
return True
else:
return False
def build(self, node_):
attrs = node_.attributes
self.buildAttributes(attrs)
self.valueOf_ = ''
for child_ in node_.childNodes:
nodeName_ = child_.nodeName.split(':')[-1]
self.buildChildren(child_, nodeName_)
def buildAttributes(self, attrs):
if attrs.get('id'):
self.id = attrs.get('id').value
def buildChildren(self, child_, nodeName_):
if child_.nodeType == Node.TEXT_NODE:
obj_ = self.mixedclass_(MixedContainer.CategoryText,
MixedContainer.TypeNone, '', child_.nodeValue)
self.content_.append(obj_)
if child_.nodeType == Node.TEXT_NODE:
self.valueOf_ += child_.nodeValue
elif child_.nodeType == Node.CDATA_SECTION_NODE:
self.valueOf_ += '![CDATA['+child_.nodeValue+']]'
# end class docFormulaType
class docIndexEntryType(GeneratedsSuper):
subclass = None
superclass = None
def __init__(self, primaryie=None, secondaryie=None):
self.primaryie = primaryie
self.secondaryie = secondaryie
def factory(*args_, **kwargs_):
if docIndexEntryType.subclass:
return docIndexEntryType.subclass(*args_, **kwargs_)
else:
return docIndexEntryType(*args_, **kwargs_)
factory = staticmethod(factory)
def get_primaryie(self): return self.primaryie
def set_primaryie(self, primaryie): self.primaryie = primaryie
def get_secondaryie(self): return self.secondaryie
def set_secondaryie(self, secondaryie): self.secondaryie = secondaryie
def export(self, outfile, level, namespace_='', name_='docIndexEntryType', namespacedef_=''):
showIndent(outfile, level)
outfile.write('<%s%s %s' % (namespace_, name_, namespacedef_, ))
self.exportAttributes(outfile, level, namespace_, name_='docIndexEntryType')
if self.hasContent_():
outfile.write('>\n')
self.exportChildren(outfile, level + 1, namespace_, name_)
showIndent(outfile, level)
outfile.write('</%s%s>\n' % (namespace_, name_))
else:
outfile.write(' />\n')
def exportAttributes(self, outfile, level, namespace_='', name_='docIndexEntryType'):
pass
def exportChildren(self, outfile, level, namespace_='', name_='docIndexEntryType'):
if self.primaryie is not None:
showIndent(outfile, level)
outfile.write('<%sprimaryie>%s</%sprimaryie>\n' % (namespace_, self.format_string(quote_xml(self.primaryie).encode(ExternalEncoding), input_name='primaryie'), namespace_))
if self.secondaryie is not None:
showIndent(outfile, level)
outfile.write('<%ssecondaryie>%s</%ssecondaryie>\n' % (namespace_, self.format_string(quote_xml(self.secondaryie).encode(ExternalEncoding), input_name='secondaryie'), namespace_))
def hasContent_(self):
if (
self.primaryie is not None or
self.secondaryie is not None
):
return True
else:
return False
def build(self, node_):
attrs = node_.attributes
self.buildAttributes(attrs)
for child_ in node_.childNodes:
nodeName_ = child_.nodeName.split(':')[-1]
self.buildChildren(child_, nodeName_)
def buildAttributes(self, attrs):
pass
def buildChildren(self, child_, nodeName_):
if child_.nodeType == Node.ELEMENT_NODE and \
nodeName_ == 'primaryie':
primaryie_ = ''
for text__content_ in child_.childNodes:
primaryie_ += text__content_.nodeValue
self.primaryie = primaryie_
elif child_.nodeType == Node.ELEMENT_NODE and \
nodeName_ == 'secondaryie':
secondaryie_ = ''
for text__content_ in child_.childNodes:
secondaryie_ += text__content_.nodeValue
self.secondaryie = secondaryie_
# end class docIndexEntryType
class docListType(GeneratedsSuper):
subclass = None
superclass = None
def __init__(self, listitem=None):
if listitem is None:
self.listitem = []
else:
self.listitem = listitem
def factory(*args_, **kwargs_):
if docListType.subclass:
return docListType.subclass(*args_, **kwargs_)
else:
return docListType(*args_, **kwargs_)
factory = staticmethod(factory)
def get_listitem(self): return self.listitem
def set_listitem(self, listitem): self.listitem = listitem
def add_listitem(self, value): self.listitem.append(value)
def insert_listitem(self, index, value): self.listitem[index] = value
def export(self, outfile, level, namespace_='', name_='docListType', namespacedef_=''):
showIndent(outfile, level)
outfile.write('<%s%s %s' % (namespace_, name_, namespacedef_, ))
self.exportAttributes(outfile, level, namespace_, name_='docListType')
if self.hasContent_():
outfile.write('>\n')
self.exportChildren(outfile, level + 1, namespace_, name_)
showIndent(outfile, level)
outfile.write('</%s%s>\n' % (namespace_, name_))
else:
outfile.write(' />\n')
def exportAttributes(self, outfile, level, namespace_='', name_='docListType'):
pass
def exportChildren(self, outfile, level, namespace_='', name_='docListType'):
for listitem_ in self.listitem:
listitem_.export(outfile, level, namespace_, name_='listitem')
def hasContent_(self):
if (
self.listitem is not None
):
return True
else:
return False
def build(self, node_):
attrs = node_.attributes
self.buildAttributes(attrs)
for child_ in node_.childNodes:
nodeName_ = child_.nodeName.split(':')[-1]
self.buildChildren(child_, nodeName_)
def buildAttributes(self, attrs):
pass
def buildChildren(self, child_, nodeName_):
if child_.nodeType == Node.ELEMENT_NODE and \
nodeName_ == 'listitem':
obj_ = docListItemType.factory()
obj_.build(child_)
self.listitem.append(obj_)
# end class docListType
class docListItemType(GeneratedsSuper):
subclass = None
superclass = None
def __init__(self, para=None):
if para is None:
self.para = []
else:
self.para = para
def factory(*args_, **kwargs_):
if docListItemType.subclass:
return docListItemType.subclass(*args_, **kwargs_)
else:
return docListItemType(*args_, **kwargs_)
factory = staticmethod(factory)
def get_para(self): return self.para
def set_para(self, para): self.para = para
def add_para(self, value): self.para.append(value)
def insert_para(self, index, value): self.para[index] = value
def export(self, outfile, level, namespace_='', name_='docListItemType', namespacedef_=''):
showIndent(outfile, level)
outfile.write('<%s%s %s' % (namespace_, name_, namespacedef_, ))
self.exportAttributes(outfile, level, namespace_, name_='docListItemType')
if self.hasContent_():
outfile.write('>\n')
self.exportChildren(outfile, level + 1, namespace_, name_)
showIndent(outfile, level)
outfile.write('</%s%s>\n' % (namespace_, name_))
else:
outfile.write(' />\n')
def exportAttributes(self, outfile, level, namespace_='', name_='docListItemType'):
pass
def exportChildren(self, outfile, level, namespace_='', name_='docListItemType'):
for para_ in self.para:
para_.export(outfile, level, namespace_, name_='para')
def hasContent_(self):
if (
self.para is not None
):
return True
else:
return False
def build(self, node_):
attrs = node_.attributes
self.buildAttributes(attrs)
for child_ in node_.childNodes:
nodeName_ = child_.nodeName.split(':')[-1]
self.buildChildren(child_, nodeName_)
def buildAttributes(self, attrs):
pass
def buildChildren(self, child_, nodeName_):
if child_.nodeType == Node.ELEMENT_NODE and \
nodeName_ == 'para':
obj_ = docParaType.factory()
obj_.build(child_)
self.para.append(obj_)
# end class docListItemType
class docSimpleSectType(GeneratedsSuper):
subclass = None
superclass = None
def __init__(self, kind=None, title=None, para=None):
self.kind = kind
self.title = title
if para is None:
self.para = []
else:
self.para = para
def factory(*args_, **kwargs_):
if docSimpleSectType.subclass:
return docSimpleSectType.subclass(*args_, **kwargs_)
else:
return docSimpleSectType(*args_, **kwargs_)
factory = staticmethod(factory)
def get_title(self): return self.title
def set_title(self, title): self.title = title
def get_para(self): return self.para
def set_para(self, para): self.para = para
def add_para(self, value): self.para.append(value)
def insert_para(self, index, value): self.para[index] = value
def get_kind(self): return self.kind
def set_kind(self, kind): self.kind = kind
def export(self, outfile, level, namespace_='', name_='docSimpleSectType', namespacedef_=''):
showIndent(outfile, level)
outfile.write('<%s%s %s' % (namespace_, name_, namespacedef_, ))
self.exportAttributes(outfile, level, namespace_, name_='docSimpleSectType')
if self.hasContent_():
outfile.write('>\n')
self.exportChildren(outfile, level + 1, namespace_, name_)
showIndent(outfile, level)
outfile.write('</%s%s>\n' % (namespace_, name_))
else:
outfile.write(' />\n')
def exportAttributes(self, outfile, level, namespace_='', name_='docSimpleSectType'):
if self.kind is not None:
outfile.write(' kind=%s' % (quote_attrib(self.kind), ))
def exportChildren(self, outfile, level, namespace_='', name_='docSimpleSectType'):
if self.title:
self.title.export(outfile, level, namespace_, name_='title')
for para_ in self.para:
para_.export(outfile, level, namespace_, name_='para')
def hasContent_(self):
if (
self.title is not None or
self.para is not None
):
return True
else:
return False
def build(self, node_):
attrs = node_.attributes
self.buildAttributes(attrs)
for child_ in node_.childNodes:
nodeName_ = child_.nodeName.split(':')[-1]
self.buildChildren(child_, nodeName_)
def buildAttributes(self, attrs):
if attrs.get('kind'):
self.kind = attrs.get('kind').value
def buildChildren(self, child_, nodeName_):
if child_.nodeType == Node.ELEMENT_NODE and \
nodeName_ == 'title':
obj_ = docTitleType.factory()
obj_.build(child_)
self.set_title(obj_)
elif child_.nodeType == Node.ELEMENT_NODE and \
nodeName_ == 'para':
obj_ = docParaType.factory()
obj_.build(child_)
self.para.append(obj_)
# end class docSimpleSectType
class docVarListEntryType(GeneratedsSuper):
subclass = None
superclass = None
def __init__(self, term=None):
self.term = term
def factory(*args_, **kwargs_):
if docVarListEntryType.subclass:
return docVarListEntryType.subclass(*args_, **kwargs_)
else:
return docVarListEntryType(*args_, **kwargs_)
factory = staticmethod(factory)
def get_term(self): return self.term
def set_term(self, term): self.term = term
def export(self, outfile, level, namespace_='', name_='docVarListEntryType', namespacedef_=''):
showIndent(outfile, level)
outfile.write('<%s%s %s' % (namespace_, name_, namespacedef_, ))
self.exportAttributes(outfile, level, namespace_, name_='docVarListEntryType')
if self.hasContent_():
outfile.write('>\n')
self.exportChildren(outfile, level + 1, namespace_, name_)
showIndent(outfile, level)
outfile.write('</%s%s>\n' % (namespace_, name_))
else:
outfile.write(' />\n')
def exportAttributes(self, outfile, level, namespace_='', name_='docVarListEntryType'):
pass
def exportChildren(self, outfile, level, namespace_='', name_='docVarListEntryType'):
if self.term:
self.term.export(outfile, level, namespace_, name_='term', )
def hasContent_(self):
if (
self.term is not None
):
return True
else:
return False
def build(self, node_):
attrs = node_.attributes
self.buildAttributes(attrs)
for child_ in node_.childNodes:
nodeName_ = child_.nodeName.split(':')[-1]
self.buildChildren(child_, nodeName_)
def buildAttributes(self, attrs):
pass
def buildChildren(self, child_, nodeName_):
if child_.nodeType == Node.ELEMENT_NODE and \
nodeName_ == 'term':
obj_ = docTitleType.factory()
obj_.build(child_)
self.set_term(obj_)
# end class docVarListEntryType
class docVariableListType(GeneratedsSuper):
subclass = None
superclass = None
def __init__(self, valueOf_=''):
self.valueOf_ = valueOf_
def factory(*args_, **kwargs_):
if docVariableListType.subclass:
return docVariableListType.subclass(*args_, **kwargs_)
else:
return docVariableListType(*args_, **kwargs_)
factory = staticmethod(factory)
def getValueOf_(self): return self.valueOf_
def setValueOf_(self, valueOf_): self.valueOf_ = valueOf_
def export(self, outfile, level, namespace_='', name_='docVariableListType', namespacedef_=''):
showIndent(outfile, level)
outfile.write('<%s%s %s' % (namespace_, name_, namespacedef_, ))
self.exportAttributes(outfile, level, namespace_, name_='docVariableListType')
if self.hasContent_():
outfile.write('>\n')
self.exportChildren(outfile, level + 1, namespace_, name_)
showIndent(outfile, level)
outfile.write('</%s%s>\n' % (namespace_, name_))
else:
outfile.write(' />\n')
def exportAttributes(self, outfile, level, namespace_='', name_='docVariableListType'):
pass
def exportChildren(self, outfile, level, namespace_='', name_='docVariableListType'):
if self.valueOf_.find('![CDATA')>-1:
value=quote_xml('%s' % self.valueOf_)
value=value.replace('![CDATA','<![CDATA')
value=value.replace(']]',']]>')
outfile.write(value)
else:
outfile.write(quote_xml('%s' % self.valueOf_))
def hasContent_(self):
if (
self.valueOf_ is not None
):
return True
else:
return False
def build(self, node_):
attrs = node_.attributes
self.buildAttributes(attrs)
self.valueOf_ = ''
for child_ in node_.childNodes:
nodeName_ = child_.nodeName.split(':')[-1]
self.buildChildren(child_, nodeName_)
def buildAttributes(self, attrs):
pass
def buildChildren(self, child_, nodeName_):
if child_.nodeType == Node.TEXT_NODE:
self.valueOf_ += child_.nodeValue
elif child_.nodeType == Node.CDATA_SECTION_NODE:
self.valueOf_ += '![CDATA['+child_.nodeValue+']]'
# end class docVariableListType
class docRefTextType(GeneratedsSuper):
subclass = None
superclass = None
def __init__(self, refid=None, kindref=None, external=None, valueOf_='', mixedclass_=None, content_=None):
self.refid = refid
self.kindref = kindref
self.external = external
if mixedclass_ is None:
self.mixedclass_ = MixedContainer
else:
self.mixedclass_ = mixedclass_
if content_ is None:
self.content_ = []
else:
self.content_ = content_
def factory(*args_, **kwargs_):
if docRefTextType.subclass:
return docRefTextType.subclass(*args_, **kwargs_)
else:
return docRefTextType(*args_, **kwargs_)
factory = staticmethod(factory)
def get_refid(self): return self.refid
def set_refid(self, refid): self.refid = refid
def get_kindref(self): return self.kindref
def set_kindref(self, kindref): self.kindref = kindref
def get_external(self): return self.external
def set_external(self, external): self.external = external
def getValueOf_(self): return self.valueOf_
def setValueOf_(self, valueOf_): self.valueOf_ = valueOf_
def export(self, outfile, level, namespace_='', name_='docRefTextType', namespacedef_=''):
showIndent(outfile, level)
outfile.write('<%s%s %s' % (namespace_, name_, namespacedef_, ))
self.exportAttributes(outfile, level, namespace_, name_='docRefTextType')
outfile.write('>')
self.exportChildren(outfile, level + 1, namespace_, name_)
outfile.write('</%s%s>\n' % (namespace_, name_))
def exportAttributes(self, outfile, level, namespace_='', name_='docRefTextType'):
if self.refid is not None:
outfile.write(' refid=%s' % (self.format_string(quote_attrib(self.refid).encode(ExternalEncoding), input_name='refid'), ))
if self.kindref is not None:
outfile.write(' kindref=%s' % (quote_attrib(self.kindref), ))
if self.external is not None:
outfile.write(' external=%s' % (self.format_string(quote_attrib(self.external).encode(ExternalEncoding), input_name='external'), ))
def exportChildren(self, outfile, level, namespace_='', name_='docRefTextType'):
if self.valueOf_.find('![CDATA')>-1:
value=quote_xml('%s' % self.valueOf_)
value=value.replace('![CDATA','<![CDATA')
value=value.replace(']]',']]>')
outfile.write(value)
else:
outfile.write(quote_xml('%s' % self.valueOf_))
def hasContent_(self):
if (
self.valueOf_ is not None
):
return True
else:
return False
def build(self, node_):
attrs = node_.attributes
self.buildAttributes(attrs)
self.valueOf_ = ''
for child_ in node_.childNodes:
nodeName_ = child_.nodeName.split(':')[-1]
self.buildChildren(child_, nodeName_)
def buildAttributes(self, attrs):
if attrs.get('refid'):
self.refid = attrs.get('refid').value
if attrs.get('kindref'):
self.kindref = attrs.get('kindref').value
if attrs.get('external'):
self.external = attrs.get('external').value
def buildChildren(self, child_, nodeName_):
if child_.nodeType == Node.TEXT_NODE:
obj_ = self.mixedclass_(MixedContainer.CategoryText,
MixedContainer.TypeNone, '', child_.nodeValue)
self.content_.append(obj_)
if child_.nodeType == Node.TEXT_NODE:
self.valueOf_ += child_.nodeValue
elif child_.nodeType == Node.CDATA_SECTION_NODE:
self.valueOf_ += '![CDATA['+child_.nodeValue+']]'
# end class docRefTextType
class docTableType(GeneratedsSuper):
subclass = None
superclass = None
def __init__(self, rows=None, cols=None, row=None, caption=None):
self.rows = rows
self.cols = cols
if row is None:
self.row = []
else:
self.row = row
self.caption = caption
def factory(*args_, **kwargs_):
if docTableType.subclass:
return docTableType.subclass(*args_, **kwargs_)
else:
return docTableType(*args_, **kwargs_)
factory = staticmethod(factory)
def get_row(self): return self.row
def set_row(self, row): self.row = row
def add_row(self, value): self.row.append(value)
def insert_row(self, index, value): self.row[index] = value
def get_caption(self): return self.caption
def set_caption(self, caption): self.caption = caption
def get_rows(self): return self.rows
def set_rows(self, rows): self.rows = rows
def get_cols(self): return self.cols
def set_cols(self, cols): self.cols = cols
def export(self, outfile, level, namespace_='', name_='docTableType', namespacedef_=''):
showIndent(outfile, level)
outfile.write('<%s%s %s' % (namespace_, name_, namespacedef_, ))
self.exportAttributes(outfile, level, namespace_, name_='docTableType')
if self.hasContent_():
outfile.write('>\n')
self.exportChildren(outfile, level + 1, namespace_, name_)
showIndent(outfile, level)
outfile.write('</%s%s>\n' % (namespace_, name_))
else:
outfile.write(' />\n')
def exportAttributes(self, outfile, level, namespace_='', name_='docTableType'):
if self.rows is not None:
outfile.write(' rows="%s"' % self.format_integer(self.rows, input_name='rows'))
if self.cols is not None:
outfile.write(' cols="%s"' % self.format_integer(self.cols, input_name='cols'))
def exportChildren(self, outfile, level, namespace_='', name_='docTableType'):
for row_ in self.row:
row_.export(outfile, level, namespace_, name_='row')
if self.caption:
self.caption.export(outfile, level, namespace_, name_='caption')
def hasContent_(self):
if (
self.row is not None or
self.caption is not None
):
return True
else:
return False
def build(self, node_):
attrs = node_.attributes
self.buildAttributes(attrs)
for child_ in node_.childNodes:
nodeName_ = child_.nodeName.split(':')[-1]
self.buildChildren(child_, nodeName_)
def buildAttributes(self, attrs):
if attrs.get('rows'):
try:
self.rows = int(attrs.get('rows').value)
except ValueError as exp:
raise ValueError('Bad integer attribute (rows): %s' % exp)
if attrs.get('cols'):
try:
self.cols = int(attrs.get('cols').value)
except ValueError as exp:
raise ValueError('Bad integer attribute (cols): %s' % exp)
def buildChildren(self, child_, nodeName_):
if child_.nodeType == Node.ELEMENT_NODE and \
nodeName_ == 'row':
obj_ = docRowType.factory()
obj_.build(child_)
self.row.append(obj_)
elif child_.nodeType == Node.ELEMENT_NODE and \
nodeName_ == 'caption':
obj_ = docCaptionType.factory()
obj_.build(child_)
self.set_caption(obj_)
# end class docTableType
class docRowType(GeneratedsSuper):
subclass = None
superclass = None
def __init__(self, entry=None):
if entry is None:
self.entry = []
else:
self.entry = entry
def factory(*args_, **kwargs_):
if docRowType.subclass:
return docRowType.subclass(*args_, **kwargs_)
else:
return docRowType(*args_, **kwargs_)
factory = staticmethod(factory)
def get_entry(self): return self.entry
def set_entry(self, entry): self.entry = entry
def add_entry(self, value): self.entry.append(value)
def insert_entry(self, index, value): self.entry[index] = value
def export(self, outfile, level, namespace_='', name_='docRowType', namespacedef_=''):
showIndent(outfile, level)
outfile.write('<%s%s %s' % (namespace_, name_, namespacedef_, ))
self.exportAttributes(outfile, level, namespace_, name_='docRowType')
if self.hasContent_():
outfile.write('>\n')
self.exportChildren(outfile, level + 1, namespace_, name_)
showIndent(outfile, level)
outfile.write('</%s%s>\n' % (namespace_, name_))
else:
outfile.write(' />\n')
def exportAttributes(self, outfile, level, namespace_='', name_='docRowType'):
pass
def exportChildren(self, outfile, level, namespace_='', name_='docRowType'):
for entry_ in self.entry:
entry_.export(outfile, level, namespace_, name_='entry')
def hasContent_(self):
if (
self.entry is not None
):
return True
else:
return False
def build(self, node_):
attrs = node_.attributes
self.buildAttributes(attrs)
for child_ in node_.childNodes:
nodeName_ = child_.nodeName.split(':')[-1]
self.buildChildren(child_, nodeName_)
def buildAttributes(self, attrs):
pass
def buildChildren(self, child_, nodeName_):
if child_.nodeType == Node.ELEMENT_NODE and \
nodeName_ == 'entry':
obj_ = docEntryType.factory()
obj_.build(child_)
self.entry.append(obj_)
# end class docRowType
class docEntryType(GeneratedsSuper):
subclass = None
superclass = None
def __init__(self, thead=None, align=None, rowspan=None, colspan=None, para=None):
self.thead = thead
self.align = align
self.rowspan = rowspan
self.colspan = colspan
if para is None:
self.para = []
else:
self.para = para
def factory(*args_, **kwargs_):
if docEntryType.subclass:
return docEntryType.subclass(*args_, **kwargs_)
else:
return docEntryType(*args_, **kwargs_)
factory = staticmethod(factory)
def get_para(self): return self.para
def set_para(self, para): self.para = para
def add_para(self, value): self.para.append(value)
def insert_para(self, index, value): self.para[index] = value
def get_thead(self): return self.thead
def set_thead(self, thead): self.thead = thead
def get_align(self): return self.align
def set_align(self, align): self.align = align
def get_rowspan(self): return self.rowspan
def set_rowspan(self, rowspan): self.rowspan = rowspan
def get_colspan(self): return self.colspan
def set_colspan(self, colspan): self.colspan = colspan
def export(self, outfile, level, namespace_='', name_='docEntryType', namespacedef_=''):
showIndent(outfile, level)
outfile.write('<%s%s %s' % (namespace_, name_, namespacedef_, ))
self.exportAttributes(outfile, level, namespace_, name_='docEntryType')
if self.hasContent_():
outfile.write('>\n')
self.exportChildren(outfile, level + 1, namespace_, name_)
showIndent(outfile, level)
outfile.write('</%s%s>\n' % (namespace_, name_))
else:
outfile.write(' />\n')
def exportAttributes(self, outfile, level, namespace_='', name_='docEntryType'):
if self.thead is not None:
outfile.write(' thead=%s' % (quote_attrib(self.thead), ))
if self.align is not None:
outfile.write(' align=%s' % (quote_attrib(self.align), ))
if self.rowspan is not None:
outfile.write(' rowspan=%s' % (quote_attrib(self.rowspan), ))
if self.colspan is not None:
outfile.write(' colspan=%s' % (quote_attrib(self.colspan), ))
def exportChildren(self, outfile, level, namespace_='', name_='docEntryType'):
for para_ in self.para:
para_.export(outfile, level, namespace_, name_='para')
def hasContent_(self):
if (
self.para is not None
):
return True
else:
return False
def build(self, node_):
attrs = node_.attributes
self.buildAttributes(attrs)
for child_ in node_.childNodes:
nodeName_ = child_.nodeName.split(':')[-1]
self.buildChildren(child_, nodeName_)
def buildAttributes(self, attrs):
if attrs.get('thead'):
self.thead = attrs.get('thead').value
if attrs.get('align'):
self.align = attrs.get('align').value
if attrs.get('rowspan'):
self.rowspan = attrs.get('rowspan').value
if attrs.get('colspan'):
self.colspan = attrs.get('colspan').value
def buildChildren(self, child_, nodeName_):
if child_.nodeType == Node.ELEMENT_NODE and \
nodeName_ == 'para':
obj_ = docParaType.factory()
obj_.build(child_)
self.para.append(obj_)
# end class docEntryType
class docCaptionType(GeneratedsSuper):
subclass = None
superclass = None
def __init__(self, valueOf_='', mixedclass_=None, content_=None):
if mixedclass_ is None:
self.mixedclass_ = MixedContainer
else:
self.mixedclass_ = mixedclass_
if content_ is None:
self.content_ = []
else:
self.content_ = content_
def factory(*args_, **kwargs_):
if docCaptionType.subclass:
return docCaptionType.subclass(*args_, **kwargs_)
else:
return docCaptionType(*args_, **kwargs_)
factory = staticmethod(factory)
def getValueOf_(self): return self.valueOf_
def setValueOf_(self, valueOf_): self.valueOf_ = valueOf_
def export(self, outfile, level, namespace_='', name_='docCaptionType', namespacedef_=''):
showIndent(outfile, level)
outfile.write('<%s%s %s' % (namespace_, name_, namespacedef_, ))
self.exportAttributes(outfile, level, namespace_, name_='docCaptionType')
outfile.write('>')
self.exportChildren(outfile, level + 1, namespace_, name_)
outfile.write('</%s%s>\n' % (namespace_, name_))
def exportAttributes(self, outfile, level, namespace_='', name_='docCaptionType'):
pass
def exportChildren(self, outfile, level, namespace_='', name_='docCaptionType'):
if self.valueOf_.find('![CDATA')>-1:
value=quote_xml('%s' % self.valueOf_)
value=value.replace('![CDATA','<![CDATA')
value=value.replace(']]',']]>')
outfile.write(value)
else:
outfile.write(quote_xml('%s' % self.valueOf_))
def hasContent_(self):
if (
self.valueOf_ is not None
):
return True
else:
return False
def build(self, node_):
attrs = node_.attributes
self.buildAttributes(attrs)
self.valueOf_ = ''
for child_ in node_.childNodes:
nodeName_ = child_.nodeName.split(':')[-1]
self.buildChildren(child_, nodeName_)
def buildAttributes(self, attrs):
pass
def buildChildren(self, child_, nodeName_):
if child_.nodeType == Node.TEXT_NODE:
obj_ = self.mixedclass_(MixedContainer.CategoryText,
MixedContainer.TypeNone, '', child_.nodeValue)
self.content_.append(obj_)
if child_.nodeType == Node.TEXT_NODE:
self.valueOf_ += child_.nodeValue
elif child_.nodeType == Node.CDATA_SECTION_NODE:
self.valueOf_ += '![CDATA['+child_.nodeValue+']]'
# end class docCaptionType
class docHeadingType(GeneratedsSuper):
subclass = None
superclass = None
def __init__(self, level=None, valueOf_='', mixedclass_=None, content_=None):
self.level = level
if mixedclass_ is None:
self.mixedclass_ = MixedContainer
else:
self.mixedclass_ = mixedclass_
if content_ is None:
self.content_ = []
else:
self.content_ = content_
def factory(*args_, **kwargs_):
if docHeadingType.subclass:
return docHeadingType.subclass(*args_, **kwargs_)
else:
return docHeadingType(*args_, **kwargs_)
factory = staticmethod(factory)
def get_level(self): return self.level
def set_level(self, level): self.level = level
def getValueOf_(self): return self.valueOf_
def setValueOf_(self, valueOf_): self.valueOf_ = valueOf_
def export(self, outfile, level, namespace_='', name_='docHeadingType', namespacedef_=''):
showIndent(outfile, level)
outfile.write('<%s%s %s' % (namespace_, name_, namespacedef_, ))
self.exportAttributes(outfile, level, namespace_, name_='docHeadingType')
outfile.write('>')
self.exportChildren(outfile, level + 1, namespace_, name_)
outfile.write('</%s%s>\n' % (namespace_, name_))
def exportAttributes(self, outfile, level, namespace_='', name_='docHeadingType'):
if self.level is not None:
outfile.write(' level="%s"' % self.format_integer(self.level, input_name='level'))
def exportChildren(self, outfile, level, namespace_='', name_='docHeadingType'):
if self.valueOf_.find('![CDATA')>-1:
value=quote_xml('%s' % self.valueOf_)
value=value.replace('![CDATA','<![CDATA')
value=value.replace(']]',']]>')
outfile.write(value)
else:
outfile.write(quote_xml('%s' % self.valueOf_))
def hasContent_(self):
if (
self.valueOf_ is not None
):
return True
else:
return False
def build(self, node_):
attrs = node_.attributes
self.buildAttributes(attrs)
self.valueOf_ = ''
for child_ in node_.childNodes:
nodeName_ = child_.nodeName.split(':')[-1]
self.buildChildren(child_, nodeName_)
def buildAttributes(self, attrs):
if attrs.get('level'):
try:
self.level = int(attrs.get('level').value)
except ValueError as exp:
raise ValueError('Bad integer attribute (level): %s' % exp)
def buildChildren(self, child_, nodeName_):
if child_.nodeType == Node.TEXT_NODE:
obj_ = self.mixedclass_(MixedContainer.CategoryText,
MixedContainer.TypeNone, '', child_.nodeValue)
self.content_.append(obj_)
if child_.nodeType == Node.TEXT_NODE:
self.valueOf_ += child_.nodeValue
elif child_.nodeType == Node.CDATA_SECTION_NODE:
self.valueOf_ += '![CDATA['+child_.nodeValue+']]'
# end class docHeadingType
class docImageType(GeneratedsSuper):
subclass = None
superclass = None
def __init__(self, width=None, type_=None, name=None, height=None, valueOf_='', mixedclass_=None, content_=None):
self.width = width
self.type_ = type_
self.name = name
self.height = height
if mixedclass_ is None:
self.mixedclass_ = MixedContainer
else:
self.mixedclass_ = mixedclass_
if content_ is None:
self.content_ = []
else:
self.content_ = content_
def factory(*args_, **kwargs_):
if docImageType.subclass:
return docImageType.subclass(*args_, **kwargs_)
else:
return docImageType(*args_, **kwargs_)
factory = staticmethod(factory)
def get_width(self): return self.width
def set_width(self, width): self.width = width
def get_type(self): return self.type_
def set_type(self, type_): self.type_ = type_
def get_name(self): return self.name
def set_name(self, name): self.name = name
def get_height(self): return self.height
def set_height(self, height): self.height = height
def getValueOf_(self): return self.valueOf_
def setValueOf_(self, valueOf_): self.valueOf_ = valueOf_
def export(self, outfile, level, namespace_='', name_='docImageType', namespacedef_=''):
showIndent(outfile, level)
outfile.write('<%s%s %s' % (namespace_, name_, namespacedef_, ))
self.exportAttributes(outfile, level, namespace_, name_='docImageType')
outfile.write('>')
self.exportChildren(outfile, level + 1, namespace_, name_)
outfile.write('</%s%s>\n' % (namespace_, name_))
def exportAttributes(self, outfile, level, namespace_='', name_='docImageType'):
if self.width is not None:
outfile.write(' width=%s' % (self.format_string(quote_attrib(self.width).encode(ExternalEncoding), input_name='width'), ))
if self.type_ is not None:
outfile.write(' type=%s' % (quote_attrib(self.type_), ))
if self.name is not None:
outfile.write(' name=%s' % (self.format_string(quote_attrib(self.name).encode(ExternalEncoding), input_name='name'), ))
if self.height is not None:
outfile.write(' height=%s' % (self.format_string(quote_attrib(self.height).encode(ExternalEncoding), input_name='height'), ))
def exportChildren(self, outfile, level, namespace_='', name_='docImageType'):
if self.valueOf_.find('![CDATA')>-1:
value=quote_xml('%s' % self.valueOf_)
value=value.replace('![CDATA','<![CDATA')
value=value.replace(']]',']]>')
outfile.write(value)
else:
outfile.write(quote_xml('%s' % self.valueOf_))
def hasContent_(self):
if (
self.valueOf_ is not None
):
return True
else:
return False
def build(self, node_):
attrs = node_.attributes
self.buildAttributes(attrs)
self.valueOf_ = ''
for child_ in node_.childNodes:
nodeName_ = child_.nodeName.split(':')[-1]
self.buildChildren(child_, nodeName_)
def buildAttributes(self, attrs):
if attrs.get('width'):
self.width = attrs.get('width').value
if attrs.get('type'):
self.type_ = attrs.get('type').value
if attrs.get('name'):
self.name = attrs.get('name').value
if attrs.get('height'):
self.height = attrs.get('height').value
def buildChildren(self, child_, nodeName_):
if child_.nodeType == Node.TEXT_NODE:
obj_ = self.mixedclass_(MixedContainer.CategoryText,
MixedContainer.TypeNone, '', child_.nodeValue)
self.content_.append(obj_)
if child_.nodeType == Node.TEXT_NODE:
self.valueOf_ += child_.nodeValue
elif child_.nodeType == Node.CDATA_SECTION_NODE:
self.valueOf_ += '![CDATA['+child_.nodeValue+']]'
# end class docImageType
class docDotFileType(GeneratedsSuper):
subclass = None
superclass = None
def __init__(self, name=None, valueOf_='', mixedclass_=None, content_=None):
self.name = name
if mixedclass_ is None:
self.mixedclass_ = MixedContainer
else:
self.mixedclass_ = mixedclass_
if content_ is None:
self.content_ = []
else:
self.content_ = content_
def factory(*args_, **kwargs_):
if docDotFileType.subclass:
return docDotFileType.subclass(*args_, **kwargs_)
else:
return docDotFileType(*args_, **kwargs_)
factory = staticmethod(factory)
def get_name(self): return self.name
def set_name(self, name): self.name = name
def getValueOf_(self): return self.valueOf_
def setValueOf_(self, valueOf_): self.valueOf_ = valueOf_
def export(self, outfile, level, namespace_='', name_='docDotFileType', namespacedef_=''):
showIndent(outfile, level)
outfile.write('<%s%s %s' % (namespace_, name_, namespacedef_, ))
self.exportAttributes(outfile, level, namespace_, name_='docDotFileType')
outfile.write('>')
self.exportChildren(outfile, level + 1, namespace_, name_)
outfile.write('</%s%s>\n' % (namespace_, name_))
def exportAttributes(self, outfile, level, namespace_='', name_='docDotFileType'):
if self.name is not None:
outfile.write(' name=%s' % (self.format_string(quote_attrib(self.name).encode(ExternalEncoding), input_name='name'), ))
def exportChildren(self, outfile, level, namespace_='', name_='docDotFileType'):
if self.valueOf_.find('![CDATA')>-1:
value=quote_xml('%s' % self.valueOf_)
value=value.replace('![CDATA','<![CDATA')
value=value.replace(']]',']]>')
outfile.write(value)
else:
outfile.write(quote_xml('%s' % self.valueOf_))
def hasContent_(self):
if (
self.valueOf_ is not None
):
return True
else:
return False
def build(self, node_):
attrs = node_.attributes
self.buildAttributes(attrs)
self.valueOf_ = ''
for child_ in node_.childNodes:
nodeName_ = child_.nodeName.split(':')[-1]
self.buildChildren(child_, nodeName_)
def buildAttributes(self, attrs):
if attrs.get('name'):
self.name = attrs.get('name').value
def buildChildren(self, child_, nodeName_):
if child_.nodeType == Node.TEXT_NODE:
obj_ = self.mixedclass_(MixedContainer.CategoryText,
MixedContainer.TypeNone, '', child_.nodeValue)
self.content_.append(obj_)
if child_.nodeType == Node.TEXT_NODE:
self.valueOf_ += child_.nodeValue
elif child_.nodeType == Node.CDATA_SECTION_NODE:
self.valueOf_ += '![CDATA['+child_.nodeValue+']]'
# end class docDotFileType
class docTocItemType(GeneratedsSuper):
subclass = None
superclass = None
def __init__(self, id=None, valueOf_='', mixedclass_=None, content_=None):
self.id = id
if mixedclass_ is None:
self.mixedclass_ = MixedContainer
else:
self.mixedclass_ = mixedclass_
if content_ is None:
self.content_ = []
else:
self.content_ = content_
def factory(*args_, **kwargs_):
if docTocItemType.subclass:
return docTocItemType.subclass(*args_, **kwargs_)
else:
return docTocItemType(*args_, **kwargs_)
factory = staticmethod(factory)
def get_id(self): return self.id
def set_id(self, id): self.id = id
def getValueOf_(self): return self.valueOf_
def setValueOf_(self, valueOf_): self.valueOf_ = valueOf_
def export(self, outfile, level, namespace_='', name_='docTocItemType', namespacedef_=''):
showIndent(outfile, level)
outfile.write('<%s%s %s' % (namespace_, name_, namespacedef_, ))
self.exportAttributes(outfile, level, namespace_, name_='docTocItemType')
outfile.write('>')
self.exportChildren(outfile, level + 1, namespace_, name_)
outfile.write('</%s%s>\n' % (namespace_, name_))
def exportAttributes(self, outfile, level, namespace_='', name_='docTocItemType'):
if self.id is not None:
outfile.write(' id=%s' % (self.format_string(quote_attrib(self.id).encode(ExternalEncoding), input_name='id'), ))
def exportChildren(self, outfile, level, namespace_='', name_='docTocItemType'):
if self.valueOf_.find('![CDATA')>-1:
value=quote_xml('%s' % self.valueOf_)
value=value.replace('![CDATA','<![CDATA')
value=value.replace(']]',']]>')
outfile.write(value)
else:
outfile.write(quote_xml('%s' % self.valueOf_))
def hasContent_(self):
if (
self.valueOf_ is not None
):
return True
else:
return False
def build(self, node_):
attrs = node_.attributes
self.buildAttributes(attrs)
self.valueOf_ = ''
for child_ in node_.childNodes:
nodeName_ = child_.nodeName.split(':')[-1]
self.buildChildren(child_, nodeName_)
def buildAttributes(self, attrs):
if attrs.get('id'):
self.id = attrs.get('id').value
def buildChildren(self, child_, nodeName_):
if child_.nodeType == Node.TEXT_NODE:
obj_ = self.mixedclass_(MixedContainer.CategoryText,
MixedContainer.TypeNone, '', child_.nodeValue)
self.content_.append(obj_)
if child_.nodeType == Node.TEXT_NODE:
self.valueOf_ += child_.nodeValue
elif child_.nodeType == Node.CDATA_SECTION_NODE:
self.valueOf_ += '![CDATA['+child_.nodeValue+']]'
# end class docTocItemType
class docTocListType(GeneratedsSuper):
subclass = None
superclass = None
def __init__(self, tocitem=None):
if tocitem is None:
self.tocitem = []
else:
self.tocitem = tocitem
def factory(*args_, **kwargs_):
if docTocListType.subclass:
return docTocListType.subclass(*args_, **kwargs_)
else:
return docTocListType(*args_, **kwargs_)
factory = staticmethod(factory)
def get_tocitem(self): return self.tocitem
def set_tocitem(self, tocitem): self.tocitem = tocitem
def add_tocitem(self, value): self.tocitem.append(value)
def insert_tocitem(self, index, value): self.tocitem[index] = value
def export(self, outfile, level, namespace_='', name_='docTocListType', namespacedef_=''):
showIndent(outfile, level)
outfile.write('<%s%s %s' % (namespace_, name_, namespacedef_, ))
self.exportAttributes(outfile, level, namespace_, name_='docTocListType')
if self.hasContent_():
outfile.write('>\n')
self.exportChildren(outfile, level + 1, namespace_, name_)
showIndent(outfile, level)
outfile.write('</%s%s>\n' % (namespace_, name_))
else:
outfile.write(' />\n')
def exportAttributes(self, outfile, level, namespace_='', name_='docTocListType'):
pass
def exportChildren(self, outfile, level, namespace_='', name_='docTocListType'):
for tocitem_ in self.tocitem:
tocitem_.export(outfile, level, namespace_, name_='tocitem')
def hasContent_(self):
if (
self.tocitem is not None
):
return True
else:
return False
def build(self, node_):
attrs = node_.attributes
self.buildAttributes(attrs)
for child_ in node_.childNodes:
nodeName_ = child_.nodeName.split(':')[-1]
self.buildChildren(child_, nodeName_)
def buildAttributes(self, attrs):
pass
def buildChildren(self, child_, nodeName_):
if child_.nodeType == Node.ELEMENT_NODE and \
nodeName_ == 'tocitem':
obj_ = docTocItemType.factory()
obj_.build(child_)
self.tocitem.append(obj_)
# end class docTocListType
class docLanguageType(GeneratedsSuper):
subclass = None
superclass = None
def __init__(self, langid=None, para=None):
self.langid = langid
if para is None:
self.para = []
else:
self.para = para
def factory(*args_, **kwargs_):
if docLanguageType.subclass:
return docLanguageType.subclass(*args_, **kwargs_)
else:
return docLanguageType(*args_, **kwargs_)
factory = staticmethod(factory)
def get_para(self): return self.para
def set_para(self, para): self.para = para
def add_para(self, value): self.para.append(value)
def insert_para(self, index, value): self.para[index] = value
def get_langid(self): return self.langid
def set_langid(self, langid): self.langid = langid
def export(self, outfile, level, namespace_='', name_='docLanguageType', namespacedef_=''):
showIndent(outfile, level)
outfile.write('<%s%s %s' % (namespace_, name_, namespacedef_, ))
self.exportAttributes(outfile, level, namespace_, name_='docLanguageType')
if self.hasContent_():
outfile.write('>\n')
self.exportChildren(outfile, level + 1, namespace_, name_)
showIndent(outfile, level)
outfile.write('</%s%s>\n' % (namespace_, name_))
else:
outfile.write(' />\n')
def exportAttributes(self, outfile, level, namespace_='', name_='docLanguageType'):
if self.langid is not None:
outfile.write(' langid=%s' % (self.format_string(quote_attrib(self.langid).encode(ExternalEncoding), input_name='langid'), ))
def exportChildren(self, outfile, level, namespace_='', name_='docLanguageType'):
for para_ in self.para:
para_.export(outfile, level, namespace_, name_='para')
def hasContent_(self):
if (
self.para is not None
):
return True
else:
return False
def build(self, node_):
attrs = node_.attributes
self.buildAttributes(attrs)
for child_ in node_.childNodes:
nodeName_ = child_.nodeName.split(':')[-1]
self.buildChildren(child_, nodeName_)
def buildAttributes(self, attrs):
if attrs.get('langid'):
self.langid = attrs.get('langid').value
def buildChildren(self, child_, nodeName_):
if child_.nodeType == Node.ELEMENT_NODE and \
nodeName_ == 'para':
obj_ = docParaType.factory()
obj_.build(child_)
self.para.append(obj_)
# end class docLanguageType
class docParamListType(GeneratedsSuper):
subclass = None
superclass = None
def __init__(self, kind=None, parameteritem=None):
self.kind = kind
if parameteritem is None:
self.parameteritem = []
else:
self.parameteritem = parameteritem
def factory(*args_, **kwargs_):
if docParamListType.subclass:
return docParamListType.subclass(*args_, **kwargs_)
else:
return docParamListType(*args_, **kwargs_)
factory = staticmethod(factory)
def get_parameteritem(self): return self.parameteritem
def set_parameteritem(self, parameteritem): self.parameteritem = parameteritem
def add_parameteritem(self, value): self.parameteritem.append(value)
def insert_parameteritem(self, index, value): self.parameteritem[index] = value
def get_kind(self): return self.kind
def set_kind(self, kind): self.kind = kind
def export(self, outfile, level, namespace_='', name_='docParamListType', namespacedef_=''):
showIndent(outfile, level)
outfile.write('<%s%s %s' % (namespace_, name_, namespacedef_, ))
self.exportAttributes(outfile, level, namespace_, name_='docParamListType')
if self.hasContent_():
outfile.write('>\n')
self.exportChildren(outfile, level + 1, namespace_, name_)
showIndent(outfile, level)
outfile.write('</%s%s>\n' % (namespace_, name_))
else:
outfile.write(' />\n')
def exportAttributes(self, outfile, level, namespace_='', name_='docParamListType'):
if self.kind is not None:
outfile.write(' kind=%s' % (quote_attrib(self.kind), ))
def exportChildren(self, outfile, level, namespace_='', name_='docParamListType'):
for parameteritem_ in self.parameteritem:
parameteritem_.export(outfile, level, namespace_, name_='parameteritem')
def hasContent_(self):
if (
self.parameteritem is not None
):
return True
else:
return False
def build(self, node_):
attrs = node_.attributes
self.buildAttributes(attrs)
for child_ in node_.childNodes:
nodeName_ = child_.nodeName.split(':')[-1]
self.buildChildren(child_, nodeName_)
def buildAttributes(self, attrs):
if attrs.get('kind'):
self.kind = attrs.get('kind').value
def buildChildren(self, child_, nodeName_):
if child_.nodeType == Node.ELEMENT_NODE and \
nodeName_ == 'parameteritem':
obj_ = docParamListItem.factory()
obj_.build(child_)
self.parameteritem.append(obj_)
# end class docParamListType
class docParamListItem(GeneratedsSuper):
subclass = None
superclass = None
def __init__(self, parameternamelist=None, parameterdescription=None):
if parameternamelist is None:
self.parameternamelist = []
else:
self.parameternamelist = parameternamelist
self.parameterdescription = parameterdescription
def factory(*args_, **kwargs_):
if docParamListItem.subclass:
return docParamListItem.subclass(*args_, **kwargs_)
else:
return docParamListItem(*args_, **kwargs_)
factory = staticmethod(factory)
def get_parameternamelist(self): return self.parameternamelist
def set_parameternamelist(self, parameternamelist): self.parameternamelist = parameternamelist
def add_parameternamelist(self, value): self.parameternamelist.append(value)
def insert_parameternamelist(self, index, value): self.parameternamelist[index] = value
def get_parameterdescription(self): return self.parameterdescription
def set_parameterdescription(self, parameterdescription): self.parameterdescription = parameterdescription
def export(self, outfile, level, namespace_='', name_='docParamListItem', namespacedef_=''):
showIndent(outfile, level)
outfile.write('<%s%s %s' % (namespace_, name_, namespacedef_, ))
self.exportAttributes(outfile, level, namespace_, name_='docParamListItem')
if self.hasContent_():
outfile.write('>\n')
self.exportChildren(outfile, level + 1, namespace_, name_)
showIndent(outfile, level)
outfile.write('</%s%s>\n' % (namespace_, name_))
else:
outfile.write(' />\n')
def exportAttributes(self, outfile, level, namespace_='', name_='docParamListItem'):
pass
def exportChildren(self, outfile, level, namespace_='', name_='docParamListItem'):
for parameternamelist_ in self.parameternamelist:
parameternamelist_.export(outfile, level, namespace_, name_='parameternamelist')
if self.parameterdescription:
self.parameterdescription.export(outfile, level, namespace_, name_='parameterdescription', )
def hasContent_(self):
if (
self.parameternamelist is not None or
self.parameterdescription is not None
):
return True
else:
return False
def build(self, node_):
attrs = node_.attributes
self.buildAttributes(attrs)
for child_ in node_.childNodes:
nodeName_ = child_.nodeName.split(':')[-1]
self.buildChildren(child_, nodeName_)
def buildAttributes(self, attrs):
pass
def buildChildren(self, child_, nodeName_):
if child_.nodeType == Node.ELEMENT_NODE and \
nodeName_ == 'parameternamelist':
obj_ = docParamNameList.factory()
obj_.build(child_)
self.parameternamelist.append(obj_)
elif child_.nodeType == Node.ELEMENT_NODE and \
nodeName_ == 'parameterdescription':
obj_ = descriptionType.factory()
obj_.build(child_)
self.set_parameterdescription(obj_)
# end class docParamListItem
class docParamNameList(GeneratedsSuper):
subclass = None
superclass = None
def __init__(self, parametername=None):
if parametername is None:
self.parametername = []
else:
self.parametername = parametername
def factory(*args_, **kwargs_):
if docParamNameList.subclass:
return docParamNameList.subclass(*args_, **kwargs_)
else:
return docParamNameList(*args_, **kwargs_)
factory = staticmethod(factory)
def get_parametername(self): return self.parametername
def set_parametername(self, parametername): self.parametername = parametername
def add_parametername(self, value): self.parametername.append(value)
def insert_parametername(self, index, value): self.parametername[index] = value
def export(self, outfile, level, namespace_='', name_='docParamNameList', namespacedef_=''):
showIndent(outfile, level)
outfile.write('<%s%s %s' % (namespace_, name_, namespacedef_, ))
self.exportAttributes(outfile, level, namespace_, name_='docParamNameList')
if self.hasContent_():
outfile.write('>\n')
self.exportChildren(outfile, level + 1, namespace_, name_)
showIndent(outfile, level)
outfile.write('</%s%s>\n' % (namespace_, name_))
else:
outfile.write(' />\n')
def exportAttributes(self, outfile, level, namespace_='', name_='docParamNameList'):
pass
def exportChildren(self, outfile, level, namespace_='', name_='docParamNameList'):
for parametername_ in self.parametername:
parametername_.export(outfile, level, namespace_, name_='parametername')
def hasContent_(self):
if (
self.parametername is not None
):
return True
else:
return False
def build(self, node_):
attrs = node_.attributes
self.buildAttributes(attrs)
for child_ in node_.childNodes:
nodeName_ = child_.nodeName.split(':')[-1]
self.buildChildren(child_, nodeName_)
def buildAttributes(self, attrs):
pass
def buildChildren(self, child_, nodeName_):
if child_.nodeType == Node.ELEMENT_NODE and \
nodeName_ == 'parametername':
obj_ = docParamName.factory()
obj_.build(child_)
self.parametername.append(obj_)
# end class docParamNameList
class docParamName(GeneratedsSuper):
subclass = None
superclass = None
def __init__(self, direction=None, ref=None, mixedclass_=None, content_=None):
self.direction = direction
if mixedclass_ is None:
self.mixedclass_ = MixedContainer
else:
self.mixedclass_ = mixedclass_
if content_ is None:
self.content_ = []
else:
self.content_ = content_
def factory(*args_, **kwargs_):
if docParamName.subclass:
return docParamName.subclass(*args_, **kwargs_)
else:
return docParamName(*args_, **kwargs_)
factory = staticmethod(factory)
def get_ref(self): return self.ref
def set_ref(self, ref): self.ref = ref
def get_direction(self): return self.direction
def set_direction(self, direction): self.direction = direction
def export(self, outfile, level, namespace_='', name_='docParamName', namespacedef_=''):
showIndent(outfile, level)
outfile.write('<%s%s %s' % (namespace_, name_, namespacedef_, ))
self.exportAttributes(outfile, level, namespace_, name_='docParamName')
outfile.write('>')
self.exportChildren(outfile, level + 1, namespace_, name_)
outfile.write('</%s%s>\n' % (namespace_, name_))
def exportAttributes(self, outfile, level, namespace_='', name_='docParamName'):
if self.direction is not None:
outfile.write(' direction=%s' % (quote_attrib(self.direction), ))
def exportChildren(self, outfile, level, namespace_='', name_='docParamName'):
for item_ in self.content_:
item_.export(outfile, level, item_.name, namespace_)
def hasContent_(self):
if (
self.ref is not None
):
return True
else:
return False
def build(self, node_):
attrs = node_.attributes
self.buildAttributes(attrs)
for child_ in node_.childNodes:
nodeName_ = child_.nodeName.split(':')[-1]
self.buildChildren(child_, nodeName_)
def buildAttributes(self, attrs):
if attrs.get('direction'):
self.direction = attrs.get('direction').value
def buildChildren(self, child_, nodeName_):
if child_.nodeType == Node.ELEMENT_NODE and \
nodeName_ == 'ref':
childobj_ = docRefTextType.factory()
childobj_.build(child_)
obj_ = self.mixedclass_(MixedContainer.CategoryComplex,
MixedContainer.TypeNone, 'ref', childobj_)
self.content_.append(obj_)
elif child_.nodeType == Node.TEXT_NODE:
obj_ = self.mixedclass_(MixedContainer.CategoryText,
MixedContainer.TypeNone, '', child_.nodeValue)
self.content_.append(obj_)
d = child_.parentNode.attributes.get('direction')
if d is not None:
self.content_.insert(0, self.mixedclass_(MixedContainer.CategoryText,
MixedContainer.TypeNone,
'', '[{}] '.format(d.value)))
# end class docParamName
class docXRefSectType(GeneratedsSuper):
subclass = None
superclass = None
def __init__(self, id=None, xreftitle=None, xrefdescription=None):
self.id = id
if xreftitle is None:
self.xreftitle = []
else:
self.xreftitle = xreftitle
self.xrefdescription = xrefdescription
def factory(*args_, **kwargs_):
if docXRefSectType.subclass:
return docXRefSectType.subclass(*args_, **kwargs_)
else:
return docXRefSectType(*args_, **kwargs_)
factory = staticmethod(factory)
def get_xreftitle(self): return self.xreftitle
def set_xreftitle(self, xreftitle): self.xreftitle = xreftitle
def add_xreftitle(self, value): self.xreftitle.append(value)
def insert_xreftitle(self, index, value): self.xreftitle[index] = value
def get_xrefdescription(self): return self.xrefdescription
def set_xrefdescription(self, xrefdescription): self.xrefdescription = xrefdescription
def get_id(self): return self.id
def set_id(self, id): self.id = id
def export(self, outfile, level, namespace_='', name_='docXRefSectType', namespacedef_=''):
showIndent(outfile, level)
outfile.write('<%s%s %s' % (namespace_, name_, namespacedef_, ))
self.exportAttributes(outfile, level, namespace_, name_='docXRefSectType')
if self.hasContent_():
outfile.write('>\n')
self.exportChildren(outfile, level + 1, namespace_, name_)
showIndent(outfile, level)
outfile.write('</%s%s>\n' % (namespace_, name_))
else:
outfile.write(' />\n')
def exportAttributes(self, outfile, level, namespace_='', name_='docXRefSectType'):
if self.id is not None:
outfile.write(' id=%s' % (self.format_string(quote_attrib(self.id).encode(ExternalEncoding), input_name='id'), ))
def exportChildren(self, outfile, level, namespace_='', name_='docXRefSectType'):
for xreftitle_ in self.xreftitle:
showIndent(outfile, level)
outfile.write('<%sxreftitle>%s</%sxreftitle>\n' % (namespace_, self.format_string(quote_xml(xreftitle_).encode(ExternalEncoding), input_name='xreftitle'), namespace_))
if self.xrefdescription:
self.xrefdescription.export(outfile, level, namespace_, name_='xrefdescription', )
def hasContent_(self):
if (
self.xreftitle is not None or
self.xrefdescription is not None
):
return True
else:
return False
def build(self, node_):
attrs = node_.attributes
self.buildAttributes(attrs)
for child_ in node_.childNodes:
nodeName_ = child_.nodeName.split(':')[-1]
self.buildChildren(child_, nodeName_)
def buildAttributes(self, attrs):
if attrs.get('id'):
self.id = attrs.get('id').value
def buildChildren(self, child_, nodeName_):
if child_.nodeType == Node.ELEMENT_NODE and \
nodeName_ == 'xreftitle':
xreftitle_ = ''
for text__content_ in child_.childNodes:
xreftitle_ += text__content_.nodeValue
self.xreftitle.append(xreftitle_)
elif child_.nodeType == Node.ELEMENT_NODE and \
nodeName_ == 'xrefdescription':
obj_ = descriptionType.factory()
obj_.build(child_)
self.set_xrefdescription(obj_)
# end class docXRefSectType
class docCopyType(GeneratedsSuper):
subclass = None
superclass = None
def __init__(self, link=None, para=None, sect1=None, internal=None):
self.link = link
if para is None:
self.para = []
else:
self.para = para
if sect1 is None:
self.sect1 = []
else:
self.sect1 = sect1
self.internal = internal
def factory(*args_, **kwargs_):
if docCopyType.subclass:
return docCopyType.subclass(*args_, **kwargs_)
else:
return docCopyType(*args_, **kwargs_)
factory = staticmethod(factory)
def get_para(self): return self.para
def set_para(self, para): self.para = para
def add_para(self, value): self.para.append(value)
def insert_para(self, index, value): self.para[index] = value
def get_sect1(self): return self.sect1
def set_sect1(self, sect1): self.sect1 = sect1
def add_sect1(self, value): self.sect1.append(value)
def insert_sect1(self, index, value): self.sect1[index] = value
def get_internal(self): return self.internal
def set_internal(self, internal): self.internal = internal
def get_link(self): return self.link
def set_link(self, link): self.link = link
def export(self, outfile, level, namespace_='', name_='docCopyType', namespacedef_=''):
showIndent(outfile, level)
outfile.write('<%s%s %s' % (namespace_, name_, namespacedef_, ))
self.exportAttributes(outfile, level, namespace_, name_='docCopyType')
if self.hasContent_():
outfile.write('>\n')
self.exportChildren(outfile, level + 1, namespace_, name_)
showIndent(outfile, level)
outfile.write('</%s%s>\n' % (namespace_, name_))
else:
outfile.write(' />\n')
def exportAttributes(self, outfile, level, namespace_='', name_='docCopyType'):
if self.link is not None:
outfile.write(' link=%s' % (self.format_string(quote_attrib(self.link).encode(ExternalEncoding), input_name='link'), ))
def exportChildren(self, outfile, level, namespace_='', name_='docCopyType'):
for para_ in self.para:
para_.export(outfile, level, namespace_, name_='para')
for sect1_ in self.sect1:
sect1_.export(outfile, level, namespace_, name_='sect1')
if self.internal:
self.internal.export(outfile, level, namespace_, name_='internal')
def hasContent_(self):
if (
self.para is not None or
self.sect1 is not None or
self.internal is not None
):
return True
else:
return False
def build(self, node_):
attrs = node_.attributes
self.buildAttributes(attrs)
for child_ in node_.childNodes:
nodeName_ = child_.nodeName.split(':')[-1]
self.buildChildren(child_, nodeName_)
def buildAttributes(self, attrs):
if attrs.get('link'):
self.link = attrs.get('link').value
def buildChildren(self, child_, nodeName_):
if child_.nodeType == Node.ELEMENT_NODE and \
nodeName_ == 'para':
obj_ = docParaType.factory()
obj_.build(child_)
self.para.append(obj_)
elif child_.nodeType == Node.ELEMENT_NODE and \
nodeName_ == 'sect1':
obj_ = docSect1Type.factory()
obj_.build(child_)
self.sect1.append(obj_)
elif child_.nodeType == Node.ELEMENT_NODE and \
nodeName_ == 'internal':
obj_ = docInternalType.factory()
obj_.build(child_)
self.set_internal(obj_)
# end class docCopyType
class docCharType(GeneratedsSuper):
subclass = None
superclass = None
def __init__(self, char=None, valueOf_=''):
self.char = char
self.valueOf_ = valueOf_
def factory(*args_, **kwargs_):
if docCharType.subclass:
return docCharType.subclass(*args_, **kwargs_)
else:
return docCharType(*args_, **kwargs_)
factory = staticmethod(factory)
def get_char(self): return self.char
def set_char(self, char): self.char = char
def getValueOf_(self): return self.valueOf_
def setValueOf_(self, valueOf_): self.valueOf_ = valueOf_
def export(self, outfile, level, namespace_='', name_='docCharType', namespacedef_=''):
showIndent(outfile, level)
outfile.write('<%s%s %s' % (namespace_, name_, namespacedef_, ))
self.exportAttributes(outfile, level, namespace_, name_='docCharType')
if self.hasContent_():
outfile.write('>\n')
self.exportChildren(outfile, level + 1, namespace_, name_)
showIndent(outfile, level)
outfile.write('</%s%s>\n' % (namespace_, name_))
else:
outfile.write(' />\n')
def exportAttributes(self, outfile, level, namespace_='', name_='docCharType'):
if self.char is not None:
outfile.write(' char=%s' % (quote_attrib(self.char), ))
def exportChildren(self, outfile, level, namespace_='', name_='docCharType'):
if self.valueOf_.find('![CDATA')>-1:
value=quote_xml('%s' % self.valueOf_)
value=value.replace('![CDATA','<![CDATA')
value=value.replace(']]',']]>')
outfile.write(value)
else:
outfile.write(quote_xml('%s' % self.valueOf_))
def hasContent_(self):
if (
self.valueOf_ is not None
):
return True
else:
return False
def build(self, node_):
attrs = node_.attributes
self.buildAttributes(attrs)
self.valueOf_ = ''
for child_ in node_.childNodes:
nodeName_ = child_.nodeName.split(':')[-1]
self.buildChildren(child_, nodeName_)
def buildAttributes(self, attrs):
if attrs.get('char'):
self.char = attrs.get('char').value
def buildChildren(self, child_, nodeName_):
if child_.nodeType == Node.TEXT_NODE:
self.valueOf_ += child_.nodeValue
elif child_.nodeType == Node.CDATA_SECTION_NODE:
self.valueOf_ += '![CDATA['+child_.nodeValue+']]'
# end class docCharType
class docParBlockType(GeneratedsSuper):
subclass = None
superclass = None
def __init__(self, mixedclass_=None, para=None):
if mixedclass_ is None:
self.mixedclass_ = MixedContainer
else:
self.mixedclass_ = mixedclass_
if para is None:
self.para = []
else:
self.para = para
def factory(*args_, **kwargs_):
if docParBlockType.subclass:
return docParBlockType.subclass(*args_, **kwargs_)
else:
return docParBlockType(*args_, **kwargs_)
factory = staticmethod(factory)
def get_para(self): return self.para
def set_para(self, para): self.para = para
def add_para(self, value): self.para.append(value)
def insert_para(self, index, value): self.para[index] = value
def export(self, outfile, level, namespace_='', name_='docParBlockType', namespacedef_=''):
showIndent(outfile, level)
outfile.write('<%s%s%s' % (namespace_, name_, namespacedef_ and ' ' + namespacedef_ or '', ))
self.exportAttributes(outfile, level, namespace_, name_='docParBlockType')
if self.hasContent_():
outfile.write('>\n')
self.exportChildren(outfile, level + 1, namespace_, name_)
showIndent(outfile, level)
outfile.write('</%s%s>\n' % (namespace_, name_))
else:
outfile.write('/>\n')
def exportAttributes(self, outfile, level, namespace_='', name_='docParBlockType'):
pass
def exportChildren(self, outfile, level, namespace_='', name_='docParBlockType'):
for para_ in self.para:
para_.export(outfile, level, namespace_, name_='para')
def hasContent_(self):
if (
self.para
):
return True
else:
return False
def build(self, node_):
attrs = node_.attributes
self.buildAttributes(attrs)
self.valueOf_ = ''
for child_ in node_.childNodes:
nodeName_ = child_.nodeName.split(':')[-1]
self.buildChildren(child_, nodeName_)
def buildAttributes(self, attrs):
pass
def buildChildren(self, child_, nodeName_):
if child_.nodeType == Node.ELEMENT_NODE and \
nodeName_ == 'para':
obj_ = docParaType.factory()
obj_.build(child_)
obj_ = self.mixedclass_(MixedContainer.CategoryComplex,
MixedContainer.TypeNone, 'para', obj_)
self.para.append(obj_)
# end class docParBlockType
class docEmptyType(GeneratedsSuper):
subclass = None
superclass = None
def __init__(self, valueOf_=''):
self.valueOf_ = valueOf_
def factory(*args_, **kwargs_):
if docEmptyType.subclass:
return docEmptyType.subclass(*args_, **kwargs_)
else:
return docEmptyType(*args_, **kwargs_)
factory = staticmethod(factory)
def getValueOf_(self): return self.valueOf_
def setValueOf_(self, valueOf_): self.valueOf_ = valueOf_
def export(self, outfile, level, namespace_='', name_='docEmptyType', namespacedef_=''):
showIndent(outfile, level)
outfile.write('<%s%s %s' % (namespace_, name_, namespacedef_, ))
self.exportAttributes(outfile, level, namespace_, name_='docEmptyType')
if self.hasContent_():
outfile.write('>\n')
self.exportChildren(outfile, level + 1, namespace_, name_)
showIndent(outfile, level)
outfile.write('</%s%s>\n' % (namespace_, name_))
else:
outfile.write(' />\n')
def exportAttributes(self, outfile, level, namespace_='', name_='docEmptyType'):
pass
def exportChildren(self, outfile, level, namespace_='', name_='docEmptyType'):
if self.valueOf_.find('![CDATA')>-1:
value=quote_xml('%s' % self.valueOf_)
value=value.replace('![CDATA','<![CDATA')
value=value.replace(']]',']]>')
outfile.write(value)
else:
outfile.write(quote_xml('%s' % self.valueOf_))
def hasContent_(self):
if (
self.valueOf_ is not None
):
return True
else:
return False
def build(self, node_):
attrs = node_.attributes
self.buildAttributes(attrs)
self.valueOf_ = ''
for child_ in node_.childNodes:
nodeName_ = child_.nodeName.split(':')[-1]
self.buildChildren(child_, nodeName_)
def buildAttributes(self, attrs):
pass
def buildChildren(self, child_, nodeName_):
if child_.nodeType == Node.TEXT_NODE:
self.valueOf_ += child_.nodeValue
elif child_.nodeType == Node.CDATA_SECTION_NODE:
self.valueOf_ += '![CDATA['+child_.nodeValue+']]'
# end class docEmptyType
USAGE_TEXT = """
Usage: python <Parser>.py [ -s ] <in_xml_file>
Options:
-s Use the SAX parser, not the minidom parser.
"""
def usage():
print(USAGE_TEXT)
sys.exit(1)
def parse(inFileName):
doc = minidom.parse(inFileName)
rootNode = doc.documentElement
rootObj = DoxygenType.factory()
rootObj.build(rootNode)
# Enable Python to collect the space used by the DOM.
doc = None
sys.stdout.write('<?xml version="1.0" ?>\n')
rootObj.export(sys.stdout, 0, name_="doxygen",
namespacedef_='')
return rootObj
def parseString(inString):
doc = minidom.parseString(inString)
rootNode = doc.documentElement
rootObj = DoxygenType.factory()
rootObj.build(rootNode)
# Enable Python to collect the space used by the DOM.
doc = None
sys.stdout.write('<?xml version="1.0" ?>\n')
rootObj.export(sys.stdout, 0, name_="doxygen",
namespacedef_='')
return rootObj
def parseLiteral(inFileName):
doc = minidom.parse(inFileName)
rootNode = doc.documentElement
rootObj = DoxygenType.factory()
rootObj.build(rootNode)
# Enable Python to collect the space used by the DOM.
doc = None
sys.stdout.write('from compound import *\n\n')
sys.stdout.write('rootObj = doxygen(\n')
rootObj.exportLiteral(sys.stdout, 0, name_="doxygen")
sys.stdout.write(')\n')
return rootObj
def main():
args = sys.argv[1:]
if len(args) == 1:
parse(args[0])
else:
usage()
if __name__ == '__main__':
main()
#import pdb
#pdb.run('main()')
| 41.00373 | 668 | 0.618467 | 25,252 | 241,840 | 5.689926 | 0.018098 | 0.03411 | 0.025236 | 0.038975 | 0.762469 | 0.718998 | 0.701173 | 0.673209 | 0.637519 | 0.631819 | 0 | 0.002552 | 0.270737 | 241,840 | 5,897 | 669 | 41.010683 | 0.812134 | 0.012281 | 0 | 0.712623 | 1 | 0 | 0.036512 | 0.000536 | 0 | 0 | 0 | 0 | 0 | 1 | 0.229006 | false | 0.01128 | 0.001253 | 0.047449 | 0.349508 | 0.000179 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
ed10f5bcd678411d0a79924195ba1e017c109d31 | 5,502 | py | Python | prodev-api/studioapp/urls.py | g90tony/pro-dev-project | fae815c21a43a23edfed8e016f044f814b7e67a6 | [
"MIT"
] | null | null | null | prodev-api/studioapp/urls.py | g90tony/pro-dev-project | fae815c21a43a23edfed8e016f044f814b7e67a6 | [
"MIT"
] | 1 | 2021-06-24T17:41:57.000Z | 2021-06-24T17:41:57.000Z | prodev-api/studioapp/urls.py | g90tony/pro-dev-project | fae815c21a43a23edfed8e016f044f814b7e67a6 | [
"MIT"
] | null | null | null | from django.urls import path
from . import views
from django.conf import settings
from django.conf.urls.static import static
urlpatterns = [
#path('hello/', views.HelloView.as_view(), name='hello'),
path('api/manyusers/', views.ManyCreativeUsers.as_view(), name='manyusers'),
path('api/singleusers/', views.SingleCreativeUsers.as_view(), name='singleusers'),
path('api/profile/', views.CreativeProfile.as_view(), name='profile'),
path('api/booking/', views.CreateBooking.as_view(), name='booking'),
path('api/review/', views.CreateReview.as_view(), name='review'),
path('api/signup/', views.RegistrationAPIView.as_view()),
path('api/login/', views.LoginAPIView.as_view()),
path('api/updateuser/', views.UserRetrieveUpdateAPIView.as_view()),
# Authentication routes
path('api/user/sign-up', views.RegistrationAPIView.as_view(), name='user_register'),
path('api/user/sign-in', views.LoginAPIView.as_view(), name='user_login'),
#client profile
path('api/client-user/profile/', views.CreativeProfile.as_view(), name='multi_profile_crud'),
path('api/client-user/profile/<int:profile_id>', views.CreativeProfile.as_view(), name='profile_crud'),
# booking
path('api/client-user/create-booking/', views.CreateBooking.as_view(), name='make_booking'),
# reviews
path('api/client-user/add-review/<int:studio_id>', views.CreateReview.as_view(), name='make_reviews'),
# path examples:
# http://127.0.0.1:8000/api/services/ --> to view all items
# http://127.0.0.1:8000/api/services/update/1/ --> to update specific item
# http://127.0.0.1:8000/api/services/delete/1/ --> to delete specific item
#StudioUser
path('api/studio-user/', views.StudioUserList.as_view(), name='StudioUser'),
path('api/studio-user/<int:pk>/',views.IndividualStudioUser.as_view()),
path('api/studio-user/update/<int:pk>/',views.IndividualStudioUser.as_view()),
path('api/studio-user/delete/<int:pk>/',views.IndividualStudioUser.as_view()),
#AdvertPost
path('api/advert-post/', views.AdvertPostList.as_view(), name='AdvertPost'),
path('api/advert-post/<int:pk>/',views.IndividualAdvertPost.as_view()),
path('api/advert-post/update/<int:pk>/',views.IndividualAdvertPost.as_view()),
path('api/advert-post/delete/<int:pk>/',views.IndividualAdvertPost.as_view()),
#Services
path('api/services/', views.ServicesList.as_view(), name='Services'),
path('api/services/<int:pk>/',views.IndividualServices.as_view()),
path('api/services/update/<int:pk>/',views.IndividualServices.as_view()),
path('api/services/delete/<int:pk>/',views.IndividualServices.as_view()),
#StudioProfile
path('api/studio-user/profile/', views.StudioProfileList.as_view()),
path('api/studio-user/profile/<int:pk>/',views.IndividualStudioProfile.as_view()),
path('api/studio-user/profile/update/<int:pk>/',views.IndividualStudioProfile.as_view()),
path('api/studio-user/profile/delete/<int:pk>/',views.IndividualStudioProfile.as_view()),
# Authentication routes
path("api/user/sign-up", views.RegistrationAPIView.as_view(), name="user_register"),
path("api/user/sign-in", views.LoginAPIView.as_view(), name="user_login"),
# client profile
path("api/client-user/profile/", views.CreativeProfile.as_view(), name="multi_profile_crud"),
path("api/client-user/profile/<int:profile_id>", views.CreativeProfile.as_view(), name="profile_crud"),
# booking
path("api/client-user/create-booking/", views.CreateBooking.as_view(), name="make_booking"),
# reviews
path("api/client-user/add-review/<int:studio_id>", views.CreateReview.as_view(), name="make_reviews"),
# path examples:
# http://127.0.0.1:8000/api/services/ --> to view all items
# http://127.0.0.1:8000/api/services/update/1/ --> to update specific item
# http://127.0.0.1:8000/api/services/delete/1/ --> to delete specific item
# StudioUser
path("api/studio-user/", views.StudioUserList.as_view(), name="StudioUser"),
path("api/studio-user/<int:pk>/", views.IndividualStudioUser.as_view()),
path("api/studio-user/update/<int:pk>/", views.IndividualStudioUser.as_view()),
path("api/studio-user/delete/<int:pk>/", views.IndividualStudioUser.as_view()),
# AdvertPost
path("api/advert-post/", views.AdvertPostList.as_view(), name="AdvertPost"),
path("api/advert-post/<studio_id>", views.AdvertPostList.as_view(), name="AdvertPost"),
path("api/advert-post/<int:pk>/", views.IndividualAdvertPost.as_view()),
path("api/advert-post/update/<int:pk>/", views.IndividualAdvertPost.as_view()),
path("api/advert-post/delete/<int:pk>/", views.IndividualAdvertPost.as_view()),
# Services
path("api/services/", views.ServicesList.as_view(), name="Services"),
path("api/services/<int:pk>/", views.IndividualServices.as_view()),
path("api/services/update/<int:pk>/", views.IndividualServices.as_view()),
path("api/services/delete/<int:pk>/", views.IndividualServices.as_view()),
# StudioProfile
path("api/studio-user/profile/", views.StudioProfileList.as_view()),
path("api/studio-user/profile/<int:pk>/", views.IndividualStudioProfile.as_view()),
path("api/studio-user/profile/update/<int:pk>/", views.IndividualStudioProfile.as_view()),
path("api/studio-user/profile/delete/<int:pk>/", views.IndividualStudioProfile.as_view()),
]
if settings.DEBUG:
urlpatterns += static(settings.MEDIA_URL, document_root=settings.MEDIA_ROOT)
| 53.417476 | 107 | 0.70847 | 718 | 5,502 | 5.32312 | 0.115599 | 0.084772 | 0.065411 | 0.068027 | 0.863422 | 0.856358 | 0.835688 | 0.835688 | 0.835688 | 0.835688 | 0 | 0.012903 | 0.09851 | 5,502 | 102 | 108 | 53.941176 | 0.757661 | 0.125954 | 0 | 0 | 0 | 0 | 0.335702 | 0.228052 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.065574 | 0 | 0.065574 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
ed116ea448b24818c5f7c1d5b0f0b7efc1ea9742 | 43 | py | Python | waterbutler/providers/weko/__init__.py | KakeruMizuno/RDM-waterbutler | 58ecd801385a7572d1ed56568a31f701291c4e3e | [
"Apache-2.0"
] | null | null | null | waterbutler/providers/weko/__init__.py | KakeruMizuno/RDM-waterbutler | 58ecd801385a7572d1ed56568a31f701291c4e3e | [
"Apache-2.0"
] | 9 | 2018-04-13T04:57:46.000Z | 2019-08-20T13:58:02.000Z | waterbutler/providers/weko/__init__.py | KakeruMizuno/RDM-waterbutler | 58ecd801385a7572d1ed56568a31f701291c4e3e | [
"Apache-2.0"
] | 6 | 2018-09-29T13:49:26.000Z | 2022-01-27T04:35:12.000Z | from .provider import WEKOProvider # noqa
| 21.5 | 42 | 0.790698 | 5 | 43 | 6.8 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.162791 | 43 | 1 | 43 | 43 | 0.944444 | 0.093023 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
ed12e487ceb798d9200e94e28b6303d365cd20f3 | 195 | py | Python | tonic/explorations/__init__.py | Eyalcohenx/tonic | afc15c6fa23fed4f696f68f0acf961964b0172dc | [
"MIT"
] | 350 | 2020-08-06T13:49:11.000Z | 2022-03-24T08:53:59.000Z | tonic/explorations/__init__.py | Eyalcohenx/tonic | afc15c6fa23fed4f696f68f0acf961964b0172dc | [
"MIT"
] | 12 | 2020-08-07T02:21:58.000Z | 2021-05-20T11:50:44.000Z | tonic/explorations/__init__.py | Eyalcohenx/tonic | afc15c6fa23fed4f696f68f0acf961964b0172dc | [
"MIT"
] | 35 | 2020-08-06T16:53:40.000Z | 2021-12-17T06:01:09.000Z | from .noisy import NoActionNoise
from .noisy import NormalActionNoise
from .noisy import OrnsteinUhlenbeckActionNoise
__all__ = [NoActionNoise, NormalActionNoise, OrnsteinUhlenbeckActionNoise]
| 27.857143 | 74 | 0.861538 | 16 | 195 | 10.25 | 0.4375 | 0.164634 | 0.27439 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.097436 | 195 | 6 | 75 | 32.5 | 0.931818 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.75 | 0 | 0.75 | 0 | 1 | 0 | 1 | null | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
ed4689be0c01d21787cfa3eab459a50edb5c6817 | 168 | py | Python | benchmarking/benchmark_01_timeit.py | h-mayorquin/attractor_sequences | 885271f30d73a58a7aad83b55949e4e32ba0b45a | [
"MIT"
] | 1 | 2016-08-19T18:58:51.000Z | 2016-08-19T18:58:51.000Z | benchmarking/benchmark_01_timeit.py | h-mayorquin/attractor_sequences | 885271f30d73a58a7aad83b55949e4e32ba0b45a | [
"MIT"
] | null | null | null | benchmarking/benchmark_01_timeit.py | h-mayorquin/attractor_sequences | 885271f30d73a58a7aad83b55949e4e32ba0b45a | [
"MIT"
] | null | null | null | import pprint
import timeit
result = timeit.repeat('benchmark_running()', setup='from benchmark_01 import benchmark_running', number=1, repeat=1)
pprint.pprint(result) | 33.6 | 117 | 0.803571 | 23 | 168 | 5.73913 | 0.521739 | 0.242424 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.025974 | 0.083333 | 168 | 5 | 118 | 33.6 | 0.831169 | 0 | 0 | 0 | 0 | 0 | 0.360947 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.75 | 0 | 0.75 | 0.5 | 1 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 1 | 0 | 6 |
ed61af285c57fd7f82281d2d17b30870101420b1 | 5,756 | py | Python | scripts/commit_validation/commit_validation/tests/validators/test_copyright_header_validator.py | aaarsene/o3de | 37e3b0226958974defd14dd6d808e8557dcd7345 | [
"Apache-2.0",
"MIT"
] | 1 | 2021-09-13T00:01:12.000Z | 2021-09-13T00:01:12.000Z | scripts/commit_validation/commit_validation/tests/validators/test_copyright_header_validator.py | aaarsene/o3de | 37e3b0226958974defd14dd6d808e8557dcd7345 | [
"Apache-2.0",
"MIT"
] | null | null | null | scripts/commit_validation/commit_validation/tests/validators/test_copyright_header_validator.py | aaarsene/o3de | 37e3b0226958974defd14dd6d808e8557dcd7345 | [
"Apache-2.0",
"MIT"
] | 1 | 2021-07-20T11:07:25.000Z | 2021-07-20T11:07:25.000Z | #
# Copyright (c) Contributors to the Open 3D Engine Project. For complete copyright and license terms please see the LICENSE at the root of this distribution.
#
# SPDX-License-Identifier: Apache-2.0 OR MIT#
#
import unittest
from unittest.mock import patch, mock_open
from commit_validation.tests.mocks.mock_commit import MockCommit
from commit_validation.validators.copyright_header_validator import CopyrightHeaderValidator
class CopyrightHeaderValidatorTests(unittest.TestCase):
@patch('builtins.open', mock_open(read_data='This file does contain\n'
'Copyright (c) Contributors to the Open 3D Engine Project. For complete copyright and license terms please see the LICENSE at the root of this distribution., so it should pass\n'))
def test_fileWithCopyrightHeader_passes(self):
commit = MockCommit(files=['/someCppFile.cpp'])
files = [
'This file does contain\n'
'Copyright (c) Contributors to the Open 3D Engine Project. For complete copyright and license terms please see the LICENSE at the root of this distribution., so it should pass\n',
'This file does contain\n'
'Copyright(c) Contributors to the Open 3D Engine Project. For complete copyright and license terms please see the LICENSE at the root of this distribution., so it should pass\n'
'and there\'s no space between "Copyright" and "(c)"',
'This file has a upper-case C between the parenthesis\n'
'// Copyright (C) Contributors to the Open 3D Engine Project. For complete copyright and license terms please see the LICENSE at the root of this distribution.',
]
for file in files:
with patch('builtins.open', mock_open(read_data=file)):
error_list = []
self.assertTrue(CopyrightHeaderValidator().run(commit, error_list))
self.assertEqual(len(error_list), 0, f"Unexpected errors: {error_list}")
def test_fileWithCopyrightHeaderExternalDir_passes(self):
commit = MockCommit(files=['/External/someCppFile.cpp'])
error_list = []
self.assertTrue(CopyrightHeaderValidator().run(commit, error_list))
self.assertEqual(len(error_list), 0, f"Unexpected errors: {error_list}")
@patch('builtins.open', mock_open(read_data='This file does not contain\n'
'the copyright header\n'))
def test_fileWithNoCopyrightHeader_fails(self):
commit = MockCommit(files=['/someCppFile.cpp'])
error_list = []
self.assertFalse(CopyrightHeaderValidator().run(commit, error_list))
self.assertNotEqual(len(error_list), 0, f"Errors were expected but none were returned.")
@patch('builtins.open', mock_open(read_data='This file does contains legacy header\n'
'this{0}file{0}Copyright{0}(c){0}Amazon.com\n'.format(' ')))
def test_fileWithLegacyAmazonCopyrightHeader_fails(self):
commit = MockCommit(files=['/someCppFile.cpp'])
error_list = []
self.assertFalse(CopyrightHeaderValidator().run(commit, error_list))
self.assertNotEqual(len(error_list), 0, f"Errors were expected but none were returned.")
@patch('builtins.open', mock_open(read_data='This file does contains legacy header\n'
'Modifications{0}copyright{0}Amazon.com,{0}Inc.{0}or{0}its affiliates\n'.format(' ')))
def test_fileWithAmazonModificationCopyrightHeader_passes(self):
commit = MockCommit(files=['/someCppFile.cpp'])
error_list = []
self.assertFalse(CopyrightHeaderValidator().run(commit, error_list))
self.assertNotEqual(len(error_list), 0, f"Errors were expected but none were returned.")
@patch('builtins.open', mock_open(read_data='Copyright (c) Contributors to the Open 3D Engine Project.\n'
'******************\n'.format(' ')))
def test_fileWithStaleO3DECopyrightHeader_fails(self):
commit = MockCommit(files=['/someCppFile.cpp'])
error_list = []
self.assertFalse(CopyrightHeaderValidator().run(commit, error_list))
self.assertNotEqual(len(error_list), 0, f"Errors were expected but none were returned.")
@patch('builtins.open', mock_open(read_data='This file does contains legacy header\n'
'Copyright{0}Crytek\n'.format(' ')))
def test_fileWithCrytekCopyrightHeader_fails(self):
commit = MockCommit(files=['/someCppFile.cpp'])
error_list = []
self.assertFalse(CopyrightHeaderValidator().run(commit, error_list))
self.assertNotEqual(len(error_list), 0, f"Errors were expected but none were returned.")
def test_fileExtensionIgnored_passes(self):
commit = MockCommit(files=['/someCppFile.waf_files'])
error_list = []
self.assertTrue(CopyrightHeaderValidator().run(commit, error_list))
self.assertEqual(len(error_list), 0, f"Unexpected errors: {error_list}")
def test_fileWith3rdPartyPath_passes(self):
commit = MockCommit(files=['/3rdParty/someCppFile.cpp'])
error_list = []
self.assertTrue(CopyrightHeaderValidator().run(commit, error_list))
self.assertEqual(len(error_list), 0, f"Unexpected errors: {error_list}")
def test_fileWithExternal_passes(self):
commit = MockCommit(files=['/External/someCppFile.cpp'])
error_list = []
self.assertTrue(CopyrightHeaderValidator().run(commit, error_list))
self.assertEqual(len(error_list), 0, f"Unexpected errors: {error_list}")
if __name__ == '__main__':
unittest.main()
| 55.346154 | 228 | 0.6713 | 669 | 5,756 | 5.650224 | 0.177877 | 0.083333 | 0.068783 | 0.066138 | 0.76164 | 0.753439 | 0.742328 | 0.720106 | 0.720106 | 0.707937 | 0 | 0.006904 | 0.219944 | 5,756 | 103 | 229 | 55.883495 | 0.834967 | 0.034399 | 0 | 0.5375 | 0 | 0.0375 | 0.339459 | 0.035676 | 0 | 0 | 0 | 0 | 0.25 | 1 | 0.125 | false | 0.1125 | 0.05 | 0 | 0.1875 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 6 |
ed71084b2fc2a2381b606e2562eaa740bff9f061 | 187 | py | Python | src/sage/rings/finite_rings/constructor.py | fredstro/sage | c936d2cda81ec7ec3552a3bdb29c994b40d1bb24 | [
"BSL-1.0"
] | null | null | null | src/sage/rings/finite_rings/constructor.py | fredstro/sage | c936d2cda81ec7ec3552a3bdb29c994b40d1bb24 | [
"BSL-1.0"
] | null | null | null | src/sage/rings/finite_rings/constructor.py | fredstro/sage | c936d2cda81ec7ec3552a3bdb29c994b40d1bb24 | [
"BSL-1.0"
] | null | null | null | from sage.misc.superseded import deprecation
deprecation(19941,"This module has been renamed to sage.rings.finite_rings.finite_field_constructor")
from finite_field_constructor import *
| 37.4 | 101 | 0.855615 | 26 | 187 | 5.961538 | 0.653846 | 0.141935 | 0.283871 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.02924 | 0.085562 | 187 | 4 | 102 | 46.75 | 0.877193 | 0 | 0 | 0 | 0 | 0 | 0.427807 | 0.256684 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0.666667 | 0 | 0.666667 | 0 | 1 | 0 | 0 | null | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
71f51b2121bba21479fcfd7ba691bd55a3729e60 | 27 | py | Python | flTile/test/printRank.py | rpwagner/tiled-display | 52d135bc163360fe55ce5521784b0ef48a8c82c9 | [
"Apache-2.0"
] | 1 | 2020-12-11T17:11:45.000Z | 2020-12-11T17:11:45.000Z | flTile/test/printRank.py | rpwagner/tiled-display | 52d135bc163360fe55ce5521784b0ef48a8c82c9 | [
"Apache-2.0"
] | null | null | null | flTile/test/printRank.py | rpwagner/tiled-display | 52d135bc163360fe55ce5521784b0ef48a8c82c9 | [
"Apache-2.0"
] | null | null | null | import mpi
print mpi.rank
| 6.75 | 14 | 0.777778 | 5 | 27 | 4.2 | 0.8 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.185185 | 27 | 3 | 15 | 9 | 0.954545 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0 | 0.5 | null | null | 0.5 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 6 |
9c697d099b31d53468ae6e5a604cd05b1f15f7e9 | 53 | py | Python | tests/test_adls_feedexporter_storage.py | hanbei/scrapy-feedexporter-azure | 66046b76c87ee98f9477fe75063e7cf5763d6ff3 | [
"Apache-2.0"
] | null | null | null | tests/test_adls_feedexporter_storage.py | hanbei/scrapy-feedexporter-azure | 66046b76c87ee98f9477fe75063e7cf5763d6ff3 | [
"Apache-2.0"
] | null | null | null | tests/test_adls_feedexporter_storage.py | hanbei/scrapy-feedexporter-azure | 66046b76c87ee98f9477fe75063e7cf5763d6ff3 | [
"Apache-2.0"
] | 2 | 2021-10-07T07:18:28.000Z | 2021-10-13T12:07:46.000Z | import pytest
def test_answer():
assert 3 == 3
| 8.833333 | 18 | 0.641509 | 8 | 53 | 4.125 | 0.875 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.051282 | 0.264151 | 53 | 5 | 19 | 10.6 | 0.794872 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.333333 | 1 | 0.333333 | true | 0 | 0.333333 | 0 | 0.666667 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.