hexsha string | size int64 | ext string | lang string | max_stars_repo_path string | max_stars_repo_name string | max_stars_repo_head_hexsha string | max_stars_repo_licenses list | max_stars_count int64 | max_stars_repo_stars_event_min_datetime string | max_stars_repo_stars_event_max_datetime string | max_issues_repo_path string | max_issues_repo_name string | max_issues_repo_head_hexsha string | max_issues_repo_licenses list | max_issues_count int64 | max_issues_repo_issues_event_min_datetime string | max_issues_repo_issues_event_max_datetime string | max_forks_repo_path string | max_forks_repo_name string | max_forks_repo_head_hexsha string | max_forks_repo_licenses list | max_forks_count int64 | max_forks_repo_forks_event_min_datetime string | max_forks_repo_forks_event_max_datetime string | content string | avg_line_length float64 | max_line_length int64 | alphanum_fraction float64 | qsc_code_num_words_quality_signal int64 | qsc_code_num_chars_quality_signal float64 | qsc_code_mean_word_length_quality_signal float64 | qsc_code_frac_words_unique_quality_signal float64 | qsc_code_frac_chars_top_2grams_quality_signal float64 | qsc_code_frac_chars_top_3grams_quality_signal float64 | qsc_code_frac_chars_top_4grams_quality_signal float64 | qsc_code_frac_chars_dupe_5grams_quality_signal float64 | qsc_code_frac_chars_dupe_6grams_quality_signal float64 | qsc_code_frac_chars_dupe_7grams_quality_signal float64 | qsc_code_frac_chars_dupe_8grams_quality_signal float64 | qsc_code_frac_chars_dupe_9grams_quality_signal float64 | qsc_code_frac_chars_dupe_10grams_quality_signal float64 | qsc_code_frac_chars_replacement_symbols_quality_signal float64 | qsc_code_frac_chars_digital_quality_signal float64 | qsc_code_frac_chars_whitespace_quality_signal float64 | qsc_code_size_file_byte_quality_signal float64 | qsc_code_num_lines_quality_signal float64 | qsc_code_num_chars_line_max_quality_signal float64 | qsc_code_num_chars_line_mean_quality_signal float64 | qsc_code_frac_chars_alphabet_quality_signal float64 | qsc_code_frac_chars_comments_quality_signal float64 | qsc_code_cate_xml_start_quality_signal float64 | qsc_code_frac_lines_dupe_lines_quality_signal float64 | qsc_code_cate_autogen_quality_signal float64 | qsc_code_frac_lines_long_string_quality_signal float64 | qsc_code_frac_chars_string_length_quality_signal float64 | qsc_code_frac_chars_long_word_length_quality_signal float64 | qsc_code_frac_lines_string_concat_quality_signal float64 | qsc_code_cate_encoded_data_quality_signal float64 | qsc_code_frac_chars_hex_words_quality_signal float64 | qsc_code_frac_lines_prompt_comments_quality_signal float64 | qsc_code_frac_lines_assert_quality_signal float64 | qsc_codepython_cate_ast_quality_signal float64 | qsc_codepython_frac_lines_func_ratio_quality_signal float64 | qsc_codepython_cate_var_zero_quality_signal bool | qsc_codepython_frac_lines_pass_quality_signal float64 | qsc_codepython_frac_lines_import_quality_signal float64 | qsc_codepython_frac_lines_simplefunc_quality_signal float64 | qsc_codepython_score_lines_no_logic_quality_signal float64 | qsc_codepython_frac_lines_print_quality_signal float64 | qsc_code_num_words int64 | qsc_code_num_chars int64 | qsc_code_mean_word_length int64 | qsc_code_frac_words_unique null | qsc_code_frac_chars_top_2grams int64 | qsc_code_frac_chars_top_3grams int64 | qsc_code_frac_chars_top_4grams int64 | qsc_code_frac_chars_dupe_5grams int64 | qsc_code_frac_chars_dupe_6grams int64 | qsc_code_frac_chars_dupe_7grams int64 | qsc_code_frac_chars_dupe_8grams int64 | qsc_code_frac_chars_dupe_9grams int64 | qsc_code_frac_chars_dupe_10grams int64 | qsc_code_frac_chars_replacement_symbols int64 | qsc_code_frac_chars_digital int64 | qsc_code_frac_chars_whitespace int64 | qsc_code_size_file_byte int64 | qsc_code_num_lines int64 | qsc_code_num_chars_line_max int64 | qsc_code_num_chars_line_mean int64 | qsc_code_frac_chars_alphabet int64 | qsc_code_frac_chars_comments int64 | qsc_code_cate_xml_start int64 | qsc_code_frac_lines_dupe_lines int64 | qsc_code_cate_autogen int64 | qsc_code_frac_lines_long_string int64 | qsc_code_frac_chars_string_length int64 | qsc_code_frac_chars_long_word_length int64 | qsc_code_frac_lines_string_concat null | qsc_code_cate_encoded_data int64 | qsc_code_frac_chars_hex_words int64 | qsc_code_frac_lines_prompt_comments int64 | qsc_code_frac_lines_assert int64 | qsc_codepython_cate_ast int64 | qsc_codepython_frac_lines_func_ratio int64 | qsc_codepython_cate_var_zero int64 | qsc_codepython_frac_lines_pass int64 | qsc_codepython_frac_lines_import int64 | qsc_codepython_frac_lines_simplefunc int64 | qsc_codepython_score_lines_no_logic int64 | qsc_codepython_frac_lines_print int64 | effective string | hits int64 |
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
cae3a3e72a135d992fe362e05fbe91439eb7881a | 3,895 | py | Python | blog/blog/migrations/0009_auto_20170609_1503.py | singularitytech/Apostolic-Building-Company- | d808136e36f5a4b60498848b4b1ab667e4bb5c7e | [
"CC-BY-3.0"
] | null | null | null | blog/blog/migrations/0009_auto_20170609_1503.py | singularitytech/Apostolic-Building-Company- | d808136e36f5a4b60498848b4b1ab667e4bb5c7e | [
"CC-BY-3.0"
] | 3 | 2017-08-15T16:31:51.000Z | 2017-12-01T15:02:33.000Z | blog/blog/migrations/0009_auto_20170609_1503.py | singularitytech/Apostolic-Building-Company- | d808136e36f5a4b60498848b4b1ab667e4bb5c7e | [
"CC-BY-3.0"
] | null | null | null | # -*- coding: utf-8 -*-
# Generated by Django 1.10.7 on 2017-06-09 15:03
from __future__ import unicode_literals
import datetime
from django.db import migrations, models
from django.utils.timezone import utc
import django.utils.timezone
class Migration(migrations.Migration):
dependencies = [
('blog', '0008_delete_site'),
]
operations = [
migrations.RemoveField(
model_name='post',
name='article_taxt',
),
migrations.RemoveField(
model_name='post',
name='article_text',
),
migrations.AddField(
model_name='post',
name='image_eight',
field=models.ImageField(default='pic_folder/None/no-img.jpg', upload_to='pic_folder/', verbose_name='blogpic'),
),
migrations.AddField(
model_name='post',
name='image_five',
field=models.ImageField(default='pic_folder/None/no-img.jpg', upload_to='pic_folder/', verbose_name='blogpic'),
),
migrations.AddField(
model_name='post',
name='image_four',
field=models.ImageField(default='pic_folder/None/no-img.jpg', upload_to='pic_folder/', verbose_name='blogpic'),
),
migrations.AddField(
model_name='post',
name='image_one',
field=models.ImageField(default='pic_folder/None/no-img.jpg', upload_to='pic_folder/', verbose_name='blogpic'),
),
migrations.AddField(
model_name='post',
name='image_seven',
field=models.ImageField(default='pic_folder/None/no-img.jpg', upload_to='pic_folder/', verbose_name='blogpic'),
),
migrations.AddField(
model_name='post',
name='image_six',
field=models.ImageField(default='pic_folder/None/no-img.jpg', upload_to='pic_folder/', verbose_name='blogpic'),
),
migrations.AddField(
model_name='post',
name='image_three',
field=models.ImageField(default='pic_folder/None/no-img.jpg', upload_to='pic_folder/', verbose_name='blogpic'),
),
migrations.AddField(
model_name='post',
name='image_two',
field=models.ImageField(default='pic_folder/None/no-img.jpg', upload_to='pic_folder/', verbose_name='blogpic'),
),
migrations.AddField(
model_name='post',
name='text_five',
field=models.TextField(default=datetime.datetime(2017, 6, 9, 15, 2, 42, 979017, tzinfo=utc)),
preserve_default=False,
),
migrations.AddField(
model_name='post',
name='text_four',
field=models.TextField(default=datetime.datetime(2017, 6, 9, 15, 2, 59, 886721, tzinfo=utc)),
preserve_default=False,
),
migrations.AddField(
model_name='post',
name='text_one',
field=models.TextField(default=datetime.datetime(2017, 6, 9, 15, 3, 28, 602454, tzinfo=utc)),
preserve_default=False,
),
migrations.AddField(
model_name='post',
name='text_three',
field=models.TextField(default=django.utils.timezone.now),
preserve_default=False,
),
migrations.AddField(
model_name='post',
name='text_two',
field=models.TextField(default=datetime.datetime(2017, 6, 9, 15, 3, 50, 863145, tzinfo=utc)),
preserve_default=False,
),
migrations.AlterField(
model_name='post',
name='model_pic',
field=models.ImageField(default='pic_folder/None/no-img.jpg', upload_to='pic_folder/', verbose_name='blogpic'),
),
migrations.AlterModelTable(
name='post',
table=None,
),
]
| 36.745283 | 123 | 0.581772 | 418 | 3,895 | 5.229665 | 0.203349 | 0.074108 | 0.095151 | 0.124428 | 0.789113 | 0.789113 | 0.771272 | 0.709973 | 0.709973 | 0.709973 | 0 | 0.03198 | 0.285494 | 3,895 | 105 | 124 | 37.095238 | 0.753503 | 0.017458 | 0 | 0.632653 | 1 | 0 | 0.167626 | 0.061192 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.05102 | 0 | 0.081633 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
1b3688835e661eb4ac560454ff8d26a9d825a258 | 29 | py | Python | google_calendar/insert_event.py | greyhub/calendar_manager | 837a0d938fdd58684279fb32f8a50805bb306fe1 | [
"MIT"
] | 6 | 2021-11-23T19:52:55.000Z | 2022-03-30T13:45:05.000Z | google_calendar/insert_event.py | greyhub/calendar_manager | 837a0d938fdd58684279fb32f8a50805bb306fe1 | [
"MIT"
] | null | null | null | google_calendar/insert_event.py | greyhub/calendar_manager | 837a0d938fdd58684279fb32f8a50805bb306fe1 | [
"MIT"
] | 1 | 2021-11-23T10:14:54.000Z | 2021-11-23T10:14:54.000Z | def insert_event():
pass
| 9.666667 | 19 | 0.655172 | 4 | 29 | 4.5 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.241379 | 29 | 2 | 20 | 14.5 | 0.818182 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.5 | true | 0.5 | 0 | 0 | 0.5 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 6 |
1b622e7d73a0a85489f5c30207ec4b0c052e89ba | 52 | py | Python | tests/python/blender_mqo_test/__init__.py | nutti/bl-mqo | f651736cde2e46014c2f011349567284c3740fac | [
"MIT"
] | 24 | 2020-03-24T11:12:00.000Z | 2022-02-19T15:04:26.000Z | tests/python/blender_mqo_test/__init__.py | nutti/bl-mqo | f651736cde2e46014c2f011349567284c3740fac | [
"MIT"
] | 12 | 2019-10-10T01:35:22.000Z | 2022-03-31T06:36:05.000Z | tests/python/blender_mqo_test/__init__.py | nutti/bl-mqo | f651736cde2e46014c2f011349567284c3740fac | [
"MIT"
] | 4 | 2019-05-12T16:00:57.000Z | 2021-08-30T04:42:24.000Z | from . import import_test
from . import export_test
| 17.333333 | 25 | 0.807692 | 8 | 52 | 5 | 0.5 | 0.5 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.153846 | 52 | 2 | 26 | 26 | 0.909091 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
1b951f9e63b3e5cdd9e6ad222050d82d02d7299e | 114 | py | Python | delira/models/segmentation/__init__.py | NKPmedia/delira | a10227e30c14c6507a1790813e53572e0d841c21 | [
"BSD-2-Clause"
] | null | null | null | delira/models/segmentation/__init__.py | NKPmedia/delira | a10227e30c14c6507a1790813e53572e0d841c21 | [
"BSD-2-Clause"
] | null | null | null | delira/models/segmentation/__init__.py | NKPmedia/delira | a10227e30c14c6507a1790813e53572e0d841c21 | [
"BSD-2-Clause"
] | null | null | null | from delira import get_backends
if "TORCH" in get_backends():
from .unet import UNet2dPyTorch, UNet3dPyTorch
| 22.8 | 50 | 0.780702 | 15 | 114 | 5.8 | 0.733333 | 0.252874 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.020833 | 0.157895 | 114 | 4 | 51 | 28.5 | 0.885417 | 0 | 0 | 0 | 0 | 0 | 0.04386 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0.666667 | 0 | 0.666667 | 0 | 1 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
1bae3ea3a76f0e196b96b25b23e3f130ca8d1518 | 96 | py | Python | main/__init__.py | download-pdf/downloadpdf | dd30036fbd910db9dd37e4fd801744f52fe340bf | [
"MIT"
] | 1 | 2021-09-24T11:28:42.000Z | 2021-09-24T11:28:42.000Z | main/__init__.py | PythonCheatsheet/downloadpdf | dd30036fbd910db9dd37e4fd801744f52fe340bf | [
"MIT"
] | 9 | 2021-09-13T06:54:56.000Z | 2021-10-14T06:42:37.000Z | main/__init__.py | download-pdf/downloadpdf | dd30036fbd910db9dd37e4fd801744f52fe340bf | [
"MIT"
] | 1 | 2021-02-23T00:53:24.000Z | 2021-02-23T00:53:24.000Z | #!/usr/bin/env python3
from main.scrape import scrapeHREF
from main.download import downloadPDF
| 24 | 37 | 0.822917 | 14 | 96 | 5.642857 | 0.785714 | 0.202532 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.011628 | 0.104167 | 96 | 3 | 38 | 32 | 0.906977 | 0.21875 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
94140d6e4b19a26e77ac4bbbd01045690c08b1f3 | 262 | py | Python | twittcher/__init__.py | Zulko/twittcher | 3ec1a0f9b558e9595cf0eb59d4fce638cd00009c | [
"MIT"
] | 37 | 2015-01-16T21:34:07.000Z | 2021-07-30T16:33:46.000Z | twittcher/__init__.py | Zulko/twittcher | 3ec1a0f9b558e9595cf0eb59d4fce638cd00009c | [
"MIT"
] | 1 | 2017-12-23T22:28:33.000Z | 2018-10-06T15:49:07.000Z | twittcher/__init__.py | Zulko/twittcher | 3ec1a0f9b558e9595cf0eb59d4fce638cd00009c | [
"MIT"
] | 8 | 2015-03-09T11:08:47.000Z | 2021-09-21T02:50:27.000Z | """ twittcher/__init__.py """
__all__ = ["PageWatcher", "UserWatcher", "SearchWatcher",
"Tweet", "TweetSender"]
from .twittcher import (PageWatcher, UserWatcher, SearchWatcher,
Tweet, TweetSender)
from .version import __version__
| 26.2 | 64 | 0.667939 | 21 | 262 | 7.761905 | 0.571429 | 0.269939 | 0.429448 | 0.490798 | 0.674847 | 0.674847 | 0 | 0 | 0 | 0 | 0 | 0 | 0.206107 | 262 | 9 | 65 | 29.111111 | 0.783654 | 0.080153 | 0 | 0 | 0 | 0 | 0.218884 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.4 | 0 | 0.4 | 0 | 1 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 6 |
94354d60aacd59849b4e72d64fc390f0ac8aace7 | 6,731 | py | Python | actionplan/permissions.py | hisham2k9/IMS-and-CAPA | 9f70988a6411c72ab4f0cbc818b84db58a28076f | [
"MIT"
] | null | null | null | actionplan/permissions.py | hisham2k9/IMS-and-CAPA | 9f70988a6411c72ab4f0cbc818b84db58a28076f | [
"MIT"
] | 15 | 2021-03-19T03:43:56.000Z | 2022-03-12T00:30:55.000Z | actionplan/permissions.py | hisham2k9/IMS-and-CAPA | 9f70988a6411c72ab4f0cbc818b84db58a28076f | [
"MIT"
] | null | null | null | from django.contrib.auth.models import User, Group
from .models import actionplanmodel
from django.shortcuts import render, HttpResponseRedirect,get_object_or_404, redirect
class permissions():
def edit_perm(self,request, pk=0):
obj=get_object_or_404(actionplanmodel,pk=pk)
switch1=getattr(obj, 'task_assign_switch')
switch2=getattr(obj, 'status_change_switch')
switch3=getattr(obj, 'completion_switch')
switch4=getattr(obj, 'verified_switch')
taskactor=getattr(obj, 'task_assigned_to')
taskmaker=getattr(obj, 'task_assign_username')
if request.user==taskmaker:
if switch1==False and switch2==False:
return True
if request.user==taskactor:
if switch2==False and switch3==False:
return True
elif switch3==False and switch4==False:
return True
else:
return False
if request.user.groups.filter(name='Super_Validators').exists():
if switch1==False and switch2==False:
return True
elif switch3==True and switch4==False:
return True
else:
return False
if request.user.groups.filter(name='admins').exists():
return True
def delete_perm(self,request, pk=0):
obj=get_object_or_404(actionplanmodel,pk=pk)
switch1=getattr(obj, 'task_assign_switch')
switch2=getattr(obj, 'status_change_switch')
switch3=getattr(obj, 'completion_switch')
switch4=getattr(obj, 'verified_switch')
taskactor=getattr(obj, 'task_assigned_to')
taskmaker=getattr(obj, 'task_assign_username')
if request.user.groups.filter(name='Super_Validators').exists():
return True
if request.user.groups.filter(name='admins').exists():
return True
def create_perm(self,request):
if request.user.groups.filter(name='Super_Validators').exists():
return True
if request.user.groups.filter(name='admins').exists():
return True
#edit option
#if task asssigned swith true and user = investigation return true
#if completion is true and user = supervalidator return true
#
# obj=get_object_or_404(actionplanmodel,pk=pk)
# switch1=getattr(obj, 'submit_confirm_switch')
# switch2=getattr(obj, 'assign_qa_comments_confirm_switch')
# switch3=getattr(obj, 'investigation_confirm_switch')
# switch4=getattr(obj, 'validation_confirm_switch')
# switch5=getattr(obj, 'closure_confirm_switch')
# investigator=getattr(obj, 'qa_assign_to')
# submitter=getattr(obj, 'submission_confirm_user')
# def post(self, request):
# #assigns request.post to imssubmission form
# ##this logic verifies if saved form has come for edit or new form.
# ptform,formset=resources.editornew(request)
# #submit or save
# if 'submission' in request.POST:
# if ptform.is_valid() and formset.is_valid():
# #dealing with pt form
# pt_instance=ptform.save()
# pk=pt_instance.id
# formsetsave=resources.validateformset(formset,imssubmissionfiles , pt_instance)
# else:
# context={}
# detailform=imssubmissiondetailsform(request.POST)
# context['ptform']=ptform
# context['detailform']=detailform
# context['formset']=formset
# template='ims/imssubmission.html'
# return render(request, template, context)
# instance=get_object_or_404(imsmodel, pk=pk)
# detailform=imssubmissiondetailsform(request.POST, instance=instance)
# if detailform.is_valid():
# detail_instance=detailform.save(commit=False)
# detail_instance.submit_confirm_switch=True
# detail_instance.report_to='quality'
# detail_instance.submission_confirm_user=request.user
# detail_instance.submission_update_timestamp=datetime.datetime.now()
# detail_instance.save()
# messages.info(request, 'Your form is successfully submitted')
# else:
# context={}
# detailform=imssubmissiondetailsform(request.POST)
# context['ptform']=ptform
# context['detailform']=detailform
# context['formset']=formset
# template='ims/imssubmission.html'
# return render(request, template, context)
# # context=resources.imslistcontext(request)
# return HttpResponseRedirect('imsview')
# elif 'save' in request.POST:
# if ptform.is_valid() and formset.is_valid():
# pt_instance=ptform.save()
# pk=pt_instance.id
# formsetsave=resources.validateformset(formset,imssubmissionfiles , pt_instance)
# else:
# context={}
# detailform=imssubmissiondetailsform(request.POST)
# context['ptform']=ptform
# context['detailform']=detailform
# context['formset']=formset
# template='ims/imssubmission.html'
# return render(request, template, context)
# instance=get_object_or_404(imsmodel, pk=pk)
# detailform=imssubmissiondetailsform(request.POST, instance=instance)
# if detailform.is_valid():
# detail_instance=detailform.save(commit=False)
# detail_instance.submit_confirm_switch=False
# detail_instance.report_to='quality'
# detail_instance.submission_confirm_user=request.user
# detail_instance.submission_update_timestamp=datetime.datetime.now()
# detail_instance.save()
# messages.info(request,'Your form is saved')
# else:
# context={}
# detailform=imssubmissiondetailsform(request.POST)
# context['ptform']=ptform
# context['detailform']=detailform
# context['formset']=formset
# template='ims/imssubmission.html'
# return render(request, template, context)
# return HttpResponseRedirect('imsview') | 42.06875 | 97 | 0.584014 | 634 | 6,731 | 6.045741 | 0.197161 | 0.04957 | 0.027133 | 0.021915 | 0.752674 | 0.73963 | 0.73963 | 0.73963 | 0.71928 | 0.713801 | 0 | 0.009407 | 0.320903 | 6,731 | 160 | 98 | 42.06875 | 0.82914 | 0.570049 | 0 | 0.75 | 0 | 0 | 0.099073 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.0625 | false | 0 | 0.0625 | 0 | 0.395833 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
94471689b8d33fe2180b67231dbf498354f6d42e | 127 | py | Python | project_euler/solutions/problem_76.py | cryvate/project-euler | 6ed13880d7916c34554559f5f71662a863735eda | [
"MIT"
] | null | null | null | project_euler/solutions/problem_76.py | cryvate/project-euler | 6ed13880d7916c34554559f5f71662a863735eda | [
"MIT"
] | 9 | 2017-02-20T23:41:40.000Z | 2017-04-16T15:36:54.000Z | project_euler/solutions/problem_76.py | cryvate/project-euler | 6ed13880d7916c34554559f5f71662a863735eda | [
"MIT"
] | null | null | null | from ..library.combinatorics.partitions import partitions
def solve(index: int=100) -> int:
return partitions(index) - 1
| 21.166667 | 57 | 0.740157 | 16 | 127 | 5.875 | 0.75 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.037037 | 0.149606 | 127 | 5 | 58 | 25.4 | 0.833333 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.333333 | false | 0 | 0.333333 | 0.333333 | 1 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 1 | 1 | 0 | 0 | 6 |
846cf325b9970f58dd748c9b5f33c63a2326ce7b | 41 | py | Python | wef/items/api/serializers/__init__.py | deadlylaid/study_alone | a024363ed1ab06fbb21a9b5da6a04eda9d7dfb35 | [
"MIT"
] | 6 | 2016-08-15T07:23:47.000Z | 2018-08-11T12:38:47.000Z | wef/items/api/serializers/__init__.py | deadlylaid/book_connect | a024363ed1ab06fbb21a9b5da6a04eda9d7dfb35 | [
"MIT"
] | 24 | 2016-08-05T06:30:11.000Z | 2022-03-11T23:20:18.000Z | wef/items/api/serializers/__init__.py | deadlylaid/study_alone | a024363ed1ab06fbb21a9b5da6a04eda9d7dfb35 | [
"MIT"
] | null | null | null | from .booklist import BookListSerializer
| 20.5 | 40 | 0.878049 | 4 | 41 | 9 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.097561 | 41 | 1 | 41 | 41 | 0.972973 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
849daef4bf1366cf4e69a81cf1c289d244244c61 | 26 | py | Python | cadence/constraints/__init__.py | tomaarsen/cadence | c6878c6224aac6c26efc9dbd59a1de551df85e8f | [
"MIT"
] | 12 | 2021-03-29T17:41:26.000Z | 2022-02-03T07:30:05.000Z | cadence/constraints/__init__.py | tomaarsen/cadence | c6878c6224aac6c26efc9dbd59a1de551df85e8f | [
"MIT"
] | 1 | 2021-03-30T11:12:44.000Z | 2021-03-30T14:56:50.000Z | cadence/constraints/__init__.py | tomaarsen/cadence | c6878c6224aac6c26efc9dbd59a1de551df85e8f | [
"MIT"
] | 3 | 2021-03-29T18:52:27.000Z | 2022-01-15T06:50:21.000Z | from .constraints import * | 26 | 26 | 0.807692 | 3 | 26 | 7 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.115385 | 26 | 1 | 26 | 26 | 0.913043 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
84b56c7d51814f9819f301abfdf833da191afe0d | 156 | py | Python | vehicle/vehicle/utils/__init__.py | not-decided-yet/parky-server | 5a2807d1cfeb7cd9d1602b429407589c847d3608 | [
"MIT"
] | null | null | null | vehicle/vehicle/utils/__init__.py | not-decided-yet/parky-server | 5a2807d1cfeb7cd9d1602b429407589c847d3608 | [
"MIT"
] | null | null | null | vehicle/vehicle/utils/__init__.py | not-decided-yet/parky-server | 5a2807d1cfeb7cd9d1602b429407589c847d3608 | [
"MIT"
] | null | null | null | from .common import Singleton, get_logger
from .crypto import rsa_decrypt, rsa_encrypt
__all__ = ["get_logger", "Singleton", "rsa_decrypt", "rsa_encrypt"]
| 31.2 | 67 | 0.775641 | 21 | 156 | 5.285714 | 0.52381 | 0.162162 | 0.234234 | 0.36036 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.108974 | 156 | 4 | 68 | 39 | 0.798561 | 0 | 0 | 0 | 0 | 0 | 0.262821 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.666667 | 0 | 0.666667 | 0 | 1 | 0 | 0 | null | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
84bfb61caf7645e6fbb7c19644c7c04cd9c1163f | 3,326 | py | Python | tests/test_images.py | kylape/ocdeployer | 0c4af12345a0c3eeb3e5c90d7962e77f9def4d10 | [
"MIT"
] | 13 | 2018-10-16T12:47:26.000Z | 2021-01-27T19:31:28.000Z | tests/test_images.py | kylape/ocdeployer | 0c4af12345a0c3eeb3e5c90d7962e77f9def4d10 | [
"MIT"
] | 14 | 2018-11-09T09:40:05.000Z | 2021-04-22T19:56:21.000Z | tests/test_images.py | kylape/ocdeployer | 0c4af12345a0c3eeb3e5c90d7962e77f9def4d10 | [
"MIT"
] | 16 | 2018-11-08T18:08:49.000Z | 2022-03-27T03:46:17.000Z | import pytest
from ocdeployer.images import ImageImporter, import_images
@pytest.fixture
def mock_oc(mocker):
_mock_oc = mocker.patch("ocdeployer.images.oc")
mocker.patch("ocdeployer.images.get_json", return_value={})
yield _mock_oc
def _check_oc_calls(mocker, mock_oc):
assert mock_oc.call_count == 2
calls = [
mocker.call(
"import-image",
"image1:tag",
"--from=docker.url/image1:sometag",
"--confirm",
"--scheduled=True",
_reraise=True,
),
mocker.call(
"import-image",
"image2:tag",
"--from=docker.url/image2:sometag",
"--confirm",
"--scheduled=True",
_reraise=True,
),
]
mock_oc.assert_has_calls(calls)
def test_images_short_style_syntax(mocker, mock_oc):
config_content = {
"images": [
{"image1:tag": "docker.url/image1:sometag"},
{"image2:tag": "docker.url/image2:sometag"},
]
}
ImageImporter.imported_istags = []
import_images(config_content, [])
_check_oc_calls(mocker, mock_oc)
def test_images_long_style_syntax(mocker, mock_oc):
config_content = {
"images": [
{"istag": "image1:tag", "from": "docker.url/image1:sometag"},
{"istag": "image2:tag", "from": "docker.url/image2:sometag"},
]
}
ImageImporter.imported_istags = []
import_images(config_content, [])
_check_oc_calls(mocker, mock_oc)
def test_images_old_style_syntax(mocker, mock_oc):
config_content = {
"images": {
"image1:tag": "docker.url/image1:sometag",
"image2:tag": "docker.url/image2:sometag",
}
}
ImageImporter.imported_istags = []
import_images(config_content, [])
_check_oc_calls(mocker, mock_oc)
def test_images_mixed_style_syntax(mocker, mock_oc):
config_content = {
"images": [
{"image1:tag": "docker.url/image1:sometag"},
{"istag": "image2:tag", "from": "docker.url/image2:sometag"},
]
}
ImageImporter.imported_istags = []
import_images(config_content, [])
_check_oc_calls(mocker, mock_oc)
def test_images_conditional_images(mocker, mock_oc):
config_content = {
"images": [
{"istag": "image1:tag", "from": "docker.url/image1:sometag", "envs": ["qa", "prod"]},
{"istag": "image2:tag", "from": "docker.url/image2:sometag"},
]
}
ImageImporter.imported_istags = []
import_images(config_content, ["prod"])
_check_oc_calls(mocker, mock_oc)
def test_images_conditional_ignore_image(mocker, mock_oc):
config_content = {
"images": [
{"istag": "image1:tag", "from": "docker.url/image1:sometag", "envs": ["qa", "prod"]},
{"istag": "image2:tag", "from": "docker.url/image2:sometag"},
]
}
ImageImporter.imported_istags = []
import_images(config_content, ["foo"])
assert mock_oc.call_count == 1
calls = [
mocker.call(
"import-image",
"image2:tag",
"--from=docker.url/image2:sometag",
"--confirm",
"--scheduled=True",
_reraise=True,
)
]
mock_oc.assert_has_calls(calls)
| 25.984375 | 97 | 0.580277 | 351 | 3,326 | 5.219373 | 0.153846 | 0.062227 | 0.085153 | 0.087336 | 0.900109 | 0.828603 | 0.798581 | 0.779476 | 0.773472 | 0.773472 | 0 | 0.01318 | 0.269994 | 3,326 | 127 | 98 | 26.188976 | 0.741351 | 0 | 0 | 0.59596 | 0 | 0 | 0.249248 | 0.126879 | 0 | 0 | 0 | 0 | 0.040404 | 1 | 0.080808 | false | 0 | 0.171717 | 0 | 0.252525 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
ca0df06c08e56c7fb874fd1254cd06770ab9a219 | 315 | py | Python | novice/python-unit-testing/code/test_something.py | Southampton-RSG/swc-template | cf2d60db13a94d45c73fe0437f0c02770390e052 | [
"CC-BY-4.0"
] | null | null | null | novice/python-unit-testing/code/test_something.py | Southampton-RSG/swc-template | cf2d60db13a94d45c73fe0437f0c02770390e052 | [
"CC-BY-4.0"
] | null | null | null | novice/python-unit-testing/code/test_something.py | Southampton-RSG/swc-template | cf2d60db13a94d45c73fe0437f0c02770390e052 | [
"CC-BY-4.0"
] | null | null | null | from something import something
def test_empty():
assert something([]) == []
def test_single_value():
assert something(['a']) == []
def test_two_values():
assert something(['a', 'b']) == [('a', 'b')]
def test_three_values():
assert something(['a', 'b', 'c']) == [('a', 'b'), ('a', 'c'), ('b', 'c')]
| 22.5 | 76 | 0.549206 | 41 | 315 | 4.04878 | 0.365854 | 0.168675 | 0.289157 | 0.26506 | 0.277108 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.168254 | 315 | 13 | 77 | 24.230769 | 0.633588 | 0 | 0 | 0 | 0 | 0 | 0.044444 | 0 | 0 | 0 | 0 | 0 | 0.444444 | 1 | 0.444444 | true | 0 | 0.111111 | 0 | 0.555556 | 0 | 0 | 0 | 0 | null | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 0 | 1 | 1 | 0 | 0 | 0 | 1 | 0 | 0 | 6 |
04acded83b759a4e35b60d1aae8e3c5e9d1f62f2 | 2,680 | py | Python | tests/test_datadict.py | edd313/ndicts | c464ae7d78c03f08d989fb5cecfbbe3d9b41da8a | [
"MIT"
] | 3 | 2022-03-08T12:40:09.000Z | 2022-03-09T06:47:11.000Z | tests/test_datadict.py | edd313/ndicts | c464ae7d78c03f08d989fb5cecfbbe3d9b41da8a | [
"MIT"
] | null | null | null | tests/test_datadict.py | edd313/ndicts | c464ae7d78c03f08d989fb5cecfbbe3d9b41da8a | [
"MIT"
] | null | null | null | """Tests for the DataDict class"""
from ndicts.ndicts import Arithmetics, DataDict, NestedDict
def test_inheritance():
assert isinstance(DataDict(), (NestedDict, Arithmetics))
def test_arithmetics():
iterables = ["ab", "ab"]
v1, v2 = 1, 2
dd1 = DataDict.from_product(*iterables, value=v1)
dd2 = DataDict.from_product(*iterables, value=v2)
assert dd1 + dd2 == DataDict.from_product(*iterables, value=v1 + v2)
assert dd2 + dd1 == DataDict.from_product(*iterables, value=v1 + v2)
assert dd1 - dd2 == DataDict.from_product(*iterables, value=v1 - v2)
assert dd2 - dd1 == DataDict.from_product(*iterables, value=v2 - v1)
assert dd1 * dd2 == DataDict.from_product(*iterables, value=v1 * v2)
assert dd2 * dd1 == DataDict.from_product(*iterables, value=v1 * v2)
assert dd1 / dd2 == DataDict.from_product(*iterables, value=v1 / v2)
assert dd2 / dd1 == DataDict.from_product(*iterables, value=v2 / v1)
assert dd1**dd2 == DataDict.from_product(*iterables, value=v1**v2)
assert dd2**dd1 == DataDict.from_product(*iterables, value=v2**v1)
assert dd1 // dd2 == DataDict.from_product(*iterables, value=v1 // v2)
assert dd2 // dd1 == DataDict.from_product(*iterables, value=v2 // v1)
assert dd1 % dd2 == DataDict.from_product(*iterables, value=v1 % v2)
assert dd2 % dd1 == DataDict.from_product(*iterables, value=v2 % v1)
def test_arithmetics_extract():
"""Extract a DataDict, and perform an operation back with the original one"""
dd = DataDict.from_product("ab", "ab", value=2)
dd_extract = dd.extract["", "b"]
assert dd - dd_extract == DataDict({"a": {"a": 2, "b": 0}, "b": {"a": 2, "b": 0}})
dd = DataDict.from_product("ab", "ab", value=2)
dd_extract = dd.extract["a"]
assert dd * dd_extract == DataDict({"a": {"a": 4, "b": 4}, "b": {"a": 2, "b": 2}})
def test_apply():
dd = DataDict.from_product("ab", "ab", value=1)
assert dd.apply(lambda x: 2 * x + 1) == DataDict.from_product("ab", "ab", value=3)
dd.apply(lambda x: 2 * x + 1, inplace=True)
assert dd == DataDict.from_product("ab", "ab", value=3)
def test_reduce():
dd = DataDict.from_product("ab", "ab", value=1)
assert dd.reduce(lambda x, y: x + y) == sum(dd.values())
assert dd.reduce(lambda x, y: x + y, 3) == sum(dd.values()) + 3
def test_total():
dd = DataDict.from_product("ab", "ab", value=1)
assert dd.total() == 4
def test_mean():
dd = DataDict.from_product("ab", "ab", value=1)
assert dd.mean() == 1
def test_std():
dd = DataDict.from_product("ab", "ab", value=1)
assert dd.std() == 0
| 38.84058 | 87 | 0.623134 | 380 | 2,680 | 4.294737 | 0.136842 | 0.183824 | 0.291054 | 0.27451 | 0.760417 | 0.760417 | 0.738358 | 0.625613 | 0.601103 | 0.601103 | 0 | 0.042293 | 0.20597 | 2,680 | 68 | 88 | 39.411765 | 0.724624 | 0.037313 | 0 | 0.148936 | 0 | 0 | 0.021591 | 0 | 0 | 0 | 0 | 0 | 0.510638 | 1 | 0.170213 | false | 0 | 0.021277 | 0 | 0.191489 | 0 | 0 | 0 | 0 | null | 0 | 1 | 1 | 0 | 1 | 1 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
b6d85305f3c1739d07197cb289602a98f44893b1 | 9,427 | py | Python | blender_models/generate_animation_frames.py | ccai1/VisualTango | 798390f78a9e2d4e1009425a17253849ebd65e3d | [
"MIT"
] | null | null | null | blender_models/generate_animation_frames.py | ccai1/VisualTango | 798390f78a9e2d4e1009425a17253849ebd65e3d | [
"MIT"
] | null | null | null | blender_models/generate_animation_frames.py | ccai1/VisualTango | 798390f78a9e2d4e1009425a17253849ebd65e3d | [
"MIT"
] | null | null | null | ##########################################
# This is a Blender script
# should by copy the text into Blender
# This basically generate all the frames using
# the poses.
##########################################
import bpy
scene = bpy.context.scene
# get current frame using scene.frame_current
# choose whole body bones
bpy.ops.pose.select_all(action='SELECT')
index = 0
# standing
scene.frame_set(index)
bpy.ops.poselib.apply_pose(pose_index=1)
bpy.ops.anim.keyframe_insert_menu(type='WholeCharacter')
# direction
for direction in ["north", "northwest", "northeast"]:
# height
for height in ["high", "low"]:
# weighted foot
for weight in ["left", "right"]:
# unweighted foot
for unweighted in ["collected", "crossed-forward", "forward", "backward", "in-air-forward", "in-air-backward", "slide-out-side", "wrapped-around-front"]:
# leaning
for leaning in ["neutral", "forward", "backward", "toward-weighted", "toward-unweighted"]:
index += 1
bpy.ops.pose.select_all(action='SELECT')
scene.frame_set(index)
# apply the initial standing pose to whole body
bpy.ops.poselib.apply_pose(pose_index=1)
# apply hand pose
bpy.ops.poselib.apply_pose(pose_index=11)
# apply direction
if direction == "north":
bpy.ops.poselib.apply_pose(pose_index=2)
elif direction == "northwest":
bpy.ops.poselib.apply_pose(pose_index=3)
else:
bpy.ops.poselib.apply_pose(pose_index=4)
# apply weighted foot with height feature
if weight == "left" and height == "high":
bpy.ops.poselib.apply_pose(pose_index=25)
elif weight == "right" and height == "high":
bpy.ops.poselib.apply_pose(pose_index=24)
elif weight == "left" and height == "low":
bpy.ops.poselib.apply_pose(pose_index=6)
else:
bpy.ops.poselib.apply_pose(pose_index=5)
# apply unweighted foot with height feature
if height == "high" and weight == "left":
if unweighted == "collected":
bpy.ops.poselib.apply_pose(pose_index=26)
elif unweighted == "crossed-forward":
bpy.ops.poselib.apply_pose(pose_index=28)
elif unweighted == "forward":
bpy.ops.poselib.apply_pose(pose_index=30)
elif unweighted == "backward":
bpy.ops.poselib.apply_pose(pose_index=32)
elif unweighted == "in-air-forward":
bpy.ops.poselib.apply_pose(pose_index=34)
elif unweighted == "in-air-backward":
bpy.ops.poselib.apply_pose(pose_index=36)
elif unweighted == "slide-out-side":
bpy.ops.poselib.apply_pose(pose_index=38)
else: # "wrapped-around-front"
bpy.ops.poselib.apply_pose(pose_index=40)
elif height == "high" and weight == "right":
if unweighted == "collected":
bpy.ops.poselib.apply_pose(pose_index=27)
elif unweighted == "crossed-forward":
bpy.ops.poselib.apply_pose(pose_index=29)
elif unweighted == "forward":
bpy.ops.poselib.apply_pose(pose_index=31)
elif unweighted == "backward":
bpy.ops.poselib.apply_pose(pose_index=33)
elif unweighted == "in-air-forward":
bpy.ops.poselib.apply_pose(pose_index=35)
elif unweighted == "in-air-backward":
bpy.ops.poselib.apply_pose(pose_index=37)
elif unweighted == "slide-out-side":
bpy.ops.poselib.apply_pose(pose_index=39)
else: # "wrapped-around-front"
bpy.ops.poselib.apply_pose(pose_index=41)
elif height == "low" and weight == "left":
if unweighted == "collected":
bpy.ops.poselib.apply_pose(pose_index=7)
elif unweighted == "crossed-forward":
bpy.ops.poselib.apply_pose(pose_index=9)
elif unweighted == "forward":
bpy.ops.poselib.apply_pose(pose_index=13)
elif unweighted == "backward":
bpy.ops.poselib.apply_pose(pose_index=14)
elif unweighted == "in-air-forward":
bpy.ops.poselib.apply_pose(pose_index=16)
elif unweighted == "in-air-backward":
bpy.ops.poselib.apply_pose(pose_index=18)
elif unweighted == "slide-out-side":
bpy.ops.poselib.apply_pose(pose_index=20)
else: # "wrapped-around-front"
bpy.ops.poselib.apply_pose(pose_index=22)
else: # height == "low" and weight == "right"
if unweighted == "collected":
bpy.ops.poselib.apply_pose(pose_index=8)
elif unweighted == "crossed-forward":
bpy.ops.poselib.apply_pose(pose_index=10)
elif unweighted == "forward":
bpy.ops.poselib.apply_pose(pose_index=12)
elif unweighted == "backward":
bpy.ops.poselib.apply_pose(pose_index=15)
elif unweighted == "in-air-forward":
bpy.ops.poselib.apply_pose(pose_index=17)
elif unweighted == "in-air-backward":
bpy.ops.poselib.apply_pose(pose_index=19)
elif unweighted == "slide-out-side":
bpy.ops.poselib.apply_pose(pose_index=21)
else: # "wrapped-around-front"
bpy.ops.poselib.apply_pose(pose_index=23)
# leaning
if height == "high":
if leaning == "neutral":
pass
elif leaning == "forward":
if weight == "left":
bpy.ops.poselib.apply_pose(pose_index=47)
else:
bpy.ops.poselib.apply_pose(pose_index=46)
elif leaning == "backward":
if weight == "left":
bpy.ops.poselib.apply_pose(pose_index=49)
else:
bpy.ops.poselib.apply_pose(pose_index=48)
elif leaning == "toward-weighted":
if weight == "left":
bpy.ops.poselib.apply_pose(pose_index=57)
else: # right
bpy.ops.poselib.apply_pose(pose_index=55)
else: # "toward-unweighted"
if weight == "left":
bpy.ops.poselib.apply_pose(pose_index=56)
else: # right
bpy.ops.poselib.apply_pose(pose_index=54)
else: # low
if leaning == "neutral":
pass
elif leaning == "forward":
if weight == "left":
bpy.ops.poselib.apply_pose(pose_index=43)
else: # right
bpy.ops.poselib.apply_pose(pose_index=42)
elif leaning == "backward":
if weight == "left":
bpy.ops.poselib.apply_pose(pose_index=45)
else: # right
bpy.ops.poselib.apply_pose(pose_index=44)
elif leaning == "toward-weighted":
if weight == "left":
bpy.ops.poselib.apply_pose(pose_index=53)
else: # right
bpy.ops.poselib.apply_pose(pose_index=51)
else: # "toward-unweighted"
if weight == "left":
bpy.ops.poselib.apply_pose(pose_index=52)
else: # right
bpy.ops.poselib.apply_pose(pose_index=50)
# insert into a new frame
bpy.ops.anim.keyframe_insert_menu(type='WholeCharacter')
| 51.513661 | 165 | 0.452 | 884 | 9,427 | 4.678733 | 0.159502 | 0.089942 | 0.182302 | 0.252418 | 0.779014 | 0.767892 | 0.767892 | 0.722921 | 0.651354 | 0.593327 | 0 | 0.02069 | 0.446271 | 9,427 | 182 | 166 | 51.796703 | 0.771648 | 0.069057 | 0 | 0.493151 | 1 | 0 | 0.084912 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0.013699 | 0.006849 | 0 | 0.006849 | 0 | 0 | 0 | 0 | null | 0 | 1 | 1 | 0 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
8e34588677450b05797aa70a8161041bb1a37f18 | 9,219 | py | Python | src/maximopyclient/OSResource.py | ibmmaximorestjsonapis/maximopyclient | 693b671e0c6ab6ba7098e5f27b0117461f02d00f | [
"Apache-2.0"
] | null | null | null | src/maximopyclient/OSResource.py | ibmmaximorestjsonapis/maximopyclient | 693b671e0c6ab6ba7098e5f27b0117461f02d00f | [
"Apache-2.0"
] | null | null | null | src/maximopyclient/OSResource.py | ibmmaximorestjsonapis/maximopyclient | 693b671e0c6ab6ba7098e5f27b0117461f02d00f | [
"Apache-2.0"
] | null | null | null | '''
Created on Jul 28, 2020
@author: AnamitraBhattacharyy
'''
class OSResource:
def __init__(self, name, conn):
self.name = name
self.conn = conn
self.ctx = "/os/"
self.next_page_uri = None
def fetch_resource(self, uri, select_clause, stream=False):
if stream:
self.ctx = "/stream/"
else:
self.ctx = "/os/"
params = {}
if select_clause is not None:
select_clause.params().update(params)
if uri is not None:
self.conn.do_get(uri, params=params, headers=self.header_params)
else:
uri = self.conn.url+self.ctx+self.name.lower()
self.conn.do_get(uri, params=params, headers=self.header_params)
def fetch_all(self, where_clause=None, select_clause=None, uri=None, params=None, data_format="JSON", stream=False):
if stream:
self.ctx = "/stream/"
else:
self.ctx = "/os/"
if params is None:
params = {}
params["ignorecollectionref"] = "1"
if data_format== "CSV" or data_format == "XML":
params["_format"] = data_format
if where_clause is not None:
params.update(where_clause.params())
if select_clause is not None:
params.update(select_clause.params())
resp = None
if uri is not None:
resp = self.conn.do_get(uri, params=params)
else:
uri = self.conn.url+self.ctx+self.name.lower()
resp = self.conn.do_get(uri, params=params)
if resp.status_code>=400:
raise Exception(resp.json()["Error"]["message"])
if data_format.lower() == "json":
return resp.json()
else:
return resp.text()
def fetch_first_page(self, where_clause=None, uri=None, select_clause=None, page_size=1000, stable=True, stream=False):
if stream:
self.ctx = "/stream/"
else:
self.ctx = "/os/"
if stable == True and self.conn.session == False:
raise Exception("stable paging not supported without session")
params = {'oslc.pageSize': page_size,'ignorecollectionref': '1'}
if where_clause is not None:
params.update(where_clause.params())
if select_clause is not None:
params.update(select_clause.params())
if stable:
params['stablepaging'] = '1'
if uri is not None:
resp = self.conn.do_get(uri, params=params)
else:
uri = self.conn.url+self.ctx+self.name.lower()
resp = self.conn.do_get(uri, params=params)
if resp.status_code >= 400:
raise Exception(resp.json()["Error"]["message"])
self.conn.session = resp.cookies['JSESSIONID']
resp_data = resp.json()
if 'nextPage' in resp_data['responseInfo']:
self.next_page_uri = resp_data['responseInfo']['nextPage']['href']
return resp_data
def has_next_page(self):
return self.next_page_uri is not None
def fetch_next_page(self):
if self.next_page_uri is not None:
resp = self.conn.do_get(self.next_page_uri)
if resp.status_code>=400:
raise Exception(resp.json()["Error"]["message"])
resp_data = resp.json()
self.next_page_uri = None
if 'nextPage' in resp_data['responseInfo']:
self.next_page_uri = resp_data['responseInfo']['nextPage']['href']
return resp_data
else:
raise Exception("no next page")
def create(self, json_data, uri=None, select_clause=None):
if uri is None:
uri = self.conn.url+self.ctx+self.name.lower()
headers = {}
if select_clause is not None:
headers["properties"] = select_clause.to_string()
resp = self.conn.do_post(uri, params={}, headers=headers, data=json_data)
if resp.status_code>=400:
raise Exception(resp.json()["Error"]["message"])
if select_clause is not None:
return resp.json()
else:
return resp.headers['location']
def replace(self, json_data, uri, select_clause=None):
if uri is None:
uri = json_data['href']
if uri is None:
raise Exception("missing uri")
headers={}
if select_clause is not None:
headers['properties'] = select_clause.to_string()
headers['x-method-override'] = 'PATCH'
resp = self.conn.do_post(uri, params={}, headers=headers, data=json_data)
if resp.status_code >= 400:
raise Exception(resp.json()["Error"]["message"])
return resp.json()
def merge(self, json_data, uri=None, select_clause=None):
if uri is None:
uri = json_data['href']
if uri is None:
raise Exception("missing uri")
headers={}
if select_clause is not None:
headers['properties'] = select_clause.to_string()
headers['x-method-override']='PATCH'
headers['patchtype'] = 'MERGE'
headers['content-type'] = 'application/json'
resp = self.conn.do_post(uri, params={}, headers=headers, data=json_data)
if resp.status_code >= 400:
raise Exception(resp.json()["Error"]["message"])
elif resp.status_code == 200:
return resp.json()
def sync(self, json_data, uri=None, select_clause=None):
if uri is None:
uri = self.conn.url+"/os/"+self.name.lower()
headers = {'x-method-override': 'SYNC'}
if select_clause is not None:
headers['properties'] = select_clause.to_string()
resp = self.conn.do_post(uri, params={}, headers=headers, data=json_data)
if resp.status_code>=400:
raise Exception(resp.json()["Error"]["message"])
return resp.json()
def delete(self, uri):
if uri is None:
raise Exception("missing uri")
headers = {'x-method-override': 'DELETE'}
resp = self.conn.do_post(uri, params={}, headers=headers)
if resp.status_code >= 400:
raise Exception(resp.json()["Error"]["message"])
return resp
def bulk(self, json_data, uri=None, select_clause=None):
if uri is None:
raise Exception("missing uri")
headers = {}
if select_clause is not None:
headers['properties'] = select_clause.to_string()
headers['x-method-override'] = 'BULK'
resp = self.conn.do_post(uri, params={}, headers=headers, data=json_data)
if resp.status_code>=400:
raise Exception(resp.json()["Error"]["message"])
return resp.json()
def sys_get_action(self, action_name, uri=None, params=None):
if uri is None:
uri = self.conn.url+"/os/"+self.name.lower()
headers = {}
params = {'action': action_name}
resp = self.conn.do_get(uri, params=params, headers=headers)
if resp.status_code >= 400:
raise Exception(resp.json()["Error"]["message"])
return resp.text()
def sys_post_action(self, action_name, data=None, uri=None, params=None):
if uri is None:
uri = self.conn.url+"/os/"+self.name.lower()
headers = {}
params = {'action': action_name}
resp = self.conn.do_post(uri, params=params, headers=headers, data=data)
if resp.status_code >= 400:
raise Exception(resp.json()["Error"]["message"])
return resp.text()
def invoke_method(self, action_name, data=None, uri=None, params=None):
if uri is None:
# uri = self.conn.url+"/os/"+self.name.lower()
uri = data['href']
headers = {'x-method-override': 'PATCH', 'patchtype': 'MERGE', 'content-type': 'application/json'}
action = 'wsmethod:'+str(action_name)
params = {'action': action}
resp = self.conn.do_post(uri, params=params, headers=headers, data=data)
if resp.status_code >= 400:
raise Exception(resp.json()["Error"]["message"])
if resp.json is not None:
return resp.json()
return resp
def new(self, uri=None, select_clause=None):
if uri is None:
uri = self.conn.url+"/os/"+self.name.lower()
headers = {}
if select_clause is not None:
headers['properties'] = select_clause.to_string()
params = {'action': "new"}
resp = self.conn.do_get(uri, params=params, headers=headers)
if resp.status_code >= 400:
raise Exception(resp.json()["Error"]["message"])
return resp.json()
def duplicate(self, uri, select_clause=None):
if uri is None:
raise Exception("missing uri")
headers = {}
if select_clause is not None:
headers['properties'] = select_clause.to_string()
params = {'action': "duplicate"}
resp = self.conn.do_get(uri, params=params, headers=headers)
if resp.status_code >= 400:
raise Exception(resp.json()["Error"]["message"])
return resp.json()
| 37.024096 | 123 | 0.57913 | 1,143 | 9,219 | 4.545057 | 0.094488 | 0.071607 | 0.032916 | 0.043118 | 0.779981 | 0.756304 | 0.737055 | 0.724543 | 0.723388 | 0.704331 | 0 | 0.008856 | 0.289619 | 9,219 | 248 | 124 | 37.173387 | 0.784395 | 0.010847 | 0 | 0.745192 | 0 | 0 | 0.096038 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.081731 | false | 0 | 0 | 0.004808 | 0.173077 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
6d1a083d6dcf5cbda6cf99f3ecc11ebc3e6ed90e | 17,684 | py | Python | slrfield/cpf/cpf_download.py | lcx366/SLRfield | 7eb70efa30043406b105b78e2612689326c77f7c | [
"MIT"
] | 2 | 2020-04-21T08:57:49.000Z | 2020-08-19T11:34:24.000Z | slrfield/cpf/cpf_download.py | lcx366/SLRfield | 7eb70efa30043406b105b78e2612689326c77f7c | [
"MIT"
] | 1 | 2021-03-14T00:56:53.000Z | 2021-06-03T02:03:10.000Z | slrfield/cpf/cpf_download.py | lcx366/SLRfield | 7eb70efa30043406b105b78e2612689326c77f7c | [
"MIT"
] | 2 | 2020-08-19T11:34:27.000Z | 2022-03-11T02:42:06.000Z | from os import system,path,makedirs
from pathlib import Path
from ftplib import FTP
import requests
from bs4 import BeautifulSoup
from astropy.time import Time
from ..utils.try_download import tqdm_ftp,tqdm_request
def download_bycurrent(source,satnames,append=False):
"""
Download the latest CPF ephemeris files at the current moment.
Usage:
server,dir_cpf_from, dir_cpf_to,cpf_files = download_bycurrent('CDDIS')
server,dir_cpf_from, dir_cpf_to,cpf_files = download_bycurrent('EDC')
server,dir_cpf_from, dir_cpf_to,cpf_files = download_bycurrent('CDDIS',lageos1')
server,dir_cpf_from, dir_cpf_to,cpf_files = download_bycurrent('EDC',['ajisai','lageos1','hy2a'])
Inputs:
source -> [str] source for CPF ephemeris files. Currently, only 'CDDIS' and 'EDC' are available.
satnames -> [str, str list, or None] target name or list of target names. If None, then all feasible targets at the current moment will be downloaded.
Parameters:
append -> [Bool, default = False] If False, clear the data storage directory ahead of requesting CPF files. If True, then keep the data storage directory.
Outputs:
server -> [str] server for downloading CPF ephemeris files. Currently, only 'cddis.nasa.gov' and 'edc.dgfi.tum.de' are available.
dir_cpf_from -> [str] directory for storing CPF ephemeris files in remote server.
dir_cpf_to -> [str] user's local directory for storing CPF ephemeris files
cpf_files -> [str list] list of CPF ephemeris files
"""
cpf_files = []
date = Time.now().iso
dir_cpf_to = 'CPF/'+source+'/'+date[:10] + '/'
if not path.exists(dir_cpf_to):
makedirs(dir_cpf_to)
else:
if not append: system('rm -rf %s/*' % dir_cpf_to)
if source == 'CDDIS':
server = 'https://cddis.nasa.gov'
dir_cpf_from = '/archive/slr/cpf_predicts/current/'
cpf_files_dict,cpf_files_list = get_cpf_filelist(server,dir_cpf_from,'bycurrent')
elif source == 'EDC':
server = 'edc.dgfi.tum.de'
dir_cpf_from = '~/slr/cpf_predicts//current/'
ftp = FTP(server,timeout=200)
ftp.login()
ftp.cwd(dir_cpf_from)
cpf_files_list = ftp.nlst('-t','*cpf*') # list files containing 'cpf' from newest to oldest
ftp.quit()
ftp.close()
else:
raise Exception("Currently, for CPF prediction centers, only 'CDDIS' and 'EDC' are available.")
if satnames is None:
reduplicates = set([cpf_file.split('_')[0] for cpf_file in cpf_files_list]) # remove duplicates
elif type(satnames) is str:
reduplicates = [satnames]
elif type(satnames) is list:
reduplicates = satnames
else:
raise Exception('Type of satnames should be str or list.')
for cpf_file in cpf_files_list:
satname = cpf_file.split('_')[0]
if satname in reduplicates:
cpf_files.append(cpf_file)
reduplicates.remove(satname)
if reduplicates == []: break
return server,dir_cpf_from, dir_cpf_to,cpf_files
def download_bydate(source,date,satnames,append=False):
"""
Download the latest CPF ephemeris files before a specific time.
Usage:
server,dirs_cpf_from, dir_cpf_to,cpf_files = download_bydate('CDDIS','2007-06-01 11:30:00',['ajisai','lageos1','hy2a'])
server,dirs_cpf_from, dir_cpf_to,cpf_files = download_bycurrent('EDC','2017-12-20 05:30:00','hy2a'])
Inputs:
source -> [str] source for CPF ephemeris files. Currently, only 'CDDIS' and 'EDC' are available.
date -> [str] 'iso-formatted' time, such as '2017-12-20 05:30:00'. It specifies a moment before which the latest CPF ephemeris files are downloaded.
satnames -> [str, str list] target name or list of target names.
Parameters:
append -> [Bool, default = False] If False, clear the data storage directory ahead of requesting CPF files. If True, then keep the data storage directory.
Outputs:
server -> [str] server for downloading CPF ephemeris files. Currently, only 'cddis.nasa.gov' and 'edc.dgfi.tum.de' are available.
dir_cpf_from -> [str] directory for storing CPF ephemeris files in remote server.
dir_cpf_to -> [str] user's local directory for storing CPF ephemeris files
cpf_files -> [str list] list of CPF ephemeris files
"""
if satnames is None:
raise Exception("satnames must be provided.")
elif type(satnames) is str:
reduplicates = [satnames]
elif type(satnames) is list:
reduplicates = satnames
else:
raise Exception('Type of satname should be str or list.')
dirs_cpf_from,cpf_files = [],[]
date_dir = date[:4]
date_str1 = Time(date).strftime('%y%m%d')
date_str2 = (Time(date)-7).strftime('%y%m%d') # ephemeris updates for some high-orbit satellites may take several days
date_str = Time(date).strftime('%Y%m%d%H%M%S')
dir_cpf_to = 'CPF/'+source+'/'+ date[:10] + '/'
if not path.exists(dir_cpf_to):
makedirs(dir_cpf_to)
else:
if not append: system('rm -rf %s/*' % dir_cpf_to)
if source == 'CDDIS':
server = 'https://cddis.nasa.gov'
for satname in reduplicates:
cpf_files_list_reduced = []
find_flag = False
dir_cpf_from = '/archive/slr/cpf_predicts/' + date_dir + '/' + satname + '/'
dirs_cpf_from.append(dir_cpf_from)
cpf_files_dict,cpf_files_list = get_cpf_filelist(server,dir_cpf_from,'bydate')
for cpf_file in cpf_files_list:
cpf_file_date = cpf_file.split('_')[2]
if date_str2 <= cpf_file_date <= date_str1: cpf_files_list_reduced.append(cpf_file)
if cpf_files_list_reduced:
for cpf_file in cpf_files_list_reduced:
# get the latest modification time for cpf files
modified_time = cpf_files_dict[cpf_file]
# modify the time format from '2020:09:20 05:30:11' to '20200920053011'
for chara in [':',' ']:
modified_time = modified_time.replace(chara,'')
if modified_time < date_str:
find_flag = True
cpf_files.append(cpf_file)
break
if not find_flag and date_str2[:2]>='05':
cpf_files_list_reduced = []
dirs_cpf_from.remove(dir_cpf_from)
dir_cpf_from = '/archive/slr/cpf_predicts/' + '20'+date_str2[:2] + '/' + satname + '/'
dirs_cpf_from.append(dir_cpf_from)
cpf_files_dict,cpf_files_list = get_cpf_filelist(server,dir_cpf_from,'bydate')
for cpf_file in cpf_files_list:
cpf_file_date = cpf_file.split('_')[2]
if date_str2 <= cpf_file_date <= date_str1: cpf_files_list_reduced.append(cpf_file)
for cpf_file in cpf_files_list_reduced:
# get the latest modification time for cpf files
modified_time = cpf_files_dict[cpf_file]
# modify the time format from '2020:09:20 05:30:11' to '20200920053011'
for chara in [':',' ']:
modified_time = modified_time.replace(chara,'')
if modified_time <= date_str:
cpf_files.append(cpf_file)
break
elif source == 'EDC':
server = 'edc.dgfi.tum.de'
ftp = FTP(server,timeout=200)
ftp.login()
for satname in reduplicates:
cpf_files_list_reduced = []
find_flag = False
dir_cpf_from = '~/slr/cpf_predicts//' + date_dir + '/' + satname + '/'
dirs_cpf_from.append(dir_cpf_from)
ftp.cwd(dir_cpf_from)
cpf_files_list = ftp.nlst('-t','*cpf*') # list files containing 'cpf' from newest to oldest
for cpf_file in cpf_files_list:
cpf_file_date = cpf_file.split('_')[2]
if date_str2 <= cpf_file_date <= date_str1: cpf_files_list_reduced.append(cpf_file)
if cpf_files_list_reduced:
for cpf_file in cpf_files_list_reduced:
# get the latest modification time for cpf files
modified_time = ftp.voidcmd('MDTM ' + cpf_file).split()[1]
if modified_time < date_str:
find_flag = True
cpf_files.append(cpf_file)
break
if not find_flag and date_str2[:2]>='05':
cpf_files_list_reduced = []
dirs_cpf_from.remove(dir_cpf_from)
dir_cpf_from = '~/slr/cpf_predicts//' + '20'+date_str2[:2] + '/' + satname + '/'
dirs_cpf_from.append(dir_cpf_from)
ftp.cwd(dir_cpf_from)
cpf_files_list = ftp.nlst('-t','*cpf*') # list files containing 'cpf' from newest to oldest
for cpf_file in cpf_files_list:
cpf_file_date = cpf_file.split('_')[2]
if date_str2 <= cpf_file_date <= date_str1: cpf_files_list_reduced.append(cpf_file)
for cpf_file in cpf_files_list_reduced:
# get the latest modification time for cpf files
modified_time = ftp.voidcmd('MDTM ' + cpf_file).split()[1]
if modified_time <= date_str:
cpf_files.append(cpf_file)
break
ftp.quit()
ftp.close()
else:
raise Exception("Currently, CPF predictions only from 'CDDIS' and 'EDC' are available.")
return server,dirs_cpf_from, dir_cpf_to,cpf_files
def cpf_download_prior(satnames = None,date = None,source = 'CDDIS',append=False):
"""
Download the latest CPF ephemeris files.
Usage:
dir_cpf_files = cpf_download()
dir_cpf_files = cpf_download(source = 'EDC')
dir_cpf_files = cpf_download('lageos1')
dir_cpf_files = cpf_download('ajisai','2007-06-01 11:30:00')
dir_cpf_files = cpf_download(['ajisai','lageos1','hy2a'],'2007-06-01 11:30:00','EDC')
Parameters:
satnames -> [str, str list, or None, default = None] target name or list of target names. If None, then all feasible targets at the current moment will be downloaded. In this case, 'date' must also be None.
date -> [str or None, default = None] 'iso-formatted' time, such as '2017-12-20 05:30:00'. It specifies a moment before which the latest CPF ephemeris files are downloaded. If None, then all feasible targets or targets in list at the current moment will be downloaded.
source -> [str, default = 'CDDIS'] source for CPF ephemeris files. Currently, only 'CDDIS' and 'EDC' are available.
append -> [Bool, default = False] If False, clear the data storage directory ahead of requesting CPF files. If True, then keep the data storage directory.
Outputs:
dir_cpf_files -> [str list] list of paths for CPF ephemeris files in user's local directory
missing_cpf_files -> [str list or None] if not None, it lists files that are not responsed from the server
Note: if 'date' is provided, namely not None, then 'satnames' must also be provided.
"""
if source == 'CDDIS': # Need to create an Earthdata login account at https://urs.earthdata.nasa.gov/
# Create a .netrc file in the home directory
home = str(Path.home())
if not path.exists(home+'/.netrc'):
uid = input('Please input the Username for your EARTHDATA login account(which can be created at https://urs.earthdata.nasa.gov/): ')
passwd = input('Please input the Password: ')
netrc_file = open(home+'/.netrc','w')
netrc_file.write('machine urs.earthdata.nasa.gov login '+uid+' password '+passwd)
netrc_file.close()
dir_cpf_files,missing_cpf_files = [],[]
if date is None:
server,dir_cpf_from, dir_cpf_to,cpf_files = download_bycurrent(source,satnames,append)
if source == 'CDDIS':
for cpf_file in cpf_files:
url = server+dir_cpf_from+cpf_file
desc = 'Downloading {:s}'.format(cpf_file)
missing_cpf_file = tqdm_request(url,dir_cpf_to,cpf_file,desc)
if missing_cpf_file is not None: missing_cpf_files.append(missing_cpf_file)
dir_cpf_files.append(dir_cpf_to+cpf_file)
if source == 'EDC':
ftp = FTP(server,timeout=200)
ftp.login()
ftp.cwd(dir_cpf_from)
for cpf_file in cpf_files:
desc = 'Downloading {:s}'.format(cpf_file)
missing_cpf_file = tqdm_ftp(ftp,dir_cpf_to,cpf_file,desc)
if missing_cpf_file is not None: missing_cpf_files.append(missing_cpf_file)
dir_cpf_files.append(dir_cpf_to+cpf_file)
ftp.quit()
ftp.close()
else:
server,dirs_cpf_from, dir_cpf_to,cpf_files = download_bydate(source,date,satnames,append)
if source == 'CDDIS':
for dir_cpf_from,cpf_file in zip(dirs_cpf_from,cpf_files):
url = server+dir_cpf_from+cpf_file
desc = 'Downloading {:s}'.format(cpf_file)
missing_cpf_file = tqdm_request(url,dir_cpf_to,cpf_file,desc)
if missing_cpf_file is not None: missing_cpf_files.append(missing_cpf_file)
dir_cpf_files.append(dir_cpf_to+cpf_file)
if source == 'EDC':
ftp = FTP(server,timeout=200)
ftp.login()
for dir_cpf_from,cpf_file in zip(dirs_cpf_from,cpf_files):
ftp.cwd(dir_cpf_from)
desc = 'Downloading {:s}'.format(cpf_file)
missing_cpf_file = tqdm_ftp(ftp,dir_cpf_to,cpf_file,desc)
if missing_cpf_file is not None: missing_cpf_files.append(missing_cpf_file)
dir_cpf_files.append(dir_cpf_to+cpf_file)
ftp.quit()
ftp.close()
return dir_cpf_files,missing_cpf_files
def cpf_download(satnames = None,date = None,source = 'CDDIS',append=False):
"""
Download the latest CPF ephemeris files.
Usage:
dir_cpf_files = cpf_download()
dir_cpf_files = cpf_download(source = 'EDC')
dir_cpf_files = cpf_download('lageos1')
dir_cpf_files = cpf_download('ajisai','2007-06-01 11:30:00')
dir_cpf_files = cpf_download(['ajisai','lageos1','hy2a'],'2007-06-01 11:30:00','EDC')
Parameters:
satnames -> [str, str list, or None, default = None] target name or list of target names. If None, then all feasible targets at the current moment will be downloaded. In this case, 'date' must also be None.
date -> [str or None, default = None] 'iso-formatted' time, such as '2017-12-20 05:30:00'. It specifies a moment before which the latest CPF ephemeris files are downloaded. If None, then all feasible targets or targets in list at the current moment will be downloaded.
source -> [str, default = 'CDDIS'] source for CPF ephemeris files. Currently, only 'CDDIS' and 'EDC' are available.
append -> [Bool, default = False] If False, clear the data storage directory ahead of requesting CPF files. If True, then keep the data storage directory.
Outputs:
dir_cpf_files -> [str list] list of paths for CPF ephemeris files in user's local directory
Note: if 'date' is provided, namely not None, then 'satnames' must also be provided.
"""
cpf_files_downloaded, cpf_files_missed = cpf_download_prior(satnames,date,source,append)
if cpf_files_missed:
cpf_files_downloaded2, cpf_files_missed2 = cpf_download_prior(cpf_files_missed,append=True)
cpf_files_downloaded += cpf_files_downloaded2
return cpf_files_downloaded
def get_cpf_filelist(server,dir_cpf_from,mode):
"""
Generate CDDIS CPF files list sorted by date from the latest to the oldest.
"""
res = requests.get(server + dir_cpf_from)
soup = BeautifulSoup(res.text, 'html.parser')
# extract time infomation
time_info = soup.find_all('span')
if mode == 'bycurrent':
time_list = [ele.get_text().split(' ')[0] for ele in time_info][2:] # Remove two extra items
elif mode == 'bydate':
time_list = [ele.get_text().split(' ')[0] for ele in time_info]
n_time_list = len(time_list)
# extract filename infomation
filename_info = soup.find_all('a')
cpf_files_list_unsort = [ele.get_text() for ele in filename_info if '_cpf_' in ele.get_text()]
n_cpf_files_list_unsort = len(cpf_files_list_unsort)
if n_time_list != n_cpf_files_list_unsort:
raise Exception('Timestamp and CPF files are not matched!')
cpf_files_dict = dict(zip(cpf_files_list_unsort,time_list))
cpf_files_turple = sorted(cpf_files_dict.items(), key=lambda x: x[1],reverse=True) # Sort by release time
cpf_files_list = [ele[0] for ele in cpf_files_turple]
res.close()
return cpf_files_dict, cpf_files_list
| 47.924119 | 285 | 0.617451 | 2,382 | 17,684 | 4.351805 | 0.109152 | 0.084893 | 0.0328 | 0.021223 | 0.815165 | 0.787189 | 0.750048 | 0.735385 | 0.717828 | 0.701331 | 0 | 0.020222 | 0.286926 | 17,684 | 368 | 286 | 48.054348 | 0.801824 | 0.335558 | 0 | 0.663594 | 0 | 0.004608 | 0.088431 | 0.011919 | 0.009217 | 0 | 0 | 0 | 0 | 1 | 0.023041 | false | 0.009217 | 0.032258 | 0 | 0.078341 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
6d285fb31d15b7c8feacb3f868aa927b5dc68f33 | 151 | py | Python | code/models/__init__.py | Bhaskers-Blu-Org1/knowledge-enabled-textual-entailment | be0741c94d734780536e549700509cf1aabcab66 | [
"Apache-2.0"
] | 42 | 2018-08-28T09:05:41.000Z | 2021-08-02T16:42:14.000Z | code/models/__init__.py | SocioProphet/knowledge-enabled-textual-entailment | be0741c94d734780536e549700509cf1aabcab66 | [
"Apache-2.0"
] | 6 | 2019-01-17T02:34:04.000Z | 2021-06-01T22:29:15.000Z | code/models/__init__.py | SocioProphet/knowledge-enabled-textual-entailment | be0741c94d734780536e549700509cf1aabcab66 | [
"Apache-2.0"
] | 13 | 2018-09-16T06:14:09.000Z | 2021-06-08T19:56:31.000Z | from .decomp_attn import *
from .simple_graph import *
from .graph_n_text import *
from .match_lstm import *
from .deiste import *
from .hbmp import *
| 21.571429 | 27 | 0.761589 | 23 | 151 | 4.782609 | 0.521739 | 0.454545 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.15894 | 151 | 6 | 28 | 25.166667 | 0.866142 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
6d52a8525977285c73e0f9d7e5d31c1466902912 | 45 | py | Python | utils4ymc/Tools/__init__.py | Interesting6/utils4ymc | 1fdac1acbcb12dafd374906fc80f9eb518670448 | [
"MIT"
] | null | null | null | utils4ymc/Tools/__init__.py | Interesting6/utils4ymc | 1fdac1acbcb12dafd374906fc80f9eb518670448 | [
"MIT"
] | null | null | null | utils4ymc/Tools/__init__.py | Interesting6/utils4ymc | 1fdac1acbcb12dafd374906fc80f9eb518670448 | [
"MIT"
] | null | null | null | from .calculate import *
from .file import *
| 15 | 24 | 0.733333 | 6 | 45 | 5.5 | 0.666667 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.177778 | 45 | 2 | 25 | 22.5 | 0.891892 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
ede8d77213ff1de7ce0a843a46d75ba15fefc178 | 18 | py | Python | part23/mylib/general/__init__.py | ThomasAlbin/SpaceScienceTutorial | 9ca5c1340480a29a112dec91e075b7ee2eff38ed | [
"MIT"
] | 167 | 2020-04-21T21:04:14.000Z | 2022-03-29T15:07:52.000Z | part23/mylib/general/__init__.py | wellyington/SpaceScienceTutorial | 9ca5c1340480a29a112dec91e075b7ee2eff38ed | [
"MIT"
] | 11 | 2020-05-19T18:49:24.000Z | 2021-06-08T01:51:29.000Z | part23/mylib/general/__init__.py | wellyington/SpaceScienceTutorial | 9ca5c1340480a29a112dec91e075b7ee2eff38ed | [
"MIT"
] | 41 | 2020-05-03T06:13:17.000Z | 2022-02-12T17:32:51.000Z | from . import vec
| 9 | 17 | 0.722222 | 3 | 18 | 4.333333 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.222222 | 18 | 1 | 18 | 18 | 0.928571 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
6127d79168b51f49af9dd1a5debb9cfb4228fad4 | 32 | py | Python | wf_rdbms/__init__.py | WildflowerSchools/wf-rdbms-python | eb94019234dc1b4bb7d0675a5572f6aa3c9db945 | [
"MIT"
] | null | null | null | wf_rdbms/__init__.py | WildflowerSchools/wf-rdbms-python | eb94019234dc1b4bb7d0675a5572f6aa3c9db945 | [
"MIT"
] | null | null | null | wf_rdbms/__init__.py | WildflowerSchools/wf-rdbms-python | eb94019234dc1b4bb7d0675a5572f6aa3c9db945 | [
"MIT"
] | null | null | null | from wf_rdbms.database import *
| 16 | 31 | 0.8125 | 5 | 32 | 5 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.125 | 32 | 1 | 32 | 32 | 0.892857 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
b6402ce61f4365f9dcf090abf581cc2f0a3c7564 | 6,497 | py | Python | cropduster/migrations.borked/0001_initial.py | pbs/django-cropduster | de4bd375421c29bb80653a01aaf263f1a9e6e626 | [
"BSD-2-Clause"
] | null | null | null | cropduster/migrations.borked/0001_initial.py | pbs/django-cropduster | de4bd375421c29bb80653a01aaf263f1a9e6e626 | [
"BSD-2-Clause"
] | null | null | null | cropduster/migrations.borked/0001_initial.py | pbs/django-cropduster | de4bd375421c29bb80653a01aaf263f1a9e6e626 | [
"BSD-2-Clause"
] | null | null | null | # encoding: utf-8
import datetime
from south.db import db
from south.v2 import SchemaMigration
from django.db import models
class Migration(SchemaMigration):
def forwards(self, orm):
# Adding model 'SizeSet'
db.create_table('cropduster_sizeset', (
('slug', self.gf('django.db.models.fields.SlugField')(max_length=50, db_index=True)),
('id', self.gf('django.db.models.fields.AutoField')(primary_key=True)),
('name', self.gf('django.db.models.fields.CharField')(max_length=255, db_index=True)),
))
db.send_create_signal('cropduster', ['SizeSet'])
# Adding model 'Size'
db.create_table('cropduster_size', (
('name', self.gf('django.db.models.fields.CharField')(max_length=255, db_index=True)),
('aspect_ratio', self.gf('django.db.models.fields.FloatField')(default=1)),
('height', self.gf('django.db.models.fields.PositiveIntegerField')(null=True, blank=True)),
('width', self.gf('django.db.models.fields.PositiveIntegerField')(null=True, blank=True)),
('size_set', self.gf('django.db.models.fields.related.ForeignKey')(to=orm['cropduster.SizeSet'])),
('id', self.gf('django.db.models.fields.AutoField')(primary_key=True)),
('auto_size', self.gf('django.db.models.fields.BooleanField')(default=False, blank=True)),
('slug', self.gf('django.db.models.fields.SlugField')(max_length=50, db_index=True)),
))
db.send_create_signal('cropduster', ['Size'])
# Adding model 'Crop'
db.create_table('cropduster_crop', (
('image', self.gf('django.db.models.fields.related.ForeignKey')(related_name='images', to=orm['cropduster.Image'])),
('crop_h', self.gf('django.db.models.fields.PositiveIntegerField')(default=0, null=True, blank=True)),
('crop_x', self.gf('django.db.models.fields.PositiveIntegerField')(default=0, null=True, blank=True)),
('crop_w', self.gf('django.db.models.fields.PositiveIntegerField')(default=0, null=True, blank=True)),
('crop_y', self.gf('django.db.models.fields.PositiveIntegerField')(default=0, null=True, blank=True)),
('id', self.gf('django.db.models.fields.AutoField')(primary_key=True)),
('size', self.gf('django.db.models.fields.related.ForeignKey')(related_name='size', to=orm['cropduster.Size'])),
))
db.send_create_signal('cropduster', ['Crop'])
# Adding model 'Image'
db.create_table('cropduster_image', (
('caption', self.gf('django.db.models.fields.CharField')(max_length=255, null=True, blank=True)),
('image', self.gf('django.db.models.fields.files.ImageField')(max_length=255, db_index=True)),
('attribution', self.gf('django.db.models.fields.CharField')(max_length=255, null=True, blank=True)),
('id', self.gf('django.db.models.fields.AutoField')(primary_key=True)),
('size_set', self.gf('django.db.models.fields.related.ForeignKey')(to=orm['cropduster.SizeSet'])),
))
db.send_create_signal('cropduster', ['Image'])
def backwards(self, orm):
# Deleting model 'SizeSet'
db.delete_table('cropduster_sizeset')
# Deleting model 'Size'
db.delete_table('cropduster_size')
# Deleting model 'Crop'
db.delete_table('cropduster_crop')
# Deleting model 'Image'
db.delete_table('cropduster_image')
models = {
'cropduster.crop': {
'Meta': {'object_name': 'Crop'},
'crop_h': ('django.db.models.fields.PositiveIntegerField', [], {'default': '0', 'null': 'True', 'blank': 'True'}),
'crop_w': ('django.db.models.fields.PositiveIntegerField', [], {'default': '0', 'null': 'True', 'blank': 'True'}),
'crop_x': ('django.db.models.fields.PositiveIntegerField', [], {'default': '0', 'null': 'True', 'blank': 'True'}),
'crop_y': ('django.db.models.fields.PositiveIntegerField', [], {'default': '0', 'null': 'True', 'blank': 'True'}),
'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
'image': ('django.db.models.fields.related.ForeignKey', [], {'related_name': "'images'", 'to': "orm['cropduster.Image']"}),
'size': ('django.db.models.fields.related.ForeignKey', [], {'related_name': "'size'", 'to': "orm['cropduster.Size']"})
},
'cropduster.image': {
'Meta': {'object_name': 'Image'},
'attribution': ('django.db.models.fields.CharField', [], {'max_length': '255', 'null': 'True', 'blank': 'True'}),
'caption': ('django.db.models.fields.CharField', [], {'max_length': '255', 'null': 'True', 'blank': 'True'}),
'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
'image': ('django.db.models.fields.files.ImageField', [], {'max_length': '255', 'db_index': 'True'}),
'size_set': ('django.db.models.fields.related.ForeignKey', [], {'to': "orm['cropduster.SizeSet']"})
},
'cropduster.size': {
'Meta': {'object_name': 'Size'},
'aspect_ratio': ('django.db.models.fields.FloatField', [], {'default': '1'}),
'auto_size': ('django.db.models.fields.BooleanField', [], {'default': 'False', 'blank': 'True'}),
'height': ('django.db.models.fields.PositiveIntegerField', [], {'null': 'True', 'blank': 'True'}),
'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
'name': ('django.db.models.fields.CharField', [], {'max_length': '255', 'db_index': 'True'}),
'size_set': ('django.db.models.fields.related.ForeignKey', [], {'to': "orm['cropduster.SizeSet']"}),
'slug': ('django.db.models.fields.SlugField', [], {'max_length': '50', 'db_index': 'True'}),
'width': ('django.db.models.fields.PositiveIntegerField', [], {'null': 'True', 'blank': 'True'})
},
'cropduster.sizeset': {
'Meta': {'object_name': 'SizeSet'},
'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
'name': ('django.db.models.fields.CharField', [], {'max_length': '255', 'db_index': 'True'}),
'slug': ('django.db.models.fields.SlugField', [], {'max_length': '50', 'db_index': 'True'})
}
}
complete_apps = ['cropduster']
| 59.605505 | 135 | 0.59135 | 727 | 6,497 | 5.173315 | 0.108666 | 0.099973 | 0.171231 | 0.244616 | 0.784898 | 0.767881 | 0.766286 | 0.739165 | 0.724275 | 0.6703 | 0 | 0.009566 | 0.195475 | 6,497 | 108 | 136 | 60.157407 | 0.709967 | 0.029398 | 0 | 0.238095 | 0 | 0 | 0.471093 | 0.29209 | 0 | 0 | 0 | 0 | 0 | 1 | 0.02381 | false | 0 | 0.047619 | 0 | 0.107143 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
b64485e4c97ec3e95f38471d03995014a5ecd2ca | 24 | py | Python | plugins/plugin_vvc/__init__.py | AlysH/experiment-notebook | c6a40b1dd518814ccac50f83b3a09d59202b138e | [
"MIT"
] | null | null | null | plugins/plugin_vvc/__init__.py | AlysH/experiment-notebook | c6a40b1dd518814ccac50f83b3a09d59202b138e | [
"MIT"
] | null | null | null | plugins/plugin_vvc/__init__.py | AlysH/experiment-notebook | c6a40b1dd518814ccac50f83b3a09d59202b138e | [
"MIT"
] | null | null | null | from . import vvc_codec
| 12 | 23 | 0.791667 | 4 | 24 | 4.5 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.166667 | 24 | 1 | 24 | 24 | 0.9 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
fcb3349435ab346c11d0c7ef62d4c5a6dfe8d179 | 302 | py | Python | src/python/pants/rules/core/register.py | fhaque/pants | cf74d6b788e28305fe342234a07ec5359ccbf2ce | [
"Apache-2.0"
] | null | null | null | src/python/pants/rules/core/register.py | fhaque/pants | cf74d6b788e28305fe342234a07ec5359ccbf2ce | [
"Apache-2.0"
] | null | null | null | src/python/pants/rules/core/register.py | fhaque/pants | cf74d6b788e28305fe342234a07ec5359ccbf2ce | [
"Apache-2.0"
] | null | null | null | # Copyright 2018 Pants project contributors (see CONTRIBUTORS.md).
# Licensed under the Apache License, Version 2.0 (see LICENSE).
from pants.rules.core import filedeps, list_roots, list_targets, test
def rules():
return list_roots.rules() + list_targets.rules() + filedeps.rules() + test.rules()
| 33.555556 | 84 | 0.754967 | 42 | 302 | 5.333333 | 0.619048 | 0.080357 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.022901 | 0.13245 | 302 | 8 | 85 | 37.75 | 0.832061 | 0.417219 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.333333 | true | 0 | 0.333333 | 0.333333 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 1 | 1 | 1 | 0 | 0 | 6 |
fcc1616524977dd610478b52dabec01ee683e416 | 11,449 | py | Python | scico/test/linop/test_convolve.py | lukepfister/scico | c849c4fa6089b99d9a4dec520c9a04cca426d2d7 | [
"BSD-3-Clause"
] | null | null | null | scico/test/linop/test_convolve.py | lukepfister/scico | c849c4fa6089b99d9a4dec520c9a04cca426d2d7 | [
"BSD-3-Clause"
] | null | null | null | scico/test/linop/test_convolve.py | lukepfister/scico | c849c4fa6089b99d9a4dec520c9a04cca426d2d7 | [
"BSD-3-Clause"
] | null | null | null | import operator as op
import numpy as np
import jax
import jax.scipy.signal as signal
import pytest
from scico.linop import Convolve, ConvolveByX, LinearOperator
from scico.random import randn
from scico.test.linop.test_linop import AbsMatOp, adjoint_AAt_test, adjoint_AtA_test
class TestConvolve:
def setup_method(self, method):
self.key = jax.random.PRNGKey(12345)
@pytest.mark.parametrize("input_dtype", [np.float32, np.complex64])
@pytest.mark.parametrize("input_shape", [(32,), (32, 48)])
@pytest.mark.parametrize("mode", ["full", "valid", "same"])
@pytest.mark.parametrize("jit", [False, True])
def test_eval(self, input_shape, input_dtype, mode, jit):
ndim = len(input_shape)
filter_shape = (3, 4)[:ndim]
x, key = randn(input_shape, dtype=input_dtype, key=self.key)
psf, key = randn(filter_shape, dtype=input_dtype, key=key)
A = Convolve(h=psf, input_shape=input_shape, input_dtype=input_dtype, mode=mode, jit=jit)
Ax = A @ x
y = signal.convolve(x, psf, mode=mode)
np.testing.assert_allclose(Ax.ravel(), y.ravel(), rtol=1e-4)
@pytest.mark.parametrize("input_dtype", [np.float32, np.complex64])
@pytest.mark.parametrize("input_shape", [(32,), (32, 48)])
@pytest.mark.parametrize("mode", ["full", "valid", "same"])
@pytest.mark.parametrize("jit", [False, True])
def test_adjoint(self, input_shape, mode, jit, input_dtype):
ndim = len(input_shape)
filter_shape = (3, 4)[:ndim]
x, key = randn(input_shape, dtype=input_dtype, key=self.key)
psf, key = randn(filter_shape, dtype=input_dtype, key=key)
A = Convolve(h=psf, input_shape=input_shape, input_dtype=input_dtype, mode=mode, jit=jit)
adjoint_AtA_test(A, self.key)
adjoint_AAt_test(A, self.key)
class ConvolveTestObj:
def __init__(self):
dtype = np.float32
key = jax.random.PRNGKey(12345)
self.psf_A, key = randn((3,), dtype=dtype, key=key)
self.psf_B, key = randn((3,), dtype=dtype, key=key)
self.psf_C, key = randn((5,), dtype=dtype, key=key)
self.A = Convolve(input_shape=(32,), h=self.psf_A)
self.B = Convolve(input_shape=(32,), h=self.psf_B)
self.C = Convolve(input_shape=(32,), h=self.psf_C)
# Matrix for a 'generic linop'
m = self.A.output_shape[0]
n = self.A.input_shape[0]
G_mat, key = randn((m, n), dtype=dtype, key=key)
self.G = AbsMatOp(G_mat)
self.x, key = randn((32,), dtype=dtype, key=key)
self.scalar = 3.141
@pytest.fixture
def testobj(request):
yield ConvolveTestObj()
@pytest.mark.parametrize("operator", [op.mul, op.truediv])
def test_scalar_left(testobj, operator):
A = operator(testobj.A, testobj.scalar)
x = testobj.x
B = Convolve(input_shape=(32,), h=operator(testobj.psf_A, testobj.scalar))
np.testing.assert_allclose(A @ x, B @ x, rtol=5e-5)
@pytest.mark.parametrize("operator", [op.mul, op.truediv])
def test_scalar_right(testobj, operator):
if operator == op.truediv:
pytest.xfail("scalar / LinearOperator is not supported")
A = operator(testobj.scalar, testobj.A)
x = testobj.x
B = Convolve(input_shape=(32,), h=operator(testobj.scalar, testobj.psf_A))
np.testing.assert_allclose(A @ x, B @ x, rtol=5e-5)
@pytest.mark.parametrize("operator", [op.add, op.sub])
def test_convolve_add_sub(testobj, operator):
A = testobj.A
B = testobj.B
C = testobj.C
x = testobj.x
# Two operators of same size
AB = operator(A, B)
ABx = AB @ x
AxBx = operator(A @ x, B @ x)
np.testing.assert_allclose(ABx, AxBx, rtol=5e-5)
# Two operators of different size
with pytest.raises(ValueError):
operator(A, C)
@pytest.mark.parametrize("operator", [op.add, op.sub])
def test_add_sub_different_mode(testobj, operator):
# These tests get caught inside of the _wrap_add_sub input/output shape checks,
# not the explicit mode check inside of the wrapped __add__ method
B_same = Convolve(input_shape=(32,), h=testobj.psf_B, mode="same")
with pytest.raises(ValueError):
operator(testobj.A, B_same)
@pytest.mark.parametrize("operator", [op.add, op.sub])
def test_add_sum_generic_linop(testobj, operator):
# Combine a AbsMatOp and Convolve, get a generic LinearOperator
AG = operator(testobj.A, testobj.G)
assert isinstance(AG, LinearOperator)
# Check evaluation
a = AG @ testobj.x
b = operator(testobj.A @ testobj.x, testobj.G @ testobj.x)
np.testing.assert_allclose(a, b, rtol=5e-5)
@pytest.mark.parametrize("operator", [op.add, op.sub])
def test_add_sum_conv(testobj, operator):
# Combine a AbsMatOp and Convolve, get a generic LinearOperator
AA = operator(testobj.A, testobj.A)
assert isinstance(AA, Convolve)
# Check evaluation
a = AA @ testobj.x
b = operator(testobj.A @ testobj.x, testobj.A @ testobj.x)
np.testing.assert_allclose(a, b, rtol=5e-5)
@pytest.mark.parametrize("operator", [op.mul, op.truediv])
def test_mul_div_generic_linop(testobj, operator):
# not defined between Convolve and AbsMatOp
with pytest.raises(TypeError):
operator(testobj.A, testobj.G)
def test_invalid_mode(testobj):
# mode that doesn't exist
with pytest.raises(ValueError):
Convolve(input_shape=(32,), h=testobj.psf_A, mode="foo")
def test_dimension_mismatch(testobj):
with pytest.raises(ValueError):
# 2-dim input shape, 1-dim filter
Convolve(input_shape=(32, 32), h=testobj.psf_A)
def test_ndarray_h():
h = np.random.randn(3, 3).astype(np.float32)
A = Convolve(input_shape=(32, 32), h=h)
assert isinstance(A.h, jax.interpreters.xla.DeviceArray)
class TestConvolveByX:
def setup_method(self, method):
self.key = jax.random.PRNGKey(12345)
@pytest.mark.parametrize("input_dtype", [np.float32, np.complex64])
@pytest.mark.parametrize("input_shape", [(32,), (32, 48)])
@pytest.mark.parametrize("mode", ["full", "valid", "same"])
@pytest.mark.parametrize("jit", [False, True])
def test_eval(self, input_shape, input_dtype, mode, jit):
ndim = len(input_shape)
x_shape = (3, 4)[:ndim]
h, key = randn(input_shape, dtype=input_dtype, key=self.key)
x, key = randn(x_shape, dtype=input_dtype, key=key)
A = ConvolveByX(x=x, input_shape=input_shape, input_dtype=input_dtype, mode=mode, jit=jit)
Ax = A @ h
y = signal.convolve(x, h, mode=mode)
np.testing.assert_allclose(Ax.ravel(), y.ravel(), rtol=1e-4)
@pytest.mark.parametrize("input_dtype", [np.float32, np.complex64])
@pytest.mark.parametrize("input_shape", [(32,), (32, 48)])
@pytest.mark.parametrize("mode", ["full", "valid", "same"])
@pytest.mark.parametrize("jit", [False, True])
def test_adjoint(self, input_shape, mode, jit, input_dtype):
ndim = len(input_shape)
x_shape = (3, 4)[:ndim]
x, key = randn(input_shape, dtype=input_dtype, key=self.key)
x, key = randn(x_shape, dtype=input_dtype, key=key)
A = ConvolveByX(x=x, input_shape=input_shape, input_dtype=input_dtype, mode=mode, jit=jit)
adjoint_AtA_test(A, self.key)
adjoint_AAt_test(A, self.key)
class ConvolveByXTestObj:
def __init__(self):
dtype = np.float32
key = jax.random.PRNGKey(12345)
self.x_A, key = randn((3,), dtype=dtype, key=key)
self.x_B, key = randn((3,), dtype=dtype, key=key)
self.x_C, key = randn((5,), dtype=dtype, key=key)
self.A = ConvolveByX(input_shape=(32,), x=self.x_A)
self.B = ConvolveByX(input_shape=(32,), x=self.x_B)
self.C = ConvolveByX(input_shape=(32,), x=self.x_C)
# Matrix for a 'generic linop'
m = self.A.output_shape[0]
n = self.A.input_shape[0]
G_mat, key = randn((m, n), dtype=dtype, key=key)
self.G = AbsMatOp(G_mat)
self.h, key = randn((32,), dtype=dtype, key=key)
self.scalar = 3.141
@pytest.fixture
def cbx_testobj(request):
yield ConvolveByXTestObj()
@pytest.mark.parametrize("operator", [op.mul, op.truediv])
def test_cbx_scalar_left(cbx_testobj, operator):
A = operator(cbx_testobj.A, cbx_testobj.scalar)
h = cbx_testobj.h
B = ConvolveByX(input_shape=(32,), x=operator(cbx_testobj.x_A, cbx_testobj.scalar))
np.testing.assert_allclose(A @ h, B @ h, rtol=5e-5)
@pytest.mark.parametrize("operator", [op.mul, op.truediv])
def test_cbx_scalar_right(cbx_testobj, operator):
if operator == op.truediv:
pytest.xfail("scalar / LinearOperator is not supported")
A = operator(cbx_testobj.scalar, cbx_testobj.A)
h = cbx_testobj.h
B = ConvolveByX(input_shape=(32,), x=operator(cbx_testobj.scalar, cbx_testobj.x_A))
np.testing.assert_allclose(A @ h, B @ h, rtol=5e-5)
@pytest.mark.parametrize("operator", [op.add, op.sub])
def test_convolve_add_sub(cbx_testobj, operator):
A = cbx_testobj.A
B = cbx_testobj.B
C = cbx_testobj.C
h = cbx_testobj.h
# Two operators of same size
AB = operator(A, B)
ABh = AB @ h
AfiltBh = operator(A @ h, B @ h)
np.testing.assert_allclose(ABh, AfiltBh, rtol=5e-5)
# Two operators of different size
with pytest.raises(ValueError):
operator(A, C)
@pytest.mark.parametrize("operator", [op.add, op.sub])
def test_add_sub_different_mode(cbx_testobj, operator):
# These tests get caught inside of the _wrap_add_sub input/output shape checks,
# not the explicit mode check inside of the wrapped __add__ method
B_same = ConvolveByX(input_shape=(32,), x=cbx_testobj.x_B, mode="same")
with pytest.raises(ValueError):
operator(cbx_testobj.A, B_same)
@pytest.mark.parametrize("operator", [op.add, op.sub])
def test_add_sum_generic_linop(cbx_testobj, operator):
# Combine a AbsMatOp and ConvolveByX, get a generic LinearOperator
AG = operator(cbx_testobj.A, cbx_testobj.G)
assert isinstance(AG, LinearOperator)
# Check evaluation
a = AG @ cbx_testobj.h
b = operator(cbx_testobj.A @ cbx_testobj.h, cbx_testobj.G @ cbx_testobj.h)
np.testing.assert_allclose(a, b, rtol=5e-5)
@pytest.mark.parametrize("operator", [op.add, op.sub])
def test_add_sum_conv(cbx_testobj, operator):
# Combine a AbsMatOp and ConvolveByX, get a generic LinearOperator
AA = operator(cbx_testobj.A, cbx_testobj.A)
assert isinstance(AA, ConvolveByX)
# Check evaluation
a = AA @ cbx_testobj.h
b = operator(cbx_testobj.A @ cbx_testobj.h, cbx_testobj.A @ cbx_testobj.h)
np.testing.assert_allclose(a, b, rtol=5e-5)
@pytest.mark.parametrize("operator", [op.mul, op.truediv])
def test_mul_div_generic_linop(cbx_testobj, operator):
# not defined between ConvolveByX and AbsMatOp
with pytest.raises(TypeError):
operator(cbx_testobj.A, cbx_testobj.G)
def test_invalid_mode(cbx_testobj):
# mode that doesn't exist
with pytest.raises(ValueError):
ConvolveByX(input_shape=(32,), x=cbx_testobj.x_A, mode="foo")
def test_dimension_mismatch(cbx_testobj):
with pytest.raises(ValueError):
# 2-dim input shape, 1-dim xer
ConvolveByX(input_shape=(32, 32), x=cbx_testobj.x_A)
def test_ndarray_x():
x = np.random.randn(3, 3).astype(np.float32)
A = ConvolveByX(input_shape=(32, 32), x=x)
assert isinstance(A.x, jax.interpreters.xla.DeviceArray)
| 34.176119 | 98 | 0.672024 | 1,694 | 11,449 | 4.386659 | 0.090909 | 0.061903 | 0.08478 | 0.054636 | 0.866371 | 0.841206 | 0.803122 | 0.741892 | 0.720899 | 0.673664 | 0 | 0.01876 | 0.189886 | 11,449 | 334 | 99 | 34.278443 | 0.782426 | 0.08551 | 0 | 0.522727 | 0 | 0 | 0.035807 | 0 | 0 | 0 | 0 | 0 | 0.081818 | 1 | 0.136364 | false | 0 | 0.036364 | 0 | 0.190909 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
fcea3490809be4f9eaacd1e68c9e8ae90e80a7ce | 4,115 | py | Python | unifier/sutil.py | jtorreshnl/unifier | dba0de9af7ec0f749b1be67cea200622f66beadc | [
"MIT"
] | null | null | null | unifier/sutil.py | jtorreshnl/unifier | dba0de9af7ec0f749b1be67cea200622f66beadc | [
"MIT"
] | null | null | null | unifier/sutil.py | jtorreshnl/unifier | dba0de9af7ec0f749b1be67cea200622f66beadc | [
"MIT"
] | null | null | null | from selenium.webdriver.common.by import (
By
)
from selenium.webdriver.support import (
expected_conditions as EC
)
from selenium.webdriver.support.ui import (
WebDriverWait
)
def get_element_by_css_selector(driver, tag, value):
''' Get a specified element by tag and attribute value.
'''
is_located = WebDriverWait(driver, 4).until(
EC.visibility_of_element_located((By.CSS_SELECTOR, f'{tag}.{value}'))
)
if is_located:
return driver.find_element(By.CSS_SELECTOR, f'{tag}.{value}')
def get_element_by_id(driver, id):
''' Get a specified element by id.
Wait until the element is visible.
'''
is_located = WebDriverWait(driver, 4).until(
EC.visibility_of_element_located((By.ID, id))
)
if is_located:
return driver.find_element(By.ID, id)
def get_element_by_id_clickable(driver, id):
''' Get elements by id.
'''
is_located = WebDriverWait(driver, 4).until(
EC.element_to_be_clickable((By.ID, id))
)
if is_located:
return driver.find_element(By.ID, id)
def get_element_by_id_ext(driver, id):
''' Get a specified element by id.
Wait until the element is visible.
'''
is_located = WebDriverWait(driver, 10).until(
EC.visibility_of_element_located((By.ID, id))
)
if is_located:
return driver.find_element(By.ID, id)
def get_element_by_tag(driver, tag):
''' Get a specified element by tag.
Wait until the element is visible.
'''
is_located = WebDriverWait(driver, 4).until(
EC.visibility_of_element_located((By.TAG_NAME, tag))
)
if is_located:
return driver.find_element(By.TAG_NAME, tag)
def get_element_by_tag_and_text(driver, tag, text):
''' Get a specified element by tag and text.
Wait until the element is visible.
'''
is_located = WebDriverWait(driver, 4).until(
EC.text_to_be_present_in_element((By.TAG_NAME, tag), text)
)
if is_located:
return driver.find_element(By.TAG_NAME, tag)
def get_element_by_xpath(driver, xpath):
''' Get element by xpath.
'''
is_located = WebDriverWait(driver, 4).until(
EC.presence_of_element_located((By.XPATH, f'{xpath}'))
)
if is_located:
return driver.find_element(By.XPATH, f'{xpath}')
def get_element_by_xpath_clickable(driver, xpath):
''' Get element by xpath.
'''
is_located = WebDriverWait(driver, 4).until(
EC.element_to_be_clickable((By.XPATH, f'{xpath}'))
)
if is_located:
return driver.find_element(By.XPATH, f'{xpath}')
def get_element_by_xpath_visible(driver, xpath):
''' Get element by xpath.
'''
is_located = WebDriverWait(driver, 4).until(
EC.visibility_of_element_located((By.XPATH, f'{xpath}'))
)
if is_located:
return driver.find_element(By.XPATH, f'{xpath}')
def get_elements_by_css_selector(driver, selector):
''' Get elements by css selector.
'''
is_located = WebDriverWait(driver, 4).until(
EC.presence_of_all_elements_located((By.CSS_SELECTOR, f'{selector}'))
)
if is_located:
return driver.find_elements(By.CSS_SELECTOR, f'{selector}')
def get_elements_by_tag(driver, tag):
''' Get all elements by tag.
Wait until the elements are visible.
'''
is_located = WebDriverWait(driver, 4).until(
EC.visibility_of_element_located((By.TAG_NAME, tag))
)
if is_located:
return driver.find_elements(By.TAG_NAME, tag)
def get_elements_by_tag_ext(driver, tag):
''' Get all elements by tag.
Wait until the elements are visible.
'''
is_located = WebDriverWait(driver, 10).until(
EC.visibility_of_element_located((By.TAG_NAME, tag))
)
if is_located:
return driver.find_elements(By.TAG_NAME, tag)
def get_elements_by_xpath(driver, xpath):
''' Get elements by xpath.
'''
is_located = WebDriverWait(driver, 4).until(
EC.presence_of_all_elements_located((By.XPATH, f'{xpath}'))
)
if is_located:
return driver.find_elements(By.XPATH, f'{xpath}')
| 27.804054 | 77 | 0.669016 | 574 | 4,115 | 4.547038 | 0.092334 | 0.093103 | 0.109579 | 0.139464 | 0.878544 | 0.804598 | 0.779693 | 0.758238 | 0.726437 | 0.726437 | 0 | 0.00464 | 0.214338 | 4,115 | 147 | 78 | 27.993197 | 0.80266 | 0.153341 | 0 | 0.471264 | 0 | 0 | 0.030493 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.149425 | false | 0 | 0.034483 | 0 | 0.333333 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
fcef7bfe5a1e708cfcd8985c0ae02e4fd2f6eec0 | 37,242 | py | Python | pysat/tests/test_orbits.py | pysat/pysat | 4d12a09ea585b88d54560413e03cae9289113718 | [
"BSD-3-Clause"
] | 68 | 2019-09-18T19:08:07.000Z | 2022-03-28T23:22:04.000Z | pysat/tests/test_orbits.py | iamaSam/pysat | 4d12a09ea585b88d54560413e03cae9289113718 | [
"BSD-3-Clause"
] | 603 | 2019-09-18T15:24:37.000Z | 2022-03-30T20:13:43.000Z | pysat/tests/test_orbits.py | iamaSam/pysat | 4d12a09ea585b88d54560413e03cae9289113718 | [
"BSD-3-Clause"
] | 24 | 2015-04-08T09:33:51.000Z | 2019-09-06T22:01:34.000Z | import datetime as dt
from dateutil.relativedelta import relativedelta
import numpy as np
import pandas as pds
# Orbits period is a pandas.Timedelta kwarg, and the pandas repr
# does not include a module name. Import required to run eval
# on Orbit representation
from pandas import Timedelta # noqa: F401
import pytest
import pysat
class TestOrbitsUserInterface():
def setup(self):
""" Set up User Interface unit tests
"""
self.in_args = ['pysat', 'testing']
self.in_kwargs = {'clean_level': 'clean', 'update_files': True}
self.testInst = None
self.stime = dt.datetime(2009, 1, 1)
def teardown(self):
""" Tear down user interface tests
"""
del self.in_args, self.in_kwargs, self.testInst, self.stime
def test_orbit_w_bad_kind(self):
""" Test orbit failure with bad 'kind' input
"""
self.in_kwargs['orbit_info'] = {'index': 'mlt', 'kind': 'cats'}
with pytest.raises(ValueError):
self.testInst = pysat.Instrument(*self.in_args, **self.in_kwargs)
@pytest.mark.parametrize("info", [({'index': 'magnetic local time',
'kind': 'longitude'}),
(None),
({'index': 'magnetic local time',
'kind': 'lt'}),
({'index': 'magnetic local time',
'kind': 'polar'}),
({'index': 'magnetic local time',
'kind': 'orbit'})])
def test_orbit_w_bad_orbit_info(self, info):
""" Test orbit failure on iteration with orbit initialization
"""
self.in_kwargs['orbit_info'] = info
self.testInst = pysat.Instrument(*self.in_args, **self.in_kwargs)
self.testInst.load(date=self.stime)
with pytest.raises(ValueError):
self.testInst.orbits.next()
@pytest.mark.parametrize("info", [({'index': 'magnetic local time',
'kind': 'polar'}),
({'index': 'magnetic local time',
'kind': 'orbit'}),
({'index': 'magnetic local time',
'kind': 'longitude'}),
({'index': 'magnetic local time',
'kind': 'lt'})])
def test_orbit_polar_w_missing_orbit_index(self, info):
""" Test orbit failure oon iteration with missing orbit index
"""
self.in_kwargs['orbit_info'] = info
self.testInst = pysat.Instrument(*self.in_args, **self.in_kwargs)
# Force index to None beforee loading and iterating
self.testInst.orbits.orbit_index = None
self.testInst.load(date=self.stime)
with pytest.raises(ValueError):
self.testInst.orbits.next()
def test_orbit_repr(self):
""" Test the Orbit representation
"""
self.in_kwargs['orbit_info'] = {'index': 'mlt'}
self.testInst = pysat.Instrument(*self.in_args, **self.in_kwargs)
out_str = self.testInst.orbits.__repr__()
assert out_str.find("Orbits(") >= 0
def test_orbit_str(self):
""" Test the Orbit string representation with data
"""
self.in_kwargs['orbit_info'] = {'index': 'mlt'}
self.testInst = pysat.Instrument(*self.in_args, **self.in_kwargs)
self.testInst.load(date=self.stime)
out_str = self.testInst.orbits.__str__()
assert out_str.find("Orbit Settings") >= 0
assert out_str.find("Orbit Lind: local time") < 0
class TestSpecificUTOrbits():
def setup(self):
"""Runs before every method to create a clean testing setup
"""
self.testInst = pysat.Instrument('pysat', 'testing',
clean_level='clean',
orbit_info={'index': 'mlt'},
update_files=True)
self.stime = pysat.instruments.pysat_testing._test_dates['']['']
self.inc_min = 97
self.etime = None
def teardown(self):
"""Runs after every method to clean up previous testing
"""
del self.testInst, self.stime, self.inc_min, self.etime
@pytest.mark.parametrize('orbit_inc', [(0), (1), (-1), (-2), (14)])
def test_single_orbit_call_by_index(self, orbit_inc):
"""Test successful orbit call by index
"""
# Load the data
self.testInst.load(date=self.stime)
self.testInst.orbits[orbit_inc]
# Increment the time
if orbit_inc >= 0:
self.stime += dt.timedelta(minutes=orbit_inc * self.inc_min)
else:
self.stime += dt.timedelta(minutes=self.inc_min
* (np.ceil(1440.0 / self.inc_min)
+ orbit_inc))
self.etime = self.stime + dt.timedelta(seconds=(self.inc_min * 60 - 1))
# Test the time
assert (self.testInst.index[0] == self.stime)
assert (self.testInst.index[-1] == self.etime)
@pytest.mark.parametrize("orbit_ind,raise_err", [(17, Exception),
(None, TypeError)])
def test_single_orbit_call_bad_index(self, orbit_ind, raise_err):
""" Test orbit failure with bad index
"""
self.testInst.load(date=self.stime)
with pytest.raises(raise_err):
self.testInst.orbits[orbit_ind]
def test_oribt_number_via_current_multiple_orbit_calls_in_day(self):
""" Test orbit number with mulitple orbits calls in a day
"""
self.testInst.load(date=self.stime)
self.testInst.bounds = (self.stime, None)
true_vals = np.arange(15)
true_vals[-1] = 0
test_vals = []
for i, inst in enumerate(self.testInst.orbits):
if i > 14:
break
test_vals.append(inst.orbits.current)
assert inst.orbits.current == self.testInst.orbits.current
assert np.all(test_vals == true_vals)
def test_all_single_orbit_calls_in_day(self):
""" Test all single orbit calls in a day
"""
self.testInst.load(date=self.stime)
self.testInst.bounds = (self.stime, None)
for i, inst in enumerate(self.testInst.orbits):
if i > 14:
break
# Test the start index
self.etime = self.stime + i * relativedelta(minutes=self.inc_min)
assert inst.index[0] == self.etime
assert self.testInst.index[0] == self.etime
# Test the end index
self.etime += relativedelta(seconds=((self.inc_min * 60) - 1))
assert inst.index[-1] == self.etime
assert self.testInst.index[-1] == self.etime
def test_orbit_next_call_no_loaded_data(self):
""" Test orbit next call without loading data
"""
self.testInst.orbits.next()
assert (self.testInst.index[0] == dt.datetime(2008, 1, 1))
assert (self.testInst.index[-1] == dt.datetime(2008, 1, 1, 0, 38, 59))
def test_orbit_prev_call_no_loaded_data(self):
""" Test orbit previous call without loading data
"""
self.testInst.orbits.prev()
# this isn't a full orbit
assert (self.testInst.index[-1]
== dt.datetime(2010, 12, 31, 23, 59, 59))
assert (self.testInst.index[0] == dt.datetime(2010, 12, 31, 23, 49))
def test_single_orbit_call_orbit_starts_0_UT_using_next(self):
""" Test orbit next call with data
"""
self.testInst.load(date=self.stime)
self.testInst.orbits.next()
self.etime = self.stime + dt.timedelta(seconds=(self.inc_min * 60 - 1))
assert (self.testInst.index[0] == self.stime)
assert (self.testInst.index[-1] == self.etime)
def test_single_orbit_call_orbit_starts_0_UT_using_prev(self):
""" Test orbit prev call with data
"""
self.testInst.load(date=self.stime)
self.testInst.orbits.prev()
self.stime += 14 * relativedelta(minutes=self.inc_min)
self.etime = self.stime + dt.timedelta(seconds=((self.inc_min * 60)
- 1))
assert self.testInst.index[0] == self.stime
assert self.testInst.index[-1] == self.etime
def test_single_orbit_call_orbit_starts_off_0_UT_using_next(self):
""" Test orbit next call with data for previous day
"""
self.stime -= dt.timedelta(days=1)
self.testInst.load(date=self.stime)
self.testInst.orbits.next()
assert (self.testInst.index[0] == dt.datetime(2008, 12, 30, 23, 45))
assert (self.testInst.index[-1]
== (dt.datetime(2008, 12, 30, 23, 45)
+ relativedelta(seconds=(self.inc_min * 60 - 1))))
def test_single_orbit_call_orbit_starts_off_0_UT_using_prev(self):
self.stime -= dt.timedelta(days=1)
self.testInst.load(date=self.stime)
self.testInst.orbits.prev()
assert (self.testInst.index[0]
== (dt.datetime(2009, 1, 1)
- relativedelta(minutes=self.inc_min)))
assert (self.testInst.index[-1]
== (dt.datetime(2009, 1, 1) - relativedelta(seconds=1)))
class TestGeneralOrbitsMLT():
def setup(self):
"""Runs before every method to create a clean testing setup."""
self.testInst = pysat.Instrument('pysat', 'testing',
clean_level='clean',
orbit_info={'index': 'mlt'},
update_files=True)
self.stime = pysat.instruments.pysat_testing._test_dates['']['']
return
def teardown(self):
"""Runs after every method to clean up previous testing."""
del self.testInst, self.stime
return
def test_equality_with_copy(self):
"""Test that copy is the same as original"""
self.out = self.testInst.orbits.copy()
assert self.out == self.testInst.orbits
return
def test_equality_with_data_with_copy(self):
"""Test that copy is the same as original"""
# Load data
self.testInst.load(date=self.stime)
# Load up an orbit
self.testInst.orbits[0]
self.out = self.testInst.orbits.copy()
assert self.out == self.testInst.orbits
return
def test_inequality_different_data(self):
"""Test that equality is false if different data"""
# Load data
self.testInst.load(date=self.stime)
# Load up an orbit
self.testInst.orbits[0]
# Make copy
self.out = self.testInst.orbits.copy()
# Modify data
self.out._full_day_data = self.testInst._null_data
assert self.out != self.testInst.orbits
return
def test_inequality_modified_object(self):
"""Test that equality is false if other missing attributes"""
self.out = self.testInst.orbits.copy()
# Remove attribute
del self.out.orbit_index
assert self.testInst.orbits != self.out
return
def test_inequality_reduced_object(self):
"""Test that equality is false if self missing attributes"""
self.out = self.testInst.orbits.copy()
self.out.hi_there = 'hi'
assert self.testInst.orbits != self.out
return
def test_inequality_different_type(self):
"""Test that equality is false if different type"""
assert self.testInst.orbits != self.testInst
return
def test_eval_repr(self):
"""Test eval of repr recreates object"""
# eval and repr don't play nice for custom functions
if len(self.testInst.custom_functions) != 0:
self.testInst.custom_clear()
self.out = eval(self.testInst.orbits.__repr__())
assert self.out == self.testInst.orbits
return
def test_repr_and_copy(self):
"""Test repr consistent with object copy"""
# Not tested with eval due to issues with datetime
self.out = self.testInst.orbits.__repr__()
second_out = self.testInst.orbits.copy().__repr__()
assert self.out == second_out
return
def test_load_orbits_w_empty_data(self):
""" Test orbit loading outside of the instrument data range
"""
self.stime -= dt.timedelta(days=365 * 100)
self.testInst.load(date=self.stime)
self.testInst.orbits[0]
with pytest.raises(StopIteration):
self.testInst.orbits.next()
def test_less_than_one_orbit_of_data(self):
"""Test successful load with less than one orbit of data
"""
def filter_data(inst):
""" Local helper function to reduce available data
"""
inst.data = inst[0:20]
self.testInst.custom_attach(filter_data)
self.testInst.load(date=self.stime)
self.testInst.orbits.next()
# a recusion issue has been observed in this area
# checking for date to limit reintroduction potential
assert self.testInst.date == self.stime
def test_less_than_one_orbit_of_data_two_ways(self):
def filter_data(inst):
inst.data = inst[0:5]
self.testInst.custom_attach(filter_data)
self.testInst.load(date=self.stime)
# starting from no orbit calls next loads first orbit
self.testInst.orbits.next()
# store comparison data
saved_data = self.testInst.copy()
self.testInst.load(date=self.stime)
self.testInst.orbits[0]
assert all(self.testInst.data == saved_data.data)
# a recusion issue has been observed in this area
# checking for date to limit reintroduction potential
d1check = self.testInst.date == saved_data.date
assert d1check
def test_less_than_one_orbit_of_data_four_ways_two_days(self):
""" Test successful loading of different parital orbits
"""
# create situation where the < 1 orbit split across two days
def filter_data(inst):
"""Local function for breaking up orbits
"""
if inst.date == dt.datetime(2009, 1, 5):
inst.data = inst[0:20]
elif inst.date == dt.datetime(2009, 1, 4):
inst.data = inst[-20:]
return
self.testInst.custom_attach(filter_data)
self.stime += dt.timedelta(days=3)
self.testInst.load(date=self.stime)
# starting from no orbit calls next loads first orbit
self.testInst.orbits.next()
# store comparison data
saved_data = self.testInst.copy()
self.testInst.load(date=self.stime + dt.timedelta(days=1))
self.testInst.orbits[0]
if self.testInst.orbits.num == 1:
# equivalence only when only one orbit
# some test settings can violate this assumption
assert all(self.testInst.data == saved_data.data)
self.testInst.load(date=self.stime)
self.testInst.orbits[0]
assert all(self.testInst.data == saved_data.data)
self.testInst.load(date=self.stime + dt.timedelta(days=1))
self.testInst.orbits.prev()
if self.testInst.orbits.num == 1:
assert all(self.testInst.data == saved_data.data)
# a recusion issue has been observed in this area
# checking for date to limit reintroduction potential
d1check = self.testInst.date == saved_data.date
assert d1check
def test_repeated_orbit_calls_symmetric_single_day_start_with_last(self):
self.testInst.load(date=self.stime)
# start on last orbit of last day
self.testInst.orbits[0]
self.testInst.orbits.prev()
control = self.testInst.copy()
for j in range(10):
self.testInst.orbits.next()
for j in range(10):
self.testInst.orbits.prev()
assert all(control.data == self.testInst.data)
def test_repeated_orbit_calls_symmetric_single_day_0_UT(self):
self.testInst.load(date=self.stime)
self.testInst.orbits.next()
control = self.testInst.copy()
for j in range(10):
self.testInst.orbits.next()
for j in range(10):
self.testInst.orbits.prev()
assert all(control.data == self.testInst.data)
def test_repeated_orbit_calls_symmetric_multi_day_0_UT(self):
self.testInst.load(date=self.stime)
self.testInst.orbits.next()
control = self.testInst.copy()
for j in range(20):
self.testInst.orbits.next()
for j in range(20):
self.testInst.orbits.prev()
assert all(control.data == self.testInst.data)
def test_repeated_orbit_calls_symmetric_single_day_off_0_UT(self):
""" Test successful orbit calls for a day about a time off 00:00 UT
"""
self.stime -= dt.timedelta(days=1)
self.testInst.load(date=self.stime)
self.testInst.orbits.next()
control = self.testInst.copy()
for j in range(10):
self.testInst.orbits.next()
for j in range(10):
self.testInst.orbits.prev()
assert all(control.data == self.testInst.data)
def test_repeated_orbit_calls_symmetric_multi_day_off_0_UT(self):
""" Test successful orbit calls for days about a time off 00:00 UT
"""
self.stime -= dt.timedelta(days=1)
self.testInst.load(date=self.stime)
self.testInst.orbits.next()
control = self.testInst.copy()
for j in range(20):
self.testInst.orbits.next()
for j in range(20):
self.testInst.orbits.prev()
assert all(control.data == self.testInst.data)
def test_repeated_orbit_calls_antisymmetric_multi_day_off_0_UT(self):
""" Test successful orbit calls for different days about a time off 0 UT
"""
self.stime -= dt.timedelta(days=1)
self.testInst.load(date=self.stime)
self.testInst.orbits.next()
control = self.testInst.copy()
for j in range(10):
self.testInst.orbits.next()
for j in range(20):
self.testInst.orbits.prev()
for j in range(10):
self.testInst.orbits.next()
assert all(control.data == self.testInst.data)
def test_repeated_orbit_calls_antisymmetric_multi_multi_day_off_0_UT(self):
""" Test successful orbit calls for more days about a time off 0 UT
"""
self.stime -= dt.timedelta(days=1)
self.testInst.load(date=self.stime)
self.testInst.orbits.next()
control = self.testInst.copy()
for j in range(20):
self.testInst.orbits.next()
for j in range(40):
self.testInst.orbits.prev()
for j in range(20):
self.testInst.orbits.next()
assert all(control.data == self.testInst.data)
def test_repeated_orbit_calls_antisymmetric_multi_day_0_UT(self):
self.testInst.load(date=self.stime)
self.testInst.orbits.next()
control = self.testInst.copy()
for j in range(10):
self.testInst.orbits.next()
for j in range(20):
self.testInst.orbits.prev()
for j in range(10):
self.testInst.orbits.next()
assert all(control.data == self.testInst.data)
def test_repeated_orbit_calls_antisymmetric_multi_multi_day_0_UT(self):
self.testInst.load(date=self.stime)
self.testInst.orbits.next()
control = self.testInst.copy()
for j in range(20):
self.testInst.orbits.next()
for j in range(40):
self.testInst.orbits.prev()
for j in range(20):
self.testInst.orbits.next()
assert all(control.data == self.testInst.data)
def test_repeat_orbit_calls_asym_multi_day_0_UT_long_time_gap(self):
"""Test successful orbit calls for many different days with a long gap
"""
self.stime += dt.timedelta(days=334)
self.testInst.load(date=self.stime)
self.testInst.orbits.next()
control = self.testInst.copy()
for j in range(20):
self.testInst.orbits.next()
for j in range(20):
self.testInst.orbits.prev()
assert all(control.data == self.testInst.data)
def test_repeat_orbit_calls_asym_multi_day_0_UT_really_long_time_gap(self):
self.testInst.load(date=self.stime)
self.testInst.orbits.next()
control = self.testInst.copy()
for j in range(400):
self.testInst.orbits.next()
for j in range(400):
self.testInst.orbits.prev()
assert all(control.data == self.testInst.data)
def test_repeat_orbit_calls_asym_multi_day_0_UT_multiple_time_gaps(self):
self.testInst.load(date=self.stime)
self.testInst.orbits.next()
control = self.testInst.copy()
n_time = []
p_time = []
for j in range(40):
n_time.append(self.testInst.index[0])
self.testInst.orbits.next()
for j in range(40):
self.testInst.orbits.prev()
p_time.append(self.testInst.index[0])
check = np.all(p_time == n_time[::-1])
assert all(control.data == self.testInst.data) & check
class TestGeneralOrbitsMLTxarray(TestGeneralOrbitsMLT):
def setup(self):
"""Runs before every method to create a clean testing setup."""
self.testInst = pysat.Instrument('pysat', 'testing_xarray',
clean_level='clean',
orbit_info={'index': 'mlt'},
update_files=True)
self.stime = pysat.instruments.pysat_testing_xarray._test_dates['']['']
def teardown(self):
"""Runs after every method to clean up previous testing."""
del self.testInst, self.stime
class TestGeneralOrbitsNonStandardIteration():
"""Create an iteration window that is larger than step size.
Ensure the overlapping data doesn't end up in the orbit iteration."""
def setup(self):
"""Runs before every method to create a clean testing setup."""
self.testInst = pysat.Instrument('pysat', 'testing',
clean_level='clean',
orbit_info={'index': 'mlt'},
update_files=True)
self.testInst.bounds = (self.testInst.files.files.index[0],
self.testInst.files.files.index[11],
'2D', dt.timedelta(days=3))
self.orbit_starts = []
self.orbit_stops = []
def teardown(self):
"""Runs after every method to clean up previous testing."""
del self.testInst, self.orbit_starts, self.orbit_stops
def test_no_orbit_overlap_with_overlapping_iteration(self):
"""Ensure error when overlap in iteration data."""
with pytest.raises(ValueError):
self.testInst.orbits.next()
return
@pytest.mark.parametrize("bounds_type", ['by_date', 'by_file'])
def test_no_orbit_overlap_with_nonoverlapping_iteration(self, bounds_type):
"""Test no orbit data overlap when overlap in iteration data"""
if bounds_type == 'by_date':
bounds = (self.testInst.files.files.index[0],
self.testInst.files.files.index[11],
'2D', dt.timedelta(days=2))
elif bounds_type == 'by_file':
bounds = (self.testInst.files[0], self.testInst.files[11], 2, 2)
self.testInst.bounds = bounds
for inst in self.testInst.orbits:
self.orbit_starts.append(inst.index[0])
self.orbit_stops.append(inst.index[-1])
self.orbit_starts = pds.Series(self.orbit_starts)
self.orbit_stops = pds.Series(self.orbit_stops)
assert self.orbit_starts.is_monotonic_increasing
assert self.orbit_stops.is_monotonic_increasing
return
class TestGeneralOrbitsLong(TestGeneralOrbitsMLT):
def setup(self):
"""Runs before every method to create a clean testing setup."""
self.testInst = pysat.Instrument('pysat', 'testing',
clean_level='clean',
orbit_info={'index': 'longitude',
'kind': 'longitude'},
update_files=True)
self.stime = pysat.instruments.pysat_testing._test_dates['']['']
def teardown(self):
"""Runs after every method to clean up previous testing."""
del self.testInst, self.stime
class TestGeneralOrbitsLongxarray(TestGeneralOrbitsMLT):
def setup(self):
"""Runs before every method to create a clean testing setup."""
self.testInst = pysat.Instrument('pysat', 'testing_xarray',
clean_level='clean',
orbit_info={'index': 'longitude',
'kind': 'longitude'},
update_files=True)
self.stime = pysat.instruments.pysat_testing._test_dates['']['']
def teardown(self):
"""Runs after every method to clean up previous testing."""
del self.testInst, self.stime
class TestGeneralOrbitsOrbitNumber(TestGeneralOrbitsMLT):
def setup(self):
"""Runs before every method to create a clean testing setup."""
self.testInst = pysat.Instrument('pysat', 'testing',
clean_level='clean',
orbit_info={'index': 'orbit_num',
'kind': 'orbit'},
update_files=True)
self.stime = pysat.instruments.pysat_testing._test_dates['']['']
def teardown(self):
"""Runs after every method to clean up previous testing."""
del self.testInst, self.stime
class TestGeneralOrbitsOrbitNumberXarray(TestGeneralOrbitsMLT):
def setup(self):
"""Runs before every method to create a clean testing setup."""
self.testInst = pysat.Instrument('pysat', 'testing_xarray',
clean_level='clean',
orbit_info={'index': 'orbit_num',
'kind': 'orbit'},
update_files=True)
self.stime = pysat.instruments.pysat_testing_xarray._test_dates['']['']
def teardown(self):
"""Runs after every method to clean up previous testing."""
del self.testInst, self.stime
class TestGeneralOrbitsLatitude(TestGeneralOrbitsMLT):
def setup(self):
"""Runs before every method to create a clean testing setup."""
self.testInst = pysat.Instrument('pysat', 'testing',
clean_level='clean',
orbit_info={'index': 'latitude',
'kind': 'polar'},
update_files=True)
self.stime = pysat.instruments.pysat_testing._test_dates['']['']
def teardown(self):
"""Runs after every method to clean up previous testing."""
del self.testInst, self.stime
class TestGeneralOrbitsLatitudeXarray(TestGeneralOrbitsMLT):
def setup(self):
"""Runs before every method to create a clean testing setup."""
self.testInst = pysat.Instrument('pysat', 'testing_xarray',
clean_level='clean',
orbit_info={'index': 'latitude',
'kind': 'polar'},
update_files=True)
self.stime = pysat.instruments.pysat_testing_xarray._test_dates['']['']
def teardown(self):
"""Runs after every method to clean up previous testing."""
del self.testInst, self.stime
def filter_data(inst):
"""Remove data from instrument, simulating gaps"""
times = [[dt.datetime(2009, 1, 1, 1, 37), dt.datetime(2009, 1, 1, 3, 14)],
[dt.datetime(2009, 1, 1, 10), dt.datetime(2009, 1, 1, 12)],
[dt.datetime(2009, 1, 1, 22), dt.datetime(2009, 1, 2, 2)],
[dt.datetime(2009, 1, 13), dt.datetime(2009, 1, 15)],
[dt.datetime(2009, 1, 20, 1), dt.datetime(2009, 1, 25, 23)],
[dt.datetime(2009, 1, 25, 23, 30), dt.datetime(2009, 1, 26, 3)]
]
for time in times:
idx, = np.where((inst.index > time[1]) | (inst.index < time[0]))
inst.data = inst[idx]
def filter_data2(inst, times=None):
"""Remove data from instrument, simulating gaps"""
for time in times:
idx, = np.where((inst.index > time[1]) | (inst.index < time[0]))
inst.data = inst[idx]
class TestOrbitsGappyData(TestGeneralOrbitsMLT):
def setup(self):
"""Runs before every method to create a clean testing setup."""
self.testInst = pysat.Instrument('pysat', 'testing',
clean_level='clean',
orbit_info={'index': 'mlt'},
update_files=True)
self.testInst.custom_attach(filter_data)
self.stime = pysat.instruments.pysat_testing._test_dates['']['']
def teardown(self):
"""Runs after every method to clean up previous testing."""
del self.testInst, self.stime
class TestOrbitsGappyDataXarray(TestGeneralOrbitsMLT):
def setup(self):
"""Runs before every method to create a clean testing setup."""
self.testInst = pysat.Instrument('pysat', 'testing_xarray',
clean_level='clean',
orbit_info={'index': 'mlt'},
update_files=True)
self.testInst.custom_attach(filter_data)
self.stime = pysat.instruments.pysat_testing._test_dates['']['']
def teardown(self):
"""Runs after every method to clean up previous testing."""
del self.testInst, self.stime
class TestOrbitsGappyData2(TestGeneralOrbitsMLT):
def setup(self):
"""Runs before every method to create a clean testing setup."""
self.testInst = pysat.Instrument('pysat', 'testing',
clean_level='clean',
orbit_info={'index': 'mlt'})
self.stime = pysat.instruments.pysat_testing._test_dates['']['']
times = [[dt.datetime(2008, 12, 31, 4),
dt.datetime(2008, 12, 31, 5, 37)],
[dt.datetime(2009, 1, 1),
dt.datetime(2009, 1, 1, 1, 37)]]
for seconds in np.arange(38):
day = (dt.datetime(2009, 1, 2)
+ dt.timedelta(days=int(seconds)))
times.append([day, day
+ dt.timedelta(hours=1, minutes=37,
seconds=int(seconds))
- dt.timedelta(seconds=20)])
self.testInst.custom_attach(filter_data2, kwargs={'times': times})
def teardown(self):
"""Runs after every method to clean up previous testing."""
del self.testInst, self.stime
class TestOrbitsGappyData2Xarray(TestGeneralOrbitsMLT):
def setup(self):
"""Runs before every method to create a clean testing setup."""
self.testInst = pysat.Instrument('pysat', 'testing_xarray',
clean_level='clean',
orbit_info={'index': 'mlt'})
self.stime = pysat.instruments.pysat_testing._test_dates['']['']
times = [[dt.datetime(2008, 12, 31, 4),
dt.datetime(2008, 12, 31, 5, 37)],
[dt.datetime(2009, 1, 1),
dt.datetime(2009, 1, 1, 1, 37)]]
for seconds in np.arange(38):
day = (dt.datetime(2009, 1, 2)
+ dt.timedelta(days=int(seconds)))
times.append([day, day
+ dt.timedelta(hours=1, minutes=37,
seconds=int(seconds))
- dt.timedelta(seconds=20)])
self.testInst.custom_attach(filter_data2, kwargs={'times': times})
def teardown(self):
"""Runs after every method to clean up previous testing."""
del self.testInst, self.stime
class TestOrbitsGappyLongData(TestGeneralOrbitsMLT):
def setup(self):
"""Runs before every method to create a clean testing setup."""
self.testInst = pysat.Instrument('pysat', 'testing',
clean_level='clean',
orbit_info={'index': 'longitude',
'kind': 'longitude'})
self.testInst.custom_attach(filter_data)
self.stime = pysat.instruments.pysat_testing._test_dates['']['']
def teardown(self):
"""Runs after every method to clean up previous testing."""
del self.testInst, self.stime
class TestOrbitsGappyLongDataXarray(TestGeneralOrbitsMLT):
def setup(self):
"""Runs before every method to create a clean testing setup."""
self.testInst = pysat.Instrument('pysat', 'testing_xarray',
clean_level='clean',
orbit_info={'index': 'longitude',
'kind': 'longitude'})
self.testInst.custom_attach(filter_data)
self.stime = pysat.instruments.pysat_testing._test_dates['']['']
def teardown(self):
"""Runs after every method to clean up previous testing."""
del self.testInst, self.stime
class TestOrbitsGappyOrbitNumData(TestGeneralOrbitsMLT):
def setup(self):
"""Runs before every method to create a clean testing setup."""
self.testInst = pysat.Instrument('pysat', 'testing',
clean_level='clean',
orbit_info={'index': 'orbit_num',
'kind': 'orbit'})
self.testInst.custom_attach(filter_data)
self.stime = pysat.instruments.pysat_testing._test_dates['']['']
def teardown(self):
"""Runs after every method to clean up previous testing."""
del self.testInst, self.stime
class TestOrbitsGappyOrbitNumDataXarray(TestGeneralOrbitsMLT):
def setup(self):
"""Runs before every method to create a clean testing setup."""
self.testInst = pysat.Instrument('pysat', 'testing_xarray',
clean_level='clean',
orbit_info={'index': 'orbit_num',
'kind': 'orbit'})
self.testInst.custom_attach(filter_data)
self.stime = pysat.instruments.pysat_testing._test_dates['']['']
def teardown(self):
"""Runs after every method to clean up previous testing."""
del self.testInst, self.stime
class TestOrbitsGappyOrbitLatData(TestGeneralOrbitsMLT):
def setup(self):
"""Runs before every method to create a clean testing setup."""
self.testInst = pysat.Instrument('pysat', 'testing',
clean_level='clean',
orbit_info={'index': 'latitude',
'kind': 'polar'})
self.testInst.custom_attach(filter_data)
self.stime = pysat.instruments.pysat_testing._test_dates['']['']
def teardown(self):
"""Runs after every method to clean up previous testing."""
del self.testInst, self.stime
class TestOrbitsGappyOrbitLatDataXarray(TestGeneralOrbitsMLT):
def setup(self):
"""Runs before every method to create a clean testing setup."""
self.testInst = pysat.Instrument('pysat', 'testing_xarray',
clean_level='clean',
orbit_info={'index': 'latitude',
'kind': 'polar'})
self.testInst.custom_attach(filter_data)
self.stime = pysat.instruments.pysat_testing._test_dates['']['']
def teardown(self):
"""Runs after every method to clean up previous testing."""
del self.testInst, self.stime
| 40.568627 | 80 | 0.571425 | 4,259 | 37,242 | 4.864992 | 0.081944 | 0.141892 | 0.075579 | 0.039286 | 0.813369 | 0.770512 | 0.727751 | 0.702751 | 0.674083 | 0.664189 | 0 | 0.020193 | 0.31784 | 37,242 | 917 | 81 | 40.612868 | 0.795395 | 0.148676 | 0 | 0.710611 | 0 | 0 | 0.040864 | 0 | 0 | 0 | 0 | 0 | 0.083601 | 1 | 0.141479 | false | 0 | 0.011254 | 0 | 0.207396 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
1e4d25f2a6405b891bfd64d7aab977e5f1458c5a | 63 | py | Python | pythosf/client/provider.py | felliott/pythosf | 961d200bdc8e7d35129405ee289d46cb0527ed23 | [
"MIT"
] | 1 | 2017-11-21T18:43:01.000Z | 2017-11-21T18:43:01.000Z | pythosf/client/provider.py | felliott/pythosf | 961d200bdc8e7d35129405ee289d46cb0527ed23 | [
"MIT"
] | null | null | null | pythosf/client/provider.py | felliott/pythosf | 961d200bdc8e7d35129405ee289d46cb0527ed23 | [
"MIT"
] | 4 | 2018-01-31T20:00:38.000Z | 2021-10-14T14:04:32.000Z | from .folder import Folder
class Provider(Folder):
pass
| 9 | 26 | 0.714286 | 8 | 63 | 5.625 | 0.75 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.222222 | 63 | 6 | 27 | 10.5 | 0.918367 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0.333333 | 0.333333 | 0 | 0.666667 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 0 | 1 | 0 | 0 | 6 |
1e6e79058b7496d5511d65f04c5d4a46afdbc49a | 4,270 | py | Python | tests/users/test_change_email.py | carbonariy/dvhb-hybrid | adbb250767ea255addc607fb6f6755c9add447db | [
"MIT"
] | 27 | 2018-05-08T16:03:24.000Z | 2020-02-20T06:39:19.000Z | tests/users/test_change_email.py | carbonariy/dvhb-hybrid | adbb250767ea255addc607fb6f6755c9add447db | [
"MIT"
] | 7 | 2018-10-20T16:03:36.000Z | 2021-11-03T11:09:22.000Z | tests/users/test_change_email.py | carbonariy/dvhb-hybrid | adbb250767ea255addc607fb6f6755c9add447db | [
"MIT"
] | 16 | 2018-12-11T15:34:22.000Z | 2022-01-25T00:20:55.000Z | import pytest
@pytest.fixture
def send_email_change_request(make_request):
async def wrapper(new_email_address, expected_status=None, client=None):
return await make_request(
'dvhb_hybrid.user:change_email',
method='put',
json=dict(new_email_address=new_email_address),
client=client,
expected_status=expected_status)
return wrapper
@pytest.fixture
def approve_email_change_request(make_request):
async def wrapper(confirmation_code, expected_status=None):
return await make_request(
'dvhb_hybrid.user:change_email',
method='post',
json=dict(confirmation_code=confirmation_code),
expected_status=expected_status)
return wrapper
@pytest.fixture
def get_email_change_requests(app):
async def wrapper(new_email_address):
return (
await app.m.user_change_email_original_address_request.get_by_new_email(new_email_address),
await app.m.user_change_email_new_address_request.get_by_new_email(new_email_address),
)
return wrapper
@pytest.mark.django_db
async def test_send_email_change_request_unauthorized(send_email_change_request):
await send_email_change_request(new_email_address='xxx@xxx.xx', expected_status=401)
@pytest.mark.django_db
async def test_send_email_change_request_same_email(app, test_client, create_new_user, send_email_change_request):
client = await test_client(app)
user = await create_new_user()
await client.authorize(email=user['email'], password=user['password'])
await send_email_change_request(new_email_address=user['email'], client=client, expected_status=400)
@pytest.mark.django_db
async def test_send_email_change_request_successful(
app, test_client, create_new_user, send_email_change_request, get_email_change_requests):
client = await test_client(app)
user = await create_new_user()
new_email_address = 'xxx@xxx.xx'
await client.authorize(email=user['email'], password=user['password'])
await send_email_change_request(new_email_address=new_email_address, client=client, expected_status=200)
orig_address_request, new_address_request = await get_email_change_requests(new_email_address)
assert orig_address_request is not None
assert new_address_request is not None
@pytest.mark.django_db
async def test_approve_email_change_request_invalid_code(approve_email_change_request):
await approve_email_change_request(confirmation_code='xxx', expected_status=400)
@pytest.mark.django_db
async def test_approve_email_change_request_unknown_code(approve_email_change_request):
await approve_email_change_request(confirmation_code='F' * 32, expected_status=404)
@pytest.mark.django_db
async def test_approve_email_change_request_successful(
app, test_client, create_new_user, send_email_change_request, approve_email_change_request,
get_email_change_requests, get_user):
client = await test_client(app)
user = await create_new_user()
new_email_address = 'xxx@xxx.xx'
await client.authorize(email=user['email'], password=user['password'])
# Request email changing
await send_email_change_request(new_email_address=new_email_address, client=client, expected_status=200)
# Confirm both original and new address codes
orig_address_request, new_address_request = await get_email_change_requests(new_email_address)
await approve_email_change_request(confirmation_code=orig_address_request.code, expected_status=200)
await approve_email_change_request(confirmation_code=new_address_request.code, expected_status=200)
# Requests should be confirmed
orig_address_request, new_address_request = await get_email_change_requests(new_email_address)
assert orig_address_request.is_confirmed()
assert new_address_request.is_confirmed()
# User email should be changed
user = await get_user(user_id=user['id'])
assert user.email == new_email_address
# Attempt to confirm same codes again
await approve_email_change_request(confirmation_code=orig_address_request.code, expected_status=409)
await approve_email_change_request(confirmation_code=new_address_request.code, expected_status=409)
| 41.862745 | 114 | 0.786417 | 585 | 4,270 | 5.312821 | 0.126496 | 0.109717 | 0.144788 | 0.104569 | 0.824003 | 0.803411 | 0.767053 | 0.747104 | 0.705277 | 0.646075 | 0 | 0.008734 | 0.14192 | 4,270 | 101 | 115 | 42.277228 | 0.83952 | 0.037471 | 0 | 0.465753 | 0 | 0 | 0.035331 | 0.014133 | 0 | 0 | 0 | 0 | 0.068493 | 1 | 0.041096 | false | 0.041096 | 0.013699 | 0 | 0.136986 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
1e9e9f3ac604ee2049ab42a6dc4d64cc64fd000a | 114 | py | Python | ado/apps/passengers/admin.py | edderleonardo/ado | fa3478e3aa55b71e7627a9b5017fa2fbff196c31 | [
"BSD-3-Clause"
] | null | null | null | ado/apps/passengers/admin.py | edderleonardo/ado | fa3478e3aa55b71e7627a9b5017fa2fbff196c31 | [
"BSD-3-Clause"
] | null | null | null | ado/apps/passengers/admin.py | edderleonardo/ado | fa3478e3aa55b71e7627a9b5017fa2fbff196c31 | [
"BSD-3-Clause"
] | null | null | null | from django.contrib import admin
from ado.apps.passengers.models import Passenger
admin.site.register(Passenger)
| 22.8 | 48 | 0.842105 | 16 | 114 | 6 | 0.75 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.087719 | 114 | 4 | 49 | 28.5 | 0.923077 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0.666667 | 0.666667 | 0 | 0.666667 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 0 | 1 | 0 | 0 | 6 |
1ee4d8b07f0eb80c716e76268724dbc92a84bc3f | 177 | py | Python | tests/helpers/examples/cart/serializers.py | hyperleex/dependencies | 188843f616670a90e5e4a4c48b2fb954a460c2ee | [
"BSD-2-Clause"
] | null | null | null | tests/helpers/examples/cart/serializers.py | hyperleex/dependencies | 188843f616670a90e5e4a4c48b2fb954a460c2ee | [
"BSD-2-Clause"
] | null | null | null | tests/helpers/examples/cart/serializers.py | hyperleex/dependencies | 188843f616670a90e5e4a4c48b2fb954a460c2ee | [
"BSD-2-Clause"
] | null | null | null | # -*- coding: utf-8 -*-
from rest_framework import serializers
class ItemSerializer(serializers.Serializer):
pass
class UserSerializer(serializers.Serializer):
pass
| 16.090909 | 45 | 0.751412 | 18 | 177 | 7.333333 | 0.722222 | 0.318182 | 0.378788 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.006667 | 0.152542 | 177 | 10 | 46 | 17.7 | 0.873333 | 0.118644 | 0 | 0.4 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0.4 | 0.2 | 0 | 0.6 | 0 | 1 | 0 | 0 | null | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 1 | 0 | 0 | 6 |
94b0349c3b4dedd57e6496e7f1fe838615d6ddc5 | 48 | py | Python | samples/src/main/resources/datasets/python/32.py | sritchie/kotlingrad | 8165ed1cd77220a5347c58cded4c6f2bcf22ee30 | [
"Apache-2.0"
] | 11 | 2020-12-19T01:19:44.000Z | 2021-12-25T20:43:33.000Z | src/main/resources/datasets/python/32.py | breandan/katholic | 081c39f3acc73ff41f5865563debe78a36e1038f | [
"Apache-2.0"
] | null | null | null | src/main/resources/datasets/python/32.py | breandan/katholic | 081c39f3acc73ff41f5865563debe78a36e1038f | [
"Apache-2.0"
] | 2 | 2021-01-25T07:59:20.000Z | 2021-08-07T07:13:49.000Z | def boolop4(a, b, c):
return a or (b and c)
| 16 | 25 | 0.5625 | 11 | 48 | 2.454545 | 0.727273 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.029412 | 0.291667 | 48 | 2 | 26 | 24 | 0.764706 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.5 | false | 0 | 0 | 0.5 | 1 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 6 |
94b9b564e0b9953a58896a6571cdbd9ffc588590 | 29 | py | Python | tracky/__init__.py | sparrowml/tracky | 6ed1b1a5542f1586e37a25d03e5dc047e1595fa9 | [
"MIT"
] | 9 | 2018-10-28T10:34:25.000Z | 2022-01-09T08:56:26.000Z | tracky/__init__.py | sparrowml/tracky | 6ed1b1a5542f1586e37a25d03e5dc047e1595fa9 | [
"MIT"
] | null | null | null | tracky/__init__.py | sparrowml/tracky | 6ed1b1a5542f1586e37a25d03e5dc047e1595fa9 | [
"MIT"
] | 1 | 2022-01-09T08:56:27.000Z | 2022-01-09T08:56:27.000Z | from .tracker import Tracker
| 14.5 | 28 | 0.827586 | 4 | 29 | 6 | 0.75 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.137931 | 29 | 1 | 29 | 29 | 0.96 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
94ce63db57e6355feec7bb3da1b97d6ecdfdae9b | 217 | py | Python | tests/wicked/test_basic.py | fevangelista/pyWicked | 9bc0e13f6e45c86222ea95fdadf1cb66eb59862f | [
"MIT"
] | 5 | 2017-06-04T06:24:40.000Z | 2022-02-12T04:58:46.000Z | tests/wicked/test_basic.py | fevangelista/pyWicked | 9bc0e13f6e45c86222ea95fdadf1cb66eb59862f | [
"MIT"
] | 3 | 2021-12-13T22:46:38.000Z | 2021-12-14T17:07:02.000Z | tests/wicked/test_basic.py | fevangelista/pyWicked | 9bc0e13f6e45c86222ea95fdadf1cb66eb59862f | [
"MIT"
] | 1 | 2019-10-10T20:17:18.000Z | 2019-10-10T20:17:18.000Z | def test_import():
import wicked
def test_orbital_space():
import wicked
wicked.reset_space()
assert wicked.num_spaces() == 0
if __name__ == "__main__":
test_import()
test_orbital_space()
| 14.466667 | 35 | 0.672811 | 27 | 217 | 4.814815 | 0.518519 | 0.107692 | 0.246154 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.005917 | 0.221198 | 217 | 14 | 36 | 15.5 | 0.763314 | 0 | 0 | 0.222222 | 0 | 0 | 0.036866 | 0 | 0 | 0 | 0 | 0 | 0.111111 | 1 | 0.222222 | true | 0 | 0.444444 | 0 | 0.666667 | 0 | 1 | 0 | 0 | null | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
94f33a1947f604337df43a9b0ec594b452d97844 | 46,017 | py | Python | koku/api/report/test/aws/openshift/test_views.py | rubik-ai/koku | 3255d1c217b7b6685cb2e130bf4e025946e76fac | [
"Apache-2.0"
] | null | null | null | koku/api/report/test/aws/openshift/test_views.py | rubik-ai/koku | 3255d1c217b7b6685cb2e130bf4e025946e76fac | [
"Apache-2.0"
] | 3 | 2020-04-20T21:42:22.000Z | 2022-03-18T19:34:12.000Z | koku/api/report/test/aws/openshift/test_views.py | rubik-ai/koku | 3255d1c217b7b6685cb2e130bf4e025946e76fac | [
"Apache-2.0"
] | null | null | null | #
# Copyright 2021 Red Hat Inc.
# SPDX-License-Identifier: Apache-2.0
#
"""Test the OCP on AWS Report views."""
import datetime
import random
from urllib.parse import quote_plus
from urllib.parse import urlencode
from dateutil import relativedelta
from django.db.models import Count
from django.db.models import F
from django.db.models import Sum
from django.urls import reverse
from rest_framework import status
from rest_framework.test import APIClient
from rest_framework_csv.renderers import CSVRenderer
from tenant_schemas.utils import tenant_context
from api.iam.test.iam_test_case import IamTestCase
from api.query_handler import TruncDayString
from api.utils import DateHelper
from reporting.models import OCPAWSCostLineItemDailySummaryP
URLS = [
reverse("reports-openshift-aws-costs"),
reverse("reports-openshift-aws-storage"),
reverse("reports-openshift-aws-instance-type"),
]
GROUP_BYS = ["project", "cluster", "node", "account", "region", "instance_type", "service", "product_family"]
class OCPAWSReportViewTest(IamTestCase):
"""Tests the report view."""
@classmethod
def setUpClass(cls):
"""Set up the test class."""
super().setUpClass()
cls.dh = DateHelper()
cls.ten_days_ago = cls.dh.n_days_ago(cls.dh._now, 9)
def test_execute_query_ocp_aws_storage(self):
"""Test that OCP on AWS Storage endpoint works."""
url = reverse("reports-openshift-aws-storage")
client = APIClient()
response = client.get(url, **self.headers)
expected_end_date = self.dh.today.date().strftime("%Y-%m-%d")
expected_start_date = self.ten_days_ago.strftime("%Y-%m-%d")
self.assertEqual(response.status_code, status.HTTP_200_OK)
data = response.json()
dates = sorted([item.get("date") for item in data.get("data")])
self.assertEqual(dates[0], expected_start_date)
self.assertEqual(dates[-1], expected_end_date)
for item in data.get("data"):
if item.get("values"):
values = item.get("values")[0]
self.assertTrue("usage" in values)
self.assertTrue("cost" in values)
def test_execute_query_ocp_aws_storage_last_thirty_days(self):
"""Test that OCP CPU endpoint works."""
url = reverse("reports-openshift-aws-storage")
client = APIClient()
params = {"filter[time_scope_value]": "-30", "filter[time_scope_units]": "day", "filter[resolution]": "daily"}
url = url + "?" + urlencode(params, quote_via=quote_plus)
response = client.get(url, **self.headers)
expected_end_date = self.dh.today
expected_start_date = self.dh.n_days_ago(expected_end_date, 29)
expected_end_date = str(expected_end_date.date())
expected_start_date = str(expected_start_date.date())
self.assertEqual(response.status_code, status.HTTP_200_OK)
data = response.json()
dates = sorted([item.get("date") for item in data.get("data")])
self.assertEqual(dates[0], expected_start_date)
self.assertEqual(dates[-1], expected_end_date)
for item in data.get("data"):
if item.get("values"):
values = item.get("values")[0]
self.assertTrue("usage" in values)
self.assertTrue("cost" in values)
def test_execute_query_ocp_aws_storage_this_month(self):
"""Test that data is returned for the full month."""
url = reverse("reports-openshift-aws-storage")
client = APIClient()
params = {
"filter[resolution]": "monthly",
"filter[time_scope_value]": "-1",
"filter[time_scope_units]": "month",
}
url = url + "?" + urlencode(params, quote_via=quote_plus)
response = client.get(url, **self.headers)
expected_date = self.dh.today.strftime("%Y-%m")
self.assertEqual(response.status_code, status.HTTP_200_OK)
data = response.json()
dates = sorted([item.get("date") for item in data.get("data")])
self.assertEqual(dates[0], expected_date)
self.assertNotEqual(data.get("data")[0].get("values", []), [])
values = data.get("data")[0].get("values")[0]
self.assertTrue("usage" in values)
self.assertTrue("cost" in values)
def test_execute_query_ocp_aws_storage_this_month_daily(self):
"""Test that data is returned for the full month."""
url = reverse("reports-openshift-aws-storage")
client = APIClient()
params = {"filter[resolution]": "daily", "filter[time_scope_value]": "-1", "filter[time_scope_units]": "month"}
url = url + "?" + urlencode(params, quote_via=quote_plus)
response = client.get(url, **self.headers)
expected_start_date = self.dh.this_month_start.strftime("%Y-%m-%d")
expected_end_date = self.dh.today.strftime("%Y-%m-%d")
self.assertEqual(response.status_code, status.HTTP_200_OK)
data = response.json()
dates = sorted([item.get("date") for item in data.get("data")])
self.assertEqual(dates[0], expected_start_date)
self.assertEqual(dates[-1], expected_end_date)
for item in data.get("data"):
if item.get("values"):
values = item.get("values")[0]
self.assertTrue("usage" in values)
self.assertTrue("cost" in values)
def test_execute_query_ocp_aws_storage_last_month(self):
"""Test that data is returned for the last month."""
url = reverse("reports-openshift-aws-storage")
client = APIClient()
params = {
"filter[resolution]": "monthly",
"filter[time_scope_value]": "-2",
"filter[time_scope_units]": "month",
}
url = url + "?" + urlencode(params, quote_via=quote_plus)
response = client.get(url, **self.headers)
expected_date = self.dh.last_month_start.strftime("%Y-%m")
self.assertEqual(response.status_code, status.HTTP_200_OK)
data = response.json()
dates = sorted([item.get("date") for item in data.get("data")])
self.assertEqual(dates[0], expected_date)
self.assertIsNotNone(data.get("data")[0].get("values"))
self.assertNotEqual(data.get("data")[0].get("values"), [])
values = data.get("data")[0].get("values")[0]
self.assertTrue("usage" in values)
self.assertTrue("cost" in values)
def test_execute_query_ocp_aws_storage_last_month_daily(self):
"""Test that data is returned for the full month."""
url = reverse("reports-openshift-aws-storage")
client = APIClient()
params = {"filter[resolution]": "daily", "filter[time_scope_value]": "-2", "filter[time_scope_units]": "month"}
url = url + "?" + urlencode(params, quote_via=quote_plus)
response = client.get(url, **self.headers)
expected_start_date = self.dh.last_month_start.strftime("%Y-%m-%d")
expected_end_date = self.dh.last_month_end.strftime("%Y-%m-%d")
self.assertEqual(response.status_code, status.HTTP_200_OK)
data = response.json()
dates = sorted([item.get("date") for item in data.get("data")])
self.assertEqual(dates[0], expected_start_date)
self.assertEqual(dates[-1], expected_end_date)
for item in data.get("data"):
if item.get("values"):
values = item.get("values")[0]
self.assertTrue("usage" in values)
self.assertTrue("cost" in values)
def test_execute_query_ocp_aws_storage_group_by_limit(self):
"""Test that OCP Mem endpoint works with limits."""
url = reverse("reports-openshift-aws-storage")
client = APIClient()
params = {
"group_by[node]": "*",
"filter[limit]": "1",
"filter[time_scope_units]": "day",
"filter[time_scope_value]": "-10",
"filter[resolution]": "daily",
}
url = url + "?" + urlencode(params, quote_via=quote_plus)
response = client.get(url, **self.headers)
data = response.data
with tenant_context(self.tenant):
totals = (
OCPAWSCostLineItemDailySummaryP.objects.filter(usage_start__gte=self.ten_days_ago)
.filter(product_family__contains="Storage")
.values(*["usage_start"])
.annotate(usage=Sum("usage_amount"))
)
totals = {total.get("usage_start").strftime("%Y-%m-%d"): total.get("usage") for total in totals}
self.assertIn("nodes", data.get("data")[0])
# assert the others count is correct
meta = data.get("meta")
self.assertEqual(meta.get("others"), 2)
# Check if limit returns the correct number of results, and
# that the totals add up properly
for item in data.get("data"):
if item.get("nodes"):
date = item.get("date")
projects = item.get("nodes")
self.assertTrue(len(projects) <= 2)
if len(projects) == 2:
self.assertEqual(projects[1].get("node"), "Others")
usage_total = projects[0].get("values")[0].get("usage", {}).get("value") + projects[1].get(
"values"
)[0].get("usage", {}).get("value")
self.assertAlmostEqual(usage_total, totals.get(date))
def test_others_count_large_limit(self):
"""Test that OCP Mem endpoint works with limits."""
url = reverse("reports-openshift-aws-storage")
client = APIClient()
params = {
"group_by[node]": "*",
"filter[limit]": "100",
"filter[time_scope_units]": "day",
"filter[time_scope_value]": "-10",
"filter[resolution]": "daily",
}
url = url + "?" + urlencode(params, quote_via=quote_plus)
response = client.get(url, **self.headers)
data = response.data
# assert the others count is correct
meta = data.get("meta")
self.assertEqual(meta.get("others"), 0)
def test_execute_query_ocp_aws_storage_with_delta(self):
"""Test that deltas work for OpenShift on AWS storage."""
url = reverse("reports-openshift-aws-storage")
client = APIClient()
params = {
"delta": "usage",
"filter[resolution]": "daily",
"filter[time_scope_value]": "-1",
"filter[time_scope_units]": "month",
}
url = url + "?" + urlencode(params, quote_via=quote_plus)
response = client.get(url, **self.headers)
self.assertEqual(response.status_code, status.HTTP_200_OK)
data = response.data
this_month_start = self.dh.this_month_start
last_month_start = self.dh.last_month_start
date_delta = relativedelta.relativedelta(months=1)
def date_to_string(dt):
return dt.strftime("%Y-%m-%d")
def string_to_date(dt):
return datetime.datetime.strptime(dt, "%Y-%m-%d").date()
with tenant_context(self.tenant):
current_total = (
OCPAWSCostLineItemDailySummaryP.objects.filter(usage_start__gte=this_month_start)
.filter(product_family__contains="Storage")
.aggregate(usage=Sum(F("usage_amount")))
.get("usage")
)
current_total = current_total if current_total is not None else 0
current_totals = (
OCPAWSCostLineItemDailySummaryP.objects.filter(usage_start__gte=this_month_start)
.filter(product_family__contains="Storage")
.annotate(**{"date": TruncDayString("usage_start")})
.values(*["date"])
.annotate(usage=Sum(F("usage_amount")))
)
prev_totals = (
OCPAWSCostLineItemDailySummaryP.objects.filter(usage_start__gte=last_month_start)
.filter(usage_start__lt=this_month_start)
.filter(product_family__contains="Storage")
.annotate(**{"date": TruncDayString("usage_start")})
.values(*["date"])
.annotate(usage=Sum(F("usage_amount")))
)
current_totals = {total.get("date"): total.get("usage") for total in current_totals}
prev_total_dates = [
total.get("date")
for total in prev_totals
if date_to_string(string_to_date(total.get("date")) + date_delta) in current_totals
]
prev_totals = {
date_to_string(string_to_date(total.get("date")) + date_delta): total.get("usage")
for total in prev_totals
if date_to_string(string_to_date(total.get("date")) + date_delta) in current_totals
}
prev_total = (
OCPAWSCostLineItemDailySummaryP.objects.filter(usage_start__in=prev_total_dates)
.filter(product_family__contains="Storage")
.aggregate(usage=Sum(F("usage_amount")))
.get("usage")
)
prev_total = prev_total if prev_total is not None else 0
expected_delta = current_total - prev_total
delta = data.get("meta", {}).get("delta", {}).get("value")
self.assertEqual(delta, expected_delta)
for item in data.get("data"):
date = item.get("date")
expected_delta = current_totals.get(date, 0) - prev_totals.get(date, 0)
values = item.get("values", [])
delta_value = 0
if values:
delta_value = values[0].get("delta_value")
self.assertAlmostEqual(delta_value, expected_delta)
def test_execute_query_ocp_aws_storage_group_by_project(self):
"""Test that grouping by project filters data."""
with tenant_context(self.tenant):
# Force Django to do GROUP BY to get nodes
projects = (
OCPAWSCostLineItemDailySummaryP.objects.filter(usage_start__gte=self.ten_days_ago)
.filter(product_family__contains="Storage")
.values(*["namespace"])
.annotate(project_count=Count("namespace"))
.all()
)
self.assertNotEqual(len(projects), 0)
project_of_interest = projects[0].get("namespace")
url = reverse("reports-openshift-aws-storage")
client = APIClient()
params = {"group_by[project]": project_of_interest}
url = url + "?" + urlencode(params, quote_via=quote_plus)
response = client.get(url, **self.headers)
self.assertEqual(response.status_code, status.HTTP_200_OK)
data = response.json()
for entry in data.get("data", []):
for project in entry.get("projects", []):
self.assertEqual(project.get("project"), project_of_interest)
def test_execute_query_ocp_aws_storage_group_by_cluster(self):
"""Test that grouping by cluster filters data."""
with tenant_context(self.tenant):
# Force Django to do GROUP BY to get nodes
clusters = (
OCPAWSCostLineItemDailySummaryP.objects.filter(usage_start__gte=self.ten_days_ago)
.filter(product_family__contains="Storage")
.values(*["cluster_id"])
.annotate(cluster_count=Count("cluster_id"))
.all()
)
self.assertNotEqual(len(clusters), 0)
cluster_of_interest = clusters[0].get("cluster_id")
url = reverse("reports-openshift-aws-storage")
client = APIClient()
params = {"group_by[cluster]": cluster_of_interest}
url = url + "?" + urlencode(params, quote_via=quote_plus)
response = client.get(url, **self.headers)
self.assertEqual(response.status_code, status.HTTP_200_OK)
data = response.json()
for entry in data.get("data", []):
for cluster in entry.get("clusters", []):
self.assertEqual(cluster.get("cluster"), cluster_of_interest)
def test_execute_query_group_by_pod_fails(self):
"""Test that grouping by pod filters data."""
url = reverse("reports-openshift-aws-storage")
client = APIClient()
params = {"group_by[pod]": "*"}
url = url + "?" + urlencode(params, quote_via=quote_plus)
response = client.get(url, **self.headers)
self.assertEqual(response.status_code, status.HTTP_400_BAD_REQUEST)
def test_execute_query_ocp_aws_storage_group_by_node(self):
"""Test that grouping by node filters data."""
with tenant_context(self.tenant):
# Force Django to do GROUP BY to get nodes
nodes = (
OCPAWSCostLineItemDailySummaryP.objects.values(*["node"])
.filter(usage_start__gte=self.ten_days_ago)
.filter(product_family__contains="Storage")
.values(*["node"])
.annotate(node_count=Count("node"))
.all()
)
self.assertNotEqual(len(nodes), 0)
node_of_interest = nodes[0].get("node")
url = reverse("reports-openshift-aws-storage")
client = APIClient()
params = {"group_by[node]": node_of_interest}
url = url + "?" + urlencode(params, quote_via=quote_plus)
response = client.get(url, **self.headers)
self.assertEqual(response.status_code, status.HTTP_200_OK)
data = response.json()
for entry in data.get("data", []):
for node in entry.get("nodes", []):
self.assertEqual(node.get("node"), node_of_interest)
def test_execute_query_ocp_aws_storage_with_tag_filter(self):
"""Test that data is filtered by tag key."""
with tenant_context(self.tenant):
labels = (
OCPAWSCostLineItemDailySummaryP.objects.filter(usage_start__gte=self.ten_days_ago)
.filter(product_family__contains="Storage")
.values(*["tags"])
.first()
)
self.assertIsNotNone(labels)
tags = labels.get("tags")
filter_key = list(tags.keys())[0]
filter_value = tags.get(filter_key)
totals = (
OCPAWSCostLineItemDailySummaryP.objects.filter(usage_start__gte=self.ten_days_ago)
.filter(**{f"tags__{filter_key}": filter_value})
.filter(product_family__contains="Storage")
.aggregate(**{"usage": Sum("usage_amount"), "cost": Sum(F("unblended_cost") + F("markup_cost"))})
)
url = reverse("reports-openshift-aws-storage")
client = APIClient()
params = {f"filter[tag:{filter_key}]": filter_value}
url = url + "?" + urlencode(params, quote_via=quote_plus)
response = client.get(url, **self.headers)
self.assertEqual(response.status_code, status.HTTP_200_OK)
data = response.json()
data_totals = data.get("meta", {}).get("total", {})
for key in totals:
expected = float(totals[key])
if key == "cost":
result = data_totals.get(key, {}).get("total").get("value")
else:
result = data_totals.get(key, {}).get("value")
self.assertEqual(result, expected)
def test_execute_query_ocp_aws_storage_with_wildcard_tag_filter(self):
"""Test that data is filtered to include entries with tag key."""
with tenant_context(self.tenant):
labels = (
OCPAWSCostLineItemDailySummaryP.objects.filter(usage_start__gte=self.ten_days_ago)
.filter(product_family__contains="Storage")
.values(*["tags"])
.first()
)
self.assertIsNotNone(labels)
tags = labels.get("tags")
filter_key = list(tags.keys())[0]
totals = (
OCPAWSCostLineItemDailySummaryP.objects.filter(usage_start__gte=self.ten_days_ago)
.filter(**{"tags__has_key": filter_key})
.filter(product_family__contains="Storage")
.aggregate(**{"usage": Sum("usage_amount"), "cost": Sum(F("unblended_cost") + F("markup_cost"))})
)
url = reverse("reports-openshift-aws-storage")
client = APIClient()
params = {f"filter[tag:{filter_key}]": "*"}
url = url + "?" + urlencode(params, quote_via=quote_plus)
response = client.get(url, **self.headers)
self.assertEqual(response.status_code, status.HTTP_200_OK)
data = response.json()
data_totals = data.get("meta", {}).get("total", {})
for key in totals:
expected = float(totals[key])
if key == "cost":
result = data_totals.get(key, {}).get("total").get("value")
else:
result = data_totals.get(key, {}).get("value")
self.assertEqual(result, expected)
def test_execute_query_ocp_aws_storage_with_tag_group_by(self):
"""Test that data is grouped by tag key."""
with tenant_context(self.tenant):
labels = (
OCPAWSCostLineItemDailySummaryP.objects.filter(usage_start__gte=self.ten_days_ago)
.filter(product_family__contains="Storage")
.values(*["tags"])
.first()
)
self.assertIsNotNone(labels)
self.assertNotEqual(len(labels), 0)
tags = labels.get("tags")
group_by_key = list(tags.keys())[0]
url = reverse("reports-openshift-aws-storage")
client = APIClient()
params = {f"group_by[tag:{group_by_key}]": "*"}
url = url + "?" + urlencode(params, quote_via=quote_plus)
response = client.get(url, **self.headers)
self.assertEqual(response.status_code, status.HTTP_200_OK)
data = response.json()
data = data.get("data", [])
expected_keys = ["date", group_by_key + "s"]
for entry in data:
self.assertEqual(list(entry.keys()), expected_keys)
def test_execute_query_ocp_aws_storage_with_group_by_tag_and_limit(self):
"""Test that data is grouped by tag key and limited."""
with tenant_context(self.tenant):
labels = (
OCPAWSCostLineItemDailySummaryP.objects.filter(usage_start__gte=self.dh.last_month_start)
.filter(usage_start__lte=self.dh.last_month_end)
.filter(product_family__contains="Storage")
.values(*["tags"])
.first()
)
self.assertIsNotNone(labels)
self.assertNotEqual(len(labels), 0)
tags = labels.get("tags")
group_by_key = list(tags.keys())[0]
plural_key = group_by_key + "s"
url = reverse("reports-openshift-aws-storage")
client = APIClient()
params = {
"filter[resolution]": "monthly",
"filter[time_scope_value]": "-2",
"filter[time_scope_units]": "month",
f"group_by[tag:{group_by_key}]": "*",
"filter[limit]": 2,
}
url = url + "?" + urlencode(params, quote_via=quote_plus)
response = client.get(url, **self.headers)
self.assertEqual(response.status_code, status.HTTP_200_OK)
data = response.json()
data = data.get("data", [])
# default ordered by usage
previous_tag_usage = data[0].get(plural_key, [])[0].get("values", [{}])[0].get("usage", {}).get("value")
for entry in data[0].get(plural_key, []):
current_tag_usage = entry.get("values", [{}])[0].get("usage", {}).get("value")
if "Other" not in entry.get(group_by_key):
self.assertTrue(current_tag_usage <= previous_tag_usage)
previous_tag_usage = current_tag_usage
def test_execute_query_ocp_aws_storage_with_group_by_and_limit(self):
"""Test that data is grouped by and limited."""
url = reverse("reports-openshift-aws-storage")
client = APIClient()
params = {
"group_by[node]": "*",
"filter[limit]": 1,
"filter[resolution]": "monthly",
"filter[time_scope_value]": "-2",
"filter[time_scope_units]": "month",
}
url = url + "?" + urlencode(params, quote_via=quote_plus)
response = client.get(url, **self.headers)
self.assertEqual(response.status_code, status.HTTP_200_OK)
data = response.json()
data = data.get("data", [])
for entry in data:
other = entry.get("nodes", [])[-1:]
self.assertNotEqual(other, [])
self.assertIn("Other", other[0].get("node"))
def test_execute_query_ocp_aws_storage_with_group_by_order_by_and_limit(self):
"""Test that data is grouped by and limited on order by."""
url = reverse("reports-openshift-aws-storage")
client = APIClient()
params = {
"filter[resolution]": "monthly",
"filter[time_scope_value]": "-2",
"filter[time_scope_units]": "month",
"group_by[node]": "*",
"order_by[usage]": "desc",
"filter[limit]": 1,
}
url = url + "?" + urlencode(params, quote_via=quote_plus)
response = client.get(url, **self.headers)
self.assertEqual(response.status_code, status.HTTP_200_OK)
data = response.json()
data = data.get("data", [])
self.assertNotEqual(data, [])
self.assertNotEqual(data[0].get("nodes", []), [])
self.assertNotEqual(data[0].get("nodes", [])[0].get("values", []), [])
previous_usage = data[0].get("nodes", [])[0].get("values", [])[0].get("usage", {}).get("value")
self.assertIsNotNone(previous_usage)
for entry in data[0].get("nodes", []):
current_usage = entry.get("values", [])[0].get("usage", {}).get("value")
self.assertTrue(current_usage <= previous_usage)
previous_usage = current_usage
def test_get_costs(self):
"""Test costs reports runs with a customer owner."""
url = reverse("reports-openshift-aws-costs")
client = APIClient()
response = client.get(url, **self.headers)
self.assertEqual(response.status_code, status.HTTP_200_OK)
json_result = response.json()
self.assertIsNotNone(json_result.get("data"))
self.assertIsInstance(json_result.get("data"), list)
self.assertTrue(len(json_result.get("data")) > 0)
def test_get_costs_invalid_query_param(self):
"""Test costs reports runs with an invalid query param."""
qs = "group_by%5Binvalid%5D=account1&filter%5Bresolution%5D=daily"
url = reverse("reports-openshift-aws-costs") + "?" + qs
client = APIClient()
response = client.get(url, **self.headers)
self.assertEqual(response.status_code, status.HTTP_400_BAD_REQUEST)
def test_get_costs_csv(self):
"""Test CSV output of costs reports."""
url = reverse("reports-openshift-aws-costs")
client = APIClient(HTTP_ACCEPT="text/csv")
response = client.get(url, **self.headers)
response.render()
self.assertEqual(response.status_code, status.HTTP_200_OK)
self.assertEqual(response.accepted_media_type, "text/csv")
self.assertIsInstance(response.accepted_renderer, CSVRenderer)
def test_execute_query_ocp_aws_costs_group_by_project(self):
"""Test that grouping by project filters data."""
with tenant_context(self.tenant):
# Force Django to do GROUP BY to get nodes
projects = (
OCPAWSCostLineItemDailySummaryP.objects.filter(usage_start__gte=self.ten_days_ago)
.values(*["namespace"])
.annotate(project_count=Count("namespace"))
.all()
)
project_of_interest = projects[0].get("namespace")
url = reverse("reports-openshift-aws-costs")
client = APIClient()
params = {"group_by[project]": project_of_interest}
url = url + "?" + urlencode(params, quote_via=quote_plus)
response = client.get(url, **self.headers)
self.assertEqual(response.status_code, status.HTTP_200_OK)
data = response.json()
for entry in data.get("data", []):
for project in entry.get("projects", []):
self.assertEqual(project.get("project"), project_of_interest)
def test_execute_query_ocp_aws_instance_type(self):
"""Test that the instance type API runs."""
url = reverse("reports-openshift-aws-instance-type")
client = APIClient()
response = client.get(url, **self.headers)
expected_end_date = self.dh.today.date().strftime("%Y-%m-%d")
expected_start_date = self.ten_days_ago.strftime("%Y-%m-%d")
self.assertEqual(response.status_code, status.HTTP_200_OK)
data = response.json()
dates = sorted([item.get("date") for item in data.get("data")])
self.assertEqual(dates[0], expected_start_date)
self.assertEqual(dates[-1], expected_end_date)
for item in data.get("data"):
if item.get("values"):
values = item.get("values")[0]
self.assertTrue("usage" in values)
self.assertTrue("cost" in values)
self.assertTrue("count" in values)
def test_execute_query_ocp_aws_instance_type_by_project(self):
"""Test that the instance type API runs when grouped by project."""
with tenant_context(self.tenant):
# Force Django to do GROUP BY to get nodes
projects = (
OCPAWSCostLineItemDailySummaryP.objects.filter(usage_start__gte=self.ten_days_ago)
.values(*["namespace"])
.annotate(project_count=Count("namespace"))
.all()
)
self.assertNotEqual(len(projects), 0)
project_of_interest = projects[0].get("namespace")
url = reverse("reports-openshift-aws-instance-type")
client = APIClient()
params = {"group_by[project]": project_of_interest}
url = url + "?" + urlencode(params, quote_via=quote_plus)
response = client.get(url, **self.headers)
self.assertEqual(response.status_code, status.HTTP_200_OK)
data = response.json()
for entry in data.get("data", []):
for project in entry.get("projects", []):
self.assertEqual(project.get("project"), project_of_interest)
def test_execute_query_default_pagination(self):
"""Test that the default pagination works."""
url = reverse("reports-openshift-aws-instance-type")
client = APIClient()
params = {
"filter[resolution]": "monthly",
"filter[time_scope_value]": "-1",
"filter[time_scope_units]": "month",
}
url = url + "?" + urlencode(params, quote_via=quote_plus)
response = client.get(url, **self.headers)
self.assertEqual(response.status_code, status.HTTP_200_OK)
response_data = response.json()
data = response_data.get("data", [])
meta = response_data.get("meta", {})
count = meta.get("count", 0)
self.assertIn("total", meta)
self.assertIn("filter", meta)
self.assertIn("count", meta)
self.assertEqual(len(data), count)
def test_execute_query_limit_pagination(self):
"""Test that the default pagination works with a limit."""
limit = 2
start_date = self.ten_days_ago.date().strftime("%Y-%m-%d")
url = reverse("reports-openshift-aws-instance-type")
client = APIClient()
params = {
"filter[resolution]": "daily",
"filter[time_scope_value]": "-10",
"filter[time_scope_units]": "day",
"limit": limit,
}
url = url + "?" + urlencode(params, quote_via=quote_plus)
response = client.get(url, **self.headers)
self.assertEqual(response.status_code, status.HTTP_200_OK)
response_data = response.json()
data = response_data.get("data", [])
meta = response_data.get("meta", {})
count = meta.get("count", 0)
self.assertIn("total", meta)
self.assertIn("count", meta)
self.assertNotEqual(len(data), count)
if limit > count:
self.assertEqual(len(data), count)
else:
self.assertEqual(len(data), limit)
self.assertEqual(data[0].get("date"), start_date)
def test_execute_query_limit_offset_pagination(self):
"""Test that the default pagination works with an offset."""
limit = 1
offset = 1
start_date = (self.ten_days_ago + datetime.timedelta(days=offset)).date().strftime("%Y-%m-%d")
url = reverse("reports-openshift-aws-instance-type")
client = APIClient()
params = {
"filter[resolution]": "daily",
"filter[time_scope_value]": "-10",
"filter[time_scope_units]": "day",
"limit": limit,
"offset": offset,
}
url = url + "?" + urlencode(params, quote_via=quote_plus)
response = client.get(url, **self.headers)
self.assertEqual(response.status_code, status.HTTP_200_OK)
response_data = response.json()
data = response_data.get("data", [])
meta = response_data.get("meta", {})
count = meta.get("count", 0)
self.assertIn("total", meta)
self.assertIn("count", meta)
self.assertNotEqual(len(data), count)
if limit + offset > count:
self.assertEqual(len(data), max((count - offset), 0))
else:
self.assertEqual(len(data), limit)
self.assertEqual(data[0].get("date"), start_date)
def test_execute_query_filter_limit_offset_pagination(self):
"""Test that the ranked group pagination works."""
limit = 1
offset = 0
url = reverse("reports-openshift-aws-instance-type")
client = APIClient()
params = {
"filter[resolution]": "monthly",
"filter[time_scope_value]": "-1",
"filter[time_scope_units]": "month",
"group_by[project]": "*",
"filter[limit]": limit,
"filter[offset]": offset,
}
url = url + "?" + urlencode(params, quote_via=quote_plus)
response = client.get(url, **self.headers)
self.assertEqual(response.status_code, status.HTTP_200_OK)
response_data = response.json()
data = response_data.get("data", [])
meta = response_data.get("meta", {})
count = meta.get("count", 0)
self.assertIn("total", meta)
self.assertIn("filter", meta)
self.assertIn("count", meta)
for entry in data:
projects = entry.get("projects", [])
if limit + offset > count:
self.assertEqual(len(projects), max((count - offset), 0))
else:
self.assertEqual(len(projects), limit)
def test_execute_query_filter_limit_high_offset_pagination(self):
"""Test that the default pagination works."""
limit = 1
offset = 10
url = reverse("reports-openshift-aws-instance-type")
client = APIClient()
params = {
"filter[resolution]": "monthly",
"filter[time_scope_value]": "-1",
"filter[time_scope_units]": "month",
"group_by[project]": "*",
"filter[limit]": limit,
"filter[offset]": offset,
}
url = url + "?" + urlencode(params, quote_via=quote_plus)
response = client.get(url, **self.headers)
self.assertEqual(response.status_code, status.HTTP_200_OK)
response_data = response.json()
data = response_data.get("data", [])
meta = response_data.get("meta", {})
count = meta.get("count", 0)
self.assertIn("total", meta)
self.assertIn("filter", meta)
self.assertIn("count", meta)
for entry in data:
projects = entry.get("projects", [])
if limit + offset > count:
self.assertEqual(len(projects), max((count - offset), 0))
else:
self.assertEqual(len(projects), limit)
def test_execute_query_with_order_by(self):
"""Test that the possible order by options work."""
order_by_numeric = ["cost", "supplementary", "infrastructure", "usage", "delta"]
order_by_non_numeric = ["project", "cluster", "node", "account_alias", "region", "service", "product_family"]
baseurl = reverse("reports-openshift-aws-instance-type")
client = APIClient()
for option in order_by_numeric:
order_by_dict_key = f"order_by[{option}]"
params = {
"filter[resolution]": "monthly",
"filter[time_scope_value]": "-1",
"filter[time_scope_units]": "month",
order_by_dict_key: "desc",
}
if option == "delta":
params.update({"delta": "usage"})
url = baseurl + "?" + urlencode(params, quote_via=quote_plus)
response = client.get(url, **self.headers)
self.assertEqual(response.status_code, status.HTTP_200_OK)
for option in order_by_non_numeric:
order_by_dict_key = f"order_by[{option}]"
group_by = option
if option == "account_alias":
group_by = "account"
params = {
"filter[resolution]": "monthly",
"filter[time_scope_value]": "-1",
"filter[time_scope_units]": "month",
order_by_dict_key: "desc",
f"group_by[{group_by}]": "*",
}
url = baseurl + "?" + urlencode(params, quote_via=quote_plus)
response = client.get(url, **self.headers)
self.assertEqual(response.status_code, status.HTTP_200_OK)
def test_order_by_tag_wo_group(self):
"""Test that order by tags without a group-by fails."""
baseurl = reverse("reports-openshift-aws-instance-type")
client = APIClient()
tag_url = reverse("openshift-aws-tags")
tag_url = tag_url + "?filter[time_scope_value]=-1&key_only=True"
response = client.get(tag_url, **self.headers)
tag_keys = response.data.get("data", [])
for key in tag_keys:
order_by_dict_key = f"order_by[tag:{key}]"
params = {
"filter[resolution]": "monthly",
"filter[time_scope_value]": "-1",
"filter[time_scope_units]": "month",
order_by_dict_key: random.choice(["asc", "desc"]),
}
url = baseurl + "?" + urlencode(params, quote_via=quote_plus)
response = client.get(url, **self.headers)
self.assertEqual(response.status_code, status.HTTP_400_BAD_REQUEST)
def test_order_by_tag_w_wrong_group(self):
"""Test that order by tags with a non-matching group-by fails."""
baseurl = reverse("reports-openshift-aws-instance-type")
client = APIClient()
tag_url = reverse("openshift-aws-tags")
tag_url = tag_url + "?filter[time_scope_value]=-1&key_only=True"
response = client.get(tag_url, **self.headers)
tag_keys = response.data.get("data", [])
for key in tag_keys:
order_by_dict_key = f"order_by[tag:{key}]"
params = {
"filter[resolution]": "monthly",
"filter[time_scope_value]": "-1",
"filter[time_scope_units]": "month",
order_by_dict_key: random.choice(["asc", "desc"]),
"group_by[usage]": random.choice(["asc", "desc"]),
}
url = baseurl + "?" + urlencode(params, quote_via=quote_plus)
response = client.get(url, **self.headers)
self.assertEqual(response.status_code, status.HTTP_400_BAD_REQUEST)
def test_order_by_tag_w_tag_group(self):
"""Test that order by tags with a matching group-by tag works."""
baseurl = reverse("reports-openshift-aws-instance-type")
client = APIClient()
tag_url = reverse("openshift-aws-tags")
tag_url = tag_url + "?filter[time_scope_value]=-1&key_only=True"
response = client.get(tag_url, **self.headers)
tag_keys = response.data.get("data", [])
for key in tag_keys:
order_by_dict_key = f"order_by[tag:{key}]"
group_by_dict_key = f"group_by[tag:{key}]"
params = {
"filter[resolution]": "monthly",
"filter[time_scope_value]": "-1",
"filter[time_scope_units]": "month",
order_by_dict_key: random.choice(["asc", "desc"]),
group_by_dict_key: random.choice(["asc", "desc"]),
}
url = baseurl + "?" + urlencode(params, quote_via=quote_plus)
response = client.get(url, **self.headers)
self.assertEqual(response.status_code, status.HTTP_200_OK)
def test_process_multiple_tag_query_params(self):
"""Test that grouping by multiple tag keys returns a valid response."""
with tenant_context(self.tenant):
labels = (
OCPAWSCostLineItemDailySummaryP.objects.filter(usage_start__gte=self.ten_days_ago)
.values(*["tags"])
.first()
)
self.assertIsNotNone(labels)
tags = labels.get("tags")
qstr = "filter[limit]=2"
# pick a random subset of tags
kval = len(tags.keys())
if kval > 2:
kval = random.randint(2, len(tags.keys()))
selected_tags = random.choices(list(tags.keys()), k=kval)
for tag in selected_tags:
qstr += f"&group_by[tag:{tag}]=*"
url = reverse("reports-openshift-aws-costs") + "?" + qstr
client = APIClient()
response = client.get(url, **self.headers)
self.assertEqual(response.status_code, status.HTTP_200_OK)
def test_group_bys_with_second_group_by_tag(self):
"""Test that a group by project followed by a group by tag does not error."""
with tenant_context(self.tenant):
labels = (
OCPAWSCostLineItemDailySummaryP.objects.filter(usage_start__gte=self.dh.last_month_start)
.filter(usage_start__lte=self.dh.last_month_end)
.values(*["tags"])
.first()
)
tags = labels.get("tags")
group_by_key = list(tags.keys())[0]
client = APIClient()
for url in URLS:
for group_by in GROUP_BYS:
params = {
"filter[resolution]": "monthly",
"filter[time_scope_value]": "-2",
"filter[time_scope_units]": "month",
f"group_by[{group_by}]": "*",
f"group_by[tag:{group_by_key}]": "*",
}
url = url + "?" + urlencode(params, quote_via=quote_plus)
response = client.get(url, **self.headers)
self.assertEqual(response.status_code, status.HTTP_200_OK)
def test_order_by_delta(self):
"""Test that the order_by delta with pagination does not error."""
limit = 5
offset = 0
url = reverse("reports-openshift-aws-instance-type")
client = APIClient()
params_list = [
{"filter[limit]": limit, "filter[offset]": offset, "order_by[delta]": "asc", "delta": "usage"},
{"order_by[delta]": "asc", "delta": "usage"},
]
for params in params_list:
url = url + "?" + urlencode(params, quote_via=quote_plus)
response = client.get(url, **self.headers)
self.assertEqual(response.status_code, status.HTTP_200_OK)
response_data = response.json()
data = response_data.get("data", [])
meta = response_data.get("meta", {})
self.assertIn("total", meta)
self.assertIn("filter", meta)
self.assertIn("count", meta)
compared_deltas = False
for day in data:
previous_delta = None
for instance_type in day.get("instance_types", []):
values = instance_type.get("values", [])
if values:
current_delta = values[0].get("delta_value")
if previous_delta:
self.assertLessEqual(previous_delta, current_delta)
compared_deltas = True
previous_delta = current_delta
else:
previous_delta = current_delta
self.assertTrue(compared_deltas)
def test_order_by_delta_no_delta(self):
"""Test that the order_by delta with no delta passed in triggers 400."""
limit = 5
offset = 0
url = reverse("reports-openshift-aws-instance-type")
client = APIClient()
params_list = [
{"filter[limit]": limit, "filter[offset]": offset, "order_by[delta]": "asc"},
{"order_by[delta]": "asc"},
]
for params in params_list:
url = url + "?" + urlencode(params, quote_via=quote_plus)
response = client.get(url, **self.headers)
self.assertEqual(response.status_code, status.HTTP_400_BAD_REQUEST)
| 42.063071 | 119 | 0.587978 | 5,238 | 46,017 | 4.952654 | 0.059756 | 0.04221 | 0.027176 | 0.040089 | 0.818518 | 0.792344 | 0.764744 | 0.754529 | 0.735371 | 0.699561 | 0 | 0.007959 | 0.279158 | 46,017 | 1,093 | 120 | 42.101555 | 0.77411 | 0.052568 | 0 | 0.691441 | 0 | 0 | 0.136475 | 0.060561 | 0 | 0 | 0 | 0 | 0.155405 | 1 | 0.046171 | false | 0 | 0.019144 | 0.002252 | 0.068694 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
94fd0f250fb8d9b97333c5278803fb806414af59 | 39 | py | Python | pybimodal/__init__.py | machism0/bimodal-qd-micropillars | f4339473ed5ff57ad6dcdb1b875c9407f4de1ee6 | [
"Apache-2.0"
] | 1 | 2021-12-09T01:33:54.000Z | 2021-12-09T01:33:54.000Z | pybimodal/__init__.py | machism0/bimodal-qd-micropillars | f4339473ed5ff57ad6dcdb1b875c9407f4de1ee6 | [
"Apache-2.0"
] | null | null | null | pybimodal/__init__.py | machism0/bimodal-qd-micropillars | f4339473ed5ff57ad6dcdb1b875c9407f4de1ee6 | [
"Apache-2.0"
] | null | null | null | from .branch import param_names, Branch | 39 | 39 | 0.846154 | 6 | 39 | 5.333333 | 0.833333 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.102564 | 39 | 1 | 39 | 39 | 0.914286 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
bfb80e3d8f37afd8b84841cfe598bda2f134fd25 | 194 | py | Python | AC/IntroPython/python-if/b4.py | samirsaravia/ubiquitous-octo-fortnigh | c197945e7e849ddfece02e34e0c6b8f03e50c7dd | [
"MIT"
] | null | null | null | AC/IntroPython/python-if/b4.py | samirsaravia/ubiquitous-octo-fortnigh | c197945e7e849ddfece02e34e0c6b8f03e50c7dd | [
"MIT"
] | 1 | 2021-03-04T22:03:05.000Z | 2021-03-04T22:03:05.000Z | AC/IntroPython/python-if/b4.py | samirsaravia/ubiquitous-octo-fortnight | c197945e7e849ddfece02e34e0c6b8f03e50c7dd | [
"MIT"
] | null | null | null | # tipos de valores
print(type('Ola mundo'))
print(type(7))
print(type(True)) # bool sempre maiuscula as primera letra
print(type(False))
print(type('True')) # pegadinha
print(type('False'))
| 19.4 | 59 | 0.706186 | 29 | 194 | 4.724138 | 0.586207 | 0.394161 | 0.189781 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.005882 | 0.123711 | 194 | 9 | 60 | 21.555556 | 0.8 | 0.335052 | 0 | 0 | 0 | 0 | 0.144 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | null | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 6 |
781a053753e617faf4749dffea21a713d646b819 | 47 | py | Python | allencv/nn/__init__.py | sethah/allencv | 1bdc27359f81290e96b290ccda11f7a9905ebf14 | [
"Apache-2.0"
] | 8 | 2019-05-09T02:48:54.000Z | 2022-02-14T03:58:54.000Z | allencv/nn/__init__.py | sethah/allencv | 1bdc27359f81290e96b290ccda11f7a9905ebf14 | [
"Apache-2.0"
] | null | null | null | allencv/nn/__init__.py | sethah/allencv | 1bdc27359f81290e96b290ccda11f7a9905ebf14 | [
"Apache-2.0"
] | null | null | null | from allencv.nn.common import StdConv, Upsample | 47 | 47 | 0.851064 | 7 | 47 | 5.714286 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.085106 | 47 | 1 | 47 | 47 | 0.930233 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
781d23efd5ae05645a178b161eb376c26a46b58d | 64 | py | Python | newspaper/utils/__init__.py | saraivaufc/jornalEletronico | d53da33c4684e1403f7e4b8943d5306e053afb02 | [
"MIT"
] | null | null | null | newspaper/utils/__init__.py | saraivaufc/jornalEletronico | d53da33c4684e1403f7e4b8943d5306e053afb02 | [
"MIT"
] | null | null | null | newspaper/utils/__init__.py | saraivaufc/jornalEletronico | d53da33c4684e1403f7e4b8943d5306e053afb02 | [
"MIT"
] | null | null | null | from .news import *
from .comment import *
from .tools import *
| 16 | 22 | 0.71875 | 9 | 64 | 5.111111 | 0.555556 | 0.434783 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.1875 | 64 | 3 | 23 | 21.333333 | 0.884615 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
7856938011901321a38ff5e82058c1cbb5daa465 | 74 | py | Python | python/src/test/resources/pyfunc/numpy_random4_test.py | maropu/lljvm-translator | 322fbe24a27976948c8e8081a9552152dda58b4b | [
"Apache-2.0"
] | 70 | 2017-12-12T10:54:00.000Z | 2022-03-22T07:45:19.000Z | python/src/test/resources/pyfunc/numpy_random4_test.py | maropu/lljvm-as | 322fbe24a27976948c8e8081a9552152dda58b4b | [
"Apache-2.0"
] | 14 | 2018-02-28T01:29:46.000Z | 2019-12-10T01:42:22.000Z | python/src/test/resources/pyfunc/numpy_random4_test.py | maropu/lljvm-as | 322fbe24a27976948c8e8081a9552152dda58b4b | [
"Apache-2.0"
] | 4 | 2019-07-21T07:58:25.000Z | 2021-02-01T09:46:59.000Z | import numpy as np
def numpy_random4_test(s):
return np.random.rand(s)
| 14.8 | 26 | 0.756757 | 14 | 74 | 3.857143 | 0.785714 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.015873 | 0.148649 | 74 | 4 | 27 | 18.5 | 0.84127 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.333333 | false | 0 | 0.333333 | 0.333333 | 1 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 1 | 1 | 0 | 0 | 6 |
7856f635e2d95e80e68c315b4a937f3fb2aaa2dd | 5,613 | py | Python | tools/annotate_coco.py | mengli11235/fruit | 568833d36a5cae0b345775c818223ff00a12f496 | [
"Apache-2.0"
] | null | null | null | tools/annotate_coco.py | mengli11235/fruit | 568833d36a5cae0b345775c818223ff00a12f496 | [
"Apache-2.0"
] | null | null | null | tools/annotate_coco.py | mengli11235/fruit | 568833d36a5cae0b345775c818223ff00a12f496 | [
"Apache-2.0"
] | null | null | null | ## This file is used to annotate xml files to coco format
import json
import os
import cv2
from xml.dom.minidom import parse
import xml.dom.minidom
if __name__ == '__main__':
path = '../dataset_AgrilFruit_forCounting/exp1/'
# fruit classes
folders = ['lemon', 'custardapple', 'apple', 'pear', 'persimmon']
train_dataset = {'categories':[], 'images':[], 'annotations':[]}
test_dataset = {'categories':[], 'images':[], 'annotations':[]}
# create coco categories
for i, j in enumerate(folders, 0):
train_dataset['categories'].append({'id': i, 'name': j, 'supercategory': 'mark'})
test_dataset['categories'].append({'id': i, 'name': j, 'supercategory': 'mark'})
# train annotations
for i, f in enumerate(os.listdir(os.path.join(path, 'train_xml'))):
# read through xml tree
DOMTree = xml.dom.minidom.parse(os.path.join(path, 'train_xml', f))
collection = DOMTree.documentElement
filename = collection.getElementsByTagName("filename")[0]
folder = collection.getElementsByTagName("folder")[0]
size = collection.getElementsByTagName("size")[0]
width = size.getElementsByTagName("width")[0]
height = size.getElementsByTagName("height")[0]
objects = collection.getElementsByTagName("object")
train_dataset['images'].append({'file_name': filename.childNodes[0].data,
'id': i,
'width': int(width.childNodes[0].data),
'height': int(height.childNodes[0].data)})
for k,object in enumerate(objects):
bndbox = object.getElementsByTagName('bndbox')[0]
name = object.getElementsByTagName('name')[0]
xmin = bndbox.getElementsByTagName('xmin')[0]
ymin = bndbox.getElementsByTagName('ymin')[0]
xmax = bndbox.getElementsByTagName('xmax')[0]
ymax = bndbox.getElementsByTagName('ymax')[0]
if name.childNodes[0].data.lower() == folder.childNodes[0].data.lower():
x1 = float(xmin.childNodes[0].data)
y1 = float(ymin.childNodes[0].data)
x2 = float(xmax.childNodes[0].data)
y2 = float(ymax.childNodes[0].data)
width_object = max(0, x2 - x1)
height_object = max(0, y2 - y1)
train_dataset['annotations'].append({'area': width_object * height_object,
'bbox': [x1, y1, width_object, height_object],
'category_id': folders.index(folder.childNodes[0].data.lower()),
'id': i*100+k,
'image_id': i,
'iscrowd': 0,
'segmentation': [[x1, y1, x2, y1, x2, y2, x1, y2]]})
#test annotations
for i, f in enumerate(os.listdir(os.path.join(path, 'test_xml'))):
DOMTree = xml.dom.minidom.parse(os.path.join(path, 'test_xml', f))
collection = DOMTree.documentElement
filename = collection.getElementsByTagName("filename")[0]
folder = collection.getElementsByTagName("folder")[0]
size = collection.getElementsByTagName("size")[0]
width = size.getElementsByTagName("width")[0]
height = size.getElementsByTagName("height")[0]
objects = collection.getElementsByTagName("object")
test_dataset['images'].append({'file_name': filename.childNodes[0].data,
'id': i,
'width': int(width.childNodes[0].data),
'height': int(height.childNodes[0].data)})
for k,object in enumerate(objects):
bndbox = object.getElementsByTagName('bndbox')[0]
name = object.getElementsByTagName('name')[0]
xmin = bndbox.getElementsByTagName('xmin')[0]
ymin = bndbox.getElementsByTagName('ymin')[0]
xmax = bndbox.getElementsByTagName('xmax')[0]
ymax = bndbox.getElementsByTagName('ymax')[0]
if name.childNodes[0].data.lower() == folder.childNodes[0].data.lower():
x1 = float(xmin.childNodes[0].data)
y1 = float(ymin.childNodes[0].data)
x2 = float(xmax.childNodes[0].data)
y2 = float(ymax.childNodes[0].data)
width_object = max(0, x2 - x1)
height_object = max(0, y2 - y1)
test_dataset['annotations'].append({'area': width_object * height_object,
'bbox': [x1, y1, width_object, height_object],
'category_id': folders.index(folder.childNodes[0].data.lower()),
'id': i*100+k,
'image_id': i,
'iscrowd': 0,
'segmentation': [[x1, y1, x2, y1, x2, y2, x1, y2]]})
# save the json file
save_folder = os.path.join(path, 'annotations')
if not os.path.exists(save_folder):
os.makedirs(save_folder)
json_name = os.path.join(save_folder, 'annotations_train.json')
with open(json_name, 'w') as f:
json.dump(train_dataset, f)
json_name = os.path.join(save_folder, 'annotations_test.json')
with open(json_name, 'w') as f:
json.dump(test_dataset, f) | 56.69697 | 110 | 0.540709 | 571 | 5,613 | 5.222417 | 0.176883 | 0.073776 | 0.100604 | 0.040241 | 0.824279 | 0.824279 | 0.816231 | 0.816231 | 0.790074 | 0.731724 | 0 | 0.024499 | 0.323713 | 5,613 | 99 | 111 | 56.69697 | 0.761064 | 0.029574 | 0 | 0.688889 | 0 | 0 | 0.108949 | 0.015194 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.055556 | 0 | 0.055556 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
788bef865bbb34a4127a7e19a613622d87cec5b5 | 480 | py | Python | bitmovin_api_sdk/encoding/encodings/muxings/fmp4/__init__.py | jaythecaesarean/bitmovin-api-sdk-python | 48166511fcb9082041c552ace55a9b66cc59b794 | [
"MIT"
] | 11 | 2019-07-03T10:41:16.000Z | 2022-02-25T21:48:06.000Z | bitmovin_api_sdk/encoding/encodings/muxings/fmp4/__init__.py | jaythecaesarean/bitmovin-api-sdk-python | 48166511fcb9082041c552ace55a9b66cc59b794 | [
"MIT"
] | 8 | 2019-11-23T00:01:25.000Z | 2021-04-29T12:30:31.000Z | bitmovin_api_sdk/encoding/encodings/muxings/fmp4/__init__.py | jaythecaesarean/bitmovin-api-sdk-python | 48166511fcb9082041c552ace55a9b66cc59b794 | [
"MIT"
] | 13 | 2020-01-02T14:58:18.000Z | 2022-03-26T12:10:30.000Z | from bitmovin_api_sdk.encoding.encodings.muxings.fmp4.fmp4_api import Fmp4Api
from bitmovin_api_sdk.encoding.encodings.muxings.fmp4.customdata.customdata_api import CustomdataApi
from bitmovin_api_sdk.encoding.encodings.muxings.fmp4.information.information_api import InformationApi
from bitmovin_api_sdk.encoding.encodings.muxings.fmp4.drm.drm_api import DrmApi
from bitmovin_api_sdk.encoding.encodings.muxings.fmp4.fmp4_muxing_list_query_params import Fmp4MuxingListQueryParams
| 80 | 116 | 0.9 | 66 | 480 | 6.272727 | 0.318182 | 0.144928 | 0.181159 | 0.217391 | 0.574879 | 0.574879 | 0.574879 | 0.574879 | 0.241546 | 0 | 0 | 0.019565 | 0.041667 | 480 | 5 | 117 | 96 | 0.880435 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
78b64d325b104a1e054346b7b3d3b268afae1f3a | 86 | py | Python | xylophone/project/rooms/room1.py | Turysaz/pyframework | da44b8127aa6b89d6cdb3bdb564c386520b37e22 | [
"MIT"
] | null | null | null | xylophone/project/rooms/room1.py | Turysaz/pyframework | da44b8127aa6b89d6cdb3bdb564c386520b37e22 | [
"MIT"
] | 6 | 2018-04-09T20:57:14.000Z | 2018-04-09T21:18:12.000Z | xylophone/project/rooms/room1.py | Turysaz/xylophone | da44b8127aa6b89d6cdb3bdb564c386520b37e22 | [
"MIT"
] | null | null | null |
def pre_step(self):
print("TestPRE")
def post_step(self):
print("TestPOST")
| 12.285714 | 21 | 0.651163 | 12 | 86 | 4.5 | 0.666667 | 0.296296 | 0.481481 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.186047 | 86 | 6 | 22 | 14.333333 | 0.771429 | 0 | 0 | 0 | 0 | 0 | 0.176471 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.5 | false | 0 | 0 | 0 | 0.5 | 0.5 | 1 | 0 | 0 | null | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 6 |
78d34f22e6e63dfcc82c8b4da42665ab9d3e915d | 148 | py | Python | DeBERTa/apps/models/__init__.py | vilhub/DeBERTa | 87580930689ec9f75ef8dbebba367953ed3dfe63 | [
"MIT"
] | 916 | 2020-06-09T01:32:41.000Z | 2022-03-31T10:14:33.000Z | DeBERTa/apps/models/__init__.py | vilhub/DeBERTa | 87580930689ec9f75ef8dbebba367953ed3dfe63 | [
"MIT"
] | 77 | 2020-06-27T15:48:25.000Z | 2022-03-30T20:52:58.000Z | DeBERTa/apps/models/__init__.py | vilhub/DeBERTa | 87580930689ec9f75ef8dbebba367953ed3dfe63 | [
"MIT"
] | 123 | 2020-06-09T01:54:53.000Z | 2022-03-30T13:22:41.000Z | from .ner import *
from .multi_choice import *
from .sequence_classification import *
from .record_qa import *
from .masked_language_model import *
| 24.666667 | 38 | 0.797297 | 20 | 148 | 5.65 | 0.6 | 0.353982 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.135135 | 148 | 5 | 39 | 29.6 | 0.882813 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
1584b07aebce815cbb1c25624e543f38df332dbb | 1,810 | py | Python | Graph-Algorithm/SIFT/hessian.py | ZhongHouyu/CVCode | e9576dfbdd1ae2ff986dadde3183eb6bc0380f76 | [
"Apache-2.0"
] | 30 | 2018-09-20T14:37:35.000Z | 2020-10-21T05:17:07.000Z | Graph-Algorithm/SIFT/hessian.py | ZhongHouyu/CVCode | e9576dfbdd1ae2ff986dadde3183eb6bc0380f76 | [
"Apache-2.0"
] | 1 | 2019-11-29T07:48:58.000Z | 2019-12-20T06:01:53.000Z | Graph-Algorithm/SIFT/hessian.py | ZhongHouyu/CVCode | e9576dfbdd1ae2ff986dadde3183eb6bc0380f76 | [
"Apache-2.0"
] | 9 | 2018-10-10T04:37:46.000Z | 2022-03-31T02:03:44.000Z | from PIL import Image
import numpy
import scipy
from scipy.ndimage import filters
class hessian(object):
def __init__(self):
self.eps = 0.000001
self.patternEdgeThreshold = 4.1
self.sourceEdgeThreshold = 4.1
def patEdgeDetect(self,arr):
"""
takes an image array as input, return the pixels on the edges of the
pattern image
"""
imx = numpy.zeros(arr.shape)
filters.gaussian_filter(arr, (3,3), (0,1), imx)
imy = numpy.zeros(arr.shape)
filters.gaussian_filter(arr, (3,3), (1,0), imy)
Wxx = filters.gaussian_filter(imx*imx,3)
Wxy = filters.gaussian_filter(imx*imy,3)
Wyy = filters.gaussian_filter(imy*imy,3)
# compute the determint and trace of the array
Wdet = Wxx*Wyy - Wxy**2
Wtr = Wxx + Wyy
# This threshold value is set by (r+1)**2/r and experiments
Thres = self.patternEdgeThreshold
coor = []
Hess = Wtr**2/(Wdet+self.eps)
re = numpy.where(Hess>Thres)
Num = len(re[0])
for i in range(Num):
coor.append((re[0][i],re[1][i]))
return tuple(coor)
def Srcedgedetect(self,arr):
"""
takes an image array as input, return the pixels on the edges of the
source image
"""
imx = numpy.zeros(arr.shape)
filters.gaussian_filter(arr, (3,3), (0,1), imx)
imy = numpy.zeros(arr.shape)
filters.gaussian_filter(arr, (3,3), (1,0), imy)
Wxx = filters.gaussian_filter(imx*imx,3)
Wxy = filters.gaussian_filter(imx*imy,3)
Wyy = filters.gaussian_filter(imy*imy,3)
# compute the determint and trace of the array
Wdet = Wxx*Wyy - Wxy**2
Wtr = Wxx + Wyy
# This threshold value is set by (r+1)**2/r and experiments
Thres = self.sourceEdgeThreshold
coor = []
Hess = Wtr**2/(Wdet+self.eps)
re = numpy.where(Hess>Thres)
Num = len(re[0])
for i in range(Num):
coor.append((re[0][i],re[1][i]))
return tuple(coor)
| 23.506494 | 71 | 0.669061 | 295 | 1,810 | 4.057627 | 0.254237 | 0.125313 | 0.175439 | 0.06015 | 0.793651 | 0.793651 | 0.793651 | 0.793651 | 0.793651 | 0.793651 | 0 | 0.032038 | 0.189503 | 1,810 | 76 | 72 | 23.815789 | 0.783913 | 0.20663 | 0 | 0.711111 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.066667 | false | 0 | 0.088889 | 0 | 0.222222 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
ecc5b394b148c00ac07c439e2accfb694ebb58e5 | 156 | py | Python | test_twarc_timeline_archive.py | DocNow/twarc-timeline-archive | e19179b4a1afe59fd64d98792bcf90addf22c33c | [
"MIT"
] | 2 | 2021-05-14T19:32:03.000Z | 2021-05-24T02:01:36.000Z | test_twarc_timeline_archive.py | DocNow/twarc-timeline-archive | e19179b4a1afe59fd64d98792bcf90addf22c33c | [
"MIT"
] | 4 | 2022-02-08T18:03:48.000Z | 2022-03-30T18:29:07.000Z | test_twarc_timeline_archive.py | DocNow/twarc-timeline-archive | e19179b4a1afe59fd64d98792bcf90addf22c33c | [
"MIT"
] | 1 | 2021-04-30T09:55:44.000Z | 2021-04-30T09:55:44.000Z | from click.testing import CliRunner
from twarc_timeline_archive import timeline_archive
runner = CliRunner()
def test_timeline_archive():
pass # todo
| 19.5 | 51 | 0.807692 | 20 | 156 | 6.05 | 0.65 | 0.371901 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.141026 | 156 | 7 | 52 | 22.285714 | 0.902985 | 0.025641 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.142857 | 0 | 1 | 0.2 | false | 0.2 | 0.4 | 0 | 0.6 | 0 | 1 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 6 |
019f4d45dde0781d4905fcf3cec3819325c8612f | 25,719 | py | Python | assets/tests/workers/managed_feeds/test_managed_feeds_postgres_manager.py | ellerbrock/quickstart-47lining-industrial-data-connector | ffd98ea68366f4a24bb30b1dcdce5a76d603ba3b | [
"Apache-2.0"
] | 1 | 2019-04-15T16:27:50.000Z | 2019-04-15T16:27:50.000Z | assets/tests/workers/managed_feeds/test_managed_feeds_postgres_manager.py | DalavanCloud/quickstart-47lining-industrial-data-connector | 6ad5530a793cfc22fe71677f14cc5775b2971697 | [
"Apache-2.0"
] | null | null | null | assets/tests/workers/managed_feeds/test_managed_feeds_postgres_manager.py | DalavanCloud/quickstart-47lining-industrial-data-connector | 6ad5530a793cfc22fe71677f14cc5775b2971697 | [
"Apache-2.0"
] | null | null | null | import datetime
import json
from io import BytesIO
from operator import itemgetter
from freezegun import freeze_time
from mock import call
from model.models import SyncAfEvent, EventStatus, PiPoint, SubscriptionStatus, Event, InterpolateEvent, \
SyncPiPointsEvent, BackfillEvent, SubscribeEvent, UnsubscribeEvent, Settings
from tests.fixtures import *
def test_get_recent_events(managed_feeds_postgres_manager, postgres_session):
postgres_session.add_all([
SyncAfEvent(
id='1',
error_message='Fail',
status=EventStatus.failure,
update_timestamp=datetime.datetime(2017, 1, 4),
s3_bucket='bucket',
s3_key='key',
database='database'
),
SyncPiPointsEvent(
id='2',
status=EventStatus.success,
update_timestamp=datetime.datetime(2017, 1, 3),
s3_bucket='bucket',
s3_key='key'
),
InterpolateEvent(
id='3',
status=EventStatus.success,
update_timestamp=datetime.datetime(2017, 1, 2),
pi_points=['point1', 'point2'],
name='name'
),
BackfillEvent(
id='4',
status=EventStatus.success,
update_timestamp=datetime.datetime(2017, 1, 1),
pi_points=['point1', 'point2'],
)
])
events = managed_feeds_postgres_manager.get_recent_events(3)
assert events == [
{
'database': 'database',
'error_message': 'Fail',
'id': '1',
's3_bucket': 'bucket',
's3_key': 'key',
'status': EventStatus.failure,
'event_type': 'sync_af',
'update_timestamp': datetime.datetime(2017, 1, 4)
},
{
'error_message': None,
'id': '2',
's3_bucket': 'bucket',
's3_key': 'key',
'status': EventStatus.success,
'event_type': 'sync_pi_points',
'update_timestamp': datetime.datetime(2017, 1, 3)
},
{
'error_message': None,
'name': 'name',
'id': '3',
'pi_points': ['point1', 'point2'],
'status': EventStatus.success,
'event_type': 'interpolate',
'update_timestamp': datetime.datetime(2017, 1, 2)
}
]
def test_get_settings(managed_feeds_postgres_manager, postgres_session):
postgres_session.add_all([
Settings(
name='Setting1',
value='value1'
),
Settings(
name='Setting2',
value='value2'
)
])
settings = managed_feeds_postgres_manager.get_settings()
assert settings == {
'Setting1': 'value1',
'Setting2': 'value2'
}
def test_save_settings(managed_feeds_postgres_manager, postgres_session):
settings = {
'Setting1': 'value1',
'Setting2': 'value2'
}
managed_feeds_postgres_manager.save_settings(settings)
settings_from_db = managed_feeds_postgres_manager.get_settings()
assert settings_from_db == settings
def test_get_pi_points(managed_feeds_postgres_manager, postgres_session):
postgres_session.add_all([
PiPoint(
pi_point='point1',
subscription_status=SubscriptionStatus.subscribed,
update_timestamp=datetime.datetime(2017, 1, 1)
),
PiPoint(
pi_point='point2',
subscription_status=SubscriptionStatus.pending,
update_timestamp=datetime.datetime(2017, 1, 2)
),
PiPoint(
pi_point='point3',
subscription_status=SubscriptionStatus.unsubscribed,
update_timestamp=datetime.datetime(2017, 1, 3)
)
])
points_data = managed_feeds_postgres_manager.get_pi_points()
points_data['pi_points'] = sorted(points_data['pi_points'], key=itemgetter('pi_point'))
assert points_data == {
'pi_points': [
{
'pi_point': 'point1',
'subscription_status': SubscriptionStatus.subscribed,
'update_timestamp': datetime.datetime(2017, 1, 1, 0, 0)
},
{
'pi_point': 'point2',
'subscription_status': SubscriptionStatus.pending,
'update_timestamp': datetime.datetime(2017, 1, 2, 0, 0)
},
{
'pi_point': 'point3',
'subscription_status': SubscriptionStatus.unsubscribed,
'update_timestamp': datetime.datetime(2017, 1, 3, 0, 0)
}
],
'total_count': 3
}
def test_get_pi_points_with_pagination(managed_feeds_postgres_manager, postgres_session):
for i in range(10):
pi_point = PiPoint(
pi_point='point{}'.format(i),
subscription_status=SubscriptionStatus.subscribed,
update_timestamp=datetime.datetime(2017, 1, 1)
)
postgres_session.add(pi_point)
points_data = managed_feeds_postgres_manager.get_pi_points(page=2, page_size=2)
points_data['pi_points'] = sorted(points_data['pi_points'], key=itemgetter('pi_point'))
assert points_data == {
'pi_points': [
{
'pi_point': 'point4',
'subscription_status': SubscriptionStatus.subscribed,
'update_timestamp': datetime.datetime(2017, 1, 1, 0, 0)
},
{
'pi_point': 'point5',
'subscription_status': SubscriptionStatus.subscribed,
'update_timestamp': datetime.datetime(2017, 1, 1, 0, 0)
}
],
'total_count': 10
}
def test_search_pi_points_with_query(managed_feeds_postgres_manager, postgres_session):
postgres_session.add_all([
PiPoint(
pi_point='name-test-name',
subscription_status=SubscriptionStatus.subscribed,
update_timestamp=datetime.datetime(2017, 1, 1)
),
PiPoint(
pi_point='test-test-test',
subscription_status=SubscriptionStatus.pending,
update_timestamp=datetime.datetime(2017, 1, 2)
)
])
points_data = managed_feeds_postgres_manager.search_pi_points(pattern='name-*-name')
assert points_data == {
'pi_points': [
{
'pi_point': 'name-test-name',
'subscription_status': SubscriptionStatus.subscribed,
'update_timestamp': datetime.datetime(2017, 1, 1, 0, 0)
}
],
'total_count': 1
}
def test_search_pi_points_using_regex(managed_feeds_postgres_manager, postgres_session):
postgres_session.add_all([
PiPoint(
pi_point='test1',
subscription_status=SubscriptionStatus.subscribed,
update_timestamp=datetime.datetime(2017, 1, 1)
),
PiPoint(
pi_point='Test12',
subscription_status=SubscriptionStatus.pending,
update_timestamp=datetime.datetime(2017, 1, 2)
)
])
points_data = managed_feeds_postgres_manager.search_pi_points(
pattern='[A-Z]{1}[a-z]?[0-9]?',
use_regex=True
)
assert points_data == {
'pi_points': [
{
'pi_point': 'Test12',
'subscription_status': SubscriptionStatus.pending,
'update_timestamp': datetime.datetime(2017, 1, 2, 0, 0)
}
],
'total_count': 1
}
def test_search_pi_points_with_pi_points(managed_feeds_postgres_manager, postgres_session):
postgres_session.add_all([
PiPoint(
pi_point='name1',
subscription_status=SubscriptionStatus.subscribed,
update_timestamp=datetime.datetime(2017, 1, 1)
),
PiPoint(
pi_point='name2',
subscription_status=SubscriptionStatus.subscribed,
update_timestamp=datetime.datetime(2017, 1, 2)
),
PiPoint(
pi_point='name3',
subscription_status=SubscriptionStatus.subscribed,
update_timestamp=datetime.datetime(2017, 1, 3)
)
])
points_data = managed_feeds_postgres_manager.search_pi_points(pi_points=['name1', 'name3'])
points_data['pi_points'] = sorted(points_data['pi_points'], key=itemgetter('pi_point'))
assert points_data == {
'pi_points': [
{
'pi_point': 'name1',
'subscription_status': SubscriptionStatus.subscribed,
'update_timestamp': datetime.datetime(2017, 1, 1, 0, 0)
},
{
'pi_point': 'name3',
'subscription_status': SubscriptionStatus.subscribed,
'update_timestamp': datetime.datetime(2017, 1, 3, 0, 0)
}
],
'total_count': 2
}
def test_search_pi_points_with_status(managed_feeds_postgres_manager, postgres_session):
postgres_session.add_all([
PiPoint(
pi_point='name1',
subscription_status=SubscriptionStatus.subscribed,
update_timestamp=datetime.datetime(2017, 1, 1)
),
PiPoint(
pi_point='name2',
subscription_status=SubscriptionStatus.pending,
update_timestamp=datetime.datetime(2017, 1, 2)
),
PiPoint(
pi_point='name3',
subscription_status=SubscriptionStatus.unsubscribed,
update_timestamp=datetime.datetime(2017, 1, 3)
)
])
points_data = managed_feeds_postgres_manager.search_pi_points(status='subscribed')
assert points_data == {
'pi_points': [
{
'pi_point': 'name1',
'subscription_status': SubscriptionStatus.subscribed,
'update_timestamp': datetime.datetime(2017, 1, 1, 0, 0)
}
],
'total_count': 1
}
@freeze_time('2016-01-02 11:12:13')
def test_send_subscribe_request(managed_feeds_postgres_manager, incoming_queue, sqs_uuid4, postgres_session):
postgres_session.add_all([PiPoint(pi_point='point1'), PiPoint(pi_point='point2')])
sqs_uuid4.return_value = '1'
managed_feeds_postgres_manager.send_subscribe_request(['point1', 'point2'])
points = postgres_session.query(PiPoint).all()
event = postgres_session.query(Event).get('1')
assert all(point.subscription_status == SubscriptionStatus.pending for point in points)
assert event.pi_points == ['point1', 'point2']
assert event.event_type == 'subscribe'
assert event.status == EventStatus.pending
assert incoming_queue.messages == [
{
'id': '1',
'action': 'subscribe',
'created_at': '2016-01-02T11:12:13',
'payload': {'points': ['point1', 'point2']}
}
]
@freeze_time('2016-01-02 11:12:13')
def test_handle_subscribe_request(managed_feeds_postgres_manager, postgres_session, iot_service):
postgres_session.add_all([PiPoint(pi_point='point1'), PiPoint(pi_point='point2')])
postgres_session.add(SubscribeEvent(id='1', status=EventStatus.pending, pi_points=['point1', 'point2']))
payload = {'points': ['point1', 'point2']}
managed_feeds_postgres_manager.handle_subscribe_request('1', payload)
points = postgres_session.query(PiPoint).all()
event = postgres_session.query(Event).get('1')
assert all(point.subscription_status == SubscriptionStatus.subscribed for point in points)
assert event.status == EventStatus.success
assert event.pi_points == ['point1', 'point2']
assert iot_service.iot_client.create_thing.has_calls([call(thingName='point1'), call(thingName='point2')])
@freeze_time('2016-01-02 11:12:13')
def test_handle_failed_subscribe_request(managed_feeds_postgres_manager, postgres_session):
postgres_session.add_all([PiPoint(pi_point='point1'), PiPoint(pi_point='point2')])
postgres_session.add(SubscribeEvent(id='1', status=EventStatus.pending, pi_points=['point1', 'point2']))
payload = {'points': ['point1'], 'error_message': 'point2 failed'}
managed_feeds_postgres_manager.handle_subscribe_request('1', payload)
point1 = postgres_session.query(PiPoint).get('point1')
point2 = postgres_session.query(PiPoint).get('point2')
event = postgres_session.query(Event).get('1')
assert point1.subscription_status == SubscriptionStatus.subscribed
assert point2.subscription_status == SubscriptionStatus.unsubscribed
assert event.status == EventStatus.failure
@freeze_time('2016-01-02 11:12:13')
def test_send_unsubscribe_request(managed_feeds_postgres_manager, incoming_queue, sqs_uuid4, postgres_session):
postgres_session.add_all([PiPoint(pi_point='point1'), PiPoint(pi_point='point2')])
sqs_uuid4.return_value = '1'
managed_feeds_postgres_manager.send_unsubscribe_request(['point1', 'point2'])
points = postgres_session.query(PiPoint).all()
event = postgres_session.query(Event).get('1')
assert all(point.subscription_status == SubscriptionStatus.pending for point in points)
assert event.pi_points == ['point1', 'point2']
assert event.event_type == 'unsubscribe'
assert event.status == EventStatus.pending
assert incoming_queue.messages == [
{
'id': '1',
'action': 'unsubscribe',
'created_at': '2016-01-02T11:12:13',
'payload': {'points': ['point1', 'point2']}
}
]
@freeze_time('2016-01-02 11:12:13')
def test_handle_unsubscribe_request(managed_feeds_postgres_manager, postgres_session):
postgres_session.add_all([PiPoint(pi_point='point1'), PiPoint(pi_point='point2')])
postgres_session.add(UnsubscribeEvent(id='1', status=EventStatus.pending, pi_points=['point1', 'point2']))
payload = {'points': ['point1', 'point2']}
managed_feeds_postgres_manager.handle_unsubscribe_request('1', payload)
points = postgres_session.query(PiPoint).all()
event = postgres_session.query(Event).get('1')
assert all(point.subscription_status == SubscriptionStatus.unsubscribed for point in points)
assert event.status == EventStatus.success
assert event.pi_points == ['point1', 'point2']
@freeze_time('2016-01-02 11:12:13')
def test_handle_failed_unsubscribe_request(managed_feeds_postgres_manager, postgres_session):
postgres_session.add_all([PiPoint(pi_point='point1'), PiPoint(pi_point='point2')])
postgres_session.add(UnsubscribeEvent(id='1', status=EventStatus.pending, pi_points=['point1', 'point2']))
payload = {'points': ['point1'], 'error_message': 'point2 failed'}
managed_feeds_postgres_manager.handle_unsubscribe_request('1', payload)
point1 = postgres_session.query(PiPoint).get('point1')
point2 = postgres_session.query(PiPoint).get('point2')
event = postgres_session.query(Event).get('1')
assert point1.subscription_status == SubscriptionStatus.unsubscribed
assert point2.subscription_status == SubscriptionStatus.subscribed
assert event.status == EventStatus.failure
@freeze_time('2016-01-02 11:12:13')
def test_send_sync_pi_points_request(managed_feeds_postgres_manager, incoming_queue, sqs_uuid4, postgres_session):
sqs_uuid4.return_value = '1'
managed_feeds_postgres_manager.send_sync_pi_points_request('bucket')
event = postgres_session.query(Event).get('1')
assert incoming_queue.messages == [
{
'id': '1',
'action': 'sync_pi_points',
'created_at': '2016-01-02T11:12:13',
'payload': {
's3_bucket': 'bucket',
's3_key': 'pi_points_sync/20160102_111213/pi_points.json'
}
}
]
assert event.event_type == 'sync_pi_points'
assert event.status == EventStatus.pending
assert event.s3_bucket == 'bucket'
assert event.s3_key == 'pi_points_sync/20160102_111213/pi_points.json'
@freeze_time('2017-01-02 11:12:13')
def test_handle_sync_pi_points(managed_feeds_postgres_manager, postgres_session, s3_resource):
postgres_session.add_all([
PiPoint(pi_point='point1', subscription_status=SubscriptionStatus.pending),
PiPoint(pi_point='point2', subscription_status=SubscriptionStatus.pending),
PiPoint(pi_point='point3', subscription_status=SubscriptionStatus.subscribed),
PiPoint(pi_point='point4', subscription_status=SubscriptionStatus.subscribed),
])
postgres_session.add(
SyncPiPointsEvent(id='1', status=EventStatus.pending, s3_bucket='bucket', s3_key='pi_points.json')
)
s3_resource.Bucket('bucket').upload_fileobj(
BytesIO(b'["point1","point3","point5"]'),
'pi_points.json'
)
payload = {}
managed_feeds_postgres_manager.handle_sync_pi_points('1', payload)
points = postgres_session.query(PiPoint).all()
point1 = postgres_session.query(PiPoint).get('point1')
point3 = postgres_session.query(PiPoint).get('point3')
point5 = postgres_session.query(PiPoint).get('point5')
event = postgres_session.query(Event).get('1')
assert len(points) == 3
assert point1.subscription_status == SubscriptionStatus.pending
assert point3.subscription_status == SubscriptionStatus.subscribed
assert point5.subscription_status == SubscriptionStatus.unsubscribed
assert event.status == EventStatus.success
@freeze_time('2016-01-02 11:12:13')
def test_send_sync_af_request(managed_feeds_postgres_manager, incoming_queue, postgres_session, sqs_uuid4):
sqs_uuid4.return_value = '1'
managed_feeds_postgres_manager.send_sync_af_request('bucket', 'database')
event = postgres_session.query(Event).get('1')
assert incoming_queue.messages == [
{
'id': '1',
'action': 'sync_af',
'created_at': '2016-01-02T11:12:13',
'payload': {
'database': 'database',
's3_bucket': 'bucket',
's3_key': 'af_structure_sync/database/20160102_111213/af_structure.json'
}
}
]
assert event.event_type == 'sync_af'
assert event.status == EventStatus.pending
assert event.s3_bucket == 'bucket'
assert event.s3_key == 'af_structure_sync/database/20160102_111213/af_structure.json'
assert event.database == 'database'
@freeze_time('2016-01-02 11:12:13')
def test_handle_sync_af(managed_feeds_postgres_manager, postgres_session, s3_resource):
managed_feeds_postgres_manager._make_unique_s3_key = lambda *args, **kwargs: 'af_structure.json'
msg_id = managed_feeds_postgres_manager.send_sync_af_request('bucket', 'NuGreen')
af_structure = {
"name": "NuGreen",
"path": "\\\\EC2AMAZ-0EE3VGR\\NuGreen",
"description": None,
"template": None,
"categories": None,
"attributes": None,
"assets": []
}
s3_resource.Bucket('bucket').upload_fileobj(
BytesIO(json.dumps(af_structure).encode()),
'af_structure.json'
)
payload = {}
managed_feeds_postgres_manager.handle_sync_af(msg_id, payload)
event = postgres_session.query(Event).get(msg_id)
assert event.status == EventStatus.success
@freeze_time('2016-01-02 11:12:13')
def test_send_backfill_request(managed_feeds_postgres_manager, postgres_session, incoming_queue, sqs_uuid4):
sqs_uuid4.return_value = '1'
managed_feeds_postgres_manager.send_backfill_request(
query_syntax=False,
feeds=['point1', 'point2'],
request_from='2016-01-02T11:12:13',
request_to='2016-01-02T11:12:13',
name='name'
)
event = postgres_session.query(Event).get('1')
assert incoming_queue.messages == [
{
'id': '1',
'action': 'backfill',
'created_at': '2016-01-02T11:12:13',
'payload': {
'points': ['point1', 'point2'],
'from': '2016-01-02T11:12:13',
'to': '2016-01-02T11:12:13',
'use_query_syntax': False,
'backfill_name': 'name'
}
}
]
assert event.pi_points == ['point1', 'point2']
assert event.status == EventStatus.pending
assert event.name == 'name'
@freeze_time('2016-01-02 11:12:13')
def test_send_backfill_request_with_query(managed_feeds_postgres_manager, incoming_queue, postgres_session, sqs_uuid4):
sqs_uuid4.return_value = '1'
managed_feeds_postgres_manager.send_backfill_request(
query_syntax=True,
feeds=['point1', 'point2'],
query='-1d',
name='name'
)
event = postgres_session.query(Event).get('1')
assert incoming_queue.messages == [
{
'id': '1',
'action': 'backfill',
'created_at': '2016-01-02T11:12:13',
'payload': {
'points': ['point1', 'point2'],
'query': '-1d',
'use_query_syntax': True,
'backfill_name': 'name'
}
}
]
assert event.pi_points == ['point1', 'point2']
assert event.status == EventStatus.pending
assert event.name == 'name'
@freeze_time('2016-01-02 11:12:13')
def test_handle_backfill(managed_feeds_postgres_manager, postgres_session):
postgres_session.add(
BackfillEvent(id='1', pi_points=['point1'], status=EventStatus.pending, name='name')
)
managed_feeds_postgres_manager.handle_backfill_status('1', {})
event = postgres_session.query(Event).get('1')
assert event.pi_points == ['point1']
assert event.status == EventStatus.success
assert event.name == 'name'
@freeze_time('2016-01-02 11:12:13')
def test_handle_backfill_failed(managed_feeds_postgres_manager, postgres_session):
postgres_session.add(
BackfillEvent(id='1', pi_points=['point1'], status=EventStatus.pending, name='name')
)
payload = {
'failed_points': [{
'point': 'point1',
'error_message': 'fail'
}]
}
managed_feeds_postgres_manager.handle_backfill_status('1', payload)
event = postgres_session.query(Event).get('1')
assert event.pi_points == ['point1']
assert event.status == EventStatus.failure
assert event.name == 'name'
assert event.error_message == "{'point1': 'fail'}"
@freeze_time('2016-01-02 11:12:13')
def test_send_interpolate_request(managed_feeds_postgres_manager, incoming_queue, postgres_session, sqs_uuid4):
sqs_uuid4.return_value = '1'
managed_feeds_postgres_manager.send_interpolate_request(
query_syntax=False,
feeds=['point1', 'point2'],
interval=1,
interval_unit='seconds',
request_from='2016-01-02T11:12:13',
request_to='2016-01-02T11:12:13',
name='name'
)
event = postgres_session.query(Event).get('1')
assert incoming_queue.messages == [
{
"id": "1",
"action": 'interpolate',
'created_at': '2016-01-02T11:12:13',
"payload": {
"points": ['point1', 'point2'],
'from': '2016-01-02T11:12:13',
'to': '2016-01-02T11:12:13',
'use_date_query_syntax': False,
'interval_seconds': 1,
'interpolation_name': 'name'
}
}
]
assert event.pi_points == ['point1', 'point2']
assert event.status == EventStatus.pending
assert event.name == 'name'
@freeze_time('2016-01-02 11:12:13')
def test_send_interpolate_request_with_query(managed_feeds_postgres_manager, incoming_queue, postgres_session, sqs_uuid4):
sqs_uuid4.return_value = '1'
managed_feeds_postgres_manager.send_interpolate_request(
query_syntax=True,
feeds=['point1', 'point2'],
interval=1,
interval_unit='seconds',
query='-1d',
name='name'
)
event = postgres_session.query(Event).get('1')
assert incoming_queue.messages == [
{
"id": "1",
"action": 'interpolate',
'created_at': '2016-01-02T11:12:13',
"payload": {
"points": ['point1', 'point2'],
'date_query': '-1d',
'interval_seconds': 1,
'use_date_query_syntax': True,
'interpolation_name': 'name'
}
}
]
assert event.pi_points == ['point1', 'point2']
assert event.status == EventStatus.pending
assert event.name == 'name'
@freeze_time('2016-01-02 11:12:13')
def test_handle_interpolation(managed_feeds_postgres_manager, postgres_session):
postgres_session.add(
InterpolateEvent(id='1', pi_points=['point1'], status=EventStatus.pending, name='name')
)
managed_feeds_postgres_manager.handle_interpolation_status('1', {})
event = postgres_session.query(Event).get('1')
assert event.pi_points == ['point1']
assert event.status == EventStatus.success
assert event.name == 'name'
@freeze_time('2016-01-02 11:12:13')
def test_handle_interpolation_with_failure(managed_feeds_postgres_manager, postgres_session):
postgres_session.add(
BackfillEvent(id='1', pi_points=['point1'], status=EventStatus.pending, name='name')
)
payload = {
"failed_points": [
{
"point": "point1",
"error_message": "fail"
}
]
}
managed_feeds_postgres_manager.handle_interpolation_status('1', payload)
event = postgres_session.query(Event).get('1')
assert event.pi_points == ['point1']
assert event.status == EventStatus.failure
assert event.name == 'name'
assert event.error_message == "{'point1': 'fail'}"
| 33.974901 | 122 | 0.631673 | 2,770 | 25,719 | 5.581227 | 0.064982 | 0.07859 | 0.073739 | 0.099547 | 0.884864 | 0.8489 | 0.835834 | 0.791074 | 0.733247 | 0.70511 | 0 | 0.052136 | 0.243789 | 25,719 | 756 | 123 | 34.019841 | 0.742763 | 0 | 0 | 0.565217 | 0 | 0 | 0.149461 | 0.011976 | 0 | 0 | 0 | 0 | 0.127214 | 1 | 0.043478 | false | 0 | 0.012882 | 0 | 0.056361 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
01c40e62303abbf85769fba7334b78d8963d2a45 | 143 | py | Python | lessons/03/frog_imp/test_challenge.py | jimlawton/codility | b286db80c7cfa6722b78c7eb8992e1a5934db8a0 | [
"Apache-2.0"
] | null | null | null | lessons/03/frog_imp/test_challenge.py | jimlawton/codility | b286db80c7cfa6722b78c7eb8992e1a5934db8a0 | [
"Apache-2.0"
] | 2 | 2021-03-25T21:32:16.000Z | 2021-07-19T11:11:15.000Z | lessons/03/frog_imp/test_challenge.py | jimlawton/codility | b286db80c7cfa6722b78c7eb8992e1a5934db8a0 | [
"Apache-2.0"
] | null | null | null | from challenge import solution
def test_challenge():
# Use 0 or 1
# Use upper and lower limits
assert solution(10, 85, 30) == 3
| 15.888889 | 36 | 0.657343 | 22 | 143 | 4.227273 | 0.863636 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.085714 | 0.265734 | 143 | 8 | 37 | 17.875 | 0.8 | 0.258741 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.333333 | 1 | 0.333333 | true | 0 | 0.333333 | 0 | 0.666667 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
01c94a4ad9baf73aae6a4e2d61251832f8dc6929 | 33,322 | py | Python | pymatflow/cp2k/base/motion_free_energy.py | DeqiTang/pymatflow | bd8776feb40ecef0e6704ee898d9f42ded3b0186 | [
"MIT"
] | 6 | 2020-03-06T16:13:08.000Z | 2022-03-09T07:53:34.000Z | pymatflow/cp2k/base/motion_free_energy.py | DeqiTang/pymatflow | bd8776feb40ecef0e6704ee898d9f42ded3b0186 | [
"MIT"
] | 1 | 2021-10-02T02:23:08.000Z | 2021-11-08T13:29:37.000Z | pymatflow/cp2k/base/motion_free_energy.py | DeqiTang/pymatflow | bd8776feb40ecef0e6704ee898d9f42ded3b0186 | [
"MIT"
] | 1 | 2021-07-10T16:28:14.000Z | 2021-07-10T16:28:14.000Z | #!/usr/bin/env python
# _*_ coding: utf-8 _*_
class cp2k_motion_free_energy_alchemical_change:
def __init__(self):
self.params = {}
self.status = False
def to_input(self, fout):
"""
fout: a file stream for writing
"""
fout.write("\t\t&ALCHEMICAL_CHANGE\n")
for item in self.params:
if self.params[item] is not None:
fout.write("\t\t\t%s %s\n" % (item, str(self.params[item])))
fout.write("\t\t&END ALCHEMICAL_CHANGE\n")
def set_params(self, params):
for item in params:
if len(item.split("-")) == 3:
self.params[item.split("-")[-1]] = params[item]
else:
pass
class cp2k_motion_free_energy_free_energy_info_each:
def __init__(self):
self.params = {}
self.status = False
def to_input(self, fout):
"""
fout: a file stream for writing
"""
fout.write("\t\t\t&EACH\n")
for item in self.params:
if self.params[item] is not None:
fout.write("\t\t\t\t%s %s\n" % (item, str(self.params[item])))
fout.write("\t\t\t&END EACH\n")
def set_params(self, params):
for item in params:
if len(item.split("-")) == 3:
self.params[item.split("-")[-1]] = params[item]
else:
pass
class cp2k_motion_free_energy_free_energy_info:
def __init__(self):
self.params = {}
self.status = False
self.each = cp2k_motion_free_energy_free_energy_info_each()
# basic setting
def to_input(self, fout):
"""
fout: a file stream for writing
"""
fout.write("\t\t&FREE_ENERGY_INFO\n")
for item in self.params:
if self.params[item] is not None:
fout.write("\t\t\t%s %s\n" % (item, str(self.params[item])))
if self.each.status == True:
self.each.to_input(fout)
fout.write("\t\t&END FREE_ENERGY_INFO\n")
def set_params(self, params):
for item in params:
if len(item.split("-")) == 3:
self.params[item.split("-")[-1]] = params[item]
elif item.split("-")[2] == "EACH":
self.each.set_params({item: params[item]})
else:
pass
class cp2k_motion_free_energy_metadyn_ext_lagrange_fs:
def __init__(self):
self.params = {}
self.status = False
def to_input(self, fout):
"""
fout: a file stream for writing
"""
fout.write("\t\t\t&EXT_LAGRANGE_FS\n")
for item in self.params:
if self.params[item] is not None:
fout.write("\t\t\t\t%s %s\n" % (item, str(self.params[item])))
fout.write("\t\t\t&END EXT_LAGRANGE_FS\n")
def set_params(self, params):
for item in params:
if len(item.split("-")) == 4:
self.params[item.split("-")[-1]] = params[item]
else:
pass
class cp2k_motion_free_energy_metadyn_ext_lagrange_ss:
def __init__(self):
self.params = {}
self.status = False
def to_input(self, fout):
"""
fout: a file stream for writing
"""
fout.write("\t\t\t&EXT_LAGRANGE_SS\n")
for item in self.params:
if self.params[item] is not None:
fout.write("\t\t\t\t%s %s\n" % (item, str(self.params[item])))
fout.write("\t\t\t&END EXT_LAGRANGE_SS\n")
def set_params(self, params):
for item in params:
if len(item.split("-")) == 4:
self.params[item.split("-")[-1]] = params[item]
else:
pass
class cp2k_motion_free_energy_metadyn_ext_lagrange_ss0:
def __init__(self):
self.params = {}
self.status = False
def to_input(self, fout):
"""
fout: a file stream for writing
"""
fout.write("\t\t\t&EXT_LAGRANGE_SS0\n")
for item in self.params:
if self.params[item] is not None:
fout.write("\t\t\t\t%s %s\n" % (item, str(self.params[item])))
fout.write("\t\t\t&END EXT_LAGRANGE_SS0\n")
def set_params(self, params):
for item in params:
if len(item.split("-")) == 4:
self.params[item.split("-")[-1]] = params[item]
else:
pass
class cp2k_motion_free_energy_metadyn_ext_lagrange_vvp:
def __init__(self):
self.params = {}
self.status = False
def to_input(self, fout):
"""
fout: a file stream for writing
"""
fout.write("\t\t\t&EXT_LAGRANGE_VVP\n")
for item in self.params:
if self.params[item] is not None:
fout.write("\t\t\t\t%s %s\n" % (item, str(self.params[item])))
fout.write("\t\t\t&END EXT_LAGRANGE_VVP\n")
def set_params(self, params):
for item in params:
if len(item.split("-")) == 4:
self.params[item.split("-")[-1]] = params[item]
else:
pass
class cp2k_motion_free_energy_metadyn_metavar_wall_gaussian:
def __init__(self):
self.params = {}
self.status = False
def to_input(self, fout):
"""
fout: a file stream for writing
"""
fout.write("\t\t\t\t\t&GAUSSIAN\n")
for item in self.params:
if self.params[item] is not None:
fout.write("\t\t\t\t\t\t%s %s\n" % (item, str(self.params[item])))
fout.write("\t\t\t\t\t&END GAUSSIAN\n")
def set_params(self, params):
for item in params:
if len(item.split("-")) == 6:
self.params[item.split("-")[-1]] = params[item]
else:
pass
class cp2k_motion_free_energy_metadyn_metavar_wall_quadratic:
def __init__(self):
self.params = {}
self.status = False
def to_input(self, fout):
"""
fout: a file stream for writing
"""
fout.write("\t\t\t\t\t&QUADRATIC\n")
for item in self.params:
if self.params[item] is not None:
fout.write("\t\t\t\t\t\t%s %s\n" % (item, str(self.params[item])))
fout.write("\t\t\t\t\t&END QUADRATIC\n")
def set_params(self, params):
for item in params:
if len(item.split("-")) == 6:
self.params[item.split("-")[-1]] = params[item]
else:
pass
class cp2k_motion_free_energy_metadyn_metavar_wall_quartic:
def __init__(self):
self.params = {}
self.status = False
def to_input(self, fout):
"""
fout: a file stream for writing
"""
fout.write("\t\t\t\t\t&QUARTIC\n")
for item in self.params:
if self.params[item] is not None:
fout.write("\t\t\t\t\t\t%s %s\n" % (item, str(self.params[item])))
fout.write("\t\t\t\t\t&END QUARTIC\n")
def set_params(self, params):
for item in params:
if len(item.split("-")) == 6:
self.params[item.split("-")[-1]] = params[item]
else:
pass
class cp2k_motion_free_energy_metadyn_metavar_wall_reflective:
def __init__(self):
self.params = {}
self.status = False
def to_input(self, fout):
"""
fout: a file stream for writing
"""
fout.write("\t\t\t\t\t&REFLECTIVE\n")
for item in self.params:
if self.params[item] is not None:
fout.write("\t\t\t\t\t\t%s %s\n" % (item, str(self.params[item])))
fout.write("\t\t\t\t\t&END REFLECTIVE\n")
def set_params(self, params):
for item in params:
if len(item.split("-")) == 6:
self.params[item.split("-")[-1]] = params[item]
else:
pass
class cp2k_motion_free_energy_metadyn_metavar_wall:
def __init__(self):
self.params = {}
self.status = False
self.gaussian = cp2k_motion_free_energy_metadyn_metavar_wall_gaussian()
self.quadratic = cp2k_motion_free_energy_metadyn_metavar_wall_quadratic()
self.quartic = cp2k_motion_free_energy_metadyn_metavar_wall_quartic()
self.reflective = cp2k_motion_free_energy_metadyn_metavar_wall_reflective()
# basic setting
def to_input(self, fout):
"""
fout: a file stream for writing
"""
fout.write("\t\t\t\t&WALL\n")
for item in self.params:
if self.params[item] is not None:
fout.write("\t\t\t\t\t%s %s\n" % (item, str(self.params[item])))
if self.gaussian.status == True:
self.gaussian.to_input(fout)
if self.quadratic.status == True:
self.quadratic.to_input(fout)
if self.quartic.status == True:
self.quartic.to_input(fout)
if self.reflective.status == True:
self.reflective.to_input(fout)
fout.write("\t\t\t\t&END WALL\n")
def set_params(self, params):
for item in params:
if len(item.split("-")) == 5:
self.params[item.split("-")[-1]] = params[item]
elif item.split("-")[4] == "GAUSSIAN":
self.gaussian.set_params({item: params[item]})
elif item.split("-")[4] == "QUADRATIC":
self.quadratic.set_params({item: params[item]})
elif item.split("-")[4] == "QUARTIC":
self.quartic.set_params({item: params[item]})
elif item.split("-")[4] == "REFLECTIVE":
self.reflective.set_params({item: params[item]})
else:
pass
class cp2k_motion_free_energy_metadyn_metavar:
def __init__(self):
self.params = {}
self.status = False
self.wall = cp2k_motion_free_energy_metadyn_metavar_wall()
# basic setting
def to_input(self, fout):
"""
fout: a file stream for writing
"""
fout.write("\t\t\t&METAVAR\n")
for item in self.params:
if self.params[item] is not None:
fout.write("\t\t\t\t%s %s\n" % (item, str(self.params[item])))
if self.wall.status == True:
self.wall.to_input(fout)
fout.write("\t\t\t&END METAVAR\n")
def set_params(self, params):
for item in params:
if len(item.split("-")) == 4:
self.params[item.split("-")[-1]] = params[item]
elif item.split("-")[3] == "WALL":
self.wall.set_params({item: params[item]})
else:
pass
class cp2k_motion_free_energy_metadyn_multiple_walkers_walkers_file_name:
def __init__(self):
self.params = {}
self.status = False
def to_input(self, fout):
"""
fout: a file stream for writing
"""
fout.write("\t\t\t\t&WALKERS_FILE_NAME\n")
for item in self.params:
if self.params[item] is not None:
fout.write("\t\t\t\t\t%s %s\n" % (item, str(self.params[item])))
fout.write("\t\t\t\t&END WALKERS_FILE_NAME\n")
def set_params(self, params):
for item in params:
if len(item.split("-")) == 5:
self.params[item.split("-")[-1]] = params[item]
else:
pass
class cp2k_motion_free_energy_metadyn_multiple_walkers:
def __init__(self):
self.params = {}
self.status = False
self.walkers_file_name = cp2k_motion_free_energy_metadyn_multiple_walkers_walkers_file_name()
# basic setting
def to_input(self, fout):
"""
fout: a file stream for writing
"""
fout.write("\t\t\t&MULTIPLE_WALKERS\n")
for item in self.params:
if self.params[item] is not None:
fout.write("\t\t\t\t%s %s\n" % (item, str(self.params[item])))
if self.walkers_file_name.status == True:
self.walkers_file_name.to_input(fout)
fout.write("\t\t\t&END MULTIPLE_WALKERS\n")
def set_params(self, params):
for item in params:
if len(item.split("-")) == 4:
self.params[item.split("-")[-1]] = params[item]
elif item.split("-")[3] == "WALKERS_FILE_NAME":
self.walkers.file_name.set_params({item: params[item]})
else:
pass
class cp2k_motion_free_energy_metadyn_print_colvar_each:
def __init__(self):
self.params = {}
self.status = False
def to_input(self, fout):
"""
fout: a file stream for writing
"""
fout.write("\t\t\t\t\t&EACH\n")
for item in self.params:
if self.params[item] is not None:
fout.write("\t\t\t\t\t\t%s %s\n" % (item, str(self.params[item])))
fout.write("\t\t\t\t\t&END EACH\n")
def set_params(self, params):
for item in params:
if len(item.split("-")) == 6:
self.params[item.split("-")[-1]] = params[item]
else:
pass
class cp2k_motion_free_energy_metadyn_print_colvar:
def __init__(self):
self.params = {}
self.status = False
self.each = cp2k_motion_free_energy_metadyn_print_colvar_each()
# basic setting
def to_input(self, fout):
"""
fout: a file stream for writing
"""
fout.write("\t\t\t\t&COLVAR\n")
for item in self.params:
if self.params[item] is not None:
fout.write("\t\t\t\t\t%s %s\n" % (item, str(self.params[item])))
if self.each.status == True:
self.each.to_input(fout)
fout.write("\t\t\t\t&END COLVAR\n")
def set_params(self, params):
for item in params:
if len(item.split("-")) == 5:
self.params[item.split("-")[-1]] = params[item]
elif item.split("-")[4] == "EACH":
self.each.set_params({item: params[item]})
else:
pass
class cp2k_motion_free_energy_metadyn_print_hills_each:
def __init__(self):
self.params = {}
self.status = False
def to_input(self, fout):
"""
fout: a file stream for writing
"""
fout.write("\t\t\t\t\t&EACH\n")
for item in self.params:
if self.params[item] is not None:
fout.write("\t\t\t\t\t\t%s %s\n" % (item, str(self.params[item])))
fout.write("\t\t\t\t\t&END EACH\n")
def set_params(self, params):
for item in params:
if len(item.split("-")) == 6:
self.params[item.split("-")[-1]] = params[item]
else:
pass
class cp2k_motion_free_energy_metadyn_print_hills:
def __init__(self):
self.params = {}
self.status = False
self.each = cp2k_motion_free_energy_metadyn_print_hills_each()
# basic setting
def to_input(self, fout):
"""
fout: a file stream for writing
"""
fout.write("\t\t\t\t&HILLS\n")
for item in self.params:
if self.params[item] is not None:
fout.write("\t\t\t\t\t%s %s\n" % (item, str(self.params[item])))
if self.each.status == True:
self.each.to_input(fout)
fout.write("\t\t\t\t&END HILLS\n")
def set_params(self, params):
for item in params:
if len(item.split("-")) == 5:
self.params[item.split("-")[-1]] = params[item]
elif item.split("-")[4] == "EACH":
self.each.set_params({item: params[item]})
else:
pass
class cp2k_motion_free_energy_metadyn_print_program_run_info_each:
def __init__(self):
self.params = {}
self.status = False
def to_input(self, fout):
"""
fout: a file stream for writing
"""
fout.write("\t\t\t\t\t&EACH\n")
for item in self.params:
if self.params[item] is not None:
fout.write("\t\t\t\t\t\t%s %s\n" % (item, str(self.params[item])))
fout.write("\t\t\t\t\t&END EACH\n")
def set_params(self, params):
for item in params:
if len(item.split("-")) == 6:
self.params[item.split("-")[-1]] = params[item]
else:
pass
class cp2k_motion_free_energy_metadyn_print_program_run_info:
def __init__(self):
self.params = {}
self.status = False
self.each = cp2k_motion_free_energy_metadyn_print_program_run_info_each()
# basic setting
def to_input(self, fout):
"""
fout: a file stream for writing
"""
fout.write("\t\t\t\t&PROGRAM_RUN_INFO\n")
for item in self.params:
if self.params[item] is not None:
fout.write("\t\t\t\t\t%s %s\n" % (item, str(self.params[item])))
if self.each.status == True:
self.each.to_input(fout)
fout.write("\t\t\t\t&END PROGRAM_RUN_INFO\n")
def set_params(self, params):
for item in params:
if len(item.split("-")) == 5:
self.params[item.split("-")[-1]] = params[item]
elif item.split("-")[4] == "EACH":
self.each.set_params({item: params[item]})
else:
pass
class cp2k_motion_free_energy_metadyn_print_temperature_colvar_each:
def __init__(self):
self.params = {}
self.status = False
def to_input(self, fout):
"""
fout: a file stream for writing
"""
fout.write("\t\t\t\t\t&EACH\n")
for item in self.params:
if self.params[item] is not None:
fout.write("\t\t\t\t\t\t%s %s\n" % (item, str(self.params[item])))
fout.write("\t\t\t\t\t&END EACH\n")
def set_params(self, params):
for item in params:
if len(item.split("-")) == 6:
self.params[item.split("-")[-1]] = params[item]
else:
pass
class cp2k_motion_free_energy_metadyn_print_temperature_colvar:
def __init__(self):
self.params = {}
self.status = False
self.each = cp2k_motion_free_energy_metadyn_print_temperature_colvar_each()
# basic setting
def to_input(self, fout):
"""
fout: a file stream for writing
"""
fout.write("\t\t\t\t&TEMPERATURE_COLVAR\n")
for item in self.params:
if self.params[item] is not None:
fout.write("\t\t\t\t\t%s %s\n" % (item, str(self.params[item])))
if self.each.status == True:
self.each.to_input(fout)
fout.write("\t\t\t\t&END TEMPERATURE_COLVAR\n")
def set_params(self, params):
for item in params:
if len(item.split("-")) == 5:
self.params[item.split("-")[-1]] = params[item]
elif item.split("-")[4] == "EACH":
self.each.set_params({item: params[item]})
else:
pass
class cp2k_motion_free_energy_metadyn_print:
def __init__(self):
self.params = {}
self.status = False
self.colvar = cp2k_motion_free_energy_metadyn_print_colvar()
self.hills = cp2k_motion_free_energy_metadyn_print_hills()
self.program_run_info = cp2k_motion_free_energy_metadyn_print_program_run_info()
self.temperature_colvar = cp2k_motion_free_energy_metadyn_print_temperature_colvar()
# basic setting
def to_input(self, fout):
"""
fout: a file stream for writing
"""
fout.write("\t\t\t&PRINT\n")
for item in self.params:
if self.params[item] is not None:
fout.write("\t\t\t\t%s %s\n" % (item, str(self.params[item])))
if self.colvar.status == True:
self.colvar.to_input(fout)
if self.hills.status == True:
self.hills.to_input(fout)
if self.program_run_info.status == True:
self.program_run_info.to_input(fout)
if self.temperature_colvar.status == True:
self.temperature_colvar.to_input(fout)
fout.write("\t\t\t&END PRINT\n")
def set_params(self, params):
for item in params:
if len(item.split("-")) == 4:
self.params[item.split("-")[-1]] = params[item]
elif item.split("-")[3] == "COLVAR":
self.colvar.set_params({item: params[item]})
elif item.split("-")[3] == "HILLS":
self.hills.set_params({item: params[item]})
elif item.split("-")[3] == "PROGRAM_RUN_INFO":
self.program_run_info.set_params({item: params[item]})
elif item.split("-")[3] == "TEMPERATURE_COLVAR":
self.temperature_colvar.set_params({item: params[item]})
else:
pass
class cp2k_motion_free_energy_metadyn_spawned_hills_height:
def __init__(self):
self.params = {}
self.status = False
def to_input(self, fout):
"""
fout: a file stream for writing
"""
fout.write("\t\t\t&SPAWNED_HILLS_HEIGHT\n")
for item in self.params:
if self.params[item] is not None:
fout.write("\t\t\t\t%s %s\n" % (item, str(self.params[item])))
fout.write("\t\t\t&END SPAWNED_HILLS_HEIGHT\n")
def set_params(self, params):
for item in params:
if len(item.split("-")) == 4:
self.params[item.split("-")[-1]] = params[item]
else:
pass
class cp2k_motion_free_energy_metadyn_spawned_hills_invdt:
def __init__(self):
self.params = {}
self.status = False
def to_input(self, fout):
"""
fout: a file stream for writing
"""
fout.write("\t\t\t&SPAWNED_HILLS_INVDT\n")
for item in self.params:
if self.params[item] is not None:
fout.write("\t\t\t\t%s %s\n" % (item, str(self.params[item])))
fout.write("\t\t\t&END SPAWNED_HILLS_INVDT\n")
def set_params(self, params):
for item in params:
if len(item.split("-")) == 4:
self.params[item.split("-")[-1]] = params[item]
else:
pass
class cp2k_motion_free_energy_metadyn_spawned_hills_pos:
def __init__(self):
self.params = {}
self.status = False
def to_input(self, fout):
"""
fout: a file stream for writing
"""
fout.write("\t\t\t&SPAWNED_HILLS_POS\n")
for item in self.params:
if self.params[item] is not None:
fout.write("\t\t\t\t%s %s\n" % (item, str(self.params[item])))
fout.write("\t\t\t&END SPAWNED_HILLS_POS\n")
def set_params(self, params):
for item in params:
if len(item.split("-")) == 4:
self.params[item.split("-")[-1]] = params[item]
else:
pass
class cp2k_motion_free_energy_metadyn_spawned_hills_scale:
def __init__(self):
self.params = {}
self.status = False
def to_input(self, fout):
"""
fout: a file stream for writing
"""
fout.write("\t\t\t&SPAWNED_HILLS_SCALE\n")
for item in self.params:
if self.params[item] is not None:
fout.write("\t\t\t\t%s %s\n" % (item, str(self.params[item])))
fout.write("\t\t\t&END SPAWNED_HILLS_SCALE\n")
def set_params(self, params):
for item in params:
if len(item.split("-")) == 4:
self.params[item.split("-")[-1]] = params[item]
else:
pass
class cp2k_motion_free_energy_metadyn:
def __init__(self):
self.params = {}
self.status = False
self.ext_lagrange_fs = cp2k_motion_free_energy_metadyn_ext_lagrange_fs()
self.ext_lagrange_ss = cp2k_motion_free_energy_metadyn_ext_lagrange_ss()
self.ext_lagrange_ss0 = cp2k_motion_free_energy_metadyn_ext_lagrange_ss0()
self.ext_lagrange_vvp = cp2k_motion_free_energy_metadyn_ext_lagrange_vvp()
self.metavar = cp2k_motion_free_energy_metadyn_metavar()
self.multiple_walkers = cp2k_motion_free_energy_metadyn_multiple_walkers()
self.printout = cp2k_motion_free_energy_metadyn_print()
self.spawned_hills_height = cp2k_motion_free_energy_metadyn_spawned_hills_height()
self.spawned_hills_invdt = cp2k_motion_free_energy_metadyn_spawned_hills_invdt()
self.spawned_hills_pos = cp2k_motion_free_energy_metadyn_spawned_hills_pos()
self.spawned_hills_scale = cp2k_motion_free_energy_metadyn_spawned_hills_scale()
# basic setting
def to_input(self, fout):
"""
fout: a file stream for writing
"""
fout.write("\t\t&METADYN\n")
for item in self.params:
if self.params[item] is not None:
fout.write("\t\t\t%s %s\n" % (item, str(self.params[item])))
if self.ext_lagrange_fs.status == True:
self.ext_lagrange_fs.to_input(fout)
if self.ext_lagrange_ss.status == True:
self.ext_lagrange_ss.to_input(fout)
if self.ext_lagrange_ss0.status == True:
self.ext_lagrange_ss0.to_input(fout)
if self.ext_lagrange_vvp.status == True:
self.ext_lagrange_vvp.to_input(fout)
if self.metavar.status == True:
self.metavar.to_input(fout)
if self.multiple_walkers.status == True:
self.multiple_walkers.to_input(fout)
if self.printout.status == True:
self.printout.to_input(fout)
if self.spawned_hills_height.status == True:
self.spawned_hills_height.to_input(fout)
if self.spawned_hills_invdt.status == True:
self.spawned_hills_invdt.to_input(fout)
if self.spawned_hills_pos.status == True:
self.spanwed_hills_pos.to_input(fout)
if self.spawned_hills_scale.status == True:
self.spawned_hills_scale.to_input(fout)
fout.write("\t\t&END METADYN\n")
def set_params(self, params):
for item in params:
if len(item.split("-")) == 3:
self.params[item.split("-")[-1]] = params[item]
elif item.split("-")[2] == "EXT_LAGRANGE_FS":
self.ext_lagrange_fs.set_params({item: params[item]})
elif item.split("-")[2] == "EXT_LAGRANGE_SS":
self.ext_lagrange_ss.set_params({item: params[item]})
elif item.split("-")[2] == "EXT_LAGRANGE_SS0":
self.ext_lagrange_ss0.set_params({item: params[item]})
elif item.split("-")[2] == "EXT_LAGRANGE_VVP":
self.ext_lagrange_vvp.set_params({item: params[item]})
elif item.split("-")[2] == "METAVAR":
self.metavar.set_params({item: params[item]})
elif item.split("-")[2] == "MULTIPLE_WALKERS":
self.multiple_walkers.set_params({item: params[item]})
elif item.split("-")[2] == "PRINT":
self.printout.set_params({item: params[item]})
elif item.split("-")[2] == "SPAWNED_HILLS_HEIGHT":
self.spawned_hills_height.set_params({item: params[item]})
elif item.split("-")[2] == "SPAWNED_HILLS_INVDT":
self.spawned_hills_invdt.set_params({item: params[item]})
elif item.split("-")[2] == "SPAWNED_HILLS_POS":
self.spawned_hills_pos.set_params({item: params[item]})
elif item.split("-")[2] == "SPANWED_HILLS_SCALE":
self.spawned_hills_scale.set_params({item: params[item]})
else:
pass
class cp2k_motion_free_energy_umbrella_integration_convergence_control:
def __init__(self):
self.params = {}
self.status = False
def to_input(self, fout):
"""
fout: a file stream for writing
"""
fout.write("\t\t\t&CONVERGENCE_CONTROL\n")
for item in self.params:
if self.params[item] is not None:
fout.write("\t\t\t\t%s %s\n" % (item, str(self.params[item])))
fout.write("\t\t\t&END CONVERGENCE_CONTROL\n")
def set_params(self, params):
for item in params:
if len(item.split("-")) == 4:
self.params[item.split("-")[-1]] = params[item]
else:
pass
class cp2k_motion_free_energy_umbrella_integration_uvar:
def __init__(self):
self.params = {}
self.status = False
def to_input(self, fout):
"""
fout: a file stream for writing
"""
fout.write("\t\t\t&UVAR\n")
for item in self.params:
if self.params[item] is not None:
fout.write("\t\t\t\t%s %s\n" % (item, str(self.params[item])))
fout.write("\t\t\t&END UVAR\n")
def set_params(self, params):
for item in params:
if len(item.split("-")) == 4:
self.params[item.split("-")[-1]] = params[item]
else:
pass
class cp2k_motion_free_energy_umbrella_integration:
def __init__(self):
self.params = {}
self.status = False
self.convergence_control = cp2k_motion_free_energy_umbrella_integration_convergence_control()
self.uvar = cp2k_motion_free_energy_umbrella_integration_uvar()
# basic setting
def to_input(self, fout):
"""
fout: a file stream for writing
"""
fout.write("\t\t&UMBRELLA_INTEGRATION\n")
for item in self.params:
if self.params[item] is not None:
fout.write("\t\t\t%s %s\n" % (item, str(self.params[item])))
if self.convergence_control.status == True:
self.convergence_control.to_input(fout)
if self.uvar.status == True:
self.uvar.to_input(fout)
fout.write("\t\t&END UMBRELLA_INTERGRATION\n")
def set_params(self, params):
for item in params:
if len(item.split("-")) == 3:
self.params[item.split("-")[-1]] = params[item]
elif item.split("-")[2] == "CONVERGENCE_CONTROL":
self.convergence_control.set_params({item: params[item]})
elif item.split("-")[2] == "UVAR":
self.uvar.set_params({item: params[item]})
else:
pass
class cp2k_motion_free_energy:
def __init__(self):
self.params = {}
self.status = False
self.alchemical_change = cp2k_motion_free_energy_alchemical_change()
self.free_energy_info = cp2k_motion_free_energy_free_energy_info()
self.metadyn = cp2k_motion_free_energy_metadyn()
self.umbrella_integration = cp2k_motion_free_energy_umbrella_integration()
# basic setting
def to_input(self, fout):
"""
fout: a file stream for writing
"""
fout.write("\t&FREE_ENERGY\n")
for item in self.params:
if self.params[item] is not None:
fout.write("\t\t%s %s\n" % (item, str(self.params[item])))
if self.alchemical_change.status == True:
self.alchmical_change.to_input(fout)
if self.free_energy_info.status == True:
self.free_energy_Info.to_input(fout)
if self.metadyn.status == True:
self.metadyn.to_input(fout)
if self.umbrella_integration.status == True:
self.umbrella_integration.to_input(fout)
fout.write("\t&END FREE_ENERGY\n")
def set_params(self, params):
for item in params:
if len(item.split("-")) == 2:
self.params[item.split("-")[-1]] = params[item]
elif item.split("-")[1] == "ALCHEMICAL_CHANGE":
self.alchemical_change.set_params({item: params[item]})
elif item.split("-")[1] == "FREE_ENERGY_INFO":
slef.free_energy_info.set_params({item: params[item]})
elif item.split("-")[1] == "METADYN":
self.metadyn.set_params({item: params[item]})
elif item.split("-")[1] == "UMBRELLA_INTERGRATION":
self.umbrella_integration.set_params({item: params[item]})
else:
pass
| 35.638503 | 102 | 0.541924 | 4,304 | 33,322 | 3.990009 | 0.02184 | 0.032493 | 0.031794 | 0.062132 | 0.880219 | 0.856461 | 0.829674 | 0.801607 | 0.755663 | 0.700868 | 0 | 0.007672 | 0.3233 | 33,322 | 934 | 103 | 35.67666 | 0.753914 | 0.037993 | 0 | 0.659884 | 0 | 0 | 0.083795 | 0.020265 | 0 | 0 | 0 | 0 | 0 | 1 | 0.143895 | false | 0.047965 | 0 | 0 | 0.19186 | 0.030523 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
01ec7ce35778fce58be483446cccec31505b08c0 | 24,082 | py | Python | test_autoarray/structures/test_kernel_2d.py | jonathanfrawley/PyAutoArray_copy | c21e8859bdb20737352147b9904797ac99985b73 | [
"MIT"
] | null | null | null | test_autoarray/structures/test_kernel_2d.py | jonathanfrawley/PyAutoArray_copy | c21e8859bdb20737352147b9904797ac99985b73 | [
"MIT"
] | null | null | null | test_autoarray/structures/test_kernel_2d.py | jonathanfrawley/PyAutoArray_copy | c21e8859bdb20737352147b9904797ac99985b73 | [
"MIT"
] | null | null | null | from os import path
import numpy as np
import pytest
from astropy import units
from astropy.modeling import functional_models
from astropy.coordinates import Angle
import autoarray as aa
from autoarray import exc
test_data_dir = path.join("{}".format(path.dirname(path.realpath(__file__))), "files")
class TestAPI:
def test__manual__input_kernel__all_attributes_correct_including_data_inheritance(
self,
):
kernel = aa.Kernel2D.ones(
shape_native=(3, 3), pixel_scales=1.0, normalize=False
)
assert kernel.shape_native == (3, 3)
assert (kernel.native == np.ones((3, 3))).all()
assert kernel.pixel_scales == (1.0, 1.0)
assert kernel.origin == (0.0, 0.0)
kernel = aa.Kernel2D.ones(
shape_native=(4, 3), pixel_scales=1.0, normalize=False
)
assert kernel.shape_native == (4, 3)
assert (kernel.native == np.ones((4, 3))).all()
assert kernel.pixel_scales == (1.0, 1.0)
assert kernel.origin == (0.0, 0.0)
def test__full_kernel_is_set_of_full_values(self):
kernel = aa.Kernel2D.full(fill_value=3.0, shape_native=(3, 3), pixel_scales=1.0)
assert kernel.shape_native == (3, 3)
assert (kernel.native == 3.0 * np.ones((3, 3))).all()
assert kernel.pixel_scales == (1.0, 1.0)
assert kernel.origin == (0.0, 0.0)
def test__ones_zeros__kernel_is_set_of_full_values(self):
kernel = aa.Kernel2D.ones(shape_native=(3, 3), pixel_scales=1.0)
assert kernel.shape_native == (3, 3)
assert (kernel.native == np.ones((3, 3))).all()
assert kernel.pixel_scales == (1.0, 1.0)
assert kernel.origin == (0.0, 0.0)
kernel = aa.Kernel2D.zeros(shape_native=(3, 3), pixel_scales=1.0)
assert kernel.shape_native == (3, 3)
assert (kernel.native == np.zeros((3, 3))).all()
assert kernel.pixel_scales == (1.0, 1.0)
assert kernel.origin == (0.0, 0.0)
def test__from_fits__input_kernel_3x3__all_attributes_correct_including_data_inheritance(
self,
):
kernel = aa.Kernel2D.from_fits(
file_path=path.join(test_data_dir, "3x2_ones.fits"), hdu=0, pixel_scales=1.0
)
assert (kernel.native == np.ones((3, 2))).all()
kernel = aa.Kernel2D.from_fits(
file_path=path.join(test_data_dir, "3x2_twos.fits"), hdu=0, pixel_scales=1.0
)
assert (kernel.native == 2.0 * np.ones((3, 2))).all()
def test__no_blur__correct_kernel(self):
kernel = aa.Kernel2D.no_blur(pixel_scales=1.0)
assert (kernel.native == np.array([[1.0]])).all()
assert kernel.pixel_scales == (1.0, 1.0)
kernel = aa.Kernel2D.no_blur(pixel_scales=2.0)
assert (kernel.native == np.array([[1.0]])).all()
assert kernel.pixel_scales == (2.0, 2.0)
class TestNormalize:
def test__input_is_already_normalized__no_change(self):
kernel_data = np.ones((3, 3)) / 9.0
kernel = aa.Kernel2D.manual_native(
array=kernel_data, pixel_scales=1.0, normalize=True
)
assert kernel.native == pytest.approx(kernel_data, 1e-3)
def test__input_is_above_normalization_so_is_normalized(self):
kernel_data = np.ones((3, 3))
kernel = aa.Kernel2D.manual_native(
array=kernel_data, pixel_scales=1.0, normalize=True
)
assert kernel.native == pytest.approx(np.ones((3, 3)) / 9.0, 1e-3)
kernel = aa.Kernel2D.manual_native(
array=kernel_data, pixel_scales=1.0, normalize=False
)
kernel = kernel.normalized
assert kernel.native == pytest.approx(np.ones((3, 3)) / 9.0, 1e-3)
def test__same_as_above__renomalized_false_does_not_normalize(self):
kernel_data = np.ones((3, 3))
kernel = aa.Kernel2D.manual_native(
array=kernel_data, pixel_scales=1.0, normalize=False
)
assert kernel.native == pytest.approx(np.ones((3, 3)), 1e-3)
class TestBinnedUp:
def test__kernel_is_even_x_even__rescaled_to_odd_x_odd__no_use_of_dimension_trimming(
self,
):
array_2d = np.ones((6, 6))
kernel = aa.Kernel2D.manual_native(
array=array_2d, pixel_scales=1.0, normalize=False
)
kernel = kernel.rescaled_with_odd_dimensions_from_rescale_factor(
rescale_factor=0.5, normalize=True
)
assert kernel.pixel_scales == (2.0, 2.0)
assert (kernel.native == (1.0 / 9.0) * np.ones((3, 3))).all()
array_2d = np.ones((9, 9))
kernel = aa.Kernel2D.manual_native(
array=array_2d, pixel_scales=1.0, normalize=False
)
kernel = kernel.rescaled_with_odd_dimensions_from_rescale_factor(
rescale_factor=0.333333333333333, normalize=True
)
assert kernel.pixel_scales == (3.0, 3.0)
assert (kernel.native == (1.0 / 9.0) * np.ones((3, 3))).all()
array_2d = np.ones((18, 6))
kernel = aa.Kernel2D.manual_native(
array=array_2d, pixel_scales=1.0, normalize=False
)
kernel = kernel.rescaled_with_odd_dimensions_from_rescale_factor(
rescale_factor=0.5, normalize=True
)
assert kernel.pixel_scales == (2.0, 2.0)
assert (kernel.native == (1.0 / 27.0) * np.ones((9, 3))).all()
array_2d = np.ones((6, 18))
kernel = aa.Kernel2D.manual_native(
array=array_2d, pixel_scales=1.0, normalize=False
)
kernel = kernel.rescaled_with_odd_dimensions_from_rescale_factor(
rescale_factor=0.5, normalize=True
)
assert kernel.pixel_scales == (2.0, 2.0)
assert (kernel.native == (1.0 / 27.0) * np.ones((3, 9))).all()
def test__kernel_is_even_x_even_after_binning_up__resized_to_odd_x_odd_with_shape_plus_one(
self,
):
kernel = aa.Kernel2D.ones(
shape_native=(2, 2), pixel_scales=1.0, normalize=False
)
kernel = kernel.rescaled_with_odd_dimensions_from_rescale_factor(
rescale_factor=2.0, normalize=True
)
assert kernel.pixel_scales == (0.4, 0.4)
assert (kernel.native == (1.0 / 25.0) * np.ones((5, 5))).all()
kernel = aa.Kernel2D.ones(
shape_native=(40, 40), pixel_scales=1.0, normalize=False
)
kernel = kernel.rescaled_with_odd_dimensions_from_rescale_factor(
rescale_factor=0.1, normalize=True
)
assert kernel.pixel_scales == (8.0, 8.0)
assert (kernel.native == (1.0 / 25.0) * np.ones((5, 5))).all()
kernel = aa.Kernel2D.ones(
shape_native=(2, 4), pixel_scales=1.0, normalize=False
)
kernel = kernel.rescaled_with_odd_dimensions_from_rescale_factor(
rescale_factor=2.0, normalize=True
)
assert kernel.pixel_scales[0] == pytest.approx(0.4, 1.0e-4)
assert kernel.pixel_scales[1] == pytest.approx(0.4444444, 1.0e-4)
assert (kernel.native == (1.0 / 45.0) * np.ones((5, 9))).all()
kernel = aa.Kernel2D.ones(
shape_native=(4, 2), pixel_scales=1.0, normalize=False
)
kernel = kernel.rescaled_with_odd_dimensions_from_rescale_factor(
rescale_factor=2.0, normalize=True
)
assert kernel.pixel_scales[0] == pytest.approx(0.4444444, 1.0e-4)
assert kernel.pixel_scales[1] == pytest.approx(0.4, 1.0e-4)
assert (kernel.native == (1.0 / 45.0) * np.ones((9, 5))).all()
def test__kernel_is_odd_and_even_after_binning_up__resized_to_odd_and_odd_with_shape_plus_one(
self,
):
kernel = aa.Kernel2D.ones(
shape_native=(6, 4), pixel_scales=1.0, normalize=False
)
kernel = kernel.rescaled_with_odd_dimensions_from_rescale_factor(
rescale_factor=0.5, normalize=True
)
assert kernel.pixel_scales == pytest.approx((2.0, 1.3333333333), 1.0e-4)
assert (kernel.native == (1.0 / 9.0) * np.ones((3, 3))).all()
kernel = aa.Kernel2D.ones(
shape_native=(9, 12), pixel_scales=1.0, normalize=False
)
kernel = kernel.rescaled_with_odd_dimensions_from_rescale_factor(
rescale_factor=0.33333333333, normalize=True
)
assert kernel.pixel_scales == pytest.approx((3.0, 2.4), 1.0e-4)
assert (kernel.native == (1.0 / 15.0) * np.ones((3, 5))).all()
kernel = aa.Kernel2D.ones(
shape_native=(4, 6), pixel_scales=1.0, normalize=False
)
kernel = kernel.rescaled_with_odd_dimensions_from_rescale_factor(
rescale_factor=0.5, normalize=True
)
assert kernel.pixel_scales == pytest.approx((1.33333333333, 2.0), 1.0e-4)
assert (kernel.native == (1.0 / 9.0) * np.ones((3, 3))).all()
kernel = aa.Kernel2D.ones(
shape_native=(12, 9), pixel_scales=1.0, normalize=False
)
kernel = kernel.rescaled_with_odd_dimensions_from_rescale_factor(
rescale_factor=0.33333333333, normalize=True
)
assert kernel.pixel_scales == pytest.approx((2.4, 3.0), 1.0e-4)
assert (kernel.native == (1.0 / 15.0) * np.ones((5, 3))).all()
class TestConvolve:
def test__kernel_is_not_odd_x_odd__raises_error(self):
kernel = aa.Kernel2D.manual_native(
array=[[0.0, 1.0], [1.0, 2.0]], pixel_scales=1.0
)
with pytest.raises(exc.KernelException):
kernel.convolved_array_from_array(np.ones((5, 5)))
def test__image_is_3x3_central_value_of_one__kernel_is_cross__blurred_image_becomes_cross(
self,
):
image = aa.Array2D.manual_native(
array=[[0.0, 0.0, 0.0], [0.0, 1.0, 0.0], [0.0, 0.0, 0.0]], pixel_scales=1.0
)
kernel = aa.Kernel2D.manual_native(
array=[[0.0, 1.0, 0.0], [1.0, 2.0, 1.0], [0.0, 1.0, 0.0]], pixel_scales=1.0
)
blurred_image = kernel.convolved_array_from_array(image)
assert (blurred_image == kernel).all()
def test__image_is_4x4_central_value_of_one__kernel_is_cross__blurred_image_becomes_cross(
self,
):
image = aa.Array2D.manual_native(
array=[
[0.0, 0.0, 0.0, 0.0],
[0.0, 1.0, 0.0, 0.0],
[0.0, 0.0, 0.0, 0.0],
[0.0, 0.0, 0.0, 0.0],
],
pixel_scales=1.0,
)
kernel = aa.Kernel2D.manual_native(
array=[[0.0, 1.0, 0.0], [1.0, 2.0, 1.0], [0.0, 1.0, 0.0]], pixel_scales=1.0
)
blurred_image = kernel.convolved_array_from_array(array=image)
assert (
blurred_image.native
== np.array(
[
[0.0, 1.0, 0.0, 0.0],
[1.0, 2.0, 1.0, 0.0],
[0.0, 1.0, 0.0, 0.0],
[0.0, 0.0, 0.0, 0.0],
]
)
).all()
def test__image_is_4x3_central_value_of_one__kernel_is_cross__blurred_image_becomes_cross(
self,
):
image = aa.Array2D.manual_native(
array=[[0.0, 0.0, 0.0], [0.0, 1.0, 0.0], [0.0, 0.0, 0.0], [0.0, 0.0, 0.0]],
pixel_scales=1.0,
)
kernel = aa.Kernel2D.manual_native(
array=[[0.0, 1.0, 0.0], [1.0, 2.0, 1.0], [0.0, 1.0, 0.0]], pixel_scales=1.0
)
blurred_image = kernel.convolved_array_from_array(image)
assert (
blurred_image.native
== np.array(
[[0.0, 1.0, 0.0], [1.0, 2.0, 1.0], [0.0, 1.0, 0.0], [0.0, 0.0, 0.0]]
)
).all()
def test__image_is_3x4_central_value_of_one__kernel_is_cross__blurred_image_becomes_cross(
self,
):
image = aa.Array2D.manual_native(
array=[[0.0, 0.0, 0.0, 0.0], [0.0, 1.0, 0.0, 0.0], [0.0, 0.0, 0.0, 0.0]],
pixel_scales=1.0,
)
kernel = aa.Kernel2D.manual_native(
array=[[0.0, 1.0, 0.0], [1.0, 2.0, 1.0], [0.0, 1.0, 0.0]], pixel_scales=1.0
)
blurred_image = kernel.convolved_array_from_array(image)
assert (
blurred_image.native
== np.array(
[[0.0, 1.0, 0.0, 0.0], [1.0, 2.0, 1.0, 0.0], [0.0, 1.0, 0.0, 0.0]]
)
).all()
def test__image_is_4x4_has_two_central_values__kernel_is_asymmetric__blurred_image_follows_convolution(
self,
):
image = aa.Array2D.manual_native(
array=[
[0.0, 0.0, 0.0, 0.0],
[0.0, 1.0, 0.0, 0.0],
[0.0, 0.0, 1.0, 0.0],
[0.0, 0.0, 0.0, 0.0],
],
pixel_scales=1.0,
)
kernel = aa.Kernel2D.manual_native(
array=[[1.0, 1.0, 1.0], [2.0, 2.0, 1.0], [1.0, 3.0, 3.0]], pixel_scales=1.0
)
blurred_image = kernel.convolved_array_from_array(image)
assert (
blurred_image.native
== np.array(
[
[1.0, 1.0, 1.0, 0.0],
[2.0, 3.0, 2.0, 1.0],
[1.0, 5.0, 5.0, 1.0],
[0.0, 1.0, 3.0, 3.0],
]
)
).all()
def test__image_is_4x4_values_are_on_edge__kernel_is_asymmetric__blurring_does_not_account_for_edge_effects(
self,
):
image = aa.Array2D.manual_native(
[
[0.0, 0.0, 0.0, 0.0],
[1.0, 0.0, 0.0, 0.0],
[0.0, 0.0, 0.0, 1.0],
[0.0, 0.0, 0.0, 0.0],
],
pixel_scales=1.0,
)
kernel = aa.Kernel2D.manual_native(
array=[[1.0, 1.0, 1.0], [2.0, 2.0, 1.0], [1.0, 3.0, 3.0]], pixel_scales=1.0
)
blurred_image = kernel.convolved_array_from_array(image)
assert (
blurred_image.native
== np.array(
[
[1.0, 1.0, 0.0, 0.0],
[2.0, 1.0, 1.0, 1.0],
[3.0, 3.0, 2.0, 2.0],
[0.0, 0.0, 1.0, 3.0],
]
)
).all()
def test__image_is_4x4_values_are_on_corner__kernel_is_asymmetric__blurring_does_not_account_for_edge_effects(
self,
):
image = aa.Array2D.manual_native(
array=[
[1.0, 0.0, 0.0, 0.0],
[0.0, 0.0, 0.0, 0.0],
[0.0, 0.0, 0.0, 0.0],
[0.0, 0.0, 0.0, 1.0],
],
pixel_scales=1.0,
)
kernel = aa.Kernel2D.manual_native(
array=[[1.0, 1.0, 1.0], [2.0, 2.0, 1.0], [1.0, 3.0, 3.0]], pixel_scales=1.0
)
blurred_image = kernel.convolved_array_from_array(image)
assert (
blurred_image.native
== np.array(
[
[2.0, 1.0, 0.0, 0.0],
[3.0, 3.0, 0.0, 0.0],
[0.0, 0.0, 1.0, 1.0],
[0.0, 0.0, 2.0, 2.0],
]
)
).all()
class TestFromGaussian:
def test__identical_to_gaussian_light_profile(self):
kernel = aa.Kernel2D.from_gaussian(
shape_native=(3, 3),
pixel_scales=1.0,
centre=(0.1, 0.1),
axis_ratio=0.9,
angle=45.0,
sigma=1.0,
normalize=True,
)
assert kernel.native == pytest.approx(
np.array(
[
[0.06281, 0.13647, 0.0970],
[0.11173, 0.21589, 0.136477],
[0.065026, 0.11173, 0.06281],
]
),
1.0e-3,
)
class TestFromAlmaGaussian:
def test__identical_to_astropy_gaussian_model__circular_no_rotation(self):
pixel_scales = 0.1
x_stddev = (
2.0e-5
* (units.deg).to(units.arcsec)
/ pixel_scales
/ (2.0 * np.sqrt(2.0 * np.log(2.0)))
)
y_stddev = (
2.0e-5
* (units.deg).to(units.arcsec)
/ pixel_scales
/ (2.0 * np.sqrt(2.0 * np.log(2.0)))
)
gaussian_astropy = functional_models.Gaussian2D(
amplitude=1.0,
x_mean=2.0,
y_mean=2.0,
x_stddev=x_stddev,
y_stddev=y_stddev,
theta=0.0,
)
shape = (5, 5)
y, x = np.mgrid[0 : shape[1], 0 : shape[0]]
kernel_astropy = gaussian_astropy(x, y)
kernel_astropy /= np.sum(kernel_astropy)
kernel = aa.Kernel2D.from_as_gaussian_via_alma_fits_header_parameters(
shape_native=shape,
pixel_scales=pixel_scales,
y_stddev=2.0e-5,
x_stddev=2.0e-5,
theta=0.0,
normalize=True,
)
assert kernel_astropy == pytest.approx(kernel.native, 1e-4)
def test__identical_to_astropy_gaussian_model__circular_no_rotation_different_pixel_scale(
self,
):
pixel_scales = 0.02
x_stddev = (
2.0e-5
* (units.deg).to(units.arcsec)
/ pixel_scales
/ (2.0 * np.sqrt(2.0 * np.log(2.0)))
)
y_stddev = (
2.0e-5
* (units.deg).to(units.arcsec)
/ pixel_scales
/ (2.0 * np.sqrt(2.0 * np.log(2.0)))
)
gaussian_astropy = functional_models.Gaussian2D(
amplitude=1.0,
x_mean=2.0,
y_mean=2.0,
x_stddev=x_stddev,
y_stddev=y_stddev,
theta=0.0,
)
shape = (5, 5)
y, x = np.mgrid[0 : shape[1], 0 : shape[0]]
kernel_astropy = gaussian_astropy(x, y)
kernel_astropy /= np.sum(kernel_astropy)
kernel = aa.Kernel2D.from_as_gaussian_via_alma_fits_header_parameters(
shape_native=shape,
pixel_scales=pixel_scales,
y_stddev=2.0e-5,
x_stddev=2.0e-5,
theta=0.0,
normalize=True,
)
assert kernel_astropy == pytest.approx(kernel.native, 1e-4)
def test__identical_to_astropy_gaussian_model__include_ellipticity_from_x_and_y_stddev(
self,
):
pixel_scales = 0.1
x_stddev = (
1.0e-5
* (units.deg).to(units.arcsec)
/ pixel_scales
/ (2.0 * np.sqrt(2.0 * np.log(2.0)))
)
y_stddev = (
2.0e-5
* (units.deg).to(units.arcsec)
/ pixel_scales
/ (2.0 * np.sqrt(2.0 * np.log(2.0)))
)
theta_deg = 0.0
theta = Angle(theta_deg, "deg").radian
gaussian_astropy = functional_models.Gaussian2D(
amplitude=1.0,
x_mean=2.0,
y_mean=2.0,
x_stddev=x_stddev,
y_stddev=y_stddev,
theta=theta,
)
shape = (5, 5)
y, x = np.mgrid[0 : shape[1], 0 : shape[0]]
kernel_astropy = gaussian_astropy(x, y)
kernel_astropy /= np.sum(kernel_astropy)
kernel = aa.Kernel2D.from_as_gaussian_via_alma_fits_header_parameters(
shape_native=shape,
pixel_scales=pixel_scales,
y_stddev=2.0e-5,
x_stddev=1.0e-5,
theta=theta_deg,
normalize=True,
)
assert kernel_astropy == pytest.approx(kernel.native, 1e-4)
def test__identical_to_astropy_gaussian_model__include_different_ellipticity_from_x_and_y_stddev(
self,
):
pixel_scales = 0.1
x_stddev = (
3.0e-5
* (units.deg).to(units.arcsec)
/ pixel_scales
/ (2.0 * np.sqrt(2.0 * np.log(2.0)))
)
y_stddev = (
2.0e-5
* (units.deg).to(units.arcsec)
/ pixel_scales
/ (2.0 * np.sqrt(2.0 * np.log(2.0)))
)
theta_deg = 0.0
theta = Angle(theta_deg, "deg").radian
gaussian_astropy = functional_models.Gaussian2D(
amplitude=1.0,
x_mean=2.0,
y_mean=2.0,
x_stddev=x_stddev,
y_stddev=y_stddev,
theta=theta,
)
shape = (5, 5)
y, x = np.mgrid[0 : shape[1], 0 : shape[0]]
kernel_astropy = gaussian_astropy(x, y)
kernel_astropy /= np.sum(kernel_astropy)
kernel = aa.Kernel2D.from_as_gaussian_via_alma_fits_header_parameters(
shape_native=shape,
pixel_scales=pixel_scales,
y_stddev=2.0e-5,
x_stddev=3.0e-5,
theta=theta_deg,
normalize=True,
)
assert kernel_astropy == pytest.approx(kernel.native, 1e-4)
def test__identical_to_astropy_gaussian_model__include_rotation_angle_30(self):
pixel_scales = 0.1
x_stddev = (
1.0e-5
* (units.deg).to(units.arcsec)
/ pixel_scales
/ (2.0 * np.sqrt(2.0 * np.log(2.0)))
)
y_stddev = (
2.0e-5
* (units.deg).to(units.arcsec)
/ pixel_scales
/ (2.0 * np.sqrt(2.0 * np.log(2.0)))
)
theta_deg = 30.0
theta = Angle(theta_deg, "deg").radian
gaussian_astropy = functional_models.Gaussian2D(
amplitude=1.0,
x_mean=1.0,
y_mean=1.0,
x_stddev=x_stddev,
y_stddev=y_stddev,
theta=theta,
)
shape = (3, 3)
y, x = np.mgrid[0 : shape[1], 0 : shape[0]]
kernel_astropy = gaussian_astropy(x, y)
kernel_astropy /= np.sum(kernel_astropy)
kernel = aa.Kernel2D.from_as_gaussian_via_alma_fits_header_parameters(
shape_native=shape,
pixel_scales=pixel_scales,
y_stddev=2.0e-5,
x_stddev=1.0e-5,
theta=theta_deg,
normalize=True,
)
assert kernel_astropy == pytest.approx(kernel.native, 1e-4)
def test__identical_to_astropy_gaussian_model__include_rotation_angle_230(self):
pixel_scales = 0.1
x_stddev = (
1.0e-5
* (units.deg).to(units.arcsec)
/ pixel_scales
/ (2.0 * np.sqrt(2.0 * np.log(2.0)))
)
y_stddev = (
2.0e-5
* (units.deg).to(units.arcsec)
/ pixel_scales
/ (2.0 * np.sqrt(2.0 * np.log(2.0)))
)
theta_deg = 230.0
theta = Angle(theta_deg, "deg").radian
gaussian_astropy = functional_models.Gaussian2D(
amplitude=1.0,
x_mean=1.0,
y_mean=1.0,
x_stddev=x_stddev,
y_stddev=y_stddev,
theta=theta,
)
shape = (3, 3)
y, x = np.mgrid[0 : shape[1], 0 : shape[0]]
kernel_astropy = gaussian_astropy(x, y)
kernel_astropy /= np.sum(kernel_astropy)
kernel = aa.Kernel2D.from_as_gaussian_via_alma_fits_header_parameters(
shape_native=shape,
pixel_scales=pixel_scales,
y_stddev=2.0e-5,
x_stddev=1.0e-5,
theta=theta_deg,
normalize=True,
)
assert kernel_astropy == pytest.approx(kernel.native, 1e-4)
| 32.720109 | 115 | 0.522382 | 3,254 | 24,082 | 3.619238 | 0.059004 | 0.049758 | 0.058844 | 0.062155 | 0.900739 | 0.889615 | 0.877473 | 0.855821 | 0.844358 | 0.838244 | 0 | 0.089161 | 0.347978 | 24,082 | 735 | 116 | 32.764626 | 0.660871 | 0 | 0 | 0.630795 | 0 | 0 | 0.001927 | 0 | 0 | 0 | 0 | 0 | 0.115894 | 1 | 0.043046 | false | 0 | 0.013245 | 0 | 0.066225 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
bf122f03ff76e99eee4a965fc10382a2d78dca39 | 4,357 | py | Python | hallo/test/inc/test_input_parser.py | joshcoales/Hallo | 17145d8f76552ecd4cbc5caef8924bd2cf0cbf24 | [
"MIT"
] | 1 | 2018-05-19T22:27:20.000Z | 2018-05-19T22:27:20.000Z | hallo/test/inc/test_input_parser.py | joshcoales/Hallo | 17145d8f76552ecd4cbc5caef8924bd2cf0cbf24 | [
"MIT"
] | 75 | 2015-09-26T18:07:18.000Z | 2022-01-04T07:15:11.000Z | hallo/test/inc/test_input_parser.py | SpangleLabs/Hallo | 17145d8f76552ecd4cbc5caef8924bd2cf0cbf24 | [
"MIT"
] | 1 | 2021-04-10T12:02:47.000Z | 2021-04-10T12:02:47.000Z | from hallo.inc.input_parser import InputParser
def test_no_args():
p = InputParser("blah blah")
assert p.remaining_text == "blah blah"
assert len(p.args_dict) == 0
def test_multiple_simple_args():
p = InputParser("blah blah arg1=val1 arg2=val2 arg3=val3")
assert p.remaining_text == "blah blah"
assert p.args_dict["arg1"] == "val1"
assert p.args_dict["arg2"] == "val2"
assert p.args_dict["arg3"] == "val3"
def test_quoted_args_quoted_values():
p = InputParser("yo 'base unit'=\"hello world\"")
assert p.remaining_text == "yo"
assert p.args_dict["base unit"] == "hello world"
def test_quoted_args_unquoted_values():
p = InputParser("yo 'base unit'=hello world")
assert p.remaining_text == "yo world"
assert p.args_dict["base unit"] == "hello"
def test_unquoted_args_quoted_values():
p = InputParser('yo base unit="hello world"')
assert p.remaining_text == "yo base"
assert p.args_dict["unit"] == "hello world"
def test_unquoted_args_unquoted_values():
p = InputParser("yo base unit=hello world")
assert p.remaining_text == "yo base world"
assert p.args_dict["unit"] == "hello"
def test_mismatched_quotes():
p = InputParser('yo \'base unit"="hello world"')
assert p.remaining_text == "yo 'base"
assert p.args_dict['unit"'] == "hello world"
p = InputParser("yo 'base unit'=\"hello's world\"")
assert p.remaining_text == "yo"
assert p.args_dict["base unit"] == "hello's world"
def test_all_types():
p = InputParser(
"yo 'base unit'=\"hello world\" arg1='value 1' 'arg 2'=val2 arg3=val3"
)
assert p.remaining_text == "yo"
assert p.args_dict["base unit"] == "hello world"
assert p.args_dict["arg1"] == "value 1"
assert p.args_dict["arg 2"] == "val2"
assert p.args_dict["arg3"] == "val3"
def test_remaining_text_start_and_end():
p = InputParser("blah blah arg1=val1 arg2=val2 hey")
assert p.remaining_text == "blah blah hey"
assert p.args_dict["arg1"] == "val1"
assert p.args_dict["arg2"] == "val2"
def test_unstripped_input():
p = InputParser(" blah blah ")
assert p.remaining_text == "blah blah"
def test_get_arg_by_names():
p = InputParser("blah blah arg1=val1 arg2=val2 arg3=val3")
assert p.remaining_text == "blah blah"
assert p.get_arg_by_names(["arg2"]) == "val2"
assert p.args_dict["arg1"] == "val1"
assert p.args_dict["arg2"] == "val2"
assert p.args_dict["arg3"] == "val3"
def test_get_arg_by_names_no_match():
p = InputParser("blah blah arg1=val1 arg2=val2 arg3=val3")
assert p.remaining_text == "blah blah"
assert p.get_arg_by_names(["arg4"]) is None
assert p.args_dict["arg1"] == "val1"
assert p.args_dict["arg2"] == "val2"
assert p.args_dict["arg3"] == "val3"
def test_get_arg_by_names_one_match():
p = InputParser("blah blah arg1=val1 arg2=val2 arg3=val3")
assert p.remaining_text == "blah blah"
assert p.get_arg_by_names(["arg4", "arg5", "arg3"]) == "val3"
assert p.args_dict["arg1"] == "val1"
assert p.args_dict["arg2"] == "val2"
assert p.args_dict["arg3"] == "val3"
def test_get_arg_by_names_first_match():
p = InputParser("blah blah arg1=val1 arg2=val2 arg3=val3")
assert p.remaining_text == "blah blah"
assert p.get_arg_by_names(["arg1", "arg2"]) == "val1"
assert p.args_dict["arg1"] == "val1"
assert p.args_dict["arg2"] == "val2"
assert p.args_dict["arg3"] == "val3"
def test_parse_string_no_numbers():
p = InputParser("blah bloo blee")
assert p.remaining_text == "blah bloo blee"
assert len(p.args_dict) == 0
assert len(p.string_words) == 3
assert len(p.number_words) == 0
assert p.string_words == ["blah", "bloo", "blee"]
def test_parse_string_all_numbers():
p = InputParser("5 421 8916 34.5 -3")
assert p.remaining_text == "5 421 8916 34.5 -3"
assert len(p.args_dict) == 0
assert len(p.string_words) == 0
assert len(p.number_words) == 5
assert p.number_words == [5, 421, 8916, 34.5, -3]
def test_parse_string_mix_of_numbers_and_args():
p = InputParser("blah blah arg1=val1 arg2=val2 5")
assert p.remaining_text == "blah blah 5"
assert p.args_dict["arg1"] == "val1"
assert p.args_dict["arg2"] == "val2"
assert p.string_words == ["blah", "blah"]
assert p.number_words == [5]
| 32.274074 | 78 | 0.655267 | 662 | 4,357 | 4.101208 | 0.110272 | 0.141805 | 0.106077 | 0.160221 | 0.842726 | 0.772376 | 0.710866 | 0.671087 | 0.657827 | 0.629098 | 0 | 0.040678 | 0.187514 | 4,357 | 134 | 79 | 32.514925 | 0.726271 | 0 | 0 | 0.43 | 0 | 0 | 0.228827 | 0 | 0 | 0 | 0 | 0 | 0.62 | 1 | 0.17 | false | 0 | 0.01 | 0 | 0.18 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 1 | 1 | 1 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
bf2859bbafc3ca91fa79fd64e27f7a27f09f2561 | 47 | py | Python | rangefinder/bosch/requests/__init__.py | tothcs1105/BOSCH-GLM-rangefinder | 419179bd63be97060d91cb87b075da47610dfbfb | [
"MIT"
] | 1 | 2020-08-03T15:31:13.000Z | 2020-08-03T15:31:13.000Z | rangefinder/bosch/requests/__init__.py | tothcs1105/BOSCH-GLM-rangefinder | 419179bd63be97060d91cb87b075da47610dfbfb | [
"MIT"
] | null | null | null | rangefinder/bosch/requests/__init__.py | tothcs1105/BOSCH-GLM-rangefinder | 419179bd63be97060d91cb87b075da47610dfbfb | [
"MIT"
] | null | null | null | from .bosch_request_192 import BoschRequest192
| 23.5 | 46 | 0.893617 | 6 | 47 | 6.666667 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.139535 | 0.085106 | 47 | 1 | 47 | 47 | 0.790698 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
bf35931c65651f21ad8881e8d248e199129cd1a7 | 17,140 | py | Python | sdk/core/azure-core/tests/test_messaging_cloud_event.py | ormichae/azure-sdk-for-python | 15523b712402fc928ee58f2a5311ac9ea703699c | [
"MIT"
] | 1 | 2021-04-26T21:15:01.000Z | 2021-04-26T21:15:01.000Z | sdk/core/azure-core/tests/test_messaging_cloud_event.py | ormichae/azure-sdk-for-python | 15523b712402fc928ee58f2a5311ac9ea703699c | [
"MIT"
] | 1 | 2021-01-19T22:41:38.000Z | 2021-01-19T22:41:38.000Z | sdk/core/azure-core/tests/test_messaging_cloud_event.py | ormichae/azure-sdk-for-python | 15523b712402fc928ee58f2a5311ac9ea703699c | [
"MIT"
] | null | null | null | # ------------------------------------
# Copyright (c) Microsoft Corporation.
# Licensed under the MIT License.
# ------------------------------------
import pytest
import json
import datetime
from azure.core.messaging import CloudEvent
from azure.core._utils import _convert_to_isoformat
from azure.core.serialization import NULL
# Cloud Event tests
def test_cloud_event_constructor():
event = CloudEvent(
source='Azure.Core.Sample',
type='SampleType',
data='cloudevent'
)
assert event.specversion == '1.0'
assert event.time.__class__ == datetime.datetime
assert event.id is not None
assert event.source == 'Azure.Core.Sample'
assert event.data == 'cloudevent'
def test_cloud_event_constructor_unexpected_keyword():
with pytest.raises(ValueError) as e:
event = CloudEvent(
source='Azure.Core.Sample',
type='SampleType',
data='cloudevent',
unexpected_keyword="not allowed",
another_bad_kwarg="not allowed either"
)
assert "unexpected_keyword" in e
assert "another_bad_kwarg" in e
def test_cloud_event_constructor_blank_data():
event = CloudEvent(
source='Azure.Core.Sample',
type='SampleType',
data=''
)
assert event.specversion == '1.0'
assert event.time.__class__ == datetime.datetime
assert event.id is not None
assert event.source == 'Azure.Core.Sample'
assert event.data == ''
def test_cloud_event_constructor_NULL_data():
event = CloudEvent(
source='Azure.Core.Sample',
type='SampleType',
data=NULL
)
assert event.data == NULL
assert event.data is NULL
def test_cloud_event_constructor_none_data():
event = CloudEvent(
source='Azure.Core.Sample',
type='SampleType',
data=None
)
assert event.data == None
def test_cloud_event_constructor_missing_data():
event = CloudEvent(
source='Azure.Core.Sample',
type='SampleType',
)
assert event.data == None
assert event.datacontenttype == None
assert event.dataschema == None
assert event.subject == None
def test_cloud_storage_dict():
cloud_storage_dict = {
"id":"a0517898-9fa4-4e70-b4a3-afda1dd68672",
"source":"/subscriptions/{subscription-id}/resourceGroups/{resource-group}/providers/Microsoft.Storage/storageAccounts/{storage-account}",
"data":{
"api":"PutBlockList",
"client_request_id":"6d79dbfb-0e37-4fc4-981f-442c9ca65760",
"request_id":"831e1650-001e-001b-66ab-eeb76e000000",
"e_tag":"0x8D4BCC2E4835CD0",
"content_type":"application/octet-stream",
"content_length":524288,
"blob_type":"BlockBlob",
"url":"https://oc2d2817345i60006.blob.core.windows.net/oc2d2817345i200097container/oc2d2817345i20002296blob",
"sequencer":"00000000000004420000000000028963",
"storage_diagnostics":{"batchId":"b68529f3-68cd-4744-baa4-3c0498ec19f0"}
},
"type":"Microsoft.Storage.BlobCreated",
"time":"2021-02-18T20:18:10.581147898Z",
"specversion":"1.0"
}
event = CloudEvent.from_dict(cloud_storage_dict)
assert event.data == {
"api":"PutBlockList",
"client_request_id":"6d79dbfb-0e37-4fc4-981f-442c9ca65760",
"request_id":"831e1650-001e-001b-66ab-eeb76e000000",
"e_tag":"0x8D4BCC2E4835CD0",
"content_type":"application/octet-stream",
"content_length":524288,
"blob_type":"BlockBlob",
"url":"https://oc2d2817345i60006.blob.core.windows.net/oc2d2817345i200097container/oc2d2817345i20002296blob",
"sequencer":"00000000000004420000000000028963",
"storage_diagnostics":{"batchId":"b68529f3-68cd-4744-baa4-3c0498ec19f0"}
}
assert event.specversion == "1.0"
assert event.time.__class__ == datetime.datetime
assert event.time.month == 2
assert event.time.day == 18
assert event.time.hour == 20
assert event.time.microsecond == 581147
assert event.__class__ == CloudEvent
assert "id" in cloud_storage_dict
assert "data" in cloud_storage_dict
def test_cloud_custom_dict_with_extensions():
cloud_custom_dict_with_extensions = {
"id":"de0fd76c-4ef4-4dfb-ab3a-8f24a307e033",
"source":"https://egtest.dev/cloudcustomevent",
"data":{"team": "event grid squad"},
"type":"Azure.Sdk.Sample",
"time":"2021-02-18T20:18:10.539861122+00:00",
"specversion":"1.0",
"ext1": "example",
"ext2": "example2"
}
event = CloudEvent.from_dict(cloud_custom_dict_with_extensions)
assert event.data == {"team": "event grid squad"}
assert event.__class__ == CloudEvent
assert event.time.month == 2
assert event.time.day == 18
assert event.time.hour == 20
assert event.time.microsecond == 539861
assert event.extensions == {"ext1": "example", "ext2": "example2"}
def test_cloud_custom_dict_ms_precision_is_gt_six():
cloud_custom_dict_with_extensions = {
"id":"de0fd76c-4ef4-4dfb-ab3a-8f24a307e033",
"source":"https://egtest.dev/cloudcustomevent",
"data":{"team": "event grid squad"},
"type":"Azure.Sdk.Sample",
"time":"2021-02-18T20:18:10.539861122+00:00",
"specversion":"1.0",
}
event = CloudEvent.from_dict(cloud_custom_dict_with_extensions)
assert event.data == {"team": "event grid squad"}
assert event.__class__ == CloudEvent
assert event.time.month == 2
assert event.time.day == 18
assert event.time.hour == 20
assert event.time.microsecond == 539861
def test_cloud_custom_dict_ms_precision_is_lt_six():
cloud_custom_dict_with_extensions = {
"id":"de0fd76c-4ef4-4dfb-ab3a-8f24a307e033",
"source":"https://egtest.dev/cloudcustomevent",
"data":{"team": "event grid squad"},
"type":"Azure.Sdk.Sample",
"time":"2021-02-18T20:18:10.123+00:00",
"specversion":"1.0",
}
event = CloudEvent.from_dict(cloud_custom_dict_with_extensions)
assert event.data == {"team": "event grid squad"}
assert event.__class__ == CloudEvent
assert event.time.month == 2
assert event.time.day == 18
assert event.time.hour == 20
assert event.time.microsecond == 123000
def test_cloud_custom_dict_ms_precision_is_eq_six():
cloud_custom_dict_with_extensions = {
"id":"de0fd76c-4ef4-4dfb-ab3a-8f24a307e033",
"source":"https://egtest.dev/cloudcustomevent",
"data":{"team": "event grid squad"},
"type":"Azure.Sdk.Sample",
"time":"2021-02-18T20:18:10.123456+00:00",
"specversion":"1.0",
}
event = CloudEvent.from_dict(cloud_custom_dict_with_extensions)
assert event.data == {"team": "event grid squad"}
assert event.__class__ == CloudEvent
assert event.time.month == 2
assert event.time.day == 18
assert event.time.hour == 20
assert event.time.microsecond == 123456
def test_cloud_custom_dict_ms_precision_is_gt_six_z_not():
cloud_custom_dict_with_extensions = {
"id":"de0fd76c-4ef4-4dfb-ab3a-8f24a307e033",
"source":"https://egtest.dev/cloudcustomevent",
"data":{"team": "event grid squad"},
"type":"Azure.Sdk.Sample",
"time":"2021-02-18T20:18:10.539861122Z",
"specversion":"1.0",
}
event = CloudEvent.from_dict(cloud_custom_dict_with_extensions)
assert event.data == {"team": "event grid squad"}
assert event.__class__ == CloudEvent
assert event.time.month == 2
assert event.time.day == 18
assert event.time.hour == 20
assert event.time.microsecond == 539861
def test_cloud_custom_dict_ms_precision_is_lt_six_z_not():
cloud_custom_dict_with_extensions = {
"id":"de0fd76c-4ef4-4dfb-ab3a-8f24a307e033",
"source":"https://egtest.dev/cloudcustomevent",
"data":{"team": "event grid squad"},
"type":"Azure.Sdk.Sample",
"time":"2021-02-18T20:18:10.123Z",
"specversion":"1.0",
}
event = CloudEvent.from_dict(cloud_custom_dict_with_extensions)
assert event.data == {"team": "event grid squad"}
assert event.__class__ == CloudEvent
assert event.time.month == 2
assert event.time.day == 18
assert event.time.hour == 20
assert event.time.microsecond == 123000
def test_cloud_custom_dict_ms_precision_is_eq_six_z_not():
cloud_custom_dict_with_extensions = {
"id":"de0fd76c-4ef4-4dfb-ab3a-8f24a307e034",
"source":"https://egtest.dev/cloudcustomevent",
"data":{"team": "event grid squad"},
"type":"Azure.Sdk.Sample",
"time":"2021-02-18T20:18:10.123456Z",
"specversion":"1.0",
}
event = CloudEvent.from_dict(cloud_custom_dict_with_extensions)
assert event.data == {"team": "event grid squad"}
assert event.__class__ == CloudEvent
assert event.time.month == 2
assert event.time.day == 18
assert event.time.hour == 20
assert event.time.microsecond == 123456
def test_cloud_custom_dict_blank_data():
cloud_custom_dict_with_extensions = {
"id":"de0fd76c-4ef4-4dfb-ab3a-8f24a307e033",
"source":"https://egtest.dev/cloudcustomevent",
"data":'',
"type":"Azure.Sdk.Sample",
"time":"2021-02-18T20:18:10+00:00",
"specversion":"1.0",
}
event = CloudEvent.from_dict(cloud_custom_dict_with_extensions)
assert event.data == ''
assert event.__class__ == CloudEvent
def test_cloud_custom_dict_no_data():
cloud_custom_dict_with_missing_data = {
"id":"de0fd76c-4ef4-4dfb-ab3a-8f24a307e033",
"source":"https://egtest.dev/cloudcustomevent",
"type":"Azure.Sdk.Sample",
"time":"2021-02-18T20:18:10+00:00",
"specversion":"1.0",
}
event = CloudEvent.from_dict(cloud_custom_dict_with_missing_data)
assert event.__class__ == CloudEvent
assert event.data is None
def test_cloud_custom_dict_null_data():
cloud_custom_dict_with_none_data = {
"id":"de0fd76c-4ef4-4dfb-ab3a-8f24a307e033",
"source":"https://egtest.dev/cloudcustomevent",
"type":"Azure.Sdk.Sample",
"data":None,
"dataschema":None,
"time":"2021-02-18T20:18:10+00:00",
"specversion":"1.0",
}
event = CloudEvent.from_dict(cloud_custom_dict_with_none_data)
assert event.__class__ == CloudEvent
assert event.data == NULL
assert event.dataschema is NULL
def test_cloud_custom_dict_valid_optional_attrs():
cloud_custom_dict_with_none_data = {
"id":"de0fd76c-4ef4-4dfb-ab3a-8f24a307e033",
"source":"https://egtest.dev/cloudcustomevent",
"type":"Azure.Sdk.Sample",
"data":None,
"dataschema":"exists",
"time":"2021-02-18T20:18:10+00:00",
"specversion":"1.0",
}
event = CloudEvent.from_dict(cloud_custom_dict_with_none_data)
assert event.__class__ == CloudEvent
assert event.data is NULL
assert event.dataschema == "exists"
def test_cloud_custom_dict_both_data_and_base64():
cloud_custom_dict_with_data_and_base64 = {
"id":"de0fd76c-4ef4-4dfb-ab3a-8f24a307e033",
"source":"https://egtest.dev/cloudcustomevent",
"data":"abc",
"data_base64":"Y2Wa==",
"type":"Azure.Sdk.Sample",
"time":"2021-02-18T20:18:10+00:00",
"specversion":"1.0",
}
with pytest.raises(ValueError):
event = CloudEvent.from_dict(cloud_custom_dict_with_data_and_base64)
def test_cloud_custom_dict_base64():
cloud_custom_dict_base64 = {
"id":"de0fd76c-4ef4-4dfb-ab3a-8f24a307e033",
"source":"https://egtest.dev/cloudcustomevent",
"data_base64":'Y2xvdWRldmVudA==',
"type":"Azure.Sdk.Sample",
"time":"2021-02-23T17:11:13.308772-08:00",
"specversion":"1.0"
}
event = CloudEvent.from_dict(cloud_custom_dict_base64)
assert event.data == b'cloudevent'
assert event.specversion == "1.0"
assert event.time.hour == 17
assert event.time.minute == 11
assert event.time.day == 23
assert event.time.tzinfo is not None
assert event.__class__ == CloudEvent
def test_data_and_base64_both_exist_raises():
with pytest.raises(ValueError):
CloudEvent.from_dict(
{"source":'sample',
"type":'type',
"data":'data',
"data_base64":'Y2kQ=='
}
)
def test_cloud_event_repr():
event = CloudEvent(
source='Azure.Core.Sample',
type='SampleType',
data='cloudevent'
)
assert repr(event).startswith("CloudEvent(source=Azure.Core.Sample, type=SampleType, specversion=1.0,")
def test_extensions_upper_case_value_error():
with pytest.raises(ValueError):
event = CloudEvent(
source='sample',
type='type',
data='data',
extensions={"lowercase123": "accepted", "NOTlower123": "not allowed"}
)
def test_extensions_not_alphanumeric_value_error():
with pytest.raises(ValueError):
event = CloudEvent(
source='sample',
type='type',
data='data',
extensions={"lowercase123": "accepted", "not@lph@nu^^3ic": "not allowed"}
)
def test_cloud_from_dict_with_invalid_extensions():
cloud_custom_dict_with_extensions = {
"id":"de0fd76c-4ef4-4dfb-ab3a-8f24a307e033",
"source":"https://egtest.dev/cloudcustomevent",
"data":{"team": "event grid squad"},
"type":"Azure.Sdk.Sample",
"time":"2020-08-07T02:06:08.11969Z",
"specversion":"1.0",
"ext1": "example",
"BADext2": "example2"
}
with pytest.raises(ValueError):
event = CloudEvent.from_dict(cloud_custom_dict_with_extensions)
def test_cloud_custom_dict_ms_precision_is_gt_six():
time ="2021-02-18T20:18:10.539861122+00:00"
date_obj = _convert_to_isoformat(time)
assert date_obj.month == 2
assert date_obj.day == 18
assert date_obj.hour == 20
assert date_obj.microsecond == 539861
def test_cloud_custom_dict_ms_precision_is_lt_six():
time ="2021-02-18T20:18:10.123+00:00"
date_obj = _convert_to_isoformat(time)
assert date_obj.month == 2
assert date_obj.day == 18
assert date_obj.hour == 20
assert date_obj.microsecond == 123000
def test_cloud_custom_dict_ms_precision_is_eq_six():
time ="2021-02-18T20:18:10.123456+00:00"
date_obj = _convert_to_isoformat(time)
assert date_obj.month == 2
assert date_obj.day == 18
assert date_obj.hour == 20
assert date_obj.microsecond == 123456
def test_cloud_custom_dict_ms_precision_is_gt_six_z_not():
time ="2021-02-18T20:18:10.539861122Z"
date_obj = _convert_to_isoformat(time)
assert date_obj.month == 2
assert date_obj.day == 18
assert date_obj.hour == 20
assert date_obj.microsecond == 539861
def test_cloud_custom_dict_ms_precision_is_lt_six_z_not():
time ="2021-02-18T20:18:10.123Z"
date_obj = _convert_to_isoformat(time)
assert date_obj.month == 2
assert date_obj.day == 18
assert date_obj.hour == 20
assert date_obj.microsecond == 123000
def test_cloud_custom_dict_ms_precision_is_eq_six_z_not():
time ="2021-02-18T20:18:10.123456Z"
date_obj = _convert_to_isoformat(time)
assert date_obj.month == 2
assert date_obj.day == 18
assert date_obj.hour == 20
assert date_obj.microsecond == 123456
def test_eventgrid_event_schema_raises():
cloud_custom_dict = {
"id":"de0fd76c-4ef4-4dfb-ab3a-8f24a307e033",
"data":{"team": "event grid squad"},
"dataVersion": "1.0",
"subject":"Azure.Sdk.Sample",
"eventTime":"2020-08-07T02:06:08.11969Z",
"eventType":"pull request",
}
with pytest.raises(ValueError, match="The event you are trying to parse follows the Eventgrid Schema. You can parse EventGrid events using EventGridEvent.from_dict method in the azure-eventgrid library."):
CloudEvent.from_dict(cloud_custom_dict)
def test_wrong_schema_raises_no_source():
cloud_custom_dict = {
"id":"de0fd76c-4ef4-4dfb-ab3a-8f24a307e033",
"data":{"team": "event grid squad"},
"type":"Azure.Sdk.Sample",
"time":"2020-08-07T02:06:08.11969Z",
"specversion":"1.0",
}
with pytest.raises(ValueError, match="The event does not conform to the cloud event spec https://github.com/cloudevents/spec. The `source` and `type` params are required."):
CloudEvent.from_dict(cloud_custom_dict)
def test_wrong_schema_raises_no_type():
cloud_custom_dict = {
"id":"de0fd76c-4ef4-4dfb-ab3a-8f24a307e033",
"data":{"team": "event grid squad"},
"source":"Azure/Sdk/Sample",
"time":"2020-08-07T02:06:08.11969Z",
"specversion":"1.0",
}
with pytest.raises(ValueError, match="The event does not conform to the cloud event spec https://github.com/cloudevents/spec. The `source` and `type` params are required."):
CloudEvent.from_dict(cloud_custom_dict) | 36.313559 | 209 | 0.656884 | 2,098 | 17,140 | 5.120591 | 0.113441 | 0.087033 | 0.074002 | 0.047752 | 0.843526 | 0.806665 | 0.786931 | 0.768966 | 0.765149 | 0.736945 | 0 | 0.099353 | 0.206068 | 17,140 | 472 | 210 | 36.313559 | 0.690109 | 0.009335 | 0 | 0.681818 | 0 | 0.009569 | 0.306745 | 0.110221 | 0 | 0 | 0.002003 | 0 | 0.272727 | 1 | 0.08134 | false | 0 | 0.014354 | 0 | 0.095694 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
bf36070bff2fa85316322d8702a57f9c5a396ba4 | 109 | py | Python | texterrors/__init__.py | ishine/texterrors | adabb0256bce27d6052677213556fa436638bada | [
"Apache-2.0"
] | 25 | 2020-10-20T16:47:15.000Z | 2022-02-12T21:14:17.000Z | texterrors/__init__.py | ishine/texterrors | adabb0256bce27d6052677213556fa436638bada | [
"Apache-2.0"
] | 2 | 2021-12-02T04:47:59.000Z | 2022-01-27T09:40:07.000Z | texterrors/__init__.py | ishine/texterrors | adabb0256bce27d6052677213556fa436638bada | [
"Apache-2.0"
] | 7 | 2020-10-20T16:27:46.000Z | 2021-06-25T17:02:58.000Z | from .texterrors import align_texts, process_lines, lev_distance, get_oov_cer, align_texts_ctm, seq_distance
| 54.5 | 108 | 0.862385 | 17 | 109 | 5.058824 | 0.823529 | 0.232558 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.082569 | 109 | 1 | 109 | 109 | 0.86 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 6 |
17344f674a2a6cce4b0266a50fbb60b1b591e350 | 96 | py | Python | venv/lib/python3.8/site-packages/debugpy/_vendored/pydevd/pydevd_attach_to_process/linux_and_mac/lldb_prepare.py | Retraces/UkraineBot | 3d5d7f8aaa58fa0cb8b98733b8808e5dfbdb8b71 | [
"MIT"
] | 2 | 2022-03-13T01:58:52.000Z | 2022-03-31T06:07:54.000Z | venv/lib/python3.8/site-packages/debugpy/_vendored/pydevd/pydevd_attach_to_process/linux_and_mac/lldb_prepare.py | DesmoSearch/Desmobot | b70b45df3485351f471080deb5c785c4bc5c4beb | [
"MIT"
] | 19 | 2021-11-20T04:09:18.000Z | 2022-03-23T15:05:55.000Z | venv/lib/python3.8/site-packages/debugpy/_vendored/pydevd/pydevd_attach_to_process/linux_and_mac/lldb_prepare.py | DesmoSearch/Desmobot | b70b45df3485351f471080deb5c785c4bc5c4beb | [
"MIT"
] | null | null | null | /home/runner/.cache/pip/pool/79/29/8b/d4a2cb3ab1bcaf8449c9539ddd44948eca6559774a4820a79c6c8655da | 96 | 96 | 0.895833 | 9 | 96 | 9.555556 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.416667 | 0 | 96 | 1 | 96 | 96 | 0.479167 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | null | 0 | 0 | null | null | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 1 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
175a8a8dc1b681ce2bd13bf158a163c7428f34a4 | 58 | py | Python | torchvision/transforms/__init__.py | koenvandesande/vision | f0d3daa7f65bcde560e242d9bccc284721368f02 | [
"BSD-3-Clause"
] | 1 | 2020-09-11T20:54:46.000Z | 2020-09-11T20:54:46.000Z | torchvision/transforms/__init__.py | GongXinyuu/vision | 76702a03d6cc2e4f431bfd1914d5e301c07bd489 | [
"BSD-3-Clause"
] | null | null | null | torchvision/transforms/__init__.py | GongXinyuu/vision | 76702a03d6cc2e4f431bfd1914d5e301c07bd489 | [
"BSD-3-Clause"
] | 1 | 2020-09-11T20:54:56.000Z | 2020-09-11T20:54:56.000Z | from .transforms import *
from .transforms_video import *
| 19.333333 | 31 | 0.793103 | 7 | 58 | 6.428571 | 0.571429 | 0.622222 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.137931 | 58 | 2 | 32 | 29 | 0.9 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
179b2e387292e2396c7507eaef8ac06d8d2d2bf1 | 45,061 | py | Python | yggdrasil/dirinode.py | ProKil/OS2018spring-projects-g10 | 1977a2953fe648d187bc1d7302eac25eff79de03 | [
"MIT"
] | 4 | 2018-08-06T19:15:00.000Z | 2021-06-06T00:04:13.000Z | yggdrasil/dirinode.py | ProKil/OS2018spring-projects-g10 | 1977a2953fe648d187bc1d7302eac25eff79de03 | [
"MIT"
] | null | null | null | yggdrasil/dirinode.py | ProKil/OS2018spring-projects-g10 | 1977a2953fe648d187bc1d7302eac25eff79de03 | [
"MIT"
] | 7 | 2018-05-05T11:56:50.000Z | 2020-03-20T15:41:03.000Z | import cython
if not cython.compiled:
import z3
from disk import *
import errno
from stat import S_IFDIR
from collections import namedtuple
#from diskspec import Bitmap, DirLookup, Allocator32
from dirspec import *
#Disk = namedtuple('Disk', ['read', 'write'])
class Disk(object):
def __init__(self, dev, _txndisk):
#super(Disk, self).__init__()
self.dev = dev
self._txndisk = _txndisk
def read(self, bid):
return self._txndisk._read(self.dev, bid)
def write(self, bid, data):
self._txndisk.write_tx(self.dev, bid, data)
'''
class Orphans(object):
def __init__(self, orphandisk):
self._orphandisk = orphandisk
def size(self):
return self._orphandisk.read(0).__getitem__(0)
def index(self, idx):
orphanblock = self._orphandisk.read(0)
#n = orphanblock[0]
n = orphanblock.__getitem__(0)
assertion(0 <= n, "orphan index: n is negative")
assertion(n < 511, "orphan index: n >= 511")
np = Extract(8, 0, idx)
return orphanblock.__getitem__(np + 1)
def reset(self):
self._orphandisk.write(0, ConstBlock(0))
def clear(self, idx):
orphanblock = self._orphandisk.read(0)
np = Extract(8, 0, idx)
#orphanblock[np] = 0
orphanblock.__setitem__(np, 0)
self._orphandisk.write(0, orphanblock)
def append(self, value):
orphanblock = self._orphandisk.read(0)
#n = orphanblock[0]
n = orphanblock.__getitem__(0)
assertion(0 <= n, "orphan index: n is negative")
assertion(n < 511, "orphan index: n >= 511")
np = Extract(8, 0, n)
"""
orphanblock[np + 1] = value
orphanblock[0] = n + 1
"""
orphanblock.__setitem__(np + 1, value)
orphanblock.__setitem__(0, n + 1)
self._orphandisk.write(0, orphanblock)
'''
# this class is auto-generated from cpp code
class Orphans:
def __init__(self, orphandisk):
self._orphandisk = orphandisk
def size(self):
return self._orphandisk.read(0).__getitem__(0)
def index(self, idx):
orphanblock = self._orphandisk.read(0)
n = orphanblock.__getitem__(0)
assertion(0 <= n)
assertion(n < 511)
np = Extract(8, 0, idx)
return orphanblock.__getitem__(np + 1)
def reset(self):
self._orphandisk.write(0, ConstBlock(0))
def clear(self, idx):
orphanblock = self._orphandisk.read(0)
np = Extract(8, 0, idx)
orphanblock.__setitem__(np, 0)
self._orphandisk.write(0, orphanblock)
def append(self, value):
orphanblock = self._orphandisk.read(0)
n = orphanblock.__getitem__(0)
assertion(0 <= n)
assertion(n < 511)
np = Extract(8, 0, n)
orphanblock.__setitem__(np + 1, value)
orphanblock.__setitem__(0, n + 1)
self._orphandisk.write(0, orphanblock)
'''
class MyPIno(object):
def __init__(self, inode):
self.inode = inode
def is_mapped(self, vbn, inode = None):
if inode == None:
return self.inode.is_mapped(vbn)
return inode.is_mapped(vbn)
def mappingi(self, vbn, inode = None):
if inode == None:
return self.inode.mappingi(vbn)
return inode.mappingi(vbn)
def read(self, bid, inode = None):
if inode == None:
return self.inode.read(bid)
return inode.read(bid)
def bmap(self, bid, inode = None):
if inode == None:
return self.inode.bmap(bid)
return inode.bmap(bid)
'''
# this class is auto-generated from cpp code
class MyPIno:
def __init__(self, _inode):
self.inode = _inode
def is_mapped(self, vbn, _inode=0):
if _inode == 0:
return self.inode.is_mapped(vbn)
return _inode.is_mapped(vbn)
def mappingi(self, vbn, _inode=0):
if _inode == 0:
return self.inode.mappingi(vbn)
return _inode.mappingi(vbn)
def read(self, bid, _inode=0):
if _inode == 0:
return self.inode.read(bid)
return _inode.read(bid)
def bmap(self, bid, _inode=0):
if _inode == 0:
return self.inode.bmap(bid)
return _inode.bmap(bid)
class Tuple2(object):
def __init__(self, _a, _b):
self.a = _a
self.b = _b
self.start = 0
def __getitem__(self, key):
if key == 0:
return self.a
return self.b
def __iter__(self):
self.start = 0
return self
def next(self):
if self.start >= 2:
raise StopIteration
else:
self.start += 1
return self.__getitem__(self.start - 1)
class Tuple3(object):
def __init__(self, _block, _bid, _off):
self.block = _block
self.bid = _bid
self.off = _off
self.start = 0
def __getitem__(self, key):
if key == 0:
return self.block
if key == 1:
return self.bid
return self.off
def __iter__(self):
self.start = 0
return self
def next(self):
if self.start >= 3:
raise StopIteration
else:
self.start += 1
return self.__getitem__(self.start - 1)
def get_bid(self):
return self.bid
def get_off(self):
return self.off
def get_block(self):
return self.block
class Tuple4(object):
def __init__(self, _block, _bid, _off, _valid):
self.block = _block
self.bid = _bid
self.off = _off
self.valid = _valid
self.start = 0
def get_bid(self):
return self.bid
def get_off(self):
return self.off
def get_valid(self):
return self.valid
def get_block(self):
return self.block
def __getitem__(self, key):
if key == 0:
return self.block
if key == 1:
return self.bid
if key == 2:
return self.off
return self.valid
def __iter__(self):
self.start = 0
return self
def next(self):
if self.start >= 4:
raise StopIteration
else:
self.start += 1
return self.__getitem__(self.start - 1)
'''
class DirImpl(object):
# ============= begin ===============
NBLOCKS = None
IFREEDISK = None
ORPHANS = None
def __init__(self, txndisk, inode):
self._txndisk = txndisk
self._inode = inode
self._dirlook = DirLook(MyPIno(inode))
self._ifree = Disk(DirImpl.IFREEDISK, self._txndisk)
orphandisk = Disk(DirImpl.ORPHANS, self._txndisk)
self._iallocator = Allocator32(self._ifree, 0, 1024)
self._ibitmap = Bitmap(self._ifree)
self._orphans = Orphans(orphandisk)
def locate_dentry_ino(self, ino, name):
tuple = self._dirlook.locate_dentry_ino(ino, name)
ioff = tuple.__getitem__(0)
off = tuple.__getitem__(1)
assertion(ULT(ioff, 522))
assertion(ioff != 10)
bid = self._inode.bmap(Concat32(ino, ioff))
block = self._inode.read(bid)
valid = And(bid != 0, off % 16 == 0, Extract(31, 0, block.__getitem__(off)) != 0)
i = 0
while i < 15:
valid = And(valid, block.__getitem__(off + i + 1) == name.__getitem__(i))
i += 1
return Tuple4(block, bid, off, valid)
def locate_empty_dentry_slot_ino(self, ino):
tuple = self._dirlook.locate_empty_slot_ino(ino)
ioff = tuple.__getitem__(0)
off = tuple.__getitem__(1)
assertion(ULT(ioff, 522))
assertion(ioff != 10)
bid = self._inode.bmap(Concat32(ino, ioff))
block = self._inode.read(bid)
assertion(bid != 0)
assertion(off % 16 == 0)
assertion(block.__getitem__(off) == 0)
return Tuple3(block, bid, off)
def locate_empty_dentry_slot_err_ino(self, ino):
tuple = self._dirlook.locate_empty_slot_ino(ino)
ioff = tuple.__getitem__(0)
off = tuple.__getitem__(1)
assertion(ULT(ioff, 522))
assertion(ioff != 10)
bid = self._inode.bmap(Concat32(ino, ioff))
block = self._inode.read(bid)
return Tuple4(block, bid, off, And(bid != 0, off % 16 == 0, block.__getitem__(off) == 0))
def write_dentry(self, block, off, ino, name):
block.__setitem__(off, ino)
i = 0
while i < 15:
block.__setitem__(off + i + 1, name.__getitem__(i))
i += 1
def clear_dentry(self, block, off):
i = 0
while i < 16:
block.__setitem__(off + i, 0)
i += 1
def ialloc(self):
ino = self._iallocator.alloc()
assertion(ino != 0)
assertion(self.is_ifree(ino))
self._ibitmap.set_bit(ino)
return ino
def is_ifree(self, ino):
return Not(self._ibitmap.is_set(ino))
def is_valid(self, ino):
return And(ino != 0, self._ibitmap.is_set(ino), UGT(self.get_iattr(ino).nlink, 0))
def is_gcable(self, ino):
return And(ino != 0, self._ibitmap.is_set(ino), self.get_iattr(ino).nlink == 0)
def is_dir(self, ino):
attr = self._inode.get_iattr(ino)
return And(self.is_valid(ino), (attr.mode & S_IFDIR) != 0)
def is_regular(self, ino):
attr = self._inode.get_iattr(ino)
return And(self.is_valid(ino), (attr.mode & S_IFDIR) == 0)
def get_iattr(self, ino):
return self._inode.get_iattr(ino)
def set_iattr(self, ino, attr):
self._inode.begin_tx()
self._inode.set_iattr(ino, attr)
self._inode.commit_tx()
def read(self, ino, blocknum):
attr = self.get_iattr(ino)
bsize = attr.bsize
is_mapped = self._inode.is_mapped(Concat32(ino, blocknum))
lbn = self._inode.mappingi(Concat32(ino, blocknum))
res = self._inode.read(lbn)
zeroblock = ConstBlock(0)
return If(And(is_mapped, ULT(blocknum, bsize)), res, zeroblock)
def truncate(self, ino, fsize):
target_bsize = fsize / 4096 + (fsize % 4096 != 0)
attr = self._inode.get_iattr(ino)
while attr.bsize > target_bsize:
self._inode.begin_tx()
self._inode.bunmap(Concat32(ino, attr.bsize - 1))
attr.size = Concat32(attr.bsize - 1, fsize)
self._inode.set_iattr(ino, attr)
self._inode.commit_tx()
if attr.fsize > fsize:
self._inode.begin_tx()
attr.size = Concat32(attr.bsize, fsize)
self._inode.set_iattr(ino, attr)
self._inode.commit_tx()
def write(self, ino, blocknum, v, size=BitVecVal(4096, 32)):
assertion(ULT(blocknum, 522))
assertion(ULT(BitVecVal(0, 32), size))
assertion(ULE(size, BitVecVal(4096, 32)))
assertion(self.is_regular(ino))
self._inode.begin_tx()
bid = self._inode.bmap(Concat32(ino, blocknum))
self._inode.write(bid, v)
attr = self._inode.get_iattr(ino)
nsize = Concat32(blocknum + 1, blocknum * 4096 + size)
update = ULE(attr.fsize, blocknum * 4096 + size)
attr.size = If(update, nsize, attr.size)
self._inode.set_iattr(ino, attr)
self._inode.commit_tx()
return size
def lookup(self, parent, name):
assertion(self.is_dir(parent))
self._inode.begin_tx()
tp = self.locate_dentry_ino(parent, name)
parent_block = tp.get_block()
off = tp.get_off()
valid = tp.get_valid()
self._inode.commit_tx()
return If(valid, Extract(31, 0, parent_block.__getitem__(off)), 0)
def mknod(self, parent, name, mode, mtime):
assertion(self.is_dir(parent))
assertion(name.__getitem__(0) != 0)
self._inode.begin_tx()
tp = self.locate_empty_dentry_slot_err_ino(parent)
parent_block = tp.get_block()
parent_bid = tp.get_bid()
off = tp.get_off()
valid = tp.get_valid()
if Not(valid):
self._inode.commit_tx()
return Tuple2(0, errno.ENOSPC)
ino = self.ialloc()
attr = Stat(0, mtime, mode, 2)
self._inode.set_iattr(ino, attr)
attr = self._inode.get_iattr(parent)
assertion(ULE(attr.bsize, 522))
attr.size = Concat32(BitVecVal(522, 32), BitVecVal(4096 * 522, 32))
assertion(ULT(attr.nlink, attr.nlink + 1))
attr.nlink += 1
self._inode.set_iattr(parent, attr)
self.write_dentry(parent_block, off, ino, name)
parent_block.__setitem__(off, ino)
self._inode.write(parent_bid, parent_block)
self._inode.commit_tx()
return Tuple2(ino, 0)
def unlink(self, parent, name):
assertion(self.is_dir(parent))
assertion(name.__getitem__(0) != 0)
self._inode.begin_tx()
tp = self.locate_dentry_ino(parent, name)
parent_block = tp.get_block()
parent_bid = tp.get_bid()
off = tp.get_off()
valid = tp.get_valid()
assertion(valid)
attr = self._inode.get_iattr(parent)
assertion(UGE(attr.nlink, 2))
attr.nlink -= 1
self._inode.set_iattr(parent, attr)
ino = Extract(31, 0, parent_block.__getitem__(off))
attr = self._inode.get_iattr(ino)
attr.nlink = 1
self._inode.set_iattr(ino, attr)
self.clear_dentry(parent_block, off)
self._inode.write(parent_bid, parent_block)
self._orphans.append(Extend(ino, 64))
self._inode.commit_tx()
return ino
def rmdir(self, parent, name):
assertion(self.is_dir(parent))
assertion(name.__getitem__(0) != 0)
self._inode.begin_tx()
tp = self.locate_dentry_ino(parent, name)
parent_block = tp.get_block()
parent_bid = tp.get_bid()
off = tp.get_off()
valid = tp.get_valid()
if Not(valid):
self._inode.commit_tx()
return Tuple2(0, errno.ENOENT)
assertion(valid)
ino = Extract(31, 0, parent_block.__getitem__(off))
if Not(self.is_dir(ino)):
self._inode.commit_tx()
return Tuple2(0, errno.ENOTDIR)
attr = self._inode.get_iattr(parent)
assertion(UGE(attr.nlink, 2))
attr.nlink -= 1
self._inode.set_iattr(parent, attr)
self.clear_dentry(parent_block, off)
self._inode.write(parent_bid, parent_block)
attr = self._inode.get_iattr(ino)
attr.nlink = 1
self._inode.set_iattr(ino, attr)
self._orphans.append(Extend(ino, 64))
self._inode.commit_tx()
return Tuple2(ino, 0)
def rename(self, oparent, oname, nparent, nname):
assertion(self.is_dir(oparent))
assertion(self.is_dir(nparent))
assertion(oname.__getitem__(0) != 0)
assertion(nname.__getitem__(0) != 0)
self._inode.begin_tx()
attr = self._inode.get_iattr(oparent)
assertion(UGE(attr.nlink, 2))
attr.nlink -= 1
self._inode.set_iattr(oparent, attr)
attr = self._inode.get_iattr(nparent)
assertion(ULE(attr.bsize, 522))
attr.size = Concat32(BitVecVal(522, 32), BitVecVal(4096 * 522, 32))
assertion(ULT(attr.nlink, attr.nlink + 1))
attr.nlink += 1
self._inode.set_iattr(nparent, attr)
tp = self.locate_dentry_ino(oparent, oname)
oparent_block = tp.get_block()
oparent_bid = tp.get_bid()
ooff = tp.get_off()
ovalid = tp.get_valid()
assertion(ovalid)
ino = oparent_block.__getitem__(ooff)
self.clear_dentry(oparent_block, ooff)
self._inode.write(oparent_bid, oparent_block)
tp = self.locate_dentry_ino(nparent, nname)
nparent_block = tp.get_block()
nparent_bid = tp.get_bid()
noff = tp.get_off()
nvalid = tp.get_valid()
if nvalid:
self._orphans.append(nparent_block.__getitem__(noff))
self.clear_dentry(nparent_block, noff)
tp3 = self.locate_empty_dentry_slot_ino(nparent)
nparent_block = tp3.get_block()
nparent_bid = tp3.get_bid()
noff = tp3.get_off()
self.write_dentry(nparent_block, noff, ino, nname)
self._inode.write(nparent_bid, nparent_block)
self._inode.commit_tx()
return 0
def forget(self, ino):
if Or((self.get_iattr(ino).mode & S_IFDIR) != 0, self.get_iattr(ino).nlink != 1):
return
assertion(self.is_regular(ino))
self._inode.begin_tx()
attr = self._inode.get_iattr(ino)
attr.nlink = 0
self._inode.set_iattr(ino, attr)
self._inode.commit_tx()
def fsync(self):
self._txndisk.flush()
def gc1(self, orph_index, off):
ino = Extract(31, 0, self._orphans.index(orph_index))
if not self.is_gcable(ino):
return
self._inode.begin_tx()
self._inode.bunmap(Concat32(ino, off))
nsize = off
attr = self._inode.get_iattr(ino)
if attr.bsize == nsize + 1:
attr.size = Concat32(nsize, nsize * 4096)
self._inode.set_iattr(ino, attr)
self._inode.commit_tx()
def gc2(self, orph_index):
ino = Extract(31, 0, self._orphans.index(orph_index))
if not self.is_gcable(ino):
return
if self._inode.get_iattr(ino).size == 0:
self._inode.begin_tx()
self._orphans.clear(orph_index)
self._ibitmap.unset_bit(ino)
self._inode.commit_tx()
def gc3(self):
self._inode.begin_tx()
self._orphans.reset()
self._inode.commit_tx()
# ============= end ==============
NBLOCKS = 522
IFREEDISK = 4
ORPHANS = 5
@cython.locals(inode='IndirectInodeDisk')
def __init__(self, txndisk, inode):
self._txndisk = txndisk
self._inode = inode
"""
self._Allocator = Allocator
self._Bitmap = Bitmap
self._DirLookup = DirLookup
"""
#PIno = namedtuple('Inode', ['is_mapped', 'mappingi', 'read', 'bmap'])
"""
self._dirlook = DirLookup(PIno(
is_mapped=lambda vbn, inode=inode: inode.is_mapped(vbn),
mappingi=lambda vbn, inode=inode: inode.mappingi(vbn),
read=lambda bid, inode=inode: inode.read(bid),
bmap=lambda bid, inode=inode: inode.bmap(bid),
))
"""
self._dirlook = DirLook(MyPIno(inode))
"""
self._ifree = Disk(
write=lambda bid, data: self._txndisk.write_tx(self.IFREEDISK, bid, data),
read=lambda bid: self._txndisk._read(self.IFREEDISK, bid))
orphandisk = Disk(
write=lambda bid, data: self._txndisk.write_tx(self.ORPHANS, bid, data),
read=lambda bid: self._txndisk._read(self.ORPHANS, bid))
"""
self._ifree = Disk(self.IFREEDISK, self._txndisk)
orphandisk = Disk(self.ORPHANS, self._txndisk)
"""
self._iallocator = Allocator(
lambda n: self._ifree.read(n),
0, 1024)
"""
self._iallocator = Allocator32(self._ifree, 0, 1024)
self._ibitmap = Bitmap(self._ifree)
self._orphans = Orphans(orphandisk)
def locate_dentry_ino(self, ino, name):
ioff, off = self._dirlook.locate_dentry_ino(ino, name)
assertion(ULT(ioff, 522), "locate_dentry_ino: invalid ioff")
assertion(ioff != 10, "locate_dentry_ino: invalid ioff")
bid = self._inode.bmap(Concat32(ino, ioff))
block = self._inode.read(bid)
#valid = And(bid != 0, off % 16 == 0, Extract(31, 0, block[off]) != 0)
valid = And(bid != 0, off % 16 == 0, Extract(31, 0, block.__getitem__(off)) != 0)
for i in range(15):
#valid = And(valid, block[off + i + 1] == name[i])
valid = And(valid, block.__getitem__(off + i + 1) == name.__getitem__(i))
#return block, bid, off, valid
return Tuple4(block, bid, off, valid)
def locate_empty_dentry_slot_ino(self, ino):
ioff, off = self._dirlook.locate_empty_slot_ino(ino)
assertion(ULT(ioff, 522), "locate_empty_dentry_slot: invalid ioff")
assertion(ioff != 10, "locate_empty_dentry_slot: invalid ioff")
bid = self._inode.bmap(Concat32(ino, ioff))
block = self._inode.read(bid)
assertion(bid != 0, "locate_empty_dentry_slot: invalid bid")
assertion(off % 16 == 0, "locate_empty_dentry_slot: invalid offset")
#assertion(block[off] == 0, "locate_empty_dentry_slot: slot not empty")
assertion(block.__getitem__(off) == 0, "locate_empty_dentry_slot: slot not empty")
#return block, bid, off
return Tuple3(block, bid, off)
def locate_empty_dentry_slot_err_ino(self, ino):
ioff, off = self._dirlook.locate_empty_slot_ino(ino)
assertion(ULT(ioff, 522), "locate_dentry_ino: invalid ioff")
assertion(ioff != 10, "locate_dentry_ino: invalid ioff")
bid = self._inode.bmap(Concat32(ino, ioff))
block = self._inode.read(bid)
#return block, bid, off, And(bid != 0, off % 16 == 0, block[off] == 0)
#return block, bid, off, And(bid != 0, off % 16 == 0, block.__getitem__(off) == 0)
return Tuple4(block, bid, off, And(bid != 0, off % 16 == 0, block.__getitem__(off) == 0))
def write_dentry(self, block, off, ino, name):
#block[off] = ino
block.__setitem__(off, ino)
for i in range(15):
#block[off + i + 1] = name[i]
block.__setitem__(off + i + 1, name.__getitem__(i))
def clear_dentry(self, block, off):
for i in range(16):
#block[off + i] = 0
block.__setitem__(off + i, 0)
def ialloc(self):
# black box allocator returns a vbn
ino = self._iallocator.alloc()
# Validation
assertion(ino != 0, "ialloc: inode is 0")
assertion(self.is_ifree(ino), "ialloc: ino is not free")
self._ibitmap.set_bit(ino)
return ino
def is_ifree(self, ino):
return Not(self._ibitmap.is_set(ino))
def is_valid(self, ino):
return And(ino != 0, self._ibitmap.is_set(ino), UGT(self.get_iattr(ino).nlink, 0))
def is_gcable(self, ino):
return And(ino != 0, self._ibitmap.is_set(ino), self.get_iattr(ino).nlink == 0)
def is_dir(self, ino):
attr = self._inode.get_iattr(ino)
return And(self.is_valid(ino),
attr.mode & S_IFDIR != 0)
def is_regular(self, ino):
attr = self._inode.get_iattr(ino)
return And(self.is_valid(ino),
attr.mode & S_IFDIR == 0)
###
def get_iattr(self, ino):
return self._inode.get_iattr(ino)
def set_iattr(self, ino, attr):
self._inode.begin_tx()
self._inode.set_iattr(ino, attr)
self._inode.commit_tx()
def read(self, ino, blocknum):
attr = self.get_iattr(ino)
bsize = attr.bsize
is_mapped = self._inode.is_mapped(Concat32(ino, blocknum))
lbn = self._inode.mappingi(Concat32(ino, blocknum))
res = self._inode.read(lbn)
res = If(And(is_mapped, ULT(blocknum, bsize)), res, ConstBlock(0))
return res
def truncate(self, ino, fsize):
target_bsize = fsize / 4096 + (fsize % 4096 != 0)
# Update the size
attr = self._inode.get_iattr(ino)
while attr.bsize > target_bsize:
self._inode.begin_tx()
self._inode.bunmap(Concat32(ino, attr.bsize - 1))
attr.size = Concat32(attr.bsize - 1, fsize)
self._inode.set_iattr(ino, attr)
self._inode.commit_tx()
if attr.fsize > fsize:
self._inode.begin_tx()
attr.size = Concat32(attr.bsize, fsize)
self._inode.set_iattr(ino, attr)
self._inode.commit_tx()
def write(self, ino, blocknum, v, size=BitVecVal(4096, 32)):
# Implementation support only a small number of blocknums.
assertion(ULT(blocknum, 522), "write: blocknum to large")
assertion(ULT(BitVecVal(0, 32), size), "write: size is 0")
assertion(ULE(size, BitVecVal(4096, 32)), "write: size to large")
assertion(self.is_regular(ino), "write: writing to a non-regular inode")
self._inode.begin_tx()
bid = self._inode.bmap(Concat32(ino, blocknum))
self._inode.write(bid, v)
attr = self._inode.get_iattr(ino)
nsize = Concat32(blocknum + 1, blocknum * 4096 + size)
update = ULE(attr.fsize, blocknum * 4096 + size)
attr.size = If(update, nsize, attr.size)
self._inode.set_iattr(ino, attr)
self._inode.commit_tx()
return size
def lookup(self, parent, name):
assertion(self.is_dir(parent), "lookup: parent is not dir")
self._inode.begin_tx()
#parent_block, _, off, valid = self.locate_dentry_ino(parent, name)
tp = self.locate_dentry_ino(parent, name)
parent_block = tp.get_block()
off = tp.get_off()
valid = tp.get_valid()
self._inode.commit_tx()
#return If(valid, Extract(31, 0, parent_block[off]), 0)
return If(valid, Extract(31, 0, parent_block.__getitem__(off)), 0)
def mknod(self, parent, name, mode, mtime):
assertion(self.is_dir(parent), "mknod: parent is not a directory")
#assertion(name[0] != 0, "mknod: name is null")
assertion(name.__getitem__(0) != 0, "mknod: name is null")
self._inode.begin_tx()
parent_block, parent_bid, off, valid = self.locate_empty_dentry_slot_err_ino(parent)
if Not(valid):
self._inode.commit_tx()
#return 0, errno.ENOSPC
return Tuple2(0, errno.ENOSPC)
ino = self.ialloc()
attr = Stat(size=0, mtime=mtime, mode=mode, nlink=2)
self._inode.set_iattr(ino, attr)
attr = self._inode.get_iattr(parent)
assertion(ULE(attr.bsize, 522), "mknod: bsize is larger than 522")
attr.size = Concat32(BitVecVal(522, 32), BitVecVal(4096 * 522, 32))
assertion(ULT(attr.nlink, attr.nlink + 1), "mknod: nlink overflow")
attr.nlink += 1
self._inode.set_iattr(parent, attr)
self.write_dentry(parent_block, off, ino, name)
#parent_block[off] = ino
parent_block.__setitem__(off, ino)
self._inode.write(parent_bid, parent_block)
self._inode.commit_tx()
#return ino, 0
return Tuple2(ino, 0)
def unlink(self, parent, name):
assertion(self.is_dir(parent), "unlink: not a dir")
#assertion(name[0] != 0, "unlink: name is null")
assertion(name.__getitem__(0) != 0, "unlink: name is null")
self._inode.begin_tx()
parent_block, parent_bid, off, valid = self.locate_dentry_ino(parent, name)
assertion(valid, "unlink: not valid")
attr = self._inode.get_iattr(parent)
assertion(UGE(attr.nlink, 2), "unlink: nlink is not greater than 1: " + str(attr.nlink))
attr.nlink -= 1
self._inode.set_iattr(parent, attr)
#ino = Extract(31, 0, parent_block[off])
ino = Extract(31, 0, parent_block.__getitem__(off))
attr = self._inode.get_iattr(ino)
attr.nlink = 1
self._inode.set_iattr(ino, attr)
self.clear_dentry(parent_block, off)
self._inode.write(parent_bid, parent_block)
# append the inode to the orphan list
self._orphans.append(Extend(ino, 64))
self._inode.commit_tx()
return ino
def rmdir(self, parent, name):
assertion(self.is_dir(parent), "rmdir: parent is not a directory")
#assertion(name[0] != 0, "rmdir: name is null")
assertion(name.__getitem__(0) != 0, "rmdir: name is null")
self._inode.begin_tx()
parent_block, parent_bid, off, valid = self.locate_dentry_ino(parent, name)
if Not(valid):
self._inode.commit_tx()
#return 0, errno.ENOENT
return Tuple2(0, errno.ENOENT)
assertion(valid, "rmdir: dentry off not valid")
#ino = Extract(31, 0, parent_block[off])
ino = Extract(31, 0, parent_block.__getitem__(off))
if Not(self.is_dir(ino)):
self._inode.commit_tx()
#return 0, errno.ENOTDIR
return Tuple2(0, errno.ENOTDIR)
assertion(self.is_dir(ino), "rmdir: ino is not dir")
attr = self._inode.get_iattr(ino)
if UGT(attr.nlink, 2):
self._inode.commit_tx()
#return BitVecVal(0, 32), errno.ENOTEMPTY
return Tuple2(BitVecVal(0, 32), errno.ENOTEMPTY)
attr = self._inode.get_iattr(parent)
assertion(UGE(attr.nlink, 2), "rmdir: nlink is not greater than 1: " + str(attr.nlink))
attr.nlink -= 1
self._inode.set_iattr(parent, attr)
self.clear_dentry(parent_block, off)
self._inode.write(parent_bid, parent_block)
attr = self._inode.get_iattr(ino)
attr.nlink = 1
self._inode.set_iattr(ino, attr)
# append the inode to the orphan list
self._orphans.append(Extend(ino, 64))
self._inode.commit_tx()
#return ino, 0
return Tuple2(ino, 0)
def rename(self, oparent, oname, nparent, nname):
assertion(self.is_dir(oparent), "rename: oparent is not dir")
assertion(self.is_dir(nparent), "rename: nparent is not dir")
#assertion(oname[0] != 0, "rename: oname is null")
#assertion(nname[0] != 0, "rename: nname is null")
assertion(oname.__getitem__(0) != 0, "rename: oname is null")
assertion(nname.__getitem__(0) != 0, "rename: nname is null")
self._inode.begin_tx()
attr = self._inode.get_iattr(oparent)
assertion(UGE(attr.nlink, 2), "rename: nlink is not greater than 1: " + str(attr.nlink))
attr.nlink -= 1
self._inode.set_iattr(oparent, attr)
attr = self._inode.get_iattr(nparent)
assertion(ULE(attr.bsize, 522), "rename: bsize is larger than 522")
attr.size = Concat32(BitVecVal(522, 32), BitVecVal(4096 * 522, 32))
assertion(ULT(attr.nlink, attr.nlink + 1), "rename: nlink overflow")
attr.nlink += 1
self._inode.set_iattr(nparent, attr)
# Find target and wipe from parent block
oparent_block, oparent_bid, ooff, ovalid = self.locate_dentry_ino(oparent, oname)
assertion(ovalid, "rename: ooff is not valid")
#ino = oparent_block[ooff]
ino = oparent_block.__getitem__(ooff)
self.clear_dentry(oparent_block, ooff)
self._inode.write(oparent_bid, oparent_block)
# Check if target exists
nparent_block, nparent_bid, noff, nvalid = self.locate_dentry_ino(nparent, nname)
if nvalid:
# append the dst inode to the orphan list
#self._orphans.append(nparent_block[noff])
self._orphans.append(nparent_block.__getitem__(noff))
self.clear_dentry(nparent_block, noff)
nparent_block, nparent_bid, noff = self.locate_empty_dentry_slot_ino(nparent)
self.write_dentry(nparent_block, noff, ino, nname)
self._inode.write(nparent_bid, nparent_block)
self._inode.commit_tx()
return 0
def forget(self, ino):
if Or(self.get_iattr(ino).mode & S_IFDIR != 0, self.get_iattr(ino).nlink != 1):
return
assertion(self.is_regular(ino), "forget: ino is not regular")
self._inode.begin_tx()
attr = self._inode.get_iattr(ino)
attr.nlink = 0
self._inode.set_iattr(ino, attr)
self._inode.commit_tx()
def fsync(self):
self._txndisk.flush()
def gc1(self, orph_index, off):
ino = Extract(31, 0, self._orphans.index(orph_index))
if not self.is_gcable(ino):
return
# Wipe data
self._inode.begin_tx()
self._inode.bunmap(Concat32(ino, off))
nsize = off
attr = self._inode.get_iattr(ino)
if attr.bsize == nsize + 1:
attr.size = Concat32(nsize, nsize * 4096)
self._inode.set_iattr(ino, attr)
self._inode.commit_tx()
# If the inode is in the orphan list, is gc-able *and*
# its size is 0 we can safely mark it as 'free'
def gc2(self, orph_index):
ino = Extract(31, 0, self._orphans.index(orph_index))
if not self.is_gcable(ino):
return
if self._inode.get_iattr(ino).size == 0:
self._inode.begin_tx()
self._orphans.clear(orph_index)
self._ibitmap.unset_bit(ino)
self._inode.commit_tx()
def gc3(self):
self._inode.begin_tx()
self._orphans.reset()
self._inode.commit_tx()
def crash(self, mach):
#return self.__class__(self._txndisk.crash(mach), self._inode.crash(mach), self._Allocator, self._Bitmap, self._DirLookup)
return self.__class__(self._txndisk.crash(mach), self._inode.crash(mach))
DirImpl.NBLOCKS = 522
DirImpl.IFREEDISK = 4
DirImpl.ORPHANS = 5
'''
# this class is auto-generated from cpp code, except crash func
class DirImpl:
NBLOCKS = None
IFREEDISK = None
ORPHANS = None
def __init__(self, txndisk, inode):
self._txndisk = txndisk
self._inode = inode
self._dirlook = DirLook(MyPIno(inode))
self._ifree = Disk(DirImpl.IFREEDISK, self._txndisk)
orphandisk = Disk(DirImpl.ORPHANS, self._txndisk)
self._iallocator = Allocator32(self._ifree, 0, 1024)
self._ibitmap = Bitmap(self._ifree)
self._orphans = Orphans(orphandisk)
def locate_dentry_ino(self, ino, name):
tuple = self._dirlook.locate_dentry_ino(ino, name)
ioff = tuple.__getitem__(0)
off = tuple.__getitem__(1)
assertion(ULT(ioff, 522))
assertion(ioff != 10)
bid = self._inode.bmap(Concat32(ino, ioff))
block = self._inode.read(bid)
valid = And(bid != 0, off % 16 == 0, Extract(31, 0, block.__getitem__(off)) != 0)
i = 0
while i < 15:
valid = And(valid, block.__getitem__(off + i + 1) == name.__getitem__(i))
i += 1
return Tuple4(block, bid, off, valid)
def locate_empty_dentry_slot_ino(self, ino):
tuple = self._dirlook.locate_empty_slot_ino(ino)
ioff = tuple.__getitem__(0)
off = tuple.__getitem__(1)
assertion(ULT(ioff, 522))
assertion(ioff != 10)
bid = self._inode.bmap(Concat32(ino, ioff))
block = self._inode.read(bid)
assertion(bid != 0)
assertion(off % 16 == 0)
assertion(block.__getitem__(off) == 0)
return Tuple3(block, bid, off)
def locate_empty_dentry_slot_err_ino(self, ino):
tuple = self._dirlook.locate_empty_slot_ino(ino)
ioff = tuple.__getitem__(0)
off = tuple.__getitem__(1)
assertion(ULT(ioff, 522))
assertion(ioff != 10)
bid = self._inode.bmap(Concat32(ino, ioff))
block = self._inode.read(bid)
return Tuple4(block, bid, off, And(bid != 0, off % 16 == 0, block.__getitem__(off) == 0))
def write_dentry(self, block, off, ino, name):
block.__setitem__(off, ino)
i = 0
while i < 15:
block.__setitem__(off + i + 1, name.__getitem__(i))
i += 1
def clear_dentry(self, block, off):
i = 0
while i < 16:
block.__setitem__(off + i, 0)
i += 1
def ialloc(self):
ino = self._iallocator.alloc()
assertion(ino != 0)
assertion(self.is_ifree(ino))
self._ibitmap.set_bit(ino)
return ino
def is_ifree(self, ino):
return Not(self._ibitmap.is_set(ino))
def is_valid(self, ino):
return And(ino != 0, self._ibitmap.is_set(ino), UGT(self.get_iattr(ino).nlink, 0))
def is_gcable(self, ino):
return And(ino != 0, self._ibitmap.is_set(ino), self.get_iattr(ino).nlink == 0)
def is_dir(self, ino):
attr = self._inode.get_iattr(ino)
return And(self.is_valid(ino), (attr.mode & S_IFDIR) != 0)
def is_regular(self, ino):
attr = self._inode.get_iattr(ino)
return And(self.is_valid(ino), (attr.mode & S_IFDIR) == 0)
def get_iattr(self, ino):
return self._inode.get_iattr(ino)
def set_iattr(self, ino, attr):
self._inode.begin_tx()
self._inode.set_iattr(ino, attr)
self._inode.commit_tx()
def read(self, ino, blocknum):
attr = self.get_iattr(ino)
bsize = attr.bsize
is_mapped = self._inode.is_mapped(Concat32(ino, blocknum))
lbn = self._inode.mappingi(Concat32(ino, blocknum))
res = self._inode.read(lbn)
zeroblock = ConstBlock(0)
return If(And(is_mapped, ULT(blocknum, bsize)), res, zeroblock)
def truncate(self, ino, fsize):
target_bsize = fsize / 4096 + (fsize % 4096 != 0)
attr = self._inode.get_iattr(ino)
while attr.bsize > target_bsize:
self._inode.begin_tx()
self._inode.bunmap(Concat32(ino, attr.bsize - 1))
attr.size = Concat32(attr.bsize - 1, fsize)
self._inode.set_iattr(ino, attr)
self._inode.commit_tx()
if attr.fsize > fsize:
self._inode.begin_tx()
attr.size = Concat32(attr.bsize, fsize)
self._inode.set_iattr(ino, attr)
self._inode.commit_tx()
def write(self, ino, blocknum, v, size=BitVecVal(4096, 32)):
assertion(ULT(blocknum, 522))
assertion(ULT(BitVecVal(0, 32), size))
assertion(ULE(size, BitVecVal(4096, 32)))
assertion(self.is_regular(ino))
self._inode.begin_tx()
bid = self._inode.bmap(Concat32(ino, blocknum))
self._inode.write(bid, v)
attr = self._inode.get_iattr(ino)
nsize = Concat32(blocknum + 1, blocknum * 4096 + size)
update = ULE(attr.fsize, blocknum * 4096 + size)
attr.size = If(update, nsize, attr.size)
self._inode.set_iattr(ino, attr)
self._inode.commit_tx()
return size
def lookup(self, parent, name):
assertion(self.is_dir(parent))
self._inode.begin_tx()
tp = self.locate_dentry_ino(parent, name)
parent_block = tp.get_block()
off = tp.get_off()
valid = tp.get_valid()
self._inode.commit_tx()
return If(valid, Extract(31, 0, parent_block.__getitem__(off)), 0)
def mknod(self, parent, name, mode, mtime):
assertion(self.is_dir(parent))
assertion(name.__getitem__(0) != 0)
self._inode.begin_tx()
tp = self.locate_empty_dentry_slot_err_ino(parent)
parent_block = tp.get_block()
parent_bid = tp.get_bid()
off = tp.get_off()
valid = tp.get_valid()
if Not(valid):
self._inode.commit_tx()
return Tuple2(0, errno.ENOSPC)
ino = self.ialloc()
attr = Stat(0, mtime, mode, 2)
self._inode.set_iattr(ino, attr)
attr = self._inode.get_iattr(parent)
assertion(ULE(attr.bsize, 522))
attr.size = Concat32(BitVecVal(522, 32), BitVecVal(4096 * 522, 32))
assertion(ULT(attr.nlink, attr.nlink + 1))
attr.nlink += 1
self._inode.set_iattr(parent, attr)
self.write_dentry(parent_block, off, ino, name)
parent_block.__setitem__(off, ino)
self._inode.write(parent_bid, parent_block)
self._inode.commit_tx()
return Tuple2(ino, 0)
def unlink(self, parent, name):
assertion(self.is_dir(parent))
assertion(name.__getitem__(0) != 0)
self._inode.begin_tx()
tp = self.locate_dentry_ino(parent, name)
parent_block = tp.get_block()
parent_bid = tp.get_bid()
off = tp.get_off()
valid = tp.get_valid()
assertion(valid)
attr = self._inode.get_iattr(parent)
assertion(UGE(attr.nlink, 2))
attr.nlink -= 1
self._inode.set_iattr(parent, attr)
ino = Extract(31, 0, parent_block.__getitem__(off))
attr = self._inode.get_iattr(ino)
attr.nlink = 1
self._inode.set_iattr(ino, attr)
self.clear_dentry(parent_block, off)
self._inode.write(parent_bid, parent_block)
self._orphans.append(Extend(ino, 64))
self._inode.commit_tx()
return ino
def rmdir(self, parent, name):
assertion(self.is_dir(parent))
assertion(name.__getitem__(0) != 0)
self._inode.begin_tx()
tp = self.locate_dentry_ino(parent, name)
parent_block = tp.get_block()
parent_bid = tp.get_bid()
off = tp.get_off()
valid = tp.get_valid()
if Not(valid):
self._inode.commit_tx()
return Tuple2(0, errno.ENOENT)
assertion(valid)
ino = Extract(31, 0, parent_block.__getitem__(off))
if Not(self.is_dir(ino)):
self._inode.commit_tx()
return Tuple2(0, errno.ENOTDIR)
attr = self._inode.get_iattr(parent)
assertion(UGE(attr.nlink, 2))
attr.nlink -= 1
self._inode.set_iattr(parent, attr)
self.clear_dentry(parent_block, off)
self._inode.write(parent_bid, parent_block)
attr = self._inode.get_iattr(ino)
attr.nlink = 1
self._inode.set_iattr(ino, attr)
self._orphans.append(Extend(ino, 64))
self._inode.commit_tx()
return Tuple2(ino, 0)
def rename(self, oparent, oname, nparent, nname):
assertion(self.is_dir(oparent))
assertion(self.is_dir(nparent))
assertion(oname.__getitem__(0) != 0)
assertion(nname.__getitem__(0) != 0)
self._inode.begin_tx()
attr = self._inode.get_iattr(oparent)
assertion(UGE(attr.nlink, 2))
attr.nlink -= 1
self._inode.set_iattr(oparent, attr)
attr = self._inode.get_iattr(nparent)
assertion(ULE(attr.bsize, 522))
attr.size = Concat32(BitVecVal(522, 32), BitVecVal(4096 * 522, 32))
assertion(ULT(attr.nlink, attr.nlink + 1))
attr.nlink += 1
self._inode.set_iattr(nparent, attr)
tp = self.locate_dentry_ino(oparent, oname)
oparent_block = tp.get_block()
oparent_bid = tp.get_bid()
ooff = tp.get_off()
ovalid = tp.get_valid()
assertion(ovalid)
ino = oparent_block.__getitem__(ooff)
self.clear_dentry(oparent_block, ooff)
self._inode.write(oparent_bid, oparent_block)
tp = self.locate_dentry_ino(nparent, nname)
nparent_block = tp.get_block()
nparent_bid = tp.get_bid()
noff = tp.get_off()
nvalid = tp.get_valid()
if nvalid:
self._orphans.append(nparent_block.__getitem__(noff))
self.clear_dentry(nparent_block, noff)
tp3 = self.locate_empty_dentry_slot_ino(nparent)
nparent_block = tp3.get_block()
nparent_bid = tp3.get_bid()
noff = tp3.get_off()
self.write_dentry(nparent_block, noff, ino, nname)
self._inode.write(nparent_bid, nparent_block)
self._inode.commit_tx()
return 0
def forget(self, ino):
if Or((self.get_iattr(ino).mode & S_IFDIR) != 0, self.get_iattr(ino).nlink != 1):
return
assertion(self.is_regular(ino))
self._inode.begin_tx()
attr = self._inode.get_iattr(ino)
attr.nlink = 0
self._inode.set_iattr(ino, attr)
self._inode.commit_tx()
def fsync(self):
self._txndisk.flush()
def gc1(self, orph_index, off):
ino = Extract(31, 0, self._orphans.index(orph_index))
if not self.is_gcable(ino):
return
self._inode.begin_tx()
self._inode.bunmap(Concat32(ino, off))
nsize = off
attr = self._inode.get_iattr(ino)
if attr.bsize == nsize + 1:
attr.size = Concat32(nsize, nsize * 4096)
self._inode.set_iattr(ino, attr)
self._inode.commit_tx()
def gc2(self, orph_index):
ino = Extract(31, 0, self._orphans.index(orph_index))
if not self.is_gcable(ino):
return
if self._inode.get_iattr(ino).size == 0:
self._inode.begin_tx()
self._orphans.clear(orph_index)
self._ibitmap.unset_bit(ino)
self._inode.commit_tx()
def gc3(self):
self._inode.begin_tx()
self._orphans.reset()
self._inode.commit_tx()
def crash(self, mach):
#return self.__class__(self._txndisk.crash(mach), self._inode.crash(mach), self._Allocator, self._Bitmap, self._DirLookup)
return self.__class__(self._txndisk.crash(mach), self._inode.crash(mach))
DirImpl.NBLOCKS = 522
DirImpl.IFREEDISK = 4
DirImpl.ORPHANS = 5
| 33.52753 | 130 | 0.596813 | 5,883 | 45,061 | 4.315145 | 0.039606 | 0.088277 | 0.031238 | 0.032813 | 0.921374 | 0.902151 | 0.885527 | 0.874419 | 0.858268 | 0.841803 | 0 | 0.027965 | 0.280242 | 45,061 | 1,343 | 131 | 33.552494 | 0.754756 | 0.008699 | 0 | 0.524476 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.095571 | 1 | 0.142191 | false | 0 | 0.016317 | 0.032634 | 0.314685 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
bd5b40074a7c5fec7855f3d1e901b3853c92b623 | 127 | py | Python | jwtornadodemo/config/__init__.py | jaggerwang/jw-pyserver | 80d621e5fe5474c3ee38b78395778c59543916cf | [
"MIT"
] | 10 | 2019-03-07T02:11:17.000Z | 2021-08-24T06:51:13.000Z | jwtornadodemo/config/__init__.py | jaggerwang/jw-pyserver | 80d621e5fe5474c3ee38b78395778c59543916cf | [
"MIT"
] | 1 | 2021-06-01T21:50:48.000Z | 2021-06-01T21:50:48.000Z | jwtornadodemo/config/__init__.py | jaggerwang/jw-pyserver | 80d621e5fe5474c3ee38b78395778c59543916cf | [
"MIT"
] | 3 | 2019-03-07T02:11:18.000Z | 2020-06-22T07:13:02.000Z | from .env import *
from .logging import *
from .tornado import *
from .session import *
from .db import *
from .cache import *
| 18.142857 | 22 | 0.716535 | 18 | 127 | 5.055556 | 0.444444 | 0.549451 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.188976 | 127 | 6 | 23 | 21.166667 | 0.883495 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
bd988b11fbd3b30e11179f289fd7f8dc02d2df56 | 158 | py | Python | tests/test_hours.py | c0yote/toolbag | 5128af31f0069372ecb537dead4402a6aac33428 | [
"MIT"
] | null | null | null | tests/test_hours.py | c0yote/toolbag | 5128af31f0069372ecb537dead4402a6aac33428 | [
"MIT"
] | 7 | 2018-03-18T22:50:24.000Z | 2018-05-31T17:38:15.000Z | tests/test_hours.py | c0yote/toolbag | 5128af31f0069372ecb537dead4402a6aac33428 | [
"MIT"
] | null | null | null | import unittest
from unittest.mock import MagicMock, mock_open, patch
class HoursTestCase(unittest.TestCase):
#def test_time_between_24hr_clock()
pass
| 22.571429 | 53 | 0.810127 | 21 | 158 | 5.857143 | 0.809524 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.014493 | 0.126582 | 158 | 7 | 54 | 22.571429 | 0.876812 | 0.21519 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0.25 | 0.5 | 0 | 0.75 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 0 | 1 | 0 | 0 | 6 |
bdbacbe5eb791172912981277ee7df65bc7ec4d0 | 161 | py | Python | xman/events.py | linychuo/SQLiteClient | 064e71e9667d4ea9aad29e7ee9ffb581335738e8 | [
"MIT"
] | 4 | 2018-01-28T07:01:51.000Z | 2019-06-19T03:27:03.000Z | xman/events.py | linychuo/SQLiteClient | 064e71e9667d4ea9aad29e7ee9ffb581335738e8 | [
"MIT"
] | null | null | null | xman/events.py | linychuo/SQLiteClient | 064e71e9667d4ea9aad29e7ee9ffb581335738e8 | [
"MIT"
] | 1 | 2020-08-16T14:29:26.000Z | 2020-08-16T14:29:26.000Z | import wx.lib.newevent
ForwardMainEvent, EVT_FORWARD_MAIN_EVENT = wx.lib.newevent.NewEvent()
CreateTabEvent, EVT_CREATE_MAIN_EVENT = wx.lib.newevent.NewEvent()
| 32.2 | 69 | 0.832298 | 22 | 161 | 5.818182 | 0.5 | 0.117188 | 0.304688 | 0.21875 | 0.46875 | 0.46875 | 0 | 0 | 0 | 0 | 0 | 0 | 0.068323 | 161 | 4 | 70 | 40.25 | 0.853333 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0.333333 | 0 | 0.333333 | 0 | 1 | 0 | 0 | null | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 6 |
bddda2c09da7cc3ddae89d81def13f2eb9fd291d | 42,403 | py | Python | svi_experiments/rgp_experiments.py | zhenwendai/RGP | be679607d3457a1038a2fe39b36b816ea380ea39 | [
"BSD-3-Clause"
] | 17 | 2016-10-24T01:31:30.000Z | 2021-07-31T08:12:02.000Z | svi_experiments/rgp_experiments.py | zhenwendai/RGP | be679607d3457a1038a2fe39b36b816ea380ea39 | [
"BSD-3-Clause"
] | null | null | null | svi_experiments/rgp_experiments.py | zhenwendai/RGP | be679607d3457a1038a2fe39b36b816ea380ea39 | [
"BSD-3-Clause"
] | 11 | 2017-07-11T09:11:48.000Z | 2022-01-25T12:10:48.000Z | #!/usr/bin/env python2
# -*- coding: utf-8 -*-
"""
Created on Mon Aug 21 11:51:21 2017
@author: grigoral
"""
from __future__ import print_function
import autoreg
import GPy
import numpy as np
from matplotlib import pyplot as plt
import scipy.io as io
from autoreg.benchmark import tasks
# Define class for normalization
class Normalize(object):
def __init__(self, data, name, norm_name):
self.data_mean = data.mean(axis=0)
self.data_std = data.std(axis=0)
self.normalization_computed = True
setattr(self, name, data)
setattr(self, norm_name, (data-self.data_mean) / self.data_std )
def normalize(self, data, name, norm_name):
if hasattr(self,norm_name):
raise ValueError("This normalization name already exist, choose another one")
setattr(self, name, data )
setattr(self, norm_name, (data-self.data_mean) / self.data_std )
def denormalize(self, data):
return data*self.data_std + self.data_mean
def prepare_data(task_name, normalize=False):
task = getattr( tasks, task_name)
task = task()
task.load_data()
print("Data OUT train shape: ", task.data_out_train.shape)
print("Data IN train shape: ", task.data_in_train.shape)
print("Data OUT test shape: ", task.data_out_test.shape)
print("Data IN test shape: ", task.data_in_test.shape)
normalize = True
in_data = Normalize(task.data_in_train,'in_train','in_train_norm' )
out_data = Normalize(task.data_out_train,'out_train','out_train_norm' )
in_data.normalize(task.data_in_test, 'in_test','in_test_norm')
out_data.normalize(task.data_out_test, 'out_test','out_test_norm')
if normalize:
out_train = out_data.out_train_norm #out_data.out_train
in_train = in_data.in_train_norm # in_data.in_train
out_test = out_data.out_test_norm #out_data.out_test
in_test = in_data.in_test_norm #in_data.in_test
else:
out_train = out_data.out_train #out_data.out_train
in_train = in_data.in_train # in_data.in_train
out_test = out_data.out_test #out_data.out_test
in_test = in_data.in_test #in_data.in_test
return out_train, in_train, task
class IE5_experiment_1( object ):
"""
Tested parameters are: initial number of optimization runs, number of hidden dims, number of inducing points.
After the first experiment the conclusion is that 1 hidden dim is the best, but also
the optimization is not very explorative.
Probably there was an error in the experi,ent setup because I did not change the number of hidden layers
only the number of hidden dimensions in 1 layer.
Best values: ini_runs = 160.0, hidden dim=1., Q=50. (237.44060068)
Iteration 21
"""
def __init__( self, initial_counter, iter_nums):
self.counter = initial_counter
self.iter_nums = iter_nums
def __call__( self, *args, **kwargs ):
#import pdb; pdb.set_trace()
new_args = (self.counter,self.iter_nums,) + tuple( [ int(args[0][0,i]) for i in range(args[0].shape[1]) ] )
ret = self.IE5_experiment_1_code( *new_args, **kwargs)
self.counter += 1
return ret
@staticmethod
def IE5_experiment_1_code( bo_iter_no, p_iter_num, p_init_runs, p_hidden_dims, p_Q):
"""
Hyper parameter search for IE5 data, varying small number of parameters.
One hidden layer.
"""
# p_iter_num # How many iteration are needed to evaluate one set of hyper parameterss
# task names:
# Actuator, Ballbeam, Drive, Gas_furnace, Flutter, Dryer, Tank,
# IdentificationExample1..5
#import pdb; pdb.set_trace()
out_train, in_train, task = prepare_data('IdentificationExample5', normalize=True)
p_task_name = 'IE5_1'
#p_iteration =
train_U = in_train.copy()
train_Y = out_train.copy()
p_init_runs = p_init_runs
p_max_runs = 10000
p_num_layers = 1
p_hidden_dims = [p_hidden_dims,]
p_inference_method = None
p_back_cstr = False
p_MLP_Dims = None
p_Q = p_Q
p_win_in = task.win_in
p_win_out = task.win_out
p_init = 'Y'
p_x_init_var = 0.05
result = list()
for i_no in range(0, p_iter_num): # iterations take into account model randomness e.g. initialization of inducing points
res = rgp_experiment_raw(p_task_name, bo_iter_no*10+i_no, train_U, train_Y, p_init_runs, p_max_runs, p_num_layers, p_hidden_dims,
p_inference_method, p_back_cstr, p_MLP_Dims, p_Q, p_win_in, p_win_out, p_init, p_x_init_var)
result.append(res[0])
#import pdb; pdb.set_trace()
return np.array(((np.min(result),),))
class IE5_experiment_2( object ):
"""
Tested parameters are: initial number of optimization runs, number of inducing points.
Conclusions after the experiment: The output file contains only variables as var_1, var_2 etc.
but Xavier said that the order is presearved.
The optimal values are: init_runs = 110, Q (ind. num) = 200. (run 3), 240.44817869
Bu the results are still the same from run to run.
Total running time was 40hours on GPU machine.
Maybe we can reduce the number of intrinsic iterations per evaluation.
Now idea is to use manually designad initial values to run more proper experiment. (Experiment 3)
"""
def __init__( self, initial_counter, iter_nums):
self.counter = initial_counter
self.iter_nums = iter_nums
def __call__( self, *args, **kwargs ):
#import pdb; pdb.set_trace()
new_args = (self.counter,self.iter_nums,) + tuple( [ int(args[0][0,i]) for i in range(args[0].shape[1]) ] )
ret = self.IE5_experiment_2_code( *new_args, **kwargs)
self.counter += 1
return ret
@staticmethod
def IE5_experiment_2_code( bo_iter_no, p_iter_num, p_init_runs, p_Q):
"""
Hyper parameter search for IE5 data, varying small number of parameters.
One hidden layer.
"""
# p_iter_num # How many iteration are needed to evaluate one set of hyper parameterss
# task names:
# Actuator, Ballbeam, Drive, Gas_furnace, Flutter, Dryer, Tank,
# IdentificationExample1..5
#import pdb; pdb.set_trace()
out_train, in_train, task = prepare_data('IdentificationExample5', normalize=True)
p_task_name = 'IE5_2'
#p_iteration =
train_U = in_train.copy()
train_Y = out_train.copy()
p_init_runs = p_init_runs
p_max_runs = 10000
p_num_layers = 1
p_hidden_dims = [1,]
p_inference_method = None
p_back_cstr = False
p_MLP_Dims = None
p_Q = p_Q
p_win_in = task.win_in
p_win_out = task.win_out
p_init = 'Y'
p_x_init_var = 0.05
result = list()
for i_no in range(0, p_iter_num): # iterations take into account model randomness e.g. initialization of inducing points
res = rgp_experiment_raw(p_task_name, bo_iter_no*10+i_no, train_U, train_Y, p_init_runs, p_max_runs, p_num_layers, p_hidden_dims,
p_inference_method, p_back_cstr, p_MLP_Dims, p_Q, p_win_in, p_win_out, p_init, p_x_init_var)
result.append(res[0])
#import pdb; pdb.set_trace()
return np.array(((np.min(result),),))
class IE5_experiment_3( object ):
"""
Tested parameters are: initial number of number of layers, number of inducing points.
Also, I do initial evaluations manually.
Best values: 1 layer, 40 inducing points, (run 7) 242.67636311
This value is laregr than in the other experiments. Manybe becuase there were only 2 internal runs
in every function evaluation.
"""
def __init__( self, initial_counter, iter_nums):
self.counter = initial_counter
self.iter_nums = iter_nums
def __call__( self, *args, **kwargs ):
#import pdb; pdb.set_trace()
new_args = (self.counter,self.iter_nums,) + tuple( [ int(args[0][0,i]) for i in range(args[0].shape[1]) ] )
ret = self.IE5_experiment_3_code( *new_args, **kwargs)
self.counter += 1
return ret
@staticmethod
def IE5_experiment_3_code( bo_iter_no, p_iter_num, p_layer_num, p_Q):
"""
Hyper parameter search for IE5 data, varying small number of parameters.
One hidden layer.
"""
# p_iter_num # How many iteration are needed to evaluate one set of hyper parameterss
# task names:
# Actuator, Ballbeam, Drive, Gas_furnace, Flutter, Dryer, Tank,
# IdentificationExample1..5
#import pdb; pdb.set_trace()
out_train, in_train, task = prepare_data('IdentificationExample5', normalize=True)
p_task_name = 'IE5_2'
#p_iteration =
train_U = in_train.copy()
train_Y = out_train.copy()
p_init_runs = 130
p_max_runs = 10000
p_num_layers = p_layer_num
p_hidden_dims = [1,1,]
p_inference_method = None
p_back_cstr = False
p_MLP_Dims = None
p_Q = p_Q
p_win_in = task.win_in
p_win_out = task.win_out
p_init = 'Y'
p_x_init_var = 0.05
result = list()
for i_no in range(0, p_iter_num): # iterations take into account model randomness e.g. initialization of inducing points
res = rgp_experiment_raw(p_task_name, bo_iter_no*10+i_no, train_U, train_Y, p_init_runs, p_max_runs, p_num_layers, p_hidden_dims,
p_inference_method, p_back_cstr, p_MLP_Dims, p_Q, p_win_in, p_win_out, p_init, p_x_init_var)
result.append(res[0])
#import pdb; pdb.set_trace()
return np.array(((np.min(result),),))
class IE5_experiment_4( object ):
"""
SVI inference
Tested parameters are: number of initial runs, number of inducing points.
First SVI experiment.
"""
def __init__( self, initial_counter, iter_nums):
self.counter = initial_counter
self.iter_nums = iter_nums
def __call__( self, *args, **kwargs ):
#import pdb; pdb.set_trace()
new_args = (self.counter,self.iter_nums,) + tuple( [ int(args[0][0,i]) for i in range(args[0].shape[1]) ] )
ret = self.IE5_experiment_4_code( *new_args, **kwargs)
self.counter += 1
return ret
@staticmethod
def IE5_experiment_4_code( bo_iter_no, p_iter_num, p_init_runs, p_Q):
"""
Hyper parameter search for IE5 data, varying small number of parameters.
One hidden layer.
"""
# p_iter_num # How many iteration are needed to evaluate one set of hyper parameterss
# task names:
# Actuator, Ballbeam, Drive, Gas_furnace, Flutter, Dryer, Tank,
# IdentificationExample1..5
#import pdb; pdb.set_trace()
out_train, in_train, task = prepare_data('IdentificationExample5', normalize=True)
p_task_name = 'IE5_4'
#p_iteration =
train_U = in_train.copy()
train_Y = out_train.copy()
p_init_runs = p_init_runs
p_max_runs = 12000
p_num_layers = 1
p_hidden_dims = [1,1,]
p_inference_method = 'svi'
p_back_cstr = False
p_MLP_Dims = None
p_Q = p_Q
p_win_in = task.win_in
p_win_out = task.win_out
p_init = 'Y'
p_x_init_var = 0.05
result = list()
for i_no in range(0, p_iter_num): # iterations take into account model randomness e.g. initialization of inducing points
res = rgp_experiment_raw(p_task_name, bo_iter_no*10+i_no, train_U, train_Y, p_init_runs, p_max_runs, p_num_layers, p_hidden_dims,
p_inference_method, p_back_cstr, p_MLP_Dims, p_Q, p_win_in, p_win_out, p_init, p_x_init_var)
result.append(res[0])
#import pdb; pdb.set_trace()
return np.array(((np.min(result),),))
class IE5_experiment_5( object ):
"""
Back constrains + SVI inference
Tested parameters are: number of initial runs, number of inducing points.
"""
def __init__( self, initial_counter, iter_nums):
self.counter = initial_counter
self.iter_nums = iter_nums
def __call__( self, *args, **kwargs ):
#import pdb; pdb.set_trace()
new_args = (self.counter,self.iter_nums,) + tuple( [ int(args[0][0,i]) for i in range(args[0].shape[1]) ] )
ret = self.IE5_experiment_5_code( *new_args, **kwargs)
self.counter += 1
return ret
@staticmethod
def IE5_experiment_5_code( bo_iter_no, p_iter_num, p_init_runs, p_Q):
"""
Hyper parameter search for IE5 data, varying small number of parameters.
One hidden layer.
"""
# p_iter_num # How many iteration are needed to evaluate one set of hyper parameterss
# task names:
# Actuator, Ballbeam, Drive, Gas_furnace, Flutter, Dryer, Tank,
# IdentificationExample1..5
#import pdb; pdb.set_trace()
out_train, in_train, task = prepare_data('IdentificationExample5', normalize=True)
p_task_name = 'IE5_5'
#p_iteration =
train_U = in_train.copy()
train_Y = out_train.copy()
p_init_runs = p_init_runs
p_max_runs = 12000
p_num_layers = 1
p_hidden_dims = [1,1,]
p_inference_method = 'svi'
p_back_cstr = True
p_rnn_type='gru'
p_rnn_hidden_dims=[20,]
p_Q = p_Q
p_win_in = task.win_in
p_win_out = task.win_out
result = list()
for i_no in range(0, p_iter_num): # iterations take into account model randomness e.g. initialization of inducing points
res = rgp_experiment_bcstr_raw(p_task_name, bo_iter_no*10+i_no, train_U, train_Y, p_init_runs, p_max_runs, p_num_layers, p_hidden_dims,
p_inference_method, p_back_cstr, p_rnn_type, p_rnn_hidden_dims, p_Q, p_win_in, p_win_out)
result.append(res[0])
#import pdb; pdb.set_trace()
return np.array(((np.min(result),),))
class IE5_experiment_6( object ):
"""
Same as experiment 5, only the model has changes now it includes the RGP inputs at
encoder inputs.
Back constrains + SVI inference
Tested parameters are: number of initial runs, number of inducing points.
"""
def __init__( self, initial_counter, iter_nums):
self.counter = initial_counter
self.iter_nums = iter_nums
def __call__( self, *args, **kwargs ):
#import pdb; pdb.set_trace()
new_args = (self.counter,self.iter_nums,) + tuple( [ int(args[0][0,i]) for i in range(args[0].shape[1]) ] )
ret = self.IE5_experiment_6_code( *new_args, **kwargs)
self.counter += 1
return ret
@staticmethod
def IE5_experiment_6_code( bo_iter_no, p_iter_num, p_init_runs, p_Q):
"""
Hyper parameter search for IE5 data, varying small number of parameters.
One hidden layer.
"""
# p_iter_num # How many iteration are needed to evaluate one set of hyper parameterss
# task names:
# Actuator, Ballbeam, Drive, Gas_furnace, Flutter, Dryer, Tank,
# IdentificationExample1..5
#import pdb; pdb.set_trace()
out_train, in_train, task = prepare_data('IdentificationExample5', normalize=True)
p_task_name = 'IE5_6'
#p_iteration =
train_U = in_train.copy()
train_Y = out_train.copy()
p_init_runs = p_init_runs
p_max_runs = 15000
p_num_layers = 1
p_hidden_dims = [1,1,]
p_inference_method = 'svi'
p_back_cstr = True
p_rnn_type='gru'
p_rnn_hidden_dims=[20,]
p_Q = p_Q
p_win_in = task.win_in
p_win_out = task.win_out
result = list()
for i_no in range(0, p_iter_num): # iterations take into account model randomness e.g. initialization of inducing points
res = rgp_experiment_bcstr_raw(p_task_name, bo_iter_no*10+i_no, train_U, train_Y, p_init_runs, p_max_runs, p_num_layers, p_hidden_dims,
p_inference_method, p_back_cstr, p_rnn_type, p_rnn_hidden_dims, p_Q, p_win_in, p_win_out)
result.append(res[0])
#import pdb; pdb.set_trace()
return np.array(((np.min(result),),))
def rgp_experiment_bcstr_raw(p_task_name, p_iteration, train_U, train_Y, p_init_runs, p_max_runs, p_num_layers, p_hidden_dims,
p_inference_method, p_back_cstr, p_rnn_type, p_rnn_hidden_dims, p_Q, p_win_in, p_win_out):
"""
Experiment file for NON MINIBATCH inference.
So, DeepAutoreg is run here.
Inputs:
-------------------------------
p_task_name: string
Experiment name, used only in file name
p_iteration: int or string
Iteration of the experiment, used only in file name
p_init_runs: int:
Number of initial runs when likelihood variances and covariance magnitudes are fixed
p_max_runs: int
Maximum runs of general optimization
p_num_layers: int [1,2]
Number of RGP layers
p_hidden_dims: list[ length is the number of hidden layers]
Dimensions of hidden layers
p_inference_method: string
If 'svi' then SVI inference is used.
p_back_cstr: bool
Use back constrains or not.
p_rnn_hidden_dims: int
Hidden dimension of neural network.
p_Q: int
Number of inducing points
p_win_in, p_win_out: int
Inpput window and hidden layer window.
"""
win_in = p_win_in # 20
win_out = p_win_out # 20
inference_method = p_inference_method if p_inference_method == 'svi' else None
#import pdb; pdb.set_trace()
if p_num_layers == 1:
# 1 layer:
wins = [0, win_out] # 0-th is output layer
nDims = [train_Y.shape[1], p_hidden_dims[0]]
kernels=[GPy.kern.RBF(win_out,ARD=True,inv_l=True),
GPy.kern.RBF(win_in + win_out,ARD=True,inv_l=True)]
elif p_num_layers == 2:
# 2 layers:
wins = [0, win_out, win_out]
nDims = [train_Y.shape[1], p_hidden_dims[0], p_hidden_dims[1]]
kernels=[GPy.kern.RBF(win_out,ARD=True,inv_l=True),
GPy.kern.RBF(win_out+win_out,ARD=True,inv_l=True),
GPy.kern.RBF(win_out+win_in,ARD=True,inv_l=True)]
else:
raise NotImplemented()
print("Input window: ", win_in)
print("Output window: ", win_out)
p_Q = 120 #!!!!! TODO:
m = autoreg.DeepAutoreg_rnn(wins, train_Y, U=train_U, U_win=win_in,
num_inducing=p_Q, back_cstr=p_back_cstr, nDims=nDims,
rnn_type=p_rnn_type,
rnn_hidden_dims=p_rnn_hidden_dims,
minibatch_inference=False,
inference_method=inference_method, # Inference method
kernels=kernels)
# pattern for model name: #task_name, inf_meth=?, wins=layers, Q = ?, backcstr=?,MLP_dims=?, nDims=
model_file_name = '%s_%s--inf_meth=%s--backcstr=%s--wins=%s_%s--Q=%i--nDims=%s' % (p_task_name, str(p_iteration),
'reg' if inference_method is None else inference_method,
str(p_back_cstr) if p_back_cstr==False else str(p_back_cstr) + '_' + p_rnn_type + str(p_rnn_hidden_dims[0]),
str(win_in), str(wins), p_Q, str(nDims))
print('Model file name: ', model_file_name)
print(m)
import pdb; pdb.set_trace()
#Initialization
# Here layer numbers are different than in initialization. 0-th layer is the top one
for i in range(m.nLayers):
m.layers[i].kern.inv_l[:] = np.mean( 1./((m.layers[i].X.mean.values.max(0)-m.layers[i].X.mean.values.min(0))/np.sqrt(2.)) )
m.layers[i].likelihood.variance[:] = 0.01*train_Y.var()
m.layers[i].kern.variance.fix(warning=False)
m.layers[i].likelihood.fix(warning=False)
print(m)
#init_runs = 50 if out_train.shape[0]<1000 else 100
print("Init runs: ", p_init_runs)
m.optimize('bfgs',messages=1,max_iters=p_init_runs)
for i in range(m.nLayers):
m.layers[i].kern.variance.constrain_positive(warning=False)
m.layers[i].likelihood.constrain_positive(warning=False)
m.optimize('bfgs',messages=1,max_iters=p_max_runs)
io.savemat(model_file_name, {'params': m.param_array[:]} )
print(m)
return -float(m._log_marginal_likelihood), m
def rgp_experiment_raw(p_task_name, p_iteration, train_U, train_Y, p_init_runs, p_max_runs, p_num_layers, p_hidden_dims,
p_inference_method, p_back_cstr, p_MLP_Dims, p_Q, p_win_in, p_win_out, p_init, p_x_init_var):
"""
Experiment file for NON MINIBATCH inference.
So, DeepAutoreg is run here.
Inputs:
-------------------------------
p_task_name: string
Experiment name, used only in file name
p_iteration: int or string
Iteration of the experiment, used only in file name
p_init_runs: int:
Number of initial runs when likelihood variances and covariance magnitudes are fixed
p_max_runs: int
Maximum runs of general optimization
p_num_layers: int [1,2]
Number of RGP layers
p_hidden_dims: list[ length is the number of hidden layers]
Dimensions of hidden layers
p_inference_method: string
If 'svi' then SVI inference is used.
p_back_cstr: bool
Use back constrains or not.
p_MLP_Dims: list[length is the number of MLP hidden layers, ignoring input and output layers]
Values are the number of neurons at each layer.
p_Q: int
Number of inducing points
p_win_in, p_win_out: int
Inpput window and hidden layer window.
p_init: string 'Y', 'rand', 'zero'
Initialization of RGP hidden layers
p_x_init_var: float
Initial variance for X, usually 0.05 for data close to normalized data.
"""
win_in = p_win_in # 20
win_out = p_win_out # 20
inference_method = p_inference_method if p_inference_method == 'svi' else None
#import pdb; pdb.set_trace()
if p_num_layers == 1:
# 1 layer:
wins = [0, win_out] # 0-th is output layer
nDims = [train_Y.shape[1], p_hidden_dims[0]]
kernels=[GPy.kern.RBF(win_out,ARD=True,inv_l=True),
GPy.kern.RBF(win_in + win_out,ARD=True,inv_l=True)]
elif p_num_layers == 2:
# 2 layers:
wins = [0, win_out, win_out]
nDims = [train_Y.shape[1], p_hidden_dims[0], p_hidden_dims[1]]
kernels=[GPy.kern.RBF(win_out,ARD=True,inv_l=True),
GPy.kern.RBF(win_out+win_out,ARD=True,inv_l=True),
GPy.kern.RBF(win_out+win_in,ARD=True,inv_l=True)]
else:
raise NotImplemented()
print("Input window: ", win_in)
print("Output window: ", win_out)
m = autoreg.DeepAutoreg_new(wins, train_Y, U=train_U, U_win=win_in,
num_inducing=p_Q, back_cstr=p_back_cstr, MLP_dims=p_MLP_Dims, nDims=nDims,
init=p_init, # how to initialize hidden states means
X_variance=p_x_init_var, #0.05, # how to initialize hidden states variances
inference_method=inference_method, # Inference method
kernels=kernels)
# pattern for model name: #task_name, inf_meth=?, wins=layers, Q = ?, backcstr=?,MLP_dims=?, nDims=
model_file_name = '%s_%s--inf_meth=%s--backcstr=%s--wins=%s_%s--Q=%i--nDims=%s--init=%s--x_init=%s' % (p_task_name, str(p_iteration),
'reg' if inference_method is None else inference_method,
str(p_back_cstr) if p_back_cstr==False else str(p_back_cstr) + '_' + str(p_MLP_Dims),
str(win_in), str(wins), p_Q, str(nDims), p_init, str(p_x_init_var))
print('Model file name: ', model_file_name)
print(m)
#import pdb; pdb.set_trace()
#Initialization
# Here layer numbers are different than in initialization. 0-th layer is the top one
for i in range(m.nLayers):
m.layers[i].kern.inv_l[:] = np.mean( 1./((m.layers[i].X.mean.values.max(0)-m.layers[i].X.mean.values.min(0))/np.sqrt(2.)) )
m.layers[i].likelihood.variance[:] = 0.01*train_Y.var()
m.layers[i].kern.variance.fix(warning=False)
m.layers[i].likelihood.fix(warning=False)
print(m)
#init_runs = 50 if out_train.shape[0]<1000 else 100
print("Init runs: ", p_init_runs)
m.optimize('bfgs',messages=1,max_iters=p_init_runs)
for i in range(m.nLayers):
m.layers[i].kern.variance.constrain_positive(warning=False)
m.layers[i].likelihood.constrain_positive(warning=False)
m.optimize('bfgs',messages=1,max_iters=p_max_runs)
io.savemat(model_file_name, {'params': m.param_array[:]} )
print(m)
return -float(m._log_marginal_likelihood), m
def bo_run_1():
"""
Run the bayesian optimization experiemnt 1.
Tested parameters are: initial number of optimization runs, number of hidden dims, number of inducing points.
After the first experiment the conclusion is that 1 hidden dim is the best, but also
the optimization is not very explorative.
Probably there was an error in the experi,ent setup because I did not change the number of hidden layers
only the number of hidden dimensions in 1 layer.
Best values: ini_runs = 160.0, hidden dim=1., Q=50. (237.44060068)
Iteration 21
"""
import GPyOpt
import pickle
exper = IE5_experiment_1(0,4)
domain =[{'name': 'init_runs', 'type': 'discrete', 'domain': np.arange(10,201,10) },
{'name': 'hidden_dims', 'type': 'discrete', 'domain': (1,2,3,4)},
{'name': 'Q', 'type': 'discrete', 'domain': np.arange(10,201,10) } ]
Bopt = GPyOpt.methods.BayesianOptimization(f=exper, # function to optimize
domain=domain, # box-constrains of the problem
model_type = 'GP_MCMC',
initial_design_numdata = 2,# number data initial design
acquisition_type='EI_MCMC', # Expected Improvement
exact_feval = False)
#import pdb; pdb.set_trace()
# --- Stop conditions
max_time = None
max_iter = 20
tolerance = 1e-4 # distance between two consecutive observations
# Run the optimization
report_file = 'report_IE5_experiment_1(0,1)'
evaluations_file = 'eval_IE5_experiment_1(0,1)'
models_file = 'model_IE5_experiment_1(0,1)'
Bopt.run_optimization(max_iter = max_iter, max_time = max_time, eps = tolerance, verbosity=True,
report_file = report_file, evaluations_file= evaluations_file, models_file=models_file)
#acquisition_type ='LCB', # LCB acquisition
#acquisition_weight = 0.1)
ff = open('IE5_experiment_1(0,4)','w')
pickle.dump(Bopt,ff)
ff.close()
def bo_run_2():
"""
Run the bayesian optimization experiemnt 2.
Tested parameters are: initial number of optimization runs, number of inducing points.
Conclusions after the experiment: The output file contains only variables as var_1, var_2 etc.
but Xavier said that the order is presearved.
The optimal values are: init_runs = 110, Q (ind. num) = 200. (run 3), 240.44817869
Bu the results are still the same from run to run.
Total running time was 40hours on GPU machine.
Maybe we can reduce the number of intrinsic iterations per evaluation.
Now idea is to use manually designad initial values to run more proper experiment. (Experiment 3)
"""
import GPyOpt
import pickle
exper = IE5_experiment_2(0,3)
domain =[{'name': 'init_runs', 'type': 'discrete', 'domain': np.arange(50,201,10) },
#{'name': 'hidden_dims', 'type': 'discrete', 'domain': (1,2,3,4)},
{'name': 'Q', 'type': 'discrete', 'domain': np.arange(40,201,10) } ]
Bopt = GPyOpt.methods.BayesianOptimization(f=exper, # function to optimize
domain=domain, # box-constrains of the problem
model_type = 'GP_MCMC',
initial_design_numdata = 3,# number data initial design
acquisition_type='EI_MCMC', # Expected Improvement
exact_feval = False)
#import pdb; pdb.set_trace()
# --- Stop conditions
max_time = None
max_iter = 7
tolerance = 1e-4 # distance between two consecutive observations
# Run the optimization
report_file = 'report_IE5_experiment_2(0,3)'
evaluations_file = 'eval_IE5_experiment_2(0,3)'
models_file = 'model_IE5_experiment_2(0,3)'
Bopt.run_optimization(max_iter = max_iter, max_time = max_time, eps = tolerance, verbosity=True,
report_file = report_file, evaluations_file= evaluations_file, models_file=models_file, acquisition_par=3) # acquisition_par is
# used to make it more explorative. It seems it did not help.
#acquisition_type ='LCB', # LCB acquisition
#acquisition_weight = 0.1)
ff = open('IE5_experiment_2(0,3)','w')
pickle.dump(Bopt,ff)
ff.close()
def bo_run_3():
"""
Run the bayesian optimization experiemnt 2.
Tested parameters are: initial number of number of layers, number of inducing points.
Also, I do initial evaluations manually.
Best values: 1 layer, 40 inducing points, (run 7) 242.67636311
This value is laregr than in the other experiments. Manybe becuase there were only 2 internal runs
in every function evaluation.
"""
import GPyOpt
import pickle
exper = IE5_experiment_3(0,2)
domain =[ #{ 'name': 'init_runs', 'type': 'discrete', 'domain': np.arange(50,201,10) },
#{'name': 'hidden_dims', 'type': 'discrete', 'domain': (1,2,3,4)},
{ 'name': 'layer_num', 'type': 'discrete', 'domain': (1,2) },
{'name': 'Q', 'type': 'discrete', 'domain': np.arange(40,201,10) } ]
#out = exper( np.array( (( 2.0,100.0),) ) ) # input_shape: (array([[ 2., 120.]]),) ### outputshape: array([[ 413.67619157]])
input1 = np.array( (( 1.0,50.0),) ); out1 = exper( input1 )
input2 = np.array( (( 2.0,50.0),) ); out2 = exper( input2 )
input3 = np.array( (( 1.0,100.0),) ); out3 = exper( input3 )
input4 = np.array( (( 2.0,100.0),) ); out4 = exper( input4 )
input5 = np.array( (( 1.0,200.0),) ); out5 = exper( input5 )
input6 = np.array( (( 2.0,200.0),) ); out6 = exper( input6 )
# init_input = np.vstack( (input1,input2,) )
# init_out = np.vstack( (out1,out2,) )
init_input = np.vstack( (input1,input2,input3,input4,input5,input6) )
init_out = np.vstack( (out1,out2,out3,out4,out5,out6) )
#import pdb; pdb.set_trace(); #return
#exper()
#import pdb; pdb.set_trace()
Bopt = GPyOpt.methods.BayesianOptimization(f=exper, # function to optimize
domain=domain, # box-constrains of the problem
model_type = 'GP_MCMC',
X=init_input,
Y=init_out,
#initial_design_numdata = 3,# number data initial design
acquisition_type='EI_MCMC', # Expected Improvement
exact_feval = False)
#import pdb; pdb.set_trace(); #return
# --- Stop conditions
max_time = None
max_iter = 10
tolerance = 1e-4 # distance between two consecutive observations
# Run the optimization
report_file = 'report_IE5_experiment_3(0,2)'
evaluations_file = 'eval_IE5_experiment_3(0,2)'
models_file = 'model_IE5_experiment_3(0,2)'
Bopt.run_optimization(max_iter = max_iter, max_time = max_time, eps = tolerance, verbosity=True,
report_file = report_file, evaluations_file= evaluations_file, models_file=models_file, acquisition_par=3) # acquisition_par is
# used to make it more explorative. It seems it did not help.
#acquisition_type ='LCB', # LCB acquisition
#acquisition_weight = 0.1)
ff = open('IE5_experiment_3(0,2)','w')
pickle.dump(Bopt,ff)
ff.close()
def bo_run_4():
"""
Run the bayesian optimization experiemnt 2.
SVI inference
Tested parameters are: number of initial runs, number of inducing points.
First SVI experiment.
"""
import GPyOpt
import pickle
exper = IE5_experiment_4(0,3)
domain =[{'name': 'init_runs', 'type': 'discrete', 'domain': np.arange(50,501,50) },
#{'name': 'hidden_dims', 'type': 'discrete', 'domain': (1,2,3,4)},
{'name': 'Q', 'type': 'discrete', 'domain': np.arange(40,201,10) } ]
Bopt = GPyOpt.methods.BayesianOptimization(f=exper, # function to optimize
domain=domain, # box-constrains of the problem
model_type = 'GP_MCMC',
initial_design_numdata = 5,# number data initial design
acquisition_type='EI_MCMC', # Expected Improvement
exact_feval = False)
#import pdb; pdb.set_trace()
# --- Stop conditions
max_time = None
max_iter = 2
tolerance = 1e-4 # distance between two consecutive observations
# Run the optimization
report_file = 'report_IE5_experiment_4(0,3)'
evaluations_file = 'eval_IE5_experiment_4(0,3)'
models_file = 'model_IE5_experiment_4(0,3)'
Bopt.run_optimization(max_iter = max_iter, max_time = max_time, eps = tolerance, verbosity=True,
report_file = report_file, evaluations_file= evaluations_file, models_file=models_file, acquisition_par=3) # acquisition_par is
# used to make it more explorative. It seems it did not help.
#acquisition_type ='LCB', # LCB acquisition
#acquisition_weight = 0.1)
ff = open('IE5_experiment_4(0,3)','w')
pickle.dump(Bopt,ff)
ff.close()
def bo_run_5():
"""
Run the bayesian optimization experiemnt 5.
Back constrains + SVI inference
Tested parameters are: number of initial runs, number of inducing points.
The optimal value: 340.816199019 350.0(init_runs) 120.0(ind points), iteration 9 (8 in file names)
"""
import GPyOpt
import pickle
exper = IE5_experiment_5(0,3)
domain =[{'name': 'init_runs', 'type': 'discrete', 'domain': np.arange(50,501,50) },
#{'name': 'hidden_dims', 'type': 'discrete', 'domain': (1,2,3,4)},
{'name': 'Q', 'type': 'discrete', 'domain': np.arange(40,201,10) } ]
Bopt = GPyOpt.methods.BayesianOptimization(f=exper, # function to optimize
domain=domain, # box-constrains of the problem
model_type = 'GP_MCMC',
initial_design_numdata = 5,# number data initial design
acquisition_type='EI_MCMC', # Expected Improvement
exact_feval = False)
#import pdb; pdb.set_trace()
# --- Stop conditions
max_time = None
max_iter = 5
tolerance = 1e-4 # distance between two consecutive observations
# Run the optimization
report_file = 'report_IE5_experiment_5(0,3)'
evaluations_file = 'eval_IE5_experiment_5(0,3)'
models_file = 'model_IE5_experiment_5(0,3)'
Bopt.run_optimization(max_iter = max_iter, max_time = max_time, eps = tolerance, verbosity=True,
report_file = report_file, evaluations_file= evaluations_file, models_file=models_file, acquisition_par=3) # acquisition_par is
# used to make it more explorative. It seems it did not help.
#acquisition_type ='LCB', # LCB acquisition
#acquisition_weight = 0.1)
ff = open('IE5_experiment_5(0,3)','w')
pickle.dump(Bopt,ff)
ff.close()
def bo_run_6():
"""
Run the bayesian optimization experiemnt 6.
Same as experiment 5, only the model has changes now it includes the RGP inputs at
encoder inputs.
Back constrains + SVI inference
Tested parameters are: number of initial runs, number of inducing points.
The optimal values: 361.667338238 300.0(init_runs) 80.0(ind points), inter. 4 (3 in file name)
"""
import GPyOpt
import pickle
exper = IE5_experiment_6(0,2)
domain =[{'name': 'init_runs', 'type': 'discrete', 'domain': np.arange(50,501,50) },
#{'name': 'hidden_dims', 'type': 'discrete', 'domain': (1,2,3,4)},
{'name': 'Q', 'type': 'discrete', 'domain': np.arange(40,201,10) } ]
Bopt = GPyOpt.methods.BayesianOptimization(f=exper, # function to optimize
domain=domain, # box-constrains of the problem
model_type = 'GP_MCMC',
initial_design_numdata = 5,# number data initial design
acquisition_type='EI_MCMC', # Expected Improvement
exact_feval = False)
#import pdb; pdb.set_trace()
# --- Stop conditions
max_time = None
max_iter = 2
tolerance = 1e-4 # distance between two consecutive observations
# Run the optimization
report_file = 'report_IE5_experiment_6(0,2)'
evaluations_file = 'eval_IE5_experiment_6(0,2)'
models_file = 'model_IE5_experiment_6(0,2)'
Bopt.run_optimization(max_iter = max_iter, max_time = max_time, eps = tolerance, verbosity=True,
report_file = report_file, evaluations_file= evaluations_file, models_file=models_file, acquisition_par=3) # acquisition_par is
# used to make it more explorative. It seems it did not help.
#acquisition_type ='LCB', # LCB acquisition
#acquisition_weight = 0.1)
ff = open('IE5_experiment_6(0,2)','w')
pickle.dump(Bopt,ff)
ff.close()
def bo_run_7():
"""
Run the bayesian optimization experiemnt 7.
Same as experiment 6, but with ARD kernel, different tolerance
and different max_iter.
"""
import GPyOpt
import pickle
exper = IE5_experiment_6(0,2)
domain =[{'name': 'init_runs', 'type': 'discrete', 'domain': np.arange(200,501,30) },
#{'name': 'hidden_dims', 'type': 'discrete', 'domain': (1,2,3,4)},
{'name': 'Q', 'type': 'discrete', 'domain': np.arange(60,201,10) } ]
kernel = GPy.kern.RBF(len(domain),ARD=True)
Bopt = GPyOpt.methods.BayesianOptimization(f=exper, # function to optimize
domain=domain, # box-constrains of the problem
model_type = 'GP_MCMC',
kernel=kernel,
initial_design_numdata = 3,# number data initial design
acquisition_type='EI_MCMC', # Expected Improvement
exact_feval = False)
#import pdb; pdb.set_trace()
# --- Stop conditions
max_time = None
max_iter = 7
tolerance = 1e-2 # distance between two consecutive observations
# Run the optimization
report_file = 'report_IE5_experiment_7(0,2)'
evaluations_file = 'eval_IE5_experiment_7(0,2)'
models_file = 'model_IE5_experiment_7(0,2)'
Bopt.run_optimization(max_iter = max_iter, max_time = max_time, eps = tolerance, verbosity=True,
report_file = report_file, evaluations_file= evaluations_file, models_file=models_file, acquisition_par=3) # acquisition_par is
# used to make it more explorative. It seems it did not help.
#acquisition_type ='LCB', # LCB acquisition
#acquisition_weight = 0.1)
ff = open('IE5_experiment_7(0,2)','w')
pickle.dump(Bopt,ff)
ff.close()
if __name__ == '__main__':
bo_run_5()
| 41.288218 | 153 | 0.598425 | 5,689 | 42,403 | 4.204781 | 0.078045 | 0.028803 | 0.015551 | 0.019439 | 0.913423 | 0.896702 | 0.882028 | 0.850173 | 0.848627 | 0.84361 | 0 | 0.029842 | 0.302974 | 42,403 | 1,027 | 154 | 41.288218 | 0.779503 | 0.319954 | 0 | 0.696498 | 0 | 0.003891 | 0.070565 | 0.035608 | 0 | 0 | 0 | 0.000974 | 0 | 1 | 0.060311 | false | 0 | 0.042802 | 0.001946 | 0.14786 | 0.036965 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
da38bd9c024e637026dd56d31e433ae744b36ed1 | 930 | py | Python | utils/__init__.py | YotamLif/HLT | ca416ba851caf3b44a6facffa539dcefff1de8ea | [
"Apache-2.0"
] | 2 | 2022-02-14T15:13:22.000Z | 2022-02-15T20:54:16.000Z | utils/__init__.py | YotamLif/HLT | ca416ba851caf3b44a6facffa539dcefff1de8ea | [
"Apache-2.0"
] | null | null | null | utils/__init__.py | YotamLif/HLT | ca416ba851caf3b44a6facffa539dcefff1de8ea | [
"Apache-2.0"
] | null | null | null | """
Package that contains useful utils used for HLT
"""
from .hamil_utils import Hamiltonian
from .hamil_utils import RandomKLocalHamiltonian
from .hamil_utils import RandomPauliHamiltonian
from .hamil_utils import TransverseIsingHamiltonian
from .hamil_utils import get_circ_from_gibbs_hamiltonian
from .hamil_utils import get_density_matrix_from_gibbs_hamiltonian
from .hamil_utils import get_ghz_circ
from .hamil_utils import get_random_k_local_gibbs_circ
from .np_utils import get_fidelity
from .np_utils import get_gibbs_hamiltonian
from .np_utils import is_hermitian
from .np_utils import normalized_matrix
from .pauli_decomposition import PauliDecomposition
from .pauli_decomposition import analyze_density_matrix
from .qiskit_utils import get_air_simulator
from .qiskit_utils import get_density_matrix_circuit
from .qiskit_utils import get_density_matrix_from_simulation
from .qiskit_utils import get_up_to_range_k_paulis
| 42.272727 | 66 | 0.886022 | 135 | 930 | 5.696296 | 0.311111 | 0.228869 | 0.182055 | 0.208062 | 0.444733 | 0.230169 | 0.208062 | 0.111834 | 0 | 0 | 0 | 0 | 0.088172 | 930 | 21 | 67 | 44.285714 | 0.90684 | 0.050538 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
da6d9a3152f491d0fe7e36aaddeddbce2e0b581f | 4,380 | py | Python | Numerial_Analysis/nas/3-1.py | RDCPP/Numerial_Analysis_Backup | a28501ec3505584cb3dce41404f28d357ba8039d | [
"MIT"
] | null | null | null | Numerial_Analysis/nas/3-1.py | RDCPP/Numerial_Analysis_Backup | a28501ec3505584cb3dce41404f28d357ba8039d | [
"MIT"
] | null | null | null | Numerial_Analysis/nas/3-1.py | RDCPP/Numerial_Analysis_Backup | a28501ec3505584cb3dce41404f28d357ba8039d | [
"MIT"
] | null | null | null | import numpy as np
x = np.linspace(0.,2.,11)
a = np.array([1,1.125,1.039,0.6663,-0.0650,-1.131,-2.448,-3.821,-4.944,-5.425,-4.83])
b = np.array([0. for i in range(11)])
c = np.array([0. for i in range(11)])
d = np.array([0. for i in range(11)])
h = np.array([0. for i in range(10)])
alpha = np.array([0. for i in range(10)])
l = np.array([1. for i in range(11)])
mu = np.array([0. for i in range(10)])
zeta = np.array([0. for i in range(11)])
for i in range(10):
h[i] = x[i+1] - x[i]
for i in range(1,10):
alpha[i] = 3*(a[i+1] - a[i])/h[i] - 3*(a[i] - a[i-1])/h[i-1]
for i in range(1,10):
l[i] = 2*(x[i+1] - x[i-1]) - h[i-1]*mu[i-1]
mu[i] = h[i]/l[i]
zeta[i] = (alpha[i] - h[i-1]*zeta[i-1])/l[i]
for j in range(9,-1,-1):
c[j] = zeta[j] - mu[j]*c[j+1]
b[j] = (a[j+1] - a[j])/h[j] - h[j]*(c[j+1] + 2*c[j])/3
d[j] = (c[j+1] - c[j])/(3*h[j])
eval_front_1 = np.subtract(0.5, x[2])
eval_front_2 = np.multiply(eval_front_1, eval_front_1)
eval_front_3 = np.multiply(eval_front_2, eval_front_1)
eval_front_S = np.add(a[2], np.multiply(b[2], eval_front_1))
eval_front_S = np.add(eval_front_S, np.multiply(c[2], eval_front_2))
eval_front_S = np.add(eval_front_S, np.multiply(d[2], eval_front_3))
eval_back_1 = np.subtract(0.5, x[3])
eval_back_2 = np.multiply(eval_back_1, eval_back_1)
eval_back_3 = np.multiply(eval_back_2, eval_back_1)
eval_back_S = np.add(a[3], np.multiply(b[3], eval_back_1))
eval_back_S = np.add(eval_back_S, np.multiply(c[3], eval_back_2))
eval_back_S = np.add(eval_back_S, np.multiply(d[3], eval_back_3))
eval_exact = np.exp(0.5) * np.cos(1.0)
front_diff = np.abs(np.subtract(eval_exact,eval_front_S))
back_diff = np.abs(np.subtract(eval_exact,eval_back_S))
print("\nFunction Value")
print("x = 0.5 : ",end="")
if(front_diff > back_diff):
print(back_diff)
else:
print(front_diff)
eval_front_1 = np.subtract(1.5, x[7])
eval_front_2 = np.multiply(eval_front_1, eval_front_1)
eval_front_3 = np.multiply(eval_front_2, eval_front_1)
eval_front_S = np.add(a[7], np.multiply(b[7], eval_front_1))
eval_front_S = np.add(eval_front_S, np.multiply(c[7], eval_front_2))
eval_front_S = np.add(eval_front_S, np.multiply(d[7], eval_front_3))
eval_back_1 = np.subtract(1.5, x[8])
eval_back_2 = np.multiply(eval_back_1, eval_back_1)
eval_back_3 = np.multiply(eval_back_2, eval_back_1)
eval_back_S = np.add(a[8], np.multiply(b[8], eval_back_1))
eval_back_S = np.add(eval_back_S, np.multiply(c[8], eval_back_2))
eval_back_S = np.add(eval_back_S, np.multiply(d[8], eval_back_3))
eval_exact = np.exp(1.5) * np.cos(3.0)
front_diff = np.abs(np.subtract(eval_exact,eval_front_S))
back_diff = np.abs(np.subtract(eval_exact,eval_back_S))
print("x = 1.5 : ",end="")
if(front_diff > back_diff):
print(back_diff)
else:
print(front_diff)
eval_front_1 = np.multiply(2,np.subtract(0.5, x[2]))
eval_front_2 = np.multiply(3,np.multiply(eval_front_1, eval_front_1))
eval_front_S = np.add(b[2], np.multiply(c[2], eval_front_1))
eval_front_S = np.add(eval_front_S, np.multiply(d[2], eval_front_2))
eval_back_1 = np.multiply(2,np.subtract(0.5, x[3]))
eval_back_2 = np.multiply(3,np.multiply(eval_back_1, eval_back_1))
eval_back_S = np.add(b[3], np.multiply(c[3], eval_back_1))
eval_back_S = np.add(eval_back_S, np.multiply(d[3], eval_back_2))
eval_exact = np.exp(0.5) * (np.cos(1.0) - 2 * np.sin(1.0))
front_diff = np.abs(np.subtract(eval_exact,eval_front_S))
back_diff = np.abs(np.subtract(eval_exact,eval_back_S))
print("\nDerivation Value")
print("x = 0.5 : ",end="")
if(front_diff > back_diff):
print(back_diff)
else:
print(front_diff)
eval_front_1 = np.multiply(2,np.subtract(1.5, x[7]))
eval_front_2 = np.multiply(3,np.multiply(eval_front_1, eval_front_1))
eval_front_S = np.add(b[7], np.multiply(c[7], eval_front_1))
eval_front_S = np.add(eval_front_S, np.multiply(d[7], eval_front_2))
eval_back_1 = np.multiply(2,np.subtract(1.5, x[8]))
eval_back_2 = np.multiply(3,np.multiply(eval_back_1, eval_back_1))
eval_back_S = np.add(b[8], np.multiply(c[8], eval_back_1))
eval_back_S = np.add(eval_back_S, np.multiply(d[8], eval_back_2))
eval_exact = np.exp(1.5) * (np.cos(3.0) - 2 * np.sin(3.0))
front_diff = np.abs(np.subtract(eval_exact,eval_front_S))
back_diff = np.abs(np.subtract(eval_exact,eval_back_S))
print("x = 1.5 : ",end="")
if(front_diff > back_diff):
print(back_diff)
else:
print(front_diff) | 32.686567 | 85 | 0.68379 | 958 | 4,380 | 2.871608 | 0.064718 | 0.17012 | 0.072701 | 0.069793 | 0.891676 | 0.871683 | 0.843693 | 0.842239 | 0.781534 | 0.781534 | 0 | 0.065891 | 0.116438 | 4,380 | 134 | 86 | 32.686567 | 0.644961 | 0 | 0 | 0.43299 | 0 | 0 | 0.016891 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.010309 | 0 | 0.010309 | 0.14433 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
da7905df3e13783d782f25ead3da960fff8459f0 | 22,621 | py | Python | tests/redis_wrapper.py | jay-johnson/docker-redis-haproxy-cluster | fbcbba8ba9a0266b718185f2d9aab6e8419cede1 | [
"MIT"
] | 41 | 2016-01-12T06:20:47.000Z | 2021-11-13T00:57:10.000Z | tests/redis_wrapper.py | jay-johnson/docker-redis-haproxy-cluster | fbcbba8ba9a0266b718185f2d9aab6e8419cede1 | [
"MIT"
] | 1 | 2016-11-20T13:26:14.000Z | 2016-11-20T13:26:14.000Z | tests/redis_wrapper.py | jay-johnson/docker-redis-haproxy-cluster | fbcbba8ba9a0266b718185f2d9aab6e8419cede1 | [
"MIT"
] | 11 | 2016-02-26T16:26:35.000Z | 2021-11-12T22:10:19.000Z | import json, os, datetime
from functools import wraps
try:
import cPickle as pickle
except ImportError:
import pickle
from redis import Redis
from time import sleep
class RedisWrapper(object):
def __init__(self, name, serializer=pickle, **kwargs):
self.m_debug = False
self.m_name = name
self.m_theserializer = serializer
self.m_redis = Redis(**kwargs)
self.m_retry_interval = 1
self.m_retry_count = 1
self.m_max_retries = -1
self.m_max_sleep_secs = 30 # 30 seconds
self.m_host = ""
self.m_port = ""
self.m_id = "Name(" + str(self.m_name) + ")"
for key, value in kwargs.iteritems():
if str(key) == "host":
self.m_host = str(value)
if str(key) == "port":
self.m_port = str(value)
if self.m_host != "" and self.m_port != "":
self.m_id = "Name(" + str(self.m_name) + ") RedisAddress(" + str(self.m_host) + ":" + str(self.m_port) + ")"
self.m_error_log = "/tmp/__redis_errors.log"
self.m_state = "Disconnected"
# end of __init__
def retry_throttled_connection(self, ex, debug=False):
msg = "ERROR - " + str(datetime.datetime.now().strftime("%d-%m-%Y %H:%M:%S")) + " - " + self.m_id + " - RW EX - " + str(ex) + "\n"
# append to the error log
with open(self.m_error_log, "a") as output_file:
output_file.write(str(msg))
try:
self.m_state = "Disconnected"
cur_sleep = (self.m_retry_interval * self.m_retry_count) + 1
if debug:
print "Retrying Connection"
while self.m_state == "Disconnected":
try:
msg = self.safe_get_cached_single_set("RetryingRedisConnection")
if "Status" in msg and str(msg["Status"]) != "EXCEPTION":
print "------"
print "SUCCESS(" + str(msg) + ")"
print str(msg)
print ""
self.m_state = "Connected"
except Exception,e:
if debug:
print "Redis Failed Connection Retry(" + str(e) + ")"
if self.m_state != "Connected":
cur_sleep = (self.m_retry_interval * self.m_retry_count) + 1
if cur_sleep > self.m_max_sleep_secs:
cur_sleep = self.m_max_sleep_secs
if debug:
print "---MAX SLEEP(" + str(self.m_max_sleep_secs) + ") HIT"
else:
self.m_retry_count += 1
self.m_retry_interval += 2
if debug:
print " - Sleeping before Retry(" + str(cur_sleep) + ")"
sleep(cur_sleep)
# end of while disconnected
if debug:
print "Retrying Connection - SUCCESS"
self.m_retry_count = 1
self.m_retry_interval = 1
except Exception, k:
print "ERROR: Redis Retry Connection had Critical Failure(" + str(k) + ")"
print "ERROR: Redis Retry Connection had Critical Failure(" + str(k) + ")"
print "ERROR: Redis Retry Connection had Critical Failure(" + str(k) + ")"
print "ERROR: Redis Retry Connection had Critical Failure(" + str(k) + ")"
print "ERROR: Redis Retry Connection had Critical Failure(" + str(k) + ")"
print "ERROR: Redis Retry Connection had Critical Failure(" + str(k) + ")"
# end of try/ex
msg = "Retry Completed - " + str(datetime.datetime.now().strftime("%d-%m-%Y %H:%M:%S")) + " - " + self.m_id + " - RW.m_state(" + str(self.m_state) + ")" + "\n"
# append to the error log
with open(self.m_error_log, "a") as output_file:
output_file.write(str(msg))
if debug:
print msg
return True
# end of retry_throttled_connection
def client_kill(self):
try:
self.m_redis.connection_pool.get_connection("QUIT").disconnect()
self.m_state = "Disconnected"
except Exception, p:
print "ERROR: Failed to Kill: " + str(self.m_id) + " with Ex(" + str(p) + ")"
self.m_state = "Disconnected"
return None
# end of client_kill
def __rlen(self):
############################################################################################
# START common safety net for high availability connectivity handling
# This class will automatically retry connections for Redis Instances that are not available
success = False
while not success:
try:
# END common safety net for high availability connectivity handling
############################################################################################
return self.m_redis.llen(self.key())
############################################################################################
# START common safety net for high availability connectivity handling
success = True
except Exception,R:
# try to reconnect with a throttle
self.retry_throttled_connection(R)
# end try/ex
# end of while not successful
# END common safety net for high availability connectivity handling
############################################################################################
# end of __rlen
def allconsume(self, **kwargs):
############################################################################################
# START common safety net for high availability connectivity handling
# This class will automatically retry connections for Redis Instances that are not available
success = False
while not success:
try:
# END common safety net for high availability connectivity handling
############################################################################################
kwargs.setdefault('block', True)
try:
while True:
msg = self.get(**kwargs)
if msg is None:
break
yield msg
except KeyboardInterrupt:
print;
return
############################################################################################
# START common safety net for high availability connectivity handling
success = True
except Exception,R:
# try to reconnect with a throttle
self.retry_throttled_connection(R)
# end try/ex
# end of while not successful
# END common safety net for high availability connectivity handling
############################################################################################
# end of allconsume
def key(self):
############################################################################################
# START common safety net for high availability connectivity handling
# This class will automatically retry connections for Redis Instances that are not available
success = False
while not success:
try:
# END common safety net for high availability connectivity handling
############################################################################################
return "%s" % self.m_name
############################################################################################
# START common safety net for high availability connectivity handling
success = True
except Exception,R:
# try to reconnect with a throttle
self.retry_throttled_connection(R)
# end try/ex
# end of while not successful
# END common safety net for high availability connectivity handling
############################################################################################
# end of key
def get_cached_multiple_set(self, start_idx=0, end_idx=-1, queue=None):
############################################################################################
# START common safety net for high availability connectivity handling
# This class will automatically retry connections for Redis Instances that are not available
success = False
while not success:
try:
# END common safety net for high availability connectivity handling
############################################################################################
msg = None
if queue == None:
msg = self.m_redis.lrange(self.key(), start_idx, end_idx)
return msg
else:
msg = self.m_redis.lrange(queue, start_idx, end_idx)
return msg
if msg is not None and self.m_theserializer is not None:
msg = self.m_theserializer.loads(msg[0])
return msg
return msg
############################################################################################
# START common safety net for high availability connectivity handling
success = True
except Exception,R:
# try to reconnect with a throttle
self.retry_throttled_connection(R)
# end try/ex
# end of while not successful
# END common safety net for high availability connectivity handling
############################################################################################
# end of get_cached_multiple_set
def safe_get_cached_single_set(self, key):
############################################################################################
# START common safety net for high availability connectivity handling
# This class will automatically retry connections for Redis Instances that are not available
success = False
while not success:
try:
# END common safety net for high availability connectivity handling
############################################################################################
msg = {
"Value" : None,
"Status" : None,
"Exception" : None
}
try:
cached_msg = self.m_redis.lrange(key, 0, 1)
new_msg = None
if cached_msg is not None and len(cached_msg) != 0 and self.m_theserializer is not None:
new_msg = self.m_theserializer.loads(cached_msg[0])
msg["Value"] = new_msg
msg["Status"] = "SUCCESS"
# end of try
except Exception, e:
msg["Status"] = "EXCEPTION"
msg["Exception"] = "Exception(" + str(e) + ")"
# end of exception
return msg
############################################################################################
# START common safety net for high availability connectivity handling
success = True
except Exception,R:
# try to reconnect with a throttle
self.retry_throttled_connection(R)
# end try/ex
# end of while not successful
# END common safety net for high availability connectivity handling
############################################################################################
# end of safe_get_cached_single_set
def get_cached_single_set(self, queue=None):
############################################################################################
# START common safety net for high availability connectivity handling
# This class will automatically retry connections for Redis Instances that are not available
success = False
while not success:
try:
# END common safety net for high availability connectivity handling
############################################################################################
msg = None
if queue == None:
msg = self.m_redis.lrange(self.key(), 0, 1)
return msg
else:
msg = self.m_redis.lrange(queue, 0, 1)
return msg
if msg is not None and self.m_theserializer is not None:
msg = self.m_theserializer.loads(msg[0])
return msg
return msg
############################################################################################
# START common safety net for high availability connectivity handling
success = True
except Exception,R:
# try to reconnect with a throttle
self.retry_throttled_connection(R)
# end try/ex
# end of while not successful
# END common safety net for high availability connectivity handling
############################################################################################
# end of get_cached_single_set
def get(self, block=False, timeout=None, queue=None):
############################################################################################
# START common safety net for high availability connectivity handling
# This class will automatically retry connections for Redis Instances that are not available
success = False
while not success:
try:
# END common safety net for high availability connectivity handling
############################################################################################
msg = None
if block:
if timeout is None:
timeout = 0
if queue == None:
msg = self.m_redis.blpop(self.key(), timeout=timeout)
else:
msg = self.m_redis.blpop(queue, timeout=timeout)
if msg is not None:
msg = msg[1]
else:
if queue == None:
msg = self.m_redis.lpop(self.key())
else:
msg = self.m_redis.lpop(queue)
if msg is not None and self.m_theserializer is not None:
msg = self.m_theserializer.loads(msg)
return msg
############################################################################################
# START common safety net for high availability connectivity handling
success = True
except Exception,R:
# try to reconnect with a throttle
self.retry_throttled_connection(R)
# end try/ex
# end of while not successful
# END common safety net for high availability connectivity handling
############################################################################################
# end of get
def put_into_key(self, key, *msgs):
############################################################################################
# START common safety net for high availability connectivity handling
# This class will automatically retry connections for Redis Instances that are not available
success = False
while not success:
try:
# END common safety net for high availability connectivity handling
############################################################################################
if self.m_theserializer is not None:
msgs = map(self.m_theserializer.dumps, msgs)
self.m_redis.rpush(key, *msgs)
############################################################################################
# START common safety net for high availability connectivity handling
success = True
except Exception,R:
# try to reconnect with a throttle
self.retry_throttled_connection(R)
# end try/ex
# end of while not successful
# END common safety net for high availability connectivity handling
############################################################################################
# end of put_into_key
def put(self, *msgs):
############################################################################################
# START common safety net for high availability connectivity handling
# This class will automatically retry connections for Redis Instances that are not available
success = False
while not success:
try:
# END common safety net for high availability connectivity handling
############################################################################################
if self.m_theserializer is not None:
msgs = map(self.m_theserializer.dumps, msgs)
self.m_redis.rpush(self.key(), *msgs)
############################################################################################
# START common safety net for high availability connectivity handling
success = True
except Exception,R:
# try to reconnect with a throttle
self.retry_throttled_connection(R)
# end try/ex
# end of while not successful
# END common safety net for high availability connectivity handling
############################################################################################
# end of put
def exists(self, key):
############################################################################################
# START common safety net for high availability connectivity handling
# This class will automatically retry connections for Redis Instances that are not available
success = False
while not success:
try:
# END common safety net for high availability connectivity handling
############################################################################################
return self.m_redis.exists(key)
############################################################################################
# START common safety net for high availability connectivity handling
success = True
except Exception,R:
# try to reconnect with a throttle
self.retry_throttled_connection(R)
# end try/ex
# end of while not successful
# END common safety net for high availability connectivity handling
############################################################################################
# end of exists
def delete_cache(self, queue=None):
############################################################################################
# START common safety net for high availability connectivity handling
# This class will automatically retry connections for Redis Instances that are not available
success = False
while not success:
try:
# END common safety net for high availability connectivity handling
############################################################################################
if queue == None:
self.m_redis.delete(self.key())
else:
self.m_redis.delete(queue)
return None
############################################################################################
# START common safety net for high availability connectivity handling
success = True
except Exception,R:
# try to reconnect with a throttle
self.retry_throttled_connection(R)
# end try/ex
# end of while not successful
# END common safety net for high availability connectivity handling
############################################################################################
# end of delete_cache
def flush_all(self):
############################################################################################
# START common safety net for high availability connectivity handling
# This class will automatically retry connections for Redis Instances that are not available
success = False
while not success:
try:
# END common safety net for high availability connectivity handling
############################################################################################
self.m_redis.flushall()
return None
############################################################################################
# START common safety net for high availability connectivity handling
success = True
except Exception,R:
# try to reconnect with a throttle
self.retry_throttled_connection(R)
# end try/ex
# end of while not successful
# END common safety net for high availability connectivity handling
############################################################################################
# end of flush_all
# end of class RedisWrapper
| 41.430403 | 168 | 0.433889 | 1,958 | 22,621 | 4.910623 | 0.085291 | 0.039002 | 0.074883 | 0.08986 | 0.794072 | 0.76131 | 0.754342 | 0.738846 | 0.738846 | 0.724077 | 0 | 0.001762 | 0.322753 | 22,621 | 545 | 169 | 41.506422 | 0.625849 | 0.248574 | 0 | 0.562992 | 0 | 0 | 0.067892 | 0.003692 | 0.003937 | 0 | 0 | 0 | 0 | 0 | null | null | 0 | 0.027559 | null | null | 0.070866 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
da8ca03a8ca13c481cb2cf14c5518defd2f1e6e9 | 126 | py | Python | src/pytest_mock_resources/container/redshift.py | ocaballeror/pytest-mock-resources | 6f388237d7ca51b5d5ce5739fb717bf242cfc86c | [
"MIT"
] | 49 | 2020-01-24T21:08:43.000Z | 2022-03-31T23:55:21.000Z | src/pytest_mock_resources/container/redshift.py | michaelbukachi/pytest-mock-resources | f0c5d56af7aeca3cd1a64cd84237d8d4b5b993a4 | [
"MIT"
] | 29 | 2020-03-11T19:07:50.000Z | 2022-03-30T16:49:06.000Z | src/pytest_mock_resources/container/redshift.py | michaelbukachi/pytest-mock-resources | f0c5d56af7aeca3cd1a64cd84237d8d4b5b993a4 | [
"MIT"
] | 10 | 2020-01-23T19:04:09.000Z | 2022-02-22T19:57:54.000Z | import pytest
@pytest.fixture(scope="session")
def _redshift_container(_postgres_container):
return _postgres_container
| 18 | 45 | 0.81746 | 14 | 126 | 6.928571 | 0.714286 | 0.350515 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.103175 | 126 | 6 | 46 | 21 | 0.858407 | 0 | 0 | 0 | 0 | 0 | 0.055556 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.25 | false | 0 | 0.25 | 0.25 | 0.75 | 0 | 1 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 6 |
e550479ea874240694e20c043902fe1111047aa8 | 3,014 | py | Python | tests/test_rsaprivatekey.py | FullteaR/factorizer | f4beb7a14d6cda38d69b9ff6dbe673575b554288 | [
"MIT"
] | null | null | null | tests/test_rsaprivatekey.py | FullteaR/factorizer | f4beb7a14d6cda38d69b9ff6dbe673575b554288 | [
"MIT"
] | null | null | null | tests/test_rsaprivatekey.py | FullteaR/factorizer | f4beb7a14d6cda38d69b9ff6dbe673575b554288 | [
"MIT"
] | null | null | null | import pytest
from factorizer import RSAPrivateKeyFactorizer as Factorizer
from factorizer import TimeOutError
def test_ok_normal_case():
n = 27686602223927069809508667129651371574022939911012945899001963424465432485857959278558567281517986710814493093153793525047594117727967922805090645353064983356364587573528439400605242106130103131384993204101341653150290978652895046371555816803503680877880178698972551411830684046183050273503256489746735122896322889851233107319826830407170825219504026080487291571593984998811061039563352881755150841596074226928497041576188756112952655828168668354006968017555707521724480003190926244967186018315461915028536777617724701916068306698854834025206955234552798984323325634496632645984709079699996404187769159304233647822817
d = 24519437769149139138866335660847545755165653484828286006043516748645401856648854182027545432645741317052553200888752554950064278086138490312456491085827725315522539371158133921466167994259596651442467698644153219537709818896410096455515199158877483530710370808678561483477316661434979292218578919768993187556392027276261818313248453442745589939088057747368431718456807618716287398645113323934099505504696161971119193577301554819446410886001798824782611510559183940960340872731200080537116249824675375515706234728334218799881290646607824216835997567209464899473615916775422544955890417233691031033339957172951014984193
e = 65537
divider = Factorizer(timeout=5)
facts = divider.factorize(n=n, d=d, e=e)
assert facts[0]!=0 and facts[1]!=0
assert n == facts[0] * facts[1]
def test_ok_reverse_case():
n = 27686602223927069809508667129651371574022939911012945899001963424465432485857959278558567281517986710814493093153793525047594117727967922805090645353064983356364587573528439400605242106130103131384993204101341653150290978652895046371555816803503680877880178698972551411830684046183050273503256489746735122896322889851233107319826830407170825219504026080487291571593984998811061039563352881755150841596074226928497041576188756112952655828168668354006968017555707521724480003190926244967186018315461915028536777617724701916068306698854834025206955234552798984323325634496632645984709079699996404187769159304233647822817
d = 65537
e = 24519437769149139138866335660847545755165653484828286006043516748645401856648854182027545432645741317052553200888752554950064278086138490312456491085827725315522539371158133921466167994259596651442467698644153219537709818896410096455515199158877483530710370808678561483477316661434979292218578919768993187556392027276261818313248453442745589939088057747368431718456807618716287398645113323934099505504696161971119193577301554819446410886001798824782611510559183940960340872731200080537116249824675375515706234728334218799881290646607824216835997567209464899473615916775422544955890417233691031033339957172951014984193
divider = Factorizer(timeout=5)
facts = divider.factorize(n=n, d=d, e=e)
assert facts[0]!=0 and facts[1]!=0
assert n == facts[0] * facts[1]
| 125.583333 | 625 | 0.934307 | 88 | 3,014 | 31.931818 | 0.318182 | 0.008541 | 0.014235 | 0.443416 | 0.066904 | 0.066904 | 0.066904 | 0.066904 | 0.066904 | 0.066904 | 0 | 0.864078 | 0.043132 | 3,014 | 23 | 626 | 131.043478 | 0.110264 | 0 | 0 | 0.526316 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0.210526 | 1 | 0.105263 | false | 0 | 0.157895 | 0 | 0.263158 | 0 | 0 | 0 | 1 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
e56d95d60ea2a9be0ea3292ff4f38ad247914a00 | 37 | py | Python | ivy/functional/__init__.py | saurbhc/ivy | 20b327b4fab543b26ad5a18acf4deddd6e3c804b | [
"Apache-2.0"
] | 681 | 2022-01-18T19:08:56.000Z | 2022-03-31T22:48:37.000Z | ivy/functional/__init__.py | saurbhc/ivy | 20b327b4fab543b26ad5a18acf4deddd6e3c804b | [
"Apache-2.0"
] | 637 | 2022-01-19T07:40:28.000Z | 2022-03-31T19:06:47.000Z | ivy/functional/__init__.py | saurbhc/ivy | 20b327b4fab543b26ad5a18acf4deddd6e3c804b | [
"Apache-2.0"
] | 501 | 2022-01-23T14:48:35.000Z | 2022-03-31T04:09:38.000Z | from . import ivy
from .ivy import *
| 12.333333 | 18 | 0.702703 | 6 | 37 | 4.333333 | 0.5 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.216216 | 37 | 2 | 19 | 18.5 | 0.896552 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
e58bad6ee97b75bc492de59da54119e28a5efc84 | 60 | py | Python | elliot/recommender/NN/item_knn/__init__.py | swapUniba/Elliot_refactor-tesi-Ventrella | 3ddffc041696c90a6f6d3e8906c212fc4f55f842 | [
"Apache-2.0"
] | null | null | null | elliot/recommender/NN/item_knn/__init__.py | swapUniba/Elliot_refactor-tesi-Ventrella | 3ddffc041696c90a6f6d3e8906c212fc4f55f842 | [
"Apache-2.0"
] | null | null | null | elliot/recommender/NN/item_knn/__init__.py | swapUniba/Elliot_refactor-tesi-Ventrella | 3ddffc041696c90a6f6d3e8906c212fc4f55f842 | [
"Apache-2.0"
] | 1 | 2021-06-02T06:57:07.000Z | 2021-06-02T06:57:07.000Z |
from elliot.recommender.NN.item_knn.item_knn import ItemKNN | 30 | 59 | 0.866667 | 10 | 60 | 5 | 0.8 | 0.28 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.066667 | 60 | 2 | 59 | 30 | 0.892857 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
e59415b72b9e78598ec51131121cb9c26ef970c2 | 16,004 | py | Python | ot/test/test_ordertheory.py | cjeffers/OT | bd417af705fae49ae3b38eb1e621180b4cf802d2 | [
"BSD-3-Clause"
] | null | null | null | ot/test/test_ordertheory.py | cjeffers/OT | bd417af705fae49ae3b38eb1e621180b4cf802d2 | [
"BSD-3-Clause"
] | null | null | null | ot/test/test_ordertheory.py | cjeffers/OT | bd417af705fae49ae3b38eb1e621180b4cf802d2 | [
"BSD-3-Clause"
] | 1 | 2022-03-11T00:23:34.000Z | 2022-03-11T00:23:34.000Z | import unittest
from .. import ordertheory
class OrderTheoryTests(unittest.TestCase):
def test_powerset(self):
"""Powerset calculate ok?"""
test_pset = set([(), (1,), (2,), (3,), (4,), (1, 2), (1, 3), (1, 4),
(2, 3), (2, 4), (3, 4), (1, 2, 3), (1, 2, 4),
(1, 3, 4), (2, 3, 4), (1, 2, 3, 4)])
pset = list(ordertheory.Powerset().powerset([1,2,3,4]))
for elem in pset:
msg = "%s from generated set not in test powerset" % str(elem)
self.assertTrue(elem in test_pset, msg)
for elem in test_pset:
msg = "%s from test powerset not in generated set" % str(elem)
self.assertTrue(elem in pset, msg)
def test_functional_space(self):
"""Functional space calculated correctly?"""
test_fspace = [[(1, 6), (2, 4), (3, 5)], [(1, 6), (2, 5), (3, 4)],
[(1, 4), (2, 6), (3, 4)], [(1, 4), (2, 5), (3, 4)],
[(1, 4), (2, 4), (3, 4)], [(1, 6), (2, 5), (3, 6)],
[(1, 5), (2, 6), (3, 4)], [(1, 4), (2, 6), (3, 5)],
[(1, 5), (2, 5), (3, 4)], [(1, 5), (2, 6), (3, 6)],
[(1, 5), (2, 4), (3, 6)], [(1, 4), (2, 5), (3, 6)],
[(1, 5), (2, 6), (3, 5)], [(1, 5), (2, 5), (3, 5)],
[(1, 4), (2, 6), (3, 6)], [(1, 6), (2, 4), (3, 6)],
[(1, 6), (2, 5), (3, 5)], [(1, 6), (2, 6), (3, 4)],
[(1, 5), (2, 4), (3, 5)], [(1, 4), (2, 5), (3, 5)],
[(1, 5), (2, 5), (3, 6)], [(1, 6), (2, 6), (3, 5)],
[(1, 5), (2, 4), (3, 4)], [(1, 6), (2, 6), (3, 6)],
[(1, 6), (2, 4), (3, 4)], [(1, 4), (2, 4), (3, 5)],
[(1, 4), (2, 4), (3, 6)]]
fspace = ordertheory.FunctionalSpace().funcs([1, 2, 3], [4, 5, 6])
msg = "functional space from [1,2,3] into [4,5,6] doesn't match"
self.assertEqual(fspace, test_fspace, msg)
def test_strict_total_orders(self):
"""Strict total orders calculated ok?"""
test_orders = [frozenset([(1, 2), (1, 3), (1, 4), (2, 3), (3, 4), (2, 4)]),
frozenset([(1, 2), (1, 3), (1, 4), (2, 3), (4, 3), (2, 4)]),
frozenset([(1, 2), (3, 2), (1, 3), (1, 4), (3, 4), (2, 4)]),
frozenset([(1, 2), (3, 2), (1, 3), (1, 4), (4, 2), (3, 4)]),
frozenset([(1, 2), (1, 3), (1, 4), (2, 3), (4, 3), (4, 2)]),
frozenset([(1, 2), (3, 2), (1, 3), (1, 4), (4, 3), (4, 2)]),
frozenset([(1, 3), (2, 1), (2, 3), (1, 4), (3, 4), (2, 4)]),
frozenset([(1, 3), (2, 1), (2, 3), (1, 4), (4, 3), (2, 4)]),
frozenset([(3, 1), (2, 1), (2, 3), (1, 4), (3, 4), (2, 4)]),
frozenset([(4, 1), (3, 1), (2, 1), (2, 3), (3, 4), (2, 4)]),
frozenset([(1, 3), (2, 1), (2, 3), (4, 3), (4, 1), (2, 4)]),
frozenset([(3, 1), (2, 1), (2, 3), (4, 3), (4, 1), (2, 4)]),
frozenset([(1, 2), (3, 2), (3, 1), (1, 4), (3, 4), (2, 4)]),
frozenset([(1, 2), (3, 2), (3, 1), (1, 4), (4, 2), (3, 4)]),
frozenset([(3, 2), (3, 1), (2, 1), (1, 4), (3, 4), (2, 4)]),
frozenset([(3, 2), (3, 4), (3, 1), (2, 1), (4, 1), (2, 4)]),
frozenset([(1, 2), (3, 2), (3, 4), (3, 1), (4, 2), (4, 1)]),
frozenset([(3, 2), (3, 4), (3, 1), (2, 1), (4, 2), (4, 1)]),
frozenset([(1, 2), (1, 3), (2, 3), (4, 3), (4, 2), (4, 1)]),
frozenset([(1, 2), (3, 2), (1, 3), (4, 3), (4, 2), (4, 1)]),
frozenset([(1, 3), (2, 1), (2, 3), (4, 3), (4, 2), (4, 1)]),
frozenset([(3, 1), (2, 1), (2, 3), (4, 3), (4, 2), (4, 1)]),
frozenset([(1, 2), (3, 2), (3, 1), (4, 3), (4, 2), (4, 1)]),
frozenset([(3, 2), (3, 1), (2, 1), (4, 3), (4, 2), (4, 1)])]
orders = list(ordertheory.StrictTotalOrders().orders([1,2,3,4]))
msg = "strict total orders for [1,2,3,4] don't match"
self.assertEqual(orders, test_orders, msg)
def test_strict_orders(self):
"""Lattices generate ok? (3-constraint)"""
test_lat = {
frozenset([(1, 3), (2, 3)]): {
'down': set([frozenset([(1, 3), (2, 3)]), frozenset([(2, 3)]),
frozenset([]), frozenset([(1, 3)])]),
'max': set([frozenset([(1, 3), (2, 3), (2, 1)]),
frozenset([(1, 2), (1, 3), (2, 3)])]),
'up': set([frozenset([(1, 3), (2, 3)]),
frozenset([(1, 2), (1, 3), (2, 3)]),
frozenset([(1, 3), (2, 3), (2, 1)])])},
frozenset([(2, 3)]): {
'down': set([frozenset([(2, 3)]), frozenset([])]),
'max': set([frozenset([(1, 3), (2, 3), (2, 1)]),
frozenset([(1, 2), (1, 3), (2, 3)]),
frozenset([(3, 1), (2, 3), (2, 1)])]),
'up': set([frozenset([(1, 3), (2, 3)]), frozenset([(2, 3)]),
frozenset([(1, 3), (2, 3), (2, 1)]),
frozenset([(3, 1), (2, 3), (2, 1)]),
frozenset([(1, 2), (1, 3), (2, 3)]),
frozenset([(2, 3), (2, 1)])])},
frozenset([(3, 2), (3, 1)]): {
'down': set([frozenset([(3, 2)]), frozenset([(3, 2), (3, 1)]),
frozenset([]), frozenset([(3, 1)])]),
'max': set([frozenset([(1, 2), (3, 2), (3, 1)]),
frozenset([(3, 2), (3, 1), (2, 1)])]),
'up': set([frozenset([(1, 2), (3, 2), (3, 1)]),
frozenset([(3, 2), (3, 1)]),
frozenset([(3, 2), (3, 1), (2, 1)])])},
frozenset([(1, 3)]): {
'down': set([frozenset([]), frozenset([(1, 3)])]),
'max': set([frozenset([(1, 2), (3, 2), (1, 3)]),
frozenset([(1, 2), (1, 3), (2, 3)]),
frozenset([(1, 3), (2, 3), (2, 1)])]),
'up': set([frozenset([(1, 2), (3, 2), (1, 3)]),
frozenset([(1, 3)]), frozenset([(1, 3), (2, 3)]),
frozenset([(1, 3), (2, 3), (2, 1)]),
frozenset([(1, 2), (1, 3), (2, 3)]),
frozenset([(1, 2), (1, 3)])])},
frozenset([(1, 2)]): {
'down': set([frozenset([]), frozenset([(1, 2)])]),
'max': set([frozenset([(1, 2), (3, 2), (1, 3)]),
frozenset([(1, 2), (1, 3), (2, 3)]),
frozenset([(1, 2), (3, 2), (3, 1)])]),
'up': set([frozenset([(1, 2), (3, 2), (1, 3)]),
frozenset([(1, 2)]),
frozenset([(1, 2), (1, 3), (2, 3)]),
frozenset([(1, 2), (3, 2)]),
frozenset([(1, 2), (3, 2), (3, 1)]),
frozenset([(1, 2), (1, 3)])])},
frozenset([(1, 2), (3, 2), (1, 3)]): {
'down': set([frozenset([(1, 2), (3, 2), (1, 3)]),
frozenset([(1, 3)]), frozenset([(1, 2)]),
frozenset([]), frozenset([(1, 2), (3, 2)]),
frozenset([(3, 2)]), frozenset([(1, 2), (1, 3)])]),
'max': set([frozenset([(1, 2), (3, 2), (1, 3)])]),
'up': set([frozenset([(1, 2), (3, 2), (1, 3)])])},
frozenset([(2, 3), (2, 1)]): {
'down': set([frozenset([(2, 1)]), frozenset([(2, 3), (2, 1)]),
frozenset([(2, 3)]), frozenset([])]),
'max': set([frozenset([(1, 3), (2, 3), (2, 1)]),
frozenset([(3, 1), (2, 3), (2, 1)])]),
'up': set([frozenset([(1, 3), (2, 3), (2, 1)]),
frozenset([(2, 3), (2, 1)]),
frozenset([(3, 1), (2, 3), (2, 1)])])},
frozenset([(3, 1), (2, 1)]): {
'down': set([frozenset([(2, 1)]), frozenset([]),
frozenset([(3, 1)]), frozenset([(3, 1), (2, 1)])]),
'max': set([frozenset([(3, 1), (2, 3), (2, 1)]),
frozenset([(3, 2), (3, 1), (2, 1)])]),
'up': set([frozenset([(3, 1), (2, 3), (2, 1)]),
frozenset([(3, 2), (3, 1), (2, 1)]),
frozenset([(3, 1), (2, 1)])])},
frozenset([]): {
'down': set([frozenset([])]),
'max': set([frozenset([(1, 2), (3, 2), (1, 3)]),
frozenset([(3, 2), (3, 1), (2, 1)]),
frozenset([(1, 3), (2, 3), (2, 1)]),
frozenset([(3, 1), (2, 3), (2, 1)]),
frozenset([(1, 2), (1, 3), (2, 3)]),
frozenset([(1, 2), (3, 2), (3, 1)])]),
'up': set([frozenset([(1, 3), (2, 3)]), frozenset([(2, 3)]),
frozenset([(3, 2), (3, 1)]), frozenset([(1, 3)]),
frozenset([(1, 2)]),
frozenset([(1, 2), (3, 2), (1, 3)]),
frozenset([(2, 3), (2, 1)]),
frozenset([(3, 1), (2, 1)]), frozenset([]),
frozenset([(3, 1)]), frozenset([(2, 1)]),
frozenset([(3, 1), (2, 3), (2, 1)]),
frozenset([(1, 2), (1, 3), (2, 3)]),
frozenset([(1, 2), (3, 2)]),
frozenset([(1, 3), (2, 3), (2, 1)]),
frozenset([(1, 2), (3, 2), (3, 1)]),
frozenset([(3, 2)]), frozenset([(1, 2), (1, 3)]),
frozenset([(3, 2), (3, 1), (2, 1)])])},
frozenset([(3, 1)]): {
'down': set([frozenset([]), frozenset([(3, 1)])]),
'max': set([frozenset([(3, 1), (2, 3), (2, 1)]),
frozenset([(3, 2), (3, 1), (2, 1)]),
frozenset([(1, 2), (3, 2), (3, 1)])]),
'up': set([frozenset([(3, 2), (3, 1), (2, 1)]),
frozenset([(3, 1), (2, 1)]),
frozenset([(3, 2), (3, 1)]), frozenset([(3, 1)]),
frozenset([(3, 1), (2, 3), (2, 1)]),
frozenset([(1, 2), (3, 2), (3, 1)])])},
frozenset([(2, 1)]): {
'down': set([frozenset([(2, 1)]), frozenset([])]),
'max': set([frozenset([(1, 3), (2, 3), (2, 1)]),
frozenset([(3, 2), (3, 1), (2, 1)]),
frozenset([(3, 1), (2, 3), (2, 1)])]),
'up': set([frozenset([(3, 2), (3, 1), (2, 1)]),
frozenset([(3, 1), (2, 1)]),
frozenset([(1, 3), (2, 3), (2, 1)]),
frozenset([(2, 1)]),
frozenset([(3, 1), (2, 3), (2, 1)]),
frozenset([(2, 3), (2, 1)])])},
frozenset([(3, 1), (2, 3), (2, 1)]): {
'down': set([frozenset([(2, 3)]), frozenset([(3, 1), (2, 1)]),
frozenset([]), frozenset([(3, 1)]),
frozenset([(2, 1)]),
frozenset([(3, 1), (2, 3), (2, 1)]),
frozenset([(2, 3), (2, 1)])]),
'max': set([frozenset([(3, 1), (2, 3), (2, 1)])]),
'up': set([frozenset([(3, 1), (2, 3), (2, 1)])])},
frozenset([(1, 2), (1, 3), (2, 3)]): {
'down': set([frozenset([(1, 3), (2, 3)]), frozenset([(2, 3)]),
frozenset([(1, 3)]), frozenset([(1, 2)]),
frozenset([]),
frozenset([(1, 2), (1, 3), (2, 3)]),
frozenset([(1, 2), (1, 3)])]),
'max': set([frozenset([(1, 2), (1, 3), (2, 3)])]),
'up': set([frozenset([(1, 2), (1, 3), (2, 3)])])},
frozenset([(1, 2), (3, 2)]): {
'down': set([frozenset([(3, 2)]), frozenset([(1, 2), (3, 2)]),
frozenset([]), frozenset([(1, 2)])]),
'max': set([frozenset([(1, 2), (3, 2), (1, 3)]),
frozenset([(1, 2), (3, 2), (3, 1)])]),
'up': set([frozenset([(1, 2), (3, 2), (1, 3)]),
frozenset([(1, 2), (3, 2)]),
frozenset([(1, 2), (3, 2), (3, 1)])])},
frozenset([(1, 3), (2, 3), (2, 1)]): {
'down': set([frozenset([(1, 3), (2, 3)]),
frozenset([(2, 3)]), frozenset([(1, 3)]),
frozenset([(1, 3), (2, 3), (2, 1)]),
frozenset([]), frozenset([(2, 1)]),
frozenset([(2, 3), (2, 1)])]),
'max': set([frozenset([(1, 3), (2, 3), (2, 1)])]),
'up': set([frozenset([(1, 3), (2, 3), (2, 1)])])},
frozenset([(1, 2), (3, 2), (3, 1)]): {
'down': set([frozenset([(1, 2)]), frozenset([(3, 2), (3, 1)]),
frozenset([]), frozenset([(3, 1)]),
frozenset([(1, 2), (3, 2)]),
frozenset([(1, 2), (3, 2), (3, 1)]),
frozenset([(3, 2)])]),
'max': set([frozenset([(1, 2), (3, 2), (3, 1)])]),
'up': set([frozenset([(1, 2), (3, 2), (3, 1)])])},
frozenset([(3, 2)]): {
'down': set([frozenset([(3, 2)]), frozenset([])]),
'max': set([frozenset([(1, 2), (3, 2), (1, 3)]),
frozenset([(3, 2), (3, 1), (2, 1)]),
frozenset([(1, 2), (3, 2), (3, 1)])]),
'up': set([frozenset([(1, 2), (3, 2), (1, 3)]),
frozenset([(3, 2), (3, 1), (2, 1)]),
frozenset([(3, 2), (3, 1)]),
frozenset([(1, 2), (3, 2)]),
frozenset([(1, 2), (3, 2), (3, 1)]),
frozenset([(3, 2)])])},
frozenset([(1, 2), (1, 3)]): {
'down': set([frozenset([]), frozenset([(1, 3)]),
frozenset([(1, 2)]), frozenset([(1, 2), (1, 3)])]),
'max': set([frozenset([(1, 2), (3, 2), (1, 3)]),
frozenset([(1, 2), (1, 3), (2, 3)])]),
'up': set([frozenset([(1, 2), (3, 2), (1, 3)]),
frozenset([(1, 2), (1, 3), (2, 3)]),
frozenset([(1, 2), (1, 3)])])},
frozenset([(3, 2), (3, 1), (2, 1)]): {
'down': set([frozenset([(3, 2), (3, 1), (2, 1)]),
frozenset([(3, 1), (2, 1)]),
frozenset([(3, 2), (3, 1)]), frozenset([]),
frozenset([(3, 1)]), frozenset([(2, 1)]),
frozenset([(3, 2)])]),
'max': set([frozenset([(3, 2), (3, 1), (2, 1)])]),
'up': set([frozenset([(3, 2), (3, 1), (2, 1)])])}
}
lat = ordertheory.StrictOrders().get_orders([1,2,3])
for k in lat.keys():
msg = "Sets for %s in 3-constraint lattice don't match" % str(k)
self.assertEqual(lat[k], test_lat[k], msg)
if __name__ == '__main__':
unittest.main()
| 59.716418 | 83 | 0.3018 | 1,919 | 16,004 | 2.503387 | 0.033872 | 0.080766 | 0.055579 | 0.053289 | 0.82244 | 0.798085 | 0.781432 | 0.743339 | 0.680891 | 0.643838 | 0 | 0.150206 | 0.423019 | 16,004 | 267 | 84 | 59.940075 | 0.370045 | 0.00831 | 0 | 0.413655 | 0 | 0.004016 | 0.025931 | 0 | 0 | 0 | 0 | 0 | 0.02008 | 1 | 0.016064 | false | 0 | 0.008032 | 0 | 0.028112 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
e5be162d4ef3b85fd7f708c56ea21f4ee4dbf251 | 236 | py | Python | fastform/index/views.py | csc5nz/FastForms | b208734b94b908fce38ca342442a3e15014f8c23 | [
"MIT"
] | 1 | 2021-07-23T19:04:24.000Z | 2021-07-23T19:04:24.000Z | fastform/index/views.py | csc5nz/FastForms | b208734b94b908fce38ca342442a3e15014f8c23 | [
"MIT"
] | 26 | 2020-01-14T04:32:38.000Z | 2022-03-12T00:54:18.000Z | fastform/index/views.py | csc5nz/FastForms | b208734b94b908fce38ca342442a3e15014f8c23 | [
"MIT"
] | 1 | 2021-12-17T19:26:57.000Z | 2021-12-17T19:26:57.000Z | from django.shortcuts import render
from django.http import HttpResponse
# Create your views here.
def home(request):
return HttpResponse("<h1>Create and Edit Documents</h1><p>Create a Document</p><p>Upload a PDF</p><p>Login</p>")
| 33.714286 | 116 | 0.745763 | 38 | 236 | 4.631579 | 0.657895 | 0.113636 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.009662 | 0.122881 | 236 | 6 | 117 | 39.333333 | 0.84058 | 0.097458 | 0 | 0 | 0 | 0.25 | 0.421801 | 0.208531 | 0 | 0 | 0 | 0 | 0 | 1 | 0.25 | false | 0 | 0.5 | 0.25 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 1 | 1 | 0 | 0 | 6 |
e5c6cfbd1a0f5222243bb6e9e1c5f23a1661e3a8 | 11,106 | py | Python | tests/api/test_payments_create.py | YTXIRE/able_crm_concrete_accounting_python_tests | c708bfb973a19499e0b04da5c219d924b295d74e | [
"Apache-2.0"
] | 1 | 2022-03-06T17:30:38.000Z | 2022-03-06T17:30:38.000Z | tests/api/test_payments_create.py | YTXIRE/able_crm_concrete_accounting_python_tests | c708bfb973a19499e0b04da5c219d924b295d74e | [
"Apache-2.0"
] | null | null | null | tests/api/test_payments_create.py | YTXIRE/able_crm_concrete_accounting_python_tests | c708bfb973a19499e0b04da5c219d924b295d74e | [
"Apache-2.0"
] | null | null | null | from allure import title, description, suite, parent_suite
from data import API_PAYMENTS_SUCCESS, DATA_AMOUNT_PAYMENTS_SUCCESS, DATA_PAYMENTS_TIMESTAMP, \
API_POST_METHOD_NOT_ALLOWED, API_AUTH_TOKEN_EMPTY, AUTH_DATA_FAIL_NOT_FOUND_ID, AUTH_DATA_LENGTH_TOKEN, \
API_BAD_REQUEST_LENGTH_TOKEN, API_BAD_REQUEST_BAD_ID, AUTH_DATA_FAIL_BAD_ID, API_BAD_REQUEST_AMOUNT, \
DATA_AMOUNT_PAYMENTS_BAD, API_BAD_REQUEST_AMOUT_LESS_THAT_ONE_MILLIARD, AUTH_DATA_FAIL_BAD_TOKEN, \
API_AUTH_NOT_FOUND_TOKEN, API_NOT_FOUND_VENDORS
from pages.api import asserts
@suite('Контроллер: Payments. Метод: create')
@parent_suite('[PYTHON][API]')
class TestApiPaymentsCreate:
@title('create')
@description('Проверка корректной работы create')
def test_create(self, create_admin, payments, create_vendors_remove, create_legal_entities_remove, remove_payments):
vendor_id = create_vendors_remove
asserts(
assert_data=payments.create({
"token": create_admin['token'],
"vendor_id": vendor_id,
"legal_entity_id": create_legal_entities_remove,
"amount": DATA_AMOUNT_PAYMENTS_SUCCESS,
"created_at": DATA_PAYMENTS_TIMESTAMP,
"user_id": create_admin['id']
}),
data=API_PAYMENTS_SUCCESS
)
remove_payments(vendor_id)
@title('create-fail-method-not-allowed')
@description('Проверка ошибки Method Not Allowed для create')
def test_create_method_not_allowed(self, payments):
asserts(
assert_data=payments.create_method_not_allowed(),
data=API_POST_METHOD_NOT_ALLOWED
)
@title('create-fail-bad-requst-empty-token')
@description('Проверка ошибки Bad Request empty token для create')
def test_create_bad_request_empty_token(self, payments, create_legal_entities_remove, remove_payments):
asserts(
assert_data=payments.create({
"token": '',
"vendor_id": AUTH_DATA_FAIL_NOT_FOUND_ID,
"legal_entity_id": create_legal_entities_remove,
"amount": DATA_AMOUNT_PAYMENTS_SUCCESS,
"created_at": DATA_PAYMENTS_TIMESTAMP,
"user_id": 1
}),
data=API_AUTH_TOKEN_EMPTY
)
@title('create-fail-bad-requst-length-token')
@description('Проверка ошибки Bad Request length token для create')
def test_create_bad_request_length_token(self, payments, create_vendors_remove, create_legal_entities_remove):
asserts(
assert_data=payments.create({
"token": AUTH_DATA_LENGTH_TOKEN,
"vendor_id": create_vendors_remove,
"legal_entity_id": create_legal_entities_remove,
"amount": DATA_AMOUNT_PAYMENTS_SUCCESS,
"created_at": DATA_PAYMENTS_TIMESTAMP,
"user_id": 1
}),
data=API_BAD_REQUEST_LENGTH_TOKEN
)
@title('create-fail-bad-requst-bad-user-id-minus')
@description('Проверка ошибки Bad Request bad id для create с отрицательным идентификатором пользоваетеля')
def test_create_bad_request_bad_user_id_minus(self, payments, create_vendors_remove, create_legal_entities_remove):
asserts(
assert_data=payments.create({
"token": 1,
"vendor_id": create_vendors_remove,
"legal_entity_id": create_legal_entities_remove,
"amount": DATA_AMOUNT_PAYMENTS_SUCCESS,
"created_at": DATA_PAYMENTS_TIMESTAMP,
"user_id": AUTH_DATA_FAIL_BAD_ID
}),
data=API_BAD_REQUEST_BAD_ID
)
@title('create-fail-bad-requst-bad-user-id-varchar')
@description('Проверка ошибки Bad Request bad id для create с символами в идентификаторе пользоваетеля')
def test_create_bad_request_bad_user_id_varchar(self, payments, create_vendors_remove,
create_legal_entities_remove):
asserts(
assert_data=payments.create({
"token": 1,
"vendor_id": create_vendors_remove,
"legal_entity_id": create_legal_entities_remove,
"amount": DATA_AMOUNT_PAYMENTS_SUCCESS,
"created_at": DATA_PAYMENTS_TIMESTAMP,
"user_id": 'text'
}),
data=API_BAD_REQUEST_BAD_ID
)
@title('create-fail-bad-requst-bad-vendor-id-minus')
@description('Проверка ошибки Bad Request bad id для create с отрицательным идентификатором поставщика')
def test_create_bad_request_bad_vendor_id_minus(self, payments, create_legal_entities_remove):
asserts(
assert_data=payments.create({
"token": 1,
"vendor_id": AUTH_DATA_FAIL_BAD_ID,
"legal_entity_id": create_legal_entities_remove,
"amount": DATA_AMOUNT_PAYMENTS_SUCCESS,
"created_at": DATA_PAYMENTS_TIMESTAMP,
"user_id": 1
}),
data=API_BAD_REQUEST_BAD_ID
)
@title('create-fail-bad-requst-bad-vendor-id-varchar')
@description('Проверка ошибки Bad Request bad id для create с символами в идентификаторе поставщика')
def test_create_bad_request_bad_vendor_id_varchar(self, payments, create_legal_entities_remove):
asserts(
assert_data=payments.create({
"token": 1,
"vendor_id": 'text',
"legal_entity_id": create_legal_entities_remove,
"amount": DATA_AMOUNT_PAYMENTS_SUCCESS,
"created_at": DATA_PAYMENTS_TIMESTAMP,
"user_id": 1
}),
data=API_BAD_REQUEST_BAD_ID
)
@title('create-fail-bad-requst-bad-legal-entity-id-minus')
@description('Проверка ошибки Bad Request bad id для create с отрицательным идентификатором юридического лица')
def test_create_bad_request_bad_legal_entity_id_minus(self, payments, create_vendors_remove):
asserts(
assert_data=payments.create({
"token": 1,
"vendor_id": create_vendors_remove,
"legal_entity_id": AUTH_DATA_FAIL_BAD_ID,
"amount": DATA_AMOUNT_PAYMENTS_SUCCESS,
"created_at": DATA_PAYMENTS_TIMESTAMP,
"user_id": 1
}),
data=API_BAD_REQUEST_BAD_ID
)
@title('create-fail-bad-requst-bad-legal-entity-id-varchar')
@description('Проверка ошибки Bad Request bad id для create с символами в идентификаторе юридического лица')
def test_create_bad_request_bad_legal_entity_id_varchar(self, payments, create_vendors_remove):
asserts(
assert_data=payments.create({
"token": 1,
"vendor_id": create_vendors_remove,
"legal_entity_id": 'text',
"amount": DATA_AMOUNT_PAYMENTS_SUCCESS,
"created_at": DATA_PAYMENTS_TIMESTAMP,
"user_id": 1
}),
data=API_BAD_REQUEST_BAD_ID
)
@title('create-fail-bad-requst-amount-varchar')
@description('Проверка ошибки Bad Request amount для create c символами в сумме платежа')
def test_create_bad_request_amount_varchar(self, payments, create_legal_entities_remove, create_vendors_remove):
asserts(
assert_data=payments.create({
"token": 1,
"vendor_id": create_vendors_remove,
"legal_entity_id": create_legal_entities_remove,
"amount": 'text',
"created_at": DATA_PAYMENTS_TIMESTAMP,
"user_id": 1
}),
data=API_BAD_REQUEST_AMOUNT
)
@title('create-fail-bad-requst-amount-minus')
@description('Проверка ошибки Bad Request amount для create с отрицательным значением в сумме платежа')
def test_create_bad_request_amount_minus(self, payments, create_legal_entities_remove, create_vendors_remove):
asserts(
assert_data=payments.create({
"token": 1,
"vendor_id": create_vendors_remove,
"legal_entity_id": create_legal_entities_remove,
"amount": AUTH_DATA_FAIL_BAD_ID,
"created_at": DATA_PAYMENTS_TIMESTAMP,
"user_id": 1
}),
data=API_BAD_REQUEST_AMOUNT
)
@title('create-fail-bad-requst-amount-zero')
@description('Проверка ошибки Bad Request amount для create с нулевым значением в сумме платежа')
def test_create_bad_request_amount_zero(self, payments, create_legal_entities_remove, create_vendors_remove):
asserts(
assert_data=payments.create({
"token": 1,
"vendor_id": create_vendors_remove,
"legal_entity_id": create_legal_entities_remove,
"amount": 0,
"created_at": DATA_PAYMENTS_TIMESTAMP,
"user_id": 1
}),
data=API_BAD_REQUEST_AMOUNT
)
@title('create-fail-bad-requst-amount-length')
@description('Проверка ошибки Bad Request amount length для create')
def test_create_bad_request_amount_length(self, payments, create_legal_entities_remove, create_vendors_remove):
asserts(
assert_data=payments.create({
"token": 1,
"vendor_id": create_vendors_remove,
"legal_entity_id": create_legal_entities_remove,
"amount": DATA_AMOUNT_PAYMENTS_BAD,
"created_at": DATA_PAYMENTS_TIMESTAMP,
"user_id": 1
}),
data=API_BAD_REQUEST_AMOUT_LESS_THAT_ONE_MILLIARD
)
@title('create-fail-not-found-token')
@description('Проверка ошибки Not Found token для create')
def test_create_not_found_token(self, payments, create_legal_entities_remove, create_vendors_remove):
asserts(
assert_data=payments.create({
"token": AUTH_DATA_FAIL_BAD_TOKEN,
"vendor_id": create_vendors_remove,
"legal_entity_id": create_legal_entities_remove,
"amount": DATA_AMOUNT_PAYMENTS_SUCCESS,
"created_at": DATA_PAYMENTS_TIMESTAMP,
"user_id": 1
}),
data=API_AUTH_NOT_FOUND_TOKEN
)
@title('create-fail-not-found-vendor')
@description('Проверка ошибки Not Found vendor для create')
def test_create_not_found_vendor(self, payments, create_admin, create_legal_entities_remove):
asserts(
assert_data=payments.create({
"token": create_admin['token'],
"vendor_id": AUTH_DATA_FAIL_NOT_FOUND_ID,
"legal_entity_id": create_legal_entities_remove,
"amount": DATA_AMOUNT_PAYMENTS_SUCCESS,
"created_at": DATA_PAYMENTS_TIMESTAMP,
"user_id": create_admin['id']
}),
data=API_NOT_FOUND_VENDORS
)
| 44.60241 | 120 | 0.637583 | 1,248 | 11,106 | 5.245994 | 0.064103 | 0.059569 | 0.075454 | 0.099282 | 0.891553 | 0.829693 | 0.778066 | 0.749504 | 0.707958 | 0.654346 | 0 | 0.002753 | 0.280389 | 11,106 | 248 | 121 | 44.782258 | 0.816441 | 0 | 0 | 0.569565 | 0 | 0 | 0.227084 | 0.050603 | 0 | 0 | 0 | 0 | 0.143478 | 1 | 0.069565 | false | 0 | 0.013043 | 0 | 0.086957 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
f9002fa847935152807ca5011eec0d2b393f4e57 | 26 | py | Python | Python/Tests/TestData/Grammar/FuncDefV2.py | nanshuiyu/pytools | 9f9271fe8cf564b4f94e9456d400f4306ea77c23 | [
"Apache-2.0"
] | null | null | null | Python/Tests/TestData/Grammar/FuncDefV2.py | nanshuiyu/pytools | 9f9271fe8cf564b4f94e9456d400f4306ea77c23 | [
"Apache-2.0"
] | null | null | null | Python/Tests/TestData/Grammar/FuncDefV2.py | nanshuiyu/pytools | 9f9271fe8cf564b4f94e9456d400f4306ea77c23 | [
"Apache-2.0"
] | 1 | 2020-12-09T10:16:23.000Z | 2020-12-09T10:16:23.000Z | def f(a, (b, c), d): pass | 26 | 26 | 0.461538 | 8 | 26 | 1.625 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.192308 | 26 | 1 | 26 | 26 | 0.571429 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 1 | 0 | null | null | 0 | 1 | 1 | 1 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 6 |
00892d6f6f69361ad4e861a0d02a227aaa16356a | 33 | py | Python | fastbay/__init__.py | termlt/FastBay | 8a9240f8472ab6c59e5a34ee93839c7bdb19f9da | [
"MIT"
] | 2 | 2021-05-09T12:07:49.000Z | 2021-05-09T13:30:47.000Z | fastbay/__init__.py | termlt/FastBay | 8a9240f8472ab6c59e5a34ee93839c7bdb19f9da | [
"MIT"
] | null | null | null | fastbay/__init__.py | termlt/FastBay | 8a9240f8472ab6c59e5a34ee93839c7bdb19f9da | [
"MIT"
] | 1 | 2022-03-12T04:52:41.000Z | 2022-03-12T04:52:41.000Z | from fastbay.fastbay import fbay
| 16.5 | 32 | 0.848485 | 5 | 33 | 5.6 | 0.8 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.121212 | 33 | 1 | 33 | 33 | 0.965517 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
00d279fb0d9752fe8846aba884e7840bf52ed707 | 140 | py | Python | tests/const.py | yakumo-saki/smart_to_zabbix | 04dd1debe0c831b4ec94962884543c989ad57730 | [
"MIT"
] | 1 | 2022-02-03T22:32:10.000Z | 2022-02-03T22:32:10.000Z | tests/const.py | yakumo-saki/megacli_to_zabbix | 2e5f22dabf54b88b23d34ebf3d592efce24f1698 | [
"MIT"
] | 23 | 2021-08-30T14:59:27.000Z | 2021-11-05T16:51:08.000Z | tests/const.py | yakumo-saki/smart_to_zabbix | 04dd1debe0c831b4ec94962884543c989ad57730 | [
"MIT"
] | null | null | null | EXAMPLE_DEVICE_DIR="smart_examples/device"
EXAMPLE_DEV_SATA_DIR="{EXAMPLE_DEVICE_DIR}/sata"
EXAMPLE_DEV_NVME_DIR="{EXAMPLE_DEVICE_DIR}/nvme" | 46.666667 | 48 | 0.871429 | 22 | 140 | 4.954545 | 0.363636 | 0.357798 | 0.440367 | 0.348624 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.014286 | 140 | 3 | 49 | 46.666667 | 0.789855 | 0 | 0 | 0 | 0 | 0 | 0.503546 | 0.503546 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
00d88dc5fa62349c5484d3b12cb198f7847c10e8 | 882 | py | Python | shell/database/Linux64/tcp_bindx64.py | vasco2016/shellsploit-framework | 04eb4a0449acaba0b70c40a78c61a0d5e2527406 | [
"MIT"
] | 61 | 2017-06-13T13:48:38.000Z | 2022-03-02T17:43:45.000Z | shell/database/Linux64/tcp_bindx64.py | security-geeks/shellsploit-framework | 93b66ab9361872697eafda2125b37005f49116be | [
"MIT"
] | null | null | null | shell/database/Linux64/tcp_bindx64.py | security-geeks/shellsploit-framework | 93b66ab9361872697eafda2125b37005f49116be | [
"MIT"
] | 28 | 2017-08-15T05:38:27.000Z | 2020-12-31T03:39:38.000Z | #https://www.exploit-db.com/exploits/39151/
def tcp_bindx64( PORT):
shellcode = r"\x48\x31\xc0\x48\x31\xff\x48\x31\xf6\x48\x31\xd2\x4d\x31\xc0\x6a"
shellcode += r"\x02\x5f\x6a\x01\x5e\x6a\x06\x5a\x6a\x29\x58\x0f\x05\x49\x89\xc0"
shellcode += r"\x4d\x31\xd2\x41\x52\x41\x52\xc6\x04\x24\x02\x66\xc7\x44\x24\x02"
shellcode += PORT
shellcode += r"\x48\x89\xe6\x41\x50\x5f\x6a\x10\x5a\x6a\x31\x58\x0f\x05"
shellcode += r"\x41\x50\x5f\x6a\x01\x5e\x6a\x32\x58\x0f\x05\x48\x89\xe6\x48\x31"
shellcode += r"\xc9\xb1\x10\x51\x48\x89\xe2\x41\x50\x5f\x6a\x2b\x58\x0f\x05\x59"
shellcode += r"\x4d\x31\xc9\x49\x89\xc1\x4c\x89\xcf\x48\x31\xf6\x6a\x03\x5e\x48"
shellcode += r"\xff\xce\x6a\x21\x58\x0f\x05\x75\xf6\x48\x31\xff\x57\x57\x5e\x5a"
shellcode += r"\x48\xbf\x2f\x2f\x62\x69\x6e\x2f\x73\x68\x48\xc1\xef\x08\x57\x54"
shellcode += r"\x5f\x6a\x3b\x58\x0f\x05"
return shellcode
| 44.1 | 81 | 0.706349 | 183 | 882 | 3.398907 | 0.387978 | 0.160772 | 0.086817 | 0.057878 | 0.048232 | 0 | 0 | 0 | 0 | 0 | 0 | 0.28 | 0.064626 | 882 | 19 | 82 | 46.421053 | 0.473939 | 0.047619 | 0 | 0 | 0 | 0.692308 | 0.706444 | 0.706444 | 0 | 1 | 0 | 0 | 0 | 1 | 0.076923 | false | 0 | 0 | 0 | 0.153846 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 1 | 1 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
da9996c4ad618d8720faaee1d4d488d21b32fe37 | 4,353 | py | Python | im/kibot/data/load/test/test_dataset_name_parser.py | ajmal017/amp | 8de7e3b88be87605ec3bad03c139ac64eb460e5c | [
"BSD-3-Clause"
] | null | null | null | im/kibot/data/load/test/test_dataset_name_parser.py | ajmal017/amp | 8de7e3b88be87605ec3bad03c139ac64eb460e5c | [
"BSD-3-Clause"
] | null | null | null | im/kibot/data/load/test/test_dataset_name_parser.py | ajmal017/amp | 8de7e3b88be87605ec3bad03c139ac64eb460e5c | [
"BSD-3-Clause"
] | null | null | null | import helpers.unit_test as hut
import im.common.data.types as vcdtyp
import im.kibot.data.load.dataset_name_parser as vkdlda
class TestDatasetNameParserExtractAssetClass(hut.TestCase):
def test_all_futures(self) -> None:
# Define input variables.
dataset = "all_futures_continuous_contracts_1min"
# Call function to test.
cls = vkdlda.DatasetNameParser()
act = cls._extract_asset_class(dataset=dataset)
# Define expected output.
exp = vcdtyp.AssetClass.Futures
# Compare actual and expected output.
self.assertEqual(act, exp)
def test_all_stocks(self) -> None:
# Define input variables.
dataset = "all_stocks_1min"
# Call function to test.
cls = vkdlda.DatasetNameParser()
act = cls._extract_asset_class(dataset=dataset)
# Define expected output.
exp = vcdtyp.AssetClass.Stocks
# Compare actual and expected output.
self.assertEqual(act, exp)
def test_all_etfs(self) -> None:
# Define input variables.
dataset = "all_etfs_daily"
# Call function to test.
cls = vkdlda.DatasetNameParser()
act = cls._extract_asset_class(dataset=dataset)
# Define expected output.
exp = vcdtyp.AssetClass.ETFs
# Compare actual and expected output.
self.assertEqual(act, exp)
def test_all_forex(self) -> None:
# Define input variables.
dataset = "all_forex_pairs_1min"
# Call function to test.
cls = vkdlda.DatasetNameParser()
act = cls._extract_asset_class(dataset=dataset)
# Define expected output.
exp = vcdtyp.AssetClass.Forex
# Compare actual and expected output.
self.assertEqual(act, exp)
def test_sp500(self) -> None:
# Define input variables.
dataset = "sp_500_tickbidask"
# Call function to test.
cls = vkdlda.DatasetNameParser()
act = cls._extract_asset_class(dataset=dataset)
# Define expected output.
exp = vcdtyp.AssetClass.SP500
# Compare actual and expected output.
self.assertEqual(act, exp)
class TestDatasetNameParserExtractFrequency(hut.TestCase):
def test_daily(self) -> None:
# Define input variables.
dataset = "all_futures_continuous_contracts_daily"
# Call function to test.
cls = vkdlda.DatasetNameParser()
act = cls._extract_frequency(dataset=dataset)
# Define expected output.
exp = vcdtyp.Frequency.Daily
# Compare actual and expected output.
self.assertEqual(act, exp)
def test_minutely(self) -> None:
# Define input variables.
dataset = "all_futures_continuous_contracts_1min"
# Call function to test.
cls = vkdlda.DatasetNameParser()
act = cls._extract_frequency(dataset=dataset)
# Define expected output.
exp = vcdtyp.Frequency.Minutely
# Compare actual and expected output.
self.assertEqual(act, exp)
def test_tick(self) -> None:
# Define input variables.
dataset = "all_futures_continuous_contracts_tick"
# Call function to test.
cls = vkdlda.DatasetNameParser()
act = cls._extract_frequency(dataset=dataset)
# Define expected output.
exp = vcdtyp.Frequency.Tick
# Compare actual and expected output.
self.assertEqual(act, exp)
class TestDatasetNameParserExtractContractType(hut.TestCase):
def test_continuous(self) -> None:
# Define input variables.
dataset = "all_futures_continuous_contracts_1min"
# Call function to test.
cls = vkdlda.DatasetNameParser()
act = cls._extract_contract_type(dataset=dataset)
# Define expected output.
exp = vcdtyp.ContractType.Continuous
# Compare actual and expected output.
self.assertEqual(act, exp)
def test_expiry(self) -> None:
# Define input variables.
dataset = "all_futures_contracts_1min"
# Call function to test.
cls = vkdlda.DatasetNameParser()
act = cls._extract_contract_type(dataset=dataset)
# Define expected output.
exp = vcdtyp.ContractType.Expiry
# Compare actual and expected output.
self.assertEqual(act, exp)
| 36.275 | 61 | 0.65518 | 468 | 4,353 | 5.931624 | 0.138889 | 0.100865 | 0.050432 | 0.068444 | 0.841859 | 0.841859 | 0.829251 | 0.788184 | 0.771974 | 0.753602 | 0 | 0.004699 | 0.266713 | 4,353 | 119 | 62 | 36.579832 | 0.864975 | 0.245578 | 0 | 0.5 | 0 | 0 | 0.085723 | 0.065372 | 0 | 0 | 0 | 0 | 0.151515 | 1 | 0.151515 | false | 0 | 0.045455 | 0 | 0.242424 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
dad8aaa3fea8a5aed55de849f33b36e86c9a7966 | 84 | py | Python | models/__init__.py | KIMGEONUNG/T-DGP | 7e580d29c43aad5a43680afebdd7ee980939ee01 | [
"MIT"
] | null | null | null | models/__init__.py | KIMGEONUNG/T-DGP | 7e580d29c43aad5a43680afebdd7ee980939ee01 | [
"MIT"
] | null | null | null | models/__init__.py | KIMGEONUNG/T-DGP | 7e580d29c43aad5a43680afebdd7ee980939ee01 | [
"MIT"
] | null | null | null | from .biggan import *
from .dgp import *
from .tdgp import *
from .nethook import *
| 16.8 | 22 | 0.714286 | 12 | 84 | 5 | 0.5 | 0.5 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.190476 | 84 | 4 | 23 | 21 | 0.882353 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
daea1feff13d64a490726962ff601715772322ce | 29 | py | Python | lib/arch/__init__.py | RUB-SysSec/tropyhunter | 7d027bf18944c641296ed5d68462f4400dbc055e | [
"MIT"
] | 1 | 2020-11-06T21:18:01.000Z | 2020-11-06T21:18:01.000Z | lib/arch/__init__.py | RUB-SysSec/tropyhunter | 7d027bf18944c641296ed5d68462f4400dbc055e | [
"MIT"
] | null | null | null | lib/arch/__init__.py | RUB-SysSec/tropyhunter | 7d027bf18944c641296ed5d68462f4400dbc055e | [
"MIT"
] | null | null | null | from .x64 import RegistersX64 | 29 | 29 | 0.862069 | 4 | 29 | 6.25 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.153846 | 0.103448 | 29 | 1 | 29 | 29 | 0.807692 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
973f0038c89d175351263cd9be4ab808808bac5e | 3,543 | py | Python | Paper_Specific_Versions/2018_NeuroImage/Code/experiments/3-class_imbalance/3-AIBL_CN_AD_balanced.py | adamwild/AD-ML | e4ac0b7d312ab482b9b52bb3f5c6745cc06431e9 | [
"MIT"
] | null | null | null | Paper_Specific_Versions/2018_NeuroImage/Code/experiments/3-class_imbalance/3-AIBL_CN_AD_balanced.py | adamwild/AD-ML | e4ac0b7d312ab482b9b52bb3f5c6745cc06431e9 | [
"MIT"
] | null | null | null | Paper_Specific_Versions/2018_NeuroImage/Code/experiments/3-class_imbalance/3-AIBL_CN_AD_balanced.py | adamwild/AD-ML | e4ac0b7d312ab482b9b52bb3f5c6745cc06431e9 | [
"MIT"
] | null | null | null |
import os
from os import path
import pickle
from clinica.pipelines.machine_learning.ml_workflows import VB_RepHoldOut_DualSVM, RB_RepHoldOut_DualSVM
n_iterations = 250
n_threads = 72
caps_dir = '/AIBL/CAPS'
output_dir = 'AIBL/CLASSIFICATION/OUTPUTS_BALANCED'
group_id = 'ADNIbl'
image_types = ['T1']
tasks_dir = '/AIBL/SUBJECTS/lists_by_task'
tasks = [('CN', 'AD')]
smoothing = [4]
atlases = ['AAL2']
classifier = 'linear_svm'
##### Voxel based classifications ######
for image_type in image_types:
for smooth in smoothing:
for task in tasks:
if image_type == 'T1':
classification_dir = path.join(output_dir, image_type, 'voxel_based',
'smoothing-%s' % smooth, classifier,
'%s_vs_%s' % (task[0], task[1]))
else:
classification_dir = path.join(output_dir, image_type, 'voxel_based', 'pvc-None',
'smoothing-%s' % smooth, classifier,
'%s_vs_%s' % (task[0], task[1]))
if not path.exists(classification_dir):
os.makedirs(classification_dir)
subjects_visits_tsv = path.join(tasks_dir, '%s_vs_%s_subjects_sessions_balanced.tsv' % (task[0], task[1]))
diagnoses_tsv = path.join(tasks_dir, '%s_vs_%s_diagnoses_balanced.tsv' % (task[0], task[1]))
with open(path.join(tasks_dir, '%s_vs_%s_splits_indices_balanced.pkl' % (task[0], task[1])), 'r') as s:
splits_indices = pickle.load(s)
print "Running %s" % classification_dir
wf = VB_RepHoldOut_DualSVM(caps_dir, subjects_visits_tsv, diagnoses_tsv, group_id, image_type,
classification_dir, fwhm=smooth, pvc=None, n_iterations=n_iterations,
n_threads=n_threads, splits_indices=splits_indices)
wf.run()
# Region based
for image_type in image_types:
for atlas in atlases:
for task in tasks:
if image_type == 'T1':
classification_dir = path.join(output_dir, image_type, 'region_based',
'atlas-%s' % atlas, classifier,
'%s_vs_%s' % (task[0], task[1]))
else:
classification_dir = path.join(output_dir, image_type, 'region_based', 'pvc-None',
'atlas-%s' % atlas, classifier,
'%s_vs_%s' % (task[0], task[1]))
if not path.exists(classification_dir):
os.makedirs(classification_dir)
subjects_visits_tsv = path.join(tasks_dir, '%s_vs_%s_subjects_sessions_balanced.tsv' % (task[0], task[1]))
diagnoses_tsv = path.join(tasks_dir, '%s_vs_%s_diagnoses_balanced.tsv' % (task[0], task[1]))
with open(path.join(tasks_dir, '%s_vs_%s_splits_indices_balanced.pkl' % (task[0], task[1])), 'r') as s:
splits_indices = pickle.load(s)
print "Running %s" % classification_dir
wf = RB_RepHoldOut_DualSVM(caps_dir, subjects_visits_tsv, diagnoses_tsv, group_id, image_type, atlas,
classification_dir, pvc=None, n_iterations=n_iterations, n_threads=n_threads,
splits_indices=splits_indices)
wf.run()
| 47.24 | 118 | 0.557437 | 409 | 3,543 | 4.523227 | 0.207824 | 0.11027 | 0.021622 | 0.054054 | 0.758919 | 0.758919 | 0.758919 | 0.72973 | 0.72973 | 0.72973 | 0 | 0.012685 | 0.332487 | 3,543 | 74 | 119 | 47.878378 | 0.769556 | 0.011572 | 0 | 0.542373 | 0 | 0 | 0.135321 | 0.079128 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0 | 0.067797 | null | null | 0.033898 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
97676e9f619500d69093597f9f60be4f6fd9238d | 214,468 | py | Python | caproto/server/records/base.py | slactjohnson/caproto | e78404234abe3c31749c4f1c04d9ae4fbcd298f6 | [
"BSD-3-Clause"
] | 12 | 2019-05-25T14:26:25.000Z | 2022-01-24T09:10:18.000Z | caproto/server/records/base.py | slactjohnson/caproto | e78404234abe3c31749c4f1c04d9ae4fbcd298f6 | [
"BSD-3-Clause"
] | 333 | 2017-06-22T03:10:15.000Z | 2019-05-07T16:37:20.000Z | caproto/server/records/base.py | slactjohnson/caproto | e78404234abe3c31749c4f1c04d9ae4fbcd298f6 | [
"BSD-3-Clause"
] | 17 | 2019-07-03T18:17:22.000Z | 2022-03-22T00:24:20.000Z | """
Contains the base field representation for EPICS base records.
This file is auto-generated. Do not modify it.
If you need to add or modify fields to correct something, please use the
``reference-dbd`` project to regenerate this file.
If you need to add functionality to any record, see the module
:mod:`caproto.server.records.records`.
"""
# **NOTE** **NOTE**
# This file is auto-generated. Please see the module docstring for details.
# **NOTE** **NOTE**
from ..._data import ChannelType
from .. import menus
from ..server import PVGroup, pvproperty
from .mixins import _Limits, _LimitsLong
from .utils import copy_pvproperties, link_parent_attribute
class RecordFieldGroup(PVGroup):
_scan_rate_sec = None
_dtype = None # to be set by subclasses
has_val_field = True
alarm_acknowledge_severity = pvproperty(
name="ACKS",
dtype=ChannelType.ENUM,
enum_strings=menus.menuAlarmSevr.get_string_tuple(),
doc="Alarm Ack Severity",
read_only=True,
)
alarm_acknowledge_transient = pvproperty(
name="ACKT",
dtype=ChannelType.ENUM,
enum_strings=menus.menuYesNo.get_string_tuple(),
doc="Alarm Ack Transient",
read_only=True,
value="YES",
)
access_security_group = pvproperty(
name="ASG",
dtype=ChannelType.CHAR,
max_length=29,
report_as_string=True,
doc="Access Security Group",
)
description = pvproperty(
name="DESC",
dtype=ChannelType.CHAR,
max_length=41,
report_as_string=True,
doc="Descriptor",
)
disable = pvproperty(name="DISA", dtype=ChannelType.INT, doc="Disable")
disable_putfield = pvproperty(
name="DISP", dtype=ChannelType.CHAR, doc="Disable putField"
)
disable_alarm_severity = pvproperty(
name="DISS",
dtype=ChannelType.ENUM,
enum_strings=menus.menuAlarmSevr.get_string_tuple(),
doc="Disable Alarm Sevrty",
)
disable_value = pvproperty(
name="DISV", dtype=ChannelType.INT, doc="Disable Value", value=1
)
device_type = pvproperty(
name="DTYP",
dtype=ChannelType.ENUM,
enum_strings=menus.dtyp_base.get_string_tuple(),
doc="Device Type",
)
event_name = pvproperty(
name="EVNT",
dtype=ChannelType.CHAR,
max_length=40,
report_as_string=True,
doc="Event Name",
)
forward_link = pvproperty(
name="FLNK", dtype=ChannelType.STRING, doc="Forward Process Link"
)
lock_count = pvproperty(
name="LCNT", dtype=ChannelType.CHAR, doc="Lock Count", read_only=True
)
record_name = pvproperty(
name="NAME",
dtype=ChannelType.CHAR,
max_length=61,
report_as_string=True,
doc="Record Name",
read_only=True,
)
new_alarm_severity = pvproperty(
name="NSEV",
dtype=ChannelType.ENUM,
enum_strings=menus.menuAlarmSevr.get_string_tuple(),
doc="New Alarm Severity",
read_only=True,
)
new_alarm_status = pvproperty(
name="NSTA",
dtype=ChannelType.ENUM,
enum_strings=menus.menuAlarmStat.get_string_tuple(),
doc="New Alarm Status",
read_only=True,
)
record_active = pvproperty(
name="PACT", dtype=ChannelType.CHAR, doc="Record active", read_only=True
)
scan_phase = pvproperty(
name="PHAS", dtype=ChannelType.INT, doc="Scan Phase"
)
process_at_iocinit = pvproperty(
name="PINI",
dtype=ChannelType.ENUM,
enum_strings=menus.menuPini.get_string_tuple(),
doc="Process at iocInit",
)
scheduling_priority = pvproperty(
name="PRIO",
dtype=ChannelType.ENUM,
enum_strings=menus.menuPriority.get_string_tuple(),
doc="Scheduling Priority",
)
process_record = pvproperty(
name="PROC", dtype=ChannelType.CHAR, doc="Force Processing"
)
dbputfield_process = pvproperty(
name="PUTF",
dtype=ChannelType.CHAR,
doc="dbPutField process",
read_only=True,
)
reprocess = pvproperty(
name="RPRO", dtype=ChannelType.CHAR, doc="Reprocess ", read_only=True
)
scan_rate = pvproperty(
name="SCAN",
dtype=ChannelType.ENUM,
enum_strings=menus.menuScan.get_string_tuple(),
doc="Scan Mechanism",
)
scanning_disable = pvproperty(
name="SDIS", dtype=ChannelType.STRING, doc="Scanning Disable"
)
current_alarm_severity = pvproperty(
name="SEVR",
dtype=ChannelType.ENUM,
enum_strings=menus.menuAlarmSevr.get_string_tuple(),
doc="Alarm Severity",
read_only=True,
)
trace_processing = pvproperty(
name="TPRO", dtype=ChannelType.CHAR, doc="Trace Processing"
)
time_stamp_event = pvproperty(
name="TSE", dtype=ChannelType.INT, doc="Time Stamp Event"
)
time_stamp_link = pvproperty(
name="TSEL", dtype=ChannelType.STRING, doc="Time Stamp Link"
)
undefined = pvproperty(
name="UDF", dtype=ChannelType.CHAR, doc="Undefined", value=chr(1)
)
alarm_status = pvproperty(
name="STAT",
dtype=ChannelType.ENUM,
enum_strings=menus.menuAlarmStat.get_string_tuple(),
doc="Alarm Status",
read_only=True,
value="NO_ALARM",
)
undefined_alarm_severity = pvproperty(
name="UDFS",
dtype=ChannelType.ENUM,
enum_strings=menus.menuAlarmSevr.get_string_tuple(),
doc="Undefined Alarm Sevrty",
value="INVALID",
)
class AiFields(RecordFieldGroup, _Limits):
_record_type = "ai"
_dtype = ChannelType.DOUBLE # DTYP of .VAL
has_val_field = True
copy_pvproperties(locals(), RecordFieldGroup, _Limits)
device_type = pvproperty(
name="DTYP",
dtype=ChannelType.ENUM,
enum_strings=menus.dtyp_ai.get_string_tuple(),
doc="Device Type",
)
current_raw_value = pvproperty(
name="RVAL", dtype=ChannelType.LONG, doc="Current Raw Value"
)
initialized = pvproperty(
name="INIT", dtype=ChannelType.INT, doc="Initialized?", read_only=True
)
last_val_monitored = pvproperty(
name="MLST",
dtype=ChannelType.DOUBLE,
doc="Last Val Monitored",
read_only=True,
)
last_value_alarmed = pvproperty(
name="LALM",
dtype=ChannelType.DOUBLE,
doc="Last Value Alarmed",
read_only=True,
)
last_value_archived = pvproperty(
name="ALST",
dtype=ChannelType.DOUBLE,
doc="Last Value Archived",
read_only=True,
)
lastbreak_point = pvproperty(
name="LBRK",
dtype=ChannelType.INT,
doc="LastBreak Point",
read_only=True,
)
previous_raw_value = pvproperty(
name="ORAW",
dtype=ChannelType.LONG,
doc="Previous Raw Value",
read_only=True,
)
raw_offset = pvproperty(
name="ROFF", dtype=ChannelType.LONG, doc="Raw Offset"
)
simulation_mode = pvproperty(
name="SIMM",
dtype=ChannelType.ENUM,
enum_strings=menus.menuSimm.get_string_tuple(),
doc="Simulation Mode",
)
simulation_value = pvproperty(
name="SVAL", dtype=ChannelType.DOUBLE, doc="Simulation Value"
)
alarm_deadband = pvproperty(
name="HYST", dtype=ChannelType.DOUBLE, doc="Alarm Deadband"
)
adjustment_offset = pvproperty(
name="AOFF", dtype=ChannelType.DOUBLE, doc="Adjustment Offset"
)
adjustment_slope = pvproperty(
name="ASLO", dtype=ChannelType.DOUBLE, doc="Adjustment Slope", value=1
)
engineer_units_full = pvproperty(
name="EGUF", dtype=ChannelType.DOUBLE, doc="Engineer Units Full"
)
engineer_units_low = pvproperty(
name="EGUL", dtype=ChannelType.DOUBLE, doc="Engineer Units Low"
)
linearization = pvproperty(
name="LINR",
dtype=ChannelType.ENUM,
enum_strings=menus.menuConvert.get_string_tuple(),
doc="Linearization",
)
raw_to_egu_offset = pvproperty(
name="EOFF", dtype=ChannelType.DOUBLE, doc="Raw to EGU Offset"
)
raw_to_egu_slope = pvproperty(
name="ESLO", dtype=ChannelType.DOUBLE, doc="Raw to EGU Slope", value=1
)
smoothing = pvproperty(
name="SMOO", dtype=ChannelType.DOUBLE, doc="Smoothing"
)
archive_deadband = pvproperty(
name="ADEL", dtype=ChannelType.DOUBLE, doc="Archive Deadband"
)
display_precision = pvproperty(
name="PREC", dtype=ChannelType.INT, doc="Display Precision"
)
engineering_units = pvproperty(
name="EGU",
dtype=ChannelType.CHAR,
max_length=16,
report_as_string=True,
doc="Engineering Units",
)
monitor_deadband = pvproperty(
name="MDEL", dtype=ChannelType.DOUBLE, doc="Monitor Deadband"
)
input_specification = pvproperty(
name="INP", dtype=ChannelType.STRING, doc="Input Specification"
)
simulation_input_link = pvproperty(
name="SIOL", dtype=ChannelType.STRING, doc="Simulation Input Link"
)
simulation_mode_link = pvproperty(
name="SIML", dtype=ChannelType.STRING, doc="Simulation Mode Link"
)
simulation_mode_severity = pvproperty(
name="SIMS",
dtype=ChannelType.ENUM,
enum_strings=menus.menuAlarmSevr.get_string_tuple(),
doc="Simulation Mode Severity",
)
alarm_filter_time_constant = pvproperty(
name="AFTC", dtype=ChannelType.DOUBLE, doc="Alarm Filter Time Constant"
)
alarm_filter_value = pvproperty(
name="AFVL",
dtype=ChannelType.DOUBLE,
doc="Alarm Filter Value",
read_only=True,
)
prev_simulation_mode = pvproperty(
name="OLDSIMM",
dtype=ChannelType.ENUM,
enum_strings=menus.menuSimm.get_string_tuple(),
doc="Prev. Simulation Mode",
read_only=True,
)
sim_mode_async_delay = pvproperty(
name="SDLY",
dtype=ChannelType.DOUBLE,
doc="Sim. Mode Async Delay",
value=-1.0,
)
sim_mode_scan = pvproperty(
name="SSCN",
dtype=ChannelType.ENUM,
enum_strings=menus.menuScan.get_string_tuple(),
doc="Sim. Mode Scan",
value=0,
)
# current_egu_value = pvproperty(name='VAL',
# dtype=ChannelType.DOUBLE,
# doc='Current EGU Value')
link_parent_attribute(
display_precision, "precision",
)
link_parent_attribute(archive_deadband, "log_atol", use_setattr=True)
link_parent_attribute(monitor_deadband, "value_atol", use_setattr=True)
class AsubFields(RecordFieldGroup):
_record_type = "aSub"
_dtype = ChannelType.LONG # DTYP of .VAL
has_val_field = True
copy_pvproperties(locals(), RecordFieldGroup)
device_type = pvproperty(
name="DTYP",
dtype=ChannelType.ENUM,
enum_strings=menus.dtyp_aSub.get_string_tuple(),
doc="Device Type",
)
bad_return_severity = pvproperty(
name="BRSV",
dtype=ChannelType.ENUM,
enum_strings=menus.menuAlarmSevr.get_string_tuple(),
doc="Bad Return Severity",
)
output_event_flag = pvproperty(
name="EFLG",
dtype=ChannelType.ENUM,
enum_strings=menus.aSubEFLG.get_string_tuple(),
doc="Output Event Flag",
value=1,
)
type_of_a = pvproperty(
name="FTA",
dtype=ChannelType.ENUM,
enum_strings=menus.menuFtype.get_string_tuple(),
doc="Type of A",
read_only=True,
value="DOUBLE",
)
type_of_b = pvproperty(
name="FTB",
dtype=ChannelType.ENUM,
enum_strings=menus.menuFtype.get_string_tuple(),
doc="Type of B",
read_only=True,
value="DOUBLE",
)
type_of_c = pvproperty(
name="FTC",
dtype=ChannelType.ENUM,
enum_strings=menus.menuFtype.get_string_tuple(),
doc="Type of C",
read_only=True,
value="DOUBLE",
)
type_of_d = pvproperty(
name="FTD",
dtype=ChannelType.ENUM,
enum_strings=menus.menuFtype.get_string_tuple(),
doc="Type of D",
read_only=True,
value="DOUBLE",
)
type_of_e = pvproperty(
name="FTE",
dtype=ChannelType.ENUM,
enum_strings=menus.menuFtype.get_string_tuple(),
doc="Type of E",
read_only=True,
value="DOUBLE",
)
type_of_f = pvproperty(
name="FTF",
dtype=ChannelType.ENUM,
enum_strings=menus.menuFtype.get_string_tuple(),
doc="Type of F",
read_only=True,
value="DOUBLE",
)
type_of_g = pvproperty(
name="FTG",
dtype=ChannelType.ENUM,
enum_strings=menus.menuFtype.get_string_tuple(),
doc="Type of G",
read_only=True,
value="DOUBLE",
)
type_of_h = pvproperty(
name="FTH",
dtype=ChannelType.ENUM,
enum_strings=menus.menuFtype.get_string_tuple(),
doc="Type of H",
read_only=True,
value="DOUBLE",
)
type_of_i = pvproperty(
name="FTI",
dtype=ChannelType.ENUM,
enum_strings=menus.menuFtype.get_string_tuple(),
doc="Type of I",
read_only=True,
value="DOUBLE",
)
type_of_j = pvproperty(
name="FTJ",
dtype=ChannelType.ENUM,
enum_strings=menus.menuFtype.get_string_tuple(),
doc="Type of J",
read_only=True,
value="DOUBLE",
)
type_of_k = pvproperty(
name="FTK",
dtype=ChannelType.ENUM,
enum_strings=menus.menuFtype.get_string_tuple(),
doc="Type of K",
read_only=True,
value="DOUBLE",
)
type_of_l = pvproperty(
name="FTL",
dtype=ChannelType.ENUM,
enum_strings=menus.menuFtype.get_string_tuple(),
doc="Type of L",
read_only=True,
value="DOUBLE",
)
type_of_m = pvproperty(
name="FTM",
dtype=ChannelType.ENUM,
enum_strings=menus.menuFtype.get_string_tuple(),
doc="Type of M",
read_only=True,
value="DOUBLE",
)
type_of_n = pvproperty(
name="FTN",
dtype=ChannelType.ENUM,
enum_strings=menus.menuFtype.get_string_tuple(),
doc="Type of N",
read_only=True,
value="DOUBLE",
)
type_of_o = pvproperty(
name="FTO",
dtype=ChannelType.ENUM,
enum_strings=menus.menuFtype.get_string_tuple(),
doc="Type of O",
read_only=True,
value="DOUBLE",
)
type_of_p = pvproperty(
name="FTP",
dtype=ChannelType.ENUM,
enum_strings=menus.menuFtype.get_string_tuple(),
doc="Type of P",
read_only=True,
value="DOUBLE",
)
type_of_q = pvproperty(
name="FTQ",
dtype=ChannelType.ENUM,
enum_strings=menus.menuFtype.get_string_tuple(),
doc="Type of Q",
read_only=True,
value="DOUBLE",
)
type_of_r = pvproperty(
name="FTR",
dtype=ChannelType.ENUM,
enum_strings=menus.menuFtype.get_string_tuple(),
doc="Type of R",
read_only=True,
value="DOUBLE",
)
type_of_s = pvproperty(
name="FTS",
dtype=ChannelType.ENUM,
enum_strings=menus.menuFtype.get_string_tuple(),
doc="Type of S",
read_only=True,
value="DOUBLE",
)
type_of_t = pvproperty(
name="FTT",
dtype=ChannelType.ENUM,
enum_strings=menus.menuFtype.get_string_tuple(),
doc="Type of T",
read_only=True,
value="DOUBLE",
)
type_of_u = pvproperty(
name="FTU",
dtype=ChannelType.ENUM,
enum_strings=menus.menuFtype.get_string_tuple(),
doc="Type of U",
read_only=True,
value="DOUBLE",
)
type_of_vala = pvproperty(
name="FTVA",
dtype=ChannelType.ENUM,
enum_strings=menus.menuFtype.get_string_tuple(),
doc="Type of VALA",
read_only=True,
value="DOUBLE",
)
type_of_valb = pvproperty(
name="FTVB",
dtype=ChannelType.ENUM,
enum_strings=menus.menuFtype.get_string_tuple(),
doc="Type of VALB",
read_only=True,
value="DOUBLE",
)
type_of_valc = pvproperty(
name="FTVC",
dtype=ChannelType.ENUM,
enum_strings=menus.menuFtype.get_string_tuple(),
doc="Type of VALC",
read_only=True,
value="DOUBLE",
)
type_of_vald = pvproperty(
name="FTVD",
dtype=ChannelType.ENUM,
enum_strings=menus.menuFtype.get_string_tuple(),
doc="Type of VALD",
read_only=True,
value="DOUBLE",
)
type_of_vale = pvproperty(
name="FTVE",
dtype=ChannelType.ENUM,
enum_strings=menus.menuFtype.get_string_tuple(),
doc="Type of VALE",
read_only=True,
value="DOUBLE",
)
type_of_valf = pvproperty(
name="FTVF",
dtype=ChannelType.ENUM,
enum_strings=menus.menuFtype.get_string_tuple(),
doc="Type of VALF",
read_only=True,
value="DOUBLE",
)
type_of_valg = pvproperty(
name="FTVG",
dtype=ChannelType.ENUM,
enum_strings=menus.menuFtype.get_string_tuple(),
doc="Type of VALG",
read_only=True,
value="DOUBLE",
)
type_of_valh = pvproperty(
name="FTVH",
dtype=ChannelType.ENUM,
enum_strings=menus.menuFtype.get_string_tuple(),
doc="Type of VALH",
read_only=True,
value="DOUBLE",
)
type_of_vali = pvproperty(
name="FTVI",
dtype=ChannelType.ENUM,
enum_strings=menus.menuFtype.get_string_tuple(),
doc="Type of VALI",
read_only=True,
value="DOUBLE",
)
type_of_valj = pvproperty(
name="FTVJ",
dtype=ChannelType.ENUM,
enum_strings=menus.menuFtype.get_string_tuple(),
doc="Type of VALJ",
read_only=True,
value="DOUBLE",
)
type_of_valk = pvproperty(
name="FTVK",
dtype=ChannelType.ENUM,
enum_strings=menus.menuFtype.get_string_tuple(),
doc="Type of VALK",
read_only=True,
value="DOUBLE",
)
type_of_vall = pvproperty(
name="FTVL",
dtype=ChannelType.ENUM,
enum_strings=menus.menuFtype.get_string_tuple(),
doc="Type of VALL",
read_only=True,
value="DOUBLE",
)
type_of_valm = pvproperty(
name="FTVM",
dtype=ChannelType.ENUM,
enum_strings=menus.menuFtype.get_string_tuple(),
doc="Type of VALM",
read_only=True,
value="DOUBLE",
)
type_of_valn = pvproperty(
name="FTVN",
dtype=ChannelType.ENUM,
enum_strings=menus.menuFtype.get_string_tuple(),
doc="Type of VALN",
read_only=True,
value="DOUBLE",
)
type_of_valo = pvproperty(
name="FTVO",
dtype=ChannelType.ENUM,
enum_strings=menus.menuFtype.get_string_tuple(),
doc="Type of VALO",
read_only=True,
value="DOUBLE",
)
type_of_valp = pvproperty(
name="FTVP",
dtype=ChannelType.ENUM,
enum_strings=menus.menuFtype.get_string_tuple(),
doc="Type of VALP",
read_only=True,
value="DOUBLE",
)
type_of_valq = pvproperty(
name="FTVQ",
dtype=ChannelType.ENUM,
enum_strings=menus.menuFtype.get_string_tuple(),
doc="Type of VALQ",
read_only=True,
value="DOUBLE",
)
type_of_valr = pvproperty(
name="FTVR",
dtype=ChannelType.ENUM,
enum_strings=menus.menuFtype.get_string_tuple(),
doc="Type of VALR",
read_only=True,
value="DOUBLE",
)
type_of_vals = pvproperty(
name="FTVS",
dtype=ChannelType.ENUM,
enum_strings=menus.menuFtype.get_string_tuple(),
doc="Type of VALS",
read_only=True,
value="DOUBLE",
)
type_of_valt = pvproperty(
name="FTVT",
dtype=ChannelType.ENUM,
enum_strings=menus.menuFtype.get_string_tuple(),
doc="Type of VALT",
read_only=True,
value="DOUBLE",
)
type_of_valu = pvproperty(
name="FTVU",
dtype=ChannelType.ENUM,
enum_strings=menus.menuFtype.get_string_tuple(),
doc="Type of VALU",
read_only=True,
value="DOUBLE",
)
initialize_subr_name = pvproperty(
name="INAM",
dtype=ChannelType.CHAR,
max_length=41,
report_as_string=True,
doc="Initialize Subr. Name",
read_only=True,
)
input_link_a = pvproperty(
name="INPA", dtype=ChannelType.STRING, doc="Input Link A"
)
input_link_b = pvproperty(
name="INPB", dtype=ChannelType.STRING, doc="Input Link B"
)
input_link_c = pvproperty(
name="INPC", dtype=ChannelType.STRING, doc="Input Link C"
)
input_link_d = pvproperty(
name="INPD", dtype=ChannelType.STRING, doc="Input Link D"
)
input_link_e = pvproperty(
name="INPE", dtype=ChannelType.STRING, doc="Input Link E"
)
input_link_f = pvproperty(
name="INPF", dtype=ChannelType.STRING, doc="Input Link F"
)
input_link_g = pvproperty(
name="INPG", dtype=ChannelType.STRING, doc="Input Link G"
)
input_link_h = pvproperty(
name="INPH", dtype=ChannelType.STRING, doc="Input Link H"
)
input_link_i = pvproperty(
name="INPI", dtype=ChannelType.STRING, doc="Input Link I"
)
input_link_j = pvproperty(
name="INPJ", dtype=ChannelType.STRING, doc="Input Link J"
)
input_link_k = pvproperty(
name="INPK", dtype=ChannelType.STRING, doc="Input Link K"
)
input_link_l = pvproperty(
name="INPL", dtype=ChannelType.STRING, doc="Input Link L"
)
input_link_m = pvproperty(
name="INPM", dtype=ChannelType.STRING, doc="Input Link M"
)
input_link_n = pvproperty(
name="INPN", dtype=ChannelType.STRING, doc="Input Link N"
)
input_link_o = pvproperty(
name="INPO", dtype=ChannelType.STRING, doc="Input Link O"
)
input_link_p = pvproperty(
name="INPP", dtype=ChannelType.STRING, doc="Input Link P"
)
input_link_q = pvproperty(
name="INPQ", dtype=ChannelType.STRING, doc="Input Link Q"
)
input_link_r = pvproperty(
name="INPR", dtype=ChannelType.STRING, doc="Input Link R"
)
input_link_s = pvproperty(
name="INPS", dtype=ChannelType.STRING, doc="Input Link S"
)
input_link_t = pvproperty(
name="INPT", dtype=ChannelType.STRING, doc="Input Link T"
)
input_link_u = pvproperty(
name="INPU", dtype=ChannelType.STRING, doc="Input Link U"
)
subr_input_enable = pvproperty(
name="LFLG",
dtype=ChannelType.ENUM,
enum_strings=menus.aSubLFLG.get_string_tuple(),
doc="Subr. Input Enable",
)
num_elements_in_a = pvproperty(
name="NEA",
dtype=ChannelType.LONG,
doc="Num. elements in A",
read_only=True,
value=1,
)
num_elements_in_b = pvproperty(
name="NEB",
dtype=ChannelType.LONG,
doc="Num. elements in B",
read_only=True,
value=1,
)
num_elements_in_c = pvproperty(
name="NEC",
dtype=ChannelType.LONG,
doc="Num. elements in C",
read_only=True,
value=1,
)
num_elements_in_d = pvproperty(
name="NED",
dtype=ChannelType.LONG,
doc="Num. elements in D",
read_only=True,
value=1,
)
num_elements_in_e = pvproperty(
name="NEE",
dtype=ChannelType.LONG,
doc="Num. elements in E",
read_only=True,
value=1,
)
num_elements_in_f = pvproperty(
name="NEF",
dtype=ChannelType.LONG,
doc="Num. elements in F",
read_only=True,
value=1,
)
num_elements_in_g = pvproperty(
name="NEG",
dtype=ChannelType.LONG,
doc="Num. elements in G",
read_only=True,
value=1,
)
num_elements_in_h = pvproperty(
name="NEH",
dtype=ChannelType.LONG,
doc="Num. elements in H",
read_only=True,
value=1,
)
num_elements_in_i = pvproperty(
name="NEI",
dtype=ChannelType.LONG,
doc="Num. elements in I",
read_only=True,
value=1,
)
num_elements_in_j = pvproperty(
name="NEJ",
dtype=ChannelType.LONG,
doc="Num. elements in J",
read_only=True,
value=1,
)
num_elements_in_k = pvproperty(
name="NEK",
dtype=ChannelType.LONG,
doc="Num. elements in K",
read_only=True,
value=1,
)
num_elements_in_l = pvproperty(
name="NEL",
dtype=ChannelType.LONG,
doc="Num. elements in L",
read_only=True,
value=1,
)
num_elements_in_m = pvproperty(
name="NEM",
dtype=ChannelType.LONG,
doc="Num. elements in M",
read_only=True,
value=1,
)
num_elements_in_n = pvproperty(
name="NEN",
dtype=ChannelType.LONG,
doc="Num. elements in N",
read_only=True,
value=1,
)
num_elements_in_o = pvproperty(
name="NEO",
dtype=ChannelType.LONG,
doc="Num. elements in O",
read_only=True,
value=1,
)
num_elements_in_p = pvproperty(
name="NEP",
dtype=ChannelType.LONG,
doc="Num. elements in P",
read_only=True,
value=1,
)
num_elements_in_q = pvproperty(
name="NEQ",
dtype=ChannelType.LONG,
doc="Num. elements in Q",
read_only=True,
value=1,
)
num_elements_in_r = pvproperty(
name="NER",
dtype=ChannelType.LONG,
doc="Num. elements in R",
read_only=True,
value=1,
)
num_elements_in_s = pvproperty(
name="NES",
dtype=ChannelType.LONG,
doc="Num. elements in S",
read_only=True,
value=1,
)
num_elements_in_t = pvproperty(
name="NET",
dtype=ChannelType.LONG,
doc="Num. elements in T",
read_only=True,
value=1,
)
num_elements_in_u = pvproperty(
name="NEU",
dtype=ChannelType.LONG,
doc="Num. elements in U",
read_only=True,
value=1,
)
num_elements_in_vala = pvproperty(
name="NEVA",
dtype=ChannelType.LONG,
doc="Num. elements in VALA",
read_only=True,
value=1,
)
num_elements_in_valb = pvproperty(
name="NEVB",
dtype=ChannelType.LONG,
doc="Num. elements in VALB",
read_only=True,
value=1,
)
num_elements_in_valc = pvproperty(
name="NEVC",
dtype=ChannelType.LONG,
doc="Num. elements in VALC",
read_only=True,
value=1,
)
num_elements_in_vald = pvproperty(
name="NEVD",
dtype=ChannelType.LONG,
doc="Num. elements in VALD",
read_only=True,
value=1,
)
num_elements_in_vale = pvproperty(
name="NEVE",
dtype=ChannelType.LONG,
doc="Num. elements in VALE",
read_only=True,
value=1,
)
num_elements_in_valf = pvproperty(
name="NEVF",
dtype=ChannelType.LONG,
doc="Num. elements in VALF",
read_only=True,
value=1,
)
num_elements_in_valg = pvproperty(
name="NEVG",
dtype=ChannelType.LONG,
doc="Num. elements in VALG",
read_only=True,
value=1,
)
num_elements_in_valh = pvproperty(
name="NEVH",
dtype=ChannelType.LONG,
doc="Num. elements in VAlH",
read_only=True,
value=1,
)
num_elements_in_vali = pvproperty(
name="NEVI",
dtype=ChannelType.LONG,
doc="Num. elements in VALI",
read_only=True,
value=1,
)
num_elements_in_valj = pvproperty(
name="NEVJ",
dtype=ChannelType.LONG,
doc="Num. elements in VALJ",
read_only=True,
value=1,
)
num_elements_in_valk = pvproperty(
name="NEVK",
dtype=ChannelType.LONG,
doc="Num. elements in VALK",
read_only=True,
value=1,
)
num_elements_in_vall = pvproperty(
name="NEVL",
dtype=ChannelType.LONG,
doc="Num. elements in VALL",
read_only=True,
value=1,
)
num_elements_in_valm = pvproperty(
name="NEVM",
dtype=ChannelType.LONG,
doc="Num. elements in VALM",
read_only=True,
value=1,
)
num_elements_in_valn = pvproperty(
name="NEVN",
dtype=ChannelType.LONG,
doc="Num. elements in VALN",
read_only=True,
value=1,
)
num_elements_in_valo = pvproperty(
name="NEVO",
dtype=ChannelType.LONG,
doc="Num. elements in VALO",
read_only=True,
value=1,
)
num_elements_in_valp = pvproperty(
name="NEVP",
dtype=ChannelType.LONG,
doc="Num. elements in VALP",
read_only=True,
value=1,
)
num_elements_in_valq = pvproperty(
name="NEVQ",
dtype=ChannelType.LONG,
doc="Num. elements in VALQ",
read_only=True,
value=1,
)
num_elements_in_valr = pvproperty(
name="NEVR",
dtype=ChannelType.LONG,
doc="Num. elements in VALR",
read_only=True,
value=1,
)
num_elements_in_vals = pvproperty(
name="NEVS",
dtype=ChannelType.LONG,
doc="Num. elements in VALS",
read_only=True,
value=1,
)
num_elements_in_valt = pvproperty(
name="NEVT",
dtype=ChannelType.LONG,
doc="Num. elements in VALT",
read_only=True,
value=1,
)
num_elements_in_valu = pvproperty(
name="NEVU",
dtype=ChannelType.LONG,
doc="Num. elements in VALU",
read_only=True,
value=1,
)
max_elements_in_a = pvproperty(
name="NOA",
dtype=ChannelType.LONG,
doc="Max. elements in A",
read_only=True,
value=1,
)
max_elements_in_b = pvproperty(
name="NOB",
dtype=ChannelType.LONG,
doc="Max. elements in B",
read_only=True,
value=1,
)
max_elements_in_c = pvproperty(
name="NOC",
dtype=ChannelType.LONG,
doc="Max. elements in C",
read_only=True,
value=1,
)
max_elements_in_d = pvproperty(
name="NOD",
dtype=ChannelType.LONG,
doc="Max. elements in D",
read_only=True,
value=1,
)
max_elements_in_e = pvproperty(
name="NOE",
dtype=ChannelType.LONG,
doc="Max. elements in E",
read_only=True,
value=1,
)
max_elements_in_f = pvproperty(
name="NOF",
dtype=ChannelType.LONG,
doc="Max. elements in F",
read_only=True,
value=1,
)
max_elements_in_g = pvproperty(
name="NOG",
dtype=ChannelType.LONG,
doc="Max. elements in G",
read_only=True,
value=1,
)
max_elements_in_h = pvproperty(
name="NOH",
dtype=ChannelType.LONG,
doc="Max. elements in H",
read_only=True,
value=1,
)
max_elements_in_i = pvproperty(
name="NOI",
dtype=ChannelType.LONG,
doc="Max. elements in I",
read_only=True,
value=1,
)
max_elements_in_j = pvproperty(
name="NOJ",
dtype=ChannelType.LONG,
doc="Max. elements in J",
read_only=True,
value=1,
)
max_elements_in_k = pvproperty(
name="NOK",
dtype=ChannelType.LONG,
doc="Max. elements in K",
read_only=True,
value=1,
)
max_elements_in_l = pvproperty(
name="NOL",
dtype=ChannelType.LONG,
doc="Max. elements in L",
read_only=True,
value=1,
)
max_elements_in_m = pvproperty(
name="NOM",
dtype=ChannelType.LONG,
doc="Max. elements in M",
read_only=True,
value=1,
)
max_elements_in_n = pvproperty(
name="NON",
dtype=ChannelType.LONG,
doc="Max. elements in N",
read_only=True,
value=1,
)
max_elements_in_o = pvproperty(
name="NOO",
dtype=ChannelType.LONG,
doc="Max. elements in O",
read_only=True,
value=1,
)
max_elements_in_p = pvproperty(
name="NOP",
dtype=ChannelType.LONG,
doc="Max. elements in P",
read_only=True,
value=1,
)
max_elements_in_q = pvproperty(
name="NOQ",
dtype=ChannelType.LONG,
doc="Max. elements in Q",
read_only=True,
value=1,
)
max_elements_in_r = pvproperty(
name="NOR",
dtype=ChannelType.LONG,
doc="Max. elements in R",
read_only=True,
value=1,
)
max_elements_in_s = pvproperty(
name="NOS",
dtype=ChannelType.LONG,
doc="Max. elements in S",
read_only=True,
value=1,
)
max_elements_in_t = pvproperty(
name="NOT",
dtype=ChannelType.LONG,
doc="Max. elements in T",
read_only=True,
value=1,
)
max_elements_in_u = pvproperty(
name="NOU",
dtype=ChannelType.LONG,
doc="Max. elements in U",
read_only=True,
value=1,
)
max_elements_in_vala = pvproperty(
name="NOVA",
dtype=ChannelType.LONG,
doc="Max. elements in VALA",
read_only=True,
value=1,
)
max_elements_in_valb = pvproperty(
name="NOVB",
dtype=ChannelType.LONG,
doc="Max. elements in VALB",
read_only=True,
value=1,
)
max_elements_in_valc = pvproperty(
name="NOVC",
dtype=ChannelType.LONG,
doc="Max. elements in VALC",
read_only=True,
value=1,
)
max_elements_in_vald = pvproperty(
name="NOVD",
dtype=ChannelType.LONG,
doc="Max. elements in VALD",
read_only=True,
value=1,
)
max_elements_in_vale = pvproperty(
name="NOVE",
dtype=ChannelType.LONG,
doc="Max. elements in VALE",
read_only=True,
value=1,
)
max_elements_in_valf = pvproperty(
name="NOVF",
dtype=ChannelType.LONG,
doc="Max. elements in VALF",
read_only=True,
value=1,
)
max_elements_in_valg = pvproperty(
name="NOVG",
dtype=ChannelType.LONG,
doc="Max. elements in VALG",
read_only=True,
value=1,
)
max_elements_in_valh = pvproperty(
name="NOVH",
dtype=ChannelType.LONG,
doc="Max. elements in VAlH",
read_only=True,
value=1,
)
max_elements_in_vali = pvproperty(
name="NOVI",
dtype=ChannelType.LONG,
doc="Max. elements in VALI",
read_only=True,
value=1,
)
max_elements_in_valj = pvproperty(
name="NOVJ",
dtype=ChannelType.LONG,
doc="Max. elements in VALJ",
read_only=True,
value=1,
)
max_elements_in_valk = pvproperty(
name="NOVK",
dtype=ChannelType.LONG,
doc="Max. elements in VALK",
read_only=True,
value=1,
)
max_elements_in_vall = pvproperty(
name="NOVL",
dtype=ChannelType.LONG,
doc="Max. elements in VALL",
read_only=True,
value=1,
)
max_elements_in_valm = pvproperty(
name="NOVM",
dtype=ChannelType.LONG,
doc="Max. elements in VALM",
read_only=True,
value=1,
)
max_elements_in_valn = pvproperty(
name="NOVN",
dtype=ChannelType.LONG,
doc="Max. elements in VALN",
read_only=True,
value=1,
)
max_elements_in_valo = pvproperty(
name="NOVO",
dtype=ChannelType.LONG,
doc="Max. elements in VALO",
read_only=True,
value=1,
)
max_elements_in_valp = pvproperty(
name="NOVP",
dtype=ChannelType.LONG,
doc="Max. elements in VALP",
read_only=True,
value=1,
)
max_elements_in_valq = pvproperty(
name="NOVQ",
dtype=ChannelType.LONG,
doc="Max. elements in VALQ",
read_only=True,
value=1,
)
max_elements_in_valr = pvproperty(
name="NOVR",
dtype=ChannelType.LONG,
doc="Max. elements in VALR",
read_only=True,
value=1,
)
max_elements_in_vals = pvproperty(
name="NOVS",
dtype=ChannelType.LONG,
doc="Max. elements in VALS",
read_only=True,
value=1,
)
max_elements_in_valt = pvproperty(
name="NOVT",
dtype=ChannelType.LONG,
doc="Max. elements in VALT",
read_only=True,
value=1,
)
max_elements_in_valu = pvproperty(
name="NOVU",
dtype=ChannelType.LONG,
doc="Max. elements in VALU",
read_only=True,
value=1,
)
old_subr_name = pvproperty(
name="ONAM",
dtype=ChannelType.CHAR,
max_length=41,
report_as_string=True,
doc="Old Subr. Name",
read_only=True,
)
num_elements_in_ovla = pvproperty(
name="ONVA",
dtype=ChannelType.LONG,
doc="Num. elements in OVLA",
read_only=True,
value=1,
)
num_elements_in_ovlb = pvproperty(
name="ONVB",
dtype=ChannelType.LONG,
doc="Num. elements in OVLB",
read_only=True,
value=1,
)
num_elements_in_ovlc = pvproperty(
name="ONVC",
dtype=ChannelType.LONG,
doc="Num. elements in OVLC",
read_only=True,
value=1,
)
num_elements_in_ovld = pvproperty(
name="ONVD",
dtype=ChannelType.LONG,
doc="Num. elements in OVLD",
read_only=True,
value=1,
)
num_elements_in_ovle = pvproperty(
name="ONVE",
dtype=ChannelType.LONG,
doc="Num. elements in OVLE",
read_only=True,
value=1,
)
num_elements_in_ovlf = pvproperty(
name="ONVF",
dtype=ChannelType.LONG,
doc="Num. elements in OVLF",
read_only=True,
value=1,
)
num_elements_in_ovlg = pvproperty(
name="ONVG",
dtype=ChannelType.LONG,
doc="Num. elements in OVLG",
read_only=True,
value=1,
)
num_elements_in_ovlh = pvproperty(
name="ONVH",
dtype=ChannelType.LONG,
doc="Num. elements in VAlH",
read_only=True,
value=1,
)
num_elements_in_ovli = pvproperty(
name="ONVI",
dtype=ChannelType.LONG,
doc="Num. elements in OVLI",
read_only=True,
value=1,
)
num_elements_in_ovlj = pvproperty(
name="ONVJ",
dtype=ChannelType.LONG,
doc="Num. elements in OVLJ",
read_only=True,
value=1,
)
num_elements_in_ovlk = pvproperty(
name="ONVK",
dtype=ChannelType.LONG,
doc="Num. elements in OVLK",
read_only=True,
value=1,
)
num_elements_in_ovll = pvproperty(
name="ONVL",
dtype=ChannelType.LONG,
doc="Num. elements in OVLL",
read_only=True,
value=1,
)
num_elements_in_ovlm = pvproperty(
name="ONVM",
dtype=ChannelType.LONG,
doc="Num. elements in OVLM",
read_only=True,
value=1,
)
num_elements_in_ovln = pvproperty(
name="ONVN",
dtype=ChannelType.LONG,
doc="Num. elements in OVLN",
read_only=True,
value=1,
)
num_elements_in_ovlo = pvproperty(
name="ONVO",
dtype=ChannelType.LONG,
doc="Num. elements in OVLO",
read_only=True,
value=1,
)
num_elements_in_ovlp = pvproperty(
name="ONVP",
dtype=ChannelType.LONG,
doc="Num. elements in OVLP",
read_only=True,
value=1,
)
num_elements_in_ovlq = pvproperty(
name="ONVQ",
dtype=ChannelType.LONG,
doc="Num. elements in OVLQ",
read_only=True,
value=1,
)
num_elements_in_ovlr = pvproperty(
name="ONVR",
dtype=ChannelType.LONG,
doc="Num. elements in OVLR",
read_only=True,
value=1,
)
num_elements_in_ovls = pvproperty(
name="ONVS",
dtype=ChannelType.LONG,
doc="Num. elements in OVLS",
read_only=True,
value=1,
)
num_elements_in_ovlt = pvproperty(
name="ONVT",
dtype=ChannelType.LONG,
doc="Num. elements in OVLT",
read_only=True,
value=1,
)
num_elements_in_ovlu = pvproperty(
name="ONVU",
dtype=ChannelType.LONG,
doc="Num. elements in OVLU",
read_only=True,
value=1,
)
output_link_a = pvproperty(
name="OUTA", dtype=ChannelType.STRING, doc="Output Link A"
)
output_link_b = pvproperty(
name="OUTB", dtype=ChannelType.STRING, doc="Output Link B"
)
output_link_c = pvproperty(
name="OUTC", dtype=ChannelType.STRING, doc="Output Link C"
)
output_link_d = pvproperty(
name="OUTD", dtype=ChannelType.STRING, doc="Output Link D"
)
output_link_e = pvproperty(
name="OUTE", dtype=ChannelType.STRING, doc="Output Link E"
)
output_link_f = pvproperty(
name="OUTF", dtype=ChannelType.STRING, doc="Output Link F"
)
output_link_g = pvproperty(
name="OUTG", dtype=ChannelType.STRING, doc="Output Link G"
)
output_link_h = pvproperty(
name="OUTH", dtype=ChannelType.STRING, doc="Output Link H"
)
output_link_i = pvproperty(
name="OUTI", dtype=ChannelType.STRING, doc="Output Link I"
)
output_link_j = pvproperty(
name="OUTJ", dtype=ChannelType.STRING, doc="Output Link J"
)
output_link_k = pvproperty(
name="OUTK", dtype=ChannelType.STRING, doc="Output Link K"
)
output_link_l = pvproperty(
name="OUTL", dtype=ChannelType.STRING, doc="Output Link L"
)
output_link_m = pvproperty(
name="OUTM", dtype=ChannelType.STRING, doc="Output Link M"
)
output_link_n = pvproperty(
name="OUTN", dtype=ChannelType.STRING, doc="Output Link N"
)
output_link_o = pvproperty(
name="OUTO", dtype=ChannelType.STRING, doc="Output Link O"
)
output_link_p = pvproperty(
name="OUTP", dtype=ChannelType.STRING, doc="Output Link P"
)
output_link_q = pvproperty(
name="OUTQ", dtype=ChannelType.STRING, doc="Output Link Q"
)
output_link_r = pvproperty(
name="OUTR", dtype=ChannelType.STRING, doc="Output Link R"
)
output_link_s = pvproperty(
name="OUTS", dtype=ChannelType.STRING, doc="Output Link S"
)
output_link_t = pvproperty(
name="OUTT", dtype=ChannelType.STRING, doc="Output Link T"
)
output_link_u = pvproperty(
name="OUTU", dtype=ChannelType.STRING, doc="Output Link U"
)
old_return_value = pvproperty(
name="OVAL",
dtype=ChannelType.LONG,
doc="Old return value",
read_only=True,
)
display_precision = pvproperty(
name="PREC", dtype=ChannelType.INT, doc="Display Precision"
)
process_subr_name = pvproperty(
name="SNAM",
dtype=ChannelType.CHAR,
max_length=41,
report_as_string=True,
doc="Process Subr. Name",
)
subroutine_name_link = pvproperty(
name="SUBL",
dtype=ChannelType.STRING,
doc="Subroutine Name Link",
read_only=True,
)
# subr_return_value = pvproperty(name='VAL',
# dtype=ChannelType.LONG,
# doc='Subr. return value')
link_parent_attribute(
display_precision, "precision",
)
class AaiFields(RecordFieldGroup):
_record_type = "aai"
_dtype = None # DTYP of .VAL
has_val_field = False
copy_pvproperties(locals(), RecordFieldGroup)
device_type = pvproperty(
name="DTYP",
dtype=ChannelType.ENUM,
enum_strings=menus.dtyp_aai.get_string_tuple(),
doc="Device Type",
)
post_archive_monitors = pvproperty(
name="APST",
dtype=ChannelType.ENUM,
enum_strings=menus.aaiPOST.get_string_tuple(),
doc="Post Archive Monitors",
)
engineering_units = pvproperty(
name="EGU",
dtype=ChannelType.CHAR,
max_length=16,
report_as_string=True,
doc="Engineering Units",
)
field_type_of_value = pvproperty(
name="FTVL",
dtype=ChannelType.ENUM,
enum_strings=menus.menuFtype.get_string_tuple(),
doc="Field Type of Value",
read_only=True,
)
hash_of_onchange_data = pvproperty(
name="HASH", dtype=ChannelType.LONG, doc="Hash of OnChange data."
)
high_operating_range = pvproperty(
name="HOPR", dtype=ChannelType.DOUBLE, doc="High Operating Range"
)
input_specification = pvproperty(
name="INP", dtype=ChannelType.STRING, doc="Input Specification"
)
low_operating_range = pvproperty(
name="LOPR", dtype=ChannelType.DOUBLE, doc="Low Operating Range"
)
post_value_monitors = pvproperty(
name="MPST",
dtype=ChannelType.ENUM,
enum_strings=menus.aaiPOST.get_string_tuple(),
doc="Post Value Monitors",
)
number_of_elements = pvproperty(
name="NELM",
dtype=ChannelType.LONG,
doc="Number of Elements",
read_only=True,
value=1,
)
number_elements_read = pvproperty(
name="NORD",
dtype=ChannelType.LONG,
doc="Number elements read",
read_only=True,
)
prev_simulation_mode = pvproperty(
name="OLDSIMM",
dtype=ChannelType.ENUM,
enum_strings=menus.menuSimm.get_string_tuple(),
doc="Prev. Simulation Mode",
read_only=True,
)
display_precision = pvproperty(
name="PREC", dtype=ChannelType.INT, doc="Display Precision"
)
sim_mode_async_delay = pvproperty(
name="SDLY",
dtype=ChannelType.DOUBLE,
doc="Sim. Mode Async Delay",
value=-1.0,
)
simulation_mode_link = pvproperty(
name="SIML", dtype=ChannelType.STRING, doc="Simulation Mode Link"
)
simulation_mode = pvproperty(
name="SIMM",
dtype=ChannelType.ENUM,
enum_strings=menus.menuYesNo.get_string_tuple(),
doc="Simulation Mode",
)
simulation_mode_severity = pvproperty(
name="SIMS",
dtype=ChannelType.ENUM,
enum_strings=menus.menuAlarmSevr.get_string_tuple(),
doc="Simulation Mode Severity",
)
simulation_input_link = pvproperty(
name="SIOL", dtype=ChannelType.STRING, doc="Simulation Input Link"
)
sim_mode_scan = pvproperty(
name="SSCN",
dtype=ChannelType.ENUM,
enum_strings=menus.menuScan.get_string_tuple(),
doc="Sim. Mode Scan",
value=0,
)
link_parent_attribute(
display_precision, "precision",
)
link_parent_attribute(
high_operating_range, "upper_ctrl_limit",
)
link_parent_attribute(
low_operating_range, "lower_ctrl_limit",
)
link_parent_attribute(
number_of_elements, "max_length", use_setattr=True, read_only=True
)
class AaoFields(RecordFieldGroup):
_record_type = "aao"
_dtype = None # DTYP of .VAL
has_val_field = False
copy_pvproperties(locals(), RecordFieldGroup)
device_type = pvproperty(
name="DTYP",
dtype=ChannelType.ENUM,
enum_strings=menus.dtyp_aao.get_string_tuple(),
doc="Device Type",
)
post_archive_monitors = pvproperty(
name="APST",
dtype=ChannelType.ENUM,
enum_strings=menus.aaoPOST.get_string_tuple(),
doc="Post Archive Monitors",
)
engineering_units = pvproperty(
name="EGU",
dtype=ChannelType.CHAR,
max_length=16,
report_as_string=True,
doc="Engineering Units",
)
field_type_of_value = pvproperty(
name="FTVL",
dtype=ChannelType.ENUM,
enum_strings=menus.menuFtype.get_string_tuple(),
doc="Field Type of Value",
read_only=True,
)
hash_of_onchange_data = pvproperty(
name="HASH", dtype=ChannelType.LONG, doc="Hash of OnChange data."
)
high_operating_range = pvproperty(
name="HOPR", dtype=ChannelType.DOUBLE, doc="High Operating Range"
)
low_operating_range = pvproperty(
name="LOPR", dtype=ChannelType.DOUBLE, doc="Low Operating Range"
)
post_value_monitors = pvproperty(
name="MPST",
dtype=ChannelType.ENUM,
enum_strings=menus.aaoPOST.get_string_tuple(),
doc="Post Value Monitors",
)
number_of_elements = pvproperty(
name="NELM",
dtype=ChannelType.LONG,
doc="Number of Elements",
read_only=True,
value=1,
)
number_elements_read = pvproperty(
name="NORD",
dtype=ChannelType.LONG,
doc="Number elements read",
read_only=True,
)
prev_simulation_mode = pvproperty(
name="OLDSIMM",
dtype=ChannelType.ENUM,
enum_strings=menus.menuSimm.get_string_tuple(),
doc="Prev. Simulation Mode",
read_only=True,
)
output_specification = pvproperty(
name="OUT", dtype=ChannelType.STRING, doc="Output Specification"
)
display_precision = pvproperty(
name="PREC", dtype=ChannelType.INT, doc="Display Precision"
)
sim_mode_async_delay = pvproperty(
name="SDLY",
dtype=ChannelType.DOUBLE,
doc="Sim. Mode Async Delay",
value=-1.0,
)
simulation_mode_link = pvproperty(
name="SIML", dtype=ChannelType.STRING, doc="Simulation Mode Link"
)
simulation_mode = pvproperty(
name="SIMM",
dtype=ChannelType.ENUM,
enum_strings=menus.menuYesNo.get_string_tuple(),
doc="Simulation Mode",
)
simulation_mode_severity = pvproperty(
name="SIMS",
dtype=ChannelType.ENUM,
enum_strings=menus.menuAlarmSevr.get_string_tuple(),
doc="Simulation Mode Severity",
)
simulation_output_link = pvproperty(
name="SIOL", dtype=ChannelType.STRING, doc="Simulation Output Link"
)
sim_mode_scan = pvproperty(
name="SSCN",
dtype=ChannelType.ENUM,
enum_strings=menus.menuScan.get_string_tuple(),
doc="Sim. Mode Scan",
value=0,
)
link_parent_attribute(
display_precision, "precision",
)
link_parent_attribute(
high_operating_range, "upper_ctrl_limit",
)
link_parent_attribute(
low_operating_range, "lower_ctrl_limit",
)
link_parent_attribute(
number_of_elements, "max_length", use_setattr=True, read_only=True
)
class AoFields(RecordFieldGroup, _Limits):
_record_type = "ao"
_dtype = ChannelType.DOUBLE # DTYP of .VAL
has_val_field = True
copy_pvproperties(locals(), RecordFieldGroup, _Limits)
device_type = pvproperty(
name="DTYP",
dtype=ChannelType.ENUM,
enum_strings=menus.dtyp_ao.get_string_tuple(),
doc="Device Type",
)
archive_deadband = pvproperty(
name="ADEL", dtype=ChannelType.DOUBLE, doc="Archive Deadband"
)
last_value_archived = pvproperty(
name="ALST",
dtype=ChannelType.DOUBLE,
doc="Last Value Archived",
read_only=True,
)
adjustment_offset = pvproperty(
name="AOFF", dtype=ChannelType.DOUBLE, doc="Adjustment Offset"
)
adjustment_slope = pvproperty(
name="ASLO", dtype=ChannelType.DOUBLE, doc="Adjustment Slope"
)
desired_output_loc = pvproperty(
name="DOL", dtype=ChannelType.STRING, doc="Desired Output Loc"
)
drive_high_limit = pvproperty(
name="DRVH", dtype=ChannelType.DOUBLE, doc="Drive High Limit"
)
drive_low_limit = pvproperty(
name="DRVL", dtype=ChannelType.DOUBLE, doc="Drive Low Limit"
)
engineering_units = pvproperty(
name="EGU",
dtype=ChannelType.CHAR,
max_length=16,
report_as_string=True,
doc="Engineering Units",
)
eng_units_full = pvproperty(
name="EGUF", dtype=ChannelType.DOUBLE, doc="Eng Units Full"
)
eng_units_low = pvproperty(
name="EGUL", dtype=ChannelType.DOUBLE, doc="Eng Units Low"
)
egu_to_raw_offset = pvproperty(
name="EOFF", dtype=ChannelType.DOUBLE, doc="EGU to Raw Offset"
)
egu_to_raw_slope = pvproperty(
name="ESLO", dtype=ChannelType.DOUBLE, doc="EGU to Raw Slope", value=1
)
alarm_deadband = pvproperty(
name="HYST", dtype=ChannelType.DOUBLE, doc="Alarm Deadband"
)
initialized = pvproperty(
name="INIT", dtype=ChannelType.INT, doc="Initialized?", read_only=True
)
invalid_output_action = pvproperty(
name="IVOA",
dtype=ChannelType.ENUM,
enum_strings=menus.menuIvoa.get_string_tuple(),
doc="INVALID output action",
)
invalid_output_value = pvproperty(
name="IVOV", dtype=ChannelType.DOUBLE, doc="INVALID output value"
)
last_value_alarmed = pvproperty(
name="LALM",
dtype=ChannelType.DOUBLE,
doc="Last Value Alarmed",
read_only=True,
)
lastbreak_point = pvproperty(
name="LBRK",
dtype=ChannelType.INT,
doc="LastBreak Point",
read_only=True,
)
linearization = pvproperty(
name="LINR",
dtype=ChannelType.ENUM,
enum_strings=menus.menuConvert.get_string_tuple(),
doc="Linearization",
)
monitor_deadband = pvproperty(
name="MDEL", dtype=ChannelType.DOUBLE, doc="Monitor Deadband"
)
last_val_monitored = pvproperty(
name="MLST",
dtype=ChannelType.DOUBLE,
doc="Last Val Monitored",
read_only=True,
)
out_full_incremental = pvproperty(
name="OIF",
dtype=ChannelType.ENUM,
enum_strings=menus.aoOIF.get_string_tuple(),
doc="Out Full/Incremental",
)
prev_simulation_mode = pvproperty(
name="OLDSIMM",
dtype=ChannelType.ENUM,
enum_strings=menus.menuSimm.get_string_tuple(),
doc="Prev. Simulation Mode",
read_only=True,
)
was_oval_modified = pvproperty(
name="OMOD",
dtype=ChannelType.CHAR,
doc="Was OVAL modified?",
read_only=True,
)
output_mode_select = pvproperty(
name="OMSL",
dtype=ChannelType.ENUM,
enum_strings=menus.menuOmsl.get_string_tuple(),
doc="Output Mode Select",
)
previous_raw_value = pvproperty(
name="ORAW",
dtype=ChannelType.LONG,
doc="Previous Raw Value",
read_only=True,
)
prev_readback_value = pvproperty(
name="ORBV",
dtype=ChannelType.LONG,
doc="Prev Readback Value",
read_only=True,
)
output_rate_of_change = pvproperty(
name="OROC", dtype=ChannelType.DOUBLE, doc="Output Rate of Change"
)
output_specification = pvproperty(
name="OUT", dtype=ChannelType.STRING, doc="Output Specification"
)
output_value = pvproperty(
name="OVAL", dtype=ChannelType.DOUBLE, doc="Output Value"
)
display_precision = pvproperty(
name="PREC", dtype=ChannelType.INT, doc="Display Precision"
)
previous_value = pvproperty(
name="PVAL",
dtype=ChannelType.DOUBLE,
doc="Previous value",
read_only=True,
)
readback_value = pvproperty(
name="RBV", dtype=ChannelType.LONG, doc="Readback Value", read_only=True
)
raw_offset = pvproperty(
name="ROFF", dtype=ChannelType.LONG, doc="Raw Offset"
)
current_raw_value = pvproperty(
name="RVAL", dtype=ChannelType.LONG, doc="Current Raw Value"
)
sim_mode_async_delay = pvproperty(
name="SDLY",
dtype=ChannelType.DOUBLE,
doc="Sim. Mode Async Delay",
value=-1.0,
)
simulation_mode_link = pvproperty(
name="SIML", dtype=ChannelType.STRING, doc="Simulation Mode Link"
)
simulation_mode = pvproperty(
name="SIMM",
dtype=ChannelType.ENUM,
enum_strings=menus.menuYesNo.get_string_tuple(),
doc="Simulation Mode",
)
simulation_mode_severity = pvproperty(
name="SIMS",
dtype=ChannelType.ENUM,
enum_strings=menus.menuAlarmSevr.get_string_tuple(),
doc="Simulation Mode Severity",
)
simulation_output_link = pvproperty(
name="SIOL", dtype=ChannelType.STRING, doc="Simulation Output Link"
)
sim_mode_scan = pvproperty(
name="SSCN",
dtype=ChannelType.ENUM,
enum_strings=menus.menuScan.get_string_tuple(),
doc="Sim. Mode Scan",
value=0,
)
# desired_output = pvproperty(name='VAL',
# dtype=ChannelType.DOUBLE,
# doc='Desired Output')
link_parent_attribute(
display_precision, "precision",
)
link_parent_attribute(archive_deadband, "log_atol", use_setattr=True)
link_parent_attribute(monitor_deadband, "value_atol", use_setattr=True)
class AsynFields(RecordFieldGroup):
_record_type = "asyn"
_dtype = ChannelType.LONG # DTYP of .VAL
has_val_field = True
copy_pvproperties(locals(), RecordFieldGroup)
device_type = pvproperty(
name="DTYP",
dtype=ChannelType.ENUM,
enum_strings=menus.dtyp_asyn.get_string_tuple(),
doc="Device Type",
)
addressed_command = pvproperty(
name="ACMD",
dtype=ChannelType.ENUM,
enum_strings=menus.gpibACMD.get_string_tuple(),
doc="Addressed command",
)
asyn_address = pvproperty(
name="ADDR", dtype=ChannelType.LONG, doc="asyn address", value=0
)
input = pvproperty(
name="AINP",
dtype=ChannelType.CHAR,
max_length=40,
report_as_string=True,
doc="Input (response) string",
read_only=True,
)
output = pvproperty(
name="AOUT",
dtype=ChannelType.CHAR,
max_length=40,
report_as_string=True,
doc="Output (command) string",
)
abort_queuerequest = pvproperty(
name="AQR", dtype=ChannelType.CHAR, doc="Abort queueRequest"
)
autoconnect = pvproperty(
name="AUCT",
dtype=ChannelType.ENUM,
enum_strings=menus.asynAUTOCONNECT.get_string_tuple(),
doc="Autoconnect",
)
baud_rate = pvproperty(
name="BAUD",
dtype=ChannelType.ENUM,
enum_strings=menus.serialBAUD.get_string_tuple(),
doc="Baud rate",
)
input_binary_data = pvproperty(
name="BINP", dtype=ChannelType.CHAR, doc="Input binary data"
)
output_binary_data = pvproperty(
name="BOUT", dtype=ChannelType.CHAR, doc="Output binary data"
)
connect_disconnect = pvproperty(
name="CNCT",
dtype=ChannelType.ENUM,
enum_strings=menus.asynCONNECT.get_string_tuple(),
doc="Connect/Disconnect",
)
data_bits = pvproperty(
name="DBIT",
dtype=ChannelType.ENUM,
enum_strings=menus.serialDBIT.get_string_tuple(),
doc="Data bits",
)
disconnect_on_timeout = pvproperty(
name="DRTO",
dtype=ChannelType.ENUM,
enum_strings=menus.ipDRTO.get_string_tuple(),
doc="Disconnect on timeout",
)
driver_info_string = pvproperty(
name="DRVINFO",
dtype=ChannelType.CHAR,
max_length=40,
report_as_string=True,
doc="Driver info string",
value="",
)
enable_disable = pvproperty(
name="ENBL",
dtype=ChannelType.ENUM,
enum_strings=menus.asynENABLE.get_string_tuple(),
doc="Enable/Disable",
)
eom_reason = pvproperty(
name="EOMR",
dtype=ChannelType.ENUM,
enum_strings=menus.asynEOMREASON.get_string_tuple(),
doc="EOM reason",
read_only=True,
)
asynfloat64_input = pvproperty(
name="F64INP",
dtype=ChannelType.DOUBLE,
doc="asynFloat64 input",
read_only=True,
)
asynfloat64_is_valid = pvproperty(
name="F64IV", dtype=ChannelType.LONG, doc="asynFloat64 is valid"
)
asynfloat64_output = pvproperty(
name="F64OUT", dtype=ChannelType.DOUBLE, doc="asynFloat64 output"
)
flow_control = pvproperty(
name="FCTL",
dtype=ChannelType.ENUM,
enum_strings=menus.serialFCTL.get_string_tuple(),
doc="Flow control",
)
asyngpib_is_valid = pvproperty(
name="GPIBIV", dtype=ChannelType.LONG, doc="asynGPIB is valid"
)
host_info = pvproperty(
name="HOSTINFO",
dtype=ChannelType.CHAR,
max_length=40,
report_as_string=True,
doc="host info",
value="",
)
asynint32_input = pvproperty(
name="I32INP",
dtype=ChannelType.LONG,
doc="asynInt32 input",
read_only=True,
)
asynint32_is_valid = pvproperty(
name="I32IV", dtype=ChannelType.LONG, doc="asynInt32 is valid"
)
asynint32_output = pvproperty(
name="I32OUT", dtype=ChannelType.LONG, doc="asynInt32 output"
)
input_delimiter = pvproperty(
name="IEOS",
dtype=ChannelType.CHAR,
max_length=40,
report_as_string=True,
doc="Input Delimiter",
)
interface = pvproperty(
name="IFACE",
dtype=ChannelType.ENUM,
enum_strings=menus.asynINTERFACE.get_string_tuple(),
doc="Interface",
)
input_format = pvproperty(
name="IFMT",
dtype=ChannelType.ENUM,
enum_strings=menus.asynFMT.get_string_tuple(),
doc="Input format",
)
max_size_of_input_array = pvproperty(
name="IMAX",
dtype=ChannelType.LONG,
doc="Max. size of input array",
read_only=True,
value=80,
)
xon_any_character = pvproperty(
name="IXANY",
dtype=ChannelType.ENUM,
enum_strings=menus.serialIX.get_string_tuple(),
doc="XON=any character",
)
input_xon_xoff = pvproperty(
name="IXOFF",
dtype=ChannelType.ENUM,
enum_strings=menus.serialIX.get_string_tuple(),
doc="Input XON/XOFF",
)
output_xon_xoff = pvproperty(
name="IXON",
dtype=ChannelType.ENUM,
enum_strings=menus.serialIX.get_string_tuple(),
doc="Output XON/XOFF",
)
long_baud_rate = pvproperty(
name="LBAUD", dtype=ChannelType.LONG, doc="Baud rate"
)
modem_control = pvproperty(
name="MCTL",
dtype=ChannelType.ENUM,
enum_strings=menus.serialMCTL.get_string_tuple(),
doc="Modem control",
)
number_of_bytes_actually_written = pvproperty(
name="NAWT",
dtype=ChannelType.LONG,
doc="Number of bytes actually written",
)
number_of_bytes_read = pvproperty(
name="NORD",
dtype=ChannelType.LONG,
doc="Number of bytes read",
read_only=True,
)
number_of_bytes_to_write = pvproperty(
name="NOWT",
dtype=ChannelType.LONG,
doc="Number of bytes to write",
value=80,
)
number_of_bytes_to_read = pvproperty(
name="NRRD", dtype=ChannelType.LONG, doc="Number of bytes to read"
)
asynoctet_is_valid = pvproperty(
name="OCTETIV", dtype=ChannelType.LONG, doc="asynOctet is valid"
)
output_delimiter = pvproperty(
name="OEOS",
dtype=ChannelType.CHAR,
max_length=40,
report_as_string=True,
doc="Output delimiter",
)
output_format = pvproperty(
name="OFMT",
dtype=ChannelType.ENUM,
enum_strings=menus.asynFMT.get_string_tuple(),
doc="Output format",
)
max_size_of_output_array = pvproperty(
name="OMAX",
dtype=ChannelType.LONG,
doc="Max. size of output array",
read_only=True,
value=80,
)
asynoption_is_valid = pvproperty(
name="OPTIONIV", dtype=ChannelType.LONG, doc="asynOption is valid"
)
port_connect_disconnect = pvproperty(
name="PCNCT",
dtype=ChannelType.ENUM,
enum_strings=menus.asynCONNECT.get_string_tuple(),
doc="Port Connect/Disconnect",
)
asyn_port = pvproperty(
name="PORT",
dtype=ChannelType.CHAR,
max_length=40,
report_as_string=True,
doc="asyn port",
value="",
)
parity = pvproperty(
name="PRTY",
dtype=ChannelType.ENUM,
enum_strings=menus.serialPRTY.get_string_tuple(),
doc="Parity",
)
asynuser_reason = pvproperty(
name="REASON", dtype=ChannelType.LONG, doc="asynUser->reason"
)
stop_bits = pvproperty(
name="SBIT",
dtype=ChannelType.ENUM,
enum_strings=menus.serialSBIT.get_string_tuple(),
doc="Stop bits",
)
serial_poll_response = pvproperty(
name="SPR",
dtype=ChannelType.CHAR,
doc="Serial poll response",
read_only=True,
)
trace_error = pvproperty(
name="TB0",
dtype=ChannelType.ENUM,
enum_strings=menus.asynTRACE.get_string_tuple(),
doc="Trace error",
)
trace_io_device = pvproperty(
name="TB1",
dtype=ChannelType.ENUM,
enum_strings=menus.asynTRACE.get_string_tuple(),
doc="Trace IO device",
)
trace_io_filter = pvproperty(
name="TB2",
dtype=ChannelType.ENUM,
enum_strings=menus.asynTRACE.get_string_tuple(),
doc="Trace IO filter",
)
trace_io_driver = pvproperty(
name="TB3",
dtype=ChannelType.ENUM,
enum_strings=menus.asynTRACE.get_string_tuple(),
doc="Trace IO driver",
)
trace_flow = pvproperty(
name="TB4",
dtype=ChannelType.ENUM,
enum_strings=menus.asynTRACE.get_string_tuple(),
doc="Trace flow",
)
trace_warning = pvproperty(
name="TB5",
dtype=ChannelType.ENUM,
enum_strings=menus.asynTRACE.get_string_tuple(),
doc="Trace warning",
)
trace_io_file = pvproperty(
name="TFIL",
dtype=ChannelType.CHAR,
max_length=40,
report_as_string=True,
doc="Trace IO file",
)
trace_io_ascii = pvproperty(
name="TIB0",
dtype=ChannelType.ENUM,
enum_strings=menus.asynTRACE.get_string_tuple(),
doc="Trace IO ASCII",
)
trace_io_escape = pvproperty(
name="TIB1",
dtype=ChannelType.ENUM,
enum_strings=menus.asynTRACE.get_string_tuple(),
doc="Trace IO escape",
)
trace_io_hex = pvproperty(
name="TIB2",
dtype=ChannelType.ENUM,
enum_strings=menus.asynTRACE.get_string_tuple(),
doc="Trace IO hex",
)
trace_info_time = pvproperty(
name="TINB0",
dtype=ChannelType.ENUM,
enum_strings=menus.asynTRACE.get_string_tuple(),
doc="Trace Info Time",
)
trace_info_port = pvproperty(
name="TINB1",
dtype=ChannelType.ENUM,
enum_strings=menus.asynTRACE.get_string_tuple(),
doc="Trace Info Port",
)
trace_info_source = pvproperty(
name="TINB2",
dtype=ChannelType.ENUM,
enum_strings=menus.asynTRACE.get_string_tuple(),
doc="Trace Info Source",
)
trace_info_thread = pvproperty(
name="TINB3",
dtype=ChannelType.ENUM,
enum_strings=menus.asynTRACE.get_string_tuple(),
doc="Trace Info Thread",
)
trace_info_mask = pvproperty(
name="TINM", dtype=ChannelType.LONG, doc="Trace Info mask"
)
translated_input_string = pvproperty(
name="TINP",
dtype=ChannelType.CHAR,
max_length=40,
report_as_string=True,
doc="Translated input string",
read_only=True,
)
trace_i_o_mask = pvproperty(
name="TIOM", dtype=ChannelType.LONG, doc="Trace I/O mask"
)
transaction_mode = pvproperty(
name="TMOD",
dtype=ChannelType.ENUM,
enum_strings=menus.asynTMOD.get_string_tuple(),
doc="Transaction mode",
)
timeout = pvproperty(
name="TMOT", dtype=ChannelType.DOUBLE, doc="Timeout (sec)", value=1.0
)
trace_mask = pvproperty(
name="TMSK", dtype=ChannelType.LONG, doc="Trace mask"
)
trace_io_truncate_size = pvproperty(
name="TSIZ", dtype=ChannelType.LONG, doc="Trace IO truncate size"
)
universal_command = pvproperty(
name="UCMD",
dtype=ChannelType.ENUM,
enum_strings=menus.gpibUCMD.get_string_tuple(),
doc="Universal command",
)
asynuint32digital_input = pvproperty(
name="UI32INP",
dtype=ChannelType.LONG,
doc="asynUInt32Digital input",
read_only=True,
)
asynuint32digital_is_valid = pvproperty(
name="UI32IV", dtype=ChannelType.LONG, doc="asynUInt32Digital is valid"
)
asynuint32digital_mask = pvproperty(
name="UI32MASK",
dtype=ChannelType.LONG,
doc="asynUInt32Digital mask",
value=4294967295,
)
asynuint32digital_output = pvproperty(
name="UI32OUT", dtype=ChannelType.LONG, doc="asynUInt32Digital output"
)
# value_field = pvproperty(name='VAL',
# dtype=ChannelType.LONG,
# doc='Value field (unused)')
class BiFields(RecordFieldGroup):
_record_type = "bi"
_dtype = ChannelType.ENUM # DTYP of .VAL
has_val_field = True
copy_pvproperties(locals(), RecordFieldGroup)
device_type = pvproperty(
name="DTYP",
dtype=ChannelType.ENUM,
enum_strings=menus.dtyp_bi.get_string_tuple(),
doc="Device Type",
)
change_of_state_svr = pvproperty(
name="COSV",
dtype=ChannelType.ENUM,
enum_strings=menus.menuAlarmSevr.get_string_tuple(),
doc="Change of State Svr",
)
input_specification = pvproperty(
name="INP", dtype=ChannelType.STRING, doc="Input Specification"
)
last_value_alarmed = pvproperty(
name="LALM",
dtype=ChannelType.INT,
doc="Last Value Alarmed",
read_only=True,
)
hardware_mask = pvproperty(
name="MASK", dtype=ChannelType.LONG, doc="Hardware Mask", read_only=True
)
last_value_monitored = pvproperty(
name="MLST",
dtype=ChannelType.INT,
doc="Last Value Monitored",
read_only=True,
)
prev_simulation_mode = pvproperty(
name="OLDSIMM",
dtype=ChannelType.ENUM,
enum_strings=menus.menuSimm.get_string_tuple(),
doc="Prev. Simulation Mode",
read_only=True,
)
one_name = pvproperty(
name="ONAM",
dtype=ChannelType.CHAR,
max_length=26,
report_as_string=True,
doc="One Name",
)
prev_raw_value = pvproperty(
name="ORAW",
dtype=ChannelType.LONG,
doc="prev Raw Value",
read_only=True,
)
one_error_severity = pvproperty(
name="OSV",
dtype=ChannelType.ENUM,
enum_strings=menus.menuAlarmSevr.get_string_tuple(),
doc="One Error Severity",
)
raw_value = pvproperty(name="RVAL", dtype=ChannelType.LONG, doc="Raw Value")
sim_mode_async_delay = pvproperty(
name="SDLY",
dtype=ChannelType.DOUBLE,
doc="Sim. Mode Async Delay",
value=-1.0,
)
simulation_mode_link = pvproperty(
name="SIML", dtype=ChannelType.STRING, doc="Simulation Mode Link"
)
simulation_mode = pvproperty(
name="SIMM",
dtype=ChannelType.ENUM,
enum_strings=menus.menuSimm.get_string_tuple(),
doc="Simulation Mode",
)
simulation_mode_severity = pvproperty(
name="SIMS",
dtype=ChannelType.ENUM,
enum_strings=menus.menuAlarmSevr.get_string_tuple(),
doc="Simulation Mode Severity",
)
simulation_input_link = pvproperty(
name="SIOL", dtype=ChannelType.STRING, doc="Simulation Input Link"
)
sim_mode_scan = pvproperty(
name="SSCN",
dtype=ChannelType.ENUM,
enum_strings=menus.menuScan.get_string_tuple(),
doc="Sim. Mode Scan",
value=0,
)
simulation_value = pvproperty(
name="SVAL", dtype=ChannelType.LONG, doc="Simulation Value"
)
zero_name = pvproperty(
name="ZNAM",
dtype=ChannelType.CHAR,
max_length=26,
report_as_string=True,
doc="Zero Name",
)
zero_error_severity = pvproperty(
name="ZSV",
dtype=ChannelType.ENUM,
enum_strings=menus.menuAlarmSevr.get_string_tuple(),
doc="Zero Error Severity",
)
# current_value = pvproperty(name='VAL',
# dtype=ChannelType.ENUM,
# doc='Current Value')
class BoFields(RecordFieldGroup):
_record_type = "bo"
_dtype = ChannelType.ENUM # DTYP of .VAL
has_val_field = True
copy_pvproperties(locals(), RecordFieldGroup)
device_type = pvproperty(
name="DTYP",
dtype=ChannelType.ENUM,
enum_strings=menus.dtyp_bo.get_string_tuple(),
doc="Device Type",
)
change_of_state_sevr = pvproperty(
name="COSV",
dtype=ChannelType.ENUM,
enum_strings=menus.menuAlarmSevr.get_string_tuple(),
doc="Change of State Sevr",
)
desired_output_loc = pvproperty(
name="DOL", dtype=ChannelType.STRING, doc="Desired Output Loc"
)
seconds_to_hold_high = pvproperty(
name="HIGH", dtype=ChannelType.DOUBLE, doc="Seconds to Hold High"
)
invalid_outpt_action = pvproperty(
name="IVOA",
dtype=ChannelType.ENUM,
enum_strings=menus.menuIvoa.get_string_tuple(),
doc="INVALID outpt action",
)
invalid_output_value = pvproperty(
name="IVOV", dtype=ChannelType.INT, doc="INVALID output value"
)
last_value_alarmed = pvproperty(
name="LALM",
dtype=ChannelType.INT,
doc="Last Value Alarmed",
read_only=True,
)
hardware_mask = pvproperty(
name="MASK", dtype=ChannelType.LONG, doc="Hardware Mask", read_only=True
)
last_value_monitored = pvproperty(
name="MLST",
dtype=ChannelType.INT,
doc="Last Value Monitored",
read_only=True,
)
prev_simulation_mode = pvproperty(
name="OLDSIMM",
dtype=ChannelType.ENUM,
enum_strings=menus.menuSimm.get_string_tuple(),
doc="Prev. Simulation Mode",
read_only=True,
)
output_mode_select = pvproperty(
name="OMSL",
dtype=ChannelType.ENUM,
enum_strings=menus.menuOmsl.get_string_tuple(),
doc="Output Mode Select",
)
one_name = pvproperty(
name="ONAM",
dtype=ChannelType.CHAR,
max_length=26,
report_as_string=True,
doc="One Name",
)
prev_raw_value = pvproperty(
name="ORAW",
dtype=ChannelType.LONG,
doc="prev Raw Value",
read_only=True,
)
prev_readback_value = pvproperty(
name="ORBV",
dtype=ChannelType.LONG,
doc="Prev Readback Value",
read_only=True,
)
one_error_severity = pvproperty(
name="OSV",
dtype=ChannelType.ENUM,
enum_strings=menus.menuAlarmSevr.get_string_tuple(),
doc="One Error Severity",
)
output_specification = pvproperty(
name="OUT", dtype=ChannelType.STRING, doc="Output Specification"
)
readback_value = pvproperty(
name="RBV", dtype=ChannelType.LONG, doc="Readback Value", read_only=True
)
raw_value = pvproperty(name="RVAL", dtype=ChannelType.LONG, doc="Raw Value")
sim_mode_async_delay = pvproperty(
name="SDLY",
dtype=ChannelType.DOUBLE,
doc="Sim. Mode Async Delay",
value=-1.0,
)
simulation_mode_link = pvproperty(
name="SIML", dtype=ChannelType.STRING, doc="Simulation Mode Link"
)
simulation_mode = pvproperty(
name="SIMM",
dtype=ChannelType.ENUM,
enum_strings=menus.menuYesNo.get_string_tuple(),
doc="Simulation Mode",
)
simulation_mode_severity = pvproperty(
name="SIMS",
dtype=ChannelType.ENUM,
enum_strings=menus.menuAlarmSevr.get_string_tuple(),
doc="Simulation Mode Severity",
)
simulation_output_link = pvproperty(
name="SIOL", dtype=ChannelType.STRING, doc="Simulation Output Link"
)
sim_mode_scan = pvproperty(
name="SSCN",
dtype=ChannelType.ENUM,
enum_strings=menus.menuScan.get_string_tuple(),
doc="Sim. Mode Scan",
value=0,
)
zero_name = pvproperty(
name="ZNAM",
dtype=ChannelType.CHAR,
max_length=26,
report_as_string=True,
doc="Zero Name",
)
zero_error_severity = pvproperty(
name="ZSV",
dtype=ChannelType.ENUM,
enum_strings=menus.menuAlarmSevr.get_string_tuple(),
doc="Zero Error Severity",
)
# current_value = pvproperty(name='VAL',
# dtype=ChannelType.ENUM,
# doc='Current Value')
class CalcFields(RecordFieldGroup, _Limits):
_record_type = "calc"
_dtype = ChannelType.DOUBLE # DTYP of .VAL
has_val_field = True
copy_pvproperties(locals(), RecordFieldGroup, _Limits)
device_type = pvproperty(
name="DTYP",
dtype=ChannelType.ENUM,
enum_strings=menus.dtyp_calc.get_string_tuple(),
doc="Device Type",
)
value_of_input_a = pvproperty(
name="A", dtype=ChannelType.DOUBLE, doc="Value of Input A"
)
archive_deadband = pvproperty(
name="ADEL", dtype=ChannelType.DOUBLE, doc="Archive Deadband"
)
alarm_filter_time_constant = pvproperty(
name="AFTC", dtype=ChannelType.DOUBLE, doc="Alarm Filter Time Constant"
)
alarm_filter_value = pvproperty(
name="AFVL",
dtype=ChannelType.DOUBLE,
doc="Alarm Filter Value",
read_only=True,
)
last_value_archived = pvproperty(
name="ALST",
dtype=ChannelType.DOUBLE,
doc="Last Value Archived",
read_only=True,
)
value_of_input_b = pvproperty(
name="B", dtype=ChannelType.DOUBLE, doc="Value of Input B"
)
value_of_input_c = pvproperty(
name="C", dtype=ChannelType.DOUBLE, doc="Value of Input C"
)
calculation = pvproperty(
name="CALC",
dtype=ChannelType.CHAR,
max_length=80,
report_as_string=True,
doc="Calculation",
value=chr(0),
)
value_of_input_d = pvproperty(
name="D", dtype=ChannelType.DOUBLE, doc="Value of Input D"
)
value_of_input_e = pvproperty(
name="E", dtype=ChannelType.DOUBLE, doc="Value of Input E"
)
engineering_units = pvproperty(
name="EGU",
dtype=ChannelType.CHAR,
max_length=16,
report_as_string=True,
doc="Engineering Units",
)
value_of_input_f = pvproperty(
name="F", dtype=ChannelType.DOUBLE, doc="Value of Input F"
)
value_of_input_g = pvproperty(
name="G", dtype=ChannelType.DOUBLE, doc="Value of Input G"
)
value_of_input_h = pvproperty(
name="H", dtype=ChannelType.DOUBLE, doc="Value of Input H"
)
alarm_deadband = pvproperty(
name="HYST", dtype=ChannelType.DOUBLE, doc="Alarm Deadband"
)
value_of_input_i = pvproperty(
name="I", dtype=ChannelType.DOUBLE, doc="Value of Input I"
)
input_a = pvproperty(name="INPA", dtype=ChannelType.STRING, doc="Input A")
input_b = pvproperty(name="INPB", dtype=ChannelType.STRING, doc="Input B")
input_c = pvproperty(name="INPC", dtype=ChannelType.STRING, doc="Input C")
input_d = pvproperty(name="INPD", dtype=ChannelType.STRING, doc="Input D")
input_e = pvproperty(name="INPE", dtype=ChannelType.STRING, doc="Input E")
input_f = pvproperty(name="INPF", dtype=ChannelType.STRING, doc="Input F")
input_g = pvproperty(name="INPG", dtype=ChannelType.STRING, doc="Input G")
input_h = pvproperty(name="INPH", dtype=ChannelType.STRING, doc="Input H")
input_i = pvproperty(name="INPI", dtype=ChannelType.STRING, doc="Input I")
input_j = pvproperty(name="INPJ", dtype=ChannelType.STRING, doc="Input J")
input_k = pvproperty(name="INPK", dtype=ChannelType.STRING, doc="Input K")
input_l = pvproperty(name="INPL", dtype=ChannelType.STRING, doc="Input L")
value_of_input_j = pvproperty(
name="J", dtype=ChannelType.DOUBLE, doc="Value of Input J"
)
value_of_input_k = pvproperty(
name="K", dtype=ChannelType.DOUBLE, doc="Value of Input K"
)
value_of_input_l = pvproperty(
name="L", dtype=ChannelType.DOUBLE, doc="Value of Input L"
)
prev_value_of_a = pvproperty(
name="LA",
dtype=ChannelType.DOUBLE,
doc="Prev Value of A",
read_only=True,
)
last_value_alarmed = pvproperty(
name="LALM",
dtype=ChannelType.DOUBLE,
doc="Last Value Alarmed",
read_only=True,
)
prev_value_of_b = pvproperty(
name="LB",
dtype=ChannelType.DOUBLE,
doc="Prev Value of B",
read_only=True,
)
prev_value_of_c = pvproperty(
name="LC",
dtype=ChannelType.DOUBLE,
doc="Prev Value of C",
read_only=True,
)
prev_value_of_d = pvproperty(
name="LD",
dtype=ChannelType.DOUBLE,
doc="Prev Value of D",
read_only=True,
)
prev_value_of_e = pvproperty(
name="LE",
dtype=ChannelType.DOUBLE,
doc="Prev Value of E",
read_only=True,
)
prev_value_of_f = pvproperty(
name="LF",
dtype=ChannelType.DOUBLE,
doc="Prev Value of F",
read_only=True,
)
prev_value_of_g = pvproperty(
name="LG",
dtype=ChannelType.DOUBLE,
doc="Prev Value of G",
read_only=True,
)
prev_value_of_h = pvproperty(
name="LH",
dtype=ChannelType.DOUBLE,
doc="Prev Value of H",
read_only=True,
)
prev_value_of_i = pvproperty(
name="LI",
dtype=ChannelType.DOUBLE,
doc="Prev Value of I",
read_only=True,
)
prev_value_of_j = pvproperty(
name="LJ",
dtype=ChannelType.DOUBLE,
doc="Prev Value of J",
read_only=True,
)
prev_value_of_k = pvproperty(
name="LK",
dtype=ChannelType.DOUBLE,
doc="Prev Value of K",
read_only=True,
)
prev_value_of_l = pvproperty(
name="LL",
dtype=ChannelType.DOUBLE,
doc="Prev Value of L",
read_only=True,
)
monitor_deadband = pvproperty(
name="MDEL", dtype=ChannelType.DOUBLE, doc="Monitor Deadband"
)
last_val_monitored = pvproperty(
name="MLST",
dtype=ChannelType.DOUBLE,
doc="Last Val Monitored",
read_only=True,
)
display_precision = pvproperty(
name="PREC", dtype=ChannelType.INT, doc="Display Precision"
)
# result = pvproperty(name='VAL',
# dtype=ChannelType.DOUBLE,
# doc='Result')
link_parent_attribute(
display_precision, "precision",
)
link_parent_attribute(archive_deadband, "log_atol", use_setattr=True)
link_parent_attribute(monitor_deadband, "value_atol", use_setattr=True)
class CalcoutFields(RecordFieldGroup, _Limits):
_record_type = "calcout"
_dtype = ChannelType.DOUBLE # DTYP of .VAL
has_val_field = True
copy_pvproperties(locals(), RecordFieldGroup, _Limits)
device_type = pvproperty(
name="DTYP",
dtype=ChannelType.ENUM,
enum_strings=menus.dtyp_calcout.get_string_tuple(),
doc="Device Type",
)
value_of_input_a = pvproperty(
name="A", dtype=ChannelType.DOUBLE, doc="Value of Input A"
)
archive_deadband = pvproperty(
name="ADEL", dtype=ChannelType.DOUBLE, doc="Archive Deadband"
)
last_value_archived = pvproperty(
name="ALST",
dtype=ChannelType.DOUBLE,
doc="Last Value Archived",
read_only=True,
)
value_of_input_b = pvproperty(
name="B", dtype=ChannelType.DOUBLE, doc="Value of Input B"
)
value_of_input_c = pvproperty(
name="C", dtype=ChannelType.DOUBLE, doc="Value of Input C"
)
calculation = pvproperty(
name="CALC",
dtype=ChannelType.CHAR,
max_length=80,
report_as_string=True,
doc="Calculation",
value=chr(0),
)
calc_valid = pvproperty(
name="CLCV", dtype=ChannelType.LONG, doc="CALC Valid"
)
value_of_input_d = pvproperty(
name="D", dtype=ChannelType.DOUBLE, doc="Value of Input D"
)
output_delay_active = pvproperty(
name="DLYA",
dtype=ChannelType.INT,
doc="Output Delay Active",
read_only=True,
)
output_data_opt = pvproperty(
name="DOPT",
dtype=ChannelType.ENUM,
enum_strings=menus.calcoutDOPT.get_string_tuple(),
doc="Output Data Opt",
)
value_of_input_e = pvproperty(
name="E", dtype=ChannelType.DOUBLE, doc="Value of Input E"
)
engineering_units = pvproperty(
name="EGU",
dtype=ChannelType.CHAR,
max_length=16,
report_as_string=True,
doc="Engineering Units",
)
value_of_input_f = pvproperty(
name="F", dtype=ChannelType.DOUBLE, doc="Value of Input F"
)
value_of_input_g = pvproperty(
name="G", dtype=ChannelType.DOUBLE, doc="Value of Input G"
)
value_of_input_h = pvproperty(
name="H", dtype=ChannelType.DOUBLE, doc="Value of Input H"
)
alarm_deadband = pvproperty(
name="HYST", dtype=ChannelType.DOUBLE, doc="Alarm Deadband"
)
value_of_input_i = pvproperty(
name="I", dtype=ChannelType.DOUBLE, doc="Value of Input I"
)
inpa_pv_status = pvproperty(
name="INAV",
dtype=ChannelType.ENUM,
enum_strings=menus.calcoutINAV.get_string_tuple(),
doc="INPA PV Status",
read_only=True,
value=1,
)
inpb_pv_status = pvproperty(
name="INBV",
dtype=ChannelType.ENUM,
enum_strings=menus.calcoutINAV.get_string_tuple(),
doc="INPB PV Status",
read_only=True,
value=1,
)
inpc_pv_status = pvproperty(
name="INCV",
dtype=ChannelType.ENUM,
enum_strings=menus.calcoutINAV.get_string_tuple(),
doc="INPC PV Status",
read_only=True,
value=1,
)
inpd_pv_status = pvproperty(
name="INDV",
dtype=ChannelType.ENUM,
enum_strings=menus.calcoutINAV.get_string_tuple(),
doc="INPD PV Status",
read_only=True,
value=1,
)
inpe_pv_status = pvproperty(
name="INEV",
dtype=ChannelType.ENUM,
enum_strings=menus.calcoutINAV.get_string_tuple(),
doc="INPE PV Status",
read_only=True,
value=1,
)
inpf_pv_status = pvproperty(
name="INFV",
dtype=ChannelType.ENUM,
enum_strings=menus.calcoutINAV.get_string_tuple(),
doc="INPF PV Status",
read_only=True,
value=1,
)
inpg_pv_status = pvproperty(
name="INGV",
dtype=ChannelType.ENUM,
enum_strings=menus.calcoutINAV.get_string_tuple(),
doc="INPG PV Status",
read_only=True,
value=1,
)
inph_pv_status = pvproperty(
name="INHV",
dtype=ChannelType.ENUM,
enum_strings=menus.calcoutINAV.get_string_tuple(),
doc="INPH PV Status",
read_only=True,
value=1,
)
inpi_pv_status = pvproperty(
name="INIV",
dtype=ChannelType.ENUM,
enum_strings=menus.calcoutINAV.get_string_tuple(),
doc="INPI PV Status",
read_only=True,
value=1,
)
inpj_pv_status = pvproperty(
name="INJV",
dtype=ChannelType.ENUM,
enum_strings=menus.calcoutINAV.get_string_tuple(),
doc="INPJ PV Status",
read_only=True,
value=1,
)
inpk_pv_status = pvproperty(
name="INKV",
dtype=ChannelType.ENUM,
enum_strings=menus.calcoutINAV.get_string_tuple(),
doc="INPK PV Status",
read_only=True,
value=1,
)
inpl_pv_status = pvproperty(
name="INLV",
dtype=ChannelType.ENUM,
enum_strings=menus.calcoutINAV.get_string_tuple(),
doc="INPL PV Status",
read_only=True,
value=1,
)
input_a = pvproperty(name="INPA", dtype=ChannelType.STRING, doc="Input A")
input_b = pvproperty(name="INPB", dtype=ChannelType.STRING, doc="Input B")
input_c = pvproperty(name="INPC", dtype=ChannelType.STRING, doc="Input C")
input_d = pvproperty(name="INPD", dtype=ChannelType.STRING, doc="Input D")
input_e = pvproperty(name="INPE", dtype=ChannelType.STRING, doc="Input E")
input_f = pvproperty(name="INPF", dtype=ChannelType.STRING, doc="Input F")
input_g = pvproperty(name="INPG", dtype=ChannelType.STRING, doc="Input G")
input_h = pvproperty(name="INPH", dtype=ChannelType.STRING, doc="Input H")
input_i = pvproperty(name="INPI", dtype=ChannelType.STRING, doc="Input I")
input_j = pvproperty(name="INPJ", dtype=ChannelType.STRING, doc="Input J")
input_k = pvproperty(name="INPK", dtype=ChannelType.STRING, doc="Input K")
input_l = pvproperty(name="INPL", dtype=ChannelType.STRING, doc="Input L")
invalid_output_action = pvproperty(
name="IVOA",
dtype=ChannelType.ENUM,
enum_strings=menus.menuIvoa.get_string_tuple(),
doc="INVALID output action",
)
invalid_output_value = pvproperty(
name="IVOV", dtype=ChannelType.DOUBLE, doc="INVALID output value"
)
value_of_input_j = pvproperty(
name="J", dtype=ChannelType.DOUBLE, doc="Value of Input J"
)
value_of_input_k = pvproperty(
name="K", dtype=ChannelType.DOUBLE, doc="Value of Input K"
)
value_of_input_l = pvproperty(
name="L", dtype=ChannelType.DOUBLE, doc="Value of Input L"
)
prev_value_of_a = pvproperty(
name="LA",
dtype=ChannelType.DOUBLE,
doc="Prev Value of A",
read_only=True,
)
last_value_alarmed = pvproperty(
name="LALM",
dtype=ChannelType.DOUBLE,
doc="Last Value Alarmed",
read_only=True,
)
prev_value_of_b = pvproperty(
name="LB",
dtype=ChannelType.DOUBLE,
doc="Prev Value of B",
read_only=True,
)
prev_value_of_c = pvproperty(
name="LC",
dtype=ChannelType.DOUBLE,
doc="Prev Value of C",
read_only=True,
)
prev_value_of_d = pvproperty(
name="LD",
dtype=ChannelType.DOUBLE,
doc="Prev Value of D",
read_only=True,
)
prev_value_of_e = pvproperty(
name="LE",
dtype=ChannelType.DOUBLE,
doc="Prev Value of E",
read_only=True,
)
prev_value_of_f = pvproperty(
name="LF",
dtype=ChannelType.DOUBLE,
doc="Prev Value of F",
read_only=True,
)
prev_value_of_g = pvproperty(
name="LG",
dtype=ChannelType.DOUBLE,
doc="Prev Value of G",
read_only=True,
)
prev_value_of_h = pvproperty(
name="LH",
dtype=ChannelType.DOUBLE,
doc="Prev Value of H",
read_only=True,
)
prev_value_of_i = pvproperty(
name="LI",
dtype=ChannelType.DOUBLE,
doc="Prev Value of I",
read_only=True,
)
prev_value_of_j = pvproperty(
name="LJ",
dtype=ChannelType.DOUBLE,
doc="Prev Value of J",
read_only=True,
)
prev_value_of_k = pvproperty(
name="LK",
dtype=ChannelType.DOUBLE,
doc="Prev Value of K",
read_only=True,
)
prev_value_of_l = pvproperty(
name="LL",
dtype=ChannelType.DOUBLE,
doc="Prev Value of L",
read_only=True,
)
monitor_deadband = pvproperty(
name="MDEL", dtype=ChannelType.DOUBLE, doc="Monitor Deadband"
)
last_val_monitored = pvproperty(
name="MLST",
dtype=ChannelType.DOUBLE,
doc="Last Val Monitored",
read_only=True,
)
output_calculation = pvproperty(
name="OCAL",
dtype=ChannelType.CHAR,
max_length=80,
report_as_string=True,
doc="Output Calculation",
value=chr(0),
)
ocal_valid = pvproperty(
name="OCLV", dtype=ChannelType.LONG, doc="OCAL Valid"
)
output_execute_delay = pvproperty(
name="ODLY", dtype=ChannelType.DOUBLE, doc="Output Execute Delay"
)
event_to_issue = pvproperty(
name="OEVT",
dtype=ChannelType.CHAR,
max_length=40,
report_as_string=True,
doc="Event To Issue",
)
output_execute_opt = pvproperty(
name="OOPT",
dtype=ChannelType.ENUM,
enum_strings=menus.calcoutOOPT.get_string_tuple(),
doc="Output Execute Opt",
)
output_specification = pvproperty(
name="OUT", dtype=ChannelType.STRING, doc="Output Specification"
)
out_pv_status = pvproperty(
name="OUTV",
dtype=ChannelType.ENUM,
enum_strings=menus.calcoutINAV.get_string_tuple(),
doc="OUT PV Status",
read_only=True,
)
output_value = pvproperty(
name="OVAL", dtype=ChannelType.DOUBLE, doc="Output Value"
)
prev_value_of_oval = pvproperty(
name="POVL", dtype=ChannelType.DOUBLE, doc="Prev Value of OVAL"
)
display_precision = pvproperty(
name="PREC", dtype=ChannelType.INT, doc="Display Precision"
)
previous_value = pvproperty(
name="PVAL", dtype=ChannelType.DOUBLE, doc="Previous Value"
)
# result = pvproperty(name='VAL',
# dtype=ChannelType.DOUBLE,
# doc='Result')
link_parent_attribute(
display_precision, "precision",
)
link_parent_attribute(archive_deadband, "log_atol", use_setattr=True)
link_parent_attribute(monitor_deadband, "value_atol", use_setattr=True)
class CompressFields(RecordFieldGroup):
_record_type = "compress"
_dtype = None # DTYP of .VAL
has_val_field = False
copy_pvproperties(locals(), RecordFieldGroup)
device_type = pvproperty(
name="DTYP",
dtype=ChannelType.ENUM,
enum_strings=menus.dtyp_compress.get_string_tuple(),
doc="Device Type",
)
compression_algorithm = pvproperty(
name="ALG",
dtype=ChannelType.ENUM,
enum_strings=menus.compressALG.get_string_tuple(),
doc="Compression Algorithm",
)
buffering_algorithm = pvproperty(
name="BALG",
dtype=ChannelType.ENUM,
enum_strings=menus.bufferingALG.get_string_tuple(),
doc="Buffering Algorithm",
)
compress_value_buffer = pvproperty(
name="CVB",
dtype=ChannelType.DOUBLE,
doc="Compress Value Buffer",
read_only=True,
)
engineering_units = pvproperty(
name="EGU",
dtype=ChannelType.CHAR,
max_length=16,
report_as_string=True,
doc="Engineering Units",
)
high_operating_range = pvproperty(
name="HOPR", dtype=ChannelType.DOUBLE, doc="High Operating Range"
)
init_high_interest_lim = pvproperty(
name="IHIL", dtype=ChannelType.DOUBLE, doc="Init High Interest Lim"
)
init_low_interest_lim = pvproperty(
name="ILIL", dtype=ChannelType.DOUBLE, doc="Init Low Interest Lim"
)
input_specification = pvproperty(
name="INP", dtype=ChannelType.STRING, doc="Input Specification"
)
number_of_elements_in_working_buffer = pvproperty(
name="INPN",
dtype=ChannelType.LONG,
doc="Number of elements in Working Buffer",
read_only=True,
)
compressed_array_inx = pvproperty(
name="INX",
dtype=ChannelType.LONG,
doc="Compressed Array Inx",
read_only=True,
)
low_operating_range = pvproperty(
name="LOPR", dtype=ChannelType.DOUBLE, doc="Low Operating Range"
)
n_to_1_compression = pvproperty(
name="N", dtype=ChannelType.LONG, doc="N to 1 Compression", value=1
)
number_of_values = pvproperty(
name="NSAM",
dtype=ChannelType.LONG,
doc="Number of Values",
read_only=True,
value=1,
)
number_used = pvproperty(
name="NUSE", dtype=ChannelType.LONG, doc="Number Used", read_only=True
)
offset = pvproperty(
name="OFF", dtype=ChannelType.LONG, doc="Offset", read_only=True
)
old_number_used = pvproperty(
name="OUSE",
dtype=ChannelType.LONG,
doc="Old Number Used",
read_only=True,
)
display_precision = pvproperty(
name="PREC", dtype=ChannelType.INT, doc="Display Precision"
)
reset = pvproperty(name="RES", dtype=ChannelType.INT, doc="Reset")
link_parent_attribute(
high_operating_range, "upper_ctrl_limit",
)
link_parent_attribute(
low_operating_range, "lower_ctrl_limit",
)
link_parent_attribute(
display_precision, "precision",
)
class DfanoutFields(RecordFieldGroup, _Limits):
_record_type = "dfanout"
_dtype = ChannelType.DOUBLE # DTYP of .VAL
has_val_field = True
copy_pvproperties(locals(), RecordFieldGroup, _Limits)
device_type = pvproperty(
name="DTYP",
dtype=ChannelType.ENUM,
enum_strings=menus.dtyp_dfanout.get_string_tuple(),
doc="Device Type",
)
archive_deadband = pvproperty(
name="ADEL", dtype=ChannelType.DOUBLE, doc="Archive Deadband"
)
last_value_archived = pvproperty(
name="ALST",
dtype=ChannelType.DOUBLE,
doc="Last Value Archived",
read_only=True,
)
desired_output_loc = pvproperty(
name="DOL", dtype=ChannelType.STRING, doc="Desired Output Loc"
)
engineering_units = pvproperty(
name="EGU",
dtype=ChannelType.CHAR,
max_length=16,
report_as_string=True,
doc="Engineering Units",
)
alarm_deadband = pvproperty(
name="HYST", dtype=ChannelType.DOUBLE, doc="Alarm Deadband"
)
last_value_alarmed = pvproperty(
name="LALM",
dtype=ChannelType.DOUBLE,
doc="Last Value Alarmed",
read_only=True,
)
monitor_deadband = pvproperty(
name="MDEL", dtype=ChannelType.DOUBLE, doc="Monitor Deadband"
)
last_val_monitored = pvproperty(
name="MLST",
dtype=ChannelType.DOUBLE,
doc="Last Val Monitored",
read_only=True,
)
output_mode_select = pvproperty(
name="OMSL",
dtype=ChannelType.ENUM,
enum_strings=menus.menuOmsl.get_string_tuple(),
doc="Output Mode Select",
)
output_spec_a = pvproperty(
name="OUTA", dtype=ChannelType.STRING, doc="Output Spec A"
)
output_spec_b = pvproperty(
name="OUTB", dtype=ChannelType.STRING, doc="Output Spec B"
)
output_spec_c = pvproperty(
name="OUTC", dtype=ChannelType.STRING, doc="Output Spec C"
)
output_spec_d = pvproperty(
name="OUTD", dtype=ChannelType.STRING, doc="Output Spec D"
)
output_spec_e = pvproperty(
name="OUTE", dtype=ChannelType.STRING, doc="Output Spec E"
)
output_spec_f = pvproperty(
name="OUTF", dtype=ChannelType.STRING, doc="Output Spec F"
)
output_spec_g = pvproperty(
name="OUTG", dtype=ChannelType.STRING, doc="Output Spec G"
)
output_spec_h = pvproperty(
name="OUTH", dtype=ChannelType.STRING, doc="Output Spec H"
)
display_precision = pvproperty(
name="PREC", dtype=ChannelType.INT, doc="Display Precision"
)
link_selection_loc = pvproperty(
name="SELL", dtype=ChannelType.STRING, doc="Link Selection Loc"
)
select_mechanism = pvproperty(
name="SELM",
dtype=ChannelType.ENUM,
enum_strings=menus.dfanoutSELM.get_string_tuple(),
doc="Select Mechanism",
)
link_selection = pvproperty(
name="SELN", dtype=ChannelType.INT, doc="Link Selection", value=1
)
# desired_output = pvproperty(name='VAL',
# dtype=ChannelType.DOUBLE,
# doc='Desired Output')
link_parent_attribute(
display_precision, "precision",
)
link_parent_attribute(archive_deadband, "log_atol", use_setattr=True)
link_parent_attribute(monitor_deadband, "value_atol", use_setattr=True)
class EventFields(RecordFieldGroup):
_record_type = "event"
_dtype = ChannelType.STRING # DTYP of .VAL
has_val_field = True
copy_pvproperties(locals(), RecordFieldGroup)
device_type = pvproperty(
name="DTYP",
dtype=ChannelType.ENUM,
enum_strings=menus.dtyp_event.get_string_tuple(),
doc="Device Type",
)
input_specification = pvproperty(
name="INP", dtype=ChannelType.STRING, doc="Input Specification"
)
prev_simulation_mode = pvproperty(
name="OLDSIMM",
dtype=ChannelType.ENUM,
enum_strings=menus.menuSimm.get_string_tuple(),
doc="Prev. Simulation Mode",
read_only=True,
)
sim_mode_async_delay = pvproperty(
name="SDLY",
dtype=ChannelType.DOUBLE,
doc="Sim. Mode Async Delay",
value=-1.0,
)
sim_mode_location = pvproperty(
name="SIML", dtype=ChannelType.STRING, doc="Sim Mode Location"
)
simulation_mode = pvproperty(
name="SIMM",
dtype=ChannelType.ENUM,
enum_strings=menus.menuYesNo.get_string_tuple(),
doc="Simulation Mode",
)
sim_mode_alarm_svrty = pvproperty(
name="SIMS",
dtype=ChannelType.ENUM,
enum_strings=menus.menuAlarmSevr.get_string_tuple(),
doc="Sim mode Alarm Svrty",
)
sim_input_specifctn = pvproperty(
name="SIOL", dtype=ChannelType.STRING, doc="Sim Input Specifctn"
)
sim_mode_scan = pvproperty(
name="SSCN",
dtype=ChannelType.ENUM,
enum_strings=menus.menuScan.get_string_tuple(),
doc="Sim. Mode Scan",
value=0,
)
simulation_value = pvproperty(
name="SVAL",
dtype=ChannelType.CHAR,
max_length=40,
report_as_string=True,
doc="Simulation Value",
)
# event_name_to_post = pvproperty(name='VAL',
# dtype=ChannelType.CHAR,
# max_length=40,report_as_string=True,doc='Event Name To Post')
class FanoutFields(RecordFieldGroup):
_record_type = "fanout"
_dtype = ChannelType.LONG # DTYP of .VAL
has_val_field = True
copy_pvproperties(locals(), RecordFieldGroup)
device_type = pvproperty(
name="DTYP",
dtype=ChannelType.ENUM,
enum_strings=menus.dtyp_fanout.get_string_tuple(),
doc="Device Type",
)
forward_link_0 = pvproperty(
name="LNK0", dtype=ChannelType.STRING, doc="Forward Link 0"
)
forward_link_1 = pvproperty(
name="LNK1", dtype=ChannelType.STRING, doc="Forward Link 1"
)
forward_link_2 = pvproperty(
name="LNK2", dtype=ChannelType.STRING, doc="Forward Link 2"
)
forward_link_3 = pvproperty(
name="LNK3", dtype=ChannelType.STRING, doc="Forward Link 3"
)
forward_link_4 = pvproperty(
name="LNK4", dtype=ChannelType.STRING, doc="Forward Link 4"
)
forward_link_5 = pvproperty(
name="LNK5", dtype=ChannelType.STRING, doc="Forward Link 5"
)
forward_link_6 = pvproperty(
name="LNK6", dtype=ChannelType.STRING, doc="Forward Link 6"
)
forward_link_7 = pvproperty(
name="LNK7", dtype=ChannelType.STRING, doc="Forward Link 7"
)
forward_link_8 = pvproperty(
name="LNK8", dtype=ChannelType.STRING, doc="Forward Link 8"
)
forward_link_9 = pvproperty(
name="LNK9", dtype=ChannelType.STRING, doc="Forward Link 9"
)
forward_link_10 = pvproperty(
name="LNKA", dtype=ChannelType.STRING, doc="Forward Link 10"
)
forward_link_11 = pvproperty(
name="LNKB", dtype=ChannelType.STRING, doc="Forward Link 11"
)
forward_link_12 = pvproperty(
name="LNKC", dtype=ChannelType.STRING, doc="Forward Link 12"
)
forward_link_13 = pvproperty(
name="LNKD", dtype=ChannelType.STRING, doc="Forward Link 13"
)
forward_link_14 = pvproperty(
name="LNKE", dtype=ChannelType.STRING, doc="Forward Link 14"
)
forward_link_15 = pvproperty(
name="LNKF", dtype=ChannelType.STRING, doc="Forward Link 15"
)
offset_for_specified = pvproperty(
name="OFFS", dtype=ChannelType.INT, doc="Offset for Specified", value=0
)
link_selection_loc = pvproperty(
name="SELL", dtype=ChannelType.STRING, doc="Link Selection Loc"
)
select_mechanism = pvproperty(
name="SELM",
dtype=ChannelType.ENUM,
enum_strings=menus.fanoutSELM.get_string_tuple(),
doc="Select Mechanism",
)
link_selection = pvproperty(
name="SELN", dtype=ChannelType.INT, doc="Link Selection", value=1
)
shift_for_mask_mode = pvproperty(
name="SHFT", dtype=ChannelType.INT, doc="Shift for Mask mode", value=-1
)
# used_to_trigger = pvproperty(name='VAL',
# dtype=ChannelType.LONG,
# doc='Used to trigger')
class HistogramFields(RecordFieldGroup):
_record_type = "histogram"
_dtype = None # DTYP of .VAL
has_val_field = False
copy_pvproperties(locals(), RecordFieldGroup)
device_type = pvproperty(
name="DTYP",
dtype=ChannelType.ENUM,
enum_strings=menus.dtyp_histogram.get_string_tuple(),
doc="Device Type",
)
collection_control = pvproperty(
name="CMD",
dtype=ChannelType.ENUM,
enum_strings=menus.histogramCMD.get_string_tuple(),
doc="Collection Control",
)
collection_status = pvproperty(
name="CSTA",
dtype=ChannelType.INT,
doc="Collection Status",
read_only=True,
value=1,
)
high_operating_range = pvproperty(
name="HOPR", dtype=ChannelType.LONG, doc="High Operating Range"
)
lower_signal_limit = pvproperty(
name="LLIM", dtype=ChannelType.DOUBLE, doc="Lower Signal Limit "
)
low_operating_range = pvproperty(
name="LOPR", dtype=ChannelType.LONG, doc="Low Operating Range"
)
counts_since_monitor = pvproperty(
name="MCNT",
dtype=ChannelType.INT,
doc="Counts Since Monitor",
read_only=True,
)
monitor_count_deadband = pvproperty(
name="MDEL", dtype=ChannelType.INT, doc="Monitor Count Deadband"
)
num_of_array_elements = pvproperty(
name="NELM",
dtype=ChannelType.INT,
doc="Num of Array Elements",
read_only=True,
value=1,
)
prev_simulation_mode = pvproperty(
name="OLDSIMM",
dtype=ChannelType.ENUM,
enum_strings=menus.menuSimm.get_string_tuple(),
doc="Prev. Simulation Mode",
read_only=True,
)
display_precision = pvproperty(
name="PREC", dtype=ChannelType.INT, doc="Display Precision"
)
monitor_seconds_dband = pvproperty(
name="SDEL", dtype=ChannelType.DOUBLE, doc="Monitor Seconds Dband"
)
sim_mode_async_delay = pvproperty(
name="SDLY",
dtype=ChannelType.DOUBLE,
doc="Sim. Mode Async Delay",
value=-1.0,
)
signal_value = pvproperty(
name="SGNL", dtype=ChannelType.DOUBLE, doc="Signal Value"
)
simulation_mode_link = pvproperty(
name="SIML", dtype=ChannelType.STRING, doc="Simulation Mode Link"
)
simulation_mode = pvproperty(
name="SIMM",
dtype=ChannelType.ENUM,
enum_strings=menus.menuYesNo.get_string_tuple(),
doc="Simulation Mode",
)
simulation_mode_severity = pvproperty(
name="SIMS",
dtype=ChannelType.ENUM,
enum_strings=menus.menuAlarmSevr.get_string_tuple(),
doc="Simulation Mode Severity",
)
simulation_input_link = pvproperty(
name="SIOL", dtype=ChannelType.STRING, doc="Simulation Input Link"
)
sim_mode_scan = pvproperty(
name="SSCN",
dtype=ChannelType.ENUM,
enum_strings=menus.menuScan.get_string_tuple(),
doc="Sim. Mode Scan",
value=0,
)
simulation_value = pvproperty(
name="SVAL", dtype=ChannelType.DOUBLE, doc="Simulation Value"
)
signal_value_location = pvproperty(
name="SVL", dtype=ChannelType.STRING, doc="Signal Value Location"
)
upper_signal_limit = pvproperty(
name="ULIM", dtype=ChannelType.DOUBLE, doc="Upper Signal Limit"
)
element_width = pvproperty(
name="WDTH",
dtype=ChannelType.DOUBLE,
doc="Element Width",
read_only=True,
)
link_parent_attribute(
display_precision, "precision",
)
link_parent_attribute(
high_operating_range, "upper_ctrl_limit",
)
link_parent_attribute(
low_operating_range, "lower_ctrl_limit",
)
class Int64inFields(RecordFieldGroup, _LimitsLong):
_record_type = "int64in"
_dtype = ChannelType.LONG # DTYP of .VAL
has_val_field = True
copy_pvproperties(locals(), RecordFieldGroup, _LimitsLong)
device_type = pvproperty(
name="DTYP",
dtype=ChannelType.ENUM,
enum_strings=menus.dtyp_int64in.get_string_tuple(),
doc="Device Type",
)
archive_deadband = pvproperty(
name="ADEL", dtype=ChannelType.LONG, doc="Archive Deadband"
)
alarm_filter_time_constant = pvproperty(
name="AFTC", dtype=ChannelType.DOUBLE, doc="Alarm Filter Time Constant"
)
alarm_filter_value = pvproperty(
name="AFVL",
dtype=ChannelType.DOUBLE,
doc="Alarm Filter Value",
read_only=True,
)
last_value_archived = pvproperty(
name="ALST",
dtype=ChannelType.LONG,
doc="Last Value Archived",
read_only=True,
)
units_name = pvproperty(
name="EGU",
dtype=ChannelType.CHAR,
max_length=16,
report_as_string=True,
doc="Units name",
)
alarm_deadband = pvproperty(
name="HYST", dtype=ChannelType.LONG, doc="Alarm Deadband"
)
input_specification = pvproperty(
name="INP", dtype=ChannelType.STRING, doc="Input Specification"
)
last_value_alarmed = pvproperty(
name="LALM",
dtype=ChannelType.LONG,
doc="Last Value Alarmed",
read_only=True,
)
monitor_deadband = pvproperty(
name="MDEL", dtype=ChannelType.LONG, doc="Monitor Deadband"
)
last_val_monitored = pvproperty(
name="MLST",
dtype=ChannelType.LONG,
doc="Last Val Monitored",
read_only=True,
)
prev_simulation_mode = pvproperty(
name="OLDSIMM",
dtype=ChannelType.ENUM,
enum_strings=menus.menuSimm.get_string_tuple(),
doc="Prev. Simulation Mode",
read_only=True,
)
sim_mode_async_delay = pvproperty(
name="SDLY",
dtype=ChannelType.DOUBLE,
doc="Sim. Mode Async Delay",
value=-1.0,
)
simulation_mode_link = pvproperty(
name="SIML", dtype=ChannelType.STRING, doc="Simulation Mode Link"
)
simulation_mode = pvproperty(
name="SIMM",
dtype=ChannelType.ENUM,
enum_strings=menus.menuYesNo.get_string_tuple(),
doc="Simulation Mode",
)
simulation_mode_severity = pvproperty(
name="SIMS",
dtype=ChannelType.ENUM,
enum_strings=menus.menuAlarmSevr.get_string_tuple(),
doc="Simulation Mode Severity",
)
simulation_input_link = pvproperty(
name="SIOL", dtype=ChannelType.STRING, doc="Simulation Input Link"
)
sim_mode_scan = pvproperty(
name="SSCN",
dtype=ChannelType.ENUM,
enum_strings=menus.menuScan.get_string_tuple(),
doc="Sim. Mode Scan",
value=0,
)
simulation_value = pvproperty(
name="SVAL", dtype=ChannelType.LONG, doc="Simulation Value"
)
# current_value = pvproperty(name='VAL',
# dtype=ChannelType.LONG,
# doc='Current value')
link_parent_attribute(archive_deadband, "log_atol", use_setattr=True)
link_parent_attribute(monitor_deadband, "value_atol", use_setattr=True)
class Int64outFields(RecordFieldGroup, _LimitsLong):
_record_type = "int64out"
_dtype = ChannelType.LONG # DTYP of .VAL
has_val_field = True
copy_pvproperties(locals(), RecordFieldGroup, _LimitsLong)
device_type = pvproperty(
name="DTYP",
dtype=ChannelType.ENUM,
enum_strings=menus.dtyp_int64out.get_string_tuple(),
doc="Device Type",
)
archive_deadband = pvproperty(
name="ADEL", dtype=ChannelType.LONG, doc="Archive Deadband"
)
last_value_archived = pvproperty(
name="ALST",
dtype=ChannelType.LONG,
doc="Last Value Archived",
read_only=True,
)
desired_output_loc = pvproperty(
name="DOL", dtype=ChannelType.STRING, doc="Desired Output Loc"
)
drive_high_limit = pvproperty(
name="DRVH", dtype=ChannelType.LONG, doc="Drive High Limit"
)
drive_low_limit = pvproperty(
name="DRVL", dtype=ChannelType.LONG, doc="Drive Low Limit"
)
units_name = pvproperty(
name="EGU",
dtype=ChannelType.CHAR,
max_length=16,
report_as_string=True,
doc="Units name",
)
alarm_deadband = pvproperty(
name="HYST", dtype=ChannelType.LONG, doc="Alarm Deadband"
)
invalid_output_action = pvproperty(
name="IVOA",
dtype=ChannelType.ENUM,
enum_strings=menus.menuIvoa.get_string_tuple(),
doc="INVALID output action",
)
invalid_output_value = pvproperty(
name="IVOV", dtype=ChannelType.LONG, doc="INVALID output value"
)
last_value_alarmed = pvproperty(
name="LALM",
dtype=ChannelType.LONG,
doc="Last Value Alarmed",
read_only=True,
)
monitor_deadband = pvproperty(
name="MDEL", dtype=ChannelType.LONG, doc="Monitor Deadband"
)
last_val_monitored = pvproperty(
name="MLST",
dtype=ChannelType.LONG,
doc="Last Val Monitored",
read_only=True,
)
prev_simulation_mode = pvproperty(
name="OLDSIMM",
dtype=ChannelType.ENUM,
enum_strings=menus.menuSimm.get_string_tuple(),
doc="Prev. Simulation Mode",
read_only=True,
)
output_mode_select = pvproperty(
name="OMSL",
dtype=ChannelType.ENUM,
enum_strings=menus.menuOmsl.get_string_tuple(),
doc="Output Mode Select",
)
output_specification = pvproperty(
name="OUT", dtype=ChannelType.STRING, doc="Output Specification"
)
sim_mode_async_delay = pvproperty(
name="SDLY",
dtype=ChannelType.DOUBLE,
doc="Sim. Mode Async Delay",
value=-1.0,
)
simulation_mode_link = pvproperty(
name="SIML", dtype=ChannelType.STRING, doc="Simulation Mode Link"
)
simulation_mode = pvproperty(
name="SIMM",
dtype=ChannelType.ENUM,
enum_strings=menus.menuYesNo.get_string_tuple(),
doc="Simulation Mode",
)
simulation_mode_severity = pvproperty(
name="SIMS",
dtype=ChannelType.ENUM,
enum_strings=menus.menuAlarmSevr.get_string_tuple(),
doc="Simulation Mode Severity",
)
simulation_output_link = pvproperty(
name="SIOL", dtype=ChannelType.STRING, doc="Simulation Output Link"
)
sim_mode_scan = pvproperty(
name="SSCN",
dtype=ChannelType.ENUM,
enum_strings=menus.menuScan.get_string_tuple(),
doc="Sim. Mode Scan",
value=0,
)
# desired_output = pvproperty(name='VAL',
# dtype=ChannelType.LONG,
# doc='Desired Output')
link_parent_attribute(archive_deadband, "log_atol", use_setattr=True)
link_parent_attribute(monitor_deadband, "value_atol", use_setattr=True)
class LonginFields(RecordFieldGroup, _LimitsLong):
_record_type = "longin"
_dtype = ChannelType.LONG # DTYP of .VAL
has_val_field = True
copy_pvproperties(locals(), RecordFieldGroup, _LimitsLong)
device_type = pvproperty(
name="DTYP",
dtype=ChannelType.ENUM,
enum_strings=menus.dtyp_longin.get_string_tuple(),
doc="Device Type",
)
archive_deadband = pvproperty(
name="ADEL", dtype=ChannelType.LONG, doc="Archive Deadband"
)
alarm_filter_time_constant = pvproperty(
name="AFTC", dtype=ChannelType.DOUBLE, doc="Alarm Filter Time Constant"
)
alarm_filter_value = pvproperty(
name="AFVL",
dtype=ChannelType.DOUBLE,
doc="Alarm Filter Value",
read_only=True,
)
last_value_archived = pvproperty(
name="ALST",
dtype=ChannelType.LONG,
doc="Last Value Archived",
read_only=True,
)
engineering_units = pvproperty(
name="EGU",
dtype=ChannelType.CHAR,
max_length=16,
report_as_string=True,
doc="Engineering Units",
)
alarm_deadband = pvproperty(
name="HYST", dtype=ChannelType.LONG, doc="Alarm Deadband"
)
input_specification = pvproperty(
name="INP", dtype=ChannelType.STRING, doc="Input Specification"
)
last_value_alarmed = pvproperty(
name="LALM",
dtype=ChannelType.LONG,
doc="Last Value Alarmed",
read_only=True,
)
monitor_deadband = pvproperty(
name="MDEL", dtype=ChannelType.LONG, doc="Monitor Deadband"
)
last_val_monitored = pvproperty(
name="MLST",
dtype=ChannelType.LONG,
doc="Last Val Monitored",
read_only=True,
)
prev_simulation_mode = pvproperty(
name="OLDSIMM",
dtype=ChannelType.ENUM,
enum_strings=menus.menuSimm.get_string_tuple(),
doc="Prev. Simulation Mode",
read_only=True,
)
sim_mode_async_delay = pvproperty(
name="SDLY",
dtype=ChannelType.DOUBLE,
doc="Sim. Mode Async Delay",
value=-1.0,
)
sim_mode_location = pvproperty(
name="SIML", dtype=ChannelType.STRING, doc="Sim Mode Location"
)
simulation_mode = pvproperty(
name="SIMM",
dtype=ChannelType.ENUM,
enum_strings=menus.menuYesNo.get_string_tuple(),
doc="Simulation Mode",
)
sim_mode_alarm_svrty = pvproperty(
name="SIMS",
dtype=ChannelType.ENUM,
enum_strings=menus.menuAlarmSevr.get_string_tuple(),
doc="Sim mode Alarm Svrty",
)
sim_input_specifctn = pvproperty(
name="SIOL", dtype=ChannelType.STRING, doc="Sim Input Specifctn"
)
sim_mode_scan = pvproperty(
name="SSCN",
dtype=ChannelType.ENUM,
enum_strings=menus.menuScan.get_string_tuple(),
doc="Sim. Mode Scan",
value=0,
)
simulation_value = pvproperty(
name="SVAL", dtype=ChannelType.LONG, doc="Simulation Value"
)
# current_value = pvproperty(name='VAL',
# dtype=ChannelType.LONG,
# doc='Current value')
link_parent_attribute(archive_deadband, "log_atol", use_setattr=True)
link_parent_attribute(monitor_deadband, "value_atol", use_setattr=True)
class LongoutFields(RecordFieldGroup, _LimitsLong):
_record_type = "longout"
_dtype = ChannelType.LONG # DTYP of .VAL
has_val_field = True
copy_pvproperties(locals(), RecordFieldGroup, _LimitsLong)
device_type = pvproperty(
name="DTYP",
dtype=ChannelType.ENUM,
enum_strings=menus.dtyp_longout.get_string_tuple(),
doc="Device Type",
)
archive_deadband = pvproperty(
name="ADEL", dtype=ChannelType.LONG, doc="Archive Deadband"
)
last_value_archived = pvproperty(
name="ALST",
dtype=ChannelType.LONG,
doc="Last Value Archived",
read_only=True,
)
desired_output_loc = pvproperty(
name="DOL", dtype=ChannelType.STRING, doc="Desired Output Loc"
)
drive_high_limit = pvproperty(
name="DRVH", dtype=ChannelType.LONG, doc="Drive High Limit"
)
drive_low_limit = pvproperty(
name="DRVL", dtype=ChannelType.LONG, doc="Drive Low Limit"
)
engineering_units = pvproperty(
name="EGU",
dtype=ChannelType.CHAR,
max_length=16,
report_as_string=True,
doc="Engineering Units",
)
alarm_deadband = pvproperty(
name="HYST", dtype=ChannelType.LONG, doc="Alarm Deadband"
)
invalid_output_action = pvproperty(
name="IVOA",
dtype=ChannelType.ENUM,
enum_strings=menus.menuIvoa.get_string_tuple(),
doc="INVALID output action",
)
invalid_output_value = pvproperty(
name="IVOV", dtype=ChannelType.LONG, doc="INVALID output value"
)
last_value_alarmed = pvproperty(
name="LALM",
dtype=ChannelType.LONG,
doc="Last Value Alarmed",
read_only=True,
)
monitor_deadband = pvproperty(
name="MDEL", dtype=ChannelType.LONG, doc="Monitor Deadband"
)
last_val_monitored = pvproperty(
name="MLST",
dtype=ChannelType.LONG,
doc="Last Val Monitored",
read_only=True,
)
prev_simulation_mode = pvproperty(
name="OLDSIMM",
dtype=ChannelType.ENUM,
enum_strings=menus.menuSimm.get_string_tuple(),
doc="Prev. Simulation Mode",
read_only=True,
)
output_mode_select = pvproperty(
name="OMSL",
dtype=ChannelType.ENUM,
enum_strings=menus.menuOmsl.get_string_tuple(),
doc="Output Mode Select",
)
output_specification = pvproperty(
name="OUT", dtype=ChannelType.STRING, doc="Output Specification"
)
sim_mode_async_delay = pvproperty(
name="SDLY",
dtype=ChannelType.DOUBLE,
doc="Sim. Mode Async Delay",
value=-1.0,
)
sim_mode_location = pvproperty(
name="SIML", dtype=ChannelType.STRING, doc="Sim Mode Location"
)
simulation_mode = pvproperty(
name="SIMM",
dtype=ChannelType.ENUM,
enum_strings=menus.menuYesNo.get_string_tuple(),
doc="Simulation Mode",
)
sim_mode_alarm_svrty = pvproperty(
name="SIMS",
dtype=ChannelType.ENUM,
enum_strings=menus.menuAlarmSevr.get_string_tuple(),
doc="Sim mode Alarm Svrty",
)
sim_output_specifctn = pvproperty(
name="SIOL", dtype=ChannelType.STRING, doc="Sim Output Specifctn"
)
sim_mode_scan = pvproperty(
name="SSCN",
dtype=ChannelType.ENUM,
enum_strings=menus.menuScan.get_string_tuple(),
doc="Sim. Mode Scan",
value=0,
)
# desired_output = pvproperty(name='VAL',
# dtype=ChannelType.LONG,
# doc='Desired Output')
link_parent_attribute(archive_deadband, "log_atol", use_setattr=True)
link_parent_attribute(monitor_deadband, "value_atol", use_setattr=True)
class LsiFields(RecordFieldGroup):
_record_type = "lsi"
_dtype = None # DTYP of .VAL
has_val_field = False
copy_pvproperties(locals(), RecordFieldGroup)
device_type = pvproperty(
name="DTYP",
dtype=ChannelType.ENUM,
enum_strings=menus.dtyp_lsi.get_string_tuple(),
doc="Device Type",
)
post_archive_monitors = pvproperty(
name="APST",
dtype=ChannelType.ENUM,
enum_strings=menus.menuPost.get_string_tuple(),
doc="Post Archive Monitors",
)
input_specification = pvproperty(
name="INP", dtype=ChannelType.STRING, doc="Input Specification"
)
length_of_val = pvproperty(
name="LEN", dtype=ChannelType.LONG, doc="Length of VAL", read_only=True
)
post_value_monitors = pvproperty(
name="MPST",
dtype=ChannelType.ENUM,
enum_strings=menus.menuPost.get_string_tuple(),
doc="Post Value Monitors",
)
prev_simulation_mode = pvproperty(
name="OLDSIMM",
dtype=ChannelType.ENUM,
enum_strings=menus.menuSimm.get_string_tuple(),
doc="Prev. Simulation Mode",
read_only=True,
)
length_of_oval = pvproperty(
name="OLEN",
dtype=ChannelType.LONG,
doc="Length of OVAL",
read_only=True,
)
sim_mode_async_delay = pvproperty(
name="SDLY",
dtype=ChannelType.DOUBLE,
doc="Sim. Mode Async Delay",
value=-1.0,
)
simulation_mode_link = pvproperty(
name="SIML", dtype=ChannelType.STRING, doc="Simulation Mode Link"
)
simulation_mode = pvproperty(
name="SIMM",
dtype=ChannelType.ENUM,
enum_strings=menus.menuYesNo.get_string_tuple(),
doc="Simulation Mode",
)
simulation_mode_severity = pvproperty(
name="SIMS",
dtype=ChannelType.ENUM,
enum_strings=menus.menuAlarmSevr.get_string_tuple(),
doc="Simulation Mode Severity",
)
simulation_input_link = pvproperty(
name="SIOL", dtype=ChannelType.STRING, doc="Simulation Input Link"
)
size_of_buffers = pvproperty(
name="SIZV",
dtype=ChannelType.INT,
doc="Size of buffers",
read_only=True,
value=41,
)
sim_mode_scan = pvproperty(
name="SSCN",
dtype=ChannelType.ENUM,
enum_strings=menus.menuScan.get_string_tuple(),
doc="Sim. Mode Scan",
value=0,
)
class LsoFields(RecordFieldGroup):
_record_type = "lso"
_dtype = None # DTYP of .VAL
has_val_field = False
copy_pvproperties(locals(), RecordFieldGroup)
device_type = pvproperty(
name="DTYP",
dtype=ChannelType.ENUM,
enum_strings=menus.dtyp_lso.get_string_tuple(),
doc="Device Type",
)
post_archive_monitors = pvproperty(
name="APST",
dtype=ChannelType.ENUM,
enum_strings=menus.menuPost.get_string_tuple(),
doc="Post Archive Monitors",
)
desired_output_link = pvproperty(
name="DOL", dtype=ChannelType.STRING, doc="Desired Output Link"
)
invalid_output_action = pvproperty(
name="IVOA",
dtype=ChannelType.ENUM,
enum_strings=menus.menuIvoa.get_string_tuple(),
doc="INVALID Output Action",
)
invalid_output_value = pvproperty(
name="IVOV",
dtype=ChannelType.CHAR,
max_length=40,
report_as_string=True,
doc="INVALID Output Value",
)
length_of_val = pvproperty(
name="LEN", dtype=ChannelType.LONG, doc="Length of VAL", read_only=True
)
post_value_monitors = pvproperty(
name="MPST",
dtype=ChannelType.ENUM,
enum_strings=menus.menuPost.get_string_tuple(),
doc="Post Value Monitors",
)
prev_simulation_mode = pvproperty(
name="OLDSIMM",
dtype=ChannelType.ENUM,
enum_strings=menus.menuSimm.get_string_tuple(),
doc="Prev. Simulation Mode",
read_only=True,
)
length_of_oval = pvproperty(
name="OLEN",
dtype=ChannelType.LONG,
doc="Length of OVAL",
read_only=True,
)
output_mode_select = pvproperty(
name="OMSL",
dtype=ChannelType.ENUM,
enum_strings=menus.menuOmsl.get_string_tuple(),
doc="Output Mode Select",
)
output_specification = pvproperty(
name="OUT", dtype=ChannelType.STRING, doc="Output Specification"
)
sim_mode_async_delay = pvproperty(
name="SDLY",
dtype=ChannelType.DOUBLE,
doc="Sim. Mode Async Delay",
value=-1.0,
)
simulation_mode_link = pvproperty(
name="SIML", dtype=ChannelType.STRING, doc="Simulation Mode link"
)
simulation_mode = pvproperty(
name="SIMM",
dtype=ChannelType.ENUM,
enum_strings=menus.menuYesNo.get_string_tuple(),
doc="Simulation Mode",
)
simulation_mode_severity = pvproperty(
name="SIMS",
dtype=ChannelType.ENUM,
enum_strings=menus.menuAlarmSevr.get_string_tuple(),
doc="Simulation Mode Severity",
)
simulation_output_link = pvproperty(
name="SIOL", dtype=ChannelType.STRING, doc="Simulation Output Link"
)
size_of_buffers = pvproperty(
name="SIZV",
dtype=ChannelType.INT,
doc="Size of buffers",
read_only=True,
value=41,
)
sim_mode_scan = pvproperty(
name="SSCN",
dtype=ChannelType.ENUM,
enum_strings=menus.menuScan.get_string_tuple(),
doc="Sim. Mode Scan",
value=0,
)
class MbbiFields(RecordFieldGroup):
_record_type = "mbbi"
_dtype = ChannelType.ENUM # DTYP of .VAL
has_val_field = True
copy_pvproperties(locals(), RecordFieldGroup)
device_type = pvproperty(
name="DTYP",
dtype=ChannelType.ENUM,
enum_strings=menus.dtyp_mbbi.get_string_tuple(),
doc="Device Type",
)
alarm_filter_time_constant = pvproperty(
name="AFTC", dtype=ChannelType.DOUBLE, doc="Alarm Filter Time Constant"
)
alarm_filter_value = pvproperty(
name="AFVL",
dtype=ChannelType.DOUBLE,
doc="Alarm Filter Value",
read_only=True,
)
change_of_state_svr = pvproperty(
name="COSV",
dtype=ChannelType.ENUM,
enum_strings=menus.menuAlarmSevr.get_string_tuple(),
doc="Change of State Svr",
)
eight_string = pvproperty(
name="EIST",
dtype=ChannelType.CHAR,
max_length=26,
report_as_string=True,
doc="Eight String",
)
state_eight_severity = pvproperty(
name="EISV",
dtype=ChannelType.ENUM,
enum_strings=menus.menuAlarmSevr.get_string_tuple(),
doc="State Eight Severity",
)
eight_value = pvproperty(
name="EIVL", dtype=ChannelType.LONG, doc="Eight Value"
)
eleven_string = pvproperty(
name="ELST",
dtype=ChannelType.CHAR,
max_length=26,
report_as_string=True,
doc="Eleven String",
)
state_eleven_severity = pvproperty(
name="ELSV",
dtype=ChannelType.ENUM,
enum_strings=menus.menuAlarmSevr.get_string_tuple(),
doc="State Eleven Severity",
)
eleven_value = pvproperty(
name="ELVL", dtype=ChannelType.LONG, doc="Eleven Value"
)
fifteen_string = pvproperty(
name="FFST",
dtype=ChannelType.CHAR,
max_length=26,
report_as_string=True,
doc="Fifteen String",
)
state_fifteen_severity = pvproperty(
name="FFSV",
dtype=ChannelType.ENUM,
enum_strings=menus.menuAlarmSevr.get_string_tuple(),
doc="State Fifteen Severity",
)
fifteen_value = pvproperty(
name="FFVL", dtype=ChannelType.LONG, doc="Fifteen Value"
)
four_string = pvproperty(
name="FRST",
dtype=ChannelType.CHAR,
max_length=26,
report_as_string=True,
doc="Four String",
)
state_four_severity = pvproperty(
name="FRSV",
dtype=ChannelType.ENUM,
enum_strings=menus.menuAlarmSevr.get_string_tuple(),
doc="State Four Severity",
)
four_value = pvproperty(
name="FRVL", dtype=ChannelType.LONG, doc="Four Value"
)
fourteen_string = pvproperty(
name="FTST",
dtype=ChannelType.CHAR,
max_length=26,
report_as_string=True,
doc="Fourteen String",
)
state_fourteen_sevr = pvproperty(
name="FTSV",
dtype=ChannelType.ENUM,
enum_strings=menus.menuAlarmSevr.get_string_tuple(),
doc="State Fourteen Sevr",
)
fourteen_value = pvproperty(
name="FTVL", dtype=ChannelType.LONG, doc="Fourteen Value"
)
five_string = pvproperty(
name="FVST",
dtype=ChannelType.CHAR,
max_length=26,
report_as_string=True,
doc="Five String",
)
state_five_severity = pvproperty(
name="FVSV",
dtype=ChannelType.ENUM,
enum_strings=menus.menuAlarmSevr.get_string_tuple(),
doc="State Five Severity",
)
five_value = pvproperty(
name="FVVL", dtype=ChannelType.LONG, doc="Five Value"
)
input_specification = pvproperty(
name="INP", dtype=ChannelType.STRING, doc="Input Specification"
)
last_value_alarmed = pvproperty(
name="LALM",
dtype=ChannelType.INT,
doc="Last Value Alarmed",
read_only=True,
)
hardware_mask = pvproperty(
name="MASK", dtype=ChannelType.LONG, doc="Hardware Mask", read_only=True
)
last_value_monitored = pvproperty(
name="MLST",
dtype=ChannelType.INT,
doc="Last Value Monitored",
read_only=True,
)
nine_string = pvproperty(
name="NIST",
dtype=ChannelType.CHAR,
max_length=26,
report_as_string=True,
doc="Nine String",
)
state_nine_severity = pvproperty(
name="NISV",
dtype=ChannelType.ENUM,
enum_strings=menus.menuAlarmSevr.get_string_tuple(),
doc="State Nine Severity",
)
nine_value = pvproperty(
name="NIVL", dtype=ChannelType.LONG, doc="Nine Value"
)
number_of_bits = pvproperty(
name="NOBT", dtype=ChannelType.INT, doc="Number of Bits", read_only=True
)
prev_simulation_mode = pvproperty(
name="OLDSIMM",
dtype=ChannelType.ENUM,
enum_strings=menus.menuSimm.get_string_tuple(),
doc="Prev. Simulation Mode",
read_only=True,
)
one_string = pvproperty(
name="ONST",
dtype=ChannelType.CHAR,
max_length=26,
report_as_string=True,
doc="One String",
)
state_one_severity = pvproperty(
name="ONSV",
dtype=ChannelType.ENUM,
enum_strings=menus.menuAlarmSevr.get_string_tuple(),
doc="State One Severity",
)
one_value = pvproperty(name="ONVL", dtype=ChannelType.LONG, doc="One Value")
prev_raw_value = pvproperty(
name="ORAW",
dtype=ChannelType.LONG,
doc="Prev Raw Value",
read_only=True,
)
raw_value = pvproperty(name="RVAL", dtype=ChannelType.LONG, doc="Raw Value")
states_defined = pvproperty(
name="SDEF", dtype=ChannelType.INT, doc="States Defined", read_only=True
)
sim_mode_async_delay = pvproperty(
name="SDLY",
dtype=ChannelType.DOUBLE,
doc="Sim. Mode Async Delay",
value=-1.0,
)
shift = pvproperty(name="SHFT", dtype=ChannelType.INT, doc="Shift")
simulation_mode_link = pvproperty(
name="SIML", dtype=ChannelType.STRING, doc="Simulation Mode Link"
)
simulation_mode = pvproperty(
name="SIMM",
dtype=ChannelType.ENUM,
enum_strings=menus.menuSimm.get_string_tuple(),
doc="Simulation Mode",
)
simulation_mode_severity = pvproperty(
name="SIMS",
dtype=ChannelType.ENUM,
enum_strings=menus.menuAlarmSevr.get_string_tuple(),
doc="Simulation Mode Severity",
)
simulation_input_link = pvproperty(
name="SIOL", dtype=ChannelType.STRING, doc="Simulation Input Link"
)
sim_mode_scan = pvproperty(
name="SSCN",
dtype=ChannelType.ENUM,
enum_strings=menus.menuScan.get_string_tuple(),
doc="Sim. Mode Scan",
value=0,
)
simulation_value = pvproperty(
name="SVAL", dtype=ChannelType.LONG, doc="Simulation Value"
)
seven_string = pvproperty(
name="SVST",
dtype=ChannelType.CHAR,
max_length=26,
report_as_string=True,
doc="Seven String",
)
state_seven_severity = pvproperty(
name="SVSV",
dtype=ChannelType.ENUM,
enum_strings=menus.menuAlarmSevr.get_string_tuple(),
doc="State Seven Severity",
)
seven_value = pvproperty(
name="SVVL", dtype=ChannelType.LONG, doc="Seven Value"
)
six_string = pvproperty(
name="SXST",
dtype=ChannelType.CHAR,
max_length=26,
report_as_string=True,
doc="Six String",
)
state_six_severity = pvproperty(
name="SXSV",
dtype=ChannelType.ENUM,
enum_strings=menus.menuAlarmSevr.get_string_tuple(),
doc="State Six Severity",
)
six_value = pvproperty(name="SXVL", dtype=ChannelType.LONG, doc="Six Value")
ten_string = pvproperty(
name="TEST",
dtype=ChannelType.CHAR,
max_length=26,
report_as_string=True,
doc="Ten String",
)
state_ten_severity = pvproperty(
name="TESV",
dtype=ChannelType.ENUM,
enum_strings=menus.menuAlarmSevr.get_string_tuple(),
doc="State Ten Severity",
)
ten_value = pvproperty(name="TEVL", dtype=ChannelType.LONG, doc="Ten Value")
three_string = pvproperty(
name="THST",
dtype=ChannelType.CHAR,
max_length=26,
report_as_string=True,
doc="Three String",
)
state_three_severity = pvproperty(
name="THSV",
dtype=ChannelType.ENUM,
enum_strings=menus.menuAlarmSevr.get_string_tuple(),
doc="State Three Severity",
)
three_value = pvproperty(
name="THVL", dtype=ChannelType.LONG, doc="Three Value"
)
thirteen_string = pvproperty(
name="TTST",
dtype=ChannelType.CHAR,
max_length=26,
report_as_string=True,
doc="Thirteen String",
)
state_thirteen_sevr = pvproperty(
name="TTSV",
dtype=ChannelType.ENUM,
enum_strings=menus.menuAlarmSevr.get_string_tuple(),
doc="State Thirteen Sevr",
)
thirteen_value = pvproperty(
name="TTVL", dtype=ChannelType.LONG, doc="Thirteen Value"
)
twelve_string = pvproperty(
name="TVST",
dtype=ChannelType.CHAR,
max_length=26,
report_as_string=True,
doc="Twelve String",
)
state_twelve_severity = pvproperty(
name="TVSV",
dtype=ChannelType.ENUM,
enum_strings=menus.menuAlarmSevr.get_string_tuple(),
doc="State Twelve Severity",
)
twelve_value = pvproperty(
name="TVVL", dtype=ChannelType.LONG, doc="Twelve Value"
)
two_string = pvproperty(
name="TWST",
dtype=ChannelType.CHAR,
max_length=26,
report_as_string=True,
doc="Two String",
)
state_two_severity = pvproperty(
name="TWSV",
dtype=ChannelType.ENUM,
enum_strings=menus.menuAlarmSevr.get_string_tuple(),
doc="State Two Severity",
)
two_value = pvproperty(name="TWVL", dtype=ChannelType.LONG, doc="Two Value")
unknown_state_severity = pvproperty(
name="UNSV",
dtype=ChannelType.ENUM,
enum_strings=menus.menuAlarmSevr.get_string_tuple(),
doc="Unknown State Severity",
)
zero_string = pvproperty(
name="ZRST",
dtype=ChannelType.CHAR,
max_length=26,
report_as_string=True,
doc="Zero String",
)
state_zero_severity = pvproperty(
name="ZRSV",
dtype=ChannelType.ENUM,
enum_strings=menus.menuAlarmSevr.get_string_tuple(),
doc="State Zero Severity",
)
zero_value = pvproperty(
name="ZRVL", dtype=ChannelType.LONG, doc="Zero Value"
)
# current_value = pvproperty(name='VAL',
# dtype=ChannelType.ENUM,
# doc='Current Value')
class MbbidirectFields(RecordFieldGroup):
_record_type = "mbbiDirect"
_dtype = ChannelType.LONG # DTYP of .VAL
has_val_field = True
copy_pvproperties(locals(), RecordFieldGroup)
device_type = pvproperty(
name="DTYP",
dtype=ChannelType.ENUM,
enum_strings=menus.dtyp_mbbiDirect.get_string_tuple(),
doc="Device Type",
)
bit_0 = pvproperty(name="B0", dtype=ChannelType.CHAR, doc="Bit 0")
bit_1 = pvproperty(name="B1", dtype=ChannelType.CHAR, doc="Bit 1")
bit_16 = pvproperty(name="B10", dtype=ChannelType.CHAR, doc="Bit 16")
bit_17 = pvproperty(name="B11", dtype=ChannelType.CHAR, doc="Bit 17")
bit_18 = pvproperty(name="B12", dtype=ChannelType.CHAR, doc="Bit 18")
bit_19 = pvproperty(name="B13", dtype=ChannelType.CHAR, doc="Bit 19")
bit_20 = pvproperty(name="B14", dtype=ChannelType.CHAR, doc="Bit 20")
bit_21 = pvproperty(name="B15", dtype=ChannelType.CHAR, doc="Bit 21")
bit_22 = pvproperty(name="B16", dtype=ChannelType.CHAR, doc="Bit 22")
bit_23 = pvproperty(name="B17", dtype=ChannelType.CHAR, doc="Bit 23")
bit_24 = pvproperty(name="B18", dtype=ChannelType.CHAR, doc="Bit 24")
bit_25 = pvproperty(name="B19", dtype=ChannelType.CHAR, doc="Bit 25")
bit_26 = pvproperty(name="B1A", dtype=ChannelType.CHAR, doc="Bit 26")
bit_27 = pvproperty(name="B1B", dtype=ChannelType.CHAR, doc="Bit 27")
bit_28 = pvproperty(name="B1C", dtype=ChannelType.CHAR, doc="Bit 28")
bit_29 = pvproperty(name="B1D", dtype=ChannelType.CHAR, doc="Bit 29")
bit_30 = pvproperty(name="B1E", dtype=ChannelType.CHAR, doc="Bit 30")
bit_31 = pvproperty(name="B1F", dtype=ChannelType.CHAR, doc="Bit 31")
bit_2 = pvproperty(name="B2", dtype=ChannelType.CHAR, doc="Bit 2")
bit_3 = pvproperty(name="B3", dtype=ChannelType.CHAR, doc="Bit 3")
bit_4 = pvproperty(name="B4", dtype=ChannelType.CHAR, doc="Bit 4")
bit_5 = pvproperty(name="B5", dtype=ChannelType.CHAR, doc="Bit 5")
bit_6 = pvproperty(name="B6", dtype=ChannelType.CHAR, doc="Bit 6")
bit_7 = pvproperty(name="B7", dtype=ChannelType.CHAR, doc="Bit 7")
bit_8 = pvproperty(name="B8", dtype=ChannelType.CHAR, doc="Bit 8")
bit_9 = pvproperty(name="B9", dtype=ChannelType.CHAR, doc="Bit 9")
bit_10 = pvproperty(name="BA", dtype=ChannelType.CHAR, doc="Bit 10")
bit_11 = pvproperty(name="BB", dtype=ChannelType.CHAR, doc="Bit 11")
bit_12 = pvproperty(name="BC", dtype=ChannelType.CHAR, doc="Bit 12")
bit_13 = pvproperty(name="BD", dtype=ChannelType.CHAR, doc="Bit 13")
bit_14 = pvproperty(name="BE", dtype=ChannelType.CHAR, doc="Bit 14")
bit_15 = pvproperty(name="BF", dtype=ChannelType.CHAR, doc="Bit 15")
input_specification = pvproperty(
name="INP", dtype=ChannelType.STRING, doc="Input Specification"
)
hardware_mask = pvproperty(
name="MASK", dtype=ChannelType.LONG, doc="Hardware Mask", read_only=True
)
last_value_monitored = pvproperty(
name="MLST",
dtype=ChannelType.LONG,
doc="Last Value Monitored",
read_only=True,
)
number_of_bits = pvproperty(
name="NOBT", dtype=ChannelType.INT, doc="Number of Bits", read_only=True
)
prev_simulation_mode = pvproperty(
name="OLDSIMM",
dtype=ChannelType.ENUM,
enum_strings=menus.menuSimm.get_string_tuple(),
doc="Prev. Simulation Mode",
read_only=True,
)
prev_raw_value = pvproperty(
name="ORAW",
dtype=ChannelType.LONG,
doc="Prev Raw Value",
read_only=True,
)
raw_value = pvproperty(name="RVAL", dtype=ChannelType.LONG, doc="Raw Value")
sim_mode_async_delay = pvproperty(
name="SDLY",
dtype=ChannelType.DOUBLE,
doc="Sim. Mode Async Delay",
value=-1.0,
)
shift = pvproperty(name="SHFT", dtype=ChannelType.INT, doc="Shift")
simulation_mode_link = pvproperty(
name="SIML", dtype=ChannelType.STRING, doc="Simulation Mode Link"
)
simulation_mode = pvproperty(
name="SIMM",
dtype=ChannelType.ENUM,
enum_strings=menus.menuSimm.get_string_tuple(),
doc="Simulation Mode",
)
simulation_mode_severity = pvproperty(
name="SIMS",
dtype=ChannelType.ENUM,
enum_strings=menus.menuAlarmSevr.get_string_tuple(),
doc="Simulation Mode Severity",
)
simulation_input_link = pvproperty(
name="SIOL", dtype=ChannelType.STRING, doc="Simulation Input Link"
)
sim_mode_scan = pvproperty(
name="SSCN",
dtype=ChannelType.ENUM,
enum_strings=menus.menuScan.get_string_tuple(),
doc="Sim. Mode Scan",
value=0,
)
simulation_value = pvproperty(
name="SVAL", dtype=ChannelType.LONG, doc="Simulation Value"
)
# current_value = pvproperty(name='VAL',
# dtype=ChannelType.LONG,
# doc='Current Value')
class MbboFields(RecordFieldGroup):
_record_type = "mbbo"
_dtype = ChannelType.ENUM # DTYP of .VAL
has_val_field = True
copy_pvproperties(locals(), RecordFieldGroup)
device_type = pvproperty(
name="DTYP",
dtype=ChannelType.ENUM,
enum_strings=menus.dtyp_mbbo.get_string_tuple(),
doc="Device Type",
)
change_of_state_sevr = pvproperty(
name="COSV",
dtype=ChannelType.ENUM,
enum_strings=menus.menuAlarmSevr.get_string_tuple(),
doc="Change of State Sevr",
)
desired_output_loc = pvproperty(
name="DOL", dtype=ChannelType.STRING, doc="Desired Output Loc"
)
eight_string = pvproperty(
name="EIST",
dtype=ChannelType.CHAR,
max_length=26,
report_as_string=True,
doc="Eight String",
)
state_eight_severity = pvproperty(
name="EISV",
dtype=ChannelType.ENUM,
enum_strings=menus.menuAlarmSevr.get_string_tuple(),
doc="State Eight Severity",
)
eight_value = pvproperty(
name="EIVL", dtype=ChannelType.LONG, doc="Eight Value"
)
eleven_string = pvproperty(
name="ELST",
dtype=ChannelType.CHAR,
max_length=26,
report_as_string=True,
doc="Eleven String",
)
state_eleven_severity = pvproperty(
name="ELSV",
dtype=ChannelType.ENUM,
enum_strings=menus.menuAlarmSevr.get_string_tuple(),
doc="State Eleven Severity",
)
eleven_value = pvproperty(
name="ELVL", dtype=ChannelType.LONG, doc="Eleven Value"
)
fifteen_string = pvproperty(
name="FFST",
dtype=ChannelType.CHAR,
max_length=26,
report_as_string=True,
doc="Fifteen String",
)
state_fifteen_sevr = pvproperty(
name="FFSV",
dtype=ChannelType.ENUM,
enum_strings=menus.menuAlarmSevr.get_string_tuple(),
doc="State Fifteen Sevr",
)
fifteen_value = pvproperty(
name="FFVL", dtype=ChannelType.LONG, doc="Fifteen Value"
)
four_string = pvproperty(
name="FRST",
dtype=ChannelType.CHAR,
max_length=26,
report_as_string=True,
doc="Four String",
)
state_four_severity = pvproperty(
name="FRSV",
dtype=ChannelType.ENUM,
enum_strings=menus.menuAlarmSevr.get_string_tuple(),
doc="State Four Severity",
)
four_value = pvproperty(
name="FRVL", dtype=ChannelType.LONG, doc="Four Value"
)
fourteen_string = pvproperty(
name="FTST",
dtype=ChannelType.CHAR,
max_length=26,
report_as_string=True,
doc="Fourteen String",
)
state_fourteen_sevr = pvproperty(
name="FTSV",
dtype=ChannelType.ENUM,
enum_strings=menus.menuAlarmSevr.get_string_tuple(),
doc="State Fourteen Sevr",
)
fourteen_value = pvproperty(
name="FTVL", dtype=ChannelType.LONG, doc="Fourteen Value"
)
five_string = pvproperty(
name="FVST",
dtype=ChannelType.CHAR,
max_length=26,
report_as_string=True,
doc="Five String",
)
state_five_severity = pvproperty(
name="FVSV",
dtype=ChannelType.ENUM,
enum_strings=menus.menuAlarmSevr.get_string_tuple(),
doc="State Five Severity",
)
five_value = pvproperty(
name="FVVL", dtype=ChannelType.LONG, doc="Five Value"
)
invalid_outpt_action = pvproperty(
name="IVOA",
dtype=ChannelType.ENUM,
enum_strings=menus.menuIvoa.get_string_tuple(),
doc="INVALID outpt action",
)
invalid_output_value = pvproperty(
name="IVOV", dtype=ChannelType.INT, doc="INVALID output value"
)
last_value_alarmed = pvproperty(
name="LALM",
dtype=ChannelType.INT,
doc="Last Value Alarmed",
read_only=True,
)
hardware_mask = pvproperty(
name="MASK", dtype=ChannelType.LONG, doc="Hardware Mask", read_only=True
)
last_value_monitored = pvproperty(
name="MLST",
dtype=ChannelType.INT,
doc="Last Value Monitored",
read_only=True,
)
nine_string = pvproperty(
name="NIST",
dtype=ChannelType.CHAR,
max_length=26,
report_as_string=True,
doc="Nine String",
)
state_nine_severity = pvproperty(
name="NISV",
dtype=ChannelType.ENUM,
enum_strings=menus.menuAlarmSevr.get_string_tuple(),
doc="State Nine Severity",
)
nine_value = pvproperty(
name="NIVL", dtype=ChannelType.LONG, doc="Nine Value"
)
number_of_bits = pvproperty(
name="NOBT", dtype=ChannelType.INT, doc="Number of Bits", read_only=True
)
prev_simulation_mode = pvproperty(
name="OLDSIMM",
dtype=ChannelType.ENUM,
enum_strings=menus.menuSimm.get_string_tuple(),
doc="Prev. Simulation Mode",
read_only=True,
)
output_mode_select = pvproperty(
name="OMSL",
dtype=ChannelType.ENUM,
enum_strings=menus.menuOmsl.get_string_tuple(),
doc="Output Mode Select",
)
one_string = pvproperty(
name="ONST",
dtype=ChannelType.CHAR,
max_length=26,
report_as_string=True,
doc="One String",
)
state_one_severity = pvproperty(
name="ONSV",
dtype=ChannelType.ENUM,
enum_strings=menus.menuAlarmSevr.get_string_tuple(),
doc="State One Severity",
)
one_value = pvproperty(name="ONVL", dtype=ChannelType.LONG, doc="One Value")
prev_raw_value = pvproperty(
name="ORAW",
dtype=ChannelType.LONG,
doc="Prev Raw Value",
read_only=True,
)
prev_readback_value = pvproperty(
name="ORBV",
dtype=ChannelType.LONG,
doc="Prev Readback Value",
read_only=True,
)
output_specification = pvproperty(
name="OUT", dtype=ChannelType.STRING, doc="Output Specification"
)
readback_value = pvproperty(
name="RBV", dtype=ChannelType.LONG, doc="Readback Value", read_only=True
)
raw_value = pvproperty(name="RVAL", dtype=ChannelType.LONG, doc="Raw Value")
states_defined = pvproperty(
name="SDEF", dtype=ChannelType.INT, doc="States Defined", read_only=True
)
sim_mode_async_delay = pvproperty(
name="SDLY",
dtype=ChannelType.DOUBLE,
doc="Sim. Mode Async Delay",
value=-1.0,
)
shift = pvproperty(name="SHFT", dtype=ChannelType.INT, doc="Shift")
simulation_mode_link = pvproperty(
name="SIML", dtype=ChannelType.STRING, doc="Simulation Mode Link"
)
simulation_mode = pvproperty(
name="SIMM",
dtype=ChannelType.ENUM,
enum_strings=menus.menuYesNo.get_string_tuple(),
doc="Simulation Mode",
)
simulation_mode_severity = pvproperty(
name="SIMS",
dtype=ChannelType.ENUM,
enum_strings=menus.menuAlarmSevr.get_string_tuple(),
doc="Simulation Mode Severity",
)
simulation_output_link = pvproperty(
name="SIOL", dtype=ChannelType.STRING, doc="Simulation Output Link"
)
sim_mode_scan = pvproperty(
name="SSCN",
dtype=ChannelType.ENUM,
enum_strings=menus.menuScan.get_string_tuple(),
doc="Sim. Mode Scan",
value=0,
)
seven_string = pvproperty(
name="SVST",
dtype=ChannelType.CHAR,
max_length=26,
report_as_string=True,
doc="Seven String",
)
state_seven_severity = pvproperty(
name="SVSV",
dtype=ChannelType.ENUM,
enum_strings=menus.menuAlarmSevr.get_string_tuple(),
doc="State Seven Severity",
)
seven_value = pvproperty(
name="SVVL", dtype=ChannelType.LONG, doc="Seven Value"
)
six_string = pvproperty(
name="SXST",
dtype=ChannelType.CHAR,
max_length=26,
report_as_string=True,
doc="Six String",
)
state_six_severity = pvproperty(
name="SXSV",
dtype=ChannelType.ENUM,
enum_strings=menus.menuAlarmSevr.get_string_tuple(),
doc="State Six Severity",
)
six_value = pvproperty(name="SXVL", dtype=ChannelType.LONG, doc="Six Value")
ten_string = pvproperty(
name="TEST",
dtype=ChannelType.CHAR,
max_length=26,
report_as_string=True,
doc="Ten String",
)
state_ten_severity = pvproperty(
name="TESV",
dtype=ChannelType.ENUM,
enum_strings=menus.menuAlarmSevr.get_string_tuple(),
doc="State Ten Severity",
)
ten_value = pvproperty(name="TEVL", dtype=ChannelType.LONG, doc="Ten Value")
three_string = pvproperty(
name="THST",
dtype=ChannelType.CHAR,
max_length=26,
report_as_string=True,
doc="Three String",
)
state_three_severity = pvproperty(
name="THSV",
dtype=ChannelType.ENUM,
enum_strings=menus.menuAlarmSevr.get_string_tuple(),
doc="State Three Severity",
)
three_value = pvproperty(
name="THVL", dtype=ChannelType.LONG, doc="Three Value"
)
thirteen_string = pvproperty(
name="TTST",
dtype=ChannelType.CHAR,
max_length=26,
report_as_string=True,
doc="Thirteen String",
)
state_thirteen_sevr = pvproperty(
name="TTSV",
dtype=ChannelType.ENUM,
enum_strings=menus.menuAlarmSevr.get_string_tuple(),
doc="State Thirteen Sevr",
)
thirteen_value = pvproperty(
name="TTVL", dtype=ChannelType.LONG, doc="Thirteen Value"
)
twelve_string = pvproperty(
name="TVST",
dtype=ChannelType.CHAR,
max_length=26,
report_as_string=True,
doc="Twelve String",
)
state_twelve_severity = pvproperty(
name="TVSV",
dtype=ChannelType.ENUM,
enum_strings=menus.menuAlarmSevr.get_string_tuple(),
doc="State Twelve Severity",
)
twelve_value = pvproperty(
name="TVVL", dtype=ChannelType.LONG, doc="Twelve Value"
)
two_string = pvproperty(
name="TWST",
dtype=ChannelType.CHAR,
max_length=26,
report_as_string=True,
doc="Two String",
)
state_two_severity = pvproperty(
name="TWSV",
dtype=ChannelType.ENUM,
enum_strings=menus.menuAlarmSevr.get_string_tuple(),
doc="State Two Severity",
)
two_value = pvproperty(name="TWVL", dtype=ChannelType.LONG, doc="Two Value")
unknown_state_sevr = pvproperty(
name="UNSV",
dtype=ChannelType.ENUM,
enum_strings=menus.menuAlarmSevr.get_string_tuple(),
doc="Unknown State Sevr",
)
zero_string = pvproperty(
name="ZRST",
dtype=ChannelType.CHAR,
max_length=26,
report_as_string=True,
doc="Zero String",
)
state_zero_severity = pvproperty(
name="ZRSV",
dtype=ChannelType.ENUM,
enum_strings=menus.menuAlarmSevr.get_string_tuple(),
doc="State Zero Severity",
)
zero_value = pvproperty(
name="ZRVL", dtype=ChannelType.LONG, doc="Zero Value"
)
# desired_value = pvproperty(name='VAL',
# dtype=ChannelType.ENUM,
# doc='Desired Value')
class MbbodirectFields(RecordFieldGroup):
_record_type = "mbboDirect"
_dtype = ChannelType.LONG # DTYP of .VAL
has_val_field = True
copy_pvproperties(locals(), RecordFieldGroup)
device_type = pvproperty(
name="DTYP",
dtype=ChannelType.ENUM,
enum_strings=menus.dtyp_mbboDirect.get_string_tuple(),
doc="Device Type",
)
bit_0 = pvproperty(name="B0", dtype=ChannelType.CHAR, doc="Bit 0")
bit_1 = pvproperty(name="B1", dtype=ChannelType.CHAR, doc="Bit 1")
bit_16 = pvproperty(name="B10", dtype=ChannelType.CHAR, doc="Bit 16")
bit_17 = pvproperty(name="B11", dtype=ChannelType.CHAR, doc="Bit 17")
bit_18 = pvproperty(name="B12", dtype=ChannelType.CHAR, doc="Bit 18")
bit_19 = pvproperty(name="B13", dtype=ChannelType.CHAR, doc="Bit 19")
bit_20 = pvproperty(name="B14", dtype=ChannelType.CHAR, doc="Bit 20")
bit_21 = pvproperty(name="B15", dtype=ChannelType.CHAR, doc="Bit 21")
bit_22 = pvproperty(name="B16", dtype=ChannelType.CHAR, doc="Bit 22")
bit_23 = pvproperty(name="B17", dtype=ChannelType.CHAR, doc="Bit 23")
bit_24 = pvproperty(name="B18", dtype=ChannelType.CHAR, doc="Bit 24")
bit_25 = pvproperty(name="B19", dtype=ChannelType.CHAR, doc="Bit 25")
bit_26 = pvproperty(name="B1A", dtype=ChannelType.CHAR, doc="Bit 26")
bit_27 = pvproperty(name="B1B", dtype=ChannelType.CHAR, doc="Bit 27")
bit_28 = pvproperty(name="B1C", dtype=ChannelType.CHAR, doc="Bit 28")
bit_29 = pvproperty(name="B1D", dtype=ChannelType.CHAR, doc="Bit 29")
bit_30 = pvproperty(name="B1E", dtype=ChannelType.CHAR, doc="Bit 30")
bit_31 = pvproperty(name="B1F", dtype=ChannelType.CHAR, doc="Bit 31")
bit_2 = pvproperty(name="B2", dtype=ChannelType.CHAR, doc="Bit 2")
bit_3 = pvproperty(name="B3", dtype=ChannelType.CHAR, doc="Bit 3")
bit_4 = pvproperty(name="B4", dtype=ChannelType.CHAR, doc="Bit 4")
bit_5 = pvproperty(name="B5", dtype=ChannelType.CHAR, doc="Bit 5")
bit_6 = pvproperty(name="B6", dtype=ChannelType.CHAR, doc="Bit 6")
bit_7 = pvproperty(name="B7", dtype=ChannelType.CHAR, doc="Bit 7")
bit_8 = pvproperty(name="B8", dtype=ChannelType.CHAR, doc="Bit 8")
bit_9 = pvproperty(name="B9", dtype=ChannelType.CHAR, doc="Bit 9")
bit_10 = pvproperty(name="BA", dtype=ChannelType.CHAR, doc="Bit 10")
bit_11 = pvproperty(name="BB", dtype=ChannelType.CHAR, doc="Bit 11")
bit_12 = pvproperty(name="BC", dtype=ChannelType.CHAR, doc="Bit 12")
bit_13 = pvproperty(name="BD", dtype=ChannelType.CHAR, doc="Bit 13")
bit_14 = pvproperty(name="BE", dtype=ChannelType.CHAR, doc="Bit 14")
bit_15 = pvproperty(name="BF", dtype=ChannelType.CHAR, doc="Bit 15")
desired_output_loc = pvproperty(
name="DOL", dtype=ChannelType.STRING, doc="Desired Output Loc"
)
invalid_outpt_action = pvproperty(
name="IVOA",
dtype=ChannelType.ENUM,
enum_strings=menus.menuIvoa.get_string_tuple(),
doc="INVALID outpt action",
)
invalid_output_value = pvproperty(
name="IVOV", dtype=ChannelType.LONG, doc="INVALID output value"
)
hardware_mask = pvproperty(
name="MASK", dtype=ChannelType.LONG, doc="Hardware Mask", read_only=True
)
last_value_monitored = pvproperty(
name="MLST",
dtype=ChannelType.LONG,
doc="Last Value Monitored",
read_only=True,
)
number_of_bits = pvproperty(
name="NOBT", dtype=ChannelType.INT, doc="Number of Bits", read_only=True
)
prev_simulation_mode = pvproperty(
name="OLDSIMM",
dtype=ChannelType.ENUM,
enum_strings=menus.menuSimm.get_string_tuple(),
doc="Prev. Simulation Mode",
read_only=True,
)
output_mode_select = pvproperty(
name="OMSL",
dtype=ChannelType.ENUM,
enum_strings=menus.menuOmsl.get_string_tuple(),
doc="Output Mode Select",
)
prev_raw_value = pvproperty(
name="ORAW",
dtype=ChannelType.LONG,
doc="Prev Raw Value",
read_only=True,
)
prev_readback_value = pvproperty(
name="ORBV",
dtype=ChannelType.LONG,
doc="Prev Readback Value",
read_only=True,
)
output_specification = pvproperty(
name="OUT", dtype=ChannelType.STRING, doc="Output Specification"
)
readback_value = pvproperty(
name="RBV", dtype=ChannelType.LONG, doc="Readback Value", read_only=True
)
raw_value = pvproperty(
name="RVAL", dtype=ChannelType.LONG, doc="Raw Value", read_only=True
)
sim_mode_async_delay = pvproperty(
name="SDLY",
dtype=ChannelType.DOUBLE,
doc="Sim. Mode Async Delay",
value=-1.0,
)
shift = pvproperty(name="SHFT", dtype=ChannelType.INT, doc="Shift")
simulation_mode_link = pvproperty(
name="SIML", dtype=ChannelType.STRING, doc="Simulation Mode Link"
)
simulation_mode = pvproperty(
name="SIMM",
dtype=ChannelType.ENUM,
enum_strings=menus.menuYesNo.get_string_tuple(),
doc="Simulation Mode",
)
simulation_mode_severity = pvproperty(
name="SIMS",
dtype=ChannelType.ENUM,
enum_strings=menus.menuAlarmSevr.get_string_tuple(),
doc="Simulation Mode Severity",
)
simulation_output_link = pvproperty(
name="SIOL", dtype=ChannelType.STRING, doc="Simulation Output Link"
)
sim_mode_scan = pvproperty(
name="SSCN",
dtype=ChannelType.ENUM,
enum_strings=menus.menuScan.get_string_tuple(),
doc="Sim. Mode Scan",
value=0,
)
# word = pvproperty(name='VAL',
# dtype=ChannelType.LONG,
# doc='Word')
class MotorFields(RecordFieldGroup, _Limits):
_record_type = "motor"
_dtype = ChannelType.DOUBLE # DTYP of .VAL
has_val_field = True
copy_pvproperties(locals(), RecordFieldGroup, _Limits)
device_type = pvproperty(
name="DTYP",
dtype=ChannelType.ENUM,
enum_strings=menus.dtyp_motor.get_string_tuple(),
doc="Device Type",
)
seconds_to_velocity = pvproperty(
name="ACCL",
dtype=ChannelType.DOUBLE,
doc="Seconds to Velocity",
value=0.2,
)
archive_deadband = pvproperty(
name="ADEL", dtype=ChannelType.DOUBLE, doc="Archive Deadband"
)
last_value_archived = pvproperty(
name="ALST",
dtype=ChannelType.DOUBLE,
doc="Last Value Archived",
read_only=True,
)
at_home = pvproperty(
name="ATHM", dtype=ChannelType.INT, doc="At HOME", read_only=True
)
bl_seconds_to_velocity = pvproperty(
name="BACC",
dtype=ChannelType.DOUBLE,
doc="BL Seconds to Velocity",
value=0.5,
)
bl_distance = pvproperty(
name="BDST", dtype=ChannelType.DOUBLE, doc="BL Distance (EGU)"
)
bl_velocity = pvproperty(
name="BVEL", dtype=ChannelType.DOUBLE, doc="BL Velocity (EGU/s)"
)
card_number = pvproperty(
name="CARD", dtype=ChannelType.INT, doc="Card Number", read_only=True
)
raw_cmnd_direction = pvproperty(
name="CDIR",
dtype=ChannelType.INT,
doc="Raw cmnd direction",
read_only=True,
)
enable_control = pvproperty(
name="CNEN",
dtype=ChannelType.ENUM,
enum_strings=menus.motorTORQ.get_string_tuple(),
doc="Enable control",
)
derivative_gain = pvproperty(
name="DCOF", dtype=ChannelType.DOUBLE, doc="Derivative Gain", value=0
)
dial_high_limit = pvproperty(
name="DHLM", dtype=ChannelType.DOUBLE, doc="Dial High Limit"
)
difference_dval_drbv = pvproperty(
name="DIFF",
dtype=ChannelType.DOUBLE,
doc="Difference dval-drbv",
read_only=True,
)
dmov_input_link = pvproperty(
name="DINP", dtype=ChannelType.STRING, doc="DMOV Input Link"
)
user_direction = pvproperty(
name="DIR",
dtype=ChannelType.ENUM,
enum_strings=menus.motorDIR.get_string_tuple(),
doc="User Direction",
)
dial_low_limit = pvproperty(
name="DLLM", dtype=ChannelType.DOUBLE, doc="Dial Low Limit"
)
readback_settle_time = pvproperty(
name="DLY", dtype=ChannelType.DOUBLE, doc="Readback settle time (s)"
)
done_moving_to_value = pvproperty(
name="DMOV",
dtype=ChannelType.INT,
doc="Done moving to value",
read_only=True,
value=1,
)
desired_output_loc = pvproperty(
name="DOL", dtype=ChannelType.STRING, doc="Desired Output Loc"
)
dial_readback_value = pvproperty(
name="DRBV",
dtype=ChannelType.DOUBLE,
doc="Dial Readback Value",
read_only=True,
)
dial_desired_value = pvproperty(
name="DVAL", dtype=ChannelType.DOUBLE, doc="Dial Desired Value (EGU"
)
engineering_units = pvproperty(
name="EGU",
dtype=ChannelType.CHAR,
max_length=16,
report_as_string=True,
doc="Engineering Units",
)
encoder_step_size = pvproperty(
name="ERES", dtype=ChannelType.DOUBLE, doc="Encoder Step Size (EGU)"
)
freeze_offset = pvproperty(
name="FOF", dtype=ChannelType.INT, doc="Freeze Offset"
)
offset_freeze_switch = pvproperty(
name="FOFF",
dtype=ChannelType.ENUM,
enum_strings=menus.motorFOFF.get_string_tuple(),
doc="Offset-Freeze Switch",
)
move_fraction = pvproperty(
name="FRAC", dtype=ChannelType.FLOAT, doc="Move Fraction", value=1
)
user_high_limit = pvproperty(
name="HLM", dtype=ChannelType.DOUBLE, doc="User High Limit"
)
user_high_limit_switch = pvproperty(
name="HLS",
dtype=ChannelType.INT,
doc="User High Limit Switch",
read_only=True,
)
hw_limit_violation_svr = pvproperty(
name="HLSV",
dtype=ChannelType.ENUM,
enum_strings=menus.menuAlarmSevr.get_string_tuple(),
doc="HW Limit Violation Svr",
)
home_forward = pvproperty(
name="HOMF", dtype=ChannelType.INT, doc="Home Forward"
)
home_reverse = pvproperty(
name="HOMR", dtype=ChannelType.INT, doc="Home Reverse"
)
home_velocity = pvproperty(
name="HVEL", dtype=ChannelType.DOUBLE, doc="Home Velocity (EGU/s)"
)
integral_gain = pvproperty(
name="ICOF", dtype=ChannelType.DOUBLE, doc="Integral Gain", value=0
)
ignore_set_field = pvproperty(
name="IGSET", dtype=ChannelType.INT, doc="Ignore SET field"
)
startup_commands = pvproperty(
name="INIT",
dtype=ChannelType.CHAR,
max_length=40,
report_as_string=True,
doc="Startup commands",
)
jog_accel = pvproperty(
name="JAR", dtype=ChannelType.DOUBLE, doc="Jog Accel. (EGU/s^2)"
)
jog_motor_forward = pvproperty(
name="JOGF", dtype=ChannelType.INT, doc="Jog motor Forward"
)
jog_motor_reverse = pvproperty(
name="JOGR", dtype=ChannelType.INT, doc="Jog motor Reverse"
)
jog_velocity = pvproperty(
name="JVEL", dtype=ChannelType.DOUBLE, doc="Jog Velocity (EGU/s)"
)
last_dial_des_val = pvproperty(
name="LDVL",
dtype=ChannelType.DOUBLE,
doc="Last Dial Des Val (EGU)",
read_only=True,
)
user_low_limit = pvproperty(
name="LLM", dtype=ChannelType.DOUBLE, doc="User Low Limit"
)
user_low_limit_switch = pvproperty(
name="LLS",
dtype=ChannelType.INT,
doc="User Low Limit Switch",
read_only=True,
)
soft_channel_position_lock = pvproperty(
name="LOCK",
dtype=ChannelType.ENUM,
enum_strings=menus.menuYesNo.get_string_tuple(),
doc="Soft Channel Position Lock",
value="NO",
)
last_rel_value = pvproperty(
name="LRLV",
dtype=ChannelType.DOUBLE,
doc="Last Rel Value (EGU)",
read_only=True,
)
last_raw_des_val = pvproperty(
name="LRVL",
dtype=ChannelType.LONG,
doc="Last Raw Des Val (steps",
read_only=True,
)
last_spmg = pvproperty(
name="LSPG",
dtype=ChannelType.ENUM,
enum_strings=menus.motorSPMG.get_string_tuple(),
doc="Last SPMG",
read_only=True,
value=3,
)
last_user_des_val = pvproperty(
name="LVAL",
dtype=ChannelType.DOUBLE,
doc="Last User Des Val (EGU)",
read_only=True,
)
limit_violation = pvproperty(
name="LVIO",
dtype=ChannelType.INT,
doc="Limit violation",
read_only=True,
value=1,
)
monitor_deadband = pvproperty(
name="MDEL", dtype=ChannelType.DOUBLE, doc="Monitor Deadband"
)
motion_in_progress = pvproperty(
name="MIP",
dtype=ChannelType.INT,
doc="Motion In Progress",
read_only=True,
)
ran_out_of_retries = pvproperty(
name="MISS",
dtype=ChannelType.INT,
doc="Ran out of retries",
read_only=True,
)
last_val_monitored = pvproperty(
name="MLST",
dtype=ChannelType.DOUBLE,
doc="Last Val Monitored",
read_only=True,
)
monitor_mask = pvproperty(
name="MMAP", dtype=ChannelType.LONG, doc="Monitor Mask", read_only=True
)
motor_is_moving = pvproperty(
name="MOVN",
dtype=ChannelType.INT,
doc="Motor is moving",
read_only=True,
)
motor_step_size = pvproperty(
name="MRES", dtype=ChannelType.DOUBLE, doc="Motor Step Size (EGU)"
)
motor_status = pvproperty(
name="MSTA", dtype=ChannelType.LONG, doc="Motor Status", read_only=True
)
monitor_mask_more = pvproperty(
name="NMAP",
dtype=ChannelType.LONG,
doc="Monitor Mask (more)",
read_only=True,
)
new_target_monitor = pvproperty(
name="NTM",
dtype=ChannelType.ENUM,
enum_strings=menus.menuYesNo.get_string_tuple(),
doc="New Target Monitor",
value="YES",
)
ntm_deadband_factor = pvproperty(
name="NTMF", dtype=ChannelType.INT, doc="NTM Deadband Factor", value=2
)
user_offset = pvproperty(
name="OFF", dtype=ChannelType.DOUBLE, doc="User Offset (EGU)"
)
output_mode_select = pvproperty(
name="OMSL",
dtype=ChannelType.ENUM,
enum_strings=menus.menuOmsl.get_string_tuple(),
doc="Output Mode Select",
)
output_specification = pvproperty(
name="OUT", dtype=ChannelType.STRING, doc="Output Specification"
)
proportional_gain = pvproperty(
name="PCOF", dtype=ChannelType.DOUBLE, doc="Proportional Gain", value=0
)
post_move_commands = pvproperty(
name="POST",
dtype=ChannelType.CHAR,
max_length=40,
report_as_string=True,
doc="Post-move commands",
)
post_process_command = pvproperty(
name="PP",
dtype=ChannelType.INT,
doc="Post process command",
read_only=True,
value=0,
)
display_precision = pvproperty(
name="PREC", dtype=ChannelType.INT, doc="Display Precision"
)
pre_move_commands = pvproperty(
name="PREM",
dtype=ChannelType.CHAR,
max_length=40,
report_as_string=True,
doc="Pre-move commands",
)
user_readback_value = pvproperty(
name="RBV",
dtype=ChannelType.DOUBLE,
doc="User Readback Value",
read_only=True,
)
retry_count = pvproperty(
name="RCNT", dtype=ChannelType.INT, doc="Retry count", read_only=True
)
retry_deadband = pvproperty(
name="RDBD", dtype=ChannelType.DOUBLE, doc="Retry Deadband (EGU)"
)
readback_location = pvproperty(
name="RDBL", dtype=ChannelType.STRING, doc="Readback Location"
)
difference_rval_rrbv = pvproperty(
name="RDIF",
dtype=ChannelType.LONG,
doc="Difference rval-rrbv",
read_only=True,
)
raw_encoder_position = pvproperty(
name="REP",
dtype=ChannelType.LONG,
doc="Raw Encoder Position",
read_only=True,
)
raw_high_limit_switch = pvproperty(
name="RHLS",
dtype=ChannelType.INT,
doc="Raw High Limit Switch",
read_only=True,
)
rmp_input_link = pvproperty(
name="RINP", dtype=ChannelType.STRING, doc="RMP Input Link"
)
raw_low_limit_switch = pvproperty(
name="RLLS",
dtype=ChannelType.INT,
doc="Raw Low Limit Switch",
read_only=True,
)
readback_outlink = pvproperty(
name="RLNK", dtype=ChannelType.STRING, doc="Readback OutLink"
)
relative_value = pvproperty(
name="RLV", dtype=ChannelType.DOUBLE, doc="Relative Value (EGU)"
)
retry_mode = pvproperty(
name="RMOD",
dtype=ChannelType.ENUM,
enum_strings=menus.motorRMOD.get_string_tuple(),
doc="Retry Mode",
value="Default",
)
raw_motor_position = pvproperty(
name="RMP",
dtype=ChannelType.LONG,
doc="Raw Motor Position",
read_only=True,
)
raw_readback_value = pvproperty(
name="RRBV",
dtype=ChannelType.LONG,
doc="Raw Readback Value",
read_only=True,
)
readback_step_size = pvproperty(
name="RRES", dtype=ChannelType.DOUBLE, doc="Readback Step Size (EGU"
)
max_retry_count = pvproperty(
name="RTRY", dtype=ChannelType.INT, doc="Max retry count", value=10
)
raw_desired_value = pvproperty(
name="RVAL", dtype=ChannelType.LONG, doc="Raw Desired Value (step"
)
raw_velocity = pvproperty(
name="RVEL", dtype=ChannelType.LONG, doc="Raw Velocity", read_only=True
)
speed = pvproperty(
name="S", dtype=ChannelType.DOUBLE, doc="Speed (revolutions/sec)"
)
bl_speed = pvproperty(
name="SBAK", dtype=ChannelType.DOUBLE, doc="BL Speed (RPS)"
)
base_speed = pvproperty(
name="SBAS", dtype=ChannelType.DOUBLE, doc="Base Speed (RPS)"
)
set_use_switch = pvproperty(
name="SET",
dtype=ChannelType.ENUM,
enum_strings=menus.motorSET.get_string_tuple(),
doc="Set/Use Switch",
)
max_speed = pvproperty(
name="SMAX", dtype=ChannelType.DOUBLE, doc="Max. Speed (RPS)"
)
setpoint_deadband = pvproperty(
name="SPDB", dtype=ChannelType.DOUBLE, doc="Setpoint Deadband (EGU)"
)
stop_pause_move_go = pvproperty(
name="SPMG",
dtype=ChannelType.ENUM,
enum_strings=menus.motorSPMG.get_string_tuple(),
doc="Stop/Pause/Move/Go",
value=3,
)
steps_per_revolution = pvproperty(
name="SREV",
dtype=ChannelType.LONG,
doc="Steps per Revolution",
value=200,
)
set_set_mode = pvproperty(
name="SSET", dtype=ChannelType.INT, doc="Set SET Mode"
)
stop_outlink = pvproperty(
name="STOO", dtype=ChannelType.STRING, doc="STOP OutLink"
)
stop = pvproperty(name="STOP", dtype=ChannelType.INT, doc="Stop")
status_update = pvproperty(
name="STUP",
dtype=ChannelType.ENUM,
enum_strings=menus.motorSTUP.get_string_tuple(),
doc="Status Update",
value="OFF",
)
set_use_mode = pvproperty(
name="SUSE", dtype=ChannelType.INT, doc="Set USE Mode"
)
sync_position = pvproperty(
name="SYNC", dtype=ChannelType.INT, doc="Sync position"
)
direction_of_travel = pvproperty(
name="TDIR",
dtype=ChannelType.INT,
doc="Direction of Travel",
read_only=True,
)
tweak_motor_forward = pvproperty(
name="TWF", dtype=ChannelType.INT, doc="Tweak motor Forward"
)
tweak_motor_reverse = pvproperty(
name="TWR", dtype=ChannelType.INT, doc="Tweak motor Reverse"
)
tweak_step_size = pvproperty(
name="TWV", dtype=ChannelType.DOUBLE, doc="Tweak Step Size (EGU)"
)
use_encoder_if_present = pvproperty(
name="UEIP",
dtype=ChannelType.ENUM,
enum_strings=menus.motorUEIP.get_string_tuple(),
doc="Use Encoder If Present",
)
egu_s_per_revolution = pvproperty(
name="UREV", dtype=ChannelType.DOUBLE, doc="EGU's per Revolution"
)
use_rdbl_link_if_presen = pvproperty(
name="URIP",
dtype=ChannelType.ENUM,
enum_strings=menus.motorUEIP.get_string_tuple(),
doc="Use RDBL Link If Presen",
)
base_velocity = pvproperty(
name="VBAS", dtype=ChannelType.DOUBLE, doc="Base Velocity (EGU/s)"
)
velocity = pvproperty(
name="VELO", dtype=ChannelType.DOUBLE, doc="Velocity (EGU/s)"
)
code_version = pvproperty(
name="VERS",
dtype=ChannelType.FLOAT,
doc="Code Version",
read_only=True,
value=1,
)
max_velocity = pvproperty(
name="VMAX", dtype=ChannelType.DOUBLE, doc="Max. Velocity (EGU/s)"
)
variable_offset = pvproperty(
name="VOF", dtype=ChannelType.INT, doc="Variable Offset"
)
# user_desired_value = pvproperty(name='VAL',
# dtype=ChannelType.DOUBLE,
# doc='User Desired Value (EGU')
link_parent_attribute(
display_precision, "precision",
)
link_parent_attribute(archive_deadband, "log_atol", use_setattr=True)
link_parent_attribute(monitor_deadband, "value_atol", use_setattr=True)
class PermissiveFields(RecordFieldGroup):
_record_type = "permissive"
_dtype = ChannelType.LONG # DTYP of .VAL
has_val_field = True
copy_pvproperties(locals(), RecordFieldGroup)
device_type = pvproperty(
name="DTYP",
dtype=ChannelType.ENUM,
enum_strings=menus.dtyp_permissive.get_string_tuple(),
doc="Device Type",
)
button_label = pvproperty(
name="LABL",
dtype=ChannelType.CHAR,
max_length=20,
report_as_string=True,
doc="Button Label",
)
old_flag = pvproperty(
name="OFLG", dtype=ChannelType.INT, doc="Old Flag", read_only=True
)
old_status = pvproperty(
name="OVAL", dtype=ChannelType.INT, doc="Old Status", read_only=True
)
wait_flag = pvproperty(name="WFLG", dtype=ChannelType.INT, doc="Wait Flag")
# status = pvproperty(name='VAL',
# dtype=ChannelType.INT,
# doc='Status')
class PrintfFields(RecordFieldGroup):
_record_type = "printf"
_dtype = None # DTYP of .VAL
has_val_field = False
copy_pvproperties(locals(), RecordFieldGroup)
device_type = pvproperty(
name="DTYP",
dtype=ChannelType.ENUM,
enum_strings=menus.dtyp_printf.get_string_tuple(),
doc="Device Type",
)
format_string = pvproperty(
name="FMT",
dtype=ChannelType.CHAR,
max_length=81,
report_as_string=True,
doc="Format String",
)
input_0 = pvproperty(name="INP0", dtype=ChannelType.STRING, doc="Input 0")
input_1 = pvproperty(name="INP1", dtype=ChannelType.STRING, doc="Input 1")
input_2 = pvproperty(name="INP2", dtype=ChannelType.STRING, doc="Input 2")
input_3 = pvproperty(name="INP3", dtype=ChannelType.STRING, doc="Input 3")
input_4 = pvproperty(name="INP4", dtype=ChannelType.STRING, doc="Input 4")
input_5 = pvproperty(name="INP5", dtype=ChannelType.STRING, doc="Input 5")
input_6 = pvproperty(name="INP6", dtype=ChannelType.STRING, doc="Input 6")
input_7 = pvproperty(name="INP7", dtype=ChannelType.STRING, doc="Input 7")
input_8 = pvproperty(name="INP8", dtype=ChannelType.STRING, doc="Input 8")
input_9 = pvproperty(name="INP9", dtype=ChannelType.STRING, doc="Input 9")
invalid_link_string = pvproperty(
name="IVLS",
dtype=ChannelType.CHAR,
max_length=16,
report_as_string=True,
doc="Invalid Link String",
value="LNK",
)
length_of_val = pvproperty(
name="LEN", dtype=ChannelType.LONG, doc="Length of VAL", read_only=True
)
output_specification = pvproperty(
name="OUT", dtype=ChannelType.STRING, doc="Output Specification"
)
size_of_val_buffer = pvproperty(
name="SIZV",
dtype=ChannelType.INT,
doc="Size of VAL buffer",
read_only=True,
value=41,
)
class SelFields(RecordFieldGroup, _Limits):
_record_type = "sel"
_dtype = ChannelType.DOUBLE # DTYP of .VAL
has_val_field = True
copy_pvproperties(locals(), RecordFieldGroup, _Limits)
device_type = pvproperty(
name="DTYP",
dtype=ChannelType.ENUM,
enum_strings=menus.dtyp_sel.get_string_tuple(),
doc="Device Type",
)
value_of_input_a = pvproperty(
name="A", dtype=ChannelType.DOUBLE, doc="Value of Input A"
)
archive_deadband = pvproperty(
name="ADEL", dtype=ChannelType.DOUBLE, doc="Archive Deadband"
)
last_value_archived = pvproperty(
name="ALST",
dtype=ChannelType.DOUBLE,
doc="Last Value Archived",
read_only=True,
)
value_of_input_b = pvproperty(
name="B", dtype=ChannelType.DOUBLE, doc="Value of Input B"
)
value_of_input_c = pvproperty(
name="C", dtype=ChannelType.DOUBLE, doc="Value of Input C"
)
value_of_input_d = pvproperty(
name="D", dtype=ChannelType.DOUBLE, doc="Value of Input D"
)
value_of_input_e = pvproperty(
name="E", dtype=ChannelType.DOUBLE, doc="Value of Input E"
)
engineering_units = pvproperty(
name="EGU",
dtype=ChannelType.CHAR,
max_length=16,
report_as_string=True,
doc="Engineering Units",
)
value_of_input_f = pvproperty(
name="F", dtype=ChannelType.DOUBLE, doc="Value of Input F"
)
value_of_input_g = pvproperty(
name="G", dtype=ChannelType.DOUBLE, doc="Value of Input G"
)
value_of_input_h = pvproperty(
name="H", dtype=ChannelType.DOUBLE, doc="Value of Input H"
)
alarm_deadband = pvproperty(
name="HYST", dtype=ChannelType.DOUBLE, doc="Alarm Deadband"
)
value_of_input_i = pvproperty(
name="I", dtype=ChannelType.DOUBLE, doc="Value of Input I"
)
input_a = pvproperty(name="INPA", dtype=ChannelType.STRING, doc="Input A")
input_b = pvproperty(name="INPB", dtype=ChannelType.STRING, doc="Input B")
input_c = pvproperty(name="INPC", dtype=ChannelType.STRING, doc="Input C")
input_d = pvproperty(name="INPD", dtype=ChannelType.STRING, doc="Input D")
input_e = pvproperty(name="INPE", dtype=ChannelType.STRING, doc="Input E")
input_f = pvproperty(name="INPF", dtype=ChannelType.STRING, doc="Input F")
input_g = pvproperty(name="INPG", dtype=ChannelType.STRING, doc="Input G")
input_h = pvproperty(name="INPH", dtype=ChannelType.STRING, doc="Input H")
input_i = pvproperty(name="INPI", dtype=ChannelType.STRING, doc="Input I")
input_j = pvproperty(name="INPJ", dtype=ChannelType.STRING, doc="Input J")
input_k = pvproperty(name="INPK", dtype=ChannelType.STRING, doc="Input K")
input_l = pvproperty(name="INPL", dtype=ChannelType.STRING, doc="Input L")
value_of_input_j = pvproperty(
name="J", dtype=ChannelType.DOUBLE, doc="Value of Input J"
)
value_of_input_k = pvproperty(
name="K", dtype=ChannelType.DOUBLE, doc="Value of Input K"
)
value_of_input_l = pvproperty(
name="L", dtype=ChannelType.DOUBLE, doc="Value of Input L"
)
prev_value_of_a = pvproperty(
name="LA",
dtype=ChannelType.DOUBLE,
doc="Prev Value of A",
read_only=True,
)
last_value_alarmed = pvproperty(
name="LALM",
dtype=ChannelType.DOUBLE,
doc="Last Value Alarmed",
read_only=True,
)
prev_value_of_b = pvproperty(
name="LB",
dtype=ChannelType.DOUBLE,
doc="Prev Value of B",
read_only=True,
)
prev_value_of_c = pvproperty(
name="LC",
dtype=ChannelType.DOUBLE,
doc="Prev Value of C",
read_only=True,
)
prev_value_of_d = pvproperty(
name="LD",
dtype=ChannelType.DOUBLE,
doc="Prev Value of D",
read_only=True,
)
prev_value_of_e = pvproperty(
name="LE",
dtype=ChannelType.DOUBLE,
doc="Prev Value of E",
read_only=True,
)
prev_value_of_f = pvproperty(
name="LF",
dtype=ChannelType.DOUBLE,
doc="Prev Value of F",
read_only=True,
)
prev_value_of_g = pvproperty(
name="LG",
dtype=ChannelType.DOUBLE,
doc="Prev Value of G",
read_only=True,
)
prev_value_of_h = pvproperty(
name="LH",
dtype=ChannelType.DOUBLE,
doc="Prev Value of H",
read_only=True,
)
prev_value_of_i = pvproperty(
name="LI",
dtype=ChannelType.DOUBLE,
doc="Prev Value of I",
read_only=True,
)
prev_value_of_j = pvproperty(
name="LJ",
dtype=ChannelType.DOUBLE,
doc="Prev Value of J",
read_only=True,
)
prev_value_of_k = pvproperty(
name="LK",
dtype=ChannelType.DOUBLE,
doc="Prev Value of K",
read_only=True,
)
prev_value_of_l = pvproperty(
name="LL",
dtype=ChannelType.DOUBLE,
doc="Prev Value of L",
read_only=True,
)
monitor_deadband = pvproperty(
name="MDEL", dtype=ChannelType.DOUBLE, doc="Monitor Deadband"
)
last_val_monitored = pvproperty(
name="MLST",
dtype=ChannelType.DOUBLE,
doc="Last Val Monitored",
read_only=True,
)
last_index_monitored = pvproperty(
name="NLST",
dtype=ChannelType.INT,
doc="Last Index Monitored",
read_only=True,
)
index_value_location = pvproperty(
name="NVL", dtype=ChannelType.STRING, doc="Index Value Location"
)
display_precision = pvproperty(
name="PREC", dtype=ChannelType.INT, doc="Display Precision"
)
select_mechanism = pvproperty(
name="SELM",
dtype=ChannelType.ENUM,
enum_strings=menus.selSELM.get_string_tuple(),
doc="Select Mechanism",
)
index_value = pvproperty(
name="SELN", dtype=ChannelType.INT, doc="Index value"
)
# result = pvproperty(name='VAL',
# dtype=ChannelType.DOUBLE,
# doc='Result',read_only=True)
link_parent_attribute(
display_precision, "precision",
)
link_parent_attribute(archive_deadband, "log_atol", use_setattr=True)
link_parent_attribute(monitor_deadband, "value_atol", use_setattr=True)
class SeqFields(RecordFieldGroup):
_record_type = "seq"
_dtype = ChannelType.LONG # DTYP of .VAL
has_val_field = True
copy_pvproperties(locals(), RecordFieldGroup)
device_type = pvproperty(
name="DTYP",
dtype=ChannelType.ENUM,
enum_strings=menus.dtyp_seq.get_string_tuple(),
doc="Device Type",
)
delay_0 = pvproperty(name="DLY0", dtype=ChannelType.DOUBLE, doc="Delay 0")
delay_1 = pvproperty(name="DLY1", dtype=ChannelType.DOUBLE, doc="Delay 1")
delay_2 = pvproperty(name="DLY2", dtype=ChannelType.DOUBLE, doc="Delay 2")
delay_3 = pvproperty(name="DLY3", dtype=ChannelType.DOUBLE, doc="Delay 3")
delay_4 = pvproperty(name="DLY4", dtype=ChannelType.DOUBLE, doc="Delay 4")
delay_5 = pvproperty(name="DLY5", dtype=ChannelType.DOUBLE, doc="Delay 5")
delay_6 = pvproperty(name="DLY6", dtype=ChannelType.DOUBLE, doc="Delay 6")
delay_7 = pvproperty(name="DLY7", dtype=ChannelType.DOUBLE, doc="Delay 7")
delay_8 = pvproperty(name="DLY8", dtype=ChannelType.DOUBLE, doc="Delay 8")
delay_9 = pvproperty(name="DLY9", dtype=ChannelType.DOUBLE, doc="Delay 9")
delay_10 = pvproperty(name="DLYA", dtype=ChannelType.DOUBLE, doc="Delay 10")
delay_11 = pvproperty(name="DLYB", dtype=ChannelType.DOUBLE, doc="Delay 11")
delay_12 = pvproperty(name="DLYC", dtype=ChannelType.DOUBLE, doc="Delay 12")
delay_13 = pvproperty(name="DLYD", dtype=ChannelType.DOUBLE, doc="Delay 13")
delay_14 = pvproperty(name="DLYE", dtype=ChannelType.DOUBLE, doc="Delay 14")
delay_15 = pvproperty(name="DLYF", dtype=ChannelType.DOUBLE, doc="Delay 15")
value_0 = pvproperty(name="DO0", dtype=ChannelType.DOUBLE, doc="Value 0")
value_1 = pvproperty(name="DO1", dtype=ChannelType.DOUBLE, doc="Value 1")
value_2 = pvproperty(name="DO2", dtype=ChannelType.DOUBLE, doc="Value 2")
value_3 = pvproperty(name="DO3", dtype=ChannelType.DOUBLE, doc="Value 3")
value_4 = pvproperty(name="DO4", dtype=ChannelType.DOUBLE, doc="Value 4")
value_5 = pvproperty(name="DO5", dtype=ChannelType.DOUBLE, doc="Value 5")
value_6 = pvproperty(name="DO6", dtype=ChannelType.DOUBLE, doc="Value 6")
value_7 = pvproperty(name="DO7", dtype=ChannelType.DOUBLE, doc="Value 7")
value_8 = pvproperty(name="DO8", dtype=ChannelType.DOUBLE, doc="Value 8")
value_9 = pvproperty(name="DO9", dtype=ChannelType.DOUBLE, doc="Value 9")
value_10 = pvproperty(name="DOA", dtype=ChannelType.DOUBLE, doc="Value 10")
value_11 = pvproperty(name="DOB", dtype=ChannelType.DOUBLE, doc="Value 11")
value_12 = pvproperty(name="DOC", dtype=ChannelType.DOUBLE, doc="Value 12")
value_13 = pvproperty(name="DOD", dtype=ChannelType.DOUBLE, doc="Value 13")
value_14 = pvproperty(name="DOE", dtype=ChannelType.DOUBLE, doc="Value 14")
value_15 = pvproperty(name="DOF", dtype=ChannelType.DOUBLE, doc="Value 15")
input_link_0 = pvproperty(
name="DOL0", dtype=ChannelType.STRING, doc="Input link 0"
)
input_link1 = pvproperty(
name="DOL1", dtype=ChannelType.STRING, doc="Input link1"
)
input_link_2 = pvproperty(
name="DOL2", dtype=ChannelType.STRING, doc="Input link 2"
)
input_link_3 = pvproperty(
name="DOL3", dtype=ChannelType.STRING, doc="Input link 3"
)
input_link_4 = pvproperty(
name="DOL4", dtype=ChannelType.STRING, doc="Input link 4"
)
input_link_5 = pvproperty(
name="DOL5", dtype=ChannelType.STRING, doc="Input link 5"
)
input_link_6 = pvproperty(
name="DOL6", dtype=ChannelType.STRING, doc="Input link 6"
)
input_link_7 = pvproperty(
name="DOL7", dtype=ChannelType.STRING, doc="Input link 7"
)
input_link_8 = pvproperty(
name="DOL8", dtype=ChannelType.STRING, doc="Input link 8"
)
input_link_9 = pvproperty(
name="DOL9", dtype=ChannelType.STRING, doc="Input link 9"
)
input_link_10 = pvproperty(
name="DOLA", dtype=ChannelType.STRING, doc="Input link 10"
)
input_link_11 = pvproperty(
name="DOLB", dtype=ChannelType.STRING, doc="Input link 11"
)
input_link_12 = pvproperty(
name="DOLC", dtype=ChannelType.STRING, doc="Input link 12"
)
input_link_13 = pvproperty(
name="DOLD", dtype=ChannelType.STRING, doc="Input link 13"
)
input_link_14 = pvproperty(
name="DOLE", dtype=ChannelType.STRING, doc="Input link 14"
)
input_link_15 = pvproperty(
name="DOLF", dtype=ChannelType.STRING, doc="Input link 15"
)
output_link_0 = pvproperty(
name="LNK0", dtype=ChannelType.STRING, doc="Output Link 0"
)
output_link_1 = pvproperty(
name="LNK1", dtype=ChannelType.STRING, doc="Output Link 1"
)
output_link_2 = pvproperty(
name="LNK2", dtype=ChannelType.STRING, doc="Output Link 2"
)
output_link_3 = pvproperty(
name="LNK3", dtype=ChannelType.STRING, doc="Output Link 3"
)
output_link_4 = pvproperty(
name="LNK4", dtype=ChannelType.STRING, doc="Output Link 4"
)
output_link_5 = pvproperty(
name="LNK5", dtype=ChannelType.STRING, doc="Output Link 5"
)
output_link_6 = pvproperty(
name="LNK6", dtype=ChannelType.STRING, doc="Output Link 6"
)
output_link_7 = pvproperty(
name="LNK7", dtype=ChannelType.STRING, doc="Output Link 7"
)
output_link_8 = pvproperty(
name="LNK8", dtype=ChannelType.STRING, doc="Output Link 8"
)
output_link_9 = pvproperty(
name="LNK9", dtype=ChannelType.STRING, doc="Output Link 9"
)
output_link_10 = pvproperty(
name="LNKA", dtype=ChannelType.STRING, doc="Output Link 10"
)
output_link_11 = pvproperty(
name="LNKB", dtype=ChannelType.STRING, doc="Output Link 11"
)
output_link_12 = pvproperty(
name="LNKC", dtype=ChannelType.STRING, doc="Output Link 12"
)
output_link_13 = pvproperty(
name="LNKD", dtype=ChannelType.STRING, doc="Output Link 13"
)
output_link_14 = pvproperty(
name="LNKE", dtype=ChannelType.STRING, doc="Output Link 14"
)
output_link_15 = pvproperty(
name="LNKF", dtype=ChannelType.STRING, doc="Output Link 15"
)
offset_for_specified = pvproperty(
name="OFFS", dtype=ChannelType.INT, doc="Offset for Specified", value=0
)
old_selection = pvproperty(
name="OLDN", dtype=ChannelType.INT, doc="Old Selection"
)
display_precision = pvproperty(
name="PREC", dtype=ChannelType.INT, doc="Display Precision"
)
link_selection_loc = pvproperty(
name="SELL", dtype=ChannelType.STRING, doc="Link Selection Loc"
)
select_mechanism = pvproperty(
name="SELM",
dtype=ChannelType.ENUM,
enum_strings=menus.seqSELM.get_string_tuple(),
doc="Select Mechanism",
)
link_selection = pvproperty(
name="SELN", dtype=ChannelType.INT, doc="Link Selection", value=1
)
shift_for_mask_mode = pvproperty(
name="SHFT", dtype=ChannelType.INT, doc="Shift for Mask mode", value=-1
)
# used_to_trigger = pvproperty(name='VAL',
# dtype=ChannelType.LONG,
# doc='Used to trigger')
link_parent_attribute(
display_precision, "precision",
)
class StateFields(RecordFieldGroup):
_record_type = "state"
_dtype = ChannelType.STRING # DTYP of .VAL
has_val_field = True
copy_pvproperties(locals(), RecordFieldGroup)
device_type = pvproperty(
name="DTYP",
dtype=ChannelType.ENUM,
enum_strings=menus.dtyp_state.get_string_tuple(),
doc="Device Type",
)
prev_value = pvproperty(
name="OVAL",
dtype=ChannelType.CHAR,
max_length=20,
report_as_string=True,
doc="Prev Value",
read_only=True,
)
# value = pvproperty(name='VAL',
# dtype=ChannelType.CHAR,
# max_length=20,report_as_string=True,doc='Value')
class StringinFields(RecordFieldGroup):
_record_type = "stringin"
_dtype = ChannelType.STRING # DTYP of .VAL
has_val_field = True
copy_pvproperties(locals(), RecordFieldGroup)
device_type = pvproperty(
name="DTYP",
dtype=ChannelType.ENUM,
enum_strings=menus.dtyp_stringin.get_string_tuple(),
doc="Device Type",
)
post_archive_monitors = pvproperty(
name="APST",
dtype=ChannelType.ENUM,
enum_strings=menus.stringinPOST.get_string_tuple(),
doc="Post Archive Monitors",
)
input_specification = pvproperty(
name="INP", dtype=ChannelType.STRING, doc="Input Specification"
)
post_value_monitors = pvproperty(
name="MPST",
dtype=ChannelType.ENUM,
enum_strings=menus.stringinPOST.get_string_tuple(),
doc="Post Value Monitors",
)
prev_simulation_mode = pvproperty(
name="OLDSIMM",
dtype=ChannelType.ENUM,
enum_strings=menus.menuSimm.get_string_tuple(),
doc="Prev. Simulation Mode",
read_only=True,
)
previous_value = pvproperty(
name="OVAL",
dtype=ChannelType.CHAR,
max_length=40,
report_as_string=True,
doc="Previous Value",
read_only=True,
)
sim_mode_async_delay = pvproperty(
name="SDLY",
dtype=ChannelType.DOUBLE,
doc="Sim. Mode Async Delay",
value=-1.0,
)
simulation_mode_link = pvproperty(
name="SIML", dtype=ChannelType.STRING, doc="Simulation Mode Link"
)
simulation_mode = pvproperty(
name="SIMM",
dtype=ChannelType.ENUM,
enum_strings=menus.menuYesNo.get_string_tuple(),
doc="Simulation Mode",
)
simulation_mode_severity = pvproperty(
name="SIMS",
dtype=ChannelType.ENUM,
enum_strings=menus.menuAlarmSevr.get_string_tuple(),
doc="Simulation Mode Severity",
)
simulation_input_link = pvproperty(
name="SIOL", dtype=ChannelType.STRING, doc="Simulation Input Link"
)
sim_mode_scan = pvproperty(
name="SSCN",
dtype=ChannelType.ENUM,
enum_strings=menus.menuScan.get_string_tuple(),
doc="Sim. Mode Scan",
value=0,
)
simulation_value = pvproperty(
name="SVAL",
dtype=ChannelType.CHAR,
max_length=40,
report_as_string=True,
doc="Simulation Value",
)
# current_value = pvproperty(name='VAL',
# dtype=ChannelType.CHAR,
# max_length=40,report_as_string=True,doc='Current Value')
class StringoutFields(RecordFieldGroup):
_record_type = "stringout"
_dtype = ChannelType.STRING # DTYP of .VAL
has_val_field = True
copy_pvproperties(locals(), RecordFieldGroup)
device_type = pvproperty(
name="DTYP",
dtype=ChannelType.ENUM,
enum_strings=menus.dtyp_stringout.get_string_tuple(),
doc="Device Type",
)
post_archive_monitors = pvproperty(
name="APST",
dtype=ChannelType.ENUM,
enum_strings=menus.stringoutPOST.get_string_tuple(),
doc="Post Archive Monitors",
)
desired_output_loc = pvproperty(
name="DOL", dtype=ChannelType.STRING, doc="Desired Output Loc"
)
invalid_output_action = pvproperty(
name="IVOA",
dtype=ChannelType.ENUM,
enum_strings=menus.menuIvoa.get_string_tuple(),
doc="INVALID output action",
)
invalid_output_value = pvproperty(
name="IVOV",
dtype=ChannelType.CHAR,
max_length=40,
report_as_string=True,
doc="INVALID output value",
)
post_value_monitors = pvproperty(
name="MPST",
dtype=ChannelType.ENUM,
enum_strings=menus.stringoutPOST.get_string_tuple(),
doc="Post Value Monitors",
)
prev_simulation_mode = pvproperty(
name="OLDSIMM",
dtype=ChannelType.ENUM,
enum_strings=menus.menuSimm.get_string_tuple(),
doc="Prev. Simulation Mode",
read_only=True,
)
output_mode_select = pvproperty(
name="OMSL",
dtype=ChannelType.ENUM,
enum_strings=menus.menuOmsl.get_string_tuple(),
doc="Output Mode Select",
)
output_specification = pvproperty(
name="OUT", dtype=ChannelType.STRING, doc="Output Specification"
)
previous_value = pvproperty(
name="OVAL",
dtype=ChannelType.CHAR,
max_length=40,
report_as_string=True,
doc="Previous Value",
read_only=True,
)
sim_mode_async_delay = pvproperty(
name="SDLY",
dtype=ChannelType.DOUBLE,
doc="Sim. Mode Async Delay",
value=-1.0,
)
simulation_mode_link = pvproperty(
name="SIML", dtype=ChannelType.STRING, doc="Simulation Mode Link"
)
simulation_mode = pvproperty(
name="SIMM",
dtype=ChannelType.ENUM,
enum_strings=menus.menuYesNo.get_string_tuple(),
doc="Simulation Mode",
)
simulation_mode_severity = pvproperty(
name="SIMS",
dtype=ChannelType.ENUM,
enum_strings=menus.menuAlarmSevr.get_string_tuple(),
doc="Simulation Mode Severity",
)
simulation_output_link = pvproperty(
name="SIOL", dtype=ChannelType.STRING, doc="Simulation Output Link"
)
sim_mode_scan = pvproperty(
name="SSCN",
dtype=ChannelType.ENUM,
enum_strings=menus.menuScan.get_string_tuple(),
doc="Sim. Mode Scan",
value=0,
)
# current_value = pvproperty(name='VAL',
# dtype=ChannelType.CHAR,
# max_length=40,report_as_string=True,doc='Current Value')
class SubFields(RecordFieldGroup, _Limits):
_record_type = "sub"
_dtype = ChannelType.DOUBLE # DTYP of .VAL
has_val_field = True
copy_pvproperties(locals(), RecordFieldGroup, _Limits)
device_type = pvproperty(
name="DTYP",
dtype=ChannelType.ENUM,
enum_strings=menus.dtyp_sub.get_string_tuple(),
doc="Device Type",
)
value_of_input_a = pvproperty(
name="A", dtype=ChannelType.DOUBLE, doc="Value of Input A"
)
archive_deadband = pvproperty(
name="ADEL", dtype=ChannelType.DOUBLE, doc="Archive Deadband"
)
last_value_archived = pvproperty(
name="ALST",
dtype=ChannelType.DOUBLE,
doc="Last Value Archived",
read_only=True,
)
value_of_input_b = pvproperty(
name="B", dtype=ChannelType.DOUBLE, doc="Value of Input B"
)
bad_return_severity = pvproperty(
name="BRSV",
dtype=ChannelType.ENUM,
enum_strings=menus.menuAlarmSevr.get_string_tuple(),
doc="Bad Return Severity",
)
value_of_input_c = pvproperty(
name="C", dtype=ChannelType.DOUBLE, doc="Value of Input C"
)
value_of_input_d = pvproperty(
name="D", dtype=ChannelType.DOUBLE, doc="Value of Input D"
)
value_of_input_e = pvproperty(
name="E", dtype=ChannelType.DOUBLE, doc="Value of Input E"
)
engineering_units = pvproperty(
name="EGU",
dtype=ChannelType.CHAR,
max_length=16,
report_as_string=True,
doc="Engineering Units",
)
value_of_input_f = pvproperty(
name="F", dtype=ChannelType.DOUBLE, doc="Value of Input F"
)
value_of_input_g = pvproperty(
name="G", dtype=ChannelType.DOUBLE, doc="Value of Input G"
)
value_of_input_h = pvproperty(
name="H", dtype=ChannelType.DOUBLE, doc="Value of Input H"
)
alarm_deadband = pvproperty(
name="HYST", dtype=ChannelType.DOUBLE, doc="Alarm Deadband"
)
value_of_input_i = pvproperty(
name="I", dtype=ChannelType.DOUBLE, doc="Value of Input I"
)
init_routine_name = pvproperty(
name="INAM",
dtype=ChannelType.CHAR,
max_length=40,
report_as_string=True,
doc="Init Routine Name",
read_only=True,
)
input_a = pvproperty(name="INPA", dtype=ChannelType.STRING, doc="Input A")
input_b = pvproperty(name="INPB", dtype=ChannelType.STRING, doc="Input B")
input_c = pvproperty(name="INPC", dtype=ChannelType.STRING, doc="Input C")
input_d = pvproperty(name="INPD", dtype=ChannelType.STRING, doc="Input D")
input_e = pvproperty(name="INPE", dtype=ChannelType.STRING, doc="Input E")
input_f = pvproperty(name="INPF", dtype=ChannelType.STRING, doc="Input F")
input_g = pvproperty(name="INPG", dtype=ChannelType.STRING, doc="Input G")
input_h = pvproperty(name="INPH", dtype=ChannelType.STRING, doc="Input H")
input_i = pvproperty(name="INPI", dtype=ChannelType.STRING, doc="Input I")
input_j = pvproperty(name="INPJ", dtype=ChannelType.STRING, doc="Input J")
input_k = pvproperty(name="INPK", dtype=ChannelType.STRING, doc="Input K")
input_l = pvproperty(name="INPL", dtype=ChannelType.STRING, doc="Input L")
value_of_input_j = pvproperty(
name="J", dtype=ChannelType.DOUBLE, doc="Value of Input J"
)
value_of_input_k = pvproperty(
name="K", dtype=ChannelType.DOUBLE, doc="Value of Input K"
)
value_of_input_l = pvproperty(
name="L", dtype=ChannelType.DOUBLE, doc="Value of Input L"
)
prev_value_of_a = pvproperty(
name="LA",
dtype=ChannelType.DOUBLE,
doc="Prev Value of A",
read_only=True,
)
last_value_alarmed = pvproperty(
name="LALM",
dtype=ChannelType.DOUBLE,
doc="Last Value Alarmed",
read_only=True,
)
prev_value_of_b = pvproperty(
name="LB",
dtype=ChannelType.DOUBLE,
doc="Prev Value of B",
read_only=True,
)
prev_value_of_c = pvproperty(
name="LC",
dtype=ChannelType.DOUBLE,
doc="Prev Value of C",
read_only=True,
)
prev_value_of_d = pvproperty(
name="LD",
dtype=ChannelType.DOUBLE,
doc="Prev Value of D",
read_only=True,
)
prev_value_of_e = pvproperty(
name="LE",
dtype=ChannelType.DOUBLE,
doc="Prev Value of E",
read_only=True,
)
prev_value_of_f = pvproperty(
name="LF",
dtype=ChannelType.DOUBLE,
doc="Prev Value of F",
read_only=True,
)
prev_value_of_g = pvproperty(
name="LG",
dtype=ChannelType.DOUBLE,
doc="Prev Value of G",
read_only=True,
)
prev_value_of_h = pvproperty(
name="LH",
dtype=ChannelType.DOUBLE,
doc="Prev Value of H",
read_only=True,
)
prev_value_of_i = pvproperty(
name="LI",
dtype=ChannelType.DOUBLE,
doc="Prev Value of I",
read_only=True,
)
prev_value_of_j = pvproperty(
name="LJ",
dtype=ChannelType.DOUBLE,
doc="Prev Value of J",
read_only=True,
)
prev_value_of_k = pvproperty(
name="LK",
dtype=ChannelType.DOUBLE,
doc="Prev Value of K",
read_only=True,
)
prev_value_of_l = pvproperty(
name="LL",
dtype=ChannelType.DOUBLE,
doc="Prev Value of L",
read_only=True,
)
monitor_deadband = pvproperty(
name="MDEL", dtype=ChannelType.DOUBLE, doc="Monitor Deadband"
)
last_value_monitored = pvproperty(
name="MLST",
dtype=ChannelType.DOUBLE,
doc="Last Value Monitored",
read_only=True,
)
display_precision = pvproperty(
name="PREC", dtype=ChannelType.INT, doc="Display Precision"
)
subroutine_name = pvproperty(
name="SNAM",
dtype=ChannelType.CHAR,
max_length=40,
report_as_string=True,
doc="Subroutine Name",
)
# result = pvproperty(name='VAL',
# dtype=ChannelType.DOUBLE,
# doc='Result')
link_parent_attribute(
display_precision, "precision",
)
link_parent_attribute(archive_deadband, "log_atol", use_setattr=True)
link_parent_attribute(monitor_deadband, "value_atol", use_setattr=True)
class SubarrayFields(RecordFieldGroup):
_record_type = "subArray"
_dtype = None # DTYP of .VAL
has_val_field = False
copy_pvproperties(locals(), RecordFieldGroup)
device_type = pvproperty(
name="DTYP",
dtype=ChannelType.ENUM,
enum_strings=menus.dtyp_subArray.get_string_tuple(),
doc="Device Type",
)
busy_indicator = pvproperty(
name="BUSY", dtype=ChannelType.INT, doc="Busy Indicator", read_only=True
)
engineering_units = pvproperty(
name="EGU",
dtype=ChannelType.CHAR,
max_length=16,
report_as_string=True,
doc="Engineering Units",
)
field_type_of_value = pvproperty(
name="FTVL",
dtype=ChannelType.ENUM,
enum_strings=menus.menuFtype.get_string_tuple(),
doc="Field Type of Value",
read_only=True,
)
high_operating_range = pvproperty(
name="HOPR", dtype=ChannelType.DOUBLE, doc="High Operating Range"
)
substring_index = pvproperty(
name="INDX", dtype=ChannelType.LONG, doc="Substring Index"
)
input_specification = pvproperty(
name="INP", dtype=ChannelType.STRING, doc="Input Specification"
)
low_operating_range = pvproperty(
name="LOPR", dtype=ChannelType.DOUBLE, doc="Low Operating Range"
)
maximum_elements = pvproperty(
name="MALM",
dtype=ChannelType.LONG,
doc="Maximum Elements",
read_only=True,
value=1,
)
number_of_elements = pvproperty(
name="NELM", dtype=ChannelType.LONG, doc="Number of Elements", value=1
)
number_elements_read = pvproperty(
name="NORD",
dtype=ChannelType.LONG,
doc="Number elements read",
read_only=True,
)
display_precision = pvproperty(
name="PREC", dtype=ChannelType.INT, doc="Display Precision"
)
link_parent_attribute(
display_precision, "precision",
)
link_parent_attribute(
high_operating_range, "upper_ctrl_limit",
)
link_parent_attribute(
low_operating_range, "lower_ctrl_limit",
)
link_parent_attribute(
maximum_elements, "length", use_setattr=True, read_only=True
)
link_parent_attribute(
number_of_elements, "max_length", use_setattr=True, read_only=True
)
class WaveformFields(RecordFieldGroup):
_record_type = "waveform"
_dtype = None # DTYP of .VAL
has_val_field = False
copy_pvproperties(locals(), RecordFieldGroup)
device_type = pvproperty(
name="DTYP",
dtype=ChannelType.ENUM,
enum_strings=menus.dtyp_waveform.get_string_tuple(),
doc="Device Type",
)
post_archive_monitors = pvproperty(
name="APST",
dtype=ChannelType.ENUM,
enum_strings=menus.waveformPOST.get_string_tuple(),
doc="Post Archive Monitors",
)
busy_indicator = pvproperty(
name="BUSY", dtype=ChannelType.INT, doc="Busy Indicator", read_only=True
)
engineering_units = pvproperty(
name="EGU",
dtype=ChannelType.CHAR,
max_length=16,
report_as_string=True,
doc="Engineering Units",
)
field_type_of_value = pvproperty(
name="FTVL",
dtype=ChannelType.ENUM,
enum_strings=menus.menuFtype.get_string_tuple(),
doc="Field Type of Value",
read_only=True,
)
hash_of_onchange_data = pvproperty(
name="HASH", dtype=ChannelType.LONG, doc="Hash of OnChange data."
)
high_operating_range = pvproperty(
name="HOPR", dtype=ChannelType.DOUBLE, doc="High Operating Range"
)
input_specification = pvproperty(
name="INP", dtype=ChannelType.STRING, doc="Input Specification"
)
low_operating_range = pvproperty(
name="LOPR", dtype=ChannelType.DOUBLE, doc="Low Operating Range"
)
post_value_monitors = pvproperty(
name="MPST",
dtype=ChannelType.ENUM,
enum_strings=menus.waveformPOST.get_string_tuple(),
doc="Post Value Monitors",
)
number_of_elements = pvproperty(
name="NELM",
dtype=ChannelType.LONG,
doc="Number of Elements",
read_only=True,
value=1,
)
number_elements_read = pvproperty(
name="NORD",
dtype=ChannelType.LONG,
doc="Number elements read",
read_only=True,
)
prev_simulation_mode = pvproperty(
name="OLDSIMM",
dtype=ChannelType.ENUM,
enum_strings=menus.menuSimm.get_string_tuple(),
doc="Prev. Simulation Mode",
read_only=True,
)
display_precision = pvproperty(
name="PREC", dtype=ChannelType.INT, doc="Display Precision"
)
rearm_the_waveform = pvproperty(
name="RARM", dtype=ChannelType.INT, doc="Rearm the waveform"
)
sim_mode_async_delay = pvproperty(
name="SDLY",
dtype=ChannelType.DOUBLE,
doc="Sim. Mode Async Delay",
value=-1.0,
)
simulation_mode_link = pvproperty(
name="SIML", dtype=ChannelType.STRING, doc="Simulation Mode Link"
)
simulation_mode = pvproperty(
name="SIMM",
dtype=ChannelType.ENUM,
enum_strings=menus.menuYesNo.get_string_tuple(),
doc="Simulation Mode",
)
simulation_mode_severity = pvproperty(
name="SIMS",
dtype=ChannelType.ENUM,
enum_strings=menus.menuAlarmSevr.get_string_tuple(),
doc="Simulation Mode Severity",
)
simulation_input_link = pvproperty(
name="SIOL", dtype=ChannelType.STRING, doc="Simulation Input Link"
)
sim_mode_scan = pvproperty(
name="SSCN",
dtype=ChannelType.ENUM,
enum_strings=menus.menuScan.get_string_tuple(),
doc="Sim. Mode Scan",
value=0,
)
link_parent_attribute(
display_precision, "precision",
)
link_parent_attribute(
high_operating_range, "upper_ctrl_limit",
)
link_parent_attribute(
low_operating_range, "lower_ctrl_limit",
)
link_parent_attribute(
number_of_elements, "max_length", use_setattr=True, read_only=True
)
| 31.032846 | 80 | 0.619281 | 24,147 | 214,468 | 5.302315 | 0.053961 | 0.182325 | 0.039177 | 0.061671 | 0.832405 | 0.783887 | 0.751466 | 0.71236 | 0.675284 | 0.646667 | 0 | 0.007052 | 0.266096 | 214,468 | 6,910 | 81 | 31.037337 | 0.80639 | 0.016222 | 0 | 0.575964 | 1 | 0 | 0.131556 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.000742 | 0 | 0.231157 | 0.000297 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
9772565f200451b6ba94eba9bd1242661a8924d3 | 32 | py | Python | losses/__init__.py | wang-chen/AirLoop | 12fb442c911002427a51f00d43f747ef593bd186 | [
"BSD-3-Clause"
] | 39 | 2021-09-28T19:48:13.000Z | 2022-03-17T06:44:19.000Z | losses/__init__.py | wang-chen/AirLoop | 12fb442c911002427a51f00d43f747ef593bd186 | [
"BSD-3-Clause"
] | null | null | null | losses/__init__.py | wang-chen/AirLoop | 12fb442c911002427a51f00d43f747ef593bd186 | [
"BSD-3-Clause"
] | 3 | 2021-10-04T01:26:17.000Z | 2022-02-12T04:48:50.000Z | from .loss import MemReplayLoss
| 16 | 31 | 0.84375 | 4 | 32 | 6.75 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.125 | 32 | 1 | 32 | 32 | 0.964286 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
9781258a19c6da3e9556f5f05c02d849f37f9331 | 101 | py | Python | rpcq/__init__.py | gitj/rpcq | 4b0bce859c74c88229d9557303f6b6265b86adcf | [
"Apache-2.0"
] | 1 | 2020-07-15T15:40:11.000Z | 2020-07-15T15:40:11.000Z | rpcq/__init__.py | astaley/rpcq | d482ea507f62e9736872154ab2fcccc46dd004f0 | [
"Apache-2.0"
] | null | null | null | rpcq/__init__.py | astaley/rpcq | d482ea507f62e9736872154ab2fcccc46dd004f0 | [
"Apache-2.0"
] | null | null | null | from rpcq._client import Client
from rpcq._server import Server
from rpcq.version import __version__
| 25.25 | 36 | 0.851485 | 15 | 101 | 5.333333 | 0.4 | 0.3 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.118812 | 101 | 3 | 37 | 33.666667 | 0.898876 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
c103f8df62ff997d7daea2021acd5195c3ee9964 | 10,386 | py | Python | backend/tests/internal/data/cbcl_1_5_scores.py | mkokotovich/cbcl_scoring | 86d8f636a980bce45fcf5fc7eccac5dffddfd3b8 | [
"MIT"
] | null | null | null | backend/tests/internal/data/cbcl_1_5_scores.py | mkokotovich/cbcl_scoring | 86d8f636a980bce45fcf5fc7eccac5dffddfd3b8 | [
"MIT"
] | 17 | 2019-12-26T16:45:10.000Z | 2022-03-21T22:16:37.000Z | backend/tests/internal/data/cbcl_1_5_scores.py | mkokotovich/testscoring | bc176caf37a2980d85a722efa919f416c9758a0e | [
"MIT"
] | null | null | null | scores = {"scores": [{"group": "I", "score": 11.0}, {"group": "II", "score": 11.0}, {"group": "III", "score": 6.0}, {"group": "IV", "score": 9.0}, {"group": "V", "score": 8.0}, {"group": "VI", "score": 5.0}, {"group": "VII", "score": 26.0}, {"group": "other", "score": 28.0}, {"group": "internal", "score": 37.0}, {"group": "external", "score": 31.0}, {"group": "tot prob", "score": 104.0}], "test": {"id": 32, "items": [{"id": 2155, "number": "1", "description": "", "score": "0", "group": "III", "groups": ["III"]}, {"id": 2156, "number": "2", "description": "", "score": "2", "group": "IV", "groups": ["IV"]}, {"id": 2157, "number": "3", "description": "", "score": "2", "group": "other", "groups": ["other"]}, {"id": 2158, "number": "4", "description": "", "score": "2", "group": "IV", "groups": ["IV"]}, {"id": 2159, "number": "5", "description": "", "score": "1", "group": "VI", "groups": ["VI"]}, {"id": 2160, "number": "6", "description": "", "score": "1", "group": "VI", "groups": ["VI"]}, {"id": 2161, "number": "7", "description": "", "score": "1", "group": "III", "groups": ["III"]}, {"id": 2162, "number": "8", "description": "", "score": "1", "group": "VII", "groups": ["VII"]}, {"id": 2163, "number": "9", "description": "", "score": "1", "group": "other", "groups": ["other"]}, {"id": 2164, "number": "10", "description": "", "score": "2", "group": "II", "groups": ["II"]}, {"id": 2165, "number": "11", "description": "", "score": "2", "group": "other", "groups": ["other"]}, {"id": 2166, "number": "12", "description": "", "score": "1", "group": "III", "groups": ["III"]}, {"id": 2167, "number": "13", "description": "", "score": "2", "group": "other", "groups": ["other"]}, {"id": 2168, "number": "14", "description": "", "score": "1", "group": "other", "groups": ["other"]}, {"id": 2169, "number": "15", "description": "", "score": "2", "group": "VII", "groups": ["VII"]}, {"id": 2170, "number": "16", "description": "", "score": "1", "group": "VII", "groups": ["VII"]}, {"id": 2171, "number": "17", "description": "", "score": "1", "group": "other", "groups": ["other"]}, {"id": 2172, "number": "18", "description": "", "score": "1", "group": "VII", "groups": ["VII"]}, {"id": 2173, "number": "19", "description": "", "score": "1", "group": "III", "groups": ["III"]}, {"id": 2174, "number": "20", "description": "", "score": "1", "group": "VII", "groups": ["VII"]}, {"id": 2175, "number": "21", "description": "", "score": "1", "group": "I", "groups": ["I"]}, {"id": 2176, "number": "22", "description": "", "score": "1", "group": "V", "groups": ["V"]}, {"id": 2177, "number": "23", "description": "", "score": "1", "group": "IV", "groups": ["IV"]}, {"id": 2178, "number": "24", "description": "", "score": "2", "group": "III", "groups": ["III"]}, {"id": 2179, "number": "25", "description": "", "score": "1", "group": "other", "groups": ["other"]}, {"id": 2180, "number": "26", "description": "", "score": "0", "group": "other", "groups": ["other"]}, {"id": 2181, "number": "27", "description": "", "score": "1", "group": "VII", "groups": ["VII"]}, {"id": 2182, "number": "28", "description": "", "score": "1", "group": "other", "groups": ["other"]}, {"id": 2183, "number": "29", "description": "", "score": "2", "group": "VII", "groups": ["VII"]}, {"id": 2184, "number": "30", "description": "", "score": "0", "group": "other", "groups": ["other"]}, {"id": 2185, "number": "31", "description": "", "score": "0", "group": "other", "groups": ["other"]}, {"id": 2186, "number": "32", "description": "", "score": "2", "group": "other", "groups": ["other"]}, {"id": 2187, "number": "33", "description": "", "score": "2", "group": "II", "groups": ["II"]}, {"id": 2188, "number": "34", "description": "", "score": "2", "group": "other", "groups": ["other"]}, {"id": 2189, "number": "35", "description": "", "score": "1", "group": "VII", "groups": ["VII"]}, {"id": 2190, "number": "36", "description": "", "score": "0", "group": "other", "groups": ["other"]}, {"id": 2191, "number": "37", "description": "", "score": "1", "group": "II", "groups": ["II"]}, {"id": 2192, "number": "38", "description": "", "score": "1", "group": "V", "groups": ["V"]}, {"id": 2193, "number": "39", "description": "", "score": "0", "group": "III", "groups": ["III"]}, {"id": 2194, "number": "40", "description": "", "score": "2", "group": "VII", "groups": ["VII"]}, {"id": 2195, "number": "41", "description": "", "score": "0", "group": "other", "groups": ["other"]}, {"id": 2196, "number": "42", "description": "", "score": "1", "group": "VII", "groups": ["VII"]}, {"id": 2197, "number": "43", "description": "", "score": "0", "group": "II", "groups": ["II"]}, {"id": 2198, "number": "44", "description": "", "score": "2", "group": "VII", "groups": ["VII"]}, {"id": 2199, "number": "45", "description": "", "score": "0", "group": "III", "groups": ["III"]}, {"id": 2200, "number": "46", "description": "", "score": "2", "group": "I", "groups": ["I"]}, {"id": 2201, "number": "47", "description": "", "score": "2", "group": "II", "groups": ["II"]}, {"id": 2202, "number": "48", "description": "", "score": "2", "group": "V", "groups": ["V"]}, {"id": 2203, "number": "49", "description": "", "score": "0", "group": "other", "groups": ["other"]}, {"id": 2204, "number": "50", "description": "", "score": "0", "group": "other", "groups": ["other"]}, {"id": 2205, "number": "51", "description": "", "score": "1", "group": "I", "groups": ["I"]}, {"id": 2206, "number": "52", "description": "", "score": "1", "group": "III", "groups": ["III"]}, {"id": 2207, "number": "53", "description": "", "score": "1", "group": "VII", "groups": ["VII"]}, {"id": 2208, "number": "54", "description": "", "score": "1", "group": "other", "groups": ["other"]}, {"id": 2209, "number": "55", "description": "", "score": "0", "group": "other", "groups": ["other"]}, {"id": 2210, "number": "56", "description": "", "score": "2", "group": "VI", "groups": ["VI"]}, {"id": 2211, "number": "57", "description": "", "score": "0", "group": "other", "groups": ["other"]}, {"id": 2212, "number": "58", "description": "", "score": "2", "group": "VII", "groups": ["VII"]}, {"id": 2213, "number": "59", "description": "", "score": "0", "group": "VI", "groups": ["VI"]}, {"id": 2214, "number": "60", "description": "", "score": "1", "group": "other", "groups": ["other"]}, {"id": 2215, "number": "61", "description": "", "score": "1", "group": "other", "groups": ["other"]}, {"id": 2216, "number": "62", "description": "", "score": "2", "group": "IV", "groups": ["IV"]}, {"id": 2217, "number": "63", "description": "", "score": "1", "group": "other", "groups": ["other"]}, {"id": 2218, "number": "64", "description": "", "score": "1", "group": "V", "groups": ["V"]}, {"id": 2219, "number": "65", "description": "", "score": "1", "group": "other", "groups": ["other"]}, {"id": 2220, "number": "66", "description": "", "score": "1", "group": "VII", "groups": ["VII"]}, {"id": 2221, "number": "67", "description": "", "score": "0", "group": "IV", "groups": ["IV"]}, {"id": 2222, "number": "68", "description": "", "score": "2", "group": "II", "groups": ["II"]}, {"id": 2223, "number": "69", "description": "", "score": "1", "group": "VII", "groups": ["VII"]}, {"id": 2224, "number": "70", "description": "", "score": "0", "group": "IV", "groups": ["IV"]}, {"id": 2225, "number": "71", "description": "", "score": "1", "group": "IV", "groups": ["IV"]}, {"id": 2226, "number": "72", "description": "", "score": "0", "group": "other", "groups": ["other"]}, {"id": 2227, "number": "73", "description": "", "score": "1", "group": "other", "groups": ["other"]}, {"id": 2228, "number": "74", "description": "", "score": "0", "group": "V", "groups": ["V"]}, {"id": 2229, "number": "75", "description": "", "score": "0", "group": "II", "groups": ["II"]}, {"id": 2230, "number": "76", "description": "", "score": "1", "group": "other", "groups": ["other"]}, {"id": 2231, "number": "77", "description": "", "score": "1", "group": "other", "groups": ["other"]}, {"id": 2232, "number": "78", "description": "", "score": "0", "group": "III", "groups": ["III"]}, {"id": 2233, "number": "79", "description": "", "score": "1", "group": "I", "groups": ["I"]}, {"id": 2234, "number": "80", "description": "", "score": "2", "group": "other", "groups": ["other"]}, {"id": 2235, "number": "81", "description": "", "score": "1", "group": "VII", "groups": ["VII"]}, {"id": 2236, "number": "82", "description": "", "score": "1", "group": "I", "groups": ["I"]}, {"id": 2237, "number": "83", "description": "", "score": "1", "group": "I", "groups": ["I"]}, {"id": 2238, "number": "84", "description": "", "score": "2", "group": "V", "groups": ["V"]}, {"id": 2239, "number": "85", "description": "", "score": "1", "group": "VII", "groups": ["VII"]}, {"id": 2240, "number": "86", "description": "", "score": "0", "group": "III", "groups": ["III"]}, {"id": 2241, "number": "87", "description": "", "score": "2", "group": "II", "groups": ["II"]}, {"id": 2242, "number": "88", "description": "", "score": "2", "group": "VII", "groups": ["VII"]}, {"id": 2243, "number": "89", "description": "", "score": "1", "group": "other", "groups": ["other"]}, {"id": 2244, "number": "90", "description": "", "score": "0", "group": "II", "groups": ["II"]}, {"id": 2245, "number": "91", "description": "", "score": "2", "group": "other", "groups": ["other"]}, {"id": 2246, "number": "92", "description": "", "score": "1", "group": "I", "groups": ["I"]}, {"id": 2247, "number": "93", "description": "", "score": "0", "group": "III", "groups": ["III"]}, {"id": 2248, "number": "94", "description": "", "score": "1", "group": "V", "groups": ["V"]}, {"id": 2249, "number": "95", "description": "", "score": "1", "group": "VI", "groups": ["VI"]}, {"id": 2250, "number": "96", "description": "", "score": "2", "group": "VII", "groups": ["VII"]}, {"id": 2251, "number": "97", "description": "", "score": "2", "group": "I", "groups": ["I"]}, {"id": 2252, "number": "98", "description": "", "score": "1", "group": "IV", "groups": ["IV"]}, {"id": 2253, "number": "99", "description": "", "score": "1", "group": "I", "groups": ["I"]}, {"id": 2254, "number": "100", "description": "", "score": "0", "group": "other", "groups": ["other"]}], "created_at": "2018-09-27T14:15:42.186270Z", "updated_at": "2018-09-27T14:15:42.186325Z", "client_number": "3108", "test_type": "cbcl_1_5", "owner": 3}} | 10,386 | 10,386 | 0.49326 | 1,188 | 10,386 | 4.307239 | 0.198653 | 0.312683 | 0.159468 | 0.206371 | 0.697088 | 0.680477 | 0.673832 | 0.666406 | 0 | 0 | 0 | 0.084493 | 0.121413 | 10,386 | 1 | 10,386 | 10,386 | 0.476274 | 0 | 0 | 0 | 0 | 0 | 0.452007 | 0.005199 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
c10e383471033886ae7cd340be81bd2cdb0c32c4 | 962 | py | Python | vpn/vpnconf/models.py | futurice/vpn-management-server | 5418b22356b58cd9a7f3043ec21e1e728abb6b27 | [
"BSD-3-Clause"
] | 13 | 2015-11-23T05:05:16.000Z | 2021-05-30T13:00:46.000Z | vpn/vpnconf/models.py | futurice/vpn-management-server | 5418b22356b58cd9a7f3043ec21e1e728abb6b27 | [
"BSD-3-Clause"
] | null | null | null | vpn/vpnconf/models.py | futurice/vpn-management-server | 5418b22356b58cd9a7f3043ec21e1e728abb6b27 | [
"BSD-3-Clause"
] | 12 | 2015-01-09T08:07:48.000Z | 2022-02-28T05:00:10.000Z | from django.contrib.auth.models import User
from django.db import models
from django.conf import settings
class Employment(models.Model):
name = models.CharField(max_length=50)
descr = models.CharField(max_length=100)
def __unicode__(self):
return self.descr
class Computertype(models.Model):
name = models.CharField(max_length=50)
descr = models.CharField(max_length=100)
def __unicode__(self):
return self.descr
class Computerowner(models.Model):
name = models.CharField(max_length=50)
descr = models.CharField(max_length=100)
def __unicode__(self):
return self.descr
class HelpChoices(models.Model):
name = models.CharField(max_length=50)
descr = models.CharField(max_length=100)
def __unicode__(self):
return self.descr
class Log(models.Model):
cn = models.CharField(max_length=50)
timestamp = models.DateTimeField()
message = models.CharField(max_length=250)
| 28.294118 | 46 | 0.725572 | 124 | 962 | 5.419355 | 0.266129 | 0.223214 | 0.267857 | 0.357143 | 0.675595 | 0.636905 | 0.636905 | 0.636905 | 0.636905 | 0.636905 | 0 | 0.031526 | 0.175676 | 962 | 33 | 47 | 29.151515 | 0.815889 | 0 | 0 | 0.592593 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.148148 | false | 0 | 0.111111 | 0.148148 | 1 | 0 | 0 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 6 |
c14329572d8a2634623e7fd0d797edbac077c8e1 | 3,989 | py | Python | transformers/hierarchical/firstNCharCVTE.py | ucds-sg/h2oai | 7042860767dc25d1a7d7122103bbd5016d02df53 | [
"Apache-2.0"
] | null | null | null | transformers/hierarchical/firstNCharCVTE.py | ucds-sg/h2oai | 7042860767dc25d1a7d7122103bbd5016d02df53 | [
"Apache-2.0"
] | null | null | null | transformers/hierarchical/firstNCharCVTE.py | ucds-sg/h2oai | 7042860767dc25d1a7d7122103bbd5016d02df53 | [
"Apache-2.0"
] | null | null | null | """Target-encode high cardinality categorical text by their first few characters in the string """
"""The str columns must be first marked as text in Data Sets page before recipe can take effect """
from h2oaicore.transformer_utils import CustomTransformer
import datatable as dt
import numpy as np
from h2oaicore.transformers import CVTargetEncodeTransformer
from sklearn.preprocessing import LabelEncoder
class firstNChars:
def fit_transform(self, X: dt.Frame, n):
return self.transform(X,n)
def transform(self, X: dt.Frame,n):
assert X.ncols == 1
return dt.Frame(X.to_pandas().apply(lambda x: x[0:n],axis=1))
class frst1ChrsCVTE(CustomTransformer):
@staticmethod
def get_default_properties():
return dict(col_type="text", min_cols=1, max_cols=1, relative_importance=1)
def fit_transform(self, X: dt.Frame, y: np.array = None):
self.binner = firstNChars()
X = self.binner.fit_transform(X,1)
# Compute mean target (out of fold) per same string
self.cvte = CVTargetEncodeTransformer(cat_cols=X.names)
if self.labels is not None:
# for classification, always turn y into numeric form, even if already integer
y = dt.Frame(LabelEncoder().fit(self.labels).transform(y))
X = self.cvte.fit_transform(X, y)
return X
def transform(self, X: dt.Frame):
X = self.binner.transform(X,1)
X = self.cvte.transform(X)
return X
class frst2ChrsCVTE(CustomTransformer):
@staticmethod
def get_default_properties():
return dict(col_type="text", min_cols=1, max_cols=1, relative_importance=1)
def fit_transform(self, X: dt.Frame, y: np.array = None):
self.binner = firstNChars()
X = self.binner.fit_transform(X,2)
# Compute mean target (out of fold) per same string
self.cvte = CVTargetEncodeTransformer(cat_cols=X.names)
if self.labels is not None:
# for classification, always turn y into numeric form, even if already integer
y = dt.Frame(LabelEncoder().fit(self.labels).transform(y))
X = self.cvte.fit_transform(X, y)
return X
def transform(self, X: dt.Frame):
X = self.binner.transform(X,2)
X = self.cvte.transform(X)
return X
class frst3ChrsCVTE(CustomTransformer):
@staticmethod
def get_default_properties():
return dict(col_type="text", min_cols=1, max_cols=1, relative_importance=1)
def fit_transform(self, X: dt.Frame, y: np.array = None):
self.binner = firstNChars()
X = self.binner.fit_transform(X,3)
# Compute mean target (out of fold) per same string
self.cvte = CVTargetEncodeTransformer(cat_cols=X.names)
if self.labels is not None:
# for classification, always turn y into numeric form, even if already integer
y = dt.Frame(LabelEncoder().fit(self.labels).transform(y))
X = self.cvte.fit_transform(X, y)
return X
def transform(self, X: dt.Frame):
X = self.binner.transform(X,3)
X = self.cvte.transform(X)
return X
class frst4ChrsCVTE(CustomTransformer):
@staticmethod
def get_default_properties():
return dict(col_type="text", min_cols=1, max_cols=1, relative_importance=1)
def fit_transform(self, X: dt.Frame, y: np.array = None):
self.binner = firstNChars()
X = self.binner.fit_transform(X,4)
# Compute mean target (out of fold) per same string
self.cvte = CVTargetEncodeTransformer(cat_cols=X.names)
if self.labels is not None:
# for classification, always turn y into numeric form, even if already integer
y = dt.Frame(LabelEncoder().fit(self.labels).transform(y))
X = self.cvte.fit_transform(X, y)
return X
def transform(self, X: dt.Frame):
X = self.binner.transform(X,4)
X = self.cvte.transform(X)
return X
| 34.686957 | 99 | 0.65806 | 552 | 3,989 | 4.677536 | 0.197464 | 0.06584 | 0.054222 | 0.061967 | 0.808675 | 0.808675 | 0.798606 | 0.778079 | 0.74206 | 0.74206 | 0 | 0.009571 | 0.240411 | 3,989 | 114 | 100 | 34.991228 | 0.842574 | 0.150664 | 0 | 0.693333 | 0 | 0 | 0.004884 | 0 | 0 | 0 | 0 | 0 | 0.013333 | 1 | 0.186667 | false | 0 | 0.12 | 0.066667 | 0.56 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
c17290801072964d40d0717b9958f5e98d66bf71 | 3,993 | py | Python | tests/api/test_marketo_api.py | BarracudaPff/code-golf-data-pythpn | 42e8858c2ebc6a061012bcadb167d29cebb85c5e | [
"MIT"
] | null | null | null | tests/api/test_marketo_api.py | BarracudaPff/code-golf-data-pythpn | 42e8858c2ebc6a061012bcadb167d29cebb85c5e | [
"MIT"
] | null | null | null | tests/api/test_marketo_api.py | BarracudaPff/code-golf-data-pythpn | 42e8858c2ebc6a061012bcadb167d29cebb85c5e | [
"MIT"
] | null | null | null | responses.mock.assert_all_requests_are_fired = True
class MarketoApi(unittest.TestCase):
@responses.activate
def test_auth(self):
marketo_auth_url = "".join(["https://066-eov-335.mktorest.com/", "identity/oauth/token?", "grant_type=client_credentials&client_id=123", "&client_secret=321"])
marketo_auth_payload = {"access_token": "test"}
responses.add(responses.GET, marketo_auth_url, json=marketo_auth_payload, status=200)
marketo_leads_url = "".join(["https://066-eov-335.mktorest.com/", "rest/v1/leads.json?", "access_token=test&filterType=email", "&filterValues=testing@testing.com&fields=id"])
marketo_leads_payload = {"result": [{"id": "test"}]}
responses.add(responses.GET, marketo_leads_url, json=marketo_leads_payload, status=200)
os.environ["MARKETO_CLIENT_ID"] = "fake_id"
os.environ["MARKETO_CLIENT_SECRET"] = "fake_secret"
marketo = marketo_api.MarketoApi()
user = marketo.get_user("testing@testing.com")
self.assertEqual(user, {"id": "test"})
@responses.activate
def test_get_user(self):
marketo_leads_url = "".join(["https://066-eov-335.mktorest.com/", "rest/v1/leads.json?", "access_token=test&filterType=email", "&filterValues=testing@testing.com&fields=id"])
marketo_leads_payload = {"result": [{"id": "test"}]}
responses.add(responses.GET, marketo_leads_url, json=marketo_leads_payload, status=200)
marketo = marketo_api.MarketoApi()
marketo.token = "test"
user = marketo.get_user("testing@testing.com")
self.assertEqual(user, {"id": "test"})
@responses.activate
def test_get_newsletter_subscription(self):
marketo_lead_url = "".join(["https://066-eov-335.mktorest.com/", "rest/v1/lead/test.json?", "access_token=test&fields=id,email,snapcraftnewsletter"])
marketo_lead_payload = {"result": [{"snapcraftnewsletter": True}]}
responses.add(responses.GET, marketo_lead_url, json=marketo_lead_payload, status=200)
marketo = marketo_api.MarketoApi()
marketo.token = "test"
subscription = marketo.get_newsletter_subscription("test")
self.assertEqual(subscription, {"snapcraftnewsletter": True})
@responses.activate
def test_get_newsletter_subscription_bad_response(self):
marketo_lead_url = "".join(["https://066-eov-335.mktorest.com/", "rest/v1/lead/test.json?", "access_token=test&fields=id,email,snapcraftnewsletter"])
marketo_lead_payload = {"badkey": "bad"}
responses.add(responses.GET, marketo_lead_url, json=marketo_lead_payload, status=200)
marketo = marketo_api.MarketoApi()
marketo.token = "test"
subscription = marketo.get_newsletter_subscription("test")
self.assertEqual(subscription, {})
@responses.activate
def test_set_newsletter_subscription(self):
marketo_set_subscription_url = "".join(["https://066-eov-335.mktorest.com/", "rest/v1/leads.json?", "access_token=test&filterType=email", "&filterValues=testing@testing.com&fields=id"])
responses.add(responses.POST, marketo_set_subscription_url, json={}, status=200)
marketo = marketo_api.MarketoApi()
marketo.token = "test"
response = marketo.set_newsletter_subscription("test", True)
self.assertEqual(response, {})
@responses.activate
def test_token_refresh(self):
marketo_leads_url = "".join(["https://066-eov-335.mktorest.com/", "rest/v1/leads.json?", "access_token=test&filterType=email", "&filterValues=testing@testing.com&fields=id"])
marketo_leads_payload = {"result": [{"id": "test"}]}
responses.add(responses.GET, marketo_leads_url, status=602)
marketo_auth_url = "".join(["https://066-eov-335.mktorest.com/", "identity/oauth/token?", "grant_type=client_credentials&client_id=123", "&client_secret=321"])
marketo_auth_payload = {"access_token": "refreshed_token"}
responses.add(responses.GET, marketo_auth_url, json=marketo_auth_payload, status=200)
responses.add(responses.GET, marketo_leads_url, json=marketo_leads_payload, status=200)
marketo = marketo_api.MarketoApi()
marketo.token = "expired_token"
marketo.get_user("testing@testing.com")
self.assertEqual(marketo.token, "refreshed_token") | 63.380952 | 187 | 0.758077 | 522 | 3,993 | 5.561303 | 0.137931 | 0.053738 | 0.065105 | 0.041337 | 0.802618 | 0.802618 | 0.80124 | 0.776783 | 0.760937 | 0.743024 | 0 | 0.025292 | 0.079138 | 3,993 | 63 | 188 | 63.380952 | 0.76421 | 0 | 0 | 0.619048 | 0 | 0 | 0.31973 | 0.152479 | 0 | 0 | 0 | 0 | 0.111111 | 1 | 0.095238 | false | 0 | 0 | 0 | 0.111111 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
c196d45726ef57b5367bbf32020738503b9b1c6b | 251 | py | Python | boo/test/test_year.py | AirVetra/boo | bb8404c48a6f17402a98a88454fdd876184c35c1 | [
"MIT"
] | 1 | 2019-12-06T21:13:22.000Z | 2019-12-06T21:13:22.000Z | boo/test/test_year.py | AirVetra/boo | bb8404c48a6f17402a98a88454fdd876184c35c1 | [
"MIT"
] | null | null | null | boo/test/test_year.py | AirVetra/boo | bb8404c48a6f17402a98a88454fdd876184c35c1 | [
"MIT"
] | null | null | null | import pytest
from boo.year import make_url
def test_make_url_on_0():
assert make_url(0)
def test_make_url_on_good_year():
assert make_url(2012)
def test_make_url_on_bad_year():
with pytest.raises(ValueError):
make_url(1990)
| 15.6875 | 35 | 0.741036 | 43 | 251 | 3.906977 | 0.44186 | 0.291667 | 0.196429 | 0.25 | 0.285714 | 0 | 0 | 0 | 0 | 0 | 0 | 0.048544 | 0.179283 | 251 | 15 | 36 | 16.733333 | 0.76699 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.222222 | 1 | 0.333333 | true | 0 | 0.222222 | 0 | 0.555556 | 0 | 0 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 0 | 1 | 0 | 0 | 6 |
c19ae499db4982713bb565ec68b3ab18348bdb49 | 353 | py | Python | misp_stix_converter/stix_import/stix2_import.py | MISP/misp-stix | fbbce1fa63235b22373063d61b1f97ee408a4aad | [
"BSD-2-Clause"
] | 7 | 2021-03-24T05:36:36.000Z | 2022-02-12T08:32:10.000Z | misp_stix_converter/stix_import/stix2_import.py | MISP/misp-stix | fbbce1fa63235b22373063d61b1f97ee408a4aad | [
"BSD-2-Clause"
] | 11 | 2021-05-26T04:04:25.000Z | 2021-12-25T05:25:56.000Z | misp_stix_converter/stix_import/stix2_import.py | chrisr3d/MISP-STIX-Converter | ddc78acb18e33e6dfea8e96dfd48e472bcbd4b26 | [
"BSD-2-Clause"
] | 3 | 2021-07-04T22:56:10.000Z | 2022-03-25T21:07:59.000Z | # -*- coding: utf-8 -*-
#!/usr/bin/env python3
import stix2
class Stix2ImportParser(ImportParser):
def __init__(self):
super().__init__()
class Stix2FromMISPImportParser(ImportParser):
def __init__(self):
super().__init__()
class ExternalStix2ImportParser(ImportParser):
def __init__(self):
super().__init__()
| 17.65 | 46 | 0.677054 | 33 | 353 | 6.515152 | 0.545455 | 0.209302 | 0.265116 | 0.32093 | 0.493023 | 0.493023 | 0.344186 | 0 | 0 | 0 | 0 | 0.020979 | 0.189802 | 353 | 19 | 47 | 18.578947 | 0.730769 | 0.11898 | 0 | 0.6 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.3 | false | 0 | 0.4 | 0 | 1 | 0 | 0 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
de0454179243a398e4435af9b401a525f4c2ba37 | 161 | py | Python | app/modules/handlers/errors.py | willypuzzle/flask-foundation | 3d4a09470d8303905969f19a990baa0945fdf1d0 | [
"BSD-2-Clause"
] | null | null | null | app/modules/handlers/errors.py | willypuzzle/flask-foundation | 3d4a09470d8303905969f19a990baa0945fdf1d0 | [
"BSD-2-Clause"
] | null | null | null | app/modules/handlers/errors.py | willypuzzle/flask-foundation | 3d4a09470d8303905969f19a990baa0945fdf1d0 | [
"BSD-2-Clause"
] | null | null | null | from flask import render_template
def init_app(app):
@app.errorhandler(404)
def not_found(error):
return render_template('errors/404.html'), 404 | 26.833333 | 54 | 0.720497 | 23 | 161 | 4.869565 | 0.695652 | 0.25 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.067669 | 0.173913 | 161 | 6 | 54 | 26.833333 | 0.774436 | 0 | 0 | 0 | 0 | 0 | 0.092593 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.4 | false | 0 | 0.2 | 0.2 | 0.8 | 0 | 1 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 6 |
a9b49e10cdcf2514a83b5a43111171783676b88d | 7,030 | py | Python | tests/test_checks.py | euroargodev/argonrtqcpy | 28f72d43e964d951521fd6d35be79e6e6a2fa880 | [
"MIT"
] | 4 | 2021-06-07T14:39:55.000Z | 2022-01-29T06:57:28.000Z | tests/test_checks.py | euroargodev/argonrtqcpy | 28f72d43e964d951521fd6d35be79e6e6a2fa880 | [
"MIT"
] | 5 | 2021-06-17T07:42:08.000Z | 2021-06-21T12:01:48.000Z | tests/test_checks.py | euroargodev/argortqcpy | 28f72d43e964d951521fd6d35be79e6e6a2fa880 | [
"MIT"
] | null | null | null | """Tests for Argo checks."""
import numpy as np
from numpy import ma
import pytest
import argortqcpy.profile
from argortqcpy.checks import ArgoQcFlag, CheckOutput, PressureIncreasingCheck
def test_check_is_required(fake_check):
"""Check that the base check is required."""
assert fake_check.is_required()
def test_output_ensure_output_for_property(profile_from_dataset):
"""Test ensuring a property is given an output array."""
output = CheckOutput(profile=profile_from_dataset)
output.ensure_output_for_property("PRES")
flags = output.get_output_flags_for_property("PRES")
assert flags is not None
assert isinstance(flags, ma.MaskedArray)
assert np.all(flags == ArgoQcFlag.GOOD.value)
def test_output_set_output_flag_for_property(profile_from_dataset):
"""Test setting a flag for a given property."""
output = CheckOutput(profile=profile_from_dataset)
output.ensure_output_for_property("PRES")
output.set_output_flag_for_property("PRES", ArgoQcFlag.GOOD)
flags = output.get_output_flags_for_property("PRES")
assert flags is not None
assert isinstance(flags, ma.MaskedArray)
assert np.all(flags == ArgoQcFlag.GOOD.value)
def test_output_set_output_flag_for_property_where(profile_from_dataset):
"""Test setting a flag for a given property for a limited set of indices."""
output = CheckOutput(profile=profile_from_dataset)
output.ensure_output_for_property("PRES")
output.set_output_flag_for_property("PRES", ArgoQcFlag.PROBABLY_GOOD, where=slice(None, 2))
flags = output.get_output_flags_for_property("PRES")
assert flags is not None
assert isinstance(flags, ma.MaskedArray)
assert np.all(flags[:2] == ArgoQcFlag.PROBABLY_GOOD.value)
assert np.all(flags[2:] == ArgoQcFlag.GOOD.value)
def test_output_set_output_flag_for_property_where_array(profile_from_dataset):
"""Test setting a flag for a given property for indices limited by array."""
output = CheckOutput(profile=profile_from_dataset)
where = np.full_like(profile_from_dataset.get_property_data("PRES"), False, dtype=bool)
where[0] = True
where[-1] = True
output.ensure_output_for_property("PRES")
output.set_output_flag_for_property("PRES", ArgoQcFlag.PROBABLY_GOOD, where=where)
flags = output.get_output_flags_for_property("PRES")
assert flags is not None
assert isinstance(flags, ma.MaskedArray)
assert np.all(flags[0] == ArgoQcFlag.PROBABLY_GOOD.value)
assert np.all(flags[1:-1] == ArgoQcFlag.GOOD.value)
assert np.all(flags[-1] == ArgoQcFlag.PROBABLY_GOOD.value)
@pytest.mark.parametrize(
"lower,higher",
(
(ArgoQcFlag.PROBABLY_GOOD, ArgoQcFlag.BAD),
(ArgoQcFlag.PROBABLY_GOOD, ArgoQcFlag.PROBABLY_BAD),
(ArgoQcFlag.PROBABLY_BAD, ArgoQcFlag.BAD),
),
)
def test_output_set_output_flag_for_property_with_precendence(profile_from_dataset, lower, higher):
"""Test setting a flag for a given property for a limited set of indices."""
output = CheckOutput(profile=profile_from_dataset)
output.ensure_output_for_property("PRES")
output.set_output_flag_for_property("PRES", lower, where=slice(None, 2))
output.set_output_flag_for_property("PRES", higher, where=slice(None, 1))
output.set_output_flag_for_property("PRES", lower, where=slice(None, 2))
flags = output.get_output_flags_for_property("PRES")
assert flags is not None
assert isinstance(flags, ma.MaskedArray)
assert np.all(flags[:1] == higher.value)
assert np.all(flags[1:2] == lower.value)
assert np.all(flags[2:] == ArgoQcFlag.GOOD.value)
@pytest.mark.parametrize(
"pressure_values",
(
range(10),
[1, 3, 5, 10, 100],
[0, 2, 2.5, 6.85],
),
)
def test_pressure_increasing_check_all_pass(mocker, pressure_values):
"""Test that the pressure increasing test succeeds."""
profile = mocker.patch.object(argortqcpy.profile, "Profile")
profile.get_property_data = mocker.Mock(return_value=ma.masked_array(pressure_values))
pic = PressureIncreasingCheck(profile, None)
output = pic.run()
assert np.all(output.get_output_flags_for_property("PRES").data == ArgoQcFlag.GOOD.value)
@pytest.mark.parametrize(
"pressure_values,expected",
(
(
[0, 2, 1, 5],
[ArgoQcFlag.GOOD.value, ArgoQcFlag.GOOD.value, ArgoQcFlag.BAD.value, ArgoQcFlag.GOOD.value],
),
),
)
def test_pressure_increasing_check_some_bad(mocker, pressure_values, expected):
"""Test that the pressure increasing works when some values are bad."""
profile = mocker.patch.object(argortqcpy.profile, "Profile")
profile.get_property_data = mocker.Mock(return_value=ma.masked_array(pressure_values))
pic = PressureIncreasingCheck(profile, None)
output = pic.run()
assert np.all(output.get_output_flags_for_property("PRES").data == expected)
@pytest.mark.parametrize(
"pressure_values,expected",
(
(
[0] * 4,
[ArgoQcFlag.GOOD.value, ArgoQcFlag.BAD.value, ArgoQcFlag.BAD.value, ArgoQcFlag.BAD.value],
),
(
[0, 1, 1, 2],
[ArgoQcFlag.GOOD.value, ArgoQcFlag.GOOD.value, ArgoQcFlag.BAD.value, ArgoQcFlag.GOOD.value],
),
),
)
def test_pressure_increasing_check_some_constants(mocker, pressure_values, expected):
"""Test that the pressure increasing works when some values are constant."""
profile = mocker.patch.object(argortqcpy.profile, "Profile")
profile.get_property_data = mocker.Mock(return_value=ma.masked_array(pressure_values))
pic = PressureIncreasingCheck(profile, None)
output = pic.run()
assert np.all(output.get_output_flags_for_property("PRES").data == expected)
@pytest.mark.parametrize(
"pressure_values,expected",
(
(
[0, 1, 2, 1, 1.5, 3, 5],
[
ArgoQcFlag.GOOD.value,
ArgoQcFlag.GOOD.value,
ArgoQcFlag.GOOD.value,
ArgoQcFlag.BAD.value,
ArgoQcFlag.BAD.value,
ArgoQcFlag.GOOD.value,
ArgoQcFlag.GOOD.value,
],
),
(
[
[0, 1, 2, 3],
[0, 1, 0, 1],
],
[
[ArgoQcFlag.GOOD.value, ArgoQcFlag.GOOD.value, ArgoQcFlag.GOOD.value, ArgoQcFlag.GOOD.value],
[ArgoQcFlag.GOOD.value, ArgoQcFlag.GOOD.value, ArgoQcFlag.BAD.value, ArgoQcFlag.BAD.value],
],
),
),
)
def test_pressure_increasing_check_some_decreasing(mocker, pressure_values, expected):
"""Test that the pressure increasing works when some values are decreasing."""
profile = mocker.patch.object(argortqcpy.profile, "Profile")
profile.get_property_data = mocker.Mock(return_value=ma.masked_array(pressure_values))
pic = PressureIncreasingCheck(profile, None)
output = pic.run()
assert np.all(output.get_output_flags_for_property("PRES").data == expected)
| 35.505051 | 109 | 0.696159 | 889 | 7,030 | 5.283465 | 0.113611 | 0.051735 | 0.097083 | 0.092612 | 0.839898 | 0.816904 | 0.804982 | 0.764105 | 0.723015 | 0.709602 | 0 | 0.010384 | 0.19175 | 7,030 | 197 | 110 | 35.685279 | 0.816262 | 0.089047 | 0 | 0.551724 | 0 | 0 | 0.033239 | 0.011342 | 0 | 0 | 0 | 0 | 0.172414 | 1 | 0.068966 | false | 0.006897 | 0.034483 | 0 | 0.103448 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
a9b535e5c72c0679f170c3a642c6e16c02b64e67 | 3,481 | py | Python | cryptography.py | hackmeehan/Cryptography | 6ff2847fba286d3a3f22ee1a8c234f1b8ea4559f | [
"MIT"
] | null | null | null | cryptography.py | hackmeehan/Cryptography | 6ff2847fba286d3a3f22ee1a8c234f1b8ea4559f | [
"MIT"
] | null | null | null | cryptography.py | hackmeehan/Cryptography | 6ff2847fba286d3a3f22ee1a8c234f1b8ea4559f | [
"MIT"
] | null | null | null | """
cryptography.py
Author: Jack Meehan
Credit: None, just help from Eric and Mr. Dennison
Assignment:
Write and submit a program that encrypts and decrypts user data.
See the detailed requirements at https://github.com/HHS-IntroProgramming/Cryptography/blob/master/README.md
"""
associations = "abcdefghijklmnopqrstuvwxyzABCDEFGHIJKLMNOPQRSTUVWXYZ0123456789 .,:;'\"/\\<>(){}[]-=_+?!abcdefghijklmnopqrstuvwxyzABCDEFGHIJKLMNOPQRSTUVWXYZ0123456789 .,:;'\"/\\<>(){}[]-=_+?!abcdefghijklmnopqrstuvwxyzABCDEFGHIJKLMNOPQRSTUVWXYZ0123456789 .,:;'\"/\\<>(){}[]-=_+?!abcdefghijklmnopqrstuvwxyzABCDEFGHIJKLMNOPQRSTUVWXYZ0123456789 .,:;'\"/\\<>(){}[]-=_+?!abcdefghijklmnopqrstuvwxyzABCDEFGHIJKLMNOPQRSTUVWXYZ0123456789 .,:;'\"/\\<>(){}[]-=_+?!abcdefghijklmnopqrstuvwxyzABCDEFGHIJKLMNOPQRSTUVWXYZ0123456789 .,:;'\"/\\<>(){}[]-=_+?!abcdefghijklmnopqrstuvwxyzABCDEFGHIJKLMNOPQRSTUVWXYZ0123456789 .,:;'\"/\\<>(){}[]-=_+?!abcdefghijklmnopqrstuvwxyzABCDEFGHIJKLMNOPQRSTUVWXYZ0123456789 .,:;'\"/\\<>(){}[]-=_+?!abcdefghijklmnopqrstuvwxyzABCDEFGHIJKLMNOPQRSTUVWXYZ0123456789 .,:;'\"/\\<>(){}[]-=_+?!abcdefghijklmnopqrstuvwxyzABCDEFGHIJKLMNOPQRSTUVWXYZ0123456789 .,:;'\"/\\<>(){}[]-=_+?!abcdefghijklmnopqrstuvwxyzABCDEFGHIJKLMNOPQRSTUVWXYZ0123456789 .,:;'\"/\\<>(){}[]-=_+?!"
A = input('Enter e to encrypt, d to decrypt, or q to quit: ')
while A != 'q':
if A == 'e':
message = input('Message: ')
key = input('Key: ')
z = len(message)
f = len(key)
a = zip(message, key)
if f == z:
for l in a:
b = associations.find(l[0])
c = associations.find(l[1])
d = b+c
e = str(associations[d])
print(e, end="")
if f < z:
e = z/f
import math
r = math.ceil(e)
key = r*key
for l in zip(message, key):
b = associations.find(l[0])
c = associations.find(l[1])
d = (b+c)
e = associations[d]
q = print(e, end="")
if f > z:
for l in a:
b = associations.find(l[0])
c = associations.find(l[1])
d = (b+c)
e = associations[d]
print(e, end="")
elif A == 'd':
message2 = input('Message: ')
key2 = input('Key: ')
z = len(message2)
f = len(key2)
a = zip(message2, key2)
if f == z:
for l in a:
b = associations.find(l[0])
c = associations.find(l[1])
d = b-c
e = str(associations[d])
print(e, end="")
if f < z:
e = z/f
import math
r = math.ceil(e)
key2 = r*key2
for l in zip(message2, key2):
b = associations.find(l[0])
c = associations.find(l[1])
d = (b-c)
e = associations[d]
q = print(e, end="")
if f > z:
for l in a:
b = associations.find(l[0])
c = associations.find(l[1])
d = (b-c)
e = associations[d]
print(e, end="")
else:
print('Did not understand command, try again.')
print(' ')
A = input('Enter e to encrypt, d to decrypt, or q to quit: ')
else:
print('Goodbye!')
# +KF;B(CH=NIZ}m;R\Dt
| 38.677778 | 974 | 0.509624 | 327 | 3,481 | 5.391437 | 0.272171 | 0.108905 | 0.115712 | 1.125355 | 0.710153 | 0.710153 | 0.710153 | 0.710153 | 0.710153 | 0.710153 | 0 | 0.056218 | 0.325481 | 3,481 | 89 | 975 | 39.11236 | 0.694634 | 0.084746 | 0 | 0.676056 | 0 | 0 | 0.244179 | 0.170548 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.028169 | 0 | 0.028169 | 0.126761 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
a9bc9ff7c65d7623a88a42c5c5d93cf8c968090a | 95 | py | Python | server/service_application/models/__init__.py | yseiren87/jellicleSpace | 10d693bbc04e6b89a7ce15d2dc9797cec2a553b7 | [
"Apache-2.0"
] | null | null | null | server/service_application/models/__init__.py | yseiren87/jellicleSpace | 10d693bbc04e6b89a7ce15d2dc9797cec2a553b7 | [
"Apache-2.0"
] | 7 | 2021-03-19T04:47:00.000Z | 2021-09-22T19:10:46.000Z | server/service_application/models/__init__.py | yseiren87/jellicleSpace | 10d693bbc04e6b89a7ce15d2dc9797cec2a553b7 | [
"Apache-2.0"
] | null | null | null | from .application import ApplicationModel
from .using_application import UsingApplicationModel
| 31.666667 | 52 | 0.894737 | 9 | 95 | 9.333333 | 0.666667 | 0.404762 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.084211 | 95 | 2 | 53 | 47.5 | 0.965517 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
e77aa77009a13b8dcffb3af1a1bf0eebade3e9fe | 34 | py | Python | aoc/d22/__init__.py | klittlepage/aoc2020 | 7135ac08263480a8cc9d6536d7caeb26bf85ae4f | [
"MIT"
] | null | null | null | aoc/d22/__init__.py | klittlepage/aoc2020 | 7135ac08263480a8cc9d6536d7caeb26bf85ae4f | [
"MIT"
] | null | null | null | aoc/d22/__init__.py | klittlepage/aoc2020 | 7135ac08263480a8cc9d6536d7caeb26bf85ae4f | [
"MIT"
] | null | null | null | from aoc.d22.main import p_1, p_2
| 17 | 33 | 0.764706 | 9 | 34 | 2.666667 | 0.888889 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.137931 | 0.147059 | 34 | 1 | 34 | 34 | 0.689655 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
e79f5b2bf90e278e5c49f926cb0666b087b75003 | 280 | py | Python | ex3/database.py | AbdManian/pythonclass-exercises | 8159b675aa71b4b8580430077d831914715ee409 | [
"MIT"
] | null | null | null | ex3/database.py | AbdManian/pythonclass-exercises | 8159b675aa71b4b8580430077d831914715ee409 | [
"MIT"
] | null | null | null | ex3/database.py | AbdManian/pythonclass-exercises | 8159b675aa71b4b8580430077d831914715ee409 | [
"MIT"
] | null | null | null | db = []
def db_add(value):
pass
def db_remove_first():
pass
def db_remove_last():
pass
def db_find(value):
return False
def db_add_multiple(value_list):
pass
def db_export_to_file(file_name):
pass
def db_import_from_file(file_name):
pass
| 9.333333 | 35 | 0.675 | 45 | 280 | 3.822222 | 0.422222 | 0.203488 | 0.261628 | 0.174419 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.239286 | 280 | 29 | 36 | 9.655172 | 0.807512 | 0 | 0 | 0.4 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.466667 | false | 0.4 | 0.066667 | 0.066667 | 0.6 | 0 | 0 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 6 |
99d66bc9caa0b51b20f45a56d55f7e83d7540476 | 118 | py | Python | handtracking/src/app/mediapipeadapter/__init__.py | rafaeltoyo/unity-vr-server | f07edb64a15673cd79287668280742267eb38418 | [
"MIT"
] | 3 | 2020-09-04T01:00:30.000Z | 2021-06-21T23:13:08.000Z | handtracking/src/app/mediapipeadapter/__init__.py | rafaeltoyo/unity-vr-server | f07edb64a15673cd79287668280742267eb38418 | [
"MIT"
] | null | null | null | handtracking/src/app/mediapipeadapter/__init__.py | rafaeltoyo/unity-vr-server | f07edb64a15673cd79287668280742267eb38418 | [
"MIT"
] | 3 | 2021-01-12T02:35:53.000Z | 2022-02-15T03:17:25.000Z | from .mp_body_pose_handler import MediaPipeBodyPoseHandler
from .mp_hand_pose_handler import MediaPipeHandPoseHandler
| 39.333333 | 58 | 0.915254 | 14 | 118 | 7.285714 | 0.642857 | 0.117647 | 0.333333 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.067797 | 118 | 2 | 59 | 59 | 0.927273 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
99dba6e170c2f8a937d7f450ce7193df93c10f03 | 598 | py | Python | aquami3D/prototyping/test skel nodes.py | JStuckner/Aquami3D | 72dd59f2b62b008b48d3c6c25db76aa0c7607020 | [
"MIT"
] | null | null | null | aquami3D/prototyping/test skel nodes.py | JStuckner/Aquami3D | 72dd59f2b62b008b48d3c6c25db76aa0c7607020 | [
"MIT"
] | null | null | null | aquami3D/prototyping/test skel nodes.py | JStuckner/Aquami3D | 72dd59f2b62b008b48d3c6c25db76aa0c7607020 | [
"MIT"
] | null | null | null | import numpy as np
from skimage import morphology
import matplotlib.pyplot as plt
a = np.array(
[[0,1,1,1,0,0,0,1,1,1,0],
[0,1,1,1,0,0,0,1,1,1,0],
[0,0,1,1,1,0,1,1,1,0,0],
[0,0,0,1,1,1,1,1,0,0,0],
[0,0,0,0,1,1,1,0,0,0,0],
[0,0,0,0,1,1,1,0,0,0,0],
[0,0,0,0,1,1,1,0,0,0,0],
[0,0,0,1,1,1,1,1,0,0,0],
[0,0,1,1,1,0,1,1,1,0,0],
[0,1,1,1,0,0,0,1,1,1,0],
[0,1,1,1,0,0,0,1,1,1,0]])
skel = morphology.skeletonize(a).astype('bool')
b = np.zeros(a.shape)
b[a==1] = 125
b[skel==1] = 255
plt.imshow(b, cmap=plt.cm.gray, interpolation=None)
plt.show()
| 20.62069 | 51 | 0.518395 | 167 | 598 | 1.856287 | 0.179641 | 0.309677 | 0.329032 | 0.283871 | 0.390323 | 0.390323 | 0.390323 | 0.390323 | 0.390323 | 0.390323 | 0 | 0.260606 | 0.172241 | 598 | 28 | 52 | 21.357143 | 0.365657 | 0 | 0 | 0.428571 | 0 | 0 | 0.006711 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.142857 | 0 | 0.142857 | 0 | 0 | 0 | 1 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
99ddb5ccdeaebf3f4d00270c2bca078077db90c9 | 2,330 | py | Python | tests/test_benchmarks/test_kar2019.py | dmayo/brain-score | 3ab4258152c9e3f8c7d29afb10158b184dbcebbe | [
"MIT"
] | 52 | 2019-12-13T06:43:44.000Z | 2022-02-21T07:47:39.000Z | tests/test_benchmarks/test_kar2019.py | dmayo/brain-score | 3ab4258152c9e3f8c7d29afb10158b184dbcebbe | [
"MIT"
] | 104 | 2019-12-06T18:08:54.000Z | 2022-03-31T23:57:51.000Z | tests/test_benchmarks/test_kar2019.py | dmayo/brain-score | 3ab4258152c9e3f8c7d29afb10158b184dbcebbe | [
"MIT"
] | 32 | 2019-12-05T14:31:14.000Z | 2022-03-10T02:04:45.000Z | import numpy as np
import pytest
from numpy.random.mtrand import RandomState
from pytest import approx
from brainio.assemblies import DataAssembly
from brainscore.benchmarks.kar2019 import DicarloKar2019OST
from tests.test_benchmarks import PrecomputedFeatures
@pytest.mark.memory_intense
@pytest.mark.private_access
def test_no_time():
benchmark = DicarloKar2019OST()
rnd = RandomState(0)
stimuli = benchmark._assembly.stimulus_set
source = DataAssembly(rnd.rand(len(stimuli), 5, 1), coords={
'image_id': ('presentation', stimuli['image_id']),
'image_label': ('presentation', stimuli['image_label']),
'truth': ('presentation', stimuli['truth']),
'neuroid_id': ('neuroid', list(range(5))),
'layer': ('neuroid', ['test'] * 5),
'time_bin_start': ('time_bin', [70]),
'time_bin_end': ('time_bin', [170]),
}, dims=['presentation', 'neuroid', 'time_bin'])
source.name = __name__ + ".test_notime"
score = benchmark(PrecomputedFeatures(source, visual_degrees=8))
assert np.isnan(score.sel(aggregation='center')) # not a temporal model
assert np.isnan(score.raw.sel(aggregation='center')) # not a temporal model
assert score.attrs['ceiling'].sel(aggregation='center') == approx(.79)
@pytest.mark.memory_intense
@pytest.mark.private_access
def test_random_time():
benchmark = DicarloKar2019OST()
rnd = RandomState(0)
stimuli = benchmark._assembly.stimulus_set
source = DataAssembly(rnd.rand(len(stimuli), 5, 5), coords={
'image_id': ('presentation', stimuli['image_id']),
'image_label': ('presentation', stimuli['image_label']),
'truth': ('presentation', stimuli['truth']),
'neuroid_id': ('neuroid', list(range(5))),
'layer': ('neuroid', ['test'] * 5),
'time_bin_start': ('time_bin', [70, 90, 110, 130, 150]),
'time_bin_end': ('time_bin', [90, 110, 130, 150, 170]),
}, dims=['presentation', 'neuroid', 'time_bin'])
source.name = __name__ + ".test_notime"
score = benchmark(PrecomputedFeatures(source, visual_degrees=8))
assert np.isnan(score.sel(aggregation='center')) # not a temporal model
assert np.isnan(score.raw.sel(aggregation='center')) # not a temporal model
assert score.attrs['ceiling'].sel(aggregation='center') == approx(.79)
| 43.962264 | 80 | 0.674249 | 277 | 2,330 | 5.494585 | 0.285199 | 0.045992 | 0.078844 | 0.047306 | 0.829172 | 0.806833 | 0.806833 | 0.806833 | 0.806833 | 0.806833 | 0 | 0.032837 | 0.163519 | 2,330 | 52 | 81 | 44.807692 | 0.748076 | 0.035622 | 0 | 0.680851 | 0 | 0 | 0.19893 | 0 | 0 | 0 | 0 | 0 | 0.12766 | 1 | 0.042553 | false | 0 | 0.148936 | 0 | 0.191489 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
82176aca41231ea6bfd0eca97290d6d2bbdc28a9 | 48 | py | Python | python/databases/MySQL.py | wjiec/packages | 4ccaf8f717265a1f8a9af533f9a998b935efb32a | [
"MIT"
] | null | null | null | python/databases/MySQL.py | wjiec/packages | 4ccaf8f717265a1f8a9af533f9a998b935efb32a | [
"MIT"
] | 1 | 2016-09-15T07:06:15.000Z | 2016-09-15T07:06:15.000Z | python/databases/MySQL.py | wjiec/packages | 4ccaf8f717265a1f8a9af533f9a998b935efb32a | [
"MIT"
] | null | null | null | #!/usr/bin.env python3
import mysql.connector | 16 | 23 | 0.75 | 7 | 48 | 5.142857 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.02381 | 0.125 | 48 | 3 | 24 | 16 | 0.833333 | 0.4375 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
416d3ee536bfc188ba9ad730b2806df23fd4dde8 | 5,037 | py | Python | src/unit_test.py | TheEnterpriseProgrammer/file-logger | da249e0b5a111c57b2da33eb5a09ed539a110a95 | [
"MIT"
] | null | null | null | src/unit_test.py | TheEnterpriseProgrammer/file-logger | da249e0b5a111c57b2da33eb5a09ed539a110a95 | [
"MIT"
] | null | null | null | src/unit_test.py | TheEnterpriseProgrammer/file-logger | da249e0b5a111c57b2da33eb5a09ed539a110a95 | [
"MIT"
] | null | null | null | import unittest
import logging
from logger_error import LoggerError
from logger import Logger
class Test_LoggerTest(unittest.TestCase):
# Test writing debug
def test_logger_will_log_debug(self):
application_name = "Unit Test Application"
file_name = "unit-test.log"
log_message = "We have written our first debug line"
expected_message = f"{application_name} DEBUG {log_message}"
self._logger = Logger(application_name, logging.DEBUG, file_name)
self._logger.log("debug", log_message)
with open(file_name, "r") as file:
first_line = file.readline()
for file_line in file:
pass
log_file_message = " ".join(file_line.split()[1:])
self.assertEqual(log_file_message, expected_message)
# Test writing info
def test_logger_will_log_info(self):
application_name = "Unit Test Application"
file_name = "unit-test.log"
log_message = "We have written our first info line"
expected_message = f"{application_name} INFO {log_message}"
self._logger = Logger(application_name, logging.DEBUG, file_name)
self._logger.log("info", log_message)
with open(file_name, "r") as file:
first_line = file.readline()
for file_line in file:
pass
log_file_message = " ".join(file_line.split()[1:])
self.assertEqual(log_file_message, expected_message)
# Test writing warning
def test_logger_will_log_warning(self):
application_name = "Unit Test Application"
file_name = "unit-test.log"
log_message = "We have written our first warning line"
expected_message = f"{application_name} WARNING {log_message}"
self._logger = Logger(application_name, logging.DEBUG, file_name)
self._logger.log("warning", log_message)
with open(file_name, "r") as file:
first_line = file.readline()
for file_line in file:
pass
log_file_message = " ".join(file_line.split()[1:])
self.assertEqual(log_file_message, expected_message)
# Test writing error
def test_logger_will_log_error(self):
application_name = "Unit Test Application"
file_name = "unit-test.log"
log_message = "We have written our first error line"
expected_message = f"{application_name} ERROR {log_message}"
self._logger = Logger(application_name, logging.DEBUG, file_name)
self._logger.log("error", log_message)
with open(file_name, "r") as file:
first_line = file.readline()
for file_line in file:
pass
log_file_message = " ".join(file_line.split()[1:])
self.assertEqual(log_file_message, expected_message)
# Test writing critical
def test_logger_will_log_critical(self):
application_name = "Unit Test Application"
file_name = "unit-test.log"
log_message = "We have written our first critical line"
expected_message = f"{application_name} CRITICAL {log_message}"
self._logger = Logger(application_name, logging.DEBUG, file_name)
self._logger.log("critical", log_message)
with open(file_name, "r") as file:
first_line = file.readline()
for file_line in file:
pass
log_file_message = " ".join(file_line.split()[1:])
self.assertEqual(log_file_message, expected_message)
# Test exception for missing application name
def test_logger_will_throw_exception_without_application_name(self):
try:
self._logger = Logger(None)
except LoggerError as e:
error = e.args[0]
self.assertEqual(error, "Application name is required.")
# Test exception for missing logging level
def test_logger_will_throw_exception_without_logging_level(self):
application_name = "Unit Test Application"
file_name = "unit-test.log"
log_message = "We have written our first debug line"
expected_message = f"{application_name} DEBUG {log_message}"
self._logger = Logger(application_name, logging.DEBUG, file_name)
try:
self._logger.log(None, log_message)
except LoggerError as e:
error = e.args[0]
self.assertEqual(error, "Logging Level is required.")
# Test exception for no logging message
def test_logger_will_throw_exception_without_logging_message(self):
application_name = "Unit Test Application"
file_name = "unit-test.log"
log_message = "We have written our first debug line"
expected_message = f"{application_name} DEBUG {log_message}"
self._logger = Logger(application_name, logging.DEBUG, file_name)
try:
self._logger.log("debug", None)
except LoggerError as e:
error = e.args[0]
self.assertEqual(error, "Logging Message is required.")
if __name__ == '__main__':
unittest.main() | 43.051282 | 73 | 0.656145 | 626 | 5,037 | 5.003195 | 0.097444 | 0.114943 | 0.05364 | 0.043423 | 0.853448 | 0.804917 | 0.760217 | 0.748084 | 0.719349 | 0.719349 | 0 | 0.002134 | 0.255708 | 5,037 | 117 | 74 | 43.051282 | 0.833289 | 0.043875 | 0 | 0.66 | 0 | 0 | 0.18698 | 0 | 0 | 0 | 0 | 0 | 0.08 | 1 | 0.08 | false | 0.05 | 0.04 | 0 | 0.13 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
417a4b90b9f0841b36f058e64f64aeea7f07a4cf | 314 | py | Python | irisreader/preprocessing/__init__.py | chuwyler/IRISreader | aee39321751ba273c5f0d172b3b653656872f605 | [
"MIT"
] | null | null | null | irisreader/preprocessing/__init__.py | chuwyler/IRISreader | aee39321751ba273c5f0d172b3b653656872f605 | [
"MIT"
] | 1 | 2019-07-31T14:35:28.000Z | 2019-12-06T10:54:49.000Z | irisreader/preprocessing/__init__.py | chuwyler/IRISreader | aee39321751ba273c5f0d172b3b653656872f605 | [
"MIT"
] | 1 | 2019-02-13T13:49:13.000Z | 2019-02-13T13:49:13.000Z | from irisreader.preprocessing.image_cropper import image_cropper
from irisreader.preprocessing.image_cropper import CorruptImageException, NullImageException
from irisreader.preprocessing.image_cube_cropper import image_cube_cropper
from irisreader.preprocessing.spectrum_interpolator import spectrum_interpolator
| 62.8 | 92 | 0.917197 | 34 | 314 | 8.205882 | 0.323529 | 0.200717 | 0.387097 | 0.344086 | 0.322581 | 0.322581 | 0 | 0 | 0 | 0 | 0 | 0 | 0.05414 | 314 | 4 | 93 | 78.5 | 0.939394 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 6 |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.