body stringlengths 26 98.2k | body_hash int64 -9,222,864,604,528,158,000 9,221,803,474B | docstring stringlengths 1 16.8k | path stringlengths 5 230 | name stringlengths 1 96 | repository_name stringlengths 7 89 | lang stringclasses 1
value | body_without_docstring stringlengths 20 98.2k |
|---|---|---|---|---|---|---|---|
def unstack(df, level=(- 1), reset_index=True):
'pd.DataFrame.unstack adapter.\n\n Call the `df.unstack` method using the indicated level and afterwards\n join the column names using an underscore.\n\n Args:\n df (pandas.DataFrame): DataFrame to unstack.\n level (str, int or list): Level(s) o... | 6,275,348,741,341,324,000 | pd.DataFrame.unstack adapter.
Call the `df.unstack` method using the indicated level and afterwards
join the column names using an underscore.
Args:
df (pandas.DataFrame): DataFrame to unstack.
level (str, int or list): Level(s) of index to unstack, can pass level name
reset_index (bool): Whether to reset... | mlprimitives/adapters/pandas.py | unstack | AlexanderGeiger/MLPrimitives | python | def unstack(df, level=(- 1), reset_index=True):
'pd.DataFrame.unstack adapter.\n\n Call the `df.unstack` method using the indicated level and afterwards\n join the column names using an underscore.\n\n Args:\n df (pandas.DataFrame): DataFrame to unstack.\n level (str, int or list): Level(s) o... |
@pytest.mark.usefixtures('data_config', 'nepc_connect')
def test_states_table_has_species_metadata(data_config, nepc_connect):
'\n check that the states table has a species_id column\n '
NEPC_DATA = data_config[0]
number_of_states = (util.wc_fxn((NEPC_DATA + 'states.tsv')) - 1)
df_states = nepc.ta... | -812,163,596,748,494,200 | check that the states table has a species_id column | tests/test_mysql_build.py | test_states_table_has_species_metadata | USNavalResearchLaboratory/nepc | python | @pytest.mark.usefixtures('data_config', 'nepc_connect')
def test_states_table_has_species_metadata(data_config, nepc_connect):
'\n \n '
NEPC_DATA = data_config[0]
number_of_states = (util.wc_fxn((NEPC_DATA + 'states.tsv')) - 1)
df_states = nepc.table_as_df(nepc_connect[1], 'states')
assert (le... |
def getProperties(configFile):
'\n\tdictionary getProperties(str)\n\t\n\tThis funciton reads the entire config file and builds a dictionary from the config file\n\t\n\tArgs:\n\t\tconfigFile: The configuration file to read from\n\t\t\n\tReturns:\n\t\tdictionary: A list of key value pairs from the config file\n\t\n\t... | -7,746,971,963,087,934,000 | dictionary getProperties(str)
This funciton reads the entire config file and builds a dictionary from the config file
Args:
configFile: The configuration file to read from
Returns:
dictionary: A list of key value pairs from the config file | HackPSUconfig.py | getProperties | hackpsu-tech/hackPSUS2018-rfid | python | def getProperties(configFile):
'\n\tdictionary getProperties(str)\n\t\n\tThis funciton reads the entire config file and builds a dictionary from the config file\n\t\n\tArgs:\n\t\tconfigFile: The configuration file to read from\n\t\t\n\tReturns:\n\t\tdictionary: A list of key value pairs from the config file\n\t\n\t... |
def setProperties(configFile, dict):
'\n\tvoid setProperties (str, dictionary)\n\t\n\tThis function iterates over the entire dictionary and saves each dictionary entry to the specified config file\n\t\n\tArgs:\n\t\tconfigFile: The file to overwrite with the new configuration\n\t\tdict: The dictionary to write\n\t'
... | -8,388,814,040,620,741,000 | void setProperties (str, dictionary)
This function iterates over the entire dictionary and saves each dictionary entry to the specified config file
Args:
configFile: The file to overwrite with the new configuration
dict: The dictionary to write | HackPSUconfig.py | setProperties | hackpsu-tech/hackPSUS2018-rfid | python | def setProperties(configFile, dict):
'\n\tvoid setProperties (str, dictionary)\n\t\n\tThis function iterates over the entire dictionary and saves each dictionary entry to the specified config file\n\t\n\tArgs:\n\t\tconfigFile: The file to overwrite with the new configuration\n\t\tdict: The dictionary to write\n\t'
... |
def getProperty(configFile, prop):
'\n\tstr getProperty(str, str)\n\t\n\tThis function searches a configFile for a specific property and returns its value\n\t\n\tArgs:\n\t\tconfigFile: The configuration file to open\n\t\tprop: The property to search for\n\t\t\n\tReturns:\n\t\tstring: The property value if found or ... | 478,470,915,018,306,560 | str getProperty(str, str)
This function searches a configFile for a specific property and returns its value
Args:
configFile: The configuration file to open
prop: The property to search for
Returns:
string: The property value if found or None for no value found | HackPSUconfig.py | getProperty | hackpsu-tech/hackPSUS2018-rfid | python | def getProperty(configFile, prop):
'\n\tstr getProperty(str, str)\n\t\n\tThis function searches a configFile for a specific property and returns its value\n\t\n\tArgs:\n\t\tconfigFile: The configuration file to open\n\t\tprop: The property to search for\n\t\t\n\tReturns:\n\t\tstring: The property value if found or ... |
def setProperty(configFile, prop, value):
'\n\tvoid setProperty(str, str, str)\n\t\n\tThis function searches a config file for the specified propery and updates its value if found.\n\tIf the specified property is not found, then a new line for the property will be created\n\t\n\tArgs:\n\t\tconfigFile: The configura... | -7,819,968,205,918,925,000 | void setProperty(str, str, str)
This function searches a config file for the specified propery and updates its value if found.
If the specified property is not found, then a new line for the property will be created
Args:
configFile: The configuration file to open and update
prop: The property key to ... | HackPSUconfig.py | setProperty | hackpsu-tech/hackPSUS2018-rfid | python | def setProperty(configFile, prop, value):
'\n\tvoid setProperty(str, str, str)\n\t\n\tThis function searches a config file for the specified propery and updates its value if found.\n\tIf the specified property is not found, then a new line for the property will be created\n\t\n\tArgs:\n\t\tconfigFile: The configura... |
def sample_user(email='example@example.com', password='testpass'):
'Creating sample user'
return get_user_model().objects.create_user(email, password) | 4,007,906,150,354,790,000 | Creating sample user | app/core/tests/test_models.py | sample_user | Rish1711/recipe-app-api | python | def sample_user(email='example@example.com', password='testpass'):
return get_user_model().objects.create_user(email, password) |
def test_create_user_with_email_successful(self):
'Test creating a new user with an email is successful'
email = 'example@example.com'
password = 'Password123'
user = get_user_model().objects.create_user(email=email, password=password)
self.assertEqual(user.email, email)
self.assertTrue(user.che... | -1,354,818,704,170,135,600 | Test creating a new user with an email is successful | app/core/tests/test_models.py | test_create_user_with_email_successful | Rish1711/recipe-app-api | python | def test_create_user_with_email_successful(self):
email = 'example@example.com'
password = 'Password123'
user = get_user_model().objects.create_user(email=email, password=password)
self.assertEqual(user.email, email)
self.assertTrue(user.check_password(password)) |
def test_email_normalize(self):
'Testing weather email is in normalize form or not'
email = 'example@example.com'
user = get_user_model().objects.create_user(email, 'test123')
self.assertEqual(user.email, email.lower()) | -3,077,353,306,868,135,000 | Testing weather email is in normalize form or not | app/core/tests/test_models.py | test_email_normalize | Rish1711/recipe-app-api | python | def test_email_normalize(self):
email = 'example@example.com'
user = get_user_model().objects.create_user(email, 'test123')
self.assertEqual(user.email, email.lower()) |
def test_create_superuser(self):
'Test for creating super user'
email = 'example@example.com'
password = 'Password123'
user = get_user_model().objects.create_superuser(email=email, password=password)
self.assertTrue(user.is_staff)
self.assertTrue(user.is_superuser) | 4,411,465,961,249,509,400 | Test for creating super user | app/core/tests/test_models.py | test_create_superuser | Rish1711/recipe-app-api | python | def test_create_superuser(self):
email = 'example@example.com'
password = 'Password123'
user = get_user_model().objects.create_superuser(email=email, password=password)
self.assertTrue(user.is_staff)
self.assertTrue(user.is_superuser) |
def write_transposed_dataset(reader: Reader, outfname: Union[(Path, str)], start: datetime.datetime=None, end: datetime.datetime=None, chunks: dict=None, memory: float=2, n_threads: int=4, zlib: bool=True, complevel: int=4, distributed: Union[(bool, Client)]=False, use_dask: bool=True):
'\n Creates a stacked and... | 8,170,015,279,336,511,000 | Creates a stacked and transposed netCDF file from a given reader.
WARNING: very experimental!
Parameters
----------
reader : XarrayImageReaderBase
Reader for the dataset.
outfname : str or Path
Output filename. Must end with ".nc" for netCDF output or with ".zarr"
for zarr output.
start : datetime.datetim... | src/qa4sm_preprocessing/nc_image_reader/transpose.py | write_transposed_dataset | awst-austria/qa4sm-preprocessing | python | def write_transposed_dataset(reader: Reader, outfname: Union[(Path, str)], start: datetime.datetime=None, end: datetime.datetime=None, chunks: dict=None, memory: float=2, n_threads: int=4, zlib: bool=True, complevel: int=4, distributed: Union[(bool, Client)]=False, use_dask: bool=True):
'\n Creates a stacked and... |
def _get_intermediate_chunks(array, chunks, new_last_dim, zarr_output, memory):
'\n Calculates chunk sizes for the given array for the intermediate output\n files.\n\n Parameters\n ----------\n array : xr.DataArray\n Array to rechunk and transpose\n chunks : dict or None\n Chunks pas... | 3,076,908,307,733,543,000 | Calculates chunk sizes for the given array for the intermediate output
files.
Parameters
----------
array : xr.DataArray
Array to rechunk and transpose
chunks : dict or None
Chunks passed to write_transposed_dataset, None if none were given.
new_last_dim : str
Name of the new last dimension, normally "time... | src/qa4sm_preprocessing/nc_image_reader/transpose.py | _get_intermediate_chunks | awst-austria/qa4sm-preprocessing | python | def _get_intermediate_chunks(array, chunks, new_last_dim, zarr_output, memory):
'\n Calculates chunk sizes for the given array for the intermediate output\n files.\n\n Parameters\n ----------\n array : xr.DataArray\n Array to rechunk and transpose\n chunks : dict or None\n Chunks pas... |
def __init__(self, key_id=None, key_state=None):
'KeyStatusInfo - a model defined in huaweicloud sdk'
self._key_id = None
self._key_state = None
self.discriminator = None
if (key_id is not None):
self.key_id = key_id
if (key_state is not None):
self.key_state = key_state | -7,973,678,530,902,859,000 | KeyStatusInfo - a model defined in huaweicloud sdk | huaweicloud-sdk-kms/huaweicloudsdkkms/v1/model/key_status_info.py | __init__ | Adek06/huaweicloud-sdk-python-v3 | python | def __init__(self, key_id=None, key_state=None):
self._key_id = None
self._key_state = None
self.discriminator = None
if (key_id is not None):
self.key_id = key_id
if (key_state is not None):
self.key_state = key_state |
@property
def key_id(self):
'Gets the key_id of this KeyStatusInfo.\n\n 密钥ID\n\n :return: The key_id of this KeyStatusInfo.\n :rtype: str\n '
return self._key_id | 2,992,302,185,481,682,000 | Gets the key_id of this KeyStatusInfo.
密钥ID
:return: The key_id of this KeyStatusInfo.
:rtype: str | huaweicloud-sdk-kms/huaweicloudsdkkms/v1/model/key_status_info.py | key_id | Adek06/huaweicloud-sdk-python-v3 | python | @property
def key_id(self):
'Gets the key_id of this KeyStatusInfo.\n\n 密钥ID\n\n :return: The key_id of this KeyStatusInfo.\n :rtype: str\n '
return self._key_id |
@key_id.setter
def key_id(self, key_id):
'Sets the key_id of this KeyStatusInfo.\n\n 密钥ID\n\n :param key_id: The key_id of this KeyStatusInfo.\n :type: str\n '
self._key_id = key_id | -7,281,734,985,210,797,000 | Sets the key_id of this KeyStatusInfo.
密钥ID
:param key_id: The key_id of this KeyStatusInfo.
:type: str | huaweicloud-sdk-kms/huaweicloudsdkkms/v1/model/key_status_info.py | key_id | Adek06/huaweicloud-sdk-python-v3 | python | @key_id.setter
def key_id(self, key_id):
'Sets the key_id of this KeyStatusInfo.\n\n 密钥ID\n\n :param key_id: The key_id of this KeyStatusInfo.\n :type: str\n '
self._key_id = key_id |
@property
def key_state(self):
'Gets the key_state of this KeyStatusInfo.\n\n 密钥状态: - 2为启用状态 - 3为禁用状态 - 4为计划删除状态 - 5为等待导入状态 - 7为冻结状态\n\n :return: The key_state of this KeyStatusInfo.\n :rtype: str\n '
return self._key_state | -3,301,752,105,416,907,300 | Gets the key_state of this KeyStatusInfo.
密钥状态: - 2为启用状态 - 3为禁用状态 - 4为计划删除状态 - 5为等待导入状态 - 7为冻结状态
:return: The key_state of this KeyStatusInfo.
:rtype: str | huaweicloud-sdk-kms/huaweicloudsdkkms/v1/model/key_status_info.py | key_state | Adek06/huaweicloud-sdk-python-v3 | python | @property
def key_state(self):
'Gets the key_state of this KeyStatusInfo.\n\n 密钥状态: - 2为启用状态 - 3为禁用状态 - 4为计划删除状态 - 5为等待导入状态 - 7为冻结状态\n\n :return: The key_state of this KeyStatusInfo.\n :rtype: str\n '
return self._key_state |
@key_state.setter
def key_state(self, key_state):
'Sets the key_state of this KeyStatusInfo.\n\n 密钥状态: - 2为启用状态 - 3为禁用状态 - 4为计划删除状态 - 5为等待导入状态 - 7为冻结状态\n\n :param key_state: The key_state of this KeyStatusInfo.\n :type: str\n '
self._key_state = key_state | 348,206,808,607,359,740 | Sets the key_state of this KeyStatusInfo.
密钥状态: - 2为启用状态 - 3为禁用状态 - 4为计划删除状态 - 5为等待导入状态 - 7为冻结状态
:param key_state: The key_state of this KeyStatusInfo.
:type: str | huaweicloud-sdk-kms/huaweicloudsdkkms/v1/model/key_status_info.py | key_state | Adek06/huaweicloud-sdk-python-v3 | python | @key_state.setter
def key_state(self, key_state):
'Sets the key_state of this KeyStatusInfo.\n\n 密钥状态: - 2为启用状态 - 3为禁用状态 - 4为计划删除状态 - 5为等待导入状态 - 7为冻结状态\n\n :param key_state: The key_state of this KeyStatusInfo.\n :type: str\n '
self._key_state = key_state |
def to_dict(self):
'Returns the model properties as a dict'
result = {}
for (attr, _) in six.iteritems(self.openapi_types):
value = getattr(self, attr)
if isinstance(value, list):
result[attr] = list(map((lambda x: (x.to_dict() if hasattr(x, 'to_dict') else x)), value))
e... | 2,594,216,033,120,720,000 | Returns the model properties as a dict | huaweicloud-sdk-kms/huaweicloudsdkkms/v1/model/key_status_info.py | to_dict | Adek06/huaweicloud-sdk-python-v3 | python | def to_dict(self):
result = {}
for (attr, _) in six.iteritems(self.openapi_types):
value = getattr(self, attr)
if isinstance(value, list):
result[attr] = list(map((lambda x: (x.to_dict() if hasattr(x, 'to_dict') else x)), value))
elif hasattr(value, 'to_dict'):
... |
def to_str(self):
'Returns the string representation of the model'
return pprint.pformat(self.to_dict()) | 5,849,158,643,760,736,000 | Returns the string representation of the model | huaweicloud-sdk-kms/huaweicloudsdkkms/v1/model/key_status_info.py | to_str | Adek06/huaweicloud-sdk-python-v3 | python | def to_str(self):
return pprint.pformat(self.to_dict()) |
def __repr__(self):
'For `print` and `pprint`'
return self.to_str() | -8,960,031,694,814,905,000 | For `print` and `pprint` | huaweicloud-sdk-kms/huaweicloudsdkkms/v1/model/key_status_info.py | __repr__ | Adek06/huaweicloud-sdk-python-v3 | python | def __repr__(self):
return self.to_str() |
def __eq__(self, other):
'Returns true if both objects are equal'
if (not isinstance(other, KeyStatusInfo)):
return False
return (self.__dict__ == other.__dict__) | -8,891,837,020,552,251,000 | Returns true if both objects are equal | huaweicloud-sdk-kms/huaweicloudsdkkms/v1/model/key_status_info.py | __eq__ | Adek06/huaweicloud-sdk-python-v3 | python | def __eq__(self, other):
if (not isinstance(other, KeyStatusInfo)):
return False
return (self.__dict__ == other.__dict__) |
def __ne__(self, other):
'Returns true if both objects are not equal'
return (not (self == other)) | 7,764,124,047,908,058,000 | Returns true if both objects are not equal | huaweicloud-sdk-kms/huaweicloudsdkkms/v1/model/key_status_info.py | __ne__ | Adek06/huaweicloud-sdk-python-v3 | python | def __ne__(self, other):
return (not (self == other)) |
def sample(self, size, random_state=None):
'Generate random samples from the model.\n Returns\n -------\n X : array_like, shape (n_samples, n_features)\n List of samples\n '
if (random_state is None):
random_state = numpy.random
mins = self.lims[:, 0]
maxes... | 4,209,959,704,689,310,700 | Generate random samples from the model.
Returns
-------
X : array_like, shape (n_samples, n_features)
List of samples | bananas/model.py | sample | bccp/bananaplots | python | def sample(self, size, random_state=None):
'Generate random samples from the model.\n Returns\n -------\n X : array_like, shape (n_samples, n_features)\n List of samples\n '
if (random_state is None):
random_state = numpy.random
mins = self.lims[:, 0]
maxes... |
def nhwc_tensorcore_cuda(cfg, Input, Filter, stride, padding, dilation, out_dtype):
'Compute declaration for tensorcore'
assert (isinstance(stride, int) or (len(stride) == 2))
assert (isinstance(dilation, int) or (len(dilation) == 2))
if isinstance(stride, int):
stride_h = stride_w = stride
... | 8,498,808,538,734,013,000 | Compute declaration for tensorcore | topi/python/topi/cuda/conv2d_nhwc_tensorcore.py | nhwc_tensorcore_cuda | HatsuneMiku4/incubator-tvm | python | def nhwc_tensorcore_cuda(cfg, Input, Filter, stride, padding, dilation, out_dtype):
assert (isinstance(stride, int) or (len(stride) == 2))
assert (isinstance(dilation, int) or (len(dilation) == 2))
if isinstance(stride, int):
stride_h = stride_w = stride
else:
(stride_h, stride_w) =... |
def schedule_nhwc_tensorcore_cuda(cfg, s, Conv):
'Schedule tensorcore template'
(kh, kw, ic) = s[Conv].op.reduce_axis
out_dtype = Conv.dtype
(trans_paddata, kernel) = s[Conv].op.input_tensors
in_dtype = trans_paddata.dtype
(batch, _, _, _) = get_const_tuple(Conv.shape)
(_, _, _, out_channels... | -3,248,361,691,220,659,700 | Schedule tensorcore template | topi/python/topi/cuda/conv2d_nhwc_tensorcore.py | schedule_nhwc_tensorcore_cuda | HatsuneMiku4/incubator-tvm | python | def schedule_nhwc_tensorcore_cuda(cfg, s, Conv):
(kh, kw, ic) = s[Conv].op.reduce_axis
out_dtype = Conv.dtype
(trans_paddata, kernel) = s[Conv].op.input_tensors
in_dtype = trans_paddata.dtype
(batch, _, _, _) = get_const_tuple(Conv.shape)
(_, _, _, out_channels) = get_const_tuple(kernel.sha... |
@autotvm.register_topi_compute('conv2d_nhwc_tensorcore.cuda')
def conv2d_nhwc_tensorcore(cfg, data, kernel, strides, padding, dilation, out_dtype):
'Compute conv2d with tensorcore for NCHW layout'
return nhwc_tensorcore_cuda(cfg, data, kernel, strides, padding, dilation, out_dtype) | -5,249,468,009,470,257,000 | Compute conv2d with tensorcore for NCHW layout | topi/python/topi/cuda/conv2d_nhwc_tensorcore.py | conv2d_nhwc_tensorcore | HatsuneMiku4/incubator-tvm | python | @autotvm.register_topi_compute('conv2d_nhwc_tensorcore.cuda')
def conv2d_nhwc_tensorcore(cfg, data, kernel, strides, padding, dilation, out_dtype):
return nhwc_tensorcore_cuda(cfg, data, kernel, strides, padding, dilation, out_dtype) |
@autotvm.register_topi_schedule('conv2d_nhwc_tensorcore.cuda')
def schedule_conv2d_nhwc_tensorcore(cfg, outs):
'TOPI schedule callback'
s = te.create_schedule([x.op for x in outs])
def _callback(op):
if ('conv2d_nhwc_tensorcore' in op.tag):
schedule_nhwc_tensorcore_cuda(cfg, s, op.outpu... | -8,333,864,018,856,409,000 | TOPI schedule callback | topi/python/topi/cuda/conv2d_nhwc_tensorcore.py | schedule_conv2d_nhwc_tensorcore | HatsuneMiku4/incubator-tvm | python | @autotvm.register_topi_schedule('conv2d_nhwc_tensorcore.cuda')
def schedule_conv2d_nhwc_tensorcore(cfg, outs):
s = te.create_schedule([x.op for x in outs])
def _callback(op):
if ('conv2d_nhwc_tensorcore' in op.tag):
schedule_nhwc_tensorcore_cuda(cfg, s, op.output(0))
traverse_inlin... |
@property
def path(self):
'str: Path added to client base.'
return 'cgi-bin/browse-edgar' | -5,470,262,742,582,340,000 | str: Path added to client base. | secedgar/core/company.py | path | Ahrvo-Trading-Systems/sec-edgar | python | @property
def path(self):
return 'cgi-bin/browse-edgar' |
@property
def params(self):
':obj:`dict`: Parameters to include in requests.'
return self._params | -2,608,281,908,461,058,000 | :obj:`dict`: Parameters to include in requests. | secedgar/core/company.py | params | Ahrvo-Trading-Systems/sec-edgar | python | @property
def params(self):
return self._params |
@property
def client(self):
'``secedgar.client._base``: Client to use to make requests.'
return self._client | 3,123,311,791,598,018,000 | ``secedgar.client._base``: Client to use to make requests. | secedgar/core/company.py | client | Ahrvo-Trading-Systems/sec-edgar | python | @property
def client(self):
return self._client |
@property
def start_date(self):
'Union([datetime.date, datetime.datetime, str]): Date before which no filings fetched.'
return self._start_date | 4,825,291,131,827,527,000 | Union([datetime.date, datetime.datetime, str]): Date before which no filings fetched. | secedgar/core/company.py | start_date | Ahrvo-Trading-Systems/sec-edgar | python | @property
def start_date(self):
return self._start_date |
@property
def match_format(self):
'The match format to use when searching for filings.'
return self._match_format | -6,210,562,446,347,363,000 | The match format to use when searching for filings. | secedgar/core/company.py | match_format | Ahrvo-Trading-Systems/sec-edgar | python | @property
def match_format(self):
return self._match_format |
@property
def end_date(self):
'Union([datetime.date, datetime.datetime, str]): Date after which no filings fetched.'
return self._end_date | -4,085,546,527,455,213,000 | Union([datetime.date, datetime.datetime, str]): Date after which no filings fetched. | secedgar/core/company.py | end_date | Ahrvo-Trading-Systems/sec-edgar | python | @property
def end_date(self):
return self._end_date |
@property
def filing_type(self):
'``secedgar.core.FilingType``: FilingType enum of filing.'
return self._filing_type | -7,117,107,396,546,987,000 | ``secedgar.core.FilingType``: FilingType enum of filing. | secedgar/core/company.py | filing_type | Ahrvo-Trading-Systems/sec-edgar | python | @property
def filing_type(self):
return self._filing_type |
@property
def count(self):
'Number of filings to fetch.'
return self._count | 1,136,969,923,357,941,900 | Number of filings to fetch. | secedgar/core/company.py | count | Ahrvo-Trading-Systems/sec-edgar | python | @property
def count(self):
return self._count |
@property
def cik_lookup(self):
'``secedgar.cik_lookup.CIKLookup``: CIKLookup object.'
return self._cik_lookup | -6,005,614,127,776,856,000 | ``secedgar.cik_lookup.CIKLookup``: CIKLookup object. | secedgar/core/company.py | cik_lookup | Ahrvo-Trading-Systems/sec-edgar | python | @property
def cik_lookup(self):
return self._cik_lookup |
def get_urls(self, **kwargs):
'Get urls for all CIKs given to Filing object.\n\n Args:\n **kwargs: Anything to be passed to requests when making get request.\n See keyword arguments accepted for\n ``secedgar.client._base.AbstractClient.get_soup``.\n\n Returns:\... | 5,007,671,513,216,153,000 | Get urls for all CIKs given to Filing object.
Args:
**kwargs: Anything to be passed to requests when making get request.
See keyword arguments accepted for
``secedgar.client._base.AbstractClient.get_soup``.
Returns:
urls (list): List of urls for txt files to download. | secedgar/core/company.py | get_urls | Ahrvo-Trading-Systems/sec-edgar | python | def get_urls(self, **kwargs):
'Get urls for all CIKs given to Filing object.\n\n Args:\n **kwargs: Anything to be passed to requests when making get request.\n See keyword arguments accepted for\n ``secedgar.client._base.AbstractClient.get_soup``.\n\n Returns:\... |
def _get_urls_for_cik(self, cik, **kwargs):
'Get all urls for specific company according to CIK.\n\n Must match start date, end date, filing_type, and count parameters.\n\n Args:\n cik (str): CIK for company.\n **kwargs: Anything to be passed to requests when making get request.\... | -2,237,234,858,518,616,000 | Get all urls for specific company according to CIK.
Must match start date, end date, filing_type, and count parameters.
Args:
cik (str): CIK for company.
**kwargs: Anything to be passed to requests when making get request.
See keyword arguments accepted for
``secedgar.client._base.AbstractClie... | secedgar/core/company.py | _get_urls_for_cik | Ahrvo-Trading-Systems/sec-edgar | python | def _get_urls_for_cik(self, cik, **kwargs):
'Get all urls for specific company according to CIK.\n\n Must match start date, end date, filing_type, and count parameters.\n\n Args:\n cik (str): CIK for company.\n **kwargs: Anything to be passed to requests when making get request.\... |
def save(self, directory, dir_pattern=None, file_pattern=None):
'Save files in specified directory.\n\n Each txt url looks something like:\n https://www.sec.gov/Archives/edgar/data/1018724/000101872419000043/0001018724-19-000043.txt\n\n Args:\n directory (str): Path to directory wher... | -3,339,035,536,558,945,300 | Save files in specified directory.
Each txt url looks something like:
https://www.sec.gov/Archives/edgar/data/1018724/000101872419000043/0001018724-19-000043.txt
Args:
directory (str): Path to directory where files should be saved.
dir_pattern (str): Format string for subdirectories. Default is "{cik}/{type}"... | secedgar/core/company.py | save | Ahrvo-Trading-Systems/sec-edgar | python | def save(self, directory, dir_pattern=None, file_pattern=None):
'Save files in specified directory.\n\n Each txt url looks something like:\n https://www.sec.gov/Archives/edgar/data/1018724/000101872419000043/0001018724-19-000043.txt\n\n Args:\n directory (str): Path to directory wher... |
def __init__(self, host='localhost', port=8125, prefix=None, maxudpsize=512, ipv6=False):
'Create a new client.'
fam = (socket.AF_INET6 if ipv6 else socket.AF_INET)
(family, _, _, _, addr) = socket.getaddrinfo(host, port, fam, socket.SOCK_DGRAM)[0]
self._addr = addr
self._sock = socket.socket(family... | 3,399,727,971,886,863,400 | Create a new client. | statsd/client/udp.py | __init__ | alanhamlett/pystatsd | python | def __init__(self, host='localhost', port=8125, prefix=None, maxudpsize=512, ipv6=False):
fam = (socket.AF_INET6 if ipv6 else socket.AF_INET)
(family, _, _, _, addr) = socket.getaddrinfo(host, port, fam, socket.SOCK_DGRAM)[0]
self._addr = addr
self._sock = socket.socket(family, socket.SOCK_DGRAM)
... |
def _send(self, data):
'Send data to statsd.'
try:
self._sock.sendto(data.encode('ascii'), self._addr)
except (socket.error, RuntimeError):
pass | -785,161,261,134,684,800 | Send data to statsd. | statsd/client/udp.py | _send | alanhamlett/pystatsd | python | def _send(self, data):
try:
self._sock.sendto(data.encode('ascii'), self._addr)
except (socket.error, RuntimeError):
pass |
def _grpc_launch_server(self) -> Optional[int]:
'Launch grpc server and return port.'
kwargs: Dict[(str, Any)] = dict(close_fds=True)
pid = os.getpid()
with tempfile.TemporaryDirectory() as tmpdir:
fname = os.path.join(tmpdir, f'port-{pid}.txt')
pid_str = str(os.getpid())
exec_cm... | 100,848,241,870,260,340 | Launch grpc server and return port. | wandb/sdk/service/service.py | _grpc_launch_server | KnightZhang625/client | python | def _grpc_launch_server(self) -> Optional[int]:
kwargs: Dict[(str, Any)] = dict(close_fds=True)
pid = os.getpid()
with tempfile.TemporaryDirectory() as tmpdir:
fname = os.path.join(tmpdir, f'port-{pid}.txt')
pid_str = str(os.getpid())
exec_cmd_list = [sys.executable, '-m']
... |
def get_default_monitors(loss_op=None, summary_op=None, save_summary_steps=100, output_dir=None, summary_writer=None):
'Returns a default set of typically-used monitors.\n\n Args:\n loss_op: `Tensor`, the loss tensor. This will be printed using `PrintTensor`\n at the default interval.\n summary_op: Se... | 6,604,171,130,763,241,000 | Returns a default set of typically-used monitors.
Args:
loss_op: `Tensor`, the loss tensor. This will be printed using `PrintTensor`
at the default interval.
summary_op: See `SummarySaver`.
save_summary_steps: See `SummarySaver`.
output_dir: See `SummarySaver`.
summary_writer: See `SummarySaver`.
Ret... | tensorflow/contrib/learn/python/learn/monitors.py | get_default_monitors | Najah-lshanableh/tensorflow | python | def get_default_monitors(loss_op=None, summary_op=None, save_summary_steps=100, output_dir=None, summary_writer=None):
'Returns a default set of typically-used monitors.\n\n Args:\n loss_op: `Tensor`, the loss tensor. This will be printed using `PrintTensor`\n at the default interval.\n summary_op: Se... |
def _as_graph_element(obj):
'Retrieves Graph element.'
graph = ops.get_default_graph()
if (not isinstance(obj, six.string_types)):
if ((not hasattr(obj, 'graph')) or (obj.graph != graph)):
raise ValueError(('Passed %s should have graph attribute that is equal to current graph %s.' % (obj... | -4,531,516,043,276,649,000 | Retrieves Graph element. | tensorflow/contrib/learn/python/learn/monitors.py | _as_graph_element | Najah-lshanableh/tensorflow | python | def _as_graph_element(obj):
graph = ops.get_default_graph()
if (not isinstance(obj, six.string_types)):
if ((not hasattr(obj, 'graph')) or (obj.graph != graph)):
raise ValueError(('Passed %s should have graph attribute that is equal to current graph %s.' % (obj, graph)))
return ... |
def set_estimator(self, estimator):
'A setter called automatically by the target estimator.\n\n If the estimator is locked, this method does nothing.\n\n Args:\n estimator: the estimator that this monitor monitors.\n\n Raises:\n ValueError: if the estimator is None.\n '
if self._estimator_... | -7,733,641,930,113,615,000 | A setter called automatically by the target estimator.
If the estimator is locked, this method does nothing.
Args:
estimator: the estimator that this monitor monitors.
Raises:
ValueError: if the estimator is None. | tensorflow/contrib/learn/python/learn/monitors.py | set_estimator | Najah-lshanableh/tensorflow | python | def set_estimator(self, estimator):
'A setter called automatically by the target estimator.\n\n If the estimator is locked, this method does nothing.\n\n Args:\n estimator: the estimator that this monitor monitors.\n\n Raises:\n ValueError: if the estimator is None.\n '
if self._estimator_... |
def _lock_estimator(self):
'Locks the estimator until _unlock_estimator is called.'
self._estimator_locked = True | 3,023,368,246,751,459,000 | Locks the estimator until _unlock_estimator is called. | tensorflow/contrib/learn/python/learn/monitors.py | _lock_estimator | Najah-lshanableh/tensorflow | python | def _lock_estimator(self):
self._estimator_locked = True |
def _unlock_estimator(self):
'Unlocks the estimator.'
self._estimator_locked = False | 8,796,999,282,623,188,000 | Unlocks the estimator. | tensorflow/contrib/learn/python/learn/monitors.py | _unlock_estimator | Najah-lshanableh/tensorflow | python | def _unlock_estimator(self):
self._estimator_locked = False |
def begin(self, max_steps=None):
"Called at the beginning of training.\n\n When called, the default graph is the one we are executing.\n\n Args:\n max_steps: `int`, the maximum global step this training will run until.\n\n Raises:\n ValueError: if we've already begun a run.\n "
if self._be... | -249,357,529,644,142,240 | Called at the beginning of training.
When called, the default graph is the one we are executing.
Args:
max_steps: `int`, the maximum global step this training will run until.
Raises:
ValueError: if we've already begun a run. | tensorflow/contrib/learn/python/learn/monitors.py | begin | Najah-lshanableh/tensorflow | python | def begin(self, max_steps=None):
"Called at the beginning of training.\n\n When called, the default graph is the one we are executing.\n\n Args:\n max_steps: `int`, the maximum global step this training will run until.\n\n Raises:\n ValueError: if we've already begun a run.\n "
if self._be... |
def end(self, session=None):
"Callback at the end of training/evaluation.\n\n Args:\n session: A `tf.Session` object that can be used to run ops.\n\n Raises:\n ValueError: if we've not begun a run.\n "
_ = session
if (not self._begun):
raise ValueError('end called without begin.')... | 3,358,963,026,610,282,500 | Callback at the end of training/evaluation.
Args:
session: A `tf.Session` object that can be used to run ops.
Raises:
ValueError: if we've not begun a run. | tensorflow/contrib/learn/python/learn/monitors.py | end | Najah-lshanableh/tensorflow | python | def end(self, session=None):
"Callback at the end of training/evaluation.\n\n Args:\n session: A `tf.Session` object that can be used to run ops.\n\n Raises:\n ValueError: if we've not begun a run.\n "
_ = session
if (not self._begun):
raise ValueError('end called without begin.')... |
def epoch_begin(self, epoch):
"Begin epoch.\n\n Args:\n epoch: `int`, the epoch number.\n\n Raises:\n ValueError: if we've already begun an epoch, or `epoch` < 0.\n "
if (self._current_epoch is not None):
raise ValueError('epoch_begin called twice without epoch_end.')
if (epoch < ... | -6,977,125,567,667,057,000 | Begin epoch.
Args:
epoch: `int`, the epoch number.
Raises:
ValueError: if we've already begun an epoch, or `epoch` < 0. | tensorflow/contrib/learn/python/learn/monitors.py | epoch_begin | Najah-lshanableh/tensorflow | python | def epoch_begin(self, epoch):
"Begin epoch.\n\n Args:\n epoch: `int`, the epoch number.\n\n Raises:\n ValueError: if we've already begun an epoch, or `epoch` < 0.\n "
if (self._current_epoch is not None):
raise ValueError('epoch_begin called twice without epoch_end.')
if (epoch < ... |
def epoch_end(self, epoch):
"End epoch.\n\n Args:\n epoch: `int`, the epoch number.\n\n Raises:\n ValueError: if we've not begun an epoch, or `epoch` number does not match.\n "
if (self._current_epoch != epoch):
raise ValueError('epoch_end expected %s but got %s.', self._current_epoch... | 7,613,804,702,632,359,000 | End epoch.
Args:
epoch: `int`, the epoch number.
Raises:
ValueError: if we've not begun an epoch, or `epoch` number does not match. | tensorflow/contrib/learn/python/learn/monitors.py | epoch_end | Najah-lshanableh/tensorflow | python | def epoch_end(self, epoch):
"End epoch.\n\n Args:\n epoch: `int`, the epoch number.\n\n Raises:\n ValueError: if we've not begun an epoch, or `epoch` number does not match.\n "
if (self._current_epoch != epoch):
raise ValueError('epoch_end expected %s but got %s.', self._current_epoch... |
def step_begin(self, step):
"Callback before training step begins.\n\n You may use this callback to request evaluation of additional tensors\n in the graph.\n\n Args:\n step: `int`, the current value of the global step.\n\n Returns:\n List of `Tensor` objects or string tensor names to be run.\... | -5,978,711,741,628,458,000 | Callback before training step begins.
You may use this callback to request evaluation of additional tensors
in the graph.
Args:
step: `int`, the current value of the global step.
Returns:
List of `Tensor` objects or string tensor names to be run.
Raises:
ValueError: if we've already begun a step, or `step` < ... | tensorflow/contrib/learn/python/learn/monitors.py | step_begin | Najah-lshanableh/tensorflow | python | def step_begin(self, step):
"Callback before training step begins.\n\n You may use this callback to request evaluation of additional tensors\n in the graph.\n\n Args:\n step: `int`, the current value of the global step.\n\n Returns:\n List of `Tensor` objects or string tensor names to be run.\... |
def step_end(self, step, output):
"Callback after training step finished.\n\n This callback provides access to the tensors/ops evaluated at this step,\n including the additional tensors for which evaluation was requested in\n `step_begin`.\n\n In addition, the callback has the opportunity to stop traini... | 8,606,716,016,347,808,000 | Callback after training step finished.
This callback provides access to the tensors/ops evaluated at this step,
including the additional tensors for which evaluation was requested in
`step_begin`.
In addition, the callback has the opportunity to stop training by returning
`True`. This is useful for early stopping, fo... | tensorflow/contrib/learn/python/learn/monitors.py | step_end | Najah-lshanableh/tensorflow | python | def step_end(self, step, output):
"Callback after training step finished.\n\n This callback provides access to the tensors/ops evaluated at this step,\n including the additional tensors for which evaluation was requested in\n `step_begin`.\n\n In addition, the callback has the opportunity to stop traini... |
def post_step(self, step, session):
'Callback after the step is finished.\n\n Called after step_end and receives session to perform extra session.run\n calls. If failure occurred in the process, will be called as well.\n\n Args:\n step: `int`, global step of the model.\n session: `Session` object... | -6,104,330,041,348,578,000 | Callback after the step is finished.
Called after step_end and receives session to perform extra session.run
calls. If failure occurred in the process, will be called as well.
Args:
step: `int`, global step of the model.
session: `Session` object. | tensorflow/contrib/learn/python/learn/monitors.py | post_step | Najah-lshanableh/tensorflow | python | def post_step(self, step, session):
'Callback after the step is finished.\n\n Called after step_end and receives session to perform extra session.run\n calls. If failure occurred in the process, will be called as well.\n\n Args:\n step: `int`, global step of the model.\n session: `Session` object... |
def __init__(self, every_n_steps=100, first_n_steps=1):
'Initializes an `EveryN` monitor.\n\n Args:\n every_n_steps: `int`, the number of steps to allow between callbacks.\n first_n_steps: `int`, specifying the number of initial steps during\n which the callbacks will always be executed, regardl... | 4,045,912,460,634,086,400 | Initializes an `EveryN` monitor.
Args:
every_n_steps: `int`, the number of steps to allow between callbacks.
first_n_steps: `int`, specifying the number of initial steps during
which the callbacks will always be executed, regardless of the value
of `every_n_steps`. Note that this value is relative to the g... | tensorflow/contrib/learn/python/learn/monitors.py | __init__ | Najah-lshanableh/tensorflow | python | def __init__(self, every_n_steps=100, first_n_steps=1):
'Initializes an `EveryN` monitor.\n\n Args:\n every_n_steps: `int`, the number of steps to allow between callbacks.\n first_n_steps: `int`, specifying the number of initial steps during\n which the callbacks will always be executed, regardl... |
def every_n_step_begin(self, step):
"Callback before every n'th step begins.\n\n Args:\n step: `int`, the current value of the global step.\n\n Returns:\n A `list` of tensors that will be evaluated at this step.\n "
return [] | 864,840,899,116,364,400 | Callback before every n'th step begins.
Args:
step: `int`, the current value of the global step.
Returns:
A `list` of tensors that will be evaluated at this step. | tensorflow/contrib/learn/python/learn/monitors.py | every_n_step_begin | Najah-lshanableh/tensorflow | python | def every_n_step_begin(self, step):
"Callback before every n'th step begins.\n\n Args:\n step: `int`, the current value of the global step.\n\n Returns:\n A `list` of tensors that will be evaluated at this step.\n "
return [] |
def every_n_step_end(self, step, outputs):
"Callback after every n'th step finished.\n\n This callback provides access to the tensors/ops evaluated at this step,\n including the additional tensors for which evaluation was requested in\n `step_begin`.\n\n In addition, the callback has the opportunity to ... | 3,292,307,751,707,634,000 | Callback after every n'th step finished.
This callback provides access to the tensors/ops evaluated at this step,
including the additional tensors for which evaluation was requested in
`step_begin`.
In addition, the callback has the opportunity to stop training by returning
`True`. This is useful for early stopping, ... | tensorflow/contrib/learn/python/learn/monitors.py | every_n_step_end | Najah-lshanableh/tensorflow | python | def every_n_step_end(self, step, outputs):
"Callback after every n'th step finished.\n\n This callback provides access to the tensors/ops evaluated at this step,\n including the additional tensors for which evaluation was requested in\n `step_begin`.\n\n In addition, the callback has the opportunity to ... |
def every_n_post_step(self, step, session):
'Callback after a step is finished or `end()` is called.\n\n Args:\n step: `int`, the current value of the global step.\n session: `Session` object.\n '
pass | -5,656,285,919,200,097,000 | Callback after a step is finished or `end()` is called.
Args:
step: `int`, the current value of the global step.
session: `Session` object. | tensorflow/contrib/learn/python/learn/monitors.py | every_n_post_step | Najah-lshanableh/tensorflow | python | def every_n_post_step(self, step, session):
'Callback after a step is finished or `end()` is called.\n\n Args:\n step: `int`, the current value of the global step.\n session: `Session` object.\n '
pass |
def step_begin(self, step):
'Overrides `BaseMonitor.step_begin`.\n\n When overriding this method, you must call the super implementation.\n\n Args:\n step: `int`, the current value of the global step.\n Returns:\n A `list`, the result of every_n_step_begin, if that was called this step,\n or... | -3,994,016,857,027,051,000 | Overrides `BaseMonitor.step_begin`.
When overriding this method, you must call the super implementation.
Args:
step: `int`, the current value of the global step.
Returns:
A `list`, the result of every_n_step_begin, if that was called this step,
or an empty list otherwise.
Raises:
ValueError: if called more t... | tensorflow/contrib/learn/python/learn/monitors.py | step_begin | Najah-lshanableh/tensorflow | python | def step_begin(self, step):
'Overrides `BaseMonitor.step_begin`.\n\n When overriding this method, you must call the super implementation.\n\n Args:\n step: `int`, the current value of the global step.\n Returns:\n A `list`, the result of every_n_step_begin, if that was called this step,\n or... |
def step_end(self, step, output):
'Overrides `BaseMonitor.step_end`.\n\n When overriding this method, you must call the super implementation.\n\n Args:\n step: `int`, the current value of the global step.\n output: `dict` mapping `string` values representing tensor names to\n the value result... | 7,474,241,267,920,073,000 | Overrides `BaseMonitor.step_end`.
When overriding this method, you must call the super implementation.
Args:
step: `int`, the current value of the global step.
output: `dict` mapping `string` values representing tensor names to
the value resulted from running these tensors. Values may be either
scalars, f... | tensorflow/contrib/learn/python/learn/monitors.py | step_end | Najah-lshanableh/tensorflow | python | def step_end(self, step, output):
'Overrides `BaseMonitor.step_end`.\n\n When overriding this method, you must call the super implementation.\n\n Args:\n step: `int`, the current value of the global step.\n output: `dict` mapping `string` values representing tensor names to\n the value result... |
def __init__(self, num_steps=None, last_step=None):
'Create a StopAtStep monitor.\n\n This monitor requests stop after either a number of steps have been\n executed or a last step has been reached. Only of the two options can be\n specified.\n\n if `num_steps` is specified, it indicates the number of s... | 8,602,064,052,097,256,000 | Create a StopAtStep monitor.
This monitor requests stop after either a number of steps have been
executed or a last step has been reached. Only of the two options can be
specified.
if `num_steps` is specified, it indicates the number of steps to execute
after `begin()` is called. If instead `last_step` is specified... | tensorflow/contrib/learn/python/learn/monitors.py | __init__ | Najah-lshanableh/tensorflow | python | def __init__(self, num_steps=None, last_step=None):
'Create a StopAtStep monitor.\n\n This monitor requests stop after either a number of steps have been\n executed or a last step has been reached. Only of the two options can be\n specified.\n\n if `num_steps` is specified, it indicates the number of s... |
def __init__(self, tensor_names, every_n=100, first_n=1):
'Initializes a PrintTensor monitor.\n\n Args:\n tensor_names: `dict` of tag to tensor names or\n `iterable` of tensor names (strings).\n every_n: `int`, print every N steps. See `PrintN.`\n first_n: `int`, also print the first N st... | 8,793,686,181,907,383,000 | Initializes a PrintTensor monitor.
Args:
tensor_names: `dict` of tag to tensor names or
`iterable` of tensor names (strings).
every_n: `int`, print every N steps. See `PrintN.`
first_n: `int`, also print the first N steps. See `PrintN.` | tensorflow/contrib/learn/python/learn/monitors.py | __init__ | Najah-lshanableh/tensorflow | python | def __init__(self, tensor_names, every_n=100, first_n=1):
'Initializes a PrintTensor monitor.\n\n Args:\n tensor_names: `dict` of tag to tensor names or\n `iterable` of tensor names (strings).\n every_n: `int`, print every N steps. See `PrintN.`\n first_n: `int`, also print the first N st... |
def __init__(self, scope=None, every_n=100, first_n=1):
'Initializes LoggingTrainable monitor.\n\n Args:\n scope: An optional string to match variable names using re.match.\n every_n: Print every N steps.\n first_n: Print first N steps.\n '
super(LoggingTrainable, self).__init__(every_n, fi... | -239,949,726,753,383,100 | Initializes LoggingTrainable monitor.
Args:
scope: An optional string to match variable names using re.match.
every_n: Print every N steps.
first_n: Print first N steps. | tensorflow/contrib/learn/python/learn/monitors.py | __init__ | Najah-lshanableh/tensorflow | python | def __init__(self, scope=None, every_n=100, first_n=1):
'Initializes LoggingTrainable monitor.\n\n Args:\n scope: An optional string to match variable names using re.match.\n every_n: Print every N steps.\n first_n: Print first N steps.\n '
super(LoggingTrainable, self).__init__(every_n, fi... |
def __init__(self, summary_op, save_steps=100, output_dir=None, summary_writer=None, scaffold=None):
"Initializes a `SummarySaver` monitor.\n\n Args:\n summary_op: `Tensor` of type `string`. A serialized `Summary` protocol\n buffer, as output by TF summary methods like `scalar_summary` or\n ... | 7,110,141,303,646,149,000 | Initializes a `SummarySaver` monitor.
Args:
summary_op: `Tensor` of type `string`. A serialized `Summary` protocol
buffer, as output by TF summary methods like `scalar_summary` or
`merge_all_summaries`.
save_steps: `int`, save summaries every N steps. See `EveryN`.
output_dir: `string`, the directory... | tensorflow/contrib/learn/python/learn/monitors.py | __init__ | Najah-lshanableh/tensorflow | python | def __init__(self, summary_op, save_steps=100, output_dir=None, summary_writer=None, scaffold=None):
"Initializes a `SummarySaver` monitor.\n\n Args:\n summary_op: `Tensor` of type `string`. A serialized `Summary` protocol\n buffer, as output by TF summary methods like `scalar_summary` or\n ... |
def __init__(self, x=None, y=None, input_fn=None, batch_size=None, eval_steps=None, every_n_steps=100, metrics=None, early_stopping_rounds=None, early_stopping_metric='loss', early_stopping_metric_minimize=True, name=None):
'Initializes a ValidationMonitor.\n\n Args:\n x: See `BaseEstimator.evaluate`.\n ... | 6,542,023,680,113,299,000 | Initializes a ValidationMonitor.
Args:
x: See `BaseEstimator.evaluate`.
y: See `BaseEstimator.evaluate`.
input_fn: See `BaseEstimator.evaluate`.
batch_size: See `BaseEstimator.evaluate`.
eval_steps: See `BaseEstimator.evaluate`.
every_n_steps: Check for new checkpoints to evaluate every N steps. If a
... | tensorflow/contrib/learn/python/learn/monitors.py | __init__ | Najah-lshanableh/tensorflow | python | def __init__(self, x=None, y=None, input_fn=None, batch_size=None, eval_steps=None, every_n_steps=100, metrics=None, early_stopping_rounds=None, early_stopping_metric='loss', early_stopping_metric_minimize=True, name=None):
'Initializes a ValidationMonitor.\n\n Args:\n x: See `BaseEstimator.evaluate`.\n ... |
@property
def early_stopped(self):
'Returns True if this monitor caused an early stop.'
return self._early_stopped | -4,954,369,659,552,044,000 | Returns True if this monitor caused an early stop. | tensorflow/contrib/learn/python/learn/monitors.py | early_stopped | Najah-lshanableh/tensorflow | python | @property
def early_stopped(self):
return self._early_stopped |
@property
def best_step(self):
'Returns the step at which the best early stopping metric was found.'
return self._best_value_step | -5,269,908,394,140,271,000 | Returns the step at which the best early stopping metric was found. | tensorflow/contrib/learn/python/learn/monitors.py | best_step | Najah-lshanableh/tensorflow | python | @property
def best_step(self):
return self._best_value_step |
@property
def best_value(self):
'Returns the best early stopping metric value found so far.'
return self._best_value | 1,251,436,279,331,486,200 | Returns the best early stopping metric value found so far. | tensorflow/contrib/learn/python/learn/monitors.py | best_value | Najah-lshanableh/tensorflow | python | @property
def best_value(self):
return self._best_value |
def __init__(self, var_name, every_n=100, first_n=1):
'Initializes a CaptureVariable monitor.\n\n Args:\n var_name: `string`. The variable name, including suffix (typically ":0").\n every_n: `int`, print every N steps. See `PrintN.`\n first_n: `int`, also print the first N steps. See `PrintN.`\n ... | -715,672,332,563,627,600 | Initializes a CaptureVariable monitor.
Args:
var_name: `string`. The variable name, including suffix (typically ":0").
every_n: `int`, print every N steps. See `PrintN.`
first_n: `int`, also print the first N steps. See `PrintN.` | tensorflow/contrib/learn/python/learn/monitors.py | __init__ | Najah-lshanableh/tensorflow | python | def __init__(self, var_name, every_n=100, first_n=1):
'Initializes a CaptureVariable monitor.\n\n Args:\n var_name: `string`. The variable name, including suffix (typically ":0").\n every_n: `int`, print every N steps. See `PrintN.`\n first_n: `int`, also print the first N steps. See `PrintN.`\n ... |
@property
def values(self):
'Returns the values captured so far.\n\n Returns:\n `dict` mapping `int` step numbers to that values of the variable at the\n respective step.\n '
return self._var_values | -2,056,683,639,811,959,000 | Returns the values captured so far.
Returns:
`dict` mapping `int` step numbers to that values of the variable at the
respective step. | tensorflow/contrib/learn/python/learn/monitors.py | values | Najah-lshanableh/tensorflow | python | @property
def values(self):
'Returns the values captured so far.\n\n Returns:\n `dict` mapping `int` step numbers to that values of the variable at the\n respective step.\n '
return self._var_values |
def __init__(self, ignore_ops=None):
'Initializes GraphDump monitor.\n\n Args:\n ignore_ops: `list` of `string`. Names of ops to ignore.\n If None, `GraphDump.IGNORE_OPS` is used.\n '
super(GraphDump, self).__init__()
self._ignore_ops = (ignore_ops or GraphDump.IGNORE_OPS)
self._data... | -7,470,051,855,634,178,000 | Initializes GraphDump monitor.
Args:
ignore_ops: `list` of `string`. Names of ops to ignore.
If None, `GraphDump.IGNORE_OPS` is used. | tensorflow/contrib/learn/python/learn/monitors.py | __init__ | Najah-lshanableh/tensorflow | python | def __init__(self, ignore_ops=None):
'Initializes GraphDump monitor.\n\n Args:\n ignore_ops: `list` of `string`. Names of ops to ignore.\n If None, `GraphDump.IGNORE_OPS` is used.\n '
super(GraphDump, self).__init__()
self._ignore_ops = (ignore_ops or GraphDump.IGNORE_OPS)
self._data... |
def compare(self, other_dump, step, atol=1e-06):
'Compares two `GraphDump` monitors and returns differences.\n\n Args:\n other_dump: Another `GraphDump` monitor.\n step: `int`, step to compare on.\n atol: `float`, absolute tolerance in comparison of floating arrays.\n\n Returns:\n Returns ... | 3,847,063,021,059,542,500 | Compares two `GraphDump` monitors and returns differences.
Args:
other_dump: Another `GraphDump` monitor.
step: `int`, step to compare on.
atol: `float`, absolute tolerance in comparison of floating arrays.
Returns:
Returns tuple:
matched: `list` of keys that matched.
non_matched: `dict` of keys to tu... | tensorflow/contrib/learn/python/learn/monitors.py | compare | Najah-lshanableh/tensorflow | python | def compare(self, other_dump, step, atol=1e-06):
'Compares two `GraphDump` monitors and returns differences.\n\n Args:\n other_dump: Another `GraphDump` monitor.\n step: `int`, step to compare on.\n atol: `float`, absolute tolerance in comparison of floating arrays.\n\n Returns:\n Returns ... |
@deprecated_arg_values('2016-09-23', "The signature of the input_fn accepted by export is changing to be consistent with what's used by tf.Learn Estimator's train/evaluate. input_fn (and in most cases, input_feature_key) will both become required args.", input_fn=None)
def __init__(self, every_n_steps, export_dir, inpu... | 2,032,589,773,287,838,200 | Initializes ExportMonitor.
Args:
every_n_steps: Run monitor every N steps.
export_dir: str, folder to export.
input_fn: A function that takes no argument and returns a tuple of
(features, targets), where features is a dict of string key to `Tensor`
and targets is a `Tensor` that's currently not used (and... | tensorflow/contrib/learn/python/learn/monitors.py | __init__ | Najah-lshanableh/tensorflow | python | @deprecated_arg_values('2016-09-23', "The signature of the input_fn accepted by export is changing to be consistent with what's used by tf.Learn Estimator's train/evaluate. input_fn (and in most cases, input_feature_key) will both become required args.", input_fn=None)
def __init__(self, every_n_steps, export_dir, inpu... |
@property
def last_export_dir(self):
'Returns the directory containing the last completed export.\n\n Returns:\n The string path to the exported directory. NB: this functionality was\n added on 2016/09/25; clients that depend on the return value may need\n to handle the case where this function re... | -7,370,270,321,409,989,000 | Returns the directory containing the last completed export.
Returns:
The string path to the exported directory. NB: this functionality was
added on 2016/09/25; clients that depend on the return value may need
to handle the case where this function returns None because the
estimator being fitted does not yet re... | tensorflow/contrib/learn/python/learn/monitors.py | last_export_dir | Najah-lshanableh/tensorflow | python | @property
def last_export_dir(self):
'Returns the directory containing the last completed export.\n\n Returns:\n The string path to the exported directory. NB: this functionality was\n added on 2016/09/25; clients that depend on the return value may need\n to handle the case where this function re... |
def __init__(self, checkpoint_dir, save_secs=None, save_steps=None, saver=None, checkpoint_basename='model.ckpt', scaffold=None):
'Initialize CheckpointSaver monitor.\n\n Args:\n checkpoint_dir: `str`, base directory for the checkpoint files.\n save_secs: `int`, save every N secs.\n save_steps: `i... | -8,126,788,662,230,020,000 | Initialize CheckpointSaver monitor.
Args:
checkpoint_dir: `str`, base directory for the checkpoint files.
save_secs: `int`, save every N secs.
save_steps: `int`, save every N steps.
saver: `Saver` object, used for saving.
checkpoint_basename: `str`, base name for the checkpoint files.
scaffold: `Scaffold`,... | tensorflow/contrib/learn/python/learn/monitors.py | __init__ | Najah-lshanableh/tensorflow | python | def __init__(self, checkpoint_dir, save_secs=None, save_steps=None, saver=None, checkpoint_basename='model.ckpt', scaffold=None):
'Initialize CheckpointSaver monitor.\n\n Args:\n checkpoint_dir: `str`, base directory for the checkpoint files.\n save_secs: `int`, save every N secs.\n save_steps: `i... |
def _save(self, step, session):
'Saves the latest checkpoint.'
if (step == self._last_saved_step):
return
logging.info('Saving checkpoints for %d into %s.', step, self._save_path)
self._last_saved_time = time.time()
self._last_saved_step = step
if (self._saver is None):
self._sca... | 9,139,546,512,531,420,000 | Saves the latest checkpoint. | tensorflow/contrib/learn/python/learn/monitors.py | _save | Najah-lshanableh/tensorflow | python | def _save(self, step, session):
if (step == self._last_saved_step):
return
logging.info('Saving checkpoints for %d into %s.', step, self._save_path)
self._last_saved_time = time.time()
self._last_saved_step = step
if (self._saver is None):
self._scaffold.saver.save(session, self... |
def __init__(self, loss_tensor, every_n_steps=100, fail_on_nan_loss=True):
'Initializes NanLoss monitor.\n\n Args:\n loss_tensor: `Tensor`, the loss tensor.\n every_n_steps: `int`, run check every this many steps.\n fail_on_nan_loss: `bool`, whether to raise exception when loss is NaN.\n '
... | -1,714,058,577,219,217,000 | Initializes NanLoss monitor.
Args:
loss_tensor: `Tensor`, the loss tensor.
every_n_steps: `int`, run check every this many steps.
fail_on_nan_loss: `bool`, whether to raise exception when loss is NaN. | tensorflow/contrib/learn/python/learn/monitors.py | __init__ | Najah-lshanableh/tensorflow | python | def __init__(self, loss_tensor, every_n_steps=100, fail_on_nan_loss=True):
'Initializes NanLoss monitor.\n\n Args:\n loss_tensor: `Tensor`, the loss tensor.\n every_n_steps: `int`, run check every this many steps.\n fail_on_nan_loss: `bool`, whether to raise exception when loss is NaN.\n '
... |
def reload(self):
"Reload image from disk. This facilitates re-loading of\n images from disk in case the image content changes.\n\n .. versionadded:: 1.3.0\n\n Usage::\n\n im = Image(source = '1.jpg')\n # -- do something --\n im.reload()\n # image wil... | 4,258,685,529,131,743,700 | Reload image from disk. This facilitates re-loading of
images from disk in case the image content changes.
.. versionadded:: 1.3.0
Usage::
im = Image(source = '1.jpg')
# -- do something --
im.reload()
# image will be re-loaded from disk | kivy/uix/image.py | reload | eman1can/kivy | python | def reload(self):
"Reload image from disk. This facilitates re-loading of\n images from disk in case the image content changes.\n\n .. versionadded:: 1.3.0\n\n Usage::\n\n im = Image(source = '1.jpg')\n # -- do something --\n im.reload()\n # image wil... |
def remove_from_cache(self):
'Remove image from cache.\n\n .. versionadded:: 2.0.0\n '
if self._coreimage:
self._coreimage.remove_from_cache() | 5,514,899,281,433,177,000 | Remove image from cache.
.. versionadded:: 2.0.0 | kivy/uix/image.py | remove_from_cache | eman1can/kivy | python | def remove_from_cache(self):
'Remove image from cache.\n\n .. versionadded:: 2.0.0\n '
if self._coreimage:
self._coreimage.remove_from_cache() |
def load_collada(file_obj, resolver=None, **kwargs):
'\n Load a COLLADA (.dae) file into a list of trimesh kwargs.\n\n Parameters\n ----------\n file_obj : file object\n Containing a COLLADA file\n resolver : trimesh.visual.Resolver or None\n For loading referenced files, like texture image... | 6,059,635,877,470,790,000 | Load a COLLADA (.dae) file into a list of trimesh kwargs.
Parameters
----------
file_obj : file object
Containing a COLLADA file
resolver : trimesh.visual.Resolver or None
For loading referenced files, like texture images
kwargs : **
Passed to trimesh.Trimesh.__init__
Returns
-------
loaded : list of dict
kwa... | trimesh/exchange/dae.py | load_collada | BerkeleyAutomation/trimesh | python | def load_collada(file_obj, resolver=None, **kwargs):
'\n Load a COLLADA (.dae) file into a list of trimesh kwargs.\n\n Parameters\n ----------\n file_obj : file object\n Containing a COLLADA file\n resolver : trimesh.visual.Resolver or None\n For loading referenced files, like texture image... |
def export_collada(mesh, **kwargs):
'\n Export a mesh or a list of meshes as a COLLADA .dae file.\n\n Parameters\n -----------\n mesh: Trimesh object or list of Trimesh objects\n The mesh(es) to export.\n\n Returns\n -----------\n export: str, string of COLLADA format output\n '
m... | -1,312,581,392,268,734,700 | Export a mesh or a list of meshes as a COLLADA .dae file.
Parameters
-----------
mesh: Trimesh object or list of Trimesh objects
The mesh(es) to export.
Returns
-----------
export: str, string of COLLADA format output | trimesh/exchange/dae.py | export_collada | BerkeleyAutomation/trimesh | python | def export_collada(mesh, **kwargs):
'\n Export a mesh or a list of meshes as a COLLADA .dae file.\n\n Parameters\n -----------\n mesh: Trimesh object or list of Trimesh objects\n The mesh(es) to export.\n\n Returns\n -----------\n export: str, string of COLLADA format output\n '
m... |
def _parse_node(node, parent_matrix, material_map, meshes, graph, resolver=None):
'\n Recursively parse COLLADA scene nodes.\n '
if isinstance(node, collada.scene.GeometryNode):
geometry = node.geometry
local_material_map = {}
for mn in node.materials:
symbol = mn.symbo... | -3,186,675,801,806,256,000 | Recursively parse COLLADA scene nodes. | trimesh/exchange/dae.py | _parse_node | BerkeleyAutomation/trimesh | python | def _parse_node(node, parent_matrix, material_map, meshes, graph, resolver=None):
'\n \n '
if isinstance(node, collada.scene.GeometryNode):
geometry = node.geometry
local_material_map = {}
for mn in node.materials:
symbol = mn.symbol
m = mn.target
... |
def _load_texture(file_name, resolver):
'\n Load a texture from a file into a PIL image.\n '
file_data = resolver.get(file_name)
image = PIL.Image.open(util.wrap_as_stream(file_data))
return image | 5,463,406,226,342,628,000 | Load a texture from a file into a PIL image. | trimesh/exchange/dae.py | _load_texture | BerkeleyAutomation/trimesh | python | def _load_texture(file_name, resolver):
'\n \n '
file_data = resolver.get(file_name)
image = PIL.Image.open(util.wrap_as_stream(file_data))
return image |
def _parse_material(effect, resolver):
'\n Turn a COLLADA effect into a trimesh material.\n '
baseColorFactor = np.ones(4)
baseColorTexture = None
if isinstance(effect.diffuse, collada.material.Map):
try:
baseColorTexture = _load_texture(effect.diffuse.sampler.surface.image.pat... | -8,106,719,459,313,488,000 | Turn a COLLADA effect into a trimesh material. | trimesh/exchange/dae.py | _parse_material | BerkeleyAutomation/trimesh | python | def _parse_material(effect, resolver):
'\n \n '
baseColorFactor = np.ones(4)
baseColorTexture = None
if isinstance(effect.diffuse, collada.material.Map):
try:
baseColorTexture = _load_texture(effect.diffuse.sampler.surface.image.path, resolver)
except BaseException:
... |
def _unparse_material(material):
'\n Turn a trimesh material into a COLLADA material.\n '
if isinstance(material, visual.material.PBRMaterial):
diffuse = material.baseColorFactor
if (diffuse is not None):
diffuse = list(diffuse)
emission = material.emissiveFactor
... | -5,805,063,635,426,141,000 | Turn a trimesh material into a COLLADA material. | trimesh/exchange/dae.py | _unparse_material | BerkeleyAutomation/trimesh | python | def _unparse_material(material):
'\n \n '
if isinstance(material, visual.material.PBRMaterial):
diffuse = material.baseColorFactor
if (diffuse is not None):
diffuse = list(diffuse)
emission = material.emissiveFactor
if (emission is not None):
emissio... |
def load_zae(file_obj, resolver=None, **kwargs):
'\n Load a ZAE file, which is just a zipped DAE file.\n\n Parameters\n -------------\n file_obj : file object\n Contains ZAE data\n resolver : trimesh.visual.Resolver\n Resolver to load additional assets\n kwargs : dict\n Passed to lo... | -1,790,349,105,444,850,700 | Load a ZAE file, which is just a zipped DAE file.
Parameters
-------------
file_obj : file object
Contains ZAE data
resolver : trimesh.visual.Resolver
Resolver to load additional assets
kwargs : dict
Passed to load_collada
Returns
------------
loaded : dict
Results of loading | trimesh/exchange/dae.py | load_zae | BerkeleyAutomation/trimesh | python | def load_zae(file_obj, resolver=None, **kwargs):
'\n Load a ZAE file, which is just a zipped DAE file.\n\n Parameters\n -------------\n file_obj : file object\n Contains ZAE data\n resolver : trimesh.visual.Resolver\n Resolver to load additional assets\n kwargs : dict\n Passed to lo... |
def tamper(payload, **kwargs):
"\n Unicode-escapes non-encoded characters in a given payload (not processing already encoded) (e.g. SELECT -> SELECT)\n\n Notes:\n * Useful to bypass weak filtering and/or WAFs in JSON contexes\n\n >>> tamper('SELECT FIELD FROM TABLE')\n '\\\\u0053\\\\u0045\\\\u004... | -7,932,019,435,247,317,000 | Unicode-escapes non-encoded characters in a given payload (not processing already encoded) (e.g. SELECT -> SELECT)
Notes:
* Useful to bypass weak filtering and/or WAFs in JSON contexes
>>> tamper('SELECT FIELD FROM TABLE')
'\\u0053\\u0045\\u004C\\u0045\\u0043\\u0054\\u0020\\u0046\\u0049\\u0045\\u004C\\u0044\\u002... | Toolz/sqlmap/tamper/charunicodeescape.py | tamper | 6un9-h0-Dan/CTF-Heaven | python | def tamper(payload, **kwargs):
"\n Unicode-escapes non-encoded characters in a given payload (not processing already encoded) (e.g. SELECT -> SELECT)\n\n Notes:\n * Useful to bypass weak filtering and/or WAFs in JSON contexes\n\n >>> tamper('SELECT FIELD FROM TABLE')\n '\\\\u0053\\\\u0045\\\\u004... |
def split_on_numbers(s):
'\n Splits the string into a list where the numbers and the characters between numbers are each element\n Copied from spt3g_software to fix dependencies (sorry)\n '
prevDig = False
outList = []
for char in s:
if char.isdigit():
if prevDig:
... | -8,418,773,352,372,961,000 | Splits the string into a list where the numbers and the characters between numbers are each element
Copied from spt3g_software to fix dependencies (sorry) | bin/kookaburra.py | split_on_numbers | simonsobs/lyrebird | python | def split_on_numbers(s):
'\n Splits the string into a list where the numbers and the characters between numbers are each element\n Copied from spt3g_software to fix dependencies (sorry)\n '
prevDig = False
outList = []
for char in s:
if char.isdigit():
if prevDig:
... |
def str_cmp_with_numbers_sorted(str1, str2):
'\n Compares two strings where numbers are sorted according to value, so Sq12 ends up after Sq8, use in sorted function\n Copied from spt3g_software to fix dependencies (sorry)\n '
if (str1 == str2):
return 0
split1 = split_on_numbers(str1)
... | 2,616,566,904,823,767,600 | Compares two strings where numbers are sorted according to value, so Sq12 ends up after Sq8, use in sorted function
Copied from spt3g_software to fix dependencies (sorry) | bin/kookaburra.py | str_cmp_with_numbers_sorted | simonsobs/lyrebird | python | def str_cmp_with_numbers_sorted(str1, str2):
'\n Compares two strings where numbers are sorted according to value, so Sq12 ends up after Sq8, use in sorted function\n Copied from spt3g_software to fix dependencies (sorry)\n '
if (str1 == str2):
return 0
split1 = split_on_numbers(str1)
... |
def _set_permissions(self):
'\n Make sure all xml files are readable by the world so that anyone can grab them\n '
for (remote, _) in self.artifacts:
self.transport.open_session().exec_command('sudo chmod -R +r {}'.format(remote)) | 5,895,479,352,179,789,000 | Make sure all xml files are readable by the world so that anyone can grab them | tests/support/copyartifacts.py | _set_permissions | 0x416e746f6e/salt | python | def _set_permissions(self):
'\n \n '
for (remote, _) in self.artifacts:
self.transport.open_session().exec_command('sudo chmod -R +r {}'.format(remote)) |
def run(filepath):
'Create a wallpaper image from a PNG file.'
src = Image.open(filepath)
target = swap_quadrants(src)
paste_with_alpha(target, src, (0, 0), 16)
return target | 817,709,833,370,899,200 | Create a wallpaper image from a PNG file. | source/_sample/pillow/pattern.py | run | showa-yojyo/note | python | def run(filepath):
src = Image.open(filepath)
target = swap_quadrants(src)
paste_with_alpha(target, src, (0, 0), 16)
return target |
def swap_quadrants(img):
'Quarter the image and swap two diagonal quadrant pairs.'
boxes = quarter_bbox(img)
regions = [img.crop(box) for box in boxes]
target = img.copy()
paste_with_alpha(target, regions[3], (0, 0), 128)
paste_with_alpha(target, regions[2], (regions[3].size[0], 0), 128)
pas... | -6,387,083,641,273,382,000 | Quarter the image and swap two diagonal quadrant pairs. | source/_sample/pillow/pattern.py | swap_quadrants | showa-yojyo/note | python | def swap_quadrants(img):
boxes = quarter_bbox(img)
regions = [img.crop(box) for box in boxes]
target = img.copy()
paste_with_alpha(target, regions[3], (0, 0), 128)
paste_with_alpha(target, regions[2], (regions[3].size[0], 0), 128)
paste_with_alpha(target, regions[1], (0, regions[3].size[1])... |
def paste_with_alpha(target, source, left_upper, opacity):
'An alpha_composite-like operation.'
mask = Image.new('L', source.size, opacity)
target.paste(source, left_upper, mask=mask) | -1,079,140,637,357,208,300 | An alpha_composite-like operation. | source/_sample/pillow/pattern.py | paste_with_alpha | showa-yojyo/note | python | def paste_with_alpha(target, source, left_upper, opacity):
mask = Image.new('L', source.size, opacity)
target.paste(source, left_upper, mask=mask) |
def quarter_bbox(img):
'Quarter the bounding box of an image.'
(left, upper, right, bottom) = img.getbbox()
xmid = (((left + right) - 1) // 2)
ymid = (((upper + bottom) - 1) // 2)
return [(left, upper, xmid, ymid), ((xmid + 1), upper, right, ymid), (left, (ymid + 1), xmid, bottom), ((xmid + 1), (ymi... | 4,406,968,220,061,805,000 | Quarter the bounding box of an image. | source/_sample/pillow/pattern.py | quarter_bbox | showa-yojyo/note | python | def quarter_bbox(img):
(left, upper, right, bottom) = img.getbbox()
xmid = (((left + right) - 1) // 2)
ymid = (((upper + bottom) - 1) // 2)
return [(left, upper, xmid, ymid), ((xmid + 1), upper, right, ymid), (left, (ymid + 1), xmid, bottom), ((xmid + 1), (ymid + 1), right, bottom)] |
@step('ActiveDocsDetailView')
def detail(self, active_doc):
'Navigate to active doc detail/preview page'
self.active_docs_table.row(name=active_doc['name']).name.click() | -694,965,796,845,580,200 | Navigate to active doc detail/preview page | testsuite/ui/views/admin/product/active_docs.py | detail | 3scale-qe/3scale-tests | python | @step('ActiveDocsDetailView')
def detail(self, active_doc):
self.active_docs_table.row(name=active_doc['name']).name.click() |
def make_request(self, endpoint):
'\n Make request on preview page\n :param endpoint: string of endpoint which should be tried\n :return:\n '
self.expand_operations_link.click()
self.active_docs_section.try_it_out(endpoint) | 8,992,228,005,639,734,000 | Make request on preview page
:param endpoint: string of endpoint which should be tried
:return: | testsuite/ui/views/admin/product/active_docs.py | make_request | 3scale-qe/3scale-tests | python | def make_request(self, endpoint):
'\n Make request on preview page\n :param endpoint: string of endpoint which should be tried\n :return:\n '
self.expand_operations_link.click()
self.active_docs_section.try_it_out(endpoint) |
def make_request(self, method, path, key):
'\n Make request on preview page\n :param path string eg. /post, /get\n :param method string eg. GET, POST\n :param key string name of application\n :return:\n '
self.active_docs_section.try_it_out(metho... | -849,955,223,380,817,300 | Make request on preview page
:param path string eg. /post, /get
:param method string eg. GET, POST
:param key string name of application
:return: | testsuite/ui/views/admin/product/active_docs.py | make_request | 3scale-qe/3scale-tests | python | def make_request(self, method, path, key):
'\n Make request on preview page\n :param path string eg. /post, /get\n :param method string eg. GET, POST\n :param key string name of application\n :return:\n '
self.active_docs_section.try_it_out(metho... |
def get_olfa_config(config_filename=''):
'\n Find and parse olfactometer configuration JSON.\n\n :param config_filename: string with path to configuration.\n :return: returns a tuple with (config_fn, config_dict)\n :rtype: tuple\n '
if (not config_filename):
logging.info('No olfa config f... | 5,484,870,733,860,152,000 | Find and parse olfactometer configuration JSON.
:param config_filename: string with path to configuration.
:return: returns a tuple with (config_fn, config_dict)
:rtype: tuple | olfactometry/utils.py | get_olfa_config | mohamedelgohary1/PyBpodGUI | python | def get_olfa_config(config_filename=):
'\n Find and parse olfactometer configuration JSON.\n\n :param config_filename: string with path to configuration.\n :return: returns a tuple with (config_fn, config_dict)\n :rtype: tuple\n '
if (not config_filename):
logging.info('No olfa config fil... |
def flatten_dictionary(dictionary, separator=':', flattened_dict=None, parent_string=''):
"\n Flattens nested dictionary into a single dictionary:\n {'hello': {'world': 1,\n 'moon': 2}}\n becomes:\n {'hello:world': 1,\n 'hello:moon': 2}\n\n Uses recursion to flatten ... | -1,371,423,723,073,061,000 | Flattens nested dictionary into a single dictionary:
{'hello': {'world': 1,
'moon': 2}}
becomes:
{'hello:world': 1,
'hello:moon': 2}
Uses recursion to flatten as many layers as exist in your dictionary.
:param dictionary: nested dictionary you wish to flatten.
:param flattened_dict: (used ... | olfactometry/utils.py | flatten_dictionary | mohamedelgohary1/PyBpodGUI | python | def flatten_dictionary(dictionary, separator=':', flattened_dict=None, parent_string=):
"\n Flattens nested dictionary into a single dictionary:\n {'hello': {'world': 1,\n 'moon': 2}}\n becomes:\n {'hello:world': 1,\n 'hello:moon': 2}\n\n Uses recursion to flatten as... |
def connect_serial(port, baudrate=115200, timeout=1, writeTimeout=1):
'\n Return Serial object after making sure that the port is accessible and that the port is expressed as a string.\n\n :param port: str or int (ie "COM4" or 4 for Windows).\n :param baudrate: baudrate.\n :param timeout: read timeout i... | 7,971,361,087,577,091,000 | Return Serial object after making sure that the port is accessible and that the port is expressed as a string.
:param port: str or int (ie "COM4" or 4 for Windows).
:param baudrate: baudrate.
:param timeout: read timeout in seconds, default 1 sec.
:param writeTimeout: write timeout in seconds, default 1 sec.
:return: ... | olfactometry/utils.py | connect_serial | mohamedelgohary1/PyBpodGUI | python | def connect_serial(port, baudrate=115200, timeout=1, writeTimeout=1):
'\n Return Serial object after making sure that the port is accessible and that the port is expressed as a string.\n\n :param port: str or int (ie "COM4" or 4 for Windows).\n :param baudrate: baudrate.\n :param timeout: read timeout i... |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.