repo
stringlengths
7
55
path
stringlengths
4
127
func_name
stringlengths
1
88
original_string
stringlengths
75
19.8k
language
stringclasses
1 value
code
stringlengths
75
19.8k
code_tokens
listlengths
20
707
docstring
stringlengths
3
17.3k
docstring_tokens
listlengths
3
222
sha
stringlengths
40
40
url
stringlengths
87
242
partition
stringclasses
1 value
idx
int64
0
252k
bastibe/SoundFile
soundfile.py
read
def read(file, frames=-1, start=0, stop=None, dtype='float64', always_2d=False, fill_value=None, out=None, samplerate=None, channels=None, format=None, subtype=None, endian=None, closefd=True): """Provide audio data from a sound file as NumPy array. By default, the whole file is read from the beginning, but the position to start reading can be specified with `start` and the number of frames to read can be specified with `frames`. Alternatively, a range can be specified with `start` and `stop`. If there is less data left in the file than requested, the rest of the frames are filled with `fill_value`. If no `fill_value` is specified, a smaller array is returned. Parameters ---------- file : str or int or file-like object The file to read from. See :class:`SoundFile` for details. frames : int, optional The number of frames to read. If `frames` is negative, the whole rest of the file is read. Not allowed if `stop` is given. start : int, optional Where to start reading. A negative value counts from the end. stop : int, optional The index after the last frame to be read. A negative value counts from the end. Not allowed if `frames` is given. dtype : {'float64', 'float32', 'int32', 'int16'}, optional Data type of the returned array, by default ``'float64'``. Floating point audio data is typically in the range from ``-1.0`` to ``1.0``. Integer data is in the range from ``-2**15`` to ``2**15-1`` for ``'int16'`` and from ``-2**31`` to ``2**31-1`` for ``'int32'``. .. note:: Reading int values from a float file will *not* scale the data to [-1.0, 1.0). If the file contains ``np.array([42.6], dtype='float32')``, you will read ``np.array([43], dtype='int32')`` for ``dtype='int32'``. Returns ------- audiodata : numpy.ndarray or type(out) A two-dimensional (frames x channels) NumPy array is returned. If the sound file has only one channel, a one-dimensional array is returned. Use ``always_2d=True`` to return a two-dimensional array anyway. If `out` was specified, it is returned. If `out` has more frames than available in the file (or if `frames` is smaller than the length of `out`) and no `fill_value` is given, then only a part of `out` is overwritten and a view containing all valid frames is returned. samplerate : int The sample rate of the audio file. Other Parameters ---------------- always_2d : bool, optional By default, reading a mono sound file will return a one-dimensional array. With ``always_2d=True``, audio data is always returned as a two-dimensional array, even if the audio file has only one channel. fill_value : float, optional If more frames are requested than available in the file, the rest of the output is be filled with `fill_value`. If `fill_value` is not specified, a smaller array is returned. out : numpy.ndarray or subclass, optional If `out` is specified, the data is written into the given array instead of creating a new array. In this case, the arguments `dtype` and `always_2d` are silently ignored! If `frames` is not given, it is obtained from the length of `out`. samplerate, channels, format, subtype, endian, closefd See :class:`SoundFile`. Examples -------- >>> import soundfile as sf >>> data, samplerate = sf.read('stereo_file.wav') >>> data array([[ 0.71329652, 0.06294799], [-0.26450912, -0.38874483], ... [ 0.67398441, -0.11516333]]) >>> samplerate 44100 """ with SoundFile(file, 'r', samplerate, channels, subtype, endian, format, closefd) as f: frames = f._prepare_read(start, stop, frames) data = f.read(frames, dtype, always_2d, fill_value, out) return data, f.samplerate
python
def read(file, frames=-1, start=0, stop=None, dtype='float64', always_2d=False, fill_value=None, out=None, samplerate=None, channels=None, format=None, subtype=None, endian=None, closefd=True): """Provide audio data from a sound file as NumPy array. By default, the whole file is read from the beginning, but the position to start reading can be specified with `start` and the number of frames to read can be specified with `frames`. Alternatively, a range can be specified with `start` and `stop`. If there is less data left in the file than requested, the rest of the frames are filled with `fill_value`. If no `fill_value` is specified, a smaller array is returned. Parameters ---------- file : str or int or file-like object The file to read from. See :class:`SoundFile` for details. frames : int, optional The number of frames to read. If `frames` is negative, the whole rest of the file is read. Not allowed if `stop` is given. start : int, optional Where to start reading. A negative value counts from the end. stop : int, optional The index after the last frame to be read. A negative value counts from the end. Not allowed if `frames` is given. dtype : {'float64', 'float32', 'int32', 'int16'}, optional Data type of the returned array, by default ``'float64'``. Floating point audio data is typically in the range from ``-1.0`` to ``1.0``. Integer data is in the range from ``-2**15`` to ``2**15-1`` for ``'int16'`` and from ``-2**31`` to ``2**31-1`` for ``'int32'``. .. note:: Reading int values from a float file will *not* scale the data to [-1.0, 1.0). If the file contains ``np.array([42.6], dtype='float32')``, you will read ``np.array([43], dtype='int32')`` for ``dtype='int32'``. Returns ------- audiodata : numpy.ndarray or type(out) A two-dimensional (frames x channels) NumPy array is returned. If the sound file has only one channel, a one-dimensional array is returned. Use ``always_2d=True`` to return a two-dimensional array anyway. If `out` was specified, it is returned. If `out` has more frames than available in the file (or if `frames` is smaller than the length of `out`) and no `fill_value` is given, then only a part of `out` is overwritten and a view containing all valid frames is returned. samplerate : int The sample rate of the audio file. Other Parameters ---------------- always_2d : bool, optional By default, reading a mono sound file will return a one-dimensional array. With ``always_2d=True``, audio data is always returned as a two-dimensional array, even if the audio file has only one channel. fill_value : float, optional If more frames are requested than available in the file, the rest of the output is be filled with `fill_value`. If `fill_value` is not specified, a smaller array is returned. out : numpy.ndarray or subclass, optional If `out` is specified, the data is written into the given array instead of creating a new array. In this case, the arguments `dtype` and `always_2d` are silently ignored! If `frames` is not given, it is obtained from the length of `out`. samplerate, channels, format, subtype, endian, closefd See :class:`SoundFile`. Examples -------- >>> import soundfile as sf >>> data, samplerate = sf.read('stereo_file.wav') >>> data array([[ 0.71329652, 0.06294799], [-0.26450912, -0.38874483], ... [ 0.67398441, -0.11516333]]) >>> samplerate 44100 """ with SoundFile(file, 'r', samplerate, channels, subtype, endian, format, closefd) as f: frames = f._prepare_read(start, stop, frames) data = f.read(frames, dtype, always_2d, fill_value, out) return data, f.samplerate
[ "def", "read", "(", "file", ",", "frames", "=", "-", "1", ",", "start", "=", "0", ",", "stop", "=", "None", ",", "dtype", "=", "'float64'", ",", "always_2d", "=", "False", ",", "fill_value", "=", "None", ",", "out", "=", "None", ",", "samplerate", ...
Provide audio data from a sound file as NumPy array. By default, the whole file is read from the beginning, but the position to start reading can be specified with `start` and the number of frames to read can be specified with `frames`. Alternatively, a range can be specified with `start` and `stop`. If there is less data left in the file than requested, the rest of the frames are filled with `fill_value`. If no `fill_value` is specified, a smaller array is returned. Parameters ---------- file : str or int or file-like object The file to read from. See :class:`SoundFile` for details. frames : int, optional The number of frames to read. If `frames` is negative, the whole rest of the file is read. Not allowed if `stop` is given. start : int, optional Where to start reading. A negative value counts from the end. stop : int, optional The index after the last frame to be read. A negative value counts from the end. Not allowed if `frames` is given. dtype : {'float64', 'float32', 'int32', 'int16'}, optional Data type of the returned array, by default ``'float64'``. Floating point audio data is typically in the range from ``-1.0`` to ``1.0``. Integer data is in the range from ``-2**15`` to ``2**15-1`` for ``'int16'`` and from ``-2**31`` to ``2**31-1`` for ``'int32'``. .. note:: Reading int values from a float file will *not* scale the data to [-1.0, 1.0). If the file contains ``np.array([42.6], dtype='float32')``, you will read ``np.array([43], dtype='int32')`` for ``dtype='int32'``. Returns ------- audiodata : numpy.ndarray or type(out) A two-dimensional (frames x channels) NumPy array is returned. If the sound file has only one channel, a one-dimensional array is returned. Use ``always_2d=True`` to return a two-dimensional array anyway. If `out` was specified, it is returned. If `out` has more frames than available in the file (or if `frames` is smaller than the length of `out`) and no `fill_value` is given, then only a part of `out` is overwritten and a view containing all valid frames is returned. samplerate : int The sample rate of the audio file. Other Parameters ---------------- always_2d : bool, optional By default, reading a mono sound file will return a one-dimensional array. With ``always_2d=True``, audio data is always returned as a two-dimensional array, even if the audio file has only one channel. fill_value : float, optional If more frames are requested than available in the file, the rest of the output is be filled with `fill_value`. If `fill_value` is not specified, a smaller array is returned. out : numpy.ndarray or subclass, optional If `out` is specified, the data is written into the given array instead of creating a new array. In this case, the arguments `dtype` and `always_2d` are silently ignored! If `frames` is not given, it is obtained from the length of `out`. samplerate, channels, format, subtype, endian, closefd See :class:`SoundFile`. Examples -------- >>> import soundfile as sf >>> data, samplerate = sf.read('stereo_file.wav') >>> data array([[ 0.71329652, 0.06294799], [-0.26450912, -0.38874483], ... [ 0.67398441, -0.11516333]]) >>> samplerate 44100
[ "Provide", "audio", "data", "from", "a", "sound", "file", "as", "NumPy", "array", "." ]
161e930da9c9ea76579b6ee18a131e10bca8a605
https://github.com/bastibe/SoundFile/blob/161e930da9c9ea76579b6ee18a131e10bca8a605/soundfile.py#L170-L260
train
41,500
bastibe/SoundFile
soundfile.py
write
def write(file, data, samplerate, subtype=None, endian=None, format=None, closefd=True): """Write data to a sound file. .. note:: If `file` exists, it will be truncated and overwritten! Parameters ---------- file : str or int or file-like object The file to write to. See :class:`SoundFile` for details. data : array_like The data to write. Usually two-dimensional (frames x channels), but one-dimensional `data` can be used for mono files. Only the data types ``'float64'``, ``'float32'``, ``'int32'`` and ``'int16'`` are supported. .. note:: The data type of `data` does **not** select the data type of the written file. Audio data will be converted to the given `subtype`. Writing int values to a float file will *not* scale the values to [-1.0, 1.0). If you write the value ``np.array([42], dtype='int32')``, to a ``subtype='FLOAT'`` file, the file will then contain ``np.array([42.], dtype='float32')``. samplerate : int The sample rate of the audio data. subtype : str, optional See :func:`default_subtype` for the default value and :func:`available_subtypes` for all possible values. Other Parameters ---------------- format, endian, closefd See :class:`SoundFile`. Examples -------- Write 10 frames of random data to a new file: >>> import numpy as np >>> import soundfile as sf >>> sf.write('stereo_file.wav', np.random.randn(10, 2), 44100, 'PCM_24') """ import numpy as np data = np.asarray(data) if data.ndim == 1: channels = 1 else: channels = data.shape[1] with SoundFile(file, 'w', samplerate, channels, subtype, endian, format, closefd) as f: f.write(data)
python
def write(file, data, samplerate, subtype=None, endian=None, format=None, closefd=True): """Write data to a sound file. .. note:: If `file` exists, it will be truncated and overwritten! Parameters ---------- file : str or int or file-like object The file to write to. See :class:`SoundFile` for details. data : array_like The data to write. Usually two-dimensional (frames x channels), but one-dimensional `data` can be used for mono files. Only the data types ``'float64'``, ``'float32'``, ``'int32'`` and ``'int16'`` are supported. .. note:: The data type of `data` does **not** select the data type of the written file. Audio data will be converted to the given `subtype`. Writing int values to a float file will *not* scale the values to [-1.0, 1.0). If you write the value ``np.array([42], dtype='int32')``, to a ``subtype='FLOAT'`` file, the file will then contain ``np.array([42.], dtype='float32')``. samplerate : int The sample rate of the audio data. subtype : str, optional See :func:`default_subtype` for the default value and :func:`available_subtypes` for all possible values. Other Parameters ---------------- format, endian, closefd See :class:`SoundFile`. Examples -------- Write 10 frames of random data to a new file: >>> import numpy as np >>> import soundfile as sf >>> sf.write('stereo_file.wav', np.random.randn(10, 2), 44100, 'PCM_24') """ import numpy as np data = np.asarray(data) if data.ndim == 1: channels = 1 else: channels = data.shape[1] with SoundFile(file, 'w', samplerate, channels, subtype, endian, format, closefd) as f: f.write(data)
[ "def", "write", "(", "file", ",", "data", ",", "samplerate", ",", "subtype", "=", "None", ",", "endian", "=", "None", ",", "format", "=", "None", ",", "closefd", "=", "True", ")", ":", "import", "numpy", "as", "np", "data", "=", "np", ".", "asarray...
Write data to a sound file. .. note:: If `file` exists, it will be truncated and overwritten! Parameters ---------- file : str or int or file-like object The file to write to. See :class:`SoundFile` for details. data : array_like The data to write. Usually two-dimensional (frames x channels), but one-dimensional `data` can be used for mono files. Only the data types ``'float64'``, ``'float32'``, ``'int32'`` and ``'int16'`` are supported. .. note:: The data type of `data` does **not** select the data type of the written file. Audio data will be converted to the given `subtype`. Writing int values to a float file will *not* scale the values to [-1.0, 1.0). If you write the value ``np.array([42], dtype='int32')``, to a ``subtype='FLOAT'`` file, the file will then contain ``np.array([42.], dtype='float32')``. samplerate : int The sample rate of the audio data. subtype : str, optional See :func:`default_subtype` for the default value and :func:`available_subtypes` for all possible values. Other Parameters ---------------- format, endian, closefd See :class:`SoundFile`. Examples -------- Write 10 frames of random data to a new file: >>> import numpy as np >>> import soundfile as sf >>> sf.write('stereo_file.wav', np.random.randn(10, 2), 44100, 'PCM_24')
[ "Write", "data", "to", "a", "sound", "file", "." ]
161e930da9c9ea76579b6ee18a131e10bca8a605
https://github.com/bastibe/SoundFile/blob/161e930da9c9ea76579b6ee18a131e10bca8a605/soundfile.py#L263-L316
train
41,501
bastibe/SoundFile
soundfile.py
available_subtypes
def available_subtypes(format=None): """Return a dictionary of available subtypes. Parameters ---------- format : str If given, only compatible subtypes are returned. Examples -------- >>> import soundfile as sf >>> sf.available_subtypes('FLAC') {'PCM_24': 'Signed 24 bit PCM', 'PCM_16': 'Signed 16 bit PCM', 'PCM_S8': 'Signed 8 bit PCM'} """ subtypes = _available_formats_helper(_snd.SFC_GET_FORMAT_SUBTYPE_COUNT, _snd.SFC_GET_FORMAT_SUBTYPE) return dict((subtype, name) for subtype, name in subtypes if format is None or check_format(format, subtype))
python
def available_subtypes(format=None): """Return a dictionary of available subtypes. Parameters ---------- format : str If given, only compatible subtypes are returned. Examples -------- >>> import soundfile as sf >>> sf.available_subtypes('FLAC') {'PCM_24': 'Signed 24 bit PCM', 'PCM_16': 'Signed 16 bit PCM', 'PCM_S8': 'Signed 8 bit PCM'} """ subtypes = _available_formats_helper(_snd.SFC_GET_FORMAT_SUBTYPE_COUNT, _snd.SFC_GET_FORMAT_SUBTYPE) return dict((subtype, name) for subtype, name in subtypes if format is None or check_format(format, subtype))
[ "def", "available_subtypes", "(", "format", "=", "None", ")", ":", "subtypes", "=", "_available_formats_helper", "(", "_snd", ".", "SFC_GET_FORMAT_SUBTYPE_COUNT", ",", "_snd", ".", "SFC_GET_FORMAT_SUBTYPE", ")", "return", "dict", "(", "(", "subtype", ",", "name", ...
Return a dictionary of available subtypes. Parameters ---------- format : str If given, only compatible subtypes are returned. Examples -------- >>> import soundfile as sf >>> sf.available_subtypes('FLAC') {'PCM_24': 'Signed 24 bit PCM', 'PCM_16': 'Signed 16 bit PCM', 'PCM_S8': 'Signed 8 bit PCM'}
[ "Return", "a", "dictionary", "of", "available", "subtypes", "." ]
161e930da9c9ea76579b6ee18a131e10bca8a605
https://github.com/bastibe/SoundFile/blob/161e930da9c9ea76579b6ee18a131e10bca8a605/soundfile.py#L462-L482
train
41,502
bastibe/SoundFile
soundfile.py
_error_check
def _error_check(err, prefix=""): """Pretty-print a numerical error code if there is an error.""" if err != 0: err_str = _snd.sf_error_number(err) raise RuntimeError(prefix + _ffi.string(err_str).decode('utf-8', 'replace'))
python
def _error_check(err, prefix=""): """Pretty-print a numerical error code if there is an error.""" if err != 0: err_str = _snd.sf_error_number(err) raise RuntimeError(prefix + _ffi.string(err_str).decode('utf-8', 'replace'))
[ "def", "_error_check", "(", "err", ",", "prefix", "=", "\"\"", ")", ":", "if", "err", "!=", "0", ":", "err_str", "=", "_snd", ".", "sf_error_number", "(", "err", ")", "raise", "RuntimeError", "(", "prefix", "+", "_ffi", ".", "string", "(", "err_str", ...
Pretty-print a numerical error code if there is an error.
[ "Pretty", "-", "print", "a", "numerical", "error", "code", "if", "there", "is", "an", "error", "." ]
161e930da9c9ea76579b6ee18a131e10bca8a605
https://github.com/bastibe/SoundFile/blob/161e930da9c9ea76579b6ee18a131e10bca8a605/soundfile.py#L1353-L1357
train
41,503
bastibe/SoundFile
soundfile.py
_format_int
def _format_int(format, subtype, endian): """Return numeric ID for given format|subtype|endian combo.""" result = _check_format(format) if subtype is None: subtype = default_subtype(format) if subtype is None: raise TypeError( "No default subtype for major format {0!r}".format(format)) elif not isinstance(subtype, (_unicode, str)): raise TypeError("Invalid subtype: {0!r}".format(subtype)) try: result |= _subtypes[subtype.upper()] except KeyError: raise ValueError("Unknown subtype: {0!r}".format(subtype)) if endian is None: endian = 'FILE' elif not isinstance(endian, (_unicode, str)): raise TypeError("Invalid endian-ness: {0!r}".format(endian)) try: result |= _endians[endian.upper()] except KeyError: raise ValueError("Unknown endian-ness: {0!r}".format(endian)) info = _ffi.new("SF_INFO*") info.format = result info.channels = 1 if _snd.sf_format_check(info) == _snd.SF_FALSE: raise ValueError( "Invalid combination of format, subtype and endian") return result
python
def _format_int(format, subtype, endian): """Return numeric ID for given format|subtype|endian combo.""" result = _check_format(format) if subtype is None: subtype = default_subtype(format) if subtype is None: raise TypeError( "No default subtype for major format {0!r}".format(format)) elif not isinstance(subtype, (_unicode, str)): raise TypeError("Invalid subtype: {0!r}".format(subtype)) try: result |= _subtypes[subtype.upper()] except KeyError: raise ValueError("Unknown subtype: {0!r}".format(subtype)) if endian is None: endian = 'FILE' elif not isinstance(endian, (_unicode, str)): raise TypeError("Invalid endian-ness: {0!r}".format(endian)) try: result |= _endians[endian.upper()] except KeyError: raise ValueError("Unknown endian-ness: {0!r}".format(endian)) info = _ffi.new("SF_INFO*") info.format = result info.channels = 1 if _snd.sf_format_check(info) == _snd.SF_FALSE: raise ValueError( "Invalid combination of format, subtype and endian") return result
[ "def", "_format_int", "(", "format", ",", "subtype", ",", "endian", ")", ":", "result", "=", "_check_format", "(", "format", ")", "if", "subtype", "is", "None", ":", "subtype", "=", "default_subtype", "(", "format", ")", "if", "subtype", "is", "None", ":...
Return numeric ID for given format|subtype|endian combo.
[ "Return", "numeric", "ID", "for", "given", "format|subtype|endian", "combo", "." ]
161e930da9c9ea76579b6ee18a131e10bca8a605
https://github.com/bastibe/SoundFile/blob/161e930da9c9ea76579b6ee18a131e10bca8a605/soundfile.py#L1360-L1389
train
41,504
bastibe/SoundFile
soundfile.py
_check_mode
def _check_mode(mode): """Check if mode is valid and return its integer representation.""" if not isinstance(mode, (_unicode, str)): raise TypeError("Invalid mode: {0!r}".format(mode)) mode_set = set(mode) if mode_set.difference('xrwb+') or len(mode) > len(mode_set): raise ValueError("Invalid mode: {0!r}".format(mode)) if len(mode_set.intersection('xrw')) != 1: raise ValueError("mode must contain exactly one of 'xrw'") if '+' in mode_set: mode_int = _snd.SFM_RDWR elif 'r' in mode_set: mode_int = _snd.SFM_READ else: mode_int = _snd.SFM_WRITE return mode_int
python
def _check_mode(mode): """Check if mode is valid and return its integer representation.""" if not isinstance(mode, (_unicode, str)): raise TypeError("Invalid mode: {0!r}".format(mode)) mode_set = set(mode) if mode_set.difference('xrwb+') or len(mode) > len(mode_set): raise ValueError("Invalid mode: {0!r}".format(mode)) if len(mode_set.intersection('xrw')) != 1: raise ValueError("mode must contain exactly one of 'xrw'") if '+' in mode_set: mode_int = _snd.SFM_RDWR elif 'r' in mode_set: mode_int = _snd.SFM_READ else: mode_int = _snd.SFM_WRITE return mode_int
[ "def", "_check_mode", "(", "mode", ")", ":", "if", "not", "isinstance", "(", "mode", ",", "(", "_unicode", ",", "str", ")", ")", ":", "raise", "TypeError", "(", "\"Invalid mode: {0!r}\"", ".", "format", "(", "mode", ")", ")", "mode_set", "=", "set", "(...
Check if mode is valid and return its integer representation.
[ "Check", "if", "mode", "is", "valid", "and", "return", "its", "integer", "representation", "." ]
161e930da9c9ea76579b6ee18a131e10bca8a605
https://github.com/bastibe/SoundFile/blob/161e930da9c9ea76579b6ee18a131e10bca8a605/soundfile.py#L1392-L1408
train
41,505
bastibe/SoundFile
soundfile.py
_create_info_struct
def _create_info_struct(file, mode, samplerate, channels, format, subtype, endian): """Check arguments and create SF_INFO struct.""" original_format = format if format is None: format = _get_format_from_filename(file, mode) assert isinstance(format, (_unicode, str)) else: _check_format(format) info = _ffi.new("SF_INFO*") if 'r' not in mode or format.upper() == 'RAW': if samplerate is None: raise TypeError("samplerate must be specified") info.samplerate = samplerate if channels is None: raise TypeError("channels must be specified") info.channels = channels info.format = _format_int(format, subtype, endian) else: if any(arg is not None for arg in ( samplerate, channels, original_format, subtype, endian)): raise TypeError("Not allowed for existing files (except 'RAW'): " "samplerate, channels, format, subtype, endian") return info
python
def _create_info_struct(file, mode, samplerate, channels, format, subtype, endian): """Check arguments and create SF_INFO struct.""" original_format = format if format is None: format = _get_format_from_filename(file, mode) assert isinstance(format, (_unicode, str)) else: _check_format(format) info = _ffi.new("SF_INFO*") if 'r' not in mode or format.upper() == 'RAW': if samplerate is None: raise TypeError("samplerate must be specified") info.samplerate = samplerate if channels is None: raise TypeError("channels must be specified") info.channels = channels info.format = _format_int(format, subtype, endian) else: if any(arg is not None for arg in ( samplerate, channels, original_format, subtype, endian)): raise TypeError("Not allowed for existing files (except 'RAW'): " "samplerate, channels, format, subtype, endian") return info
[ "def", "_create_info_struct", "(", "file", ",", "mode", ",", "samplerate", ",", "channels", ",", "format", ",", "subtype", ",", "endian", ")", ":", "original_format", "=", "format", "if", "format", "is", "None", ":", "format", "=", "_get_format_from_filename",...
Check arguments and create SF_INFO struct.
[ "Check", "arguments", "and", "create", "SF_INFO", "struct", "." ]
161e930da9c9ea76579b6ee18a131e10bca8a605
https://github.com/bastibe/SoundFile/blob/161e930da9c9ea76579b6ee18a131e10bca8a605/soundfile.py#L1411-L1435
train
41,506
bastibe/SoundFile
soundfile.py
_format_str
def _format_str(format_int): """Return the string representation of a given numeric format.""" for dictionary in _formats, _subtypes, _endians: for k, v in dictionary.items(): if v == format_int: return k else: return 'n/a'
python
def _format_str(format_int): """Return the string representation of a given numeric format.""" for dictionary in _formats, _subtypes, _endians: for k, v in dictionary.items(): if v == format_int: return k else: return 'n/a'
[ "def", "_format_str", "(", "format_int", ")", ":", "for", "dictionary", "in", "_formats", ",", "_subtypes", ",", "_endians", ":", "for", "k", ",", "v", "in", "dictionary", ".", "items", "(", ")", ":", "if", "v", "==", "format_int", ":", "return", "k", ...
Return the string representation of a given numeric format.
[ "Return", "the", "string", "representation", "of", "a", "given", "numeric", "format", "." ]
161e930da9c9ea76579b6ee18a131e10bca8a605
https://github.com/bastibe/SoundFile/blob/161e930da9c9ea76579b6ee18a131e10bca8a605/soundfile.py#L1462-L1469
train
41,507
bastibe/SoundFile
soundfile.py
_format_info
def _format_info(format_int, format_flag=_snd.SFC_GET_FORMAT_INFO): """Return the ID and short description of a given format.""" format_info = _ffi.new("SF_FORMAT_INFO*") format_info.format = format_int _snd.sf_command(_ffi.NULL, format_flag, format_info, _ffi.sizeof("SF_FORMAT_INFO")) name = format_info.name return (_format_str(format_info.format), _ffi.string(name).decode('utf-8', 'replace') if name else "")
python
def _format_info(format_int, format_flag=_snd.SFC_GET_FORMAT_INFO): """Return the ID and short description of a given format.""" format_info = _ffi.new("SF_FORMAT_INFO*") format_info.format = format_int _snd.sf_command(_ffi.NULL, format_flag, format_info, _ffi.sizeof("SF_FORMAT_INFO")) name = format_info.name return (_format_str(format_info.format), _ffi.string(name).decode('utf-8', 'replace') if name else "")
[ "def", "_format_info", "(", "format_int", ",", "format_flag", "=", "_snd", ".", "SFC_GET_FORMAT_INFO", ")", ":", "format_info", "=", "_ffi", ".", "new", "(", "\"SF_FORMAT_INFO*\"", ")", "format_info", ".", "format", "=", "format_int", "_snd", ".", "sf_command", ...
Return the ID and short description of a given format.
[ "Return", "the", "ID", "and", "short", "description", "of", "a", "given", "format", "." ]
161e930da9c9ea76579b6ee18a131e10bca8a605
https://github.com/bastibe/SoundFile/blob/161e930da9c9ea76579b6ee18a131e10bca8a605/soundfile.py#L1472-L1480
train
41,508
bastibe/SoundFile
soundfile.py
_check_format
def _check_format(format_str): """Check if `format_str` is valid and return format ID.""" if not isinstance(format_str, (_unicode, str)): raise TypeError("Invalid format: {0!r}".format(format_str)) try: format_int = _formats[format_str.upper()] except KeyError: raise ValueError("Unknown format: {0!r}".format(format_str)) return format_int
python
def _check_format(format_str): """Check if `format_str` is valid and return format ID.""" if not isinstance(format_str, (_unicode, str)): raise TypeError("Invalid format: {0!r}".format(format_str)) try: format_int = _formats[format_str.upper()] except KeyError: raise ValueError("Unknown format: {0!r}".format(format_str)) return format_int
[ "def", "_check_format", "(", "format_str", ")", ":", "if", "not", "isinstance", "(", "format_str", ",", "(", "_unicode", ",", "str", ")", ")", ":", "raise", "TypeError", "(", "\"Invalid format: {0!r}\"", ".", "format", "(", "format_str", ")", ")", "try", "...
Check if `format_str` is valid and return format ID.
[ "Check", "if", "format_str", "is", "valid", "and", "return", "format", "ID", "." ]
161e930da9c9ea76579b6ee18a131e10bca8a605
https://github.com/bastibe/SoundFile/blob/161e930da9c9ea76579b6ee18a131e10bca8a605/soundfile.py#L1491-L1499
train
41,509
bastibe/SoundFile
soundfile.py
_has_virtual_io_attrs
def _has_virtual_io_attrs(file, mode_int): """Check if file has all the necessary attributes for virtual IO.""" readonly = mode_int == _snd.SFM_READ writeonly = mode_int == _snd.SFM_WRITE return all([ hasattr(file, 'seek'), hasattr(file, 'tell'), hasattr(file, 'write') or readonly, hasattr(file, 'read') or hasattr(file, 'readinto') or writeonly, ])
python
def _has_virtual_io_attrs(file, mode_int): """Check if file has all the necessary attributes for virtual IO.""" readonly = mode_int == _snd.SFM_READ writeonly = mode_int == _snd.SFM_WRITE return all([ hasattr(file, 'seek'), hasattr(file, 'tell'), hasattr(file, 'write') or readonly, hasattr(file, 'read') or hasattr(file, 'readinto') or writeonly, ])
[ "def", "_has_virtual_io_attrs", "(", "file", ",", "mode_int", ")", ":", "readonly", "=", "mode_int", "==", "_snd", ".", "SFM_READ", "writeonly", "=", "mode_int", "==", "_snd", ".", "SFM_WRITE", "return", "all", "(", "[", "hasattr", "(", "file", ",", "'seek...
Check if file has all the necessary attributes for virtual IO.
[ "Check", "if", "file", "has", "all", "the", "necessary", "attributes", "for", "virtual", "IO", "." ]
161e930da9c9ea76579b6ee18a131e10bca8a605
https://github.com/bastibe/SoundFile/blob/161e930da9c9ea76579b6ee18a131e10bca8a605/soundfile.py#L1502-L1511
train
41,510
bastibe/SoundFile
soundfile.py
SoundFile.extra_info
def extra_info(self): """Retrieve the log string generated when opening the file.""" info = _ffi.new("char[]", 2**14) _snd.sf_command(self._file, _snd.SFC_GET_LOG_INFO, info, _ffi.sizeof(info)) return _ffi.string(info).decode('utf-8', 'replace')
python
def extra_info(self): """Retrieve the log string generated when opening the file.""" info = _ffi.new("char[]", 2**14) _snd.sf_command(self._file, _snd.SFC_GET_LOG_INFO, info, _ffi.sizeof(info)) return _ffi.string(info).decode('utf-8', 'replace')
[ "def", "extra_info", "(", "self", ")", ":", "info", "=", "_ffi", ".", "new", "(", "\"char[]\"", ",", "2", "**", "14", ")", "_snd", ".", "sf_command", "(", "self", ".", "_file", ",", "_snd", ".", "SFC_GET_LOG_INFO", ",", "info", ",", "_ffi", ".", "s...
Retrieve the log string generated when opening the file.
[ "Retrieve", "the", "log", "string", "generated", "when", "opening", "the", "file", "." ]
161e930da9c9ea76579b6ee18a131e10bca8a605
https://github.com/bastibe/SoundFile/blob/161e930da9c9ea76579b6ee18a131e10bca8a605/soundfile.py#L671-L676
train
41,511
bastibe/SoundFile
soundfile.py
SoundFile.read
def read(self, frames=-1, dtype='float64', always_2d=False, fill_value=None, out=None): """Read from the file and return data as NumPy array. Reads the given number of frames in the given data format starting at the current read/write position. This advances the read/write position by the same number of frames. By default, all frames from the current read/write position to the end of the file are returned. Use :meth:`.seek` to move the current read/write position. Parameters ---------- frames : int, optional The number of frames to read. If ``frames < 0``, the whole rest of the file is read. dtype : {'float64', 'float32', 'int32', 'int16'}, optional Data type of the returned array, by default ``'float64'``. Floating point audio data is typically in the range from ``-1.0`` to ``1.0``. Integer data is in the range from ``-2**15`` to ``2**15-1`` for ``'int16'`` and from ``-2**31`` to ``2**31-1`` for ``'int32'``. .. note:: Reading int values from a float file will *not* scale the data to [-1.0, 1.0). If the file contains ``np.array([42.6], dtype='float32')``, you will read ``np.array([43], dtype='int32')`` for ``dtype='int32'``. Returns ------- audiodata : numpy.ndarray or type(out) A two-dimensional NumPy (frames x channels) array is returned. If the sound file has only one channel, a one-dimensional array is returned. Use ``always_2d=True`` to return a two-dimensional array anyway. If `out` was specified, it is returned. If `out` has more frames than available in the file (or if `frames` is smaller than the length of `out`) and no `fill_value` is given, then only a part of `out` is overwritten and a view containing all valid frames is returned. numpy.ndarray or type(out) Other Parameters ---------------- always_2d : bool, optional By default, reading a mono sound file will return a one-dimensional array. With ``always_2d=True``, audio data is always returned as a two-dimensional array, even if the audio file has only one channel. fill_value : float, optional If more frames are requested than available in the file, the rest of the output is be filled with `fill_value`. If `fill_value` is not specified, a smaller array is returned. out : numpy.ndarray or subclass, optional If `out` is specified, the data is written into the given array instead of creating a new array. In this case, the arguments `dtype` and `always_2d` are silently ignored! If `frames` is not given, it is obtained from the length of `out`. Examples -------- >>> from soundfile import SoundFile >>> myfile = SoundFile('stereo_file.wav') Reading 3 frames from a stereo file: >>> myfile.read(3) array([[ 0.71329652, 0.06294799], [-0.26450912, -0.38874483], [ 0.67398441, -0.11516333]]) >>> myfile.close() See Also -------- buffer_read, .write """ if out is None: frames = self._check_frames(frames, fill_value) out = self._create_empty_array(frames, always_2d, dtype) else: if frames < 0 or frames > len(out): frames = len(out) frames = self._array_io('read', out, frames) if len(out) > frames: if fill_value is None: out = out[:frames] else: out[frames:] = fill_value return out
python
def read(self, frames=-1, dtype='float64', always_2d=False, fill_value=None, out=None): """Read from the file and return data as NumPy array. Reads the given number of frames in the given data format starting at the current read/write position. This advances the read/write position by the same number of frames. By default, all frames from the current read/write position to the end of the file are returned. Use :meth:`.seek` to move the current read/write position. Parameters ---------- frames : int, optional The number of frames to read. If ``frames < 0``, the whole rest of the file is read. dtype : {'float64', 'float32', 'int32', 'int16'}, optional Data type of the returned array, by default ``'float64'``. Floating point audio data is typically in the range from ``-1.0`` to ``1.0``. Integer data is in the range from ``-2**15`` to ``2**15-1`` for ``'int16'`` and from ``-2**31`` to ``2**31-1`` for ``'int32'``. .. note:: Reading int values from a float file will *not* scale the data to [-1.0, 1.0). If the file contains ``np.array([42.6], dtype='float32')``, you will read ``np.array([43], dtype='int32')`` for ``dtype='int32'``. Returns ------- audiodata : numpy.ndarray or type(out) A two-dimensional NumPy (frames x channels) array is returned. If the sound file has only one channel, a one-dimensional array is returned. Use ``always_2d=True`` to return a two-dimensional array anyway. If `out` was specified, it is returned. If `out` has more frames than available in the file (or if `frames` is smaller than the length of `out`) and no `fill_value` is given, then only a part of `out` is overwritten and a view containing all valid frames is returned. numpy.ndarray or type(out) Other Parameters ---------------- always_2d : bool, optional By default, reading a mono sound file will return a one-dimensional array. With ``always_2d=True``, audio data is always returned as a two-dimensional array, even if the audio file has only one channel. fill_value : float, optional If more frames are requested than available in the file, the rest of the output is be filled with `fill_value`. If `fill_value` is not specified, a smaller array is returned. out : numpy.ndarray or subclass, optional If `out` is specified, the data is written into the given array instead of creating a new array. In this case, the arguments `dtype` and `always_2d` are silently ignored! If `frames` is not given, it is obtained from the length of `out`. Examples -------- >>> from soundfile import SoundFile >>> myfile = SoundFile('stereo_file.wav') Reading 3 frames from a stereo file: >>> myfile.read(3) array([[ 0.71329652, 0.06294799], [-0.26450912, -0.38874483], [ 0.67398441, -0.11516333]]) >>> myfile.close() See Also -------- buffer_read, .write """ if out is None: frames = self._check_frames(frames, fill_value) out = self._create_empty_array(frames, always_2d, dtype) else: if frames < 0 or frames > len(out): frames = len(out) frames = self._array_io('read', out, frames) if len(out) > frames: if fill_value is None: out = out[:frames] else: out[frames:] = fill_value return out
[ "def", "read", "(", "self", ",", "frames", "=", "-", "1", ",", "dtype", "=", "'float64'", ",", "always_2d", "=", "False", ",", "fill_value", "=", "None", ",", "out", "=", "None", ")", ":", "if", "out", "is", "None", ":", "frames", "=", "self", "....
Read from the file and return data as NumPy array. Reads the given number of frames in the given data format starting at the current read/write position. This advances the read/write position by the same number of frames. By default, all frames from the current read/write position to the end of the file are returned. Use :meth:`.seek` to move the current read/write position. Parameters ---------- frames : int, optional The number of frames to read. If ``frames < 0``, the whole rest of the file is read. dtype : {'float64', 'float32', 'int32', 'int16'}, optional Data type of the returned array, by default ``'float64'``. Floating point audio data is typically in the range from ``-1.0`` to ``1.0``. Integer data is in the range from ``-2**15`` to ``2**15-1`` for ``'int16'`` and from ``-2**31`` to ``2**31-1`` for ``'int32'``. .. note:: Reading int values from a float file will *not* scale the data to [-1.0, 1.0). If the file contains ``np.array([42.6], dtype='float32')``, you will read ``np.array([43], dtype='int32')`` for ``dtype='int32'``. Returns ------- audiodata : numpy.ndarray or type(out) A two-dimensional NumPy (frames x channels) array is returned. If the sound file has only one channel, a one-dimensional array is returned. Use ``always_2d=True`` to return a two-dimensional array anyway. If `out` was specified, it is returned. If `out` has more frames than available in the file (or if `frames` is smaller than the length of `out`) and no `fill_value` is given, then only a part of `out` is overwritten and a view containing all valid frames is returned. numpy.ndarray or type(out) Other Parameters ---------------- always_2d : bool, optional By default, reading a mono sound file will return a one-dimensional array. With ``always_2d=True``, audio data is always returned as a two-dimensional array, even if the audio file has only one channel. fill_value : float, optional If more frames are requested than available in the file, the rest of the output is be filled with `fill_value`. If `fill_value` is not specified, a smaller array is returned. out : numpy.ndarray or subclass, optional If `out` is specified, the data is written into the given array instead of creating a new array. In this case, the arguments `dtype` and `always_2d` are silently ignored! If `frames` is not given, it is obtained from the length of `out`. Examples -------- >>> from soundfile import SoundFile >>> myfile = SoundFile('stereo_file.wav') Reading 3 frames from a stereo file: >>> myfile.read(3) array([[ 0.71329652, 0.06294799], [-0.26450912, -0.38874483], [ 0.67398441, -0.11516333]]) >>> myfile.close() See Also -------- buffer_read, .write
[ "Read", "from", "the", "file", "and", "return", "data", "as", "NumPy", "array", "." ]
161e930da9c9ea76579b6ee18a131e10bca8a605
https://github.com/bastibe/SoundFile/blob/161e930da9c9ea76579b6ee18a131e10bca8a605/soundfile.py#L780-L873
train
41,512
bastibe/SoundFile
soundfile.py
SoundFile.buffer_read
def buffer_read(self, frames=-1, dtype=None): """Read from the file and return data as buffer object. Reads the given number of `frames` in the given data format starting at the current read/write position. This advances the read/write position by the same number of frames. By default, all frames from the current read/write position to the end of the file are returned. Use :meth:`.seek` to move the current read/write position. Parameters ---------- frames : int, optional The number of frames to read. If `frames < 0`, the whole rest of the file is read. dtype : {'float64', 'float32', 'int32', 'int16'} Audio data will be converted to the given data type. Returns ------- buffer A buffer containing the read data. See Also -------- buffer_read_into, .read, buffer_write """ frames = self._check_frames(frames, fill_value=None) ctype = self._check_dtype(dtype) cdata = _ffi.new(ctype + '[]', frames * self.channels) read_frames = self._cdata_io('read', cdata, ctype, frames) assert read_frames == frames return _ffi.buffer(cdata)
python
def buffer_read(self, frames=-1, dtype=None): """Read from the file and return data as buffer object. Reads the given number of `frames` in the given data format starting at the current read/write position. This advances the read/write position by the same number of frames. By default, all frames from the current read/write position to the end of the file are returned. Use :meth:`.seek` to move the current read/write position. Parameters ---------- frames : int, optional The number of frames to read. If `frames < 0`, the whole rest of the file is read. dtype : {'float64', 'float32', 'int32', 'int16'} Audio data will be converted to the given data type. Returns ------- buffer A buffer containing the read data. See Also -------- buffer_read_into, .read, buffer_write """ frames = self._check_frames(frames, fill_value=None) ctype = self._check_dtype(dtype) cdata = _ffi.new(ctype + '[]', frames * self.channels) read_frames = self._cdata_io('read', cdata, ctype, frames) assert read_frames == frames return _ffi.buffer(cdata)
[ "def", "buffer_read", "(", "self", ",", "frames", "=", "-", "1", ",", "dtype", "=", "None", ")", ":", "frames", "=", "self", ".", "_check_frames", "(", "frames", ",", "fill_value", "=", "None", ")", "ctype", "=", "self", ".", "_check_dtype", "(", "dt...
Read from the file and return data as buffer object. Reads the given number of `frames` in the given data format starting at the current read/write position. This advances the read/write position by the same number of frames. By default, all frames from the current read/write position to the end of the file are returned. Use :meth:`.seek` to move the current read/write position. Parameters ---------- frames : int, optional The number of frames to read. If `frames < 0`, the whole rest of the file is read. dtype : {'float64', 'float32', 'int32', 'int16'} Audio data will be converted to the given data type. Returns ------- buffer A buffer containing the read data. See Also -------- buffer_read_into, .read, buffer_write
[ "Read", "from", "the", "file", "and", "return", "data", "as", "buffer", "object", "." ]
161e930da9c9ea76579b6ee18a131e10bca8a605
https://github.com/bastibe/SoundFile/blob/161e930da9c9ea76579b6ee18a131e10bca8a605/soundfile.py#L875-L908
train
41,513
bastibe/SoundFile
soundfile.py
SoundFile.buffer_read_into
def buffer_read_into(self, buffer, dtype): """Read from the file into a given buffer object. Fills the given `buffer` with frames in the given data format starting at the current read/write position (which can be changed with :meth:`.seek`) until the buffer is full or the end of the file is reached. This advances the read/write position by the number of frames that were read. Parameters ---------- buffer : writable buffer Audio frames from the file are written to this buffer. dtype : {'float64', 'float32', 'int32', 'int16'} The data type of `buffer`. Returns ------- int The number of frames that were read from the file. This can be less than the size of `buffer`. The rest of the buffer is not filled with meaningful data. See Also -------- buffer_read, .read """ ctype = self._check_dtype(dtype) cdata, frames = self._check_buffer(buffer, ctype) frames = self._cdata_io('read', cdata, ctype, frames) return frames
python
def buffer_read_into(self, buffer, dtype): """Read from the file into a given buffer object. Fills the given `buffer` with frames in the given data format starting at the current read/write position (which can be changed with :meth:`.seek`) until the buffer is full or the end of the file is reached. This advances the read/write position by the number of frames that were read. Parameters ---------- buffer : writable buffer Audio frames from the file are written to this buffer. dtype : {'float64', 'float32', 'int32', 'int16'} The data type of `buffer`. Returns ------- int The number of frames that were read from the file. This can be less than the size of `buffer`. The rest of the buffer is not filled with meaningful data. See Also -------- buffer_read, .read """ ctype = self._check_dtype(dtype) cdata, frames = self._check_buffer(buffer, ctype) frames = self._cdata_io('read', cdata, ctype, frames) return frames
[ "def", "buffer_read_into", "(", "self", ",", "buffer", ",", "dtype", ")", ":", "ctype", "=", "self", ".", "_check_dtype", "(", "dtype", ")", "cdata", ",", "frames", "=", "self", ".", "_check_buffer", "(", "buffer", ",", "ctype", ")", "frames", "=", "se...
Read from the file into a given buffer object. Fills the given `buffer` with frames in the given data format starting at the current read/write position (which can be changed with :meth:`.seek`) until the buffer is full or the end of the file is reached. This advances the read/write position by the number of frames that were read. Parameters ---------- buffer : writable buffer Audio frames from the file are written to this buffer. dtype : {'float64', 'float32', 'int32', 'int16'} The data type of `buffer`. Returns ------- int The number of frames that were read from the file. This can be less than the size of `buffer`. The rest of the buffer is not filled with meaningful data. See Also -------- buffer_read, .read
[ "Read", "from", "the", "file", "into", "a", "given", "buffer", "object", "." ]
161e930da9c9ea76579b6ee18a131e10bca8a605
https://github.com/bastibe/SoundFile/blob/161e930da9c9ea76579b6ee18a131e10bca8a605/soundfile.py#L910-L941
train
41,514
bastibe/SoundFile
soundfile.py
SoundFile.write
def write(self, data): """Write audio data from a NumPy array to the file. Writes a number of frames at the read/write position to the file. This also advances the read/write position by the same number of frames and enlarges the file if necessary. Note that writing int values to a float file will *not* scale the values to [-1.0, 1.0). If you write the value ``np.array([42], dtype='int32')``, to a ``subtype='FLOAT'`` file, the file will then contain ``np.array([42.], dtype='float32')``. Parameters ---------- data : array_like The data to write. Usually two-dimensional (frames x channels), but one-dimensional `data` can be used for mono files. Only the data types ``'float64'``, ``'float32'``, ``'int32'`` and ``'int16'`` are supported. .. note:: The data type of `data` does **not** select the data type of the written file. Audio data will be converted to the given `subtype`. Writing int values to a float file will *not* scale the values to [-1.0, 1.0). If you write the value ``np.array([42], dtype='int32')``, to a ``subtype='FLOAT'`` file, the file will then contain ``np.array([42.], dtype='float32')``. Examples -------- >>> import numpy as np >>> from soundfile import SoundFile >>> myfile = SoundFile('stereo_file.wav') Write 10 frames of random data to a new file: >>> with SoundFile('stereo_file.wav', 'w', 44100, 2, 'PCM_24') as f: >>> f.write(np.random.randn(10, 2)) See Also -------- buffer_write, .read """ import numpy as np # no copy is made if data has already the correct memory layout: data = np.ascontiguousarray(data) written = self._array_io('write', data, len(data)) assert written == len(data) self._update_frames(written)
python
def write(self, data): """Write audio data from a NumPy array to the file. Writes a number of frames at the read/write position to the file. This also advances the read/write position by the same number of frames and enlarges the file if necessary. Note that writing int values to a float file will *not* scale the values to [-1.0, 1.0). If you write the value ``np.array([42], dtype='int32')``, to a ``subtype='FLOAT'`` file, the file will then contain ``np.array([42.], dtype='float32')``. Parameters ---------- data : array_like The data to write. Usually two-dimensional (frames x channels), but one-dimensional `data` can be used for mono files. Only the data types ``'float64'``, ``'float32'``, ``'int32'`` and ``'int16'`` are supported. .. note:: The data type of `data` does **not** select the data type of the written file. Audio data will be converted to the given `subtype`. Writing int values to a float file will *not* scale the values to [-1.0, 1.0). If you write the value ``np.array([42], dtype='int32')``, to a ``subtype='FLOAT'`` file, the file will then contain ``np.array([42.], dtype='float32')``. Examples -------- >>> import numpy as np >>> from soundfile import SoundFile >>> myfile = SoundFile('stereo_file.wav') Write 10 frames of random data to a new file: >>> with SoundFile('stereo_file.wav', 'w', 44100, 2, 'PCM_24') as f: >>> f.write(np.random.randn(10, 2)) See Also -------- buffer_write, .read """ import numpy as np # no copy is made if data has already the correct memory layout: data = np.ascontiguousarray(data) written = self._array_io('write', data, len(data)) assert written == len(data) self._update_frames(written)
[ "def", "write", "(", "self", ",", "data", ")", ":", "import", "numpy", "as", "np", "# no copy is made if data has already the correct memory layout:", "data", "=", "np", ".", "ascontiguousarray", "(", "data", ")", "written", "=", "self", ".", "_array_io", "(", "...
Write audio data from a NumPy array to the file. Writes a number of frames at the read/write position to the file. This also advances the read/write position by the same number of frames and enlarges the file if necessary. Note that writing int values to a float file will *not* scale the values to [-1.0, 1.0). If you write the value ``np.array([42], dtype='int32')``, to a ``subtype='FLOAT'`` file, the file will then contain ``np.array([42.], dtype='float32')``. Parameters ---------- data : array_like The data to write. Usually two-dimensional (frames x channels), but one-dimensional `data` can be used for mono files. Only the data types ``'float64'``, ``'float32'``, ``'int32'`` and ``'int16'`` are supported. .. note:: The data type of `data` does **not** select the data type of the written file. Audio data will be converted to the given `subtype`. Writing int values to a float file will *not* scale the values to [-1.0, 1.0). If you write the value ``np.array([42], dtype='int32')``, to a ``subtype='FLOAT'`` file, the file will then contain ``np.array([42.], dtype='float32')``. Examples -------- >>> import numpy as np >>> from soundfile import SoundFile >>> myfile = SoundFile('stereo_file.wav') Write 10 frames of random data to a new file: >>> with SoundFile('stereo_file.wav', 'w', 44100, 2, 'PCM_24') as f: >>> f.write(np.random.randn(10, 2)) See Also -------- buffer_write, .read
[ "Write", "audio", "data", "from", "a", "NumPy", "array", "to", "the", "file", "." ]
161e930da9c9ea76579b6ee18a131e10bca8a605
https://github.com/bastibe/SoundFile/blob/161e930da9c9ea76579b6ee18a131e10bca8a605/soundfile.py#L943-L994
train
41,515
bastibe/SoundFile
soundfile.py
SoundFile.truncate
def truncate(self, frames=None): """Truncate the file to a given number of frames. After this command, the read/write position will be at the new end of the file. Parameters ---------- frames : int, optional Only the data before `frames` is kept, the rest is deleted. If not specified, the current read/write position is used. """ if frames is None: frames = self.tell() err = _snd.sf_command(self._file, _snd.SFC_FILE_TRUNCATE, _ffi.new("sf_count_t*", frames), _ffi.sizeof("sf_count_t")) if err: raise RuntimeError("Error truncating the file") self._info.frames = frames
python
def truncate(self, frames=None): """Truncate the file to a given number of frames. After this command, the read/write position will be at the new end of the file. Parameters ---------- frames : int, optional Only the data before `frames` is kept, the rest is deleted. If not specified, the current read/write position is used. """ if frames is None: frames = self.tell() err = _snd.sf_command(self._file, _snd.SFC_FILE_TRUNCATE, _ffi.new("sf_count_t*", frames), _ffi.sizeof("sf_count_t")) if err: raise RuntimeError("Error truncating the file") self._info.frames = frames
[ "def", "truncate", "(", "self", ",", "frames", "=", "None", ")", ":", "if", "frames", "is", "None", ":", "frames", "=", "self", ".", "tell", "(", ")", "err", "=", "_snd", ".", "sf_command", "(", "self", ".", "_file", ",", "_snd", ".", "SFC_FILE_TRU...
Truncate the file to a given number of frames. After this command, the read/write position will be at the new end of the file. Parameters ---------- frames : int, optional Only the data before `frames` is kept, the rest is deleted. If not specified, the current read/write position is used.
[ "Truncate", "the", "file", "to", "a", "given", "number", "of", "frames", "." ]
161e930da9c9ea76579b6ee18a131e10bca8a605
https://github.com/bastibe/SoundFile/blob/161e930da9c9ea76579b6ee18a131e10bca8a605/soundfile.py#L1115-L1135
train
41,516
bastibe/SoundFile
soundfile.py
SoundFile.close
def close(self): """Close the file. Can be called multiple times.""" if not self.closed: # be sure to flush data to disk before closing the file self.flush() err = _snd.sf_close(self._file) self._file = None _error_check(err)
python
def close(self): """Close the file. Can be called multiple times.""" if not self.closed: # be sure to flush data to disk before closing the file self.flush() err = _snd.sf_close(self._file) self._file = None _error_check(err)
[ "def", "close", "(", "self", ")", ":", "if", "not", "self", ".", "closed", ":", "# be sure to flush data to disk before closing the file", "self", ".", "flush", "(", ")", "err", "=", "_snd", ".", "sf_close", "(", "self", ".", "_file", ")", "self", ".", "_f...
Close the file. Can be called multiple times.
[ "Close", "the", "file", ".", "Can", "be", "called", "multiple", "times", "." ]
161e930da9c9ea76579b6ee18a131e10bca8a605
https://github.com/bastibe/SoundFile/blob/161e930da9c9ea76579b6ee18a131e10bca8a605/soundfile.py#L1151-L1158
train
41,517
bastibe/SoundFile
soundfile.py
SoundFile._check_frames
def _check_frames(self, frames, fill_value): """Reduce frames to no more than are available in the file.""" if self.seekable(): remaining_frames = self.frames - self.tell() if frames < 0 or (frames > remaining_frames and fill_value is None): frames = remaining_frames elif frames < 0: raise ValueError("frames must be specified for non-seekable files") return frames
python
def _check_frames(self, frames, fill_value): """Reduce frames to no more than are available in the file.""" if self.seekable(): remaining_frames = self.frames - self.tell() if frames < 0 or (frames > remaining_frames and fill_value is None): frames = remaining_frames elif frames < 0: raise ValueError("frames must be specified for non-seekable files") return frames
[ "def", "_check_frames", "(", "self", ",", "frames", ",", "fill_value", ")", ":", "if", "self", ".", "seekable", "(", ")", ":", "remaining_frames", "=", "self", ".", "frames", "-", "self", ".", "tell", "(", ")", "if", "frames", "<", "0", "or", "(", ...
Reduce frames to no more than are available in the file.
[ "Reduce", "frames", "to", "no", "more", "than", "are", "available", "in", "the", "file", "." ]
161e930da9c9ea76579b6ee18a131e10bca8a605
https://github.com/bastibe/SoundFile/blob/161e930da9c9ea76579b6ee18a131e10bca8a605/soundfile.py#L1262-L1271
train
41,518
bastibe/SoundFile
soundfile.py
SoundFile._check_buffer
def _check_buffer(self, data, ctype): """Convert buffer to cdata and check for valid size.""" assert ctype in _ffi_types.values() if not isinstance(data, bytes): data = _ffi.from_buffer(data) frames, remainder = divmod(len(data), self.channels * _ffi.sizeof(ctype)) if remainder: raise ValueError("Data size must be a multiple of frame size") return data, frames
python
def _check_buffer(self, data, ctype): """Convert buffer to cdata and check for valid size.""" assert ctype in _ffi_types.values() if not isinstance(data, bytes): data = _ffi.from_buffer(data) frames, remainder = divmod(len(data), self.channels * _ffi.sizeof(ctype)) if remainder: raise ValueError("Data size must be a multiple of frame size") return data, frames
[ "def", "_check_buffer", "(", "self", ",", "data", ",", "ctype", ")", ":", "assert", "ctype", "in", "_ffi_types", ".", "values", "(", ")", "if", "not", "isinstance", "(", "data", ",", "bytes", ")", ":", "data", "=", "_ffi", ".", "from_buffer", "(", "d...
Convert buffer to cdata and check for valid size.
[ "Convert", "buffer", "to", "cdata", "and", "check", "for", "valid", "size", "." ]
161e930da9c9ea76579b6ee18a131e10bca8a605
https://github.com/bastibe/SoundFile/blob/161e930da9c9ea76579b6ee18a131e10bca8a605/soundfile.py#L1273-L1282
train
41,519
bastibe/SoundFile
soundfile.py
SoundFile._create_empty_array
def _create_empty_array(self, frames, always_2d, dtype): """Create an empty array with appropriate shape.""" import numpy as np if always_2d or self.channels > 1: shape = frames, self.channels else: shape = frames, return np.empty(shape, dtype, order='C')
python
def _create_empty_array(self, frames, always_2d, dtype): """Create an empty array with appropriate shape.""" import numpy as np if always_2d or self.channels > 1: shape = frames, self.channels else: shape = frames, return np.empty(shape, dtype, order='C')
[ "def", "_create_empty_array", "(", "self", ",", "frames", ",", "always_2d", ",", "dtype", ")", ":", "import", "numpy", "as", "np", "if", "always_2d", "or", "self", ".", "channels", ">", "1", ":", "shape", "=", "frames", ",", "self", ".", "channels", "e...
Create an empty array with appropriate shape.
[ "Create", "an", "empty", "array", "with", "appropriate", "shape", "." ]
161e930da9c9ea76579b6ee18a131e10bca8a605
https://github.com/bastibe/SoundFile/blob/161e930da9c9ea76579b6ee18a131e10bca8a605/soundfile.py#L1284-L1291
train
41,520
bastibe/SoundFile
soundfile.py
SoundFile._check_dtype
def _check_dtype(self, dtype): """Check if dtype string is valid and return ctype string.""" try: return _ffi_types[dtype] except KeyError: raise ValueError("dtype must be one of {0!r} and not {1!r}".format( sorted(_ffi_types.keys()), dtype))
python
def _check_dtype(self, dtype): """Check if dtype string is valid and return ctype string.""" try: return _ffi_types[dtype] except KeyError: raise ValueError("dtype must be one of {0!r} and not {1!r}".format( sorted(_ffi_types.keys()), dtype))
[ "def", "_check_dtype", "(", "self", ",", "dtype", ")", ":", "try", ":", "return", "_ffi_types", "[", "dtype", "]", "except", "KeyError", ":", "raise", "ValueError", "(", "\"dtype must be one of {0!r} and not {1!r}\"", ".", "format", "(", "sorted", "(", "_ffi_typ...
Check if dtype string is valid and return ctype string.
[ "Check", "if", "dtype", "string", "is", "valid", "and", "return", "ctype", "string", "." ]
161e930da9c9ea76579b6ee18a131e10bca8a605
https://github.com/bastibe/SoundFile/blob/161e930da9c9ea76579b6ee18a131e10bca8a605/soundfile.py#L1293-L1299
train
41,521
bastibe/SoundFile
soundfile.py
SoundFile._array_io
def _array_io(self, action, array, frames): """Check array and call low-level IO function.""" if (array.ndim not in (1, 2) or array.ndim == 1 and self.channels != 1 or array.ndim == 2 and array.shape[1] != self.channels): raise ValueError("Invalid shape: {0!r}".format(array.shape)) if not array.flags.c_contiguous: raise ValueError("Data must be C-contiguous") ctype = self._check_dtype(array.dtype.name) assert array.dtype.itemsize == _ffi.sizeof(ctype) cdata = _ffi.cast(ctype + '*', array.__array_interface__['data'][0]) return self._cdata_io(action, cdata, ctype, frames)
python
def _array_io(self, action, array, frames): """Check array and call low-level IO function.""" if (array.ndim not in (1, 2) or array.ndim == 1 and self.channels != 1 or array.ndim == 2 and array.shape[1] != self.channels): raise ValueError("Invalid shape: {0!r}".format(array.shape)) if not array.flags.c_contiguous: raise ValueError("Data must be C-contiguous") ctype = self._check_dtype(array.dtype.name) assert array.dtype.itemsize == _ffi.sizeof(ctype) cdata = _ffi.cast(ctype + '*', array.__array_interface__['data'][0]) return self._cdata_io(action, cdata, ctype, frames)
[ "def", "_array_io", "(", "self", ",", "action", ",", "array", ",", "frames", ")", ":", "if", "(", "array", ".", "ndim", "not", "in", "(", "1", ",", "2", ")", "or", "array", ".", "ndim", "==", "1", "and", "self", ".", "channels", "!=", "1", "or"...
Check array and call low-level IO function.
[ "Check", "array", "and", "call", "low", "-", "level", "IO", "function", "." ]
161e930da9c9ea76579b6ee18a131e10bca8a605
https://github.com/bastibe/SoundFile/blob/161e930da9c9ea76579b6ee18a131e10bca8a605/soundfile.py#L1301-L1312
train
41,522
bastibe/SoundFile
soundfile.py
SoundFile._update_frames
def _update_frames(self, written): """Update self.frames after writing.""" if self.seekable(): curr = self.tell() self._info.frames = self.seek(0, SEEK_END) self.seek(curr, SEEK_SET) else: self._info.frames += written
python
def _update_frames(self, written): """Update self.frames after writing.""" if self.seekable(): curr = self.tell() self._info.frames = self.seek(0, SEEK_END) self.seek(curr, SEEK_SET) else: self._info.frames += written
[ "def", "_update_frames", "(", "self", ",", "written", ")", ":", "if", "self", ".", "seekable", "(", ")", ":", "curr", "=", "self", ".", "tell", "(", ")", "self", ".", "_info", ".", "frames", "=", "self", ".", "seek", "(", "0", ",", "SEEK_END", ")...
Update self.frames after writing.
[ "Update", "self", ".", "frames", "after", "writing", "." ]
161e930da9c9ea76579b6ee18a131e10bca8a605
https://github.com/bastibe/SoundFile/blob/161e930da9c9ea76579b6ee18a131e10bca8a605/soundfile.py#L1327-L1334
train
41,523
bastibe/SoundFile
soundfile.py
SoundFile._prepare_read
def _prepare_read(self, start, stop, frames): """Seek to start frame and calculate length.""" if start != 0 and not self.seekable(): raise ValueError("start is only allowed for seekable files") if frames >= 0 and stop is not None: raise TypeError("Only one of {frames, stop} may be used") start, stop, _ = slice(start, stop).indices(self.frames) if stop < start: stop = start if frames < 0: frames = stop - start if self.seekable(): self.seek(start, SEEK_SET) return frames
python
def _prepare_read(self, start, stop, frames): """Seek to start frame and calculate length.""" if start != 0 and not self.seekable(): raise ValueError("start is only allowed for seekable files") if frames >= 0 and stop is not None: raise TypeError("Only one of {frames, stop} may be used") start, stop, _ = slice(start, stop).indices(self.frames) if stop < start: stop = start if frames < 0: frames = stop - start if self.seekable(): self.seek(start, SEEK_SET) return frames
[ "def", "_prepare_read", "(", "self", ",", "start", ",", "stop", ",", "frames", ")", ":", "if", "start", "!=", "0", "and", "not", "self", ".", "seekable", "(", ")", ":", "raise", "ValueError", "(", "\"start is only allowed for seekable files\"", ")", "if", ...
Seek to start frame and calculate length.
[ "Seek", "to", "start", "frame", "and", "calculate", "length", "." ]
161e930da9c9ea76579b6ee18a131e10bca8a605
https://github.com/bastibe/SoundFile/blob/161e930da9c9ea76579b6ee18a131e10bca8a605/soundfile.py#L1336-L1350
train
41,524
bioidiap/bob.bio.spear
bob/bio/spear/utils/__init__.py
probes_used_generate_vector
def probes_used_generate_vector(probe_files_full, probe_files_model): """Generates boolean matrices indicating which are the probes for each model""" import numpy as np C_probesUsed = np.ndarray((len(probe_files_full),), 'bool') C_probesUsed.fill(False) c=0 for k in sorted(probe_files_full.keys()): if probe_files_model.has_key(k): C_probesUsed[c] = True c+=1 return C_probesUsed
python
def probes_used_generate_vector(probe_files_full, probe_files_model): """Generates boolean matrices indicating which are the probes for each model""" import numpy as np C_probesUsed = np.ndarray((len(probe_files_full),), 'bool') C_probesUsed.fill(False) c=0 for k in sorted(probe_files_full.keys()): if probe_files_model.has_key(k): C_probesUsed[c] = True c+=1 return C_probesUsed
[ "def", "probes_used_generate_vector", "(", "probe_files_full", ",", "probe_files_model", ")", ":", "import", "numpy", "as", "np", "C_probesUsed", "=", "np", ".", "ndarray", "(", "(", "len", "(", "probe_files_full", ")", ",", ")", ",", "'bool'", ")", "C_probesU...
Generates boolean matrices indicating which are the probes for each model
[ "Generates", "boolean", "matrices", "indicating", "which", "are", "the", "probes", "for", "each", "model" ]
9f5d13d2e52d3b0c818f4abaa07cda15f62a34cd
https://github.com/bioidiap/bob.bio.spear/blob/9f5d13d2e52d3b0c818f4abaa07cda15f62a34cd/bob/bio/spear/utils/__init__.py#L66-L75
train
41,525
bioidiap/bob.bio.spear
bob/bio/spear/utils/__init__.py
probes_used_extract_scores
def probes_used_extract_scores(full_scores, same_probes): """Extracts a matrix of scores for a model, given a probes_used row vector of boolean""" if full_scores.shape[1] != same_probes.shape[0]: raise "Size mismatch" import numpy as np model_scores = np.ndarray((full_scores.shape[0],np.sum(same_probes)), 'float64') c=0 for i in range(0,full_scores.shape[1]): if same_probes[i]: for j in range(0,full_scores.shape[0]): model_scores[j,c] = full_scores[j,i] c+=1 return model_scores
python
def probes_used_extract_scores(full_scores, same_probes): """Extracts a matrix of scores for a model, given a probes_used row vector of boolean""" if full_scores.shape[1] != same_probes.shape[0]: raise "Size mismatch" import numpy as np model_scores = np.ndarray((full_scores.shape[0],np.sum(same_probes)), 'float64') c=0 for i in range(0,full_scores.shape[1]): if same_probes[i]: for j in range(0,full_scores.shape[0]): model_scores[j,c] = full_scores[j,i] c+=1 return model_scores
[ "def", "probes_used_extract_scores", "(", "full_scores", ",", "same_probes", ")", ":", "if", "full_scores", ".", "shape", "[", "1", "]", "!=", "same_probes", ".", "shape", "[", "0", "]", ":", "raise", "\"Size mismatch\"", "import", "numpy", "as", "np", "mode...
Extracts a matrix of scores for a model, given a probes_used row vector of boolean
[ "Extracts", "a", "matrix", "of", "scores", "for", "a", "model", "given", "a", "probes_used", "row", "vector", "of", "boolean" ]
9f5d13d2e52d3b0c818f4abaa07cda15f62a34cd
https://github.com/bioidiap/bob.bio.spear/blob/9f5d13d2e52d3b0c818f4abaa07cda15f62a34cd/bob/bio/spear/utils/__init__.py#L77-L88
train
41,526
bioidiap/bob.bio.spear
bob/bio/spear/utils/__init__.py
read
def read(filename): """Read audio file""" # Depricated: use load() function from bob.bio.spear.database.AudioBioFile #TODO: update xbob.sox first. This will enable the use of formats like NIST sphere and other #import xbob.sox #audio = xbob.sox.reader(filename) #(rate, data) = audio.load() # We consider there is only 1 channel in the audio file => data[0] #data= numpy.cast['float'](data[0]*pow(2,15)) # pow(2,15) is used to get the same native format as for scipy.io.wavfile.read import scipy.io.wavfile rate, audio = scipy.io.wavfile.read(filename) # We consider there is only 1 channel in the audio file => data[0] data= numpy.cast['float'](audio) return rate, data
python
def read(filename): """Read audio file""" # Depricated: use load() function from bob.bio.spear.database.AudioBioFile #TODO: update xbob.sox first. This will enable the use of formats like NIST sphere and other #import xbob.sox #audio = xbob.sox.reader(filename) #(rate, data) = audio.load() # We consider there is only 1 channel in the audio file => data[0] #data= numpy.cast['float'](data[0]*pow(2,15)) # pow(2,15) is used to get the same native format as for scipy.io.wavfile.read import scipy.io.wavfile rate, audio = scipy.io.wavfile.read(filename) # We consider there is only 1 channel in the audio file => data[0] data= numpy.cast['float'](audio) return rate, data
[ "def", "read", "(", "filename", ")", ":", "# Depricated: use load() function from bob.bio.spear.database.AudioBioFile", "#TODO: update xbob.sox first. This will enable the use of formats like NIST sphere and other", "#import xbob.sox", "#audio = xbob.sox.reader(filename)", "#(rate, data) = audio...
Read audio file
[ "Read", "audio", "file" ]
9f5d13d2e52d3b0c818f4abaa07cda15f62a34cd
https://github.com/bioidiap/bob.bio.spear/blob/9f5d13d2e52d3b0c818f4abaa07cda15f62a34cd/bob/bio/spear/utils/__init__.py#L91-L105
train
41,527
bioidiap/bob.bio.spear
bob/bio/spear/utils/__init__.py
normalize_std_array
def normalize_std_array(vector): """Applies a unit mean and variance normalization to an arrayset""" # Initializes variables length = 1 n_samples = len(vector) mean = numpy.ndarray((length,), 'float64') std = numpy.ndarray((length,), 'float64') mean.fill(0) std.fill(0) # Computes mean and variance for array in vector: x = array.astype('float64') mean += x std += (x ** 2) mean /= n_samples std /= n_samples std -= (mean ** 2) std = std ** 0.5 arrayset = numpy.ndarray(shape=(n_samples,mean.shape[0]), dtype=numpy.float64) for i in range (0, n_samples): arrayset[i,:] = (vector[i]-mean) / std return arrayset
python
def normalize_std_array(vector): """Applies a unit mean and variance normalization to an arrayset""" # Initializes variables length = 1 n_samples = len(vector) mean = numpy.ndarray((length,), 'float64') std = numpy.ndarray((length,), 'float64') mean.fill(0) std.fill(0) # Computes mean and variance for array in vector: x = array.astype('float64') mean += x std += (x ** 2) mean /= n_samples std /= n_samples std -= (mean ** 2) std = std ** 0.5 arrayset = numpy.ndarray(shape=(n_samples,mean.shape[0]), dtype=numpy.float64) for i in range (0, n_samples): arrayset[i,:] = (vector[i]-mean) / std return arrayset
[ "def", "normalize_std_array", "(", "vector", ")", ":", "# Initializes variables", "length", "=", "1", "n_samples", "=", "len", "(", "vector", ")", "mean", "=", "numpy", ".", "ndarray", "(", "(", "length", ",", ")", ",", "'float64'", ")", "std", "=", "num...
Applies a unit mean and variance normalization to an arrayset
[ "Applies", "a", "unit", "mean", "and", "variance", "normalization", "to", "an", "arrayset" ]
9f5d13d2e52d3b0c818f4abaa07cda15f62a34cd
https://github.com/bioidiap/bob.bio.spear/blob/9f5d13d2e52d3b0c818f4abaa07cda15f62a34cd/bob/bio/spear/utils/__init__.py#L109-L134
train
41,528
bioidiap/bob.bio.spear
bob/bio/spear/preprocessor/External.py
External._conversion
def _conversion(self, input_signal, vad_file): """ Converts an external VAD to follow the Spear convention. Energy is used in order to avoind out-of-bound array indexes. """ e = bob.ap.Energy(rate_wavsample[0], self.win_length_ms, self.win_shift_ms) energy_array = e(rate_wavsample[1]) labels = self.use_existing_vad(energy_array, vad_file) return labels
python
def _conversion(self, input_signal, vad_file): """ Converts an external VAD to follow the Spear convention. Energy is used in order to avoind out-of-bound array indexes. """ e = bob.ap.Energy(rate_wavsample[0], self.win_length_ms, self.win_shift_ms) energy_array = e(rate_wavsample[1]) labels = self.use_existing_vad(energy_array, vad_file) return labels
[ "def", "_conversion", "(", "self", ",", "input_signal", ",", "vad_file", ")", ":", "e", "=", "bob", ".", "ap", ".", "Energy", "(", "rate_wavsample", "[", "0", "]", ",", "self", ".", "win_length_ms", ",", "self", ".", "win_shift_ms", ")", "energy_array", ...
Converts an external VAD to follow the Spear convention. Energy is used in order to avoind out-of-bound array indexes.
[ "Converts", "an", "external", "VAD", "to", "follow", "the", "Spear", "convention", ".", "Energy", "is", "used", "in", "order", "to", "avoind", "out", "-", "of", "-", "bound", "array", "indexes", "." ]
9f5d13d2e52d3b0c818f4abaa07cda15f62a34cd
https://github.com/bioidiap/bob.bio.spear/blob/9f5d13d2e52d3b0c818f4abaa07cda15f62a34cd/bob/bio/spear/preprocessor/External.py#L65-L75
train
41,529
bioidiap/bob.bio.spear
bob/bio/spear/preprocessor/Mod_4Hz.py
Mod_4Hz.mod_4hz
def mod_4hz(self, rate_wavsample): """Computes and returns the 4Hz modulation energy features for the given input wave file""" # Set parameters wl = self.win_length_ms ws = self.win_shift_ms nf = self.n_filters f_min = self.f_min f_max = self.f_max pre = self.pre_emphasis_coef c = bob.ap.Spectrogram(rate_wavsample[0], wl, ws, nf, f_min, f_max, pre) c.energy_filter=True c.log_filter=False c.energy_bands=True sig = rate_wavsample[1] energy_bands = c(sig) filtering_res = self.pass_band_filtering(energy_bands, rate_wavsample[0]) mod_4hz = self.modulation_4hz(filtering_res, rate_wavsample) mod_4hz = self.averaging(mod_4hz) e = bob.ap.Energy(rate_wavsample[0], wl, ws) energy_array = e(rate_wavsample[1]) labels = self._voice_activity_detection(energy_array, mod_4hz) labels = utils.smoothing(labels,self.smoothing_window) # discard isolated speech less than 100ms logger.info("After Mod-4Hz based VAD there are %d frames remaining over %d", numpy.sum(labels), len(labels)) return labels, energy_array, mod_4hz
python
def mod_4hz(self, rate_wavsample): """Computes and returns the 4Hz modulation energy features for the given input wave file""" # Set parameters wl = self.win_length_ms ws = self.win_shift_ms nf = self.n_filters f_min = self.f_min f_max = self.f_max pre = self.pre_emphasis_coef c = bob.ap.Spectrogram(rate_wavsample[0], wl, ws, nf, f_min, f_max, pre) c.energy_filter=True c.log_filter=False c.energy_bands=True sig = rate_wavsample[1] energy_bands = c(sig) filtering_res = self.pass_band_filtering(energy_bands, rate_wavsample[0]) mod_4hz = self.modulation_4hz(filtering_res, rate_wavsample) mod_4hz = self.averaging(mod_4hz) e = bob.ap.Energy(rate_wavsample[0], wl, ws) energy_array = e(rate_wavsample[1]) labels = self._voice_activity_detection(energy_array, mod_4hz) labels = utils.smoothing(labels,self.smoothing_window) # discard isolated speech less than 100ms logger.info("After Mod-4Hz based VAD there are %d frames remaining over %d", numpy.sum(labels), len(labels)) return labels, energy_array, mod_4hz
[ "def", "mod_4hz", "(", "self", ",", "rate_wavsample", ")", ":", "# Set parameters", "wl", "=", "self", ".", "win_length_ms", "ws", "=", "self", ".", "win_shift_ms", "nf", "=", "self", ".", "n_filters", "f_min", "=", "self", ".", "f_min", "f_max", "=", "s...
Computes and returns the 4Hz modulation energy features for the given input wave file
[ "Computes", "and", "returns", "the", "4Hz", "modulation", "energy", "features", "for", "the", "given", "input", "wave", "file" ]
9f5d13d2e52d3b0c818f4abaa07cda15f62a34cd
https://github.com/bioidiap/bob.bio.spear/blob/9f5d13d2e52d3b0c818f4abaa07cda15f62a34cd/bob/bio/spear/preprocessor/Mod_4Hz.py#L168-L194
train
41,530
bioidiap/bob.bio.spear
bob/bio/spear/extractor/CQCCFeatures.py
CQCCFeatures.read_matlab_files
def read_matlab_files(self, biofile, directory, extension): """ Read pre-computed CQCC Matlab features here """ import bob.io.matlab # return the numpy array read from the data_file data_path = biofile.make_path(directory, extension) return bob.io.base.load(data_path)
python
def read_matlab_files(self, biofile, directory, extension): """ Read pre-computed CQCC Matlab features here """ import bob.io.matlab # return the numpy array read from the data_file data_path = biofile.make_path(directory, extension) return bob.io.base.load(data_path)
[ "def", "read_matlab_files", "(", "self", ",", "biofile", ",", "directory", ",", "extension", ")", ":", "import", "bob", ".", "io", ".", "matlab", "# return the numpy array read from the data_file", "data_path", "=", "biofile", ".", "make_path", "(", "directory", "...
Read pre-computed CQCC Matlab features here
[ "Read", "pre", "-", "computed", "CQCC", "Matlab", "features", "here" ]
9f5d13d2e52d3b0c818f4abaa07cda15f62a34cd
https://github.com/bioidiap/bob.bio.spear/blob/9f5d13d2e52d3b0c818f4abaa07cda15f62a34cd/bob/bio/spear/extractor/CQCCFeatures.py#L45-L52
train
41,531
bioidiap/bob.bio.spear
bob/bio/spear/script/baselines.py
command_line_arguments
def command_line_arguments(command_line_parameters): """Defines the command line parameters that are accepted.""" # create parser parser = argparse.ArgumentParser(description='Execute baseline algorithms with default parameters', formatter_class=argparse.ArgumentDefaultsHelpFormatter) # add parameters # - the algorithm to execute parser.add_argument('-a', '--algorithms', choices = all_algorithms, default = ('gmm-voxforge',), nargs = '+', help = 'Select one (or more) algorithms that you want to execute.') parser.add_argument('--all', action = 'store_true', help = 'Select all algorithms.') # - the database to choose parser.add_argument('-d', '--database', choices = available_databases, default = 'voxforge', help = 'The database on which the baseline algorithm is executed.') # - the database to choose parser.add_argument('-b', '--baseline-directory', default = 'baselines', help = 'The sub-directory, where the baseline results are stored.') # - the directory to write parser.add_argument('-f', '--directory', help = 'The directory to write the data of the experiment into. If not specified, the default directories of the verify.py script are used (see verify.py --help).') # - use the Idiap grid -- option is only useful if you are at Idiap parser.add_argument('-g', '--grid', action = 'store_true', help = 'Execute the algorithm in the SGE grid.') # - run in parallel on the local machine parser.add_argument('-l', '--parallel', type=int, help = 'Run the algorithms in parallel on the local machine, using the given number of parallel threads') # - perform ZT-normalization parser.add_argument('-z', '--zt-norm', action = 'store_false', help = 'Compute the ZT norm for the files (might not be availabe for all databases).') # - just print? parser.add_argument('-q', '--dry-run', action = 'store_true', help = 'Just print the commands, but do not execute them.') # - evaluate the algorithm (after it has finished) parser.add_argument('-e', '--evaluate', nargs='+', choices = ('EER', 'HTER', 'ROC', 'DET', 'CMC', 'RR'), help = 'Evaluate the results of the algorithms (instead of running them) using the given evaluation techniques.') # TODO: add MIN-DCT measure # - other parameters that are passed to the underlying script parser.add_argument('parameters', nargs = argparse.REMAINDER, help = 'Parameters directly passed to the verify.py script.') bob.core.log.add_command_line_option(parser) args = parser.parse_args(command_line_parameters) if args.all: args.algorithms = all_algorithms bob.core.log.set_verbosity_level(logger, args.verbose) return args
python
def command_line_arguments(command_line_parameters): """Defines the command line parameters that are accepted.""" # create parser parser = argparse.ArgumentParser(description='Execute baseline algorithms with default parameters', formatter_class=argparse.ArgumentDefaultsHelpFormatter) # add parameters # - the algorithm to execute parser.add_argument('-a', '--algorithms', choices = all_algorithms, default = ('gmm-voxforge',), nargs = '+', help = 'Select one (or more) algorithms that you want to execute.') parser.add_argument('--all', action = 'store_true', help = 'Select all algorithms.') # - the database to choose parser.add_argument('-d', '--database', choices = available_databases, default = 'voxforge', help = 'The database on which the baseline algorithm is executed.') # - the database to choose parser.add_argument('-b', '--baseline-directory', default = 'baselines', help = 'The sub-directory, where the baseline results are stored.') # - the directory to write parser.add_argument('-f', '--directory', help = 'The directory to write the data of the experiment into. If not specified, the default directories of the verify.py script are used (see verify.py --help).') # - use the Idiap grid -- option is only useful if you are at Idiap parser.add_argument('-g', '--grid', action = 'store_true', help = 'Execute the algorithm in the SGE grid.') # - run in parallel on the local machine parser.add_argument('-l', '--parallel', type=int, help = 'Run the algorithms in parallel on the local machine, using the given number of parallel threads') # - perform ZT-normalization parser.add_argument('-z', '--zt-norm', action = 'store_false', help = 'Compute the ZT norm for the files (might not be availabe for all databases).') # - just print? parser.add_argument('-q', '--dry-run', action = 'store_true', help = 'Just print the commands, but do not execute them.') # - evaluate the algorithm (after it has finished) parser.add_argument('-e', '--evaluate', nargs='+', choices = ('EER', 'HTER', 'ROC', 'DET', 'CMC', 'RR'), help = 'Evaluate the results of the algorithms (instead of running them) using the given evaluation techniques.') # TODO: add MIN-DCT measure # - other parameters that are passed to the underlying script parser.add_argument('parameters', nargs = argparse.REMAINDER, help = 'Parameters directly passed to the verify.py script.') bob.core.log.add_command_line_option(parser) args = parser.parse_args(command_line_parameters) if args.all: args.algorithms = all_algorithms bob.core.log.set_verbosity_level(logger, args.verbose) return args
[ "def", "command_line_arguments", "(", "command_line_parameters", ")", ":", "# create parser", "parser", "=", "argparse", ".", "ArgumentParser", "(", "description", "=", "'Execute baseline algorithms with default parameters'", ",", "formatter_class", "=", "argparse", ".", "A...
Defines the command line parameters that are accepted.
[ "Defines", "the", "command", "line", "parameters", "that", "are", "accepted", "." ]
9f5d13d2e52d3b0c818f4abaa07cda15f62a34cd
https://github.com/bioidiap/bob.bio.spear/blob/9f5d13d2e52d3b0c818f4abaa07cda15f62a34cd/bob/bio/spear/script/baselines.py#L38-L78
train
41,532
bioidiap/bob.bio.spear
bob/bio/spear/utils/extraction.py
calc_mean
def calc_mean(c0, c1=[]): """ Calculates the mean of the data.""" if c1 != []: return (numpy.mean(c0, 0) + numpy.mean(c1, 0)) / 2. else: return numpy.mean(c0, 0)
python
def calc_mean(c0, c1=[]): """ Calculates the mean of the data.""" if c1 != []: return (numpy.mean(c0, 0) + numpy.mean(c1, 0)) / 2. else: return numpy.mean(c0, 0)
[ "def", "calc_mean", "(", "c0", ",", "c1", "=", "[", "]", ")", ":", "if", "c1", "!=", "[", "]", ":", "return", "(", "numpy", ".", "mean", "(", "c0", ",", "0", ")", "+", "numpy", ".", "mean", "(", "c1", ",", "0", ")", ")", "/", "2.", "else"...
Calculates the mean of the data.
[ "Calculates", "the", "mean", "of", "the", "data", "." ]
9f5d13d2e52d3b0c818f4abaa07cda15f62a34cd
https://github.com/bioidiap/bob.bio.spear/blob/9f5d13d2e52d3b0c818f4abaa07cda15f62a34cd/bob/bio/spear/utils/extraction.py#L32-L37
train
41,533
bioidiap/bob.bio.spear
bob/bio/spear/utils/extraction.py
calc_std
def calc_std(c0, c1=[]): """ Calculates the variance of the data.""" if c1 == []: return numpy.std(c0, 0) prop = float(len(c0)) / float(len(c1)) if prop < 1: p0 = int(math.ceil(1 / prop)) p1 = 1 else: p0 = 1 p1 = int(math.ceil(prop)) return numpy.std(numpy.vstack(p0 * [c0] + p1 * [c1]), 0)
python
def calc_std(c0, c1=[]): """ Calculates the variance of the data.""" if c1 == []: return numpy.std(c0, 0) prop = float(len(c0)) / float(len(c1)) if prop < 1: p0 = int(math.ceil(1 / prop)) p1 = 1 else: p0 = 1 p1 = int(math.ceil(prop)) return numpy.std(numpy.vstack(p0 * [c0] + p1 * [c1]), 0)
[ "def", "calc_std", "(", "c0", ",", "c1", "=", "[", "]", ")", ":", "if", "c1", "==", "[", "]", ":", "return", "numpy", ".", "std", "(", "c0", ",", "0", ")", "prop", "=", "float", "(", "len", "(", "c0", ")", ")", "/", "float", "(", "len", "...
Calculates the variance of the data.
[ "Calculates", "the", "variance", "of", "the", "data", "." ]
9f5d13d2e52d3b0c818f4abaa07cda15f62a34cd
https://github.com/bioidiap/bob.bio.spear/blob/9f5d13d2e52d3b0c818f4abaa07cda15f62a34cd/bob/bio/spear/utils/extraction.py#L40-L51
train
41,534
bioidiap/bob.bio.spear
bob/bio/spear/utils/extraction.py
calc_mean_std
def calc_mean_std(c0, c1=[], nonStdZero=False): """ Calculates both the mean of the data. """ mi = calc_mean(c0, c1) std = calc_std(c0, c1) if (nonStdZero): std[std == 0] = 1 return mi, std
python
def calc_mean_std(c0, c1=[], nonStdZero=False): """ Calculates both the mean of the data. """ mi = calc_mean(c0, c1) std = calc_std(c0, c1) if (nonStdZero): std[std == 0] = 1 return mi, std
[ "def", "calc_mean_std", "(", "c0", ",", "c1", "=", "[", "]", ",", "nonStdZero", "=", "False", ")", ":", "mi", "=", "calc_mean", "(", "c0", ",", "c1", ")", "std", "=", "calc_std", "(", "c0", ",", "c1", ")", "if", "(", "nonStdZero", ")", ":", "st...
Calculates both the mean of the data.
[ "Calculates", "both", "the", "mean", "of", "the", "data", "." ]
9f5d13d2e52d3b0c818f4abaa07cda15f62a34cd
https://github.com/bioidiap/bob.bio.spear/blob/9f5d13d2e52d3b0c818f4abaa07cda15f62a34cd/bob/bio/spear/utils/extraction.py#L59-L66
train
41,535
jtambasco/modesolverpy
modesolverpy/mode_solver.py
_ModeSolver.solve_sweep_structure
def solve_sweep_structure( self, structures, sweep_param_list, filename="structure_n_effs.dat", plot=True, x_label="Structure number", fraction_mode_list=[], ): """ Find the modes of many structures. Args: structures (list): A list of `Structures` to find the modes of. sweep_param_list (list): A list of the parameter-sweep sweep that was used. This is for plotting purposes only. filename (str): The nominal filename to use when saving the effective indices. Defaults to 'structure_n_effs.dat'. plot (bool): `True` if plots should be generates, otherwise `False`. Default is `True`. x_label (str): x-axis text to display in the plot. fraction_mode_list (list): A list of mode indices of the modes that should be included in the TE/TM mode fraction plot. If the list is empty, all modes will be included. The list is empty by default. Returns: list: A list of the effective indices found for each structure. """ n_effs = [] mode_types = [] fractions_te = [] fractions_tm = [] for s in tqdm.tqdm(structures, ncols=70): self.solve(s) n_effs.append(np.real(self.n_effs)) mode_types.append(self._get_mode_types()) fractions_te.append(self.fraction_te) fractions_tm.append(self.fraction_tm) if filename: self._write_n_effs_to_file( n_effs, self._modes_directory + filename, sweep_param_list ) with open(self._modes_directory + "mode_types.dat", "w") as fs: header = ",".join( "Mode%i" % i for i, _ in enumerate(mode_types[0]) ) fs.write("# " + header + "\n") for mt in mode_types: txt = ",".join("%s %.2f" % pair for pair in mt) fs.write(txt + "\n") with open(self._modes_directory + "fraction_te.dat", "w") as fs: header = "fraction te" fs.write("# param sweep," + header + "\n") for param, fte in zip(sweep_param_list, fractions_te): txt = "%.6f," % param txt += ",".join("%.2f" % f for f in fte) fs.write(txt + "\n") with open(self._modes_directory + "fraction_tm.dat", "w") as fs: header = "fraction tm" fs.write("# param sweep," + header + "\n") for param, ftm in zip(sweep_param_list, fractions_tm): txt = "%.6f," % param txt += ",".join("%.2f" % f for f in ftm) fs.write(txt + "\n") if plot: if MPL: title = "$n_{eff}$ vs %s" % x_label y_label = "$n_{eff}$" else: title = "n_{effs} vs %s" % x_label y_label = "n_{eff}" self._plot_n_effs( self._modes_directory + filename, self._modes_directory + "fraction_te.dat", x_label, y_label, title ) title = "TE Fraction vs %s" % x_label self._plot_fraction( self._modes_directory + "fraction_te.dat", x_label, "TE Fraction [%]", title, fraction_mode_list, ) title = "TM Fraction vs %s" % x_label self._plot_fraction( self._modes_directory + "fraction_tm.dat", x_label, "TM Fraction [%]", title, fraction_mode_list, ) return n_effs
python
def solve_sweep_structure( self, structures, sweep_param_list, filename="structure_n_effs.dat", plot=True, x_label="Structure number", fraction_mode_list=[], ): """ Find the modes of many structures. Args: structures (list): A list of `Structures` to find the modes of. sweep_param_list (list): A list of the parameter-sweep sweep that was used. This is for plotting purposes only. filename (str): The nominal filename to use when saving the effective indices. Defaults to 'structure_n_effs.dat'. plot (bool): `True` if plots should be generates, otherwise `False`. Default is `True`. x_label (str): x-axis text to display in the plot. fraction_mode_list (list): A list of mode indices of the modes that should be included in the TE/TM mode fraction plot. If the list is empty, all modes will be included. The list is empty by default. Returns: list: A list of the effective indices found for each structure. """ n_effs = [] mode_types = [] fractions_te = [] fractions_tm = [] for s in tqdm.tqdm(structures, ncols=70): self.solve(s) n_effs.append(np.real(self.n_effs)) mode_types.append(self._get_mode_types()) fractions_te.append(self.fraction_te) fractions_tm.append(self.fraction_tm) if filename: self._write_n_effs_to_file( n_effs, self._modes_directory + filename, sweep_param_list ) with open(self._modes_directory + "mode_types.dat", "w") as fs: header = ",".join( "Mode%i" % i for i, _ in enumerate(mode_types[0]) ) fs.write("# " + header + "\n") for mt in mode_types: txt = ",".join("%s %.2f" % pair for pair in mt) fs.write(txt + "\n") with open(self._modes_directory + "fraction_te.dat", "w") as fs: header = "fraction te" fs.write("# param sweep," + header + "\n") for param, fte in zip(sweep_param_list, fractions_te): txt = "%.6f," % param txt += ",".join("%.2f" % f for f in fte) fs.write(txt + "\n") with open(self._modes_directory + "fraction_tm.dat", "w") as fs: header = "fraction tm" fs.write("# param sweep," + header + "\n") for param, ftm in zip(sweep_param_list, fractions_tm): txt = "%.6f," % param txt += ",".join("%.2f" % f for f in ftm) fs.write(txt + "\n") if plot: if MPL: title = "$n_{eff}$ vs %s" % x_label y_label = "$n_{eff}$" else: title = "n_{effs} vs %s" % x_label y_label = "n_{eff}" self._plot_n_effs( self._modes_directory + filename, self._modes_directory + "fraction_te.dat", x_label, y_label, title ) title = "TE Fraction vs %s" % x_label self._plot_fraction( self._modes_directory + "fraction_te.dat", x_label, "TE Fraction [%]", title, fraction_mode_list, ) title = "TM Fraction vs %s" % x_label self._plot_fraction( self._modes_directory + "fraction_tm.dat", x_label, "TM Fraction [%]", title, fraction_mode_list, ) return n_effs
[ "def", "solve_sweep_structure", "(", "self", ",", "structures", ",", "sweep_param_list", ",", "filename", "=", "\"structure_n_effs.dat\"", ",", "plot", "=", "True", ",", "x_label", "=", "\"Structure number\"", ",", "fraction_mode_list", "=", "[", "]", ",", ")", ...
Find the modes of many structures. Args: structures (list): A list of `Structures` to find the modes of. sweep_param_list (list): A list of the parameter-sweep sweep that was used. This is for plotting purposes only. filename (str): The nominal filename to use when saving the effective indices. Defaults to 'structure_n_effs.dat'. plot (bool): `True` if plots should be generates, otherwise `False`. Default is `True`. x_label (str): x-axis text to display in the plot. fraction_mode_list (list): A list of mode indices of the modes that should be included in the TE/TM mode fraction plot. If the list is empty, all modes will be included. The list is empty by default. Returns: list: A list of the effective indices found for each structure.
[ "Find", "the", "modes", "of", "many", "structures", "." ]
85254a13b5aed2404187c52ac93b9b3ce99ee3a3
https://github.com/jtambasco/modesolverpy/blob/85254a13b5aed2404187c52ac93b9b3ce99ee3a3/modesolverpy/mode_solver.py#L92-L192
train
41,536
jtambasco/modesolverpy
modesolverpy/mode_solver.py
_ModeSolver.solve_sweep_wavelength
def solve_sweep_wavelength( self, structure, wavelengths, filename="wavelength_n_effs.dat", plot=True, ): """ Solve for the effective indices of a fixed structure at different wavelengths. Args: structure (Slabs): The target structure to solve for modes. wavelengths (list): A list of wavelengths to sweep over. filename (str): The nominal filename to use when saving the effective indices. Defaults to 'wavelength_n_effs.dat'. plot (bool): `True` if plots should be generates, otherwise `False`. Default is `True`. Returns: list: A list of the effective indices found for each wavelength. """ n_effs = [] for w in tqdm.tqdm(wavelengths, ncols=70): structure.change_wavelength(w) self.solve(structure) n_effs.append(np.real(self.n_effs)) if filename: self._write_n_effs_to_file( n_effs, self._modes_directory + filename, wavelengths ) if plot: if MPL: title = "$n_{eff}$ vs Wavelength" y_label = "$n_{eff}$" else: title = "n_{effs} vs Wavelength" % x_label y_label = "n_{eff}" self._plot_n_effs( self._modes_directory + filename, self._modes_directory + "fraction_te.dat", "Wavelength", "n_{eff}", title, ) return n_effs
python
def solve_sweep_wavelength( self, structure, wavelengths, filename="wavelength_n_effs.dat", plot=True, ): """ Solve for the effective indices of a fixed structure at different wavelengths. Args: structure (Slabs): The target structure to solve for modes. wavelengths (list): A list of wavelengths to sweep over. filename (str): The nominal filename to use when saving the effective indices. Defaults to 'wavelength_n_effs.dat'. plot (bool): `True` if plots should be generates, otherwise `False`. Default is `True`. Returns: list: A list of the effective indices found for each wavelength. """ n_effs = [] for w in tqdm.tqdm(wavelengths, ncols=70): structure.change_wavelength(w) self.solve(structure) n_effs.append(np.real(self.n_effs)) if filename: self._write_n_effs_to_file( n_effs, self._modes_directory + filename, wavelengths ) if plot: if MPL: title = "$n_{eff}$ vs Wavelength" y_label = "$n_{eff}$" else: title = "n_{effs} vs Wavelength" % x_label y_label = "n_{eff}" self._plot_n_effs( self._modes_directory + filename, self._modes_directory + "fraction_te.dat", "Wavelength", "n_{eff}", title, ) return n_effs
[ "def", "solve_sweep_wavelength", "(", "self", ",", "structure", ",", "wavelengths", ",", "filename", "=", "\"wavelength_n_effs.dat\"", ",", "plot", "=", "True", ",", ")", ":", "n_effs", "=", "[", "]", "for", "w", "in", "tqdm", ".", "tqdm", "(", "wavelength...
Solve for the effective indices of a fixed structure at different wavelengths. Args: structure (Slabs): The target structure to solve for modes. wavelengths (list): A list of wavelengths to sweep over. filename (str): The nominal filename to use when saving the effective indices. Defaults to 'wavelength_n_effs.dat'. plot (bool): `True` if plots should be generates, otherwise `False`. Default is `True`. Returns: list: A list of the effective indices found for each wavelength.
[ "Solve", "for", "the", "effective", "indices", "of", "a", "fixed", "structure", "at", "different", "wavelengths", "." ]
85254a13b5aed2404187c52ac93b9b3ce99ee3a3
https://github.com/jtambasco/modesolverpy/blob/85254a13b5aed2404187c52ac93b9b3ce99ee3a3/modesolverpy/mode_solver.py#L194-L243
train
41,537
jtambasco/modesolverpy
modesolverpy/structure_base.py
_AbstractStructure._add_material
def _add_material(self, x_bot_left, y_bot_left, x_top_right, y_top_right, n_material, angle=0): ''' A low-level function that allows writing a rectangle refractive index profile to a `Structure`. Args: x_bot_left (float): The bottom-left x-coordinate of the rectangle. y_bot_left (float): The bottom-left y-coordinate of the rectangle. x_top_right (float): The top-right x-coordinate of the rectangle. y_top_right (float): The top-right y-coordinate of the rectangle. n_material (float): The refractive index of the points encompassed by the defined rectangle. angle (float): The angle in degrees of the sidewalls of the defined rectangle. Default is 0. This is useful for creating a ridge with angled sidewalls. ''' x_mask = np.logical_and(x_bot_left<=self.x, self.x<=x_top_right) y_mask = np.logical_and(y_bot_left<=self.y, self.y<=y_top_right) xy_mask = np.kron(y_mask, x_mask).reshape((y_mask.size, x_mask.size)) self.n[xy_mask] = n_material if angle: self._add_triangular_sides(xy_mask, angle, y_top_right, y_bot_left, x_top_right, x_bot_left, n_material) return self.n
python
def _add_material(self, x_bot_left, y_bot_left, x_top_right, y_top_right, n_material, angle=0): ''' A low-level function that allows writing a rectangle refractive index profile to a `Structure`. Args: x_bot_left (float): The bottom-left x-coordinate of the rectangle. y_bot_left (float): The bottom-left y-coordinate of the rectangle. x_top_right (float): The top-right x-coordinate of the rectangle. y_top_right (float): The top-right y-coordinate of the rectangle. n_material (float): The refractive index of the points encompassed by the defined rectangle. angle (float): The angle in degrees of the sidewalls of the defined rectangle. Default is 0. This is useful for creating a ridge with angled sidewalls. ''' x_mask = np.logical_and(x_bot_left<=self.x, self.x<=x_top_right) y_mask = np.logical_and(y_bot_left<=self.y, self.y<=y_top_right) xy_mask = np.kron(y_mask, x_mask).reshape((y_mask.size, x_mask.size)) self.n[xy_mask] = n_material if angle: self._add_triangular_sides(xy_mask, angle, y_top_right, y_bot_left, x_top_right, x_bot_left, n_material) return self.n
[ "def", "_add_material", "(", "self", ",", "x_bot_left", ",", "y_bot_left", ",", "x_top_right", ",", "y_top_right", ",", "n_material", ",", "angle", "=", "0", ")", ":", "x_mask", "=", "np", ".", "logical_and", "(", "x_bot_left", "<=", "self", ".", "x", ",...
A low-level function that allows writing a rectangle refractive index profile to a `Structure`. Args: x_bot_left (float): The bottom-left x-coordinate of the rectangle. y_bot_left (float): The bottom-left y-coordinate of the rectangle. x_top_right (float): The top-right x-coordinate of the rectangle. y_top_right (float): The top-right y-coordinate of the rectangle. n_material (float): The refractive index of the points encompassed by the defined rectangle. angle (float): The angle in degrees of the sidewalls of the defined rectangle. Default is 0. This is useful for creating a ridge with angled sidewalls.
[ "A", "low", "-", "level", "function", "that", "allows", "writing", "a", "rectangle", "refractive", "index", "profile", "to", "a", "Structure", "." ]
85254a13b5aed2404187c52ac93b9b3ce99ee3a3
https://github.com/jtambasco/modesolverpy/blob/85254a13b5aed2404187c52ac93b9b3ce99ee3a3/modesolverpy/structure_base.py#L207-L239
train
41,538
jtambasco/modesolverpy
modesolverpy/structure_base.py
Slab.add_material
def add_material(self, x_min, x_max, n, angle=0): ''' Add a refractive index between two x-points. Args: x_min (float): The start x-point. x_max (float): The stop x-point. n (float, function): Refractive index between `x_min` and `x_max`. Either a constant (`float`), or a function that accepts one parameters, the wavelength, and returns a float of the refractive index. This is useful when doing wavelength sweeps and solving for the group velocity. The function provided could be a Sellmeier equation. angle (float): Angle in degrees of the slope of the sidewalls at `x_min` and `x_max`. This is useful for defining a ridge with angled sidewalls. ''' self._mat_params.append([x_min, x_max, n, angle]) if not callable(n): n_mat = lambda wl: n else: n_mat = n Structure._add_material(self, x_min, self.y_min, x_max, self.y_max, n_mat(self._wl), angle) return self.n
python
def add_material(self, x_min, x_max, n, angle=0): ''' Add a refractive index between two x-points. Args: x_min (float): The start x-point. x_max (float): The stop x-point. n (float, function): Refractive index between `x_min` and `x_max`. Either a constant (`float`), or a function that accepts one parameters, the wavelength, and returns a float of the refractive index. This is useful when doing wavelength sweeps and solving for the group velocity. The function provided could be a Sellmeier equation. angle (float): Angle in degrees of the slope of the sidewalls at `x_min` and `x_max`. This is useful for defining a ridge with angled sidewalls. ''' self._mat_params.append([x_min, x_max, n, angle]) if not callable(n): n_mat = lambda wl: n else: n_mat = n Structure._add_material(self, x_min, self.y_min, x_max, self.y_max, n_mat(self._wl), angle) return self.n
[ "def", "add_material", "(", "self", ",", "x_min", ",", "x_max", ",", "n", ",", "angle", "=", "0", ")", ":", "self", ".", "_mat_params", ".", "append", "(", "[", "x_min", ",", "x_max", ",", "n", ",", "angle", "]", ")", "if", "not", "callable", "("...
Add a refractive index between two x-points. Args: x_min (float): The start x-point. x_max (float): The stop x-point. n (float, function): Refractive index between `x_min` and `x_max`. Either a constant (`float`), or a function that accepts one parameters, the wavelength, and returns a float of the refractive index. This is useful when doing wavelength sweeps and solving for the group velocity. The function provided could be a Sellmeier equation. angle (float): Angle in degrees of the slope of the sidewalls at `x_min` and `x_max`. This is useful for defining a ridge with angled sidewalls.
[ "Add", "a", "refractive", "index", "between", "two", "x", "-", "points", "." ]
85254a13b5aed2404187c52ac93b9b3ce99ee3a3
https://github.com/jtambasco/modesolverpy/blob/85254a13b5aed2404187c52ac93b9b3ce99ee3a3/modesolverpy/structure_base.py#L479-L505
train
41,539
jtambasco/modesolverpy
modesolverpy/_mode_solver_lib.py
trapz2
def trapz2(f, x=None, y=None, dx=1.0, dy=1.0): """Double integrate.""" return numpy.trapz(numpy.trapz(f, x=y, dx=dy), x=x, dx=dx)
python
def trapz2(f, x=None, y=None, dx=1.0, dy=1.0): """Double integrate.""" return numpy.trapz(numpy.trapz(f, x=y, dx=dy), x=x, dx=dx)
[ "def", "trapz2", "(", "f", ",", "x", "=", "None", ",", "y", "=", "None", ",", "dx", "=", "1.0", ",", "dy", "=", "1.0", ")", ":", "return", "numpy", ".", "trapz", "(", "numpy", ".", "trapz", "(", "f", ",", "x", "=", "y", ",", "dx", "=", "d...
Double integrate.
[ "Double", "integrate", "." ]
85254a13b5aed2404187c52ac93b9b3ce99ee3a3
https://github.com/jtambasco/modesolverpy/blob/85254a13b5aed2404187c52ac93b9b3ce99ee3a3/modesolverpy/_mode_solver_lib.py#L22-L24
train
41,540
jtambasco/modesolverpy
modesolverpy/_mode_solver_lib.py
_ModeSolverVectorial.solve
def solve(self, neigs=4, tol=0, guess=None, mode_profiles=True, initial_mode_guess=None): """ This function finds the eigenmodes. Parameters ---------- neigs : int number of eigenmodes to find tol : float Relative accuracy for eigenvalues. The default value of 0 implies machine precision. guess : float a guess for the refractive index. Only finds eigenvectors with an effective refractive index higher than this value. Returns ------- self : an instance of the VFDModeSolver class obtain the fields of interest for specific modes using, for example: solver = EMpy.modesolvers.FD.VFDModeSolver(wavelength, x, y, epsf, boundary).solve() Ex = solver.modes[0].Ex Ey = solver.modes[0].Ey Ez = solver.modes[0].Ez """ from scipy.sparse.linalg import eigen self.nmodes = neigs self.tol = tol A = self.build_matrix() if guess is not None: # calculate shift for eigs function k = 2 * numpy.pi / self.wl shift = (guess * k) ** 2 else: shift = None [eigvals, eigvecs] = eigen.eigs(A, k=neigs, which='LR', tol=0.001, ncv=None, v0 = initial_mode_guess, return_eigenvectors=mode_profiles, sigma=shift) neffs = self.wl * scipy.sqrt(eigvals) / (2 * numpy.pi) if mode_profiles: Hxs = [] Hys = [] nx = self.nx ny = self.ny for ieig in range(neigs): Hxs.append(eigvecs[:nx * ny, ieig].reshape(nx, ny)) Hys.append(eigvecs[nx * ny:, ieig].reshape(nx, ny)) # sort the modes idx = numpy.flipud(numpy.argsort(neffs)) neffs = neffs[idx] self.neff = neffs if mode_profiles: tmpx = [] tmpy = [] for i in idx: tmpx.append(Hxs[i]) tmpy.append(Hys[i]) Hxs = tmpx Hys = tmpy [Hzs, Exs, Eys, Ezs] = self.compute_other_fields(neffs, Hxs, Hys) self.modes = [] for (neff, Hx, Hy, Hz, Ex, Ey, Ez) in zip(neffs, Hxs, Hys, Hzs, Exs, Eys, Ezs): self.modes.append( FDMode(self.wl, self.x, self.y, neff, Ey, Ex, Ez, Hy, Hx, Hz).normalize()) return self
python
def solve(self, neigs=4, tol=0, guess=None, mode_profiles=True, initial_mode_guess=None): """ This function finds the eigenmodes. Parameters ---------- neigs : int number of eigenmodes to find tol : float Relative accuracy for eigenvalues. The default value of 0 implies machine precision. guess : float a guess for the refractive index. Only finds eigenvectors with an effective refractive index higher than this value. Returns ------- self : an instance of the VFDModeSolver class obtain the fields of interest for specific modes using, for example: solver = EMpy.modesolvers.FD.VFDModeSolver(wavelength, x, y, epsf, boundary).solve() Ex = solver.modes[0].Ex Ey = solver.modes[0].Ey Ez = solver.modes[0].Ez """ from scipy.sparse.linalg import eigen self.nmodes = neigs self.tol = tol A = self.build_matrix() if guess is not None: # calculate shift for eigs function k = 2 * numpy.pi / self.wl shift = (guess * k) ** 2 else: shift = None [eigvals, eigvecs] = eigen.eigs(A, k=neigs, which='LR', tol=0.001, ncv=None, v0 = initial_mode_guess, return_eigenvectors=mode_profiles, sigma=shift) neffs = self.wl * scipy.sqrt(eigvals) / (2 * numpy.pi) if mode_profiles: Hxs = [] Hys = [] nx = self.nx ny = self.ny for ieig in range(neigs): Hxs.append(eigvecs[:nx * ny, ieig].reshape(nx, ny)) Hys.append(eigvecs[nx * ny:, ieig].reshape(nx, ny)) # sort the modes idx = numpy.flipud(numpy.argsort(neffs)) neffs = neffs[idx] self.neff = neffs if mode_profiles: tmpx = [] tmpy = [] for i in idx: tmpx.append(Hxs[i]) tmpy.append(Hys[i]) Hxs = tmpx Hys = tmpy [Hzs, Exs, Eys, Ezs] = self.compute_other_fields(neffs, Hxs, Hys) self.modes = [] for (neff, Hx, Hy, Hz, Ex, Ey, Ez) in zip(neffs, Hxs, Hys, Hzs, Exs, Eys, Ezs): self.modes.append( FDMode(self.wl, self.x, self.y, neff, Ey, Ex, Ez, Hy, Hx, Hz).normalize()) return self
[ "def", "solve", "(", "self", ",", "neigs", "=", "4", ",", "tol", "=", "0", ",", "guess", "=", "None", ",", "mode_profiles", "=", "True", ",", "initial_mode_guess", "=", "None", ")", ":", "from", "scipy", ".", "sparse", ".", "linalg", "import", "eigen...
This function finds the eigenmodes. Parameters ---------- neigs : int number of eigenmodes to find tol : float Relative accuracy for eigenvalues. The default value of 0 implies machine precision. guess : float a guess for the refractive index. Only finds eigenvectors with an effective refractive index higher than this value. Returns ------- self : an instance of the VFDModeSolver class obtain the fields of interest for specific modes using, for example: solver = EMpy.modesolvers.FD.VFDModeSolver(wavelength, x, y, epsf, boundary).solve() Ex = solver.modes[0].Ex Ey = solver.modes[0].Ey Ez = solver.modes[0].Ez
[ "This", "function", "finds", "the", "eigenmodes", "." ]
85254a13b5aed2404187c52ac93b9b3ce99ee3a3
https://github.com/jtambasco/modesolverpy/blob/85254a13b5aed2404187c52ac93b9b3ce99ee3a3/modesolverpy/_mode_solver_lib.py#L926-L1003
train
41,541
jtambasco/modesolverpy
modesolverpy/design.py
grating_coupler_period
def grating_coupler_period(wavelength, n_eff, n_clad, incidence_angle_deg, diffration_order=1): ''' Calculate the period needed for a grating coupler. Args: wavelength (float): The target wavelength for the grating coupler. n_eff (float): The effective index of the mode of a waveguide with the width of the grating coupler. n_clad (float): The refractive index of the cladding. incidence_angle_deg (float): The incidence angle the grating coupler should operate at [degrees]. diffration_order (int): The grating order the coupler should work at. Default is 1st order (1). Returns: float: The period needed for the grating coupler in the same units as the wavelength was given at. ''' k0 = 2. * np.pi / wavelength beta = n_eff.real * k0 n_inc = n_clad grating_period = (2.*np.pi*diffration_order) \ / (beta - k0*n_inc*np.sin(np.radians(incidence_angle_deg))) return grating_period
python
def grating_coupler_period(wavelength, n_eff, n_clad, incidence_angle_deg, diffration_order=1): ''' Calculate the period needed for a grating coupler. Args: wavelength (float): The target wavelength for the grating coupler. n_eff (float): The effective index of the mode of a waveguide with the width of the grating coupler. n_clad (float): The refractive index of the cladding. incidence_angle_deg (float): The incidence angle the grating coupler should operate at [degrees]. diffration_order (int): The grating order the coupler should work at. Default is 1st order (1). Returns: float: The period needed for the grating coupler in the same units as the wavelength was given at. ''' k0 = 2. * np.pi / wavelength beta = n_eff.real * k0 n_inc = n_clad grating_period = (2.*np.pi*diffration_order) \ / (beta - k0*n_inc*np.sin(np.radians(incidence_angle_deg))) return grating_period
[ "def", "grating_coupler_period", "(", "wavelength", ",", "n_eff", ",", "n_clad", ",", "incidence_angle_deg", ",", "diffration_order", "=", "1", ")", ":", "k0", "=", "2.", "*", "np", ".", "pi", "/", "wavelength", "beta", "=", "n_eff", ".", "real", "*", "k...
Calculate the period needed for a grating coupler. Args: wavelength (float): The target wavelength for the grating coupler. n_eff (float): The effective index of the mode of a waveguide with the width of the grating coupler. n_clad (float): The refractive index of the cladding. incidence_angle_deg (float): The incidence angle the grating coupler should operate at [degrees]. diffration_order (int): The grating order the coupler should work at. Default is 1st order (1). Returns: float: The period needed for the grating coupler in the same units as the wavelength was given at.
[ "Calculate", "the", "period", "needed", "for", "a", "grating", "coupler", "." ]
85254a13b5aed2404187c52ac93b9b3ce99ee3a3
https://github.com/jtambasco/modesolverpy/blob/85254a13b5aed2404187c52ac93b9b3ce99ee3a3/modesolverpy/design.py#L29-L60
train
41,542
toomore/goristock
grs/timeser.py
oop
def oop(aa): """ For cmd output. """ return ('%s %s %s %.2f %+.2f %s %s %s %s %+.2f %s %s %.2f %.4f %.4f' % (aa.stock_no, aa.stock_name, aa.data_date[-1], aa.raw_data[-1], aa.range_per, aa.MAC(3), aa.MAC(6), aa.MAC(18), aa.MAO(3,6)[1], aa.MAO(3,6)[0][1][-1], aa.MAO(3,6)[0][0], aa.RABC, aa.stock_vol[-1]/1000, aa.SD, aa.CV)).encode('utf-8')
python
def oop(aa): """ For cmd output. """ return ('%s %s %s %.2f %+.2f %s %s %s %s %+.2f %s %s %.2f %.4f %.4f' % (aa.stock_no, aa.stock_name, aa.data_date[-1], aa.raw_data[-1], aa.range_per, aa.MAC(3), aa.MAC(6), aa.MAC(18), aa.MAO(3,6)[1], aa.MAO(3,6)[0][1][-1], aa.MAO(3,6)[0][0], aa.RABC, aa.stock_vol[-1]/1000, aa.SD, aa.CV)).encode('utf-8')
[ "def", "oop", "(", "aa", ")", ":", "return", "(", "'%s %s %s %.2f %+.2f %s %s %s %s %+.2f %s %s %.2f %.4f %.4f'", "%", "(", "aa", ".", "stock_no", ",", "aa", ".", "stock_name", ",", "aa", ".", "data_date", "[", "-", "1", "]", ",", "aa", ".", "raw_data", "[...
For cmd output.
[ "For", "cmd", "output", "." ]
e61f57f11a626cfbc4afbf66337fd9d1c51e3e71
https://github.com/toomore/goristock/blob/e61f57f11a626cfbc4afbf66337fd9d1c51e3e71/grs/timeser.py#L25-L27
train
41,543
jtambasco/modesolverpy
modesolverpy/coupling_efficiency.py
reflection
def reflection(n1, n2): ''' Calculate the power reflection at the interface of two refractive index materials. Args: n1 (float): Refractive index of material 1. n2 (float): Refractive index of material 2. Returns: float: The percentage of reflected power. ''' r = abs((n1-n2) / (n1+n2))**2 return r
python
def reflection(n1, n2): ''' Calculate the power reflection at the interface of two refractive index materials. Args: n1 (float): Refractive index of material 1. n2 (float): Refractive index of material 2. Returns: float: The percentage of reflected power. ''' r = abs((n1-n2) / (n1+n2))**2 return r
[ "def", "reflection", "(", "n1", ",", "n2", ")", ":", "r", "=", "abs", "(", "(", "n1", "-", "n2", ")", "/", "(", "n1", "+", "n2", ")", ")", "**", "2", "return", "r" ]
Calculate the power reflection at the interface of two refractive index materials. Args: n1 (float): Refractive index of material 1. n2 (float): Refractive index of material 2. Returns: float: The percentage of reflected power.
[ "Calculate", "the", "power", "reflection", "at", "the", "interface", "of", "two", "refractive", "index", "materials", "." ]
85254a13b5aed2404187c52ac93b9b3ce99ee3a3
https://github.com/jtambasco/modesolverpy/blob/85254a13b5aed2404187c52ac93b9b3ce99ee3a3/modesolverpy/coupling_efficiency.py#L25-L38
train
41,544
jtambasco/modesolverpy
modesolverpy/coupling_efficiency.py
coupling_efficiency
def coupling_efficiency(mode_solver, fibre_mfd, fibre_offset_x=0, fibre_offset_y=0, n_eff_fibre=1.441): ''' Finds the coupling efficiency between a solved fundamental mode and a fibre of given MFD. Args: mode_solver (_ModeSolver): Mode solver that has found a fundamental mode. fibre_mfd (float): The mode-field diameter (MFD) of the fibre. fibre_offset_x (float): Offset the fibre from the centre position of the window in x. Default is 0 (no offset). fibre_offset_y (float): Offset the fibre from the centre position of the window in y. Default is 0 (no offset). n_eff_fibre (float): The effective index of the fibre mode. Default is 1.441. Returns: float: The power coupling efficiency. ''' etas = [] gaus = _make_gaussian(mode_solver._structure.xc, mode_solver._structure.yc, fibre_mfd, fibre_offset_x, fibre_offset_y) for mode, n_eff in zip(mode_solver.modes, mode_solver.n_effs): o = abs(_overlap(mode, gaus)) t = abs(transmission(n_eff, n_eff_fibre)) eta = o * t etas.append(eta) return etas
python
def coupling_efficiency(mode_solver, fibre_mfd, fibre_offset_x=0, fibre_offset_y=0, n_eff_fibre=1.441): ''' Finds the coupling efficiency between a solved fundamental mode and a fibre of given MFD. Args: mode_solver (_ModeSolver): Mode solver that has found a fundamental mode. fibre_mfd (float): The mode-field diameter (MFD) of the fibre. fibre_offset_x (float): Offset the fibre from the centre position of the window in x. Default is 0 (no offset). fibre_offset_y (float): Offset the fibre from the centre position of the window in y. Default is 0 (no offset). n_eff_fibre (float): The effective index of the fibre mode. Default is 1.441. Returns: float: The power coupling efficiency. ''' etas = [] gaus = _make_gaussian(mode_solver._structure.xc, mode_solver._structure.yc, fibre_mfd, fibre_offset_x, fibre_offset_y) for mode, n_eff in zip(mode_solver.modes, mode_solver.n_effs): o = abs(_overlap(mode, gaus)) t = abs(transmission(n_eff, n_eff_fibre)) eta = o * t etas.append(eta) return etas
[ "def", "coupling_efficiency", "(", "mode_solver", ",", "fibre_mfd", ",", "fibre_offset_x", "=", "0", ",", "fibre_offset_y", "=", "0", ",", "n_eff_fibre", "=", "1.441", ")", ":", "etas", "=", "[", "]", "gaus", "=", "_make_gaussian", "(", "mode_solver", ".", ...
Finds the coupling efficiency between a solved fundamental mode and a fibre of given MFD. Args: mode_solver (_ModeSolver): Mode solver that has found a fundamental mode. fibre_mfd (float): The mode-field diameter (MFD) of the fibre. fibre_offset_x (float): Offset the fibre from the centre position of the window in x. Default is 0 (no offset). fibre_offset_y (float): Offset the fibre from the centre position of the window in y. Default is 0 (no offset). n_eff_fibre (float): The effective index of the fibre mode. Default is 1.441. Returns: float: The power coupling efficiency.
[ "Finds", "the", "coupling", "efficiency", "between", "a", "solved", "fundamental", "mode", "and", "a", "fibre", "of", "given", "MFD", "." ]
85254a13b5aed2404187c52ac93b9b3ce99ee3a3
https://github.com/jtambasco/modesolverpy/blob/85254a13b5aed2404187c52ac93b9b3ce99ee3a3/modesolverpy/coupling_efficiency.py#L54-L89
train
41,545
wdecoster/nanolyse
nanolyse/NanoLyse.py
getIndex
def getIndex(reference): ''' Find the reference folder using the location of the script file Create the index, test if successful ''' if reference: reffas = reference else: parent_directory = path.dirname(path.abspath(path.dirname(__file__))) reffas = path.join(parent_directory, "reference/DNA_CS.fasta") if not path.isfile(reffas): logging.error("Could not find reference fasta for lambda genome.") sys.exit("Could not find reference fasta for lambda genome.") aligner = mp.Aligner(reffas, preset="map-ont") # build index if not aligner: logging.error("Failed to load/build index") raise Exception("ERROR: failed to load/build index") return aligner
python
def getIndex(reference): ''' Find the reference folder using the location of the script file Create the index, test if successful ''' if reference: reffas = reference else: parent_directory = path.dirname(path.abspath(path.dirname(__file__))) reffas = path.join(parent_directory, "reference/DNA_CS.fasta") if not path.isfile(reffas): logging.error("Could not find reference fasta for lambda genome.") sys.exit("Could not find reference fasta for lambda genome.") aligner = mp.Aligner(reffas, preset="map-ont") # build index if not aligner: logging.error("Failed to load/build index") raise Exception("ERROR: failed to load/build index") return aligner
[ "def", "getIndex", "(", "reference", ")", ":", "if", "reference", ":", "reffas", "=", "reference", "else", ":", "parent_directory", "=", "path", ".", "dirname", "(", "path", ".", "abspath", "(", "path", ".", "dirname", "(", "__file__", ")", ")", ")", "...
Find the reference folder using the location of the script file Create the index, test if successful
[ "Find", "the", "reference", "folder", "using", "the", "location", "of", "the", "script", "file", "Create", "the", "index", "test", "if", "successful" ]
026631b3a88097c91d84070f1cfc035c825d0878
https://github.com/wdecoster/nanolyse/blob/026631b3a88097c91d84070f1cfc035c825d0878/nanolyse/NanoLyse.py#L81-L98
train
41,546
openstack/proliantutils
proliantutils/ilo/ipmi.py
_exec_ipmitool
def _exec_ipmitool(driver_info, command): """Execute the ipmitool command. This uses the lanplus interface to communicate with the BMC device driver. :param driver_info: the ipmitool parameters for accessing a node. :param command: the ipmitool command to be executed. """ ipmi_cmd = ("ipmitool -H %(address)s" " -I lanplus -U %(user)s -P %(passwd)s %(cmd)s" % {'address': driver_info['address'], 'user': driver_info['username'], 'passwd': driver_info['password'], 'cmd': command}) out = None try: out = subprocess.check_output(ipmi_cmd, shell=True) except Exception: pass return out
python
def _exec_ipmitool(driver_info, command): """Execute the ipmitool command. This uses the lanplus interface to communicate with the BMC device driver. :param driver_info: the ipmitool parameters for accessing a node. :param command: the ipmitool command to be executed. """ ipmi_cmd = ("ipmitool -H %(address)s" " -I lanplus -U %(user)s -P %(passwd)s %(cmd)s" % {'address': driver_info['address'], 'user': driver_info['username'], 'passwd': driver_info['password'], 'cmd': command}) out = None try: out = subprocess.check_output(ipmi_cmd, shell=True) except Exception: pass return out
[ "def", "_exec_ipmitool", "(", "driver_info", ",", "command", ")", ":", "ipmi_cmd", "=", "(", "\"ipmitool -H %(address)s\"", "\" -I lanplus -U %(user)s -P %(passwd)s %(cmd)s\"", "%", "{", "'address'", ":", "driver_info", "[", "'address'", "]", ",", "'user'", ":", "driv...
Execute the ipmitool command. This uses the lanplus interface to communicate with the BMC device driver. :param driver_info: the ipmitool parameters for accessing a node. :param command: the ipmitool command to be executed.
[ "Execute", "the", "ipmitool", "command", "." ]
86ef3b47b4eca97c221577e3570b0240d6a25f22
https://github.com/openstack/proliantutils/blob/86ef3b47b4eca97c221577e3570b0240d6a25f22/proliantutils/ilo/ipmi.py#L32-L53
train
41,547
openstack/proliantutils
proliantutils/ilo/ipmi.py
get_nic_capacity
def get_nic_capacity(driver_info, ilo_fw): """Gets the FRU data to see if it is NIC data Gets the FRU data in loop from 0-255 FRU Ids and check if the returned data is NIC data. Couldn't find any easy way to detect if it is NIC data. We should't be hardcoding the FRU Id. :param driver_info: Contains the access credentials to access the BMC. :param ilo_fw: a tuple containing major and minor versions of firmware :returns: the max capacity supported by the NIC adapter. """ i = 0x0 value = None ilo_fw_rev = get_ilo_version(ilo_fw) or DEFAULT_FW_REV # Note(vmud213): iLO firmware versions >= 2.3 support reading the FRU # information in a single call instead of iterating over each FRU id. if ilo_fw_rev < MIN_SUGGESTED_FW_REV: for i in range(0xff): # Note(vmud213): We can discard FRU ID's between 0x6e and 0xee # as they don't contain any NIC related information if (i < 0x6e) or (i > 0xee): cmd = "fru print %s" % hex(i) out = _exec_ipmitool(driver_info, cmd) if out and 'port' in out and 'Adapter' in out: value = _parse_ipmi_nic_capacity(out) if value is not None: break else: continue else: cmd = "fru print" out = _exec_ipmitool(driver_info, cmd) if out: for line in out.split('\n'): if line and 'port' in line and 'Adapter' in line: value = _parse_ipmi_nic_capacity(line) if value is not None: break return value
python
def get_nic_capacity(driver_info, ilo_fw): """Gets the FRU data to see if it is NIC data Gets the FRU data in loop from 0-255 FRU Ids and check if the returned data is NIC data. Couldn't find any easy way to detect if it is NIC data. We should't be hardcoding the FRU Id. :param driver_info: Contains the access credentials to access the BMC. :param ilo_fw: a tuple containing major and minor versions of firmware :returns: the max capacity supported by the NIC adapter. """ i = 0x0 value = None ilo_fw_rev = get_ilo_version(ilo_fw) or DEFAULT_FW_REV # Note(vmud213): iLO firmware versions >= 2.3 support reading the FRU # information in a single call instead of iterating over each FRU id. if ilo_fw_rev < MIN_SUGGESTED_FW_REV: for i in range(0xff): # Note(vmud213): We can discard FRU ID's between 0x6e and 0xee # as they don't contain any NIC related information if (i < 0x6e) or (i > 0xee): cmd = "fru print %s" % hex(i) out = _exec_ipmitool(driver_info, cmd) if out and 'port' in out and 'Adapter' in out: value = _parse_ipmi_nic_capacity(out) if value is not None: break else: continue else: cmd = "fru print" out = _exec_ipmitool(driver_info, cmd) if out: for line in out.split('\n'): if line and 'port' in line and 'Adapter' in line: value = _parse_ipmi_nic_capacity(line) if value is not None: break return value
[ "def", "get_nic_capacity", "(", "driver_info", ",", "ilo_fw", ")", ":", "i", "=", "0x0", "value", "=", "None", "ilo_fw_rev", "=", "get_ilo_version", "(", "ilo_fw", ")", "or", "DEFAULT_FW_REV", "# Note(vmud213): iLO firmware versions >= 2.3 support reading the FRU", "# i...
Gets the FRU data to see if it is NIC data Gets the FRU data in loop from 0-255 FRU Ids and check if the returned data is NIC data. Couldn't find any easy way to detect if it is NIC data. We should't be hardcoding the FRU Id. :param driver_info: Contains the access credentials to access the BMC. :param ilo_fw: a tuple containing major and minor versions of firmware :returns: the max capacity supported by the NIC adapter.
[ "Gets", "the", "FRU", "data", "to", "see", "if", "it", "is", "NIC", "data" ]
86ef3b47b4eca97c221577e3570b0240d6a25f22
https://github.com/openstack/proliantutils/blob/86ef3b47b4eca97c221577e3570b0240d6a25f22/proliantutils/ilo/ipmi.py#L76-L117
train
41,548
openstack/proliantutils
proliantutils/ilo/ipmi.py
_parse_ipmi_nic_capacity
def _parse_ipmi_nic_capacity(nic_out): """Parse the FRU output for NIC capacity Parses the FRU output. Seraches for the key "Product Name" in FRU output and greps for maximum speed supported by the NIC adapter. :param nic_out: the FRU output for NIC adapter. :returns: the max capacity supported by the NIC adapter. """ if (("Device not present" in nic_out) or ("Unknown FRU header" in nic_out) or not nic_out): return None capacity = None product_name = None data = nic_out.split('\n') for item in data: fields = item.split(':') if len(fields) > 1: first_field = fields[0].strip() if first_field == "Product Name": # Join the string back if the Product Name had some # ':' by any chance product_name = ':'.join(fields[1:]) break if product_name: product_name_array = product_name.split(' ') for item in product_name_array: if 'Gb' in item: capacity_int = item.strip('Gb') if capacity_int.isdigit(): capacity = item return capacity
python
def _parse_ipmi_nic_capacity(nic_out): """Parse the FRU output for NIC capacity Parses the FRU output. Seraches for the key "Product Name" in FRU output and greps for maximum speed supported by the NIC adapter. :param nic_out: the FRU output for NIC adapter. :returns: the max capacity supported by the NIC adapter. """ if (("Device not present" in nic_out) or ("Unknown FRU header" in nic_out) or not nic_out): return None capacity = None product_name = None data = nic_out.split('\n') for item in data: fields = item.split(':') if len(fields) > 1: first_field = fields[0].strip() if first_field == "Product Name": # Join the string back if the Product Name had some # ':' by any chance product_name = ':'.join(fields[1:]) break if product_name: product_name_array = product_name.split(' ') for item in product_name_array: if 'Gb' in item: capacity_int = item.strip('Gb') if capacity_int.isdigit(): capacity = item return capacity
[ "def", "_parse_ipmi_nic_capacity", "(", "nic_out", ")", ":", "if", "(", "(", "\"Device not present\"", "in", "nic_out", ")", "or", "(", "\"Unknown FRU header\"", "in", "nic_out", ")", "or", "not", "nic_out", ")", ":", "return", "None", "capacity", "=", "None",...
Parse the FRU output for NIC capacity Parses the FRU output. Seraches for the key "Product Name" in FRU output and greps for maximum speed supported by the NIC adapter. :param nic_out: the FRU output for NIC adapter. :returns: the max capacity supported by the NIC adapter.
[ "Parse", "the", "FRU", "output", "for", "NIC", "capacity" ]
86ef3b47b4eca97c221577e3570b0240d6a25f22
https://github.com/openstack/proliantutils/blob/86ef3b47b4eca97c221577e3570b0240d6a25f22/proliantutils/ilo/ipmi.py#L120-L156
train
41,549
Azure/azure-multiapi-storage-python
azure/multiapi/storage/v2016_05_31/table/_encryption.py
_extract_encryption_metadata
def _extract_encryption_metadata(entity, require_encryption, key_encryption_key, key_resolver): ''' Extracts the encryption metadata from the given entity, setting them to be utf-8 strings. If no encryption metadata is present, will return None for all return values unless require_encryption is true, in which case the method will throw. :param entity: The entity being retrieved and decrypted. Could be a dict or an entity object. :param bool require_encryption: If set, will enforce that the retrieved entity is encrypted and decrypt it. :param object key_encryption_key: The user-provided key-encryption-key. Must implement the following methods: unwrap_key(key, algorithm)--returns the unwrapped form of the specified symmetric key using the string-specified algorithm. get_kid()--returns a string key id for this key-encryption-key. :param function key_resolver(kid): The user-provided key resolver. Uses the kid string to return a key-encryption-key implementing the interface defined above. :returns: a tuple containing the entity iv, the list of encrypted properties, the entity cek, and whether the entity was encrypted using JavaV1. :rtype: tuple (bytes[], list, bytes[], bool) ''' _validate_not_none('entity', entity) try: encrypted_properties_list = _decode_base64_to_bytes(entity['_ClientEncryptionMetadata2']) encryption_data = entity['_ClientEncryptionMetadata1'] encryption_data = _dict_to_encryption_data(loads(encryption_data)) except Exception as e: # Message did not have properly formatted encryption metadata. if require_encryption: raise ValueError(_ERROR_ENTITY_NOT_ENCRYPTED) else: return (None,None,None,None) if not(encryption_data.encryption_agent.encryption_algorithm == _EncryptionAlgorithm.AES_CBC_256): raise ValueError(_ERROR_UNSUPPORTED_ENCRYPTION_ALGORITHM) content_encryption_key = _validate_and_unwrap_cek(encryption_data, key_encryption_key, key_resolver) # Special check for compatibility with Java V1 encryption protocol. isJavaV1 = (encryption_data.key_wrapping_metadata is None) or \ ((encryption_data.encryption_agent.protocol == _ENCRYPTION_PROTOCOL_V1) and \ 'EncryptionLibrary' in encryption_data.key_wrapping_metadata and \ 'Java' in encryption_data.key_wrapping_metadata['EncryptionLibrary']) metadataIV = _generate_property_iv(encryption_data.content_encryption_IV, entity['PartitionKey'], entity['RowKey'], '_ClientEncryptionMetadata2', isJavaV1) cipher = _generate_AES_CBC_cipher(content_encryption_key, metadataIV) # Decrypt the data. decryptor = cipher.decryptor() encrypted_properties_list = decryptor.update(encrypted_properties_list) + decryptor.finalize() # Unpad the data. unpadder = PKCS7(128).unpadder() encrypted_properties_list = unpadder.update(encrypted_properties_list) + unpadder.finalize() encrypted_properties_list = encrypted_properties_list.decode('utf-8') if isJavaV1: # Strip the square braces from the ends and split string into list. encrypted_properties_list = encrypted_properties_list[1:-1] encrypted_properties_list = encrypted_properties_list.split(', ') else: encrypted_properties_list = loads(encrypted_properties_list) return (encryption_data.content_encryption_IV, encrypted_properties_list, content_encryption_key, isJavaV1)
python
def _extract_encryption_metadata(entity, require_encryption, key_encryption_key, key_resolver): ''' Extracts the encryption metadata from the given entity, setting them to be utf-8 strings. If no encryption metadata is present, will return None for all return values unless require_encryption is true, in which case the method will throw. :param entity: The entity being retrieved and decrypted. Could be a dict or an entity object. :param bool require_encryption: If set, will enforce that the retrieved entity is encrypted and decrypt it. :param object key_encryption_key: The user-provided key-encryption-key. Must implement the following methods: unwrap_key(key, algorithm)--returns the unwrapped form of the specified symmetric key using the string-specified algorithm. get_kid()--returns a string key id for this key-encryption-key. :param function key_resolver(kid): The user-provided key resolver. Uses the kid string to return a key-encryption-key implementing the interface defined above. :returns: a tuple containing the entity iv, the list of encrypted properties, the entity cek, and whether the entity was encrypted using JavaV1. :rtype: tuple (bytes[], list, bytes[], bool) ''' _validate_not_none('entity', entity) try: encrypted_properties_list = _decode_base64_to_bytes(entity['_ClientEncryptionMetadata2']) encryption_data = entity['_ClientEncryptionMetadata1'] encryption_data = _dict_to_encryption_data(loads(encryption_data)) except Exception as e: # Message did not have properly formatted encryption metadata. if require_encryption: raise ValueError(_ERROR_ENTITY_NOT_ENCRYPTED) else: return (None,None,None,None) if not(encryption_data.encryption_agent.encryption_algorithm == _EncryptionAlgorithm.AES_CBC_256): raise ValueError(_ERROR_UNSUPPORTED_ENCRYPTION_ALGORITHM) content_encryption_key = _validate_and_unwrap_cek(encryption_data, key_encryption_key, key_resolver) # Special check for compatibility with Java V1 encryption protocol. isJavaV1 = (encryption_data.key_wrapping_metadata is None) or \ ((encryption_data.encryption_agent.protocol == _ENCRYPTION_PROTOCOL_V1) and \ 'EncryptionLibrary' in encryption_data.key_wrapping_metadata and \ 'Java' in encryption_data.key_wrapping_metadata['EncryptionLibrary']) metadataIV = _generate_property_iv(encryption_data.content_encryption_IV, entity['PartitionKey'], entity['RowKey'], '_ClientEncryptionMetadata2', isJavaV1) cipher = _generate_AES_CBC_cipher(content_encryption_key, metadataIV) # Decrypt the data. decryptor = cipher.decryptor() encrypted_properties_list = decryptor.update(encrypted_properties_list) + decryptor.finalize() # Unpad the data. unpadder = PKCS7(128).unpadder() encrypted_properties_list = unpadder.update(encrypted_properties_list) + unpadder.finalize() encrypted_properties_list = encrypted_properties_list.decode('utf-8') if isJavaV1: # Strip the square braces from the ends and split string into list. encrypted_properties_list = encrypted_properties_list[1:-1] encrypted_properties_list = encrypted_properties_list.split(', ') else: encrypted_properties_list = loads(encrypted_properties_list) return (encryption_data.content_encryption_IV, encrypted_properties_list, content_encryption_key, isJavaV1)
[ "def", "_extract_encryption_metadata", "(", "entity", ",", "require_encryption", ",", "key_encryption_key", ",", "key_resolver", ")", ":", "_validate_not_none", "(", "'entity'", ",", "entity", ")", "try", ":", "encrypted_properties_list", "=", "_decode_base64_to_bytes", ...
Extracts the encryption metadata from the given entity, setting them to be utf-8 strings. If no encryption metadata is present, will return None for all return values unless require_encryption is true, in which case the method will throw. :param entity: The entity being retrieved and decrypted. Could be a dict or an entity object. :param bool require_encryption: If set, will enforce that the retrieved entity is encrypted and decrypt it. :param object key_encryption_key: The user-provided key-encryption-key. Must implement the following methods: unwrap_key(key, algorithm)--returns the unwrapped form of the specified symmetric key using the string-specified algorithm. get_kid()--returns a string key id for this key-encryption-key. :param function key_resolver(kid): The user-provided key resolver. Uses the kid string to return a key-encryption-key implementing the interface defined above. :returns: a tuple containing the entity iv, the list of encrypted properties, the entity cek, and whether the entity was encrypted using JavaV1. :rtype: tuple (bytes[], list, bytes[], bool)
[ "Extracts", "the", "encryption", "metadata", "from", "the", "given", "entity", "setting", "them", "to", "be", "utf", "-", "8", "strings", ".", "If", "no", "encryption", "metadata", "is", "present", "will", "return", "None", "for", "all", "return", "values", ...
bd5482547f993c6eb56fd09070e15c2e9616e440
https://github.com/Azure/azure-multiapi-storage-python/blob/bd5482547f993c6eb56fd09070e15c2e9616e440/azure/multiapi/storage/v2016_05_31/table/_encryption.py#L215-L284
train
41,550
openstack/proliantutils
proliantutils/redfish/resources/system/storage/array_controller.py
HPEArrayController.logical_drives
def logical_drives(self): """Gets the resource HPELogicalDriveCollection of ArrayControllers""" return logical_drive.HPELogicalDriveCollection( self._conn, utils.get_subresource_path_by( self, ['Links', 'LogicalDrives']), redfish_version=self.redfish_version)
python
def logical_drives(self): """Gets the resource HPELogicalDriveCollection of ArrayControllers""" return logical_drive.HPELogicalDriveCollection( self._conn, utils.get_subresource_path_by( self, ['Links', 'LogicalDrives']), redfish_version=self.redfish_version)
[ "def", "logical_drives", "(", "self", ")", ":", "return", "logical_drive", ".", "HPELogicalDriveCollection", "(", "self", ".", "_conn", ",", "utils", ".", "get_subresource_path_by", "(", "self", ",", "[", "'Links'", ",", "'LogicalDrives'", "]", ")", ",", "redf...
Gets the resource HPELogicalDriveCollection of ArrayControllers
[ "Gets", "the", "resource", "HPELogicalDriveCollection", "of", "ArrayControllers" ]
86ef3b47b4eca97c221577e3570b0240d6a25f22
https://github.com/openstack/proliantutils/blob/86ef3b47b4eca97c221577e3570b0240d6a25f22/proliantutils/redfish/resources/system/storage/array_controller.py#L47-L53
train
41,551
openstack/proliantutils
proliantutils/redfish/resources/system/storage/array_controller.py
HPEArrayController.physical_drives
def physical_drives(self): """Gets the resource HPEPhysicalDriveCollection of ArrayControllers""" return physical_drive.HPEPhysicalDriveCollection( self._conn, utils.get_subresource_path_by( self, ['Links', 'PhysicalDrives']), redfish_version=self.redfish_version)
python
def physical_drives(self): """Gets the resource HPEPhysicalDriveCollection of ArrayControllers""" return physical_drive.HPEPhysicalDriveCollection( self._conn, utils.get_subresource_path_by( self, ['Links', 'PhysicalDrives']), redfish_version=self.redfish_version)
[ "def", "physical_drives", "(", "self", ")", ":", "return", "physical_drive", ".", "HPEPhysicalDriveCollection", "(", "self", ".", "_conn", ",", "utils", ".", "get_subresource_path_by", "(", "self", ",", "[", "'Links'", ",", "'PhysicalDrives'", "]", ")", ",", "...
Gets the resource HPEPhysicalDriveCollection of ArrayControllers
[ "Gets", "the", "resource", "HPEPhysicalDriveCollection", "of", "ArrayControllers" ]
86ef3b47b4eca97c221577e3570b0240d6a25f22
https://github.com/openstack/proliantutils/blob/86ef3b47b4eca97c221577e3570b0240d6a25f22/proliantutils/redfish/resources/system/storage/array_controller.py#L57-L62
train
41,552
openstack/proliantutils
proliantutils/redfish/resources/system/storage/array_controller.py
HPEArrayControllerCollection.physical_drives_maximum_size_mib
def physical_drives_maximum_size_mib(self): """Gets the biggest disk :returns the size in MiB. """ return utils.max_safe([member.physical_drives.maximum_size_mib for member in self.get_members()])
python
def physical_drives_maximum_size_mib(self): """Gets the biggest disk :returns the size in MiB. """ return utils.max_safe([member.physical_drives.maximum_size_mib for member in self.get_members()])
[ "def", "physical_drives_maximum_size_mib", "(", "self", ")", ":", "return", "utils", ".", "max_safe", "(", "[", "member", ".", "physical_drives", ".", "maximum_size_mib", "for", "member", "in", "self", ".", "get_members", "(", ")", "]", ")" ]
Gets the biggest disk :returns the size in MiB.
[ "Gets", "the", "biggest", "disk" ]
86ef3b47b4eca97c221577e3570b0240d6a25f22
https://github.com/openstack/proliantutils/blob/86ef3b47b4eca97c221577e3570b0240d6a25f22/proliantutils/redfish/resources/system/storage/array_controller.py#L84-L90
train
41,553
openstack/proliantutils
proliantutils/redfish/resources/system/storage/array_controller.py
HPEArrayControllerCollection.array_controller_by_location
def array_controller_by_location(self, location): """Returns array controller instance by location :returns Instance of array controller """ for member in self.get_members(): if member.location == location: return member
python
def array_controller_by_location(self, location): """Returns array controller instance by location :returns Instance of array controller """ for member in self.get_members(): if member.location == location: return member
[ "def", "array_controller_by_location", "(", "self", ",", "location", ")", ":", "for", "member", "in", "self", ".", "get_members", "(", ")", ":", "if", "member", ".", "location", "==", "location", ":", "return", "member" ]
Returns array controller instance by location :returns Instance of array controller
[ "Returns", "array", "controller", "instance", "by", "location" ]
86ef3b47b4eca97c221577e3570b0240d6a25f22
https://github.com/openstack/proliantutils/blob/86ef3b47b4eca97c221577e3570b0240d6a25f22/proliantutils/redfish/resources/system/storage/array_controller.py#L142-L149
train
41,554
openstack/proliantutils
proliantutils/redfish/resources/system/storage/array_controller.py
HPEArrayControllerCollection.array_controller_by_model
def array_controller_by_model(self, model): """Returns array controller instance by model :returns Instance of array controller """ for member in self.get_members(): if member.model == model: return member
python
def array_controller_by_model(self, model): """Returns array controller instance by model :returns Instance of array controller """ for member in self.get_members(): if member.model == model: return member
[ "def", "array_controller_by_model", "(", "self", ",", "model", ")", ":", "for", "member", "in", "self", ".", "get_members", "(", ")", ":", "if", "member", ".", "model", "==", "model", ":", "return", "member" ]
Returns array controller instance by model :returns Instance of array controller
[ "Returns", "array", "controller", "instance", "by", "model" ]
86ef3b47b4eca97c221577e3570b0240d6a25f22
https://github.com/openstack/proliantutils/blob/86ef3b47b4eca97c221577e3570b0240d6a25f22/proliantutils/redfish/resources/system/storage/array_controller.py#L151-L158
train
41,555
openstack/proliantutils
proliantutils/redfish/utils.py
get_subresource_path_by
def get_subresource_path_by(resource, subresource_path): """Helper function to find the resource path :param resource: ResourceBase instance from which the path is loaded. :param subresource_path: JSON field to fetch the value from. Either a string, or a list of strings in case of a nested field. It should also include the '@odata.id' :raises: MissingAttributeError, if required path is missing. :raises: ValueError, if path is empty. :raises: AttributeError, if json attr not found in resource """ if isinstance(subresource_path, six.string_types): subresource_path = [subresource_path] elif not subresource_path: raise ValueError('"subresource_path" cannot be empty') body = resource.json for path_item in subresource_path: body = body.get(path_item, {}) if not body: raise exception.MissingAttributeError( attribute='/'.join(subresource_path), resource=resource.path) if '@odata.id' not in body: raise exception.MissingAttributeError( attribute='/'.join(subresource_path)+'/@odata.id', resource=resource.path) return body['@odata.id']
python
def get_subresource_path_by(resource, subresource_path): """Helper function to find the resource path :param resource: ResourceBase instance from which the path is loaded. :param subresource_path: JSON field to fetch the value from. Either a string, or a list of strings in case of a nested field. It should also include the '@odata.id' :raises: MissingAttributeError, if required path is missing. :raises: ValueError, if path is empty. :raises: AttributeError, if json attr not found in resource """ if isinstance(subresource_path, six.string_types): subresource_path = [subresource_path] elif not subresource_path: raise ValueError('"subresource_path" cannot be empty') body = resource.json for path_item in subresource_path: body = body.get(path_item, {}) if not body: raise exception.MissingAttributeError( attribute='/'.join(subresource_path), resource=resource.path) if '@odata.id' not in body: raise exception.MissingAttributeError( attribute='/'.join(subresource_path)+'/@odata.id', resource=resource.path) return body['@odata.id']
[ "def", "get_subresource_path_by", "(", "resource", ",", "subresource_path", ")", ":", "if", "isinstance", "(", "subresource_path", ",", "six", ".", "string_types", ")", ":", "subresource_path", "=", "[", "subresource_path", "]", "elif", "not", "subresource_path", ...
Helper function to find the resource path :param resource: ResourceBase instance from which the path is loaded. :param subresource_path: JSON field to fetch the value from. Either a string, or a list of strings in case of a nested field. It should also include the '@odata.id' :raises: MissingAttributeError, if required path is missing. :raises: ValueError, if path is empty. :raises: AttributeError, if json attr not found in resource
[ "Helper", "function", "to", "find", "the", "resource", "path" ]
86ef3b47b4eca97c221577e3570b0240d6a25f22
https://github.com/openstack/proliantutils/blob/86ef3b47b4eca97c221577e3570b0240d6a25f22/proliantutils/redfish/utils.py#L30-L59
train
41,556
openstack/proliantutils
proliantutils/redfish/utils.py
get_supported_boot_mode
def get_supported_boot_mode(supported_boot_mode): """Return bios and uefi support. :param supported_boot_mode: Supported boot modes :return: A tuple of 'true'/'false' based on bios and uefi support respectively. """ boot_mode_bios = 'false' boot_mode_uefi = 'false' if (supported_boot_mode == sys_cons.SUPPORTED_LEGACY_BIOS_ONLY): boot_mode_bios = 'true' elif (supported_boot_mode == sys_cons.SUPPORTED_UEFI_ONLY): boot_mode_uefi = 'true' elif (supported_boot_mode == sys_cons.SUPPORTED_LEGACY_BIOS_AND_UEFI): boot_mode_bios = 'true' boot_mode_uefi = 'true' return SupportedBootModes(boot_mode_bios=boot_mode_bios, boot_mode_uefi=boot_mode_uefi)
python
def get_supported_boot_mode(supported_boot_mode): """Return bios and uefi support. :param supported_boot_mode: Supported boot modes :return: A tuple of 'true'/'false' based on bios and uefi support respectively. """ boot_mode_bios = 'false' boot_mode_uefi = 'false' if (supported_boot_mode == sys_cons.SUPPORTED_LEGACY_BIOS_ONLY): boot_mode_bios = 'true' elif (supported_boot_mode == sys_cons.SUPPORTED_UEFI_ONLY): boot_mode_uefi = 'true' elif (supported_boot_mode == sys_cons.SUPPORTED_LEGACY_BIOS_AND_UEFI): boot_mode_bios = 'true' boot_mode_uefi = 'true' return SupportedBootModes(boot_mode_bios=boot_mode_bios, boot_mode_uefi=boot_mode_uefi)
[ "def", "get_supported_boot_mode", "(", "supported_boot_mode", ")", ":", "boot_mode_bios", "=", "'false'", "boot_mode_uefi", "=", "'false'", "if", "(", "supported_boot_mode", "==", "sys_cons", ".", "SUPPORTED_LEGACY_BIOS_ONLY", ")", ":", "boot_mode_bios", "=", "'true'", ...
Return bios and uefi support. :param supported_boot_mode: Supported boot modes :return: A tuple of 'true'/'false' based on bios and uefi support respectively.
[ "Return", "bios", "and", "uefi", "support", "." ]
86ef3b47b4eca97c221577e3570b0240d6a25f22
https://github.com/openstack/proliantutils/blob/86ef3b47b4eca97c221577e3570b0240d6a25f22/proliantutils/redfish/utils.py#L62-L83
train
41,557
openstack/proliantutils
proliantutils/redfish/utils.py
get_allowed_operations
def get_allowed_operations(resource, subresouce_path): """Helper function to get the HTTP allowed methods. :param resource: ResourceBase instance from which the path is loaded. :param subresource_path: JSON field to fetch the value from. Either a string, or a list of strings in case of a nested field. :returns: A list of allowed HTTP methods. """ uri = get_subresource_path_by(resource, subresouce_path) response = resource._conn.get(path=uri) return response.headers['Allow']
python
def get_allowed_operations(resource, subresouce_path): """Helper function to get the HTTP allowed methods. :param resource: ResourceBase instance from which the path is loaded. :param subresource_path: JSON field to fetch the value from. Either a string, or a list of strings in case of a nested field. :returns: A list of allowed HTTP methods. """ uri = get_subresource_path_by(resource, subresouce_path) response = resource._conn.get(path=uri) return response.headers['Allow']
[ "def", "get_allowed_operations", "(", "resource", ",", "subresouce_path", ")", ":", "uri", "=", "get_subresource_path_by", "(", "resource", ",", "subresouce_path", ")", "response", "=", "resource", ".", "_conn", ".", "get", "(", "path", "=", "uri", ")", "retur...
Helper function to get the HTTP allowed methods. :param resource: ResourceBase instance from which the path is loaded. :param subresource_path: JSON field to fetch the value from. Either a string, or a list of strings in case of a nested field. :returns: A list of allowed HTTP methods.
[ "Helper", "function", "to", "get", "the", "HTTP", "allowed", "methods", "." ]
86ef3b47b4eca97c221577e3570b0240d6a25f22
https://github.com/openstack/proliantutils/blob/86ef3b47b4eca97c221577e3570b0240d6a25f22/proliantutils/redfish/utils.py#L86-L96
train
41,558
openstack/proliantutils
proliantutils/redfish/connector.py
HPEConnector._op
def _op(self, method, path='', data=None, headers=None): """Overrides the base method to support retrying the operation. :param method: The HTTP method to be used, e.g: GET, POST, PUT, PATCH, etc... :param path: The sub-URI path to the resource. :param data: Optional JSON data. :param headers: Optional dictionary of headers. :returns: The response from the connector.Connector's _op method. """ resp = super(HPEConnector, self)._op(method, path, data, headers, allow_redirects=False) # With IPv6, Gen10 server gives redirection response with new path with # a prefix of '/' so this check is required if resp.status_code == 308: path = urlparse(resp.headers['Location']).path resp = super(HPEConnector, self)._op(method, path, data, headers) return resp
python
def _op(self, method, path='', data=None, headers=None): """Overrides the base method to support retrying the operation. :param method: The HTTP method to be used, e.g: GET, POST, PUT, PATCH, etc... :param path: The sub-URI path to the resource. :param data: Optional JSON data. :param headers: Optional dictionary of headers. :returns: The response from the connector.Connector's _op method. """ resp = super(HPEConnector, self)._op(method, path, data, headers, allow_redirects=False) # With IPv6, Gen10 server gives redirection response with new path with # a prefix of '/' so this check is required if resp.status_code == 308: path = urlparse(resp.headers['Location']).path resp = super(HPEConnector, self)._op(method, path, data, headers) return resp
[ "def", "_op", "(", "self", ",", "method", ",", "path", "=", "''", ",", "data", "=", "None", ",", "headers", "=", "None", ")", ":", "resp", "=", "super", "(", "HPEConnector", ",", "self", ")", ".", "_op", "(", "method", ",", "path", ",", "data", ...
Overrides the base method to support retrying the operation. :param method: The HTTP method to be used, e.g: GET, POST, PUT, PATCH, etc... :param path: The sub-URI path to the resource. :param data: Optional JSON data. :param headers: Optional dictionary of headers. :returns: The response from the connector.Connector's _op method.
[ "Overrides", "the", "base", "method", "to", "support", "retrying", "the", "operation", "." ]
86ef3b47b4eca97c221577e3570b0240d6a25f22
https://github.com/openstack/proliantutils/blob/86ef3b47b4eca97c221577e3570b0240d6a25f22/proliantutils/redfish/connector.py#L38-L55
train
41,559
Azure/azure-multiapi-storage-python
azure/multiapi/storage/v2015_04_05/file/fileservice.py
FileService.generate_file_shared_access_signature
def generate_file_shared_access_signature(self, share_name, directory_name=None, file_name=None, permission=None, expiry=None, start=None, id=None, ip=None, protocol=None, cache_control=None, content_disposition=None, content_encoding=None, content_language=None, content_type=None): ''' Generates a shared access signature for the file. Use the returned signature with the sas_token parameter of FileService. :param str share_name: Name of share. :param str directory_name: Name of directory. SAS tokens cannot be created for directories, so this parameter should only be present if file_name is provided. :param str file_name: Name of file. :param FilePermissions permission: The permissions associated with the shared access signature. The user is restricted to operations allowed by the permissions. Permissions must be ordered read, create, write, delete, list. Required unless an id is given referencing a stored access policy which contains this field. This field must be omitted if it has been specified in an associated stored access policy. :param expiry: The time at which the shared access signature becomes invalid. Required unless an id is given referencing a stored access policy which contains this field. This field must be omitted if it has been specified in an associated stored access policy. Azure will always convert values to UTC. If a date is passed in without timezone info, it is assumed to be UTC. :type expiry: date or str :param start: The time at which the shared access signature becomes valid. If omitted, start time for this call is assumed to be the time when the storage service receives the request. Azure will always convert values to UTC. If a date is passed in without timezone info, it is assumed to be UTC. :type start: date or str :param str id: A unique value up to 64 characters in length that correlates to a stored access policy. To create a stored access policy, use set_file_service_properties. :param str ip: Specifies an IP address or a range of IP addresses from which to accept requests. If the IP address from which the request originates does not match the IP address or address range specified on the SAS token, the request is not authenticated. For example, specifying sip=168.1.5.65 or sip=168.1.5.60-168.1.5.70 on the SAS restricts the request to those IP addresses. :param str protocol: Specifies the protocol permitted for a request made. Possible values are both HTTPS and HTTP (https,http) or HTTPS only (https). The default value is https,http. Note that HTTP only is not a permitted value. :param str cache_control: Response header value for Cache-Control when resource is accessed using this shared access signature. :param str content_disposition: Response header value for Content-Disposition when resource is accessed using this shared access signature. :param str content_encoding: Response header value for Content-Encoding when resource is accessed using this shared access signature. :param str content_language: Response header value for Content-Language when resource is accessed using this shared access signature. :param str content_type: Response header value for Content-Type when resource is accessed using this shared access signature. :return: A Shared Access Signature (sas) token. :rtype: str ''' _validate_not_none('share_name', share_name) _validate_not_none('file_name', file_name) _validate_not_none('self.account_name', self.account_name) _validate_not_none('self.account_key', self.account_key) sas = SharedAccessSignature(self.account_name, self.account_key) return sas.generate_file( share_name, directory_name, file_name, permission, expiry, start=start, id=id, ip=ip, protocol=protocol, cache_control=cache_control, content_disposition=content_disposition, content_encoding=content_encoding, content_language=content_language, content_type=content_type, )
python
def generate_file_shared_access_signature(self, share_name, directory_name=None, file_name=None, permission=None, expiry=None, start=None, id=None, ip=None, protocol=None, cache_control=None, content_disposition=None, content_encoding=None, content_language=None, content_type=None): ''' Generates a shared access signature for the file. Use the returned signature with the sas_token parameter of FileService. :param str share_name: Name of share. :param str directory_name: Name of directory. SAS tokens cannot be created for directories, so this parameter should only be present if file_name is provided. :param str file_name: Name of file. :param FilePermissions permission: The permissions associated with the shared access signature. The user is restricted to operations allowed by the permissions. Permissions must be ordered read, create, write, delete, list. Required unless an id is given referencing a stored access policy which contains this field. This field must be omitted if it has been specified in an associated stored access policy. :param expiry: The time at which the shared access signature becomes invalid. Required unless an id is given referencing a stored access policy which contains this field. This field must be omitted if it has been specified in an associated stored access policy. Azure will always convert values to UTC. If a date is passed in without timezone info, it is assumed to be UTC. :type expiry: date or str :param start: The time at which the shared access signature becomes valid. If omitted, start time for this call is assumed to be the time when the storage service receives the request. Azure will always convert values to UTC. If a date is passed in without timezone info, it is assumed to be UTC. :type start: date or str :param str id: A unique value up to 64 characters in length that correlates to a stored access policy. To create a stored access policy, use set_file_service_properties. :param str ip: Specifies an IP address or a range of IP addresses from which to accept requests. If the IP address from which the request originates does not match the IP address or address range specified on the SAS token, the request is not authenticated. For example, specifying sip=168.1.5.65 or sip=168.1.5.60-168.1.5.70 on the SAS restricts the request to those IP addresses. :param str protocol: Specifies the protocol permitted for a request made. Possible values are both HTTPS and HTTP (https,http) or HTTPS only (https). The default value is https,http. Note that HTTP only is not a permitted value. :param str cache_control: Response header value for Cache-Control when resource is accessed using this shared access signature. :param str content_disposition: Response header value for Content-Disposition when resource is accessed using this shared access signature. :param str content_encoding: Response header value for Content-Encoding when resource is accessed using this shared access signature. :param str content_language: Response header value for Content-Language when resource is accessed using this shared access signature. :param str content_type: Response header value for Content-Type when resource is accessed using this shared access signature. :return: A Shared Access Signature (sas) token. :rtype: str ''' _validate_not_none('share_name', share_name) _validate_not_none('file_name', file_name) _validate_not_none('self.account_name', self.account_name) _validate_not_none('self.account_key', self.account_key) sas = SharedAccessSignature(self.account_name, self.account_key) return sas.generate_file( share_name, directory_name, file_name, permission, expiry, start=start, id=id, ip=ip, protocol=protocol, cache_control=cache_control, content_disposition=content_disposition, content_encoding=content_encoding, content_language=content_language, content_type=content_type, )
[ "def", "generate_file_shared_access_signature", "(", "self", ",", "share_name", ",", "directory_name", "=", "None", ",", "file_name", "=", "None", ",", "permission", "=", "None", ",", "expiry", "=", "None", ",", "start", "=", "None", ",", "id", "=", "None", ...
Generates a shared access signature for the file. Use the returned signature with the sas_token parameter of FileService. :param str share_name: Name of share. :param str directory_name: Name of directory. SAS tokens cannot be created for directories, so this parameter should only be present if file_name is provided. :param str file_name: Name of file. :param FilePermissions permission: The permissions associated with the shared access signature. The user is restricted to operations allowed by the permissions. Permissions must be ordered read, create, write, delete, list. Required unless an id is given referencing a stored access policy which contains this field. This field must be omitted if it has been specified in an associated stored access policy. :param expiry: The time at which the shared access signature becomes invalid. Required unless an id is given referencing a stored access policy which contains this field. This field must be omitted if it has been specified in an associated stored access policy. Azure will always convert values to UTC. If a date is passed in without timezone info, it is assumed to be UTC. :type expiry: date or str :param start: The time at which the shared access signature becomes valid. If omitted, start time for this call is assumed to be the time when the storage service receives the request. Azure will always convert values to UTC. If a date is passed in without timezone info, it is assumed to be UTC. :type start: date or str :param str id: A unique value up to 64 characters in length that correlates to a stored access policy. To create a stored access policy, use set_file_service_properties. :param str ip: Specifies an IP address or a range of IP addresses from which to accept requests. If the IP address from which the request originates does not match the IP address or address range specified on the SAS token, the request is not authenticated. For example, specifying sip=168.1.5.65 or sip=168.1.5.60-168.1.5.70 on the SAS restricts the request to those IP addresses. :param str protocol: Specifies the protocol permitted for a request made. Possible values are both HTTPS and HTTP (https,http) or HTTPS only (https). The default value is https,http. Note that HTTP only is not a permitted value. :param str cache_control: Response header value for Cache-Control when resource is accessed using this shared access signature. :param str content_disposition: Response header value for Content-Disposition when resource is accessed using this shared access signature. :param str content_encoding: Response header value for Content-Encoding when resource is accessed using this shared access signature. :param str content_language: Response header value for Content-Language when resource is accessed using this shared access signature. :param str content_type: Response header value for Content-Type when resource is accessed using this shared access signature. :return: A Shared Access Signature (sas) token. :rtype: str
[ "Generates", "a", "shared", "access", "signature", "for", "the", "file", ".", "Use", "the", "returned", "signature", "with", "the", "sas_token", "parameter", "of", "FileService", "." ]
bd5482547f993c6eb56fd09070e15c2e9616e440
https://github.com/Azure/azure-multiapi-storage-python/blob/bd5482547f993c6eb56fd09070e15c2e9616e440/azure/multiapi/storage/v2015_04_05/file/fileservice.py#L342-L442
train
41,560
Azure/azure-multiapi-storage-python
azure/multiapi/storage/v2015_04_05/file/fileservice.py
FileService.get_share_properties
def get_share_properties(self, share_name, timeout=None): ''' Returns all user-defined metadata and system properties for the specified share. The data returned does not include the shares's list of files or directories. :param str share_name: Name of existing share. :param int timeout: The timeout parameter is expressed in seconds. :return: A Share that exposes properties and metadata. :rtype: :class:`.Share` ''' _validate_not_none('share_name', share_name) request = HTTPRequest() request.method = 'GET' request.host = self._get_host() request.path = _get_path(share_name) request.query = [ ('restype', 'share'), ('timeout', _int_to_str(timeout)), ] response = self._perform_request(request) return _parse_share(share_name, response)
python
def get_share_properties(self, share_name, timeout=None): ''' Returns all user-defined metadata and system properties for the specified share. The data returned does not include the shares's list of files or directories. :param str share_name: Name of existing share. :param int timeout: The timeout parameter is expressed in seconds. :return: A Share that exposes properties and metadata. :rtype: :class:`.Share` ''' _validate_not_none('share_name', share_name) request = HTTPRequest() request.method = 'GET' request.host = self._get_host() request.path = _get_path(share_name) request.query = [ ('restype', 'share'), ('timeout', _int_to_str(timeout)), ] response = self._perform_request(request) return _parse_share(share_name, response)
[ "def", "get_share_properties", "(", "self", ",", "share_name", ",", "timeout", "=", "None", ")", ":", "_validate_not_none", "(", "'share_name'", ",", "share_name", ")", "request", "=", "HTTPRequest", "(", ")", "request", ".", "method", "=", "'GET'", "request",...
Returns all user-defined metadata and system properties for the specified share. The data returned does not include the shares's list of files or directories. :param str share_name: Name of existing share. :param int timeout: The timeout parameter is expressed in seconds. :return: A Share that exposes properties and metadata. :rtype: :class:`.Share`
[ "Returns", "all", "user", "-", "defined", "metadata", "and", "system", "properties", "for", "the", "specified", "share", ".", "The", "data", "returned", "does", "not", "include", "the", "shares", "s", "list", "of", "files", "or", "directories", "." ]
bd5482547f993c6eb56fd09070e15c2e9616e440
https://github.com/Azure/azure-multiapi-storage-python/blob/bd5482547f993c6eb56fd09070e15c2e9616e440/azure/multiapi/storage/v2015_04_05/file/fileservice.py#L630-L654
train
41,561
Azure/azure-multiapi-storage-python
azure/multiapi/storage/v2015_04_05/file/fileservice.py
FileService.get_share_metadata
def get_share_metadata(self, share_name, timeout=None): ''' Returns all user-defined metadata for the specified share. :param str share_name: Name of existing share. :param int timeout: The timeout parameter is expressed in seconds. :return: A dictionary representing the share metadata name, value pairs. :rtype: a dict mapping str to str ''' _validate_not_none('share_name', share_name) request = HTTPRequest() request.method = 'GET' request.host = self._get_host() request.path = _get_path(share_name) request.query = [ ('restype', 'share'), ('comp', 'metadata'), ('timeout', _int_to_str(timeout)), ] response = self._perform_request(request) return _parse_metadata(response)
python
def get_share_metadata(self, share_name, timeout=None): ''' Returns all user-defined metadata for the specified share. :param str share_name: Name of existing share. :param int timeout: The timeout parameter is expressed in seconds. :return: A dictionary representing the share metadata name, value pairs. :rtype: a dict mapping str to str ''' _validate_not_none('share_name', share_name) request = HTTPRequest() request.method = 'GET' request.host = self._get_host() request.path = _get_path(share_name) request.query = [ ('restype', 'share'), ('comp', 'metadata'), ('timeout', _int_to_str(timeout)), ] response = self._perform_request(request) return _parse_metadata(response)
[ "def", "get_share_metadata", "(", "self", ",", "share_name", ",", "timeout", "=", "None", ")", ":", "_validate_not_none", "(", "'share_name'", ",", "share_name", ")", "request", "=", "HTTPRequest", "(", ")", "request", ".", "method", "=", "'GET'", "request", ...
Returns all user-defined metadata for the specified share. :param str share_name: Name of existing share. :param int timeout: The timeout parameter is expressed in seconds. :return: A dictionary representing the share metadata name, value pairs. :rtype: a dict mapping str to str
[ "Returns", "all", "user", "-", "defined", "metadata", "for", "the", "specified", "share", "." ]
bd5482547f993c6eb56fd09070e15c2e9616e440
https://github.com/Azure/azure-multiapi-storage-python/blob/bd5482547f993c6eb56fd09070e15c2e9616e440/azure/multiapi/storage/v2015_04_05/file/fileservice.py#L683-L707
train
41,562
Azure/azure-multiapi-storage-python
azure/multiapi/storage/v2015_04_05/file/fileservice.py
FileService.set_file_properties
def set_file_properties(self, share_name, directory_name, file_name, content_settings, timeout=None): ''' Sets system properties on the file. If one property is set for the content_settings, all properties will be overriden. :param str share_name: Name of existing share. :param str directory_name: The path to the directory. :param str file_name: Name of existing file. :param ~azure.storage.file.models.ContentSettings content_settings: ContentSettings object used to set the file properties. :param int timeout: The timeout parameter is expressed in seconds. ''' _validate_not_none('share_name', share_name) _validate_not_none('file_name', file_name) _validate_not_none('content_settings', content_settings) request = HTTPRequest() request.method = 'PUT' request.host = self._get_host() request.path = _get_path(share_name, directory_name, file_name) request.query = [ ('comp', 'properties'), ('timeout', _int_to_str(timeout)), ] request.headers = None request.headers = content_settings._to_headers() self._perform_request(request)
python
def set_file_properties(self, share_name, directory_name, file_name, content_settings, timeout=None): ''' Sets system properties on the file. If one property is set for the content_settings, all properties will be overriden. :param str share_name: Name of existing share. :param str directory_name: The path to the directory. :param str file_name: Name of existing file. :param ~azure.storage.file.models.ContentSettings content_settings: ContentSettings object used to set the file properties. :param int timeout: The timeout parameter is expressed in seconds. ''' _validate_not_none('share_name', share_name) _validate_not_none('file_name', file_name) _validate_not_none('content_settings', content_settings) request = HTTPRequest() request.method = 'PUT' request.host = self._get_host() request.path = _get_path(share_name, directory_name, file_name) request.query = [ ('comp', 'properties'), ('timeout', _int_to_str(timeout)), ] request.headers = None request.headers = content_settings._to_headers() self._perform_request(request)
[ "def", "set_file_properties", "(", "self", ",", "share_name", ",", "directory_name", ",", "file_name", ",", "content_settings", ",", "timeout", "=", "None", ")", ":", "_validate_not_none", "(", "'share_name'", ",", "share_name", ")", "_validate_not_none", "(", "'f...
Sets system properties on the file. If one property is set for the content_settings, all properties will be overriden. :param str share_name: Name of existing share. :param str directory_name: The path to the directory. :param str file_name: Name of existing file. :param ~azure.storage.file.models.ContentSettings content_settings: ContentSettings object used to set the file properties. :param int timeout: The timeout parameter is expressed in seconds.
[ "Sets", "system", "properties", "on", "the", "file", ".", "If", "one", "property", "is", "set", "for", "the", "content_settings", "all", "properties", "will", "be", "overriden", "." ]
bd5482547f993c6eb56fd09070e15c2e9616e440
https://github.com/Azure/azure-multiapi-storage-python/blob/bd5482547f993c6eb56fd09070e15c2e9616e440/azure/multiapi/storage/v2015_04_05/file/fileservice.py#L1218-L1249
train
41,563
Azure/azure-multiapi-storage-python
azure/multiapi/storage/v2015_04_05/file/fileservice.py
FileService.copy_file
def copy_file(self, share_name, directory_name, file_name, copy_source, metadata=None, timeout=None): ''' Copies a blob or file to a destination file within the storage account. :param str share_name: Name of existing share. :param str directory_name: The path to the directory. :param str file_name: Name of existing file. :param str copy_source: Specifies the URL of the source blob or file, up to 2 KB in length. A source file in the same account can be private, but a file in another account must be public or accept credentials included in this URL, such as a Shared Access Signature. Examples: https://myaccount.file.core.windows.net/myshare/mydirectory/myfile :param metadata: Dict containing name, value pairs. :type metadata: A dict mapping str to str. :param int timeout: The timeout parameter is expressed in seconds. :return: Copy operation properties such as status, source, and ID. :rtype: :class:`~azure.storage.file.models.CopyProperties` ''' _validate_not_none('share_name', share_name) _validate_not_none('file_name', file_name) _validate_not_none('copy_source', copy_source) request = HTTPRequest() request.method = 'PUT' request.host = self._get_host() request.path = _get_path(share_name, directory_name, file_name) request.query = [('timeout', _int_to_str(timeout))] request.headers = [ ('x-ms-copy-source', _to_str(copy_source)), ('x-ms-meta-name-values', metadata), ] response = self._perform_request(request) props = _parse_properties(response, FileProperties) return props.copy
python
def copy_file(self, share_name, directory_name, file_name, copy_source, metadata=None, timeout=None): ''' Copies a blob or file to a destination file within the storage account. :param str share_name: Name of existing share. :param str directory_name: The path to the directory. :param str file_name: Name of existing file. :param str copy_source: Specifies the URL of the source blob or file, up to 2 KB in length. A source file in the same account can be private, but a file in another account must be public or accept credentials included in this URL, such as a Shared Access Signature. Examples: https://myaccount.file.core.windows.net/myshare/mydirectory/myfile :param metadata: Dict containing name, value pairs. :type metadata: A dict mapping str to str. :param int timeout: The timeout parameter is expressed in seconds. :return: Copy operation properties such as status, source, and ID. :rtype: :class:`~azure.storage.file.models.CopyProperties` ''' _validate_not_none('share_name', share_name) _validate_not_none('file_name', file_name) _validate_not_none('copy_source', copy_source) request = HTTPRequest() request.method = 'PUT' request.host = self._get_host() request.path = _get_path(share_name, directory_name, file_name) request.query = [('timeout', _int_to_str(timeout))] request.headers = [ ('x-ms-copy-source', _to_str(copy_source)), ('x-ms-meta-name-values', metadata), ] response = self._perform_request(request) props = _parse_properties(response, FileProperties) return props.copy
[ "def", "copy_file", "(", "self", ",", "share_name", ",", "directory_name", ",", "file_name", ",", "copy_source", ",", "metadata", "=", "None", ",", "timeout", "=", "None", ")", ":", "_validate_not_none", "(", "'share_name'", ",", "share_name", ")", "_validate_...
Copies a blob or file to a destination file within the storage account. :param str share_name: Name of existing share. :param str directory_name: The path to the directory. :param str file_name: Name of existing file. :param str copy_source: Specifies the URL of the source blob or file, up to 2 KB in length. A source file in the same account can be private, but a file in another account must be public or accept credentials included in this URL, such as a Shared Access Signature. Examples: https://myaccount.file.core.windows.net/myshare/mydirectory/myfile :param metadata: Dict containing name, value pairs. :type metadata: A dict mapping str to str. :param int timeout: The timeout parameter is expressed in seconds. :return: Copy operation properties such as status, source, and ID. :rtype: :class:`~azure.storage.file.models.CopyProperties`
[ "Copies", "a", "blob", "or", "file", "to", "a", "destination", "file", "within", "the", "storage", "account", "." ]
bd5482547f993c6eb56fd09070e15c2e9616e440
https://github.com/Azure/azure-multiapi-storage-python/blob/bd5482547f993c6eb56fd09070e15c2e9616e440/azure/multiapi/storage/v2015_04_05/file/fileservice.py#L1315-L1356
train
41,564
Azure/azure-multiapi-storage-python
azure/multiapi/storage/v2015_04_05/file/fileservice.py
FileService.create_file_from_bytes
def create_file_from_bytes( self, share_name, directory_name, file_name, file, index=0, count=None, content_settings=None, metadata=None, progress_callback=None, max_connections=1, max_retries=5, retry_wait=1.0, timeout=None): ''' Creates a new file from an array of bytes, or updates the content of an existing file, with automatic chunking and progress notifications. :param str share_name: Name of existing share. :param str directory_name: The path to the directory. :param str file_name: Name of file to create or update. :param str file: Content of file as an array of bytes. :param int index: Start index in the array of bytes. :param int count: Number of bytes to upload. Set to None or negative value to upload all bytes starting from index. :param ~azure.storage.file.models.ContentSettings content_settings: ContentSettings object used to set file properties. :param metadata: Name-value pairs associated with the file as metadata. :type metadata: a dict mapping str to str :param progress_callback: Callback for progress with signature function(current, total) where current is the number of bytes transfered so far and total is the size of the file, or None if the total size is unknown. :type progress_callback: callback function in format of func(current, total) :param int max_connections: Maximum number of parallel connections to use when the file size exceeds 64MB. Set to 1 to upload the file chunks sequentially. Set to 2 or more to upload the file chunks in parallel. This uses more system resources but will upload faster. :param int max_retries: Number of times to retry upload of file chunk if an error occurs. :param int retry_wait: Sleep time in secs between retries. :param int timeout: The timeout parameter is expressed in seconds. This method may make multiple calls to the Azure service and the timeout will apply to each call individually. ''' _validate_not_none('share_name', share_name) _validate_not_none('file_name', file_name) _validate_not_none('file', file) _validate_type_bytes('file', file) if index < 0: raise TypeError(_ERROR_VALUE_NEGATIVE.format('index')) if count is None or count < 0: count = len(file) - index stream = BytesIO(file) stream.seek(index) self.create_file_from_stream( share_name, directory_name, file_name, stream, count, content_settings, metadata, progress_callback, max_connections, max_retries, retry_wait, timeout)
python
def create_file_from_bytes( self, share_name, directory_name, file_name, file, index=0, count=None, content_settings=None, metadata=None, progress_callback=None, max_connections=1, max_retries=5, retry_wait=1.0, timeout=None): ''' Creates a new file from an array of bytes, or updates the content of an existing file, with automatic chunking and progress notifications. :param str share_name: Name of existing share. :param str directory_name: The path to the directory. :param str file_name: Name of file to create or update. :param str file: Content of file as an array of bytes. :param int index: Start index in the array of bytes. :param int count: Number of bytes to upload. Set to None or negative value to upload all bytes starting from index. :param ~azure.storage.file.models.ContentSettings content_settings: ContentSettings object used to set file properties. :param metadata: Name-value pairs associated with the file as metadata. :type metadata: a dict mapping str to str :param progress_callback: Callback for progress with signature function(current, total) where current is the number of bytes transfered so far and total is the size of the file, or None if the total size is unknown. :type progress_callback: callback function in format of func(current, total) :param int max_connections: Maximum number of parallel connections to use when the file size exceeds 64MB. Set to 1 to upload the file chunks sequentially. Set to 2 or more to upload the file chunks in parallel. This uses more system resources but will upload faster. :param int max_retries: Number of times to retry upload of file chunk if an error occurs. :param int retry_wait: Sleep time in secs between retries. :param int timeout: The timeout parameter is expressed in seconds. This method may make multiple calls to the Azure service and the timeout will apply to each call individually. ''' _validate_not_none('share_name', share_name) _validate_not_none('file_name', file_name) _validate_not_none('file', file) _validate_type_bytes('file', file) if index < 0: raise TypeError(_ERROR_VALUE_NEGATIVE.format('index')) if count is None or count < 0: count = len(file) - index stream = BytesIO(file) stream.seek(index) self.create_file_from_stream( share_name, directory_name, file_name, stream, count, content_settings, metadata, progress_callback, max_connections, max_retries, retry_wait, timeout)
[ "def", "create_file_from_bytes", "(", "self", ",", "share_name", ",", "directory_name", ",", "file_name", ",", "file", ",", "index", "=", "0", ",", "count", "=", "None", ",", "content_settings", "=", "None", ",", "metadata", "=", "None", ",", "progress_callb...
Creates a new file from an array of bytes, or updates the content of an existing file, with automatic chunking and progress notifications. :param str share_name: Name of existing share. :param str directory_name: The path to the directory. :param str file_name: Name of file to create or update. :param str file: Content of file as an array of bytes. :param int index: Start index in the array of bytes. :param int count: Number of bytes to upload. Set to None or negative value to upload all bytes starting from index. :param ~azure.storage.file.models.ContentSettings content_settings: ContentSettings object used to set file properties. :param metadata: Name-value pairs associated with the file as metadata. :type metadata: a dict mapping str to str :param progress_callback: Callback for progress with signature function(current, total) where current is the number of bytes transfered so far and total is the size of the file, or None if the total size is unknown. :type progress_callback: callback function in format of func(current, total) :param int max_connections: Maximum number of parallel connections to use when the file size exceeds 64MB. Set to 1 to upload the file chunks sequentially. Set to 2 or more to upload the file chunks in parallel. This uses more system resources but will upload faster. :param int max_retries: Number of times to retry upload of file chunk if an error occurs. :param int retry_wait: Sleep time in secs between retries. :param int timeout: The timeout parameter is expressed in seconds. This method may make multiple calls to the Azure service and the timeout will apply to each call individually.
[ "Creates", "a", "new", "file", "from", "an", "array", "of", "bytes", "or", "updates", "the", "content", "of", "an", "existing", "file", "with", "automatic", "chunking", "and", "progress", "notifications", "." ]
bd5482547f993c6eb56fd09070e15c2e9616e440
https://github.com/Azure/azure-multiapi-storage-python/blob/bd5482547f993c6eb56fd09070e15c2e9616e440/azure/multiapi/storage/v2015_04_05/file/fileservice.py#L1552-L1617
train
41,565
Azure/azure-multiapi-storage-python
azure/multiapi/storage/v2015_04_05/file/fileservice.py
FileService.list_ranges
def list_ranges(self, share_name, directory_name, file_name, start_range=None, end_range=None, timeout=None): ''' Retrieves the valid ranges for a file. :param str share_name: Name of existing share. :param str directory_name: The path to the directory. :param str file_name: Name of existing file. :param int start_range: Specifies the start offset of bytes over which to list ranges. The start_range and end_range params are inclusive. Ex: start_range=0, end_range=511 will download first 512 bytes of file. :param int end_range: Specifies the end offset of bytes over which to list ranges. The start_range and end_range params are inclusive. Ex: start_range=0, end_range=511 will download first 512 bytes of file. :param int timeout: The timeout parameter is expressed in seconds. :returns: a list of valid ranges :rtype: a list of :class:`.FileRange` ''' _validate_not_none('share_name', share_name) _validate_not_none('file_name', file_name) request = HTTPRequest() request.method = 'GET' request.host = self._get_host() request.path = _get_path(share_name, directory_name, file_name) request.query = [ ('comp', 'rangelist'), ('timeout', _int_to_str(timeout)), ] if start_range is not None: _validate_and_format_range_headers( request, start_range, end_range, start_range_required=False, end_range_required=False) response = self._perform_request(request) return _convert_xml_to_ranges(response)
python
def list_ranges(self, share_name, directory_name, file_name, start_range=None, end_range=None, timeout=None): ''' Retrieves the valid ranges for a file. :param str share_name: Name of existing share. :param str directory_name: The path to the directory. :param str file_name: Name of existing file. :param int start_range: Specifies the start offset of bytes over which to list ranges. The start_range and end_range params are inclusive. Ex: start_range=0, end_range=511 will download first 512 bytes of file. :param int end_range: Specifies the end offset of bytes over which to list ranges. The start_range and end_range params are inclusive. Ex: start_range=0, end_range=511 will download first 512 bytes of file. :param int timeout: The timeout parameter is expressed in seconds. :returns: a list of valid ranges :rtype: a list of :class:`.FileRange` ''' _validate_not_none('share_name', share_name) _validate_not_none('file_name', file_name) request = HTTPRequest() request.method = 'GET' request.host = self._get_host() request.path = _get_path(share_name, directory_name, file_name) request.query = [ ('comp', 'rangelist'), ('timeout', _int_to_str(timeout)), ] if start_range is not None: _validate_and_format_range_headers( request, start_range, end_range, start_range_required=False, end_range_required=False) response = self._perform_request(request) return _convert_xml_to_ranges(response)
[ "def", "list_ranges", "(", "self", ",", "share_name", ",", "directory_name", ",", "file_name", ",", "start_range", "=", "None", ",", "end_range", "=", "None", ",", "timeout", "=", "None", ")", ":", "_validate_not_none", "(", "'share_name'", ",", "share_name", ...
Retrieves the valid ranges for a file. :param str share_name: Name of existing share. :param str directory_name: The path to the directory. :param str file_name: Name of existing file. :param int start_range: Specifies the start offset of bytes over which to list ranges. The start_range and end_range params are inclusive. Ex: start_range=0, end_range=511 will download first 512 bytes of file. :param int end_range: Specifies the end offset of bytes over which to list ranges. The start_range and end_range params are inclusive. Ex: start_range=0, end_range=511 will download first 512 bytes of file. :param int timeout: The timeout parameter is expressed in seconds. :returns: a list of valid ranges :rtype: a list of :class:`.FileRange`
[ "Retrieves", "the", "valid", "ranges", "for", "a", "file", "." ]
bd5482547f993c6eb56fd09070e15c2e9616e440
https://github.com/Azure/azure-multiapi-storage-python/blob/bd5482547f993c6eb56fd09070e15c2e9616e440/azure/multiapi/storage/v2015_04_05/file/fileservice.py#L2167-L2210
train
41,566
openstack/proliantutils
proliantutils/redfish/redfish.py
RedfishOperations._get_sushy_system
def _get_sushy_system(self, system_id): """Get the sushy system for system_id :param system_id: The identity of the System resource :returns: the Sushy system instance :raises: IloError """ system_url = parse.urljoin(self._sushy.get_system_collection_path(), system_id) try: return self._sushy.get_system(system_url) except sushy.exceptions.SushyError as e: msg = (self._('The Redfish System "%(system)s" was not found. ' 'Error %(error)s') % {'system': system_id, 'error': str(e)}) LOG.debug(msg) raise exception.IloError(msg)
python
def _get_sushy_system(self, system_id): """Get the sushy system for system_id :param system_id: The identity of the System resource :returns: the Sushy system instance :raises: IloError """ system_url = parse.urljoin(self._sushy.get_system_collection_path(), system_id) try: return self._sushy.get_system(system_url) except sushy.exceptions.SushyError as e: msg = (self._('The Redfish System "%(system)s" was not found. ' 'Error %(error)s') % {'system': system_id, 'error': str(e)}) LOG.debug(msg) raise exception.IloError(msg)
[ "def", "_get_sushy_system", "(", "self", ",", "system_id", ")", ":", "system_url", "=", "parse", ".", "urljoin", "(", "self", ".", "_sushy", ".", "get_system_collection_path", "(", ")", ",", "system_id", ")", "try", ":", "return", "self", ".", "_sushy", "....
Get the sushy system for system_id :param system_id: The identity of the System resource :returns: the Sushy system instance :raises: IloError
[ "Get", "the", "sushy", "system", "for", "system_id" ]
86ef3b47b4eca97c221577e3570b0240d6a25f22
https://github.com/openstack/proliantutils/blob/86ef3b47b4eca97c221577e3570b0240d6a25f22/proliantutils/redfish/redfish.py#L167-L183
train
41,567
openstack/proliantutils
proliantutils/redfish/redfish.py
RedfishOperations._get_sushy_manager
def _get_sushy_manager(self, manager_id): """Get the sushy Manager for manager_id :param manager_id: The identity of the Manager resource :returns: the Sushy Manager instance :raises: IloError """ manager_url = parse.urljoin(self._sushy.get_manager_collection_path(), manager_id) try: return self._sushy.get_manager(manager_url) except sushy.exceptions.SushyError as e: msg = (self._('The Redfish Manager "%(manager)s" was not found. ' 'Error %(error)s') % {'manager': manager_id, 'error': str(e)}) LOG.debug(msg) raise exception.IloError(msg)
python
def _get_sushy_manager(self, manager_id): """Get the sushy Manager for manager_id :param manager_id: The identity of the Manager resource :returns: the Sushy Manager instance :raises: IloError """ manager_url = parse.urljoin(self._sushy.get_manager_collection_path(), manager_id) try: return self._sushy.get_manager(manager_url) except sushy.exceptions.SushyError as e: msg = (self._('The Redfish Manager "%(manager)s" was not found. ' 'Error %(error)s') % {'manager': manager_id, 'error': str(e)}) LOG.debug(msg) raise exception.IloError(msg)
[ "def", "_get_sushy_manager", "(", "self", ",", "manager_id", ")", ":", "manager_url", "=", "parse", ".", "urljoin", "(", "self", ".", "_sushy", ".", "get_manager_collection_path", "(", ")", ",", "manager_id", ")", "try", ":", "return", "self", ".", "_sushy",...
Get the sushy Manager for manager_id :param manager_id: The identity of the Manager resource :returns: the Sushy Manager instance :raises: IloError
[ "Get", "the", "sushy", "Manager", "for", "manager_id" ]
86ef3b47b4eca97c221577e3570b0240d6a25f22
https://github.com/openstack/proliantutils/blob/86ef3b47b4eca97c221577e3570b0240d6a25f22/proliantutils/redfish/redfish.py#L185-L201
train
41,568
openstack/proliantutils
proliantutils/redfish/redfish.py
RedfishOperations.get_host_power_status
def get_host_power_status(self): """Request the power state of the server. :returns: Power State of the server, 'ON' or 'OFF' :raises: IloError, on an error from iLO. """ sushy_system = self._get_sushy_system(PROLIANT_SYSTEM_ID) return GET_POWER_STATE_MAP.get(sushy_system.power_state)
python
def get_host_power_status(self): """Request the power state of the server. :returns: Power State of the server, 'ON' or 'OFF' :raises: IloError, on an error from iLO. """ sushy_system = self._get_sushy_system(PROLIANT_SYSTEM_ID) return GET_POWER_STATE_MAP.get(sushy_system.power_state)
[ "def", "get_host_power_status", "(", "self", ")", ":", "sushy_system", "=", "self", ".", "_get_sushy_system", "(", "PROLIANT_SYSTEM_ID", ")", "return", "GET_POWER_STATE_MAP", ".", "get", "(", "sushy_system", ".", "power_state", ")" ]
Request the power state of the server. :returns: Power State of the server, 'ON' or 'OFF' :raises: IloError, on an error from iLO.
[ "Request", "the", "power", "state", "of", "the", "server", "." ]
86ef3b47b4eca97c221577e3570b0240d6a25f22
https://github.com/openstack/proliantutils/blob/86ef3b47b4eca97c221577e3570b0240d6a25f22/proliantutils/redfish/redfish.py#L212-L219
train
41,569
openstack/proliantutils
proliantutils/redfish/redfish.py
RedfishOperations.reset_server
def reset_server(self): """Resets the server. :raises: IloError, on an error from iLO. """ sushy_system = self._get_sushy_system(PROLIANT_SYSTEM_ID) try: sushy_system.reset_system(sushy.RESET_FORCE_RESTART) except sushy.exceptions.SushyError as e: msg = (self._('The Redfish controller failed to reset server. ' 'Error %(error)s') % {'error': str(e)}) LOG.debug(msg) raise exception.IloError(msg)
python
def reset_server(self): """Resets the server. :raises: IloError, on an error from iLO. """ sushy_system = self._get_sushy_system(PROLIANT_SYSTEM_ID) try: sushy_system.reset_system(sushy.RESET_FORCE_RESTART) except sushy.exceptions.SushyError as e: msg = (self._('The Redfish controller failed to reset server. ' 'Error %(error)s') % {'error': str(e)}) LOG.debug(msg) raise exception.IloError(msg)
[ "def", "reset_server", "(", "self", ")", ":", "sushy_system", "=", "self", ".", "_get_sushy_system", "(", "PROLIANT_SYSTEM_ID", ")", "try", ":", "sushy_system", ".", "reset_system", "(", "sushy", ".", "RESET_FORCE_RESTART", ")", "except", "sushy", ".", "exceptio...
Resets the server. :raises: IloError, on an error from iLO.
[ "Resets", "the", "server", "." ]
86ef3b47b4eca97c221577e3570b0240d6a25f22
https://github.com/openstack/proliantutils/blob/86ef3b47b4eca97c221577e3570b0240d6a25f22/proliantutils/redfish/redfish.py#L221-L234
train
41,570
openstack/proliantutils
proliantutils/redfish/redfish.py
RedfishOperations.set_host_power
def set_host_power(self, target_value): """Sets the power state of the system. :param target_value: The target value to be set. Value can be: 'ON' or 'OFF'. :raises: IloError, on an error from iLO. :raises: InvalidInputError, if the target value is not allowed. """ if target_value not in POWER_RESET_MAP: msg = ('The parameter "%(parameter)s" value "%(target_value)s" is ' 'invalid. Valid values are: %(valid_power_values)s' % {'parameter': 'target_value', 'target_value': target_value, 'valid_power_values': POWER_RESET_MAP.keys()}) raise exception.InvalidInputError(msg) # Check current power status, do not act if it's in requested state. current_power_status = self.get_host_power_status() if current_power_status == target_value: LOG.debug(self._("Node is already in '%(target_value)s' power " "state."), {'target_value': target_value}) return sushy_system = self._get_sushy_system(PROLIANT_SYSTEM_ID) try: sushy_system.reset_system(POWER_RESET_MAP[target_value]) except sushy.exceptions.SushyError as e: msg = (self._('The Redfish controller failed to set power state ' 'of server to %(target_value)s. Error %(error)s') % {'target_value': target_value, 'error': str(e)}) LOG.debug(msg) raise exception.IloError(msg)
python
def set_host_power(self, target_value): """Sets the power state of the system. :param target_value: The target value to be set. Value can be: 'ON' or 'OFF'. :raises: IloError, on an error from iLO. :raises: InvalidInputError, if the target value is not allowed. """ if target_value not in POWER_RESET_MAP: msg = ('The parameter "%(parameter)s" value "%(target_value)s" is ' 'invalid. Valid values are: %(valid_power_values)s' % {'parameter': 'target_value', 'target_value': target_value, 'valid_power_values': POWER_RESET_MAP.keys()}) raise exception.InvalidInputError(msg) # Check current power status, do not act if it's in requested state. current_power_status = self.get_host_power_status() if current_power_status == target_value: LOG.debug(self._("Node is already in '%(target_value)s' power " "state."), {'target_value': target_value}) return sushy_system = self._get_sushy_system(PROLIANT_SYSTEM_ID) try: sushy_system.reset_system(POWER_RESET_MAP[target_value]) except sushy.exceptions.SushyError as e: msg = (self._('The Redfish controller failed to set power state ' 'of server to %(target_value)s. Error %(error)s') % {'target_value': target_value, 'error': str(e)}) LOG.debug(msg) raise exception.IloError(msg)
[ "def", "set_host_power", "(", "self", ",", "target_value", ")", ":", "if", "target_value", "not", "in", "POWER_RESET_MAP", ":", "msg", "=", "(", "'The parameter \"%(parameter)s\" value \"%(target_value)s\" is '", "'invalid. Valid values are: %(valid_power_values)s'", "%", "{"...
Sets the power state of the system. :param target_value: The target value to be set. Value can be: 'ON' or 'OFF'. :raises: IloError, on an error from iLO. :raises: InvalidInputError, if the target value is not allowed.
[ "Sets", "the", "power", "state", "of", "the", "system", "." ]
86ef3b47b4eca97c221577e3570b0240d6a25f22
https://github.com/openstack/proliantutils/blob/86ef3b47b4eca97c221577e3570b0240d6a25f22/proliantutils/redfish/redfish.py#L236-L267
train
41,571
openstack/proliantutils
proliantutils/redfish/redfish.py
RedfishOperations._validate_virtual_media
def _validate_virtual_media(self, device): """Check if the device is valid device. :param device: virtual media device :raises: IloInvalidInputError, if the device is not valid. """ if device not in VIRTUAL_MEDIA_MAP: msg = (self._("Invalid device '%s'. Valid devices: FLOPPY or " "CDROM.") % device) LOG.debug(msg) raise exception.IloInvalidInputError(msg)
python
def _validate_virtual_media(self, device): """Check if the device is valid device. :param device: virtual media device :raises: IloInvalidInputError, if the device is not valid. """ if device not in VIRTUAL_MEDIA_MAP: msg = (self._("Invalid device '%s'. Valid devices: FLOPPY or " "CDROM.") % device) LOG.debug(msg) raise exception.IloInvalidInputError(msg)
[ "def", "_validate_virtual_media", "(", "self", ",", "device", ")", ":", "if", "device", "not", "in", "VIRTUAL_MEDIA_MAP", ":", "msg", "=", "(", "self", ".", "_", "(", "\"Invalid device '%s'. Valid devices: FLOPPY or \"", "\"CDROM.\"", ")", "%", "device", ")", "L...
Check if the device is valid device. :param device: virtual media device :raises: IloInvalidInputError, if the device is not valid.
[ "Check", "if", "the", "device", "is", "valid", "device", "." ]
86ef3b47b4eca97c221577e3570b0240d6a25f22
https://github.com/openstack/proliantutils/blob/86ef3b47b4eca97c221577e3570b0240d6a25f22/proliantutils/redfish/redfish.py#L363-L374
train
41,572
openstack/proliantutils
proliantutils/redfish/redfish.py
RedfishOperations._is_boot_mode_uefi
def _is_boot_mode_uefi(self): """Checks if the system is in uefi boot mode. :return: 'True' if the boot mode is uefi else 'False' """ boot_mode = self.get_current_boot_mode() return (boot_mode == BOOT_MODE_MAP.get(sys_cons.BIOS_BOOT_MODE_UEFI))
python
def _is_boot_mode_uefi(self): """Checks if the system is in uefi boot mode. :return: 'True' if the boot mode is uefi else 'False' """ boot_mode = self.get_current_boot_mode() return (boot_mode == BOOT_MODE_MAP.get(sys_cons.BIOS_BOOT_MODE_UEFI))
[ "def", "_is_boot_mode_uefi", "(", "self", ")", ":", "boot_mode", "=", "self", ".", "get_current_boot_mode", "(", ")", "return", "(", "boot_mode", "==", "BOOT_MODE_MAP", ".", "get", "(", "sys_cons", ".", "BIOS_BOOT_MODE_UEFI", ")", ")" ]
Checks if the system is in uefi boot mode. :return: 'True' if the boot mode is uefi else 'False'
[ "Checks", "if", "the", "system", "is", "in", "uefi", "boot", "mode", "." ]
86ef3b47b4eca97c221577e3570b0240d6a25f22
https://github.com/openstack/proliantutils/blob/86ef3b47b4eca97c221577e3570b0240d6a25f22/proliantutils/redfish/redfish.py#L493-L499
train
41,573
openstack/proliantutils
proliantutils/redfish/redfish.py
RedfishOperations.get_supported_boot_mode
def get_supported_boot_mode(self): """Get the system supported boot modes. :return: any one of the following proliantutils.ilo.constants: SUPPORTED_BOOT_MODE_LEGACY_BIOS_ONLY, SUPPORTED_BOOT_MODE_UEFI_ONLY, SUPPORTED_BOOT_MODE_LEGACY_BIOS_AND_UEFI :raises: IloError, if account not found or on an error from iLO. """ sushy_system = self._get_sushy_system(PROLIANT_SYSTEM_ID) try: return SUPPORTED_BOOT_MODE_MAP.get( sushy_system.supported_boot_mode) except sushy.exceptions.SushyError as e: msg = (self._('The Redfish controller failed to get the ' 'supported boot modes. Error: %s') % e) LOG.debug(msg) raise exception.IloError(msg)
python
def get_supported_boot_mode(self): """Get the system supported boot modes. :return: any one of the following proliantutils.ilo.constants: SUPPORTED_BOOT_MODE_LEGACY_BIOS_ONLY, SUPPORTED_BOOT_MODE_UEFI_ONLY, SUPPORTED_BOOT_MODE_LEGACY_BIOS_AND_UEFI :raises: IloError, if account not found or on an error from iLO. """ sushy_system = self._get_sushy_system(PROLIANT_SYSTEM_ID) try: return SUPPORTED_BOOT_MODE_MAP.get( sushy_system.supported_boot_mode) except sushy.exceptions.SushyError as e: msg = (self._('The Redfish controller failed to get the ' 'supported boot modes. Error: %s') % e) LOG.debug(msg) raise exception.IloError(msg)
[ "def", "get_supported_boot_mode", "(", "self", ")", ":", "sushy_system", "=", "self", ".", "_get_sushy_system", "(", "PROLIANT_SYSTEM_ID", ")", "try", ":", "return", "SUPPORTED_BOOT_MODE_MAP", ".", "get", "(", "sushy_system", ".", "supported_boot_mode", ")", "except...
Get the system supported boot modes. :return: any one of the following proliantutils.ilo.constants: SUPPORTED_BOOT_MODE_LEGACY_BIOS_ONLY, SUPPORTED_BOOT_MODE_UEFI_ONLY, SUPPORTED_BOOT_MODE_LEGACY_BIOS_AND_UEFI :raises: IloError, if account not found or on an error from iLO.
[ "Get", "the", "system", "supported", "boot", "modes", "." ]
86ef3b47b4eca97c221577e3570b0240d6a25f22
https://github.com/openstack/proliantutils/blob/86ef3b47b4eca97c221577e3570b0240d6a25f22/proliantutils/redfish/redfish.py#L629-L647
train
41,574
openstack/proliantutils
proliantutils/redfish/redfish.py
RedfishOperations.get_server_capabilities
def get_server_capabilities(self): """Returns the server capabilities raises: IloError on an error from iLO. """ capabilities = {} sushy_system = self._get_sushy_system(PROLIANT_SYSTEM_ID) sushy_manager = self._get_sushy_manager(PROLIANT_MANAGER_ID) try: count = len(sushy_system.pci_devices.gpu_devices) boot_mode = rf_utils.get_supported_boot_mode( sushy_system.supported_boot_mode) capabilities.update( {'pci_gpu_devices': count, 'ilo_firmware_version': sushy_manager.firmware_version, 'rom_firmware_version': sushy_system.rom_version, 'server_model': sushy_system.model, 'nic_capacity': sushy_system.pci_devices.max_nic_capacity, 'boot_mode_bios': boot_mode.boot_mode_bios, 'boot_mode_uefi': boot_mode.boot_mode_uefi}) tpm_state = sushy_system.bios_settings.tpm_state all_key_to_value_expression_tuples = [ ('sriov_enabled', sushy_system.bios_settings.sriov == sys_cons.SRIOV_ENABLED), ('cpu_vt', sushy_system.bios_settings.cpu_vt == ( sys_cons.CPUVT_ENABLED)), ('trusted_boot', (tpm_state == sys_cons.TPM_PRESENT_ENABLED or tpm_state == sys_cons.TPM_PRESENT_DISABLED)), ('secure_boot', self._has_secure_boot()), ('iscsi_boot', (sushy_system.bios_settings.iscsi_resource. is_iscsi_boot_supported())), ('hardware_supports_raid', len(sushy_system.smart_storage.array_controllers. members_identities) > 0), ('has_ssd', common_storage.has_ssd(sushy_system)), ('has_rotational', common_storage.has_rotational(sushy_system)), ('has_nvme_ssd', common_storage.has_nvme_ssd(sushy_system)) ] all_key_to_value_expression_tuples += ( [('logical_raid_level_' + x, True) for x in sushy_system.smart_storage.logical_raid_levels]) all_key_to_value_expression_tuples += ( [('drive_rotational_' + str(x) + '_rpm', True) for x in common_storage.get_drive_rotational_speed_rpm(sushy_system)]) capabilities.update( {key: 'true' for (key, value) in all_key_to_value_expression_tuples if value}) memory_data = sushy_system.memory.details() if memory_data.has_nvdimm_n: capabilities.update( {'persistent_memory': ( json.dumps(memory_data.has_persistent_memory)), 'nvdimm_n': ( json.dumps(memory_data.has_nvdimm_n)), 'logical_nvdimm_n': ( json.dumps(memory_data.has_logical_nvdimm_n))}) except sushy.exceptions.SushyError as e: msg = (self._("The Redfish controller is unable to get " "resource or its members. Error " "%(error)s)") % {'error': str(e)}) LOG.debug(msg) raise exception.IloError(msg) return capabilities
python
def get_server_capabilities(self): """Returns the server capabilities raises: IloError on an error from iLO. """ capabilities = {} sushy_system = self._get_sushy_system(PROLIANT_SYSTEM_ID) sushy_manager = self._get_sushy_manager(PROLIANT_MANAGER_ID) try: count = len(sushy_system.pci_devices.gpu_devices) boot_mode = rf_utils.get_supported_boot_mode( sushy_system.supported_boot_mode) capabilities.update( {'pci_gpu_devices': count, 'ilo_firmware_version': sushy_manager.firmware_version, 'rom_firmware_version': sushy_system.rom_version, 'server_model': sushy_system.model, 'nic_capacity': sushy_system.pci_devices.max_nic_capacity, 'boot_mode_bios': boot_mode.boot_mode_bios, 'boot_mode_uefi': boot_mode.boot_mode_uefi}) tpm_state = sushy_system.bios_settings.tpm_state all_key_to_value_expression_tuples = [ ('sriov_enabled', sushy_system.bios_settings.sriov == sys_cons.SRIOV_ENABLED), ('cpu_vt', sushy_system.bios_settings.cpu_vt == ( sys_cons.CPUVT_ENABLED)), ('trusted_boot', (tpm_state == sys_cons.TPM_PRESENT_ENABLED or tpm_state == sys_cons.TPM_PRESENT_DISABLED)), ('secure_boot', self._has_secure_boot()), ('iscsi_boot', (sushy_system.bios_settings.iscsi_resource. is_iscsi_boot_supported())), ('hardware_supports_raid', len(sushy_system.smart_storage.array_controllers. members_identities) > 0), ('has_ssd', common_storage.has_ssd(sushy_system)), ('has_rotational', common_storage.has_rotational(sushy_system)), ('has_nvme_ssd', common_storage.has_nvme_ssd(sushy_system)) ] all_key_to_value_expression_tuples += ( [('logical_raid_level_' + x, True) for x in sushy_system.smart_storage.logical_raid_levels]) all_key_to_value_expression_tuples += ( [('drive_rotational_' + str(x) + '_rpm', True) for x in common_storage.get_drive_rotational_speed_rpm(sushy_system)]) capabilities.update( {key: 'true' for (key, value) in all_key_to_value_expression_tuples if value}) memory_data = sushy_system.memory.details() if memory_data.has_nvdimm_n: capabilities.update( {'persistent_memory': ( json.dumps(memory_data.has_persistent_memory)), 'nvdimm_n': ( json.dumps(memory_data.has_nvdimm_n)), 'logical_nvdimm_n': ( json.dumps(memory_data.has_logical_nvdimm_n))}) except sushy.exceptions.SushyError as e: msg = (self._("The Redfish controller is unable to get " "resource or its members. Error " "%(error)s)") % {'error': str(e)}) LOG.debug(msg) raise exception.IloError(msg) return capabilities
[ "def", "get_server_capabilities", "(", "self", ")", ":", "capabilities", "=", "{", "}", "sushy_system", "=", "self", ".", "_get_sushy_system", "(", "PROLIANT_SYSTEM_ID", ")", "sushy_manager", "=", "self", ".", "_get_sushy_manager", "(", "PROLIANT_MANAGER_ID", ")", ...
Returns the server capabilities raises: IloError on an error from iLO.
[ "Returns", "the", "server", "capabilities" ]
86ef3b47b4eca97c221577e3570b0240d6a25f22
https://github.com/openstack/proliantutils/blob/86ef3b47b4eca97c221577e3570b0240d6a25f22/proliantutils/redfish/redfish.py#L649-L727
train
41,575
openstack/proliantutils
proliantutils/redfish/redfish.py
RedfishOperations.get_essential_properties
def get_essential_properties(self): """Constructs the dictionary of essential properties Constructs the dictionary of essential properties, named cpu, cpu_arch, local_gb, memory_mb. The MACs are also returned as part of this method. """ sushy_system = self._get_sushy_system(PROLIANT_SYSTEM_ID) try: # TODO(nisha): Add local_gb here and return after # local_gb changes are merged. # local_gb = sushy_system.storage_summary prop = {'memory_mb': (sushy_system.memory_summary.size_gib * 1024), 'cpus': sushy_system.processors.summary.count, 'cpu_arch': sushy_map.PROCESSOR_ARCH_VALUE_MAP_REV.get( sushy_system.processors.summary.architecture), 'local_gb': common_storage.get_local_gb(sushy_system)} return {'properties': prop, 'macs': sushy_system.ethernet_interfaces.summary} except sushy.exceptions.SushyError as e: msg = (self._('The Redfish controller failed to get the ' 'resource data. Error %(error)s') % {'error': str(e)}) LOG.debug(msg) raise exception.IloError(msg)
python
def get_essential_properties(self): """Constructs the dictionary of essential properties Constructs the dictionary of essential properties, named cpu, cpu_arch, local_gb, memory_mb. The MACs are also returned as part of this method. """ sushy_system = self._get_sushy_system(PROLIANT_SYSTEM_ID) try: # TODO(nisha): Add local_gb here and return after # local_gb changes are merged. # local_gb = sushy_system.storage_summary prop = {'memory_mb': (sushy_system.memory_summary.size_gib * 1024), 'cpus': sushy_system.processors.summary.count, 'cpu_arch': sushy_map.PROCESSOR_ARCH_VALUE_MAP_REV.get( sushy_system.processors.summary.architecture), 'local_gb': common_storage.get_local_gb(sushy_system)} return {'properties': prop, 'macs': sushy_system.ethernet_interfaces.summary} except sushy.exceptions.SushyError as e: msg = (self._('The Redfish controller failed to get the ' 'resource data. Error %(error)s') % {'error': str(e)}) LOG.debug(msg) raise exception.IloError(msg)
[ "def", "get_essential_properties", "(", "self", ")", ":", "sushy_system", "=", "self", ".", "_get_sushy_system", "(", "PROLIANT_SYSTEM_ID", ")", "try", ":", "# TODO(nisha): Add local_gb here and return after", "# local_gb changes are merged.", "# local_gb = sushy_system.storage_s...
Constructs the dictionary of essential properties Constructs the dictionary of essential properties, named cpu, cpu_arch, local_gb, memory_mb. The MACs are also returned as part of this method.
[ "Constructs", "the", "dictionary", "of", "essential", "properties" ]
86ef3b47b4eca97c221577e3570b0240d6a25f22
https://github.com/openstack/proliantutils/blob/86ef3b47b4eca97c221577e3570b0240d6a25f22/proliantutils/redfish/redfish.py#L854-L878
train
41,576
openstack/proliantutils
proliantutils/redfish/redfish.py
RedfishOperations._change_iscsi_target_settings
def _change_iscsi_target_settings(self, iscsi_info): """Change iSCSI target settings. :param iscsi_info: A dictionary that contains information of iSCSI target like target_name, lun, ip_address, port etc. :raises: IloError, on an error from iLO. """ sushy_system = self._get_sushy_system(PROLIANT_SYSTEM_ID) try: pci_settings_map = ( sushy_system.bios_settings.bios_mappings.pci_settings_mappings) nics = [] for mapping in pci_settings_map: for subinstance in mapping['Subinstances']: for association in subinstance['Associations']: if 'NicBoot' in association: nics.append(association) except sushy.exceptions.SushyError as e: msg = (self._('The Redfish controller failed to get the ' 'bios mappings. Error %(error)s') % {'error': str(e)}) LOG.debug(msg) raise exception.IloError(msg) if not nics: msg = ('No nics were found on the system') raise exception.IloError(msg) # Set iSCSI info to all nics iscsi_infos = [] for nic in nics: data = iscsi_info.copy() data['iSCSIAttemptName'] = nic data['iSCSINicSource'] = nic data['iSCSIAttemptInstance'] = nics.index(nic) + 1 iscsi_infos.append(data) iscsi_data = {'iSCSISources': iscsi_infos} try: (sushy_system.bios_settings.iscsi_resource. iscsi_settings.update_iscsi_settings(iscsi_data)) except sushy.exceptions.SushyError as e: msg = (self._("The Redfish controller is failed to update iSCSI " "settings. Error %(error)s") % {'error': str(e)}) LOG.debug(msg) raise exception.IloError(msg)
python
def _change_iscsi_target_settings(self, iscsi_info): """Change iSCSI target settings. :param iscsi_info: A dictionary that contains information of iSCSI target like target_name, lun, ip_address, port etc. :raises: IloError, on an error from iLO. """ sushy_system = self._get_sushy_system(PROLIANT_SYSTEM_ID) try: pci_settings_map = ( sushy_system.bios_settings.bios_mappings.pci_settings_mappings) nics = [] for mapping in pci_settings_map: for subinstance in mapping['Subinstances']: for association in subinstance['Associations']: if 'NicBoot' in association: nics.append(association) except sushy.exceptions.SushyError as e: msg = (self._('The Redfish controller failed to get the ' 'bios mappings. Error %(error)s') % {'error': str(e)}) LOG.debug(msg) raise exception.IloError(msg) if not nics: msg = ('No nics were found on the system') raise exception.IloError(msg) # Set iSCSI info to all nics iscsi_infos = [] for nic in nics: data = iscsi_info.copy() data['iSCSIAttemptName'] = nic data['iSCSINicSource'] = nic data['iSCSIAttemptInstance'] = nics.index(nic) + 1 iscsi_infos.append(data) iscsi_data = {'iSCSISources': iscsi_infos} try: (sushy_system.bios_settings.iscsi_resource. iscsi_settings.update_iscsi_settings(iscsi_data)) except sushy.exceptions.SushyError as e: msg = (self._("The Redfish controller is failed to update iSCSI " "settings. Error %(error)s") % {'error': str(e)}) LOG.debug(msg) raise exception.IloError(msg)
[ "def", "_change_iscsi_target_settings", "(", "self", ",", "iscsi_info", ")", ":", "sushy_system", "=", "self", ".", "_get_sushy_system", "(", "PROLIANT_SYSTEM_ID", ")", "try", ":", "pci_settings_map", "=", "(", "sushy_system", ".", "bios_settings", ".", "bios_mappin...
Change iSCSI target settings. :param iscsi_info: A dictionary that contains information of iSCSI target like target_name, lun, ip_address, port etc. :raises: IloError, on an error from iLO.
[ "Change", "iSCSI", "target", "settings", "." ]
86ef3b47b4eca97c221577e3570b0240d6a25f22
https://github.com/openstack/proliantutils/blob/86ef3b47b4eca97c221577e3570b0240d6a25f22/proliantutils/redfish/redfish.py#L880-L926
train
41,577
openstack/proliantutils
proliantutils/redfish/redfish.py
RedfishOperations.set_iscsi_info
def set_iscsi_info(self, target_name, lun, ip_address, port='3260', auth_method=None, username=None, password=None): """Set iSCSI details of the system in UEFI boot mode. The initiator system is set with the target details like IQN, LUN, IP, Port etc. :param target_name: Target Name for iSCSI. :param lun: logical unit number. :param ip_address: IP address of the target. :param port: port of the target. :param auth_method : either None or CHAP. :param username: CHAP Username for authentication. :param password: CHAP secret. :raises: IloCommandNotSupportedInBiosError, if the system is in the bios boot mode. """ if(self._is_boot_mode_uefi()): iscsi_info = {} iscsi_info['iSCSITargetName'] = target_name iscsi_info['iSCSILUN'] = lun iscsi_info['iSCSITargetIpAddress'] = ip_address iscsi_info['iSCSITargetTcpPort'] = int(port) iscsi_info['iSCSITargetInfoViaDHCP'] = False iscsi_info['iSCSIConnection'] = 'Enabled' if (auth_method == 'CHAP'): iscsi_info['iSCSIAuthenticationMethod'] = 'Chap' iscsi_info['iSCSIChapUsername'] = username iscsi_info['iSCSIChapSecret'] = password self._change_iscsi_target_settings(iscsi_info) else: msg = 'iSCSI boot is not supported in the BIOS boot mode' raise exception.IloCommandNotSupportedInBiosError(msg)
python
def set_iscsi_info(self, target_name, lun, ip_address, port='3260', auth_method=None, username=None, password=None): """Set iSCSI details of the system in UEFI boot mode. The initiator system is set with the target details like IQN, LUN, IP, Port etc. :param target_name: Target Name for iSCSI. :param lun: logical unit number. :param ip_address: IP address of the target. :param port: port of the target. :param auth_method : either None or CHAP. :param username: CHAP Username for authentication. :param password: CHAP secret. :raises: IloCommandNotSupportedInBiosError, if the system is in the bios boot mode. """ if(self._is_boot_mode_uefi()): iscsi_info = {} iscsi_info['iSCSITargetName'] = target_name iscsi_info['iSCSILUN'] = lun iscsi_info['iSCSITargetIpAddress'] = ip_address iscsi_info['iSCSITargetTcpPort'] = int(port) iscsi_info['iSCSITargetInfoViaDHCP'] = False iscsi_info['iSCSIConnection'] = 'Enabled' if (auth_method == 'CHAP'): iscsi_info['iSCSIAuthenticationMethod'] = 'Chap' iscsi_info['iSCSIChapUsername'] = username iscsi_info['iSCSIChapSecret'] = password self._change_iscsi_target_settings(iscsi_info) else: msg = 'iSCSI boot is not supported in the BIOS boot mode' raise exception.IloCommandNotSupportedInBiosError(msg)
[ "def", "set_iscsi_info", "(", "self", ",", "target_name", ",", "lun", ",", "ip_address", ",", "port", "=", "'3260'", ",", "auth_method", "=", "None", ",", "username", "=", "None", ",", "password", "=", "None", ")", ":", "if", "(", "self", ".", "_is_boo...
Set iSCSI details of the system in UEFI boot mode. The initiator system is set with the target details like IQN, LUN, IP, Port etc. :param target_name: Target Name for iSCSI. :param lun: logical unit number. :param ip_address: IP address of the target. :param port: port of the target. :param auth_method : either None or CHAP. :param username: CHAP Username for authentication. :param password: CHAP secret. :raises: IloCommandNotSupportedInBiosError, if the system is in the bios boot mode.
[ "Set", "iSCSI", "details", "of", "the", "system", "in", "UEFI", "boot", "mode", "." ]
86ef3b47b4eca97c221577e3570b0240d6a25f22
https://github.com/openstack/proliantutils/blob/86ef3b47b4eca97c221577e3570b0240d6a25f22/proliantutils/redfish/redfish.py#L928-L960
train
41,578
openstack/proliantutils
proliantutils/redfish/redfish.py
RedfishOperations.get_host_post_state
def get_host_post_state(self): """Get the current state of system POST. Retrieves current state of system POST. :returns: POST state of the server. The valida states are:- null, Unknown, Reset, PowerOff, InPost, InPostDiscoveryComplete and FinishedPost. :raises: IloError, on an error from iLO """ sushy_system = self._get_sushy_system(PROLIANT_SYSTEM_ID) return GET_POST_STATE_MAP.get(sushy_system.post_state)
python
def get_host_post_state(self): """Get the current state of system POST. Retrieves current state of system POST. :returns: POST state of the server. The valida states are:- null, Unknown, Reset, PowerOff, InPost, InPostDiscoveryComplete and FinishedPost. :raises: IloError, on an error from iLO """ sushy_system = self._get_sushy_system(PROLIANT_SYSTEM_ID) return GET_POST_STATE_MAP.get(sushy_system.post_state)
[ "def", "get_host_post_state", "(", "self", ")", ":", "sushy_system", "=", "self", ".", "_get_sushy_system", "(", "PROLIANT_SYSTEM_ID", ")", "return", "GET_POST_STATE_MAP", ".", "get", "(", "sushy_system", ".", "post_state", ")" ]
Get the current state of system POST. Retrieves current state of system POST. :returns: POST state of the server. The valida states are:- null, Unknown, Reset, PowerOff, InPost, InPostDiscoveryComplete and FinishedPost. :raises: IloError, on an error from iLO
[ "Get", "the", "current", "state", "of", "system", "POST", "." ]
86ef3b47b4eca97c221577e3570b0240d6a25f22
https://github.com/openstack/proliantutils/blob/86ef3b47b4eca97c221577e3570b0240d6a25f22/proliantutils/redfish/redfish.py#L1042-L1053
train
41,579
openstack/proliantutils
proliantutils/ilo/ris.py
RISOperations._get_collection
def _get_collection(self, collection_uri, request_headers=None): """Generator function that returns collection members.""" # get the collection status, headers, thecollection = self._rest_get(collection_uri) if status != 200: msg = self._get_extended_error(thecollection) raise exception.IloError(msg) while status < 300: # verify expected type # Don't limit to version 0 here as we will rev to 1.0 at some # point hopefully with minimal changes ctype = self._get_type(thecollection) if (ctype not in ['Collection.0', 'Collection.1']): raise exception.IloError("collection not found") # if this collection has inline items, return those # NOTE: Collections are very flexible in how the represent # members. They can be inline in the collection as members # of the 'Items' array, or they may be href links in the # links/Members array. The could actually be both. Typically, # iLO implements the inline (Items) for only when the collection # is read only. We have to render it with the href links when an # array contains PATCHable items because its complex to PATCH # inline collection members. if 'Items' in thecollection: # iterate items for item in thecollection['Items']: # if the item has a self uri pointer, # supply that for convenience. memberuri = None if 'links' in item and 'self' in item['links']: memberuri = item['links']['self']['href'] yield 200, None, item, memberuri # else walk the member links elif ('links' in thecollection and 'Member' in thecollection['links']): # iterate members for memberuri in thecollection['links']['Member']: # for each member return the resource indicated by the # member link status, headers, member = self._rest_get(memberuri['href']) yield status, headers, member, memberuri['href'] # page forward if there are more pages in the collection if ('links' in thecollection and 'NextPage' in thecollection['links']): next_link_uri = (collection_uri + '?page=' + str( thecollection['links']['NextPage']['page'])) status, headers, thecollection = self._rest_get(next_link_uri) # else we are finished iterating the collection else: break
python
def _get_collection(self, collection_uri, request_headers=None): """Generator function that returns collection members.""" # get the collection status, headers, thecollection = self._rest_get(collection_uri) if status != 200: msg = self._get_extended_error(thecollection) raise exception.IloError(msg) while status < 300: # verify expected type # Don't limit to version 0 here as we will rev to 1.0 at some # point hopefully with minimal changes ctype = self._get_type(thecollection) if (ctype not in ['Collection.0', 'Collection.1']): raise exception.IloError("collection not found") # if this collection has inline items, return those # NOTE: Collections are very flexible in how the represent # members. They can be inline in the collection as members # of the 'Items' array, or they may be href links in the # links/Members array. The could actually be both. Typically, # iLO implements the inline (Items) for only when the collection # is read only. We have to render it with the href links when an # array contains PATCHable items because its complex to PATCH # inline collection members. if 'Items' in thecollection: # iterate items for item in thecollection['Items']: # if the item has a self uri pointer, # supply that for convenience. memberuri = None if 'links' in item and 'self' in item['links']: memberuri = item['links']['self']['href'] yield 200, None, item, memberuri # else walk the member links elif ('links' in thecollection and 'Member' in thecollection['links']): # iterate members for memberuri in thecollection['links']['Member']: # for each member return the resource indicated by the # member link status, headers, member = self._rest_get(memberuri['href']) yield status, headers, member, memberuri['href'] # page forward if there are more pages in the collection if ('links' in thecollection and 'NextPage' in thecollection['links']): next_link_uri = (collection_uri + '?page=' + str( thecollection['links']['NextPage']['page'])) status, headers, thecollection = self._rest_get(next_link_uri) # else we are finished iterating the collection else: break
[ "def", "_get_collection", "(", "self", ",", "collection_uri", ",", "request_headers", "=", "None", ")", ":", "# get the collection", "status", ",", "headers", ",", "thecollection", "=", "self", ".", "_rest_get", "(", "collection_uri", ")", "if", "status", "!=", ...
Generator function that returns collection members.
[ "Generator", "function", "that", "returns", "collection", "members", "." ]
86ef3b47b4eca97c221577e3570b0240d6a25f22
https://github.com/openstack/proliantutils/blob/86ef3b47b4eca97c221577e3570b0240d6a25f22/proliantutils/ilo/ris.py#L77-L134
train
41,580
openstack/proliantutils
proliantutils/ilo/ris.py
RISOperations._get_type
def _get_type(self, obj): """Return the type of an object.""" typever = obj['Type'] typesplit = typever.split('.') return typesplit[0] + '.' + typesplit[1]
python
def _get_type(self, obj): """Return the type of an object.""" typever = obj['Type'] typesplit = typever.split('.') return typesplit[0] + '.' + typesplit[1]
[ "def", "_get_type", "(", "self", ",", "obj", ")", ":", "typever", "=", "obj", "[", "'Type'", "]", "typesplit", "=", "typever", ".", "split", "(", "'.'", ")", "return", "typesplit", "[", "0", "]", "+", "'.'", "+", "typesplit", "[", "1", "]" ]
Return the type of an object.
[ "Return", "the", "type", "of", "an", "object", "." ]
86ef3b47b4eca97c221577e3570b0240d6a25f22
https://github.com/openstack/proliantutils/blob/86ef3b47b4eca97c221577e3570b0240d6a25f22/proliantutils/ilo/ris.py#L136-L140
train
41,581
openstack/proliantutils
proliantutils/ilo/ris.py
RISOperations._render_extended_error_message_list
def _render_extended_error_message_list(self, extended_error): """Parse the ExtendedError object and retruns the message. Build a list of decoded messages from the extended_error using the message registries. An ExtendedError JSON object is a response from the with its own schema. This function knows how to parse the ExtendedError object and, using any loaded message registries, render an array of plain language strings that represent the response. """ messages = [] if isinstance(extended_error, dict): if ('Type' in extended_error and extended_error['Type'].startswith('ExtendedError.')): for msg in extended_error['Messages']: message_id = msg['MessageID'] x = message_id.split('.') registry = x[0] msgkey = x[len(x) - 1] # if the correct message registry is loaded, # do string resolution if (registry in self.message_registries and msgkey in self.message_registries[registry]['Messages']): rmsgs = self.message_registries[registry]['Messages'] msg_dict = rmsgs[msgkey] msg_str = message_id + ': ' + msg_dict['Message'] for argn in range(0, msg_dict['NumberOfArgs']): subst = '%' + str(argn+1) m = str(msg['MessageArgs'][argn]) msg_str = msg_str.replace(subst, m) if ('Resolution' in msg_dict and msg_dict['Resolution'] != 'None'): msg_str += ' ' + msg_dict['Resolution'] messages.append(msg_str) else: # no message registry, simply return the msg object # in string form messages.append(str(message_id)) return messages
python
def _render_extended_error_message_list(self, extended_error): """Parse the ExtendedError object and retruns the message. Build a list of decoded messages from the extended_error using the message registries. An ExtendedError JSON object is a response from the with its own schema. This function knows how to parse the ExtendedError object and, using any loaded message registries, render an array of plain language strings that represent the response. """ messages = [] if isinstance(extended_error, dict): if ('Type' in extended_error and extended_error['Type'].startswith('ExtendedError.')): for msg in extended_error['Messages']: message_id = msg['MessageID'] x = message_id.split('.') registry = x[0] msgkey = x[len(x) - 1] # if the correct message registry is loaded, # do string resolution if (registry in self.message_registries and msgkey in self.message_registries[registry]['Messages']): rmsgs = self.message_registries[registry]['Messages'] msg_dict = rmsgs[msgkey] msg_str = message_id + ': ' + msg_dict['Message'] for argn in range(0, msg_dict['NumberOfArgs']): subst = '%' + str(argn+1) m = str(msg['MessageArgs'][argn]) msg_str = msg_str.replace(subst, m) if ('Resolution' in msg_dict and msg_dict['Resolution'] != 'None'): msg_str += ' ' + msg_dict['Resolution'] messages.append(msg_str) else: # no message registry, simply return the msg object # in string form messages.append(str(message_id)) return messages
[ "def", "_render_extended_error_message_list", "(", "self", ",", "extended_error", ")", ":", "messages", "=", "[", "]", "if", "isinstance", "(", "extended_error", ",", "dict", ")", ":", "if", "(", "'Type'", "in", "extended_error", "and", "extended_error", "[", ...
Parse the ExtendedError object and retruns the message. Build a list of decoded messages from the extended_error using the message registries. An ExtendedError JSON object is a response from the with its own schema. This function knows how to parse the ExtendedError object and, using any loaded message registries, render an array of plain language strings that represent the response.
[ "Parse", "the", "ExtendedError", "object", "and", "retruns", "the", "message", "." ]
86ef3b47b4eca97c221577e3570b0240d6a25f22
https://github.com/openstack/proliantutils/blob/86ef3b47b4eca97c221577e3570b0240d6a25f22/proliantutils/ilo/ris.py#L150-L193
train
41,582
openstack/proliantutils
proliantutils/ilo/ris.py
RISOperations._get_host_details
def _get_host_details(self): """Get the system details.""" # Assuming only one system present as part of collection, # as we are dealing with iLO's here. status, headers, system = self._rest_get('/rest/v1/Systems/1') if status < 300: stype = self._get_type(system) if stype not in ['ComputerSystem.0', 'ComputerSystem.1']: msg = "%s is not a valid system type " % stype raise exception.IloError(msg) else: msg = self._get_extended_error(system) raise exception.IloError(msg) return system
python
def _get_host_details(self): """Get the system details.""" # Assuming only one system present as part of collection, # as we are dealing with iLO's here. status, headers, system = self._rest_get('/rest/v1/Systems/1') if status < 300: stype = self._get_type(system) if stype not in ['ComputerSystem.0', 'ComputerSystem.1']: msg = "%s is not a valid system type " % stype raise exception.IloError(msg) else: msg = self._get_extended_error(system) raise exception.IloError(msg) return system
[ "def", "_get_host_details", "(", "self", ")", ":", "# Assuming only one system present as part of collection,", "# as we are dealing with iLO's here.", "status", ",", "headers", ",", "system", "=", "self", ".", "_rest_get", "(", "'/rest/v1/Systems/1'", ")", "if", "status", ...
Get the system details.
[ "Get", "the", "system", "details", "." ]
86ef3b47b4eca97c221577e3570b0240d6a25f22
https://github.com/openstack/proliantutils/blob/86ef3b47b4eca97c221577e3570b0240d6a25f22/proliantutils/ilo/ris.py#L199-L213
train
41,583
openstack/proliantutils
proliantutils/ilo/ris.py
RISOperations._check_bios_resource
def _check_bios_resource(self, properties=[]): """Check if the bios resource exists.""" system = self._get_host_details() if ('links' in system['Oem']['Hp'] and 'BIOS' in system['Oem']['Hp']['links']): # Get the BIOS URI and Settings bios_uri = system['Oem']['Hp']['links']['BIOS']['href'] status, headers, bios_settings = self._rest_get(bios_uri) if status >= 300: msg = self._get_extended_error(bios_settings) raise exception.IloError(msg) # If property is not None, check if the bios_property is supported for property in properties: if property not in bios_settings: # not supported on this platform msg = ('BIOS Property "' + property + '" is not' ' supported on this system.') raise exception.IloCommandNotSupportedError(msg) return headers, bios_uri, bios_settings else: msg = ('"links/BIOS" section in ComputerSystem/Oem/Hp' ' does not exist') raise exception.IloCommandNotSupportedError(msg)
python
def _check_bios_resource(self, properties=[]): """Check if the bios resource exists.""" system = self._get_host_details() if ('links' in system['Oem']['Hp'] and 'BIOS' in system['Oem']['Hp']['links']): # Get the BIOS URI and Settings bios_uri = system['Oem']['Hp']['links']['BIOS']['href'] status, headers, bios_settings = self._rest_get(bios_uri) if status >= 300: msg = self._get_extended_error(bios_settings) raise exception.IloError(msg) # If property is not None, check if the bios_property is supported for property in properties: if property not in bios_settings: # not supported on this platform msg = ('BIOS Property "' + property + '" is not' ' supported on this system.') raise exception.IloCommandNotSupportedError(msg) return headers, bios_uri, bios_settings else: msg = ('"links/BIOS" section in ComputerSystem/Oem/Hp' ' does not exist') raise exception.IloCommandNotSupportedError(msg)
[ "def", "_check_bios_resource", "(", "self", ",", "properties", "=", "[", "]", ")", ":", "system", "=", "self", ".", "_get_host_details", "(", ")", "if", "(", "'links'", "in", "system", "[", "'Oem'", "]", "[", "'Hp'", "]", "and", "'BIOS'", "in", "system...
Check if the bios resource exists.
[ "Check", "if", "the", "bios", "resource", "exists", "." ]
86ef3b47b4eca97c221577e3570b0240d6a25f22
https://github.com/openstack/proliantutils/blob/86ef3b47b4eca97c221577e3570b0240d6a25f22/proliantutils/ilo/ris.py#L215-L242
train
41,584
openstack/proliantutils
proliantutils/ilo/ris.py
RISOperations._get_pci_devices
def _get_pci_devices(self): """Gets the PCI devices. :returns: PCI devices list if the pci resource exist. :raises: IloCommandNotSupportedError if the PCI resource doesn't exist. :raises: IloError, on an error from iLO. """ system = self._get_host_details() if ('links' in system['Oem']['Hp'] and 'PCIDevices' in system['Oem']['Hp']['links']): # Get the PCI URI and Settings pci_uri = system['Oem']['Hp']['links']['PCIDevices']['href'] status, headers, pci_device_list = self._rest_get(pci_uri) if status >= 300: msg = self._get_extended_error(pci_device_list) raise exception.IloError(msg) return pci_device_list else: msg = ('links/PCIDevices section in ComputerSystem/Oem/Hp' ' does not exist') raise exception.IloCommandNotSupportedError(msg)
python
def _get_pci_devices(self): """Gets the PCI devices. :returns: PCI devices list if the pci resource exist. :raises: IloCommandNotSupportedError if the PCI resource doesn't exist. :raises: IloError, on an error from iLO. """ system = self._get_host_details() if ('links' in system['Oem']['Hp'] and 'PCIDevices' in system['Oem']['Hp']['links']): # Get the PCI URI and Settings pci_uri = system['Oem']['Hp']['links']['PCIDevices']['href'] status, headers, pci_device_list = self._rest_get(pci_uri) if status >= 300: msg = self._get_extended_error(pci_device_list) raise exception.IloError(msg) return pci_device_list else: msg = ('links/PCIDevices section in ComputerSystem/Oem/Hp' ' does not exist') raise exception.IloCommandNotSupportedError(msg)
[ "def", "_get_pci_devices", "(", "self", ")", ":", "system", "=", "self", ".", "_get_host_details", "(", ")", "if", "(", "'links'", "in", "system", "[", "'Oem'", "]", "[", "'Hp'", "]", "and", "'PCIDevices'", "in", "system", "[", "'Oem'", "]", "[", "'Hp'...
Gets the PCI devices. :returns: PCI devices list if the pci resource exist. :raises: IloCommandNotSupportedError if the PCI resource doesn't exist. :raises: IloError, on an error from iLO.
[ "Gets", "the", "PCI", "devices", "." ]
86ef3b47b4eca97c221577e3570b0240d6a25f22
https://github.com/openstack/proliantutils/blob/86ef3b47b4eca97c221577e3570b0240d6a25f22/proliantutils/ilo/ris.py#L244-L269
train
41,585
openstack/proliantutils
proliantutils/ilo/ris.py
RISOperations._get_gpu_pci_devices
def _get_gpu_pci_devices(self): """Returns the list of gpu devices.""" pci_device_list = self._get_pci_devices() gpu_list = [] items = pci_device_list['Items'] for item in items: if item['ClassCode'] in CLASSCODE_FOR_GPU_DEVICES: if item['SubclassCode'] in SUBCLASSCODE_FOR_GPU_DEVICES: gpu_list.append(item) return gpu_list
python
def _get_gpu_pci_devices(self): """Returns the list of gpu devices.""" pci_device_list = self._get_pci_devices() gpu_list = [] items = pci_device_list['Items'] for item in items: if item['ClassCode'] in CLASSCODE_FOR_GPU_DEVICES: if item['SubclassCode'] in SUBCLASSCODE_FOR_GPU_DEVICES: gpu_list.append(item) return gpu_list
[ "def", "_get_gpu_pci_devices", "(", "self", ")", ":", "pci_device_list", "=", "self", ".", "_get_pci_devices", "(", ")", "gpu_list", "=", "[", "]", "items", "=", "pci_device_list", "[", "'Items'", "]", "for", "item", "in", "items", ":", "if", "item", "[", ...
Returns the list of gpu devices.
[ "Returns", "the", "list", "of", "gpu", "devices", "." ]
86ef3b47b4eca97c221577e3570b0240d6a25f22
https://github.com/openstack/proliantutils/blob/86ef3b47b4eca97c221577e3570b0240d6a25f22/proliantutils/ilo/ris.py#L271-L281
train
41,586
openstack/proliantutils
proliantutils/ilo/ris.py
RISOperations._get_storage_resource
def _get_storage_resource(self): """Gets the SmartStorage resource if exists. :raises: IloCommandNotSupportedError if the resource SmartStorage doesn't exist. :returns the tuple of SmartStorage URI, Headers and settings. """ system = self._get_host_details() if ('links' in system['Oem']['Hp'] and 'SmartStorage' in system['Oem']['Hp']['links']): # Get the SmartStorage URI and Settings storage_uri = system['Oem']['Hp']['links']['SmartStorage']['href'] status, headers, storage_settings = self._rest_get(storage_uri) if status >= 300: msg = self._get_extended_error(storage_settings) raise exception.IloError(msg) return headers, storage_uri, storage_settings else: msg = ('"links/SmartStorage" section in ComputerSystem/Oem/Hp' ' does not exist') raise exception.IloCommandNotSupportedError(msg)
python
def _get_storage_resource(self): """Gets the SmartStorage resource if exists. :raises: IloCommandNotSupportedError if the resource SmartStorage doesn't exist. :returns the tuple of SmartStorage URI, Headers and settings. """ system = self._get_host_details() if ('links' in system['Oem']['Hp'] and 'SmartStorage' in system['Oem']['Hp']['links']): # Get the SmartStorage URI and Settings storage_uri = system['Oem']['Hp']['links']['SmartStorage']['href'] status, headers, storage_settings = self._rest_get(storage_uri) if status >= 300: msg = self._get_extended_error(storage_settings) raise exception.IloError(msg) return headers, storage_uri, storage_settings else: msg = ('"links/SmartStorage" section in ComputerSystem/Oem/Hp' ' does not exist') raise exception.IloCommandNotSupportedError(msg)
[ "def", "_get_storage_resource", "(", "self", ")", ":", "system", "=", "self", ".", "_get_host_details", "(", ")", "if", "(", "'links'", "in", "system", "[", "'Oem'", "]", "[", "'Hp'", "]", "and", "'SmartStorage'", "in", "system", "[", "'Oem'", "]", "[", ...
Gets the SmartStorage resource if exists. :raises: IloCommandNotSupportedError if the resource SmartStorage doesn't exist. :returns the tuple of SmartStorage URI, Headers and settings.
[ "Gets", "the", "SmartStorage", "resource", "if", "exists", "." ]
86ef3b47b4eca97c221577e3570b0240d6a25f22
https://github.com/openstack/proliantutils/blob/86ef3b47b4eca97c221577e3570b0240d6a25f22/proliantutils/ilo/ris.py#L283-L305
train
41,587
openstack/proliantutils
proliantutils/ilo/ris.py
RISOperations._get_array_controller_resource
def _get_array_controller_resource(self): """Gets the ArrayController resource if exists. :raises: IloCommandNotSupportedError if the resource ArrayController doesn't exist. :returns the tuple of SmartStorage URI, Headers and settings. """ headers, storage_uri, storage_settings = self._get_storage_resource() if ('links' in storage_settings and 'ArrayControllers' in storage_settings['links']): # Get the ArrayCOntrollers URI and Settings array_uri = storage_settings['links']['ArrayControllers']['href'] status, headers, array_settings = self._rest_get(array_uri) if status >= 300: msg = self._get_extended_error(array_settings) raise exception.IloError(msg) return headers, array_uri, array_settings else: msg = ('"links/ArrayControllers" section in SmartStorage' ' does not exist') raise exception.IloCommandNotSupportedError(msg)
python
def _get_array_controller_resource(self): """Gets the ArrayController resource if exists. :raises: IloCommandNotSupportedError if the resource ArrayController doesn't exist. :returns the tuple of SmartStorage URI, Headers and settings. """ headers, storage_uri, storage_settings = self._get_storage_resource() if ('links' in storage_settings and 'ArrayControllers' in storage_settings['links']): # Get the ArrayCOntrollers URI and Settings array_uri = storage_settings['links']['ArrayControllers']['href'] status, headers, array_settings = self._rest_get(array_uri) if status >= 300: msg = self._get_extended_error(array_settings) raise exception.IloError(msg) return headers, array_uri, array_settings else: msg = ('"links/ArrayControllers" section in SmartStorage' ' does not exist') raise exception.IloCommandNotSupportedError(msg)
[ "def", "_get_array_controller_resource", "(", "self", ")", ":", "headers", ",", "storage_uri", ",", "storage_settings", "=", "self", ".", "_get_storage_resource", "(", ")", "if", "(", "'links'", "in", "storage_settings", "and", "'ArrayControllers'", "in", "storage_s...
Gets the ArrayController resource if exists. :raises: IloCommandNotSupportedError if the resource ArrayController doesn't exist. :returns the tuple of SmartStorage URI, Headers and settings.
[ "Gets", "the", "ArrayController", "resource", "if", "exists", "." ]
86ef3b47b4eca97c221577e3570b0240d6a25f22
https://github.com/openstack/proliantutils/blob/86ef3b47b4eca97c221577e3570b0240d6a25f22/proliantutils/ilo/ris.py#L307-L329
train
41,588
openstack/proliantutils
proliantutils/ilo/ris.py
RISOperations._create_list_of_array_controllers
def _create_list_of_array_controllers(self): """Creates the list of Array Controller URIs. :raises: IloCommandNotSupportedError if the ArrayControllers doesnt have member "Member". :returns list of ArrayControllers. """ headers, array_uri, array_settings = ( self._get_array_controller_resource()) array_uri_links = [] if ('links' in array_settings and 'Member' in array_settings['links']): array_uri_links = array_settings['links']['Member'] else: msg = ('"links/Member" section in ArrayControllers' ' does not exist') raise exception.IloCommandNotSupportedError(msg) return array_uri_links
python
def _create_list_of_array_controllers(self): """Creates the list of Array Controller URIs. :raises: IloCommandNotSupportedError if the ArrayControllers doesnt have member "Member". :returns list of ArrayControllers. """ headers, array_uri, array_settings = ( self._get_array_controller_resource()) array_uri_links = [] if ('links' in array_settings and 'Member' in array_settings['links']): array_uri_links = array_settings['links']['Member'] else: msg = ('"links/Member" section in ArrayControllers' ' does not exist') raise exception.IloCommandNotSupportedError(msg) return array_uri_links
[ "def", "_create_list_of_array_controllers", "(", "self", ")", ":", "headers", ",", "array_uri", ",", "array_settings", "=", "(", "self", ".", "_get_array_controller_resource", "(", ")", ")", "array_uri_links", "=", "[", "]", "if", "(", "'links'", "in", "array_se...
Creates the list of Array Controller URIs. :raises: IloCommandNotSupportedError if the ArrayControllers doesnt have member "Member". :returns list of ArrayControllers.
[ "Creates", "the", "list", "of", "Array", "Controller", "URIs", "." ]
86ef3b47b4eca97c221577e3570b0240d6a25f22
https://github.com/openstack/proliantutils/blob/86ef3b47b4eca97c221577e3570b0240d6a25f22/proliantutils/ilo/ris.py#L331-L348
train
41,589
openstack/proliantutils
proliantutils/ilo/ris.py
RISOperations._get_drive_type_and_speed
def _get_drive_type_and_speed(self): """Gets the disk drive type. :returns: A dictionary with the following keys: - has_rotational: True/False. It is True if atleast one rotational disk is attached. - has_ssd: True/False. It is True if at least one SSD disk is attached. - drive_rotational_<speed>_rpm: These are set to true as per the speed of the rotational disks. :raises: IloCommandNotSupportedError if the PhysicalDrives resource doesn't exist. :raises: IloError, on an error from iLO. """ disk_details = self._get_physical_drive_resource() drive_hdd = False drive_ssd = False drive_details = {} speed_const_list = [4800, 5400, 7200, 10000, 15000] if disk_details: for item in disk_details: value = item['MediaType'] if value == "HDD": drive_hdd = True speed = item['RotationalSpeedRpm'] if speed in speed_const_list: var = 'rotational_drive_' + str(speed) + '_rpm' drive_details.update({var: 'true'}) # Note: RIS returns value as 'SDD' for SSD drives. else: drive_ssd = True if drive_hdd: drive_details.update({'has_rotational': 'true'}) if drive_ssd: drive_details.update({'has_ssd': 'true'}) return drive_details if len(drive_details.keys()) > 0 else None
python
def _get_drive_type_and_speed(self): """Gets the disk drive type. :returns: A dictionary with the following keys: - has_rotational: True/False. It is True if atleast one rotational disk is attached. - has_ssd: True/False. It is True if at least one SSD disk is attached. - drive_rotational_<speed>_rpm: These are set to true as per the speed of the rotational disks. :raises: IloCommandNotSupportedError if the PhysicalDrives resource doesn't exist. :raises: IloError, on an error from iLO. """ disk_details = self._get_physical_drive_resource() drive_hdd = False drive_ssd = False drive_details = {} speed_const_list = [4800, 5400, 7200, 10000, 15000] if disk_details: for item in disk_details: value = item['MediaType'] if value == "HDD": drive_hdd = True speed = item['RotationalSpeedRpm'] if speed in speed_const_list: var = 'rotational_drive_' + str(speed) + '_rpm' drive_details.update({var: 'true'}) # Note: RIS returns value as 'SDD' for SSD drives. else: drive_ssd = True if drive_hdd: drive_details.update({'has_rotational': 'true'}) if drive_ssd: drive_details.update({'has_ssd': 'true'}) return drive_details if len(drive_details.keys()) > 0 else None
[ "def", "_get_drive_type_and_speed", "(", "self", ")", ":", "disk_details", "=", "self", ".", "_get_physical_drive_resource", "(", ")", "drive_hdd", "=", "False", "drive_ssd", "=", "False", "drive_details", "=", "{", "}", "speed_const_list", "=", "[", "4800", ","...
Gets the disk drive type. :returns: A dictionary with the following keys: - has_rotational: True/False. It is True if atleast one rotational disk is attached. - has_ssd: True/False. It is True if at least one SSD disk is attached. - drive_rotational_<speed>_rpm: These are set to true as per the speed of the rotational disks. :raises: IloCommandNotSupportedError if the PhysicalDrives resource doesn't exist. :raises: IloError, on an error from iLO.
[ "Gets", "the", "disk", "drive", "type", "." ]
86ef3b47b4eca97c221577e3570b0240d6a25f22
https://github.com/openstack/proliantutils/blob/86ef3b47b4eca97c221577e3570b0240d6a25f22/proliantutils/ilo/ris.py#L350-L385
train
41,590
openstack/proliantutils
proliantutils/ilo/ris.py
RISOperations._get_drive_resource
def _get_drive_resource(self, drive_name): """Gets the DiskDrive resource if exists. :param drive_name: can be either "PhysicalDrives" or "LogicalDrives". :returns the list of drives. :raises: IloCommandNotSupportedError if the given drive resource doesn't exist. :raises: IloError, on an error from iLO. """ disk_details_list = [] array_uri_links = self._create_list_of_array_controllers() for array_link in array_uri_links: _, _, member_settings = ( self._rest_get(array_link['href'])) if ('links' in member_settings and drive_name in member_settings['links']): disk_uri = member_settings['links'][drive_name]['href'] headers, disk_member_uri, disk_mem = ( self._rest_get(disk_uri)) if ('links' in disk_mem and 'Member' in disk_mem['links']): for disk_link in disk_mem['links']['Member']: diskdrive_uri = disk_link['href'] _, _, disk_details = ( self._rest_get(diskdrive_uri)) disk_details_list.append(disk_details) else: msg = ('"links/Member" section in %s' ' does not exist', drive_name) raise exception.IloCommandNotSupportedError(msg) else: msg = ('"links/%s" section in ' ' ArrayController/links/Member does not exist', drive_name) raise exception.IloCommandNotSupportedError(msg) if disk_details_list: return disk_details_list
python
def _get_drive_resource(self, drive_name): """Gets the DiskDrive resource if exists. :param drive_name: can be either "PhysicalDrives" or "LogicalDrives". :returns the list of drives. :raises: IloCommandNotSupportedError if the given drive resource doesn't exist. :raises: IloError, on an error from iLO. """ disk_details_list = [] array_uri_links = self._create_list_of_array_controllers() for array_link in array_uri_links: _, _, member_settings = ( self._rest_get(array_link['href'])) if ('links' in member_settings and drive_name in member_settings['links']): disk_uri = member_settings['links'][drive_name]['href'] headers, disk_member_uri, disk_mem = ( self._rest_get(disk_uri)) if ('links' in disk_mem and 'Member' in disk_mem['links']): for disk_link in disk_mem['links']['Member']: diskdrive_uri = disk_link['href'] _, _, disk_details = ( self._rest_get(diskdrive_uri)) disk_details_list.append(disk_details) else: msg = ('"links/Member" section in %s' ' does not exist', drive_name) raise exception.IloCommandNotSupportedError(msg) else: msg = ('"links/%s" section in ' ' ArrayController/links/Member does not exist', drive_name) raise exception.IloCommandNotSupportedError(msg) if disk_details_list: return disk_details_list
[ "def", "_get_drive_resource", "(", "self", ",", "drive_name", ")", ":", "disk_details_list", "=", "[", "]", "array_uri_links", "=", "self", ".", "_create_list_of_array_controllers", "(", ")", "for", "array_link", "in", "array_uri_links", ":", "_", ",", "_", ",",...
Gets the DiskDrive resource if exists. :param drive_name: can be either "PhysicalDrives" or "LogicalDrives". :returns the list of drives. :raises: IloCommandNotSupportedError if the given drive resource doesn't exist. :raises: IloError, on an error from iLO.
[ "Gets", "the", "DiskDrive", "resource", "if", "exists", "." ]
86ef3b47b4eca97c221577e3570b0240d6a25f22
https://github.com/openstack/proliantutils/blob/86ef3b47b4eca97c221577e3570b0240d6a25f22/proliantutils/ilo/ris.py#L387-L425
train
41,591
openstack/proliantutils
proliantutils/ilo/ris.py
RISOperations._get_logical_raid_levels
def _get_logical_raid_levels(self): """Gets the different raid levels configured on a server. :returns a dictionary of logical_raid_levels set to true. Example if raid level 1+0 and 6 are configured, it returns {'logical_raid_level_10': 'true', 'logical_raid_level_6': 'true'} """ logical_drive_details = self._get_logical_drive_resource() raid_level = {} if logical_drive_details: for item in logical_drive_details: if 'Raid' in item: raid_level_var = "logical_raid_level_" + item['Raid'] raid_level.update({raid_level_var: 'true'}) return raid_level if len(raid_level.keys()) > 0 else None
python
def _get_logical_raid_levels(self): """Gets the different raid levels configured on a server. :returns a dictionary of logical_raid_levels set to true. Example if raid level 1+0 and 6 are configured, it returns {'logical_raid_level_10': 'true', 'logical_raid_level_6': 'true'} """ logical_drive_details = self._get_logical_drive_resource() raid_level = {} if logical_drive_details: for item in logical_drive_details: if 'Raid' in item: raid_level_var = "logical_raid_level_" + item['Raid'] raid_level.update({raid_level_var: 'true'}) return raid_level if len(raid_level.keys()) > 0 else None
[ "def", "_get_logical_raid_levels", "(", "self", ")", ":", "logical_drive_details", "=", "self", ".", "_get_logical_drive_resource", "(", ")", "raid_level", "=", "{", "}", "if", "logical_drive_details", ":", "for", "item", "in", "logical_drive_details", ":", "if", ...
Gets the different raid levels configured on a server. :returns a dictionary of logical_raid_levels set to true. Example if raid level 1+0 and 6 are configured, it returns {'logical_raid_level_10': 'true', 'logical_raid_level_6': 'true'}
[ "Gets", "the", "different", "raid", "levels", "configured", "on", "a", "server", "." ]
86ef3b47b4eca97c221577e3570b0240d6a25f22
https://github.com/openstack/proliantutils/blob/86ef3b47b4eca97c221577e3570b0240d6a25f22/proliantutils/ilo/ris.py#L435-L450
train
41,592
openstack/proliantutils
proliantutils/ilo/ris.py
RISOperations._is_raid_supported
def _is_raid_supported(self): """Get the RAID support on the server. This method returns the raid support on the physical server. It checks for the list of array controllers configured to the Smart Storage. If one or more array controllers available then raid is supported by the server. If none, raid is not supported. :return: Raid support as a dictionary with true/false as its value. """ header, uri, array_resource = self._get_array_controller_resource() return True if array_resource['Total'] > 0 else False
python
def _is_raid_supported(self): """Get the RAID support on the server. This method returns the raid support on the physical server. It checks for the list of array controllers configured to the Smart Storage. If one or more array controllers available then raid is supported by the server. If none, raid is not supported. :return: Raid support as a dictionary with true/false as its value. """ header, uri, array_resource = self._get_array_controller_resource() return True if array_resource['Total'] > 0 else False
[ "def", "_is_raid_supported", "(", "self", ")", ":", "header", ",", "uri", ",", "array_resource", "=", "self", ".", "_get_array_controller_resource", "(", ")", "return", "True", "if", "array_resource", "[", "'Total'", "]", ">", "0", "else", "False" ]
Get the RAID support on the server. This method returns the raid support on the physical server. It checks for the list of array controllers configured to the Smart Storage. If one or more array controllers available then raid is supported by the server. If none, raid is not supported. :return: Raid support as a dictionary with true/false as its value.
[ "Get", "the", "RAID", "support", "on", "the", "server", "." ]
86ef3b47b4eca97c221577e3570b0240d6a25f22
https://github.com/openstack/proliantutils/blob/86ef3b47b4eca97c221577e3570b0240d6a25f22/proliantutils/ilo/ris.py#L452-L464
train
41,593
openstack/proliantutils
proliantutils/ilo/ris.py
RISOperations._get_bios_settings_resource
def _get_bios_settings_resource(self, data): """Get the BIOS settings resource.""" try: bios_settings_uri = data['links']['Settings']['href'] except KeyError: msg = ('BIOS Settings resource not found.') raise exception.IloError(msg) status, headers, bios_settings = self._rest_get(bios_settings_uri) if status != 200: msg = self._get_extended_error(bios_settings) raise exception.IloError(msg) return headers, bios_settings_uri, bios_settings
python
def _get_bios_settings_resource(self, data): """Get the BIOS settings resource.""" try: bios_settings_uri = data['links']['Settings']['href'] except KeyError: msg = ('BIOS Settings resource not found.') raise exception.IloError(msg) status, headers, bios_settings = self._rest_get(bios_settings_uri) if status != 200: msg = self._get_extended_error(bios_settings) raise exception.IloError(msg) return headers, bios_settings_uri, bios_settings
[ "def", "_get_bios_settings_resource", "(", "self", ",", "data", ")", ":", "try", ":", "bios_settings_uri", "=", "data", "[", "'links'", "]", "[", "'Settings'", "]", "[", "'href'", "]", "except", "KeyError", ":", "msg", "=", "(", "'BIOS Settings resource not fo...
Get the BIOS settings resource.
[ "Get", "the", "BIOS", "settings", "resource", "." ]
86ef3b47b4eca97c221577e3570b0240d6a25f22
https://github.com/openstack/proliantutils/blob/86ef3b47b4eca97c221577e3570b0240d6a25f22/proliantutils/ilo/ris.py#L466-L479
train
41,594
openstack/proliantutils
proliantutils/ilo/ris.py
RISOperations._validate_if_patch_supported
def _validate_if_patch_supported(self, headers, uri): """Check if the PATCH Operation is allowed on the resource.""" if not self._operation_allowed(headers, 'PATCH'): msg = ('PATCH Operation not supported on the resource ' '"%s"' % uri) raise exception.IloError(msg)
python
def _validate_if_patch_supported(self, headers, uri): """Check if the PATCH Operation is allowed on the resource.""" if not self._operation_allowed(headers, 'PATCH'): msg = ('PATCH Operation not supported on the resource ' '"%s"' % uri) raise exception.IloError(msg)
[ "def", "_validate_if_patch_supported", "(", "self", ",", "headers", ",", "uri", ")", ":", "if", "not", "self", ".", "_operation_allowed", "(", "headers", ",", "'PATCH'", ")", ":", "msg", "=", "(", "'PATCH Operation not supported on the resource '", "'\"%s\"'", "%"...
Check if the PATCH Operation is allowed on the resource.
[ "Check", "if", "the", "PATCH", "Operation", "is", "allowed", "on", "the", "resource", "." ]
86ef3b47b4eca97c221577e3570b0240d6a25f22
https://github.com/openstack/proliantutils/blob/86ef3b47b4eca97c221577e3570b0240d6a25f22/proliantutils/ilo/ris.py#L481-L486
train
41,595
openstack/proliantutils
proliantutils/ilo/ris.py
RISOperations._get_bios_setting
def _get_bios_setting(self, bios_property): """Retrieves bios settings of the server.""" headers, bios_uri, bios_settings = self._check_bios_resource([ bios_property]) return bios_settings[bios_property]
python
def _get_bios_setting(self, bios_property): """Retrieves bios settings of the server.""" headers, bios_uri, bios_settings = self._check_bios_resource([ bios_property]) return bios_settings[bios_property]
[ "def", "_get_bios_setting", "(", "self", ",", "bios_property", ")", ":", "headers", ",", "bios_uri", ",", "bios_settings", "=", "self", ".", "_check_bios_resource", "(", "[", "bios_property", "]", ")", "return", "bios_settings", "[", "bios_property", "]" ]
Retrieves bios settings of the server.
[ "Retrieves", "bios", "settings", "of", "the", "server", "." ]
86ef3b47b4eca97c221577e3570b0240d6a25f22
https://github.com/openstack/proliantutils/blob/86ef3b47b4eca97c221577e3570b0240d6a25f22/proliantutils/ilo/ris.py#L488-L492
train
41,596
openstack/proliantutils
proliantutils/ilo/ris.py
RISOperations._get_bios_hash_password
def _get_bios_hash_password(self, bios_password): """Get the hashed BIOS password.""" request_headers = {} if bios_password: bios_password_hash = hashlib.sha256((bios_password.encode()). hexdigest().upper()) request_headers['X-HPRESTFULAPI-AuthToken'] = bios_password_hash return request_headers
python
def _get_bios_hash_password(self, bios_password): """Get the hashed BIOS password.""" request_headers = {} if bios_password: bios_password_hash = hashlib.sha256((bios_password.encode()). hexdigest().upper()) request_headers['X-HPRESTFULAPI-AuthToken'] = bios_password_hash return request_headers
[ "def", "_get_bios_hash_password", "(", "self", ",", "bios_password", ")", ":", "request_headers", "=", "{", "}", "if", "bios_password", ":", "bios_password_hash", "=", "hashlib", ".", "sha256", "(", "(", "bios_password", ".", "encode", "(", ")", ")", ".", "h...
Get the hashed BIOS password.
[ "Get", "the", "hashed", "BIOS", "password", "." ]
86ef3b47b4eca97c221577e3570b0240d6a25f22
https://github.com/openstack/proliantutils/blob/86ef3b47b4eca97c221577e3570b0240d6a25f22/proliantutils/ilo/ris.py#L494-L501
train
41,597
openstack/proliantutils
proliantutils/ilo/ris.py
RISOperations._change_bios_setting
def _change_bios_setting(self, properties): """Change the bios settings to specified values.""" keys = properties.keys() # Check if the BIOS resource/property exists. headers, bios_uri, settings = self._check_bios_resource(keys) if not self._operation_allowed(headers, 'PATCH'): headers, bios_uri, _ = self._get_bios_settings_resource(settings) self._validate_if_patch_supported(headers, bios_uri) request_headers = self._get_bios_hash_password(self.bios_password) status, headers, response = self._rest_patch(bios_uri, request_headers, properties) if status >= 300: msg = self._get_extended_error(response) raise exception.IloError(msg)
python
def _change_bios_setting(self, properties): """Change the bios settings to specified values.""" keys = properties.keys() # Check if the BIOS resource/property exists. headers, bios_uri, settings = self._check_bios_resource(keys) if not self._operation_allowed(headers, 'PATCH'): headers, bios_uri, _ = self._get_bios_settings_resource(settings) self._validate_if_patch_supported(headers, bios_uri) request_headers = self._get_bios_hash_password(self.bios_password) status, headers, response = self._rest_patch(bios_uri, request_headers, properties) if status >= 300: msg = self._get_extended_error(response) raise exception.IloError(msg)
[ "def", "_change_bios_setting", "(", "self", ",", "properties", ")", ":", "keys", "=", "properties", ".", "keys", "(", ")", "# Check if the BIOS resource/property exists.", "headers", ",", "bios_uri", ",", "settings", "=", "self", ".", "_check_bios_resource", "(", ...
Change the bios settings to specified values.
[ "Change", "the", "bios", "settings", "to", "specified", "values", "." ]
86ef3b47b4eca97c221577e3570b0240d6a25f22
https://github.com/openstack/proliantutils/blob/86ef3b47b4eca97c221577e3570b0240d6a25f22/proliantutils/ilo/ris.py#L503-L517
train
41,598
openstack/proliantutils
proliantutils/ilo/ris.py
RISOperations._get_iscsi_settings_resource
def _get_iscsi_settings_resource(self, data): """Get the iscsi settings resoure. :param data: Existing iscsi settings of the server. :returns: headers, iscsi_settings url and iscsi settings as a dictionary. :raises: IloCommandNotSupportedError, if resource is not found. :raises: IloError, on an error from iLO. """ try: iscsi_settings_uri = data['links']['Settings']['href'] except KeyError: msg = ('iscsi settings resource not found.') raise exception.IloCommandNotSupportedError(msg) status, headers, iscsi_settings = self._rest_get(iscsi_settings_uri) if status != 200: msg = self._get_extended_error(iscsi_settings) raise exception.IloError(msg) return headers, iscsi_settings_uri, iscsi_settings
python
def _get_iscsi_settings_resource(self, data): """Get the iscsi settings resoure. :param data: Existing iscsi settings of the server. :returns: headers, iscsi_settings url and iscsi settings as a dictionary. :raises: IloCommandNotSupportedError, if resource is not found. :raises: IloError, on an error from iLO. """ try: iscsi_settings_uri = data['links']['Settings']['href'] except KeyError: msg = ('iscsi settings resource not found.') raise exception.IloCommandNotSupportedError(msg) status, headers, iscsi_settings = self._rest_get(iscsi_settings_uri) if status != 200: msg = self._get_extended_error(iscsi_settings) raise exception.IloError(msg) return headers, iscsi_settings_uri, iscsi_settings
[ "def", "_get_iscsi_settings_resource", "(", "self", ",", "data", ")", ":", "try", ":", "iscsi_settings_uri", "=", "data", "[", "'links'", "]", "[", "'Settings'", "]", "[", "'href'", "]", "except", "KeyError", ":", "msg", "=", "(", "'iscsi settings resource not...
Get the iscsi settings resoure. :param data: Existing iscsi settings of the server. :returns: headers, iscsi_settings url and iscsi settings as a dictionary. :raises: IloCommandNotSupportedError, if resource is not found. :raises: IloError, on an error from iLO.
[ "Get", "the", "iscsi", "settings", "resoure", "." ]
86ef3b47b4eca97c221577e3570b0240d6a25f22
https://github.com/openstack/proliantutils/blob/86ef3b47b4eca97c221577e3570b0240d6a25f22/proliantutils/ilo/ris.py#L519-L540
train
41,599