hexsha stringlengths 40 40 | size int64 5 1.05M | ext stringclasses 588 values | lang stringclasses 305 values | max_stars_repo_path stringlengths 3 363 | max_stars_repo_name stringlengths 5 118 | max_stars_repo_head_hexsha stringlengths 40 40 | max_stars_repo_licenses listlengths 1 10 | max_stars_count float64 1 191k ⌀ | max_stars_repo_stars_event_min_datetime stringdate 2015-01-01 00:00:35 2022-03-31 23:43:49 ⌀ | max_stars_repo_stars_event_max_datetime stringdate 2015-01-01 12:37:38 2022-03-31 23:59:52 ⌀ | max_issues_repo_path stringlengths 3 363 | max_issues_repo_name stringlengths 5 118 | max_issues_repo_head_hexsha stringlengths 40 40 | max_issues_repo_licenses listlengths 1 10 | max_issues_count float64 1 134k ⌀ | max_issues_repo_issues_event_min_datetime stringlengths 24 24 ⌀ | max_issues_repo_issues_event_max_datetime stringlengths 24 24 ⌀ | max_forks_repo_path stringlengths 3 363 | max_forks_repo_name stringlengths 5 135 | max_forks_repo_head_hexsha stringlengths 40 40 | max_forks_repo_licenses listlengths 1 10 | max_forks_count float64 1 105k ⌀ | max_forks_repo_forks_event_min_datetime stringdate 2015-01-01 00:01:02 2022-03-31 23:27:27 ⌀ | max_forks_repo_forks_event_max_datetime stringdate 2015-01-03 08:55:07 2022-03-31 23:59:24 ⌀ | content stringlengths 5 1.05M | avg_line_length float64 1.13 1.04M | max_line_length int64 1 1.05M | alphanum_fraction float64 0 1 |
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
dac85dffe0e0b8b0b052ab4371cc302a3ea55fe8 | 1,785 | rst | reStructuredText | vendor/friendsofphp/php-cs-fixer/doc/rules/php_unit/php_unit_test_case_static_method_calls.rst | D3MoN98/careerstudy | 70290c22c133e5029e3e78c85f73ce3ccddd2732 | [
"MIT"
] | 133 | 2015-01-12T07:11:36.000Z | 2022-02-16T13:25:18.000Z | vendor/friendsofphp/php-cs-fixer/doc/rules/php_unit/php_unit_test_case_static_method_calls.rst | D3MoN98/careerstudy | 70290c22c133e5029e3e78c85f73ce3ccddd2732 | [
"MIT"
] | 534 | 2015-01-05T21:49:45.000Z | 2022-03-31T16:27:03.000Z | vendor/friendsofphp/php-cs-fixer/doc/rules/php_unit/php_unit_test_case_static_method_calls.rst | D3MoN98/careerstudy | 70290c22c133e5029e3e78c85f73ce3ccddd2732 | [
"MIT"
] | 448 | 2015-01-09T15:40:09.000Z | 2022-03-11T15:13:25.000Z | ===============================================
Rule ``php_unit_test_case_static_method_calls``
===============================================
Calls to ``PHPUnit\Framework\TestCase`` static methods must all be of the same
type, either ``$this->``, ``self::`` or ``static::``.
.. warning:: Using this rule is risky.
Risky when PHPUnit methods are overridden or not accessible, or when project
has PHPUnit incompatibilities.
Configuration
-------------
``call_type``
~~~~~~~~~~~~~
The call type to use for referring to PHPUnit methods.
Allowed values: ``'self'``, ``'static'``, ``'this'``
Default value: ``'static'``
``methods``
~~~~~~~~~~~
Dictionary of ``method`` => ``call_type`` values that differ from the default
strategy.
Allowed types: ``array``
Default value: ``[]``
Examples
--------
Example #1
~~~~~~~~~~
*Default* configuration.
.. code-block:: diff
--- Original
+++ New
@@ -3,8 +3,8 @@
{
public function testMe()
{
- $this->assertSame(1, 2);
- self::assertSame(1, 2);
+ static::assertSame(1, 2);
+ static::assertSame(1, 2);
static::assertSame(1, 2);
}
}
Example #2
~~~~~~~~~~
With configuration: ``['call_type' => 'this']``.
.. code-block:: diff
--- Original
+++ New
@@ -4,7 +4,7 @@
public function testMe()
{
$this->assertSame(1, 2);
- self::assertSame(1, 2);
- static::assertSame(1, 2);
+ $this->assertSame(1, 2);
+ $this->assertSame(1, 2);
}
}
Rule sets
---------
The rule is part of the following rule set:
@PhpCsFixer:risky
Using the ``@PhpCsFixer:risky`` rule set will enable the ``php_unit_test_case_static_method_calls`` rule with the default config.
| 20.755814 | 131 | 0.537815 |
271c615f032ed36622f5fc11d50ed1cd5d10139d | 799 | rst | reStructuredText | CHANGES.rst | visio2img/sphinxcontrib-visio | b110470b89fc2c87b5c29d50835b4259824e55d5 | [
"Apache-2.0"
] | 5 | 2016-04-29T17:09:31.000Z | 2021-11-24T17:37:49.000Z | CHANGES.rst | visio2img/sphinxcontrib-visio | b110470b89fc2c87b5c29d50835b4259824e55d5 | [
"Apache-2.0"
] | 1 | 2018-03-06T06:45:55.000Z | 2018-03-06T06:45:55.000Z | CHANGES.rst | visio2img/sphinxcontrib-visio | b110470b89fc2c87b5c29d50835b4259824e55d5 | [
"Apache-2.0"
] | 4 | 2016-04-29T17:09:40.000Z | 2018-03-05T01:20:41.000Z | Changelog
==========
2.1.2 (2016-05-18)
-------------------
- Check pywin32 is installed before running
2.1.1 (not released)
--------------------
2.1.0 (not released)
--------------------
2.0.0 (2015-06-26)
-------------------
- Make `image` and `figure` directives support visio images
- Rename `name` option of `visio-image` and `visio-figure` directives (Usually, `name` is used for linking)
- Fix bugs
- Fix invalid path was generated when .rst is in subdir
1.1.0 (2014-09-24)
-------------------
- Add `visio-image` and `visio-figure` directives
- Change license to Apache 2.0
- Update docs
- Fix bugs
1.0.2 (2014-09-15)
-------------------
- Add python2.7 support
- Support .vsdx format
- Refactor whole of script
- Add test scripts
1.0.1
------
1.0.0
------
- initial release
| 17 | 107 | 0.574468 |
4fd0f02dd055865772115ff93f3bb1204ecc1f1e | 1,394 | rst | reStructuredText | doc/ug/GUI.rst | lmaiztegi/containers | baff9a31047d9a01f183c557fbf674d13b7f26d8 | [
"Apache-2.0"
] | null | null | null | doc/ug/GUI.rst | lmaiztegi/containers | baff9a31047d9a01f183c557fbf674d13b7f26d8 | [
"Apache-2.0"
] | null | null | null | doc/ug/GUI.rst | lmaiztegi/containers | baff9a31047d9a01f183c557fbf674d13b7f26d8 | [
"Apache-2.0"
] | null | null | null | .. _UserGuide:GUI:
Tools with GUI
##############
By default, tools with Graphical User Interface (GUI) cannot be used in containers, because there is no graphical
server.
However, there are multiple alternatives for making an `X11 <https://en.wikipedia.org/wiki/X_Window_System>`__ or
`Wayland <https://en.wikipedia.org/wiki/Wayland_(display_server_protocol)>`__ server visible to the container.
`gh:mviereck/x11docker <https://github.com/mviereck/x11docker>`__ and `gh:mviereck/runx <https://github.com/mviereck/runx>`__
are full-featured helper scripts for setting up the environment and running GUI applications and desktop environments in
OCI containers.
GNU/Linux and Windows hosts are supported, and security related options are provided (such as cookie authentication).
Users of GTKWave, KLayout, nextpnr and other tools will likely want to try x11docker (and runx).
.. figure:: ../_static/img/x11docker_klayout.gif
:alt: Block diagram of the OSVB
:width: 100%
:align: center
Execution of KLayout in a container on Windows 10 (MSYS2/MINGW64) with
`mviereck/x11docker <https://github.com/mviereck/x11docker>`__,
`mviereck/runx <https://github.com/mviereck/runx>`__
and `VcxSrv <https://sourceforge.net/projects/vcxsrv/>`__.
* `x11docker: Run GUI applications in Docker containers; Journal of Open Source Hardware <https://joss.theoj.org/papers/10.21105/joss.01349>`__.
| 51.62963 | 144 | 0.769727 |
bfba01bea7064e4edba0493a7712176ef452e9e0 | 21,654 | rst | reStructuredText | docs/ddl.rst | hcheng2002cn/pglast | 63e01b9418a9161b0b8fa345d32b8e1475353cb6 | [
"PostgreSQL"
] | null | null | null | docs/ddl.rst | hcheng2002cn/pglast | 63e01b9418a9161b0b8fa345d32b8e1475353cb6 | [
"PostgreSQL"
] | null | null | null | docs/ddl.rst | hcheng2002cn/pglast | 63e01b9418a9161b0b8fa345d32b8e1475353cb6 | [
"PostgreSQL"
] | 1 | 2021-04-11T19:29:53.000Z | 2021-04-11T19:29:53.000Z | .. -*- coding: utf-8 -*-
.. :Project: pglast -- DO NOT EDIT: generated automatically
.. :Author: Lele Gaifax <lele@metapensiero.it>
.. :License: GNU General Public License version 3 or later
.. :Copyright: © 2017-2020 Lele Gaifax
..
======================================================
:mod:`pglast.printers.ddl` --- DDL printer functions
======================================================
.. module:: pglast.printers.ddl
:synopsis: DDL printer functions
.. index:: AccessPriv
.. function:: access_priv(node, output)
Pretty print a `node` of type `AccessPriv <https://github.com/rdunklau/libpg_query/blob/90bc162/tmp/postgres/src/include/nodes/parsenodes.h#L1938>`__ to the `output` stream.
.. index:: AlterDatabaseStmt
.. function:: alter_database_stmt(node, output)
Pretty print a `node` of type `AlterDatabaseStmt <https://github.com/rdunklau/libpg_query/blob/90bc162/tmp/postgres/src/include/nodes/parsenodes.h#L3113>`__ to the `output` stream.
.. index:: AlterDatabaseSetStmt
.. function:: alter_database_set_stmt(node, output)
Pretty print a `node` of type `AlterDatabaseSetStmt <https://github.com/rdunklau/libpg_query/blob/90bc162/tmp/postgres/src/include/nodes/parsenodes.h#L3120>`__ to the `output` stream.
.. index:: AlterEnumStmt
.. function:: alter_enum_stmt(node, output)
Pretty print a `node` of type `AlterEnumStmt <https://github.com/rdunklau/libpg_query/blob/90bc162/tmp/postgres/src/include/nodes/parsenodes.h#L3055>`__ to the `output` stream.
.. index:: AlterDefaultPrivilegesStmt
.. function:: alter_default_privileges_stmt(node, output)
Pretty print a `node` of type `AlterDefaultPrivilegesStmt <https://github.com/rdunklau/libpg_query/blob/90bc162/tmp/postgres/src/include/nodes/parsenodes.h#L1969>`__ to the `output` stream.
.. index:: AlterFunctionStmt
.. function:: alter_function_stmt(node, output)
Pretty print a `node` of type `AlterFunctionStmt <https://github.com/rdunklau/libpg_query/blob/90bc162/tmp/postgres/src/include/nodes/parsenodes.h#L2826>`__ to the `output` stream.
.. index:: AlterObjectSchemaStmt
.. function:: alter_object_schema_stmt(node, output)
Pretty print a `node` of type `AlterObjectSchemaStmt <https://github.com/rdunklau/libpg_query/blob/90bc162/tmp/postgres/src/include/nodes/parsenodes.h#L2907>`__ to the `output` stream.
.. index:: AlterOwnerStmt
.. function:: alter_owner_stmt(node, output)
Pretty print a `node` of type `AlterOwnerStmt <https://github.com/rdunklau/libpg_query/blob/90bc162/tmp/postgres/src/include/nodes/parsenodes.h#L2921>`__ to the `output` stream.
.. index:: AlterRoleStmt
.. function:: alter_role_stmt(node, output)
Pretty print a `node` of type `AlterRoleStmt <https://github.com/rdunklau/libpg_query/blob/90bc162/tmp/postgres/src/include/nodes/parsenodes.h#L2491>`__ to the `output` stream.
.. index:: AlterSeqStmt
.. function:: alter_seq_stmt(node, output)
Pretty print a `node` of type `AlterSeqStmt <https://github.com/rdunklau/libpg_query/blob/90bc162/tmp/postgres/src/include/nodes/parsenodes.h#L2529>`__ to the `output` stream.
.. index:: AlterTableStmt
.. function:: alter_table_stmt(node, output)
Pretty print a `node` of type `AlterTableStmt <https://github.com/rdunklau/libpg_query/blob/90bc162/tmp/postgres/src/include/nodes/parsenodes.h#L1750>`__ to the `output` stream.
.. index:: AlterTableCmd
.. function:: alter_table_cmd(node, output)
Pretty print a `node` of type `AlterTableCmd <https://github.com/rdunklau/libpg_query/blob/90bc162/tmp/postgres/src/include/nodes/parsenodes.h#L1837>`__ to the `output` stream.
.. index:: ClusterStmt
.. function:: cluster_stmt(node, output)
Pretty print a `node` of type `ClusterStmt <https://github.com/rdunklau/libpg_query/blob/90bc162/tmp/postgres/src/include/nodes/parsenodes.h#L3158>`__ to the `output` stream.
.. index:: ColumnDef
.. function:: column_def(node, output)
Pretty print a `node` of type `ColumnDef <https://github.com/rdunklau/libpg_query/blob/90bc162/tmp/postgres/src/include/nodes/parsenodes.h#L643>`__ to the `output` stream.
.. index:: CommentStmt
.. function:: comment_stmt(node, output)
Pretty print a `node` of type `CommentStmt <https://github.com/rdunklau/libpg_query/blob/90bc162/tmp/postgres/src/include/nodes/parsenodes.h#L2654>`__ to the `output` stream.
.. index:: CompositeTypeStmt
.. function:: composite_type_stmt(node, output)
Pretty print a `node` of type `CompositeTypeStmt <https://github.com/rdunklau/libpg_query/blob/90bc162/tmp/postgres/src/include/nodes/parsenodes.h#L3022>`__ to the `output` stream.
.. index:: Constraint
.. function:: constraint(node, output)
Pretty print a `node` of type `Constraint <https://github.com/rdunklau/libpg_query/blob/90bc162/tmp/postgres/src/include/nodes/parsenodes.h#L2126>`__ to the `output` stream.
.. index:: CreateAmStmt
.. function:: create_am_stmt(node, output)
Pretty print a `node` of type `CreateAmStmt <https://github.com/rdunklau/libpg_query/blob/90bc162/tmp/postgres/src/include/nodes/parsenodes.h#L2392>`__ to the `output` stream.
.. index:: CreatedbStmt
.. function:: create_db_stmt(node, output)
Pretty print a `node` of type `CreatedbStmt <https://github.com/rdunklau/libpg_query/blob/90bc162/tmp/postgres/src/include/nodes/parsenodes.h#L3102>`__ to the `output` stream.
.. index::
pair: CreatedbStmt;DefElem
.. function:: create_db_stmt_def_elem(node, output)
Pretty print a `node` of type `DefElem <https://github.com/rdunklau/libpg_query/blob/90bc162/tmp/postgres/src/include/nodes/parsenodes.h#L726>`__, when it is inside a `CreatedbStmt <https://github.com/rdunklau/libpg_query/blob/90bc162/tmp/postgres/src/include/nodes/parsenodes.h#L3102>`__, to the `output` stream.
.. index:: CreateCastStmt
.. function:: create_cast_stmt(node, output)
Pretty print a `node` of type `CreateCastStmt <https://github.com/rdunklau/libpg_query/blob/90bc162/tmp/postgres/src/include/nodes/parsenodes.h#L3344>`__ to the `output` stream.
.. index:: CreateConversionStmt
.. function:: create_conversion_stmt(node, output)
Pretty print a `node` of type `CreateConversionStmt <https://github.com/rdunklau/libpg_query/blob/90bc162/tmp/postgres/src/include/nodes/parsenodes.h#L3330>`__ to the `output` stream.
.. index:: CreateDomainStmt
.. function:: create_domain_stmt(node, output)
Pretty print a `node` of type `CreateDomainStmt <https://github.com/rdunklau/libpg_query/blob/90bc162/tmp/postgres/src/include/nodes/parsenodes.h#L2558>`__ to the `output` stream.
.. index:: CreateEnumStmt
.. function:: create_enum_stmt(node, output)
Pretty print a `node` of type `CreateEnumStmt <https://github.com/rdunklau/libpg_query/blob/90bc162/tmp/postgres/src/include/nodes/parsenodes.h#L3033>`__ to the `output` stream.
.. index:: CreateEventTrigStmt
.. function:: create_event_trig_stmt(node, output)
Pretty print a `node` of type `CreateEventTrigStmt <https://github.com/rdunklau/libpg_query/blob/90bc162/tmp/postgres/src/include/nodes/parsenodes.h#L2431>`__ to the `output` stream.
.. index::
pair: CreateEventTrigStmt;DefElem
.. function:: create_event_trig_stmt_def_elem(node, output)
Pretty print a `node` of type `DefElem <https://github.com/rdunklau/libpg_query/blob/90bc162/tmp/postgres/src/include/nodes/parsenodes.h#L726>`__, when it is inside a `CreateEventTrigStmt <https://github.com/rdunklau/libpg_query/blob/90bc162/tmp/postgres/src/include/nodes/parsenodes.h#L2431>`__, to the `output` stream.
.. index:: CreateExtensionStmt
.. function:: create_extension_stmt(node, output)
Pretty print a `node` of type `CreateExtensionStmt <https://github.com/rdunklau/libpg_query/blob/90bc162/tmp/postgres/src/include/nodes/parsenodes.h#L2222>`__ to the `output` stream.
.. index::
pair: CreateExtensionStmt;DefElem
.. function:: create_extension_stmt_def_elem(node, output)
Pretty print a `node` of type `DefElem <https://github.com/rdunklau/libpg_query/blob/90bc162/tmp/postgres/src/include/nodes/parsenodes.h#L726>`__, when it is inside a `CreateExtensionStmt <https://github.com/rdunklau/libpg_query/blob/90bc162/tmp/postgres/src/include/nodes/parsenodes.h#L2222>`__, to the `output` stream.
.. index:: CreateFdwStmt
.. function:: create_fdw_stmt(node, output)
Pretty print a `node` of type `CreateFdwStmt <https://github.com/rdunklau/libpg_query/blob/90bc162/tmp/postgres/src/include/nodes/parsenodes.h#L2252>`__ to the `output` stream.
.. index::
pair: CreateFdwStmt;DefElem
.. function:: create_fdw_stmt_def_elem(node, output)
Pretty print a `node` of type `DefElem <https://github.com/rdunklau/libpg_query/blob/90bc162/tmp/postgres/src/include/nodes/parsenodes.h#L726>`__, when it is inside a `CreateFdwStmt <https://github.com/rdunklau/libpg_query/blob/90bc162/tmp/postgres/src/include/nodes/parsenodes.h#L2252>`__, to the `output` stream.
.. index:: CreateForeignTableStmt
.. function:: create_foreign_table_stmt(node, output)
Pretty print a `node` of type `CreateForeignTableStmt <https://github.com/rdunklau/libpg_query/blob/90bc162/tmp/postgres/src/include/nodes/parsenodes.h#L2298>`__ to the `output` stream.
.. index::
pair: CreateForeignTableStmt;DefElem
.. function:: create_foreign_table_stmt_def_elem(node, output)
Pretty print a `node` of type `DefElem <https://github.com/rdunklau/libpg_query/blob/90bc162/tmp/postgres/src/include/nodes/parsenodes.h#L726>`__, when it is inside a `CreateForeignTableStmt <https://github.com/rdunklau/libpg_query/blob/90bc162/tmp/postgres/src/include/nodes/parsenodes.h#L2298>`__, to the `output` stream.
.. index:: CreateFunctionStmt
.. function:: create_function_stmt(node, output)
Pretty print a `node` of type `CreateFunctionStmt <https://github.com/rdunklau/libpg_query/blob/90bc162/tmp/postgres/src/include/nodes/parsenodes.h#L2796>`__ to the `output` stream.
.. index::
pair: AlterFunctionStmt;DefElem
.. index::
pair: CreateFunctionStmt;DefElem
.. index::
pair: DoStmt;DefElem
.. function:: create_function_option(node, output)
Pretty print a `node` of type `DefElem <https://github.com/rdunklau/libpg_query/blob/90bc162/tmp/postgres/src/include/nodes/parsenodes.h#L726>`__, when it is inside a `AlterFunctionStmt <https://github.com/rdunklau/libpg_query/blob/90bc162/tmp/postgres/src/include/nodes/parsenodes.h#L2826>`__ or a `CreateFunctionStmt <https://github.com/rdunklau/libpg_query/blob/90bc162/tmp/postgres/src/include/nodes/parsenodes.h#L2796>`__ or a `DoStmt <https://github.com/rdunklau/libpg_query/blob/90bc162/tmp/postgres/src/include/nodes/parsenodes.h#L2840>`__, to the `output` stream.
.. index:: CreateOpClassStmt
.. function:: create_opclass_stmt(node, output)
Pretty print a `node` of type `CreateOpClassStmt <https://github.com/rdunklau/libpg_query/blob/90bc162/tmp/postgres/src/include/nodes/parsenodes.h#L2571>`__ to the `output` stream.
.. index:: CreateOpClassItem
.. function:: create_opclass_item(node, output)
Pretty print a `node` of type `CreateOpClassItem <https://github.com/rdunklau/libpg_query/blob/90bc162/tmp/postgres/src/include/nodes/parsenodes.h#L2586>`__ to the `output` stream.
.. index:: CreatePLangStmt
.. function:: create_plang_stmt(node, output)
Pretty print a `node` of type `CreatePLangStmt <https://github.com/rdunklau/libpg_query/blob/90bc162/tmp/postgres/src/include/nodes/parsenodes.h#L2456>`__ to the `output` stream.
.. index:: CreatePolicyStmt
.. function:: create_policy_stmt(node, output)
Pretty print a `node` of type `CreatePolicyStmt <https://github.com/rdunklau/libpg_query/blob/90bc162/tmp/postgres/src/include/nodes/parsenodes.h#L2362>`__ to the `output` stream.
.. index:: AlterPolicyStmt
.. function:: create_policy_stmt(node, output)
Pretty print a `node` of type `AlterPolicyStmt <https://github.com/rdunklau/libpg_query/blob/90bc162/tmp/postgres/src/include/nodes/parsenodes.h#L2378>`__ to the `output` stream.
.. index:: CreateSchemaStmt
.. function:: create_schema_stmt(node, output)
Pretty print a `node` of type `CreateSchemaStmt <https://github.com/rdunklau/libpg_query/blob/90bc162/tmp/postgres/src/include/nodes/parsenodes.h#L1731>`__ to the `output` stream.
.. index:: CreateSeqStmt
.. function:: create_seq_stmt(node, output)
Pretty print a `node` of type `CreateSeqStmt <https://github.com/rdunklau/libpg_query/blob/90bc162/tmp/postgres/src/include/nodes/parsenodes.h#L2519>`__ to the `output` stream.
.. index::
pair: CreateSeqStmt;DefElem
.. function:: create_seq_stmt_def_elem(node, output)
Pretty print a `node` of type `DefElem <https://github.com/rdunklau/libpg_query/blob/90bc162/tmp/postgres/src/include/nodes/parsenodes.h#L726>`__, when it is inside a `CreateSeqStmt <https://github.com/rdunklau/libpg_query/blob/90bc162/tmp/postgres/src/include/nodes/parsenodes.h#L2519>`__, to the `output` stream.
.. index::
pair: AlterSeqStmt;DefElem
.. function:: create_seq_stmt_def_elem(node, output)
Pretty print a `node` of type `DefElem <https://github.com/rdunklau/libpg_query/blob/90bc162/tmp/postgres/src/include/nodes/parsenodes.h#L726>`__, when it is inside a `AlterSeqStmt <https://github.com/rdunklau/libpg_query/blob/90bc162/tmp/postgres/src/include/nodes/parsenodes.h#L2529>`__, to the `output` stream.
.. index:: CreateStmt
.. function:: create_stmt(node, output)
Pretty print a `node` of type `CreateStmt <https://github.com/rdunklau/libpg_query/blob/90bc162/tmp/postgres/src/include/nodes/parsenodes.h#L2046>`__ to the `output` stream.
.. index:: CreateTableAsStmt
.. function:: create_table_as_stmt(node, output)
Pretty print a `node` of type `CreateTableAsStmt <https://github.com/rdunklau/libpg_query/blob/90bc162/tmp/postgres/src/include/nodes/parsenodes.h#L3224>`__ to the `output` stream.
.. index:: CreateTrigStmt
.. function:: create_trig_stmt(node, output)
Pretty print a `node` of type `CreateTrigStmt <https://github.com/rdunklau/libpg_query/blob/90bc162/tmp/postgres/src/include/nodes/parsenodes.h#L2404>`__ to the `output` stream.
.. index:: DefineStmt
.. function:: define_stmt(node, output)
Pretty print a `node` of type `DefineStmt <https://github.com/rdunklau/libpg_query/blob/90bc162/tmp/postgres/src/include/nodes/parsenodes.h#L2542>`__ to the `output` stream.
.. index:: DefElem
.. function:: def_elem(node, output)
Pretty print a `node` of type `DefElem <https://github.com/rdunklau/libpg_query/blob/90bc162/tmp/postgres/src/include/nodes/parsenodes.h#L726>`__ to the `output` stream.
.. index::
pair: DefineStmt;DefElem
.. function:: define_stmt_def_elem(node, output)
Pretty print a `node` of type `DefElem <https://github.com/rdunklau/libpg_query/blob/90bc162/tmp/postgres/src/include/nodes/parsenodes.h#L726>`__, when it is inside a `DefineStmt <https://github.com/rdunklau/libpg_query/blob/90bc162/tmp/postgres/src/include/nodes/parsenodes.h#L2542>`__, to the `output` stream.
.. index:: DiscardStmt
.. function:: discard_stmt(node, output)
Pretty print a `node` of type `DiscardStmt <https://github.com/rdunklau/libpg_query/blob/90bc162/tmp/postgres/src/include/nodes/parsenodes.h#L3268>`__ to the `output` stream.
.. index:: DoStmt
.. function:: do_stmt(node, output)
Pretty print a `node` of type `DoStmt <https://github.com/rdunklau/libpg_query/blob/90bc162/tmp/postgres/src/include/nodes/parsenodes.h#L2840>`__ to the `output` stream.
.. index:: DropdbStmt
.. function:: drop_db_stmt(node, output)
Pretty print a `node` of type `DropdbStmt <https://github.com/rdunklau/libpg_query/blob/90bc162/tmp/postgres/src/include/nodes/parsenodes.h#L3131>`__ to the `output` stream.
.. index:: DropOwnedStmt
.. function:: drop_owned_stmt(node, output)
Pretty print a `node` of type `DropOwnedStmt <https://github.com/rdunklau/libpg_query/blob/90bc162/tmp/postgres/src/include/nodes/parsenodes.h#L3408>`__ to the `output` stream.
.. index:: DropRoleStmt
.. function:: drop_role_stmt(node, output)
Pretty print a `node` of type `DropRoleStmt <https://github.com/rdunklau/libpg_query/blob/90bc162/tmp/postgres/src/include/nodes/parsenodes.h#L2507>`__ to the `output` stream.
.. index:: DropStmt
.. function:: drop_stmt(node, output)
Pretty print a `node` of type `DropStmt <https://github.com/rdunklau/libpg_query/blob/90bc162/tmp/postgres/src/include/nodes/parsenodes.h#L2628>`__ to the `output` stream.
.. index:: DropSubscriptionStmt
.. function:: drop_subscription_stmt(node, output)
Pretty print a `node` of type `DropSubscriptionStmt <https://github.com/rdunklau/libpg_query/blob/90bc162/tmp/postgres/src/include/nodes/parsenodes.h#L3516>`__ to the `output` stream.
.. index:: DropTableSpaceStmt
.. function:: drop_table_space_stmt(node, output)
Pretty print a `node` of type `DropTableSpaceStmt <https://github.com/rdunklau/libpg_query/blob/90bc162/tmp/postgres/src/include/nodes/parsenodes.h#L2192>`__ to the `output` stream.
.. index:: DropUserMappingStmt
.. function:: drop_user_mapping_stmt(node, output)
Pretty print a `node` of type `DropUserMappingStmt <https://github.com/rdunklau/libpg_query/blob/90bc162/tmp/postgres/src/include/nodes/parsenodes.h#L2327>`__ to the `output` stream.
.. index:: FunctionParameter
.. function:: function_parameter(node, output)
Pretty print a `node` of type `FunctionParameter <https://github.com/rdunklau/libpg_query/blob/90bc162/tmp/postgres/src/include/nodes/parsenodes.h#L2817>`__ to the `output` stream.
.. index:: GrantStmt
.. function:: grant_stmt(node, output)
Pretty print a `node` of type `GrantStmt <https://github.com/rdunklau/libpg_query/blob/90bc162/tmp/postgres/src/include/nodes/parsenodes.h#L1901>`__ to the `output` stream.
.. index:: GrantRoleStmt
.. function:: grant_role_stmt(node, output)
Pretty print a `node` of type `GrantRoleStmt <https://github.com/rdunklau/libpg_query/blob/90bc162/tmp/postgres/src/include/nodes/parsenodes.h#L1954>`__ to the `output` stream.
.. index:: IndexStmt
.. function:: index_stmt(node, output)
Pretty print a `node` of type `IndexStmt <https://github.com/rdunklau/libpg_query/blob/90bc162/tmp/postgres/src/include/nodes/parsenodes.h#L2749>`__ to the `output` stream.
.. index:: LockStmt
.. function:: lock_stmt(node, output)
Pretty print a `node` of type `LockStmt <https://github.com/rdunklau/libpg_query/blob/90bc162/tmp/postgres/src/include/nodes/parsenodes.h#L3278>`__ to the `output` stream.
.. index:: NotifyStmt
.. function:: notify_stmt(node, output)
Pretty print a `node` of type `NotifyStmt <https://github.com/rdunklau/libpg_query/blob/90bc162/tmp/postgres/src/include/nodes/parsenodes.h#L2963>`__ to the `output` stream.
.. index:: ObjectWithArgs
.. function:: object_with_args(node, output)
Pretty print a `node` of type `ObjectWithArgs <https://github.com/rdunklau/libpg_query/blob/90bc162/tmp/postgres/src/include/nodes/parsenodes.h#L1921>`__ to the `output` stream.
.. index:: PartitionBoundSpec
.. function:: partition_bound_spec(node, output)
Pretty print a `node` of type `PartitionBoundSpec <https://github.com/rdunklau/libpg_query/blob/90bc162/tmp/postgres/src/include/nodes/parsenodes.h#L808>`__ to the `output` stream.
.. index:: PartitionElem
.. function:: partition_elem(node, output)
Pretty print a `node` of type `PartitionElem <https://github.com/rdunklau/libpg_query/blob/90bc162/tmp/postgres/src/include/nodes/parsenodes.h#L773>`__ to the `output` stream.
.. index:: PartitionRangeDatum
.. function:: partition_range_datum(node, output)
Pretty print a `node` of type `PartitionRangeDatum <https://github.com/rdunklau/libpg_query/blob/90bc162/tmp/postgres/src/include/nodes/parsenodes.h#L841>`__ to the `output` stream.
.. index:: PartitionSpec
.. function:: partition_spec(node, output)
Pretty print a `node` of type `PartitionSpec <https://github.com/rdunklau/libpg_query/blob/90bc162/tmp/postgres/src/include/nodes/parsenodes.h#L788>`__ to the `output` stream.
.. index:: RenameStmt
.. function:: rename_stmt(node, output)
Pretty print a `node` of type `RenameStmt <https://github.com/rdunklau/libpg_query/blob/90bc162/tmp/postgres/src/include/nodes/parsenodes.h#L2876>`__ to the `output` stream.
.. index:: RoleSpec
.. function:: role_spec(node, output)
Pretty print a `node` of type `RoleSpec <https://github.com/rdunklau/libpg_query/blob/90bc162/tmp/postgres/src/include/nodes/parsenodes.h#L325>`__ to the `output` stream.
.. index:: RuleStmt
.. function:: rule_stmt_printer(node, output)
Pretty print a `node` of type `RuleStmt <https://github.com/rdunklau/libpg_query/blob/90bc162/tmp/postgres/src/include/nodes/parsenodes.h#L2947>`__ to the `output` stream.
.. index:: TriggerTransition
.. function:: trigger_transition(node, output)
Pretty print a `node` of type `TriggerTransition <https://github.com/rdunklau/libpg_query/blob/90bc162/tmp/postgres/src/include/nodes/parsenodes.h#L1458>`__ to the `output` stream.
.. index:: VacuumStmt
.. function:: vacuum_stmt(node, output)
Pretty print a `node` of type `VacuumStmt <https://github.com/rdunklau/libpg_query/blob/90bc162/tmp/postgres/src/include/nodes/parsenodes.h#L3173>`__ to the `output` stream.
.. index:: VacuumRelation
.. function:: vacuum_relation(node, output)
Pretty print a `node` of type `VacuumRelation <https://github.com/rdunklau/libpg_query/blob/90bc162/tmp/postgres/src/include/nodes/parsenodes.h#L3188>`__ to the `output` stream.
.. index:: ViewStmt
.. function:: view_stmt(node, output)
Pretty print a `node` of type `ViewStmt <https://github.com/rdunklau/libpg_query/blob/90bc162/tmp/postgres/src/include/nodes/parsenodes.h#L3077>`__ to the `output` stream.
| 44.647423 | 575 | 0.757458 |
f1496b9531d7cc6b66dbd60d08f9e91a2ecc96e1 | 141 | rst | reStructuredText | docs/contributing/index.rst | murmuras-tech/plasma | 7883a6a1266b69ab49f353bdf9974be408bf0709 | [
"BSD-3-Clause"
] | 32 | 2020-05-11T07:56:49.000Z | 2022-03-13T16:26:14.000Z | docs/contributing/index.rst | murmuras-tech/plasma | 7883a6a1266b69ab49f353bdf9974be408bf0709 | [
"BSD-3-Clause"
] | 56 | 2020-05-07T13:07:30.000Z | 2022-03-31T21:03:17.000Z | docs/contributing/index.rst | murmuras-tech/plasma | 7883a6a1266b69ab49f353bdf9974be408bf0709 | [
"BSD-3-Clause"
] | 8 | 2020-09-23T07:28:26.000Z | 2022-03-29T09:38:53.000Z | Contributing
============
Thanks for your interest in contributing to the project!
.. toctree::
:maxdepth: 3
documentation
local
| 12.818182 | 56 | 0.659574 |
ee1255c4cdd8eb1a33977df64dda86ae8fd6a930 | 921 | rst | reStructuredText | docs/pymatgen.util.rst | qimin/pymatgen | 4823c777a8af4a3ca7cd29297563ba8174ec402c | [
"MIT"
] | 1 | 2022-03-29T20:03:58.000Z | 2022-03-29T20:03:58.000Z | docs/pymatgen.util.rst | qimin/pymatgen | 4823c777a8af4a3ca7cd29297563ba8174ec402c | [
"MIT"
] | null | null | null | docs/pymatgen.util.rst | qimin/pymatgen | 4823c777a8af4a3ca7cd29297563ba8174ec402c | [
"MIT"
] | null | null | null | util Package
============
:mod:`util` Package
-------------------
.. automodule:: pymatgen.util
:members:
:undoc-members:
:show-inheritance:
:mod:`coord_utils` Module
-------------------------
.. automodule:: pymatgen.util.coord_utils
:members:
:undoc-members:
:show-inheritance:
:mod:`decorators` Module
------------------------
.. automodule:: pymatgen.util.decorators
:members:
:undoc-members:
:show-inheritance:
:mod:`io_utils` Module
----------------------
.. automodule:: pymatgen.util.io_utils
:members:
:undoc-members:
:show-inheritance:
:mod:`plotting_utils` Module
----------------------------
.. automodule:: pymatgen.util.plotting_utils
:members:
:undoc-members:
:show-inheritance:
:mod:`string_utils` Module
--------------------------
.. automodule:: pymatgen.util.string_utils
:members:
:undoc-members:
:show-inheritance:
| 17.711538 | 44 | 0.560261 |
6882fbb8e22043d36b7f2d2098aed680f3e0e579 | 114 | rst | reStructuredText | docs/source/models/submodels/electrolyte_conductivity/full_conductivity.rst | manjunathnilugal/PyBaMM | 65d5cba534b4f163670e753714964aaa75d6a2d2 | [
"BSD-3-Clause"
] | 330 | 2019-04-17T11:36:57.000Z | 2022-03-28T16:49:55.000Z | docs/source/models/submodels/electrolyte_conductivity/full_conductivity.rst | masoodtamaddon/PyBaMM | a31e2095600bb92e913598ac4d02b2b6b77b31c1 | [
"BSD-3-Clause"
] | 1,530 | 2019-03-26T18:13:03.000Z | 2022-03-31T16:12:53.000Z | docs/source/models/submodels/electrolyte_conductivity/full_conductivity.rst | masoodtamaddon/PyBaMM | a31e2095600bb92e913598ac4d02b2b6b77b31c1 | [
"BSD-3-Clause"
] | 178 | 2019-03-27T13:48:04.000Z | 2022-03-31T09:30:11.000Z | Full Model
==========
.. autoclass:: pybamm.electrolyte_conductivity.Full
:members:
:inherited-members:
| 14.25 | 51 | 0.657895 |
079d92ce64589b91a6c86b69a608df4a7161bade | 4,311 | rst | reStructuredText | CHANGES.rst | douardda/tidypy | 9d4c6470af8e0ca85209333a99787290f36498d4 | [
"MIT"
] | null | null | null | CHANGES.rst | douardda/tidypy | 9d4c6470af8e0ca85209333a99787290f36498d4 | [
"MIT"
] | null | null | null | CHANGES.rst | douardda/tidypy | 9d4c6470af8e0ca85209333a99787290f36498d4 | [
"MIT"
] | null | null | null | *****************
TidyPy Change Log
*****************
.. contents:: Releases
0.7.0 (2018-10-24)
==================
**Enhancements**
* Upgraded the ``pycodestyle``, ``pydocstyle``, ``vulture``, and ``pyflakes``
tools.
* Added ability to distinguish and disable specific codes from the ``secrets``
tool.
0.6.0 (2018-09-30)
==================
**Enhancements**
* Added the ``secrets`` tool.
* Enabled the ``pydiatra`` tool on windows (thanks @jwilk).
* Upgraded the ``pylint`` and ``vulture`` tools.
* Upgraded the ``pep8-naming`` plugin of the ``pycodestyle`` tool.
**Fixes**
* Fixed an issue with ``rstlint`` crashing due to recent updates to Sphinx.
0.5.0 (2018-05-05)
==================
**Enhancements**
* Added ``manifest`` and ``pydiatra`` tools.
* Upgraded the ``pylint`` tool.
* Upgraded the ``pep8-naming`` plugin of the ``pycodestyle`` tool.
* Added some convenience handling of the ``License`` vs ``Licence`` and
``LicenceClassifier`` vs ``LicenseClassifier`` codes reported by ``pyroma``.
* Added the first draft of the project documentation.
* Added an ``extensions`` command that will output a listing of all the
available tools, reports, and extenders that are available.
**Fixes**
* Fixed the character location reported in ``pylint`` issues being off-by-one.
* Fixed various issues with the ``pyroma`` tool leaking problems to stderr.
0.4.0 (2017-12-02)
==================
**Enhancements**
* Added a ``sphinx-extensions`` option to the ``rstlint`` tool to enable the
automatic recognition of Sphinx-specific extensions to ReST (Sphinx must be
installed in the same environment as TidyPy for it to work).
* Added a ``ignore-roles`` option to the ``rstlint`` tool to help deal with
non-standard ReST text roles.
* Changed tool execution from a multithreaded model to multiprocess. Larger
projects should see an improvement in execution speed.
**Changes**
* The ``--threads`` option to the ``check`` command has been changed to
``--workers``.
**Fixes**
* Fixed an issue that caused the ``pylint`` tool to crash when it encountered
``duplicate-code`` issues on files that are being excluded from analysis.
0.3.0 (2017-11-18)
==================
**Enhancements**
* Added ``ignore-directives`` and ``load-directives`` options to the
``rstlint`` tool to help deal with non-standard ReST directives.
* Added support for the ``extension-pkg-whitelist`` option to the ``pylint``
tool.
* Added ``install-vcs`` and ``remove-vcs`` commands to install/remove
pre-commit hooks into the VCS of a project that will execute TidyPy.
Currently supports both Git and Mercurial.
**Changes**
* Changed the ``merge_issues`` and ``ignore_missing_extends`` options to
``merge-issues`` and ``ignore-missing-extends`` for naming consistency.
* Replaced the ``radon`` tool with the traditional ``mccabe`` tool.
**Fixes**
* Fixed issue that caused TidyPy to spin out of control if you used CTRL-C to
kill it while it was executing tools.
* Fixed issue where ``pylint``'s ``duplicate-code`` issue was reported only
against one file, and it was usually the wrong file. TidyPy will now report
an issue against each file identified with the duplicate code.
* Numerous fixes to support running TidyPy on Windows.
0.2.0 (2017-11-04)
==================
**Enhancements**
* Added a ``2to3`` tool.
* All tools that report issues against Python source files can now use the
``# noqa`` comment to ignore issues for that specific line.
* Added support for the ``ignore-nosec`` option in the ``bandit`` tool.
* Added the ability for TidyPy configurations to extend from other
configuration files via the ``extends`` property.
* Upgraded the ``vulture`` tool.
* Upgraded the ``pyflakes`` tool.
**Changes**
* Changed the ``--no-merge`` and ``--no-progress`` options to the ``check``
command to ``--disable-merge`` and ``--disable-progress``.
* The ``check`` command will now return ``1`` to the shell if TidyPy finds
issues.
* No longer overriding ``pycodestyle``'s default max-line-length.
**Fixes**
* If any tools output directly to stdout or stderr, TidyPy will now capture it
and report it as a ``tidypy:tool`` issue.
* Fixed crash/hang that occurred when using ``--disable-progress``.
0.1.0 (2017-10-15)
==================
* Initial public release.
| 30.574468 | 78 | 0.68708 |
a6ac272c893b3e3146619a3f70405fedcb51013c | 237 | rst | reStructuredText | website_sale_cache/doc/changelog.rst | RL-OtherApps/website-addons | b0903daefa492c298084542de2c99f1ab13cd4b4 | [
"MIT"
] | null | null | null | website_sale_cache/doc/changelog.rst | RL-OtherApps/website-addons | b0903daefa492c298084542de2c99f1ab13cd4b4 | [
"MIT"
] | null | null | null | website_sale_cache/doc/changelog.rst | RL-OtherApps/website-addons | b0903daefa492c298084542de2c99f1ab13cd4b4 | [
"MIT"
] | null | null | null | `1.0.2`
-------
- **Fix:** Active category was not highlighted, that could also lead to bad preview in collapsed state
`1.0.1`
-------
- **FIX:** corrections to provide compatibility with version 10.0
`1.0.0`
-------
- Init version
| 15.8 | 102 | 0.628692 |
08ca9444c1e8ac8eb8d90b9cfd6035cc0b735c0b | 896 | rst | reStructuredText | docs/installation.rst | sbip-sg/typescriptllvm | 42621774352f55b1d23a5ab5678201b0e5d19085 | [
"Apache-2.0"
] | 4 | 2022-01-21T05:07:24.000Z | 2022-03-15T12:43:20.000Z | docs/installation.rst | sbip-sg/tsll | 42621774352f55b1d23a5ab5678201b0e5d19085 | [
"Apache-2.0"
] | 11 | 2021-11-23T01:02:21.000Z | 2022-03-03T09:54:05.000Z | docs/installation.rst | sbip-sg/typescriptllvm | 42621774352f55b1d23a5ab5678201b0e5d19085 | [
"Apache-2.0"
] | null | null | null | Installation
----------------
Dependencies
^^^^^^^^^^^^^
- Specify LLVM path:
.. code-block:: sh
npm config set cmake_LLVM_DIR $(path-to-llvm/bin/llvm-config --cmakedir)
sudo npm config set cmake_LLVM_DIR $(path-to-llvm/bin/llvm-config --cmakedir) --global
Global installation
^^^^^^^^^^^^^^^^^^^^^
- Install Typescriptllvm globally:
.. code-block:: sh
sudo -E npm install -g @lungchen/typescriptllvm
Note that remember to add ``-E`` after ``sudo`` in the above command to ensure
the custom LLVM setting is copied to the root environment.
After this step, it can be run globally by the command ``typescriptllvm``.
Local installation
^^^^^^^^^^^^^^^^^^^^
- Install Typescriptllvm locally:
.. code-block:: sh
cd typescriptllvm
npm install @lungchen/typescriptllvm
After this step, it can be run globally by the command ``npx typescriptllvm``.
| 22.4 | 91 | 0.664063 |
59aa31b10d86aeadc1f8bd8fe67f76c1cd969497 | 2,514 | rst | reStructuredText | docs/class8/module2/lab4.rst | tmrtn/f5-agility-labs-waf | 6353603fd67280b388f0f8b516af58ff6a7df4c2 | [
"MIT"
] | null | null | null | docs/class8/module2/lab4.rst | tmrtn/f5-agility-labs-waf | 6353603fd67280b388f0f8b516af58ff6a7df4c2 | [
"MIT"
] | null | null | null | docs/class8/module2/lab4.rst | tmrtn/f5-agility-labs-waf | 6353603fd67280b388f0f8b516af58ff6a7df4c2 | [
"MIT"
] | null | null | null | Detection Per IP Address
------------------------
.. _attempt-invalid-logins-2:
Attempt invalid logins
~~~~~~~~~~~~~~~~~~~~~~
In the browser address bar change to the login page URL at **http://hackzazon.f5demo.com/user/login**.
Try to login with various usernames and passwords.
.. IMPORTANT::
Do NOT use the same username twice.
After **at least** 20 failed login attempts you should get the CAPTCHA page.
.. TIP::
Why does it take so many failed logins to detect when you are using different usernames?
Complete the CAPTCHA. You should be returned to the login screen.
.. _review-asm-request-log-2:
Review ASM Request log
~~~~~~~~~~~~~~~~~~~~~~
In the BIG-IP browse to the ASM Request log at **Security >> Event Logs >> Application >> Requests**.
Look through the request log for the **most recent** illegal request to /user/login.
|image16|
.. NOTE::
What **Violation** was detected for this request?
What other details about this request are visible when you select the “occurrence”?
What indicator is there that this Brute Force violation was detected by IP address instead of by username?
.. |image12| image:: /_static/class8/credstuff/image12.png
.. |image13| image:: /_static/class8/credstuff/image13.png
.. |image14| image:: /_static/class8/credstuff/image14.png
.. |image15| image:: /_static/class8/credstuff/image15.png
.. |image16| image:: /_static/class8/credstuff/image16.png
.. |image17| image:: /_static/class8/credstuff/image17.png
.. |image18| image:: /_static/class8/credstuff/image18.png
.. |image19| image:: /_static/class8/credstuff/image19.png
.. |image20| image:: /_static/class8/credstuff/image20.png
.. |image21| image:: /_static/class8/credstuff/image21.png
.. |image22| image:: /_static/class8/credstuff/image22.png
.. |image23| image:: /_static/class8/credstuff/image23.png
.. |image24| image:: /_static/class8/credstuff/image24.png
.. |image25| image:: /_static/class8/credstuff/image25.png
.. |image26| image:: /_static/class8/credstuff/image26.png
.. |image27| image:: /_static/class8/credstuff/image27.png
.. |image28| image:: /_static/class8/credstuff/image28.png
.. |image29| image:: /_static/class8/credstuff/image29.png
.. |image30| image:: /_static/class8/credstuff/image30.png
.. |image31| image:: /_static/class8/credstuff/image31.png
.. |image32| image:: /_static/class8/credstuff/image32.png
.. |image33| image:: /_static/class8/credstuff/image33.png
.. |image34| image:: /_static/class8/credstuff/image34.png
| 36.970588 | 110 | 0.716388 |
8340bbc1eb3673f7d9d7087aa1299670c40b701d | 250 | rst | reStructuredText | docs/source/reference/index.rst | stevepiercy/websauna | 2886b86f7920d75900c634958779d61aa73f011b | [
"CNRI-Python"
] | 286 | 2016-01-17T05:44:02.000Z | 2022-02-07T20:28:49.000Z | docs/source/reference/index.rst | stevepiercy/websauna | 2886b86f7920d75900c634958779d61aa73f011b | [
"CNRI-Python"
] | 203 | 2016-03-15T02:00:53.000Z | 2021-09-27T10:48:49.000Z | docs/source/reference/index.rst | ooduor/websauna | 2e78cd87eda305fbbb1080d386b8cf96537360e5 | [
"CNRI-Python"
] | 71 | 2016-01-17T11:04:26.000Z | 2021-08-24T08:04:31.000Z | References
----------
.. toctree::
:maxdepth: 1
templates
templatecontext
config
commands
playbook
troubleshooting
glossary
Indices and tables
++++++++++++++++++
* :ref:`genindex`
* :ref:`modindex`
* :ref:`search` | 11.904762 | 19 | 0.568 |
2c70a13a93153bd657cec3909ea35fe61e0e5395 | 436 | rst | reStructuredText | doc/whats_new/v0.1.rst | samueljamesbell/scikit-optimize | c53998816481d150ccc745ffd07d022fdb1fd25d | [
"BSD-3-Clause"
] | 2,404 | 2016-04-20T19:41:37.000Z | 2022-03-31T16:29:23.000Z | doc/whats_new/v0.1.rst | samueljamesbell/scikit-optimize | c53998816481d150ccc745ffd07d022fdb1fd25d | [
"BSD-3-Clause"
] | 982 | 2016-04-20T20:51:54.000Z | 2022-03-29T18:49:14.000Z | doc/whats_new/v0.1.rst | samueljamesbell/scikit-optimize | c53998816481d150ccc745ffd07d022fdb1fd25d | [
"BSD-3-Clause"
] | 501 | 2016-04-20T20:04:16.000Z | 2022-03-10T14:11:22.000Z | Version 0.1
===========
First light!
New features
------------
* Bayesian optimization via `gp_minimize`.
* Tree-based sequential model-based optimization via `forest_minimize` and `gbrt_minimize`, with support for multi-threading.
* Support of LCB, EI and PI as acquisition functions.
* Plotting functions for inspecting convergence, evaluations and the objective function.
* API for specifying and sampling from a parameter space.
| 31.142857 | 125 | 0.754587 |
4854425ecb9526588d9356fd9f4a25407e7e1d9c | 422 | rst | reStructuredText | doc/source/mapdl_commands/solution/multi_field_solver_convergence_controls.rst | da1910/pymapdl | 305b70b30e61a78011e974ff4cb409ee21f89e13 | [
"MIT"
] | 194 | 2016-10-21T08:46:41.000Z | 2021-01-06T20:39:23.000Z | doc/source/mapdl_commands/solution/multi_field_solver_convergence_controls.rst | da1910/pymapdl | 305b70b30e61a78011e974ff4cb409ee21f89e13 | [
"MIT"
] | 463 | 2021-01-12T14:07:38.000Z | 2022-03-31T22:42:25.000Z | doc/source/mapdl_commands/solution/multi_field_solver_convergence_controls.rst | da1910/pymapdl | 305b70b30e61a78011e974ff4cb409ee21f89e13 | [
"MIT"
] | 66 | 2016-11-21T04:26:08.000Z | 2020-12-28T09:27:27.000Z | .. _ref_multi_field_solver_convergence_controls_commands_api:
***************************************
Multi-Field Solver Convergence Controls
***************************************
.. currentmodule:: ansys.mapdl.core
These SOLUTION commands are used to define convergence controls for an
ANSYS Multi-field solver analysis.
.. autosummary::
:toctree: _autosummary/
Mapdl.mfconv
Mapdl.mfiter
Mapdl.mfrelax
| 23.444444 | 70 | 0.64455 |
4ac00a8d9f3eea85df6735c930cf73ec3f8b1dca | 2,950 | rst | reStructuredText | doc/quickstart.rst | glemaitre/neurtu | df3080c8347254466273071d949cafa804ed14b1 | [
"BSD-3-Clause"
] | null | null | null | doc/quickstart.rst | glemaitre/neurtu | df3080c8347254466273071d949cafa804ed14b1 | [
"BSD-3-Clause"
] | null | null | null | doc/quickstart.rst | glemaitre/neurtu | df3080c8347254466273071d949cafa804ed14b1 | [
"BSD-3-Clause"
] | null | null | null | Quickstart
==========
To illustrate neurtu usage, will will benchmark array sorting in numpy. First, we will
generator of cases,
.. code:: python
import numpy as np
import neurtu
def cases()
rng = np.random.RandomState(42)
for N in [1000, 10000, 100000]:
X = rng.rand(N)
tags = {'N' : N}
yield neurtu.delayed(X, tags=tags).sort()
that yields a sequence of delayed calculations, each tagged with the parameters defining individual runs.
We can evaluate the run time with,
.. code:: python
>>> df = neurtu.timeit(cases())
>>> print(df)
wall_time
N
1000 0.000014
10000 0.000134
100000 0.001474
which will internally use ``timeit`` module with a sufficient number of evaluation to work around the timer precision
limitations (similarly to IPython's ``%timeit``). It will also display a progress bar for long running benchmarks,
and return the results as a ``pandas.DataFrame`` (if pandas is installed).
By default, all evaluations are run with ``repeat=1``. If more statistical confidence is required, this value can
be increased,
.. code:: python
>>> neurtu.timeit(cases(), repeat=3)
wall_time
mean max std
N
1000 0.000012 0.000014 0.000002
10000 0.000116 0.000149 0.000029
100000 0.001323 0.001714 0.000339
In this case we will get a frame with a
`pandas.MultiIndex <https://pandas.pydata.org/pandas-docs/stable/advanced.html#multiindex-advanced-indexing>`_ for
columns, where the first level represents the metric name (``wall_time``) and the second -- the aggregation method.
By default ``neurtu.timeit`` is called with ``aggregate=['mean', 'max', 'std']`` methods, as supported
by the `pandas aggregation API <https://pandas.pydata.org/pandas-docs/version/0.22.0/groupby.html#aggregation>`_. To disable,
aggregation and obtains timings for individual runs, use ``aggregate=False``.
See `neurtu.timeit documentation <https://neurtu.readthedocs.io/generated/neurtu.timeit.html>`_ for more details.
To evaluate the peak memory usage, one can use the ``neurtu.memit`` function with the same API,
.. code:: python
>>> neurtu.memit(cases(), repeat=3)
peak_memory
mean max std
N
10000 0.0 0.0 0.0
100000 0.0 0.0 0.0
1000000 0.0 0.0 0.0
More generally ``neurtu.Benchmark`` supports a wide number of evaluation metrics,
.. code:: python
>>> bench = neurtu.Benchmark(wall_time=True, cpu_time=True, peak_memory=True)
>>> bench(cases)
cpu_time peak_memory wall_time
N
10000 0.000100 0.0 0.000142
100000 0.001149 0.0 0.001680
1000000 0.013677 0.0 0.018347
including [psutil process metrics](https://psutil.readthedocs.io/en/latest/#psutil.Process).
For more information see the :ref:`examples`.
| 34.302326 | 125 | 0.661695 |
45595eb09cdc02dd0dc4de7673668f7875aacec3 | 707 | rst | reStructuredText | api/sphinx/tempsource/dabo.dEvents.MouseMiddleDoubleClick.rst | EdLeafe/dabodoc | d51be11e4ace84cfc9404278b299bd4436d6a692 | [
"MIT"
] | null | null | null | api/sphinx/tempsource/dabo.dEvents.MouseMiddleDoubleClick.rst | EdLeafe/dabodoc | d51be11e4ace84cfc9404278b299bd4436d6a692 | [
"MIT"
] | null | null | null | api/sphinx/tempsource/dabo.dEvents.MouseMiddleDoubleClick.rst | EdLeafe/dabodoc | d51be11e4ace84cfc9404278b299bd4436d6a692 | [
"MIT"
] | null | null | null |
.. include:: _static/headings.txt
.. module:: dabo.dEvents
.. _dabo.dEvents.MouseMiddleDoubleClick:
=======================================================
|doc_title| **dEvents.MouseMiddleDoubleClick** - class
=======================================================
Occurs when the mouse's middle button is double-clicked
on the control.
|hierarchy| Inheritance Diagram
===============================
Inheritance diagram for: **MouseMiddleDoubleClick**
.. inheritance-diagram:: MouseMiddleDoubleClick
|supclasses| Known Superclasses
===============================
* :ref:`dabo.dEvents.MouseEvent`
|API| Class API
===============
.. autoclass:: dabo.dEvents.MouseMiddleDoubleClick
|
| 17.243902 | 55 | 0.553041 |
58fa595263dde3c21d044eefca91a61bc0e1866f | 143 | rst | reStructuredText | readme.rst | trtr7dr/ci_frame | 8814cf5fd62eccfd860847a940e2c00159861ac3 | [
"MIT"
] | null | null | null | readme.rst | trtr7dr/ci_frame | 8814cf5fd62eccfd860847a940e2c00159861ac3 | [
"MIT"
] | 4 | 2019-08-05T06:43:23.000Z | 2020-07-27T12:38:29.000Z | readme.rst | trtr7dr/ci_frame | 8814cf5fd62eccfd860847a940e2c00159861ac3 | [
"MIT"
] | null | null | null | ###################
CI_frame
###################
Авторизация с куками и сессиями, новости, админка новостей и другие мелочи на Codeigniter 3.
| 23.833333 | 92 | 0.587413 |
c0965fc969d6d5f6305140adaf90548aad5a0bf6 | 4,597 | rst | reStructuredText | docs/install.rst | eliasp/salt-sproxy | 192a8ab61976f69c8c656801b203aacc597e0013 | [
"Apache-2.0"
] | 107 | 2019-01-29T21:19:38.000Z | 2022-03-11T08:51:03.000Z | docs/install.rst | eliasp/salt-sproxy | 192a8ab61976f69c8c656801b203aacc597e0013 | [
"Apache-2.0"
] | 97 | 2019-06-26T14:29:14.000Z | 2022-03-15T22:09:11.000Z | docs/install.rst | eliasp/salt-sproxy | 192a8ab61976f69c8c656801b203aacc597e0013 | [
"Apache-2.0"
] | 16 | 2019-06-18T19:57:08.000Z | 2021-11-09T21:34:10.000Z | .. _install:
Installation
============
The base installation is pretty much straightforward, ``salt-sproxy`` is
installable using ``pip``. See
https://packaging.python.org/tutorials/installing-packages/ for a comprehensive
guide on the installing Python packages.
Either when installing in a virtual environment, or directly on the base
system, execute the following:
.. code-block:: bash
$ pip install salt-sproxy
If you would like to install a specific Salt version, you will firstly need to
instal Salt (via pip) pinning to the desired version, e.g.,
.. code-block:: bash
$ pip install salt==2018.3.4
$ pip install salt-sproxy
Easy installation
-----------------
We also provide a script to install the system requirements:
https://raw.githubusercontent.com/mirceaulinic/salt-sproxy/master/install.sh
Usage example:
- Using curl
.. code-block:: bash
$ curl sproxy-install.sh -L https://raw.githubusercontent.com/mirceaulinic/salt-sproxy/master/install.sh
# check the contents of sproxy-install.sh
$ sudo sh sproxy-install.sh
- Using wget
.. code-block:: bash
$ wget -O sproxy-install.sh https://raw.githubusercontent.com/mirceaulinic/salt-sproxy/master/install.sh
# check the contents of sproxy-install.sh
$ sudo sh sproxy-install.sh
- Using fetch (on FreeBSD)
.. code-block:: bash
$ fetch -o sproxy-install.sh https://raw.githubusercontent.com/mirceaulinic/salt-sproxy/master/install.sh
# check the contents of sproxy-install.sh
$ sudo sh sproxy-install.sh
One liner:
.. warning::
This method can be dangerous and it is not recommended on production systems.
.. code-block:: bash
$ curl -L https://raw.githubusercontent.com/mirceaulinic/salt-sproxy/master/install.sh | sudo sh
See https://gist.github.com/mirceaulinic/bdbbbcfbc3588b1c8b1ec7ef63931ac6 for
a sample one-line installation on a fresh Fedora server.
The script ensures Python 3 is installed on your system, together with the
virtualenv package, and others required for Salt, in a virtual
environment under the ``$HOME/venvs/salt-sproxy`` path. In fact, when
executing, you will see that the script will tell where it's going to try to
install, e.g.,
.. code-block:: bash
$ sudo sh install.sh
Installing salt-sproxy under /home/mircea/venvs/salt-sproxy
Reading package lists... Done
~~~ snip ~~~
Installation complete, now you can start using by executing the following command:
. /home/mircea/venvs/salt-sproxy/bin/activate
After that, you can start using it:
.. code-block:: bash
$ . /home/mircea/venvs/salt-sproxy/bin/activate
(salt-sproxy) $
(salt-sproxy) $ salt-sproxy -V
Salt Version:
Salt: 2019.2.0
Salt SProxy: 2019.6.0b1
Dependency Versions:
Ansible: Not Installed
cffi: 1.12.3
dateutil: Not Installed
docker-py: Not Installed
gitdb: Not Installed
gitpython: Not Installed
Jinja2: 2.10.1
junos-eznc: 2.2.1
jxmlease: 1.0.1
libgit2: Not Installed
M2Crypto: Not Installed
Mako: Not Installed
msgpack-pure: Not Installed
msgpack-python: 0.6.1
NAPALM: 2.4.0
ncclient: 0.6.4
Netmiko: 2.3.3
paramiko: 2.4.2
pycparser: 2.19
pycrypto: 2.6.1
pycryptodome: Not Installed
pyeapi: 0.8.2
pygit2: Not Installed
PyNetBox: 4.0.6
PyNSO: Not Installed
Python: 3.6.7 (default, Oct 22 2018, 11:32:17)
python-gnupg: Not Installed
PyYAML: 5.1
PyZMQ: 18.0.1
scp: 0.13.2
smmap: Not Installed
textfsm: 0.4.1
timelib: Not Installed
Tornado: 4.5.3
ZMQ: 4.3.1
System Versions:
dist: Ubuntu 18.04 bionic
locale: UTF-8
machine: x86_64
release: 4.18.0-20-generic
system: Linux
version: Ubuntu 18.04 bionic
Upgrading
---------
To install a newer version, you can execute ``pip install -U salt-sproxy``,
however this is also going to upgrade your Salt installation. So in case you
would like to use a specific Salt version, it might be a better idea to install
the specific salt-sproxy version you want. You can check at
https://pypi.org/project/salt-sproxy/#history the list of available salt-sproxy
versions.
Example:
.. code-block:: bash
$ pip install salt-sproxy==2019.6.0
| 28.552795 | 109 | 0.649772 |
19ff38a1935ec2d3d0cfbb289b909e05b4ccbd80 | 3,593 | rst | reStructuredText | docs/source/how-works.rst | VIDA-NYU/alphad3m | db40193a448300d87442c451f9da17fa5cb845fd | [
"Apache-2.0"
] | null | null | null | docs/source/how-works.rst | VIDA-NYU/alphad3m | db40193a448300d87442c451f9da17fa5cb845fd | [
"Apache-2.0"
] | null | null | null | docs/source/how-works.rst | VIDA-NYU/alphad3m | db40193a448300d87442c451f9da17fa5cb845fd | [
"Apache-2.0"
] | null | null | null | How AlphaD3M works
====================
Inspired by AlphaZero, AlphaD3M frames the problem of pipeline synthesis for model discovery as a single-player game
where the player iteratively builds a pipeline by selecting actions (insertion, deletion and replacement of pipeline
components). We solve the meta-learning problem using a deep neural network and a Monte Carlo tree search (MCTS).
The neural network receives as input an entire pipeline, data meta-features, and the problem, and outputs
action probabilities and estimates for the pipeline performance. The MCTS uses the network probabilities to run
simulations which terminate at actual pipeline evaluations.
To reduce the search space, we define a pipeline grammar where the rules of the grammar constitute the actions. The
grammar rules grow linearly with the number of primitives and hence address the issue of scalability. Finally, AlphaD3M
performs hyperparameter optimization of the best pipelines using SMAC.
For more information about how AlphaD3M works, see our papers:
- `AlphaD3M: Machine Learning Pipeline Synthesis <https://arxiv.org/abs/2111.02508>`__
- `Automatic Machine Learning by Pipeline Synthesis using Model-Based Reinforcement Learning and a Grammar
<https://arxiv.org/abs/1905.10345>`__
Support for Many ML Problems
-----------------------------
AlphaD3M uses a comprehensive collection of primitives developed under the D3M program as well as primitives provided
in open-source libraries, such as scikit-learn, to derive pipelines for a wide range of machine learning tasks. These
pipelines can be applied to different data types and derive standard performance metrics.
- *Learning Tasks*: classification, regression, clustering, time series forecasting, time series classification, object
detection, LUPI, community detection, link prediction, graph matching, vertex classification, collaborative filtering,
and semi-supervised classification.
- *Data Types*: tabular, text, images, audio, video, and graph.
- *Data Formats*: CSV, D3M, raw text files, OpenML, and scikit-learn datasets.
- *Metrics*: accuracy, F1, macro F1, micro F1, mean squared error, mean absolute error, root mean squared error, object
detection AP, hamming loss, ROC-AUC, ROC-AUC macro, ROC-AUC micro, jaccard similarity score, normalized mutual
information, hit at K, R2, recall, mean reciprocal rank, precision, and precision at top K.
Usability, Model Exploration and Explanation
---------------------------------------------
AlphaD3M greatly simplifies the process to create predictive models. Users can interact with the system from a
Jupyter notebook, and derive models using a few lines of Python code.
Users can leverage Python-based libraries and tools to clean, transform and visualize data, as well as standard methods
to explain machine learning models. They can also be combined to build customized solutions for specific problems that
can be deployed to end users.
The AlphaD3M environment includes tools that we developed to enable users to explore the pipelines and their predictions:
- *PipelineProfiler*, an interactive visual analytics tool that empowers data scientists to explore the pipelines derived
by AlphaD3M within a Jupyter notebook, and gain insights to improve them as well as make an informed decision while
selecting models for a given application.
- *Visual Text Explorer*, a tool that helps users to understand models for text classification, by allowing to explore
model predictions and their association with words and entities present in the classified documents.
| 61.948276 | 121 | 0.786529 |
76fa17e1d868b7874c3af73b3bae8117569ed4ac | 1,076 | rst | reStructuredText | docs/source/notes/classification.rst | octree-nn/ocnn-pytorch | cdff2f5e589fd8a0b79c4f7b90042a8cefabf2d0 | [
"MIT"
] | 4 | 2021-12-18T13:27:59.000Z | 2022-03-19T02:47:16.000Z | docs/source/notes/classification.rst | octree-nn/ocnn-pytorch | cdff2f5e589fd8a0b79c4f7b90042a8cefabf2d0 | [
"MIT"
] | null | null | null | docs/source/notes/classification.rst | octree-nn/ocnn-pytorch | cdff2f5e589fd8a0b79c4f7b90042a8cefabf2d0 | [
"MIT"
] | null | null | null | Classification
===========================
Data Preparation
---------------------------
Clone the `ocnn-pytorch` repository, and enter the subdirectory `projects`.
.. code-block:: none
python tools/cls_modelnet.py
Experiments
---------------------------
#. Train the LeNet used in our paper `O-CNN <https://wang-ps.github.io/O-CNN>`_.
The classification accuracy on the testing set without voting is **91.7%**.
And the training log and weights can be downloaded `here
<https://1drv.ms/u/s!Ago-xIr0OR2-b2WkgDqYEh6EDRw?e=jOcVlJ>`_.
.. code-block:: none
python classification.py --config configs/cls_m40.yaml SOLVER.alias time
#. Train the HRNet used in our paper on `3D Unsupervised Pretraining
<https://wang-ps.github.io/pretrain>`_. The classification accuracy on the
testing set without voting is **93.0%**. And the training log and weights can
be downloaded `here <https://1drv.ms/u/s!Ago-xIr0OR2-aiT3IUrezwcW7aY>`_.
.. code-block:: none
python classification.py --config configs/cls_m40_hrnet.yaml SOLVER.alias time
| 29.888889 | 86 | 0.671004 |
a8198e9baa7e543760a6338f82b88662a72a383e | 3,265 | rst | reStructuredText | doc/theory/power_flow/fast_decoupled.rst | mzy2240/GridCal | 0352f0e9ce09a9c037722bf2f2afc0a31ccd2880 | [
"BSD-3-Clause"
] | 284 | 2016-01-31T03:20:44.000Z | 2022-03-17T21:16:52.000Z | doc/theory/power_flow/fast_decoupled.rst | mzy2240/GridCal | 0352f0e9ce09a9c037722bf2f2afc0a31ccd2880 | [
"BSD-3-Clause"
] | 94 | 2016-01-14T13:37:40.000Z | 2022-03-28T03:13:56.000Z | doc/theory/power_flow/fast_decoupled.rst | mzy2240/GridCal | 0352f0e9ce09a9c037722bf2f2afc0a31ccd2880 | [
"BSD-3-Clause"
] | 84 | 2016-03-29T10:43:04.000Z | 2022-02-22T16:26:55.000Z | .. _fast_decoupled:
Fast Decoupled
===================
The fast decoupled method is a fantastic power flow algorithm developed by Stott and Alsac [FD]_.
The method builds on a series of very clever simplifications and the decoupling of the Jacobian matrix of the
canonical Newton-Raphson algorithm to yield the fast-decoupled method.
The method consists in the building of two admittance-based matrices :math:`B'` and :math:`B''` which are
independently factorized (using the LU method or any other) which then serve to find the increments of angle
and voltage magnitude separately until convergence.
Finding :math:`B'` and :math:`B''`
----------------------------------------
To find :math:`B'` we perform the following operations:
.. math::
bs' = \frac{1}{X}
bs'_{ff} = diag(bs')
bs'_{ft} = diag(-bs')
bs'_{tf} = diag(-bs')
bs'_{tt} = diag(bs')
B'_f = bs'_{ff} \times Cf + bs'_{ft} \times Ct
B'_t = bs'_{tf} \times Cf + bs'_{tt} \times Ct
B' = Cf^\top \times B'_f + Ct^\top \times B'_t
To find :math:`B''` we perform the following operations:
.. math::
bs_{ff}^{''} = -Re \left\{\frac{bs' + B}{tap \cdot tap^*} \right\}
bs_{ft}^{''} = -Re \left\{ \frac{bs'}{tap^*} \right\}
bs_{tf}^{''} = -Re \left\{ \frac{bs'}{tap} \right\}
bs_{tt}^{''} = - bs''
B''_f = bs''_{ff} \times Cf + bs''_{ft} \times Ct
B''_t = bs''_{tf} \times Cf + bs''_{tt} \times Ct
B'' = Cf^\top \times B''_f + Ct^\top \times B''_t
The fast-decoupled algorithm
-------------------------------
- Factorize :math:`B'`
.. math::
J1 = factorization(B')
- Factorize :math:`B''`
.. math::
J2 = factorization(B'')
- Compute the voltage module :math:`V_m = |V|`
- Compute the voltage angle :math:`V_a= atan \left ( \frac{V_i}{V_r} \right )`
- Compute the error
.. math::
S_{calc} = V \cdot \left( Ybus \times V - I_{bus} \right)^*
.. math::
\Delta S = \frac{S_{calc} - S_{bus}}{V_m}
.. math::
\Delta P = Re \left\{\Delta S[pqpv] \right\}
.. math::
\Delta Q = Im \left\{ \Delta S[pq] \right\}
- Check the convergence
.. math::
converged = |\Delta P|_{\infty} < tol \quad \& \quad|\Delta Q|_{\infty} < tol
- Iterate; While convergence is false and the number of iterations is less than the maximum:
- Solve voltage angles (P-iteration)
.. math::
\Delta V_a = J1.solve( \Delta P)
- Update voltage
.. math::
V_a[pqpv] = V_a[pqpv] - \Delta V_a
V = V_m \cdot e^{j \cdot V_a}
- Compute the error (follow the previous steps)
- Check the convergence (follow the previous steps)
- If the convergence is still false:
- Solve voltage modules (Q-iteration)
.. math::
\Delta V_m = J2.solve( \Delta Q)
- Update voltage
.. math::
V_m[pq] = V_m[pq] - \Delta V_m
V = V_m \cdot e^{j \cdot V_a}
- Compute the error (follow the previous steps)
- Check the convergence (follow the previous steps)
- Increase the iteration counter.
- End
.. [FD] B. Stott and O. Alsac, 1974, Fast Decoupled Power Flow, IEEE Trasactions PAS-93 859-869. | 22.992958 | 109 | 0.565697 |
25c91f150395e3497b53eb5f80d5067b79d79790 | 960 | rst | reStructuredText | aws/dist/awscli/examples/servicecatalog/search-products-as-admin.rst | PlengNakdee/my-portfolio | 0b0cd636611a625480b04de35a76504dea56cf7c | [
"Apache-2.0"
] | 36 | 2019-11-06T20:49:07.000Z | 2021-07-07T02:26:52.000Z | aws/dist/awscli/examples/servicecatalog/search-products-as-admin.rst | PlengNakdee/my-portfolio | 0b0cd636611a625480b04de35a76504dea56cf7c | [
"Apache-2.0"
] | 21 | 2019-11-10T05:38:06.000Z | 2022-03-10T15:07:48.000Z | aws/dist/awscli/examples/servicecatalog/search-products-as-admin.rst | PlengNakdee/my-portfolio | 0b0cd636611a625480b04de35a76504dea56cf7c | [
"Apache-2.0"
] | 9 | 2020-11-21T02:06:57.000Z | 2021-11-26T05:44:44.000Z | **To search products with administrator privileges**
The following ``search-products-as-admin`` example searches for products with admin privileges, using a portfolio ID as a filter. ::
aws servicecatalog search-products-as-admin \
--portfolio-id port-5abcd3e5st4ei
Output::
{
"ProductViewDetails": [
{
"ProductViewSummary": {
"Name": "my product",
"Owner": "owner name",
"Type": "CLOUD_FORMATION_TEMPLATE",
"ProductId": "prod-abcdfz3syn2rg",
"HasDefaultPath": false,
"Id": "prodview-abcdmyuzv2dlu",
"ShortDescription": "description"
},
"ProductARN": "arn:aws:catalog:us-west-2:123456789012:product/prod-abcdfz3syn2rg",
"CreatedTime": 1562097906.0,
"Status": "CREATED"
}
]
}
| 33.103448 | 132 | 0.527083 |
1b8a4b768be194f0dc33d9551ec20838996a970c | 3,112 | rst | reStructuredText | README.rst | glentner/delete-cli | 51bf9010f6f06ae5c5016584b82eb84a92477599 | [
"Apache-2.0"
] | 2 | 2020-08-14T20:25:10.000Z | 2021-01-01T17:15:35.000Z | README.rst | glentner/delete-cli | 51bf9010f6f06ae5c5016584b82eb84a92477599 | [
"Apache-2.0"
] | 5 | 2020-03-24T18:32:09.000Z | 2021-06-02T01:03:47.000Z | README.rst | glentner/delete-cli | 51bf9010f6f06ae5c5016584b82eb84a92477599 | [
"Apache-2.0"
] | null | null | null | Delete
======
*A simple, cross-platform, command line move-to-trash.*
.. image:: https://img.shields.io/badge/license-Apache-blue.svg?style=flat
:target: https://www.apache.org/licenses/LICENSE-2.0
:alt: License
.. image:: https://img.shields.io/pypi/v/delete-cli.svg
:target: https://pypi.org/project/delete-cli
:alt: PyPI Version
.. image:: https://img.shields.io/pypi/pyversions/delete-cli.svg?logo=python&logoColor=white&style=flat
:target: https://pypi.org/project/delete-cli
:alt: Python Versions
.. image:: https://readthedocs.org/projects/delete-cli/badge/?version=latest&style=flat
:target: https://delete-cli.readthedocs.io
:alt: Documentation
.. image:: https://pepy.tech/badge/delete-cli
:target: https://pepy.tech/badge/delete-cli
:alt: Downloads
But why?
--------
The ``delete`` command is a simple alternative to using the standard ``rm`` command.
Using ``rm`` as a matter of course can be dangerous and prone to mistakes. Once a file is
unlinked with ``rm`` it cannot be recovered (without having backups).
All major graphical environments offer a "move to trash" option. This does a clean move
operation to a "trash" folder. Once a file as been put in the trash it can be recovered
easily. Periodically, the trash can be emptied if desired.
``delete`` is a command line implementation of this metaphor. It maintains a basic
``sqlite3`` database of files and folders put in the trash. Using the ``--list`` option
will list the contents. Using ``--restore`` will restore a file or folder from the trash.
Using ``--empty`` will purge anything put in the trash by ``delete``.
Installation
------------
If you already have Python 3.7 on your system, you can install ``delete`` using Pip.
.. code-block:: bash
pip install delete-cli
Basic Usage
-----------
Calling ``delete`` with no arguments or with the ``--help`` flag yield typically Unix
style behavior, print a usage or help statement, respectively. For detailed usage and
examples you can read the manual page, ``man delete``.
Deleting files and folders is as simple as:
.. code-block:: bash
delete file1.txt file2.txt folderA
Files or folders that get deleted with the same basename will have a suffix added before
the extension (e.g., ``file1.1.txt``, ``file1.2.txt``, ...).
Restore files using their basename (in the trash), their full path (in the trash) or
their original full path.
Documentation
-------------
Documentation is available at `delete-cli.readthedocs.io <https://delete-cli.readthedocs.io>`_.
For basic usage information on the command line use: ``delete --help``. For a more comprehensive
usage guide on the command line you can view the manual page with ``man delete``.
Contributions
-------------
Contributions are welcome in the form of suggestions for additional features, pull requests with
new features or bug fixes, etc. If you find bugs or have questions, open an *Issue* here. If and
when the project grows, a code of conduct will be provided along side a more comprehensive set of
guidelines for contributing; until then, just be nice.
| 34.966292 | 103 | 0.724936 |
c020807d4578440e4c3c06095a320be620e89d41 | 114 | rst | reStructuredText | testdata/04-test-section/01-section-title-overline/04.01.03.00-bad-unexpected-titles.rst | demizer/go-rst | 76354a48a5c3e212687cad4362b551727987af8f | [
"MIT"
] | 51 | 2015-01-21T18:01:42.000Z | 2021-03-15T21:09:23.000Z | testdata/04-test-section/01-section-title-overline/04.01.03.00-bad-unexpected-titles.rst | demizer/go-rst | 76354a48a5c3e212687cad4362b551727987af8f | [
"MIT"
] | 26 | 2015-04-29T07:17:44.000Z | 2017-06-17T10:28:38.000Z | testdata/04-test-section/01-section-title-overline/04.01.03.00-bad-unexpected-titles.rst | demizer/go-rst | 76354a48a5c3e212687cad4362b551727987af8f | [
"MIT"
] | 5 | 2015-04-15T03:29:55.000Z | 2019-05-24T14:24:37.000Z | Test unexpected section titles.
Title
=====
Paragraph.
-----
Title
-----
Paragraph.
| 10.363636 | 31 | 0.482456 |
e2a8dca34cde2e046cc9fc224446c6d8d6ac8dbf | 4,199 | rst | reStructuredText | docs/glossary.rst | profLewis/GEOG0133_OLD | 147ea7c2948dbbe46d63e3fcf33fed2c8b510ae6 | [
"MIT"
] | 2 | 2021-12-27T12:23:52.000Z | 2022-01-09T22:44:27.000Z | docs/glossary.rst | profLewis/GEOG0133_OLD | 147ea7c2948dbbe46d63e3fcf33fed2c8b510ae6 | [
"MIT"
] | null | null | null | docs/glossary.rst | profLewis/GEOG0133_OLD | 147ea7c2948dbbe46d63e3fcf33fed2c8b510ae6 | [
"MIT"
] | null | null | null | Glossary
========
Sources:
* Land Use, Land-Use Change, and Forestry, IPCC, 2000 - Robert T. Watson, Ian R. Noble, Bert Bolin, N. H. Ravindranath, David J. Verardo and David J. Dokken (Eds.) Cambridge University Press, UK. pp 375
You might also try the more extensive `EPA Glossary of Climate Change Terms <https://19january2017snapshot.epa.gov/climatechange/glossary-climate-change-terms_.html>`_
Some Terms
----------
* Accuracy
The degree to which the mean of a sample approaches the true mean of the population; lack of bias.
* Activity
A practice or ensemble of practices that take place on a delineated area over a given period of time.
* Baseline
A reference scenario against which a change in greenhouse gas emissions or removals is measured.
* Bias
Systematic over- or under-estimation of a quantity.
* Biosphere
That component of the Earth system that contains life in its var- ious forms, which includes its living organisms and derived organic matter (e.g., litter, detritus, soil).
* Carbon Flux
Transfer of carbon from one carbon pool to another in units of measurement of mass per unit area and time (e.g., t C ha-1 y-1).
* Carbon Pool
A reservoir. A system which has the capacity to accumulate or release carbon. Examples of carbon pools are forest biomass, wood products, soils, and atmosphere. The units are mass (e.g., t C).
* Carbon Stock
The absolute quantity of carbon held within a pool at a specified time.
* Flux
See “Carbon Flux.”
* Forest Estate
A forested landscape consisting of multiple stands of trees.
* Forest Stand
A community of trees, including aboveground and below- ground biomass and soils, sufficiently uniform in species composition, age, arrangement, and condition to be managed as a unit.
* Heterotrophic Respiration
The release of carbon dioxide from decomposition of organic matter.
* Land Cover
The observed physical and biological cover of the Earth’s land as vegetation or man-made features.
* Land Use
The total of arrangements, activities, and inputs undertaken in a certain land cover type (a set of human actions). The social and economic purposes for which land is managed (e.g., grazing, tim- ber extraction, conservation).
* Permanence
The longevity of a carbon pool and the stability of its stocks, given the management and disturbance environment in which it occurs.
* Pool
See Carbon Pool.
* Practice
An action or set of actions that affect the land, the stocks of pools associated with it or otherwise affect the exchange of greenhouse gases with the atmosphere.
* Precision
The repeatability of a measurement (e.g., the standard error of the sample mean).
* Regeneration
The renewal of a stand of trees through either natural means (seeded on-site or adjacent stands or deposited by wind, birds, or animals) or artificial means (by planting seedlings or direct seeding).
* Reservoir
A pool.
* Sequestration
The process of increasing the carbon content of a carbon pool other than the atmosphere.
* Shifting Agriculture
A form of forest use common in tropic forests where an area of forest is cleared, or partially cleared, and used for cropping for a few years until the forest regenerates. Also known as slash and burn agriculture, moving agriculture, or swidden agriculture.
* Sink
Any process or mechanism which removes a greenhouse gas, an aerosol, or a precursor of a greenhouse gas from the atmos- phere. A given pool (reservoir) can be a sink for atmospheric
* Source
Opposite of sink. A carbon pool (reservoir) can be a source of carbon to the atmosphere if less carbon is flowing into it than is flowing out of it.
* Stand
See Forest Stand.
* Stock
See Carbon Stock.
* Soil Carbon Pool
Used here to refer to the relevant carbon in the soil. It includes var- ious forms of soil organic carbon (humus) and inorganic soil carbon and charcoal. It excludes soil biomass (e.g., roots, bulbs, etc.) as well as the soil fauna (animals).
* Uptake
The addition of carbon to a pool. A similar term is sequestration.
* Wood Products
Products derived from the harvested wood from a forest, including fuelwood and logs and the products derived from them such as sawn timber, plywood, wood pulp, paper, etc.
| 41.574257 | 257 | 0.776137 |
dabd4961328ee50a16ed83f90889baead77f4106 | 1,395 | rst | reStructuredText | docs/source/features.rst | innot/AdvPiStepper | e98db5b19952b25cbcffdc118c599e546f026a9b | [
"MIT"
] | 1 | 2022-01-22T09:18:56.000Z | 2022-01-22T09:18:56.000Z | docs/source/features.rst | innot/AdvPiStepper | e98db5b19952b25cbcffdc118c599e546f026a9b | [
"MIT"
] | 2 | 2021-05-17T23:14:36.000Z | 2022-01-22T09:23:38.000Z | docs/source/features.rst | innot/AdvPiStepper | e98db5b19952b25cbcffdc118c599e546f026a9b | [
"MIT"
] | 1 | 2021-10-04T23:15:01.000Z | 2021-10-04T23:15:01.000Z | Features
--------
"Here comes the Hotstepper"
-- Ini Kamoze
* Uses acceleration and deceleration ramps.
* Fairly tight timing up to approx. 1500 steps per second (on Raspberry Pi 4) [#]_.
* Complete API for relative and absolute moves, rotations and continuous running.
* Runs in the background. Motor movements can be blocking or non-blocking.
* Support for microstepping (depending on the driver).
* Support for any unipolar stepper motors, like:
- 28BYJ-48 (very cheap geared stepper)
* {TODO} Support for Bipolar stepper drivers / dual H-Bridges like the
- L293(D)
- DRV8833
* {TODO} Support for Step/Direction controllers like
- A4988
- DRV8825
- STSPIN220 / 820
* Other drivers should be easy to implement
* Licensed under the very permissive MIT license.
* 100% Python, no dependencies except pigpio.
.. [#] At high step rates occasional stutters may occur when some
Python / Linux background tasks run.
Uses
....
AdvPiStepper is suitable for
* Python projects that need to accuratly control a single stepper motor at reasonable speeds.
* Stepper motor experiments and prototyping.
It is not suitable for
* multi-axis stepper motor projects
* high speeds (> 1500 steps per second)
Caveats
.......
* Currently no support for multiple motors. Single motor only.
* 100% Python, therefore no realtime performance - jitter and stutters may occur.
| 29.0625 | 93 | 0.738351 |
657d2bd2269bc2ee200aa458e2738dc4f06794ac | 3,697 | rst | reStructuredText | docs/setup.rst | praekeltfoundation/seed-stage-based-messaging | c1d39601c0d16fb32cebe7c2e288076c1dc4225b | [
"BSD-3-Clause"
] | 1 | 2017-08-17T14:17:53.000Z | 2017-08-17T14:17:53.000Z | docs/setup.rst | praekelt/seed-stage-based-messaging | c1d39601c0d16fb32cebe7c2e288076c1dc4225b | [
"BSD-3-Clause"
] | 69 | 2016-02-19T06:58:00.000Z | 2018-11-26T09:43:42.000Z | docs/setup.rst | praekeltfoundation/seed-stage-based-messaging | c1d39601c0d16fb32cebe7c2e288076c1dc4225b | [
"BSD-3-Clause"
] | 2 | 2016-09-28T09:32:00.000Z | 2017-08-18T06:18:36.000Z | =====
Setup
=====
Installing
==========
The steps required to install the Seed Stage-based Messaging Service are:
#. Get the code from the `Github Project`_ with git:
.. code-block:: console
$ git clone https://github.com/praekelt/seed-stage-based-messaging.git
This will create a directory ``seed-stage-based-messaging`` in your current directory.
.. _Github Project: https://github.com/praekelt/seed-stage-based-messaging
#. Install the Python requirements with pip:
.. code-block:: console
$ pip install -r requirements.txt
This will download and install all the Python packages required to run the
project.
#. Setup the database:
.. code-block:: console
$ python manage migrate
This will create all the database tables required.
.. note::
The PostgreSQL database for the Seed Stage-based Messaging Store needs
to exist before running this command.
See :envvar:`STAGE_BASED_MESSAGING_DATABASE` for details.
#. Run the development server:
.. code-block:: console
$ python manage.py runserver
.. note::
This will run a development HTTP server. This is only suitable for
testing and development, for production usage please
see :ref:`running-in-production`
.. _configuration-options:
Configuration Options
=====================
The main configuration file is ``seed_stage_based_messaging/settings.py``.
The following environmental variables can be used to override some default settings:
.. envvar:: SECRET_KEY
This overrides the Django :django:setting:`SECRET_KEY` setting.
.. envvar:: DEBUG
This overrides the Django :django:setting:`DEBUG` setting.
.. envvar:: USE_SSL
Whether to use SSL when build absolute URLs. Defaults to False.
.. envvar:: STAGE_BASED_MESSAGING_DATABASE
The database parameters to use as a URL in the format specified by the
`DJ-Database-URL`_ format.
.. _DJ-Database-URL: https://github.com/kennethreitz/dj-database-url
.. envvar:: STAGE_BASED_MESSAGING_SENTRY_DSN
The DSN to the Sentry instance you would like to log errors to.
.. envvar:: BROKER_URL
The Broker URL to use with Celery.
.. envvar:: STAGE_BASED_MESSAGING_URL
The URL of the instance of the Seed Stage-based Messaging API that will be
used when creating POST-back hooks to this service from other Seed services.
.. envvar:: SCHEDULER_URL
The URL to the `Seed Scheduler API`_ instance.
.. envvar:: SCHEDULER_API_TOKEN
The `auth token` to use to connect to the `Seed Scheduler API`_ instance
above.
.. envvar:: SCHEDULER_INBOUND_API_TOKEN
The `auth token` to use to connect to this Seed Stage-based Messaging API
from POST-backs from the `Seed Scheduler API`_ instance.
.. _Seed Scheduler API: https://github.com/praekelt/seed-scheduler
.. envvar:: IDENTITY_STORE_URL
The URL to the `Seed Identity Store API`_ instance.
.. envvar:: IDENTITY_STORE_TOKEN
The `auth token` to use to connect to the `Seed Identity Store API`_ instance
above.
.. _Seed Identity Store API: https://github.com/praekelt/seed-identity-store
.. envvar:: MESSAGE_SENDER_URL
The URL to the `Seed Message Sender API`_ instance.
.. envvar:: MESSAGE_SENDER_TOKEN
The `auth token` to use to connect to the `Seed Message Sender API`_ instance
above.
.. _Seed Message Sender API: https://github.com/praekelt/seed-message-sender
.. envvar:: METRICS_URL
The URL to the `Go Metrics API`_ instance to push metrics to.
.. envvar:: METRICS_AUTH_TOKEN
The `auth token` to use to connect to the `Go Metrics API`_ above.
.. _Go Metrics API: https://github.com/praekelt/go-metrics-api | 26.407143 | 90 | 0.715715 |
d23ef1079998cf425912ab6c321939aadc0d5944 | 925 | rst | reStructuredText | docs/write_a_backend.rst | palfrey/Cohen3 | d5779b4cbcf736e12d0ccfd162238ac5c376bb0b | [
"MIT"
] | 60 | 2018-09-14T18:57:38.000Z | 2022-02-19T18:16:24.000Z | docs/write_a_backend.rst | palfrey/Cohen3 | d5779b4cbcf736e12d0ccfd162238ac5c376bb0b | [
"MIT"
] | 37 | 2018-09-04T08:51:11.000Z | 2022-02-21T01:36:21.000Z | docs/write_a_backend.rst | palfrey/Cohen3 | d5779b4cbcf736e12d0ccfd162238ac5c376bb0b | [
"MIT"
] | 16 | 2019-02-19T18:34:58.000Z | 2022-02-05T15:36:33.000Z | .. _write_a_backend:
Write a backend
---------------
If you are interested in writing your own custom backend, then you could learn
how to do it with those simple examples presented here. Currently you have two
options, you can use the low level tools directly from
:ref:`backend <coherence.backend>` or you could use the tools from
:ref:`models <coherence.backends.models (package)>` which will simplify a
little the building process and you will end up with less lines of code, but by
reading the first method you will get a better idea of what is happening behind
the second method...so...you will have a better understanding of the Cohen3
tools by reading both:
- :ref:`Using backend directly - The old way <example_backend_the_old_way>`
- :ref:`Using the models module - The new way <example_backend_the_new_way>`
.. toctree::
:hidden:
example_backend_the_old_way
example_backend_the_new_way | 40.217391 | 80 | 0.755676 |
7c3849b2213b2c307ab2c4053b0875daddf8fbec | 1,580 | rst | reStructuredText | tests/README.rst | webbcam/tiflash | 54e26232174b2ac20f2b6da40d7704a317886b24 | [
"MIT"
] | 5 | 2018-07-16T05:45:06.000Z | 2021-06-02T16:41:17.000Z | tests/README.rst | webbcam/tiflash | 54e26232174b2ac20f2b6da40d7704a317886b24 | [
"MIT"
] | 54 | 2018-07-13T03:59:20.000Z | 2019-10-16T05:47:14.000Z | tests/README.rst | webbcam/tiflash | 54e26232174b2ac20f2b6da40d7704a317886b24 | [
"MIT"
] | 2 | 2019-03-19T03:09:52.000Z | 2022-03-25T11:41:13.000Z | =======
Testing
=======
This directory contains all necessary files for running the tests
Setting up Test Environment
===========================
To setup your testing environment, you'll need at least one device to run
tests on. Below are the devices supported out of the box that include resources
for testing (if you want to run tests on a device not listed below, you'll need
to provide similar resources for that device. See `tests/resources/cc1310 <resources/cc1310>`_ for an
example)
- `cc1310/cc1350 <resources/cc1310/README.rst>`_
Steps
-----
1. Edit the file `tests/env.cfg <env.cfg>`_
**Minimum Requirements:**
1. ``prefix``: full path to parent directory of CCS installations (e.g. /opt/ti)
2. ``versions``: comma-separated list of CCS version numbers (e.g. 9.0.1.00004, 8.1.0.00011)
a. for each version listed you'll need to provide the full path to that
CCS installation
(e.g. 9.0.1.00004 = /opt/ti/ccs901/ccs)
3. Enter the required device information (see `tests/resources/cc1310/README.rst <resources/cc1310/README.rst>`_
for what's required)
|
| An example:
::
# env.cfg
[ccs]
prefix = /opt/ti
versions = 9.0.1.00004, 8.1.0.00011
9.0.1.00004: /opt/ti/ccs901/ccs
8.1.0.00011: /opt/ti/ccsv8
[devices]
cc1310
[cc1310]
serno = L20000CE
2. Configure the test setup
::
# From the top level directory
make configure
3. Run tests
::
# From the top level directory
make test
| 24.307692 | 115 | 0.632911 |
1152fd69acdd1834615861625fcda71a6195a5b0 | 537 | rst | reStructuredText | doc/ref/models/model_hierarchy.rst | teresy/zotonic | 5ac5db1886a0d9d841614ad08fbe0d8fdef38a7f | [
"Apache-2.0"
] | 495 | 2015-01-04T19:27:43.000Z | 2022-03-23T09:35:30.000Z | doc/ref/models/model_hierarchy.rst | teresy/zotonic | 5ac5db1886a0d9d841614ad08fbe0d8fdef38a7f | [
"Apache-2.0"
] | 1,563 | 2015-01-03T00:43:31.000Z | 2022-03-17T10:27:29.000Z | doc/ref/models/model_hierarchy.rst | teresy/zotonic | 5ac5db1886a0d9d841614ad08fbe0d8fdef38a7f | [
"Apache-2.0"
] | 154 | 2015-01-09T05:55:29.000Z | 2022-03-11T13:23:22.000Z |
.. include:: meta-hierarchy.rst
The category hierarchy tables have been replaced by *m_hierarchy*.
This model defines named hierarchies of resources (pages).
If the categories are changed then the system needs to update the
*pivot_category_nr* field of all resources. With the introduction
of *m_hierarchy* this renumbering is much more efficient and will
only affect a minimal number of resources.
The *m_hierarchy* module is also used for the content- and user group
hierarchies, as used by the new *mod_acl_user_groups* module.
| 38.357143 | 69 | 0.798883 |
54d246789c4ca65e12549f1667263a872a5eddf4 | 1,739 | rst | reStructuredText | doc/fluid/api_cn/optimizer_cn.rst | shiyutang/docs | b05612213a08daf9f225abce08fc42f924ef51ad | [
"Apache-2.0"
] | 104 | 2018-09-04T08:16:05.000Z | 2021-05-06T20:45:26.000Z | doc/fluid/api_cn/optimizer_cn.rst | shiyutang/docs | b05612213a08daf9f225abce08fc42f924ef51ad | [
"Apache-2.0"
] | 1,582 | 2018-06-25T06:14:11.000Z | 2021-05-14T16:00:43.000Z | doc/fluid/api_cn/optimizer_cn.rst | shiyutang/docs | b05612213a08daf9f225abce08fc42f924ef51ad | [
"Apache-2.0"
] | 387 | 2018-06-20T07:42:32.000Z | 2021-05-14T08:35:28.000Z | =======================
paddle.optimizer
=======================
.. toctree::
:maxdepth: 1
optimizer_cn/Adadelta_cn.rst
optimizer_cn/AdadeltaOptimizer_cn.rst
optimizer_cn/Adagrad_cn.rst
optimizer_cn/AdagradOptimizer_cn.rst
optimizer_cn/Adam_cn.rst
optimizer_cn/Adamax_cn.rst
optimizer_cn/AdamW_cn.rst
optimizer_cn/DecayedAdagrad_cn.rst
optimizer_cn/DecayedAdagradOptimizer_cn.rst
optimizer_cn/DGCMomentumOptimizer_cn.rst
optimizer_cn/Dpsgd_cn.rst
optimizer_cn/DpsgdOptimizer_cn.rst
optimizer_cn/ExponentialMovingAverage_cn.rst
optimizer_cn/Ftrl_cn.rst
optimizer_cn/FtrlOptimizer_cn.rst
optimizer_cn/LambOptimizer_cn.rst
optimizer_cn/LarsMomentum_cn.rst
optimizer_cn/LarsMomentumOptimizer_cn.rst
optimizer_cn/LookaheadOptimizer_cn.rst
optimizer_cn/ModelAverage_cn.rst
optimizer_cn/Momentum_cn.rst
optimizer_cn/MomentumOptimizer_cn.rst
optimizer_cn/RecomputeOptimizer_cn.rst
optimizer_cn/RMSProp_cn.rst
optimizer_cn/SGD_cn.rst
optimizer_cn/SGDOptimizer_cn.rst
optimizer_cn/Optimizer_cn.rst
optimizer_cn/lr_scheduler_cn/CosineAnnealingLR_cn.rst
optimizer_cn/lr_scheduler_cn/ExponentialLR_cn.rst
optimizer_cn/lr_scheduler_cn/InverseTimeLR_cn.rst
optimizer_cn/lr_scheduler_cn/LambdaLR_cn.rst
optimizer_cn/lr_scheduler_cn/MultiStepLR_cn.rst
optimizer_cn/lr_scheduler_cn/NaturalExpLR_cn.rst
optimizer_cn/lr_scheduler_cn/NoamLR_cn.rst
optimizer_cn/lr_scheduler_cn/PiecewiseLR_cn.rst
optimizer_cn/lr_scheduler_cn/PolynomiaLR_cn.rst
optimizer_cn/lr_scheduler_cn/ReduceLROnPlateauLR_cn.rst
optimizer_cn/lr_scheduler_cn/StepLR_cn.rst
optimizer_cn/lr_scheduler_cn/LinearLrWarmup_cn.rst
| 34.78 | 59 | 0.79586 |
1a3904770db4435e0382b299069e1e5fbe06af77 | 828 | rst | reStructuredText | docs/api/Cheetah.rst | yegorich/cheetah3 | 35571a00395c4ef937a08ae5a3686700a53bd1d1 | [
"MIT"
] | 111 | 2017-03-04T11:35:50.000Z | 2022-02-15T08:10:28.000Z | docs/api/Cheetah.rst | yegorich/cheetah3 | 35571a00395c4ef937a08ae5a3686700a53bd1d1 | [
"MIT"
] | 49 | 2017-03-21T18:31:43.000Z | 2022-02-20T11:30:40.000Z | docs/api/Cheetah.rst | yegorich/cheetah3 | 35571a00395c4ef937a08ae5a3686700a53bd1d1 | [
"MIT"
] | 38 | 2017-03-21T00:44:43.000Z | 2021-09-18T09:45:13.000Z | Cheetah package
===============
.. automodule:: Cheetah
:members:
:undoc-members:
:show-inheritance:
Subpackages
-----------
.. toctree::
Cheetah.Macros
Cheetah.Templates
Cheetah.Tests
Cheetah.Tools
Cheetah.Utils
Submodules
----------
.. toctree::
Cheetah.CacheRegion
Cheetah.CacheStore
Cheetah.CheetahWrapper
Cheetah.Compiler
Cheetah.DirectiveAnalyzer
Cheetah.Django
Cheetah.DummyTransaction
Cheetah.ErrorCatchers
Cheetah.FileUtils
Cheetah.Filters
Cheetah.ImportHooks
Cheetah.ImportManager
Cheetah.NameMapper
Cheetah.Parser
Cheetah.Servlet
Cheetah.SettingsManager
Cheetah.SourceReader
Cheetah.Template
Cheetah.TemplateCmdLineIface
Cheetah.Unspecified
Cheetah.Version
Cheetah.compat
Cheetah.convertTmplPathToModuleName
| 16.897959 | 38 | 0.71256 |
7868301fe4c4463ebff1a752121331e408bf2f47 | 266 | rst | reStructuredText | external/opensocdebug/software/doc/02_developer/api/libosd/cl_scm.rst | hchenji/optimsoc | 11893ceb832119e90a47284d5f7356c7265e021f | [
"MIT"
] | 48 | 2017-06-30T15:28:08.000Z | 2022-03-07T06:47:31.000Z | external/opensocdebug/software/doc/02_developer/api/libosd/cl_scm.rst | hchenji/optimsoc | 11893ceb832119e90a47284d5f7356c7265e021f | [
"MIT"
] | 96 | 2017-06-20T19:51:03.000Z | 2022-03-16T20:22:36.000Z | external/opensocdebug/software/doc/02_developer/api/libosd/cl_scm.rst | nanvix/optimsoc | aaf9de0bdddec921bde8c31b273d829000c5e5b9 | [
"MIT"
] | 20 | 2017-08-17T18:34:51.000Z | 2021-09-05T08:36:25.000Z | osd_cl_scm class
----------------
API to access the functionality of the Subnet Control Module (SCM).
Usage
^^^^^
.. code-block:: c
#include <osd/osd.h>
#include <osd/cl_scm.h>
Public Interface
^^^^^^^^^^^^^^^^
.. doxygenfile:: libosd/include/osd/cl_scm.h
| 14.777778 | 67 | 0.616541 |
ce28ef4d671b61707e18450b1d9f31fdfc62b0c1 | 1,001 | rst | reStructuredText | docs/01-operation/keys.rst | hotpeppersec/Assassin | d2b8bb17de84f98f002c66276efc8b3e213ce251 | [
"Apache-2.0"
] | null | null | null | docs/01-operation/keys.rst | hotpeppersec/Assassin | d2b8bb17de84f98f002c66276efc8b3e213ce251 | [
"Apache-2.0"
] | 7 | 2020-03-18T18:40:13.000Z | 2020-03-31T16:20:02.000Z | docs/01-operation/keys.rst | hotpeppersec/Assassin | d2b8bb17de84f98f002c66276efc8b3e213ce251 | [
"Apache-2.0"
] | 1 | 2020-04-14T15:27:47.000Z | 2020-04-14T15:27:47.000Z | ====
Keys
====
There is a template `apiKeys.py` file that can be populated
with various keys to improve the functionality of the tool
and the granularity of the report.
Shodan
------
The tool will check for the existence of an environment variable
containing the value of the Shodan key. For example:
.. code-block:: bash
export SHODAN_KEY=`abc123keystringValue`
If this is not found, the tool will check for a value in the
`apiKeys.py` file. Failing this, the tool will log an exception
and move on. No Shodan report will be generated for hosts
within the domain.
Google Maps
-----------
The tool will check for the existence of an environment variable
containing the value of the Google Maps key. For example:
.. code-block:: bash
export GOOGLE_MAPS_KEY=`abc123keystringValue`
If this is not found, the tool will check for a value in the
`apiKeys.py` file. Failing this, the tool will log an exception
and move on. No Google map will be added to the top of the detail
report. | 25.666667 | 65 | 0.749251 |
b57758f22e81f1bdb4d8d3923644cd3d9433a7f9 | 8,101 | rst | reStructuredText | machine/drivers/virtualbox.rst | kdnakt/docs.docker.jp | a5071e7c036cfc8fba35b96c9f657d49ca70fb36 | [
"Apache-2.0"
] | 264 | 2015-11-23T12:58:07.000Z | 2022-03-09T00:06:44.000Z | machine/drivers/virtualbox.rst | kdnakt/docs.docker.jp | a5071e7c036cfc8fba35b96c9f657d49ca70fb36 | [
"Apache-2.0"
] | 182 | 2015-11-16T22:55:17.000Z | 2022-02-02T19:54:56.000Z | machine/drivers/virtualbox.rst | kdnakt/docs.docker.jp | a5071e7c036cfc8fba35b96c9f657d49ca70fb36 | [
"Apache-2.0"
] | 241 | 2015-12-31T08:10:24.000Z | 2022-03-27T01:42:48.000Z | .. -*- coding: utf-8 -*-
.. https://docs.docker.com/machine/drivers/virtualbox/
.. SOURCE: https://github.com/docker/machine/blob/master/docs/drivers/virtualbox.md
doc version: 1.11
https://github.com/docker/machine/commits/master/docs/drivers/virtualbox.md
.. check date: 2016/04/28
.. Commits on Mar 16, 2016 ab559c542f2a3a4534b14b4c16300344412a93a3
.. -----------------------------------------------------------------------------
.. Oracle VirtualBox
.. _driver-oracle-virtualbox:
=======================================
Oracle VirtualBox
=======================================
.. sidebar:: 目次
.. contents::
:depth: 3
:local:
.. Create machines locally using VirtualBox. This driver requires VirtualBox 5+ to be installed on your host. Using VirtualBox 4+ should work but will give you a warning. Older versions will refuse to work.
`VirtualBox <https://www.virtualbox.org/>`_ を使い、ローカルにマシンを作成します。このドライバを使うには、ホスト上に VirtualBox 5 以上のインストールが必要です。VirtualBox 4 の場合は動作するかもしれませんが、警告が出ます。それよりも古いバージョンは実行できません。
.. code-block:: bash
$ docker-machine create --driver=virtualbox vbox-test
.. You can create an entirely new machine or you can convert a Boot2Docker VM into a machine by importing the VM. To convert a Boot2Docker VM, you’d use the following command:
完全に新しいマシンを作成するか、Boot2Docker 仮想マシンにあるデータを変換して仮想マシンに取り込めます。Boot2Docker 仮想マシンを変換するには、以下のコマンドを実行します。
.. code-block:: bash
$ docker-machine create -d virtualbox --virtualbox-import-boot2docker-vm boot2docker-vm b2d
.. Options:
オプション:
.. --virtualbox-memory: Size of memory for the host in MB.
--virtualbox-cpu-count: Number of CPUs to use to create the VM. Defaults to single CPU.
--virtualbox-disk-size: Size of disk for the host in MB.
--virtualbox-host-dns-resolver: Use the host DNS resolver. (Boolean value, defaults to false)
--virtualbox-boot2docker-url: The URL of the boot2docker image. Defaults to the latest available version.
--virtualbox-import-boot2docker-vm: The name of a Boot2Docker VM to import.
--virtualbox-hostonly-cidr: The CIDR of the host only adapter.
--virtualbox-hostonly-nictype: Host Only Network Adapter Type. Possible values are are ‘82540EM’ (Intel PRO/1000), ‘Am79C973’ (PCnet-FAST III) and ‘virtio-net’ Paravirtualized network adapter.
--virtualbox-hostonly-nicpromisc: Host Only Network Adapter Promiscuous Mode. Possible options are deny , allow-vms, allow-all
--virtualbox-no-share: Disable the mount of your home directory
--virtualbox-dns-proxy: Proxy all DNS requests to the host (Boolean value, default to false)
* ``--virtualbox-memory`` : ホストのメモリ容量を MB 単位で指定。
* ``--virtualbox-cpu-count`` : 作成する仮想マシンが使う CPU 数。デフォルトは CPU 1つ。
* ``--virtualbox-disk-size`` : ホストのディスク容量を MB 単位で指定。
* ``--virtualbox-host-dns-resolver`` : ホスト DNS レゾルバの使用( Boolean 値で、デフォルトは false)。
* ``--virtualbox-boot2docker-url`` : boot2docker イメージの URL を指定。デフォルトは利用可能な最新バージョン。
* ``--virtualbox-import-boot2docker-vm`` : 取り込む Boot2Docker 仮想マシンの名前。
* ``--virtualbox-hostonly-cidr`` : ホストオンリー・アダプタの CIDR 。
* ``--virtualbox-hostonly-nictype`` : ホストオンリー・ネットワーク・アダプタのタイプを指定。値は ``82540EM`` (Intel PRO/1000)、 ``Am79C973`` (PCnet-FAST III) 、``virtio-net`` 準仮想化ネットワーク・アダプタのいずれか。
* ``--virtualbox-hostonly-nicpromisc`` : ホスト・オンリー・ネットワーク・アダプタのプロミスキャス・モードを指定。オプションは deny、allow-vms、allow-all のいずれか。
* ``--virtualbox-no-share`` : ホーム・ディレクトリのマウントを無効化。
* ``--virtualbox-no-dns-proxy`` : 全ての DNS リクエストをホスト側にプロキシしない( Boolean 値で、デフォルトは false)。
* ``--virtualbox-no-vtx-check`` : 仮想マシンを起動する前にハードウェア仮想化が利用可能かどうかを確認。
.. The --virtualbox-boot2docker-url flag takes a few different forms. By default, if no value is specified for this flag, Machine will check locally for a boot2docker ISO. If one is found, that will be used as the ISO for the created machine. If one is not found, the latest ISO release available on boot2docker/boot2docker will be downloaded and stored locally for future use. Note that this means you must run docker-machine upgrade deliberately on a machine if you wish to update the “cached” boot2docker ISO.
``--virtualbox-boot2docker-url`` フラグには、いくつかの異なった使い方があります。デフォルトでは、フラグに値を何も指定しなければ、Docker Machine はローカルの boot2docker ISO を探します。もしローカル上に見つかれば、マシン作成用の ISO として用いられます。もし見つからない場合は、 `boot2docker/boot2docker <https://github.com/boot2docker/boot2docker>`_ にある最新の ISO イメージをダウンロードし、ローカルに保存してから使います。つまり、ローカルに「キャッシュされた」boot2docker ISO を更新したい場合は、 ``docker-machine upgrade`` を実行しなくてはいけません。
.. This is the default behavior (when --virtualbox-boot2docker-url=""), but the option also supports specifying ISOs by the http:// and file:// protocols. file:// will look at the path specified locally to locate the ISO: for instance, you could specify --virtualbox-boot2docker-url file://$HOME/Downloads/rc.iso to test out a release candidate ISO that you have downloaded already. You could also just get an ISO straight from the Internet using the http:// form.
これはデフォルトの挙動( ``--virtualbox-boot2docker-url=""`` を指定 )ですが、オプションで ISO を ``http://`` や ``file://`` プロトコルで指定することもサポートされています。 ``file://`` はローカルに置かれている ISO イメージのパスを探します。例えば、 ``--virtualbox-boot2docker-url file://$HOME/Downloads/rc.iso`` を指定すると、リリース候補のダウンロード済み ISO を確認します。あるいは、 ``http://`` 形式を使い、インターネットの ISO を直接指定できます。
.. To customize the host only adapter, you can use the --virtualbox-hostonly-cidr flag. This will specify the host IP and Machine will calculate the VirtualBox DHCP server address (a random IP on the subnet between .1 and .25) so it does not clash with the specified host IP. Machine will also specify the DHCP lower bound to .100 and the upper bound to .254. For example, a specified CIDR of 192.168.24.1/24 would have a DHCP server between 192.168.24.2-25, a lower bound of 192.168.24.100 and upper bound of 192.168.24.254.
ホスト・オンリー・アダプタをカスタマイズするには、 ``--virtualbox-hostonly-cidr`` フラグを使えます。ここでホスト IP を指定すると、Machine は VirtualBox DHCP サーバ・アドレスを計算( ``.1`` ~ ``.25`` までのサブネット上の、ランダムな IP )するので、指定したホスト IP と衝突しないようにします。また、Machine は自動的に最小 ``.100`` ~最大 ``.254`` までの間で DHCP を指定します。たとえば、CIDR ``192.168.24.1/24`` を指定すると、DHCP サーバは ``192.168.24.2-25`` になり、IP アドレスの範囲は最小 ``192.168.24.100`` から最大 ``192.168.24.254`` となります。
.. Environment variables and default values:
利用可能な環境変数とデフォルト値は以下の通りです。
.. list-table::
:header-rows: 1
* - コマンドライン・オプション
- 環境変数
- デフォルト値
* - ``--virtualbox-memory``
- ``VIRTUALBOX_MEMORY_SIZE``
- ``1024``
* - ``--virtualbox-cpu-count``
- ``VIRTUALBOX_CPU_COUNT``
- ``1``
* - ``--virtualbox-disk-size``
- ``VIRTUALBOX_DISK_SIZE``
- ``20000``
* - ``--virtualbox-host-dns-resolver``
- ``VIRTUALBOX_HOST_DNS_RESOLVER``
- ``false``
* - ``--virtualbox-boot2docker-url``
- ``VIRTUALBOX_BOOT2DOCKER_URL``
- ``最新の boot2docker url``
* - ``--virtualbox-import-boot2docker-vm``
- ``VIRTUALBOX_BOOT2DOCKER_IMPORT_VM``
- ``boot2docker-vm``
* - ``--virtualbox-hostonly-cidr``
- ``VIRTUALBOX_HOSTONLY_CIDR``
- ``192.168.99.1/24``
* - ``--virtualbox-hostonly-nictype``
- ``VIRTUALBOX_HOSTONLY_NIC_TYPE``
- ``82540EM``
* - ``--virtualbox-hostonly-nicpromisc``
- ``VIRTUALBOX_HOSTONLY_NIC_PROMISC``
- ``deny``
* - ``--virtualbox-no-share``
- ``VIRTUALBOX_NO_SHARE``
- ``false``
* - ``--virtualbox-no-dns-prox``
- ``VIRTUALBOX_NO_DNS_PROXY``
- ``false``
* - ``--virtualbox-no-vtx-check``
- ``VIRTUALBOX_NO_VTX_CHECK``
- ``false``
.. Known Issues
既知の問題
==========
.. Vboxfs suffers from a longstanding bug causing sendfile(2) to serve cached file contents.
Vboxfs は `longstanding bug <https://www.virtualbox.org/ticket/9069>`_ により、キャッシュされたファイル内容を提供するため `sendfile(2) <http://linux.die.net/man/2/sendfile>`_ を引き起こします。
.. This will often cause problems when using a web server such as nginx to serve static files from a shared volume. For development environments, a good workaround is to disable sendfile in your server configuration.
これにより、nginx のようなウェブ・サーバが共有ボリュームから静的ファイルを読み込むとき、問題を引き起こしがちです。開発環境では、サーバの設定で sendfile を無効化するのが良いでしょう。
.. seealso::
Quicks
https://docs.docker.com/machine/reference/
| 55.868966 | 525 | 0.713369 |
f9f0cd8fb16e821f991b6860a86269c2d4c36ee3 | 15,653 | rst | reStructuredText | docs/general/bundles.rst | fschwenn/invenio | 238192639ddeef091863e57e52d20efbca0461c7 | [
"MIT"
] | null | null | null | docs/general/bundles.rst | fschwenn/invenio | 238192639ddeef091863e57e52d20efbca0461c7 | [
"MIT"
] | null | null | null | docs/general/bundles.rst | fschwenn/invenio | 238192639ddeef091863e57e52d20efbca0461c7 | [
"MIT"
] | null | null | null | ..
This file is part of Invenio.
Copyright (C) 2018 CERN.
Invenio is free software; you can redistribute it and/or modify it
under the terms of the MIT License; see LICENSE file for more details.
.. _bundles:
Bundles
=======
Invenio is a highly modular framework with many modules that provide various
different functionality. We are packing related modules into bundles which is
released together at the same time.
Each module has a separate documentation which you can find linked below.
Base bundle
-----------
The base bundle contains all modules related to the generic web application.
This includes the Flask/Celery application factories, configuration management,
I18N, logging, database management, assets/theme management, mail handling and
administration interface.
Included modules:
- `invenio-admin <https://invenio-admin.readthedocs.io>`_
- Administration interface for Invenio based on Flask-Admin.
- `invenio-app <https://invenio-app.readthedocs.io>`_
- Flask, WSGI, Celery and CLI applications for Invenio including
security-related headers and rate limiting.
- `invenio-assets <https://invenio-assets.readthedocs.io>`_
- Static files management and Webpack integration for Invenio.
- `invenio-base <https://invenio-base.readthedocs.io>`_
- Flask application factories implementing the application loading patterns
with entry points in Invenio.
- `invenio-cache <https://invenio-cache.readthedocs.io>`_
- Caching module for Invenio, supporting Reddis and Memcached as backends.
- `invenio-celery <https://invenio-celery.readthedocs.io>`_
- Task discovery and default configuration of Celery for Invenio.
- `invenio-config <https://invenio-config.readthedocs.io>`_
- Configuration loading pattern responsible for loading configuration from
Python modules, instance folder and environment variables.
- `invenio-db <https://invenio-db.readthedocs.io>`_
- Database connection management for Invenio.
- `invenio-formatter <https://invenio-formatter.readthedocs.io>`_
- Jinja template engine utilities for Invenio.
- `invenio-i18n <https://invenio-i18n.readthedocs.io>`_
- I18N utilities like user locale detection, message catalog merging and
views for language change.
- `invenio-logging <https://invenio-logging.readthedocs.io>`_
- Configuration of logging to both console, files and log aggregation
engines like `sentry.io <https://sentry.io/>`_
- `invenio-mail <https://invenio-mail.readthedocs.io>`_
- Mail sending for Invenio using Flask-Mail.
- `invenio-rest <https://invenio-rest.readthedocs.io>`_
- REST API utilities including Cross Origin Resource Sharing (CORS) and
Content Negotiation versioning support.
- `invenio-theme <https://invenio-theme.readthedocs.io>`_
- Jinja templates implementing a basic theme for Invenio as well as menus
and breadcrumbs support.
- `docker-invenio <https://docker-invenio.readthedocs.io>`_
- Docker base images based on CentOS 7 for Invenio.
- `pytest-invenio <https://pytest-invenio.readthedocs.io>`_
- Testing utilities for Invenio modules and applications.
Auth bundle
-----------
The auth bundle contains all modules related to account and access management,
user profiles, session management and OAuth (provider and client)
Included modules:
- `invenio-access <https://invenio-access.readthedocs.io>`_
- Role Based Access Control (RBAC) with object level permissions.
- `invenio-accounts <https://invenio-accounts.readthedocs.io>`_
- User/role management, registration, password recovery, email
verification, session theft protection, strong cryptographic hashing of
passwords, hash migration, session activity tracking and CSRF protection
of REST API via JSON Web Tokens.
- `invenio-oauth2server <https://invenio-oauth2server.readthedocs.io>`_
- OAuth 2.0 Provider for REST API authentication via access tokens.
- `invenio-oauthclient <https://invenio-oauthclient.readthedocs.io>`_
- User identity management and support for login via ORCID, GitHub, Google
or other OAuth providers.
- `invenio-userprofiles <https://invenio-userprofiles.readthedocs.io>`_
- User profiles for integration into registration forms.
The modules relies heavily on a suite of open source community projects:
- `flask-security <https://pythonhosted.org/Flask-Security/>`_
- `flask-login <https://flask-login.readthedocs.io/>`_
- `flask-principal <https://pythonhosted.org/Flask-Principal/>`_
- `flask-oauthlib <https://flask-oauthlib.readthedocs.io/>`_
- `passlib <https://passlib.readthedocs.io/en/stable/>`_
Metadata bundle
---------------
The metadata bundle contains all modules related to records and metadata
management including e.g. records storage, persistent identifier management,
search engine indexing, an OAI-PMH server and REST APIs for records.
Included modules:
- `invenio-indexer <https://invenio-indexer.readthedocs.io>`_
- Highly scalable record bulk indexing.
- `invenio-jsonschemas <https://invenio-jsonschemas.readthedocs.io>`_
- JSONSchema registry for Invenio.
- `invenio-oaiserver <https://invenio-oaiserver.readthedocs.io>`_
- OAI-PMH server implementation for Invenio.
- `invenio-pidstore <https://invenio-pidstore.readthedocs.io>`_
- Management, registration and resolution of persistent identifiers
including e.g DOIs.
- `invenio-records <https://invenio-records.readthedocs.io>`_
- JSON document storage with revision history and JSONSchema validation.
- `invenio-records-rest <https://invenio-records-rest.readthedocs.io>`_
- REST APIs for search and CRUD operations on records and persistent
identifiers.
- `invenio-records-ui <https://invenio-records-ui.readthedocs.io>`_
- User interface for displaying records.
- `invenio-search <https://invenio-search.readthedocs.io>`_
- Elasticsearch integration module for Invenio.
- `invenio-search-js <https://inveniosoftware.github.io/invenio-search-js/>`_
- AngularJS search application for displaying records from the REST API.
- `invenio-search-ui <https://invenio-search-ui.readthedocs.io>`_
- User interface for searching records.
Files bundle (beta)
-------------------
.. note::
This bundle is in beta. The modules are being used in production systems
but are still missing some minor changes as well as documentation.
The files bundle contains all modules related to management of files in
Invenio, including an object storage REST API, multiple supported storage
backends, file previewers, and IIIF image server and an integration layer between files and records.
Included modules:
- `invenio-files-rest <https://invenio-files-rest.readthedocs.io>`_
- Object storage REST API for Invenio with many supported backend storage
protocols and file integrity checking.
- `invenio-iiif <https://invenio-iiif.readthedocs.io>`_
- International Image Interoperability Framework (IIIF) server for making
thumbnails and zooming images.
- `invenio-previewer <https://invenio-previewer.readthedocs.io>`_
- Previewer for Markdown, JSON/XML, CSV, PDF, JPEG, PNG, TIFF, GIF and ZIP
files.
- `invenio-records-files <https://invenio-records-files.readthedocs.io>`_
- Integration layer between object storage and records.
- `invenio-xrootd <https://invenio-xrootd.readthedocs.io>`_
- Support for the storage protocol XRootD in Invenio.
Statistics bundle (beta)
------------------------
.. note::
This bundle is in beta. The modules are being used in production systems
but are still missing some minor changes as well as documentation.
The statistics bundle contains all modules related to counting statistics such
as file downloads, record views or any other type of events. It supports the
COUNTER Code of Practice as well as Making Data Count Code of Practice
including e.g. double-click detection.
Included modules:
- `invenio-stats <https://invenio-stats.readthedocs.io>`_
- Event collection, processing and aggregation in time-based indicies in
Elasticsearch.
- `invenio-queues <https://invenio-queues.readthedocs.io>`_
- Event queue management module.
- `counter-robots <https://counter-robots.readthedocs.io>`_
- Module providing the list of robots according to the COUNTER Code of
Practice.
Deposit bundle (alpha)
----------------------
.. note::
This bundle is in alpha. The modules are being used in production systems
but are very likely subject to change and are missing documentation.
Included modules:
- `invenio-deposit <https://invenio-deposit.readthedocs.io>`_
- REST API for managing deposit of records into Invenio with support for
in progress editing of records.
- `invenio-files-js <https://invenio-xrootd.readthedocs.io>`_
- AngularJS application for uploading files to Invenio via streaming the
binary files in an HTTP request.
- `invenio-records-js <https://invenio-records-js.readthedocs.io>`_
- AngularJS application for interacting with the deposit REST API and
rendering forms based on angular schema forms.
- `invenio-sipstore <https://invenio-sipstore.readthedocs.io>`_
- Submission Information Package (SIP) store with bagit support.
Invenio modules (alpha)
-----------------------
.. note::
These modules are in alpha. The modules are being used in production
systems but are most likely subject to changes and are missing
documentation.
In addition to above bundles, we have a number of other individual modules
which are all being used in production systems, but which are likely subject
to change prior to final release and in most cases are missing documentation.
- `invenio-accounts-rest <https://invenio-accounts-rest.readthedocs.io>`_
- REST APIs for account management.
- `invenio-charts-js <https://invenio-charts-js.readthedocs.io>`_
- AngularJS application for producing charts.
- `invenio-csl-js <https://invenio-csl-js.readthedocs.io>`_
- AngularJS application for rendering citation strings via the records
REST API and the CSL REST API.
- `invenio-csl-rest <https://invenio-csl-rest.readthedocs.io>`_
- REST API for retrieving Citation Style Language (CSL) style files.
- `invenio-github <https://invenio-github.readthedocs.io>`_
- GitHub integration with automatic archiving of new releases in Invenio.
- `invenio-openaire <https://invenio-openaire.readthedocs.io>`_
- Integration with OpenAIRE, including support for harvesting Open Funder
Regsitry and the OpenAIRE grants database, as well as REST APIs for
funders and grants.
- `invenio-opendefinition <https://invenio-opendefinition.readthedocs.io>`_
- REST API for licenses from OpenDefinition and SPDX.
- `invenio-pages <https://invenio-pages.readthedocs.io>`_
- Static pages module for Invenio.
- `invenio-pidrelations <https://invenio-pidrelations.readthedocs.io>`_
- Persistent identifier relations management to support e.g. DOI
versioning.
- `invenio-previewer-ispy <https://invenio-previewer-ispy.readthedocs.io>`_
- ISPY previewer.
- `invenio-query-parser <https://invenio-query-parser.readthedocs.io>`_
- Invenio v1 compatible query parser for Invenio v3. Note the module is GPL
licensed due to a GPL-licensed dependency.
- `invenio-records-editor <https://invenio-records-editor.readthedocs.io>`_
- JSON record editor.
- `invenio-records-editor-js <https://invenio-records-editor-js.readthedocs.io>`_
- Angular 4 application for editing JSON records.
- `invenio-s3 <https://invenio-s3.readthedocs.io>`_
- Support for the S3 storage protocol in Invenio.
- `invenio-sequencegenerator <https://invenio-sequencegenerator.readthedocs.io>`_
- Module for minting and tracking multiple sequences for e.g. report
numbers, journals etc.
- `invenio-sse <https://invenio-sse.readthedocs.io>`_
- Server-Sent Events (SSE) integration in Invenio.
- `invenio-webhooks <https://invenio-webhooks.readthedocs.io>`_
- REST API for receiving and processing webhook calls from third-party
services.
- `react-searchkit <https://invenio-react-searchkit.readthedocs.io>`_
- Modular React library for implementing search interfaces on top of
Invenio, Elasticsearch or other search APIs. Replacement for
Invenio-Search-JS.
Core libraries
--------------
Above Invenio modules dependent on a number of smaller core libraries we have
developed to take care of e.g. identifier normalization, DataCite/Dublin Core
metadata generation, testing and citation formatting.
- `citeproc-py-styles <https://citeproc-py-styles.readthedocs.io>`_
- Citation Style Language (CSL) style files packaged as a Python module
- `datacite <https://datacite.readthedocs.io>`_
- Python library for generating DataCite XML from Python dictionaries and
registering DOIs with the DataCite DOI registration service.
- `dcxml <https://dcxml.readthedocs.io>`_
- Python library for generating Dublin Core XML from Python dictionaries.
- `dictdiffer <https://dictdiffer.readthedocs.io>`_
- Python library for diffing/patching/merging JSON documents.
- `dojson <https://dojson.readthedocs.io>`_
- JSON to JSON rule-based transformation library.
- `flask-breadcrumbs <https://flask-breadcrumbs.readthedocs.io>`_
- Flask extension for managing breadcrumbs in web applications.
- `flask-celeryext <https://flask-celeryext.readthedocs.io>`_
- Celery integration for Flask.
- `flask-iiif <https://flask-iiif.readthedocs.io>`_
- IIIF server for Flask.
- `flask-menu <https://flask-menu.readthedocs.io>`_
- Menu generation support for Flask.
- `flask-sitemap <https://flask-sitemap.readthedocs.io>`_
- Sitemaps XML generation for Flask.
- `flask-webpack <https://flask-webpack.readthedocs.io>`_
- Webpack integration for Flask.
- `idutils <https://idutils.readthedocs.io>`_
- Persistent identifier validation, identification and normalization.
- `jsonresolver <https://jsonresolver.readthedocs.io>`_
- JSONRef resolver with support for local plugins.
- `pynpm <https://pynpm.readthedocs.io>`_
- NPM integration for Python.
- `pywebpack <https://pywebpack.readthedocs.io>`_
- Webpack integration library for Python.
- `requirements-builder <https://requirements-builder.readthedocs.io>`_
- Python CLI tool for testing multiple versions of different Python
libraries in you continuous integration system.
- `xrootdpyfs <https://xrootdpyfs.readthedocs.io>`_
- PyFilesystem plugin adding XRootD support.
Scaffolding
-----------
Following modules provide templates for getting started with Invenio:
- `cookiecutter-invenio-instance <https://github.com/inveniosoftware/cookiecutter-invenio-instance>`_
- Template for new Invenio instances.
- `cookiecutter-invenio-datamodel <https://github.com/inveniosoftware/cookiecutter-invenio-datamodel>`_
- Template for new data models.
- `cookiecutter-invenio-module <https://github.com/inveniosoftware/cookiecutter-invenio-module>`_
- Template for a reusable Invenio module.
Notes on license
----------------
Invenio is undergoing a change of license from GPLv2 to MIT License in most
cases. Thus, you may especially for alpha and beta modules see that the license
is still GPL v2 in the source code. This will be changed to MIT License for
all repositories before being finally released. The only module we are
currently aware of that can not be converted is Invenio-Query-Parser, which
has a dependency on a GPL-licensed library. Invenio-Query-Parser is however not
needed by most installations, as it only provides an Invenio v1.x compatible
query parser.
| 48.015337 | 103 | 0.752444 |
c20642d2b7e8b71742684ee47fe461d7ff09756c | 948 | rst | reStructuredText | docs/index.rst | gugarosa/textformer | cccc670d48995fa0bfbdf9fc8013d13a90ea5e84 | [
"Apache-2.0"
] | 3 | 2020-07-26T03:51:56.000Z | 2020-10-04T18:42:18.000Z | docs/index.rst | gugarosa/textformer | cccc670d48995fa0bfbdf9fc8013d13a90ea5e84 | [
"Apache-2.0"
] | null | null | null | docs/index.rst | gugarosa/textformer | cccc670d48995fa0bfbdf9fc8013d13a90ea5e84 | [
"Apache-2.0"
] | null | null | null | Welcome to Textformer's documentation!
=================================
Did you ever want to transform text? Are you tired of re-implementing and defining state-of-the-art architectures? If yes, Textformer is the way-to-go! This package provides a straightforward implementation of sequence-to-sequence and transformer-based architectures, fostering all research related to text generation and translation.
Use Textformer if you need a library or wish to:
* Create your network;
* Design or use pre-loaded state-of-the-art architectures;
* Mix-and-match encoder and decoders to solve your problem;
* Because it is fun to transform text.
Textformer is compatible with: **Python 3.6+**.
.. toctree::
:maxdepth: 2
:caption: Package Reference
api/textformer.core
api/textformer.datasets
api/textformer.models
api/textformer.utils
Indices and tables
==================
* :ref:`genindex`
* :ref:`modindex`
* :ref:`search` | 32.689655 | 334 | 0.720464 |
04625c470b365f6e3acf269f6b6e44a2aad91bae | 3,702 | rst | reStructuredText | README.rst | SRCZC/stan-statespace | c167857895139541160d9886ccb9625c02593f8a | [
"MIT"
] | 126 | 2015-04-22T04:28:28.000Z | 2022-02-21T16:56:38.000Z | README.rst | SRCZC/stan-statespace | c167857895139541160d9886ccb9625c02593f8a | [
"MIT"
] | 2 | 2017-07-01T01:19:23.000Z | 2019-04-15T15:05:54.000Z | README.rst | SRCZC/stan-statespace | c167857895139541160d9886ccb9625c02593f8a | [
"MIT"
] | 40 | 2015-05-26T20:44:08.000Z | 2022-02-21T16:56:40.000Z |
Reproducing "An Introduction to State Space Time Series Analysis" using Stan
============================================================================
Trying to reproduce the examples introduced in "An Introduction to State Space Time Series Analysis" using Stan.
Example data:
,,,,,,,,,,,,,
From http://www.ssfpack.com/CKbook.html:
- logUKpetrolprice.txt
- NorwayFinland.txt
- UKdriversKSI.txt
- UKinflation.txt
- UKfrontrearseatKSI.txt
Models:
,,,,,,,
1. Introduction
- fig01_01.R: Linear regression
2. The local level model
- fig02_01.R: Deterministic level
- fig02_03.R: Stochastic level
- fig02_05.R: The local level model and Norwegian fatalities
3. The local linear trend model
- fig03_01.R: Stochastic level and slope
- fig03_04.R: Stochastic level and deterministic slope
- fig03_05.R: The local linear trend model and Finnish fatalities
4. The local level model with seasonal
- fig04_02.R: Deterministic level and seasonal
- fig04_06.R: Stochastic level and seasonal
- fig04_10.R: The local level and seasonal model and UK inflation
5. The local level model with explanatory variable
- fig05_01.R: Deterministic level and explanatory variable
- fig05_04.R: Stochastic level and explanatory variable
6. The local level model with intervention variable
- fig06_01.R: Deterministic level and intervention variable
- fig06_04.R: Stochastic level and intervention variable
7. The UK seat belt and inflation models
- fig07_01.R: Deterministic level and seasonal
- fig07_02.R: Stochastic level and seasonal
- fig07_04.R: The UK inflation model
8. General treatment of univariate state space models
9. Multivariate time series analysis
10. State space and Box–Jenkins methods for time series analysis
**IMPORTANT** Some models output different results from textbook and R's `{dlm}` package.
Japanese
--------
Stan で "状態空間時系列分析入門" を再現する
サンプルデータ:
,,,,,,,,,,,,,,,
http://www.ssfpack.com/CKbook.html から:
- logUKpetrolprice.txt
- NorwayFinland.txt
- UKdriversKSI.txt
- UKinflation.txt
- UKfrontrearseatKSI.txt
モデル:
,,,,,,,
1. はじめに
- fig01_01.R: `線形回帰 <https://rpubs.com/sinhrks/sstsa_01_01>`_
2. ローカル・レベル・モデル
- fig02_01.R: `確定的レベル <https://rpubs.com/sinhrks/sstsa_02_01>`_
- fig02_03.R: `確率的レベル <https://rpubs.com/sinhrks/sstsa_02_03>`_
- fig02_05.R: `ローカル・レベル・モデルとノルウェイの事故 <https://rpubs.com/sinhrks/sstsa_02_05>`_
3. ローカル線形トレンド・モデル
- fig03_01.R: `確率的レベルと確率的傾き <https://rpubs.com/sinhrks/sstsa_03_01>`_
- fig03_04.R: `確率的レベルと確定的傾き <https://rpubs.com/sinhrks/sstsa_03_04>`_
- fig03_05.R: `ローカル線形トレンド・モデルとフィンランドの事故 <https://rpubs.com/sinhrks/sstsa_03_05>`_
4. 季節要素のあるローカル・レベル・モデル
- fig04_02.R: `確定的レベルと確定的季節要素 <https://rpubs.com/sinhrks/sstsa_04_02>`_
- fig04_06.R: `確率的レベルと確率的季節要素 <https://rpubs.com/sinhrks/sstsa_04_06>`_
- fig04_10.R: `ローカル・レベルと季節モデルと英国インフレーション <https://rpubs.com/sinhrks/sstsa_04_10>`_
5. 説明変数のあるローカル・レベル・モデル
- fig05_01.R: `確定的レベルと(確定的)説明変数 <https://rpubs.com/sinhrks/sstsa_05_01>`_
- fig05_04.R: `確率的レベルと(確定的)説明変数 <https://rpubs.com/sinhrks/sstsa_05_04>`_
6. 干渉変数のあるローカル・レベル・モデル
- fig06_01.R: `確定的レベルと(確定的)干渉変数 <https://rpubs.com/sinhrks/sstsa_06_01>`_
- fig06_04.R: `確率的レベルと(確定的)干渉変数 <https://rpubs.com/sinhrks/sstsa_06_04>`_
7. 英国シートベルト法とインフレーション・モデル
- fig07_01.R: `確定的レベルと確定的季節要素 <https://rpubs.com/sinhrks/sstsa_07_01>`_
- fig07_02.R: `確率的レベルと確率的季節要素 <https://rpubs.com/sinhrks/sstsa_07_02>`_
- fig07_04.R: `英国インフレーション・モデル <https://rpubs.com/sinhrks/sstsa_07_04>`_
8. 単変量状態空間モデルの一般的な取り扱い
9. 多変量時系列分析
10. 時系列分析に対する状態空間法とボックス・ジェンキンス法
**重要** いくつかのモデルはテキスト、ならびに Rの `{dlm}` パッケージとは異なる値となっている
| 38.164948 | 112 | 0.715829 |
19c0f6c028c71605c22b4304fcf18edf0c00b22b | 3,133 | rst | reStructuredText | docs/quick-start.rst | ekwan/cctk | 85cb8d0b714a80e8e353987dc24006695f1d0532 | [
"Apache-2.0"
] | 10 | 2020-01-16T15:26:57.000Z | 2022-01-15T23:12:00.000Z | docs/quick-start.rst | ekwan/cctk | 85cb8d0b714a80e8e353987dc24006695f1d0532 | [
"Apache-2.0"
] | 2 | 2020-05-27T21:04:36.000Z | 2020-09-26T20:49:53.000Z | docs/quick-start.rst | ekwan/cctk | 85cb8d0b714a80e8e353987dc24006695f1d0532 | [
"Apache-2.0"
] | 2 | 2020-09-24T18:44:18.000Z | 2021-08-05T20:35:51.000Z | :orphan:
.. _quick-start:
===========
Quick Start
===========
Here, we will extract the relative energies from some pentane conformer optimizations.
You can find the code in ``test/test_pentane.py``.
Import some libraries::
import numpy as np
import cctk
import glob as glob
import pandas as pd
from pandas import DataFrame
Gather the relevant filenames with ``glob``::
path = "test/static/pentane_conformation*.out"
filenames = sorted(glob.glob(path))
Now, we read these files. The molecules and their properties are stored in a
``ConformationalEnsemble``::
conformational_ensemble = cctk.ConformationalEnsemble()
for filename in filenames:
gaussian_file = cctk.GaussianFile.read_file(filename)
ensemble = gaussian_file.ensemble
molecule = ensemble.molecules[-1]
property_dict = ensemble.get_property_dict(molecule)
conformational_ensemble.add_molecule(molecule,property_dict)
Because these are geometry optimization jobs, each ``GaussianFile``
contains an ``Ensemble`` containing the geometries at each step. Calling
``ensemble.molecules[-1]`` provides the last geometry.
Each ``Molecule`` is associated with a property dictionary::
property_dict = {
'energy': -0.0552410743198,
'scf_iterations': 2,
'link1_idx': 0,
'filename': 'test/static/pentane_conformation_1.out',
'rms_force': 4.4e-05,
'rms_displacement': 0.000319,
'enthalpy': 0.106416,
'gibbs_free_energy': 0.068028,
'frequencies': [101.5041, 117.3291, 192.5335, 201.8222, 231.7895, 463.1763, 465.449, 717.7345, 778.6405, 876.373, 915.2653, 972.8192, 974.4666, 1071.7653, 1118.4824, 1118.5532, 1118.7997, 1121.9397, 1138.5283, 1145.0836, 1154.1222, 1224.0252, 1280.9892, 1286.3355, 1293.7174, 1304.3843, 1304.4249, 1307.1626, 1307.7894, 1333.8135, 1352.5493, 1402.936, 1463.1459, 2886.2576, 2897.014, 2897.5548, 2898.0773, 2904.9758, 2906.6594, 3022.3193, 3022.3517, 3029.3245, 3029.3492, 3037.506, 3037.5529],
'mulliken_charges': OneIndexedArray([-0.271682, 0.090648, 0.090012, 0.090649, -0.18851, 0.095355, 0.09536, -0.200782, 0.098551, 0.098567, -0.18851, 0.095364, 0.095351, -0.271682, 0.090649, 0.090012, 0.090649])
}
Overall, we are taking the last geometry and molecular properties from each file
and combining them into a ``ConformationalEnsemble``.
To extract the filenames and energies::
property_names = ["filename", "energy"]
conformational_energies = conformational_ensemble[:,property_names]
We can then determine the lowest energy and display the results in a ``pandas`` dataframe::
df = DataFrame(conformational_energies, columns=property_names)
df["rel_energy"] = (df.energy - df.energy.min()) * 627.509469
print(df)
The output is::
filename energy rel_energy
0 test/static/pentane_conformation_1.out -0.055241 0.000000
1 test/static/pentane_conformation_2.out -0.054881 0.226124
2 test/static/pentane_conformation_3.out -0.054171 0.671446
3 test/static/pentane_conformation_4.out -0.053083 1.354009
| 40.688312 | 498 | 0.705075 |
469a5f08f4da06c8430a369d1a5aa2feb3616b3b | 393 | rst | reStructuredText | events/switch_name_inactive.rst | jephstahl/mpf-docs | 3ee1f6082e8954d3b90a96eb4536655e0de71659 | [
"MIT"
] | null | null | null | events/switch_name_inactive.rst | jephstahl/mpf-docs | 3ee1f6082e8954d3b90a96eb4536655e0de71659 | [
"MIT"
] | null | null | null | events/switch_name_inactive.rst | jephstahl/mpf-docs | 3ee1f6082e8954d3b90a96eb4536655e0de71659 | [
"MIT"
] | null | null | null | switch_(name)_inactive
======================
*MPF Event*
Posted on MPF-MC only (e.g. not in MPF) when the MC receives
a BCP "switch" inactive command. Useful for video modes and graphical
menu navigation. Note that this is not posted for every switch all
the time, rather, only for switches that have been configured to
send events to BCP.
*This event does not have any keyword arguments*
| 30.230769 | 69 | 0.732824 |
90af680b9c8e7e78a6cf4ddda465d53bf51205ff | 382 | rst | reStructuredText | docs/python.rst | kam1sh/anchor | 6699a7c3f0d43f358ea399490227cbeaa63df075 | [
"MIT"
] | 1 | 2019-05-04T07:24:40.000Z | 2019-05-04T07:24:40.000Z | docs/python.rst | kam1sh/ciconia | 6699a7c3f0d43f358ea399490227cbeaa63df075 | [
"MIT"
] | null | null | null | docs/python.rst | kam1sh/ciconia | 6699a7c3f0d43f358ea399490227cbeaa63df075 | [
"MIT"
] | null | null | null | Python packages support
=======================
Anchor has support for the Python packages (sdist and wheel)
and API that is compatible with pip, twine and others.
TL;DR
-----
::
# Upload:
$ poetry config repositories.ci http://localhost/py/upload/
$ poetry publish --repository ci
# Search:
$ pip3 search -i http://localhost/py/ foo
.. TODO write more!
| 17.363636 | 63 | 0.636126 |
d804594942adf95f3599fa7fa315b1b4a10ca9c6 | 9,008 | rst | reStructuredText | source/advanced_features/converting_into_localization.rst | talkable/talkable-docs | af039f386ed4dc67bbd9ffd3c62f6d24b47643b0 | [
"MIT"
] | 2 | 2018-02-15T07:22:48.000Z | 2018-08-30T13:54:40.000Z | source/advanced_features/converting_into_localization.rst | talkable/talkable-docs | af039f386ed4dc67bbd9ffd3c62f6d24b47643b0 | [
"MIT"
] | 85 | 2015-07-31T10:24:25.000Z | 2022-02-08T14:17:49.000Z | source/advanced_features/converting_into_localization.rst | talkable/talkable-docs | af039f386ed4dc67bbd9ffd3c62f6d24b47643b0 | [
"MIT"
] | 2 | 2018-02-15T07:22:53.000Z | 2022-03-04T18:05:29.000Z | .. _advanced_features/converting_into_localization:
.. include:: /partials/common.rst
.. meta::
:description: This is a how-to guide to help you get the most out of Localizations.
Converting Into Localization
============================
If you don’t know what is Localization inside Talkable :ref:`read this article <campaigns/localization>`.
Few benefits you get out of using Localizations:
1. It is extremely easy to set up an AB test if your copy is coded up as a Localization.
2. For non-technical people it is easier to change the copy inside Localization Editor because they are afraid to code.
1. Visit Campaign Editor:
.. image:: /_static/img/advanced_features/campaign_navigation.png
2. Swith to HTML & CSS editor:
.. image:: /_static/img/advanced_features/campaign_navigation_html_css_editor.png
3. Find a View you would like to convert a static copy at:
.. image:: /_static/img/advanced_features/campaign_view_navigation.png
4. Find a copy that you’d like to convert into a Localization. You can hit `Cmd+F` (`Ctrl+F` on Windows) to search within HTML area.
Static copy is basically a piece of text that sits inside HTML & CSS Editor and usually looks like this:
.. code-block:: html
<h1>
Get {{ advocate_incentive.description }}.
</h1>
A piece that we are going to extract into Localizations is just a copy, without HTML tags. To do that, simply wrap the text into a variable notation like so:
.. code-block:: html
<h1>
{{ "advocate_share_page_headline" | localize: "Get [[ advocate_incentive.description ]]." }}
</h1>
5. Go back to Editor to see newly created Localization:
.. image:: /_static/img/advanced_features/campaign_localization_sidebar.png
.. raw:: html
<h2>Few important things to remember</h2>
1. Don’t forget to change `{{` into `[[` inside interpolation variables, otherwise variables will lose its function and become just a plain text.
2. Keep in mind that `identifier` ("advocate_share_page_headline" in the example above) is a campaign-level Localization in fact, always remember to provide unique names for them, otherwise you will be overriding a value of one Localization.
.. warning::
Talkable does not allow coding up Localizations within CSS area. If you want to move some CSS property into localizations use inline <style> tag inside HTML area.
Moving Subject Line Into Localization
-------------------------------------
It is highly unlikely that your campaign is not equipped with Subject line as a Localization, by default all newly created campaigns at Talkable already have it. In case your campaign does not please keep reading this section.
Subject line is unique because its default value is set on the Advocate Share Page along with other email sharing fields (email body, reminder checkbox value), not Friend Share email as it might sounded logical. Here is the plan:
1. Create Subject Line as a Localization on the Advocate Share Page, provide its default value, it will be used inside `value` attribute of the "subject" form field:
.. image:: /_static/img/advanced_features/subject_line_setup_inside_share_page.png
.. code-block:: html
<input name="email_subject" type="text" class="textfield" value="{{ 'friend_share_email_subject' | localize: 'Your friend [[ advocate_info.first_name ]] [[ advocate_info.last_name ]] gave you [[ friend_incentive.description ]]' | replace: ' ', ' ' | replace: ' ', ' ' }}" />
This code creates new Localization named "Friend share email subject" that you are able to change on the Advocate Share Page.
2. Navigate to Friend Share email → Extra fields to see Email Subject field:
.. image:: /_static/img/advanced_features/campaign_editor_subject.png
3. Put the following code in there:
.. code-block:: liquid
{% if custom_message_subject == blank %}
{{ 'friend_share_email_subject' | localize }}
{% else %}
{{ custom_message_subject }}
{% endif %}
The code snippet above checks if the Advocate provided any Subject at all. If not we take default Subject copy so Friend Share email does not come with a blank subject.
Color As Localization
---------------------
Another example would be localizing font color of a headline, all copy at once, or a background color of a button. You can use `color` trait of a Localization for that.
1. Navigate to HTML & CSS editor of the View you want to add a color Localization on:
.. image:: /_static/img/advanced_features/campaign_navigation_html_css_editor.png
2. At the very bottom of the HTML area add ``<style></style>`` tag with CSS that will override default styling of the element you want to localize:
.. code-block:: text
.button {
background-color: {{ "advocate_share_page_button_color" | localize: "#f94d08", trait: "color" }};
border-color: {{ "advocate_share_page_button_color" | localize: "#f94d08", trait: "color" }};
}
In the code example above we created new Color Localization with default HEX color `#f94d08` which is used for `background-color` and `border-color` CSS properties of `.button` selector. Whenever you set a new color inside Campaign Editor it will be changed across both places because we’re using the same Localization identifier in both places.
3. New Color Localization appears under "Color" tab inside Campaign Editor:
.. image:: /_static/img/advanced_features/editor_colors_tab.png
Image As Localization
---------------------
Localizing Image asset can be handy if you want to AB test it. Here is how to do that:
1. Navigate to HTML & CSS editor of the View you want to add a color Localization on:
.. image:: /_static/img/advanced_features/campaign_navigation_html_css_editor.png
2. Inside HTML area find an image you want to localize. An image can be either within CSS or within HTML area (`<img />`, inline styles, etc.). If the image is set within CSS you need to extract it into HTML area using inline styles:
.. code-block:: text
<div class="container" style="background-image: url('{{ "share_page_background" | localize: "share-page-background.jpg", trait: "asset" }}');">
...
</div>
In the example above `share_page_background` is a name of an Image Localization. `share-page-background.jpg` is a name of an Asset (Files tab within HTML & CSS Editor).
3. Now we can see newly created Image Localization under "Images" tab:
.. image:: /_static/img/advanced_features/editor_images_tab.png
Custom Option (Configuration) Localization
------------------------------------------
In addition to localizing Images, Colors, and static copy Talkable allows you to build really advanced Localizations where you can AB test or switch between different visual layouts of campaign Views.
An example can be to create an AB test for Equal Emphasis (all 3 sharing channels look visually equal) vs. Email Emphasis where email sharing form stands out:
.. image:: /_static/img/advanced_features/share_page_equal_emphasis.png
:height: 250 px
.. image:: /_static/img/advanced_features/share_page_email_emphasis.png
:height: 250 px
In order to chieve this AB test we need to
1. Build two separate layouts using CSS cascades to style all nested children within a container block that holds all the content:
.. code-block:: text
{% assign share_page_layout = "share_page_layout" | localize: "Equal Emphasis", "Email Emphasis" %}
<div class="container is-{{ share_page_layout | downcase | split: " " | join: "-" }}">
...
</div>
The code above creates a local Liquid variable named `share_page_layout` and assigns `share_page_layout` Configuration Localization to it.
Then we optimize the variable value to be set as an HTML `class` attribute (downcase, replace spaces with hyphens) and set it as a part of a `class` attribute.
2. Now inside Campaign Editor we can see newly created Configuration Localization:
.. image:: /_static/img/advanced_features/editor_configuration_tab.png
3. Let’s switch back to HTML & CSS editor and start applying CSS styling to both layouts. Knowing their final classes inside HTML: `class="container is-equal-emphasis"` and `class="container is-email-emphasis"` we can easily style both layouts inside CSS area like so (SCSS is also allowed and is shown as an example for code simplicity):
.. code-block:: scss
.container {
&.is-equal-emphasis {
h1 {
font-size: 48px;
}
}
&.is-email-emphasis {
h1 {
font-size: 32px;
}
}
}
All other nested children can be styled following this pattern.
4. Once you’re done with styling it is very easy to set up an AB test, just go back to Campaign Editor and click "Add A/B test variant" link. Once a Campaign goes Live it will start rotating both variants following AB test distribution rules (50:50 by default).
.. container:: hidden
.. toctree::
| 45.494949 | 349 | 0.720471 |
9cc7a80f08e747aca68e4ded68201df05b91bc50 | 3,196 | rst | reStructuredText | README.rst | psykzz/pyup | fd14b516341b48940054a28f7e886c7a47580fac | [
"MIT"
] | 1 | 2021-07-12T19:21:16.000Z | 2021-07-12T19:21:16.000Z | README.rst | psykzz/pyup | fd14b516341b48940054a28f7e886c7a47580fac | [
"MIT"
] | null | null | null | README.rst | psykzz/pyup | fd14b516341b48940054a28f7e886c7a47580fac | [
"MIT"
] | null | null | null | .. image:: https://pyup.io/static/images/logo.png
:target: https://pyup.io
|
.. image:: https://pyup.io/repos/github/pyupio/pyup/shield.svg
:target: https://pyup.io/repos/github/pyupio/pyup/
:alt: Updates
.. image:: https://img.shields.io/pypi/v/pyupio.svg
:target: https://pypi.python.org/pypi/pyupio
.. image:: https://travis-ci.org/pyupio/pyup.svg?branch=master
:target: https://travis-ci.org/pyupio/pyup
.. image:: https://readthedocs.org/projects/pyup/badge/?version=latest
:target: https://readthedocs.org/projects/pyup/?badge=latest
:alt: Documentation Status
.. image:: https://codecov.io/github/pyupio/pyup/coverage.svg?branch=master
:target: https://codecov.io/github/pyupio/pyup?branch=master
A tool to update all your project's requirement files with a single command directly on github.
.. image:: https://github.com/pyupio/pyup/blob/master/demo.gif
About
-----
Pyup is the open source version of the online service that is running behind pyup.io. The online
service comes with a user interface to manage all your project dependencies at a single place and a
lot of additional features. It's currently in closed beta. If you are interested to try it out,
make sure to request an invite at https://pyup.io
Installation
------------
To install pyup, run::
$ pip install pyupio
Obtain Token
------------
In order to communicate with the github API, you need to create an oauth token for your account:
* Log in to your github account
* Click on settings -> Personal access tokens
* Click on Generate new token
* Make sure to check 'repo' and click on Generate token
Run your first Update
---------------------
Run::
$ pyup --repo=username/repo --user-token=<YOUR_TOKEN> --initial
This will check all your requirement files and search for new package versions. If there are
updates available, pyup will create a new branch on your repository and create a new commit for
every single update. Once all files are up to date, pyup will create a single pull request containing
all commits.
Once your repository is up to date and the initial update is merged in, remove the `--initial`
flag and run::
$ pyup --repo=username/repo --user-token=<YOUR_TOKEN>
This will create a new branch and a pull request for every single update. Run a cronjob or a scheduled task somewhere
that auto-updates your repository once in a while (e.g. every day) to stay on latest.
Filtering
---------
You may don't want to update all your requirements to latest, completely ignore
some of them or exclude whole files. That's what filters are for.
To exclude a whole file, add this to the first line::
# pyup: ignore file
To ignore a package, append the `# pyup: ignore` filter::
flask # pyup: ignore
If you want to use e.g. the long term support version of Django, which is 1.8 currently, without
updating to the latest version 1.9, just add this filter::
Django # pyup: >=1.8,<1.9
This tells pyup to use a version that is greater or equal to `1.8` but smaller than `1.9`.
If you are a user of requires.io and you are using the `rq.filter` directive in your files: Pyup
supports that, too.
| 31.96 | 118 | 0.71965 |
1ad574788724cc2fccf4cd5b50272c2bb74b319e | 413 | rst | reStructuredText | docs/utils/text_utils.rst | rajshah4/pytorch-widedeep | 6540cd3cb33b2d7f0ad55e73371094e0d4f9907b | [
"MIT"
] | 692 | 2019-10-08T11:14:06.000Z | 2022-03-31T04:06:42.000Z | docs/utils/text_utils.rst | rajshah4/pytorch-widedeep | 6540cd3cb33b2d7f0ad55e73371094e0d4f9907b | [
"MIT"
] | 69 | 2019-12-18T10:53:47.000Z | 2022-03-29T22:45:51.000Z | docs/utils/text_utils.rst | rajshah4/pytorch-widedeep | 6540cd3cb33b2d7f0ad55e73371094e0d4f9907b | [
"MIT"
] | 85 | 2019-11-17T04:54:03.000Z | 2022-02-21T15:43:43.000Z | Text utils
=================
Collection of helper function that facilitate processing text.
.. autofunction:: pytorch_widedeep.utils.text_utils.simple_preprocess
:noindex:
.. autofunction:: pytorch_widedeep.utils.text_utils.get_texts
:noindex:
.. autofunction:: pytorch_widedeep.utils.text_utils.pad_sequences
:noindex:
.. autofunction:: pytorch_widedeep.utils.text_utils.build_embeddings_matrix
:noindex: | 27.533333 | 75 | 0.786925 |
0c24eb9a5d824044efac07c3a657afdda32f38fb | 11,300 | rst | reStructuredText | docs/source/language.rst | Canadian-Geospatial-Platform/pygeoapi | 56ccf43d25a9e06ec351a57d0ebc73dc09110013 | [
"MIT"
] | 3 | 2021-02-22T13:19:14.000Z | 2021-06-04T17:51:13.000Z | docs/source/language.rst | Canadian-Geospatial-Platform/pygeoapi | 56ccf43d25a9e06ec351a57d0ebc73dc09110013 | [
"MIT"
] | null | null | null | docs/source/language.rst | Canadian-Geospatial-Platform/pygeoapi | 56ccf43d25a9e06ec351a57d0ebc73dc09110013 | [
"MIT"
] | null | null | null | .. _language:
Multilingual support
====================
pygeoapi is language-aware and can handle multiple languages if these have been defined in pygeoapi's configuration (see `maintainer guide`_).
Providers can also handle multiple languages if configured. These may even be different from the languages that pygeoapi
supports. Out-of-the-box, pygeoapi "speaks" English.
The following sections provide more information how to use and set up languages in pygeoapi.
End user guide
--------------
There are 2 ways to affect the language of the results returned by pygeoapi, both for the HTML and JSON(-LD) formats:
1. After the requested pygeoapi URL, append a ``lang=<code>`` query parameter, where ``<code>`` should be replaced by a well-known language code.
This can be an ISO 639-1 code (e.g. `de` for German), optionally accompanied by an ISO 3166-1 alpha-2 country code (e.g. `de-CH` for Swiss-German).
Please refer to this `W3C article <https://www.w3.org/International/articles/language-tags/>`_ for more information or
this `list of language codes <http://www.lingoes.net/en/translator/langcode.htm>`_ for more examples.
Another option is to send a complex definition with quality weights (e.g. `de-CH, de;q=0.9, en;q=0.8, fr;q=0.7, \*;q=0.5`).
pygeoapi will then figure out the best match for the requested language.
For example, to view the pygeoapi landing page in Canadian-French, you could use this URL:
https://demo.pygeoapi.io/master?lang=fr-CA
2. Alternatively, you can set an ``Accept-Language`` HTTP header for the requested pygeoapi URL. Language tags that are valid for
the ``lang`` query parameter are also valid for this header value.
Please note that if your client application (e.g. browser) is configured for a certain language, it will likely set this
header by default, so the returned response should be translated to the language of your client app. If you don't want this,
you can either change the language of your client app or append the ``lang`` parameter to the URL, which will override
any language defined in the ``Accept-Language`` header.
Notes
^^^^^
- If pygeoapi cannot find a good match to the requested language, the response is returned in the default language (usually English).
- Even if pygeoapi *itself* supports the requested language, provider plugins might not support that language or perhaps don't even
support (multiple) languages at all. In that case the provider will reply in its own default language, which may not be the same language
as the default pygeoapi server language.
- If pygeoapi found a match to the requested language, the response will include a ``Content-Language`` HTTP header,
set to the best-matching server language code. This is the default behavior for all pygeoapi requests.
- For HTML results returned from a **provider**, the ``Content-Language`` HTTP header will be set to the best match for the
provider language or the best-matching pygeoapi server language if the provider does not support languages.
- For JSON(-LD) results returned from a **provider**, the ``Content-Language`` header will be **removed** if the provider
does not support any language. Otherwise, the header will be set to the best-matching provider language.
Maintainer guide
----------------
Every pygeoapi instance needs to support at least 1 language. In the server configuration, there must be a ``language``
or a ``languages`` (note the `s`) property. The property can be set to a single language tag or a list of tags respectively.
If you wish to set up a multilingual pygeoapi instance, you will have to add more than 1 language to the
server configuration YAML file (i.e. ``pygeoapi-config.yml``). First, you will have to add the supported language tags/codes
as a list. For example, if you wish to support American English and Canadian French, you could do:
.. code-block:: yaml
server:
bind: ...
url: ...
mimetype: ...
encoding: ...
languages:
- en-US
- fr-CA
Next, you will have to provide translations for the configured languages. This involves 3 steps:
1. `Add translations for configurable text values`_ in the server YAML file;
2. Verify if there are any Jinja2 HTML template translations for the configured language(s);
3. Make sure that the provider plugins you need can handle this language as well, if you have the ability to do so.
See the `developer guide`_ for more details.
Notes
^^^^^
- The **first** language you define in the configuration determines the default language, i.e. the language that pygeoapi will
use if no other language was requested or no best match for the requested language could be found.
- It is not possible to **disable** language support in pygeoapi. The functionality is always on. If results should always
be shown in a single language, you'd have to set that language only in the pygeoapi configuration.
- Results returned from a provider may be in a different language than pygeoapi's own server language. The requested language
is always passed on to the provider, even if pygeoapi itself does not support it. For more information, see the `end user guide`_
and the `developer guide`_.
Add translations for configurable text values
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
For most of the text values in pygeoapi's server configuration where it makes sense, you can add translations.
Consider the ``metadata`` section for example. The English-only version looks similar to this:
.. code-block:: yaml
metadata:
identification:
title: pygeoapi default instance
description: pygeoapi provides an API to geospatial data
keywords:
- geospatial
- data
- api
If you wish to make these text values available in English and French, you could use the following language struct:
.. code-block:: yaml
metadata:
identification:
title:
en: pygeoapi default instance
fr: instance par défaut de pygeoapi
description:
en: pygeoapi provides an API to geospatial data
fr: pygeoapi fournit une API aux données géospatiales
keywords:
en:
- geospatial
- data
- api
fr:
- géospatiale
- données
- api
In other words: each plain text value should be replaced by a dictionary, where the language code is the key and the translated text represents the matching value.
For lists, this can be applied as well (see ``keywords`` example above), as long as you nest the entire list under a language key instead of each list item.
Note that the example above uses generic language tags, but you can also supply more localized tags (with a country code) if required.
pygeoapi should always be able find the best match to the requested language, i.e. if the user wants Swiss-French (`fr-CH`) but pygeoapi can only find `fr` tags,
those values will be returned. However, if a `fr-CH` tag can also be found, that value will be returned and not the `fr` value.
.. todo:: Add docs on HTML templating.
Developer guide
---------------
If you are a developer who wishes to create a pygeoapi provider plugin that "speaks" a certain language,
you will have to fully implement this yourself. Needless to say, if your provider depends on some backend, it will only make sense to
implement language support if the backend can be queried in another language as well.
You are free to set up the language support anyway you like, but there are a couple of steps you'll have to walk through:
1. You will have to define the supported languages in the provider configuration YAML. This can be done in a similar fashion
as the ``languages`` configuration for pygeoapi itself, as described in the `maintainer guide`_ section above.
For example, a TinyDB records provider that supports English and French could be set up like:
.. code-block:: yaml
my-records:
type: collection
..
providers:
- type: record
name: TinyDBCatalogue
data: ..
languages:
- en
- fr
2. If your provider implements any of the ``query``, ``get``, ``get_metadata``, ``get_coverage_domainset`` and ``get_coverage_rangetype``
methods of the base class and you wish to make them language-aware, either add a ``**kwargs`` parameter or a
``language=None`` parameter to the method signature.
An example Python code block for a custom provider with a language-aware ``query`` method could look like this:
.. code-block:: python
class MyCoolVectorDataProvider(BaseProvider):
"""My cool vector data provider"""
def __init__(self, provider_def):
super().__init__(provider_def)
def query(self, startindex=0, limit=10, resulttype='results', bbox=[],
datetime_=None, properties=[], sortby=[], select_properties=[],
skip_geometry=False, q=None, language=None):
LOGGER.debug(f'Provider queried in {language.english_name} language')
# Implement your logic here, returning JSON in the requested language
Alternatively, you could also use ``**kwargs`` in the ``query`` method and get the ``language`` value:
.. code-block:: python
def query(self, **kwargs):
LOGGER.debug(f"Provider locale set to: {kwargs.get('language')}")
# Implement your logic here, returning JSON in the requested language
This is all that is required. The pygeoapi API class will make sure that the correct HTTP ``Content-Language`` headers are set on the response object.
Notes
^^^^^
- If your provider implements any of the aforementioned ``query``, ``get``, ``get_metadata``, ``get_coverage_domainset`` and ``get_coverage_rangetype``
methods, it **must** add a ``**kwargs`` or ``language=None`` parameter, even if it does not need to use the language parameter.
- Contrary to the pygeoapi server configuration, adding a ``language`` or ``languages`` (both are supported) property to the
provider definition is **not** required and may be omitted. In that case, the passed-in ``language`` parameter language-aware provider methods
(``query``, ``get``, etc.) will be set to ``None``. This results in the following behavior:
- HTML responses returned from the providers will have the ``Content-Language`` header set to the best-matching pygeoapi server language.
- JSON(-LD) responses returned from providers will **not** have a ``Content-Language`` header if ``language`` is ``None``.
- If the provider supports a requested language, the passed-in ``language`` will be set to the best matching
`Babel Locale instance <http://babel.pocoo.org/en/latest/api/core.html#babel.core.Locale>`_.
Note that this may be the provider default language if no proper match was found.
No matter the output format, API responses returned from providers will always contain a best-matching ``Content-Language``
header if one ore more supported provider languages were defined.
- For general information about building plugins, please visit the :ref:`plugins` page.
| 50.222222 | 163 | 0.713186 |
76c01e85ce49cafffe45eb7465d5274953c0eff0 | 13,302 | rst | reStructuredText | asynciomysql/docs/cursors.rst | 1067511899/tornado-learn | 497cc8f7816f15e2eab834a758f192d50704fe05 | [
"Apache-2.0"
] | 1 | 2017-12-27T09:05:23.000Z | 2017-12-27T09:05:23.000Z | asynciomysql/docs/cursors.rst | 1067511899/tornado-learn | 497cc8f7816f15e2eab834a758f192d50704fe05 | [
"Apache-2.0"
] | null | null | null | asynciomysql/docs/cursors.rst | 1067511899/tornado-learn | 497cc8f7816f15e2eab834a758f192d50704fe05 | [
"Apache-2.0"
] | null | null | null | .. _aiomysql-cursors:
Cursor
======
.. class:: Cursor
A cursor for connection.
Allows Python code to execute :term:`MySQL` command in a database
session. Cursors are created by the :meth:`Connection.cursor`
:ref:`coroutine <coroutine>`: they are bound to the connection for
the entire lifetime and all the commands are executed in the context
of the database session wrapped by the connection.
Cursors that are created from the same connection are not isolated,
i.e., any changes done to the database by a cursor are immediately
visible by the other cursors. Cursors created from different
connections can or can not be isolated, depending on the
connections’ isolation level.
.. code:: python
import asyncio
import aiomysql
loop = asyncio.get_event_loop()
@asyncio.coroutine
def test_example():
conn = yield from aiomysql.connect(host='127.0.0.1', port=3306,
user='root', password='',
db='mysql', loop=loop)
# create default cursor
cursor = yield from conn.cursor()
# execute sql query
yield from cursor.execute("SELECT Host, User FROM user")
# fetch all results
r = yield from cursor.fetchall()
# detach cursor from connection
yield from cursor.close()
# close connection
conn.close()
loop.run_until_complete(test_example())
Use :meth:`Connection.cursor()` for getting cursor for connection.
.. attribute:: connection
This read-only attribute return a reference to the :class:`Connection`
object on which the cursor was created
.. attribute:: echo
Return echo mode status.
.. attribute:: description
This read-only attribute is a sequence of 7-item sequences.
Each of these sequences is a collections.namedtuple containing
information describing one result column:
0. name: the name of the column returned.
1. type_code: the type of the column.
2. display_size: the actual length of the column in bytes.
3. internal_size: the size in bytes of the column associated to
this column on the server.
4. precision: total number of significant digits in columns of
type ``NUMERIC``. None for other types.
5. scale: count of decimal digits in the fractional part in
columns of type ``NUMERIC``. None for other types.
6. null_ok: always None.
This attribute will be None for operations that do not
return rows or if the cursor has not had an operation invoked
via the :meth:`Cursor.execute()` method yet.
.. attribute:: rowcount
Returns the number of rows that has been produced of affected.
This read-only attribute specifies the number of rows that the
last :meth:`Cursor.execute()` produced (for Data Query Language
statements like SELECT) or affected (for Data Manipulation
Language statements like ``UPDATE`` or ``INSERT``).
The attribute is -1 in case no :meth:`Cursor.execute()` has been
performed on the cursor or the row count of the last operation if it
can't be determined by the interface.
.. attribute:: rownumber
Row index. This read-only attribute provides the current 0-based index
of the cursor in the result set or ``None`` if the index cannot be
determined.
.. attribute:: arraysize
How many rows will be returned by :meth:`Cursor.fetchmany()` call.
This read/write attribute specifies the number of rows to
fetch at a time with :meth:`Cursor.fetchmany()`. It defaults to
1 meaning to fetch a single row at a time.
.. attribute:: lastrowid
This read-only property returns the value generated for an
`AUTO_INCREMENT` column by the previous `INSERT` or `UPDATE` statement
or None when there is no such value available. For example,
if you perform an `INSERT` into a table that contains an
`AUTO_INCREMENT` column, :attr:`Cursor.lastrowid` returns the
`AUTO_INCREMENT` value for the new row.
.. attribute:: closed
The readonly property that returns ``True`` if connections was detached
from current cursor
.. method:: close()
:ref:`Coroutine <coroutine>` to close the cursor now (rather than
whenever ``del`` is executed). The cursor will be unusable from this
point forward; closing a cursor just exhausts all remaining data.
.. method:: execute(query, args=None)
:ref:`Coroutine <coroutine>`, executes the given operation substituting
any markers with the given parameters.
For example, getting all rows where id is 5::
yield from cursor.execute("SELECT * FROM t1 WHERE id=%s", (5,))
:param str query: sql statement
:param list args: tuple or list of arguments for sql query
:returns int: number of rows that has been produced of affected
.. method:: executemany(query, args)
The `executemany()` :ref:`coroutine <coroutine>` will execute the
operation iterating over the list of parameters in seq_params.
Example: Inserting 3 new employees and their phone number::
data = [
('Jane','555-001'),
('Joe', '555-001'),
('John', '555-003')
]
stmt = "INSERT INTO employees (name, phone)
VALUES ('%s','%s')"
yield from cursor.executemany(stmt, data)
`INSERT` statements are optimized by batching the data, that is
using the MySQL multiple rows syntax.
:param str query: sql statement
:param list args: tuple or list of arguments for sql query
.. method:: callproc(procname, args)
Execute stored procedure procname with args, this method is
:ref:`coroutine <coroutine>`.
Compatibility warning: PEP-249 specifies that any modified
parameters must be returned. This is currently impossible
as they are only available by storing them in a server
variable and then retrieved by a query. Since stored
procedures return zero or more result sets, there is no
reliable way to get at OUT or INOUT parameters via `callproc`.
The server variables are named `@_procname_n`, where `procname`
is the parameter above and n is the position of the parameter
(from zero). Once all result sets generated by the procedure
have been fetched, you can issue a `SELECT @_procname_0`, ...
query using :meth:`Cursor.execute()` to get any OUT or INOUT values.
Basic usage example::
conn = yield from aiomysql.connect(host='127.0.0.1', port=3306,
user='root', password='',
db='mysql', loop=self.loop)
cur = yield from conn.cursor()
yield from cur.execute("""CREATE PROCEDURE myinc(p1 INT)
BEGIN
SELECT p1 + 1;
END
""")
yield from cur.callproc('myinc', [1])
(ret, ) = yield from cur.fetchone()
assert 2, ret
yield from cur.close()
conn.close()
Compatibility warning: The act of calling a stored procedure
itself creates an empty result set. This appears after any
result sets generated by the procedure. This is non-standard
behavior with respect to the DB-API. Be sure to use
:meth:`Cursor.nextset()` to advance through all result sets; otherwise
you may get disconnected.
:param str procname: name of procedure to execute on server
:param args: sequence of parameters to use with procedure
:returns: the original args.
.. method:: fetchone()
Fetch the next row :ref:`coroutine <coroutine>`.
.. method:: fetchmany(size=None)
:ref:`Coroutine <coroutine>` the next set of rows of a query result,
returning a list of tuples. When no more rows are available, it
returns an empty list.
The number of rows to fetch per call is specified by the parameter.
If it is not given, the cursor's :attr:`Cursor.arraysize` determines
the number of rows to be fetched. The method should try to fetch as
many rows as indicated by the size parameter. If this is not possible
due to the specified number of rows not being available, fewer rows
may be returned ::
cursor = yield from connection.cursor()
yield from cursor.execute("SELECT * FROM test;")
r = cursor.fetchmany(2)
print(r)
# [(1, 100, "abc'def"), (2, None, 'dada')]
r = yield from cursor.fetchmany(2)
print(r)
# [(3, 42, 'bar')]
r = yield from cursor.fetchmany(2)
print(r)
# []
:param int size: number of rows to return
:returns list: of fetched rows
.. method:: fetchall()
:ref:`Coroutine <coroutine>` returns all rows of a query result set::
yield from cursor.execute("SELECT * FROM test;")
r = yield from cursor.fetchall()
print(r)
# [(1, 100, "abc'def"), (2, None, 'dada'), (3, 42, 'bar')]
:returns list: list of fetched rows
.. method:: scroll(value, mode='relative')
Scroll the cursor in the result set to a new position according
to mode. This method is :ref:`coroutine <coroutine>`.
If mode is ``relative`` (default), value is taken as offset to the
current position in the result set, if set to ``absolute``, value
states an absolute target position. An IndexError should be raised in
case a scroll operation would leave the result set. In this case,
the cursor position is left undefined (ideal would be to
not move the cursor at all).
.. note::
According to the :term:`DBAPI`, the exception raised for a cursor out
of bound should have been :exc:`IndexError`. The best option is
probably to catch both exceptions in your code::
try:
yield from cur.scroll(1000 * 1000)
except (ProgrammingError, IndexError), exc:
deal_with_it(exc)
:param int value: move cursor to next position according to mode.
:param str mode: scroll mode, possible modes: `relative` and `absolute`
.. class:: DictCursor
A cursor which returns results as a dictionary. All methods and arguments
same as :class:`Cursor`, see example::
import asyncio
import aiomysql
loop = asyncio.get_event_loop()
@asyncio.coroutine
def test_example():
conn = yield from aiomysql.connect(host='127.0.0.1', port=3306,
user='root', password='',
db='mysql', loop=loop)
# create dict cursor
cursor = yield from conn.cursor(aiomysql.DictCursor)
# execute sql query
yield from cursor.execute(
"SELECT * from people where name='bob'")
# fetch all results
r = yield from cursor.fetchone()
print(r)
# {'age': 20, 'DOB': datetime.datetime(1990, 2, 6, 23, 4, 56),
# 'name': 'bob'}
loop.run_until_complete(test_example())
.. class:: SSCursor
Unbuffered Cursor, mainly useful for queries that return a lot of
data, or for connections to remote servers over a slow network.
Instead of copying every row of data into a buffer, this will fetch
rows as needed. The upside of this, is the client uses much less memory,
and rows are returned much faster when traveling over a slow network,
or if the result set is very big.
There are limitations, though. The MySQL protocol doesn't support
returning the total number of rows, so the only way to tell how many rows
there are is to iterate over every row returned. Also, it currently isn't
possible to scroll backwards, as only the current row is held in memory.
All methods are the same as in :class:`Cursor` but with different
behaviour.
.. method:: fetchall()
Same as :meth:`Cursor.fetchall` :ref:`coroutine <coroutine>`,
useless for large queries, as all rows fetched one by one.
.. method:: fetchmany(size=None, mode='relative')
Same as :meth:`Cursor.fetchall`, but each row fetched one by one.
.. method:: scroll(size=None)
Same as :meth:`Cursor.scroll`, but move cursor on server side one by
one. If you want to move 20 rows forward scroll will make 20 queries
to move cursor. Currently only forward scrolling is supported.
.. class:: SSDictCursor
An unbuffered cursor, which returns results as a dictionary.
| 38.005714 | 81 | 0.618929 |
1d6cb912407a36d29a3050930893087c66deb11b | 2,827 | rst | reStructuredText | docs/source/2pc.rst | tarasvaskiv/openprocurement.tender.cfaselectionua | 6c4e1bcff13fd582e745d394c4f5fb0ffd76929e | [
"Apache-2.0"
] | 102 | 2015-04-27T21:02:50.000Z | 2022-03-08T19:51:57.000Z | docs/source/2pc.rst | tarasvaskiv/openprocurement.tender.cfaselectionua | 6c4e1bcff13fd582e745d394c4f5fb0ffd76929e | [
"Apache-2.0"
] | 353 | 2015-01-02T13:03:06.000Z | 2022-03-21T22:16:27.000Z | docs/source/2pc.rst | tarasvaskiv/openprocurement.tender.cfaselectionua | 6c4e1bcff13fd582e745d394c4f5fb0ffd76929e | [
"Apache-2.0"
] | 77 | 2015-04-10T14:43:30.000Z | 2021-06-08T06:50:22.000Z | .. _2pc:
2 Phase Commit
==============
.. _tender-2pc:
Mechanism of the 2-phase commit
--------------------------------
The 2-phase commit provides a mechanism for CDB to publish only the tenders that clients are able to control and duplicates of which they have rights to cancel.
The reason for duplicated tenders can be cases when the requester did not receive a response from the server about tender creation and, therefore, repeated the request. Removing such tenders requires administrative intervention.
Creating tender with single-phase commit
----------------------------------------
Sending a single-phase request for a tender creation (POST /tenders) according to the "old" mechanism, that creates a tender already in the ``active.enquiries`` status:
.. include:: tutorial/tender-post-attempt-json-data.http
:code:
Creating tender with 2-phase commit
-----------------------------------
Tender becomes available after the successful completion of the following requests:
1. Creation of the tender in the ``draft`` status.
2. Transfer of the tender to ``active.enquiries`` status through a separate request (publication).
Creation of a tender
~~~~~~~~~~~~~~~~~~~~
A request `POST /tenders` creates a tender in status ``draft``. As a result, an ``acc_token`` is passed for the further tender management.
.. include:: tutorial/tender-post-2pc.http
:code:
Tender with the ``draft`` status is "invisible" in the `GET /tenders` list. Chronograph does not "see" it, therefore, does not switch statuses.
Publication of a tender
~~~~~~~~~~~~~~~~~~~~~~~
The request `PATCH /tenders/{id}?acc_token=...` ``{“data”:{“status”:”active.enquiries”}}`` changes status of tender (according to the request), therefore, publishes it ("visualizes" it in the `GET /tenders list`).
.. include:: tutorial/tender-patch-2pc.http
:code:
All tenders created in the CDB but not yet published will not be displayed on the web platform and, therefore, will not lead to their announcement.
Repeating of the request for publication in case of problem with receiving a response from the server will not cause errors.
The new mechanism is available along with the "old" one. The "old" is likely to be turned off in one of the later releases.
Work with errors
----------------
In case of unsuccessful request and/or 5xx errors you should check modified object data (tender, bid, award, etc.), since 5xx error response does not necessarily guarantee that request has not been performed. You should repeat this request with some interval until successful result.
You can view more detailed error description :ref:`here <errors>`.
Here is an example of incorrectly formed request. This error indicates that the data is not found in the body of JSON.
.. include:: tutorial/tender-post-attempt-json.http
:code:
| 40.971014 | 284 | 0.717015 |
7e5656cad649772959c6f2bbbff46c487e2490f0 | 2,293 | rst | reStructuredText | docs/api/tasks.rst | carlmanaster/python-lokalise-api | ce4a43c5a7bf14f45a2432e096b1880ff28d6770 | [
"BSD-3-Clause"
] | 5 | 2020-09-09T15:22:34.000Z | 2021-12-07T12:24:26.000Z | docs/api/tasks.rst | carlmanaster/python-lokalise-api | ce4a43c5a7bf14f45a2432e096b1880ff28d6770 | [
"BSD-3-Clause"
] | 39 | 2020-12-08T16:56:06.000Z | 2022-03-28T15:18:52.000Z | docs/api/tasks.rst | carlmanaster/python-lokalise-api | ce4a43c5a7bf14f45a2432e096b1880ff28d6770 | [
"BSD-3-Clause"
] | 1 | 2021-03-25T02:55:49.000Z | 2021-03-25T02:55:49.000Z | Tasks endpoint
==============
`Tasks documentation <https://app.lokalise.com/api2docs/curl/#resource-tasks>`_
Fetch all tasks
---------------
.. py:function:: tasks(project_id, [params = None])
:param str project_id: ID of the project
:param dict params: (optional) Request parameters
:return: Collection of tasks
Example:
.. code-block:: python
tasks = client.tasks('123.abc', {
"page": 2,
"limit": 3,
"filter_statuses": "completed"
})
tasks.items[0].task_id # => 89334
Fetch a task
------------
.. py:function:: task(project_id, task_id)
:param str project_id: ID of the project
:param task_id: ID of the task to fetch
:type task_id: int or str
:return: Task model
Example:
.. code-block:: python
task = client.task('123.abc', 89334)
task.task_id # => 89334
task.title # => "Demo task"
Create a task
-------------
.. py:function:: create_task(project_id, params)
:param str project_id: ID of the project
:param dict params: Task parameters
:return: Task model
Example:
.. code-block:: python
task = client.create_task('123.abc', {
"title": "Python task",
"languages": [{
"language_iso": "en",
"users": [203]
}],
"keys": [340891],
"auto_close_task": True
})
task.project_id # => '123.abc'
task.title # => "Python task"
task.languages[0]['language_iso'] # => "en"
task.auto_close_task # => True
Update a task
-------------
.. py:function:: update_task(project_id, task_id, [params = None])
:param str project_id: ID of the project
:param task_id: ID of the task to update
:type task_id: int or str
:param dict params: Task parameters
:return: Task model
Example:
.. code-block:: python
task = client.update_task('123.abc', 34567, {
"title": "Python updated task",
"due_date": "2020-08-24 23:59:59"
})
task.title # => "Python updated task"
task.due_date # => "2020-08-24 21:59:59 (Etc/UTC)"
Delete a task
-------------
.. py:function:: delete_task(project_id, task_id)
:param str project_id: ID of the project
:param task_id: ID of the task to delete
:type task_id: int or str
:return: Dictionary with the project ID and "task_deleted": True
Example:
.. code-block:: python
client.delete_task('123.abc', 34567)
| 21.231481 | 79 | 0.631923 |
cdfe863b162021ed4085f0b979a3bdec9d85e900 | 395 | rst | reStructuredText | docs/index.rst | infosec-garage/maltools | 007c7856f343b267b28b2bca8065ec2832678fc2 | [
"MIT"
] | null | null | null | docs/index.rst | infosec-garage/maltools | 007c7856f343b267b28b2bca8065ec2832678fc2 | [
"MIT"
] | null | null | null | docs/index.rst | infosec-garage/maltools | 007c7856f343b267b28b2bca8065ec2832678fc2 | [
"MIT"
] | null | null | null | *************************************
Maltools
*************************************
Welcome to the Maltools documentation.
.. toctree::
:maxdepth: 3
:hidden:
self
API <api/maltools>
GitHub project <hhttps://github.com/infosec-garage/maltools>
.. Feel free to change github link to your own repo url.
Indices and tables
==================
* :ref:`genindex`
* :ref:`modindex`
| 18.809524 | 63 | 0.531646 |
17bb5d4920993155b7875da8e34618f809af50d7 | 331 | rst | reStructuredText | docs/bases/component.rst | ChenlingJ/MCycle | 1fb72be3f889bd5f30c5dbd927f31b3bca2f1c38 | [
"Apache-2.0"
] | 7 | 2018-05-08T08:42:07.000Z | 2022-01-27T14:40:45.000Z | docs/bases/component.rst | ChenlingJ/MCycle | 1fb72be3f889bd5f30c5dbd927f31b3bca2f1c38 | [
"Apache-2.0"
] | 6 | 2018-05-20T05:52:48.000Z | 2019-09-05T19:56:29.000Z | docs/bases/component.rst | ChenlingJ/MCycle | 1fb72be3f889bd5f30c5dbd927f31b3bca2f1c38 | [
"Apache-2.0"
] | 4 | 2018-05-06T20:57:34.000Z | 2021-07-03T04:32:15.000Z | Component Abstract Base Classes
===============================
.. autosummary::
:toctree:
mcycle.bases.component.Component
mcycle.bases.component.Component11
mcycle.bases.component.Component22
.. automodule:: mcycle.bases.component
:members:
:inherited-members:
:show-inheritance:
| 22.066667 | 39 | 0.619335 |
667b8213d2110cfdd23f68c5565660a33d402a1c | 168 | rst | reStructuredText | gen/pb-protodoc/flyteidl/plugins/sagemaker/index.rst | EngHabu/flyteidl | d7970314fc2bcbd2840610b4d42ea2886f0837b9 | [
"Apache-2.0"
] | null | null | null | gen/pb-protodoc/flyteidl/plugins/sagemaker/index.rst | EngHabu/flyteidl | d7970314fc2bcbd2840610b4d42ea2886f0837b9 | [
"Apache-2.0"
] | 3 | 2021-03-27T12:20:58.000Z | 2021-03-29T09:45:45.000Z | gen/pb-protodoc/flyteidl/plugins/sagemaker/index.rst | EngHabu/flyteidl | d7970314fc2bcbd2840610b4d42ea2886f0837b9 | [
"Apache-2.0"
] | null | null | null | sagemaker
=========
.. toctree::
:maxdepth: 1
:caption: sagemaker
:name: sagemakertoc
hyperparameter_tuning_job.proto
parameter_ranges.proto
training_job.proto
| 14 | 32 | 0.744048 |
7310b523115fd12167d5e5d6e03a2f3f80015988 | 95 | rst | reStructuredText | docs/usage.rst | GenomeTrakrUnofficial/sra-quick-subm | e5c69d3a9fc0298fcd03188abaffb16bcbaa70bb | [
"Unlicense"
] | 2 | 2016-12-17T17:13:53.000Z | 2020-01-31T21:20:18.000Z | docs/usage.rst | GenomeTrakrUnofficial/sra-quick-submit | e5c69d3a9fc0298fcd03188abaffb16bcbaa70bb | [
"Unlicense"
] | null | null | null | docs/usage.rst | GenomeTrakrUnofficial/sra-quick-submit | e5c69d3a9fc0298fcd03188abaffb16bcbaa70bb | [
"Unlicense"
] | null | null | null | =====
Usage
=====
To use SRA Quick-Submit in a project::
import sra_quick_submit_package
| 11.875 | 38 | 0.673684 |
c3968f193a683d78bb20fd7f81a0b4f30f9a096c | 2,000 | rst | reStructuredText | docs/index.rst | johnnoone/aioconsul | 02f7a529d7dc2e49bed942111067aa5faf320e90 | [
"BSD-3-Clause"
] | 7 | 2015-03-17T18:29:14.000Z | 2020-01-03T06:45:43.000Z | docs/index.rst | johnnoone/aioconsul | 02f7a529d7dc2e49bed942111067aa5faf320e90 | [
"BSD-3-Clause"
] | 1 | 2015-06-04T03:06:46.000Z | 2015-06-04T03:06:46.000Z | docs/index.rst | johnnoone/aioconsul | 02f7a529d7dc2e49bed942111067aa5faf320e90 | [
"BSD-3-Clause"
] | 2 | 2015-06-03T16:53:11.000Z | 2021-12-16T13:38:23.000Z | AIOConsul
=========
AIOConsul is a Python >= 3.5 library for requesting Consul_ API, build on top
of asyncio_ and aiohttp_.
Currently, this library aims a full compatibility with consul 0.7.
Sources are available at https://lab.errorist.xyz/aio/aioconsul.
Build status
------------
.. image:: https://lab.errorist.xyz/aio/aioconsul/badges/master/build.svg
:target: https://lab.errorist.xyz/aio/aioconsul/commits/master
.. image:: https://lab.errorist.xyz/aio/aioconsul/badges/master/coverage.svg
:target: https://lab.errorist.xyz/aio/aioconsul/commits/master
Installation
------------
::
pip install aioconsul
Tutorial
--------
In this example I will show you how to join my cluster with another::
from aioconsul import Consul
client = Consul('my.node.ip')
# do I have a members?
members = await client.members.items()
assert len(members) == 1, "I am alone in my cluster"
# let's join another cluster
joined = await client.members.join('other.node.ip')
if joined:
members = await client.members.items()
assert len(members) > 1, "I'm not alone anymore"
And display the catalog::
datacenters = await client.catalog.datacenters()
for dc in datacenters:
print(dc)
services, _ = await client.catalog.services()
for service, tags in services.items():
print(service, tags)
nodes, _ = await client.catalog.nodes()
for node in nodes:
print(node.name, node.address)
Important
---------
Version 0.7 breaks compatibility with previous versions:
* It is closer to what HTTP API returns
* It does not add consul property anymore
* Response with metadata are now a 2 items length tuple
(:class:`~aioconsul.typing.CollectionMeta` or
:class:`~aioconsul.typing.ObjectMeta`)
Focus
-----
.. toctree::
:maxdepth: 2
client
api
endpoints
misc
contributing
.. _Consul: https://www.consul.io
.. _asyncio: http://asyncio.org
.. _aiohttp: http://aiohttp.readthedocs.org
| 22.222222 | 77 | 0.6845 |
2c6edc34756a1f7d4fa9cd3ff051087d78f0a287 | 12,098 | rst | reStructuredText | docs/expression.rst | orekyuu/doma | dd1d9360baf326e54beee2de38a72991114f05d5 | [
"Apache-2.0"
] | null | null | null | docs/expression.rst | orekyuu/doma | dd1d9360baf326e54beee2de38a72991114f05d5 | [
"Apache-2.0"
] | null | null | null | docs/expression.rst | orekyuu/doma | dd1d9360baf326e54beee2de38a72991114f05d5 | [
"Apache-2.0"
] | null | null | null | ===================
Expression language
===================
.. contents:: Contents
:depth: 3
You can write simple expressions in directives of :doc:`sql`.
The grammar is almost the same as Java.
However, not everything is possible that Java can do.
.. note::
Especially, the big difference is how to use optional types like ``java.util.Optional``.
In the expression, a value of ``Optional`` type is always converted
to a value of the element type automatically.
For example a value of the ``Optional<String>`` type is treated as a value of ``String`` type.
Therefore, we can't call methods of ``Optional`` type,
nor do we call methods which have an ``Optional`` type in the parameters.
When you want to check existence of a value, use ``/*%if optional != null */``
instead of ``/*%if optional.isPresent() */``.
The same is true for ``java.util.OptionalInt``, ``java.util.OptionalDouble``,
and ``java.util.OptionalLong``.
Literals
========
You can use the following literals:
+----------+----------------------+
| Literal | Type |
+==========+======================+
| null | void |
+----------+----------------------+
| true | boolean |
+----------+----------------------+
| false | boolean |
+----------+----------------------+
| 10 | int |
+----------+----------------------+
| 10L | long |
+----------+----------------------+
| 0.123F | float |
+----------+----------------------+
| 0.123D | double |
+----------+----------------------+
| 0.123B | java.math.BigDecimal |
+----------+----------------------+
| 'a' | char |
+----------+----------------------+
| "a" | java.lang.String |
+----------+----------------------+
The numeral types are distinguished by suffix letters such as ``L`` or ``F``
at the end of the literals. The suffixes must be capital letters.
.. code-block:: sql
select * from employee where
/*%if employeeName != null && employeeName.length() > 10 */
employee_name = /* employeeName */'smith'
/*%end*/
Comparison operators
====================
You can use the following comparison operators:
+-----------+-------------------------------------+
| Operator | Description |
+===========+=====================================+
| == | Equal to operator |
+-----------+-------------------------------------+
| != | Not equal to operator |
+-----------+-------------------------------------+
| < | Less than operator |
+-----------+-------------------------------------+
| <= | Less than or equal to operator |
+-----------+-------------------------------------+
| > | Greater than operator |
+-----------+-------------------------------------+
| >= | Greater than or equal to operator |
+-----------+-------------------------------------+
To use comparison operators, operands must implement ``java.lang.Comparable``.
The operands for ``<``, ``<=``, ``>`` and ``>=`` must not be ``null``.
.. code-block:: sql
select * from employee where
/*%if employeeName.indexOf("s") > -1 */
employee_name = /* employeeName */'smith'
/*%end*/
Logical operators
=================
You can use the following logical operators:
========= ===========================
Operator Description
========= ===========================
! Logical complement operator
&& Conditional-AND operator
|| Conditional-OR operator
========= ===========================
With parentheses, you can override the precedence of operators.
.. code-block:: sql
select * from employee where
/*%if (departmentId == null || managerId == null) and employee_name != null */
employee_name = /* employeeName */'smith'
/*%end*/
Arithmetic operators
====================
You can use the following arithmetic operators:
+----------+----------------------------+
| Operator | Description |
+==========+============================+
| \+ | Additive operator |
+----------+----------------------------+
| \- | Subtraction operator |
+----------+----------------------------+
| \* | Multiplication operator |
+----------+----------------------------+
| / | Division operator |
+----------+----------------------------+
| % | Remainder operator |
+----------+----------------------------+
Operands must be numeric type.
.. code-block:: sql
select * from employee where
salary = /* salary + 1000 */0
String concatenation operator
=============================
You can concatenate characters using a concatenation operator ``+``.
The operand must be one of the following types:
* java.lang.String
* java.lang.Character
* char
.. code-block:: sql
select * from employee where
employee_name like /* employeeName + "_" */'smith'
Calling instance methods
========================
You can call instance methods with the method names separated by dots ``.``.
The method visibility must be public.
.. code-block:: sql
select * from employee where
/*%if employeeName.startsWith("s") */
employee_name = /* employeeName */'smith'
/*%end*/
If the method has no argument, specify ``()`` after the method name.
.. code-block:: sql
select * from employee where
/*%if employeeName.length() > 10 */
employee_name = /* employeeName */'smith'
/*%end*/
Accessing to instance fields
============================
You can access instance fields with the field names separated by dots ``.``.
Even if the visibility is private, you can access it.
.. code-block:: sql
select * from employee where
employee_name = /* employee.employeeName */'smith'
Calling static methods
======================
You can call static methods by continuing the method names
with the fully qualified class names enclosed in ``@``.
The method visibility must be public.
.. code-block:: sql
select * from employee where
/*%if @java.util.regex.Pattern@matches("^[a-z]*$", employeeName) */
employee_name = /* employeeName */'smith'
/*%end*/
Accessing to static fields
==========================
You can access static fields by continuing the field name
with the fully qualified class name enclosed in ``@``.
Even if the visibility is private, you can access it.
.. code-block:: sql
select * from employee where
/*%if employeeName.length() < @java.lang.Byte@MAX_VALUE */
employee_name = /* employeeName */'smith'
/*%end*/
Using built-in functions
========================
Built-in functions are utilities mainly for changing values of binding variables
before binding them to SQL.
For example, when you run a prefix search with a LIKE clause,
you can write like this:
.. code-block:: sql
select * from employee where
employee_name like /* @prefix(employee.employeeName) */'smith' escape '$'
``@prefix(employee.employeeName)`` means that we pass ``employee.employeeName``
to the ``@prefix`` function.
The ``@prefix`` function converts the character sequence which is received by the parameter
to a string for forward match search.
It also escapes special characters.
For example, if the value of ``employee.employeeName`` is ``ABC``, it's converted to ``ABC%``.
If the value of ``employee.employeeName`` contains ``%`` such as ``AB%C``,
the ``%`` is escaped with a default escape sequence ``$``,
therefore the value is converted to ``AB$%C%``.
You can use following function signatures:
String @escape(CharSequence text, char escapeChar = '$')
Escapes the character sequence for LIKE operation.
The return value is a string which is a result of escaping the character sequence.
If ``escapeChar`` isn't specified, ``$`` is used as a default escape sequence.
It returns ``null`` if you pass ``null`` as a parameter.
String @prefix(CharSequence prefix, char escapeChar = '$')
Converts the character sequence for prefix search.
The return value is a string which is a result of escaping the character sequence
and adding a wild card character at the end.
If ``escapeChar`` isn't specified, ``$`` is used as a default escape sequence.
It returns ``null`` if you pass ``null`` as a parameter.
String @infix(CharSequence infix, char escapeChar = '$')
Converts the character sequence for infix search.
The return value is a string which is a result of escaping the character sequence
and adding wild card characters at the beginning and the end.
If ``escapeChar`` isn't specified, ``$`` is used as a default escape sequence.
It returns ``null`` if you pass ``null`` as a parameter.
String @suffix(CharSequence suffix, char escapeChar = '$')
Converts the character sequence for suffix search.
The return value is a string which is a result of escaping the character sequence
and adding a wild card character at the beginning.
If ``escapeChar`` isn't specified, ``$`` is used as a default escape sequence.
It returns ``null`` if you pass ``null`` as a parameter.
java.util.Date @roundDownTimePart(java.util.Date date)
Rounds down the time part.
The return value is a new Date which is rounded down the time part.
It returns ``null`` if you pass ``null`` as a parameter.
java.sql.Date @roundDownTimePart(java.sql.Date date)
Rounds down the time part.
The return value is a new Date which is rounded down the time part.
It returns ``null`` if you pass ``null`` as a parameter.
java.sql.Timestamp @roundDownTimePart(java.sql.Timestamp timestamp)
Rounds down the time part.
The return value is a new Timestamp which is rounded down the time part.
It returns ``null`` if you pass ``null`` as a parameter.
java.util.Date @roundUpTimePart(java.util.Date date)
Rounds up the time part.
The return value is a new Date which is rounded up the time part.
It returns ``null`` if you pass ``null`` as a parameter.
java.sql.Date @roundUpTimePart(java.sql.Date date)
Rounds up the time part.
The return value is a new Date which is rounded up the time part.
It returns ``null`` if you pass ``null`` as a parameter.
java.sql.Timestamp @roundUpTimePart(java.sql.Timestamp timestamp)
Rounds up the time part.
The return value is a new Timestamp which is rounded up the time part.
It returns ``null`` if you pass ``null`` as a parameter.
boolean @isEmpty(CharSequence charSequence)
Returns ``true`` if the character sequence is ``null`` or the length is ``0``.
boolean @isNotEmpty(CharSequence charSequence)
Returns ``true`` if the character sequence isn't ``null`` and the length isn't ``0``.
boolean @isBlank(CharSequence charSequence)
Returns ``true`` only if the character sequence is ``null``, the length is ``0``,
or the sequence is formed with whitespaces only.
boolean @isNotBlank(CharSequence charSequence)
Returns ``true`` if the character sequence isn't ``null``, the length isn't ``0``,
and the sequence isn't formed with whitespaces only.
These functions are correspond to the methods of ``org.seasar.doma.expr.ExpressionFunctions``.
Using custom functions
======================
You can define and use your own functions.
You need to follow these settings when you use custom functions which you define by yourself:
* The function is defined as a method of a class which implements
``org.seasar.doma.expr.ExpressionFunctions``.
* The method is a public instance method.
* The class is registered as an option in :doc:`annotation-processing`.
The key of the option is ``doma.expr.functions``.
* The instance of the class you create is used in an RDBMS dialect in your configuration class
(The implementations of RDBMS dialect provided by Doma can receive
``ExpressionFunctions`` in the constructor).
To call a custom function, add ``@`` at the beginning of the function name like built-in functions.
For example, you can call ``myfunc`` function like this:
.. code-block:: sql
select * from employee where
employee_name = /* @myfunc(employee.employeeName) */'smith'
| 35.066667 | 99 | 0.599025 |
e6524bf78e577ceddf53d9b720ea7cc2b67f804c | 45 | rst | reStructuredText | docs/vcd.rst | christian-krieg/pyvcd | ecaa5bf0faebeb34a5b0eead41a0ad8c73abb8b4 | [
"MIT"
] | 44 | 2016-08-04T20:00:04.000Z | 2021-01-09T13:47:18.000Z | docs/vcd.rst | christian-krieg/pyvcd | ecaa5bf0faebeb34a5b0eead41a0ad8c73abb8b4 | [
"MIT"
] | 16 | 2017-03-08T15:31:42.000Z | 2020-12-29T18:07:54.000Z | docs/vcd.rst | christian-krieg/pyvcd | ecaa5bf0faebeb34a5b0eead41a0ad8c73abb8b4 | [
"MIT"
] | 20 | 2016-07-06T16:44:14.000Z | 2020-10-03T05:28:19.000Z | =======
``vcd``
=======
.. automodule:: vcd
| 7.5 | 19 | 0.355556 |
055f876be10b5e8e4f6b58b0a1dab5714dd96977 | 172 | rst | reStructuredText | Cellar/cmake/3.2.3/share/cmake/Help/variable/MSVC_IDE.rst | Infornia/BrewStoryBro | bdda77888e04c61d20a81d2a628dfe96244c7c4c | [
"BSD-2-Clause"
] | 60 | 2018-09-29T05:08:07.000Z | 2022-03-05T19:44:20.000Z | Cellar/cmake/3.2.3/share/cmake/Help/variable/MSVC_IDE.rst | Infornia/BrewStoryBro | bdda77888e04c61d20a81d2a628dfe96244c7c4c | [
"BSD-2-Clause"
] | 4 | 2018-11-19T07:47:56.000Z | 2020-12-30T05:06:21.000Z | Cellar/cmake/3.2.3/share/cmake/Help/variable/MSVC_IDE.rst | Infornia/BrewStoryBro | bdda77888e04c61d20a81d2a628dfe96244c7c4c | [
"BSD-2-Clause"
] | 27 | 2018-10-29T17:30:53.000Z | 2022-03-29T01:48:24.000Z | MSVC_IDE
--------
True when using the Microsoft Visual C IDE
Set to true when the target platform is the Microsoft Visual C IDE, as
opposed to the command line compiler.
| 21.5 | 70 | 0.755814 |
65e2ccd6645eea5dca86e00c758f1790b4450d40 | 906 | rst | reStructuredText | elements/extjs/README.rst | UoA-eResearch/sahara-image-elements | 62f9fbc3df3043d7ebcf684ce0c8e36b7248b0e6 | [
"Apache-2.0"
] | 53 | 2015-01-14T12:27:43.000Z | 2021-09-16T03:20:57.000Z | elements/extjs/README.rst | UoA-eResearch/sahara-image-elements | 62f9fbc3df3043d7ebcf684ce0c8e36b7248b0e6 | [
"Apache-2.0"
] | null | null | null | elements/extjs/README.rst | UoA-eResearch/sahara-image-elements | 62f9fbc3df3043d7ebcf684ce0c8e36b7248b0e6 | [
"Apache-2.0"
] | 35 | 2015-02-10T22:09:09.000Z | 2019-09-10T21:02:55.000Z | =====
extjs
=====
This element downloads extjs from its website, caching it so it is
not downloaded every time, and optionally unpacking it.
Environment Variables
---------------------
The element can be configured by exporting variables using a
`environment.d` script.
EXTJS_DESTINATION_DIR
:Required: Yes
:Description: The directory where to extract (or copy) extjs; must be
an absolute directory within the image. The directory is created if not
existing already.
:Example: ``EXTJS_DESTINATION_DIR=/usr/share/someapp``
EXTJS_DOWNLOAD_URL
:Required: No
:Default: ``https://tarballs.openstack.org/sahara-extra/dist/common-artifacts/ext-2.2.zip``
:Description: The URL from where to download extjs.
EXTJS_NO_UNPACK
:Required: No
:Default: *unset*
:Description: If set to 1, then the extjs tarball is simply copied to the
location specified by ``EXTJS_DESTINATION_DIR``.
| 29.225806 | 93 | 0.739514 |
e9cdf1fe664b96233b22687dc04b9ec1babb8e5c | 2,346 | rst | reStructuredText | README.rst | zopefoundation/zc.recipe.filestorage | 410dcacdc98460d0373e64db7236c7b9c5bb8e28 | [
"ZPL-2.1"
] | 1 | 2021-03-05T17:28:13.000Z | 2021-03-05T17:28:13.000Z | README.rst | zopefoundation/zc.recipe.filestorage | 410dcacdc98460d0373e64db7236c7b9c5bb8e28 | [
"ZPL-2.1"
] | 2 | 2019-11-19T13:49:27.000Z | 2021-03-12T16:12:11.000Z | README.rst | zopefoundation/zc.recipe.filestorage | 410dcacdc98460d0373e64db7236c7b9c5bb8e28 | [
"ZPL-2.1"
] | 1 | 2015-04-03T06:46:18.000Z | 2015-04-03T06:46:18.000Z | ===================================
Recipe for setting up a filestorage
===================================
This recipe can be used to define a file-storage. It creates a ZConfig
file-storage database specification that can be used by other recipes to
generate ZConfig configuration files.
This recipe takes an optional path option. If none is given, it creates and
uses a subdirectory of the buildout parts directory with the same name as the
part.
The recipe records a zconfig option for use by other recipes.
We'll show a couple of examples, using a dictionary as a simulated buildout
object:
>>> import zc.recipe.filestorage
>>> buildout = dict(
... buildout = {
... 'directory': '/buildout',
... },
... db = {
... 'path': 'foo/Main.fs',
... },
... )
>>> recipe = zc.recipe.filestorage.Recipe(
... buildout, 'db', buildout['db'])
>>> print(buildout['db']['path'])
/buildout/foo/Main.fs
>>> from six import print_
>>> print_(buildout['db']['zconfig'], end='')
<zodb>
<filestorage>
path /buildout/foo/Main.fs
</filestorage>
</zodb>
>>> recipe.install()
()
>>> import tempfile
>>> d = tempfile.mkdtemp()
>>> buildout = dict(
... buildout = {
... 'parts-directory': d,
... },
... db = {},
... )
>>> recipe = zc.recipe.filestorage.Recipe(
... buildout, 'db', buildout['db'])
>>> print(buildout['db']['path'])
/tmp/tmpQo0DTB/db/Data.fs
>>> print_(buildout['db']['zconfig'], end='')
<zodb>
<filestorage>
path /tmp/tmpQo0DTB/db/Data.fs
</filestorage>
</zodb>
>>> recipe.install()
()
>>> import os
>>> os.listdir(d)
['db']
The update method doesn't do much, as the database part's directory
already exists, but it is present, so buildout doesn't complain and doesn't
accidentally run install() again:
>>> recipe.update()
If the storage's directory is removed, is it re-added by the update method:
>>> os.rmdir(os.path.join(d, 'db'))
>>> os.listdir(d)
[]
>>> recipe.update()
>>> os.listdir(d)
['db']
This is useful in development when the directory containing the database is
removed in order to start the database from scratch.
| 26.066667 | 77 | 0.571611 |
617b7d7065510d2899648ed999707be8c1c9ea1c | 3,152 | rst | reStructuredText | README.rst | dnmellen/dj-warning-forms | 25213821f41ad6864cb7eda7bd2f6640d4418561 | [
"BSD-3-Clause"
] | 3 | 2022-03-15T09:09:08.000Z | 2022-03-23T12:30:47.000Z | README.rst | dnmellen/dj-warning-forms | 25213821f41ad6864cb7eda7bd2f6640d4418561 | [
"BSD-3-Clause"
] | 1 | 2022-03-16T08:04:07.000Z | 2022-03-18T21:18:38.000Z | README.rst | dnmellen/dj-warning-forms | 25213821f41ad6864cb7eda7bd2f6640d4418561 | [
"BSD-3-Clause"
] | null | null | null | =============================
Django Warning Forms
=============================
|PyPI| |build| |coverage|
.. |PyPI| image:: https://img.shields.io/pypi/v/dj-warning-forms
.. |build| image:: https://github.com/dnmellen/dj-warning-forms/actions/workflows/python-package.yml/badge.svg?branch=master
.. |coverage| image:: https://img.shields.io/codecov/c/gh/dnmellen/dj-warning-forms
Add warnings to your Django Forms easily
.. image:: https://github.com/dnmellen/dj-warning-forms/blob/master/docs/demo.gif?raw=true
Documentation
-------------
The full documentation is at https://dj-warning-forms.readthedocs.io.
Quickstart
----------
Install Django Warning Forms::
pip install dj-warning-forms
Use the form mixin in any form
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
.. code-block:: python
from django import forms
from dj_warning_forms.forms import WarningFormMixin, FormFieldWarning
from .models import Poll
class PollForm(WarningFormMixin, forms.ModelForm):
question = forms.CharField(
max_length=200, widget=forms.TextInput(attrs={"autocomplete": "off"})
)
class Meta:
model = Poll
fields = "__all__"
def warning_question(self) -> List[FormFieldWarning]:
if not self.cleaned_data["question"].endswith("?"):
return [
FormFieldWarning(
message="Weird question",
description="This question does not end with a question mark. Are you sure you want to publish this question?", # noqa
)
]
return []
Adding a warning is as simple as adding a method with the ``warning_`` prefix. This method must return a
list of FormFieldWarning objects.
Showing warnings in the template
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
You can find the list of warnings in ``form.warnings``.
.. code-block:: html
{% block content %}
<form action="." method="post">
{% csrf_token %}
{{ form }}
<input type="submit" value="Submit">
<!-- Customize your form warnings as you wish -->
{% if form.warnings %}
<div class="rounded p-2 mt-2 bg-warning">
<ul>
{% for warning in form.warnings %}
<li><b>{{ warning.message }}:</b> {{ warning.description }}</li>
{% endfor %}
</ul>
</div>
{% endif %}
<!-- End of form warnings -->
</form>
{% endblock %}
Features
--------
- No external dependencies
- Minimal changes needed in your forms
- Easy to customize
Running Tests
-------------
Does the code actually work?
::
source <YOURVIRTUALENV>/bin/activate
(myenv) $ pip install tox
(myenv) $ tox
Development commands
---------------------
::
pip install -r requirements_dev.txt
invoke -l
Credits
-------
Tools used in rendering this package:
* Cookiecutter_
* `cookiecutter-djangopackage`_
.. _Cookiecutter: https://github.com/audreyr/cookiecutter
.. _`cookiecutter-djangopackage`: https://github.com/pydanny/cookiecutter-djangopackage
| 24.625 | 143 | 0.587563 |
3b53d74a9fee6cfb000c102d97fd15cb65dfa961 | 207 | rst | reStructuredText | AUTHORS.rst | saraedum/flipper | 31ee1c58e6ffb485a1e2dcab46821eddb3841963 | [
"MIT"
] | 3 | 2019-11-06T18:15:27.000Z | 2020-08-05T21:53:03.000Z | AUTHORS.rst | saraedum/flipper | 31ee1c58e6ffb485a1e2dcab46821eddb3841963 | [
"MIT"
] | 6 | 2019-06-23T13:14:51.000Z | 2022-02-15T04:07:51.000Z | AUTHORS.rst | saraedum/flipper | 31ee1c58e6ffb485a1e2dcab46821eddb3841963 | [
"MIT"
] | 2 | 2019-12-17T23:34:41.000Z | 2020-06-16T18:09:55.000Z |
Credits
=======
Development Lead
----------------
* Mark Bell <mcbell@illinois.edu>
Contributors
------------
* Nathan M Dunfield <nmd@illinois.edu> - Code for constructing taut strunctures on bundles.
| 14.785714 | 91 | 0.63285 |
75806ececa1d115c14779cbe88864c8148536c3d | 462 | rst | reStructuredText | typo3/sysext/core/Documentation/Changelog/7.2/Feature-52693-TSFE-RequestedId.rst | dennned/jesus | f4f8575f81f3b95ffab830037264b4c0f0ae6421 | [
"PostgreSQL"
] | null | null | null | typo3/sysext/core/Documentation/Changelog/7.2/Feature-52693-TSFE-RequestedId.rst | dennned/jesus | f4f8575f81f3b95ffab830037264b4c0f0ae6421 | [
"PostgreSQL"
] | null | null | null | typo3/sysext/core/Documentation/Changelog/7.2/Feature-52693-TSFE-RequestedId.rst | dennned/jesus | f4f8575f81f3b95ffab830037264b4c0f0ae6421 | [
"PostgreSQL"
] | null | null | null | ================================================
Feature: #59646 - Add TSFE property $requestedId
================================================
Description
===========
A new property within the main TypoScriptFrontendController for the frontend called $requestedId stores
the information about the page ID which is set before the page ID processing and resolving.
It is accessible via ``$TSFE->getRequestedId()``. Also see ``$TSFE->fetch_the_id()`` method.
| 42 | 103 | 0.601732 |
ea114f31244417283c175de4e1fdcf6f2efec4f4 | 1,744 | rst | reStructuredText | docs/cqelight-sample/scenario.rst | sputier/CQELight | 16b7c45a09e0fa365417e8c403b96d0ea063e012 | [
"MIT"
] | 15 | 2017-06-10T09:12:55.000Z | 2020-01-21T20:53:47.000Z | docs/cqelight-sample/scenario.rst | sputier/CQELight | 16b7c45a09e0fa365417e8c403b96d0ea063e012 | [
"MIT"
] | 219 | 2018-02-05T15:14:07.000Z | 2020-01-28T07:38:22.000Z | docs/cqelight-sample/scenario.rst | sputier/CQELight | 16b7c45a09e0fa365417e8c403b96d0ea063e012 | [
"MIT"
] | 6 | 2019-04-09T17:02:36.000Z | 2020-01-21T20:53:51.000Z | Scénario complet
================
Afin d'explorer au maximum CQELight de façon assez simple, nous allons faire, dans un contexte mono-applicatif desktop console, un programme de test qui démontrera les concepts que nous avons vu précédemment.
L'idée va être de développer une petite application permettant la gestion d'un arbre généalogique, de façon ultra simplifiée. On se contentera uniquement de lister les personnes d'une famille, avec ses infos de naissance et leur date de décès.
Au niveau des informations de naissance, on se contente de stocker date et lieu de naissance. La date de décès peut ne pas être renseignée. Il est bien entendu impossible d'avoir une date de décès inférieure à la date de naissance. On considère à cet effet que deux personnes sont nées "de la même façon" si elles sont nées au même endroit le même jour.
La famille est identifiée de façon unique par le nom. Il ne peut pas y avoir deux familles avec le même nom. Au niveau des personnes, en plus des infos de naissance, on stockera uniquement le prénom. Il est possible d'avoir plusieurs personnes avec le même prénom dans la même famille, si les informations de naissance sont différentes, sinon, c'est qu'il s'agit d'un doublon, et ce n'est pas autorisé.
Dans les pages suivantes, on va explorer ce sujet petit à petit pour le modéliser et utiliser CQELight pour arriver à nos fins. Nous allons créer un système event-sourcé, séparé en CQRS et hautement extensible.
.. warning::Ce scénario n'est pas production ready, il est là pour montrer la façon d'utiliser l'outil et les bonnes pratiques associées.
Vous trouverez l'ensemble du code sur `notre repository GitHub <https://github.com/cdie/CQELight/tree/master/samples/documentation/2.Geneao>`_. | 109 | 402 | 0.791284 |
8be8b769a98310730b5c8e156fcdab36eb26255e | 934 | rst | reStructuredText | docs/index.rst | shan18/TensorNet | c79a0c64152dbeb3499d204994772858326f668c | [
"MIT"
] | 6 | 2020-06-04T16:01:38.000Z | 2021-11-28T17:47:13.000Z | docs/index.rst | shan18/TensorNet | c79a0c64152dbeb3499d204994772858326f668c | [
"MIT"
] | 22 | 2020-03-20T22:00:32.000Z | 2021-02-08T19:32:32.000Z | docs/index.rst | shan18/TensorNet | c79a0c64152dbeb3499d204994772858326f668c | [
"MIT"
] | 5 | 2020-03-24T11:29:22.000Z | 2020-11-01T11:45:20.000Z | :github_url: https://github.com/shan18/TensorNet
TensorNet Documentation
=======================
TensorNet is a high-level deep learning library built on top of PyTorch.
Installation
------------
To install and use TensorNet all you have to do is:
.. code-block:: bash
pip install torch-tensornet
If you want to get the latest version of the code before it is released on PyPI you can install the library from GitHub
.. code-block:: bash
pip install git+https://github.com/shan18/TensorNet.git#egg=torch-tensornet
.. toctree::
:maxdepth: 2
:caption: Contents
source/data
source/models
source/models.loss
source/models.optimizer
source/engine
source/engine.ops
source/engine.ops.regularizer
source/gradcam
source/utils
Contact/Getting Help
--------------------
If you need any help or want to report a bug, raise an `issue <https://github.com/shan18/TensorNet/issues>`_ in the repo.
| 22.780488 | 121 | 0.702355 |
cccb86ff4ce57b0ab2b46ca756251726e2c39cf6 | 14,961 | rst | reStructuredText | advanced_source/torch_script_custom_classes.rst | Ismail-Mustapha/tutorials | 0ccfbf0047db855e93e2aadb43c89c92e89f52b8 | [
"BSD-3-Clause"
] | 6,424 | 2017-01-18T17:57:30.000Z | 2022-03-31T11:43:48.000Z | advanced_source/torch_script_custom_classes.rst | Ismail-Mustapha/tutorials | 0ccfbf0047db855e93e2aadb43c89c92e89f52b8 | [
"BSD-3-Clause"
] | 1,713 | 2017-01-18T18:50:08.000Z | 2022-03-31T14:57:25.000Z | advanced_source/torch_script_custom_classes.rst | Ismail-Mustapha/tutorials | 0ccfbf0047db855e93e2aadb43c89c92e89f52b8 | [
"BSD-3-Clause"
] | 3,932 | 2017-01-18T21:11:46.000Z | 2022-03-31T10:24:24.000Z | Extending TorchScript with Custom C++ Classes
===============================================
This tutorial is a follow-on to the
:doc:`custom operator <torch_script_custom_ops>`
tutorial, and introduces the API we've built for binding C++ classes into TorchScript
and Python simultaneously. The API is very similar to
`pybind11 <https://github.com/pybind/pybind11>`_, and most of the concepts will transfer
over if you're familiar with that system.
Implementing and Binding the Class in C++
-----------------------------------------
For this tutorial, we are going to define a simple C++ class that maintains persistent
state in a member variable.
.. literalinclude:: ../advanced_source/torch_script_custom_classes/custom_class_project/class.cpp
:language: cpp
:start-after: BEGIN class
:end-before: END class
There are several things to note:
- ``torch/custom_class.h`` is the header you need to include to extend TorchScript
with your custom class.
- Notice that whenever we are working with instances of the custom
class, we do it via instances of ``c10::intrusive_ptr<>``. Think of ``intrusive_ptr``
as a smart pointer like ``std::shared_ptr``, but the reference count is stored
directly in the object, as opposed to a separate metadata block (as is done in
``std::shared_ptr``. ``torch::Tensor`` internally uses the same pointer type;
and custom classes have to also use this pointer type so that we can
consistently manage different object types.
- The second thing to notice is that the user-defined class must inherit from
``torch::CustomClassHolder``. This ensures that the custom class has space to
store the reference count.
Now let's take a look at how we will make this class visible to TorchScript, a process called
*binding* the class:
.. literalinclude:: ../advanced_source/torch_script_custom_classes/custom_class_project/class.cpp
:language: cpp
:start-after: BEGIN binding
:end-before: END binding
:append:
;
}
Building the Example as a C++ Project With CMake
------------------------------------------------
Now, we're going to build the above C++ code with the `CMake
<https://cmake.org>`_ build system. First, take all the C++ code
we've covered so far and place it in a file called ``class.cpp``.
Then, write a simple ``CMakeLists.txt`` file and place it in the
same directory. Here is what ``CMakeLists.txt`` should look like:
.. literalinclude:: ../advanced_source/torch_script_custom_classes/custom_class_project/CMakeLists.txt
:language: cmake
Also, create a ``build`` directory. Your file tree should look like this::
custom_class_project/
class.cpp
CMakeLists.txt
build/
We assume you've setup your environment in the same way as described in
the :doc:`previous tutorial <torch_script_custom_ops>`.
Go ahead and invoke cmake and then make to build the project:
.. code-block:: shell
$ cd build
$ cmake -DCMAKE_PREFIX_PATH="$(python -c 'import torch.utils; print(torch.utils.cmake_prefix_path)')" ..
-- The C compiler identification is GNU 7.3.1
-- The CXX compiler identification is GNU 7.3.1
-- Check for working C compiler: /opt/rh/devtoolset-7/root/usr/bin/cc
-- Check for working C compiler: /opt/rh/devtoolset-7/root/usr/bin/cc -- works
-- Detecting C compiler ABI info
-- Detecting C compiler ABI info - done
-- Detecting C compile features
-- Detecting C compile features - done
-- Check for working CXX compiler: /opt/rh/devtoolset-7/root/usr/bin/c++
-- Check for working CXX compiler: /opt/rh/devtoolset-7/root/usr/bin/c++ -- works
-- Detecting CXX compiler ABI info
-- Detecting CXX compiler ABI info - done
-- Detecting CXX compile features
-- Detecting CXX compile features - done
-- Looking for pthread.h
-- Looking for pthread.h - found
-- Looking for pthread_create
-- Looking for pthread_create - not found
-- Looking for pthread_create in pthreads
-- Looking for pthread_create in pthreads - not found
-- Looking for pthread_create in pthread
-- Looking for pthread_create in pthread - found
-- Found Threads: TRUE
-- Found torch: /torchbind_tutorial/libtorch/lib/libtorch.so
-- Configuring done
-- Generating done
-- Build files have been written to: /torchbind_tutorial/build
$ make -j
Scanning dependencies of target custom_class
[ 50%] Building CXX object CMakeFiles/custom_class.dir/class.cpp.o
[100%] Linking CXX shared library libcustom_class.so
[100%] Built target custom_class
What you'll find is there is now (among other things) a dynamic library
file present in the build directory. On Linux, this is probably named
``libcustom_class.so``. So the file tree should look like::
custom_class_project/
class.cpp
CMakeLists.txt
build/
libcustom_class.so
Using the C++ Class from Python and TorchScript
-----------------------------------------------
Now that we have our class and its registration compiled into an ``.so`` file,
we can load that `.so` into Python and try it out. Here's a script that
demonstrates that:
.. literalinclude:: ../advanced_source/torch_script_custom_classes/custom_class_project/custom_test.py
:language: python
Saving, Loading, and Running TorchScript Code Using Custom Classes
------------------------------------------------------------------
We can also use custom-registered C++ classes in a C++ process using
libtorch. As an example, let's define a simple ``nn.Module`` that
instantiates and calls a method on our MyStackClass class:
.. literalinclude:: ../advanced_source/torch_script_custom_classes/custom_class_project/save.py
:language: python
``foo.pt`` in our filesystem now contains the serialized TorchScript
program we've just defined.
Now, we're going to define a new CMake project to show how you can load
this model and its required .so file. For a full treatment of how to do this,
please have a look at the `Loading a TorchScript Model in C++ Tutorial <https://pytorch.org/tutorials/advanced/cpp_export.html>`_.
Similarly to before, let's create a file structure containing the following::
cpp_inference_example/
infer.cpp
CMakeLists.txt
foo.pt
build/
custom_class_project/
class.cpp
CMakeLists.txt
build/
Notice we've copied over the serialized ``foo.pt`` file, as well as the source
tree from the ``custom_class_project`` above. We will be adding the
``custom_class_project`` as a dependency to this C++ project so that we can
build the custom class into the binary.
Let's populate ``infer.cpp`` with the following:
.. literalinclude:: ../advanced_source/torch_script_custom_classes/infer.cpp
:language: cpp
And similarly let's define our CMakeLists.txt file:
.. literalinclude:: ../advanced_source/torch_script_custom_classes/CMakeLists.txt
:language: cpp
You know the drill: ``cd build``, ``cmake``, and ``make``:
.. code-block:: shell
$ cd build
$ cmake -DCMAKE_PREFIX_PATH="$(python -c 'import torch.utils; print(torch.utils.cmake_prefix_path)')" ..
-- The C compiler identification is GNU 7.3.1
-- The CXX compiler identification is GNU 7.3.1
-- Check for working C compiler: /opt/rh/devtoolset-7/root/usr/bin/cc
-- Check for working C compiler: /opt/rh/devtoolset-7/root/usr/bin/cc -- works
-- Detecting C compiler ABI info
-- Detecting C compiler ABI info - done
-- Detecting C compile features
-- Detecting C compile features - done
-- Check for working CXX compiler: /opt/rh/devtoolset-7/root/usr/bin/c++
-- Check for working CXX compiler: /opt/rh/devtoolset-7/root/usr/bin/c++ -- works
-- Detecting CXX compiler ABI info
-- Detecting CXX compiler ABI info - done
-- Detecting CXX compile features
-- Detecting CXX compile features - done
-- Looking for pthread.h
-- Looking for pthread.h - found
-- Looking for pthread_create
-- Looking for pthread_create - not found
-- Looking for pthread_create in pthreads
-- Looking for pthread_create in pthreads - not found
-- Looking for pthread_create in pthread
-- Looking for pthread_create in pthread - found
-- Found Threads: TRUE
-- Found torch: /local/miniconda3/lib/python3.7/site-packages/torch/lib/libtorch.so
-- Configuring done
-- Generating done
-- Build files have been written to: /cpp_inference_example/build
$ make -j
Scanning dependencies of target custom_class
[ 25%] Building CXX object custom_class_project/CMakeFiles/custom_class.dir/class.cpp.o
[ 50%] Linking CXX shared library libcustom_class.so
[ 50%] Built target custom_class
Scanning dependencies of target infer
[ 75%] Building CXX object CMakeFiles/infer.dir/infer.cpp.o
[100%] Linking CXX executable infer
[100%] Built target infer
And now we can run our exciting C++ binary:
.. code-block:: shell
$ ./infer
momfoobarbaz
Incredible!
Moving Custom Classes To/From IValues
-------------------------------------
It's also possible that you may need to move custom classes into or out of
``IValue``s, such as when you take or return ``IValue``s from TorchScript methods
or you want to instantiate a custom class attribute in C++. For creating an
``IValue`` from a custom C++ class instance:
- ``torch::make_custom_class<T>()`` provides an API similar to c10::intrusive_ptr<T>
in that it will take whatever set of arguments you provide to it, call the constructor
of T that matches that set of arguments, and wrap that instance up and return it.
However, instead of returning just a pointer to a custom class object, it returns
an ``IValue`` wrapping the object. You can then pass this ``IValue`` directly to
TorchScript.
- In the event that you already have an ``intrusive_ptr`` pointing to your class, you
can directly construct an IValue from it using the constructor ``IValue(intrusive_ptr<T>)``.
For converting ``IValue`` back to custom classes:
- ``IValue::toCustomClass<T>()`` will return an ``intrusive_ptr<T>`` pointing to the
custom class that the ``IValue`` contains. Internally, this function is checking
that ``T`` is registered as a custom class and that the ``IValue`` does in fact contain
a custom class. You can check whether the ``IValue`` contains a custom class manually by
calling ``isCustomClass()``.
Defining Serialization/Deserialization Methods for Custom C++ Classes
---------------------------------------------------------------------
If you try to save a ``ScriptModule`` with a custom-bound C++ class as
an attribute, you'll get the following error:
.. literalinclude:: ../advanced_source/torch_script_custom_classes/custom_class_project/export_attr.py
:language: python
.. code-block:: shell
$ python export_attr.py
RuntimeError: Cannot serialize custom bound C++ class __torch__.torch.classes.my_classes.MyStackClass. Please define serialization methods via def_pickle for this class. (pushIValueImpl at ../torch/csrc/jit/pickler.cpp:128)
This is because TorchScript cannot automatically figure out what information
save from your C++ class. You must specify that manually. The way to do that
is to define ``__getstate__`` and ``__setstate__`` methods on the class using
the special ``def_pickle`` method on ``class_``.
.. note::
The semantics of ``__getstate__`` and ``__setstate__`` in TorchScript are
equivalent to that of the Python pickle module. You can
`read more <https://github.com/pytorch/pytorch/blob/master/torch/csrc/jit/docs/serialization.md#getstate-and-setstate>`_
about how we use these methods.
Here is an example of the ``def_pickle`` call we can add to the registration of
``MyStackClass`` to include serialization methods:
.. literalinclude:: ../advanced_source/torch_script_custom_classes/custom_class_project/class.cpp
:language: cpp
:start-after: BEGIN def_pickle
:end-before: END def_pickle
.. note::
We take a different approach from pybind11 in the pickle API. Whereas pybind11
as a special function ``pybind11::pickle()`` which you pass into ``class_::def()``,
we have a separate method ``def_pickle`` for this purpose. This is because the
name ``torch::jit::pickle`` was already taken, and we didn't want to cause confusion.
Once we have defined the (de)serialization behavior in this way, our script can
now run successfully:
.. code-block:: shell
$ python ../export_attr.py
testing
Defining Custom Operators that Take or Return Bound C++ Classes
---------------------------------------------------------------
Once you've defined a custom C++ class, you can also use that class
as an argument or return from a custom operator (i.e. free functions). Suppose
you have the following free function:
.. literalinclude:: ../advanced_source/torch_script_custom_classes/custom_class_project/class.cpp
:language: cpp
:start-after: BEGIN free_function
:end-before: END free_function
You can register it running the following code inside your ``TORCH_LIBRARY``
block:
.. literalinclude:: ../advanced_source/torch_script_custom_classes/custom_class_project/class.cpp
:language: cpp
:start-after: BEGIN def_free
:end-before: END def_free
Refer to the `custom op tutorial <https://pytorch.org/tutorials/advanced/torch_script_custom_ops.html>`_
for more details on the registration API.
Once this is done, you can use the op like the following example:
.. code-block:: python
class TryCustomOp(torch.nn.Module):
def __init__(self):
super(TryCustomOp, self).__init__()
self.f = torch.classes.my_classes.MyStackClass(["foo", "bar"])
def forward(self):
return torch.ops.my_classes.manipulate_instance(self.f)
.. note::
Registration of an operator that takes a C++ class as an argument requires that
the custom class has already been registered. You can enforce this by
making sure the custom class registration and your free function definitions
are in the same ``TORCH_LIBRARY`` block, and that the custom class
registration comes first. In the future, we may relax this requirement,
so that these can be registered in any order.
Conclusion
----------
This tutorial walked you through how to expose a C++ class to TorchScript
(and by extension Python), how to register its methods, how to use that
class from Python and TorchScript, and how to save and load code using
the class and run that code in a standalone C++ process. You are now ready
to extend your TorchScript models with C++ classes that interface with
third party C++ libraries or implement any other use case that requires the
lines between Python, TorchScript and C++ to blend smoothly.
As always, if you run into any problems or have questions, you can use our
`forum <https://discuss.pytorch.org/>`_ or `GitHub issues
<https://github.com/pytorch/pytorch/issues>`_ to get in touch. Also, our
`frequently asked questions (FAQ) page
<https://pytorch.org/cppdocs/notes/faq.html>`_ may have helpful information.
| 41.558333 | 225 | 0.722879 |
4cc1b0b9d2cf30db7f532f10d8941a1a1719fac5 | 17,205 | rst | reStructuredText | tools/clang/docs/ReleaseNotes.rst | guoyuqi020/TraceRecorder | d6a31ca49d26bfd3c58a92cbf1727db99901d335 | [
"Apache-2.0"
] | null | null | null | tools/clang/docs/ReleaseNotes.rst | guoyuqi020/TraceRecorder | d6a31ca49d26bfd3c58a92cbf1727db99901d335 | [
"Apache-2.0"
] | null | null | null | tools/clang/docs/ReleaseNotes.rst | guoyuqi020/TraceRecorder | d6a31ca49d26bfd3c58a92cbf1727db99901d335 | [
"Apache-2.0"
] | 1 | 2021-10-31T23:38:07.000Z | 2021-10-31T23:38:07.000Z | ========================================
Clang 12.0.0 (In-Progress) Release Notes
========================================
.. contents::
:local:
:depth: 2
Written by the `LLVM Team <https://llvm.org/>`_
.. warning::
These are in-progress notes for the upcoming Clang 12 release.
Release notes for previous releases can be found on
`the Download Page <https://releases.llvm.org/download.html>`_.
Introduction
============
This document contains the release notes for the Clang C/C++/Objective-C
frontend, part of the LLVM Compiler Infrastructure, release 12.0.0. Here we
describe the status of Clang in some detail, including major
improvements from the previous release and new feature work. For the
general LLVM release notes, see `the LLVM
documentation <https://llvm.org/docs/ReleaseNotes.html>`_. All LLVM
releases may be downloaded from the `LLVM releases web
site <https://llvm.org/releases/>`_.
For more information about Clang or LLVM, including information about the
latest release, please see the `Clang Web Site <https://clang.llvm.org>`_ or the
`LLVM Web Site <https://llvm.org>`_.
Note that if you are reading this file from a Git checkout or the
main Clang web page, this document applies to the *next* release, not
the current one. To see the release notes for a specific release, please
see the `releases page <https://llvm.org/releases/>`_.
What's New in Clang 12.0.0?
===========================
Some of the major new features and improvements to Clang are listed
here. Generic improvements to Clang as a whole or to its underlying
infrastructure are described first, followed by language-specific
sections with improvements to Clang's support for those languages.
Major New Features
------------------
- ...
Improvements to Clang's diagnostics
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
- ...
Non-comprehensive list of changes in this release
-------------------------------------------------
- The builtin intrinsics ``__builtin_bitreverse8``, ``__builtin_bitreverse16``,
``__builtin_bitreverse32`` and ``__builtin_bitreverse64`` may now be used
within constant expressions.
- The builtin intrinsics ``__builtin_rotateleft8``, ``__builtin_rotateleft16``,
``__builtin_rotateleft32`` and ``__builtin_rotateleft64`` may now be used
within constant expressions.
- The builtin intrinsics ``__builtin_rotateright8``, ``__builtin_rotateright16``,
``__builtin_rotateright32`` and ``__builtin_rotateright64`` may now be used
within constant expressions.
New Compiler Flags
------------------
- ...
- AArch64 options ``-moutline-atomics``, ``-mno-outline-atomics`` to enable
and disable calls to helper functions implementing atomic operations. These
out-of-line helpers like '__aarch64_cas8_relax' will detect at runtime
AArch64 Large System Extensions (LSE) availability and either use their
atomic instructions, or falls back to LL/SC loop. These options do not apply
if the compilation target supports LSE. Atomic instructions are used directly
in that case. The option's behaviour mirrors GCC, the helpers are implemented
both in compiler-rt and libgcc.
- New option ``-fbinutils-version=`` specifies the targeted binutils version.
For example, ``-fbinutils-version=2.35`` means compatibility with GNU as/ld
before 2.35 is not needed: new features can be used and there is no need to
work around old GNU as/ld bugs.
Deprecated Compiler Flags
-------------------------
The following options are deprecated and ignored. They will be removed in
future versions of Clang.
- The clang-cl ``/fallback`` flag, which made clang-cl invoke Microsoft Visual
C++ on files it couldn't compile itself, has been deprecated. It will be
removed in Clang 13.
- ...
Modified Compiler Flags
-----------------------
- On ELF, ``-gz`` now defaults to ``-gz=zlib`` with the integrated assembler.
It produces ``SHF_COMPRESSED`` style compression of debug information. GNU
binutils 2.26 or newer, or lld is required to link produced object files. Use
``-gz=zlib-gnu`` to get the old behavior.
- Now that `this` pointers are tagged with `nonnull` and `dereferenceable(N)`,
`-fno-delete-null-pointer-checks` has gained the power to remove the
`nonnull` attribute on `this` for configurations that need it to be nullable.
- ``-gsplit-dwarf`` no longer implies ``-g2``.
- ``-fasynchronous-unwind-tables`` is now the default on Linux AArch64/PowerPC.
This behavior matches newer GCC.
(`D91760 <https://reviews.llvm.org/D91760>`_)
(`D92054 <https://reviews.llvm.org/D92054>`_)
- Support has been added for the following processors (command-line identifiers
in parentheses):
- Arm Cortex-A78C (cortex-a78c).
- Arm Cortex-R82 (cortex-r82).
- Arm Neoverse V1 (neoverse-v1).
- Arm Neoverse N2 (neoverse-n2).
- Fujitsu A64FX (a64fx).
For example, to select architecture support and tuning for Neoverse-V1 based
systems, use ``-mcpu=neoverse-v1``.
Removed Compiler Flags
-------------------------
The following options no longer exist.
- clang-cl's ``/Zd`` flag no longer exist. But ``-gline-tables-only`` still
exists and does the same thing.
New Pragmas in Clang
--------------------
- ...
Modified Pragmas in Clang
-------------------------
- The "#pragma clang loop vectorize_width" has been extended to support an
optional 'fixed|scalable' argument, which can be used to indicate that the
compiler should use fixed-width or scalable vectorization. Fixed-width is
assumed by default.
Scalable or vector length agnostic vectorization is an experimental feature
for targets that support scalable vectors. For more information please refer
to the Clang Language Extensions documentation.
Attribute Changes in Clang
--------------------------
- Added support for the C++20 likelihood attributes ``[[likely]]`` and
``[[unlikely]]``. As an extension they can be used in C++11 and newer.
This extension is enabled by default.
Windows Support
---------------
- Implicitly add ``.exe`` suffix for MinGW targets, even when cross compiling.
(This matches a change from GCC 8.)
- Windows on Arm64: programs using the C standard library's setjmp and longjmp
functions may crash with a "Security check failure or stack buffer overrun"
exception. To workaround (with reduced security), compile with
/guard:cf,nolongjmp.
- Windows on Arm64: LLVM 12 adds official binary release hosted on
Windows on Arm64. The binary is built and tested by Linaro alongside
AArch64 and ARM 32-bit Linux binary releases. This first WoA release
includes Clang compiler, LLD Linker, and compiler-rt runtime libraries.
Work on LLDB, sanitizer support, OpenMP, and other features is in progress
and will be included in future Windows on Arm64 LLVM releases.
C Language Changes in Clang
---------------------------
- ...
C++ Language Changes in Clang
-----------------------------
- ...
C++1z Feature Support
^^^^^^^^^^^^^^^^^^^^^
...
Objective-C Language Changes in Clang
-------------------------------------
OpenCL Kernel Language Changes in Clang
---------------------------------------
- Improved online documentation: :doc:`UsersManual` and :doc:`OpenCLSupport`
pages.
- Added ``-cl-std=CL3.0`` and predefined version macro for OpenCL 3.0.
- Added ``-cl-std=CL1.0`` and mapped to the existing OpenCL 1.0 functionality.
- Improved OpenCL extension handling per target.
- Added clang extension for function pointers ``__cl_clang_function_pointers``
and variadic functions ``__cl_clang_variadic_functions``, more details can be
found in :doc:`LanguageExtensions`.
- Removed extensions without kernel language changes:
``cl_khr_select_fprounding_mode``, ``cl_khr_gl_sharing``, ``cl_khr_icd``,
``cl_khr_gl_event``, ``cl_khr_d3d10_sharing``, ``cl_khr_context_abort``,
``cl_khr_d3d11_sharing``, ``cl_khr_dx9_media_sharing``,
``cl_khr_image2d_from_buffer``, ``cl_khr_initialize_memory``,
``cl_khr_gl_depth_images``, ``cl_khr_spir``, ``cl_khr_egl_event``,
``cl_khr_egl_image``, ``cl_khr_terminate_context``.
- Improved diagnostics for unevaluated ``vec_step`` expression.
- Allow nested pointers (e.g. pointer-to-pointer) kernel arguments beyond OpenCL
1.2.
- Added ``global_device`` and ``global_host`` address spaces for USM
allocations.
Miscellaneous improvements in C++ for OpenCL support:
- Added diagnostics for pointers to member functions and references to
functions.
- Added support of ``vec_step`` builtin.
- Fixed ICE on address spaces with forwarding references and templated copy
constructors.
- Removed warning for variadic macro use.
ABI Changes in Clang
--------------------
OpenMP Support in Clang
-----------------------
- ...
CUDA Support in Clang
---------------------
- ...
X86 Support in Clang
--------------------
- The x86 intrinsics ``_mm_popcnt_u32``, ``_mm_popcnt_u64``, ``_popcnt32``,
``_popcnt64``, ``__popcntd`` and ``__popcntq`` may now be used within
constant expressions.
- The x86 intrinsics ``_bit_scan_forward``, ``__bsfd`` and ``__bsfq`` may now
be used within constant expressions.
- The x86 intrinsics ``_bit_scan_reverse``, ``__bsrd`` and ``__bsrq`` may now
be used within constant expressions.
- The x86 intrinsics ``__bswap``, ``__bswapd``, ``__bswap64`` and ``__bswapq``
may now be used within constant expressions.
- The x86 intrinsics ``_castf32_u32``, ``_castf64_u64``, ``_castu32_f32`` and
``_castu64_f64`` may now be used within constant expressions.
- The x86 intrinsics ``__rolb``, ``__rolw``, ``__rold``, ``__rolq`, ``_rotl``,
``_rotwl`` and ``_lrotl`` may now be used within constant expressions.
- The x86 intrinsics ``__rorb``, ``__rorw``, ``__rord``, ``__rorq`, ``_rotr``,
``_rotwr`` and ``_lrotr`` may now be used within constant expressions.
- Support for ``-march=alderlake``, ``-march=sapphirerapids`` and
``-march=znver3`` was added.
- Support for ``-march=x86-64-v[234]`` has been added.
See :doc:`UsersManual` for details about these micro-architecture levels.
- The -mtune command line option is no longer ignored for X86. This can be used
to request microarchitectural optimizations independent on -march. -march=<cpu>
implies -mtune=<cpu>. -mtune=generic is the default with no -march or -mtune
specified.
- Support for ``HRESET`` instructions has been added.
- Support for ``UINTR`` instructions has been added.
- Support for ``AVXVNNI`` instructions has been added.
Internal API Changes
--------------------
These are major API changes that have happened since the 11.0.0 release of
Clang. If upgrading an external codebase that uses Clang as a library,
this section should help get you past the largest hurdles of upgrading.
- ...
Build System Changes
--------------------
These are major changes to the build system that have happened since the 11.0.0
release of Clang. Users of the build system should adjust accordingly.
- ...
AST Matchers
------------
- The ``mapAnyOf()`` matcher was added. This allows convenient matching of
different AST nodes which have a compatible matcher API. For example,
``mapAnyOf(ifStmt, forStmt).with(hasCondition(integerLiteral()))``
matches any ``IfStmt`` or ``ForStmt`` with a integer literal as the
condition.
- The ``binaryOperation()`` matcher allows matching expressions which
appear like binary operators in the code, even if they are really
``CXXOperatorCallExpr`` for example. It is based on the ``mapAnyOf()``
matcher functionality. The matcher API for the latter node has been
extended with ``hasLHS()`` etc to facilitate the abstraction.
- Matcher API for ``CXXRewrittenBinaryOperator`` has been added. In addition
to explicit matching with the ``cxxRewrittenBinaryOperator()`` matcher, the
``binaryOperation()`` matches on nodes of this type.
- The behavior of ``TK_IgnoreUnlessSpelledInSource`` with the ``traverse()``
matcher has been changed to no longer match on template instantiations or on
implicit nodes which are not spelled in the source.
- The ``TK_IgnoreImplicitCastsAndParentheses`` traversal kind was removed. It
is recommended to use ``TK_IgnoreUnlessSpelledInSource`` instead.
- The behavior of the ``forEach()`` matcher was changed to not internally
ignore implicit and parenthesis nodes. This makes it consistent with
the ``has()`` matcher. Uses of ``forEach()`` relying on the old behavior
can now use the ``traverse()`` matcher or ``ignoringParenCasts()``.
- Several AST Matchers have been changed to match based on the active
traversal mode. For example, ``argumentCountIs()`` matches the number of
arguments written in the source, ignoring default arguments represented
by ``CXXDefaultArgExpr`` nodes.
- Improvements in AST Matchers allow more matching of template declarations,
independent of their template instantations.
clang-format
------------
- Option ``BitFieldColonSpacing`` has been added that decides how
space should be added around identifier, colon and bit-width in
bitfield definitions.
.. code-block:: c++
// Both (default)
struct F {
unsigned dscp : 6;
unsigned ecn : 2; // AlignConsecutiveBitFields=true
};
// None
struct F {
unsigned dscp:6;
unsigned ecn :2;
};
// Before
struct F {
unsigned dscp :6;
unsigned ecn :2;
};
// After
struct F {
unsigned dscp: 6;
unsigned ecn : 2;
};
- Experimental Support in clang-format for concepts has been improved, to
aid this the follow options have been added
- Option ``IndentRequires`` has been added to indent the ``requires`` keyword
in templates.
- Option ``BreakBeforeConceptDeclarations`` has been added to aid the formatting of concepts.
- Option ``IndentPragmas`` has been added to allow #pragma to indented with the current scope
level. This is especially useful when using #pragma to mark OpenMP sections of code.
- Option ``SpaceBeforeCaseColon`` has been added to add a space before the
colon in a case or default statement.
- Option ``StatementAttributeLikeMacros`` has been added to declare
macros which are not parsed as a type in front of a statement. See
the documentation for an example.
- Options ``AlignConsecutiveAssignments``, ``AlignConsecutiveBitFields``,
``AlignConsecutiveDeclarations`` and ``AlignConsecutiveMacros`` have been modified to allow
alignment across empty lines and/or comments.
libclang
--------
- ...
Static Analyzer
---------------
.. 3ff220de9009 [analyzer][StdLibraryFunctionsChecker] Add POSIX networking functions
.. ...And a million other patches.
- Improve the analyzer's understanding of several POSIX functions.
.. https://reviews.llvm.org/D86533#2238207
- Greatly improved the analyzer’s constraint solver by better understanding
when constraints are imposed on multiple symbolic values that are known to be
equal or known to be non-equal. It will now also efficiently reject impossible
if-branches between known comparison expressions. (Incorrectly stated as a
11.0.0 feature in the previous release notes)
.. 820e8d8656ec [Analyzer][WebKit] UncountedLambdaCaptureChecker
- New checker: :ref:`webkit.UncountedLambdaCapturesChecker<webkit-UncountedLambdaCapturesChecker>`
is a WebKit coding convention checker that flags raw pointers to
reference-counted objects captured by lambdas and suggests using intrusive
reference-counting smart pointers instead.
.. 8a64689e264c [Analyzer][WebKit] UncountedLocalVarsChecker
- New checker: :ref:`alpha.webkit.UncountedLocalVarsChecker<alpha-webkit-UncountedLocalVarsChecker>`
is a WebKit coding convention checker that intends to make sure that any
uncounted local variable is backed by a ref-counted object with lifetime that
is strictly larger than the scope of the uncounted local variable.
.. i914f6c4ff8a4 [StaticAnalyzer] Support struct annotations in FuchsiaHandleChecker
- ``fuchia.HandleChecker`` now recognizes handles in structs; All the handles
referenced by the structure (direct value or ptr) would be treated as
containing the release/use/acquire annotations directly.
.. 8deaec122ec6 [analyzer] Update Fuchsia checker to catch releasing unowned handles.
- Fuchsia checkers can detect the release of an unowned handle.
- Numerous fixes and improvements to bug report generation.
.. _release-notes-ubsan:
Undefined Behavior Sanitizer (UBSan)
------------------------------------
Core Analysis Improvements
==========================
- ...
New Issues Found
================
- ...
Python Binding Changes
----------------------
The following methods have been added:
- ...
Significant Known Problems
==========================
Additional Information
======================
A wide variety of additional information is available on the `Clang web
page <https://clang.llvm.org/>`_. The web page contains versions of the
API documentation which are up-to-date with the Git version of
the source code. You can access versions of these documents specific to
this release by going into the "``clang/docs/``" directory in the Clang
tree.
If you have any questions or comments about Clang, please feel free to
contact us via the `mailing
list <https://lists.llvm.org/mailman/listinfo/cfe-dev>`_.
| 36.528662 | 100 | 0.714734 |
f2e7bb043a028d6592d2bf6d3768ed3a089461bc | 1,876 | rst | reStructuredText | source/courses/comp310.rst | AJM10565/coursedescriptions | 93fd9cd7899a6b733fe2056df70cd7f7fb339257 | [
"Apache-2.0"
] | null | null | null | source/courses/comp310.rst | AJM10565/coursedescriptions | 93fd9cd7899a6b733fe2056df70cd7f7fb339257 | [
"Apache-2.0"
] | null | null | null | source/courses/comp310.rst | AJM10565/coursedescriptions | 93fd9cd7899a6b733fe2056df70cd7f7fb339257 | [
"Apache-2.0"
] | null | null | null | .. index:: introduction to operating systems
operating systems
COMP 310 (formerly 374): Introduction to Operating Systems
==========================================================
This course introduces principles of operating systems and how they are designed. Various important parts of operating systems such as memory addressing, file structures, processes, and threads are covered.
Credit Hours
-------------------
3
Prerequisites
--------------------
:doc:`../courses/comp264` or :doc:`../courses/comp271`
Description
--------------------
This is an introductory course in Operating Systems discussing both standalone
and distributed environments. The focus of the course is to understand the
underlying technologies that make contemporary operating systems work
efficiently. We will discuss processes, threads, synchronization, I/O, file
systems, memory management, transactions and system coordination techniques.
Through this course we will discover how these technologies are integrated
into the systems we use today and then utilize these technologies and apply
them to practical applications. This is NOT a programming intensive course,
however, students will be expected to complete some programming in C with
plenty of examples and assistance along the way. You certainly don't need to
know how to program in C today. In addition, the completion of a technical
paper on an OS related subject will also be expected.
Everyone currently in, or planning to enter the IT field should have a grasp
on these components as they effect every area of the day to day operation of
IT technology. Reference systems will include both Linux and Windows.
Outcome
----------
Students will learn the different parts of an operating system at a functional level and how they interact with each other.
Syllabi
---------------------
See :doc:`../syllabi/syllabi`.
| 38.285714 | 207 | 0.744136 |
e9b9584743ca1da8e5ba164ecbb4e12decc7182d | 7,010 | rst | reStructuredText | doc/sources/guide/packaging-android.rst | sirpercival/kivy | 29ef854a200e6764aae60ea29324379c69d271a3 | [
"MIT"
] | 2 | 2015-10-26T12:35:37.000Z | 2020-11-26T12:06:09.000Z | doc/sources/guide/packaging-android.rst | sirpercival/kivy | 29ef854a200e6764aae60ea29324379c69d271a3 | [
"MIT"
] | null | null | null | doc/sources/guide/packaging-android.rst | sirpercival/kivy | 29ef854a200e6764aae60ea29324379c69d271a3 | [
"MIT"
] | null | null | null | .. _packaging_android:
Create a package for Android
============================
You can create a package for android using the `python-for-android
<https://github.com/kivy/python-for-android>`_ project. This page explains how to
download and use it directly on your own machine (see
:ref:`Packaging your application into APK`), use the prebuilt :ref:`testdrive` virtual
machine image, or use the :ref:`buildozer` tool to automate the entire
process. You can also see :ref:`Packaging your application for Kivy Launcher` to run kivy
programs without compiling them.
.. _Packaging your application into APK:
Packaging your application into an APK
--------------------------------------
This section describes how to download and use python-for-android directly.
You'll need:
- A linux computer or virtual machine
- Java
- Python 2.7 (not 2.6.)
- Jinja2 (python module)
- Apache ant
- Android SDK
Setup Python for Android
~~~~~~~~~~~~~~~~~~~~~~~~
First, install the prerequisites needed for the project:
http://python-for-android.readthedocs.org/en/latest/prerequisites/
Then open a console and type::
git clone git://github.com/kivy/python-for-android
Build your distribution
~~~~~~~~~~~~~~~~~~~~~~~
The distribution is a "directory" containing a specialized python compiled for
Android, including only the modules you asked for. You can, from the same
python-for-android, compile multiple distributions. For example:
- One containing a minimal support without audio / video
- Another containing audio, openssl etc.
To do that, you must use the script named `distribute.sh`::
./distribute.sh -m "kivy"
The result of the compilation will be saved into `dist/default`. Here are other
examples of building distributions::
./distribute.sh -m "openssl kivy"
./distribute.sh -m "pil ffmpeg kivy"
.. note::
The order of modules provided are important, as a general rule put
dependencies first and then the dependent modules, C libs come first
then python modules.
To see the available options for distribute.sh, type::
./distribute.sh -h
.. note::
To use the latest Kivy development version to build your distribution, link
"P4A_kivy_DIR" to the kivy folder environment variable to the kivy folder
location. On linux you would use the export command, like this::
export P4A_kivy_DIR=/path/to/cloned/kivy/
Package your application
~~~~~~~~~~~~~~~~~~~~~~~~
Inside the distribution (`dist/default` by default), you have a tool named
`build.py`. This is the script that will create the APK for you::
./build.py --dir <path to your app>
--name "<title>"
--package <org.of.your.app>
--version <human version>
--icon <path to an icon to use>
--orientation <landscape|portrait>
--permission <android permission like VIBRATE> (multiple allowed)
<debug|release> <installd|installr|...>
An example of using multiple permissions::
--permission INTERNET --permission WRITE_EXTERNAL_STORAGE
Full list of available permissions are documented here:
http://developer.android.com/reference/android/Manifest.permission.html
For example, if we imagine that the touchtracer demo of Kivy is in the directory
~/kivy/examples/demo/touchtracer, you can do::
./build.py --dir ~/kivy/examples/demo/touchtracer \
--package org.demo.touchtracer \
--name "Kivy Touchtracer" --version 1.1.0 debug installd
You need to be aware that the default target Android SDK version for the build
will be SDK v.8, which is the minimum required SDK version for kivy. You should
either install this API version, or change the AndroidManifest.xml file (under
dist/.../) to match your own target SDK requirements.
The debug binary will be generated in bin/KivyTouchtracer-1.1.0-debug.apk. The
`debug` and `installd` parameters are commands from the Android project itself.
They instruct `build.py` to compile the APK in debug mode and install on the
first connected device.
You can then install the APK directly to your Android device as follows::
adb install -r bin/KivyTouchtracer-1.1.0-debug.apk
Release on the market
~~~~~~~~~~~~~~~~~~~~~
Launch the build.py script again, with the `release` parameter. After buiding it,
you must sign and zipalign the APK. Read the android documentation at:
http://developer.android.com/guide/publishing/app-signing.html
The release binary will be generated in
bin/KivyTouchtracer-1.1.0-release-unsigned.apk (for the previous touchtracer example.)
.. _testdrive:
TestDrive
---------
We provide a VirtualBox Image with python-for-android along with
the Android SDK and NDK preinstalled to ease your installation woes. You can
download it from `here <http://kivy.org/#download>`_.
Once the VM is loaded, you can follow the instructions from
:ref:`Packaging your application into APK`. You don't need to download
with `git clone` though, as python-for-android is already installed
and set up in the virtual machine home directory.
.. _Buildozer:
Buildozer
---------
Buildozer is a tool that automates the entire build process. It
downloads and sets up all the prequisites for python-for-android,
including the android SDK and NDK, then builds an apk that can be
automatically pushed to the device.
Buildozer currently works only in Linux, and is an alpha
release, but it already works well and can significantly simplify the
apk build.
You can get buildozer at `<https://github.com/kivy/buildozer>`_::
git clone https://github.com/kivy/buildozer.git
cd buildozer
sudo python2.7 setup.py install
This will install buildozer in your system. Afterwards, navigate to
your project directory and run::
buildozer init
This creates a `buildozer.spec` file controlling your build
configuration. You should edit it appropriately with your app name
etc. You can set variables to control most or all of the parameters
passed to python-for-android.
Afterwards, plug in your android device and run::
buildozer android debug deploy run
to build, push and automatically run the apk on your device.
You can check the buildozer README at
`<https://github.com/kivy/buildozer>`_ for more documentation of
buildozer's capabilities.
.. _Packaging your application for Kivy Launcher:
Packaging your application for the Kivy Launcher
------------------------------------------------
The `Kivy launcher <https://play.google.com/store/apps/details?id=org.kivy.pygame&hl=en>`_
is an Android application that runs any Kivy examples stored on your
SD Card. See :ref:`androidinstall`.
Your application must be saved into::
/sdcard/kivy/<yourapplication>
Your application directory must contain::
# Your main application file:
main.py
# Some info Kivy requires about your app on android:
android.txt
The file `android.txt` must contain::
title=<Application Title>
author=<Your Name>
orientation=<portrait|landscape>
| 32.604651 | 90 | 0.72368 |
04fed9a4860bd15ae7472641d1e5dad26628bcf8 | 297 | rst | reStructuredText | docs/source/index.rst | dmulyalin/salt-nornir | 2fbbe136f5a462e7981ce638a3dd15463de63d18 | [
"MIT"
] | 12 | 2020-08-24T10:44:59.000Z | 2022-03-23T03:41:20.000Z | docs/source/index.rst | dmulyalin/salt-nornir | 2fbbe136f5a462e7981ce638a3dd15463de63d18 | [
"MIT"
] | 3 | 2021-02-15T20:55:20.000Z | 2021-09-26T22:50:02.000Z | docs/source/index.rst | dmulyalin/salt-nornir | 2fbbe136f5a462e7981ce638a3dd15463de63d18 | [
"MIT"
] | 1 | 2021-05-23T15:47:10.000Z | 2021-05-23T15:47:10.000Z | Welcome to salt-nornir's documentation!
=======================================
.. toctree::
:maxdepth: 2
:caption: Contents:
Overview
Installation
Getting started
Nornir Proxy Module
Nornir Execution Module
Nornir Runner Module
Nornir State Module
FAQ
Examples
| 17.470588 | 39 | 0.612795 |
a163d355985acee3dbf377bbfd199dae56730d94 | 260 | rst | reStructuredText | README.rst | spcial/crypto-prediction | dd568c4de386c551e36bd9b698472de83a847ef4 | [
"MIT"
] | null | null | null | README.rst | spcial/crypto-prediction | dd568c4de386c551e36bd9b698472de83a847ef4 | [
"MIT"
] | null | null | null | README.rst | spcial/crypto-prediction | dd568c4de386c551e36bd9b698472de83a847ef4 | [
"MIT"
] | null | null | null | Crypto Prediction Bot
========================
This is simple project trying to predict some stock prices using ML.
Setup
----------
Run setup.py to install the necessary requirements
Run
----------
The application can be started by using the bot/core.py
| 18.571429 | 68 | 0.657692 |
9f2bb0b4f91659ad028d7c27ab261c30525e9b1c | 779 | rst | reStructuredText | docs/source/install.rst | admdev8/vws-cli | 593b115d2efc5ebd6e275ecb5e5cd5c79bf2f24a | [
"MIT"
] | 1 | 2020-11-02T10:58:32.000Z | 2020-11-02T10:58:32.000Z | docs/source/install.rst | admdev8/vws-cli | 593b115d2efc5ebd6e275ecb5e5cd5c79bf2f24a | [
"MIT"
] | 29 | 2020-12-21T07:45:08.000Z | 2021-05-03T07:31:12.000Z | docs/source/install.rst | admdev8/vws-cli | 593b115d2efc5ebd6e275ecb5e5cd5c79bf2f24a | [
"MIT"
] | null | null | null | Installation
------------
With ``pip``
~~~~~~~~~~~~
Requires Python 3.8+.
.. code:: sh
pip install VWS-CLI
With Homebrew (macOS, Linux, WSL)
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
Requires `Homebrew`_.
.. code:: sh
brew tap VWS-Python/vws
brew install vws-cli
.. _Homebrew: https://docs.brew.sh/Installation
Pre-built Linux binaries
~~~~~~~~~~~~~~~~~~~~~~~~
.. prompt:: bash
:substitutions:
curl --fail -L https://github.com/|github-owner|/|github-repository|/releases/download/|release|/vws -o /usr/local/bin/vws && \
chmod +x /usr/local/bin/vws
curl --fail -L https://github.com/|github-owner|/|github-repository|/releases/download/|release|/vuforia-cloud-reco -o /usr/local/bin/vuforia-cloud-reco && \
chmod +x /usr/local/bin/vuforia-cloud-reco
| 22.257143 | 160 | 0.618742 |
2f5fd30dad466527e0e7718f4b6e62c3e46fa684 | 2,346 | rst | reStructuredText | docs/class11/module2/lab1.rst | jmcalalang/f5-big-iq-lab | 62b96333e6b45c4f07b32fde0a98520e4a918061 | [
"MIT"
] | null | null | null | docs/class11/module2/lab1.rst | jmcalalang/f5-big-iq-lab | 62b96333e6b45c4f07b32fde0a98520e4a918061 | [
"MIT"
] | null | null | null | docs/class11/module2/lab1.rst | jmcalalang/f5-big-iq-lab | 62b96333e6b45c4f07b32fde0a98520e4a918061 | [
"MIT"
] | null | null | null | Lab 2.1: Configuring DoS Device Profile
---------------------------------------
The Device DoS profile reports and mitigates based on aggregated data across the entire BIG-IP. All packets that are not explicitly white listed
count for the Device DoS vectors. When using more specific profiles on Virtual Servers, the Device DoS profile should be set using values large
enough that they provide protection for the device without conflicting with Virtual Server profiles. For example individual virtual servers may be
configured with UDP flood values to detect and mitigate values of 10000 PPS, however device DoS is set to 50000 PPS.
.. note:: Most DoS Vector values are *per TMM*, one exception being Single Endpoint Sweep and Flood vectors which aggregates multiple packet types and applies the configured limit across all TMMs.
The default Device DoS profile settings manual detect and mitigate are set high for all vectors. To simplify any demo, reducing these values allows for easier demonstrations.
Additionally, if using a lab VE license there is a 10 Mb/s limit which limits how many PPS can be processed by TMM. If using larger values for DoS demonstration, use at least a 1 Gb/s license.
For this set of labs we will be utilizing Device DoS to detect and mitigate bad packet and flood types, while using a DoS Profile to detect and mitigate
specific DNS vectors only. This allows us to layer a fine grained DNS policy while letting Device DoS catch bad packet types across all Virtuals.
1. Under *Configuration* > *Security* > *Shared Security* > *Device DoS Configurations*, click on *BOS-vBIGIP01*
2. Expand the *Flood* category of attack types
3. Select *UDP Flood* and modify the settings as shown below: be sure UDP port list *includes all ports* (a white listed port will not be counted, and DNS is in the default port list)
.. image:: ../pictures/module2/udp-flood-settings.png
:align: center
:scale: 50%
4. Click *OK* to save changes, and expand the *DNS* category of attacks
5. Select *DNS Malformed* and modify the settings as shown below then click *OK*
.. image:: ../pictures/module2/dns-malformed-settings.png
:align: center
:scale: 80%
6. Click *Save & Close* to save the edits
7. To deploy the changes, create and deploy using *Shared Security* or *Network Security* for *BOS-vBIGIP01* and *02*
| 65.166667 | 197 | 0.76428 |
95d248f997b19a24291dc8080300b6bacb088fea | 1,165 | rst | reStructuredText | documentation/devref/tkp/accessors/accessor_types.rst | ajstewart/tkp | 2aab1d021d10e3d1f3c4c8a836aea96ac6ae413f | [
"BSD-2-Clause"
] | 9 | 2015-04-30T22:10:14.000Z | 2020-06-09T01:24:20.000Z | documentation/devref/tkp/accessors/accessor_types.rst | ajstewart/tkp | 2aab1d021d10e3d1f3c4c8a836aea96ac6ae413f | [
"BSD-2-Clause"
] | 218 | 2015-01-08T11:10:57.000Z | 2021-11-25T05:52:42.000Z | documentation/devref/tkp/accessors/accessor_types.rst | ajstewart/tkp | 2aab1d021d10e3d1f3c4c8a836aea96ac6ae413f | [
"BSD-2-Clause"
] | 14 | 2015-03-11T11:21:58.000Z | 2020-06-16T09:15:57.000Z | ======================
Data Accessor Variants
======================
Basic data accessors
++++++++++++++++++++
.. module:: tkp.accessors.dataaccessor
.. autoclass:: tkp.accessors.dataaccessor.DataAccessor
:members:
The following acessors are derived from the basic :class:`DataAccessor` class:
.. module:: tkp.accessors.fitsimage
.. class:: tkp.accessors.fitsimage.FitsImage
Generic FITS data access.
.. module:: tkp.accessors.casaimage
.. class:: tkp.accessors.casaimage.CasaImage
Generic CASA image access.
.. module:: tkp.accessors.kat7casaimage
.. class:: tkp.accessors.kat7casaimage.Kat7CasaImage
KAT-7 specific CASA image access.
LOFAR-specific data accessors
++++++++++++++++++++++++++++++++++
.. module:: tkp.accessors.lofaraccessor
.. autoclass:: tkp.accessors.lofaraccessor.LofarAccessor
:members:
The following accessors are derived from the generic :class:`LofarAccessor`:
.. module:: tkp.accessors.lofarfitsimage
.. class:: tkp.accessors.lofarfitsimage.LofarFitsImage
LOFAR FITS access.
.. module:: tkp.accessors.lofarcasaimage
.. class:: tkp.accessors.lofarcasaimage.LofarCasaImage
LOFAR CASA image access.
| 21.574074 | 78 | 0.699571 |
afeb480655a3796b3991996c2e4d4ae6967756a3 | 736 | rst | reStructuredText | entries/food/letuvee.rst | Julian/til | 362429dc3ed57d020ba813025aa7fc77b9381d52 | [
"CC0-1.0"
] | 5 | 2020-04-21T12:56:22.000Z | 2021-09-10T00:14:21.000Z | entries/food/letuvee.rst | Julian/til | 362429dc3ed57d020ba813025aa7fc77b9381d52 | [
"CC0-1.0"
] | 150 | 2020-04-22T01:19:56.000Z | 2021-05-20T21:41:30.000Z | entries/food/letuvee.rst | Julian/til | 362429dc3ed57d020ba813025aa7fc77b9381d52 | [
"CC0-1.0"
] | null | null | null | ========
L'Etuvee
========
Try cooking vegetables "a l'etuvee."
It retains tons of flavor. Any non-green vegetable is a good candidate
-- turnips, carrots, parsnips, cauliflower, squash, etc.
Cut them into pieces, put them flat in a pot. Fill the pot with enough
water to cover 1/4 of the vegetables. Add a hunk of butter and salt and
pepper. Cook on medium heat until the water is evaporated. Optionally,
after the water is evaporated, turn the flame up to high for a minute.
Let the bottom of the pan start to brown well. Once it's a nice dark
brown, but not black, add a tablespoon of water, and swirl everything
around the pan. The water will deglaze all the browned sugars, and make
a nice sauce that will coat your vegetables.
| 40.888889 | 71 | 0.754076 |
028889860b1a4dc6a95d84d455e1cf57eaf156c6 | 187 | rst | reStructuredText | README.rst | ukch/jclib | e3e293024d225865bec5e704fca753e08c7fdf2e | [
"MIT"
] | null | null | null | README.rst | ukch/jclib | e3e293024d225865bec5e704fca753e08c7fdf2e | [
"MIT"
] | null | null | null | README.rst | ukch/jclib | e3e293024d225865bec5e704fca753e08c7fdf2e | [
"MIT"
] | null | null | null | Python library of reusable code
===============================
Currently this library contains a single module, ``actions``. This can be used for writing reusable, confirmable actions.
| 37.4 | 121 | 0.663102 |
c9d4e490fbfcbfdb96a8f437fb73640f03c34a1d | 1,154 | rst | reStructuredText | README.rst | rassouly/exopy_pulses | bf71a3899c9b04434ee928ede08a21a0dd62f0a5 | [
"BSD-3-Clause"
] | 2 | 2016-02-09T20:23:16.000Z | 2017-09-04T10:18:45.000Z | README.rst | rassouly/exopy_pulses | bf71a3899c9b04434ee928ede08a21a0dd62f0a5 | [
"BSD-3-Clause"
] | 15 | 2015-12-14T21:58:50.000Z | 2017-10-12T07:04:33.000Z | README.rst | rassouly/exopy_pulses | bf71a3899c9b04434ee928ede08a21a0dd62f0a5 | [
"BSD-3-Clause"
] | 2 | 2016-02-09T20:23:16.000Z | 2017-09-07T09:41:36.000Z | Exopy Pulses
============
.. image:: https://travis-ci.org/Exopy/exopy_pulses.svg?branch=master
:target: https://travis-ci.org/Exopy/exopy_pulses
:alt: Build Status
.. image:: https://codecov.io/gh/Exopy/exopy_pulses/branch/master/graph/badge.svg
:target: https://codecov.io/gh/Exopy/exopy_pulses
:alt: Coverage
.. image:: https://readthedocs.org/projects/exopy-pulses/badge/?version=latest
:target: http://exopy-pulses.readthedocs.io/en/latest/?badge=latest
:alt: Documentation Status
.. image:: https://api.codacy.com/project/badge/Grade/700a9aea186b40aeba07bab363ff3544
:target: https://www.codacy.com/app/Exopy/exopy_pulses?utm_source=github.com&utm_medium=referral&utm_content=Exopy/exopy_pulses&utm_campaign=Badge_Grade
:alt: Code quality (Codacy)
.. image:: https://anaconda.org/exopy/exopy_pulses/badges/version.svg
:target: https://anaconda.org/exopy/exopy_pulses
:alt: Conda package
Plugin providing tools to synthesize pulse sequences in Exopy.
Installling
-----------
The easiest way to install exopy_pulses is through conda :
.. code:: shell
conda install exopy_pulses -c exopy
| 37.225806 | 168 | 0.741768 |
9133d5f8efc788889980d58e9e03d20d3cf245f1 | 4,538 | rst | reStructuredText | docs/source/general_algorithm.rst | HelmholtzAI-Consultants-Munich/fg-clustering | e5c7f3f0173b56fe21316e590436234ec679683f | [
"MIT"
] | 2 | 2022-03-25T10:24:01.000Z | 2022-03-28T07:18:38.000Z | docs/source/general_algorithm.rst | HelmholtzAI-Consultants-Munich/fg-clustering | e5c7f3f0173b56fe21316e590436234ec679683f | [
"MIT"
] | null | null | null | docs/source/general_algorithm.rst | HelmholtzAI-Consultants-Munich/fg-clustering | e5c7f3f0173b56fe21316e590436234ec679683f | [
"MIT"
] | null | null | null | General Algorithm
===============
Compute distance matrix between data points
--------------------------------------------
The first step in the algorithm is the computation of the distances between the data points. This distance matrix is based on a proximity measure between two instances i and j, that represents the frequency with which those instances occur in the same terminal nodes of a tree in the Random Forest (RF) model. Intuitively, this defines how close those instances are in the RF model. We define the proximity matrix as
:math:`M^{proximity}_{ij} = \frac{m_{i,j}}{N}`
where :math:`m_{i,j}` is the number of trees where the data-points i,j end up in the same terminal node and N is the total number of trees in the RF model. According to *Breiman et al., 2003* the values :math:`1-M^{proximity}_{ij}` are square distances in a euclidean space and can therefore be used as distance measures: :math:`M^{distance}_{ij} = 1-M^{proximity}_{ij}`
Optimize number of clusters
------------------------
Having a distance matrix :math:`M^{distance}_{ij}` we can use `k-medoids clustering <https://en.wikipedia.org/wiki/K-medoids>` to find subgroups for which the data points follow similar decision paths in the RF model. We use k-medoids, as, in contrast to k-means, it does not require an embedding of the data points in a metric space but can be applied directly to a distance matrix.
Similar to k-means clustering, k-medoids clustering requires setting the number of clusters :math:`k` into which we want to divide our dataset in advance. We want the resulting clusters to be both, stable and predictive of the target. We developed a scoring system to choose the optimal number of clusters :math:`k`, which minimizes the model bias while restricting the model complexity. The model bias measures how well the clustering (FGC with a certain value of :math:`k`) approximates the expected model, while the variance is related to the model complexity, since complex models usually have a high variance and poor generalization capability.
Model bias
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
For **classification models**, we define the model bias by the impurity score of the clustering. Analogous to the regression case the model bias is computed for each value of :math:`k` separately. The impurity score is a balanced Gini coefficient of the classes within each cluster. The class sizes are balanced by rescaling the class size with the inverse size of the class in the overall dataset. Given a classification problem with :math:`G` classes, we define the impurity score as:
:math:`IS_k = \sum_i^k \left( 1- \sum_{g=1}^G b^2_{i,g} \right)`
where the balanced per cluster frequency :math:`b_{i,g} = \frac{1}{\sum_{g=1}^G \frac{p_{i,g}}{q_g}} \frac{p_{i,g}}{q_g}` of class :math:`g` in cluster :math:`i` is the normalized frequency :math:`p_{i,g}` of class :math:`g` in cluster :math:`i`, weighted by the total frequency :math:`q_g` of class :math:`g` in the data set.
For **regression models**, the target value of each cluster is treated as the mean prediction values for each data point in the cluster. The model bias is then defined as the total squared error of this prediction compared to the ground truth. We compute the model bias separately for each value of :math:`k`. Then the clustering's are scored by the lowest total squared error:
:math:`TSE_k = \sum_i^k \sum_{y_i \in C_j} \left( y_i - \mu_j \right)^2`
where :math:`y_i` is the target value of data point i and :math:`\mu_j = \frac{1}{|C_j|}\sum_{y_i \in C_j} y_i` is mean of the target values within cluster :math:`C_j`. It measures the compactness (i.e goodness) of the clustering with respect to the target and should be as small as possible.
Model variance
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
We limit the model variance by discarding too complex models. We define the complexity of the clustering for each value of :math:`k` by its stability. The
stability of each cluster :math:`i` in the clustering is measured by the average Jaccard Similarity between the original cluster :math:`A` and :math:`n` bootstrapped clusters :math:`B_b`:
:math:`JS_i(A|B) = \frac{\sum_{b=1}^n\frac{|A ∩ B_b|}{|A ∪ B_b|}}{n}`
Jaccard similarity values > 0.6 are usually indicative of stable patterns in the data (*Hennig, 2008*). Only stable clusterings, i.e. clustering with low variance,
are considered as clustering candidates. Hence, the optimal number of clusters :math:`k` is the one yielding the minimum model bias, while having a stable clustering.
| 98.652174 | 649 | 0.733363 |
55ef936d512af32485c93572de549d865a8b56bf | 13,337 | rst | reStructuredText | source/installation/deploy-minio-standalone.rst | kaankabalak/docs | a6fb44ac600501ea7e5eb37d8d4b7346155e4374 | [
"CC-BY-4.0"
] | null | null | null | source/installation/deploy-minio-standalone.rst | kaankabalak/docs | a6fb44ac600501ea7e5eb37d8d4b7346155e4374 | [
"CC-BY-4.0"
] | null | null | null | source/installation/deploy-minio-standalone.rst | kaankabalak/docs | a6fb44ac600501ea7e5eb37d8d4b7346155e4374 | [
"CC-BY-4.0"
] | null | null | null | ===============================
Deploy MinIO in Standalone Mode
===============================
.. default-domain:: minio
.. contents:: Table of Contents
:local:
:depth: 1
The procedures on this page cover deploying MinIO in
:guilabel:`Standalone Mode`. A standalone MinIO deployment consists of a single
MinIO server process with a single drive or storage volume ("filesystem mode").
The MinIO server provides an S3 access layer to the drive or volume and stores
objects as-is without any :ref:`erasure coding <minio-erasure-coding>`.
For extended development or production environments, *or* to access
:ref:`advanced MinIO functionality <minio-installation-comparison>` deploy MinIO
in :guilabel:`Distributed Mode`. See :ref:`deploy-minio-distributed` for more
information.
.. _deploy-minio-standalone:
Deploy Standalone MinIO on Baremetal
------------------------------------
The following procedure deploys MinIO in :guilabel:`Standalone Mode` consisting
of a single MinIO server and a single drive or storage volume. Standalone
deployments are best suited for evaluation and initial development environments.
.. admonition:: Network File System Volumes Break Consistency Guarantees
:class: note
MinIO's strict **read-after-write** and **list-after-write** consistency
model requires local disk filesystems (``xfs``, ``ext4``, etc.).
MinIO cannot provide consistency guarantees if the underlying storage
volumes are NFS or a similar network-attached storage volume.
For deployments that *require* using network-attached storage, use
NFSv4 for best results.
1) Download and Run MinIO Server
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
Visit `https://min.io/download <https://min.io/download?ref=docs>`__ and select
the tab most relevant to your use case. Follow the displayed instructions to
download the :mc:`minio` binary to your local machine. The example instructions
use the ``/data`` folder by default. You can create or change this folder
as necessary for your deployment. The :mc:`minio` process must have
full access to the specified folder *and* all of its subfolders.
The :mc:`minio server` process prints its output to the system console, similar
to the following:
.. code-block:: shell
API: http://192.0.2.10:9000 http://127.0.0.1:9000
RootUser: minioadmin
RootPass: minioadmin
Console: http://192.0.2.10:9001 http://127.0.0.1:9001
RootUser: minioadmin
RootPass: minioadmin
Command-line: https://docs.min.io/docs/minio-client-quickstart-guide
$ mc alias set myminio http://192.0.2.10:9000 minioadmin minioadmin
Documentation: https://docs.min.io
WARNING: Detected default credentials 'minioadmin:minioadmin', we recommend that you change these values with 'MINIO_ROOT_USER' and 'MINIO_ROOT_PASSWORD' environment variables
Open your browser to any of the listed :guilabel:`Console` addresses to open the
:ref:`MinIO Console <minio-console>` and log in with the :guilabel:`RootUser`
and :guilabel:`RootPass`. You can use the MinIO Console for performing
administration on the MinIO server.
For applications, use the :guilabel:`API` addresses to access the MinIO
server and perform S3 operations.
The following steps are optional but recommended for further securing the
MinIO deployment.
2) Add TLS Certificates
~~~~~~~~~~~~~~~~~~~~~~~
MinIO supports enabling :ref:`Transport Layer Security (TLS) <minio-TLS>` 1.2+
automatically upon detecting a x.509 private key (``private.key``) and public
certificate (``public.crt``) in the MinIO ``certs`` directory:
- For Linux/MacOS: ``${HOME}/.minio/certs``
- For Windows: ``%%USERPROFILE%%\.minio\certs``
You can override the certificate directory using the
:mc-cmd-option:`minio server certs-dir` commandline argument.
3) Run the MinIO Server with Non-Default Credentials
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
Issue the following command to start the :mc:`minio server` with non-default
credentials. The table following this command breaks down each portion of the
command:
.. code-block:: shell
:class: copyable
export MINIO_ROOT_USER=minio-admin
export MINIO_ROOT_PASSWORD=minio-secret-key-CHANGE-ME
#export MINIO_SERVER_URL=https://minio.example.net
minio server /data --console-address ":9001"
The example command breaks down as follows:
.. list-table::
:widths: 40 60
:width: 100%
* - :envvar:`MINIO_ROOT_USER`
- The access key for the :ref:`root <minio-users-root>` user.
Replace this value with a unique, random, and long string.
* - :envvar:`MINIO_ROOT_PASSWORD`
- The corresponding secret key to use for the
:ref:`root <minio-users-root>` user.
Replace this value with a unique, random, and long string.
* - :envvar:`MINIO_SERVER_URL`
- The URL hostname the MinIO Console uses for connecting to the MinIO
server. This variable is *required* if specifying TLS certificates
which **do not** contain the IP address of the MinIO Server host
as a :rfc:`Subject Alternative Name <5280#section-4.2.1.6>`.
Specify a hostname covered by one of the TLS certificate SAN entries.
* - ``/data``
- The path to each disk on the host machine.
See :mc-cmd:`minio server DIRECTORIES` for more information on
configuring the backing storage for the :mc:`minio server` process.
MinIO writes objects to the specified directory as is and without
:ref:`minio-erasure-coding`. Any other application accessing that
directory can read and modify stored objects.
* - ``--console-address ":9001"``
- The static port on which the embedded MinIO Console listens for incoming
connections.
Omit to allow MinIO to select a dynamic port for the MinIO Console.
With dynamic port selection, browsers opening the root node hostname
``https://minio1.example.com:9000`` are automatically redirected to the
Console.
You may specify other :ref:`environment variables
<minio-server-environment-variables>` as required by your deployment.
4) Open the MinIO Console
~~~~~~~~~~~~~~~~~~~~~~~~~
Open your browser to the DNS name or IP address corresponding to the
container and the :ref:`MinIO Console <minio-console>` port. For example,
``https://127.0.0.1:9001``.
Log in with the :guilabel:`MINIO_ROOT_USER` and :guilabel:`MINIO_ROOT_PASSWORD`
from the previous step.
.. image:: /images/minio-console-dashboard.png
:width: 600px
:alt: MinIO Console Dashboard displaying Monitoring Data
:align: center
You can use the MinIO Console for general administration tasks like
Identity and Access Management, Metrics and Log Monitoring, or
Server Configuration. Each MinIO server includes its own embedded MinIO
Console.
Applications should use the ``https://HOST-ADDRESS:9000`` to perform S3
operations against the MinIO server.
.. _deploy-minio-standalone-container:
Deploy Standalone MinIO in a Container
--------------------------------------
The following procedure deploys a single MinIO container with a single drive.
Standalone deployments are best suited for evaluation and initial development
environments.
The procedure uses `Podman <https://podman.io/>`__ for running the MinIO
container in rootfull mode. Configuring for rootless mode is out of scope for
this procedure.
.. admonition:: Network File System Volumes Break Consistency Guarantees
:class: note
MinIO's strict **read-after-write** and **list-after-write** consistency
model requires local disk filesystems (``xfs``, ``ext4``, etc.).
MinIO cannot provide consistency guarantees if the underlying storage
volumes are NFS or a similar network-attached storage volume.
For deployments that *require* using network-attached storage, use
NFSv4 for best results.
1) Create a Configuration File to store Environment Variables
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
MinIO reads configuration values from environment variables. MinIO supports
reading these environment variables from ``/run/secrets/config.env``. Save
the ``config.env`` file as a :podman-docs:`Podman secret <secret.html>` and
specify it as part of running the container.
Create a file ``config.env`` using your preferred text editor and enter the
following environment variables:
.. code-block:: shell
:class: copyable
export MINIO_ROOT_USER=minio-admin
export MINIO_ROOT_PASSWORD=minio-secret-key-CHANGE-ME
#export MINIO_SERVER_URL=https://minio.example.net
Create the Podman secret using the ``config.env`` file:
.. code-block:: shell
:class: copyable
sudo podman secret create config.env config.env
The following table details each environment variable set in ``config.env``:
.. list-table::
:widths: 40 60
:width: 100%
* - :envvar:`MINIO_ROOT_USER`
- The access key for the :ref:`root <minio-users-root>` user.
Replace this value with a unique, random, and long string.
* - :envvar:`MINIO_ROOT_PASSWORD`
- The corresponding secret key to use for the
:ref:`root <minio-users-root>` user.
Replace this value with a unique, random, and long string.
* - :envvar:`MINIO_SERVER_URL`
- The URL hostname the MinIO Console uses for connecting to the MinIO
server. This variable is *required* if specifying TLS certificates
which **do not** contain the IP address of the MinIO Server host
as a :rfc:`Subject Alternative Name <5280#section-4.2.1.6>`.
Specify a hostname covered by one of the TLS certificate SAN entries.
You may specify other :ref:`environment variables
<minio-server-environment-variables>` as required by your deployment.
2) Add TLS Certificates
~~~~~~~~~~~~~~~~~~~~~~~
MinIO supports enabling :ref:`Transport Layer Security (TLS) <minio-TLS>` 1.2+
automatically upon detecting a x.509 private key (``private.key``) and public
certificate (``public.crt``) in the MinIO ``certs`` directory:
Create a Podman secret pointing to the x.509
``private.key`` and ``public.crt`` to use for the container.
.. code-block:: shell
:class: copyable
sudo podman secret create private.key /path/to/private.key
sudo podman secret create public.crt /path/to/public.crt
You can optionally skip this step to deploy without TLS enabled. MinIO
strongly recommends *against* non-TLS deployments outside of early development.
3) Run the MinIO Container
~~~~~~~~~~~~~~~~~~~~~~~~~~
Issue the following command to start the MinIO server in a container:
.. code-block:: shell
:class: copyable
sudo podman run -p 9000:9000 -p 9001:9001 \
-v /data:/data \
--secret private.key \
--secret public.crt \
--secret config.env \
minio/minio server /data \
--console-address ":9001" \
--certs-dir "/run/secrets/"
The example command breaks down as follows:
.. list-table::
:widths: 40 60
:width: 100%
* - ``-p 9000:9000, -p 9001:9001``
- Exposes the container internal port ``9000`` and ``9001`` through
the node port ``9000`` and ``9001`` respectively.
Port ``9000`` is the default MinIO server listen port.
Port ``9001`` is the :ref:`MinIO Console <minio-console>` listen port
specified by the ``--console-address`` argument.
* - ``-v /data:/data``
- Mounts a local volume to the container at the specified path.
* - ``--secret ...``
- Mounts a secret to the container. The specified secrets correspond to
the following:
- The x.509 private and public key the MinIO server process uses for
enabling TLS.
- The ``config.env`` file from which MinIO looks for configuration
environment variables.
* - ``/data``
- The path to the container volume in which the ``minio`` server stores
all information related to the deployment.
See :mc-cmd:`minio server DIRECTORIES` for more information on
configuring the backing storage for the :mc:`minio server` process.
* - ``--console-address ":9001"``
- The static port on which the embedded MinIO Console listens for incoming
connections.
Omit to allow MinIO to select a dynamic port for the MinIO Console.
With dynamic port selection, browsers opening the root node hostname
``https://minio1.example.com:9000`` are automatically redirected to the
Console.
* - ``--cert /run/secrets/``
- Directs the MinIO server to use the ``/run/secrets/`` folder for
retrieving x.509 certificates to use for enabling TLS.
4) Open the MinIO Console
~~~~~~~~~~~~~~~~~~~~~~~~~
Open your browser to the DNS name or IP address corresponding to the
container and the :ref:`MinIO Console <minio-console>` port. For example,
``https://127.0.0.1:9001``.
Log in with the :guilabel:`MINIO_ROOT_USER` and :guilabel:`MINIO_ROOT_PASSWORD`
from the previous step.
.. image:: /images/minio-console-dashboard.png
:width: 600px
:alt: MinIO Console Dashboard displaying Monitoring Data
:align: center
You can use the MinIO Console for general administration tasks like
Identity and Access Management, Metrics and Log Monitoring, or
Server Configuration. Each MinIO server includes its own embedded MinIO
Console.
Applications should use the ``https://HOST-ADDRESS:9000`` to perform S3
operations against the MinIO server.
| 36.241848 | 178 | 0.70818 |
71d89bc6e5e5f0f462028ac6e9ba2ad4582c9fe4 | 384 | rst | reStructuredText | docs/vim/fault/DiskMoveTypeNotSupported.rst | nandonov/pyvmomi | ad9575859087177623f08b92c24132ac019fb6d9 | [
"Apache-2.0"
] | 12 | 2016-09-14T21:59:46.000Z | 2019-12-18T18:02:55.000Z | docs/vim/fault/DiskMoveTypeNotSupported.rst | nandonov/pyvmomi | ad9575859087177623f08b92c24132ac019fb6d9 | [
"Apache-2.0"
] | 2 | 2019-01-07T12:02:47.000Z | 2019-01-07T12:05:34.000Z | docs/vim/fault/DiskMoveTypeNotSupported.rst | nandonov/pyvmomi | ad9575859087177623f08b92c24132ac019fb6d9 | [
"Apache-2.0"
] | 8 | 2020-05-21T03:26:03.000Z | 2022-01-26T11:29:21.000Z | .. _diskMoveType: ../../vim/vm/RelocateSpec/DiskLocator.rst#diskMoveType
.. _vim.fault.MigrationFault: ../../vim/fault/MigrationFault.rst
vim.fault.DiskMoveTypeNotSupported
==================================
:extends:
`vim.fault.MigrationFault`_
Specifying non-standard disk movement types is not supported.See `diskMoveType`_ See `diskMoveType`_
Attributes:
| 20.210526 | 103 | 0.677083 |
1d3e32ea67631c374682d9547341891e8aed2555 | 707 | rst | reStructuredText | docs/docs_voicechannel.rst | comp500/discord.js | 2cd1104d3da6ce72e58c8c925465c38279ec412b | [
"Apache-2.0"
] | 1 | 2021-06-06T11:35:01.000Z | 2021-06-06T11:35:01.000Z | docs/docs_voicechannel.rst | c-parsons/discord.js | 2cd1104d3da6ce72e58c8c925465c38279ec412b | [
"Apache-2.0"
] | null | null | null | docs/docs_voicechannel.rst | c-parsons/discord.js | 2cd1104d3da6ce72e58c8c925465c38279ec412b | [
"Apache-2.0"
] | null | null | null | .. include:: ./vars.rst
VoiceChannel
============
**extends** ServerChannel_
A voice channel of a server. Currently, the voice channel class has no differences to the ServerChannel class.
--------
Attributes
----------
members
~~~~~~~~
A Cache_ of Users_ that are connected to the voice channel
userLimit
~~~~~~~~
The maximum amount of users that can connect to the voice channel. If it's 0, there is no limit
bitrate
~~~~~~~~
The bitrate of the voice channel (in kb/s).
Functions
---------
setUserLimit(limit, `callback`)
~~~~~~~~~~~~~~~~~~~
| **Shortcut of** ``client.setChannelUserLimit(channel, limit, callback)``
| **See** client.setChannelUserLimit_ | 18.605263 | 111 | 0.625177 |
3296d0729ffcc8d3d09d7a7b6c4ce98beeaea67c | 161 | rst | reStructuredText | openmp/docs/design/Offloading.rst | mkinsner/llvm | 589d48844edb12cd357b3024248b93d64b6760bf | [
"Apache-2.0"
] | 2,338 | 2018-06-19T17:34:51.000Z | 2022-03-31T11:00:37.000Z | openmp/docs/design/Offloading.rst | mkinsner/llvm | 589d48844edb12cd357b3024248b93d64b6760bf | [
"Apache-2.0"
] | 3,740 | 2019-01-23T15:36:48.000Z | 2022-03-31T22:01:13.000Z | openmp/docs/design/Offloading.rst | mkinsner/llvm | 589d48844edb12cd357b3024248b93d64b6760bf | [
"Apache-2.0"
] | 500 | 2019-01-23T07:49:22.000Z | 2022-03-30T02:59:37.000Z | OpenMP in LLVM --- Offloading Design
====================================
.. toctree::
:glob:
:hidden:
:maxdepth: 1
GPUSPMDMode
GPUGenericMode
| 13.416667 | 36 | 0.490683 |
9fb29cb249d84b7b69e16cd7c7fc3548217539d9 | 5,244 | rst | reStructuredText | docs/backends.rst | gaybro8777/django-notifs | f349fdf2dc87f2b579055c4e2abd95832ad9df5b | [
"MIT"
] | 145 | 2017-06-22T20:37:14.000Z | 2022-02-03T21:18:28.000Z | docs/backends.rst | gaybro8777/django-notifs | f349fdf2dc87f2b579055c4e2abd95832ad9df5b | [
"MIT"
] | 67 | 2017-06-23T06:53:32.000Z | 2021-11-13T04:00:27.000Z | docs/backends.rst | gaybro8777/django-notifs | f349fdf2dc87f2b579055c4e2abd95832ad9df5b | [
"MIT"
] | 24 | 2017-06-22T20:37:17.000Z | 2022-02-17T19:52:35.000Z | Backends
********
.. _Celery settings in the repo: https://github.com/danidee10/django-notifs/blob/master/notifs/settings.py
.. _django-rq: https://github.com/rq/django-rq
.. _django-rq documentation: https://github.com/rq/django-rq
.. _Serverless documentation for AWS: https://www.serverless.com/framework/docs/providers/aws
.. _lambda worker repository: https://github.com/danidee10/django-notifs-lambda-worker
The primary function of **a delivery backend** is to execute the code of the delivery channels and providers.
*Unlike notification channels, you can only use one delivery backend at a time.*
Celery
------
Install the optional Celery dependency with::
pip install django-notifs[celery]
Enable it by setting ``NOTIFICATIONS_DELIVERY_BACKEND`` to ``notifications.backends.Celery``
Run celery with the command::
celery -A yourapp worker -l info -Q django-notifs
Whenever a notification is created, it's automatically sent to celery and processed.
Make sure you see the queue and task (``notifications.backends.celery.consume``) in the terminal.
.. image:: _static/images/django-notifs-celery.png
If you have issues registering the task, you can import it manually or checkout the `Celery settings in the repo`_.
Channels
--------
Install the channels dependency with::
pip install django-notifs[channels]
*This also installs channels_redis as an extra dependency*
Declare the notifications consumer in ``asgi.py``::
from notifications import consumers
application = ProtocolTypeRouter({
...,
'channel': ChannelNameRouter({
'django_notifs': consumers.DjangoNotifsConsumer.as_asgi(),
})
})
*This example assumes that you're running Django 3x Which has native support for asgi. Check the channels documentation for Django 2.2*
Next add the `django_notifs` channel layer to ``settings.CHANNEL_LAYERS``::
CHANNEL_LAYERS = {
...,
'django_notifs': {
'BACKEND': 'channels_redis.core.RedisChannelLayer',
'CONFIG': {
"hosts": [('127.0.0.1', 6379)],
},
},
}
Finally, run the worker with::
python manage.py runworker django_notifs
.. image:: _static/images/channels.png
RQ
--
RQ is a lightweight alternative to Celery. To use the RQ Backend, install the optional dependency with::
pip install django-notifs[rq]
*django notifs uses django-rq under the hood*
Enable it by setting ``NOTIFICATIONS_DELIVERY_BACKEND`` to ``notifications.backends.RQ``
Configure the ``django_notifs`` in ``settings.py``::
RQ_QUEUES = {
...,
'django_notifs': {
'HOST': 'localhost',
'PORT': 6379,
'DB': 0,
'PASSWORD': '',
'DEFAULT_TIMEOUT': 360,
}
}
Finally start the rq worker with::
python manage.py rqworker django_notifs --with-scheduler
.. image:: _static/images/rq-worker.png
See the `django-rq documentation`_ for more details
AwsLambda (with SQS)
--------------------
The setup for this backend is more involved but it's probably the cheapest and most scalable backend to use in production
because the heavylifting and execution environment is handled by AWS.
set ``NOTIFICATIONS_DELIVERY_BACKEND`` to ``notifications.backends.AwsSqsLambda``
This backend uses ``boto3`` under the hood; so make sure your AWS credentials are configured e.g::
export AWS_ACCESS_KEY_ID=xxxx
export AWS_SECRET_ACCESS_KEY=xxxx
export AWS_DEFAULT_REGION=xxxx
Clone the `lambda worker repository`_ and run::
npm install
The ``sqs-lambda-worker`` folder includes four files that are of interest:
``.env.example``
You can use this file (after renaming it to ``.env``) to configure the environment variables for the autogenerated Lambda function.
You can replace this step by:
* Configuring the environment variables in your CI/CD environment **(Recommended)**
* Exporting them in the current shell.
This is useful if you want to test the serverless deployment locally before moving it to your CI/CD
``requirements.txt``
In order to keep the lambda function as lean as possible,
you have to explicitly declare the requirements that are necessary
for the lambda function. New providers (and their dependencies) are continuously added to django-notifs
so it's not adviseable to install dependencies for providers that you don't need because this could impact
the startup time of your Lambda function.
``serverless.yml``
The Serverless file. It contains a blueprint that deploys the simplest configuration possible but the configuration options
are endless. see the `Serverless documentation for AWS`_ for more information.
``settings.py``
Declare the Django settings for the lambda function.
After setting these variables deploy the serverless stack to AWS::
serverless deploy --stage <your-stage>
Then update your settings with the generated sqs queue url::
settings.NOTIFICATIONS_SQS_QUEUE_URL = 'xxxxxx' # autogenerated SQS url
Synchronous
-----------
This is the default backend that sends notifications synchronously.
You can enable it explicitly by setting ``NOTIFICATIONS_DELIVERY_BACKEND`` to ``notifications.backends.Synchronous``
| 30.666667 | 135 | 0.727117 |
76229d5fc9bd4765cdb1948e17c79adf4050b734 | 2,316 | rst | reStructuredText | docs/citing-oggm.rst | patclrk/oggm | 01a76071a2173566de9861d9444db7130591ba47 | [
"BSD-3-Clause"
] | 1 | 2021-06-03T10:44:57.000Z | 2021-06-03T10:44:57.000Z | docs/citing-oggm.rst | patclrk/oggm | 01a76071a2173566de9861d9444db7130591ba47 | [
"BSD-3-Clause"
] | null | null | null | docs/citing-oggm.rst | patclrk/oggm | 01a76071a2173566de9861d9444db7130591ba47 | [
"BSD-3-Clause"
] | 1 | 2021-02-18T13:30:29.000Z | 2021-02-18T13:30:29.000Z | .. _citing-oggm:
***********
Citing OGGM
***********
Publication
-----------
If you want to refer to OGGM in your publications or presentations, please
refer to the paper in `Geoscientific Model Development`_.
BibTeX entry::
@Article{gmd-12-909-2019,
AUTHOR = {Maussion, F. and Butenko, A. and Champollion, N. and
Dusch, M. and Eis, J. and Fourteau, K. and Gregor, P. and
Jarosch, A. H. and Landmann, J. and Oesterle, F. and
Recinos, B. and Rothenpieler, T. and Vlug, A. and Wild, C. T. and
Marzeion, B.},
TITLE = {The Open Global Glacier Model (OGGM) v1.1},
JOURNAL = {Geoscientific Model Development},
VOLUME = {2019},
YEAR = {2019},
PAGES = {909--931},
URL = {https://www.geosci-model-dev.net/12/909/2019/},
DOI = {10.5194/gmd-12-909-2019}
}
.. _Geoscientific Model Development: https://www.geosci-model-dev.net/12/909/2019/
Software DOI
------------
If you want to refer to a specific version of the software you can use
the `Zenodo citation`_ for this purpose. An example BibTeX entry::
@misc{OGGM_v1.1,
author = {Fabien Maussion and Timo Rothenpieler and
Matthias Dusch and Beatriz Recinos and Anouk Vlug and
Ben Marzeion and Johannes Landmann and Julia Eis and
Sadie Bartholomew and Nicolas Champollion and
Philipp Gregor and Anton Butenko and Schmitty Smith and
Moritz Oberrauch},
title = {OGGM/oggm: v1.1},
month = feb,
year = 2019,
doi = {10.5281/zenodo.2580277},
url = {https://doi.org/10.5281/zenodo.2580277}
}
.. _Zenodo citation: https://zenodo.org/badge/latestdoi/43965645
Logo
----
The OGGM logos are free to use!
.. image:: _static/logo.png
**full logo**:
`pdf <_static/logos/oggm.pdf>`_,
`large (2004x856) <_static/logos/oggm_l.png>`_,
`large (2004x856) with alpha channel <_static/logos/oggm_l_alpha.png>`_,
`small (502x214) <_static/logos/oggm_s.png>`_,
`small (502x214) with alpha channel <_static/logos/oggm_s_alpha.png>`_
.. image:: _static/logos/oggm_icon.png
**icon logo**:
`small (135x157) <_static/logos/oggm_icon.png>`_,
`small (135x157) with alpha channel <_static/logos/oggm_icon_alpha.png>`_
| 30.473684 | 82 | 0.630397 |
117a5b7d5c5f5ceb375c8e7bc01cba3bf0c1b3ad | 1,830 | rst | reStructuredText | docs/source/config_samples/full_configuration.rst | rhos-infra/cibyl | 842a993ddf3552d1b4f2e85025dcf928f76fe7fb | [
"Apache-2.0"
] | 3 | 2022-02-17T18:07:07.000Z | 2022-03-19T10:22:38.000Z | docs/source/config_samples/full_configuration.rst | rhos-infra/cibyl | 842a993ddf3552d1b4f2e85025dcf928f76fe7fb | [
"Apache-2.0"
] | 58 | 2022-02-14T14:41:22.000Z | 2022-03-31T10:54:28.000Z | docs/source/config_samples/full_configuration.rst | rhos-infra/cibyl | 842a993ddf3552d1b4f2e85025dcf928f76fe7fb | [
"Apache-2.0"
] | 6 | 2022-02-14T19:21:26.000Z | 2022-03-29T09:31:31.000Z | ::
# Full Configuration Example
environments: # List of CI/CD environments
production: # An environment called "production"
production_jenkins_1: # A single system called "production_jenkins_1" belongs to "production" environment
system_type: jenkins # The type of the system (jenkins or zuul)
sources: # List of sources belong to "production_jenkins" system
jenkins1_api: # The name of the source which belongs to "production_jenkins_1" system
driver: jenkins # The driver the source will be using
url: https://... # The URL of the system
username: user # The username to use for the authentication
token: xyz # The token to use for the authentication
cert: False # Disable/Enable certificates to use for the authentication
job_definitions: # Another source that belongs to the same system called "production_jenkins_1"
driver: jjb
repos:
- url: https://job_definitions_repo.git
production_jenkins_2: # Another system belongs to the "production" environment
system_type: jenkins
sources:
jenkins2_api:
driver: jenkins
url: https://...
username: user
token: xyz
cert: False
production_zuul
system_type: zuul
sources:
zuul_api:
driver: zuul
url: https://...
staging: # Another environment called "staging"
staging_jenkins:
system_type: jenkins
sources:
staging_jenkins_api:
driver: jenkins
url: https://...
username: user
token: xyz
| 33.272727 | 115 | 0.574317 |
bbcd3b447791d7407d8d4d36d1ba4ef6f5d0320c | 188 | rst | reStructuredText | docs/generated/colour.plotting.plot_cvd_simulation_Machado2009.rst | BPearlstine/colour | 40f0281295496774d2a19eee017d50fd0c265bd8 | [
"Cube",
"BSD-3-Clause"
] | 2 | 2020-05-03T20:15:42.000Z | 2021-04-09T18:19:06.000Z | docs/generated/colour.plotting.plot_cvd_simulation_Machado2009.rst | BPearlstine/colour | 40f0281295496774d2a19eee017d50fd0c265bd8 | [
"Cube",
"BSD-3-Clause"
] | null | null | null | docs/generated/colour.plotting.plot_cvd_simulation_Machado2009.rst | BPearlstine/colour | 40f0281295496774d2a19eee017d50fd0c265bd8 | [
"Cube",
"BSD-3-Clause"
] | null | null | null | colour.plotting.plot\_cvd\_simulation\_Machado2009
==================================================
.. currentmodule:: colour.plotting
.. autofunction:: plot_cvd_simulation_Machado2009 | 31.333333 | 50 | 0.611702 |
392b29dd5899705ec450464a5777b3b595bd169e | 1,897 | rst | reStructuredText | content/missives/you-know-nothing-anthony-snow.rst | scopatz/anthony.scopatz.com | c23303f4d2deed50c576794ba2bdf21f4bcf17fc | [
"BSD-2-Clause"
] | null | null | null | content/missives/you-know-nothing-anthony-snow.rst | scopatz/anthony.scopatz.com | c23303f4d2deed50c576794ba2bdf21f4bcf17fc | [
"BSD-2-Clause"
] | null | null | null | content/missives/you-know-nothing-anthony-snow.rst | scopatz/anthony.scopatz.com | c23303f4d2deed50c576794ba2bdf21f4bcf17fc | [
"BSD-2-Clause"
] | null | null | null | You know nothing, Anthony Snow
##############################
:date: 2009-09-18 16:06
:author: Anthony Scopatz
:category: missives
:tags: argentiniancowboys, iheartvaginas, kdlang, lollipop, northofthewall, notmybrother, oldladyhat, onedoesnotsimplyjumpingstilitsintomordor, paris, peoplechangenawtheydont, ssp2
:slug: you-know-nothing-anthony-snow
Well it seems like I can't please everyone. Some people think my posts
are too long. Others feel they are too cryptic. Still others think that
my 'musings on women' and my sexuality are 'annoying.'
|image0|
**Exciting News!** So given the recent criticisms, I am going to tell
you about it in a short, encoded, and peevish fashion.
So while I was in Paris, my adviser and I had a meeting where we decided
that it would be good for me to spend some time north of the wall. It is
known that it is quite a magical land up there. Which is good since
previous attempts to befriend the majestic unicorn have been successful.
So it looks like this expedition is going to take place. It took a lot
of work to get this off the ground. After all, one does not simply
jumping stilts into Mordor.
However, in the interim Super Secret Project 2 has come to a close. How
this affects the northern climes (not to mention where everyone loves
vagina) is unclear. It has also come to light that Duncan Idaho of House
Atreides has a made a similar passage to the one I am about to embark
upon. I expect an upset among expatriate Argentine cowboys.
Unlike K. D. Lang (not my brother), Super Secret Project 2 is not +5.
Maybe it is -5. Maybe it is 0. Maybe it is +2.5. Still, I now have no
desire for lollipops nor hats worn by old ladies. Despite what you
believe, people change.
**Sisyphus-like, we build. "*Things fall apart; it's scientific.*\ "**
.. |image0| image:: http://lh4.ggpht.com/_KFdIKJVlj1w/Sq0sqgSYt-I/AAAAAAAAC_s/LqkmrvhhZWU/s400/p9090023.jpg
| 47.425 | 180 | 0.762783 |
67bdba5a2a4e036d08231d10c17c610dd087e02c | 3,190 | rst | reStructuredText | docs/contribute.rst | ErwinJunge/django-selectable | 3d7b8db0526dd924a774c599f0c665eff98fb375 | [
"BSD-2-Clause"
] | 1 | 2020-10-05T21:00:24.000Z | 2020-10-05T21:00:24.000Z | docs/contribute.rst | ErwinJunge/django-selectable | 3d7b8db0526dd924a774c599f0c665eff98fb375 | [
"BSD-2-Clause"
] | null | null | null | docs/contribute.rst | ErwinJunge/django-selectable | 3d7b8db0526dd924a774c599f0c665eff98fb375 | [
"BSD-2-Clause"
] | null | null | null | .. _contributing-guide:
Contributing
==================
There are plenty of ways to contribute to this project. If you think you've found
a bug please submit an issue. If there is a feature you'd like to see then please
open an ticket proposal for it. If you've come up with some helpful examples then
you can add to our example project.
Getting the Source
--------------------------------------
The source code is hosted on `Github <https://github.com/mlavin/django-selectable>`_.
You can download the full source by cloning the git repo::
git clone git://github.com/mlavin/django-selectable.git
Feel free to fork the project and make your own changes. If you think that it would
be helpful for other then please submit a pull request to have it merged in.
Submit an Issue
--------------------------------------
The issues are also managed on `Github issue page <https://github.com/mlavin/django-selectable/issues>`_.
If you think you've found a bug it's helpful if you indicate the version of django-selectable
you are using the ticket version flag. If you think your bug is javascript related it is
also helpful to know the version of jQuery, jQuery UI, and the browser you are using.
Issues are also used to track new features. If you have a feature you would like to see
you can submit a proposal ticket. You can also see features which are planned here.
Submit a Translation
--------------------------------------
We are working towards translating django-selectable into different languages. There
are not many strings to be translated so it is a reasonably easy task and a great way
to be involved with the project. The translations are managed through
`Transifex <https://www.transifex.com/projects/p/django-selectable/>`_.
Running the Test Suite
--------------------------------------
There are a number of tests in place to test the server side code for this
project. To run the tests you need Django and `mock <http://www.voidspace.org.uk/python/mock/>`_
installed and run::
python runtests.py
`tox <http://tox.readthedocs.org/en/latest/index.html>`_ is used to test django-selectable
against multiple versions of Django/Python. With tox installed you can run::
tox
to run all the version combinations. You can also run tox against a subset of supported
environments::
tox -e py27-django15
For more information on running/installing tox please see the
tox documentation: http://tox.readthedocs.org/en/latest/index.html
Client side tests are written using `QUnit <http://docs.jquery.com/QUnit>`_. They
can be found in ``selectable/tests/qunit/index.html``. The test suite also uses
`PhantomJS <http://phantomjs.org/>`_ to
run the tests. You can install PhantomJS from NPM::
# Install requirements
npm install -g phantomjs jshint
make test-js
Building the Documentation
--------------------------------------
The documentation is built using `Sphinx <http://sphinx.pocoo.org/>`_
and available on `Read the Docs <http://django-selectable.readthedocs.io/>`_. With
Sphinx installed you can build the documentation by running::
make html
inside the docs directory. Documentation fixes and improvements are always welcome.
| 36.666667 | 105 | 0.722571 |
81f2a6b14c3f57fc1209de3d6709a3c613c00a86 | 15,670 | rst | reStructuredText | docs/AUTHORS.rst | M2mobi/openssh-formula | 2301678d71dbfd3de85c98568913839a419d8569 | [
"Apache-2.0"
] | null | null | null | docs/AUTHORS.rst | M2mobi/openssh-formula | 2301678d71dbfd3de85c98568913839a419d8569 | [
"Apache-2.0"
] | null | null | null | docs/AUTHORS.rst | M2mobi/openssh-formula | 2301678d71dbfd3de85c98568913839a419d8569 | [
"Apache-2.0"
] | null | null | null | .. role:: raw-html-m2r(raw)
:format: html
Authors
=======
This list is sorted by the number of commits per contributor in *descending* order.
.. list-table::
:header-rows: 1
* - Avatar
- Contributor
- Contributions
* - :raw-html-m2r:`<img class='float-left rounded-1' src='https://avatars2.githubusercontent.com/u/10231489?v=4' width='36' height='36' alt='@myii'>`
- `@myii <https://github.com/myii>`_
- 89
* - :raw-html-m2r:`<img class='float-left rounded-1' src='https://avatars0.githubusercontent.com/u/1920805?v=4' width='36' height='36' alt='@alxwr'>`
- `@alxwr <https://github.com/alxwr>`_
- 38
* - :raw-html-m2r:`<img class='float-left rounded-1' src='https://avatars0.githubusercontent.com/u/1233212?v=4' width='36' height='36' alt='@baby-gnu'>`
- `@baby-gnu <https://github.com/baby-gnu>`_
- 31
* - :raw-html-m2r:`<img class='float-left rounded-1' src='https://avatars2.githubusercontent.com/u/1396878?v=4' width='36' height='36' alt='@gravyboat'>`
- `@gravyboat <https://github.com/gravyboat>`_
- 28
* - :raw-html-m2r:`<img class='float-left rounded-1' src='https://avatars0.githubusercontent.com/u/1800660?v=4' width='36' height='36' alt='@aboe76'>`
- `@aboe76 <https://github.com/aboe76>`_
- 25
* - :raw-html-m2r:`<img class='float-left rounded-1' src='https://avatars0.githubusercontent.com/u/3374962?v=4' width='36' height='36' alt='@nmadhok'>`
- `@nmadhok <https://github.com/nmadhok>`_
- 15
* - :raw-html-m2r:`<img class='float-left rounded-1' src='https://avatars2.githubusercontent.com/u/91293?v=4' width='36' height='36' alt='@whiteinge'>`
- `@whiteinge <https://github.com/whiteinge>`_
- 9
* - :raw-html-m2r:`<img class='float-left rounded-1' src='https://avatars1.githubusercontent.com/u/8029478?v=4' width='36' height='36' alt='@rfairburn'>`
- `@rfairburn <https://github.com/rfairburn>`_
- 8
* - :raw-html-m2r:`<img class='float-left rounded-1' src='https://avatars0.githubusercontent.com/u/6018668?v=4' width='36' height='36' alt='@amendlik'>`
- `@amendlik <https://github.com/amendlik>`_
- 8
* - :raw-html-m2r:`<img class='float-left rounded-1' src='https://avatars2.githubusercontent.com/u/941928?v=4' width='36' height='36' alt='@amontalban'>`
- `@amontalban <https://github.com/amontalban>`_
- 7
* - :raw-html-m2r:`<img class='float-left rounded-1' src='https://avatars2.githubusercontent.com/u/242396?v=4' width='36' height='36' alt='@javierbertoli'>`
- `@javierbertoli <https://github.com/javierbertoli>`_
- 7
* - :raw-html-m2r:`<img class='float-left rounded-1' src='https://avatars3.githubusercontent.com/u/897349?v=4' width='36' height='36' alt='@kennydo'>`
- `@kennydo <https://github.com/kennydo>`_
- 7
* - :raw-html-m2r:`<img class='float-left rounded-1' src='https://avatars0.githubusercontent.com/u/17393048?v=4' width='36' height='36' alt='@ek9'>`
- `@ek9 <https://github.com/ek9>`_
- 7
* - :raw-html-m2r:`<img class='float-left rounded-1' src='https://avatars3.githubusercontent.com/u/6215293?v=4' width='36' height='36' alt='@0xf10e'>`
- `@0xf10e <https://github.com/0xf10e>`_
- 7
* - :raw-html-m2r:`<img class='float-left rounded-1' src='https://avatars2.githubusercontent.com/u/642259?v=4' width='36' height='36' alt='@pepoluan'>`
- `@pepoluan <https://github.com/pepoluan>`_
- 5
* - :raw-html-m2r:`<img class='float-left rounded-1' src='https://avatars1.githubusercontent.com/u/10227523?v=4' width='36' height='36' alt='@llua'>`
- `@llua <https://github.com/llua>`_
- 5
* - :raw-html-m2r:`<img class='float-left rounded-1' src='https://avatars1.githubusercontent.com/u/528061?v=4' width='36' height='36' alt='@puneetk'>`
- `@puneetk <https://github.com/puneetk>`_
- 5
* - :raw-html-m2r:`<img class='float-left rounded-1' src='https://avatars1.githubusercontent.com/u/3375654?v=4' width='36' height='36' alt='@nterupt'>`
- `@nterupt <https://github.com/nterupt>`_
- 4
* - :raw-html-m2r:`<img class='float-left rounded-1' src='https://avatars3.githubusercontent.com/u/10141454?v=4' width='36' height='36' alt='@mathieupotier'>`
- `@mathieupotier <https://github.com/mathieupotier>`_
- 4
* - :raw-html-m2r:`<img class='float-left rounded-1' src='https://avatars2.githubusercontent.com/u/1079875?v=4' width='36' height='36' alt='@bogdanr'>`
- `@bogdanr <https://github.com/bogdanr>`_
- 3
* - :raw-html-m2r:`<img class='float-left rounded-1' src='https://avatars1.githubusercontent.com/u/287147?v=4' width='36' height='36' alt='@techhat'>`
- `@techhat <https://github.com/techhat>`_
- 3
* - :raw-html-m2r:`<img class='float-left rounded-1' src='https://avatars2.githubusercontent.com/u/13550?v=4' width='36' height='36' alt='@mikemol'>`
- `@mikemol <https://github.com/mikemol>`_
- 3
* - :raw-html-m2r:`<img class='float-left rounded-1' src='https://avatars2.githubusercontent.com/u/358074?v=4' width='36' height='36' alt='@pcdummy'>`
- `@pcdummy <https://github.com/pcdummy>`_
- 3
* - :raw-html-m2r:`<img class='float-left rounded-1' src='https://avatars0.githubusercontent.com/u/507599?v=4' width='36' height='36' alt='@thatch45'>`
- `@thatch45 <https://github.com/thatch45>`_
- 3
* - :raw-html-m2r:`<img class='float-left rounded-1' src='https://avatars1.githubusercontent.com/u/117961?v=4' width='36' height='36' alt='@babilen5'>`
- `@babilen5 <https://github.com/babilen5>`_
- 3
* - :raw-html-m2r:`<img class='float-left rounded-1' src='https://avatars3.githubusercontent.com/u/2061751?v=4' width='36' height='36' alt='@matthew-parlette'>`
- `@matthew-parlette <https://github.com/matthew-parlette>`_
- 3
* - :raw-html-m2r:`<img class='float-left rounded-1' src='https://avatars1.githubusercontent.com/u/1013915?v=4' width='36' height='36' alt='@rhertzog'>`
- `@rhertzog <https://github.com/rhertzog>`_
- 3
* - :raw-html-m2r:`<img class='float-left rounded-1' src='https://avatars1.githubusercontent.com/u/36720?v=4' width='36' height='36' alt='@brot'>`
- `@brot <https://github.com/brot>`_
- 2
* - :raw-html-m2r:`<img class='float-left rounded-1' src='https://avatars3.githubusercontent.com/u/776662?v=4' width='36' height='36' alt='@carlosperello'>`
- `@carlosperello <https://github.com/carlosperello>`_
- 2
* - :raw-html-m2r:`<img class='float-left rounded-1' src='https://avatars2.githubusercontent.com/u/4195158?v=4' width='36' height='36' alt='@dafyddj'>`
- `@dafyddj <https://github.com/dafyddj>`_
- 2
* - :raw-html-m2r:`<img class='float-left rounded-1' src='https://avatars2.githubusercontent.com/u/114159?v=4' width='36' height='36' alt='@fpletz'>`
- `@fpletz <https://github.com/fpletz>`_
- 2
* - :raw-html-m2r:`<img class='float-left rounded-1' src='https://avatars2.githubusercontent.com/u/5255388?v=4' width='36' height='36' alt='@ingben'>`
- `@ingben <https://github.com/ingben>`_
- 2
* - :raw-html-m2r:`<img class='float-left rounded-1' src='https://avatars2.githubusercontent.com/u/675056?v=4' width='36' height='36' alt='@OrangeDog'>`
- `@OrangeDog <https://github.com/OrangeDog>`_
- 2
* - :raw-html-m2r:`<img class='float-left rounded-1' src='https://avatars3.githubusercontent.com/u/2285387?v=4' width='36' height='36' alt='@kyrias'>`
- `@kyrias <https://github.com/kyrias>`_
- 2
* - :raw-html-m2r:`<img class='float-left rounded-1' src='https://avatars0.githubusercontent.com/u/924183?v=4' width='36' height='36' alt='@mschiff'>`
- `@mschiff <https://github.com/mschiff>`_
- 2
* - :raw-html-m2r:`<img class='float-left rounded-1' src='https://avatars2.githubusercontent.com/u/3768412?v=4' width='36' height='36' alt='@stp-ip'>`
- `@stp-ip <https://github.com/stp-ip>`_
- 2
* - :raw-html-m2r:`<img class='float-left rounded-1' src='https://avatars1.githubusercontent.com/u/13322818?v=4' width='36' height='36' alt='@noelmcloughlin'>`
- `@noelmcloughlin <https://github.com/noelmcloughlin>`_
- 2
* - :raw-html-m2r:`<img class='float-left rounded-1' src='https://avatars2.githubusercontent.com/u/299386?v=4' width='36' height='36' alt='@excavador'>`
- `@excavador <https://github.com/excavador>`_
- 2
* - :raw-html-m2r:`<img class='float-left rounded-1' src='https://avatars3.githubusercontent.com/u/4510160?v=4' width='36' height='36' alt='@hudecof'>`
- `@hudecof <https://github.com/hudecof>`_
- 2
* - :raw-html-m2r:`<img class='float-left rounded-1' src='https://avatars0.githubusercontent.com/u/1004111?v=4' width='36' height='36' alt='@freach'>`
- `@freach <https://github.com/freach>`_
- 2
* - :raw-html-m2r:`<img class='float-left rounded-1' src='https://avatars2.githubusercontent.com/u/50891?v=4' width='36' height='36' alt='@westurner'>`
- `@westurner <https://github.com/westurner>`_
- 2
* - :raw-html-m2r:`<img class='float-left rounded-1' src='https://avatars0.githubusercontent.com/u/228723?v=4' width='36' height='36' alt='@abednarik'>`
- `@abednarik <https://github.com/abednarik>`_
- 2
* - :raw-html-m2r:`<img class='float-left rounded-1' src='https://avatars1.githubusercontent.com/u/26563851?v=4' width='36' height='36' alt='@chenmen'>`
- `@chenmen <https://github.com/chenmen>`_
- 2
* - :raw-html-m2r:`<img class='float-left rounded-1' src='https://avatars0.githubusercontent.com/u/850317?v=4' width='36' height='36' alt='@alanpearce'>`
- `@alanpearce <https://github.com/alanpearce>`_
- 1
* - :raw-html-m2r:`<img class='float-left rounded-1' src='https://avatars0.githubusercontent.com/u/445200?v=4' width='36' height='36' alt='@arthurlogilab'>`
- `@arthurlogilab <https://github.com/arthurlogilab>`_
- 1
* - :raw-html-m2r:`<img class='float-left rounded-1' src='https://avatars3.githubusercontent.com/u/1566437?v=4' width='36' height='36' alt='@bkmit'>`
- `@bkmit <https://github.com/bkmit>`_
- 1
* - :raw-html-m2r:`<img class='float-left rounded-1' src='https://avatars3.githubusercontent.com/u/20098965?v=4' width='36' height='36' alt='@brianholland99'>`
- `@brianholland99 <https://github.com/brianholland99>`_
- 1
* - :raw-html-m2r:`<img class='float-left rounded-1' src='https://avatars1.githubusercontent.com/u/20441?v=4' width='36' height='36' alt='@iggy'>`
- `@iggy <https://github.com/iggy>`_
- 1
* - :raw-html-m2r:`<img class='float-left rounded-1' src='https://avatars1.githubusercontent.com/u/13131979?v=4' width='36' height='36' alt='@BT-dschleich'>`
- `@BT-dschleich <https://github.com/BT-dschleich>`_
- 1
* - :raw-html-m2r:`<img class='float-left rounded-1' src='https://avatars0.githubusercontent.com/u/3012076?v=4' width='36' height='36' alt='@fzipi'>`
- `@fzipi <https://github.com/fzipi>`_
- 1
* - :raw-html-m2r:`<img class='float-left rounded-1' src='https://avatars0.githubusercontent.com/u/94157?v=4' width='36' height='36' alt='@imran1008'>`
- `@imran1008 <https://github.com/imran1008>`_
- 1
* - :raw-html-m2r:`<img class='float-left rounded-1' src='https://avatars3.githubusercontent.com/u/637504?v=4' width='36' height='36' alt='@jasperla'>`
- `@jasperla <https://github.com/jasperla>`_
- 1
* - :raw-html-m2r:`<img class='float-left rounded-1' src='https://avatars2.githubusercontent.com/u/350294?v=4' width='36' height='36' alt='@anderbubble'>`
- `@anderbubble <https://github.com/anderbubble>`_
- 1
* - :raw-html-m2r:`<img class='float-left rounded-1' src='https://avatars0.githubusercontent.com/u/7613500?v=4' width='36' height='36' alt='@levlozhkin'>`
- `@levlozhkin <https://github.com/levlozhkin>`_
- 1
* - :raw-html-m2r:`<img class='float-left rounded-1' src='https://avatars0.githubusercontent.com/u/25535310?v=4' width='36' height='36' alt='@polymeter'>`
- `@polymeter <https://github.com/polymeter>`_
- 1
* - :raw-html-m2r:`<img class='float-left rounded-1' src='https://avatars3.githubusercontent.com/u/16899663?v=4' width='36' height='36' alt='@Mario-F'>`
- `@Mario-F <https://github.com/Mario-F>`_
- 1
* - :raw-html-m2r:`<img class='float-left rounded-1' src='https://avatars3.githubusercontent.com/u/2869?v=4' width='36' height='36' alt='@nigelsim'>`
- `@nigelsim <https://github.com/nigelsim>`_
- 1
* - :raw-html-m2r:`<img class='float-left rounded-1' src='https://avatars2.githubusercontent.com/u/25389335?v=4' width='36' height='36' alt='@antifob'>`
- `@antifob <https://github.com/antifob>`_
- 1
* - :raw-html-m2r:`<img class='float-left rounded-1' src='https://avatars0.githubusercontent.com/u/1610802?v=4' width='36' height='36' alt='@robinelfrink'>`
- `@robinelfrink <https://github.com/robinelfrink>`_
- 1
* - :raw-html-m2r:`<img class='float-left rounded-1' src='https://avatars2.githubusercontent.com/u/2377054?v=4' width='36' height='36' alt='@smlloyd'>`
- `@smlloyd <https://github.com/smlloyd>`_
- 1
* - :raw-html-m2r:`<img class='float-left rounded-1' src='https://avatars1.githubusercontent.com/u/4156131?v=4' width='36' height='36' alt='@skylerberg'>`
- `@skylerberg <https://github.com/skylerberg>`_
- 1
* - :raw-html-m2r:`<img class='float-left rounded-1' src='https://avatars3.githubusercontent.com/u/48949?v=4' width='36' height='36' alt='@tampakrap'>`
- `@tampakrap <https://github.com/tampakrap>`_
- 1
* - :raw-html-m2r:`<img class='float-left rounded-1' src='https://avatars3.githubusercontent.com/u/566830?v=4' width='36' height='36' alt='@TJuberg'>`
- `@TJuberg <https://github.com/TJuberg>`_
- 1
* - :raw-html-m2r:`<img class='float-left rounded-1' src='https://avatars2.githubusercontent.com/u/1974659?v=4' width='36' height='36' alt='@tibold'>`
- `@tibold <https://github.com/tibold>`_
- 1
* - :raw-html-m2r:`<img class='float-left rounded-1' src='https://avatars3.githubusercontent.com/u/1277162?v=4' width='36' height='36' alt='@brandonparsons'>`
- `@brandonparsons <https://github.com/brandonparsons>`_
- 1
* - :raw-html-m2r:`<img class='float-left rounded-1' src='https://avatars0.githubusercontent.com/u/1406670?v=4' width='36' height='36' alt='@elfixit'>`
- `@elfixit <https://github.com/elfixit>`_
- 1
* - :raw-html-m2r:`<img class='float-left rounded-1' src='https://avatars3.githubusercontent.com/u/10122937?v=4' width='36' height='36' alt='@ketzacoatl'>`
- `@ketzacoatl <https://github.com/ketzacoatl>`_
- 1
* - :raw-html-m2r:`<img class='float-left rounded-1' src='https://avatars3.githubusercontent.com/u/15609251?v=4' width='36' height='36' alt='@omltorg'>`
- `@omltorg <https://github.com/omltorg>`_
- 1
* - :raw-html-m2r:`<img class='float-left rounded-1' src='https://avatars0.githubusercontent.com/u/1721508?v=4' width='36' height='36' alt='@reschl'>`
- `@reschl <https://github.com/reschl>`_
- 1
* - :raw-html-m2r:`<img class='float-left rounded-1' src='https://avatars0.githubusercontent.com/u/991850?v=4' width='36' height='36' alt='@scub'>`
- `@scub <https://github.com/scub>`_
- 1
* - :raw-html-m2r:`<img class='float-left rounded-1' src='https://avatars1.githubusercontent.com/u/8021992?v=4' width='36' height='36' alt='@tmeneau'>`
- `@tmeneau <https://github.com/tmeneau>`_
- 1
----
Auto-generated by a `forked version <https://github.com/myii/maintainer>`_ of `gaocegege/maintainer <https://github.com/gaocegege/maintainer>`_ on 2021-01-12.
| 66.965812 | 163 | 0.646267 |
585c98209b0f4d6360e4f62ff88324a01232a068 | 210 | rst | reStructuredText | docs/source/models/index.rst | aquilesC/experimentor | 1a70760912ef40f0e2aaee44ed1a1e5594fd5b45 | [
"MIT"
] | 4 | 2020-05-15T04:07:25.000Z | 2020-09-30T22:20:46.000Z | docs/source/models/index.rst | aquilesC/experimentor | 1a70760912ef40f0e2aaee44ed1a1e5594fd5b45 | [
"MIT"
] | null | null | null | docs/source/models/index.rst | aquilesC/experimentor | 1a70760912ef40f0e2aaee44ed1a1e5594fd5b45 | [
"MIT"
] | null | null | null | Models
======
.. automodule:: experimentor.models
:members:
.. toctree::
:maxdepth: 1
action
feature
properties
models
exceptions
meta
devices/index
experiments/index
| 11.666667 | 35 | 0.609524 |
6b95e291bdb5cde3f101c9a15a56e9da6a94a960 | 420 | rest | reStructuredText | test-api/token.rest | Dulce18san/Proyecto-Clinica | d2034fbe5858f7c1132dd56510a748e74d47e5f2 | [
"MIT"
] | 1 | 2021-12-07T15:20:35.000Z | 2021-12-07T15:20:35.000Z | test-api/token.rest | Dulce18san/Proyecto-Clinica | d2034fbe5858f7c1132dd56510a748e74d47e5f2 | [
"MIT"
] | null | null | null | test-api/token.rest | Dulce18san/Proyecto-Clinica | d2034fbe5858f7c1132dd56510a748e74d47e5f2 | [
"MIT"
] | null | null | null |
#Obtener un token
POST http://127.0.0.1:8000/api/sanctum/token HTTP/1.1
Content-Type: application/json
Accept: application/json
{
"email": "administrador@dominio.com",
"password": "123456789",
"device_name": "nuevo"
}
#Ejemplo uso de token
GET http://127.0.0.1:8000/api/user HTTP/1.1
Content-Type: application/json
Accept: application/json
Authorization: Bearer 1|vviVWE5TMd4Vn1rwjNoVPLLO44PJlDPLjQlwco5G
| 23.333333 | 64 | 0.75 |
fe02500e0ffb46993ab2c145dab0b458601f35a3 | 10,159 | rst | reStructuredText | README.rst | Spotme/couchdb | 147b708c91f14be06c3cd0d779e9b9e4863c0515 | [
"Apache-2.0"
] | null | null | null | README.rst | Spotme/couchdb | 147b708c91f14be06c3cd0d779e9b9e4863c0515 | [
"Apache-2.0"
] | 1 | 2018-10-01T11:37:24.000Z | 2018-10-03T08:57:30.000Z | README.rst | Spotme/couchdb | 147b708c91f14be06c3cd0d779e9b9e4863c0515 | [
"Apache-2.0"
] | 1 | 2018-02-13T16:17:04.000Z | 2018-02-13T16:17:04.000Z | Apache CouchDB + SpotMe patchset
=====================
SpotMe fork of the Apache CouchDB. All patches reside in separate `spotme/` branches.
Patches (not a full list). Also doesn't include those merged into upstream.
-------------
1. Support for `validate_doc_read` functions (JS and native Erlang ones).
CouchDB already has support for [validate_doc_update](http://docs.couchdb.org/en/master/ddocs/ddocs.html#validate-document-update-functions) functions which if defined in the design doc can be used to prevent invalid or unauthorised document update requests from being stored. In other words document-level write security. This patch further enhances security via adding a fine-grained document-level read security. Patch implements a support for `validate_doc_read` functions.
`validate_doc_read` has a bit different semantics when is used together with `_bulk_get`. Normally for a single GET request which violates VDR rules the response will be `403 Forbidden`. But when the forbidden document is requested as a part of `{"id", "rev"}` batch in the `_bulk_get` CouchDB instead of throwing an error for the whole batch returns a so-called "stub" of the forbidden document which always has `validate_doc_read_error` field the value of which is `"forbidden"` and optionally a detailed description and the VDR error code.
```
"_id" : "id",
"_rev": "rev",
"validate_doc_read_error":"forbidden",
"forbidden":"not allowed (error code 104), DocId: 032d9670-ee8f-427c-9a44-e0a77bb725fc Rev: 1-7a48478d435872b71dd2929f3439b213"
```
This was done because VDR is a custom SpotMe patch which is not accepted to upstream Apache CouchDB but `_bulk_get` patch has been accepted. Therefore due to obvious reasons changing `_bulk_get` handler logic to filter out those stubs is not acceptable. When VDR patch design was discussed the core idea was to avoid intrusiveness at all costs. That's why not using the concept of document stubs and throwing an exception early internally also is not possible for the reason this breaks a lot of different internal code paths and will require massive and unrelated code changes in the CouchDB's storage engine due to the fact it wasn't designed with the assumption that internal lookup functions may have side-effects. On the other hand changing VDR response for a single GET request to also return a "stub" document breaks [CouchDB API contracts](https://docs.couchdb.org/en/stable/query-server/protocol.html#forbidden) for validate functions. Hence having these stubs was the only viable trade-off.
### Performance notes
These functions have to be implemented only in Erlang in order to be executed inside of `couch_native_process`, otherwise implemented in JS they will lay waste to overall performance. Currently CouchDB sends all JS map functions in the design documents to the `couch_query_server` which spawns a pool of external `couch_js` OS processes (by default `proc_manager` spawns one OS process per design doc). This involves serialising/deserialising and sending strings back and forth via a slow standard I/O. And we know that the biggest complain about standard I/O is the performance impact from double copy (standard IO buffer -> kernel -> IO buffer). On the contrary, Erlang implementation of the VDR functions have almost zero overhead, because now map function will directly go to the `couch_native_process` which will tokenise (`erl_scan`), parse (`erl_parse`), call `erl_eval` and run a function.
Optimisations for Erlang validate_doc_update and validate_doc_read to leverage caching for evaluated BEAM code have been implemented as a separate SpotMe patch.
### 2. Support for an indexed view-filtered changes feed.
Changes feed with [_view filter](http://docs.couchdb.org/en/2.2.0/api/database/changes.html#view) allows to use existing map function as the filter, but unfortunately it doesn’t query the view index files. Fortunately enough CouchDB's `mrview` has unused internal implementation of two separate B+ tree indexes namely: `Seq` and `KeyBySeq` which are responsible for indexing of the changes feed with view keys. This functionality marked as `fast_view` was never exposed to `fabric` cluster layer and `chttpd` API, because it has its own problems. Long story short, it appears to work in some simple cases, but quickly goes south once keys in views starting to be removed and then reappear. For instance when the same keys appear in a view, then removed with docs updated or deleted and then reappearing again with new updates, the `seq`-index breaks pretty quickly. Also another problem is that unlike DB's b-tree keys that are always strings, views keys in seq b-tree could be anything, so sorting might potentiall yield unexpected results. `Seq` index currently looks like this: `{UpdSeq, ViewKey} -> {DocID, EmittedValue, DocRev}`. Where `UpdSeq` number is the sum of the individual update sequences encoded in the second part of update sequence as base64. And `ViewKey` can be any mapped from JSON data type: binary, integer, float or boolean. Currently `mrview` uses Erlang native collation to find first key to start from, so consistent low limit key couldn't be constructed.
According to discussions in the dev-mailing list, hidden `seq` and `keyseq` indexes support will be completely removed in future and will be reimplemented from scratch. But nevertheless `seq`/`keyseq`-indexed changes feed proved to work for simple controlled scenarios. As a result aforementioned functionality has been extended in this fork and some bugs have been fixed (compaction of `seq`-indexes, b-tree reduce, some replication scenarios) and new functionality has been implemented namely: view-filtered replication with rcouch, support for custom `/_last_seq` and `_view/_info` endpoints and `fabric` interface to the underlying `mrview`'s `seq`/`keyseq`-indexes.
Additional `KeyBySeq` index allows to apply familiar view query params to changes feed such as `key`, `start_key`, `end_key`, `limit` and so on which allow to have a somewhat limited [channel-like](https://developer.couchbase.com/documentation/mobile/current/guides/sync-gateway/channels/index.html#introduction-to-channels) functionality in vanilla Apache CouchDB.
### 3. Support for bulk get with multipart response
**UPD** This functionality has been submitted to the upstream repo in this [PR](https://github.com/apache/couchdb/pull/1195) and hopefully will be merged soon. PR also has a detailed description and motivation of the feature.
### 4. Recon patch
Integrates Fred Herbert's recon patch into our CouchDB build to troubleshoot production nodes.
### 5. CouchDB replicator-watchdog
Fixes CouchDB scheduled continuous replications and makes them more reliable and close to real-time requirements imposed by some of SpotMe critical business features. Namely monitors for slow or "stuck" replication jobs within `couch_replicator` supervision tree and re-forces them as well as handles crashing replication jobs which were suspended by the scheduler.
https://github.com/Spotme/couchdb/pull/2
### 6. _bulk-get with `multipart/mixed` response (merged into upstream)
This also was a SpotMe patch but has been accepted into the upstream. Soon afterwards there was found an issue with `_bulk_get` when `atts_since` param was causing a malformed `multipart/mixed` response. This was fixed on the SpotMe side and PR to the upstream should be sent. As this is a bug there are very high chances it will be accepted but for now this is a separate `0009` patch.
Patches which containg small fixes or those which already have been merged into upstream are not mentioned here:
https://github.com/apache/couchdb/pull/1401
https://github.com/apache/couchdb/pull/1165
https://github.com/apache/couchdb/pull/1164
Verifying your Installation
---------------------------
Run a basic test suite for CouchDB by browsing here:
http://127.0.0.1:5984/_utils/#verifyinstall
Getting started with developing
-------------------------------
For more detail, read the README-DEV.rst file in this directory.
Basically you just have to install the needed dependencies which are
documented in the install docs and then run ``./configure && make``.
You don't need to run ``make install`` after compiling, just use
``./dev/run`` to spin up three nodes. You can add haproxy as a caching
layer in front of this cluster by running ``./dev/run --with-haproxy
--haproxy=/path/to/haproxy`` . You will now have a local cluster
listening on port 5984.
For Fauxton developers fixing the admin-party does not work via the button in
Fauxton. To fix the admin party you have to run ``./dev/run`` with the ``admin``
flag, e.g. ``./dev/run --admin=username:password``. If you want to have an
admin-party, just omit the flag.
Contributing to CouchDB
-----------------------
You can learn more about our contributing process here:
https://github.com/apache/couchdb/blob/master/CONTRIBUTING.md
Cryptographic Software Notice
-----------------------------
This distribution includes cryptographic software. The country in which you
currently reside may have restrictions on the import, possession, use, and/or
re-export to another country, of encryption software. BEFORE using any
encryption software, please check your country's laws, regulations and policies
concerning the import, possession, or use, and re-export of encryption software,
to see if this is permitted. See <http://www.wassenaar.org/> for more
information.
The U.S. Government Department of Commerce, Bureau of Industry and Security
(BIS), has classified this software as Export Commodity Control Number (ECCN)
5D002.C.1, which includes information security software using or performing
cryptographic functions with asymmetric algorithms. The form and manner of this
Apache Software Foundation distribution makes it eligible for export under the
License Exception ENC Technology Software Unrestricted (TSU) exception (see the
BIS Export Administration Regulations, Section 740.13) for both object code and
source code.
The following provides more details on the included cryptographic software:
CouchDB includes a HTTP client (ibrowse) with SSL functionality.
| 84.658333 | 1,481 | 0.784231 |
9878d92095c4c40bcdf3d0142323455981d4ad31 | 858 | rst | reStructuredText | docs/source/faq.rst | TAMCTec/sense-node-c3 | e9d2056d5302a4aec0bc7bb2f36d687102812dc8 | [
"Apache-2.0"
] | null | null | null | docs/source/faq.rst | TAMCTec/sense-node-c3 | e9d2056d5302a4aec0bc7bb2f36d687102812dc8 | [
"Apache-2.0"
] | null | null | null | docs/source/faq.rst | TAMCTec/sense-node-c3 | e9d2056d5302a4aec0bc7bb2f36d687102812dc8 | [
"Apache-2.0"
] | null | null | null | FAQ
=======================
.. _no-serial-device-found:
No serial devices found
--------------------------------
Here are a few things you can try:
1. The first thing you should do, is just click **CLOSE** and try again.
2. Make sure you have turn on the Sense Node C3, and the D2 LED is on.
3. Make sure you connected the Sense Node C3 with the device running Home Assistant. Not the remote one. Or if you want to use a remote one.
4. Make sure the Type C cable has data transmission support, USB 2.0 and above. Cuz some cable is power only.
5. Try force the device into flash mode. Sometime, maybe with bad code, the device cannot be able get to flash mode, you need to manual do it. Just hold down IO9 button, and press release the RST button. Remember to press RST again after flashed done, as manual flash mode will not auto reboot. | 71.5 | 298 | 0.693473 |
9d847fd83044401a3071edf98877ef26fe5687d8 | 15,367 | rst | reStructuredText | docs/testing/user/userguide/integration.rst | parthyadav3105/k8svsperf | c4867c32c46a7e4c46586fff664bcba81f20f44a | [
"ECL-2.0",
"Apache-2.0"
] | 9 | 2017-05-18T03:25:39.000Z | 2021-04-30T18:35:32.000Z | docs/testing/user/userguide/integration.rst | parthyadav3105/k8svsperf | c4867c32c46a7e4c46586fff664bcba81f20f44a | [
"ECL-2.0",
"Apache-2.0"
] | 2 | 2020-11-26T09:29:12.000Z | 2021-03-19T22:59:35.000Z | docs/testing/user/userguide/integration.rst | parthyadav3105/k8svsperf | c4867c32c46a7e4c46586fff664bcba81f20f44a | [
"ECL-2.0",
"Apache-2.0"
] | 16 | 2016-12-09T12:47:04.000Z | 2021-05-20T09:28:37.000Z | .. This work is licensed under a Creative Commons Attribution 4.0 International License.
.. http://creativecommons.org/licenses/by/4.0
.. (c) OPNFV, Intel Corporation, AT&T, Tieto and others.
.. _integration-tests:
Integration tests
=================
VSPERF includes a set of integration tests defined in conf/integration.
These tests can be run by specifying --integration as a parameter to vsperf.
Current tests in conf/integration include switch functionality and Overlay
tests.
Tests in the conf/integration can be used to test scaling of different switch
configurations by adding steps into the test case.
For the overlay tests VSPERF supports VXLAN, GRE and GENEVE tunneling protocols.
Testing of these protocols is limited to unidirectional traffic and
P2P (Physical to Physical scenarios).
NOTE: The configuration for overlay tests provided in this guide is for
unidirectional traffic only.
NOTE: The overlay tests require an IxNet traffic generator. The tunneled traffic
is configured by ``ixnetrfc2544v2.tcl`` script. This script can be used
with all supported deployment scenarios for generation of frames with VXLAN, GRE
or GENEVE protocols. In that case options "Tunnel Operation" and
"TRAFFICGEN_IXNET_TCL_SCRIPT" must be properly configured at testcase definition.
Executing Integration Tests
---------------------------
To execute integration tests VSPERF is run with the integration parameter. To
view the current test list simply execute the following command:
.. code-block:: console
./vsperf --integration --list
The standard tests included are defined inside the
``conf/integration/01_testcases.conf`` file.
Executing Tunnel encapsulation tests
------------------------------------
The VXLAN OVS DPDK encapsulation tests requires IPs, MAC addresses,
bridge names and WHITELIST_NICS for DPDK.
NOTE: Only Ixia traffic generators currently support the execution of the tunnel
encapsulation tests. Support for other traffic generators may come in a future
release.
Default values are already provided. To customize for your environment, override
the following variables in you user_settings.py file:
.. code-block:: python
# Variables defined in conf/integration/02_vswitch.conf
# Tunnel endpoint for Overlay P2P deployment scenario
# used for br0
VTEP_IP1 = '192.168.0.1/24'
# Used as remote_ip in adding OVS tunnel port and
# to set ARP entry in OVS (e.g. tnl/arp/set br-ext 192.168.240.10 02:00:00:00:00:02
VTEP_IP2 = '192.168.240.10'
# Network to use when adding a route for inner frame data
VTEP_IP2_SUBNET = '192.168.240.0/24'
# Bridge names
TUNNEL_INTEGRATION_BRIDGE = 'vsperf-br0'
TUNNEL_EXTERNAL_BRIDGE = 'vsperf-br-ext'
# IP of br-ext
TUNNEL_EXTERNAL_BRIDGE_IP = '192.168.240.1/24'
# vxlan|gre|geneve
TUNNEL_TYPE = 'vxlan'
# Variables defined conf/integration/03_traffic.conf
# For OP2P deployment scenario
TRAFFICGEN_PORT1_MAC = '02:00:00:00:00:01'
TRAFFICGEN_PORT2_MAC = '02:00:00:00:00:02'
TRAFFICGEN_PORT1_IP = '1.1.1.1'
TRAFFICGEN_PORT2_IP = '192.168.240.10'
To run VXLAN encapsulation tests:
.. code-block:: console
./vsperf --conf-file user_settings.py --integration \
--test-params 'TUNNEL_TYPE=vxlan' overlay_p2p_tput
To run GRE encapsulation tests:
.. code-block:: console
./vsperf --conf-file user_settings.py --integration \
--test-params 'TUNNEL_TYPE=gre' overlay_p2p_tput
To run GENEVE encapsulation tests:
.. code-block:: console
./vsperf --conf-file user_settings.py --integration \
--test-params 'TUNNEL_TYPE=geneve' overlay_p2p_tput
To run OVS NATIVE tunnel tests (VXLAN/GRE/GENEVE):
1. Install the OVS kernel modules
.. code:: console
cd src/ovs/ovs
sudo -E make modules_install
2. Set the following variables:
.. code-block:: python
VSWITCH = 'OvsVanilla'
# Specify vport_* kernel module to test.
PATHS['vswitch']['OvsVanilla']['src']['modules'] = [
'vport_vxlan',
'vport_gre',
'vport_geneve',
'datapath/linux/openvswitch.ko',
]
**NOTE:** In case, that Vanilla OVS is installed from binary package, then
please set ``PATHS['vswitch']['OvsVanilla']['bin']['modules']`` instead.
3. Run tests:
.. code-block:: console
./vsperf --conf-file user_settings.py --integration \
--test-params 'TUNNEL_TYPE=vxlan' overlay_p2p_tput
Executing VXLAN decapsulation tests
------------------------------------
To run VXLAN decapsulation tests:
1. Set the variables used in "Executing Tunnel encapsulation tests"
2. Run test:
.. code-block:: console
./vsperf --conf-file user_settings.py --integration overlay_p2p_decap_cont
If you want to use different values for your VXLAN frame, you may set:
.. code-block:: python
VXLAN_FRAME_L3 = {'proto': 'udp',
'packetsize': 64,
'srcip': TRAFFICGEN_PORT1_IP,
'dstip': '192.168.240.1',
}
VXLAN_FRAME_L4 = {'srcport': 4789,
'dstport': 4789,
'vni': VXLAN_VNI,
'inner_srcmac': '01:02:03:04:05:06',
'inner_dstmac': '06:05:04:03:02:01',
'inner_srcip': '192.168.0.10',
'inner_dstip': '192.168.240.9',
'inner_proto': 'udp',
'inner_srcport': 3000,
'inner_dstport': 3001,
}
Executing GRE decapsulation tests
---------------------------------
To run GRE decapsulation tests:
1. Set the variables used in "Executing Tunnel encapsulation tests"
2. Run test:
.. code-block:: console
./vsperf --conf-file user_settings.py --test-params 'TUNNEL_TYPE=gre' \
--integration overlay_p2p_decap_cont
If you want to use different values for your GRE frame, you may set:
.. code-block:: python
GRE_FRAME_L3 = {'proto': 'gre',
'packetsize': 64,
'srcip': TRAFFICGEN_PORT1_IP,
'dstip': '192.168.240.1',
}
GRE_FRAME_L4 = {'srcport': 0,
'dstport': 0
'inner_srcmac': '01:02:03:04:05:06',
'inner_dstmac': '06:05:04:03:02:01',
'inner_srcip': '192.168.0.10',
'inner_dstip': '192.168.240.9',
'inner_proto': 'udp',
'inner_srcport': 3000,
'inner_dstport': 3001,
}
Executing GENEVE decapsulation tests
------------------------------------
IxNet 7.3X does not have native support of GENEVE protocol. The
template, GeneveIxNetTemplate.xml_ClearText.xml, should be imported
into IxNET for this testcase to work.
To import the template do:
1. Run the IxNetwork TCL Server
2. Click on the Traffic menu
3. Click on the Traffic actions and click Edit Packet Templates
4. On the Template editor window, click Import. Select the template
located at ``3rd_party/ixia/GeneveIxNetTemplate.xml_ClearText.xml``
and click import.
5. Restart the TCL Server.
To run GENEVE decapsulation tests:
1. Set the variables used in "Executing Tunnel encapsulation tests"
2. Run test:
.. code-block:: console
./vsperf --conf-file user_settings.py --test-params 'tunnel_type=geneve' \
--integration overlay_p2p_decap_cont
If you want to use different values for your GENEVE frame, you may set:
.. code-block:: python
GENEVE_FRAME_L3 = {'proto': 'udp',
'packetsize': 64,
'srcip': TRAFFICGEN_PORT1_IP,
'dstip': '192.168.240.1',
}
GENEVE_FRAME_L4 = {'srcport': 6081,
'dstport': 6081,
'geneve_vni': 0,
'inner_srcmac': '01:02:03:04:05:06',
'inner_dstmac': '06:05:04:03:02:01',
'inner_srcip': '192.168.0.10',
'inner_dstip': '192.168.240.9',
'inner_proto': 'udp',
'inner_srcport': 3000,
'inner_dstport': 3001,
}
Executing Native/Vanilla OVS VXLAN decapsulation tests
------------------------------------------------------
To run VXLAN decapsulation tests:
1. Set the following variables in your user_settings.py file:
.. code-block:: python
PATHS['vswitch']['OvsVanilla']['src']['modules'] = [
'vport_vxlan',
'datapath/linux/openvswitch.ko',
]
TRAFFICGEN_PORT1_IP = '172.16.1.2'
TRAFFICGEN_PORT2_IP = '192.168.1.11'
VTEP_IP1 = '172.16.1.2/24'
VTEP_IP2 = '192.168.1.1'
VTEP_IP2_SUBNET = '192.168.1.0/24'
TUNNEL_EXTERNAL_BRIDGE_IP = '172.16.1.1/24'
TUNNEL_INT_BRIDGE_IP = '192.168.1.1'
VXLAN_FRAME_L2 = {'srcmac':
'01:02:03:04:05:06',
'dstmac':
'06:05:04:03:02:01',
}
VXLAN_FRAME_L3 = {'proto': 'udp',
'packetsize': 64,
'srcip': TRAFFICGEN_PORT1_IP,
'dstip': '172.16.1.1',
}
VXLAN_FRAME_L4 = {
'srcport': 4789,
'dstport': 4789,
'protocolpad': 'true',
'vni': 99,
'inner_srcmac': '01:02:03:04:05:06',
'inner_dstmac': '06:05:04:03:02:01',
'inner_srcip': '192.168.1.2',
'inner_dstip': TRAFFICGEN_PORT2_IP,
'inner_proto': 'udp',
'inner_srcport': 3000,
'inner_dstport': 3001,
}
**NOTE:** In case, that Vanilla OVS is installed from binary package, then
please set ``PATHS['vswitch']['OvsVanilla']['bin']['modules']`` instead.
2. Run test:
.. code-block:: console
./vsperf --conf-file user_settings.py --integration \
--test-params 'tunnel_type=vxlan' overlay_p2p_decap_cont
Executing Native/Vanilla OVS GRE decapsulation tests
----------------------------------------------------
To run GRE decapsulation tests:
1. Set the following variables in your user_settings.py file:
.. code-block:: python
PATHS['vswitch']['OvsVanilla']['src']['modules'] = [
'vport_gre',
'datapath/linux/openvswitch.ko',
]
TRAFFICGEN_PORT1_IP = '172.16.1.2'
TRAFFICGEN_PORT2_IP = '192.168.1.11'
VTEP_IP1 = '172.16.1.2/24'
VTEP_IP2 = '192.168.1.1'
VTEP_IP2_SUBNET = '192.168.1.0/24'
TUNNEL_EXTERNAL_BRIDGE_IP = '172.16.1.1/24'
TUNNEL_INT_BRIDGE_IP = '192.168.1.1'
GRE_FRAME_L2 = {'srcmac':
'01:02:03:04:05:06',
'dstmac':
'06:05:04:03:02:01',
}
GRE_FRAME_L3 = {'proto': 'udp',
'packetsize': 64,
'srcip': TRAFFICGEN_PORT1_IP,
'dstip': '172.16.1.1',
}
GRE_FRAME_L4 = {
'srcport': 4789,
'dstport': 4789,
'protocolpad': 'true',
'inner_srcmac': '01:02:03:04:05:06',
'inner_dstmac': '06:05:04:03:02:01',
'inner_srcip': '192.168.1.2',
'inner_dstip': TRAFFICGEN_PORT2_IP,
'inner_proto': 'udp',
'inner_srcport': 3000,
'inner_dstport': 3001,
}
**NOTE:** In case, that Vanilla OVS is installed from binary package, then
please set ``PATHS['vswitch']['OvsVanilla']['bin']['modules']`` instead.
2. Run test:
.. code-block:: console
./vsperf --conf-file user_settings.py --integration \
--test-params 'tunnel_type=gre' overlay_p2p_decap_cont
Executing Native/Vanilla OVS GENEVE decapsulation tests
-------------------------------------------------------
To run GENEVE decapsulation tests:
1. Set the following variables in your user_settings.py file:
.. code-block:: python
PATHS['vswitch']['OvsVanilla']['src']['modules'] = [
'vport_geneve',
'datapath/linux/openvswitch.ko',
]
TRAFFICGEN_PORT1_IP = '172.16.1.2'
TRAFFICGEN_PORT2_IP = '192.168.1.11'
VTEP_IP1 = '172.16.1.2/24'
VTEP_IP2 = '192.168.1.1'
VTEP_IP2_SUBNET = '192.168.1.0/24'
TUNNEL_EXTERNAL_BRIDGE_IP = '172.16.1.1/24'
TUNNEL_INT_BRIDGE_IP = '192.168.1.1'
GENEVE_FRAME_L2 = {'srcmac':
'01:02:03:04:05:06',
'dstmac':
'06:05:04:03:02:01',
}
GENEVE_FRAME_L3 = {'proto': 'udp',
'packetsize': 64,
'srcip': TRAFFICGEN_PORT1_IP,
'dstip': '172.16.1.1',
}
GENEVE_FRAME_L4 = {'srcport': 6081,
'dstport': 6081,
'protocolpad': 'true',
'geneve_vni': 0,
'inner_srcmac': '01:02:03:04:05:06',
'inner_dstmac': '06:05:04:03:02:01',
'inner_srcip': '192.168.1.2',
'inner_dstip': TRAFFICGEN_PORT2_IP,
'inner_proto': 'udp',
'inner_srcport': 3000,
'inner_dstport': 3001,
}
**NOTE:** In case, that Vanilla OVS is installed from binary package, then
please set ``PATHS['vswitch']['OvsVanilla']['bin']['modules']`` instead.
2. Run test:
.. code-block:: console
./vsperf --conf-file user_settings.py --integration \
--test-params 'tunnel_type=geneve' overlay_p2p_decap_cont
Executing Tunnel encapsulation+decapsulation tests
--------------------------------------------------
The OVS DPDK encapsulation/decapsulation tests requires IPs, MAC addresses,
bridge names and WHITELIST_NICS for DPDK.
The test cases can test the tunneling encap and decap without using any ingress
overlay traffic as compared to above test cases. To achieve this the OVS is
configured to perform encap and decap in a series on the same traffic stream as
given below.
TRAFFIC-IN --> [ENCAP] --> [MOD-PKT] --> [DECAP] --> TRAFFIC-OUT
Default values are already provided. To customize for your environment, override
the following variables in you user_settings.py file:
.. code-block:: python
# Variables defined in conf/integration/02_vswitch.conf
# Bridge names
TUNNEL_EXTERNAL_BRIDGE1 = 'br-phy1'
TUNNEL_EXTERNAL_BRIDGE2 = 'br-phy2'
TUNNEL_MODIFY_BRIDGE1 = 'br-mod1'
TUNNEL_MODIFY_BRIDGE2 = 'br-mod2'
# IP of br-mod1
TUNNEL_MODIFY_BRIDGE_IP1 = '10.0.0.1/24'
# Mac of br-mod1
TUNNEL_MODIFY_BRIDGE_MAC1 = '00:00:10:00:00:01'
# IP of br-mod2
TUNNEL_MODIFY_BRIDGE_IP2 = '20.0.0.1/24'
#Mac of br-mod2
TUNNEL_MODIFY_BRIDGE_MAC2 = '00:00:20:00:00:01'
# vxlan|gre|geneve, Only VXLAN is supported for now.
TUNNEL_TYPE = 'vxlan'
To run VXLAN encapsulation+decapsulation tests:
.. code-block:: console
./vsperf --conf-file user_settings.py --integration \
overlay_p2p_mod_tput
| 31.361224 | 88 | 0.586451 |
680a9cd01c32c710d25f037dd82a63cd48350531 | 8,600 | rst | reStructuredText | docs/source/operating.rst | gabrieledamone/DE3-ROB1-CHESS | 19ec74f10317d27683817989e729cacd6fe55a3f | [
"CC-BY-4.0"
] | 25 | 2018-03-28T09:46:50.000Z | 2022-02-10T02:51:46.000Z | docs/source/operating.rst | gabrieledamone/DE3-ROB1-CHESS | 19ec74f10317d27683817989e729cacd6fe55a3f | [
"CC-BY-4.0"
] | 2 | 2018-02-25T13:14:46.000Z | 2020-06-15T09:24:09.000Z | docs/source/operating.rst | gabrieledamone/DE3-ROB1-CHESS | 19ec74f10317d27683817989e729cacd6fe55a3f | [
"CC-BY-4.0"
] | 21 | 2018-02-06T23:09:28.000Z | 2021-06-24T16:58:59.000Z | ************************
Controlling Franka & ROS
************************
Starting ROS to control Franka
==============================
.. TODO decide if we are using this commented section.
.. The image below describes how we plan to control the Arm using Python. To be able to write a successful Python program, we must first understand how ROS works: how to publish and listen on topics.
..
.. .. figure:: _static/franka_programming_interface.png
.. :align: center
.. :figclass: align-center
..
.. Interfacing Python with FRANKA.
The recommended setup for controlling the franka can be seen in the tree below. The top level shows the devices required and their IP address, the second tier shows the commands needed to run the files::
.
├── Franka Control Box (192.168.0.88)
├── Host workstation computer (192.168.0.77)
│ ├── roscore
│ └── ./franka_controller_sub 192.168.0.88
└── Project workstation computer
├── roslaunch openni2_launch openni2.launch
├── camera_subscriber.py (this does not need to be run seperately)
└── main.py
.. tip:: To test the image feed out without using it in ``main.py`` you can use the ``test_camera.py`` file in the ``tests`` folder.
.. Using a single workstation for roscore
.. --------------------------------------
..
.. .. warning:: **This is not recommended**. In order to maintain the realtime loop required by the ``franka_controller_sub`` it should run on its own computer.
..
.. To use ROS to control the Franka arm from one workstation, you need to have the master node running for ROS. To do this, open a new terminal window and run:
..
.. .. code-block:: bash
..
.. roscore
..
.. From this point, you can now initialise the subscriber node.
Networking with other workstations
----------------------------------
Instead of running the master node and subscriber nodes on your own workstation, these can be running on the main workstation in the lab instead. This means that libfranka won't need to be installed on your specific workstation.
To communicate over the lab network you need to change two main ROS variables. Firstly you need to find the IP address of your computer when connected to the lab network (via ethernet). To do this you can use ``ifconfig`` in a terminal window to give you your ``<ip_address_of_pc>``.
You then need to run the following two commands in your terminal window (substitute in you IP address):
.. code-block:: bash
export ROS_MASTER_URI=http://192.168.0.77:11311
export ROS_IP=<ip_address_of_pc>
As you will see, this is connecting you to the static IP address of the main Franka workstation, ``192.168.0.77``. In order for you to continue with running a Python publisher, you need to ensure that roscore and the subscriber is running on the main workstation.
.. note:: That this configuration of assigning IP addresses to ROS_MASTER_URI and ROS_IP is non-permanent, and is only active for the terminal window you are working in. This has to be repeated for every window you use to run rospy. Alternatively you can add these commands to your bashrc.
Running the subscriber
======================
Once ``roscore`` is running, the subsciber has to run in the background to read the messages from our Python publisher and execute them with libfranka. To run the subscriber (from the project folder), run:
.. code-block:: bash
cd franka/
./franka_controller_sub 192.168.0.88
Sometimes there is an "**error with no active exception**" thrown by this executable. This can sometimes be solved by simply manually moving the arm using the buttons a bit. Then rerun the command above again.
.. warning:: This subscriber is compiled for ``libfranka 0.1.0``. You can check your current ``libfranka`` version with ``rosversion libfranka`` command.
Using the publisher
===================
First, make sure you are running ``roscore`` and the subscriber, ``franka_controller_sub`` on the main lab workstation.
**Example**
Assuming you are writing a script (``script.py``) that wants to use control franka, the files should be stored as::
.
├── README.md
├── franka
│ ├── __init__.py
│ ├── franka_control_ros.py
│ ├── franka_controller_sub
│ ├── franka_controller_sub.cpp
│ ├── print_joint_positions
│ └── print_joint_positions.cpp
└── script.py
To use the ``FrankaRos`` class in your own Python script would look something like this:
.. code-block:: python
from franka.franka_control_ros import FrankaRos
franka = FrankaRos(debug_flag=True)
# we set the flag true to get prints to the console about what FrankaRos is doing
while True:
data = arm.get_position()
print("End effector position:")
print("X: ", data[0])
print("Y: ", data[1])
print("Z: ", data[2])
.. automodule:: franka.franka_control_ros
:members:
:undoc-members:
Using Franka without ROS
========================
.. note:: **This method is deprecated**. It is now recommended you use ROS to control the Arm using a Python publisher, such as the way described above.
Setting Permissions
-------------------
To control the Franka Arm, Fabian and Petar have written a small collection of C++ files which can be compiled to run as executables and control the Franka Arm using libfranka.
You need to ensure you have set the correct permissions for libfranka. You can check that in :ref:`setting-permissions`.
Downloading the C++ Executables and Python Class
------------------------------------------------
Now that you have libfranka set up properly you can get use the C++ files provided. These resources can be found in the ``/franka`` `directory`_ of the repository. Firstly, go to your project directory in the terminal by using ``cd <project_dir>``. If you have already downloaded the files before and are replacing them with an up-to-date version, run ``rm -rf franka/`` first. To download the necessary folder, run::
svn export https://github.com/nebbles/DE3-ROB1-CHESS/trunk/franka
.. _`directory`: https://github.com/nebbles/DE3-ROB1-CHESS/tree/master/franka
Once this directory is downloaded into your project directory, you need to change directory and then make the binaries executable::
cd franka/
chmod a+x franka*
chmod a-x *.cpp
.. warning::
This next command will move the FRANKA. **Make sure you have someone in charge of the external activation device (push button)**.
These binaries can now be used from the command line to control the Arm::
./franka_move_to_relative <ip_address> <delta_X> <delta_Y> <delta_Z>
Alternatively, you can control the Arm using the easy custom Python class ``Caller`` (see below).
Python-Franka API with ``franka_control.py``
--------------------------------------------
The Python-FRANKA module (``franka_control.py``) is designed to allow easy access to the C++ controller programs provided by Petar. The provided Python module is structured as follows.
.. TODO
.. .. automodule:: franka.archive.franka_control
.. :members:
.. :undoc-members:
**Example**
To use the ``FrankaControl`` class in your own Python script would look something like this:
.. code-block:: python
from franka.franka_control import FrankaControl
arm = FrankaControl(debug_flag=True)
# we set the flag true to get prints to the console about what Caller is doing
arm.move_relative(dx=0.1, dy=0.0, dz=-0.3)
# we tell teh arm to move down by 30cm and along x away from base by 10cm
.. note::
This example code assumes you are following the general project structure guide. See below for more information. The code above would be called from a main script such as ``run.py``.
General Structure of Project
----------------------------
The structure of the project is important to ensure that importing between modules works properly and also seperates externally maintained code from your own. An example of a project tree is::
.
├── LICENSE.txt
├── README.md
├── run.py
├── __init__.py
├── docs
│ ├── Makefile
│ ├── build
│ ├── make.bat
│ └── source
├── franka
│ ├── __init__.py
│ ├── franka_control.py
│ ├── franka_get_current_position
│ ├── franka_get_current_position.cpp
│ ├── franka_move_to_absolute
│ ├── franka_move_to_absolute.cpp
│ ├── franka_move_to_relative
│ └── franka_move_to_relative.cpp
├── my_modules
│ ├── module1.py
│ ├── module2.py
│ ├── module3.py
│ ├── __init__.py
└── test_script.py
Additional Resources
====================
https://frankaemika.github.io/docs/getting_started.html#operating-the-robot
| 40 | 417 | 0.696628 |
3915814548fb1856f5f40969868c91a7b614ea3a | 3,910 | rst | reStructuredText | docs/overview.rst | binary-cube/dot-array | 32da333bfc7f532d241a501a5dc18304d88052f5 | [
"MIT"
] | 10 | 2019-02-28T11:41:00.000Z | 2022-02-12T08:01:43.000Z | docs/overview.rst | binary-cube/dot-array | 32da333bfc7f532d241a501a5dc18304d88052f5 | [
"MIT"
] | null | null | null | docs/overview.rst | binary-cube/dot-array | 32da333bfc7f532d241a501a5dc18304d88052f5 | [
"MIT"
] | 1 | 2018-12-24T02:16:37.000Z | 2018-12-24T02:16:37.000Z | ========
Overview
========
Requirements
============
#. PHP 7.1
.. _installation:
Installation
============
The recommended way to install DotArray is with
`Composer <https://getcomposer.org>`_. Composer is a dependency management tool
for PHP that allows you to declare the dependencies your project needs and
installs them into your project.
.. code-block:: bash
# Install Composer
curl -sS https://getcomposer.org/installer | php
You can add DotArray as a dependency using the composer.phar CLI:
.. code-block:: bash
php composer.phar require binary-cube/dot-array
Alternatively, you can specify DotArray as a dependency in your project's
existing composer.json file:
.. code-block:: js
{
"require": {
"binary-cube/dot-array": "*"
}
}
After installing, you need to require Composer's autoloader:
.. code-block:: php
require 'vendor/autoload.php';
You can find out more on how to install Composer, configure autoloading, and
other best-practices for defining dependencies at `getcomposer.org <https://getcomposer.org>`_.
License
=======
Licensed using the `MIT license <http://opensource.org/licenses/MIT>`_.
Copyright (c) 2018 Binary Cube
Permission is hereby granted, free of charge, to any person obtaining a copy
of this software and associated documentation files (the "Software"), to deal
in the Software without restriction, including without limitation the rights
to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
copies of the Software, and to permit persons to whom the Software is
furnished to do so, subject to the following conditions:
The above copyright notice and this permission notice shall be included in all
copies or substantial portions of the Software.
THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE
SOFTWARE.
Bugs and feature requests
=========================
Have a bug or a feature request?
Please first read the issue guidelines and search for existing and closed issues.
If your problem or idea is not addressed yet, `please open a new issue <https://github.com/binary-cube/dot-array/issues/new>`_.
Contributing
============
All contributions are more than welcomed.
Contributions may close an issue, fix a bug (reported or not reported), add new design blocks,
improve the existing code, add new feature, and so on.
In the interest of fostering an open and welcoming environment,
we as contributors and maintainers pledge to making participation in our project and our community a harassment-free experience for everyone,
regardless of age, body size, disability, ethnicity, gender identity and expression, level of experience, nationality,
personal appearance, race, religion, or sexual identity and orientation.
`Read the full Code of Conduct <https://github.com/binary-cube/dot-array/blob/master/code-of-conduct.md>`_.
Versioning
===========
Through the development of new versions, we're going use the `Semantic Versioning <https://semver.org>`_.
Example: `1.0.0`.
- Major release: increment the first digit and reset middle and last digits to zero. Introduces major changes that might break backward compatibility. E.g. 2.0.0
- Minor release: increment the middle digit and reset last digit to zero. It would fix bugs and also add new features without breaking backward compatibility. E.g. 1.1.0
- Patch release: increment the third digit. It would fix bugs and keep backward compatibility. E.g. 1.0.1
| 35.87156 | 169 | 0.742711 |
8f72009886241f9f1d69857425e353dcbffff9fa | 4,738 | rst | reStructuredText | docs/source/guide/quick_start.rst | gategill/elliot | 113763ba6d595976e14ead2e3d460d9705cd882e | [
"Apache-2.0"
] | 175 | 2021-03-04T15:46:25.000Z | 2022-03-31T05:56:58.000Z | docs/source/guide/quick_start.rst | gategill/elliot | 113763ba6d595976e14ead2e3d460d9705cd882e | [
"Apache-2.0"
] | 15 | 2021-03-06T17:53:56.000Z | 2022-03-24T17:02:07.000Z | docs/source/guide/quick_start.rst | gategill/elliot | 113763ba6d595976e14ead2e3d460d9705cd882e | [
"Apache-2.0"
] | 39 | 2021-03-04T15:46:26.000Z | 2022-03-09T15:37:12.000Z | Quick Start
======================
Hello Word Cofiguration
-----------------------
Elliot's entry point is the function ``run_experiment``, which accepts a
configuration file that drives the whole experiment. In the following, a
sample configuration file is shown to demonstrate how a sample and
explicit structure can generate a rigorous experiment.
.. code:: python
from elliot.run import run_experiment
run_experiment("configuration/file/path")
The following file is a simple configuration for an experimental setup.
It contains all the instructions to get the MovieLens-1M catalog from a
specific path and perform a train test split in a random sample way with
a ratio of 20%.
This experiment provides a hyperparameter optimization with a grid
search strategy for an Item-KNN model. Indeed, it is seen that the
possible values of neighbors are closed in squared brackets. It
indicates that two different models equipped with two different
neighbors' values will be trained and compared to select the best
configuration. Moreover, this configuration obliges Elliot to save the
recommendation lists with at most 10 items per user as suggest by top\_k
property.
In this basic experiment, only a simple metric is considered in the
final evaluation study. The candidate metric is nDCG for a cutoff equal
to top\_k, unless otherwise noted.
.. code:: yaml
experiment:
dataset: movielens_1m
data_config:
strategy: dataset
dataset_path: ../data/movielens_1m/dataset.tsv
splitting:
test_splitting:
strategy: random_subsampling
test_ratio: 0.2
models:
ItemKNN:
meta:
hyper_opt_alg: grid
save_recs: True
neighbors: [50, 100]
similarity: cosine
evaluation:
simple_metrics: [nDCG]
top_k: 10
Basic Configuration
------------------------
In the first scenario, the experiments require comparing a group of RSs whose parameters are optimized via a grid-search.
The configuration specifies the data loading information, i.e., semantic features source files, in addition to the filtering and splitting strategies.
In particular, the latter supplies an entirely automated way of preprocessing the dataset, which is often a time-consuming
and non-easily-reproducible phase.
The simple_metrics field allows computing accuracy and beyond-accuracy metrics, with two top-k cut-off values (5 and 10)
by merely inserting the list of desired measures, e.g., [Precision, nDCG, ...].
The knowledge-aware recommendation model, AttributeItemKNN, is compared against two baselines: Random and ItemKNN,
along with a user-implemented model that is external.MostPop.
The configuration makes use of elliot's feature of conducting a grid search-based hyperparameter optimization strategy
by merely passing a list of possible hyperparameter values, e.g., neighbors: [50, 70, 100].
The reported models are selected according to nDCG@10.
**To see the full configuration file please visit the following** `link_basic <https://github.com/sisinflab/elliot/blob/master/config_files/basic_configuration.yml>`_
**To run the experiment use the following** `script_basic <https://github.com/sisinflab/elliot/blob/master/sample_basic.py>`_
Advanced Configuration
------------------------
The second scenario depicts a more complex experimental setting.
In the configuration, the user specifies an elaborate data splitting strategy, i.e., random_subsampling (for test splitting)
and random_cross_validation (for model selection), by setting few splitting configuration fields.
The configuration does not provide a cut-off value, and thus a top-k field value of 50 is assumed as the cut-off.
Moreover, the evaluation section includes the UserMADrating metric.
Elliot considers it as a complex metric since it requires additional arguments.
The user also wants to implement a more advanced hyperparameter tuning optimization. For instance, regarding NeuMF,
Bayesian optimization using Tree of Parzen Estimators is required (i.e., hyper_opt_alg: tpe) with a logarithmic uniform
sampling for the learning rate search space.
Moreover, Elliot allows considering complex neural architecture search spaces by inserting lists of tuples. For instance,
(32, 16, 8) indicates that the neural network consists of three hidden layers with 32, 16, and 8 units, respectively.
**To see the full configuration file please visit the following** `link_advanced <https://github.com/sisinflab/elliot/blob/master/config_files/advanced_configuration.yml>`_
**To run the experiment use the following** `script_advanced <https://github.com/sisinflab/elliot/blob/master/sample_advanced.py>`_
| 44.280374 | 172 | 0.760447 |
416564777abab57b1208618b4b6f977af41d52a5 | 200 | rst | reStructuredText | doc/apbs/input/elec/lpbe.rst | ashermancinelli/apbs-pdb2pqr | 0b1bc0126331cf3f1e08667ccc70dae8eda5cd00 | [
"BSD-3-Clause"
] | null | null | null | doc/apbs/input/elec/lpbe.rst | ashermancinelli/apbs-pdb2pqr | 0b1bc0126331cf3f1e08667ccc70dae8eda5cd00 | [
"BSD-3-Clause"
] | null | null | null | doc/apbs/input/elec/lpbe.rst | ashermancinelli/apbs-pdb2pqr | 0b1bc0126331cf3f1e08667ccc70dae8eda5cd00 | [
"BSD-3-Clause"
] | null | null | null | .. _lpbe:
lpbe
====
Specifies that the linearized Poisson-Boltzmann equation should be solved.
.. note::
The options :ref:`lpbe`, :ref:`npbe`, :ref:`lrpbe`, :ref:`nrpbe` are mutually exclusive. | 20 | 91 | 0.685 |
8a6f665ec66e88daf560120176580580fa2aceb7 | 406 | rst | reStructuredText | docs/tutorials.rst | jamesmcclain/pystac | 993b54f5a10b0d55db18dbda81c5ad7acc06d921 | [
"Apache-2.0"
] | null | null | null | docs/tutorials.rst | jamesmcclain/pystac | 993b54f5a10b0d55db18dbda81c5ad7acc06d921 | [
"Apache-2.0"
] | null | null | null | docs/tutorials.rst | jamesmcclain/pystac | 993b54f5a10b0d55db18dbda81c5ad7acc06d921 | [
"Apache-2.0"
] | null | null | null | Tutorials
#########
PySTAC Introduction
-------------------
:ref:`This notebook </tutorials/pystac-introduction.ipynb>` gives an introduction to PySTAC concepts through code examples.
PySTAC SpaceNet tutorial
------------------------
:ref:`This notebook </tutorials/pystac-spacenet-tutorial.ipynb>` shows how to create and manipulate a STAC of `SpaceNet <https://spacenetchallenge.github.io/>`_ data.
| 29 | 166 | 0.689655 |
4585fc651679062a3f8477e27fc6f514ac1c9367 | 187 | rst | reStructuredText | Lib/site-packages/PyQt5/doc/sphinx/api/qdnsdomainnamerecord.rst | dipivan/my-first-blog | 07c2b7ba631c747ac85bbd32fcedb9305474b7b8 | [
"bzip2-1.0.6"
] | null | null | null | Lib/site-packages/PyQt5/doc/sphinx/api/qdnsdomainnamerecord.rst | dipivan/my-first-blog | 07c2b7ba631c747ac85bbd32fcedb9305474b7b8 | [
"bzip2-1.0.6"
] | null | null | null | Lib/site-packages/PyQt5/doc/sphinx/api/qdnsdomainnamerecord.rst | dipivan/my-first-blog | 07c2b7ba631c747ac85bbd32fcedb9305474b7b8 | [
"bzip2-1.0.6"
] | null | null | null | .. currentmodule:: PyQt5.QtNetwork
QDnsDomainNameRecord
--------------------
.. class:: QDnsDomainNameRecord
`C++ documentation <http://doc.qt.io/qt-5/qdnsdomainnamerecord.html>`_
| 20.777778 | 74 | 0.668449 |
479a70b89c462adf9ec1279882e0237d4acbed3b | 841 | rst | reStructuredText | docs/CHANGELOG.rst | azavea/sbo-selenium | 013d33c2f5aa76cd80698233e9bdca86f5845569 | [
"BSD-2-Clause"
] | null | null | null | docs/CHANGELOG.rst | azavea/sbo-selenium | 013d33c2f5aa76cd80698233e9bdca86f5845569 | [
"BSD-2-Clause"
] | null | null | null | docs/CHANGELOG.rst | azavea/sbo-selenium | 013d33c2f5aa76cd80698233e9bdca86f5845569 | [
"BSD-2-Clause"
] | 1 | 2015-01-06T20:57:15.000Z | 2015-01-06T20:57:15.000Z | sbo-selenium Changelog
======================
0.4.3 (2014-06-01)
------------------
* More robust SeleniumTestCase.click() implementation (retry until success or timeout)
0.4.2 (2014-05-20)
------------------
* Page load timeout support (default is 10 seconds, override via SELENIUM_PAGE_LOAD_TIMEOUT)
* Support for Internet Explorer Sauce OnDemand sessions
0.4.1 (2014-04-18)
------------------
* Added support for Sauce Connect tunnel identifiers
* Added the SELENIUM_SAUCE_VERSION setting to tell Sauce Labs which Selenium
version to use
* More reliable output of Sauce OnDemand session IDs for integration with
the Jenkins plugin
* Better redirection of error messages to configured logging (the
SELENIUM_LOG_FILE setting is no longer needed and has been removed)
0.4.0 (2014-03-29)
------------------
* Initial public release
| 32.346154 | 92 | 0.699168 |
7116213cfdbc4f38ac535d992392e7a20070d2d3 | 732 | rst | reStructuredText | doc/to-start/changelog.rst | Yxa2111/types-linq | 39ea10f739033874c84f404ff912e45f8174e6fb | [
"BSD-2-Clause"
] | null | null | null | doc/to-start/changelog.rst | Yxa2111/types-linq | 39ea10f739033874c84f404ff912e45f8174e6fb | [
"BSD-2-Clause"
] | null | null | null | doc/to-start/changelog.rst | Yxa2111/types-linq | 39ea10f739033874c84f404ff912e45f8174e6fb | [
"BSD-2-Clause"
] | null | null | null | Changelog
############
`GitHub Releases <https://github.com/cleoold/types-linq/releases>`_
v0.1.2
********
- Add to_tuple()
- Add an overload to sequence_equal() that accepts a comparision function
- https://github.com/cleoold/types-linq/commit/f70bd510492a915776f6cac26854111650541b22
v0.1.1
********
- Change zip() to support multiple
- Add as_cached() method to memoize results
- Fix OrderedEnumerable bug that once use [] operator on it, returning incorrect result
- Add dunder to some parameter names seen in pyi to prevent them from being passed as named arguments
- https://github.com/cleoold/types-linq/commit/b1b70b9d489cfe06ab1a69c4a0e4a5d195f5f5d7
v0.1.0
********
- Initial releases under the BSD-2-Clause License
| 28.153846 | 101 | 0.75 |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.