hexsha stringlengths 40 40 | size int64 5 1.05M | ext stringclasses 588 values | lang stringclasses 305 values | max_stars_repo_path stringlengths 3 363 | max_stars_repo_name stringlengths 5 118 | max_stars_repo_head_hexsha stringlengths 40 40 | max_stars_repo_licenses listlengths 1 10 | max_stars_count float64 1 191k ⌀ | max_stars_repo_stars_event_min_datetime stringdate 2015-01-01 00:00:35 2022-03-31 23:43:49 ⌀ | max_stars_repo_stars_event_max_datetime stringdate 2015-01-01 12:37:38 2022-03-31 23:59:52 ⌀ | max_issues_repo_path stringlengths 3 363 | max_issues_repo_name stringlengths 5 118 | max_issues_repo_head_hexsha stringlengths 40 40 | max_issues_repo_licenses listlengths 1 10 | max_issues_count float64 1 134k ⌀ | max_issues_repo_issues_event_min_datetime stringlengths 24 24 ⌀ | max_issues_repo_issues_event_max_datetime stringlengths 24 24 ⌀ | max_forks_repo_path stringlengths 3 363 | max_forks_repo_name stringlengths 5 135 | max_forks_repo_head_hexsha stringlengths 40 40 | max_forks_repo_licenses listlengths 1 10 | max_forks_count float64 1 105k ⌀ | max_forks_repo_forks_event_min_datetime stringdate 2015-01-01 00:01:02 2022-03-31 23:27:27 ⌀ | max_forks_repo_forks_event_max_datetime stringdate 2015-01-03 08:55:07 2022-03-31 23:59:24 ⌀ | content stringlengths 5 1.05M | avg_line_length float64 1.13 1.04M | max_line_length int64 1 1.05M | alphanum_fraction float64 0 1 |
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
abcd7a6c74856b817ebda7e6e24445df5c47c4ea | 3,415 | rst | reStructuredText | docs/tutorial/differentialoperators.rst | TomF98/torchphysics | b509385b623516e9475870ab2ca81fef1924dea4 | [
"Apache-2.0"
] | 203 | 2021-11-10T10:33:29.000Z | 2022-03-26T09:05:12.000Z | docs/tutorial/differentialoperators.rst | DKreuter/torchphysics | 775d9aca71752a568f1fca972c958b99107f3b7c | [
"Apache-2.0"
] | 3 | 2022-01-07T19:57:00.000Z | 2022-03-10T08:04:49.000Z | docs/tutorial/differentialoperators.rst | DKreuter/torchphysics | 775d9aca71752a568f1fca972c958b99107f3b7c | [
"Apache-2.0"
] | 16 | 2021-09-30T08:35:37.000Z | 2022-03-16T13:12:22.000Z | ============================
Use of Differentialoperators
============================
To learn a solution of a differential equation, one needs to compute different
derivatives of the neural network.
To make the implementation of a given ODE/PDE easier, different operators are already
implemented. They can be found under ``torchphysics.utils.differentialoperators``.
Under the hood, all operators use the ``autograd`` functionallity of PyTorch.
For example the following operators are implemented:
- For scalar outputs:
- ``grad``, to compute the gradient :math:`\nabla u`
- ``laplacian``, to compute the laplace operator :math:`\Delta u`
- ``partial``, to compute a partial derivative :math:`\partial_x u`
- ``normalderivatives``, to compute the normal derivative :math:`\vec{n} \cdot \nabla u`
- For vector outputs:
- ``div``, to compute the divergence :math:`\text{div}(u)` or :math:`\nabla \cdot u`
- ``rot``, to compute the rotation/curl of a vector field :math:`\nabla \times u`
- ``jac``, to compute the jacobian matrix
All operators can handle the computation on a whole batch of data.
Of course, the operators for scalar outputs can also be used for vectorial outputs, if one output
entry is specified. E.g. :math:`u: \mathbb{R}^3 \to \mathbb{R}^3` then do
``laplacian`` (:math:`u[:, 0], x`), to get the laplacian of the first entry.
The newest version of all implemented operators can be found under the docs_.
.. _docs: https://torchphysics.readthedocs.io/en/latest/api/torchphysics.utils.html
Since ``autograd`` is used, the differential operators can work with neural networks and functions
that use PyTorch-Tensors. It follow a short example of the usage:
.. code-block:: python
import torch
# Define some example function:
def f(x, t):
return torch.sin(t * x[:, :1]) + x[:, 1:]**2
# Define some points where to evaluate the function
x = torch.tensor([[1.0, 1.0], [0, 1], [1, 0]], requires_grad=True)
t = torch.tensor([[1], [0], [2.0]], requires_grad=True)
# requires_grad=True is needed, so PyTorch knows to create a backwards graph.
# These tensors could be seen as a batch with three data points.
The important part for the implemented operators and ``autograd`` in general, is that the output
of the function evaluated at the points is needed, not the function itself. This has the advantage
that one has to only evaluate the function once and then can create arbitrary derivatives.
.. code-block:: python
# Therefore comput now the outputs:
out = f(x, t)
Let us compute the gradient and laplacian:
.. code-block:: python
import torchphysics as tp
# gradient and laplacian w.r.t. x:
grad_x = tp.utils.grad(out, x)
laplace_x = tp.utils.laplacian(out, x)
# gradient and laplacian w.r.t. t:
grad_t = tp.utils.grad(out, t) # equal to the first derivative
laplace_t = tp.utils.laplacian(out, t) # equal to the second derivative
What is also possible is the computation of derivative w.r.t. different variables. For
this, one just has to pass all variables, for which the derivative has to be computed, to the method.
E.g. for :math:`\partial_{x_1}^2f + \partial_{x_2}^2f + \partial_t^2f` one can use:
.. code-block:: python
laplace_t = tp.utils.laplacian(out, x, t) # <- here both variables
All other operators work in the same way. Here_ you can go back to the main tutorial page.
.. _Here: tutorial_start.html | 43.227848 | 101 | 0.711567 |
b003d1923fd62374012f3bfb966a74c44d62f62d | 770 | rst | reStructuredText | docs/source/installation.rst | csdenboer/channels-demultiplexer | 7181f63ee086b0573daac821fdec6b10d733cc9e | [
"MIT"
] | 17 | 2020-09-18T16:15:30.000Z | 2021-12-22T21:51:42.000Z | docs/source/installation.rst | csdenboer/channels-demultiplexer | 7181f63ee086b0573daac821fdec6b10d733cc9e | [
"MIT"
] | 4 | 2020-11-24T00:16:06.000Z | 2021-04-20T05:22:41.000Z | docs/source/installation.rst | csdenboer/channels-demultiplexer | 7181f63ee086b0573daac821fdec6b10d733cc9e | [
"MIT"
] | 2 | 2020-09-25T09:46:36.000Z | 2020-11-25T12:19:29.000Z | Installation
=======================
Channels Demultiplexer is easy to install from the PyPI index:
.. code-block:: bash
$ pip install channels-demultiplexer
This will install ``channels-demultiplexer`` along with its dependencies:
* channels 3;
* django-appconf.
After installing the package, the project settings need to be configured.
Add ``channels_demultiplexer`` to your ``INSTALLED_APPS``::
INSTALLED_APPS = [
'django.contrib.admin',
'django.contrib.auth',
'django.contrib.contenttypes',
'django.contrib.sessions',
'django.contrib.messages',
'django.contrib.staticfiles',
# Channels Demultiplexer app can be in any position in the INSTALLED_APPS list.
'channels_demultiplexer',
] | 26.551724 | 87 | 0.679221 |
fcae2d837bd84a1c2250681a2062c297757157ce | 102 | rst | reStructuredText | docs/core_components/images/index.rst | willcodefortea/wagtail | 2723b85ed8f356bde89d9541105b8cea4812d6a1 | [
"BSD-3-Clause"
] | null | null | null | docs/core_components/images/index.rst | willcodefortea/wagtail | 2723b85ed8f356bde89d9541105b8cea4812d6a1 | [
"BSD-3-Clause"
] | null | null | null | docs/core_components/images/index.rst | willcodefortea/wagtail | 2723b85ed8f356bde89d9541105b8cea4812d6a1 | [
"BSD-3-Clause"
] | null | null | null | Images
======
.. toctree::
:maxdepth: 2
using_images_outside_wagtail
feature_detection
| 10.2 | 32 | 0.656863 |
6c0a9eb530486ede3a70d3aa11f8361b026d0ea8 | 441 | rst | reStructuredText | docs/src/reference/auth.rst | JordanPCF/LyricsGenius | eb8c6b3fb3ba12eb8420cf93739959720a5d6def | [
"MIT"
] | 692 | 2018-02-21T15:32:21.000Z | 2022-03-26T09:50:13.000Z | docs/src/reference/auth.rst | JordanPCF/LyricsGenius | eb8c6b3fb3ba12eb8420cf93739959720a5d6def | [
"MIT"
] | 171 | 2018-02-21T05:39:07.000Z | 2022-03-30T22:00:05.000Z | docs/src/reference/auth.rst | JordanPCF/LyricsGenius | eb8c6b3fb3ba12eb8420cf93739959720a5d6def | [
"MIT"
] | 170 | 2018-02-22T19:56:44.000Z | 2022-03-18T11:12:37.000Z | .. _auth:
.. currentmodule:: lyricsgenius.auth
.. toctree::
:maxdepth: 2
:hidden:
:caption: Auth
Auth
====
OAuth2
------
You can use this class to authenticate yourself or
get URLs to redirect your users to and get them to
give your Genius app the premissions you need.
To find out more about how to use this class
visit the :ref:`snippets`.
.. autoclass:: OAuth2
:members:
:member-order: bysource
:no-show-inheritance:
| 19.173913 | 50 | 0.70068 |
66e7e23307ef575d7ddaa8eb83df648039585105 | 1,608 | rst | reStructuredText | source/fieldtypes/text.rst | simonjob/ExpressionEngine-User-Guide | 1d1be347ddfb9fd3738585606835d638d92dbdf0 | [
"Apache-2.0"
] | null | null | null | source/fieldtypes/text.rst | simonjob/ExpressionEngine-User-Guide | 1d1be347ddfb9fd3738585606835d638d92dbdf0 | [
"Apache-2.0"
] | null | null | null | source/fieldtypes/text.rst | simonjob/ExpressionEngine-User-Guide | 1d1be347ddfb9fd3738585606835d638d92dbdf0 | [
"Apache-2.0"
] | null | null | null | .. # This source file is part of the open source project
# ExpressionEngine User Guide (https://github.com/ExpressionEngine/ExpressionEngine-User-Guide)
#
# @link https://expressionengine.com/
# @copyright Copyright (c) 2003-2018, EllisLab, Inc. (https://ellislab.com)
# @license https://expressionengine.com/license Licensed under Apache License, Version 2.0
Text Input
==========
Text Input is a single-lined free-form writing space where you can enter text or HTML.
Field Options
-------------
Maximum Characters
~~~~~~~~~~~~~~~~~~
The maximum number of characters this field should allow.
Text Formatting
~~~~~~~~~~~~~~~
Specifies how the entered-text will be formatted when rendered on the front-end. Choices include replacing each linebreak with a ``BR`` tag, automatically surrounding paragraphs with ``P`` tags, or Markdown processing. :doc:`Additional plugins </development/plugins>` may be installed to provide more text formatting options.
Allow Override?
~~~~~~~~~~~~~~~
When set to yes, authors can override the default text formatting for this field from the publish form to set it to a format other than what was selected as the default above.
Text Direction
~~~~~~~~~~~~~~
Either left-to-right, or right-to-left.
Allowed Content
~~~~~~~~~~~~~~~
Restricts the text field to certain data types, while also informing how the data should be sorted on the front-end (numbers represented as text will sort differently than regular numbers).
Field Tools
~~~~~~~~~~~
Show a smiley chooser, or show a file chooser button to easily insert images or links to files.
| 35.733333 | 325 | 0.720771 |
5b0ba0995c820b6f0e9574ed797ff7982786e8c3 | 421 | rst | reStructuredText | doc/source/io.rst | wanaylor/NewBuildingsPy | a80ea41600c80569dfb381ed9629161a5f17224e | [
"BSD-3-Clause-LBNL"
] | 66 | 2015-01-26T15:57:05.000Z | 2022-03-24T18:43:01.000Z | doc/source/io.rst | wanaylor/NewBuildingsPy | a80ea41600c80569dfb381ed9629161a5f17224e | [
"BSD-3-Clause-LBNL"
] | 259 | 2015-01-06T21:37:52.000Z | 2022-03-07T18:02:38.000Z | doc/source/io.rst | wanaylor/NewBuildingsPy | a80ea41600c80569dfb381ed9629161a5f17224e | [
"BSD-3-Clause-LBNL"
] | 34 | 2015-01-14T11:35:57.000Z | 2022-03-15T22:10:25.000Z | Reading output files, plotting results and reporting simulation output
======================================================================
.. automodule:: buildingspy.io
File reader
-----------
.. autoclass:: buildingspy.io.outputfile.Reader
:members:
Plotter
-------
.. autoclass:: buildingspy.io.postprocess.Plotter
:members:
Reporter
--------
.. autoclass:: buildingspy.io.reporter.Reporter
:members:
| 21.05 | 70 | 0.584323 |
495c9ee35276d13b98eadcc5007c536f169c58d3 | 14,224 | rst | reStructuredText | doc/getting_started.rst | codebyravi/otter | d58077ba4af24a586ae0a0becaf6da96b716a597 | [
"Apache-2.0"
] | 20 | 2015-02-11T16:32:07.000Z | 2019-11-12T03:27:54.000Z | doc/getting_started.rst | codebyravi/otter | d58077ba4af24a586ae0a0becaf6da96b716a597 | [
"Apache-2.0"
] | 1,145 | 2015-01-01T00:00:47.000Z | 2022-02-11T03:40:39.000Z | doc/getting_started.rst | codebyravi/otter | d58077ba4af24a586ae0a0becaf6da96b716a597 | [
"Apache-2.0"
] | 29 | 2015-01-08T15:00:11.000Z | 2021-02-16T16:33:53.000Z |
***************
Getting Started
***************
Core Concepts
=============
Autoscale is an API based tool that enables you to scale your application by adding or removing servers based on monitoring events, a schedule, or arbitrary webhooks.
Autoscale functions by linking three services:
- Monitoring (such as Monitoring as a Service)
- Autoscale API
- Servers and Load Balancers
Basic Workflow
--------------
An autoscaling group is monitored by Rackspace Cloud Monitoring. When Monitoring triggers an alarm for high utilization within the autoscaling group, a webhook is triggered. The webhook stimulates the autoscale service which consults a policy in accordance with the webhook. The policy determines how many additional cloud servers should be added or removed in accordance with the alarm.
Alarms may trigger scaling up or scaling down. Currently scale down events always remove the oldest server in the group.
Cooldowns allow you to ensure that you don't scale up or down too fast. When a scaling policy is hit, both the scaling policy cooldown and the group cooldown start. Any additional requests to the group are discarded while the group cooldown is active. Any additional requests to the specific policy are discarded when the policy cooldown is active.
**Autoscale does not configure anything within a server.** This means that all images should be self provisioning. It is up to you to make sure that your services are configured to function properly when the server is started. We recommend using something like Chef, Salt, or Puppet.
Example Use Case
----------------
Five servers are in an autoscaling group, with Rackspace Cloud Monitoring monitoring their CPU usage. Monitoring will trigger an alarm when CPU is at 90%. That alarm will trigger a webhook that Autoscale created previously. When that webhook is hit, autoscale receives the alert, and carries out a policy specific to that webhook. This policy says "When my webhook is hit, create five servers according to the launch configuration, and add them to the load balancer."
Autoscale can also work in the opposite direction. A policy can say "When my webhook is hit, scale down by five servers."
The Scaling Group
=================
There are three components to Autoscale:
- The Scaling Group Configuration
- The Scaling Group's Launch Configuration
- The Scaling Group's Policies
Autoscale Groups at a minimum require the Group Configuration, and a Launch Configuration. Policies are only required to make the group change.
The Group Configuration
-----------------------
This specifies the basic elements of the group.
The Group Configuration contains:
- Group Name
- Group Cooldown (how long a group has to wait before you can scale again in seconds)
- Minimum and Maximum number of entities
The Launch Configuration
------------------------
This configuration specifies what to do when we want to create a new server. What image to boot, on what flavor, and which load balancer to connect it to.
The Launch Configuration Contains:
- Launch Configuration Type (Only type currently supported is "launch_server")
- Arguments:
- Server
- name
- flavor
- imageRef (This is the ID of the Cloud Server image you will boot)
- Load Balancer
- loadBalancerId
- port
Scaling Policies
----------------
Scaling policies specify how to change the Autoscaling Group. There can be multiple scaling policies per group.
Scaling Policies Contain:
- Scaling Policy Name
- Change Value (incremental, or by percent)
- Policy Cooldown (in seconds)
- Execute Webhook (auto-generated)
Walking Through the Autoscale API
=================================
This will give you the basic steps to create an Autoscaling group. We recommend using http://docs.autoscale.apiary.io/ to generate CURL commands if you want to follow along in your environment.
Authentication
--------------
You will need to generate an auth token and then send it as 'X-Auth-token' header along with all the requests to authenticate yourself.
Authentication Endpoint: ``https://identity.api.rackspacecloud.com/v2.0/tokens``
You can request a token by providing your username and your API key.
.. code-block:: bash
curl --request POST -H "Content-type: application/json" \
--data-binary '{
"auth":{
"RAX-KSKEY:apiKeyCredentials":{
"username":"theUserName",
"apiKey":"00a00000a000a0000000a000a00aaa0a"
}
}
}' \
https://identity.api.rackspacecloud.com/v2.0/tokens | python -mjson.tool
You can request a token by providing your username and your password.
.. code-block:: bash
curl --request POST -H "Content-type: application/json" \
--data-binary '{
"auth":{
"passwordCredentials":{
"username":"username",
"password":"password"}
}
}' \
https://identity.api.rackspacecloud.com/v2.0/tokens | python -mjson.tool
The response will be HUGE (sorry!) We've snipped the serviceCatalog bit for clarity.
.. code-block:: bash
{
"access": {
"serviceCatalog": [
...
],
"token": {
"expires": "2012-04-13T13:15:00.000-05:00",
"id": "aaaaaaaa-bbbb-cccc-dddd-eeeeeeeeeeee",
"tenant": {
"id": "123456",
"name": "123456"
}
},
"user": {
"RAX-AUTH:defaultRegion": "DFW",
"id": "161418",
"name": "demoauthor",
"roles": [
{
"description": "User Admin Role.",
"id": "3",
"name": "identity:user-admin"
}
]
}
}
}
Note your token.id and your user.id. That token.tenant.id is your "tenantID" and you will need it to make requests to Autoscale.
If the auth token received is "aaaaaaaa-bbbb-cccc-dddd-eeeeeeeeeeee" and your tenantID is 123456 then this example request will list all groups you've created:
.. code-block:: bash
$ curl -X GET -H "Content-Type: application/json" -H "X-Auth-token: {auth-token}" https://{region}.ord.autoscale.api.rackspacecloud.com/v1.0/{tenantId}/groups/ | python -mjson.tool
Step One - Save an Image
------------------------
First, boot a Rackspace Cloud Server, and customize it so that it can process requests. For example, if you're building a webhead autoscaling group, configure Apache2 to start on launch, and serve the files you need.
When that is complete, save your image, and record the imageID.
.. code-block:: bash
$ curl --request GET --header "Content-Type: application/json" \
--header "X-Auth-token: {auth-token}" \
https://ord.servers.api.rackspacecloud.com/v2/{Tenant-id}/images?type=SNAPSHOT \
| python -mjson.tool
Step Two - Create the Group
---------------------------
Create a Scaling Group by submitting a POST request containing an edited version of these data.
.. code-block:: bash
POST https://ord.autoscale.api.rackspacecloud.com/v1.0/{tenantId}/groups/
.. code-block:: bash
curl --include --header "Accept: application/json" \
--header "X-Auth-token: {auth-token}" \
--request POST \
--data-binary '{
"groupConfiguration": {
"name": "workers",
"cooldown": 60,
"minEntities": 5,
"maxEntities": 100,
"metadata": {
"firstkey": "this is a string",
"secondkey": "1"
}
},
"launchConfiguration": {
"type": "launch_server",
"args": {
"server": {
"flavorRef": 3,
"name": "webhead",
"imageRef": "0d589460-f177-4b0f-81c1-8ab8903ac7d8",
"OS-DCF:diskConfig": "AUTO",
"metadata": {
"mykey": "myvalue"
},
"personality": [
{
"path": '/root/.ssh/authorized_keys',
"contents": "ssh-rsa AAAAB3Nza...LiPk== user@example.net"
}
],
"networks": [
{
"uuid": "11111111-1111-1111-1111-111111111111"
}
],
},
"loadBalancers": [
{
"loadBalancerId": 2200,
"port": 8081
}
]
}
},
"scalingPolicies": [
]
}' \
"https://ord.autoscale.api.rackspacecloud.com/v1.0/{tenantId}/groups/"
This will create your scaling group, spin up the minimum number of servers, and then attach them to the load balancer you specified. To modify the group, you will need to create policies.
Step Three - Policies
---------------------
Create scaling policies by sending POST requests
.. code-block:: bash
POST https://ord.autoscale.api.rackspacecloud.com/v1.0/{tenantId}/groups/{groupId}/policies/
.. code-block:: bash
curl --include --header "Accepts: application/json" \
--header "X-Auth-token: {auth-token}" \
--request POST \
--data-binary '[
{
"name": "scale up by one server",
"change": 1,
"cooldown": 150,
"type": "webhook"
},
{
"name": "scale down by 5.5 percent",
"changePercent": -5.5,
"cooldown": 6,
"type": "webhook"
}
]' \
"https://ord.autoscale.api.rackspacecloud.com/v1.0/{tenantId}/groups/{groupId}/policies"
Step Four - Webhooks
--------------------
Now that you've created the policy, let's create a few webhooks. Webhooks are URLs that can activate the policy without authentication. Webhooks are used with third party services that may trigger Autoscale policies, such as Nagios.
An execution call will always return ``202, Accepted``, even if it fails to scale because of an invalid configuration. This is done to prevent `information leakage <https://www.owasp.org/index.php/Information_Leakage>`_.
.. code-block:: bash
POST https://ord.autoscale.api.rackspacecloud.com/v1.0/{tenantId}/groups/{groupId}/policies/{policyId}/webhooks
.. code-block:: bash
curl --include --header "Accepts: application/json" \
--header "X-Auth-token: {auth-token}" \
--request POST \
--data-binary '[
{
"name": "alice",
"metadata": {
"notes": "this is for Alice"
}
},
{
"name": "bob"
}
]' \
"https://ord.autoscale.api.rackspacecloud.com/v1.0/{tenantId}/groups/{groupId}/policies/{policyId}/webhooks"
Will reply with:
.. code-block:: bash
{
"webhooks": [
{
"id":"{webhookId1}",
"alice",
"metadata": {
"notes": "this is for Alice"
},
"links": [
{
"href": ".../{groupId1}/policies/{policyId1}/webhooks/{webhookId1}/",
"rel": "self"
},
{
"href": ".../execute/1/{capabilityHash1}/",
"rel": "capability"
}
]
},
{
"id":"{webhookId2}",
"name": "bob",
"metadata": {},
"links": [
{
"href": ".../{groupId1}/policies/{policyId1}/webhooks/{webhookId2}/",
"rel": "self"
},
{
"href": ".../execute/1/{capabilityHash2}/",
"rel": "capability"
}
]
}
]
}
Step Five - Executing a Scaling Policy
--------------------------------------
You can excecute a scaling policy in two ways:
**Authenticated Scaling Policy Path**
Identify the path to the desired scaling policy, and append 'execute' to the path. To activate the policy POST against it.
.. code-block:: bash
curl --include \
--header "X-Auth-token: {auth-token}" \
--request POST \
"https://ord.autoscale.api.rackspacecloud.com/v1.0/{tenantId}/groups/{groupId}/policies/{policyId}/execute"
**Execute Capability URL**
Find the capability URL in your Scaling Policy Webhook. If you want to activate that policy, POST against it.
An execution call will always return ``202, Accepted``, even if it fails to scale because of an invalid configuration. This is done to prevent `information leakage <https://www.owasp.org/index.php/Information_Leakage>`_.
.. code-block:: bash
curl --include \
--request POST \
"https://ord.autoscale.api.rackspacecloud.com/v1.0/execute/1/be624bfb20f07baddc278cd978c1ddca56bdb29a1c7b70bbeb229fe0b862c134" -v
Note how authentication is not needed.
The policy will execute, and your group will transform.
Step Six - Tearing it all down
------------------------------
Autoscaling groups can not be deleted while they have active servers. Upload a new config with minimum and maximum of zero to be able to delete a server.
.. code-block:: bash
PUT /{tenantId}/groups/{groupId}/config
.. code-block:: bash
curl --include --header "Accept: application/json" \
--header "X-Auth-token: {auth-token}" \
--request PUT \
--data-binary '{
"name": "workers",
"cooldown": 60,
"minEntities": 0,
"maxEntities": 0,
"metadata": {
"firstkey": "this is a string",
"secondkey": "1",
}
}' \
"https://ord.autoscale.api.rackspacecloud.com/v1.0/{tenantId}/groups/{groupId}/config"
The autoscale group will start destroying all your servers. Now you can fire a DELETE command to the Group ID. Take care that all your servers are deleted before deleting the group.
.. code-block:: bash
curl --include \
--header "X-Auth-token: {auth-token}" \
--request DELETE \
"https://ord.autoscale.api.rackspacecloud.com/v1.0/{tenantId}/groups/{groupId}" | 33.706161 | 467 | 0.601659 |
b05ac6253efd2bb339f643bdd0fa637a3624e471 | 3,362 | rst | reStructuredText | docs/how-to/fuzzy-matching.rst | ajhynes7/datatest | 78742e98de992807286655f5685a2dc33a7b452e | [
"Apache-2.0"
] | 277 | 2016-05-12T13:22:49.000Z | 2022-03-11T00:18:32.000Z | docs/how-to/fuzzy-matching.rst | ajhynes7/datatest | 78742e98de992807286655f5685a2dc33a7b452e | [
"Apache-2.0"
] | 57 | 2016-05-18T01:03:32.000Z | 2022-02-17T13:48:43.000Z | docs/how-to/fuzzy-matching.rst | ajhynes7/datatest | 78742e98de992807286655f5685a2dc33a7b452e | [
"Apache-2.0"
] | 16 | 2016-05-22T11:35:19.000Z | 2021-12-01T19:41:42.000Z |
.. currentmodule:: datatest
.. meta::
:description: How to assert fuzzy matches.
:keywords: approximate string, fuzzy matching, testing, datatest
#############################
How to Validate Fuzzy Matches
#############################
When comparing strings of text, it can sometimes be useful
to check that values are similar instead of asserting that
they are exactly the same. Datatest provides options for
*approximate string matching* (also called "fuzzy
matching").
When checking mappings or sequences of values, you can accept
approximate matches with the :meth:`accepted.fuzzy` acceptance:
.. tabs::
.. tab:: Using Acceptance
.. code-block:: python
:emphasize-lines: 19
:linenos:
from datatest import validate, accepted
linked_record = {
'id165': 'Saint Louis',
'id382': 'Raliegh',
'id592': 'Austin',
'id720': 'Cincinatti',
'id826': 'Philadelphia',
}
master_record = {
'id165': 'St. Louis',
'id382': 'Raleigh',
'id592': 'Austin',
'id720': 'Cincinnati',
'id826': 'Philadelphia',
}
with accepted.fuzzy(cutoff=0.6):
validate(linked_record, master_record)
.. tab:: No Acceptance
.. code-block:: python
:linenos:
from datatest import validate
linked_record = {
'id165': 'Saint Louis',
'id382': 'Raliegh',
'id592': 'Austin',
'id720': 'Cincinatti',
'id826': 'Philadelphia',
}
master_record = {
'id165': 'St. Louis',
'id382': 'Raleigh',
'id592': 'Austin',
'id720': 'Cincinnati',
'id826': 'Philadelphia',
}
validate(linked_record, master_record)
.. code-block:: none
:emphasize-lines: 5-7
Traceback (most recent call last):
File "example.py", line 19, in <module>
validate(linked_record, master_record)
datatest.ValidationError: does not satisfy mapping requirements (3 differences): {
'id165': Invalid('Saint Louis', expected='St. Louis'),
'id382': Invalid('Raliegh', expected='Raleigh'),
'id720': Invalid('Cincinatti', expected='Cincinnati'),
}
If variation is an inherent, natural feature of the data and
does not necessarily represent a defect, it may be appropriate
to use :meth:`validate.fuzzy` instead of the acceptance shown
previously:
.. code-block:: python
:emphasize-lines: 19
:linenos:
from datatest import validate
linked_record = {
'id165': 'Saint Louis',
'id382': 'Raliegh',
'id592': 'Austin',
'id720': 'Cincinatti',
'id826': 'Philadelphia',
}
master_record = {
'id165': 'St. Louis',
'id382': 'Raleigh',
'id592': 'Austin',
'id720': 'Cincinnati',
'id826': 'Philadelphia',
}
validate.fuzzy(linked_record, master_record, cutoff=0.6)
That said, it's probably more appropriate to use an acceptance
for this specific example.
| 27.333333 | 94 | 0.538667 |
cd08944cdb32d19f71a5054aa60e5c2ca70ae3d8 | 76 | rst | reStructuredText | docs/usage.rst | Eb-Zeero/tacapi | 2c94d037e2dd19bf5d1f67ea5ae34cb6cc6eef61 | [
"MIT"
] | null | null | null | docs/usage.rst | Eb-Zeero/tacapi | 2c94d037e2dd19bf5d1f67ea5ae34cb6cc6eef61 | [
"MIT"
] | 5 | 2021-03-18T21:39:50.000Z | 2022-03-11T23:36:18.000Z | docs/usage.rst | Eb-Zeero/tacapi | 2c94d037e2dd19bf5d1f67ea5ae34cb6cc6eef61 | [
"MIT"
] | null | null | null | =====
Usage
=====
To use SALT Data Quality in a project::
import sdq
| 8.444444 | 39 | 0.578947 |
2fb40bab2abd2797265fe855360d129a0e86ea1b | 86 | rst | reStructuredText | readme.rst | Kagachi21/LocationMarker | edb1ee646dc26a5f006650439480a2bcf741185b | [
"MIT"
] | null | null | null | readme.rst | Kagachi21/LocationMarker | edb1ee646dc26a5f006650439480a2bcf741185b | [
"MIT"
] | null | null | null | readme.rst | Kagachi21/LocationMarker | edb1ee646dc26a5f006650439480a2bcf741185b | [
"MIT"
] | null | null | null | Sebuah Aplikasi Polisi ada Polici dan Polisi dan Poliki
Punyaa Nyinyaaa, Michel, dkk
| 21.5 | 55 | 0.813953 |
a5a07a718251606cac1607e6f106de865a222284 | 655 | rst | reStructuredText | gatech/cs7641/python-for-data-analysis.rst | orsenthil/omscs-transcend | d83b8cc5698f8338e4f4513e2dca53f58fc27808 | [
"Apache-2.0"
] | 56 | 2017-08-17T19:12:46.000Z | 2022-01-31T07:26:57.000Z | gatech/cs7641/python-for-data-analysis.rst | KpJitendraKumar/coursedocs | 5146caf733a5749331dba43aa97604c634358747 | [
"Apache-2.0"
] | 1 | 2017-03-23T12:40:54.000Z | 2019-03-20T07:06:34.000Z | gatech/cs7641/python-for-data-analysis.rst | KpJitendraKumar/coursedocs | 5146caf733a5749331dba43aa97604c634358747 | [
"Apache-2.0"
] | 42 | 2017-03-23T12:37:15.000Z | 2022-02-28T04:17:15.000Z | .. title: Python for Data Analysis
.. slug: python-for-data-analysis
.. date: 2015-08-28 09:42:13 UTC-07:00
.. tags:
.. category: notes
.. link:
.. description:
.. type: text
Python for Data Analysis
========================
Read the excerpt of the book Python for `Data Analysis`_
It gave an introduction to using IPython, Pandas, NumPy and SciPy for Data analysis. My confidence
improved a little bit as I thought, I can use Python for doing my machine learning course.
I will setup the environment and data sets and continue with going learning the lectures.
.. _Data Analysis: http://cdn.oreilly.com/oreilly/booksamplers/9781449319793_sampler.pdf
| 32.75 | 98 | 0.734351 |
9684bfc8e9d4d9cacdb2654e7ff06a2919b829c5 | 605 | rst | reStructuredText | docs/index.rst | NMTHydro/GADGET | 14ade6e896ce36449fe0c22567f349f25ad93be0 | [
"Apache-2.0"
] | null | null | null | docs/index.rst | NMTHydro/GADGET | 14ade6e896ce36449fe0c22567f349f25ad93be0 | [
"Apache-2.0"
] | null | null | null | docs/index.rst | NMTHydro/GADGET | 14ade6e896ce36449fe0c22567f349f25ad93be0 | [
"Apache-2.0"
] | null | null | null | .. GADGET documentation master file, created by
sphinx-quickstart on Wed Jul 13 12:53:44 2016.
You can adapt this file completely to your liking, but it should at least
contain the root `toctree` directive.
Welcome to GADGET's documentation!
==================================
**G**\ ridded **A**\ tmospheric **D**\ ata Downscalin\ **G** and **E**\ vapotranspiration **T**\ ools for High-resolution
Distributed Reference ET in Complex Terrain
Contents:
.. toctree::
:maxdepth: 2
quickstart
Indices and tables
==================
* :ref:`genindex`
* :ref:`modindex`
* :ref:`search`
| 21.607143 | 121 | 0.639669 |
290f313479aeed8049c6b11438c6effba92e4cce | 2,751 | rst | reStructuredText | docs/source/api_ref/resources/inventory.rst | kedark3/tower-cli | 487a1b9a8e96509798fee108e4f7d2c187177771 | [
"Apache-2.0"
] | 363 | 2015-01-14T17:48:34.000Z | 2022-01-29T06:37:04.000Z | docs/source/api_ref/resources/inventory.rst | kedark3/tower-cli | 487a1b9a8e96509798fee108e4f7d2c187177771 | [
"Apache-2.0"
] | 703 | 2015-01-06T17:17:20.000Z | 2020-09-16T15:54:17.000Z | docs/source/api_ref/resources/inventory.rst | kedark3/tower-cli | 487a1b9a8e96509798fee108e4f7d2c187177771 | [
"Apache-2.0"
] | 203 | 2015-01-18T22:38:23.000Z | 2022-01-28T19:19:05.000Z | Inventory
=========
Description
-----------
This resource is used for managing inventory resources in Tower.
Fields Table
------------
.. <table goes here>
+--------------------+----------------------+----------------------------------------------------+----------+-------+-----------+---------+
|name |type |help_text |read_only |unique |filterable |required |
+====================+======================+====================================================+==========+=======+===========+=========+
|name |String |The name field. |False |True |True |True |
+--------------------+----------------------+----------------------------------------------------+----------+-------+-----------+---------+
|description |String |The description field. |False |False |True |False |
+--------------------+----------------------+----------------------------------------------------+----------+-------+-----------+---------+
|organization |Resource organization |The organization field. |False |False |True |True |
+--------------------+----------------------+----------------------------------------------------+----------+-------+-----------+---------+
|variables |variables |Inventory variables, use "@" to get from file. |False |False |True |False |
+--------------------+----------------------+----------------------------------------------------+----------+-------+-----------+---------+
|kind |Choices: ,smart |The kind field. Cannot be modified after created. |False |False |True |False |
+--------------------+----------------------+----------------------------------------------------+----------+-------+-----------+---------+
|host_filter |String |The host_filter field. Only useful when kind=smart. |False |False |True |False |
+--------------------+----------------------+----------------------------------------------------+----------+-------+-----------+---------+
|insights_credential |Resource credential |The insights_credential field. |False |False |True |False |
+--------------------+----------------------+----------------------------------------------------+----------+-------+-----------+---------+
.. <table goes here>
API Specification
-----------------
.. autoclass:: tower_cli.resources.inventory.Resource
:members: copy, create, delete, get, list, modify, associate_ig, disassociate_ig, batch_update
| 74.351351 | 139 | 0.278808 |
c10148584d801ae8a4002afcbc4575e19bff8913 | 608 | rst | reStructuredText | docs/index.rst | rbarrue/madminer | e1d17b2d6be48cc5686b5cbe5d01f62f6cd2450c | [
"MIT"
] | 46 | 2019-06-29T14:56:00.000Z | 2021-08-02T06:05:41.000Z | docs/index.rst | Sinclert/madminer | 6556f164725baab6b79e5ffa2a64913a0ac50f7a | [
"MIT"
] | 52 | 2019-06-18T18:42:58.000Z | 2021-10-04T14:56:39.000Z | docs/index.rst | Sinclert/madminer | 6556f164725baab6b79e5ffa2a64913a0ac50f7a | [
"MIT"
] | 20 | 2019-06-17T15:29:49.000Z | 2021-09-22T18:14:35.000Z | MadMiner
========
*Johann Brehmer, Felix Kling, Irina Espejo, and Kyle Cranmer*
**Machine learning–based inference for particle physics**
.. toctree::
:maxdepth: 1
:caption: Sites
introduction
installation
usage
troubleshooting
references
.. toctree::
:maxdepth: 1
:caption: API
madminer.analysis
madminer.core
madminer.delphes
madminer.fisherinformation
madminer.lhe
madminer.likelihood
madminer.limits
madminer.ml
madminer.plotting
madminer.sampling
Indices and tables
------------------
* :ref:`genindex`
* :ref:`modindex`
* :ref:`search`
| 15.2 | 61 | 0.675987 |
3d9748babc278c5346a35c4cfd5d322a15180c74 | 131 | rst | reStructuredText | doc/source/toc/section4.rst | dtasev/sarepy | 3179a63ac09aca791f3bccf13d61e2d47bee44c3 | [
"Apache-2.0"
] | null | null | null | doc/source/toc/section4.rst | dtasev/sarepy | 3179a63ac09aca791f3bccf13d61e2d47bee44c3 | [
"Apache-2.0"
] | null | null | null | doc/source/toc/section4.rst | dtasev/sarepy | 3179a63ac09aca791f3bccf13d61e2d47bee44c3 | [
"Apache-2.0"
] | null | null | null | ***********************************
Applications
***********************************
.. toctree::
section4/section4_1
| 16.375 | 36 | 0.282443 |
d4dee805fe93c5563d820d9c9dde3ccc0f7c8b58 | 334 | rst | reStructuredText | docsrc/table_of_contents.rst | sylvainbonnot/epythet | 4b0b06f93e187c914c1782696a5bdf0387a230ab | [
"Apache-2.0"
] | null | null | null | docsrc/table_of_contents.rst | sylvainbonnot/epythet | 4b0b06f93e187c914c1782696a5bdf0387a230ab | [
"Apache-2.0"
] | null | null | null | docsrc/table_of_contents.rst | sylvainbonnot/epythet | 4b0b06f93e187c914c1782696a5bdf0387a230ab | [
"Apache-2.0"
] | null | null | null | .. toctree::
:maxdepth: 2
:caption: Contents:
module_docs/epythet
module_docs/epythet/autogen
module_docs/epythet/call_make
module_docs/epythet/cli
module_docs/epythet/config_parser
module_docs/epythet/docs_gen
module_docs/epythet/setup_docsrc
module_docs/epythet/templates
module_docs/epythet/tools
| 23.857143 | 36 | 0.775449 |
db3d7bb28ee02757a1d08fa2bf96e377bf39fb55 | 332 | rst | reStructuredText | hexo/source/_posts/aboutme.rst | zhaojiedi1992/My_Aliyun_Hexo_Blog | 0089375ced035145542faf74913b1f7f7b32d60d | [
"MIT"
] | null | null | null | hexo/source/_posts/aboutme.rst | zhaojiedi1992/My_Aliyun_Hexo_Blog | 0089375ced035145542faf74913b1f7f7b32d60d | [
"MIT"
] | null | null | null | hexo/source/_posts/aboutme.rst | zhaojiedi1992/My_Aliyun_Hexo_Blog | 0089375ced035145542faf74913b1f7f7b32d60d | [
"MIT"
] | null | null | null | ---
title: 关于我
tags:
- about
categories: about
copyright: true
top: 10
date: 2018-02-07 20:48:41
---
我的工作经历
--------------------------------------------------------------------
我的学习经历
--------------------------------------------------------------------
我的爱好
--------------------------------------------------------------------
| 17.473684 | 68 | 0.253012 |
cff43638ef3316a11df122a25b4958f6356bf442 | 2,309 | rst | reStructuredText | tests/readmes/rbcarson/pypac/README.rst | JulienPalard/mdorrst | 88a605dc8aea11c24d24e4257efc39c2a9a125ad | [
"MIT"
] | 3 | 2017-04-27T03:19:02.000Z | 2021-02-05T13:17:27.000Z | tests/readmes/rbcarson/pypac/README.rst | JulienPalard/mdorrst | 88a605dc8aea11c24d24e4257efc39c2a9a125ad | [
"MIT"
] | 1 | 2019-10-23T07:36:30.000Z | 2019-10-23T07:36:31.000Z | tests/readmes/rbcarson/pypac/README.rst | JulienPalard/mdorrst | 88a605dc8aea11c24d24e4257efc39c2a9a125ad | [
"MIT"
] | 4 | 2017-05-13T06:39:20.000Z | 2020-11-06T11:00:50.000Z | PyPAC: Proxy auto-config for Python
===================================
.. image:: https://img.shields.io/pypi/v/pypac.svg?maxAge=2592000
:target: https://pypi.python.org/pypi/pypac
.. image:: https://img.shields.io/travis/rbcarson/pypac.svg?maxAge=2592000
:target: https://travis-ci.org/rbcarson/pypac
.. image:: https://ci.appveyor.com/api/projects/status/y7nxvu2feu87i39t/branch/master?svg=true
:target: https://ci.appveyor.com/project/rbcarson/pypac/branch/master
.. image:: https://img.shields.io/coveralls/rbcarson/pypac/HEAD.svg?maxAge=2592000
:target: https://coveralls.io/github/rbcarson/pypac
.. image:: https://img.shields.io/codacy/grade/71ac103b491d44efb94976ca5ea5d89c.svg?maxAge=2592000
:target: https://www.codacy.com/app/carsonyl/pypac
PyPAC is a pure-Python library for finding `proxy auto-config (PAC)`_ files and making HTTP requests
that respect them. PAC files are often used in organizations that need fine-grained and centralized control
of proxy settings.
.. _proxy auto-config (PAC): https://en.wikipedia.org/wiki/Proxy_auto-config
PyPAC provides a subclass of a `Requests <http://docs.python-requests.org/en/master/>`_ ``Session``,
so you can start using it immediately, with any PAC file transparently discovered and honoured:
.. code-block:: python
>>> from pypac import PACSession
>>> session = PACSession()
>>> session.get('http://example.org')
...
If a PAC file isn't found, then ``PACSession`` acts exactly like a regular ``Session``.
PyPAC can find PAC files according to the DNS portion of the `Web Proxy Auto-Discovery (WPAD)`_ protocol.
On Windows, PyPAC can also obtain the PAC file URL from the Internet Options dialog, via the registry.
.. _Web Proxy Auto-Discovery (WPAD): https://en.wikipedia.org/wiki/Web_Proxy_Autodiscovery_Protocol
Features
--------
* The same Requests API that you already know and love
* Honour PAC setting from Windows Internet Options
* Follow DNS Web Proxy Auto-Discovery protocol
* Proxy authentication pass-through
* Proxy failover and load balancing
PyPAC supports Python 2.7 and 3.3+.
Installation
------------
Install PyPAC using `pip <https://pip.pypa.io>`_::
$ pip install pypac
Documentation
-------------
PyPAC's documentation is available at http://pypac.readthedocs.io/.
| 36.650794 | 107 | 0.731485 |
e646470051d7219ae1362b22de1f5d913beba1fe | 3,940 | rst | reStructuredText | docs/import_workflow.rst | pjdelport/django-import-export | 5f4a710ff9f7a0b742e40feb978fda476d084ac5 | [
"BSD-2-Clause"
] | 1 | 2021-03-31T05:42:15.000Z | 2021-03-31T05:42:15.000Z | docs/import_workflow.rst | pjdelport/django-import-export | 5f4a710ff9f7a0b742e40feb978fda476d084ac5 | [
"BSD-2-Clause"
] | null | null | null | docs/import_workflow.rst | pjdelport/django-import-export | 5f4a710ff9f7a0b742e40feb978fda476d084ac5 | [
"BSD-2-Clause"
] | 2 | 2016-06-17T02:30:59.000Z | 2017-08-31T01:34:01.000Z | ====================
Import data workflow
====================
This document describes import data workflow, with hooks that enable
customization of import process.
``import_data`` method arguments
--------------------------------
``import_data`` method of :class:`import_export.resources.Resource` class is
responsible for import data from given `dataset`.
``import_data`` expect following arguments:
:attr:`dataset`
REQUIRED.
should be Tablib `Dataset`_ object with header row.
:attr:`dry_run`
If ``True``, import should not change database. Default is ``False``.
:attr:`raise_errors`
If ``True``, import should raise errors. Default is ``False``, which
means that eventual errors and traceback will be saved in ``Result``
instance.
``import_data`` method workflow
-------------------------------
#. ``import_data`` intialize new :class:`import_export.results.Result`
instance. ``Result`` instance holds errors and other information
gathered during import.
#. ``InstanceLoader`` responsible for loading existing instances
is intitalized.
Different ``InstanceLoader`` class
can be specified with ``instance_loader_class``
option of :class:`import_export.resources.ResourceOptions`.
:class:`import_export.instance_loaders.CachedInstanceLoader` can be used to
reduce number of database queries.
See :mod:`import_export.instance_loaders` for available implementations.
#. Process each `row` in ``dataset``
#. ``get_or_init_instance`` method is called with current ``InstanceLoader``
and current `row` returning object `instance` and `Boolean` variable
that indicates if object instance is new.
``get_or_init_instance`` tries to load instance for current `row` or
calls ``init_instance`` to init object if object does not exists yet.
Default ``ModelResource.init_instance`` initialize Django Model without
arguments. You can override ``init_instance`` method to manipulate how
new objects are initialized (ie: to set default values).
#. ``for_delete`` method is called to determine if current `instance`
should be deleted:
#. current `instance` is deleted
OR
#. ``import_obj`` method is called with current object `instance` and
current `row`.
``import_obj`` loop through all `Resource` `fields`, skipping
many to many fields and calls ``import_field`` for each. (Many to many
fields require that instance have a primary key, this is why assigning
them is postponed, after object is saved).
``import_field`` calls ``field.save`` method, if ``field`` has
both `attribute` and field `column_name` exists in given row.
#. ``save_instance`` method is called.
``save_instance`` receives ``dry_run`` argument and actually saves
instance only when ``dry_run`` is False.
``save_instance`` calls two hooks methods that by default does not
do anything but can be overriden to customize import process:
* ``before_save_instance``
* ``after_save_instance``
Both methods receive ``instance`` and ``dry_run`` arguments.
#. ``save_m2m`` method is called to save many to many fields.
#. ``RowResult`` is assigned with diff between original and imported
object fields as well as import type(new, updated).
If exception is raised inside row processing, and ``raise_errors`` is
``False`` (default), traceback is appended to ``RowResult``.
#. ``result`` is returned.
Transaction support
-------------------
If transaction support is enabled, whole import process is wrapped inside
transaction and rollbacked or committed respectively.
All methods called from inside of ``import_data`` (create / delete / update)
receive ``False`` for ``dry_run`` argument.
.. _Dataset: http://docs.python-tablib.org/en/latest/api/#dataset-object
| 35.495495 | 79 | 0.683249 |
d4d3c5d2f9dfe68030cea016e9b89ef1440266c8 | 2,283 | rst | reStructuredText | docs/index.rst | fledge-iot/fledge-filter-asset | 51655f2de462e0c54b4d57824b864199edc44a02 | [
"Apache-2.0"
] | null | null | null | docs/index.rst | fledge-iot/fledge-filter-asset | 51655f2de462e0c54b4d57824b864199edc44a02 | [
"Apache-2.0"
] | null | null | null | docs/index.rst | fledge-iot/fledge-filter-asset | 51655f2de462e0c54b4d57824b864199edc44a02 | [
"Apache-2.0"
] | null | null | null | .. Images
.. |asset| image:: images/asset.jpg
Asset Filter
============
The *fledge-filter-asset* is a filter that allows for assets to be included, excluded or renamed in a stream. It may be used either in *South* services or *North* tasks and is driven by a set of rules that define for each named asset what action should be taken.
Asset filters are added in the same way as any other filters.
- Click on the Applications add icon for your service or task.
- Select the *asset* plugin from the list of available plugins.
- Name your asset filter.
- Click *Next* and you will be presented with the following configuration page
+---------+
| |asset| |
+---------+
- Enter the *Asset rules*
- Enable the plugin and click *Done* to activate it
Asset Rules
-----------
The asset rules are an array of JSON objects which define the asset name to which the rule is applied and an action. Actions can be one of
- **include**: The asset should be forwarded to the output of the filter
- **exclude**: The asset should not be forwarded to the output of the filter
- **rename**: Change the name of the asset. In this case a third property is included in the rule object, "new_asset_name"
In addition a *defaultAction* may be included, however this is limited to *include* and *exclude*. Any asset that does not match a specific rule will have this default action applied to them. If the default action it not given it is treated as if a default action of *include* had been set.
A typical set of rules might be
.. code-block:: JSON
{
"rules": [
{
"asset_name": "Random1",
"action": "include"
},
{
"asset_name": "Random2",
"action": "rename",
"new_asset_name": "Random92"
},
{
"asset_name": "Random3",
"action": "exclude"
},
{
"asset_name": "Random4",
"action": "rename",
"new_asset_name": "Random94"
},
{
"asset_name": "Random5",
"action": "exclude"
},
{
"asset_name": "Random6",
"action": "rename",
"new_asset_name": "Random96"
},
{
"asset_name": "Random7",
"action": "include"
}
],
"defaultAction": "include"
}
| 27.841463 | 290 | 0.622865 |
7aba5da2a802bd1e82249f568c8ddb13f1b139ac | 1,790 | rst | reStructuredText | api/autoapi/Microsoft/AspNetCore/Authentication/OpenIdConnect/RemoteSignOutContext/index.rst | alingarnwelay-thesis/Docs | bf22fbe17ef03f9ccef7facd9462a37a04b73d47 | [
"CC-BY-4.0",
"MIT"
] | 13 | 2019-02-14T19:48:34.000Z | 2021-12-24T13:38:23.000Z | api/autoapi/Microsoft/AspNetCore/Authentication/OpenIdConnect/RemoteSignOutContext/index.rst | nikibobi/Docs | 05a0789c2c87bc4cb98a7b6411083ce3f9771b01 | [
"CC-BY-4.0",
"MIT"
] | null | null | null | api/autoapi/Microsoft/AspNetCore/Authentication/OpenIdConnect/RemoteSignOutContext/index.rst | nikibobi/Docs | 05a0789c2c87bc4cb98a7b6411083ce3f9771b01 | [
"CC-BY-4.0",
"MIT"
] | 3 | 2017-12-29T18:10:16.000Z | 2018-07-24T18:41:45.000Z |
RemoteSignOutContext Class
==========================
Namespace
:dn:ns:`Microsoft.AspNetCore.Authentication.OpenIdConnect`
Assemblies
* Microsoft.AspNetCore.Authentication.OpenIdConnect
----
.. contents::
:local:
Inheritance Hierarchy
---------------------
* :dn:cls:`System.Object`
* :dn:cls:`Microsoft.AspNetCore.Authentication.BaseContext`
* :dn:cls:`Microsoft.AspNetCore.Authentication.BaseControlContext`
* :dn:cls:`Microsoft.AspNetCore.Authentication.OpenIdConnect.BaseOpenIdConnectContext`
* :dn:cls:`Microsoft.AspNetCore.Authentication.OpenIdConnect.RemoteSignOutContext`
Syntax
------
.. code-block:: csharp
public class RemoteSignOutContext : BaseOpenIdConnectContext
.. dn:class:: Microsoft.AspNetCore.Authentication.OpenIdConnect.RemoteSignOutContext
:hidden:
.. dn:class:: Microsoft.AspNetCore.Authentication.OpenIdConnect.RemoteSignOutContext
Constructors
------------
.. dn:class:: Microsoft.AspNetCore.Authentication.OpenIdConnect.RemoteSignOutContext
:noindex:
:hidden:
.. dn:constructor:: Microsoft.AspNetCore.Authentication.OpenIdConnect.RemoteSignOutContext.RemoteSignOutContext(Microsoft.AspNetCore.Http.HttpContext, Microsoft.AspNetCore.Builder.OpenIdConnectOptions, Microsoft.IdentityModel.Protocols.OpenIdConnect.OpenIdConnectMessage)
:type context: Microsoft.AspNetCore.Http.HttpContext
:type options: Microsoft.AspNetCore.Builder.OpenIdConnectOptions
:type message: Microsoft.IdentityModel.Protocols.OpenIdConnect.OpenIdConnectMessage
.. code-block:: csharp
public RemoteSignOutContext(HttpContext context, OpenIdConnectOptions options, OpenIdConnectMessage message)
| 21.058824 | 275 | 0.726257 |
513f57d00feef558aa11bfed19b73af153d78417 | 6,017 | rst | reStructuredText | doc/python/tutorial/debugging.rst | tasugi/nnabla | cf54d64cc0448c8ea63b2a8e9a7999963f3c169e | [
"Apache-2.0"
] | 1 | 2020-08-03T12:49:19.000Z | 2020-08-03T12:49:19.000Z | doc/python/tutorial/debugging.rst | tasugi/nnabla | cf54d64cc0448c8ea63b2a8e9a7999963f3c169e | [
"Apache-2.0"
] | 1 | 2020-11-09T07:33:29.000Z | 2020-11-09T07:33:29.000Z | doc/python/tutorial/debugging.rst | tasugi/nnabla | cf54d64cc0448c8ea63b2a8e9a7999963f3c169e | [
"Apache-2.0"
] | null | null | null |
Debugging
=========
Deep neural networks are going deeper and deeper every year, requiring
more components in the networks. Such complexity often misleads us to
mal-configure the networks that can turn out be critical. Even if we
correctly configure a neural network as desired, we may still want to
find out its performance bottleneck, e.g., from which layer(s) the
computational bottleneck comes from.
In this debugging tutorial, we introduce three techniques to deal with
such cases:
1. ``visit`` method of a variable
2. simple graph viewer
3. profiling utils
We will go over each technique, but first prepare the following
reference model.
.. code:: python
import numpy as np
import nnabla as nn
import nnabla.logger as logger
import nnabla.functions as F
import nnabla.parametric_functions as PF
import nnabla.solvers as S
def block(x, maps, test=False, name="block"):
h = x
with nn.parameter_scope(name):
with nn.parameter_scope("in-block-1"):
h = PF.convolution(h, maps, kernel=(3, 3), pad=(1, 1), with_bias=False)
h = PF.batch_normalization(h, batch_stat=not test)
h = F.relu(h)
with nn.parameter_scope("in-block-2"):
h = PF.convolution(h, maps // 2, kernel=(3, 3), pad=(1, 1), with_bias=False)
h = PF.batch_normalization(h, batch_stat=not test)
h = F.relu(h)
with nn.parameter_scope("in-block-3"):
h = PF.convolution(h, maps, kernel=(3, 3), pad=(1, 1), with_bias=False)
h = PF.batch_normalization(h, batch_stat=not test)
if h.shape[1] != x.shape[1]:
with nn.parameter_scope("skip"):
s = PF.convolution(x, maps, kernel=(3, 3), pad=(1, 1), with_bias=False)
s = PF.batch_normalization(s, batch_stat=not test)
return F.relu(h + s)
def network(x, maps=16, test=False):
h = x
h = PF.convolution(h, maps, kernel=(3, 3), pad=(1, 1), name="first-conv", with_bias=False)
h = PF.batch_normalization(h, batch_stat=not test, name="first-bn")
h = F.relu(h)
for l in range(4):
h = block(h, maps * 2 ** (l + 1), name="block-{}".format(l))
h = F.max_pooling(h, (2, 2))
h = F.average_pooling(h, h.shape[2:])
pred = PF.affine(h, 100, name="pred")
return pred
Visit Method
------------
Visit method of a variable takes either lambda, function, callable
object as an argument and calls it over all NNabla functions where the
variable can traverse in the forward order. It is easier to see the
usage than expalined.
First of all, define the callable class.
.. code:: python
class PrintFunc(object):
def __call__(self, nnabla_func):
print("==========")
print(nnabla_func.info.type_name)
print(nnabla_func.inputs)
print(nnabla_func.outputs)
print(nnabla_func.info.args)
This callable object takes a NNabla function, e.g., convolution, relu,
etc., so a user can get information of that function.
.. code:: python
nn.clear_parameters() # this call is just in case to do the following code again
x = nn.Variable([4, 3, 128, 128])
pred = network(x)
pred.visit(PrintFunc())
Simple Graph Viewer
-------------------
Visit method is very useful for getting information about each function
used in a graph, but it is hard to see the details of the whole network
structure, e.g., which variable is connected to which variable. So we
have a graph viewer that visually shows the whole structure of network,
enabling us to debug more efficiently. Using this graph viewer is
straightforward, as shown in the following code:
.. code:: python
# Create graph again just in case
nn.clear_parameters() # call this in case you want to run the following code agian
x = nn.Variable([4, 3, 128, 128])
pred = network(x)
.. code:: python
import nnabla.experimental.viewers as V
graph = V.SimpleGraph(verbose=False)
graph.view(pred)
If one would like to see more detailed information as in ``visit``
method case, change verbose option to ``True``.
.. code:: python
graph = V.SimpleGraph(verbose=True)
graph.view(pred)
Now one can see detailed information!
Note that this viewer is mainly for NNabla users who want to write codes
in python, so for those who like to see more beautiful network and play
with that, please use Neural Network Console and visit
https://dl.sony.com/.
Profiling utils
---------------
Basically, this feature is **for developers** who want to know the whole
stats in speed and which functions could be bottlenecks. NNabla provides
a simple profiling tool. Once a network is prepared, one better to have
other components to train the network like a loss function and solvers.
First, to create the profile and see the results, run the following
codes.
.. code:: python
# Create graph again just in case
nn.clear_parameters() # call this in case you want to run the following code agian
# Context
from nnabla.ext_utils import get_extension_context
device = "cudnn"
ctx = get_extension_context(device)
nn.set_default_context(ctx)
# Network
x = nn.Variable([4, 3, 128, 128])
t = nn.Variable([4, 1])
pred = network(x)
loss = F.mean(F.softmax_cross_entropy(pred, t))
# Solver
solver = S.Momentum()
solver.set_parameters(nn.get_parameters())
# Profiler
from nnabla.utils.profiler import GraphProfiler
B = GraphProfiler(loss, solver=solver, device_id=0, ext_name=device, n_run=100)
B.run()
print("Profile finished.")
# Report
from nnabla.utils.profiler import GraphProfilerCsvWriter
with open("./profile.csv", "w") as f:
writer = GraphProfilerCsvWriter(B, file=f)
writer.write()
print("Report is prepared.")
| 33.06044 | 98 | 0.654147 |
4ba9903b87cfb22952161c9d8e745298c8f10f6e | 37,218 | rst | reStructuredText | zip-0244.rst | zingolabs/zips | e84ce9423fa7ffa84454a08dd451db3df9cb41a9 | [
"MIT"
] | null | null | null | zip-0244.rst | zingolabs/zips | e84ce9423fa7ffa84454a08dd451db3df9cb41a9 | [
"MIT"
] | null | null | null | zip-0244.rst | zingolabs/zips | e84ce9423fa7ffa84454a08dd451db3df9cb41a9 | [
"MIT"
] | null | null | null | ::
ZIP: 244
Title: Transaction Identifier Non-Malleability
Owners: Kris Nuttycombe <kris@electriccoin.co>
Daira Hopwood <daira@electriccoin.co>
Jack Grigg <str4d@electriccoin.co>
Status: Proposed
Category: Consensus
Created: 2021-01-06
License: MIT
Discussions-To: <https://github.com/zcash/zips/issues/411>
===========
Terminology
===========
The key words "MUST" and "MUST NOT" in this document are to be interpreted as described in RFC 2119. [#RFC2119]_
The terms "consensus branch", "epoch", and "network upgrade" in this document are to be interpreted as
described in ZIP 200. [#zip-0200]_
The term "field encoding" refers to the binary serialized form of a Zcash transaction
field, as specified in section 7.1 of the Zcash protocol specification
[#protocol-txnencoding]_.
========
Abstract
========
This proposal defines a new transaction digest algorithm for the NU5 network upgrade
onward, in order to introduce non-malleable transaction identifiers that commit to
all transaction data except for attestations to transaction validity.
This proposal also defines a new transaction digest algorithm for signature validation,
which shares all available structure produced during the construction of transaction
identifiers, in order to minimize redundant data hashing in validation.
This proposal also defines a new name and semantics for the ``hashLightClientRoot`` field of the
block header, to enable additional commitments to be represented in this hash and to
provide a mechanism for future extensibility of the set of commitments represented.
==========
Motivation
==========
In all cases, but particularly in order to support the use of transactions in
higher-level protocols, any modification of the transaction that has not been
explicitly permitted (such as via anyone-can-spend inputs) should invalidate
attestations to spend authority or to the included outputs. Following the activation
of this proposed change, transaction identifiers will be stable irrespective of
any possible malleation of "witness data" such as proofs and transaction
signatures.
In addition, by specifying a transaction identifier and signature algorithm
that is decoupled from the serialized format of the transaction as a whole,
this change makes it so that the wire format of transactions is no longer
consensus-critical.
============
Requirements
============
- Continue to support existing functionality of the protocol (multisig,
signing modes for transparent inputs).
- Allow the use of transaction ids, and pairs of the form (transaction id,
output index) as stable identifiers.
- A sender must be able to recognize their own transaction, even given allowed
forms of malleability such as recomputation of transaction signatures.
- In the case of transparent inputs, it should be possible to create a
transaction (B) that spends the outputs from a previous transaction (A) even
before (A) has been mined. This should also be possible in the case that the
creator of (B) does not wait for confirmations of (A). That is, (B) should remain
valid so long as any variant of (A) is eventually mined.
- It should not be possible for an attacker to malleate a transaction in a
fashion that would result in the transaction being interpreted as a
double-spend.
- It should be possible in the future to upgrade the protocol in such a fashion
that only non-malleable transactions are accepted.
- It should be possible to use the transaction id unmodified as the value that
is used to produce a signature hash in the case that the transaction contains
no transparent inputs.
================
Non-requirements
================
In order to support backwards-compatibility with parts of the ecosystem that
have not yet upgraded to the non-malleable transaction format, it is not an
initial requirement that all transactions be non-malleable.
It is not required that legacy (Sapling V4 and earlier) transaction formats
support construction of non-malleable transaction identifiers, even though
they may continue to be accepted by the network after the NU5 upgrade.
=============
Specification
=============
-------
Digests
-------
All digests are personalized BLAKE2b-256 hashes. In cases where no elements are available
for hashing (for example, if there are no transparent transaction inputs or no Orchard
actions), a personalized hash of the empty byte array will be used. The personalization
string therefore provides domain separation for the hashes of even empty data fields.
The notation ``BLAKE2b-256(personalization_string, [])`` is used to refer to hashes
constructed in this manner.
TxId Digest
===========
A new transaction digest algorithm is defined that constructs the identifier for
a transaction from a tree of hashes. Each branch of the tree of hashes will
correspond to a specific subset of transaction data. The overall structure of
the hash is as follows; each name referenced here will be described in detail
below::
txid_digest
├── header_digest
├── transparent_digest
│ ├── prevouts_digest
│ ├── sequence_digest
│ └── outputs_digest
├── sapling_digest
│ ├── sapling_spends_digest
│ │ ├── sapling_spends_compact_digest
│ │ └── sapling_spends_noncompact_digest
│ ├── sapling_outputs_digest
│ │ ├── sapling_outputs_compact_digest
│ │ ├── sapling_outputs_memos_digest
│ │ └── sapling_outputs_noncompact_digest
│ └── valueBalance
└── orchard_digest
├── orchard_actions_compact_digest
├── orchard_actions_memos_digest
├── orchard_actions_noncompact_digest
├── flagsOrchard
├── valueBalanceOrchard
└── anchorOrchard
Each node written as ``snake_case`` in this tree is a BLAKE2b-256 hash of its
children, initialized with a personalization string specific to that branch
of the tree. Nodes that are not themselves digests are written in ``camelCase``.
In the specification below, nodes of the tree are presented in depth-first order.
txid_digest
-----------
A BLAKE2b-256 hash of the following values ::
T.1: header_digest (32-byte hash output)
T.2: transparent_digest (32-byte hash output)
T.3: sapling_digest (32-byte hash output)
T.4: orchard_digest (32-byte hash output)
The personalization field of this hash is set to::
"ZcashTxHash_" || CONSENSUS_BRANCH_ID
``ZcashTxHash_`` has 1 underscore character.
As in ZIP 143 [#zip-0143]_, CONSENSUS_BRANCH_ID is the 4-byte little-endian encoding of
the consensus branch ID for the epoch of the block containing the transaction. Domain
separation of the transaction id hash across parallel consensus branches provides replay
protection: transactions targeted for one consensus branch will not have the same
transaction identifier on other consensus branches.
This signature hash personalization prefix has been changed to reflect the new role of
this hash (relative to ``ZcashSigHash`` as specified in ZIP 143) as a transaction
identifier rather than a commitment that is exclusively used for signature purposes.
The previous computation of the transaction identifier was a SHA256d hash of the
serialized transaction contents, and was not personalized.
T.1: header_digest
``````````````````
A BLAKE2b-256 hash of the following values ::
T.1a: version (4-byte little-endian version identifier including overwinter flag)
T.1b: version_group_id (4-byte little-endian version group identifier)
T.1c: consensus_branch_id (4-byte little-endian consensus branch id)
T.1d: lock_time (4-byte little-endian nLockTime value)
T.1e: expiry_height (4-byte little-endian block height)
The personalization field of this hash is set to::
"ZTxIdHeadersHash"
T.2: transparent_digest
```````````````````````
In the case that transparent inputs or outputs are present, the transparent digest
consists of a BLAKE2b-256 hash of the following values ::
T.2a: prevouts_digest (32-byte hash)
T.2b: sequence_digest (32-byte hash)
T.2c: outputs_digest (32-byte hash)
The personalization field of this hash is set to::
"ZTxIdTranspaHash"
In the case that the transaction has no transparent components, ``transparent_digest`` is ::
BLAKE2b-256("ZTxIdTranspaHash", [])
T.2a: prevouts_digest
'''''''''''''''''''''
A BLAKE2b-256 hash of the field encoding of all ``outpoint``
field values of transparent inputs to the transaction.
The personalization field of this hash is set to::
"ZTxIdPrevoutHash"
In the case that the transaction has transparent outputs but no transparent inputs,
``prevouts_digest`` is ::
BLAKE2b-256("ZTxIdPrevoutHash", [])
T.2b: sequence_digest
'''''''''''''''''''''
A BLAKE2b-256 hash of the 32-bit little-endian representation of all ``nSequence``
field values of transparent inputs to the transaction.
The personalization field of this hash is set to::
"ZTxIdSequencHash"
In the case that the transaction has transparent outputs but no transparent inputs,
``sequence_digest`` is ::
BLAKE2b-256("ZTxIdSequencHash", [])
T.2c: outputs_digest
''''''''''''''''''''
A BLAKE2b-256 hash of the concatenated field encodings of all transparent
output values of the transaction. The field encoding of such an output consists
of the encoded output ``amount`` (8-byte little endian) followed by
the ``scriptPubKey`` byte array (serialized as Bitcoin script).
The personalization field of this hash is set to::
"ZTxIdOutputsHash"
In the case that the transaction has transparent inputs but no transparent outputs,
``outputs_digest`` is ::
BLAKE2b-256("ZTxIdOutputsHash", [])
T.3: sapling_digest
```````````````````
In the case that Sapling spends or outputs are present, the digest of Sapling components
is composed of two subtrees which are organized to permit easy interoperability with the
``CompactBlock`` representation of Sapling data specified by the ZIP 307 Light Client
Protocol [#zip-0307]_.
This digest is a BLAKE2b-256 hash of the following values ::
T.3a: sapling_spends_digest (32-byte hash)
T.3b: sapling_outputs_digest (32-byte hash)
T.3c: valueBalance (64-bit signed little-endian)
The personalization field of this hash is set to::
"ZTxIdSaplingHash"
In the case that the transaction has no Sapling spends or outputs, ``sapling_digest`` is ::
BLAKE2b-256("ZTxIdSaplingHash", [])
T.3a: sapling_spends_digest
'''''''''''''''''''''''''''
In the case that Sapling spends are present, this digest is a BLAKE2b-256 hash of the
following values ::
T.3a.i: sapling_spends_compact_digest (32-byte hash)
T.3a.ii: sapling_spends_noncompact_digest (32-byte hash)
The personalization field of this hash is set to::
"ZTxIdSSpendsHash"
In the case that the transaction has Sapling outputs but no Sapling spends,
``sapling_spends_digest`` is ::
BLAKE2b-256("ZTxIdSSpendsHash", [])
T.3a.i: sapling_spends_compact_digest
.....................................
A BLAKE2b-256 hash of the field encoding of all ``nullifier`` field
values of Sapling shielded spends belonging to the transaction.
The personalization field of this hash is set to::
"ZTxIdSSpendCHash"
T.3a.ii: sapling_spends_noncompact_digest
.........................................
A BLAKE2b-256 hash of the non-nullifier information for all Sapling shielded spends
belonging to the transaction, excluding both zkproof data and spend authorization
signature(s). For each spend, the following elements are included in the hash::
T.3a.ii.1: cv (field encoding bytes)
T.3a.ii.2: anchor (field encoding bytes)
T.3a.ii.3: rk (field encoding bytes)
In Transaction version 5, Sapling Spends have a shared anchor, which is hashed
into the sapling_spends_noncompact_digest for *each* Spend.
The personalization field of this hash is set to::
"ZTxIdSSpendNHash"
T.3b: sapling_outputs_digest
''''''''''''''''''''''''''''
In the case that Sapling outputs are present, this digest is a BLAKE2b-256 hash of the
following values ::
T.3b.i: sapling_outputs_compact_digest (32-byte hash)
T.3b.ii: sapling_outputs_memos_digest (32-byte hash)
T.3b.iii: sapling_outputs_noncompact_digest (32-byte hash)
The personalization field of this hash is set to::
"ZTxIdSOutputHash"
In the case that the transaction has Sapling spends but no Sapling outputs,
``sapling_outputs_digest`` is ::
BLAKE2b-256("ZTxIdSOutputHash", [])
T.3b.i: sapling_outputs_compact_digest
......................................
A BLAKE2b-256 hash of the subset of Sapling output information included in the
ZIP-307 [#zip-0307]_ ``CompactBlock`` format for all Sapling shielded outputs
belonging to the transaction. For each output, the following elements are included
in the hash::
T.3b.i.1: cmu (field encoding bytes)
T.3b.i.2: ephemeral_key (field encoding bytes)
T.3b.i.3: enc_ciphertext[..52] (First 52 bytes of field encoding)
The personalization field of this hash is set to::
"ZTxIdSOutC__Hash" (2 underscore characters)
T.3b.ii: sapling_outputs_memos_digest
.....................................
A BLAKE2b-256 hash of the subset of Sapling shielded memo field data for all Sapling
shielded outputs belonging to the transaction. For each output, the following elements
are included in the hash::
T.3b.ii.1: enc_ciphertext[52..564] (contents of the encrypted memo field)
The personalization field of this hash is set to::
"ZTxIdSOutM__Hash" (2 underscore characters)
T.3b.iii: sapling_outputs_noncompact_digest
...........................................
A BLAKE2b-256 hash of the remaining subset of Sapling output information **not** included
in the ZIP 307 [#zip-0307]_ ``CompactBlock`` format, excluding zkproof data, for all
Sapling shielded outputs belonging to the transaction. For each output, the following
elements are included in the hash::
T.3b.iii.1: cv (field encoding bytes)
T.3b.iii.2: enc_ciphertext[564..] (post-memo Poly1305 AEAD tag of field encoding)
T.3b.iii.3: out_ciphertext (field encoding bytes)
The personalization field of this hash is set to::
"ZTxIdSOutN__Hash" (2 underscore characters)
T.4: orchard_digest
```````````````````
In the case that Orchard actions are present in the transaction, this digest is
a BLAKE2b-256 hash of the following values ::
T.4a: orchard_actions_compact_digest (32-byte hash output)
T.4b: orchard_actions_memos_digest (32-byte hash output)
T.4c: orchard_actions_noncompact_digest (32-byte hash output)
T.4d: flagsOrchard (1 byte)
T.4e: valueBalanceOrchard (64-bit signed little-endian)
T.4f: anchorOrchard (32 bytes)
The personalization field of this hash is set to::
"ZTxIdOrchardHash"
In the case that the transaction has no Orchard actions, ``orchard_digest`` is ::
BLAKE2b-256("ZTxIdOrchardHash", [])
T.4a: orchard_actions_compact_digest
''''''''''''''''''''''''''''''''''''
A BLAKE2b-256 hash of the subset of Orchard Action information intended to be included in
an updated version of the ZIP-307 [#zip-0307]_ ``CompactBlock`` format for all Orchard
Actions belonging to the transaction. For each Action, the following elements are included
in the hash::
T.4a.i : nullifier (field encoding bytes)
T.4a.ii : cmx (field encoding bytes)
T.4a.iii: ephemeralKey (field encoding bytes)
T.4a.iv : encCiphertext[..52] (First 52 bytes of field encoding)
The personalization field of this hash is set to::
"ZTxIdOrcActCHash"
T.4b: orchard_actions_memos_digest
''''''''''''''''''''''''''''''''''
A BLAKE2b-256 hash of the subset of Orchard shielded memo field data for all Orchard
Actions belonging to the transaction. For each Action, the following elements are included
in the hash::
T.4b.i: encCiphertext[52..564] (contents of the encrypted memo field)
The personalization field of this hash is set to::
"ZTxIdOrcActMHash"
T.4c: orchard_actions_noncompact_digest
'''''''''''''''''''''''''''''''''''''''
A BLAKE2b-256 hash of the remaining subset of Orchard Action information **not** intended
for inclusion in an updated version of the the ZIP 307 [#zip-0307]_ ``CompactBlock``
format, for all Orchard Actions belonging to the transaction. For each Action,
the following elements are included in the hash::
T.4c.i : cv (field encoding bytes)
T.4c.ii : rk (field encoding bytes)
T.4c.iii: encCiphertext[564..] (post-memo suffix of field encoding)
T.4c.iv : outCiphertext (field encoding bytes)
The personalization field of this hash is set to::
"ZTxIdOrcActNHash"
Signature Digest
================
A new per-input transaction digest algorithm is defined that constructs a hash that may be
signed by a transaction creator to commit to the effects of the transaction. A signature
digest is produced for each transparent input, each Sapling input, and each Orchard
action. For transparent inputs, this follows closely the algorithms from ZIP 143 [#zip-0143]_
and ZIP 243 [#zip-0243]_. For shielded inputs, this algorithm has the exact same output
as the transaction digest algorithm, thus the txid may be signed directly.
The overall structure of the hash is as follows; each name referenced here will be
described in detail below::
signature_digest
├── header_digest
├── transparent_sig_digest
├── sapling_digest
└── orchard_digest
signature_digest
----------------
A BLAKE2b-256 hash of the following values ::
S.1: header_digest (32-byte hash output)
S.2: transparent_sig_digest (32-byte hash output)
S.3: sapling_digest (32-byte hash output)
S.4: orchard_digest (32-byte hash output)
The personalization field of this hash is set to::
"ZcashTxHash_" || CONSENSUS_BRANCH_ID
``ZcashTxHash_`` has 1 underscore character.
This value has the same personalization as the top hash of the transaction
identifier digest tree, so that what is being signed in the case that there are
no transparent inputs is just the transaction id.
S.1: header_digest
``````````````````
Identical to that specified for the transaction identifier.
S.2: transparent_sig_digest
```````````````````````````
If we are producing a hash for either a coinbase transaction, or a non-coinbase
transaction that has no transparent inputs, the value of ``transparent_sig_digest``
is identical to the value specified in section `T.2 <#t-2-transparent-digest>`_.
If we are producing a hash for a non-coinbase transaction that has transparent
inputs, the value of ``transparent_sig_digest`` depends upon the value of a
``hash_type`` flag, as follows.
The construction of each component below depends upon the values of the
``hash_type`` flag bits. Each component will be described separately.
This digest is a BLAKE2b-256 hash of the following values ::
S.2a: hash_type (1 byte)
S.2b: prevouts_sig_digest (32-byte hash)
S.2c: amounts_sig_digest (32-byte hash)
S.2d: scriptpubkeys_sig_digest (32-byte hash)
S.2e: sequence_sig_digest (32-byte hash)
S.2f: outputs_sig_digest (32-byte hash)
S.2g: txin_sig_digest (32-byte hash)
The personalization field of this hash is set to::
"ZTxIdTranspaHash"
S.2a: hash_type
'''''''''''''''
This is an 8-bit unsigned value. The ``SIGHASH`` encodings from the legacy
script system are reused: one of ``SIGHASH_ALL`` (0x01), ``SIGHASH_NONE`` (0x02),
and ``SIGHASH_SINGLE`` (0x03), with or without the ``SIGHASH_ANYONECANPAY`` flag
(0x80). The following restrictions apply, which cause validation failure if
violated:
- Using any undefined ``hash_type`` (not 0x01, 0x02, 0x03, 0x81, 0x82, or 0x83).
- Using ``SIGHASH_SINGLE`` without a "corresponding output" (an output with the
same index as the input being verified).
If we are producing a hash for the signature over a transparent input, the value
of ``hash_type`` is obtained from the input's ``scriptSig`` as encoded in the
transaction. If we are producing a hash for the signature over a Sapling Spend
or an Orchard Action, ``hash_type`` is set to ``SIGHASH_ALL``.
S.2b: prevouts_sig_digest
'''''''''''''''''''''''''
This is a BLAKE2b-256 hash initialized with the personalization field value
``ZTxIdPrevoutHash``.
If the ``SIGHASH_ANYONECANPAY`` flag is not set::
identical to the value of ``prevouts_digest`` as specified for the
transaction identifier in section T.2a.
otherwise::
BLAKE2b-256(``ZTxIdPrevoutHash``, [])
S.2c: amounts_sig_digest
''''''''''''''''''''''''
If the ``SIGHASH_ANYONECANPAY`` flag is not set, the value of
``amounts_sig_digest`` is a BLAKE2b-256 hash of the concatenation of the 8-byte
signed little-endian representations of all ``value`` fields [#bdr-txout]_ for
the coins spent by the transparent inputs to the transaction.
The personalization field of this hash is set to::
"ZTxTrAmountsHash"
If the ``SIGHASH_ANYONECANPAY`` flag is set, ``amounts_sig_digest`` is::
BLAKE2b-256("ZTxTrAmountsHash", [])
S.2d: scriptpubkeys_sig_digest
''''''''''''''''''''''''''''''
If the ``SIGHASH_ANYONECANPAY`` flag is not set, the value of
``scriptpubkeys_sig_digest`` is a BLAKE2b-256 hash of the concatenation of the
field encodings (each including a leading ``CompactSize``) of all ``pk_script``
fields [#bdr-txout]_ for the coins spent by the transparent inputs to the
transaction.
The personalization field of this hash is set to::
"ZTxTrScriptsHash"
If the ``SIGHASH_ANYONECANPAY`` flag is set, ``scriptpubkeys_sig_digest`` is::
BLAKE2b-256("ZTxTrScriptsHash", [])
S.2e: sequence_sig_digest
'''''''''''''''''''''''''
This is a BLAKE2b-256 hash initialized with the personalization field value
``ZTxIdSequencHash``.
If the ``SIGHASH_ANYONECANPAY`` flag is not set::
identical to the value of ``sequence_digest`` as specified for the
transaction identifier in section T.2b.
otherwise::
BLAKE2b-256(``ZTxIdSequencHash``, [])
S.2f: outputs_sig_digest
''''''''''''''''''''''''
This is a BLAKE2b-256 hash initialized with the personalization field value
``ZTxIdOutputsHash``.
If the sighash type is neither ``SIGHASH_SINGLE`` nor ``SIGHASH_NONE``::
identical to the value of ``outputs_digest`` as specified for the
transaction identifier in section T.2c.
If the sighash type is ``SIGHASH_SINGLE`` and the signature hash is being computed for
the transparent input at a particular index, and a transparent output appears in
the transaction at that index::
the hash is over the transaction serialized form of the transparent output at that
index
otherwise::
BLAKE2b-256(``ZTxIdOutputsHash``, [])
S.2g: txin_sig_digest
'''''''''''''''''''''
If we are producing a hash for the signature over a transparent input, the value
of ``txin_sig_digest`` is a BLAKE2b-256 hash of the following properties of the
transparent input being signed, initialized with the personalization field value
``Zcash___TxInHash`` (3 underscores)::
S.2g.i: prevout (field encoding)
S.2g.ii: value (8-byte signed little-endian)
S.2g.iii: scriptPubKey (field encoding)
S.2g.iv: nSequence (4-byte unsigned little-endian)
Notes:
- ``value`` is defined in the consensus rules to be a nonnegative value <=
``MAX_MONEY``, but all existing implementations parse this value as signed and
enforce the nonnegative constraint as a consensus check. It is defined as
signed here for consistency with those existing implementations.
- ``scriptPubKey`` is the field encoding (including a leading ``CompactSize``)
of the ``pk_script`` field [#bdr-txout]_ for the coin spent by the transparent
input. For P2SH coins, this differs from the ``redeemScript`` committed to in
ZIP 243 [#zip-0243]_.
If we are producing a hash for the signature over a Sapling Spend or an Orchard
Action, ``txin_sig_digest`` is::
BLAKE2b-256("Zcash___TxInHash", [])
S.3: sapling_digest
```````````````````
Identical to that specified for the transaction identifier.
S.4: orchard_digest
```````````````````
Identical to that specified for the transaction identifier.
Authorizing Data Commitment
===========================
A new transaction digest algorithm is defined that constructs a digest which commits
to the authorizing data of a transaction from a tree of BLAKE2b-256 hashes.
For v5 transactions, the overall structure of the hash is as follows::
auth_digest
├── transparent_scripts_digest
├── sapling_auth_digest
└── orchard_auth_digest
Each node written as ``snake_case`` in this tree is a BLAKE2b-256 hash of authorizing
data of the transaction.
For transaction versions before v5, a placeholder value consisting of 32 bytes of
``0xFF`` is used in place of the authorizing data commitment. This is only used in
the tree committed to by ``hashAuthDataRoot``, as defined in `Block Header Changes`_.
The pair (Transaction Identifier, Auth Commitment) constitutes a commitment to all the
data of a serialized transaction that may be included in a block.
auth_digest
-----------
A BLAKE2b-256 hash of the following values ::
A.1: transparent_scripts_digest (32-byte hash output)
A.2: sapling_auth_digest (32-byte hash output)
A.3: orchard_auth_digest (32-byte hash output)
The personalization field of this hash is set to::
"ZTxAuthHash_" || CONSENSUS_BRANCH_ID
``ZTxAuthHash_`` has 1 underscore character.
A.1: transparent_scripts_digest
```````````````````````````````
In the case that the transaction contains transparent inputs, this is a BLAKE2b-256 hash
of the field encoding of the concatenated values of the Bitcoin script values associated
with each transparent input belonging to the transaction.
The personalization field of this hash is set to::
"ZTxAuthTransHash"
In the case that the transaction has no transparent inputs, ``transparent_scripts_digest`` is ::
BLAKE2b-256("ZTxAuthTransHash", [])
A.2: sapling_auth_digest
````````````````````````
In the case that Sapling Spends or Sapling Outputs are present, this is a BLAKE2b-256 hash
of the field encoding of the Sapling ``zkproof`` value of each Sapling Spend Description,
followed by the field encoding of the ``spend_auth_sig`` value of each Sapling Spend
Description belonging to the transaction, followed by the field encoding of the
``zkproof`` field of each Sapling Output Description belonging to the transaction,
followed by the field encoding of the binding signature::
A.2a: spend_zkproofs (field encoding bytes)
A.2b: spend_auth_sigs (field encoding bytes)
A.2c: output_zkproofs (field encoding bytes)
A.2d: binding_sig (field encoding bytes)
The personalization field of this hash is set to::
"ZTxAuthSapliHash"
In the case that the transaction has no Sapling Spends or Sapling Outputs,
``sapling_auth_digest`` is ::
BLAKE2b-256("ZTxAuthSapliHash", [])
A.3: orchard_auth_digest
````````````````````````
In the case that Orchard Actions are present, this is a BLAKE2b-256 hash of the field
encoding of the ``zkProofsOrchard``, ``spendAuthSigsOrchard``, and ``bindingSigOrchard``
fields of the transaction::
A.3a: proofsOrchard (field encoding bytes)
A.3b: vSpendAuthSigsOrchard (field encoding bytes)
A.3c: bindingSigOrchard (field encoding bytes)
The personalization field of this hash is set to::
"ZTxAuthOrchaHash"
In the case that the transaction has no Orchard Actions, ``orchard_auth_digest`` is ::
BLAKE2b-256("ZTxAuthOrchaHash", [])
--------------------
Block Header Changes
--------------------
The nonmalleable transaction identifier specified by this ZIP will be used
in the place of the current malleable transaction identifier within the
Merkle tree committed to by the ``hashMerkleRoot`` value. However, this
change now means that ``hashMerkleRoot`` is not sufficient to fully commit
to the transaction data, including witnesses, that appear within the block.
As a consequence, we now need to add a new commitment to the block header.
This commitment will be the root of a Merkle tree having leaves that are
transaction authorizing data commitments, produced according to the
`Authorizing Data Commitment`_ part of this specification. The insertion
order for this Merkle tree MUST be identical to the insertion order of
transaction identifiers into the Merkle tree that is used to construct
``hashMerkleRoot``, such that a path through this Merkle tree to a
transaction identifies the same transaction as that path reaches in the tree
rooted at ``hashMerkleRoot``.
This new commitment is named ``hashAuthDataRoot`` and is the root of a
binary Merkle tree of transaction authorizing data commitments having height
:math:`\mathsf{ceil(log_2(tx\_count))}`, padded with leaves having the "null"
hash value ``[0u8; 32]``. Note that :math:`\mathsf{log_2(tx\_count)}` is
well-defined because :math:`\mathsf{tx\_count} > 0`, due to the coinbase
transaction in each block. Non-leaf hashes in this tree are BLAKE2b-256
hashes personalized by the string ``"ZcashAuthDatHash"``.
Changing the block header format to allow space for an additional
commitment is somewhat invasive. Instead, the name and meaning of the
``hashLightClientRoot`` field, described in ZIP 221 [#zip-0221]_, is changed.
``hashLightClientRoot`` is renamed to ``hashBlockCommitments``. The value
of this hash is the BLAKE2b-256 hash personalized by the string ``"ZcashBlockCommit"``
of the following elements::
hashLightClientRoot (as described in ZIP 221)
hashAuthDataRoot (as described below)
terminator [0u8;32]
This representation treats the ``hashBlockCommitments`` value as a linked
list of hashes terminated by arbitrary data. In the case of protocol upgrades
where additional commitments need to be included in the block header, it is
possible to replace this terminator with the hash of a newly defined structure
which ends in a similar terminator. Fully validating nodes MUST always use the
entire structure defined by the latest activated protocol version that they
support.
The linked structure of this hash is intended to provide extensibility for
use by light clients which may be connected to a third-party server that supports
a later protocol version. Such a third party SHOULD provide a value that can
be used instead of the all-zeros terminator to permit the light client to
perform validation of the parts of the structure it needs.
Unlike the ``hashLightClientRoot`` change, the change to ``hashBlockCommitments``
happens in the block that activates this ZIP.
The block header byte format and version are not altered by this ZIP.
=========
Rationale
=========
In S.2, we use the same personalization strings for fields that have matching
fields in T.2, in order to facilitate reuse of their digests. In particular, the
"no transparent inputs or outputs" case of S.2 is identical to the equivalent
case in T.2; thus for fully shielded transactions, ``signature_digest`` is
equal to ``txid_digest``.
Several changes in this ZIP (relative to ZIP 243 [#zip-0243]_) were made to
align with BIP 341 [#bip-0341]_:
- The ``hash_type`` field is now restricted via a new consensus rule to be one
of a specific set of sighash type encodings. The rationale for this change is
inherited from BIP 341 [#bip-0341-hash_type]_.
- Note however that we do not define ``SIGHASH_DEFAULT``, as it is equivalent
to ``SIGHASH_ALL``, and we prefer the encodings to be canonical.
- Two new commitments (``amounts_sig_digest`` and ``scriptpubkeys_sig_digest``)
were added, to address difficulties in the case of a hardware wallet signing
transparent inputs. ``scriptpubkeys_sig_digest`` helps the hardware wallet to
determine the subset of inputs belonging to it [#bip-0341-scriptPubKey]_.
``amounts_sig_digest`` prevents the transaction creator from lying to the
hardware wallet about the transaction fee [#bip-0341-amount]_. Without these
commitments, the hardware wallet would need to be sent every transaction
containing an outpoint referenced in the transaction being signed.
- The semantics of ``sequence_sig_digest`` were changed, to commit to ``nSequence``
even if ``SIGHASH_SINGLE`` or ``SIGHASH_NONE`` is set. The rationale for this
change is inherited from BIP 341 [#bip-0341-nSequence]_.
- The semantics of ``outputs_sig_digest`` were changed, via a new consensus rule
that rejects transparent inputs for which ``SIGHASH_SINGLE`` is set without a
corresponding transparent output at the same index. BIP 341 does not give a
rationale for this change, but without it these inputs were effectively using
``SIGHASH_NONE``, which is silently misleading.
- The semantics of ``txin_sig_digest`` were changed, to always commit to the
``scriptPubKey`` field of the transparent coin being spent, instead of the
script actually being executed at the time ``signature_digest`` is calculated.
- This ensures that the signature commits to the entire committed script. In
Taproot, this makes it possible to prove to a hardware wallet what (unused)
execution paths exist [#bip-0341-scriptPubKey]_. Alternate execution paths
don't exist for P2PKH (where the executed script is ``scriptPubKey``) or
P2SH (where ``scriptPubKey`` is fully executed prior to ``redeemScript``).
- For P2SH, this means we commit to the Hash160 digest of ``redeemScript``
instead of the actual script. Note that the Bitcoin P2SH design depends
entirely on Hash160 being preimage-resistant, because otherwise anyone would
be able to spend someone else's P2SH UTXO using a preimage. We do need to
ensure that there is no collision attack; this holds because even if an
adversary could find a Hash160 collision, it would only enable them to
alter the input's ``scriptSig`` field. Doing so doesn't alter the effecting
data of the transaction, which by definition means the transaction has the
same effect under consensus (spends the same inputs and produces the same
outputs).
Signatures over Sapling Spends or Orchard Actions, in transactions containing
transparent inputs, commit to the same data that the transparent inputs do,
including all of the transparent input values. Without this commitment, there
would be a similar difficulty for a hardware wallet in the case where it is
only signing shielded inputs, when the transaction also contains transparent
inputs from a malicious other party, because that party could lie about their
coins' values.
By contrast, binding signatures for shielded coinbase transactions continue to
be over the transaction ID, as for non-coinbase transactions without transparent
inputs. This is necessary because coinbase transactions have a single "dummy"
transparent input element that has no corresponding previous output to commit
to. It is also sufficient because the data in that transparent input either is
already bound elsewhere (namely the block height, placed in ``expiry_height``
from NU5 activation), or does not need to be bound to the shielded outputs
(e.g. miner-identifying information).
========================
Reference implementation
========================
- https://github.com/zcash/librustzcash/pull/319/files
==========
References
==========
.. [#RFC2119] `RFC 2119: Key words for use in RFCs to Indicate Requirement Levels <https://www.rfc-editor.org/rfc/rfc2119.html>`_
.. [#protocol-txnencoding] `Zcash Protocol Specification, Version 2021.2.16 [NU5 proposal]. Section 7.1: Transaction Encoding and Consensus <protocol/protocol.pdf#txnencoding>`_
.. [#zip-0200] `ZIP 200: Network Upgrade Mechanism <zip-0200.rst>`_
.. [#zip-0221] `ZIP 221: FlyClient - Consensus Layer Changes <zip-0221.rst>`_
.. [#zip-0076] `ZIP 76: Transaction Signature Validation before Overwinter <zip-0076.rst>`_
.. [#zip-0143] `ZIP 143: Transaction Signature Validation for Overwinter <zip-0143.rst>`_
.. [#zip-0243] `ZIP 243: Transaction Signature Validation for Sapling <zip-0243.rst>`_
.. [#zip-0307] `ZIP 307: Light Client Protocol for Payment Detection <zip-0307.rst>`_
.. [#bip-0341] `BIP 341: Taproot: SegWit version 1 spending rules <https://github.com/bitcoin/bips/blob/master/bip-0341.mediawiki>`_
.. [#bip-0341-hash_type] `Why reject unknown hash_type values? <https://github.com/bitcoin/bips/blob/master/bip-0341.mediawiki#cite_note-13>`_
.. [#bip-0341-scriptPubKey] `Why does the signature message commit to the scriptPubKey? <https://github.com/bitcoin/bips/blob/master/bip-0341.mediawiki#cite_note-17>`_
.. [#bip-0341-amount] `Why does the signature message commit to the amounts of all transaction inputs? <https://github.com/bitcoin/bips/blob/master/bip-0341.mediawiki#cite_note-18>`_
.. [#bip-0341-nSequence] `Why does the signature message commit to all input nSequence if SIGHASH_SINGLE or SIGHASH_NONE are set? <https://github.com/bitcoin/bips/blob/master/bip-0341.mediawiki#cite_note-19>`_
.. [#bdr-txout] `Bitcoin Developer Reference. TxOut: A Transaction Output <https://developer.bitcoin.org/reference/transactions.html#txout-a-transaction-output>`_
| 41.445434 | 209 | 0.738836 |
62462b9493bf61044d5eb84926d9b598fa6e069a | 5,110 | rst | reStructuredText | docs/doc/data.rst | karawallace/mygene | 35bf066eb50bc929b4bb4e2423d47b4c98797526 | [
"Apache-2.0"
] | null | null | null | docs/doc/data.rst | karawallace/mygene | 35bf066eb50bc929b4bb4e2423d47b4c98797526 | [
"Apache-2.0"
] | null | null | null | docs/doc/data.rst | karawallace/mygene | 35bf066eb50bc929b4bb4e2423d47b4c98797526 | [
"Apache-2.0"
] | null | null | null | .. Data
Gene annotation data
*********************
.. _data_sources:
Data sources
------------
We currently obtain the gene annotation data from several public data resources and keep them up-to-date, so that you don't have to do it:
============ ======================= =================================
Source Update frequency Notes
============ ======================= =================================
NCBI Entrez weekly snapshot
Ensembl whenever a new | Ensembl Pre! and EnsemblGenomes
release is available | are not included at the moment
Uniprot whenever a new
release is available
NetAffy whenever a new
release is available
PharmGKB whenever a new
release is available
UCSC whenever a new For "exons" field
release is available
CPDB whenever a new For "pathway" field
release is available
============ ======================= =================================
The most updated data information can be accessed `here <http://mygene.info/v3/metadata>`_.
.. _gene_object:
Gene object
------------
Gene annotation data are both stored and returned as a gene object, which is essentially a collection of fields (attributes) and their values:
.. code-block :: json
{
"_id": "1017"
"taxid": 9606,
"symbol": "CDK2",
"entrezgene": 1017,
"name": "cyclin-dependent kinase 2",
"genomic_pos": {
"start": 56360553,
"chr": "12",
"end": 56366568,
"strand": 1
}
}
The example above omits most of available fields. For a full example, you can just check out a few gene examples: `CDK2 <http://mygene.info/v3/gene/1017>`_, `ADA <http://mygene.info/v3/gene/100>`_. Or, did you try our `interactive API page <http://mygene.info/v3/api>`_ yet?
.. _species:
Species
------------
We support **ALL** species annotated by NCBI and Ensembl. All of our services allow you to pass a "**species**" parameter to limit the query results. "species" parameter accepts taxonomy ids as the input. You can look for the taxomony ids for your favorite species from `NCBI Taxonomy <http://www.ncbi.nlm.nih.gov/taxonomy>`_.
For convenience, we allow you to pass these *common names* for commonly used species (e.g. "species=human,mouse,rat"):
.. container:: species-table
=========== ======================= ===========
Common name Genus name Taxonomy id
=========== ======================= ===========
human Homo sapiens 9606
mouse Mus musculus 10090
rat Rattus norvegicus 10116
fruitfly Drosophila melanogaster 7227
nematode Caenorhabditis elegans 6239
zebrafish Danio rerio 7955
thale-cress Arabidopsis thaliana 3702
frog Xenopus tropicalis 8364
pig Sus scrofa 9823
=========== ======================= ===========
If needed, you can pass "species=all" to query against all available species, although, we recommend you to pass specific species you need for faster response.
.. _genome_assemblies:
Genome assemblies
----------------------------
Our `gene query service <query_service.html>`_ supports `genome interval queries <query_service.html#genome-interval-query>`_. We import genomic location data from Ensembl, so all species available there are supported. You can find the their reference genome assemblies information `here <http://www.ensembl.org/info/about/species.html>`_.
This table lists the genome assembies for commonly-used species:
.. container:: species-table
=========== ======================= =======================
Common name Genus name Genome assembly
=========== ======================= =======================
human Homo sapiens GRCh38 (hg38), also support hg19
mouse Mus musculus GRCm38 (mm10), also support mm9
rat Rattus norvegicus Rnor_5.0 (rn4)
fruitfly Drosophila melanogaster BDGP5 (dm3)
nematode Caenorhabditis elegans WBcel235 (ce10)
zebrafish Danio rerio Zv9 (danRer6)
frog Xenopus tropicalis JGI_4.2 (xenTro2)
pig Sus scrofa Sscrofa10.2 (susScr2)
=========== ======================= =======================
Available fields
----------------
The table below lists of all of the possible fields that could be in a gene object.
.. raw:: html
<table class='indexed-field-table stripe'>
<thead>
<tr>
<th>Field</th>
<th>Indexed</th>
<th>Type</th>
<th>Notes</th>
</tr>
</thead>
<tbody>
</tbody>
</table>
<div id="spacer" style="height:300px"></div>
| 38.712121 | 339 | 0.531703 |
082e48029b82e1408861f1434eb889342f7eae68 | 5,039 | rst | reStructuredText | README.rst | SatelliteQE/blinker_herald | 4af2f38ef63a0e8a71590784f8f26304567cc033 | [
"0BSD"
] | 7 | 2016-05-31T15:58:47.000Z | 2021-11-09T10:59:35.000Z | README.rst | SatelliteQE/blinker_herald | 4af2f38ef63a0e8a71590784f8f26304567cc033 | [
"0BSD"
] | 2 | 2016-05-29T04:42:42.000Z | 2016-05-30T13:26:04.000Z | README.rst | SatelliteQE/blinker_herald | 4af2f38ef63a0e8a71590784f8f26304567cc033 | [
"0BSD"
] | 6 | 2016-05-30T01:49:21.000Z | 2019-03-28T09:56:50.000Z | ===============================
Blinker Herald
===============================
.. image:: docs/The_Herald.jpg
:alt: The Herald
.. image:: https://img.shields.io/pypi/v/blinker_herald.svg
:target: https://pypi.python.org/pypi/blinker_herald
.. image:: https://travis-ci.org/SatelliteQE/blinker_herald.svg?branch=master
:target: https://travis-ci.org/SatelliteQE/blinker_herald
.. image:: https://readthedocs.org/projects/blinker-herald/badge/?version=latest
:target: https://readthedocs.org/projects/blinker-herald/?badge=latest
:alt: Documentation Status
.. image:: https://coveralls.io/repos/github/SatelliteQE/blinker_herald/badge.svg?branch=master
:target: https://coveralls.io/github/SatelliteQE/blinker_herald?branch=master
:alt: Coverage
The Blinker Herald includes helpers to easily emit signals using the excelent
`blinker`_ library.
Decorate a function or method with :code:`@blinker_herald.emit()`
and **pre** and **post** signals will be automatically emitted to
all connected handlers.
* Free software: ISC license
* Documentation: https://blinker_herald.readthedocs.org.
Features
--------
* All the features provided by `blinker`_
* `+` an easy decorator :code:`@emit()` to magically emit signals when your functions are called and before it returns a result.
* A :code:`signals` namespace proxy to discover the signals in your project
* Customizable for your needs
Usage
-----
Let's say you have a class and wants to emit a signal for a specific method::
from blinker_herald import emit
class SomeClass(object):
@emit()
def do_something(self, arg1):
# here is were magically the 'pre' signal will be sent
return 'something done'
# here is were magically the 'post' signal will be sent
using :code:`@emit` decorator makes blinker_herald to emit a signal for that method
and now you can connect handlers to capture that signals
You can capture **pre** signal to manipulate the object::
SomeClass.do_something.pre.connect
def handle_pre(sender, signal_emitter, **kwargs):
signal_emitter.foo = 'bar'
signal_emitter.do_another_thing()
And you can also capture the **post** signal to log the results::
SomeClass.do_something.post.connect
def handle_post(sender, signal_emitter, result, **kwargs):
logger.info("The method {0} returned {1}".format(sender, result))
.. note::
Post-signals are only called if there were no exceptions
raised during the processing of their related function.
You can also use the namespace proxy :code:`blinker_herald.signals` to connect
handlers to signals, the signal name is the prefix **pre** or **post**
followed by **_** and the method name::
from blinker_herald import signals
@signals.pre_do_something.connect
def handle_pre(sender, signal_emitter, **kwargs):
...
If you have a lot of subclasses emitting signals with the same name and you
need to capture only specific signals, you can specify that you want to listen
to only one type of sender::
from blinker_herald import emit, signals, SENDER_CLASS
class BaseModel(object):
...
@emit(sender=SENDER_CLASS)
def create(self, **kwargs):
new_instance = my_project.new(self, **kwargs)
return new_instance
class One(BaseModel):
pass
class Two(BaseModel):
pass
.. note::
By default the sender is always the instance but you can use :code:`SENDER_CLASS`
to force the sender to be the **class** another options are **SENDER_CLASS_NAME**,
**SENDER_MODULE**, **SENDER_NAME** and you can also pass a string, an object
or a lambda receiving the **sender** instance e.g: :code:`@emit(sender=lambda self: self.get_sender())`
Using :code:`SENDER_CLASS` you can now connect to specific signal::
from blinker_herald import signals
@signals.post_create.connect_via(One)
def handle_post_only_for_one(sender, signal_emitter, result, **kwargs):
# sender is the class One (cls)
# signal the instance of the class One (self)
# result is the return of the method create
The above will handle the :code:`create` method signal for the class **One** but not for the class **Two**
You can also be more specific about the signal you want to connect using the
**__** double underscore to provide method name::
from blinker_herald import signals
@signals.module_name__ClassName__post_method_name.connect
def handle_post(sender, signal_emitter, result, **kwargs):
...
The above will connect to the **post** signal emitted by :code:`module_name.ClassName.method_name`
.. note::
You don't have to use the pattern above if your project do not have a lot of
method name collisions, using only the method name will be just fine for most cases.
Credits
-------
This software was first created by SatelliteQE team to provide signals to
Robottelo and Nailgun
.. _blinker: http://pypi.python.org/pypi/blinker
| 33.593333 | 128 | 0.708275 |
76c041efed484257f70aa70a4174b4dcfbfa6270 | 519 | rst | reStructuredText | source/intro/ChangeLog/202101.0.rst | freee/a11y-guidelines | 4b92b247e53ddd51e5fae011082c0f3dee4d6faf | [
"CC-BY-4.0"
] | 42 | 2020-05-02T13:20:59.000Z | 2022-02-17T04:52:14.000Z | source/intro/ChangeLog/202101.0.rst | freee/a11y-guidelines | 4b92b247e53ddd51e5fae011082c0f3dee4d6faf | [
"CC-BY-4.0"
] | 8 | 2020-05-14T04:41:57.000Z | 2022-03-29T08:40:50.000Z | source/intro/ChangeLog/202101.0.rst | freee/a11y-guidelines | 4b92b247e53ddd51e5fae011082c0f3dee4d6faf | [
"CC-BY-4.0"
] | 9 | 2020-05-13T01:38:01.000Z | 2021-11-26T04:58:39.000Z | .. _ver-202101-0:
********************************************************************************************
`Ver. 202101.0 (2021年1月5日) <https://github.com/freee/a11y-guidelines/releases/202101.0>`_
********************************************************************************************
参考: `freeeアクセシビリティー・ガイドラインVer. 202101.0を公開しました <https://developers.freee.co.jp/entry/a11y-guidelines-202101.0>`_
* 参考情報更新
- :ref:`exp-tab-order-check` に、キーボードのみを用いた操作が可能であることを確認する方法として、マウス・ポインターを非表示にする方法を追加
* 誤字修正
| 37.071429 | 112 | 0.481696 |
d8584a932af8cc739348e9b8f2035e4c0c1e13f9 | 71,313 | rst | reStructuredText | classes/class_spatialmaterial.rst | malcolmhoward/godot-docs | 654049fe70a7777730d7581df7cd34d321a455ac | [
"CC-BY-3.0",
"MIT"
] | 1 | 2021-07-14T03:10:26.000Z | 2021-07-14T03:10:26.000Z | classes/class_spatialmaterial.rst | malcolmhoward/godot-docs | 654049fe70a7777730d7581df7cd34d321a455ac | [
"CC-BY-3.0",
"MIT"
] | null | null | null | classes/class_spatialmaterial.rst | malcolmhoward/godot-docs | 654049fe70a7777730d7581df7cd34d321a455ac | [
"CC-BY-3.0",
"MIT"
] | null | null | null | .. Generated automatically by doc/tools/makerst.py in Godot's source tree.
.. DO NOT EDIT THIS FILE, but the SpatialMaterial.xml source instead.
.. The source is found in doc/classes or modules/<name>/doc_classes.
.. _class_SpatialMaterial:
SpatialMaterial
===============
**Inherits:** :ref:`Material<class_Material>` **<** :ref:`Resource<class_Resource>` **<** :ref:`Reference<class_Reference>` **<** :ref:`Object<class_Object>`
**Category:** Core
Brief Description
-----------------
Default 3D rendering material.
Properties
----------
+----------------------------------------------------------------+---------------------------------------------------------------------------------------------+
| :ref:`Color<class_Color>` | :ref:`albedo_color<class_SpatialMaterial_albedo_color>` |
+----------------------------------------------------------------+---------------------------------------------------------------------------------------------+
| :ref:`Texture<class_Texture>` | :ref:`albedo_texture<class_SpatialMaterial_albedo_texture>` |
+----------------------------------------------------------------+---------------------------------------------------------------------------------------------+
| :ref:`float<class_float>` | :ref:`anisotropy<class_SpatialMaterial_anisotropy>` |
+----------------------------------------------------------------+---------------------------------------------------------------------------------------------+
| :ref:`bool<class_bool>` | :ref:`anisotropy_enabled<class_SpatialMaterial_anisotropy_enabled>` |
+----------------------------------------------------------------+---------------------------------------------------------------------------------------------+
| :ref:`Texture<class_Texture>` | :ref:`anisotropy_flowmap<class_SpatialMaterial_anisotropy_flowmap>` |
+----------------------------------------------------------------+---------------------------------------------------------------------------------------------+
| :ref:`bool<class_bool>` | :ref:`ao_enabled<class_SpatialMaterial_ao_enabled>` |
+----------------------------------------------------------------+---------------------------------------------------------------------------------------------+
| :ref:`float<class_float>` | :ref:`ao_light_affect<class_SpatialMaterial_ao_light_affect>` |
+----------------------------------------------------------------+---------------------------------------------------------------------------------------------+
| :ref:`bool<class_bool>` | :ref:`ao_on_uv2<class_SpatialMaterial_ao_on_uv2>` |
+----------------------------------------------------------------+---------------------------------------------------------------------------------------------+
| :ref:`Texture<class_Texture>` | :ref:`ao_texture<class_SpatialMaterial_ao_texture>` |
+----------------------------------------------------------------+---------------------------------------------------------------------------------------------+
| :ref:`TextureChannel<enum_SpatialMaterial_TextureChannel>` | :ref:`ao_texture_channel<class_SpatialMaterial_ao_texture_channel>` |
+----------------------------------------------------------------+---------------------------------------------------------------------------------------------+
| :ref:`float<class_float>` | :ref:`clearcoat<class_SpatialMaterial_clearcoat>` |
+----------------------------------------------------------------+---------------------------------------------------------------------------------------------+
| :ref:`bool<class_bool>` | :ref:`clearcoat_enabled<class_SpatialMaterial_clearcoat_enabled>` |
+----------------------------------------------------------------+---------------------------------------------------------------------------------------------+
| :ref:`float<class_float>` | :ref:`clearcoat_gloss<class_SpatialMaterial_clearcoat_gloss>` |
+----------------------------------------------------------------+---------------------------------------------------------------------------------------------+
| :ref:`Texture<class_Texture>` | :ref:`clearcoat_texture<class_SpatialMaterial_clearcoat_texture>` |
+----------------------------------------------------------------+---------------------------------------------------------------------------------------------+
| :ref:`bool<class_bool>` | :ref:`depth_deep_parallax<class_SpatialMaterial_depth_deep_parallax>` |
+----------------------------------------------------------------+---------------------------------------------------------------------------------------------+
| :ref:`bool<class_bool>` | :ref:`depth_enabled<class_SpatialMaterial_depth_enabled>` |
+----------------------------------------------------------------+---------------------------------------------------------------------------------------------+
| :ref:`int<class_int>` | :ref:`depth_max_layers<class_SpatialMaterial_depth_max_layers>` |
+----------------------------------------------------------------+---------------------------------------------------------------------------------------------+
| :ref:`int<class_int>` | :ref:`depth_min_layers<class_SpatialMaterial_depth_min_layers>` |
+----------------------------------------------------------------+---------------------------------------------------------------------------------------------+
| :ref:`float<class_float>` | :ref:`depth_scale<class_SpatialMaterial_depth_scale>` |
+----------------------------------------------------------------+---------------------------------------------------------------------------------------------+
| :ref:`Texture<class_Texture>` | :ref:`depth_texture<class_SpatialMaterial_depth_texture>` |
+----------------------------------------------------------------+---------------------------------------------------------------------------------------------+
| :ref:`Texture<class_Texture>` | :ref:`detail_albedo<class_SpatialMaterial_detail_albedo>` |
+----------------------------------------------------------------+---------------------------------------------------------------------------------------------+
| :ref:`BlendMode<enum_SpatialMaterial_BlendMode>` | :ref:`detail_blend_mode<class_SpatialMaterial_detail_blend_mode>` |
+----------------------------------------------------------------+---------------------------------------------------------------------------------------------+
| :ref:`bool<class_bool>` | :ref:`detail_enabled<class_SpatialMaterial_detail_enabled>` |
+----------------------------------------------------------------+---------------------------------------------------------------------------------------------+
| :ref:`Texture<class_Texture>` | :ref:`detail_mask<class_SpatialMaterial_detail_mask>` |
+----------------------------------------------------------------+---------------------------------------------------------------------------------------------+
| :ref:`Texture<class_Texture>` | :ref:`detail_normal<class_SpatialMaterial_detail_normal>` |
+----------------------------------------------------------------+---------------------------------------------------------------------------------------------+
| :ref:`DetailUV<enum_SpatialMaterial_DetailUV>` | :ref:`detail_uv_layer<class_SpatialMaterial_detail_uv_layer>` |
+----------------------------------------------------------------+---------------------------------------------------------------------------------------------+
| :ref:`float<class_float>` | :ref:`distance_fade_max_distance<class_SpatialMaterial_distance_fade_max_distance>` |
+----------------------------------------------------------------+---------------------------------------------------------------------------------------------+
| :ref:`float<class_float>` | :ref:`distance_fade_min_distance<class_SpatialMaterial_distance_fade_min_distance>` |
+----------------------------------------------------------------+---------------------------------------------------------------------------------------------+
| :ref:`DistanceFadeMode<enum_SpatialMaterial_DistanceFadeMode>` | :ref:`distance_fade_mode<class_SpatialMaterial_distance_fade_mode>` |
+----------------------------------------------------------------+---------------------------------------------------------------------------------------------+
| :ref:`Color<class_Color>` | :ref:`emission<class_SpatialMaterial_emission>` |
+----------------------------------------------------------------+---------------------------------------------------------------------------------------------+
| :ref:`bool<class_bool>` | :ref:`emission_enabled<class_SpatialMaterial_emission_enabled>` |
+----------------------------------------------------------------+---------------------------------------------------------------------------------------------+
| :ref:`float<class_float>` | :ref:`emission_energy<class_SpatialMaterial_emission_energy>` |
+----------------------------------------------------------------+---------------------------------------------------------------------------------------------+
| :ref:`bool<class_bool>` | :ref:`emission_on_uv2<class_SpatialMaterial_emission_on_uv2>` |
+----------------------------------------------------------------+---------------------------------------------------------------------------------------------+
| :ref:`EmissionOperator<enum_SpatialMaterial_EmissionOperator>` | :ref:`emission_operator<class_SpatialMaterial_emission_operator>` |
+----------------------------------------------------------------+---------------------------------------------------------------------------------------------+
| :ref:`Texture<class_Texture>` | :ref:`emission_texture<class_SpatialMaterial_emission_texture>` |
+----------------------------------------------------------------+---------------------------------------------------------------------------------------------+
| :ref:`bool<class_bool>` | :ref:`flags_albedo_tex_force_srgb<class_SpatialMaterial_flags_albedo_tex_force_srgb>` |
+----------------------------------------------------------------+---------------------------------------------------------------------------------------------+
| :ref:`bool<class_bool>` | :ref:`flags_disable_ambient_light<class_SpatialMaterial_flags_disable_ambient_light>` |
+----------------------------------------------------------------+---------------------------------------------------------------------------------------------+
| :ref:`bool<class_bool>` | :ref:`flags_do_not_receive_shadows<class_SpatialMaterial_flags_do_not_receive_shadows>` |
+----------------------------------------------------------------+---------------------------------------------------------------------------------------------+
| :ref:`bool<class_bool>` | :ref:`flags_ensure_correct_normals<class_SpatialMaterial_flags_ensure_correct_normals>` |
+----------------------------------------------------------------+---------------------------------------------------------------------------------------------+
| :ref:`bool<class_bool>` | :ref:`flags_fixed_size<class_SpatialMaterial_flags_fixed_size>` |
+----------------------------------------------------------------+---------------------------------------------------------------------------------------------+
| :ref:`bool<class_bool>` | :ref:`flags_no_depth_test<class_SpatialMaterial_flags_no_depth_test>` |
+----------------------------------------------------------------+---------------------------------------------------------------------------------------------+
| :ref:`bool<class_bool>` | :ref:`flags_transparent<class_SpatialMaterial_flags_transparent>` |
+----------------------------------------------------------------+---------------------------------------------------------------------------------------------+
| :ref:`bool<class_bool>` | :ref:`flags_unshaded<class_SpatialMaterial_flags_unshaded>` |
+----------------------------------------------------------------+---------------------------------------------------------------------------------------------+
| :ref:`bool<class_bool>` | :ref:`flags_use_point_size<class_SpatialMaterial_flags_use_point_size>` |
+----------------------------------------------------------------+---------------------------------------------------------------------------------------------+
| :ref:`bool<class_bool>` | :ref:`flags_vertex_lighting<class_SpatialMaterial_flags_vertex_lighting>` |
+----------------------------------------------------------------+---------------------------------------------------------------------------------------------+
| :ref:`bool<class_bool>` | :ref:`flags_world_triplanar<class_SpatialMaterial_flags_world_triplanar>` |
+----------------------------------------------------------------+---------------------------------------------------------------------------------------------+
| :ref:`float<class_float>` | :ref:`metallic<class_SpatialMaterial_metallic>` |
+----------------------------------------------------------------+---------------------------------------------------------------------------------------------+
| :ref:`float<class_float>` | :ref:`metallic_specular<class_SpatialMaterial_metallic_specular>` |
+----------------------------------------------------------------+---------------------------------------------------------------------------------------------+
| :ref:`Texture<class_Texture>` | :ref:`metallic_texture<class_SpatialMaterial_metallic_texture>` |
+----------------------------------------------------------------+---------------------------------------------------------------------------------------------+
| :ref:`TextureChannel<enum_SpatialMaterial_TextureChannel>` | :ref:`metallic_texture_channel<class_SpatialMaterial_metallic_texture_channel>` |
+----------------------------------------------------------------+---------------------------------------------------------------------------------------------+
| :ref:`bool<class_bool>` | :ref:`normal_enabled<class_SpatialMaterial_normal_enabled>` |
+----------------------------------------------------------------+---------------------------------------------------------------------------------------------+
| :ref:`float<class_float>` | :ref:`normal_scale<class_SpatialMaterial_normal_scale>` |
+----------------------------------------------------------------+---------------------------------------------------------------------------------------------+
| :ref:`Texture<class_Texture>` | :ref:`normal_texture<class_SpatialMaterial_normal_texture>` |
+----------------------------------------------------------------+---------------------------------------------------------------------------------------------+
| :ref:`float<class_float>` | :ref:`params_alpha_scissor_threshold<class_SpatialMaterial_params_alpha_scissor_threshold>` |
+----------------------------------------------------------------+---------------------------------------------------------------------------------------------+
| :ref:`bool<class_bool>` | :ref:`params_billboard_keep_scale<class_SpatialMaterial_params_billboard_keep_scale>` |
+----------------------------------------------------------------+---------------------------------------------------------------------------------------------+
| :ref:`BillboardMode<enum_SpatialMaterial_BillboardMode>` | :ref:`params_billboard_mode<class_SpatialMaterial_params_billboard_mode>` |
+----------------------------------------------------------------+---------------------------------------------------------------------------------------------+
| :ref:`BlendMode<enum_SpatialMaterial_BlendMode>` | :ref:`params_blend_mode<class_SpatialMaterial_params_blend_mode>` |
+----------------------------------------------------------------+---------------------------------------------------------------------------------------------+
| :ref:`CullMode<enum_SpatialMaterial_CullMode>` | :ref:`params_cull_mode<class_SpatialMaterial_params_cull_mode>` |
+----------------------------------------------------------------+---------------------------------------------------------------------------------------------+
| :ref:`DepthDrawMode<enum_SpatialMaterial_DepthDrawMode>` | :ref:`params_depth_draw_mode<class_SpatialMaterial_params_depth_draw_mode>` |
+----------------------------------------------------------------+---------------------------------------------------------------------------------------------+
| :ref:`DiffuseMode<enum_SpatialMaterial_DiffuseMode>` | :ref:`params_diffuse_mode<class_SpatialMaterial_params_diffuse_mode>` |
+----------------------------------------------------------------+---------------------------------------------------------------------------------------------+
| :ref:`bool<class_bool>` | :ref:`params_grow<class_SpatialMaterial_params_grow>` |
+----------------------------------------------------------------+---------------------------------------------------------------------------------------------+
| :ref:`float<class_float>` | :ref:`params_grow_amount<class_SpatialMaterial_params_grow_amount>` |
+----------------------------------------------------------------+---------------------------------------------------------------------------------------------+
| :ref:`float<class_float>` | :ref:`params_line_width<class_SpatialMaterial_params_line_width>` |
+----------------------------------------------------------------+---------------------------------------------------------------------------------------------+
| :ref:`float<class_float>` | :ref:`params_point_size<class_SpatialMaterial_params_point_size>` |
+----------------------------------------------------------------+---------------------------------------------------------------------------------------------+
| :ref:`SpecularMode<enum_SpatialMaterial_SpecularMode>` | :ref:`params_specular_mode<class_SpatialMaterial_params_specular_mode>` |
+----------------------------------------------------------------+---------------------------------------------------------------------------------------------+
| :ref:`bool<class_bool>` | :ref:`params_use_alpha_scissor<class_SpatialMaterial_params_use_alpha_scissor>` |
+----------------------------------------------------------------+---------------------------------------------------------------------------------------------+
| :ref:`int<class_int>` | :ref:`particles_anim_h_frames<class_SpatialMaterial_particles_anim_h_frames>` |
+----------------------------------------------------------------+---------------------------------------------------------------------------------------------+
| :ref:`int<class_int>` | :ref:`particles_anim_loop<class_SpatialMaterial_particles_anim_loop>` |
+----------------------------------------------------------------+---------------------------------------------------------------------------------------------+
| :ref:`int<class_int>` | :ref:`particles_anim_v_frames<class_SpatialMaterial_particles_anim_v_frames>` |
+----------------------------------------------------------------+---------------------------------------------------------------------------------------------+
| :ref:`float<class_float>` | :ref:`proximity_fade_distance<class_SpatialMaterial_proximity_fade_distance>` |
+----------------------------------------------------------------+---------------------------------------------------------------------------------------------+
| :ref:`bool<class_bool>` | :ref:`proximity_fade_enable<class_SpatialMaterial_proximity_fade_enable>` |
+----------------------------------------------------------------+---------------------------------------------------------------------------------------------+
| :ref:`bool<class_bool>` | :ref:`refraction_enabled<class_SpatialMaterial_refraction_enabled>` |
+----------------------------------------------------------------+---------------------------------------------------------------------------------------------+
| :ref:`float<class_float>` | :ref:`refraction_scale<class_SpatialMaterial_refraction_scale>` |
+----------------------------------------------------------------+---------------------------------------------------------------------------------------------+
| :ref:`Texture<class_Texture>` | :ref:`refraction_texture<class_SpatialMaterial_refraction_texture>` |
+----------------------------------------------------------------+---------------------------------------------------------------------------------------------+
| :ref:`TextureChannel<enum_SpatialMaterial_TextureChannel>` | :ref:`refraction_texture_channel<class_SpatialMaterial_refraction_texture_channel>` |
+----------------------------------------------------------------+---------------------------------------------------------------------------------------------+
| :ref:`float<class_float>` | :ref:`rim<class_SpatialMaterial_rim>` |
+----------------------------------------------------------------+---------------------------------------------------------------------------------------------+
| :ref:`bool<class_bool>` | :ref:`rim_enabled<class_SpatialMaterial_rim_enabled>` |
+----------------------------------------------------------------+---------------------------------------------------------------------------------------------+
| :ref:`Texture<class_Texture>` | :ref:`rim_texture<class_SpatialMaterial_rim_texture>` |
+----------------------------------------------------------------+---------------------------------------------------------------------------------------------+
| :ref:`float<class_float>` | :ref:`rim_tint<class_SpatialMaterial_rim_tint>` |
+----------------------------------------------------------------+---------------------------------------------------------------------------------------------+
| :ref:`float<class_float>` | :ref:`roughness<class_SpatialMaterial_roughness>` |
+----------------------------------------------------------------+---------------------------------------------------------------------------------------------+
| :ref:`Texture<class_Texture>` | :ref:`roughness_texture<class_SpatialMaterial_roughness_texture>` |
+----------------------------------------------------------------+---------------------------------------------------------------------------------------------+
| :ref:`TextureChannel<enum_SpatialMaterial_TextureChannel>` | :ref:`roughness_texture_channel<class_SpatialMaterial_roughness_texture_channel>` |
+----------------------------------------------------------------+---------------------------------------------------------------------------------------------+
| :ref:`bool<class_bool>` | :ref:`subsurf_scatter_enabled<class_SpatialMaterial_subsurf_scatter_enabled>` |
+----------------------------------------------------------------+---------------------------------------------------------------------------------------------+
| :ref:`float<class_float>` | :ref:`subsurf_scatter_strength<class_SpatialMaterial_subsurf_scatter_strength>` |
+----------------------------------------------------------------+---------------------------------------------------------------------------------------------+
| :ref:`Texture<class_Texture>` | :ref:`subsurf_scatter_texture<class_SpatialMaterial_subsurf_scatter_texture>` |
+----------------------------------------------------------------+---------------------------------------------------------------------------------------------+
| :ref:`Color<class_Color>` | :ref:`transmission<class_SpatialMaterial_transmission>` |
+----------------------------------------------------------------+---------------------------------------------------------------------------------------------+
| :ref:`bool<class_bool>` | :ref:`transmission_enabled<class_SpatialMaterial_transmission_enabled>` |
+----------------------------------------------------------------+---------------------------------------------------------------------------------------------+
| :ref:`Texture<class_Texture>` | :ref:`transmission_texture<class_SpatialMaterial_transmission_texture>` |
+----------------------------------------------------------------+---------------------------------------------------------------------------------------------+
| :ref:`Vector3<class_Vector3>` | :ref:`uv1_offset<class_SpatialMaterial_uv1_offset>` |
+----------------------------------------------------------------+---------------------------------------------------------------------------------------------+
| :ref:`Vector3<class_Vector3>` | :ref:`uv1_scale<class_SpatialMaterial_uv1_scale>` |
+----------------------------------------------------------------+---------------------------------------------------------------------------------------------+
| :ref:`bool<class_bool>` | :ref:`uv1_triplanar<class_SpatialMaterial_uv1_triplanar>` |
+----------------------------------------------------------------+---------------------------------------------------------------------------------------------+
| :ref:`float<class_float>` | :ref:`uv1_triplanar_sharpness<class_SpatialMaterial_uv1_triplanar_sharpness>` |
+----------------------------------------------------------------+---------------------------------------------------------------------------------------------+
| :ref:`Vector3<class_Vector3>` | :ref:`uv2_offset<class_SpatialMaterial_uv2_offset>` |
+----------------------------------------------------------------+---------------------------------------------------------------------------------------------+
| :ref:`Vector3<class_Vector3>` | :ref:`uv2_scale<class_SpatialMaterial_uv2_scale>` |
+----------------------------------------------------------------+---------------------------------------------------------------------------------------------+
| :ref:`bool<class_bool>` | :ref:`uv2_triplanar<class_SpatialMaterial_uv2_triplanar>` |
+----------------------------------------------------------------+---------------------------------------------------------------------------------------------+
| :ref:`float<class_float>` | :ref:`uv2_triplanar_sharpness<class_SpatialMaterial_uv2_triplanar_sharpness>` |
+----------------------------------------------------------------+---------------------------------------------------------------------------------------------+
| :ref:`bool<class_bool>` | :ref:`vertex_color_is_srgb<class_SpatialMaterial_vertex_color_is_srgb>` |
+----------------------------------------------------------------+---------------------------------------------------------------------------------------------+
| :ref:`bool<class_bool>` | :ref:`vertex_color_use_as_albedo<class_SpatialMaterial_vertex_color_use_as_albedo>` |
+----------------------------------------------------------------+---------------------------------------------------------------------------------------------+
Enumerations
------------
.. _enum_SpatialMaterial_DetailUV:
enum **DetailUV**:
- **DETAIL_UV_1** = **0**
- **DETAIL_UV_2** = **1**
.. _enum_SpatialMaterial_TextureParam:
enum **TextureParam**:
- **TEXTURE_ALBEDO** = **0**
- **TEXTURE_METALLIC** = **1**
- **TEXTURE_ROUGHNESS** = **2**
- **TEXTURE_EMISSION** = **3**
- **TEXTURE_NORMAL** = **4**
- **TEXTURE_RIM** = **5**
- **TEXTURE_CLEARCOAT** = **6**
- **TEXTURE_FLOWMAP** = **7**
- **TEXTURE_AMBIENT_OCCLUSION** = **8**
- **TEXTURE_DEPTH** = **9**
- **TEXTURE_SUBSURFACE_SCATTERING** = **10**
- **TEXTURE_TRANSMISSION** = **11**
- **TEXTURE_REFRACTION** = **12**
- **TEXTURE_DETAIL_MASK** = **13**
- **TEXTURE_DETAIL_ALBEDO** = **14**
- **TEXTURE_DETAIL_NORMAL** = **15**
- **TEXTURE_MAX** = **16**
.. _enum_SpatialMaterial_DistanceFadeMode:
enum **DistanceFadeMode**:
- **DISTANCE_FADE_DISABLED** = **0**
- **DISTANCE_FADE_PIXEL_ALPHA** = **1**
- **DISTANCE_FADE_PIXEL_DITHER** = **2**
- **DISTANCE_FADE_OBJECT_DITHER** = **3**
.. _enum_SpatialMaterial_DepthDrawMode:
enum **DepthDrawMode**:
- **DEPTH_DRAW_OPAQUE_ONLY** = **0** --- Default depth draw mode. Depth is drawn only for opaque objects.
- **DEPTH_DRAW_ALWAYS** = **1** --- Depth draw is calculated for both opaque and transparent objects.
- **DEPTH_DRAW_DISABLED** = **2** --- No depth draw.
- **DEPTH_DRAW_ALPHA_OPAQUE_PREPASS** = **3** --- For transparent objects, an opaque pass is made first with the opaque parts, then transparency is drawn.
.. _enum_SpatialMaterial_DiffuseMode:
enum **DiffuseMode**:
- **DIFFUSE_BURLEY** = **0** --- Default diffuse scattering algorithm.
- **DIFFUSE_LAMBERT** = **1** --- Diffuse scattering ignores roughness.
- **DIFFUSE_LAMBERT_WRAP** = **2** --- Extends Lambert to cover more than 90 degrees when roughness increases.
- **DIFFUSE_OREN_NAYAR** = **3** --- Attempts to use roughness to emulate microsurfacing.
- **DIFFUSE_TOON** = **4** --- Uses a hard cut for lighting, with smoothing affected by roughness.
.. _enum_SpatialMaterial_CullMode:
enum **CullMode**:
- **CULL_BACK** = **0** --- Default cull mode. The back of the object is culled when not visible.
- **CULL_FRONT** = **1** --- The front of the object is culled when not visible.
- **CULL_DISABLED** = **2** --- No culling is performed.
.. _enum_SpatialMaterial_Feature:
enum **Feature**:
- **FEATURE_TRANSPARENT** = **0**
- **FEATURE_EMISSION** = **1**
- **FEATURE_NORMAL_MAPPING** = **2**
- **FEATURE_RIM** = **3**
- **FEATURE_CLEARCOAT** = **4**
- **FEATURE_ANISOTROPY** = **5**
- **FEATURE_AMBIENT_OCCLUSION** = **6**
- **FEATURE_DEPTH_MAPPING** = **7**
- **FEATURE_SUBSURACE_SCATTERING** = **8**
- **FEATURE_TRANSMISSION** = **9**
- **FEATURE_REFRACTION** = **10**
- **FEATURE_DETAIL** = **11**
- **FEATURE_MAX** = **12**
.. _enum_SpatialMaterial_Flags:
enum **Flags**:
- **FLAG_UNSHADED** = **0**
- **FLAG_USE_VERTEX_LIGHTING** = **1**
- **FLAG_DISABLE_DEPTH_TEST** = **2**
- **FLAG_ALBEDO_FROM_VERTEX_COLOR** = **3**
- **FLAG_SRGB_VERTEX_COLOR** = **4**
- **FLAG_USE_POINT_SIZE** = **5**
- **FLAG_FIXED_SIZE** = **6**
- **FLAG_BILLBOARD_KEEP_SCALE** = **7**
- **FLAG_UV1_USE_TRIPLANAR** = **8**
- **FLAG_UV2_USE_TRIPLANAR** = **9**
- **FLAG_AO_ON_UV2** = **11**
- **FLAG_EMISSION_ON_UV2** = **12**
- **FLAG_USE_ALPHA_SCISSOR** = **13**
- **FLAG_TRIPLANAR_USE_WORLD** = **10**
- **FLAG_ALBEDO_TEXTURE_FORCE_SRGB** = **14**
- **FLAG_DONT_RECEIVE_SHADOWS** = **15**
- **FLAG_DISABLE_AMBIENT_LIGHT** = **17**
- **FLAG_ENSURE_CORRECT_NORMALS** = **16**
- **FLAG_MAX** = **18**
.. _enum_SpatialMaterial_BlendMode:
enum **BlendMode**:
- **BLEND_MODE_MIX** = **0** --- Default blend mode.
- **BLEND_MODE_ADD** = **1**
- **BLEND_MODE_SUB** = **2**
- **BLEND_MODE_MUL** = **3**
.. _enum_SpatialMaterial_SpecularMode:
enum **SpecularMode**:
- **SPECULAR_SCHLICK_GGX** = **0** --- Default specular blob.
- **SPECULAR_BLINN** = **1** --- Older specular algorithm, included for compatibility.
- **SPECULAR_PHONG** = **2** --- Older specular algorithm, included for compatibility.
- **SPECULAR_TOON** = **3** --- Toon blob which changes size based on roughness.
- **SPECULAR_DISABLED** = **4** --- No specular blob.
.. _enum_SpatialMaterial_TextureChannel:
enum **TextureChannel**:
- **TEXTURE_CHANNEL_RED** = **0**
- **TEXTURE_CHANNEL_GREEN** = **1**
- **TEXTURE_CHANNEL_BLUE** = **2**
- **TEXTURE_CHANNEL_ALPHA** = **3**
- **TEXTURE_CHANNEL_GRAYSCALE** = **4**
.. _enum_SpatialMaterial_BillboardMode:
enum **BillboardMode**:
- **BILLBOARD_DISABLED** = **0** --- Default value.
- **BILLBOARD_ENABLED** = **1** --- The object's z-axis will always face the camera.
- **BILLBOARD_FIXED_Y** = **2** --- The object's x-axis will always face the camera.
- **BILLBOARD_PARTICLES** = **3** --- Used for particle systems. Enables particle animation options.
.. _enum_SpatialMaterial_EmissionOperator:
enum **EmissionOperator**:
- **EMISSION_OP_ADD** = **0**
- **EMISSION_OP_MULTIPLY** = **1**
Description
-----------
This provides a default material with a wide variety of rendering features and properties without the need to write shader code. See the tutorial below for details.
Tutorials
---------
- :doc:`../tutorials/3d/spatial_material`
Property Descriptions
---------------------
.. _class_SpatialMaterial_albedo_color:
- :ref:`Color<class_Color>` **albedo_color**
+----------+-------------------+
| *Setter* | set_albedo(value) |
+----------+-------------------+
| *Getter* | get_albedo() |
+----------+-------------------+
The material's base color.
.. _class_SpatialMaterial_albedo_texture:
- :ref:`Texture<class_Texture>` **albedo_texture**
+----------+--------------------+
| *Setter* | set_texture(value) |
+----------+--------------------+
| *Getter* | get_texture() |
+----------+--------------------+
.. _class_SpatialMaterial_anisotropy:
- :ref:`float<class_float>` **anisotropy**
+----------+-----------------------+
| *Setter* | set_anisotropy(value) |
+----------+-----------------------+
| *Getter* | get_anisotropy() |
+----------+-----------------------+
The strength of the anisotropy effect.
.. _class_SpatialMaterial_anisotropy_enabled:
- :ref:`bool<class_bool>` **anisotropy_enabled**
+----------+--------------------+
| *Setter* | set_feature(value) |
+----------+--------------------+
| *Getter* | get_feature() |
+----------+--------------------+
If ``true`` anisotropy is enabled. Changes the shape of the specular blob and aligns it to tangent space. Default value: ``false``.
.. _class_SpatialMaterial_anisotropy_flowmap:
- :ref:`Texture<class_Texture>` **anisotropy_flowmap**
+----------+--------------------+
| *Setter* | set_texture(value) |
+----------+--------------------+
| *Getter* | get_texture() |
+----------+--------------------+
.. _class_SpatialMaterial_ao_enabled:
- :ref:`bool<class_bool>` **ao_enabled**
+----------+--------------------+
| *Setter* | set_feature(value) |
+----------+--------------------+
| *Getter* | get_feature() |
+----------+--------------------+
If ``true`` ambient occlusion is enabled.
.. _class_SpatialMaterial_ao_light_affect:
- :ref:`float<class_float>` **ao_light_affect**
+----------+----------------------------+
| *Setter* | set_ao_light_affect(value) |
+----------+----------------------------+
| *Getter* | get_ao_light_affect() |
+----------+----------------------------+
.. _class_SpatialMaterial_ao_on_uv2:
- :ref:`bool<class_bool>` **ao_on_uv2**
+----------+-----------------+
| *Setter* | set_flag(value) |
+----------+-----------------+
| *Getter* | get_flag() |
+----------+-----------------+
.. _class_SpatialMaterial_ao_texture:
- :ref:`Texture<class_Texture>` **ao_texture**
+----------+--------------------+
| *Setter* | set_texture(value) |
+----------+--------------------+
| *Getter* | get_texture() |
+----------+--------------------+
.. _class_SpatialMaterial_ao_texture_channel:
- :ref:`TextureChannel<enum_SpatialMaterial_TextureChannel>` **ao_texture_channel**
+----------+-------------------------------+
| *Setter* | set_ao_texture_channel(value) |
+----------+-------------------------------+
| *Getter* | get_ao_texture_channel() |
+----------+-------------------------------+
.. _class_SpatialMaterial_clearcoat:
- :ref:`float<class_float>` **clearcoat**
+----------+----------------------+
| *Setter* | set_clearcoat(value) |
+----------+----------------------+
| *Getter* | get_clearcoat() |
+----------+----------------------+
.. _class_SpatialMaterial_clearcoat_enabled:
- :ref:`bool<class_bool>` **clearcoat_enabled**
+----------+--------------------+
| *Setter* | set_feature(value) |
+----------+--------------------+
| *Getter* | get_feature() |
+----------+--------------------+
If ``true`` clearcoat rendering is enabled. Adds a secondary transparent pass to the material. Default value: ``false``.
.. _class_SpatialMaterial_clearcoat_gloss:
- :ref:`float<class_float>` **clearcoat_gloss**
+----------+----------------------------+
| *Setter* | set_clearcoat_gloss(value) |
+----------+----------------------------+
| *Getter* | get_clearcoat_gloss() |
+----------+----------------------------+
.. _class_SpatialMaterial_clearcoat_texture:
- :ref:`Texture<class_Texture>` **clearcoat_texture**
+----------+--------------------+
| *Setter* | set_texture(value) |
+----------+--------------------+
| *Getter* | get_texture() |
+----------+--------------------+
.. _class_SpatialMaterial_depth_deep_parallax:
- :ref:`bool<class_bool>` **depth_deep_parallax**
+----------+----------------------------------+
| *Setter* | set_depth_deep_parallax(value) |
+----------+----------------------------------+
| *Getter* | is_depth_deep_parallax_enabled() |
+----------+----------------------------------+
.. _class_SpatialMaterial_depth_enabled:
- :ref:`bool<class_bool>` **depth_enabled**
+----------+--------------------+
| *Setter* | set_feature(value) |
+----------+--------------------+
| *Getter* | get_feature() |
+----------+--------------------+
If ``true`` Depth mapping is enabled. See also :ref:`normal_enabled<class_SpatialMaterial_normal_enabled>`.
.. _class_SpatialMaterial_depth_max_layers:
- :ref:`int<class_int>` **depth_max_layers**
+----------+-------------------------------------------+
| *Setter* | set_depth_deep_parallax_max_layers(value) |
+----------+-------------------------------------------+
| *Getter* | get_depth_deep_parallax_max_layers() |
+----------+-------------------------------------------+
.. _class_SpatialMaterial_depth_min_layers:
- :ref:`int<class_int>` **depth_min_layers**
+----------+-------------------------------------------+
| *Setter* | set_depth_deep_parallax_min_layers(value) |
+----------+-------------------------------------------+
| *Getter* | get_depth_deep_parallax_min_layers() |
+----------+-------------------------------------------+
.. _class_SpatialMaterial_depth_scale:
- :ref:`float<class_float>` **depth_scale**
+----------+------------------------+
| *Setter* | set_depth_scale(value) |
+----------+------------------------+
| *Getter* | get_depth_scale() |
+----------+------------------------+
.. _class_SpatialMaterial_depth_texture:
- :ref:`Texture<class_Texture>` **depth_texture**
+----------+--------------------+
| *Setter* | set_texture(value) |
+----------+--------------------+
| *Getter* | get_texture() |
+----------+--------------------+
.. _class_SpatialMaterial_detail_albedo:
- :ref:`Texture<class_Texture>` **detail_albedo**
+----------+--------------------+
| *Setter* | set_texture(value) |
+----------+--------------------+
| *Getter* | get_texture() |
+----------+--------------------+
.. _class_SpatialMaterial_detail_blend_mode:
- :ref:`BlendMode<enum_SpatialMaterial_BlendMode>` **detail_blend_mode**
+----------+------------------------------+
| *Setter* | set_detail_blend_mode(value) |
+----------+------------------------------+
| *Getter* | get_detail_blend_mode() |
+----------+------------------------------+
.. _class_SpatialMaterial_detail_enabled:
- :ref:`bool<class_bool>` **detail_enabled**
+----------+--------------------+
| *Setter* | set_feature(value) |
+----------+--------------------+
| *Getter* | get_feature() |
+----------+--------------------+
.. _class_SpatialMaterial_detail_mask:
- :ref:`Texture<class_Texture>` **detail_mask**
+----------+--------------------+
| *Setter* | set_texture(value) |
+----------+--------------------+
| *Getter* | get_texture() |
+----------+--------------------+
.. _class_SpatialMaterial_detail_normal:
- :ref:`Texture<class_Texture>` **detail_normal**
+----------+--------------------+
| *Setter* | set_texture(value) |
+----------+--------------------+
| *Getter* | get_texture() |
+----------+--------------------+
.. _class_SpatialMaterial_detail_uv_layer:
- :ref:`DetailUV<enum_SpatialMaterial_DetailUV>` **detail_uv_layer**
+----------+----------------------+
| *Setter* | set_detail_uv(value) |
+----------+----------------------+
| *Getter* | get_detail_uv() |
+----------+----------------------+
.. _class_SpatialMaterial_distance_fade_max_distance:
- :ref:`float<class_float>` **distance_fade_max_distance**
+----------+---------------------------------------+
| *Setter* | set_distance_fade_max_distance(value) |
+----------+---------------------------------------+
| *Getter* | get_distance_fade_max_distance() |
+----------+---------------------------------------+
.. _class_SpatialMaterial_distance_fade_min_distance:
- :ref:`float<class_float>` **distance_fade_min_distance**
+----------+---------------------------------------+
| *Setter* | set_distance_fade_min_distance(value) |
+----------+---------------------------------------+
| *Getter* | get_distance_fade_min_distance() |
+----------+---------------------------------------+
.. _class_SpatialMaterial_distance_fade_mode:
- :ref:`DistanceFadeMode<enum_SpatialMaterial_DistanceFadeMode>` **distance_fade_mode**
+----------+--------------------------+
| *Setter* | set_distance_fade(value) |
+----------+--------------------------+
| *Getter* | get_distance_fade() |
+----------+--------------------------+
.. _class_SpatialMaterial_emission:
- :ref:`Color<class_Color>` **emission**
+----------+---------------------+
| *Setter* | set_emission(value) |
+----------+---------------------+
| *Getter* | get_emission() |
+----------+---------------------+
The emitted light's color. See :ref:`emission_enabled<class_SpatialMaterial_emission_enabled>`.
.. _class_SpatialMaterial_emission_enabled:
- :ref:`bool<class_bool>` **emission_enabled**
+----------+--------------------+
| *Setter* | set_feature(value) |
+----------+--------------------+
| *Getter* | get_feature() |
+----------+--------------------+
If ``true`` the body emits light.
.. _class_SpatialMaterial_emission_energy:
- :ref:`float<class_float>` **emission_energy**
+----------+----------------------------+
| *Setter* | set_emission_energy(value) |
+----------+----------------------------+
| *Getter* | get_emission_energy() |
+----------+----------------------------+
The emitted light's strength. See :ref:`emission_enabled<class_SpatialMaterial_emission_enabled>`.
.. _class_SpatialMaterial_emission_on_uv2:
- :ref:`bool<class_bool>` **emission_on_uv2**
+----------+-----------------+
| *Setter* | set_flag(value) |
+----------+-----------------+
| *Getter* | get_flag() |
+----------+-----------------+
.. _class_SpatialMaterial_emission_operator:
- :ref:`EmissionOperator<enum_SpatialMaterial_EmissionOperator>` **emission_operator**
+----------+------------------------------+
| *Setter* | set_emission_operator(value) |
+----------+------------------------------+
| *Getter* | get_emission_operator() |
+----------+------------------------------+
.. _class_SpatialMaterial_emission_texture:
- :ref:`Texture<class_Texture>` **emission_texture**
+----------+--------------------+
| *Setter* | set_texture(value) |
+----------+--------------------+
| *Getter* | get_texture() |
+----------+--------------------+
.. _class_SpatialMaterial_flags_albedo_tex_force_srgb:
- :ref:`bool<class_bool>` **flags_albedo_tex_force_srgb**
+----------+-----------------+
| *Setter* | set_flag(value) |
+----------+-----------------+
| *Getter* | get_flag() |
+----------+-----------------+
.. _class_SpatialMaterial_flags_disable_ambient_light:
- :ref:`bool<class_bool>` **flags_disable_ambient_light**
+----------+-----------------+
| *Setter* | set_flag(value) |
+----------+-----------------+
| *Getter* | get_flag() |
+----------+-----------------+
If ``true`` the object receives no ambient light. Default value: ``false``.
.. _class_SpatialMaterial_flags_do_not_receive_shadows:
- :ref:`bool<class_bool>` **flags_do_not_receive_shadows**
+----------+-----------------+
| *Setter* | set_flag(value) |
+----------+-----------------+
| *Getter* | get_flag() |
+----------+-----------------+
If ``true`` the object receives no shadow that would otherwise be cast onto it. Default value: ``false``.
.. _class_SpatialMaterial_flags_ensure_correct_normals:
- :ref:`bool<class_bool>` **flags_ensure_correct_normals**
+----------+-----------------+
| *Setter* | set_flag(value) |
+----------+-----------------+
| *Getter* | get_flag() |
+----------+-----------------+
.. _class_SpatialMaterial_flags_fixed_size:
- :ref:`bool<class_bool>` **flags_fixed_size**
+----------+-----------------+
| *Setter* | set_flag(value) |
+----------+-----------------+
| *Getter* | get_flag() |
+----------+-----------------+
If ``true`` the object is rendered at the same size regardless of distance. Default value: ``false``.
.. _class_SpatialMaterial_flags_no_depth_test:
- :ref:`bool<class_bool>` **flags_no_depth_test**
+----------+-----------------+
| *Setter* | set_flag(value) |
+----------+-----------------+
| *Getter* | get_flag() |
+----------+-----------------+
If ``true`` depth testing is disabled and the object will be drawn in render order.
.. _class_SpatialMaterial_flags_transparent:
- :ref:`bool<class_bool>` **flags_transparent**
+----------+--------------------+
| *Setter* | set_feature(value) |
+----------+--------------------+
| *Getter* | get_feature() |
+----------+--------------------+
If ``true`` transparency is enabled on the body. Default value: ``false``. See also :ref:`params_blend_mode<class_SpatialMaterial_params_blend_mode>`.
.. _class_SpatialMaterial_flags_unshaded:
- :ref:`bool<class_bool>` **flags_unshaded**
+----------+-----------------+
| *Setter* | set_flag(value) |
+----------+-----------------+
| *Getter* | get_flag() |
+----------+-----------------+
If ``true`` the object is unaffected by lighting. Default value: ``false``.
.. _class_SpatialMaterial_flags_use_point_size:
- :ref:`bool<class_bool>` **flags_use_point_size**
+----------+-----------------+
| *Setter* | set_flag(value) |
+----------+-----------------+
| *Getter* | get_flag() |
+----------+-----------------+
If ``true`` render point size can be changed. Note: this is only effective for objects whose geometry is point-based rather than triangle-based. See also :ref:`params_point_size<class_SpatialMaterial_params_point_size>`.
.. _class_SpatialMaterial_flags_vertex_lighting:
- :ref:`bool<class_bool>` **flags_vertex_lighting**
+----------+-----------------+
| *Setter* | set_flag(value) |
+----------+-----------------+
| *Getter* | get_flag() |
+----------+-----------------+
If ``true`` lighting is calculated per vertex rather than per pixel. This may increase performance on low-end devices. Default value: ``false``.
.. _class_SpatialMaterial_flags_world_triplanar:
- :ref:`bool<class_bool>` **flags_world_triplanar**
+----------+-----------------+
| *Setter* | set_flag(value) |
+----------+-----------------+
| *Getter* | get_flag() |
+----------+-----------------+
If ``true`` triplanar mapping is calculated in world space rather than object local space. See also :ref:`uv1_triplanar<class_SpatialMaterial_uv1_triplanar>`. Default value: ``false``.
.. _class_SpatialMaterial_metallic:
- :ref:`float<class_float>` **metallic**
+----------+---------------------+
| *Setter* | set_metallic(value) |
+----------+---------------------+
| *Getter* | get_metallic() |
+----------+---------------------+
The reflectivity of the object's surface. The higher the value the more light is reflected.
.. _class_SpatialMaterial_metallic_specular:
- :ref:`float<class_float>` **metallic_specular**
+----------+---------------------+
| *Setter* | set_specular(value) |
+----------+---------------------+
| *Getter* | get_specular() |
+----------+---------------------+
General reflectivity amount. Note: unlike :ref:`metallic<class_SpatialMaterial_metallic>`, this is not energy-conserving, so it should be left at ``0.5`` in most cases. See also :ref:`roughness<class_SpatialMaterial_roughness>`.
.. _class_SpatialMaterial_metallic_texture:
- :ref:`Texture<class_Texture>` **metallic_texture**
+----------+--------------------+
| *Setter* | set_texture(value) |
+----------+--------------------+
| *Getter* | get_texture() |
+----------+--------------------+
.. _class_SpatialMaterial_metallic_texture_channel:
- :ref:`TextureChannel<enum_SpatialMaterial_TextureChannel>` **metallic_texture_channel**
+----------+-------------------------------------+
| *Setter* | set_metallic_texture_channel(value) |
+----------+-------------------------------------+
| *Getter* | get_metallic_texture_channel() |
+----------+-------------------------------------+
.. _class_SpatialMaterial_normal_enabled:
- :ref:`bool<class_bool>` **normal_enabled**
+----------+--------------------+
| *Setter* | set_feature(value) |
+----------+--------------------+
| *Getter* | get_feature() |
+----------+--------------------+
If ``true`` normal mapping is enabled.
.. _class_SpatialMaterial_normal_scale:
- :ref:`float<class_float>` **normal_scale**
+----------+-------------------------+
| *Setter* | set_normal_scale(value) |
+----------+-------------------------+
| *Getter* | get_normal_scale() |
+----------+-------------------------+
The strength of the normal map's effect.
.. _class_SpatialMaterial_normal_texture:
- :ref:`Texture<class_Texture>` **normal_texture**
+----------+--------------------+
| *Setter* | set_texture(value) |
+----------+--------------------+
| *Getter* | get_texture() |
+----------+--------------------+
.. _class_SpatialMaterial_params_alpha_scissor_threshold:
- :ref:`float<class_float>` **params_alpha_scissor_threshold**
+----------+------------------------------------+
| *Setter* | set_alpha_scissor_threshold(value) |
+----------+------------------------------------+
| *Getter* | get_alpha_scissor_threshold() |
+----------+------------------------------------+
.. _class_SpatialMaterial_params_billboard_keep_scale:
- :ref:`bool<class_bool>` **params_billboard_keep_scale**
+----------+-----------------+
| *Setter* | set_flag(value) |
+----------+-----------------+
| *Getter* | get_flag() |
+----------+-----------------+
.. _class_SpatialMaterial_params_billboard_mode:
- :ref:`BillboardMode<enum_SpatialMaterial_BillboardMode>` **params_billboard_mode**
+----------+---------------------------+
| *Setter* | set_billboard_mode(value) |
+----------+---------------------------+
| *Getter* | get_billboard_mode() |
+----------+---------------------------+
Controls how the object faces the camera. See :ref:`BillboardMode<enum_@GlobalScope_BillboardMode>`.
.. _class_SpatialMaterial_params_blend_mode:
- :ref:`BlendMode<enum_SpatialMaterial_BlendMode>` **params_blend_mode**
+----------+-----------------------+
| *Setter* | set_blend_mode(value) |
+----------+-----------------------+
| *Getter* | get_blend_mode() |
+----------+-----------------------+
The material's blend mode. Note that values other than ``Mix`` force the object into the transparent pipeline. See :ref:`BlendMode<enum_@GlobalScope_BlendMode>`.
.. _class_SpatialMaterial_params_cull_mode:
- :ref:`CullMode<enum_SpatialMaterial_CullMode>` **params_cull_mode**
+----------+----------------------+
| *Setter* | set_cull_mode(value) |
+----------+----------------------+
| *Getter* | get_cull_mode() |
+----------+----------------------+
Which side of the object is not drawn when backfaces are rendered. See :ref:`CullMode<enum_@GlobalScope_CullMode>`.
.. _class_SpatialMaterial_params_depth_draw_mode:
- :ref:`DepthDrawMode<enum_SpatialMaterial_DepthDrawMode>` **params_depth_draw_mode**
+----------+----------------------------+
| *Setter* | set_depth_draw_mode(value) |
+----------+----------------------------+
| *Getter* | get_depth_draw_mode() |
+----------+----------------------------+
Determines when depth rendering takes place. See :ref:`DepthDrawMode<enum_@GlobalScope_DepthDrawMode>`. See also :ref:`flags_transparent<class_SpatialMaterial_flags_transparent>`.
.. _class_SpatialMaterial_params_diffuse_mode:
- :ref:`DiffuseMode<enum_SpatialMaterial_DiffuseMode>` **params_diffuse_mode**
+----------+-------------------------+
| *Setter* | set_diffuse_mode(value) |
+----------+-------------------------+
| *Getter* | get_diffuse_mode() |
+----------+-------------------------+
The algorithm used for diffuse light scattering. See :ref:`DiffuseMode<enum_@GlobalScope_DiffuseMode>`.
.. _class_SpatialMaterial_params_grow:
- :ref:`bool<class_bool>` **params_grow**
+----------+-------------------------+
| *Setter* | set_grow_enabled(value) |
+----------+-------------------------+
| *Getter* | is_grow_enabled() |
+----------+-------------------------+
If ``true`` enables the vertex grow setting. See :ref:`params_grow_amount<class_SpatialMaterial_params_grow_amount>`.
.. _class_SpatialMaterial_params_grow_amount:
- :ref:`float<class_float>` **params_grow_amount**
+----------+-----------------+
| *Setter* | set_grow(value) |
+----------+-----------------+
| *Getter* | get_grow() |
+----------+-----------------+
Grows object vertices in the direction of their normals.
.. _class_SpatialMaterial_params_line_width:
- :ref:`float<class_float>` **params_line_width**
+----------+-----------------------+
| *Setter* | set_line_width(value) |
+----------+-----------------------+
| *Getter* | get_line_width() |
+----------+-----------------------+
.. _class_SpatialMaterial_params_point_size:
- :ref:`float<class_float>` **params_point_size**
+----------+-----------------------+
| *Setter* | set_point_size(value) |
+----------+-----------------------+
| *Getter* | get_point_size() |
+----------+-----------------------+
The point size in pixels. See :ref:`flags_use_point_size<class_SpatialMaterial_flags_use_point_size>`.
.. _class_SpatialMaterial_params_specular_mode:
- :ref:`SpecularMode<enum_SpatialMaterial_SpecularMode>` **params_specular_mode**
+----------+--------------------------+
| *Setter* | set_specular_mode(value) |
+----------+--------------------------+
| *Getter* | get_specular_mode() |
+----------+--------------------------+
The method for rendering the specular blob. See :ref:`SpecularMode<enum_@GlobalScope_SpecularMode>`.
.. _class_SpatialMaterial_params_use_alpha_scissor:
- :ref:`bool<class_bool>` **params_use_alpha_scissor**
+----------+-----------------+
| *Setter* | set_flag(value) |
+----------+-----------------+
| *Getter* | get_flag() |
+----------+-----------------+
.. _class_SpatialMaterial_particles_anim_h_frames:
- :ref:`int<class_int>` **particles_anim_h_frames**
+----------+------------------------------------+
| *Setter* | set_particles_anim_h_frames(value) |
+----------+------------------------------------+
| *Getter* | get_particles_anim_h_frames() |
+----------+------------------------------------+
The number of horizontal frames in the particle spritesheet. Only enabled when using ``BillboardMode.BILLBOARD_PARTICLES``. See :ref:`params_billboard_mode<class_SpatialMaterial_params_billboard_mode>`.
.. _class_SpatialMaterial_particles_anim_loop:
- :ref:`int<class_int>` **particles_anim_loop**
+----------+--------------------------------+
| *Setter* | set_particles_anim_loop(value) |
+----------+--------------------------------+
| *Getter* | get_particles_anim_loop() |
+----------+--------------------------------+
If ``true`` particle animations are looped. Only enabled when using ``BillboardMode.BILLBOARD_PARTICLES``. See :ref:`params_billboard_mode<class_SpatialMaterial_params_billboard_mode>`.
.. _class_SpatialMaterial_particles_anim_v_frames:
- :ref:`int<class_int>` **particles_anim_v_frames**
+----------+------------------------------------+
| *Setter* | set_particles_anim_v_frames(value) |
+----------+------------------------------------+
| *Getter* | get_particles_anim_v_frames() |
+----------+------------------------------------+
The number of vertical frames in the particle spritesheet. Only enabled when using ``BillboardMode.BILLBOARD_PARTICLES``. See :ref:`params_billboard_mode<class_SpatialMaterial_params_billboard_mode>`.
.. _class_SpatialMaterial_proximity_fade_distance:
- :ref:`float<class_float>` **proximity_fade_distance**
+----------+------------------------------------+
| *Setter* | set_proximity_fade_distance(value) |
+----------+------------------------------------+
| *Getter* | get_proximity_fade_distance() |
+----------+------------------------------------+
.. _class_SpatialMaterial_proximity_fade_enable:
- :ref:`bool<class_bool>` **proximity_fade_enable**
+----------+-----------------------------+
| *Setter* | set_proximity_fade(value) |
+----------+-----------------------------+
| *Getter* | is_proximity_fade_enabled() |
+----------+-----------------------------+
If ``true`` the proximity and distance fade effect is enabled. Default value: ``false``.
.. _class_SpatialMaterial_refraction_enabled:
- :ref:`bool<class_bool>` **refraction_enabled**
+----------+--------------------+
| *Setter* | set_feature(value) |
+----------+--------------------+
| *Getter* | get_feature() |
+----------+--------------------+
If ``true`` the refraction effect is enabled. Distorts transparency based on light from behind the object. Default value: ``false``.
.. _class_SpatialMaterial_refraction_scale:
- :ref:`float<class_float>` **refraction_scale**
+----------+-----------------------+
| *Setter* | set_refraction(value) |
+----------+-----------------------+
| *Getter* | get_refraction() |
+----------+-----------------------+
The strength of the refraction effect.
.. _class_SpatialMaterial_refraction_texture:
- :ref:`Texture<class_Texture>` **refraction_texture**
+----------+--------------------+
| *Setter* | set_texture(value) |
+----------+--------------------+
| *Getter* | get_texture() |
+----------+--------------------+
.. _class_SpatialMaterial_refraction_texture_channel:
- :ref:`TextureChannel<enum_SpatialMaterial_TextureChannel>` **refraction_texture_channel**
+----------+---------------------------------------+
| *Setter* | set_refraction_texture_channel(value) |
+----------+---------------------------------------+
| *Getter* | get_refraction_texture_channel() |
+----------+---------------------------------------+
.. _class_SpatialMaterial_rim:
- :ref:`float<class_float>` **rim**
+----------+----------------+
| *Setter* | set_rim(value) |
+----------+----------------+
| *Getter* | get_rim() |
+----------+----------------+
.. _class_SpatialMaterial_rim_enabled:
- :ref:`bool<class_bool>` **rim_enabled**
+----------+--------------------+
| *Setter* | set_feature(value) |
+----------+--------------------+
| *Getter* | get_feature() |
+----------+--------------------+
If ``true`` rim effect is enabled. Default value: ``false``.
.. _class_SpatialMaterial_rim_texture:
- :ref:`Texture<class_Texture>` **rim_texture**
+----------+--------------------+
| *Setter* | set_texture(value) |
+----------+--------------------+
| *Getter* | get_texture() |
+----------+--------------------+
.. _class_SpatialMaterial_rim_tint:
- :ref:`float<class_float>` **rim_tint**
+----------+---------------------+
| *Setter* | set_rim_tint(value) |
+----------+---------------------+
| *Getter* | get_rim_tint() |
+----------+---------------------+
The amount of to blend light and albedo color when rendering rim effect. If ``0`` the light color is used, while ``1`` means albedo color is used. An intermediate value generally works best.
.. _class_SpatialMaterial_roughness:
- :ref:`float<class_float>` **roughness**
+----------+----------------------+
| *Setter* | set_roughness(value) |
+----------+----------------------+
| *Getter* | get_roughness() |
+----------+----------------------+
Surface reflection. A value of ``0`` represents a perfect mirror while a value of ``1`` completely blurs the reflection. See also :ref:`metallic<class_SpatialMaterial_metallic>`.
.. _class_SpatialMaterial_roughness_texture:
- :ref:`Texture<class_Texture>` **roughness_texture**
+----------+--------------------+
| *Setter* | set_texture(value) |
+----------+--------------------+
| *Getter* | get_texture() |
+----------+--------------------+
.. _class_SpatialMaterial_roughness_texture_channel:
- :ref:`TextureChannel<enum_SpatialMaterial_TextureChannel>` **roughness_texture_channel**
+----------+--------------------------------------+
| *Setter* | set_roughness_texture_channel(value) |
+----------+--------------------------------------+
| *Getter* | get_roughness_texture_channel() |
+----------+--------------------------------------+
.. _class_SpatialMaterial_subsurf_scatter_enabled:
- :ref:`bool<class_bool>` **subsurf_scatter_enabled**
+----------+--------------------+
| *Setter* | set_feature(value) |
+----------+--------------------+
| *Getter* | get_feature() |
+----------+--------------------+
If ``true`` subsurface scattering is enabled. Emulates light that penetrates an object's surface, is scattered, and then emerges.
.. _class_SpatialMaterial_subsurf_scatter_strength:
- :ref:`float<class_float>` **subsurf_scatter_strength**
+----------+-------------------------------------------+
| *Setter* | set_subsurface_scattering_strength(value) |
+----------+-------------------------------------------+
| *Getter* | get_subsurface_scattering_strength() |
+----------+-------------------------------------------+
The strength of the subsurface scattering effect.
.. _class_SpatialMaterial_subsurf_scatter_texture:
- :ref:`Texture<class_Texture>` **subsurf_scatter_texture**
+----------+--------------------+
| *Setter* | set_texture(value) |
+----------+--------------------+
| *Getter* | get_texture() |
+----------+--------------------+
.. _class_SpatialMaterial_transmission:
- :ref:`Color<class_Color>` **transmission**
+----------+-------------------------+
| *Setter* | set_transmission(value) |
+----------+-------------------------+
| *Getter* | get_transmission() |
+----------+-------------------------+
The color used by the transmission effect. Represents the light passing through an object.
.. _class_SpatialMaterial_transmission_enabled:
- :ref:`bool<class_bool>` **transmission_enabled**
+----------+--------------------+
| *Setter* | set_feature(value) |
+----------+--------------------+
| *Getter* | get_feature() |
+----------+--------------------+
If ``true`` the transmission effect is enabled. Default value: ``false``.
.. _class_SpatialMaterial_transmission_texture:
- :ref:`Texture<class_Texture>` **transmission_texture**
+----------+--------------------+
| *Setter* | set_texture(value) |
+----------+--------------------+
| *Getter* | get_texture() |
+----------+--------------------+
.. _class_SpatialMaterial_uv1_offset:
- :ref:`Vector3<class_Vector3>` **uv1_offset**
+----------+-----------------------+
| *Setter* | set_uv1_offset(value) |
+----------+-----------------------+
| *Getter* | get_uv1_offset() |
+----------+-----------------------+
.. _class_SpatialMaterial_uv1_scale:
- :ref:`Vector3<class_Vector3>` **uv1_scale**
+----------+----------------------+
| *Setter* | set_uv1_scale(value) |
+----------+----------------------+
| *Getter* | get_uv1_scale() |
+----------+----------------------+
.. _class_SpatialMaterial_uv1_triplanar:
- :ref:`bool<class_bool>` **uv1_triplanar**
+----------+-----------------+
| *Setter* | set_flag(value) |
+----------+-----------------+
| *Getter* | get_flag() |
+----------+-----------------+
.. _class_SpatialMaterial_uv1_triplanar_sharpness:
- :ref:`float<class_float>` **uv1_triplanar_sharpness**
+----------+------------------------------------------+
| *Setter* | set_uv1_triplanar_blend_sharpness(value) |
+----------+------------------------------------------+
| *Getter* | get_uv1_triplanar_blend_sharpness() |
+----------+------------------------------------------+
.. _class_SpatialMaterial_uv2_offset:
- :ref:`Vector3<class_Vector3>` **uv2_offset**
+----------+-----------------------+
| *Setter* | set_uv2_offset(value) |
+----------+-----------------------+
| *Getter* | get_uv2_offset() |
+----------+-----------------------+
.. _class_SpatialMaterial_uv2_scale:
- :ref:`Vector3<class_Vector3>` **uv2_scale**
+----------+----------------------+
| *Setter* | set_uv2_scale(value) |
+----------+----------------------+
| *Getter* | get_uv2_scale() |
+----------+----------------------+
.. _class_SpatialMaterial_uv2_triplanar:
- :ref:`bool<class_bool>` **uv2_triplanar**
+----------+-----------------+
| *Setter* | set_flag(value) |
+----------+-----------------+
| *Getter* | get_flag() |
+----------+-----------------+
.. _class_SpatialMaterial_uv2_triplanar_sharpness:
- :ref:`float<class_float>` **uv2_triplanar_sharpness**
+----------+------------------------------------------+
| *Setter* | set_uv2_triplanar_blend_sharpness(value) |
+----------+------------------------------------------+
| *Getter* | get_uv2_triplanar_blend_sharpness() |
+----------+------------------------------------------+
.. _class_SpatialMaterial_vertex_color_is_srgb:
- :ref:`bool<class_bool>` **vertex_color_is_srgb**
+----------+-----------------+
| *Setter* | set_flag(value) |
+----------+-----------------+
| *Getter* | get_flag() |
+----------+-----------------+
If ``true`` the model's vertex colors are processed as sRGB mode. Default value: ``false``.
.. _class_SpatialMaterial_vertex_color_use_as_albedo:
- :ref:`bool<class_bool>` **vertex_color_use_as_albedo**
+----------+-----------------+
| *Setter* | set_flag(value) |
+----------+-----------------+
| *Getter* | get_flag() |
+----------+-----------------+
If ``true`` the vertex color is used as albedo color. Default value: ``false``.
| 46.488266 | 229 | 0.397557 |
efcd00183771d53b08b3e84a1c1445b5e0355ebd | 1,824 | rst | reStructuredText | source/courses/comp331.rst | NicholasSynovic/coursedescriptions | 1579ca42f427457603912272f2e98a7382fe5df9 | [
"Apache-2.0"
] | 4 | 2015-01-23T21:43:20.000Z | 2021-12-11T13:22:54.000Z | source/courses/comp331.rst | NicholasSynovic/coursedescriptions | 1579ca42f427457603912272f2e98a7382fe5df9 | [
"Apache-2.0"
] | 48 | 2015-08-20T20:06:42.000Z | 2021-08-29T17:07:40.000Z | source/courses/comp331.rst | NicholasSynovic/coursedescriptions | 1579ca42f427457603912272f2e98a7382fe5df9 | [
"Apache-2.0"
] | 22 | 2015-01-23T21:44:04.000Z | 2022-01-24T17:35:19.000Z | .. header:: COMP 331: Mathematical Foundations of Cryptography
.. footer:: COMP 331: Mathematical Foundations of Cryptography
.. index::
Mathematical Foundations of Cryptography
Mathematical Foundations
Cryptography
Mathematical
Foundations
COMP 331
##################################################
COMP 331: Mathematical Foundations of Cryptography
##################################################
******************
Course Information
******************
.. sidebar:: General Information
**Alias**
* |math331|
**Credit Hours**
* 3
**Prerequisites**
* |math201| or :doc:`comp363`
* One or more of the following:
* :doc:`comp125`
* :doc:`comp150`
* :doc:`comp170`
* :doc:`comp215`
About
=====
This course introduces the formal foundations of cryptography and also investigates some well-known standards and protocols, including private and public-key cryptosystems, hashing, digital signatures, RSA, DSS, PGP, and related topics.
Description
===========
This course introduces the formal foundations of cryptography and also investigates some well-known standards and protocols. The intended audience is senior undergraduate and beginning graduate students. The course will include topics selected from the following: information-theoretic security, private key encryption, DES, public key encryption, background on modular arithmetic, RSA, hashing, and message authentication codes (MACs), digital signatures, DSS, key distribution and management, PGP, network security, and Fiat-Shamir protocol.
Outcome
=======
Students will gain an understanding of cryptosystems widely used to protect data security on the internet, and be able to apply the ideas in new situations as needed.
*******
Syllabi
*******
|see-syllabi|
| 30.915254 | 543 | 0.679825 |
da66e2f2cb413c23656802cdc149e17a039770de | 992 | rst | reStructuredText | doc/ref/modules/all/salt.modules.shadow.rst | mattrobenolt/salt | e01a4c2d26adc705d8056970777a6313ad10291b | [
"Apache-2.0"
] | 1 | 2017-10-13T13:42:59.000Z | 2017-10-13T13:42:59.000Z | doc/ref/modules/all/salt.modules.shadow.rst | mattrobenolt/salt | e01a4c2d26adc705d8056970777a6313ad10291b | [
"Apache-2.0"
] | null | null | null | doc/ref/modules/all/salt.modules.shadow.rst | mattrobenolt/salt | e01a4c2d26adc705d8056970777a6313ad10291b | [
"Apache-2.0"
] | null | null | null | .. NOTE: Do not edit this file, it is a generated file and will be overwritten.
See the master template file _templates/autosummary/module.rst
===================
salt.modules.shadow
===================
.. automodule:: salt.modules.shadow
.. rubric:: Members
info
=========================================================
.. autofunction:: info
set_inactdays
=========================================================
.. autofunction:: set_inactdays
set_maxdays
=========================================================
.. autofunction:: set_maxdays
set_mindays
=========================================================
.. autofunction:: set_mindays
set_password
=========================================================
.. autofunction:: set_password
set_warndays
=========================================================
.. autofunction:: set_warndays
| 19.84 | 79 | 0.370968 |
65c1f9e9f1425c033e8dbe472ee9fbc8b4b1edc8 | 793 | rst | reStructuredText | doc/user/installation.rst | drammock/python-quantities | 6ee561e918c00fec773481fa9a2c73e8e106f061 | [
"DOC"
] | null | null | null | doc/user/installation.rst | drammock/python-quantities | 6ee561e918c00fec773481fa9a2c73e8e106f061 | [
"DOC"
] | null | null | null | doc/user/installation.rst | drammock/python-quantities | 6ee561e918c00fec773481fa9a2c73e8e106f061 | [
"DOC"
] | null | null | null | ************
Installation
************
Prerequisites
=============
Quantities has a few dependencies:
* Python_ (>=2.7)
* NumPy_ (>=1.8.2)
Source Code Installation
========================
To install Quantities, download the Quantites sourcecode from PyPi_
and run "python setup.py install" in the quantities source directory,
or run "pip install quantities".
Development
===========
You can follow and contribute to Quantities' development using git::
git clone git@github.com:python-quantities/python-quantities.git
Bugs, feature requests, and questions can be directed to the github_
website.
.. _Python: http://www.python.org/
.. _NumPy: http://www.scipy.org
.. _PyPi: http://pypi.python.org/pypi/quantities
.. _github: http://github.com/python-quantities/python-quantities
| 21.432432 | 69 | 0.693569 |
46c5aea035ad65543ac29b966b1d157eade49f69 | 1,268 | rst | reStructuredText | README.rst | boilerroomtv/django-authtools | 558a27115852099071c97689da81e2b902cb778d | [
"BSD-2-Clause"
] | null | null | null | README.rst | boilerroomtv/django-authtools | 558a27115852099071c97689da81e2b902cb778d | [
"BSD-2-Clause"
] | null | null | null | README.rst | boilerroomtv/django-authtools | 558a27115852099071c97689da81e2b902cb778d | [
"BSD-2-Clause"
] | null | null | null | django-authtools
================
.. image:: https://travis-ci.org/fusionbox/django-authtools.png
:target: http://travis-ci.org/fusionbox/django-authtools
:alt: Build Status
A custom user model app for Django 2.2+ that features email as username and
other things. It tries to stay true to the built-in user model for the most
part.
The main differences between authtools's User and django.contrib.auth's are
email as username and class-based auth views.
Read the `django-authtools documentation
<https://django-authtools.readthedocs.org/en/latest/>`_.
Quickstart
==========
Before you use this, you should probably read the documentation about `custom
User models
<https://docs.djangoproject.com/en/dev/topics/auth/customizing/#substituting-a-custom-user-model>`_.
1. Install the package:
.. code-block:: bash
$ pip install django-authtools
2. Add ``authtools`` to your ``INSTALLED_APPS``.
3. Add the following to your settings.py:
.. code-block:: python
AUTH_USER_MODEL = 'authtools.User'
4. Add ``authtools.urls`` to your URL patterns:
.. code-block:: python
urlpatterns = patterns('',
# ...
url(r'^accounts/', include('authtools.urls')),
# ...
)
5. Enjoy.
| 24.862745 | 100 | 0.675868 |
f3e7e545cba4c2a8afc9a8acc59947440e9560ce | 3,379 | rst | reStructuredText | DEPLOYNOTES.rst | making-books-ren-today/test_eval_4_derrmar | 615796efeb517cd12cfb1f8b67e0150f6aaaea66 | [
"Apache-2.0"
] | null | null | null | DEPLOYNOTES.rst | making-books-ren-today/test_eval_4_derrmar | 615796efeb517cd12cfb1f8b67e0150f6aaaea66 | [
"Apache-2.0"
] | null | null | null | DEPLOYNOTES.rst | making-books-ren-today/test_eval_4_derrmar | 615796efeb517cd12cfb1f8b67e0150f6aaaea66 | [
"Apache-2.0"
] | null | null | null | .. _DEPLOYNOTES:
Deploy and Upgrade notes
========================
1.2
---
* Provision and configure a **ZOTERO_API_KEY** and **ZOTERO_LIBRARY_ID**
before running zotero export script.
* Run data export scripts to generate dataset for deposit and reuse::
python manage.py reference_data
python manage.py intervention_data
python manage.py export_zotero
* This update includes changes to the Solr indexing (but not to the Solr
schema). You should refresh the Solr index::
python manage.py update_index
1.0
----
* Solr XML configuration files ``schema.xml`` and ``solrconfig.xml``
need to be generated and deployed to production Solr server. Use
``python manage.py build_solr_schema`` to generate the schema. Reload
the Solr core or restart Solr after updating the configuration.
* After Solr configurations are in place, run ``python
manage.py rebuild_index --noinput`` to update the index based on
content in the Derrida database.
0.9
---
* Migration from Plum to Figgy requires that a new auth token be added
to local settings under **DJIFFY_AUTH_TOKENS** for loading restricted
IIIF Manifests.
* Solr XML configuration files ``schema.xml`` and ``solrconfig.xml``
need to be generated and deployed to production Solr server. Use
``python manage.py build_solr_schema`` to generate the schema. Reload
the Solr core or restart Solr after updating the configuration.
* After Solr configurations are in place, run ``python
manage.py rebuild_index --noinput`` to update the index based on
content in the Derrida database.
* Production ``local_settings.py`` should have updated settings to use the
extended signal processor for Haystack managed models::
HAYSTACK_SIGNAL_PROCESSOR = 'derrida.books.signals.RelationSafeRTSP'
* This update includes a migration to update cached Plum IIIF Manifest
and Canvas data to the new equivalent Figgy URLs. This migration
will update records, but to update IIIF content (i.e. after additional
labeling work), run::
python manage.py import_digitaleds PUL --update
0.8 Interventions Phase I
-------------------------
* This release adds `django.contrib.sites` to **INSTALLED_APPS**, so you
will need to manually configure your site domain in Django admin.
* An auth token must be configured using **DJIFFY_AUTH_TOKENS** in
``local_settings.py`` in order to import restricted IIIF manifests.
* To load IIIF digitized content for documenting interventions, you should use
the **import_digitaleds** manage.py script. Use **PUL** to load the
entire collection::
python manage.py import_digitaleds PUL
Ansible
~~~~~~~
We include sample deploy scripts in the form of a short `Ansible <http://docs.ansible.com/>`__ playbook
and associated configuration files. In the current usage, assuming Ansible
is installed and the appropriate server key is loaded via `ssh-add`::
cd derrida-django/deploy/
ansible-playbook prod_derrida-django_.yml <-e github reference>
Any valid Github tag type is accepted, but the script defaults to ``master``. ``ansible.cfg`` and ``hosts`` set up the host group and configuration used in the commands.
The production deploy does not involve itself with Apache configuration, because
that is handled server side and now handles running a database backup, migrations,
and resetting symlinks to make the deployment go live.
| 38.83908 | 169 | 0.754365 |
dfad91724d004ec69df354d75dd16ae4c51991fc | 486 | rst | reStructuredText | docs/source/sciope.rst | rmjiang7/sciope | 5122107dedcee9c39458e83d853ec35f91268780 | [
"Apache-2.0"
] | 5 | 2019-05-21T18:56:04.000Z | 2020-08-02T20:09:43.000Z | docs/source/sciope.rst | rmjiang7/sciope | 5122107dedcee9c39458e83d853ec35f91268780 | [
"Apache-2.0"
] | 6 | 2020-10-16T08:11:10.000Z | 2022-03-16T09:35:46.000Z | docs/source/sciope.rst | rmjiang7/sciope | 5122107dedcee9c39458e83d853ec35f91268780 | [
"Apache-2.0"
] | 6 | 2019-05-23T09:09:00.000Z | 2020-08-02T20:09:45.000Z | sciope package
===========
Subpackages
-----------
.. toctree::
sciope.data
sciope.designs
sciope.features
sciope.inference
sciope.models
sciope.sampling
sciope.utilities
sciope.visualize
Submodules
----------
sciope.sciope module
--------------
.. automodule:: sciope.sciope
:members:
:undoc-members:
:show-inheritance:
Module contents
---------------
.. automodule:: sciope
:members:
:undoc-members:
:show-inheritance:
| 13.135135 | 29 | 0.590535 |
807ec4f6f2fd4cfbda7cd2fe77d2bd49c81d857c | 5,434 | rst | reStructuredText | docs/quickstart.rst | creativechain/crea-python-lib | b0a61e947586e2d19001824259efad51722e43a8 | [
"MIT"
] | null | null | null | docs/quickstart.rst | creativechain/crea-python-lib | b0a61e947586e2d19001824259efad51722e43a8 | [
"MIT"
] | 1 | 2020-03-02T12:18:48.000Z | 2020-03-02T12:18:48.000Z | docs/quickstart.rst | creativechain/crea-python-lib | b0a61e947586e2d19001824259efad51722e43a8 | [
"MIT"
] | null | null | null | Quickstart
==========
Crea
-----
The crea object is the connection to the Crea blockchain.
By creating this object different options can be set.
.. note:: All init methods of crea classes can be given
the ``crea_instance=`` parameter to assure that
all objects use the same crea object. When the
``crea_instance=`` parameter is not used, the
crea object is taken from get_shared_crea_instance().
:func:`crea.instance.shared_crea_instance` returns a global instance of crea.
It can be set by :func:`crea.instance.set_shared_crea_instance` otherwise it is created
on the first call.
.. code-block:: python
from crea import Crea
from crea.account import Account
stm = Crea()
account = Account("test", crea_instance=stm)
.. code-block:: python
from crea import Crea
from crea.account import Account
from crea.instance import set_shared_crea_instance
stm = Crea()
set_shared_crea_instance(stm)
account = Account("test")
Wallet and Keys
---------------
Each account has the following keys:
* Posting key (allows accounts to post, vote, edit, recrea and follow/mute)
* Active key (allows accounts to transfer, power up/down, voting for witness, ...)
* Memo key (Can be used to encrypt/decrypt memos)
* Owner key (The most important key, should not be used with crea)
Outgoing operation, which will be stored in the crea blockchain, have to be
signed by a private key. E.g. Comment or Vote operation need to be signed by the posting key
of the author or upvoter. Private keys can be provided to crea temporary or can be
stored encrypted in a sql-database (wallet).
.. note:: Before using the wallet the first time, it has to be created and a password has
to set. The wallet content is available to creapy and all python scripts, which have
access to the sql database file.
Creating a wallet
~~~~~~~~~~~~~~~~~
``crea.wallet.wipe(True)`` is only necessary when there was already an wallet created.
.. code-block:: python
from crea import Crea
crea = Crea()
crea.wallet.wipe(True)
crea.wallet.unlock("wallet-passphrase")
Adding keys to the wallet
~~~~~~~~~~~~~~~~~~~~~~~~~
.. code-block:: python
from crea import Crea
crea = Crea()
crea.wallet.unlock("wallet-passphrase")
crea.wallet.addPrivateKey("xxxxxxx")
crea.wallet.addPrivateKey("xxxxxxx")
Using the keys in the wallet
~~~~~~~~~~~~~~~~~~~~~~~~~~~~
.. code-block:: python
from crea import Crea
crea = Crea()
crea.wallet.unlock("wallet-passphrase")
account = Account("test", crea_instance=crea)
account.transfer("<to>", "<amount>", "<asset>", "<memo>")
Private keys can also set temporary
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
.. code-block:: python
from crea import Crea
crea = Crea(keys=["xxxxxxxxx"])
account = Account("test", crea_instance=crea)
account.transfer("<to>", "<amount>", "<asset>", "<memo>")
Receiving information about blocks, accounts, votes, comments, market and witness
---------------------------------------------------------------------------------
Receive all Blocks from the Blockchain
.. code-block:: python
from crea.blockchain import Blockchain
blockchain = Blockchain()
for op in blockchain.stream():
print(op)
Access one Block
.. code-block:: python
from crea.block import Block
print(Block(1))
Access an account
.. code-block:: python
from crea.account import Account
account = Account("test")
print(account.balances)
for h in account.history():
print(h)
A single vote
.. code-block:: python
from crea.vote import Vote
vote = Vote(u"@gtg/ffdhu-gtg-witness-log|gandalf")
print(vote.json())
All votes from an account
.. code-block:: python
from crea.vote import AccountVotes
allVotes = AccountVotes("gtg")
Access a post
.. code-block:: python
from crea.comment import Comment
comment = Comment("@gtg/ffdhu-gtg-witness-log")
print(comment["active_votes"])
Access the market
.. code-block:: python
from crea.market import Market
market = Market("CBD:CREA")
print(market.ticker())
Access a witness
.. code-block:: python
from crea.witness import Witness
witness = Witness("gtg")
print(witness.is_active)
Sending transaction to the blockchain
-------------------------------------
Sending a Transfer
.. code-block:: python
from crea import Crea
crea = Crea()
crea.wallet.unlock("wallet-passphrase")
account = Account("test", crea_instance=crea)
account.transfer("null", 1, "CBD", "test")
Upvote a post
.. code-block:: python
from crea.comment import Comment
from crea import Crea
crea = Crea()
crea.wallet.unlock("wallet-passphrase")
comment = Comment("@gtg/ffdhu-gtg-witness-log", crea_instance=crea)
comment.upvote(weight=10, voter="test")
Publish a post to the blockchain
.. code-block:: python
from crea import Crea
crea = Crea()
crea.wallet.unlock("wallet-passphrase")
crea.post("title", "body", author="test", tags=["a", "b", "c", "d", "e"], self_vote=True)
Sell CREA on the market
.. code-block:: python
from crea.market import Market
from crea import Crea
crea.wallet.unlock("wallet-passphrase")
market = Market("CBD:CREA", crea_instance=crea)
print(market.ticker())
market.crea.wallet.unlock("wallet-passphrase")
print(market.sell(300, 100)) # sell 100 CREA for 300 CREA/CBD
| 26.251208 | 97 | 0.669672 |
ccd24d89180ad523be032de59c45e68913e6ee88 | 6,883 | rst | reStructuredText | README.rst | anergictcell/pyhpo | 94bcde46754f2a7fd8bbb5cb19dbcaa103b9e935 | [
"MIT"
] | 2 | 2021-06-01T11:42:16.000Z | 2022-01-26T11:40:43.000Z | README.rst | anergictcell/pyhpo | 94bcde46754f2a7fd8bbb5cb19dbcaa103b9e935 | [
"MIT"
] | 2 | 2019-12-18T05:47:04.000Z | 2019-12-18T05:48:05.000Z | README.rst | anergictcell/pyhpo | 94bcde46754f2a7fd8bbb5cb19dbcaa103b9e935 | [
"MIT"
] | 1 | 2021-09-03T11:47:44.000Z | 2021-09-03T11:47:44.000Z | *****
PyHPO
*****
A Python library to work with, analyze, filter and inspect the `Human Phenotype Ontology`_
Visit the `PyHPO Documentation`_ for a more detailed overview of all the functionality.
New maintainer - Centogene
==========================
As of version 2.7, the library is maintained by CENTOGENE. The new `Git repository`_ is at https://github.com/Centogene/pyhpo.
The documentation is now also hosted by Github at https://centogene.github.io/pyhpo/.
I will continue to be the lead maintainer of PyHPO.
Main features
=============
It allows working on individual terms ``HPOTerm``, a set of terms ``HPOSet`` and the full ``Ontology``.
Internally the ontology is represented as a branched linked list, every term contains pointers to its parent and child terms. This allows fast tree traversal functioanlity.
The library is helpful for discovery of novel gene-disease associations and GWAS data analysis studies. At the same time, it can be used for oragnize clinical information of patients in research or diagnostic settings.
It provides an interface to create ``Pandas Dataframe`` from its data, allowing integration in already existing data anlysis tools.
HPOTerm
-------
An individual ``HPOTerm`` contains all info about itself as well as pointers to its parents and its children. You can access its information-content, calculate similarity scores to other terms, find the shortest or longes connection between two terms. List all associated genes or diseases, etc.
HPOSet
------
An ``HPOSet`` can be used to represent e.g. a patient's clinical information. It allows some basic filtering and comparisons to other ``HPOSet`` s.
Ontology
--------
The ``Ontology`` represents all HPO terms and their connections and associations. It also contains pointers to associated genes and disease.
Installation / Setup
====================
The easiest way to install PyHPO is via pip
.. code:: bash
pip install pyhpo
.. note::
Some features of PyHPO require ``pandas``. The standard installation via pip will not include pandas and PyHPO will work just fine. (You will get a warning on the initial import though). As long as you don't try to create a ``pandas.DataFrame``, everything should work without pandas. If you want to use all features, install ``pandas`` yourself:
.. code:: bash
pip install pandas
Usage
=====
For a detailed description of how to use PyHPO, visit the `PyHPO Documentation`_.
Getting started
---------------
.. code:: python
from pyhpo.ontology import Ontology
# initilize the Ontology (you can specify config parameters if needed here)
ontology = Ontology()
# Iterate through all HPO terms
for term in ontology:
# do something, e.g.
print(term.name)
There are multiple ways to retrieve a single term out of an ontology:
.. code:: python
# Retrieve a term via its HPO-ID
term = ontology.get_hpo_object('HP:0002650')
# ...or via the Integer representation of the ID
term = ontology.get_hpo_object(2650)
# ...or via shortcut
term = ontology[2650]
# ...or by term name
term = ontology.get_hpo_object('Scoliosis')
You can also do substring search on term names and synonyms:
.. code:: python
# ontology.search returns an Iterator over all matches
for term in ontology.search('Abn'):
print(term.name)
Find the shortest path between two terms:
.. code:: python
ontology.path(
'Abnormality of the nervous system',
'HP:0002650'
)
Working with terms
------------------
.. code-block:: python
# check the relationship of two terms
term.path_to_other(ontology[11])
# get the information content for OMIM diseases
term.information_content['omim']
# ...or for genes
term.information_content['genes']
# compare two terms
term.similarity_score(term2, method='resnik', kind='gene')
Working with sets
-----------------
.. code-block:: python
# Create a clinical information set of HPO Terms
clinical_info = pyhpo.HPOSet([
ontology[12],
ontology[14],
ontology.get_hpo_object(2650)
])
# Extract only child nodes and leave out all parent terms
children = clinical_info.child_nodes()
# Remove HPO modifier terms
new_ci = clinical_info.remove_modifier()
# Calculate the similarity of two Sets
sim_score = clinical_info.similarity(other_set)
Statistics
-----------------
``PyHPO`` includes some basic statics method for gene, disease and HPO-Term enrichment analysis.
.. code-block:: python
# Let's say you have a patient with a couple of symptoms and
# you want to find out the most likely affected genes
# or most likely diseases
from pyhpo import stats
from pyhpo.ontology import Ontology
from pyhpo.set import HPOSet, BasicHPOSet
_ = Ontology()
hpo_terms = [
'Decreased circulating antibody level',
'Abnormal immunoglobulin level',
'Abnormality of B cell physiology',
'Abnormal lymphocyte physiology',
'Abnormality of humoral immunity',
'Lymphoma',
'Lymphopenia',
'Autoimmunity',
'Increased circulating IgG level',
'Abnormal lymphocyte count'
]
# you can either use a HPOSet for this
hposet = HPOSet.from_queries(hpo_terms)
# or just a plain list of HPO Terms
hposet = [Ontology.match(q) for q in hpo_terms]
# Initialize an Enrichment model for genes
gene_model = stats.EnrichmentModel('gene')
# You can also do enrichment for diseases
disease_model = stats.EnrichmentModel('omim')
# Calculate the Hypergeometric distribution test enrichment
gene_results = gene_model.enrichment(
'hypergeom',
hposet
)
disease_results = disease_model.enrichment(
'hypergeom',
hposet
)
# and print the Top-10 results
for x in gene_results[0:10]:
print(x)
for x in disease_results[0:10]:
print(x)
and many more examples in the `PyHPO Documentation`_
Contributing
============
Yes, please do so. I would appreciate any help, suggestions for improvement or other feedback. Just create a pull-request or open an issue.
License
=======
PyHPO is released under the `MIT license`_.
PyHPO is using the Human Phenotype Ontology. Find out more at http://www.human-phenotype-ontology.org
Sebastian Köhler, Leigh Carmody, Nicole Vasilevsky, Julius O B Jacobsen, et al. Expansion of the Human Phenotype Ontology (HPO) knowledge base and resources. Nucleic Acids Research. (2018) doi: 10.1093/nar/gky1105
.. _PyHPO Documentation: https://centogene.github.io/pyhpo/
.. _MIT license: http://www.opensource.org/licenses/mit-license.php
.. _Human Phenotype Ontology: https://hpo.jax.org/
.. _Git repository: https://github.com/Centogene/pyhpo
| 29.668103 | 350 | 0.696934 |
dc802d6d485f94176d5a26a18f8c5d33373793ba | 1,221 | rst | reStructuredText | docs/usage.rst | MechanisM/django-confy | 53818db22d1f05623d257aac2abdc625f5972d88 | [
"MIT"
] | 28 | 2015-01-12T11:27:19.000Z | 2018-05-05T21:37:58.000Z | docs/usage.rst | MechanisM/django-confy | 53818db22d1f05623d257aac2abdc625f5972d88 | [
"MIT"
] | null | null | null | docs/usage.rst | MechanisM/django-confy | 53818db22d1f05623d257aac2abdc625f5972d88 | [
"MIT"
] | 2 | 2015-04-26T07:28:32.000Z | 2018-02-27T14:34:40.000Z | Usage
~~~~~
Import from confy needed modules and use them.
Example for settings.py:
.. code-block:: py
from confy import env, database, cache
DEBUG = env('DEV')
SECRET_KEY = env('SECRET_KEY')
DATABASES = {'default': database.config()}
CACHES = {'default': cache.config()}
Create .env file and place it into project's root directory(where manage.py is located)
And add into it environment variable like these:
.. code-block:: sh
DJANGO_SETTINGS_MODULE=project_name.settings
DEV=True
DATABASE_URL=sqlite:////server/apps/project_name/project_name.sqlite3
CACHE_URL=uwsgi://
Modify your manage.py file to read environment variables(if you don't read them other ways like honcho, uwsgi etc.)
.. code-block:: py
#!/usr/bin/env python
import sys
import confy
confy.read_environment_file()
if __name__ == "__main__":
from django.core.management import execute_from_command_line
execute_from_command_line(sys.argv)
Since environment variables exists you don't need to use os.environ.setdefault for wsgi.py and manage.py
.. code-block:: py
from django.core.wsgi import get_wsgi_application
application = get_wsgi_application()
| 24.42 | 115 | 0.718264 |
93ea36ffe2734d23c02bd371e78272d802248401 | 22,828 | rst | reStructuredText | source/3dcitydb/docker.rst | 3dcitydb/3dcitydb-docs | 26306766f73c381bd2671b03afd47d56723fc418 | [
"Apache-2.0"
] | 9 | 2019-03-30T11:33:25.000Z | 2021-03-17T12:09:29.000Z | source/3dcitydb/docker.rst | 3dcitydb/3dcitydb-docs | 26306766f73c381bd2671b03afd47d56723fc418 | [
"Apache-2.0"
] | 24 | 2019-03-30T11:06:51.000Z | 2021-10-07T06:28:32.000Z | source/3dcitydb/docker.rst | 3dcitydb/3dcitydb-docs | 26306766f73c381bd2671b03afd47d56723fc418 | [
"Apache-2.0"
] | 7 | 2020-03-31T06:40:03.000Z | 2022-03-21T01:22:47.000Z | .. _citydb_docker_chapter:
###############################################################################
3D City Database using Docker
###############################################################################
.. image:: ../media/citydb_docker_logo.png
:width: 80 px
:align: right
:alt: 3D City Database on Docker
The 3DCityDB Docker images are available for *PostgreSQL/PostGIS* and *Oracle*.
The PostgreSQL/PostGIS version is based on the official
`PostgreSQL <postgres_hub_>`_ and
`PostGIS <postgis_hub_>`_ Docker images.
The Oracle version is based on the
*Oracle Database Enterprise Edition* images available from the
`Oracle Container registry <https://container-registry.oracle.com>`_.
The images described here are available for 3DCityDB version ``v4.1.0`` and newer.
Images for older 3DCityDB versions are available from
`TUM-GIS 3DCityDB Docker images <https://github.com/tum-gis/
3dcitydb-docker-postgis>`_.
When designing the images we tried to stay as close as possible to the behavior of
the base images and the :ref:`3DCityDB Shell scripts <3dcitydb_shell_scripts>`.
Thus, all configuration options you may be used to from the base images are
available for the 3DCityDB Docker images as well.
.. rubric:: Synopsis
.. code-block:: bash
:name: citydb_docker_code_synopsis_psql
:caption: Synopsis 3DCityDB Docker PostgreSQL/PostGIS
docker run --name 3dciytdb -p 5432:5432 -d \
-e POSTGRES_PASSWORD=<theSecretPassword> \
-e SRID=<EPSG code> \
[-e HEIGHT_EPSG=<EPSG code>] \
[-e GMLSRSNAME=<mySrsName>] \
[-e POSTGRES_DB=<database name>] \
[-e POSTGRES_USER=<username>] \
[-e POSTGIS_SFCGAL=<true|false|yes|no>] \
3dcitydb/3dcitydb-pg
.. code-block:: bash
:name: citydb_docker_code_synopsis_oracle
:caption: Synopsis 3DCityDB Oracle
docker run --name 3dciytdb -p 1521:1521 -d \
-e ORACLE_USER=<theUserName> \
-e ORACLE_PASSWORD=<theSecretPassword> \
-e SRID=<EPSG code> \
[-e HEIGHT_EPSG=<EPSG code>] \
[-e GMLSRSNAME=<mySrsName>] \
[-e ORACLE_PDB=<pluggable database name>] \
[-e DBVERSION=<oracle license option>] \
[-e VERSIONING=<version-enabled>] \
3dcitydb/3dcitydb-oracle
.. _citydb_docker_image_variants:
*******************************************************************************
Image variants and versions
*******************************************************************************
The images are available in various *variants* and *versions*. The
PostgreSQL/PostGIS images are available based on *Debian* and *Alpine Linux*,
the Oracles image are based on *Oracle Linux*.
:numref:`citydb_docker_tbl_images` gives an overview on the available image
versions.
.. list-table:: 3DCityDB Docker image variants and versions
:widths: auto
:header-rows: 1
:stub-columns: 1
:align: center
:name: citydb_docker_tbl_images
* - Tag
- PostGIS (Debian)
- PostGIS (Alpine)
- Oracle
* - edge
- |psql-deb-build-edge| |psql-deb-size-edge|
- |psql-alp-build-edge| |psql-alp-size-edge|
- |ora-build-edge| |ora-size-edge|
* - latest
- |psql-deb-size-latest|
- |psql-alp-size-latest|
- |ora-size-edge|
* - 4.1.0
- |psql-deb-size-v4.1.0|
- |psql-alp-size-v4.1.0|
- |ora-size-edge|
* - 4.2.0
- |psql-deb-size-v4.2.0|
- |psql-alp-size-v4.2.0|
- |ora-size-edge|
The **edge** images are automatically built and published on every push to the
*master* branch of the `3DCityDB Github repository <https://github.com/3dcitydb/
3dcitydb>`_
using the latest stable version of the base images.
The **latest** and **release** image versions are only built
when a new release is published on Github. The **latest** tag will point to
the most recent release version using the latest base image version.
.. _citydb_docker_image_pg:
PostgreSQL/PostGIS images
===============================================================================
The PostgreSQL/PostGIS images are available from
`3DCityDB DockerHub <https://hub.docker.com/r/3dcitydb/3dcitydb-pg>`_ and
can be pulled like this:
.. code-block:: Shell
docker pull 3dcitydb/3dcitydb-pg:TAG
The image tags are compose of the *base image version*, the
*3DCityDB version* and the *image variant*,
``<base image version>-<3DCityDB version>-<image variant>``.
The base image version is inherited
from the `PostGIS Docker images <https://hub.docker.com/r/postgis/postgis/tags>`_.
Debian is the default image variant, where no image variant is appended to the
tag. For the Alpine Linux images ``-alpine`` is appended. Currently supported
base image versions are listed in :numref:`citydb_docker_tbl_pgversions`.
.. list-table:: Overview on supported PostgreSQL/PostGIS versions
:widths: auto
:header-rows: 1
:stub-columns: 1
:align: center
:name: citydb_docker_tbl_pgversions
* - PostgreSQL/PostGIS version
- 2.5
- 3.0
- 3.1
* - 9.5
- 9.5-2.5
- 9.5-3.0
-
* - 9.6
- 9.6-2.5
- 9.6-3.0
- 9.6-3.1
* - 10
- 10-2.5
- 10-3.0
- 10-3.1
* - 11
- 11-2.5
- 11-3.0
- 11-3.1
* - 12
- 12-2.5
- 12-3.0
- 12-3.1
* - 13
-
- 13-3.0
- 13-3.1
* - 14
-
-
- 14-3.1
The full list of available tags can be found on `DockerHub <https://hub.
docker.com/r/3dcitydb/3dcitydb-pg/tags?page=1&ordering=last_updated>`_
Here are some examples for full image tags:
.. code-block:: shell
docker pull 3dcitydb/3dcitydb-pg:9.5-2.5-4.2.0
docker pull 3dcitydb/3dcitydb-pg:13-3.1-4.2.0
docker pull 3dcitydb/3dcitydb-pg:13-3.1-4.2.0-alpine
docker pull 3dcitydb/3dcitydb-pg:13-3.1-4.2.0-alpine
.. _citydb_docker_image_oracle:
Oracle images
===============================================================================
Due to Oracle licensing conditions we cannot offer Oracle images
in a public repository like DockerHub at the
moment. However, you can easily build the images yourself. A detailed description
of how to do that is available in :numref:`citydb_docker_oracle_build`.
.. _citydb_docker_config:
*******************************************************************************
Usage and configuration
*******************************************************************************
A 3DCityDB container is configured by settings environment variables inside
the container. For instance, this can be done using the ``-e VARIABLE=VALUE``
flag of `docker run <https://docs.docker.com/engine/reference/run/#env-
environment-variables>`_. The 3DCityDB Docker images introduce the variables
:option:`SRID`, :option:`HEIGHT_EPSG` and :option:`GMLSRSNAME`. Their behavior
is described here.
Furthermore, some variables inherited from the base images offer important
configuration options, they are described separately for the
:ref:`PostgreSQL/PostGIS <citydb_docker_config_psql>` and
:ref:`Oracle <citydb_docker_config_oracle>` image variants.
.. tip:: All variables besides :option:`POSTGRES_PASSWORD` and
:option:`ORACLE_PWD` are optional.
.. option:: SRID=<EPSG code>
EPSG code for the 3DCityDB instance. If :option:`SRID` is not set,
the 3DCityDB schema will not be setup in the default database and
you will end up with a plain PostgreSQL/PostGIS or Oracle container.
.. option:: HEIGHT_EPSG=<EPSG code>
EPSG code of the height system, omit or use 0 if unknown or
:option:`SRID` is already 3D. This variable is used only for the automatic
generation of :option:`GMLSRSNAME`.
.. option:: GMLSRSNAME=<mySrsName>
If set, the automatically generated :option:`GMLSRSNAME` from :option:`SRID`
and :option:`HEIGHT_EPSG` is overwritten. If not set, the variable will
be created automatically like this:
If only :option:`SRID` is set: :option:`GMLSRSNAME` =
``urn:ogc:def:crs:EPSG::SRID``
If :option:`SRID` and :option:`HEIGHT_EPSG` are set:
:option:`GMLSRSNAME` = ``urn:ogc:def:crs,crs:EPSG::SRID,crs:EPSG::HEIGHT_EPSG``
.. _citydb_docker_config_psql:
PostgreSQL/PostGIS environment variables
===============================================================================
The 3DCityDB PostgreSQL/PostGIS Docker images make use of the following
environment variables inherited from the official
`PostgreSQL <https://hub.docker.com/_/postgres>`_ and
`PostGIS <https://hub.docker.com/r/postgis/postgis>`_ Docker images. Refer to
the documentations of both images for much more configuration options.
.. option:: POSTGRES_DB=<database name>
Sets name for the default database. If not set, the default database is named
like :option:`POSTGRES_USER`.
.. option:: POSTGRES_USER=<username>
Sets name for the database user, defaults to ``postgres``.
.. option:: POSTGRES_PASSWORD=<password>
Sets the password for the database connection. This variable is **mandatory**.
.. option:: POSTGIS_SFCGAL=<true|false|yes|no>
If set, `PostGIS SFCGAL <http://www.sfcgal.org/>`_ support is
enabled. **Note:** SFCGAL is currently only available in the Debian image variant.
Setting the variable on Alpine images will have no effect.
.. _citydb_docker_config_oracle:
Oracle environment variables
===============================================================================
.. option:: DBUSER=<username>
The database user name of the 3DCityDB instance to be created. The default value
is 'citydb'.
.. option:: ORACLE_PWD=<password>
The database password of the 3DCityDB instance to be created. This variable is
mandatory.
.. option:: ORACLE_PDB=<pluggable database name>
set the name of the pluggable database (PDB) that should be used (default:
'ORCLPDB1'). Requires Oracle 12c or higher.
.. option:: DBVERSION=<oracle license option>
'S' (default value) or 'L' to choose the Oracle Spatial or Locator license
option for the 3DCityDB instance to be created.
.. option:: VERSIONING=<version-enabled>
'yes' or 'no' (default value) to specify whether the 3DCityDB instance should be
versioned-enabled based on the Oracle's Workspace Manager.
.. _citydb_docker_build:
*******************************************************************************
How to build images
*******************************************************************************
This section describes how to build 3DCityDB Docker images on your own. Both
the PostgreSQL/PostGIS and Oracle version offer one build argument, that can
be used to set the tag of the base image that is used.
.. option:: BASEIMAGE_TAG=<tag of the base image>
Tag of the base image that is used for the build. Available tags can be
found on DockerHub for the `PostgreSQL/PostGIS images <https://registry.hub.
docker.com/r/postgis/postgis/tags?page=1&ordering=last_updated>`_ and in
the `Oracle container registry <https://container-registry.oracle.com>`_.
.. _citydb_docker_psql_build:
PostgreSQL/PostGIS
===============================================================================
The PostgreSQL/PostGIS images are build by cloning the 3DCityDB Github repository
and running `docker build <https://docs.docker.com/engine/reference/commandline
/build/>`_:
1. Clone 3DCityDB Github repository and navigate to the ``postgresql`` folder in
the repo:
.. code-block:: bash
git clone https://github.com/3dcitydb/3dcitydb.git
cd 3dcitydb/postgresql/
2. Build the Postgresql/PostGIS image using `docker build <https://docs.docker.com
/engine/reference/commandline/build/>`_:
.. code-block:: bash
docker build -t 3dcitydb/3dcitydb-pg .
# or with a specific base image tag
docker build -t 3dcitydb/3dcitydb-oracle \
--build-arg BASEIMAGE_TAG=13-3.1 \
.
.. _citydb_docker_oracle_build:
Oracle
===============================================================================
To build Oracle 3DCityDB Docker images, you need to create an Oracle account
and accept the licensing conditions first:
1. Visit https://login.oracle.com/mysso/signon.jsp and create an account.
2. Visit https://container-registry.oracle.com and navigate to *Database*.
Click the *Continue* button in the right column of the *enterprise* repository.
Scroll to the bottom of the license agreement, which should be displayed
now and click *accept*.
3. The repository listing should now show a green hook for the enterprise
repository, as shown in the example below.
|oracle-license|
If this is the case, you are ready to pull the required base images from
Oracle container registry.
4. Signin Docker to the Oracle container registry using the account credentials
from above using `docker login <https://docs.docker.com/engine/reference
/commandline/login/>`_:
.. code-block:: bash
docker login container-registry.oracle.com
5. Clone the 3DCityDB repository and navigate to the ``oracle`` folder in the
repo:
.. code-block:: bash
git clone https://github.com/3dcitydb/3dcitydb.git
cd 3dcitydb/oracle/
6. Build the 3DCityDB Oracle image using `docker build <https://docs.docker.com
/engine/reference/commandline/build/>`_:
.. code-block:: bash
docker build -t 3dcitydb/3dcitydb-oracle .
# or with a specific base image tag
docker build . \
-t 3dcitydb/3dcitydb-oracle \
--build-arg BASEIMAGE_TAG=19.3.0.0
After the build process has finished, you are ready to use the image
(see :numref:`citydb_docker_config` and :numref:`citydb_docker_config_oracle`)
or push it to a **private** Docker repository.
*******************************************************************************
Performance tuning for PostgreSQL/PostGIS containers
*******************************************************************************
PostgreSQL databases offer a wide range of configuration parameters that
affect database performance and enable e.g. parallelization of queries.
Database optimization is a complex topic but using `PGTune <https://pgtune.
leopard.in.ua/#/>`_ you can easily get a set of configuration options,
that may help to increase database performance.
1. Visit the `PGTune website <https://pgtune.leopard.in.ua/#/>`_, fill in the
form and generate a set of parameters for your system. You will get
something like this:
.. code-block:: text
# DB Version: 13
# OS Type: linux
# DB Type: mixed
# Total Memory (RAM): 8 GB
# CPUs num: 8
# Connections num: 20
# Data Storage: ssd
max_connections = 20
shared_buffers = 2GB
effective_cache_size = 6GB
maintenance_work_mem = 512MB
checkpoint_completion_target = 0.9
wal_buffers = 16MB
default_statistics_target = 100
random_page_cost = 1.1
effective_io_concurrency = 200
work_mem = 13107kB
min_wal_size = 1GB
max_wal_size = 4GB
max_worker_processes = 8
max_parallel_workers_per_gather = 4
max_parallel_workers = 8
max_parallel_maintenance_workers = 4
2. Pass these configuration parameters to ``postgres`` (see emphasized line)
using the the ``-c`` option when starting your 3DCityDB container with
`docker run <https://docs.docker.com/engine/reference/run>`_.
.. code-block:: bash
:emphasize-lines: 4
docker run -d -i -t --name citydb -p 5432:5342 \
-e SRID=25832 \
-e POSTGRES_PASSWORD=changeMe! \
3dcitydb/3dcitydb-pg postgres \
-c max_connections=20 \
-c shared_buffers=2GB \
-c effective_cache_size=6GB \
-c maintenance_work_mem=512MB \
-c checkpoint_completion_target=0.9 \
-c wal_buffers=16MB \
-c default_statistics_target=100 \
-c random_page_cost=1.1 \
-c effective_io_concurrency=200 \
-c work_mem=13107kB \
-c min_wal_size=1GB \
-c max_wal_size=4GB \
-c max_worker_processes=8 \
-c max_parallel_workers_per_gather=4 \
-c max_parallel_workers=8 \
-c max_parallel_maintenance_workers=4
*******************************************************************************
Creating 3DCityDB Docker images including data
*******************************************************************************
In general, it is **not recommended** to store data directly inside a Docker image
and use `docker volumes <https://docs.docker.com/storage/volumes/>`_ instead.
Volumes are the preferred mechanism for persisting data generated by and used by
Docker containers.
However, for some use-cases it can be very handy to create a Docker image including
data. For instance, if you have automated tests operating on the exact same
data every time or you want to prepare a 3DCityDB image including data for a
lecture or workshop, that will run out of the box, without having to import
data first.
.. warning:: The practise described here has many drawbacks and is a potential
security threat. It should not be performed with sensitive data!
Here is how to create an image with data:
1. Choose a 3DCityDB image that is suitable for you purpose. You will not be able
to change the image version later, as you could easily do when using volumes
(the default). Available versions are listed in :ref:`citydb_docker_image_variants`.
To update an image with data, it has to be recreated from scrap using the
desired/updated base image.
2. Create a Docker network and start a 3DCityDB Docker container:
.. code-block:: bash
docker network create citydb-net
docker run -d --name citydbTemp \
--network citydb-net \
-e "PGDATA=/mydata" \
-e "POSTGRES_PASSWORD=changeMe" \
-e "SRID=25832" \
3dcitydb/3dcitydb-pg:latest-alpine
.. warning:: The database credentials and settings provided in this step
cannot be changed when later on creating containers from this image!
Note down the database connection credentials (db name, username, password)
or you won't be able to access the content later.
2. Import data to the container. For this example we are using the
:download:`LoD3 Railway dataset <https://github.com/3dcitydb/importer-exporter/raw/master/resources/samples/Railway%20Scene/Railway_Scene_LoD3.zip>` and the
:ref:`3DCityDB Importer/Exporter Docker image<impexp_docker_chapter>`:
.. code-block:: bash
docker run -i -t --rm --name impexp \
--network citydb-net \
-v /d/temp:/data \
3dcitydb/impexp:latest-alpine import \
-H citydbTemp \
-d postgres \
-u postgres \
-p changeMe \
/data/Railway_Scene_LoD3.zip
3. Stop the running 3DCityDB container, remove the network and commit it
to an image:
.. code-block:: bash
docker stop citydbTemp
docker network rm citydb-net
docker commit citydbTemp 3dcitydb/3dcitydb-pg:4.1.0-alpine-railwayScene_LoD3
4. Remove the 3DCityDB container:
.. code-block:: bash
docker rm -f -v citydbTemp
We have now created a 3DCityDB image that contains data that can e.g. be pushed to a
Docker registry or exported as TAR.
When creating containers from this image, it is not required to specify any configuration
parameter as you usually would, when creating a fresh 3DCityDB container.
.. code-block:: bash
docker run --name cdbWithData --rm -p 5432:5432 \
3dcitydb/3dcitydb-pg:4.1.0-alpine-railwayScene_LoD3
To connect to the database, use the credentials you set in step 2. The following example
lists the tables of the DB running in the container using ``psql``.
.. code-block:: console
$ export PGPASSWORD=postgres
$ query='SELECT COUNT(*) FROM citydb.cityobject;'
$ psql -h localhost -p 5432 -U postgres -d postgres -c "$query"
count
-------
231
(1 row)
.. Links ----------------------------------------------------------------------
.. _postgres_hub: https://github.com/docker-library/postgres/
.. _postgis_hub: https://github.com/postgis/docker-postgis/
.. Images ---------------------------------------------------------------------
.. |oracle-license| image:: ../media/citydb_oracle_license.jpg
.. edge
.. |psql-deb-build-edge| image:: https://img.shields.io/github/workflow/status/
3dcitydb/3dcitydb/psql-docker-build-push-edge?label=Debian&
style=flat-square&logo=Docker&logoColor=white
:target: https://hub.docker.com/r/3dcitydb/3dcitydb-pg/tags?page=1&ordering=last_updated
.. |psql-deb-size-edge| image:: https://img.shields.io/docker/image-size/
3dcitydb/3dcitydb-pg/edge?label=image%20size&logo=Docker&logoColor=white&style=flat-square
:target: https://hub.docker.com/r/3dcitydb/3dcitydb-pg/tags?page=1&ordering=last_updated
.. |psql-alp-build-edge| image:: https://img.shields.io/github/workflow/status/
3dcitydb/3dcitydb/psql-docker-build-push-edge?label=Alpine&
style=flat-square&logo=Docker&logoColor=white
:target: https://hub.docker.com/r/3dcitydb/3dcitydb-pg/tags?page=1&ordering=last_updated
.. |psql-alp-size-edge| image:: https://img.shields.io/docker/image-size/
3dcitydb/3dcitydb-pg/edge-alpine?label=image%20size&logo=Docker&logoColor=white&
style=flat-square
:target: https://hub.docker.com/r/3dcitydb/3dcitydb-pg/tags?page=1&ordering=last_updated
.. |ora-build-edge| image:: https://img.shields.io/github/workflow/status/
3dcitydb/3dcitydb/oracle-docker-build-edge?label=Oracle%20Linux&
style=flat-square&logo=Docker&logoColor=white
.. |ora-size-edge| image:: https://img.shields.io/static/v1?label=image%20size&message=
%3E3%20GB&color=blue&style=flat-square&logo=Docker&logoColor=white
.. latest
.. |psql-deb-size-latest| image:: https://img.shields.io/docker/image-size/
3dcitydb/3dcitydb-pg/latest?label=image%20size&logo=Docker&logoColor=white&style=flat-square
:target: https://hub.docker.com/r/3dcitydb/3dcitydb-pg/tags?page=1&ordering=last_updated
.. |psql-alp-size-latest| image:: https://img.shields.io/docker/image-size/
3dcitydb/3dcitydb-pg/latest-alpine?label=image%20size&logo=Docker&logoColor=white&
style=flat-square
:target: https://hub.docker.com/r/3dcitydb/3dcitydb-pg/tags?page=1&ordering=last_updated
.. 4.1.0
.. |psql-deb-size-v4.1.0| image:: https://img.shields.io/docker/image-size/
3dcitydb/3dcitydb-pg/13-3.1-4.1.0?label=image%20size&logo=Docker&logoColor=white&style=flat-square
:target: https://hub.docker.com/r/3dcitydb/3dcitydb-pg
.. |psql-alp-size-v4.1.0| image:: https://img.shields.io/docker/image-size/
3dcitydb/3dcitydb-pg/13-3.1-4.1.0-alpine?label=image%20size&logo=Docker&logoColor=white&
style=flat-square
:target: https://hub.docker.com/r/3dcitydb/3dcitydb-pg
.. 4.2.0
.. |psql-deb-size-v4.2.0| image::
https://img.shields.io/docker/image-size/
3dcitydb/3dcitydb-pg/14-3.1-4.2.0?label=image%20size&logo=Docker&logoColor=white&style=flat-square
:target: https://hub.docker.com/r/3dcitydb/3dcitydb-pg
.. |psql-alp-size-v4.2.0| image::
https://img.shields.io/docker/image-size/
3dcitydb/3dcitydb-pg/14-3.1-4.2.0-alpine?label=image%20size&logo=Docker&logoColor=white&
style=flat-square
:target: https://hub.docker.com/r/3dcitydb/3dcitydb-pg
| 35.836735 | 159 | 0.678421 |
ea875e444c0e6cb0da9bfbe1248e046aaa0c5342 | 101 | rst | reStructuredText | docs/olm.calcite.calc_k2.rst | speleophysics/waterchem | 1c7d8247d642dfd9f73b44f6251fcfa03544534b | [
"MIT"
] | null | null | null | docs/olm.calcite.calc_k2.rst | speleophysics/waterchem | 1c7d8247d642dfd9f73b44f6251fcfa03544534b | [
"MIT"
] | 25 | 2015-01-04T17:34:54.000Z | 2016-12-23T23:05:41.000Z | docs/olm.calcite.calc_k2.rst | speleophysics/waterchem | 1c7d8247d642dfd9f73b44f6251fcfa03544534b | [
"MIT"
] | 3 | 2015-12-02T01:44:42.000Z | 2016-06-25T23:22:26.000Z | olm.calcite.calc\_k2
====================
.. currentmodule:: olm.calcite
.. autofunction:: calc_k2 | 16.833333 | 30 | 0.584158 |
02500aca6d330fd683be9954f9c7c46aeddd0960 | 2,838 | rst | reStructuredText | graphql/manual/schema/basics.rst | 2mol/graphql-engine-docs | 54b31483023af736cc627c2d1b855994d7b628a3 | [
"MIT"
] | null | null | null | graphql/manual/schema/basics.rst | 2mol/graphql-engine-docs | 54b31483023af736cc627c2d1b855994d7b628a3 | [
"MIT"
] | null | null | null | graphql/manual/schema/basics.rst | 2mol/graphql-engine-docs | 54b31483023af736cc627c2d1b855994d7b628a3 | [
"MIT"
] | null | null | null | Schema design basics
====================
Let's take a look at how to create tables using the Hasura console, a UI tool meant for doing exactly this. We'll use a
typical author/articles schema as a reference for all the following examples.
Open the console
----------------
Run the following command using the Hasura CLI tool.
.. code:: bash
hasura console
Create tables
-------------
Let's say we want to create two simple tables:
- ``author`` with columns ``id``, ``name``
- ``article`` with columns ``id``, ``title``, ``content``, ``author_id``
Head to the ``Data`` tab and click the ``Create Table`` button to open up an interface to create tables.
For example, here is the schema for the ``article`` table in this interface:
.. image:: ../../../img/graphql/manual/schema/create-table-graphql.png
As soon as a table is created, the corresponding GraphQL schema and resolvers are automatically created/updated. For
e.g. the following *query* and *mutation* fields are generated for the tables we just created:
.. code-block:: none
article (
where: article_bool_exp
limit: Int
offset: Int
order_by: [article_order_by!]
): [article]
.. code-block:: none
insert_article (
objects: [article_input!]
on_conflict: conflict_clause
): article_mutation_response
Try basic GraphQL queries
-------------------------
At this point, you should be able to try out basic GraphQL queries/mutations on the newly created tables using the API
Explorer in the console(*you may want to add some test data in the tables*).
.. note::
You can either use the admin token to run them or modify the permissions for these tables to temporarily allow
anonymous access to data in the **Permissions** tab of each table.
Here are a couple of examples:
- Query all rows in the ``article`` table
.. graphiql::
:query:
query {
article {
id
title
author_id
}
}
:response:
{
"data": {
"article": [
{
"id": 1,
"title": "sit amet",
"author_id": 4
},
{
"id": 2,
"title": "a nibh",
"author_id": 2
},
{
"id": 3,
"title": "amet justo morbi",
"author_id": 4
},
{
"id": 4,
"title": "vestibulum ac est",
"author_id": 5
}
]
}
}
- Insert data in the ``author`` table
.. graphiql::
:view_only: true
:query:
mutation add_author {
insert_author(
objects: [
{id: 11, name: "Jane"}
]
) {
affected_rows
}
}
:response:
{
"data": {
"insert_author": {
"affected_rows": 1
}
}
}
| 23.454545 | 119 | 0.566244 |
79e46909f44356121ea99175ef695c49f705a7b0 | 671 | rst | reStructuredText | doc/config-reference/source/shared-file-systems.rst | AlekhyaMallina-Vedams/kaminario_manuals | 79a8bb3a700a80ea4a6fed1ceb45224ab3d5f0ca | [
"Apache-2.0"
] | null | null | null | doc/config-reference/source/shared-file-systems.rst | AlekhyaMallina-Vedams/kaminario_manuals | 79a8bb3a700a80ea4a6fed1ceb45224ab3d5f0ca | [
"Apache-2.0"
] | null | null | null | doc/config-reference/source/shared-file-systems.rst | AlekhyaMallina-Vedams/kaminario_manuals | 79a8bb3a700a80ea4a6fed1ceb45224ab3d5f0ca | [
"Apache-2.0"
] | null | null | null | ===========================
Shared File Systems service
===========================
.. toctree::
shared-file-systems/overview.rst
shared-file-systems/api.rst
shared-file-systems/drivers.rst
shared-file-systems/log-files.rst
shared-file-systems/rpc.rst
shared-file-systems/misc.rst
shared-file-systems/samples/index.rst
tables/conf-changes/manila.rst
The Shared File Systems service works with many different drivers that
you can configure by using these instructions.
.. note::
The common configurations for shared service and libraries,
such as database connections and RPC messaging,
are described at :doc:`common-configurations`.
| 27.958333 | 70 | 0.704918 |
7adbfda037143a33fd8f0ff232b7f5e2e2817b4a | 58 | rst | reStructuredText | doc/code/gbs/sample.rst | rajeshkumarkarra/strawberryfields | 138d8459fe773a9d645569d7af3ecd1f86e65f5a | [
"Apache-2.0"
] | null | null | null | doc/code/gbs/sample.rst | rajeshkumarkarra/strawberryfields | 138d8459fe773a9d645569d7af3ecd1f86e65f5a | [
"Apache-2.0"
] | 5 | 2020-09-26T01:27:24.000Z | 2022-02-10T02:13:49.000Z | doc/code/gbs/sample.rst | rajeshkumarkarra/strawberryfields | 138d8459fe773a9d645569d7af3ecd1f86e65f5a | [
"Apache-2.0"
] | null | null | null | .. automodule:: strawberryfields.gbs.sample
:members:
| 19.333333 | 43 | 0.724138 |
2ece0558cdfd50b100c90007227d178e69802293 | 250 | rst | reStructuredText | README.rst | python-coincidence/contributing | d6794fe68a17f3523e1318c34614388095549c0d | [
"MIT"
] | null | null | null | README.rst | python-coincidence/contributing | d6794fe68a17f3523e1318c34614388095549c0d | [
"MIT"
] | 1 | 2021-08-12T15:29:02.000Z | 2021-08-12T15:29:02.000Z | README.rst | python-coincidence/contributing | d6794fe68a17f3523e1318c34614388095549c0d | [
"MIT"
] | null | null | null | ======================
Contributing Guide
======================
.. start short_desc
**Contributing guide for projects in the python-coincidence organization.**
.. end short_desc
View online at https://contributing-to-coincidence.readthedocs.io/
| 20.833333 | 75 | 0.648 |
87ec99646bfdabbfa30ec5790572f50074a779a3 | 205 | rst | reStructuredText | docs/source/generated/paasta_tools.autoscaling.cluster_boost.rst | jackchi/paasta | 0899adcef43cb07c247a36f5af82f09bb6f8db12 | [
"Apache-2.0"
] | 2 | 2020-04-09T06:58:46.000Z | 2021-05-03T21:56:03.000Z | docs/source/generated/paasta_tools.autoscaling.cluster_boost.rst | jackchi/paasta | 0899adcef43cb07c247a36f5af82f09bb6f8db12 | [
"Apache-2.0"
] | 4 | 2021-02-08T20:42:08.000Z | 2021-06-02T00:51:04.000Z | docs/source/generated/paasta_tools.autoscaling.cluster_boost.rst | eric-erki/An-open-distributed-platform-as-a-service | 6769c5601685deb1017910ab8d09109e8e998892 | [
"Apache-2.0"
] | 1 | 2020-09-29T03:23:02.000Z | 2020-09-29T03:23:02.000Z | paasta_tools.autoscaling.cluster_boost module
=============================================
.. automodule:: paasta_tools.autoscaling.cluster_boost
:members:
:undoc-members:
:show-inheritance:
| 25.625 | 54 | 0.595122 |
b6f2ce8aa2fdc69c935ecec6d3aa420c11a903ae | 7,355 | rst | reStructuredText | docs/other/cmake_utilities.rst | maxitg/LibraryLinkUtilities | 2d20dd08f38999a974874c0f2a6df527fe781401 | [
"MIT"
] | 30 | 2020-10-06T11:06:34.000Z | 2022-03-22T22:38:39.000Z | docs/other/cmake_utilities.rst | maxitg/LibraryLinkUtilities | 2d20dd08f38999a974874c0f2a6df527fe781401 | [
"MIT"
] | 1 | 2020-12-29T18:23:37.000Z | 2021-02-25T11:22:28.000Z | docs/other/cmake_utilities.rst | LaudateCorpus1/LibraryLinkUtilities | cc472c6e3e936b86cb4736afc535bfa482fc5d23 | [
"MIT"
] | 7 | 2020-12-08T17:10:00.000Z | 2022-03-29T15:42:13.000Z | ================================
CMake utility functions
================================
Apart from the C++ and Wolfram Language APIs, LLU offers a range of CMake utility functions to automate common steps in building LibraryLink paclets with CMake.
While it is not at all required to use CMake in a project that links to LLU, it is definitely convenient, as LLU is specifically tailored to be used
by other CMake projects.
When you install LLU, a :file:`cmake` directory is created in the installation directory, with the following contents:
.. code-block:: none
:emphasize-lines: 5,7
.
└── cmake
└── LLU
├── Wolfram
│ ├── Common.cmake
│ ├── CVSUtilities.cmake
│ └── PacletUtilities.cmake
├── FindWolframLanguage.cmake
├── FindWolframLibrary.cmake
├── FindWSTP.cmake
├── LLUConfig.cmake
├── LLUConfigVersion.cmake
└── LLUTargets.cmake
Most of these files are used internally by LLU or by CMake in order to get information about LLU installation when you link to it from your project.
However, in the :file:`Wolfram` subdirectory you will find two files (highlighted) with general purpose utilities which are documented below.
.. tip::
Check out the :ref:`Demo paclet <demo-project>` to see how some of these utilities can be used in a project.
Common
================================
:file:`cmake/LLU/Wolfram/Common.cmake` contains a number of small CMake functions and macros that automate common tasks when writing cross-platform CMake code.
Not all of them will be useful in every project so feel free to choose whatever suits your needs.
.. cmake:command:: set_machine_flags
**Syntax:**
.. code-block:: cmake
set_machine_flags(<target>)
Depending on the machine architecture and operating system this function sets correct "machine flag" for given target:
- on Windows it sets ``/MACHINE:XX`` link flag
- on Linux and MacOS it sets ``-mXX`` flag for compilation and linking
Additionally, on 32-bit platforms it also defines ``MINT_32`` macro to indicate to the Wolfram Library to use 32-bit machine integers.
.. cmake:command:: set_windows_static_runtime
**Syntax:**
.. code-block:: cmake
set_windows_static_runtime()
Forces static runtime on Windows and does nothing on other platforms. See https://gitlab.kitware.com/cmake/community/wikis/FAQ#dynamic-replace for details.
.. cmake:command:: set_min_windows_version
**Syntax:**
.. code-block:: cmake
set_min_windows_version(<target> <version>)
Adds compile definitions to the specified target to set minimum supported Windows version. Does nothing on other platforms.
Supported values of ``<version>`` include: 7, 8, 8.1 and 10.
.. cmake:command:: set_default_compile_options
**Syntax:**
.. code-block:: cmake
set_default_compile_options(<target> <optimization>)
Sets default paclet compile options including warning level and optimization. On Windows, also sets ``/EHsc``. A call to this function may be used
as a starting point and new compile options can be added with consecutive calls to :cmake:command:`target_compile_options`.
.. cmake:command:: install_dependency_files
**Syntax:**
.. code-block:: cmake
install_dependency_files(<paclet_name> <dependency_target> [lib1, lib2, ...])
Copies dependency libraries into paclet layout if the library type is SHARED (always copies on Windows).
Optional arguments are the libraries to copy (defaults to main target file plus its dependencies).
**Arguments:**
:cmake:variable:`<paclet_name>`
name of the paclet (i.e. name of the paclet's layout root directory)
:cmake:variable:`<dependency_target>`
CMake target corresponding to a dependency of the paclet
:cmake:variable:`lib1, lib2, ...`
*[optional]* absolute paths to dynamic libraries on which the paclet depends and which should be copied to the paclet's layout. If not provided,
this information will be deduces from the ``<dependency_target>``.
Paclet Utilities
================================
:file:`cmake/LLU/Wolfram/PacletUtilities.cmake` contains CMake functions for installing and packaging projects into proper :term:`paclet`\ s.
.. cmake:command:: install_paclet_files
**Syntax:**
.. code-block:: cmake
install_paclet_files(
TARGET <target>
[LLU_LOCATION path]
[PACLET_NAME name]
[PACLET_FILES_LOCATION path2]
[INSTALL_TO_LAYOUT])
Configures the CMake *install* target for a paclet. The only required argument is :cmake:variable:`TARGET` which should be followed by the main paclet
target (that defines the shared library). The *install* target configured with this function will copy the directory
passed as :cmake:variable:`PACLET_FILES_LOCATION` into the location stored in :cmake:variable:`CMAKE_INSTALL_PREFIX`. It will also place the
:file:`PacletInfo.wl` in the appropriate location in the paclet and put the shared library under :file:`LibraryResources/<system_id>`.
**Arguments:**
:cmake:variable:`TARGET`
name of the main target in the paclet's CMakeLists.txt
:cmake:variable:`LLU_LOCATION`
path to LLU installation. This is needed because every paclet that uses the Wolfram Language part of the LLU API needs a copy of
:file:`LibraryLinkUtilities.wl` which is stored in the :file:`share` folder of LLU installation.
:cmake:variable:`PACLET_NAME`
*[optional]* if the name of the paclet is different than the name of the main paclet target, pass it here
:cmake:variable:`PACLET_FILES_LOCATION`
*[optional]* location of the Wolfram Language source files in the paclet, by default it is assumed as ``${CMAKE_CURRENT_SOURCE_DIR}/PACLET_NAME``
:cmake:variable:`INSTALL_TO_LAYOUT`
*[optional]* a flag indicating whether the complete paclet layout (what the *install* target produces) should be also copied to the :file:`SystemFiles/Links`
directory of current Wolfram Language installation (the one used for paclet configuration)
----------------------------------------
.. cmake:command:: add_paclet_target
**Syntax:**
.. code-block:: cmake
add_paclet_target(<target>
NAME name
[VERIFY]
[INSTALL]
[TEST_FILE file]
)
Create a target that produces a proper .paclet file for the project. It takes a paclet layout, produced by the *install* target, packs it into a .paclet
file, optionally verifies contents, installs to the user paclet directory and run tests.
.. warning::
For this function to work, *install* target must be built beforehand and wolframscript from Wolfram Language v12.1 or later must be available.
**Arguments:**
``<target>``
name for the new target, can be anything
:cmake:variable:`NAME`
name of the paclet, it must match the name of the paclet's layout root directory
:cmake:variable:`VERIFY`
*[optional]* verify contents of the newly created .paclet file
:cmake:variable:`INSTALL`
*[optional]* install .paclet file to the user paclet directory, see :wlref:`PacletInstall` for details
:cmake:variable:`TEST_FILE`
*[optional]* provide a path to a test file, if your paclet has one. There is no magic here, CMake will simply ask wolframscript to evaluate the file
you provided. What will actually happen fully depends on the contents of your test file.
| 39.543011 | 160 | 0.72087 |
306a057697a24a4c6a2745c5f906d47bf016a41f | 99 | rst | reStructuredText | docs/usage.rst | shannon-jia/cctv | 3a73dc28d5c3f8556cfc87f0c9d2a0d6ece23a01 | [
"MIT"
] | null | null | null | docs/usage.rst | shannon-jia/cctv | 3a73dc28d5c3f8556cfc87f0c9d2a0d6ece23a01 | [
"MIT"
] | null | null | null | docs/usage.rst | shannon-jia/cctv | 3a73dc28d5c3f8556cfc87f0c9d2a0d6ece23a01 | [
"MIT"
] | null | null | null | =====
Usage
=====
To use CCTV for SAM V1 with RabbitMQ and Docker in a project::
import cctv
| 12.375 | 62 | 0.636364 |
44075e78ccacbba88380492e22478357ccb704bd | 954 | rst | reStructuredText | docs/generating-a-cli.rst | MasonMcGill/artisan | f24932289bfe4f606b30516d429dc982df27ffdd | [
"MIT"
] | null | null | null | docs/generating-a-cli.rst | MasonMcGill/artisan | f24932289bfe4f606b30516d429dc982df27ffdd | [
"MIT"
] | 9 | 2019-06-10T11:27:56.000Z | 2022-01-20T15:53:28.000Z | docs/generating-a-cli.rst | MasonMcGill/artisan | f24932289bfe4f606b30516d429dc982df27ffdd | [
"MIT"
] | 1 | 2019-06-07T15:44:33.000Z | 2019-06-07T15:44:33.000Z | Generating a CLI
================
An example command-line interface script, using `PyYAML
<https://pyyaml.org/wiki/PyYAMLDocumentation>`_ and `clize
<https://clize.readthedocs.io/en/stable/>`_:
.. code-block:: python3
#!/usr/bin/env python3
import json
from pathlib import Path
import artisan, clize, yaml
from .my_lib import my_context
def build(spec_dict_path: Path, key: str) -> None:
'Build an artifact based on a spec in a YAML file.'
spec_dict = yaml.safe_load(spec_dict_path.read_bytes())
artifact = artisan.build(artisan.Artifact, spec_dict[key])
print(f'Built {Path(artifact)}.')
def write_schema(dst: Path) -> None:
'Generate a JSON Schema describing valid inputs to `build`.'
schema = artisan.get_spec_dict_schema()
dst.write_text(json.dumps(schema, indent=2))
print(f'Wrote a schema to {dst}.')
with artisan.using_context(my_context):
clize.run(build, write_schema)
| 31.8 | 66 | 0.69392 |
e486d62ef3c969ed45aa485d69ba2393bc6be1ea | 63 | rst | reStructuredText | README.rst | InterviewCake/he-sdk-python | 85a07b2e63b0ae9fb8579dee4f4f48731e56cf99 | [
"MIT"
] | 17 | 2015-11-18T17:33:33.000Z | 2020-12-10T19:14:27.000Z | README.rst | InterviewCake/he-sdk-python | 85a07b2e63b0ae9fb8579dee4f4f48731e56cf99 | [
"MIT"
] | 1 | 2019-02-20T15:30:35.000Z | 2019-07-24T06:47:57.000Z | README.rst | InterviewCake/he-sdk-python | 85a07b2e63b0ae9fb8579dee4f4f48731e56cf99 | [
"MIT"
] | 8 | 2016-04-18T18:22:21.000Z | 2020-02-20T23:53:16.000Z | # he-sdk-python
Python client for HackerEarth Code Checker API
| 21 | 46 | 0.809524 |
0a360b88fb28e3d77122987c1216e04fa83f0c46 | 1,904 | rst | reStructuredText | doc/operation/logging.rst | limitusus/openxpki | 92ae3f6af9830b390b06a784026fbdb3e048ac8a | [
"Apache-2.0"
] | 357 | 2015-02-19T18:23:12.000Z | 2022-03-29T04:05:25.000Z | doc/operation/logging.rst | limitusus/openxpki | 92ae3f6af9830b390b06a784026fbdb3e048ac8a | [
"Apache-2.0"
] | 604 | 2015-01-19T11:58:44.000Z | 2022-03-14T13:38:42.000Z | doc/operation/logging.rst | limitusus/openxpki | 92ae3f6af9830b390b06a784026fbdb3e048ac8a | [
"Apache-2.0"
] | 108 | 2015-03-10T19:05:20.000Z | 2022-03-29T04:32:28.000Z |
Audit Log
=========
The audit log lists all operations that are relevant for the usage of
private key material or important steps (as approvals) that lead to a
signature using the CA key.
Categories
##########
The audit log is divided into several categories. The given items are
actions logged by the standard configuration but are not exhaustive.
The name in brackets is the name of the logger category used by the
logger.
CA Key Usage (cakey)
--------------------
* certificate issued
* crl issued
Entity Key Usage (key)
----------------------
* key generated
* key exported
* key destroyed
Certificate (entity)
----------------------
* request received
* request fully approved
* issued
* revoked
Approval (approval)
---------------------
* operator approval given via ui
* automated approval derived from backend checks
ACL (acl)
---------------------
* access to workflow
* access to api
System (system)
----------------
* start/stop of system
* import/activation of tokens
* import of certificates
Application
-----------
* Application specific logging
Parameters
##########
Each log message consists of a fixed string describing the event plus a
list of normalized parameters which are appended as key/value pairs to
the message so it is easy to search the log for certain or feed it to a
log analysis programm like logstash.
* cakey/key: subject key identifier of the used key
* certid: certificate identifier
* wfid: id of the workflow
* action: name of a workflow action or called API method
* token: alias name of the token/key, e.g. "ca-signer-1"
* pki_realm: name of the pki realm
Example (line breaks are for verbosity, logfile is one line)::
certificate signed|
cakey=28:B9:6D:51:EC:EB:6D:C9:4A:71:7C:B4:C0:67:F7:E9:C1:BD:63:7A|
certid=FW2Hq52uTcthhyhrrvTjRub66M0|
key=D6:14:BB:E2:90:12:F4:FF:64:B4:0F:F3:F6:3A:FD:17:02:C9:06:C8|
pki_realm=democa
| 24.101266 | 71 | 0.693803 |
2bbee748389eaffa36890ec297a713ec84922452 | 2,658 | rst | reStructuredText | README.rst | Shihta/python-novaclient | 9465e0883e0cb9d3f8cd45373aea26070c49e0ce | [
"Apache-1.1"
] | null | null | null | README.rst | Shihta/python-novaclient | 9465e0883e0cb9d3f8cd45373aea26070c49e0ce | [
"Apache-1.1"
] | null | null | null | README.rst | Shihta/python-novaclient | 9465e0883e0cb9d3f8cd45373aea26070c49e0ce | [
"Apache-1.1"
] | null | null | null | Python bindings to the OpenStack Nova API
==================================================
This is a client for the OpenStack Nova API. There's a Python API (the
``novaclient`` module), and a command-line script (``nova``). Each
implements 100% of the OpenStack Nova API.
See the `OpenStack CLI guide`_ for information on how to use the ``nova``
command-line tool. You may also want to look at the
`OpenStack API documentation`_.
.. _OpenStack CLI Guide: http://docs.openstack.org/cli/quick-start/content/
.. _OpenStack API documentation: http://docs.openstack.org/api/
The project is hosted on `Launchpad`_, where bugs can be filed. The code is
hosted on `Github`_. Patches must be submitted using `Gerrit`_, *not* Github
pull requests.
.. _Github: https://github.com/openstack/python-novaclient
.. _Launchpad: https://launchpad.net/python-novaclient
.. _Gerrit: http://wiki.openstack.org/GerritWorkflow
python-novaclient is licensed under the Apache License like the rest of
OpenStack.
.. contents:: Contents:
:local:
Command-line API
----------------
Installing this package gets you a shell command, ``nova``, that you
can use to interact with any OpenStack cloud.
You'll need to provide your OpenStack username and password. You can do this
with the ``--os-username``, ``--os-password`` and ``--os-tenant-name``
params, but it's easier to just set them as environment variables::
export OS_USERNAME=openstack
export OS_PASSWORD=yadayada
export OS_TENANT_NAME=myproject
You will also need to define the authentication url with ``--os-auth-url``
and the version of the API with ``--os-compute-api-version``. Or set them as
an environment variables as well::
export OS_AUTH_URL=http://example.com:8774/v1.1/
export OS_COMPUTE_API_VERSION=1.1
If you are using Keystone, you need to set the OS_AUTH_URL to the keystone
endpoint::
export OS_AUTH_URL=http://example.com:5000/v2.0/
Since Keystone can return multiple regions in the Service Catalog, you
can specify the one you want with ``--os-region-name`` (or
``export OS_REGION_NAME``). It defaults to the first in the list returned.
You'll find complete documentation on the shell by running
``nova help``
Python API
----------
There's also a complete Python API, but it has not yet been documented.
To use with nova, with keystone as the authentication system::
# use v2.0 auth with http://example.com:5000/v2.0/")
>>> from novaclient.v1_1 import client
>>> nt = client.Client(USER, PASS, TENANT, AUTH_URL, service_type="compute")
>>> nt.flavors.list()
[...]
>>> nt.servers.list()
[...]
>>> nt.keypairs.list()
[...]
| 33.225 | 80 | 0.709556 |
c6532ce9925522aa5521b33362c922e19821016c | 82 | rst | reStructuredText | docs/fastf1.rst | JellybeanAsh/Fast-F1 | cf0cb20fdd3e89fdee3755097722db5ced3a23b5 | [
"MIT"
] | 690 | 2020-07-31T15:37:59.000Z | 2022-03-31T20:51:46.000Z | docs/fastf1.rst | JellybeanAsh/Fast-F1 | cf0cb20fdd3e89fdee3755097722db5ced3a23b5 | [
"MIT"
] | 90 | 2020-07-25T11:00:15.000Z | 2022-03-31T01:59:59.000Z | docs/fastf1.rst | JellybeanAsh/Fast-F1 | cf0cb20fdd3e89fdee3755097722db5ced3a23b5 | [
"MIT"
] | 68 | 2020-07-21T23:21:29.000Z | 2022-03-30T16:12:01.000Z | .. automodule:: fastf1
:members:
:undoc-members:
:show-inheritance:
| 11.714286 | 22 | 0.609756 |
58fc331acc2b3f564dc73bb8c039c17b9b4720f2 | 107 | rst | reStructuredText | doc/fluid/api/paddle/equal_all.rst | shiyutang/docs | b05612213a08daf9f225abce08fc42f924ef51ad | [
"Apache-2.0"
] | 104 | 2018-09-04T08:16:05.000Z | 2021-05-06T20:45:26.000Z | doc/fluid/api/paddle/equal_all.rst | shiyutang/docs | b05612213a08daf9f225abce08fc42f924ef51ad | [
"Apache-2.0"
] | 1,582 | 2018-06-25T06:14:11.000Z | 2021-05-14T16:00:43.000Z | doc/fluid/api/paddle/equal_all.rst | shiyutang/docs | b05612213a08daf9f225abce08fc42f924ef51ad | [
"Apache-2.0"
] | 387 | 2018-06-20T07:42:32.000Z | 2021-05-14T08:35:28.000Z | .. _api_paddle_equal_all
equal_all
-------------------------------
:doc_source: paddle.tensor.equal_all
| 13.375 | 36 | 0.570093 |
0d391f98ce029dd12796f759faad32ac37dfb14c | 7,881 | rst | reStructuredText | source/tutorials/reliable_forwarding.rst | qu0zl/rsyslog-doc | 50b67b95259edcbae4e81c1069e7bb56a89743d6 | [
"Apache-2.0"
] | 1 | 2020-01-08T08:48:59.000Z | 2020-01-08T08:48:59.000Z | source/tutorials/reliable_forwarding.rst | qu0zl/rsyslog-doc | 50b67b95259edcbae4e81c1069e7bb56a89743d6 | [
"Apache-2.0"
] | null | null | null | source/tutorials/reliable_forwarding.rst | qu0zl/rsyslog-doc | 50b67b95259edcbae4e81c1069e7bb56a89743d6 | [
"Apache-2.0"
] | null | null | null | Reliable Forwarding of syslog Messages with Rsyslog
===================================================
*Written by* `Rainer Gerhards <https://rainer.gerhards.net/>`_
*(2008-06-27)*
Abstract
--------
**In this paper, I describe how to forward**
`syslog <http://www.monitorware.com/en/topics/syslog/>`_ **messages
(quite) reliable to a central rsyslog server.** This depends on rsyslog
being installed on the client system and it is recommended to have it
installed on the server system. Please note that industry-standard
`plain TCP syslog protocol is not fully
reliable <https://rainer.gerhards.net/2008/04/on-unreliability-of-plain-tcp-syslog.html>`_
(thus the "quite reliable"). If you need a truly reliable solution, you
need to look into RELP (natively supported by rsyslog).*
The Intention
-------------
Whenever two systems talk over a network, something can go wrong. For
example, the communications link may go down, or a client or server may
abort. Even in regular cases, the server may be offline for a short
period of time because of routine maintenance.
A logging system should be capable of avoiding message loss in
situations where the server is not reachable. To do so, unsent data
needs to be buffered at the client while the server is offline. Then,
once the server is up again, this data is to be sent.
This can easily be accomplished by rsyslog. In rsyslog, every action runs
on its own queue and each queue can be set to buffer data if the action
is not ready. Of course, you must be able to detect that "the action is
not ready", which means the remote server is offline. This can be
detected with plain TCP syslog and RELP, but not with UDP. So you need
to use either of the two. In this howto, we use plain TCP syslog.
Please note that we are using rsyslog-specific features. The are
required on the client, but not on the server. So the client system must
run rsyslog (at least version 3.12.0), while on the server another
syslogd may be running, as long as it supports plain tcp syslog.
**The rsyslog queueing subsystem tries to buffer to memory. So even if
the remote server goes offline, no disk file is generated.** File on
disk are created only if there is need to, for example if rsyslog runs
out of (configured) memory queue space or needs to shutdown (and thus
persist yet unsent messages). Using main memory and going to the disk
when needed is a huge performance benefit. You do not need to care about
it, because, all of it is handled automatically and transparently by
rsyslog.
How To Setup
------------
First, you need to create a working directory for rsyslog. This is where
it stores its queue files (should need arise). You may use any location
on your local system.
Next, you need to do is instruct rsyslog to use a disk queue and then
configure your action. There is nothing else to do. With the following
simple config file, you forward anything you receive to a remote server
and have buffering applied automatically when it goes down. This must be
done on the client machine.
.. code-block:: linux-config
$ModLoad imuxsock # local message reception
$WorkDirectory /rsyslog/work # default location for work (spool) files
$ActionQueueType LinkedList # use asynchronous processing
$ActionQueueFileName srvrfwd # set file name, also enables disk mode
$ActionResumeRetryCount -1 # infinite retries on insert failure
$ActionQueueSaveOnShutdown on # save in-memory data if rsyslog shuts down
*.* @@server:port
The port given above is optional. It may not be specified, in which case
you only provide the server name. The "$ActionQueueFileName" is used to
create queue files, should need arise. This value must be unique inside
rsyslog.conf. No two rules must use the same queue file. Also, for
obvious reasons, it must only contain those characters that can be used
inside a valid file name. Rsyslog possibly adds some characters in front
and/or at the end of that name when it creates files. So that name
should not be at the file size name length limit (which should not be a
problem these days).
Please note that actual spool files are only created if the remote
server is down **and** there is no more space in the in-memory queue. By
default, a short failure of the remote server will never result in the
creation of a disk file as a couple of hundred messages can be held in
memory by default. [These parameters can be fine-tuned. However, then
you need to either fully understand how the queue works (`read elaborate
doc <http://www.rsyslog.com/doc-queues.html>`_) or use `professional
services <http://www.rsyslog.com/professional-services/>`_ to
have it done based on your specs ;) - what that means is that
fine-tuning queue parameters is far from being trivial...]
If you would like to test if your buffering scenario works, you need to
stop, wait a while and restart you central server. Do **not** watch for
files being created, as this usually does not happen and never happens
immediately.
Forwarding to More than One Server
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
If you have more than one server you would like to forward to, that's
quickly done. Rsyslog has no limit on the number or type of actions, so
you can define as many targets as you like. What is important to know,
however, is that the full set of directives make up an action. So you
can not simply add (just) a second forwarding rule, but need to
duplicate the rule configuration as well. Be careful that you use
different queue file names for the second action, else you will mess up
your system.
A sample for forwarding to two hosts looks like this:
.. code-block:: linux-config
$ModLoad imuxsock # local message reception
$WorkDirectory /rsyslog/work # default location for work (spool) files
# start forwarding rule 1
$ActionQueueType LinkedList # use asynchronous processing
$ActionQueueFileName srvrfwd1 # set file name, also enables disk mode
$ActionResumeRetryCount -1 # infinite retries on insert failure
$ActionQueueSaveOnShutdown on # save in-memory data if rsyslog shuts down
*.* @@server1:port
# end forwarding rule 1
# start forwarding rule 2
$ActionQueueType LinkedList # use asynchronous processing
$ActionQueueFileName srvrfwd2 # set file name, also enables disk mode
$ActionResumeRetryCount -1 # infinite retries on insert failure
$ActionQueueSaveOnShutdown on # save in-memory data if rsyslog shuts down
*.* @@server2
# end forwarding rule 2
Note the filename used for the first rule it is "srvrfwd1" and for the
second it is "srvrfwd2". I have used a server without port name in the
second forwarding rule. This was just to illustrate how this can be
done. You can also specify a port there (or drop the port from server1).
When there are multiple action queues, they all work independently.
Thus, if server1 goes down, server2 still receives data in real-time.
The client will **not** block and wait for server1 to come back online.
Similarly, server1's operation will not be affected by server2's state.
Some Final Words on Reliability ...
-----------------------------------
Using plain TCP syslog provides a lot of reliability over UDP syslog.
However, plain TCP syslog is **not** a fully reliable transport. In
order to get full reliability, you need to use the RELP protocol.
Follow the next link to learn more about `the problems you may encounter
with plain tcp
syslog <https://rainer.gerhards.net/2008/04/on-unreliability-of-plain-tcp-syslog.html>`_.
Feedback requested
~~~~~~~~~~~~~~~~~~
I would appreciate feedback on this tutorial. If you have additional
ideas, comments or find bugs (I \*do\* bugs - no way... ;)), please `let
me know <mailto:rgerhards@adiscon.com>`_.
Revision History
----------------
- 2008-06-27 \* `Rainer Gerhards <https://rainer.gerhards.net/>`_ \*
Initial Version created
| 45.819767 | 90 | 0.752062 |
ede52272e3d04758e9a1cc734fc115628b3d35bd | 499 | rst | reStructuredText | docs/source/tutorial/zh/installation.rst | SimpleButNotNaive/EduData | 072056261930e110d0b9d840ea707c92e49b1d31 | [
"Apache-2.0"
] | 98 | 2019-07-05T03:27:36.000Z | 2022-03-30T08:38:09.000Z | docs/source/tutorial/zh/installation.rst | chenqiyuan1012/EduData | da9415aa7c19a53a604d46f24e4912d6334b7ed3 | [
"Apache-2.0"
] | 45 | 2020-12-25T03:49:43.000Z | 2021-11-26T09:45:42.000Z | docs/source/tutorial/zh/installation.rst | chenqiyuan1012/EduData | da9415aa7c19a53a604d46f24e4912d6334b7ed3 | [
"Apache-2.0"
] | 50 | 2019-08-17T05:11:15.000Z | 2022-03-29T07:54:13.000Z | 安装与帮助
=====================
从源码安装
----------------
从 ``github`` 上 ``clone`` 后进入文件夹
.. code-block:: console
$ pip install -e .
从 ``pypi`` 安装
----------------
.. code-block:: console
$ pip install EduData
命令行格式
-------------------
.. code-block:: console
$ edudata $subcommand $parameters1 $parameters2
.. _安装:
查看所有命令的帮助文档
----------------
.. code-block:: console
$ edudata -- --help
查看命令 ``subcommmand`` 的帮助
----------------
.. code-block:: console
$ edudata $subcommand --help
| 11.880952 | 48 | 0.50501 |
50ccbc4b03c4afaea2a5d1dc7ee8875c608cb79f | 3,518 | rst | reStructuredText | CHANGELOG.rst | yaoguai/sanzang | de562c8fa694c111125e2bc97a9b01ccbcb54437 | [
"MIT"
] | 5 | 2015-03-24T07:40:09.000Z | 2019-05-14T11:57:44.000Z | CHANGELOG.rst | yaoguai/sanzang-utils | de562c8fa694c111125e2bc97a9b01ccbcb54437 | [
"MIT"
] | null | null | null | CHANGELOG.rst | yaoguai/sanzang-utils | de562c8fa694c111125e2bc97a9b01ccbcb54437 | [
"MIT"
] | 1 | 2015-11-30T04:29:59.000Z | 2015-11-30T04:29:59.000Z | Change Log
==========
1.3.3 (2016-01-??)
------------------
* Added more helpful and verbose output to szu-ed.
* Removed table format checks from all tools except szu-ed.
1.3.2 (2015-04-01)
------------------
* Reimplemented core algorithm for szu-r (cleaner and faster).
* Added missing delimiter in buffering algorithm for szu-r.
* Added a bit of missing Unicode normalization to szu-r.
* Edited writing message for szu-ed to include the line count.
1.3.1 (2015-03-24)
------------------
* Fixed input_lines support in szu_ed.py so editing terminates normally.
* Added a "file written" message to szu-ed. File writing is important...
1.3.0 (2015-03-11)
------------------
* Converted all programs into Python modules that can be imported.
* Refactored modules for more convenient use in calling code.
* Converted Makefile install / uninstall to use pip3.
* New directory layout (but installation to the same directories).
* Updated documentation for new installation procedure.
1.2.3 (2015-02-16)
------------------
* More thorough Unicode normalization for the table editor (szu-ed).
1.2.2 (2015-02-02)
------------------
* Added basic Unicode normalization for safety and compatibility.
1.2.1 (2015-01-06)
------------------
* Fixed formatting for an example in the szu-t manual page.
* Documentation updates for README.
1.2.0 (2015-01-05)
------------------
* szu-ss: read and process one line at a time if stdin is a TTY.
* szu-t: read and process one line at a time if stdin is a TTY.
* szu-t: at EOF, do not translate the buffer if it is an empty string.
1.1.2 (2015-01-03)
------------------
* szu-ss updated with major performance improvements (~200-300%).
* szu-ss "verbose" option fixed to function correctly.
* Verbose modes now preserve the original stack trace for debugging.
1.1.1 (2014-11-22)
------------------
* szu-ss fixed to show usage if there are too few arguments.
* Tutorial updated to use new listing notation.
1.1.0 (2014-10-31)
------------------
* Added support for input files as positional arguments to commands.
* Changed szu-t list notation to be more compact to aid readability.
* Added a makefile note about setting parameters for BSD, Solaris, etc.
* Programs updated to close a table file immediately after reading it.
1.0.5 (2014-10-02)
------------------
* Makefile dist target now just builds a Python dist.
* Removed superfluous exception handling.
* Updated source code according to pep8 and pep257.
* Documentation fixes and updates.
1.0.4 (2014-09-09)
------------------
* Updated programs for proper universal newline support.
* Fixed makefile logic bug (documentation directory removal).
1.0.3 (2014-08-23)
------------------
* Translation table fields have surrounding whitespace stripped.
* All spaces and tabs will not be removed from table fields.
* Fixed bug in szu-ss so string matching works correctly.
* Minor documentation fixes.
1.0.2 (2014-08-15)
------------------
* Updated szu-ed to print to stderr for any common exceptions.
* Added missing option description for szu-ss.
* Documentation and build system updates and fixes.
1.0.1 (2014-08-11)
------------------
* Tutorial updated to HTML5.
* Documentation copyedits and formatting.
* Added MANIFEST.in to include makefile in the Python package.
* Fixed minor encoding compatibility issues with UTF-8 BOMs.
* Improved szu-ss table-loading code to be more robust.
* Overhauled szu-ss to use buffering -- much more efficient.
1.0.0 (2014-08-10)
------------------
* Initial commit into git.
| 34.15534 | 72 | 0.686754 |
4f826bd7f4fa6e5bbfde0df240ee65e0f10cad7b | 1,365 | rst | reStructuredText | doc/matlab/generic-syntax.rst | mdiazmel/keops | 52a3d2ee80a720639f52898305f85399b7b45a63 | [
"MIT"
] | 695 | 2019-04-29T10:20:55.000Z | 2022-03-31T13:07:24.000Z | doc/matlab/generic-syntax.rst | mdiazmel/keops | 52a3d2ee80a720639f52898305f85399b7b45a63 | [
"MIT"
] | 213 | 2019-04-18T09:24:39.000Z | 2022-03-31T14:27:12.000Z | doc/matlab/generic-syntax.rst | mdiazmel/keops | 52a3d2ee80a720639f52898305f85399b7b45a63 | [
"MIT"
] | 52 | 2019-04-18T09:18:08.000Z | 2022-03-27T01:48:33.000Z | Matlab API
==========
The example described below is implemented in the example Matlab script `script_GenericSyntax.m <https://github.com/getkeops/keops/blob/master/keopslab/examples/script_GenericSyntax.m>`_ located in directory ``keopslab/examples``.
The Matlab bindings provide a function `keops_kernel <https://github.com/getkeops/keops/blob/master/keopslab/generic/keops_kernel.m>`_ which can be used to define the corresponding convolution operations. Following the previous example, one may write
.. code-block:: matlab
f = keops_kernel('Square(p-a)*Exp(x+y)','p=Pm(1)','a=Vj(1)','x=Vi(3)','y=Vj(3)');
which defines a Matlab function handler ``f`` which can be used to perform a sum reduction for this formula:
.. code-block:: matlab
c = f(p,a,x,y);
where ``p``, ``a``, ``x``, ``y`` must be arrays with compatible dimensions as previously explained. A gradient function `keops_grad <https://github.com/getkeops/keops/blob/master/keopslab/generic/keops_grad.m>`_ is also provided. For example, to get the gradient with respect to ``y`` of the previously defined function ``f``, one needs to write:
.. code-block:: matlab
Gfy = keops_grad(f, 'y');
which returns a new function that can be used as follows :
.. code-block:: matlab
Gfy(p, a, x, y, e)
where ``e`` is the input gradient array (here of type ``Vi(3)``).
| 42.65625 | 346 | 0.70989 |
819c9054b3a4cbcc6d699ccfdad86e1fda8c33b8 | 414 | rst | reStructuredText | CHANGELOG.rst | ArthurFDLR/beancount-n26 | c259d107b05b9bf563c527accce71ea9999da146 | [
"MIT"
] | null | null | null | CHANGELOG.rst | ArthurFDLR/beancount-n26 | c259d107b05b9bf563c527accce71ea9999da146 | [
"MIT"
] | null | null | null | CHANGELOG.rst | ArthurFDLR/beancount-n26 | c259d107b05b9bf563c527accce71ea9999da146 | [
"MIT"
] | 1 | 2020-12-04T21:03:42.000Z | 2020-12-04T21:03:42.000Z | CHANGELOG
=========
v0.3.1 (2020-05-12)
-------------------
- Add optional parameter :code:`existing_entries` to :code:`extract()` (thanks `@tbm`_)
v0.3.0 (2020-05-10)
-------------------
- Add support for Python 3.8
v0.2.0 (2019-10-22)
-------------------
- Support multiple languages (starting with English and German)
- Add support for Python 3.5
v0.1.0 (2019-10-21)
-------------------
- First release
| 17.25 | 87 | 0.545894 |
74e69411db2d26999f11169aaa8d7e3f4ff2cc7e | 52 | rst | reStructuredText | docs/find-module/FindGradle.rst | persianyagami/extra-cmake-modules | fc50aef52482d6b680ce7c59e2452a8cef871c22 | [
"BSD-3-Clause",
"BSD-2-Clause",
"MIT"
] | 99 | 2015-10-05T11:58:53.000Z | 2022-02-18T08:50:07.000Z | docs/find-module/FindGradle.rst | persianyagami/extra-cmake-modules | fc50aef52482d6b680ce7c59e2452a8cef871c22 | [
"BSD-3-Clause",
"BSD-2-Clause",
"MIT"
] | 2 | 2018-09-27T11:54:32.000Z | 2021-09-29T10:14:56.000Z | docs/find-module/FindGradle.rst | persianyagami/extra-cmake-modules | fc50aef52482d6b680ce7c59e2452a8cef871c22 | [
"BSD-3-Clause",
"BSD-2-Clause",
"MIT"
] | 49 | 2015-12-05T02:59:11.000Z | 2022-03-09T02:35:24.000Z | .. ecm-module:: ../../find-modules/FindGradle.cmake
| 26 | 51 | 0.673077 |
fd74a36eeb77419a59089c8d33b1bea039330709 | 1,156 | rst | reStructuredText | docs-sphinx/source/strahlenschutz.rst | andreasbossard/deutschland | 6f561256c707e21f81b54b139b9acb745b901298 | [
"Apache-2.0"
] | 445 | 2021-07-26T22:00:26.000Z | 2022-03-31T08:31:08.000Z | docs-sphinx/source/strahlenschutz.rst | andreasbossard/deutschland | 6f561256c707e21f81b54b139b9acb745b901298 | [
"Apache-2.0"
] | 30 | 2021-07-27T15:42:23.000Z | 2022-03-26T16:14:11.000Z | docs-sphinx/source/strahlenschutz.rst | andreasbossard/deutschland | 6f561256c707e21f81b54b139b9acb745b901298 | [
"Apache-2.0"
] | 28 | 2021-07-27T10:48:43.000Z | 2022-03-26T14:31:30.000Z | strahlenschutz package
======================
Subpackages
-----------
.. toctree::
:maxdepth: 4
strahlenschutz.api
strahlenschutz.apis
strahlenschutz.model
strahlenschutz.models
Submodules
----------
strahlenschutz.api\_client module
---------------------------------
.. automodule:: strahlenschutz.api_client
:members:
:undoc-members:
:show-inheritance:
strahlenschutz.configuration module
-----------------------------------
.. automodule:: strahlenschutz.configuration
:members:
:undoc-members:
:show-inheritance:
strahlenschutz.exceptions module
--------------------------------
.. automodule:: strahlenschutz.exceptions
:members:
:undoc-members:
:show-inheritance:
strahlenschutz.model\_utils module
----------------------------------
.. automodule:: strahlenschutz.model_utils
:members:
:undoc-members:
:show-inheritance:
strahlenschutz.rest module
--------------------------
.. automodule:: strahlenschutz.rest
:members:
:undoc-members:
:show-inheritance:
Module contents
---------------
.. automodule:: strahlenschutz
:members:
:undoc-members:
:show-inheritance:
| 17.784615 | 44 | 0.596886 |
1ee8265943cb1238974130931cd936fa1963d16f | 1,412 | rst | reStructuredText | docs/examples/interactive.rst | ivankravets/pumbaa | 2a1869cc204e3128516ed6fa9f89529aedec1702 | [
"MIT"
] | 69 | 2016-09-04T18:36:18.000Z | 2021-07-04T21:51:54.000Z | docs/examples/interactive.rst | ivankravets/pumbaa | 2a1869cc204e3128516ed6fa9f89529aedec1702 | [
"MIT"
] | 42 | 2016-09-02T20:10:19.000Z | 2020-07-01T05:54:01.000Z | docs/examples/interactive.rst | ivankravets/pumbaa | 2a1869cc204e3128516ed6fa9f89529aedec1702 | [
"MIT"
] | 11 | 2016-09-29T14:33:23.000Z | 2021-02-28T19:30:49.000Z | Interactive
===========
About
-----
This application is a Python interpreter!
When the application starts it tries to run the script ``main.py``
from the file system. After the script ends the `Python` interactive
interpreter is started.
The serial port baudrate is 38400.
Example script
--------------
Here is an example of how to write a script ``main.py`` using the
interpreter.
1. Start the serial monitor.
2. Create ``main.py`` and write ``print('Hello World!\n')`` to
it. This file will be executed everytime the board starts.
.. code-block:: text
MicroPython v1.8.3-88-gf98bb2d on 2016-09-17; Arduino Due with SAM3X8E
Type "help()" for more information.
>>> with open("main.py", "w") as f:
... f.write("print('Hello World!\n')")
>>>
3. Restart the board and you'll see ``Hello World!`` on the screen!
.. code-block:: text
Hello World!
MicroPython v1.8.3-88-gf98bb2d on 2016-09-17; Arduino Due with SAM3X8E
Type "help()" for more information.
>>>
4. Done!
The example can be found on Github in the
:github-tree:`examples/interactive` folder.
Build and run
-------------
Build and run the application.
.. code-block:: text
$ cd examples/interactive
$ make -s BOARD=arduino_due run
...
MicroPython v1.8.3-88-gf98bb2d on 2016-09-17; Arduino Due with SAM3X8E
Type "help()" for more information.
>>>
| 22.774194 | 76 | 0.652975 |
4a142995d262690e07c4744858ab150213e76a3e | 445 | rst | reStructuredText | docs/modules/raytracing.thorlabs.AC254_150_A.rst | gregsadetsky/RayTracing | 3d11ed91014a47bddc797495ca2af059005e810d | [
"MIT"
] | null | null | null | docs/modules/raytracing.thorlabs.AC254_150_A.rst | gregsadetsky/RayTracing | 3d11ed91014a47bddc797495ca2af059005e810d | [
"MIT"
] | null | null | null | docs/modules/raytracing.thorlabs.AC254_150_A.rst | gregsadetsky/RayTracing | 3d11ed91014a47bddc797495ca2af059005e810d | [
"MIT"
] | null | null | null | AC254\_150\_A
=============
.. currentmodule:: raytracing.thorlabs
.. autoclass:: AC254_150_A
:no-members:
:no-undoc-members:
:show-inheritance:
.. rubric:: Methods
.. autosummary::
:template: autoFunction.rst
:toctree: methods/AC254_150_A
~AC254_150_A.__init__
.. rubric:: Attributes
.. autosummary::
~AC254_150_A.determinant
~AC254_150_A.hasPower
~AC254_150_A.isImaging
| 12.027027 | 38 | 0.626966 |
3431d3f56c80397fc2a7950e7324c6a748920486 | 210 | rst | reStructuredText | docs/source/tatk.policy.rst | yqzhangthu/tatk | 4d27e89604a33f19f1c7b8fe5dc92d4ba6c6f10a | [
"Apache-2.0"
] | 81 | 2019-03-12T13:40:29.000Z | 2022-01-17T10:59:21.000Z | docs/source/tatk.policy.rst | zqwerty/tatk | fafabc45d02ad889f59354acac4e3b1367e7d4bf | [
"Apache-2.0"
] | 35 | 2019-03-13T14:05:05.000Z | 2021-08-25T15:38:14.000Z | docs/source/tatk.policy.rst | zqwerty/tatk | fafabc45d02ad889f59354acac4e3b1367e7d4bf | [
"Apache-2.0"
] | 41 | 2019-03-13T09:40:24.000Z | 2022-03-07T17:59:07.000Z | tatk.policy package
===================
Submodules
----------
tatk.policy.policy module
-------------------------
.. automodule:: tatk.policy.policy
:members:
:undoc-members:
:show-inheritance:
| 14 | 34 | 0.519048 |
16b6f3d750b6498a517db438a1f50e79a206f22c | 179 | rst | reStructuredText | AUTHORS.rst | CodyJohnsonCHL/dfm_models | 50cd5e876d545f9aa677fcceee902687be2a97ee | [
"BSD-3-Clause"
] | null | null | null | AUTHORS.rst | CodyJohnsonCHL/dfm_models | 50cd5e876d545f9aa677fcceee902687be2a97ee | [
"BSD-3-Clause"
] | null | null | null | AUTHORS.rst | CodyJohnsonCHL/dfm_models | 50cd5e876d545f9aa677fcceee902687be2a97ee | [
"BSD-3-Clause"
] | null | null | null | =======
Credits
=======
Maintainer
----------
* Cody L. Johnson <cody.l.johnson@erdc.dren.mil>
Contributors
------------
None yet. Why not be the first? See: CONTRIBUTING.rst
| 12.785714 | 53 | 0.586592 |
aa59577d91263d895bf08ba5da956ccfe66a32c7 | 1,292 | rst | reStructuredText | doc/server/principle.rst | julienfalque/http-mock | 50016774a9fc6d40c5413c4b7937708b5949844f | [
"MIT"
] | 4 | 2017-11-08T09:37:06.000Z | 2021-10-03T20:16:36.000Z | doc/server/principle.rst | julienfalque/http-mock | 50016774a9fc6d40c5413c4b7937708b5949844f | [
"MIT"
] | 3 | 2019-06-23T10:39:16.000Z | 2019-07-05T06:13:03.000Z | doc/server/principle.rst | julienfalque/http-mock | 50016774a9fc6d40c5413c4b7937708b5949844f | [
"MIT"
] | 1 | 2019-06-07T09:06:57.000Z | 2019-06-07T09:06:57.000Z | ================
HTTP Server mock
================
The ``Jfalque\HttpMock\Server`` class provides a fluent API to create a simple HTTP server mock:
.. code-block:: php
<?php
use Jfalque\HttpMock\Server;
$server = (new Server())
->whenUri('http://foo')
->return($foo = new Response())
->end()
->whenUri('http://bar')
->return($bar = new Response())
->end()
;
$response = $server->handle(new Request('http://foo')); // $foo
$response = $server->handle(new Request('http://bar')); // $bar
$response = $server->handle(new Request('http://baz')); // null
The server works by defining layers with predicates. Predicates are functions that return a boolean depending on whether
a request match some criteria and are used to determine the matching response. When handling a request, a layer passes
it to its predicates and if all predicates match (return ``true``), the response will be:
1. the response returned by sublayers, if any matches;
2. the response of the current layer, if defined;
3. ``null``.
If any predicate does not match (returns ``false``), the current layer does not return a response and the matching
process continues with subsequent layers.
See the `API documentation <api.rst>`_ for more details. | 34.918919 | 120 | 0.664861 |
bbc167d04387b56b4c2c1529502ef65e90d8f820 | 10,339 | rst | reStructuredText | classes/class_audiostreamplayer2d.rst | nekomatata/godot-docs | 96560ed17d47758fa594b2f07ce5a1f43e674231 | [
"CC-BY-3.0"
] | 2 | 2021-11-08T02:46:58.000Z | 2021-11-08T09:40:58.000Z | classes/class_audiostreamplayer2d.rst | nekomatata/godot-docs | 96560ed17d47758fa594b2f07ce5a1f43e674231 | [
"CC-BY-3.0"
] | 1 | 2019-06-17T19:29:48.000Z | 2019-12-11T21:48:06.000Z | classes/class_audiostreamplayer2d.rst | nekomatata/godot-docs | 96560ed17d47758fa594b2f07ce5a1f43e674231 | [
"CC-BY-3.0"
] | 1 | 2020-08-09T15:31:16.000Z | 2020-08-09T15:31:16.000Z | :github_url: hide
.. Generated automatically by doc/tools/makerst.py in Godot's source tree.
.. DO NOT EDIT THIS FILE, but the AudioStreamPlayer2D.xml source instead.
.. The source is found in doc/classes or modules/<name>/doc_classes.
.. _class_AudioStreamPlayer2D:
AudioStreamPlayer2D
===================
**Inherits:** :ref:`Node2D<class_Node2D>` **<** :ref:`CanvasItem<class_CanvasItem>` **<** :ref:`Node<class_Node>` **<** :ref:`Object<class_Object>`
Plays audio in 2D.
Description
-----------
Plays audio that dampens with distance from screen center.
Tutorials
---------
- :doc:`../tutorials/audio/audio_streams`
Properties
----------
+---------------------------------------+------------------------------------------------------------------------+--------------+
| :ref:`int<class_int>` | :ref:`area_mask<class_AudioStreamPlayer2D_property_area_mask>` | ``1`` |
+---------------------------------------+------------------------------------------------------------------------+--------------+
| :ref:`float<class_float>` | :ref:`attenuation<class_AudioStreamPlayer2D_property_attenuation>` | ``1.0`` |
+---------------------------------------+------------------------------------------------------------------------+--------------+
| :ref:`bool<class_bool>` | :ref:`autoplay<class_AudioStreamPlayer2D_property_autoplay>` | ``false`` |
+---------------------------------------+------------------------------------------------------------------------+--------------+
| :ref:`String<class_String>` | :ref:`bus<class_AudioStreamPlayer2D_property_bus>` | ``"Master"`` |
+---------------------------------------+------------------------------------------------------------------------+--------------+
| :ref:`float<class_float>` | :ref:`max_distance<class_AudioStreamPlayer2D_property_max_distance>` | ``2000.0`` |
+---------------------------------------+------------------------------------------------------------------------+--------------+
| :ref:`float<class_float>` | :ref:`pitch_scale<class_AudioStreamPlayer2D_property_pitch_scale>` | ``1.0`` |
+---------------------------------------+------------------------------------------------------------------------+--------------+
| :ref:`bool<class_bool>` | :ref:`playing<class_AudioStreamPlayer2D_property_playing>` | ``false`` |
+---------------------------------------+------------------------------------------------------------------------+--------------+
| :ref:`AudioStream<class_AudioStream>` | :ref:`stream<class_AudioStreamPlayer2D_property_stream>` | |
+---------------------------------------+------------------------------------------------------------------------+--------------+
| :ref:`bool<class_bool>` | :ref:`stream_paused<class_AudioStreamPlayer2D_property_stream_paused>` | ``false`` |
+---------------------------------------+------------------------------------------------------------------------+--------------+
| :ref:`float<class_float>` | :ref:`volume_db<class_AudioStreamPlayer2D_property_volume_db>` | ``0.0`` |
+---------------------------------------+------------------------------------------------------------------------+--------------+
Methods
-------
+-------------------------------------------------------+------------------------------------------------------------------------------------------------------------+
| :ref:`float<class_float>` | :ref:`get_playback_position<class_AudioStreamPlayer2D_method_get_playback_position>` **(** **)** |
+-------------------------------------------------------+------------------------------------------------------------------------------------------------------------+
| :ref:`AudioStreamPlayback<class_AudioStreamPlayback>` | :ref:`get_stream_playback<class_AudioStreamPlayer2D_method_get_stream_playback>` **(** **)** |
+-------------------------------------------------------+------------------------------------------------------------------------------------------------------------+
| void | :ref:`play<class_AudioStreamPlayer2D_method_play>` **(** :ref:`float<class_float>` from_position=0.0 **)** |
+-------------------------------------------------------+------------------------------------------------------------------------------------------------------------+
| void | :ref:`seek<class_AudioStreamPlayer2D_method_seek>` **(** :ref:`float<class_float>` to_position **)** |
+-------------------------------------------------------+------------------------------------------------------------------------------------------------------------+
| void | :ref:`stop<class_AudioStreamPlayer2D_method_stop>` **(** **)** |
+-------------------------------------------------------+------------------------------------------------------------------------------------------------------------+
Signals
-------
.. _class_AudioStreamPlayer2D_signal_finished:
- **finished** **(** **)**
Emitted when the audio stops playing.
Property Descriptions
---------------------
.. _class_AudioStreamPlayer2D_property_area_mask:
- :ref:`int<class_int>` **area_mask**
+-----------+----------------------+
| *Default* | ``1`` |
+-----------+----------------------+
| *Setter* | set_area_mask(value) |
+-----------+----------------------+
| *Getter* | get_area_mask() |
+-----------+----------------------+
Areas in which this sound plays.
----
.. _class_AudioStreamPlayer2D_property_attenuation:
- :ref:`float<class_float>` **attenuation**
+-----------+------------------------+
| *Default* | ``1.0`` |
+-----------+------------------------+
| *Setter* | set_attenuation(value) |
+-----------+------------------------+
| *Getter* | get_attenuation() |
+-----------+------------------------+
Dampens audio over distance with this as an exponent.
----
.. _class_AudioStreamPlayer2D_property_autoplay:
- :ref:`bool<class_bool>` **autoplay**
+-----------+-----------------------+
| *Default* | ``false`` |
+-----------+-----------------------+
| *Setter* | set_autoplay(value) |
+-----------+-----------------------+
| *Getter* | is_autoplay_enabled() |
+-----------+-----------------------+
If ``true``, audio plays when added to scene tree.
----
.. _class_AudioStreamPlayer2D_property_bus:
- :ref:`String<class_String>` **bus**
+-----------+----------------+
| *Default* | ``"Master"`` |
+-----------+----------------+
| *Setter* | set_bus(value) |
+-----------+----------------+
| *Getter* | get_bus() |
+-----------+----------------+
Bus on which this audio is playing.
----
.. _class_AudioStreamPlayer2D_property_max_distance:
- :ref:`float<class_float>` **max_distance**
+-----------+-------------------------+
| *Default* | ``2000.0`` |
+-----------+-------------------------+
| *Setter* | set_max_distance(value) |
+-----------+-------------------------+
| *Getter* | get_max_distance() |
+-----------+-------------------------+
Maximum distance from which audio is still hearable.
----
.. _class_AudioStreamPlayer2D_property_pitch_scale:
- :ref:`float<class_float>` **pitch_scale**
+-----------+------------------------+
| *Default* | ``1.0`` |
+-----------+------------------------+
| *Setter* | set_pitch_scale(value) |
+-----------+------------------------+
| *Getter* | get_pitch_scale() |
+-----------+------------------------+
Changes the pitch and the tempo of the audio.
----
.. _class_AudioStreamPlayer2D_property_playing:
- :ref:`bool<class_bool>` **playing**
+-----------+--------------+
| *Default* | ``false`` |
+-----------+--------------+
| *Getter* | is_playing() |
+-----------+--------------+
If ``true``, audio is playing.
----
.. _class_AudioStreamPlayer2D_property_stream:
- :ref:`AudioStream<class_AudioStream>` **stream**
+----------+-------------------+
| *Setter* | set_stream(value) |
+----------+-------------------+
| *Getter* | get_stream() |
+----------+-------------------+
The :ref:`AudioStream<class_AudioStream>` object to be played.
----
.. _class_AudioStreamPlayer2D_property_stream_paused:
- :ref:`bool<class_bool>` **stream_paused**
+-----------+--------------------------+
| *Default* | ``false`` |
+-----------+--------------------------+
| *Setter* | set_stream_paused(value) |
+-----------+--------------------------+
| *Getter* | get_stream_paused() |
+-----------+--------------------------+
If ``true``, the playback is paused. You can resume it by setting ``stream_paused`` to ``false``.
----
.. _class_AudioStreamPlayer2D_property_volume_db:
- :ref:`float<class_float>` **volume_db**
+-----------+----------------------+
| *Default* | ``0.0`` |
+-----------+----------------------+
| *Setter* | set_volume_db(value) |
+-----------+----------------------+
| *Getter* | get_volume_db() |
+-----------+----------------------+
Base volume without dampening.
Method Descriptions
-------------------
.. _class_AudioStreamPlayer2D_method_get_playback_position:
- :ref:`float<class_float>` **get_playback_position** **(** **)**
Returns the position in the :ref:`AudioStream<class_AudioStream>`.
----
.. _class_AudioStreamPlayer2D_method_get_stream_playback:
- :ref:`AudioStreamPlayback<class_AudioStreamPlayback>` **get_stream_playback** **(** **)**
Returns the :ref:`AudioStreamPlayback<class_AudioStreamPlayback>` object associated with this ``AudioStreamPlayer2D``.
----
.. _class_AudioStreamPlayer2D_method_play:
- void **play** **(** :ref:`float<class_float>` from_position=0.0 **)**
Plays the audio from the given position ``from_position``, in seconds.
----
.. _class_AudioStreamPlayer2D_method_seek:
- void **seek** **(** :ref:`float<class_float>` to_position **)**
Sets the position from which audio will be played, in seconds.
----
.. _class_AudioStreamPlayer2D_method_stop:
- void **stop** **(** **)**
Stops the audio.
| 37.871795 | 166 | 0.396267 |
ed78648926debef98d7186133523ce64bbce66da | 4,568 | rst | reStructuredText | README.rst | aoertel/rpi-rf-gpiod | 4ef1173481dcd58c788fed2657496161b4aa8aa9 | [
"BSD-3-Clause"
] | 2 | 2019-12-29T00:36:53.000Z | 2020-10-18T22:45:51.000Z | README.rst | patricncosta/rpi-rf-gpiod | 4ef1173481dcd58c788fed2657496161b4aa8aa9 | [
"BSD-3-Clause"
] | null | null | null | README.rst | patricncosta/rpi-rf-gpiod | 4ef1173481dcd58c788fed2657496161b4aa8aa9 | [
"BSD-3-Clause"
] | 2 | 2020-08-09T12:28:53.000Z | 2021-01-03T13:22:04.000Z | rpi-rf-gpiod
============
Introduction
------------
Python module for sending and receiving 433/315MHz LPD/SRD signals with generic low-cost GPIO RF modules on a Raspberry Pi.
Protocol and base logic ported ported from `rc-switch`_. The `libgiod-python`_ library is required to access the GPIO pins. Therefore, the GPIO character device is used instead of the old GPIO sysfs interface.
Supported hardware
------------------
Most generic 433/315MHz capable modules (cost: ~2€) connected via GPIO to a Raspberry Pi.
.. figure:: http://i.imgur.com/vG89UP9.jpg
:alt: 433modules
Compatibility
-------------
Generic RF outlets and most 433/315MHz switches (cost: ~15€/3pcs).
.. figure:: http://i.imgur.com/WVRxvWe.jpg
:alt: rfoutlet
Chipsets:
* SC5262 / SC5272
* HX2262 / HX2272
* PT2262 / PT2272
* EV1527 / RT1527 / FP1527 / HS1527
For a full list of compatible devices and chipsets see the `rc-switch Wiki`_
Dependencies
------------
`libgiod-python`_ (available through most package managers as :code:`python3-libgpiod`)
Installation
------------
On your Raspberry Pi, install the *rpi_rf-gpiod* module via pip.
Debian/Ubuntu::
# apt-get install python3-pip python3-libgpiod
Fedora/CentOS::
# dnf install python3-pip python3-libgpiod
With :code:`pip` installed::
# pip3 install rpi-rf-gpiod
Wiring diagram (example)
------------------------
Raspberry Pi 1/2(B+)::
RPI GPIO HEADER
____________
| ____|__
| | | |
| 01| . x |02
| | . x__|________ RX
| | . x__|______ | ________
| | . . | | | | |
TX | ____|__x . | | |__|VCC |
_______ | | __|__x . | | | |
| | | | | | x____|______|____|DATA |
| GND|____|__| | | . . | | | |
| | | | | . . | | |DATA |
| VCC|____| | | . . | | | |
| | | | . . | |____|GND |
| DATA|_________| | . . | |________|
|_______| | . . |
| . . |
| . . |
| . . |
| . . |
| . . |
| . . |
39| . . |40
|_______|
TX:
GND > PIN 09 (GND)
VCC > PIN 02 (5V)
DATA > PIN 11 (GPIO17)
RX:
VCC > PIN 04 (5V)
DATA > PIN 13 (GPIO27)
GND > PIN 06 (GND)
Usage
-----
See `scripts`_ (`rpi-rf_send`_, `rpi-rf_receive`_) which are also shipped as cmdline tools. They are automatically installed when installing the package with :code:`pip3`.
Send:
:code:`rpi-rf_send [-h] [-g GPIO] [-p PULSELENGTH] [-t PROTOCOL] [-l LENGTH] [-r REPEAT] CODE`
Sends a decimal code via a 433/315MHz GPIO device
positional arguments:
CODE Decimal code to send
optional arguments:
-h, --help show this help message and exit
-g GPIO GPIO pin (Default: 17)
-p PULSELENGTH Pulselength (Default: 350)
-t PROTOCOL Protocol (Default: 1)
-l LENGTH Codelength (Default: 24)
-r REPEAT Repeat cycles (Default: 10)
Receive:
:code:`rpi-rf_receive [-h] [-g GPIO]`
Receives a decimal code via a 433/315MHz GPIO device
optional arguments:
-h, --help show this help message and exit
-g GPIO GPIO pin (Default: 27)
Open Source
-----------
* The code is licensed under the `BSD Licence`_
* The project is forked from the GPIO sysfs interface implementation of milaq_
* The project source code is hosted on `GitHub`_
* Please use `GitHub issues`_ to submit bugs and report issues
.. _rc-switch: https://github.com/sui77/rc-switch
.. _rc-switch Wiki: https://github.com/sui77/rc-switch/wiki
.. _BSD Licence: http://www.linfo.org/bsdlicense.html
.. _milaq: https://github.com/milaq/rpi-rf
.. _GitHub: https://github.com/aoertel/rpi-rf-gpiod
.. _GitHub issues: https://github.com/aoertel/rpi-rf-gpiod/issues
.. _scripts: https://github.com/aoertel/rpi-rf-gpiod/blob/master/scripts
.. _rpi-rf_send: https://github.com/aoertel/rpi-rf-gpiod/blob/master/scripts/rpi-rf_send
.. _rpi-rf_receive: https://github.com/aoertel/rpi-rf-gpiod/blob/master/scripts/rpi-rf_receive
.. _libgiod-python: https://git.kernel.org/pub/scm/libs/libgpiod/libgpiod.git/
| 30.864865 | 209 | 0.551664 |
5e11bbcf67d2fdb98abc6c0100e6b48926ef63d6 | 320 | rst | reStructuredText | dist/awscli/examples/codestar/delete-user-profile.rst | prasad-madusanka/aws-cli | 6fe7244096823bba739830a35c8eeb52a0e03df7 | [
"Apache-2.0"
] | 4 | 2022-01-07T13:37:33.000Z | 2022-03-31T03:21:17.000Z | dist/awscli/examples/codestar/delete-user-profile.rst | prasad-madusanka/aws-cli | 6fe7244096823bba739830a35c8eeb52a0e03df7 | [
"Apache-2.0"
] | 8 | 2021-03-19T04:46:59.000Z | 2022-03-12T00:10:00.000Z | dist/awscli/examples/codestar/delete-user-profile.rst | prasad-madusanka/aws-cli | 6fe7244096823bba739830a35c8eeb52a0e03df7 | [
"Apache-2.0"
] | null | null | null | **To delete a user profile**
The following ``delete-user-profile`` example deletes the user profile for the user with the specified ARN. ::
aws codestar delete-user-profile \
--user-arn arn:aws:iam::123456789012:user/intern
Output::
{
"userArn": "arn:aws:iam::123456789012:user/intern"
}
| 24.615385 | 110 | 0.675 |
27bfd7260c86819f771e4f2af5fd7530f0c20f2e | 7,616 | rst | reStructuredText | doc/manual/functions.rst | juewuer/novaprova | 7ca3d28975d7d155e52b53ba52107fe7bb29e5d2 | [
"Apache-2.0"
] | 29 | 2015-08-13T01:21:38.000Z | 2020-08-03T01:38:56.000Z | doc/manual/functions.rst | juewuer/novaprova | 7ca3d28975d7d155e52b53ba52107fe7bb29e5d2 | [
"Apache-2.0"
] | 34 | 2015-08-10T00:55:34.000Z | 2022-03-23T03:08:06.000Z | doc/manual/functions.rst | juewuer/novaprova | 7ca3d28975d7d155e52b53ba52107fe7bb29e5d2 | [
"Apache-2.0"
] | 9 | 2015-08-13T06:20:53.000Z | 2021-02-26T03:43:18.000Z |
Writing Test Functions
======================
Runtime Discovery
-----------------
Test functions are discovered at runtime using Reflection. The
NovaProva library walks through all the functions linked into the test
executable and matches those which take no arguments, return ``void``, and
have a name matching one of the following patterns:
* ``test_foo``
* ``testFoo``
* ``TestFoo``
Here's an example of a test function.
.. highlight:: c
::
#include <np.h>
static void test_simple(void)
{
int r = myatoi("42");
NP_ASSERT_EQUAL(r, 42);
}
Note that you do not need to write any code to register this test
function with the framework. If it matches the above criteria, the
function will be found and recorded by NovaProva. Just write the
function and you're done.
The Test Tree
-------------
Most other test frameworks provide a simple, 2-level mechanism for
organizing tests; *tests* are grouped into *suites*.
By contrast NovaProva organizes tests into an **tree of test nodes**.
All the tests built into a test executable are gathered at runtime
and are fitted into a tree, with a single common root. The root is
then pruned until the test names are as short as possible. Each test
function is a leaf node in this tree (usually).
The locations of tests in this tree are derived from the names of the
test function, the basename of the test source file containing the test
function, and the hierarchy of filesystem directories containing that
source file. These form a natural classifying scheme that you are
already controlling by choosing the names of filenames and functions.
These names are stuck together in order from least to most specific,
separated by ASCII '.' characters, and in general look like this.
.. highlight:: none
::
dir.subdir.more.subdirs.filename.function
Here's an example showing how test node names fall naturally out of
your test code organization.
.. highlight:: none
::
% cat tests/startrek/tng/federation/enterprise.c
static void test_torpedoes(void)
{
fprintf(stderr, "Testing photon torpedoes\n");
}
% cat tests/startrek/tng/klingons/neghvar.c
static void test_disruptors(void)
{
fprintf(stderr, "Testing disruptors\n");
}
% cat tests/starwars/episode4/rebels/xwing.c
static void test_lasers(void)
{
fprintf(stderr, "Testing laser cannon\n");
}
% ./testrunner --list
tests.startrek.tng.federation.enterprise.torpedoes
tests.startrek.tng.klingons.neghvar.disruptors
tests.starwars.episode4.rebels.xwing.lasers
Pass and Fail
-------------
A test passes in a very simple way: it returns without failing. A test
can fail in any number of ways, some of them obvious, all of them
indicative of a bug in the Code Under Test (or possibly the test
itself). See :ref:`assert_macros` and :doc:`failures` for full details.
Here's an example of a test which always passes.
.. highlight:: c
::
static void test_always_passes(void)
{
printf("Hi, I'm passing!\n");
}
A test can also use the ``NP_PASS`` macro, which terminates the test
immediately without recording a failure.
.. highlight:: c
::
static void test_also_always_passes(void)
{
printf("Hi, I'm passing too!\n");
NP_PASS; /* terminates the test */
printf("Now I'm celebrating passing!\n"); /* never happens */
}
Note that this does not necessarily mean the test will get a Pass
result, only that the test itself thinks it has passed. It is possible
that NovaProva will detect more subtle failures that the test itself
does not see; some of these failures are not even detectable until after
the test terminates. So, ``NP_PASS`` is really just a complicated
``return`` statement and you should probably never use it.
.. highlight:: c
::
static void test_thinks_it_passes(void)
{
void *x = malloc(24);
printf("Hi, I think I'm passing!\n");
NP_PASS; /* but it's wrong, it leaked memory */
}
A test can use the ``NP_FAIL`` macro, which terminates the test and
records a Fail result. Unlike ``NP_PASS``, if a test says it fails
then NovaProva believes it.
.. highlight:: c
::
static void test_always_fails(void)
{
printf("Hi, I'm failing\n");
NP_FAIL; /* terminates the test */
printf("Now I'm mourning my failure!\n"); /* never happens */
}
Note that NovaProva provides a number of declarative :ref:`assert_macros`
which are much more useful than using ``NP_FAIL`` inside a conditional
statement. Not only are they more concise, but if they cause a test
failure they provide a more useful error message which helps with
diagnosis. For example, this test code
.. highlight:: c
::
static void test_dont_do_it_this_way(void)
{
if (atoi("42") != 3)
NP_FAIL;
}
static void test_do_it_this_way_instead(void)
{
NP_ASSERT_EQUAL(atoi("42"), 3);
}
Will generate the following error messages
.. highlight:: none
::
% ./testrunner
np: running: "mytests.dont_do_it_this_way"
EVENT EXFAIL NP_FAIL called
FAIL mytests.dont_do_it_this_way
np: running: "mytests.do_it_this_way_instead"
EVENT ASSERT NP_ASSERT_NOT_EQUAL(atoi("42")=42, 3=3)
FAIL mytests.do_it_this_way_instead
NovaProva also supports a third test result, Not Applicable, which is
neither a Pass nor a Fail. A test which runs but decides that some
preconditions are not met, can call the ``NP_NOTAPPLICABLE`` macro.
Such tests are not counted as either passes or failures; it's as if they
never existed.
.. _dependencies:
Dependencies
------------
Some unit test frameworks support a concept of test dependencies, i.e.
the framework knows that some tests should not be run until after some
other tests have been run. NovaProva does not support test
dependencies.
In the opinion of the author, test dependencies are a terrible idea.
They encourage a style of test writing where some tests are used to
generate external state (e.g. rows in a database) which is then used
as input to other tests. NovaProva is designed around a model where
each test is isolated, repeatable, and stateless. This means
that each test must trigger the same behaviour in the Code Under Test
and give the same result, regardless of which order tests were run,
or whether they were run in parallel, or whether any other tests
were run at all, or whether the test had been run before.
The philosophy here is that the purpose of tests is to find bugs
and to keep on finding bugs long after it's written.
If a test is run nightly, fails roughly once a month,
but nobody can figure out why, that test is useless.
So a good test is conceptually simple, easy to run, and easy to diagnose
when it fails. Deliberately sharing state between tests makes it
harder to achieve all these ideals.
If you find yourself writing a test and you want to save some time
by feeding the results of one test into another, please just stop and
think about what you're doing.
If the Code Under Test needs to be in a particular state before the test
can begin, you should consider it to be the job of the test to achieve
that state from an initial null state. You can use :doc:`fixtures` to
pull out common code which sets up such state so that you don't have to
repeat it in every test. You can also use coding techniques which allow
to save and restore the state of the Code Under Test (e.g. a database
dump), and check the saved state into version control along with your
test code.
.. vim:set ft=rst:
| 30.95935 | 74 | 0.722295 |
a42467bdea08590739b2738443f46b77ed3c88af | 639 | rst | reStructuredText | docs/getting_started.rst | kevmanderson/chunky-pipes | 7a0ffa23eeaf95b14b825493ae7abc784c04cc7b | [
"MIT"
] | 5 | 2016-04-29T19:55:37.000Z | 2019-09-11T02:29:31.000Z | docs/getting_started.rst | kevmanderson/chunky-pipes | 7a0ffa23eeaf95b14b825493ae7abc784c04cc7b | [
"MIT"
] | 15 | 2016-04-18T16:11:23.000Z | 2019-09-11T02:29:51.000Z | docs/getting_started.rst | kevmanderson/chunky-pipes | 7a0ffa23eeaf95b14b825493ae7abc784c04cc7b | [
"MIT"
] | 1 | 2021-03-31T15:57:20.000Z | 2021-03-31T15:57:20.000Z | Getting Started
===============
To install with pip:
::
$ pip install chunkypipes
Before ChunkyPipes can function, it needs to be initialized with a call to ``chunky init``::
$ chunky init
> ChunkyPipes successfully initialized at /home/user
To install a pipeline, point ChunkyPipes to the python source file::
$ chunky install /path/to/pipeline.py
> Pipeline pipeline.py successfully installed
To configure a pipeline to run on the current platform, execute the configuration subcommand::
$ chunky configure pipeline
To run the pipeline, execute the run subcommand::
$ chunky run pipeline [options]
| 23.666667 | 94 | 0.72457 |
81caca31eca1fec9c77aff1a002cb0c782cc103b | 3,112 | rst | reStructuredText | docs/source/user_guide/mamba.rst | jjerphan/mamba | 4e047cca5888b1b40e0ccad3f42824305a0f4126 | [
"BSD-3-Clause"
] | 1 | 2021-08-23T02:49:07.000Z | 2021-08-23T02:49:07.000Z | docs/source/user_guide/mamba.rst | jjerphan/mamba | 4e047cca5888b1b40e0ccad3f42824305a0f4126 | [
"BSD-3-Clause"
] | 1 | 2021-08-23T02:48:56.000Z | 2021-08-23T02:48:56.000Z | docs/source/user_guide/mamba.rst | jjerphan/mamba | 4e047cca5888b1b40e0ccad3f42824305a0f4126 | [
"BSD-3-Clause"
] | null | null | null | .. _mamba:
Mamba
-----
``mamba`` is a CLI tool to manage ``conda`` s environments.
If you already know ``conda``, great, you already know ``mamba``!
If you're new to this world, don't panic you will find everything you need in this documentation. We recommend to get familiar with :ref:`concepts<concepts>` first.
Quickstart
==========
The ``mamba create`` command creates a new environment.
You can create an environment with the name ``nameofmyenv`` by calling:
.. code::
mamba create -n nameofmyenv <list of packages>
After this process has finished, you can _activate_ the virtual environment by calling ``conda activate <nameofmyenv>``.
For example, to install JupyterLab from the ``conda-forge`` channel and then run it, you could use the following commands:
.. code::
mamba create -n myjlabenv jupyterlab -c conda-forge
conda activate myjlabenv # activate our environment
jupyter lab # this will start up jupyter lab and open a browser
Once an environment is activated, ``mamba install`` can be used to install further packages into the environment.
.. code::
conda activate myjlabenv
mamba install bqplot # now you can use bqplot in myjlabenv
``mamba`` vs ``conda`` CLIs
===========================
| ``mamba`` is a drop-in replacement and uses the same commands and configuration options as ``conda``.
| You can swap almost all commands between ``conda`` & ``mamba``:
.. code::
mamba install ...
mamba create -n ... -c ... ...
mamba list
.. warning::
The only difference is that you should still use ``conda`` for :ref:`activation<activation>` and :ref:`deactivation<deactivation>`.
Repoquery
=========
``mamba`` comes with features on top of stock ``conda``.
To efficiently query repositories and query package dependencies you can use ``mamba repoquery``.
Here are some examples:
.. code::
# will show you all available xtensor packages.
$ mamba repoquery search xtensor
# you can also specify more constraints on this search query
$ mamba repoquery search "xtensor>=0.18"
# will show you a tree view of the dependencies of xtensor.
$ mamba repoquery depends xtensor
.. code::
$ mamba repoquery depends xtensor
xtensor == 0.21.5
├─ libgcc-ng [>=7.3.0]
│ ├─ _libgcc_mutex [0.1 conda_forge]
│ └─ _openmp_mutex [>=4.5]
│ ├─ _libgcc_mutex already visited
│ └─ libgomp [>=7.3.0]
│ └─ _libgcc_mutex already visited
├─ libstdcxx-ng [>=7.3.0]
└─ xtl [>=0.6.9,<0.7]
├─ libgcc-ng already visited
└─ libstdcxx-ng already visited
And you can ask for the inverse, which packages depend on some other package (e.g. ``ipython``) using ``whoneeds``.
.. code::
$ mamba repoquery whoneeds ipython
Name Version Build Channel
──────────────────────────────────────────────────
ipykernel 5.2.1 py37h43977f1_0 installed
ipywidgets 7.5.1 py_0 installed
jupyter_console 6.1.0 py_1 installed
With the ``-t,--tree`` flag, you can get the same information in a tree.
| 29.358491 | 164 | 0.65392 |
8ec080cf941f97ccbf0d083a4be976080ddcfc56 | 2,865 | rst | reStructuredText | news/garage.rst | tsimonen/scicomp-docs | 4f825e0888af15d8b8921e53a9d6a7a3f6937b65 | [
"CC0-1.0",
"CC-BY-4.0"
] | null | null | null | news/garage.rst | tsimonen/scicomp-docs | 4f825e0888af15d8b8921e53a9d6a7a3f6937b65 | [
"CC0-1.0",
"CC-BY-4.0"
] | null | null | null | news/garage.rst | tsimonen/scicomp-docs | 4f825e0888af15d8b8921e53a9d6a7a3f6937b65 | [
"CC0-1.0",
"CC-BY-4.0"
] | null | null | null | ==============
Scicomp garage
==============
The Aalto Scicomp Garage is a help session for scientific computing at
Aalto organized by the Science-IT team (Triton admins). It's the best
time to talk to the people behind scientific computing at Aalto. This
is a place to get stuff done, so bring your laptop and coffee/food,
and come hang out.
Most of the time, we are just there to answer questions.
Sometimes, there may be a short presentation on some topic, but you
can still ask questions to the other staff before, during, and after
that.
Come if you want to:
- Solve problems
- Discuss bigger problems
- Network with others who are doing similar work
- Learn something new
- Give feedback
Schedule
========
- Days: Triton garage is every week from 13:00-14:00 on Thursdays.
- Time: We promise to be there only the first 30 minutes or so, if
everyone leaves before then. Usually we are there the whole time.
- Location: See below. T4_ and A106_ (CS building), A237_ (CS
building).
.. _U121a: http://usefulaaltomap.fi/#!/select/main-U121a
.. _U121b: http://usefulaaltomap.fi/#!/select/main-U121b
.. _T4: http://usefulaaltomap.fi/#!/select/cs-A238
.. _A106: http://usefulaaltomap.fi/#!/select/r030-awing
.. _A237: http://usefulaaltomap.fi/#!/select/r030-awing
.. _F254: http://usefulaaltomap.fi/#!/select/F-F254
.. csv-table::
:header-rows: 1
:delim: |
Date (default Th) | Time (default 13:00-14:00) | Loc | Topic
9.8 | | B337 | Note different room, 3rd floor northwest corner of building
16.8 | | CS 3rd floor bridge |
23.8 | | B337 | Note different room, 3rd floor northwest corner of building
30.8 | | A106 |
6.9 | | A106 |
13.9 | | A106 |
20.9 | | A106 |
27.9 | | A106 |
4.10 | | A106 |
11.10 | | Outside T1 | Stand as part of CS research day
18.10 | | A106 |
25.10 | | A106 |
1.11 | | A106 |
8.11 | | A106 |
15.11 | | A106 |
22.11 | | A106 |
29.11 | | A106 |
6.12 | | A106 | NO garage
13.12 | | A106 |
20.12 | | A106 |
Dates after 2.8 to be decided later.
Topics
======
* `Triton intro: interactive jobs <../triton/tut/interactive>`_
* `Git <http://rkd.zgib.net/scicomp/scip2015/git.html>`_
Possible special topics
=======================
- Profiling and performance monitoring
- debugging
- open source: making software and running a project, licenses
- shell scripting and automation
- unix intro
- software testing
- building good programs
- porting python2 to python3
- R
- matlab
- GPU / deep learning computing
- molecular dynamics software
Past events
===========
Scicomp Garage has existed since Spring 2017.
| 29.84375 | 89 | 0.61466 |
cc83def9553f85d372eb6bfe6a902407224b2702 | 169 | rst | reStructuredText | doc/source/api/multi_level_providers/multi_level_provider.rst | DiMoser/PyPinT | 3cba394d0fd87055ab412d35fe6dbf4a3b0dbe73 | [
"MIT"
] | null | null | null | doc/source/api/multi_level_providers/multi_level_provider.rst | DiMoser/PyPinT | 3cba394d0fd87055ab412d35fe6dbf4a3b0dbe73 | [
"MIT"
] | null | null | null | doc/source/api/multi_level_providers/multi_level_provider.rst | DiMoser/PyPinT | 3cba394d0fd87055ab412d35fe6dbf4a3b0dbe73 | [
"MIT"
] | null | null | null | Multi-Level Provider (:mod:`multi_level_provider`)
==================================================
.. automodule:: pypint.multi_level_providers.multi_level_provider
| 33.8 | 65 | 0.579882 |
ee45d2d0698bd98e563b375d00594df046a937d0 | 7,927 | rst | reStructuredText | doc/source/user/vector_data_model.rst | unitave/gdal | 9f71e97f05c10e924b61d6d37c43cd85a42b3aff | [
"Apache-2.0"
] | 2 | 2022-03-24T00:53:48.000Z | 2022-03-26T02:52:52.000Z | doc/source/user/vector_data_model.rst | unitave/gdal | 9f71e97f05c10e924b61d6d37c43cd85a42b3aff | [
"Apache-2.0"
] | null | null | null | doc/source/user/vector_data_model.rst | unitave/gdal | 9f71e97f05c10e924b61d6d37c43cd85a42b3aff | [
"Apache-2.0"
] | null | null | null | .. _vector_data_model:
================================================================================
벡터 데이터 모델
================================================================================
이 문서에서는 OGR 클래스를 설명하려 합니다. OGR 클래스는 일반적이지만 (OLE DB 또는 COM 또는 윈도우에 특화되지 않았지만) OLE DB 제공자 지원은 물론 SFCOM 용 클라이언트 쪽 지원을 구현하기 위한 기반으로 사용됩니다. 예를 들어 SFCORBA 구현이, 또는 오픈GIS 단순 피처에서 영감을 받은 API를 사용하고자 하는 C++ 프로그램이 직접 이런 동일 OGR 클래스를 사용할 수 있게 하려는 목적입니다.
OGR 클래스가 오픈GIS 단순 피처 데이터 모델을 기반으로 하기 때문에, SFCOM 또는 OGC(Open Geospatial Consortium) 웹사이트로부터 가져올 수 있는 다른 단순 피처 인터페이스 사양들을 검토해보는 것이 매우 도움이 될 것입니다. 데이터 유형 및 메소드 이름도 인터페이스 사양의 데이터 유형 및 메소드 이름에 기반을 두고 있습니다.
클래스 개요
--------------
- 도형(:ref:`ogr_geometry.h <ogrgeometry_cpp>`):
(:cpp:class:`OGRGeometry` 등의) 도형 클래스는 오픈GIS 모델 벡터 데이터를 요약하는 것은 물론, 몇몇 도형 작업을 제공하고 WKT 및 텍스트 포맷을 서로 변환합니다. 도형은 좌표계(투영법)를 포함합니다.
- 공간 좌표계(:ref:`ogr_spatialref.h <ogrspatialref>`):
:cpp:class:`OGRSpatialReference` 클래스는 투영법 및 원점(datum)의 정의를 요약합니다.
- 피처(:ref:`ogr_feature.h <ogrfeature_cpp>`):
:cpp:class:`OGRFeature` 클래스는 도형 및 속성 집합이라는 전체 피처의 정의를 요약합니다.
- 피처 클래스 정의(:ref:`ogr_feature.h <ogrfeature_cpp>`):
:cpp:class:`OGRFeatureDefn` 클래스는 연결된 피처 그룹의 (일반적으로 전체 레이어의) 스키마(필드 정의 집합)를 수집합니다.
- 레이어(:ref:`ogrsf_frmts.h <ogrlayer_cpp>`):
:cpp:class:`OGRLayer` 클래스는 GDALDataset에 있는 피처 레이어를 표현하는 추상 기반(base) 클래스입니다.
- 데이터셋(:ref:`gdal_priv.h <gdaldataset_cpp>`):
:cpp:class:`GDALDataset` 클래스는 OGRLayer 객체(object)를 하나 이상 담고 있는 파일 또는 데이터베이스를 표현하는 추상 기반(base) 클래스입니다.
- 드라이버(:ref:`gdal_priv.h <gdaldriver_cpp>`):
:cpp:class:`GDALDriver` 클래스는 GDALDataset 객체를 열기 위한 특정 포맷 변환기(translator)를 표현합니다. GDALDriverManager가 사용할 수 있는 모든 드라이버를 관리합니다.
도형
--------
도형 클래스는 여러 유형의 벡터 도형을 표현합니다. 모든 도형 클래스는 모든 도형의 공통 서비스를 정의하는 :cpp:class:`OGRGeometry` 로부터 파생됩니다. 도형 유형에는 :cpp:class:`OGRPoint`, :cpp:class:`OGRLineString`, :cpp:class:`OGRPolygon`, :cpp:class:`OGRGeometryCollection`, :cpp:class:`OGRMultiPolygon`, :cpp:class:`OGRMultiPoint`, 및 :cpp:class:`OGRMultiLineString` 이 포함됩니다.
이런 도형 유형들은 :cpp:class:`OGRCircularString`, :cpp:class:`OGRCompoundCurve`, :cpp:class:`OGRCurvePolygon`, :cpp:class:`OGRMultiCurve` 및 :cpp:class:`OGRMultiSurface` 클래스를 가지는 비선형 도형들로 확장됩니다.
추가적인 중간(intermediate) 추상 기반 클래스는 다른 도형 유형들이 결국 구현하게 될 기능을 담고 있습니다. 이 기능에는 (OGRLineString의 기반 클래스인) OGRCurve와 (OGRPolygon의 기반 클래스인) OGRSurface가 포함됩니다. 몇몇 중간 인터페이스들은 단순 피처 추상 모델을 기반으로 하며, 현재 OGR에서 SFCOM은 모델링되지 않았습니다. 대부분의 경우 메소드를 다른 클래스로 집계합니다.
:cpp:class:`OGRGeometryFactory` 클래스를 사용해서 WKT 및 WKB 포맷 데이터를 도형으로 변환합니다. WKT 및 WKB는 모든 단순 피처 도형 유형을 표현하기 위해 사전 정의된 아스키 및 바이너리 포맷입니다.
SFCOM의 도형 객체를 기반으로 하는 방식으로, OGRGeometry에는 해당 도형의 공간 좌표계를 정의하는 :cpp:class:`OGRSpatialReference` 객체를 가리키는 참조가 포함됩니다. 이 참조는 일반적으로 해당 공간 좌표계를 사용하는 각 OGRGeometry 객체에 대한 참조를 집계하는 공유 공간 좌표계 객체를 가리킵니다.
OGRGeometry에 대한 (중첩 등등을 계산하는) 많은 공간 분석 메소드는 아직 구현되지 않았습니다.
기존 OGRGeometry 클래스로부터 다른 많은 특정 도형 클래스를 파생시키는 것은 이론적으로 가능하지만, 제대로 숙고된 측면은 아닙니다. 특히 OGRGeometryFactory 클래스를 이용해서 OGRGeometryFactory를 수정하지 않고 특수 클래스를 생성할 수는 있을 것입니다.
비선형 도형의 호환성 문제점
++++++++++++++++++++++++++++++++++++++++++++++++
비선형 도형을 지원하지 않는 드라이버의 레이어에 있는 비선형 도형을 가진 피처를 생성하거나 수정할 수 있도록 도입된 일반 메커니즘은 해당 도형을 가장 근접하게 일치하는 선형 도형으로 변환할 것입니다.
다른 한편으로는 OGR C API로부터 데이터를 가져올 때, 필요한 경우 :cpp:func:`OGRSetNonLinearGeometriesEnabledFlag` 함수를 사용해서 반환되는 도형 및 레이어 도형 유형을 가장 근접하게 일치하는 선형 도형으로 변환할 수 있습니다.
공간 좌표계
-----------------
:cpp:class:`OGRSpatialReference` 클래스는 오픈GIS 공간 좌표계 정의를 저장합니다. 현재 로컬, 지리 및 투영 좌표계를 지원합니다. 최신 GDAL 버전들에서는 수직 좌표계, 측지 좌표계 및 복합(수평+수직) 좌표계도 지원합니다.
오픈GIS WKT 포맷으로부터 공간 좌표계 데이터 모델을 상속받습니다. 단순 피처 사양에 이에 대한 간단한 형식이 정의되어 있습니다. 좌표 변환(Coordinate Transformation) 사양에서 더 복잡한 형식을 찾아볼 수 있습니다. OGRSpatialReference는 좌표 변환 사양을 기반으로 작성되지만 이전 버전들의 단순 피처 형식과 호환되려는 의도를 가지고 있습니다.
다른 좌표계들을 서로 변환하기 위한 PROJ 사용을 요약하는 관련 :cpp:class:`OGRCoordinateTransformation` 클래스도 있습니다. OGRSpatialReference 클래스의 사용법을 설명하는 예제가 존재합니다.
객체 / 객체 정의
----------------------------
:cpp:class:`OGRGeometry` 클래스는 벡터 피처의 도형을, 피처의 공간 위치/영역을 수집합니다. :cpp:class:`OGRFeature` 클래스가 이 도형을 담고 있으며, 피처 속성, 피처ID, 그리고 피처 클래스 식별자를 추가합니다. OGRFeature 하나에 도형 여러 개를 연결할 수 있습니다.
:cpp:class:`OGRFeatureDefn` 클래스로 속성 집합(:cpp:class:`OGRFieldDefn`), 속성 유형, 이름 등등을 표현합니다. 일반적으로 피처 레이어 하나 당 OGRFeatureDefn 클래스 하나가 존재합니다. 해당 유형의 (또는 피처 클래스의) 피처를 가리키는 참조를 집계하는 방식으로 동일한 정의를 공유합니다.
피처의 피처ID(FID)는 피처가 속해 있는 레이어 안에서 해당 피처를 유일하게 식별하기 위한 것입니다. 독립형 피처, 또는 아직 레이어에 작성되지 않은 피처는 NULL(OGRNullFID) 피처ID를 가질 수도 있습니다. OGR에서 피처ID는 64비트 정수형 기반입니다. 하지만 64비트 정수형은 일부 포맷에서 네이티브한 피처ID를 제대로 표현하지 못 하는 경우도 있습니다. 예를 들면 GML의 피처ID는 문자열입니다.
피처 클래스는 해당 피처 클래스에 허용되는 도형 유형의 지시자(indicator)도 담고 있습니다. (이 표시자는 :cpp:func:`OGRFeatureDefn::GetGeomType` 함수로부터 OGRwkbGeometryType으로 반환됩니다.) 이 표시자가 wkbUnknown인 경우 모든 도형 유형을 사용할 수 있습니다. 어떤 레이어에 있는 피처들이 항상 공통 속성 스키마를 공유할 테지만, 서로 다른 도형 유형일 수도 있다는 사실을 의미합니다.
피처 클래스에 도형 필드 (:cpp:class:`OGRGeomFieldDefn`) 여러 개를 연결할 수 있습니다. 각 도형 필드는 :cpp:func:`OGRGeomFieldDefn::GetType` 함수가 반환하는 자신만의 허용 도형 유형 지시자와 :cpp:func:`OGRGeomFieldDefn::GetSpatialRef` 함수가 반환하는 공간 좌표계를 가집니다.
OGRFeatureDefn 클래스는 (일반적으로 레이어 이름으로 사용되는) 피처 클래스 이름도 담고 있습니다.
레이어
-----
:cpp:class:`OGRLayer` 클래스는 데이터소스 안에 있는 피처 레이어를 표현합니다. OGRLayer에 있는 모든 피처는 공통 스키마를 공유하며 동일한 :cpp:class:`OGRFeatureDefn` 클래스입니다. OGRLayer 클래스는 데이터소스로부터 피처를 읽어오기 위한 메소드도 담고 있습니다. OGRLayer를 일반적으로 파일 포맷인 기저 데이터소스로부터 피처를 읽고 쓰기 위한 게이트웨이로 생각해도 됩니다. SFCOM 및 다른 테이블 기반 단순 피처 구현에서 OGRLayer는 공간 테이블로 표현됩니다.
OGRLayer에는 순차 및 임의 읽기 및 쓰기를 위한 메소드가 포함됩니다. (:cpp:func:`OGRLayer::GetNextFeature` 메소드를 통한) 읽기 접근은 일반적으로 모든 피처를 한 번에 하나씩 순차적으로 읽어옵니다. 하지만 OGRLayer에 (:cpp:func:`OGRLayer::SetSpatialFilter` 메소드를 통해) 공간 필터를 설치하면 특정 지리 영역과 교차하는 피처만 반환하도록 제한할 수 있습니다. 속성에 대한 필터는 :cpp:func:`OGRLayer::SetAttributeFilter` 메소드로만 설정할 수 있습니다.
GDAL 3.6버전부터, ``GetNextFeature`` 를 통해 피처를 가져오는 대신 :cpp:func:`OGRLayer::GetArrowStream` 메소드를 사용해서 열 지향 메모리 레이아웃을 가진 배치(batch)로 가져올 수도 있습니다. (참고: :ref:`vector_api_tut_arrow_stream`)
현재 OGR 아키텍처에 존재하는 한 가지 결함은 공간 필터 및 속성 필터가 데이터소스에 있는 어떤 레이어를 유일하게 대표하기 위한 OGRLayer 상에 직접 설정된다는 점입니다. 즉 한 번에 서로 다른 공간 필터를 각각 가진 읽기 작업 여러 개를 수행할 수 없다는 뜻입니다.
.. note:: 향후 :cpp:class:`OGRLayerView` 또는 이와 유사한 클래스를 도입하기 위해 이런 측면을 수정할 수도 있습니다.
떠올릴 수도 있는 또다른 질문은 어째서 OGRLayer와 OGRFeatureDefn 클래스가 구별되느냐입니다. OGRLayer 클래스와 OGRFeatureDefn 클래스는 항상 일대일 관계이기 때문에, 어째서 두 클래스를 합치면 안 되느냐라는 질문 말입니다. 두 가지 이유가 있습니다:
- 이제 OGRFeature와 OGRFeatureDefn이 OGRLayer를 의존하지 않는다고 정의되었기 때문에, 이 두 클래스가 데이터소스에 있는 특정 레이어에 상관없이 메모리에 독립적으로 존재할 수 있습니다.
- SFCORBA 모델은 SFCOM 및 SFSQL 모델과는 다르게 단일 고정 스키마를 가진 레이어라는 개념을 가지고 있지 않습니다. 피처가 현재 피처 그룹에 직접 연결되지 않을 가능성이 있는 피처 집합에 속해 있다는 사실이 OGR를 이용해서 SFCORBA 지원을 구현하는 데 중요할 수도 있습니다.
OGRLayer 클래스는 추상 기반 클래스입니다. OGRLayer 클래스를 구현하는 각 파일 포맷 드라이버별로 하위 클래스로 구현될 것으로 예상됩니다. OGRLayer는 일반적으로 해당 OGRLayer의 :cpp:class:`GDALDataset` 클래스가 직접 소유하며, 직접 인스턴스화되거나 삭제되지 않습니다.
데이터셋
-------
:cpp:class:`GDALDataset` 클래스는 OGRLayer 객체 집합을 표현합니다. 이 클래스는 일반적으로 단일 파일, 파일 집합, 데이터베이스 또는 게이트웨이를 표현합니다. GDALDataset은 자신이 소유하고 있지만 그를 가리키는 참조를 반환할 수 있는 :cpp:class:`OGRLayer` 목록을 가집니다.
GDALDataset 클래스는 추상 기반 클래스입니다. GDALDataset 클래스를 구현하는 각 파일 포맷 드라이버별로 하위 클래스로 구현될 것으로 예상됩니다. 일반적으로 GDALDataset 객체를 직접 인스턴스화하지 않지만, :cpp:class:`GDALDriver` 를 이용해서 인스턴스화하는 경우가 많습니다. GDALDataset을 삭제하면 기저 영구 데이터소스에의 접근이 종료되지만, 일반적으로 해당 파일을 정말로 삭제하지는 않습니다.
GDALDataset은 GDALDriver로 데이터소스를 다시 여는 데 사용할 수 있는 (보통 파일명인) 이름을 가집니다.
GDALDataset은 일반적으로 SQL 형식의 데이터소스 특화 명령어를 실행할 수 있도록 지원합니다. :cpp:func:`GDALDataset::ExecuteSQL` 메소드를 통해 SQL 명령어를 실행합니다. (PostGIS 또는 오라클 같은) 일부 데이터소스가 기저 데이터베이스를 통해 SQL을 전송하는 반면, OGR는 어떤 데이터소스 대상으로도 SQL SELECT 문의 부분 집합을 평가할 수 있도록 지원합니다.
드라이버
-------
지원하는 각 파일 포맷에 대해 :cpp:class:`GDALDriver` 객체를 인스턴스화합니다. GDALDriver 객체는 일반적으로 새 데이터셋을 열기 위해 사용되는 단독(singleton) 클래스인 GDALDriverManager를 통해 등록됩니다.
이는 새 GDALDriver 객체를 인스턴스화하고 지원할 각 파일 포맷에 대해 (파일 포맷 특화 GDALDataset 및 OGRLayer 클래스와 함께) Identify(), Open() 같은 작업용 함수 포인터를 정의하기 위해서입니다.
응용 프로그램 구동 시 일반적으로 원하는 파일 포맷별로 등록 함수를 호출합니다. 이 함수가 적절한 GDALDriver 객체를 인스턴스화하고 GDALDriverManager를 통해 등록합니다. 데이터셋을 여는 경우, 드라이버 관리자는 일반적으로 GDALDataset 가운데 하나가 성공해서 GDALDataset 객체를 반환할 때까지 각 GDALDataset을 차례로 시도할 것입니다.
| 63.927419 | 316 | 0.722594 |
bed1c4e07180178a13b7a5d54663c8d56f187c24 | 272 | rst | reStructuredText | docs/root/development/development.rst | ryanrhee/envoy-mobile | 34649625e979638dc51f2fa08336055d0cc1909d | [
"Apache-2.0"
] | 86 | 2020-11-16T23:40:04.000Z | 2022-03-15T20:36:21.000Z | docs/root/development/development.rst | ryanrhee/envoy-mobile | 34649625e979638dc51f2fa08336055d0cc1909d | [
"Apache-2.0"
] | 497 | 2020-11-14T02:53:19.000Z | 2022-03-31T20:26:11.000Z | docs/root/development/development.rst | ryanrhee/envoy-mobile | 34649625e979638dc51f2fa08336055d0cc1909d | [
"Apache-2.0"
] | 36 | 2020-11-14T00:06:50.000Z | 2022-03-03T19:27:50.000Z | .. _dev:
Development
===========
This section of the docs describes information useful to engineers actively
developing Envoy Mobile.
.. toctree::
:maxdepth: 2
performance/performance
releasing/releasing
testing/testing
tools/tools
debugging/android_local
| 16 | 75 | 0.746324 |
d864e1a28db08e169f191ca9b31569f8aa1c10d3 | 3,719 | rst | reStructuredText | docs/source/user/debugger.rst | J-Keller/jupyterlab | 2475a5355dd0811574d3372b7741e05f9067cf44 | [
"BSD-3-Clause"
] | 1 | 2022-02-22T06:18:47.000Z | 2022-02-22T06:18:47.000Z | docs/source/user/debugger.rst | J-Keller/jupyterlab | 2475a5355dd0811574d3372b7741e05f9067cf44 | [
"BSD-3-Clause"
] | 8 | 2022-01-04T19:19:07.000Z | 2022-03-03T22:11:12.000Z | docs/source/user/debugger.rst | J-Keller/jupyterlab | 2475a5355dd0811574d3372b7741e05f9067cf44 | [
"BSD-3-Clause"
] | 3 | 2022-03-08T03:52:35.000Z | 2022-03-15T06:23:01.000Z | .. _debugger:
Debugger
========
JupyterLab ships with a Debugger front-end by default.
This means that notebooks, code consoles and files can be debugged from JupyterLab directly!
Requirements
------------
For the debugger to be enabled and visible, a kernel with support for debugging is required.
Here is a list of kernels that are known to be supporting the Jupyter Debug Protocol:
- `xeus-python <https://github.com/jupyter-xeus/xeus-python>`_: Jupyter kernel for the Python programming language
- `xeus-robot <https://github.com/jupyter-xeus/xeus-robot>`_: Jupyter kernel for Robot Framework
- `ipykernel <https://github.com/ipython/ipykernel>`_: IPython Kernel for Jupyter
- `common-lisp-jupyter <https://github.com/yitzchak/common-lisp-jupyter>`_: Common Lisp Kernel for Jupyter
Other Jupyter Kernels can also support debugging and be compatible with the JupyterLab debugger
by implementing the `Jupyter Debugger Protocol <https://jupyter-client.readthedocs.io/en/latest/messaging.html#debug-request>`_.
If you know of other kernels with support for debugging, please open a PR to add them to this list.
Here is an example of how to install ``ipykernel`` and ``xeus-python`` in a new ``conda`` environment:
.. code:: bash
conda create -n jupyterlab-debugger -c conda-forge jupyterlab=3 "ipykernel>=6" xeus-python
conda activate jupyterlab-debugger
Usage
-----
Here is a screencast to enable the debugger and set up breakpoints. The various steps are described more in depth below.
.. image:: ./images/debugger/step.gif
Use a kernel supporting debugger
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
First, you will need to check that the kernel supports debugging. If so, the *bug* icon in the upper-right corner of the
notebook will be enabled.
.. image:: ../images/debugger-kernel.png
Debug code in notebook
^^^^^^^^^^^^^^^^^^^^^^
Now let's enable the debugger for this notebook. The debugger can be enabled by toggling the *bug* button on in the upper-right corner of the notebook:
.. image:: ../images/debugger-activate.png
Once debugging has been enabled, we can set breakpoints and step into the code.
Let's define a function that adds two elements:
.. code:: python
def add(a, b):
res = a + b
return res
We can call the function and print the result:
.. code:: python
result = add(1, 2)
print(result)
Now let's go back to the first code cell and click on the gutter on line number 2 to add a breakpoint:
.. image:: ../images/debugger-breakpoint.png
Then let's execute the second code cell by clicking on the _Run_ button:
.. image:: ../images/debugger-run.png
The execution stops where the breakpoint is set:
.. image:: ../images/debugger-stop-on-breakpoint.png
Explore the code state
^^^^^^^^^^^^^^^^^^^^^^
Exploring the code state is done with the debugger sidebar. It shows a variable explorer,
a list of breakpoints, a source preview and the possibility to navigate the call stack.
.. image:: ../images/debugger-sidebar.png
**Variables**
Variables can be explored using a tree view and a table view:
.. image:: ../images/debugger-variables.png
**Call stack**
You can step into the code, and continue the execution using the debug actions:
.. image:: ../images/debugger-callstack.png
**Breakpoints**
New breakpoints can be added and removed while the execution is stopped, and
they will be shown in the list of breakpoints:
.. image:: ../images/debugger-breakpoints.png
**Source**
The source panel shows the source of the current file being debugged:
.. image:: ../images/debugger-source.png
If the source corresponds to a cell that has been deleted, clicking on the
*Open in Main Area* button will open a read-only view of the source.
| 30.735537 | 151 | 0.733799 |
1ca4e45cd884b011d09b65d5d3310287dd504af4 | 154 | rst | reStructuredText | docs/source/generated/brightwind.analyse.analyse.dist_12x24.rst | altosphere/brightwind | 7d061505fb924529b9ea1d0adfd01d269a30a3bb | [
"MIT"
] | 34 | 2019-01-13T12:11:26.000Z | 2021-12-30T18:28:38.000Z | docs/source/generated/brightwind.analyse.analyse.dist_12x24.rst | altosphere/brightwind | 7d061505fb924529b9ea1d0adfd01d269a30a3bb | [
"MIT"
] | 208 | 2018-12-17T15:56:03.000Z | 2022-03-25T15:27:00.000Z | docs/source/generated/brightwind.analyse.analyse.dist_12x24.rst | altosphere/brightwind | 7d061505fb924529b9ea1d0adfd01d269a30a3bb | [
"MIT"
] | 9 | 2019-02-11T17:11:24.000Z | 2022-01-04T23:30:03.000Z | brightwind.analyse.analyse.dist\_12x24
======================================
.. currentmodule:: brightwind.analyse.analyse
.. autofunction:: dist_12x24 | 25.666667 | 45 | 0.603896 |
68436f470cbd38ed36f0c8ade2e9c240bcbf13c1 | 8,760 | rst | reStructuredText | src/documentation/reference-guide/1.6/dijit/form/DateTextBox.rst | Anirudradabas/dojo-website | f222da8256a48081771ba7626febe2d977563afc | [
"CC0-1.0"
] | 4 | 2016-01-22T11:13:41.000Z | 2017-03-22T11:09:30.000Z | src/documentation/reference-guide/1.6/dijit/form/DateTextBox.rst | jains1234567890/dojo-website | 371f39db94f86032bd5844add44436a5c17a97a4 | [
"CC0-1.0"
] | 34 | 2015-08-14T15:56:18.000Z | 2021-09-17T05:06:46.000Z | src/documentation/reference-guide/1.6/dijit/form/DateTextBox.rst | jains1234567890/dojo-website | 371f39db94f86032bd5844add44436a5c17a97a4 | [
"CC0-1.0"
] | 23 | 2015-08-28T13:29:38.000Z | 2021-05-18T07:28:10.000Z | .. _dijit/form/DateTextBox:
dijit.form.DateTextBox
======================
:Authors: Becky Gibson, Doug Hays, Craig Riecke, Adam Peller
:Developers: Doug Hays, Bill Keese
:Available: since V0.9
.. contents::
:depth: 2
DateTextBox widgets are easy-to-use date entry controls that allow either typing or choosing a date from any calendar widget.
============
Introduction
============
``dijit.form.DateTextBox``:
* is a :ref:`mapped form control <dijit/form>`
* validates against locale-specific :ref:`i18n <quickstart/internationalization/index>` rules
* also validates against developer-provided ``constraints`` like ``min``, ``max``, valid days of the week, etc.
:ref:`Options defined by the dojo.date package <quickstart/numbersDates>` to alter the way dates are formatted and parsed can be specified in the DateTextBox ``constraints`` object.
Standard Date Format
--------------------
One universal problem with specifying dates as text strings is they can be written so many different ways. In Great Britain, "5/8/2008" means August 5th where in the U.S. it means May 8th. Fortunately, Dojo respects the cultural conventions so that the date will be properly parsed when interacting with the user. Routines in the :ref:`dojo.date.locale <dojo/date/locale>` package are used against the setting of djConfig.locale or the locale of the user's browser to determine the appropriate behavior.
Another problem is that your application may interact with various users in different locales, and the same server interaction is expected to work for all of them. If your widget markup specifies the attribute ``value='5/8/2008'``, how does DateTextBox know what you mean? You could write your application to assume US-English conventions, as Javascript often does, but that programming practice will not be well understood in other parts of the world and may cause problems interacting with other software. To prevent this ambiguity, DateTextBox uses ISO8601/RFC3339 format ``yyyy-MM-dd`` to specify dates when communicating outside the Javascript realm. This format is both neutral to cultural formatting conventions as well as to time zones. For example:
* 2007-12-25 means December 25, 2007.
ISO formatted date values sort properly as strings and are lighter-weight than Javascript Date objects, which make them convenient for programming.
The DateTextBox widget uses a hidden form element with the *NAME* of the original tag to submit the ISO data; the form element provided for user interaction is an additional form element instantiated only for this purpose. When you access the DateTextBox value attribute programmatically from the widget using JavaScript, you must use a native Javascript Date object, e.g. new Date(2007, 11, 25) The time portion of the Date object is ignored.
========
Examples
========
Declarative example
-------------------
.. code-example ::
.. js ::
<script type="text/javascript">
dojo.require("dijit.form.DateTextBox");
</script>
.. html ::
<input type="text" name="date1" id="date1" value="2005-12-30"
dojoType="dijit.form.DateTextBox"
required="true" />
<label for="date1">Drop down Date box. Click inside to display the calendar.</label>
Alternate Date Format to/from a Server
--------------------------------------
Ideally, your server application will send and receive dates in the ISO standard format. Dojo recommends it as a best practice, but your data may not conform. For example when Oracle database processes dates, by default it insists on dd-MMM-yyyy format in English, as in 01-APR-2006. Perhaps you do not control the database or cannot write a shim to convert the dates server side. How do you get around it?
To accept dates from the server in this format (but continue to work with dates on the client using local conventions), you can create your own widget class which overrides the postMixInProperties and serialize methods of DateTextBox. (See :ref:`Dijit <dijit/index>` for details on creating your own widgets). Here's an example:
.. code-example ::
.. js ::
<script type="text/javascript">
dojo.require("dijit.form.DateTextBox");
dojo.addOnLoad(function(){
dojo.declare("OracleDateTextBox", dijit.form.DateTextBox, {
oracleFormat: {selector: 'date', datePattern: 'dd-MMM-yyyy', locale: 'en-us'},
value: "", // prevent parser from trying to convert to Date object
postMixInProperties: function() { // change value string to Date object
this.inherited(arguments);
// convert value to Date object
this.value = dojo.date.locale.parse(this.value, this.oracleFormat);
},
// To write back to the server in Oracle format, override the serialize method:
serialize: function(dateObject, options) {
return dojo.date.locale.format(dateObject, this.oracleFormat).toUpperCase();
}
});
function showServerValue(){
dojo.byId('toServerValue').value=document.getElementsByName('oracle')[0].value;
}
new OracleDateTextBox({
value: "31-DEC-2009",
name: "oracle",
onChange: function(v){ setTimeout(showServerValue, 0)}
}, "oracle");
showServerValue();
});
</script>
.. html ::
<label for"fromServerValue">Oracle date coming from server:</label>
<input id="fromServerValue" readOnly disabled value="31-DEC-2009"/><br/>
<label for="oracle">Client date:</label>
<input id="oracle" /><br/>
<label for"toServerValue">Oracle date going back to server:</label>
<input id="toServerValue" readOnly disabled/>
Changing Constraints on the Fly
-------------------------------
The DateTextBox widget obeys the ``constraints`` you give, much like :ref:`dijit.form.NumberTextBox <dijit/form/NumberTextBox>` Sometimes you may need to change this attribute's `min` and `max` values at runtime. To do this, you can set new ``constraints`` on the widget, but the catch is you must use JavaScript dates. In this example, the first DateTextBox widget sets the `max` constraint of the second widget, and the second widget sets the `min` constraint of the first widget.
.. code-example ::
.. js ::
<script type="text/javascript">
dojo.require("dijit.form.DateTextBox");
</script>
.. html ::
<label for="fromDate">From:</label>
<input id="fromDate" type="text" name="fromDate" dojoType="dijit.form.DateTextBox" required="true"
onChange="dijit.byId('toDate').constraints.min = arguments[0];" />
<label for="toDate">To:</label>
<input id="toDate" type="text" name="toDate" dojoType="dijit.form.DateTextBox" required="true"
onChange="dijit.byId('fromDate').constraints.max = arguments[0];" />
Working with Two-Digit Years
----------------------------
Sometimes you may want to input and display years in a format with only 2-digit years. Note the server still needs the full 4-digit year sent on form submit so that it's not ambiguous. There is a ``constraints`` property `fullYear` (boolean) that controls the presentation of the year as 2 digits or 4. The catch is that this can only be set after the widget has been created.
.. code-example ::
.. js ::
<script type="text/javascript">
dojo.require("dijit.form.DateTextBox");
function setShortYear(){
var w = dijit.byId('shortYear');
w.constraints.fullYear = false;
w.attr('value', w.attr('value')); // reformat display to short year
}
dojo.addOnLoad(setShortYear);
</script>
.. html ::
<label for="shortYear">From:</label>
<input id="shortYear" type="text" name="shortYear" dojoType="dijit.form.DateTextBox" value="1999-12-31" required="true"/>
=============
Accessibility
=============
Version 1.6
-----------
As of 1.6, full keyboard support has been added to the Calendar dropdown used by the DateTextBox. See the Accessibility Section in :ref:`dijit.Calendar <dijit/Calendar>` for the keyboard commands to navigate the Calendar drop down. To navigate the DateTextBox with the JAWS 12 screen reader, JAWS must be in virtual cursor off mode. With focus on the DateTextBox field JAWS will announce the DateTextBox as an edit combo. The user presses the down arrow key to open the Calendar and set focus onto the date specified in the text box. Use table navigation to navigate through the Calendar.
Previous to 1.6
---------------
See the Accessibility Section in :ref:`dijit.form.ValidationTextBox <dijit/form/ValidationTextBox>`
The calendar popup associated with the DateTextBox is not yet keyboard accessible. However, the DateTextBox will still meet accessibility requirments as long as the developer provides the validation parameters promptMessage and invalidMessage when creating the DateTextBox (note that there is a default invalidMessage but not a promptMessage). These messages are implemented in a format that is accessible to all users.
| 49.772727 | 759 | 0.730023 |
eb2b3cc77ae44ad66617a3f23347288fecfdef1e | 93 | rst | reStructuredText | docs/usage.rst | lresende/eg-microservice-demo | 0ab8861c72e3fd2dd42fe7192dde392c7be2c7bf | [
"Apache-2.0"
] | 2 | 2018-04-19T01:10:10.000Z | 2018-07-21T17:53:32.000Z | docs/usage.rst | lresende/eg-microservice-demo | 0ab8861c72e3fd2dd42fe7192dde392c7be2c7bf | [
"Apache-2.0"
] | null | null | null | docs/usage.rst | lresende/eg-microservice-demo | 0ab8861c72e3fd2dd42fe7192dde392c7be2c7bf | [
"Apache-2.0"
] | null | null | null | =====
Usage
=====
To use python_microservice in a project::
import python_microservice
| 11.625 | 41 | 0.688172 |
5f7be8f584a1b7f9b2b604feb70958b5794e4733 | 3,180 | rst | reStructuredText | HISTORY.rst | nicolargo/TextBlob | 1444008a36dbb5ebcb5c00e4e9d24ee003e2d88a | [
"MIT"
] | 1 | 2018-02-06T06:44:58.000Z | 2018-02-06T06:44:58.000Z | HISTORY.rst | nicolargo/TextBlob | 1444008a36dbb5ebcb5c00e4e9d24ee003e2d88a | [
"MIT"
] | null | null | null | HISTORY.rst | nicolargo/TextBlob | 1444008a36dbb5ebcb5c00e4e9d24ee003e2d88a | [
"MIT"
] | null | null | null | Changelog
=========
0.5.2 (2013-08-14)
------------------
- `Important patch update for NLTK users`: Fix bug with importing TextBlob if local NLTK is installed.
- Fix bug with computing start and end indices of sentences.
0.5.1 (2013-08-13)
------------------
- Fix bug that disallowed display of non-ascii characters in the Python REPL.
- Backwards incompatible: Restore ``blob.json`` property for backwards compatibility with textblob<=0.3.10. Add a ``to_json()`` method that takes the same arguments as ``json.dumps``.
- Add ``WordList.append`` and ``WordList.extend`` methods that append Word objects.
0.5.0 (2013-08-10)
------------------
- Language translation and detection API!
- Add ``text.sentiments`` module. Contains the ``PatternAnalyzer`` (default implementation) as well as a ``NaiveBayesAnalyzer``.
- Part-of-speech tags can be accessed via ``TextBlob.tags`` or ``TextBlob.pos_tags``.
- Add ``polarity`` and ``subjectivity`` helper properties.
0.4.0 (2013-08-05)
------------------
- New ``text.tokenizers`` module with ``WordTokenizer`` and ``SentenceTokenizer``. Tokenizer instances (from either textblob itself or NLTK) can be passed to TextBlob's constructor. Tokens are accessed through the new ``tokens`` property.
- New ``Blobber`` class for creating TextBlobs that share the same tagger, tokenizer, and np_extractor.
- Add ``ngrams`` method.
- `Backwards-incompatible`: ``TextBlob.json()`` is now a method, not a property. This allows you to pass arguments (the same that you would pass to ``json.dumps()``).
- New home for documentation: https://textblob.readthedocs.org/
- Add parameter for cleaning HTML markup from text.
- Minor improvement to word tokenization.
- Updated NLTK.
- Fix bug with adding blobs to bytestrings.
0.3.10 (2013-08-02)
-------------------
- Bundled NLTK no longer overrides local installation.
- Fix sentiment analysis of text with non-ascii characters.
0.3.9 (2013-07-31)
------------------
- Updated nltk.
- ConllExtractor is now Python 3-compatible.
- Improved sentiment analysis.
- Blobs are equal (with `==`) to their string counterparts.
- Added instructions to install textblob without nltk bundled.
- Dropping official 3.1 and 3.2 support.
0.3.8 (2013-07-30)
------------------
- Importing TextBlob is now **much faster**. This is because the noun phrase parsers are trained only on the first call to ``noun_phrases`` (instead of training them every time you import TextBlob).
- Add text.taggers module which allows user to change which POS tagger implementation to use. Currently supports PatternTagger and NLTKTagger (NLTKTagger only works with Python 2).
- NPExtractor and Tagger objects can be passed to TextBlob's constructor.
- Fix bug with POS-tagger not tagging one-letter words.
- Rename text/np_extractor.py -> text/np_extractors.py
- Add run_tests.py script.
0.3.7 (2013-07-28)
------------------
- Every word in a ``Blob`` or ``Sentence`` is a ``Word`` instance which has methods for inflection, e.g ``word.pluralize()`` and ``word.singularize()``.
- Updated the ``np_extractor`` module. Now has an new implementation, ``ConllExtractor`` that uses the Conll2000 chunking corpus. Only works on Py2. | 50.47619 | 238 | 0.713522 |
a4daa31d180656a764801853267e6c3f50f32035 | 2,373 | rst | reStructuredText | doc/python/sphinx/glossary.rst | arghyadip01/grpc | 9e10bfc8a096ef91a327e22f84f10c0fabff4417 | [
"Apache-2.0"
] | 36,552 | 2015-02-26T17:30:13.000Z | 2022-03-31T22:41:33.000Z | doc/python/sphinx/glossary.rst | SanjanaSingh897/grpc | 2d858866eb95ce5de8ccc8c35189a12733d8ca79 | [
"Apache-2.0"
] | 23,536 | 2015-02-26T17:50:56.000Z | 2022-03-31T23:39:42.000Z | doc/python/sphinx/glossary.rst | SanjanaSingh897/grpc | 2d858866eb95ce5de8ccc8c35189a12733d8ca79 | [
"Apache-2.0"
] | 11,050 | 2015-02-26T17:22:10.000Z | 2022-03-31T10:12:35.000Z | Glossary
================
.. glossary::
metadatum
A key-value pair included in the HTTP header. It is a
2-tuple where the first entry is the key and the
second is the value, i.e. (key, value). The metadata key is an ASCII str,
and must be a valid HTTP header name. The metadata value can be
either a valid HTTP ASCII str, or bytes. If bytes are provided,
the key must end with '-bin', i.e.
``('binary-metadata-bin', b'\\x00\\xFF')``
metadata
A sequence of metadatum.
serializer
A callable function that encodes an object into bytes. Applications are
allowed to provide any customized serializer, so there isn't a restriction
for the input object (i.e. even ``None``). On the server-side, the
serializer is invoked with server handler's return value; on the
client-side, the serializer is invoked with outbound message objects.
deserializer
A callable function that decodes bytes into an object. Same as serializer,
the returned object doesn't have restrictions (i.e. ``None`` allowed). The
deserializer is invoked with inbound message bytes on both the server side
and the client-side.
wait_for_ready
If an RPC is issued but the channel is in the TRANSIENT_FAILURE or SHUTDOWN
states, the library cannot transmit the RPC at the moment. By default, the
gRPC library will fail such RPCs immediately. This is known as "fail fast."
RPCs will not fail as a result of the channel being in other states
(CONNECTING, READY, or IDLE).
When the wait_for_ready option is specified, the library will queue RPCs
until the channel is READY. Any submitted RPCs may still fail before the
READY state is reached for other reasons, e.g., the client channel has been
shut down or the RPC's deadline has been reached.
channel_arguments
A list of key-value pairs to configure the underlying gRPC Core channel or
server object. Channel arguments are meant for advanced usages and contain
experimental API (some may not labeled as experimental). Full list of
available channel arguments and documentation can be found under the
"grpc_arg_keys" section of "grpc_types.h" header file (|grpc_types_link|).
For example, if you want to disable TCP port reuse, you may construct
channel arguments like: ``options = (('grpc.so_reuseport', 0),)``.
| 46.529412 | 79 | 0.726507 |
e9e30320ed7edb58e752ecf536468638a5a9329b | 245 | rst | reStructuredText | docs/source/handwriting_features.features.validation.rst | AppEecaetano/handwriting-features | a27446756c3af9995ce0e7ae5a795cebea341eff | [
"MIT"
] | 1 | 2021-11-23T20:52:14.000Z | 2021-11-23T20:52:14.000Z | docs/source/handwriting_features.features.validation.rst | AppEecaetano/handwriting-features | a27446756c3af9995ce0e7ae5a795cebea341eff | [
"MIT"
] | null | null | null | docs/source/handwriting_features.features.validation.rst | AppEecaetano/handwriting-features | a27446756c3af9995ce0e7ae5a795cebea341eff | [
"MIT"
] | 1 | 2021-11-23T21:10:52.000Z | 2021-11-23T21:10:52.000Z | handwriting\_features.features.validation package
=================================================
Module contents
---------------
.. automodule:: handwriting_features.features.validation
:members:
:undoc-members:
:show-inheritance:
| 22.272727 | 56 | 0.57551 |
41e9303b1926877bee5c188f056c5e7068fc2221 | 4,981 | rst | reStructuredText | start/install.rst | DevDooly/answer-doc | 739ac0ea5314e1da35ddce0651e5ff9ec09b3c44 | [
"BSD-3-Clause"
] | 3 | 2021-06-20T02:24:10.000Z | 2022-01-26T23:55:33.000Z | docs/ko/_sources/start/install.rst.txt | bogonets/answer | 57f892a9841980bcbc35fa1e27521b34cd94bc25 | [
"MIT"
] | null | null | null | docs/ko/_sources/start/install.rst.txt | bogonets/answer | 57f892a9841980bcbc35fa1e27521b34cd94bc25 | [
"MIT"
] | null | null | null | .. meta::
:keywords: INSTALL
.. _doc-start-install:
설치방법
========
이 페이지는 "엔서"를 설치하기 위한 방법에 대하여 정리한 곳입니다.
설치하기에 앞서
---------------
"엔서"는 `Ubuntu <https://ubuntu.com/>`_ 플랫폼과 `Docker <https://www.docker.com/>`_ 도구를 기본으로 배포합니다.
.. note:: 현재는 도커(Docker)를 사용한 배포만 지원합니다.
플랫폼별 배포는 차 후 지원 계획에 포함되어 있습니다.
또한 설치하기에 앞서 CPU가 "가상화 기술 (Virtualization Technology)"을 지원하는지 확인해야 합니다.
CPU 벤더에 따라 가상화을 지칭하는 기술명이 다릅니다.
- Intel VT
- AMD-V
"BIOS 유틸리티" 또는 "CPU 정보" 도구 등을 활용하여 확인할 수 있습니다.
- 윈도우를 사용한다면 ``작업 관리자 > 성능 탭 > CPU`` 에서 확인할 수 있습니다.
- 리눅스를 사용한다면 ``lscpu | grep Virtualization`` 로 확인할 수 있습니다.
Docker 설치
-----------
도커는 응용 프로그램을 컨테이너에 격리시키는 자동화 도구 입니다.
.. note:: 설치 방법이 변경될 수 있으므로
공식 홈페이지에서 최신 정보를 확인해 주세요.
- `Install Docker Desktop on Mac <https://docs.docker.com/docker-for-mac/install/>`_
- `Install Docker Desktop on Windows <https://docs.docker.com/docker-for-windows/install/>`_
- `Install Docker Engine on Ubuntu <https://docs.docker.com/engine/install/ubuntu/>`_
자세한 내용은 `Get Docker <https://docs.docker.com/get-docker/>`_ 페이지에서 확인할 수 있습니다.
만약, 우분투(x86_64/amd64)를 사용할경우 아래의 명령으로 설치할 수 있습니다.
.. code-block:: bash
:linenos:
## Uninstall old versions
sudo apt-get remove docker docker-engine docker.io containerd runc
## Set up the repository
sudo apt-get update
sudo apt-get install \
apt-transport-https \
ca-certificates \
curl \
gnupg-agent \
software-properties-common
## Add Docker’s official GPG key:
curl -fsSL https://download.docker.com/linux/ubuntu/gpg | sudo apt-key add -
## Verify that you now have the key with the fingerprint
sudo apt-key fingerprint 0EBFCD88
## Set up the stable repository.
sudo add-apt-repository \
"deb [arch=amd64] https://download.docker.com/linux/ubuntu \
$(lsb_release -cs) \
stable"
## Install Docker Engine
sudo apt-get update
sudo apt-get install docker-ce docker-ce-cli containerd.io
## Add docker group
sudo usermod -aG docker your-user
## Verify that Docker Engine is installed correctly by running the hello-world image.
sudo docker run hello-world
Docker Compose 설치
-------------------
`Docker Compose <https://docs.docker.com/compose/>`_ 는 여러 컨테이너 애플리케이션을 정의하고 실행하기위한 도구입니다.
자세한 설치 방법은 `Install Docker Compose <https://docs.docker.com/compose/install/>`_ 페이지를 확인해 주세요.
리눅스를 사용할경우 아래의 명령으로 간단히 설치할 수 있습니다.
.. code-block:: bash
:linenos:
sudo curl -L "https://github.com/docker/compose/releases/download/1.26.2/docker-compose-$(uname -s)-$(uname -m)" -o /usr/local/bin/docker-compose
sudo chmod +x /usr/local/bin/docker-compose
(선택) NVIDIA 그래픽 드라이버 지원
----------------------------------
GPGPU를 위한 CUDA지원을 "엔서"에 적용할 수 있습니다.
.. warning:: "NVIDIA 그래픽 드라이버 지원"은 현재 리눅스 플랫폼만 지원됩니다.
Docker 지원을 위해 `nvidia-docker <https://nvidia.github.io/nvidia-docker/>`_ 가 필요하기 때문입니다.
지원 현황은 해당 사이트를 확인해 주세요.
이를 위해 우선 `NVIDIA 그래픽 드라이버 <https://www.nvidia.co.kr/Download/index.aspx?lang=kr>`_ 를 설치해야 합니다.
해당 사이트를 통해 설치를 진행해야 합니다.
.. note:: `CUDA Toolkit <https://developer.nvidia.com/cuda-toolkit>`_ 를 설치해도
그래픽 드라이버를 함께 설치할 수 있습니다.
그 후 `nvidia-docker <https://nvidia.github.io/nvidia-docker/>`_ 를 설치합니다.
.. note:: 설치 방법이 변경될 수 있으므로
공식 홈페이지에서 최신 정보를 확인해 주세요.
만약, 우분투를 사용할경우 아래의 명령으로 설치할 수 있습니다.
.. code-block:: bash
:linenos:
distribution=$(. /etc/os-release;echo $ID$VERSION_ID)
curl -s -L https://nvidia.github.io/nvidia-docker/gpgkey | \
sudo apt-key add -
curl -s -L https://nvidia.github.io/nvidia-docker/$distribution/nvidia-docker.list | \
sudo tee /etc/apt/sources.list.d/nvidia-docker.list
sudo apt-get update
sudo apt-get install -y nvidia-container-toolkit
sudo systemctl restart docker
(선택) nvidia-docker-compose 설치
---------------------------------
Docker-Compose 를 사용할 경우 NVIDIA 그래픽 드라이버가 연결되지 않을 수 있다.
이 경우 사용할 수 있는 몇가지 방법이 있다.
- 전체 이미지를 수동으로 실행
- :download:`Bash Script </_static/answer-cli>` 작성
- Docker의 ``daemon.json`` 파일에 ``runtimes`` 설정 추가
- `nvidia-docker-compose <https://github.com/eywalker/nvidia-docker-compose>`_ 설치
이 중 nvidia-docker-compose 를 설치하는 방법은 아래와 같습니다.
.. code-block:: bash
:linenos:
pip install nvidia-docker-compose
다음과 같이 사용할 수 있습니다.
.. code-block:: bash
:linenos:
docker-compose -f docker-compose-gpu.yaml ...
## or
nvidia-docker-compose ...
.. warning:: 이 방법은 비공식 입니다.
엔서 다운로드
-------------
엔서는 `Docker Hub <https://hub.docker.com/>`_ 공식 사이트에 배포하고 있습니다.
각각의 이미지는 아래의 링크를 참조하세요.
- `bogonets/answer-core <https://hub.docker.com/r/bogonets/answer-core>`_
- `bogonets/answer-api <https://hub.docker.com/r/bogonets/answer-api>`_
- `bogonets/answer-web <https://hub.docker.com/r/bogonets/answer-web>`_
최신 버전을 받고 싶다면 아래의 명령을 입력하면 됩니다.
.. code-block:: bash
:linenos:
docker pull bogonets/answer-core
docker pull bogonets/answer-api
docker pull bogonets/answer-web
| 27.983146 | 149 | 0.659707 |
97996de5987008c2d397ac50782c2f753a4c2805 | 311 | rst | reStructuredText | docs/reference_utilities.rst | IRT-Open-Source/libadm | 442f51229e760db1caca4b962bfaad58a9c7d9b8 | [
"Apache-2.0"
] | 14 | 2018-07-24T12:18:05.000Z | 2020-05-11T19:14:49.000Z | docs/reference_utilities.rst | IRT-Open-Source/libadm | 442f51229e760db1caca4b962bfaad58a9c7d9b8 | [
"Apache-2.0"
] | 19 | 2018-07-30T15:02:54.000Z | 2020-05-21T10:13:19.000Z | docs/reference_utilities.rst | IRT-Open-Source/libadm | 442f51229e760db1caca4b962bfaad58a9c7d9b8 | [
"Apache-2.0"
] | 7 | 2018-07-24T12:18:12.000Z | 2020-02-14T11:18:12.000Z | .. reference_utilities:
Utilities
#########
.. doxygenstruct:: adm::SimpleObjectHolder
.. doxygenfunction:: adm::createSimpleObject
.. doxygenfunction:: adm::updateBlockFormatDurations(std::shared_ptr<Document>)
.. doxygenfunction:: adm::updateBlockFormatDurations(std::shared_ptr<Document>, const Time &)
| 23.923077 | 93 | 0.762058 |
5988536e900cd93f9e5e654b020c4c109b56ebf6 | 11,559 | rst | reStructuredText | docs/source/quickstart/root.rst | mccartnm/robx | ee1a2fc6239df170d487ab42c7c350a5272e66a8 | [
"MIT"
] | null | null | null | docs/source/quickstart/root.rst | mccartnm/robx | ee1a2fc6239df170d487ab42c7c350a5272e66a8 | [
"MIT"
] | 2 | 2019-09-08T20:15:11.000Z | 2019-10-04T00:45:58.000Z | docs/source/quickstart/root.rst | mccartnm/hivemind | ee1a2fc6239df170d487ab42c7c350a5272e66a8 | [
"MIT"
] | null | null | null | ************
Starting Off
************
What is a Hive
==============
A hivemind project, colloquially known as a "hive", is an organizational structure for all components of the hivemind universe. With a number of benefits over one-off files, the hive provides benefits for development and production environments alike. All while keeping the handcuffs off for when programmers want to do the awesome hackery that Python is known for.
.. tip::
Think of a city. A bustling of people and information. When walking through the streets, you don't care about what most people or things are doing. You observe and relay information to people who are listening all while potentially listening in on others when the time is right.
If you're familiar with the Django_ project, this should feel very familiar as many of the concepts were modeled after it's project structure.
``hm`` Command
==============
The CLI interface for hivemind. This comes with utilities to help initialize the project, new nodes, and even run in development and production environments. Below is the current help info of the ``hm`` command
.. execute_code::
:hide_code:
:hide_headers:
from hivemind.util.cliparse import build_hivemind_parser
parser = build_hivemind_parser()
parser.print_help()
Create a Project
================
Let's go to wherever we normally keep our code repositories and make a new directory. To follow with our "hive as a city" concept, we'll call it ``gotham``.
.. code-block:: shell
~$> cd ~/repos
~$> mkdir gotham
~$> cd gotham
Once inside, we use the ``hm new`` command to set up a fresh hive.
.. execute_code::
:hide_code:
:hide_headers:
from hivemind.util.cliparse import build_hivemind_parser
parser = build_hivemind_parser()
parser.subparser_map['new'].print_help()
.. code-block:: shell
~$> hm new gotham
New hive at: /home/jdoe/repos/gotham
If you look around there, you'll see something akin to the following structure.
.. code-block:: text
gotham/
`- gotham/
`- config/
`- __init__.py
`- hive.py
`- nodes/
`- __init__.py
`- root/
`- __init__.py
`- static/
`- templates/
`- __init__.py
- ``config/hive.py``: The settings file for a given hive
- ``nodes/``: Where our nodes will go. More on this soon.
- ``root/__init__.py``: The controller class goes here. The controller(s) is/are the heart of the hive. By default, it's just a transparent wrapper around the ``RootController`` class.
- ``static/``: For web-side utilities, static files that we'll serve out
- ``templates/``: ``TaskYaml`` templates for plugin-style task fun! We'll get more into this later on too.
.. tip::
While not specifically required, the additional top level directory is to contain all the python parts to one location to avoid crowding the root or your new repo.
With just that one command, we have a functional web server that can be navigated to.
.. code-block:: shell
~$> cd gotham
~$> hm dev
The ``dev`` command should boot up your hive and start listening.
.. tip::
While this will be described better in the logging documentation, you should be able to find the output log of your hive wherever you're ``config/hive.py -> LOG_LOCATION`` is set.
Now, simply navigate to ``http://127.0.0.1:9476`` and you should be greeted with a simple (but noteworthy!) page.
A Quick Recap
-------------
- The ``hm new`` command created a "blank" hive with some defaults.
- The ``hm dev`` command starts the hive environment
- A basic web server is run and we were able to see the page!
.. note::
The web server you're seeing is actually your hive's ``RootController`` listening and responding to changes in your network. We'll describe this in greater detail later. This is a vital piece of the puzzle so remember the name!
Create a Node
=============
Okay, we have a network. Time to put some nodes on it! Enter the ``hm create_node`` command.
.. execute_code::
:hide_code:
:hide_headers:
from hivemind.util.cliparse import build_hivemind_parser
parser = build_hivemind_parser()
parser.subparser_map['create_node'].print_help()
.. code-block:: shell
~$> hm create_node BatSignal
Once run, you should see the following in your hive.
.. code-block:: text
nodes/
`- __init__.py
`- batsignal
`- __init__.py
`- batsignal.py
And ``batsignal.py`` should look something like:
.. code-block:: python
"""
BatSignal node for gotham
"""
from hivemind import _Node
class BatSignal(_Node):
"""
BatSignal Implementation
"""
def services(self) -> None:
"""
Register any default services
:return: None
"""
super().services()
def subscriptions(self) -> None:
"""
Register any default subscriptions
:return: None
"""
super().subscriptions()
if __name__ == '__main__': # pragma: no cover
# An initialization command.
BatSignal.exec_(
name="mynode",
logging='verbose'
)
That's a self contained node that can be run but, at the moment, it doesn't do anything.
Enable The Node
---------------
If we run ``hm dev`` right now, nothing will have changed. We need to tell the hive to load the ``BatSignal`` by default.
In the ``nodes/__init__.py`` file you should see the following:
.. code-block:: python
# from .batsignal import batsignal
Simply uncommenting that grants access to the components powering your hive. This is a pretty brute force way to (dis|en)able nodes.
Add a Service
-------------
.. tip::
**What is a Service?**
A service is how we communicate observed data to our network. A service executes on it's own thread and, philosophically, relies on nothing but itself. This doesn't mean it shouldn't interact with other data but, at all costs, we avoid synchronous waiting patterns.
With the node enabled, running ``hm dev`` will start up the controller and our node, but until we add services or subscriptions, our node doesn't serve any purpose. To get the Dark Knight to save us, we'll need to send him a message whenever a crisis occurs. A perfect situation for a service.
.. code-block:: python
class BatSignal(_Node):
# ...
def services(self) -> None:
"""
Add the bat signal service!
"""
self._bat_signal_service = self.add_service(
name='bat-signal-main',
function=self._bat_signal
)
def _bat_signal(self, service) -> int:
"""
Run the service! By default, this is run on a
loop forever!
:param service: The service instance we created above
:return: int (0 means continue, otherwise abort)
"""
# Send a signal to the big man himself. (JSON compliant payload)
service.send('Batman! We need help!')
# As lowly cityfolk, we can do nothing but wait until
# the next crisis.
service.sleep_for(5.0)
return 0
A few important things in there.
1. We defined our first ``_Service`` with the ``self.add_service`` function.
- ``name``: A name for our service (unique to the node class)
- ``function``: The callback that gets run in a loop for the rest of the _Node's life
2. Within the callback function, we sent a message through our service to alert anyone that's listening
3. To avoid spamming the controller and subsequent subscriptions to this service, we have the service sleep
- We do this with ``service.sleep_for`` as it has the ability to wake up gracefully when shutting down
.. warning::
Avoid ``time.sleep`` in a service callback! There's no benefit over ``_Serivce.sleep_for`` and can
cause your program to stall out for a length of time.
Logs
----
With the service looking good, we can start our development environment with ``hm dev``. At the moment, nothing new will appear to happen. Our service should be transmitting a signal every 5 seconds, but we don't have anything to really look at. To see what's happening at a log level, navigate to where your ``config/hive.py -> LOG_LOCATION`` points you. (This will be different on different platforms).
You should see two logs. One called ``root.log`` and another called ``batsignal.log`` (or whatever you called the node class). The logging utilities within hivemind route to the corresponding log file to keep them from all becoming one giant mess. In the future ``hm`` may gain the ability to merge the log based on the timestamps to help improve with time debugging.
.. tip::
**Verbose** To enable verbose output of the nodes, use the ``-v`` flag. This will be reflected in the logs, not on the standard output
.. code-block:: shell
~$> hm dev -v
Add a Subscription
------------------
The Batman is always vigilant to save the day. A subscription, is no different.
Now, we could add a subscription to our ``BatSignal`` node that listens for the ``"bat-signal-main"`` command and responds to the crisis however we can scale the number of Bat Signals and Bat...men(?) to any extent if we separate them. Herein lies one of the cornerstones of ``hivemind``.
First, let's create the node to host our subscription.
.. code-block:: shell
~$> hm create_node batman
Now, we should have the following:
.. code-block:: text
nodes/
`- __init__.py
`- batman/
`- __init__.py
`- batman.py
`- batsignal/
` - ...
Within the node definition (``nodes/batman/batman.py``) we can set up the subscription.
.. code-block:: python
class Batman(_Node):
# ...
def subscriptions(self) -> None:
"""
Stay vigilant caped crusader
"""
self._bat_signal_subscription = self.add_subscription(
name='bat-signal-*',
function=self._go_save_the_day
)
def _go_save_the_day(self, payload) -> int:
"""
Here, we can save the day.
:param payload: The message data sent from a service who's name matches
the subscription name pattern. In this case, anything
matching "bat-signal-*"
"""
if not isinstance(payload, str):
self.log_warning(f'Unknown payload type: {type(payload)}')
return 0
if payload == "Batman! We need help!":
print ('Batman has saved the day!') # To print to our stdout
else:
print ('Oh no! Batman doesn\'t know what to do!')
return 0
With that node now ready to go, let's fire the whole system up!
.. code-block:: shell
~$> hm dev
If all goes well, you should see the text ``"Batman has saved the day!"`` every few seconds.
Next Steps
----------
We've done it! Created a ``hive``, created nodes, set up an interconnected network, and run that network in a development arena. All from the same terminal and without lifting *too* many fingers.
With this knowledge you can do plenty of powerful things but ``hivemind`` has quite a bit more to offer.
.. _Django: https://www.djangoproject.com/
| 33.311239 | 404 | 0.650921 |
42b1f3b57b3b6b6c1fc822e8b2d5fd2ffc1976d9 | 77 | rst | reStructuredText | doc/api/jetset.output.makedir.rst | AAGunya/jetset | 53cb0e3e1f308273f19fd4c9b288be12447fd43d | [
"BSD-3-Clause"
] | 16 | 2019-02-11T06:58:43.000Z | 2021-12-28T13:00:35.000Z | doc/api/jetset.output.makedir.rst | AAGunya/jetset | 53cb0e3e1f308273f19fd4c9b288be12447fd43d | [
"BSD-3-Clause"
] | 14 | 2019-04-14T14:49:55.000Z | 2021-12-27T04:18:24.000Z | doc/api/jetset.output.makedir.rst | AAGunya/jetset | 53cb0e3e1f308273f19fd4c9b288be12447fd43d | [
"BSD-3-Clause"
] | 10 | 2019-02-25T14:53:28.000Z | 2022-03-02T08:49:19.000Z | makedir
=======
.. currentmodule:: jetset.output
.. autofunction:: makedir
| 11 | 32 | 0.662338 |
4811f686af45c625df2a2430e1568f867d425141 | 7,159 | rst | reStructuredText | servers/neurohs/doc/swrs.rst | arpitgogia/mars_city | 30cacd80487a8c2354bbc15b4fad211ed1cb4f9d | [
"BSD-2-Clause-FreeBSD"
] | 25 | 2016-07-20T04:49:14.000Z | 2021-08-25T09:05:04.000Z | servers/neurohs/doc/swrs.rst | arpitgogia/mars_city | 30cacd80487a8c2354bbc15b4fad211ed1cb4f9d | [
"BSD-2-Clause-FreeBSD"
] | 16 | 2016-12-27T08:30:27.000Z | 2018-06-18T08:51:44.000Z | servers/neurohs/doc/swrs.rst | arpitgogia/mars_city | 30cacd80487a8c2354bbc15b4fad211ed1cb4f9d | [
"BSD-2-Clause-FreeBSD"
] | 49 | 2016-07-20T13:08:27.000Z | 2020-06-02T18:26:12.000Z | ================================================================
Software Requirements Specification for the Neuro Headset Server
================================================================
:Author: Ezio Melotti
Change Record
=============
.. If the changelog is saved on an external file (e.g. in servers/sname/NEWS),
it can be included here by using (dedent to make it work):
.. literalinclude:: ../../servers/servername/NEWS
Introduction
============
Purpose
-------
This document describes the software requirements specification for the neuro
headset server.
Scope
-----
Describes the scope of this requirements specification.
Applicable Documents
--------------------
Reference Documents
-------------------
Glossary
--------
.. To create a glossary use the following code (dedent it to make it work):
.. glossary::
``Term``
This is a sample term
.. Use the main :ref:`glossary` for general terms, and :term:`Term` to link
to the glossary entries.
Overview
--------
The package will initially provide low-level access to the data sent by the
neuro headset. The package will also provide training software and ways to
define an higher-level interface used to control several different devices.
General Description
===================
Problem Statement
-----------------
In a manned mission on Mars, astronauts need to control a number of devices
(e.g. a Mars rover) but often have limited mobility. Ideally, the best
approach consists in an hand-free input control, such as voice command or
brain waves. These input methods would allow astronauts to operate devices
without needing specific hardware (e.g. a joystick), and even in situations
of limited mobility (e.g. while wearing a space suit).
While these input methods clearly have advantages, they might not be as
accurate as traditional input methods.
Functional Description
----------------------
The package will offer an interface between Tango and the EPOC neuro headset.
Developers can use then use the data provided by the package to control
different kind of devices.
Environment
-----------
The neuro headset can be used in the habitat or even during EVAs while
wearing a space suit, but will requires a nearby computer that will receive
and process the signal.
User objectives
---------------
User1
~~~~~
Describe all the users and their expectations for this package
Constraints
-----------
Describe any constraints that are placed on this software.
Functional Requirements
=======================
This section lists the functional requirements in ranked order. Functional
requirements describe the possible effects of a software system, in other
words, what the system must accomplish. Other kinds of requirements (such as
interface requirements, performance requirements, or reliability requirements)
describe how the system accomplishes its functional requirements.
Each functional requirement should be specified in a format similar to the
following:
Requirement
-----------
Description
~~~~~~~~~~~
Criticality
~~~~~~~~~~~
* High | Normal | Low
Dependency
~~~~~~~~~~
Indicate if this requirement is dependant on another.
Interface Requirements
======================
This section describes how the software interfaces with other software products
or users for input or output. Examples of such interfaces include library
routines, token streams, shared memory, data streams, and so forth.
User Interfaces
---------------
Describes how this product interfaces with the user.
GUI (Graphical User Interface)
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
CLI (Command Line Interface)
~~~~~~~~~~~~~~~~~~~~~~~~~~~~
API (Application Programming Interface)
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
The Tango server will provide a low-level API to access the raw data for
the neuro headset. This low level API, in combination with a training
software, will be used to create higher-level APIs, possibly as new servers.
The actual APIs are still to be determined.
Diagnostics
~~~~~~~~~~~
Describes how to obtain debugging information or other diagnostic data.
Hardware Interfaces
-------------------
A high level description (from a software point of view) of the hardware
interface if one exists. This section can refer to an ICD (Interface Control
Document) that will contain the detail description of this interface.
Software Interfaces
-------------------
A high level description (from a software point of view) of the software
interface if one exists. This section can refer to an ICD (Interface Control
Document) that will contain the detail description of this interface.
Communication Interfaces
------------------------
Describe any communication interfaces that will be required.
Performance Requirements
========================
Specifies speed and memory requirements.
Development and Test Factors
============================
Standards Compliance
--------------------
Mention to what standards this software must adhere to.
Hardware Limitations
--------------------
Describe any hardware limitations if any exist.
Software validation and verification
------------------------------------
Give a detail requirements plan for the how the software will be tested and
verified.
Planning
--------
Describe the planning of the whole process mentioning major milestones and
deliverables at these milestones.
Use-Case Models
===============
If UML Use-Case notation is used in capturing the requirements, these models
can be inserted and described in this section. Also providing references in
paragraphs 5, 6 and 7 where applicable.
Notes
=====
.. notes can be handled automatically by Sphinx
Appendix A: Use Case template
=============================
Use Case: Controlling a rover with the neuro headset
====================================================
The user wants to control a rover using the neuro headset.
Actors
------
User, rover.
Priority
--------
Normal
Preconditions
-------------
The user should be wearing a charged neuro headset and be within wireless
range of a computer that will receive and process the data, making them
available on Tango. The user might be required to do a training before
being able to use the neuro headset successfully.
Basic Course
------------
1. If the user didn't do the training yet, he should do it in order to
associate specific thoughts to specific movements.
2. After the training, he should be able to just think at the movements the
rover should do.
3. The server will process the inputs sent by the headset and convert them
to higher level signals, accoding to the data collected during the
training.
4. The higher level signals can be accessed by other servers (e.g. the rover
server) and used to determine what actions should be taken.
Alternate Course
----------------
None
Exception Course
----------------
If any of the preconditions are not met, the user should make sure to
address the problems before continuing with the basic course.
Postconditions
--------------
At the end of the session the user should turn off and remove the neuro
headset, and recharge it if needed.
Notes
-----
None
| 24.944251 | 79 | 0.69032 |
7c3c04c804620d4932ae383169221c038afc3488 | 242 | rst | reStructuredText | sphinx/source/docs/releases.rst | g-parki/bokeh | 664ead5306bba64609e734d4105c8aa8cfb76d81 | [
"BSD-3-Clause"
] | 15,193 | 2015-01-01T05:11:45.000Z | 2022-03-31T19:30:20.000Z | sphinx/source/docs/releases.rst | g-parki/bokeh | 664ead5306bba64609e734d4105c8aa8cfb76d81 | [
"BSD-3-Clause"
] | 9,554 | 2015-01-01T03:16:54.000Z | 2022-03-31T22:59:39.000Z | sphinx/source/docs/releases.rst | g-parki/bokeh | 664ead5306bba64609e734d4105c8aa8cfb76d81 | [
"BSD-3-Clause"
] | 4,829 | 2015-01-02T03:35:32.000Z | 2022-03-30T16:40:26.000Z | .. _releases:
.. currently, in order to limit the max toc depth to 1 in the right hand side
.. menu, there is a bit of JavaScript in the layout.html to remove the toc-h3
.. elements just on this page.
Releases
########
.. bokeh-releases::
| 22 | 77 | 0.694215 |
9afd0fd102e94cb3b1b0283f9abae5b181657ebe | 2,712 | rst | reStructuredText | notes/issues/tboolean_box_fail.rst | hanswenzel/opticks | b75b5929b6cf36a5eedeffb3031af2920f75f9f0 | [
"Apache-2.0"
] | 11 | 2020-07-05T02:39:32.000Z | 2022-03-20T18:52:44.000Z | notes/issues/tboolean_box_fail.rst | hanswenzel/opticks | b75b5929b6cf36a5eedeffb3031af2920f75f9f0 | [
"Apache-2.0"
] | null | null | null | notes/issues/tboolean_box_fail.rst | hanswenzel/opticks | b75b5929b6cf36a5eedeffb3031af2920f75f9f0 | [
"Apache-2.0"
] | 4 | 2020-09-03T20:36:32.000Z | 2022-01-19T07:42:21.000Z | tboolean_box_fail
====================
::
O[blyth@localhost tests]$ ./tboolean_box.sh
2021-06-05 01:46:38.501 INFO [65473] [CDetector::traverse@124] [
2021-06-05 01:46:38.501 INFO [65473] [CDetector::traverse@132] ]
2021-06-05 01:46:38.501 FATAL [65473] [Opticks::setSpaceDomain@3263] changing w 60000 -> 451
OKG4Test: /home/blyth/opticks/cfg4/CMaterialBridge.cc:101: void CMaterialBridge::initMap(): Assertion `m_g4toix.size() == nmat_mlib' failed.
(gdb) bt
#3 0x00007fffe5734252 in __assert_fail () from /lib64/libc.so.6
#4 0x00007ffff4ab0320 in CMaterialBridge::initMap (this=0xa279480) at /home/blyth/opticks/cfg4/CMaterialBridge.cc:101
#5 0x00007ffff4aafc14 in CMaterialBridge::CMaterialBridge (this=0xa279480, mlib=0x8e85ef0) at /home/blyth/opticks/cfg4/CMaterialBridge.cc:41
#6 0x00007ffff4a85477 in CGeometry::postinitialize (this=0x91a8c60) at /home/blyth/opticks/cfg4/CGeometry.cc:143
#7 0x00007ffff4af5d94 in CG4::postinitialize (this=0x8fb2a40) at /home/blyth/opticks/cfg4/CG4.cc:249
#8 0x00007ffff4af5aef in CG4::initialize (this=0x8fb2a40) at /home/blyth/opticks/cfg4/CG4.cc:225
#9 0x00007ffff4af583a in CG4::init (this=0x8fb2a40) at /home/blyth/opticks/cfg4/CG4.cc:195
#10 0x00007ffff4af55e0 in CG4::CG4 (this=0x8fb2a40, hub=0x703c50) at /home/blyth/opticks/cfg4/CG4.cc:186
#11 0x00007ffff7baf89d in OKG4Mgr::OKG4Mgr (this=0x7fffffff4440, argc=33, argv=0x7fffffff4788) at /home/blyth/opticks/okg4/OKG4Mgr.cc:107
#12 0x00000000004038ba in main (argc=33, argv=0x7fffffff4788) at /home/blyth/opticks/okg4/tests/OKG4Test.cc:27
(gdb)
On material different ?::
2021-06-05 02:01:44.672 INFO [89720] [CMaterialBridge::initMap@106]
nmat (G4Material::GetNumberOfMaterials) 3 nmat_mlib (GMaterialLib::getNumMaterials) materials used by geometry 4
i 0 name Rock shortname Rock abbr Rock index 2 mlib_unset 0
i 1 name Vacuum shortname Vacuum abbr Vacuum index 3 mlib_unset 0
i 2 name GlassSchottF2 shortname GlassSchottF2 abbr GlassSchottF2 index 0 mlib_unset 0
nmat 3 nmat_mlib 4 m_g4toix.size() 3 m_ixtoname.size() 3 m_ixtoabbr.size() 3
OKG4Test: /home/blyth/opticks/cfg4/CMaterialBridge.cc:112: void CMaterialBridge::initMap(): Assertion `m_g4toix.size() == nmat_mlib' failed.
Program received signal SIGABRT, Aborted.
This issue looks to be very specific to this test geometry, so its non urgent.
| 60.266667 | 167 | 0.666667 |
b3988254800d3ff0a97b9f416ca8ef609d892649 | 716 | rest | reStructuredText | route.rest | UIHacks2021-byke/byke-rest-api | c9f3acbaf2b28b3d5c0b87560827976794c1c625 | [
"MIT"
] | null | null | null | route.rest | UIHacks2021-byke/byke-rest-api | c9f3acbaf2b28b3d5c0b87560827976794c1c625 | [
"MIT"
] | null | null | null | route.rest | UIHacks2021-byke/byke-rest-api | c9f3acbaf2b28b3d5c0b87560827976794c1c625 | [
"MIT"
] | null | null | null | GET http://192.168.33.10:8080/users
###
GET http://192.168.33.10:8080/users/616c2c19bcb25ac2159c1596
###
POST http://127.0.0.1:8080/users
Content-Type: application/json
{
"name": "Uzumaki Naruto",
"username": "copy_ninja",
"password": "3herwkrnke",
"email": "olamideumarq@gmail.com"
}
###
DELETE http://192.168.33.10:8080/users/616c44430aca47261e8544a5
###
PATCH http://192.168.33.10:8080/users/616c4478d4de4ff04da32950
Content-Type: application/json
{
"name": "Uzumaki Boruto",
"password": "jougan"
}
###
fetch("http://192.168.33.10:8080/users", {
method: "GET",
headers: {
accept: "application/json",
},
})
.then((result) => result.json())
.then((json) => console.log(json));
| 18.842105 | 64 | 0.659218 |
6ceb607f5a6bcc608afd8e70747a2f878d569414 | 224 | rst | reStructuredText | README.rst | dcloud/radiotelephony | bec34f5aca12fb6d566d66c37011cce7f50ff165 | [
"BSD-3-Clause"
] | null | null | null | README.rst | dcloud/radiotelephony | bec34f5aca12fb6d566d66c37011cce7f50ff165 | [
"BSD-3-Clause"
] | null | null | null | README.rst | dcloud/radiotelephony | bec34f5aca12fb6d566d66c37011cce7f50ff165 | [
"BSD-3-Clause"
] | null | null | null | ==============
radiotelephony
==============
Translates words to callsigns using the `NATO phonetic alphabet <https://en.wikipedia.org/wiki/NATO_phonetic_alphabet>`_, aka the International Radiotelephony Spelling Alphabet.
| 37.333333 | 177 | 0.727679 |
b84def7c6cd2cea7d113fa41215d194dc50224fe | 920 | rst | reStructuredText | docs/Network.rst | mstepovanyy/python-training | 0a6766674855cbe784bc1195774016aee889ad6c | [
"MIT",
"Unlicense"
] | null | null | null | docs/Network.rst | mstepovanyy/python-training | 0a6766674855cbe784bc1195774016aee889ad6c | [
"MIT",
"Unlicense"
] | null | null | null | docs/Network.rst | mstepovanyy/python-training | 0a6766674855cbe784bc1195774016aee889ad6c | [
"MIT",
"Unlicense"
] | null | null | null |
.. contents::
.. sectnum::
Ethernet packet frame
=====================
.. image:: img/ethernet_frame.jpg
ARP packet
==========
Determinate MAC address based on existing IP address.
+-------------------------+--------------+
| On image | Description |
+=========================+==============+
| Sender Hardware Address | sender MAC |
+-------------------------+--------------+
| Sender IP Address | sender IP |
+-------------------------+--------------+
| Target Hardware Address | All zero |
+-------------------------+--------------+
| Target IP Address | receiver IP |
+-------------------------+--------------+
.. image:: img/arp.png
Video `arp <https://www.youtube.com/watch?v=aamG4-tH_m8>`_.
TCP
===
.. image:: img/tcp.gif
UDP
===
.. image:: img/udp.jpg
OSI
===
.. image:: img/osi.gif
NAS
===
Check `nas <https://www.youtube.com/watch?v=01ajHxPLxAw>`_.
| 16.428571 | 59 | 0.426087 |
6330bc32e969b1554b5921983a3365bab73d803c | 99 | rst | reStructuredText | docs/source/endpoints/async_cloud.rst | gmerz/MatterApi | b116da58d3a4ca77739970a28e30672e0e611705 | [
"MIT"
] | 3 | 2022-01-26T23:31:01.000Z | 2022-03-01T13:07:26.000Z | docs/source/endpoints/async_cloud.rst | gmerz/MatterApi | b116da58d3a4ca77739970a28e30672e0e611705 | [
"MIT"
] | null | null | null | docs/source/endpoints/async_cloud.rst | gmerz/MatterApi | b116da58d3a4ca77739970a28e30672e0e611705 | [
"MIT"
] | null | null | null | Cloud
-----
.. autoclass:: matterapi.endpoints.async_api.CloudApi
:members:
:undoc-members:
| 14.142857 | 53 | 0.686869 |
fae5ee37814118283376f65d8817f0d27d93af0a | 601 | rst | reStructuredText | doc/source/sdk/meta/tag/uint32.rst | brycelelbach/nt2 | 73d7e8dd390fa4c8d251c6451acdae65def70e0b | [
"BSL-1.0"
] | 1 | 2022-03-24T03:35:10.000Z | 2022-03-24T03:35:10.000Z | doc/source/sdk/meta/tag/uint32.rst | brycelelbach/nt2 | 73d7e8dd390fa4c8d251c6451acdae65def70e0b | [
"BSL-1.0"
] | null | null | null | doc/source/sdk/meta/tag/uint32.rst | brycelelbach/nt2 | 73d7e8dd390fa4c8d251c6451acdae65def70e0b | [
"BSL-1.0"
] | null | null | null | .. _tag_uint32_:
uint32
=======
.. index::
single: uint32_ (tag)
single: tag; uint32_
single: uint32_ (meta)
single: meta; uint32_
Description
^^^^^^^^^^^
Tag type used in |nt2| hierarchical overload resolution for discriminating
unsigned integers of 32 bits.
Header File
^^^^^^^^^^^
.. code-block:: cpp
#include <nt2/sdk/meta/hierarchy.hpp>
Synopsis
^^^^^^^^
.. code-block:: cpp
namespace nt2
{
namespace tag
{
struct uint32_;
}
namespace meta
{
template<typename T>
struct uint32_;
}
}
.. seealso::
:ref:`sdk_tags`
| 13.065217 | 74 | 0.592346 |
c8b598ac9d67779f9d346a4c5bd764bafbb81fd6 | 372 | rst | reStructuredText | docs/API/reporting_functions.rst | MSLNZ/GTC | e446fcf1e2f378a477e28af6182dcb083b7ee732 | [
"MIT"
] | 9 | 2018-11-21T23:02:08.000Z | 2021-11-13T11:19:25.000Z | docs/API/reporting_functions.rst | MSLNZ/GTC | e446fcf1e2f378a477e28af6182dcb083b7ee732 | [
"MIT"
] | 12 | 2018-11-06T21:32:19.000Z | 2022-03-06T20:19:06.000Z | docs/API/reporting_functions.rst | MSLNZ/GTC | e446fcf1e2f378a477e28af6182dcb083b7ee732 | [
"MIT"
] | 2 | 2020-05-14T12:43:21.000Z | 2021-06-10T16:07:27.000Z | .. _reporting_functions:
===================
Reporting functions
===================
This module provides functions to facilitate the reporting of information
about calculations.
The shorter name ``rp`` has been defined as an alias for :mod:`reporting`,
to resolve the names of objects defined in this module.
.. automodule:: reporting
:members:
:inherited-members:
| 23.25 | 74 | 0.698925 |
99314e9fe53a49b46295081c822fb7182e6cc7f6 | 632 | rst | reStructuredText | README.rst | munikarmanish/spark-todo | c6bfb9c2f0f5cf5c59090c9823040b0e5da54012 | [
"Unlicense"
] | 1 | 2021-09-17T18:12:48.000Z | 2021-09-17T18:12:48.000Z | README.rst | josephbill/spark-todo | c6bfb9c2f0f5cf5c59090c9823040b0e5da54012 | [
"Unlicense"
] | null | null | null | README.rst | josephbill/spark-todo | c6bfb9c2f0f5cf5c59090c9823040b0e5da54012 | [
"Unlicense"
] | 1 | 2021-09-17T09:11:01.000Z | 2021-09-17T09:11:01.000Z | ==========
To-do List
==========
This is a simple database-driven web app written in `Java <//java.com>`_ programming language, using the `Spark <//sparkjava.com>`_ framework.
Requirements
------------
You need to have a database named **todo** and **todo_test** and import the file **todo.sql** into those databases.
Other requirements:
- Gradle
- JDK/JRE >= 7
Usage
-----
To run the local development server, run::
$ gradle run
and then visit http://0.0.0.0:4567 in the browser.
To run the unit tests, run::
$ gradle test
Reference
---------
- Awesome Java tutorials at https://www.learnhowtoprogram.com/java
| 18.588235 | 142 | 0.661392 |
b8d724f4765569c182343a3d76c0a044d6870b7c | 7,306 | rst | reStructuredText | README.rst | Thundzz/python-sqs-listener | c89e1459e10a0d2f796808cf7779be56286636fe | [
"Apache-2.0"
] | null | null | null | README.rst | Thundzz/python-sqs-listener | c89e1459e10a0d2f796808cf7779be56286636fe | [
"Apache-2.0"
] | null | null | null | README.rst | Thundzz/python-sqs-listener | c89e1459e10a0d2f796808cf7779be56286636fe | [
"Apache-2.0"
] | null | null | null | AWS SQS Listener
----------------
.. image:: https://img.shields.io/pypi/v/pySqsListener.svg?style=popout
:alt: PyPI
:target: https://github.com/jegesh/python-sqs-listener
.. image:: https://img.shields.io/pypi/pyversions/pySqsListener.svg?style=popout
:alt: PyPI - Python Version
:target: https://pypi.org/project/pySqsListener/
This package takes care of the boilerplate involved in listening to an SQS
queue, as well as sending messages to a queue. Works with python 2.7 & 3.6+.
Installation
~~~~~~~~~~~~
``pip install pySqsListener``
Listening to a queue
~~~~~~~~~~~~~~~~~~~~
| Using the listener is very straightforward - just inherit from the
``SqsListener`` class and implement the ``handle_message()`` method.
The queue will be created at runtime if it doesn't already exist.
You can also specify an error queue to automatically push any errors to.
Here is a basic code sample:
**Standard Listener**
::
from sqs_listener import SqsListener
class MyListener(SqsListener):
def handle_message(self, body, attributes, messages_attributes):
run_my_function(body['param1'], body['param2'])
listener = MyListener('my-message-queue', error_queue='my-error-queue', region_name='us-east-1')
listener.listen()
**Error Listener**
::
from sqs_listener import SqsListener
class MyErrorListener(SqsListener):
def handle_message(self, body, attributes, messages_attributes):
save_to_log(body['exception_type'], body['error_message']
error_listener = MyErrorListener('my-error-queue')
error_listener.listen()
| The options available as ``kwargs`` are as follows:
- error_queue (str) - name of queue to push errors.
- force_delete (boolean) - delete the message received from the queue, whether or not the handler function is successful. By default the message is deleted only if the handler function returns with no exceptions
- interval (int) - number of seconds in between polls. Set to 60 by default
- visibility_timeout (str) - Number of seconds the message will be invisible ('in flight') after being read. After this time interval it reappear in the queue if it wasn't deleted in the meantime. Set to '600' (10 minutes) by default
- error_visibility_timeout (str) - Same as previous argument, for the error queue. Applicable only if the ``error_queue`` argument is set, and the queue doesn't already exist.
- wait_time (int) - number of seconds to wait for a message to arrive (for long polling). Set to 0 by default to provide short polling.
- max_number_of_messages (int) - Max number of messages to receive from the queue. Set to 1 by default, max is 10
- message_attribute_names (list) - message attributes by which to filter messages
- attribute_names (list) - attributes by which to filter messages (see boto docs for difference between these two)
- region_name (str) - AWS region name (defaults to ``us-east-1``)
- queue_url (str) - overrides ``queue`` parameter. Mostly useful for getting around `this bug <https://github.com/aws/aws-cli/issues/1715>`_ in the boto library
- deserializer (function str -> dict) - Deserialization function that will be used to parse the message body. Set to python's ``json.loads`` by default.
Running as a Daemon
~~~~~~~~~~~~~~~~~~~
| Typically, in a production environment, you'll want to listen to an SQS queue with a daemonized process.
The simplest way to do this is by running the listener in a detached process. On a typical Linux distribution it might look like this:
|
``nohup python my_listener.py > listener.log &``
| And saving the resulting process id for later (for stopping the listener via the ``kill`` command).
|
A more complete implementation can be achieved easily by inheriting from the package's ``Daemon`` class and overriding the ``run()`` method.
|
| The sample_daemon.py file in the source root folder provides a clear example for achieving this. Using this example,
you can run the listener as a daemon with the command ``python sample_daemon.py start``. Similarly, the command
``python sample_daemon.py stop`` will stop the process. You'll most likely need to run the start script using ``sudo``.
|
Logging
~~~~~~~
| The listener and launcher instances push all their messages to a ``logger`` instance, called 'sqs_listener'.
In order to view the messages, the logger needs to be redirected to ``stdout`` or to a log file.
|
| For instance:
::
logger = logging.getLogger('sqs_listener')
logger.setLevel(logging.INFO)
sh = logging.StreamHandler(sys.stdout)
sh.setLevel(logging.INFO)
formatstr = '[%(asctime)s - %(name)s - %(levelname)s] %(message)s'
formatter = logging.Formatter(formatstr)
sh.setFormatter(formatter)
logger.addHandler(sh)
|
| Or to a log file:
::
logger = logging.getLogger('sqs_listener')
logger.setLevel(logging.INFO)
sh = logging.FileHandler('mylog.log')
sh.setLevel(logging.INFO)
formatstr = '[%(asctime)s - %(name)s - %(levelname)s] %(message)s'
formatter = logging.Formatter(formatstr)
sh.setFormatter(formatter)
logger.addHandler(sh)
Sending messages
~~~~~~~~~~~~~~~~
| In order to send a message, instantiate an ``SqsLauncher`` with the name of the queue. By default an exception will
be raised if the queue doesn't exist, but it can be created automatically if the ``create_queue`` parameter is
set to true. In such a case, there's also an option to set the newly created queue's ``VisibilityTimeout`` via the
third parameter. It is possible to provide a ``serializer`` function if custom types need to be sent. This function
expects a dict object and should return a string. If not provided, python's `json.dumps` is used by default.
|
| After instantiation, use the ``launch_message()`` method to send the message. The message body should be a ``dict``,
and additional kwargs can be specified as stated in the `SQS docs
<http://boto3.readthedocs.io/en/latest/reference/services/sqs.html#SQS.Client.send_message>`_.
The method returns the response from SQS.
**Launcher Example**
::
from sqs_launcher import SqsLauncher
launcher = SqsLauncher('my-queue')
response = launcher.launch_message({'param1': 'hello', 'param2': 'world'})
Important Notes
~~~~~~~~~~~~~~~
- The environment variable ``AWS_ACCOUNT_ID`` must be set, in addition
to the environment having valid AWS credentials (via environment variables
or a credentials file) or if running in an aws ec2 instance a role attached
with the required permissions.
- For both the main queue and the error queue, if the queue doesn’t
exist (in the specified region), it will be created at runtime.
- The error queue receives only two values in the message body: ``exception_type`` and ``error_message``. Both are of type ``str``
- If the function that the listener executes involves connecting to a database, you should explicitly close the connection at the end of the function. Otherwise, you're likely to get an error like this: ``OperationalError(2006, 'MySQL server has gone away')``
- Either the queue name or the queue url should be provided. When both are provided the queue url is used and the queue name is ignored.
Contributing
~~~~~~~~~~~~
Fork the repo and make a pull request.
| 43.230769 | 261 | 0.728306 |
9987bf6e066a35be7d5e91f5e9276c7f7941c208 | 1,495 | rst | reStructuredText | doc/Changelog/8.0/Deprecation-72856-RemovedRTEModesOption.rst | DanielSiepmann/typo3scan | 630efc8ea9c7bd86c4b9192c91b795fff5d3b8dc | [
"MIT"
] | 1 | 2019-10-04T23:58:04.000Z | 2019-10-04T23:58:04.000Z | doc/Changelog/8.0/Deprecation-72856-RemovedRTEModesOption.rst | DanielSiepmann/typo3scan | 630efc8ea9c7bd86c4b9192c91b795fff5d3b8dc | [
"MIT"
] | 1 | 2021-12-17T10:58:59.000Z | 2021-12-17T10:58:59.000Z | doc/Changelog/8.0/Deprecation-72856-RemovedRTEModesOption.rst | DanielSiepmann/typo3scan | 630efc8ea9c7bd86c4b9192c91b795fff5d3b8dc | [
"MIT"
] | 4 | 2020-10-06T08:18:55.000Z | 2022-03-17T11:14:09.000Z |
.. include:: ../../Includes.txt
================================================
Deprecation: #72856 - Removed RTE "modes" option
================================================
See :issue:`72856`
Description
===========
The RTE "modes" option that was added to a RTE enabled TCA field in the "defaultExtras"
section has been removed.
The RTE is now loaded via the configuration from TSconfig, usually set by "modes"
or "overruleMode" (used by default), and loaded even without the RTE mode set in
the TCA field defaultExtras section.
Impact
======
Extension authors do not need to set the defaultExtras "mode=ts_css" parameter explicitly.
Migration
=========
When configuring a RTE field in a TYPO3 extension the defaultExtras part should bet
set to `richtext:rte_transform` instead of `richtext:rte_transform[mode=ts_css]`
in order to render the RTE.
Flexform
--------
Example for an RTE Field, used in a Flexform with CMS 8 after migration
.. code-block:: xml
<text>
<TCEforms>
<label>LLL:EXT:extension_name/Resources/Private/Language/locallang_db.xlf:flexform.text.element.labelname</label>
<config>
<type>text</type>
<size>10</size>
<rows>5</rows>
<enableRichtext>true</enableRichtext>
</config>
<defaultExtras>
<richtext>rte_transform</richtext>
</defaultExtras>
</TCEforms>
</text>
.. index:: TSConfig, Backend, RTE
| 24.916667 | 124 | 0.61806 |
8517af3ab096cd2beb6dffecf7aad64f01447659 | 221 | rst | reStructuredText | py/docs/source/webdriver/selenium.webdriver.common.by.rst | TamsilAmani/selenium | 65b7a240185e1de4bc7693c69cd60cca11c3903a | [
"Apache-2.0"
] | 25,151 | 2015-01-01T15:40:17.000Z | 2022-03-31T18:44:03.000Z | py/docs/source/webdriver/selenium.webdriver.common.by.rst | TamsilAmani/selenium | 65b7a240185e1de4bc7693c69cd60cca11c3903a | [
"Apache-2.0"
] | 9,885 | 2015-01-03T17:53:00.000Z | 2022-03-31T21:48:12.000Z | py/docs/source/webdriver/selenium.webdriver.common.by.rst | TamsilAmani/selenium | 65b7a240185e1de4bc7693c69cd60cca11c3903a | [
"Apache-2.0"
] | 8,441 | 2015-01-05T09:36:44.000Z | 2022-03-31T19:58:06.000Z | selenium.webdriver.common.by
============================
.. automodule:: selenium.webdriver.common.by
.. rubric:: Classes
.. autosummary::
By
| 7.366667 | 44 | 0.38914 |
9809c016a7ce62f0fdca478e604a9930485a97b9 | 3,958 | rst | reStructuredText | content/docs/users-guide/getting-help.rst | pacha/vem-doc | b4792cc509503d13570b378bae9e697e7bca8d30 | [
"MIT"
] | 2 | 2019-02-24T17:42:25.000Z | 2019-08-03T15:47:05.000Z | content/docs/users-guide/getting-help.rst | pacha/vem-doc | b4792cc509503d13570b378bae9e697e7bca8d30 | [
"MIT"
] | null | null | null | content/docs/users-guide/getting-help.rst | pacha/vem-doc | b4792cc509503d13570b378bae9e697e7bca8d30 | [
"MIT"
] | null | null | null |
.. role:: key
.. default-role:: key
Getting Help
============
Unintended changes
""""""""""""""""""
It can happen —specially when learning Vem, but also later too— that you
mistakenly press a different key than the one you really intended to use. Since
every key has an associated action, you may end up in an unexpected place of
your document or modifying it in an unintentional way.
If this happens, just jump back to where you were or undo the latest action with:
:`q`: undo last change
:`R`: jump to previous position
`q` undoes the last change and jumps back too, so you can use that one to ensure
that you didn't modify anything by mistake and if the latest change was correct
you can just redo it with `Q`.
In any case, you can also check the timestamps of the latest changes to see, at
any time, when the last change to the document happened::
:undolist
Unexpected behavior
"""""""""""""""""""
If you find that a key, or a key combination, does something unexpected, it can
be that one plugin is overriding Vem's default behavior. Most plugins define
their custom keyboard mappings using the ``<leader>`` key (usually `\\`) as
prefix or allow you to define your own mappings to avoid conflicts with existing
key mappings. However, if they define a new behavior for a key that shadows
Vem's default one, then it may be difficult to detect until you try use such key.
In any case, if you find a key that performs a different action to what you
expect, you can check how its action was mapped with::
:verbose map KEY
where ``KEY`` is:
* A lower or uppercase letter.
* A `Ctrl` key combination, specified as ``<C-f>``, ``<C-t>``, ...
* A special key: ``<Tab>``, ``<Space>``, ``<Enter>`` or ``<BS>`` (for
backspace).
``map`` will list mappings in normal and visual modes, but you also query the
mappings of a specific mode with:
* ``nmap`` for normal mode
* ``xmap`` for visual mode
* ``imap`` for insert mode
The output of the command is something like::
x f <Plug>vem_change-
Last set from /usr/local/share/vem10/src/mappings.vim line 226
n f <Plug>vem_change_word-
Last set from /usr/local/share/vem10/src/mappings.vim line 221
The first column is the mode (``n``: normal, ``x``: visual, ``i``: insert), the
second the key and the third the associated action. Finally, a line is included
that specifies where the mapping was defined. This can help to identify which
plugin is redefining a key.
All actions set by Vem are prefixed with ``<Plug>vem``.
Getting help
""""""""""""
You can get help about any topic using the command line. Just type::
:help <topic>
With `Tab` you can autocomplete the topic string.
You can use the ``help`` command to find more information about configuration
options, commands or mappings. For example, you can get more information about
the ``:sort`` ex-command with::
:help :sort
After executing this command, the screen will be split showing your current
document and a new `window </docs/users-guide/windows.html>`__ displaying the
contents of help. This is a regular editor window and you can browse its
contents with the usual movement keys. In particular, you'll see that some terms
are highlighted. Vim help files use `tags </docs/users-guide/tags.html>`__ to
hyperlink documents together. Therefore, you can:
* jump to a topic by placing the cursor on top of a highlighted term and
pressing `Space` `o`
* jump back to the previous topic with `Space` `i`
* close the window with `x`
.. Note:: The key command information provided by ``:help`` is the one related
to the original Vim ones, not Vem's. To get a description of Vem commands use
this tutorial, visit the `User's guide </docs/users-guide/index.html>`__ or
check the `Key command cheat sheets </docs/cheat-sheets/index.html>`__.
.. container:: browsing-links
« `Vim Compatibility </docs/users-guide/vim-compatibility.html>`_
| 36.311927 | 81 | 0.716271 |
ff69f119aa267352b964b119be37efcf82dc4c84 | 1,294 | rst | reStructuredText | doc/user/events/guides/mixed_taxation.rst | fabm3n/pretix | 520fb620888d5c434665a6a4a33cb2ab22dd42c7 | [
"Apache-2.0"
] | 1,248 | 2015-04-24T13:32:06.000Z | 2022-03-29T07:01:36.000Z | doc/user/events/guides/mixed_taxation.rst | fabm3n/pretix | 520fb620888d5c434665a6a4a33cb2ab22dd42c7 | [
"Apache-2.0"
] | 2,113 | 2015-02-18T18:58:16.000Z | 2022-03-31T11:12:32.000Z | doc/user/events/guides/mixed_taxation.rst | fabm3n/pretix | 520fb620888d5c434665a6a4a33cb2ab22dd42c7 | [
"Apache-2.0"
] | 453 | 2015-05-13T09:29:06.000Z | 2022-03-24T13:39:16.000Z | Use case: Mixed taxation
------------------------
Let's say you are a charitable organization in Germany and are allowed to charge a reduced tax rate of 7% for your educational event. However, your event includes a significant amount of food, you might need to charge a 19% tax rate on that portion. For example, your desired tax structure might then look like this:
* Conference ticket price: € 450 (incl. € 150 for food)
* incl. € 19.63 VAT at 7%
* incl. € 23.95 VAT at 19%
You can implement this in pretix using product bundles. In order to do so, you should create the following two products:
* Conference ticket at € 450 with a 7% tax rule
* Conference food at € 150 with a 19% tax rule and the option "**Only sell this product as part of a bundle**" set
In addition to your normal conference quota, you need to create an unlimited quota for the food product.
Then, head to the **Bundled products** tab of the "conference ticket" and add the "conference food" as a bundled product with a **designated price** of € 150.
Once a customer tries to buy the € 450 conference ticket, a sub-product will be added and the price will automatically be split into the two components, leading to a correct computation of taxes.
You can find more use cases in these specialized guides:
| 56.26087 | 316 | 0.739567 |
0aedbed45b81a676dbba6130367cc0284d4bed3b | 122 | rst | reStructuredText | content/missives/kosovo.rst | scopatz/anthony.scopatz.com | c23303f4d2deed50c576794ba2bdf21f4bcf17fc | [
"BSD-2-Clause"
] | null | null | null | content/missives/kosovo.rst | scopatz/anthony.scopatz.com | c23303f4d2deed50c576794ba2bdf21f4bcf17fc | [
"BSD-2-Clause"
] | null | null | null | content/missives/kosovo.rst | scopatz/anthony.scopatz.com | c23303f4d2deed50c576794ba2bdf21f4bcf17fc | [
"BSD-2-Clause"
] | null | null | null | Kosovo
##############
:date: 2008-02-17 16:12
:author: Anthony Scopatz
:category: missives
Kosovo declared independence.
| 15.25 | 29 | 0.688525 |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.