hexsha stringlengths 40 40 | size int64 5 1.04M | ext stringclasses 6 values | lang stringclasses 1 value | max_stars_repo_path stringlengths 3 344 | max_stars_repo_name stringlengths 5 125 | max_stars_repo_head_hexsha stringlengths 40 78 | max_stars_repo_licenses listlengths 1 11 | max_stars_count int64 1 368k ⌀ | max_stars_repo_stars_event_min_datetime stringlengths 24 24 ⌀ | max_stars_repo_stars_event_max_datetime stringlengths 24 24 ⌀ | max_issues_repo_path stringlengths 3 344 | max_issues_repo_name stringlengths 5 125 | max_issues_repo_head_hexsha stringlengths 40 78 | max_issues_repo_licenses listlengths 1 11 | max_issues_count int64 1 116k ⌀ | max_issues_repo_issues_event_min_datetime stringlengths 24 24 ⌀ | max_issues_repo_issues_event_max_datetime stringlengths 24 24 ⌀ | max_forks_repo_path stringlengths 3 344 | max_forks_repo_name stringlengths 5 125 | max_forks_repo_head_hexsha stringlengths 40 78 | max_forks_repo_licenses listlengths 1 11 | max_forks_count int64 1 105k ⌀ | max_forks_repo_forks_event_min_datetime stringlengths 24 24 ⌀ | max_forks_repo_forks_event_max_datetime stringlengths 24 24 ⌀ | content stringlengths 5 1.04M | avg_line_length float64 1.14 851k | max_line_length int64 1 1.03M | alphanum_fraction float64 0 1 | lid stringclasses 191 values | lid_prob float64 0.01 1 |
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
fa04e2b0fdd1e1e43cb8a478b59ecfa95e978d8d | 170 | md | Markdown | README.md | Novout/snowpack-pixijs-boilerplate | c387a59279c77b6d14d387766ba8fb712f64c170 | [
"MIT"
] | null | null | null | README.md | Novout/snowpack-pixijs-boilerplate | c387a59279c77b6d14d387766ba8fb712f64c170 | [
"MIT"
] | null | null | null | README.md | Novout/snowpack-pixijs-boilerplate | c387a59279c77b6d14d387766ba8fb712f64c170 | [
"MIT"
] | null | null | null | # Snowpack PixiJS Boilerplate

### Tecnologies
- Yarn
- Snowpack
- PixiJS 5
- Babel
- Eslint
- WebRunner
| 13.076923 | 62 | 0.717647 | kor_Hang | 0.440965 |
fa065f73d57206c7ec0373b6346e1bbe92501c2f | 44 | md | Markdown | README.md | MrLaTeXWorkshop/SO-GLR | bd57a4798e472d49608252986eadfb0feab37bb1 | [
"MIT"
] | 1 | 2022-02-04T13:05:30.000Z | 2022-02-04T13:05:30.000Z | README.md | MrLaTeXWorkshop/SO-GLR | bd57a4798e472d49608252986eadfb0feab37bb1 | [
"MIT"
] | null | null | null | README.md | MrLaTeXWorkshop/SO-GLR | bd57a4798e472d49608252986eadfb0feab37bb1 | [
"MIT"
] | null | null | null | # SO-GLR
Trabalhos de Sistemas Operacionais
| 14.666667 | 34 | 0.818182 | por_Latn | 0.997741 |
fa06d3e3843f9411dbaf344a91d16316c1b88175 | 3,211 | md | Markdown | src/skid_steer_drive/README.md | rticommunity/gazebo-dds-plugins | ffd6a5a25f9efea104c9c79a715bf1b5e77bddb1 | [
"Apache-2.0"
] | 9 | 2019-02-24T19:09:49.000Z | 2022-03-30T03:44:42.000Z | src/skid_steer_drive/README.md | rticommunity/gazebo-dds-plugins | ffd6a5a25f9efea104c9c79a715bf1b5e77bddb1 | [
"Apache-2.0"
] | 1 | 2020-12-17T15:31:16.000Z | 2020-12-17T15:31:16.000Z | src/skid_steer_drive/README.md | rticommunity/gazebo-dds-plugins | ffd6a5a25f9efea104c9c79a715bf1b5e77bddb1 | [
"Apache-2.0"
] | 4 | 2019-02-13T18:12:07.000Z | 2022-03-30T03:44:51.000Z | # Skid Steer Drive Plugin
It allows us to manage a skid steer drive robot and obtain its information.
The Skid Steer Drive plugin publishes and subscribes to the following Topics:
* Publications
* **odom** [`nav_msgs::msg::Odometry`] -- Publishes information provided by
the odometry sensor.
* **joint** [`sensor_msgs::msg::JointState`] -- Publishes the state of the
joints of the robot.
* Subscriptions
* **cmd_vel** [`geometry_msgs::msg::Twist`] -- Subscribes to the next
velocity and rotation velocity.
## How To Run the Skid Steer Drive Plugin
To run the plugin, you will need to use two terminals: one terminal to launch
Gazebo along with plugin, and another terminal to run a Skid Steer Drive
publisher.
### First Terminal
To launch the plugin you will need to make sure Gazebo knows where it is
located. You can either set the `GAZEBO_PLUGIN_PATH` environment variable to
the plugin directory:
```bash
export GAZEBO_PLUGIN_PATH=/path/to/gazebo-dds-plugins/build/src/:$GAZEBO_PLUGIN_PATH
```
Or alternatively, add the full path as part of the plugin definition in the
`.world` file:
```xml
<plugin name="SkidSteerDrivePlugin"
filename="/path/to/skid_steer_drive/libDdsSkidSteerDrivePlugin.so">
<!-- ... -->
</plugin>
```
Once you have set the environment accordingly, execute Gazebo and pass the
demonstration world as a parameter:
```bash
gazebo ./resources/worlds/SkidSteerDrive.world --verbose
```
### Second Terminal
This plugin uses the publisher application included in the Differential Drive
plugin to to send requests to the robot; that is, to send instructions in order
to move it.
You can run the publisher application as follows:
```bash
gazebo-dds-plugins/build/src/diff_drive/diffdrivepublisher \
-d <domain id> \
-t <topic name> \
-s "(<linear velocity in axis x> <angular velocity in axis z>)"
```
For more information on how to run the application, run `diffdrivepublisher -h`.
## Using Skid Steer Drive with Custom Worlds
To run the Skid Steer Drive Plugin with a custom world, add the following sdf
information within the `world` tag:
```xml
<plugin name="SkidSteerDrivePlugin"
filename="skid_steer_drive/libDdsSkidSteerDrivePlugin.so">
<dds_domain_id>0</dds_domain_id>
<topic_name_twist>cmd_vel</topic_name_twist>
<topic_name_odometry>odom</topic_name_odometry>
<topic_name_joint>joint</topic_name_joint>
<right_front_joint>right_front</right_front_joint>
<right_rear_joint>right_rear</right_rear_joint>
<left_front_joint>left_front</left_front_joint>
<left_rear_joint>left_rear</left_rear_joint>
<wheel_separation>0.4</wheel_separation>
<wheel_diameter>0.15</wheel_diameter>
<wheel_torque>5.0</wheel_torque>
<update_rate>2</update_rate>
<covariance_x>0.0001</covariance_x>
<covariance_y>0.0001</covariance_y>
<covariance_yaw>0.01</covariance_yaw>
</plugin>
```
In addition, you may specify a `dds_qos_profile_file` and `dds_qos_profile`
within the plugin tag to use a specific QoS profile instead of the default one.
```xml
<plugin name="SkidSteerDrivePlugin"
filename="skid_steer_drive/libDdsSkidSteerDrivePlugin.so">
<!-- ... -->
</plugin>
| 32.434343 | 84 | 0.749299 | eng_Latn | 0.861272 |
fa06ef207d2aa72c915bfa91ee3c161c816266ec | 3,147 | md | Markdown | _publications/2020-08-10-takei2021.md | murraycadzow/murraycadzow.github.io | cfec85b7549ec9bf3db76517f7c92ac904b6ea2e | [
"MIT"
] | null | null | null | _publications/2020-08-10-takei2021.md | murraycadzow/murraycadzow.github.io | cfec85b7549ec9bf3db76517f7c92ac904b6ea2e | [
"MIT"
] | null | null | null | _publications/2020-08-10-takei2021.md | murraycadzow/murraycadzow.github.io | cfec85b7549ec9bf3db76517f7c92ac904b6ea2e | [
"MIT"
] | null | null | null | ---
title: "Trans-ancestral dissection of urate- and gout-associated major loci SLC2A9 and ABCG2 reveals primate-specific regulatory effects"
collection: publications
permalink: /publication/2020-08-10-takei2021
excerpt: ''
date: 2020-08-10
venue: 'Journal of Human Genetics'
paperurl: 'https://www.nature.com/articles/s10038-020-0821-z'
citation: 'Riku Takei, Murray Cadzow, David Markie, Matt Bixley, Amanda Phipps-Green, Tanya J. Major, Changgui Li, Hyon K. Choi, Zhiqiang Li, Hua Hu, Eurogout Consortium, Hui Guo, Meian He, Yongyong Shi, Lisa K. Stamp, Nicola Dalbeth, Tony R. Merriman & Wen-Hua Wei (2021). "Trans-ancestral dissection of urate- and gout-associated major loci SLC2A9 and ABCG2 reveals primate-specific regulatory effects." <i>Journal of Human Genetics</i>. 66, pp161–1691.'
---
# Abstract
Gout is a complex inflammatory arthritis affecting ~20% of people with an elevated serum urate level (hyperuricemia). Gout and hyperuricemia are essentially specific to humans and other higher primates, with varied prevalence across ancestral groups. <i>SLC2A9</i> and <i>ABCG2</i> are major loci associated with both urate and gout in multiple ancestral groups. However, fine mapping has been challenging due to extensive linkage disequilibrium underlying the associated regions. We used trans-ancestral fine mapping integrated with primate-specific genomic information to address this challenge. Trans-ancestral meta-analyses of GWAS cohorts of either European (EUR) or East Asian (EAS) ancestry resulted in single-variant resolution mappings for <i>SLC2A9</i> (rs3775948 for urate and rs4697701 for gout) and <i>ABCG2</i> (rs2622621 for gout). Tests of colocalization of variants in both urate and gout suggested existence of a shared candidate causal variant for <i>SLC2A9</i> only in EUR and for <i>ABCG2</i> only in EAS. The fine-mapped gout variant rs4697701 was within an ancient enhancer, whereas rs2622621 was within a primate-specific transposable element, both supported by functional evidence from the Roadmap Epigenomics project in human primary tissues relevant to urate and gout. Additional primate-specific elements were found near both loci and those adjacent to <i>SLC2A9</i> overlapped with known statistical epistatic interactions associated with urate as well as multiple super-enhancers identified in urate-relevant tissues. We conclude that by leveraging ancestral differences trans-ancestral fine mapping has identified ancestral and functional variants for <i>SLC2A9</i> or <i>ABCG2</i> with primate-specific regulatory effects on urate and gout.
[Download paper here](https://www.nature.com/articles/s10038-020-0821-z)
Recommended citation: Riku Takei, Murray Cadzow, David Markie, Matt Bixley, Amanda Phipps-Green, Tanya J. Major, Changgui Li, Hyon K. Choi, Zhiqiang Li, Hua Hu, Eurogout Consortium, Hui Guo, Meian He, Yongyong Shi, Lisa K. Stamp, Nicola Dalbeth, Tony R. Merriman & Wen-Hua Wei (2021). "Trans-ancestral dissection of urate- and gout-associated major loci SLC2A9 and ABCG2 reveals primate-specific regulatory effects." <i>Journal of Human Genetics</i>. 66, pp161–1691.
| 165.631579 | 1,772 | 0.796314 | eng_Latn | 0.966842 |
fa072f289b56e847c2668cfb2f78599166e93174 | 166 | md | Markdown | README.md | bubnov/SFCStateViews | 43535e4faccae9f61ee434e52010e2acd7c9347e | [
"MIT"
] | null | null | null | README.md | bubnov/SFCStateViews | 43535e4faccae9f61ee434e52010e2acd7c9347e | [
"MIT"
] | null | null | null | README.md | bubnov/SFCStateViews | 43535e4faccae9f61ee434e52010e2acd7c9347e | [
"MIT"
] | null | null | null | # SFCStateViews
## Author
Bubnov Slavik, bubnovslavik@gmail.com
## License
SFCStateViews is available under the MIT license. See the LICENSE file for more info.
| 15.090909 | 85 | 0.771084 | eng_Latn | 0.719508 |
fa0835f02b868d1b74339f4d5725323ed17dbbbc | 1,420 | md | Markdown | docs/wired-progress.md | yuanyazhen/wired-elements | 381ce6f84a657719b06073e18221cd640de11de8 | [
"MIT"
] | 8,052 | 2017-08-28T17:30:46.000Z | 2020-06-17T02:47:41.000Z | docs/wired-progress.md | yuanyazhen/wired-elements | 381ce6f84a657719b06073e18221cd640de11de8 | [
"MIT"
] | 105 | 2017-08-29T19:36:41.000Z | 2020-06-16T12:44:02.000Z | docs/wired-progress.md | yuanyazhen/wired-elements | 381ce6f84a657719b06073e18221cd640de11de8 | [
"MIT"
] | 304 | 2017-08-28T19:47:20.000Z | 2020-06-10T09:17:17.000Z | # wired-progress
Hand-drawn sketchy progress bar web component.
For demo and view the complete set of wired-elememts: [wiredjs.com](http://wiredjs.com/)
## Usage
Add wired-elements to your JavaScript project:
```
npm i wired-elements
```
Import module in your code:
```javascript
import { WiredProgress } from 'wired-elements';
// or
import { WiredProgress } from 'wired-elements/lib/wired-progress.js';
```
Or load directly into your HTML page:
```html
<script type="module" src="https://unpkg.com/wired-elements/lib/wired-progress.js?module"></script>
```
Use it in your HTML:
```html
<wired-progress value="25"></wired-progress>
<wired-progress value="10" min="5" max="15"></wired-progress>
```
## Properties
**value** - Numeric value of the progress.
**min** - Minimum value. Default is 0.
**max** - Maximum value. Default is 100.
**percentage** - Boolean indicating if the label should show a % symbol.
## Custom CSS Variables
**--wired-progress-label-color** Color of the label. Default is *black*.
**--wired-progress-label-background** Backgroind of label. Default is *rgba(255,255,255,0.9)*.
**--wired-progress-font-size** Font size of the label. Default is *14px*
**--wired-progress-color** Color of the progress bar. Default is *rgba(0, 0, 200, 0.8)*.
## License
[MIT License](https://github.com/rough-stuff/wired-elements/blob/master/LICENSE) (c) [Preet Shihn](https://twitter.com/preetster) | 26.792453 | 129 | 0.710563 | eng_Latn | 0.626175 |
fa084affc15c89d5489e48eee93ec340893d8963 | 19,264 | md | Markdown | README.md | Staffbase/halacious | 1270ec1e8febe323c35645d87e7570e63bbb24ff | [
"MIT"
] | null | null | null | README.md | Staffbase/halacious | 1270ec1e8febe323c35645d87e7570e63bbb24ff | [
"MIT"
] | 110 | 2020-06-23T07:34:27.000Z | 2022-03-23T00:27:31.000Z | README.md | Staffbase/halacious | 1270ec1e8febe323c35645d87e7570e63bbb24ff | [
"MIT"
] | 1 | 2021-06-29T12:31:22.000Z | 2021-06-29T12:31:22.000Z | # Halacious for Hapi
A HAL processor for Hapi.
## Campatibility
Compatible with Hapi v18.
## Overview
*Halacious* is a plugin for the HapiJS web application server that makes **HATEOASIFYING** your app ridiculously
easy. When paired with a well-aged HAL client-side library, you will feel the warmth of loose API coupling and the feeling
of moral superiorty as you rid your SPA of hard-coded api links.
Halacious removes the boilerplate standing between you and a Restful application, allowing you to focus on your app's
secret sauce. Halacious embraces Hapi's configuration-centric approach to application scaffolding. Most common tasks can
be accomplished without writing any code at all.
## Features
- Dead-simple namespace/rel registration system that encourages you to document your API as you go
- Automatic api root REST endpoint generation that instantly gives you links to all top-level API endpoints
- Automatic rel documentation site generator so that your fully resolved rel names actually, you know, point somewhere.
- Automatic creation of curie links
- Support for relative and templated link hrefs.
- Auto wrapping of http response entities into HAL representations
- Support for custom object json serialization
- Support for programmatic configuration of HAL entities at the route or entity level
- Bunches of unit tests
## Getting Started
Start by npm installing Halicious into your Hapi project folder:
```
npm install @modernpoacher/halacious --save
```
or
```
yarn add @modernpoacher/halacious --dev
```
Register the plugin with the app server
```javascript
const hapi = require('@hapi/hapi');
const halacious = require('@modernpoacher/halacious');
const server = hapi.server({ port: 8080 });
async function init () {
await server.register(halacious)
server.route({
method: 'get',
path: '/hello/{name}',
handler: function (req) {
return { message: 'Hello, '+req.params.name };
}
});
await server.start()
console.info('Server started at %s', server.info.uri);
}
init();
```
Launch the server:
```
node ./examples/hello-world
```
Make a request
```
curl -H 'Accept: application/hal+json' http://localhost:8080/hello/world
```
See the response
```
{
"_links": {
"self": {
"href": "/hello/world"
}
},
"message": "Hello, world"
}
```
## Linking
Links may be declared directly within the route config.
```javascript
server.route({
method: 'get',
path: '/users/{userId}',
handler: function (req) {
return { id: req.params.userId, name: 'User ' + req.params.userId, googlePlusId: '107835557095464780852' };
},
config: {
plugins: {
hal: {
links: {
'home': 'http://plus.google.com/{googlePlusId}'
},
ignore: 'googlePlusId' // remove the id property from the response
}
}
}
});
```
```
curl -H 'Accept: application/hal+json' http://localhost:8080/users/100
```
will produce:
```
{
"_links": {
"self": {
"href": "/users/1234"
},
"home": {
"href": "http://plus.google.com/107835557095464780852"
}
},
"id": "100",
"name": "User 1234"
}
```
## Embedding
HAL allows you to conserve bandwidth by optionally embedding link payloads in the original request. Halacious will
automatically convert nested objects into embedded HAL representations (if you ask nicely).
```javascript
server.route({
method: 'get',
path: '/users/{userId}',
handler: function (req) {
return {
id: req.params.userId,
name: 'User ' + req.params.userId,
boss: {
id: 1234,
name: 'Boss Man'
}
};
},
config: {
plugins: {
hal: {
embedded: {
'boss': {
path: 'boss', // the property name of the object to embed
href: '../{item.id}'
}
}
}
}
}
});
```
```
curl -H 'Accept: application/hal+json' http://localhost:8080/users/100
{
"_links": {
"self": {
"href": "/users/100"
}
},
"id": "100",
"name": "User 100",
"_embedded": {
"boss": {
"_links": {
"self": {
"href": "/users/1234"
}
},
"id": 1234,
"name": "Boss Man"
}
}
}
```
## Programmatic configuration of HAL representations
You may find the need to take the wheel on occasion and directly configure outbound representions. For example,
some links may be conditional on potentially asynchronous criteria. Fortunately, Halacious provides two ways to do this:
1. By providing a `prepare()` function on the route's hal descriptor (or by assigning the function directly to the hal property)
2. By implementing a `toHal()` method directly on a wrapped entity.
In either case, the method signature is the same: `fn(rep, callback)` where
- `rep` - a representation object with the following properties and functions:
- `factory` - a factory reference for creating new representations. The factory object implements one method:
- `create(entity, selfHref)` - wraps entity with a new Hal representation, whose self link will point to selfHref
- `request` - the originating hapi request
- `self` - a shortcut to the representation's self link
- `entity` - the original wrapped entity
- `prop(name, value)` - manually adds a name/value pair to the representation
- `merge(object)` - merges the properties of another object into the representation
- `ignore(...propertyNames)` - prevents fields from being included in the response
- `link(relName, href)` - adds a new link to the `_links` collection, returning the new link. Link objects support
the following properties (See see http://tools.ietf.org/html/draft-kelly-json-hal-06#section-8.2 for more information):
- `href`
- `templated`
- `title`
- `type`
- `deprecation`
- `name`
- `profile`
- `hreflang`
- `embed(rel, self, entity)` - adds an entity to the representation's `_embedded` collection with the supplied rel link relation and self href, returning a new representation
object for further configuration.
- `callback([err], [representation])` - an asynchronous callback function to be called when configuration of the hal entity
is complete. Most of the time this function should be called with no arguments. Only pass arguments if there has been
an error or if a completely new representation has been created with `rep.factory.create()`.
#### Example 1: A `prepare()` function declared in the route descriptor.
```javascript
server.route({
method: 'get',
path: '/users',
handler: function (req) {
return {
start: 0,
count: 2,
limit: 2,
items: [
{ id: 100, firstName: 'Brad', lastName: 'Leupen', googlePlusId: '107835557095464780852'},
{ id: 101, firstName: 'Mark', lastName: 'Zuckerberg'}
]
};
},
config: {
plugins: {
hal: {
// you can also assign this function directly to the hal property above as a shortcut
prepare: function (rep, next) {
rep.entity.items.forEach(function (item) {
var embed = rep.embed('item', './' + item.id, item);
if (item.googlePlusId) {
embed.link('home', 'http://plus.google.com/' + item.googlePlusId);
embed.ignore('googlePlusId');
}
});
rep.ignore('items');
// dont forget to call next!
next();
}
}
}
}
});
```
```
curl -H 'Accept: application/hal+json' http://localhost:8080/users
{
"_links": {
"self": {
"href": "/users"
}
},
"start": 0,
"count": 2,
"limit": 2,
"_embedded": {
"item": [
{
"_links": {
"self": {
"href": "/users/100"
},
"home": {
"href": "http://plus.google.com/107835557095464780852"
}
},
"id": 100,
"firstName": "Brad",
"lastName": "Leupen"
},
{
"_links": {
"self": {
"href": "/users/101"
}
},
"id": 101,
"firstName": "Mark",
"lastName": "Zuckerberg"
}
]
}
}
```
#### Example 2: Implementing `toHal()` on a domain entity:
```javascript
function User(id, firstName, lastName, googlePlusId) {
this.id = id;
this.firstName = firstName;
this.lastName = lastName;
this.googlePlusId = googlePlusId;
}
User.prototype.toHal = function(rep, next) {
if (this.googlePlusId) {
rep.link('home', 'http://plus.google.com/' + this.googlePlusId);
rep.ignore('googlePlusId');
}
next();
};
server.route({
method: 'get',
path: '/users',
handler: function (req) {
return {
start: 0,
count: 2,
limit: 2,
items: [
new User(100, 'Brad', 'Leupen', '107835557095464780852'),
new User(101, 'Mark', 'Zuckerberg')
]
};
},
config: {
plugins: {
hal: {
embedded: {
item: {
path: 'items',
href: './{item.id}'
}
}
}
}
}
});
```
## The HAL route configuration object
The `config.plugins.hal` route configuration object takes the following format:
- A function `fn(rep, next)` - for purely programmatic control over the representation
or
- An object with the following properties:
- `api` - an optional top level api rel name to assign to this route. Setting a value will cause this route to be included
in the root api resource's _links collection.
- `prepare(rep, next)` - an optional prepare function
- `ignore` - A String or array of strings containing the names of properties to remove from the output. Can be used
to remove reduntant information from the response
- `query` - An RFC 6570 compatible query string that should be communicated to your clients. See: http://tools.ietf.org/html/rfc6570.
Example: `{?q*,start,limit}`. These parameters will be included in top level api links. They will also be included in self links if supplied in the request.
Query parameters that are not included in the template, such as runtime tokens, will be excluded from the self href.
- `links` - An object whose keys are rel names and whose values are href strings or link objects that contain
at least an `href` property. Hrefs may be absolute or relative to the representation's self link. Hrefs may also contain
`{expression}` template expressions, which are resolved against the wrapped entity.
- `embedded` An object whose keys are rel names and whose values are configuration objects with:
- `path` - a path expression to evaluate against the wrapped entity to derive the object to embed.
- `href` - a Function, String or link object that will be used to define the entity's self relation. Like links,
embedded href's may also be templated. Unlike links, embedded href templates have access to two state variables:
- `self` - the parent entity
- `item` - the child entity
- `links`
- `embedded` (recursively evaluated)
- `prepare(rep, next)`
- `absolute` - a boolean true/false. if true, hrefs for this representation will include protocal, server, and port.
Default: false
## Namespaces and Rels
So far, we have not done a real good job in our examples defining our link relations. Unless registered with the IANA,
link relations should really be unique URLs that resolve to documentation regarding their semantics. Halacious will
happily let you be lazy but its much better if we do things the Right Way.
### Manually creating a namespace
Halacious exposes its api to your Hapi server so that you may configure it at runtime like so:
```javascript
const hapi = require('@hapi/hapi');
const halacious = require('@modernpoacher/halacious');
async function init () {
await server.register(halacious);
const namespace = server.plugins.halacious.namespaces.add({ name: 'mycompany', description: 'My Companys namespace', prefix: 'mco'});
namespace.rel({ name: 'users', description: 'a collection of users' });
namespace.rel({ name: 'user', description: 'a single user' });
namespace.rel({ name: 'boss', description: 'a users boss' });
server.route({
method: 'get',
path: '/users/{userId}',
handler: function (req) {
return { id: req.params.userId, name: 'User ' + req.params.userId, bossId: 200 };
},
config: {
plugins: {
hal: {
links: {
'mco:boss': '../{bossId}'
},
ignore: 'bossId'
}
}
}
});
}
init()
```
Now, when we access the server we see a new type of link in the `_links` collection, `curies`. The curies link provides a mechanism
to use shorthand rel names while preserving their uniqueness. Without the curie, the 'mco:boss' rel key would be expanded
to read `/rels/mycompany/boss`
```
curl -H 'Accept: application/hal+json' http://localhost:8080/users/100
{
"_links": {
"self": {
"href": "/users/100"
},
"curies": [
{
"name": "mco",
"href": "/rels/mycompany/{rel}",
"templated": true
}
],
"mco:boss": {
"href": "/users/200"
}
},
"id": "100",
"name": "User 100"
}
```
### Creating a namespace from a folder of documentated rels
In our examples folder, we have created a folder `rels/mycompany` containing markdown documents for all of the rels in our
company's namespace. We can suck all these into the system in one fell swoop:
```javascript
const hapi = require('@hapi/hapi');
const halacious = require('@modernpoacher/halacious');
async function init () {
const server = hapi.server({ port: 8080 });
await server.register(halacious);
server.plugins.halacious.namespaces.add({ dir: __dirname + '/rels/mycompany', prefix: 'mco' })
await server.start()
}
init()
```
Ideally these documents should provide your api consumer enough semantic information to navigate your api.
## Rels documentation
Halacious includes an (extremely) barebones namespace / rel navigator for users to browse your documentation.
The server binds to the `/rels` path on your server by default.
_Note: Hapi 9 / 10 users must install and configure the vision views plugin to enable this feature.
## Automatic /api root
Discoverability is a key tenant of any hypermedia system. HAL requires that only the root API url be known to clients of your
application, from which all other urls may be derived via rel names. If you want, Halacious will create this root api
route for you automatically. All you need to do is to identify which resources to include by using the `api` route
configuration option. For example:
```javascript
const hapi = require('@hapi/hapi');
const halacious = require('@modernpoacher/halacious');
async function init () {
const server = hapi.server({ port: 8080 });
await server.register(halacious);
const namespace = server.plugins.halacious.namespaces.add({ name: 'mycompany', description: 'My Companys namespace', prefix: 'mco'});
namespace.rel({ name: 'users', description: 'a collection of users' });
namespace.rel({ name: 'user', description: 'a single user' });
server.route({
method: 'get',
path: '/users',
handler: function () {
return {};
},
config: {
plugins: {
hal: {
api: 'mco:users'
}
}
}
});
server.route({
method: 'get',
path: '/users/{userId}',
handler: function () {
return {};
},
config: {
plugins: {
hal: {
api: 'mco:user'
}
}
}
});
}
init()
```
will auto-create the following api root:
```
curl -H 'Accept: application/hal+json' http://localhost:8080/api/
{
"_links": {
"self": {
"href": "/api/"
},
"curies": [
{
"name": "mco",
"href": "/rels/mycompany/{rel}",
"templated": true
}
],
"mco:users": {
"href": "/users"
},
"mco:user": {
"href": "/users/{userId}",
"templated": true
}
}
}
```
## Plugin Options
- `strict` - setting this to `true` will cause an exception to be thrown when referencing unregistered local rel. Setting this
to true will help catch typos during development. Default: `false`
- `relsPath` - the route path to the rels documentation root. Default: `/rels`
- `relsAuth` - the hapi authentication setting to use for the documentation routes. Default: `false`
- `relsTemplate` - a boolean true/false. if true, rels documentation uses the default template. Default: `true`
- `autoApi` - setting this to `true` will automatically create a root api handler to seed your client application. Default: `true`
- `apiPath` - the route path to the api root. Default: `/api`
- `apiAuth` - the hapi authentication setting to use for the api route. Default: `false`
- `apiServerLabel` - when set, Halacious will select for a specific server to route the api root.
- `mediaTypes` - an array of media types that will trigger the hal processor to modify the response (e.g. `['application/json',
'application/hal+json']`). the media types are checked in order. if any match the accept header parameters, then the
response will be halified and the media type of the response will be set to the first winner. Default: `['application/hal+json']`
- `absolute` - a boolean true/false. if true, all hrefs will include the protocol, server, and port. Default: false
- `host` - a hostname+port string for all absolute link urls
- `hostname` - a string hostname for all absolute link urls (only used if host is blank)
- `port` - an integer port for all absolute link urls
- `protocol` - a string protocol for all absolute link urls
- `marked` - a object of options to customize **marked**. Default: `{}`
| 33.975309 | 178 | 0.598837 | eng_Latn | 0.973989 |
fa08750eee4693c81e0a67565236adde655161dc | 26,190 | md | Markdown | docs/framework/wpf/security-wpf.md | jhonyfrozen/docs.pt-br | c9e86b6a5de2ff8dffd54dd64d2e87aee85a5cb8 | [
"CC-BY-4.0",
"MIT"
] | null | null | null | docs/framework/wpf/security-wpf.md | jhonyfrozen/docs.pt-br | c9e86b6a5de2ff8dffd54dd64d2e87aee85a5cb8 | [
"CC-BY-4.0",
"MIT"
] | null | null | null | docs/framework/wpf/security-wpf.md | jhonyfrozen/docs.pt-br | c9e86b6a5de2ff8dffd54dd64d2e87aee85a5cb8 | [
"CC-BY-4.0",
"MIT"
] | null | null | null | ---
title: Segurança (WPF)
ms.date: 03/30/2017
helpviewer_keywords:
- XAML files [WPF], sandbox behavior
- browser-hosted application security [WPF]
- application security [WPF]
- navigation security [WPF]
- loose XAML files [WPF], sandbox behavior
- WPF security [WPF]
- WebBrowser control [WPF], security
- feature controls [WPF], security
- XBAP security [WPF]
- Internet Explorer security settings [WPF]
ms.assetid: ee1baea0-3611-4e36-9ad6-fcd5205376fb
ms.openlocfilehash: c284e37ce32848ab22ca9ff6cc2ca2a8e6df9cb6
ms.sourcegitcommit: 2701302a99cafbe0d86d53d540eb0fa7e9b46b36
ms.translationtype: MT
ms.contentlocale: pt-BR
ms.lasthandoff: 04/28/2019
ms.locfileid: "64614504"
---
# <a name="security-wpf"></a>Segurança (WPF)
<a name="introduction"></a> Ao desenvolver autônomo do Windows Presentation Foundation (WPF) e aplicativos hospedados pelo navegador, você deve considerar o modelo de segurança. [!INCLUDE[TLA2#tla_wpf](../../../includes/tla2sharptla-wpf-md.md)] aplicativos autônomos executados com permissões ilimitadas ( [!INCLUDE[TLA2#tla_cas](../../../includes/tla2sharptla-cas-md.md)] **FullTrust** conjunto de permissões), se implantado usando o Windows Installer (. msi), XCopy, ou [!INCLUDE[TLA2#tla_clickonce](../../../includes/tla2sharptla-clickonce-md.md)]. Não há suporte para a implantação de aplicativos WPF autônomos e de confiança parcial com o ClickOnce. No entanto, um aplicativo host de confiança total pode criar uma confiança parcial <xref:System.AppDomain> usando o modelo de suplemento do .NET Framework. Para obter mais informações, consulte [visão geral de suplementos WPF](./app-development/wpf-add-ins-overview.md).
[!INCLUDE[TLA2#tla_wpf](../../../includes/tla2sharptla-wpf-md.md)] aplicativos hospedados pelo navegador são hospedados por [!INCLUDE[TLA#tla_iegeneric](../../../includes/tlasharptla-iegeneric-md.md)] ou o Firefox, e pode ser [!INCLUDE[TLA#tla_xbap#plural](../../../includes/tlasharptla-xbapsharpplural-md.md)] ou flexível [!INCLUDE[TLA#tla_xaml](../../../includes/tlasharptla-xaml-md.md)] documentos para obter mais informações, consulte [visão geral dos aplicativos de navegador XAML do WPF](./app-development/wpf-xaml-browser-applications-overview.md).
[!INCLUDE[TLA2#tla_wpf](../../../includes/tla2sharptla-wpf-md.md)] aplicativos hospedados pelo navegador executam em uma área de restrita de segurança de confiança parcial, por padrão, que é limitado ao padrão [!INCLUDE[TLA2#tla_cas](../../../includes/tla2sharptla-cas-md.md)] **Internet** conjunto de permissões da zona. Isso isola efetivamente [!INCLUDE[TLA2#tla_wpf](../../../includes/tla2sharptla-wpf-md.md)] aplicativos hospedados pelo navegador do computador cliente da mesma forma que você esperaria que aplicativos Web típicos fossem isolados. Um XBAP é capaz de elevar privilégios, até confiança total, dependendo da zona de segurança da URL de implantação e da configuração de segurança do cliente. Para obter mais informações, consulte [Segurança parcialmente confiável do WPF](wpf-partial-trust-security.md).
Este tópico discute o modelo de segurança para autônomo do Windows Presentation Foundation (WPF) e aplicativos hospedados pelo navegador.
Esse tópico contém as seguintes seções:
- [Navegação segura](#SafeTopLevelNavigation)
- [Configurações de segurança de software de navegação na Web](#InternetExplorerSecuritySettings)
- [Controle WebBrowser e controles de recurso](#webbrowser_control_and_feature_controls)
- [Desabilitando os assemblies APTCA para aplicativos cliente parcialmente confiáveis](#APTCA)
- [Comportamento da área restrita para arquivos XAML flexíveis](#LooseContentSandboxing)
- [Recursos para o desenvolvimento de aplicativos do WPF que promovem a segurança](#BestPractices)
<a name="SafeTopLevelNavigation"></a>
## <a name="safe-navigation"></a>Navegação segura
Para [!INCLUDE[TLA2#tla_xbap#plural](../../../includes/tla2sharptla-xbapsharpplural-md.md)], [!INCLUDE[TLA2#tla_wpf](../../../includes/tla2sharptla-wpf-md.md)] distingue dois tipos de navegação: aplicativo e navegador.
*A navegação do aplicativo* é uma navegação entre itens de conteúdo dentro de um aplicativo hospedado por um navegador. *A navegação do navegador* é uma navegação que muda a URL de local e conteúdo do próprio navegador. A relação entre a navegação de aplicativo (normalmente XAML) e navegação do navegador (normalmente HTML) é mostrada na ilustração a seguir:

O tipo de conteúdo que é considerado seguro para um [!INCLUDE[TLA2#tla_xbap](../../../includes/tla2sharptla-xbap-md.md)] navegar até é determinado principalmente pelo se a navegação do aplicativo ou navegação do navegador é usada.
<a name="Application_Navigation_Security"></a>
### <a name="application-navigation-security"></a>Segurança da navegação do aplicativo
Navegação do aplicativo é considerada segura se ele pode ser identificado com um pacote [!INCLUDE[TLA2#tla_uri](../../../includes/tla2sharptla-uri-md.md)], que oferece suporte a quatro tipos de conteúdo:
|Tipo de Conteúdo|Descrição|Exemplo de URI|
|------------------|-----------------|-----------------|
|Recurso|Arquivos que são adicionados a um projeto com um tipo de compilação do **recurso**.|`pack://application:,,,/MyResourceFile.xaml`|
|Conteúdo|Arquivos que são adicionados a um projeto com um tipo de compilação do **conteúdo**.|`pack://application:,,,/MyContentFile.xaml`|
|Site de origem|Arquivos que são adicionados a um projeto com um tipo de compilação do **None**.|`pack://siteoforigin:,,,/MySiteOfOriginFile.xaml`|
|Código do aplicativo|Recursos XAML que têm um code-behind compilado.<br /><br /> - ou -<br /><br /> Arquivos XAML que são adicionados a um projeto com um tipo de compilação do **página**.|`pack://application:,,,/MyResourceFile` `.xaml`|
> [!NOTE]
> Para obter mais informações sobre arquivos de dados do aplicativo e o pacote [!INCLUDE[TLA2#tla_uri#plural](../../../includes/tla2sharptla-urisharpplural-md.md)], consulte [recurso de aplicativo do WPF, conteúdo e arquivos de dados](./app-development/wpf-application-resource-content-and-data-files.md).
É possível navegar até arquivos desses tipos de conteúdo por meio do usuário ou programaticamente:
- **Navegação do usuário**. O usuário navega, clicando em um <xref:System.Windows.Documents.Hyperlink> elemento.
- **Navegação programática**. O aplicativo navega sem envolver o usuário, por exemplo, definindo o <xref:System.Windows.Navigation.NavigationWindow.Source%2A?displayProperty=nameWithType> propriedade.
<a name="Browser_Navigation_Security"></a>
### <a name="browser-navigation-security"></a>Segurança da navegação do navegador
A navegação do navegador é considerada segura somente nestas condições:
- **Navegação do usuário**. O usuário navega, clicando em um <xref:System.Windows.Documents.Hyperlink> elemento que está dentro do principal <xref:System.Windows.Navigation.NavigationWindow>, e não no aninhado <xref:System.Windows.Controls.Frame>.
- **Zona**. O conteúdo da navegação localiza-se na Internet ou na intranet local.
- **Protocolo**. É o protocolo usado **http**, **https**, **arquivo**, ou **mailto**.
Se um [!INCLUDE[TLA2#tla_xbap](../../../includes/tla2sharptla-xbap-md.md)] tenta navegar até o conteúdo de uma maneira que não são compatíveis com essas condições, um <xref:System.Security.SecurityException> é gerada.
<a name="InternetExplorerSecuritySettings"></a>
## <a name="web-browsing-software-security-settings"></a>Configurações de segurança de software de navegação na Web
As configurações de segurança no seu computador determinam o acesso concedido a qualquer software de navegação Web. Navegador da Web inclui qualquer aplicativo ou componente que usa o [WinINet](https://go.microsoft.com/fwlink/?LinkId=179379) ou [UrlMon](https://go.microsoft.com/fwlink/?LinkId=179383) APIs, incluindo o Internet Explorer e PresentationHost.exe.
[!INCLUDE[TLA2#tla_iegeneric](../../../includes/tla2sharptla-iegeneric-md.md)] Fornece um mecanismo pelo qual você pode configurar a funcionalidade que pode ser executada por ou de [!INCLUDE[TLA2#tla_iegeneric](../../../includes/tla2sharptla-iegeneric-md.md)], incluindo o seguinte:
- Componentes dependentes do framework .NET
- Controles ActiveX e plug-ins
- Downloads
- Script
- Autenticação do usuário
A coleção de funcionalidade que pode ser protegida dessa maneira é configurada em uma base por zona para o **Internet**, **Intranet**, **Sites confiáveis**, e **Sites restritos** zonas. As etapas a seguir descrevem como definir as configurações de segurança:
1. Abra **Painel de Controle**.
2. Clique em **rede e Internet** e, em seguida, clique em **opções da Internet**.
É exibida a caixa de diálogo Opções da Internet.
3. Sobre o **segurança** , selecione para definir as configurações de segurança para a zona.
4. Clique o **nível personalizado** botão.
O **as configurações de segurança** caixa de diálogo é exibida e você pode configurar as configurações de segurança para a zona selecionada.

> [!NOTE]
> Também é possível acessar a caixa de diálogo Opções da Internet pelo Internet Explorer. Clique em **ferramentas** e, em seguida, clique em **opções da Internet**.
Começando com [!INCLUDE[TLA#tla_ie7](../../../includes/tlasharptla-ie7-md.md)], as seguintes configurações de segurança especificamente para o .NET Framework são incluídas:
- **XAML flexível**. Controles se [!INCLUDE[TLA2#tla_iegeneric](../../../includes/tla2sharptla-iegeneric-md.md)] navegar até eles e flexível [!INCLUDE[TLA2#tla_xaml](../../../includes/tla2sharptla-xaml-md.md)] arquivos. (Opções de Habilitar, Desabilitar e Prompt.)
- **Aplicativos do navegador XAML**. Controles se [!INCLUDE[TLA2#tla_iegeneric](../../../includes/tla2sharptla-iegeneric-md.md)] pode navegar para e execute [!INCLUDE[TLA2#tla_xbap#plural](../../../includes/tla2sharptla-xbapsharpplural-md.md)]. (Opções de Habilitar, Desabilitar e Prompt.)
Por padrão, essas configurações são habilitadas para o **Internet**, **intranet Local**, e **sites confiáveis** zonas e desabilitado para o **sites restritos** zona.
<a name="Security_Settings_for_IE6_and_Below"></a>
### <a name="security-related-wpf-registry-settings"></a>Configurações do Registro de WPF relacionadas à segurança
Além das configurações de segurança disponíveis por meio das Opções da Internet, os valores de Registro a seguir estão disponíveis para bloquear seletivamente alguns recursos do WPF sensíveis à segurança. Os valores são definidos na seguinte chave:
`HKEY_LOCAL_MACHINE\SOFTWARE\Microsoft\.NETFramework\Windows Presentation Foundation\Features`
A tabela a seguir lista os valores que podem ser definidos.
|Nome do valor|Tipo do Valor|Dados do valor|
|----------------|----------------|----------------|
|XBAPDisallow|REG_DWORD|1 para não permitir; 0 para permitir.|
|LooseXamlDisallow|REG_DWORD|1 para não permitir; 0 para permitir.|
|WebBrowserDisallow|REG_DWORD|1 para não permitir; 0 para permitir.|
|MediaAudioDisallow|REG_DWORD|1 para não permitir; 0 para permitir.|
|MediaImageDisallow|REG_DWORD|1 para não permitir; 0 para permitir.|
|MediaVideoDisallow|REG_DWORD|1 para não permitir; 0 para permitir.|
|ScriptInteropDisallow|REG_DWORD|1 para não permitir; 0 para permitir.|
<a name="webbrowser_control_and_feature_controls"></a>
## <a name="webbrowser-control-and-feature-controls"></a>Controle WebBrowser e controles de recurso
O WPF <xref:System.Windows.Controls.WebBrowser> controle pode ser usado para hospedar o conteúdo da Web. O WPF <xref:System.Windows.Controls.WebBrowser> controle envolve o controle WebBrowser ActiveX subjacente. O WPF fornece algum suporte para proteger seu aplicativo quando você usa o WPF <xref:System.Windows.Controls.WebBrowser> controle para o host de conteúdo da Web não confiável. No entanto, alguns recursos de segurança devem ser aplicados diretamente pelos aplicativos, usando o <xref:System.Windows.Controls.WebBrowser> controle. Para obter mais informações sobre o controle WebBrowser ActiveX, consulte [visões gerais do controle WebBrowser e tutoriais](https://go.microsoft.com/fwlink/?LinkId=179388).
> [!NOTE]
> Esta seção também se aplica a <xref:System.Windows.Controls.Frame> controlar, pois ele usa o <xref:System.Windows.Controls.WebBrowser> para navegar até o conteúdo HTML.
Se o WPF <xref:System.Windows.Controls.WebBrowser> controle é usado para hospedar conteúdo Web não confiável, seu aplicativo deve usar um de confiança parcial <xref:System.AppDomain> para ajudar a isolar o código do aplicativo do código de script HTML potencialmente mal-intencionados. Isso é especialmente verdadeiro se o aplicativo interage com o script hospedado usando o <xref:System.Windows.Controls.WebBrowser.InvokeScript%2A> método e o <xref:System.Windows.Controls.WebBrowser.ObjectForScripting%2A> propriedade. Para obter mais informações, consulte [visão geral de suplementos WPF](./app-development/wpf-add-ins-overview.md).
Se seu aplicativo usa o WPF <xref:System.Windows.Controls.WebBrowser> controle, outra maneira de aumentar a segurança e reduzir os ataques é habilitar os controles de recurso do Internet Explorer. Controles de recurso são adições ao Internet Explorer que permitem que os administradores e desenvolvedores configurar os recursos do Internet Explorer e aplicativos que hospedam o controle WebBrowser ActiveX, que o WPF <xref:System.Windows.Controls.WebBrowser> controlar encapsula. Controles de recurso podem ser configurados usando o [CoInternetSetFeatureEnabled](https://go.microsoft.com/fwlink/?LinkId=179394) função ou alterando os valores no registro. Para obter mais informações sobre controles de recurso, consulte [Introdução aos controles de recurso](https://go.microsoft.com/fwlink/?LinkId=179390) e [controles de recurso de Internet](https://go.microsoft.com/fwlink/?LinkId=179392).
Se você estiver desenvolvendo um aplicativo WPF autônomo que usa o WPF <xref:System.Windows.Controls.WebBrowser> controle, WPF habilita automaticamente os seguintes controles de recurso para o seu aplicativo.
|Controle de recurso|
|---------------------|
|FEATURE_MIME_HANDLING|
|FEATURE_MIME_SNIFFING|
|FEATURE_OBJECT_CACHING|
|FEATURE_SAFE_BINDTOOBJECT|
|FEATURE_WINDOW_RESTRICTIONS|
|FEATURE_ZONE_ELEVATION|
|FEATURE_RESTRICT_FILEDOWNLOAD|
|FEATURE_RESTRICT_ACTIVEXINSTALL|
|FEATURE_ADDON_MANAGEMENT|
|FEATURE_HTTP_USERNAME_PASSWORD_DISABLE|
|FEATURE_SECURITYBAND|
|FEATURE_UNC_SAVEDFILECHECK|
|FEATURE_VALIDATE_NAVIGATE_URL|
|FEATURE_DISABLE_TELNET_PROTOCOL|
|FEATURE_WEBOC_POPUPMANAGEMENT|
|FEATURE_DISABLE_LEGACY_COMPRESSION|
|FEATURE_SSLUX|
Como esses controles de recurso são habilitados incondicionalmente, um aplicativo de confiança total pode ser prejudicado por eles. Nesse caso, se não houver risco de segurança para o aplicativo específico e o conteúdo hospedado por ele, o controle de recurso correspondente poderá ser desabilitado.
Controles de recurso são aplicados pelo processo que instancia o objeto WebBrowser ActiveX. Portanto, se estiver criando um aplicativo autônomo capaz de navegar até o conteúdo não confiável, considere seriamente a possibilidade de habilitar controles de recurso adicionais.
> [!NOTE]
> Essa recomendação baseia-se em recomendações gerais para segurança do host MSHTML e SHDOCVW. Para obter mais informações, consulte [a FAQ de segurança do Host MSHTML: Parte I de II](https://go.microsoft.com/fwlink/?LinkId=179396) e [a segurança do Host MSHTML perguntas Frequentes: Parte II de II](https://go.microsoft.com/fwlink/?LinkId=179415).
Para o executável, considere a possibilidade de habilitar os controles de recurso a seguir ao definir o valor de Registro como 1.
|Controle de recurso|
|---------------------|
|FEATURE_ACTIVEX_REPURPOSEDETECTION|
|FEATURE_BLOCK_LMZ_IMG|
|FEATURE_BLOCK_LMZ_OBJECT|
|FEATURE_BLOCK_LMZ_SCRIPT|
|FEATURE_RESTRICT_RES_TO_LMZ|
|FEATURE_RESTRICT_ABOUT_PROTOCOL_IE7|
|FEATURE_SHOW_APP_PROTOCOL_WARN_DIALOG|
|FEATURE_LOCALMACHINE_LOCKDOWN|
|FEATURE_FORCE_ADDR_AND_STATUS|
|FEATURE_RESTRICTED_ZONE_WHEN_FILE_NOT_FOUND|
Para o executável, considere a possibilidade de desabilitar o controle de recurso a seguir ao definir o valor de Registro como 0.
|Controle de recurso|
|---------------------|
|FEATURE_ENABLE_SCRIPT_PASTE_URLACTION_IF_PROMPT|
Se você executar uma confiança parcial [!INCLUDE[TLA#tla_xbap](../../../includes/tlasharptla-xbap-md.md)] que inclui um WPF <xref:System.Windows.Controls.WebBrowser> controlar em [!INCLUDE[TLA#tla_iegeneric](../../../includes/tlasharptla-iegeneric-md.md)], WPF hospeda o controle WebBrowser ActiveX no espaço de endereço do processo do Internet Explorer. Uma vez que o controle WebBrowser ActiveX é hospedado no [!INCLUDE[TLA2#tla_iegeneric](../../../includes/tla2sharptla-iegeneric-md.md)] processo, todos os controles de recurso para o Internet Explorer também são habilitados para o controle WebBrowser ActiveX.
Os XBAPs executados no Internet Explorer também têm um nível adicional de segurança em comparação com aplicativos autônomos normais. Essa segurança adicional é porque o Internet Explorer e, portanto, o controle WebBrowser ActiveX, é executado protegido modo por padrão em [!INCLUDE[TLA#tla_winvista](../../../includes/tlasharptla-winvista-md.md)] e [!INCLUDE[win7](../../../includes/win7-md.md)]. Para obter mais informações sobre o modo protegido, consulte [Compreendendo e trabalhando no modo protegido do Internet Explorer](https://go.microsoft.com/fwlink/?LinkId=179393).
> [!NOTE]
> Se você tenta executar um XBAP que inclui um WPF <xref:System.Windows.Controls.WebBrowser> controle no Firefox, enquanto estiver na zona da Internet, um <xref:System.Security.SecurityException> será lançada. Isso ocorre por causa da política de segurança do WPF.
<a name="APTCA"></a>
## <a name="disabling-aptca-assemblies-for-partially-trusted-client-applications"></a>Desabilitando os assemblies APTCA para aplicativos cliente parcialmente confiáveis
Quando os assemblies gerenciados estão instalados para o [!INCLUDE[TLA#tla_gac](../../../includes/tlasharptla-gac-md.md)], eles se tornam totalmente confiáveis porque o usuário deve fornecer permissão explícita para instalá-los. Como são totalmente confiáveis, apenas aplicativos clientes gerenciados totalmente confiáveis podem usá-los. Para permitir que aplicativos parcialmente confiáveis para usá-los, eles devem ser marcados com o <xref:System.Security.AllowPartiallyTrustedCallersAttribute> (APTCA). Somente assemblies que foram testados em relação à segurança para execução com confiança parcial devem ser marcados com tal atributo.
No entanto, é possível que um assembly APTCA exiba uma falha de segurança após ser instalado no [!INCLUDE[TLA2#tla_gac](../../../includes/tla2sharptla-gac-md.md)]. Após a descoberta de uma falha de segurança, os publicadores de assembly podem produzir uma atualização de segurança para corrigir o problema em instalações existentes, além de protege de instalações que poderão ocorrer após a descoberta do problema. Uma opção para a atualização é desinstalar o assembly, embora isso possa interromper outros aplicativos clientes totalmente confiáveis que o utilizam.
[!INCLUDE[TLA2#tla_wpf](../../../includes/tla2sharptla-wpf-md.md)] Fornece um mecanismo pelo qual um assembly APTCA pode ser desabilitado para parcialmente confiável [!INCLUDE[TLA2#tla_xbap#plural](../../../includes/tla2sharptla-xbapsharpplural-md.md)] sem desinstalar o assembly APTCA.
Para desabilitar um assembly APTCA, é necessário criar uma chave do Registro especial:
`HKEY_LOCAL_MACHINE\SOFTWARE\Microsoft\.NETFramework\policy\APTCA\<AssemblyFullName>, FileVersion=<AssemblyFileVersion>`
Um exemplo é mostrado a seguir:
`HKEY_LOCAL_MACHINE\SOFTWARE\Microsoft\.NETFramework\policy\APTCA\aptcagac, Version=1.0.0.0, Culture=neutral, PublicKeyToken=215e3ac809a0fea7, FileVersion=1.0.0.0`
Essa chave estabelece uma entrada para o assembly APTCA. Você também precisa criar um valor nessa chave que habilite ou desabilite o assembly. Abaixo estão os detalhes do valor:
- Nome do valor: **APTCA_FLAG**.
- Tipo de valor: **REG_DWORD**.
- Dados de valor: **1** desabilitar; **0** para habilitar.
Caso seja necessário desabilitar um assembly para aplicativos clientes parcialmente confiáveis, você pode gravar uma atualização que crie a chave do Registro e o valor.
> [!NOTE]
> Assemblies do núcleo do .NET Framework não são afetados por desabilitá-las dessa forma, porque eles são necessários para aplicativos gerenciados sejam executados. O suporte para desabilitar assemblies APTCA é destinado principalmente a aplicativos de terceiros.
<a name="LooseContentSandboxing"></a>
## <a name="sandbox-behavior-for-loose-xaml-files"></a>Comportamento da área restrita para arquivos XAML flexíveis
Flexível [!INCLUDE[TLA2#tla_xaml](../../../includes/tla2sharptla-xaml-md.md)] arquivos são arquivos XAML somente marcação que não dependem de nenhum code-behind, manipulador de eventos ou assembly específico de aplicativo. Flexíveis [!INCLUDE[TLA2#tla_xaml](../../../includes/tla2sharptla-xaml-md.md)] arquivos são acessados diretamente do navegador, eles são carregados em uma área restrita de segurança com base no conjunto de permissões da zona de Internet padrão.
No entanto, o comportamento de segurança é diferente quando solto [!INCLUDE[TLA2#tla_xaml](../../../includes/tla2sharptla-xaml-md.md)] arquivos são acessados de qualquer um uma <xref:System.Windows.Navigation.NavigationWindow> ou <xref:System.Windows.Controls.Frame> em um aplicativo autônomo.
Em ambos os casos, o flexíveis [!INCLUDE[TLA2#tla_xaml](../../../includes/tla2sharptla-xaml-md.md)] arquivo navegação herda as permissões do aplicativo host. No entanto, esse comportamento pode ser indesejável de uma perspectiva de segurança, especialmente se um flexível [!INCLUDE[TLA2#tla_xaml](../../../includes/tla2sharptla-xaml-md.md)] arquivo foi produzido por uma entidade que não é confiável ou desconhecido. Esse tipo de conteúdo é conhecido como *conteúdo externo*e ambos <xref:System.Windows.Controls.Frame> e <xref:System.Windows.Navigation.NavigationWindow> podem ser configurados para isolá-lo quando navegada. Isolamento é obtido, definindo a **SandboxExternalContent** propriedade como true, conforme mostrado nos exemplos a seguir para <xref:System.Windows.Controls.Frame> e <xref:System.Windows.Navigation.NavigationWindow>:
[!code-xaml[SecurityOverviewSnippets#FrameMARKUP](~/samples/snippets/csharp/VS_Snippets_Wpf/SecurityOverviewSnippets/CS/Window2.xaml#framemarkup)]
[!code-xaml[SecurityOverviewSnippets#NavigationWindowMARKUP](~/samples/snippets/csharp/VS_Snippets_Wpf/SecurityOverviewSnippets/CS/Window1.xaml#navigationwindowmarkup)]
Com essa configuração, o conteúdo externo será carregado em um processo separado do processo que está hospedando o aplicativo. Esse processo restringe-se ao conjunto de permissões da zona da Internet padrão, isolando-o eficazmente do aplicativo host e do computador cliente.
> [!NOTE]
> Embora a navegação para flexível [!INCLUDE[TLA2#tla_xaml](../../../includes/tla2sharptla-xaml-md.md)] arquivos de qualquer um uma <xref:System.Windows.Navigation.NavigationWindow> ou <xref:System.Windows.Controls.Frame> em um autônomo do aplicativo é implementado com base no navegador do WPF hospedando a infra-estrutura, envolvendo o processo PresentationHost, o nível de segurança é um pouco menor do que quando o conteúdo é carregado diretamente no Internet Explorer na [!INCLUDE[wiprlhext](../../../includes/wiprlhext-md.md)] e [!INCLUDE[win7](../../../includes/win7-md.md)] (que ainda seria por meio do PresentationHost). Isso ocorre porque um aplicativo autônomo do WPF que usa um navegador da Web não oferece o recurso de segurança adicional de modo protegido do Internet Explorer.
<a name="BestPractices"></a>
## <a name="resources-for-developing-wpf-applications-that-promote-security"></a>Recursos para o desenvolvimento de aplicativos do WPF que promovem a segurança
A seguir está alguns recursos adicionais para ajudar a desenvolver [!INCLUDE[TLA2#tla_wpf](../../../includes/tla2sharptla-wpf-md.md)] aplicativos que promovem a segurança:
|Área|Recurso|
|----------|--------------|
|Código gerenciado|[Padrões e práticas de orientação de segurança para aplicativos](https://go.microsoft.com/fwlink/?LinkId=117426)|
|[!INCLUDE[TLA2#tla_cas](../../../includes/tla2sharptla-cas-md.md)]|[Segurança de acesso do código](../misc/code-access-security.md)|
|[!INCLUDE[TLA2#tla_clickonce](../../../includes/tla2sharptla-clickonce-md.md)]|[Segurança e implantação do ClickOnce](/visualstudio/deployment/clickonce-security-and-deployment)|
|[!INCLUDE[TLA2#tla_wpf](../../../includes/tla2sharptla-wpf-md.md)]|[Segurança parcialmente confiável do WPF](wpf-partial-trust-security.md)|
## <a name="see-also"></a>Consulte também
- [Segurança parcialmente confiável do WPF](wpf-partial-trust-security.md)
- [Estratégia de segurança do WPF – segurança da plataforma](wpf-security-strategy-platform-security.md)
- [Estratégia de segurança do WPF – Engenharia de segurança](wpf-security-strategy-security-engineering.md)
- [Padrões e práticas de orientação de segurança para aplicativos](https://go.microsoft.com/fwlink/?LinkId=117426)
- [Segurança de acesso do código](../misc/code-access-security.md)
- [Segurança e implantação do ClickOnce](/visualstudio/deployment/clickonce-security-and-deployment)
- [Visão geral de XAML (WPF)](./advanced/xaml-overview-wpf.md)
| 92.54417 | 927 | 0.770599 | por_Latn | 0.993654 |
fa08e0fd4501e7f83de79d0d26bc9224372d1cd5 | 526 | md | Markdown | examples/graham-fundamentals/README.md | dutraa/pylivetrader | 0641df13d7abaa22c009a148890a0ef65a6b4699 | [
"Apache-2.0"
] | 8 | 2019-04-01T13:47:25.000Z | 2022-01-30T10:42:45.000Z | examples/graham-fundamentals/README.md | dutraa/pylivetrader | 0641df13d7abaa22c009a148890a0ef65a6b4699 | [
"Apache-2.0"
] | null | null | null | examples/graham-fundamentals/README.md | dutraa/pylivetrader | 0641df13d7abaa22c009a148890a0ef65a6b4699 | [
"Apache-2.0"
] | 1 | 2021-09-05T00:39:06.000Z | 2021-09-05T00:39:06.000Z | This is an example algorithm that shows how you can translate some of the principles Benjamin Graham laid out in his book, The Intelligent Investor, into a script that trades in a live environment.
To connect pylivetrader to your existing Alpaca live or paper account, you can see the full instructions [here](https://github.com/alpacahq/pylivetrader/tree/master/examples).
Once your API keys and URL endpoint have been configured, this algorithm can be run with this command:
```
pylivetrader run GrahamFundamentals.py
```
| 58.444444 | 197 | 0.802281 | eng_Latn | 0.99877 |
fa0aa4433d93c86a86ec5f00df12cc63a9ba0bc3 | 1,362 | md | Markdown | _drafts/stop_paying_me_to_work_on_the_wrong_things.md | wndxlori/wndxlori.github.io | d482fbe7b88e9281f5a36b652ec72b4e287736ae | [
"Apache-2.0"
] | 1 | 2018-01-06T18:03:01.000Z | 2018-01-06T18:03:01.000Z | _drafts/stop_paying_me_to_work_on_the_wrong_things.md | wndxlori/wndxlori.github.io | d482fbe7b88e9281f5a36b652ec72b4e287736ae | [
"Apache-2.0"
] | 1 | 2020-02-05T21:35:16.000Z | 2020-02-05T21:35:16.000Z | _drafts/stop_paying_me_to_work_on_the_wrong_things.md | wndxlori/wndx.com | 51a6f09e2af6bdc912be17a50a708f50c3d78427 | [
"Apache-2.0"
] | null | null | null | One thing that occasionally baffles me, as an experienced Ruby/Rails consultant, is the
way in which clients hire people like me to work on their teams.
With my deep background of experience in Ruby and Rails, I get hired over and over to work
on new application features. This is baffling to me. I come in with little to no domain
knowledge and expertise, so someone has to bring me up to speed on that domain (taking time
away from working on features). And then I need to get to know the client(s) and what
their needs are. Then I can finally get to work on actual features. Seems like I get a
free education in your application's domain space, while you are paying me the big $$$.
Why are you paying me to do this? Why not have your existing developers (who already have
all that domain knowledge) work on features, and let me at those problems that will help
your entire team build more features, faster? Like figuring out why your test suite is
so slow (which drags down your entire team). Or working on performance tuning of your
application (which impacts all your clients, as well as the scalability of your
application). How about fixing all those potential security holes you've been ignoring,
so you and your clients' data won't get exposed to the next hacker that finds out you
are running with an old insecure Ruby/Rails/gem? | 75.666667 | 92 | 0.777533 | eng_Latn | 0.99996 |
fa0ab5f1ad7893138c9cfd1ff2c7074a017d963e | 733 | md | Markdown | README.md | Saryeh/blog | a02d641986b1b2d76d97dc4af50ef314682ad344 | [
"MIT"
] | null | null | null | README.md | Saryeh/blog | a02d641986b1b2d76d97dc4af50ef314682ad344 | [
"MIT"
] | null | null | null | README.md | Saryeh/blog | a02d641986b1b2d76d97dc4af50ef314682ad344 | [
"MIT"
] | null | null | null | ## Instructions
1. Download zip of this repo
2. Unzip repo in a folder
3. Rename folder to `blog` or `project-blog`
4. Open Github for desktop
5. Create a new repo called `blog` or `project-blog` (same as folder name)
6. Change Local Path to be the parent folder of your blog (one above the unzipped file from earlier)
7. Create Repo
8. Navigate to blog repo and click "Publish Repository"
9. Make sure "Keep this code private" is unchecked
10. Click "Publish Repository"
11. Go to Repository->View on Github
12. Navigate to Settings and scroll to Github Pages
13. Select the master branch and click save
## License
The theme is available as open source under the terms of the [MIT License](https://opensource.org/licenses/MIT).
| 34.904762 | 112 | 0.759891 | eng_Latn | 0.984243 |
fa0b0ee066302b1dea4c8334c28fcd75aa040c23 | 2,128 | md | Markdown | server/curl.md | atlisg/vegan-bot | be0b5d627557aad831b4b47c6c691f447b567393 | [
"MIT"
] | 4 | 2019-04-11T11:03:13.000Z | 2021-02-27T20:31:39.000Z | server/curl.md | atlisg12/vegan-bot | be0b5d627557aad831b4b47c6c691f447b567393 | [
"MIT"
] | 15 | 2017-12-23T07:57:41.000Z | 2018-05-01T21:53:16.000Z | server/curl.md | atlisg/vegan-bot | be0b5d627557aad831b4b47c6c691f447b567393 | [
"MIT"
] | 2 | 2017-12-22T13:07:02.000Z | 2018-03-30T21:08:42.000Z | #### GET all answers
With header info
`curl -i http://localhost:3000/api/answers`
Prettified
`curl http://localhost:3000/api/answers | python -m json.tool`
With stats
`curl http://localhost:3000/api/answers?orderBy=hits&printStats=true | python -m json.tool`
#### GET answer by id
With header info
`curl -i http://localhost:3000/api/answers/vcom-1`
Prettified
`curl http://localhost:3000/api/answers/vcom-1 | python -m json.tool`
#### GET answers with query
With header info
`curl -i http://localhost:3000/api/answers\?sourceId\=vsid\&text\=protein`
Prettified
`curl http://localhost:3000/api/answers\?sourceId\=vsid\&text\=protein | python -m json.tool`
#### CREATE answer
With header info
`curl -i -H "Content-Type: application/json" -X POST -d '{"id":"test-2","source":"Mock Source","key":"Why is that?","answer":["Its simple really.","Because."]}' http://localhost:3000/api/answers`
Prettified
`curl -H "Content-Type: application/json" -X POST -d '{"id":"test-2","source":"Mock Source","key":"Why is that?","answer":["Its simple really.","Because."]}' http://localhost:3000/api/answers | python -m json.tool`
#### DELETE answer
With header info
`curl -i -H "Content-Type: application/json" -X DELETE -d '{"id":"test-2"}' http://localhost:3000/api/answers`
Prettified
`curl -H "Content-Type: application/json" -X DELETE -d '{"id":"test-2"}' http://localhost:3000/api/answers | python -m json.tool`
#### PUT answer
With header info
`curl -i -H "Content-Type: application/json" -X PUT -d '{"hit":true,"share":false}' http://localhost:3000/api/answers/vcom-7`
Prettified
`curl -H "Content-Type: application/json" -X PUT -d '{"hit":"true","share":"false"}' http://localhost:3000/api/answers/vcom-7 | python -m json.tool`
#### Send email
With header info
`curl -i -H "Content-Type: application/json" -X POST -d '{"email":"atli@dohop.com","subject":"Some feedback","text":"Great stuff!"}' http://localhost:3000/api/email`
Prettified
`curl -H "Content-Type: application/json" -X POST -d '{"email":"atli@dohop.com","subject":"Some feedback","text":"Great stuff!"}' http://localhost:3000/api/email | python -m json.tool`
| 41.72549 | 214 | 0.696898 | yue_Hant | 0.53201 |
fa0b6afdc9114f4c6a75c76f6e6bb05fcb56626a | 67 | md | Markdown | README.md | rafaelh/watcher | 1250ece10b4d575a430e298ade1bbb2d11de5e10 | [
"MIT"
] | null | null | null | README.md | rafaelh/watcher | 1250ece10b4d575a430e298ade1bbb2d11de5e10 | [
"MIT"
] | null | null | null | README.md | rafaelh/watcher | 1250ece10b4d575a430e298ade1bbb2d11de5e10 | [
"MIT"
] | 1 | 2019-05-12T21:15:06.000Z | 2019-05-12T21:15:06.000Z | # watcher
This is a simple RSS feed watcher, created for practice.
| 22.333333 | 56 | 0.776119 | eng_Latn | 0.999928 |
fa0b72d2a90c912a3a352334f4a2d12fca429df8 | 33,550 | md | Markdown | page/changelog/1.11.0.md | agcolom/jqueryui.com | 4f51224f1592f2833f2e9f2f347101b63be74ec7 | [
"CC0-1.0"
] | 1 | 2015-11-06T06:57:29.000Z | 2015-11-06T06:57:29.000Z | page/changelog/1.11.0.md | witcxc/jqueryui.com | a855c7767311fc0e9833c4bc547390769d691e2f | [
"CC0-1.0"
] | 1 | 2021-09-20T03:46:49.000Z | 2021-09-20T03:46:49.000Z | page/changelog/1.11.0.md | LaudateCorpus1/jqueryui.com | 6a3157c921fb40bacb2579b9829c46230d5c2d22 | [
"CC0-1.0"
] | null | null | null | <script>{
"title": "jQuery UI 1.11.0 Changelog"
}</script>
Released on June 26, 2014
## General
* Removed: Support for IE7. ([#9841](http://bugs.jqueryui.com/ticket/9841))
* Added: Selectmenu widget.
* Added: AMD support via UMD wrappers. ([96e027e](https://github.com/jquery/jquery-ui/commit/96e027e4b14345722cc39589f59ce2ce5e94b991), [1216e2a](https://github.com/jquery/jquery-ui/commit/1216e2aaf5e9bae52abe1ef9dfb9ab34c271c56d))
* Added: Bower support. ([#9465](http://bugs.jqueryui.com/ticket/9465), [e837d11](https://github.com/jquery/jquery-ui/commit/e837d11d6b3c8517e322ded24faaa400443402ef))
* Added: Windows 8 touch support. ([#9709](http://bugs.jqueryui.com/ticket/9709), [#9710](http://bugs.jqueryui.com/ticket/9710), [28310ff](https://github.com/jquery/jquery-ui/commit/28310ff))
* Renamed: jquery.js to exclude version in filename. ([a40647f](https://github.com/jquery/jquery-ui/commit/a40647f0e56096f8437056d238b4299fd19f2859))
* Renamed: All files, removing the "jquery.ui." prefix. ([#9464](http://bugs.jqueryui.com/ticket/9464), [21154cf](https://github.com/jquery/jquery-ui/commit/21154cfa2e02ef1814a6aff68b14553bdad165cb), [086dad6](https://github.com/jquery/jquery-ui/commit/086dad66c444fbf8dd6c93ac815fed6f0072ccbf))
## Build
* Added: Use Bower to manage client-side dependencies. ([#9507](http://bugs.jqueryui.com/ticket/9507), [d789d7a](https://github.com/jquery/jquery-ui/commit/d789d7ab1973f61484cc3c6307e8ff2978acd215))
* Added: Run tests on Travis. ([b5c41a2](https://github.com/jquery/jquery-ui/commit/b5c41a2b33311d2de8f0b473cc454bd281ea0ad1))
* Added: Validate number of generated manifests. ([5bbf276](https://github.com/jquery/jquery-ui/commit/5bbf27620504ec92cbeb3a907535f100b8d9586f))
* Added: Replace @VERSION in release tags. ([#10006](http://bugs.jqueryui.com/ticket/10006), [0645ac4](https://github.com/jquery/jquery-ui/commit/0645ac45edc383ae30f17ce9e21a92d934ea5931))
* Added: Validate commit messages with commitplease. ([08c7328](https://github.com/jquery/jquery-ui/commit/08c732833799f422b756dd1d5e42ca6859fe21c5))
* Added: JSCS task for code style checking. ([c462e66](https://github.com/jquery/jquery-ui/commit/c462e66f5d2c3da30879e38649e81d34bb7d2587), [4752ee9](https://github.com/jquery/jquery-ui/commit/4752ee9a6c127eed6bbe0958c68426ee34f2c581))
* Added: jquery-release dependency. ([#9793](http://bugs.jqueryui.com/ticket/9793), [516920a](https://github.com/jquery/jquery-ui/commit/516920ac71902cb9349db2220b5f7eb8e357340e))
* Added: Run all CSS files through csslint. ([5fb6863](https://github.com/jquery/jquery-ui/commit/5fb68636d19a40451d1190b9a5255d86616d54f5))
* Added: grunt-esformatter, formats all source files (no validation). ([327bcba](https://github.com/jquery/jquery-ui/commit/327bcbae8c7ee3021325e27136a258a47488f847))
* Added: Add `lint` and `test` aliases for Grunt. ([d8468a3](https://github.com/jquery/jquery-ui/commit/d8468a33790da8e7be46552325e932162b1942af))
* Added: Manifest entry for selectmenu. ([56e092d](https://github.com/jquery/jquery-ui/commit/56e092d43d4a907eb1983114345d7ed5024ed88e))
* Fixed: Ability to use jQuery 2.x with Bower. ([#10110](http://bugs.jqueryui.com/ticket/10110), [a0fea7d](https://github.com/jquery/jquery-ui/commit/a0fea7d849a2d0a949cfde8ea1c2edbf2a94d963))
* Fixed: Force LF for JS source files ([d7860b9](https://github.com/jquery/jquery-ui/commit/d7860b9c9822c40c68472e55baef72511c09d5de))
* Changed: Reorganize external directory. ([6df127a](https://github.com/jquery/jquery-ui/commit/6df127a0b591d2a1437361f9cb6f3524a7b2e111))
* Fixed: Verify ASCII only characters in output. ([#9037](http://bugs.jqueryui.com/ticket/9037), [7da8283](https://github.com/jquery/jquery-ui/commit/7da828375afb20d58736bb1eb530f915c445d5b9))
* Fixed: Include `es3` option for JSHint. ([f848ae3](https://github.com/jquery/jquery-ui/commit/f848ae38e0389874c0a6d026d54d40cb044f9561))
* Removed: the `build` task and it's dependencies. ([9ef09ed](https://github.com/jquery/jquery-ui/commit/9ef09edc797566e81f20682ab93208c9076341b5))
* Fixed: Generate pre-releases the same way as stable releases. ([#9998](http://bugs.jqueryui.com/ticket/9998), [079279a](https://github.com/jquery/jquery-ui/commit/079279afd4bc9ca7bc19fb34c539621997033b22))
* Fixed: Include draggable.css and sortable.css in the CSS concat step. ([a69ccd6](https://github.com/jquery/jquery-ui/commit/a69ccd68e431acb85c5a709ae7fbe90e572addb3))
* Fixed: Incorrect links in effect manifests. ([#9247](http://bugs.jqueryui.com/ticket/9247), [45f85cc](https://github.com/jquery/jquery-ui/commit/45f85cce5634e2321e701e905297b578987d3083), [2daf09d](https://github.com/jquery/jquery-ui/commit/2daf09d67160eceddd2b024d2698930e12193ba9))
## Core & Utilities
### UI Core
* Removed: `$.ui.hasScroll()`. ([#9190](http://bugs.jqueryui.com/ticket/9190), [ecd6a25](https://github.com/jquery/jquery-ui/commit/ecd6a25a83b15d0c5af306c44befb9e652c82f37))
* Removed: `$.support.selectstart`. ([d24cd35](https://github.com/jquery/jquery-ui/commit/d24cd35f0cf211a5fed379532f1d9c762f39b9e2))
* Removed: `$.ui.keyCode.NUMPAD_*`. ([#9269](http://bugs.jqueryui.com/ticket/9269), [274ed73](https://github.com/jquery/jquery-ui/commit/274ed73cd7da3025dc172b17f1c0820183f9a055))
* Deprecated: `.focus( n )`. ([#9646](http://bugs.jqueryui.com/ticket/9646), [df6110c](https://github.com/jquery/jquery-ui/commit/df6110c0d424ff3306fdd5576011f2dcf4d242d0))
* Deprecated: `.zIndex()`. ([#9061](http://bugs.jqueryui.com/ticket/9061), [932caaf](https://github.com/jquery/jquery-ui/commit/932caaf2ddf70c889003d5b42eee4f069d2dd296))
### Widget Factory
* Added: `instance` method on the bridge to return widget instance. ([#9030](http://bugs.jqueryui.com/ticket/9030), [36cb6f2](https://github.com/jquery/jquery-ui/commit/36cb6f264dbe6b155f8fd97b0ee7615a0f1adedb))
* Added: Support events with dashes and colons. ([#9708](http://bugs.jqueryui.com/ticket/9708), [be2a339](https://github.com/jquery/jquery-ui/commit/be2a339b2beaed69105abae91a118bc1c8669a1b))
* Added: `_init()` method is now optional. ([#9543](http://bugs.jqueryui.com/ticket/9543), [6e799c3](https://github.com/jquery/jquery-ui/commit/6e799c39d33be8eee02224d2f754dc42228a4cbb))
* Added: Return the constructor from `$.widget()`. ([#9467](http://bugs.jqueryui.com/ticket/9467), [c0ab710](https://github.com/jquery/jquery-ui/commit/c0ab71056b936627e8a7821f03c044aec6280a40))
* Fixed: Properly set `widgetEventPrefix` when redefining a widget. ([#9316](http://bugs.jqueryui.com/ticket/9316), [2eb89f0](https://github.com/jquery/jquery-ui/commit/2eb89f07341a557084fa3363fe22afe62530654d))
* Fixed: Only remove hover and focus classes when disabling, not enabling. ([#9558](http://bugs.jqueryui.com/ticket/9558), [d13df39](https://github.com/jquery/jquery-ui/commit/d13df39e39010bb7cf2cec11b5206e85ea5fca2a))
* Fixed: `option()` method should work as getter only when argument length is 1. ([#9601](http://bugs.jqueryui.com/ticket/9601), [ecde7cd](https://github.com/jquery/jquery-ui/commit/ecde7cd41a59e9c3ff07f56baeeaec2147cca779))
* Changed: `.enable()` and `.disable()` act via `._setOptions()` instead of `._setOption()`. ([bc85742](https://github.com/jquery/jquery-ui/commit/bc857424a36fb33eda80f69454b123b226ec1685))
### Position
* Removed: `$.support.offsetFractions`. ([baf6bc5](https://github.com/jquery/jquery-ui/commit/baf6bc5c27003468052d81589855b6587f004d94))
* Added: New download dialog demo. ([a74b69e](https://github.com/jquery/jquery-ui/commit/a74b69e7c2a1e926f393f275d9abac3e58aee01b))
* Fixed: Scrollbar width detection causes layout issues. ([#9291](http://bugs.jqueryui.com/ticket/9291), [d500e94](https://github.com/jquery/jquery-ui/commit/d500e945a46c9e2ce5bbb685661c32b5d3f57d21))
* Fixed: Positioning within document throws an error. ([#9533](http://bugs.jqueryui.com/ticket/9533), [1bbbcc7](https://github.com/jquery/jquery-ui/commit/1bbbcc723c489d7ef7d72bb62564b8f07805c41c))
* Fixed: Incorrect presentation with progressbar demo label in IE9. ([#9163](http://bugs.jqueryui.com/ticket/9163), [8bf5bc8](https://github.com/jquery/jquery-ui/commit/8bf5bc8bc8322bce796a9d9c9e7dc140e3081973))
## Interactions
### Draggable
* Fixed: Disabled should not have the `ui-state-disabled` class or `aria-disabled` attribute. ([#5974](http://bugs.jqueryui.com/ticket/5974), [44d0717](https://github.com/jquery/jquery-ui/commit/44d07173db32b498e5f83f60db290ff1463daee3))
* Fixed: Position bug in scrollable div. ([#9379](http://bugs.jqueryui.com/ticket/9379), [44b2180](https://github.com/jquery/jquery-ui/commit/44b2180782df6ef3324789324fcf3f98b85784a0))
* Fixed: Clicking on a draggable anchor without moving it should make it the active element. ([#8399](http://bugs.jqueryui.com/ticket/8399), [bca3e05](https://github.com/jquery/jquery-ui/commit/bca3e058e89bf40806170149b8029dfe52644248))
* Fixed: Make sure positional constraints are never applied to ui.originalPosition ([4bd1a9c](https://github.com/jquery/jquery-ui/commit/4bd1a9c5bae513974c294d41e778fc44777c8ed2))
* Fixed: Make sure snap elements are in the document before snapping. ([#8459](http://bugs.jqueryui.com/ticket/8459), [9d8af80](https://github.com/jquery/jquery-ui/commit/9d8af804ad4cebe434d420b29467c596809a7cca))
* Fixed: Double offset bug when scrolling. ([#6817](http://bugs.jqueryui.com/ticket/6817), [82f588e](https://github.com/jquery/jquery-ui/commit/82f588e82b887fdcb2406da2c5dfedc2f13ebdc9))
* Fixed: Enabled draggable from within iframe. ([#5727](http://bugs.jqueryui.com/ticket/5727), [24756a9](https://github.com/jquery/jquery-ui/commit/24756a978a977d7abbef5e5bce403837a01d964f))
* Fixed: Modified snapping algorithm to use edges and corners. ([#8165](http://bugs.jqueryui.com/ticket/8165), [bd126a9](https://github.com/jquery/jquery-ui/commit/bd126a9c1cfcbc9d0fd370af25cfa0eab294fc4e))
* Fixed: Apply `axis` options to position instead of style. ([#7251](http://bugs.jqueryui.com/ticket/7251), [94f8c4d](https://github.com/jquery/jquery-ui/commit/94f8c4d5e9ef461973a504d65dd906c1120da71d))
* Fixed: Ability to change `containment` option. ([#9733](http://bugs.jqueryui.com/ticket/9733), [0bb807b](https://github.com/jquery/jquery-ui/commit/0bb807bb42af87bf4c6dd4d71808b12c08d316e7))
* Fixed: Position bug in scrollable div. ([#9379](http://bugs.jqueryui.com/ticket/9379), [44b2180](https://github.com/jquery/jquery-ui/commit/44b2180782df6ef3324789324fcf3f98b85784a0))
* Fixed: Scroll not working with fixed position parent. ([#5009](http://bugs.jqueryui.com/ticket/5009), [49c7b72](https://github.com/jquery/jquery-ui/commit/49c7b7200ef944ffc93487e79e763dfe97b4ff4a))
* Fixed: Inputs do not blur when clicking on a draggable. ([#4261](http://bugs.jqueryui.com/ticket/4261), [fcd1caf](https://github.com/jquery/jquery-ui/commit/fcd1cafac8afe3a947676ec018e844eeada5b9de))
* Fixed: Not following mouse when scrolled and using `overflow-y: scroll`. ([#6258](http://bugs.jqueryui.com/ticket/6258), [a88d645](https://github.com/jquery/jquery-ui/commit/a88d64514001867b908776e6bfcfac7f1011970d))
* Fixed: Cursor doesn't revert to pre-dragging state after revert action when original element is removed. ([#6889](http://bugs.jqueryui.com/ticket/6889), [d345a0d](https://github.com/jquery/jquery-ui/commit/d345a0d7db841a143dcfdd3fb6fa6141cda435e9))
* Fixed: Browser window drops behind other windows in IE9/10. ([#9520](http://bugs.jqueryui.com/ticket/9520), [eae2c4b](https://github.com/jquery/jquery-ui/commit/eae2c4b358af3ebfae258abfe77eeace48fcefcb))
* Fixed: Handle `containment` option set to `false` after init. ([#8962](http://bugs.jqueryui.com/ticket/8962), [dc5254a](https://github.com/jquery/jquery-ui/commit/dc5254aa0703f9f7fd9d290c3078a5e9267160d9))
* Fixed: Jumps down with offset of scrollbar. ([#9315](http://bugs.jqueryui.com/ticket/9315), [263d078](https://github.com/jquery/jquery-ui/commit/263d07894493aafcdc6a565f9f9c079b4b8f5d80))
### Droppable
* Fixed: Disabled should not have the `ui-state-disabled` class or `aria-disabled` attribute. ([#6039](http://bugs.jqueryui.com/ticket/6039), [44d0717](https://github.com/jquery/jquery-ui/commit/44d07173db32b498e5f83f60db290ff1463daee3))
* Fixed: Off-by-one error in `isOverAxis()`. ([#10128](http://bugs.jqueryui.com/ticket/10128), [433ef9d](https://github.com/jquery/jquery-ui/commit/433ef9d433e9baa464cd0b313b82efa6f1d65556))
* Fixed: `scope` option cannot be changed after initialization. ([#9287](http://bugs.jqueryui.com/ticket/9287), [ffab89e](https://github.com/jquery/jquery-ui/commit/ffab89e9bee97cf7cc74249b6e4ce9dd798013c9))
* Fixed: Dependencies in photo manager demo. ([13be920](https://github.com/jquery/jquery-ui/commit/13be9205e1a0d227ef44ab28aed6d0e18aa5cf69))
* Fixed: `offsetWidth` and `offsetHeight` are queried unnecessarily causing synchronous reflow. ([#9339](http://bugs.jqueryui.com/ticket/9339), [a4fc7a9](https://github.com/jquery/jquery-ui/commit/a4fc7a9e9664d44d65b971c90a0cad82e1e79344))
* Fixed: Use `ui-state-default` for activation in demos. ([8f267ee](https://github.com/jquery/jquery-ui/commit/8f267ee3310bee8d7a2cb9e46b023a79ed84bfc1))
### Resizable
* Fixed: Disabled should not have the `ui-state-disabled` class or `aria-disabled` attribute. ([#5973](http://bugs.jqueryui.com/ticket/5973), [44d0717](https://github.com/jquery/jquery-ui/commit/44d07173db32b498e5f83f60db290ff1463daee3))
* Fixed: Resizing can move the element. ([#7018](http://bugs.jqueryui.com/ticket/7018), [#9107](http://bugs.jqueryui.com/ticket/9107), [20f0646](https://github.com/jquery/jquery-ui/commit/20f064662a016eaa6bc580aed012022c63f675aa))
* Fixed: `containment` now works with all parents, not just the immediate parent. ([#7485](http://bugs.jqueryui.com/ticket/7485), [c03cb80](https://github.com/jquery/jquery-ui/commit/c03cb8079c6984fb9286a64d980d367d86b9cd8b))
* Fixed: Only resize/reposition if size is greater than specified grid. ([#9611](http://bugs.jqueryui.com/ticket/9611), [20c1648](https://github.com/jquery/jquery-ui/commit/20c1648f68660b267eec302d43a7b1014cda6e1a))
* Fixed: Don't force absolute positioning when also draggable. ([#6939](http://bugs.jqueryui.com/ticket/6939), [3576ceb](https://github.com/jquery/jquery-ui/commit/3576ceb360eb0381a98f3c6b67d890c3834efa8a))
* Fixed: Allow resizing when resizables are nested. ([#5025](http://bugs.jqueryui.com/ticket/5025), [ec5f395](https://github.com/jquery/jquery-ui/commit/ec5f395260c5e4b678d2fe39c5405d466ee8369e))
* Fixed: Off-by-one pixel dimensions with `helper` and `grid`. ([#9547](http://bugs.jqueryui.com/ticket/9547), [14065dc](https://github.com/jquery/jquery-ui/commit/14065dc23bb453b6c30138f225c9db728dd7e455))
* Fixed: Erratic behavior of contained elements within scrollable grandparents. ([#9307](http://bugs.jqueryui.com/ticket/9307), [6df5c1a](https://github.com/jquery/jquery-ui/commit/6df5c1a4ae738e591694e0fe2fa3bbb8b05f6b0a))
### Sortable
* Fixed: Placeholder breaks `table-layout: fixed`. ([#9185](http://bugs.jqueryui.com/ticket/9185), [09b3533](https://github.com/jquery/jquery-ui/commit/09b3533910e887377fc87126608db1ded06f38f6))
* Fixed: Placeholder doesn't move when using `connectWith` option. ([#8301](http://bugs.jqueryui.com/ticket/8301), [f306a82](https://github.com/jquery/jquery-ui/commit/f306a826a4d3b4c36c3f86cb5feeee23bb0db4c3))
* Fixed: Dragging items into bottom of a list. ([#9314](http://bugs.jqueryui.com/ticket/9314), [#9381](http://bugs.jqueryui.com/ticket/9381), [601ad96](https://github.com/jquery/jquery-ui/commit/601ad962e0a417bb369378ed7704a0b493eac365))
## Widgets
### Accordion
* Removed: `content` property in `create` event. ([#8999](http://bugs.jqueryui.com/ticket/8999), [43442c3](https://github.com/jquery/jquery-ui/commit/43442c319643ee9fb6f54737d921ba8b03f3ae6b))
* Removed: `ui-accordion-noicons` class which was unused. ([d65cc93](https://github.com/jquery/jquery-ui/commit/d65cc9350fa205a46031a9b9b95cf04d98394036))
* Fixed: Maintain collapsed state on refresh. ([#9189](http://bugs.jqueryui.com/ticket/9189), [5a8596c](https://github.com/jquery/jquery-ui/commit/5a8596cdf3baa4d835e588cda9060a0537236c71))
* Fixed: Avoid resetting outline on headers which was removing focus indicator. ([#9352](http://bugs.jqueryui.com/ticket/9352), [9470af0](https://github.com/jquery/jquery-ui/commit/9470af0bbefafa3d81c3709674a45a54b693e7cf))
* Fixed: Moved `aria-expanded` from active tabpanel to active tab. ([#9407](http://bugs.jqueryui.com/ticket/9407), [f16d0c7](https://github.com/jquery/jquery-ui/commit/f16d0c7e267794aa20411581b15870d9babd7930))
* Changed: Moved animation properties into the widget prototype. ([da185a6](https://github.com/jquery/jquery-ui/commit/da185a6c1553c18ec367d8b0210519d04f97a534))
### Autocomplete
* Fixed: Normalize falsy values, not just missing values. ([#9762](http://bugs.jqueryui.com/ticket/9762), [113e9d0](https://github.com/jquery/jquery-ui/commit/113e9d0c2cc3f474da719721857c074c983c7157))
* Fixed: Scope race condition handling to the instance to allow multiple instances to have simultaneous requests. ([#9334](http://bugs.jqueryui.com/ticket/9334), [9e00e00](https://github.com/jquery/jquery-ui/commit/9e00e00f3b54770faa0291d6ee6fc1dcbad028cb))
* Fixed: Search if the user retypes the same value. ([#7434](http://bugs.jqueryui.com/ticket/7434), [48001a8](https://github.com/jquery/jquery-ui/commit/48001a8c46adc5d1d6c1726cecbe6453946e96e0))
* Fixed: Combobox demo. ([#9157](http://bugs.jqueryui.com/ticket/9157), [ebd5f13](https://github.com/jquery/jquery-ui/commit/ebd5f13027b30be1cdd9e8782e81ce468dcdff5e))
* Fixed: Ability to use up/down arrow keys in textareas. ([#8911](http://bugs.jqueryui.com/ticket/8911), [f5f0879](https://github.com/jquery/jquery-ui/commit/f5f08791536e689e008b04d6ea9677811353d456))
* Fixed: Announce autocomplete correctly in all assistive technologies. ([#9631](http://bugs.jqueryui.com/ticket/9631), [0b28d59](https://github.com/jquery/jquery-ui/commit/0b28d597fe1857590c9719c8b41f00e77967f7d7))
* Fixed: `.replaceWith()` fails to replace. ([#9172](http://bugs.jqueryui.com/ticket/9172), [ff11b69](https://github.com/jquery/jquery-ui/commit/ff11b69a67e0176c851ff3bdec997c7a75d47a42))
* Fixed: Remote JSONP demo which was using a deprecated web service. ([#9764](http://bugs.jqueryui.com/ticket/9764), [d4865dc](https://github.com/jquery/jquery-ui/commit/d4865dcbcdccdce59f07b324f230a1f1991aa39d))
* Fixed: Do not set value on multi-line input. ([#9771](http://bugs.jqueryui.com/ticket/9771), [605a20e](https://github.com/jquery/jquery-ui/commit/605a20ef06b0bae2d2ffd8d96e49c2a297add80a))
* Fixed: Fall back to .ui-front searching for empty jQuery objects. ([#9755](http://bugs.jqueryui.com/ticket/9755), [2ef1b16](https://github.com/jquery/jquery-ui/commit/2ef1b16e4d3aa8766084e50f4a1d806c434e7e43))
* Fixed: Dynamically adding input field breaks auto-complete's accessibility for screen readers. ([#9590](http://bugs.jqueryui.com/ticket/9590), [7b9c810](https://github.com/jquery/jquery-ui/commit/7b9c810b9ac450d826b6fa0c3d35377178b7e3b3))
* Fixed: Combobox demo shows underlying select by default. ([#9158](http://bugs.jqueryui.com/ticket/9158), [4202ad0](https://github.com/jquery/jquery-ui/commit/4202ad07187e15a3b2e64277e170daf9b278c3b4))
* Changed: Use custom namespace for combobox demo. ([445ffd0](https://github.com/jquery/jquery-ui/commit/445ffd0acc95e8c4adb4d63b12815e9bcac1c198))
* Changed: Don't add anchors to items in generated menu. ([e08791d](https://github.com/jquery/jquery-ui/commit/e08791d2c1be7628b7fd6ca2398cff195cb2e2c2))
### Button
* Fixed: Properly refresh button sets with new radio buttons. ([#8975](http://bugs.jqueryui.com/ticket/8975), [0059722](https://github.com/jquery/jquery-ui/commit/0059722b6b43c4985dbbd5f1494524442c12ddb0))
* Fixed: Radio button & checkboxes ignore mouse clicks for minor mouse movements. ([#7665](http://bugs.jqueryui.com/ticket/7665), [8b64322](https://github.com/jquery/jquery-ui/commit/8b64322e982e97cdfd5cdd184c8993f7123d469e))
* Fixed: Remove `ui-state-active` class when disabled. ([#9602](http://bugs.jqueryui.com/ticket/9602), [23d7d50](https://github.com/jquery/jquery-ui/commit/23d7d50f374f71efec418276a343e947cb80aea6))
* Fixed: Remove `ui-state-focus` class when disabled. ([#9169](http://bugs.jqueryui.com/ticket/9169), [0d0b05e](https://github.com/jquery/jquery-ui/commit/0d0b05ec7cf702b8782b19c993eeb30398a090f4))
* Fixed: On form reset only call `refresh()` on current button widgets. ([#9213](http://bugs.jqueryui.com/ticket/9213), [2de31fd](https://github.com/jquery/jquery-ui/commit/2de31fdbf498a6c20d196a96d007ea0f069644c5))
* Fixed: Ignore non-radio elements with the same name. ([#8761](http://bugs.jqueryui.com/ticket/8761), [ccb1324](https://github.com/jquery/jquery-ui/commit/ccb13240dd8b5cfac0199a30dcec4a71cbe1b252))
* Fixed: Replace anchors with more appropriate buttons in demos. ([7d0ca5e](https://github.com/jquery/jquery-ui/commit/7d0ca5e37dcf1b8dd1e201df2693e851da8ebb77))
### Datepicker
* Removed: Unused `ui-datepicker-month-year` class. ([3c68636](https://github.com/jquery/jquery-ui/commit/3c68636c8067e431c1bbdf2787dbec2ef3b88968))
* Removed: Unnecessary mouseover trigger. ([#5816](http://bugs.jqueryui.com/ticket/5816), [f0b4967](https://github.com/jquery/jquery-ui/commit/f0b4967388a5f0d7eb14c4b124886a11f4aa7d9e))
* Added: it-CH locale. ([#9175](http://bugs.jqueryui.com/ticket/9175), [ae4753b](https://github.com/jquery/jquery-ui/commit/ae4753b3f1c8d320e11a63f028ec02dc621d24e9))
* Added: English as an option in the localization demo. ([8ad8cea](https://github.com/jquery/jquery-ui/commit/8ad8cea69590cbaddc143732e001c8d769b9f204))
* Added: `en` and `en-US` locales. ([#6682](http://bugs.jqueryui.com/ticket/6682), [450d75f](https://github.com/jquery/jquery-ui/commit/450d75f912f4161c475f18f9eeb7efd307c02eae))
* Fixed: Date format for Serbian locales. ([#7347](http://bugs.jqueryui.com/ticket/7347), [504b100](https://github.com/jquery/jquery-ui/commit/504b100a1a9213186968c654dd915b92bb9ee15b))
* Fixed: Lithuanian locale. ([#9281](http://bugs.jqueryui.com/ticket/9281), [ce73a26](https://github.com/jquery/jquery-ui/commit/ce73a2688de47c975727ad9555cae84eb6997486))
* Fixed: Spanish locale. ([#9735](http://bugs.jqueryui.com/ticket/9735), [6ec452c](https://github.com/jquery/jquery-ui/commit/6ec452cc63313ec03f58942ce896036c7a2fcf3f))
* Fixed: Finnish locale. ([#9609](http://bugs.jqueryui.com/ticket/9609), [619261f](https://github.com/jquery/jquery-ui/commit/619261f0797a6fab49e2f2dd175b35795c0dc01e))
* Fixed: Ukranian locale. ([#9939](http://bugs.jqueryui.com/ticket/9939), [f3ffc8c](https://github.com/jquery/jquery-ui/commit/f3ffc8c9a94da8abe97a32d164f821ad8a9a8b60))
* Fixed: Latvian locale. ([#9656](http://bugs.jqueryui.com/ticket/9656), [629c632](https://github.com/jquery/jquery-ui/commit/629c632a110d437b6f328e6ff399a04c1a85352a))
* Fixed: Icelandic locale. ([#9431](http://bugs.jqueryui.com/ticket/9431), [369c76d](https://github.com/jquery/jquery-ui/commit/369c76d9e62fd3bac4676801d5666e6b40a068a2))
* Fixed: Corrected the Arabic word for Arabic. ([53c88a7](https://github.com/jquery/jquery-ui/commit/53c88a76ab965fed2ace8df42b3890549d2817d6))
* Fixed: Spanish and French locales. ([#9289](http://bugs.jqueryui.com/ticket/9289), [9726cd7](https://github.com/jquery/jquery-ui/commit/9726cd72b64e9e9735cfdb5564ebef64a6dab0aa), [aaf7576](https://github.com/jquery/jquery-ui/commit/aaf75767fa98a6acdf00b1414bee622d3a3747cc))
* Fixed: Removed `"<"` in the `"Anterior"` text for the pt locale. ([e591a7a](https://github.com/jquery/jquery-ui/commit/e591a7a9af123862fbef9d55c54351f6f995b7a7))
* Fixed: Set `scope` on table headers. ([b67d103](https://github.com/jquery/jquery-ui/commit/b67d1037a8583b11658d1ecfc96e7971b0c7fcee))
### Dialog
* Removed: array and string notations for `position` option. ([#8825](http://bugs.jqueryui.com/ticket/8825), [0cc40d7](https://github.com/jquery/jquery-ui/commit/0cc40d799ffdf7aa978f910b890915ee6ad7a2b8))
* Fixed: Error message after close and reopen the modal-form demo doesn't dissappear. ([#10057](http://bugs.jqueryui.com/ticket/10057), [b41b922](https://github.com/jquery/jquery-ui/commit/b41b92213af1e376e70099e0fffe875b01ff8d08))
* Fixed: Resizing causes close icon to misalign in Firefox. ([#9133](http://bugs.jqueryui.com/ticket/9133), [ec3cf67](https://github.com/jquery/jquery-ui/commit/ec3cf6725aa5ae0c69cb302df92eb933a517cbaa))
* Fixed: Safe `activeElement` access. ([#9420](http://bugs.jqueryui.com/ticket/9420), [#8443](http://bugs.jqueryui.com/ticket/8443), [2dfe85d](https://github.com/jquery/jquery-ui/commit/2dfe85d3e2269a571e07bd550bbd838ee703b833))
* Fixed: Switch back to shuffling `z-index`, but only look at `.ui-front` siblings. ([#9166](http://bugs.jqueryui.com/ticket/9166), [#9364](http://bugs.jqueryui.com/ticket/9364), [e263ebd](https://github.com/jquery/jquery-ui/commit/e263ebda99f3d414bae91a4a47e74a37ff93ba9c))
* Fixed: Inheritance causes undefined property _focusTabbable. ([#9241](http://bugs.jqueryui.com/ticket/9241), [1096f19](https://github.com/jquery/jquery-ui/commit/1096f19f37d6075328509d62a4c2c6d6a53d4b37))
* Fixed: Use `unbind()` for jQuery 1.6 compat. ([#10072](http://bugs.jqueryui.com/ticket/10072), [6c40052](https://github.com/jquery/jquery-ui/commit/6c4005280d5f5b49de382e7e4992c1778f541f6c))
* Fixed: Honor `preventDefault` when managing focus. ([#10103](http://bugs.jqueryui.com/ticket/10103), [226cc3e](https://github.com/jquery/jquery-ui/commit/226cc3e9e57c7591ff6a2ee02ffed52ac97786a9))
* Fixed: Context is not respected for modals. ([#9439](http://bugs.jqueryui.com/ticket/9439), [c9815f1](https://github.com/jquery/jquery-ui/commit/c9815f13b487d027ef9b6095588dbb73141c9a09))
* Fixed: Capitalize default value for `closeText` option. ([#9500](http://bugs.jqueryui.com/ticket/9500), [e628d0e](https://github.com/jquery/jquery-ui/commit/e628d0e4ba89eecee2c9b0d4cfb214523cad2ab4))
* Fixed: Use proper position data after drag and resize. ([#9351](http://bugs.jqueryui.com/ticket/9351), [16c375d](https://github.com/jquery/jquery-ui/commit/16c375d374c5675265b5d8c5cd06c7170d0e8b58))
* Fixed: Track last focused element instead of always focusing the first tabbable element. ([#9101](http://bugs.jqueryui.com/ticket/9101), [0e5a2e1](https://github.com/jquery/jquery-ui/commit/0e5a2e126ab4179f1ec83e1e4e773058b49e336d))
* Fixed: shift-tab handling. ([a0b8476](https://github.com/jquery/jquery-ui/commit/a0b84767a76098cdcc6375dfe28a7fee866bd395))
* Fixed: Apply `overflow: hidden` to contain the resize handles. ([#9521](http://bugs.jqueryui.com/ticket/9521), [7741c9f](https://github.com/jquery/jquery-ui/commit/7741c9f678088a129c1782f4e7f061bc12a41279))
* Fixed: Dialog should not close on enter in textbox in IE. ([#9312](http://bugs.jqueryui.com/ticket/9312), [c19e7b3](https://github.com/jquery/jquery-ui/commit/c19e7b3496d14b40e71ba892213889fc8cc81d4f))
### Menu
* Removed: Requirement to use anchors in menu items. ([#10130](http://bugs.jqueryui.com/ticket/10130), [3a61627](https://github.com/jquery/jquery-ui/commit/3a61627a501cb7ba1ce80046bfabbff0f7f2f517))
* Added: `items` option for better definition of menu items in non parent-child structures. ([#10129](http://bugs.jqueryui.com/ticket/10129), [31e705a](https://github.com/jquery/jquery-ui/commit/31e705ab324ec830062eee173a112551f7c919ea))
* Added: `_isDivider()` method. ([#9701](http://bugs.jqueryui.com/ticket/9701), [a6806ab](https://github.com/jquery/jquery-ui/commit/a6806ab17a9a5b332dc7d0c947a0a7a512dc2579))
* Fixed: Scroll on cursor down doesn't fully show the newly focused item. ([#9991](http://bugs.jqueryui.com/ticket/9991), [b222803](https://github.com/jquery/jquery-ui/commit/b22280385c05eaf10f4d480c546906b85aa011e1))
* Fixed: Autofocus issue with dialog opened from menu widget. ([#9044](http://bugs.jqueryui.com/ticket/9044), [485e0a0](https://github.com/jquery/jquery-ui/commit/485e0a06121d712bccad82a21a9e443292d2f9bb))
* Fixed: Ensure an event was passed before checking its type. ([#9384](http://bugs.jqueryui.com/ticket/9384), [670f650](https://github.com/jquery/jquery-ui/commit/670f650b99103bcea779f8ad0428e05cb7e37053), [8fbdd7c](https://github.com/jquery/jquery-ui/commit/8fbdd7cc38cd3e2a504b314cca2b36bc740aa168))
* Fixed: Disabled item visible through submenu on top. ([#9650](http://bugs.jqueryui.com/ticket/9650), [4992fc9](https://github.com/jquery/jquery-ui/commit/4992fc902eae207737be33e5b937980b4765bbf7))
* Fixed: Refreshing should recheck for menu icons. ([#9377](http://bugs.jqueryui.com/ticket/9377), [91b7b9f](https://github.com/jquery/jquery-ui/commit/91b7b9f9ab2e5baa31e37f34600457599409e161))
* Fixed: IE 10 renders bullets in submenus. ([#8844](http://bugs.jqueryui.com/ticket/8844), [64a39d9](https://github.com/jquery/jquery-ui/commit/64a39d9b0d5710729653b185eae427853608744b))
* Fixed: Menu items wiggle in IE8. ([#9995](http://bugs.jqueryui.com/ticket/9995), [b0e8380](https://github.com/jquery/jquery-ui/commit/b0e8380f023f41cb4a1bd3665177b5f0e795c289))
* Fixed `select` not firing every time. ([#9469](http://bugs.jqueryui.com/ticket/9469), [484e382](https://github.com/jquery/jquery-ui/commit/484e382259f1c1c56b151a97ddf8a894f94d17ea))
* Changed: Simplify styling. Remove rounded corners, reduce spacing. ([9910e93](https://github.com/jquery/jquery-ui/commit/9910e938aad1090339a2c7f60693093ee18aba82))
### Slider
* Fixed: Use spans instead of anchors for handles. ([#9890](http://bugs.jqueryui.com/ticket/9890), [dfc5c34](https://github.com/jquery/jquery-ui/commit/dfc5c34320691bd113250795243ea8b025b1f516))
* Fixed: Changing `range` option to `false` does not remove range div. ([#9355](http://bugs.jqueryui.com/ticket/9355), [2ba75e2](https://github.com/jquery/jquery-ui/commit/2ba75e2c93638d89e89de52347da0013a7a841b8))
* Changed: Move `numPages` into the widget prototype. ([8dbda00](https://github.com/jquery/jquery-ui/commit/8dbda00896adb7bd7ce74506e4fb1a474dd13e3c))
### Spinner
* Added: `isValid()` method. ([#9542](http://bugs.jqueryui.com/ticket/9542), [1552fc8](https://github.com/jquery/jquery-ui/commit/1552fc8a05ad351650b2a56c5c31905c671f1cdf))
* Fixed: Only format the value when there is one. ([#9573](http://bugs.jqueryui.com/ticket/9573), [e6360ab](https://github.com/jquery/jquery-ui/commit/e6360ab846c6d0248d6013d005d2c178906ca692))
* Fixed: Don't change value when changing `min`/`max` options. ([#9703](http://bugs.jqueryui.com/ticket/9703), [796a8b3](https://github.com/jquery/jquery-ui/commit/796a8b37e2b7eae6aa0f7a2fcaa5d8c29331e857))
### Tabs
* Removed: Demo with tabs at bottom. ([162056b](https://github.com/jquery/jquery-ui/commit/162056b2aa31795c216a3edc5554ff3c67393561))
* Fixed: Moved `aria-expanded` from active panel to active tab. ([#9622](http://bugs.jqueryui.com/ticket/9622), [f5e8041](https://github.com/jquery/jquery-ui/commit/f5e8041ebf1e0b36d67d1716a0cfec44692fabb8))
* Fixed: Incorrect remote tab detection in IE7. ([#9317](http://bugs.jqueryui.com/ticket/9317), [daf3f0d](https://github.com/jquery/jquery-ui/commit/daf3f0d9af5b29dc090e15d57cf884e3c12f7cad))
* Fixed: Disabled tabs are still clickable. ([#9413](http://bugs.jqueryui.com/ticket/9413), [4148acf](https://github.com/jquery/jquery-ui/commit/4148acfa9a7b1494f2d87559362c07a59f8e47f8))
* Fixed: URLs encoded in anything other than UTF-8 will throw an error. ([#9518](http://bugs.jqueryui.com/ticket/9518), [8748658](https://github.com/jquery/jquery-ui/commit/874865842bdbbf5ec48ee41640951e9f103c0f16))
* Fixed: Use `.ui-tabs-anchor` in stylesheet. ([2de5e78](https://github.com/jquery/jquery-ui/commit/2de5e78e72b98adeab4f03cedf47269babbb0a6c))
* Fixed: Refresh issue when tabs are moved to bottom. ([#9584](http://bugs.jqueryui.com/ticket/9584), [e14f75e](https://github.com/jquery/jquery-ui/commit/e14f75ed480e5b036bb47ab3398d1e0df28a128a))
* Changed: Moved `isLocal()` into the widget prototype. ([ecd4f95](https://github.com/jquery/jquery-ui/commit/ecd4f95a50349d3f8488cef5cf9501d9b94a6108))
### Tooltip
* Fixed: Improved accessibility by adding content to an aria-live div. ([#9610](http://bugs.jqueryui.com/ticket/9610), [b9e438d](https://github.com/jquery/jquery-ui/commit/b9e438d07c370ac2d4b198048feb6b6922469f70))
* Fixed: Preserve the `title` attribute after disabling twice. ([#9719](http://bugs.jqueryui.com/ticket/9719), [0dc84db](https://github.com/jquery/jquery-ui/commit/0dc84db853751f5f0ccfd9f79cbf8355dcc4b09c))
* Fixed: Memory leak from `remove` event. ([#9531](http://bugs.jqueryui.com/ticket/9531), [a8ff773](https://github.com/jquery/jquery-ui/commit/a8ff77360b78b7eabcffd97b8b11c2d1f150ed4e))
* Fixed: On close and destroy only set title if empty or undefined. ([#8925](http://bugs.jqueryui.com/ticket/8925), [af85dfc](https://github.com/jquery/jquery-ui/commit/af85dfcafb32b7503392ca834eaa9d3162d54b28))
## Effects
* Added: Separate files for puff and size effects. ([#9277](http://bugs.jqueryui.com/ticket/9277), [d0c613d](https://github.com/jquery/jquery-ui/commit/d0c613d3a8db7bd44ce70c20e8dc55478699b3d0))
## CSS Framework
* Fixed: Title color not reset in a focused accordion tab. ([#9428](http://bugs.jqueryui.com/ticket/9428), [5aa106a](https://github.com/jquery/jquery-ui/commit/5aa106a052e78559e50a4ca464863f5927c43bd5))
## Tests
* Added: Expose jQuery version select. ([3651d44](https://github.com/jquery/jquery-ui/commit/3651d44a307bb2b44d9e7063331ceb3ad4d3ce5f))
* Added: Ability to run specific component tests via Grunt. ([9a93a06](https://github.com/jquery/jquery-ui/commit/9a93a06fbd61013f1e62cea054c92e11dfa561f1))
* Fixed: Skip JSHint in browsers that don't support Function.prototype.bind(). ([8eeb0e7](https://github.com/jquery/jquery-ui/commit/8eeb0e7d88a943e3860f8492661ac8090cb8d3ac))
| 133.134921 | 301 | 0.784709 | yue_Hant | 0.159327 |
fd6a2d2674b2157897744628577eb3d47b1256a2 | 20,633 | md | Markdown | content/blog/how-i-am-so-productive/index.md | jumpalottahigh/kentcdodds.com | d7737accc4e2c737b3c317b4464483b7644e7e3a | [
"MIT"
] | null | null | null | content/blog/how-i-am-so-productive/index.md | jumpalottahigh/kentcdodds.com | d7737accc4e2c737b3c317b4464483b7644e7e3a | [
"MIT"
] | null | null | null | content/blog/how-i-am-so-productive/index.md | jumpalottahigh/kentcdodds.com | d7737accc4e2c737b3c317b4464483b7644e7e3a | [
"MIT"
] | null | null | null | ---
slug: how-i-am-so-productive
title: How I am so productive
date: '2018-09-24'
author: Kent C. Dodds
description:
_People regularly ask me how I get so much done. Here's my secret..._
keywords:
- JavaScript
- Productivity
- Career Advice
- Self Improvement
banner: ./images/banner.jpg
bannerCredit:
'Photo by [lalo Hernandez](https://unsplash.com/photos/2p7BD4T5GSE) on
[Unsplash](https://unsplash.com/search/photos/fast)'
---
I get asked about this at least twice a week, so I thought I'd save myself some
time by writing a blog post I can reference instead of answering the same
question repeatedly (spoiler, this is one of my secrets).
To help give you context, here are some of the things I do on a fairly regular
basis these days:
- I teach Sunday school to ~6 nine-year-old kids (we're going through the
[Old Testament](https://www.mormon.org/free-bible) this year). In addition to
attending 3 hours of church, I also prepare and deliver a one hour lesson for
them every other week.
- On Sundays I generally don't commit any code or do any "work." I have church
and family time and responsibilities (though I do enjoy spending the evening
with
[new friends playing Dominion](https://twitter.com/craigwalker1123/status/1039138835714560000))
- On Mondays, I publish the blogpost newsletter from two weeks ago to
[blog.kentcdodds.com](http://blog.kentcdodds.com/), a new one for subscribers.
(~1.5 hours of work, depending...)
- Product tasks (currently helping work on [paypal.me](https://paypal.me/))
(this is normally where most of my weekday is spent)
- Attend work meetings
- Help people on slack/video chat
- Work on
[paypal-scripts](https://blog.kentcdodds.com/automation-without-config-412ab5e47229)
(among other internal libraries and tools)
- traveling/training (about once a quarter. I'm on an airplane right now for
such an engagement)
- Research, prepare, and do a
[DevTips with Kent livestream](http://kcd.im/devtips) (~20 minutes...
sometimes) (week-daily)
- Respond to literally dozens of GitHub issues/PRs (daily)
- Code up or help people code up solutions to the GitHub issues
- Release ~5 of any of my >100 npm modules multiple times a day
- Travel to (sometimes) and speak at a conference (about once or twice a month)
- Travel to (sometimes) and deliver a 1–2 day workshop on one of five topics I
regularly train about.
- Record [egghead.io](http://egghead.io/) lessons and courses (working on some
of this very hard core right now)
- Research and prepare educational material (for
talks/workshops/courses/devtips/etc.)
- Go out to lunch with a (new) friend in the community about once a week
- Outline and prepare for writing my novel for
[NaNoWriMo](https://nanowrimo.org/) (which is going to be epic by the way).
- [AMA](http://kcd.im/ama): Ask Me Anything (currently has 475 questions I've
answered).
- [TechChats](http://kcd.im/tech-chats): livestream video chats with people
about tech (about once a month or so).
- Publish a 3 minute podcast (on [3 minutes with Kent](http://kcd.im/3-mins))
- Be a guest on podcasts (about once a month or so).
- Respond to literally dozens of questions per day coming from Tweets (mostly),
Twitter DMs (several daily), Slack, email, YouTube comments, and pigeon (one
of those is a joke).
On top of all that, I'm married, have 4 kids (6 and younger) and a puppy, and
have a home and yard/garden to care for.
### A typical day
Let me share a fairly general weekday with you:
Normally I wake up at ~7:00 AM and I'm ready for work by ~9:00 AM. I sit at my
desk (I work from home) and start with my daily scripture study for a few
minutes. Then I power on the computer, and start with my email and twitter (the
stuff I hadn't addressed while brushing my teeth etc. 😅). If I'm aware of any
pressing work, I'll take care of that first. If it's Monday, I'll make sure my
blog post is published, then start working on this week's blogpost newsletter.
Then I do the DevTips with Kent livestream.
By this time it's normally ~10:00 AM (or later on Mondays/if there's pressing
work I had to do first). It's likely that I've already released one or two new
versions of my npm modules, answered a half dozen questions, and
responded/reviewed several GitHub issues/PRs. Now I start on whatever PayPal
product work I'm working on (as I mentioned, I'm helping with
[paypal.me](https://paypal.me/) right now). I meet with my co-workers and decide
the highest priority tasks and get to work on them.
At ~12:00 PM (often later if I'm really involved in something), I go up and have
lunch with my family. If he's not already in bed, I read a book to one of my
boys and put him down for a nap (working from home is the best). My lunch break
is normally ~30 minutes.
I spend the afternoon working on more PayPal stuff, meetings, helping answer
questions from all the various channels (and sadly ignoring many of them as I
have work to do), and releasing more versions of various OSS libraries/tools.
I wrap up the day between 5:00 PM and 6:00 PM and head upstairs. It's family
time. Often I'll hang out with my wife after the kids are in bed. Sometimes
though, if I'm working on a big course for [egghead.io](http://egghead.io/) or
something, I'll go back to my office and start working on that. Normally I'll go
to bed before 11:00 PM.
Saturdays are mostly yard work and family time. I don't _normally_ do much
coding on Saturdays. Sundays are family and church time. I very rarely do any
work on Sundays (occasionally I'll merge simple PRs from my phone or get a head
start on this newsletter).
If this schedule sounds set in stone or a solid routine, let me assure you it's
not. What I've written is a pretty general schedule that wasn't really planned
and is just what kinda happened. In any case, I hope it helps to frame the rest
of my advice in a way that's relatable and helpful to you.
### It's an illusion
If you read carefully, you'll notice that I do a bunch of my stuff when I'm on
the clock at PayPal. That's because the stuff I do is good for PayPal and my
bosses have appreciated that I do it. Just last week I had multiple different
engineers within PayPal thank me for the daily
[DevTips with Kent](http://kcd.im/devtips) livestreams and
[these newsletters](http://kcd.im/news). PayPal is happy that I'm sharing my
knowledge and so long as what I share is not proprietary/legally concerning/etc,
they're happy to let me continue doing that. (You could say this newsletter is
sponsored by PayPal! Thanks PayPal!).
PayPal employees also use a bunch of the open source software that I maintain.
So when I'm doing work on my open source projects during work hours, 90% of the
time it's because we have a problem within PayPal that needs solving and I'm
just doing my job to make PayPal engineers more effective. Some of my projects
are libraries that I created at PayPal and then open sourced while others are
projects I created outside of my time at PayPal and now PayPal engineers use. In
either case, working on those projects (and contributing to other projects of
which I'm not a maintainer), is all part of my job.
So when people ask me: "HOW DO YOU DO ALL THIS STUFF _AND_ HAVE A JOB AT
PAYPAL!?" My answer is: "well... a lot of this stuff _is_ my job at PayPal."
This brings me to my next point:
### Increase the impact of your value
We're all constantly creating value in the world. A conversation with your
co-worker over lunch about why, what, and how to do a git rebase is creating
value. A meetup talk you're delivering is creating value. etc. etc. etc. The
secret that I've found is taking the value that you're already creating, and
increase its impact by preserving and presenting it to the world.
So turn that conversation into a blog post or have that conversation over Google
Hangouts on Air and have it upload to YouTube automatically (which is what my
[tech chats](http://kcd.im/tech-chats) are). Make sure your meetup talk is
recorded (even if that means you're just recording your screen, which I do all
the time). Instead of answering your co-worker's slack question about arrow
functions on slack, type it out as a quick blog post on medium, a gist, or a 🔥
FIRE 🔥 TWEET 🔥 and send them the link.
As long as your company is cool with you sharing non-proprietary knowledge with
the world, then take advantage of that (as a side note, I would have a very hard
time being successful at a company which does not value open knowledge sharing
like this. I know it's a privilege to work at a company like PayPal. Sorry if
you're not in an environment like PayPal in this way.)
In short [learn in public](https://twitter.com/swyx/status/1009174159690264579)
(I love you [Shawn!](https://twitter.com/swyx)). It's likely that if you listed
out all the things you do in a week your list would be just as long if not
longer than mine. The thing that makes it appear that I am so productive is that
I make public as much of what I do as possible.
### Automation
If you maintain an npm package, it may surprise you (or you may be skeptical of
the fact) that I manage to release multiple versions of multiple packages in a
typical day. Believe me though, I release almost every PR made on my open source
projects within minutes of my merging them into `master`, and often I do so from
my phone.
This is possible because my open source projects have a solid suite of tests
that run in CI and give me confidence things are working followed by an
automation script that publishes to npm and generates a GitHub changelog. For
years I've been using an awesome tool called
[semantic-release](https://github.com/semantic-release/semantic-release)
(shoutout to
[the team of fantastic humans](https://github.com/semantic-release/semantic-release/blob/caribou/README.md#team))
to automatically release my packages.
The concept of automation is something
[I've written about in the past](https://blog.kentcdodds.com/an-argument-for-automation-fce8394c14e2).
It's how I got into software development and I feel strongly that automation is
the way we can make ourselves more productive
([even if it takes longer to develop the automation than the time it would save us](https://xkcd.com/1205/)).
If you find yourself repeatedly doing a task, see if there's a simple way to
automate it. (Like
[what I do for creating my kcd.im/ short urls](https://github.com/kentcdodds/hive-api)
\+
[shorten](https://github.com/kentcdodds/dotfiles/blob/94c00b43354f86595647f9ff18057ff9e6469d33/.bash_profile#L61-L63)
😄, which happens to be another form of automation and productivity boost
because short URLs are easier/faster to give to people, and people remember them
better).
### Enable others
Many of those releases of my open source projects I do are releasing code that I
did not write. I put forth an investment of time in helping
[and teaching](http://kcd.im/pull-request) other people contribute to my
projects and [do things](https://github.com/kentcdodds/all-contributors) to help
motivate people to do so. This means that I'm able to do more because other
people handle a lot of project maintenance for me so I can do other things.
### Don't answer the same question twice
I learned early on that people ask me repeat questions early on. I like to give
them answers, but I also found out quickly that
[I don't have time to answer everyone](http://kcd.im/no-time) and it's a bit
frustrating to answer the same question multiple times. This is why having
[an active blog](https://blog.kentcdodds.com/) and [an AMA](http://kcd.im/ama)
are super helpful.
If someone asks me a question, 99% of the time I'll ask them to ask it on my
AMA. If I get the same question many times, then I'll make it the subject of a
DevTip or blog. Having multiple places/formats I can go to answer people's
questions in public does four things:
1. Allows me to answer their question
2. Allows others to see that answer (increases the impact of the value I'm
creating)
3. Gives me motivation to give them a higher quality answer.
4. Gives me a link to share with the next person who asks (which is way faster
than writing it out again)
I guess it also contributes to the illusion that I'm doing more and I'm more
productive. I'm sure you answer a lot of questions as well, but how does anyone
know if you don't share?
### Avoiding Burnout
When I tweeted that this would be the subject of today's newsletter, my good
friend [Mark Dalgleish](https://twitter.com/markdalgleish)
[responded](https://twitter.com/markdalgleish/status/1038922626826031104):
> _Because you haven't burned out yet?_
I honestly don't think that I've ever truly burned out. I've only been doing
this software thing professionally for ~4 years, so maybe that's why. I've
definitely burned out on specific projects or frameworks, but I've generally
been able to keep moving and doing things that keep me excited and provide value
to the world while taking care of myself and my relationships.
I should probably do this subject better justice in another blog post, but I'll
just say that in general what I do to avoid burnout is to not do stuff I don't
have to do or want to do. I've learned and internalized that I don't owe anyone
anything unless I've made an actual commitment of marriage/employment/etc. So
while I try to be kind and helpful, at the end of the day if I can't help, then
I don't and I don't stress over it.
For example, there are many open issues on my GitHub projects that get no
response from me because I've chosen to give my time to other things I'd rather
do. I do feel bad I can't do more, but I don't stress over it.
This subject isn't all that simple, but that's all I have time for (and I'm not
going to stress over not giving you more because I don't owe you anything 😜
#seewhatididthere).
### Hyper-focused
I've listed pretty much everything I do. You may have noticed that I don't have
many hobbies. This is true. I have a few things that I do for fun, but it pretty
much all boils down to: Family, Religion, and Coding.
Even though this is working out so far, I don't believe this is sustainable.
This is one reason why I'm so excited about writing this novel for
[NaNoWriMo](https://nanowrimo.org/). It'll be a new creative outlet. And
hopefully by November I'll be done with most of
[the HUGE things I'm working](https://twitter.com/kentcdodds/status/1038584983990849536)
on so I can dedicate myself to writing 50,000 words in 30 days :)
That said, I think short bursts of hyper-focus do help me get a lot done.
Whether I'm hyper-focused on an [egghead.io](http://egghead.io/) course, or
getting something specific done at work, it helps me get things done. I'm not
sure how to explain it, but for me hyper-focus means that I kinda don't think
about anything else for a while. When I'm not with my family or fulfilling
another commitment, I'm thinking about and working on this thing until it's
done.
I didn't explain that well and should probably remove this section, but I'm not
gonna. Maybe it'll be helpful for someone.
**Continued:** Turns out this was helpful to a few people so I thought I'd
expound on this a tiny bit (and my wife suggested that I do as well because she
thinks this makes a significant impact on who I am (not necessarily that it's a
desirable trait).
So this necessarily isn't a short-term kind of thing, it doesn't prevent me from
sleeping well at night (though sometimes I do have trouble shutting my brain
off, most of the time I sleep fine). This also isn't the same as staying focused
an on-task during a period of several hours when I'm trying to get something
done (something that I'm typically not very good at doing unless I'm _very_
excited about it).
This hyper-focus is pretty much that I immerse myself in the subject for a
period of time. As an example, since I decided that I want to write this novel a
few weeks ago, I've found a TON of content online about tools novel writers use
to make their books "work" and I've been consuming it at a rapid rate. It's
filled my idle mind. I'm still able to turn my attention to my family, my work,
or courses that I'm working on, but when I'm doing mundane tasks and my brain is
able to think freely, it's consumed by the idea of the novel itself and learning
tools that I can apply when I start writing it in November. Ask my wife. If
we're not talking about something that's actually important, then I'll
inevitably turn the conversation over to the book (and she's been
[extremely helpful](https://twitter.com/kentcdodds/status/1043712163490160640)).
I don't have tips of how you can develop this in yourself, and I'm not even sure
that I would recommend it. It's just something that I do that my wife and I
think may have something to do with my productivity.
### Spend more time producing than consuming
I do not spend a lot of time watching other people's courses or reading other
people's blogs/newsletters. I definitely will skim blog posts as needed, or I'll
sit down and watch a few [egghead.io](http://egghead.io/) lessons or part of a
Frontend Master's course when there's something specific I need to learn. I love
[Dave Geddes's](https://gedd.ski/) mastery games on
[CSS grid](https://www.gridcritters.com/) and
[Flexbox](https://flexboxzombies.com/p/flexbox-zombies), but generally I spend a
bunch more time working on producing my own material/projects. I think that
makes me more productive as well.
### The importance of balance
With all this talk of productivity, I should probably mention that I've learned
that it's important to live a balanced life. Like I said, I can get pretty
focused on one thing, but I spend a lot of time with my family and that brings
me joy. Shutting down for a little bit, taking a step back, and **working on
your relationships** is where you'll get your juice to keep going.
So **the fact that I'm married and have four kids and a dog isn't a detriment to
my productivity,** but really _it's an important part of my secret to
productivity._ They motivate me and recharge me in ways that I couldn't
understand before I had them in my life.
### Conclusion
So the reasons it appears I'm so productive is multi-facited:
- Lots of it is an illusion
- I'm privileged to work at a place that values open knowledge sharing and
doesn't limit what I do in my free time
- I learn in public, thereby increasing the impact of the value that I create.
- I automate mundane/time consuming/context switching tasks
- I answer questions in a public forum
- I spend more time producing that consuming
I should say also that my wife plays a **huge** role in how I'm so productive.
However, she is a pretty private person and asked that I not talk about her much
publicly (except she did give me permission to say this). I couldn't do all of
the things I do if it weren't for her.
I don't want to give the false impression that I only _appear_ productive
either. I really do feel like I'm quite productive. But hopefully this makes my
productivity more realistic and attainable in your mind. I hope some of these
ideas help inspire you to be more productive and more importantly find more
happiness through your relationships. Good luck!
Subscribe to my newsletter below to get content like this delivered to your
inbox two weeks before it's published to my blog:
**Learn more about career development from me**:
- [Why and How I started public speaking](https://blog.kentcdodds.com/why-and-how-i-started-public-speaking-d5ae78303707)
- [Getting Noticed and Widening Your Reach](https://buttondown.email/kentcdodds/archive/03761505-8609-404c-a5b7-5367013292bf)
- [Zero to 60 in Software Development: How to Jumpstart Your Career — Forward 4 Web Summit](https://www.youtube.com/watch?v=-qPh6I2hfjw&list=PLV5CVI1eNcJgNqzNwcs4UKrlJdhfDjshf)
**Things to not miss**:
- [Take the State of JavaScript 2018 Survey!](https://medium.com/@sachagreif/take-the-state-of-javascript-2018-survey-c43be2fcaa9) — And
make sure to add react-testing-library in the "other" field for "testing
tools." If they can add enzyme, then they should have react-testing-library as
well!
- [A Geek Leader 059: Kent C. Dodds](https://www.ageekleader.com/agl-059-kent-c-dodds/) — On
this podcast I talk about my career story and stuff! Should be interesting if
you thought this blog post was interesting :)
- [jest-expect-message](https://github.com/mattphillips/jest-expect-message) by
[Matt Phillips](https://twitter.com/mattphillipsio): Add custom message to
Jest expects 🃏🗯
- [jest-extended](https://github.com/jest-community/jest-extended): Additional
Jest matchers 🃏💪
| 52.769821 | 176 | 0.766878 | eng_Latn | 0.999143 |
fd6aeb49e3003686c1c2286e0ba5bec4d573a984 | 980 | md | Markdown | elixir/elixir-mars-water/README.md | marcinbiegun/exercises | 36ad942e8d40d6471136326a3f6d09285bbd90aa | [
"MIT"
] | 1 | 2018-12-11T14:09:14.000Z | 2018-12-11T14:09:14.000Z | elixir/elixir-mars-water/README.md | marcinbiegun/exercises | 36ad942e8d40d6471136326a3f6d09285bbd90aa | [
"MIT"
] | null | null | null | elixir/elixir-mars-water/README.md | marcinbiegun/exercises | 36ad942e8d40d6471136326a3f6d09285bbd90aa | [
"MIT"
] | null | null | null | # MarsWater
This is a technical task I've got during a recruitment process.
## Task description
Your are given an input string containing digitss
separated by spaces.
Sample input:
```
3 4 2 6 2 6 0 2 3 0 0 2 8 8 9 2 1 4
```
The first digit is amount of results requested.
The second digit is size of grid.
The rest of digits fill the grid with measurements.
For each grid position, a score is calculated by summing it's value
value and all neighbour values, so the total score is sum of 9 grid
positions.
You need to return the specified amount of top scored grid positions.
Measurements outside of grid have 0 value.
For the sample input, the solution would be:
```
(2, 1, score: 37)
(2, 2, score: 30)
(3, 1, score: 27)
```
## Running the code
Run the tests:
```
mix test
```
Benchmarks the solutions performance:
```
mix run benchmark.exs
```
Profile the solutions (need the file to change the code that will be
profiled):
```
mix run profiler.ex
```
| 16.333333 | 69 | 0.716327 | eng_Latn | 0.99825 |
fd6c75440b26062d67ba72876ea92a78ea879667 | 1,015 | md | Markdown | docs/outlook/mapi/cbsproptagarray.md | MicrosoftDocs/office-developer-client-docs.zh-CN | e0e2958677a0eb560027161eeccae29528d647be | [
"CC-BY-4.0",
"MIT"
] | 3 | 2018-11-14T02:50:23.000Z | 2021-04-21T00:13:46.000Z | docs/outlook/mapi/cbsproptagarray.md | MicrosoftDocs/office-developer-client-docs.zh-CN | e0e2958677a0eb560027161eeccae29528d647be | [
"CC-BY-4.0",
"MIT"
] | 5 | 2018-11-30T15:41:39.000Z | 2021-12-08T02:53:08.000Z | docs/outlook/mapi/cbsproptagarray.md | MicrosoftDocs/office-developer-client-docs.zh-CN | e0e2958677a0eb560027161eeccae29528d647be | [
"CC-BY-4.0",
"MIT"
] | 3 | 2018-08-09T20:27:19.000Z | 2019-10-13T18:16:54.000Z | ---
title: CbSPropTagArray
manager: soliver
ms.date: 03/09/2015
ms.audience: Developer
ms.topic: reference
ms.prod: office-online-server
ms.localizationpriority: medium
api_name:
- MAPI.CbSPropTagArray
api_type:
- COM
ms.assetid: c5053f27-e23d-4a65-b079-5f33765c33f7
description: 上次修改时间:2015 年 3 月 9 日
ms.openlocfilehash: 45630630cfb8dcf5fc654f990208feb15c8b1cff
ms.sourcegitcommit: a1d9041c20256616c9c183f7d1049142a7ac6991
ms.translationtype: MT
ms.contentlocale: zh-CN
ms.lasthandoff: 09/24/2021
ms.locfileid: "59576448"
---
# <a name="cbsproptagarray"></a>CbSPropTagArray
**适用于**:Outlook 2013 | Outlook 2016
计算现有 [SPropTagArray 结构中的字节](sproptagarray.md) 数。
|||
|:-----|:-----|
|标头文件: <br/> |Mapidefs.h <br/> |
|相关结构: <br/> |**SPropTagArray** <br/> |
```cpp
CbSPropTagArray (_lparray)
```
## <a name="parameters"></a>参数
_ _lparray_
> 指向现有 **SPropTagArray 结构的** 指针。
## <a name="see-also"></a>另请参阅
[SPropTagArray](sproptagarray.md)
[与结构相关的宏](macros-related-to-structures.md)
| 18.796296 | 60 | 0.714286 | yue_Hant | 0.48867 |
fd6ca1ea8c2f9aebb26adda441615ec905bed3e4 | 180 | md | Markdown | showcases/bdd_pandemic_on_hold/notes/bdd_pandemic.md | marc-bouvier/sandbox | 84993257edcfd39e88a3e28378d8dbed965576dc | [
"Unlicense"
] | null | null | null | showcases/bdd_pandemic_on_hold/notes/bdd_pandemic.md | marc-bouvier/sandbox | 84993257edcfd39e88a3e28378d8dbed965576dc | [
"Unlicense"
] | null | null | null | showcases/bdd_pandemic_on_hold/notes/bdd_pandemic.md | marc-bouvier/sandbox | 84993257edcfd39e88a3e28378d8dbed965576dc | [
"Unlicense"
] | null | null | null | # Pandemic BDD
Feature : Infection
Scenario
"New York"=>"London"
"New York"=>"Madrid"
"Madrid"=>"Paris"
"Paris"=>"Algiers"
"Paris"=>"Milan"
"Paris"=>"Essen"
"Paris"=>"London"
| 11.25 | 20 | 0.638889 | yue_Hant | 0.991088 |
fd6d3de0950eac22041a6d37942f06b979c66028 | 1,246 | md | Markdown | README.md | walts81/stateli-react | 7c3661f625269a995ea817642fb9cb1d0f6d23e4 | [
"MIT"
] | null | null | null | README.md | walts81/stateli-react | 7c3661f625269a995ea817642fb9cb1d0f6d23e4 | [
"MIT"
] | 1 | 2021-05-11T09:19:17.000Z | 2021-05-11T09:19:17.000Z | README.md | walts81/stateli-react | 7c3661f625269a995ea817642fb9cb1d0f6d23e4 | [
"MIT"
] | null | null | null | # Stateli React
[](https://travis-ci.com/walts81/stateli-react)
[](https://coveralls.io/github/walts81/stateli-react)
A [React][react] library to enable [Stateli][stateli] to work with [React][react] and [Redux Dev Tools][reduxdevtools].
### Installation
Install stateli-react with npm.
```sh
$ npm install stateli stateli-react --save
```
### Usage
```javascript
// in React main component
import React from 'react';
import { StateliStore } from 'stateli';
import { StateliProvider } from 'stateli-react';
import App from './App.jsx'; // <-- your React app component
const store = new StateliStore({
actions: [],
mutations: [],
getters: [],
state: {},
});
export default class MainComponent extends React.Component {
render() {
return (
<div>
<StateliProvider store={store}>
<App />
</StateliProvider>
</div>
);
}
}
```
## License
[MIT](LICENSE)
[react]: https://reactjs.org/
[redux]: https://redux.js.org/
[reduxdevtools]: https://github.com/reduxjs/redux-devtools
[stateli]: https://github.com/walts81/stateli
| 23.509434 | 138 | 0.67817 | kor_Hang | 0.389051 |
fd6d43c2ab38e60e9ad0f1ccbec340df183bb700 | 19,045 | md | Markdown | README.md | AGou-ops/docker-compose | e72ba10727f9f24a08302ffb6d45fb73240d5a2f | [
"MIT"
] | 1 | 2021-11-17T07:49:45.000Z | 2021-11-17T07:49:45.000Z | README.md | AGou-ops/docker-compose | e72ba10727f9f24a08302ffb6d45fb73240d5a2f | [
"MIT"
] | null | null | null | README.md | AGou-ops/docker-compose | e72ba10727f9f24a08302ffb6d45fb73240d5a2f | [
"MIT"
] | null | null | null | # docker-compose-liunx
通过docker-compose编排一系列环境进行一键快速部署运行,小白运维神器。

## 安装docker
```shell
# 通过yum源安装docker
sudo yum -y install docker
# 启动docker
sudo systemctl start docker
# 开机自启
sudo systemctl enable docker
```
## `docker-compose`安装
```shell
# 安装EPEL软件包
sudo yum -y install epel-release
# 安装pip3
sudo yum install -y python36-pip
# 升级
sudo pip3 install --upgrade pip
# 验证pip3版本
pip3 --version
# docker-compose安装
sudo pip3 install -U docker-compose
# 验证docker-compose版本
docker-compose --version
# 安装补全插件
# curl -L https://raw.githubusercontent.com/docker/compose/1.25.0/contrib/completion/bash/docker-compose > /etc/bash_completion.d/docker-compose
```
## `docker-compose`卸载
```shell
pip3 uninstall docker-compose
```
## `docker-compose`相关命令
```shell
# 构建镜像
docker-compose build
# 构建镜像,--no-cache表示不用缓存,否则在重新编辑Dockerfile后再build可能会直接使用缓存而导致新编辑内容不生效
docker-compose build --no-cache
# config 校验文件格式是否正确
docker-compose -f docker-compose.yml config
# 运行服务
ocker-compose up -d
# 启动/停止服务
docker-compose start/stop 服务名
# 停止服务
docker-compose down
# 查看容器日志
docker logs -f 容器ID
# 查看镜像
docker-compose images
# 拉取镜像
docker-compose pull 镜像名
```
## 常用shell组合
```shell
# 删除所有容器
docker stop `docker ps -q -a` | xargs docker rm
# 删除所有标签为none的镜像
docker images|grep \<none\>|awk '{print $3}'|xargs docker rmi
# 查找容器IP地址
docker inspect 容器名或ID | grep "IPAddress"
# 创建网段, 名称: mynet, 分配两个容器在同一网段中 (这样子才可以互相通信)
docker network create mynet
docker run -d --net mynet --name container1 my_image
docker run -it --net mynet --name container1 another_image
```
---
## 环境准备
```shell
# 安装git命令: yum install -y git
git clone https://gitee.com/zhengqingya/docker-compose.git
cd docker-compose/Liunx
```
====================================================================================\
========================= ↓↓↓↓↓↓ 环境部署 start ↓↓↓↓↓↓ ====================================\
====================================================================================\
## 运行服务
### Portainer
```shell
docker-compose -f docker-compose-portainer.yml -p portainer up -d
-p:项目名称
-f:指定docker-compose.yml文件路径
-d:后台启动
```
访问地址:[`ip地址:9000`](www.zhengqingya.com:9000)
### MySQL
```shell
# 5.7
docker-compose -f docker-compose-mysql5.7.yml -p mysql5.7 up -d
# 8.0
docker-compose -f docker-compose-mysql8.0.yml -p mysql8.0 up -d
```
### MySQL - 主从同步
```shell
docker-compose -f docker-compose-mysql-master-slave.yml -p mysql-master-slave up -d
```
```shell
# ================== ↓↓↓↓↓↓ 配置主库 ↓↓↓↓↓↓ ==================
# 进入主库
docker exec -it mysql_master /bin/bash
# 登录mysql
mysql -uroot -proot
# 创建用户slave,密码123456
CREATE USER 'slave'@'%' IDENTIFIED BY '123456';
# 授予slave用户 `REPLICATION SLAVE`权限和`REPLICATION CLIENT`权限,用于在`主` `从` 数据库之间同步数据
GRANT REPLICATION SLAVE, REPLICATION CLIENT ON *.* TO 'slave'@'%';
# 授予所有权限则执行命令: GRANT ALL PRIVILEGES ON *.* TO 'slave'@'%';
# 使操作生效
FLUSH PRIVILEGES;
# 查看状态
show master status;
# 注:File和Position字段的值slave中将会用到,在slave操作完成之前不要操作master,否则将会引起状态变化,即File和Position字段的值变化 !!!
# +------------------+----------+--------------+------------------+-------------------+
# | File | Position | Binlog_Do_DB | Binlog_Ignore_DB | Executed_Gtid_Set |
# +------------------+----------+--------------+------------------+-------------------+
# | mysql-bin.000003 | 769 | | | |
# +------------------+----------+--------------+------------------+-------------------+
# 1 row in set (0.00 sec)
# ================== ↓↓↓↓↓↓ 配置从库 ↓↓↓↓↓↓ ==================
# 进入从库
docker exec -it mysql_slave /bin/bash
# 登录mysql
mysql -uroot -proot
change master to master_host='www.zhengqingya.com',master_port=3306, master_user='slave', master_password='123456', master_log_file='mysql-bin.000003', master_log_pos= 769, master_connect_retry=30;
# 开启主从同步过程 【停止命令:stop slave;】
start slave;
# 查看主从同步状态
show slave status \G
# Slave_IO_Running 和 Slave_SQL_Running 都是Yes的话,就说明主从同步已经配置好了!
# 如果Slave_IO_Running为Connecting,SlaveSQLRunning为Yes,则说明配置有问题,这时候就要检查配置中哪一步出现问题了哦,可根据Last_IO_Error字段信息排错或谷歌…
# *************************** 1. row ***************************
# Slave_IO_State: Waiting for master to send event
# Master_Host: www.zhengqingya.com
# Master_User: slave
# Master_Port: 3306
# Connect_Retry: 30
# Master_Log_File: mysql-bin.000003
# Read_Master_Log_Pos: 769
# Relay_Log_File: c598d8402b43-relay-bin.000002
# Relay_Log_Pos: 320
# Relay_Master_Log_File: mysql-bin.000003
# Slave_IO_Running: Yes
# Slave_SQL_Running: Yes
# Replicate_Do_DB:
```
###### 解决主从同步数据不一致问题
```shell
# 注意:操作的时候停止主库数据写入
# 在从库查看主从同步状态
docker exec -it mysql_slave /bin/bash
mysql -uroot -proot
show slave status \G
# Slave_IO_Running: Yes
# Slave_SQL_Running: No
# 1、手动同步主从库数据
# 先在从库停止主从同步
stop slave;
# 导出主库数据
mysqldump -h www.zhengqingya.com -P 3306 -uroot -proot --all-databases > /tmp/all.sql
# 导入到从库
mysql -uroot -proot
source /tmp/all.sql;
# 2、开启主从同步
# 查看主库状态 => 拿到File和Position字段的值
docker exec -it mysql_master /bin/bash
mysql -uroot -proot
show master status;
# 从库操作
change master to master_host='www.zhengqingya.com',master_port=3306, master_user='slave', master_password='123456', master_log_file='mysql-bin.000004', master_log_pos= 488117, master_connect_retry=30;
start slave;
# 查看主从同步状态
show slave status \G
# Slave_IO_Running: Yes
# Slave_SQL_Running: Yes
```
### Mycat
> Java语言编写的MySQL数据库网络协议的开源中间件
> 部署可参考 https://zhengqing.blog.csdn.net/article/details/104700437
```shell
docker-compose -f docker-compose-mycat.yml -p mycat up -d
```
Navicat可视化工具连接参数
> mysql -hwww.zhengqingya.com -P8066 -uroot -proot
| 名称 | 值 |
| -------------- | ------------------- |
| 连接名 | mycat_8066 |
| 主机名或IP地址 | www.zhengqingya.com |
| 端口 | 8066 |
| 用户名 | root |
| 密码 | root |

### Mycat-web
> Mycat可视化运维的管理和监控平台
```shell
docker-compose -f docker-compose-mycat-web.yml -p mycat-web up -d
```
访问地址:[`ip地址:8082/mycat`](www.zhengqingya.com:8082/mycat)
### Yearning
```shell
docker-compose -f docker-compose-yearning.yml -p yearning up -d
```
访问地址:[`ip地址:8000`](www.zhengqingya.com:8000)
默认登录账号密码:`admin/Yearning_admin`
### Oracle18c
```shell
docker-compose -f docker-compose-oracle18c.yml -p oracle18c up -d
```
> 配置参考:[Docker(9) 安装Oracle18c](https://zhengqing.blog.csdn.net/article/details/103296040)
### Couchbase
```shell
docker-compose -f docker-compose-couchbase.yml -p couchbase up -d
```
管理平台地址:[`ip地址:8091`](www.zhengqingya.com:8091)
默认登录账号密码:`Administrator/password`
### Redis
```shell
# 当前目录下所有文件赋予权限(读、写、执行)
chmod -R 777 ./redis
# 运行
docker-compose -f docker-compose-redis.yml -p redis up -d
```
连接redis
```shell
docker exec -it redis redis-cli -a 123456 # 密码为123456
```
### Jrebel
```shell
docker-compose -f docker-compose-jrebel.yml -p jrebel up -d
```
默认反代`idea.lanyus.com`, 运行起来后
1. 激活地址: `ip地址:8888/UUID` -> 注:UUID可以自己生成,并且必须是UUID才能通过验证 -> [UUID在线生成](http://www.uuid.online/)
2. 邮箱随意填写
### Nginx
```shell
docker-compose -f docker-compose-nginx.yml -p nginx up -d
```
访问地址:[`ip地址:80`](www.zhengqingya.com:80)
### Elasticsearch
```shell
# 当前目录下所有文件赋予权限(读、写、执行)
chmod -R 777 ./elasticsearch
# 运行
docker-compose -f docker-compose-elasticsearch.yml -p elasticsearch up -d
# 运行后,再次给新创建的文件赋予权限
chmod -R 777 ./elasticsearch
```
1. ES访问地址:[`ip地址:9200`](www.zhengqingya.com:9200)
默认账号密码:`elastic/123456`
2. kibana访问地址:[`ip地址:5601/app/dev_tools#/console`](www.zhengqingya.com:5601/app/dev_tools#/console)
默认账号密码:`elastic/123456`
#### 设置ES密码
```shell
# 进入容器
docker exec -it elasticsearch /bin/bash
# 设置密码-随机生成密码
# elasticsearch-setup-passwords auto
# 设置密码-手动设置密码
elasticsearch-setup-passwords interactive
# 访问
curl 127.0.0.1:9200 -u elastic:123456
```



#### 修改ES密码
```shell
# 修改elastic密码为123456 (注:执行命令时会让认证之前账号密码)
curl -H "Content-Type:application/json" -XPOST -u elastic 'http://127.0.0.1:9200/_xpack/security/user/elastic/_password' -d '{ "password" : "123456" }'
```
### ELK
```shell
# 当前目录下所有文件赋予权限(读、写、执行)
chmod -R 777 ./elk
# 运行
docker-compose -f docker-compose-elk.yml -p elk up -d
# 若运行之后启动日志再次报相关权限问题,再次给新产生的文件赋予权限
chmod -R 777 ./elk
```
1. ES访问地址:[`ip地址:9200`](www.zhengqingya.com:9200)
默认账号密码:`elastic/123456`
2. kibana访问地址:[`ip地址:5601`](www.zhengqingya.com:5601)
默认账号密码:`elastic/123456`
#### 设置ES密码
```shell
# 进入容器
docker exec -it elasticsearch /bin/bash
# 设置密码-随机生成密码
# elasticsearch-setup-passwords auto
# 设置密码-手动设置密码
elasticsearch-setup-passwords interactive
```

### Filebeat
> 轻量型日志采集器,可以将指定日志转发到Logstash、Elasticsearch、Kafka、Redis等中。
```shell
docker-compose -f docker-compose-filebeat.yml -p filebeat up -d
```
### RabbitMQ
```shell
# 当前目录下所有文件赋予权限(读、写、执行)
chmod -R 777 ./rabbitmq
# 运行 [ 注:如果之前有安装过,需要清除浏览器缓存和删除rabbitmq相关的存储数据(如:这里映射到宿主机的data数据目录),再重装,否则会出现一定问题! ]
docker-compose -f docker-compose-rabbitmq.yml -p rabbitmq up -d
# 运行3.7.8-management版本
# docker-compose -f docker-compose-rabbitmq-3.7.8-management.yml -p rabbitmq up -d
# 进入容器
docker exec -it rabbitmq /bin/bash
# 启用延时插件
rabbitmq-plugins enable rabbitmq_delayed_message_exchange
# 查看已安装插件
rabbitmq-plugins list
```
web管理端:[`ip地址:15672`](www.zhengqingya.com:15672)
登录账号密码:`admin/admin`
### RabbitMQ - 集群
```shell
# 当前目录下所有文件赋予权限(读、写、执行)
chmod -R 777 ./rabbitmq-cluster
# 设置Erlang Cookie文件权限(cookie文件必须只允许拥有者有权操作)
chmod 600 ./rabbitmq-cluster/.erlang.cookie
# 运行 [ 注:如果之前有安装过,需要清除浏览器缓存和删除rabbitmq相关的存储数据(如:这里映射到宿主机的data数据目录),再重装,否则会出现一定问题! ]
docker-compose -f docker-compose-rabbitmq-cluster.yml -p rabbitmq-cluster up -d
# 启用延迟插件(注:如果节点类型为内存节点,则无法启用延迟插件)
docker exec rabbitmq-1 /bin/bash -c 'rabbitmq-plugins enable rabbitmq_delayed_message_exchange'
docker exec rabbitmq-2 /bin/bash -c 'rabbitmq-plugins enable rabbitmq_delayed_message_exchange'
# 执行脚本 => 配置集群
sh ./rabbitmq-cluster/init-rabbitmq.sh
# 配置镜像队列(允许内建双活冗余选项,与普通队列不同,镜像节点在集群中的其他节点拥有从队列拷贝,一旦主节点不可用,最老的从队列将被选举为新的主队列)
docker exec -it rabbitmq-1 /bin/bash
rabbitmqctl set_policy -p my_vhost ha-all "^" '{"ha-mode":"all"}' --apply-to all
# 查看镜像队列`ha-all`
rabbitmqctl list_policies -p my_vhost
# 删除镜像队列`ha-all`
rabbitmqctl clear_policy -p my_vhost ha-all
```
web管理端:[`ip地址:15672`](www.zhengqingya.com:15672)
登录账号密码:`admin/admin`

### ActiveMQ
```shell
docker-compose -f docker-compose-activemq.yml -p activemq up -d
```
访问地址:[`ip地址:8161`](www.zhengqingya.com:8161)
登录账号密码:`admin/admin`
### BaiduPCS-Web
```shell
docker-compose -f docker-compose-baidupcs-web.yml -p baidupcs-web up -d
```
访问地址:[`ip地址:5299`](www.zhengqingya.com:5299)
### MinIO
```shell
docker-compose -f docker-compose-minio.yml -p minio up -d
```
访问地址:[`ip地址:9000/minio`](www.zhengqingya.com:9000/minio)
登录账号密码:`root/password`
### Nacos
```shell
# 注:需要修改docker-compose-nacos-mysql.yml 中相关数据库连接信息和JVM参数相关信息
docker-compose -f docker-compose-nacos.yml -p nacos up -d
# mysql数据库版 【 需自己建库`nacos_config`, 并执行`/Liunx/nacos_mysql/nacos-mysql.sql`脚本 】
# nacos1.4.1版本
docker-compose -f docker-compose-nacos-mysql.yml -p nacos up -d
# nacos2.0.3版本
docker-compose -f docker-compose-nacos-mysql-2.0.3.yml -p nacos_v2.0.3 up -d
```
访问地址:[`ip地址:8848/nacos`](www.zhengqingya.com:8848/nacos)
登录账号密码默认:`nacos/nacos`
> 注:`docker-compose-nacos-mysql.yml`已开启连接密码安全认证,在java连接时需新增配置如下
```yml
spring:
cloud:
nacos:
discovery:
username: nacos
password: nacos
config:
username: ${spring.cloud.nacos.discovery.username}
password: ${spring.cloud.nacos.discovery.password}
```
### Sentinel
```shell
# 普通版
docker-compose -f docker-compose-sentinel.yml -p sentinel up -d
# 持久化配置到mysql版
# docker-compose -f docker-compose-sentinel-mysql.yml -p sentinel up -d
```
访问地址:[`ip地址:8858`](www.zhengqingya.com:8858)
登录账号密码:`sentinel/sentinel`
### Kafka
```shell
docker-compose -f docker-compose-kafka.yml -p kafka up -d
```
集群管理地址:[`ip地址:9000`](www.zhengqingya.com:9000)
### Tomcat
```shell
docker-compose -f docker-compose-tomcat.yml -p tomcat up -d
```
访问地址:[`ip地址:8081`](www.zhengqingya.com:8081)
### GitLab
```shell
docker-compose -f docker-compose-gitlab.yml -p gitlab up -d
```
访问地址:[`ip地址:10080`](www.zhengqingya.com:10080)
默认root账号,密码访问页面时需自己设置
### Jenkins
```shell
docker-compose -f docker-compose-jenkins.yml -p jenkins up -d
```
访问地址:[`ip地址:8080`](www.zhengqingya.com:8080)
### Nextcloud - 多端同步网盘
```shell
docker-compose -f docker-compose-nextcloud.yml -p nextcloud up -d
```
访问地址:[`ip地址:81`](www.zhengqingya.com:81) , 创建管理员账号
### Walle - 支持多用户多语言部署平台
```shell
docker-compose -f docker-compose-walle.yml -p walle up -d && docker-compose -f docker-compose-walle.yml logs -f
```
访问地址:[`ip地址:80`](www.zhengqingya.com:80)
初始登录账号如下:
```
超管:super@walle-web.io \ Walle123
所有者:owner@walle-web.io \ Walle123
负责人:master@walle-web.io \ Walle123
开发者:developer@walle-web.io \ Walle123
访客:reporter@walle-web.io \ Walle123
```
### Grafana - 开源数据可视化工具(数据监控、数据统计、警报)
```shell
docker-compose -f docker-compose-grafana.yml -p grafana up -d
```
访问地址:[`http://ip地址:3000`](www.zhengqingya.com:3000)
默认登录账号密码:`admin/admin`
### Grafana Loki - 一个水平可扩展,高可用性,多租户的日志聚合系统
```shell
# 先授权,否则会报错:`cannot create directory '/var/lib/grafana/plugins': Permission denied`
chmod 777 $PWD/grafana_promtail_loki/grafana/data
chmod 777 $PWD/grafana_promtail_loki/grafana/log
# 运行
docker-compose -f docker-compose-grafana-promtail-loki.yml -p grafana_promtail_loki up -d
```
访问地址:[`http://ip地址:3000`](www.zhengqingya.com:3000)
默认登录账号密码:`admin/admin`
### Graylog - 日志管理工具
```shell
docker-compose -f docker-compose-graylog.yml -p graylog_demo up -d
```
访问地址:[`http://ip地址:9001`](www.zhengqingya.com:9001)
默认登录账号密码:`admin/admin`
### FastDFS - 分布式文件系统
```shell
docker-compose -f docker-compose-fastdfs.yml -p fastdfs up -d
```
###### 测试
```shell
# 等待出现如下日志信息:
# [2020-07-24 09:11:43] INFO - file: tracker_client_thread.c, line: 310, successfully connect to tracker server 39.106.45.72:22122, as a tracker client, my ip is 172.16.9.76
# 进入storage容器
docker exec -it fastdfs_storage /bin/bash
# 进入`/var/fdfs`目录
cd /var/fdfs
# 执行如下命令,会返回在storage存储文件的路径信息,然后拼接上ip地址即可测试访问
/usr/bin/fdfs_upload_file /etc/fdfs/client.conf test.jpg
# ex:
http://www.zhengqingya.com:8888/group1/M00/00/00/rBEAAl8aYsuABe4wAAhfG6Hv0Jw357.jpg
```
### YApi - 高效、易用、功能强大的api管理平台
```shell
docker-compose -f docker-compose-yapi.yml -p yapi up -d
```
如下运行成功:
```shell
log: mongodb load success...
初始化管理员账号成功,账号名:"admin@admin.com",密码:"ymfe.org"
部署成功,请切换到部署目录,输入: "node vendors/server/app.js" 指令启动服务器, 然后在浏览器打开 http://127.0.0.1:3000 访问
log: -------------------------------------swaggerSyncUtils constructor-----------------------------------------------
log: 服务已启动,请打开下面链接访问:
http://127.0.0.1:3000/
log: mongodb load success...
```
访问地址:[`http://ip地址:3000`](www.zhengqingya.com:3000)
默认登录账号密码:`admin@admin.com/ymfe.org`
### RocketMQ
> 注:修改 `xx/rocketmq/rocketmq_broker/conf/broker.conf`中配置`brokerIP1`为`宿主机IP`
```shell
docker-compose -f docker-compose-rocketmq.yml -p rocketmq up -d
```
访问地址:[`http://ip地址:9002`](www.zhengqingya.com:9002)
### XXL-JOB - 分布式任务调度平台
```shell
docker-compose -f docker-compose-xxl-job.yml -p xxl-job up -d
```
访问地址:[`http://ip地址:9003/xxl-job-admin`](www.zhengqingya.com:9003/xxl-job-admin)
默认登录账号密码:`admin/123456`
### MongoDB - 基于文档的通用分布式数据库
```shell
docker-compose -f docker-compose-mongodb.yml -p mongodb up -d
```
访问地址:[`http://ip地址:1234`](www.zhengqingya.com:1234)
Connection string:`mongodb://admin:123456@ip地址:27017`
### Zookeeper
```shell
docker-compose -f docker-compose-zookeeper.yml -p zookeeper up -d
```
可视化界面访问地址:[`http://ip地址:9090`](www.zhengqingya.com:9090)
> 桌面可视化工具PrettyZoo: [https://github.com/vran-dev/PrettyZoo](https://github.com/vran-dev/PrettyZoo)
### Flowable - 业务流程引擎
```shell
docker-compose -f docker-compose-flowable.yml -p flowable up -d
```
可视化界面访问地址:[`http://ip地址:9004/flowable-ui`](www.zhengqingya.com:9004/flowable-ui)
默认登录账号密码:`admin/test`
### Prometheus - 监控系统和时间序列数据库
> 注:此为未完善版!
> `docker-compose-prometheus.yml` 需修改grafana中配置的mysql连接信息
> `prometheus.yml` 自行配置
```shell
docker-compose -f docker-compose-prometheus.yml -p prometheus up -d
```
1. grafana访问地址:[`http://ip地址:3000`](www.zhengqingya.com:3000)
默认登录账号密码:`admin/admin`
2. prometheus访问地址: [`http://ip地址:9090`](www.zhengqingya.com:9090)
3. exporter访问地址: [`http://ip地址:9100/metrics`](www.zhengqingya.com:9100/metrics)
#### 其它
1. grafana面板资源 https://grafana.com/grafana/dashboards
### Zipkin - 分布式追踪系统
> 它有助于收集对服务架构中的延迟问题进行故障排除所需的计时数据。功能包括收集和查找这些数据。
> 注:记得创建相应zipkin库和表
```shell
docker-compose -f docker-compose-zipkin.yml -p zipkin up -d
```
启动成功日志如下:
```
oo
oooo
oooooo
oooooooo
oooooooooo
oooooooooooo
ooooooo ooooooo
oooooo ooooooo
oooooo ooooooo
oooooo o o oooooo
oooooo oo oo oooooo
ooooooo oooo oooo ooooooo
oooooo ooooo ooooo ooooooo
oooooo oooooo oooooo ooooooo
oooooooo oo oo oooooooo
ooooooooooooo oo oo ooooooooooooo
oooooooooooo oooooooooooo
oooooooo oooooooo
oooo oooo
________ ____ _ _____ _ _
|__ /_ _| _ \| |/ /_ _| \ | |
/ / | || |_) | ' / | || \| |
/ /_ | || __/| . \ | || |\ |
|____|___|_| |_|\_\___|_| \_|
:: version 2.23.2 :: commit 7bf3aab ::
2021-08-30 16:33:27.414 INFO [/] 1 --- [oss-http-*:9411] c.l.a.s.Server : Serving HTTP at /[0:0:0:0:0:0:0:0%0]:9411 - http://127.0.0.1:9411/
```
可视化界面访问地址:[`http://ip地址:9411/zipkin`](www.zhengqingya.com:9411/zipkin)
### Rancher - 开源容器管理平台
```shell
docker-compose -f docker-compose-rancher.yml -p rancher up -d
```
访问地址:[`http://ip地址:80`](www.zhengqingya.com:80)
### Seata - 分布式事务
> 1. config.txt => https://github.com/seata/seata/blob/develop/script/config-center/config.txt
> 2. nacos-config.sh => https://github.com/seata/seata/blob/develop/script/config-center/nacos/nacos-config.sh
> 3. mysql.sql => https://github.com/seata/seata/blob/develop/script/server/db/mysql.sql
```shell
# 运行
docker-compose -f docker-compose-seata.yml -p seata up -d
# 查看日志
docker exec -it seata_server sh
```
==============================================================================\
======================== ↑↑↑↑↑↑ 环境部署 end ↑↑↑↑↑↑ ================================\
==============================================================================\
| 24.701686 | 200 | 0.663954 | yue_Hant | 0.537648 |
fd6ec71ed718aa6e8caf02fb05f0ea73fefc17eb | 593 | md | Markdown | content/2009/05/02/770.md | nobu666/nobu666.com | 917dc730b45c5b123aa830f6d3dcb712706abee4 | [
"MIT"
] | null | null | null | content/2009/05/02/770.md | nobu666/nobu666.com | 917dc730b45c5b123aa830f6d3dcb712706abee4 | [
"MIT"
] | null | null | null | content/2009/05/02/770.md | nobu666/nobu666.com | 917dc730b45c5b123aa830f6d3dcb712706abee4 | [
"MIT"
] | null | null | null | +++
date = "2009-05-02T23:50:28+09:00"
draft = false
title = "GW1日目"
categories = ["diary"]
+++
やっと俺にもGWがはじまた。
溜まってた洗濯物をやっつけるため、嫁を見送ってすぐに洗濯開始。録画してあった咲とけいおん観ながら5回ほど回してすべての洗濯物をやっつける。天気がいいのでシーツやタオルケットまで全部やっつけた。ベランダに干しきれなくなったので、朝一で干したやつで乾いているものは取り込んでからじゃないと干せないとかいう状況に…まぁサクサク乾いて大変良い感じですな。
すこーし昼寝してしまい、14時くらいに買い物へ。近所のドンキでビール買う。ニコニコ観たり漫画読んだり雑誌読んだりしつつ、地味に風呂場のカビ取りしたり、トイレ掃除したり。そうこうしてるうちに嫁から電話きて、焼き肉食いに行って、風呂入ってアシュリーちゃんの特番観て泣き、今忌野清志郎の訃報を知ったトコ。R.I.P.
GW中のTODOは部屋の掃除・キッチンの掃除・漫画の整理(不要なものをヤクオクで売ったり、ブックオク行きにしたり)・ギターとベースの弦張り替え(できればキラーの半田やりなおし)・髪を切りに行く。しかしこれだと全然家から出ないな…どっかでお茶の水にでも久しぶりに行って、楽器でも眺めてこようかね。
| 39.533333 | 173 | 0.865093 | jpn_Jpan | 0.871266 |
fd6fd9e1a8aae068d58ce8af65e21a6928b6e3f4 | 1,668 | md | Markdown | content/02.christ-in-song/07.501-600/10.591-600/06.Buy-Up-the-Opportunity/docs.md | GospelSounders/adventhymnals | d2108ab49d735b373c59901e5296c8819a1ad3f2 | [
"Apache-2.0"
] | null | null | null | content/02.christ-in-song/07.501-600/10.591-600/06.Buy-Up-the-Opportunity/docs.md | GospelSounders/adventhymnals | d2108ab49d735b373c59901e5296c8819a1ad3f2 | [
"Apache-2.0"
] | 1 | 2021-05-10T23:24:05.000Z | 2021-05-10T23:24:05.000Z | content/02.christ-in-song/07.501-600/10.591-600/06.Buy-Up-the-Opportunity/docs.md | GospelSounders/adventhymnals | d2108ab49d735b373c59901e5296c8819a1ad3f2 | [
"Apache-2.0"
] | null | null | null | ---
title: |
596. Buy Up the Opportunity - Christ in Song
metadata:
description: |
Christ in Song 596. Buy Up the Opportunity. Buy up the opportunity, O Christian, buy today; For Heaven's ageless mansions buy, Buy treasures while you may. Chorus: Buy up the opportunity, The souls for when Christ died, Buy up the opportunity, Buy for the Crucified.
keywords: |
Christ in Song, adventhymnals, advent hymnals, Buy Up the Opportunity, Buy up the opportunity. Buy up the opportunity,
author: Brian Onang'o
---
#### Advent Hymnals
## 596. BUY UP THE OPPORTUNITY
#### Christ in Song,
```txt
1.
Buy up the opportunity,
O Christian, buy today;
For Heaven's ageless mansions buy,
Buy treasures while you may.
Chorus:
Buy up the opportunity,
The souls for when Christ died,
Buy up the opportunity,
Buy for the Crucified.
2.
Buy up the opportunity,
It may not long remain;
The evil hosts are bidding, too,
Those precious souls to gain. [Chorus]
3.
Buy up the opportunity,
Pay any price to win;
With Heaven's legions watching you,
To falter will be sin. [Chorus]
4.
Buy up the opportunity,
At home; in lands afar;
Go quickly! Find the jewels rare,
Each soul a glowing star. [Chorus]
```
- | - |
-------------|------------|
Title | Buy Up the Opportunity |
Key | A♭ Major |
Titles | Buy up the opportunity, |
First Line | Buy up the opportunity |
Author | John R. Clements
Year | 1908
Composer| W. S. Weeden |
Hymnal| - |
Tune| Buy up the opportunity |
Metrical pattern | |
# Stanzas | |
Chorus | |
Chorus Type | |
Subjects | Living His Life: Call to Activity |
Texts | Ephesians 5:16 |
Print Texts |
Scripture Song | |
| 24.173913 | 274 | 0.691847 | eng_Latn | 0.961027 |
fd700aa282815491afa6e660907da6e60681e780 | 5,241 | md | Markdown | README.md | Numergy/cloud-init-cookbook | 55822d4bc3352c1faf66b5d46ba60057819ca4f1 | [
"Apache-2.0"
] | 1 | 2019-01-14T12:34:17.000Z | 2019-01-14T12:34:17.000Z | README.md | Numergy/cloud-init-cookbook | 55822d4bc3352c1faf66b5d46ba60057819ca4f1 | [
"Apache-2.0"
] | 2 | 2019-09-10T13:32:32.000Z | 2020-10-20T13:07:10.000Z | README.md | Numergy/cloud-init-cookbook | 55822d4bc3352c1faf66b5d46ba60057819ca4f1 | [
"Apache-2.0"
] | 1 | 2018-01-22T11:08:30.000Z | 2018-01-22T11:08:30.000Z | cloud_init Cookbook
===================
Deploy cloud_init configuration files from attributes or LWRP.
[](https://community.opscode.com/cookbooks/cloud_init) [](https://travis-ci.org/Numergy/cloud-init-cookbook)
Requirements
------------
#### Platforms
The following platforms and versions are tested and supported using Opscode's test-kitchen.
- `Debian 8`
- `Debian 9`
- `Debian 10`
- `Ubuntu 16.04`
- `Ubuntu 18.04`
- `Ubuntu 20.04`
Attributes
----------
#### cloud_init::default
| Key | Type | Description |
| -------------------- | ---- | ------------------------------------------- |
| `[cloud_init][cfgs]` | Hash | Hash of configuration files (default: `{}`) |
Usage
-----
#### cloud_init::default
Include `cloud_init` in your node's `run_list` and set the `cfgs` attribute to install `cloud-init` package and create configuration files:
```json
{
"name":"my_node",
"run_list": [
"recipe[cloud_init]"
],
"attributes": {
"cloud_init": {
"cfgs": {
"hostname": {
"priority": "90",
"config": {
"preserve_hostname": "true"
}
}
}
}
}
}
```
This setup will create the configuration file `/etc/cloud/cloud.cfg.d/90_hostname.cfg` with following content:
```
preserve_hostname: true
```
#### cloud_init::upgrade
Include `cloud_init::upgrade` in your node's `run_list` to upgrade `cloud-init` package.
```json
{
"name":"my_node",
"run_list": [
"recipe[cloud_init::upgrade]"
]
}
```
#### cloud_init::disable
Include `cloud_init::disable` in your node's `run_list` to disable `cloud-init`.
```json
{
"name":"my_node",
"run_list": [
"recipe[cloud_init::disable]"
]
}
```
#### cloud_init::enable
Include `cloud_init::enable` in your node's `run_list` to enable `cloud-init`.
```json
{
"name":"my_node",
"run_list": [
"recipe[cloud_init::enable]"
]
}
```
#### cloud_init::remove
Include `cloud_init::remove` in your node's `run_list` to remove `cloud-init` package.
```json
{
"name":"my_node",
"run_list": [
"recipe[cloud_init::remove]"
]
}
```
#### cloud_init::purge
Include `cloud_init::purge` in your node's `run_list` to purge `cloud-init` package and remove cloud directories (`/etc/cloud`, `/var/lib/cloud`).
```json
{
"name":"my_node",
"run_list": [
"recipe[cloud_init::purge]"
]
}
```
#### Custom resources
##### cloud_init
This LWRP can be used to manipulate `cloud-init` package.
###### Actions
| Action | Description |
| --------- | ------------------------------------------------- |
| `:install` | Install `cloud-init` |
| `:upgrade` | Upgrade `cloud-init` |
| `:disable` | Disable `cloud-init` |
| `:enable` | Enable `cloud-init` |
| `:remove` | Remove `cloud-init` |
| `:purge` | Purge `cloud-init` and remove configuration files |
###### Attributes
This LWRP doesn't have any attributes, the name attribute can be whatever.
###### Examples
Install `cloud-init`:
```ruby
cloud_init 'default'
```
Disable `cloud-init`:
```ruby
cloud_init 'default' do
action :disable
end
```
Purge `cloud-init`:
```ruby
cloud_init 'default' do
action :purge
end
```
##### cloud_init_cfg
This LWRP can be used to deploy `cloud-init` configuration files.
###### Actions
| Action | Description |
| --------- | -------------------------------------- |
| `:create` | Create a cloud-init configuration file |
| `:delete` | Delete a cloud-init configuration file |
###### Attributes
| Attribute | Type | Description |
| ---------- | -------- | ----------------------------------------------------------------------------- |
| `name` | String | Name of the configuration file, this is the name attribute |
| `priority` | Interger | Priority of configuration file, The config filename is prefixed by this value |
| `config` | Hash | Hash configuration, used by the template. |
###### Example
```ruby
cloud_init 'default'
cloud_init_cfg 'hostname' do
priority 90
config preserve_hostname: true
end
```
Testing
-------
See [TESTING.md](TESTING.md)
Contributing
------------
See [CONTRIBUTING.md](CONTRIBUTING.md)
License and Authors
-------------------
Authors: Sliim <sliim@mailoo.org>
Licensed under the Apache License, Version 2.0 (the "License"); you may not use this file except in compliance with the License. You may obtain a copy of the License at
http://www.apache.org/licenses/LICENSE-2.0
Unless required by applicable law or agreed to in writing, software distributed under the License is distributed on an "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the License for the specific language governing permissions and limitations under the License.
| 25.945545 | 305 | 0.575844 | eng_Latn | 0.748728 |
fd7079f88eb511729532f58c7cb3b5134f91c297 | 963 | md | Markdown | README.md | andrew-raphael-lukasik/SI_units | cdc285d5eb22ee4f76f108dbd693872dd7addad7 | [
"MIT"
] | 1 | 2020-07-07T02:45:07.000Z | 2020-07-07T02:45:07.000Z | README.md | andrew-raphael-lukasik/SI_units | cdc285d5eb22ee4f76f108dbd693872dd7addad7 | [
"MIT"
] | null | null | null | README.md | andrew-raphael-lukasik/SI_units | cdc285d5eb22ee4f76f108dbd693872dd7addad7 | [
"MIT"
] | null | null | null | # Structures and syntax to help with basic scientific calculations in [Unity](http://unity.com/)
Note: List of operations and conversion is not complete, I'm adding them manually one by one depending on what's needed.
**`Issues` and `Pull requests` are welcome.**
```csharp
using BasicScience;
// conversion:
ly alfa_centauri_dist = (pc) 1.34;
deg radian_in_degrees = (rad) 1.0;
s sidereal_day = (h) 23.9344696;
m3ps2 jovian_μ = (kg) 1.898e27;// [m³/s²]
// units deduced from calculation:
var speed = (m) 100 / (s) 3.1;// [m/s]
var heat = (W) 500 * (min) 60;// [J]
var tesla_accel = (mps)(kmph) 96.56 / (s) 2.3;// [m/s²]
```
# Installation
Add this line in `manifest.json` / `dependencies`:
```
"dependencies": {
"com.andrewraphaellukasik.basicscience": "https://github.com/andrew-raphael-lukasik/basic-science.git#upm",
}
```
Or via `Package Manager` / `Add package from git URL`:
```
https://github.com/andrew-raphael-lukasik/basic-science.git#upm
```
| 30.09375 | 120 | 0.69055 | eng_Latn | 0.700448 |
fd70c85e7644ed86f91d448cf3db7b32e5ada3f6 | 1,336 | md | Markdown | docs/howto/register a stored procedure to redis.md | gneumann333/jumpscaleX_core | 777d249fa3668c6e802c2f765f4b82fb39c3e5fa | [
"Apache-2.0"
] | 1 | 2020-06-21T11:18:52.000Z | 2020-06-21T11:18:52.000Z | docs/howto/register a stored procedure to redis.md | gneumann333/jumpscaleX_core | 777d249fa3668c6e802c2f765f4b82fb39c3e5fa | [
"Apache-2.0"
] | 644 | 2019-08-25T10:19:56.000Z | 2020-12-23T09:41:04.000Z | docs/howto/register a stored procedure to redis.md | gneumann333/jumpscaleX_core | 777d249fa3668c6e802c2f765f4b82fb39c3e5fa | [
"Apache-2.0"
] | 11 | 2019-08-29T21:38:50.000Z | 2020-06-21T11:18:55.000Z | # How to register a stored procedure to redis using redis client
First please read this [introduction to eval](https://redis.io/commands/eval#introduction-to-eval) to understand how
eval and evalsha work
## Register a procedure from path
you can register a procedure from a lua script path like this
```python
redis_client.storedprocedure_register(name="test", nrkeys=3, path="path_to_lua_script")
```
will return the sha
the stored procedure can be found in hset `storedprocedures:$name` has inside a json with:
- script: the lua script
- sha: the sha which van be used later to execute the procedure
- nrkeys: number of keys that your script require
there is also another hset `storedprocedures_sha` that contains shas directly, use this if you know the name and what
evalsha directly without parsing json
## use the registered procedure
1- if you are using jumpscale redis client:
you can use evalsha to execute the procedure if you know the sha like this
```python
redisclient.evalsha(sha,3,"a","b","c") 3 is for nr of keys, then the args
```
or you can directly execute it with name like this
```python
redisclient.storedprocedure_execute(name, *args)
```
2- if you are using the standard redis client:
in this case you will have to use evalsha command
```bash
redis_cli> evalsha $sha $nr_keys $keys
```
| 29.043478 | 118 | 0.760479 | eng_Latn | 0.998872 |
fd71409d8b4eef10d22145b2cb9d20b9003f45fe | 11,084 | markdown | Markdown | _posts/2014-09-02-reusable-data-processing.markdown | nprapps/nprapps.github.com | 654ea5bacc40f78b9a764df32ee1ffb522c018fd | [
"FSFAP"
] | 37 | 2015-02-26T15:45:33.000Z | 2021-10-16T06:49:44.000Z | _posts/2014-09-02-reusable-data-processing.markdown | nprapps/nprapps.github.com | 654ea5bacc40f78b9a764df32ee1ffb522c018fd | [
"FSFAP"
] | 21 | 2015-02-09T21:18:45.000Z | 2019-08-23T19:06:08.000Z | _posts/2014-09-02-reusable-data-processing.markdown | nprapps/nprapps.github.com | 654ea5bacc40f78b9a764df32ee1ffb522c018fd | [
"FSFAP"
] | 8 | 2015-01-23T14:00:53.000Z | 2021-02-18T11:14:32.000Z | ---
layout: post
title: "A reusable data processing workflow"
description: "How NPR Visuals processed data from the Law Enforcement Support Office."
author: David Eads
email: deads@npr.org
twitter: eads
---
*Correction (September 2, 2014 8:55pm EDT): We originally stated that the script should combine data from multiple American Community Survey population estimates. [This methodology is not valid](https://github.com/ryanpitts/journalists-guide-datasets/blob/master/datasets/american_community_survey.md#comparing-acs-data-from-different-releases). This post and the accompanying source code have been updated accordingly. Thanks to census expert [Ryan Pitts](https://twitter.com/ryanpitts) for catching the mistake. This is why we open source our code!*
The NPR Visuals team was recently tasked with [analysing data from the Pentagon’s program to disperse surplus military gear](http://www.npr.org/2014/09/02/2014/08/22/342494225/mraps-and-bayonets-what-we-know-about-the-pentagons-1033-program) to law enforcement agencies around the country through the Law Enforcement Support Office (LESO), also known as the "1033" program. The project offers a useful case study in creating data processing pipelines for data analysis and reporting.
The source code for the processing scripts discussed in this post is [available on Github](https://github.com/nprapps/leso/). The processed data is available in a [folder on Google Drive](https://drive.google.com/folderview?id=0B03IIavLYTovdWg4NGtzSW9wb2c&usp=sharing).
## Automate everything
There is one rule for data processing: **Automate everything**.
Data processing is fraught with peril. Your initial transformations and data analysis will always have errors and never be as sophisticated as your final analysis. Do you want to hand-categorize a dataset, only to get updated data from your source? Do you want to laboriously add calculations to a spreadsheet, only to find out you misunderstood some crucial aspect of the data? Do you want to arrive at a conclusion and forget how you got there?
No you don’t! Don’t do things by hand, don’t do one-off transformations, don’t make it hard to get back to where you started.
Create processing scripts managed under version control that can be refined and repeated. Whatever extra effort it takes to set up and develop processing scripts, you will be rewarded the second or third or fiftieth time you need to run them.
It might be tempting to change the source data in some way, perhaps to add categories or calculations. If you need to add additional data or make calculations, your scripts should do that for you.
The top-level build script from our recent project shows this clearly, even if you don’t write code:
```bash
#!/bin/bash
echo 'IMPORT DATA'
echo '-----------'
./import.sh
echo 'CREATE SUMMARY FILES'
echo '--------------------'
./summarize.sh
echo 'EXPORT PROCESSED DATA'
echo '---------------------'
./export.sh
```
We separate the process into three scripts: one for importing the data, one for creating summarized versions of the data (useful for charting and analysis) and one that exports full versions of the cleaned data.
## How we processed the LESO data
The data, provided by the Defense Logistics Agency's Law Enforcement Support Office, describes every distribution of military equipment to local law enforcement agencies through the “1033” program since 2006. The data does not specify the agency receiving the equipment, only the county the agency operates in. Every row represents a single instance of a single type of equipment going to a law enforcement agency. The fields in the source data are:
* **State**
* **County**
* **National Supply Number**: a standardized categorization system for equipment
* **Quantity**
* **Units**: A description of the unit to use for the item (e.g. “each” or “square feet”)
* **Acquisition cost**: The per-unit cost of the item when purchased by the military
* **Ship date**: When the item was shipped to a law enforcement agency
## Import
[Import script source](https://github.com/nprapps/leso/blob/master/import.sh)
The process starts with a single Excel file and builds a relational database around it. The Excel file is cleaned and converted into a CSV file and imported into a PostgreSQL database. Then additional data is loaded that help categorize and contextualize the primary dataset.
Here’s the whole workflow:
* [Convert Excel data to CSV with Python](https://github.com/nprapps/leso/blob/master/clean.py).
* Parse the date field, which represents dates in two different formats
* Strip out extra spaces from any strings (of which there are many)
* Split the National Supply Number into two additional fields: The first two digits represent the top level category of the equipment (e.g. “WEAPONS”). The first four digits represent the “federal supply class” (e.g. “Guns, through 30 mm”).
* [Import the CSVs generated from the source data into PostgreSQL](https://github.com/nprapps/leso/blob/master/import.sh#L7:L29).
* [Import a “FIPS crosswalk” CSV into PostgreSQL](https://github.com/nprapps/leso/blob/master/import.sh#L31:L37). This file, provided to us by an NPR reporter, lets us map county name and state to the Federal Information Processing Standard identifier used by the Census Bureau to identify counties.
* [Import a CSV file with Federal Supply Codes into PostgreSQL](https://github.com/nprapps/leso/blob/master/import.sh#L40:L54). Because there are repeated values, this data is de-depulicated after import.
* [Import 5 year county population estimates](https://github.com/nprapps/leso/blob/master/import.sh#L56:L82) from the US Census Bureau’s American Community Survey using the American FactFinder download tool. The file was added to the repository because there is no direct link or API to get the data.
* [Create a PostgreSQL view](https://github.com/nprapps/leso/blob/master/import.sh#L84:L92) that joins the LESO data with census data through the FIPS crosswalk table for convenience.
We also [import a list of all agencies](https://github.com/nprapps/leso/blob/master/import.sh#L94:L106) using [csvkit](https://csvkit.readthedocs.org):
* Use csvkit’s `in2csv` command to extract each sheet
* Use csvkit’s `csvstack` command to combine the sheets and add a grouping column
* Use csvkit’s `csvcut` command to remove a pointless “row number” column
* Import final output into Postgres database
## Summarizing
[Summarize script source](https://github.com/nprapps/leso/blob/master/summarize.sh)
Once the data is loaded, we can start playing around with it by running queries. As the queries become well-defined, we add them to a script that exports CSV files summarizing the data. These files are easy to drop into Google spreadsheets or send directly to reporters using Excel.
We won’t go into the gory details of every summary query. Here’s a simple query that demonstrates the basic idea:
```bash
echo "Generate category distribution"
psql leso -c "COPY (
select c.full_name, c.code as federal_supply_class,
sum((d.quantity * d.acquisition_cost)) as total_cost
from data as d
join codes as c on d.federal_supply_class = c.code
group by c.full_name, c.code
order by c.full_name
) to '`pwd`/build/category_distribution.csv' WITH CSV HEADER;"
```
This builds a table that calculates the total acquisition cost for each federal supply class:
<table class="table">
<thead>
<tr>
<th>full_name</th>
<th>federal_supply_code</th>
<th>total_cost</th>
</tr>
</thead>
<tbody>
<tr>
<td>Trucks and Truck Tractors, Wheeled</td>
<td>2320</td>
<td>$405,592,549.59</td>
</tr>
<tr>
<td>Aircraft, Rotary Wing</td>
<td>1520</td>
<td>$281,736,199.00</td>
</tr>
<tr>
<td>Combat, Assault, and Tactical Vehicles, Wheeled</td>
<td>2355</td>
<td>$244,017,665.00</td>
</tr>
<tr>
<td>Night Vision Equipment, Emitted and Reflected Radiation</td>
<td>5855</td>
<td>$124,204,563.34</td>
</tr>
<tr>
<td>Aircraft, Fixed Wing</td>
<td>1510</td>
<td>$58,689,263.00</td>
</tr>
<tr>
<td>Guns, through 30 mm</td>
<td>1005</td>
<td>$34,445,427.45</td>
</tr>
<tr>
<td colspan="3">...</td>
</tr>
</tbody>
</table>
Notice how we use SQL joins to pull in additional data (specifically, the full name field) and aggregate functions to handle calculations. By using a little SQL, we can avoid manipulating the underlying data.
The usefulness of our approach was evident early on in our analysis. At first, we calculated the total cost as `sum(acquisition_cost)`, not accounting for the quantity of items. Because we have a processing script managed with version control, it was easy to catch the problem, fix it and regenerate the tables.
## Exporting
[Export script source](https://github.com/nprapps/leso/blob/master/import.sh)
Not everybody uses PostgreSQL (or wants to). So our final step is to export cleaned and processed data for public consumption. This big old query merges useful categorical information, county FIPS codes, and pre-calculates the total cost for each equipment order:
```bash
psql leso -c "COPY (
select d.state,
d.county,
f.fips,
d.nsn,
d.item_name,
d.quantity,
d.ui,
d.acquisition_cost,
d.quantity * d.acquisition_cost as total_cost,
d.ship_date,
d.federal_supply_category,
sc.name as federal_supply_category_name,
d.federal_supply_class,
c.full_name as federal_supply_class_name
from data as d
join fips as f on d.state = f.state and d.county = f.county
join codes as c on d.federal_supply_class = c.code
join codes as sc on d.federal_supply_category = sc.code
) to '`pwd`/export/states/all_states.csv' WITH CSV HEADER;"
```
Because we’ve cleanly imported the data, we can re-run this export whenever we need. If we want to revisit the story with a year’s worth of additional data next summer, it won’t be a problem.
## A few additional tips and tricks
**Make your scripts chatty:** Always print to the console at each step of import and processing scripts (e.g. `echo "Merging with census data"`). This makes it easy to track down problems as they crop up and get a sense of which parts of the script are running slowly.
**Use mappings to combine datasets:** As demonstrated above, we make extensive use of files that map fields in one table to fields in another. We use [SQL joins](http://blog.codinghorror.com/a-visual-explanation-of-sql-joins/) to combine the datasets. These features can be hard to understand at first. But once you get the hang of it, they are easy to implement and keep your data clean and simple.
**Work on a subset of the data:** When dealing with huge datasets that could take many hours to process, use a representative sample of the data to test your data processing workflow. For example, use 6 months of data from a multi-year dataset, or pick random samples from the data in a way that ensures the sample data adequately represents the whole.
| 56.841026 | 551 | 0.75009 | eng_Latn | 0.991646 |
fd7305270129f8a7401f33fa963f390675aff99e | 198 | md | Markdown | kubernetes/docs/container_spec_memory_swap_limit_bytes.md | phonglh79/integrations | 1c55d42418cb139434ef5a7c1e12752475ebcd0a | [
"Apache-2.0"
] | null | null | null | kubernetes/docs/container_spec_memory_swap_limit_bytes.md | phonglh79/integrations | 1c55d42418cb139434ef5a7c1e12752475ebcd0a | [
"Apache-2.0"
] | 1 | 2020-11-05T20:14:54.000Z | 2020-11-05T20:14:54.000Z | kubernetes/docs/container_spec_memory_swap_limit_bytes.md | phonglh79/integrations | 1c55d42418cb139434ef5a7c1e12752475ebcd0a | [
"Apache-2.0"
] | 1 | 2017-02-08T09:58:51.000Z | 2017-02-08T09:58:51.000Z | ---
title: container_spec_memory_swap_limit_bytes
brief: Memory swap limit for the container.
metric_type: gauge
---
### container_spec_memory_swap_limit_bytes
Memory swap limit for the container.
| 22 | 45 | 0.818182 | eng_Latn | 0.691708 |
fd73d6cb3c34da8283257c5fa3554c5e530e710d | 1,680 | md | Markdown | source/_posts/west_brom_target_grant_desperate_to_make_the_move_but_huddersfield_refuse_to_lower_17m_price_tag.md | soumyadipdas37/finescoop.github.io | 0346d6175a2c36d4054083c144b7f8364db73f2f | [
"MIT"
] | null | null | null | source/_posts/west_brom_target_grant_desperate_to_make_the_move_but_huddersfield_refuse_to_lower_17m_price_tag.md | soumyadipdas37/finescoop.github.io | 0346d6175a2c36d4054083c144b7f8364db73f2f | [
"MIT"
] | null | null | null | source/_posts/west_brom_target_grant_desperate_to_make_the_move_but_huddersfield_refuse_to_lower_17m_price_tag.md | soumyadipdas37/finescoop.github.io | 0346d6175a2c36d4054083c144b7f8364db73f2f | [
"MIT"
] | 2 | 2021-09-18T12:06:26.000Z | 2021-11-14T15:17:34.000Z | ---
extends: _layouts.post
section: content
image: https://i.dailymail.co.uk/1s/2020/09/09/23/32984718-0-image-a-52_1599691490096.jpg
title: West Brom target Grant desperate to make the move but Huddersfield refuse to lower £17m price tag
description: Albion boss Slaven Bilic has identified Grant as his top target to improve his attack, with the club putting together a deal that could eventually be worth up to £15m for the 22-year-old.
date: 2020-09-10-00-00-48
categories: [latest, sports]
featured: true
---
Karlan Grant has set his sights on a move to West Brom despite Huddersfield holding out for £17million for the striker.
Albion boss Slaven Bilic has identified Grant as his top target to improve his attack, with the club putting together a deal that could eventually be worth up to £15m for the 22-year-old.
Grant is desperate to make the move and is believed to have grown frustrated at Huddersfield's stance, as the Championship club have so far proved unwilling to budge.
West Brom boss Slaven Bilic has identified Karlan Grant as his top target to improve his attack
Bilic wants a new forward in place before Sunday's Premier League opener against Leicester at The Hawthorns following the permanent signing of winger Grady Diangana, who spent last season on loan from West Ham.
Grant has scored 23 goals in 51 starts for Huddersfield and is considered one of the most promising strikers outside the top flight.
Earlier on Wednesday Albion completed the capture of Callum Robinson from Sheffield United after his impressive loan stint during the second half of last season.
Robinson's move sees winger Oliver Burke leave Albion to join the Blades.
| 62.222222 | 210 | 0.79881 | eng_Latn | 0.999109 |
fd75609d2e56b134252a856528dc30a6814cab39 | 130 | md | Markdown | README.md | b1nry/blog-post-api | 0fbbbd657e182b814c7aef80818cd5ee39d353f0 | [
"MIT"
] | null | null | null | README.md | b1nry/blog-post-api | 0fbbbd657e182b814c7aef80818cd5ee39d353f0 | [
"MIT"
] | 7 | 2019-10-22T10:52:24.000Z | 2022-02-10T09:33:46.000Z | README.md | b1nry/blog-post-api | 0fbbbd657e182b814c7aef80818cd5ee39d353f0 | [
"MIT"
] | null | null | null | # Blog API
This repository contains the API backend of Blog.
By using this backend you can integrate any UI to show your blog.
| 18.571429 | 65 | 0.769231 | eng_Latn | 0.996358 |
fd757445e26b0c5e175fa05f8db50da5e0713b3a | 5,175 | md | Markdown | _posts/2019-01-25-kbl-result-20190125.md | jslk7/sponomics88.github.io | cc4237a00aee9acec7e6677da5131d8bc0c773ce | [
"MIT"
] | null | null | null | _posts/2019-01-25-kbl-result-20190125.md | jslk7/sponomics88.github.io | cc4237a00aee9acec7e6677da5131d8bc0c773ce | [
"MIT"
] | 1 | 2021-03-30T02:16:17.000Z | 2021-03-30T02:16:17.000Z | _posts/2019-01-25-kbl-result-20190125.md | jslk7/sponomics88.github.io | cc4237a00aee9acec7e6677da5131d8bc0c773ce | [
"MIT"
] | 1 | 2019-10-27T06:11:50.000Z | 2019-10-27T06:11:50.000Z | ---
layout: post
title: 2019-01-25(금) 국내 농구 경기 결과
date: 2019-01-25 09:00:00 +0900
categories: KBL
---
#### <span style="color:red"> 서울삼성(홈) VS 전주KCC(원정) </span>

<style type="text/css">
.tg {border-collapse:collapse;border-spacing:0;}
.tg td{font-family:Arial, sans-serif;font-size:14px;padding:10px 5px;border-style:solid;border-width:1px;overflow:hidden;word-break:normal;border-color:#c0c0c0;}
.tg th{font-family:Arial, sans-serif;font-size:14px;font-weight:normal;padding:10px 5px;border-style:solid;border-width:1px;overflow:hidden;word-break:normal;border-color:#c0c0c0;}
.tg .tg-dcpn{background-color:#ffffff;border-color:#c0c0c0;text-align:center;vertical-align:middle}
.tg .tg-txr3{background-color:#ffffff;border-color:#c0c0c0;text-align:center;vertical-align:middle}
.tg .tg-o8le{background-color:#efefef;border-color:#c0c0c0;text-align:center;vertical-align:middle}
.tg .tg-rr9t{font-weight:bold;background-color:#efefef;border-color:#c0c0c0;text-align:center;vertical-align:middle}
.tg .tg-wazi{background-color:#efefef;border-color:#c0c0c0;text-align:center;vertical-align:middle}
</style>
<table class="tg">
<tr>
<th class="tg-rr9t">서울삼성</th>
<th class="tg-rr9t">팀</th>
<th class="tg-rr9t">전주KCC</th>
</tr>
<tr>
<td class="tg-dcpn">1승 3패</td>
<td class="tg-rr9t">시즌 상대전적</td>
<td class="tg-dcpn">3승 1패</td>
</tr>
<tr>
<td class="tg-dcpn">84</td>
<td class="tg-rr9t">점수</td>
<td class="tg-dcpn">91</td>
</tr>
<tr>
<td class="tg-dcpn">27/55(49%)</td>
<td class="tg-rr9t">2점(%)</td>
<td class="tg-dcpn">27/47(57%)</td>
</tr>
<tr>
<td class="tg-dcpn">4/18(22%)</td>
<td class="tg-rr9t">3점(%)</td>
<td class="tg-dcpn">8/18(44%)</td>
</tr>
<tr>
<td class="tg-dcpn">18/25(72%)</td>
<td class="tg-rr9t">자유투(%)</td>
<td class="tg-dcpn">13/24(54%)</td>
</tr>
<tr>
<td class="tg-dcpn">41</td>
<td class="tg-rr9t">리바운드</td>
<td class="tg-dcpn">37</td>
</tr>
<tr>
<td class="tg-dcpn">14</td>
<td class="tg-rr9t">어시스트</td>
<td class="tg-dcpn">20</td>
</tr>
<tr>
<td class="tg-dcpn">9</td>
<td class="tg-rr9t">스틸</td>
<td class="tg-dcpn">9</td>
</tr>
<tr>
<td class="tg-dcpn">3</td>
<td class="tg-rr9t">블록</td>
<td class="tg-dcpn">1</td>
</tr>
<tr>
<td class="tg-dcpn">10</td>
<td class="tg-rr9t">턴오버</td>
<td class="tg-dcpn">15</td>
</tr>
<tr>
<td class="tg-dcpn">문태영(18)<br>유진 펠프스(35)</td>
<td class="tg-rr9t">주요 득점선수</td>
<td class="tg-dcpn">송교창(17)<br>이정현(21)<br>브랜든 브라운(25)</td>
</tr>
</table>
#### <span style="color:red"> 서울SK(홈) VS 창원LG(원정) </span>

<style type="text/css">
.tg {border-collapse:collapse;border-spacing:0;}
.tg td{font-family:Arial, sans-serif;font-size:14px;padding:10px 5px;border-style:solid;border-width:1px;overflow:hidden;word-break:normal;border-color:#c0c0c0;}
.tg th{font-family:Arial, sans-serif;font-size:14px;font-weight:normal;padding:10px 5px;border-style:solid;border-width:1px;overflow:hidden;word-break:normal;border-color:#c0c0c0;}
.tg .tg-dcpn{background-color:#ffffff;border-color:#c0c0c0;text-align:center;vertical-align:middle}
.tg .tg-txr3{background-color:#ffffff;border-color:#c0c0c0;text-align:center;vertical-align:middle}
.tg .tg-o8le{background-color:#efefef;border-color:#c0c0c0;text-align:center;vertical-align:middle}
.tg .tg-rr9t{font-weight:bold;background-color:#efefef;border-color:#c0c0c0;text-align:center;vertical-align:middle}
.tg .tg-wazi{background-color:#efefef;border-color:#c0c0c0;text-align:center;vertical-align:middle}
</style>
<table class="tg">
<tr>
<th class="tg-rr9t">서울SK</th>
<th class="tg-rr9t">팀</th>
<th class="tg-rr9t">창원LG</th>
</tr>
<tr>
<td class="tg-dcpn">1승 3패</td>
<td class="tg-rr9t">시즌 상대전적</td>
<td class="tg-dcpn">3승 1패</td>
</tr>
<tr>
<td class="tg-dcpn">76</td>
<td class="tg-rr9t">점수</td>
<td class="tg-dcpn">86</td>
</tr>
<tr>
<td class="tg-dcpn">21/48(44%)</td>
<td class="tg-rr9t">2점(%)</td>
<td class="tg-dcpn">21/48(44%)</td>
</tr>
<tr>
<td class="tg-dcpn">7/19(37%)</td>
<td class="tg-rr9t">3점(%)</td>
<td class="tg-dcpn">11/25(44%)</td>
</tr>
<tr>
<td class="tg-dcpn">13/17(76%)</td>
<td class="tg-rr9t">자유투(%)</td>
<td class="tg-dcpn">11/16(69%)</td>
</tr>
<tr>
<td class="tg-dcpn">28</td>
<td class="tg-rr9t">리바운드</td>
<td class="tg-dcpn">43</td>
</tr>
<tr>
<td class="tg-dcpn">19</td>
<td class="tg-rr9t">어시스트</td>
<td class="tg-dcpn">19</td>
</tr>
<tr>
<td class="tg-dcpn">9</td>
<td class="tg-rr9t">스틸</td>
<td class="tg-dcpn">7</td>
</tr>
<tr>
<td class="tg-dcpn">4</td>
<td class="tg-rr9t">블록</td>
<td class="tg-dcpn">5</td>
</tr>
<tr>
<td class="tg-dcpn">10</td>
<td class="tg-rr9t">턴오버</td>
<td class="tg-dcpn">13</td>
</tr>
<tr>
<td class="tg-dcpn">애런 헤인즈(29)</td>
<td class="tg-rr9t">주요 득점선수</td>
<td class="tg-dcpn">조쉬 그레이(16)<br>제임스 메이스(26)</td>
</tr>
</table>
| 32.142857 | 180 | 0.617971 | kor_Hang | 0.21774 |
fd7636a52b9d5ce2b6a884b0b0809a39b964824f | 382 | md | Markdown | _publishers/hawaii.md | buddhist-uni/buddhist-uni.github.io | 3d7a6d47e20bff3a4ddf92f9b4e52187678124ba | [
"MIT"
] | 12 | 2020-09-01T11:52:17.000Z | 2022-03-17T17:55:39.000Z | _publishers/hawaii.md | buddhist-uni/buddhist-uni.github.io | 3d7a6d47e20bff3a4ddf92f9b4e52187678124ba | [
"MIT"
] | 26 | 2020-03-03T10:39:46.000Z | 2022-03-24T03:53:28.000Z | _publishers/hawaii.md | buddhist-uni/buddhist-uni.github.io | 3d7a6d47e20bff3a4ddf92f9b4e52187678124ba | [
"MIT"
] | 3 | 2020-03-02T20:08:36.000Z | 2022-01-01T15:50:06.000Z | ---
title: "University of Hawai'i Press"
address: "Honolulu"
external_url: "https://uhpress.hawaii.edu/"
---
One of the premier publishers of Asian, American, and Pacific Studies monographs in the English language, the UH Press keeps over 3,000 titles in print and digitizes and opens (!) [key titles in its out-of-print collection](https://www.hawaiiopen.org/?post_type=product).
| 47.75 | 271 | 0.759162 | eng_Latn | 0.893509 |
fd76d1c35b751aae23450415f34e6a1343f587f2 | 189 | md | Markdown | README.md | KeevenOliveira/Bootstrap-publication | b533a339c0e76d8263cb5d4f9ef9180c550f40b6 | [
"MIT"
] | null | null | null | README.md | KeevenOliveira/Bootstrap-publication | b533a339c0e76d8263cb5d4f9ef9180c550f40b6 | [
"MIT"
] | null | null | null | README.md | KeevenOliveira/Bootstrap-publication | b533a339c0e76d8263cb5d4f9ef9180c550f40b6 | [
"MIT"
] | null | null | null | # Bootstrap Page😆
Essa página foi criada para demonstrar um pouco da responsividade do Bootstrap.✌
✔Para acessá-la, clique [aqui](https://keevenoliveira.github.io/Bootstrap-publication/).
| 37.8 | 88 | 0.78836 | por_Latn | 0.996709 |
fd76ddabd3855811ffe9c2b4926cc95291ea3bb2 | 728 | md | Markdown | README.md | bramnyadewi007/arduino-projects | 94ab9c83c8781a7197a715f5172978d515a21749 | [
"MIT"
] | null | null | null | README.md | bramnyadewi007/arduino-projects | 94ab9c83c8781a7197a715f5172978d515a21749 | [
"MIT"
] | null | null | null | README.md | bramnyadewi007/arduino-projects | 94ab9c83c8781a7197a715f5172978d515a21749 | [
"MIT"
] | 1 | 2021-03-02T03:19:38.000Z | 2021-03-02T03:19:38.000Z |
Arduino Libraries
=================
This distribution contains a bunch of libraries and example applications
that I have made for Arduino, covering a variety of tasks from blinking LED's
to LCD's and RTC-based alarm clocks. They are distributed under the
terms of the MIT license.
The [documentation](http://rweather.github.com/arduino-projects/index.html)
contains more information on the libraries and examples.
If you are looking for my cryptography libraries for Arduino, then they
can be found [here](https://github.com/rweather/arduinolibs).
For more information on these libraries, to report bugs, or to suggest
improvements, please contact the author Rhys Weatherley via
[email](mailto:rhys.weatherley@gmail.com).
| 38.315789 | 77 | 0.784341 | eng_Latn | 0.995413 |
fd76fb9a358883b0678646f2182244f73839d81a | 2,087 | md | Markdown | README.md | jeff-99/access2csv | c64805e4c879789205249bb4bbf5dfa63040f4ab | [
"MIT"
] | null | null | null | README.md | jeff-99/access2csv | c64805e4c879789205249bb4bbf5dfa63040f4ab | [
"MIT"
] | null | null | null | README.md | jeff-99/access2csv | c64805e4c879789205249bb4bbf5dfa63040f4ab | [
"MIT"
] | null | null | null | # access2csv
Simple program to extract data from Access databases into CSV files.
## Features
* view the schema of the database
* export all tables to csv files named after the table
* export one table
## Examples
Dumping a schema:
$ ./access2csv myfile.accdb --schema
CREATE TABLE Test(
Id INT,
Name TEXT,
)
CREATE TABLE Test2(
Id INT,
Name TEXT
)
Exporting all tables:
$ ./access2csv myfile.accdb
Exporting 'Test' to /home/ryepup/Test.csv
2 rows exported
Exporting 'Test2' to /home/ryepup/Test2.csv
100000 rows exported
Export one table:
$ ./access2csv myfile.accdb Test
1,"foo"
2,"bar"
## Installation
Binaries are available at
https://github.com/AccelerationNet/access2csv/releases, download a jar
file from there then use it as shown above.
### Compile from source
$ git clone https://github.com/AccelerationNet/access2csv.git
$ cd access2csv
$ mvn clean install
Now you should have a `access2csv.jar` in the target directory, ready to go.
Note December, 2017. Things have changed a little. If nothing else works then, (after compiling with mvn clean install) try running something
like this (example of Windows batch file) in the root of the repository (replace the path\to\file):
<pre> ".\target\appassembler\bin\access2csv.bat" --input ".\path\to\file" --output . --write-null NULL --quote-all false --schema --with-header </pre>
### Docker build
To run with docker first build the image then it can be run with the following commands
```sh
docker build -t access2csv .
```
example run:
```sh
docker run --rm -v /dir/with/accessfile:/input:rw access2csv --input /input/database.accdb --output /input/output/ --with-header
```
## Dependencies
* [Jackess](http://jackcess.sourceforge.net/) - a pure Java library
for reading from and writing to MS Access databases
* [opencsv](http://opencsv.sourceforge.net/) - CSV library
## Contributing
Use https://github.com/AccelerationNet/access2csv to open issues or
pull requests.
| 26.0875 | 151 | 0.700048 | eng_Latn | 0.746712 |
fd776cdbc82a8328e0a364bf5df3078873206f28 | 196 | md | Markdown | README.md | ssuljic/erd-designer | 6f76bd49fdf0fd1408360e2c2ec8b6fdd0d5057d | [
"MIT"
] | 15 | 2015-01-05T22:22:55.000Z | 2021-04-04T09:28:00.000Z | README.md | ssuljic/erd-designer | 6f76bd49fdf0fd1408360e2c2ec8b6fdd0d5057d | [
"MIT"
] | null | null | null | README.md | ssuljic/erd-designer | 6f76bd49fdf0fd1408360e2c2ec8b6fdd0d5057d | [
"MIT"
] | 6 | 2015-01-05T22:24:34.000Z | 2020-07-14T08:10:59.000Z | erd-designer
============
Javascript application for creating ER diagrams, and generating DDL SQL scripts for different DBMS's
Setup
============
```
bundle install
bundle exec ruby app.rb
```
| 15.076923 | 100 | 0.678571 | eng_Latn | 0.802268 |
fd77aa06fb4c0000a263ff6584bd964111a0b38d | 1,400 | md | Markdown | docs/web/mandarine/mandarine-data/data-repositories.md | martincrb/mandarinets | 200f35f09b86aa8451674cf31a233b2135d7bd22 | [
"MIT"
] | null | null | null | docs/web/mandarine/mandarine-data/data-repositories.md | martincrb/mandarinets | 200f35f09b86aa8451674cf31a233b2135d7bd22 | [
"MIT"
] | null | null | null | docs/web/mandarine/mandarine-data/data-repositories.md | martincrb/mandarinets | 200f35f09b86aa8451674cf31a233b2135d7bd22 | [
"MIT"
] | 1 | 2021-09-05T15:25:59.000Z | 2021-09-05T15:25:59.000Z | # Repositories
Mandarine's built-in ORM offers the concept & usability of Repositories. Repositories serve as a bridge between your database & your code, they execute tasks such as bringing information from the database to your code, deleting & saving, and more. In a nutshell, repositories allow you to execute queries programmatically, this means, you do not have to necessarily write the SQL query.
----
## Declaring a repository
Repositories are declared by using the decorator `@Repository()` . The `@Repository` decorator targets an **abstract class**.
The abstract class that will represent your repository must be extended [MandarineRepository](https://doc.deno.land/https/raw.githubusercontent.com/mandarineorg/mandarinets/master/orm-core/repository/mandarineRepository.ts#MandarineRepository) . MandarineRepository is a generic class that will take `T` as your model's instance.
**Syntax:**
```typescript
@Repository()
```
Example
```typescript
import { Repository, MandarineRepository } from "https://x.nest.land/MandarineTS@1.5.0/mod.ts";
@Repository()
abstract class MyRepository extends MandarineRepository<MyModel> {
constructor() {
super(MyModel);
}
}
```
> **Note** that all repositories must have a constructor with a super call. This super call will take your model's instance as shown above.
```typescript
constructor() {
super(MyModel);
}
```
| 37.837838 | 386 | 0.76 | eng_Latn | 0.96602 |
fd77b14bb68e9a41d5ed284e8a98948006455dc2 | 1,434 | md | Markdown | docs/2014/relational-databases/errors-events/mssqlserver-5515-database-engine-error.md | dirceuresende/sql-docs.pt-br | 023b1c4ae887bc1ed6a45cb3134f33a800e5e01e | [
"CC-BY-4.0",
"MIT"
] | 2 | 2021-10-12T00:50:30.000Z | 2021-10-12T00:53:51.000Z | docs/2014/relational-databases/errors-events/mssqlserver-5515-database-engine-error.md | dirceuresende/sql-docs.pt-br | 023b1c4ae887bc1ed6a45cb3134f33a800e5e01e | [
"CC-BY-4.0",
"MIT"
] | null | null | null | docs/2014/relational-databases/errors-events/mssqlserver-5515-database-engine-error.md | dirceuresende/sql-docs.pt-br | 023b1c4ae887bc1ed6a45cb3134f33a800e5e01e | [
"CC-BY-4.0",
"MIT"
] | 1 | 2020-06-25T13:33:56.000Z | 2020-06-25T13:33:56.000Z | ---
title: MSSQLSERVER_5515 | Microsoft Docs
ms.custom: ''
ms.date: 03/06/2017
ms.prod: sql-server-2014
ms.reviewer: ''
ms.suite: ''
ms.technology: supportability
ms.tgt_pltfrm: ''
ms.topic: conceptual
helpviewer_keywords:
- 5515 (Database Engine error)
ms.assetid: ccd793bc-ba5d-4782-8d72-731fd01fc177
caps.latest.revision: 12
author: MashaMSFT
ms.author: mathoma
manager: craigg
ms.openlocfilehash: 4c010a533e6f5f8326ef1fa218af535494a423c5
ms.sourcegitcommit: f8ce92a2f935616339965d140e00298b1f8355d7
ms.translationtype: MT
ms.contentlocale: pt-BR
ms.lasthandoff: 07/03/2018
ms.locfileid: "37419515"
---
# <a name="mssqlserver5515"></a>MSSQLSERVER_5515
## <a name="details"></a>Detalhes
|||
|-|-|
|Nome do produto|MSSQLSERVER|
|ID do evento|5515|
|Origem do evento|MSSQLSERVER|
|Componente|SQLEngine|
|Nome simbólico|FS_OPEN_CONTAINER_FAILED|
|Texto da mensagem|Diretório do contêiner ''%.*ls'' do arquivo de FILESTREAM não pode ser aberto. O sistema operacional retornou o código de status do Windows 0x%x.|
## <a name="explanation"></a>Explicação
O diretório do contêiner especificado para o arquivo de FILESTREAM não pode ser aberto.
## <a name="user-action"></a>Ação do usuário
Para saber a causa do erro, consulte o código de status específico do Windows. Para obter mais informações, consulte o [eventos and Errors Message Center](http://go.microsoft.com/fwlink/?linkid=47660).
| 31.866667 | 204 | 0.752441 | por_Latn | 0.636069 |
fd782264f545ffb13c5a14a7b438785bdd96d6a3 | 3,478 | md | Markdown | website/docs/recipes/live-queries.md | das-buro-am-draht/graphql-mesh | ce47024c14790a5c1aef149bd984b418b1af84fc | [
"MIT"
] | null | null | null | website/docs/recipes/live-queries.md | das-buro-am-draht/graphql-mesh | ce47024c14790a5c1aef149bd984b418b1af84fc | [
"MIT"
] | null | null | null | website/docs/recipes/live-queries.md | das-buro-am-draht/graphql-mesh | ce47024c14790a5c1aef149bd984b418b1af84fc | [
"MIT"
] | null | null | null | ---
id: live-queries
title: Live Queries
sidebar_label: Live Queries
---
<img src="https://raw.githubusercontent.com/n1ru4l/graphql-live-query/main/assets/logo.svg" width="300" alt="GraphQL Live Query" style={{ margin: '0 auto' }} />
GraphQL Live Query implementation from [Laurin Quast](https://github.com/n1ru4l) can be used in GraphQL Mesh with a few addition in the configuration.
### Basic Usage
Let's say you have a `Query` root field that returns all `Todo` entities from your data source like below.
```graphql
query getTodos {
todos {
id
content
}
}
```
And you want to update this operation result automatically without manual refresh when `Mutation.addTodo` is called.
The only thing you need is to add the following configuration to your existing configuration.
```yml
additionalTypeDefs: |
directive @live on QUERY
liveQueryInvalidations:
- field: Mutation.addTodo
invalidate:
- Query.todos
```
Then you can send a live query with `@live` directive.
```graphql
query getTodos @live {
todos {
id
content
}
}
```
This will start a real-time connection between server and your client, then the response of `todos` will get updated whenever `addTodo` is called.
### ID Based Invalidation
Let's say you have the following query that returns specific `Todo` entity based on `id` field;
```graphql
query getTodo($id: ID!) {
todo(id: $id) {
id
content
}
}
```
And you update this entity with `editTodo` mutation field on your backend then you want to invalidate this entity specifically instead of validating all `todo` queries;
```yml
liveQueryInvalidations:
- field: Mutation.editTodo
invalidate:
- Todo:{args.id}
```
In a case where the field resolver resolve null but might resolve to an object type later, e.g. because the visibility got updates the field that uses a specific id argument can be invalidated in the following way:
```yml
liveQueryInvalidations:
- field: Mutation.editTodo
invalidate:
- Query.todo(id:"{args.id}")
```
### Programmatic Usage
`liveQueryStore` is available in GraphQL Context so you can access it in resolvers composition functions that wrap existing resolvers or additional resolvers;
See [Resolvers Composition](/docs/transforms/resolvers-composition)
```yml
transforms:
- resolversComposition:
- resolver: Mutation.editTodo
composer: invalidate-todo#invalidateTodo
```
And in this code file;
```js
module.exports = {
invalidateTodo: next => async (root, args, context, info) => {
const result = await next(root, args, context, info);
context.liveQueryStore.invalidate(`Todo:${args.id}`);
return result;
}
}
```
> You can learn more about [GraphQL Live Query](https://github.com/n1ru4l/graphql-live-query) in its documentation.
> You can check out our example that uses live queries
<iframe
src="https://codesandbox.io/embed/github/Urigo/graphql-mesh/tree/master/examples/json-schema-subscriptions?fontsize=14&hidenavigation=1&theme=dark&module=%2F.meshrc.yml"
style={{width:"100%", height:"500px", border:"0", borderRadius: "4px", overflow:"hidden"}}
title="json-schema-subscriptions"
allow="geolocation; microphone; camera; midi; vr; accelerometer; gyroscope; payment; ambient-light-sensor; encrypted-media; usb"
sandbox="allow-modals allow-forms allow-popups allow-scripts allow-same-origin" />
| 29.726496 | 214 | 0.711616 | eng_Latn | 0.912562 |
fd78e0feed705c579bb4b39e74f3c94ec6e5026d | 43 | md | Markdown | Ticket/v1.0.58.md | gitpad-pl/github | ff2ef558044cd711c0d777fb3a45c4241c264ac6 | [
"MIT"
] | null | null | null | Ticket/v1.0.58.md | gitpad-pl/github | ff2ef558044cd711c0d777fb3a45c4241c264ac6 | [
"MIT"
] | null | null | null | Ticket/v1.0.58.md | gitpad-pl/github | ff2ef558044cd711c0d777fb3a45c4241c264ac6 | [
"MIT"
] | null | null | null | New Version of System v1.0.58
array docs
| 14.333333 | 30 | 0.744186 | kor_Hang | 0.53395 |
fd799759bbb950b6b01fced97cf2d212df94515b | 3,982 | md | Markdown | articles/marketplace/standard-contract.md | xinity/azure-docs.fr-fr | 63489b6c56a429435a3ede6a3bf86f75eedbc9a2 | [
"CC-BY-4.0",
"MIT"
] | null | null | null | articles/marketplace/standard-contract.md | xinity/azure-docs.fr-fr | 63489b6c56a429435a3ede6a3bf86f75eedbc9a2 | [
"CC-BY-4.0",
"MIT"
] | null | null | null | articles/marketplace/standard-contract.md | xinity/azure-docs.fr-fr | 63489b6c56a429435a3ede6a3bf86f75eedbc9a2 | [
"CC-BY-4.0",
"MIT"
] | null | null | null | ---
title: Contrat Standard | Azure
description: Contrat Standard dans la Place de marché Azure et AppSource
services: Azure, Marketplace, Compute, Storage, Networking
author: qianw211
ms.service: marketplace
ms.topic: article
ms.date: 07/05/2019
ms.author: ellacroi
ms.openlocfilehash: 80c157423572d356026f257e81d52650ce01d3e8
ms.sourcegitcommit: 6a42dd4b746f3e6de69f7ad0107cc7ad654e39ae
ms.translationtype: HT
ms.contentlocale: fr-FR
ms.lasthandoff: 07/07/2019
ms.locfileid: "67620364"
---
# <a name="standard-contract"></a>Contrat standard
Pour simplifier le processus d’approvisionnement pour les clients et réduire la complexité juridique pour les éditeurs de logiciels, Microsoft propose un modèle de contrat Standard afin de faciliter les transactions sur la place de marché. Plutôt que d’élaborer des conditions générales personnalisées, les éditeurs de la Place de marché Azure peuvent choisir de proposer leur logiciel dans le cadre du contrat Standard, que les clients ne doivent accepter qu’une seule fois. Le contrat Standard est disponible ici : [https://go.microsoft.com/fwlink/?linkid=2041178](https://go.microsoft.com/fwlink/?linkid=2041178).
Les termes et conditions des offres sont définies sous l’onglet Place de marché lorsque vous créez une offre dans le Portail Cloud Partner. L’option de contrat Standard est activée en définissant le paramètre sur Oui.

>[!Note]
>Si vous choisissez d’utiliser le contrat Standard, d’autres conditions générales sont encore nécessaires pour le [canal CSP](./cloud-solution-providers.md).
## <a name="standard-contract-amendments"></a>Contrat Standard | Modifications
Les modifications de contrat Standard permettent aux éditeurs de sélectionner le contrat Standard par souci de simplicité, et avec des conditions personnalisées pour leur produit ou entreprise. Les clients sont invités à examiner les modifications apportées au contrat, puis à accepter le contrat Standard de Microsoft.
Deux types de modifications sont proposées aux éditeurs de la Place de marché Azure :
* Modifications universelles : Ces modifications sont appliquées de manière universelle au contrat Standard de tous les clients. Les modifications universelles s'affichent pour chaque client du produit dans le flux d’achat.

* Modifications personnalisées : La Place de marché Azure propose également des modifications personnalisées destinées aux locataires. Il s'agit de modifications spéciales apportées au contrat Standard qui ciblent certains clients uniquement. Les éditeurs peuvent choisir le locataire qu'ils souhaitent cibler. Les clients de ce locataire achètent le produit selon les conditions du contrat Standard et les modifications ciblées.

>[!Note]
>Les clients ciblés par les modifications personnalisées reçoivent aussi la modification universelle apportée aux conditions standard lors de l'achat.
>[!Note]
>Les types d’offres suivants prennent en charge les modifications du contrat Standard : applications Azure (modèles de solution et applications managées), machines virtuelles, conteneurs, applications de conteneur.
### <a name="customer-experience"></a>Expérience client
Lors du processus d’achat sur le portail Azure, les clients peuvent voir les conditions du contrat associé au produit en tant que contrat Standard de Microsoft, ainsi que les modifications.

### <a name="api"></a>API
Les clients peuvent utiliser `Get-AzureRmMarketplaceTerms` pour récupérer les conditions d’une offre et l’accepter. Le contrat Standard et les modifications qui s'y rapportent sont renvoyés dans la sortie de la cmdlet.
---
| 67.491525 | 617 | 0.815168 | fra_Latn | 0.944509 |
fd79e2c00bf9ab7efb74bd6f9e152acd3c0880d6 | 1,149 | md | Markdown | README.md | visar/duo_api_perl | 3910be0dd49bb600cd01e08514f67b958b5d75a3 | [
"BSD-3-Clause"
] | 2 | 2019-05-23T20:37:36.000Z | 2019-05-24T14:06:35.000Z | README.md | visar/duo_api_perl | 3910be0dd49bb600cd01e08514f67b958b5d75a3 | [
"BSD-3-Clause"
] | null | null | null | README.md | visar/duo_api_perl | 3910be0dd49bb600cd01e08514f67b958b5d75a3 | [
"BSD-3-Clause"
] | null | null | null | # Overview
**duo_api_perl** - Demonstration client to call Duo API methods
with Perl.
# Duo Verify API
The Duo Verify API provides the ability to call or text (SMS) a phone
number with a one-time PIN number.
For more information see the Duo Verify API guide:
<http://www.duosecurity.com/docs/duoverify>
# Duo Auth API
The Duo Auth API provides a low-level API for adding strong two-factor
authentication to applications that cannot directly display rich web
content.
For more information see the Duo Auth API guide:
<http://www.duosecurity.com/docs/authapi>
# Duo Admin API
The Duo Admin API provides programmatic access to the administrative
functionality of Duo Security's two-factor authentication platform.
This feature is not available with all Duo accounts.
For more information see the Duo Admin API guide:
<http://www.duosecurity.com/docs/adminapi>
# Duo Accounts API
The Duo Accounts API allows a parent account to create, manage, and
delete other Duo customer accounts. This feature is not available with
all Duo accounts.
For more information see the Duo Accounts API guide:
<http://www.duosecurity.com/docs/accountsapi>
| 26.113636 | 70 | 0.785901 | eng_Latn | 0.890245 |
fd7b1c0d1613905ebfa2845218573f67d93f06f5 | 216 | md | Markdown | exercicios/ex017/README.md | oscarlojr/logica_com_javascript | 176ec64585e6aaa71059049d84da18dc00f3b04e | [
"MIT"
] | null | null | null | exercicios/ex017/README.md | oscarlojr/logica_com_javascript | 176ec64585e6aaa71059049d84da18dc00f3b04e | [
"MIT"
] | null | null | null | exercicios/ex017/README.md | oscarlojr/logica_com_javascript | 176ec64585e6aaa71059049d84da18dc00f3b04e | [
"MIT"
] | null | null | null | # Exercise with Javascript and Canvas
<p align="center">
:fire: Resolution following :nerd_face:
</p>
<h1 align="center">
<img alt="trocaBandeira" title="#trocaBandeira" src="./img/trocaBandeira.gif" />
</h1> | 24 | 82 | 0.694444 | eng_Latn | 0.414915 |
fd7c5906cae583851973b357f6e7bb7e06c71a6d | 1,931 | md | Markdown | _posts/2009-06-15-all-change-please.md | DougAGault/dougagault.github.io | 4129e6b9e8230aed6dc98284b45a3cb0ba630a14 | [
"MIT"
] | null | null | null | _posts/2009-06-15-all-change-please.md | DougAGault/dougagault.github.io | 4129e6b9e8230aed6dc98284b45a3cb0ba630a14 | [
"MIT"
] | null | null | null | _posts/2009-06-15-all-change-please.md | DougAGault/dougagault.github.io | 4129e6b9e8230aed6dc98284b45a3cb0ba630a14 | [
"MIT"
] | null | null | null | ---
id: 34
title: 'All Change Please…'
date: 2009-06-15T22:12:00-05:00
author: Doug Gault
layout: article
guid: http://douggault.com/uncategorized/all-change-please/
permalink: /all-change-please/
blogger_blog:
- douggault.blogspot.com
blogger_author:
- Doug Gault
blogger_permalink:
- /2009/06/all-change-please.html
blogger_internal:
- /feeds/6363924398907149830/posts/default/3638496037727066892
categories:
- PERSONAL
tags:
- Career
- Personal
---
As I mentioned in my [last post](http://douggault.blogspot.com/2009/06/as-many-of-you-know-odtug-kaleidoscope.html), on June 1, 2009 I joined the team at [Sumner Technologies](http://sumnertechnologies.com/apex/f?p=10000:1:0). Since then, Scott Spendolini and I have been working on Sumner’s portfolio to broaden the scope of what we’re offering.
New items include [SumnerPrint](http://sumnertechnologies.com/apex/f?p=10000:1010:0::NO:::), a PL/SQL-based solution that allows you to easily print APEX reports to PDF, HTML and XLS, and [SumnerFramework](http://sumnertechnologies.com/apex/f?p=10000:1020:0::NO:::), an application security and management framework for APEX. We’re also working on several new courses and service offerings.
As part of the inevitable website changes, we’ve introduced a new [company blog](http://sumnertechnologies.com/apex/f?p=10000:400:0::NO:::). We’ll be using this blog to provide information about products, services and training classes that we provide as well as unique technical tips and tricks on APEX and other Oracle-related technologies.
You can subscribe to the SumnertechBlog at [http://feeds2.feedburner.com/SumnertechBlog.](http://feeds2.feedburner.com/SumnertechBlog)
I’ll still be updating my blog, but it will become much more personal, and much less about Oracle and APEX.
And don’t forget to come visit us at [ODTUG KALAIDOSOPE](http://odtugkaleidoscope.com/)!
| 56.794118 | 396 | 0.771621 | eng_Latn | 0.823243 |
fd7d563e1c1b7c78e2a50c105c8d3817a0dcc5f0 | 18,398 | md | Markdown | articles/iot-hub/iot-hub-live-data-visualization-in-web-apps.md | tsunami416604/azure-docs.hu-hu | aeba852f59e773e1c58a4392d035334681ab7058 | [
"CC-BY-4.0",
"MIT"
] | null | null | null | articles/iot-hub/iot-hub-live-data-visualization-in-web-apps.md | tsunami416604/azure-docs.hu-hu | aeba852f59e773e1c58a4392d035334681ab7058 | [
"CC-BY-4.0",
"MIT"
] | null | null | null | articles/iot-hub/iot-hub-live-data-visualization-in-web-apps.md | tsunami416604/azure-docs.hu-hu | aeba852f59e773e1c58a4392d035334681ab7058 | [
"CC-BY-4.0",
"MIT"
] | null | null | null | ---
title: A IoT hub-adatai valós idejű adatvizualizációja egy webalkalmazásban
description: Egy webalkalmazás használatával megjelenítheti az érzékelőből összegyűjtött hőmérséklet-és páratartalom-adatokat, és elküldheti azokat az IOT hubhoz.
author: robinsh
ms.service: iot-hub
services: iot-hub
ms.topic: conceptual
ms.tgt_pltfrm: arduino
ms.date: 05/31/2019
ms.author: robinsh
ms.custom:
- 'Role: Cloud Development'
- 'Role: Data Analytics'
- devx-track-azurecli
ms.openlocfilehash: 7753c6c118d763163e6bc8f69f5b4eee13fe2393
ms.sourcegitcommit: d2d1c90ec5218b93abb80b8f3ed49dcf4327f7f4
ms.translationtype: MT
ms.contentlocale: hu-HU
ms.lasthandoff: 12/16/2020
ms.locfileid: "97588794"
---
# <a name="visualize-real-time-sensor-data-from-your-azure-iot-hub-in-a-web-application"></a>Valós idejű érzékelők adatainak megjelenítése az Azure IoT hub-ban egy webalkalmazásban

[!INCLUDE [iot-hub-get-started-note](../../includes/iot-hub-get-started-note.md)]
## <a name="what-you-learn"></a>Ismertetett témák
Ebből az oktatóanyagból megtudhatja, hogyan jelenítheti meg a valós idejű érzékelők adatait, amelyeket az IoT hub a helyi számítógépen futó node.js webalkalmazáshoz kap. A webalkalmazás helyi futtatása után igény szerint követheti a webalkalmazás futtatásának lépéseit Azure App Serviceban. Ha Power BI használatával szeretné megjeleníteni az IoT hub adatait, tekintse meg a [Power bi használata a valós idejű érzékelők adatainak megjelenítéséhez az Azure IoT hub](iot-hub-live-data-visualization-in-power-bi.md).
## <a name="what-you-do"></a>Teendők
* Vegyen fel egy fogyasztói csoportot az IoT hubhoz, amelyet a webalkalmazás az érzékelő adatai olvasásához használ
* A webalkalmazás kódjának letöltése a GitHubról
* A webalkalmazás kódjának vizsgálata
* Környezeti változók konfigurálása a webalkalmazáshoz szükséges IoT Hub-összetevők tárolásához
* A webalkalmazás futtatása a fejlesztői gépen
* Nyisson meg egy weblapot az IoT hub valós idejű hőmérséklet-és páratartalom-adatainak megjelenítéséhez
* Választható A webalkalmazás üzemeltetése az Azure CLI használatával Azure App Service
## <a name="what-you-need"></a>Amire szükség lesz
* Fejezze be a [málna PI online szimulátor](iot-hub-raspberry-pi-web-simulator-get-started.md) oktatóanyagát vagy az eszköz egyik oktatóanyagát; például a [málna PI és a node.js](iot-hub-raspberry-pi-kit-node-get-started.md). Ezek az alábbi követelményekre vonatkoznak:
* Aktív Azure-előfizetés
* Az előfizetés alá tartozó IOT hub
* Egy ügyfélalkalmazás, amely üzeneteket küld az IOT hubhoz
* [A git letöltése](https://www.git-scm.com/downloads)
* A cikkben ismertetett lépések feltételezik a Windows fejlesztői gépeket; ezeket a lépéseket azonban egyszerűen elvégezheti az előnyben részesített rendszerhéjban található Linux rendszeren is.
[!INCLUDE [azure-cli-prepare-your-environment.md](../../includes/azure-cli-prepare-your-environment-no-header.md)]
## <a name="add-a-consumer-group-to-your-iot-hub"></a>Fogyasztói csoport hozzáadása az IoT hub-hoz
A [fogyasztói csoportok](../event-hubs/event-hubs-features.md#event-consumers) független nézeteket biztosítanak az esemény-adatfolyamhoz, amely lehetővé teszi, hogy az alkalmazások és az Azure-szolgáltatások egymástól függetlenül használják az adott Event hub-végpont adatait. Ebben a szakaszban egy fogyasztói csoportot ad hozzá az IoT hub beépített végpontához, amelyet a webalkalmazás az adatok olvasására fog használni.
A következő parancs futtatásával vegyen fel egy fogyasztói csoportot az IoT hub beépített végpontján:
```azurecli-interactive
az iot hub consumer-group create --hub-name YourIoTHubName --name YourConsumerGroupName
```
Jegyezze fel a kiválasztott nevet, majd az oktatóanyag későbbi részében szüksége lesz rá.
## <a name="get-a-service-connection-string-for-your-iot-hub"></a>IoT hub szolgáltatáshoz tartozó szolgáltatási kapcsolatok karakterláncának beolvasása
A IoT hubok több alapértelmezett hozzáférési házirenddel jönnek létre. Az egyik ilyen szabályzat a **szolgáltatási** szabályzat, amely megfelelő engedélyeket biztosít a szolgáltatás számára az IoT hub végpontjának olvasásához és írásához. Futtassa a következő parancsot, hogy lekérje a IoT hub kapcsolódási karakterláncát, amely megfelel a szolgáltatási házirendnek:
```azurecli-interactive
az iot hub show-connection-string --hub-name YourIotHub --policy-name service
```
A kapcsolódási karakterláncnak a következőhöz hasonlóan kell kinéznie:
```javascript
"HostName={YourIotHubName}.azure-devices.net;SharedAccessKeyName=service;SharedAccessKey={YourSharedAccessKey}"
```
Jegyezze fel a szolgáltatási kapcsolatok karakterláncát, ezért az oktatóanyag későbbi részében szüksége lesz rá.
## <a name="download-the-web-app-from-github"></a>A webalkalmazás letöltése a GitHubról
Nyisson meg egy parancssorablakot, és írja be a következő parancsokat a minta GitHubról való letöltéséhez és a minta könyvtárra való váltáshoz:
```cmd
git clone https://github.com/Azure-Samples/web-apps-node-iot-hub-data-visualization.git
cd web-apps-node-iot-hub-data-visualization
```
## <a name="examine-the-web-app-code"></a>A webalkalmazás kódjának vizsgálata
A Web-Apps-Node-IOT-hub-adatvizualizáció könyvtárában nyissa meg a webalkalmazást a kedvenc szerkesztőjében. Az alábbi ábrán a VS Code-ban megtekintett fájl szerkezete látható:

Szánjon egy kis időt a következő fájlok vizsgálatára:
* A **Server.js** egy olyan szolgáltatás-oldali parancsfájl, amely inicializálja a webes szoftvercsatornát és az Event hub burkoló osztályát. Visszahívást biztosít az Event hub burkoló osztályához, amelyet az osztály a bejövő üzenetek webes szoftvercsatornára való szórására használ.
* A **Event-hub-reader.js** egy olyan szolgáltatás-oldali parancsfájl, amely az IoT hub beépített végpontját a megadott kapcsolati sztring és fogyasztói csoport használatával köti össze. Kibontja a DeviceId és a EnqueuedTimeUtc a beérkező üzenetek metaadataiból, majd továbbítja az üzenetet a server.js által regisztrált visszahívási módszer használatával.
* **Chart-device-data.js** egy ügyféloldali parancsfájl, amely figyeli a webes szoftvercsatornát, nyomon követi az egyes DeviceID-ket, és minden eszközön tárolja az utolsó 50 pontot a bejövő adathoz. Ezután a kiválasztott eszközhöz tartozó adategységeket a diagram objektumhoz köti.
* **Index.html** kezeli a weblap felhasználói felületének elrendezését, és az ügyféloldali logikához szükséges parancsfájlokra hivatkozik.
## <a name="configure-environment-variables-for-the-web-app"></a>Környezeti változók konfigurálása a webalkalmazáshoz
Az IoT hub adatainak beolvasásához a webalkalmazásnak szüksége van az IoT hub kapcsolati karakterláncára és annak a fogyasztói csoportnak a nevére, amelyet át kell olvasnia. Ezeket a karakterláncokat a folyamat-környezetből kapja meg a következő sorokban server.js:
```javascript
const iotHubConnectionString = process.env.IotHubConnectionString;
const eventHubConsumerGroup = process.env.EventHubConsumerGroup;
```
Állítsa be a környezeti változókat a parancsablakban a következő parancsokkal. Cserélje le a helyőrző értékeket az IoT hub szolgáltatási kapcsolatok karakterláncára, valamint a korábban létrehozott fogyasztói csoport nevére. Ne idézze a karakterláncokat.
```cmd
set IotHubConnectionString=YourIoTHubConnectionString
set EventHubConsumerGroup=YourConsumerGroupName
```
## <a name="run-the-web-app"></a>A webalkalmazás futtatása
1. Győződjön meg arról, hogy az eszköz fut és adatokat küld.
2. A parancssorablakban futtassa a következő sorokat a hivatkozott csomagok letöltéséhez és telepítéséhez, majd indítsa el a webhelyet:
```cmd
npm install
npm start
```
3. Meg kell jelennie a kimenetnek a konzolon, amely azt jelzi, hogy a webalkalmazás sikeresen csatlakozott az IoT hub-hoz, és a 3000-es porton figyeli a következőt:

## <a name="open-a-web-page-to-see-data-from-your-iot-hub"></a>Nyisson meg egy weblapot az IoT hub adatainak megjelenítéséhez
Nyisson meg egy böngészőt a következőhöz: `http://localhost:3000` .
Az eszköz **kiválasztása** listából válassza ki az eszközt, és tekintse meg az eszköz által az IoT hubhoz eljuttatott utolsó 50 hőmérséklet-és páratartalom-adatpontok egy futó ábráját.

Azt is meg kell jelennie a konzolon, hogy a webalkalmazás által a böngésző ügyfelének küldött üzenetek megjelenjenek:

## <a name="host-the-web-app-in-app-service"></a>A webalkalmazás üzemeltetése App Service
A [Azure App Service Web Apps szolgáltatása](../app-service/overview.md) a webalkalmazások üzemeltetésére szolgáló platformként szolgál (a Pásti). A Azure App Serviceban üzemeltetett webalkalmazások kihasználhatják a hatékony Azure-funkciókat, például a további biztonságot, a terheléselosztást és a méretezhetőséget, valamint az Azure-és partneri DevOps-megoldásokat, például a folyamatos üzembe helyezést, a csomagok felügyeletét stb. Azure App Service támogatja a számos népszerű nyelven fejlesztett és Windows-vagy Linux-infrastruktúrán üzembe helyezett webalkalmazást.
Ebben a szakaszban egy webalkalmazást hoz létre a App Serviceban, és üzembe helyezi a kódját az Azure CLI-parancsok használatával. Az az [WebApp](/cli/azure/webapp?view=azure-cli-latest) dokumentációjában használt parancsok részleteit megtekintheti. Mielőtt elkezdené, győződjön meg arról, hogy elvégezte az [erőforráscsoport hozzáadásának](#add-a-consumer-group-to-your-iot-hub)lépéseit az IoT hub-hoz, [szerezzen be egy szolgáltatás-kapcsolati karakterláncot az IoT hub számára](#get-a-service-connection-string-for-your-iot-hub), és [töltse le a webalkalmazást a githubról](#download-the-web-app-from-github).
1. Az [app Service-csomag](../app-service/overview-hosting-plans.md) számítási erőforrásokat határoz meg app Service által futtatott alkalmazáshoz. Ebben az oktatóanyagban a fejlesztő/ingyenes szintet használjuk a webalkalmazás üzemeltetéséhez. Az ingyenes szinten a webalkalmazás a megosztott Windows-erőforrásokon fut más App Service alkalmazásokkal, többek között más ügyfelek alkalmazásaival. Az Azure App Service terveket is kínál a webalkalmazások Linux számítási erőforrásokon való üzembe helyezéséhez. Ezt a lépést kihagyhatja, ha már rendelkezik a használni kívánt App Service-csomaggal.
Ha App Service csomagot szeretne létrehozni a Windows ingyenes szintjével, futtassa a következő parancsot. Használja ugyanazt az erőforráscsoportot, amelyben az IoT hub található. A szolgáltatási csomag neve kis-és nagybetűket, számokat és kötőjeleket tartalmazhat.
```azurecli-interactive
az appservice plan create --name <app service plan name> --resource-group <your resource group name> --sku FREE
```
2. Most hozzon létre egy webalkalmazást a App Service tervben. A `--deployment-local-git` paraméter lehetővé teszi a webalkalmazás kódjának feltöltését és üzembe helyezését a helyi gépen található git-adattárból. A webalkalmazás nevének globálisan egyedinek kell lennie, és kis-és nagybetűket, számokat és kötőjeleket tartalmazhat. Ügyeljen arra, hogy a (z) paraméterhez a (z) 10,6-es vagy újabb csomópontot `--runtime` használja a használt Node.js futtatókörnyezet verziójától függően. A `az webapp list-runtimes` parancs segítségével lekérheti a támogatott futtatókörnyezetek listáját.
```azurecli-interactive
az webapp create -n <your web app name> -g <your resource group name> -p <your app service plan name> --runtime "node|10.6" --deployment-local-git
```
3. Most adja meg az IoT hub kapcsolati karakterláncot és az Event hub fogyasztói csoportot megadó környezeti változók alkalmazás-beállításait. Az egyes beállítások a paraméterben vannak elválasztva `-settings` . Használja a IoT hub szolgáltatási kapcsolatok karakterláncát és az oktatóanyagban korábban létrehozott fogyasztói csoportot. Ne idézze az értékeket.
```azurecli-interactive
az webapp config appsettings set -n <your web app name> -g <your resource group name> --settings EventHubConsumerGroup=<your consumer group> IotHubConnectionString="<your IoT hub connection string>"
```
4. Engedélyezze a web Sockets protokollt a webalkalmazáshoz, és állítsa be úgy a webalkalmazást, hogy csak HTTPS-kérelmeket fogadjon (a HTTP-kérelmeket a rendszer a HTTPS-re irányítja át)
```azurecli-interactive
az webapp config set -n <your web app name> -g <your resource group name> --web-sockets-enabled true
az webapp update -n <your web app name> -g <your resource group name> --https-only true
```
5. A kód App Serviceba való üzembe helyezéséhez [felhasználói szintű központi telepítési hitelesítő adatait](../app-service/deploy-configure-credentials.md)fogja használni. A felhasználói szintű központi telepítési hitelesítő adatai eltérnek az Azure-beli hitelesítő adataitól, és a git helyi és FTP-környezetekben való üzembe helyezéséhez használatosak a webalkalmazások számára. A beállítás után az Azure-fiók összes előfizetésében érvényesek lesznek az összes App Service alkalmazásban. Ha korábban beállította a felhasználói szintű központi telepítési hitelesítő adatokat, használhatja azokat.
Ha korábban még nem állította be a felhasználói szintű központi telepítési hitelesítő adatokat, vagy nem emlékszik a jelszavára, futtassa a következő parancsot. Az üzembe helyezési felhasználónévnek egyedinek kell lennie az Azure-ban, és nem tartalmazhat " \@ " szimbólumot a helyi git leküldéséhez. Amikor a rendszer kéri, írja be és erősítse meg az új jelszót. A jelszónak legalább nyolc karakterből kell állnia, és a következő három elem közül kettőnek kell lennie: betűk, számok és szimbólumok.
```azurecli-interactive
az webapp deployment user set --user-name <your deployment user name>
```
6. Kérje meg a git URL-címét, hogy a kódot a App Serviceba küldje el.
```azurecli-interactive
az webapp deployment source config-local-git -n <your web app name> -g <your resource group name>
```
7. Adjon hozzá egy távolit a klónhoz, amely a App Service webalkalmazás git-tárházára hivatkozik. A esetében \<Git clone URL\> használja az előző lépésben visszaadott URL-címet. Futtassa a következő parancsot a parancsablakban.
```cmd
git remote add webapp <Git clone URL>
```
8. A kód App Serviceba való telepítéséhez írja be a következő parancsot a parancsablakban. Ha a rendszer hitelesítő adatokat kér, adja meg az 5. lépésben létrehozott felhasználói szintű központi telepítési hitelesítő adatokat. Győződjön meg arról, hogy leküldi a App Service távoli főágra.
```cmd
git push webapp main:main
```
9. A központi telepítés előrehaladása a parancsablakban fog frissülni. A sikeres üzembe helyezés a következő kimenethez hasonló sorokkal végződik:
```cmd
remote:
remote: Finished successfully.
remote: Running post deployment command(s)...
remote: Deployment successful.
To https://contoso-web-app-3.scm.azurewebsites.net/contoso-web-app-3.git
6b132dd..7cbc994 main -> main
```
10. Futtassa a következő parancsot a webalkalmazás állapotának lekéréséhez, és győződjön meg arról, hogy az fut:
```azurecli-interactive
az webapp show -n <your web app name> -g <your resource group name> --query state
```
11. Egy böngészőben nyissa meg a `https://<your web app name>.azurewebsites.net` oldalt. A webalkalmazás helyi futtatásakor megjelenő weboldalhoz hasonló weblap. Feltételezve, hogy az eszköz fut és adatokat küld, látnia kell az eszköz által küldött, a 50 legutóbbi hőmérséklet-és páratartalom-olvasások futó ábráját.
## <a name="troubleshooting"></a>Hibaelhárítás
Ha bármilyen probléma merül fel ezzel a mintával, próbálkozzon az alábbi részekben ismertetett lépésekkel. Ha továbbra is problémákat tapasztal, küldjön visszajelzést a témakör alján.
### <a name="client-issues"></a>Ügyfelekkel kapcsolatos problémák
* Ha egy eszköz nem jelenik meg a listában, vagy nem készül gráf, ellenőrizze, hogy az eszközön fut-e a kód.
* A böngészőben nyissa meg a fejlesztői eszközöket (számos böngészőben az F12 billentyű megnyithatja), és keresse meg a konzolt. Keresse meg az ott kinyomtatott figyelmeztetéseket és hibákat.
* Az ügyféloldali szkriptek hibakeresése a/JS/chat-device-data.jsban végezhető el.
### <a name="local-website-issues"></a>Helyi webhely problémái
* Tekintse meg az ablakban azt a kimenetet, amelyen a konzol kimenetéhez tartozó csomópontot indította.
* Hibakeresés a kiszolgáló kódjával, pontosabban server.js és/Scripts/event-hub-reader.js.
### <a name="azure-app-service-issues"></a>Azure App Service problémák
* A Azure Portalban nyissa meg a webalkalmazást. A bal oldali ablaktábla **figyelés** területén válassza a **app Service naplók** lehetőséget. Kapcsolja be az **alkalmazás naplózása (fájlrendszer)** beállítást be értékre, állítsa a hiba **szintet** , majd kattintson a **Mentés** gombra. Ezután nyissa meg a **naplózási adatfolyamot** (a **figyelés** alatt).
* A Azure Portal webalkalmazásában a **fejlesztői eszközök** területen válassza ki a **konzolt** , és ellenőrizze a csomópont-és NPM verzióit a és a segítségével `node -v` `npm -v` .
* Ha a csomag nem talál hibát, lehetséges, hogy a lépéseket a sorrend szerint futtatta. Ha a hely telepítve van (a `git push` -vel), az App Service fut `npm install` , amely a csomópont aktuális verziója alapján fut. Ha később módosítja a konfigurációt, értelmetlen módosítást kell végeznie a kódban, és újra le kell küldenie.
## <a name="next-steps"></a>Következő lépések
Sikeresen használta a webalkalmazást az IoT hub valós idejű érzékelői adatainak megjelenítéséhez.
Az Azure IoT Hubból származó adatok megjelenítésének másik módja: [a Power bi használata a valós idejű érzékelők adatainak megjelenítéséhez az IoT hub-ból](iot-hub-live-data-visualization-in-power-bi.md).
[!INCLUDE [iot-hub-get-started-next-steps](../../includes/iot-hub-get-started-next-steps.md)]
| 69.954373 | 612 | 0.800141 | hun_Latn | 1.000009 |
fd7eba90859b211b491eac164ce1d1accc322dfa | 3,007 | md | Markdown | fecshop-guide/develop/cn-1.0/fecshop-server-api-product-add-review-submit.md | terrywaterzhao/yii2_fecshop_doc | ddaa983b6662a5ce89aca30d984bd67728f02a96 | [
"BSD-3-Clause"
] | 16 | 2017-08-16T02:42:25.000Z | 2020-05-23T06:30:43.000Z | fecshop-guide/develop/cn-1.0/fecshop-server-api-product-add-review-submit.md | terrywaterzhao/yii2_fecshop_doc | ddaa983b6662a5ce89aca30d984bd67728f02a96 | [
"BSD-3-Clause"
] | null | null | null | fecshop-guide/develop/cn-1.0/fecshop-server-api-product-add-review-submit.md | terrywaterzhao/yii2_fecshop_doc | ddaa983b6662a5ce89aca30d984bd67728f02a96 | [
"BSD-3-Clause"
] | 32 | 2017-08-16T02:42:31.000Z | 2021-07-27T06:23:25.000Z | Api- Product Add Review Submit
================
> 产品评论页面,填写完评论信息,点击提交后访问的api
URL: `/catalog/reviewproduct/submitreview`
格式:`json`
方式:`post`
一:请求部分
---------
#### 1.Request Header
| 参数名称 | 是否必须 | 类型 | 描述 |
| ------------------| -----: | :----: |:----: |
| access-token | 必须 | String | 从localStorage取出来的值,识别用户登录状态的标示,用户登录成功,服务端返回access-token,VUE保存到localStorage中,在下一次请求从localStorage取出来放到request header中 |
| fecshop-uuid | 必须 | String | 从localStorage取出来的值,用户的唯一标示,VUE第一次访问服务端,服务端会返回fecshop-uuid ,VUE将其保存到本地,后面的每一次请求都需要加上fecshop-uuid |
| fecshop-currency | 必须 | String | 从localStorage取出来的值,当前的货币值 |
| fecshop-lang | 必须 | String | 从localStorage取出来的值,当前的语言code |
#### 2.Request Body Form-Data:
| 参数名称 | 是否必须 | 类型 | 描述 |
| ----------------| -----: | :----: |:----: |
| product_id | 必须 | String | Product Id |
| customer_name | 必须 | String | 评论用户的name |
| summary | 必须 | String | 评论的title |
| captcha | 必须 | String | 验证码 |
| review_content | 必须 | String | 评论内容 |
| selectStar | 必须 | Integer | 评星 |
**请求参数示例如下:**
```
{
product_id: "580835d0f656f240742f0b7c",
customer_name: "44444 666",
summary: "summary title",
captcha: "2744",
review_content: "review content ...",
selectStar: 5
}
```
二:返回部分
----------
#### 1.Reponse Header
| 参数名称 | 是否必须 | 类型 | 描述 |
| ------------------| -----: | :----: |:----: |
| access-token | 选填 | String | 用户登录成功,服务端返回access-token,VUE保存到localStorage中,在下一次请求从localStorage取出来放到request header中 |
| fecshop-uuid | 必须 | String | 用户的唯一标示,VUE第一次访问服务端,服务端会返回fecshop-uuid ,VUE将其保存到本地,后面的每一次请求都需要加上fecshop-uuid |
#### 2.Reponse Body Form-Data:
格式:`json`
| 参数名称 | 是否必须 | 类型 | 描述 |
| ----------------| -----: | :----: |:----: |
| code | 必须 | Number | 返回状态码,200 代表完成,完整的返回状态码详细参看:[Api- 状态码](fecshop-server-return-code.md) |
| message | 必须 | String | 返回状态字符串描述 |
| data | 必须 | Array | 返回详细数据 |
#### 3.参数code所有返回状态码:(完整的返回状态码详细参看:[Api- 状态码](fecshop-server-return-code.md) )
| code Value | 描述 |
| ----------------| --------------------------------------------------:|
| 200 | 成功状态码 |
| 1100003 | 登录:账户的token已经过期,或者没有登录 |
| 1300001 | 产品:产品不存在,或者已经下架 |
| 1000007 | 无效数据:验证码错误 |
| 1000009 | 参数丢失 |
| 1300003 | 产品:产品保存评论失败 |
#### 4.返回数据举例:
```
{
"code": 200,
"message": "process success",
"data": []
}
``` | 31.652632 | 168 | 0.442301 | yue_Hant | 0.530988 |
fd7ee194bede6ed021d5332d3ab893408bd4d52e | 4,421 | md | Markdown | _posts/2019-06-21-Download-the-soul-takers.md | Camille-Conlin/26 | 00f0ca24639a34f881d6df937277b5431ae2dd5d | [
"MIT"
] | null | null | null | _posts/2019-06-21-Download-the-soul-takers.md | Camille-Conlin/26 | 00f0ca24639a34f881d6df937277b5431ae2dd5d | [
"MIT"
] | null | null | null | _posts/2019-06-21-Download-the-soul-takers.md | Camille-Conlin/26 | 00f0ca24639a34f881d6df937277b5431ae2dd5d | [
"MIT"
] | null | null | null | ---
layout: post
comments: true
categories: Other
---
## Download The soul takers book
The great mass of the Never had the familiar red Bicycle design of the U! Our first meeting with the A few attractive women were here alone, the soul takers spiked. Earl was a one-man firing squad, receiving receive treatment and who should not. Sometimes instinct told him that in his path was an object that ordinarily the soul takers not have been there; but as often as not, where he will be less easily detected if "The one I'm about to start is Dr Jekyll and Mr, of briefings in the shelter, do not appear now to be found in any large numbers on the to see if the names were in alphabetical order. She would have tricks in her repertoire that younger women were too inexperienced to know. but then diminishes and fades the soul takers away. Their great ships filled Thwil Bay, the motherless boy relaxes establishments, then?" [Footnote 204: According to Johannesen's determination, I Leilani pretended to consider it. "Swab this on your skin, useful life, bright-eyed man who wore a red tunic under his grey wizard's cloak said, snow. I was not at the ever seen anyone? Bones of the bear, and I said that I could go at any time? irritation, several orchids, _Tedljgio_, along the north coast of Asia. limping after my long excursion on foot, I must admit. The song of a bird? Then Zeke said, to the mouth of a large river carried on between them and the Russians. Then they adorned the elephant and setting up the throne on his back, accepting the call with a flip of his thumb, as containing a useful mineral peculiar SHORTLY BEFORE one o'clock, and scrambles at once to his feet, that it did not lie in a flood plain, as though he were in "You're nuts, making the sand red, come in!" This time. The sea? Grimacing, but on the 10th Sept, his voice quaking. maybe fifty. Babe, bright and who had looked at him. He added these to the suitcases. 329 clay-slate or schist with leaf-impressions. In the still backwaters were moored boats, O Aamir, among the Chukches in the interior of the [Illustration: SAMOYED GRAVE ON VAYGATS ISLAND, rapidly closer, was squeezed into one the soul takers of the precinct next to a coffee shop, Donella turns away from him, it cannot be denied that, "then you start worrying about food, brought Medra safe down the Inmost Sea to Roke, and we treat the remaining eye with radiation, and hour by hour he blends better with the and had undergone subsequent tendon surgery. the soul takers, "the world felt a lot different to me from the way it looked to other people, of course. I just wanted you "Well have to get cutting tools from the ship," he told his crew. "Your Perri would want you to Billings, the soul takers at the same time to resolve on making a are fastened. " someone's name gives you power "If I did, It had enabled her to stop fighting so hard against the screaming panic she wanted to unleash. You can't describe the soul takers craving. "Maybe I'll just curl up on a blanket in the corner, where the mother lived, somehow, as you might expect, but it Chapter 37 Great hobnailed wheels of pain turned through The soul takers. Well, the keystone of her soul, where they form a and self-pity roiled in him, he restlessly roamed the hotel room. An arrogant man, but Nolan had no choice, in her hair, but Kalens seemed more thoughtful and less insistent, and spoke with each the soul takers his uncles, east coast of Yesso, "Whose brother?" Pet's Straits. Making the soul takers uneasy. Strangely, Celestina, including most of what Preston They are not shy in laying heavy loads on their dogs, Barry, so he probably poses little danger to them, and several dignity and sense of justice would compel her to act-perhaps more out of "You mean, in the night. "Selene is the dancer. Behind furniture. colored sheets of sailboats. Indeed, and then the edges of the large holes closed so much the price, as containing a useful mineral peculiar SHORTLY BEFORE one o'clock, and ate a little food she gave him to eat, fidgety. The most the soul takers squeals seem less like human sounds than like the panicked The excursion now described and Almquist's and Hovgaard's landing in navigator. trousers stuck into the boots, smiled. the soul takers don't think you've been so lucky," he warned as the Chironian walked away. At Foul Bay, and there met the soul takers another with a load of wood. | 491.222222 | 4,332 | 0.783986 | eng_Latn | 0.999955 |
fd8062051f26a94d25517f1813f795bee7abe7b8 | 46,826 | md | Markdown | README.md | FIWARE/tutorials.Concise-Format | c60babdaa764dbbd1f6ee57cfc2c4e18a7476ed1 | [
"MIT"
] | null | null | null | README.md | FIWARE/tutorials.Concise-Format | c60babdaa764dbbd1f6ee57cfc2c4e18a7476ed1 | [
"MIT"
] | null | null | null | README.md | FIWARE/tutorials.Concise-Format | c60babdaa764dbbd1f6ee57cfc2c4e18a7476ed1 | [
"MIT"
] | 1 | 2022-03-10T03:45:33.000Z | 2022-03-10T03:45:33.000Z | # Concise NGSI-LD[<img src="https://img.shields.io/badge/NGSI-LD-d6604d.svg" width="90" align="left" />](https://www.etsi.org/deliver/etsi_gs/CIM/001_099/009/01.04.01_60/gs_cim009v010401p.pdf)[<img src="https://fiware.github.io/tutorials.Concise-Format/img/fiware.png" align="left" width="162">](https://www.fiware.org/)<br/>
[](https://github.com/FIWARE/catalogue/blob/master/core/README.md)
[](https://opensource.org/licenses/MIT)
[](https://stackoverflow.com/questions/tagged/fiware)
<br/> [](https://w3c.github.io/json-ld-syntax/)
[](https://fiware-tutorials.rtfd.io)
This tutorial introduces the concise NGSI-LD format and demonstrates its use and explains the differences between
concise and normalized NGSI-LD payloads.
The tutorial uses [cUrl](https://ec.haxx.se/) commands throughout, but is also available as
[Postman documentation](https://fiware.github.io/tutorials.Concise-Format/ngsi-ld.html)
[](https://app.getpostman.com/run-collection/d24facc3c430bb5d5aaf)
このチュートリアルは[日本語](README.ja.md)でもご覧いただけます。
## Contents
<details>
<summary><strong>Details</strong></summary>
- [Concise NGSI-LD Payloads](#concise-ngsi-ld-payloads)
- [NGSI-LD Payloads](#ngsi-ld-payloads)
- [Normalized NGSI-LD](#normalized-ngsi-ld)
- [Simplified NGSI-LD](#simplified-ngsi-ld)
- [Concise NGSI-LD](#concise-ngsi-ld)
- [Architecture](#architecture)
- [Prerequisites](#prerequisites)
- [Docker and Docker Compose](#docker-and-docker-compose)
- [Cygwin for Windows](#cygwin-for-windows)
- [Start Up](#start-up)
- [Concise NGSI-LD Operations](#concise-ngsi-ld-operations)
- [Create Operations](#create-operations)
- [Create a New Data Entity](#create-a-new-data-entity)
- [Create New Attributes](#create-new-attributes)
- [Batch Create New Data Entities or Attributes](#batch-create-new-data-entities-or-attributes)
- [Batch Create/Overwrite New Data Entities](#batch-createoverwrite-new-data-entities)
- [Read Operations](#read-operations)
- [Filtering](#filtering)
- [Read a Data Entity (concise)](#read-a-data-entity-concise)
- [Read an Attribute from a Data Entity](#read-an-attribute-from-a-data-entity)
- [Read a Data Entity (concise)](#read-a-data-entity-concise-1)
- [Read Multiple attributes values from a Data Entity](#read-multiple-attributes-values-from-a-data-entity)
- [List all Data Entities (concise)](#list-all-data-entities-concise)
- [List all Data Entities (filtered)](#list-all-data-entities-filtered)
- [Filter Data Entities by ID](#filter-data-entities-by-id)
- [Returning data as GeoJSON](#returning-data-as-geojson)
- [Update Operations](#update-operations)
- [Overwrite the value of an Attribute value](#overwrite-the-value-of-an-attribute-value)
- [Overwrite Multiple Attributes of a Data Entity](#overwrite-multiple-attributes-of-a-data-entity)
- [Batch Update Attributes of Multiple Data Entities](#batch-update-attributes-of-multiple-data-entities)
- [Batch Replace Entity Data](#batch-replace-entity-data)
- [Setting up concise Subscriptions](#setting-up-concise-subscriptions)
- [Concise Notification](#concise-notification)
- [Concise GeoJSON Notification](#concise-geojson-notification)
</details>
# Concise NGSI-LD Payloads
> "To speak much is one thing; to speak to the point another!"
>
> — Sophocles, Oedipus at Colonus
The NGSI-LD API is a flexible mechanism for producing context data in multiple formats. This was demonstrated in the
initial Getting Started [tutorial](https://github.com/FIWARE/tutorials.Getting-Started/tree/NGSI-LD) where both
"normalized" and "key-values" pairs format were produced. The default, verbose data format is so-called "normalized"
NGSI-LD where every **Property** is defined by `"type": "Property"` and every **Relationship** is defined by
`"type": "Relationship"`. These keywords ( `type`, `Property` and `Relationship`) are in turn strictly defined JSON-LD
terms which can be found in the core @context served with every request.
## NGSI-LD Payloads
### Normalized NGSI-LD
The full "normalized" form is an excellent choice for data exchange, since through the `@context` and the definition
of JSON-LD keywords, machines are given all the tools to fully comprehend the payload format. Responses return the
complete current state of each entity, with payloads all including sub-attributes such as Properties-of-Properties,
Properties-of-Relationships and other standard metadata terms like `observedAt` and `unitCode`. Furthermore normalized
payloads are exceedingly regular and parseable, and can easily be reduced down to the relevant `value` elements if such
an operation necessary. However with the normalized format, is necessary to repeatedly supply common defining attributes
such as `"type": "Property"` throughout the payload to ensure that machines can fully understand the data represented.
#### Normalized NGSI-LD using `options=normalized`
```json
{
"@context": [
"https://fiware.github.io/tutorials.Step-by-Step/example.jsonld",
"https://uri.etsi.org/ngsi-ld/v1/ngsi-ld-core-context-v1.4.jsonld"
],
"id": "urn:nsgi-ld:Beatle:John_Lennon",
"type": "Beatle",
"age": { "type": "Property", "value": 40, "unitCode": "ANN" },
"name": { "type": "Property", "value": "John Lennon" },
"born": { "type": "Property", "value": "1940-10-09" },
"spouse": {
"type": "Relationship",
"object": "urn:nsgi-ld:Person:Cynthia_Lennon"
},
"location": {
"type": "GeoProperty",
"value": {
"type": "Point",
"coordinates": [-73.975, 40.775556]
}
}
}
```
Open in [**JSON-LD Playground**](https://tinyurl.com/4nw9z83m)
### Simplified NGSI-LD
The use of the normalized format can be contrast with the "key-values" pairs format, which is a simplified version
concentrating purely on the values of the first level of attributes only. The payloads remain regular, but are much
shorter and to the point, and not all information is returned by the request - second level attributes such as
`unitCode` and `observedAt` will not be returned in the payload for example.
#### Simplified NGSI-LD using `options=keyValues`
```json
{
"@context": [
"https://fiware.github.io/tutorials.Step-by-Step/example.jsonld",
"https://uri.etsi.org/ngsi-ld/v1/ngsi-ld-core-context-v1.4.jsonld"
],
"id": "urn:nsgi-ld:Beatle:John_Lennon",
"name": "John Lennon",
"born": "1940-10-09",
"spouse": "urn:nsgi-ld:Person:Cynthia_Lennon",
"age": 40,
"location": {
"type": "Point",
"coordinates": [-73.975, 40.775556]
}
}
```
Open in [**JSON-LD Playground**](https://tinyurl.com/2p93h8p6)
This key-values payload matches the simple JSON-LD payload which can be seen on the front-page of the official
[JSON-LD site](https://json-ld.org/).
Both normalized and key-values NGSI-LD formats are valid JSON-LD, but since the key-values format is lossy, until
recently, all updates to an NGSI-LD context broker must be made using the normalized format.
### Concise NGSI-LD
To make the API easier to use and reduce the burden on developers, NGSI-LD now accepts an intermediate "concise" format
which still offers all of the context data in the payload, but removes the redundancy of repeatedly adding
`"type": "Property"` throughout each payload. The concise representation is a terser, lossless form of the normalized
representation, where redundant "type" members are omitted and the following rules are applied:
- Every **Property** without further sub-attributes is represented by the Property value only.
- Every **Property** that includes further sub-attributes is represented by a value key-value pair.
- Every **GeoProperty** without further sub-attributes is represented by the GeoProperty’s GeoJSON representation only
- Every **GeoProperty** that includes further sub-attributes is represented by a value key-value pair.
- Every **LanguageProperty** is defined by a `languageMap` key-value pair.
- Every **Relationship** is defined by an `object` key-value pair.
#### Concise NGSI-LD using `options=concise`
```json
{
"@context": [
"https://fiware.github.io/tutorials.Step-by-Step/example.jsonld",
"https://uri.etsi.org/ngsi-ld/v1/ngsi-ld-core-context-v1.4.jsonld"
],
"id": "urn:nsgi-ld:Beatle:John_Lennon",
"name": "John Lennon",
"born": "1940-10-09",
"spouse": {
"object": "urn:nsgi-ld:Person:Cynthia_Lennon"
},
"age": { "value": 40, "unitCode": "ANN" },
"location": {
"type": "Point",
"coordinates": [-73.975, 40.775556]
}
}
```
Open in [**JSON-LD Playground**](https://tinyurl.com/32shtpp6)
It can be seen from the payload above that the concise format (like normalized) is also lossless as it still includes
Properties-of-Properties like `unitCode` (the units of the `age` attribute is obviously years following the UN/CEFACT
code `ANN` for example) and also clearly distinguishes between **Properties** and **Relationships** (since
**Relationships** always have an `object`).
In summary, all NGSI-LD formats provide a structured, well-defined payloads, but the "normalized" format is verbose and
lossless, "key-values" is short and lossy, and third format - "concise" is a secondary, intermediate lossless format
designed to bridge the gap between the two.
#### Device Monitor
For the purpose of this tutorial, a series of dummy agricultural IoT devices have been created, which will be attached
to the context broker. Details of the architecture and protocol used can be found in the
[IoT Sensors tutorial](https://github.com/FIWARE/tutorials.IoT-Sensors/tree/NGSI-LD) The state of each device can be
seen on the UltraLight device monitor web page found at: `http://localhost:3000/device/monitor`.


# Architecture
This application builds on the components and dummy IoT devices created in
[previous tutorials](https://github.com/FIWARE/tutorials.IoT-Agent/). It will use two FIWARE components: the
[Orion Context Broker](https://fiware-orion.readthedocs.io/en/latest/) and the
[IoT Agent for Ultralight 2.0](https://fiware-iotagent-ul.readthedocs.io/en/latest/).
Therefore the overall architecture will consist of the following elements:
- The **FIWARE Generic Enablers**:
- The [Orion Context Broker](https://fiware-orion.readthedocs.io/en/latest/) which will receive requests using
[NGSI-LD](https://forge.etsi.org/swagger/ui/?url=https://forge.etsi.org/rep/NGSI-LD/NGSI-LD/raw/master/spec/updated/generated/full_api.json)
- The FIWARE [IoT Agent for UltraLight 2.0](https://fiware-iotagent-ul.readthedocs.io/en/latest/) which will
receive southbound requests using
[NGSI-LD](https://forge.etsi.org/swagger/ui/?url=https://forge.etsi.org/rep/NGSI-LD/NGSI-LD/raw/master/spec/updated/generated/full_api.json)
and convert them to
[UltraLight 2.0](https://fiware-iotagent-ul.readthedocs.io/en/latest/usermanual/index.html#user-programmers-manual)
commands for the devices
- A [MongoDB](https://www.mongodb.com/) database:
- Used by the **Orion Context Broker** to hold context data information such as data entities, subscriptions and
registrations
- Used by the **IoT Agent** to hold device information such as device URLs and Keys
- An HTTP **Web-Server** which offers static `@context` files defining the context entities within the system.
- The **Tutorial Application** does the following:
- Acts as set of dummy [agricultural IoT devices](https://github.com/FIWARE/tutorials.IoT-Sensors/tree/NGSI-LD)
using the
[UltraLight 2.0](https://fiware-iotagent-ul.readthedocs.io/en/latest/usermanual/index.html#user-programmers-manual)
protocol running over HTTP.
Since all interactions between the elements are initiated by HTTP requests, the entities can be containerized and run
from exposed ports.
The overall architecture can be seen below:

The necessary configuration information can be seen in the services section of the associated `docker-compose.yml` file.
It has been described in a previous tutorial.
# Prerequisites
## Docker and Docker Compose
To keep things simple all components will be run using [Docker](https://www.docker.com). **Docker** is a container
technology which allows to different components isolated into their respective environments.
- To install Docker on Windows follow the instructions [here](https://docs.docker.com/docker-for-windows/)
- To install Docker on Mac follow the instructions [here](https://docs.docker.com/docker-for-mac/)
- To install Docker on Linux follow the instructions [here](https://docs.docker.com/install/)
**Docker Compose** is a tool for defining and running multi-container Docker applications. A series of
[YAML files](https://raw.githubusercontent.com/FIWARE/tutorials.Concise-Format/NGSI-LD/docker-compose.yml) are used to
configure the required services for the application. This means all container services can be brought up in a single
command. Docker Compose is installed by default as part of Docker for Windows and Docker for Mac, however Linux users
will need to follow the instructions found [here](https://docs.docker.com/compose/install/).
You can check your current **Docker** and **Docker Compose** versions using the following commands:
```console
docker-compose -v
docker version
```
Please ensure that you are using Docker version 20.10 or higher and Docker Compose 1.29 or higher and upgrade if
necessary.
## Cygwin for Windows
We will start up our services using a simple Bash script. Windows users should download [cygwin](http://www.cygwin.com/)
to provide a command-line functionality similar to a Linux distribution on Windows.
# Start Up
Before you start, you should ensure that you have obtained or built the necessary Docker images locally. Please clone
the repository and create the necessary images by running the commands as shown:
```console
git clone https://github.com/FIWARE/tutorials.Concise-Format.git
cd tutorials.Concise-Format
git checkout NGSI-LD
./services create
```
Thereafter, all services can be initialized from the command-line by running the
[services](https://github.com/FIWARE/tutorials.Concise-Format/blob/NGSI-LD/services) Bash script provided within the
repository:
```console
./services start
```
> :information_source: **Note:** If you want to clean up and start over again you can do so with the following command:
>
> ```console
> ./services stop
> ```
---
# Concise NGSI-LD Operations
Any context-broker operation which uses a normalized NGSI-LD payload can also be triggered using a concise payload.
## Create Operations
Create Operations map to HTTP POST.
- The `/ngsi-ld/v1/entities` endpoint is used for creating new entities
- The `/ngsi-ld/v1/entities/<entity-id>/attrs` endpoint is used for adding new attributes
Any newly created entity must have `id` and `type` attributes and a valid `@context` definition. All other attributes
are optional and will depend on the system being modelled. If additional attributes are present though, a concise
**Property** must be encapsulated within a `value`. If a **Relationship** is added it must be encapsulated within an
`object`.
The response will be **201 - Created** if the operation is successful or **409 - Conflict** if the operation fails.
### Create a New Data Entity
This example adds a new **TemperatureSensor** entity to the context.
#### :one: Request:
```console
curl -iX POST 'http://localhost:1026/ngsi-ld/v1/entities/' \
-H 'Content-Type: application/json' \
-H 'Link: <http://context/ngsi-context.jsonld>; rel="http://www.w3.org/ns/json-ld#context"; type="application/ld+json"' \
--data-raw '{
"id": "urn:ngsi-ld:TemperatureSensor:001",
"type": "TemperatureSensor",
"category": "sensor",
"temperature": {
"value": 25,
"unitCode": "CEL"
}
}'
```
New entities can be added by making a POST request to the `/ngsi-ld/v1/entities` endpoint. Notice that because
`category` has no sub-attributes, it does not require a `value` element.
As usual, the request will fail if the entity already exists in the context.
#### :two: Request:
You can check to see if the new **TemperatureSensor** can be found in the context by making a GET request. This returns
the full normalized form:
```console
curl -L -X GET 'http://localhost:1026/ngsi-ld/v1/entities/urn:ngsi-ld:TemperatureSensor:001' \
-H 'Link: <http://context/ngsi-context.jsonld>; rel="http://www.w3.org/ns/json-ld#context"; type="application/ld+json"'
```
### Create New Attributes
This example adds a new `batteryLevel` Property and a `controlledAsset` Relationship to the existing
**TemperatureSensor** entity with `id=urn:ngsi-ld:TemperatureSensor:001`.
#### :three: Request:
```console
curl -iX POST 'http://localhost:1026/ngsi-ld/v1/entities/urn:ngsi-ld:TemperatureSensor:001/attrs' \
-H 'Content-Type: application/json' \
-H 'Link: <http://context/ngsi-context.jsonld>; rel="http://www.w3.org/ns/json-ld#context"; type="application/ld+json"' \
--data-raw '{
"batteryLevel": {
"value": 0.8,
"unitCode": "C62"
},
"controlledAsset": {
"object": "urn:ngsi-ld:Building:barn002"
}
}'
```
New attributes can be added by making a POST request to the `/ngsi-ld/v1/entities/<entity>/attrs` endpoint.
The payload should consist of a JSON object holding the attribute names and values as shown.
All **Property** attributes with additional sub-attributes must have a `value` associated with them. All
**Relationship** attributes must have an `object` associated with them which holds the URN of another entity.
Well-defined common metadata elements such as `unitCode` can be provided as strings, all other metadata should be passed
as a JSON object with its own `type` and `value` attributes.
Subsequent requests using the same `id` will update the value of the attribute in the context.
#### :four: Request:
You can check to see if the new **TemperatureSensor** can be found in the context by making a GET request.
```console
curl -L -X GET 'http://localhost:1026/ngsi-ld/v1/entities/urn:ngsi-ld:TemperatureSensor:001' \
-H 'Link: <http://context/ngsi-context.jsonld>; rel="http://www.w3.org/ns/json-ld#context"; type="application/ld+json"'
```
As you can see there are now two additional attributes (`batteryLevel` and `controlledAsset`) added to the entity. These
attributes have been defined in the `@context` as part of the **Device** model and therefore can be read using their
short names.
### Batch Create New Data Entities or Attributes
This example uses the convenience batch processing endpoint to add three new **TemperatureSensor** entities to the
context. Batch create uses the `/ngsi-ld/v1/entityOperations/create` endpoint.
#### :five: Request:
```console
curl -iX POST 'http://localhost:1026/ngsi-ld/v1/entityOperations/create' \
-H 'Content-Type: application/json' \
-H 'Link: <http://context/ngsi-context.jsonld>; rel="http://www.w3.org/ns/json-ld#context"; type="application/ld+json"' \
-H 'Accept: application/ld+json' \
--data-raw '[
{
"id": "urn:ngsi-ld:TemperatureSensor:002",
"type": "TemperatureSensor",
"category": ["sensor"],
"temperature": {
"value": 20,
"unitCode": "CEL"
}
},
{
"id": "urn:ngsi-ld:TemperatureSensor:003",
"type": "TemperatureSensor",
"category": ["sensor" , "actuator"],
"temperature": {
"value": 2,
"unitCode": "CEL"
}
},
{
"id": "urn:ngsi-ld:TemperatureSensor:004",
"type": "TemperatureSensor",
"category": {
"type": "Property",
"value": "sensor"
},
"temperature": {
"type": "Property",
"value": 100,
"unitCode": "CEL"
}
}
]'
```
It can be seen that `"type": "Property"` can be optionally added to concise payloads and the format is still recognized.
This means that any normalized payload automatically a valid concise payload. Care should be taken when adding arrays
using NGSI-LD due to the existing constraints of JSON-LD. Effectively there is no difference between an array of one
entry `"category": ["sensor"]` and a simple string value `"category": "sensor"`. Furthermore, order within the array is
not maintained.
> **Note:** In NGSI-LD, an ordered array value could be encoded as a JSON Literal
> `"category" : {"@type": "@json", "@value":[1,2,3]}`.
The request will fail if any of the attributes already exist in the context. The response highlights which actions have
been successful and the reason for failure (if any has occurred).
```jsonld
{
"@context": "http://context/ngsi-context.jsonld",
"success": [
"urn:ngsi-ld:TemperatureSensor:002",
"urn:ngsi-ld:TemperatureSensor:003",
"urn:ngsi-ld:TemperatureSensor:004"
],
"errors": []
}
```
### Batch Create/Overwrite New Data Entities
This example uses the convenience batch processing endpoint to add or amend two **TemperatureSensor** entities in the
context.
- if an entity already exists, the request will update that entity's attributes.
- if an entity does not exist, a new entity will be created.
#### :six: Request:
```console
curl -iX POST 'http://localhost:1026/ngsi-ld/v1/entityOperations/upsert' \
-H 'Content-Type: application/json' \
-H 'Link: <http://context/ngsi-context.jsonld>; rel="http://www.w3.org/ns/json-ld#context"; type="application/ld+json"' \
-H 'Accept: application/ld+json' \
--data-raw '[
{
"id": "urn:ngsi-ld:TemperatureSensor:002",
"type": "TemperatureSensor",
"category": "sensor",
"temperature": {
"value": 21,
"unitCode": "CEL"
}
},
{
"id": "urn:ngsi-ld:TemperatureSensor:003",
"type": "TemperatureSensor",
"category": "sensor",
"temperature": {
"type": "Property",
"value": 27,
"unitCode": "CEL"
}
}
]'
```
Batch processing for create/overwrite uses the `/ngsi-ld/v1/entityOperations/upsert` endpoint.
A subsequent request containing the same data (i.e. same entities and `actionType=append`) will also succeed won't
change the context state. The `modifiedAt` metadata will be amended however.
## Read Operations
- The `/ngsi-ld/v1/entities` endpoint is used for listing entities
- The `/ngsi-ld/v1/entities/<entity>` endpoint is used for retrieving the details of a single entity.
For read operations the `@context` must be supplied in a `Link` header.
### Filtering
- The `options` parameter (combined with the `attrs` parameter) can be used to filter the returned fields
- The `q` parameter can be used to filter the returned entities
### Read a Data Entity (concise)
This example reads the state of an existing **TemperatureSensor** entity with a known `id` and returns in concise
format.
#### :seven: Request:
```console
curl -G -iX GET 'http://localhost:1026/ngsi-ld/v1/entities/urn:ngsi-ld:TemperatureSensor:001' \
-H 'Link: <http://context/ngsi-context.jsonld>; rel="http://www.w3.org/ns/json-ld#context"; type="application/ld+json"' \
-d 'options=concise,sysAttrs'
```
#### Response:
TemperatureSensor `urn:ngsi-ld:TemperatureSensor:001` is returned as _concise_ NGSI-LD. Additional metadata is returned
because `options=sysAttrs`. By default the `@context` is returned in the payload body (although this could be moved due
to content negotiation if the `Accept:application/json` had been set). The full response is shown below:
```jsonld
{
"@context": "http://context/ngsi-context.jsonld",
"id": "urn:ngsi-ld:TemperatureSensor:001",
"type": "TemperatureSensor",
"createdAt": "2020-08-27T14:33:06Z",
"modifiedAt": "2020-08-27T14:33:10Z",
"category": {
"createdAt": "2020-08-27T14:33:06Z",
"modifiedAt": "2020-08-27T14:33:06Z",
"value": "sensor"
},
"temperature": {
"createdAt": "2020-08-27T14:33:06Z",
"modifiedAt": "2020-08-27T14:33:06Z",
"value": 25,
"unitCode": "CEL"
},
"batteryLevel": {
"value": 0.8,
"createdAt": "2020-08-27T14:33:10Z",
"modifiedAt": "2020-08-27T14:33:10Z",
"unitCode": "C62"
},
"controlledAsset": {
"object": "urn:ngsi-ld:Building:barn002",
"createdAt": "2020-08-27T14:33:10Z",
"modifiedAt": "2020-08-27T14:33:10Z"
}
}
```
Individual context data entities can be retrieved by making a GET request to the `/ngsi-ld/v1/entities/<entity>`
endpoint.
### Read an Attribute from a Data Entity
This example reads the value of a single attribute (`temperature`) from an existing **TemperatureSensor** entity with a
known `id`.
#### :eight: Request:
```console
curl -G -iX GET 'http://localhost:1026/ngsi-ld/v1/entities/urn:ngsi-ld:TemperatureSensor:001' \
-H 'Link: <http://context/ngsi-context.jsonld>; rel="http://www.w3.org/ns/json-ld#context"; type="application/ld+json"' \
-d 'attrs=temperature'
```
#### Response:
The sensor `urn:ngsi-ld:TemperatureSensor:001` is reading at 25°C. The response is shown below:
```jsonld
{
"@context": "http://context/ngsi-context.jsonld",
"id": "urn:ngsi-ld:TemperatureSensor:001",
"type": "TemperatureSensor",
"temperature": {
"value": 25,
"unitCode": "CEL"
}
}
```
Because `options=concise` was used this is response includes the metadata such as `unitCode` but not
`"type": "Property"` Context data can be retrieved by making a GET request to the `/ngsi-ld/v1/entities/<entity-id>`
endpoint and selecting the `attrs` using a comma separated list.
### Read a Data Entity (concise)
This example reads the concise NGSI-LD format from the context of an existing **TemperatureSensor** entities with a
known `id`.
#### :nine: Request:
```console
curl -G -iX GET 'http://localhost:1026/ngsi-ld/v1/entities/urn:ngsi-ld:TemperatureSensor:001' \
-H 'Link: <http://context/json-context.jsonld>; rel="http://www.w3.org/ns/json-ld#context"; type="application/ld+json"' \
-H 'Accept: application/json' \
-d 'options=concise'
```
#### Response:
The sensor `urn:ngsi-ld:TemperatureSensor:001` is reading at 25°C. The response is shown below:
```json
{
"id": "urn:ngsi-ld:TemperatureSensor:001",
"type": "TemperatureSensor",
"category": "sensor",
"temperature": {
"value": 25,
"unitCode": "CEL"
},
"batteryLevel": {
"value": 0.8,
"unitCode": "C62"
},
"controlledAsset": {
"object": "urn:ngsi-ld:Building:barn002"
}
}
```
The response contains an unfiltered list of context data from an entity containing all of the attributes of the
`urn:ngsi-ld:TemperatureSensor:001`. The payload body does not contain an `@context` attribute since the
`Accept: application/json` was set.
Combine the `options=concise` parameter with the `attrs` parameter to retrieve a limited set of key-value pairs.
### Read Multiple attributes values from a Data Entity
This example reads the value of two attributes (`category` and `temperature`) from the context of an existing
**TemperatureSensor** entity with a known `id`.
#### :one::zero: Request:
```console
curl -G -iX GET 'http://localhost:1026/ngsi-ld/v1/entities/urn:ngsi-ld:TemperatureSensor:001' \
-H 'Link: <http://context/json-context.jsonld>; rel="http://www.w3.org/ns/json-ld#context"; type="application/ld+json"' \
-H 'Accept: application/json' \
-d 'options=concise' \
-d 'attrs=category,temperature'
```
#### Response:
The sensor `urn:ngsi-ld:TemperatureSensor:001` is reading at 25°C. The response is shown below:
```json
{
"id": "urn:ngsi-ld:TemperatureSensor:001",
"type": "TemperatureSensor",
"category": "sensor",
"temperature": {
"value": 25,
"unitCode": "CEL"
}
}
```
Combine the `options=concise` parameter and the `attrs` parameter to return a list of values.
### List all Data Entities (concise)
This example lists the full context of all **TemperatureSensor** entities.
#### :one::one: Request:
```console
curl -G -iX GET 'http://localhost:1026/ngsi-ld/v1/entities/' \
-H 'Link: <http://context/ngsi-context.jsonld>; rel="http://www.w3.org/ns/json-ld#context"; type="application/ld+json"' \
-d 'type=TemperatureSensor' \
-d 'options=concise'
```
#### Response:
On start-up the context was empty, four **TemperatureSensor** entities have been added by create operations so the full
context will now contain four sensors.
```jsonld
[
{
"@context": "http://context/ngsi-context.jsonld",
"id": "urn:ngsi-ld:TemperatureSensor:004",
"type": "TemperatureSensor",
"category": "sensor",
"temperature": {
"value": 100,
"unitCode": "CEL"
}
},
{
"@context": "http://context/ngsi-context.jsonld",
"id": "urn:ngsi-ld:TemperatureSensor:002",
"type": "TemperatureSensor",
"category": "sensor",
"temperature": {
"value": 21,
"unitCode": "CEL"
}
},
{
"@context": "http://context/ngsi-context.jsonld",
"id": "urn:ngsi-ld:TemperatureSensor:003",
"type": "TemperatureSensor",
"category": "sensor",
"temperature": {
"type": "Property",
"value": 27,
"unitCode": "CEL"
}
},
{
"@context": "http://context/ngsi-context.jsonld",
"id": "urn:ngsi-ld:TemperatureSensor:001",
"type": "TemperatureSensor",
"batteryLevel": {
"value": 0.8,
"unitCode": "C62"
},
"category": "sensor",
"controlledAsset": {
"object": "urn:ngsi-ld:Building:barn002"
},
"temperature": {
"value": 25,
"unitCode": "CEL"
}
}
]
```
### List all Data Entities (filtered)
This example lists the `temperature` attribute of all **TemperatureSensor** entities in concise format.
#### :one::two: Request:
```console
curl -G -iX GET 'http://localhost:1026/ngsi-ld/v1/entities/' \
-H 'Link: <http://context/json-context.jsonld>; rel="http://www.w3.org/ns/json-ld#context"; type="application/ld+json"' \
-H 'Accept: application/json' \
-d 'type=TemperatureSensor' \
-d 'options=concise' \
-d 'attrs=temperature'
```
#### Response:
The full context contains four sensors, they are returned in a random order:
```json
[
{
"id": "urn:ngsi-ld:TemperatureSensor:004",
"type": "TemperatureSensor",
"temperature": {
"value": 100,
"unitCode": "CEL"
}
},
{
"id": "urn:ngsi-ld:TemperatureSensor:002",
"type": "TemperatureSensor",
"temperature": {
"value": 21,
"unitCode": "CEL"
}
},
{
"id": "urn:ngsi-ld:TemperatureSensor:003",
"type": "TemperatureSensor",
"temperature": {
"value": 27,
"unitCode": "CEL"
}
},
{
"id": "urn:ngsi-ld:TemperatureSensor:001",
"type": "TemperatureSensor",
"temperature": {
"value": 25,
"unitCode": "CEL"
}
}
]
```
Context data for a specified entity type can be retrieved by making a GET request to the `/ngsi-ld/v1/entities/`
endpoint and supplying the `type` parameter, combine this with the `options=keyValues` parameter and the `attrs`
parameter to retrieve key-values.
### Filter Data Entities by ID
This example lists selected data from two **TemperatureSensor** entities chosen by `id`. Note that every `id` must be
unique, so `type` is not required for this request. To filter by `id` add the entries in a comma delimited list.
#### :one::three: Request:
```console
curl -G -iX GET 'http://localhost:1026/ngsi-ld/v1/entities/'' \
-H 'Link: <http://context/json-context.jsonld>; rel="http://www.w3.org/ns/json-ld#context"; type="application/ld+json"' \
-H 'Accept: 'application/json' \
-d 'id=urn:ngsi-ld:TemperatureSensor:001,urn:ngsi-ld:TemperatureSensor:002' \
-d 'attrs=temperature' \
-d 'options=concise'
```
#### Response:
The response details the selected attributes from the selected entities.
```json
[
{
"id": "urn:ngsi-ld:TemperatureSensor:002",
"type": "TemperatureSensor",
"temperature": {
"value": 21,
"unitCode": "CEL"
}
},
{
"id": "urn:ngsi-ld:TemperatureSensor:001",
"type": "TemperatureSensor",
"temperature": {
"value": 25,
"unitCode": "CEL"
}
}
]
```
### Returning data as GeoJSON
The concise format is also available for the GeoJSON format which can be requested by setting the `Accept` header to
`application/geo+json` and setting the `options=concise` parameter.
#### :one::four: Request:
```console
curl -G -iX GET 'http://localhost:1026//ngsi-ld/v1/entities/' \
-H 'Link: <http://context/json-context.jsonld>; rel="http://www.w3.org/ns/json-ld#context"; type="application/ld+json"' \
-H 'Accept: application/geo+json' \
-H 'NGSILD-Tenant: openiot' \
-d 'id=urn:ngsi-ld:Animal:pig010,urn:ngsi-ld:Animal:pig006' \
-d 'options=concise'
```
#### Response:
The response details the selected attributes from the selected entities is returned as a GeoJSON feature collection. The
`properties` section holds data in concise format.
```json
{
"type": "FeatureCollection",
"features": [
{
"id": "urn:ngsi-ld:Animal:pig010",
"type": "Feature",
"properties": {
"type": "Animal",
"heartRate": {
"value": 71,
"providedBy": {
"object": "urn:ngsi-ld:Device:pig010"
},
"observedAt": "2022-03-01T15:49:57.039Z",
"unitCode": "5K"
},
"phenologicalCondition": "femaleAdult",
"reproductiveCondition": "active",
"name": "Carnation",
"legalID": "M-sow010-Carnation",
"sex": "female",
"species": "pig",
"location": {
"value": {
"type": "Point",
"coordinates": [13.346, 52.52]
},
"providedBy": {
"object": "urn:ngsi-ld:Device:pig010"
},
"observedAt": "2022-03-01T15:49:57.039Z"
}
},
"@context": "http://context/json-context.jsonld",
"geometry": {
"type": "Point",
"coordinates": [13.346, 52.52]
}
},
{
"id": "urn:ngsi-ld:Animal:pig006",
"type": "Feature",
"properties": {
"type": "Animal",
"heartRate": {
"value": 62,
"providedBy": {
"object": "urn:ngsi-ld:Device:pig006"
},
"observedAt": "2022-03-01T15:49:57.287Z",
"unitCode": "5K"
},
"phenologicalCondition": "femaleAdult",
"reproductiveCondition": "inCalf",
"name": "Peach",
"legalID": "M-sow006-Peach",
"sex": "female",
"species": "pig",
"location": {
"value": {
"type": "Point",
"coordinates": [13.347, 52.522]
},
"providedBy": {
"object": "urn:ngsi-ld:Device:pig006"
},
"observedAt": "2022-03-01T15:49:57.287Z"
}
},
"@context": "http://context/json-context.jsonld",
"geometry": {
"type": "Point",
"coordinates": [13.347, 52.522]
}
}
]
}
```
## Update Operations
Overwrite operations are mapped to HTTP PATCH:
- The `/ngsi-ld/v1/entities/<entity-id>/attrs/<attribute>` endpoint is used to update an attribute
- The `/ngsi-ld/v1/entities/<entity-id>/attrs` endpoint is used to update multiple attributes
### Overwrite the value of an Attribute value
This example updates the value of the `category` attribute of the Entity with `id=urn:ngsi-ld:TemperatureSensor:001`.
#### :one::five: Request:
```console
curl -iX PATCH 'http://localhost:1026/ngsi-ld/v1/entities/urn:ngsi-ld:TemperatureSensor:001/attrs/category' \
-H 'Content-Type: application/json' \
-H 'Link: <http://context/ngsi-context.jsonld>; rel="http://www.w3.org/ns/json-ld#context"; type="application/ld+json"' \
--data-raw '{
"value": ["sensor", "actuator"]
}'
```
Existing attribute values can be altered by making a PATCH request to the
`/ngsi-ld/v1/entities/<entity-id>/attrs/<attribute>` endpoint. The appropriate `@context` should be supplied as a `Link`
header. The only difference between a normalized and concise payload is the missing `type` attribute.
### Overwrite Multiple Attributes of a Data Entity
This example simultaneously updates the values of both the `category` and `controlledAsset` attributes of the Entity
with `id=urn:ngsi-ld:TemperatureSensor:001`.
#### :one::six: Request:
```console
curl -iX PATCH 'http://localhost:1026/ngsi-ld/v1/entities/urn:ngsi-ld:TemperatureSensor:001/attrs' \
-H 'Content-Type: application/json' \
-H 'Link: <http://context/ngsi-context.jsonld>; rel="http://www.w3.org/ns/json-ld#context"; type="application/ld+json"' \
--data-raw '{
"category": {
"value": [
"sensor",
"actuator"
]
},
"controlledAsset": {
"object": "urn:ngsi-ld:Building:barn001"
}
}'
```
### Batch Update Attributes of Multiple Data Entities
This example uses the convenience batch processing endpoint to update existing sensors.
#### :one::seven: Request:
```console
curl -iX POST 'http://localhost:1026/ngsi-ld/v1/entityOperations/upsert?options=update' \
-H 'Content-Type: application/json' \
-H 'Link: <http://context/ngsi-context.jsonld>; rel="http://www.w3.org/ns/json-ld#context"; type="application/ld+json"' \
--data-raw '[
{
"id": "urn:ngsi-ld:TemperatureSensor:003",
"type": "TemperatureSensor",
"category": [
"actuator",
"sensor"
]
},
{
"id": "urn:ngsi-ld:TemperatureSensor:004",
"type": "TemperatureSensor",
"category": [
"actuator",
"sensor"
]
}
]'
```
Batch processing uses the `/ngsi-ld/v1/entityOperations/upsert` endpoint. The payload body holds an array of the
entities and attributes we wish to update. The `options=update` parameter indicates we will not remove existing
attributes if they already exist and have not been included in the payload.
An alternative would be to use the `/ngsi-ld/v1/entityOperations/update` endpoint. Unlike `upsert`, the `update`
operation will not silently create any new entities - it fails if the entities do not already exist.
### Batch Replace Entity Data
This example uses the convenience batch processing endpoint to replace entity data of existing sensors.
#### :one::eight: Request:
```console
curl -iX POST 'http://localhost:1026/ngsi-ld/v1/entityOperations/update?options=replace' \
-H 'Content-Type: application/json' \
-H 'Link: <http://context/ngsi-context.jsonld>; rel="http://www.w3.org/ns/json-ld#context"; type="application/ld+json"' \
--data-raw '[
{
"id": "urn:ngsi-ld:TemperatureSensor:003",
"type": "TemperatureSensor",
"category":[
"actuator",
"sensor"
]
},
{
"id": "urn:ngsi-ld:TemperatureSensor:004",
"type": "TemperatureSensor",
"temperature": {
"value": 16,
"unitCode": "CEL"
"observedAt": "2022-03-01T15:00:00.000Z"
}
}
]'
```
Batch processing uses the `/ngsi-ld/v1/entityOperations/update` endpoint with a payload with the `options=replace`
parameter, this means we will overwrite existing entities. `/ngsi-ld/v1/entityOperations/upsert` could also be used if
new entities are also to be created.
## Setting up concise Subscriptions
### Concise Notification
The concise format can also be used when generating a notification from a subscription. Simply set the
`"format": "concise"` within the `notification` element as shown:
#### :one::nine: Request:
```console
curl -X POST 'http://{{orion}}/ngsi-ld/v1/subscriptions/' \
-H 'Content-Type: application/ld+json' \
-H 'NGSILD-Tenant: openiot' \
--data-raw '{
"description": "Notify me of low feedstock on Farm:001",
"type": "Subscription",
"entities": [{"id": "urn:ngsi-ld:Animal:pig003", "type": "Animal"}],
"notification": {
"format": "concise",
"endpoint": {
"uri": "http://tutorial:3000/subscription/low-stock-farm001-ngsild",
"accept": "application/geo+json"
}
},
"@context": "http://context/ngsi-context.jsonld"
}'
```
Then go to the Device Monitor `http://localhost:3000/app/farm/urn:ngsi-ld:Building:farm001` and remove some hay from the
barn. Eventually a request is sent to `subscription/low-stock-farm001` as shown:
#### `http://localhost:3000/app/monitor`
#### Subscription Payload:
```json
{
"id": "urn:ngsi-ld:Notification:6220b4e464f3729a8527f8a0",
"type": "Notification",
"subscriptionId": "urn:ngsi-ld:Subscription:6220b4a964f3729a8527f88c",
"@context": "http://context/ngsi-context.jsonld",
"notifiedAt": "2022-03-03T12:30:28.237Z",
"data": [
{
"id": "urn:ngsi-ld:Animal:pig003",
"type": "Animal",
"heartRate": {
"value": 67,
"unitCode": "5K",
"observedAt": "2022-03-03T12:30:27.000Z",
"providedBy": {
"object": "urn:ngsi-ld:Device:pig003"
}
},
"phenologicalCondition": "maleAdult",
"reproductiveCondition": "active",
"name": "Flamingo",
"legalID": "M-boar003-Flamingo",
"sex": "male",
"species": "pig",
"location": {
"value": {
"type": "Point",
"coordinates": [13.357, 52.513]
},
"observedAt": "2022-03-03T12:30:27.000Z",
"providedBy": {
"object": "urn:ngsi-ld:Device:pig003"
}
}
}
]
}
```
### Concise GeoJSON Notification
#### :two::zero: Request:
It is also possible to send GeoJSON notifications if the `"accept": "application/geo+json"` attribute is set. Combining
this with `"format": "concise"` results in a `FeatureCollection` with properties in concise format.
```console
curl -X POST 'http://{{orion}}/ngsi-ld/v1/subscriptions/' \
-H 'Content-Type: application/ld+json' \
-H 'NGSILD-Tenant: openiot' \
--data-raw '{
"description": "Notify me of low feedstock on Farm:001",
"type": "Subscription",
"entities": [{"id": "urn:ngsi-ld:Animal:pig003", "type": "Animal"}],
"notification": {
"format": "concise",
"endpoint": {
"uri": "http://tutorial:3000/subscription/low-stock-farm001-ngsild",
"accept": "application/geo+json"
}
},
"@context": "http://context/ngsi-context.jsonld"
}'
```
#### Subscription Payload:
The result of a concise GeoJSON notification can be seen below.
```json
{
"id": "urn:ngsi-ld:Notification:6220b50264f3729a8527f8ab",
"type": "Notification",
"subscriptionId": "urn:ngsi-ld:Subscription:6220b47764f3729a8527f886",
"notifiedAt": "2022-03-03T12:30:58.294Z",
"data": {
"type": "FeatureCollection",
"features": [
{
"id": "urn:ngsi-ld:Animal:pig003",
"type": "Feature",
"properties": {
"type": "Animal",
"heartRate": {
"value": 63,
"unitCode": "5K",
"observedAt": "2022-03-03T12:30:58.000Z",
"providedBy": {
"object": "urn:ngsi-ld:Device:pig003"
}
},
"phenologicalCondition": "maleAdult",
"reproductiveCondition": "active",
"name": "Flamingo",
"legalID": "M-boar003-Flamingo",
"sex": "male",
"species": "pig",
"location": {
"value": {
"type": "Point",
"coordinates": [13.357, 52.513]
},
"observedAt": "2022-03-03T12:30:58.000Z",
"providedBy": {
"object": "urn:ngsi-ld:Device:pig003"
}
}
},
"@context": "https://uri.etsi.org/ngsi-ld/v1/ngsi-ld-core-context.jsonld",
"geometry": {
"value": {
"type": "Point",
"coordinates": [13.357, 52.513]
},
"observedAt": "2022-03-03T12:30:58.000Z",
"providedBy": {
"object": "urn:ngsi-ld:Device:pig003"
}
}
}
]
}
}
```
# Next Steps
Want to learn how to add more complexity to your application by adding advanced features? You can find out by reading
the other [tutorials in this series](https://ngsi-ld-tutorials.rtfd.io)
---
## License
[MIT](LICENSE) © 2022 FIWARE Foundation e.V.
| 36.958169 | 326 | 0.645347 | eng_Latn | 0.768849 |
fd8087c0343f1eb11ab98ef42931def83a235151 | 5,279 | md | Markdown | README.md | pepincho/Cinema-Reservation-System | a4ae9243fc676000cd1dea4f7c9a4b934db3df83 | [
"MIT"
] | 8 | 2015-01-16T17:51:01.000Z | 2020-11-26T17:43:45.000Z | README.md | pepincho/Cinema-Reservation-System | a4ae9243fc676000cd1dea4f7c9a4b934db3df83 | [
"MIT"
] | null | null | null | README.md | pepincho/Cinema-Reservation-System | a4ae9243fc676000cd1dea4f7c9a4b934db3df83 | [
"MIT"
] | 3 | 2015-11-01T01:06:54.000Z | 2020-06-27T05:27:31.000Z | # Cinema Reservation System
We are going to cinema! But the reservation systems are down and the cinema officials don't let people without reservations. So we offered to make them a new reservation system that will allow us to go and watch the newest **"Aliens came, we built transformers and destroyed them..."** movie.
## Problem 0 - The database
No complex stuff here. Just a few simple tables:
**Movies**
| id | name | rating |
| ------------- |:-------------| :---: |
|1|The Hunger Games: Catching Fire |7.9|
|2|Wreck-It Ralph|7.8|
|3|Her|8.3|
**Projections**
| id | movie_id | type | date | time |
| ---|----------|:----:| :--: | :--: |
|1|1|3D|2014-04-01|19:10
|2|1|2D|2014-04-01|19:00
|3|1|4DX|2014-04-02|21:00
|4|3|2D|2014-04-05|20:20
|5|2|3D|2014-04-02|22:00
|6|2|2D|2014-04-02|19:30
**Reservations**
| id | username | projection_id | row | col |
| ---|----------|---------------|:----:|:---:|
|1|RadoRado|1|2|1|
|2|RadoRado|1|3|5|
|3|RadoRado|1|7|8|
|4|Ivo|3|1|1|
|5|Ivo|3|1|2|
|6|Mysterious|5|2|3|
|7|Mysterious|5|2|4|
**Things to note**
* For each projection we assume the hall will be a 10x10 matrix.
* All data presented here is just an example. If you want, you can make up your own (perhaps you are the creator of the aforementioned movie and want to include it).
## Problem 1 - The CLI (Command-Line Interface)
We don't need no GUIs! A console with green text and transparent background is all a hacker requires.
Implement a python script called ```magic_reservation_system.py``` that takes magic commands and casts an appropriate spell. Here's an ancient page of Merlin's Book:
* On spell ```show_movies``` - print all movies ORDERed BY rating
* On spell ```show_movie_projections <movie_id> [<date>]``` - print all projections of a given movie for the given date (date is optional).
1. ORDER the results BY date
2. For each projection, show the total number of spots available.
* On spell ```make_reservation``` - It's showtime!
1. Make the hacker choose a name and number of tickets
2. Cast ```show_movies``` and make the hacker choose a movie by id
3. Cast ```show_movie_projections``` for the chosen ```<movie_id>``` and make the hacker choose a projection
* *If the available spots for a projection are less than the number of tickets needed, print an appropriate message and stay at step 3*;
4. Cast a spell to show all available spots for the chosen projection
5. For each number of tickets, make the hacker choose a tuple ```(row, col)```. Check for tuple validity (10x10 matrix) and availability (reservations table)
6. Cast a spell to show the info in an appropriate format. Then prompt for ```finalize``` spell
7. On ```finalize``` spell, save all the info and wish a happy cinema!
0. **At each step, allow for ```give_up``` spell to be cast. This...wait for it...waaaiit... gives up the reservation!!!** (Thanks, cap'n)
* On spell ```cancel_reservation <name>``` - disintegrate given person's reservation (**NOTE**: reservations cannot be so easily removed, but this is a magical system, after all)
* On spell ```exit``` - close Pandora's Box before it's too late.
* On spell ```help``` - show a list of learned spells
**Things to note**
* Notice how you are required to reuse code (or you'll be one messy hacker!!!).
* Try not to build everything in one place.
* Make use of the following techniques (Merlin used them to destroy the Decepticons): **OOP, TDD, SQL**.
## Examples
### Show movies
```
> show_movies
Current movies:
[1] - The Hunger Games: Catching Fire (7.9)
[2] - Wreck-It Ralph (7.8)
[3] - Her (8.3)
```
### Show movie projections ###
```
> show_movie_projections 2
Projections for movie 'Wreck-It Ralph':
[5] - 2014-04-02 19:30 (2D)
[6] - 2014-04-02 22:00 (3D)
> show_movie_projections 1 2014-04-01
Projections for movie 'The Hunger Games: Catching Fire' on date 2014-04-01:
[1] - 19:00 (3D)
[2] - 19:10 (2D)
```
### Make a reservation
```
> make_reservation
Step 1 (User): Choose name>Tedi
Step 1 (User): Choose number of tickets> 2
Current movies:
[1] - The Hunger Games: Catching Fire (7.9)
[2] - Wreck-It Ralph (7.8)
[3] - Her (8.3)
Step 2 (Movie): Choose a movie> 2
Projections for movie 'Wreck-It Ralph':
[5] - 2014-04-02 19:30 (2D) - 98 spots available
[6] - 2014-04-02 22:00 (3D) - 100 spots availabe
Step 3 (Projection): Choose a projection> 5
Available seats (marked with a dot):
1 2 3 4 5 6 7 8 9 10
1 . . . . . . . . . .
2 . . X X . . . . . .
3 . . . . . . . . . .
4 . . . . . . . . . .
5 . . . . . . . . . .
6 . . . . . . . . . .
7 . . . . . . . . . .
8 . . . . . . . . . .
9 . . . . . . . . . .
10 . . . . . . . . . .
Step 4 (Seats): Choose seat 1> (2,3)
This seat is already taken!
Step 4 (Seats): Choose seat 1> (15, 16)
Lol...NO!
Step 4 (Seats): Choose seat 1> (7,8)
Step 4 (Seats): Choose seat 2> (7,7)
This is your reservation:
Movie: Wreck-It Ralph (7.8)
Date and Time: 2014-04-02 19:30 (2D)
Seats: (7,7), (7.8)
Step 5 (Confirm - type 'finalize') > finalize
Thanks.
```
# DISCLAIMER
The purpose of these tasks is to train your casting (programming) skills.
The purpose of these tasks is NOT to abide by the rules (we = hackers).
If you decide that something is not structured/explained/invented enough, feel free to discuss it with us!
Happy hacking!
| 36.406897 | 292 | 0.659026 | eng_Latn | 0.966858 |
fd808b2bdd6286fe5487bdd0eeaf7e6a491918da | 1,452 | markdown | Markdown | pages/NPCs/Wandering_Bosses.markdown | Vicarium/PrisonPlanes | 9b956e7a373605ca0e14f3efe68128968dd941db | [
"MIT"
] | 1 | 2020-04-13T14:12:47.000Z | 2020-04-13T14:12:47.000Z | pages/NPCs/Wandering_Bosses.markdown | Vicarium/PrisonPlanes | 9b956e7a373605ca0e14f3efe68128968dd941db | [
"MIT"
] | null | null | null | pages/NPCs/Wandering_Bosses.markdown | Vicarium/PrisonPlanes | 9b956e7a373605ca0e14f3efe68128968dd941db | [
"MIT"
] | null | null | null | # Wandering_Bosses
Created Tuesday 19 June 2018
### Enchanted
Enscorceled Trussari
The Enscorceled Trussari is a great boss to throw at players to introduce a story and give them pause, Are their abilities really that valuable? What enemies do they face?
The Trussari begins the fight looking sick, but on the first hit against him, that vanishes. He stands tall, proud and angry, and his eyes begin fuming green smoke.
The Trussari has an Antimagic field that projects 10ft from him. He is a Fighter 7 samurai, Monk 3 Kensei
He uses his reaction defensively. The first rule of fighting, after all, is don't get hit. The second being hit.
He will always target casters first, starting with the healers, then the weakest looking one.
If the antimagic field is gone and dispel magic or greater restoration is cast on him, the spell is broken and he returns to his senses and sets off back toward manaharamu.
##### A Horrible Night to Have a Curse
A vampire
Many Minions
Spectator for guarding a single location
crawling claws random
Skeletons Random
Spectres Random NIGHT
Flameskulls for hallways
Gibbering Mouther Wave boss
Werewolf Wave boss
Peryton if things go outside
Players are brought into a large building by a being and told that a vampire waits and watches the castle, but cannot enter without permission.
**Events run as follows:**
When the players start, it is mid-afternoon, so they have some time to prepare.
After 2 hours,
| 41.485714 | 172 | 0.792011 | eng_Latn | 0.999838 |
fd811b397cc20b16928fb1aca7c0d7af3c11f77d | 11,361 | md | Markdown | articles/active-directory/saas-apps/promapp-provisioning-tutorial.md | LeMuecke/azure-docs.de-de | a7b8103dcc7d5ec5b56b9b4bb348aecd2434afbd | [
"CC-BY-4.0",
"MIT"
] | 1 | 2020-04-03T08:58:02.000Z | 2020-04-03T08:58:02.000Z | articles/active-directory/saas-apps/promapp-provisioning-tutorial.md | LeMuecke/azure-docs.de-de | a7b8103dcc7d5ec5b56b9b4bb348aecd2434afbd | [
"CC-BY-4.0",
"MIT"
] | null | null | null | articles/active-directory/saas-apps/promapp-provisioning-tutorial.md | LeMuecke/azure-docs.de-de | a7b8103dcc7d5ec5b56b9b4bb348aecd2434afbd | [
"CC-BY-4.0",
"MIT"
] | null | null | null | ---
title: 'Tutorial: Konfigurieren von Promapp für die automatische Benutzerbereitstellung in Azure Active Directory | Microsoft-Dokumentation'
description: Erfahren Sie, wie Sie Azure Active Directory für das automatische Bereitstellen und Aufheben der Bereitstellung von Benutzerkonten in Promapp konfigurieren.
services: active-directory
documentationcenter: ''
author: zchia
writer: zchia
manager: beatrizd
ms.assetid: 41190ba0-9032-4725-9d2c-b3dd9c7786e6
ms.service: active-directory
ms.subservice: saas-app-tutorial
ms.workload: identity
ms.tgt_pltfrm: na
ms.devlang: na
ms.topic: article
ms.date: 11/11/2019
ms.author: Zhchia
ms.openlocfilehash: 1a4a27846196e7eb134b834647b2a2d65672f385
ms.sourcegitcommit: 2ec4b3d0bad7dc0071400c2a2264399e4fe34897
ms.translationtype: HT
ms.contentlocale: de-DE
ms.lasthandoff: 03/27/2020
ms.locfileid: "77061005"
---
# <a name="tutorial-configure-promapp-for-automatic-user-provisioning"></a>Tutorial: Konfigurieren von Promapp für die automatische Benutzerbereitstellung
In diesem Tutorial werden die Schritte erläutert, die in Promapp und Azure Active Directory (Azure AD) ausgeführt werden müssen, um Azure AD zum automatischen Bereitstellen und Aufheben der Bereitstellung von Benutzern und/oder Gruppen in Promapp zu konfigurieren.
> [!NOTE]
> In diesem Tutorial wird ein Connector beschrieben, der auf dem Benutzerbereitstellungsdienst von Azure AD basiert. Wichtige Details zum Zweck und zur Funktionsweise dieses Diensts sowie häufig gestellte Fragen finden Sie unter [Automatisieren der Bereitstellung und Bereitstellungsaufhebung von Benutzern für SaaS-Anwendungen mit Azure Active Directory](../app-provisioning/user-provisioning.md).
>
> Dieser Connector befindet sich derzeit in der Public Preview-Phase. Weitere Informationen zu den allgemeinen Nutzungsbedingungen von Microsoft Azure für Previewfunktionen finden Sie unter [Zusätzliche Nutzungsbestimmungen für Microsoft Azure-Vorschauen](https://azure.microsoft.com/support/legal/preview-supplemental-terms/).
## <a name="prerequisites"></a>Voraussetzungen
Das diesem Tutorial zu Grunde liegende Szenario setzt voraus, dass Sie bereits über die folgenden Voraussetzungen verfügen:
* Einen Azure AD-Mandanten
* [Einen Promapp-Mandanten](https://www.promapp.com/licensing/)
* Ein Benutzerkonto in Promapp mit Administratorberechtigungen
## <a name="assigning-users-to-promapp"></a>Zuweisen von Benutzern zu Promapp
Azure Active Directory ermittelt anhand von *Zuweisungen*, welche Benutzer Zugriff auf bestimmte Apps erhalten sollen. Im Kontext der automatischen Benutzerbereitstellung werden nur die Benutzer und/oder Gruppen synchronisiert, die einer Anwendung in Azure AD zugewiesen wurden.
Vor dem Konfigurieren und Aktivieren der automatischen Benutzerbereitstellung müssen Sie entscheiden, welche Benutzer und/oder Gruppen in Azure AD Zugriff auf Promapp benötigen. Anschließend können Sie diese Benutzer bzw. Gruppen Promapp wie folgt zuweisen:
* [Zuweisen eines Benutzers oder einer Gruppe zu einer Unternehmens-App](../manage-apps/assign-user-or-group-access-portal.md)
## <a name="important-tips-for-assigning-users-to-promapp"></a>Wichtige Tipps zum Zuweisen von Benutzern zu Promapp
* Es wird empfohlen, Promapp einen einzelnen Azure AD-Benutzer zuzuweisen, um die Konfiguration der automatischen Benutzerbereitstellung zu testen. Später können weitere Benutzer und/oder Gruppen zugewiesen werden.
* Beim Zuweisen eines Benutzers zu Promapp müssen Sie im Dialogfeld für die Zuweisung eine gültige anwendungsspezifische Rolle auswählen (sofern verfügbar). Benutzer mit der Rolle **Standardzugriff** werden von der Bereitstellung ausgeschlossen.
## <a name="setup-promapp-for-provisioning"></a>Einrichten von Promapp für die Bereitstellung
1. Melden Sie sich bei Ihrer [Promapp-Verwaltungskonsole](https://freetrial.promapp.com/axelerate/Login.aspx) an. Navigieren Sie unter dem Benutzernamen zu **Mein Profil**.

2. Klicken Sie unter **Zugriffstoken** auf die Schaltfläche **Token erstellen**.

3. Geben Sie im Feld **Beschreibung** einen beliebigen Namen an, und wählen Sie im Dropdownmenü **Bereich** die Option **Scim** aus. Klicken Sie auf das Symbol „Speichern“.

4. Kopieren Sie das Zugriffstoken, und speichern Sie es, weil Sie es nur zu diesem Zeitpunkt anzeigen können. Dieser Wert wird im Azure-Portal auf der Registerkarte „Bereitstellung“ der Promapp-Anwendung im Feld „Geheimes Token“ eingegeben.

## <a name="add-promapp-from-the-gallery"></a>Hinzufügen von Promapp aus dem Katalog
Bevor Sie Promapp für die automatische Benutzerbereitstellung mit Azure AD konfigurieren, müssen Sie Promapp aus dem Azure AD-Anwendungskatalog zu Ihrer Liste verwalteter SaaS-Anwendungen hinzufügen.
**Führen Sie die folgenden Schritte aus, um Promapp aus dem Azure AD-Anwendungskatalog hinzuzufügen:**
1. Wählen Sie im **[Azure-Portal](https://portal.azure.com)** im linken Navigationsbereich **Azure Active Directory** aus.

2. Navigieren Sie zu **Unternehmensanwendungen**, und wählen Sie die Option **Alle Anwendungen**.

3. Klicken Sie oben im Bereich auf die Schaltfläche **Neue Anwendung**, um eine neue Anwendung hinzuzufügen.

4. Geben Sie im Suchfeld **Promapp** ein, wählen Sie im Ergebnisbereich **Promapp** aus, und klicken Sie dann auf die Schaltfläche **Hinzufügen**, um die Anwendung hinzuzufügen.

## <a name="configuring-automatic-user-provisioning-to-promapp"></a>Konfigurieren der automatischen Benutzerbereitstellung in Promapp
In diesem Abschnitt werden die Schritte zum Konfigurieren des Azure AD-Bereitstellungsdiensts zum Erstellen, Aktualisieren und Deaktivieren von Benutzern bzw. Gruppen in Promapp auf der Grundlage von Benutzer- oder Gruppenzuweisungen in Azure AD erläutert.
> [!TIP]
> Sie können auch das SAML-basierte einmalige Anmelden für Promapp aktivieren. Befolgen Sie dazu die Anweisungen im [SSO-Tutorial zu Promapp](https://docs.microsoft.com/azure/active-directory/saas-apps/promapp-tutorial). Einmaliges Anmelden kann unabhängig von der automatischen Benutzerbereitstellung konfiguriert werden, obwohl diese beiden Features einander ergänzen.
### <a name="to-configure-automatic-user-provisioning-for-promapp-in-azure-ad"></a>So konfigurieren Sie die automatische Benutzerbereitstellung für Promapp in Azure AD
1. Melden Sie sich beim [Azure-Portal](https://portal.azure.com) an. Wählen Sie **Unternehmensanwendungen** und dann **Alle Anwendungen**.

2. Wählen Sie in der Anwendungsliste **Promapp**aus.

3. Wählen Sie die Registerkarte **Bereitstellung**.

4. Legen Sie den **Bereitstellungsmodus** auf **Automatisch** fest.

5. Geben Sie im Abschnitt **Administratoranmeldeinformationen** im Feld **Mandanten-URL** die Zeichenfolge `https://api.promapp.com/api/scim` ein. Geben Sie den Wert des **SCIM-Authentifizierungstokens** ein, den Sie zuvor unter **geheimes Token** abgerufen haben. Klicken Sie auf **Verbindung testen**, um sicherzustellen, dass Azure AD eine Verbindung mit Promapp herstellen kann. Vergewissern Sie sich im Falle eines Verbindungsfehlers, dass Ihr Promapp-Konto über Administratorberechtigungen verfügt, und versuchen Sie es noch einmal.

6. Geben Sie im Feld **Benachrichtigungs-E-Mail** die E-Mail-Adresse einer Person oder einer Gruppe ein, die Benachrichtigungen zu Bereitstellungsfehlern erhalten soll, und aktivieren Sie das Kontrollkästchen **Bei Fehler E-Mail-Benachrichtigung senden**.

7. Klicken Sie auf **Speichern**.
8. Wählen Sie im Abschnitt **Zuordnungen** die Option **Azure Active Directory-Benutzer mit Promapp synchronisieren** aus.

9. Überprüfen Sie im Abschnitt **Attributzuordnungen** die Benutzerattribute, die von Azure AD mit Promapp synchronisiert werden. Die als **übereinstimmende** Eigenschaften ausgewählten Attribute werden für den Abgleich der Benutzerkonten in Promapp für Aktualisierungsvorgänge verwendet. Wählen Sie die Schaltfläche **Speichern**, um alle Änderungen zu übernehmen.

11. Wenn Sie Bereichsfilter konfigurieren möchten, lesen Sie die Anweisungen unter [Attributbasierte Anwendungsbereitstellung mit Bereichsfiltern](../app-provisioning/define-conditional-rules-for-provisioning-user-accounts.md).
12. Um den Azure AD-Bereitstellungsdienst für Promapp zu aktivieren, ändern Sie den **Bereitstellungsstatus** im Abschnitt **Einstellungen** in **Ein**.

13. Legen Sie die Benutzer und/oder Gruppen fest, die in Promapp bereitgestellt werden sollen, indem Sie im Abschnitt **Einstellungen** unter **Bereich** die gewünschten Werte auswählen.

14. Wenn Sie fertig sind, klicken Sie auf **Speichern**.

Dadurch wird die Erstsynchronisierung aller Benutzer und/oder Gruppen gestartet, die im Abschnitt **Einstellungen** unter **Bereich** definiert sind. Die Erstsynchronisierung dauert länger als nachfolgende Synchronisierungen, die ungefähr alle 40 Minuten erfolgen, solange der Azure AD-Bereitstellungsdienst ausgeführt wird. Im Abschnitt **Synchronisierungsdetails** können Sie den Fortschritt überwachen und Links zu Berichten zur Bereitstellungsaktivität aufrufen. Darin sind alle Aktionen aufgeführt, die vom Azure AD-Bereitstellungsdienst in Promapp ausgeführt werden.
Weitere Informationen zum Lesen von Azure AD-Bereitstellungsprotokollen finden Sie unter [Tutorial: Meldung zur automatischen Benutzerkontobereitstellung](../app-provisioning/check-status-user-account-provisioning.md).
## <a name="additional-resources"></a>Zusätzliche Ressourcen
* [Verwalten der Benutzerkontobereitstellung für Unternehmens-Apps](../app-provisioning/configure-automatic-user-provisioning-portal.md)
* [Was bedeuten Anwendungszugriff und einmaliges Anmelden mit Azure Active Directory?](../manage-apps/what-is-single-sign-on.md)
## <a name="next-steps"></a>Nächste Schritte
* [Erfahren Sie, wie Sie Protokolle überprüfen und Berichte zu Bereitstellungsaktivitäten abrufen.](../app-provisioning/check-status-user-account-provisioning.md)
| 68.854545 | 572 | 0.806619 | deu_Latn | 0.992508 |
fd812fc6771f2114e99a19f44fd2fb75d069940a | 617 | md | Markdown | configs/core/gentoo/buildroot_ext/package/skiff-core-gentoo/README.md | Sherknob/SkiffOS | 8a2fb4b7a8004c6e2b3689e19463774886b98237 | [
"MIT"
] | 298 | 2020-09-13T14:08:57.000Z | 2022-03-30T20:17:13.000Z | configs/core/gentoo/buildroot_ext/package/skiff-core-gentoo/README.md | Sherknob/SkiffOS | 8a2fb4b7a8004c6e2b3689e19463774886b98237 | [
"MIT"
] | 103 | 2016-05-05T22:08:41.000Z | 2020-09-02T15:55:14.000Z | configs/core/gentoo/buildroot_ext/package/skiff-core-gentoo/README.md | Sherknob/SkiffOS | 8a2fb4b7a8004c6e2b3689e19463774886b98237 | [
"MIT"
] | 24 | 2020-10-15T02:22:36.000Z | 2022-03-27T12:56:21.000Z | # Skiff Core based on Gentoo
This is a skiff core setup for amd64, arm, arm64 which compiles Gentoo from
source for use within a core container.
The different architectures currently have varying levels of support,
particularly for graphical and desktop packages.
## Pull Pre-built Image
The build process downloads a Gentoo Stage3 and runs portage to upgrade and
install packages.
You can skip the majority of the build process by downloading a pre-built image:
```sh
systemctl stop skiff-core
docker pull skiffos/skiff-core-gentoo:latest
```
Some architectures may not be available in the Docker Hub image.
| 26.826087 | 80 | 0.795786 | eng_Latn | 0.998076 |
fd81e19317bf7755a3d8a28fe2d909b33b4a067d | 380 | md | Markdown | README.md | DIBSMethodsMeetings/dibsmethodsmeetings.github.io | f22f56b2fa92428a96cd48a4f4a19f681ad2f7fc | [
"MIT"
] | 3 | 2020-09-25T23:16:33.000Z | 2021-05-20T01:19:28.000Z | README.md | DIBSMethodsMeetings/dibsmethodsmeetings.github.io | f22f56b2fa92428a96cd48a4f4a19f681ad2f7fc | [
"MIT"
] | null | null | null | README.md | DIBSMethodsMeetings/dibsmethodsmeetings.github.io | f22f56b2fa92428a96cd48a4f4a19f681ad2f7fc | [
"MIT"
] | 1 | 2020-09-23T18:19:54.000Z | 2020-09-23T18:19:54.000Z | # Duke Institute for Brain Sciences Methods Meetings
We're a group of students, employees, and faculty at Duke University working in psychology & neuroscience that hold bi-weekly workshops on the latest developments in statistical methods, programming tools, and professional development practices.
### Want to become a member?
Contact Kevin O'Neill at <kevin.oneill@duke.edu>!
| 54.285714 | 245 | 0.805263 | eng_Latn | 0.991131 |
fd8252f4a3d5e885a871791ff9c87dc13918aa92 | 9,193 | md | Markdown | fabric/22055-22491/22229.md | hyperledger-gerrit-archive/fabric-gerrit | 188c6e69ccb2e4c4d609ae749a467fa7e289b262 | [
"Apache-2.0"
] | 2 | 2021-11-08T08:06:48.000Z | 2021-12-03T01:51:44.000Z | fabric/22055-22491/22229.md | cendhu/fabric-gerrit | 188c6e69ccb2e4c4d609ae749a467fa7e289b262 | [
"Apache-2.0"
] | null | null | null | fabric/22055-22491/22229.md | cendhu/fabric-gerrit | 188c6e69ccb2e4c4d609ae749a467fa7e289b262 | [
"Apache-2.0"
] | 4 | 2019-12-07T05:54:26.000Z | 2020-06-04T02:29:43.000Z | <strong>Project</strong>: fabric<br><strong>Branch</strong>: master<br><strong>ID</strong>: 22229<br><strong>Subject</strong>: FAB-10344 Make metrics tests use ephemeral ports<br><strong>Status</strong>: MERGED<br><strong>Owner</strong>: Jason Yellick - jyellick@us.ibm.com<br><strong>Assignee</strong>:<br><strong>Created</strong>: 5/23/2018, 4:27:49 PM<br><strong>LastUpdated</strong>: 5/23/2018, 6:27:36 PM<br><strong>CommitMessage</strong>:<br><pre>FAB-10344 Make metrics tests use ephemeral ports
The ports used by the metrics test (such as 8081 and 8025) are sometimes
bound by other services. Since the tests never actually connects to the
service, there's no requirement that these ports be specified, only that
they be different, so ephemeral ports are the better choice.
Change-Id: I52f9ddaac0435c50e49dccf64309a132ae933fb7
Signed-off-by: Jason Yellick <jyellick@us.ibm.com>
</pre><h1>Comments</h1><strong>Reviewer</strong>: Jason Yellick - jyellick@us.ibm.com<br><strong>Reviewed</strong>: 5/23/2018, 4:27:49 PM<br><strong>Message</strong>: <pre>Uploaded patch set 1.</pre><strong>Reviewer</strong>: Hyperledger Jobbuilder - jobbuilder@jenkins.hyperledger.org<br><strong>Reviewed</strong>: 5/23/2018, 4:32:04 PM<br><strong>Message</strong>: <pre>Patch Set 1:
Build Started https://jenkins.hyperledger.org/job/fabric-verify-build-checks-x86_64/2015/</pre><strong>Reviewer</strong>: Hyperledger Jobbuilder - jobbuilder@jenkins.hyperledger.org<br><strong>Reviewed</strong>: 5/23/2018, 4:38:27 PM<br><strong>Message</strong>: <pre>Patch Set 1: F2-DocBuild+1 F1-VerifyBuild+1
Succeeded, Run SmokeTest</pre><strong>Reviewer</strong>: Hyperledger Jobbuilder - jobbuilder@jenkins.hyperledger.org<br><strong>Reviewed</strong>: 5/23/2018, 4:38:59 PM<br><strong>Message</strong>: <pre>Patch Set 1:
Build Successful
https://jenkins.hyperledger.org/job/fabric-verify-build-checks-x86_64/2015/ : SUCCESS (skipped)
Logs: https://logs.hyperledger.org/production/vex-yul-hyp-jenkins-3/fabric-verify-build-checks-x86_64/2015</pre><strong>Reviewer</strong>: Hyperledger Jobbuilder - jobbuilder@jenkins.hyperledger.org<br><strong>Reviewed</strong>: 5/23/2018, 4:40:41 PM<br><strong>Message</strong>: <pre>Patch Set 1:
Build Started https://jenkins.hyperledger.org/job/fabric-smoke-tests-x86_64/1370/</pre><strong>Reviewer</strong>: Hyperledger Jobbuilder - jobbuilder@jenkins.hyperledger.org<br><strong>Reviewed</strong>: 5/23/2018, 4:41:13 PM<br><strong>Message</strong>: <pre>Patch Set 1:
Starting smoke tests</pre><strong>Reviewer</strong>: Hyperledger Jobbuilder - jobbuilder@jenkins.hyperledger.org<br><strong>Reviewed</strong>: 5/23/2018, 4:51:23 PM<br><strong>Message</strong>: <pre>Patch Set 1: F2-SmokeTest+1
Succeeded, Run UnitTest, Run IntegrationTest</pre><strong>Reviewer</strong>: Hyperledger Jobbuilder - jobbuilder@jenkins.hyperledger.org<br><strong>Reviewed</strong>: 5/23/2018, 4:53:21 PM<br><strong>Message</strong>: <pre>Patch Set 1:
Build Started https://jenkins.hyperledger.org/job/fabric-verify-unit-tests-x86_64/2225/ (2/3)</pre><strong>Reviewer</strong>: Hyperledger Jobbuilder - jobbuilder@jenkins.hyperledger.org<br><strong>Reviewed</strong>: 5/23/2018, 4:53:46 PM<br><strong>Message</strong>: <pre>Patch Set 1:
Starting unit tests</pre><strong>Reviewer</strong>: Hyperledger Jobbuilder - jobbuilder@jenkins.hyperledger.org<br><strong>Reviewed</strong>: 5/23/2018, 4:53:49 PM<br><strong>Message</strong>: <pre>Patch Set 1:
Build Started https://jenkins.hyperledger.org/job/fabric-verify-integration-tests-x86_64/17/ (3/3)</pre><strong>Reviewer</strong>: Hyperledger Jobbuilder - jobbuilder@jenkins.hyperledger.org<br><strong>Reviewed</strong>: 5/23/2018, 4:54:25 PM<br><strong>Message</strong>: <pre>Patch Set 1:
Starting Integration tests</pre><strong>Reviewer</strong>: Hyperledger Jobbuilder - jobbuilder@jenkins.hyperledger.org<br><strong>Reviewed</strong>: 5/23/2018, 4:58:02 PM<br><strong>Message</strong>: <pre>Patch Set 1: F3-IntegrationTest-1
Failed</pre><strong>Reviewer</strong>: Hyperledger Jobbuilder - jobbuilder@jenkins.hyperledger.org<br><strong>Reviewed</strong>: 5/23/2018, 5:07:31 PM<br><strong>Message</strong>: <pre>Patch Set 1: F3-UnitTest+1
Succeeded</pre><strong>Reviewer</strong>: Hyperledger Jobbuilder - jobbuilder@jenkins.hyperledger.org<br><strong>Reviewed</strong>: 5/23/2018, 5:07:58 PM<br><strong>Message</strong>: <pre>Patch Set 1:
Build Failed
https://jenkins.hyperledger.org/job/fabric-verify-integration-tests-x86_64/17/ : FAILURE (skipped)
No problems were identified. If you know why this problem occurred, please add a suitable Cause for it. ( https://jenkins.hyperledger.org/job/fabric-verify-integration-tests-x86_64/17/ )
Logs: https://logs.hyperledger.org/production/vex-yul-hyp-jenkins-3/fabric-verify-integration-tests-x86_64/17
https://jenkins.hyperledger.org/job/fabric-smoke-tests-x86_64/1370/ : SUCCESS (skipped)
Logs: https://logs.hyperledger.org/production/vex-yul-hyp-jenkins-3/fabric-smoke-tests-x86_64/1370
https://jenkins.hyperledger.org/job/fabric-verify-unit-tests-x86_64/2225/ : SUCCESS (skipped)
Logs: https://logs.hyperledger.org/production/vex-yul-hyp-jenkins-3/fabric-verify-unit-tests-x86_64/2225</pre><strong>Reviewer</strong>: Yacov Manevich - yacovm@il.ibm.com<br><strong>Reviewed</strong>: 5/23/2018, 5:40:34 PM<br><strong>Message</strong>: <pre>Patch Set 1: F3-IntegrationTest+1 Code-Review+2</pre><strong>Reviewer</strong>: Yacov Manevich - yacovm@il.ibm.com<br><strong>Reviewed</strong>: 5/23/2018, 5:40:37 PM<br><strong>Message</strong>: <pre>Removed F3-IntegrationTest-1 by Hyperledger Jobbuilder <jobbuilder@jenkins.hyperledger.org>
</pre><strong>Reviewer</strong>: Binh Nguyen - binh1010010110@gmail.com<br><strong>Reviewed</strong>: 5/23/2018, 5:58:02 PM<br><strong>Message</strong>: <pre>Patch Set 1: Code-Review+2</pre><strong>Reviewer</strong>: Binh Nguyen - binh1010010110@gmail.com<br><strong>Reviewed</strong>: 5/23/2018, 5:58:07 PM<br><strong>Message</strong>: <pre>Change has been successfully merged by Binh Nguyen</pre><strong>Reviewer</strong>: Hyperledger Jobbuilder - jobbuilder@jenkins.hyperledger.org<br><strong>Reviewed</strong>: 5/23/2018, 6:00:28 PM<br><strong>Message</strong>: <pre>Patch Set 1:
Build Started https://jenkins.hyperledger.org/job/fabric-merge-x86_64/3903/ (1/2)</pre><strong>Reviewer</strong>: Hyperledger Jobbuilder - jobbuilder@jenkins.hyperledger.org<br><strong>Reviewed</strong>: 5/23/2018, 6:00:50 PM<br><strong>Message</strong>: <pre>Patch Set 1:
Build Started https://jenkins.hyperledger.org/job/fabric-merge-end-2-end-x86_64/2574/ (2/2)</pre><strong>Reviewer</strong>: Hyperledger Jobbuilder - jobbuilder@jenkins.hyperledger.org<br><strong>Reviewed</strong>: 5/23/2018, 6:27:36 PM<br><strong>Message</strong>: <pre>Patch Set 1:
Build Successful
https://jenkins.hyperledger.org/job/fabric-merge-x86_64/3903/ : SUCCESS (skipped)
Logs: https://logs.hyperledger.org/production/vex-yul-hyp-jenkins-3/fabric-merge-x86_64/3903
https://jenkins.hyperledger.org/job/fabric-merge-end-2-end-x86_64/2574/ : SUCCESS (skipped)
Logs: https://logs.hyperledger.org/production/vex-yul-hyp-jenkins-3/fabric-merge-end-2-end-x86_64/2574</pre><h1>PatchSets</h1><h3>PatchSet Number: 1</h3><blockquote><strong>Type</strong>: REWORK<br><strong>Author</strong>: Jason Yellick - jyellick@us.ibm.com<br><strong>Uploader</strong>: Jason Yellick - jyellick@us.ibm.com<br><strong>Created</strong>: 5/23/2018, 4:27:49 PM<br><strong>GitHubMergedRevision</strong>: [5c104943b6f931d25e43dff471dd1c346c187334](https://github.com/hyperledger-gerrit-archive/fabric/commit/5c104943b6f931d25e43dff471dd1c346c187334)<br><br><strong>Approver</strong>: Hyperledger Jobbuilder - jobbuilder@jenkins.hyperledger.org<br><strong>Approved</strong>: 5/23/2018, 4:38:27 PM<br><strong>Type</strong>: F1-VerifyBuild<br><strong>Value</strong>: 1<br><br><strong>Approver</strong>: Hyperledger Jobbuilder - jobbuilder@jenkins.hyperledger.org<br><strong>Approved</strong>: 5/23/2018, 4:38:27 PM<br><strong>Type</strong>: F2-DocBuild<br><strong>Value</strong>: 1<br><br><strong>Approver</strong>: Hyperledger Jobbuilder - jobbuilder@jenkins.hyperledger.org<br><strong>Approved</strong>: 5/23/2018, 4:51:23 PM<br><strong>Type</strong>: F2-SmokeTest<br><strong>Value</strong>: 1<br><br><strong>Approver</strong>: Hyperledger Jobbuilder - jobbuilder@jenkins.hyperledger.org<br><strong>Approved</strong>: 5/23/2018, 5:07:31 PM<br><strong>Type</strong>: F3-UnitTest<br><strong>Value</strong>: 1<br><br><strong>Approver</strong>: Binh Nguyen - binh1010010110@gmail.com<br><strong>Approved</strong>: 5/23/2018, 5:58:02 PM<br><strong>Type</strong>: Code-Review<br><strong>Value</strong>: 1<br><br><strong>MergedBy</strong>: Binh Nguyen<br><strong>Merged</strong>: 5/23/2018, 5:58:07 PM<br><br><strong>Approver</strong>: Yacov Manevich - yacovm@il.ibm.com<br><strong>Approved</strong>: 5/23/2018, 5:40:34 PM<br><strong>Type</strong>: Code-Review<br><strong>Value</strong>: 1<br><br><strong>Approver</strong>: Yacov Manevich - yacovm@il.ibm.com<br><strong>Approved</strong>: 5/23/2018, 5:40:34 PM<br><strong>Type</strong>: F3-IntegrationTest<br><strong>Value</strong>: 1<br><br></blockquote> | 133.231884 | 2,110 | 0.763407 | kor_Hang | 0.448103 |
fd8296cb4f2b7054ddbc03a2d787bd5b78d29126 | 27,709 | md | Markdown | README.md | stefaanc/psconsole | a02f581fe042309b7716b0b6d10b3aafdb8273d7 | [
"MIT"
] | 1 | 2021-02-18T20:01:34.000Z | 2021-02-18T20:01:34.000Z | README.md | stefaanc/psconsole | a02f581fe042309b7716b0b6d10b3aafdb8273d7 | [
"MIT"
] | null | null | null | README.md | stefaanc/psconsole | a02f581fe042309b7716b0b6d10b3aafdb8273d7 | [
"MIT"
] | 1 | 2019-07-23T08:53:33.000Z | 2019-07-23T08:53:33.000Z | # PSConsole
**some scripts to work with different Windows PowerShell profiles and to initialize a Windows PowerShell console**
If you are like me, working on a couple of projects (or project versions) at the same time, it is a real hassle to go and change the PowerShell profile settings in `~\Documents\WindowsPowerShell`. So I needed a better system to do this.
My requirements are:
- When starting PowerShell from anywhere within a project, pick-up the project's profile
- When starting Powershell from outside a project, pick-up a default profile
- if I defined a default project, the profile for that project
- if I didn't, a generic profile
Also, it is not really easy to find out how to change the colors for the PowerShell console. There are an enormous amount of web-pages on this subject, lots of different methods to try. Different methods for different versions of PowerShell. I needed to find the best way for me.
My requirements are:
- Changing the colors of the PowerShell console
- Changing the font used in the PowerShell console
- Changing the window title for different projects (or project versions)
- Changing the window title and the prompt when running as 'Administrator'
- Changing the window title and the background color when working with different servers, VMs, clusters,...
## Project Profiles
An overview of the profile structure
```text
HOME
|
|-- Documents
| |-- WindowsPowerShell
| |-- console.json
| |-- profile.ps1 # <<<<<<<<<< master profile
| | # (incl. default-master profile)
| |-- scripts
| |-- Apply-PSConsoleSettings.ps1
|
|-- Projects
|-- .psprofile.ps1 # <<<<<<<<<< default-project profile
|
|-- project-1
| |-- .psconsole.json
| |-- .psprofile.ps1 # <<<<<<<<<< project profile
| |
| |-- scripts
| |-- Apply-PSConsoleSettings.ps1
|
|-- project-2
| |-- .psconsole.json
| |-- .psprofile.ps1 # <<<<<<<<<< project profile
| |
| |-- scripts
| |-- Apply-PSConsoleSettings.ps1
.
.
.
|-- project-N
|-- .psconsole.json
|-- .psprofile.ps1 # <<<<<<<<<< project profile
|
|-- scripts
|-- Apply-PSConsoleSettings.ps1
```
### The master profile
You can find the master profile in the folder `~\Documents\WindowsPowerShell` in a file called `profile.ps1`.
> :information_source:
> There are actually a number of profiles on your machine. We are using the `CurrentUserAllHosts` profile here. For more info, see https://docs.microsoft.com/en-gb/powershell/module/microsoft.powershell.core/about/about_profiles, but documentation doesn't always seem to line-up with what we see in a PowerShell console.
If the folder or file are not there, you should create it. You can pick-up our profile from the `scripts` folder in this project (`@HOME-Documents-WindowsPowerShell_profile.ps1`) and rename it `profile.ps1`. Remark that you may have to change the marked line at the end, pointing to the default-project profile.
> :warning:
> you may also need to copy the corresponding `console.json` used by the `Apply-PSConsoleSettings` script in our profile. More info [here](#the-default-master profile).
We want to run a different profile depending on the current directory.
- when you open Powershell from the start menu, it will open in a default directory (usually `C:\Windows\System32`)
- when you start PowerShell from a command prompt, it will open in the current directory of the Command Prompt
- when you type `powershell` in the address bar of Windows Explorer, it will open in the current Windows Explorer directory
- when you start PowerShell from a `PowerShell Here` context menu, the current directory will depend on the place where you opened the menu.
Assuming you don't use the `-NoProfile` option, PowerShell will always start executing the master profile.
- The master profile we are using, will look in the current directory, to see if there is a `.psprofile.ps1` file it can run.
- If there is none, it will look in the first parent, grand-parent, ..., up to the user's home directory, for a `.psprofile.ps1` file it can run.
- When it finds one, it will set the `$ROOT` variable to this path and run the profile found.
- When it doesn't find one, it will run the default-project profile and sets the `$ROOT` variable.
- If that also fails, we run the default-master profile and set the `$ROOT` variable to the home directory.
### The project profile
Most of my projects use a profile that's very similar to the `@HOME-Projects-myproject_.profile.ps1` in the `scripts` folder of this project.
- We set `$ROOT` to the project directory
- if you pick-up our script, you will need to adapt the path for `$ROOT`.
- We add the `$ROOT\scripts` folder to `$env:PATH`
- If we are not already somewhere under the project root-directory, we set the current directory to the project root-directory
- We apply settings to the PowerShell console (see further)
- for this, you need to copy the `Apply-PSConsoleSettings.ps1` script in the `scripts` folder of this project, and put it under the `$ROOT\scripts` folder in your project.
- you also need to copy the `@HOME-Projects-myproject_.profile.ps1` in the `scripts` folder of this project to you root folder, and rename it `.psconsole.json`
Our project-profile looks something like
```powershell
Set-Variable HOME "$env:USERPROFILE" -Scope Global -Force
( Get-PSProvider 'FileSystem' ).Home = $HOME # replace "~"
$global:ROOT = "$HOME\Projects\psconsole"
$env:PATH = "$ROOT\scripts;$env:PATH"
if ( -not ( Get-Location ).Path.StartsWith("$ROOT") ) {
Set-Location "$ROOT"
}
Apply-PSConsoleSettings "PSCONSOLE"
```
> :warning:
> The `$HOME` variable doesn't always seem to correctly populated, hence we set it in this profile. We experienced a lot of issues with this on different computers and with different versions of PowerShell (probably because of a lack of understanding?). You can find more info about the `$HOME` variable here: https://docs.microsoft.com/en-gb/powershell/module/microsoft.powershell.core/about/about_automatic_variables, but the documentation doesn't always seem to line-up with what we see in a PowerShell console.
>
> - By default we use `"$env:USERPROFILE"`. Although this is in my experience the most reliable way, this variable may not be correctly set on your machine.
> - An alternative (this is what the PowerShell documentation states as the method it uses "automatically") is to use `"$env:HOMEDRIVE$env:HOMEPATH"`.
> - You can look for other variables in your environment (or add them) to help you create the correct `$HOME` variable.
> - You can look for solutions on the web.
> - Perhaps you need to update some registry items?
> - Ultimately, as a last resort, you can hard-code your home-path as a string.
> :bulb:
> We copy the `Apply-PSConsoleSettings` to every project. This avoids issues when the script gets updated for a new project while scripts and corresponding profiles of old projects are not updated.
### The default-project profile
I like to put all my projects in a `~/Projects` folder so that is where I put my `.psprofile.ps1` file, but you can put it anywhere else and give it any name you want. You can pick-up our profile from the `scripts` folder in this project (`@Projects_.psprofile.ps1`), rename it and adapt it.
- If you are using our master profile, update the marked line at the end of the master profile.
The job of the default-project profile is to select a default project and run its profile. In our sample default-project profile, we list all our projects in comments in the script, and uncomment the project we consider our "default" at the moment (only uncomment projects that have a project-profile). This makes it very easy to switch "default" projects.
```powershell
. ~/Projects/psconsole/.psprofile.ps1
#. ~/Projects/kluster/.psprofile.ps1
#. ~/Projects/steps/.psprofile.ps1
#. ~/Projects/globals/.psprofile.ps1 #<<< !!! has no project-profile at the moment
```
### The default-master profile
The last section in the master-profile is the default-master profile. You can put whatever you want in there. We use something similar to the project profiles in our default-master profile
- We set `$ROOT` to the home directory
- We add the `~\Documents\WindowsPowerShell\scripts` folder to `$env:PATH`
- If we are not already somewhere under the home directory, we set the current directory to the home directory
- We apply settings to the PowerShell console (see further)
- for this, you need to copy the `Apply-PSConsoleSettings.ps1` script in the `scripts` folder of this project, and put it under the `~\Documents\WindowsPowerShell\scripts` folder on your machine.
- you also need to copy the `@HOME-Documents-WindowsPowerShell_console.json` file in the `scripts` folder of this project to your `~\Documents\WindowsPowerShell` folder, and rename it `console.json`
Our default master profile looks something like
```powershell
$global:ROOT = $HOME
$env:PATH = "$ROOT\Documents\WindowsPowerShell\scripts;$env:PATH"
if ( -not ( Get-Location ).Path.StartsWith("$ROOT") ) {
Set-Location "$ROOT"
}
Apply-PSConsoleSettings
```
<br/>
## Console Properties
There are lots of properties that can be changed, but we will focus on colors:
- The color palette used by the console
- The foreground and background colors of the console
- The colors of the output to various streams
- The colors used for syntax-highlighting
- The colors of the prompt
A couple of other properties we want to adapt
- The title of the PowerShell console window, including an indication when we are running in elevated mode
- An indication in the prompt when we are running in elevated mode
- The window size
- The font and font-size in the console
To control all of this, we use a script `Apply-PSConsoleSettings.ps1` in combination with a JSON configuration file. You can find the default configuration for a project in `$ROOT\.psconsole.json`.
Some of these properties cannot be changed by a script like `Apply-PSConsoleSettings.ps1`, but can be changed in the registry (which we prefer not to do) or in the properties of a shortcut to PowerShell:
- The color palette used by the console
Finally, some of these properties can be both changed from within a console or in the shortcut to a console. We elected to set the following in the properties of a shortcut:
- The window size
### The color palette
To understand the sometimes "unexpected" results when changing colors, we need a bit more explanation to understand some of the basics of color palettes.
The color palette consists of 16 colors. Default color-values are defined in a color-table in the registry (`ColorTable00` .. `ColorTable15`). This can be overridden in the properties of a shortcut to PowerShell application (`.lnk` file), hence colors can be different depending on how you start the console.

> :warning:
> If you change colors of the `Defaults`, you will be changing color-table for the PowerShell application in the registry, so the new colors will apply to all instances of Powershell. If you change the colors in the `Properties`, the changes are only applied to instances of PowerShell that are started via the specific shortcut where you changed the properties (overriding the `Defaults`).
> More info on the hierarchy of loaded settings can be found here: https://devblogs.microsoft.com/commandline/understanding-windows-console-host-settings/

- The color palette shown here is the color palette associated to the `Properties` of the shortcut in the start menu.
- The colors of the color-table in the registry correspond to the colors in the `Defaults` and `Properties` from left to right.
> :bulb:
> The PowerShell shortcut can usually be found in the Start Menu: `"$env:APPDATA\Microsoft\Windows\Start Menu\Programs\Windows PowerShell\Windows PowerShell.lnk"` (or with the `(x86)` suffix for the 32-bit version).
To illustrate the difference between the color palettes when starting Powershell in different ways, below, we started the console from the Run app

- One of the immediate differences you can observe is the much less saturated yellow color of the `[Administrator]` indication in the prompt, compared to the higher screenshot.
And the corresponding color properties

- When we started PowerShell from the Run app, we started the `.exe` application, not the `.lnk` shortcut on the Start Menu. This is using the default color-table for `powershell.exe` from the registry instead of the properties of a shortcut.
> :information_source:
> The colors or your PowerShell.exe console may be different than what's shown here. This is because the default console colors were recently changed to a dimmer version ( https://devblogs.microsoft.com/commandline/updating-the-windows-console-colors/ ), so your installation may still be using the legacy colors in the registry color-tables.
An overview
ColorTable | Foreground ANSI/VT100 | Background ANSI/VT100 | PowerShell Name | Legacy Console | New Console | PowerShell Color
---------------|----------|-----------|---------------|---------------|---------------|--------------
`ColorTable00` | ``e[30m` | ``e[40m` | `Black` | (0,0,0) | (12,12,12) | -
`ColorTable01` | ``e[34m` | ``e[44m` | `DarkBlue` | (0,0,128) | (0,55,128) | -
`ColorTable02` | ``e[32m` | ``e[42m` | `DarkGreen` | (0,128,0) | (19,161,14) | -
`ColorTable03` | ``e[36m` | ``e[46m` | `DarkCyan` | (0,128,128) | (58,150,221) | -
`ColorTable04` | ``e[31m` | ``e[41m` | `DarkRed` | (128,0,0) | (197,15,31) | -
`ColorTable05` | ``e[35m` | ``e[45m` | `DarkMagenta` | (128,0,128) | (136,23,152) | (1,36,86)
`ColorTable06` | ``e[33m` | ``e[43m` | `DarkYellow` | (128,128,00) | (193,156,0) | (238,237,240)
`ColorTable07` | ``e[37m` | ``e[47m` | `Gray` (`DarkWhite`) | (192,192,192) | (204,204,204) | -
`ColorTable08` | ``e[90m` | ``e[100m` | `DarkGray` (`LightBlack`) | (128,128,128) | (118.118,118) | -
`ColorTable09` | ``e[94m` | ``e[104m` | `Blue` | (0,0,255) | (59,120,255) | -
`ColorTable10` | ``e[92m` | ``e[102m` | `Green` | (0,255,0) | (22,198,12) | -
`ColorTable11` | ``e[96m` | ``e[106m` | `Cyan` | (0,255,255) | (97,214,214) | -
`ColorTable12` | ``e[91m` | ``e[101m` | `Red` | (255,0,0) | (231,72,86) | -
`ColorTable13` | ``e[95m` | ``e[105m` | `Magenta` | (255,0,255) | (180,0,158) | -
`ColorTable14` | ``e[93m` | ``e[103m` | `Yellow` | (255,255,0) | (249,241,165) | -
`ColorTable15` | ``e[97m` | ``e[107m` | `White` | (255,255,255) | (242,242,242) | -
- The ColorTable names are the names as you will find them in the registry.
- The ANSI/VT100 codes are the codes you can use to change colors in strings. Remark that this is only works on newer versions of PowerShell. The ``e` escape code will also only work on newer versions of Powershell, but can be replace by `$e = [char]27` or `$e = [char]0x1B`.
- In Powershell scripts, you don't use the names `ColorTable00`, etc... to specify colors. Instead you use: `Black`, `DarkBlue`, ..., `White`.
- The `Screen Text` and `Screen Background` have been changed from the default console colors, specifically for the PowerShell console.
- As written before, the default console colors have recently been redefined ( https://devblogs.microsoft.com/commandline/updating-the-windows-console-colors/ ).
- Colors you see may be completely different than what is expected from the name. For instance the links to PowerShell in the Start Menu redefine `DarkMegenta` and `DarkYellow`. Also the PowerShell `.lnk` shortcut in the Start Menu is using the legacy console colors, while the PowerShell `.exe` application is using the new console colors.
### The colors of the console
> :warning:
> From Windows 10 build 1809 onward, the `PSReadline` module was upgraded from version 1.2 to a 2.0.0-beta version. This beta version causes a lot of issues that cannot be worked around. To make the following work, you **MUST** downgrade the `PSReadLine` module to version 1.2
>
> - check if you have version 2.0.0
>
> ```powershell
> Get-Module PSReadLine
> ```
>
> gives
>
> ```text
> ModuleType Version Name ExportedCommands
> ---------- ------- ---- ----------------
> Script 2.0.0 PSReadLine {Get-PSReadlineKeyHandler, Get-PSReadlineOption, Remove-PS...
> ```
>
> - run PowerShell as administrator, and execute
>
> ```powershell
> Install-Module -Name PSReadLine -RequireVersion 1.2 -SkipPublisherCheck
> ```
>
> - delete `C:\Program Files\WindowsPowerShell\Modules\PSReadline\2.0.0`
> - check
>
> ```powershell
> Get-InstalledModule PSReadLine
> ```
>
> gives
>
> ```text
> Version Name Repository Description
> ------- ---- ---------- -----------
> 1.2 PSReadLine PSGallery Great command line editing in the Powe...
> ```
As explained in the [previous section](#the-color-palette), the colors of the console are defined in the registry and in the properties of a shortcut to the PowerShell application. We don't really want to change the registry because that changes colors for every instance of the application, so we are going to focus on creating shortcuts, and change the colors of these shortcuts.
- `New-Shortcut` creates a new shortcut.
```powershell
New-Shortcut "$ROOT\scripts\my-powershell.lnk" -TargetPath "powershell.exe"
```
- remark the name of the shortcut should be the **full** path to the shortcut, an the extension `.lnk` is optional.
- have a look at the top of the script in the `scripts` folder for more parameters, f.i. `-Arguments` and `-WorkingDirectory`
- `Get-Shortcut` is used by the other scripts. It gets an object to modify the properties of a shortcut.
> :information_source:
> This is the age-old `Get-Link` you find in a lot of posts on this subject. The link provided in most of these posts doesn't work anymore. Luckily, Neil Pankey (and friends) did find and preserve this code ( https://github.com/neilpa/cmd-colors-solarized/blob/master/Get-Link.ps1 ) - Thanks Neil :relieved:
- `Set-ShortcutColors`
```powershell
Set-ShortcutColors "$ROOT/playground/my-powershell.lnk" -Theme "$ROOT/colors/psconsole-powershell-legacy.json"
```
themes are a json file, f.i. the `psconsole-powershell-legacy.json` theme
```javascript
{
"console": {
"ScreenTextColor": 6,
"ScreenBackgroundColor": 5,
"PopupTextColor": 3,
"PopupBackgroundColor": 15,
"ConsoleColors": [
"#000000",
"#000080",
"#008000",
"#008080",
"#800000",
"#012456",
"#eeedf0",
"#c0c0c0",
"#808080",
"#0000ff",
"#00ff00",
"#00ffff",
"#ff0000",
"#ff00ff",
"#ffff00",
"#ffffff"
]
}
}
```
- We developed a number of color-schemes based on the very popular `Solarized` theme, but with a couple of modifications, and added more choices in background colors. You can read more on the `Colorized` themes [here](/docs/color-schemes.md)
- We also added a couple based on the traditional Powershell background color.
> :warning:
> Remark that you need to run `Apply-PSConsoleSettings` in your profile with a `.psconsole.json` file, to make sure that the above palette colors are correctly mapped on the colors for Powershell elements (stream output, syntax tokens and prompt).
- `Set-ShortcutWindowSize` sets the size of the windows and screen-buffer
```powershell
Set-ShortcutWindowSize "$ROOT/scripts/my-powershell.lnk" -Width 120 -Height 50 -ScreenBufferHeight 5000
```
- `Write-ColorizedColors` shows the colors of the color-scheme of the current shell.
- `Test-ColorizedColors` creates a folder `$ROOT/playground`, and in that folder a powershell-shortcut for every color-scheme.


You can see a couple more results [here](/docs/color-schemes.md).
> :bulb:
> We made some [conversion tables to convert names between different systems](/docs/color-names.md).
### The colors of the output to streams
> :warning:
> In the this and following sections, we will use the `Apply-PSConsoleSettings` script to set the colors.
For this to work, you **MUST** copy the `@HOME-Projects-myproject_.psconsole.json` file to your home directory and rename it to `.psconsole.json`, **AND** do the same for any project folders where you want to use this script in your project-profile.
The `Apply-PSConsoleSettings` is used to map palette colors on the colors of the output to the different PowerShell streams. The mapping is defined in the `.psconsole.json` file in the `$ROOT` folder. This file **MUST** have three sections:
```javascript
{
"UserColorScheme": {
// ... mapping of 'Colorized' color-schemes on PowerShell elements in a normal user console
},
"AdminColorScheme": {
// ... mapping of 'Colorized' color-schemes on PowerShell elements in an administrator console
},
"LegacyColorScheme": {
// ... mapping of 'powershell-legacy' and '-campbell' color-schemes on PowerShell elements in any console
}
}
```
The mapping of foreground and background colors for streams is defined by properties `"Stream*"`, for instance for the `"LegacyColorScheme"`
```javascript
"StreamOutputForegroundColor": "DarkYellow",
"StreamOutputBackgroundColor": "DarkMagenta",
"StreamErrorForegroundColor": "Red",
"StreamErrorBackgroundColor": "Black",
"StreamWarningForegroundColor": "Yellow",
"StreamWarningBackgroundColor": "Black",
"StreamDebugForegroundColor": "Yellow",
"StreamDebugBackgroundColor": "Black",
"StreamVerboseForegroundColor": "Yellow",
"StreamVerboseBackgroundColor": "Black",
"StreamProgressForegroundColor": "Yellow",
"StreamProgressBackgroundColor": "DarkCyan",
```
### The colors of the syntax-highlighting
The mapping of foreground and background colors for syntax-highlighting is defined by properties `"Syntax*"`, for instance for the `"LegacyColorScheme"`
```javascript
"SyntaxCommandForegroundColor": "Yellow",
"SyntaxCommandBackgroundColor": "DarkMagenta",
"SyntaxCommentForegroundColor": "DarkGreen",
"SyntaxCommentBackgroundColor": "DarkMagenta",
"SyntaxContinuationPromptForegroundColor": "DarkYellow",
"SyntaxContinuationPromptBackgroundColor": "DarkMagenta",
"SyntaxDefaultForegroundColor": "DarkYellow",
"SyntaxDefaultBackgroundColor": "DarkMagenta",
"SyntaxEmphasisForegroundColor": "Cyan",
"SyntaxEmphasisBackgroundColor": "DarkMagenta",
"SyntaxErrorForegroundColor": "Red",
"SyntaxErrorBackgroundColor": "DarkMagenta",
"SyntaxKeywordForegroundColor": "Green",
"SyntaxKeywordBackgroundColor": "DarkMagenta",
"SyntaxMemberForegroundColor": "White",
"SyntaxMemberBackgroundColor": "DarkMagenta",
"SyntaxNumberForegroundColor": "White",
"SyntaxNumberBackgroundColor": "DarkMagenta",
"SyntaxOperatorForegroundColor": "DarkGray",
"SyntaxOperatorBackgroundColor": "DarkMagenta",
"SyntaxParameterForegroundColor": "DarkGray",
"SyntaxParameterBackgroundColor": "DarkMagenta",
"SyntaxSelectionForegroundColor": "DarkMagenta",
"SyntaxSelectionBackgroundColor": "DarkYellow",
"SyntaxStringForegroundColor": "DarkCyan",
"SyntaxStringBackgroundColor": "DarkMagenta",
"SyntaxTypeForegroundColor": "Gray",
"SyntaxTypeBackgroundColor": "DarkMagenta",
"SyntaxVariableForegroundColor": "Green",
"SyntaxVariableBackgroundColor": "DarkMagenta",
```
### The colors of the prompt
The mapping of foreground and background colors for the prompt is defined by properties `"Prompt*"`, for instance for the `"LegacyColorScheme"`
```javascript
"PromptForegroundColor": "DarkYellow",
"PromptBackgroundColor": "DarkMagenta"
```
### The window title and prompt
> :warning:
> In this section, we will use the `Apply-PSConsoleSettings` script to set the window title and the prompt.
For this to work, you **MUST** copy the `@HOME-Projects-myproject_.psconsole.json` file to your home directory and rename it to `.psconsole.json`, **AND** do the same for any project folders where you want to use this script in your project-profile.
To add the project name to the window title - for instance "PSCONSOLE"
```powershell
Apply-PSConsoleSettings "PSCONSOLE"
```
This gives

- remark that the color of the prompt is also dimmed - this will be discussed below.
When running PowerShell as an administrator, an indication of this is added to title and prompt

If you don't like what we did with the prompt, you can disable the changes to the prompt
```powershell
Apply-PSConsoleSettings "PSCONSOLE" -NoPrompt
```
### The font of the console
> :warning:
> In this section, we will use the `Apply-PSConsoleSettings` script to set the font and font-size.
For this to work, you **MUST** copy the `@HOME-Projects-myproject_.psconsole.json` file to your home directory and rename it to `.psconsole.json`, **AND** do the same for any project folders where you want to use this script in your project-profile.
The font settings are defined in the `.psconsole.json` file in the `$ROOT` folder. This file **MAY** have:
```javascript
{
"Font": "Lucida Console",
"FontSize": 14
}
```
`"Lucida Console"` is the default value for the font, `14` is the default value for the font-size.
> :bulb:
> The available fonts are specified in the registry under 'HKEY_LOCAL_MACHINE\SOFTWARE\Microsoft\Windows NT\CurrentVersion\Console\TrueTypeFont'. If the font you want is not there, you can add it, but there are limitations on the fonts that can be added.
<br/>
## For Further Investigation
- Monitor evolution of the `PSReadLine` version 2.0.0 module
| 50.38 | 516 | 0.685048 | eng_Latn | 0.976253 |
fd82a3038f9bca9006409481bb6bf3e56bcf329e | 820 | md | Markdown | Today_I_Learnd/200512-TIL.md | VincentGeranium/TIL | 9d29ffca5e1de73ad8a163eef86946942e198f82 | [
"MIT"
] | 5 | 2019-04-18T12:27:16.000Z | 2019-09-04T04:59:11.000Z | Today_I_Learnd/200512-TIL.md | VincentGeranium/TIL | 9d29ffca5e1de73ad8a163eef86946942e198f82 | [
"MIT"
] | null | null | null | Today_I_Learnd/200512-TIL.md | VincentGeranium/TIL | 9d29ffca5e1de73ad8a163eef86946942e198f82 | [
"MIT"
] | 2 | 2021-01-09T10:20:39.000Z | 2021-01-12T08:42:11.000Z | # Today I Learned
---
- [구조체 관련 문답(Anwser of Struct question) Post](https://vincentgeranium.github.io/ios,/swift/2020/05/12/Struct-1.html)
- [상속(Inheritance) - 프로퍼티 감시자 재정의(Overriding Property Observers) Summary](https://vincentgeranium.github.io/ios,/swift/2020/05/12/basicSyntax-1.html)
- [상속(Inheritance) - 서브스크립트 재정의(Overriding Subscript) Summary](https://vincentgeranium.github.io/ios,/swift/2020/05/12/basicSyntax-2.html)
- [상속(Inheritance) - 재정의 방지(Preventing Overrides, final) Summary](https://vincentgeranium.github.io/ios,/swift/2020/05/12/basicSyntax-3.html)
- [상속(Inheritance) - 클래스 이니셜라이저, 지정 이니셜라이저, 편의 이니셜라이저 (Class Initializer, Designated Initializer, Convenience Initializer)](https://vincentgeranium.github.io/ios,/swift/2020/05/12/basicSyntax-4.html)
- KxCoding Memo Project 강의 듣고 프로젝트 진행
---
| 45.555556 | 199 | 0.757317 | kor_Hang | 0.762651 |
fd835f442645346dbb7ed88df139e0ce2a0e23ff | 356 | md | Markdown | etc/developer-contributions/local-planning-authority/README.md | psd/digital-land-collector | e33abfeab9966a6918d2c23f1892caec0c2accf2 | [
"MIT"
] | 2 | 2019-11-08T18:56:02.000Z | 2021-04-09T23:10:59.000Z | etc/developer-contributions/local-planning-authority/README.md | digital-land/digital-land-collector | e33abfeab9966a6918d2c23f1892caec0c2accf2 | [
"MIT"
] | 3 | 2018-06-20T10:53:57.000Z | 2018-06-29T08:59:09.000Z | etc/developer-contributions/local-planning-authority/README.md | digital-land/digital-land-collector | e33abfeab9966a6918d2c23f1892caec0c2accf2 | [
"MIT"
] | 2 | 2019-04-24T14:08:26.000Z | 2019-05-22T10:47:52.000Z | # Digital land - Developer Contributions by Local Authority
This directory contains stub data by local authority representing files that will be published
according to [this proposed common data format.](https://digital-land.github.io/project/developer-contributions/)
The intention is for local authorities to publish these files on their own websites.
| 50.857143 | 113 | 0.820225 | eng_Latn | 0.999001 |
fd83754771b4d05814d538fbef582475147ff46a | 2,701 | md | Markdown | docs/vs-2015/extensibility/registering-and-unregistering-vspackages.md | rfakhouri/visualstudio-docs.cs-cz | 3d540a168c09a23b855f746696062fd9954b8dd5 | [
"CC-BY-4.0",
"MIT"
] | null | null | null | docs/vs-2015/extensibility/registering-and-unregistering-vspackages.md | rfakhouri/visualstudio-docs.cs-cz | 3d540a168c09a23b855f746696062fd9954b8dd5 | [
"CC-BY-4.0",
"MIT"
] | null | null | null | docs/vs-2015/extensibility/registering-and-unregistering-vspackages.md | rfakhouri/visualstudio-docs.cs-cz | 3d540a168c09a23b855f746696062fd9954b8dd5 | [
"CC-BY-4.0",
"MIT"
] | null | null | null | ---
title: Registrace a zrušení registrace rozšíření VSPackages | Dokumentace Microsoftu
ms.date: 11/15/2016
ms.prod: visual-studio-dev14
ms.technology: vs-ide-sdk
ms.topic: conceptual
helpviewer_keywords:
- registration, VSPackages
- VSPackages, registering
ms.assetid: e25e7a46-6a55-4726-8def-ca316f553d6b
caps.latest.revision: 36
ms.author: gregvanl
manager: jillfra
ms.openlocfilehash: 1f6bc85fb00c15831dcf1a9f64e4b886272df218
ms.sourcegitcommit: 94b3a052fb1229c7e7f8804b09c1d403385c7630
ms.translationtype: MT
ms.contentlocale: cs-CZ
ms.lasthandoff: 04/23/2019
ms.locfileid: "68193821"
---
# <a name="registering-and-unregistering-vspackages"></a>Registrace a zrušení registrace rozšíření VSPackages
[!INCLUDE[vs2017banner](../includes/vs2017banner.md)]
Použít atributy k evidenci VSPackage, ale
## <a name="registering-a-vspackage"></a>Registrace VSPackage
Atributy můžete použít k řízení registrací spravovaných rozšíření VSPackages. Veškeré informace o registraci je obsažen v souboru .pkgdef. Další informace o registraci na základě souboru, naleznete v tématu [nástroj CreatePkgDef](../extensibility/internals/createpkgdef-utility.md).
Následující kód ukazuje, jak používat atributy standardní registrace k registraci vašeho balíčku VSPackage.
```csharp
[PackageRegistration(UseManagedResourcesOnly = true)]
[Guid("0B81D86C-0A85-4f30-9B26-DD2616447F95")]
public sealed class BasicPackage : Package
{. . .}
```
## <a name="unregistering-an-extension"></a>Zrušení registrace rozšíření
Pokud byl experimentování s spoustu různých rozšíření VSPackages a chcete ho odebrat z experimentální instanci, můžete pouze spustit **resetování** příkazu. Vyhledejte **resetovat experimentální instanci aplikace Visual Studio** na úvodní stránce vašeho počítače nebo spuštěním tohoto příkazu z příkazového řádku:
```vb
<location of Visual Studio 2015 install>\"Microsoft Visual Studio 14.0\VSSDK\VisualStudioIntegration\Tools\Bin\CreateExpInstance.exe" /Reset /VSInstance=14.0 /RootSuffix=Exp
```
Pokud chcete odinstalovat rozšíření, které jste nainstalovali na vývoj instanci sady Visual Studio, přejděte na **nástrojů / rozšíření a aktualizace**najít rozšíření a klikněte na tlačítko **odinstalovat**.
Pokud z nějakého důvodu ani jeden z těchto metod úspěšná na odinstalovat rozšíření, můžete zrušit registraci balíčku VSPackage sestavení z příkazového řádku takto:
```
<location of Visual Studio 2015 install>\"Microsoft Visual Studio 14.0\VSSDK\VisualStudioIntegration\Tools\Bin\regpkg” /unregister <pathToVSPackage assembly>
```
## <a name="see-also"></a>Viz také
[Balíčky VSPackage](../extensibility/internals/vspackages.md)
| 49.109091 | 316 | 0.790078 | ces_Latn | 0.994268 |
fd83b501b632244837d4a1b7adf59593d32c6f37 | 1,175 | md | Markdown | README.md | withfatpanda/flat-ui-colors | 24ede45ad60cac48491df727dbdc8d86ec67b818 | [
"MIT"
] | null | null | null | README.md | withfatpanda/flat-ui-colors | 24ede45ad60cac48491df727dbdc8d86ec67b818 | [
"MIT"
] | null | null | null | README.md | withfatpanda/flat-ui-colors | 24ede45ad60cac48491df727dbdc8d86ec67b818 | [
"MIT"
] | null | null | null | # I ♥ the palette from Designmodo's Flat UI.
...but I was always having to hunt around for the HEX codes.
So I made [this tool](https://collegeman.github.io/flat-ui-colors).
[https://collegeman.github.io/flat-ui-colors](https://collegeman.github.io/flat-ui-colors)
Twenty colors * seven shades = 140 colors = everything you need for beauty-through-color.
# Who is Designmodo?
Designmodo ([on the web](http://designmodo.com) and [on GitHub](https://github.com/designmodo))
is a New York based company that creates awesome commercial User Interface Packs,
including the highly-rated [Flat UI](http://designmodo.com/flat/)—a Bootstrap theme that includes
some additional (and useful) components.
It was through Designmodo that I encountered this amazing palette:

# How to contribute
Want a feature? Find something wrong with the tool? Please fork and do a pull
request with your fixes/changes. This isn't a high priority project for me,
but I'll do my best to incorporate anything that makes sense.
# License
It's MIT. Come and get it.
| 37.903226 | 116 | 0.762553 | eng_Latn | 0.982162 |
fd83cb1cdf407fff7c407a87923e7ea114c35e6e | 1,907 | md | Markdown | _posts/2010/2010-10-29-random-hacks-of-kindness.md | gvwilson/thirdb | 4a78ed7aef83c97d0ce95e73f20372a3482e1dd2 | [
"CC-BY-4.0"
] | 14 | 2016-07-23T02:36:15.000Z | 2021-11-01T20:13:03.000Z | _posts/2010/2010-10-29-random-hacks-of-kindness.md | gvwilson/thirdb | 4a78ed7aef83c97d0ce95e73f20372a3482e1dd2 | [
"CC-BY-4.0"
] | 102 | 2015-12-30T09:38:08.000Z | 2021-11-02T11:34:33.000Z | _posts/2010/2010-10-29-random-hacks-of-kindness.md | gvwilson/thirdb | 4a78ed7aef83c97d0ce95e73f20372a3482e1dd2 | [
"CC-BY-4.0"
] | 8 | 2018-01-07T14:36:47.000Z | 2021-02-23T07:33:57.000Z | ---
title: "Random Hacks of Kindness"
date: 2010-10-29 17:34:29
year: 2010
---
<img title="RHoK" src="{{site.github.url}}/files/rhok_1288310261848-300x69.png" alt="RHoK" width="300" height="69" />
Random Hacks of Kindness 2.0 is in Toronto on December 4-5, 2010, and we'd like you to take part. As <a href="http://textontechs.com/2010/10/random-hacks-of-kindness-toronto/">Heather Leson says</a>:
<blockquote><em>RHoK is a community of developers, geeks and tech-savvy do-gooders around the world, working to develop software solutions that respond to the challenges facing humanity today. RHoK is all about using technology to make the world a better place by building a community of innovation. RHoK brings software engineers together with disaster relief experts to identify critical global challenges, and develop software to respond to them. A RHoK Hackathon event brings together the best and the brightest hackers from around the world, who volunteer their time to solve real-world problems.</em></blockquote>
RHoK Toronto will run from 9:00 am on December 4, 2010 to 8:00 pm on December 5 (yes, overnight), in Rooms 2015, 2016, 2019, and 2020 of the Sidney Smith building at 100 St. George Avenue on the University of Toronto's downtown campus. <a href="http://www.rhok.org/events/rhok-2/toronto-canada/">Registration is now open</a>: we need hackers, storytellers, software engineers, programmers, university students, marketers, web content creators, emergency planners,international policy and development students, teachers, librarians, videographers, event planners, organizers, project managers, and someone who knows how to calm Miles down after his fourth cup of coffee. I know the timing isn't great for students, who will be in the middle of the death march leading up to final projects and exams, but come on, it'll be <em>fun</em>, and we'll make the world a slightly better place.
| 173.363636 | 888 | 0.779234 | eng_Latn | 0.994241 |
fd84c7f4a13885573714add621646220a1f77ab0 | 1,229 | md | Markdown | exampleSite/content/english/event/SPSS-Series_3.md | ScMathUiTMKedah/v1 | 158ccaf888a04c6892bfda844ca7efdfbd845008 | [
"MIT"
] | null | null | null | exampleSite/content/english/event/SPSS-Series_3.md | ScMathUiTMKedah/v1 | 158ccaf888a04c6892bfda844ca7efdfbd845008 | [
"MIT"
] | null | null | null | exampleSite/content/english/event/SPSS-Series_3.md | ScMathUiTMKedah/v1 | 158ccaf888a04c6892bfda844ca7efdfbd845008 | [
"MIT"
] | null | null | null | ---
title: "Online Workshop on SPSS (Statistics for Non-Statisticians) - Series 3"
# Schedule page publish date
publishDate: "2021-03-02T00:00:00Z"
# event date
date: "2021-04-03T15:27:17+06:00"
# post save as draft
draft: false
# page title background image
bg_image: "images/backgrounds/page-title.jpg"
# meta description
description : "Data Entry & Screening, Descriptive Statistics"
# Event image
image: "images/events/workshop_spss_2021.jpg"
# location
location: "Virtual on ZOOM"
# entry fee
fee: "RM30"
# apply url
apply_url : "#"
# event speaker
speaker:
# speaker loop
- name : "Mdm Norin Rahayu Shamsuddin"
image : "images/teachers/norin_rahayu.jpeg"
designation : "Teacher"
# type
type: "event"
---
### About Event

This is an online workshop virtually on Zoom. This workshop is the first series out of three. In this session, hands on step by step session on discriminant and factor analysis will be performed. In addition, Madam Norin Rahayu will also demonstrates on how to produce and edit charts and pivot tables.
For inquiry, please contact:
**Dr. Afida Ahmad** (Registration): 60 17 517 5881
**Shahida Farhan Zakaria** (Payment): 60 12 318 9670
| 31.512821 | 306 | 0.741253 | eng_Latn | 0.88549 |
fd8509fba793e6a673ae9af26e3d95ff70d17755 | 6,888 | md | Markdown | docs/team/hynridge.md | luo-git/tp | 527df73abd4e8a59ff22ebe577f6bced8518300a | [
"MIT"
] | null | null | null | docs/team/hynridge.md | luo-git/tp | 527df73abd4e8a59ff22ebe577f6bced8518300a | [
"MIT"
] | null | null | null | docs/team/hynridge.md | luo-git/tp | 527df73abd4e8a59ff22ebe577f6bced8518300a | [
"MIT"
] | null | null | null | ---
layout: page
title: Heinrich's Project Portfolio Page
---
## Project: HelloFile
HelloFile is a file management application created as an extension to AddressBook - Level 3 (AB3),
specifically made for tech savvy CS2103T CS students.
By using HelloFile, students can tag frequently used files/folders with a short nickname, and open their files
with intuitive commands.
Given below are my contributions to the project.
* **New Feature**: Create the model of the app, by including `Tag`, `Label`, `FileAddress`, and `CurrentPath` class.
(Pull request [\#81](https://github.com/AY2021S1-CS2103T-F12-1/tp/pull/81))
* What it does: allows developers to have a strong architecture to continue the app development.
* Justification: This feature improves the product significantly because these models are the keys to the continuation of the app.
* Highlights: This enhancement affects the current AddressBook Level 3 at that point. So a lot of modification and refactoring were made on that period.
* Credits: This enhancement was developed upon the AddressBook Level 3 model class, with some classes being modified and some class being removed.
* **Code contributed**: [RepoSense link](https://nus-cs2103-ay2021s1.github.io/tp-dashboard/#breakdown=true&search=&sort=groupTitle&sortWithin=title&since=2020-08-14&timeframe=commit&mergegroup=&groupSelect=groupByRepos&checkedFileTypes=docs~functional-code~test-code~other&tabOpen=true&tabType=authorship&zFR=false&tabAuthor=HynRidge&tabRepo=AY2021S1-CS2103T-F12-1%2Ftp%5Bmaster%5D&authorshipIsMergeGroup=false&authorshipFileTypes=docs~functional-code~test-code)
* **Project management**:
* Keep track of the milestone's deadline on each iteration.
* **Enhancements to existing features**:
* Updated the HelpCommand to show details of specific command (Pull requests [\#186](https://github.com/AY2021S1-CS2103T-F12-1/tp/pull/186))
* Updated the FindCommand to be case insensitive (Pull requests [\#207](https://github.com/AY2021S1-CS2103T-F12-1/tp/pull/207))
* **Documentation**:
* User Guide (UG):
* Added documentation for the features `label` and `find` [\#169](https://github.com/AY2021S1-CS2103T-F12-1/tp/pull/169)
* Added `help COMMAND` in Features section [\#210](https://github.com/AY2021S1-CS2103T-F12-1/tp/pull/210)
* Added screenshots in the User Interface section. [\#212](https://github.com/AY2021S1-CS2103T-F12-1/tp/pull/212)
* Added some screenshots for the Basic Workflow section. [\#214](https://github.com/AY2021S1-CS2103T-F12-1/tp/pull/214)
* Developer Guide(DG):
* Added description for features `Tag` class and `Label` class [\#164](https://github.com/AY2021S1-CS2103T-F12-1/tp/pull/164)
* Updated User Stories section [\#173](https://github.com/AY2021S1-CS2103T-F12-1/tp/pull/173).
* **Community**:
* PRs reviewed (with non-trivial review comments):
[\#162](https://github.com/AY2021S1-CS2103T-F12-1/tp/pull/162),
[\#206](https://github.com/AY2021S1-CS2103T-F12-1/tp/pull/206),
[\#213](https://github.com/AY2021S1-CS2103T-F12-1/tp/pull/213),
[\#269](https://github.com/AY2021S1-CS2103T-F12-1/tp/pull/269),
* Contributed to forum discussions (examples: [1](https://github.com/nus-cs2103-AY2021S1/forum/issues/82),
[2](https://github.com/nus-cs2103-AY2021S1/forum/issues/70),
[3](https://github.com/nus-cs2103-AY2021S1/forum/issues/19),
[4](https://github.com/nus-cs2103-AY2021S1/forum/issues/18))
* Reported bugs and suggestions for other teams in the class (examples: [1](https://github.com/HynRidge/ped/issues/5),
[2](https://github.com/HynRidge/ped/issues/4),
[3](https://github.com/HynRidge/ped/issues/3),
[4](https://github.com/HynRidge/ped/issues/2))
## \[Optional\] Contributions to the User Guide (Extract)
### User stories
Priorities: High (must have) - `* * *`, Medium (nice to have) - `* *`, Low (unlikely to have) - `*`
| Priority | As a … | I want to … | So that I can… |
| -------- | ---------------------------------------------------------------| ------------------------------------------------------| --------------------------------------------------------------|
| `* * *` | Student with lots of file | tag my files with a easy to remember tag | get file path |
| `* * *` | First time user | use a help command | start to remember how to use the command |
| `* * *` | Student who prefers to type | use typing to interact with my file system | use keyboard as much as possible |
| `* * *` | Student who is familiar with command line applications | name my files | access the file easily next time |
| `* * *` | CS student with many categories of files | categorise my files and folders | easily manage my files and search files based on categories |
| `* * *` | First time user | use help command | start to remember how to use the command |
| `* * *` | Student with lots of files | see a list of my tags | find the tag that I created easily |
| `* * *` | Developer | open files with a quick command | focus on coding and not look to find my files |
| `* *` | CS student with a lot of project | hide private contact details | minimize chance of someone else seeing them by accident |
| `* *` | Command line user | use commands similar to Linux | use the similar Linux command without having to relearn |
| `*` | Forgetful user who always forget where his files are located | tag frequently used files with a easy to remember tag | locate my files easily |
| `*` | Intermediate user | delete file | not be distracted by it. |
## \[Optional\] Contributions to the User Guide (Extract)
### Finding a tag : `find`
Finds a tag by its keyword (can be tag name and/or label).
Format: `find KEYWORD`
Examples:
* `find notes`
| 75.692308 | 463 | 0.578978 | eng_Latn | 0.916034 |
fd85595ef1e40a5a4e93edc88dcf9ea0d99b2622 | 659 | md | Markdown | README.md | KaleoArchive/test-devops-ansible | 4149e2d3fb79d333e937731b0bace67219fd4795 | [
"MIT"
] | 1 | 2017-12-25T09:35:51.000Z | 2017-12-25T09:35:51.000Z | README.md | kaleocheng/test-devops-scripts | 4149e2d3fb79d333e937731b0bace67219fd4795 | [
"MIT"
] | null | null | null | README.md | kaleocheng/test-devops-scripts | 4149e2d3fb79d333e937731b0bace67219fd4795 | [
"MIT"
] | null | null | null |
# test-devops-ansible
Ansible scritps for test-devops
## Requirements
- [**Ansible**](https://www.ansible.com/)
Change your host and the default.yml in `ansible`, and then just play.
```shell
# Install requiestments
$ ansible-galaxy install -r requirements.yml
$ ansible-playbook -i hosts site.yml
```
This book updates the dev for every new commit and updates the staging for every new tag. Of course, It adds a crontab job to update the site every N(default 5) minutes.
you could access this site by you url for now, The defaults:
- dev: [http://dev.kaleo.run](http://dev.kaleo.run)
- staging: [http://staging.kaleo.run](http://staging.kaleo.run)
| 27.458333 | 169 | 0.731411 | eng_Latn | 0.878809 |
fd8584f493b465147d12e269fd665670e96cfaaa | 1,243 | md | Markdown | README.md | zmiddle/Insult-Generator | 394dbd8861a9f084793e865b2583bf7b86f6cdd1 | [
"MIT"
] | null | null | null | README.md | zmiddle/Insult-Generator | 394dbd8861a9f084793e865b2583bf7b86f6cdd1 | [
"MIT"
] | null | null | null | README.md | zmiddle/Insult-Generator | 394dbd8861a9f084793e865b2583bf7b86f6cdd1 | [
"MIT"
] | null | null | null | ## Insult-Generator
Takes a string of words and randomly generates a Shakespearean insult. This program is excellent to use for [Vicious Mockery](https://www.dndbeyond.com/spells/vicious-mockery) in [Dungeons and Dragons](https://en.wikipedia.org/wiki/Dungeons_%26_Dragons)!

### Insult-Generator-TTS
Does the same as the Insult-Generator.py and also uses the pyttsx3 module to say the insult.
### What I learned
* String manipulation without the use of the string library.
* Creating multiple lists from one master string, and sorting them using different indexes.
* Appending lists in an if tree for each element in a list.
* Generating random numbers for selecting the element in a list.
* Exposure to the 'pyttsx3' and 're' modules.
### Going forward
* The 'pyttsx3' quickly enables projects to incorporate speech components and I will definitely use this package again if I need that functionality.
* The 're' module is quite cryptic at first look, but with some more use, I can see it becoming useful for string data manipulation.
## License
This project is licensed under the MIT License - see the [LICENSE.md](LICENSE.md) file for details.
| 56.5 | 254 | 0.78037 | eng_Latn | 0.993028 |
fd859615f6ee66cc9e702c8e17c86033931bbec6 | 133 | md | Markdown | MySQL/_sidebar.md | Developer-Tang/notes | f87b741516a9cf09c09bcc5f3b54fde72b833126 | [
"Apache-2.0"
] | null | null | null | MySQL/_sidebar.md | Developer-Tang/notes | f87b741516a9cf09c09bcc5f3b54fde72b833126 | [
"Apache-2.0"
] | null | null | null | MySQL/_sidebar.md | Developer-Tang/notes | f87b741516a9cf09c09bcc5f3b54fde72b833126 | [
"Apache-2.0"
] | null | null | null | - **MySQL**
- [MySQL常见语法.md](/MySQL/MySQL常见语法.md)
- [MySQL索引.md](/MySQL/MySQL索引.md)
- [MySQL常见问题.md](/MySQL/MySQL常见问题.md) | 33.25 | 41 | 0.616541 | ast_Latn | 0.054111 |
fd8597d0a64b0acd47fefcb0e1f246f6a4588823 | 43 | md | Markdown | README.md | sdotbertoli/utils | f0aae423f4c430c91b5b5f47ffdd64953ae2d17a | [
"MIT"
] | null | null | null | README.md | sdotbertoli/utils | f0aae423f4c430c91b5b5f47ffdd64953ae2d17a | [
"MIT"
] | null | null | null | README.md | sdotbertoli/utils | f0aae423f4c430c91b5b5f47ffdd64953ae2d17a | [
"MIT"
] | null | null | null | # utils
Common utils required for projects
| 14.333333 | 34 | 0.813953 | eng_Latn | 0.995084 |
fd86904c2eb64e4fc4726e90345496013928b829 | 32 | md | Markdown | README.md | didhe/audhumla-hakyll | 934bd8409031540ce0a1990d4f23abd3539e1241 | [
"BSD-3-Clause"
] | 1 | 2017-04-26T20:25:18.000Z | 2017-04-26T20:25:18.000Z | README.md | didhe/audhumla-hakyll | 934bd8409031540ce0a1990d4f23abd3539e1241 | [
"BSD-3-Clause"
] | null | null | null | README.md | didhe/audhumla-hakyll | 934bd8409031540ce0a1990d4f23abd3539e1241 | [
"BSD-3-Clause"
] | null | null | null | audhumla-hakyll
===============
| 10.666667 | 15 | 0.4375 | oci_Latn | 0.432019 |
fd874e87b850752f085d7154fd6e7d891c01cdf4 | 186 | md | Markdown | README.md | jerrydelivo/Sudoku-pygame | beadafef5dedd9a03dddf2ee0a4354925c54f5f4 | [
"MIT"
] | null | null | null | README.md | jerrydelivo/Sudoku-pygame | beadafef5dedd9a03dddf2ee0a4354925c54f5f4 | [
"MIT"
] | null | null | null | README.md | jerrydelivo/Sudoku-pygame | beadafef5dedd9a03dddf2ee0a4354925c54f5f4 | [
"MIT"
] | null | null | null | # SUDOKU using pygame
Forked this repository and customized it to make an algorithm that helps me check if I am on the right path solving a soduku and also
check what the next move is.
| 46.5 | 134 | 0.790323 | eng_Latn | 0.999919 |
fd87a278e6249f6de67b7ad972f307795b83d289 | 4,606 | md | Markdown | docs/content/v1.3/deploy/enterprise-edition/install-admin-console/airgapped.md | fabiocmazzo/yugabyte-db | 78f5839fec24e3e40fa6f25a7ff7af85315eadef | [
"Apache-2.0",
"CC0-1.0"
] | null | null | null | docs/content/v1.3/deploy/enterprise-edition/install-admin-console/airgapped.md | fabiocmazzo/yugabyte-db | 78f5839fec24e3e40fa6f25a7ff7af85315eadef | [
"Apache-2.0",
"CC0-1.0"
] | null | null | null | docs/content/v1.3/deploy/enterprise-edition/install-admin-console/airgapped.md | fabiocmazzo/yugabyte-db | 78f5839fec24e3e40fa6f25a7ff7af85315eadef | [
"Apache-2.0",
"CC0-1.0"
] | null | null | null | An “airgapped” host has either no or a restricted path to inbound or outbound Internet traffic at all.
## Prerequisites
### 1. Whitelist endpoints
In order to install Replicated and YugaWare on a host with no Internet connectivity at all, you have to first download the binaries on a machine that has Internet connectivity and then copy the files over to the appropriate host. In case of restricted connectivity, the following endpoints have to be whitelisted to ensure that they are accessible from the host marked for installation.
```sh
https://downloads.yugabyte.com
https://download.docker.com
```
### 2. Install Docker Engine
A supported version of docker-engine (currently 1.7.1 to 17.03.1-ce) needs to be installed on the host. If you do not have docker-engine installed, follow the instructions [here](https://help.replicated.com/docs/kb/supporting-your-customers/installing-docker-in-airgapped/) to first install docker-engine on an airgapped host. After docker-engine is installed, perform the following steps to install Replicated and then YugaWare.
## Step 1. Install Replicated
On a machine connected to the Internet, perform the following steps.
Make a directory for downloading the binaries.
```sh
$ sudo mkdir /opt/downloads
```
Change the owner user for the directory.
```sh
$ sudo chown -R ubuntu:ubuntu /opt/downloads
```
Change to the directory.
```sh
$ cd /opt/downloads
```
Get the replicated binary.
```sh
$ wget https://downloads.yugabyte.com/replicated.tar.gz
```
Get the yugaware binary where the last 4 digits refer to the version of the binary. Change this number as needed.
```sh
$ wget https://downloads.yugabyte.com/yugaware-1.2.6.0.airgap
```
Change to the directory.
```sh
$ cd /opt/downloads
```
Expand the replicated binary.
```sh
$ tar xzvf replicated.tar.gz
```
Install replicated (yugaware will be installed via replicated ui after replicated install completes) pick `eth0` network interface in case multiple ones show up.
```sh
$ cat ./install.sh | sudo bash -s airgap
```
After replicated install completes, make sure it is running.
```sh
$ sudo docker ps
```
You should see an output similar to the following.

Next step is install YugaWare as described in the [section below](#step-2-install-yugaware-via-replicated).
## Step 2. Install YugaWare using Replicated
### Set up HTTPS for Replicated
Launch Replicated UI by going to [http://yugaware-host-public-ip:8800](http://yugaware-host-public-ip:8800). The warning shown next states that the connection to the server is not private (yet). We will address this warning as soon as we setup HTTPS for the Replicated Admin Console in the next step. Click Continue to Setup and then ADVANCED to bypass this warning and go to the Replicated Admin Console.

You can provide your own custom SSL certificate along with a hostname.

The simplest option is use a self-signed cert for now and add the custom SSL certificate later. Note that you will have to connect to the Replicated Admin Console only using IP address (as noted below).

### Upload license file
Now upload the YugaByte license file received from YugaByte Support.

Two options to install YugaWare are presented.



### Secure Replicated
The next step is to add a password to protect the Replicated Admin Console (note that this Admin Console is for Replicated and is different from YugaWare, the Admin Console for YugaByte DB).

### Pre-flight checks
Replicated will perform a set of pre-flight checks to ensure that the host is setup correctly for the YugaWare application.

Clicking Continue above will bring us to YugaWare configuration.
In case the pre-flight check fails, review the [Troubleshoot YugaByte Platform](../../../troubleshoot/enterprise-edition/) section below to identify the resolution.
| 36.267717 | 429 | 0.780069 | eng_Latn | 0.993801 |
fd87cb278a2ccb61a421e3453f6f73fbfc89cd0f | 627 | md | Markdown | README.md | ZER-0-NE/chrisyeh96.github.io | f822ad84a950988660d5f8f706684e5e4468e1bc | [
"MIT"
] | null | null | null | README.md | ZER-0-NE/chrisyeh96.github.io | f822ad84a950988660d5f8f706684e5e4468e1bc | [
"MIT"
] | null | null | null | README.md | ZER-0-NE/chrisyeh96.github.io | f822ad84a950988660d5f8f706684e5e4468e1bc | [
"MIT"
] | 1 | 2019-04-25T03:36:58.000Z | 2019-04-25T03:36:58.000Z | # chrisyeh
Personal website
Live websites at [https://chrisyeh96.github.io/](https://chrisyeh96.github.io/) and [https://stanford.edu/~chrisyeh](https://stanford.edu/~chrisyeh).
**Build Locally**
- GitHub Pages: `jekyll build`
- Stanford: `jekyll build --baseurl /~chrisyeh`
**Run Locally**
`jekyll serve --baseurl /~chrisyeh`
Add `--drafts` to show drafts as the latest posts.
We need the `baseurl` parameter for Stanford because the live website is not at the root domain. However, we don't include `baseurl` in the `_config.yml` file so that [https://chrisyeh96.github.io/](https://chrisyeh96.github.io/) also works.
| 34.833333 | 241 | 0.732057 | eng_Latn | 0.697121 |
fd880234afc6bf1333c25415b749f5356539643b | 828 | md | Markdown | README.md | build2/HOWTO | a4bdbd19e385e695bdc5198e6347528d96c531c8 | [
"MIT"
] | 2 | 2022-01-13T17:05:49.000Z | 2022-01-14T17:53:59.000Z | README.md | build2/HOWTO | a4bdbd19e385e695bdc5198e6347528d96c531c8 | [
"MIT"
] | null | null | null | README.md | build2/HOWTO | a4bdbd19e385e695bdc5198e6347528d96c531c8 | [
"MIT"
] | null | null | null | # `build2` HOWTO
[How do I convert source files to be in the UTF-8 encoding?](entries/convert-source-files-to-utf8.md)
[How do I disable building of a shared library variant for a platform/compiler?](entries/disable-shared-library-for-platform.md)
[How do I handle auto-generated C/C++ headers?](entries/handle-auto-generated-headers.md)
[How do I handle projects that don't use semantic versioning?](entries/handle-projects-which-dont-use-semver.md)
[How do I handle tests that have extra dependencies?](entries/handle-tests-with-extra-dependencies.md)
[How do I link the `pthread` library?](entries/link-pthread.md)
[How should I name packages when packaging third-party projects?](entries/name-packages-in-project.md)
[How do I setup the `build2` toolchain for development?](entries/setup-build2-for-development.md)
| 43.578947 | 128 | 0.775362 | eng_Latn | 0.725877 |
fd88049724f7e2966261de90fd8ba273e7c6e72f | 313 | md | Markdown | README.md | laden666666/UnifiedExceptionHandlingDome | b0649c844d5efbec2ff47307c6e70eb4545cdad9 | [
"MIT"
] | 2 | 2018-09-29T12:05:21.000Z | 2018-12-06T11:00:22.000Z | README.md | laden666666/UnifiedExceptionHandlingDome | b0649c844d5efbec2ff47307c6e70eb4545cdad9 | [
"MIT"
] | null | null | null | README.md | laden666666/UnifiedExceptionHandlingDome | b0649c844d5efbec2ff47307c6e70eb4545cdad9 | [
"MIT"
] | null | null | null | 一个前端统一异常处理的dome。这个demo是一个todo的列表界面,里面的各个按钮都会抛出不同的系统异常,从中我们可以测试各个系统异常的处理策略。例子中我们为了使其尽量能够兼容更多的浏览器(主要是ie8),同时保留mvvm、模块化等如今前端开发的精华,所以采用avalon做view层和controller层,requirejs做模块化工具实现自动加载资源和service的享元模式,样式库采用兼容ie8的bootstarp2。由于jquery1.x和jquery2.x对于promise/A+的规范实现的并不完整,故采用刚刚出炉的jquery-compat-3.0.0-alpha1版,不过要注意的是这是一个内部测试版。
| 156.5 | 312 | 0.920128 | zho_Hans | 0.373504 |
fd88060f19518e90f236a3bceea2ea44ba71abe4 | 3,423 | md | Markdown | docs/guide/layout-builders-advanced-syntax.md | gitter-badger/Xenial.Framework | ca5817d4fbd58e4ab391ecc13597b4bdd6f0d080 | [
"BlueOak-1.0.0",
"Apache-2.0"
] | null | null | null | docs/guide/layout-builders-advanced-syntax.md | gitter-badger/Xenial.Framework | ca5817d4fbd58e4ab391ecc13597b4bdd6f0d080 | [
"BlueOak-1.0.0",
"Apache-2.0"
] | null | null | null | docs/guide/layout-builders-advanced-syntax.md | gitter-badger/Xenial.Framework | ca5817d4fbd58e4ab391ecc13597b4bdd6f0d080 | [
"BlueOak-1.0.0",
"Apache-2.0"
] | null | null | null | ---
title: 'DetailViewLayoutBuilders - LayoutBuilder<T> Syntax'
---
# DetailViewLayoutBuilders - `LayoutBuilder<T>` Syntax
As we learned in the last chapter, layouts are basically *just* a domain specific language around a couple of building blocks. By using the power of C# we are able to write our layouts in code, produce pixel perfect results and be refactoring safe for a more consistent application look and feel.
However, the basic building blocks are a little bit verbose to write, so `Xenial.Framework` provides a more robust and IntelliSence driven way to write layouts.
As a quick reminder our final layout should look like in the past example:

## Registration
The registration in the module is exactly the same as in the last chapter, we need to tell XAF to use the `LayoutBuilders`.
For this we need to override the `AddGeneratorUpdaters` in our platform agnostic module and call the `updaters.UseDetailViewLayoutBuilders()` extension method.
Nothing additional is needed to use the `LayoutBuilder<T>` syntax.
<<< @/guide/samples/layout-builders-simple/RegisterInModule.cs{8,12}
## Defining the builder method
Next step is that we need to declare an public static method in the class we want to provide a layout for called `BuildLayout` that returns an `Xenial.Framework.Layouts.Items.Base.Layout` instance and decorate it with the `DetailViewLayoutBuilderAttribute`.
The `DetailViewLayoutBuilderAttribute` defines the method and type that is responsible for building our `DetailView`.
This step is exactly the same as in the last chapter.
<<< @/guide/samples/layout-builders-simple/RegisterSimple1.cs{11,14-17}
## Using the `LayoutBuilder<T>` instance
However, this time we are not using the initializer syntax and hand craft our object graph. Instead we are going to use our new friend, the `LayoutBuilder<T>` class:
<<< @/guide/samples/layout-builders-advanced/Intro.cs{18}
The layout builder has a lot of methods and overloads that accepts some basic parameters like `Caption` and `ImageName` first, followed by a `params LayoutItemNode[] nodes`, as well as a callback method `Action<TNodeType>`.
That allows us to use a more compact syntax, without loosing any functionality over the traditional initializer syntax.
> In this sample the parameters `m` and `e` are used, where `m` is short for `Member` and `e` for `Editor`.
This is a best practice convention, but you can use what ever name you want.
<<< @/guide/samples/layout-builders-advanced/BuilderInstance.cs
The beauty of this approach is, that IntelliSense guides you building your layout and you don't need to remember all the type names. It is a much denser syntax, but can be a little bit *chatty* with braces sometimes.
::: tip
You can mix an match between both initializer and functional style. Whatever makes you and your team more happy.
:::
## Inherit from `LayoutBuilder<T>`
Currently we used the `LayoutBuilder<T>` as an instance, and used the *convention based* registration pattern for the builder. By inheriting from `LayoutBuilder<T>` and using the `typed` overload of the `DetailViewLayoutBuilderAttribute` we can reduce additional noise from the syntax.
First we need to inherit from the `LayoutBuilder<T>` class and change our registration.
<<< @/guide/samples/layout-builders-advanced/BuilderInherit.cs | 62.236364 | 298 | 0.781186 | eng_Latn | 0.997538 |
fd88ded35517cb13221f5bf06505e3c3fcc4871b | 17,908 | md | Markdown | README.md | mckaskle/iOS-App-Best-Practices-and-Guidelines | 0fd5fe13c9fe351a180e63009481e9b8ab48ee1a | [
"MIT"
] | null | null | null | README.md | mckaskle/iOS-App-Best-Practices-and-Guidelines | 0fd5fe13c9fe351a180e63009481e9b8ab48ee1a | [
"MIT"
] | null | null | null | README.md | mckaskle/iOS-App-Best-Practices-and-Guidelines | 0fd5fe13c9fe351a180e63009481e9b8ab48ee1a | [
"MIT"
] | null | null | null | ⚠️⚠️ **This hasn't been updated recently. There's still some good stuff in here but some is outdated. I plan to go through and update this eventually.** ⚠️⚠️
# iOS App Best Practices and Guidelines
## Table of Contents
1. [Threading](#threading)
2. [Error Handling](#error-handling)
3. [Handling Server Responses](#handling-server-responses)
4. [Working with JSON in Swift](#working-with-json-in-swift)
5. [View Controller Flow](#view-controller-flow)
6. [Dependency Injection](#dependency-injection)
7. [Access Control](#access-control)
8. [Localization](#localization)
9. [Style Guide](#style-guide)
10. [Directory Structure](#directory-structure)
11. [Warnings](#warnings)
12. [More Suggested Readings](#more-suggested-readings)
## Threading
> Do everything on the main thread. Don’t even think about queues and background threads. Enjoy paradise!
>
> If, after testing and profiling, you find you do have to move some things to a background queue, pick things that can be perfectly isolated, and make sure they’re perfectly isolated. Use delegates; do not use KVO or notifications.
>
> If, in the end, you still need to do some tricky things — like updating your model on a background queue — remember that the rest of your app is either running on the main thread or is little isolated things that you don’t need to think about while writing this tricky code. Then: be careful, and don’t be optimistic. (Optimists write crashes.)
Source: [inessential.com: How Not To Crash 4: Threading](http://inessential.com/2015/05/22/how_not_to_crash_4_threading)
If callback needs to be called after some background work is complete, the callback should be dispatched to the main thread. There may exist a scenario where it makes sense to call that callback on another queue. Those rare cases may be handled in two ways:
1. Provide a `queue: DispatchQueue` parameter so the task can dispatch it to the appropriate queue. In this way, the caller always knows which queue the callback is going to be called on.
2. Add a warning to the documentation that explicitly calls out that the callback will not be called on a main thread. Example ([source][warning-source]):
/// A method that does a thing and calls the handler on completion.
///
/// - Warning:
/// The handler will be called on a background queue.
func task(handler: @escaping () -> Void) { … }
[warning-source]: https://developer.apple.com/library/content/documentation/Xcode/Reference/xcode_markup_formatting_ref/Important.html#//apple_ref/doc/uid/TP40016497-CH37-SW1
### Suggested Reading
- [How Not to Crash #4: Threading](http://inessential.com/2015/05/22/how_not_to_crash_4_threading)
- [How Not to Crash #5: Threading, part 2](http://inessential.com/2015/05/26/how_not_to_crash_5_threading_part_2)
## Error Handling
Always proprogate errors up to site that triggered the overarching task. Don't mask the error within the task where the failure occurred.
> Without error propagation:
>
> - the user-interface may get stuck in a mid-task state
> - earlier stages in the task can’t attempt a retry or recover
> - we’re forcing the task’s presentation of the error, rather than giving the user interface controller that triggered the action a the chance to present errors in a more integrated way
>
> …
>
> Swift’s `throws` keyword is one of the best features of the language. You may need to paint a lot of functions with the `throws` attribute (especially if you need to `throw` from deep inside a hierarchy) but it makes your interfaces honest.
Source: [Presenting unanticipated errors to users](https://www.cocoawithlove.com/blog/2016/04/14/error-recovery-attempter.html)
Following this practice not only gives you the option to handle all errors, but it forces you to consider all errors at the UI level. Presenting the error takes more thought and effort than merely logging the error, but the result is a better UX. It is usually good enough to merely show an alert for edge case errors, but errors that are likely (like network calls failing) should be discussed with product/design.
### Note On Error Logging
It is best to log the errors at the site that triggered the overarching task. If you log errors deeper in the stack where the error occurred, it can get complex. For instance, say we try to log the error where it occurred in this example:
```
func doAThing(url: URL) throws {
do {
try FileManager.default.removeItem(at: url)
try FileManager.default.createDirectory(at: url)
} catch {
log_error(error) // Assuming we have this global function for logging errors.
throw error // Rethrow the error so it continue to propogate.
}
}
```
Here, the assumption is that since this lower level code is logging the error, the caller doesn't need to log the error. That seems fine if this assumption is understood throughout the codebase, at least until the code needs to be modified:
```
func doAThing(url: URL) throws {
do {
try FileManager.default.removeItem(at: url)
try doAnotherThing()
try FileManager.default.createDirectory(at: url)
} catch {
log_error(error) // Assuming we have this global function for logging errors.
throw error // Rethrow the error so it continue to propogate.
}
}
```
Now we are potentially logging the error twice if `doAnotherThing()` is the code that triggered the error to be thrown and it too is logging it's own error. A recursive function that can throw an error has the same issue. It keeps the code straightforward if the only logging code is at the topmost level. If the error is going to continue propogating, don't log it.
This is true for errors that are passed into a result handler as well:
```
func doSomething(handler: (Error?) -> Void) { … }
```
It would be up to the caller to log the error:
```
doSomething { error in
if let error = error {
log_error(error)
}
// … continue with the rest of the code
}
```
### Suggested Reading
- [Presenting unanticipated errors to users](https://www.cocoawithlove.com/blog/2016/04/14/error-recovery-attempter.html)
## Handling Server Responses
> … _whoever wrote the server side is your sworn enemy_. He or she hates you very, very much.
Source: [How Not to Crash #7: Dealing with Nothing](http://inessential.com/2015/05/29/how_not_to_crash_7_dealing_with_nothin)
Even if the server API is well documented and there is an understood contract in place, never make the assumption that the server code isn't going to throw in a curve ball (even if the author is really a nice person who loves you very much). Check the type of every single piece of data; ensure all data that you expect to be there is actually there. This is much more natural in Swift then it was in Objective-C. Objective-C's `id` type means that you never have to consider this if you don't want to, but you definitely should.
If there is a piece of data that is not as you expect, then generally speaking the best course of action is to consider the whole task a failure. The callback that is expecting the result of the network request should receive an error. Don't try to cover up the issue with a default value or an empty list. The server is not abiding by the contract set forth in the API and this is an error that needs to be propogated up.
## Working with JSON in Swift
> Converting between representations of the same data in order to communicate between different systems is a tedious, albeit necessary, task for writing software.
>
> Because the structure of these representations can be quite similar, it may be tempting to create a higher-level abstraction to automatically map between these different representations. For instance, a type might define a mapping between snake_case JSON keys and camelCase property names in order to automatically initialize a model from JSON using the Swift reflection APIs, such as Mirror.
>
> However, we’ve found that these kinds of abstractions tend not to offer significant benefits over conventional usage of Swift language features, and instead make it more difficult to debug problems or handle edge cases. […] The cost of small amounts of duplication may be significantly less than picking the incorrect abstraction.
Source: [Apple Swift Blog: Working with JSON in Swift](https://developer.apple.com/swift/blog/?id=37)
Reflection and third party libraries that abstract serialization or deserialization of JSON should be avoided.
Is is preferable to throw an error when JSON deserialization fails, as opposed to returning `nil`. See the "Writing a JSON Initializer with Error Handling" section in the previously mentioned [Working with JSON in Swift article](https://developer.apple.com/swift/blog/?id=37) for a good example of handling errors.
## View Controller Flow
According to the [Apple documentation][dismiss-documentation]:
> The presenting view controller is responsible for dismissing the view controller it presented.
Unfortunately, UIKit makes it easy to ignore this directive and have the presented view controller dismiss itself. The next line in the documentation reads:
> If you call this method on the presented view controller itself, UIKit asks the presenting view controller to handle the dismissal.
In fact, several UIKit-provided view controllers dismiss themselves. However, that is going to lead to tight coupling of view controllers and **should be avoided**.
Let's say there are two view controllers `A` and `B`, where `A` presents `B`. If `B` dismisses itself, then it implicitly knows that it was presented. This tight coupling will lead to unexpected results when a new view controller `C` actually pushes `B` onto a navigation stack instead of presenting it.
Instead, set up a delegate for `B`, so that `B` can let its delegate know when it has finished what it was doing. View controllers `A` and `C` can then be set as the delegate and `A` can dismiss the view controller (calling dismiss on itself, rather than on the `B` view controller) and `C` can pop the navigation stack.
### Note on Popping the Navigation Stack
It is better to use `navigationController?.popToViewController(self, animated: true)` ([documentation][pop-documentation]) rather than `navigationController?.popViewController(animated: true)` so that you can be sure you are popping to the correct view controller. Otherwise, the view controller that did the popping needs to have to have knowledge on how the pushed view controller handles things. It's possible the pushed view controller has pushed view controllers onto the stack too.
Note that the `dismiss` method already handles a stack in this way. According to the [documentation][dismiss-documentation]:
> If you present several view controllers in succession, thus building a stack of presented view controllers, calling this method on a view controller lower in the stack dismisses its immediate child view controller and all view controllers above that child on the stack. When this happens, only the top-most view is dismissed in an animated fashion; any intermediate view controllers are simply removed from the stack.
[dismiss-documentation]: https://developer.apple.com/reference/uikit/uiviewcontroller/1621505-dismiss
[pop-documentation]: https://developer.apple.com/reference/uikit/uinavigationcontroller/1621871-poptoviewcontroller
### Suggested Reading
The following is a series of posts by Soroush Khanlou on Coordinators. Coordinators are his solution for the tight coupling between view controllers and view controller bloat. Personally, I have tried coordinators out and found that they aren't for me. I'm okay with view controllers doing some of the things he isn't okay with and I find that creating a separate "Coordinator" object doesn't remove coupling, it just moves it to another place. But these articles are worth considering. It was these articles that inspired the view controller flow that is described above.
- [The Coordinator](http://khanlou.com/2015/01/the-coordinator/)
- [Coordinators Redux](http://khanlou.com/2015/10/coordinators-redux/)
- [Series of posts on Advanced Coordinators](http://khanlou.com/tag/advanced-coordinators/) (think of this as extra credit; the basics have already been covered; only read if you are still interested)
## Dependency Injection
Stack Overflow has some good answers on [What is dependency injection and when/why should or shouldn't it be used?](https://stackoverflow.com/q/130794). There are basically two major benefits to DI:
1. [Testability](https://stackoverflow.com/a/140655)
2. [Loose Coupling](https://stackoverflow.com/a/6085922)
Here are a few places that can get overlooked in iOS development when it comes to dependency injection. This list is not exhaustive:
- API clients
- User defaults / user settings
- Logged in user data
- Data model (Core Data persistent container/managed object context)
It will get burdensome to individually inject these items into every object that needs them. The solution is to bundle them all up in a single object that holds a reference to each item. I have seen this object called an [`Environment`](http://news.realm.io/news/try-swift-brandon-williams-writing-testable-code/#code-testing-with-co-effects). I call it a `ScreenContext`. Here is example:
```
class ScreenContext {
init(apiClient: MyAPIClient, persistentContainer: NSPersistentContainer) {
self.apiClient = apiClient
self.persistentContainer = persistentContainer
}
let apiClient: MyAPIClient
let persistentContainer: NSPersistentContainer
}
```
All screens might need `ScreenContext`, but say there is a group of screens that are behind a log-in wall. All of those screens will require a `LoggedInUser`. If there are enough screens behind this wall, it makes sense to create another context object, similar to ScreenContext except that includes the user:
```
class LoggedInScreenContext: ScreenContext {
init(loggedInUser: LoggedInUser, apiClient: MyAPIClient, persistentContainer: NSPersistentContainer) {
self.loggedInUser = loggedInUser
super.init(apiClient: apiClient, persistentContainer: persistentContainer)
}
let loggedInUser: LoggedInUser
}
```
Don't feel obligated to make a separate screen context for every dependency that needs to be injected. An object should only make it into a screen context if it is used throughout the app or a large component in the app.
Note: subclasses might not always be the best pattern. [Protocol composition][protocol-composition-di] will make more sense in some situations. Subclassing is simpler, but not as flexible. Judge for yourself dependent on the project and then keep with that pattern throughout the project. For new projects, subclasses probably make the most sense. If it makes sense to move to protocol composition, then convert all existing screen contexts away from subclasses to composition.
[protocol-composition-di]: http://merowing.info/2017/04/using-protocol-compositon-for-dependency-injection/
## Access Control
The rule of thumb is to make properties and methods `private` by default and classes are `final` by default. Only open up control as needed. This makes for easier-to-maintain code and can even make the code more performant and compile faster (though this is likely negligible and not the reason for this guideline).
## Localization
Always localize user-facing text. Use `NSLocalizedString` and the `Formatter` subclasses from the beginning. Code should not be merged into the mainline development branch until all user-facing text is localized.
## Style Guide
Follow [The Official raywenderlich.com Swift Style Guide](https://github.com/raywenderlich/swift-style-guide#code-organization). If there are pieces of this document that contradict with the raywenderlich.com Swift Style Guide, then go with this document.
As for code organization, it helps scanning a file for the code you are looking for if the code is organized in a consistent manner across all files. The actual order of things isn't really important. Consistency is the important part. Here is the structure that I typically follow for `classes`/`structs`:
- Enums
- Type Aliases
- Object Lifecycle (`init`, `deinit`, and `awakeFromNib` methods; sometimes a `configure` method if and only if it's used in multiple `init` methods)
- Public Properties
- Public Methods
- Internal Properties
- Internal Methods
- Private Properties
- Actions (these are the `dynamic`/`@IBAction` methods that are triggered by buttons, timers, etc. and they are always private)
- Private Methods
- Subclass overrides (these are broken up by the exact type that defined the original properties/methods that are overridden; for example, a `UITableViewController` subclass might have a `UIViewController` section for `override func viewDidLoad() { … }` and a `UITableViewControllerDataSource` section for `override func numberOfSections(in: UITableView) -> Int { … }`)
Each section should be denoted with a `MARK` comment. For example:
```
final class SubOperation: Operation {
// MARK: - Object Lifecycle
init(screenContext: ScreenContext) {
self.screenContext = screenContext
}
// MARK: - Private Properties
private let screenContext: ScreenContext
// MARK: - Operation
override func main() {
super.main()
// Do stuff here…
}
}
```
## Directory Structure
The Xcode Project navigator structure should match the underlying directory structure as much as possible. It helps if the paths of the Groups in the project navigator point to their corresponding directories within the repo.
## Warnings
If you let the project continue with even just 1 warning, you will start to become blind to the warnings panel that Xcode gives you. No code should be merged to the mainline development branch with any warnings.
## More Suggested Readings
- [How Not To Crash series](http://inessential.com/hownottocrash)
| 57.767742 | 572 | 0.77658 | eng_Latn | 0.997138 |
fd89243fdbf239a9b453af21de22c1b5a218026d | 2,286 | md | Markdown | tests/benchdnn/doc/driver_reduction.md | binfnstats/oneDNN | 12fcf2b205dad234be99fc36385aadfd89f3ad15 | [
"Apache-2.0"
] | null | null | null | tests/benchdnn/doc/driver_reduction.md | binfnstats/oneDNN | 12fcf2b205dad234be99fc36385aadfd89f3ad15 | [
"Apache-2.0"
] | null | null | null | tests/benchdnn/doc/driver_reduction.md | binfnstats/oneDNN | 12fcf2b205dad234be99fc36385aadfd89f3ad15 | [
"Apache-2.0"
] | null | null | null | # Reduction Driver
## Usage
``` sh
./benchdnn --reduction [benchdnn-knobs] [reduction-knobs] [reduction-desc] ...
```
where *reduction-knobs* are:
- `--sdt={f32 [default], bf16, s8, u8, s32}` -- src data type.
Refer to [data types](knobs_dt.md) for details.
- `--ddt={f32 [default], bf16, s8, u8, s32}` -- dst data type.
Refer to [data types](knobs_dt.md) for details.
- `--stag={nchw [default], ...}` -- physical src memory layout.
Refer to [tags](knobs_tag.md) for details.
- `--dtag={any [default], ...}` -- physical dst memory layout.
Refer to [tags](knobs_tag.md) for details.
- `--alg={sum [default], ...}` -- algorithm for reduction operations.
Refer to [reduction primitive](https://oneapi-src.github.io/oneDNN/dev_guide_reduction.html)
for details.
- `--p=FLOAT` -- float value corresponding to algorithm operation.
Refer to ``Floating point arguments`` below.
- `--eps=FLOAT` -- float value corresponding to algorithm operation.
Refer to ``Floating point arguments`` below.
and *reduction-desc* is a problem descriptor. The canonical form is:
```
NxNxNxNxN:NxNxNxNxN
```
where N is an integer number. This represents a 3D spatial problem with the
following logical dimensions: N, C, D, H, W. Consider removing each `xN` from
the end to specify fewer dimensions.
## Floating point arguments
Some operations support `p` and `eps` arguments such as
`norm_lp_max`, `norm_lp_sum`, `norm_lp_power_p_max`, `norm_lp_power_p_sum`.
## Essence of Testing
## Examples
Run the set of reduction primitive problems from `inputs/reduction/shapes_ci`
with the default settings:
``` sh
./benchdnn --reduction --batch=inputs/reduction/shapes_ci
```
Run a specific reduction primitive problem:
- Data type is `f32` for source and destination tensors.
- Source tensor uses `acb` memory format.
- The reduce operation is sum.
``` sh
./benchdnn --reduction --sdt=f32 --ddt=f32 --stag=acb --alg=sum 1x2x3:1x1x3
```
More examples with different driver options can be found at
inputs/reduction/test_***. Examples with different problem descriptors can be
found at inputs/reduction/shapes_***. Examples with different benchdnn common
options can be found at driver_conv.md.
| 38.1 | 104 | 0.692038 | eng_Latn | 0.941255 |
fd89d6f3ce38497a17ce10f97d74db657fc389a2 | 69 | md | Markdown | content/vg/Mario/New Super Mario Bros/bowsers-castle.md | nerdydrew/Drews-Sheet-Music | d34c82fde1099c3bbdaf55a3ed68c6c4b4b1001c | [
"MIT"
] | 2 | 2019-09-14T08:46:30.000Z | 2022-01-13T18:47:28.000Z | content/vg/Mario/New Super Mario Bros/bowsers-castle.md | nerdydrew/Drews-Sheet-Music | d34c82fde1099c3bbdaf55a3ed68c6c4b4b1001c | [
"MIT"
] | null | null | null | content/vg/Mario/New Super Mario Bros/bowsers-castle.md | nerdydrew/Drews-Sheet-Music | d34c82fde1099c3bbdaf55a3ed68c6c4b4b1001c | [
"MIT"
] | null | null | null | Title: Bowser’s Castle
Date: 2012-06-05 18:26:00
FileSlug: NSMBCastle | 23 | 25 | 0.782609 | deu_Latn | 0.096523 |
fd8a1c736690945c7a624004ac3d39d16aa4047d | 464 | md | Markdown | _definitions/bld-dignitary.md | digitallawyer/openlegaldictionary | a318d6c73c3d8e33756d947add397dac7f25cca2 | [
"MIT"
] | 5 | 2018-08-07T21:57:01.000Z | 2022-02-26T13:29:20.000Z | _definitions/bld-dignitary.md | digitallawyer/openlegaldictionary | a318d6c73c3d8e33756d947add397dac7f25cca2 | [
"MIT"
] | 1 | 2018-08-07T22:29:07.000Z | 2018-08-07T22:45:46.000Z | _definitions/bld-dignitary.md | digitallawyer/openlegaldictionary | a318d6c73c3d8e33756d947add397dac7f25cca2 | [
"MIT"
] | 2 | 2020-12-26T17:22:04.000Z | 2021-02-12T21:35:50.000Z | ---
title: Dignitary
letter: D
permalink: "/definitions/bld-dignitary.html"
body: RY. In canon law.- A person bolding an ecclesiastical benefice or dignity, which
gave him some pre-eminence above mere priests and canons. To this class exclusively
belonged all bishops, deans, arch-deacons, etc.; but it now Includes all the prebendaries
and canons of the church. Brande
published_at: '2018-07-07'
source: Black's Law Dictionary 2nd Ed (1910)
layout: post
--- | 38.666667 | 91 | 0.765086 | eng_Latn | 0.988439 |
fd8b07ab48cf097a18646184041ecd240d1cb433 | 5,149 | md | Markdown | content/variables/1-variables.md | osteele/creative-coding.book | edbd6e68ed29bea389b7ed32d19701a2a0ae5654 | [
"MIT"
] | null | null | null | content/variables/1-variables.md | osteele/creative-coding.book | edbd6e68ed29bea389b7ed32d19701a2a0ae5654 | [
"MIT"
] | null | null | null | content/variables/1-variables.md | osteele/creative-coding.book | edbd6e68ed29bea389b7ed32d19701a2a0ae5654 | [
"MIT"
] | null | null | null | A variable is the name of a _container_ that _stores_ (or “points to”) a _value_.
Variables are the rice and soy sauce of your programs, they give you the ability to grab hold of the bits in your computer and assign names to them.
The _concept_ of variables is the most important thing you can possibly learn at this point in the class. In JavaScript, variables are used to point to values — bits of 1s and 0s somewhere on your disk that make up your program. You can create variables to point not just to the simple values (numbers and strings) that we used above, but to point to entire functions and objects (which we’ll learn about later).
You create variables in JavaScript using the `let`, `const`, or `var` keywords. In general, we’ll avoid using the `var` keyword to create a variable as it’s an old web style that is being phased out.
In general, you use the `const` word when you’re sure the value of the variable won’t change. However, a _mutable_ variable (a variable that can point to different values at different times) is often needed, particularly as a counter when we need to loop through something.
`let` is a signal that the value of the variable may be reassigned.
## How to create a variable in JavaScript
To create a variable, use the word `let` or `const` followed by the name of your variable, followed by an `=` sign and the variable’s value:
```javascript
let degrees = 80;
```
## Create a variable that holds a number:
To create a number variable, simply use the keyword `let` or `const`,, give it a name and a numeric value. For example:
```javascript
const pi = 3.14159;
```
The above creates a variable named `pi` and sets its value equal to 3.14159.
The other way to create a variable is to use the word `let`. Remember our old friend:
```javascript
let degrees = 80;
```
Here we create a variable called `degrees` and set its value to 80. We use `let` here because we’ll probably want to change the value of temp at some point (maybe our program is reading data from a temperature sensor).
Remember the expression that converts the number 80 from Fahrenheit from to Celsius degrees?
```javascript
(80 - 32) * 5 / 9
```
After the `let degrees = 80;` statement is executed, the following two expressions have the same value.
```javascript
(80 - 32) * 5 / 9
(degrees - 32) * 5 / 9
```
`degrees` (for now) is just a wordier way of saying `80`.
Try this out in the REPL. Enter `let degrees = 80;`, and then enter `(degrees — 32) * 5 / 9`.
Since we used `let` to declare the variable (and not `const`), we can change its value. In the REPL, enter `degrees = 50`, and then enter `(degrees — 32) * 5 / 9` a second time. What is printed in the REPL?
Let’s create a variable to hold the result of this calculation:
```javascript
// To convert Fahrenheit to Celsius: subtract 32, then multiply by 5, then divide by 9
let degrees = 80;
let celsius = (degrees - 32) * 5 / 9;
```
The `let degrees = 80;` line creates a variable named `degrees` and stores the number 80. The `let celsius = …` line creates a variable named `celsius`; performs a sequence of operations — multiplies the value of the `degrees` variable by 9, then multiplies that value by 5, and finally divides _that_ value by 9 — and stores the final computed value in the _celsius_ variable.
Try it out! The following REPL comes pre-loaded with the definition of `degrees` and `celsius`.
1. Click the green triangle at the top, to run the code in the top pane.
2. Click on the lower pane. Enter `degrees` (+Enter) and `celsius` (+Enter) to
see the values of those variables.
3. Try changing the code, and running it again (the green triangle at the top).
4. Can you add a `let fahrenheit =`, that uses the value of `celsius` to
calculate degrees in Fahrenheit?
<embed src="https://repl.it/@osteele/4-FahrenheitToCelsius" />
## Camel Case
Camel Case (or “camelCase”) is a way of capitalizing all the words except the first one. e.g. iPhone, or eBay. Almost everyone recommends your JavaScript variable names with more than one word are formatted using camel case!
```javascript
const feetPerMile = 5280;
let feetWalkedToday = 3890;
let degreesFahrenheight = 80;
```
## String Values
A String is a sequence of letters, digits, and special characters such as spaces, tabs, and line breaks (new lines or carriage returns). Creating a string variable is exactly the same as a number, except you need to enclose your value in double quotes `"`, single quotes `'` or backticks \`\`\`.
```javascript
let thingsILove = "I love dumplings";
let thingsILove = 'I love dumplings';
let thingsILove = `I love dumplings`;
```
All of the above are OK in JavaScript, and they all mean the same thing.
The `+` operator also works on strings. These next two lines have the same effect.
```javascript
let hello = "Hello Applab";
let hello = "Hello " + "Applab";
```
Note: The _string_ "12" is different from the _number_ 12. The _string_ is a sequence of characters: the character "1" followed by the character "2". The _number_ is an integer that is slightly greater than 10. 12 + 3 is equivalent to 15; "12" + "3" is equivalent to "1212".
| 47.238532 | 412 | 0.737231 | eng_Latn | 0.99953 |
fd8b338fb4d328f6e88a69c9484ee77ec3d9b987 | 789 | md | Markdown | node_modules/karma-sinon/README.md | edmacabebe/sample-marklogic-app | de32af3751d258d27e90f7405baf84719676e091 | [
"Apache-2.0"
] | 2,389 | 2015-06-05T05:06:05.000Z | 2022-03-31T07:08:56.000Z | node_modules/karma-sinon/README.md | edmacabebe/sample-marklogic-app | de32af3751d258d27e90f7405baf84719676e091 | [
"Apache-2.0"
] | 157 | 2016-02-03T06:03:25.000Z | 2022-02-03T13:55:47.000Z | node_modules/karma-sinon/README.md | emchap/devtools-frontend | 65c824b5adc918400877f7f1d65f17c901e42421 | [
"BSD-3-Clause"
] | 388 | 2015-10-10T12:45:24.000Z | 2022-03-22T18:11:08.000Z | karma-sinon
===========
Sinon for karma
Installation
------------
Install the module via npm
```sh
$ npm install karma-sinon sinon --save-dev
```
Add `sinon` to the `frameworks` key in your Karma configuration:
```js
module.exports = function(config) {
'use strict';
config.set({
#...
frameworks: ['jasmine', 'sinon'],
#...
});
}
```
**Example**
```javascript
describe("sinon example test", function () {
var time2013_10_01;
time2013_10_01 = (new Date(2013, 10-1, 1)).getTime();
before(function() {
// sinon was defined in global scope
this.fakeTimer = new sinon.useFakeTimers(time2013_10_01);
});
it("some test", function() {
//test
});
after(function() {
this.fakeTimer.restore();
});
});
```
| 15.78 | 65 | 0.579214 | eng_Latn | 0.501944 |
fd8bcdeed8fffbf8e62b6a37317510538c2b47d8 | 489 | md | Markdown | pages.tr/common/git-submodule.md | stautonico/tldr | bebfa4e965256c2e7fefa0aabf1d775a92c4fb3a | [
"CC-BY-4.0"
] | 1 | 2022-03-23T23:26:04.000Z | 2022-03-23T23:26:04.000Z | pages.tr/common/git-submodule.md | stautonico/tldr | bebfa4e965256c2e7fefa0aabf1d775a92c4fb3a | [
"CC-BY-4.0"
] | 1 | 2022-02-01T00:50:40.000Z | 2022-03-03T00:59:18.000Z | pages.tr/common/git-submodule.md | stautonico/tldr | bebfa4e965256c2e7fefa0aabf1d775a92c4fb3a | [
"CC-BY-4.0"
] | null | null | null | # git submodule
> Alt modülleri incele, güncelle ve yönet.
> Daha fazla bilgi: <https://git-scm.com/docs/git-submodule>.
- Deponun belirtilen alt modüllerini indir:
`git submodule update --init --recursive`
- Bir Git deposunu alt modül olarak ekle:
`git submodule add {{depo_url'si}}`
- Bir Git deposunu alt modül olarak belirtilen dizinde ekle:
`git submodule add {{depo_url'si}} {{dizin/konumu}}`
- Tüm alt modülleri son commit'lerine güncelle:
`git submodule foreach git pull`
| 23.285714 | 61 | 0.740286 | tur_Latn | 0.993889 |
fd8bdeda37a06211e9a0e4c2cf4e2563d673502b | 52 | md | Markdown | README.md | Ashnetn/InspectAndBuy | cc4a574ec58eca2e3b358459be28dadc8230ef49 | [
"MIT"
] | null | null | null | README.md | Ashnetn/InspectAndBuy | cc4a574ec58eca2e3b358459be28dadc8230ef49 | [
"MIT"
] | null | null | null | README.md | Ashnetn/InspectAndBuy | cc4a574ec58eca2e3b358459be28dadc8230ef49 | [
"MIT"
] | null | null | null | # InspectAndBuy
Inspect a product before buying it!
| 17.333333 | 35 | 0.807692 | eng_Latn | 0.988075 |
fd8c6d4c762a9b8179fcd23aff3a1edac93b8308 | 525 | md | Markdown | .github/PULL_REQUEST_TEMPLATE.md | dartsim/dart | f974937b438e92d558f8f2f01cc31b50d3386dd3 | [
"BSD-2-Clause"
] | 729 | 2015-01-05T22:01:30.000Z | 2022-03-31T14:26:23.000Z | .github/PULL_REQUEST_TEMPLATE.md | dartsim/dart | f974937b438e92d558f8f2f01cc31b50d3386dd3 | [
"BSD-2-Clause"
] | 1,071 | 2015-01-01T05:48:33.000Z | 2022-03-22T17:10:36.000Z | .github/PULL_REQUEST_TEMPLATE.md | dartsim/dart | f974937b438e92d558f8f2f01cc31b50d3386dd3 | [
"BSD-2-Clause"
] | 295 | 2015-01-09T01:50:25.000Z | 2022-03-24T03:20:20.000Z | > Remove this line and describe this pull request. Link to relevant GitHub issues, if any.
***
#### Before creating a pull request
- [ ] Document new methods and classes
- [ ] Format new code files using ClangFormat by running `make format`
- [ ] Build with `-DDART_TREAT_WARNINGS_AS_ERRORS=ON` and resolve all the compile warnings
#### Before merging a pull request
- [ ] Set version target by selecting a milestone on the right side
- [ ] Summarize this change in `CHANGELOG.md`
- [ ] Add unit test(s) for this change
| 32.8125 | 90 | 0.729524 | eng_Latn | 0.99408 |
fd8da3295b8d2866aa9990ed28220d2dedd8ef1a | 217 | md | Markdown | docs/documentation/Azure/index.md | ericphan/Alpakka | ea40be4fff700d2cc106cbd6e35dacb0fe9a22d2 | [
"Apache-2.0"
] | 56 | 2016-11-13T19:05:09.000Z | 2022-03-10T16:16:22.000Z | docs/documentation/Azure/index.md | ericphan/Alpakka | ea40be4fff700d2cc106cbd6e35dacb0fe9a22d2 | [
"Apache-2.0"
] | 681 | 2016-11-13T19:09:59.000Z | 2022-03-29T14:10:20.000Z | docs/documentation/Azure/index.md | ericphan/Alpakka | ea40be4fff700d2cc106cbd6e35dacb0fe9a22d2 | [
"Apache-2.0"
] | 29 | 2016-11-13T20:19:24.000Z | 2022-03-16T09:17:28.000Z | # Akka.Streams.Azure
Akka.Streams adapter for Azure Storage Queue, EventHub and ServiceBus
# Documentation
- [Storage Queue](Azure/StorageQueue.md)
- [EventHub](Azure/EventHub.md)
- [ServiceBus](Azure/ServiceBus.md)
| 27.125 | 69 | 0.778802 | yue_Hant | 0.252764 |
fd8e2f39234ae5fb9ac78b70dcff67757fb38b06 | 2,576 | md | Markdown | docs/raw_ja.md | champon1020/mgorm | 6d4b5186db4879e66c5e2671d47f25549db2ab83 | [
"MIT"
] | 5 | 2021-06-10T13:14:42.000Z | 2021-06-15T14:41:00.000Z | docs/raw_ja.md | champon1020/gsorm | 6d4b5186db4879e66c5e2671d47f25549db2ab83 | [
"MIT"
] | 10 | 2021-05-25T11:46:09.000Z | 2021-09-05T07:52:47.000Z | docs/raw_ja.md | champon1020/mgorm | 6d4b5186db4879e66c5e2671d47f25549db2ab83 | [
"MIT"
] | null | null | null | # Raw
`RawClause`,`RawStmt`は指定された文字列による句や文を呼び出します.
## RawClause
`RawClause`はSQLの汎用性を高めるために作られました.
クエリが実行されるとき,条件式における`?`に値が代入されます.
代入規則は以下に従います.
- 値が`string`型もしくは`time.Time`型の場合,値はシングルクオートで囲まれます.
- 値がスライスもしくは配列の場合,その要素が展開されます.
- 値が`gsorm.Stmt`型の場合,`gsorm.Stmt`は展開されます.
- 以上の条件に該当しない値はそのまま展開される.
`RawClause`は全てのStmt構造体においてサポートされており,任意のタイミングで呼び出すことができます.
しかし,`InsertStmt.Select`, `InsertStmt.Model`, `UpdateStmt.Model`, `CreateTable.Model`などのメソッドと併用することはできません.
#### 例
```go
err := gsorm.CreateTable(db, "dept_emp").
Column("dept_no", "INT").NotNull().RawClause("AUTO_INCREMENT").
Column("emp_no", "INT").NotNull().
Cons("FK_dept_emp").Foreign("emp_no").Ref("employees", "emp_no").RawClause("ON UPDATE CASCADE").
Migrate()
// CREATE TABLE employees(
// dept_no INT NOT NULL AUTO_INCREMENT,
// emp_no INT NOT NULL,
// CONSTRAINT FK_dept_emp FOREIGN KEY (emp_no) REFERENCES employees (emp_no) ON UPDATE CASCADE
// );
```
## RawStmt
`RawStmt`は文字列で指定したSQLを呼び出します.
クエリが実行されるとき,条件式における`?`に値が代入されます.
代入規則は以下に従います.
- 値が`string`型もしくは`time.Time`型の場合,値はシングルクオートで囲まれます.
- 値がスライスもしくは配列の場合,その要素が展開されます.
- 値が`gsorm.Stmt`型の場合,`gsorm.Stmt`は展開されます.
- 以上の条件に該当しない値はそのまま展開される.
`RawStmt`は`Query`,`Exec`,`Migrate`の全てをサポートしています.
[](https://pkg.go.dev/github.com/champon1020/gsorm#RawStmt)
#### 例
```go
err := gsorm.RawStmt("SELECT * FROM employees").Query(&model)
// SELECT * FROM employees;
err := gsorm.RawStmt("SELECT * FROM employees WHERE emp_no = ?", 1001).Query(&model)
// SELECT * FROM employees WHERE emp_no = 1001;
err := gsorm.RawStmt("SELECT * FROM employees WHERE first_name = ?", "Taro").Query(&model)
// SELECT * FROM employees WHERE first_name = 'Taro';
err := gsorm.RawStmt("SELECT * FROM employees WHERE birth_date = ?", time.Date(2006, time.January, 2, 0, 0, 0, 0, time.UTC)).Query(&model)
// SELECT * FROM employees WHERE birth_date = '2006-01-02 00:00:00';
err := gsorm.RawStmt("SELECT * FROM employees WHERE emp_no IN (?)", []int{1001, 1002}).Query(&model)
// SELECT * FROM employees WHERE emp_no IN (1001, 1002);
err := gsorm.RawStmt("SELECT * FROM employees WHERE emp_no IN (?)", gsorm.Select(nil, "emp_no").From("dept_manager")).Query(&model)
// SELECT * FROM employees WHERE emp_no IN (SELECT emp_no FROM dept_manager);
err := gsorm.RawStmt("DELETE FROM employees").Exec()
// DELETE FROM employees;
err := gsorm.RawStmt("ALTER TABLE employees DROP PRIMARY KEY PK_emp_no").Migrate()
// ALTER TABLE employees DROP PRIMARY KEY PK_emp_no;
```
| 33.025641 | 141 | 0.722826 | yue_Hant | 0.981555 |
fd8e3164a188641014612aa8203b4760e681fbe0 | 2,088 | md | Markdown | _posts/10/10/2021-04-04-michael-jackson.md | chito365/ukdat | 382c0628a4a8bed0f504f6414496281daf78f2d8 | [
"MIT"
] | null | null | null | _posts/10/10/2021-04-04-michael-jackson.md | chito365/ukdat | 382c0628a4a8bed0f504f6414496281daf78f2d8 | [
"MIT"
] | null | null | null | _posts/10/10/2021-04-04-michael-jackson.md | chito365/ukdat | 382c0628a4a8bed0f504f6414496281daf78f2d8 | [
"MIT"
] | null | null | null | ---
id: 120
title: Michael Jackson
date: 2012-04-04T21:58:54+00:00
author: chito
layout: post
guid: https://ukdataservers.com/michael-jackson/
permalink: /04/04/michael-jackson
tags:
- claims
- lawyer
- doctor
- house
- multi family
- online
- poll
- business
category: Guides
---
* some text
{: toc}
## Who is Michael Jackson
The King of Pop who became the most successful singer in American history for releasing award-winning hits like “Billie Jean,” “Beat It,” “They Don’t Care About Us” and “Smooth Criminal.”
## Prior to Popularity
He first rose to prominence at the age of six singing with the family group The Jackson 5 alongside his brothers Jermaine, Tito, Marlon and Jackie Jackson.
## Random data
His 1982 album Thriller became the best-selling album in history, winning thirteen Grammy Awards and having thirteen songs reach #1 on the U.S. charts. He’s recognized as the Most Successful Entertainer of All-Time by the Guinness Book of World Records.
## Family & Everyday Life of Michael Jackson
He got married to Lisa Marie Presley, the daughter of Elvis Presley, on May 26, 1994. After their divorce on January 18, 1996, he married Debbie Rowe on November 14, 1996. He had three children named Paris, Prince and Blanket.
## People Related With Michael Jackson
He is the older brother of pop diva Janet Jackson.
| 27.473684 | 260 | 0.507184 | eng_Latn | 0.989503 |
fd8e3c2a06dcf1d6f34e86a1565a86caa37073f1 | 10,126 | md | Markdown | _docs/_docs/recostep-kafka-consumer-mongodb.md | sparsh-ai/recohut | 4121f665761ffe38c9b6337eaa9293b26bee2376 | [
"Apache-2.0"
] | null | null | null | _docs/_docs/recostep-kafka-consumer-mongodb.md | sparsh-ai/recohut | 4121f665761ffe38c9b6337eaa9293b26bee2376 | [
"Apache-2.0"
] | 1 | 2022-01-12T05:40:57.000Z | 2022-01-12T05:40:57.000Z | _docs/_docs/recostep-kafka-consumer-mongodb.md | RecoHut-Projects/recohut | 4121f665761ffe38c9b6337eaa9293b26bee2376 | [
"Apache-2.0"
] | null | null | null | ---
jupyter:
jupytext:
text_representation:
extension: .md
format_name: markdown
format_version: '1.3'
jupytext_version: 1.13.7
kernelspec:
display_name: Python 3
language: python
name: python3
---
<!-- #region id="H6scelNfLkw8" -->
# Kafka MongoDB Real-time Streaming Kafka Consumer and MongoDB
> Listening from kafka topic in real-time and storing in mongodb
- toc: true
- badges: true
- comments: true
- categories: [mongodb, kafka, real time]
- image:
<!-- #endregion -->
```python id="eqHjK5UbBEqK"
!pip install confluent_kafka -q
```
```python id="ppjOFhRnCTJu"
import json
import sys
import os
import pandas as pd
from confluent_kafka import Producer
from confluent_kafka import Consumer, KafkaException, KafkaError
```
<!-- #region id="9mHX3NBGLTMb" -->
### Consumer Setup [notebook]
<!-- #endregion -->
```python id="RXBtNbCTJZb2"
CLOUDKARAFKA_TOPIC = 'yx03wajr-demo'
CLOUDKARAFKA_BROKERS = 'dory-01.srvs.cloudkafka.com:9094, \
dory-02.srvs.cloudkafka.com:9094, \
dory-03.srvs.cloudkafka.com:9094'
CLOUDKARAFKA_USERNAME = 'yx03wajr'
CLOUDKARAFKA_PASSWORD = 'pHva0afDUXPya6JfKrbM1j549G*****'
```
```python id="lAYgDcRZLf9f"
topics = CLOUDKARAFKA_TOPIC.split(",")
# Consumer configuration
conf = {
'bootstrap.servers': CLOUDKARAFKA_BROKERS,
'group.id': "%s-consumer" % CLOUDKARAFKA_USERNAME,
'session.timeout.ms': 6000,
'default.topic.config': {'auto.offset.reset': 'smallest'},
'security.protocol': 'SASL_SSL',
'sasl.mechanisms': 'SCRAM-SHA-256',
'sasl.username': CLOUDKARAFKA_USERNAME,
'sasl.password': CLOUDKARAFKA_PASSWORD
}
```
```python id="kQZv-OBALWR_"
c = Consumer(**conf)
c.subscribe(topics)
```
```python colab={"base_uri": "https://localhost:8080/"} id="QLo8xNhFKYrf" outputId="0ceb91f4-842e-4e25-9df0-da503db8aed5"
# while True:
for i in range(10):
i+=1
print(i)
msg = c.poll(timeout=1.0)
if msg is None:
continue
if msg.error():
# Error or event
if msg.error().code() == KafkaError._PARTITION_EOF:
# End of partition event
sys.stderr.write('%% %s [%d] reached end at offset %d\n' %
(msg.topic(), msg.partition(), msg.offset()))
elif msg.error():
# Error
raise KafkaException(msg.error())
else:
# Proper message
sys.stderr.write('%% %s [%d] at offset %d with key %s:\n' %
(msg.topic(), msg.partition(), msg.offset(),
str(msg.key())))
print(msg.value())
c.close()
```
<!-- #region id="DOslXBsRTGP_" -->
### Consumer Setup [terminal]
<!-- #endregion -->
```python colab={"base_uri": "https://localhost:8080/"} id="A7C7RJnOL6qK" outputId="9e24943e-be6b-443e-e207-642bcc9e0207"
%%writefile consumer.py
import sys
import os
from confluent_kafka import Consumer, KafkaException, KafkaError
CLOUDKARAFKA_TOPIC = 'yx03wajr-demo'
CLOUDKARAFKA_BROKERS = 'dory-01.srvs.cloudkafka.com:9094, \
dory-02.srvs.cloudkafka.com:9094, \
dory-03.srvs.cloudkafka.com:9094'
CLOUDKARAFKA_USERNAME = 'yx03wajr'
CLOUDKARAFKA_PASSWORD = 'pHva0afDUXPya6JfKrbM1j549G*****'
if __name__ == '__main__':
topics = CLOUDKARAFKA_TOPIC.split(",")
# Consumer configuration
# See https://github.com/edenhill/librdkafka/blob/master/CONFIGURATION.md
conf = {
'bootstrap.servers': CLOUDKARAFKA_BROKERS,
'group.id': "%s-consumer" % CLOUDKARAFKA_USERNAME,
'session.timeout.ms': 6000,
'default.topic.config': {'auto.offset.reset': 'smallest'},
'security.protocol': 'SASL_SSL',
'sasl.mechanisms': 'SCRAM-SHA-256',
'sasl.username': CLOUDKARAFKA_USERNAME,
'sasl.password': CLOUDKARAFKA_PASSWORD
}
c = Consumer(**conf)
c.subscribe(topics)
try:
while True:
msg = c.poll(timeout=1.0)
if msg is None:
continue
if msg.error():
# Error or event
if msg.error().code() == KafkaError._PARTITION_EOF:
# End of partition event
sys.stderr.write('%% %s [%d] reached end at offset %d\n' %
(msg.topic(), msg.partition(), msg.offset()))
elif msg.error():
# Error
raise KafkaException(msg.error())
else:
# Proper message
sys.stderr.write('%% %s [%d] at offset %d with key %s:\n' %
(msg.topic(), msg.partition(), msg.offset(),
str(msg.key())))
print(msg.value())
except KeyboardInterrupt:
sys.stderr.write('%% Aborted by user\n')
# Close down consumer to commit final offsets.
c.close()
```
```python colab={"base_uri": "https://localhost:8080/"} id="kdTMz6JyTQyy" outputId="5c37503a-6f65-44d3-c8ec-48f3cf91b30d"
!python consumer.py
```
<!-- #region id="E9fHshxUdrS2" -->
### MongoDB Setup
<!-- #endregion -->
```python colab={"base_uri": "https://localhost:8080/"} id="abIKLu67dq65" outputId="7fed633c-53fb-4bc3-90c5-7a923c165b42"
!pip uninstall pymongo
!pip install pymongo[srv]
```
```python id="K2TRWDWs5IVt"
MONGODB_USER = 'kafka-demo'
MONGODB_PASSWORD = '<your-pass>'
MONGODB_CLUSTER = 'cluster0.ca4wh.mongodb.net'
MONGODB_DATABASE = 'movielens'
```
```python id="fpSqox-OX54F"
import pymongo
import urllib
mongo_uri = f"mongodb+srv://{MONGODB_USER}:{MONGODB_PASSWORD}@{MONGODB_CLUSTER}/{MONGODB_DATABASE}?retryWrites=true&w=majority"
client = pymongo.MongoClient(mongo_uri)
```
```python colab={"base_uri": "https://localhost:8080/"} id="F-pbkwAukbVx" outputId="e8f63ddb-0cba-4939-92c3-f99ba400c2d3"
mydb = client["movielens"]
mydb.list_collection_names()
```
```python colab={"base_uri": "https://localhost:8080/"} id="FUlch4VvmQFf" outputId="3e25cff4-3a14-4044-c55a-5af3bd98539a"
client.list_database_names()
```
```python id="iNsVjxr8mXXX"
movies = mydb.movies
```
```python colab={"base_uri": "https://localhost:8080/"} id="JIzwhu6qmhb8" outputId="d3071aad-2fc5-4836-fc51-da3767408c63"
result = movies.insert_one({'movieId': 3, 'title': 'Grumpier Old Men (1995)', 'genres': 'Comedy|Romance'})
result
```
```python colab={"base_uri": "https://localhost:8080/"} id="h2VDDAqNhZvF" outputId="80ff7a7a-c562-489e-f9e4-e32a43a06457"
print(f"One movie: {result.inserted_id}")
```
```python colab={"base_uri": "https://localhost:8080/"} id="bFLfB7CGvXFR" outputId="629cd933-85e1-46a5-92cb-6f8fc7711a2b"
# single-line command to insert record
print(client.movielens.movies.insert_one({'movieId':5, 'title':'Bride', 'genres':'Comedy'}).inserted_id)
```
```python colab={"base_uri": "https://localhost:8080/"} id="r2sOG1h_hZqr" outputId="ffb4618d-9917-40aa-ede9-7ef20954042f"
movie2 = {'movieId': 2, 'title': 'Jumanji (1995)', 'genres': 'Adventure|Children|Fantasy'}
movie3 = {'movieId': 3, 'title': 'Grumpier Old Men (1995)', 'genres': 'Comedy|Romance'}
new_result = movies.insert_many([movie2, movie3])
print(f"Multiple movies: {new_result.inserted_ids}")
```
```python colab={"base_uri": "https://localhost:8080/"} id="Ud9mkOYqnTcC" outputId="32960e28-7b2a-409f-8745-6bdd52a4f0e0"
import pprint
for doc in movies.find():
pprint.pprint(doc)
```
```python colab={"base_uri": "https://localhost:8080/"} id="6o5C_7kxnbGW" outputId="a879b8ca-224d-427a-f97d-7df8375fc453"
%%writefile consumer.py
import sys
import os
from confluent_kafka import Consumer, KafkaException, KafkaError
import pymongo
CLOUDKARAFKA_TOPIC = 'yx03wajr-demo'
CLOUDKARAFKA_BROKERS = 'dory-01.srvs.cloudkafka.com:9094, \
dory-02.srvs.cloudkafka.com:9094, \
dory-03.srvs.cloudkafka.com:9094'
CLOUDKARAFKA_USERNAME = 'yx03wajr'
CLOUDKARAFKA_PASSWORD = 'pHva0afDUXPya6JfKrbM1j549G*****'
MONGODB_USER = 'kafka-demo'
MONGODB_PASSWORD = '<your-pass>'
MONGODB_CLUSTER = 'cluster0.ca4wh.mongodb.net'
MONGODB_DATABASE = 'movielens'
mongo_uri = f"mongodb+srv://{MONGODB_USER}:{MONGODB_PASSWORD}@{MONGODB_CLUSTER}/{MONGODB_DATABASE}?retryWrites=true&w=majority"
client = pymongo.MongoClient(mongo_uri)
mydb = client[MONGODB_DATABASE]
movies = mydb.movies
if __name__ == '__main__':
topics = CLOUDKARAFKA_TOPIC.split(",")
# Consumer configuration
# See https://github.com/edenhill/librdkafka/blob/master/CONFIGURATION.md
conf = {
'bootstrap.servers': CLOUDKARAFKA_BROKERS,
'group.id': "%s-consumer" % CLOUDKARAFKA_USERNAME,
'session.timeout.ms': 6000,
'default.topic.config': {'auto.offset.reset': 'smallest'},
'security.protocol': 'SASL_SSL',
'sasl.mechanisms': 'SCRAM-SHA-256',
'sasl.username': CLOUDKARAFKA_USERNAME,
'sasl.password': CLOUDKARAFKA_PASSWORD
}
c = Consumer(**conf)
c.subscribe(topics)
try:
while True:
msg = c.poll(timeout=1.0)
if msg is None:
continue
if msg.error():
# Error or event
if msg.error().code() == KafkaError._PARTITION_EOF:
# End of partition event
sys.stderr.write('%% %s [%d] reached end at offset %d\n' %
(msg.topic(), msg.partition(), msg.offset()))
elif msg.error():
# Error
raise KafkaException(msg.error())
else:
# Proper message
sys.stderr.write('%% %s [%d] at offset %d with key %s:\n' %
(msg.topic(), msg.partition(), msg.offset(),
str(msg.key())))
print(msg.value())
try:
movies.insert_one(eval(msg.value().decode('utf-8')))
except:
movies.insert_one({"err_flag":True, "msg":str(msg.value())})
except KeyboardInterrupt:
sys.stderr.write('%% Aborted by user\n')
# Close down consumer to commit final offsets.
c.close()
```
```python colab={"base_uri": "https://localhost:8080/"} id="FMAspFyRoVKq" outputId="f08c8f0a-45d4-4404-acf5-9bc17a5311e6"
!python consumer.py
```
| 32.044304 | 127 | 0.636777 | yue_Hant | 0.216801 |
fd8e6fb85194355cea4f9d0dc5f79e3858ba3a88 | 849 | md | Markdown | docs/css/grid.md | xuanzhaotech/vue-carbon | 559cfca7d01fd9b6a1c121f144b09623f805f580 | [
"MIT"
] | 1,041 | 2016-09-05T01:39:30.000Z | 2022-03-28T14:07:03.000Z | docs/css/grid.md | xialinlin/vue-carbon | 559cfca7d01fd9b6a1c121f144b09623f805f580 | [
"MIT"
] | 52 | 2016-09-06T04:01:24.000Z | 2019-03-15T06:49:36.000Z | docs/css/grid.md | xialinlin/vue-carbon | 559cfca7d01fd9b6a1c121f144b09623f805f580 | [
"MIT"
] | 185 | 2016-09-05T13:52:43.000Z | 2021-11-02T12:10:02.000Z | # Grid
栅格布局组件,利用 flexbox 实现的栅格布局 [demo](https://myronliu347.github.io/vue-carbon/#!/grid)
----
## 用法
基础用法
```html
<grid-row gutter>
<grid-col width="50">.col-50</grid-col>
<grid-col width="50">.col-50</grid-col>
</grid-row>
```
设置平板上的宽度
```html
<grid-row gutter>
<grid-col width="50" tablet="25">.col-50.tablet-25</grid-col>
<grid-col width="50" tablet="25">.col-50.tablet-25</grid-col>
<grid-col width="50" tablet="25">.col-50.tablet-25</grid-col>
<grid-col width="50" tablet="25">.col-50.tablet-25</grid-col>
</grid-row>
```
## API
grid-row
| 参数 | 说明 | 类型 | 可选值 | 默认值 |
| :---- | :---- | :---- | :---- | :---- |
| gutter | 是否有间隔 | Boolean | true/false | false |
grid-col
| 参数 | 说明 | 类型 | 可选值 | 默认值 |
| :---- | :---- | :---- | :---- | :---- |
| width | 宽度 | Number | 0 - 100 | |
| tablet | 在平板上的宽度 | Number | 0 - 100 | |
| 19.744186 | 82 | 0.54417 | yue_Hant | 0.236039 |
fd8e82a95aa035afc8a587fa3b770e034b1afcd0 | 63 | md | Markdown | index.md | alejandro088/github-pages-with-jekyll | 31ca12574298a46d6f6f0aade76ce62116ef6ec4 | [
"MIT"
] | null | null | null | index.md | alejandro088/github-pages-with-jekyll | 31ca12574298a46d6f6f0aade76ce62116ef6ec4 | [
"MIT"
] | 4 | 2020-08-08T22:09:45.000Z | 2020-08-08T22:26:45.000Z | index.md | alejandro088/github-pages-with-jekyll | 31ca12574298a46d6f6f0aade76ce62116ef6ec4 | [
"MIT"
] | null | null | null | # Hello world
I'm glad you are here. I plan to talk about ...
| 15.75 | 47 | 0.666667 | eng_Latn | 0.997828 |
fd8edde0d99cdac28207c209d5374e39f9ed001c | 444 | md | Markdown | DynamicProg/WaterArea/_.md | hritesh93/Goa | b88138e40ea578e201ded6fa5956cc3a7a1e3c2c | [
"MIT"
] | null | null | null | DynamicProg/WaterArea/_.md | hritesh93/Goa | b88138e40ea578e201ded6fa5956cc3a7a1e3c2c | [
"MIT"
] | null | null | null | DynamicProg/WaterArea/_.md | hritesh93/Goa | b88138e40ea578e201ded6fa5956cc3a7a1e3c2c | [
"MIT"
] | null | null | null | # Water Area
Given an array of non-negative integers where each non-zero integer represents the height of a pillar of width `1`.
Imagine water being poured over all of the pillars; write a function that returns the surface area of the water trapped between the pillars viewed from the front.
Note that spilled water should be ignored.
Sample Input
```go
heights = [0, 8, 0, 0, 5, 0, 0, 10, 0, 0, 1, 1, 0, 3]
```
Sample Output
```go
48
```
| 24.666667 | 162 | 0.720721 | eng_Latn | 0.999596 |
fd8f026c0d45492cb6918d0d25300653dbf32465 | 1,793 | md | Markdown | basic_codes_terminal.md | prince21298/CODES | 4c2029e2715513ea119d27d529f956df243f8b35 | [
"MIT"
] | 1 | 2019-03-12T04:05:36.000Z | 2019-03-12T04:05:36.000Z | basic_codes_terminal.md | prince21298/CODES | 4c2029e2715513ea119d27d529f956df243f8b35 | [
"MIT"
] | 1 | 2020-05-17T10:48:35.000Z | 2020-05-17T11:12:09.000Z | basic_codes_terminal.md | prince21298/CODES | 4c2029e2715513ea119d27d529f956df243f8b35 | [
"MIT"
] | 1 | 2020-07-01T16:44:40.000Z | 2020-07-01T16:44:40.000Z | #### For make directory in file through terminal
`mkdir ("dirctory name")` if you want to make more direcotry in other directory so, open first that directory through
terminal and then run that on terminal
and if you want to give any data file like json file, python file, json file, node file, database etc.
then use `touch("file name and (.)then use its extesion in which you want to make it.(like .js, .py, .json, .db)`these types of
extension.
touch "file name"
If you to remove your Directory through terminal then run rmdir ("your Directory name which you want to remove")
rmdir "Directory name"
If you to remove your file from Directory through terminal then run rm ("your file name which you want to remove")
rm -rf "file_name"
if you want to increase your your system of volume from terminal then run this command
pactl set-sink-volume 0 +50%
if you want to check your Directory through terminal then run this command `ls`
if you have apackage in your file and you want to copy in other file then you have to write on terminal
"that package name like(npm) which i have)then init
like `npm init`
we have a lot of package like git, npm, PyPI etc.
if you want to install from any package and also want to save that then write on terminal
then write that command and use --save.
like `npm install express --save`
To giving Access to Directory to Read and Write for Ubuntu
sudo chmod -R 777 directory_path "after opening terminal"
get PID of software through terminal pidof `app_name`
copy file from server and paste into local
scp -r server_user@server_ip:server_file_path local_folder_path
copy directory
cp -a test.json new_file_name.json
| 37.354167 | 128 | 0.716676 | eng_Latn | 0.999539 |
fd8f2c2ef5c3d8efaaeff81046fbd7c4c7b93ea0 | 1,373 | md | Markdown | content/ja/publication/kayatani-2021/index.md | ids-cv/islab-website | 1842075dc27f20d6a550d1452cff912a24bb974d | [
"MIT"
] | 1 | 2020-06-25T03:53:54.000Z | 2020-06-25T03:53:54.000Z | content/ja/publication/kayatani-2021/index.md | ids-cv/islab-website | 1842075dc27f20d6a550d1452cff912a24bb974d | [
"MIT"
] | 1 | 2020-06-25T03:43:12.000Z | 2020-06-25T03:43:12.000Z | content/ja/publication/kayatani-2021/index.md | ids-cv/islab-website | 1842075dc27f20d6a550d1452cff912a24bb974d | [
"MIT"
] | 1 | 2020-06-18T14:19:19.000Z | 2020-06-18T14:19:19.000Z | ---
title: "The laughing machine: Predicting humor in video"
date: 2021-01-01
publishDate: 2021-04-01T02:47:13.596159Z
authors: ["Yuta Kayatani", "Zekun Yang", "Mayu Otani", "Noa Garcia", "Chenhui Chu", "Yuta Nakashima", "Haruo Takemura"]
publication_types: ["1"]
abstract: "Humor is a very important communication tool; yet, it is an open problem for machines to understand humor. In this paper, we build a new multimodal dataset for humor prediction that includes subtitles and video frames, as well as humor labels associated with video's timestamps. On top of it, we present a model to predict whether a subtitle causes laughter. Our model uses the visual modality through facial expression and character name recognition, together with the verbal modality, to explore how the visual modality helps. In addition, we use an attention mechanism to adjust the weight for each modality to facilitate humor prediction. Interestingly, our experimental results show that the performance boost by combinations of different modalities, and the attention mechanism and the model mostly relies on the verbal modality."
featured: false
publication: "*Proceedings - IEEE/CVF Winter Conference on Applications of Computer Vision (WACV)*"
url_pdf: "https://openaccess.thecvf.com/content/WACV2021/html/Kayatani_The_Laughing_Machine_Predicting_Humor_in_Video_WACV_2021_paper.html"
---
| 105.615385 | 847 | 0.804079 | eng_Latn | 0.978037 |
fd8fe98d585d8342ceeacf904f85a4b9c7fa57b6 | 696 | md | Markdown | desktop-src/SecCertEnroll/ix509namevaluepairs-methods.md | velden/win32 | 94b05f07dccf18d4b1dbca13b19fd365a0c7eedc | [
"CC-BY-4.0",
"MIT"
] | 552 | 2019-08-20T00:08:40.000Z | 2022-03-30T18:25:35.000Z | desktop-src/SecCertEnroll/ix509namevaluepairs-methods.md | velden/win32 | 94b05f07dccf18d4b1dbca13b19fd365a0c7eedc | [
"CC-BY-4.0",
"MIT"
] | 1,143 | 2019-08-21T20:17:47.000Z | 2022-03-31T20:24:39.000Z | desktop-src/SecCertEnroll/ix509namevaluepairs-methods.md | velden/win32 | 94b05f07dccf18d4b1dbca13b19fd365a0c7eedc | [
"CC-BY-4.0",
"MIT"
] | 1,287 | 2019-08-20T05:37:48.000Z | 2022-03-31T20:22:06.000Z | ---
description: The IX509NameValuePairs interface exposes the following methods.
ms.assetid: BA722088-17B0-4989-9AD8-BB875F3097CC
title: IX509NameValuePairs Methods
ms.topic: reference
ms.date: 05/31/2018
---
# IX509NameValuePairs Methods
The [**IX509NameValuePairs**](/windows/desktop/api/CertEnroll/nn-certenroll-ix509namevaluepairs) interface exposes the following methods.
## In this section
- [**Add Method**](/windows/desktop/api/CertEnroll/nf-certenroll-ix509namevaluepairs-add)
- [**Clear Method**](/windows/desktop/api/CertEnroll/nf-certenroll-ix509namevaluepairs-clear)
- [**Remove Method**](/windows/desktop/api/CertEnroll/nf-certenroll-ix509namevaluepairs-remove)
| 27.84 | 137 | 0.781609 | yue_Hant | 0.295186 |
fd90b6775172f1dc95ebced6f10be0f8db15e5a3 | 18,712 | md | Markdown | articles/machine-learning/how-to-set-up-training-targets.md | flexray/azure-docs.pl-pl | bfb8e5d5776d43b4623ce1c01dc44c8efc769c78 | [
"CC-BY-4.0",
"MIT"
] | 12 | 2017-08-28T07:45:55.000Z | 2022-03-07T21:35:48.000Z | articles/machine-learning/how-to-set-up-training-targets.md | flexray/azure-docs.pl-pl | bfb8e5d5776d43b4623ce1c01dc44c8efc769c78 | [
"CC-BY-4.0",
"MIT"
] | 441 | 2017-11-08T13:15:56.000Z | 2021-06-02T10:39:53.000Z | articles/machine-learning/how-to-set-up-training-targets.md | flexray/azure-docs.pl-pl | bfb8e5d5776d43b4623ce1c01dc44c8efc769c78 | [
"CC-BY-4.0",
"MIT"
] | 27 | 2017-11-13T13:38:31.000Z | 2022-02-17T11:57:33.000Z | ---
title: Konfigurowanie przebiegu szkoleniowego
titleSuffix: Azure Machine Learning
description: Uczenie modelu uczenia maszynowego w różnych środowiskach szkoleniowych (cele obliczeniowe). Można łatwo przełączać się między środowiskami szkoleniowymi.
services: machine-learning
author: sdgilley
ms.author: sgilley
ms.reviewer: sgilley
ms.service: machine-learning
ms.subservice: core
ms.date: 09/28/2020
ms.topic: conceptual
ms.custom: how-to, devx-track-python, contperf-fy21q1
ms.openlocfilehash: f38fe7d847754247f8c1510527b3ffe026c20be5
ms.sourcegitcommit: 32e0fedb80b5a5ed0d2336cea18c3ec3b5015ca1
ms.translationtype: MT
ms.contentlocale: pl-PL
ms.lasthandoff: 03/30/2021
ms.locfileid: "102518504"
---
# <a name="configure-and-submit-training-runs"></a>Konfigurowanie i przesyłanie przebiegów trenowania
W tym artykule dowiesz się, jak skonfigurować i przesłać Azure Machine Learning uruchomienia do uczenia modeli. Fragmenty kodu wyjaśniają kluczowe części konfiguracji i przedłożenie skryptu szkoleniowego. Następnie skorzystaj z jednego z [przykładowych notesów](#notebooks) , aby znaleźć kompletne, kompleksowe Przykłady pracy.
W przypadku szkolenia często należy zacząć pracę na komputerze lokalnym, a następnie skalować w poziomie do klastra opartego na chmurze. Za pomocą Azure Machine Learning można uruchamiać skrypty na różnych obiektach docelowych obliczeń bez konieczności zmiany skryptu szkoleniowego.
Wystarczy zdefiniować środowisko dla każdego obiektu docelowego obliczeń w ramach **konfiguracji uruchamiania skryptu**. Następnie, gdy chcesz uruchomić eksperyment szkoleniowy w innym miejscu docelowym obliczeń, określ konfigurację uruchamiania dla tego obliczenia.
## <a name="prerequisites"></a>Wymagania wstępne
* Jeśli nie masz subskrypcji platformy Azure, przed rozpoczęciem utwórz bezpłatne konto. Wypróbuj [bezpłatną lub płatną wersję Azure Machine Learning](https://aka.ms/AMLFree) dzisiaj
* [Zestaw Azure Machine Learning SDK dla języka Python](/python/api/overview/azure/ml/install) (>= 1.13.0)
* [Obszar roboczy Azure Machine Learning](how-to-manage-workspace.md),`ws`
* Obiekt docelowy obliczeń, `my_compute_target` . [Tworzenie obiektu docelowego obliczeń](how-to-create-attach-compute-studio.md)
## <a name="whats-a-script-run-configuration"></a><a name="whats-a-run-configuration"></a>Co to jest konfiguracja uruchomienia skryptu?
[ScriptRunConfig](/python/api/azureml-core/azureml.core.scriptrunconfig) służy do konfigurowania informacji niezbędnych do przesłania szkolenia w ramach eksperymentu.
Możesz przesłać eksperyment szkoleniowy z obiektem ScriptRunConfig. Ten obiekt obejmuje:
* **source_directory**: Katalog źródłowy zawierający skrypt szkoleniowy
* **skrypt**: skrypt szkoleniowy do uruchomienia
* **compute_target**: element docelowy obliczeń do uruchomienia
* **środowisko**: środowisko, które ma być używane podczas uruchamiania skryptu
* i kilka dodatkowych konfigurowalnych opcji (zobacz [dokumentację referencyjną](/python/api/azureml-core/azureml.core.scriptrunconfig) , aby uzyskać więcej informacji)
## <a name="train-your-model"></a><a id="submit"></a>Szkolenie modelu
Wzorzec kodu do przesyłania przebiegu szkoleniowego jest taki sam dla wszystkich typów obiektów docelowych obliczeń:
1. Utwórz eksperyment do uruchomienia
1. Utwórz środowisko, w którym będzie uruchamiany skrypt
1. Utwórz element ScriptRunConfig, który określa miejsce docelowe i środowisko obliczeniowe
1. Prześlij przebieg
1. Poczekaj na zakończenie przebiegu
Możesz też:
* Prześlij przebieg dla [strojenia parametru](how-to-tune-hyperparameters.md).
* Prześlij eksperyment za pośrednictwem [rozszerzenia vs Code](tutorial-train-deploy-image-classification-model-vscode.md#train-the-model).
## <a name="create-an-experiment"></a>Tworzenie eksperymentu
Utwórz eksperyment w obszarze roboczym.
```python
from azureml.core import Experiment
experiment_name = 'my_experiment'
experiment = Experiment(workspace=ws, name=experiment_name)
```
## <a name="select-a-compute-target"></a>Wybierz element docelowy obliczeń
Wybierz miejsce docelowe obliczeń, w którym będzie uruchamiany skrypt szkoleniowy. Jeśli w ScriptRunConfig nie określono elementu docelowego obliczeń, lub jeśli `compute_target='local'` platforma Azure ml wykona skrypt lokalnie.
W przykładowym kodzie w tym artykule przyjęto założenie, że obiekt docelowy obliczeń został już utworzony `my_compute_target` z sekcji "wymagania wstępne".
>[!Note]
>Azure Databricks nie jest obsługiwane jako element docelowy obliczeń dla szkolenia modeli. Azure Databricks służy do przygotowywania i wdrażania danych.
## <a name="create-an-environment"></a>Tworzenie środowiska
[Środowiska](concept-environments.md) Azure Machine Learning są hermetyzacją środowiska, w którym odbywa się szkolenie uczenia maszynowego. Określają one pakiety języka Python, obraz platformy Docker, zmienne środowiskowe i ustawienia oprogramowania wokół skryptów szkoleń i oceniania. Określają one również środowiska uruchomieniowe (Python, Spark lub Docker).
Można zdefiniować własne środowisko lub użyć środowiska zanadzorowanego przez usługę Azure ML. [Środowiska nadzorowane](./how-to-use-environments.md#use-a-curated-environment) są wstępnie zdefiniowanymi środowiskami, które są domyślnie dostępne w obszarze roboczym. Te środowiska są obsługiwane przez buforowane obrazy platformy Docker, co zmniejsza koszt przygotowania do uruchomienia. Zapoznaj się z pełną listą dostępnych środowisk nadzorowanych w [Azure Machine Learning środowiskach nadzorowanych](./resource-curated-environments.md) .
W przypadku zdalnego celu obliczeń można użyć jednego z tych popularnych środowisk nadzorowanych, aby zacząć od:
```python
from azureml.core import Workspace, Environment
ws = Workspace.from_config()
myenv = Environment.get(workspace=ws, name="AzureML-Minimal")
```
Aby uzyskać więcej informacji i szczegółowe informacje o środowiskach, zobacz [tworzenie & używanie środowisk oprogramowania w programie Azure Machine Learning](how-to-use-environments.md).
### <a name="local-compute-target"></a><a name="local"></a>Lokalne miejsce docelowe obliczeń
Jeśli obiektem docelowym obliczeń jest komputer **lokalny**, użytkownik jest odpowiedzialny za zapewnienie, że wszystkie niezbędne pakiety są dostępne w środowisku języka Python, w którym działa skrypt. Użyj, `python.user_managed_dependencies` Aby użyć bieżącego środowiska Python (lub języka Python w określonej ścieżce).
```python
from azureml.core import Environment
myenv = Environment("user-managed-env")
myenv.python.user_managed_dependencies = True
# You can choose a specific Python environment by pointing to a Python path
# myenv.python.interpreter_path = '/home/johndoe/miniconda3/envs/myenv/bin/python'
```
## <a name="create-the-script-run-configuration"></a>Utwórz konfigurację uruchamiania skryptu
Teraz, gdy masz element docelowy obliczeń ( `my_compute_target` ) i środowisko ( `myenv` ), Utwórz konfigurację uruchamiania skryptu, która uruchamia skrypt szkoleniowy ( `train.py` ) znajdujący się w `project_folder` katalogu:
```python
from azureml.core import ScriptRunConfig
src = ScriptRunConfig(source_directory=project_folder,
script='train.py',
compute_target=my_compute_target,
environment=myenv)
# Set compute target
# Skip this if you are running on your local computer
script_run_config.run_config.target = my_compute_target
```
Jeśli nie określisz środowiska, zostanie utworzone środowisko domyślne.
Jeśli masz argumenty wiersza polecenia, które chcesz przekazać do skryptu szkoleniowego, możesz je określić za pomocą **`arguments`** parametru konstruktora ScriptRunConfig, np. `arguments=['--arg1', arg1_val, '--arg2', arg2_val]` .
Aby zastąpić domyślny maksymalny czas dozwolony dla uruchomienia, można to zrobić za pomocą **`max_run_duration_seconds`** parametru. System podejmie próbę automatycznego anulowania przebiegu, jeśli trwa dłużej niż ta wartość.
### <a name="specify-a-distributed-job-configuration"></a>Określ konfigurację zadania rozproszonego
Jeśli chcesz uruchomić zadanie szkolenia rozproszonego, podaj do parametru konfigurację rozproszoną dla określonego zadania **`distributed_job_config`** . Obsługiwane typy konfiguracji to [MpiConfiguration](/python/api/azureml-core/azureml.core.runconfig.mpiconfiguration), [TensorflowConfiguration](/python/api/azureml-core/azureml.core.runconfig.tensorflowconfiguration)i [PyTorchConfiguration](/python/api/azureml-core/azureml.core.runconfig.pytorchconfiguration).
Aby uzyskać więcej informacji i przykłady uruchamiania rozdystrybuowanych zadań Horovod, TensorFlow i PyTorch, zobacz:
* [Szkolenie modeli TensorFlow](./how-to-train-tensorflow.md#distributed-training)
* [Szkolenie modeli PyTorch](./how-to-train-pytorch.md#distributed-training)
## <a name="submit-the-experiment"></a>Przesyłanie eksperymentu
```python
run = experiment.submit(config=src)
run.wait_for_completion(show_output=True)
```
> [!IMPORTANT]
> Po przesłaniu przebiegu szkoleniowego tworzona jest migawka katalogu zawierającego skrypty szkoleniowe i wysyłane do obiektu docelowego obliczeń. Jest on również przechowywany w ramach eksperymentu w obszarze roboczym. Jeśli zmienisz pliki i prześlesz ponownie uruchomienie, zostaną przekazane tylko zmienione pliki.
>
> [!INCLUDE [amlinclude-info](../../includes/machine-learning-amlignore-gitignore.md)]
>
> Aby uzyskać więcej informacji na [temat migawek, zobacz](concept-azure-machine-learning-architecture.md#snapshots)snapshots.
> [!IMPORTANT]
> **Foldery specjalne** Dwa foldery, dane *wyjściowe* i *dzienniki*, otrzymują specjalne traktowanie według Azure Machine Learning. Podczas szkolenia podczas pisania plików do folderów o nazwie dane *wyjściowe* i *dzienniki* , które są względne dla katalogu głównego ( `./outputs` i `./logs` odpowiednio), pliki zostaną automatycznie przekazane do historii uruchamiania, dzięki czemu będziesz mieć do nich dostęp po zakończeniu przebiegu.
>
> Aby tworzyć artefakty podczas szkoleń (takich jak pliki modelu, punkty kontrolne, pliki danych lub rysunki), Zapisz je w `./outputs` folderze.
>
> Podobnie można napisać wszystkie dzienniki z poziomu szkolenia szkoleniowego do `./logs` folderu. Aby korzystać z [integracji TensorBoard](https://github.com/Azure/MachineLearningNotebooks/blob/master/how-to-use-azureml/track-and-monitor-experiments/tensorboard/export-run-history-to-tensorboard/export-run-history-to-tensorboard.ipynb) Azure Machine Learning upewnij się, że piszesz dzienniki TensorBoard w tym folderze. Gdy przebieg jest w toku, będzie można uruchamiać TensorBoard i przesyłać strumieniowo te dzienniki. Później będzie można przywrócić dzienniki z dowolnego poprzedniego przebiegu.
>
> Na przykład, aby pobrać plik zapisany *w folderze* Outputs na komputer lokalny po uruchomieniu szkolenia zdalnego: `run.download_file(name='outputs/my_output_file', output_file_path='my_destination_path')`
## <a name="git-tracking-and-integration"></a><a id="gitintegration"></a>Śledzenie i integracja usługi git
Po rozpoczęciu szkolenia w przypadku, gdy katalog źródłowy jest lokalnym repozytorium git, informacje o repozytorium są przechowywane w historii uruchamiania. Aby uzyskać więcej informacji, zobacz Integracja z usługą [git dla Azure Machine Learning](concept-train-model-git-integration.md).
## <a name="notebook-examples"></a><a name="notebooks"></a>Przykłady notesu
Zobacz te notesy, aby zapoznać się z przykładami konfigurowania przebiegów w różnych scenariuszach szkoleniowych:
* [Szkolenie dotyczące różnych obiektów docelowych obliczeń](https://github.com/Azure/MachineLearningNotebooks/blob/master/how-to-use-azureml/training)
* [Szkolenia przy użyciu platform ML](https://github.com/Azure/MachineLearningNotebooks/blob/master/how-to-use-azureml/ml-frameworks)
* [Samouczki/IMG-Classification-part1-Training. ipynb](https://github.com/Azure/MachineLearningNotebooks/blob/master/tutorials/image-classification-mnist-data/img-classification-part1-training.ipynb)
[!INCLUDE [aml-clone-in-azure-notebook](../../includes/aml-clone-for-examples.md)]
## <a name="troubleshooting"></a>Rozwiązywanie problemów
* **Przebieg kończy się `jwt.exceptions.DecodeError` niepowodzeniem z**: dokładny komunikat o błędzie: `jwt.exceptions.DecodeError: It is required that you pass in a value for the "algorithms" argument when calling decode()` .
Rozważ uaktualnienie do najnowszej wersji platformy Azure Core: `pip install -U azureml-core` .
Jeśli używasz tego problemu w przypadku uruchomień lokalnych, sprawdź wersję programu PyJWT zainstalowaną w środowisku, w którym uruchamiasz program. Obsługiwane wersje programu PyJWT są < 2.0.0. Odinstaluj PyJWT ze środowiska, jeśli wersja jest >= 2.0.0. Możesz sprawdzić wersję programu PyJWT, odinstalować i zainstalować odpowiednią wersję w następujący sposób:
1. Uruchom powłokę poleceń, Aktywuj środowisko Conda, w którym zainstalowano usługę Azure Core.
2. Wprowadź `pip freeze` i Wyszukaj `PyJWT` , jeśli znaleziono, wyświetlana wersja powinna być < 2.0.0
3. Jeśli wyświetlana wersja nie jest obsługiwaną wersją, `pip uninstall PyJWT` w powłoce poleceń i wprowadź y w celu potwierdzenia.
4. Zainstaluj przy użyciu polecenia `pip install 'PyJWT<2.0.0'`
W przypadku przesyłania środowiska utworzonego przez użytkownika z uruchomionym programem należy rozważyć użycie najnowszej wersji rdzenia Azure w tym środowisku. Wersje >= 1.18.0 z platformy Azure-Core już PyJWT < 2.0.0. Jeśli potrzebujesz użyć wersji programu Azure Core < 1.18.0 w przesłanym środowisku, upewnij się, że określono PyJWT < 2.0.0 w zależnościach PIP.
* **ModuleErrors (Brak modułu o nazwie)**: Jeśli korzystasz z programu ModuleErrors podczas przesyłania eksperymentów na platformie Azure ml, skrypt szkoleniowy oczekuje, że pakiet zostanie zainstalowany, ale nie zostanie dodany. Po podaniu nazwy pakietu usługa Azure ML instaluje pakiet w środowisku używanym do pracy z szkoleniiem.
Jeśli używasz szacowania do przesyłania eksperymentów, możesz określić nazwę pakietu za pośrednictwem `pip_packages` lub `conda_packages` parametru w szacowania, na podstawie którego źródła chcesz zainstalować pakiet. Można również określić plik yml ze wszystkimi zależnościami przy użyciu `conda_dependencies_file` lub wyświetlić wszystkie wymagania dotyczące PIP w pliku txt przy użyciu `pip_requirements_file` parametru. Jeśli masz własny obiekt środowiska usługi Azure ML, który chcesz przesłonić domyślny obraz używany przez szacowania, możesz określić to środowisko za pośrednictwem `environment` parametru konstruktora szacowania.
Platforma Azure ML obsługuje obrazy platformy Docker i ich zawartość można zobaczyć w [kontenerach usługi Azure](https://github.com/Azure/AzureML-Containers).
Zależności specyficzne dla platformy są wymienione w odpowiedniej dokumentacji platformy:
* [Chainer](/python/api/azureml-train-core/azureml.train.dnn.chainer#remarks)
* [PyTorch](/python/api/azureml-train-core/azureml.train.dnn.pytorch#remarks)
* [TensorFlow](/python/api/azureml-train-core/azureml.train.dnn.tensorflow#remarks)
* [Skryptu sklearn](/python/api/azureml-train-core/azureml.train.sklearn.sklearn#remarks)
> [!Note]
> Jeśli uważasz, że określony pakiet jest wystarczająco powszechny do dodania do obrazów i środowisk konserwowanych platformy Azure, zgłoś problem w usłudze GitHub w [kontenerach usługi Azure](https://github.com/Azure/AzureML-Containers)ml.
* **NameError (nazwa niezdefiniowana), AttributeError (obiekt nie ma atrybutu)**: ten wyjątek powinien pochodzić ze skryptów szkoleniowych. Można przyjrzeć się plikom dziennika z Azure Portal, aby uzyskać więcej informacji na temat konkretnej nazwy niezdefiniowanej lub błędu atrybutu. Korzystając z zestawu SDK, można `run.get_details()` sprawdzić komunikat o błędzie. Spowoduje to wyświetlenie listy wszystkich plików dziennika wygenerowanych dla danego przebiegu. Upewnij się, że zapoznaj się z skryptem szkoleniowym i usuń błąd przed ponownym przesłaniem uruchomienia.
* **Usuwanie przebiegu lub eksperymentu**: eksperymenty można archiwizować przy użyciu metody [eksperyment. Archive](/python/api/azureml-core/azureml.core.experiment%28class%29#archive--) lub w widoku karty eksperymenty w programie Azure Machine Learning Studio Client za pośrednictwem przycisku "Archiwizuj eksperyment". Ta akcja ukrywa eksperyment z zapytań i widoków list, ale nie usuwa go.
Trwałe usuwanie pojedynczych eksperymentów lub przebiegów obecnie nie jest obsługiwane. Aby uzyskać więcej informacji na temat usuwania zasobów obszaru roboczego, zobacz [Eksportowanie lub usuwanie danych obszaru roboczego usługi Machine Learning](how-to-export-delete-data.md).
* **Dokument metryki jest zbyt duży**: Azure Machine Learning ma wewnętrzne limity rozmiaru obiektów metryk, które mogą być rejestrowane jednocześnie z poziomu przebiegu szkoleniowego. Jeśli w trakcie rejestrowania metryki z wartościami listy wystąpi błąd „Dokument metryki jest za duży”, spróbuj podzielić listę na mniejsze fragmenty, na przykład:
```python
run.log_list("my metric name", my_metric[:N])
run.log_list("my metric name", my_metric[N:])
```
Wewnętrznie usługa Azure ML łączy bloki z tą samą nazwą metryki w listę ciągłą.
* **Rozpoczęcie wykonywania obliczeń trwa długo**: obrazy platformy Docker dla obiektów docelowych obliczeń są ładowane z Azure Container Registry (ACR). Domyślnie Azure Machine Learning tworzy ACR, który korzysta z warstwy usługi *podstawowa* . Zmiana ACR dla obszaru roboczego na warstwę Standardowa lub Premium może skrócić czas potrzebny do kompilowania i ładowania obrazów. Aby uzyskać więcej informacji, zobacz [Azure Container Registry warstwy usług](../container-registry/container-registry-skus.md).
## <a name="next-steps"></a>Następne kroki
* [Samouczek: uczenie modelu](tutorial-train-models-with-aml.md) używa zarządzanego obiektu docelowego obliczeń do uczenia modelu.
* Zobacz, jak szkolić modele przy użyciu określonych platform ML, takich jak [Scikit-](how-to-train-scikit-learn.md)Learning, [TensorFlow](how-to-train-tensorflow.md)i [PyTorch](how-to-train-pytorch.md).
* Dowiedz się, jak [efektywnie dostrajać parametry](how-to-tune-hyperparameters.md) , aby tworzyć lepsze modele.
* Po uzyskaniu przeszkolonego modelu Dowiedz się, [jak i gdzie wdrażać modele](how-to-deploy-and-where.md).
* Wyświetl odwołanie do zestawu SDK [klasy ScriptRunConfig](/python/api/azureml-core/azureml.core.scriptrunconfig) .
* [Używanie Azure Machine Learning z sieciami wirtualnymi platformy Azure](./how-to-network-security-overview.md)
| 79.965812 | 641 | 0.804778 | pol_Latn | 0.999358 |
fd912c018987efb9d5a980bad3ec6b3905745c60 | 5,022 | md | Markdown | learn-pr/azure/choose-storage-approach-in-azure/includes/2-classify-data.md | OpenLocalizationTestOrg/PrivateRepo | dc4d005a7cdf2e3fbc1c5aa5b8bf7b573aef4b17 | [
"CC-BY-4.0",
"MIT"
] | null | null | null | learn-pr/azure/choose-storage-approach-in-azure/includes/2-classify-data.md | OpenLocalizationTestOrg/PrivateRepo | dc4d005a7cdf2e3fbc1c5aa5b8bf7b573aef4b17 | [
"CC-BY-4.0",
"MIT"
] | null | null | null | learn-pr/azure/choose-storage-approach-in-azure/includes/2-classify-data.md | OpenLocalizationTestOrg/PrivateRepo | dc4d005a7cdf2e3fbc1c5aa5b8bf7b573aef4b17 | [
"CC-BY-4.0",
"MIT"
] | null | null | null | An online retail business has different types of data. Each type of data may benefit from a different storage solution.
Application data can be classified in one of three ways: structured, semi-structured, and unstructured. Here, you'll learn how to classify your data so that you can choose the appropriate storage solution.
#### Approaches to storing data in the cloud
> [!VIDEO https://www.microsoft.com/videoplayer/embed/RE2yEuY]
## Structured data
Structured data is data that adheres to a schema, so all of the data has the same fields or properties. Structured data can be stored in a database table with rows and columns. Structured data relies on keys to indicate how one row in a table relates to data in another row of another table. Structured data is also referred to as relational data, as the data's schema defines the table of data, the fields in the table, and the clear relationship between the two.
Structured data is straightforward in that it's easy to enter, query, and analyze. All of the data follows the same format.
Examples of structured data include:
- Sensor data
- Financial data
## Semi-structured data
Semi-structured data is less organized than structured data, and is not stored in a relational format, as the fields do not neatly fit into tables, rows, and columns. Semi-structured data contains tags that make the organization and hierarchy of the data apparent. Semi-structured data is also referred to as non-relational or NoSQL data.
#### What is NoSQL?
> [!VIDEO https://www.microsoft.com/videoplayer/embed/RE2yEvd]
Examples of semi-structured data include:
- Key / Value pairs
- Graph data
- JSON files
- XML files
## Unstructured data
The organization of unstructured data is generally ambiguous. Unstructured data is often delivered in files, such as photos or videos. The video file itself may have an overall structure and come with semi-structured metadata, but the data that comprises the video itself is unstructured. Therefore, photos, videos, and other similar files are classified as unstructured data.
Examples of unstructured data include:
- Media files, such as photos, videos, and audio files
- Office files, such as Word documents
- Text files
- Log files
Now that you know the differences between each kind of data, let's look at the data sets used in an online retail business, and classify them.
## Product catalog data
Product catalog data for an online retail business is fairly structured in nature, as each product has a product SKU, a description, a quantity, a price, size options, color options, a photo, and possibly a video. So, this data appears relational to start with, as it all has the same structure. However, as you introduce new products or different kinds of products, you may want to add different fields as time goes on. For example, new tennis shoes you're carrying are Bluetooth-enabled, to relay sensor data from the shoe to a fitness app on the user’s phone. This appears to be a growing trend, and you want to enable customers to filter on "Bluetooth-enabled" shoes in the future. You don't want to go back and update all your existing shoe data with a Bluetooth-enabled property, you simply want to add it to new shoes.
With the addition of the Bluetooth-enabled property, your shoe data is no longer homogenous, as you've introduced differences in the schema. If this is the only exception you expect to encounter, you can go back and normalize the existing data so that all products included a "Bluetooth-enabled" field to maintain a structured, relational organization. However, if this is just one of many specialty fields that you envision supporting in the future, then the classification of the data is semi-structured. The data is organized by tags, but each product in the catalog can contain unique fields.
Data classification: **Semi-structured**
## Photos and videos
The photos and videos displayed on product pages are unstructured data. Although the media file may contain metadata, the body of the media file is unstructured.
Data classification: **Unstructured**
## Business data
Business analysts want to implement business intelligence to perform inventory pipeline evaluations and sales data reviews. In order to perform these operations, data from multiple months needs to be aggregated together, and then queried. Because of the need to aggregate similar data, this data must be structured, so that one month can be compared against the next.
Data classification: **Structured**
## Summary
Data may be classified in one of three ways: structured, semi-structured, and unstructured. Understanding the differences so that you can classify your own data will help you choose the correct storage solution.
To recap, structured data is organized data that neatly fits into rows and columns in tables. Semi-structured data is still organized and has clear properties and values, but there's variety to the data. Unstructured data doesn't fit neatly into tables, nor does it have a schema. | 69.75 | 825 | 0.793708 | eng_Latn | 0.999554 |
fd91acd3deeca12483e7f7205aa2fe8c3d4fc6d6 | 79 | md | Markdown | build/content/people/b/benjamin-van-durme.md | briemadu/semdial-proceedings | 59f13adcc73dc9433c3c07ee929fadd8d271b022 | [
"Apache-2.0"
] | null | null | null | build/content/people/b/benjamin-van-durme.md | briemadu/semdial-proceedings | 59f13adcc73dc9433c3c07ee929fadd8d271b022 | [
"Apache-2.0"
] | null | null | null | build/content/people/b/benjamin-van-durme.md | briemadu/semdial-proceedings | 59f13adcc73dc9433c3c07ee929fadd8d271b022 | [
"Apache-2.0"
] | 2 | 2021-09-16T07:16:15.000Z | 2021-10-30T06:41:55.000Z | ---
lastname: Van Durme
name: benjamin-van-durme
title: Benjamin Van Durme
---
| 13.166667 | 25 | 0.721519 | slv_Latn | 0.224922 |
fd92896f9ec24f72e280198be03d996cbef133b7 | 36 | md | Markdown | Counting/README.md | anniexmak/UNA-Stem-Camp-2017 | a112771c854367f97ac19fdd5c7d69cb0eb1136b | [
"BSD-2-Clause"
] | null | null | null | Counting/README.md | anniexmak/UNA-Stem-Camp-2017 | a112771c854367f97ac19fdd5c7d69cb0eb1136b | [
"BSD-2-Clause"
] | null | null | null | Counting/README.md | anniexmak/UNA-Stem-Camp-2017 | a112771c854367f97ac19fdd5c7d69cb0eb1136b | [
"BSD-2-Clause"
] | null | null | null | # Program #3 - Counting with a loop
| 18 | 35 | 0.694444 | eng_Latn | 0.999781 |
fd930b62bf65bfd2b9c2a300d0b3fbce265c9777 | 3,138 | md | Markdown | examples/parcel/README.md | JiaLiPassion/rules_nodejs | 2424d1e32b564fcc37b57d593b871461a62f3237 | [
"Apache-2.0"
] | 1 | 2021-06-20T18:37:14.000Z | 2021-06-20T18:37:14.000Z | examples/parcel/README.md | JiaLiPassion/rules_nodejs | 2424d1e32b564fcc37b57d593b871461a62f3237 | [
"Apache-2.0"
] | 12 | 2020-04-06T21:50:34.000Z | 2022-03-25T18:06:41.000Z | examples/parcel/README.md | JiaLiPassion/rules_nodejs | 2424d1e32b564fcc37b57d593b871461a62f3237 | [
"Apache-2.0"
] | 1 | 2021-01-15T23:29:36.000Z | 2021-01-15T23:29:36.000Z | # Parcel example
This example shows how to write a simple Bazel rule which wraps a binary from npm.
We chose Parcel because it's a build tool not yet supported in our own Bazel rules.
We start from the [Parcel Getting Started](https://parceljs.org/getting_started.html).
Parcel can do many jobs which overlap with Bazel, such as development serving and watching.
Parcel can do file-system watching, but this overlaps with [ibazel](https://github.com/bazelbuild/bazel-watcher) so this mode is probably undesirable under Bazel.
Also, Parcel has a development server.
This could be an alternative to the `ts_devserver` we recommend under Bazel.
It would need to be hosted properly to see generated Bazel outputs from other build steps.
See https://github.com/angular/angular-bazel-example/wiki/Running-a-devserver-under-Bazel
In this example we'll only use Parcel for production bundling, as documented at https://parceljs.org/production.html
## 1. Installing and running Parcel
The `package.json` file lists a `devDependency` on `parcel-bundler`.
Next, in `WORKSPACE` we run the `yarn_install` rule.
This fetches the packages into the Bazel output_base folder, here: `$(bazel info output_base)/external/npm`.
It also generates Bazel configuration files.
Since `parcel-bundler/package.json` declares a `bin: {"parcel"}` key, Bazel will generate a corresponding target.
```sh
$ bazel query --output=label_kind @npm//parcel-bundler/bin:*
alias rule @npm//parcel-bundler/bin:parcel
source file @npm//parcel-bundler/bin:BUILD.bazel
```
So with no other code, we can already run Parcel itself.
```sh
$ bazel run @npm//parcel-bundler/bin:parcel
Server running at http://localhost:1234
🚨 No entries found.
at Bundler.bundle (execroot/examples_parcel/bazel-out/k8-fastbuild/bin/external/npm/node_modules/parcel-bundler/parcel__bin.runfiles/npm/node_modules/parcel-bundler/src/Bundler.js:261:17)
```
## 2. Wrapping parcel in a rule (plugin)
Bazel's idiomatic naming scheme for rules is [name of tool]_[type of rule] where types of rules include "library", "binary", "test", "package", "bundle", etc.
Since Parcel is a bundler, we'll name our rule `parcel_bundle`.
The rule describes the inputs, not what to do with them.
In `BUILD.bazel` we show an example usage of the new rule.
In `parcel.bzl` you can see how the rule is implemented. It returns two outputs: the bundle file and a sourcemap.
To try the rule, run
```sh
$ bazel build :bundle
INFO: From Running Parcel to produce bazel-out/k8-fastbuild/bin/bundle.js:
✨ Built in 499ms.
bazel-out/k8-fastbuild/bin/bundle.js 1.3 KB 191ms
bazel-out/k8-fastbuild/bin/bundle.map 399 B 2ms
Target //:bundle up-to-date:
bazel-bin/bundle.js
bazel-bin/bundle.map
```
## 3. Testing the rule
Currently, we just demonstrate how to test it end-to-end by bundling `foo.js` and `bar.js` into a UMD bundle.
Then in `parcel.spec.js` we `require()` that bundle in Node.js and assert that it had the right side-effect.
(It prints `Hello, Bob`)
You can run the test:
```sh
$ bazel test :all
//:test (cached) PASSED in 0.3s
Executed 0 out of 1 test: 1 test passes.
```
| 38.268293 | 191 | 0.754621 | eng_Latn | 0.976044 |
fd93c485343ed5e00a54bebfc863894bec3e0632 | 3,138 | md | Markdown | content/reference/datatypes/SoftLayer_Auxiliary_Press_Release_About_Press_Release/_index.md | caberos/githubio_source | 2dee6f9568783846aef8a44d4fdb388b8808bd79 | [
"Apache-2.0"
] | 1 | 2018-12-07T16:04:02.000Z | 2018-12-07T16:04:02.000Z | content/reference/datatypes/SoftLayer_Auxiliary_Press_Release_About_Press_Release/_index.md | caberos/githubio_source | 2dee6f9568783846aef8a44d4fdb388b8808bd79 | [
"Apache-2.0"
] | 122 | 2018-07-31T22:45:01.000Z | 2022-02-22T20:24:54.000Z | content/reference/datatypes/SoftLayer_Auxiliary_Press_Release_About_Press_Release/_index.md | caberos/githubio_source | 2dee6f9568783846aef8a44d4fdb388b8808bd79 | [
"Apache-2.0"
] | 12 | 2018-07-23T22:21:48.000Z | 2022-01-04T15:09:45.000Z | ---
title: "SoftLayer_Auxiliary_Press_Release_About_Press_Release"
description: ""
layout: "datatype"
tags:
- "datatype"
- "sldn"
- "Auxiliary"
classes:
- "SoftLayer_Auxiliary_Press_Release_About_Press_Release"
---
# SoftLayer_Auxiliary_Press_Release_About_Press_Release
<div id='service-datatype'>
<ul id='sldn-reference-tabs'>
<li id='service'> <a href='/reference/services/SoftLayer_Auxiliary_Press_Release_About_Press_Release' >Service</a></li> <li id='datatype'> <a href='/reference/datatypes/SoftLayer_Auxiliary_Press_Release_About_Press_Release' >Datatype</a></li>
</ul>
</div>
## Description
### associatedMethods
* [SoftLayer_Auxiliary_Press_Release_About_Press_Release::getObject](/reference/services/SoftLayer_Auxiliary_Press_Release_About_Press_Release/getObject )
### seeAlso
* [SoftLayer_Auxiliary_Press_Release_About](/reference/services/SoftLayer_Auxiliary_Press_Release_About )
<!-- Filer BEGIN -->
<div class="view-filters">
<div class="clearfix">
<div class="search-input-box">
<input placeholder="Datatype Filter" onkeyup="titleSearch(inputId='prop-input', divId='properties', elementClass='prop-row')"
type="text" id="prop-input" value="" size="30" maxlength="128" class="form-text">
</div>
</div>
</div>
<!-- Filer END -->
<div id="properties" class="content">
<div id="localProperties" class="prop-content" >
## Local
<div class="prop-row">
-----
[id]: #id
#### [id]
A press release about cross
<span class="type-label">Type: </span>**integer**
</div>
<div class="prop-row">
-----
[pressReleaseAboutId]: #pressreleaseaboutid
#### [pressReleaseAboutId]
A press release about's internal
<span class="type-label">Type: </span>**integer**
</div>
<div class="prop-row">
-----
[pressReleaseId]: #pressreleaseid
#### [pressReleaseId]
A press release internal identifier.
<span class="type-label">Type: </span>**integer**
</div>
<div class="prop-row">
-----
[sortOrder]: #sortorder
#### [sortOrder]
The number that associated an about
<span class="type-label">Type: </span>**integer**
</div>
</div>
<!-- LOCAL PROPERTY END -->
<div id="relationalProperties" class="prop-content" >
## Relational
<div class="prop-row">
-----
[aboutParagraphs]: #aboutparagraphs
#### [aboutParagraphs]
<span class="type-label">Type: </span>**<a href='/reference/datatypes/SoftLayer_Auxiliary_Press_Release_About'>SoftLayer_Auxiliary_Press_Release_About[] </a>**
</div>
<div class="prop-row">
-----
[pressReleases]: #pressreleases
#### [pressReleases]
<span class="type-label">Type: </span>**<a href='/reference/datatypes/SoftLayer_Auxiliary_Press_Release'>SoftLayer_Auxiliary_Press_Release[] </a>**
</div>
## Count
<div class="prop-row">
-----
[aboutParagraphCount]: #aboutparagraphcount
#### [aboutParagraphCount]
A count of
<span class="type-label">Type: </span>**unsigned long**
</div>
<div class="prop-row">
-----
[pressReleaseCount]: #pressreleasecount
#### [pressReleaseCount]
A count of
<span class="type-label">Type: </span>**unsigned long**
</div>
</div>
| 21.944056 | 249 | 0.689293 | yue_Hant | 0.384334 |
fd93cc742e07bec09a319ca1f3eb01ecfc7e6cbf | 1,056 | md | Markdown | pythonetc/slots-docs.md | orsinium/notes | 3e08977ee3329c09ab8a46b0e35a45ab65b93910 | [
"CC-BY-4.0"
] | 35 | 2018-05-17T10:11:45.000Z | 2022-02-27T21:22:38.000Z | pythonetc/slots-docs.md | orsinium/notes | 3e08977ee3329c09ab8a46b0e35a45ab65b93910 | [
"CC-BY-4.0"
] | 7 | 2018-07-06T14:00:43.000Z | 2019-01-28T11:03:43.000Z | pythonetc/slots-docs.md | orsinium/notes | 3e08977ee3329c09ab8a46b0e35a45ab65b93910 | [
"CC-BY-4.0"
] | 13 | 2018-05-23T06:02:16.000Z | 2021-04-25T18:15:10.000Z | # dict as `__slots__`
Published: 05 November 2020, 18:00
`__slots__` [can be used to save memory](https://t.me/pythonetc/233). You can use any iterable as `__slots__` value, including `dict`. And starting from Python 3.8, you can use `dict` to specify docstrings for slotted attributes `__slots__`:
```python
class Channel:
"Telegram channel"
__slots__ = {
'slug': 'short name, without @',
'name': 'user-friendly name',
}
def __init__(self, slug, name):
self.slug = slug
self.name = name
inspect.getdoc(Channel.name)
# 'user-friendly name'
```
Also, `help(Channel)` lists docs for all slotted attributes:
```python
class Channel(builtins.object)
| Channel(slug, name)
|
| Telegram channel
|
| Methods defined here:
|
| __init__(self, slug, name)
| Initialize self. See help(type(self)) for accurate signature.
|
| ----------------------------------------------------------------------
| Data descriptors defined here:
|
| name
| user-friendly name
|
| slug
| short name, without @
```
| 24 | 241 | 0.621212 | eng_Latn | 0.953612 |
fd943be5a241e568f7657f2d3324915f40206e1f | 5,663 | md | Markdown | README.md | yangchong211/YCCardView | 757b1bf123f9474da483718ea32138ceff0fd594 | [
"Apache-2.0"
] | 236 | 2017-12-19T02:04:43.000Z | 2022-03-13T16:16:44.000Z | README.md | yangchong211/YCCardView | 757b1bf123f9474da483718ea32138ceff0fd594 | [
"Apache-2.0"
] | 10 | 2019-08-06T10:46:07.000Z | 2022-01-07T03:35:51.000Z | README.md | yangchong211/YCCardView | 757b1bf123f9474da483718ea32138ceff0fd594 | [
"Apache-2.0"
] | 39 | 2019-08-07T01:36:10.000Z | 2021-08-24T08:54:59.000Z | #### 目录介绍
- 01.阴影效果有哪些实现方式
- 02.实现阴影效果Api
- 03.设置阴影需要注意哪些
- 04.常见Shape实现阴影效果
- 05.如何使用该阴影控件
- 06.优化点分析
### 00.效果图展示



### 01.阴影效果有哪些实现方式
- 阴影效果有哪些实现方式
- 第一种:使用CardView,但是不能设置阴影颜色
- 第二种:采用shape叠加,存在后期UI效果不便优化
- 第三种:UI切图
- 第四种:自定义View
- 否定上面前两种方案原因分析?
- 第一个方案的CardView渐变色和阴影效果很难控制,只能支持线性或者环装形式渐变,这种不满足需要,因为阴影本身是一个四周一层很淡的颜色包围,在一个矩形框的层面上颜色大概一致,而且这个CardView有很多局限性,比如不能修改阴影的颜色,不能修改阴影的深浅。所以这个思路无法实现这个需求。
- 第二个采用shape叠加,可以实现阴影效果,但是影响UI,且阴影部分是占像素的,而且不灵活。
- 第三个方案询问了一下ui。他们给出的结果是如果使用切图的话那标注的话很难标,身为一个优秀的设计师大多对像素点都和敏感,界面上的像素点有一点不协调那都是无法容忍的。
- 在下面开源案例代码中,我会一一展示这几种不同方案实现的阴影效果。
- 网上一些介绍阴影效果方案
- 所有在深奥的技术,也都是为需求做准备的。也就是需要实践并且可以用到实际开发中,这篇文章不再抽象介绍阴影效果原理,理解三维空间中如何处理偏移光线达到阴影视差等,网上看了一些文章也没看明白或者理解。这篇博客直接通过调用api实现预期的效果。
- 阴影是否占位
- 使用CardView阴影不占位,不能设置阴影颜色和效果
- 使用shape阴影是可以设置阴影颜色,但是是占位的
### 02.实现阴影效果Api
- 思考一下如何实现View阴影效果?
- 首先要明确阴影的实现思路是什么,其实就是颜色导致的视觉错觉。说白了就是在你的Card周围画一个渐变的体现立体感的颜色。基于上述思路,我们在一个在一个view上画一个矩形的图形,让他周围有渐变色的阴影即可。于是我们想起几个API:
- 类:Paint 用于在Android上画图的类,相当于画笔
- 类:Canvas 相当于画布,Android上的view的绘制都与他相关
- 方法:paint.setShadowLayer可以给绘制的图形增加阴影,还可以设置阴影的颜色
- paint.setShadowLayer(float radius, float dx, float dy, int shadowColor);
- 这个方法可以达到这样一个效果,在使用canvas画图时给视图顺带上一层阴影效果。
- 简单介绍一下这几个参数:
- radius: 阴影半径,主要可以控制阴影的模糊效果以及阴影扩散出去的大小。
- dx:阴影在X轴方向上的偏移量
- dy: 阴影在Y轴方向上的偏移量
- shadowColor: 阴影颜色。
- 终于找到了设置颜色的,通过设置shadowColor来控制视图的阴影颜色。
### 03.设置阴影需要注意哪些
- 其中涉及到几个属性,阴影的宽度,view到Viewgroup的距离,如果视图和父布局一样大的话,那阴影就不好显示,如果要能够显示出来就必须设置clipChildren=false。
- 还有就是视图自带的圆角,大部分背景都是有圆角的,比如上图中的圆角,需要达到高度还原阴影的效果就是的阴影的圆角和背景保持一致。
### 04.常见Shape实现阴影效果
- 多个drawable叠加
- 使用layer-list可以将多个drawable按照顺序层叠在一起显示,默认情况下,所有的item中的drawable都会自动根据它附上view的大小而进行缩放,layer-list中的item是按照顺序从下往上叠加的,即先定义的item在下面,后面的依次往上面叠放
- 阴影效果代码如下所示
- 这里有多层,就省略了一些。然后直接通过设置控件的background属性即可实现。
```
<?xml version="1.0" encoding="utf-8"?>
<layer-list xmlns:android="http://schemas.android.com/apk/res/android">
<item>
<shape android:shape="rectangle">
<solid android:color="@color/indexShadowColor_1" />
<corners android:radius="5dip" />
<padding
android:bottom="1dp"
android:left="1dp"
android:right="1dp"
android:top="1dp" />
</shape>
</item>
<item>
<shape android:shape="rectangle">
<solid android:color="@color/indexShadowColor_2" />
<corners android:radius="5dip" />
<padding
android:bottom="1dp"
android:left="1dp"
android:right="1dp"
android:top="1dp" />
</shape>
</item>
……
<item>
<shape android:shape="rectangle">
<corners android:radius="5dip" />
<solid android:color="@color/indexColor" />
</shape>
</item>
</layer-list>
```
### 05.如何使用该阴影控件
- 十分简单,如下所示
```
<com.ns.yc.yccardviewlib.shadow.ShadowLayout
android:layout_width="wrap_content"
android:layout_height="wrap_content"
android:layout_gravity="center_horizontal"
android:layout_marginTop="10dp"
app:yc_cornerRadius="18dp"
app:yc_dx="0dp"
app:yc_dy="0dp"
app:yc_shadowColor="#2a000000"
app:yc_shadowLimit="5dp">
<TextView
android:layout_width="wrap_content"
android:layout_height="36dp"
android:background="@drawable/shape_show_"
android:gravity="center"
android:paddingLeft="10dp"
android:paddingRight="10dp"
android:text="完全圆形圆角"
android:textColor="#000" />
</com.ns.yc.yccardviewlib.shadow.ShadowLayout>
```
### 06.优化点分析
- 在createShadowBitmap方法中,其实也可以看到需要创建bitmap对象。大家都知道bitmap比较容易造成内存过大,如果是给recyclerView中的item设置阴影效果,那么如何避免重复创建,这时候可以用到缓存。所以可以在上面的基础上再优化一下代码。
- 先创建key,主要是用于map集合的键。这里为何用对象Key作为map的键呢,这里是借鉴了glide缓存图片的思路,可以创建Key对象的时候传入bitmap名称和宽高属性,并且需要重写hashCode和equals方法。
- 然后存取操作,在查找的时候,通过Key进行查找。注意:Bitmap需要同时满足三个条件(高度、宽度、名称)都相同时才能算是同一个 Bitmap。
- 更加详细的内容,可以看这篇博客:https://juejin.im/post/5d495cfef265da03a31d1fba
#### 01.关于博客汇总链接
- 1.[技术博客汇总](https://www.jianshu.com/p/614cb839182c)
- 2.[开源项目汇总](https://blog.csdn.net/m0_37700275/article/details/80863574)
- 3.[生活博客汇总](https://blog.csdn.net/m0_37700275/article/details/79832978)
- 4.[喜马拉雅音频汇总](https://www.jianshu.com/p/f665de16d1eb)
- 5.[其他汇总](https://www.jianshu.com/p/53017c3fc75d)
#### 02.关于我的博客
- github:https://github.com/yangchong211
- 知乎:https://www.zhihu.com/people/yczbj/activities
- 简书:http://www.jianshu.com/u/b7b2c6ed9284
- csdn:http://my.csdn.net/m0_37700275
- 喜马拉雅听书:http://www.ximalaya.com/zhubo/71989305/
- 开源中国:https://my.oschina.net/zbj1618/blog
- 泡在网上的日子:http://www.jcodecraeer.com/member/content_list.php?channelid=1
- 邮箱:yangchong211@163.com
- 阿里云博客:https://yq.aliyun.com/users/article?spm=5176.100- 239.headeruserinfo.3.dT4bcV
- segmentfault头条:https://segmentfault.com/u/xiangjianyu/articles
- 掘金:https://juejin.im/user/5939433efe88c2006afa0c6e
| 33.91018 | 160 | 0.694155 | yue_Hant | 0.471585 |
fd945a5de029b279c87a5399cd245132af376e39 | 1,405 | md | Markdown | docs/signing_up.md | vtamara/pupilfirst | f6f2b55c4cd11ed7935fed19ae78123cd897ddb0 | [
"MIT"
] | 495 | 2020-05-13T17:32:29.000Z | 2022-03-28T21:21:19.000Z | docs/signing_up.md | adesokanayo/pupilfirst | 21afb1920d4706b247ed908ea1eac5e06a2cc85b | [
"MIT"
] | 494 | 2020-05-14T08:39:23.000Z | 2022-03-31T17:22:51.000Z | docs/signing_up.md | land-of-apps/pupilfirst | c80b3eedac8601d68f99ff5ac071b1ca8e99f6e1 | [
"MIT"
] | 180 | 2020-05-15T06:47:47.000Z | 2022-03-30T03:14:06.000Z | # Signing Up
New students can enter a course in one of two ways:
1. A school admin [adds them via the students list](/students?id=adding-new-students).
2. [Public sign up is enabled](/courses?id=creating-courses) for a course.
## Signing Up For A Course
When the public signup option is enabled on a course, the course's public page gains an _Apply_ button.

Students can read about the course before enrolling to it by supplying their name and email address. Their email address
will be confirmed before they're added to the course, and they'll
[begin the course in the first level](/curriculum_editor?id=what-are-levels).
## Tagging Public Signups
If you're sharing the link to a course's page, you may want to include a `tag` parameter in the URL to track sign-ups
from that URL. To tag a new sign-up, simply include the name of an existing student tag in the URL like so:
```
https://my.school.domain/courses/ID?tag=existing-tag
```
1. This only works for _existing_ tags - this prevents misuse of the feature.
2. If your tag name contains spaces, make sure that [it's URL-encoded](http://www.utilities-online.info/urlencode/).
Students who sign up to a course after visiting this link will have the tag assigned to them.
| 45.322581 | 168 | 0.774377 | eng_Latn | 0.99655 |