hexsha stringlengths 40 40 | size int64 5 1.04M | ext stringclasses 6 values | lang stringclasses 1 value | max_stars_repo_path stringlengths 3 344 | max_stars_repo_name stringlengths 5 125 | max_stars_repo_head_hexsha stringlengths 40 78 | max_stars_repo_licenses listlengths 1 11 | max_stars_count int64 1 368k ⌀ | max_stars_repo_stars_event_min_datetime stringlengths 24 24 ⌀ | max_stars_repo_stars_event_max_datetime stringlengths 24 24 ⌀ | max_issues_repo_path stringlengths 3 344 | max_issues_repo_name stringlengths 5 125 | max_issues_repo_head_hexsha stringlengths 40 78 | max_issues_repo_licenses listlengths 1 11 | max_issues_count int64 1 116k ⌀ | max_issues_repo_issues_event_min_datetime stringlengths 24 24 ⌀ | max_issues_repo_issues_event_max_datetime stringlengths 24 24 ⌀ | max_forks_repo_path stringlengths 3 344 | max_forks_repo_name stringlengths 5 125 | max_forks_repo_head_hexsha stringlengths 40 78 | max_forks_repo_licenses listlengths 1 11 | max_forks_count int64 1 105k ⌀ | max_forks_repo_forks_event_min_datetime stringlengths 24 24 ⌀ | max_forks_repo_forks_event_max_datetime stringlengths 24 24 ⌀ | content stringlengths 5 1.04M | avg_line_length float64 1.14 851k | max_line_length int64 1 1.03M | alphanum_fraction float64 0 1 | lid stringclasses 191 values | lid_prob float64 0.01 1 |
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
f96194f152ee64e99e02a700704ae72aff13a540 | 949 | md | Markdown | generated-docs/P5/Structure/Structure.md | thought2/purescript-p5 | 5281b9250a15489e320ca61b4ee28bf3c14fb0f2 | [
"MIT"
] | 15 | 2019-11-22T04:14:23.000Z | 2022-02-06T21:02:04.000Z | generated-docs/P5/Structure/Structure.md | thought2/purescript-p5 | 5281b9250a15489e320ca61b4ee28bf3c14fb0f2 | [
"MIT"
] | 8 | 2019-09-22T13:12:07.000Z | 2022-02-26T07:07:53.000Z | generated-docs/P5/Structure/Structure.md | thought2/purescript-p5 | 5281b9250a15489e320ca61b4ee28bf3c14fb0f2 | [
"MIT"
] | 3 | 2020-08-12T00:15:43.000Z | 2022-01-16T01:45:10.000Z | ## Module P5.Structure.Structure
#### `loop`
``` purescript
loop :: P5 -> (Effect Unit)
```
[p5js.org documentation](https://p5js.org/reference/#/p5/loop)
#### `noLoop`
``` purescript
noLoop :: P5 -> (Effect Unit)
```
[p5js.org documentation](https://p5js.org/reference/#/p5/noLoop)
#### `pop`
``` purescript
pop :: P5 -> (Effect Unit)
```
[p5js.org documentation](https://p5js.org/reference/#/p5/pop)
#### `preload`
``` purescript
preload :: P5 -> (Effect Unit)
```
[p5js.org documentation](https://p5js.org/reference/#/p5/preload)
#### `push`
``` purescript
push :: P5 -> (Effect Unit)
```
[p5js.org documentation](https://p5js.org/reference/#/p5/push)
#### `redraw`
``` purescript
redraw :: P5 -> (Maybe Int) -> (Effect Unit)
```
[p5js.org documentation](https://p5js.org/reference/#/p5/redraw)
#### `remove`
``` purescript
remove :: P5 -> (Effect Unit)
```
[p5js.org documentation](https://p5js.org/reference/#/p5/remove)
| 15.816667 | 65 | 0.629083 | yue_Hant | 0.69358 |
f961aaab46ca82241bbc722940051a87958058ba | 43,827 | md | Markdown | packages/furo-route/CHANGELOG.md | maltenorstroem/FuroBaseComponents | bc990565f07e5761c756c04071a89194e79aa969 | [
"MIT"
] | 1 | 2019-08-08T13:05:37.000Z | 2019-08-08T13:05:37.000Z | packages/furo-route/CHANGELOG.md | maltenorstroem/FuroBaseComponents | bc990565f07e5761c756c04071a89194e79aa969 | [
"MIT"
] | null | null | null | packages/furo-route/CHANGELOG.md | maltenorstroem/FuroBaseComponents | bc990565f07e5761c756c04071a89194e79aa969 | [
"MIT"
] | null | null | null | # Change Log
All notable changes to this project will be documented in this file.
See [Conventional Commits](https://conventionalcommits.org) for commit guidelines.
## [1.6.4](https://github.com/theNorstroem/FuroBaseComponents/compare/@furo/route@1.6.3...@furo/route@1.6.4) (2021-04-01)
**Note:** Version bump only for package @furo/route
## [1.6.3](https://github.com/theNorstroem/FuroBaseComponents/compare/@furo/route@1.6.2...@furo/route@1.6.3) (2021-03-23)
**Note:** Version bump only for package @furo/route
## [1.6.2](https://github.com/theNorstroem/FuroBaseComponents/compare/@furo/route@1.6.1...@furo/route@1.6.2) (2021-03-17)
**Note:** Version bump only for package @furo/route
## [1.6.1](https://github.com/theNorstroem/FuroBaseComponents/compare/@furo/route@1.6.0...@furo/route@1.6.1) (2021-03-16)
**Note:** Version bump only for package @furo/route
# [1.6.0](https://github.com/theNorstroem/FuroBaseComponents/compare/@furo/route@1.5.1...@furo/route@1.6.0) (2021-03-12)
### Features
* added furo-location-updater ([1afdae7](https://github.com/theNorstroem/FuroBaseComponents/commit/1afdae749b0ea9ee1040e5ce49b2c23264319722))
## [1.5.1](https://github.com/theNorstroem/FuroBaseComponents/compare/@furo/route@1.5.0...@furo/route@1.5.1) (2021-03-03)
**Note:** Version bump only for package @furo/route
# [1.5.0](https://github.com/theNorstroem/FuroBaseComponents/compare/@furo/route@1.4.21...@furo/route@1.5.0) (2021-03-01)
### Features
* furo-location-updater ([b4b5ff7](https://github.com/theNorstroem/FuroBaseComponents/commit/b4b5ff7452c821b450c8ca370a9c1a6b76447198))
* furo-location-updater ([1c0fd4b](https://github.com/theNorstroem/FuroBaseComponents/commit/1c0fd4b084d5101501c79344f1257c90ef04ed72))
## [1.4.21](https://github.com/theNorstroem/FuroBaseComponents/compare/@furo/route@1.4.20...@furo/route@1.4.21) (2021-02-25)
**Note:** Version bump only for package @furo/route
## [1.4.20](https://github.com/theNorstroem/FuroBaseComponents/compare/@furo/route@1.4.19...@furo/route@1.4.20) (2021-02-25)
**Note:** Version bump only for package @furo/route
## [1.4.19](https://github.com/theNorstroem/FuroBaseComponents/compare/@furo/route@1.4.18...@furo/route@1.4.19) (2021-02-12)
### Bug Fixes
* init invisible components with display:none ([bdf3464](https://github.com/theNorstroem/FuroBaseComponents/commit/bdf3464a946130ced466bf496eed65ee7a9e1709))
## [1.4.18](https://github.com/theNorstroem/FuroBaseComponents/compare/@furo/route@1.4.17...@furo/route@1.4.18) (2021-02-09)
**Note:** Version bump only for package @furo/route
## [1.4.17](https://github.com/theNorstroem/FuroBaseComponents/compare/@furo/route@1.4.16...@furo/route@1.4.17) (2021-02-02)
**Note:** Version bump only for package @furo/route
## [1.4.16](https://github.com/theNorstroem/FuroBaseComponents/compare/@furo/route@1.4.15...@furo/route@1.4.16) (2021-01-31)
**Note:** Version bump only for package @furo/route
## [1.4.15](https://github.com/theNorstroem/FuroBaseComponents/compare/@furo/route@1.4.14...@furo/route@1.4.15) (2021-01-22)
**Note:** Version bump only for package @furo/route
## [1.4.14](https://github.com/theNorstroem/FuroBaseComponents/compare/@furo/route@1.4.13...@furo/route@1.4.14) (2021-01-12)
**Note:** Version bump only for package @furo/route
## [1.4.13](https://github.com/theNorstroem/FuroBaseComponents/compare/@furo/route@1.4.12...@furo/route@1.4.13) (2021-01-12)
**Note:** Version bump only for package @furo/route
## [1.4.12](https://github.com/theNorstroem/FuroBaseComponents/compare/@furo/route@1.4.11...@furo/route@1.4.12) (2021-01-05)
**Note:** Version bump only for package @furo/route
## [1.4.11](https://github.com/theNorstroem/FuroBaseComponents/compare/@furo/route@1.4.10...@furo/route@1.4.11) (2020-12-24)
**Note:** Version bump only for package @furo/route
## [1.4.10](https://github.com/theNorstroem/FuroBaseComponents/compare/@furo/route@1.4.9...@furo/route@1.4.10) (2020-12-17)
**Note:** Version bump only for package @furo/route
## [1.4.9](https://github.com/theNorstroem/FuroBaseComponents/compare/@furo/route@1.4.8...@furo/route@1.4.9) (2020-12-10)
**Note:** Version bump only for package @furo/route
## [1.4.8](https://github.com/theNorstroem/FuroBaseComponents/compare/@furo/route@1.4.7...@furo/route@1.4.8) (2020-12-01)
**Note:** Version bump only for package @furo/route
## [1.4.7](https://github.com/theNorstroem/FuroBaseComponents/compare/@furo/route@1.4.6...@furo/route@1.4.7) (2020-11-27)
**Note:** Version bump only for package @furo/route
## [1.4.6](https://github.com/theNorstroem/FuroBaseComponents/compare/@furo/route@1.4.5...@furo/route@1.4.6) (2020-11-24)
**Note:** Version bump only for package @furo/route
## [1.4.5](https://github.com/theNorstroem/FuroBaseComponents/compare/@furo/route@1.4.4...@furo/route@1.4.5) (2020-11-20)
**Note:** Version bump only for package @furo/route
## [1.4.4](https://github.com/theNorstroem/FuroBaseComponents/compare/@furo/route@1.4.3...@furo/route@1.4.4) (2020-11-14)
**Note:** Version bump only for package @furo/route
## [1.4.3](https://github.com/theNorstroem/FuroBaseComponents/compare/@furo/route@1.4.2...@furo/route@1.4.3) (2020-11-12)
**Note:** Version bump only for package @furo/route
## [1.4.2](https://github.com/theNorstroem/FuroBaseComponents/compare/@furo/route@1.4.1...@furo/route@1.4.2) (2020-11-05)
**Note:** Version bump only for package @furo/route
## [1.4.1](https://github.com/theNorstroem/FuroBaseComponents/compare/@furo/route@1.4.0...@furo/route@1.4.1) (2020-11-04)
**Note:** Version bump only for package @furo/route
# [1.4.0](https://github.com/theNorstroem/FuroBaseComponents/compare/@furo/route@1.3.27...@furo/route@1.4.0) (2020-10-30)
### Features
* qp-changer with clearable qps ([d86b502](https://github.com/theNorstroem/FuroBaseComponents/commit/d86b502b302333b1d1758595657eeda00bc86fd2))
* qp-changer with clearable qps ([069f1d0](https://github.com/theNorstroem/FuroBaseComponents/commit/069f1d0549275f774333f0a97537e16e644b74b5))
## [1.3.27](https://github.com/theNorstroem/FuroBaseComponents/compare/@furo/route@1.3.26...@furo/route@1.3.27) (2020-10-28)
**Note:** Version bump only for package @furo/route
## [1.3.26](https://github.com/theNorstroem/FuroBaseComponents/compare/@furo/route@1.3.25...@furo/route@1.3.26) (2020-10-27)
**Note:** Version bump only for package @furo/route
## [1.3.25](https://github.com/theNorstroem/FuroBaseComponents/compare/@furo/route@1.3.24...@furo/route@1.3.25) (2020-10-20)
### Bug Fixes
* furo location sent empty qp on location change with same qp, hash and path. Not sending the sub events is ok, but not sending the data is not ok ([27e0c5e](https://github.com/theNorstroem/FuroBaseComponents/commit/27e0c5e3a1c13bbf8a84d413f03830d368bdae8c))
## [1.3.24](https://github.com/theNorstroem/FuroBaseComponents/compare/@furo/route@1.3.23...@furo/route@1.3.24) (2020-10-08)
**Note:** Version bump only for package @furo/route
## [1.3.23](https://github.com/theNorstroem/FuroBaseComponents/compare/@furo/route@1.3.22...@furo/route@1.3.23) (2020-09-14)
**Note:** Version bump only for package @furo/route
## [1.3.22](https://github.com/theNorstroem/FuroBaseComponents/compare/@furo/route@1.3.21...@furo/route@1.3.22) (2020-09-03)
**Note:** Version bump only for package @furo/route
## [1.3.21](https://github.com/theNorstroem/FuroBaseComponents/compare/@furo/route@1.3.20...@furo/route@1.3.21) (2020-08-27)
### Bug Fixes
* qp-changer fires to much ([3ac8d42](https://github.com/theNorstroem/FuroBaseComponents/commit/3ac8d42d686d844af5c6eea48f0053a5cdcbdd3b))
## [1.3.20](https://github.com/theNorstroem/FuroBaseComponents/compare/@furo/route@1.3.19...@furo/route@1.3.20) (2020-08-11)
**Note:** Version bump only for package @furo/route
## [1.3.19](https://github.com/theNorstroem/FuroBaseComponents/compare/@furo/route@1.3.18...@furo/route@1.3.19) (2020-07-31)
**Note:** Version bump only for package @furo/route
## [1.3.18](https://github.com/theNorstroem/FuroBaseComponents/compare/@furo/route@1.3.17...@furo/route@1.3.18) (2020-06-29)
**Note:** Version bump only for package @furo/route
## [1.3.17](https://github.com/theNorstroem/FuroBaseComponents/compare/@furo/route@1.3.16...@furo/route@1.3.17) (2020-06-29)
**Note:** Version bump only for package @furo/route
## [1.3.16](https://github.com/theNorstroem/FuroBaseComponents/compare/@furo/route@1.3.15...@furo/route@1.3.16) (2020-06-22)
**Note:** Version bump only for package @furo/route
## [1.3.15](https://github.com/theNorstroem/FuroBaseComponents/compare/@furo/route@1.3.14...@furo/route@1.3.15) (2020-06-16)
**Note:** Version bump only for package @furo/route
## [1.3.14](https://github.com/theNorstroem/FuroBaseComponents/compare/@furo/route@1.3.13...@furo/route@1.3.14) (2020-06-08)
**Note:** Version bump only for package @furo/route
## [1.3.13](https://github.com/theNorstroem/FuroBaseComponents/compare/@furo/route@1.3.12...@furo/route@1.3.13) (2020-06-04)
**Note:** Version bump only for package @furo/route
## [1.3.12](https://github.com/theNorstroem/FuroBaseComponents/compare/@furo/route@1.3.11...@furo/route@1.3.12) (2020-05-15)
**Note:** Version bump only for package @furo/route
## [1.3.11](https://github.com/theNorstroem/FuroBaseComponents/compare/@furo/route@1.3.10...@furo/route@1.3.11) (2020-05-13)
**Note:** Version bump only for package @furo/route
## [1.3.10](https://github.com/theNorstroem/FuroBaseComponents/compare/@furo/route@1.3.9...@furo/route@1.3.10) (2020-05-12)
**Note:** Version bump only for package @furo/route
## [1.3.9](https://github.com/theNorstroem/FuroBaseComponents/compare/@furo/route@1.3.8...@furo/route@1.3.9) (2020-05-01)
### Bug Fixes
* prettier location test ([75b6f75](https://github.com/theNorstroem/FuroBaseComponents/commit/75b6f75347b9199b57803d5c8c7c252f00bb3c46))
* use litElement in furo location ([12b6199](https://github.com/theNorstroem/FuroBaseComponents/commit/12b61990dbc75c70a5b535fe6e97d4265dc05cb4))
## [1.3.8](https://github.com/theNorstroem/FuroBaseComponents/compare/@furo/route@1.3.7...@furo/route@1.3.8) (2020-04-29)
### Bug Fixes
* furo-page should remove `hidden` for the current page when the current page is also the lastpage ([5e3aa5a](https://github.com/theNorstroem/FuroBaseComponents/commit/5e3aa5a65ec5e731fbcd345489891ee6058566c3))
* prevent multi `close-request` eventlistening on filednode in base-panel ([8d8dacf](https://github.com/theNorstroem/FuroBaseComponents/commit/8d8dacf5815b97fc67473661b73bc1b93e53deec))
## [1.3.7](https://github.com/theNorstroem/FuroBaseComponents/compare/@furo/route@1.3.6...@furo/route@1.3.7) (2020-04-29)
### Bug Fixes
* prevent multi selection in navigation tree ([65742d9](https://github.com/theNorstroem/FuroBaseComponents/commit/65742d991f230582fdd3223a2a62a8a58b295e3a))
## [1.3.6](https://github.com/theNorstroem/FuroBaseComponents/compare/@furo/route@1.3.5...@furo/route@1.3.6) (2020-04-24)
### Bug Fixes
* app-flow-router and furo-location with properties instead of observers ([eef5132](https://github.com/theNorstroem/FuroBaseComponents/commit/eef51322b6a5da6ae2ed70403ebdbb4bd70d87fa))
* revert wrong branch ([02aeea1](https://github.com/theNorstroem/FuroBaseComponents/commit/02aeea1f92b2a7a0f1ce3eaa6fd7a05a328259e6))
## [1.3.5](https://github.com/theNorstroem/FuroBaseComponents/compare/@furo/route@1.3.4...@furo/route@1.3.5) (2020-04-22)
**Note:** Version bump only for package @furo/route
## [1.3.4](https://github.com/theNorstroem/FuroBaseComponents/compare/@furo/route@1.3.3...@furo/route@1.3.4) (2020-04-21)
**Note:** Version bump only for package @furo/route
## [1.3.3](https://github.com/theNorstroem/FuroBaseComponents/compare/@furo/route@1.3.1...@furo/route@1.3.3) (2020-04-20)
**Note:** Version bump only for package @furo/route
## [1.3.2](https://github.com/theNorstroem/FuroBaseComponents/compare/@furo/route@1.3.1...@furo/route@1.3.2) (2020-04-20)
**Note:** Version bump only for package @furo/route
## [1.3.1](https://github.com/theNorstroem/FuroBaseComponents/compare/@furo/route@1.3.0...@furo/route@1.3.1) (2020-04-20)
### Bug Fixes
* lerna problems ([ef9d1c4](https://github.com/theNorstroem/FuroBaseComponents/commit/ef9d1c405fbf55664ef05e6f12a1e7eecfc53759))
## [0.25.5](https://github.com/theNorstroem/FuroBaseComponents/compare/@furo/route@0.25.4...@furo/route@0.25.5) (2020-04-03)
**Note:** Version bump only for package @furo/route
## [0.25.4](https://github.com/theNorstroem/FuroBaseComponents/compare/@furo/route@0.25.3...@furo/route@0.25.4) (2020-04-03)
**Note:** Version bump only for package @furo/route
## [0.25.3](https://github.com/theNorstroem/FuroBaseComponents/compare/@furo/route@0.25.2...@furo/route@0.25.3) (2020-04-03)
**Note:** Version bump only for package @furo/route
## [0.25.2](https://github.com/theNorstroem/FuroBaseComponents/compare/@furo/route@0.25.1...@furo/route@0.25.2) (2020-03-29)
**Note:** Version bump only for package @furo/route
## [0.25.1](https://github.com/theNorstroem/FuroBaseComponents/compare/@furo/route@0.25.0...@furo/route@0.25.1) (2020-03-27)
**Note:** Version bump only for package @furo/route
# [0.25.0](https://github.com/theNorstroem/FuroBaseComponents/compare/@furo/route@0.24.1...@furo/route@0.25.0) (2020-03-26)
### Features
* furo-data-context-menu tests are working again ([8448896](https://github.com/theNorstroem/FuroBaseComponents/commit/8448896))
## [0.24.1](https://github.com/theNorstroem/FuroBaseComponents/compare/@furo/route@0.24.0...@furo/route@0.24.1) (2020-03-11)
**Note:** Version bump only for package @furo/route
# [0.24.0](https://github.com/theNorstroem/FuroBaseComponents/compare/@furo/route@0.23.11...@furo/route@0.24.0) (2020-03-10)
### Features
* app support in sub directory ([c1c0a92](https://github.com/theNorstroem/FuroBaseComponents/commit/c1c0a92))
* app support in sub directory ([d1ea586](https://github.com/theNorstroem/FuroBaseComponents/commit/d1ea586))
## [0.23.11](https://github.com/theNorstroem/FuroBaseComponents/compare/@furo/route@0.23.10...@furo/route@0.23.11) (2020-03-07)
**Note:** Version bump only for package @furo/route
## [0.23.10](https://github.com/theNorstroem/FuroBaseComponents/compare/@furo/route@0.23.9...@furo/route@0.23.10) (2020-03-07)
**Note:** Version bump only for package @furo/route
## [0.23.9](https://github.com/theNorstroem/FuroBaseComponents/compare/@furo/route@0.23.8...@furo/route@0.23.9) (2020-03-04)
### Bug Fixes
* close tab fixed ([e8e0372](https://github.com/theNorstroem/FuroBaseComponents/commit/e8e0372))
## [0.23.8](https://github.com/theNorstroem/FuroBaseComponents/compare/@furo/route@0.23.7...@furo/route@0.23.8) (2020-02-27)
**Note:** Version bump only for package @furo/route
## [0.23.7](https://github.com/theNorstroem/FuroBaseComponents/compare/@furo/route@0.23.6...@furo/route@0.23.7) (2020-02-26)
**Note:** Version bump only for package @furo/route
## [0.23.6](https://github.com/theNorstroem/FuroBaseComponents/compare/@furo/route@0.23.5...@furo/route@0.23.6) (2020-02-19)
**Note:** Version bump only for package @furo/route
## [0.23.5](https://github.com/theNorstroem/FuroBaseComponents/compare/@furo/route@0.23.4...@furo/route@0.23.5) (2020-02-18)
### Bug Fixes
* change the reference of loaded panels from navigation-nodes to panel-names in panel coordinator ([a0c7676](https://github.com/theNorstroem/FuroBaseComponents/commit/a0c7676))
## [0.23.4](https://github.com/theNorstroem/FuroBaseComponents/compare/@furo/route@0.23.3...@furo/route@0.23.4) (2020-02-18)
**Note:** Version bump only for package @furo/route
## [0.23.3](https://github.com/theNorstroem/FuroBaseComponents/compare/@furo/route@0.23.2...@furo/route@0.23.3) (2020-02-17)
**Note:** Version bump only for package @furo/route
## [0.23.2](https://github.com/theNorstroem/FuroBaseComponents/compare/@furo/route@0.23.1...@furo/route@0.23.2) (2020-02-17)
**Note:** Version bump only for package @furo/route
## [0.23.1](https://github.com/theNorstroem/FuroBaseComponents/compare/@furo/route@0.23.0...@furo/route@0.23.1) (2020-02-12)
**Note:** Version bump only for package @furo/route
# [0.23.0](https://github.com/theNorstroem/FuroBaseComponents/compare/@furo/route@0.22.0...@furo/route@0.23.0) (2020-02-11)
### Features
* furo-app-flow-router open external links in new window ([bec5418](https://github.com/theNorstroem/FuroBaseComponents/commit/bec5418))
# [0.22.0](https://github.com/theNorstroem/FuroBaseComponents/compare/@furo/route@0.21.4...@furo/route@0.22.0) (2020-02-10)
### Features
* furo-app-flow-router can link external addresses ([649ffc6](https://github.com/theNorstroem/FuroBaseComponents/commit/649ffc6))
## [0.21.4](https://github.com/theNorstroem/FuroBaseComponents/compare/@furo/route@0.21.2...@furo/route@0.21.4) (2020-02-09)
**Note:** Version bump only for package @furo/route
## [0.21.3](https://github.com/theNorstroem/FuroBaseComponents/compare/@furo/route@0.21.2...@furo/route@0.21.3) (2020-02-09)
**Note:** Version bump only for package @furo/route
## [0.21.2](https://github.com/theNorstroem/FuroBaseComponents/compare/@furo/route@0.21.1...@furo/route@0.21.2) (2020-02-07)
**Note:** Version bump only for package @furo/route
## [0.21.1](https://github.com/theNorstroem/FuroBaseComponents/compare/@furo/route@0.21.0...@furo/route@0.21.1) (2020-02-04)
### Bug Fixes
* Register EventListeners only once ([a8cb4d1](https://github.com/theNorstroem/FuroBaseComponents/commit/a8cb4d1))
# [0.21.0](https://github.com/theNorstroem/FuroBaseComponents/compare/@furo/route@0.20.20...@furo/route@0.21.0) (2020-01-15)
### Features
* panel-coordinator with force close ([b62f604](https://github.com/theNorstroem/FuroBaseComponents/commit/b62f604))
## [0.20.20](https://github.com/theNorstroem/FuroBaseComponents/compare/@furo/route@0.20.19...@furo/route@0.20.20) (2020-01-14)
### Bug Fixes
* clasname for css theme injecter ([28d0415](https://github.com/theNorstroem/FuroBaseComponents/commit/28d0415))
## [0.20.19](https://github.com/theNorstroem/FuroBaseComponents/compare/@furo/route@0.20.18...@furo/route@0.20.19) (2019-12-17)
### Bug Fixes
* u33e init ([c92681a](https://github.com/theNorstroem/FuroBaseComponents/commit/c92681a))
## [0.20.18](https://github.com/theNorstroem/FuroBaseComponents/compare/@furo/route@0.20.17...@furo/route@0.20.18) (2019-12-12)
**Note:** Version bump only for package @furo/route
## [0.20.17](https://github.com/theNorstroem/FuroBaseComponents/compare/@furo/route@0.20.16...@furo/route@0.20.17) (2019-12-10)
**Note:** Version bump only for package @furo/route
## [0.20.16](https://github.com/theNorstroem/FuroBaseComponents/compare/@furo/route@0.20.14...@furo/route@0.20.16) (2019-12-02)
### Bug Fixes
* clear on reference-search will clear the binded data ([ffc43b7](https://github.com/theNorstroem/FuroBaseComponents/commit/ffc43b7))
## [0.20.15](https://github.com/theNorstroem/FuroBaseComponents/compare/@furo/route@0.20.14...@furo/route@0.20.15) (2019-11-28)
### Bug Fixes
* clear on reference-search will clear the binded data ([ffc43b7](https://github.com/theNorstroem/FuroBaseComponents/commit/ffc43b7))
## [0.20.14](https://github.com/theNorstroem/FuroBaseComponents/compare/@furo/route@0.20.13...@furo/route@0.20.14) (2019-11-16)
### Bug Fixes
* imports corrected to current versions ([5932c61](https://github.com/theNorstroem/FuroBaseComponents/commit/5932c61))
## [0.20.13](https://github.com/theNorstroem/FuroBaseComponents/compare/@furo/route@0.20.12...@furo/route@0.20.13) (2019-11-15)
**Note:** Version bump only for package @furo/route
## [0.20.12](https://github.com/veith/FuroBaseComponents/compare/@furo/route@0.20.11...@furo/route@0.20.12) (2019-11-13)
**Note:** Version bump only for package @furo/route
## [0.20.11](https://github.com/veith/FuroBaseComponents/compare/@furo/route@0.20.10...@furo/route@0.20.11) (2019-11-08)
### Bug Fixes
* renaming ([70e1b52](https://github.com/veith/FuroBaseComponents/commit/70e1b52))
## [0.20.10](https://github.com/veith/FuroBaseComponents/compare/@furo/route@0.20.9...@furo/route@0.20.10) (2019-10-31)
**Note:** Version bump only for package @furo/route
## [0.20.9](https://github.com/veith/FuroBaseComponents/compare/@furo/route@0.20.8...@furo/route@0.20.9) (2019-10-29)
**Note:** Version bump only for package @furo/route
## [0.20.8](https://github.com/veith/FuroBaseComponents/compare/@furo/route@0.20.7...@furo/route@0.20.8) (2019-10-28)
**Note:** Version bump only for package @furo/route
## [0.20.7](https://github.com/veith/FuroBaseComponents/compare/@furo/route@0.20.6...@furo/route@0.20.7) (2019-10-25)
**Note:** Version bump only for package @furo/route
## [0.20.6](https://github.com/veith/FuroBaseComponents/compare/@furo/route@0.20.5...@furo/route@0.20.6) (2019-10-23)
**Note:** Version bump only for package @furo/route
## [0.20.5](https://github.com/veith/FuroBaseComponents/compare/@furo/route@0.20.4...@furo/route@0.20.5) (2019-10-21)
**Note:** Version bump only for package @furo/route
## [0.20.4](https://github.com/veith/FuroBaseComponents/compare/@furo/route@0.20.3...@furo/route@0.20.4) (2019-10-18)
**Note:** Version bump only for package @furo/route
## [0.20.3](https://github.com/veith/FuroBaseComponents/compare/@furo/route@0.20.2...@furo/route@0.20.3) (2019-10-15)
**Note:** Version bump only for package @furo/route
## [0.20.2](https://github.com/veith/FuroBaseComponents/compare/@furo/route@0.20.1...@furo/route@0.20.2) (2019-10-14)
### Bug Fixes
* .value => ._value ([3aab3a0](https://github.com/veith/FuroBaseComponents/commit/3aab3a0))
## [0.20.1](https://github.com/veith/FuroBaseComponents/compare/@furo/route@0.20.0...@furo/route@0.20.1) (2019-10-10)
**Note:** Version bump only for package @furo/route
# [0.20.0](https://github.com/veith/FuroBaseComponents/compare/@furo/route@0.19.0...@furo/route@0.20.0) (2019-10-06)
### Features
* basePanel with wire to close and eventlistener to close ([0a397c0](https://github.com/veith/FuroBaseComponents/commit/0a397c0))
# [0.19.0](https://github.com/veith/FuroBaseComponents/compare/@furo/route@0.18.17...@furo/route@0.19.0) (2019-10-04)
### Features
* furo-app-flow-router with qp mapper and strict mapping ([3c26cb5](https://github.com/veith/FuroBaseComponents/commit/3c26cb5))
## [0.18.17](https://github.com/veith/FuroBaseComponents/compare/@furo/route@0.18.16...@furo/route@0.18.17) (2019-10-04)
**Note:** Version bump only for package @furo/route
## [0.18.16](https://github.com/veith/FuroBaseComponents/compare/@furo/route@0.18.15...@furo/route@0.18.16) (2019-10-03)
**Note:** Version bump only for package @furo/route
## [0.18.15](https://github.com/veith/FuroBaseComponents/compare/@furo/route@0.18.14...@furo/route@0.18.15) (2019-10-03)
**Note:** Version bump only for package @furo/route
## [0.18.14](https://github.com/veith/FuroBaseComponents/compare/@furo/route@0.18.13...@furo/route@0.18.14) (2019-10-03)
**Note:** Version bump only for package @furo/route
## [0.18.13](https://github.com/veith/FuroBaseComponents/compare/@furo/route@0.18.12...@furo/route@0.18.13) (2019-10-02)
**Note:** Version bump only for package @furo/route
## [0.18.12](https://github.com/veith/FuroBaseComponents/compare/@furo/route@0.18.11...@furo/route@0.18.12) (2019-10-01)
**Note:** Version bump only for package @furo/route
## [0.18.11](https://github.com/veith/FuroBaseComponents/compare/@furo/route@0.18.10...@furo/route@0.18.11) (2019-10-01)
**Note:** Version bump only for package @furo/route
## [0.18.10](https://github.com/veith/FuroBaseComponents/compare/@furo/route@0.18.9...@furo/route@0.18.10) (2019-10-01)
**Note:** Version bump only for package @furo/route
## [0.18.9](https://github.com/veith/FuroBaseComponents/compare/@furo/route@0.18.8...@furo/route@0.18.9) (2019-09-30)
**Note:** Version bump only for package @furo/route
## [0.18.8](https://github.com/veith/FuroBaseComponents/compare/@furo/route@0.18.7...@furo/route@0.18.8) (2019-09-30)
**Note:** Version bump only for package @furo/route
## [0.18.7](https://github.com/veith/FuroBaseComponents/compare/@furo/route@0.18.6...@furo/route@0.18.7) (2019-09-25)
**Note:** Version bump only for package @furo/route
## [0.18.6](https://github.com/veith/FuroBaseComponents/compare/@furo/route@0.18.5...@furo/route@0.18.6) (2019-09-19)
**Note:** Version bump only for package @furo/route
## [0.18.5](https://github.com/veith/FuroBaseComponents/compare/@furo/route@0.18.4...@furo/route@0.18.5) (2019-09-18)
**Note:** Version bump only for package @furo/route
## [0.18.4](https://github.com/veith/FuroBaseComponents/compare/@furo/route@0.18.3...@furo/route@0.18.4) (2019-09-17)
**Note:** Version bump only for package @furo/route
## [0.18.3](https://github.com/veith/FuroBaseComponents/compare/@furo/route@0.18.2...@furo/route@0.18.3) (2019-09-17)
**Note:** Version bump only for package @furo/route
## [0.18.2](https://github.com/veith/FuroBaseComponents/compare/@furo/route@0.18.1...@furo/route@0.18.2) (2019-09-15)
**Note:** Version bump only for package @furo/route
## [0.18.1](https://github.com/veith/FuroBaseComponents/compare/@furo/route@0.18.0...@furo/route@0.18.1) (2019-09-11)
**Note:** Version bump only for package @furo/route
# [0.18.0](https://github.com/veith/FuroBaseComponents/compare/@furo/route@0.17.3...@furo/route@0.18.0) (2019-09-11)
### Features
* entity-agent put ([b879c9b](https://github.com/veith/FuroBaseComponents/commit/b879c9b))
* entity-agent put ([c9ba6ec](https://github.com/veith/FuroBaseComponents/commit/c9ba6ec))
## [0.17.3](https://github.com/veith/FuroBaseComponents/compare/@furo/route@0.17.2...@furo/route@0.17.3) (2019-09-08)
**Note:** Version bump only for package @furo/route
## [0.17.2](https://github.com/veith/FuroBaseComponents/compare/@furo/route@0.17.1...@furo/route@0.17.2) (2019-09-05)
**Note:** Version bump only for package @furo/route
## [0.17.1](https://github.com/veith/FuroBaseComponents/compare/@furo/route@0.17.0...@furo/route@0.17.1) (2019-09-05)
**Note:** Version bump only for package @furo/route
# [0.17.0](https://github.com/veith/FuroBaseComponents/compare/@furo/route@0.16.6...@furo/route@0.17.0) (2019-08-30)
### Bug Fixes
* rename deprecated types and services ([fed08cb](https://github.com/veith/FuroBaseComponents/commit/fed08cb))
### Features
* tree with qp and qp-change-request ([5fe9e10](https://github.com/veith/FuroBaseComponents/commit/5fe9e10))
## [0.16.6](https://github.com/veith/FuroBaseComponents/compare/@furo/route@0.16.5...@furo/route@0.16.6) (2019-08-19)
**Note:** Version bump only for package @furo/route
## [0.16.5](https://github.com/veith/FuroBaseComponents/compare/@furo/route@0.16.4...@furo/route@0.16.5) (2019-08-19)
**Note:** Version bump only for package @furo/route
## [0.16.4](https://github.com/veith/FuroBaseComponents/compare/@furo/route@0.16.3...@furo/route@0.16.4) (2019-08-17)
**Note:** Version bump only for package @furo/route
## [0.16.3](https://github.com/veith/FuroBaseComponents/compare/@furo/route@0.16.2...@furo/route@0.16.3) (2019-08-12)
**Note:** Version bump only for package @furo/route
## [0.16.2](https://github.com/veith/FuroBaseComponents/compare/@furo/route@0.16.1...@furo/route@0.16.2) (2019-08-10)
### Bug Fixes
* furo-***-input label position ([4ee5d1d](https://github.com/veith/FuroBaseComponents/commit/4ee5d1d))
## [0.16.1](https://github.com/veith/FuroBaseComponents/compare/@furo/route@0.16.0...@furo/route@0.16.1) (2019-08-10)
**Note:** Version bump only for package @furo/route
# [0.16.0](https://github.com/veith/FuroBaseComponents/compare/@furo/route@0.15.3...@furo/route@0.16.0) (2019-08-07)
### Features
* testing ([7428896](https://github.com/veith/FuroBaseComponents/commit/7428896))
## [0.15.3](https://github.com/veith/FuroBaseComponents/compare/@furo/route@0.15.2...@furo/route@0.15.3) (2019-08-06)
**Note:** Version bump only for package @furo/route
## [0.15.2](https://github.com/veith/FuroBaseComponents/compare/@furo/route@0.15.1...@furo/route@0.15.2) (2019-08-05)
**Note:** Version bump only for package @furo/route
## [0.15.1](https://github.com/veith/FuroBaseComponents/compare/@furo/route@0.15.0...@furo/route@0.15.1) (2019-08-03)
**Note:** Version bump only for package @furo/route
# [0.15.0](https://github.com/veith/FuroBaseComponents/compare/@furo/route@0.14.0...@furo/route@0.15.0) (2019-08-02)
### Features
* furo-data-number-input with furo-number-input ([230e7cd](https://github.com/veith/FuroBaseComponents/commit/230e7cd))
# [0.14.0](https://github.com/veith/FuroBaseComponents/compare/@furo/route@0.13.1...@furo/route@0.14.0) (2019-08-01)
### Features
* separation of furo-input and furo-data-input ([fabf35c](https://github.com/veith/FuroBaseComponents/commit/fabf35c))
## [0.13.1](https://github.com/veith/FuroBaseComponents/compare/@furo/route@0.13.0...@furo/route@0.13.1) (2019-08-01)
### Bug Fixes
* make __fbpReady non private => _FBPReady ([042155c](https://github.com/veith/FuroBaseComponents/commit/042155c))
# [0.13.0](https://github.com/veith/FuroBaseComponents/compare/@furo/route@0.12.10...@furo/route@0.13.0) (2019-07-30)
### Features
* furo-component-page ([37c9078](https://github.com/veith/FuroBaseComponents/commit/37c9078))
## [0.12.10](https://github.com/veith/FuroBaseComponents/compare/@furo/route@0.12.9...@furo/route@0.12.10) (2019-07-30)
**Note:** Version bump only for package @furo/route
## [0.12.9](https://github.com/veith/FuroBaseComponents/compare/@furo/route@0.12.8...@furo/route@0.12.9) (2019-07-28)
**Note:** Version bump only for package @furo/route
## [0.12.8](https://github.com/veith/FuroBaseComponents/compare/@furo/route@0.12.7...@furo/route@0.12.8) (2019-07-28)
### Bug Fixes
* imports ([a51ec93](https://github.com/veith/FuroBaseComponents/commit/a51ec93))
## [0.12.7](https://github.com/veith/FuroBaseComponents/compare/@furo/route@0.12.6...@furo/route@0.12.7) (2019-07-28)
**Note:** Version bump only for package @furo/route
## [0.12.6](https://github.com/veith/FuroBaseComponents/compare/@furo/route@0.12.5...@furo/route@0.12.6) (2019-07-25)
**Note:** Version bump only for package @furo/route
## [0.12.5](https://github.com/veith/FuroBaseComponents/compare/@furo/route@0.12.4...@furo/route@0.12.5) (2019-07-22)
**Note:** Version bump only for package @furo/route
## [0.12.4](https://github.com/veith/FuroBaseComponents/compare/@furo/route@0.12.3...@furo/route@0.12.4) (2019-07-18)
**Note:** Version bump only for package @furo/route
## [0.12.3](https://github.com/veith/FuroBaseComponents/compare/@furo/route@0.12.2...@furo/route@0.12.3) (2019-07-18)
### Bug Fixes
* panel-coordinator update ([b8559f8](https://github.com/veith/FuroBaseComponents/commit/b8559f8))
## [0.12.2](https://github.com/veith/FuroBaseComponents/compare/@furo/route@0.12.1...@furo/route@0.12.2) (2019-07-18)
### Bug Fixes
* tests ([77d8141](https://github.com/veith/FuroBaseComponents/commit/77d8141))
## [0.12.1](https://github.com/veith/FuroBaseComponents/compare/@furo/route@0.12.0...@furo/route@0.12.1) (2019-07-16)
**Note:** Version bump only for package @furo/route
# [0.12.0](https://github.com/veith/FuroBaseComponents/compare/@furo/route@0.11.5...@furo/route@0.12.0) (2019-07-16)
### Features
* furo markdown with prism highlighter ([15b5774](https://github.com/veith/FuroBaseComponents/commit/15b5774))
* new spec for tree ([c0aef5a](https://github.com/veith/FuroBaseComponents/commit/c0aef5a))
* new spec panel coordinator ([1b67bc4](https://github.com/veith/FuroBaseComponents/commit/1b67bc4))
## [0.11.5](https://github.com/veith/FuroBaseComponents/compare/@furo/route@0.11.4...@furo/route@0.11.5) (2019-07-12)
**Note:** Version bump only for package @furo/route
## [0.11.4](https://github.com/veith/FuroBaseComponents/compare/@furo/route@0.11.3...@furo/route@0.11.4) (2019-07-12)
**Note:** Version bump only for package @furo/route
## [0.11.3](https://github.com/veith/FuroBaseComponents/compare/@furo/route@0.11.2...@furo/route@0.11.3) (2019-07-11)
### Bug Fixes
* fix e.path ([f5d8f66](https://github.com/veith/FuroBaseComponents/commit/f5d8f66))
## [0.11.2](https://github.com/veith/FuroBaseComponents/compare/@furo/route@0.11.1...@furo/route@0.11.2) (2019-07-11)
**Note:** Version bump only for package @furo/route
## [0.11.1](https://github.com/veith/FuroBaseComponents/compare/@furo/route@0.11.0...@furo/route@0.11.1) (2019-07-09)
**Note:** Version bump only for package @furo/route
# [0.11.0](https://github.com/veith/FuroBaseComponents/compare/@furo/route@0.10.1...@furo/route@0.11.0) (2019-07-09)
### Features
* panel-coordinator close all panels ([a2daf9c](https://github.com/veith/FuroBaseComponents/commit/a2daf9c))
## [0.10.1](https://github.com/veith/FuroBaseComponents/compare/@furo/route@0.10.0...@furo/route@0.10.1) (2019-07-09)
### Bug Fixes
* last tab ([dc2fdc7](https://github.com/veith/FuroBaseComponents/commit/dc2fdc7))
* last tab ([93ce40e](https://github.com/veith/FuroBaseComponents/commit/93ce40e))
# [0.10.0](https://github.com/veith/FuroBaseComponents/compare/@furo/route@0.9.2...@furo/route@0.10.0) (2019-07-08)
### Features
* panelInit wire ([e6d89c9](https://github.com/veith/FuroBaseComponents/commit/e6d89c9))
## [0.9.2](https://github.com/veith/FuroBaseComponents/compare/@furo/route@0.9.1...@furo/route@0.9.2) (2019-07-07)
**Note:** Version bump only for package @furo/route
## [0.9.1](https://github.com/veith/FuroBaseComponents/compare/@furo/route@0.9.0...@furo/route@0.9.1) (2019-07-06)
**Note:** Version bump only for package @furo/route
# [0.9.0](https://github.com/veith/FuroBaseComponents/compare/@furo/route@0.8.2...@furo/route@0.9.0) (2019-07-04)
### Features
* tree marker ([9076c1b](https://github.com/veith/FuroBaseComponents/commit/9076c1b))
## [0.8.2](https://github.com/veith/FuroBaseComponents/compare/@furo/route@0.8.1...@furo/route@0.8.2) (2019-07-04)
### Bug Fixes
* ignore nonexistent panels ([fe81cdd](https://github.com/veith/FuroBaseComponents/commit/fe81cdd))
## [0.8.1](https://github.com/veith/FuroBaseComponents/compare/@furo/route@0.8.0...@furo/route@0.8.1) (2019-07-03)
### Bug Fixes
* init empty tree ([1783f15](https://github.com/veith/FuroBaseComponents/commit/1783f15))
# [0.8.0](https://github.com/veith/FuroBaseComponents/compare/@furo/route@0.7.16...@furo/route@0.8.0) (2019-07-02)
### Features
* furo-panel-coordinator ([61c219a](https://github.com/veith/FuroBaseComponents/commit/61c219a))
## [0.7.16](https://github.com/veith/FuroBaseComponents/compare/@furo/route@0.7.15...@furo/route@0.7.16) (2019-07-01)
### Bug Fixes
* browser compatibility ([1179e85](https://github.com/veith/FuroBaseComponents/commit/1179e85))
## [0.7.15](https://github.com/veith/FuroBaseComponents/compare/@furo/route@0.7.14...@furo/route@0.7.15) (2019-07-01)
**Note:** Version bump only for package @furo/route
## [0.7.14](https://github.com/veith/FuroBaseComponents/compare/@furo/route@0.7.13...@furo/route@0.7.14) (2019-06-27)
**Note:** Version bump only for package @furo/route
## [0.7.13](https://github.com/veith/FuroBaseComponents/compare/@furo/route@0.7.12...@furo/route@0.7.13) (2019-06-27)
**Note:** Version bump only for package @furo/route
## [0.7.12](https://github.com/veith/FuroBaseComponents/compare/@furo/route@0.7.11...@furo/route@0.7.12) (2019-06-21)
**Note:** Version bump only for package @furo/route
## [0.7.11](https://github.com/veith/FuroBaseComponents/compare/@furo/route@0.7.10...@furo/route@0.7.11) (2019-06-21)
**Note:** Version bump only for package @furo/route
## [0.7.10](https://github.com/veith/FuroBaseComponents/compare/@furo/route@0.7.9...@furo/route@0.7.10) (2019-06-13)
**Note:** Version bump only for package @furo/route
## [0.7.9](https://github.com/veith/FuroBaseComponents/compare/@furo/route@0.7.8...@furo/route@0.7.9) (2019-06-12)
**Note:** Version bump only for package @furo/route
## [0.7.8](https://github.com/veith/FuroBaseComponents/compare/@furo/route@0.7.7...@furo/route@0.7.8) (2019-06-11)
**Note:** Version bump only for package @furo/route
## [0.7.7](https://github.com/veith/FuroBaseComponents/compare/@furo/route@0.7.6...@furo/route@0.7.7) (2019-06-10)
**Note:** Version bump only for package @furo/route
## [0.7.6](https://github.com/veith/FuroBaseComponents/compare/@furo/route@0.7.5...@furo/route@0.7.6) (2019-06-07)
### Bug Fixes
* trigger initial page-activated only if no page was activated before ([65029ef](https://github.com/veith/FuroBaseComponents/commit/65029ef))
* trigger page-activated on init with 1ms timeout ([3a2bc83](https://github.com/veith/FuroBaseComponents/commit/3a2bc83))
## [0.7.5](https://github.com/veith/FuroBaseComponents/compare/@furo/route@0.7.4...@furo/route@0.7.5) (2019-06-07)
**Note:** Version bump only for package @furo/route
## [0.7.4](https://github.com/veith/FuroBaseComponents/compare/@furo/route@0.7.3...@furo/route@0.7.4) (2019-06-06)
**Note:** Version bump only for package @furo/route
## [0.7.3](https://github.com/veith/FuroBaseComponents/compare/@furo/route@0.7.2...@furo/route@0.7.3) (2019-06-05)
**Note:** Version bump only for package @furo/route
## [0.7.2](https://github.com/veith/FuroBaseComponents/compare/@furo/route@0.7.1...@furo/route@0.7.2) (2019-06-05)
### Bug Fixes
* furo-location cuts of urlSpaceRegex part of url ([600b739](https://github.com/veith/FuroBaseComponents/commit/600b739))
* furo-pages works with location object from furo-location ([e69b6e2](https://github.com/veith/FuroBaseComponents/commit/e69b6e2))
## [0.7.1](https://github.com/veith/FuroBaseComponents/compare/@furo/route@0.7.0...@furo/route@0.7.1) (2019-06-04)
### Bug Fixes
* imports linted ([0d62b33](https://github.com/veith/FuroBaseComponents/commit/0d62b33))
* use default on empty name ([7b0fc4f](https://github.com/veith/FuroBaseComponents/commit/7b0fc4f))
# [0.7.0](https://github.com/veith/FuroBaseComponents/compare/@furo/route@0.6.0...@furo/route@0.7.0) (2019-06-04)
### Features
* furo-location, furo-pages interaction with furo-head-tail ([d14e706](https://github.com/veith/FuroBaseComponents/commit/d14e706))
# [0.6.0](https://github.com/veith/FuroBaseComponents/compare/@furo/route@0.5.0...@furo/route@0.6.0) (2019-05-29)
### Features
* furo-pages activatePage(name) by name ([c8bda9a](https://github.com/veith/FuroBaseComponents/commit/c8bda9a))
# [0.5.0](https://github.com/veith/FuroBaseComponents/compare/@furo/route@0.4.11...@furo/route@0.5.0) (2019-05-26)
### Features
* Dokumentation ([c10c969](https://github.com/veith/FuroBaseComponents/commit/c10c969))
## [0.4.11](https://github.com/veith/FuroBaseComponents/compare/@furo/route@0.4.10...@furo/route@0.4.11) (2019-05-24)
**Note:** Version bump only for package @furo/route
## [0.4.10](https://github.com/veith/FuroBaseComponents/compare/@furo/route@0.4.9...@furo/route@0.4.10) (2019-05-24)
### Bug Fixes
* _findAtagInPath ([703d2c1](https://github.com/veith/FuroBaseComponents/commit/703d2c1))
## [0.4.9](https://github.com/veith/FuroBaseComponents/compare/@furo/route@0.4.8...@furo/route@0.4.9) (2019-05-24)
### Bug Fixes
* furo-location check for clicks in nested elements ([86a2f54](https://github.com/veith/FuroBaseComponents/commit/86a2f54))
## [0.4.8](https://github.com/veith/FuroBaseComponents/compare/@furo/route@0.4.7...@furo/route@0.4.8) (2019-05-23)
### Bug Fixes
* publish error ([0e10b0b](https://github.com/veith/FuroBaseComponents/commit/0e10b0b))
## [0.4.7](https://github.com/veith/FuroBaseComponents/compare/@furo/route@0.4.6...@furo/route@0.4.7) (2019-05-23)
**Note:** Version bump only for package @furo/route
## [0.4.6](https://github.com/veith/FuroBaseComponents/compare/@furo/route@0.4.5...@furo/route@0.4.6) (2019-05-23)
**Note:** Version bump only for package @furo/route
## [0.4.5](https://github.com/veith/FuroBaseComponents/compare/@furo/route@0.4.4...@furo/route@0.4.5) (2019-05-03)
**Note:** Version bump only for package @furo/route
## [0.4.4](https://github.com/veith/FuroBaseComponents/compare/@furo/route@0.4.3...@furo/route@0.4.4) (2019-04-25)
**Note:** Version bump only for package @furo/route
## [0.4.3](https://github.com/veith/FuroBaseComponents/compare/@furo/route@0.4.2...@furo/route@0.4.3) (2019-04-22)
**Note:** Version bump only for package @furo/route
## [0.4.2](https://github.com/veith/FuroBaseComponents/compare/@furo/route@0.4.1...@furo/route@0.4.2) (2019-04-05)
### Bug Fixes
* pages timing ([2fb44f0](https://github.com/veith/FuroBaseComponents/commit/2fb44f0))
## [0.4.1](https://github.com/veith/FuroBaseComponents/compare/@furo/route@0.4.0...@furo/route@0.4.1) (2019-04-05)
### Bug Fixes
* timing for --pageActivated changed ([a522661](https://github.com/veith/FuroBaseComponents/commit/a522661))
# [0.4.0](https://github.com/veith/FuroBaseComponents/compare/@furo/route@0.3.1...@furo/route@0.4.0) (2019-04-02)
### Features
* Trigger app-flow without payload ([19296d2](https://github.com/veith/FuroBaseComponents/commit/19296d2))
## [0.3.1](https://github.com/veith/FuroBaseComponents/compare/@furo/route@0.3.0...@furo/route@0.3.1) (2019-03-28)
### Bug Fixes
* furo-app-flow-router notificaton ([e0d5398](https://github.com/veith/FuroBaseComponents/commit/e0d5398))
* furo-app-flow-router notificaton ([b8bb6dd](https://github.com/veith/FuroBaseComponents/commit/b8bb6dd))
# [0.3.0](https://github.com/veith/FuroBaseComponents/compare/@furo/route@0.2.4...@furo/route@0.3.0) (2019-03-28)
### Features
* furo-app-flow emitter ([4e6d369](https://github.com/veith/FuroBaseComponents/commit/4e6d369))
* furo-app-flow-router ([eed5a29](https://github.com/veith/FuroBaseComponents/commit/eed5a29))
## [0.2.4](https://github.com/veith/FuroBaseComponents/compare/@furo/route@0.2.3...@furo/route@0.2.4) (2019-03-26)
### Bug Fixes
* typos ([95a00ab](https://github.com/veith/FuroBaseComponents/commit/95a00ab))
## [0.2.3](https://github.com/veith/FuroBaseComponents/compare/@furo/route@0.2.2...@furo/route@0.2.3) (2019-03-25)
### Bug Fixes
* timeout ([d769a1a](https://github.com/veith/FuroBaseComponents/commit/d769a1a))
## [0.2.2](https://github.com/veith/FuroBaseComponents/compare/@furo/route@0.2.1...@furo/route@0.2.2) (2019-03-24)
**Note:** Version bump only for package @furo/route
## [0.2.1](https://github.com/veith/FuroBaseComponents/compare/@furo/route@0.2.0...@furo/route@0.2.1) (2019-03-24)
### Bug Fixes
* licence ([c364569](https://github.com/veith/FuroBaseComponents/commit/c364569))
# [0.2.0](https://github.com/veith/FuroBaseComponents/compare/@furo/route@0.1.0...@furo/route@0.2.0) (2019-03-24)
### Features
* furo-collection contains furo/route ([2ba26ca](https://github.com/veith/FuroBaseComponents/commit/2ba26ca))
# 0.1.0 (2019-03-24)
### Features
* furo-location initial version ([5e46459](https://github.com/veith/FuroBaseComponents/commit/5e46459))
* furo-location set query with {"key":"value"} ([94b8ef1](https://github.com/veith/FuroBaseComponents/commit/94b8ef1))
| 23.588267 | 258 | 0.705889 | yue_Hant | 0.354485 |
f961fada52dc20bb29d4526804e074e89df1c935 | 157 | md | Markdown | README.md | bingduouduos/test | 1120d5870a7fd5a182462b89e641a590cd07e3c8 | [
"Apache-2.0"
] | null | null | null | README.md | bingduouduos/test | 1120d5870a7fd5a182462b89e641a590cd07e3c8 | [
"Apache-2.0"
] | null | null | null | README.md | bingduouduos/test | 1120d5870a7fd5a182462b89e641a590cd07e3c8 | [
"Apache-2.0"
] | null | null | null | # test
test the function and usage of the organization
项目管理看板:https://github.com/bingduouduos/test/projects/1
主页io: https://bingduouduos.github.io/test/
| 19.625 | 54 | 0.77707 | kor_Hang | 0.327486 |
f9621b08d12a279561f1af71a83cde028f2db137 | 867 | md | Markdown | content/posts/2018/cldictp/index.zh-cn.md | PwzXxm/pwzxxm.github.io | 646823c5327f7980445ee789f8e106765316a3cd | [
"MIT"
] | 1 | 2019-04-17T07:23:04.000Z | 2019-04-17T07:23:04.000Z | content/posts/2018/cldictp/index.zh-cn.md | PwzXxm/pwzxxm.github.io | 646823c5327f7980445ee789f8e106765316a3cd | [
"MIT"
] | 5 | 2017-02-01T12:12:49.000Z | 2021-05-11T23:49:57.000Z | content/posts/2018/cldictp/index.zh-cn.md | PwzXxm/pwzxxm.github.io | 646823c5327f7980445ee789f8e106765316a3cd | [
"MIT"
] | null | null | null | ---
title: "CLDictP: 命令行英文词典工具"
date: 2018-07-04
draft: false
tags: ["Perl"]
---
CLDictP是一个用Perl,韦氏词典API写的命令行英文词典工具。
这是我第一个用Perl写的小项目。每次想用[Quizlet](https://quizlet.com/)做Flashcard(抽认卡?)来记单词的时候,因为想有的地方要加粗之类的,手动很麻烦。而且查单词的时候打开在线词典,太懒了,就想着写一个小工具吧。
它使用了如下韦氏词典API:
- 韦氏学习词典
- 韦氏大学词典
每个词条包括:
- 音标
- 词性
- 语法
- 释义
- 常用用法
- 例句
所有搜索过的单词存在一个集合里,保存在`searched.txt`里。
新的搜索结果会使用特定格式被保存在`quizlet.txt`里:
- 单词和定义之间:`$`
- 卡片与卡片之间:`---`
## 用法
1. 获得API密匙:[DictionaryAPI](https://www.dictionaryapi.com/)
2. 在`api_template.json`中加入获得的密匙,更改文件名为`api.json`
3. 安装依赖模块:
``` bash
$ cpan Term::ANSIColor Term::ReadKey LWP::UserAgent LWP::Protocol::https Readonly XML::LibXML JSON::XS Data::Dumper Set::Light
```
4. 运行脚本:
``` bash
$ perl dict.pl
```
5. 使用`Ctrl+D`退出
## 演示

## 源码
源码在[Github](https://github.com/PwzXxm/CLDictP) | 14.694915 | 130 | 0.685121 | yue_Hant | 0.908032 |
f9626da992dcb89cecd509180f844a47c69266f2 | 74 | md | Markdown | README.md | TopDev930/TranslateAPI | e6a08e6b1b544f56a333aef4d22d1d744bd0eb78 | [
"Apache-2.0"
] | 2 | 2020-05-18T14:01:25.000Z | 2020-05-19T17:27:32.000Z | README.md | TopDev930/TranslateAPI | e6a08e6b1b544f56a333aef4d22d1d744bd0eb78 | [
"Apache-2.0"
] | null | null | null | README.md | TopDev930/TranslateAPI | e6a08e6b1b544f56a333aef4d22d1d744bd0eb78 | [
"Apache-2.0"
] | null | null | null | # TranslateAPI
This is the simple project which uses google translate api
| 24.666667 | 58 | 0.824324 | eng_Latn | 0.999592 |
f9629f9baea465db06fb66b1fd3f5c32137c2e45 | 12,035 | md | Markdown | References.md | gordenbrown51/NSABlocklist | 8cf7e41d56102c204f6a92db924622af1d99e839 | [
"ISC",
"MIT"
] | 3 | 2019-07-17T02:11:24.000Z | 2021-11-16T00:33:08.000Z | References.md | gordenbrown51/NSABlocklist | 8cf7e41d56102c204f6a92db924622af1d99e839 | [
"ISC",
"MIT"
] | null | null | null | References.md | gordenbrown51/NSABlocklist | 8cf7e41d56102c204f6a92db924622af1d99e839 | [
"ISC",
"MIT"
] | 1 | 2021-11-18T21:25:03.000Z | 2021-11-18T21:25:03.000Z | References
------------
* [NSA official GitHub Account](https://nationalsecurityagency.github.io)
* [USB Snooping Made Easy: Crosstalk Leakage Attacks on USB Hubs](https://www.usenix.org/conference/usenixsecurity17/technical-sessions/presentation/su)
* [WikiLeaks Publishes NSA Target List](https://wikileaks.org/nsa-201602/)
* [NSA Warns of the Dangers of Quantum Computing (futurism.com)](http://futurism.com/nsa-warns-dangers-quantum-computing/)
* [Right to privacy (wikipedia.org)](https://en.wikipedia.org/wiki/Right_to_privacy)
* [Patriot Act | Wikipedia](https://en.wikipedia.org/wiki/USA_PATRIOT_Act) + CALEA act + [FISA](https://en.wikipedia.org/wiki/Foreign_Intelligence_Surveillance_Act)
* [Cryptome | cryptome.org](http://cryptome.info/0001/ip-tla.htm)
* [NSA's Autonomous Systems (AS),](https://www.robtex.net/?_escaped_fragment_=dns%3Dnsa.gov#!dns=nsa.gov)
* [33bits | 33bits.org](http://33bits.org/)
* [What an IP Address Can Reveal About You | priv.gc.ca](https://www.priv.gc.ca/information/research-recherche/2013/ip_201305_e.asp)
* [Randomtalker web-privacy](http://randomwalker.info/web-privacy/)
* [https://bosnadev.com/2015/04/14/facebook-chats-are-being-scanned-by-a-cia-funded-company/](Chats Are Being Scanned By A CIA Funded Company)
* [Mobile Security Wiki | mobilesecuritywiki.com](https://mobilesecuritywiki.com/)
* [Researcher at Kaspersky Labs have discovered a list of domains used by the NSA to install malware on victim's PC around the world.](https://www.hackread.com/here-is-a-list-of-urls-used-by-the-nsa-to-install-malware-on-pcs-worldwide/)
* [NSA PRISM Keywords For Domestic Spying | Business Insider](http://www.businessinsider.com/nsa-prism-keywords-for-domestic-spying-2013-6?IR=T)
* [Windows and the backdoor question from 1999 | CNN](http://edition.cnn.com/TECH/computing/9909/03/windows.nsa.02/)
* [Psssst: Wanna Buy a Used Spy Website? | Wired](http://www.wired.com/2015/03/nsa_domains/)
* [Understanding NSA Malware | Schneier on Security](https://www.schneier.com/blog/archives/2015/02/understanding_n.html)
* [Check if NSA warrantless surveillance is looking at your IP traffic | Lookingglassnews](http://www.lookingglassnews.org/viewstory.php?storyid=6861)
* [Sensitive IP addresses | Wikipedia](https://en.wikipedia.org/wiki/Wikipedia:Sensitive_IP_addresses)
* [Do Not Scan - Government IP list | PeerBlock Forums](http://forums.peerblock.com/read.php?8,14794,14794)
* [Hardened user.js for Firefox to stop data leackage | GitHub](https://github.com/pyllyukko/user.js)
* [Firefox Zero-Day Exploit used by FBI to shutdown Child porn on Tor Network hostin; Tor Mail Compromised](https://thehackernews.com/2013/08/Firefox-Exploit-Tor-Network-child-pornography-Freedom-Hosting.html)
* [Entire set of 5,300+ .gov domains as .csv file | GitHub](https://gsa.github.io/data/dotgov-domains/2014-12-01-full.csv)
* [SS7 hack shown demonstrated to track anyone | 60 Minutes](http://www.9jumpin.com.au/show/60minutes/stories/2015/august/phone-hacking/)
* [SSL Blacklist](https://sslbl.abuse.ch/blacklist/)
* [MITM-Proxy](https://mitmproxy.org/doc/howmitmproxy.html) + [Lagado proxy test](http://www.lagado.com/proxy-test) + [Lagado cache test](http://www.lagado.com/tools/cache-test)
* [Detect Superfish, Komodia and Privdog | filippo](https://filippo.io/Badfish/)
* [SSL eye prism protection | digi77](https://www.digi77.com/ssl-eye-prism-protection/)
* [NSAPlaySet](http://www.nsaplayset.org/)
* [Global surveillance disclosures (2013–present) | Wikipedia](https://en.wikipedia.org/wiki/Global_surveillance_disclosures_(2013%E2%80%93present))
* [Attacking Tor: how the NSA targets users' online anonymity |TheGuardian](http://www.theguardian.com/world/2013/oct/04/tor-attacks-nsa-users-online-anonymity)
* [Using a Power Law Distribution to describe Big Data | Arxiv.org](http://arxiv.org/abs/1509.00504)
* [5 reasons you need to be tracking Big Data security analytics (monitor.us)](http://blog.monitor.us/2015/09/5-reasons-you-need-to-be-tracking-big-data-security-analytics/)
* [FBI, intel chiefs decry “deep cynicism” over cyber spying programs (arstechnica.com)](http://arstechnica.com/tech-policy/2015/09/fbi-intel-chiefs-decry-deep-cynicism-over-cyber-spying-programs/)
* [Internet-Wide Scan Data Repository (scans.io)](https://scans.io/)
* [.onion (ietf.org)](https://www.ietf.org/blog/2015/09/onion/)
* [Mail tester to check you eMail security score](http://www.mail-tester.com/) + [eMail defense](https://emailselfdefense.fsf.org/en/)
* [List of United States mobile virtual network oprators](https://en.m.wikipedia.org/wiki/List_of_United_States_mobile_virtual_network_operators)
* If you are interested in switching away from Google, take a look at https://github.com/sovereign/sovereign
* [Julian Assange: Debian Is Owned by the NSA (igurublog.wordpress.com)](https://igurublog.wordpress.com/2014/04/08/julian-assange-debian-is-owned-by-the-nsa/)
* [Vodafone Australia admits hacking Fairfax journalist's phone (theguardian.com)](http://www.theguardian.com/business/2015/sep/13/vodafone-australia-admits-hacking-fairfax-journalists-phone)
* [Hacking Team, Computer Vulnerabilities, and the NSA (schneier.com)](https://www.schneier.com/blog/archives/2015/09/hacking_team_co.html)
* [Big Data and Environmental Sustainability: A Conversation Starter (in Brief) (medium.com)](https://medium.com/@AlanKeeso/big-data-and-environmental-sustainability-a-conversation-starter-in-brief-4052d0b2f0ae)
* [ISPs don’t have First Amendment right to edit Internet, FCC tells court (arstechnica.com)](http://arstechnica.com/tech-policy/2015/09/isps-dont-have-1st-amendment-right-to-edit-internet-fcc-tells-court/)
* [Homeland Security websites vulnerable to cyber attack: audit (reuters.com)](http://www.reuters.com/article/2015/09/15/us-usa-cybersecurity-idUSKCN0RF2DC20150915)
* [Now you can find out if GCHQ illegally spied on you (privacyinternational.org)](https://www.privacyinternational.org/?q=illegalspying)
* [Hacking Team, Computer Vulnerabilities, and the NSA (georgetown.edu)](http://journal.georgetown.edu/hacking-team-and-the-nsa/)
* [New Federalist platform lets [government] agencies quickly launch websites (gsa.gov)](https://18f.gsa.gov/2015/09/15/federalist-platform-launch/)
* [Government spying spooks French citizens (straitstimes.com)](http://www.straitstimes.com/world/europe/government-spying-spooks-french-citizens)
* [NSA Plans to Develop Encryption That Could Stump Quantum Computers (wired.com)](http://www.wired.com/2015/09/tricky-encryption-stump-quantum-computers/)
* [Japanese government orders closure of university social science/humanities depts (timeshighereducation.com)](https://www.timeshighereducation.com/news/social-sciences-and-humanities-faculties-close-japan-after-ministerial-decree)
* [Tollow (reqrypt.org)](https://reqrypt.org/tallow.html)
* [Chinese government firms sell products that subvert censorship (larrysalibra.com)](https://www.larrysalibra.com/hop-over-the-great-firewall-with-government-help/)
* [The Tricky Encryption That Could Stump Quantum Computers (wired.com)](http://www.wired.com/2015/09/tricky-encryption-stump-quantum-computers/)
* [GCHQ tried to track Web visits of “every visible user on Internet” (arstechnica.com)](http://arstechnica.com/security/2015/09/gchq-tried-to-track-web-visits-of-every-visible-user-on-internet/)
* [A Q&A with NSA Whistleblower Edward Snowden (fusion.net)(http://fusion.net/story/201737/edward-snowden-interview/)
* [Skype Alternatives, Part 2: Edward Snowden’s Recommendations (cointelegraph.com)](http://cointelegraph.com/news/114689/skype-alternatives-part-2-edward-snowdens-recommendations)
* [The FBI has no trouble spying on encrypted communications (boingboing.net)](http://boingboing.net/2015/09/29/the-fbi-has-no-trouble-spying.html)
* [Encryption doesn't stop the FBI (theintercept.com)](https://theintercept.com/2015/09/28/hacking/)
* [NSA runs its spying activities on Red Hat Linux](http://www.itwire.com/opinion-and-analysis/open-sauce/68567-nsa-runs-its-spying-activities-on-red-hat-linux)
* [How the NSA can break trillions of encrypted Web and VPN connections ](http://arstechnica.com/security/2015/10/how-the-nsa-can-break-trillions-of-encrypted-web-and-vpn-connections/)
* [THE NSA WORKED TO “TRACK DOWN” BITCOIN USERS, SNOWDEN DOCUMENTS REVEAL](https://theintercept.com/2018/03/20/the-nsa-worked-to-track-down-bitcoin-users-snowden-documents-reveal/)
* [THE INTERCEPT IS BROADENING ACCESS TO THE SNOWDEN ARCHIVE. HERE’S WHY (theintercept.com)](https://theintercept.com/2016/05/16/the-intercept-is-broadening-access-to-the-snowden-archive-heres-why/)
Research
------------
* https://www.google.com/#q=goldenfrog+thisisourkey [Archive](http://archive.is/qlrLK)
* http://www.gfwvpn.com/?q=node/224 [Archive](http://archive.is/EdpFV)
* https://www.vpnreactor.com/android_l2tp_ipsec.html [Archive](http://archive.is/uwJvk)
* http://unblockvpn.com/support/how-to-set-up-l2tp-on-the-android.html [Archive](http://archive.is/4To5Y)
* http://www.ibvpn.com/billing/knowledgebase/34/Set-up-the-VPN-connection-on-Android-handsets.html [Archive](http://archive.is/srptW)
* https://www.astrill.com/knowledge-base/50/L2TP-IPSec-PSK---How-to-configure-L2TP-IPSec-on-Android.html [Archive](http://archive.is/PZpRU)
* http://billing.purevpn.com/knowledgebase.php?action=displayarticle&id=33 [Archive](http://archive.is/R4JTi)
* https://www.privateinternetaccess.com/pages/client-support/ [Archive](http://archive.is/U1bkL)
* http://torguard.net/knowledgebase.php?action=displayarticle&id=58 [Archive](http://archive.is/iKJjl)
* https://www.ipvanish.com/visualguides/L2TP/Android/ [Archive](http://imgur.com/IQU1mdg)
* http://www.earthvpn.com/android-l2tp-setup-guide/ [Archive](http://archive.is/roKtf)
* https://nordvpn.com/tutorials/android/l2tpipsec/ [Archive](http://archive.is/BQumt)
* https://help.tigervpn.com/support/search/solutions?term=shared+secret+tigerVPN [Archive](http://archive.is/xZ136)
* https://www.slickvpn.com/tutorials/ipsec-for-iphone/ [Archive](http://archive.is/h4rI9)
* DoubleHop.me: [Archive](http://archive.is/G11WQ) and http://archive.is/MZgWE and http://imgur.com/Zn5HSIj
* https://en.wikipedia.org/wiki/Trusted_Computing#Loss_of_anonymity ([measure.c](https://github.com/systemd/systemd/blob/09541e49ebd17b41482e447dd8194942f39788c0/src/boot/efi/measure.c))
* [NIST Special Publication 800-63B](https://pages.nist.gov/800-63-3/sp800-63b.html)
* https://file.wikileaks.org/torrent/
Videos
------------
* [Greenwald Vs. NSA debate (youtube.com)](https://www.youtube.com/watch?t=12&v=sfPjgUgoLaQ)
* [A Conversation with Edward Snowden (Part 1) [Podcast] (startalkradio.net)](http://www.startalkradio.net/show/a-conversation-with-edward-snowden-part-1/)
Papers
------------
* [Pages From OAKSTAR Weekly 2013-03-29](https://theintercept.com/document/2018/03/20/pages-from-oakstar-weekly-2013-03-29/)
* [Pages From OAKSTAR Weekly 2013-03-15](https://theintercept.com/document/2018/03/20/pages-from-oakstar-weekly-2013-03-15/)
* [Detect IP spoofing with 97 chance of success](https://sce.carleton.ca/~abdou/CPV_TDSC.pdf)
* [ENCRYPT Act of 2016](https://assets.documentcloud.org/documents/2708079/LIEU-027-Xml-ENCRYPT-Act-of-2016.pdf)
* [Google PDF Search: “not for public release”](https://www.google.com/search?as_q=&as_epq=not+for+public+release&as_oq=&as_eq=&as_nlo=&as_nhi=&lr=&cr=&as_qdr=all&as_sitesearch=&as_occt=any&safe=images&as_filetype=pdf&as_rights=&gws_rd=ssl)
* [Password guidance - simplifying your approach (GCHQ) [pdf] (gov.uk)](https://www.gov.uk/government/uploads/system/uploads/attachment_data/file/458857/Password_guidance_-_simplifying_your_approach.pdf)
* [ISP wins 11-year battle to reveal warrantless FBI spying [pdf] (calyxinstitute.org)](https://www.calyxinstitute.org/sites/all/documents/08_28_2015_REDACTED_Decision_and_Order.pdf)
* [TheIntercept Documents](https://theintercept.com/document/2015/09/25/legalities)
* [Speicherfristen von Verkehrsdaten im Mobilfunk (spiegel.de)](http://www.spiegel.de/media/media-37689.pdf)
| 104.652174 | 240 | 0.773162 | yue_Hant | 0.54021 |
f9630d2b7437c70d9ee16c2ea63c243582e1df1f | 1,948 | md | Markdown | _posts/2019-02-11-memset-note.md | RandomlyChristen/RandomlyChristen.github.io | e933d0c5fbd4a0f7071e46f583239c8fbb6cc266 | [
"MIT"
] | null | null | null | _posts/2019-02-11-memset-note.md | RandomlyChristen/RandomlyChristen.github.io | e933d0c5fbd4a0f7071e46f583239c8fbb6cc266 | [
"MIT"
] | null | null | null | _posts/2019-02-11-memset-note.md | RandomlyChristen/RandomlyChristen.github.io | e933d0c5fbd4a0f7071e46f583239c8fbb6cc266 | [
"MIT"
] | null | null | null | ---
layout: post
title: memset, 이런 실수 하지말자
tags: C++ std memset fill
feature-img: "assets/img/2019-01-14/c-introduction.jpg"
---
`string.h`의 `memset()` 함수를 통한 배열 원소의 초기화는 **매우 제한적**이다.
우선, `memset()`은 [다음](http://www.cplusplus.com/reference/cstring/memset/)과 같이 정의된다.
>`void * memset ( void * ptr, int value, size_t num );`
Fill block of memory
Sets the first num bytes of the block of memory pointed by ptr to the specified value (interpreted as an unsigned char).
* 첫 번째 인자로 받은 주소로부터 두 번째 인자로 받은 값을 **바이트 단위로** 반복작성한다.
* 두 번째 인자는 `int`형으로 받지만, 작성되는 데이터는 `unsigned char`로 해석된다.
---
즉, 각 바이트 블럭 당 할당되는 값은 **0 ~ 255(`0xFF`)** 사이의 값만 유의미하게 적용된다는 말이다.
```cpp
#include <iostream>
#include <cstring>
using namespace std;
int main() {
int result = 0;
memset(&result, 300, 1);
cout << result;
return 0;
}
```
```
44
```
위의 프로그램에서 `memset`의 두 번째 인자는 `unsigned char`를 넘어서는 값이다.
이 때, 300 즉, `0x12C`의 [Little-endian에 의거한 바이트 오더](https://genesis8.tistory.com/37)의 첫번째 블럭의 값은 44(`0x2C`)인데,
이 값이 `result`의 메모리 위치에 1 바이트만큼 작성되면서, 출력이 44가 되는 것이다.
그렇다면, 코드를 다음과 같이 수정하면 어떤 결과가 나올까?
```cpp
memset(&result, 300, sizeof(int));
```
```
741092396
```
`0x2C2C2C2C`. 눈치가 빠르다면 이 결과에 쉽게 수긍할 것이다.
---
`memset`이라는 함수는 절대로 입력된 `int`값을 작성해주는 값이 아니다. 절대 주의 할 것.
초기화하고자 하는 것이 배열이거나, 데이터가 정수형이 아니라면 더욱이 잘 살펴보아야 할 것이다.
나같이 많은 사람들이 `memset`의 레퍼런스를 잘못이해하고 있었을 경우가 많을 것이라고 생각되는 Behavior 는 다음과 같은 것일 것이다.
* `memset(arr, 0, sizeof(int) * arrSize);`
* `memset(arr, -1, sizeof(int) * arrSize);`
이 흔한 코드를 보고, *"아, 모든 배열 원소를 해당 값으로 초기화해주는 함수이구나!"*라고 말이다.
하지만, 0은 첫번째 바이트 블록의 값이 당연히 `0x0`이기 때문에 몇 번을 반복하든, 어느 블록을 참조하든 결과가 당연히 0일 뿐이다.
-1의 경우는 값이 `unsigned char`로 해석되면, 1의 2의 보수를 나타내는 비트 패턴이 나타내는 값이 할당되는데, 그 값의 바이트 블록은
255(`0xFF`)이다. 해당 패턴이 반복되는 메모리 역시, 어느 블록을 참조하든 결과를 `int`로 나타내면 -1(`0xFFFFFFFF`)이 나오는 것이다.
---
결론, 배열을 초기화 어떤 값으로 초기화 할 때는, C++스러운 Template 함수인 `std::fill`을 사용하자.
[참고](http://www.cplusplus.com/reference/algorithm/fill/?kw=fill) | 24.35 | 121 | 0.662218 | kor_Hang | 1.00001 |
f9635d736849b9f12fe702060b7eadded0f26a33 | 21 | md | Markdown | README.md | jeffrey2423/side-navigation-bar | 5b857cf7fda7dca7b50244eaf4601aca3ba2275d | [
"Apache-2.0"
] | null | null | null | README.md | jeffrey2423/side-navigation-bar | 5b857cf7fda7dca7b50244eaf4601aca3ba2275d | [
"Apache-2.0"
] | null | null | null | README.md | jeffrey2423/side-navigation-bar | 5b857cf7fda7dca7b50244eaf4601aca3ba2275d | [
"Apache-2.0"
] | null | null | null | # side-navigation-bar | 21 | 21 | 0.809524 | eng_Latn | 0.340424 |
f9636fd35ddd10c9df8e760a646247f582475a08 | 6,945 | md | Markdown | README.md | MadCatMining/Espers | 81848beb8974e3e38422ae7a5a9bab2416936f7d | [
"MIT"
] | null | null | null | README.md | MadCatMining/Espers | 81848beb8974e3e38422ae7a5a9bab2416936f7d | [
"MIT"
] | 1 | 2020-08-21T01:20:06.000Z | 2020-08-21T01:20:06.000Z | README.md | MadCatMining/Espers | 81848beb8974e3e38422ae7a5a9bab2416936f7d | [
"MIT"
] | null | null | null | Espers [ESP] 2016-2021 integration/staging tree
===============================================
https://espers.io/
What is the Espers [ESP] Blockchain?
------------------------------------
*TODO: Update documentation regarding implemented tech as this section is out of date and much progress and upgrades have been made to mentioned sections...*
### Overview
Espers is a blockchain project with the goal of offering secured messaging, websites on the chain and an overall pleasing experience to the user.
### Blockchain Technology
Blockchain technology has the capabilities to change a variety of things in the world and can also be the backbone of a new internet. One place that blockchain technology has not had an impact on yet is website on the blockchain. However, Espers is trying to change that and is well on their way to doing so.
### Website on the Chain
Espers Blockchain will keep sites extremely safe and will make them virtually impenetrable, which is creating a new standard for website safety and security. In addition to the security improvements, Espers Blockchain will also ensure that your website is never taken down or removed, no matter what. You will also have complete control of your data, and the sites will feature faster server times.
### Secured Messaging
Developing the usability features: The secured messaging system, masternodes (intuitive node launching and monitoring), smooth sync patch, shrinking blockchain, light clients and much much more.
### Custom Difficulty Retarget Algorithm “VRX”
VRX is designed from the ground up to integrate properly with the Velocity parameter enforcement system to ensure users no longer receive orphan blocks.
### Velocity Block Constraint System
Ensuring Espers stays as secure and robust as possible the CryptoCoderz team have implemented what's known as the Velocity block constraint system. This system acts as third and final check for both mined and peer-accepted blocks ensuring that all parameters are strictly enforced.
### HMQ1725 Algorithm
We use an custom internal algorithm known as HMQ1725 to sign blocks and conduct other functions, it takes its name from how it was designed: Highly-Modified-Quark 17-Algorithms 25-Scientific-Rounds
Specifications and General info
------------------
Espers uses:
Boost1.72,
OR Boost1.58,
OpenSSL1.02u,
OR OpenSSL1.1x,
Berkeley DB 6.2.32,
Qt5.14 to compile
General Info:
Block Spacing: 5 Minutes
Stake Minimum Age: 90 Confirmations (PoS-v3) | 2 Hours (PoS-v2)
Port: 22448
RPC Port: 22442
Compiling Espers daemon on Ubunutu 18.04 LTS Bionic
---------------------------
### Note: guide should be compatible with other Ubuntu versions from 14.04+
### Become poweruser
```
sudo -i
```
### Dependencies install
```
cd ~; sudo apt-get install ntp git build-essential libssl-dev libdb-dev libdb++-dev libboost-all-dev libqrencode-dev libcurl4-openssl-dev curl libzip-dev; apt-get update; apt-get upgrade; apt-get install git make automake build-essential libboost-all-dev; apt-get install yasm binutils libcurl4-openssl-dev openssl libssl-dev; sudo apt-get install libgmp-dev; cd ~;
```
### Dependencies build and link
```
cd ~; wget http://download.oracle.com/berkeley-db/db-6.2.32.NC.tar.gz; tar zxf db-6.2.32.NC.tar.gz; cd db-6.2.32.NC/build_unix; ../dist/configure --enable-cxx; make; sudo make install; sudo ln -s /usr/local/BerkeleyDB.6.2/lib/libdb-6.2.so /usr/lib/libdb-6.2.so; sudo ln -s /usr/local/BerkeleyDB.6.2/lib/libdb_cxx-6.2.so /usr/lib/libdb_cxx-6.2.so; export BDB_INCLUDE_PATH="/usr/local/BerkeleyDB.6.2/include"; export BDB_LIB_PATH="/usr/local/BerkeleyDB.6.2/lib"; cd ~;
```
### Personal upload EXAMPLE
```
cd ~; sudo cp -r /home/ftpuser/ftp/files/ESP-clean/. ~/Espers
```
### GitHub pull RECOMMENDED
```
cd ~; git clone https://github.com/CryptoCoderz/Espers
```
### Build Espers daemon
```
cd ~; cd ~/Espers/src; chmod a+x obj; chmod a+x leveldb/build_detect_platform; chmod a+x leveldb; chmod a+x ~/Espers/src; chmod a+x ~/Espers; make -f ~/Espers/src/makefile/makefile.unix USE_UPNP=-; cd ~; cp ~/Espers/src/Espersd /usr/local/bin;
```
### Create config file for daemon
```
cd ~; sudo ufw allow 22448/tcp; sudo ufw allow 22442/tcp; sudo mkdir ~/.ESP; cat << "CONFIG" >> ~/.ESP/Espers.conf
listen=1
server=1
daemon=1
testnet=0
rpcuser=espersuser
rpcpassword=SomeCrazyVeryVerySecurePasswordHere
rpcport=22442
port=22448
rpcconnect=127.0.0.1
rpcallowip=127.0.0.1
addnode=198.50.180.207
addnode=85.255.7.52
addnode=195.181.211.221
addnode=176.31.205.41
addnode=116.14.167.86
addnode=167.99.171.227
addnode=174.107.102.219
addnode=176.9.156.236
addnode=198.50.180.193
addnode=94.130.64.143
addnode=145.239.65.6
addnode=108.61.175.156
addnode=46.4.27.201
addnode=149.56.154.75
addnode=50.46.99.238
addnode=159.89.114.40
CONFIG
chmod 700 ~/.ESP/Espers.conf; chmod 700 ~/.ESP; ls -la ~/.ESP
```
### Run Espers daemon
```
cd ~; Espersd; Espersd getinfo
```
### Troubleshooting
### for basic troubleshooting run the following commands when compiling:
### this is for minupnpc errors compiling
```
make clean -f makefile.unix USE_UPNP=-
make -f makefile.unix USE_UPNP=-
```
License
-------
Espers [ESP] is released under the terms of the MIT license. See [COPYING](COPYING) for more
information or see https://opensource.org/licenses/MIT.
Development Process
-------------------
The `master` branch is regularly built and tested, but is not guaranteed to be
completely stable. [Tags](https://github.com/CryptoCoderz/Espers/tags) are created
regularly to indicate new official, stable release versions of Espers [ESP].
The contribution workflow is described in [CONTRIBUTING.md](CONTRIBUTING.md).
The developer [mailing list](https://lists.linuxfoundation.org/mailman/listinfo/bitcoin-dev)
should be used to discuss complicated or controversial changes before working
on a patch set.
Developer Discord can be found at https://discord.gg/cn3AfPS .
Testing
-------
Testing and code review is the bottleneck for development; we get more pull
requests than we can review and test on short notice. Please be patient and help out by testing
other people's pull requests, and remember this is a security-critical project where any mistake might cost people
lots of money.
### Automated Testing
Developers are strongly encouraged to write [unit tests](/doc/unit-tests.md) for new code, and to
submit new unit tests for old code. Unit tests can be compiled and run
(assuming they weren't disabled in configure) with: `make check`
There are also [regression and integration tests](/qa) of the RPC interface, written
in Python, that are run automatically on the build server.
### Manual Quality Assurance (QA) Testing
Changes should be tested by somebody other than the developer who wrote the
code. This is especially important for large or high-risk changes. It is useful
to add a test plan to the pull request description if testing the changes is
not straightforward.
| 39.685714 | 466 | 0.750612 | eng_Latn | 0.964233 |
f96477538585e4989596cda9b8411eff2988014f | 1,823 | md | Markdown | biztalk/core/receipt-delivery-option-value-is-invalid.md | SicongLiuSimon/biztalk-docs | 85394b436d277504d9e759c655608888123785bd | [
"CC-BY-4.0",
"MIT"
] | 1 | 2019-06-14T19:45:26.000Z | 2019-06-14T19:45:26.000Z | biztalk/core/receipt-delivery-option-value-is-invalid.md | AzureMentor/biztalk-docs | 16b211f29ad233c26d5511475c7e621760908af3 | [
"CC-BY-4.0",
"MIT"
] | 7 | 2020-01-09T22:34:58.000Z | 2020-02-18T19:42:16.000Z | biztalk/core/receipt-delivery-option-value-is-invalid.md | AzureMentor/biztalk-docs | 16b211f29ad233c26d5511475c7e621760908af3 | [
"CC-BY-4.0",
"MIT"
] | 2 | 2017-06-23T18:30:28.000Z | 2017-11-28T01:11:25.000Z | ---
title: "Receipt-Delivery-Option value is invalid | Microsoft Docs"
ms.custom: ""
ms.date: "06/08/2017"
ms.prod: "biztalk-server"
ms.reviewer: ""
ms.suite: ""
ms.tgt_pltfrm: ""
ms.topic: "article"
ms.assetid: 0eed306b-0912-4694-a55c-976128117c02
caps.latest.revision: 8
author: "MandiOhlinger"
ms.author: "mandia"
manager: "anneta"
---
# Receipt-Delivery-Option value is invalid
## Details
| | |
|-----------------|----------------------------------------------------------------------------------------|
| Product Name | [!INCLUDE[btsBizTalkServerNoVersion](../includes/btsbiztalkservernoversion-md.md)] |
| Product Version | [!INCLUDE[btsEDIVersion](../includes/btsediversion-md.md)] |
| Event ID | - |
| Event Source | [!INCLUDE[btsBizTalkServerNoVersion](../includes/btsbiztalkservernoversion-md.md)] EDI |
| Component | AS2 Engine |
| Symbolic Name | InvalidReceiptDeliveryOptionError |
| Message Text | Receipt-Delivery-Option value: "{0}" is invalid. {1} |
## Explanation
This Error/Warning/Information event indicates that the Receipt-Delivery-Option in the received AS2 message is not a valid URL and does not conform to the specifications in the AS2 RFC 4130.
## User Action
To resolve this error, have the Receipt-Delivery-Option in the AS2 message changed to include a valid URL and conform to the specifications in section 7.3 of the AS2 RFC 4130, and then resend the AS2 message. | 53.617647 | 209 | 0.53209 | eng_Latn | 0.676862 |
f964cbbba67032e7ac64ee273a0f0f0613a613af | 432 | md | Markdown | OPEN_SSL_HB_FIX.md | patricia-gallardo/insecure-fuzz | 5a19a9da0de7c68a7da2b012a0f8f37012ad84b6 | [
"MIT"
] | 1 | 2019-11-15T10:16:45.000Z | 2019-11-15T10:16:45.000Z | OPEN_SSL_HB_FIX.md | patricia-gallardo/insecure-fuzz | 5a19a9da0de7c68a7da2b012a0f8f37012ad84b6 | [
"MIT"
] | null | null | null | OPEN_SSL_HB_FIX.md | patricia-gallardo/insecure-fuzz | 5a19a9da0de7c68a7da2b012a0f8f37012ad84b6 | [
"MIT"
] | 1 | 2020-06-15T14:16:45.000Z | 2020-06-15T14:16:45.000Z | # Review the actual fix for Heartbleed
Install Meld
~~~~bash
sudo apt install meld
~~~~
Get the before and after code
~~~~bash
curl -O https://raw.githubusercontent.com/patricia-gallardo/insecure-fuzz/master/fixes/heartbleed/fixed.c
curl -O https://raw.githubusercontent.com/patricia-gallardo/insecure-fuzz/master/fixes/heartbleed/vuln.c
~~~~
Diff them
~~~~bash
meld fixed.c vuln.c
~~~~
How does it compare to your solution?
| 18.782609 | 105 | 0.75 | eng_Latn | 0.437804 |
f964deb5f8ccbe9644d7fecb36369096abaae70f | 1,457 | md | Markdown | adam_optimizer.md | nurulc/machine-learning | 41dc098d962758a50bee59177c8f32e2f93d332a | [
"Apache-2.0"
] | null | null | null | adam_optimizer.md | nurulc/machine-learning | 41dc098d962758a50bee59177c8f32e2f93d332a | [
"Apache-2.0"
] | null | null | null | adam_optimizer.md | nurulc/machine-learning | 41dc098d962758a50bee59177c8f32e2f93d332a | [
"Apache-2.0"
] | null | null | null | ## Javascript version of the adam optimizer
```Javascript
class Adam {
constructor(nparams, beta_1=0.9, beta_2=0.999, epsilon=1e-8) {
this.nparams= nparams;
this.beta_1= beta_1;
this.beta_2= beta_2;
this.epsilon= epsilon;
this.m= [];
this.v= [];
for(var i= 0 ; i < this.nparams ; ++i) {
this.m[i]= 0; // Initialize 1st moment vector
this.v[i]= 0; // Initialize 2nd moment vector
}
this.t= 0; //Initialize timestep
this.b1t = beta_1;
this.b2t = beta_2;
}
get_update(alpha, g) { // alpha is the learning rate, g is the gradient vector (array oflength 'nparams')
this.t += 1;
let updates= new Array(this.nparams);
const {m,v,b,beta_1,beta_2, b1t, b2t, epsilon, nparams} = this;
for(var i= 0 ; i < nparams ; ++i) {
m[i]= beta_1*m[i] + (1-beta_1)*g[i]; // Update biased first moment estimate
v[i]= beta_2*v[i] + (1-beta_2)*g[i]*g[i]; // Update biased second raw moment estimate
var mub= m[i]/(1 - b1t); // Compute bias-corrected first moment estimate
var vub= v[i]/(1 - b2t); // Compute bias-corrected second raw moment estimate
updates[i]= - alpha * mub / ( Math.sqrt(vub) + epsilon ); // Return parameter updates
}
this.b1t *= beta_1*beta_1;
this.b2t *= beta_2*beta_2;
return updates;
};
}
```
| 37.358974 | 109 | 0.553191 | eng_Latn | 0.527103 |
f9654885a66ac32ef0cb89408e481dcaec6f98e9 | 7,639 | md | Markdown | docs/cpp/header-files-cpp.md | Mdlglobal-atlassian-net/cpp-docs.it-it | c8edd4e9238d24b047d2b59a86e2a540f371bd93 | [
"CC-BY-4.0",
"MIT"
] | null | null | null | docs/cpp/header-files-cpp.md | Mdlglobal-atlassian-net/cpp-docs.it-it | c8edd4e9238d24b047d2b59a86e2a540f371bd93 | [
"CC-BY-4.0",
"MIT"
] | null | null | null | docs/cpp/header-files-cpp.md | Mdlglobal-atlassian-net/cpp-docs.it-it | c8edd4e9238d24b047d2b59a86e2a540f371bd93 | [
"CC-BY-4.0",
"MIT"
] | 1 | 2020-05-28T15:54:57.000Z | 2020-05-28T15:54:57.000Z | ---
title: File di intestazione (C
ms.date: 12/11/2019
helpviewer_keywords:
- header files [C++]
ms.openlocfilehash: 4ab6a2b2626cde94f35678bc9ec789b80d493b8f
ms.sourcegitcommit: c123cc76bb2b6c5cde6f4c425ece420ac733bf70
ms.translationtype: MT
ms.contentlocale: it-IT
ms.lasthandoff: 04/14/2020
ms.locfileid: "81367235"
---
# <a name="header-files-c"></a>File di intestazione (C
I nomi degli elementi di programma, ad esempio variabili, funzioni, classi e così via, devono essere dichiarati prima di poter essere utilizzati. Ad esempio, non è `x = 42` possibile scrivere semplicemente senza prima dichiarare 'x'.
```cpp
int x; // declaration
x = 42; // use x
```
La dichiarazione indica al compilatore se l'elemento è **un int**, un **double**, una **funzione**, una **classe** o un'altra cosa. Inoltre, ogni nome deve essere dichiarato (direttamente o indirettamente) in ogni file con estensione cpp in cui viene utilizzato. Quando si compila un programma, ogni file con estensione cpp viene compilato in modo indipendente in un'unità di compilazione. Il compilatore non è a conoscenza di quali nomi vengono dichiarati in altre unità di compilazione. Ciò significa che se si definisce una classe o una funzione o una variabile globale, è necessario fornire una dichiarazione di tale cosa in ogni file cpp aggiuntivo che lo utilizza. Ogni dichiarazione di tale cosa deve essere esattamente identica in tutti i file. Una leggera incoerenza causerà errori, o comportamenti imprevisti, quando il linker tenta di unire tutte le unità di compilazione in un unico programma.
Per ridurre al minimo il rischio di errori, c'è adottato la convenzione di utilizzare i file di *intestazione* per contenere le dichiarazioni. Apportare le dichiarazioni in un file di intestazione, quindi utilizzare il #include direttiva in ogni file con estensione cpp o altro file di intestazione che richiede tale dichiarazione. La direttiva #include inserisce una copia del file di intestazione direttamente nel file con estensione cpp prima della compilazione.
> [!NOTE]
> In Visual Studio 2019, la funzionalità di *moduli* di C- 20 è stato introdotto come un miglioramento e l'eventuale sostituzione per i file di intestazione. Per ulteriori informazioni, vedere [Panoramica dei moduli in C.](modules-cpp.md)
## <a name="example"></a>Esempio
Nell'esempio seguente viene illustrato un modo comune per dichiarare una classe e quindi utilizzarla in un file di origine diverso. Inizieremo con il file `my_class.h`di intestazione, . Contiene una definizione di classe, ma si noti che la definizione è incompleta; la funzione `do_something` membro non è definita:
```cpp
// my_class.h
namespace N
{
class my_class
{
public:
void do_something();
};
}
```
Successivamente, creare un file di implementazione (in genere con estensione cpp o simile). Chiameremo il file my_class.cpp e forniremo una definizione per la dichiarazione del membro. Aggiungiamo una `#include` direttiva per il file "my_class.h" per inserire la dichiarazione my_class a `<iostream>` questo punto nel `std::cout`file cpp e includiamo la dichiarazione per . Si noti che le virgolette vengono utilizzate per i file di intestazione nella stessa directory del file di origine e le parentesi angolari vengono utilizzate per le intestazioni di libreria standard. Inoltre, molte intestazioni di libreria standard non hanno .h o qualsiasi altra estensione di file.
Nel file di implementazione, è possibile utilizzare facoltativamente **un'istruzione using** per evitare di dover qualificare ogni menzione di "my_class" o "cout" con "N::" o "std::". Non inserire istruzioni **using** nei file di intestazione!
```cpp
// my_class.cpp
#include "my_class.h" // header in local directory
#include <iostream> // header in standard library
using namespace N;
using namespace std;
void my_class::do_something()
{
cout << "Doing something!" << endl;
}
```
Ora possiamo `my_class` usare in un altro file .cpp. Viene #include il file di intestazione in modo che il compilatore estrae la dichiarazione. Tutto ciò che il compilatore deve sapere è che `do_something()`my_class è una classe con una funzione membro pubblica denominata .
```cpp
// my_program.cpp
#include "my_class.h"
using namespace N;
int main()
{
my_class mc;
mc.do_something();
return 0;
}
```
Dopo che il compilatore ha completato la compilazione di ogni file con estensione cpp in file obj, passa i file obj al linker. Quando il linker unisce i file oggetto trova esattamente una definizione per my_class; si trova nel file obj prodotto per my_class.cpp e la compilazione ha esito positivo.
## <a name="include-guards"></a>Includi guardie
In genere, i file di `#pragma once` intestazione hanno un protezione di *inclusione* o una direttiva per garantire che non vengano inseriti più volte in un singolo file con estensione cpp.
```cpp
// my_class.h
#ifndef MY_CLASS_H // include guard
#define MY_CLASS_H
namespace N
{
class my_class
{
public:
void do_something();
};
}
#endif /* MY_CLASS_H */
```
## <a name="what-to-put-in-a-header-file"></a>Cosa inserire in un file di intestazione
Poiché un file di intestazione potrebbe essere potenzialmente incluso da più file, non può contenere definizioni che potrebbero produrre più definizioni con lo stesso nome. I seguenti elementi non sono ammessi o sono considerati una pratica molto cattiva:
- definizioni di tipo predefinito nello spazio dei nomi o nell'ambito globale
- definizioni di funzioni non inline
- definizioni di variabili non const
- definizioni di aggregazione
- spazi dei nomi senza nome
- Direttive using
L'uso della direttiva **using** non causerà necessariamente un errore, ma potrebbe potenzialmente causare un problema perché inserisce lo spazio dei nomi nell'ambito in ogni file con estensione cpp che include direttamente o indirettamente tale intestazione.
## <a name="sample-header-file"></a>File di intestazione di esempio
Nell'esempio seguente vengono illustrati i vari tipi di dichiarazioni e definizioni consentiti in un file di intestazione:
```cpp
// sample.h
#pragma once
#include <vector> // #include directive
#include <string>
namespace N // namespace declaration
{
inline namespace P
{
//...
}
enum class colors : short { red, blue, purple, azure };
const double PI = 3.14; // const and constexpr definitions
constexpr int MeaningOfLife{ 42 };
constexpr int get_meaning()
{
static_assert(MeaningOfLife == 42, "unexpected!"); // static_assert
return MeaningOfLife;
}
using vstr = std::vector<int>; // type alias
extern double d; // extern variable
#define LOG // macro definition
#ifdef LOG // conditional compilation directive
void print_to_log();
#endif
class my_class // regular class definition,
{ // but no non-inline function definitions
friend class other_class;
public:
void do_something(); // definition in my_class.cpp
inline void put_value(int i) { vals.push_back(i); } // inline OK
private:
vstr vals;
int i;
};
struct RGB
{
short r{ 0 }; // member initialization
short g{ 0 };
short b{ 0 };
};
template <typename T> // template definition
class value_store
{
public:
value_store<T>() = default;
void write_value(T val)
{
//... function definition OK in template
}
private:
std::vector<T> vals;
};
template <typename T> // template declaration
class value_widget;
}
```
| 40.632979 | 906 | 0.738055 | ita_Latn | 0.998882 |
f966c6d100a4413ab6c8f97ef0399a5b7f883827 | 111 | md | Markdown | input/corporate-sponsors/volosoft.md | kennedylabs/website | 3e171d4f6b126d8d4857edcba540498241fe21c3 | [
"MIT"
] | 2 | 2021-09-03T01:45:12.000Z | 2021-12-03T03:00:28.000Z | input/corporate-sponsors/volosoft.md | kennedylabs/website | 3e171d4f6b126d8d4857edcba540498241fe21c3 | [
"MIT"
] | 1 | 2021-10-21T12:58:58.000Z | 2021-10-21T12:58:58.000Z | input/corporate-sponsors/volosoft.md | kennedylabs/website | 3e171d4f6b126d8d4857edcba540498241fe21c3 | [
"MIT"
] | 2 | 2021-12-11T08:26:41.000Z | 2021-12-12T08:43:33.000Z | ---
name: Volosoft
logo: assets/corporate-sponsors/volosoft.png
url: https://volosoft.com/
order: 10
---
| 15.857143 | 45 | 0.684685 | nld_Latn | 0.241557 |
f967b276730ecee816eb0c6de90d416fa079b364 | 718 | md | Markdown | CHANGELOG.md | cmeister2/corrosion | 5ea01d3769c1357781b97fa1f6780500a6783049 | [
"MIT"
] | 1 | 2019-07-25T11:06:13.000Z | 2019-07-25T11:06:13.000Z | CHANGELOG.md | cmeister2/corrosion | 5ea01d3769c1357781b97fa1f6780500a6783049 | [
"MIT"
] | null | null | null | CHANGELOG.md | cmeister2/corrosion | 5ea01d3769c1357781b97fa1f6780500a6783049 | [
"MIT"
] | null | null | null | # Changelog
All notable changes to this project will be documented in this file.
This file's format is based on [Keep a Changelog](http://keepachangelog.com/)
and this project adheres to [Semantic Versioning](http://semver.org/). The
version number is tracked in the file `VERSION`.
## [Unreleased]
### Changed
- Remove the Cargo.lock file from this library
### Added
## [0.0.2] - 2018-10-21
### Changed
- Moved to a library model
- Got Travis CI working
## [0.0.1] - 2018-10-21
### Changed
- Initial commit
[Unreleased]: https://github.com/cmeister2/corrosion/compare/0.0.2...HEAD
[0.0.2]: https://github.com/cmeister2/corrosion/compare/0.0.1...0.0.2
[0.0.1]: https://github.com/cmeister2/corrosion/tree/0.0.1
| 26.592593 | 77 | 0.711699 | eng_Latn | 0.798076 |
f967ec175b208791c13b71e06fe05d230273bd14 | 64 | md | Markdown | README.md | davdtheemonk/LudoGameJava | 5b48821a29f6ec433e9bdf0b181526bd9251c22d | [
"MIT"
] | 1 | 2021-12-14T08:46:53.000Z | 2021-12-14T08:46:53.000Z | README.md | davdtheemonk/LudoGameJava | 5b48821a29f6ec433e9bdf0b181526bd9251c22d | [
"MIT"
] | null | null | null | README.md | davdtheemonk/LudoGameJava | 5b48821a29f6ec433e9bdf0b181526bd9251c22d | [
"MIT"
] | null | null | null | # LudoGameJava
I'm just learning 📌
Simple Ludo Game no GUI....
| 12.8 | 27 | 0.703125 | kor_Hang | 0.530944 |
f96824c48d97557c1cb583fae60f9563ecc6c08f | 588 | md | Markdown | docs/docs/changeLog.md | alibaba/form-driver | 8598c98b5c4238927fe0ce553fd2378589124fcb | [
"MIT"
] | 7 | 2021-11-17T06:14:38.000Z | 2022-03-24T12:02:54.000Z | docs/docs/changeLog.md | alibaba/form-driver | 8598c98b5c4238927fe0ce553fd2378589124fcb | [
"MIT"
] | 4 | 2022-01-25T06:44:30.000Z | 2022-03-04T09:36:58.000Z | docs/docs/changeLog.md | alibaba/form-driver | 8598c98b5c4238927fe0ce553fd2378589124fcb | [
"MIT"
] | 1 | 2021-11-17T07:05:00.000Z | 2021-11-17T07:05:00.000Z | ### 0.3.14 (2021.06.18)
* NPS 支持修饰文案配置
* 修复部分组件不传 props 时报错
### 0.3.12 (2021.06.16)
* 支持分栏布局
### 0.3.10 (2021.06.12)
* 添加附件上传组件
### 0.3.9 (2021.06.12)
* 更名:candidates 改为 option
* 更名:setOpen 和 enumOpen 改为 openOption
* 支持树形选择器
### 0.3.6 (2021.06.07)
* 添加 NPS 属性
* 修复 datebase 中的值不在枚举中的情况下,选择后出现在选项中 的问题
* M3 本地构建时不再触发热更新
* 修复 A 标签只读报错
* 修复导入 less 文件无效果
### 0.3.1 (2021.06.03)
* schema 支持传数组、支持省略 type
* 统一表示选项的字段
* 优化添加插件的形式
* A 标签的表达式拓展
### 0.2.4 (2021.05.21)
* M3 修正 database 被外部修改的问题
* 支持更加简单的插件调用
* 消除富文本编辑与展示模式下展示差异
* 修正 M3 校验相关的问题
### 0.1.9 (2021.01.27)
* 重构 M3 项目代码架构
| 13.066667 | 40 | 0.647959 | yue_Hant | 0.6285 |
f968e331fa9dacab039af127207156754909049d | 2,077 | md | Markdown | data-management-library/autonomous-database/dedicated/adb-datasafe/Register a Target Database.md | mrabhiram/learning-library | a638cc973a52006e08bff4ec9f41c6122a987b8a | [
"UPL-1.0"
] | 2 | 2020-09-29T20:21:15.000Z | 2020-10-20T11:45:55.000Z | data-management-library/autonomous-database/dedicated/adb-datasafe/Register a Target Database.md | mrabhiram/learning-library | a638cc973a52006e08bff4ec9f41c6122a987b8a | [
"UPL-1.0"
] | null | null | null | data-management-library/autonomous-database/dedicated/adb-datasafe/Register a Target Database.md | mrabhiram/learning-library | a638cc973a52006e08bff4ec9f41c6122a987b8a | [
"UPL-1.0"
] | null | null | null | # Register a dedicated ADB instance with Data Safe
## Introduction
Oracle Data Safe can connect to an Oracle Cloud database that has a public or private IP address on a virtual cloud network (VCN) in Oracle Cloud Infrastructure (OCI). This workshop describes the difference between public and private endpoints and explains the network connection between Oracle Data Safe and the databases. It also walks you through the steps of creating a private endpoint and registering an Autonomous Database (ADB) on dedicated Exadata Infrastructure with Oracle Data Safe
## STEP 1: Register your ADB-D
If your DB system has a private IP address, you need to create a private endpoint for it prior to registering it with Oracle Data Safe. You can create private endpoints on the Data Safe page in OCI. Be sure to create the private endpoint in the same tenancy and VCN as your database. The private IP address does not need to be on the same subnet as your database, although, it does need to be on a subnet that can communicate with the database. You can create a maximum of one private endpoint per VCN.
However, with the Autonomous Database, the entire registration is reduced to a single click operation.
- From your Menu, navigate to Autonomous Transaction Processing, Ensure you are in the right compartment and select the ADB-D instance you wish to register.

- On the extreme down right corner you would see an option to register your ADB-D database. Click on it and select Confirm.

- Once registered, click on "View Console". This takes you directly to the Data Safe console in your Region.

## Acknowledgements
*Great Work! You have successfully Registered your ADB-D with Data Safe.*
- **Author** - Padma Priya Rajan, Navya M S & Jayshree Chatterjee
- **Last Updated By/Date** - Jayshree Chatterjee, July 2020
See an issue? Please open up a request [here](https://github.com/oracle/learning-library/issues). Please include the workshop name and lab in your request.
| 59.342857 | 502 | 0.76986 | eng_Latn | 0.995232 |
f9693016689801a50fdbe9a81dc9e6897c598d5f | 1,403 | md | Markdown | tccli/examples/essbasic/v20201222/DescribeFlowApprovers.md | HS-Gray/tencentcloud-cli | 3822fcfdfed570fb526fe49abe6793e2f9127f4a | [
"Apache-2.0"
] | 47 | 2018-05-31T11:26:25.000Z | 2022-03-08T02:12:45.000Z | tccli/examples/essbasic/v20201222/DescribeFlowApprovers.md | HS-Gray/tencentcloud-cli | 3822fcfdfed570fb526fe49abe6793e2f9127f4a | [
"Apache-2.0"
] | 23 | 2018-06-14T10:46:30.000Z | 2022-02-28T02:53:09.000Z | tccli/examples/essbasic/v20201222/DescribeFlowApprovers.md | HS-Gray/tencentcloud-cli | 3822fcfdfed570fb526fe49abe6793e2f9127f4a | [
"Apache-2.0"
] | 22 | 2018-10-22T09:49:45.000Z | 2022-03-30T08:06:04.000Z | **Example 1: 查询个人帐号**
Input:
```
tccli essbasic DescribeFlowApprovers --cli-unfold-argument \
--Caller.ApplicationId \
--Caller.SubOrganizationId \
--Caller.OperatorId \
--FlowId \
--UserId \
--SignId
```
Output:
```
{
"Response": {
"RequestId": "xx",
"FlowId": "",
"Approvers": [
{
"UserId": "",
"VerifyChannel": [
"WEIXINAPP",
"FACEID"
],
"ApproveStatus": 0,
"ApproveMessage": "",
"ApproveTime": 0,
"SubOrganizationId": "",
"JumpUrl": "",
"ComponentSeals": [
{
"ComponentId": "",
"SealId": ""
}
],
"IsFullText": true,
"PreReadTime": 30,
"Mobile": "",
"Deadline": 0,
"Name": "",
"IdCardType": "ID_CARD",
"IdCardNumber": "",
"CallbackUrl": "",
"CanOffLine": false,
"IsLastApprover": false,
"SignId": "",
"SmsTemplate": {
"TemplateId": "",
"Sign": ""
}
}
]
}
}
```
| 22.629032 | 61 | 0.332145 | yue_Hant | 0.658511 |
f969939dab2067baaab90efec94f91580c309b8c | 69 | md | Markdown | README.md | MetamorphPHP/mongodb-transformations | 2e119fed11ab625a92c248438c0d8abffd85659f | [
"MIT"
] | null | null | null | README.md | MetamorphPHP/mongodb-transformations | 2e119fed11ab625a92c248438c0d8abffd85659f | [
"MIT"
] | null | null | null | README.md | MetamorphPHP/mongodb-transformations | 2e119fed11ab625a92c248438c0d8abffd85659f | [
"MIT"
] | null | null | null | Metamorph MongoDB Transformations
=================================
| 17.25 | 33 | 0.449275 | oci_Latn | 0.969378 |
f9699b49ae89605187a7a16d35117bf2a3c420f2 | 1,142 | md | Markdown | _posts/2021-09-17-shower.md | ra559/testjekyll | 2843d6b30415beee96fb6980f60c312f9b44bd41 | [
"MIT"
] | null | null | null | _posts/2021-09-17-shower.md | ra559/testjekyll | 2843d6b30415beee96fb6980f60c312f9b44bd41 | [
"MIT"
] | null | null | null | _posts/2021-09-17-shower.md | ra559/testjekyll | 2843d6b30415beee96fb6980f60c312f9b44bd41 | [
"MIT"
] | null | null | null | ---
layout: post
title: "Shower"
date: 2021-09-17 00:00:00 -0400
categories: Poems
---
<br>
She was late for work <br>
I came late from work last night <br>
My insomnia did not let me sleep <br>
I stared at her the rest of the night. <br>
When I heard the shower, I could not resist <br>
I entered slowly because the door was open <br>
She was a little surprised when I touched her elbow <br>
We have not showered together in months. <br>
The water was melting our bodies together <br>
I could feel the passion brewing while we kissed <br>
She no longer cares she was late <br>
Someone else will have to open the store today. <br>
My hands were rediscovering her body <br>
She felt unknown to me like the first day <br>
I do not know what is the deal with the shower <br>
But It made me want to love her harder than ever. <br>
I pressed her to the shower glass <br>
She let a small moan escape <br>
I loved every minute of it <br>
This shower. This moment. May it never end. <br>
By TTDLMR
| 28.55 | 141 | 0.740806 | eng_Latn | 0.999689 |
f969c432413fb9ee4397de6f3d004994471d7cdf | 2,315 | md | Markdown | CHANGELOG.md | nohli/build_context | 04518d60625f23e6a1f5c7235d534ebe1864c04d | [
"MIT"
] | 132 | 2020-04-03T13:27:36.000Z | 2022-02-20T11:30:55.000Z | CHANGELOG.md | nohli/build_context | 04518d60625f23e6a1f5c7235d534ebe1864c04d | [
"MIT"
] | 15 | 2020-04-03T14:01:37.000Z | 2021-03-10T15:15:22.000Z | CHANGELOG.md | nohli/build_context | 04518d60625f23e6a1f5c7235d534ebe1864c04d | [
"MIT"
] | 16 | 2020-04-21T12:55:02.000Z | 2022-02-11T11:06:49.000Z | ## [3.0.0] - 07/03/2021.
* Migrated to null safety
* Migration to ScaffoldMessenger
Added support for:
* `context.platform`
* `context.headline1`
* `context.headline2`
* `context.headline3`
* `context.headline4`
* `context.headline5`
* `context.headline6`
* `context.subtitle1`
* `context.bodyText1`
* `context.bodyText2`
* `context.caption`
* `context.button`
* `context.subtitle2`
* `context.overline`
* `context.modalRoute`
* `context.routeSettings`
## [2.0.4] - 18/07/2020.
Added support for:
* `context.platform`
* `context.isAndroid`
* `context.isIOS`
* `context.isWindows`
* `context.isMacOS`
* `context.isLinux`
* `context.isFuchsia`
* `context.closeKeyboard()`
## [2.0.2] - 09/06/2020.
Added support for:
* `context.theme`
## [2.0.1] - 18/05/2020.
* `context.pop()` param value is now optional
## [2.0.0] - 06/05/2020.
This is a breaking change for users of the bellow functions:
* `context.pop()` now returns void as the default function (Flutter 1.17.0)
* `context.unfocus()` now receive [UnfocusDisposition] as optional argument (Flutter 1.17.0)
## [1.1.1] - 05/05/2020.
Reasons to removeCurrentSnackBar and hideCurrentSnackBar are now optional as in the default implementation
## [1.1.0] - 23/04/2020.
* Added support for:
* `context.focusScope.hasFocus`
* `context.focusScope.isFirstFocus`
* `context.focusScope.canRequestFocus`
* `context.focusScope.hasPrimaryFocus`
* `context.focusScope.unfocus()`
* `context.focusScope.nextFocus()`
* `context.focusScope.requestFocus()`
* `context.focusScope.previousFocus()`
* `context.focusScope.setFirstFocus()`
* `context.focusScope.consumeKeyboardToken()`
## [1.0.0] - 21/04/2020.
Added support for:
* `context.mediaQueryShortestSide`
* `context.isPhone`
* `context.isTablet`
* `context.isSmallTablet`
* `context.isLargeTablet`
## [0.0.5] - 07/04/2020.
* Code format
## [0.0.4] - 04/04/2020.
Added support for:
* `context.form.validate()`
* `context.form.reset()`
* `context.form.save()`
## [0.0.3] - 04/03/2020.
Added support for:
* `context.isLandscape`
* `context.isPortrait`
* `context.mediaQueryViewInsets`
* `context.mediaQueryViewPadding`
## [0.0.2] - 04/03/2020.
* Fixed README.md format issue
## [0.0.1] - 04/03/2020.
Added support for the following classes:
* `MediaQuery`
* `Theme`
* `Navigator`
* `Scaffold`
| 19.291667 | 106 | 0.699352 | yue_Hant | 0.685371 |
f96a4206fa7f7e07403519fc56fe63b3157e650c | 124 | md | Markdown | src/markdown/businessPresentation.md | MohammadBnei/dev-mohammad | f238c8a505aa7fc160774ae909db43f6fa17891b | [
"MIT"
] | null | null | null | src/markdown/businessPresentation.md | MohammadBnei/dev-mohammad | f238c8a505aa7fc160774ae909db43f6fa17891b | [
"MIT"
] | null | null | null | src/markdown/businessPresentation.md | MohammadBnei/dev-mohammad | f238c8a505aa7fc160774ae909db43f6fa17891b | [
"MIT"
] | null | null | null | ---
path: "/accueil/"
page: 1
title: "Presentation"
date: "2019-05-19"
---
#Besoin d'un site ?
### Je suis là pour vous !
| 11.272727 | 26 | 0.596774 | fra_Latn | 0.801356 |
f96a49cdb1ede46ee31521e55b53153566966377 | 7,029 | md | Markdown | exams.md | ee209-2020-fall/ee209-2020-fall.github.io | e2da3c86656ea792cb4797becfe96b15da9e35cd | [
"MIT"
] | null | null | null | exams.md | ee209-2020-fall/ee209-2020-fall.github.io | e2da3c86656ea792cb4797becfe96b15da9e35cd | [
"MIT"
] | null | null | null | exams.md | ee209-2020-fall/ee209-2020-fall.github.io | e2da3c86656ea792cb4797becfe96b15da9e35cd | [
"MIT"
] | null | null | null | ---
layout: page
title: OldExams
permalink: /oldExams/
---
<html>
<body>
<h2> EE209 Old Midterm Exams </h2>
<style type="text/css">
.tg {border-collapse:collapse;border-spacing:0;}
.tg td{border-color:black;border-style:solid;border-width:1px;font-family:Arial, sans-serif;font-size:14px;
overflow:hidden;padding:10px 5px;word-break:normal;}
.tg th{border-color:black;border-style:solid;border-width:1px;font-family:Arial, sans-serif;font-size:14px;
font-weight:normal;overflow:hidden;padding:10px 5px;word-break:normal;}
.tg .tg-wp8o{border-color:#000000;text-align:center;vertical-align:top}
</style>
<table class="tg">
<thead>
<tr>
<th class="tg-0lax">2010</th>
<th class="tg-0lax">Fall</th>
<th class="tg-0lax"><a href="../oldmidterm/fall10exam_KAIST.pdf">Exam sheet</a></th>
<th class="tg-0lax"><a href="../oldmidterm/fall10exam_KAISTans.pdf">Solution</a></th>
</tr>
</thead>
<tbody>
<tr>
<td class="tg-0lax">2011</td>
<td class="tg-0lax">Fall</td>
<td class="tg-0lax"><a href="../oldmidterm/fall11exam_KAIST.pdf">Exam sheet</a></td>
<td class="tg-0lax"><a href="../oldmidterm/fall11exam_KAISTans.pdf">Solution</a></td>
</tr>
<tr>
<td class="tg-0lax">2012</td>
<td class="tg-0lax">Fall</td>
<td class="tg-0lax"><a href="../oldmidterm/fall12exam_KAIST.pdf">Exam sheet</a></td>
<td class="tg-0lax"><a href="../oldmidterm/fall12exam_KAISTans.pdf">Solution</a></td>
</tr>
<tr>
<td class="tg-0lax">2013</td>
<td class="tg-0lax">Fall</td>
<td class="tg-0lax"><a href="../oldmidterm/fall13exam_KAIST.pdf">Exam sheet</a></td>
<td class="tg-0lax"></td>
</tr>
<tr>
<td class="tg-0lax">2015</td>
<td class="tg-0lax">Spring</td>
<td class="tg-0lax"><a href="../oldmidterm/spring15exam_KAIST.pdf">Exam sheet</a></td>
<td class="tg-0lax"></td>
</tr>
<tr>
<td class="tg-0lax">2015</td>
<td class="tg-0lax">Fall</td>
<td class="tg-0lax"><a href="../oldmidterm/fall15exam_KAIST.pdf">Exam sheet</a></td>
<td class="tg-0lax"><a href="../oldmidterm/fall15exam_KAISTans.pdf">Solution</a></td>
</tr>
<tr>
<td class="tg-0lax">2016</td>
<td class="tg-0lax">Spring</td>
<td class="tg-0lax"><a href="../oldmidterm/spring16exam_KAIST.pdf">Exam sheet</a></td>
<td class="tg-0lax"><a href="../oldmidterm/spring16exam_KAISTans.pdf">Solution</a></td>
</tr>
<tr>
<td class="tg-0lax">2016</td>
<td class="tg-0lax">Fall</td>
<td class="tg-0lax"><a href="../oldmidterm/fall16exam_KAIST.pdf">Exam sheet</a></td>
<td class="tg-0lax"><a href="../oldmidterm/fall16exam_KAISTans.pdf">Solution</a></td>
</tr>
<tr>
<td class="tg-0lax">2017</td>
<td class="tg-0lax">Spring</td>
<td class="tg-0lax"><a href="../oldmidterm/spring17exam_KAIST.pdf">Exam sheet</a></td>
<td class="tg-0lax"><a href="../oldmidterm/spring17exam_KAISTans.pdf">Solution</a></td>
</tr>
<tr>
<td class="tg-0lax">2017</td>
<td class="tg-0lax">Fall</td>
<td class="tg-0lax"><a href="../oldmidterm/fall17exam_KAIST.pdf">Exam sheet</a></td>
<td class="tg-0lax"><a href="../oldmidterm/fall17exam_KAISTans.pdf">Solution</a></td>
</tr>
<tr>
<td class="tg-0lax">2018</td>
<td class="tg-0lax">Spring</td>
<td class="tg-0lax"><a href="../oldmidterm/spring18exam_KAIST.pdf">Exam sheet</a></td>
<td class="tg-0lax"><a href="../oldmidterm/spring18exam_KAISTans.pdf">Solution</a></td>
</tr>
<tr>
<td class="tg-0lax">2018</td>
<td class="tg-0lax">Spring</td>
<td class="tg-0lax"><a href="../oldmidterm/spring18c++sample_KAIST.pdf">C++ Sample</a></td>
<td class="tg-0lax"><a href="../oldmidterm/spring18c++sample_KAISTans.pdf">Solution</a></td>
</tr>
<tr>
<td class="tg-0lax">2020</td>
<td class="tg-0lax">Fall</td>
<td class="tg-0lax"></td>
<td class="tg-0lax"><a href="../oldmidterm/fall20exam_KAISTans.pdf">Solution</a></td>
</tr>
</tbody>
</table>
</body>
<br>
<br>
<h2> EE209 Old Final Exams </h2>
<style type="text/css">
.tg {border-collapse:collapse;border-spacing:0;}
.tg td{border-color:black;border-style:solid;border-width:1px;font-family:Arial, sans-serif;font-size:14px;
overflow:hidden;padding:10px 5px;word-break:normal;}
.tg th{border-color:black;border-style:solid;border-width:1px;font-family:Arial, sans-serif;font-size:14px;
font-weight:normal;overflow:hidden;padding:10px 5px;word-break:normal;}
.tg .tg-wp8o{border-color:#000000;text-align:center;vertical-align:top}
</style>
<table class="tg">
<thead>
<tr>
<th class="tg-wp8o">2010</th>
<th class="tg-wp8o">Fall</th>
<th class="tg-wp8o"><a href="../oldfinal/fall10exam_final_KAIST.pdf">Exam sheet</a></th>
<th class="tg-wp8o"><a href="../oldfinal/fall10exam_final_KAISTans.pdf">Solution</a></th>
</tr>
</thead>
<tbody>
<tr>
<td class="tg-wp8o">2011</td>
<td class="tg-wp8o">Fall</td>
<td class="tg-wp8o"><a href="../oldfinal/fall11exam_final_KAIST.pdf">Exam sheet</a></td>
<td class="tg-wp8o"><a href="../oldfinal/fall11exam_final_KAISTans.pdf">Solution</a></td>
</tr>
<tr>
<td class="tg-wp8o">2012</td>
<td class="tg-wp8o">Fall</td>
<td class="tg-wp8o"><a href="../oldfinal/fall12exam_final_KAIST.pdf">Exam sheet</a></td>
<td class="tg-wp8o"><a href="../oldfinal/fall12exam_final_KAISTans.pdf">Solution</a></td>
</tr>
<tr>
<td class="tg-wp8o">2013</td>
<td class="tg-wp8o">Fall</td>
<td class="tg-wp8o"><a href="../oldfinal/fall13exam_final_KAIST.pdf">Exam sheet</a></td>
<td class="tg-wp8o"></td>
</tr>
<tr>
<td class="tg-wp8o">2015</td>
<td class="tg-wp8o">Spring</td>
<td class="tg-wp8o"><a href="../oldfinal/spring15exam_final_KAIST.pdf">Exam sheet</a></td>
<td class="tg-wp8o"></td>
</tr>
<tr>
<td class="tg-wp8o">2015</td>
<td class="tg-wp8o">Fall</td>
<td class="tg-wp8o"><a href="../oldfinal/fall15exam_final_KAIST.pdf">Exam sheet</a></td>
<td class="tg-wp8o"></td>
</tr>
<tr>
<td class="tg-wp8o">2016</td>
<td class="tg-wp8o">Spring</td>
<td class="tg-wp8o"><a href="../oldfinal/spring16exam_final_KAIST.pdf">Exam sheet</a></td>
<td class="tg-wp8o"><a href="../oldfinal/spring16exam_final_KAISTans.pdf">Solution</a></td>
</tr>
<tr>
<td class="tg-wp8o">2016</td>
<td class="tg-wp8o">Fall</td>
<td class="tg-wp8o"><a href="../oldfinal/fall16exam_final_KAIST.pdf">Exam sheet</a></td>
<td class="tg-wp8o"><a href="../oldfinal/fall16exam_final_KAISTans.pdf">Solution</a></td>
</tr>
<tr>
<td class="tg-wp8o">2017</td>
<td class="tg-wp8o">Spring</td>
<td class="tg-wp8o"><a href="../oldfinal/spring17exam_final_KAIST.pdf">Exam sheet</a></td>
<td class="tg-wp8o"><a href="../oldfinal/spring17exam_final_KAISTans.pdf">Solution</a></td>
</tr>
<tr>
<td class="tg-wp8o">2020</td>
<td class="tg-wp8o">Fall</td>
<td class="tg-wp8o"></td>
<td class="tg-wp8o"><a href="../oldfinal/fall20exam_final_KAIST.pdf">Solution</a></td>
</tr>
</tbody>
</table>
<br>
<br>
</html> | 37.588235 | 107 | 0.637217 | yue_Hant | 0.399139 |
f96ab726499c0033da328afef19913a8034827d1 | 1,847 | md | Markdown | articles/defender-for-iot/device-builders/concept-security-module.md | BaherAbdullah/azure-docs | 65d82440dd3209697fdb983ef456b0a2293e270a | [
"CC-BY-4.0",
"MIT"
] | 7,073 | 2017-06-27T08:58:22.000Z | 2022-03-30T23:19:23.000Z | articles/defender-for-iot/device-builders/concept-security-module.md | BaherAbdullah/azure-docs | 65d82440dd3209697fdb983ef456b0a2293e270a | [
"CC-BY-4.0",
"MIT"
] | 87,608 | 2017-06-26T22:11:41.000Z | 2022-03-31T23:57:29.000Z | articles/defender-for-iot/device-builders/concept-security-module.md | BaherAbdullah/azure-docs | 65d82440dd3209697fdb983ef456b0a2293e270a | [
"CC-BY-4.0",
"MIT"
] | 17,093 | 2017-06-27T03:28:18.000Z | 2022-03-31T20:46:38.000Z | ---
title: Defender-IoT-micro-agent and device twins
description: Learn about the concept of Defender-IoT-micro-agent twins and how they are used in Defender for IoT.
ms.topic: conceptual
ms.date: 05/25/2021
---
# Defender-IoT-micro-agent
This article explains how Defender for IoT uses device twins and modules.
## Device twins
For IoT solutions built in Azure, device twins play a key role in both device management and process automation.
Defender for IoT offers full integration with your existing IoT device management platform, enabling you to manage your device security status as well as make use of existing device control capabilities. Integration is achieved by making use of the IoT Hub twin mechanism.
Learn more about the concept of [Device twins](../../iot-hub/iot-hub-devguide-device-twins.md#device-twins) in Azure IoT Hub.
## Defender-IoT-micro-agent twins
Defender for IoT maintains a Defender-IoT-micro-agent twin for each device in the service.
The Defender-IoT-micro-agent twin holds all the information relevant to device security for each specific device in your solution.
Device security properties are maintained in a dedicated Defender-IoT-micro-agent twin for safer communication and for enabling updates and maintenance that requires fewer resources.
See [Create Defender-IoT-micro-agent twin](quickstart-create-security-twin.md) and [Configure security agents](how-to-agent-configuration.md) to learn how to create, customize, and configure the twin. See [Understand and use module twins in IoT Hub](../../iot-hub/iot-hub-devguide-module-twins.md) to learn more about the concept of module twins in IoT Hub.
## See also
- [Defender for IoT overview](overview.md)
- [Deploy security agents](how-to-deploy-agent.md)
- [Security agent authentication methods](concept-security-agent-authentication-methods.md)
| 55.969697 | 357 | 0.794261 | eng_Latn | 0.990273 |
f96b62d87f420a2233d091547d834ef366eb0f59 | 609 | md | Markdown | docs/news/facebook_pytorch_developer_conference_2018_09_05.md | mberkanbicer/deep-learning-wizard | 74f8e193340eb03d321114f3bb962ac33f7a3949 | [
"MIT"
] | 506 | 2018-07-06T07:50:00.000Z | 2022-03-30T05:53:55.000Z | docs/news/facebook_pytorch_developer_conference_2018_09_05.md | vasulakkaraju/deep-learning-wizard | f3582cfd7417b851b2196ec27aa58edbc49dd131 | [
"MIT"
] | 5 | 2021-01-13T02:44:44.000Z | 2022-02-06T10:07:46.000Z | docs/news/facebook_pytorch_developer_conference_2018_09_05.md | vasulakkaraju/deep-learning-wizard | f3582cfd7417b851b2196ec27aa58edbc49dd131 | [
"MIT"
] | 191 | 2018-11-09T07:52:06.000Z | 2022-03-28T05:15:25.000Z | # Facebook PyTorch Developer Conference
## We are heading down!
In barely 2 short years, PyTorch (Facebook) will be hosting their first [PyTorch Developer Conference](https://pytorch.fbreg.com) in San Francisco, USA.
I will be heading down thanks to Soumith Chintala for the invite and arrangements. Looking forward to meet anyone there.
The PyTorch ecosystem has grown tremendously from when I first started using it. To this date, I've taught more than 3000 students worldwide in 120+ countries and every single wizard has fallen in love with it!
Cheers,
<br />[Ritchie Ng](https://www.ritchieng.com/) | 50.75 | 210 | 0.781609 | eng_Latn | 0.991353 |
f96b8a7c027503915c85d547913d855f65d59616 | 3,716 | md | Markdown | README.md | heuisam/mbed-os-example-s1sbp6a | f02cffebc8f111bf9a695a8df82cb0ac82a409c9 | [
"Apache-2.0"
] | null | null | null | README.md | heuisam/mbed-os-example-s1sbp6a | f02cffebc8f111bf9a695a8df82cb0ac82a409c9 | [
"Apache-2.0"
] | null | null | null | README.md | heuisam/mbed-os-example-s1sbp6a | f02cffebc8f111bf9a695a8df82cb0ac82a409c9 | [
"Apache-2.0"
] | null | null | null | # BP6A Example
The example project is the getting started example for BP6A.
It contains ECG, PPG, BIA, and GPIO, i2c application.
You can build the project with all supported Mbed OS build tools.
## Mbed studio
You can import s1sbp6a example.
* step 1. Open the File menu and select Import Program
* step 2. Paste the full HTTPS of this example, "https://github.com/ARMmbed/mbed-os-example-s1sbp6a"
## Mbed CLI
You can import after clone manually.
* step 1. manually clone this example
```bash
$git clone https://github.com/ARMmbed/mbed-os-example-s1sbp6a
```
* step 2. set the this program directory as the root of your program
```bash
$mbed new .
```
* setp 3. get all missing libraries
```bash
mbed deploy
```
#### Build & Download
You can build the example and copy it to DAPLink to download the firmware image.
```bash
$mbed compile -t <TOOLCHAIN> -m S1SBP6A
```
## Expected output
The example starts show the menu you can select.
```bash
$sudo minicom /dev/ttyACM0
SIDK BP6A Example
========= Menu =========
bia example : b
ecg example : e
ppg example : p
gpio example : g
i2c example : i
stop example : q
============================
Select menu >>
```
### BIA example
This example is for Body impedance.
```bash
========= Menu =========
bia example : b
ecg example : e
ppg example : p
gpio example : g
i2c example : i
stop example : q
============================
Select menu >> b
Select menu >>BIA ExampleStart
Calibration Start
calibration r: 0, code: 849
calibration r: 1, code: 408
5653
215
170
150
142
139
142
148
157
166
175
187
195
202
209
215
221
```
### ECG example
This example is for ECG.
```bash
========= Menu =========
bia example : b
ecg example : e
ppg example : p
gpio example : g
i2c example : i
stop example : q
============================
Select menu >> e
Select menu >>ECG ExampleStart
2681
2620
2547
2553
2654
2634
2583
2543
2632
2676
2602
2555
2577
2675
2633
2564
2583
2663
2639
2579
```
You can also see the ECG data as a graph with bp6a_data_plot.py
```bash
$python bp6a_data_plot.py -p /dev/ttyACM0
```

### PPG example
This example is for PPG.
```bash
========= Menu =========
bia example : b
ecg example : e
ppg example : p
gpio example : g
i2c example : i
stop example : q
============================
Select menu >> p
Select menu >>PPG ExampleStart
2166293
2174849
2181769
2162016
2170146
2172667
2168917
2166199
2158735
2171078
2173429
2161364
2169346
2167306
2195398
2182067
2183647
2185844
2170946
```
You can also see the PPG data as a graph with bp6a_data_plot.py
```bash
$python bp6a_data_plot.py -p /dev/ttyACM0
```

### GPIO example
This example is forLED, Buzzer and button.
When the button is pressed, the LED color changes and the LED flashes repeatedly or a buzzer sounds.
```bash
========= Menu =========
bia example : b
ecg example : e
ppg example : p
gpio example : g
i2c example : i
stop example : q
============================
Select menu >> g
Select menu >>start gpio_example
if you press the Test button, LED and buzzer will be changed
```
### i2c example
You can test IMU with i2c example
```bash
========= Menu =========
bia example : b
ecg example : e
ppg example : p
gpio example : g
i2c example : i
stop example : q
============================
Select menu >> i
Select menu >>i2c IMU example
Who am I : 0x12
Accel Config : 0x0
Gyro Config : 0x0
Accelerometer : -172 292 16652
Gyroscope : -16 -39 43
Accelerometer : -144 192 16516
Gyroscope : -7 -62 45
Accelerometer : -176 220 16708
Gyroscope : -34 -40 28
``` | 17.951691 | 100 | 0.641012 | eng_Latn | 0.952645 |
f96c80bc246928d026dfe951e0421dd17c669327 | 489 | md | Markdown | examples/markdown/carousel-doc.md | FredaFei/react-typeScript-ui | be2841327b9d6554b88bf6b0f69ee53797f42956 | [
"MIT"
] | 2 | 2020-05-27T02:37:23.000Z | 2020-07-01T06:34:59.000Z | examples/markdown/carousel-doc.md | FredaFei/react-typeScript-ui | be2841327b9d6554b88bf6b0f69ee53797f42956 | [
"MIT"
] | 8 | 2020-03-16T03:56:51.000Z | 2022-02-26T10:07:12.000Z | examples/markdown/carousel-doc.md | FredaFei/amazing-ui-react | be2841327b9d6554b88bf6b0f69ee53797f42956 | [
"MIT"
] | null | null | null | ### API
### Collapse
| 参数 |说明 |类型 |默认值 |可选值 |
| ------------ | :----------------|:------- | :----- | :----- |
| animation |动画效果 |String |slide |`fade` |
| indicator |下标设置 |{position, style} |{position:`bottom`, style: `circle`}|{position:`left/top/bottom/right`, style: `number/circle`} |
| duration |播放间隔 |Number |2000 单位:毫秒 |—— |
| autoPlay |是否自动切换 |Boolean |false |`true` |
| className |自定义类名 |String |—— |—— |
| style |自定义样式 |React.CSSProperties |—— |—— |
| 40.75 | 135 | 0.509202 | yue_Hant | 0.393186 |
f96ca46f38c92b480a4f237f897b448bc9ea15bf | 951 | md | Markdown | api/Word.CustomLabels.Item.md | MarkWithC/VBA-Docs | a43a38a843c95cbe8beed2a15218a5aeca4df8fb | [
"CC-BY-4.0",
"MIT"
] | null | null | null | api/Word.CustomLabels.Item.md | MarkWithC/VBA-Docs | a43a38a843c95cbe8beed2a15218a5aeca4df8fb | [
"CC-BY-4.0",
"MIT"
] | null | null | null | api/Word.CustomLabels.Item.md | MarkWithC/VBA-Docs | a43a38a843c95cbe8beed2a15218a5aeca4df8fb | [
"CC-BY-4.0",
"MIT"
] | null | null | null | ---
title: CustomLabels.Item method (Word)
keywords: vbawd10.chm152436736
f1_keywords:
- vbawd10.chm152436736
ms.prod: word
api_name:
- Word.CustomLabels.Item
ms.assetid: 3b0734f9-de26-3722-7267-2665fa73d9f9
ms.date: 06/08/2017
ms.localizationpriority: medium
---
# CustomLabels.Item method (Word)
Returns a **CustomLabel** object in a collection.
## Syntax
_expression_.**Item** (_Index_)
_expression_ Required. A variable that represents a '[CustomLabels](Word.customlabels.md)' collection.
## Parameters
|Name|Required/Optional|Data type|Description|
|:-----|:-----|:-----|:-----|
| _Index_|Required| **Variant**|The individual object to be returned. Can be a **Long** indicating the ordinal position or a **String** representing the name of the individual object.|
## Return value
CustomLabel
## See also
[CustomLabels Collection Object](Word.customlabels.md)
[!include[Support and feedback](~/includes/feedback-boilerplate.md)] | 21.133333 | 184 | 0.740273 | eng_Latn | 0.448595 |
f96d4bb50ed4ed525292957f9142d4905f20d23f | 3,630 | md | Markdown | server-2013/lync-server-2013-configure-add-ins-for-rooms.md | v-vijanu/OfficeDocs-SkypeforBusiness-Test-pr.es-es | b8058e4767e19709e1df553cb66a7e6df429cf5e | [
"CC-BY-4.0",
"MIT"
] | 1 | 2020-05-19T19:28:10.000Z | 2020-05-19T19:28:10.000Z | server-2013/lync-server-2013-configure-add-ins-for-rooms.md | v-vijanu/OfficeDocs-SkypeforBusiness-Test-pr.es-es | b8058e4767e19709e1df553cb66a7e6df429cf5e | [
"CC-BY-4.0",
"MIT"
] | 21 | 2018-04-26T18:42:59.000Z | 2018-08-23T23:00:11.000Z | server-2013/lync-server-2013-configure-add-ins-for-rooms.md | v-vijanu/OfficeDocs-SkypeforBusiness-Test-pr.es-es | b8058e4767e19709e1df553cb66a7e6df429cf5e | [
"CC-BY-4.0",
"MIT"
] | 11 | 2018-06-19T11:13:26.000Z | 2021-11-15T11:25:02.000Z | ---
title: 'Lync Server 2013: Configurar complementos para salones'
TOCTitle: Configurar complementos para salones
ms:assetid: 4eeaf19e-8369-4f6f-af65-a283cf7daa1c
ms:mtpsurl: https://technet.microsoft.com/es-es/library/JJ204878(v=OCS.15)
ms:contentKeyID: 48275245
ms.date: 01/07/2017
mtps_version: v=OCS.15
ms.translationtype: HT
---
# Configurar complementos para salones en Lync Server 2013
_**Última modificación del tema:** 2013-02-21_
En Panel de control de Lync Server 2013, puede usar la sección **Complemento** de la página **Chat persistente** para asociar direcciones URL a salones de Chat persistente. Estas direcciones URL aparecen en el cliente de Lync 2013 en el salón de chat en el panel de extensibilidad de conversación. Un administrador debe agregar complementos a la lista de complementos registrados, permitiendo que los salones se asocien a uno de los complementos registrados antes de que los usuarios puedan ver esta actualización en su cliente de Lync 2013.
Los complementos se usan para ampliar la experiencia en el salón. Un complemento normal podría incluir una dirección URL que señale a una aplicación Silverlight que se intercepta cuando se envía un tablero de cotizaciones a un salón de chat y muestra el historial de cotizaciones en el panel de extensibilidad. Otros ejemplos serían incrustar una dirección URL de OneNote 2013 en el salón de chat como complemento para incluir contexto compartido, como "Lista de prioridades" o "Tema del día".
## Procedimiento para configurar complementos para salones de chat
1. Desde una cuenta de usuario que se asigne al rol CsPersistentChatAdministrator o CsAdministrator, inicie sesión en cualquier equipo en la implementación interna.
2. En el menú **Inicio**, seleccione el Panel de control de Lync Server o abra la ventana del explorador y, después, escriba la dirección URL del administrador. Para más información sobre los diferentes métodos que puede usar para abrir el Panel de control de Lync Server, vea [Abrir las herramientas administrativas de Lync Server](lync-server-2013-open-lync-server-administrative-tools.md).
> [!IMPORTANT]
> También puede utilizar cmdlets Windows PowerShell. Para más información, vea <a href="configuring-persistent-chat-server-by-using-windows-powershell-cmdlets.md">Configuración del servidor de chat persistente con cmdlets de Windows PowerShell</a> en la documentación de implementación.
3. En la barra de navegación izquierda, haga clic en **Chat persistente** y después en **Complemento**.
Para varias implementaciones Grupo de servidores de chat persistente, seleccione el grupo de servidores adecuado de la lista desplegable.
4. En la página **Complemento**, haga clic en **Nuevo**.
5. En **Seleccionar un servicio**, seleccione el servicio correspondiente a Grupo de servidores de chat persistente donde necesita crear el complemento. Los complementos no se pueden mover de un grupo a otro o compartirse entre diferentes grupos.
6. En **Complemento nuevo**, haga lo siguiente:
- En **Nombre**, especifique un nombre para el nuevo complemento.
- En **URL**, especifique la dirección URL que se va a asociar al complemento. Las direcciones URL se limitan a protocolos http y https.
7. Haga clic en **Confirmar**.
## Vea también
#### Tareas
[Abrir las herramientas administrativas de Lync Server](lync-server-2013-open-lync-server-administrative-tools.md)
#### Conceptos
[Configuración del servidor de chat persistente con cmdlets de Windows PowerShell](configuring-persistent-chat-server-by-using-windows-powershell-cmdlets.md)
| 61.525424 | 541 | 0.785399 | spa_Latn | 0.979239 |
f96d9d98826a922ad76bb9d0e4ef5288d0ff0545 | 2,697 | md | Markdown | README.md | fbigun/LCApp | 87f96fdb922b9916a84dd0c2ec67710b2ff39fbc | [
"MIT"
] | 1 | 2019-06-03T02:38:05.000Z | 2019-06-03T02:38:05.000Z | README.md | beautilut/XMIIconFont | d899cec22642d62080ccf9be2b6c7e8d9445f1d5 | [
"MIT"
] | null | null | null | README.md | beautilut/XMIIconFont | d899cec22642d62080ccf9be2b6c7e8d9445f1d5 | [
"MIT"
] | null | null | null | # Flask Todo Demo
## 环境与依赖
Flask Todo Demo 是一个 [LeanEngine][1] 的示例项目。它运行在 Python 3 上,依赖 [flask][2] 和 [LeanCloud Python SDK][3]。你可以在 [这里][11] 查看在线 demo。
## 在开始之前
Flask Todo Demo 是一个 [LeanCloud][4] 应用。在部署上线之前,需要先做一些准备工作。
1. 在 [LeanCloud 控制台][5] 新建一个应用,并设置一个二级域名。
2. 在控制台中新增一个名为 `SECRET_KEY` 的环境变量。(关于如何创建一个好的密钥,请参考 [这个 gist][6])
3. 安装最新版的 [LeanCloud 命令行工具][7]。如果你无法访问 GitHub,请移步 [国内镜像](http://releases.leanapp.cn/#/leancloud/lean-cli/releases)。
## 部署方法
首先将 Flask Todo Demo 的代码克隆到本地。在终端中打开项目所在目录,输入 `lean login`,然后 `lean checkout`,根据提示操作,就可以将本地的项目与刚刚在 LeanCloud 上创建的应用链接起来。
使用 [virtualenv][8] 来为这个应用创建一个隔离的 Python 运行环境。激活虚拟环境,然后用 `pip` 来安装所需的依赖。
用 `lean deploy` 命令将代码部署到 LeanCloud 上。部署完成之后,就可以在浏览器中输入刚才设置的域名,打开线上运行的网站了。
简单来讲:
```bash
$ git clone https://github.com/leancloud/flask-todo-demo.git && cd flask-todo-demo
$ virtualenv venv --python=python3 && source venv/bin/activate
$ (venv) pip install -r requirements.txt
$ (venv) lean login
$ (venv) lean checkout
$ (venv) lean deploy
```
## 如何调试?
在本地调试,请使用 `lean up` 命令,然后在 `http://localhost:3000` 查看本地运行实例。如果你使用了虚拟环境,请先激活对应的虚拟环境。
## 高级技巧
### Hook 函数
在 `cloud.py` 中有一个 `before_todo_save` 函数,它的上一行是 `@engine.before_save('Todo')`,说明这是一个 Hook 函数。当一个新的 `Todo` 将要被保存到数据之前,LeanCloud 会自动调用这个函数,你可以利用它来检查数据的有效性。
在现在这个函数里,当一个 todo 的 `content` 超过 240 个字,我们就会将其截断,并在最后加上「...」。这显然不是最好的办法;如果你有兴趣,可以尝试 fork 代码之后自己修改成更好的处理方法。
除了在对象保存之前进行预处理,你还可以调用很多其他 Hook。想要了解更多,可以参考 [Hook 函数文档][13]。
### 定时任务
`cloud.py` 中的另一个函数是 `empty_trash`,它会筛选出 `status` 为 `-1` 且最后一次更新时间是 30 天之前的 todo 并删除。
这是一个云函数,你可以在项目中通过 `leancloud.cloudfunc.run('empty_trash')` 来手动调用它,但显然自动化是更好的方案。
为了实现自动化,我们可以在 LeanCloud 控制台设置一个定时任务,让这个函数每天凌晨都自动运行一次。
将代码部署到云引擎之后,你就可以前往 LeanCloud 控制台,在云引擎 > 定时任务中创建一个定时器。定时器的运行规则用 cron 表达式来定义,这里我们希望它每天凌晨 12 点运行一次,那么 cron 表达式就应当写作 `0 0 0 * * ?`。
定时任务可以帮你自动做很多事情,想要了解更多,可以参考 [定时任务文档][14]。
## 其他语言的 Demo
请参考 [LeanCloud 开源 Demo][12]
## Miscellaneous
* License: [MIT][9]
* Author: GUAN Xiaoyu ([guanxy@me.com][10])
[1]: https://leancloud.cn/docs/leanengine_overview.html
[2]: http://flask.pocoo.org
[3]: https://github.com/leancloud/python-sdk
[4]: https://leancloud.cn/
[5]: https://leancloud.cn/dashboard/applist.html#/apps
[6]: https://gist.github.com/nervouna/cd58fb09c22826eaaff996793de72d85
[7]: https://github.com/leancloud/lean-cli/releases/latest
[8]: https://github.com/pypa/virtualenv
[9]: https://github.com/leancloud/flask-todo-demo/blob/master/LICENSE
[10]: mailto:guanxy@me.com
[11]: https://flask-todo-demo.leanapp.cn
[12]: https://leancloud.cn/docs/demo.html#/web
[13]: https://leancloud.cn/docs/leanengine_cloudfunction_guide-python.html#Hook_函数
[14]: https://leancloud.cn/docs/leanengine_cloudfunction_guide-python.html#定时任务 | 32.890244 | 152 | 0.747497 | yue_Hant | 0.845091 |
f96da8b1601b9d551bcced5810f4455b9af4a5a7 | 1,423 | md | Markdown | _posts/2018-10-31-future-of-print-in-open-stacks.md | sharilaster/sharilaster | 7b2de4a47217404aa2628eb64e47d71a6d2be0a1 | [
"MIT"
] | null | null | null | _posts/2018-10-31-future-of-print-in-open-stacks.md | sharilaster/sharilaster | 7b2de4a47217404aa2628eb64e47d71a6d2be0a1 | [
"MIT"
] | null | null | null | _posts/2018-10-31-future-of-print-in-open-stacks.md | sharilaster/sharilaster | 7b2de4a47217404aa2628eb64e47d71a6d2be0a1 | [
"MIT"
] | null | null | null | ---
layout: post
title: "The Future of Print in Open Stacks"
categories: journal
tags: [future of print]
image:
feature:
teaser:
credit:
creditlink: ""
---
Building on a talk given by Jim O'Donnell at the 2017 Charleston Conference, Lorrie McAllister and I expanded on some of the conceptual underpinnings of the "Future of Print" project at ASU Library.
Abstract:
> Arizona State University is embracing new ways of thinking about how open stacks can make books active objects of engagement for a new generation of students, rather than risk becoming mere backdrops for study spaces. By taking a deliberate design approach to answering the question of “which books, where?,” ASU Library seeks to position print collections as an engagement mechanism. This chapter presents the transformative potential of open stacks, along with planning for access, assessment, and inclusive engagement. The authors describe how ASU Library is using a major library renovation project as a catalyst to explore these ideas, and propose a pathway to developing shared solutions for more effective use of library collections.
Full chapter:
McAllister, Lorrie, & Shari Laster. "The Future of Print in Open Stacks." In *Charleston Voices: Perspectives from the 2017 Conference*, edited by Lars Meyer. Ann Arbor, MI: Michigan Publishing, University of Michigan Library, 2018. doi:10.3998/mpub.11281794
[Preprint forthcoming]
| 59.291667 | 743 | 0.791989 | eng_Latn | 0.984878 |
f96dce0dbd34d7e6bdc9e8e83d5c5c82b238dcc4 | 13,060 | md | Markdown | src/posts/2019-11-20-testing-thunky-redux-actions.md | neil-berg/portfolio | 36e021fbd8803408f6171aa5495755adad0c24c3 | [
"MIT"
] | 2 | 2019-05-05T21:42:53.000Z | 2020-01-13T20:57:05.000Z | src/posts/2019-11-20-testing-thunky-redux-actions.md | neil-berg/portfolio | 36e021fbd8803408f6171aa5495755adad0c24c3 | [
"MIT"
] | null | null | null | src/posts/2019-11-20-testing-thunky-redux-actions.md | neil-berg/portfolio | 36e021fbd8803408f6171aa5495755adad0c24c3 | [
"MIT"
] | null | null | null | ---
title: "Testing Thunk-y Redux Actions"
date: "2019-11-20"
description: Diving into redux-thunk for testing thunk-y Redux action creators
---
## Testing Thunk-y Redux Actions
<time datetime="2019-11-20">November 20, 2019</time>
In Redux-land, thunk-y action creators are those that return a function and not an action, i.e., a plain object. Using them in a Redux store can be achieved with the redux-thunk middleware. With the help of custom middleware to track actions (developed by [Dillon Krug](https://github.com/dillonkrug)), testing thunk-y actions is a breeze.
To demo the testing of thunk-y action creators, we'll create a simple React app that first fetches the latitude and longitude of a city and then uses those coordinates to fetch the current weather of that city. The asynchronous nature of data fetching is where the thunk-y actions will come in to play.
The app has a directory structure as follows:
<!-- prettier-ignore -->
```javascript
src/
--App.js
--index.js
--actions/
----fetching.actions.js
----geocode.actions.js
----status.actions.js
----weather.actions.js
--components/
----CityForm.js
--reducers/
----geocode.reducer.js
----status.reducer.js
----weather.reducer.js
--store/
----store.js
```
#### Reducers and Redux store
Three reducers are used in this app:
1. `geocode.reducer` for handling state related to the city's coordinates
2. `weather.reducer` for handling state related to the city's weather
3. `status.reducer` for handling data fetching states, e.g. loading, error, and success
All the reducers and their initial states are exported from their respective reducer files and combined into a single `rootReducer`. Then we can create the Redux store that incorporates the `redux-thunk` middleware.
<!-- prettier-ignore -->
```javascript
// store/store.js
import { combinedReducers, createStore, applyMiddleware } from 'redux'
import thunk from 'redux-thunk'
import { geocodeReducer } from '../reducers/geocode.reducer';
import { statusReducer } from '../reducers/status.reducer';
import { weatherReducer } from '../reducers/weather.reducer';
const rootReducer = combineReducers({
status: statusReducer,
geocode: geocodeReducer,
weather: weatherReducer
})
export const store = createStore(rootReducer, applyMiddleware(thunk))
```
#### Thunk-y actions
Most pertitent to this article are the actions used to fetch data. For that we create a single action creator called `fetchWeather` inside of `actions/fetching.actions.js`.
<!-- prettier-ignore -->
```javascript
// actions/fetching.actions.js
import axios from 'axios';
import { setLoading, setError, setSuccess } from './status.actions'
import { setCity, setCoords } from './geocode.actions';
import { setSummary, setTemperature, setPrecip } from './weather.actions';
export const fetchWeather = city => async (dispatch, getState) => {
dispatch(setLoading());
try {
const url1 = `https://api.mapbox.com/geocoding/v5/mapbox.places/${city}.json?access_token=${process.env.REACT_APP_GEOCODE_ACCESS_TOKEN}&limit=1`;
const { data: { features } } = await axios.get(url1);
const [lon, lat] = features[0].center;
dispatch(setCity(city));
dispatch(setCoords(lat, lon));
const proxy = 'https://cors-anywhere.herokuapp.com/';
const url2 = `https://api.darksky.net/forecast/${process.env.REACT_APP_DARK_SKY_API_KEY}/${lat},${lon}`;
const { data: { currently } } = await axios.get(proxy + url2);
const { summary, temperature, precipProbability: precip } = currently;
dispatch(setSummary(summary));
dispatch(setTemperature(temperature));
dispatch(setPrecip(precip))
dispatch(setSuccess());
} catch (error) {
dispatch(setError(error))
}
}
```
The other action creators that `fetchWeather` calls are all synchronous and simply return an action, for instance, `setCity` looks like:
<!-- prettier-ignore -->
```javascript
// actions/geocode.actions.js
export const setCity = (city) => ({
type: "SET_CITY",
city
})
```
Returing to `fetchWeather`, we see that this thunk-y action takes on the signature of `fn = params => (dispatch, getState) => { ... }`, where `dispatch` and `getState` parameters are accesible to the inner function by the redux-thunk middleware. When this action creator is called, it is passed a city name as an argument and then uses the `dispatch` method to indicate that the loading process has begun and if successful, translates the city into lat/lon coordinates and then fetches weather at those coordinates, or catches an error and dispatches it back to the `statusReducer`. `getState` is not used in this creator and could be removed from the function, but I've included it to clarify that access to the store's state via `getState()` is available in thunk-y actions.
#### Testing thunk-y actions
At this point we're ready to test `fetchWeather`. Let's start by setting up our Jest testing environment. Jest comes out of the box in a React-app using `create-react-app`, but if you aren't bootstrapping your app using CRA, simply install the library as a dev dependency and setup [configuration](https://jestjs.io/docs/en/configuration) either in `package.json` or a separate `jest.config.js` file.
Since `axios` is used as a helper library for HTTP requests, we need to mock its methods (don't make real API calls in your tests!).
<!-- prettier-ignore -->
```javascript
// __tests__/fetching.actions.test.js
jest.mock("axios", () => ({
get: () => jest.fn(() => {
throw new Error("axios.get not mocked")
}),
}))
import axios from "axios"
```
Errors are thrown on mocked functions to make sure that their implementation inside of tests is also mocked. Not necessary, but a nice sanity check to ensure that you are fully in control of any mocked method.
Next, we need to re-create the redux store in this test file, but with an extra middleware applied. The extra middleware is a function called `trackActions` that takes in a context (ctx) object and assigns it an `actions` property that will ultimately be used to collect actions passed through the store. That collection happens through the redux middleware signature of `store => next => action => { ... }`. In our case, we want the inner function to simply push the actions onto `ctx.actions` before moving down the middleware chain.
<!-- prettier-ignore -->
```javascript
// __tests__/fetching.actions.test.js
function trackActions (ctx) {
ctx.actions = [];
return store => next => action => {
ctx.actions.push(action);
return next(action);
};
}
```
`trackActions` get used as the final middleware when creating the redux store. After the redux store is created, we assign it some extra helper methods to extract the tracked actions off the `ctx` object.
<!-- prettier-ignore -->
```javascript
// __tests__/fetching.actions.test.js
function createTrackedStore (rootReducer, middleware = [thunk]) {
const ctx = {};
const store = createStore(rootReducer, applyMiddleware(...middleware, trackActions(ctx)));
return Object.assign(store, {
getActions: () => ctx.actions,
getFirstAction: () => ctx.actions[0],
getNthAction: (n) => n > ctx.actions.length ? new Error('out of bounds') : ctx.actions[n - 1],
getLastAction: () => ctx.actions[ctx.actions.length - 1],
clearActions: () => ctx.actions = [],
});
}
```
With these functions in place, we're ready to start testing the thunky action creator. In the setup phase of the test, we create the tracked store and dispatch `fetchWeather` with some sample data. The sample data needs to have the same structure as what the API returns, which looks like:
<!-- prettier-ignore -->
```javascript
// __tests__/fetching.actions.test.js
const testData = {
city: 'Baltimore',
geocode: {
data: {
features: [
{
center: [100, 100]
}
]
}
},
weather: {
data: {
currently: {
summary: 'testSummary',
temperature: 100,
precipProbability: .5
}
}
},
}
```
We also need to define the implementation of the mocked axios calls. The first call should return the geocode information and the second should return the weather data. Since axios is promise-based, we need to mock their resolved values.
<!-- prettier-ignore -->
```javascript
// __tests__/fetching.actions.test.js
describe('fetchWeather() action creator', () => {
describe('Success fetching geocode and weather', () => {
let store;
beforeAll(() => {
jest.spyOn(axios, 'get')
.mockResolvedValueOnce(testData.geocode)
.mockResolvedValueOnce(testData.weather)
store = createTrackedStore(rootReducer);
store.dispatch(fetchingActions.fetchWeather(testData.city));
})
```
Looking back at `fetchWeather`, during a successful fetch, we should see the following action creators get dispatched:
1. setLoading
2. setCity
3. setCoords
4. setSummary
5. setTemperature
6. setPrecip
7. setSuccess
The `setLoading` action creator returns an object with a type `SET_LOADING`. Therefore, in our tracked store, we should see the first action match that type.
<!-- prettier-ignore -->
```javascript{13-17}
// __tests__/fetching.actions.test.js
describe('fetchWeather() action creator', () => {
describe('Success fetching geocode and weather', () => {
let store;
beforeAll(() => {
jest.spyOn(axios, 'get')
.mockResolvedValueOnce(testData.geocode)
.mockResolvedValueOnce(testData.weather)
store = createTrackedStore(rootReducer);
store.dispatch(fetchingActions.fetchWeather(testData.city));
})
it('dispatches SET_LOADING', () => {
expect(store.getFirstAction()).toEqual({
type: 'SET_LOADING'
})
})
```
The axios `get` methods should then be called twice (one for the geocode and one for the weather).
<!-- prettier-ignore -->
```javascript{18-22}
// __tests__/fetching.actions.test.js
describe('fetchWeather() action creator', () => {
describe('Success fetching geocode and weather', () => {
let store;
beforeAll(() => {
jest.spyOn(axios, 'get')
.mockResolvedValueOnce(testData.geocode)
.mockResolvedValueOnce(testData.weather)
store = createTrackedStore(rootReducer);
store.dispatch(fetchingActions.fetchWeather(testData.city));
})
it('dispatches SET_LOADING', () => {
expect(store.getFirstAction()).toEqual({
type: 'SET_LOADING'
})
})
it('calls axios.get() twice', async () => {
expect(axios.get).toHaveBeenCalledTimes(2);
})
```
More stringent tests could also be implemented here to assert that the axios calls were invoked with the correct arguments.
Then the remaining action creators get called that place all of the geocode and weather data into the store. For instance, the second action the store receives comes from the `setCity` action creator, which is verified by:
<!-- prettier-ignore -->
```javascript{21-28}
// __tests__/fetching.actions.test.js
describe('fetchWeather() action creator', () => {
describe('Success fetching geocode and weather', () => {
let store;
beforeAll(() => {
jest.spyOn(axios, 'get')
.mockResolvedValueOnce(testData.geocode)
.mockResolvedValueOnce(testData.weather)
store = createTrackedStore(rootReducer);
store.dispatch(fetchingActions.fetchWeather(testData.city));
})
it('dispatches SET_LOADING', () => {
expect(store.getFirstAction()).toEqual({
type: 'SET_LOADING'
})
})
it('calls axios.get() twice', async () => {
expect(axios.get).toHaveBeenCalledTimes(2);
})
it('dispatches SET_CITY with correct city', () => {
expect(store.getNthAction(2)).toEqual({
type: 'SET_CITY',
city: testData.city
})
})
```
Similar tests can be written for the remaining actions in this scenario and also for the two error scenarios (one during fetching geocode data and the other when fetching weather data). In any case, make sure that in the cleanup phase of the tests that any mocked implementations are restored and for good measure, clear any tracked actions in the store.
```javascript{28-31}
// __tests__/fetching.actions.test.js
describe('fetchWeather() action creator', () => {
describe('Success fetching geocode and weather', () => {
let store;
beforeAll(() => {
jest.spyOn(axios, 'get')
.mockResolvedValueOnce(testData.geocode)
.mockResolvedValueOnce(testData.weather)
store = createTrackedStore(rootReducer);
store.dispatch(fetchingActions.fetchWeather(testData.city));
})
it('dispatches SET_LOADING', () => {
expect(store.getFirstAction()).toEqual({
type: 'SET_LOADING'
})
})
it('calls axios.get() twice', async () => {
expect(axios.get).toHaveBeenCalledTimes(2);
})
it('dispatches SET_CITY with correct city', () => {
expect(store.getNthAction(2)).toEqual({
type: 'SET_CITY',
city: testData.city
})
})
// ... additional tests for remaining actions
afterAll(() => {
axios.get.mockRestore();
store.clearActions();
})
```
If all goes well, you'll hopefully see a nice Jest summary of green check marks!

<figcaption>Tracking and testing actions ✅✅✅</figcaption>
Voila! Go forth and test thunk-y actions without fear!
| 36.582633 | 776 | 0.721746 | eng_Latn | 0.954332 |
f96dee1d0ae0444fa02a7ebb97f1f82b694400e9 | 741 | md | Markdown | app/backend/templates/account_deletion_report.md | lpmi-13/couchers | 4b69ed3192daede93d645b6c26f9acb3db03a2ea | [
"MIT"
] | null | null | null | app/backend/templates/account_deletion_report.md | lpmi-13/couchers | 4b69ed3192daede93d645b6c26f9acb3db03a2ea | [
"MIT"
] | 38 | 2022-02-10T16:32:53.000Z | 2022-03-28T13:43:24.000Z | app/backend/templates/account_deletion_report.md | lpmi-13/couchers | 4b69ed3192daede93d645b6c26f9acb3db03a2ea | [
"MIT"
] | null | null | null | ---
subject: "Account deletion reason #{{ reason.id|couchers_escape }}"
---
{% from "macros.html" import button, link, support_email, email_link, newline %}
Someone deleted their account and wrote a reason.
* Reason{{ newline(html)|couchers_safe }}
{{ reason.reason|couchers_escape }}
* Deleted user{{ newline(html)|couchers_safe }}
Name: {{ reason.user.name|couchers_escape }}{{ newline(html)|couchers_safe }}
Email: {{ reason.user.email|couchers_escape }}{{ newline(html)|couchers_safe }}
Username: {{ reason.user.username|couchers_escape }}{{ newline(html)|couchers_safe }}
User ID: {{ reason.user.id|couchers_escape }}{{ newline(html)|couchers_safe }}
* Time{{ newline(html)|couchers_safe }}
{{ reason.created|couchers_escape }}
| 32.217391 | 85 | 0.726046 | eng_Latn | 0.415619 |
f96e7e9e1450d9197119f2c20fb4d0877473a9cf | 52 | md | Markdown | README.md | JKalmbach/questionaire-demo | a02a7e7c9a2a695283b27b745d8a5ed985d5eb78 | [
"Apache-2.0"
] | null | null | null | README.md | JKalmbach/questionaire-demo | a02a7e7c9a2a695283b27b745d8a5ed985d5eb78 | [
"Apache-2.0"
] | null | null | null | README.md | JKalmbach/questionaire-demo | a02a7e7c9a2a695283b27b745d8a5ed985d5eb78 | [
"Apache-2.0"
] | null | null | null | # questionaire-demo
a single question questionaire
| 17.333333 | 31 | 0.826923 | fra_Latn | 0.601481 |
f96f6222c28665d297e6a1d122a6ac9d7486f274 | 1,321 | markdown | Markdown | _posts/2013-02-13-comrades-let-your-voice-be-heard-about-scholarly-publishing.markdown | emhart/website | b7a3ca237e94796b8e2e6afec1165615e76b1f50 | [
"MIT"
] | null | null | null | _posts/2013-02-13-comrades-let-your-voice-be-heard-about-scholarly-publishing.markdown | emhart/website | b7a3ca237e94796b8e2e6afec1165615e76b1f50 | [
"MIT"
] | null | null | null | _posts/2013-02-13-comrades-let-your-voice-be-heard-about-scholarly-publishing.markdown | emhart/website | b7a3ca237e94796b8e2e6afec1165615e76b1f50 | [
"MIT"
] | null | null | null | ---
layout: post
title: "Comrades let your voice be heard about scholarly publishing"
date: 2013-02-13 23:24
comments: true
tags:
---
When I was in college I had a girlfriend who's dad was an ex-lawyer turned self-proclaimed anarchist scholar. She left me for a guy in his late 30's who worked in a scarf store, and I was stuck with a stack of anarchist theory books I'd bought to try and impress her at the local used bookstore. <img style="float:right" src="http://www.iwise.com/authorIcons/9555/Peter%20Kropotkin_128x128.png" /> while my politics have mellowed a bit in most things, my opninion on scholarly publishing is about in line with what [Kropotkin](http://en.wikipedia.org/wiki/Peter_Kropotkin) would have made of the situation if he were around today. <!-- more --> I got a chance to air my radical views thanks to the wonderful [Jarrett Byrnes](http://www.jarrettbyrnes.info/) and his NCEAS colleagues. He is working on a survey, where you can speak your mind and channel Kropotkin, or Karl Rove, or anyone in between. No matter what the most important thing is that you take the survey! So here's [the link to the survey](http://bit.ly/eebpublishing) and you can read more about [their NCEAS group here.](http://openpub.nceas.ucsb.edu/2013/01/24/surveying-opinions-on-scholarly-publishing-in-eeb/)
| 120.090909 | 1,183 | 0.766843 | eng_Latn | 0.998873 |
f96fae8e9daaf15866306695fec63ed05d19723f | 6,645 | md | Markdown | docs/reporting-services/report-server-web-service-net-framework-soap-headers/identifying-execution-state.md | L3onard80/sql-docs.it-it | f73e3d20b5b2f15f839ff784096254478c045bbb | [
"CC-BY-4.0",
"MIT"
] | null | null | null | docs/reporting-services/report-server-web-service-net-framework-soap-headers/identifying-execution-state.md | L3onard80/sql-docs.it-it | f73e3d20b5b2f15f839ff784096254478c045bbb | [
"CC-BY-4.0",
"MIT"
] | null | null | null | docs/reporting-services/report-server-web-service-net-framework-soap-headers/identifying-execution-state.md | L3onard80/sql-docs.it-it | f73e3d20b5b2f15f839ff784096254478c045bbb | [
"CC-BY-4.0",
"MIT"
] | null | null | null | ---
title: Identificazione dello stato di esecuzione | Microsoft Docs
description: Informazioni su come usare Reporting Services per identificare lo stato di esecuzione in modo che sia possibile interagire con il report in diversi modi.
ms.date: 03/03/2017
ms.prod: reporting-services
ms.prod_service: reporting-services-native
ms.technology: report-server-web-service-net-framework-soap-headers
ms.topic: reference
helpviewer_keywords:
- session states [Reporting Services]
- lifetimes [Reporting Services]
- sessions [Reporting Services]
- SessionHeader SOAP header
ms.assetid: d8143a4b-08a1-4c38-9d00-8e50818ee380
author: maggiesMSFT
ms.author: maggies
ms.openlocfilehash: 0977cc384dac80f28d6b7c5b7a0149ba200f1794
ms.sourcegitcommit: ff82f3260ff79ed860a7a58f54ff7f0594851e6b
ms.translationtype: HT
ms.contentlocale: it-IT
ms.lasthandoff: 03/29/2020
ms.locfileid: "80215600"
---
# <a name="identifying-execution-state"></a>Identificazione dello stato di esecuzione
HTTP (Hypertext Transfer Protocol) è un protocollo senza connessione e senza stato, ovvero non indica automaticamente se diverse richieste provengono dallo stesso client o se una singola istanza di un browser continua a visualizzare attivamente una pagina o un sito. Le sessioni creano una connessione logica per gestire lo stato tra server e client tramite HTTP. Le informazioni specifiche dell'utente relative a una particolare sessione sono note come stato della sessione.
La gestione della sessione implica la correlazione di una richiesta HTTP con le altre richieste precedenti generate dalla stessa sessione. In assenza di gestione della sessione, queste richieste appaiono non correlate al servizio Web ReportServer a causa della natura senza connessione e senza stato del protocollo HTTP.
[!INCLUDE[ssRSnoversion](../../includes/ssrsnoversion-md.md)] non espone un concetto olistico di stato della sessione simile a quello esposto da [!INCLUDE[vstecasp](../../includes/vstecasp-md.md)]. Quando, tuttavia, si eseguono i report, il server di report gestisce lo stato tra chiamate ai metodi sotto forma di **esecuzione**. Un'esecuzione consente all'utente di interagire con il report in diversi modi, ad esempio caricando il report dal server di report, impostando le credenziali e i parametri per il report ed eseguendo il rendering del report.
Mentre comunicano con un server di report, i client utilizzano l'esecuzione per gestire la visualizzazione dei report e la navigazione degli utenti ad altre pagine di un report, nonché per mostrare o nascondere le sezioni di un report. Per ogni report eseguito dall'applicazione client è disponibile un'unica esecuzione.
In generale, un'esecuzione inizia quando un utente passa a un browser o a un'applicazione client e seleziona un report da visualizzare. L'esecuzione viene eliminata dopo un breve periodo di timeout dopo la ricezione dell'ultima richiesta (l'intervallo di timeout predefinito è di 20 minuti).
Dal punto di vista del servizio Web, la durata ha inizio quando viene chiamato il metodo <xref:ReportExecution2005.ReportExecutionService.LoadReport%2A>, <xref:ReportExecution2005.ReportExecutionService.LoadReportDefinition%2A> o <xref:ReportExecution2005.ReportExecutionService.Render%2A> del servizio Web ReportServer. Per modificare l'esecuzione attiva (ad esempio, impostando i parametri e le origini dati) possono venire utilizzati altri metodi. L'esecuzione viene eliminata dopo un breve periodo di timeout dopo la ricezione dell'ultima richiesta (l'intervallo di timeout predefinito è di 20 minuti).
Un'applicazione tiene traccia di più esecuzioni attive tra le chiamate ai metodi <xref:ReportExecution2005.ReportExecutionService.Render%2A> e <xref:ReportExecution2005.ReportExecutionService.RenderStream%2A> del servizio Web, salvando l'oggetto <xref:ReportExecution2005.ExecutionHeader.ExecutionID%2A>, restituito nell'intestazione SOAP dai metodi <xref:ReportExecution2005.ReportExecutionService.LoadReport%2A> e <xref:ReportExecution2005.ReportExecutionService.LoadReportDefinition%2A>.
Nel diagramma seguente vengono illustrati i percorsi di elaborazione e di rendering per i report.

Per supportare le funzioni descritte in precedenza, il metodo di rendering SOAP corrente è stato suddiviso in più metodi che includono le fasi di inizializzazione dell'esecuzione, elaborazione e rendering.
Per eseguire il rendering di un report a livello di programmazione, è necessario eseguire le operazioni seguenti:
- Caricare il report o la definizione del report utilizzando <xref:ReportExecution2005.ReportExecutionService.LoadReport%2A> o <xref:ReportExecution2005.ReportExecutionService.LoadReportDefinition%2A>.
- Controllare se per il report sono necessari credenziali o parametri verificando i valori delle proprietà <xref:ReportExecution2005.ExecutionInfo.CredentialsRequired%2A> e <xref:ReportExecution2005.ExecutionInfo.ParametersRequired%2A> dell'oggetto <xref:ReportExecution2005.ExecutionInfo> restituito da <xref:ReportExecution2005.ReportExecutionService.LoadReport%2A> o <xref:ReportExecution2005.ReportExecutionService.LoadReportDefinition%2A>
- Se necessario, impostare le credenziali e/o i parametri utilizzando i metodi <xref:ReportExecution2005.ReportExecutionService.SetExecutionCredentials%2A> e <xref:ReportExecution2005.ReportExecutionService.SetExecutionParameters%2A>.
- Chiamare il metodo <xref:ReportExecution2005.ReportExecutionService.Render%2A> per eseguire il rendering del report.
Durante la sessione di un report, il report sottostante archiviato nel database del server di report può cambiare. La definizione del report e le autorizzazioni utente possono cambiare oppure il report può essere eliminato o spostato. Se il report è in una sessione attiva, non viene interessato dalle modifiche apportate al report sottostante (ovvero il report archiviato nel database del server di report).
È anche possibile gestire una sessione del report utilizzando i comandi di accesso con URL.
## <a name="see-also"></a>Vedere anche
<xref:ReportExecution2005.ReportExecutionService.Render%2A>
[Riferimento tecnico (SSRS)](../../reporting-services/technical-reference-ssrs.md)
[Uso di intestazioni SOAP di Reporting Services](../../reporting-services/report-server-web-service-net-framework-soap-headers/using-reporting-services-soap-headers.md)
| 102.230769 | 609 | 0.817156 | ita_Latn | 0.987145 |
f970010214eaa8d3a1aa8d9dfcf5c5ee21e3410c | 5,680 | md | Markdown | articles/site-recovery/site-recovery-manage-network-interfaces-on-premises-to-azure.md | sonquer/azure-docs.pl-pl | d8159cf8e870e807bd64e58188d281461b291ea8 | [
"CC-BY-4.0",
"MIT"
] | null | null | null | articles/site-recovery/site-recovery-manage-network-interfaces-on-premises-to-azure.md | sonquer/azure-docs.pl-pl | d8159cf8e870e807bd64e58188d281461b291ea8 | [
"CC-BY-4.0",
"MIT"
] | null | null | null | articles/site-recovery/site-recovery-manage-network-interfaces-on-premises-to-azure.md | sonquer/azure-docs.pl-pl | d8159cf8e870e807bd64e58188d281461b291ea8 | [
"CC-BY-4.0",
"MIT"
] | null | null | null | ---
title: Zarządzanie kartami sieciowymi lokalnego odzyskiwania po awarii przy użyciu Azure Site Recovery
description: Opisuje sposób zarządzania interfejsami sieciowymi na potrzeby lokalnego odzyskiwania po awarii na platformie Azure przy użyciu Azure Site Recovery
author: mayurigupta13
manager: rochakm
ms.service: site-recovery
ms.topic: conceptual
ms.date: 4/9/2019
ms.author: mayg
ms.openlocfilehash: 2a4752b501e40f9e8a4f3bc82cb2533c11f9e526
ms.sourcegitcommit: 44c2a964fb8521f9961928f6f7457ae3ed362694
ms.translationtype: MT
ms.contentlocale: pl-PL
ms.lasthandoff: 11/12/2019
ms.locfileid: "73954604"
---
# <a name="manage-vm-network-interfaces-for-on-premises-disaster-recovery-to-azure"></a>Zarządzanie interfejsami sieciowymi maszyn wirtualnych na potrzeby lokalnego odzyskiwania po awarii na platformie Azure
Maszyna wirtualna na platformie Azure musi mieć dołączony co najmniej jeden interfejs sieciowy. Może mieć dołączoną liczbę interfejsów sieciowych, ponieważ rozmiar maszyny wirtualnej obsługuje.
Domyślnie pierwszy interfejs sieciowy dołączony do maszyny wirtualnej platformy Azure jest zdefiniowany jako podstawowy interfejs sieciowy. Wszystkie inne interfejsy sieciowe w maszynie wirtualnej są dodatkowymi interfejsami sieciowymi. Domyślnie cały ruch wychodzący z maszyny wirtualnej jest wysyłany przy użyciu adresu IP przypisanego do podstawowej konfiguracji adresu IP podstawowego interfejsu sieciowego.
W środowisku lokalnym maszyny wirtualne lub serwery mogą mieć wiele interfejsów sieciowych dla różnych sieci w środowisku. Różne sieci są zwykle używane do wykonywania określonych operacji, takich jak uaktualnienia, konserwacja i dostęp do Internetu. W przypadku migrowania lub przejścia w tryb failover na platformę Azure z środowiska lokalnego należy pamiętać, że interfejsy sieciowe na tej samej maszynie wirtualnej muszą być połączone z tą samą siecią wirtualną.
Domyślnie Azure Site Recovery tworzy jako wiele interfejsów sieciowych na maszynie wirtualnej platformy Azure, które są podłączone do serwera lokalnego. Można uniknąć tworzenia nadmiarowych interfejsów sieciowych podczas migracji lub przejścia w tryb failover, edytując ustawienia interfejsu sieciowego w ustawieniach dla zreplikowanej maszyny wirtualnej.
## <a name="select-the-target-network"></a>Wybierz sieć docelową
W przypadku maszyn wirtualnych VMware i fizycznych oraz dla funkcji Hyper-V (bez System Center Virtual Machine Manager) można określić docelową sieć wirtualną dla poszczególnych maszyn wirtualnych. W przypadku maszyn wirtualnych funkcji Hyper-V zarządzanych przy użyciu Virtual Machine Manager należy użyć [mapowania sieci](site-recovery-network-mapping.md) do mapowania sieci maszyn wirtualnych na źródłowym serwerze Virtual Machine Manager i docelowych sieci platformy Azure.
1. W obszarze **zreplikowane elementy** w magazynie Recovery Services wybierz dowolny replikowany element, aby uzyskać dostęp do ustawień dla tego zreplikowanego elementu.
2. Wybierz kartę **obliczenia i sieć** , aby uzyskać dostęp do ustawień sieci dla zreplikowanego elementu.
3. W obszarze **właściwości sieci**wybierz sieć wirtualną z listy dostępnych interfejsów sieciowych.

Modyfikacja sieci docelowej ma wpływ na wszystkie interfejsy sieciowe dla tej konkretnej maszyny wirtualnej.
W przypadku chmur Virtual Machine Manager modyfikowanie mapowania sieci ma wpływ na wszystkie maszyny wirtualne i ich interfejsy sieciowe.
## <a name="select-the-target-interface-type"></a>Wybierz typ interfejsu docelowego
W sekcji **interfejsy sieciowe** okienka **obliczenia i sieć** można wyświetlać i edytować ustawienia interfejsu sieciowego. Możesz również określić docelowy typ interfejsu sieciowego.
- **Podstawowy** interfejs sieciowy jest wymagany do przejścia w tryb failover.
- Wszystkie inne wybrane interfejsy sieciowe (jeśli istnieją) są **dodatkowymi** interfejsami sieciowymi.
- Wybierz pozycję nie **Używaj** , aby wykluczyć interfejs sieciowy z tworzenia w trybie failover.
Domyślnie podczas włączania replikacji Site Recovery wybiera wszystkie wykryte interfejsy sieciowe na serwerze lokalnym. Oznacza jedną jako **podstawową** i wszystkie inne jako **pomocnicze**. Wszystkie kolejne interfejsy dodane na serwerze lokalnym są oznaczane jako **nieużywane** domyślnie. Gdy dodajesz więcej interfejsów sieciowych, upewnij się, że wybrany jest prawidłowy rozmiar docelowy maszyny wirtualnej platformy Azure, aby uwzględnić wszystkie wymagane interfejsy sieciowe.
## <a name="modify-network-interface-settings"></a>Modyfikowanie ustawień interfejsu sieciowego
Można zmodyfikować podsieć i adres IP dla interfejsów sieciowych zreplikowanego elementu. Jeśli nie określono adresu IP, Site Recovery przypisze następnego dostępnego adresu IP z podsieci do interfejsu sieciowego w trybie failover.
1. Wybierz dowolny dostępny interfejs sieciowy, aby otworzyć ustawienia interfejsu sieciowego.
2. Wybierz żądaną podsieć z listy dostępnych podsieci.
3. Wprowadź żądany adres IP (zgodnie z potrzebami).

4. Wybierz **przycisk OK** , aby zakończyć edycję i wrócić do okienka **obliczenia i sieć** .
5. Powtórz kroki 1-4 dla innych interfejsów sieciowych.
6. Wybierz pozycję **Zapisz** , aby zapisać wszystkie zmiany.
## <a name="next-steps"></a>Następne kroki
[Dowiedz się więcej](../virtual-network/virtual-network-network-interface-vm.md) o interfejsach sieciowych dla maszyn wirtualnych platformy Azure.
| 77.808219 | 485 | 0.827817 | pol_Latn | 0.999844 |
f9720d0bdccf103c8a98dbb9aef42ff5dec1432c | 471 | md | Markdown | intro/introduction/whats-new-with-4.0.0/cachebox-2.0.0.md | mborn319/coldbox-docs | 99b249b0a9dd3855c79f84974f830d49ff18d32e | [
"Apache-2.0"
] | null | null | null | intro/introduction/whats-new-with-4.0.0/cachebox-2.0.0.md | mborn319/coldbox-docs | 99b249b0a9dd3855c79f84974f830d49ff18d32e | [
"Apache-2.0"
] | null | null | null | intro/introduction/whats-new-with-4.0.0/cachebox-2.0.0.md | mborn319/coldbox-docs | 99b249b0a9dd3855c79f84974f830d49ff18d32e | [
"Apache-2.0"
] | null | null | null | # CacheBox 2.0.0
## Introduction
CacheBox 2.0.0 is a major release, mostly aligned to support our ColdBox 4 release.
## Release Notes
You can find the release version information here: [https://ortussolutions.atlassian.net/browse/CACHEBOX/fixforversion/12303](https://ortussolutions.atlassian.net/browse/CACHEBOX/fixforversion/12303)
Improvements
* \[[CACHEBOX-25](https://ortussolutions.atlassian.net/browse/CACHEBOX-25)\] - Deprecate Cachebox xml config support
| 31.4 | 199 | 0.787686 | eng_Latn | 0.462003 |
f972d9b377a388cb63bbf8054839b14ee5d062ad | 403 | md | Markdown | docs/Model/AplusContent/SearchContentDocumentsResponseAllOf.md | hkonnet/selling-partner-api | 7e87b835db6e1233362a36788e2387bdf78322cd | [
"BSD-3-Clause"
] | 119 | 2021-02-18T18:58:48.000Z | 2022-03-25T16:19:05.000Z | docs/Model/AplusContent/SearchContentDocumentsResponseAllOf.md | hkonnet/selling-partner-api | 7e87b835db6e1233362a36788e2387bdf78322cd | [
"BSD-3-Clause"
] | 213 | 2021-03-09T16:23:34.000Z | 2022-03-31T11:31:31.000Z | docs/Model/AplusContent/SearchContentDocumentsResponseAllOf.md | hkonnet/selling-partner-api | 7e87b835db6e1233362a36788e2387bdf78322cd | [
"BSD-3-Clause"
] | 68 | 2021-02-15T14:38:18.000Z | 2022-03-25T21:08:45.000Z | ## SearchContentDocumentsResponseAllOf
## Properties
Name | Type | Description | Notes
------------ | ------------- | ------------- | -------------
**content_metadata_records** | [**\SellingPartnerApi\Model\AplusContent\ContentMetadataRecord[]**](ContentMetadataRecord.md) | A list of A+ Content metadata records. |
[[AplusContent Models]](../) [[API list]](../../Api) [[README]](../../../README.md)
| 40.3 | 167 | 0.602978 | yue_Hant | 0.813545 |
f9733889ec9430009a9d6c7b6dc1c53bb956b047 | 7,648 | md | Markdown | articles/azure-resource-manager/templates/view-resources.md | trrwilson/azure-docs | 6c53a4286fafd830ab24fc70ce420e832144f4fb | [
"CC-BY-4.0",
"MIT"
] | 2 | 2021-07-31T17:49:12.000Z | 2021-08-03T13:32:39.000Z | articles/azure-resource-manager/templates/view-resources.md | trrwilson/azure-docs | 6c53a4286fafd830ab24fc70ce420e832144f4fb | [
"CC-BY-4.0",
"MIT"
] | null | null | null | articles/azure-resource-manager/templates/view-resources.md | trrwilson/azure-docs | 6c53a4286fafd830ab24fc70ce420e832144f4fb | [
"CC-BY-4.0",
"MIT"
] | 1 | 2020-07-10T20:21:56.000Z | 2020-07-10T20:21:56.000Z | ---
title: Discover resource properties
description: Describes how to search for resource properties.
ms.topic: conceptual
ms.date: 06/10/2020
---
# Discover resource properties
Before creating Resource Manager templates, you need to understand what resource types are available, and what values to use in your template. This article shows some ways you can find the properties to include in your template.
## Find resource provider namespaces
Resources in an ARM template are defined with a resource provider namespace and resource type. For example, Microsoft.Storage/storageAccounts is the full name of the storage account resource type. Microsoft.Storage is the namespace. If you don't already know the namespaces for the resource types you want to use, see [Resource providers for Azure services](../management/azure-services-resource-providers.md).

## Export templates
The easiest way to get the template properties for your existing resources is to export the template. For more information, see [Single and multi-resource export to a template in the Azure portal](./export-template-portal.md).
## Use Resource Manager tools extension
Visual Studio Code and the Azure Resource Manager tools extension help you see exactly which properties are needed for each resource type. They provide intellisense and snippets that simplify how you define a resource in your template. For more information, see [Quickstart: Create Azure Resource Manager templates with Visual Studio Code](./quickstart-create-templates-use-visual-studio-code.md#add-an-azure-resource).
The following screenshot shows a storage account resource is added to a template:

The extension also provides a list of options for the configuration properties.

## Use template reference
The Azure Resource Manager template reference is the most comprehensive resource for template schema. You can find API versions, template format, and property information.
1. Browse to [Azure Resource Manager template reference](/azure/templates/).
1. From the left navigation, select **Storage**, and then select **All resources**. The All resources page summarizes the resource types and the versions.

If you know the resource type, you can go directly to this page with the following URL format: `https://docs.microsoft.com/azure/templates/{provider-namespace}/{resource-type}`.
1. Select the latest version. It is recommended to use the latest API version.
The **Template format** section lists all the properties for storage account. **sku** is in the list:

Scroll down to see **Sku object** in the **Property values** section. The article shows the allowed values for SKU name:

At the end of the page, the **Quickstart templates** section lists some Azure Quickstart Templates that contain the resource type:

The template reference is linked from each of the Azure service documentation sites. For example, the [Key Vault documentation site](../../key-vault/general/overview.md):

## Use Resource Explorer
Resource Explorer is embedded in the Azure portal. Before using this method, you need a storage account. If you don't have one, select the following button to create one:
[](https://portal.azure.com/#create/Microsoft.Template/uri/https%3A%2F%2Fraw.githubusercontent.com%2FAzure%2Fazure-quickstart-templates%2Fmaster%2Fquickstarts%2Fmicrosoft.storage%2Fstorage-account-create%2Fazuredeploy.json)
1. Sign in to the [Azure portal](https://portal.azure.com).
1. In the search box, enter **resource explorer**, and then select **Resource Explorer**.

1. From left, expand **Subscriptions**, and then expand your Azure subscription. You can find the storage account under either **Providers** or **ResourceGroups**.

- **Providers**: expand **Providers** -> **Microsoft.Storage** -> **storageAccounts**, and then select your storage account.
- **ResourceGroups**: select the resource group, which contains the storage account, select **Resources**, and then select the storage account.
On the right, you see the SKU configuration for the existing storage account similar to:

## Use Resources.azure.com
Resources.azure.com is a public website can be accessed by anyone with an Azure subscription. It is in preview. Consider using [Resource Explorer](#use-resource-explorer) instead. This tool provides these functionalities:
- Discover the Azure Resource Management APIs.
- Get API documentation and schema information.
- Make API calls directly in your own subscriptions.
To demonstrate how to retrieve schema information by using this tool, you need a storage account. If you don't have one, select the following button to create one:
[](https://portal.azure.com/#create/Microsoft.Template/uri/https%3A%2F%2Fraw.githubusercontent.com%2FAzure%2Fazure-quickstart-templates%2Fmaster%2Fquickstarts%2Fmicrosoft.storage%2Fstorage-account-create%2Fazuredeploy.json)
1. Browse to [resources.azure.com](https://resources.azure.com/). It takes a few moments for the tool to popular the left pane.
1. Select **subscriptions**.

The node on the left matches the API call on the right. You can make the API call by selecting the **GET** button.
1. From left, expand **Subscriptions**, and then expand your Azure subscription. You can find the storage account under either **Providers** or **ResourceGroups**.
- **Providers**: expand **Providers** -> **Microsoft.Storage** -> **storageAccounts**, and then browse to the storage account.
- **ResourceGroups**: select the resource group, which contains the storage account, and then select **Resources**.
On the right, you see the sku configuration for the existing storage account similar to:

## Next steps
In this article, you learned how to find template schema information. To learn more about creating Resource Manager templates, see [Understand the structure and syntax of ARM templates](./syntax.md). | 67.087719 | 419 | 0.784519 | eng_Latn | 0.940581 |
f97394f3ba4380d769a285d5441d899407ae564b | 947 | md | Markdown | documentation/wiki/FSM.md | rodrigovidal/akka.net | 09db298dc429c59cada63c0ae17a72142c4bfec0 | [
"Apache-2.0"
] | null | null | null | documentation/wiki/FSM.md | rodrigovidal/akka.net | 09db298dc429c59cada63c0ae17a72142c4bfec0 | [
"Apache-2.0"
] | null | null | null | documentation/wiki/FSM.md | rodrigovidal/akka.net | 09db298dc429c59cada63c0ae17a72142c4bfec0 | [
"Apache-2.0"
] | null | null | null | ---
layout: wiki
title: FSM
---
Finite State Machine
For more info see real Akka FSM documentation: http://doc.akka.io/docs/akka/snapshot/scala/fsm.html
```csharp
public class MyFSM : FSM<int, object>
{
public MyFSM(ActorRef target)
{
Target = target;
StartWith(0, new object());
When(0, @event =>
{
if (@event.FsmEvent.Equals("tick")) return GoTo(1);
return null;
});
When(1, @event =>
{
if (@event.FsmEvent.Equals("tick")) return GoTo(0);
return null;
});
WhenUnhandled(@event =>
{
if (@event.FsmEvent.Equals("reply")) return Stay().Replying("reply");
return null;
});
Initialize();
}
public ActorRef Target { get; private set; }
protected override void PreRestart(Exception reason, object message)
{
Target.Tell("restarted");
}
}
``` | 21.522727 | 99 | 0.543823 | eng_Latn | 0.308432 |
f973fc737c021c610f8bb44d88b27503a1263479 | 19 | md | Markdown | README.md | guxiaodai/mvvm | 3cffc8aad21d2106d476b3f3087f9280a5c6edb8 | [
"MIT"
] | null | null | null | README.md | guxiaodai/mvvm | 3cffc8aad21d2106d476b3f3087f9280a5c6edb8 | [
"MIT"
] | null | null | null | README.md | guxiaodai/mvvm | 3cffc8aad21d2106d476b3f3087f9280a5c6edb8 | [
"MIT"
] | null | null | null | # mvvm
实现一个简化版mvvm
| 6.333333 | 11 | 0.789474 | zho_Hans | 0.361881 |
f97412e54c54d35063363c85603e5edf11f28e98 | 1,198 | md | Markdown | 2015/CVE-2015-7297.md | justinforbes/cve | 375c65312f55c34fc1a4858381315fe9431b0f16 | [
"MIT"
] | 2,340 | 2022-02-10T21:04:40.000Z | 2022-03-31T14:42:58.000Z | 2015/CVE-2015-7297.md | justinforbes/cve | 375c65312f55c34fc1a4858381315fe9431b0f16 | [
"MIT"
] | 19 | 2022-02-11T16:06:53.000Z | 2022-03-11T10:44:27.000Z | 2015/CVE-2015-7297.md | justinforbes/cve | 375c65312f55c34fc1a4858381315fe9431b0f16 | [
"MIT"
] | 280 | 2022-02-10T19:58:58.000Z | 2022-03-26T11:13:05.000Z | ### [CVE-2015-7297](https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2015-7297)



### Description
SQL injection vulnerability in Joomla! 3.2 before 3.4.4 allows remote attackers to execute arbitrary SQL commands via unspecified vectors, a different vulnerability than CVE-2015-7858.
### POC
#### Reference
- http://packetstormsecurity.com/files/134097/Joomla-3.44-SQL-Injection.html
- http://packetstormsecurity.com/files/134494/Joomla-Content-History-SQL-Injection-Remote-Code-Execution.html
- https://www.exploit-db.com/exploits/38797/
- https://www.trustwave.com/Resources/SpiderLabs-Blog/Joomla-SQL-Injection-Vulnerability-Exploit-Results-in-Full-Administrative-Access/
#### Github
- https://github.com/0ps/pocassistdb
- https://github.com/ARPSyndicate/kenzer-templates
- https://github.com/Elsfa7-110/kenzer-templates
- https://github.com/areaventuno/exploit-joomla
- https://github.com/jweny/pocassistdb
- https://github.com/stamparm/maltrail
| 46.076923 | 184 | 0.775459 | yue_Hant | 0.63384 |
f9747dc6be070d7c5d2d60cead5a96947a6c258c | 18 | md | Markdown | xcode_technique.md | nicolastinkl/repo | 1c43d0874acad480b8ba7df12e283a0614ba8d54 | [
"MIT"
] | 1 | 2016-07-27T10:44:25.000Z | 2016-07-27T10:44:25.000Z | xcode_technique.md | nicolastinkl/repo | 1c43d0874acad480b8ba7df12e283a0614ba8d54 | [
"MIT"
] | null | null | null | xcode_technique.md | nicolastinkl/repo | 1c43d0874acad480b8ba7df12e283a0614ba8d54 | [
"MIT"
] | null | null | null | # Xcode technique
| 9 | 17 | 0.777778 | fra_Latn | 0.904926 |
f975086d33f6a8fa0b5cbbafd74d667398461b43 | 1,053 | md | Markdown | ArcGISRuntimeSDKQt_QMLSamples/Layers/DisplayKml/README.md | markjdugger/arcgis-runtime-samples-qt | 833b583d8f0a70b2dbbe4a8c28d8609c7e93dca3 | [
"Apache-2.0"
] | 1 | 2021-08-08T13:58:54.000Z | 2021-08-08T13:58:54.000Z | ArcGISRuntimeSDKQt_QMLSamples/Layers/DisplayKml/README.md | markjdugger/arcgis-runtime-samples-qt | 833b583d8f0a70b2dbbe4a8c28d8609c7e93dca3 | [
"Apache-2.0"
] | null | null | null | ArcGISRuntimeSDKQt_QMLSamples/Layers/DisplayKml/README.md | markjdugger/arcgis-runtime-samples-qt | 833b583d8f0a70b2dbbe4a8c28d8609c7e93dca3 | [
"Apache-2.0"
] | null | null | null | # Display KML
Load and display KML files from various sources, including URLs, local files, and Portal items.

## How to use the sample
Use the UI to select a source. A KML file from that source will be loaded and displayed in the map.
## Relevant API
- KmlLayer
- KmlDataset
## Offline data
Read more about how to set up the sample's offline data [here](http://links.esri.com/ArcGISRuntimeQtSamples).
Link | Local Location
---------|-------|
|[US State Capitals](https://www.arcgis.com/home/item.html?id=324e4742820e46cfbe5029ff2c32cb1f)| `<userhome>`/ArcGIS/Runtime/Data/kml/US_State_Capitals.kml |
## About the data
This sample displays three different KML files:
- From URL - this is a map of the significant weather outlook produced by NOAA/NWS. It uses KML network links to always show the latest data.
- From local file - this is a map of U.S. state capitals. It doesn't define an icon, so the default pushpin is used for the points.
- From portal item - this is a map of U.S. states.
## Tags
KML, KMZ, OGC, Keyhole
| 35.1 | 157 | 0.736942 | eng_Latn | 0.978087 |
f9751263182daff40bc05c8f770a192aca23f1d6 | 34,804 | md | Markdown | docs/docs/graphql-reference.md | yanaraeven/gatsby | ee9319df0e83d348d085468c340648cd33f3cf56 | [
"MIT"
] | 2 | 2020-08-04T09:21:34.000Z | 2022-01-12T01:15:04.000Z | docs/docs/graphql-reference.md | tanmaylaud/gatsby | 8e9452cc06122d77375d0aff049e2aff96d5fb45 | [
"MIT"
] | 3 | 2022-02-28T01:47:46.000Z | 2022-03-08T23:40:10.000Z | docs/docs/graphql-reference.md | tanmaylaud/gatsby | 8e9452cc06122d77375d0aff049e2aff96d5fb45 | [
"MIT"
] | 3 | 2020-10-09T23:08:26.000Z | 2020-12-01T21:29:32.000Z | ---
title: GraphQL Query Options Reference
---
## Intro
This page will walk you through a series of GraphQL queries, each designed to demonstrate a particular feature of GraphQL. You'll be querying the _real_ schema used on [graphql-reference example](https://github.com/gatsbyjs/gatsby/tree/master/examples/graphql-reference) so feel free to experiment and poke around the innards of the site! You can also open the [CodeSandbox version](https://codesandbox.io/s/github/gatsbyjs/gatsby/tree/master/examples/graphql-reference) ([Direct link to GraphiQL](<https://711808k40x.sse.codesandbox.io/___graphql?query=%7B%0A%20%20site%20%7B%0A%20%20%20%20siteMetadata%20%7B%0A%20%20%20%20%20%20title%0A%20%20%20%20%20%20description%0A%20%20%20%20%7D%0A%20%20%7D%0A%20%20books%3A%20allMarkdownRemark(filter%3A%20%7Bfrontmatter%3A%20%7Btitle%3A%20%7Bne%3A%20%22%22%7D%7D%7D)%20%7B%0A%20%20%20%20totalCount%0A%20%20%20%20edges%20%7B%0A%20%20%20%20%20%20node%20%7B%0A%20%20%20%20%20%20%20%20frontmatter%20%7B%0A%20%20%20%20%20%20%20%20%20%20title%0A%20%20%20%20%20%20%20%20%20%20date(fromNow%3A%20true)%0A%20%20%20%20%20%20%20%20%20%20author%20%7B%0A%20%20%20%20%20%20%20%20%20%20%20%20id%0A%20%20%20%20%20%20%20%20%20%20%7D%0A%20%20%20%20%20%20%20%20%7D%0A%20%20%20%20%20%20%7D%0A%20%20%20%20%7D%0A%20%20%7D%0A%20%20authors%3A%20allAuthorYaml%20%7B%0A%20%20%20%20totalCount%0A%20%20%20%20edges%20%7B%0A%20%20%20%20%20%20node%20%7B%0A%20%20%20%20%20%20%20%20id%0A%20%20%20%20%20%20%20%20bio%0A%20%20%20%20%20%20%7D%0A%20%20%20%20%7D%0A%20%20%7D%0A%7D%0A>)) of it.
You'll be using GraphiQL, an interactive editor you can also use [while building your Gatsby site](/tutorial/part-five/#introducing-graphiql).
For more information, read about [why Gatsby uses GraphQL](/docs/why-gatsby-uses-graphql/).
## Basic query
Start with the basics, pulling up the site `title` from your `gatsby-config.js`'s `siteMetadata`. Here the query is on the left and the results are on the right.
<iframe
title="A basic query"
src="https://711808k40x.sse.codesandbox.io/___graphql?query=%7B%0A%20%20site%20%7B%0A%20%20%20%20siteMetadata%20%7B%0A%20%20%20%20%20%20title%0A%20%20%20%20%7D%0A%20%20%7D%0A%7D%0A&explorerIsOpen=false"
width="600"
height="400"
loading="lazy"
></iframe>
Try editing the query to include the `description` from `siteMetadata`. When typing in the query editor you can use `Ctrl + Space` to see autocomplete options and `Ctrl + Enter` to run the current query.
## A longer query
Gatsby structures its content as collections of `nodes`, which are connected to each other with `edges`. In this query you ask for the total count of plugins in this Gatsby site, along with specific information about each one.
<iframe
title="A longer query"
src="https://711808k40x.sse.codesandbox.io/___graphql?query=%7B%0A%20%20allSitePlugin%20%7B%0A%20%20%20%20totalCount%0A%20%20%20%20edges%20%7B%0A%20%20%20%20%20%20node%20%7B%0A%20%20%20%20%20%20%20%20name%0A%20%20%20%20%20%20%20%20version%0A%20%20%20%20%20%20%20%20packageJson%20%7B%0A%20%20%20%20%20%20%20%20%20%20description%0A%20%20%20%20%20%20%20%20%7D%0A%20%20%20%20%20%20%7D%0A%20%20%20%20%7D%0A%20%20%7D%0A%7D%0A&explorerIsOpen=false"
width="600"
height="400"
loading="lazy"
></iframe>
Try using the editor's autocomplete (`Ctrl + Space`) to get extended details from the `packageJson` nodes.
If you're using Gatsby version `2.2.0` or later, you can remove `edges` and `node` from your query and replace it with `nodes`. The query will still work and the returned object will reflect the `nodes` structure.
<iframe
title="A longer query with nodes"
src="https://711808k40x.sse.codesandbox.io/___graphql?query=%7B%0A%20%20allSitePlugin%20%7B%0A%20%20%20%20totalCount%0A%20%20%20%20nodes%20%7B%0A%20%20%20%20%20%20%20%20name%0A%20%20%20%20%20%20%20%20version%0A%20%20%20%20%20%20%20%20packageJson%20%7B%0A%20%20%20%20%20%20%20%20%20%20description%0A%20%20%20%20%20%20%20%20%7D%0A%20%20%20%20%7D%0A%20%20%7D%0A%7D&explorerIsOpen=false"
width="600"
height="400"
></iframe>
## Limit
There are several ways to reduce the number of results from a query. Here `totalCount` tells you there's 8 results, but `limit` is used to show only the first three.
<iframe
title="Using limit"
src="https://711808k40x.sse.codesandbox.io/___graphql?query=%7B%0A%20%20allMarkdownRemark(limit%3A%203)%20%7B%0A%20%20%20%20totalCount%0A%20%20%20%20edges%20%7B%0A%20%20%20%20%20%20node%20%7B%0A%20%20%20%20%20%20%20%20frontmatter%20%7B%0A%20%20%20%20%20%20%20%20%20%20title%0A%20%20%20%20%20%20%20%20%7D%0A%20%20%20%20%20%20%7D%0A%20%20%20%20%7D%0A%20%20%7D%0A%7D%0A&explorerIsOpen=false"
width="600"
height="400"
loading="lazy"
></iframe>
## Skip
Skip over a number of results. In this query `skip` is used to omit the first 3 results.
<iframe
title="Using skip"
src="https://711808k40x.sse.codesandbox.io/___graphql?query=%7B%0A%20%20allMarkdownRemark(skip%3A%203)%20%7B%0A%20%20%20%20totalCount%0A%20%20%20%20edges%20%7B%0A%20%20%20%20%20%20node%20%7B%0A%20%20%20%20%20%20%20%20frontmatter%20%7B%0A%20%20%20%20%20%20%20%20%20%20title%0A%20%20%20%20%20%20%20%20%7D%0A%20%20%20%20%20%20%7D%0A%20%20%20%20%7D%0A%20%20%7D%0A%7D%0A&explorerIsOpen=false"
width="600"
height="400"
loading="lazy"
></iframe>
## Filter
In this query `filter` and the `ne` (not equals) operator is used to show only results that have a title. You can find a good video tutorial on this [here](https://www.youtube.com/watch?v=Lg1bom99uGM).
<iframe
title="Using a filter"
src="https://711808k40x.sse.codesandbox.io/___graphql?query=%7B%0A%20%20allMarkdownRemark(%0A%20%20%20%20filter%3A%20%7B%0A%20%20%20%20%20%20frontmatter%3A%20%7Btitle%3A%20%7Bne%3A%20%22%22%7D%7D%0A%20%20%20%20%7D%0A%20%20)%20%7B%0A%20%20%20%20totalCount%0A%20%20%20%20edges%20%7B%0A%20%20%20%20%20%20node%20%7B%0A%20%20%20%20%20%20%20%20frontmatter%20%7B%0A%20%20%20%20%20%20%20%20%20%20title%0A%20%20%20%20%20%20%20%20%7D%0A%20%20%20%20%20%20%7D%0A%20%20%20%20%7D%0A%20%20%7D%0A%7D%0A&explorerIsOpen=false"
width="600"
height="400"
loading="lazy"
></iframe>
Gatsby relies on [Sift](https://www.npmjs.com/package/sift) to enable MongoDB-like query syntax for object filtering. This allows Gatsby to support operators like `eq`, `ne`, `in`, `regex` and querying nested fields through the `__` connector.
It is also possible to filter on multiple fields - just separate the individual filters by a comma (works as an AND):
```js
filter: { contentType: { in: ["post", "page"] }, draft: { eq: false } }
```
In this query the fields `categories` and `title` are filtered to find the book that has `Fantastic` in its title and belongs to the `magical creatures` category.
<iframe
title="Using multiple filters"
src="https://711808k40x.sse.codesandbox.io/___graphql?query=%7B%0A%20%20allMarkdownRemark(%0A%20%20%20%20filter%3A%20%7B%0A%20%20%20%20%20%20frontmatter%3A%20%7B%0A%20%20%20%20%20%20%20%20categories%3A%20%7B%0A%20%20%20%20%20%20%20%20%20%20in%3A%20%5B%22magical%20creatures%22%5D%0A%20%20%20%20%20%20%20%20%7D%0A%20%20%20%20%20%20%20%20title%3A%20%7Bregex%3A%20%22%2FFantastic%2F%22%0A%20%20%20%20%20%20%20%20%7D%0A%20%20%20%20%20%20%7D%0A%20%20%20%20%7D%0A%20%20)%20%7B%0A%20%20%20%20totalCount%0A%20%20%20%20edges%20%7B%0A%20%20%20%20%20%20node%20%7B%0A%20%20%20%20%20%20%20%20frontmatter%20%7B%0A%20%20%20%20%20%20%20%20%20%20title%0A%20%20%20%20%20%20%20%20%7D%0A%20%20%20%20%20%20%7D%0A%20%20%20%20%7D%0A%20%20%7D%0A%7D%0A&explorerIsOpen=false"
width="600"
height="400"
loading="lazy"
></iframe>
You can also combine the mentioned operators. This query filters on `/History/` for the `regex` operator. The result is `Hogwarts: A History` and `History of Magic`. You can filter out the latter with the `ne` operator.
<iframe
title="Using multiple operators in filters"
src="https://711808k40x.sse.codesandbox.io/___graphql?query=%7B%0A%20%20allMarkdownRemark(%0A%20%20%20%20filter%3A%20%7B%0A%20%20%20%20%20%20frontmatter%3A%20%7B%0A%20%20%20%20%20%20%20%20title%3A%20%7B%0A%20%20%20%20%20%20%20%20%20%20regex%3A%20%22%2FHistory%2F%22%0A%20%20%20%20%20%20%20%20%20%20ne%3A%20%22History%20of%20Magic%22%0A%20%20%20%20%20%20%20%20%7D%0A%20%20%20%20%20%20%7D%0A%20%20%20%20%7D%0A%20%20)%20%7B%0A%20%20%20%20totalCount%0A%20%20%20%20edges%20%7B%0A%20%20%20%20%20%20node%20%7B%0A%20%20%20%20%20%20%20%20frontmatter%20%7B%0A%20%20%20%20%20%20%20%20%20%20title%0A%20%20%20%20%20%20%20%20%7D%0A%20%20%20%20%20%20%7D%0A%20%20%20%20%7D%0A%20%20%7D%0A%7D%0A&explorerIsOpen=false"
width="600"
height="400"
loading="lazy"
></iframe>
### Complete list of possible operators
_In the playground below the list, there is an example query with a description of what the query does for each operator._
- `eq`: short for **equal**, must match the given data exactly
- `ne`: short for **not equal**, must be different from the given data
- `regex`: short for **regular expression**, must match the given pattern. Note that backslashes need to be escaped _twice_, so `/\w+/` needs to be written as `"/\\\\w+/"`.
- `glob`: short for **global**, allows to use wildcard `*` which acts as a placeholder for any non-empty string
- `in`: short for **in array**, must be an element of the array
- `nin`: short for **not in array**, must NOT be an element of the array
- `gt`: short for **greater than**, must be greater than given value
- `gte`: short for **greater than or equal**, must be greater than or equal to given value
- `lt`: short for **less than**, must be less than given value
- `lte`: short for **less than or equal**, must be less than or equal to given value
- `elemMatch`: short for **element match**, this indicates that the field you are filtering will return an array of elements, on which you can apply a filter using the previous operators
<iframe
title="Complete list of possible operators"
src="https://711808k40x.sse.codesandbox.io/___graphql?query=%7B%0A%20%20%23%20eq%3A%20I%20want%20all%20the%20titles%20that%20match%20%22Fantastic%20Beasts%20and%20Where%20to%20Find%20Them%22%0A%20%20example_eq%3AallMarkdownRemark(%0A%20%20%20%20filter%3A%20%7B%0A%20%20%20%20%20%20frontmatter%3A%20%7B%0A%20%20%20%20%20%20%20%20title%3A%20%7B%0A%20%20%20%20%20%20%20%20%20%20eq%3A%20%22Fantastic%20Beasts%20and%20Where%20to%20Find%20Them%22%0A%20%20%20%20%20%20%20%20%7D%0A%20%20%20%20%20%20%7D%0A%20%20%20%20%7D%0A%20%20)%20%7B%0A%20%20%20%20edges%20%7B%0A%20%20%20%20%20%20node%20%7B%0A%20%20%20%20%20%20%20%20frontmatter%20%7B%0A%20%20%20%20%20%20%20%20%20%20title%0A%20%20%20%20%20%20%20%20%7D%0A%20%20%20%20%20%20%7D%0A%20%20%20%20%7D%0A%20%20%7D%0A%20%20%0A%20%20%23%20neq%3A%20I%20want%20all%20the%20titles%20which%20are%20NOT%20equal%20to%20the%20empty%20string%0A%20%20example_ne%3AallMarkdownRemark(%0A%20%20%20%20filter%3A%20%7B%0A%20%20%20%20%20%20frontmatter%3A%20%7B%0A%20%20%20%20%20%20%20%20title%3A%20%7B%0A%20%20%20%20%20%20%20%20%20%20ne%3A%22%22%0A%20%20%20%20%20%20%20%20%7D%0A%20%20%20%20%20%20%7D%0A%20%20%20%20%7D%0A%20%20)%20%7B%0A%20%20%20%20edges%20%7B%0A%20%20%20%20%20%20node%20%7B%20%20%20%20%20%20%20%20%0A%20%20%20%20%20%20%20%20frontmatter%20%7B%0A%20%20%20%20%20%20%20%20%20%20title%0A%20%20%20%20%20%20%20%20%7D%0A%20%20%20%20%20%20%7D%0A%20%20%20%20%7D%0A%20%20%7D%0A%20%20%0A%20%20%23%20regex%3A%20I%20want%20all%20the%20titles%20that%20do%20not%20start%20with%20%27T%27%20--%20this%20is%20what%20%2F%5E%5B%5ET%5D%2F%20means.%0A%20%20%23%20To%20learn%20more%20about%20regular%20expressions%3A%20https%3A%2F%2Fregexr.com%2F%0A%20%20example_regex%3AallMarkdownRemark(%0A%20%20%20%20filter%3A%20%7B%0A%20%20%20%20%20%20frontmatter%3A%20%7B%0A%20%20%20%20%20%20%20%20title%3A%20%7B%0A%20%20%20%20%20%20%20%20%20%20regex%3A%20%22%2F%5E%5B%5ET%5D%2F%22%0A%20%20%20%20%20%20%20%20%7D%0A%20%20%20%20%20%20%7D%0A%20%20%20%20%7D%0A%20%20)%20%7B%0A%20%20%20%20edges%20%7B%0A%20%20%20%20%20%20node%20%7B%0A%20%20%20%20%20%20%20%20frontmatter%20%7B%0A%20%20%20%20%20%20%20%20%20%20title%0A%20%20%20%20%20%20%20%20%7D%0A%20%20%20%20%20%20%7D%0A%20%20%20%20%7D%0A%20%20%7D%0A%20%20%0A%20%20%23%20glob%3A%20I%20want%20all%20the%20titles%20that%20contain%20the%20word%20%27History%27.%0A%20%20%23%20The%20wildcard%20*%20stands%20for%20any%20non-empty%20string.%0A%20%20example_glob%3AallMarkdownRemark(%0A%20%20%20%20filter%3A%20%7B%0A%20%20%20%20%20%20frontmatter%3A%20%7B%0A%20%20%20%20%20%20%20%20title%3A%20%7B%0A%20%20%20%20%20%20%20%20%20%20glob%3A%20%22*History*%22%0A%20%20%20%20%20%20%20%20%7D%0A%20%20%20%20%20%20%7D%0A%20%20%20%20%7D%0A%20%20)%20%7B%0A%20%20%20%20edges%20%7B%0A%20%20%20%20%20%20node%20%7B%0A%20%20%20%20%20%20%20%20frontmatter%20%7B%0A%20%20%20%20%20%20%20%20%20%20title%0A%20%20%20%20%20%20%20%20%7D%0A%20%20%20%20%20%20%7D%0A%20%20%20%20%7D%0A%20%20%7D%0A%20%20%0A%20%20%23%20in%3A%20I%20want%20all%20the%20titles%20and%20dates%20from%20%60frontmatter%60%0A%20%20%23%20where%20the%20title%20is%20either%20%0A%20%20%23%20-%20%22Children%27s%20Anthology%20of%20Monsters%22%2C%20or%0A%20%20%23%20-%20%22Hogwarts%3A%20A%20History%22.%0A%20%20example_in%3AallMarkdownRemark(%0A%20%20%20%20filter%3A%20%7B%0A%20%20%20%20%20%20frontmatter%3A%20%7B%0A%20%20%20%20%20%20%20%20title%3A%20%7B%0A%20%20%20%20%20%20%20%20%20%20in%3A%20%5B%0A%20%20%20%20%20%20%20%20%20%20%20%20%22Children%27s%20Anthology%20of%20Monsters%22%2C%0A%20%20%20%20%20%20%20%20%20%20%20%20%22Hogwarts%3A%20A%20History%22%0A%20%20%20%20%20%20%20%20%20%20%5D%0A%20%20%20%20%20%20%20%20%7D%0A%20%20%20%20%20%20%7D%0A%20%20%20%20%7D%0A%20%20)%20%7B%0A%20%20%20%20edges%20%7B%0A%20%20%20%20%20%20node%20%7B%0A%20%20%20%20%20%20%20%20frontmatter%7B%0A%20%20%20%20%20%20%20%20%20%20title%0A%20%20%20%20%20%20%20%20%09date%0A%20%20%20%20%20%09%09%7D%0A%20%20%20%20%20%20%7D%0A%20%20%20%20%7D%0A%20%20%7D%0A%20%20%0A%20%20%23%20nin%3A%20I%20want%20all%20the%20titles%20and%20dates%20from%20%60frontmatter%60%0A%20%20%23%20where%20the%20title%20is%20neither%20%0A%20%20%23%20-%20%22Children%27s%20Anthology%20of%20Monsters%22%2C%20nor%0A%20%20%23%20-%20%22Hogwarts%3A%20A%20History%22.%0A%20%20example_nin%3AallMarkdownRemark(%0A%20%20%20%20filter%3A%20%7B%0A%20%20%20%20%20%20frontmatter%3A%20%7B%0A%20%20%20%20%20%20%20%20title%3A%20%7B%0A%20%20%20%20%20%20%20%20%20%20nin%3A%5B%0A%20%20%20%20%20%20%20%20%20%20%20%20%22Children%27s%20Anthology%20of%20Monsters%22%2C%0A%20%20%20%20%20%20%20%20%20%20%20%20%22Hogwarts%3A%20A%20History%22%0A%20%20%20%20%20%20%20%20%20%20%5D%0A%20%20%20%20%20%20%20%20%7D%0A%20%20%20%20%20%20%7D%0A%20%20%20%20%7D%0A%20%20)%20%7B%0A%20%20%20%20edges%20%7B%0A%20%20%20%20%20%20node%20%7B%0A%20%20%20%20%20%20%20%20frontmatter%7B%0A%20%20%20%20%20%20%20%20%20%20title%0A%20%20%20%20%20%20%20%20%09date%0A%20%20%20%20%20%09%09%7D%0A%20%20%20%20%20%20%7D%0A%20%20%20%20%7D%0A%20%20%7D%0A%20%20%0A%20%20%23%20lte%3A%20I%20want%20all%20the%20titles%20for%20which%20%60timeToRead%60%20is%20less%20than%20or%20equal%20to%204%20minutes.%0A%20%20example_lte%3AallMarkdownRemark(%0A%20%20%20%20filter%3A%20%7B%0A%20%20%20%20%20%20timeToRead%3A%20%7B%0A%20%20%20%20%20%20%20%20lte%3A%204%0A%20%20%20%20%20%20%7D%0A%20%20%09%7D%0A%20%20)%20%7B%0A%20%20%20%20edges%20%7B%0A%20%20%20%20%20%20node%20%7B%0A%20%20%20%20%20%20%20%20frontmatter%20%7B%0A%20%20%20%20%20%20%20%20%20%20title%0A%20%20%20%20%20%20%20%20%7D%0A%20%20%20%20%20%20%7D%0A%20%20%20%20%7D%0A%20%20%7D%0A%20%20%0A%20%20%23%20elemMatch%3A%20I%20want%20to%20know%20all%20the%20plugins%20that%20contain%20%22chokidar%22%20in%20their%20dependencies.%0A%20%20%23%20Note%3A%20the%20%60allSitePlugin%60%20query%20lists%20all%20the%20plugins%20used%20in%20our%20Gatsby%20site.%20%0A%20%20example_elemMatch%3A%20allSitePlugin(%0A%20%20%20%20filter%3A%7B%0A%20%20%20%20%20%20packageJson%3A%7B%0A%20%20%20%20%20%20%20%20dependencies%3A%7B%0A%20%20%20%20%20%20%20%20%20%20elemMatch%3A%7B%0A%20%20%20%20%20%20%20%20%20%20%20%20name%3A%7B%0A%20%20%20%20%20%20%20%20%20%20%20%20%20%20eq%3A%22chokidar%22%0A%20%20%20%20%20%20%20%20%20%20%20%20%7D%0A%20%20%20%20%20%20%20%20%20%20%7D%0A%20%20%20%20%20%20%20%20%7D%0A%20%20%20%20%20%20%7D%0A%20%20%09%7D%0A%20%20)%20%7B%0A%20%20%20%20edges%20%7B%0A%20%20%20%20%20%20node%7B%0A%20%20%20%20%20%20%20%20name%0A%20%20%20%20%20%20%7D%0A%20%20%20%20%7D%0A%20%20%7D%0A%7D%0A&explorerIsOpen=false"
width="600"
height="400"
loading="lazy"
></iframe>
If you want to understand more how these filters work, looking at the corresponding [tests](https://github.com/gatsbyjs/gatsby/blob/master/packages/gatsby/src/schema/__tests__/run-query.js) in the codebase could be very useful.
## Sort
The ordering of your results can be specified with `sort`. Here the results are sorted in ascending order of `frontmatter`'s `date` field.
<iframe
title="Using sort"
src="https://711808k40x.sse.codesandbox.io/___graphql?query=%7B%0A%20%20allMarkdownRemark(%0A%20%20%20%20sort%3A%20%7B%0A%20%20%20%20%20%20fields%3A%20%5Bfrontmatter___date%5D%0A%20%20%20%20%20%20order%3A%20ASC%0A%20%20%20%20%7D%0A%20%20)%20%7B%0A%20%20%20%20totalCount%0A%20%20%20%20edges%20%7B%0A%20%20%20%20%20%20node%20%7B%0A%20%20%20%20%20%20%20%20frontmatter%20%7B%0A%20%20%20%20%20%20%20%20%20%20title%0A%20%20%20%20%20%20%20%20%20%20date%0A%20%20%20%20%20%20%20%20%7D%0A%20%20%20%20%20%20%7D%0A%20%20%20%20%7D%0A%20%20%7D%0A%7D%0A&explorerIsOpen=false"
width="600"
height="400"
loading="lazy"
></iframe>
You can also sort on multiple fields but the `sort` keyword can only be used once. The second sort field gets evaluated when the first field (here: `date`) is identical. The results of the following query are sorted in ascending order of `date` and `title` field.
<iframe
title="Sorting on multiple fields"
src="https://711808k40x.sse.codesandbox.io/___graphql?query=%7B%0A%20%20allMarkdownRemark(%0A%20%20%20%20sort%3A%20%7B%0A%20%20%20%20%20%20fields%3A%20%5Bfrontmatter___date%2C%20frontmatter___title%5D%0A%20%20%20%20%20%20order%3A%20ASC%0A%20%20%20%20%7D%0A%20%20)%20%7B%0A%20%20%20%20totalCount%0A%20%20%20%20edges%20%7B%0A%20%20%20%20%20%20node%20%7B%0A%20%20%20%20%20%20%20%20frontmatter%20%7B%0A%20%20%20%20%20%20%20%20%20%20title%0A%20%20%20%20%20%20%20%20%20%20date%0A%20%20%20%20%20%20%20%20%7D%0A%20%20%20%20%20%20%7D%0A%20%20%20%20%7D%0A%20%20%7D%0A%7D%0A&explorerIsOpen=false"
width="600"
height="400"
loading="lazy"
></iframe>
`Children's Anthology of Monsters` and `Break with Banshee` both have the same date (`1992-01-02`) but in the first query (only one sort field) the latter comes after the first. The additional sorting on the `title` puts `Break with Banshee` in the right order.
By default, sort `fields` will be sorted in ascending order. Optionally, you can specify a sort `order` per field by providing an array of `ASC` (for ascending) or `DESC` (for descending) values. For example, to sort by `frontmatter.date` in ascending order, and additionally by `frontmatter.title` in descending order, you would use `sort: { fields: [frontmatter___date, frontmatter___title], order: [ASC, DESC] }`. Note that if you only provide a single sort `order` value, this will affect the first sort field only, the rest will be sorted in default ascending order.
<iframe
title="Setting sorting order"
src="https://711808k40x.sse.codesandbox.io/___graphql?query=%7B%0A%20%20allMarkdownRemark(%0A%20%20%20%20sort%3A%20%7B%0A%20%20%20%20%20%20fields%3A%20%5Bfrontmatter___date%2C%20frontmatter___title%5D%0A%20%20%20%20%20%20order%3A%20%5BASC%2C%20DESC%5D%0A%20%20%20%20%7D%0A%20%20)%20%7B%0A%20%20%20%20totalCount%0A%20%20%20%20edges%20%7B%0A%20%20%20%20%20%20node%20%7B%0A%20%20%20%20%20%20%20%20frontmatter%20%7B%0A%20%20%20%20%20%20%20%20%20%20title%0A%20%20%20%20%20%20%20%20%20%20date%0A%20%20%20%20%20%20%20%20%7D%0A%20%20%20%20%20%20%7D%0A%20%20%20%20%7D%0A%20%20%7D%0A%7D&explorerIsOpen=false"
width="600"
height="400"
loading="lazy"
></iframe>
## Format
### Dates
Dates can be formatted using the `formatString` function.
<iframe
title="Formatting dates"
src="https://711808k40x.sse.codesandbox.io/___graphql?query=%7B%0A%20%20allMarkdownRemark(%0A%20%20%20%20filter%3A%20%7Bfrontmatter%3A%20%7Bdate%3A%20%7Bne%3A%20null%7D%7D%7D%0A%20%20)%20%7B%0A%20%20%20%20edges%20%7B%0A%20%20%20%20%20%20node%20%7B%0A%20%20%20%20%20%20%20%20frontmatter%20%7B%0A%20%20%20%20%20%20%20%20%20%20title%0A%20%20%20%20%20%20%20%20%20%20date(formatString%3A%20%22dddd%20DD%20MMMM%20YYYY%22)%0A%20%20%20%20%20%20%20%20%7D%0A%20%20%20%20%20%20%7D%0A%20%20%20%20%7D%0A%20%20%7D%0A%7D%0A&explorerIsOpen=false"
width="600"
height="400"
loading="lazy"
></iframe>
Gatsby relies on [Moment.js](https://momentjs.com/) to format the dates. This allows you to use any tokens in your string. See [moment.js documentation](https://momentjs.com/docs/#/displaying/format/) for more tokens.
You can also pass in a `locale` to adapt the output to your language. The above query gives you the English output for the weekdays, this example outputs them in German.
<iframe
title="Using locale"
src="https://711808k40x.sse.codesandbox.io/___graphql?query=%7B%0A%20%20allMarkdownRemark(%0A%20%20%20%20filter%3A%20%7Bfrontmatter%3A%20%7Bdate%3A%20%7Bne%3A%20null%7D%7D%7D%0A%20%20)%20%7B%0A%20%20%20%20edges%20%7B%0A%20%20%20%20%20%20node%20%7B%0A%20%20%20%20%20%20%20%20frontmatter%20%7B%0A%20%20%20%20%20%20%20%20%20%20title%0A%20%20%20%20%20%20%20%20%20%20date(%0A%20%20%20%20%20%20%20%20%20%20%20%20formatString%3A%20%22dddd%20DD%20MMMM%20YYYY%22%0A%20%20%20%20%20%20%20%20%20%20%20%20locale%3A%20%22de-DE%22%0A%20%20%20%20%20%20%20%20%20%20)%0A%20%20%20%20%20%20%20%20%7D%0A%20%20%20%20%20%20%7D%0A%20%20%20%20%7D%0A%20%20%7D%0A%7D%0A&explorerIsOpen=false"
width="600"
height="400"
loading="lazy"
></iframe>
Example: [`anotherDate(formatString: "dddd, MMMM Do YYYY, h:mm:ss a") # Sunday, August 5th 2018, 10:56:14 am`](<https://711808k40x.sse.codesandbox.io/___graphql?query=%7B%0A%20%20allMarkdownRemark(filter%3A%20%7Bfrontmatter%3A%20%7Bdate%3A%20%7Bne%3A%20null%7D%7D%7D)%20%7B%0A%20%20%20%20edges%20%7B%0A%20%20%20%20%20%20node%20%7B%0A%20%20%20%20%20%20%20%20frontmatter%20%7B%0A%20%20%20%20%20%20%20%20%20%20title%0A%20%20%20%20%20%20%20%20%20%20date(formatString%3A%20%22dddd%2C%20MMMM%20Do%20YYYY%2C%20h%3Amm%3Ass%20a%22)%0A%20%20%20%20%20%20%20%20%7D%0A%20%20%20%20%20%20%7D%0A%20%20%20%20%7D%0A%20%20%7D%0A%7D%0A>)
Dates also accept the `fromNow` and `difference` function. The former returns a string generated with Moment.js' [`fromNow`](https://momentjs.com/docs/#/displaying/fromnow/) function, the latter returns the difference between the date and current time (using Moment.js' [`difference`](https://momentjs.com/docs/#/displaying/difference/) function).
<iframe
title="Using date functions"
src="https://711808k40x.sse.codesandbox.io/___graphql?query=%7B%0A%20%20one%3A%20allMarkdownRemark(%0A%20%20%20%20filter%3A%20%7Bfrontmatter%3A%20%7Bdate%3A%20%7Bne%3A%20null%7D%7D%7D%0A%20%20%20%20limit%3A%202%0A%20%20)%20%7B%0A%20%20%20%20edges%20%7B%0A%20%20%20%20%20%20node%20%7B%0A%20%20%20%20%20%20%20%20frontmatter%20%7B%0A%20%20%20%20%20%20%20%20%20%20title%0A%20%20%20%20%20%20%20%20%20%20date(fromNow%3A%20true)%0A%20%20%20%20%20%20%20%20%7D%0A%20%20%20%20%20%20%7D%0A%20%20%20%20%7D%0A%20%20%7D%0A%20%20two%3A%20allMarkdownRemark(%0A%20%20%20%20filter%3A%20%7Bfrontmatter%3A%20%7Bdate%3A%20%7Bne%3A%20null%7D%7D%7D%0A%20%20%20%20limit%3A%202%0A%20%20)%20%7B%0A%20%20%20%20edges%20%7B%0A%20%20%20%20%20%20node%20%7B%0A%20%20%20%20%20%20%20%20frontmatter%20%7B%0A%20%20%20%20%20%20%20%20%20%20title%0A%20%20%20%20%20%20%20%20%20%20date(difference%3A%20%22days%22)%0A%20%20%20%20%20%20%20%20%7D%0A%20%20%20%20%20%20%7D%0A%20%20%20%20%7D%0A%20%20%7D%0A%7D%0A&explorerIsOpen=false"
width="600"
height="400"
loading="lazy"
></iframe>
### Excerpt
Excerpts accept three options: `pruneLength`, `truncate`, and `format`. `format` can be `PLAIN` or `HTML`.
<iframe
title="Using excerpts"
src="https://711808k40x.sse.codesandbox.io/___graphql?query=%7B%0A%20%20allMarkdownRemark(%0A%20%20%20%20filter%3A%20%7Bfrontmatter%3A%20%7Bdate%3A%20%7Bne%3A%20null%7D%7D%7D%0A%20%20%20%20limit%3A%205%0A%20%20)%20%7B%0A%20%20%20%20edges%20%7B%0A%20%20%20%20%20%20node%20%7B%0A%20%20%20%20%20%20%20%20frontmatter%20%7B%0A%20%20%20%20%20%20%20%20%20%20title%0A%20%20%20%20%20%20%20%20%7D%0A%20%20%20%20%20%20%20%20excerpt(%0A%20%20%20%20%20%20%20%20%20%20format%3A%20PLAIN%0A%20%20%20%20%20%20%20%20%20%20pruneLength%3A%20200%0A%20%20%20%20%20%20%20%20%20%20truncate%3A%20true%0A%20%20%20%20%20%20%20%20)%0A%20%20%20%20%20%20%7D%0A%20%20%20%20%7D%0A%20%20%7D%0A%7D%0A&explorerIsOpen=false"
width="600"
height="400"
loading="lazy"
></iframe>
## Sort, filter, limit & format together
This query combines sorting, filtering, limiting and formatting together.
<iframe
title="Combining sorting, filtering, limiting and formatting"
src="https://711808k40x.sse.codesandbox.io/___graphql?query=%7B%0A%20%20allMarkdownRemark(%0A%20%20%20%20limit%3A%203%0A%20%20%20%20filter%3A%20%7B%20frontmatter%3A%20%7B%20date%3A%20%7B%20ne%3A%20null%20%7D%20%7D%20%7D%0A%20%20%20%20sort%3A%20%7B%20fields%3A%20%5Bfrontmatter___date%5D%2C%20order%3A%20DESC%20%7D%0A%20%20)%20%7B%0A%20%20%20%20edges%20%7B%0A%20%20%20%20%20%20node%20%7B%0A%20%20%20%20%20%20%20%20frontmatter%20%7B%0A%20%20%20%20%20%20%20%20%20%20title%0A%20%20%20%20%20%20%20%20%20%20date(formatString%3A%20%22dddd%20DD%20MMMM%20YYYY%22)%0A%20%20%20%20%20%20%20%20%7D%0A%20%20%20%20%20%20%7D%0A%20%20%20%20%7D%0A%20%20%7D%0A%7D&explorerIsOpen=false"
width="600"
height="400"
loading="lazy"
></iframe>
## Query variables
In addition to adding query arguments directly to queries, GraphQL allows to pass in "query variables". These can be both simple scalar values as well as objects.
The query below is the same one as the previous example, but with the input arguments passed in as "query variables".
To add variables to page component queries, pass these in the `context` object [when creating pages](/docs/creating-and-modifying-pages/#creating-pages-in-gatsby-nodejs).
<iframe
title="Using query variables"
src="https://711808k40x.sse.codesandbox.io/___graphql?query=query%20GetBlogPosts(%0A%20%20%24limit%3A%20Int%2C%20%24filter%3A%20MarkdownRemarkFilterInput%2C%20%24sort%3A%20MarkdownRemarkSortInput%0A)%20%7B%0A%09allMarkdownRemark(%0A%20%20%20%20limit%3A%20%24limit%2C%0A%20%20%20%20filter%3A%20%24filter%2C%0A%20%20%20%20sort%3A%20%24sort%0A%20%20)%20%7B%0A%20%20%20%20edges%20%7B%0A%20%20%20%20%20%20node%20%7B%0A%20%20%20%20%20%20%20%20frontmatter%20%7B%0A%20%20%20%20%20%20%20%20%20%20title%0A%20%20%20%20%20%20%20%20%20%20date(formatString%3A%20%22dddd%20DD%20MMMM%20YYYY%22)%0A%20%20%20%20%20%20%20%20%7D%0A%20%20%20%20%20%20%7D%0A%20%20%20%20%7D%0A%20%20%7D%0A%7D&operationName=GetBlogPosts&variables=%7B%0A%20%20%22limit%22%3A%205%2C%0A%20%20%22filter%22%3A%20%7B%0A%20%20%20%20%22frontmatter%22%3A%20%7B%0A%20%20%20%20%20%20%22date%22%3A%20%7B%0A%20%20%20%20%20%20%20%20%22ne%22%3A%20null%0A%20%20%20%20%20%20%7D%0A%20%20%20%20%7D%0A%20%20%7D%2C%0A%20%20%22sort%22%3A%20%7B%0A%20%20%20%20%22fields%22%3A%20%22frontmatter___title%22%2C%0A%20%20%20%20%22order%22%3A%20%22DESC%22%0A%20%20%7D%0A%7D&explorerIsOpen=false"
width="600"
height="400"
loading="lazy"
></iframe>
## Group
You can also group values on the basis of a field e.g. the title, date or category and get the field value, the total number of occurrences and edges.
The query below gets you all categories (`fieldValue`) applied to a book and how many books (`totalCount`) a given category is applied to. In addition you are grabbing the `title` of books in a given category. You can see for example that there are 3 books in the `magical creatures` category.
<iframe
title="Grouping values"
src="https://711808k40x.sse.codesandbox.io/___graphql?query=%7B%0A%20%20allMarkdownRemark(filter%3A%20%7Bfrontmatter%3A%20%7Btitle%3A%20%7Bne%3A%20%22%22%7D%7D%7D)%20%7B%0A%20%20%20%20group(field%3A%20frontmatter___categories)%20%7B%0A%20%20%20%20%20%20fieldValue%0A%20%20%20%20%20%20totalCount%0A%20%20%20%20%20%20edges%20%7B%0A%20%20%20%20%20%20%20%20node%20%7B%0A%20%20%20%20%20%20%20%20%20%20frontmatter%20%7B%0A%20%20%20%20%20%20%20%20%20%20%20%20title%0A%20%20%20%20%20%20%20%20%20%20%7D%0A%20%20%20%20%20%20%20%20%7D%0A%20%20%20%20%20%20%7D%0A%20%20%20%20%7D%0A%20%20%20%20nodes%20%7B%0A%20%20%20%20%20%20frontmatter%20%7B%0A%20%20%20%20%20%20%20%20title%0A%20%20%20%20%20%20%20%20categories%0A%20%20%20%20%20%20%7D%0A%20%20%20%20%7D%0A%20%20%7D%0A%7D%0A&explorerIsOpen=false"
width="600"
height="400"
loading="lazy"
></iframe>
## Fragments
Fragments are a way to save frequently used queries for re-use. To create a fragment, define it in a query and export it as a named export from any file Gatsby is aware of. A fragment is available for use in any other GraphQL query, regardless of its location in the project. Fragments are globally defined in a Gatsby project, so names have to be unique.
The query below defines a fragment to get the site title, and then uses the fragment to access this information.
<iframe
title="Using fragments"
src="https://711808k40x.sse.codesandbox.io/___graphql?query=fragment%20fragmentName%20on%20Site%20%7B%0A%20%20siteMetadata%20%7B%0A%20%20%20%20title%0A%20%20%7D%0A%7D%0A%0A%7B%0A%20%20site%20%7B%0A%20%20%20%20...fragmentName%0A%20%20%7D%0A%7D&explorerIsOpen=false"
width="600"
height="400"
loading="lazy"
></iframe>
## Aliasing
Want to run two queries on the same datasource? You can do this by aliasing your queries. See the query below for an example:
<iframe
title="Using aliases"
src="https://711808k40x.sse.codesandbox.io/___graphql?query=%7B%0A%20%20someEntries%3A%20allMarkdownRemark(skip%3A%203%2C%20limit%3A%203)%20%7B%0A%20%20%20%20edges%20%7B%0A%20%20%20%20%20%20node%20%7B%0A%20%20%20%20%20%20%20%20frontmatter%20%7B%0A%20%20%20%20%20%20%20%20%20%20title%0A%20%20%20%20%20%20%20%20%7D%0A%20%20%20%20%20%20%7D%0A%20%20%20%20%7D%0A%20%20%7D%0A%20%20someMoreEntries%3A%20allMarkdownRemark(limit%3A%203)%20%7B%0A%20%20%20%20edges%20%7B%0A%20%20%20%20%20%20node%20%7B%0A%20%20%20%20%20%20%20%20frontmatter%20%7B%0A%20%20%20%20%20%20%20%20%20%20title%0A%20%20%20%20%20%20%20%20%7D%0A%20%20%20%20%20%20%7D%0A%20%20%20%20%7D%0A%20%20%7D%0A%7D%0A&explorerIsOpen=false"
width="600"
height="400"
loading="lazy"
></iframe>
When you use your data, you will be able to reference it using the alias instead of the root query name. In this example, that would be `data.someEntries` or `data.someMoreEntries` instead of `data.allMarkdownRemark`.
The same works for fields inside a query. Take this example:
<iframe
title="Using aliases for fields"
src="https://711808k40x.sse.codesandbox.io/___graphql?query=%7B%0A%20%20allMarkdownRemark(skip%3A%203%2C%20limit%3A%203)%20%7B%0A%20%20%20%20edges%20%7B%0A%20%20%20%20%20%20node%20%7B%0A%20%20%20%20%20%20%20%20frontmatter%20%7B%0A%20%20%20%20%20%20%20%20%20%20header%3A%20title%0A%20%20%20%20%20%20%20%20%20%20date%0A%20%20%20%20%20%20%20%20%20%20relativeDate%3A%20date(fromNow%3A%20true)%0A%20%20%20%20%20%20%20%20%7D%0A%20%20%20%20%20%20%7D%0A%20%20%20%20%7D%0A%20%20%7D%0A%7D%0A&explorerIsOpen=false"
width="600"
height="400"
loading="lazy"
></iframe>
Instead of receiving `title` you'll get `header`. This is especially useful when you want to display the same field in different ways as the `date` shows. You both get `date` and `relativeDate` from the same source.
## Conditionals
GraphQL allows you to skip a piece of a query depending on variables. This is handy when you need to render some part of a page conditionally.
Try changing variable `withDate` in the example query below:
<iframe
title="Using conditionals"
src="https://711808k40x.sse.codesandbox.io/___graphql?query=query%20GetBlogPosts%20(%24withDate%3ABoolean%20%3D%20false)%20%7B%0A%20%20allMarkdownRemark(limit%3A%203%2C%20skip%3A%201)%20%7B%0A%20%20%20%20edges%20%7B%0A%20%20%20%20%20%20node%20%7B%0A%20%20%20%20%20%20%20%20frontmatter%20%7B%0A%20%20%20%20%20%20%20%20%20%20title%0A%20%20%20%20%20%20%20%20%20%20date%20%40include(if%3A%24withDate)%0A%20%20%20%20%20%20%20%20%7D%0A%20%20%20%20%20%20%7D%0A%20%20%20%20%7D%0A%20%20%7D%0A%7D%0A&operationName=GetBlogPosts&explorerIsOpen=false"
width="600"
height="400"
loading="lazy"
></iframe>
Use directive `@include(if: $variable)` to conditionally include a part of a query or `@skip(if: $variable)` to exclude it.
You can use those directives on any level of the query and even on fragments. Take a look at an advanced example:
<iframe
title="Using conditionals: Advanced"
src="https://711808k40x.sse.codesandbox.io/___graphql?query=query%20GetBlogPosts%20(%24preview%3ABoolean%20%3D%20true)%20%7B%0A%20%20allMarkdownRemark(limit%3A%203%2C%20skip%3A%201)%20%7B%0A%20%20%20%20edges%20%7B%0A%20%20%20%20%20%20node%20%7B%0A%20%20%20%20%20%20%20%20...BlogPost%20%40skip(if%3A%24preview)%0A%20%20%20%20%20%20%20%20...BlogPostPreview%20%40include(if%3A%24preview)%0A%20%20%20%20%20%20%7D%0A%20%20%20%20%7D%0A%20%20%7D%0A%20%20allFile(limit%3A2)%20%40skip(if%3A%24preview)%20%7B%0A%20%20%20%20edges%20%7B%0A%20%20%20%20%20%20node%20%7B%0A%20%20%20%20%20%20%20%20relativePath%0A%20%20%20%20%20%20%7D%0A%20%20%20%20%7D%0A%20%20%7D%0A%7D%0A%0Afragment%20BlogPost%20on%20MarkdownRemark%20%7B%0A%20%20html%0A%20%20frontmatter%20%7B%0A%20%20%20%20title%0A%20%20%20%20date%0A%20%20%7D%0A%7D%0A%0Afragment%20BlogPostPreview%20on%20MarkdownRemark%20%7B%0A%20%20excerpt%0A%20%20frontmatter%20%7B%0A%20%20%20%20title%0A%20%20%7D%0A%7D&operationName=GetBlogPosts&explorerIsOpen=false"
width="600"
height="400"
loading="lazy"
></iframe>
## Where next?
Try [running your own queries](<https://711808k40x.sse.codesandbox.io/___graphql?query=%7B%0A%20%20site%20%7B%0A%20%20%20%20siteMetadata%20%7B%0A%20%20%20%20%20%20title%0A%20%20%20%20%20%20description%0A%20%20%20%20%7D%0A%20%20%7D%0A%20%20books%3A%20allMarkdownRemark(filter%3A%20%7Bfrontmatter%3A%20%7Btitle%3A%20%7Bne%3A%20%22%22%7D%7D%7D)%20%7B%0A%20%20%20%20totalCount%0A%20%20%20%20edges%20%7B%0A%20%20%20%20%20%20node%20%7B%0A%20%20%20%20%20%20%20%20frontmatter%20%7B%0A%20%20%20%20%20%20%20%20%20%20title%0A%20%20%20%20%20%20%20%20%20%20date(fromNow%3A%20true)%0A%20%20%20%20%20%20%20%20%20%20author%20%7B%0A%20%20%20%20%20%20%20%20%20%20%20%20id%0A%20%20%20%20%20%20%20%20%20%20%7D%0A%20%20%20%20%20%20%20%20%7D%0A%20%20%20%20%20%20%7D%0A%20%20%20%20%7D%0A%20%20%7D%0A%20%20authors%3A%20allAuthorYaml%20%7B%0A%20%20%20%20totalCount%0A%20%20%20%20edges%20%7B%0A%20%20%20%20%20%20node%20%7B%0A%20%20%20%20%20%20%20%20id%0A%20%20%20%20%20%20%20%20bio%0A%20%20%20%20%20%20%7D%0A%20%20%20%20%7D%0A%20%20%7D%0A%7D%0A>), check out the rest of [the docs](/docs/) or run through [the tutorial](/tutorial/).
| 103.892537 | 6,412 | 0.740806 | eng_Latn | 0.26079 |
f9755637268e270a4da1a652e38ff2876f2989b6 | 5,950 | md | Markdown | _posts/2021-6-15-Inventory-App-0.md | EbenZergaw/ebenzergaw.github.io | a561c5f3ed6c228f20234bc8379d0bf1e82e2de9 | [
"MIT"
] | null | null | null | _posts/2021-6-15-Inventory-App-0.md | EbenZergaw/ebenzergaw.github.io | a561c5f3ed6c228f20234bc8379d0bf1e82e2de9 | [
"MIT"
] | null | null | null | _posts/2021-6-15-Inventory-App-0.md | EbenZergaw/ebenzergaw.github.io | a561c5f3ed6c228f20234bc8379d0bf1e82e2de9 | [
"MIT"
] | null | null | null | ---
layout: post
title: Inventory App [0] - Features and Design
---
## Introduction
As of writing this post, I’m currently going past the basics of Javascript onto learning React. I’ve already built a todo app in vanilla JS from following tutorials, then went past that and built a note taking app. But at the end of the day, even though it was much more complex than a simple todo app, it was still just basic DOM manipulation. I want to challenge myself and build something that's just slightly more complicated. My dad recommended that I build an inventory management app, something really simple that a small business owner would use for keeping track of stock and sales. This seems like it could be fun, and certainly better than a todo app which is why I started this blog. I’ll be documenting my process of building this web app, and this is the first entry in more to come.
## Features
The core benefit of this web product is that it’ll help me make money by automating bookkeeping and basic accounting. So I started brainstorming a list of essential features needed for something like this. Assuming I had a small store or maybe even just a side hustle, I’ll need a place to keep track of my inventory, manage sales, and look at my revenue for a given time period. I’ll need to be able to check out an item when it's sold and add items to my inventory when I have more products to sell. On top of all of this, I’d love to see the financial data for each product I’m selling, and how much money I’ve been making.
Features:
- Inventory display
- Checkout functions
- Analytics and weekly / monthly reports
## Low Fidelity Wireframes
After I had this clearly defined feature set established, I got to work on actually designing the app. I started with basic, low fidelity wireframes to figure out the basic structure of all the different elements to be displayed. I originally thought of having something similar to an excel spreadsheet with the added functionality needed to dynamically edit data as items get sold, but spreadsheets are boring. They connote bureaucracy and number crunching, this web app is supposed to be the opposite of that. It's meant to remove the burden of bookkeeping so that users can focus on actually selling the product. So in place of rows and columns of data, I chose to use cards to display item data.
Each card will contain only the bare minimum amount of information needed for a user to get started with selling products. Users shouldnt be overburdened with a plethora of financial information for each product they sell, someone who wants a gargantuan heap of information will probably be using excel anyway. So sticking true to the values of a minimalist and beginner friendly user experience, each card included only the item name, quantity, price, and daily sales of that item. However, with a dropdown menu, the cost, profit margin, description, and total sales of that item will be displayed.

To reduce the number of clicks to accomplish a task, I decided to have the action menu on the right hand side of the cards. Each button obviously manipulates the data of a given card in some way. Checkout reduces the quantity, delete removes the whole card, and edit simply edits any property. I realized very quickly that having a display of the daily revenue count can be extremely motivating. When I used to sell candy at school, feeling my wallet getting thicker by the hour always forced me to continue the grind. It was motivating to see tangible change and that's what I wanted the revenue display to do. This was also a good design choice because users don’t have to constantly switch back and forth from the inventory and analytics tabs to view their daily progress.
## Design Guidelines
I chose to come back to the analytics tab later and start working on the design guidelines straight away. Though analytics are a necessary feature, as of writing this right now, I have no idea how to build something like that. I don't want my designs to be limited by my technical incompetence so I’ll work on the analytics wireframes later down the line. As for the design guidelines, I wanted a hyper-modern feel to the web app through my choices of typography and colors. I originally wanted a dark themed color scheme not only because it is aesthetically appealing, but also to stray away from the ‘spreadsheet’ look.

## High Fidelity Wireframes
The bright green seemed to be an obvious choice since the web app was designed to help users in making money. Normally, a dark theme would have a neutral dark slate gray color as the background but I felt like that would cause an eye straining contrast with the bright vibrant green. So instead, I chose to have a dark blue-green as the background. For the accent color, I chose a simple beige that contrasted against the dark background. I applied the fonts and colors to the wireframes to see what a final product would look like. It looked great for the most part, but I ended up scrapping most of the icons since they didn't add much value to the UI as a whole and were too distracting to look at alongside the text.




Overall, I’m satisfied with the wireframes. I think I have a great start on a pretty interesting side project. Once I start building this, I’ll learn how to store and edit data outside the DOM. I don’t think this project requires a database so I’ll likely store all the information on a JSON file. I'll also learn data visualization through the analytics feature once the time is right. For the next post, I’ll focus on building the UI using React and planning the component hierarchy.
| 129.347826 | 797 | 0.793782 | eng_Latn | 0.999805 |
f9758d174f0e6fbd6abcebc8a98d6a4ae48606e1 | 2,000 | md | Markdown | README.md | VishwaP98/MyDiagnosis | ccd4ae5cfa2ca7e081ab029943f4c12438def024 | [
"MIT"
] | 1 | 2021-03-29T06:22:35.000Z | 2021-03-29T06:22:35.000Z | README.md | VishwaP98/MyDiagnosis | ccd4ae5cfa2ca7e081ab029943f4c12438def024 | [
"MIT"
] | 1 | 2021-02-15T17:13:40.000Z | 2021-02-15T17:13:40.000Z | README.md | VishwaP98/MyDiagnosis | ccd4ae5cfa2ca7e081ab029943f4c12438def024 | [
"MIT"
] | 1 | 2021-02-09T03:18:23.000Z | 2021-02-09T03:18:23.000Z | # MyDiagnosis
Android application designed to provide detailed diagnosis based on observed symptoms, any Lab test results, and risk factors. Moreover, there is a Natural Language Processing feature that lets users type in a message explaining their problem and symptoms are extracted from the text. Finally, when user clicks Diagnose, a Diagnosis is generated with possible conditions listed along with their probabilities.
## APIs and Libraries used
-> Infermedica - Handles diagnosis processing with evidences given by user (https://infermedica.com/)<br/>
-> Retrofit - Used to communicate with the API to achieve application goal<br/>
-> MockK - Used to mock external dependencies for unit tests<br/>
## Architecture Components used
-> ViewModel - Used to decouple view from the data that the view is presenting. Allows data shown by the view to survive view state changes<br/>
-> LiveData - Used to allow views to observe on data changes in the viewModels<br/>
-> Room - Used to store data locally, very easy since lot of boilerplate code is removed and trivial to define entities and<br/>
DAOs<br/>
<div align="center">
<p float="center">
<img src="https://github.com/VishwaP98/MyDiagnosis/blob/master/screenshots/screenshot2.png" width="250" height="400" hspace="50">
<img src="https://github.com/VishwaP98/MyDiagnosis/blob/master/screenshots/screenshot3.png" width="250" height="400" hspace="50">
<img src="https://github.com/VishwaP98/MyDiagnosis/blob/master/screenshots/screenshot4.png" width="250" height="400" hspace="50">
<img src="https://github.com/VishwaP98/MyDiagnosis/blob/master/screenshots/screenshot5.png" width="250" height="400" hspace="50">
<img src="https://github.com/VishwaP98/MyDiagnosis/blob/master/screenshots/screenshot6.png" width="250" height="400" hspace="50">
<img src="https://github.com/VishwaP98/MyDiagnosis/blob/master/screenshots/screenshot7.png" width="250" height="400" hspace="50">
</p>
</div>
| 76.923077 | 409 | 0.7505 | eng_Latn | 0.928091 |
f975d6e3757155ecc84ad6ba0a4bd20f1ab9b5a5 | 1,931 | md | Markdown | README.md | pulse00/nagios-php | 772a47010157cd3989ff4e945c4a7d67395423d0 | [
"MIT"
] | 3 | 2015-09-01T14:23:05.000Z | 2018-10-19T12:47:56.000Z | README.md | pulse00/nagios-php | 772a47010157cd3989ff4e945c4a7d67395423d0 | [
"MIT"
] | null | null | null | README.md | pulse00/nagios-php | 772a47010157cd3989ff4e945c4a7d67395423d0 | [
"MIT"
] | null | null | null | nagios-php
==========
[](https://travis-ci.org/pulse00/nagios-php)
[](https://scrutinizer-ci.com/g/pulse00/nagios-php/?branch=master)
Simple utility to help writing nagios plugins in PHP inspired
by the [Silex microframework](https://github.com/fabpot/Silex).
Usage:
======
Example - check_hello :
```php
<?php
require_once __DIR__ . '/nagios.phar';
use Dubture\Nagios\Plugin;
$plugin = new Plugin();
$plugin->run(function($name, $foo = 'bar') use ($plugin) {
return array(Plugin::OK, array('hello' => $name, $foo));
});
```
Running the above plugin using `check_hello pulse00` will result in an
nagios service state `OK` and the multiline output:
```
hello | pulse00
bar
```
The ` Dubture\Nagios\Plugin::run()` method expects a `Closure` whose
method signature determines the nagios plugin arguments. A parameter
without a default value represents a mandatory argument, a parameter with
a default value represents an optional argument.
The plugin in the above example has one mandatory argument `name` and
an optional argument `foo` with the default value `bar`.
The `Closure` should return an array with the status code as the first
element and the output as the second argument, which will be formatted
after the following rules:
- If the second array element is a string, the output is a single-line message
- If the second array element is an array, the output is a multi-line message.
- Every element of the multiline message can either be a simple message (literal value),
or a message / performance output if the array paramater is a key/value pair.
Installation
============
Download and include the [nagios.phar](https://github.com/pulse00/nagios-php/raw/master/nagios.phar) file. That's all. | 32.728814 | 176 | 0.736924 | eng_Latn | 0.946625 |
f9763900b97ff2bdeaa2bd88b4f6f605645f3318 | 1,427 | md | Markdown | AlchemyInsights/upload-a-folder-or-files-to-a-document-library.md | isabella232/OfficeDocs-AlchemyInsights-pr.lt-LT | 001bf5fd6b37e2b218bad3586388a8d80ba1d47d | [
"CC-BY-4.0",
"MIT"
] | 2 | 2020-05-19T19:07:05.000Z | 2021-03-06T00:35:04.000Z | AlchemyInsights/upload-a-folder-or-files-to-a-document-library.md | MicrosoftDocs/OfficeDocs-AlchemyInsights-pr.lt-LT | 2b110ffade6ea6775b415483413fccd000605bed | [
"CC-BY-4.0",
"MIT"
] | 4 | 2020-06-02T23:31:43.000Z | 2022-02-09T06:55:37.000Z | AlchemyInsights/upload-a-folder-or-files-to-a-document-library.md | isabella232/OfficeDocs-AlchemyInsights-pr.lt-LT | 001bf5fd6b37e2b218bad3586388a8d80ba1d47d | [
"CC-BY-4.0",
"MIT"
] | 3 | 2019-10-09T20:28:54.000Z | 2021-10-09T10:40:30.000Z | ---
title: Aplanko arba failų nusiuntimas į dokumentų biblioteką
ms.author: toresing
author: tomresing
manager: scotv
ms.date: 04/21/2020
ms.audience: Admin
ms.topic: article
ms.service: o365-administration
ROBOTS: NOINDEX, NOFOLLOW
localization_priority: Normal
ms.collection: Adm_O365
ms.custom: ''
ms.assetid: df1ffdf0-8e08-4a56-880e-8ef162ec8431
ms.openlocfilehash: 7f954d85a6b682a5e683e7216b71e694e8013e9a6145b9c7f119d3b2a5b78965
ms.sourcegitcommit: b5f7da89a650d2915dc652449623c78be6247175
ms.translationtype: MT
ms.contentlocale: lt-LT
ms.lasthandoff: 08/05/2021
ms.locfileid: "53990643"
---
# <a name="upload-a-folder-or-files-to-a-sharepoint-document-library"></a>Upload arba failų į SharePoint biblioteką
Norėdami nusiųsti aplanką, turite naudoti "Microsoft Edge", "Google Chrome" arba "Mozilla FireFox". Negalite nusiųsti aplankų naudodami klasikinę biblioteką arba naudodami Internet Explorer 10 arba 11.
1. Atidarykite dokumentų biblioteką, kurioje norite nusiųsti aplanką arba failus.
2. Kompiuteryje atidarykite failų naršyklę ir raskite aplanką arba failus, kuriuos norite nusiųsti.
3. Vilkite aplanką arba failus į dokumentų biblioteką. Jei bibliotekoje nerodomi jūsų nusiųsti elementai, atnaujinkite puslapį.
Jei nenorite nusiųsti vilkdami elementus tarp langų, taip **pat galite naudoti mygtuką Upload** bibliotekoje, kad pasirinktumėte aplanką arba failus, kuriuos norite nusiųsti.
| 40.771429 | 201 | 0.813595 | lit_Latn | 0.995586 |
f9779e37892ee7a73ffbfc2fbf7fd989292275c9 | 2,033 | md | Markdown | Library/PackageCache/com.unity.render-pipelines.universal@10.6.0/Documentation~/2DRendererData_overview.md | adrytekk/MoleRoule | 2c3c5c0282f63863c000c1702a013fae5b19efa2 | [
"MIT"
] | 4 | 2021-06-02T19:03:51.000Z | 2021-10-02T19:47:05.000Z | Library/PackageCache/com.unity.render-pipelines.universal@10.6.0/Documentation~/2DRendererData_overview.md | adrytekk/MoleRoule | 2c3c5c0282f63863c000c1702a013fae5b19efa2 | [
"MIT"
] | 7 | 2021-04-07T10:03:53.000Z | 2022-03-14T22:06:08.000Z | Library/PackageCache/com.unity.render-pipelines.universal@10.6.0/Documentation~/2DRendererData_overview.md | adrytekk/MoleRoule | 2c3c5c0282f63863c000c1702a013fae5b19efa2 | [
"MIT"
] | 2 | 2021-12-06T13:56:23.000Z | 2021-12-06T13:56:37.000Z | # 2D Renderer Data Asset

The __2D Renderer Data__ Asset contains the settings that affect the way __2D Lights__ are applied to lit Sprites. You can set the way Lights emulate HDR lighting with the [HDR Emulation Scale](HDREmulationScale), or customize your own [Light Blend Styles](LightBlendStyles). Refer to their respective pages for more information about their properties and options.
## Use Depth/Stencil Buffer
This option is enabled by default. Clear this option to disable the Depth/[Stencil](https://docs.unity3d.com/Manual/SL-Stencil.html) Buffer. Doing so might improve your project’s performance, especially on mobile platforms. You should clear this option if you are not using any features that require the Depth/Stencil Buffer (such as [Sprite Mask](https://docs.unity3d.com/Manual/class-SpriteMask.html)).
## Post-processing Data
Unity automatically assigns a default Asset to this property that contains the resources (such as Textures and Shaders) that the post-processing effects require. If you want to use your own post-processing Shaders or lookup Textures, replace the default Asset.
## Default Material Type

Unity assigns a Material of the selected __Default Material Type__ to Sprites when they are created. The available options have the following properties and functions.
__Lit__: Unity assigns a Material with the Lit type (default Material: Sprite-Lit-Default). 2D Lights affect Materials of this type.
__Unlit__: Unity assigns a Material with the Unlit type (default Material: Sprite-Lit-Default). 2D Lights do not affect Materials of this type.
__Custom__: Unity assigns a Material with the Custom type. When you select this option, Unity shows the __Default Custom Material__ box. Assign the desired Material to this box.

| 72.607143 | 404 | 0.805706 | eng_Latn | 0.965107 |
f977cf587b8a7c488d58840ac5fe6ac467c7a983 | 1,849 | md | Markdown | getting-started/install.md | ohmysh/docs-v2 | 4d477659e72539a1a45b7c8bb6a7051e611e4365 | [
"MIT"
] | null | null | null | getting-started/install.md | ohmysh/docs-v2 | 4d477659e72539a1a45b7c8bb6a7051e611e4365 | [
"MIT"
] | null | null | null | getting-started/install.md | ohmysh/docs-v2 | 4d477659e72539a1a45b7c8bb6a7051e611e4365 | [
"MIT"
] | null | null | null | # Installation Guide
This guide will help you to install the OhMySH.
If you have any question, open a new issue in [(ohmysh/ohmysh) repository](https://github.com/ohmysh/ohmysh/issues) to tell us.
## Preparation
Check your system required first in [System Page](/getting-started/system).
OhMySh works with [SH shell (Bourne shell)](https://en.wikipedia.org/wiki/Bourne_shell) , GNU-Bash and Zsh.
So make sure `sh (>=4.x.x)` and `bash (>=4.x.x)` or `zsh (>=5.x.x)` was installed. There are some other dependences: `curl` `git` .
## Installation
Run the following commands with shell. (Not support Chinese internet)
```bash
curl https://raw.githubusercontent.com/ohmysh/ohmysh/main/install.sh > OMSInstaller.sh
bash OMSInstaller.sh
```
Or if you are in China, you may need get installer from Chinese mirror with this commands:
```bash
curl https://gitee.com/ohmysh/ohmysh-mirror/raw/main/install.sh > OMSInstaller.sh
bash OMSInstaller.ah
```
## Checking the Installation
Run the following script with **any** shell.
```sh
bash
```
## Ready to use
Please read [Ready to use](/getting-started/ready) page in docs. **It will be very useful!**
## Advanced Options
### Getting Installer
Run the following commands with shell.
```sh
curl https://raw.githubusercontent.com/ohmysh/ohmysh/main/install.sh > OMSInstaller.sh
```
### Run with Advanced Options
Run installer script with some enviroment, such as `env1_name=env1_value env2_name=env2_value sh OMSInstaller.sh`
- Change OhMySH installation path
- ENV_NAME: `OMS`
- ENV_VALUE: [PATH]
- More info: Do not set it to a path without access rights.
- Change OhMySH cache path
- ENV_NAME: `OMS_CACHE`
- ENV_VALUE: [PATH]
- More info: Do not set it to a path that does not have access rights and cannot be the same as the installation path or inside the installation path.
| 28.446154 | 152 | 0.737155 | eng_Latn | 0.962127 |
f97882a76c0d65cf90ef056f778156da9026e026 | 33,358 | md | Markdown | LICENSE.md | dekterov/Leacme.Weathery | 5223abaaa4953c00029db6a6c86ebccf059ea862 | [
"MIT"
] | null | null | null | LICENSE.md | dekterov/Leacme.Weathery | 5223abaaa4953c00029db6a6c86ebccf059ea862 | [
"MIT"
] | null | null | null | LICENSE.md | dekterov/Leacme.Weathery | 5223abaaa4953c00029db6a6c86ebccf059ea862 | [
"MIT"
] | null | null | null | # MIT License
Copyright (c) 2017 Leacme (http://leac.me)
Permission is hereby granted, free of charge, to any person obtaining a copy
of this software and associated documentation files (the "Software"), to deal
in the Software without restriction, including without limitation the rights
to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
copies of the Software, and to permit persons to whom the Software is
furnished to do so, subject to the following conditions:
The above copyright notice and this permission notice shall be included in all
copies or substantial portions of the Software.
THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE
SOFTWARE.
Avalonia UI:
The MIT License (MIT)
Copyright (c) 2014 Steven Kirk
Permission is hereby granted, free of charge, to any person obtaining a copy of this software and associated documentation files (the "Software"), to deal in the Software without restriction, including without limitation the rights to use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of the Software, and to permit persons to whom the Software is furnished to do so, subject to the following conditions:
The above copyright notice and this permission notice shall be included in all copies or substantial portions of the Software.
THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE.
.NET Core:
The MIT License (MIT)
Copyright (c) .NET Foundation and Contributors
All rights reserved.
Permission is hereby granted, free of charge, to any person obtaining a copy
of this software and associated documentation files (the "Software"), to deal
in the Software without restriction, including without limitation the rights
to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
copies of the Software, and to permit persons to whom the Software is
furnished to do so, subject to the following conditions:
The above copyright notice and this permission notice shall be included in all
copies or substantial portions of the Software.
THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE
SOFTWARE.
LiteDB:
The MIT License (MIT)
Copyright (c) 2014-2015 Mauricio David
Permission is hereby granted, free of charge, to any person obtaining a copy
of this software and associated documentation files (the "Software"), to deal
in the Software without restriction, including without limitation the rights
to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
copies of the Software, and to permit persons to whom the Software is
furnished to do so, subject to the following conditions:
The above copyright notice and this permission notice shall be included in all
copies or substantial portions of the Software.
THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE
SOFTWARE.
weather.codes:
All data is provided “as is” without warranty or any representation of accuracy, timeliness or completeness.
wxdata.weather.com:
WxData Terms of Use
November 29, 2017
This Agreement
This Terms of Use Agreement ("Agreement") sets forth the legally binding terms for your use of the WxData.com website. By using the WxData.com website, you agree to be bound by this Agreement, whether you are a "visitor", which means that you are simply viewing one of the WxData.com website with an internet browser or other software, or you are a "member", meaning you have registered with WxData.com for one or more of the WxData.com services. The term "user" shall refer to both visitors and members. You are only authorized to use the WxData.com website, whether such use is intentional or not, if you agree to abide by this Agreement, and all other applicable laws. Please read this agreement carefully and save it. If you do not agree with it, you should leave the WxData.com website immediately, and discontinue use of WxData.com. If you wish to become a member, communicate with other members, and make use of the WxData.com website, you must read this Agreement, and by signing up as a member during the registration process you are accepting the terms of the WxData.com service.
Modification
WxData.com may modify this Agreement from time to time and such modification shall be effective upon posting on the WxData.com website. You agree to be bound to any changes to this Agreement when you use the WxData.com website after any such modification is posted. It is important that you review this Agreement regularly to ensure you are aware of any changes.
Description of Services
WxData.com provides users with access to a rich collection of weather and environmental data in its network of properties which may be accessed through any various medium or device now known or hereafter developed (the "Service"). You also understand and agree that the WxData.com website may include certain communications from WxData.com or parent company iAlert Services LLC., such as service announcements, administrative messages and the WxData.com Newsletter, and that these communications are considered part of WxData.com membership and you will not be able to opt out of receiving them. Unless explicitly stated otherwise, any new features that augment or enhance the current Service, including the release of new WxData.com properties, shall be subject to this Agreement. You understand and agree that the WxData.com service and website is provided "AS-IS" and that WxData.com or it's parent company iAlert Services LLC assumes no responsibility for the timeliness, deletion, mis-delivery or failure to store any user communications or personalization settings. You are responsible for obtaining access to the WxData.com website, and that access may involve third-party fees (such as Internet service provider, mobile device service provider, or airtime charges, charges for incoming/outgoing text messages). You are responsible for those fees. In addition, you must provide and are responsible for all equipment necessary to access the WxData.com website or to receive WxData.com communications (such as a mobile phone or other such device to receive SMS or otherwise known as Texting communication).
WxData.com Trademarks, Copyrights, and other Intellectual Property Rights
The WxData.com website and any related software are protected by copyright, trademark, and other laws and international treaties. Except as described in other sections of this document, all intellectual property rights such as but not limited to patents, trademarks or copyrights related to the WxData.com website are the property of iAlert Services LLC. You agree that your use of the WxData.com website is through a limited nonexclusive license. iAlert Services LLC is and remains the owner of all titles, rights, and interests in the WxData.com website. You acknowledge and agree that you shall not modify, translate, reverse engineer, decompile or disassemble the Software or any part thereof or otherwise attempt to derive source code or create derivative works there from. You acknowledge and agree that you are not allowed to remove, alter or destroy any proprietary, trademark or copyright markings or notices placed upon, contained within, or used in relation to the WxData.com website. WxData.com and iAlert Services LLC are completely independent and do not directly represent any government agency. The display of government logos, or reference to any government agency, on the WxData.com website is to display active users or provide as "credit to" only and "reference back" to the authorized government agency responsible for providing datasets.
Your Registration Obligations
In consideration of your use of the WxData.com website, you represent that you are of legal age to form a binding contract and are not a person barred from receiving services under the laws of the United States or other applicable jurisdiction. You also agree to: (a) provide true, accurate, current and complete information about yourself as prompted by the WxData.com website's registration form (the "Registration Data") and (b) maintain and promptly update the Registration Data to keep it true, accurate, current and complete. If you provide any information that is untrue, inaccurate, not current or incomplete, or WxData.com has reasonable grounds to suspect that such information is untrue, inaccurate, not current or incomplete, WxData.com has the right to suspend or terminate your account and refuse any and all current or future use of the WxData.com website (or any portion thereof). By using the WxData.com website, you represent and warrant that you have the right, authority, and capacity to enter into this Agreement and will abide by the terms and conditions of this Agreement.
WxData.com Privacy Policy
Registration Data and certain other information about you is subject to our Privacy Policy. By using the WxData.com website, you are representing that you have read that Privacy Policy and that you have accepted its terms. You understand that through your use of the WxData.com website you consent to the collection and use (as set forth in the Privacy Policy) of this information, including the transfer of this information to the United States and/or other countries for storage, processing and use by WxData.com or parent company iAlert Services LLC, and its affiliates. The WxData.com Privacy Policy, as available on the WxData.com website, is incorporated into this agreement by reference.
Member Account, Password and Security
You will receive a password and account designation upon completing the WxData.com website registration process. You are responsible for maintaining the confidentiality of the password and account and are fully responsible for all activities that occur under your password or account. You agree to (a) immediately notify WxData.com of any unauthorized use of your password or account or any other breach of security, and (b) ensure that you exit from your account at the end of each session. WxData.com cannot and will not be liable for any loss or damage arising from your failure to comply with this Section. WxData.com may charge a fee for membership in or access to any of the WxData.com website as it deems appropriate. Such fee shall be non-refundable.
Bandwidth Limitations
You understand that you will use only a reasonable amount of bandwidth, that is, that you are responsible for ensuring that the amount of data transmitted between third-parties and WxData.com for the purposes of viewing content that you control will remain within a reasonable limit. WxData.com will set that limit at its sole discretion. WxData.com take any remedies to cure use of an unreasonable amount of bandwidth, including, but not limited to, removing access to content causing unreasonable bandwidth usage, either temporarily or permanently; temporarily or permanently suspending your member account; or terminating this agreement.
Special Admonitions for International Use
Recognizing the global nature of the Internet, you agree to comply with all rules regarding online conduct specific to any geographic location in which you use the WxData.com website. You also agree to comply with any applicable United States export rules that may govern the geographic location in which you use the WxData.com website.
Indemnity
You agree to indemnify and hold harmless WxData.com or parent company iAlert Services LLC and its subsidiaries, affiliates, officers, agents, employees, partners, licensors and licensees from any claim or demand, including attorney's fees and costs, made by any third party due to or arising out of your use of the WxData.com website, your connection with the WxData.com website, or your violation of this Agreement.
No Resale of Service
You agree not to reproduce, duplicate, copy, sell, trade, resell, or exploit for any commercial purpose, any portion of the WxData.com website, use of the WxData.com website, or access to the WxData.com website.
General Practices Regarding Use and Service
You agree that WxData.com may establish general practices and limits concerning use of the WxData.com website, including without limitation
Modifications to Service
WxData.com reserves the right at any time and from time to time to modify or discontinue, temporarily or permanently, the WxData.com website (or any part thereof) with or without notice. You agree that WxData.com shall not be liable to you or to any third party for any modification, suspension or discontinuance of the WxData.com Services.
Termination
You agree that WxData.com may, under certain circumstances and without prior notice and at WxData.com's sole discretion, immediately terminate your WxData.com account, any associated email address, and access to the WxData.com website. Cause for such termination shall include, but not be limited to:
- breaches or violations of the TOS or other incorporated agreements or guidelines
- requests by law enforcement or other government agencies
- a request by you (self-initiated account deletions)
- discontinuance or material modification to the WxData.com website (or any part thereof)
- unexpected technical or security issues or problems
- extended periods of inactivity
- engagement by you in fraudulent or illegal activities, or
- nonpayment of any fees owed by you in connection with the WxData.com website.
Termination of your membership includes:
- removal of access to all offerings within the WxData.com website
- deletion of your password and all related information, files and content associated with or inside your account (or any part thereof)
- barring of further use of the WxData.com website.
Further, you agree that all terminations for cause shall be made in WxData.com's sole discretion and that WxData.com shall not be liable to you or any third party for any termination of your account, any associated email address, or access to the WxData.com website.
Members can terminate their account and service manually at any time from inside their membership account.
Dealings with Advertisers
Your correspondence or business dealings with, or participation in promotions of, advertisers found on or through the WxData.com website, including payment and delivery of related goods or services, and any other terms, conditions, warranties or representations associated with such dealings, are solely between you and such advertiser. You agree that WxData.com shall not be responsible or liable for any loss or damage of any sort incurred as the result of any such dealings or as the result of the presence of such advertisers on the WxData.com website.
Links
The WxData.com website or third parties may provide links to other World Wide Web sites or resources. Because WxData.com has no control over such sites and resources, you agree that WxData.com is not responsible for the availability of such external sites or resources, and you agree that WxData.com does not endorse and is not responsible or liable for any content, advertising, products or other materials on or available from such other sites or resources. You further agree that WxData.com shall not be responsible or liable, directly or indirectly, for any damage or loss caused by or in connection with use of or reliance on any such content, goods or services associated with any such site or resource.
Disclaimer of Warranties
THIS SECTION IS EXTREMELY IMPORTANT. PLEASE READ IT CAREFULLY
You understand and expressly agree that:
- Your use of the WxData.com is at your sole risk. The WxData.com website and services is provided on an "as is" and "as available" basis. WxData.com and its subsidiaries, affiliates, officers, employees, agents, partners, and licensees disclaim all warranties of any kind, whether express of implied, including, but not limited to, an implied warranty of merchantability, any express or implied warranty of fitness for a particular purpose, and any implied or express warranty of non-infringement.
- WxData.com and its subsidiaries, affiliates, officers, employees, agents, partners, and licensors and licensees make no warranty that The WxData.com website or WxData.com services will be best effort. We make no guarantee of timely weather and environmental data access.
- Any material you download or otherwise obtain through use of the WxData.com website you access at your sole risk. You agree that you shall be solely responsible for any damage to your computer system or loss of data that results from the download of or accessing of any such material.
- No communication between WxData.com or the parent company iAlert Services LLC and you, or through or from the WxData.com website shall create any warranty not expressly created in this Agreement.
Limitation of Liability
THIS SECTION IS EXTREMELY IMPORTANT. PLEASE READ IT CAREFULLY
You expressly understand and agree that WxData.com and parent company iAlert Services LLC and its subsidiaries, affiliates, officers, directors, employees, agents, partners, and licensors shall not be liable to you for any direct, indirect, incidental, special, consequential or exemplary damages, including, but not limited to, damages for loss of profits, goodwill, loss of use, loss of data or other intangible losses, even if WxData.com or iAlert Services LLC has been advised of the possibility of such damages, resulting from:
- The use or the inability to use the WxData.com website or services; The cost of procurement of substitute goods or services resulting from any goods, data, information or services purchased or obtained or messages received or transactions entered into through or from the WxData.com website; Unauthorized access to or alteration of your transmissions or data; Any postings or transmissions made by you, the user; Statements or conduct of any third party on the WxData.com website; or Any other matter relating to the WxData.com website.
Exclusions and Limitations
THIS SECTION IS EXTREMELY IMPORTANT. PLEASE READ IT CAREFULLY
Some Jurisdictions do not allow the exclusion of certain warranties or the limitation or exclusion of liability for incidental or consequential damages.
No Third Party Beneficiaries
You agree that, except as otherwise expressly provided in this Agreement, there shall be no third-party beneficiaries to this agreement.
Notices
WxData.com or parent company iAlert Services LLC may provide you with notices, including those regarding changes to this Agreement, at WxData.com's sole discretion, in any manner WxData.com decides, including, but not limited to, posting to one of the WxData.com website, email, regular mail, or by SMS.
Entire Agreement
This Agreement constitutes the entire agreement between you and WxData.com and parent the company iAlert Services LLC and governs your use of the WxData.com website, superseding any prior agreements between you and WxData.com and the parent company iAlert Services LLC with respect to the WxData.com website. No agreement other than this shall govern your use of the WxData.com website, unless it is incorporated herein. The WxData.com privacy policy is incorporated herein in "privacy". There shall be no modification or amendment to this agreement except in writing at the sole discretion of WxData.com. You agree that posting of a modification or amendment to this agreement to the WxData.com website shall constitute acceptable notice such modification or amendment.
Choice of Law, Forum and Venue
The Agreement, and the relationship between you and WxData.com and parent company iAlert Services LLC shall be governed by the laws of Maryland without regard to its Conflict of Laws provisions, and without regard to your domicile or physical location. You and WxData.com and parent company iAlert Services LLC agree to submit to the exclusive personal jurisdiction of Rockville, Maryland either in the State or Federal courts with jurisdiction thereof. You and WxData.com and parent company iAlert Services agree that proper venue for any claims arising out of or related to this Agreement shall be county of Montgomery, Rockville Maryland.
Severability and Waiver
The failure of WxData.com to exercise or enforce any right or provision of the Agreement shall not constitute a waiver of such right or provision. If any provision of the Agreement is found by a court of competent jurisdiction to be invalid, you and WxData.com nonetheless agree that the court should endeavor to give full effect to the parties' intentions as reflected in the provision, and you and WxData.com agree that other provisions of the Agreement remain in full effect.
Non-Transferrability and No Right of Survivorship
You agree that your right to access the WxData.com website through a membership is non-transferable and any rights to your membership or contents within your account terminate upon your death. Upon receipt of a copy of a death certificate, your account may be terminated and all contents permanently deleted.
Claim Time Limitation
You agree that regardless of any law to the contrary, any claim or cause of action arising out of or related to use of the WxData.com services, or this Agreement must be filed within one year after such claim or cause of action arises or be forever barred.
Refunds
Forward paid services may be refunded at the sole discretion of WxData.com and the parent company iAlert Services LLC. There are no refunds for paid services consumed prior to refund request date, exceptions may be considered at the sole discretion of WxData.com and the parent company iAlert Services LLC.
RestSharp:
Apache License
Version 2.0, January 2004
http://www.apache.org/licenses/
TERMS AND CONDITIONS FOR USE, REPRODUCTION, AND DISTRIBUTION
1. Definitions.
"License" shall mean the terms and conditions for use, reproduction, and distribution as defined by Sections 1 through 9 of this document.
"Licensor" shall mean the copyright owner or entity authorized by the copyright owner that is granting the License.
"Legal Entity" shall mean the union of the acting entity and all other entities that control, are controlled by, or are under common control with that entity. For the purposes of this definition, "control" means (i) the power, direct or indirect, to cause the direction or management of such entity, whether by contract or otherwise, or (ii) ownership of fifty percent (50%) or more of the outstanding shares, or (iii) beneficial ownership of such entity.
"You" (or "Your") shall mean an individual or Legal Entity exercising permissions granted by this License.
"Source" form shall mean the preferred form for making modifications, including but not limited to software source code, documentation source, and configuration files.
"Object" form shall mean any form resulting from mechanical transformation or translation of a Source form, including but not limited to compiled object code, generated documentation, and conversions to other media types.
"Work" shall mean the work of authorship, whether in Source or Object form, made available under the License, as indicated by a copyright notice that is included in or attached to the work (an example is provided in the Appendix below).
"Derivative Works" shall mean any work, whether in Source or Object form, that is based on (or derived from) the Work and for which the editorial revisions, annotations, elaborations, or other modifications represent, as a whole, an original work of authorship. For the purposes of this License, Derivative Works shall not include works that remain separable from, or merely link (or bind by name) to the interfaces of, the Work and Derivative Works thereof.
"Contribution" shall mean any work of authorship, including the original version of the Work and any modifications or additions to that Work or Derivative Works thereof, that is intentionally submitted to Licensor for inclusion in the Work by the copyright owner or by an individual or Legal Entity authorized to submit on behalf of the copyright owner. For the purposes of this definition, "submitted" means any form of electronic, verbal, or written communication sent to the Licensor or its representatives, including but not limited to communication on electronic mailing lists, source code control systems, and issue tracking systems that are managed by, or on behalf of, the Licensor for the purpose of discussing and improving the Work, but excluding communication that is conspicuously marked or otherwise designated in writing by the copyright owner as "Not a Contribution."
"Contributor" shall mean Licensor and any individual or Legal Entity on behalf of whom a Contribution has been received by Licensor and subsequently incorporated within the Work.
2. Grant of Copyright License.
Subject to the terms and conditions of this License, each Contributor hereby grants to You a perpetual, worldwide, non-exclusive, no-charge, royalty-free, irrevocable copyright license to reproduce, prepare Derivative Works of, publicly display, publicly perform, sublicense, and distribute the Work and such Derivative Works in Source or Object form.
3. Grant of Patent License.
Subject to the terms and conditions of this License, each Contributor hereby grants to You a perpetual, worldwide, non-exclusive, no-charge, royalty-free, irrevocable (except as stated in this section) patent license to make, have made, use, offer to sell, sell, import, and otherwise transfer the Work, where such license applies only to those patent claims licensable by such Contributor that are necessarily infringed by their Contribution(s) alone or by combination of their Contribution(s) with the Work to which such Contribution(s) was submitted. If You institute patent litigation against any entity (including a cross-claim or counterclaim in a lawsuit) alleging that the Work or a Contribution incorporated within the Work constitutes direct or contributory patent infringement, then any patent licenses granted to You under this License for that Work shall terminate as of the date such litigation is filed.
4. Redistribution.
You may reproduce and distribute copies of the Work or Derivative Works thereof in any medium, with or without modifications, and in Source or Object form, provided that You meet the following conditions:
You must give any other recipients of the Work or Derivative Works a copy of this License; and
You must cause any modified files to carry prominent notices stating that You changed the files; and
You must retain, in the Source form of any Derivative Works that You distribute, all copyright, patent, trademark, and attribution notices from the Source form of the Work, excluding those notices that do not pertain to any part of the Derivative Works; and
If the Work includes a "NOTICE" text file as part of its distribution, then any Derivative Works that You distribute must include a readable copy of the attribution notices contained within such NOTICE file, excluding those notices that do not pertain to any part of the Derivative Works, in at least one of the following places: within a NOTICE text file distributed as part of the Derivative Works; within the Source form or documentation, if provided along with the Derivative Works; or, within a display generated by the Derivative Works, if and wherever such third-party notices normally appear. The contents of the NOTICE file are for informational purposes only and do not modify the License. You may add Your own attribution notices within Derivative Works that You distribute, alongside or as an addendum to the NOTICE text from the Work, provided that such additional attribution notices cannot be construed as modifying the License.
You may add Your own copyright statement to Your modifications and may provide additional or different license terms and conditions for use, reproduction, or distribution of Your modifications, or for any such Derivative Works as a whole, provided Your use, reproduction, and distribution of the Work otherwise complies with the conditions stated in this License.
5. Submission of Contributions.
Unless You explicitly state otherwise, any Contribution intentionally submitted for inclusion in the Work by You to the Licensor shall be under the terms and conditions of this License, without any additional terms or conditions. Notwithstanding the above, nothing herein shall supersede or modify the terms of any separate license agreement you may have executed with Licensor regarding such Contributions.
6. Trademarks.
This License does not grant permission to use the trade names, trademarks, service marks, or product names of the Licensor, except as required for reasonable and customary use in describing the origin of the Work and reproducing the content of the NOTICE file.
7. Disclaimer of Warranty.
Unless required by applicable law or agreed to in writing, Licensor provides the Work (and each Contributor provides its Contributions) on an "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied, including, without limitation, any warranties or conditions of TITLE, NON-INFRINGEMENT, MERCHANTABILITY, or FITNESS FOR A PARTICULAR PURPOSE. You are solely responsible for determining the appropriateness of using or redistributing the Work and assume any risks associated with Your exercise of permissions under this License.
8. Limitation of Liability.
In no event and under no legal theory, whether in tort (including negligence), contract, or otherwise, unless required by applicable law (such as deliberate and grossly negligent acts) or agreed to in writing, shall any Contributor be liable to You for damages, including any direct, indirect, special, incidental, or consequential damages of any character arising as a result of this License or out of the use or inability to use the Work (including but not limited to damages for loss of goodwill, work stoppage, computer failure or malfunction, or any and all other commercial damages or losses), even if such Contributor has been advised of the possibility of such damages.
9. Accepting Warranty or Additional Liability.
While redistributing the Work or Derivative Works thereof, You may choose to offer, and charge a fee for, acceptance of support, warranty, indemnity, or other liability obligations and/or rights consistent with this License. However, in accepting such obligations, You may act only on Your own behalf and on Your sole responsibility, not on behalf of any other Contributor, and only if You agree to indemnify, defend, and hold each Contributor harmless for any liability incurred by, or claims asserted against, such Contributor by reason of your accepting any such warranty or additional liability.
END OF TERMS AND CONDITIONS
Newtonsoft.Json:
The MIT License (MIT)
Copyright (c) 2007 James Newton-King
Permission is hereby granted, free of charge, to any person obtaining a copy of
this software and associated documentation files (the "Software"), to deal in
the Software without restriction, including without limitation the rights to
use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of
the Software, and to permit persons to whom the Software is furnished to do so,
subject to the following conditions:
The above copyright notice and this permission notice shall be included in all
copies or substantial portions of the Software.
THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS
FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR
COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER
IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN
CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE.
| 109.730263 | 1,611 | 0.81126 | eng_Latn | 0.998749 |
f97895faf87b68b7d72946d9a5e965e365f2bd98 | 7,136 | md | Markdown | _next/03-01-benefits.md | huyenltnguyen/uswds-site | c0f7418324c85e49413e7ddd071336f44a00b583 | [
"CC0-1.0"
] | 102 | 2018-01-28T04:59:38.000Z | 2022-03-13T18:18:37.000Z | _next/03-01-benefits.md | huyenltnguyen/uswds-site | c0f7418324c85e49413e7ddd071336f44a00b583 | [
"CC0-1.0"
] | 553 | 2018-01-23T18:05:13.000Z | 2022-03-30T19:59:24.000Z | _next/03-01-benefits.md | huyenltnguyen/uswds-site | c0f7418324c85e49413e7ddd071336f44a00b583 | [
"CC0-1.0"
] | 92 | 2018-01-24T13:24:05.000Z | 2022-03-23T15:27:56.000Z | ---
title: Understand the value and benefits
subhead: Agency teams want to...
description: "Agencies need to be able to quickly discern the design system’s benefits and how it aligns with their goals for delivering better digital services."
meta:
og:image: /img/next/og-next-findings-understand.png
permalink: /next/findings/benefits/
slug: "understand"
navbar_color: "next-red-dark"
hero_color: "next-red-medium"
quote_color: "next-red-medium"
chapter: false
parent: Findings
conversation_starters:
- |-
**How might we** communicate the value of the design system more effectively to new customers?
- |-
**How might we** help agency champions advocate for the design system?
- |-
**How might we** clarify common misperceptions or confusion around the design system?
- |-
**How might we** better explain what 21st Century IDEA means for agencies and help them understand how they are performing?
---
<section class="next-section">
<div class="grid-container">
<div class="grid-row">
<div class="grid-col-12 tablet:grid-col-8 tablet:margin-x-auto desktop:margin-x-0 next-section-prose" markdown="1">
While the benefits of using the design system become more clear to agency teams as they use it, they often start with some concerns and misconceptions about how it works and the value it provides.
### Leverage the benefits of USWDS
Building a website or adding to an existing one can be a significant undertaking. While there’s an initial investment, agencies save time and energy in the end, allowing them to focus on their mission instead.
In addition, when agency teams commit to using the design system, they don’t have to worry about explaining every design decision to stakeholders, allowing teams to align on priorities and move forward quickly and confidently. Some teams are also looking for support and documentation to help them advocate with their leadership to use design system components and principles.
{% include next/next-quote.html quote="This has already been developed and blessed by an interagency community so you don’t have to just go on my opinion." source="Manager" variant="inline" color=page.quote_color %}
{% include next/next-quote.html quote="By using these standard components, our design is less likely to be hijacked by some external stakeholder who has some new idea. We can say ‘oh we’re using the web design system.’ While it gives us flexibility, it also gives us protection to do something testable and usable. Less likely for our designs to go off the rails." source="Designer" variant="inline" color=page.quote_color %}
### Adhere to their brand
Some agency teams fear that adopting the design system means their website will look like every other government website. Many agencies have pride in their brand and want their digital services to be recognizably their own. However, these efforts may not have the impact they’re hoping for, as most members of the public value easy-to-use experiences over beautifully branded ones.
The design system team needs to do more to show agency teams that they can still customize the look and feel of their sites using the design system, while keeping the accessibility and UX best practices it provides. This involves balancing just the right amount of guidance without being overly prescriptive.
{% include next/next-quote.html quote="The largest roadblock is that there is a very strong current of agencies wanting to have their own identity." source="Manager" variant="inline" color=page.quote_color %}
{% include next/next-quote.html quote="Very few people would be ok with a site that looks like other government sites. They all want their brand, a way to look different." source="Content manager" variant="inline" color=page.quote_color %}
### Learn how designers fit into the process
Another common misunderstanding is that USWDS is only for teams without a designer. Though the design system supports design and engineering processes, it’s not a replacement for a designer or any of the other cross-functional skills necessary to bring it all together. Instead, using the design system allows designers and engineers to focus more on problem-solving for specific user needs (such as adding multilingual content) and less time establishing the basics.
{% include next/next-quote.html quote="My general opinion is that it’s a great set of principles and tools for agencies that don’t have designers. I’ve always had excellent designers and UX experts on staff so we didn’t need the design system, per se." source="Civic tech leader" variant="inline" color=page.quote_color %}
{% include next/next-quote.html quote="I have a UX person on my team so I’m very lucky." source="Manager" variant="inline" color=page.quote_color %}
### Know what compliance means
We also heard significant confusion around the 21st Century Integrated Digital Experience Act, otherwise known as 21st Century IDEA. Though the Act was signed into law over two years ago, agency teams are still trying to figure out what compliance looks like. It’s unclear if using USWDS is required for all sites, or only new or redesigned ones.
There are also questions of customization: can components or colors be customized, or do they have to be the default styles? Though the Act mentions eight specific standards for websites, including that they be accessible, mobile-friendly, and user-centered, agencies are having trouble determining what “good enough” looks like [^5].
{% include next/next-quote.html quote="So where it says ‘the standards as issued by TTS’, is that USWDS? And no one knew. ‘Standards’ has budgetary and workload prioritization meaning." source="Civic tech leader" variant="inline" color=page.quote_color %}
{% include next/next-quote.html quote="We weren’t sure if it was a requirement by law or optional tool." source="Designer" variant="inline" color=page.quote_color %}
</div>
</div>
</div>
</section>
<div class="bg-{{ page.hero_color}} height-1"></div>
<section class="next-section next-section--shaded">
<div class="grid-container">
<div class="grid-row">
<div class="grid-col-12 tablet:grid-col-8 tablet:margin-x-auto desktop:margin-x-0 margin-top-neg-3 margin-bottom-neg-3 next-section-prose">
{% include next/next-bulleted-list.html paragraph="We should explore a few different avenues to help agencies <b>understand the value of using USWDS</b> and support them in gaining executive buy-in." icon="forum" bullets=page.conversation_starters title="Conversation starters" %}
</div>
</div>
</div>
</section>
<section class="next-section next-section--citations">
<div class="grid-container">
<div class="grid-row">
<div class="grid-col-12 tablet:grid-col-8 tablet:margin-x-auto desktop:margin-x-0" markdown="1">
#### Citations
* footnotes will be placed here. This line is necessary
{:footnotes}
[^5]: 21st Century Integrated Digital Experience Act. (2020, August 31). Retrieved January 06, 2021, from <https://digital.gov/resources/21st-century-integrated-digital-experience-act/>
</div>
</div>
</div>
</section> | 64.288288 | 467 | 0.768778 | eng_Latn | 0.997957 |
f97936694d015c26593d164c81b3f2ee4eaa876c | 633 | md | Markdown | _posts/2020-09-20-lluvia.md | AviReyes/avireyes.github.io | b9ffd9d47e0b837c3398f409600b52a612c64c61 | [
"MIT"
] | 2 | 2020-07-18T07:54:57.000Z | 2020-07-18T07:55:16.000Z | _posts/2020-09-20-lluvia.md | AviReyes/Blog-Amora.github.io | b9ffd9d47e0b837c3398f409600b52a612c64c61 | [
"MIT"
] | null | null | null | _posts/2020-09-20-lluvia.md | AviReyes/Blog-Amora.github.io | b9ffd9d47e0b837c3398f409600b52a612c64c61 | [
"MIT"
] | null | null | null | ---
layout: post
title: Día de Lluvia 🌧
---
<p style='text-align: justify;'></p>
<p></p>
<p style='text-align: justify;'>
<p>Hoy fue la última lluvia de Agosto, y pensé en ti. </p>
<p>En el día que por primera vez cruzamos eje 3 juntas en nuestras bicis. </p>
<p>En el día que tomamos un café mientras llovía. </p>
<p>En el día que paseamos por eje central cantando, porque simplemente no supimos disimular la alegria. </p>
<p>En el día que por primera vez senti tus labios tímidos y suaves. </p>
<p>Todo en un día de lluvia. ❤</p>
</p>

| 26.375 | 108 | 0.682464 | spa_Latn | 0.985347 |
f979ab40b9435883ab7623f45e1b96f79e3e6917 | 8,863 | md | Markdown | articles/service-bus-messaging/service-bus-amqp-overview.md | gliljas/azure-docs.sv-se-1 | 1efdf8ba0ddc3b4fb65903ae928979ac8872d66e | [
"CC-BY-4.0",
"MIT"
] | null | null | null | articles/service-bus-messaging/service-bus-amqp-overview.md | gliljas/azure-docs.sv-se-1 | 1efdf8ba0ddc3b4fb65903ae928979ac8872d66e | [
"CC-BY-4.0",
"MIT"
] | null | null | null | articles/service-bus-messaging/service-bus-amqp-overview.md | gliljas/azure-docs.sv-se-1 | 1efdf8ba0ddc3b4fb65903ae928979ac8872d66e | [
"CC-BY-4.0",
"MIT"
] | null | null | null | ---
title: Översikt över AMQP 1,0 i Azure Service Bus
description: Lär dig hur Azure Service Bus stöder Advanced Message Queueing Protocol (AMQP), ett öppet standard protokoll.
services: service-bus-messaging
documentationcenter: .net
author: axisc
manager: timlt
editor: spelluru
ms.assetid: 0e8d19cc-de36-478e-84ae-e089bbc2d515
ms.service: service-bus-messaging
ms.workload: na
ms.tgt_pltfrm: na
ms.devlang: multiple
ms.topic: article
ms.date: 01/23/2019
ms.author: aschhab
ms.openlocfilehash: 50d21cfe8136b9c794eae5104bbb34e28f7c1661
ms.sourcegitcommit: 849bb1729b89d075eed579aa36395bf4d29f3bd9
ms.translationtype: MT
ms.contentlocale: sv-SE
ms.lasthandoff: 04/28/2020
ms.locfileid: "76759320"
---
# <a name="amqp-10-support-in-service-bus"></a>AMQP 1,0-stöd i Service Bus
Både Azure Service Bus moln tjänst och lokala [Service Bus för Windows Server (Service Bus 1,1)](https://msdn.microsoft.com/library/dn282144.aspx) stöder Advanced Message Queueing Protocol (AMQP) 1,0. Med AMQP kan du skapa hybrid program mellan plattformar med ett öppet standard protokoll. Du kan skapa program med hjälp av komponenter som har skapats med olika språk och ramverk och som körs på olika operativ system. Alla dessa komponenter kan ansluta till Service Bus och sömlöst utbyta strukturerade affärs meddelanden effektivt och med full kvalitet.
## <a name="introduction-what-is-amqp-10-and-why-is-it-important"></a>Introduktion: Vad är AMQP 1,0 och varför är det viktigt?
Traditionellt förekommande, meddelanden mellan olika program varor har använt egna protokoll för kommunikation mellan klient program och-utjämnare. Det innebär att när du har valt en viss leverantörs meddelande Broker måste du använda leverantörens bibliotek för att ansluta dina klient program till denna Broker. Detta resulterar i en viss grad av beroenden av den leverantören, eftersom det krävs kod ändringar i alla anslutna program för att hamna på ett program till en annan produkt.
Dessutom är det svårt att ansluta meddelande utjämnare från olika leverantörer. Detta kräver vanligt vis att program nivå bryggningen flyttar meddelanden från ett system till ett annat och översätter mellan deras egna meddelande format. Detta är ett vanligt krav. till exempel när du måste ange ett nytt enhetligt gränssnitt till äldre system, eller integrera IT-system efter en sammanslagning.
Program varu branschen är en snabb och rörlig verksamhet. nya programmeringsspråk och program ramverk introduceras ibland bewildering takt. På samma sätt utvecklas kraven i IT-systemen över tid och utvecklare vill dra nytta av de senaste plattforms funktionerna. Men ibland stöder inte den valda meddelande leverantören dessa plattformar. Eftersom meddelande protokoll är tillverkarspecifika är det inte möjligt för andra att tillhandahålla bibliotek för de här nya plattformarna. Därför måste du använda metoder som att skapa gatewayer eller bryggor som gör att du kan fortsätta att använda meddelande produkten.
Utvecklingen av Advanced Message Queueing Protocol (AMQP) 1,0 motiverades av dessa problem. Den har sitt ursprung på JP-Morgan, som, till exempel de flesta finansiella tjänste företag, är stora användare av meddelanderoutning mellanprodukter. Målet var enkelt: om du vill skapa ett meddelande protokoll med öppen standard som gjorde det möjligt att skapa meddelandebaserade program med hjälp av komponenter som skapats med olika språk, ramverk och operativ system kan du använda de bästa komponenterna från ett utbud av leverantörer.
## <a name="amqp-10-technical-features"></a>AMQP 1,0-tekniska funktioner
AMQP 1,0 är ett effektivt, tillförlitligt meddelande protokoll på trådnivå som du kan använda för att bygga robusta, plattforms oberoende meddelande program. Protokollet har ett enkelt mål: för att definiera Mechanics för säker, tillförlitlig och effektiv överföring av meddelanden mellan två parter. Själva meddelandena kodas med hjälp av en bärbar data representation som gör det möjligt för heterogena avsändare och mottagare att utbyta strukturerade företags meddelanden med full åter givning. Följande är en sammanfattning av de viktigaste funktionerna:
* **Effektiv**: AMQP 1,0 är ett anslutnings orienterat protokoll som använder en binär kodning för protokoll instruktionerna och de affärs meddelanden som överförs via det. Den införlivar avancerade flödes kontroll scheman för att maximera användningen av nätverket och de anslutna komponenterna. Detta innebär att protokollet har utformats för att ge en balans mellan effektivitet, flexibilitet och interoperabilitet.
* **Tillförlitligt**: AMQP 1,0-protokollet gör att meddelanden kan bytas ut med en mängd Tillförlitlighets garantier, från brand och-glömma till tillförlitlig, exakt en och en godkänd leverans.
* **Flexibel**: AMQP 1,0 är ett flexibelt protokoll som kan användas för att stödja olika topologier. Samma protokoll kan användas för klient-till-klient-, klient-till-Broker-och Broker-till-Broker-kommunikation.
* **Broker-Model-oberoende**: AMQP 1,0-specifikationen gör inga krav på den meddelande modell som används av en Broker. Det innebär att det är möjligt att enkelt lägga till AMQP 1,0-stöd för befintliga meddelande hanterare.
## <a name="amqp-10-is-a-standard-with-a-capital-s"></a>AMQP 1,0 är en standard (med versaler ')
AMQP 1,0 är en internationell standard som godkänts av ISO och IEC som ISO/IEC 19464:2014.
AMQP 1,0 har varit i utvecklings läge sedan 2008 av en kärn grupp med fler än 20 företag, både teknik leverantörer och slutanvändarens företag. Under den tiden har användare företag bidragit med sina verkliga affärs krav och teknik leverantörer har utvecklat protokollet för att uppfylla dessa krav. Under hela processen har leverantörer deltagit i verkstäder där de samarbetar för att verifiera samverkan mellan sina implementeringar.
I oktober 2011 släpptes utvecklings arbetet över till en teknisk kommitté inom organisationen för förhands OASIS (Structured Information Standards) och OASIS AMQP 1,0 standard i oktober 2012. Följande företag deltog i den tekniska kommittén vid utvecklingen av standarden:
* **Teknik leverantörer**: Axway program vara, Huawei Technologies, IIT-programvara, INETCO-system, Kaazing, Microsoft, Mitre Corporation, Primeton Technologies, process program, Red Hat, Sita, program-AG, Solace-system, VMware, WSO2, Zenika.
* **Användar företag**: bank i Amerika, kredit Suisse, tyska Boerse, Goldman Sachs, JPMorgan
Några av de mest citerade fördelarna med öppna standarder är:
* Mindre chans att låsa leverantören
* Samverkan
* Bred tillgänglighet för bibliotek och verktyg
* Skydd mot Obsolescence
* Tillgänglighet för kunnig personal
* Lägre och hanterbar risk
## <a name="amqp-10-and-service-bus"></a>AMQP 1,0 och Service Bus
AMQP 1,0-stöd i Azure Service Bus innebär att du nu kan använda Service Bus köer och publicera/prenumerera på Service Broker-funktioner från en uppsättning plattformar med hjälp av ett effektivt binärt protokoll. Dessutom kan du skapa program som består av komponenter som skapats med en blandning av språk, ramverk och operativ system.
Följande bild illustrerar en exempel distribution där Java-klienter som körs på Linux, skrivet med hjälp av JMS-API: et (standard Java Message Service) och .NET-klienter som körs på Windows, Exchange-meddelanden via Service Bus med AMQP 1,0.
![][0]
**Bild 1: exempel på distributions scenario som visar meddelanden över plattformar med hjälp av Service Bus och AMQP 1,0**
Vid detta tillfälle är följande klient bibliotek kända för att fungera med Service Bus:
| Språk | Bibliotek |
| --- | --- |
| Java |JMS-klient (Apache qpid Java Message Service)<br/>IIT program vara SwiftMQ Java client |
| C |Apache qpid Proton-C |
| PHP |Apache qpid Proton – PHP |
| Python |Apache qpid Proton – python |
| C# |AMQP .NET lite |
**Bild 2: tabell över AMQP 1,0-klient bibliotek**
## <a name="summary"></a>Sammanfattning
* AMQP 1,0 är ett öppet, Reliable meddelande protokoll som du kan använda för att skapa hybrid program mellan plattformar. AMQP 1,0 är en OASIS-standard.
* AMQP 1,0-support finns nu i Azure Service Bus och Service Bus för Windows Server (Service Bus 1,1). Priserna är samma som för de befintliga protokollen.
## <a name="next-steps"></a>Nästa steg
Vill du lära dig mer? Besök följande länkar:
* [Använda Service Bus från .NET med AMQP]
* [Använda Service Bus från Java med AMQP]
* [Installera Apache qpid Proton-C på en virtuell Azure Linux-dator]
* [AMQP i Service Bus för Windows Server]
[0]: ./media/service-bus-amqp-overview/service-bus-amqp-1.png
[Använda Service Bus från .NET med AMQP]: service-bus-amqp-dotnet.md
[Använda Service Bus från Java med AMQP]: service-bus-amqp-java.md
[Installera Apache qpid Proton-C på en virtuell Azure Linux-dator]: service-bus-amqp-apache.md
[AMQP i Service Bus för Windows Server]: https://msdn.microsoft.com/library/dn574799.aspx
| 87.752475 | 613 | 0.803904 | swe_Latn | 0.99989 |
f979d64bf683b9781a970a09451e72d58dec2c3f | 859 | md | Markdown | src/pages/about/about.md | ReganNZ/4displays | 747fd0b0396d54953b2cfed74e6b3d23f6f59199 | [
"MIT"
] | null | null | null | src/pages/about/about.md | ReganNZ/4displays | 747fd0b0396d54953b2cfed74e6b3d23f6f59199 | [
"MIT"
] | null | null | null | src/pages/about/about.md | ReganNZ/4displays | 747fd0b0396d54953b2cfed74e6b3d23f6f59199 | [
"MIT"
] | null | null | null | ---
templateKey: about-page
path: /about
title: About 4displays
---
Based in Auckland, 4displays have been in the display business since 1997 designing, manufacturing and supplying all kinds of display systems. From small & medium size items like brochure, poster and menu holders, to large complex creations and retail display for merchandising, so if you need some kind of signage, custom made unit, or something to hold your promotional material, you have come to the right place!
John, the founder and director of 4 Displays is ready to help in all areas of custom and ready made displays, big or small!
Check out our product range on the menu above! Do get in touch - for any questions. We look forward to helping you with the right display solution! It is worthwhile following our [news](http://www.4displays.co.nz/news/)link, for the latest updates.
| 78.090909 | 415 | 0.786962 | eng_Latn | 0.999482 |
f979f4f3ba5cc2fc02a059cf4b4859f20d3b889a | 10,319 | md | Markdown | vendor/src/github.com/flopp/go-staticmaps/README.md | whosonfirst/go-whosonfirst-staticmaps | fe61e92f3a9aa728b85321e4a2f083cf8f53baf5 | [
"BSD-3-Clause"
] | null | null | null | vendor/src/github.com/flopp/go-staticmaps/README.md | whosonfirst/go-whosonfirst-staticmaps | fe61e92f3a9aa728b85321e4a2f083cf8f53baf5 | [
"BSD-3-Clause"
] | null | null | null | vendor/src/github.com/flopp/go-staticmaps/README.md | whosonfirst/go-whosonfirst-staticmaps | fe61e92f3a9aa728b85321e4a2f083cf8f53baf5 | [
"BSD-3-Clause"
] | null | null | null | [](https://godoc.org/github.com/flopp/go-staticmaps)
[](https://goreportcard.com/report/flopp/go-staticmaps)
[](https://github.com/flopp/go-staticmaps/)
# go-staticmaps
A go (golang) library and command line tool to render static map images using OpenStreetMap tiles.
## What?
go-staticmaps is a golang library that allows you to create nice static map images from OpenStreetMap tiles, along with markers of different size and color, as well as paths and colored areas.
go-staticmaps comes with a command line tool called `create-static-map` for use in shell scripts, etc.

## How?
### Installation
Installing go-staticmaps is as easy as
```bash
go get -u github.com/flopp/go-staticmaps
```
For the command line tool, use
```bash
go get -u github.com/flopp/go-staticmaps/create-static-map
```
Of course, your local Go installation must be setup up properly.
### Library Usage
Create a 400x300 pixel map with a red marker:
```go
import (
"image/color"
"github.com/flopp/go-staticmaps"
"github.com/fogleman/gg"
"github.com/golang/geo/s2"
)
func main() {
ctx := sm.NewContext()
ctx.SetSize(400, 300)
ctx.AddMarker(sm.NewMarker(s2.LatLngFromDegrees{52.514536, 13.350151}, color.RGBA{0xff, 0, 0, 0xff}, 16.0))
img, err := ctx.Render()
if err != nil {
panic(err)
}
if err := gg.SavePNG("my-map.png", img); err != nil {
panic(err)
}
}
```
See [GoDoc](https://godoc.org/github.com/flopp/go-staticmaps) for a complete documentation and the source code of the [command line tool](https://github.com/flopp/go-staticmaps/blob/master/create-static-map/create-static-map.go) for an example how to use the package.
### Command Line Usage
Usage:
create-static-map [OPTIONS]
Creates a static map
Application Options:
--width=PIXELS Width of the generated static map image (default: 512)
--height=PIXELS Height of the generated static map image (default: 512)
-o, --output=FILENAME Output file name (default: map.png)
-t, --type=MAPTYPE Select the map type; list possible map types with '--type list'
-c, --center=LATLNG Center coordinates (lat,lng) of the static map
-z, --zoom=ZOOMLEVEL Zoom factor
-b, --bbox=NW_LATLNG|SE_LATLNG
Set the bounding box (NW_LATLNG = north-western point of the
bounding box, SW_LATLNG = southe-western point of the bounding
box)
-m, --marker=MARKER Add a marker to the static map
-p, --path=PATH Add a path to the static map
-a, --area=AREA Add an area to the static map
Help Options:
-h, --help Show this help message
### General
The command line interface tries to resemble [Google's Static Maps API](https://developers.google.com/maps/documentation/static-maps/intro).
If neither `--bbox`, `--center`, nor `--zoom` are given, the map extent is determined from the specified markers, paths and areas.
### Markers
The `--marker` option defines one or more map markers of the same style. Use multiple `--marker` options to add markers of different styles.
--marker MARKER_STYLES|LATLNG|LATLNG|...
`LATLNG` is a comma separated pair of latitude and longitude, e.g. `52.5153,13.3564`.
`MARKER_STYLES` consists of a set of style descriptors separated by the pipe character `|`:
- `color:COLOR` - where `COLOR` is either of the form `0xRRGGBB`, `0xRRGGBBAA`, or one of `black`, `blue`, `brown`, `green`, `orange`, `purple`, `red`, `yellow`, `white` (default: `red`)
- `size:SIZE` - where `SIZE` is one of `mid`, `small`, `tiny`, or some number > 0 (default: `mid`)
- `label:LABEL` - where `LABEL` is an alpha numeric character, i.e. `A`-`Z`, `a`-`z`, `0`-`9`; (default: no label)
- `labelcolor:COLOR` - where `COLOR` is either of the form `0xRRGGBB`, `0xRRGGBBAA`, or one of `black`, `blue`, `brown`, `green`, `orange`, `purple`, `red`, `yellow`, `white` (default: `black` or `white`, depending on the marker color)
### Paths
The `--path` option defines a path on the map. Use multiple `--path` options to add multiple paths to the map.
--path PATH_STYLES|LATLNG|LATLNG|...
or
--path PATH_STYLES|gpx:my_gpx_file.gpx
`PATH_STYLES` consists of a set of style descriptors separated by the pipe character `|`:
- `color:COLOR` - where `COLOR` is either of the form `0xRRGGBB`, `0xRRGGBBAA`, or one of `black`, `blue`, `brown`, `green`, `orange`, `purple`, `red`, `yellow`, `white` (default: `red`)
- `weight:WEIGHT` - where `WEIGHT` is the line width in pixels (defaut: `5`)
### Areas
The `--area` option defines a closed area on the map. Use multiple `--area` options to add multiple areas to the map.
--area AREA_STYLES|LATLNG|LATLNG|...
`AREA_STYLES` consists of a set of style descriptors separated by the pipe character `|`:
- `color:COLOR` - where `COLOR` is either of the form `0xRRGGBB`, `0xRRGGBBAA`, or one of `black`, `blue`, `brown`, `green`, `orange`, `purple`, `red`, `yellow`, `white` (default: `red`)
- `weight:WEIGHT` - where `WEIGHT` is the line width in pixels (defaut: `5`)
- `fill:COLOR` - where `COLOR` is either of the form `0xRRGGBB`, `0xRRGGBBAA`, or one of `black`, `blue`, `brown`, `green`, `orange`, `purple`, `red`, `yellow`, `white` (default: none)
## Examples
### Basic Maps
Centered at "N 52.514536 E 13.350151" with zoom level 10:
```bash
$ create-static-map --width 600 --height 400 -o map1.png -c "52.514536,13.350151" -z 10
```

A map with a marker at "N 52.514536 E 13.350151" with zoom level 14 (no need to specify the map's center - it is automatically computed from the marker(s)):
```bash
$ create-static-map --width 600 --height 400 -o map2.png -z 14 -m "52.514536,13.350151"
```

A map with two markers (red and green). If there are more than two markers in the map, a *good* zoom level can be determined automatically:
```bash
$ create-static-map --width 600 --height 400 -o map3.png -m "color:red|52.514536,13.350151" -m "color:green|52.516285,13.377746"
```

### Create a map of the Berlin Marathon
create-static-map --width 800 --height 600 \
--marker "color:green|52.5153,13.3564" \
--marker "color:red|52.5160,13.3711" \
--output "berlin-marathon.png" \
--path "color:blue|weight:2|gpx:berlin-marathon.gpx"

### Create a map of the US capitals
create-static-map --width 800 --height 400 \
--output "us-capitals.png" \
--marker "color:blue|size:tiny|32.3754,-86.2996|58.3637,-134.5721|33.4483,-112.0738|34.7244,-92.2789|\
38.5737,-121.4871|39.7551,-104.9881|41.7665,-72.6732|39.1615,-75.5136|30.4382,-84.2806|33.7545,-84.3897|\
21.2920,-157.8219|43.6021,-116.2125|39.8018,-89.6533|39.7670,-86.1563|41.5888,-93.6203|39.0474,-95.6815|\
38.1894,-84.8715|30.4493,-91.1882|44.3294,-69.7323|38.9693,-76.5197|42.3589,-71.0568|42.7336,-84.5466|\
44.9446,-93.1027|32.3122,-90.1780|38.5698,-92.1941|46.5911,-112.0205|40.8136,-96.7026|39.1501,-119.7519|\
43.2314,-71.5597|40.2202,-74.7642|35.6816,-105.9381|42.6517,-73.7551|35.7797,-78.6434|46.8084,-100.7694|\
39.9622,-83.0007|35.4931,-97.4591|44.9370,-123.0272|40.2740,-76.8849|41.8270,-71.4087|34.0007,-81.0353|\
44.3776,-100.3177|36.1589,-86.7821|30.2687,-97.7452|40.7716,-111.8882|44.2627,-72.5716|37.5408,-77.4339|\
47.0449,-122.9016|38.3533,-81.6354|43.0632,-89.4007|41.1389,-104.8165"

### Create a map of Australia
...where the Northern Territory is highlighted and the capital Canberra is marked.
create-static-map --width 800 --height 600 \
--center="-26.284973,134.303764" \
--output "australia.png" \
--marker "color:blue|-35.305200,149.121574" \
--area "color:0x00FF00|fill:0x00FF007F|weight:2|-25.994024,129.013847|-25.994024,137.989677|-16.537670,138.011649|\
-14.834820,135.385917|-12.293236,137.033866|-11.174554,130.398124|-12.925791,130.167411|-14.866678,129.002860"

## Acknowledgements
Besides the go standard library, go-staticmaps uses
- [OpenStreetMap](http://openstreetmap.org/), [Thunderforest](http://www.thunderforest.com/), [OpenTopoMap](http://www.opentopomap.org/), [Stamen](http://maps.stamen.com/) and [Carto](http://carto.com) as map tile providers
- [Go Graphics](https://github.com/fogleman/gg) for 2D drawing
- [S2 geometry library](https://github.com/golang/geo) for spherical geometry calculations
- [appdirs](https://github.com/Wessie/appdirs) for platform specific system directories
- [gpxgo](github.com/tkrajina/gpxgo) for loading GPX files
- [go-coordsparser](https://github.com/flopp/go-coordsparser) for parsing geo coordinates
## Contributors
- [Kooper](https://github.com/Kooper): fixed *library usage examples*
- [felix](https://github.com/felix): added *more tile servers*
- [wiless](https://github.com/wiless): suggested to add user definable *marker label colors*
- [noki](https://github.com/Noki): suggested to add a user definable *bounding box*
- [digitocero](https://github.com/digitocero): reported and fixed *type mismatch error*
## License
Copyright 2016, 2017 Florian Pigorsch & Contributors. All rights reserved.
Use of this source code is governed by a MIT-style license that can be found in the LICENSE file.
| 46.481982 | 267 | 0.692024 | eng_Latn | 0.685175 |
f97a229d63f3b64e1c19138adfb5ee947581cf36 | 2,587 | md | Markdown | README.md | carlesfelix/2mixxx | 82f8eab8839ef8b5f0c4fed0d5ebdf3877bd69c4 | [
"Apache-2.0"
] | 1 | 2021-12-03T16:54:31.000Z | 2021-12-03T16:54:31.000Z | README.md | carlesfelix/2mixxx | 82f8eab8839ef8b5f0c4fed0d5ebdf3877bd69c4 | [
"Apache-2.0"
] | null | null | null | README.md | carlesfelix/2mixxx | 82f8eab8839ef8b5f0c4fed0d5ebdf3877bd69c4 | [
"Apache-2.0"
] | null | null | null | # 2mixxx
[2mixxx](https://2mixxx.com) is a web application for requesting songs that DJs have in their [iTunes](https://www.apple.com/es/itunes/download/index.html) libraries.

## Installation and build guide
1. Install [git](https://git-scm.com/downloads)
2. Install [docker compose](https://docs.docker.com/compose/install)
3. Create an [auth0](https://auth0.com/) account.
4. Create a Single-Page Application API in your [auth0](https://auth0.com/) account.
5. Create a Machine to Machine API in your [auth0](https://auth0.com/).
6. Clone the project
```sh
$ git clone https://github.com/carlesfelix/2mixxx.git
```
7. Create the environment variables for frontend. `/frontend/.env`
```
REACT_APP_SOCKET_BASE_URI=wss://{your_host}:3002
REACT_APP_AUTH0_DOMAIN={your_auth0_domain}
REACT_APP_AUTH0_CLIENT_ID={your_auth0_SPA_client_id}
REACT_APP_API_BASE_URL=https://{your_host}:3001/api
GENERATE_SOURCEMAP=false
```
8. Create the environment variables for backend. `/backend/.env`
```
NODE_ENV=production
API_PORT=3001
MYSQL_ROOT_PASSWORD={your_mysql_root_password}
MYSQL_DATABASE={your_mysql_database}
MYSQL_USER={your_mysql_user}
MYSQL_PASSWORD={your_mysql_password}
MYSQL_HOST=mysql
MYSQL_PORT=3306
MYSQL_SSL=0 # To be upgraded to 1 in the future
SOCKET_PORT=3002
SOCKET_CORS_ORIGIN=https://{your_host}
JWT_ROOM_USERS_SECRET={your_secret}
AUTH0_DOMAIN={your_auth0_domain}
AUTH0_MACHINE_TO_MACHINE_CLIENT_ID={your_machine_to_machine_client_id}
AUTH0_MACHINE_TO_MACHINE_CLIENT_SECRET={your_machine_to_machine_client_secret}
AUTH0_DB_CONNECTION={name_of_the_auth0_database_connection_that_stores_your_users}
```
9. Place your ssl files 2mixxx.crt and 2mixxx.key in the project root directory.
10. Launch 2mixxx with docker compose.
```sh
$ docker-compose -f docker-compose.deploy-vps.yml build
$ docker-compose -f docker-compose.deploy-vps.yml up -d
```
10. Create the first registered user in your auth0 dashboard.
11. Create the first registered user (as admin) in the database in order to have full access to the dashboard.
```
INSERT INTO registered_users (id, createdAt, updatedAt, sub, email, userRole) VALUES('70b77906-ec13-4b79-8a55-f8d33429dfce', now(), now(), 'auth0|auth0_user_id', 'your@email.com', 1);
```
## Setting up the environment for development
Follow the instruccions located in [Frontend](https://github.com/carlesfelix/2mixxx/tree/main/frontend) (client side) and [Backend](https://github.com/carlesfelix/2mixxx/tree/main/backend) (server side).
| 43.847458 | 203 | 0.795129 | eng_Latn | 0.284768 |
f97a70c2f77a663124d1ee7167db7a707368cb23 | 12,046 | md | Markdown | windows/client-management/mdm/enable-admx-backed-policies-in-mdm.md | sckissel/windows-itpro-docs | 4ce647480cc30c26db475be55526126addb8c52d | [
"CC-BY-4.0",
"MIT"
] | null | null | null | windows/client-management/mdm/enable-admx-backed-policies-in-mdm.md | sckissel/windows-itpro-docs | 4ce647480cc30c26db475be55526126addb8c52d | [
"CC-BY-4.0",
"MIT"
] | null | null | null | windows/client-management/mdm/enable-admx-backed-policies-in-mdm.md | sckissel/windows-itpro-docs | 4ce647480cc30c26db475be55526126addb8c52d | [
"CC-BY-4.0",
"MIT"
] | null | null | null | ---
title: Enable ADMX-backed policies in MDM
description: Guide to configuring ADMX-backed policies in MDM
ms.author: dansimp
ms.topic: article
ms.prod: w10
ms.technology: windows
author: manikadhiman
ms.date: 11/01/2017
ms.reviewer:
manager: dansimp
---
# Enable ADMX-backed policies in MDM
This is a step-by-step guide to configuring ADMX-backed policies in MDM.
Starting in Windows 10 version 1703, Mobile Device Management (MDM) policy configuration support was expanded to allow access of select Group Policy administrative templates (ADMX-backed policies) for Windows PCs via the [Policy configuration service provider (CSP)](policy-configuration-service-provider.md). Configuring ADMX-backed policies in Policy CSP is different from the typical way you configure a traditional MDM policy.
Summary of steps to enable a policy:
- Find the policy from the list ADMX-backed policies.
- Find the Group Policy related information from the MDM policy description.
- Use the Group Policy Editor to determine whether there are parameters necessary to enable the policy.
- Create the data payload for the SyncML.
See [Support Tip: Ingesting Office ADMX-backed policies using Microsoft Intune](https://techcommunity.microsoft.com/t5/Intune-Customer-Success/Support-Tip-Ingesting-Office-ADMX-Backed-policies-using/ba-p/354824) for a walk-through using Intune.
>[!TIP]
>Intune has added a number of ADMX-backed administrative templates in public preview. Check if the policy settings you need are available in a template before using the SyncML method described below. [Learn more about Intune's administrative templates.](https://docs.microsoft.com/intune/administrative-templates-windows)
## Enable a policy
1. Find the policy from the list [ADMX-backed policies](policy-configuration-service-provider.md#admx-backed-policies). You need the following information listed in the policy description.
- GP English name
- GP name
- GP ADMX file name
- GP path
2. Use the Group Policy Editor to determine whether you need additional information to enable the policy. Run GPEdit.msc
1. Click **Start**, then in the text box type **gpedit**.
2. Under **Best match**, click **Edit group policy** to launch it.

3. In **Local Computer Policy** navigate to the policy you want to configure.
In this example, navigate to **Administrative Templates > System > App-V**.

4. Double-click **Enable App-V Client**.
The **Options** section is empty, which means there are no parameters necessary to enable the policy. If the **Options** section is not empty, follow the procedure in [Enable a policy that requires parameters](#enable-a-policy-that-requires-parameters)

3. Create the SyncML to enable the policy that does not require any parameter.
In this example you configure **Enable App-V Client** to **Enabled**.
> [!NOTE]
> The \<Data> payload must be XML encoded. To avoid encoding, you can use CData if your MDM supports it. For more information, see [CDATA Sections](http://www.w3.org/TR/REC-xml/#sec-cdata-sect). If you are using Intune, select String as the data type.
```xml
<SyncML xmlns="SYNCML:SYNCML1.2">
<SyncBody>
<Replace>
<CmdID>2</CmdID>
<Item>
<Meta>
<Format>chr</Format>
<Type>text/plain</Type>
</Meta>
<Target>
<LocURI>./Device/Vendor/MSFT/Policy/Config/AppVirtualization/AllowAppVClient </LocURI>
</Target>
<Data><Enabled/></Data>
</Item>
</Replace>
<Final/>
</SyncBody>
</SyncML>
```
## Enable a policy that requires parameters
1. Create the SyncML to enable the policy that requires parameters.
In this example, the policy is in **Administrative Templates > System > App-V > Publishing**.
1. Double-click **Publishing Server 2 Settings** to see the parameters you need to configure when you enable this policy.


2. Find the variable names of the parameters in the ADMX file.
You can find the ADMX file name in the policy description in Policy CSP. In this example, the filename appv.admx is listed in [AppVirtualization/PublishingAllowServer2](policy-configuration-service-provider.md#appvirtualization-publishingallowserver2).

3. Navigate to **C:\Windows\PolicyDefinitions** (default location of the admx files) and open appv.admx.
4. Search for GP name **Publishing_Server2_policy**.
5. Under **policy name="Publishing_Server2_Policy"** you can see the \<elements> listed. The text id and enum id represents the data id you need to include in the SyncML data payload. They correspond to the fields you see in GP Editor.
Here is the snippet from appv.admx:
```xml
<!-- Publishing Server 2 -->
<policy name="Publishing_Server2_Policy" class="Machine" displayName="$(string.PublishingServer2)"
explainText="$(string.Publishing_Server_Help)" presentation="$(presentation.Publishing_Server2)"
key="SOFTWARE\Policies\Microsoft\AppV\Client\Publishing\Servers\2">
<parentCategory ref="CAT_Publishing" />
<supportedOn ref="windows:SUPPORTED_Windows7" />
<elements>
<text id="Publishing_Server2_Name_Prompt" valueName="Name" required="true"/>
<text id="Publishing_Server_URL_Prompt" valueName="URL" required="true"/>
<enum id="Global_Publishing_Refresh_Options" valueName="GlobalEnabled">
<item displayName="$(string.False)">
<value>
<decimal value="0"/>
</value>
</item>
<item displayName="$(string.True)">
<value>
<decimal value="1"/>
</value>
</item>
</enum>
<enum id="Global_Refresh_OnLogon_Options" valueName="GlobalLogonRefresh">
<item displayName="$(string.False)">
<value>
<decimal value="0"/>
</value>
</item>
<item displayName="$(string.True)">
<value>
<decimal value="1"/>
</value>
</item>
</enum>
<decimal id="Global_Refresh_Interval_Prompt" valueName="GlobalPeriodicRefreshInterval" minValue="0" maxValue="31"/>
<enum id="Global_Refresh_Unit_Options" valueName="GlobalPeriodicRefreshIntervalUnit">
<item displayName="$(string.Hour)">
<value>
<decimal value="0"/>
</value>
</item>
<item displayName="$(string.Day)">
<value>
<decimal value="1"/>
</value>
</item>
</enum>
<enum id="User_Publishing_Refresh_Options" valueName="UserEnabled">
<item displayName="$(string.False)">
<value>
<decimal value="0"/>
</value>
</item>
<item displayName="$(string.True)">
<value>
<decimal value="1"/>
</value>
</item>
</enum>
<enum id="User_Refresh_OnLogon_Options" valueName="UserLogonRefresh">
<item displayName="$(string.False)">
<value>
<decimal value="0"/>
</value>
</item>
<item displayName="$(string.True)">
<value>
<decimal value="1"/>
</value>
</item>
</enum>
<decimal id="User_Refresh_Interval_Prompt" valueName="UserPeriodicRefreshInterval" minValue="0" maxValue="31"/>
<enum id="User_Refresh_Unit_Options" valueName="UserPeriodicRefreshIntervalUnit">
<item displayName="$(string.Hour)">
<value>
<decimal value="0"/>
</value>
</item>
<item displayName="$(string.Day)">
<value>
<decimal value="1"/>
</value>
</item>
</enum>
</elements>
</policy>
```
6. From the \<elements> tag, copy all the text id and enum id and create an XML with data id and value fields. The value field contains the configuration settings you would enter in the GP Editor.
Here is the example XML for Publishing_Server2_Policy :
```xml
<data id="Publishing_Server2_Name_Prompt" value="Name"/>
<data id="Publishing_Server_URL_Prompt" value="http://someuri"/>
<data id="Global_Publishing_Refresh_Options" value="1"/>
<data id="Global_Refresh_OnLogon_Options" value="0"/>
<data id="Global_Refresh_Interval_Prompt" value="15"/>
<data id="Global_Refresh_Unit_Options" value="0"/>
<data id="User_Publishing_Refresh_Options" value="0"/>
<data id="User_Refresh_OnLogon_Options" value="0"/>
<data id="User_Refresh_Interval_Prompt" value="15"/>
<data id="User_Refresh_Unit_Options" value="1"/>
```
7. Create the SyncML to enable the policy. Payload contains \<enabled/> and name/value pairs.
Here is the example for **AppVirtualization/PublishingAllowServer2**:
> [!NOTE]
> The \<Data> payload must be XML encoded. To avoid encoding, you can use CData if your MDM supports it. For more information, see [CDATA Sections](http://www.w3.org/TR/REC-xml/#sec-cdata-sect). If you are using Intune, select String as the data type.
```xml
<?xml version="1.0" encoding="utf-8"?>
<SyncML xmlns="SYNCML:SYNCML1.2">
<SyncBody>
<Replace>
<CmdID>2</CmdID>
<Item>
<Meta>
<Format>chr</Format>
<Type>text/plain</Type>
</Meta>
<Target>
<LocURI>./Device/Vendor/MSFT/Policy/Config/AppVirtualization/PublishingAllowServer2</LocURI>
</Target>
<Data>
<![CDATA[<enabled/><data id="Publishing_Server2_Name_Prompt" value="name prompt"/><data
id="Publishing_Server_URL_Prompt" value="URL prompt"/><data
id="Global_Publishing_Refresh_Options" value="1"/><data
id="Global_Refresh_OnLogon_Options" value="0"/><data
id="Global_Refresh_Interval_Prompt" value="15"/><data
id="Global_Refresh_Unit_Options" value="0"/><data
id="User_Publishing_Refresh_Options" value="0"/><data
id="User_Refresh_OnLogon_Options" value="0"/><data
id="User_Refresh_Interval_Prompt" value="15"/><data
id="User_Refresh_Unit_Options" value="1"/>]]>
</Data>
</Item>
</Replace>
<Final/>
</SyncBody>
</SyncML>
```
## Disable a policy
The \<Data> payload is \<disabled/>. Here is an example to disable AppVirtualization/PublishingAllowServer2.
```xml
<SyncML xmlns="SYNCML:SYNCML1.2">
<SyncBody>
<Replace>
<CmdID>2</CmdID>
<Item>
<Meta>
<Format>chr</Format>
<Type>text/plain</Type>
</Meta>
<Target>
<LocURI>./Device/Vendor/MSFT/Policy/Config/AppVirtualization/PublishingAllowServer2</LocURI>
</Target>
<Data><disabled/></Data>
</Item>
</Replace>
<Final/>
</SyncBody>
</SyncML>
```
## Setting a policy to not configured
The \<Data> payload is empty. Here an example to set AppVirtualization/PublishingAllowServer2 to "Not Configured."
```xml
<?xml version="1.0" encoding="utf-8"?>
<SyncML xmlns="SYNCML:SYNCML1.2">
<SyncBody>
<Delete>
<CmdID>1</CmdID>
<Item>
<Target>
<LocURI>./Device/Vendor/MSFT/Policy/Config/AppVirtualization/PublishingAllowServer2</LocURI>
</Target>
</Item>
</Delete>
<Final/>
</SyncBody>
</SyncML>
```
| 38.983819 | 431 | 0.645609 | eng_Latn | 0.565085 |
f97a77d120141aaaefc666413a660ca5701d0ad7 | 67 | md | Markdown | README.md | akc-code/chrome-extension-ci | 3745564a3841fb483b69fc0e156797615ff07191 | [
"MIT"
] | null | null | null | README.md | akc-code/chrome-extension-ci | 3745564a3841fb483b69fc0e156797615ff07191 | [
"MIT"
] | null | null | null | README.md | akc-code/chrome-extension-ci | 3745564a3841fb483b69fc0e156797615ff07191 | [
"MIT"
] | null | null | null | # chrome-extension-ci
Setting up CI for a sample chrome extension.
| 22.333333 | 44 | 0.791045 | eng_Latn | 0.91538 |
f97bcedf180d61cbd197d3bac5cfcc4d301b30f0 | 1,880 | md | Markdown | _posts/2020-12-27-Basic-Conquest-3321.md | hython37/hython37.github.io | db11dc22cb6947c49d3a5bba22585c1b3da972df | [
"MIT"
] | null | null | null | _posts/2020-12-27-Basic-Conquest-3321.md | hython37/hython37.github.io | db11dc22cb6947c49d3a5bba22585c1b3da972df | [
"MIT"
] | null | null | null | _posts/2020-12-27-Basic-Conquest-3321.md | hython37/hython37.github.io | db11dc22cb6947c49d3a5bba22585c1b3da972df | [
"MIT"
] | null | null | null | ---
title: "기초정복, 3321"
excerpt: "최고의 피자"
categories:
- Algorithm
tags:
- study
- greedy
- python
toc: true
toc_label: "Algorithm of Python"
---
## 3321 : 최고의 피자
[코드업 바로가기](https://codeup.kr/problem.php?id=3321){: target="_blank"}
**문제 설명**
vega 선생님은 Miss 피자 가게의 단골 손님이다.
그는 이번 달부터 절약 생활을 시작했다.
그래서 그는 피자 가게에서 주문할 수 있는 피자 중 1 달러 당 열량이 최대가 되는 피자를 주문하고 싶어한다.
이러한 피자를 "최고의 피자"라고 부르기로 하자.
"최고의 피자"는 1종류가 아니다.
Miss 피자는 N 종류의 토핑에서 여러 종류를 자유롭게 선택하여, 도우 위에 올려 주문할 수있다.
같은 토핑을 2 개 이상 올릴 수 없다.
도우에 토핑을 하나도 하지 않은 피자도 주문할 수있다.
도우의 가격은 A 달러이며, 토핑의 가격은 모두 B 달러이다.
실제 피자 가격은 도우의 가격과 토핑 가격의 합계이다.
즉, 토핑을 k 종류 (0 ≦ k ≦ N) 한 피자의 가격은 A + k × B 원이다.
피자 전체의 칼로리는 도우 열량과 토핑 칼로리의 합계이다.
도우의 가격과 토핑의 가격, 그리고 도우와 각 토핑 열량 값이 주어 졌을 때, "최고의 피자"의 1 달러 당 열량의 수를 구하는 프로그램을 작성하시오.
**입력**
첫 번째 줄에는 토핑 종류 수를 나타내는 하나의 정수 N (1 ≦ N ≦ 100)이 입력된다.
두 번째 줄에는 두 개의 정수 A, B (1 ≦ A ≦ 1000,1 ≦ B ≦ 1000)가 공백을 구분으로 입력된다. A는 도우의 가격, B는 토핑의 가격을 나타낸다.
세 번째 줄에는 도우의 칼로리를 나타내는 정수 C (1 ≦ C ≦ 10000)가 입력된다.
3 + i 행 (1 ≦ i ≦ N)는 i 번째의 토핑 칼로리 수를 나타내는 정수 Di (1 ≦ Di ≦ 10,000)가 입력된다.
**출력**
"최고의 피자" 1 달러 당 열량의 수를 소수점 이하는 버리고 정수로 출력한다.
**입력 예시**
3
12 2
200
50
300
100
**출력 예시**
37
### 문제 정리
처음 생각은 단순하게 가성비에 초점을 맞춰, 토핑이 없는 피자의 가성비인 C/A 보다 가성비가 크면 추가하면 되지 않을까 생각했지만 평균을 구하는 과정에서 이러한 생각은 문제(도우와 토핑의 가격이 다르고, 토핑의 갯수가 달라짐 = 분모의 변화)가 있음을 깨달았다.
그래서 생각을 바꿔 A(도우 가격), B(토핑 가격), C(도우 칼로리) 는 일정하므로 변화가 있는 부분, 즉 토핑 칼로리가 큰 순서대로 추가하면서 달러 당 칼로리를 구해야한다고 생각했다. 토핑의 가성비가 큰 순서대로 달러 당 칼로리를 구한다면 가장 높은 달러 당 칼로리를 갱신 및 유지 할 수 있기 때문이다.
계산 과정 중 달러 당 칼로리가 낮아질 것이 예상된다면 해당 토핑을 포함시키지 않는다.
```python
N = int(input())
A, B = map(int, input().split())
C = int(input())
D = [int(input()) for i in range(N)]
D.sort(reverse=True)
Cal = C/A
calorie = C
price = A
for i in range(N):
if (calorie + D[i])/(price + B) < Cal:
continue
calorie += D[i]
price += B
Cal = calorie/price
print(int(Cal))
``` | 19.381443 | 173 | 0.616489 | kor_Hang | 1.00001 |
f97bd9ed9a8f55229f48f802da38cd14fac50899 | 1,373 | md | Markdown | AlchemyInsights/commercial-dialogues/assign-ediscovery-administrator-permissions.md | isabella232/OfficeDocs-AlchemyInsights-pr.fr-FR | b23fe97cbba1674ad1f59978ca5080bb00d217cb | [
"CC-BY-4.0",
"MIT"
] | 1 | 2020-05-19T19:06:33.000Z | 2020-05-19T19:06:33.000Z | AlchemyInsights/commercial-dialogues/assign-ediscovery-administrator-permissions.md | isabella232/OfficeDocs-AlchemyInsights-pr.fr-FR | b23fe97cbba1674ad1f59978ca5080bb00d217cb | [
"CC-BY-4.0",
"MIT"
] | 2 | 2022-02-09T06:56:37.000Z | 2022-02-09T06:56:51.000Z | AlchemyInsights/commercial-dialogues/assign-ediscovery-administrator-permissions.md | isabella232/OfficeDocs-AlchemyInsights-pr.fr-FR | b23fe97cbba1674ad1f59978ca5080bb00d217cb | [
"CC-BY-4.0",
"MIT"
] | 2 | 2019-10-11T18:36:10.000Z | 2021-10-09T11:34:48.000Z | ---
title: Attribution d’autorisations Administrateur eDiscovery
ms.author: v-smandalika
author: v-smandalika
manager: dansimp
ms.date: 02/19/2021
audience: Admin
ms.topic: article
ms.service: o365-administration
ROBOTS: NOINDEX, NOFOLLOW
localization_priority: Priority
ms.collection: Adm_O365
ms.custom:
- "7363"
- "9000722"
ms.openlocfilehash: b43a677377e11ce92e461532e54a3e4247de6a4e1924def11a14f4956b3d8de8
ms.sourcegitcommit: b5f7da89a650d2915dc652449623c78be6247175
ms.translationtype: HT
ms.contentlocale: fr-FR
ms.lasthandoff: 08/05/2021
ms.locfileid: "54005844"
---
# <a name="assign-ediscovery-administrator-permissions"></a>Attribution d’autorisations Administrateur eDiscovery
Pour attribuer des autorisations Administrateur eDiscovery, effectuez la procédure suivante :
1. Dans le [Centre de sécurité et conformité Office 365](https://sip.protection.office.com/), sélectionnez **Autorisations > Administrateur eDiscovery**.
2. Dans le volet **Gestionnaire eDiscovery**, près de **Administrateur eDiscovery**, sélectionnez **Modifier**.
3. Dans le volet **Modifier Sélectionner un administrateur eDiscovery**, cliquez sur **Sélectionner un administrateur eDiscovery**.
4. Dans le volet **Sélectionner un administrateur eDiscovery** , cliquez sur **+ Ajouter**, puis sélectionnez les noms des utilisateurs.
5. Sélectionnez **Ajouter > Terminé > Enregistrer**. | 44.290323 | 153 | 0.805535 | fra_Latn | 0.724856 |
f97bdfdbd32577c2644c0f5924373190e809a7fc | 3,786 | md | Markdown | articles/rest-api/bot-framework-rest-direct-line-1-1-receive-messages.md | Miguel-byte/bot-docs.es-es | 73160a62ecf9178cb46530674a28db607fef434d | [
"CC-BY-4.0",
"MIT"
] | null | null | null | articles/rest-api/bot-framework-rest-direct-line-1-1-receive-messages.md | Miguel-byte/bot-docs.es-es | 73160a62ecf9178cb46530674a28db607fef434d | [
"CC-BY-4.0",
"MIT"
] | null | null | null | articles/rest-api/bot-framework-rest-direct-line-1-1-receive-messages.md | Miguel-byte/bot-docs.es-es | 73160a62ecf9178cb46530674a28db607fef434d | [
"CC-BY-4.0",
"MIT"
] | null | null | null | ---
title: Recepción de mensajes del bot | Microsoft Docs
description: Aprenda a recibir mensajes del bot mediante Direct Line API v1.1.
author: RobStand
ms.author: kamrani
manager: kamrani
ms.topic: article
ms.service: bot-service
ms.date: 12/13/2017
ms.openlocfilehash: 6f9a132b538a278b0990271864a70e77ea7dc56c
ms.sourcegitcommit: a6d02ec4738e7fc90b7108934740e9077667f3c5
ms.translationtype: HT
ms.contentlocale: es-ES
ms.lasthandoff: 09/04/2019
ms.locfileid: "70299624"
---
# <a name="receive-messages-from-the-bot"></a>Recepción de mensajes del bot
> [!IMPORTANT]
> En este artículo se describe cómo recibir mensajes del bot mediante Direct Line API v1.1. Si va a crear una nueva conexión entre la aplicación cliente y el bot, use [Direct Line API 3.0](bot-framework-rest-direct-line-3-0-receive-activities.md) en su lugar.
Mediante el protocolo Direct Line 1.1, los clientes deben sondear una interfaz `HTTP GET` para recibir mensajes.
## <a name="retrieve-messages-with-http-get"></a>Recuperación de mensajes con HTTP GET
Para recuperar mensajes de una conversación específica, emita una solicitud `GET` al punto de conexión `api/conversations/{conversationId}/messages` y, si lo desea, especifique el parámetro `watermark` para indicar el mensaje más reciente visto por el cliente. Se devolverá un valor `watermark` en la respuesta JSON, incluso si no se incluyen mensajes.
Los fragmentos de código siguientes proporcionan un ejemplo de la solicitud Get Messages y la respuesta. Contiene la respuesta de Get Messages `watermark` como una propiedad de [MessageSet](bot-framework-rest-direct-line-1-1-api-reference.md#messageset-object). Los clientes deben desplazarse por los mensajes disponibles haciendo avanzar el valor `watermark` hasta que no se devuelvan mensajes.
### <a name="request"></a>Solicitud
```http
GET https://directline.botframework.com/api/conversations/abc123/messages?watermark=0001a-94
Authorization: Bearer RCurR_XV9ZA.cwA.BKA.iaJrC8xpy8qbOF5xnR2vtCX7CZj0LdjAPGfiCpg4Fv0
```
### <a name="response"></a>Response
```http
HTTP/1.1 200 OK
[other headers]
```
```json
{
"messages": [
{
"conversation": "abc123",
"id": "abc123|0000",
"text": "hello",
"from": "user1"
},
{
"conversation": "abc123",
"id": "abc123|0001",
"text": "Nice to see you, user1!",
"from": "bot1"
}
],
"watermark": "0001a-95"
}
```
## <a name="timing-considerations"></a>Consideraciones de tiempo
Aunque Direct Line es un protocolo de varias partes con posibles diferencias temporales, el protocolo y el servicio están diseñados para que sea fácil crear un cliente confiable. La propiedad `watermark` que se envía en la respuesta de Get Messages es confiable. Un cliente no perderá ningún mensaje siempre que se reproduzca la marca de agua textualmente.
Los clientes deben elegir un intervalo de sondeo que coincida con su uso previsto.
- Las aplicaciones de servicio a servicio suelen usar un intervalo de sondeo de 5 o 10 s.
- Las aplicaciones orientadas al cliente a menudo usan un intervalo de sondeo de 1 segundo y emiten una solicitud adicional ~ 300 ms después de todos los mensajes que envía el cliente (para recuperar rápidamente la respuesta de un bot). Este retraso de 300 ms se debe ajustar en función del tiempo de tránsito y la velocidad del bot.
## <a name="additional-resources"></a>Recursos adicionales
- [Conceptos clave](bot-framework-rest-direct-line-1-1-concepts.md)
- [Autenticación](bot-framework-rest-direct-line-1-1-authentication.md)
- [Inicio de una conversación](bot-framework-rest-direct-line-1-1-start-conversation.md)
- [Envío de un mensaje al bot](bot-framework-rest-direct-line-1-1-send-message.md) | 47.924051 | 396 | 0.746698 | spa_Latn | 0.94872 |
f97d00b971b052019ab5c485f4155bddc1e42bab | 4,444 | md | Markdown | _posts/2014-05-02-multiple-uncertainty-notes.md | kubke/labnotebook | ebada896118f21a8d63e9b584e2d1424a5d3a6bd | [
"CC0-1.0"
] | null | null | null | _posts/2014-05-02-multiple-uncertainty-notes.md | kubke/labnotebook | ebada896118f21a8d63e9b584e2d1424a5d3a6bd | [
"CC0-1.0"
] | null | null | null | _posts/2014-05-02-multiple-uncertainty-notes.md | kubke/labnotebook | ebada896118f21a8d63e9b584e2d1424a5d3a6bd | [
"CC0-1.0"
] | null | null | null | ---
layout: post
category: multiple_uncertainty
---
### Logistic recruitment, uniform noise
(Sethi Fig 3 configuration)

### Logistic recruitment, lognormal noise

### Beverton-Holt recruitement, uniform noise

### Beverton-Holt recruitement, lognormal noise

### Logistic recruitment, uniform noise, small r

### Logistic recruitment, lognormal noise, small r

## Metadata
From [scenarios_meta.txt](https://github.com/cboettig/multiple_uncertainty/blob/25cc46841908c6ca4da92b8866b836bfeddd707f/inst/matlab/scenarios_meta.txt)
```yaml
Columns:
y_grid: The observed stock size
escapement: The optimal escapement policy. (y_grid value minus the escapement is the quota).
sigma_g: The growth noise scaling factor
sigma_m: The measurement error scaling factor (between observed stock x and measured stock y)
sigma_i: The implementation noise scaling factor (between quota q set and harvest h realized).
r: The growth rate parameter of the recuitment function
K: The carrying capacity parameter of the recruitment function
recruitment:
1: Logistic, x+r*x.*(1-x/K)
2: Ricker, (1+r)*x.*exp(-(log(1+r)/K)*x)
3: Beverton-Holt, (1+r)*x./(1+r/K*x)
noise:
1: uniform noise distribution
2: lognormal noise distribution
id: a unique id number for the scenario (for subsetting and plotting)
Constants:
delta:
value: 0.05
description: Discount rate
Tmax:
value: 10
description: number of years (iterations) over which policy is optimized
x_grid:
value: linspace(0,200,201)
description: True stock size
y_grid:
value: linspace(0,200,201)
description: Observed stock size
h_grid:
value: linspace(0,120,121)
description: Implemented harvest
q_grid:
value: linspace(0,120,121)
description: Target harvest quota
Notes:
See the scenarios.m file to confirm values of constants used.
```
## Source code / version archives
- Data for all the plots shown appears in [scenarios.csv](https://github.com/cboettig/multiple_uncertainty/blob/4c71612448eb6f33378235e786d149abc8ce0074/inst/matlab/scenarios.csv). Because my collaborators have me using Matlab for this and Matlab can neither (afaik) write headers to csv files, append to csv files, or even conviently export non-numeric and numeric data (I write a matrix to the csv file), you'll just have to consult the [scenarios_meta.txt](https://github.com/cboettig/multiple_uncertainty/blob/25cc46841908c6ca4da92b8866b836bfeddd707f/inst/matlab/scenarios_meta.txt) for relevant information.
- All results shown here correspond to files that can be found in [Commit 4c71612](https://github.com/cboettig/multiple_uncertainty/tree/4c71612448eb6f33378235e786d149abc8ce0074/inst/matlab), including the data file. The SVG plots are hosted through the gh-pages branch of the repo, and thus show the most recent versions of those plots. The names have been tagged with the commit hash to avoid accidentally overwriting when programatically generating images. (Clearly that's not ideal, but cannot link directly to version and have it render).
- Results generated by running [scenarios.m](https://github.com/cboettig/multiple_uncertainty/blob/4c71612448eb6f33378235e786d149abc8ce0074/inst/matlab/scenarios.m)
__Functions__
- The main routine is (still) in [multiple_uncertainty.m](https://github.com/cboettig/multiple_uncertainty/blob/4c71612448eb6f33378235e786d149abc8ce0074/inst/matlab/multiple_uncertainty.m)
- The scenario template over which we loop is defined as a seperate function in [scenario.m](https://github.com/cboettig/multiple_uncertainty/blob/4c71612448eb6f33378235e786d149abc8ce0074/inst/matlab/scenario.m)
- scenarios.m also calls the script [plot_scenarios.m](https://github.com/cboettig/multiple_uncertainty/blob/4c71612448eb6f33378235e786d149abc8ce0074/inst/matlab/plot_scenarios.m), which can be run to generate the plots from scenarios.csv directly without first regenerating the results data.
| 44.888889 | 612 | 0.786229 | eng_Latn | 0.826074 |
f97d7613cf1714fe6d0a4852dc476e763003d268 | 4,565 | md | Markdown | 2021/materials/istanbul/Pre-SICSS Tasks/Introduction.md | meghaarora42/summer-institute | 396a93ee5e999c3be5c4c212953eb8ddfd8ae7cd | [
"MIT"
] | 264 | 2017-02-01T16:50:02.000Z | 2022-03-30T20:36:20.000Z | 2021/materials/istanbul/Pre-SICSS Tasks/Introduction.md | meghaarora42/summer-institute | 396a93ee5e999c3be5c4c212953eb8ddfd8ae7cd | [
"MIT"
] | 103 | 2017-03-31T19:32:05.000Z | 2022-02-26T03:24:55.000Z | 2021/materials/istanbul/Pre-SICSS Tasks/Introduction.md | meghaarora42/summer-institute | 396a93ee5e999c3be5c4c212953eb8ddfd8ae7cd | [
"MIT"
] | 199 | 2017-06-19T15:04:00.000Z | 2022-03-18T13:21:14.000Z | # SICSS-Istanbul 2021 Pre-SICSS Tasks
## INTRODUCTION
Pre-SICSS tasks require basic knowledge of R. Therefore, we are assuming that you finished [SICSS Boot Camp](https://sicss.io/boot_camp/) which is an online
training program designed to provide you with beginner level skills in coding. The videos and materials are designed for complete beginners and are best viewed as
a sequence since each video builds upon content introduced in previous tutorials. After you complete the tutorials, you are ready to learn [SICSS Lecture
Materials](https://sicss.io/curriculum) which provides state-of-the art training in a range of different areas in computational social science from ethics to text analysis and mass collaboration. You can find videos, slides, code, and teaching exercises. These lectures also assume a basic, working knowledge of the R language.
When it comes to our Pre-SICSS tasks, we provide you with practical experience of Day 2 (Collecting Digital Trace Data) and Day 3(Automated Text Analysis) of
[SICSS Lecture Materials](https://sicss.io/curriculum). In our Pre-SICSS tasks, we are extending Day 2 and 3 exercises, and providing more time to finish them
individually (or in a group setting) considering different learning curves in our cohort. We believe that learning to code is an individual process, but some
would see a benefit to solve tasks in a group setting. Therefore, we are leaving the decision of doing our tasks individually or in a group setting to you.
To help you in this process, [Ahmet Kurnaz](https://sicss2021.slack.com/archives/D0248VAJCRE) will organize code walkthrough sessions for each task. Thus,
participants will have a chance to replicate the codes on
their desk to improve their coding experience.
Additionally, you can request an one-to-one office hour from [Emre Tapan](https://sicss2021.slack.com/archives/D01GW4RAZ9B#) when you get stuck and need support.
We strongly suggest that when you face a bug or
difficulty, instead of immediately asking it, you should try to solve your problems by searching on [StackOverflow](https://stackoverflow.com/questions/tagged/r)!
We are listing some helpful links you would need to look at while solving our tasks:
- [R Studio Education](https://education.rstudio.com/) is a great place to start to learn R. No one is born a data scientist. Every person who works with R today was once a complete beginner. No matter how much you know about the R ecosystem already, you’ll always have more to learn.
- [SICSS Boot Camp](https://sicss.io/boot_camp/) is an online training program designed to provide you with beginner level skills in coding so that you can follow the more advanced curriculum we teach at the partner locations of the Summer Institutes in Computational Social Science. The videos and materials are designed for complete beginners and are best viewed as a sequence since each video builds upon content introduced in previous tutorials.
- [SICSS Lecture Materials](https://sicss.io/curriculum) provides state-of-the art training in a range of different areas in computational social science from ethics to text analysis and mass collaboration. You can find videos, slides, code, and teaching exercises. These lectures assume a basic, working knowledge of the R language.
- [Alternative Curriculum](https://github.com/compsocialscience/summer-institute/blob/master/_data/alternative_curriculum.md) is produced by organizers of SICSS partner sites to serve the needs of different audiences.
## Data Project Steps
We thought that completing a whole data project before SICSS would make you ready for group work. Therefore, our Pre-SICSS tasks will combine a whole data project.
Every data project generally has four critical milestones. These are:
• Data collection (using APIs or scraping)
• Data cleaning
• Data analysis
• Reporting
We created 5 tasks for you. First, you will collect data from the web. You may choose a website or a social media platform as a datasource. Depending on your
decision, you may want to use APIs or screen scraping methods. In the first week, you will have a handy exercise of both of these methods. In the second week, you
are expected to clean, manipulate and report the data you obtained from the web.
### NOTICE: All of the recommendations are based on our experience; therefore they can be highly subjective. We do not argue that the packages&approaches we are suggesting in this document are the best ones. However, we are using some of them for a long time, and they have been handy.
| 93.163265 | 449 | 0.794524 | eng_Latn | 0.999116 |
f97ea4791355462649a3be3b40aff17f55b4d4d3 | 1,535 | md | Markdown | docs/build/reference/pdbpath.md | bobbrow/cpp-docs | 769b186399141c4ea93400863a7d8463987bf667 | [
"CC-BY-4.0",
"MIT"
] | 965 | 2017-06-25T23:57:11.000Z | 2022-03-31T14:17:32.000Z | docs/build/reference/pdbpath.md | bobbrow/cpp-docs | 769b186399141c4ea93400863a7d8463987bf667 | [
"CC-BY-4.0",
"MIT"
] | 3,272 | 2017-06-24T00:26:34.000Z | 2022-03-31T22:14:07.000Z | docs/build/reference/pdbpath.md | bobbrow/cpp-docs | 769b186399141c4ea93400863a7d8463987bf667 | [
"CC-BY-4.0",
"MIT"
] | 951 | 2017-06-25T12:36:14.000Z | 2022-03-26T22:49:06.000Z | ---
description: "Learn more about: /PDBPATH"
title: "/PDBPATH"
ms.date: "11/04/2016"
f1_keywords: ["/pdbpath"]
helpviewer_keywords: [".pdb files, path", "-PDBPATH dumpbin option", "/PDBPATH dumpbin option", "PDBPATH dumpbin option", "PDB files, path"]
ms.assetid: ccf67dcd-0b23-4250-ad47-06c48acbe82b
---
# /PDBPATH
```
/PDBPATH[:VERBOSE] filename
```
### Parameters
*filename*<br/>
The name of the .dll or .exe file for which you want to find the matching .pdb file.
**:VERBOSE**<br/>
(Optional) Reports all directories where an attempt was made to locate the .pdb file.
## Remarks
/PDBPATH will search your computer along the same paths that the debugger would search for a .pdb file and will report which, if any, .pdb files correspond to the file specified in *filename*.
When using the Visual Studio debugger, you may experience a problem due to the fact that the debugger is using a .pdb file for a different version of the file you are debugging.
/PDBPATH will search for .pdb files along the following paths:
- Check the location where the executable resides.
- Check the location of the PDB written into the executable. This is usually the location at the time the image was linked.
- Check along the search path configured in the Visual Studio IDE.
- Check along the paths in the _NT_SYMBOL_PATH and _NT_ALT_SYMBOL_PATH environment variables.
- Check in the Windows directory.
## See also
[DUMPBIN Options](dumpbin-options.md)<br/>
[/PDBALTPATH (Use Alternate PDB Path)](pdbaltpath-use-alternate-pdb-path.md)
| 34.111111 | 192 | 0.753746 | eng_Latn | 0.984479 |
f97ed7102dbf2eca402d8b4e78dad26a8959958a | 9,963 | md | Markdown | _posts/2022-01-22-info-2015-375511-11290.md | seed-info/apt-info-sub | 5d3a83a8da1bd659d7d4392146f97ddbbaf7633f | [
"MIT"
] | null | null | null | _posts/2022-01-22-info-2015-375511-11290.md | seed-info/apt-info-sub | 5d3a83a8da1bd659d7d4392146f97ddbbaf7633f | [
"MIT"
] | null | null | null | _posts/2022-01-22-info-2015-375511-11290.md | seed-info/apt-info-sub | 5d3a83a8da1bd659d7d4392146f97ddbbaf7633f | [
"MIT"
] | null | null | null | ---
layout: post
comments: true
title: 정릉꿈에그린
categories: [아파트정보]
permalink: /apt/서울특별시성북구정릉동정릉꿈에그린
---
정릉꿈에그린 실거래 상세정보
<script type="text/javascript">
google.charts.load('current', {'packages':['line', 'corechart']});
google.charts.setOnLoadCallback(drawChart);
function drawChart() {
var data = new google.visualization.DataTable();
data.addColumn('date', '거래일');
data.addColumn('number', "매매");
data.addColumn('number', "전세");
data.addColumn('number', "전매");
data.addRows([[new Date(Date.parse("2021-11-16")), null, 65000, null], [new Date(Date.parse("2021-11-04")), null, null, null], [new Date(Date.parse("2021-10-21")), null, 44625, null], [new Date(Date.parse("2021-10-04")), 80000, null, null], [new Date(Date.parse("2021-08-27")), null, 36750, null], [new Date(Date.parse("2021-08-22")), 75000, null, null], [new Date(Date.parse("2021-07-24")), null, 45150, null], [new Date(Date.parse("2021-07-12")), null, 6997, null], [new Date(Date.parse("2021-07-03")), null, null, null], [new Date(Date.parse("2021-06-17")), null, 35700, null], [new Date(Date.parse("2021-06-14")), 95000, null, null], [new Date(Date.parse("2021-06-14")), null, 43000, null], [new Date(Date.parse("2021-06-11")), 74000, null, null], [new Date(Date.parse("2021-06-06")), 83500, null, null], [new Date(Date.parse("2021-06-05")), 89900, null, null], [new Date(Date.parse("2021-06-04")), 77700, null, null], [new Date(Date.parse("2021-06-03")), null, null, null], [new Date(Date.parse("2021-05-28")), null, 38850, null], [new Date(Date.parse("2021-05-08")), 110000, null, null], [new Date(Date.parse("2021-04-30")), null, 35000, null], [new Date(Date.parse("2021-04-17")), null, 44000, null], [new Date(Date.parse("2021-04-01")), null, 33000, null], [new Date(Date.parse("2021-03-29")), null, 42000, null], [new Date(Date.parse("2021-03-27")), 87000, null, null], [new Date(Date.parse("2021-03-06")), null, 38000, null], [new Date(Date.parse("2021-03-06")), null, 65000, null], [new Date(Date.parse("2021-03-04")), 78000, null, null], [new Date(Date.parse("2021-02-17")), 86000, null, null], [new Date(Date.parse("2021-02-15")), null, null, null], [new Date(Date.parse("2021-01-30")), 88500, null, null], [new Date(Date.parse("2021-01-26")), null, 47500, null]]);
var options = {
hAxis: {
format: 'yyyy/MM/dd'
},
lineWidth: 0,
pointsVisible: true,
title: '최근 1년간 유형별 실거래가 분포',
legend: { position: 'bottom' }
};
var formatter = new google.visualization.NumberFormat({pattern:'###,###'} );
formatter.format(data, 1);
formatter.format(data, 2);
setTimeout(function() {
var chart = new google.visualization.LineChart(document.getElementById('columnchart_material'));
chart.draw(data, (options));
document.getElementById('loading').style.display = 'none';
}, 200);
}
</script>
<div id="loading" style="z-index:20; display: block; margin-left: 0px">"그래프를 그리고 있습니다"</div>
<div id="columnchart_material" style="width: 95%; margin-left: 0px; display: block"></div>
<!-- contents start -->
<b>역대 전용면적, 거래별 최고가</b>
<table class="sortable">
<tr>
<td>거래</td>
<td>가격</td>
<td>면적</td>
<td>층</td>
<td>거래일</td>
</tr>
<tr>
<td><a style="color: blue">매매</a></td>
<td>95,000</td>
<td>84.56</td>
<td>4</td>
<td>2021-06-14</td>
</tr> <tr>
<td><a style="color: blue">매매</a></td>
<td>94,000</td>
<td>109.06</td>
<td>3</td>
<td>2020-12-11</td>
</tr> <tr>
<td><a style="color: blue">매매</a></td>
<td>83,500</td>
<td>59.63</td>
<td>14</td>
<td>2021-06-06</td>
</tr> <tr>
<td><a style="color: blue">매매</a></td>
<td>50,000</td>
<td>63.4</td>
<td>1</td>
<td>2020-02-11</td>
</tr> <tr>
<td><a style="color: blue">매매</a></td>
<td>43,000</td>
<td>52.88</td>
<td>1</td>
<td>2020-02-09</td>
</tr>
<tr>
<td><a style="color: darkgreen">전세</a></td>
<td>65,000</td>
<td>84.56</td>
<td>5</td>
<td>2021-11-16</td>
</tr> <tr>
<td><a style="color: darkgreen">전세</a></td>
<td>36,750</td>
<td>59.63</td>
<td>13</td>
<td>2021-08-27</td>
</tr> <tr>
<td><a style="color: darkgreen">전세</a></td>
<td>6,997</td>
<td>51.57</td>
<td>12</td>
<td>2021-07-12</td>
</tr>
</table>
<b>최근 1년간 거래 내역</b>
<table class="sortable">
<tr>
<td>거래</td>
<td>가격</td>
<td>면적</td>
<td>층</td>
<td>거래일</td>
</tr>
<tr>
<td><a style="color: darkgreen">전세</a></td>
<td>65,000</td>
<td>84.56</td>
<td>5</td>
<td>2021-11-16</td>
</tr> <tr>
<td><a style="color: darkgoldenrod">월세</a></td>
<td>20 (37,000)</td>
<td>84.56</td>
<td>9</td>
<td>2021-11-04</td>
</tr> <tr>
<td><a style="color: darkgreen">전세</a></td>
<td>44,625</td>
<td>84.56</td>
<td>1</td>
<td>2021-10-21</td>
</tr> <tr>
<td><a style="color: blue">매매</a></td>
<td>80,000</td>
<td>59.63</td>
<td>16</td>
<td>2021-10-04</td>
</tr> <tr>
<td><a style="color: darkgreen">전세</a></td>
<td>36,750</td>
<td>59.63</td>
<td>13</td>
<td>2021-08-27</td>
</tr> <tr>
<td><a style="color: blue">매매</a></td>
<td>75,000</td>
<td>59.63</td>
<td>2</td>
<td>2021-08-22</td>
</tr> <tr>
<td><a style="color: darkgreen">전세</a></td>
<td>45,150</td>
<td>84.56</td>
<td>12</td>
<td>2021-07-24</td>
</tr> <tr>
<td><a style="color: darkgreen">전세</a></td>
<td>6,997</td>
<td>51.57</td>
<td>12</td>
<td>2021-07-12</td>
</tr> <tr>
<td><a style="color: darkgoldenrod">월세</a></td>
<td>200 (8,000)</td>
<td>84.56</td>
<td>9</td>
<td>2021-07-03</td>
</tr> <tr>
<td><a style="color: darkgreen">전세</a></td>
<td>35,700</td>
<td>59.63</td>
<td>3</td>
<td>2021-06-17</td>
</tr> <tr>
<td><a style="color: blue">매매</a></td>
<td>95,000</td>
<td>84.56</td>
<td>4</td>
<td>2021-06-14</td>
</tr> <tr>
<td><a style="color: darkgreen">전세</a></td>
<td>43,000</td>
<td>84.56</td>
<td>6</td>
<td>2021-06-14</td>
</tr> <tr>
<td><a style="color: blue">매매</a></td>
<td>74,000</td>
<td>59.63</td>
<td>13</td>
<td>2021-06-11</td>
</tr> <tr>
<td><a style="color: blue">매매</a></td>
<td>83,500</td>
<td>59.63</td>
<td>14</td>
<td>2021-06-06</td>
</tr> <tr>
<td><a style="color: blue">매매</a></td>
<td>89,900</td>
<td>84.56</td>
<td>3</td>
<td>2021-06-05</td>
</tr> <tr>
<td><a style="color: blue">매매</a></td>
<td>77,700</td>
<td>59.63</td>
<td>13</td>
<td>2021-06-04</td>
</tr> <tr>
<td><a style="color: darkgoldenrod">월세</a></td>
<td>140 (3,000)</td>
<td>59.63</td>
<td>10</td>
<td>2021-06-03</td>
</tr> <tr>
<td><a style="color: darkgreen">전세</a></td>
<td>38,850</td>
<td>84.56</td>
<td>3</td>
<td>2021-05-28</td>
</tr> <tr>
<td><a style="color: blue">매매</a></td>
<td>110,000</td>
<td>109.06</td>
<td>9</td>
<td>2021-05-08</td>
</tr> <tr>
<td><a style="color: darkgreen">전세</a></td>
<td>35,000</td>
<td>59.63</td>
<td>2</td>
<td>2021-04-30</td>
</tr> <tr>
<td><a style="color: darkgreen">전세</a></td>
<td>44,000</td>
<td>84.56</td>
<td>9</td>
<td>2021-04-17</td>
</tr> <tr>
<td><a style="color: darkgreen">전세</a></td>
<td>33,000</td>
<td>63.82</td>
<td>1</td>
<td>2021-04-01</td>
</tr> <tr>
<td><a style="color: darkgreen">전세</a></td>
<td>42,000</td>
<td>84.56</td>
<td>5</td>
<td>2021-03-29</td>
</tr> <tr>
<td><a style="color: blue">매매</a></td>
<td>87,000</td>
<td>84.56</td>
<td>14</td>
<td>2021-03-27</td>
</tr> <tr>
<td><a style="color: darkgreen">전세</a></td>
<td>38,000</td>
<td>84.56</td>
<td>6</td>
<td>2021-03-06</td>
</tr> <tr>
<td><a style="color: darkgreen">전세</a></td>
<td>65,000</td>
<td>84.56</td>
<td>14</td>
<td>2021-03-06</td>
</tr> <tr>
<td><a style="color: blue">매매</a></td>
<td>78,000</td>
<td>59.63</td>
<td>4</td>
<td>2021-03-04</td>
</tr> <tr>
<td><a style="color: blue">매매</a></td>
<td>86,000</td>
<td>84.56</td>
<td>2</td>
<td>2021-02-17</td>
</tr> <tr>
<td><a style="color: darkgoldenrod">월세</a></td>
<td>5 (3,398)</td>
<td>31.22</td>
<td>7</td>
<td>2021-02-15</td>
</tr> <tr>
<td><a style="color: blue">매매</a></td>
<td>88,500</td>
<td>84.56</td>
<td>14</td>
<td>2021-01-30</td>
</tr> <tr>
<td><a style="color: darkgreen">전세</a></td>
<td>47,500</td>
<td>84.56</td>
<td>1</td>
<td>2021-01-26</td>
</tr> </table>
<!-- contents end -->
| 32.03537 | 1,782 | 0.464418 | kor_Hang | 0.213545 |
f97f4c63f797daa274eb714ee351079aa38b640e | 2,606 | md | Markdown | README.md | favourch/mentor-mentee-platform | 5207c0f8d6edb1c0b9c519eee47f39a66a90675c | [
"MIT"
] | 26 | 2018-12-12T21:15:58.000Z | 2021-09-07T21:51:18.000Z | README.md | favourch/mentor-mentee-platform | 5207c0f8d6edb1c0b9c519eee47f39a66a90675c | [
"MIT"
] | 212 | 2018-12-31T00:02:51.000Z | 2022-03-01T08:16:23.000Z | README.md | favourch/mentor-mentee-platform | 5207c0f8d6edb1c0b9c519eee47f39a66a90675c | [
"MIT"
] | 22 | 2019-04-11T19:32:05.000Z | 2020-11-30T14:29:48.000Z | [](https://travis-ci.com/RailsGirlsCPH/mentor-mentee-platform)
# Mentor Mentee Platform
Find a mentor-platform: A matchmaking platform where people entering the tech workspace can find mentors already established in the field.
This project has been started in the [Rails Girls Copenhagen learners group](https://www.meetup.com/Rails-Girls-Copenhagen/).
In 2019 we got together as a group and spent some time talking about what members of the community need and what problem they would like to solve together with this project. The result was the Mentor-Mentee App!
### Project Goals
1. To build a service connecting newcomers to the field with mentors that can help them grow technically and professionally.
2. To be a beginner-friendly project that offers its contributors a welcoming and encouraging setting to advance their skills and learn new technologies.
Contributors have formed three separate groups:
- Frontend (React)
- Backend (Ruby on Rails)
- Design/Project Ownership
So far, the groups have laid out a lot of the groundwork, have created a fair amount of issues and set up a very elaborate wiki in our repo. This is a beginner(junior)-friendly project with a established community of contributors and mentors.
### Contribute
We're always happy to invite new contributors, particularly people new to progamming and who belongs to a group that is not already so well-represented in tech. We also very enthusiastically welcome more seasoned folks who wants to mentor and help our beginner-contributors out. Check out our issues to see which tasks we have ahead. Feel free to join our [Rails Girls cph slack channel](https://join.slack.com/t/railsgirlscopenhagen/shared_invite/enQtMzczNDA3NzMwMDA3LWI3YjVkMDE0MjU1NzllOGIxNGE3YTUyZmIxOWEzNTFlMTg2ZDE2YzUwMDE1YjZjMTQ4ZTYwNTc2MWRkMTNmMmM) for a social space to talk about the project and raise any questions you may have.
```server``` will take you to the backend repository.
```ui``` will take you to the frontend repository.
Information about using docker to spin up app can be found in DOCKER.md file.
| 93.071429 | 639 | 0.67076 | eng_Latn | 0.997803 |
f97f6afea736703350df7e5fc47497174f49135b | 44 | md | Markdown | README.old.md | peterblazejewicz/ch-react-redux | a4a42bcc0e22c38450780947de1498b16c8f3dfe | [
"Unlicense"
] | null | null | null | README.old.md | peterblazejewicz/ch-react-redux | a4a42bcc0e22c38450780947de1498b16c8f3dfe | [
"Unlicense"
] | null | null | null | README.old.md | peterblazejewicz/ch-react-redux | a4a42bcc0e22c38450780947de1498b16c8f3dfe | [
"Unlicense"
] | null | null | null | # ch-react-redux
Cory House React/Redux app
| 14.666667 | 26 | 0.772727 | kor_Hang | 0.394113 |
f97fe0176de98ea1f5c9be025ab60fe95841a2ae | 5,174 | md | Markdown | network_controllers/dnac/README.md | tobaidullah/2 | 3fa67855ef461ccaee283dcbbdd9bf00e7a52378 | [
"MIT"
] | 629 | 2017-12-15T20:26:13.000Z | 2022-03-30T04:02:02.000Z | network_controllers/dnac/README.md | tobaidullah/2 | 3fa67855ef461ccaee283dcbbdd9bf00e7a52378 | [
"MIT"
] | 40 | 2018-01-18T09:07:50.000Z | 2021-09-23T23:21:47.000Z | network_controllers/dnac/README.md | tobaidullah/2 | 3fa67855ef461ccaee283dcbbdd9bf00e7a52378 | [
"MIT"
] | 394 | 2017-12-18T22:35:36.000Z | 2022-03-29T19:41:25.000Z | # Programming Cisco DNA Center Platform Lessons
* **Introducing Cisco DNA Center Platform APIs and Programmability**
* **Cisco DNA Center Platform APIs Part 1: Exploring Apps and Tools**
* **Cisco DNA Center Platform APIs Part 2: Network Troubleshooting**
## "Gitting" the Code
All of the code and examples for this lesson is located in the `netprog_basics/network_controllers/dnac` directory. Clone and access it with the following commands:
```bash
git clone https://github.com/CiscoDevNet/netprog_basics
cd netprog_basics/network_controllers/dnac
```
## Local Workstation Setup
Be sure to complete the [General Workstation Setup](https://github.com/CiscoDevNet/netprog_basics/blob/master/readme_resources/workstation_setup.md) instructions before beginning this lesson.
### Postman Setup
During this lesson the Postman client for making REST API calls is used. For convenience we have included a `postman_collection.json` file that contains all the REST API calls leveraged in the different lessons, and `postman_environment.json` files for each of the DevNet Sandboxes leveraged across the lessons. These files are all located in the [postman_config](https://github.com/CiscoDevNet/netprog_basics/postman_config) directory in the code repository.
To leverage them, simply `Import` them into your Postman client.
1. Collections: Use the **Import** button in the upper left corner of the client.
2. Environments: Use the **Import** button from the `Manage Environments` interface of the client.
> Reminder: Many network devices leverage self-signed certificates for `https://` APIs. Don't forget to turn **OFF** SSL certificate checking within Postman settings.
### Python Environment Setup
It is recommended that this lesson be completed using Python 3.8. Other versions of Python 3 should also work.
> **Note about Python 2:** Python 2 was [sunset](https://www.python.org/doc/sunset-python-2/) by Python Software Foundation on January 1, 2020. This means that no more updates to Python 2 are being worked on, including security updates. Python 3 is now the recommended version of Python for everyone to use. Most Python developers of software, packages, and scripts have migrated to Python 3 already, however you may find some older scripts and tools that are no longer maintained that only work with Python 2.
>
> You may see/hear references to Python 2 within the videos in this course from before January 2020, however all examples scripts and demos available in the GitHub repo to run have been updated to leverage Python 3.
It is highly recommended to leverage Python Virtual Environments for completing exercises in this course.
*There is no need to create independent venv for each lesson, but you can if you choose.*
Follow these steps to create and activate a venv.
***Note: If you are leveraging a shared venv across all lessons simply activate it.***
```bash
# OS X or Linux
python3 -m venv venv
source venv/bin/activate
```
```bash
# Windows
python -m venv venv
venv/Scripts/activate
```
#### Install Python Requirements for Lesson
With the Virtual Environment activated, use pip to install the necessary requirements.
```bash
# From the code directory for this lesson
pip install -r requirements.txt
```
## DevNet Sandbox
The majority of the exercises in this lab can be completed with the available Always On Cisco DNA Center Sandboxes:
* [Always On: Cisco DNA Center Lab 1](https://devnetsandbox.cisco.com/RM/Diagram/Index/c3c949dc-30af-498b-9d77-4f1c07d835f9?diagramType=Topology)
* [Always On: Cisco DNA Center Lab 2](https://devnetsandbox.cisco.com/RM/Diagram/Index/471eb739-323e-4805-b2a6-d0ec813dc8fc?diagramType=Topology)
These sandboxes require no reservation **or** VPN connection.
> Note: Access to the Platform section of the GUI is restricted to Admin users. The Always On Sandbox users are limited to operator role to allow shared access. To view this part of the GUI, use a reservable Sandbox.
To complete the exercises related to the Path Trace API, you'll need to leverage a reservable Cisco DNA Center Sandbox where you'll have administrative access.
* [Reservable: Cisco DNA Center Lab 1](https://devnetsandbox.cisco.com/RM/Diagram/Index/b8d7aa34-aa8f-4bf2-9c42-302aaa2daafb?diagramType=Topology)
* [Reservable: Cisco DNA Center Lab 2](https://devnetsandbox.cisco.com/RM/Diagram/Index/2db69bce-66ca-44cb-9b51-05c2d5b4868d?diagramType=Topology)
* [Reservable: Cisco DNA Center Lab 3](https://devnetsandbox.cisco.com/RM/Diagram/Index/8076fc05-a818-4bb4-aae1-477680bc2211?diagramType=Topology)
## Download Slides
You can download the slides for this lesson [here]().
* [**Introducing Cisco DNA Center Platform APIs and Programmability**](https://developer.cisco.com/fileMedia/download/166adb81-cb9b-37e3-964f-1fc1b09ad30b)
* [**Cisco DNA Center Platform APIs Part 1: Exploring Apps and Tools**](https://developer.cisco.com/fileMedia/download/78f81422-0a8c-38c4-ae99-304334f48fbe)
* [**Cisco DNA Center Platform APIs Part 2: Network Troubleshooting**](https://developer.cisco.com/fileMedia/download/1430c5d1-c770-3601-9aab-639ccce482f5)
> *Suggestion: Right click, "Open in new tab"* | 58.795455 | 512 | 0.784693 | eng_Latn | 0.948969 |
d5ef694b5278df5bc0ccfd7a2e516e33173251b1 | 75 | md | Markdown | packages/___buildTools/README.md | Jason-Rev/vscode-spell-checker | 6781e5e952a0007f78ceff1d157763709c2c7762 | [
"MIT"
] | 255 | 2016-06-02T10:33:49.000Z | 2019-05-10T15:32:10.000Z | packages/___buildTools/README.md | Jason-Rev/vscode-spell-checker | 6781e5e952a0007f78ceff1d157763709c2c7762 | [
"MIT"
] | 334 | 2016-06-02T10:33:44.000Z | 2019-05-09T09:18:07.000Z | packages/___buildTools/README.md | Jason-Rev/vscode-spell-checker | 6781e5e952a0007f78ceff1d157763709c2c7762 | [
"MIT"
] | 45 | 2016-06-22T08:02:27.000Z | 2019-05-05T10:02:45.000Z | Build Tools
These are tools that are used to help with the build process.
| 18.75 | 61 | 0.786667 | eng_Latn | 1.000009 |
d5ef9f15e7fb8a6fa6e882c12afd70d85ccebadb | 663 | md | Markdown | notes/array/matrixRepresentation/README.md | theArjun/Java | 40fc2ec5c3fc529000ffe8525904bdcadaf7058d | [
"MIT"
] | 11 | 2019-01-16T02:31:04.000Z | 2020-03-17T11:19:35.000Z | notes/array/matrixRepresentation/README.md | theArjun/Java_Notes_With_Codes | 40fc2ec5c3fc529000ffe8525904bdcadaf7058d | [
"MIT"
] | 1 | 2020-02-26T07:58:14.000Z | 2020-02-26T07:58:14.000Z | notes/array/matrixRepresentation/README.md | theArjun/Java_Notes_With_Codes | 40fc2ec5c3fc529000ffe8525904bdcadaf7058d | [
"MIT"
] | 18 | 2019-01-16T02:32:01.000Z | 2020-03-18T18:16:36.000Z | ---
title: Matrix Representation of Arrays
---
# Matrix Representation

This type of array declaration is represented in matrix form. Memory wastage is most probable to happen in this scenario and manually unassigned blocks of memory are automatically assigned to zero.
```java
{% include 'array/matrixRepresentation/matrixRepresentation.java' %}
```
## Conventional Representation
This method of declaring array is often referred as conventional way of declaring array.
```java
{% include 'array/conventional/conventionalWay.java' %}
```
| 34.894737 | 197 | 0.790347 | eng_Latn | 0.990826 |
d5effb4b3a882877baec7da097a55243327aff31 | 1,131 | md | Markdown | README.md | malloryfaria/jest-another-rpg | 785aeadd011a3caf4bbe4d163dbb72bc66b43b3d | [
"MIT"
] | null | null | null | README.md | malloryfaria/jest-another-rpg | 785aeadd011a3caf4bbe4d163dbb72bc66b43b3d | [
"MIT"
] | 5 | 2021-02-12T02:58:14.000Z | 2021-02-14T02:04:10.000Z | README.md | malloryfaria/jest-another-rpg | 785aeadd011a3caf4bbe4d163dbb72bc66b43b3d | [
"MIT"
] | null | null | null | # Jest Another RPG
## Description
A simple RPG game built with Node, Jest and Inquirer
## Build status
Live
## Deployed application
https://github.com/malloryfaria/jest-another-rpg
## Tech/framework used
<b>Built with</b>
- Node
- Inquirer
- Jest
## Code Example
```
class Character {
constructor(name = '') {
this.name = name;
this.health = Math.floor(Math.random() * 10 + 95);
this.strength = Math.floor(Math.random() * 5 + 7);
this.agility = Math.floor(Math.random() * 5 + 7);
}
isAlive() {
if (this.health === 0) {
return false;
}
return true;
}
getHealth() {
return `${this.name}'s health is now ${this.health}!`;
}
getAttackValue() {
const min = this.strength - 5;
const max = this.strength + 5;
return Math.floor(Math.random() * (max - min) + min);
}
reduceHealth(health) {
this.health -= health;
if (this.health < 0) {
this.health = 0;
}
}
}
module.exports = Character;
```
## License
MIT
© [Mallory](https://github.com/malloryfaria)
| 15.929577 | 60 | 0.560566 | eng_Latn | 0.658873 |
d5f0142aa8ac0285a9308748d1cdb855050220f8 | 29,595 | md | Markdown | contributing.md | Pandinosaurus/spark-website | 3ec4093344a022d15341f6f7e1d23ee2a223f01c | [
"Apache-2.0"
] | 1 | 2019-05-31T14:01:39.000Z | 2019-05-31T14:01:39.000Z | contributing.md | Pandinosaurus/spark-website | 3ec4093344a022d15341f6f7e1d23ee2a223f01c | [
"Apache-2.0"
] | null | null | null | contributing.md | Pandinosaurus/spark-website | 3ec4093344a022d15341f6f7e1d23ee2a223f01c | [
"Apache-2.0"
] | null | null | null | ---
layout: global
title: Contributing to Spark
type: "page singular"
navigation:
weight: 5
show: true
---
This guide documents the best way to make various types of contribution to Apache Spark,
including what is required before submitting a code change.
Contributing to Spark doesn't just mean writing code. Helping new users on the mailing list,
testing releases, and improving documentation are also welcome. In fact, proposing significant
code changes usually requires first gaining experience and credibility within the community by
helping in other ways. This is also a guide to becoming an effective contributor.
So, this guide organizes contributions in order that they should probably be considered by new
contributors who intend to get involved long-term. Build some track record of helping others,
rather than just open pull requests.
<h2>Contributing by helping other users</h2>
A great way to contribute to Spark is to help answer user questions on the `user@spark.apache.org`
mailing list or on StackOverflow. There are always many new Spark users; taking a few minutes to
help answer a question is a very valuable community service.
Contributors should subscribe to this list and follow it in order to keep up to date on what's
happening in Spark. Answering questions is an excellent and visible way to help the community,
which also demonstrates your expertise.
See the <a href="{{site.baseurl}}/mailing-lists.html">Mailing Lists guide</a> for guidelines
about how to effectively participate in discussions on the mailing list, as well as forums
like StackOverflow.
<h2>Contributing by testing releases</h2>
Spark's release process is community-oriented, and members of the community can vote on new
releases on the `dev@spark.apache.org` mailing list. Spark users are invited to subscribe to
this list to receive announcements, and test their workloads on newer release and provide
feedback on any performance or correctness issues found in the newer release.
<h2>Contributing by reviewing changes</h2>
Changes to Spark source code are proposed, reviewed and committed via
<a href="https://github.com/apache/spark/pulls">GitHub pull requests</a> (described later).
Anyone can view and comment on active changes here.
Reviewing others' changes is a good way to learn how the change process works and gain exposure
to activity in various parts of the code. You can help by reviewing the changes and asking
questions or pointing out issues -- as simple as typos or small issues of style.
See also <a href="https://spark-prs.appspot.com/">https://spark-prs.appspot.com/</a> for a
convenient way to view and filter open PRs.
<h2>Contributing documentation changes</h2>
To propose a change to _release_ documentation (that is, docs that appear under
<a href="https://spark.apache.org/docs/">https://spark.apache.org/docs/</a>),
edit the Markdown source files in Spark's
<a href="https://github.com/apache/spark/tree/master/docs">`docs/`</a> directory,
whose `README` file shows how to build the documentation locally to test your changes.
The process to propose a doc change is otherwise the same as the process for proposing code
changes below.
To propose a change to the rest of the documentation (that is, docs that do _not_ appear under
<a href="https://spark.apache.org/docs/">https://spark.apache.org/docs/</a>), similarly, edit the Markdown in the
<a href="https://github.com/apache/spark-website">spark-website repository</a> and open a pull request.
<h2>Contributing user libraries to Spark</h2>
Just as Java and Scala applications can access a huge selection of libraries and utilities,
none of which are part of Java or Scala themselves, Spark aims to support a rich ecosystem of
libraries. Many new useful utilities or features belong outside of Spark rather than in the core.
For example: language support probably has to be a part of core Spark, but, useful machine
learning algorithms can happily exist outside of MLlib.
To that end, large and independent new functionality is often rejected for inclusion in Spark
itself, but, can and should be hosted as a separate project and repository, and included in
the <a href="https://spark-packages.org/">spark-packages.org</a> collection.
<h2>Contributing bug reports</h2>
Ideally, bug reports are accompanied by a proposed code change to fix the bug. This isn't
always possible, as those who discover a bug may not have the experience to fix it. A bug
may be reported by creating a JIRA but without creating a pull request (see below).
Bug reports are only useful however if they include enough information to understand, isolate
and ideally reproduce the bug. Simply encountering an error does not mean a bug should be
reported; as below, search JIRA and search and inquire on the Spark user / dev mailing lists
first. Unreproducible bugs, or simple error reports, may be closed.
It's very helpful if the bug report has a description about how the bug was introduced, by
which commit, so that reviewers can easily understand the bug. It also helps committers to
decide how far the bug fix should be backported, when the pull request is merged. The pull
request to fix the bug should narrow down the problem to the root cause.
Performance regression is also one kind of bug. The pull request to fix a performance regression
must provide a benchmark to prove the problem is indeed fixed.
Note that, data correctness/data loss bugs are very serious. Make sure the corresponding bug
report JIRA ticket is labeled as `correctness` or `data-loss`. If the bug report doesn't get
enough attention, please send an email to `dev@spark.apache.org`, to draw more attentions.
It is possible to propose new features as well. These are generally not helpful unless
accompanied by detail, such as a design document and/or code change. Large new contributions
should consider <a href="https://spark-packages.org/">spark-packages.org</a> first (see above),
or be discussed on the mailing
list first. Feature requests may be rejected, or closed after a long period of inactivity.
<h2>Contributing to JIRA maintenance</h2>
Given the sheer volume of issues raised in the Apache Spark JIRA, inevitably some issues are
duplicates, or become obsolete and eventually fixed otherwise, or can't be reproduced, or could
benefit from more detail, and so on. It's useful to help identify these issues and resolve them,
either by advancing the discussion or even resolving the JIRA. Most contributors are able to
directly resolve JIRAs. Use judgment in determining whether you are quite confident the issue
should be resolved, although changes can be easily undone. If in doubt, just leave a comment
on the JIRA.
When resolving JIRAs, observe a few useful conventions:
- Resolve as **Fixed** if there's a change you can point to that resolved the issue
- Set Fix Version(s), if and only if the resolution is Fixed
- Set Assignee to the person who most contributed to the resolution, which is usually the person
who opened the PR that resolved the issue.
- In case several people contributed, prefer to assign to the more 'junior', non-committer contributor
- For issues that can't be reproduced against master as reported, resolve as **Cannot Reproduce**
- Fixed is reasonable too, if it's clear what other previous pull request resolved it. Link to it.
- If the issue is the same as or a subset of another issue, resolved as **Duplicate**
- Make sure to link to the JIRA it duplicates
- Prefer to resolve the issue that has less activity or discussion as the duplicate
- If the issue seems clearly obsolete and applies to issues or components that have changed
radically since it was opened, resolve as **Not a Problem**
- If the issue doesn't make sense – not actionable, for example, a non-Spark issue, resolve
as **Invalid**
- If it's a coherent issue, but there is a clear indication that there is not support or interest
in acting on it, then resolve as **Won't Fix**
- Umbrellas are frequently marked **Done** if they are just container issues that don't correspond
to an actionable change of their own
<h2>Preparing to contribute code changes</h2>
<h3>Choosing what to contribute</h3>
Spark is an exceptionally busy project, with a new JIRA or pull request every few hours on average.
Review can take hours or days of committer time. Everyone benefits if contributors focus on
changes that are useful, clear, easy to evaluate, and already pass basic checks.
Sometimes, a contributor will already have a particular new change or bug in mind. If seeking
ideas, consult the list of starter tasks in JIRA, or ask the `user@spark.apache.org` mailing list.
Before proceeding, contributors should evaluate if the proposed change is likely to be relevant,
new and actionable:
- Is it clear that code must change? Proposing a JIRA and pull request is appropriate only when a
clear problem or change has been identified. If simply having trouble using Spark, use the mailing
lists first, rather than consider filing a JIRA or proposing a change. When in doubt, email
`user@spark.apache.org` first about the possible change
- Search the `user@spark.apache.org` and `dev@spark.apache.org` mailing list
<a href="{{site.baseurl}}/community.html#mailing-lists">archives</a> for
related discussions.
Often, the problem has been discussed before, with a resolution that doesn't require a code
change, or recording what kinds of changes will not be accepted as a resolution.
- Search JIRA for existing issues:
<a href="https://issues.apache.org/jira/browse/SPARK">https://issues.apache.org/jira/browse/SPARK</a>
- Type `spark [search terms]` at the top right search box. If a logically similar issue already
exists, then contribute to the discussion on the existing JIRA and pull request first, instead of
creating a new one.
- Is the scope of the change matched to the contributor's level of experience? Anyone is qualified
to suggest a typo fix, but refactoring core scheduling logic requires much more understanding of
Spark. Some changes require building up experience first (see above).
It's worth reemphasizing that changes to the core of Spark, or to highly complex and important modules
like SQL and Catalyst, are more difficult to make correctly. They will be subjected to more scrutiny,
and held to a higher standard of review than changes to less critical code.
<h3>MLlib-specific contribution guidelines</h3>
While a rich set of algorithms is an important goal for MLLib, scaling the project requires
that maintainability, consistency, and code quality come first. New algorithms should:
- Be widely known
- Be used and accepted (academic citations and concrete use cases can help justify this)
- Be highly scalable
- Be well documented
- Have APIs consistent with other algorithms in MLLib that accomplish the same thing
- Come with a reasonable expectation of developer support.
- Have `@Since` annotation on public classes, methods, and variables.
<h3>Error message guidelines</h3>
Exceptions thrown in Spark should be associated with standardized and actionable
error messages.
Error messages should answer the following questions:
- **What** was the problem?
- **Why** did the problem happen?
- **How** can the problem be solved?
When writing error messages, you should:
- Use active voice
- Avoid time-based statements, such as promises of future support
- Use the present tense to describe the error and provide suggestions
- Provide concrete examples if the resolution is unclear
- Avoid sounding accusatory, judgmental, or insulting
- Be direct
- Do not use programming jargon in user-facing errors
See the <a href="{{site.baseurl}}/error-message-guidelines.html">error message guidelines</a> for more details.
<h3>Code review criteria</h3>
Before considering how to contribute code, it's useful to understand how code is reviewed,
and why changes may be rejected. See the
<a href="https://google.github.io/eng-practices/review/">detailed guide for code reviewers</a>
from Google's Engineering Practices documentation.
Simply put, changes that have many or large
positives, and few negative effects or risks, are much more likely to be merged, and merged quickly.
Risky and less valuable changes are very unlikely to be merged, and may be rejected outright
rather than receive iterations of review.
<h4>Positives</h4>
- Fixes the root cause of a bug in existing functionality
- Adds functionality or fixes a problem needed by a large number of users
- Simple, targeted
- Maintains or improves consistency across Python, Java, Scala
- Easily tested; has tests
- Reduces complexity and lines of code
- Change has already been discussed and is known to committers
<h4>Negatives, risks</h4>
- Band-aids a symptom of a bug only
- Introduces complex new functionality, especially an API that needs to be supported
- Adds complexity that only helps a niche use case
- Adds user-space functionality that does not need to be maintained in Spark, but could be hosted
externally and indexed by <a href="https://spark-packages.org/">spark-packages.org</a>
- Changes a public API or semantics (rarely allowed)
- Adds large dependencies
- Changes versions of existing dependencies
- Adds a large amount of code
- Makes lots of modifications in one "big bang" change
<h2>Contributing code changes</h2>
Please review the preceding section before proposing a code change. This section documents how to do so.
**When you contribute code, you affirm that the contribution is your original work and that you
license the work to the project under the project's open source license. Whether or not you state
this explicitly, by submitting any copyrighted material via pull request, email, or other means
you agree to license the material under the project's open source license and warrant that you
have the legal authority to do so.**
<h3>Cloning the Apache Spark<span class="tm">™</span> source code</h3>
If you are interested in working with the newest under-development code or contributing to Apache Spark development, you can check out the master branch from Git:
# Master development branch
git clone git://github.com/apache/spark.git
Once you've downloaded Spark, you can find instructions for installing and building it on the <a href="{{site.baseurl}}/documentation.html">documentation page</a>.
<h3>JIRA</h3>
Generally, Spark uses JIRA to track logical issues, including bugs and improvements, and uses
GitHub pull requests to manage the review and merge of specific code changes. That is, JIRAs are
used to describe _what_ should be fixed or changed, and high-level approaches, and pull requests
describe _how_ to implement that change in the project's source code. For example, major design
decisions are discussed in JIRA.
1. Find the existing Spark JIRA that the change pertains to.
1. Do not create a new JIRA if creating a change to address an existing issue in JIRA; add to
the existing discussion and work instead
1. Look for existing pull requests that are linked from the JIRA, to understand if someone is
already working on the JIRA
1. If the change is new, then it usually needs a new JIRA. However, trivial changes, where the
what should change is virtually the same as the how it should change do not require a JIRA.
Example: `Fix typos in Foo scaladoc`
1. If required, create a new JIRA:
1. Provide a descriptive Title. "Update web UI" or "Problem in scheduler" is not sufficient.
"Kafka Streaming support fails to handle empty queue in YARN cluster mode" is good.
1. Write a detailed Description. For bug reports, this should ideally include a short
reproduction of the problem. For new features, it may include a design document.
1. Set required fields:
1. **Issue Type**. Generally, Bug, Improvement and New Feature are the only types used in Spark.
1. **Priority**. Set to Major or below; higher priorities are generally reserved for
committers to set. The main exception is correctness or data-loss issues, which can be flagged as
Blockers. JIRA tends to unfortunately conflate "size" and "importance" in its
Priority field values. Their meaning is roughly:
1. Blocker: pointless to release without this change as the release would be unusable
to a large minority of users. Correctness and data loss issues should be considered Blockers for their target versions.
1. Critical: a large minority of users are missing important functionality without
this, and/or a workaround is difficult
1. Major: a small minority of users are missing important functionality without this,
and there is a workaround
1. Minor: a niche use case is missing some support, but it does not affect usage or
is easily worked around
1. Trivial: a nice-to-have change but unlikely to be any problem in practice otherwise
1. **Component**
1. **Affects Version**. For Bugs, assign at least one version that is known to exhibit the
problem or need the change
1. **Label**. Not widely used, except for the following:
- `correctness`: a correctness issue
- `data-loss`: a data loss issue
- `release-notes`: the change's effects need mention in release notes. The JIRA or pull request
should include detail suitable for inclusion in release notes -- see "Docs Text" below.
- `starter`: small, simple change suitable for new contributors
1. **Docs Text**: For issues that require an entry in the release notes, this should contain the
information that the release manager should include in Release Notes. This should include a short summary
of what behavior is impacted, and detail on what behavior changed. It can be provisionally filled out
when the JIRA is opened, but will likely need to be updated with final details when the issue is
resolved.
1. Do not set the following fields:
1. **Fix Version**. This is assigned by committers only when resolved.
1. **Target Version**. This is assigned by committers to indicate a PR has been accepted for
possible fix by the target version.
1. Do not include a patch file; pull requests are used to propose the actual change.
1. If the change is a large change, consider inviting discussion on the issue at
`dev@spark.apache.org` first before proceeding to implement the change.
<h3>Pull request</h3>
1. <a href="https://help.github.com/articles/fork-a-repo/">Fork</a> the GitHub repository at
<a href="https://github.com/apache/spark">https://github.com/apache/spark</a> if you haven't already
1. Clone your fork, create a new branch, push commits to the branch.
1. Consider whether documentation or tests need to be added or updated as part of the change,
and add them as needed.
1. When you add tests, make sure the tests are self-descriptive.
1. Also, you should consider writing a JIRA ID in the tests when your pull request targets to fix
a specific issue. In practice, usually it is added when a JIRA type is a bug or a PR adds
a couple of tests to an existing test class. See the examples below:
- Scala
```
test("SPARK-12345: a short description of the test") {
...
```
- Java
```
@Test
public void testCase() {
// SPARK-12345: a short description of the test
...
```
- Python
```
def test_case(self):
# SPARK-12345: a short description of the test
...
```
- R
```
test_that("SPARK-12345: a short description of the test", {
...
```
1. Consider whether benchmark results should be added or updated as part of the change, and add them as needed by
<a href="https://spark.apache.org/developer-tools.html#github-workflow-benchmarks">Running benchmarks in your forked repository</a>
to generate benchmark results.
1. Run all tests with `./dev/run-tests` to verify that the code still compiles, passes tests, and
passes style checks. Alternatively you can run the tests via GitHub Actions workflow by
<a href="https://spark.apache.org/developer-tools.html#github-workflow-tests">Running tests in your forked repository</a>.
If style checks fail, review the Code Style Guide below.
1. <a href="https://help.github.com/articles/using-pull-requests/">Open a pull request</a> against
the `master` branch of `apache/spark`. (Only in special cases would the PR be opened against other branches.)
1. The PR title should be of the form `[SPARK-xxxx][COMPONENT] Title`, where `SPARK-xxxx` is
the relevant JIRA number, `COMPONENT `is one of the PR categories shown at
<a href="https://spark-prs.appspot.com/">spark-prs.appspot.com</a> and
Title may be the JIRA's title or a more specific title describing the PR itself.
1. If the pull request is still a work in progress, and so is not ready to be merged,
but needs to be pushed to GitHub to facilitate review, then add `[WIP]` after the component.
1. Consider identifying committers or other contributors who have worked on the code being
changed. Find the file(s) in GitHub and click "Blame" to see a line-by-line annotation of
who changed the code last. You can add `@username` in the PR description to ping them
immediately.
1. Please state that the contribution is your original work and that you license the work
to the project under the project's open source license.
1. The related JIRA, if any, will be marked as "In Progress" and your pull request will
automatically be linked to it. There is no need to be the Assignee of the JIRA to work on it,
though you are welcome to comment that you have begun work.
1. The Jenkins automatic pull request builder will test your changes
1. If it is your first contribution, Jenkins will wait for confirmation before building
your code and post "Can one of the admins verify this patch?"
1. A committer can authorize testing with a comment like "ok to test"
1. A committer can automatically allow future pull requests from a contributor to be
tested with a comment like "Jenkins, add to whitelist"
1. After about 2 hours, Jenkins will post the results of the test to the pull request, along
with a link to the full results on Jenkins.
1. Watch for the results, and investigate and fix failures promptly
1. Fixes can simply be pushed to the same branch from which you opened your pull request
1. Jenkins will automatically re-test when new commits are pushed
1. If the tests failed for reasons unrelated to the change (e.g. Jenkins outage), then a
committer can request a re-test with "Jenkins, retest this please".
Ask if you need a test restarted. If you were added by "Jenkins, add to whitelist" from a
committer before, you can also request the re-test.
1. If there is a change related to SparkR in your pull request, AppVeyor will be triggered
automatically to test SparkR on Windows, which takes roughly an hour. Similarly to the steps
above, fix failures and push new commits which will request the re-test in AppVeyor.
<h3>The review process</h3>
- Other reviewers, including committers, may comment on the changes and suggest modifications.
Changes can be added by simply pushing more commits to the same branch.
- Lively, polite, rapid technical debate is encouraged from everyone in the community. The outcome
may be a rejection of the entire change.
- Keep in mind that changes to more critical parts of Spark, like its core and SQL components, will
be subjected to more review, and may require more testing and proof of its correctness than
other changes.
- Reviewers can indicate that a change looks suitable for merging with a comment such as: "I think
this patch looks good". Spark uses the LGTM convention for indicating the strongest level of
technical sign-off on a patch: simply comment with the word "LGTM". It specifically means: "I've
looked at this thoroughly and take as much ownership as if I wrote the patch myself". If you
comment LGTM you will be expected to help with bugs or follow-up issues on the patch. Consistent,
judicious use of LGTMs is a great way to gain credibility as a reviewer with the broader community.
- Sometimes, other changes will be merged which conflict with your pull request's changes. The
PR can't be merged until the conflict is resolved. This can be resolved by, for example, adding a remote
to keep up with upstream changes by `git remote add upstream https://github.com/apache/spark.git`,
running `git fetch upstream` followed by `git rebase upstream/master` and resolving the conflicts by hand,
then pushing the result to your branch.
- Try to be responsive to the discussion rather than let days pass between replies
<h3>Closing your pull request / JIRA</h3>
- If a change is accepted, it will be merged and the pull request will automatically be closed,
along with the associated JIRA if any
- Note that in the rare case you are asked to open a pull request against a branch besides
`master`, that you will actually have to close the pull request manually
- The JIRA will be Assigned to the primary contributor to the change as a way of giving credit.
If the JIRA isn't closed and/or Assigned promptly, comment on the JIRA.
- If your pull request is ultimately rejected, please close it promptly
- ... because committers can't close PRs directly
- Pull requests will be automatically closed by an automated process at Apache after about a
week if a committer has made a comment like "mind closing this PR?" This means that the
committer is specifically requesting that it be closed.
- If a pull request has gotten little or no attention, consider improving the description or
the change itself and ping likely reviewers again after a few days. Consider proposing a
change that's easier to include, like a smaller and/or less invasive change.
- If it has been reviewed but not taken up after weeks, after soliciting review from the
most relevant reviewers, or, has met with neutral reactions, the outcome may be considered a
"soft no". It is helpful to withdraw and close the PR in this case.
- If a pull request is closed because it is deemed not the right approach to resolve a JIRA,
then leave the JIRA open. However if the review makes it clear that the issue identified in
the JIRA is not going to be resolved by any pull request (not a problem, won't fix) then also
resolve the JIRA.
<a name="code-style-guide"></a>
<h2>Code style guide</h2>
Please follow the style of the existing codebase.
- For Python code, Apache Spark follows
<a href="http://legacy.python.org/dev/peps/pep-0008/">PEP 8</a> with one exception:
lines can be up to 100 characters in length, not 79.
- For R code, Apache Spark follows
<a href="https://google.github.io/styleguide/Rguide.xml">Google's R Style Guide</a> with three exceptions:
lines can be up to 100 characters in length, not 80, there is no limit on function name but it has a initial
lower case latter and S4 objects/methods are allowed.
- For Java code, Apache Spark follows
<a href="http://www.oracle.com/technetwork/java/codeconvtoc-136057.html">Oracle's Java code conventions</a> and
Scala guidelines below. The latter is preferred.
- For Scala code, Apache Spark follows the official
<a href="http://docs.scala-lang.org/style/">Scala style guide</a> and
<a href="https://github.com/databricks/scala-style-guide">Databricks Scala guide</a>. The latter is preferred. To format Scala code, run ./dev/scalafmt prior to submitting a PR.
<h3>If in doubt</h3>
If you're not sure about the right style for something, try to follow the style of the existing
codebase. Look at whether there are other examples in the code that use your feature. Feel free
to ask on the `dev@spark.apache.org` list as well and/or ask committers.
<h2>Code of conduct</h2>
The Apache Spark project follows the <a href="https://www.apache.org/foundation/policies/conduct.html">Apache Software Foundation Code of Conduct</a>. The <a href="https://www.apache.org/foundation/policies/conduct.html">code of conduct</a> applies to all spaces managed by the Apache Software Foundation, including IRC, all public and private mailing lists, issue trackers, wikis, blogs, Twitter, and any other communication channel used by our communities. A code of conduct which is specific to in-person events (ie., conferences) is codified in the published ASF anti-harassment policy.
We expect this code of conduct to be honored by everyone who participates in the Apache community formally or informally, or claims any affiliation with the Foundation, in any Foundation-related activities and especially when representing the ASF, in any role.
This code <u>is not exhaustive or complete</u>. It serves to distill our common understanding of a collaborative, shared environment and goals. We expect it to be followed in spirit as much as in the letter, so that it can enrich all of us and the technical communities in which we participate.
For more information and specific guidelines, refer to the <a href="https://www.apache.org/foundation/policies/conduct.html">Apache Software Foundation Code of Conduct</a>. | 61.914226 | 591 | 0.75827 | eng_Latn | 0.999414 |
d5f01ca5e2989edf9db743bc6c45c7ffc68aba3d | 1,373 | md | Markdown | docs/interfaces/_src_agent_structures_.customerparameters.md | quarties/lc-sdk-js | b511205cc9421c00e607340fc92968555b16775a | [
"Apache-2.0"
] | null | null | null | docs/interfaces/_src_agent_structures_.customerparameters.md | quarties/lc-sdk-js | b511205cc9421c00e607340fc92968555b16775a | [
"Apache-2.0"
] | null | null | null | docs/interfaces/_src_agent_structures_.customerparameters.md | quarties/lc-sdk-js | b511205cc9421c00e607340fc92968555b16775a | [
"Apache-2.0"
] | null | null | null | [@livechat/lc-sdk-js](../README.md) › [Globals](../globals.md) › ["src/agent/structures"](../modules/_src_agent_structures_.md) › [CustomerParameters](_src_agent_structures_.customerparameters.md)
# Interface: CustomerParameters
## Hierarchy
* **CustomerParameters**
## Index
### Properties
* [avatar](_src_agent_structures_.customerparameters.md#optional-avatar)
* [email](_src_agent_structures_.customerparameters.md#optional-email)
* [name](_src_agent_structures_.customerparameters.md#optional-name)
* [session_fields](_src_agent_structures_.customerparameters.md#optional-session_fields)
## Properties
### `Optional` avatar
• **avatar**? : *undefined | string*
*Defined in [src/agent/structures.ts:243](https://github.com/livechat/lc-sdk-js/blob/ce4846a/src/agent/structures.ts#L243)*
___
### `Optional` email
• **email**? : *undefined | string*
*Defined in [src/agent/structures.ts:242](https://github.com/livechat/lc-sdk-js/blob/ce4846a/src/agent/structures.ts#L242)*
___
### `Optional` name
• **name**? : *undefined | string*
*Defined in [src/agent/structures.ts:241](https://github.com/livechat/lc-sdk-js/blob/ce4846a/src/agent/structures.ts#L241)*
___
### `Optional` session_fields
• **session_fields**? : *object[]*
*Defined in [src/agent/structures.ts:244](https://github.com/livechat/lc-sdk-js/blob/ce4846a/src/agent/structures.ts#L244)*
| 28.020408 | 196 | 0.74217 | yue_Hant | 0.279898 |
d5f030c0189aa7f64895b2c259060c4a78623546 | 200 | md | Markdown | _posts/0000-01-02-Basilisnocoder.md | Basilisnocoder/github-slideshow | dfdc796e2f8585baf3622ece4604d7453e45c98d | [
"MIT"
] | null | null | null | _posts/0000-01-02-Basilisnocoder.md | Basilisnocoder/github-slideshow | dfdc796e2f8585baf3622ece4604d7453e45c98d | [
"MIT"
] | 3 | 2021-06-16T05:52:24.000Z | 2021-06-16T19:33:46.000Z | _posts/0000-01-02-Basilisnocoder.md | Basilisnocoder/github-slideshow | dfdc796e2f8585baf3622ece4604d7453e45c98d | [
"MIT"
] | null | null | null | ---
layout: slide
title: "Welcome to our second slide!"
---
Textications. Am i doing this right??
https://user-images.githubusercontent.com/85979580/122281262-4e182e80-cea7-11eb-8504-d04fdbc7d0b0.mp4
| 28.571429 | 101 | 0.775 | eng_Latn | 0.290429 |
d5f05f4063410d764130b3284b5b4e848ebb2827 | 2,138 | md | Markdown | docs/en/index.md | EvvC/platform | 57e369f45740072aeae5aad7a7a335b542ccb1cb | [
"MIT"
] | 3 | 2019-07-20T09:40:41.000Z | 2019-07-29T05:14:00.000Z | docs/en/index.md | EvvC/platform | 57e369f45740072aeae5aad7a7a335b542ccb1cb | [
"MIT"
] | null | null | null | docs/en/index.md | EvvC/platform | 57e369f45740072aeae5aad7a7a335b542ccb1cb | [
"MIT"
] | null | null | null | # Documentation
----------
## Welcome
This tutorial contains a reference and important themes about development of business-applications on the ORCHID platform.
To use the platform the knowledge of following technologies is required:
- PHP/JavaScript/HTML/CSS
- Relational databases
- Apache/nginx
- Laravel Framework
> To offer some improvements for this tutorial please add a new [issue](https://github.com/orchidsoftware/platform/issues).
In case you find some mistakes in the documentation please specify the section and related text to point it out.
Before installation and use of the platform we recomend you read about the problems it was created to solve. This little investment of time will help get you up to speed quickly.

## Introduction
ORCHID - is an instrument for [RAD](https://en.wikipedia.org/wiki/Rapid_application_development) development of websites and linear business applications.
It is shipped as a package for Laravel and can be easily integrated using Composer, it harmonizes well with other PHP components.
You may also register additional third-party components for Laravel or ones found on [Packagist](https://packagist.org/).
## How to read the documentation
If you are a newcomer I recommend you to read the Laravel documentation from start to end, at [laravel.com](https://laravel.com).
ORCHID does not explain the framework capabilities. If you are already familiar with it you may proceed directly to reading further sections.
This documentation begins with an explanation of concept and architecture of ORCHID before it dives into particular themes.
## Capabilities
ORCHID takes upon itself the routine business application development operations by adding custom control elements, fields, templates and screens, and allowing to concentrate on the application's unique business-logic.
Despite its simple appearance it allows you to solve many problems using standard tools while also allowing to add your own custom functionality if required.
Following the [tutorial](/en/docs/installation/) you can quickly install and use it.
| 47.511111 | 218 | 0.799813 | eng_Latn | 0.998511 |
d5f1beb0d63db2818eda2c53225d0cf21363f68f | 9,089 | md | Markdown | articles/virtual-machines/windows/capture-image-resource.md | diablo444/azure-docs.de-de | 168079679b8171e6c2b6957d21d581f05752689d | [
"CC-BY-4.0",
"MIT"
] | null | null | null | articles/virtual-machines/windows/capture-image-resource.md | diablo444/azure-docs.de-de | 168079679b8171e6c2b6957d21d581f05752689d | [
"CC-BY-4.0",
"MIT"
] | null | null | null | articles/virtual-machines/windows/capture-image-resource.md | diablo444/azure-docs.de-de | 168079679b8171e6c2b6957d21d581f05752689d | [
"CC-BY-4.0",
"MIT"
] | 1 | 2022-01-21T14:22:47.000Z | 2022-01-21T14:22:47.000Z | ---
title: Erstellen eines verwalteten Images in Azure | Microsoft-Dokumentation
description: "Erstellen Sie ein verwaltetes Image eines generalisierten virtuellen Computers oder einer VHD in Azure. Mit Images können mehrere virtuelle Computer erstellt werden, die verwaltete Datenträger verwenden."
services: virtual-machines-windows
documentationcenter:
author: cynthn
manager: timlt
editor:
tags: azure-resource-manager
ms.assetid:
ms.service: virtual-machines-windows
ms.workload: infrastructure-services
ms.tgt_pltfrm: vm-windows
ms.devlang: na
ms.topic: article
ms.date: 02/27/2017
ms.author: cynthn
ms.translationtype: Human Translation
ms.sourcegitcommit: 245ce9261332a3d36a36968f7c9dbc4611a019b2
ms.openlocfilehash: e428b755f6696bd6d4047ad77579a8e9665dfbd8
ms.contentlocale: de-de
ms.lasthandoff: 06/09/2017
---
# <a name="create-a-managed-image-of-a-generalized-vm-in-azure"></a>Erstellen eines verwalteten Images eines generalisierten virtuellen Computers in Azure
Eine verwaltete Imageressource kann aus einem generalisierten virtuellen Computer erstellt werden, der entweder als verwalteter Datenträger oder als nicht verwalteter Datenträger in einem Speicherkonto gespeichert ist. Mit diesem Image können dann mehrere virtuelle Computer erstellt werden.
## <a name="generalize-the-windows-vm-using-sysprep"></a>Generalisieren der Windows-VM mithilfe von Sysprep
Sysprep entfernt unter anderem alle persönlichen Kontoinformationen, und bereitet den Computer darauf vor, als Image verwendet zu werden. Weitere Informationen zu Sysprep finden Sie unter [How to Use Sysprep: An Introduction](http://technet.microsoft.com/library/bb457073.aspx)(in englischer Sprache).
Stellen Sie sicher, dass die auf dem Computer ausgeführten Serverrollen von Sysprep unterstützt werden. Weitere Informationen finden Sie unter [Sysprep Support for Server Roles](https://msdn.microsoft.com/windows/hardware/commercialize/manufacture/desktop/sysprep-support-for-server-roles)
> [!IMPORTANT]
> Wenn Sie Sysprep vor dem Hochladen der virtuellen Festplatte in Azure zum ersten Mal ausführen, stellen Sie sicher, dass Sie vor dem Ausführen von Sysprep [Ihren virtuellen Computer vorbereitet](prepare-for-upload-vhd-image.md?toc=%2fazure%2fvirtual-machines%2fwindows%2ftoc.json) haben.
>
>
1. Melden Sie sich bei dem virtuellen Windows-Computer an.
2. Öffnen Sie das Eingabeaufforderungsfenster als Administrator. Wechseln Sie in das Verzeichnis **%windir%\system32\sysprep**, und führen Sie anschließend `sysprep.exe` aus.
3. Wählen Sie unter **Systemvorbereitungsprogramm** die Option **Out-of-Box-Experience (OOBE) für System aktivieren**, und vergewissern Sie sich, dass das Kontrollkästchen **Verallgemeinern** aktiviert ist.
4. Wählen Sie unter **Optionen für Herunterfahren** die Option **Herunterfahren**.
5. Klicken Sie auf **OK**.

6. Nach Abschluss von Sysprep wird die virtuelle Maschine heruntergefahren. Starten Sie den virtuellen Computer nicht neu.
## <a name="create-a-managed-image-in-the-portal"></a>Erstellen eines verwalteten Images im Portal
1. Öffnen Sie das [Portal](https://portal.azure.com).
2. Klicken Sie auf das Pluszeichen, um eine neue Ressource zu erstellen.
3. Geben Sie in der Filtersuche den Begriff **Image** ein.
4. Wählen Sie **Image** aus den Ergebnissen.
5. Klicken Sie auf dem Blatt **Image** auf **Erstellen**.
6. Geben Sie unter **Name** einen Namen für das Image ein.
7. Wenn Sie über mehrere Abonnements verfügen, wählen Sie das gewünschte Abonnement in der Dropdownliste **Abonnement** aus.
7. Unter **Ressourcengruppe** wählen Sie entweder **Neu erstellen** aus und geben einen Namen ein, oder Sie wählen die Option **Aus vorhandenen** und dann aus der Dropdownliste eine Ressourcengruppe aus.
8. Wählen Sie unter **Speicherort** den Speicherort Ihrer Ressourcengruppe aus.
9. Unter **Betriebssystemtyp** wählen Sie den Typ des Betriebssystems, entweder Windows oder Linux.
11. Klicken Sie unter **Speicherblob** auf **Durchsuchen**, um die VHD im Azure-Speicher zu suchen.
12. Wählen Sie unter **Kontotyp** den Typ „Standard_LRS“ oder „Premium_LRS“. Standard verwendet Festplattenlaufwerke, Premium Solid-State Drives. Beide Typen verwenden lokal redundanten Speicher.
13. Wählen Sie unter **Datenträgercaching** die geeignete Datenträgercachingoption. Die Optionen sind **Kein**, **Schreibgeschützt** und **Lesen/Schreiben**.
14. Optional: Sie können dem Image auch einen vorhandenen Datenträger für Daten hinzufügen, indem Sie auf **+ Datenträger hinzufügen** klicken.
15. Klicken Sie nach Abschluss der Auswahl auf **Erstellen**.
16. Nachdem das Image erstellt wurde, wird es in der von Ihnen ausgewählten Ressourcengruppe in der Liste der Ressourcen als **Image** angezeigt.
## <a name="create-a-managed-image-of-a-vm-using-powershell"></a>Erstellen eines verwalteten Images eines virtuellen Computers mithilfe von PowerShell
Durch Erstellen eines Images direkt von einem virtuellen Computer lässt sich sicherstellen, dass das Image alle Datenträger umfasst, die dem virtuellen Computer zugeordnet sind, einschließlich des Betriebssystemdatenträgers und allen Datenträgern für Daten.
Stellen Sie vor Beginn sicher, dass Sie die neueste Version des AzureRM.Compute-PowerShell-Moduls verwenden. Führen Sie den folgenden Befehl aus, um es zu installieren.
```powershell
Install-Module AzureRM.Compute -RequiredVersion 2.6.0
```
Weitere Informationen finden Sie unter [Azure PowerShell-Versionsverwaltung](/powershell/azure/overview).
1. Erstellen Sie einige Variablen.
```powershell
$vmName = "myVM"
$rgName = "myResourceGroup"
$location = "EastUS"
$imageName = "myImage"
```
2. Stellen Sie sicher, dass die Zuordnung des virtuellen Computers aufgehoben wurde.
```powershell
Stop-AzureRmVM -ResourceGroupName $rgName -Name $vmName -Force
```
3. Legen Sie den Status des virtuellen Computers auf **Generalisiert**fest.
```powershell
Set-AzureRmVm -ResourceGroupName $rgName -Name $vmName -Generalized
```
4. Rufen Sie den virtuellen Computer ab.
```powershell
$vm = Get-AzureRmVM -Name $vmName -ResourceGroupName $rgName
```
5. Erstellen Sie die Imagekonfiguration.
```powershell
$image = New-AzureRmImageConfig -Location $location -SourceVirtualMachineId $vm.ID
```
6. Erstellen Sie das Image.
```powershell
New-AzureRmImage -Image $image -ImageName $imageName -ResourceGroupName $rgName
```
## <a name="create-a-managed-image-of-a-vhd-in-powershell"></a>Erstellen eines verwalteten Images einer VHD mithilfe von PowerShell
Erstellen Sie ein verwaltetes Image mithilfe Ihrer generalisierten Betriebssystem-VHD.
1. Legen Sie zunächst die allgemeinen Parameter fest:
```powershell
$rgName = "myResourceGroupName"
$vmName = "myVM"
$location = "West Central US"
$imageName = "yourImageName"
$osVhdUri = "https://storageaccount.blob.core.windows.net/vhdcontainer/osdisk.vhd"
```
2. Halten Sie den virtuellen Computer an, und heben Sie seine Zuordnung auf.
```powershell
Stop-AzureRmVM -ResourceGroupName $rgName -Name $vmName -Force
```
3. Kennzeichnen Sie den virtuellen Computer als generalisiert.
```powershell
Set-AzureRmVm -ResourceGroupName $rgName -Name $vmName -Generalized
```
4. Erstellen Sie das Image mithilfe Ihrer generalisierten Betriebssystem-VHD.
```powershell
$imageConfig = New-AzureRmImageConfig -Location $location
$imageConfig = Set-AzureRmImageOsDisk -Image $imageConfig -OsType Windows -OsState Generalized -BlobUri $osVhdUri
$image = New-AzureRmImage -ImageName $imageName -ResourceGroupName $rgName -Image $imageConfig
```
## <a name="create-a-managed-image-from-a-snapshot-using-powershell"></a>Erstellen eines verwalteten Images aus einer Momentaufnahme mithilfe von PowerShell
Sie können ein verwaltetes Image auch aus einer Momentaufnahme der VHD eines generalisierten virtuellen Computers erstellen.
1. Erstellen Sie einige Variablen.
```powershell
$rgName = "myResourceGroup"
$location = "EastUS"
$snapshotName = "mySnapshot"
$imageName = "myImage"
```
2. Rufen Sie die Momentaufnahme ab.
```powershell
$snapshot = Get-AzureRmSnapshot -ResourceGroupName $rgName -SnapshotName $snapshotName
```
3. Erstellen Sie die Imagekonfiguration.
```powershell
$imageConfig = New-AzureRmImageConfig -Location $location
$imageConfig = Set-AzureRmImageOsDisk -Image $imageConfig -OsState Generalized -OsType Windows -SnapshotId $snapshot.Id
```
4. Erstellen Sie das Image.
```powershell
New-AzureRmImage -ImageName $imageName -ResourceGroupName $rgName -Image $imageConfig
```
## <a name="next-steps"></a>Nächste Schritte
- Jetzt können Sie [einen virtuellen Computer aus dem generalisierten verwalteten Image erstellen](create-vm-generalized-managed.md?toc=%2fazure%2fvirtual-machines%2fwindows%2ftoc.json).
| 46.610256 | 301 | 0.768621 | deu_Latn | 0.974696 |
d5f2c6a1da798704d63390fd27991014a6e52b42 | 282 | md | Markdown | docs/common.awaited.md | evanw/velcro | 0980f8c73c07182f5a878b9312bb900b587a047f | [
"MIT"
] | 1 | 2020-10-11T02:41:36.000Z | 2020-10-11T02:41:36.000Z | docs/common.awaited.md | evanw/velcro | 0980f8c73c07182f5a878b9312bb900b587a047f | [
"MIT"
] | null | null | null | docs/common.awaited.md | evanw/velcro | 0980f8c73c07182f5a878b9312bb900b587a047f | [
"MIT"
] | 1 | 2021-03-11T04:27:24.000Z | 2021-03-11T04:27:24.000Z | <!-- Do not edit this file. It is automatically generated by API Documenter. -->
[Home](./index.md) > [@velcro/common](./common.md) > [Awaited](./common.awaited.md)
## Awaited type
<b>Signature:</b>
```typescript
type Awaited<T> = T extends Thenable<infer U> ? U : T;
```
| 23.5 | 89 | 0.648936 | eng_Latn | 0.816194 |
d5f306d26a074461357d931c90839a97a8e8d204 | 15,356 | md | Markdown | gitbook/2021/01/20210129.md | yanalhk/live | e445d081d25dda486a1527e18e2d450ce6a063f0 | [
"MIT"
] | null | null | null | gitbook/2021/01/20210129.md | yanalhk/live | e445d081d25dda486a1527e18e2d450ce6a063f0 | [
"MIT"
] | null | null | null | gitbook/2021/01/20210129.md | yanalhk/live | e445d081d25dda486a1527e18e2d450ce6a063f0 | [
"MIT"
] | 1 | 2020-11-30T17:36:19.000Z | 2020-11-30T17:36:19.000Z | #2021-01-29
09:28:39
法庭文字直播台
\#西九龍裁判法院第五庭
\#劉淑嫻裁判官 \#審訊 \[1/1\] \#1226旺角
楊(29)
控罪:遊蕩導致他人擔心
案情:2019年12月26日,在朗豪坊4樓「茶木」內上述公眾地方遊蕩,而他結伴在該處出現,導致其他不知名人士合理地擔心本身的安全或利益。
\========================
0927播錄音,滿人(細庭)
0934開庭
1057休庭至1105
不認罪
承認事實(控方證物p4):
1.20191226約2025時警員15625於朗豪坊四樓拘捕被告,罪名是在公眾地方行為不檢
2.警方於公眾媒體下載3條片段,並存入光碟,是證物p1-3
3.被告沒刑事定罪紀錄
\=======================
傳召PW1警員15625戴子猿(同音):
20191226在旺角警區反三合會行動組第一隊,由1400時開始當值,[1950-2000](tel:1950-2000)期間,主管及副主管有訓示,叫他們行動,PW1小隊會聯同旺角警區刑事總部其他隊伍,前往朗豪坊,喬裝顧客或行街人士,進行特別行動,目的是防止商場內商舖被人破壞。PW1於2030到達並喬裝途人進行觀察,當時是便裝,他到達四樓,見到有間餐廳叫茶木,有一群人聚集,所以他行去茶木作出觀察。
到茶木外,見到餐廳內有十多名男女在入面叫囂,於是繼續觀察,但因當時外面有大量記者,或者是示威人士或行街人士,他視線水平受阻,之後在餐廳旁扶手電梯上商場五樓。
上去五樓後,電梯旁有欄杆位,可以直接向茶木作觀察,角度比較好,當時餐廳內有十多名男女,在餐廳出入口附近的幾張餐桌坐下,茶木裏亦有其他相信是食客,而該十多名男女部份身穿黑衫黑褲,多數有戴帽或者用衫套頭,以及戴口罩或者使用其他物件遮擋面部。
當時十多名男女有其中一名男子身穿紅色長袖連帽衛衣,衛衣帽是套在頭上,而紅衛衣心口前有白色圓形圖案,背面有大白色圓形圖案,另外身穿藍色牛仔褲,有戴淺藍色外科口罩,有戴眼鏡,即是被告。
被告與另外十多名男女在茶木內大聲叫囂,被告曾經由一張桌走過另一張桌,與另一張桌本來坐在的人士,有交頭接耳的動作,其後被告行去原本的桌。
在他們叫囂期間,被告旁邊的桌有3名女子,因為當時聽到十幾名男女及被告叫囂期間,有說到\`\`做咩喊啊\`\`所以見到3名女子,其中2名女子約十多歲,另1名年紀較大。而年紀較小的其中一個女子,有拿着紙巾,蓋住面部,另一名也用手蓋住面部。
其後3名女子行入去茶木較入的餐桌位置,當時有另一名男子在收銀處附近,看似結賬走,他站着看着茶木出入口,顯得很徬徨,不知結賬好還是想走好,因為當時主要出入口,被一大群記者及示威人士堵塞,之後PW1沒再留意該男子。
被告與十多名男女繼續在所坐的餐桌叫囂,亦不停翻閱餐牌,期間茶木吸引很多記者去拍他們。
及後PW1到達商場四樓繼續觀察,見到被告與十多名男女行出餐廳,連同外面的記者及其他約30-40名人士,行去朗豪坊旁邊康德斯酒店連接的天橋方向。
PW1指當時沒見到他們在餐廳內進食,亦沒見到點餐。
(控方呈上P5相簿第11-12張相,顯示被告人及當日衣著)
(播證物p1片--PSHK,畫面是茶木內的情況。PW1認出哭泣旳女鬼;看到被告交頭接耳的動作;認出想結賬的男子;餐廳內有人叫囂好多位啊咁貴㗎啲嘢食啲嘢會食死人㗎要水啊唔好喊啦酒店有狗啊)
(播證物p2--ON.CC的16分鐘片段,顯示事發前商場內的背景)
(呈上證物p6被告紅色衛衣)
\=======================
抱歉直播員有事,因此無紀錄PW2供詞,感謝臨時直播員的幫忙
結案陳詞
[https://t.me/youarenotalonehk\_live/13324](https://t.me/youarenotalonehk_live/13324)
---
09:30:17
\#粉嶺裁判法院第五庭
\#陳炳宙裁判官
0929 庭內約30人,尚有少量空位。
0930 開庭。
---
10:19:08
法庭文字直播台
\#東區裁判法院第五庭
\#劉綺雲裁判官
\#0915灣仔 \#審訊 (4/1)
陳(19)
控罪:刑事損壞
案情:
被控於2019年9月15日,在駱克道港鐵灣仔站A1出口,無合法辯解而意圖損壞屬於港鐵的鐵閘。
———————————-
Day4
DW1 技術主任 事實證人 作供
證人現職香港某大機構高級技術主任超過三十八年,擁有豐富維修保養經驗,工餘亦有十年時間承接家居及辦公室維修保養及安裝工程工作,對膨脹膠使用非常熟悉,通常在工序完成房與房剩餘空隙要用膨脹膠填補防止蟲蟻及冷氣溜走,一般在五金店買到,使用前戴眼罩,雙手握罐搖動,將罐倒置,噴時用食指㩒制,另一手托住穩定。
證人指自已曾拆開過罐知道結構不同,殺蟲水及噴髮膠不用倒置使用但膨脹膠需要,噴出後通常好幾倍膨脹,要等10至20分鐘表面才硬化之後廿四小時整體才能全部硬化似發泡膠可以用刀清除。DW1以相冊所見閘門上泡沫,估計自己做清理收費數百元,如需找其他人幫手也是二三千元。
DW1 控方盤問
證人同意不倒置及不搖動也可使用膨脹膠,但效果減少及用不完物料。
控方質問所有噴劑因狀態不同,證人不會知道所有使用方法。證人回應所有工具上其實有標明指示及說明書,工程技術人員會跟着做,如殺蟲水寫明不可倒置使用。
證人也同意膨脹膠硬化是一個大約24小時過程,外在環境如溫度、濕度會有影響硬化過程但不多。
DW1同時回答膨脹膠噴出後10多20分鐘至到8小時來一般可以成塊移除,而凌晨及緊急清潔費用只是正常1.5至2倍費用。
控方提出相冊上看到噴劑有注意事項包括不當使用會使物品表面受損,DW1不完全同意說不會噴在動物及相機鏡頭上,鐵器上不會受損。
辯方案情完結前大律師將截圖冊及DW1書面陳述書以65(b)呈堂。同時希望裁判官可抽空再重溫辯方呈堂片段(D2-1大紀元 1:21:30-1:26:20,D2-2 TVB 1:18:00-1:25:00)及相冊截圖24-29。
辯方又提交3封被告品格證明信(校長、通識老師及社工)以65(b)呈堂,辯方案情完成
案件押後至2021年3月26日0930東區裁判法院第五庭作結案陳詞,法庭指示控方需於2月19日或之前提交詳細書面結案陳詞,而辯方需於3月12日或之前回應及辯方詳細結案陳詞。
✅期間批准以原有條件繼續擔保。
---
10:39:17
法庭文字直播台
\#粉嶺裁判法院第五庭
\#陳炳宙裁判官
\#裁決 \#20200119大埔
吳 (17)
控罪:
(1) 在公眾地方管有 \#攻擊性武器
2020年1月19日於大埔公路元洲仔段廣福邨行人天橋管有一支雷射筆
(2) \#襲警 襲擊警務人員
廣福邨行人天橋襲擊高級警員PC47958
❌不認罪❌
——————————————————
判詞稍後補回。
速報
法庭宣布兩項控罪罪成‼️‼️
為被告索取勞教中心、更生中心報告。
案件押後至2月19日930於粉嶺裁判法院第五庭判刑,期間被告還押‼️
---
10:57:09
法庭文字直播台
【1月29日 星期五】
以下法庭需要臨時文字直播員報料:
\#西九龍裁判法院第五庭 \#審訊
[請按此報料](https://t.me/onlycourtbot)
其他法庭暫不需要,請留意本台更新
---
11:01:51
\#東區裁判法院第十庭
\#張志偉裁判官 \#裁決
周,嚴 (16-17) \#1111鰂魚涌
\#管有工具作非法用途
控罪:2項管有適合作非法用途的工具並意圖作非法用途使用
D1被控於2019年11月11日,在鰂魚涌康怡廣場北座外管有1把可摺軍刀。而D2被控於同日同地,管有3包塑膠索帶及1把螺旋鉗。
——————————
以下是就特別事項的簡短裁決理由:
D1警誡口供的自願性:
\-法庭認為PW1並非故意不對D1作警誡,並認為他只是希望儘快完成拘捕程序。除外警方在截查D1時不知道他之前的行為,理應沒有理由視他為疑犯,PW1沒有對D1進行警誡沒有問題
\-法庭認為PW1及PW2口供互相支持,他們有向D1查詢他是否需要送院,亦確定D1沒有不適才進行搜查
\-法庭認為D1能在開始時能回答部分問題下,不會對之後的問題沒有印象,故認為D1事實上能清晰聽到問題,即使中了胡椒噴霧也沒有影響意志亦會懂得提出要求
\-法庭認為當警員問及帶萬用刀的原因時,D1沒有回應,可見他知道沉默的權利
故法庭裁定D1警誡口供是自願作出。
D2警誡口供的自願性:
\-法庭認為PW3在庭上表示不記得當時與D2錄取警誡口供時的語氣和內容是不理想
\-法庭認為PW3在沒有對D2進行警誡下對他説「我地當傾計咁傾」,有誤導D2之嫌
故法庭裁定D2警誡口供是非自願作出,不能呈堂。
————————————
以下是就一般事項的簡短裁決理由:
D1:
\-法庭裁定PW1和PW2是可靠證人,而D1是不可靠證人
\-法庭認為D1在當天沒有帶書及文具上學,可見他有整理背包,亦必然知道萬用刀在他背包。法庭認為即使D1不打算回校上課,亦不需帶望遠鏡及萬用刀出外
\-由於D1在作供曾提及即使遲到趕不上罷課,學校也會讓他進入,法庭因此認為他當時不需急於回校,D1在當時正做其他事
\-法庭認為此控罪亦包括拆除工具的目的,由於法庭已裁定D1警誡口供是自願作出,而D1已承認他有丟雜物往馬路下,法庭認為他會用萬用刀作撬開地磚及剪下索帶使物件分開等,加上案發時附近的情節和環境‼️故裁定D1**罪名成立**‼️
D2:
身分辨認:
\-PW3在作供時指為D2搜身時是以八達通確認D2的身份,法庭不解為何PW3不用身分證作確認D2身份
\-PW3在作供時表示不記得何時得悉被告的個人資料,即出生日期和住址等,亦不記得當時D2有沒有帶口罩。在前者方面,法庭認為PW3不可能忘記何時得知D2的身份證號碼;在後者方面,法庭認為口罩是辨認的重要因素,有機會影響PW3辨認D2與犯案人的身份
\-PW3稱他曾向D2發出羈留人士通知書,法庭認為為何控方不能提供及沒有紀錄感不解,是否沒有發出還是發錯了
證物處理:
\-法庭不解為何PW3會把接收證物的警員編號弄錯
\-法庭不解為何調查報告會寫錯D2的名字,是否有警員在交代犯案人的身份或資料出錯?
\-法庭認為犯案人名字與涉案物品有關連,若出錯有機會影響配對
基於法庭在上述2點的判斷,法庭裁定D2**罪名不成立**。
——————————
就法庭應否為D1索取感化報告,D1律師指出S.H.Y案例,指出本案案情較輕,萬用刀的危險性也較白電油低,也沒有證據指他有參與堵路的行為。雖然年青未必是有效求情因素,但法庭應在可行情況下着重更生的因素。
D1律師亦建議法庭為D1索取社會服務令報告,雖然D1經審訊後罪成未有完全悔意,但他同意大部分案情亦可以展示出一定悔意。
張官表示為了解完整的情況及有完整的判刑選擇,會為D1索取所有報告。
求情會在判刑當天進行。
——————————
張官把D1判刑訂在2月19日0930於東區裁判法院第10庭進行,D1在等侯判刑期間需還押看管。期間法庭亦會為D1索取感化、社會服務令、更生中心、勞教中心、教導所及青少年評議會報告。
---
11:14:39
\#高等法院第卅六庭
\#李運騰法官
\#0816上水 \#宣布判決
\#攻擊性武器
李 (21)‼️曾服刑5個月
控罪:管有攻擊性武器
第228章《簡易程序治罪條例》第17條
於2019年8月16日,在上水新康街及新祥街交界,管有攻擊性武器,即2支棒球棍、2支鐵通及2瓶辣椒噴霧。
背景:
案件經審訊後於去年7月29日遭 \#蘇文隆主任裁判官 裁定管有攻擊性武器罪成,判囚18個月。被告不服定罪及刑罰,向高等法院申請上訴。9月14日向 \#黃崇厚法官 申請保釋等候上訴遭拒,至12月29日上訴聆訊後才獲 \#李運騰法官 批准保釋。
——————————————————
定罪上訴
⏺兩支裝着辣椒油的噴樽 (P3)
20\. 本席同意答辯方所說,即使裁判官拒絕接納上訴人的說法時用了「驅蚊除蟲」,而不是「蛇蟲鼠蟻」,但明顯地他是不接納上訴人的解釋的固有可能性(inherent probability)。事實上,以辣椒油對家居周圍、汽車車身、車底噴射以「驅蚊除蟲」或驅趕「蛇蟲鼠蟻」,同樣匪夷所思,毫無說服力可言。裁判官有耳聞目睹上訴人作證的優勢,本席認為他有權拒絕接納上訴人管有辣椒噴樽的解釋,而他這樣做也是合情合理的。
⏺兩枝鐵棒球棍 (P5)
24\. 本席不能夠接受馬大律師以上陳詞,原因是上訴人的證供有歪常理,完全不能合信。上訴人說他的朋友有棒球和手套,卻沒有棒球棍;至於上訴人自己則只有棒球棍,卻沒有棒球和手套。倘若上訴人所說的是真的或可能是真的話,即是說他們只能夠一起才能夠打棒球。還有的是,本席在庭上檢視過上訴人所管有的兩支鐵製棒球棍,它們比一般的木製球棍窄身及沉重,相信它們是用來練習揮棒用的。若然用於打球的話,命中率只會較低。然而,上訴人說他從來未打過棒球,那麼他為何會購買鐵製棒球棍呢?既然他購買的棒球棍是鐵做的,為何需要多買一支同色同款的做後備呢?為何不買一支鐵的,一支木的?正如裁判官指出,上訴人這方面的證供,確是「疑點重重」。本席認為他有權拒絕接納上訴人的解釋,而他這樣做也是合情合理的。
⏺結論
34\. 本席以重審的方式檢視過本案的所有證據之後,認同裁判官就事實的裁決,即上訴人管有涉案的物品P3及P5,唯一合理的推論是為傷害他人的用途,所以P3 和P5屬於攻擊性武器。因此,本席裁定駁回上訴人針對定罪的上訴。
刑期上訴
39\. 本席注意到,本案並沒有證據顯示案發當天有任何社運活動或騷亂;沒有證據顯示上訴人於當晚曾經或將要使用P3或P5;也沒有證據顯示他有或將會公開展示涉案物品;它們在本質上亦非攻擊性武器 (offensive weapon per se)。
40\. 基於以上,本席同意裁判官的量刑沒有恰當地反映上訴人的罪責,而且和同類案件的判例相比,即使考慮到本案攻擊性武器的數量,本案的判刑仍然明顯地超出合理範圍。再者,由於控罪的最高刑罰是兩年監禁及罰款5,000元,倘若本案合適的量刑起點已經是18個月監禁的話,那麼對於比本案案情嚴重得多的同類案件,法庭便再沒有足夠的空間將刑罰上調。
41\. 然而,上訴人由被判罪成至本上訴案聆訊日為止,已經被關押了5個月,除去假期及在服刑中因勤奮及行為良好的減刑 ,相當於7個半月的刑期。本席認為這已經足夠反映上訴人的罪責。因此,本席批准他針對判刑的上訴。
總結
42\. 基於以上,本席駁回上訴人針對的定罪上訴,但允許他針對刑期的上訴,並將他的刑期減短,以至他可以即時獲得釋放。
[判案書全文](https://legalref.judiciary.hk/doc/judg/word/vetted/other/ch/2020/HCMA000213_2020.docx)
---
11:25:27
\#西九龍法院第三庭 ( 區域法院 )
\#高勁修首席區域法院法官
\#提訊 \#1006灣仔
A2: 謝 > D1,A6: 吳 > D2
A7: 李 > D3,A12: 黃 > D4
A13: 周 > D5,A15: 高 > D6
A17: X > D7,A19: 陳 > D8
A23: 曾 > D9,A25: 蘇 > D10
控罪:暴動,6項蒙面,無牌管有無線電通訊器具,管有攻擊性武器。2項管有物品意圖損壞財產
10:30 開庭
控方已將案件分拆,呈交修訂控罪書,各被告❌不認罪❌。
有被告未準備好答辯,案件押後至2022年11月4日14:30,在灣仔區域法院作審前覆核,在2022年12月1日至2023年1月16日,作30日審訊。
D10律師表示在2020年10月已經去信律政司要求攞文件,但仲未有回覆,申請另外處理,安排D10在2021年3月30日在灣仔區域法院提訊。
D1 & D4 申請縮短宵禁時間獲法庭批准,其他被告以原有條件繼續保釋。
---
12:18:15
法庭文字直播台
\#東區裁判法院第四庭
\#崔美霞裁判官
\#0907沙田 \#審訊 \[3/2\]
\#襲警
王(22)
控罪:襲擊警務人員
第232章 《警隊條例》第63條
被控於2019年9月7日,於港鐵沙田站B出口外,襲擊執行職責的警務人員,即警員鄧浩濱。
辯方以65B形式呈上一份被告的聽力測試報告,報告證實被告右耳有聽障。
被告選擇出庭作供
現時就讀教大的被告中學時已立志成為老師。被告知悉如果犯案會對他日後當老師有極大影響。
辯方呈上一此被告過往於本地及外國當義工時拍下的相片及證書、醫院病歷、調解課程證書…(詳情後補)
案件將押後至3月3日 0930 沙田裁判法院5庭裁判,期間辯方需於2月5日前提交詳結案陳詞。
---
12:37:15
法庭文字直播台
\#粉嶺裁判法院第五庭
\#陳炳宙裁判官
\#裁決 \#20200119大埔
吳 (17)
控罪:
(1) 在公眾地方管有 \#攻擊性武器
2020年1月19日於大埔公路元洲仔段廣福邨行人天橋管有一支雷射筆
(2) \#襲警 襲擊警務人員
廣福邨行人天橋襲擊高級警員PC47958
❌不認罪❌
——————————————————
裁決理由:
關於證物是否同一支雷射筆
順帶一提,即使控方未能證明P1的雷射筆便是從被告身上搜出那一支,也不會對裁決結果構成任何影響。不爭的事實是pw2在被告身上搜出雷射筆,當被告以其雷射筆作為攻擊性武器,蓄意以此傷害pw1和令他擔憂受到即時和非法的暴力對待。
雷射筆的殺傷力有多大,屬於第1類或第3B類,也不影響被告當時襲擊他人的意圖,亦不會改變被告以其作為攻擊性武器的事實。
為免誤會,本席表明,以上只屬題外話。本席重申,本席肯定P1便是pw2在被告身上搜出的那一支雷射筆。
關於事實的裁斷
本席考慮了所有證據和辯方陳詞後,及被告的良好品格後,本席滿意控方已經在毫無合理疑點的標準下,證明了以下事實:
於2020年1月19日下午06:45,被告身處大埔大埔公路元州仔段近燈柱EA8354的行人天橋上。當時被告攜有一個能發出雷射光束的裝置,即P1的雷射筆。
該雷射筆發出的光照射向距離78米外,即南運路和廣福道交界處的一輛警車中的司機(pw1高級警員47958),照射在其眼睛和臉上。
pw1駕駛右轉至大埔公路元州仔段,於行人天橋底停車。被告一直以打圈的方式把雷射光照射pw1,並聚焦在其臉上和眼睛。
案發時,pw1是一名執行職責的警務人員。被告以雷射光蓄意襲擊pw1。
pw2警員19755在天橋底下車,被告從行人天橋上乘搭升降機到達地面,pw1在升降機外截停被告。
pw1對被告作出搜查,並從他的右前褲袋內取出了p1的雷射筆。pw2在被告的鎖匙包內找到兩條可以用來啟動那支雷射筆的鎖匙P2。隨即pw2以「管有攻擊性武器」的罪名拘捕被告。在pw2口頭警誡下,被告說「呀Sir,我知錯啦,以後唔會再用雷射筆照警察。」
被告於案發時有意圖使用雷射筆來對pw1造成傷害,因此這支雷射筆是攻擊性武器。被告沒有合法權限或合理辯解而攜有攻擊性武器。
裁決
基於以上的原因,本席裁定控方已經在毫無合理疑點的標準下,證明了兩項控罪的所有罪行元素。本席裁定被告人所面對的兩項控罪全部成立,並把他定罪。
——————————————————
求情
被告案發時17歲,現修讀大學一年級,與父母居住。辯方呈上3封求情信。分別由中學校長、老師、父母、朋友撰寫。老師們表示被告品學兼優,在學界上拿到不同的獎項,希望法庭能給予機會被告;父母提及被告乖巧,經過努力入讀自己心儀的學系。
關於控罪一,辯方認為被告不是預謀的罪行,且受傷的警員傷勢不大,在同類案件中也不屬於最嚴重。被告沒有任何案底紀錄。
辯方就著還押有一個保釋申請,法庭拒絕批准(\*陳官表示根據指示,控罪一的判刑只有三個選擇,包括即時監禁、勞教中心、更生中心)
——————————————————
法庭宣布兩項控罪罪成‼️‼️
為被告索取勞教中心、更生中心報告。
證物處理,控方申請證物p1、p2充公,p5的記事冊正副本則交回警方,其他在法庭存檔,辯方不反對。
案件押後至2月19日930於粉嶺裁判法院第五庭判刑,期間被告還押‼️
(按:手足離開時向旁聽席點頭)
其他判詞詳情請參考:
[判詞書全文](https://legalref.judiciary.hk/lrs/common/search/search_result_detail_frame.jsp?DIS=133308&QS=%2B&TP=RV)
---
13:54:28
法庭文字直播台
\#西九龍法院第三庭 ( 區域法院 )
\#高勁修首席區域法院法官
\#提訊 \#0829深水埗
案件一
A1: X,A2: 戴,A3: X(26)
控罪:暴動
案件二
A1: 陳,A2: 彭,A3: 林
A4: 林,A5: 張,A6: 梁
A7: 田,A8: 顧,A9: 洗
A10: 鄺,A11: 梁,A12: 馬
A13: 姚,A14: \* (14-28)
控罪:暴動 3項管有攻擊性武器
11:37 開庭
控方依照法庭指示,在1月25日去信法庭,將兩宗案件合併,再分拆為兩組:
第一組:案件一(A1, A2 & A3),案件二(A1, A2, A3, A4, A6 & A11)
第二組:案件二(A5, A7, A8, A9, A10, A12 & A13)
各被告可以答辯,❌不認罪❌,除了案件二A14,律師表示曾去信律政司申請以其他方式處理,在1月20日有回覆,需要押後4-6星期。
案件押後:
第一組在2022年6月1日09:30,在灣仔區域法院作審前覆核,在2022年7月4日09:30開始,作35日審訊。
第二組在2022年8月1日09:30,在灣仔區域法院作審前覆核,在2022年9月5日09:30開始,作35日審訊。
至於案件二A14,安排在2021年3月16日14:30再提訊。
法庭指令控方在今日起14日內,呈交新控罪書、案情撮要,和證人列表給法庭和被告律師,各被告代表需在下次開庭前7日交回問卷。
案件二的A5, A7, A9, A11申請改報到時間和轉地址獲批,其他被告以原有條件繼續保釋。
---
14:24:31
法庭文字直播台
\#屯門裁判法院第七庭
\#王證瑜裁判官
\#1111屯門 \#判刑
據悉七庭已截龍
---
14:56:55
法庭文字直播台
\#九龍城裁判法院第一庭
\#嚴舜儀主任裁判官
\#網上言行 \#判刑
區(25)
控罪:目的在於使其本人或他人不誠實地獲益而取用電腦、披露未經資料使用者同意而取得的個人資料
上回裁決
[https://t.me/youarenotalonehk\_live/12853](https://t.me/youarenotalonehk_live/12853)
案情:被告為某一診所助理護士,在2019年8月13日中午,被告在其Instagram 賬戶發佈一張截圖,內容有一名警員(PW1)的個人資料:出生日期、姓名、身分證號碼和電話號碼,被另一名警員(PW2)發現,並通知PW1,PW1在下午去診所跟進,被告僱主(PW3)在得悉事件後,曾代表被告向PW3道歉,和承諾解僱被告。
**嚴官判刑前向控辯釐清幾條問題:**
\-涉案係診所規模幾大?單一
\-案情提及上載到Ig後,被告IG followers有幾多?少於210,大概200個
\-2019年8月13日已經知道係警員?係
\-電腦版面並沒有提及警員職業?係
嚴官指案件涉及一定公眾利益。因為任何入侵電腦,從電腦存取個人資料並係網上披露,無論係唔同嘅網上群組都會有類似嘅情況,所以需要考慮具阻嚇性刑罰。至於本案點解可以係咁短時間發現,係因為碰巧被告IG嘅追隨者見到,並認識被起底警員,立即告知先會咁短時間。
本案要考慮阻嚇性刑罰,翻睇案例後考慮刑期係9-12個月。嚴管指需要控辯雙方協助,若考慮9-12個月監禁,是否應該繼續考慮社會服務令?即使係上次庭上明言為被告索取社會服務令報告,以及參考左辯方案例嘅28日監禁緩刑24月,但案例指嘅係起底禁制令,非不誠實使用電腦。
控方能準備陳詞
唯辯方 \#駱應淦大律師 指今日並沒有考慮相對嘅監禁刑罰,因此申請押後以書面陳詞形式協助法庭
辯方呈上母親撰寫嘅信件予法庭,信件主要提及希望在公開法庭上向受害人以及法庭誠心致歉
控方反對保釋
❌被告保釋被拒❌
案件押後到2021年2月5號1430九龍城裁判法院第一庭再處理判刑,‼️期間需要還押看管‼️
---
15:06:23
\#東區裁判法院第一庭
\#鄭紀航裁判官 \#提堂
王(19) \#1102灣仔
控罪:襲擊在正當執行職務的警務人員
被控於2019年11月2日港島區流水式集會,在警方驅散示威者期間,在灣仔告士打道與盧押道交界襲擊正當執行職務的警長4393。
————————
答辯:❌不認罪❌
本案3名控方證人,全是警員,沒有警誡供詞亦沒有片段。辯方沒有證人也沒有片段。
辯方希望2天審訊,控方持中立態度。本案會以中文進行審訊。
辯方指沒有法律辯論。辯方會要求傳召非列表的證人盤問,認為可在2天內完成審訊,鄭官表示擔憂可否在2天完成審訊,以及控方會否作反對。
辯方同意鄭官看法,故希望申請押後4星期,以看畢所有警員筆記再決定會要求傳召多少非證人列表中的警員作盤問。
控方表示辯方要求7名警員的記事冊,但可能需時尋找,加上要查閱,故不認為4星期可完成。
鄭官問押後5星期可否完成上述步驟,控方表示會儘快完成。
本案押後至2021年3月2日1430在東區裁判法院第一庭進行審前覆核。
✅期間被告以原有條件保釋✅
---
15:07:21
\#九龍城裁判法院第一庭
\#嚴舜儀主任裁判官
\#20200906油麻地 \#答辯
張(36)
控罪:不小心駕駛
案情:被控於2020年9月6日在油麻地彌敦道南行、近文明里在道路上不小心駕駛一輛新巴
背景:有網民於2020年9月6日發起九龍大遊行,期間警方於彌敦道截停一部 970 號新巴, 36 歲車長被警指「無故響號」、駕駛巴士「快速行駛」貼近警員,最終遭控一項不小心駕駛罪
控方修訂控罪一,新增控罪二:
\-修訂控罪:危險駕駛
\-新增控罪:不小心駕駛
答辯意向:❌不認罪❌
沒有警誡供詞,有九名控方證人,有行車記錄儀片段
辯方指暫時得1位辯方證人,並指總共傳召4位證人便可
審期:2天
案件押後到2021年5月17-18號0930九龍城裁判法院第七庭審訊,期間維持原有擔保✅
---
15:19:19
\#觀塘裁判法院第一庭
\#梁少玲裁判官 \#判刑
翁(16) \#1105觀塘
已還押14日
控罪:管有攻擊性武器 已認罪
違反香港法例第228章《簡易程序治罪條例》第17條。翁手足被控於2019年11月5日在觀塘翠屏商場M2層的佳寶食品超級市場外,管有雷射筆和伸縮棍,意圖作非法用途使用。
翁手足早前承認上述控罪,梁少玲裁判官判刑前先為被告索取勞教中心、教導所、更生中心報告,另外應辯方律師要求索取社會服務令報告,期間還押2星期,今日判刑。
詳情如下:[https://t.me/youarenotalonehk\_live/12928](https://t.me/youarenotalonehk_live/12928)
\--------------------------
♂️**求情**♂️
辯方律師引用報告內容作進一步求情。報告指翁手足在學校雖然有遲到、缺席但沒有打過交,上課情況可接受、為人有禮貌。兼職離職純粹因為無法遷就學業時間,僱主表明願意再僱用翁手足,學校亦為他保留學席。此外翁手足沒有不良嗜好、本性不懷、得到家人支持及愛護,他們今日亦有到法庭旁聽。
翁手足的情緒問題自從爸爸過身後開始浮現,由於不希望家人擔心所以將負面情緒收收埋埋。2019年8月曾經因意外在石灘跌倒導致腦震盪,影響情緒管理,在2019年10月開始接受精神科治療。雖然報告指出翁手足有自殘傾向,但是沒有傷害他人的暴力傾向。
適合更生中心或教導所服刑
還押索取報告期間,會面官指出翁手足態度合作、有禮貌、坦白承認情況。由於翁手足的情緒問題需要專業護理,不適合在壓力下受監管,身體狀況不適宜判入勞教中心或執行社會服務令。中心主任認為更生中心更適合翁手足。
\--------------------------
⚖️**梁少玲裁判官判刑**
考慮被告審訊前認罪,過往沒有刑事定罪記錄,犯案時年僅16歲,但涉案的兩件物品(雷射筆、伸縮棍)均有一定攻擊性及危險性。被告管有的雷射筆屬於第IV級,60米內直接照射眼睛可令人眼部受傷。兩件物品一但使用會對他人造成嚴重傷害,因此必須判處禁閉式刑罰,而更生中心對被告的技能培訓較有幫助。
判入更生中心服刑
---
15:30:49
\#屯門裁判法院第七庭
\#王證瑜裁判官
\#1111屯門 \#判刑
第一被告:李 (早前已罪名不成立)
第二被告:譚
第三被告:甄
控罪:2項管有攻擊性武器
案情:三名被告人被控於2019年11月11日於屯門青賢街成發汽車有限公司的露天停車場,未經許可或批准管有攻擊性武器(鐳射筆),違反香港法例243章公安條例33條第1,2,3項
[2021年1月12日於王證瑜裁判官席前宣讀之簡單裁決理由](https://t.me/youarenotalonehk_live/12844)
\========
第二被告:
已經收到背景報告,律師已解釋予被告人知及同意。
呈上七封求情信(被告,媽媽,上司,舊同事,朋友,小學班主任,認識了十多年的朋友)
求情信顯示被告由細到大均樂於助人,能為公司帶來歡樂,已真心悔過,在案件發生後,有明顯轉變(收工後都會盡快番屋企,避免參與公眾活動),雖然成續平平,但品格優良,富有正義感
(求情信指被告曾抱著友人的小童說道「**我一定會為你守衛公義,給你自由嘅香港,一個充滿希望嘅未來**」)
第三被告:
已索取勞教中心及更生中心報告。希望採納報告內容(適合更生中心報告)
呈上數封求情信(被告,2名姐姐,教會牧師,中學校長等)
被告沒有刑事定罪記錄,21歲以下,案情非特別嚴重,案發被截停後相對合作,案件中鐳射筆搜獲時為拆開,沒有證據曾使用。
**判刑**
法庭指出控罪是嚴重的,法例要求必然是監禁式刑罰。然,法庭接納2位被告背景良好,有家人朋友支持。
第二被告:❗️監禁5個月❗️
(監禁6個月為量刑起點,考慮求情因素,酌情扣一個月)
第三被告:判入更生中心❗️
直播員按:我地會堅持落去,守衛公義,給下一代自由,充滿希望嘅香港,無論付出什麼代價
---
15:32:09
\#東區裁判法院第一庭
\#鄭紀航裁判官 \#提堂
黃之鋒,古思堯 \#1005銅鑼灣
2人因另案服刑中
控罪1: 明知而參與未經批准的集結 (D1,D2)
2人被控於2019年10月5日,於香港與其他不詳人士明知而參與未經批准的集結,即反對蒙面法大遊行
控罪2:在身處非法集結時使用蒙面物品(D1):
D1被控於同日在金鐘道身處非法集結時,無合法權限或合理辯解而使用相當可能阻止識辨身分的蒙面物品,即一個掛耳式口罩
————————
控方表示2人可答辯。
D1答辯:
控罪1:‼️認罪‼️
控罪2:‼️認罪‼️
D2答辯:
控罪1:不認罪
控方指2人角色相若,希望可以把D1 判刑及求情連同D2審訊由同一法官處理,而D1判刑及求情可以押後至D2審訊完結後處理。
D1同意控方2項控罪的案情。
鄭官裁定D1 2項控罪‼️罪名成立‼️
控方指D1有3項類似案底,指他會在2021年8月出獄。
D2有9名證人,全部警員,沒有警誡供詞,有8段共30分鐘片段。而辯方沒有片段要播放亦沒有證人。
控方預計1-2天審訊,因預期D2不會盤問證人,而D1,2亦希望有較早審期。
D2表示會自辯,並已明白找律師的權利。亦表示1-2天審訊已足夠。
由於考慮到D2身體狀況,鄭官把本案D2審訊暫訂排至2021年3月23和24日在東區裁判法院第8庭進行中文審訊,而D1求情和判刑會擇日進行。若D2要求改期,需2月5日1600前通知法庭。
鄭官詢問2人的保釋條件,控方表示以原有條件擔保。(按:但由於他們就另案服刑,故需還押看管。)
黃之鋒祝大家新年快樂!他們2人精神都好好!
---
17:45:51
法庭文字直播台
\#西九龍裁判法院第五庭
\#劉淑嫻裁判官
\#1226旺角 \#審訊 \[1/1\]
楊 (29)
控罪:遊蕩導致他人擔心
被控於2019年12月26日在旺角朗豪坊4樓的餐廳「茶木」內遊蕩,導致他人合理地擔心自身的安全。
——————————————————
審訊詳情
[https://t.me/youarenotalonehk\_live/13304](https://t.me/youarenotalonehk_live/13304)
辯方結案陳詞
被告嘅行為是否符合遊蕩罪的定義有疑點或不穩妥,被指「與他人結伴」所謂同伴亦沒有參與叫囂。
另一重點,導致他人擔心自身安全或利益,由於案情中兩位粉色衣服女士並沒有到場作供,我冇辦法就住畫面中嘅悲傷神情,去判斷兩位擔心自身安全或利益。
另外控方所指嘅叫囂,都是在兩位粉色衣服女士起身調位置後出現的,因此冇辦法判斷兩位粉色衣服女士調位置與被告或者同伴嘅行為有直接的關係。
案件押後至2021年3月12日1430在西九龍裁判法院第五庭裁決
✅期間以原有條件繼續保釋✅
感謝臨時直播員
---
| 21.78156 | 346 | 0.738799 | yue_Hant | 0.90656 |
d5f34413f344455bf90eaa8e4958362d848da980 | 550 | md | Markdown | _posts/2019-02-20-NUnit3TestAdapter-3.13.md | mikkelbu/nunit.github.io | d81f2b8093dcdf84c83b3ea047002d36b40ad3da | [
"MIT"
] | 1 | 2021-12-17T13:00:17.000Z | 2021-12-17T13:00:17.000Z | _posts/2019-02-20-NUnit3TestAdapter-3.13.md | mikkelbu/nunit.github.io | d81f2b8093dcdf84c83b3ea047002d36b40ad3da | [
"MIT"
] | 104 | 2015-12-16T15:39:50.000Z | 2022-03-20T18:46:59.000Z | _posts/2019-02-20-NUnit3TestAdapter-3.13.md | mikkelbu/nunit.github.io | d81f2b8093dcdf84c83b3ea047002d36b40ad3da | [
"MIT"
] | 11 | 2016-08-19T20:29:34.000Z | 2022-01-26T20:08:29.000Z | NUnit3TestAdapter version 3.13.0 is released.
This release focuses on enhancements: improving and fixing the generation of NUnit3 Test Results XML and removing internal properties. It also ensures that the VSIX version works with VS2019.
See [the release notes](https://github.com/nunit/docs/wiki/Adapter-Release-Notes) for details on what has changed in 3.13.
[Download NuGet package](https://www.nuget.org/packages/NUnit3TestAdapter/3.13.0)
[Download VSIX](https://marketplace.visualstudio.com/items?itemName=NUnitDevelopers.NUnit3TestAdapter) | 91.666667 | 193 | 0.810909 | eng_Latn | 0.833553 |
d5f35abe4f4ef77e32de170d4067738336484b82 | 193 | md | Markdown | fr/adopt-openjdk-getting-started/write_up_on_the_adopt_openjdk_&_adopt-a-jsr_programs.md | feng-qi/adoptopenjdk-getting-started-kit | 0ac48d8d93b462ba682b653b1afbd9314c660fd2 | [
"CC0-1.0"
] | 49 | 2015-11-17T09:47:19.000Z | 2021-11-14T20:41:55.000Z | fr/adopt-openjdk-getting-started/write_up_on_the_adopt_openjdk_&_adopt-a-jsr_programs.md | feng-qi/adoptopenjdk-getting-started-kit | 0ac48d8d93b462ba682b653b1afbd9314c660fd2 | [
"CC0-1.0"
] | 32 | 2015-11-23T00:02:42.000Z | 2020-12-05T23:48:37.000Z | fr/adopt-openjdk-getting-started/write_up_on_the_adopt_openjdk_&_adopt-a-jsr_programs.md | feng-qi/adoptopenjdk-getting-started-kit | 0ac48d8d93b462ba682b653b1afbd9314c660fd2 | [
"CC0-1.0"
] | 12 | 2015-11-16T23:36:31.000Z | 2020-04-08T10:14:13.000Z | # Présentation des programmes Adopt OpenJDK et Adopt-a-JSR
https://docs.google.com/document/d/1TjusNITbfb35PKqA3Bq5yx_j4jPeHs3dLBvACW-fAoA/edit (capture de la page d'accueil d'Adopt OpenJDK).
| 48.25 | 132 | 0.818653 | kor_Hang | 0.288398 |
d5f35e9c9b1af9a3e8b3671a423bafe259b9b924 | 10,247 | md | Markdown | src/components/Picklist/readme.md | Charlitos96/react-rainbow | 7b9d1445ba39e55cbda9a6b9f7adcd38323c987d | [
"MIT"
] | null | null | null | src/components/Picklist/readme.md | Charlitos96/react-rainbow | 7b9d1445ba39e55cbda9a6b9f7adcd38323c987d | [
"MIT"
] | null | null | null | src/components/Picklist/readme.md | Charlitos96/react-rainbow | 7b9d1445ba39e55cbda9a6b9f7adcd38323c987d | [
"MIT"
] | null | null | null | ##### Picklist base
```js
import React from 'react';
import { Picklist, Option } from 'react-rainbow-components';
const containerStyles = {
width: '200px',
};
initialState = { value: { name: 'option 3', label: 'Central Park' } };
<div className="rainbow-m-bottom_xx-large rainbow-p-bottom_xx-large">
<GlobalHeader
src="images/user/user2.jpg"
className="rainbow-p-bottom_xx-large rainbow-m-bottom_xx-large"
>
<div className="rainbow-flex rainbow-align_right">
<Picklist
id="picklist-1"
style={containerStyles}
onChange={(value) => setState({ value })}
value={state.value}
label="Select Building"
hideLabel
>
<Option name="header" label="Your Buildings" variant="header" />
<Option name="option 1" label="Experimental Building" />
<Option name="option 2" label="Empire State" />
<Option name="option 3" label="Central Park" />
</Picklist>
</div>
</GlobalHeader>
</div>;
```
##### Picklist with multiple options
```js
import React from 'react';
import { Picklist, Option } from 'react-rainbow-components';
const containerStyles = {
width: '200px',
};
initialState = { value: null };
<div className="rainbow-m-bottom_xx-large rainbow-p-bottom_xx-large">
<GlobalHeader
src="images/user/user1.jpg"
className="rainbow-p-bottom_xx-large rainbow-m-bottom_xx-large"
>
<div className="rainbow-flex rainbow-align_right">
<Picklist
id="picklist-3"
style={containerStyles}
placeholder="Choose Building"
onChange={(value) => setState({ value })}
value={state.value}
label="Select Building"
hideLabel
enableSearch
>
<Option name="option 1" label="All Buildings" icon={<DashboardIcon />} />
<Option name="option 2" label="New Building" icon={<AddFilledIcon />} />
<Option name="header" label="Your Buildings" variant="header" />
<Option name="option 3" label="Experimental" icon={<BuildingIcon />} />
<Option name="option 4" label="Bennet Towers" icon={<BuildingIcon />} />
<Option name="option 5" label="Empire State" icon={<BuildingIcon />} />
<Option name="option 6" label="Central Park" icon={<BuildingIcon />} />
<Option name="option 7" label="Chrysler" icon={<BuildingIcon />} />
<Option name="option 8" label="Plaza" icon={<BuildingIcon />} />
</Picklist>
</div>
</GlobalHeader>
</div>;
```
##### Picklist disabled
```js
import React from 'react';
import { Picklist, Option } from 'react-rainbow-components';
const containerStyles = {
width: '180px',
};
initialState = { value: { name: 'option 1', label: 'All Buildings' } };
<div className="rainbow-m-bottom_xx-large rainbow-p-bottom_xx-large">
<GlobalHeader
src="images/user/user2.jpg"
className="rainbow-p-bottom_xx-large rainbow-m-bottom_xx-large"
>
<div className="rainbow-flex rainbow-align_right">
<Picklist
disabled
value={state.value}
label="Select Building"
hideLabel
style={containerStyles}
>
<Option name="option 1" label="Experimental Building" />
<Option name="option 2" label="Empire State" />
<Option name="option 3" label="Central Park" />
</Picklist>
</div>
</GlobalHeader>
</div>;
```
##### Picklist with redux-form
```js
import React from 'react';
import { Picklist, Option, ButtonIcon } from 'react-rainbow-components';
import { FontAwesomeIcon } from '@fortawesome/react-fontawesome';
import { faUpload } from '@fortawesome/free-solid-svg-icons';
import { Field, reduxForm } from 'redux-form';
const containerStyles = {
width: '140px',
};
function Form({ handleSubmit, onSubmit }) {
return (
<form
className="rainbow-flex rainbow-align_right rainbow-flex_wrap"
noValidate
onSubmit={handleSubmit(onSubmit)}
>
<Field
style={containerStyles}
component={Picklist}
name="option"
placeholder="Choose"
label="Select Building"
hideLabel
>
<Option name="option 1" label="All Buildings" />
<Option name="option 2" label="New Building" />
<Option name="header 2" label="Your Buildings" variant="header" />
<Option name="option 3" label="Experimental" />
<Option name="option 4" label="Empire State" />
</Field>
<ButtonIcon
variant="border"
icon={<FontAwesomeIcon icon={faUpload} />}
className="rainbow-m-left_small"
type="submit"
/>
</form>
);
}
const PicklistForm = reduxForm({
form: 'picklist-form',
touchOnBlur: false,
})(Form);
<div className="rainbow-m-bottom_xx-large rainbow-p-bottom_xx-large">
<GlobalHeader
src="images/user/user3.jpg"
className="rainbow-p-bottom_xx-large rainbow-m-bottom_xx-large"
>
<PicklistForm onSubmit={(values) => console.log(values)} />
</GlobalHeader>
</div>;
```
##### Picklist with Option changed dynamically
```js
import React from 'react';
import { Picklist, Option, ButtonIcon } from 'react-rainbow-components';
import { FontAwesomeIcon } from '@fortawesome/react-fontawesome';
import { faDownload } from '@fortawesome/free-solid-svg-icons';
const containerStyles = {
width: '200px',
};
initialState = { value: { name: 'option 3', label: 'Central Park' } };
class PicklistExample extends React.Component {
constructor(props) {
super(props);
this.state = {
isBuildingsAdded: false,
value: initialState.value,
};
this.handleChange = this.handleChange.bind(this);
}
addNewBuildings() {
const { isBuildingsAdded } = this.state;
this.setState({ isBuildingsAdded: !isBuildingsAdded });
}
handleChange(value) {
this.setState({
value: value,
});
}
renderNewBuildings() {
const { isBuildingsAdded } = this.state;
if (isBuildingsAdded) {
return <Option name="option 4" label="One World Trade Center" />;
}
return null;
}
render() {
return (
<div className="rainbow-m-bottom_xx-large rainbow-p-bottom_xx-large">
<GlobalHeader
className="rainbow-p-bottom_xx-large rainbow-m-bottom_xx-large"
src="images/user/user3.jpg"
>
<div className="rainbow-flex rainbow-align_right">
<Picklist
id="picklist-9"
style={containerStyles}
onChange={this.handleChange}
value={this.state.value}
label="Select Building"
hideLabel
>
<Option name="option 1" label="Experimental Building" />
{this.renderNewBuildings()}
<Option name="option 2" label="Empire State" />
<Option name="option 3" label="Central Park" />
</Picklist>
</div>
<ButtonIcon
id="button-icon_add-new-buildings"
className="rainbow-m-left_small"
onClick={() => this.addNewBuildings()}
variant="border"
icon={<FontAwesomeIcon icon={faDownload} />}
/>
</GlobalHeader>
</div>
);
}
}
<PicklistExample />;
```
##### Picklist error
```js
import React from 'react';
import { Picklist, Option } from 'react-rainbow-components';
const containerStyles = {
maxWidth: 700,
};
initialState = { value: { name: 'option 3', label: 'Central Park' } };
<Picklist
id="picklist-11"
style={containerStyles}
onChange={(value) => setState({ value })}
value={state.value}
required
error="This Field is Required"
label="Select Building"
className="rainbow-m-vertical_x-large rainbow-p-horizontal_medium rainbow-m_auto"
>
<Option name="header" label="Your Buildings" variant="header" />
<Option name="option 1" label="Experimental Building" />
<Option name="option 2" label="Empire State" />
<Option name="option 3" label="Central Park" />
</Picklist>;
```
##### Picklist shaded with neutral GlobalHeader
```js
import React from 'react';
import { Picklist, Option } from 'react-rainbow-components';
const containerStyles = {
width: '200px',
};
initialState = { value: { name: 'option 3', label: 'Central Park' } };
<div className="rainbow-m-bottom_xx-large rainbow-p-bottom_xx-large">
<GlobalHeader
src="images/user/user2.jpg"
className="rainbow-p-bottom_xx-large rainbow-m-bottom_xx-large"
variant="neutral"
>
<div className="rainbow-flex rainbow-align_right">
<Picklist
id="picklist-13"
style={containerStyles}
onChange={(value) => setState({ value })}
value={state.value}
label="Select Building"
hideLabel
variant="shaded"
>
<Option name="header" label="Your Buildings" variant="header" />
<Option name="option 1" label="Experimental Building" />
<Option name="option 2" label="Empire State" />
<Option name="option 3" label="Central Park" />
</Picklist>
</div>
</GlobalHeader>
</div>;
```
| 32.021875 | 89 | 0.552454 | eng_Latn | 0.349926 |
d5f3d3d076a86f2d766ea509d6eadf3e787ade46 | 3,507 | md | Markdown | docfx_project/articles/getting_started/unity3d.md | Azyyyyyy/discord-rpc-csharp | 2b6be4749f75fce930e2835237c5d4b25905f98f | [
"MIT"
] | 451 | 2018-02-01T10:29:39.000Z | 2022-03-25T13:16:16.000Z | docfx_project/articles/getting_started/unity3d.md | Azyyyyyy/discord-rpc-csharp | 2b6be4749f75fce930e2835237c5d4b25905f98f | [
"MIT"
] | 159 | 2018-03-18T22:09:48.000Z | 2022-03-29T17:15:59.000Z | docfx_project/articles/getting_started/unity3d.md | Azyyyyyy/discord-rpc-csharp | 2b6be4749f75fce930e2835237c5d4b25905f98f | [
"MIT"
] | 125 | 2018-01-19T18:36:22.000Z | 2022-03-05T17:57:37.000Z | # Unity3D
This library has full Unity3D intergration and custom editor scripts to help enhance your usage with the library. Please note there are some technical limitations with Unity3D however:
* .NET 2.0+ is required (no subset).
* Newtonsoft.Json is required.
* Native Named Pipe Wrapper is required.
Luckily the provided Unity Package handles all this for you.
## Download
Use the automatically built `.UnityPackage` that can be found in the artifacts of the AppVoyer build. This contains the extra dependencies for the platform and full editor support to make it easier to use.
[Download Package](https://ci.appveyor.com/project/Lachee/discord-rpc-csharp/build/artifacts) and import into your project.
[](https://ci.appveyor.com/project/Lachee/discord-rpc-csharp/build/artifacts)
## Importing
Import the unity package normally and make sure all content is selected. Once imported you may get the following warning. This library does not support the .NET 2.0 **Subset** and requires the full .NET 2.0 or greater. Proceeding with `Yes` will convert the project automatically to .NET 2.0.

## Creating a Manager
The Discord Manager is a wrapper class around the DiscordRpcClient. It will handle the initialization, deinitialization and event invoking for you automatically.
Create a new Discord Manager in your very first loaded scene by following `GameObject -> Discord Manager`.

### Discord Manager Inspector
Once created, a new object will appear in your scene. You can _only_ have 1 Discord Manager at a time and any extras will automatically be deleted. The manager has some default values, but will need to be configured to your application.
| Property | Description |
|----------|-------------|
| Application ID | The Client ID of your Discord App created in the [Developer Portal](https://discordapp.com/developers/applications/). |
| Steam ID | A optional Steam App ID of your game. This will let Discord launch your game through the steam client instead of directly when using the [Join & Spectate](/join_spectate/intro.md) |
| Target Pipe | The pipe your Discord Client is running on. If you have 2 clients running for testing purposes, you can switch which client the game connects too. |
| Log Level | The level of logging to receive from the DiscordRpcClient. |
| Register Uri Scheme | Registers a custom URI scheme so Discord can launch your game. Only required if using the [Join & Spectate](/join_spectate/intro.md) feature. |
| Active | If enabled, the Discord Manager will create a connection and maintain it. |
| **State** | The current state of the connected client. These values are generally `Read Only` |

## Usage
Setting Rich Presence is done via your game code. It is upto you on how you implement it, but as an example from the Survival Shooter example by Unity3D:
```cs
public void UpdatePresence()
{
presence.state = "Score: " + CompleteProject.ScoreManager.score;
presence.largeAsset = new DiscordAsset()
{
image = health.isDead ? "dead" : "alive",
tooltip = health.currentHealth + "HP"
};
DiscordManager.current.SetPresence(presence);
}
```
## Further Reading
If you wish to implement the Join and Spectate feature within your project (those buttons), please read [Joining & Spectating Introduction](../join_spectate/intro.md) to get started.
| 50.1 | 292 | 0.763616 | eng_Latn | 0.991523 |
d5f42933f6065fa245f6fe9d552558d53e38d7f3 | 977 | md | Markdown | r/redux-devtools-extension/readme.md | ScalablyTyped/SlinkyTyped | abb05700fe72d527728a9c735192f4c156bd9be1 | [
"MIT"
] | 14 | 2020-01-09T02:36:33.000Z | 2021-09-05T13:40:52.000Z | r/redux-devtools-extension/readme.md | oyvindberg/SlinkyTyped | abb05700fe72d527728a9c735192f4c156bd9be1 | [
"MIT"
] | 1 | 2021-07-31T20:24:00.000Z | 2021-08-01T07:43:35.000Z | r/redux-devtools-extension/readme.md | oyvindberg/SlinkyTyped | abb05700fe72d527728a9c735192f4c156bd9be1 | [
"MIT"
] | 4 | 2020-03-12T14:08:42.000Z | 2021-08-12T19:08:49.000Z |
# Scala.js typings for redux-devtools-extension
Typings are for version 2.13.8
## Library description:
Wrappers for Redux DevTools Extension.
| | |
| ------------------ | :-------------: |
| Full name | redux-devtools-extension |
| Keywords | - |
| # releases | 4 |
| # dependents | 470 |
| # downloads | 33280671 |
| # stars | 12 |
## Links
- [Homepage](https://github.com/zalmoxisus/redux-devtools-extension)
- [Bugs](https://github.com/zalmoxisus/redux-devtools-extension/issues)
- [Repository](https://github.com/zalmoxisus/redux-devtools-extension)
- [Npm](https://www.npmjs.com/package/redux-devtools-extension)
## Note
This library has been generated from typescript code from first party type definitions.
Provided with :purple_heart: from [ScalablyTyped](https://github.com/oyvindberg/ScalablyTyped)
## Usage
See [the main readme](../../readme.md) for instructions.
| 27.914286 | 94 | 0.631525 | eng_Latn | 0.432742 |
d5f52e6757d2dd3c6418ac403e94b71015edf010 | 4,805 | md | Markdown | README.md | DependableSystemsLab/ThingsJS | abd5e5b631cd73ea81f528e2e19fb189b9c89f6d | [
"MIT"
] | 13 | 2018-10-13T02:54:22.000Z | 2022-02-25T09:27:52.000Z | README.md | DependableSystemsLab/ThingsJS | abd5e5b631cd73ea81f528e2e19fb189b9c89f6d | [
"MIT"
] | 24 | 2018-09-17T17:19:48.000Z | 2021-05-28T23:32:29.000Z | README.md | karthikp-ubc/ThingsJS | abd5e5b631cd73ea81f528e2e19fb189b9c89f6d | [
"MIT"
] | 5 | 2018-01-06T04:48:23.000Z | 2018-08-06T21:41:56.000Z | # ThingsJS
ThingsJS is a framework for running JavaScript applications on IoT devices such as Raspberry PIs
* NOTE: This repository is currently under active development and its contents are subject to breaking changes.
## Directory Structure
```
/bin
/docs
/lib
/core
/util
/util
/dashboard
/samples
```
This repository is organized thus:
1. [`bin`](bin/) contains the `things-js` CLI (shell script) that is installed with the package. Also contains default config files.
2. [`docs`](docs/) contains the API documentations generated by JSDoc.
3. [`lib`](lib/) contains the core ThingsJS code
1. [`core`](lib/core/) contains the main ThingsJS objects such as `Code`, `CodeEngine`, `Pubsub`, `Dispatcher`.
2. [`util`](lib/util/) contains general-purpose utility modules used in the system.
4. [`util`](util/) contains supplementary apps and debug tools
1. [`dashboard`](lib/dashboard/) contains the ThingsJS Dashboard application; it is an `express` web-application
5. [`samples`](samples/) contains raw JavaScript sample code (non-instrumented) that can be dispatched to ThingsJS workers.
## Dependencies
* The ThingsJS framework uses MQTT Pub/Sub as its main communication mechanism and assumes the existence of an active MQTT service. Either **Mosquitto**, or **Mosca** will do. ThingsJS provides a basic Mosca server that can be started with `things-js pubsub` command. The Pub/Sub service is referenced only by the URL (e.g. `mqtt://localhost`) within the ThingsJS framework and does not assume any specific implementation of the service.
```
~$ things-js pubsub
```
* For running the Dashboard web-application, **MongoDB** is required as the Dashboard uses it to store the end-user code.
```
~$ service mongod start
```
## Getting Started
For trying out the framework, you can follow the steps below:
### Installation
#### Option 1
1. `git clone` this repository
```
~$ git clone https://github.com/DependableSystemsLab/ThingsJS.git
```
2. `npm install -g` to install the package. (download all the npm dependencies and link the CLI tool)
You may need to put `sudo` depending on your installation of NodeJS.
You need the `-g` (global installation) option for using the CLI. If you don't plan on using the CLI, you can omit the `-g` option.
```
~$ cd ThingsJS
~/ThingsJS$ npm install -g
```
#### Option 2
1. Install via npm
```
~$ sudo npm install -g things-js
```
You may omit the `sudo` depending on your NodeJS install settings.
#### Using the CLI
Along with the API provided, CLI is included for easy use.
Here are some of the commands:
* things-js pubsub
* things-js dashboard
* things-js worker {config}
* things-js instrument {code}
You can find the full list of commands [here](https://dependablesystemslab.github.io/ThingsJS/api/CLI.html).
1. To start the Dashboard Application:
```
~$ things-js dash
#OR
~$ things-js dashboard
```
By default it connects to MQTT at `localhost:1883`, MongoDB at `localhost:27017/things-js-fs`, and listens on `localhost:3000`.
To start the dashboard with a different configuration, you can use the `-c` or `--config` options with the config file path provided as an argument.
e.g.
```
~$ things-js dash -c my_config.conf
```
This will start a web-application served at the specified port.
You can watch the demo of the Dashboard here:
<a href="http://ece.ubc.ca/~kumseok/assets/ThingsMigrate_201811.mp4" target="_blank"><img src="http://ece.ubc.ca/~kumseok/assets/ThingsJS_Migration.png"
alt="Demo Screenshot" width="427" height="240" border="10" /><p>Click Image to see Demo Video</p></a>
2. Running a ThingsJS worker:
To start a ThingsJS worker, first you need to create a directory that will provide the NodeJS environment. This is because the worker needs to have a reference to the `things-js` module and any other npm modules that a ThingsJS user (developer) may require. If the worker cannot find a link to a `node_modules` directory, it will throw an error.
```
~$ mkdir hello_things
~$ cd hello_things
~$ npm link things-js
#create a config file for the worker first (e.g. node_00.conf)
~/hello_things$ things-js worker node_00.conf
```
The configuration file is a required argument for starting the worker. It should contain the following information:
```
{
"pubsub_url": "mqtt://localhost",
"id": "node_00",
"device": "raspberry-pi3"
}
```
3. To instrument raw JavaScript code into a "ThingJS-compatible" code:
```
~$ things-js inst my_code.js
#OR
~$ things-js instrument my_code.js
```
By default the output file will have the same file name, with the extension `.things.js`.
To specify the output file name, provide the optional argument with `-o` or `--output`.
e.g.
```
~$ things-js inst my_code.js -o my_code.instrumented.js
```
## License
MIT
| 31.611842 | 437 | 0.72924 | eng_Latn | 0.985635 |
d5f540a023de0ac36681121978f2ef9e480d6384 | 4,490 | md | Markdown | README.md | cuarti/zenox-env | 0af770b2b17c2f3f560ab788fb62c7a182731a7a | [
"MIT"
] | null | null | null | README.md | cuarti/zenox-env | 0af770b2b17c2f3f560ab788fb62c7a182731a7a | [
"MIT"
] | null | null | null | README.md | cuarti/zenox-env | 0af770b2b17c2f3f560ab788fb62c7a182731a7a | [
"MIT"
] | null | null | null |
@zenox/rcfile
==
[](https://travis-ci.org/cuarti/zenox-env)
[](https://www.npmjs.com/package/@zenox/env)
[](https://github.com/cuarti/zenox-env/blob/master/LICENSE)
## Description
Manage environment variables and load it from file. Part of Zenox Framework
#### File example
```bash
# Comment
NAME=rcfile
VERSION=1
WORKS=true
```
## Installation
The usage of the module not depend of Zenox framework and can use it in your NodeJS program.
```bash
npm install --save @zenox/env
```
## Documentation
*All the examples are in ES6/ES2015 of Javascript and Typescript.
To use all the features of Typescript, you can use the generic param
on load function.*
### get
Get value from environment variables.
#### Typescript definition
```typescript
declare function get(name: string): any;
```
#### Example
```typescript
import {get} from '@zenox/env';
let data = env.get('EXAMPLE');
```
### getString
Get value as string from environment variables.
#### Typescript definition
```typescript
declare function getString(name: string): string;
```
#### Example
```typescript
import {getString} from '@zenox/env';
let data = env.getString('EXAMPLE');
```
### getInt
Get value as number (integer) from environment variables.
#### Typescript definition
```typescript
declare function getInt(name: string): number;
```
#### Example
```typescript
import {getInt} from '@zenox/env';
let data = env.getInt('EXAMPLE');
```
### getFloat
Get value as number (float) from environment variables.
#### Typescript definition
```typescript
declare function getFloat(name: string): number;
```
#### Example
```typescript
import {getFloat} from '@zenox/env';
let data = env.getFloat('EXAMPLE');
```
### getBoolean
Get value as boolean from environment variables.
#### Typescript definition
```typescript
declare function getBoolean(name: string): boolean;
```
#### Example
```typescript
import {getBoolean} from '@zenox/env';
let data = env.getBoolean('EXAMPLE');
```
### exists
Get if exists environment variable.
#### Typescript definition
```typescript
declare function exists(name: string): boolean;
```
#### Example
```typescript
import {exists} from '@zenox/env';
let exists = env.exists('EXAMPLE');
```
### existsString
Get if exists environment variable and is string.
#### Typescript definition
```typescript
declare function existsString(name: string): boolean;
```
#### Example
```typescript
import {existsString} from '@zenox/env';
let exists = env.existsString('EXAMPLE');
```
### existsInt
Get if exists environment variable and is number (integer).
#### Typescript definition
```typescript
declare function existsInt(name: string): boolean;
```
#### Example
```typescript
import {existsInt} from '@zenox/env';
let exists = env.existsInt('EXAMPLE');
```
### existsFloat
Get if exists environment variable and is number (float).
#### Typescript definition
```typescript
declare function existsFloat(name: string): boolean;
```
#### Example
```typescript
import {existsFloat} from '@zenox/env';
let exists = env.existsFloat('EXAMPLE');
```
### existsBoolean
Get if exists environment variable and is boolean.
#### Typescript definition
```typescript
declare function existsBoolean(name: string): boolean;
```
#### Example
```typescript
import {existsBoolean} from '@zenox/env';
let exists = env.existsBoolean('EXAMPLE');
```
### add
Add value to environment variables.
#### Typescript definition
```typescript
declare function add(name: string, value: any): void;
```
#### Example
```typescript
import {add} from '@zenox/env';
add('EXAMPLE', 'FOO');
```
### remove
Remove value from environment variables.
#### Typescript definition
```typescript
declare function remove(name: string): void;
```
#### Example
```typescript
import {remove} from '@zenox/env';
remove('EXAMPLE');
```
### load
Load file and set values to environment variables.
Importing package, it tries to load .envrc file.
#### Typescript definition
```typescript
declare function load(file: string = '.envrc', encoding: string = 'utf8'): Promise<void>;
```
#### Example
```typescript
import {load} from '@zenox/env';
load('.examplerc', 'utf8').then(() => {
...
});
```
## License
The source code of the current repository are launched with MIT License.
| 17.747036 | 129 | 0.703563 | eng_Latn | 0.782393 |
d5f553ca9c13a6ad4225bbe53bc544b4cae3ed3e | 6,048 | md | Markdown | bower_components/typed.js/README.md | iceman201/liguo.jiao | 10edea9a8da6b3ff0e7c8ce16e650f810b0a6129 | [
"MIT"
] | 1 | 2021-09-06T12:58:35.000Z | 2021-09-06T12:58:35.000Z | bower_components/typed.js/README.md | iceman201/liguo.jiao | 10edea9a8da6b3ff0e7c8ce16e650f810b0a6129 | [
"MIT"
] | 1 | 2017-05-28T23:12:23.000Z | 2017-06-01T09:14:46.000Z | bower_components/typed.js/README.md | iceman201/liguo.jiao | 10edea9a8da6b3ff0e7c8ce16e650f810b0a6129 | [
"MIT"
] | null | null | null | <img src="/logo-cropped.png" width="450px" title="Typed.js" />
========
[View the live demo](http://www.mattboldt.com/demos/typed-js/) | [Go to my site, mattboldt.com](http://www.mattboldt.com)
Typed.js is a library that types. Enter in any string, and watch it type at the speed you've set, backspace what it's typed, and begin a new sentence for however many strings you've set.
Looking for some custom use cases for Typed.js? [Check out the wiki](https://github.com/mattboldt/typed.js/wiki)
---
Installation
------------
This is really all you need to get going.
~~~ javascript
<script src="typed.js"></script>
<script>
document.addEventListener("DOMContentLoaded", function(){
Typed.new(".element", {
strings: ["First sentence.", "Second sentence."],
typeSpeed: 0
});
});
</script>
...
<span class="element"></span>
~~~
Or you can use jQuery:
~~~ javascript
<script src="jquery.js"></script>
<script src="typed.js"></script>
<script>
$(function(){
$(".element").typed({
strings: ["First sentence.", "Second sentence."],
typeSpeed: 0
});
});
</script>
...
<span class="element"></span>
~~~
### Install with Bower
~~~
bower install typed.js
~~~
Want the animated blinking cursor? Add this CSS.
~~~ scss
.typed-cursor{
opacity: 1;
-webkit-animation: blink 0.7s infinite;
-moz-animation: blink 0.7s infinite;
animation: blink 0.7s infinite;
}
@keyframes blink{
0% { opacity:1; }
50% { opacity:0; }
100% { opacity:1; }
}
@-webkit-keyframes blink{
0% { opacity:1; }
50% { opacity:0; }
100% { opacity:1; }
}
@-moz-keyframes blink{
0% { opacity:1; }
50% { opacity:0; }
100% { opacity:1; }
}
~~~
Wonderful sites using Typed.js
---
https://slack.com/
https://envato.com/
https://productmap.co/
https://www.typed.com/
https://git.market/
http://allison.house/404
http://www.maxcdn.com/
https://commando.io/
http://testdouble.com/agency.html
http://www.stephanemartinw.com/
http://www.trelab.fi/en/
http://jessejohnson.github.io/
---
### HTML tags
By default the content type is set to `html`, so you're good to go. Want to type out the html regularly? Set it to `text`.
~~~ javascript
Typed.new(".element", {
strings: ["Typed.js is an <strong>Awesome</strong> library."],
contentType: 'html' // or 'text'
});
~~~
### Strings from static HTML (SEO Friendly)
Rather than using the `strings` array to insert strings, you can place an HTML `div` on the page and read from it.
This allows bots and search engines, as well as users with JavaScript disabled, to see your text on the page.
~~~ javascript
<script>
document.addEventListener('DOMContentLoaded', function(){
Typed.new('#typed', {
stringsElement: document.getElementById('typed-strings')
});
});
</script>
~~~
You must wrap each string in the `typed-strings` div with a `<p>`
~~~ html
<div id="typed-strings">
<p>Typed.js is an <strong>Awesome</strong> library.</p>
<p>It <em>types</em> out sentences.</p>
</div>
<span id="typed"></span>
~~~
### Line Breaks
#### `contentType: 'html'`
~~~ javascript
Typed.new(".typed", { strings: ["Sentence with <br>line break."] });
~~~
#### `contentType: 'text'`
Use `white-space: pre` in your typed text element, and then `\n` when typing out the strings. Example:
~~~ html
<span id="typed" style="white-space:pre"></span>
...
Typed.new(".element", { strings: ["Sentence with a\nline break."] });
~~~
### Type Pausing
You can pause in the middle of a string for a given amount of time by including an escape character.
~~~ javascript
Typed.new(".element", {
// Waits 1000ms after typing "First"
strings: ["First ^1000 sentence.", "Second sentence."]
});
~~~
Customization
----
~~~ javascript
Typed.new(".element", {
strings: ["First sentence.", "Second sentence."],
// Optionally use an HTML element to grab strings from (must wrap each string in a <p>)
stringsElement: null,
// typing speed
typeSpeed: 0,
// time before typing starts
startDelay: 0,
// backspacing speed
backSpeed: 0,
// shuffle the strings
shuffle: false,
// time before backspacing
backDelay: 500,
// loop
loop: false,
// false = infinite
loopCount: false,
// show cursor
showCursor: true,
// character for cursor
cursorChar: "|",
// attribute to type (null == text)
attr: null,
// either html or text
contentType: 'html',
// call when done callback function
callback: function() {},
// starting callback function before each string
preStringTyped: function() {},
//callback for every typed string
onStringTyped: function() {},
// callback for reset
resetCallback: function() {}
});
~~~
### Get Super Custom
Want to get really custom? On my site and in the Typed.js demo I have the code type out two words, and then backspace only those two, then continue where it left off. This is done in an `if` statement in the `backspace()` function. Here's what it looks like.
~~~ javascript
...
, backspace: function(curString, curStrPos){
...
setTimeout(function() {
// check string array position
// on the first string, only delete one word
// the stopNum actually represents the amount of chars to
// keep in the current string. In my case it's 3.
if (self.arrayPos == 1){
self.stopNum = 3;
}
//every other time, delete the whole typed string
else{
self.stopNum = 0;
}
...
~~~
This checks if the `arrayPos` is `1`, which would be the second string you entered. If so, it sets `stopNum` to `3` instead of `0`, which tells it to stop when there are 3 characters left. For now you'll have to create custom `if` statements for each specific case you want. I may automate this somehow in the future.
## Development
`npm install`
Then, once you've made your edits:
`gulp`
This will create a minified version in `/dist`
end
---
Thanks for checking this out. If you have any questions, I'll be on [Twitter](http://www.twitter.com/atmattb).
If you're using this, let me know! I'd love to see it.
It would also be great if you mentioned me or my website somewhere. [www.mattboldt.com](http://www.mattboldt.com)
| 22.567164 | 317 | 0.675595 | eng_Latn | 0.867099 |
d5f5b37696aa1cfad52c5042fc38bce5dac6cb14 | 2,035 | md | Markdown | content/fragments-drafts/managed-environments.md | leolower/sorg | 5f0ac7b7870e4d4c3a20b57a640ca1cc6b7af483 | [
"MIT"
] | null | null | null | content/fragments-drafts/managed-environments.md | leolower/sorg | 5f0ac7b7870e4d4c3a20b57a640ca1cc6b7af483 | [
"MIT"
] | null | null | null | content/fragments-drafts/managed-environments.md | leolower/sorg | 5f0ac7b7870e4d4c3a20b57a640ca1cc6b7af483 | [
"MIT"
] | null | null | null | ---
title: On Managed Environments
published_at: 2017-05-11T18:46:36Z
hook: TODO
---
At Stripe we distribute Puppeted laptops to every new
developer that lets them get a running start with Ruby,
Rbenv (and its various plugins), Bundler, GPG, and many
other tools pre-installed. This has a few nice
characteristics in that it's often enough to get almost
every new employee to be able to a successful deployment on
their first or second day.
There are similar ideas with Docker: instead of setting up
a working local environment, write a `docker-compose.yml`
and let the pieces will slide together.
## Self-help not endorsed (#self-help)
In my experience, these managed systems can be fine for
days or weeks, but eventually the user will hit a problem,
and when they do, no self-help is possible. The combination
of not having had a hand in building their own environment,
and the complexity introduced by the managed configuration,
means that the only route to a solution is to get direct
help from a maintainer of the scheme. This will happen
again and again, because with the crutch to lean on, at no
point does the user come to understand it any better.
## The zen of self-assembly (#self-assembly)
Contrast this to self-assembly. It's painful for a day or
two, but the user sees every layer go in. Even if they
don't have a perfect understanding as they're doing it,
they'll have some memory that will come in handy for
troubleshooting.
That's not to say that you should go out and build your own
operating system or car. There is a point where the
complexity is so considerable, and the walls of the black
box sufficienty opaque, that it's appropriate to seal it
off and accept the cost of going to see the mechanic a few
times a year.
Development environments don't fit in this category.
They're lighter weight for one, but also so dynamic
(changes to code, runtimes, and libraries are happening on
a daily basis) that they're more prone to problems. In
their case, the extra pain is well worth the insight that
it buys.
| 39.134615 | 59 | 0.783784 | eng_Latn | 0.99996 |
d5f62f27c4ec9a98431603d76962a4f44c69db76 | 27,304 | md | Markdown | articles/synapse-analytics/sql-data-warehouse/sql-data-warehouse-tables-index.md | macdrai/azure-docs.fr-fr | 59bc35684beaba04a4f4c09a745393e1d91428db | [
"CC-BY-4.0",
"MIT"
] | null | null | null | articles/synapse-analytics/sql-data-warehouse/sql-data-warehouse-tables-index.md | macdrai/azure-docs.fr-fr | 59bc35684beaba04a4f4c09a745393e1d91428db | [
"CC-BY-4.0",
"MIT"
] | null | null | null | articles/synapse-analytics/sql-data-warehouse/sql-data-warehouse-tables-index.md | macdrai/azure-docs.fr-fr | 59bc35684beaba04a4f4c09a745393e1d91428db | [
"CC-BY-4.0",
"MIT"
] | null | null | null | ---
title: Indexation de tables
description: Recommandations et exemples relatifs à l’indexation de tables dans un pool SQL dédié.
services: synapse-analytics
author: XiaoyuMSFT
manager: craigg
ms.service: synapse-analytics
ms.topic: conceptual
ms.subservice: sql-dw
ms.date: 03/18/2019
ms.author: xiaoyul
ms.reviewer: igorstan
ms.custom: seo-lt-2019, azure-synapse
ms.openlocfilehash: fea314d595fb39a1e35dec8ab24533ad4b893f98
ms.sourcegitcommit: 6a350f39e2f04500ecb7235f5d88682eb4910ae8
ms.translationtype: HT
ms.contentlocale: fr-FR
ms.lasthandoff: 12/01/2020
ms.locfileid: "96448068"
---
# <a name="indexing-dedicated-sql-pool-tables-in-azure-synapse-analytics"></a>Indexer les tables d’un pool SQL dédié dans Azure Synapse Analytics
Recommandations et exemples relatifs à l’indexation de tables dans un pool SQL dédié.
## <a name="index-types"></a>Types d’index
Un pool SQL dédié propose différentes options d’indexation, comme [les index columnstore en cluster](/sql/relational-databases/indexes/columnstore-indexes-overview?toc=/azure/synapse-analytics/sql-data-warehouse/toc.json&bc=/azure/synapse-analytics/sql-data-warehouse/breadcrumb/toc.json&view=azure-sqldw-latest), [les index cluster et les index non cluster](/sql/relational-databases/indexes/clustered-and-nonclustered-indexes-described?toc=/azure/synapse-analytics/sql-data-warehouse/toc.json&bc=/azure/synapse-analytics/sql-data-warehouse/breadcrumb/toc.json&view=azure-sqldw-latest), ainsi qu’une option hors index également connue sous le nom de [segment de mémoire](/sql/relational-databases/indexes/heaps-tables-without-clustered-indexes?toc=/azure/synapse-analytics/sql-data-warehouse/toc.json&bc=/azure/synapse-analytics/sql-data-warehouse/breadcrumb/toc.json&view=azure-sqldw-latest).
Pour créer une table dotée d’un index, consultez la documentation [CREATE TABLE (pool SQL dédié)](/sql/t-sql/statements/create-table-azure-sql-data-warehouse?toc=/azure/synapse-analytics/sql-data-warehouse/toc.json&bc=/azure/synapse-analytics/sql-data-warehouse/breadcrumb/toc.json&view=azure-sqldw-latest).
## <a name="clustered-columnstore-indexes"></a>Index columnstore en cluster
Par défaut, un pool SQL dédié crée un index columnstore cluster lorsqu’aucune option d’index n’est spécifiée sur une table. Les tables columnstore en cluster offrent le plus haut niveau de compression de données ainsi que les meilleures performances de requête globales. Les tables columnstore en cluster sont généralement plus performantes que les index en cluster ou les tables de segments de mémoire, et constituent généralement le meilleur choix pour les grandes tables. Pour ces raisons, les index columnstore en cluster sont le meilleur endroit pour démarrer lorsque vous ne savez pas comment indexer votre table.
Pour créer une table columnstore en cluster, spécifiez simplement CLUSTERED COLUMNSTORE INDEX dans la clause WITH, ou omettez la clause WITH :
```SQL
CREATE TABLE myTable
(
id int NOT NULL,
lastName varchar(20),
zipCode varchar(6)
)
WITH ( CLUSTERED COLUMNSTORE INDEX );
```
Il existe quelques scénarios où un columnstore en cluster peut ne pas être pas une bonne option :
- Les tables columnstore ne prennent pas en charge varchar(max), nvarchar(max) et varbinary(max). Envisagez plutôt les segments de mémoire ou les index en cluster.
- Les tables columnstore peuvent être moins efficaces pour les données temporaires. Envisagez les segments de mémoire, voire les tables temporaires.
- Petites tables avec moins de 60 millions de lignes. Envisagez les tables de segments de mémoire.
## <a name="heap-tables"></a>Tables de segments de mémoire
Lors de l’envoi temporaire de données dans un pool SQL dédié, vous constaterez peut-être que l’utilisation d’une table de segments de mémoire accélère le processus global. En effet, les charges sur les segments de mémoire sont plus rapides que sur les tables d’index et, dans certains cas, la lecture ultérieure peut être effectuée depuis le cache. Si vous chargez des données uniquement pour les organiser avant d’exécuter d’autres transformations, le chargement de la table dans la table de segments de mémoire est beaucoup plus rapide que le chargement de données dans une table columnstore en cluster. En outre, le chargement des données dans une [table temporaire](sql-data-warehouse-tables-temporary.md) est plus rapide que le chargement d’une table dans un stockage permanent. Après le chargement des données, vous pouvez créer des index dans la table pour des performances de requête plus rapides.
Les tables columnstore en cluster commencent à atteindre une compression optimale une fois qu’elles comptent plus de 60 millions de lignes. Pour les petites tables de recherche de moins de 60 millions de lignes, envisagez d’utiliser un index cluster HEAP pour des performances de requêtes plus rapides.
Pour créer une table de segments de mémoire, spécifiez simplement HEAP dans la clause WITH :
```SQL
CREATE TABLE myTable
(
id int NOT NULL,
lastName varchar(20),
zipCode varchar(6)
)
WITH ( HEAP );
```
## <a name="clustered-and-nonclustered-indexes"></a>Index en cluster et non cluster
Les index en cluster peuvent être plus performants que les tables columnstore en cluster lorsqu’une seule ligne doit être récupérée rapidement. Pour les requêtes où une seule ligne ou très peu de choix de lignes est requis pour des performances très rapides, envisagez un index de cluster ou un index secondaire non cluster. L’inconvénient d’utiliser un index cluster est que seules les requêtes qui en bénéficient utilisent un filtre hautement sélectif sur la colonne d’index cluster. Pour améliorer le filtre sur les autres colonnes, un index non cluster peut être ajouté aux autres colonnes. Toutefois, chaque index ajouté à une table ajoute de l’espace et du temps de traitement pour les charges.
Pour créer une table d’index en cluster, spécifiez simplement CLUSTERED INDEX dans la clause WITH :
```SQL
CREATE TABLE myTable
(
id int NOT NULL,
lastName varchar(20),
zipCode varchar(6)
)
WITH ( CLUSTERED INDEX (id) );
```
Pour ajouter un index non cluster sur une table, utilisez la syntaxe suivante :
```SQL
CREATE INDEX zipCodeIndex ON myTable (zipCode);
```
## <a name="optimizing-clustered-columnstore-indexes"></a>Optimisation des index columnstore en cluster
Les tables columnstore en cluster sont organisées en données dans les segments. Une qualité de segment élevée est essentielle pour atteindre des performances des requêtes optimales sur une table columnstore. La qualité du segment peut être mesurée par le nombre de lignes dans un groupe de lignes compressé. La qualité de segment est optimale lorsqu’il existe au moins 100 000 lignes par groupe de lignes compressé, et ses performances vont croissant, car le nombre de lignes par groupe est proche de 1 048 576 lignes, soit le nombre maximum de lignes qu’un groupe de lignes peut contenir.
La vue ci-dessous peut être créée et utilisée sur votre système afin de calculer les lignes moyennes par groupe de lignes et d’identifier tous les index columnstore en cluster non optimaux. La dernière colonne sur cette vue génère une instruction SQL qui peut être utilisée pour reconstruire vos index.
```sql
CREATE VIEW dbo.vColumnstoreDensity
AS
SELECT
GETDATE() AS [execution_date]
, DB_Name() AS [database_name]
, s.name AS [schema_name]
, t.name AS [table_name]
, COUNT(DISTINCT rg.[partition_number]) AS [table_partition_count]
, SUM(rg.[total_rows]) AS [row_count_total]
, SUM(rg.[total_rows])/COUNT(DISTINCT rg.[distribution_id]) AS [row_count_per_distribution_MAX]
, CEILING ((SUM(rg.[total_rows])*1.0/COUNT(DISTINCT rg.[distribution_id]))/1048576) AS [rowgroup_per_distribution_MAX]
, SUM(CASE WHEN rg.[State] = 0 THEN 1 ELSE 0 END) AS [INVISIBLE_rowgroup_count]
, SUM(CASE WHEN rg.[State] = 0 THEN rg.[total_rows] ELSE 0 END) AS [INVISIBLE_rowgroup_rows]
, MIN(CASE WHEN rg.[State] = 0 THEN rg.[total_rows] ELSE NULL END) AS [INVISIBLE_rowgroup_rows_MIN]
, MAX(CASE WHEN rg.[State] = 0 THEN rg.[total_rows] ELSE NULL END) AS [INVISIBLE_rowgroup_rows_MAX]
, AVG(CASE WHEN rg.[State] = 0 THEN rg.[total_rows] ELSE NULL END) AS [INVISIBLE_rowgroup_rows_AVG]
, SUM(CASE WHEN rg.[State] = 1 THEN 1 ELSE 0 END) AS [OPEN_rowgroup_count]
, SUM(CASE WHEN rg.[State] = 1 THEN rg.[total_rows] ELSE 0 END) AS [OPEN_rowgroup_rows]
, MIN(CASE WHEN rg.[State] = 1 THEN rg.[total_rows] ELSE NULL END) AS [OPEN_rowgroup_rows_MIN]
, MAX(CASE WHEN rg.[State] = 1 THEN rg.[total_rows] ELSE NULL END) AS [OPEN_rowgroup_rows_MAX]
, AVG(CASE WHEN rg.[State] = 1 THEN rg.[total_rows] ELSE NULL END) AS [OPEN_rowgroup_rows_AVG]
, SUM(CASE WHEN rg.[State] = 2 THEN 1 ELSE 0 END) AS [CLOSED_rowgroup_count]
, SUM(CASE WHEN rg.[State] = 2 THEN rg.[total_rows] ELSE 0 END) AS [CLOSED_rowgroup_rows]
, MIN(CASE WHEN rg.[State] = 2 THEN rg.[total_rows] ELSE NULL END) AS [CLOSED_rowgroup_rows_MIN]
, MAX(CASE WHEN rg.[State] = 2 THEN rg.[total_rows] ELSE NULL END) AS [CLOSED_rowgroup_rows_MAX]
, AVG(CASE WHEN rg.[State] = 2 THEN rg.[total_rows] ELSE NULL END) AS [CLOSED_rowgroup_rows_AVG]
, SUM(CASE WHEN rg.[State] = 3 THEN 1 ELSE 0 END) AS [COMPRESSED_rowgroup_count]
, SUM(CASE WHEN rg.[State] = 3 THEN rg.[total_rows] ELSE 0 END) AS [COMPRESSED_rowgroup_rows]
, SUM(CASE WHEN rg.[State] = 3 THEN rg.[deleted_rows] ELSE 0 END) AS [COMPRESSED_rowgroup_rows_DELETED]
, MIN(CASE WHEN rg.[State] = 3 THEN rg.[total_rows] ELSE NULL END) AS [COMPRESSED_rowgroup_rows_MIN]
, MAX(CASE WHEN rg.[State] = 3 THEN rg.[total_rows] ELSE NULL END) AS [COMPRESSED_rowgroup_rows_MAX]
, AVG(CASE WHEN rg.[State] = 3 THEN rg.[total_rows] ELSE NULL END) AS [COMPRESSED_rowgroup_rows_AVG]
, 'ALTER INDEX ALL ON ' + s.name + '.' + t.NAME + ' REBUILD;' AS [Rebuild_Index_SQL]
FROM sys.[pdw_nodes_column_store_row_groups] rg
JOIN sys.[pdw_nodes_tables] nt ON rg.[object_id] = nt.[object_id]
AND rg.[pdw_node_id] = nt.[pdw_node_id]
AND rg.[distribution_id] = nt.[distribution_id]
JOIN sys.[pdw_table_mappings] mp ON nt.[name] = mp.[physical_name]
JOIN sys.[tables] t ON mp.[object_id] = t.[object_id]
JOIN sys.[schemas] s ON t.[schema_id] = s.[schema_id]
GROUP BY
s.[name]
, t.[name]
;
```
Maintenant que vous avez créé la vue, exécutez cette requête pour identifier les tables avec des groupes de lignes comportant moins de 100 000 lignes. Bien sûr, vous souhaiterez peut-être augmenter le seuil de 100 000 si vous recherchez une qualité de segment supérieure.
```sql
SELECT *
FROM [dbo].[vColumnstoreDensity]
WHERE COMPRESSED_rowgroup_rows_AVG < 100000
OR INVISIBLE_rowgroup_rows_AVG < 100000
```
Une fois que vous avez exécuté la requête, vous pouvez commencer à examiner les données et analyser vos résultats. Cette table explique ce que vous devez rechercher dans votre analyse de groupe de lignes.
| Colonne | Utilisation de ces données |
| --- | --- |
| [table_partition_count] |Si la table est partitionnée, vous pouvez vous attendre à un nombre plus élevé de groupes de lignes ouverts. Chaque partition de la distribution peut, en théorie, avoir un groupe de lignes ouvert associé. Tenez-en compte dans votre analyse. Une petite table qui a été partitionnée peut être optimisée en supprimant complètement le partitionnement pour améliorer la compression. |
| [row_count_total] |Nombre total de lignes de la table. Par exemple, vous pouvez utiliser cette valeur pour calculer le pourcentage de lignes à l’état compressé. |
| [row_count_per_distribution_MAX] |Si toutes les lignes sont réparties uniformément, cette valeur est le nombre cible de lignes par distribution. Comparez cette valeur avec compressed_rowgroup_count. |
| [COMPRESSED_rowgroup_rows] |Nombre total de lignes au format columnstore pour la table |
| [COMPRESSED_rowgroup_rows_AVG] |Si le nombre moyen de lignes est considérablement inférieur au nombre maximal de lignes pour un groupe de lignes, utilisez CTAS ou ALTER INDEX REBUILD pour recompresser les données |
| [COMPRESSED_rowgroup_count] |Nombre de groupes de lignes au format columnstore. Si ce nombre est très élevé par rapport à la table, il indique que la densité de columnstore est faible. |
| [COMPRESSED_rowgroup_rows_DELETED] |Les lignes sont logiquement supprimées au format columnstore. Si le nombre est élevé par rapport à la taille de la table, recréez la partition ou reconstruisez l’index, car cette opération les supprime physiquement. |
| [COMPRESSED_rowgroup_rows_MIN] |Utilisez cette valeur avec les colonnes AVG et MAX pour comprendre la plage de valeurs des groupes de lignes dans votre columnstore. Un nombre bas au-dessus du seuil de chargement (102 400 par distribution alignée sur la partition) suggère que des optimisations sont disponibles dans le chargement des données |
| [COMPRESSED_rowgroup_rows_MAX] |Identique à ce qui précède |
| [OPEN_rowgroup_count] |Les groupes de lignes ouverts sont normaux. On peut raisonnablement s’attendre à un groupe de lignes ouvert par distribution de tables (60). Les nombres excessifs suggèrent un chargement des données sur plusieurs partitions de données. Vérifiez la stratégie de partitionnement pour vous assurer qu’elle est saine |
| [OPEN_rowgroup_rows] |Chaque groupe de lignes peut contenir 1 048 576 lignes maximum. Utilisez cette valeur pour connaître l’état de remplissage actuel des groupes de lignes ouverts |
| [OPEN_rowgroup_rows_MIN] |Les groupes ouverts indiquent que les données sont chargées de manière progressive dans la table ou que la charge précédente a débordé sur les lignes restantes de ce groupe de lignes. Utilisez les colonnes MIN, MAX et AVG pour afficher la quantité de données stockées dans les groupes de ligne ouverts. Pour les petites tables, il peut s’agir de 100 % des données. Dans ce cas, l’instruction ALTER INDEX REBUILD force les données dans le columnstore. |
| [OPEN_rowgroup_rows_MAX] |Identique à ce qui précède |
| [OPEN_rowgroup_rows_AVG] |Identique à ce qui précède |
| [CLOSED_rowgroup_rows] |Examinez les lignes des groupes de lignes fermés pour effectuer un contrôle de validité. |
| [CLOSED_rowgroup_count] |S’il existe des groupes de lignes fermés, ils doivent être en petit nombre. Les groupes de lignes fermés peuvent être convertis en groupes de lignes compressés à l’aide de ALTER INDEX ... Command REORGANIZE. Toutefois, cela n’est généralement pas nécessaire. Les groupes fermés sont automatiquement convertis en groupes de lignes du columnstore par le processus « moteur de tuple » en arrière-plan. |
| [CLOSED_rowgroup_rows_MIN] |Les groupes de lignes fermés doivent avoir un taux de remplissage très élevé. Si le taux de remplissage d’un groupe de lignes fermé est faible, une analyse plus approfondie du columnstore est nécessaire. |
| [CLOSED_rowgroup_rows_MAX] |Identique à ce qui précède |
| [CLOSED_rowgroup_rows_AVG] |Identique à ce qui précède |
| [Rebuild_Index_SQL] |Reconstruction d’un index columnstore pour une table par SQL |
## <a name="causes-of-poor-columnstore-index-quality"></a>Causes de la qualité médiocre des index columnstore
Si vous avez identifié des tables avec une qualité médiocre des segments, vous devez en déterminer la cause racine. Vous trouverez ci-dessous d’autres causes communes de segments de qualité médiocre :
1. Saturation de la mémoire lors de la construction de l’index
2. Volume élevé d’opérations DML
3. Opérations de chargement progressives ou légères
4. Nombre trop important de partitions
En raison de ces facteurs, le nombre de lignes par groupe de lignes de l’index columnstore peut être très inférieur à 1 million (chiffre optimal). Ces facteurs peuvent également entraîner le transfert des lignes vers un groupe de lignes delta au lieu d’un groupe de lignes compressé.
### <a name="memory-pressure-when-index-was-built"></a>Saturation de la mémoire lors de la construction de l’index
Le nombre de lignes par groupe de lignes compressé est directement lié à la largeur de la ligne et à la quantité de mémoire disponible pour traiter le groupe de lignes. Lorsque les lignes sont écrites dans les tables columnstore avec une mémoire insuffisante, la qualité du segment columnstore peut être affectée. Par conséquent, la meilleure pratique consiste à accorder à la session qui écrit sur vos tables d’index columnstore l’accès à autant de mémoire que possible. Dans la mesure où il existe un compromis entre la mémoire et l’accès concurrentiel, les conseils sur l’allocation de la mémoire appropriée dépendent des données dans chaque ligne de votre table, des unités d’entrepôts de données affectées à votre système et de la quantité d’emplacements de concurrence que vous pouvez donner à la session qui écrit les données sur votre table.
### <a name="high-volume-of-dml-operations"></a>Volume élevé d’opérations DML
Un volume élevé d’opérations DML qui mettent à jour et suppriment des lignes peut provoquer l’inefficacité du columnstore. Cela est particulièrement vrai lorsque la majorité des lignes dans un groupe de lignes est modifiée.
- La suppression d’une ligne dans un groupe de lignes compressé ne fait que marquer logiquement la ligne comme étant supprimée. La ligne reste dans le groupe de lignes compressé jusqu’à ce que la partition ou la table soit reconstruite.
- L’insertion d’une ligne ajoute cette dernière à une table de groupe de lignes interne appelée groupe de lignes delta. La ligne insérée n’est pas convertie en columnstore jusqu’à ce que le groupe de lignes delta soit rempli et marqué comme étant fermé. Les groupes de lignes sont fermés lorsqu’ils atteignent leur capacité maximale de 1 048 576 lignes.
- La mise à jour d’une ligne au format columnstore est traitée en tant que suppression logique, puis en tant qu’insertion. La ligne insérée peut être stockée dans le deltastore.
Les opérations de mise à jour par lot et d’insertion qui dépassent le seuil de 102 400 lignes par distribution alignée sur la partition vont directement au format columnstore. Toutefois, pour que cela se produise et en supposant que la répartition soit uniforme, vous devriez modifier plus de 6,144 millions de lignes en une seule opération. Si le nombre de lignes d’une distribution alignée sur la partition donnée est inférieur à 102 400, les lignes sont transférées vers le deltastore et y restent jusqu’à ce que l’une des trois conditions suivantes ait été remplie : un nombre suffisant de lignes a été inséré ; un nombre suffisant de lignes a été modifié pour fermer le groupe de lignes ; l’index a été reconstruit.
### <a name="small-or-trickle-load-operations"></a>Opérations de chargement progressives ou légères
Les charges légères entrant dans le pool SQL dédié sont parfois appelées charges progressives. Elles représentent généralement un flux presque constant de données que le système reçoit. Toutefois, comme ce flux est presque continu, le volume des lignes n’est pas forcément important. La plupart du temps, les données sont bien inférieures au seuil requis pour une charge directe au format columnstore.
Dans ces situations, il est souvent préférable d’envoyer les données dans Azure Blob Storage et de les laisser s’accumuler avant le chargement. Cette technique est souvent appelée *micro-batching* ou micro-traitement par lots.
### <a name="too-many-partitions"></a>Nombre trop important de partitions
Autre élément à prendre en compte : l’impact du partitionnement sur vos tables columnstore en cluster. Avant le partitionnement, le pool SQL dédié divise déjà vos données en 60 bases de données. Un partitionnement plus approfondi divise vos données. Si vous partitionnez vos données, tenez compte du fait que **chaque** partition nécessite au moins 1 million de lignes pour bénéficier d’un index columnstore en cluster. Si vous partitionnez votre table en 100 partitions, cette dernière doit compter au moins 6 milliards de lignes pour bénéficier d’un index columnstore en cluster (60 distributions *100 partitions* 1 million de lignes). Si votre table de 100 partitions ne possède pas 6 milliards de lignes, réduisez le nombre de partitions, ou envisagez plutôt d’utiliser une table de segments de mémoire.
Une fois que les tables ont été chargées avec des données, suivez les étapes ci-dessous pour identifier et reconstruire des tables avec des index columnstore en cluster non optimaux.
## <a name="rebuilding-indexes-to-improve-segment-quality"></a>Reconstruire des index pour améliorer la qualité de segment
### <a name="step-1-identify-or-create-user-which-uses-the-right-resource-class"></a>Étape 1 : Identifier ou créer un utilisateur qui utilise la classe de ressources appropriée
Un moyen rapide d’améliorer immédiatement la qualité de segment consiste à reconstruire l’index. La requête SQL renvoyée par la vue ci-dessus renvoie une instruction ALTER INDEX REBUILD, qui peut être utilisée pour reconstruire vos index. Lors de la reconstruction de vos index, veillez à allouer suffisamment de mémoire à la session qui reconstruit votre index. Pour ce faire, augmentez la classe de ressources d’un utilisateur qui dispose des autorisations pour reconstruire l’index sur cette table conformément aux valeurs minimum recommandées.
Voici un exemple montrant comment allouer davantage de mémoire à un utilisateur en augmentant sa classe de ressources. Pour utiliser des classes de ressources, consultez l’article [Classes de ressources pour la gestion des charges de travail](resource-classes-for-workload-management.md).
```sql
EXEC sp_addrolemember 'xlargerc', 'LoadUser'
```
### <a name="step-2-rebuild-clustered-columnstore-indexes-with-higher-resource-class-user"></a>Étape 2 : Reconstruire les index columnstore en cluster avec un utilisateur de la classe de ressources la plus élevée
Connectez-vous en tant qu’utilisateur à l’étape 1 (par exemple, LoadUser), qui utilise maintenant une classe de ressources supérieure, puis exécutez les instructions ALTER INDEX. N’oubliez pas que cet utilisateur possède l’autorisation ALTER sur les tables où l’index est reconstruit. Ces exemples illustrent comment reconstruire l’index columnstore entier ou comment reconstruire une partition unique. Sur des tables volumineuses, il est plus pratique de reconstruire les index une seule partition à la fois.
Au lieu de reconstruire l’index, vous pouvez également copier la table dans une nouvelle table avec [CTAS](sql-data-warehouse-develop-ctas.md). Quelle méthode est la meilleure ? Pour les gros volumes de données, CTAS est généralement plus rapide qu’[ALTER INDEX](/sql/t-sql/statements/alter-index-transact-sql?toc=/azure/synapse-analytics/sql-data-warehouse/toc.json&bc=/azure/synapse-analytics/sql-data-warehouse/breadcrumb/toc.json&view=azure-sqldw-latest). Pour les volumes de données plus restreints, ALTER INDEX est plus facile à utiliser et ne vous oblige pas à permuter la table.
```sql
-- Rebuild the entire clustered index
ALTER INDEX ALL ON [dbo].[DimProduct] REBUILD
```
```sql
-- Rebuild a single partition
ALTER INDEX ALL ON [dbo].[FactInternetSales] REBUILD Partition = 5
```
```sql
-- Rebuild a single partition with archival compression
ALTER INDEX ALL ON [dbo].[FactInternetSales] REBUILD Partition = 5 WITH (DATA_COMPRESSION = COLUMNSTORE_ARCHIVE)
```
```sql
-- Rebuild a single partition with columnstore compression
ALTER INDEX ALL ON [dbo].[FactInternetSales] REBUILD Partition = 5 WITH (DATA_COMPRESSION = COLUMNSTORE)
```
La reconstruction d’un index dans un pool SQL dédié est une opération hors connexion. Pour plus d’informations sur la reconstruction d’index, consultez la section ALTER INDEX REBUILD dans [Columnstore Indexes Defragmentation](/sql/relational-databases/indexes/columnstore-indexes-defragmentation?toc=/azure/synapse-analytics/sql-data-warehouse/toc.json&bc=/azure/synapse-analytics/sql-data-warehouse/breadcrumb/toc.json&view=azure-sqldw-latest) (Défragmentation d’index columnstore) et [ALTER INDEX](/sql/t-sql/statements/alter-index-transact-sql?toc=/azure/synapse-analytics/sql-data-warehouse/toc.json&bc=/azure/synapse-analytics/sql-data-warehouse/breadcrumb/toc.json&view=azure-sqldw-latest).
### <a name="step-3-verify-clustered-columnstore-segment-quality-has-improved"></a>Étape 3 : Vérifier que la qualité de segment columnstore en cluster a été améliorée
Réexécutez la requête qui a identifié la table présentant une qualité de segment médiocre, et vérifiez que la qualité de segment a été améliorée. Si la qualité de segment n’a pas été améliorée, cela peut signifier que les lignes de votre table sont très larges. Utilisez une classe de ressources supérieure ou une base de données DWU lors de la reconstruction de vos index.
## <a name="rebuilding-indexes-with-ctas-and-partition-switching"></a>Reconstruire des index avec CTAS et le basculement de partitions
Cet exemple utilise l’instruction [CREATE TABLE AS SELECT (CTAS)](/sql/t-sql/statements/create-table-as-select-azure-sql-data-warehouse?toc=/azure/synapse-analytics/sql-data-warehouse/toc.json&bc=/azure/synapse-analytics/sql-data-warehouse/breadcrumb/toc.json&view=azure-sqldw-latest) et le basculement de partition pour reconstruire une partition de table.
```sql
-- Step 1: Select the partition of data and write it out to a new table using CTAS
CREATE TABLE [dbo].[FactInternetSales_20000101_20010101]
WITH ( DISTRIBUTION = HASH([ProductKey])
, CLUSTERED COLUMNSTORE INDEX
, PARTITION ( [OrderDateKey] RANGE RIGHT FOR VALUES
(20000101,20010101
)
)
)
AS
SELECT *
FROM [dbo].[FactInternetSales]
WHERE [OrderDateKey] >= 20000101
AND [OrderDateKey] < 20010101
;
-- Step 2: Switch IN the rebuilt data with TRUNCATE_TARGET option
ALTER TABLE [dbo].[FactInternetSales_20000101_20010101] SWITCH PARTITION 2 TO [dbo].[FactInternetSales] PARTITION 2 WITH (TRUNCATE_TARGET = ON);
```
Pour plus d’informations sur la recréation de partitions avec CTAS, consultez [Utilisation de partitions dans un pool SQL dédié](sql-data-warehouse-tables-partition.md).
## <a name="next-steps"></a>Étapes suivantes
Pour plus d’informations sur le développement des tables, consultez [Tables de développement](sql-data-warehouse-tables-overview.md).
| 93.828179 | 909 | 0.75846 | fra_Latn | 0.960318 |
d5f65ddd229e8a92f28175034dbcf991e1e4a310 | 4,783 | md | Markdown | extension-reference/web/experience-cloud-id-service-extension/README.md | MisterPhilip/reactor-user-docs | 8c4973cc28da7bc6121f2aee89a03bac5d1a868d | [
"Apache-2.0"
] | null | null | null | extension-reference/web/experience-cloud-id-service-extension/README.md | MisterPhilip/reactor-user-docs | 8c4973cc28da7bc6121f2aee89a03bac5d1a868d | [
"Apache-2.0"
] | null | null | null | extension-reference/web/experience-cloud-id-service-extension/README.md | MisterPhilip/reactor-user-docs | 8c4973cc28da7bc6121f2aee89a03bac5d1a868d | [
"Apache-2.0"
] | 1 | 2021-02-03T16:48:39.000Z | 2021-02-03T16:48:39.000Z | # Experience Cloud ID Service Extension
Use this reference for information about configuring the Experience Cloud ID extension, and the options available when using this extension to build a rule.
Use this extension to integrate the Experience Cloud ID Service with your property. With the Experience Cloud ID Service, you can create and store unique and persistent identifiers for your site visitors.
## Configure the Experience Cloud ID extension
This section provides a reference for the options available when configuring the Experience Cloud ID extension.
If the Experience Cloud ID extension is not yet installed, open your property, then click Extensions > Catalog, hover over the Experience Cloud ID extension, and click Install.
To configure the extension, open the Extensions tab, hover over the extension, and then click Configure.

The following configuration options are available:
### Experience Cloud Organization ID
The ID for your Experience Cloud Organization.
Your ID is a 24-character, alphanumeric string followed by @AdobeOrg. If you do not know this ID, contact Customer Care.
### Exclude specific paths
The Experience Cloud ID does not load if the URL matches any of the specified paths.
\(Optional\) Enable Regex if this is a regular expression.
Click Add to exclude another path.
### Opt In
Use the Opt In options to determine whether to require visitors to opt in Adobe services on your site, including whether to create cookies that track visitor activity.
Opt In is the centralized point of reference for all Platform solution client-side libraries to determine if cookies can be created on a user's device or browser when visiting your site. Opt In does not provide support for either gathering or storing user consent preferences.
**Enable Opt In?**
The selected option determines whether your website waits for consent to track a visitor's activities on your website.
There are three options:
* **No:** Does not wait for consent to track the visitor. This is the default behavior if you do not select an option.
* **Yes:** Waits for consent to track the visitor.
* **Determined at runtime using function:** Programmatically determine whether the value is true or false at runtime. If you select this option, the Select Data Element field becomes available. Select a data element that can determine whether to wait for consent. This data element resolves to a boolean value. For example, you can select a data element that provides consent determined on whether the visitor's country is located in the EU.
**Is Opt In Storage Enabled?**
If enabled, consent is stored in a first-party cookie on your domain. If not enabled, consent settings are kept in your CMP or a cookie you manage.
**Opt In Cookie Domain?**
Use this optional setting to specify the domain where the Opt In cookie is stored if storage is enabled. You can enter a domain, or select a data element that contains the domain.
**Opt In Storage Expiry?**
Specify when the Opt In cookie expires if storage is enabled, in seconds.
Enter a number, then select a unit of time from the dropdown list. For example, enter 2 and select Weeks. Default is 13 months.
**Permissions?**
Pass previous consent to the Opt In library. Select a data element that contains the consent. The element type must be an object or a JSON string. Overrides Pre Opt In Approvals.
Example:
`"{"aa":true,"aam":true,"ecid":true}"`
**Pre Opt In Approvals?**
Define which categories are approved or denied when no preference has been set by the visitor. Consent is assumed for the selected solutions from the time the page is loaded. The element type must be an object or a JSON string \(example: `{aam: true}`\).
### Variables
Set name-value pairs as Experience Cloud ID instance properties. Use the drop-down to select a variable, then type or select a value. For information about each variable, refer to the [Experience Cloud ID Service documentation](https://experiencecloud.adobe.com/resources/help/en_US/mcvid/mcvid-overview.html).
## Experience Cloud ID extension action types
This section describes the action types available in the Experience Cloud ID extension.
### Action Types
#### Set Customer IDs
Set one or more customer IDs.
1. Enter the integration code.
The integration code should contain the value set up as a data source in Audience Manager or Customer Attributes.
2. Select a value.
The value should be a user ID. Data elements are most suitable for dynamic values like IDs from a client-specific internal system.
3. Select an authentication state.
Available options are:
* Unknown
* Authenticated
* Logged out
4. \(Optional\) Click Add to set more customer IDs.
5. Click Keep Changes.
| 44.287037 | 441 | 0.779009 | eng_Latn | 0.997171 |
d5f69b5b2672453d4d2b11945d4cde8da984146d | 722 | md | Markdown | databases/sql/sql.md | pmohun/knowledge-1 | fbdc158f709dacf783fa308d56f88187ac0945a1 | [
"CC0-1.0"
] | null | null | null | databases/sql/sql.md | pmohun/knowledge-1 | fbdc158f709dacf783fa308d56f88187ac0945a1 | [
"CC0-1.0"
] | null | null | null | databases/sql/sql.md | pmohun/knowledge-1 | fbdc158f709dacf783fa308d56f88187ac0945a1 | [
"CC0-1.0"
] | null | null | null | # [SQL](http://en.wikipedia.org/wiki/SQL)
## Notes
- High cardinality = unique. Low cardinality = less unique.
## Links
- [Learn SQL the hard way](https://learncodethehardway.org/sql/)
- [Quick SQL cheat sheet](https://github.com/enochtangg/quick-SQL-cheatsheet#readme) - Quick reminder of all SQL queries and examples on how to use them.
- [SQLCheck](https://github.com/jarulraj/sqlcheck) - Automatically identify anti-patterns in SQL queries.
- [SQL Fundamentals course](https://egghead.io/courses/sql-fundamentals)
- [q](https://github.com/harelba/q) - Run SQL directly on CSV or TSV files.
- [Learn Modern SQL with PostgreSQL](https://www.masterywithsql.com/) ([HN](https://news.ycombinator.com/item?id=20260292))
| 48.133333 | 153 | 0.739612 | yue_Hant | 0.263661 |
d5f6a8ae7ba14f526c23f785c8f38bcecdf7d5e9 | 144 | md | Markdown | content/docs/reference/pytorchjob/v1beta2/_index.md | avdaredevil/website | db2b59005f21701699c60c6975e4bb5cf3819ebb | [
"CC-BY-4.0"
] | 3 | 2020-09-21T21:56:29.000Z | 2021-06-24T13:29:28.000Z | content/docs/reference/pytorchjob/v1beta2/_index.md | avdaredevil/website | db2b59005f21701699c60c6975e4bb5cf3819ebb | [
"CC-BY-4.0"
] | 126 | 2021-01-25T12:50:37.000Z | 2022-03-28T11:21:28.000Z | content/docs/reference/pytorchjob/v1beta2/_index.md | avdaredevil/website | db2b59005f21701699c60c6975e4bb5cf3819ebb | [
"CC-BY-4.0"
] | 5 | 2020-07-31T14:01:36.000Z | 2020-08-13T17:19:10.000Z | +++
title = "PyTorchJob v1beta2"
description = "Reference documentation for version v1beta2 of the PytorchJob custom resource."
weight = 10
+++
| 24 | 94 | 0.756944 | eng_Latn | 0.794783 |
d5f6decf8aabc4374e633e93874e7db10faa9120 | 6,421 | md | Markdown | README.md | GJRinfinity/kizuna | 7f2bbe2edbfb8e0120cf8597a193e0ccbbfaf549 | [
"MIT"
] | null | null | null | README.md | GJRinfinity/kizuna | 7f2bbe2edbfb8e0120cf8597a193e0ccbbfaf549 | [
"MIT"
] | 490 | 2019-07-09T23:34:25.000Z | 2021-08-02T23:16:22.000Z | README.md | GJRinfinity/kizuna | 7f2bbe2edbfb8e0120cf8597a193e0ccbbfaf549 | [
"MIT"
] | 2 | 2019-06-23T07:22:12.000Z | 2019-10-03T11:29:31.000Z | <h1 align="center">:cherry_blossom: kizuna</h1>
<h3 align="center">:large_blue_diamond: a photo sharing web app :large_blue_diamond:</h3>
<p align="center">
<img src="samples/kizuna.png" width="650px">
</p>
## Introduction
Welcome to kizuna, the photo sharing web app! This is my first venture into production level web app development, so please do spare me for any inconsistencies :sweat_smile:. The name kizuna was chosen as it means *bond* in Japanese. In-fact, it was kanji of the year 2011! :tada:
This is a simple web app. It allows users to sign-up via Google, Facebook or Twitter. Instagram support is also available, but has been temporarily disabled due to the lengthy app review requirements of Instagram.
Once the user signs-up, they are given the ability to-
:heavy_check_mark: Create/Delete a public/private post
:heavy_check_mark: Search for friends
:heavy_check_mark: Follow/Unfollow friends
:heavy_check_mark: Like/Unlike posts
:heavy_check_mark: Comment on posts
The app follows a minimal design philosophy, by maintaining the color scheme and the material design throughout. This is a responsive website, therefore making it suitable for viewing on a variety of screen sizes. It is best viewed on mobile screens, in a full-screen mode.
In terms of user experience, small enhancements like toast messages and lazy loading of images is also included.
## Table of Contents
* [Project Structure](#project-structure)
* [Local Development](#local-development)
* [Credential Setup](#credential-setup)
* [Running the local server](#running-the-local-server)
* [Production Setup](#production-setup)
* [Progressive Web App Integration](#progressive-web-app-integration)
* [License](#license)
## Project Structure
The project structure is as shown below:
```bash
.
├── public
│ ├── css
│ │ └── styles.css
│ ├── img
│ │ ├── android-chrome-192x192.png
│ │ ├── android-chrome-512x512.png
│ │ ├── apple-touch-icon.png
│ │ ├── favicon-16x16.png
│ │ ├── favicon-32x32.png
│ │ ├── loading.gif
│ │ └── sad-pug.png
│ ├── js
│ │ ├── comments.js
│ │ ├── delete-post.js
│ │ ├── follow-unfollow.js
│ │ ├── lazy-load.js
│ │ ├── likes.js
│ │ ├── registerServiceWorker.js
│ │ ├── search.js
│ │ ├── ui.js
│ │ └── upload.js
│ ├── pages
│ │ └── fallback.html
│ ├── manifest.json
│ └── serviceWorker.js
├── samples
│ └── kizuna.png
├── src
│ ├── db
│ │ ├── aws.js
│ │ └── mongoose.js
│ ├── models
│ │ ├── comment.js
│ │ ├── follow.js
│ │ ├── like.js
│ │ ├── post.js
│ │ └── user.js
│ ├── passport
│ │ ├── facebook-strategy.js
│ │ ├── google-strategy.js
│ │ ├── instagram-strategy.js
│ │ ├── passport-setup.js
│ │ └── twitter-strategy.js
│ ├── routes
│ │ ├── auth.js
│ │ ├── post.js
│ │ ├── profile.js
│ │ ├── search.js
│ │ ├── upload.js
│ │ └── user.js
│ └── app.js
├── templates
│ ├── partials
│ │ ├── footer.ejs
│ │ ├── head.ejs
│ │ ├── header.ejs
│ │ ├── menu.ejs
│ │ ├── new-post.ejs
│ │ └── post-gallery.ejs
│ └── views
│ ├── feed.ejs
│ ├── login.ejs
│ ├── post.ejs
│ ├── privacy-policy.ejs
│ ├── profile.ejs
│ └── terms-conditions.ejs
├── .env
├── .gitignore
├── LICENSE
├── package.json
├── package-lock.json
└── README.md
```
All of the server side code is included in the `src` directory, while the client side code is in the `public` direcory. This project makes use of templating with EJS to minimize the amount of redundant code.
## Local Development
### Credential setup
This project makes use of the dotenv npm package for managing the environment variables. Prior to running the app locally, it is necessary to obtain the private credentials for the following:
* [Google OAuth API](https://console.developers.google.com)
* [Facebook Developers](https://developers.facebook.com/)
* [Twitter Developers](https://developer.twitter.com)
* [Instagram Developers](https://www.instagram.com/developer/)
* [Amazon AWS](https://aws.amazon.com/)
* [MongoDB Atlas](https://www.mongodb.com/cloud/atlas)
Once all the credentials have been obtained, fill them in the dummy `.env` file located in the root of the project.
> **Never commit the private credentials to version control. In this project, the dummy `.env` file serves the purpose of viewing the key names of the variables**
### Running the local server
First, install all the required node modules by running the command:
`npm install`
Run the local development script:
`npm run dev`
The dev script uses [nodemon](https://www.npmjs.com/package/nodemon) to monitor the changes made to the js, ejs and css files and then reloads the server on file changes.
> You may view the dependencies of the app [here](https://github.com/tackyunicorn/kizuna/network/dependencies)
## Production Setup
The setup for production environments depends greatly on the cloud provider used. This project is hosted using [heroku hobby dynos](https://devcenter.heroku.com/articles/dyno-types) on a custom root-level domain name - [kizuna.ml](https://kizuna.ml), served by [cloudflare nameservers](https://support.cloudflare.com/hc/en-us/articles/205195708-Changing-your-domain-nameservers-to-Cloudflare). Since the primary purpose of the app is to interact with images, all forms of image upload and hosting is done via Amazon AWS [S3 Buckets.](https://aws.amazon.com/s3/) An M0 Sandbox [MongoDB Atlas](https://www.mongodb.com/cloud/atlas) cluster instance is used as the cloud database solution.
Most cloud providers provide the functionality for specifying the environment variables. Do lookup the documentation for your cloud provider for the detailed process.
## Progressive Web App Integration
kizuna can be installed on supported devices as a progressive web app. This is done using a service worker that caches all the static assets and intercepts the requests to these assets. Offline support is also provided by means of a default fallback page. This feature has been tested on Android devices. Functionality on iOS browsers like Safari still remain murky due to the breaking OAuth flow. This behavior has been fixed in the [iOS 12.2 release](https://medium.com/@firt/whats-new-on-ios-12-2-for-progressive-web-apps-75c348f8e945).
## License
kizuna is licensed under the [MIT](LICENSE) license. | 43.680272 | 685 | 0.694284 | eng_Latn | 0.957596 |
d5f75eefd34add732e27cfee240dc133b119e3be | 3,920 | md | Markdown | wiki/Std_DrawStyle.md | arslogavitabrevis/FreeCAD-documentation | 8166ce0fd906f0e15672cd1d52b3eb6cf9e17830 | [
"CC0-1.0"
] | null | null | null | wiki/Std_DrawStyle.md | arslogavitabrevis/FreeCAD-documentation | 8166ce0fd906f0e15672cd1d52b3eb6cf9e17830 | [
"CC0-1.0"
] | null | null | null | wiki/Std_DrawStyle.md | arslogavitabrevis/FreeCAD-documentation | 8166ce0fd906f0e15672cd1d52b3eb6cf9e17830 | [
"CC0-1.0"
] | null | null | null | ---
- GuiCommand:
Name:Std DrawStyle
MenuLocation:View → Draw style → ...
Workbenches:All
Shortcut:**V** **1** - **V** **7**
SeeAlso:[Std SelBoundingBox](Std_SelBoundingBox.md)
---
# Std DrawStyle
## Description
The **Std DrawStyle** command can override the effect of the **Display Mode** [property](Property_editor.md) of objects in a [3D view](3D_view.md).
## Usage
1. There are several ways to invoke the command:
- Click on the black down arrow to the right of the **<img src="images/Std_DrawStyleAsIs.svg" width=16px> [Std DrawStyle](Std_DrawStyle.md)** button and select a style from the flyout.
- In the menu go to **View → Draw style** and select a style.
- In the [3D view](3D_view.md) context menu go to **Draw style** and select a style.
- Use one of the keyboard shortcut: **V** then **1**, **2**, **3**, **4**, **5**, **6** or **7**.
## Available draw styles
### <img alt="" src=images/Std_DrawStyleAsIs.svg style="width:24px;"> As is
The **As is** style does not override the **Display Mode** of objects.

*4 identical objects each with a different Display Mode (from left to right: 'Points', 'Wireframe', 'Shaded' and 'Flat lines') with the 'As is' draw style applied*
### <img alt="" src=images/Std_DrawStylePoints.svg style="width:24px;"> Points
The **Points** style overrides the **Display Mode** of objects. This style matches the \'Points\' Display Mode. Vertices are displayed in solid colors. Edges and faces are not displayed.

*The same objects with the 'Points' draw style applied*
### <img alt="" src=images/Std_DrawStyleWireFrame.svg style="width:24px;"> Wireframe
The **Wireframe** style overrides the **Display Mode** of objects. This style matches the \'Wireframe\' Display Mode. Vertices and edges are displayed in solid colors. Faces are not displayed.

*The same objects with the 'Wireframe' draw style applied*
### <img alt="" src=images/Std_DrawStyleHiddenLine.svg style="width:24px;"> Hidden line
The **Hidden line** style overrides the **Display Mode** of objects. Objects are displayed as if converted to triangular meshes.

*The same objects with the 'Hidden line' draw style applied*
### <img alt="" src=images/Std_DrawStyleNoShading.svg style="width:24px;"> No shading
The **No shading** style overrides the **Display Mode** of objects. Vertices, edges and faces are displayed in solid colors.

*The same objects with the 'No shading' draw style applied*
### <img alt="" src=images/Std_DrawStyleShaded.svg style="width:24px;"> Shaded
The **Shaded** style overrides the **Display Mode** of objects. This style matches the \'Shaded\' Display Mode. Vertices and edges are not displayed. Faces are illuminated depending on their orientation.

*The same objects with the 'Shaded' draw style applied*
### <img alt="" src=images/Std_DrawStyleFlatLines.svg style="width:24px;"> Flat lines
The **Flat lines** style overrides the **Display Mode** of objects. This style matches the \'Flat lines\' Display Mode. Vertices and edges are displayed in solid colors. Faces are illuminated depending on their orientation.

*The same objects with the 'Flat lines' draw style applied*
## Notes
- Objects in a [3D view](3D_view.md) also have a **Draw Style** property. This property controls the linetype used for the edges. The Std DrawStyle command does not override this property.
- For a macro to toggle between two draw styles see: [Macro Toggle Drawstyle](Macro_Toggle_Drawstyle.md).
{{Std Base navi}}
---
[documentation index](../README.md) > Std DrawStyle
| 45.057471 | 223 | 0.721684 | eng_Latn | 0.912127 |
d5f7b5feba1d597c8fc3c8372c8485b1f0702c9e | 1,479 | md | Markdown | content/book_1/003_vivamus_laoreet_porttitor/009_dictum_magna_nisl/003_cursus_et_habitasse_erat.md | wernerstrydom/sample-book | dbe00c6a6e2c9c227a6eb8955371d394b9398e48 | [
"MIT"
] | null | null | null | content/book_1/003_vivamus_laoreet_porttitor/009_dictum_magna_nisl/003_cursus_et_habitasse_erat.md | wernerstrydom/sample-book | dbe00c6a6e2c9c227a6eb8955371d394b9398e48 | [
"MIT"
] | null | null | null | content/book_1/003_vivamus_laoreet_porttitor/009_dictum_magna_nisl/003_cursus_et_habitasse_erat.md | wernerstrydom/sample-book | dbe00c6a6e2c9c227a6eb8955371d394b9398e48 | [
"MIT"
] | null | null | null | ### Cursus et habitasse erat
Eleifend diam dui, platea per himenaeos fusce taciti porta, quisque facilisis curabitur. Sagittis primis congue. Imperdiet commodo, rhoncus, augue himenaeos tortor, ante massa, inceptos quisque ad maecenas
Eros lectus, viverra nulla massa, nec suspendisse erat
Rhoncus pretium elementum mauris, blandit dapibus proin feugiat, consequat eget laoreet platea praesent et
Suspendisse bibendum condimentum tempus vestibulum, posuere euismod, viverra. Gravida lectus, semper purus vel dignissim a nibh
Quisque vivamus lorem amet, ultrices mollis diam interdum, mi metus dignissim arcu, condimentum fusce. Pulvinar, luctus, arcu, mollis lacus
Nulla sodales magna
Magna vel, sem, curabitur tellus, nulla. Laoreet, proin mollis fames sem ligula, ultrices, conubia est praesent felis. Auctor, arcu mi at ac, integer
Elementum imperdiet lacinia euismod aliquam luctus malesuada mollis facilisis donec ante, euismod, cras lorem. Lectus, ligula mauris platea morbi finibus. Eros, placerat feugiat nec, finibus magna arcu eleifend, amet, est
Sagittis finibus ligula enim nisl nibh, posuere, tempor, elit fringilla mauris, imperdiet
Urna est quam malesuada mauris lacus ex
Bibendum sodales ex viverra fringilla nibh lobortis cursus auctor gravida blandit, iaculis. Luctus porta bibendum
Molestie tortor, orci
Integer leo sed nibh mi nibh, morbi elit, semper. Tellus cursus, fusce eros magna laoreet, feugiat, finibus tortor purus enim volutpat lorem
| 49.3 | 221 | 0.812711 | cat_Latn | 0.228153 |
d5f7bbc71a3c17e813124400f115d81fe51ae58c | 540 | md | Markdown | MissionAndVision.md | CookingWithCale/WikiSpyCookbook | 5f0c312168b6e7197b79921180b42c0468a6acf1 | [
"RSA-MD"
] | null | null | null | MissionAndVision.md | CookingWithCale/WikiSpyCookbook | 5f0c312168b6e7197b79921180b42c0468a6acf1 | [
"RSA-MD"
] | 8 | 2021-12-26T22:57:12.000Z | 2022-02-23T03:41:45.000Z | MissionAndVision.md | FreedomGovernment/FreedomCookbook | 5f0c312168b6e7197b79921180b42c0468a6acf1 | [
"RSA-MD"
] | null | null | null | ## Mission and Vision
The Mission of Freedom Cookbook is eliminate cyber threats, misinformation, and disinformation from Wikipedia. The Vision is to shut down the Black Looter Mob Antifa Nazis and Deep State.
## License
Copyright 2022 © [Freedom Government](https://github.com/FreedomGovernment); most rights reserved, Third-party commercialization prohibited, mandatory improvement donations, licensed under the Kabuki Strong Source-available License that YOU MUST CONSENT TO at <https://github.com/FreedomGovernment/FreedomCookbook>.
| 67.5 | 315 | 0.816667 | eng_Latn | 0.86406 |
d5f7ed26d85fd4c0c3f3d679ec3fa83be0b18bfe | 647 | md | Markdown | README.md | guiyep/framely | 4645d240ec15a9cb77507fdf94c4d75bc4527804 | [
"MIT"
] | null | null | null | README.md | guiyep/framely | 4645d240ec15a9cb77507fdf94c4d75bc4527804 | [
"MIT"
] | null | null | null | README.md | guiyep/framely | 4645d240ec15a9cb77507fdf94c4d75bc4527804 | [
"MIT"
] | null | null | null | # framely
<-- UNSTABLE VERSION - PROJECT ON HOLD -->
This is a work in progress project that aims to simplify the way we communicate between different isolated parts of the application.
It provides an easy api to create iframes interfaces and consume it as a normal CRUD call using fetch.
contact me on guiyep@gmail.com for any information or if you want to be part of this project.
https://github.com/guiyep/framely
# roadmap
* GET req. DONE
* make service worker call iframe with request/response. ON PROGRESS
* create POST req/response
* create PUT req/response
* create DELETE req/response
# install
``` npm install --save framely ``` | 29.409091 | 132 | 0.763524 | eng_Latn | 0.983776 |