hexsha stringlengths 40 40 | size int64 5 1.04M | ext stringclasses 6 values | lang stringclasses 1 value | max_stars_repo_path stringlengths 3 344 | max_stars_repo_name stringlengths 5 125 | max_stars_repo_head_hexsha stringlengths 40 78 | max_stars_repo_licenses listlengths 1 11 | max_stars_count int64 1 368k ⌀ | max_stars_repo_stars_event_min_datetime stringlengths 24 24 ⌀ | max_stars_repo_stars_event_max_datetime stringlengths 24 24 ⌀ | max_issues_repo_path stringlengths 3 344 | max_issues_repo_name stringlengths 5 125 | max_issues_repo_head_hexsha stringlengths 40 78 | max_issues_repo_licenses listlengths 1 11 | max_issues_count int64 1 116k ⌀ | max_issues_repo_issues_event_min_datetime stringlengths 24 24 ⌀ | max_issues_repo_issues_event_max_datetime stringlengths 24 24 ⌀ | max_forks_repo_path stringlengths 3 344 | max_forks_repo_name stringlengths 5 125 | max_forks_repo_head_hexsha stringlengths 40 78 | max_forks_repo_licenses listlengths 1 11 | max_forks_count int64 1 105k ⌀ | max_forks_repo_forks_event_min_datetime stringlengths 24 24 ⌀ | max_forks_repo_forks_event_max_datetime stringlengths 24 24 ⌀ | content stringlengths 5 1.04M | avg_line_length float64 1.14 851k | max_line_length int64 1 1.03M | alphanum_fraction float64 0 1 | lid stringclasses 191 values | lid_prob float64 0.01 1 |
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
97bebbf20b2b04e1329d05ce6fcc42a0492f8ce5 | 10,736 | md | Markdown | README.md | keithdoggett/spatial_stats | dd45491dbe833d8be9cf88157b78de48ec75a801 | [
"BSD-3-Clause"
] | 7 | 2020-03-29T19:54:05.000Z | 2020-05-29T02:15:38.000Z | README.md | keithdoggett/spatial_stats | dd45491dbe833d8be9cf88157b78de48ec75a801 | [
"BSD-3-Clause"
] | 2 | 2020-06-24T22:20:33.000Z | 2020-06-24T22:22:09.000Z | README.md | keithdoggett/spatial_stats | dd45491dbe833d8be9cf88157b78de48ec75a801 | [
"BSD-3-Clause"
] | 1 | 2020-05-12T15:07:12.000Z | 2020-05-12T15:07:12.000Z | 
[](https://travis-ci.com/keithdoggett/spatial_stats)
[Docs](https://keithdoggett.github.io/spatial_stats)
# SpatialStats
SpatialStats is an ActiveRecord/Rails plugin that utilizes PostGIS to compute weights/statistics of spatial data sets in Rails Apps.
## Installation
Add this line to your application's Gemfile:
```ruby
gem 'spatial_stats'
```
And then execute:
```bash
$ bundle
```
Or install it yourself as:
```bash
$ gem install spatial_stats
```
## Usage
### Weights
Weights define the spatial relation between members of a dataset. Contiguous operations are supported for `polygons` and `multipolygons`, and distant operations are supported for `points`.
To compute weights, you need an `ActiveRecord::Relation` scope and a geometry field. From there, you can pick what type of weight operation to compute (`knn`, `queen neighbors`, etc.).
#### Compute Queen Weights
```ruby
# County table has the following fields: avg_income: float, geom: multipolygon.
scope = County.all
geom_field = :geom
weights = SpatialStats::Weights::Contiguous.queen(scope, geom_field)
# => #<SpatialStats::Weights::WeightsMatrix>
```
#### Compute KNN of Centroids
The field being queried does not have to be defined in the schema, but could be computed during the query for scope.
This example finds the inverse distance weighted, 5 nearest neighbors for the centroid of each county.
```ruby
scope = County.all.select("*, st_centroid(geom) as geom")
weights = SpatialStats::Weights::Distant.idw_knn(scope, :geom, 5)
# => #<SpatialStats::Weights::WeightsMatrix>
```
#### Define WeightsMatrix without Query
Weight matrices can be defined by a hash that describes each key's neighbor and weight.
Example: Define WeightsMatrix and get the matrix in row_standardized format.
```ruby
weights = {
1 => [{ id: 2, weight: 1 }, { id: 4, weight: 1 }],
2 => [{ id: 1, weight: 1 }],
3 => [{ id: 4, weight: 1 }],
4 => [{ id: 1, weight: 1 }, { id: 3, weight: 1 }]
}
keys = weights.keys
wm = SpatialStats::Weights::WeightsMatrix.new(weights)
# => #<SpatialStats::Weights::WeightsMatrix:0x0000561e205677c0 @keys=[1, 2, 3, 4], @weights={1=>[{:id=>2, :weight=>1}, {:id=>4, :weight=>1}], 2=>[{:id=>1, :weight=>1}], 3=>[{:id=>4, :weight=>1}], 4=>[{:id=>1, :weight=>1}, {:id=>3, :weight=>1}]}, @n=4>
wm = wm.standardize
# => #<SpatialStats::Weights::WeightsMatrix:0x0000561e205677c0 @keys=[1, 2, 3, 4], @weights={1=>[{:id=>2, :weight=>0.5}, {:id=>4, :weight=>0.5}], 2=>[{:id=>1, :weight=>1}], 3=>[{:id=>4, :weight=>1}], 4=>[{:id=>1, :weight=>0.5}, {:id=>3, :weight=>0.5}]}, @n=4>
wm.dense
# => Numo::DFloat[
# [0, 0.5, 0, 0.5],
# [1, 0, 0, 0],
# [0, 0, 0, 1],
# [0.5, 0, 0.5, 0]
# ]
wm.sparse
# => #<SpatialStats::Weights::CSRMatrix @m=4, @n=4, @nnz=6>
```
### Lagged Variables
Spatially lagged variables can be computed with weights matrix and 1-D vector (`Array`).
#### Compute a Lagged Variable
```ruby
weights = {
1 => [{ id: 2, weight: 1 }, { id: 4, weight: 1 }],
2 => [{ id: 1, weight: 1 }],
3 => [{ id: 4, weight: 1 }],
4 => [{ id: 1, weight: 1 }, { id: 3, weight: 1 }]
}
wm = SpatialStats::Weights::WeightsMatrix.new(weights).standardize
vec = [1, 2, 3, 4]
lagged_var = SpatialStats::Utils::Lag.neighbor_sum(wm, vec)
# => [3.0, 1.0, 4.0, 2.0]
```
### Global Stats
Global stats compute a value for the dataset, like how clustered the observations are within the region.
Most `stat` classes take three parameters: `scope`, `data_field`, and `weights`. All `stat` classes have the `stat` method that will compute the target statistic. These are also aliased with the common name of the statistic, such as `i` for `Moran` or `c` for `Geary`.
#### Compute Moran's I
```ruby
scope = County.all
weights = SpatialStats::Weights::Contiguous.rook(scope, :geom)
moran = SpatialStats::Global::Moran.new(scope, :avg_income, weights)
# => <SpatialStats::Global::Moran>
moran.stat
# => 0.834
moran.i
# => 0.834
```
#### Compute Moran's I without Querying Data
To calculate the statistic by using an array of data and not querying a database field. The order of the data must correspond to the order of `weights.keys`.
```ruby
scope = County.all
weights = SpatialStats::Weights::Contiguous.rook(scope, :geom)
field = nil
moran = SpatialStats::Global::Moran.new(scope, field, weights)
# => <SpatialStats::Global::Moran>
# data is automatically standardized on input
data = [1,2,3,4,5,6]
moran.x = data
moran.stat
# => 0.521
```
#### Compute Moran's I Z-Score
```ruby
scope = County.all
weights = SpatialStats::Weights::Contiguous.rook(scope, :geom)
moran = SpatialStats::Global::Moran.new(scope, :avg_income, weights)
# => <SpatialStats::Global::Moran>
moran.z_score
# => 3.2
```
#### Run a Permutation Test on Moran's I
All stat classes have the `mc` method which takes `permutations` and `seed` as its parameters. `mc` runs a permutation test on the class and returns the psuedo p-value.
```ruby
scope = County.all
weights = SpatialStats::Weights::Contiguous.rook(scope, :geom)
moran = SpatialStats::Global::Moran.new(scope, :avg_income, weights)
# => <SpatialStats::Global::Moran>
moran.mc(999, 123_456)
# => 0.003
```
#### Get Summary of Permutation Test
All stat classes have the `summary` method which takes `permutations` and `seed` as its parameters. `summary` runs `stat` and `mc` then combines the results into a hash.
```ruby
scope = County.all
weights = SpatialStats::Weights::Contiguous.rook(scope, :geom)
moran = SpatialStats::Global::Moran.new(scope, :avg_income, weights)
# => <SpatialStats::Global::Moran>
moran.summary(999, 123_456)
# => {stat: 0.834, p: 0.003}
```
### Local Stats
Local stats compute a value each observation in the dataset, like how similar its neighbors are to itself. Local stats operate similarly to global stats, except that almost every operation will return an array of length `n` where `n` is the number of observations in the dataset.
Most `stat` classes take three parameters: `scope`, `data_field`, and `weights`. All `stat` classes have the `stat` method that will compute the target statistic. These are also aliased with the common name of the statistic, such as `i` for `Moran` or `c` for `Geary`.
#### Compute Moran's I
```ruby
scope = County.all
weights = SpatialStats::Weights::Contiguous.rook(scope, :geom)
moran = SpatialStats::Local::Moran.new(scope, :avg_income, weights)
# => <SpatialStats::Local::Moran>
moran.stat
# => [0.888, 0.675, 0.2345, -0.987, -0.42, ...]
moran.i
# => [0.888, 0.675, 0.2345, -0.987, -0.42, ...]
```
#### Compute Moran's I without Querying Data
To calculate the statistic by using an array of data and not querying a database field. The order of the data must correspond to the order of `weights.keys`.
```ruby
scope = County.all
weights = SpatialStats::Weights::Contiguous.rook(scope, :geom)
field = nil
moran = SpatialStats::Local::Moran.new(scope, field, weights)
# => <SpatialStats::Local::Moran>
# data is automatically standardized on input
data = [1,2,3,4,5,6]
moran.x = data
moran.stat
# => [0.521, 0.123, -0.432, -0.56,. ...]
```
#### Compute Moran's I Z-Scores
Note: Many classes do not have a variance or expectation method implemented and this will raise a `NotImplementedError`.
```ruby
scope = County.all
weights = SpatialStats::Weights::Contiguous.rook(scope, :geom)
moran = SpatialStats::Local::Moran.new(scope, :avg_income, weights)
# => <SpatialStats::Local::Moran>
moran.z_score
# => # => [0.65, 1.23, 0.42, 3.45, -0.34, ...]
```
#### Run a Permutation Test on Moran's I
All stat classes have the `mc` method which takes `permutations` and `seed` as its parameters. `mc` runs a permutation test on the class and returns the psuedo p-values.
```ruby
scope = County.all
weights = SpatialStats::Weights::Contiguous.rook(scope, :geom)
moran = SpatialStats::Local::Moran.new(scope, :avg_income, weights)
# => <SpatialStats::Local::Moran>
moran.mc(999, 123_456)
# => [0.24, 0.13, 0.53, 0.023, 0.65, ...]
```
#### Get Summary of Permutation Test
All stat classes have the `summary` method which takes `permutations` and `seed` as its parameters. `summary` runs `stat`, `mc`, and `groups` then combines the results into a hash array indexed by `weight.keys`.
```ruby
scope = County.all
weights = SpatialStats::Weights::Contiguous.rook(scope, :geom)
moran = SpatialStats::Local::Moran.new(scope, :avg_income, weights)
# => <SpatialStats::Local::Moran>
moran.summary(999, 123_456)
# => [{key: 1, stat: 0.521, p: 0.24, group: 'HH'}, ...]
```
## Contributing
Once cloned, run the following commands to setup the test database.
```bash
cd ./spatial_stats
bundle install
cd test/dummy
rake db:create
rake db:migrate
```
If you are getting an error, you may need to set the following environment variables.
```bash
$PGUSER # default "postgres"
$PGPASSWORD # default ""
$PGHOST # default "127.0.0.1"
$PGPORT # default "5432"
$PGDATABASE # default "spatial_stats_test"
```
If the dummy app is setup correctly, run the following:
```bash
cd ../..
rake
```
This will run the tests. If they all pass, then your environment is setup correctly.
Note: It is recommended to have GEOS installed and linked to RGeo. You can test this by running the following:
```bash
cd test/dummy
rails c
RGeo::Geos.supported?
# => true
```
## Path Forward
Summaries of milestones for v1.x and v2.0. These lists are subject to change. If you have an additional feature you want to see for either milestone, open up an issue or PR.
### v1.x
1. Global Measurements
- `Geary`'s C
- `GetisOrd`
2. Local Measurements
- `Join Count`
3. Utilities
- Add support for .gal/.swm file imports
- Add support for Rate variables
- Add support for Bayes smoothing
- ~Add support for Bonferroni Bounds and FDR~
4. General
- ~Add new stat constructors that only rely on a weights matrix and data vector~
- Point Pattern Analysis Module
- Regression Module
### v2.0
- Break gem into core `spatial_stats` that will not include queries module and `spatial_stats-activerecord`. This will remove the dependency on rails for the core gem.
- Create `spatial_stats-import/geojson/shp` gem that will allow importing files and generating a `WeightsMatrix`. Will likely rely on `RGeo` or another spatial lib.
### Other TODOs
- Update Docs to show `from_observation` when version is bumped
- Refactor `MultivariateGeary` so that it can be used without `activerecord` by adding `from_observations` and supporting methods.
## License
The gem is available as open source under the terms of the [BSD-3-Clause](https://opensource.org/licenses/BSD-3-Clause).
| 30.586895 | 279 | 0.698305 | eng_Latn | 0.934127 |
97bee9f676b569ed0ca13348a224f5c81cbe56a6 | 128 | md | Markdown | _posts/0000-01-02-Sybil-777.md | Sybil-777/github-slideshow | b30803c3dbd4977eac9f7a0810daa1a9320f58fd | [
"MIT"
] | null | null | null | _posts/0000-01-02-Sybil-777.md | Sybil-777/github-slideshow | b30803c3dbd4977eac9f7a0810daa1a9320f58fd | [
"MIT"
] | 4 | 2022-01-23T16:36:16.000Z | 2022-01-24T21:37:14.000Z | _posts/0000-01-02-Sybil-777.md | Sybil-777/github-slideshow | b30803c3dbd4977eac9f7a0810daa1a9320f58fd | [
"MIT"
] | null | null | null | ---
layout: slide
title: "Welcome to our second slide!"
---
This is my Slide for Tech Up! take 2
Use the left arrow to go back!
| 18.285714 | 37 | 0.695313 | eng_Latn | 0.998594 |
97bf6168b46f85c73a52f70144666049438bbef1 | 538 | md | Markdown | node_modules/@types/js-cookie/README.md | Gabo1122/thebitcoindomain | ebf7c9222633159748626a0f88f803d0e8560bcb | [
"MIT"
] | 1 | 2021-09-06T20:49:25.000Z | 2021-09-06T20:49:25.000Z | node_modules/@types/js-cookie/README.md | Gabo1122/thebitcoindomain | ebf7c9222633159748626a0f88f803d0e8560bcb | [
"MIT"
] | 1 | 2022-03-25T18:57:49.000Z | 2022-03-25T18:57:49.000Z | node_modules/@types/js-cookie/README.md | Gabo1122/thebitcoindomain | ebf7c9222633159748626a0f88f803d0e8560bcb | [
"MIT"
] | 3 | 2020-08-01T22:05:55.000Z | 2021-09-06T20:49:39.000Z | # Installation
> `npm install --save @types/js-cookie`
# Summary
This package contains type definitions for js-cookie (https://github.com/js-cookie/js-cookie).
# Details
Files were exported from https://www.github.com/DefinitelyTyped/DefinitelyTyped/tree/master/types/js-cookie
Additional Details
* Last updated: Wed, 23 Aug 2017 17:50:59 GMT
* Dependencies: none
* Global values: Cookies
# Credits
These definitions were written by Theodore Brown <https://github.com/theodorejb>, BendingBender <https://github.com/BendingBender>.
| 31.647059 | 131 | 0.773234 | yue_Hant | 0.342539 |
97bf8d997ce3c2afbbce24f9ab97aa8f77164f25 | 15 | md | Markdown | README.md | mattrcarter/mattrcarter.github.io | a5ae1c59557f18729a1aef559fdae0c7db655fd7 | [
"CC-BY-3.0"
] | null | null | null | README.md | mattrcarter/mattrcarter.github.io | a5ae1c59557f18729a1aef559fdae0c7db655fd7 | [
"CC-BY-3.0"
] | null | null | null | README.md | mattrcarter/mattrcarter.github.io | a5ae1c59557f18729a1aef559fdae0c7db655fd7 | [
"CC-BY-3.0"
] | null | null | null | ## Matt Carter
| 7.5 | 14 | 0.666667 | eng_Latn | 0.400924 |
97c13356f1c15d1b42daf833c292d4b2cdd4c5b2 | 4,766 | md | Markdown | README.md | stackshareio/sidekiq-statistic | 7dcdfc01ab77e91ebc77b0061c576dcca06c91a7 | [
"MIT"
] | 776 | 2015-07-28T11:47:19.000Z | 2022-03-25T14:07:56.000Z | README.md | stackshareio/sidekiq-statistic | 7dcdfc01ab77e91ebc77b0061c576dcca06c91a7 | [
"MIT"
] | 107 | 2015-07-29T13:53:40.000Z | 2022-03-25T15:34:53.000Z | README.md | stackshareio/sidekiq-statistic | 7dcdfc01ab77e91ebc77b0061c576dcca06c91a7 | [
"MIT"
] | 102 | 2015-08-04T09:13:59.000Z | 2022-03-15T16:45:26.000Z |
# Sidekiq::Statistic
[](https://travis-ci.org/davydovanton/sidekiq-statistic) [](https://codeclimate.com/github/davydovanton/sidekiq-history) [](https://gitter.im/davydovanton/sidekiq-history?utm_source=badge&utm_medium=badge&utm_campaign=pr-badge)
Improved display of statistics for your Sidekiq workers and jobs.
## Screenshots
### Index page:

### Worker page with table (per day):

## Installation
Add this line to your application's Gemfile:
gem 'sidekiq-statistic'
And then execute:
$ bundle
## Usage
### Using Rails
Read [Sidekiq documentation](https://github.com/mperham/sidekiq/wiki/Monitoring#rails) to configure Sidekiq Web UI in your `routes.rb`.
When Sidekiq Web UI is active you're going be able to see the option `Statistic` on the menu.
### Using a standalone application
Read [Sidekiq documentation](https://github.com/mperham/sidekiq/wiki/Monitoring#standalone) to configure Sidekiq in your Rack server.
Next add `require 'sidekiq-statistic'` to your `config.ru`.
``` ruby
# config.ru
require 'sidekiq/web'
require 'sidekiq-statistic'
use Rack::Session::Cookie, secret: 'some unique secret string here'
run Sidekiq::Web
```
## Configuration
The Statistic configuration is an initializer that GEM uses to configure itself. The option `max_timelist_length`
is used to avoid memory leak, in practice, whenever the cache size reaches that number, the GEM is going
to remove 25% of the key values, avoiding inflating memory.
``` ruby
Sidekiq::Statistic.configure do |config|
config.max_timelist_length = 250_000
end
```
## Supported Sidekiq versions
Statistic support the following Sidekiq versions:
- Sidekiq 6.
- Sidekiq 5.
- Sidekiq 4.
- Sidekiq 3.5.
## JSON API
### /api/statistic.json
Returns statistic for each worker.
Params:
* `dateFrom` - Date start (format: `yyyy-mm-dd`)
* `dateTo` - Date end (format: `yyyy-mm-dd`)
Example:
```
$ curl http://example.com/sidekiq/api/statistic.json?dateFrom=2015-07-30&dateTo=2015-07-31
# =>
{
"workers": [
{
"name": "Worker",
"last_job_status": "passed",
"number_of_calls": {
"success": 1,
"failure": 0,
"total": 1
},
"runtime": {
"last": "2015-07-31 10:42:13 UTC",
"max": 4.002,
"min": 4.002,
"average": 4.002,
"total": 4.002
}
},
...
]
}
```
### /api/statistic/:worker_name.json
Returns worker statistic for each day in range.
Params:
* `dateFrom` - Date start (format: `yyyy-mm-dd`)
* `dateTo` - Date end (format: `yyyy-mm-dd`)
Example:
```
$ curl http://example.com/sidekiq/api/statistic/Worker.json?dateFrom=2015-07-30&dateTo=2015-07-31
# =>
{
"days": [
{
"date": "2015-07-31",
"failure": 0,
"success": 1,
"total": 1,
"last_job_status": "passed",
"runtime": {
"last": null,
"max": 0,
"min": 0,
"average": 0,
"total": 0
}
},
...
]
}
```
## Update statistic inside middleware
You can update your worker statistic inside middleware. For this you should to update `sidekiq:statistic` redis hash.
This hash has the following structure:
* `sidekiq:statistic` - redis hash with all statistic
- `yyyy-mm-dd:WorkerName:passed` - count of passed jobs for Worker name on yyyy-mm-dd
- `yyyy-mm-dd:WorkerName:failed` - count of failed jobs for Worker name on yyyy-mm-dd
- `yyyy-mm-dd:WorkerName:last_job_status` - string with status (`passed` or `failed`) for last job
- `yyyy-mm-dd:WorkerName:last_time` - date of last job performing
- `yyyy-mm-dd:WorkerName:queue` - name of job queue (`default` by default)
For time information you should push the runtime value to `yyyy-mm-dd:WorkerName:timeslist` redis list.
## How it works
<details>
<summary>Big image 'how it works'</summary>

</details>
## Contributing
1. Fork it ( https://github.com/davydovanton/sidekiq-statistic/fork )
2. Create your feature branch (`git checkout -b my-new-feature`)
3. Commit your changes (`git commit -am 'Add some feature'`)
4. Push to the branch (`git push origin my-new-feature`)
5. Create a new Pull Request
| 28.710843 | 442 | 0.68632 | eng_Latn | 0.503397 |
97c15f76810dfb1b67cc10b1a2ac689a5cef1808 | 320 | md | Markdown | intl.en-US/User Guide/Operation center/Alarm/Intelligent monitor FAQ/Why is the task wrong but I didn't receive an alarm?.md | rishabhbanga/ide | b6e29b4db572d6123c455d73399219149f82477f | [
"MIT"
] | null | null | null | intl.en-US/User Guide/Operation center/Alarm/Intelligent monitor FAQ/Why is the task wrong but I didn't receive an alarm?.md | rishabhbanga/ide | b6e29b4db572d6123c455d73399219149f82477f | [
"MIT"
] | null | null | null | intl.en-US/User Guide/Operation center/Alarm/Intelligent monitor FAQ/Why is the task wrong but I didn't receive an alarm?.md | rishabhbanga/ide | b6e29b4db572d6123c455d73399219149f82477f | [
"MIT"
] | null | null | null | # Why is the task wrong but I didn't receive an alarm? {#concept_trr_rws_42b .concept}
An alarm is reported only when a task meets either of the following conditions.
- The task is on the upstream node of a baseline with the baseline function enabled.
- Associated custom notification rules are set for the task.
| 40 | 86 | 0.771875 | eng_Latn | 0.999881 |
97c1d87fd38c457ae9aec3779b2e5cdc9f915a98 | 3,610 | md | Markdown | docs/v0.0.2/Debugging.md | AnnexGameDev/Annex | 569c2ca109d5c868450b07ab0dd6dd1dffa2fc7f | [
"MIT"
] | 1 | 2020-10-14T01:44:13.000Z | 2020-10-14T01:44:13.000Z | docs/v0.0.2/Debugging.md | AnnexGameDev/Annex | 569c2ca109d5c868450b07ab0dd6dd1dffa2fc7f | [
"MIT"
] | 32 | 2020-09-23T03:19:46.000Z | 2022-02-26T07:23:15.000Z | docs/v0.0.2/Debugging.md | AnnexGameDev/Annex | 569c2ca109d5c868450b07ab0dd6dd1dffa2fc7f | [
"MIT"
] | null | null | null | ---
layout: default
title: Debugging
nav_order: 2
parent: v0.0.2
search_exclude: true
---
**Careful:** This page was for meant for an older version of Annex
{: .deprecated }
Annex offers several debugging tools that you can leverage to your discretion. All of these tools apply only on DEBUG builds and will not affect performance on RELEASE builds.
# Event Trackers
In order to make sure certain critical game-events are meeting your expectations, you can attach performance trackers to them. Trackers keep track of how many times an event gets executed in a certain interval.
```cs
long interval = 1000; // How often the interval resets.
var tracker = new EventTracker(interval);
tracker.CurrentCount; // How many times the event ran in this interval.
tracker.LastCount; // How many times the event ran last interval.
tracker.Interval; // Corresponds to interval.
tracker.CurrentInterval; // How long the current interval has been running for.
string eventID;
var gameEvent = EventManager.Singleton.GetEvent(eventID);
gameEvent.AttachTracker(tracker);
```
# Logging
You can use `Debug.Log(message)` to add an entry into the current session's log, found in the bin/logs/ folder. Each log file is generated at the start of each session with the DateTimeStamp of the start of the session as the name.
```cs
Console.Log("We got to this point!");
```
# Assertions
You can use `Debug.Assert(bool)` in code that makes assumptions about the state of your game. If the assumption is invalid, an exception will be thrown and a log entry will be made noting the details of the failed assertion.
```cs
float Divide(float numerator, float denominator) {
Debug.Assert(denominator != 0);
return numerator / denominator;
}
```
# Overlay
You can gain access to a game-overlay, similar to a CLI, which you can use to display information, or execute commands. You can toggle this overlay for your current scene by calling its corresponding function.
***
**PLEASE NOTE THAT DUE TO THE LACK OF DEFAULT FONTS IN ANNEX, YOU CURRENTLY NEED A FONT FILE CALLED `default.ttf` IN ORDER FOR THE OVERLAY TO WORK**
***
```cs
Debug.ToggleDebugOverlay();
```
Without overlay

With overlay

## Commands
The top of this overlay is a rudimentary command-line-interface. While the overlay is visible, you can type into the CLI and hit enter to execute that command.
A command is the first word in the user-input (case-insensitive) and needs its corresponding data.
The data is the text trailing the command, which is split by whitespace.
## Example
A debug command that sets the player's position might look like this.
```
SetPlayerPosition 200 300
```
You can implement this debug command by using the `Debug.AddDebugCommand` function.
```cs
Debug.AddDebugCommand("setplayerposition", (data) => {
float x = float.Parse(data[0]);
float y = float.Parse(data[1]);
player.SetPosition(x, y);
});
```
Every command you enter gets stored, and can be flipped through using the up and down arrow-keys - similar to a regular CLI. You can remove all this history by executing the "clear" command.
## Information
To display information on the overlay, you can use the `Debug.AddDebugInformation` function.
For example, you can display your game's FPS by using Debug Information along with event trackers.
```cs
var drawEvent = EventManager.Singleton.GetEvent(GameWindow.DrawGameEventID);
var fpsTracker = new EventTracker(1000);
drawEvent.AttachTracker(fpsTracker);
Debug.AddDebugInformation(() => $"FPS: {fpsTracker.LastCount}");
``` | 35.742574 | 231 | 0.758726 | eng_Latn | 0.981577 |
97c20ba3647ac51cb09ab07c0ddfb322c7bb2b37 | 1,324 | md | Markdown | README.md | Liwins/sjmo | 168e00fd1da3bdd61a024c73cfc927b27195f6df | [
"Apache-2.0"
] | null | null | null | README.md | Liwins/sjmo | 168e00fd1da3bdd61a024c73cfc927b27195f6df | [
"Apache-2.0"
] | null | null | null | README.md | Liwins/sjmo | 168e00fd1da3bdd61a024c73cfc927b27195f6df | [
"Apache-2.0"
] | null | null | null | # sjmo
设计模式的学习
设计模式的编目
1抽象工厂:提供一个创建一系列相关或相互以来对象的接口而无需指定它们具体的类
2适配器模式:讲一个类的接口转换成客户希望的另外一个接口。适配器模式使得原本由于接口不兼容而不能在一起工作的类可以在一起工作。
3桥模式:将抽象部分与它的实现部分分离,使得同样的构建过程而言创建不同的表示
4Builder模式:将一个复杂对象的构建与它的表示分离使得同样的构建过程可以创建不同的表示
5责任链模式:为解除请求的发送者和接受者之间耦合,而使多个对象都有机会处理这个请求。将这些对象连成一条链,并沿着这条链传递该请求,直到有一个对象处理它
6命令模式:将一个请求封装成一个对象,从而使你可用不同的请求对客户进行参数化;对请求排队或记录请求日志,以及支持可取消的操作。
7composite模式:将对象组合成树形结构以表示“部分-整体”的层次结构。Composite使得客户对单个对象和复合对象的使用具有一致性。
8Decorator模式:动态地给一个对象添加一些额外的职责。就扩展功能而言,Decorator模式比生成子类方式更为灵活。
9门面模式:为子系统中的一组接口提供一个一致的界面,Facade模式定义了一个高层接口这个接口使得这一子系统更加容易使用。
10工厂方法:定一个用于创建对象的接口,让子类决定将哪一个类实例化。工厂方法使得一个类的实例化延迟到其子类。
11Flyweight模式:运用共享技术有效地支持大量细粒度的对象。
12Interpreter:给定一个语言,定义它的文法的一种表示,并定义一个解释器,该解释器使用该表示来解释语言中的句子
13Iterator:提供一种方法顺序访问一个集合对象中的各个元素,而又不暴漏该对象的内部表示
14Mediator:用于一个中介对象来封装一系列的对象交互。中介者使各对象不需要显式地相互引用,从而使其耦合松散,而且可以独立地改变它们之间的交互。
15Memento:在不破坏封装型的前提下,捕获一个对象的内部状态,并在该对象之外保存这个状态。这样以后就可将该对象恢复到保存的状态。
16观察者模式:定义对象间的一种一对多的依赖关系,以便当一个对象的状态发生改变时,所有依赖于它的对象都得到通知并自动刷新
17原型模式:用原型实例指定创建对象的种类,并且通过拷贝这个原型来创建新的对象
18代理模式:为其他对象提供一个代理以控制对这个对象的访问
19单例模式:保证一个类仅有一个实例,并提供一个访问它的全局访问点。
20State:允许一个对象在其内部状态改变时改变它的行为。对象看起来似乎修改了所属的类。
21Strategy:定义一系列的算法,把它们一个个封装起来,并且使他们可相互替换。本模式使得算法的变化可独立于使用它的客户。
22模版方法:定义一个操作中的算法的骨架,而将一些步骤延迟到子类中。模版方法使得子类可以不改变一个算法的结构即可重定义该算法的某些特定步骤。
23访问者模式:表示一个作用于某对象结构中的各元素的操作。它使你可以在不改变各元素的类的前提下定义作用于这些元素的新操作。 | 26.48 | 75 | 0.904079 | zho_Hans | 0.82925 |
97c26e01b7c049ebb08736feed9eb8e9a58fc8af | 2,952 | md | Markdown | content/sap/cds/000-cds-getting-started/200-cds-syntax.md | aemrei/dev.aemre.net | 4835916c1ffc2eb6a666cb34728c288d1398e358 | [
"MIT"
] | 1 | 2022-01-02T20:21:07.000Z | 2022-01-02T20:21:07.000Z | content/sap/cds/000-cds-getting-started/200-cds-syntax.md | aemrei/dev.aemre.net | 4835916c1ffc2eb6a666cb34728c288d1398e358 | [
"MIT"
] | null | null | null | content/sap/cds/000-cds-getting-started/200-cds-syntax.md | aemrei/dev.aemre.net | 4835916c1ffc2eb6a666cb34728c288d1398e358 | [
"MIT"
] | null | null | null | ---
title: CDS Syntax
---
# CDS Syntax
## Cast Operations
```cds
cast(myunit as abap.unit(3)) as myunit,
key cast(spras as sylangu preserving type) as spras,
```
## Case
```cds
case when myquantity < 30 then '<' else '>' end as comp,
case myquantity when 30 then '=' else '<>' end as equ
```
## Session Variables
```cds
define view Z_ViewWithSessionVariables
as select from t000
{
$session.client as ClientField, //sy-mandt
$session.system_date as SystemDateField, //sy-datum
$session.system_language as SystemLanguageField, //sy-langu
$session.user as UserField //sy-uname
} where mandt = $session.client
```
## Join
```cds
define view Z_ViewWithLeftOuterJoin
as select from Z_ViewAsDataSourceD
left outer to many join Z_ViewAsDataSourceE
on Z_ViewAsDataSourceD.FieldD2 = Z_ViewAsDataSourceE.FieldE1
define view Z_ViewWithoutAssociationPaths
as select from ZI_SalesOrderScheduleLine as SL
left outer to one join ZI_SalesOrderItem as ITEM
on ITEM.SalesOrder = SL.SalesOrder
```
> You should always specify the maximum target cardinality (TO ONE or TO MANY) of a left outer join partner. On one hand, this specification is used to document the composition of the CDS view. On the other hand, this information can optimize the processing of a selection request in the database.
>
> However, if you maintain the cardinality information, you should make sure that it's defined correctly. Otherwise, the selection result may become incorrect.
## Aggregation Functions
```cds
@AbapCatalog.sqlViewName: 'Z21_V_AGGR'
@AbapCatalog.compiler.compareFilter: true
@AbapCatalog.preserveKey: true
@AccessControl.authorizationCheck: #CHECK
@EndUserText.label: 'Aggregations'
define view z21_c_aggr as select from Z21_C_Item {
key mykey,
avg( myquantity ) as myavg,
min( myquantity ) as mymin,
max( myquantity ) as mymax,
count(*) as mycount
} group by mykey
```
## Association
| Cardinality | Min/Max |
| ----------- | ------- |
| default | 0-1 |
| [1] | 0-1 |
| [0..1] | 0-1 |
| [1..1] | 1-1 |
| [0..*] | 0-∞ |
| [1..*] | 1-∞ |
## Parameters
CDS: `Z21_C_type_param`
```cds
@AbapCatalog.sqlViewName: 'Z21_VP_TYPE'
@AbapCatalog.compiler.compareFilter: true
@AbapCatalog.preserveKey: true
@AccessControl.authorizationCheck: #CHECK
@EndUserText.label: 'CDS Type with param'
define view Z21_C_type_param
with parameters p_lang: sylangu
as select from Z21_C_Type {
key mytype,
mytext
} where spras = $parameters.p_lang
```
SQL View for a parametric view in ABAP code
```abap
SELECT
Z21_C_TYPE_PARAM~MYTYPE,
Z21_C_TYPE_PARAM~MYTEXT
FROM
Z21_C_TYPE_PARAM( P_LANG = 'T' )
```
## Conversion
```cds
{
unit_conversion( quantity => OrderQuantity,
source_unit => OrderQuantityUnit,
target_unit => :P_DisplayUnit,
error_handling => 'FAIL_ON_ERROR' ) as mynewquantity
}
```
| 23.616 | 297 | 0.697154 | eng_Latn | 0.645968 |
97c2b13d4e9ead0ff448b7949b7f8a66e263f09f | 7,977 | md | Markdown | docs/sharepoint/merging-xml-in-feature-and-package-manifests.md | thoelkef/visualstudio-docs.de-de | bfc727b45dcd43a837d3f4f2b06b45d59da637bc | [
"CC-BY-4.0",
"MIT"
] | null | null | null | docs/sharepoint/merging-xml-in-feature-and-package-manifests.md | thoelkef/visualstudio-docs.de-de | bfc727b45dcd43a837d3f4f2b06b45d59da637bc | [
"CC-BY-4.0",
"MIT"
] | null | null | null | docs/sharepoint/merging-xml-in-feature-and-package-manifests.md | thoelkef/visualstudio-docs.de-de | bfc727b45dcd43a837d3f4f2b06b45d59da637bc | [
"CC-BY-4.0",
"MIT"
] | null | null | null | ---
title: Zusammenführen von XML in Funktions-und Paket Manifesten | Microsoft-Dokumentation
ms.date: 02/02/2017
ms.topic: conceptual
dev_langs:
- VB
- CSharp
helpviewer_keywords:
- SharePoint development in Visual Studio, packaging
author: John-Hart
ms.author: johnhart
manager: jillfra
ms.workload:
- office
ms.openlocfilehash: 1378cddbc9770af923a98f1b7083a8792874b5b3
ms.sourcegitcommit: 6cfffa72af599a9d667249caaaa411bb28ea69fd
ms.translationtype: MT
ms.contentlocale: de-DE
ms.lasthandoff: 09/02/2020
ms.locfileid: "90841058"
---
# <a name="merge-xml-in-feature-and-package-manifests"></a>Zusammenführen von XML in Funktions-und Paket Manifesten
Funktionen und Pakete werden durch [!INCLUDE[TLA2#tla_xml](../sharepoint/includes/tla2sharptla-xml-md.md)] Manifest-Dateien definiert. Diese gepackten Manifeste stellen eine Kombination aus Daten dar, die von Designern generiert werden, und benutzerdefinierte Daten, [!INCLUDE[TLA2#tla_xml](../sharepoint/includes/tla2sharptla-xml-md.md)] die von Benutzern in der Zum Zeitpunkt der Paket Erstellung werden [!INCLUDE[vsprvs](../sharepoint/includes/vsprvs-md.md)] die benutzerdefinierten [!INCLUDE[TLA2#tla_xml](../sharepoint/includes/tla2sharptla-xml-md.md)] Anweisungen mit dem vom Designer bereitgestellten zusammengeführt, [!INCLUDE[TLA2#tla_xml](../sharepoint/includes/tla2sharptla-xml-md.md)] um die gepackte Manifest-Datei zu bilden [!INCLUDE[TLA2#tla_xml](../sharepoint/includes/tla2sharptla-xml-md.md)] Ähnliche Elemente, mit den Ausnahmen, die später in mergeausnahmen notiert werden, werden zusammengeführt, um [!INCLUDE[TLA2#tla_xml](../sharepoint/includes/tla2sharptla-xml-md.md)] Validierungs Fehler zu vermeiden, nachdem Sie die Dateien in SharePoint bereitgestellt haben, und um die Manifest-Dateien zu verkleinern und effizienter zu gestalten.
## <a name="modify-the-manifests"></a>Ändern der Manifeste
Sie können die gepackten Manifest-Dateien erst direkt ändern, wenn Sie die Funktion oder die Paket-Designer deaktivieren. Allerdings können Sie benutzerdefinierte [!INCLUDE[TLA2#tla_xml](../sharepoint/includes/tla2sharptla-xml-md.md)] Elemente manuell der Manifest-Vorlage hinzufügen, entweder über die Funktions-und Paket-Designer oder den [!INCLUDE[TLA2#tla_xml](../sharepoint/includes/tla2sharptla-xml-md.md)] Editor. Weitere Informationen finden Sie unter Gewusst [wie: Anpassen einer SharePoint-Funktion](../sharepoint/how-to-customize-a-sharepoint-feature.md) und Gewusst [wie: Anpassen eines SharePoint-Lösungs Pakets](../sharepoint/how-to-customize-a-sharepoint-solution-package.md).
## <a name="feature-and-package-manifest-merge-process"></a>Merge-Prozess für Funktions-und Paket Manifest
Wenn Sie benutzerdefinierte Elemente zusammen mit vom Designer bereitgestellten Elementen kombinieren, wird von [!INCLUDE[vsprvs](../sharepoint/includes/vsprvs-md.md)] der folgende Prozess verwendet. [!INCLUDE[vsprvs](../sharepoint/includes/vsprvs-md.md)] überprüft, ob jedes Element über einen eindeutigen Schlüsselwert verfügt. Wenn ein Element keinen eindeutigen Schlüsselwert aufweist, wird es an die gepackte Manifest-Datei angehängt. Ebenso können Elemente, die über mehrere Schlüssel verfügen, nicht zusammengeführt werden. Daher werden Sie an die Manifest-Datei angehängt.
Wenn ein Element über einen eindeutigen Schlüssel verfügt, [!INCLUDE[vsprvs](../sharepoint/includes/vsprvs-md.md)] vergleicht die Werte des Designers und der benutzerdefinierten Schlüssel. Wenn die Werte stimmen, werden Sie zu einem einzelnen Wert zusammengeführt. Wenn sich die Werte unterscheiden, wird der Designer Schlüsselwert verworfen, und der benutzerdefinierte Schlüsselwert wird verwendet. Sammlungen werden ebenfalls zusammengeführt. Wenn beispielsweise der vom Designer generierte [!INCLUDE[TLA2#tla_xml](../sharepoint/includes/tla2sharptla-xml-md.md)] und die benutzerdefinierten beide eine assemblyauflistung [!INCLUDE[TLA2#tla_xml](../sharepoint/includes/tla2sharptla-xml-md.md)] enthalten, enthält das Paket Manifest nur eine assemblyauflistung.
## <a name="merge-exceptions"></a>Merge-Ausnahmen
[!INCLUDE[vsprvs](../sharepoint/includes/vsprvs-md.md)] führt die meisten Designer [!INCLUDE[TLA2#tla_xml](../sharepoint/includes/tla2sharptla-xml-md.md)] Elemente zusammen mit ähnlichen benutzerdefinierten [!INCLUDE[TLA2#tla_xml](../sharepoint/includes/tla2sharptla-xml-md.md)] Elementen zusammen, sofern Sie über ein einzelnes eindeutiges identifizierendes Attribut verfügen. Einige Elemente fehlen jedoch dem eindeutigen Bezeichner, der für die Zusammenführung erforderlich ist [!INCLUDE[TLA2#tla_xml](../sharepoint/includes/tla2sharptla-xml-md.md)] . Diese Elemente werden als *Merge-Ausnahmen*bezeichnet. In diesen Fällen werden [!INCLUDE[vsprvs](../sharepoint/includes/vsprvs-md.md)] die benutzerdefinierten Elemente nicht [!INCLUDE[TLA2#tla_xml](../sharepoint/includes/tla2sharptla-xml-md.md)] zusammen mit den vom Designer bereitgestellten [!INCLUDE[TLA2#tla_xml](../sharepoint/includes/tla2sharptla-xml-md.md)] Elementen zusammengeführt, sondern an die gepackte Manifest-Datei angehängt.
Im folgenden finden Sie eine Liste von Merge-Ausnahmen für Feature-und Paket [!INCLUDE[TLA2#tla_xml](../sharepoint/includes/tla2sharptla-xml-md.md)] Manifest-Dateien.
|Designer|XML-Element|
|--------------|-----------------|
|Funktions-Designer|Activationabhängigkeit|
|Funktions-Designer|Upgradeaction|
|Paket-Designer|SafeControl|
|Paket-Designer|CodeAccessSecurity|
## <a name="feature-manifest-elements"></a>Feature-Manifest-Elemente
In der folgenden Tabelle finden Sie eine Liste aller featuremanifeselemente, die zusammengeführt werden können, und deren eindeutiger Schlüssel für den Abgleich.
|Elementname|Eindeutiger Schlüssel|
|------------------|----------------|
|Feature (alle Attribute)|*Attribut Name* (jeder Attribut Name des Feature-Elements ist ein eindeutiger Schlüssel.)|
|ElementFile|Speicherort|
|Elementmanifests/Element Manifest|Speicherort|
|Eigenschaften/Eigenschaft|Schlüssel|
|Customupgradeaction|name|
|Customupgradebug-Parameter|name|
> [!NOTE]
> Da sich die einzige Möglichkeit zum Ändern des customupgradeaction-Elements im benutzerdefinierten [!INCLUDE[TLA2#tla_xml](../sharepoint/includes/tla2sharptla-xml-md.md)] Editor befindet, ist die Auswirkung der nicht Zusammenführung gering.
## <a name="package-manifest-elements"></a>Paket Manifest-Elemente
Die folgende Tabelle ist eine Liste aller Paket Manifest-Elemente, die zusammengeführt werden können, und ihrer eindeutigen Schlüssel, der für den Abgleich verwendet wird.
|Elementname|Eindeutiger Schlüssel|
|------------------|----------------|
|Lösung (alle Attribute)|*Attribut Name* (jeder Attribut Name des projektmappenelements ist ein eindeutiger Schlüssel.)|
|ApplicationResourceFiles/ApplicationResourceFile|Speicherort|
|Assemblys/Assembly|Speicherort|
|ClassResources/ClassResource|Speicherort|
|DwpFiles/DwpFile|Speicherort|
|Featuremanifeste/FeatureManifest|Speicherort|
|Ressourcen/Ressource|Speicherort|
|RootFiles/RootFile|Speicherort|
|SiteDefinitionManifests/SiteDefinitionManifest|Speicherort|
|WebTempFile|Speicherort|
|TemplateFiles/TemplateFile|Speicherort|
|Solutionabhängigkeit|SolutionID|
## <a name="manually-add-deployed-files"></a>Manuell bereitgestellte Dateien hinzufügen
Einige Manifest-Elemente, wie z. b. ApplicationResourceFile und DwpFiles, geben einen Speicherort an, der einen Dateinamen enthält. Durch das Hinzufügen eines Datei namens Eintrags zur Manifest-Vorlage wird jedoch nicht die zugrunde liegende Datei zum Paket hinzugefügt. Sie müssen die Datei dem Projekt hinzufügen, um Sie in das Paket einzuschließen, und die zugehörige Eigenschaft des Bereitstellungs Typs entsprechend festlegen.
## <a name="see-also"></a>Weitere Informationen
- [Packen und Bereitstellen von SharePoint-Lösungen](../sharepoint/packaging-and-deploying-sharepoint-solutions.md)
- [Erstellen und Debuggen von SharePoint-Lösungen](../sharepoint/building-and-debugging-sharepoint-solutions.md)
| 94.964286 | 1,160 | 0.806694 | deu_Latn | 0.960775 |
97c6689839b5132ba4959ff0a0ad7ed08c7cb97c | 4,808 | md | Markdown | articles/machine-learning/team-data-science-process/execute-data-science-tasks.md | hirenshah1/azure-docs | 6fbb2e7518459b53fbfb9924ef40231223e22b75 | [
"CC-BY-4.0",
"MIT"
] | 1 | 2021-09-09T11:23:22.000Z | 2021-09-09T11:23:22.000Z | articles/machine-learning/team-data-science-process/execute-data-science-tasks.md | hirenshah1/azure-docs | 6fbb2e7518459b53fbfb9924ef40231223e22b75 | [
"CC-BY-4.0",
"MIT"
] | null | null | null | articles/machine-learning/team-data-science-process/execute-data-science-tasks.md | hirenshah1/azure-docs | 6fbb2e7518459b53fbfb9924ef40231223e22b75 | [
"CC-BY-4.0",
"MIT"
] | null | null | null | ---
title: Execute data science tasks - Team Data Science Process
description: How a data scientist can execute a data science project in a trackable, version controlled, and collaborative way.
author: marktab
manager: marktab
editor: marktab
ms.service: machine-learning
ms.subservice: team-data-science-process
ms.topic: article
ms.date: 11/17/2020
ms.author: tdsp
ms.custom: seodec18, previous-author=deguhath, previous-ms.author=deguhath
---
# Execute data science tasks: exploration, modeling, and deployment
Typical data science tasks include data exploration, modeling, and deployment. This article outlines the tasks to complete several common data science tasks such as interactive data exploration, data analysis, reporting, and model creation. Options for deploying a model into a production environment may include:
- [Azure Machine Learning](../index.yml)
- [SQL-Server with ML services](/sql/advanced-analytics/r/r-services)
- [Microsoft Machine Learning Server](/machine-learning-server/what-is-machine-learning-server)
## 1. <a name='DataQualityReportUtility-1'></a> Exploration
A data scientist can perform exploration and reporting in a variety of ways: by using libraries and packages available for Python (matplotlib for example) or with R (ggplot or lattice for example). Data scientists can customize such code to fit the needs of data exploration for specific scenarios. The needs for dealing with structured data are different that for unstructured data such as text or images.
Products such as Azure Machine Learning also provide [advanced data preparation](../how-to-create-register-datasets.md) for data wrangling and exploration, including feature creation. The user should decide on the tools, libraries, and packages that best suite their needs.
The deliverable at the end of this phase is a data exploration report. The report should provide a fairly comprehensive view of the data to be used for modeling and an assessment of whether the data is suitable to proceed to the modeling step.
## 2. <a name='ModelingUtility-2'></a> Modeling
There are numerous toolkits and packages for training models in a variety of languages. Data scientists should feel free to use which ever ones they are comfortable with, as long as performance considerations regarding accuracy and latency are satisfied for the relevant business use cases and production scenarios.
### Model management
After multiple models have been built, you usually need to have a system for registering and managing the models. Typically you need a combination of scripts or APIs and a backend database or versioning system. A few options that you can consider for these management tasks are:
1. [Azure Machine Learning - model management service](../index.yml)
2. [ModelDB from MIT](https://people.csail.mit.edu/mvartak/papers/modeldb-hilda.pdf)
3. [SQL-server as a model management system](https://blogs.technet.microsoft.com/dataplatforminsider/2016/10/17/sql-server-as-a-machine-learning-model-management-system/)
4. [Microsoft Machine Learning Server](/sql/advanced-analytics/r/r-server-standalone)
## 3. <a name='Deployment-3'></a> Deployment
Production deployment enables a model to play an active role in a business. Predictions from a deployed model can be used for business decisions.
### Production platforms
There are various approaches and platforms to put models into production. Here are a few options:
- [Model deployment in Azure Machine Learning](../how-to-deploy-and-where.md)
- [Deployment of a model in SQL-server](/sql/advanced-analytics/tutorials/sqldev-py6-operationalize-the-model)
- [Microsoft Machine Learning Server](/sql/advanced-analytics/r/r-server-standalone)
> [!NOTE]
> Prior to deployment, one has to insure the latency of model scoring is low enough to use in production.
>
>
Further examples are available in walkthroughs that demonstrate all the steps in the process for **specific scenarios**. They are listed and linked with thumbnail descriptions in the [Example walkthroughs](walkthroughs.md) article. They illustrate how to combine cloud, on-premises tools, and services into a workflow or pipeline to create an intelligent application.
> [!NOTE]
> For deployment using Azure Machine Learning Studio, see [Deploy an Azure Machine Learning web service](../classic/deploy-a-machine-learning-web-service.md).
>
>
### A/B testing
When multiple models are in production, it can be useful to perform [A/B testing](https://en.wikipedia.org/wiki/A/B_testing) to compare performance of the models.
## Next steps
[Track progress of data science projects](track-progress.md) shows how a data scientist can track the progress of a data science project.
[Model operation and CI/CD](ci-cd-flask.md) shows how CI/CD can be performed with developed models.
| 61.641026 | 407 | 0.791597 | eng_Latn | 0.994058 |
97c76ce69b46a7ece72c70e03375fe68913c9376 | 6,669 | md | Markdown | node_modules/nativescript-iqkeyboardmanager/README.md | DwGonzalez/NSPlayground | d602cf1ba5314b51a5ceeff9787aa8c3301f8773 | [
"MIT"
] | 1 | 2019-02-21T03:52:31.000Z | 2019-02-21T03:52:31.000Z | node_modules/nativescript-iqkeyboardmanager/README.md | DwGonzalez/NSPlayground | d602cf1ba5314b51a5ceeff9787aa8c3301f8773 | [
"MIT"
] | null | null | null | node_modules/nativescript-iqkeyboardmanager/README.md | DwGonzalez/NSPlayground | d602cf1ba5314b51a5ceeff9787aa8c3301f8773 | [
"MIT"
] | null | null | null | # NativeScript IQKeyboardManager Plugin
NativeScript wrapper for the popular [IQKeyboardManager](https://cocoapods.org/pods/IQKeyboardManager) iOS framework, which provides an elegant solution for preventing the iOS keyboard from covering `UITextView` controls.

## Installation
```
$ tns plugin add nativescript-iqkeyboardmanager
```
## Usage
That's it! IQKeyboardManager takes care of all initialization when your app starts up by default.
## Advanced usage
### Grouping related textfields (previous / next buttons)
If your UI layout has sibling text fields, then IQKeyboardManager is able to automatically
add previous / next buttons to the accessory bar which the user can use to jump back and forth.
See those < and > buttons in the video above.
In case those fields were not direct siblings, until version 1.3.0 of this plugin, you had no way
to force the previous / next buttons to appear. However, now you can:
#### NativeScript /w XML usage
Note in the example below that the two `<TextField>` controls are not siblings (both have parent `<StackLayout>` containers). Because of this, IQKeyboardManager will not automatically provide an optimized keyboard by default.
However, if you surround the controls with this plugin's `<PreviousNextView>` control, as the example below shows, you will continue to get an optimized keyboard as expected.
```xml
<Page xmlns="http://schemas.nativescript.org/tns.xsd" xmlns:IQKeyboardManager="nativescript-iqkeyboardmanager">
<StackLayout>
<IQKeyboardManager:PreviousNextView><!-- add this 'wrapper' to enable those previous / next buttons -->
<StackLayout>
<StackLayout>
<TextField hint="Email"/>
</StackLayout>
<StackLayout>
<TextField hint="Password"/>
</StackLayout>
</StackLayout>
</IQKeyboardManager:PreviousNextView>
</Stacklayout>
</Page>
```
#### NativeScript /w Angular usage
In the `.modules.ts` file where you want to use this feature (or the `app.module.ts`),
register the `PreviousNextView` element:
```typescript
import { registerElement } from "nativescript-angular";
registerElement("PreviousNextView", () => require("nativescript-iqkeyboardmanager").PreviousNextView);
```
Then in the view, use that element like this (again, we went nuts with the `<StackLayout>`s:
```html
<StackLayout>
<PreviousNextView><!-- add this 'wrapper' to enable those previous / next buttons -->
<StackLayout>
<StackLayout>
<TextField hint="Email"></TextField>
</StackLayout>
<StackLayout>
<TextField hint="Password"></TextField>
</StackLayout>
</StackLayout>
</PreviousNextView>
</Stacklayout>
```
#### NativeScript /w Vue usage
Vue usage is very similar to Angular usage, the only difference is in how the element is registered. Open your app's entry file, and add this:
```javascript
Vue.registerElement("PreviousNextView", () => require("nativescript-iqkeyboardmanager"). PreviousNextView)
```
### Tweaking the appearance and behavior
Start by adding the following two paths into your app’s `references.d.ts` file. (See this repo’s demo app for a specific example.)
```
/// <reference path="./node_modules/tns-platform-declarations/ios/ios.d.ts" />
/// <reference path="./node_modules/nativescript-iqkeyboardmanager/index.d.ts" />
```
> **NOTE**: You might also need to `npm install --save-dev tns-platform-declarations` to bring in NativeScript’s TypeScript definitions for native iOS development.
Next, initialize an instance of `IQKeyboardManager` with the following line of code.
```typescript
const iqKeyboard = IQKeyboardManager.sharedManager();
```
You now have the full IQKeyboardManager APIs available for you to use. For example you could use the following code to switch to a dark keyboard.
```typescript
const iqKeyboard = IQKeyboardManager.sharedManager();
iqKeyboard.overrideKeyboardAppearance = true;
iqKeyboard.keyboardAppearance = UIKeyboardAppearance.Dark;
```
For more examples of what's possible, run the demo app (shown in the gif below) and check out the [app's `main-view-model.ts` file](demo/app/main-view-model.ts).
<img src="https://github.com/tjvantoll/nativescript-IQKeyboardManager/raw/master/demo.gif" width="320px"/>
### Multi-factor one-time code auto-fill
While the following is not a feature specific to IQKeyboardManager, you are here because you want the best keyboard experience for your NativeScript app and this may be helpful to know about.
iOS has a feature where a text field's QuickType search suggestion bar can suggest one-time code values for multi-factor authentication that were texted to your device.
If the field is specially-identified as a one-time code field, the suggestion will appear for about 3 minutes after being received, and the user simply has to tap the suggestion to fill in the value—no short term memorization or copy/paste gestures required. Examples of message formats are:
* 123456 is your App Name code.
* 123456 is your App Name login code.
* 123456 is your App Name verification code.
To implement this functionality in your own app, first declare `UITextContentTypeOneTimeCode` near your component imports:
```typescript
declare var UITextContentTypeOneTimeCode;
```
Then, set the field's `ios.textContentType` property:
```typescript
// This code assumes this.page exists as a reference to the current Page.
const mfaCodeField: TextField = this.page.getViewById(oneTimeCodeFieldName);
if (mfaCodeField !== null && mfaCodeField.ios) {
mfaCodeField.ios.textContentType = UITextContentTypeOneTimeCode;
}
```
There are other `textContentType` values you might want to use. You can read more about the property in [this article](https://medium.com/developerinsider/ios12-password-autofill-automatic-strong-password-and-security-code-autofill-6e7db8da1810).
## Documentation
For more details on how IQKeyboardManager works, including more detailed API documentation, refer to [the library's CocoaPod page](https://cocoapods.org/pods/IQKeyboardManager).
## Maintainers
For maintainer’s of this plugin’s source code: when the [IQKeyboardManager Podfile](platforms/ios/Podfile) updates, you should generate new typings for for this plugin to reflect those changes.
To do so, execute these commands.
```bash
cd demo
TNS_DEBUG_METADATA_PATH="$(pwd)/metadata" tns build ios
TNS_TYPESCRIPT_DECLARATIONS_PATH="$(pwd)/typings" tns build ios
```
Next, locate IQKeyboardManager’s generated typings file in the `demo/typings` folder and override the `IQKeyboardManager.d.ts` file in this repo’s root.
| 42.477707 | 291 | 0.764432 | eng_Latn | 0.973656 |
97c7a674e795657176d142d03da291b091d2ead7 | 727 | md | Markdown | src/safe-guides/coding_practice/threads.md | surechen/rust-coding-guidelines-zh | 75ae109b22a9c2d18c02f30a98aa1166906a273b | [
"MIT"
] | null | null | null | src/safe-guides/coding_practice/threads.md | surechen/rust-coding-guidelines-zh | 75ae109b22a9c2d18c02f30a98aa1166906a273b | [
"MIT"
] | null | null | null | src/safe-guides/coding_practice/threads.md | surechen/rust-coding-guidelines-zh | 75ae109b22a9c2d18c02f30a98aa1166906a273b | [
"MIT"
] | null | null | null | # 3.17 多线程
Rust 天生可以有效消除数据竞争。
## 列表
- [锁同步](./threads/lock.md)
- [P.MTH.LCK.01 多线程下要注意识别锁争用的情况,避免死锁](./threads/lock/P.MTH.LCK.01.md)
- [G.MTH.LCK.01 对布尔或引用并发访问应该使用原子类型而非互斥锁](./threads/lock/G.MTH.LCK.01.md)
- [G.MTH.LCK.02 建议使用 `Arc<str> / Arc<[T]>` 来代替 `Arc<String> / Arc<Vec<T>>`](./threads/lock/G.MTH.LCK.02.md)
- [G.MTH.LCK.03 尽量避免直接使用标准库 `std::sync` 模块中的同步原语,替换为 `parking_lot`](./threads/lock/G.MTH.LCK.03.md)
- [G.MTH.LCK.04 尽量避免直接使用标准库 `std::sync::mpsc` 模块中的 `channel`,替换为 `crossbeam`](./threads/lock/G.MTH.LCK.04.md)
- [无锁](./threads/lock-free.md)
- [P.MTH.LKF.01 除非必要,否则建议使用同步锁](./threads/lock-free/P.MTH.LKF.01.md)
- [P.MTH.LKF.02 使用无锁编程时,需要合理选择内存顺序](./threads/lock-free/P.MTH.LKF.02.md) | 48.466667 | 113 | 0.651994 | yue_Hant | 0.888212 |
97c815171fe0da84181bd1fe5cc72cb3dd031008 | 517 | md | Markdown | node_modules/@atlaskit/analytics-next-stable-react-context/CHANGELOG.md | ankushk-sdm/jitsiRepo | 8b18d31fd02bc4eb7bffe398f23be2b50b33665a | [
"Apache-2.0"
] | 1 | 2021-12-13T02:17:05.000Z | 2021-12-13T02:17:05.000Z | node_modules/@atlaskit/analytics-next-stable-react-context/CHANGELOG.md | Topstar88/jitsi_meet | 40a29a714ce54a1c4e8a3bcbf33e1132103cc596 | [
"Apache-2.0"
] | null | null | null | node_modules/@atlaskit/analytics-next-stable-react-context/CHANGELOG.md | Topstar88/jitsi_meet | 40a29a714ce54a1c4e8a3bcbf33e1132103cc596 | [
"Apache-2.0"
] | 1 | 2020-11-04T06:01:11.000Z | 2020-11-04T06:01:11.000Z | # @atlaskit/analytics-next-stable-react-context
## 1.0.1
### Patch Changes
- [`6c525a8229`](https://bitbucket.org/atlassian/atlassian-frontend/commits/6c525a8229) - Upgraded to TypeScript 3.9.6 and tslib to 2.0.0
Since tslib is a dependency for all our packages we recommend that products also follow this tslib upgrade
to prevent duplicates of tslib being bundled.
## 1.0.0
### Major Changes
- [`8a42350632`](https://bitbucket.org/atlassian/atlassian-frontend/commits/8a42350632) - First and only version
| 30.411765 | 137 | 0.756286 | eng_Latn | 0.851601 |
97c828c2456648ce1fdd5422c04a7d6b5cf93bec | 31,444 | markdown | Markdown | _posts/2001-08-14-the-intolerable-acts.markdown | fugabi/blog | 8b19ca5b412ded3e47afdb97cefe7e3888236578 | [
"MIT"
] | null | null | null | _posts/2001-08-14-the-intolerable-acts.markdown | fugabi/blog | 8b19ca5b412ded3e47afdb97cefe7e3888236578 | [
"MIT"
] | null | null | null | _posts/2001-08-14-the-intolerable-acts.markdown | fugabi/blog | 8b19ca5b412ded3e47afdb97cefe7e3888236578 | [
"MIT"
] | null | null | null | ---
layout: post
title: The Intolerable Acts
date: '2001-08-14 08:06:00'
tags:
- philosophy
- history
- america
---
Gabriel Furmuzachi
## Preamble
The Intolerable Acts, also known as the ‘Coercive Acts’, represent a series of disciplinary measures taken by Britain’s Parliament and directed at the colonies in the New World, in the aftermath of the French and Indian War. In what follows I will try to analyze the circumstances in which these acts were brought about, then I will try to show how and why these measures, although they seemed justified in the eyes of the British government, were not accepted by the colonists, who claimed that they only obstruct their freedom and economical prosperity.
## The Circumstances
The beginning of the new policy towards the colonies is usually placed at the end of the French and Indian War[1]. Why should this be the beginning? One possible answer would be because during the seven years of war (1756 - 1763) Britain exhausted most of its resources and got to the point where it needed the support of the colonists in its attempt to regain its position as a world power. Many English leaders were dissatisfied with the colonists’ support and came to the conclusion that they needed a major reorganization and their central authority should be nowhere else but in London. Throughout the Seven Years War, the English government continually supplied the colonies with British troops so that they might be protected from the French as well as the Indians who fought on the French side during the war. The troops were kept in the New World even after the French had surrendered their territories in Canada to Great Britain. Their continued presence was to protect the colonists from Indian invasions as well as French retaliation along the borders. With this war, the English Crown accumulated about $2 million in debt. The colonies had, and still were, enjoying the benefits of being citizens of the British Empire while Great Britain was taking care of all of the costs. George Grenville, the Prime Minister of Parliament, did not appreciate the fact that England was paying the bill for the protection of the American colonists while they were gaining so much from the placement of troops there. In 1763, the time had come for the colonies to contribute as well and lighten the burden of the Crown. Following Grenville’s policy, Charles Townsend imposed another series of taxes and in 1774 Lord North, the Prime Minister at the time, passed the five infamous acts.
Let us start with the beginning - the Grenville program. Between 1763 and 1765, George Grenville was Britain’s Prime Minister. If one looks at the series of acts that he issued and supported when time came for the parliament to give its vote, one can concede they seem to be not only quite legitimate but, in some instances even required[2].
Most of the members of Parliament agreed with Lord Grenville, even if he was an ‘insufferable bore’ as King George III so eloquently mentioned once[3]. Lord Grenville was quite correct in his assessment of the situation concerning the American colonies. The debt incurred to defend them was great, and the colonists were paying very little of that bill. ‘The time had come to pay for these victories which... the American colonies had done very little to achieve… in helping to meet the expenses, Grenville considered it was only proper that at least part of the high cost of maintaining a force of ten thousand men in America… should be met by the colonists themselves’[4]. The debt had been incurred on the colonies’ behalf, and they should have to help pay for their protection. After all, Parliament reserved the right to tax any and every citizen of the British Empire, and the colonies were part of the empire. Grenville, as well as the Parliament, considered that there was no question as to whether or not the Parliament could tax the colonies. There was little opposition in the Parliament. Among those who did not agree were Grenville’s own brother-in-law, William Pitt the Earl of Chatham and the political thinker and diplomat, Edmund Burke. Pitt’s unsettling came not the ambiguity of whether or not they could tax, there was no doubt about that in any one’s mind – but whether or not they should. Pitt, like Edmund Burke, had taken into account that colonists were left alone for a very long time and that they would not appreciate a swift action from the Parliament demanding a tax.
Unfortunately, William Pitt’s fear became reality. The colonies strongly opposed the Sugar Act of 1763. A couple of decades before, in 1733, a Molasses Act was passed. It required a tariff on all sugar products that were imported into America from the West Indies. The American colonists, however, had found that it was not difficult to smuggle their sugar and avoid paying the tariffs. This sort of activity was not allowed to go on in any other part of the British Empire, and Lord Grenville saw no reason why this practice should be permitted in the colonies without being sanctioned by England. Moreover, the colonies were lightly taxed when compared to the rest of the British Empire[5]. They were doing well in America! There was enough industrial development to surprise an Englishman who had never been there before. There was absolutely no excuse for the colonists to be further exempt from taxes and do otherwise than every other British citizen. Therefore, with logic on their side, King George, Lord Grenville, and Parliament agreed that through the Revenue Act the colonists should help pay for their own protection.
The Sugar Act was a follow up of the Molasses Act and it aimed at enforcing the trade regulations combined with an additional source of revenues for the soldiers. The money resulting from the alteration of the molasses duty was to be put entirely at the disposal of the army. At the time, the money from the molasses duty did not exceed £ 700 a year, where it was expected to be around £ 200,000 a year, from a trade estimated at 8 mil. gallons. Grenville justified the decision by reminding the MP’s of Britain the vast amount of money spent on behalf of America during the war. The molasses duty will produce revenue and will maintain imperial preference (since there would still be no duty on the molasses from the British West Indies). The tax was received very badly across the Atlantic and, in the end, under much pressure from the colonies, the Parliament repealed the Sugar Act and Lord Grenville had to redesign his plans.
This is how the Stamp Act[6] came about. The idea for the Stamp Act was given to Grenville, in his quest for money for the colonies, from a London merchant (Henry McCulloh). The Stamp taxes were to be very small, around 70% of their equivalent in Britain. Corroborated with the Molasses duty, the Stamp taxes will cover one third of the army costs, the rest being taken care of by Britain itself.
The same year (1765), the Quartering Act was Grenville’s response to Thomas Gage’s discontent with regard to the difficulties over the quartering of soldiers and other problems caused by the colonial obstruction. Once again, Grenville’s action seems to be a very legitimate answer. Basically, the Quartering Act requires that the colonies are to provide housing and food for the troops sent from Britain. This, can be regarded as an indirect act of taxation but, as we have seen above, all the money went to the colonies themselves.
In 1766[7], however after continuous protests, the Parliament repelled the Stamp Act only after declaring that it had a right to bind the colonies ‘in all cases whatsoever’. The Sugar Act and the Stamp Act had failed to gain revenue from the American colonists. The bold refusal of the American colonists was a slap in the face for the British Parliament. A plan to repay the debt was not enough. A more serious approach was needed.
## The Townsend ‘Era’
In 1767, Charles Townsend took up the challenge. His legacy, inasmuch as the colonies are concerned can be summarized with what is known as the Townsend acts.
The Townsend Acts involved first the old Navigation Laws. Townsend, as well as Edmund Burke, did not expect that the colonies would protest against the Navigation Laws. They were ‘traditional commercial regulations’. They were the cornerstone of British colonial policy; they protected and promoted imperial commerce, to the benefit of mother country and colonies alike. Therefore, Burke argued that the solution of the American controversy was easy. Let Britain… ‘be content to bind America by laws of trade’ because she had ‘always done it’[8].
The most stirring act from the Townsend series, which caused much trouble in 1767, proposed taxes on glass, paper, pasteboard, painters’ supplies, and tea. When the new tariffs were boycotted in the colonies, Parliament began to feel as though ‘The colonial merchants demanded in effect free trade... or [at least] easy smuggling’[9]. Free trade was something that the mother country did not even have. All Englishmen paid their taxes. There was no one on English soil, who was exempt from any of these taxes. Realizing that they would never willingly pay their taxes to the British Crown turned out to be ‘the beginning of the end’.
‘The boycott on British goods, particularly tea, threatened the livelihood of many English merchants. More and more sympathy for America was confined to those narrow circles of forward looking people or to professional politicians in opposition’[10]. The plain truth is that colonists did not allow to be taxed. The Townsend Acts were loosing support at home because of the economic impact in England. The Parliament started to run out of ideas. The Townsend Acts were finally lifted, but the damage had already been done. It was just as Burke had feared when they were first introduced to Parliament. ‘He [had] prophesied correctly that the laws would “gain no revenue for England, but only embitter the colonists”’[11].
One law did remain intact when the Townsend Acts were repealed, and that was the Tea Act ‘for the sake of principle’[12], not for revenue. Burke had asked that this law be lifted as well because it was only causing a greater dislike of the English in America and brought absolutely no revenue. The request was denied[13]. On the evening of December 16, 1773, three companies of fifty men, masquerading as Mohawk Indians, passed through a tremendous crowd of spectators, went aboard three ships that just brought a new cargo of tea in the port of Boston, broke open the tea chests, and threw them overboard. As the electrifying news of the Boston ‘tea party’ spread, other seaports followed the example and staged similar acts of resistance of their own.
##The Coercive Acts
The news of the ‘Boston Tea Party’ reached Parliament in early 1774. In response to the constant insubordination of the colonists, King George III himself approved of measures that were going to force the colonists into submission. As a result of the king’s approval, Parliament enacted four new laws and updated an old one. These laws, the Boston Port Bill, the Administration of Justice Act, the Massachusetts Government Act, the Quebec Act, and the updating of the Quartering Act, were called “coercive” by Parliament. The new set of acts, while important in itself, was not as important as the new question that came ringing across the ocean to echo in the halls of Parliament. The colonists were questioning Parliament’s very right to tax and rule over them. The Coercive Acts were designed to be just what they came to be called by the colonists – intolerable. It was the intention of Parliament at the time of these acts to force the colonists to obey the laws and pay the taxes that they were avoiding.
The first of these laws enacted in 1774 was meant as a direct punishment for the ‘Boston Tea Party’. Lord North, the Prime Minister at the time, presented this bill to Parliament and they, with the approval of the king, closed all of the ports in Boston, and ordered that they remain closed until the repayment for the tea that was at the bottom of the harbor could be made. This act alone would be detrimental to Boston’s economy, and therefore Parliament expected compliance with their laws from that day forth.
#### The Boston Port Act (March 31, 1774)
‘An act to discontinue, in such manner, and for such time as are therein mentioned, the landing and discharging, lading or shipping, of goods, wares, and merchandise, at the town, and within the harbor, of Boston, in the province of Massachusetts’s bay, in North America.’
The text of the Act itself points to deeds that aim at subverting his Majesty’s government, that destroy public peace and good order. This kind of actions endangers the life of his Majesty’s officials working in Boston area. They represent a threat to society and therefore drastic measures have to be taken. There should be full satisfaction given not only to the officials but also to the ‘united company of merchants of England trading to the East Indies’, who had to suffer because of the riots. Moreover, any kind of transactions involving any activity in the port of Boston is to be ceased immediately. There are, however a few exceptions: anything that has to do with the king, his army or whatever is in his use; anything that has to do with fuel and supplies for the inhabitants brought from any part of America (provided the vessels will be subject to thorough search). These interdictions will be valid until the king will say otherwise (it ended in June 1774).
A further step in the attempt to regain control in the colonies was the Administration of Justice Act. In the aftermath of the riots, the Parliament decided that the royal officials in the American colonies needed some form of protection from the unfair legal prosecution that they might have experienced in the colonies. Therefore, the Administration of Justice Act demanded that any British officials being tried for a crime would be extradited to England so he would receive a fair trial.
#### The Administration of Justice Act (May 20, 1774)
‘An act for the impartial administration of justice in the cases of persons questioned for any acts done by them in the execution of the law, or for the suppression of riots and tumults, in the province of the Massachusetts’s Bay, in New England.’
For a period of three years, the trials involving British officials should be moved to Great Britain and the persons involved should appear before the King’s bench. The reason for doing so, the Act stipulates, is that there are very slight chances to have fair trials in a place like Boston at the time, where ‘the execution of certain acts of the parliament hath been suffered to take place, uncontrolled and unpunished’…
It is only logical then that the parliament should try to limit as much as possible to power of local authorities. The next act that was issued, the Massachusetts Government Act, did exactly this. It removed the power of the assemblies and the town councils in the colonies and gave the governor complete control over them.
#### The Massachusetts Government Act (May 20, 1774)
An act for better regulating the government of the Massachusetts Bay, in New England
The act keeps going in the same direction as the previous ones - laws have been obstructed, the government at the time was ‘ill adapted’, etc. ‘It hath accordingly happened, that an open resistance to the execution of the laws hath actually taken place in the town of Boston, and the neighborhood thereof…’. Consequently, the counselors and assistants will be appointed by the king. His Majesty’s Governor or lieutenant-governor will be his Majesty’s direct representative. The act also called attention upon the great number of unnecessary meetings, which could only imply that ‘a great abuse of power has been made on calling these meetings’. It is not difficult to realize that a smaller number of meetings meant a lower probability for riots to happen.
In June 1774, two other acts were passed – the Quebec Act and the Quartering Act.
#### The Quebec Act (June 22, 1774)
‘An act for making effectual Provision for the Government of the Province of Quebec, in North America’.
This was not one of the coercive acts, properly speaking, since it did not impose directly anything on the colonies themselves. What it did though was to modify the borders of the English colonies (as they were settled at the Proclamation of 1763), ‘to include territory west to the Mississippi, north to the frontiers of the Hudson’s Bay territory, and the islands in the mouth of the St. Lawrence’. Thus, it took away portions of land that were meant for the North Western colonies and extended the border of Canada. It also underlined the importance of religious tolerance towards the Catholics as well as closing an eye to their westward expansions.
But this was still not enough. A few days later, there was another motion passed by the Parliament with regard to the colonies.
#### The Quartering Act (June, 1774)
An act for the better providing suitable quarters for officers and soldiers in his Majesty’s service in North America.
The Quartering Act of 1765 (when Grenville was Prime minister) was revised as a final punishment for the colonists. Previously, the colonists were demanded only to supply the soldiers stationed in America with unoccupied buildings for shelter and some food provisions. The revision demanded that the hospitality offered to the soldiers be extended to the point of the colonists taking the soldiers into their own homes. There were frictions between the colonists and the soldiers before[14]. This new act only added to their mutual dislike.
## The Intolerable Acts
At first sight, this strategy seems, again, to be very well prepared. It starts by punishing the misdeeds and therefore, it limits the liberties of the colonists. Then, it makes it so that the British officials are under the protection of the King. Being the King’s direct representatives they bypassed the structure of power already in place. Moreover, as if there are not already enough reasons to upset the colonists, their prospective expansion was refused. The cherry on top was the standing army, whose role, all things considered, was very difficult to establish. The Quebec Act extended the borders of the French colonies. Why this, when the army is supposed to defend the British colonies from any further French or Indian attacks? Was the army there to defend them at all? It is not difficult to imagine that the colonists might have asked themselves the same kind of questions.
What do we have here then? On the one hand, there is the ‘firm’ hand of England that ‘puts things in order’ in North America. On the other hand, there is the growing mistrust and even hate among the colonists. The Coercive Acts became the Intolerable Acts. The colonists raised a new objection, this time not to the new laws, but the very right of the Parliament to enforce any laws and taxes upon them in the first place since they were not represented in Parliament. ‘No Taxation, Without Representation’ became the colonists’ ‘next attempt at avoiding the laws of England’, and it sparked debates and reactions in the Parliament[15].
William Pitt, who had been sympathetic towards the colonists and had said many times that they should not be taxed, never said that England could not tax the colonies. That power was evident. When he asked that Parliament not tax the colonists, he reminded them that while he was opposing the taxes, he ‘at the same time, [asserted] the authority of this kingdom over the colonies to be sovereign and supreme in every circumstance’[16]. What he and the rest of the British government began to face was the question of the supremacy of Great Britain. They either ruled the colonies completely and totally, or they did not rule them at all. The colonies had challenged the authority of the Parliament at every turn, and this latest question of authority based on representation was just another excuse to avoid the laws.
William Pitt suggested that ‘the idea of a virtual representation of America in this House [was] the most contemptable that [had] ever entered into the head of man. It [did] not deserve a serious refutation’[17]. The debates went on and on, but one detail seemed to be lost in all of the arguments that were presented. If the colonies did not respect the power of Parliament, then who was actually governing America?
## Conclusion
With all of the debating that went on in Parliament over the challenge of their power in America, the question always came back to one single problem. It did not matter what laws were enacted if the colonists did not adhere to them. Edmund Burke reminded everyone in Parliament that ‘a great black book and a great many red coats [would] never be able to govern [America]’[18]. For Burke, it was true that the Parliament had the right to impose taxes but good politics has to take into account those who are governed by it and thus it need not be intransigent and adhere strictly to the law rather than to its spirit. It needs to adapt to circumstances, otherwise it risks bringing about ‘conflict and ruin’. Governments can only prevent wrongdoings but they are incapable of ‘producing goodness’ as such. A good legislation does not exhaust the space of all possible and permissive actions.
The number of people in Parliament who sympathized with the situation faced by people in America grew smaller by 1774. It was certainly the right of Parliament to tax the colonies for their debt incurred by the Seven Years War, but none of the policies worked. Parliament had attempted to retrieve the money that the colonists owed for the protection they had received during the war and the stationing of the troops there ever since. Their method for regaining the funds that had been spent on America’s behalf was always a tax of some sort or another. While William Pitt and Edmund Burke supported the position of the colonists’ refusal to pay these taxes, they maintained that Parliament did have the right to impose any taxes as well as other laws upon the colonists. The right to impose a law or a tax, however, came with no guarantee that it would be followed. The colonists defied every act of Parliament and even questioned their right to be in authority over them. This forced the British government to enact even harsher laws where the colonists were concerned. Finally, when these laws were implemented, the colonists sparked a new debate as a last effort to avoid paying their taxes by saying that they were not represented in Parliament. They may not have been directly represented in Parliament, but, as it had been pointed out, no Englishman was directly represented[19].
The Parliament never asked the colonists to pay a tax that they could not afford. In reality, they were asked to pay less for the items that they had already been purchasing. Had they not been smuggling their goods into the colonies and avoiding the tax at every opportunity, they would have realized this.
In September 1774, twelve colonies, everyone but Georgia, sent delegates to a ‘continental congress’, in Philadelphia to coordinate a response. The Coercive Acts secured for Massachusetts the support and sympathy of all the other colonies. The Virginia assembly called for a meeting of representatives from the 13 colonies and Canada to consider joint action against Parliament’s encroachments on colonial rights. The Congress did not seek independence fromGreat Britain but attempted to define America’s rights, place limits on Parliament’s power, and agree on tactics of resistance to the Coercive Acts. In October, the delegates adopted a Declaration of Rights and Grievances that denied Parliament’s right to tax or legislate for the colonies and asserted that only the colonial assemblies had that power. They conceded though Parliament’s authority to regulate trade. The Congress drew up the Continental Association, an agreement calling for the colonies to cease all trade with Britain until Parliament repealed the Coercive Acts. The Congress agreed upon arranging for a second meeting in May 1775. By that time, however, hostilities had begun betweenBritain and the colonies.
King George III’s response, who was petitioned to intercede on behalf of the colonists was: ‘The colonies are in a state of rebellion’. In 1775, General Gage, the new Royal Governor, sent troops to seize colonial arms stored at the town of Concord. Soon after, a new congress in Philadelphia appointed Washingtonto take charge of the colonial army at Cambridge. The confrontation started.
The Coercive Acts had a short and infamous life. They did not have the power that their name suggests. Au contraire, they were considered ‘intolerable’, they represented an excess of legislation that limited colonists’ liberty. Something that could not last. This was the beginning of the American Revolution.
---
## Works Cited
Commager, Henry Steele and Richard, B. Morris, eds., The Spirit of 'Seventy-Six: The Story of the American Revolution As Told By Participants. New York: Harper & Row, 1975
Commager, Henry Steele, ed. Documents of American History. 8th ed., New York: Appleton-Croft, 1968
Cone, Carl B. Burke and the Nature of Politics: The Age of the American Revolution. Kentucky: University of Kentucky Press, 1957
Greene, Jack P. “The Origin of the American Colonial Policy 1748 – 1763”. In The Blackwell Encyclopedia of the American Revolution, edited by Jack P. Greene and J.R. Pole. Cambridge: Basil Blackwell, Inc., 1991.
Hibbert, Christopher. Redcoats and Rebels: The American Revolution Through British Eyes. New York: Avon Books, 1990
Kallich, Martin and Andrew MacLeish (eds.). The American Revolution Through British Eyes. New York: Harper & Row Publishers, 1962
Lancaster, Bruce. The American Heritage Book of the American Revolution.New York, 1958
Locke, John. Two treatises of government. University Press : Cambridge, 1970
Miller, John. Origins of the American Revolution. Stanford: StanfordUniversity Press, 1959
[1] Jack P. Green in his essay on ‘The Origins of the New Colonial Policies, 1746 – 1763’, writes that although it is generally agreed that the ‘new’ metropolitan colonial policy began after 1763 (i.e., after The Seven Years War), there are reasons to believe that the series of measures taken by the British officials were developed much earlier, namely around 1748. ‘The metropolitan government began to abandon its long standing posture of accommodation and conciliation towards the colonies for a policy of strict supervision and control’ (Greene, p. 95)
[2] After the war, it had been decided that more soldiers should be sent into the colonies since there was fear that the French could strike again (there were more than 20,000 soldiers in the French West Indies). A greater number of soldiers meant more money and, as the soldiers were there for the colonists’ protection, the latter should be the ones accommodating them. It seemed only logical! The next thing that had to be done was to find a way to raise the money. Then, although the colonists’ trade was regulated by trade laws, they were not respected as they should have been and Grenville proceeded to enforce them. For example, from a molasses trade estimated at approximately 8 mil. gallons, there was a £700 duty instead of over £200,000 a year.
[3] Hibbert, p. 1
[4] Hibbert, p. xviii
[5] The colonists had to ‘pay no more than sixpence a year against the average English taxpayer’s twenty-five shillings’ – Hibbert, p. xviii
[6] The Stamp Act required taxes on all American legal documents – newspapers, pamphlets and ‘items’ such is playing cards, dice, etc. In Massachusetts, the Stamp Act could not get into effect since there was no one to distribute the stamps.
[7] Lord Grenville lost the seat of Prime Minister in 1765
[8] Cone, p. 261-262
[9] Miller, p. 102.
[10] Lancaster, p. 24
[11] Cone, p. 193
[12] Cone, p. 195
[13] It should be mentioned here, however, that in 1773, Britain’s East India Company owned large stocks of tea that it could not sell in England. This brought it on the brink of bankruptcy. The Tea Act was an attempt to help the company. It gave the company the right to export its merchandise directly to the colonies without paying any of the regular taxes that were imposed on the colonial merchants, who had traditionally served as the middlemen in such transactions. With these privileges, hopefully, the company could undersell American merchants and monopolize the colonial tea trade. The act backfired. First, it angered influential colonial merchants, who feared being replaced and bankrupted by a powerful monopoly. The East India Company's decision to grant franchises to certain American merchants for the sale of their tea created further resentments among those excluded from this lucrative trade. More important, however, the Tea Act revived American passions about the issue of ‘taxation without representation’, as it will be shown later. The law provided no new tax on tea. Lord North assumed that most colonists would welcome the new law because it would reduce the price of tea to consumers by removing the middlemen. But the colonists responded by boycotting tea. Unlike earlier protests, this boycott mobilized large segments of the population. It also helped link the colonies together in a common experience of mass popular protest.
[14] A quick example: on the night of March 5, 1770, in what will become known as ‘the Boston Massacre’, five men had been shot dead in Boston by British soldiers.
[15] From the colonies' point of view, it was impossible to consider themselves represented in Parliament unless they actually elected members to the House of Commons. But this idea conflicted with the English principle of ‘virtual representation’, according to which each Member of Parliament represented the interests of the whole country, even the empire, despite the fact that his electoral base consisted of only a tiny minority of property owners from a given district. The rest of the community was seen to be ‘represented’ on the ground that all inhabitants shared the same interests as the property owners who elected members of Parliament.
[16], Kallich & MacLeish, p. 366
[17] Kallich & MacLeish, p. 424
[18] Commager & Morris, p. 15
[19] More than half a century before, John Locke (1632-1704) wrote his ‘Two Treatises of Government’ (1689): the first treatise targets Robert Filmer’s ‘Patriarchia’ (which advertised political authority stemmed by direct descent through the royal lines from Adam’s original title). However, Locke argues, the absolute title makes the rest of the humankind slaves. The second treatise draws on the social contract as the source of authority. In a pre-social state, no adult has natural authority over any other; rather, everyone is in a perfect state of freedom. Granted the law of nature, each person in the state of nature has the authority to enforce it. Thus, if you violate my rights I, in turn, have the right to punish you. Included in my rights, Locke would argue, is the right to property. I have the right to the product of my own labor when I turn virgin soil into farmland. This is what the colonists were doing. | 136.121212 | 1,784 | 0.796813 | eng_Latn | 0.999923 |
97c82f41567fa88ab34794331e6a81efa01c565a | 2,882 | md | Markdown | README.md | turingou/term-list-enhanced | 10856c5494916f1b2ad8d4a8dc7efc9905d49c6b | [
"MIT"
] | 3 | 2015-01-08T05:53:14.000Z | 2015-08-29T12:49:01.000Z | README.md | turingou/term-list-enhanced | 10856c5494916f1b2ad8d4a8dc7efc9905d49c6b | [
"MIT"
] | null | null | null | README.md | turingou/term-list-enhanced | 10856c5494916f1b2ad8d4a8dc7efc9905d49c6b | [
"MIT"
] | null | null | null | ## term-list-enhanced 
a enhancement module for term-list, supporting setting or clear labels
### Installation
```
$ [sudo] npm install term-list-enhanced
```
### Example
just pull this repo and run example:
```
$ git clone https://github.com/turingou/term-list-enhanced.git
$ cd term-list-enhanced
$ node examples/menu.js
```
````javascript
var term-list-enhanced = require('term-list-enhanced');
var menu = new List({
labelKey: 'label'
});
menu.adds([
'Welcome to Term List Enhanced',
'==============================',
'this line is focused by default',
{
label: '+ : click me to add new line',
add: true
}, {
label: '√ : click me to update lebel',
update: true
}
]);
menu.on('keypress', function(key, index) {
if (key.name === 'return') {
var item = menu.item(index);
if (item.add) {
return menu.append({
label: '+++ a new line, click to remove me',
remove: true
});
}
if (item.update) {
return menu.update(index, '| wooha !!')
}
if (item.remove) {
return menu.remove(index);
}
menu.update(index, 'you\'ve choose the ' + (index + 1) + 'th');
} else if (key.name === 'q') {
return menu.exit();
}
})
menu.on('empty', function() {
menu.stop();
});
menu.start(2)
````
### API
check this file: `index.js`
### Contributing
- Fork this repo
- Clone your repo
- Install dependencies
- Checkout a feature branch
- Feel free to add your features
- Make sure your features are fully tested
- Open a pull request, and enjoy <3
### MIT license
Copyright (c) 2014 turing <o.u.turing@Gmail.com>
Permission is hereby granted, free of charge, to any person obtaining a copy
of this software and associated documentation files (the "Software"), to deal
in the Software without restriction, including without limitation the rights
to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
copies of the Software, and to permit persons to whom the Software is
furnished to do so, subject to the following conditions:
The above copyright notice and this permission notice shall be included in
all copies or substantial portions of the Software.
THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN
THE SOFTWARE.
---

built upon love by [docor](https://github.com/turingou/docor.git) v0.1.3 | 27.980583 | 87 | 0.688758 | eng_Latn | 0.595783 |
97c839520ccebf2843c8833381895469d554e853 | 4,440 | md | Markdown | docs/TelemetryDruidTimeBoundaryRequestAllOf.md | pktyagi1/intersight-go | 1f8bd2aa1b8cbdf35d45068fb113b65ee41df3ed | [
"Apache-2.0"
] | null | null | null | docs/TelemetryDruidTimeBoundaryRequestAllOf.md | pktyagi1/intersight-go | 1f8bd2aa1b8cbdf35d45068fb113b65ee41df3ed | [
"Apache-2.0"
] | 1 | 2022-03-21T06:28:43.000Z | 2022-03-21T06:28:43.000Z | docs/TelemetryDruidTimeBoundaryRequestAllOf.md | pktyagi1/intersight-go | 1f8bd2aa1b8cbdf35d45068fb113b65ee41df3ed | [
"Apache-2.0"
] | 2 | 2020-07-07T15:00:25.000Z | 2022-03-21T04:43:33.000Z | # TelemetryDruidTimeBoundaryRequestAllOf
## Properties
Name | Type | Description | Notes
------------ | ------------- | ------------- | -------------
**DataSource** | [**TelemetryDruidDataSource**](TelemetryDruidDataSource.md) | |
**Bound** | Pointer to **string** | Optional, set to maxTime or minTime to return only the latest or earliest timestamp. Default to returning both if not set. | [optional]
**Filter** | Pointer to [**TelemetryDruidFilter**](TelemetryDruidFilter.md) | | [optional]
**Context** | Pointer to [**TelemetryDruidQueryContext**](TelemetryDruidQueryContext.md) | | [optional]
## Methods
### NewTelemetryDruidTimeBoundaryRequestAllOf
`func NewTelemetryDruidTimeBoundaryRequestAllOf(dataSource TelemetryDruidDataSource, ) *TelemetryDruidTimeBoundaryRequestAllOf`
NewTelemetryDruidTimeBoundaryRequestAllOf instantiates a new TelemetryDruidTimeBoundaryRequestAllOf object
This constructor will assign default values to properties that have it defined,
and makes sure properties required by API are set, but the set of arguments
will change when the set of required properties is changed
### NewTelemetryDruidTimeBoundaryRequestAllOfWithDefaults
`func NewTelemetryDruidTimeBoundaryRequestAllOfWithDefaults() *TelemetryDruidTimeBoundaryRequestAllOf`
NewTelemetryDruidTimeBoundaryRequestAllOfWithDefaults instantiates a new TelemetryDruidTimeBoundaryRequestAllOf object
This constructor will only assign default values to properties that have it defined,
but it doesn't guarantee that properties required by API are set
### GetDataSource
`func (o *TelemetryDruidTimeBoundaryRequestAllOf) GetDataSource() TelemetryDruidDataSource`
GetDataSource returns the DataSource field if non-nil, zero value otherwise.
### GetDataSourceOk
`func (o *TelemetryDruidTimeBoundaryRequestAllOf) GetDataSourceOk() (*TelemetryDruidDataSource, bool)`
GetDataSourceOk returns a tuple with the DataSource field if it's non-nil, zero value otherwise
and a boolean to check if the value has been set.
### SetDataSource
`func (o *TelemetryDruidTimeBoundaryRequestAllOf) SetDataSource(v TelemetryDruidDataSource)`
SetDataSource sets DataSource field to given value.
### GetBound
`func (o *TelemetryDruidTimeBoundaryRequestAllOf) GetBound() string`
GetBound returns the Bound field if non-nil, zero value otherwise.
### GetBoundOk
`func (o *TelemetryDruidTimeBoundaryRequestAllOf) GetBoundOk() (*string, bool)`
GetBoundOk returns a tuple with the Bound field if it's non-nil, zero value otherwise
and a boolean to check if the value has been set.
### SetBound
`func (o *TelemetryDruidTimeBoundaryRequestAllOf) SetBound(v string)`
SetBound sets Bound field to given value.
### HasBound
`func (o *TelemetryDruidTimeBoundaryRequestAllOf) HasBound() bool`
HasBound returns a boolean if a field has been set.
### GetFilter
`func (o *TelemetryDruidTimeBoundaryRequestAllOf) GetFilter() TelemetryDruidFilter`
GetFilter returns the Filter field if non-nil, zero value otherwise.
### GetFilterOk
`func (o *TelemetryDruidTimeBoundaryRequestAllOf) GetFilterOk() (*TelemetryDruidFilter, bool)`
GetFilterOk returns a tuple with the Filter field if it's non-nil, zero value otherwise
and a boolean to check if the value has been set.
### SetFilter
`func (o *TelemetryDruidTimeBoundaryRequestAllOf) SetFilter(v TelemetryDruidFilter)`
SetFilter sets Filter field to given value.
### HasFilter
`func (o *TelemetryDruidTimeBoundaryRequestAllOf) HasFilter() bool`
HasFilter returns a boolean if a field has been set.
### GetContext
`func (o *TelemetryDruidTimeBoundaryRequestAllOf) GetContext() TelemetryDruidQueryContext`
GetContext returns the Context field if non-nil, zero value otherwise.
### GetContextOk
`func (o *TelemetryDruidTimeBoundaryRequestAllOf) GetContextOk() (*TelemetryDruidQueryContext, bool)`
GetContextOk returns a tuple with the Context field if it's non-nil, zero value otherwise
and a boolean to check if the value has been set.
### SetContext
`func (o *TelemetryDruidTimeBoundaryRequestAllOf) SetContext(v TelemetryDruidQueryContext)`
SetContext sets Context field to given value.
### HasContext
`func (o *TelemetryDruidTimeBoundaryRequestAllOf) HasContext() bool`
HasContext returns a boolean if a field has been set.
[[Back to Model list]](../README.md#documentation-for-models) [[Back to API list]](../README.md#documentation-for-api-endpoints) [[Back to README]](../README.md)
| 34.153846 | 172 | 0.791216 | yue_Hant | 0.421149 |
97c896d49d02cb28c58f20176b8771093199f080 | 323 | md | Markdown | _pages/projects.md | joseandresmontes/joseandresmontes.github.io | 0b246b59284dc1c64f536f66eb79ea0e38190793 | [
"MIT"
] | null | null | null | _pages/projects.md | joseandresmontes/joseandresmontes.github.io | 0b246b59284dc1c64f536f66eb79ea0e38190793 | [
"MIT"
] | null | null | null | _pages/projects.md | joseandresmontes/joseandresmontes.github.io | 0b246b59284dc1c64f536f66eb79ea0e38190793 | [
"MIT"
] | null | null | null | ---
title: "Projects"
header:
overlay_image: "/images/website/background-projects.jpg"
overlay_color: "#000"
overlay_filter: "0.5"
caption: "Photo credit: [**Unsplash**](https://unsplash.com)"
layout: collection
permalink: /projects/
collection: projects
entries_layout: grid
classes: wide
author_profile: yes
---
| 20.1875 | 63 | 0.733746 | eng_Latn | 0.309607 |
97c900fa5267f4dbcda0758545b04f5911ea69ce | 1,692 | md | Markdown | sample-apps/prometheus/README.md | alanwest/aws-otel-test-framework | de1058f69480da124b441934d6188be75f3cb79b | [
"Apache-2.0"
] | 15 | 2020-10-22T06:27:09.000Z | 2021-12-14T20:08:53.000Z | sample-apps/prometheus/README.md | alanwest/aws-otel-test-framework | de1058f69480da124b441934d6188be75f3cb79b | [
"Apache-2.0"
] | 236 | 2020-10-14T17:41:01.000Z | 2022-03-30T19:50:22.000Z | sample-apps/prometheus/README.md | alanwest/aws-otel-test-framework | de1058f69480da124b441934d6188be75f3cb79b | [
"Apache-2.0"
] | 42 | 2020-10-13T17:33:01.000Z | 2022-03-17T23:18:36.000Z | # Prometheus sample app
This Prometheus sample app generates all 4 Prometheus metric types (counter, gauge, histogram, summary) and exposes them at the `/metrics` endpoint
At the same time, the generated metrics are also exposed at the `/expected_metrics` endpoint in the Prometheus remote write format
A health check endpoint also exists at `/`
The following is a list of optional environment variables for configuration:
* `LISTEN_ADDRESS`: (default = `0.0.0.0:8080`)this defines the address and port that the sample app is exposed to. This is primarily to conform with the test framework requirements.
* `INSTANCE_ID`: a unique identifier for a batch of metrics from 1 instance of a sample app. Every metric name from a sample app instance will be prefixed by `testINSTANCE_ID_` if specified, else the prefix would be `test_`
* `METRICS_LOAD`: (default=1) the amount of each type of metric to generate. The same amount of metrics is always generated per metric type.
Steps for running locally:
```bash
docker build . -t prometheus-sample-app
docker run -it -e INSTANCE_ID=test1234 \
-e LISTEN_ADDRESS=0.0.0.0:8080 \
-p 8080:8080 prometheus-sample-app
curl localhost:8080/metrics
```
Note that the port in LISTEN_ADDRESS must match the the second port specified in the port-forward
More functioning examples:
```bash
docker build . -t prometheus-sample-app
docker run -it -e INSTANCE_ID=test1234 \
-e LISTEN_ADDRESS=0.0.0.0:8080 \
-p 9001:8080 prometheus-sample-app
curl localhost:9001/metrics
```
```bash
docker build . -t prometheus-sample-app
docker run -it -e LISTEN_ADDRESS=0.0.0.0:8080 \
-e METRICS_LOAD=10 \
-p 9001:8080 prometheus-sample-app
curl localhost:9001/metrics
``` | 41.268293 | 223 | 0.774823 | eng_Latn | 0.952412 |
97c989f0d30738c975f2c214f5f1cf045c507895 | 831 | md | Markdown | adhoc/10137_the_trip/10137_solution.md | tricktron/Programming-Challenges | 0ece339a789529731eac8b908aa37d084fcdfe2a | [
"MIT"
] | null | null | null | adhoc/10137_the_trip/10137_solution.md | tricktron/Programming-Challenges | 0ece339a789529731eac8b908aa37d084fcdfe2a | [
"MIT"
] | null | null | null | adhoc/10137_the_trip/10137_solution.md | tricktron/Programming-Challenges | 0ece339a789529731eac8b908aa37d084fcdfe2a | [
"MIT"
] | null | null | null | # Solution
This exercise is about floating-point arithmetic and tricky rounding.
There is no special algorithm or data structure needed.
## My approach:
1. We save each expense in an array position and calculate the sum of
all expenses.
2. We calculate the average.
3. We find the minimum difference sum from the average.
We need to consider two things:
* There is a negative sum and a positive sum. In the end we need
the maximum of these two otherwise we may have 0.01 cents too less.
* As we only have a precision of 2 decimals we need to round after
each addition.
## Analysis:
n is the number of students and thus also expenses.
1. Takes n times
2. Is constant
3. Takes n times
T = O(n) + O(1) + O(n) = O(2 * n) = O(n)
Code | Result
--- | ---
[10137_trip.c](10137_trip.c) | 
| 30.777778 | 71 | 0.717208 | eng_Latn | 0.999248 |
97cae3204f2bea97520de45ae58492e20796f8ae | 197 | md | Markdown | README.md | robiningelbrecht/sudoku-solver-vanilla-js | 1778f3369c844a9db8a92c86703424d2abd21fef | [
"MIT"
] | null | null | null | README.md | robiningelbrecht/sudoku-solver-vanilla-js | 1778f3369c844a9db8a92c86703424d2abd21fef | [
"MIT"
] | null | null | null | README.md | robiningelbrecht/sudoku-solver-vanilla-js | 1778f3369c844a9db8a92c86703424d2abd21fef | [
"MIT"
] | null | null | null | # Sudoku solver
A visual representaton of how this algorithm solves a sudoku.
Demo: [https://sudoku-solver-vanilla-js.robiningelbrecht.be/](https://sudoku-solver-vanilla-js.robiningelbrecht.be/)
| 32.833333 | 116 | 0.786802 | hun_Latn | 0.160677 |
97cb5c4a8dc77629941c3b50ff9297f1f9d7aa2c | 2,958 | md | Markdown | _pages/research.md | sljardine/sljardine.github.io | afd1582331955687fc02109dd51143d43134ae6d | [
"MIT"
] | null | null | null | _pages/research.md | sljardine/sljardine.github.io | afd1582331955687fc02109dd51143d43134ae6d | [
"MIT"
] | null | null | null | _pages/research.md | sljardine/sljardine.github.io | afd1582331955687fc02109dd51143d43134ae6d | [
"MIT"
] | null | null | null | ---
layout: archive
title: "Current Research Topics"
permalink: /research/
author_profile: true
---
For a full list of my publications and grants see my [Curriculum Vitae](/files/Jardine CV.pdf) or [Google Scholar page](https://scholar.google.com/citations?user=h4BCQ3oAAAAJ&hl=en&oi=ao). Below I summarize projects that are currently taking up most of my research time.
# Harmful Algal Blooms
<p align="center">
<img width="500" height="500" src="/images/habs.jpg">
</p>
In the 2015/2016 fishing season, managers shut down the California Dungeness crab fishery in response to an unprescendented harmful algal bloom (HABs) event. Impacts include reduced harvest, [reduced ex-vessel prices](https://www.journals.uchicago.edu/doi/abs/10.1086/707643), [shifts in the distribution of fishing revenue towards large vessels](https://www.sciencedirect.com/science/article/abs/pii/S0921800919312935), [shifts in fishing portfolios](https://www.pnas.org/content/118/2/e2014379117), and [disruptions to community traditions](https://www.sciencedirect.com/science/article/pii/S1568988320300792). As climate change is expected to increase the occurence of HAB events, it is important to explore policies that can mitigate the impacts from HABs.
# Investments in the Commons
<p align="center">
<img width="460" height="300" src="/images/walleyeFingerlings.jpg">
</p>
The dominant narrative of open access resources is one of a lack of stewardship incentives leading to overexploitation. While a good description of fundamental drivers in these systems, the current dialogue almost entirely ignores real-world complexities surrounding observed investments that resource users make in improving resource value and sustainability. Examples include investment in [marketing seafood products](https://onlinelibrary.wiley.com/doi/abs/10.1093/ajae/aau050) and in [stocking fish harvested under recreational open access](https://www.ecologyandsociety.org/vol26/iss2/art16/). I am interested in understanding the economic incentives for investment in the commons, and the biological and socioeconomic outcomes of these investments.
# Salmon Conservation
<p align="center">
<img width="50%" height="50%" src="/images/culvert.jpg">
</p>
Migratory anadromous salmon and steelhead native to Western Washington (*Oncorhynchus spp.*) rely on access to streams throughout their spawning and rearing phases of their complex life cycles. Barriers to fish passage, particularly poorly-designed culverts at road crossings, prevent fish from accessing potential habitat, hampering recovery efforts for declining populations, and violating tribal treaty rights. In the 2021-2023 biennium, rougly $750 million dollars will be spent restoring state-owned culverts in Washinton and millions more will be spent on culverts owned by cities, counties, and other parties. I am interested in exploring opportunities to improve the cost-effectiveness of these investments.
| 70.428571 | 760 | 0.797498 | eng_Latn | 0.983812 |
97cbe9c81de05d2cdefc71a85e724379d2871033 | 289 | md | Markdown | README.md | bblacklock3/Capstone | 6591651b6b038be6a28eb76210e91532176123d2 | [
"MIT"
] | null | null | null | README.md | bblacklock3/Capstone | 6591651b6b038be6a28eb76210e91532176123d2 | [
"MIT"
] | null | null | null | README.md | bblacklock3/Capstone | 6591651b6b038be6a28eb76210e91532176123d2 | [
"MIT"
] | 2 | 2022-02-23T19:25:07.000Z | 2022-03-01T19:41:44.000Z | # Capstone
2022 Spring GT Capstone
source: code that's finalized/ fully developed
kinematic_analysis: stuff used to analyze plots from research papers and outputs of video analyses from Kinovea
simulation: dynamimcs, simulating kicks
controller: controller...
scripts: WIPs/ minor tasks
| 28.9 | 111 | 0.816609 | eng_Latn | 0.945833 |
97cc1ad87fc8000918bd1cc1b798dd267421a915 | 30 | md | Markdown | skeleton/content/fr/guide/overview/content.md | etermind/alex-demo-apress | b0014f5412139929c680ef261968247c150668f8 | [
"MIT"
] | null | null | null | skeleton/content/fr/guide/overview/content.md | etermind/alex-demo-apress | b0014f5412139929c680ef261968247c150668f8 | [
"MIT"
] | null | null | null | skeleton/content/fr/guide/overview/content.md | etermind/alex-demo-apress | b0014f5412139929c680ef261968247c150668f8 | [
"MIT"
] | null | null | null | ---
template: 'guide.htm'
---
| 7.5 | 21 | 0.533333 | yue_Hant | 0.189556 |
97cc1c09118fd6137e27f7e67b1f1c9ba47881be | 1,441 | md | Markdown | docs/notes/93f47e4f-849b-45cc-b250-e3d674c20b7f.md | handuozh/dendron-pkm | a7a35ce8c9e5ca84a288ca10181ef10ecbe3fbda | [
"MIT"
] | null | null | null | docs/notes/93f47e4f-849b-45cc-b250-e3d674c20b7f.md | handuozh/dendron-pkm | a7a35ce8c9e5ca84a288ca10181ef10ecbe3fbda | [
"MIT"
] | null | null | null | docs/notes/93f47e4f-849b-45cc-b250-e3d674c20b7f.md | handuozh/dendron-pkm | a7a35ce8c9e5ca84a288ca10181ef10ecbe3fbda | [
"MIT"
] | null | null | null | ---
id: 93f47e4f-849b-45cc-b250-e3d674c20b7f
title: Keypoint-detection-and-description
desc: ''
updated: 1606203942337
created: 1606196809674
parent: 2b4ada79-0c24-4eb3-85a9-71aee1c0ddbe
children: []
fname: pkm.Keypoint.keypoint-detection-and-description
hpath: pkm.Keypoint.keypoint-detection-and-description
---
# Keypoint Detection and Description Simultaneously
- tags: [detection-and-describe](1a60b043-5b79-4df5-a1f2-d92db7b4e8d3)
还是和之前一样先说。不出意外,现在几乎都是detector和descriptor一起去学了,这个任务进展非常大。个人觉得比较有意思的几个工作:首先是CVPR'19的[pkm.D2-Net](5bc96717-b624-46ed-afff-6c08666793a2)(<https://github.com/mihaidusmanu/d2-net),这篇大意是说detector其实没必要特意去学,直接从descriptor里取depth-wise> maximum和local spatial maximum就可以得到。如果沿着这个方向做,甚至detector都没必要特意设计loss去学(这个loss真的挺难设计的),确实是个很吸引人的性质。但现在d2-net的keypoint localization还是很不准,这对geometry computation影响还是很大,所以如果用来做sfm或者slam这种对geometry很敏感的任务估计还不太行。最近还有一篇[pkm.R2D2](a57da98a-ba3b-4ff0-bd63-df3de73deb55) (<https://arxiv.org/abs/1906.06195>) 会比d2-net复杂一些,不过看实验结果似乎localization error小了很多,如果以后开源可以多测下。不过D2-Net和R2D2都依赖dense gt correspondences, 比如D2-Net依赖MegaDepth,而R2D2是用EpicFlow自己插值出来的,获取成本和精度都是麻烦事...所以最近有一篇[pkm.Unsuperpoint](e33d889e-271d-4140-a9ac-ebe5c511c502),完全[pkm.Self-supervised](f7836a8a-f790-4c84-b3b4-5f12f2f75160)方法只用homography transformation(这点其实和[pkm.Superpoint](52dc650e-80ec-49d3-a804-23df714f1469)很像,不过做成了self-improving,比[pkm.Superpoint](52dc650e-80ec-49d3-a804-23df714f1469)优雅)去做,看起来效果也相当不错,期待它开源了再仔细测一测了。
| 80.055556 | 1,000 | 0.850104 | yue_Hant | 0.340395 |
97cc6df927b8c5b5b4eb38275ffefbe0710ef46f | 363 | md | Markdown | README.md | OhmVikrant/Document-field-detection-using-Template-Matching | 5eec308c2c3a7f971144157f5b87289acbd83eb2 | [
"MIT"
] | 1 | 2020-09-05T19:03:58.000Z | 2020-09-05T19:03:58.000Z | README.md | OhmVikrant/Document-field-detection-using-Template-Matching | 5eec308c2c3a7f971144157f5b87289acbd83eb2 | [
"MIT"
] | null | null | null | README.md | OhmVikrant/Document-field-detection-using-Template-Matching | 5eec308c2c3a7f971144157f5b87289acbd83eb2 | [
"MIT"
] | null | null | null | # Document-field-detection-using-Template-Matching
Document field detection using Template Matching incorporating the OpenCV library in Python
The link to an article on this written by me can be found here: [codespeedy.com/document-field-detection-using-template-matching](https://www.codespeedy.com/document-field-detection-using-template-matching-in-python/)
| 60.5 | 217 | 0.826446 | eng_Latn | 0.877065 |
97cd08ab89207f6e6898d94c55efe6ccfabcceea | 2,735 | md | Markdown | README.md | Plisp/cl-async | 59b1daefb4ededbd6a2d998a1b81669a96d05140 | [
"MIT"
] | null | null | null | README.md | Plisp/cl-async | 59b1daefb4ededbd6a2d998a1b81669a96d05140 | [
"MIT"
] | null | null | null | README.md | Plisp/cl-async | 59b1daefb4ededbd6a2d998a1b81669a96d05140 | [
"MIT"
] | null | null | null | cl-async - Asynchronous operations for Common Lisp
==================================================
Cl-async is a library for general purpose, non-blocking programming in Common
Lisp. Cl-async uses [libuv](http://docs.libuv.org/en/v1.x/) as the backend,
which is a fast, stable, portable library for asynchronous IO (used as the
backend library in Node.js).
The main goal is to provide an experience that makes general asynchronous
programming in lisp natural, and to also provide a number of
[drivers](http://orthecreedence.github.com/cl-async/drivers) on top of cl-async.
__NOTE:__ cl-async uses the v1.x branch of libuv, so make sure to grab that
version of it (not the v0.10.x branch).
### [Documentation](http://orthecreedence.github.com/cl-async/documentation)
Please see the [cl-async website](http://orthecreedence.github.com/cl-async) for
full documentation, examples, etc.
Quick links:
- [Documentation](http://orthecreedence.github.com/cl-async/documentation)
- [Base system](http://orthecreedence.github.com/cl-async/base)
- [Timers](http://orthecreedence.github.com/cl-async/timers)
- [Signal handling](http://orthecreedence.github.com/cl-async/signal-handling)
- [DNS](http://orthecreedence.github.com/cl-async/dns)
- [TCP](http://orthecreedence.github.com/cl-async/tcp)
- [TCP stream](http://orthecreedence.github.com/cl-async/tcp-stream)
- [TCP SSL](http://orthecreedence.github.com/cl-async/tcp-ssl)
- [Pollers](http://orthecreedence.github.com/cl-async/pollers)
- [Idlers](http://orthecreedence.github.com/cl-async/idlers)
- [Notifiers](http://orthecreedence.github.com/cl-async/notifiers)
- [Futures](http://orthecreedence.github.com/cl-async/future)
- [Threading](http://orthecreedence.github.com/cl-async/threading)
- [Stats](http://orthecreedence.github.com/cl-async/stats)
- [Event callbacks and error handling](http://orthecreedence.github.com/cl-async/event-handling)
- [Examples](http://orthecreedence.github.com/cl-async/examples)
- [Benchmarks](http://orthecreedence.github.com/cl-async/benchmarks)
- [Implementation notes](http://orthecreedence.github.com/cl-async/implementation-notes)
- [Drivers](http://orthecreedence.github.com/cl-async/drivers)
### Install
```lisp
(ql:quickload :cl-async)
```
Please be aware that until cl-async v0.6.x is in quicklisp, you might want to
git clone the master branch into `quicklisp/local-projects/`.
### Tests
There is a fairly complete suite of tests in the `cl-async-test` package:
```common-lisp
(ql:quickload :cl-async-test)
(cl-async-test:run-tests)
(cl-async-test:run-tests :ssl t)
(cl-async-test:run-tests :threading t)
```
### License
As always, my code is MIT licensed. Do whatever the hell you want with it. Enjoy!
| 44.112903 | 98 | 0.737112 | eng_Latn | 0.595195 |
97cd53889449dab8c56f25c72787a35da1094048 | 344 | md | Markdown | README.md | ceeb57f83688/torrc | c1f23286406ed10b3a6b1645f257a64ba5d527e4 | [
"CC0-1.0"
] | 4 | 2017-10-12T10:29:38.000Z | 2019-04-02T00:03:56.000Z | README.md | ceeb57f83688/torrc | c1f23286406ed10b3a6b1645f257a64ba5d527e4 | [
"CC0-1.0"
] | null | null | null | README.md | ceeb57f83688/torrc | c1f23286406ed10b3a6b1645f257a64ba5d527e4 | [
"CC0-1.0"
] | 3 | 2018-09-13T11:06:04.000Z | 2021-06-25T10:44:42.000Z | # torrc and related files
Configuration files for my Tor relay
## Install
Hardlink each file into the right place in `/usr/local/etc` (you're on your own here, for now). Git does not track permissions beyond the execute bit, so these are tracked in `set-perms.sh`.
## Author
AJ Jordan <alex@strugee.net>
## License
Creative Commons Zero
| 21.5 | 190 | 0.744186 | eng_Latn | 0.993777 |
97ce875193e6cb7c9ce7fd2f55394791fc7fb60c | 6,618 | md | Markdown | sdk/spring/azure-spring-boot-samples/azure-spring-boot-sample-active-directory-resource-server-obo/README.md | jimwrightcz/azure-sdk-for-java | c1c082029e8e7ada4fac5f8f011ca43334506ae5 | [
"MIT"
] | null | null | null | sdk/spring/azure-spring-boot-samples/azure-spring-boot-sample-active-directory-resource-server-obo/README.md | jimwrightcz/azure-sdk-for-java | c1c082029e8e7ada4fac5f8f011ca43334506ae5 | [
"MIT"
] | null | null | null | sdk/spring/azure-spring-boot-samples/azure-spring-boot-sample-active-directory-resource-server-obo/README.md | jimwrightcz/azure-sdk-for-java | c1c082029e8e7ada4fac5f8f011ca43334506ae5 | [
"MIT"
] | null | null | null | # OAuth 2.0 Sample for azure-spring-boot-sample-active-directory-resource-server-obo library for Java
## Key concepts
[Resource server access other resources usage][resource-server-access-other-resources-usage] is an extension scenario of the *azure-spring-boot-sample-active-directory-spring-security-resource-server* sample. Similarly, this sample illustrates how to protect a Java web API by restricting access to its resources to authorized accounts, and the restricted resource will access other restricted resource, such as Graph API and Custom API.
## Getting started
We will prepare two application to demonstrate the dependent calls of resources.
Another sample [spring security resource server sample][azure-spring-boot-sample-active-directory-spring-security-resource-server] will be as Custom API resource.
### Environment checklist
We need to ensure that this [environment checklist][ready-to-run-checklist] is completed before the run.
### Include the package
```xml
<dependencies>
<dependency>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-starter-web</artifactId>
</dependency>
<dependency>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-starter-security</artifactId>
</dependency>
<dependency>
<groupId>com.azure.spring</groupId>
<artifactId>azure-spring-boot-starter-active-directory</artifactId>
<version>3.0.0-beta.2</version> <!-- {x-version-update;com.azure.spring:azure-spring-boot-starter-active-directory;current} -->
</dependency>
<dependency>
<groupId>org.springframework.security</groupId>
<artifactId>spring-security-oauth2-resource-server</artifactId>
</dependency>
<dependency>
<groupId>org.springframework.security</groupId>
<artifactId>spring-security-oauth2-client</artifactId>
</dependency>
<dependency>
<groupId>com.microsoft.azure</groupId>
<artifactId>msal4j</artifactId>
<version>1.8.0</version> <!-- {x-version-update;com.microsoft.azure:msal4j;external_dependency} -->
</dependency>
<dependency>
<groupId>com.microsoft.azure</groupId>
<artifactId>msal4j</artifactId>
<version>1.8.0</version> <!-- {x-version-update;com.microsoft.azure:msal4j;external_dependency} -->
</dependency>
<dependency>
<groupId>org.springframework.security</groupId>
<artifactId>spring-security-oauth2-jose</artifactId>
<version>5.3.5.RELEASE</version> <!-- {x-version-update;org.springframework.security:spring-security-oauth2-jose;external_dependency} -->
</dependency>
</dependencies>
```
### Register your Web API
You can follow *azure-spring-boot-sample-active-directory-spring-security-resource-server* sample to add `ResourceAccessGraph.read`, `ResourceAccessGraphCustomResources.read` scopes.
Convention current application id url is `api://sample-client-id`, ; the application id url of the sample *azure-spring-boot-sample-active-directory-spring-security-resource-server* is `custom-client-id`.
After adding as shown below:

### Add API permissions
The current Web API will access Graph API and Custom API.
Sign in to the [Azure portal][azure-portal]. If you have access to multiple tenants, use the **Directory + subscription** filter in the top menu to select the tenant containing your client app's registration.
#### Add Graph API Permission
1. Select **Azure Active Directory** > **App registrations**, and then select your current sample application (not your web API).
2. Select **API permissions** > **Add a permission** > **Microsoft APIs** > **Microsoft Graph** > **Delegated permissions**, select **User.Read**, select **Add permission** to complete the process.
#### Add Custom API Permission
1. Select **Azure Active Directory** > **App registrations**, and then select your current sample application (not your web API).
2. Select **API permissions** > **Add a permission** > **My APIs**, select *azure-spring-boot-sample-active-directory-spring-security-resource-server* application name.
3. **Delegated permissions** is selected by default, Select **File** > **File.Read** permission, select **Add permission** to complete the process.
### Grant consent for your tenant
Respectively grant admin consent to the Graph and Custom permissions. After adding as shown below:

## Examples
### Configure application.yaml
```yaml
azure:
activedirectory:
resource-server:
obo:
enabled: true
client-id: [resource-server-application-client-id]
client-secret: [resource-server-application-client-secret]
tenant-id: [tenant-id-registered-by-application]
app-id-uri: api://sample-client-id
authorization:
graph:
scopes:
- User.read
custom:
scopes:
- api://custom-client-id/File.read
```
### Run with Maven
```shell
cd azure-spring-boot-samples/azure-spring-boot-sample-active-directory-resource-server-obo
mvn spring-boot:run
```
### Access the Web API
First, you need to obtain an access token to access Sample API.
- API will Only call Graph resource.
```shell script
# Replace to valid access token.
curl localhost:8081/call-graph-only -H "Authorization: Bearer <replace-the-access-token>"
```
Verify response:
```text
Graph response success.
```
- Sample API will Only call Graph resource.
```shell script
# Replace to valid access token.
curl localhost:8081/call-graph-only -H "Authorization: Bearer <replace-the-access-token>"
# Another endpoint for access Graph resource.
curl localhost:8081/call-graph-only-with-annotation -H "Authorization: Bearer <replace-the-access-token>"
```
Verify response:
```text
Graph response success.
```
- Sample API will call Graph and Custom resources.
```shell script
# Replace to valid access token.
curl localhost:8081/call-graph-and-custom-resources -H "Authorization: Bearer <replace-the-access-token>"
```
Verify response:
```text
Graph response success. Custom(local) response success.
```
## Troubleshooting
## Next steps
## Contributing
<!-- LINKS -->
[azure-portal]: https://portal.azure.com/
[ready-to-run-checklist]: https://github.com/Azure/azure-sdk-for-java/blob/master/sdk/spring/azure-spring-boot-samples/README.md#ready-to-run-checklist
[resource-server-access-other-resources-usage]: https://github.com/Azure/azure-sdk-for-java/tree/master/sdk/spring/azure-spring-boot-starter-active-directory#resource-server-access-other-resources-usage
| 40.601227 | 437 | 0.739196 | eng_Latn | 0.676362 |
97ceedd94a80beadff0d3c13233019e93beea02d | 28 | md | Markdown | bower_components/captionss/CONTRIBUTING.md | simhagate/joshanderin.com | 2832db9da235e91aabdd7785cd62fd353217abcc | [
"MIT"
] | 8 | 2015-01-21T04:35:27.000Z | 2017-04-25T15:22:34.000Z | bower_components/captionss/CONTRIBUTING.md | simhagate/joshanderin.com | 2832db9da235e91aabdd7785cd62fd353217abcc | [
"MIT"
] | 1 | 2015-07-06T07:10:03.000Z | 2015-07-06T07:10:03.000Z | bower_components/captionss/CONTRIBUTING.md | simhagate/joshanderin.com | 2832db9da235e91aabdd7785cd62fd353217abcc | [
"MIT"
] | 4 | 2015-07-12T21:25:07.000Z | 2015-11-10T14:43:12.000Z | # Contributing to captionss
| 14 | 27 | 0.821429 | eng_Latn | 0.988897 |
97cf2d38e6722121ab4b718f9deb7f692857b653 | 8,334 | md | Markdown | README.md | shapesecurity/shift-template-js | 36bd2f493882106da83c3c8b099ec6491cbc6fbd | [
"Apache-2.0"
] | 16 | 2018-09-14T18:04:19.000Z | 2022-03-24T01:30:33.000Z | README.md | shapesecurity/shift-template-js | 36bd2f493882106da83c3c8b099ec6491cbc6fbd | [
"Apache-2.0"
] | 7 | 2018-10-23T18:40:52.000Z | 2021-05-07T19:53:41.000Z | README.md | shapesecurity/shift-template-js | 36bd2f493882106da83c3c8b099ec6491cbc6fbd | [
"Apache-2.0"
] | 1 | 2020-07-30T19:38:11.000Z | 2020-07-30T19:38:11.000Z | # Shift Template
## About
This module provides a templating system based on [Shift format](https://github.com/shapesecurity/shift-spec) ASTs for JavaScript programs.
## Status
[Stable](http://nodejs.org/api/documentation.html#documentation_stability_index).
## Installation
```sh
npm install shift-template
```
## Usage
This module exports three functions: `findNodes`, `applyTemplate`, and `applyStructuredTemplate`.
`findNodes` takes the output of `parse{Script,Module}WithLocation` from [shift-parser](https://github.com/shapesecurity/shift-parser-js) as run on a source text which has comments in a certain format before nodes of interest. An optional second parameter to `findNodes` allows customizing the comment format; by default it is `/*# name #*/` or `/*# name # type #*/`. It returns a list of objects of the form `{ name, node, comment }`, where the first property is a name given in a marker comment, the second is the node which follows that marker, and the third an object with metadata about the comment.
For example,
```js
let { parseScriptWithLocation } = require('shift-parser');
let src = `
a + /*# foo # IdentifierExpression #*/ b;
0 + /*# bar #*/ 1;
`;
let { tree, locations, comments } = parseScriptWithLocation(src);
let names = findNodes({ tree, locations, comments });
console.log(names);
/*
[
{
name: "foo",
node: {
type: "IdentifierExpression",
name: "b",
},
comment: { text, type, start, end },
},
{
name: "bar",
node: {
type: "LiteralNumericExpression",
value: 1,
},
comment: { text, type, start, end },
},
]
*/
```
`applyTemplate` is a convenience function built on top of `findNodes` which takes source code annotated with marking comments as above and an object giving replacing functions for each marker. It returns an AST corresponding to that of the original source with all marked nodes replaced by the corresponding supplied replacement.
For example,
```js
let Shift = require('shift-ast/checked');
let src = `
a + /*# foo # IdentifierExpression #*/ b;
0 + /*# bar #*/ 1;
`;
let replaced = applyTemplate(src, {
foo: node => new Shift.IdentifierExpression({ name: node.name + '_' }),
bar: node => new Shift.LiteralNumericExpression({ value: node.value + 1 }),
});
```
produces an AST corresponding to the script
```js
a + b_;
0 + 2;
```
The replacing functions are passed a node which has already had the template applied to its children, so you can safely replace both an inner node and its parent as long as the function generating a replacement for the parent uses its argument.
A more sophisiticated example (a build-time implementation of an `autobind` class decorator) is in [example.js](example.js).
`applyStructuredTemplate` extends the above by giving special meaning to labels of the form `if foo`, `unless foo`, and `for each foo of bar`. For `if` and `unless`, you should supply a property named `foo` holding a boolean in the second parameter. For `for each`, you should supply a property named `bar` holding a list of objects of the same shape as the full parameter.
`if`-marked nodes are included as long as the condition is `true`; `unless` are included as long as it is `false`. These markers may only be included on nodes that are in an optional or list position in the AST. In the case of a list of optional nodes, omitting the node is treated as omitting the entry from the list, rather than putting in an empty value in the list.
For `for each` nodes, the node will be included multiple times. Each time, the template will be evaluated using the values supplied in the corresponding list. The values may be referenced by prefixing their name with the variable name and `::`. The `for each` marker may only be included on nodes which are in list position in the AST.
Multiple of these structural labels may be used on a single node.
For example,
```js
let Shift = require('shift-ast/checked');
let script = `
[
/*# if markerAtStart #*/
{ prop: 'marker' },
/*# for each x of xs #*/
{ prop: /*# x::prop #*/ null },
/*# unless markerAtStart #*/
{ prop: 'marker' },
]
`;
let templateValues = {
markerAtStart: false,
xs: [
{ prop: () => new Shift.LiteralNumericExpression({ value: 1 }) },
{ prop: () => new Shift.LiteralNumericExpression({ value: 2 }) },
{ prop: () => new Shift.LiteralNumericExpression({ value: 3 }) },
]
};
let replaced = applyStructuredTemplate(script, templateValues);
```
produces an AST corresponding to the script
```js
[
{ prop: 1 },
{ prop: 2 },
{ prop: 3 },
{ prop: 'marker' },
]
```
### Handling ambiguous markers
It is possible for two nodes two start at exactly the same place. Often it suffices to resolve the ambiguity by specifying the type of the node of interest. However, it is possible for two nodes of the same type to start at exactly the same place, as in `a + b + c` (which has two BinaryExpression nodes starting at the beginning of the source). In these cases the outermost is picked. If there is no unique outermost node satisfying the type constraint, an error is thrown.
### Specifying comment format
Both functions take an optional final argument which is an options bag. Currently it supports one option, `matcher`, which is a function for parsing comments. This function, if supplied, should take the text of a comment and return either `null` to indicate that the comment is not to be treated as marking a node or an object of type `{ name: string, predicate: Node => bool }` to indicate that it is to be treated as marking a node. The function supplied as `predicate` will be used to test nodes to see if they match the label: for example, the default matcher when given `/*# label # type #*/` returns `{ name: 'label', predicate: node => node.type === "type" }` and when given `/*# label #*/` returns `{ name: 'label', predicate: node => true }`.
### Nodes with multiple names, names with multiple nodes
`findNodes` permits both nodes which have multiple names and names which occur multiple times. Depending on your application, you may wish to forbid one or both of these.
If you just want to get a map from names to nodes, you can do
```js
let names = findNodes(data);
let map = new Map(namePairs.map(({ name, node }) => [name, node]);
if (map.size < namePairs.length) {
throw new TypeError('duplicate name');
}
```
`applyTemplate` forbids nodes which have multiple names, but allows names which occur multiple times.
### Validating correctness
`applyTemplate` enforces that the tree it produces conforms to the types in the Shift specification. However, some such trees are not valid programs. It is the responsibility of the calling code to ensure it does not produce such a well-typed but invalid program, or to run both the `EarlyErrorChecker` from [shift-parser](https://github.com/shapesecurity/shift-parser-js/) and the Validator from [shift-validator](https://github.com/shapesecurity/shift-validator-js) to check for invalid programs.
## Contributing
* Ensure you've signed the [CLA](https://github.com/shapesecurity/CLA/).
* Open a Github issue with a description of your desired change. If one exists already, leave a message stating that you are working on it with the date you expect it to be complete.
* Fork this repo, and clone the forked repo.
* Install dependencies with `npm install`.
* Build and test in your environment with `npm run build && npm test`.
* Create a feature branch. Make your changes. Add tests.
* Build and test in your environment with `npm run build && npm test`.
* Make a commit that includes the text "fixes #*XX*" where *XX* is the Github issue.
* Open a Pull Request on Github.
## License
Copyright 2018 Shape Security, Inc.
Licensed under the Apache License, Version 2.0 (the "License");
you may not use this file except in compliance with the License.
You may obtain a copy of the License at
http://www.apache.org/licenses/LICENSE-2.0
Unless required by applicable law or agreed to in writing, software
distributed under the License is distributed on an "AS IS" BASIS,
WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
See the License for the specific language governing permissions and
limitations under the License.
| 41.67 | 751 | 0.720542 | eng_Latn | 0.998572 |
97cf742c41252088deb3d7d8cb00df8059a81e8f | 42 | md | Markdown | README.md | desarrolloCoeur/draadriana | fc9816841c9a5c9e0aa336abcf68e0e2b873bf6b | [
"Unlicense"
] | null | null | null | README.md | desarrolloCoeur/draadriana | fc9816841c9a5c9e0aa336abcf68e0e2b873bf6b | [
"Unlicense"
] | null | null | null | README.md | desarrolloCoeur/draadriana | fc9816841c9a5c9e0aa336abcf68e0e2b873bf6b | [
"Unlicense"
] | null | null | null | # draadriana.com
Sitio Web draadriana.com
| 14 | 24 | 0.809524 | nld_Latn | 0.475845 |
97cfc8fc5dbd636efdc4c195489fd9955d1e6ad8 | 202 | md | Markdown | src/reference/forge-std/std-assertions.md | devanonon/foundry-book | 459bc315346967827673bbe2060a5ed5d43cea34 | [
"Apache-2.0",
"MIT"
] | 143 | 2022-01-17T12:02:45.000Z | 2022-03-31T14:08:49.000Z | src/reference/forge-std/std-assertions.md | devanonon/foundry-book | 459bc315346967827673bbe2060a5ed5d43cea34 | [
"Apache-2.0",
"MIT"
] | 99 | 2022-01-17T11:42:48.000Z | 2022-03-31T17:16:15.000Z | src/reference/forge-std/std-assertions.md | devanonon/foundry-book | 459bc315346967827673bbe2060a5ed5d43cea34 | [
"Apache-2.0",
"MIT"
] | 36 | 2022-01-17T22:20:39.000Z | 2022-03-31T21:49:12.000Z | ## Std Assertions
- [`fail`](./fail.md)
- [`assertFalse`](./assertFalse.md)
- [`assertEq`](./assertEq.md)
- [`assertApproxEqAbs`](./assertApproxEqAbs.md)
- [`assertApproxEqRel`](./assertApproxEqRel.md) | 28.857143 | 47 | 0.678218 | kor_Hang | 0.204189 |
97d0a452e5d814deda2d1f9db58b4b23856b97a7 | 234 | md | Markdown | README.md | shaunxu/angular-jsoneditor | 3bf1bfec3c07c2a4af53718ee7f267796e7404a8 | [
"MIT"
] | 2 | 2015-01-12T15:07:09.000Z | 2015-04-06T04:14:08.000Z | README.md | shaunxu/angular-jsoneditor | 3bf1bfec3c07c2a4af53718ee7f267796e7404a8 | [
"MIT"
] | 1 | 2015-01-08T10:57:58.000Z | 2015-01-13T02:18:08.000Z | README.md | shaunxu/angular-jsoneditor | 3bf1bfec3c07c2a4af53718ee7f267796e7404a8 | [
"MIT"
] | null | null | null | angular-jsoneditor
==================
An AngularJS directive for Jos de Jong's JSONEditor. https://github.com/josdejong/jsoneditor
Example: https://gist.github.com/shaunxu/a8508bfd12a6826a820d -- Thanks Alon Lavi for asking exmaple | 39 | 101 | 0.747863 | oci_Latn | 0.191048 |
97d1c0bf4c8cd4ccac4454431b56a34ed4cfe097 | 106 | md | Markdown | index.md | zohdit/DeepJanus | c32022bdff2994e91df7af8af64a022d3e7e6a75 | [
"MIT"
] | 7 | 2020-10-12T10:46:30.000Z | 2021-06-23T10:42:30.000Z | index.md | zohdit/DeepJanus | c32022bdff2994e91df7af8af64a022d3e7e6a75 | [
"MIT"
] | null | null | null | index.md | zohdit/DeepJanus | c32022bdff2994e91df7af8af64a022d3e7e6a75 | [
"MIT"
] | 1 | 2021-10-06T10:10:56.000Z | 2021-10-06T10:10:56.000Z | [](https://doi.org/10.5281/zenodo.3878037)
| 53 | 105 | 0.726415 | yue_Hant | 0.09725 |
97d1c413b0731d212064f7422bd1bb8b41738f30 | 1,276 | md | Markdown | README.md | jasineri/gitartwork | e46e66efcf494d37e82e2330a8c03d8e0ba1457f | [
"BSD-3-Clause"
] | 155 | 2022-02-19T08:49:47.000Z | 2022-03-31T22:53:19.000Z | README.md | jasineri/gitartwork | e46e66efcf494d37e82e2330a8c03d8e0ba1457f | [
"BSD-3-Clause"
] | 1 | 2022-02-24T23:27:31.000Z | 2022-02-24T23:27:31.000Z | README.md | jasineri/gitartwork | e46e66efcf494d37e82e2330a8c03d8e0ba1457f | [
"BSD-3-Clause"
] | 49 | 2022-02-19T12:40:23.000Z | 2022-03-31T12:12:50.000Z | # gitartwork on user's contribution graph
gitartwork on user's contribution graph, make a SVG image of it and finally push it back to your repository.
An example result:
[](https://github.com/jasineri/gitartwork)
## Usage:
### Option #1: Use gitartwork as a GitHub Action
1. Copy the workflow code into a `.github/workflows/gitartwork.yml` file in your repository.
name: gitartwork from a contribution graph
on:
push:
schedule:
- cron: '* */24 * * *'
jobs:
build:
name: Make gitartwork SVG
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v3
- uses: jasineri/gitartwork@v1
with:
# Use this username's contribution graph
user_name: jasineri
# Text on contribution graph
text: JASINERI
- uses: jasineri/simple-push-action@v1
2. A few moments later it will generate `gitartwork.svg` image in your repository, so then you can include it in your `README.md` like ``
3. Have fun :)
### Option #2: Make gitartwork locally on your environment
Still in progress...
| 34.486486 | 166 | 0.611285 | eng_Latn | 0.91564 |
97d2281d58dc5ef9bef78a964bc9c7ef24245fc6 | 45 | md | Markdown | website/docs/guide/advance/__meta__.md | 2239559319/garfish | dd10c3aa11d5ec859375fec9ef4b0a2001a684f4 | [
"Apache-2.0"
] | 669 | 2021-01-05T01:30:18.000Z | 2021-08-25T12:38:52.000Z | website/docs/guide/advance/__meta__.md | 2239559319/garfish | dd10c3aa11d5ec859375fec9ef4b0a2001a684f4 | [
"Apache-2.0"
] | 9 | 2021-01-15T11:00:29.000Z | 2021-08-25T08:01:00.000Z | website/docs/guide/advance/__meta__.md | 2239559319/garfish | dd10c3aa11d5ec859375fec9ef4b0a2001a684f4 | [
"Apache-2.0"
] | 44 | 2021-01-13T15:12:05.000Z | 2021-08-20T09:03:16.000Z | ---
title: 进阶功能
collapsed: true
order: 3
---
| 7.5 | 15 | 0.622222 | eng_Latn | 0.826047 |
97d28bd8fca8e2eb3721bea19999b74c2736f147 | 912 | md | Markdown | SUSE/README.md | ciaracarey/friends | 783fb0dc7d322d818c95a4878527ca3f4df718a3 | [
"Apache-2.0"
] | null | null | null | SUSE/README.md | ciaracarey/friends | 783fb0dc7d322d818c95a4878527ca3f4df718a3 | [
"Apache-2.0"
] | null | null | null | SUSE/README.md | ciaracarey/friends | 783fb0dc7d322d818c95a4878527ca3f4df718a3 | [
"Apache-2.0"
] | null | null | null | # SUSE
- [SUSE](https://www.suse.com) is implementing sigstore methods in its Container and Product deliveries.
- The [SUSE Container Registry](https://registry.suse.com/) is now also signing the SUSE containers using Cosign (additionally to Notary and Atomic methods) and also uploading signatures to the Linux Foundation Transparency Rekor log.
- Our SUSE Linux Enterprise update repositories signatures are also being uploaded to the Linux Foundation Transparency Rekor log.
- SUSE Linux Enterprise 15 SP4 will contain, and openSUSE Tumbleweed already contains Cosign and Rekor tooling.
- [Kubewarden](https://kubewarden.io/) uses sigstore in the following contexts:
* Signature and verification of its WebAssembly policies
* Allow policy authors to leverage sigstore verification capabilities inside of their policies. That allows the creation of policies that can validate OCI artifacts using sigstore
| 91.2 | 234 | 0.810307 | eng_Latn | 0.990977 |
97d3672df2c75d63d1bf61e48ed45382c8773b5e | 581 | md | Markdown | Testfunctions/MagOD2/ESP32_FTPServer_SD/README.md | LeonAbelmann/MagOD | 946ed7c15a1773d849d216c1e950f3b3efb164be | [
"MIT"
] | null | null | null | Testfunctions/MagOD2/ESP32_FTPServer_SD/README.md | LeonAbelmann/MagOD | 946ed7c15a1773d849d216c1e950f3b3efb164be | [
"MIT"
] | null | null | null | Testfunctions/MagOD2/ESP32_FTPServer_SD/README.md | LeonAbelmann/MagOD | 946ed7c15a1773d849d216c1e950f3b3efb164be | [
"MIT"
] | 2 | 2019-09-18T20:03:26.000Z | 2019-10-17T16:41:22.000Z | # ESP32_FTPServer_SD
This code based from ESP32_FTPServer_SD from user robo8080 https://github.com/robo8080/ESP32_FTPServer_SD and [Simple FTP Server for using esp8266 SPIFFs](https://github.com/nailbuster/esp8266FTPServer "Title")
This version used ESP32 and SD card in SD_MMC mode (No SD lib, SD_MMC lib) and more than 3MB/s SD read/write speed. Added ful fuctional passive mode ftp server, browse dir, change dir, rename dir/file, delete dir/file, upload and download files, dir's. On esp32 wroom i can take up to 1050 KB/s downloading speed(need tune esp arduino SD_MMC lib)
| 116.2 | 346 | 0.795181 | eng_Latn | 0.60369 |
97d3c1476c4fa41b5fd75cae6309621cee5ce9c7 | 420 | md | Markdown | man/ping.md | pew/hitchhikers-guide | 30addf5b8d45e3c28ebffa87409be82e89fec12e | [
"MIT"
] | 4 | 2019-09-29T10:42:50.000Z | 2022-02-12T08:27:37.000Z | man/ping.md | pew/hitchhikers-guide | 30addf5b8d45e3c28ebffa87409be82e89fec12e | [
"MIT"
] | 7 | 2019-04-22T19:35:20.000Z | 2020-07-29T17:58:39.000Z | man/ping.md | pew/hitchhikers-guide | 30addf5b8d45e3c28ebffa87409be82e89fec12e | [
"MIT"
] | null | null | null | # ping
well, it's not actually about ping right now but what to do when `ping` is not available
## if there's no ping, nc, telnet available
like in a container, do this (but you need bash, heh):
```
(echo >/dev/tcp/${host}/${port}) &>/dev/null && echo "open" || echo "closed"
```
example:
```
(echo >/dev/tcp/8.8.8.8/53) &>/dev/null && echo "open" || echo "closed"
```
[source](https://serverfault.com/a/923193)
| 20 | 88 | 0.630952 | eng_Latn | 0.957564 |
97d4a1da777c79af927481ed9b6d28047d3a71a5 | 1,939 | markdown | Markdown | _posts/2014-05-08-nesting-media-queries.markdown | PaulAdamDavis/Blog | 43aa5ba46cdf0c44cd0c6a55b292165b0375aa79 | [
"MIT"
] | 3 | 2015-11-10T12:22:22.000Z | 2016-07-13T14:17:31.000Z | _posts/2014-05-08-nesting-media-queries.markdown | PaulAdamDavis/Blog | 43aa5ba46cdf0c44cd0c6a55b292165b0375aa79 | [
"MIT"
] | null | null | null | _posts/2014-05-08-nesting-media-queries.markdown | PaulAdamDavis/Blog | 43aa5ba46cdf0c44cd0c6a55b292165b0375aa79 | [
"MIT"
] | null | null | null | ---
layout: post
title: Nesting Media Queries
date: '2014-05-08 10:59:54'
---
Prompted by [Dan Davies](https://twitter.com/danjdavies/status/464340647462592512), I thought I'd share how I use nested media queries in Sass. What I do is nothing new, but we all learn by sharing and discussing things.
Below is how I use nested media queries in Sass. I try to keep nesting up to 3 levels deep at most, group properties by type such as layout (display, padding), text (colour, font, weight, size) and so on.
Then I order media queries by size, so the smallest query first then the largest last. Inside the query's block, I try to keep the order of properties the same, but as they're usually quite small, I'm not precious about it.
{% highlight scss %}
.elem {
color: red;
border: 1px solid blue;
.thing {
font-weight: 600;
}
// And any other properties...
@media (min-width: 601px) and (max-width: 800px) {
color: green;
}
@media (max-width: 801px) and (min-width: 1000px) {
color: blue;
border-color: yellow;
.thing {
font-weight: 400;
}
}
} // .elem
{% endhighlight %}
The only downside I can see to this is the end result when compiled. You could have many repeated `@media` query declarations. When someone makes a good Sass plugin to combine the `@media` declarations, that'd be awesome.
As ever, my good friend [Stuart Robson](https://twitter.com/StuRobson) has written a few times about this such thing. [This](http://www.alwaystwisted.com/post.php?s=2012-05-05-everyday-im-bubbling-with-media-queries-and-less) and [this](http://alwaystwisted.com/post.php?s=2013-12-29-how-i-write-my-sass) are both worth reading, even if one is about LESS.
**Update:** 12:07pm
The very same Stuart as above shared some links related to my 'many queries' concern above. [Take a look](https://twitter.com/StuRobson/status/464360701608599552). | 42.152174 | 355 | 0.70294 | eng_Latn | 0.988531 |
97d51af858f97e364a3eeac13c9bcfce283e89fd | 1,089 | md | Markdown | chapters/2018.1.31.md | CreativeSwayGroup/Android-developers-blog-CN-2018 | 6aa26ace449c67aea1b0695ef1c00fdc00b6ecc3 | [
"Apache-2.0"
] | null | null | null | chapters/2018.1.31.md | CreativeSwayGroup/Android-developers-blog-CN-2018 | 6aa26ace449c67aea1b0695ef1c00fdc00b6ecc3 | [
"Apache-2.0"
] | null | null | null | chapters/2018.1.31.md | CreativeSwayGroup/Android-developers-blog-CN-2018 | 6aa26ace449c67aea1b0695ef1c00fdc00b6ecc3 | [
"Apache-2.0"
] | null | null | null | # Android开发者故事:Big Fish游戏使用公开测试来消除游戏的发布风险
原标题:Android Developer Story: Big Fish Games uses open beta testing to de-risk their game launch
链接:[https://android-developers.googleblog.com/2018/01/android-developer-story-big-fish-games.html](https://android-developers.googleblog.com/2018/01/android-developer-story-big-fish-games.html)
作者:Kacey Fahey (Google Play开发者营销)
翻译:[arjinmc](https://github.com/arjinmc)
[Big Fish Games](https://play.google.com/store/apps/dev?id=8355317828905497231) 总部位于西雅图,成立于2002年。作为一个游戏工作室,他们迅速成为休闲游戏的主要发行商和分销商。为了推出热门时间管理游戏[Cooking Craze](https://play.google.com/store/apps/details?id=com.bigfishgames.cookingcrazegooglef2p),该团队在Google Play上进行了公测。
[视频介绍](https://youtu.be/qRXkEQOtQ98)
Big Fish Games发现,使用公开测试版提供的来自世界各地的用户反馈的数量超过了10倍,并且还使他们能够在Play控制台中访问关键指标和Android Vitals。监控游戏性能指标的能力可以使团队专注于改进领域,从而使崩溃率降低21%。Beta测试者的样本规模越大,对玩家的行为也提供了更多的见解,并帮助在第1天,第7天和第30天的保留率上提高了+ 7%。
你还可以在3月19 日星期一在GDC的[Google开发者日](https://events.withgoogle.com/google-gdc-2018/)上了解更多启动前最佳实践和策略,以提高启动后的性能。[注册](https://events.withgoogle.com/google-gdc-2018/registrations/new/)以了解情况。 | 77.785714 | 264 | 0.822773 | yue_Hant | 0.572201 |
97d570bf4fcfa9194148e2ab4e6a48aaf545afd8 | 41 | md | Markdown | README.md | industriousonesoft/DraggableTimeline | c1aa0f9e2c3467451d4d879c0efc965adf69742e | [
"MIT"
] | null | null | null | README.md | industriousonesoft/DraggableTimeline | c1aa0f9e2c3467451d4d879c0efc965adf69742e | [
"MIT"
] | null | null | null | README.md | industriousonesoft/DraggableTimeline | c1aa0f9e2c3467451d4d879c0efc965adf69742e | [
"MIT"
] | null | null | null | # DraggableTimeline
A draggable timeline
| 13.666667 | 20 | 0.853659 | eng_Latn | 0.475557 |
97d7b07bee3c38d629953f8bc5fece445d49d761 | 133 | md | Markdown | README.md | uncelvel/react-redux-tutorial | aac1648ac018d7843410d84624185aa311f77905 | [
"MIT"
] | null | null | null | README.md | uncelvel/react-redux-tutorial | aac1648ac018d7843410d84624185aa311f77905 | [
"MIT"
] | null | null | null | README.md | uncelvel/react-redux-tutorial | aac1648ac018d7843410d84624185aa311f77905 | [
"MIT"
] | null | null | null | # React Redux Tutorial for Beginners: The Definitive Guide
> Companion repo
# Origin link
- https://www.valentinog.com/blog/redux/
| 22.166667 | 58 | 0.759398 | kor_Hang | 0.743753 |
97d80072241aa651dee09a391dd67c10ec6f7fae | 4,946 | md | Markdown | docs/visual-basic/programming-guide/language-features/procedures/how-to-pass-arguments-to-a-procedure.md | ANahr/docs.de-de | 14ad02cb12132d62994c5cb66fb6896864c7cfd7 | [
"CC-BY-4.0",
"MIT"
] | null | null | null | docs/visual-basic/programming-guide/language-features/procedures/how-to-pass-arguments-to-a-procedure.md | ANahr/docs.de-de | 14ad02cb12132d62994c5cb66fb6896864c7cfd7 | [
"CC-BY-4.0",
"MIT"
] | null | null | null | docs/visual-basic/programming-guide/language-features/procedures/how-to-pass-arguments-to-a-procedure.md | ANahr/docs.de-de | 14ad02cb12132d62994c5cb66fb6896864c7cfd7 | [
"CC-BY-4.0",
"MIT"
] | null | null | null | ---
title: 'Gewusst wie: Übergeben von Argumenten an eine Prozedur (Visual Basic)'
ms.date: 07/20/2015
helpviewer_keywords:
- arguments [Visual Basic], passing to procedures
- procedures [Visual Basic], arguments
- procedures [Visual Basic], parameters
- procedure arguments
- Visual Basic code, procedures
- procedure parameters
- procedures [Visual Basic], calling
- argument passing [Visual Basic], procedures
ms.assetid: 08723588-3890-4ddc-8249-79e049e0f241
ms.openlocfilehash: f393f17f87c5920fb9bfa2a2097c09d48bebdc16
ms.sourcegitcommit: 3d5d33f384eeba41b2dff79d096f47ccc8d8f03d
ms.translationtype: MT
ms.contentlocale: de-DE
ms.lasthandoff: 05/04/2018
ms.locfileid: "33650591"
---
# <a name="how-to-pass-arguments-to-a-procedure-visual-basic"></a>Gewusst wie: Übergeben von Argumenten an eine Prozedur (Visual Basic)
Wenn Sie eine Prozedur aufrufen, folgen Sie den Namen der Prozedur mit einer Argumentliste in Klammern. Geben Sie ein Argument für jeden erforderlichen Parameter der Prozedur definiert, und geben Sie optional Argumente für die `Optional` Parameter. Wenn Sie keine angeben einer `Optional` Parameter im Aufruf muss ein Komma, um seine Position in der Argumentliste zu markieren, wenn Sie nachfolgende Argumente angeben, sind enthalten.
Wenn Sie beabsichtigen, ein Argument eines Datentyps wie z. B. unterscheidet, von den entsprechenden Parameter übergeben `Byte` auf `String`, Sie können festlegen, dass den Switch Typprüfung ([Option Strict-Anweisung](../../../../visual-basic/language-reference/statements/option-strict-statement.md)) zu `Off`. Wenn `Option Strict` ist `On`, verwenden Sie entweder erweiternde Konvertierungen oder explizite Konvertierungsschlüsselwörter. Weitere Informationen finden Sie unter [Widening und einschränkende Konvertierungen](../../../../visual-basic/programming-guide/language-features/data-types/widening-and-narrowing-conversions.md) und [Typkonvertierungsfunktionen](../../../../visual-basic/language-reference/functions/type-conversion-functions.md).
Weitere Informationen finden Sie unter [Prozedurparameter und Argumente](./procedure-parameters-and-arguments.md).
### <a name="to-pass-one-or-more-arguments-to-a-procedure"></a>Um ein oder mehrere Argumente an eine Prozedur übergeben.
1. Führen Sie in der aufrufenden Anweisung den Prozedurnamen mit Klammern aus.
2. Fügen Sie innerhalb der Klammern eine Argumentliste. Ein Argument für jeden Parameter erforderlich, dass die Prozedur definiert sind, und trennen Sie die Argumente durch Kommas.
3. Stellen Sie sicher, dass jedes Argument ist ein gültiger Ausdruck, der eine Daten konvertiert werden kann, den Typ der Prozedur ergibt für den entsprechenden Parameter definiert.
4. Wenn ein Parameter, als definiert ist [Optional](../../../../visual-basic/language-reference/modifiers/optional.md), Sie können entweder in der Argumentliste eingeschlossen wird, oder lassen sie. Wenn Sie ihn weglassen, verwendet die Prozedur den für diesen Parameter definierten Standardwert.
5. Wenn Sie ein Argument für weglassen ein `Optional` Parameter ein anderer Parameter nach ihm in der Parameterliste ist, können Sie die Stelle der ausgelassenes Argument markieren, durch ein zusätzliches Komma in der Argumentliste.
Im folgenden Beispiel wird das Visual Basic <xref:Microsoft.VisualBasic.Interaction.MsgBox%2A> Funktion.
[!code-vb[VbVbcnProcedures#34](./codesnippet/VisualBasic/how-to-pass-arguments-to-a-procedure_1.vb)]
Das obige Beispiel liefert das erforderliche erste Argument, also die Zeichenfolge, die angezeigt werden soll. Es wird ein Argument für den optionalen zweiten Parameter, der angibt, die Schaltflächen im Meldungsfeld anzuzeigende ausgelassen. Da der Aufruf einen Wert nicht angegeben wird `MsgBox` verwendet den Standardwert `MsgBoxStyle.OKOnly`, woraufhin nur ein **OK** Schaltfläche.
Das zweite Komma in der Argumentliste markiert die Position des zweiten Arguments nicht angegeben, und die letzte Zeichenfolge übergeben wird, des optionalen dritten Parameters von `MsgBox`, dies ist der Text, der in der Titelleiste angezeigt werden.
## <a name="see-also"></a>Siehe auch
[Sub-Prozeduren](./sub-procedures.md)
[Function-Prozeduren](./function-procedures.md)
[Eigenschaftenprozeduren](./property-procedures.md)
[Operatorprozeduren](./operator-procedures.md)
[Gewusst wie: Definieren eines Parameters für eine Prozedur](./how-to-define-a-parameter-for-a-procedure.md)
[Übergeben von Argumenten als Wert und als Verweis](./passing-arguments-by-value-and-by-reference.md)
[Rekursive Prozeduren](./recursive-procedures.md)
[Prozedurüberladung](./procedure-overloading.md)
[Objekte und Klassen](../../../../visual-basic/programming-guide/language-features/objects-and-classes/index.md)
[Objektorientierte Programmierung (Visual Basic)](../../concepts/object-oriented-programming.md)
| 82.433333 | 757 | 0.782653 | deu_Latn | 0.984904 |
97d8ae97a16763b903afdcd85bcaf5d8874cd45a | 4,263 | md | Markdown | _posts/2020-05-06-sot-cao-co-giat.md | vuongkienthanh/vuongkienthanh.github.io | 79f046bc8f35256a1a88e57e6044a0f8a083ff4a | [
"MIT"
] | null | null | null | _posts/2020-05-06-sot-cao-co-giat.md | vuongkienthanh/vuongkienthanh.github.io | 79f046bc8f35256a1a88e57e6044a0f8a083ff4a | [
"MIT"
] | 2 | 2021-05-20T16:52:13.000Z | 2022-02-26T07:43:08.000Z | _posts/2020-05-06-sot-cao-co-giat.md | vuongkienthanh/vuongkienthanh.github.io | 79f046bc8f35256a1a88e57e6044a0f8a083ff4a | [
"MIT"
] | null | null | null | ---
title: SỐT CO GIẬT
categories:
- Nội thần kinh
---
## 🌡 Sốt co giật là gì?
- Co giật do sốt là tình trạng co giật xảy ra ở trẻ từ 6 tháng tới 5 tuổi trước đó khỏe mạnh (thường gặp nhất là ở trẻ từ 12 tháng đến 18 tháng tuổi) và có thân nhiệt > 38oC
- Co giật do sốt là thường gặp ở khoảng 4% trẻ dưới 5 tuổi
## 🤒 Các triệu chứng của sốt co giật?
- Sốt co giật thường gặp khi thân nhiệt >38oC, khoảng 25% trường hợp co giật do sốt xảy ra khi thân nhiệt trẻ >38oC và 75% trường hợp co giật do sốt xảy ra khi thân nhiệt >39oC
- Khi co giật do sốt trẻ thường không tỉnh (kêu tên lay gọi không biết, không đáp ứng) có những cử động giật của tay, chân, hoặc mặt, trợn mắt, có thể có sùi bọt mép.
- Sau cơn co giật do sốt trẻ thường bồn chồn, bối rối và có thể có giấc ngủ ngắn, sau ngủ trẻ tỉnh táo bình thường.
## 🏡 Làm gì để giúp con trong cơn co giật tại nhà?
- Đặt đứa trẻ nằm nghiêng bên nào bên nào cũng được. Đàm nhớt, thức ăn có thể chảy ra, giúp con dễ thở hơn.
- Không đưa bất kì thứ gì vào miệng của con. Khi trẻ co giật sai lầm trong xử trí là cố gắng đưa các vật như muỗng ngón tay để con không cắn lưỡi, điều này chỉ làm con khó thở nhiều hơn và có thể làm gãy răng, răng và các dị vật khác có thể rơi vào đường thở, gây nghẹt thở.
- Không nặn chanh vào mắt mũi miệng, không xoa dầu, rượu vào da khi trẻ đang co giật
- Không nên cố gắng dùng sức mạnh để giữ tay chân con ngừng cử động co giật, vì điều này không giúp con ngừng cơn co giật mà ngược lại có thể làm tổn thương xương và khớp của con sau cơn co giật.
- Nhanh chóng hạ sốt bằng thuốc paracetamol nhét hậu môn liều 10-15mg/kg/lần (trẻ 10kg dùng paracetamol 150mg nhét hậu môn)
- Nếu cơn co giật kéo dài hơn 5 phút, mà con vẫn chưa tự hết cơn, nhanh chóng đưa con tới bệnh viện gần nhất.
- Sau cơn co giật : không cho trẻ ăn uống nếu chưa tỉnh hẳn; nếu đã tỉnh hẳn nên bắt đầu cho uống 1-2 muỗng nước, nếu không sặc mới cho ăn uống dùng thuốc như bình thường.
## 🚑 Con tôi có cần phải đi khám? Con bạn cần được đưa đi khám ngay sau cơn co giật:
- Con có thể lên cơn co giật lại vài lần trong ngày.
- Con bạn cần được khám tổng quát và và làm một số xét nghiệm để tìm nguyên nhân gây sốt: Viêm phổi, Viêm tiểu phế quản, Viêm họng, Tiêu chảy nhiễm trùng, Nhiễm trùng da, Sốt xuất huyết...
- Cần được các Bác sĩ loại trừ các bệnh nguy hiểm cũng có tình trạng co giật và sốt như: Viêm màng não, Viêm não, xuất huyết não...
## 🤧 Co giật do sốt được điều trị như thế nào?
Điều trị sốt co giật bao gồm xử trí cơn co giật, điều trị nguyên nhân gây sốt, tình trạng sốt và các điều trị hỗ trợ khác.
## 🏥 Xử trí cơn co giật tại cơ sở y tế:
- Đặt bé nằm nghiêng, hút đàm nhớt từ mũi miệng em bé.
- Thở Oxy
- Nếu cơn co giật của con kéo dài hơn 5 phút các Bác sĩ có thể phải sử dụng thuốc để cắt cơn co giật.
Điều trị nguyên nhân, tình trạng sốt và các điều trị hỗ trợ khác:
- Thuốc hạ sốt
- Thuốc kháng sinh. Nếu nguyên nhân sốt là tình trạng nhiễm trùng thì con cần được điều trị bằng kháng sinh đường uống hoặc phải chích thuốc.
- Dung dịch uống có chất điện giải (ORESOL) khi con có tình trạng mất nước khi nôn nhiều hoặc tiêu chảy nhiều lần
## 👨⚕️👩⚕️🧏♀️🙋♂️Các câu hỏi khác thường gặp của cha mẹ dành cho Bác sĩ:
Hỏi: Con tôi sẽ bị co giật do sốt nhiều lần hơn trẻ khác?
Đáp: Có thể. Tỉ lệ tái phát các cơ co giật do sốt là khoảng 30-35%. Cơn co giật do sốt tái phát, không nhất thiết phải phải cùng thân nhiệt so với lần trước. Vậy nên nếu con bạn có tiền căn sốt co giật, khi con sốt bạn nên đưa đến khám tại các cơ sở khám chữa bệnh và nhớ lưu ý cho các Bác sĩ về tiền căn sốt co giật của con.
Hỏi: Con tôi bị sốt co giật thì có ảnh hưởng đến trí thông minh của trẻ?
Đáp: Trí thông minh và các khía cạnh khác của sự phát triển não bộ không bị ảnh hưởng bởi tình trạng sốt co giật đơn giản.
Hỏi: Sốt co giật đơn giản là gì?
Đáp: sốt co giật đơn giản là tình trạng co giật (ở trẻ có độ tuổi 6 tháng đến 5 tuổi) trong khi có thân nhiệt >38oC. Cơn co giật <5 phút, co giật toàn thân, không tái phát cơn co giật trong 24 giờ, sau cơn trẻ tỉnh táo chơi, ăn bú giỏi. Dù là sốt co giật đơn giản nhưng con cũng cần được đưa đến cơ sở khám chữa bệnh sớm nhất, để được chẩn đoán và điều trị.
## BS. Đỗ Thành Đông, BS. CKI Quách Ngọc Vinh, ThS. BS Nguyễn Duy Khải – Khoa Thần Kinh | 83.588235 | 359 | 0.723903 | vie_Latn | 1.00001 |
97d8b06a846a54bc31cb0c124122a229fa5e89b9 | 900 | md | Markdown | README.md | Build-Week-Dev-Desk/FrontEnd | 5660214e3da1dc8a743ee2c4061ae3b040176346 | [
"MIT"
] | null | null | null | README.md | Build-Week-Dev-Desk/FrontEnd | 5660214e3da1dc8a743ee2c4061ae3b040176346 | [
"MIT"
] | 5 | 2020-02-04T16:50:11.000Z | 2022-02-26T23:09:05.000Z | README.md | Build-Week-Dev-Desk/FrontEnd | 5660214e3da1dc8a743ee2c4061ae3b040176346 | [
"MIT"
] | 1 | 2020-02-01T17:21:30.000Z | 2020-02-01T17:21:30.000Z | # Dev Desk - FrontEnd
This project was made as a school project by Lambda School students.
It is a basic ticket tracking system for any students who are in need of help with a particular blocker in their code. Users can be students, staff, or both. Under the 'ask for help' page student accounts have the ability to post new tickets. From the dashboard staff accounts have the ability to 'claim' unclaimed tickets. From the 'mytickets' page staff has the ability to 'unclaim' or 'complete' tickets. In both locations staff accounts have the ability to permanently delete tickets. Accounts can be created and accessed.
API Docs: https://documenter.getpostman.com/view/9969236/SWTEdx1A?version=latest#intro
## Team
*Lucas Greenwell*
React II
*Milo Rastgoo*
React II
*Seth Cox*
React I
## Deployed Link
Visit our app: [https://devdeskgranola.now.sh/](https://devdeskgranola.now.sh/) | 40.909091 | 529 | 0.766667 | eng_Latn | 0.994302 |
97d8e7f5c36f6384c1df011e35fd7a1f106e6059 | 34,430 | md | Markdown | content/docs/reference-react-component.md | RichardOtvos/hu.reactjs.org | ce3bac75769e307f131c20b5b2fa655d2dbe4336 | [
"CC-BY-4.0"
] | null | null | null | content/docs/reference-react-component.md | RichardOtvos/hu.reactjs.org | ce3bac75769e307f131c20b5b2fa655d2dbe4336 | [
"CC-BY-4.0"
] | 1 | 2021-02-28T10:59:39.000Z | 2021-02-28T10:59:39.000Z | content/docs/reference-react-component.md | RichardOtvos/hu.reactjs.org | ce3bac75769e307f131c20b5b2fa655d2dbe4336 | [
"CC-BY-4.0"
] | null | null | null | ---
id: react-component
title: React.Component
layout: docs
category: Reference
permalink: docs/react-component.html
redirect_from:
- "docs/component-api.html"
- "docs/component-specs.html"
- "docs/component-specs-ko-KR.html"
- "docs/component-specs-zh-CN.html"
- "tips/UNSAFE_componentWillReceiveProps-not-triggered-after-mounting.html"
- "tips/dom-event-listeners.html"
- "tips/initial-ajax.html"
- "tips/use-react-with-other-libraries.html"
---
Ez az oldal egy részletes API referenciát tartalmaz a React Component osztálydefiníciójáról. Feltételezzük, hogy már tisztában vagy a React alapkoncepcióival, mint a [Komponensek és Propok](/docs/components-and-props.html), valamint az [Állapot és Életciklus](/docs/state-and-lifecycle.html). Ha még nem, akkor ezeket olvasd el először.
## Áttekintés {#overview}
Reactben egy komponenst osztályként vagy metódusként is definiálhatsz. Azok a komponensek, amiket osztályként definiáltak, több funkcionalitással rendelkeznek, amiket részletesen kifejtünk ezen az oldalon. Egy React komponens definiálásához ki kell terjesztenünk a `React.Component`-et:
```js
class Welcome extends React.Component {
render() {
return <h1>Helló, {this.props.name}</h1>;
}
}
```
Az egyetlen metódus amit *kötelező* definiálni egy `React.Component`-ben, az úgynevezett [`render()`](#render) metódus. Az összes többi függvény, amit ezen az oldalon bemutatunk, opcionális.
**Erősen ellenezzük a saját alaposztály definiálását.** A React komponensekben [a kód újrafelhasználását elsősorban kompozíción keresztül érjük el az öröklődés helyett](/docs/composition-vs-inheritance.html).
>Megjegyzés:
>
>A Reactben nem kötelező ES6 szintaxist használni. Ha inkább szeretnéd elkerülni, helyette használhatod a `create-react-class` modult vagy egy hasonló egyedi absztrakciót. Látogass el a [React használata ES6 nélkül](/docs/react-without-es6.html) szekcióhoz, hogy többet megtudj erről.
### A komponens életciklus {#the-component-lifecycle}
Minden egyes komponenshez több "életciklusmetódus" is tartozik, amiket felülírhatsz, hogy az életciklusa különböző pontjain tudj kódot futtatni. **Ezt az [életciklus diagramot](https://projects.wojtekmaj.pl/react-lifecycle-methods-diagram/) használhatod puskaként.** Az alábbi listában a legtöbbet használt életciklusmetódusok **félkövér betűvel** vannak jelölve. A többit általában csak ritkán használjuk.
#### Létrehozás {#mounting}
Ezek a metódusok a következő sorrendben hívódnak meg, amikor egy komponens létrejön és belekerül a DOM-ba:
- [**`constructor()`**](#constructor)
- [`static getDerivedStateFromProps()`](#static-getderivedstatefromprops)
- [**`render()`**](#render)
- [**`componentDidMount()`**](#componentdidmount)
>Megjegyzés:
>
>Ezek a metódusok elavulttá lettek minősítve és [elkerülendőek](/blog/2018/03/27/update-on-async-rendering.html) új kód írásakor:
>
>- [`UNSAFE_componentWillMount()`](#unsafe_componentwillmount)
#### Frissítés {#updating}
Egy frissítés történhet propok vagy állapot változása esetén. Ezek a metódusok a következő sorrendben hajtódnak végre egy komponens újrarenderelésekor:
- [`static getDerivedStateFromProps()`](#static-getderivedstatefromprops)
- [`shouldComponentUpdate()`](#shouldcomponentupdate)
- [**`render()`**](#render)
- [`getSnapshotBeforeUpdate()`](#getsnapshotbeforeupdate)
- [**`componentDidUpdate()`**](#componentdidupdate)
>Megjegyzés:
>
>Ezek a metódusok elavulttá lettek minősítve és [elkerülendőek](/blog/2018/03/27/update-on-async-rendering.html) új kód írásakor:
>
>- [`UNSAFE_componentWillUpdate()`](#unsafe_componentwillupdate)
>- [`UNSAFE_componentWillReceiveProps()`](#unsafe_componentwillreceiveprops)
#### Leválasztás {#unmounting}
Ez a metódus hívódik meg, amikor egy komponens eltávolítódik a DOM-ból:
- [**`componentWillUnmount()`**](#componentwillunmount)
#### Hibakezelés {#error-handling}
Ezek a metódusok hívódnak meg, amikor hiba történik renderelés közben, egy életciklusmetódusban, vagy bármely gyermekkomponens konstruktorában.
- [`static getDerivedStateFromError()`](#static-getderivedstatefromerror)
- [`componentDidCatch()`](#componentdidcatch)
### Egyéb API-k {#other-apis}
Minden egyes komponensnek van néhány egyéb API-ja is:
- [`setState()`](#setstate)
- [`forceUpdate()`](#forceupdate)
### Osztálytulajdonságok {#class-properties}
- [`defaultProps`](#defaultprops)
- [`displayName`](#displayname)
### Példánytulajdonságok {#instance-properties}
- [`props`](#props)
- [`state`](#state)
* * *
## Referencia {#reference}
### Legtöbbet használt életciklusmetódusok {#commonly-used-lifecycle-methods}
Az ebben a részben található metódusok lefedik a használati esetek nagyrészét, amikkel a React komponensek írása közben találkozhatsz. **Vizuális referenciaként nézd meg ezt az [életcikus diagramot](https://projects.wojtekmaj.pl/react-lifecycle-methods-diagram/).**
### `render()` {#render}
```javascript
render()
```
A `render()` metódus az egyetlen kötelezően implementálandó metódus egy osztálykomponensben.
Amikor meghívódik, a `this.props` és `this.state` alapján vissza kell adnia a következők valamelyikét:
- **React elemek.** Általában [JSX](/docs/introducing-jsx.html) definiálja. Például mind a `<div />` és a `<MyComponent />` is React elemek, amik a Reactet sorrendben egy DOM csomópont és egy felhasználó által definiált komponens renderelésére utasítják.
- **Tömbök és töredékek.** Ezek több elemet is vissza tudnak adni egyszerre a render függvényből. Részletekért lásd a [töredékek](/docs/fragments.html) dokumentációját.
- **Portálok**. Ez egy különálló DOM alfát tud renderelni. Részletekért lásd a [portálok](/docs/portals.html) dokumentációját.
- **Sztringek és számok.** Ezek szövegként kerülnek renderelésre a DOM-ban.
- **Logikai változók vagy `null`**. Nem renderel semmit. (Többnyire azért van, hogy segítse a `return test && <Child />` mintát, ahol a `test` egy logikai változó.)
A `render()` tiszta függvény kell, hogy legyen, ami azt jelenti, hogy nem módosítja a komponens állapotát, minden egyes híváskor ugyanazt adja vissza, és közvetlenül nem kommunikál a böngészővel.
Ha a böngészővel kell kommunikálnod, ezt a `componentDidMount()`-ban vagy más életciklusmetódusokban tedd meg. A `render()` tisztán tartása segít a komponens egyszerűbbé tételében.
> Megjegyzés
>
> A `render()` nem hívódik meg, ha a [`shouldComponentUpdate()`](#shouldcomponentupdate) hamisat ad vissza.
* * *
### `constructor()` {#constructor}
```javascript
constructor(props)
```
**Nem kell konstruktort implementálnod a React komponensedben, ha nem inicializálod a helyi állapotot és nem bind-olsz metódusokat.**
Egy React komponens konstruktora a DOM-ba helyeződés előtt hívódik meg. Egy `React.Component` konstruktorában először a `super(props)`-t kell meghívni minden más hívás előtt. Enélkül a `this.props` undefined értékű lesz a konstruktorban, ami hibákhoz vezethet.
A React konstruktorokat általában kétféle okból használjuk:
* Inicializálni a [helyi állapotot](/docs/state-and-lifecycle.html) egy objektum `this.state`-hez rendelésével.
* Hozzákötni [eseménykezelő](/docs/handling-events.html) metódusokat egy komponenspéldányhoz.
**Nem szabad a `setState()`-et** meghívni a `constructor()`-ban. Ehelyett, ha a komponensednek helyi állapotot kell használnia, **rendeld hozzá a kezdő állapotot a `this.state`-hez** közvetlenül a konstruktorban:
```js
constructor(props) {
super(props);
// Ne hívd meg a this.setState()-et itt!
this.state = { counter: 0 };
this.handleClick = this.handleClick.bind(this);
}
```
A konstruktor az egyetlen hely, ahol a `this.state`-hez közvetlenül hozzárendelhetsz egy értéket. Mindenhol máshol a `this.setState()`-et használd ehelyett.
Kerüld el a mellékhatások bevezetését vagy a feliratkozásokat a konstruktorban. Ezekre az esetekre inkább a `componentDidMount()`-ot használd.
>Megjegyzés
>
>**Kerüld el a propok helyi állapotba való másolását! Ez egy gyakori hiba:**
>
>```js
>constructor(props) {
> super(props);
> // Ezt ne csináld!
> this.state = { color: props.color };
>}
>```
>
>A probléma az, hogy ez mind felesleges (a `this.props.color`-t közvetlenül is tudod használni ehelyett), mind pedig hibákat generál (a `color` prop frissülése nem fog megjelenni a helyi állapotban).
>
>**Ezt csak akkor használd, ha direkt szeretnéd figyelmen kívül hagyni a prop frissüléseit.** Ebben az esetben a propot átnevezhetjük `initialColor`-nak (vagyis kezdeti szín) vagy `defaultColor`-nak (alapszín). Ezután kényszerítheted a komponenst, hogy "újraindítsa" a belső állapotát [a `key` megváltoztatásával](/blog/2018/06/07/you-probably-dont-need-derived-state.html#recommendation-fully-uncontrolled-component-with-a-key), amikor szükséges.
>
>Olvasd el a [blogbejegyzésünket a származtatott állapot elkerüléséről](/blog/2018/06/07/you-probably-dont-need-derived-state.html), ha tudni szeretnéd, hogy mit csinálj, amikor úgy gondolod, hogy szükséged van az állapot propokból való származtatására.
* * *
### `componentDidMount()` {#componentdidmount}
```javascript
componentDidMount()
```
A `componentDidMount()` rögtön azután hívódik meg, miután a komponens létrejött (belekerült a fába). Az inicializálás, ami DOM csomópontokat igényel, ide kerül. Ha távoli végpontból kell adatokat betöltened, ez egy jó hely a hálózati kérés elindítására.
Ez a metódus arra is jó, ha feliratozásokat kell felállítanod. Ha ezt teszed, ne felejts el leiratkozni a `componentWillUnmount()`-ban.
A **`setState()`-et akár azonnal is meghívhatod** a `componentDidMount()`-ban. Ez egy extra renderelést fog eredményezni, de ez azelőtt fog megtörténni, mielőtt a böngésző frissíti a képernyőt. Ez garantálja, hogy habár a `render()` kétszer fog meghívódni ebben az esetben, a felhasználó nem fogja a közbenső állapotot látni. Ezt a mintát óvatosan használd, mivel sokszor teljesítményproblémákat okoz. A legtöbb esetben ehelyett már a `constructor()`-ban hozzá tudod rendelni a kezdeti értéket a helyi állapothoz. Azonban szükséges lehet például dialógusok vagy címkék esetében, amik egy DOM csomópont helyzetétől vagy méretétől függenek.
* * *
### `componentDidUpdate()` {#componentdidupdate}
```javascript
componentDidUpdate(prevProps, prevState, snapshot)
```
A `componentDidUpdate()` rögtön egy frissítés után hívódik meg. Ez a metódus nem hívódik meg a kezdeti rendereléskor.
Ez egy jó alkalom arra, hogy változtatásokat hajts végre a DOM-on, amikor egy komponens frissült. Ez egy jó hely arra is, hogy hálózati kéréseket hajts végre, de csak akkor, ha összehasonlítod a jelenlegi propokat az előző propokkal (például nincs szükség a hálózati kérésre, ha a propok nem változtak).
```js
componentDidUpdate(prevProps) {
// Tipikus használat (ne felejtsd el a propokat összehasonlítani):
if (this.props.userID !== prevProps.userID) {
this.fetchData(this.props.userID);
}
}
```
A **`setState()`-et akár azonnal is meghívhatod** a `componentDidUpdate()`-ben, de figyelj arra, hogy ennek **egy feltételben kell elhelyezkednie**, mint a fenti példában, különben egy végtelen ciklus lesz az eredmény. Ez ugyancsak okoz egy extra újrarenderelést, ami bár nem látható a felhasználó számára, de befolyásolhatja a komponens teljesítményét. Ha a helyi állapot egy részét a propból képzed le, fontold meg ehelyett a prop közvetlen használatát. Itt olvashatsz többet a [hibákról, amik a propok helyi állapotba való másolásakor fordulnak elő](/blog/2018/06/07/you-probably-dont-need-derived-state.html).
Ha a komponensed implementálja a `getSnapshotBeforeUpdate()` életciklus metódust (ami ritka), az érték, amit visszaad, a `componentDidUpdate()` egy harmadik paramétereként lesz átadva. Egyébként ez a paraméter undefined értékű lesz.
> Megjegyzés
>
> A `componentDidUpdate()` nem lesz meghívva, ha a [`shouldComponentUpdate()`](#shouldcomponentupdate) hamisat ad vissza.
* * *
### `componentWillUnmount()` {#componentwillunmount}
```javascript
componentWillUnmount()
```
A `componentWillUnmount()` rögtön azután hívódik meg, hogy a komponens lecsatolódott és megsemmisült. Bármilyennemű takarítást itt végezz el, például időzítők érvénytelenítése, hálózati kérések visszavonása, vagy azon feliratkozások törlése, amik a `componentDidMount()`-ban lettek létrehozva.
A **`setState()`-et soha ne hívd meg** a `componentWillUnmount()`-ban, mivel a komponens nem fog újrarenderelődni. Ha egy komponenspéldány lecsatolódott, már nem fog újra létrejönni.
* * *
### Ritkán használt életciklusmetódusok {#rarely-used-lifecycle-methods}
A metódusok ebben a részben ritka használati eseteknek felelnek meg. Néha hasznosak, de a legtöbb komponensed nem fogja használni egyiket sem. **A legtöbbet ezek közül [ezen az életciklus diagramon](https://projects.wojtekmaj.pl/react-lifecycle-methods-diagram/) láthatod, ha rákattintatsz a "Show less common lifecycles" (Mutassa a kevésbé használt életciklusmetódusokat is) jelölőnégyzetre a tetején.**
### `shouldComponentUpdate()` {#shouldcomponentupdate}
```javascript
shouldComponentUpdate(nextProps, nextState)
```
A `shouldComponentUpdate()`-t akkor használd, amikor a Reacttel tudatni szeretnéd, hogy a komponens kimenetét nem befolyásolja a jelenlegi állapot vagy prop változás. Az alapviselkedés az, hogy minden egyes állapotváltozáskor újrarenderelődik, és a legtöbb esetben elég erre az alapviselkedésre támaszkodni.
A `shouldComponentUpdate()` a renderelés előtt hívódik meg, amikor éppen új propokat vagy új helyi állapotot kap a komponens. Alapból `true`-t ad vissza. Ez a metódus nem hívódik meg a kezdeti rendereléskor, vagy amikor a `forceUpdate()` használva van.
Ez a metódus kizárólag **[teljesítmény-optimalizálásra](/docs/optimizing-performance.html) használatos.** Ne támaszkodj erre, ha szeretnél "meggátolni" egy renderelést, mivel ez hibákhoz vezethet. **Fontold meg a beépített [`PureComponent`](/docs/react-api.html#reactpurecomponent)** használatát a kézzel írt `shouldComponentUpdate()` helyett. A `PureComponent` egy sekély összehasonlítást hajt végre a propokon és állapoton, és csökkenti az esélyét, hogy egy szükséges frissítés kimarad.
Ha biztos vagy abban, hogy kézzel szeretnéd megírni, összehasonlíthatod a `this.props`-ot a `nextProps`-szal és a `this.state`-et a `nextState`-tel, és `false`-ot adj vissza, ha a frissítés kihagyható. Megjegyzendő, hogy a `false` visszatérési érték nem fogja megakadályozni a gyermekek újrarenderelését, amikor az *ő saját* állapotuk változik meg.
Nem ajánlatos mély összehasonlítást végezni a `JSON.stringify()` használatával a `shouldComponentUpdate()`-ben. Ez nagyon nem hatékony és a teljesítmény rovására megy.
Jelenleg, ha a `shouldComponentUpdate()` `false`-ot ad vissza, akkor az [`UNSAFE_componentWillUpdate()`](#unsafe_componentwillupdate), [`render()`](#render), és [`componentDidUpdate()`](#componentdidupdate) metódusok nem lesznek meghívva. Elképzelhető, hogy a React a jövőben a `shouldComponentUpdate()`-t csak ajánlásként fogja használni, és `false` visszatérési érték esetében is újrarenderelheti a komponenst.
* * *
### `static getDerivedStateFromProps()` {#static-getderivedstatefromprops}
```js
static getDerivedStateFromProps(props, state)
```
A `getDerivedStateFromProps` közvetlenül a renderelés előtt hajtódik végre a kezdeti renderelésnél és frissítéskor is. Az állapot módosításához egy objektumot kell visszaadnia, vagy `null` értéket, hogy ne módosítson semmit.
Ez a metódus csak [ritka használati esetekre](/blog/2018/06/07/you-probably-dont-need-derived-state.html#when-to-use-derived-state) való, amikor a helyi állapot a propok változásaitól függ. Például hasznos lehet egy `<Transition>` komponens implementálására, ami összehasonlítja az előző és a következő gyermekeit, hogy eldöntse, melyiket animálja kifelé vagy befelé.
A leszármaztatott állapot beszédes kódhoz vezet, ami a komponenseidet kevésbé átláthatóvá teszi.
[Győződj meg róla, hogy tisztában vagy egyszerűbb alternatívákkal is:](/blog/2018/06/07/you-probably-dont-need-derived-state.html)
* Ha muszáj **mellékhatást végrehajtanod** (például adatlekérés vagy animáció) egy propokban történt változás hatására, használd a [`componentDidUpdate`](#componentdidupdate) életciklus metódust ehelyett.
* Ha **adatokat akarsz újraszámítani csak akkor, ha prop változás történt**, [használj egy memoizálás segítőt](/blog/2018/06/07/you-probably-dont-need-derived-state.html#what-about-memoization).
* Ha **"újra kell inicializálni" az állapot egy részét prop változás hatására**, fontold meg a komponens [teljesen kontrollálttá tételét](/blog/2018/06/07/you-probably-dont-need-derived-state.html#recommendation-fully-controlled-component), vagy [teljesen kontrollálatlaná tételét egy `key`-vel](/blog/2018/06/07/you-probably-dont-need-derived-state.html#recommendation-fully-uncontrolled-component-with-a-key) ehelyett.
Ennek a metódusnak nincs hozzáférése a komponenspéldányhoz. Ha szeretnéd, újrahasználhatsz némi kódot a `getDerivedStateFromProps()` és a többi osztálymetódus között a propok és helyi állapot tiszta függvényeinek kiemelésével az osztálydefiníción kívülre.
Megjegyzendő, hogy ez a metódus *minden egyes* renderelődéskor meghívódik, októl függetlenül. Ez ellentétes az `UNSAFE_componentWillReceiveProps` viselkedésével, ami csak akkor hívódik meg, ha az újrarenderelődés a szülőkomponens miatt történt, és nem a helyi `setState` hívás által.
* * *
### `getSnapshotBeforeUpdate()` {#getsnapshotbeforeupdate}
```javascript
getSnapshotBeforeUpdate(prevProps, prevState)
```
A `getSnapshotBeforeUpdate()` rögtön azelőtt hívódik meg, hogy a legutóbb renderelt kimenet el van mentve pl. a DOM-ba. Ez lehetővé teszi, hogy a komponens kiragadjon információkat a DOM-ból (pl. görgetési pozíció), mielőtt az potenciálisan megváltozna. Bármely érték, amit ez a metódus visszaad, a `componentDidUpdate()`-nek lesz átadva.
Ez a használati eset nem gyakori, de előfordulhat olyan felhasználói felületeken, mint például egy csevegőalkalmazás, aminek speciálisan kell kezelnie a görgetési pozíciót.
Egy pillanatfelvételi érték (vagy `null`) legyen a visszatérés értéke.
Például:
`embed:react-component-reference/get-snapshot-before-update.js`
A fenti példákban fontos, hogy kiolvassuk a `scrollHeight` értékét a `getSnapshotBeforeUpdate`-ben, mivel időeltolódás lehet a "renderelési" életciklus fázis (mint a `render`) és "elmentési" életciklus fázis (mint a `getSnapshotBeforeUpdate` vagy a `componentDidUpdate`) között.
* * *
### Hibahatárok {#error-boundaries}
A [hibahatárok](/docs/error-boundaries.html) React komponensek, amik elkapják a JavaScript hibákat bárhol a komponensfában, kinaplózzák ezeket, és egy tartalék UI-t jelenítenek meg az összeomlott komponensfa helyett. A hibahatárok elkapják a hibákat renderelés közben, az életciklusmetódusokban, és a teljes gyermekkomponensfa konstruktoraiban.
Egy osztálykomponens hibahatárrá válik, ha definiálja a `static getDerivedStateFromError()` és/vagy a `componentDidCatch()` életciklus metódust. Az állapotváltoztatás ezekben az életciklusokban lehetővé teszi az alatta lévő fában történő kezeletlen JavaScript hibák elkapását és egy tartelék komponens megjelenítését.
Csak akkor használd ezeket a hibahatárokat, ha váratlan hibákat szeretnél helyreállítani; **ne használd a normál működés irányítására.**
További részletekért lásd a [*Hibakezelés React 16-ban*](/blog/2017/07/26/error-handling-in-react-16.html) szekciót.
> Megjegyzés
>
> A hibahatárok csak a fában **alattuk** lévő komponensek hibáit kapják el. Egy hibahatár nem tud elkapni egy saját magában történő hibát.
### `static getDerivedStateFromError()` {#static-getderivedstatefromerror}
```javascript
static getDerivedStateFromError(error)
```
Ez az életciklusmetódus akkor hívódik meg, amikor egy hiba dobódott egy alatta lévő komponensben.
Paraméterként kapja meg a dobott hibát és egy olyan értéket kell visszaadnia, ami az állapotot fogja frissíteni.
```js{7-10,13-16}
class ErrorBoundary extends React.Component {
constructor(props) {
super(props);
this.state = { hasError: false };
}
static getDerivedStateFromError(error) {
// Frissíti az állapotot, hogy a következő renderelésnél a tartalék UI jelenjen meg
return { hasError: true };
}
render() {
if (this.state.hasError) {
// Bármilyen tartalék UI-t renderelhetsz itt
return <h1>Valami hiba történt.</h1>;
}
return this.props.children;
}
}
```
> Megjegyzés
>
> A `getDerivedStateFromError()` a "renderelési" fázis alatt hívódik meg, úgyhogy a mellékhatások nem megengedettek.
Azokra az esetekre inkább a `componentDidCatch()`-t használd.
* * *
### `componentDidCatch()` {#componentdidcatch}
```javascript
componentDidCatch(error, info)
```
Ez az életciklusmetódus egy alatta lévő komponensben történt hiba dobása után hívódik meg.
Két paramétert fogad:
1. `error` - A dobott hiba.
2. `info` - Egy objektum, amiben van egy `componentStack` kulcs, ami [információt tartalmaz a komponensről, amelyik a hibát dobta](/docs/error-boundaries.html#component-stack-traces).
A `componentDidCatch()` az "elmentési" fázis közben hívódik meg, úgyhogy a mellékhatások itt meg vannak engedve.
Ez olyan dolgokra használható, mint például hibák naplózása:
```js{12-19}
class ErrorBoundary extends React.Component {
constructor(props) {
super(props);
this.state = { hasError: false };
}
static getDerivedStateFromError(error) {
// Frissíti az állapotot, hogy a következő renderelésnél a tartalék UI jelenjen meg
return { hasError: true };
}
componentDidCatch(error, info) {
// "componentStack" példa:
// in ComponentThatThrows (created by App)
// in ErrorBoundary (created by App)
// in div (created by App)
// in App
logComponentStackToMyService(info.componentStack);
}
render() {
if (this.state.hasError) {
// Bármilyen tartalék UI-t renderelhetsz itt
return <h1>Valami hiba történt.</h1>;
}
return this.props.children;
}
}
```
> Megjegyzés
>
> Hiba esetén egy tartelék UI-t renderelhetsz a `componentDidCatch()`-ben a `setState` meghívásával, de ez elavulttá válik majd a következő kiadásokban.
> Ehelyett használd a `static getDerivedStateFromError()`-t a tartalék UI renderelésére.
* * *
### Elavult életciklusmetódusok {#legacy-lifecycle-methods}
Az alábbi életciklusmetódusok "elavulttá" váltak. Még mindig működnek, de új kódban már nem ajánljuk a használatát. Ebben a [blogbejegyzésben](/blog/2018/03/27/update-on-async-rendering.html) többet megtudhatsz az elavult életciklusmetódusok lecseréléséről.
### `UNSAFE_componentWillMount()` {#unsafe_componentwillmount}
```javascript
UNSAFE_componentWillMount()
```
> Megjegyzés
>
> Ezt az életciklust előzőleg `componentWillMount`-nak hívták. Ez a név a 17-es verzióig működni fog. Használd a [`rename-unsafe-lifecycles` kódmódosítót](https://github.com/reactjs/react-codemod#rename-unsafe-lifecycles), hogy automatikusan frissítsd a komponenseidet.
`UNSAFE_componentWillMount()` közvetlenül a betöltés előtt hívódik meg. Ez a `render()` előtt van, így a `setState()` szinkron hívása ebben a metódusban nem fog extra renderelést előidézni. Mi általában a `constructor()`-t ajánljuk az állapot inicializálására.
Kerüld el a mellékhatások vagy feliratkozások bevezetését ebben a metódusban. Azokra az esetekre inkább a `componentDidMount()`-ot használd.
Ez az egyetlen életciklusmetódus ami meghívódik a szerveren rendereléskor.
* * *
### `UNSAFE_componentWillReceiveProps()` {#unsafe_componentwillreceiveprops}
```javascript
UNSAFE_componentWillReceiveProps(nextProps)
```
> Megjegyzés
>
> Ezt az életciklust előzőleg `componentWillReceiveProps`-nak hívták. Ez a név a 17-es verzióig működni fog. Használd a [`rename-unsafe-lifecycles` kódmódosítót](https://github.com/reactjs/react-codemod#rename-unsafe-lifecycles), hogy automatikusan frissítsd a komponenseidet.
> Megjegyzés:
>
> Ennek az életciklusmetódusnak a használata gyakran vezet hibákhoz és inkonzisztenciákhoz
>
> * Ha **mellékhatást kell végrehajtanod**, (például adatlekérés vagy egy animáció) a propokban történt változások határásra, használd a [`componentDidUpdate`](#componentdidupdate) életciklust ehelyett.
> * Ha a `componentWillReceiveProps`-ot használtad az **adatok újraszámítására a propok változásai alapján**, [használj memoizálás segítőt ehelyett](/blog/2018/06/07/you-probably-dont-need-derived-state.html#what-about-memoization).
> * Ha a `componentWillReceiveProps`-ot használtad, hogy **"újrainicializáld" az állapot egy részét a propok változásának hatására**, fontold meg a komponens [teljesen kontrollálttá tételét](/blog/2018/06/07/you-probably-dont-need-derived-state.html#recommendation-fully-controlled-component), vagy [teljesen kontrollálatlanná egy `key`-vel](/blog/2018/06/07/you-probably-dont-need-derived-state.html#recommendation-fully-uncontrolled-component-with-a-key) ehelyett.
>
> A többi használati esetben [kövesd az ajánlásokat, amik ebben a származtatott állapotról szóló blogbejegyzésben találhatóak](/blog/2018/06/07/you-probably-dont-need-derived-state.html).
Az `UNSAFE_componentWillReceiveProps()` azelőtt hívódik meg, mielőtt egy betöltött komponens új propokat kapna. Ha meg kell változtatnod az állapotot a propok változásának hatására, (például, újrainicializálás), összehasonlíthatod a `this.props`-ot a `nextProps`-szal és állapotot változtathatsz a `this.setState()`-tel ebben a metódusban.
Megjegyzendő, hogy ha egy szülő komponens miatt renderelődik újra a komponensed, ez a metódus meghívódik akkor is, ha a propok nem változtak. Hasonlítsd össze a jelenlegi és következő értékeket, ha csak a változásokat akarod lekezelni.
A React nem hívja meg az `UNSAFE_componentWillReceiveProps()`-t a kezdeti prop értékekkel a [betöltés](#mounting) közben. Csak akkor hívja meg ezt a metódust, ha a komponens néhány propja frissülhet. A `this.setState()` meghívása általában nem idézi elő az `UNSAFE_componentWillReceiveProps()` meghívását.
* * *
### `UNSAFE_componentWillUpdate()` {#unsafe_componentwillupdate}
```javascript
UNSAFE_componentWillUpdate(nextProps, nextState)
```
> Mejegyzés
>
> Ezt az életciklust előzőleg `componentWillUpdate`-nek hívták. Ez a név a 17-es verzióig működni fog. Használd a [`rename-unsafe-lifecycles` kódmódosítót](https://github.com/reactjs/react-codemod#rename-unsafe-lifecycles), hogy automatikusan frissítsd a komponenseidet.
Az `UNSAFE_componentWillUpdate()` közvetlenül renderelés előtt hívódik meg, amikor új propok és állapotok kerülnek fogadásra. Ez egy jó lehetőség arra, hogy előkészítést végezzünk el egy frissítés előtt. Ez a metódus nem hívódik meg a kezdeti rendereléskor.
Megjegyzendő, hogy itt nem hívhatod meg a `this.setState()`-et; valamint bármi mást sem (pl. kezdeményezni egy Redux műveletet), ami egy frissítést kezdeményezne a React komponensben a `UNSAFE_componentWillUpdate()` visszatérése előtt.
Általában ez a metódus kicserélhető a `componentDidUpdate()`-tel. Ha ebben a metódusban a DOM-ból olvastál ki (pl. a görgetési pozíció elmentése), ezt a logikát áthelyezheted a `getSnapshotBeforeUpdate()`-be.
> Megjegyzés
>
> A `UNSAFE_componentWillUpdate()` nem hívódik meg, ha a [`shouldComponentUpdate()`](#shouldcomponentupdate) hamisat ad vissza.
* * *
## Egyéb API-k {#other-apis-1}
A fenti életciklusmetódusokkal ellentétben (amiket a React hív meg neked), a következő metódusokat *te magad* hívhatod meg a komponenseidből.
Csak ez a kettő van: `setState()` és `forceUpdate()`.
### `setState()` {#setstate}
```javascript
setState(updater, [callback])
```
A `setState()` egy várakozási sorba helyezi a komponens állapotának változásait és utasítja a Reactet, hogy ez a komponens a gyermekeivel együtt újrarendereljen az új állapottal. Ez az elsődleges metódus a kezelői felület frissítéséhez az eseménykezelők és szerver általi válasz hatására.
Gondolj a `setState()`-re mint *kérés*, egy azonnali parancs helyett a komponens frissítésére. A jobb észlelt teljesítmény érdekében a React dönthet úgy, hogy késlelteti a végrehajtást, és aztán több komponenst is egyszerre frissít. A React nem garantálja, hogy az állapotfrissítések azonnal megtörténnek.
A `setState()` nem mindig frissíti rögtön a komponenst. Lehet, hogy összevonja más frissítésekkel, vagy késlelteti a frissítést. Emiatt a `this.state` kiolvasása rögtön a `setState()` meghívása után egy potenciális buktató. Ehelyett használd a `componentDidUpdate`-t vagy a `setState` callbackjét (`setState(updater, callback)`), ezek közül bármelyik garantáltan lefut egy frissítés megtörténte után. Ha az állapotot egy előző állapot alapján kell beállítanod, olvass az `updater` argumentumról alább.
A `setState()` mindig újrarendereléshez vezet, kivéve, ha a `shouldComponentUpdate()` `false`-ot ad vissza. Ha megváltoztatható objektumokat használsz és nem lehet implementálni feltételes renderelő logikát a `shouldComponentUpdate()`-ben, a felesleges renderelések elkerülése érdekében a `setState()`-et csak akkor hívd meg, ha az új állapot különbözik az előzőtől.
Az első argumentum egy `updater` függvény az alábbi szignatúrával:
```javascript
(state, props) => stateChange
```
A `state` egy referencia a komponens állapotára a frissítés idejében. Ezt nem szabad közvetlenül módosítani. Ehelyett a változásokat reprezentáld egy új objektumban a `state` és `props` alapján. Például tegyük fel, hogy meg akarjuk növelni egy állapotváltozó értéket `props.step`-pel:
```javascript
this.setState((state, props) => {
return {counter: state.counter + props.step};
});
```
Mind `state` és `props` értéke, amit a frissítőfüggvény (updater) által kaptunk, garantáltan friss lesz. A frissítőfüggvény kimenete a `state`-tel kerül sekélyen összefésülésre.
A `setState()` második paramétere egy opcionális callback függvény, ami akkor lesz meghívva, amikor a `setState` lefutott és a komponens újrarenderelt. Mi általában hasonló logikához a `componentDidUpdate()`-et ajánljuk ehelyett.
A `setState()` első argumentuma függvény helyett lehet objektum is:
```javascript
setState(stateChange[, callback])
```
Ez a `stateChange` egy sekély összefésülését végzi el az új állapotba, pl., amikor egy kosár elemeinek számát változtatod meg:
```javascript
this.setState({quantity: 2})
```
A `setState()` ezen formája aszinkron, és az egy ciklus alatt bekövetkező hívások összevonódhatnak. Például, ha egy elem értéket többször is megnöveled egy cikluson belül, az ezzel lesz ekvivalens:
```javaScript
Object.assign(
previousState,
{quantity: state.quantity + 1},
{quantity: state.quantity + 1},
...
)
```
Az ugyanabban a ciklusban lévő egymás utáni hívások felülírják egymást, így az elem értéke csak egyszer lesz megnövelve. Ha a következő állapot a jelenlegi állapotra épül, a frissítőfüggvény használatát javasoljuk ehelyett:
```js
this.setState((state) => {
return {quantity: state.quantity + 1};
});
```
További részletekért lásd:
* [Állapot és Életciklus leírása](/docs/state-and-lifecycle.html)
* [Részletesen: Mikor és miért vonódnak össze a `setState()` hívások?](https://stackoverflow.com/a/48610973/458193)
* [Részletesen: Miért nem frissül a `this.state` azonnal?](https://github.com/facebook/react/issues/11527#issuecomment-360199710)
* * *
### `forceUpdate()` {#forceupdate}
```javascript
component.forceUpdate(callback)
```
Alapesetben, amikor a komponensed állapota vagy propjai változnak, a komponensed újrarenderelődik. Ha a `render()` metódusod valami más adattól is függ, utasíthatod a Reactet az újrarenderelésre a `forceUpdate()` meghívásával.
A `forceUpdate()` meghívása a `render()` hívását fogja előidézni a komponensben, kihagyva a `shouldComponentUpdate()`-t. Ez ugyanakkor előidézi a normális életciklusmetódusok meghívását a gyermekkomponensekben a `shouldComponentUpdate()`-tel együtt. Viszont a React ugyanúgy csak akkor frissíti a DOM-ot, ha változás történt.
Általában elkerülendő a `forceUpdate()` használata és csak a `this.props` és `this.state`-ből olvass a `render()`-ben.
* * *
## Osztálytulajdonságok {#class-properties-1}
### `defaultProps` {#defaultprops}
A `defaultProps`-ot egy a komponensen lévő tulajdonságként tudjuk definiálni, a propok alapértékeinek beállításához az osztályon. Ez az `undefined` értékű propoknál használatos, a `null` értékűeknél nem. Például:
```js
class CustomButton extends React.Component {
// ...
}
CustomButton.defaultProps = {
color: 'blue'
};
```
Ha a `props.color` nincs megadva, az alapértéke `'blue'` lesz:
```js
render() {
return <CustomButton /> ; // A props.color blue-ra lesz állítva
}
```
Ha a `props.color` értéke `null`, `null` is marad:
```js
render() {
return <CustomButton color={null} /> ; // A props.color értéke null marad
}
```
* * *
### `displayName` {#displayname}
A `displayName` szöveg hibakeresésnél használatos. Általában ezt nem kell explicit módon beállítanod, mivel ez az a függvény vagy osztály nevéből van kikövetkeztetve, amelyik definiálja a komponenst. Ezt lehet, hogy be akarod állítani, ha expliciten szeretnél egy másik nevet megjeleníteni hibakeresés miatt, vagy amikor egy magasabb rendű komponenst hozol létre, lásd a [Csomagold be a megjelenítési nevet a könnyebb hibakeresés érdekében](/docs/higher-order-components.html#convention-wrap-the-display-name-for-easy-debugging) szekciót a részletekért.
* * *
## Példánytulajdonságok {#instance-properties-1}
### `props` {#props}
A `this.props` a propokat tartalmazza, amik ennek a komponensnek a meghívója által lettek definiálva. Lásd a [Komponensek és Propok](/docs/components-and-props.html) szekciót a propok bemutatásához.
A `this.props.children` egy speciális prop, ami általában a JSX kifejezés gyermekcímkéi által van meghatározva a szülő címke helyett.
### `state` {#state}
Az állapot komponensspecifikus adatot tartalmaz, ami idővel változhat. Az állapot felhasználó által definiált, és egyszerű JavaScript objektumnak kell lennie.
Ha egy érték nincs rendereléshez vagy adatáramláshoz használva (például egy időzítő azonosítója), nem kell az állapotba tenned. Az ilyen értékeket mezőkként definiálhatod a komponenspéldányon.
Lásd az [Állapot és Életciklus](/docs/state-and-lifecycle.html) leírást további információkért az állapotról.
Soha ne változtasd meg a `this.state`-et közvetlenül, mivel a `setState()` hívás lehet, hogy megváltoztatja ezt később. Tekintsd a `this.state`-et úgy, mintha megváltoztathatatlan lenne.
| 53.545879 | 639 | 0.780366 | hun_Latn | 1.000003 |
97da1e968b6dce21bfbde9d0a03a031e0838bea2 | 28,397 | md | Markdown | _articles/ro/best-practices.md | guillermo-ai/opensource.guide | 38a9e9bcd13bac4b9baf870faff7639f021d287c | [
"CC-BY-4.0"
] | 10 | 2021-09-19T17:26:34.000Z | 2021-11-14T09:29:41.000Z | _articles/ro/best-practices.md | nedmah077/opensource.guide | a89914ccc4d17b23ce38f2384ebe8d5ab29caef6 | [
"CC-BY-4.0"
] | 35 | 2021-02-26T16:44:24.000Z | 2022-03-31T08:11:33.000Z | _articles/ro/best-practices.md | Dqddyvillager1722/opensource.guide | c6cfb056d596821c060a9ecb1ab49e40e154c68f | [
"CC-BY-4.0"
] | 3 | 2021-07-23T07:48:43.000Z | 2021-10-04T20:13:55.000Z | ---
lang: ro
title: Cele mai bune practici pentru întreținători
description: Ușurarea vieții tale în calitate de întreținător open source, de la documentarea proceselor la mobilizarea comunității
class: best-practices
toc:
what-does-it-mean-to-be-a-maintainer: "Ce înseamnă să fii un întreținător?"
documenting-your-processes: "Documentarea proceselor tale"
learning-to-say-no: "Învățând să spui „nu”"
leverage-your-community: "Mobilizarea comunității tale"
bring-in-the-robots: "Cheamă roboții"
its-okay-to-hit-pause: "Este OK să apeși pauză"
order: 5
image: /assets/images/cards/best-practices.png
related:
- metrics
- leadership
---
## Ce înseamnă să fii un întreținător?
Dacă întreții un proiect cu sursă deschisă pe care mulți oameni îl folosesc, poate ai observat că programezi mai puțin și răspunzi la probleme mai mult.
În primele etape ale unui proiect, experimentezi cu idei noi și decizi bazat pe ce vrei tu. Pe măsură ce proiectul crește în popularitate, te vei afla lucrând cu utilizatorii și contributorii tăi mai mult.
Întreținerea unui proiect necesită mai mult decât cod. Aceste sarcini sunt deseori neașteptate, dar ele sunt exact la fel de importante pentru un proiect în creștere. Am adunat câteva metode pentru a-ți ușura viața, de la documentarea proceselor la mobilizarea comunității tale.
## Documentarea proceselor tale
Notarea lucrurilor este unul dintre cele mai importante lucruri pe care le poți face în calitate de întreținător.
Documentația nu doar clarifică propria ta gândire, ci și ajută alți oameni să înțeleagă de ce ai nevoie sau ce aștepți, chiar înainte ca ei să întrebe.
Notând lucruri face mai ușor să spui „nu” când ceva nu se încadrează în domeniul tău. De asemenea face mai ușor pentru oameni să pășească înăuntru și să ajute. Nu știi niciodată cine ar putea citi sau folosi proiectul tău.
Chiar dacă nu folosești paragrafe complete, notarea bulinelor este mai bună decât să nu scrii nimic.
Ține minte să-ți păstrezi documentația actualizată. Dacă nu ești capabil să faci asta mereu, șterge documentația ta expirată sau indică faptul că este expirată astfel încât contributorii să știe că actualizările sunt binevenite.
### Notează viziunea proiectului tău
Începe prin a scrie scopurile proiectului tău. Adaugă-le la README-ul tău, sau creează un fișier separat numit VISION. Dacă există alte artefacte care ar putea ajuta, cum ar fi o foaie de parcurs a proiectului, fă-le publice și pe acestea.
A avea o viziune clară, documentată te ține concentrat și te ajută să eviți „deriva obiectivelor” din cauza contribuțiilor altora.
De exemplu, @lord a descoperit că având o viziune a proiectului l-a ajutat să-și dea seama pe care cereri să-și petreacă timpul. În calitate de nou întreținător, el a regretat că nu s-a lipit de scopul proiectului lui când a primit prima lui cerere de facilitate pentru [Slate](https://github.com/lord/slate).
<aside markdown="1" class="pquote">
<img src="https://avatars.githubusercontent.com/lord?s=180" class="pquote-avatar" alt="avatar">
<p>
Am fumat-o. Nu am acordat efortul pentru a veni cu o soluție completă. În schimbul unei soluții neadecvate, îmi doresc să fi spus „Nu am timp pentru aceasta chiar acum, dar o să o adaug la lista pe termen lung de lucruri drăguțe de făcut.”
</p>
<p>
<em>
I fumbled it. I didn't put in the effort to come up with a complete solution. Instead of an half-assed solution, I wish I had said "I don't have time for this right now, but I'll add it to the long term nice-to-have list."
</em>
</p>
<p markdown="1" class="pquote-credit">
— @lord, ["Tips for new open source maintainers"](https://lord.io/blog/2014/oss-tips/)
</p>
</aside>
### Comunică-ți așteptările
Regulile pot fi enervante când le scrii. Câteodată te-ai putea simți ca și cum faci politică pentru comportamentul altor oameni sau distrugi toată distracția.
Scrise și impuse corect, totuși, regulile bune împuternicesc întreținătorii. Ele te previn din a fi târât în a face lucruri pe care nu vrei să le faci.
Cei mai mulți oameni care ajung la proiectul tău nu știu nimic despre tine sau despre circumstanțele tale. Ei pot presupune că ești plătit să lucrezi pe el, în special dacă este ceva pe care ei îl folosesc în mod obișnuit și de care depind. Poate într-un punct tu depui mult timp în proiectul tău, dar acum ești ocupat cu un nou loc de muncă sau membru de familie.
Toate acestea sunt perfect OK! Doar asigură-te că ceilalți știu despre acestea.
Dacă întreținerea proiectului tău este part-time sau pur voluntară, fii sincer în legătură cu cât de mult timp ai. Acesta nu este același cu cât timp crezi tu că proiectul necesită, sau cât timp alții vor să cheltui tu.
Iată câteva reguli care merită notate:
* Cum o contribuție este analizată și acceptată (_Au nevoie de teste? Un șablon pentru probleme?_)
* Tipurile de contribuții pe care le vei accepta (_Vrei ajutor doar la o anumită parte a codului tău?_)
* Când este potrivit să răspundă (_de exemplu, „Puteți aștepta un răspuns de la un întreținător în 7 zile. Dacă nu ați auzit nimic până atunci, simțiți-vă liberi să bâzâiți firul de discuție.”_)
* Cât de mult timp cheltui pe proiect (_de exemplu, „Cheltuim doar aproximativ 5 ore pe săptămână pe acest proiect”_)
[Jekyll](https://github.com/jekyll/jekyll/tree/master/docs), [CocoaPods](https://github.com/CocoaPods/CocoaPods/wiki/Communication-&-Design-Rules), și [Homebrew](https://github.com/Homebrew/brew/blob/bbed7246bc5c5b7acb8c1d427d10b43e090dfd39/docs/Maintainers-Avoiding-Burnout.md) sunt câteva exemple de proiecte cu reguli de bază pentru întreținători și contributori.
### Păstrează comunicarea publică
Nu uita să documentezi interacțiunile tale, de asemenea. Oriunde poți, păstrează comunicarea despre proiectul tău publică. Dacă cineva încearcă să te contacteze în privat să discutați despre o cerere de facilitate sau o nevoie de asistență, direcționează-l politicos către un canal de comunicare public, cum ar fi o listă de email-uri sau un urmăritor de probleme.
Dacă te întâlnești cu alți întreținători, sau faci o decizie majoră în privat, documentează aceste conversații în public, chiar dacă înseamnă doar postarea notițelor tale.
În acest mod, oricine se alătură comunității tale va avea acces la aceleași informații ca cineva care a fost acolo de ani de zile.
## Învățând să spui „nu”
Ai notat lucruri. În mod normal, toată lumea ți-ar citi documentația, dar în realitate, va trebui să reamintești celorlalți că aceste cunoștințe există.
Având totul scris, totuși, ajută la depersonalizarea situațiilor când trebuie să-ți impui regulile.
A spune „nu” nu este distractiv, dar _„Contribuția ta nu se potrivește cu criteriile acestui proiect”_ se simte mai puțin personal decât _„Nu-mi place contribuția ta”_.
A spune „nu” se aplică la multe situații peste care vei da în calitate de întreținător: cereri de facilități care nu se încadrează în domeniu, cineva care deraiază o discuție, făcând muncă nefolositoare pentru alții.
### Păstrează conversația prietenoasă
Unul dintre cele mai importante locuri unde vei practica spunerea de nu este asupra cozii tale de probleme și cereri de pull. În calitate de întreținător de proiect, vei primi inevitabil sugestii pe care nu dorești să le accepți.
Poate contribuția schimbă domeniul proiectului tău sau nu se potrivește cu viziunea ta. Poate ideea este bună, dar implementarea este slabă.
Indiferent de motiv, este posibil să gestionezi cu mult tact contribuțiile care nu se ridică la standardele proiectului tău.
Dacă primești o contribuție pe care nu vrei să o accepți, prima ta reacție ar putea fi să o ignori sau să pretinzi că nu ai văzut-o. Făcând astfel ar putea răni sentimentele celeilalte persoane și chiar să demotiveze alți potențiali contributori din comunitatea ta.
<aside markdown="1" class="pquote">
<img src="https://avatars.githubusercontent.com/krausefx?s=180" class="pquote-avatar" alt="avatar">
<p>
Cheia pentru a gestiona asistența pentru proiecte cu sursă deschisă la scară mare este să ții problemele în mișcare. Încearcă să eviți a avea probleme în stagnare. Dacă ești un dezvoltator iOS știi cât de frustrant poate fi să trimiți radare. Ai putea primi răspuns 2 ani mai târziu, și ți se spune să încerci din nou cu ultima versiune iOS.
</p>
<p>
<em>
The key to handle support for large-scale open source projects is to keep issues moving. Try to avoid having issues stall. If you're an iOS developer you know how frustrating it can be to submit radars. You might hear back 2 years later, and are told to try again with the latest version of iOS.
</em>
</p>
<p markdown="1" class="pquote-credit">
— @KrauseFx, ["Scaling open source communities"](https://krausefx.com/blog/scaling-open-source-communities)
</p>
</aside>
Nu lăsa o contribuție nedorită deschisă pentru că te simți vinovat sau dorești să fii drăguț. În timp, problemele la care nu s-a răspuns și PR-urile vor face lucrul pe proiectul tău să se simtă cu foarte mult mai stresant și mai intimidant.
Este mai bine să închizi imediat contribuțiile pe care știi că nu vrei să le accepți. Dacă proiectul tău suferă deja de restanțe mari, @steveklabnik are sugestii pentru [cum să triezi problemele eficient](https://words.steveklabnik.com/how-to-be-an-open-source-gardener).
În al doilea rând, ignorarea contribuțiilor trimite un semnal negativ către comunitatea ta. A contribui la un proiect poate fi intimidant, în special pentru prima dată. Chiar dacă nu accepți contribuția, consideră persoana din spatele ei și mulțumește-i pentru interes. Este un mare compliment!
Dacă nu dorești să accepți o contribuție:
* **Mulțumește-i** pentru contribuția ei
* **Explică de ce nu se încadrează** în domeniul proiectului, și oferă sugestii clare de îmbunătățire, dacă poți. Fii bun, dar ferm.
* **Leagă către documentația relevantă**, dacă o ai. Dacă observi cereri repetate de lucruri pe care nu vrei să le accepți, adaugă-le în documentația ta pentru a evita să te repeți.
* **Închide cererea**
Nu ar trebui să ai nevoie de mai mult de 1-2 enunțuri pentru a răspunde. De exemplu, când un utilizator al [celery](https://github.com/celery/celery/) a raportat o eroare legată de Windows, @berkerpeksag [a răspuns cu](https://github.com/celery/celery/issues/3383):

Dacă gândul de a spune „nu” te îngrozește, nu ești singur. După cum @jessfraz [a spus](https://blog.jessfraz.com/post/the-art-of-closing/):
> Am discutat cu întreținători din câteva proiecte diferite cu sursă deschisă, Mesos, Kubernetes, Chromium, și ei toți cad de acord că una dintre cele mai grele părți de a fi un întreținător este să spui „Nu” la patch-uri pe care nu le vrei.
>
> I've talked to maintainers from several different open source projects, Mesos, Kubernetes, Chromium, and they all agree one of the hardest parts of being a maintainer is saying "No" to patches you don't want.
Nu te simți vinovat în legătură cu a nu vrea să accepți contribuția cuiva. Prima regulă a open source, [după](https://twitter.com/solomonstre/status/715277134978113536) @shykes: _"Nu este temporar, da este pentru totdeauna.”_ În timp ce a empatiza cu entuziasmul altei persoane este un lucru bun, a respinge o contribuție nu este același lucru cu a respinge persoana din spatele ei.
În cele din urmă, dacă o contribuție nu este destul de bună, nu ai nicio obligație să o accepți. Fii amabil și sensibil când oameni contribuie la proiectul tău, dar acceptă doar schimbări despre care chiar crezi că vor face proiectul tău mai bun. Cu cât mai des practici a spune „nu”, cu atât devine mai ușor. Promit.
### Fii proactiv
Pentru a reduce în primul rând volumul de contribuții nedorite, explică procesul proiectului tău pentru trimiterea și acceptarea contribuțiilor în ghidul tău de contribuire.
Dacă tu primești prea multe contribuții de calitate slabă, cere contributorilor să facă puțină muncă înainte, de exemplu:
* Completează un șablon sau o listă de verificare, la o problemă sau un PR
* Deschide o problemă înainte de a trimite un PR
Dacă ei nu îți urmează regulile, închide problema imediat și fă trimitere către documentația ta.
În timp ce această abordare se poate simți necuviincioasă la început, fiind proactiv este de fapt bine pentru ambele părți. Aceasta reduce șansele ca cineva să pună multe ore pierdute de muncă într-o cerere de pull pe care nu o vei accepta. Și îți face volumul de muncă mai ușor de gestionat.
<aside markdown="1" class="pquote">
<img src="https://avatars.githubusercontent.com/mikemcquaid?s=180" class="pquote-avatar" alt="avatar">
<p>
În mod ideal, explică-le direct și într-un fișier CONTRIBUTING.md cum pot obține în viitor o indicație mai bună despre ce ar fi sau nu ar fi acceptat înainte ca ei să înceapă munca.
</p>
<p>
<em>
Ideally, explain to them and in a CONTRIBUTING.md file how they can get a better indication in the future on what would or would not be accepted before they begin the work.
</em>
</p>
<p markdown="1" class="pquote-credit">
— @MikeMcQuaid, ["Kindly Closing Pull Requests"](https://github.com/blog/2124-kindly-closing-pull-requests)
</p>
</aside>
Uneori, când spui „nu”, contributorul tău potențial ar putea să se supere sau să-ți critice decizia. Dacă comportamentul lui devine ostil, [ia măsuri să dezamorsezi situația](https://github.com/jonschlinkert/maintainers-guide-to-staying-positive#action-items) sau chiar să-l înlături din comunitatea ta, dacă nu dorește să colaboreze constructiv.
### Îmbrățișează mentoratul
Poate cineva din comunitatea ta trimite în mod regulat contribuții care nu respectă standardele proiectului tău. Poate fi frustrant pentru ambele părți să treacă în mod repetat prin respingeri.
Dacă vezi că cineva este entuziast în legătură cu proiectul tău, dar are nevoie de un pic de finisare, fii răbdător. Explică în mod clar în fiecare situație de ce contribuția lui nu respectă așteptările proiectului. Încearcă să-l trimiți la o sarcină mai ușoară sau mai puțin ambiguă, cum ar fi o problemă marcată _„good first issue,”_ pentru a-l obișnui cu situații noi. Dacă ai timp, consideră să-l mentorezi prin prima lor contribuție, sau găsește pe altcineva din comunitatea ta care ar putea fi doritor să-l mentoreze.
## Mobilizarea comunității tale
Nu trebuie să faci totul de unul singur. Comunitatea proiectului tău există cu un motiv! Chiar dacă nu ai încă o comunitate activă de contributori, dacă ai mulți utilizatori, pune-i la muncă.
### Împarte volumul de muncă
Dacă ești în căutarea altora să pășească înăuntru, începe prin a întreba prin jur.
Când vezi contributori noi făcând contribuții repetate, recunoaște munca lor oferind mai multă responsabilitate. Documentează cum pot alții crește în roluri de conducere dacă doresc.
Încurajându-i pe alții să [împartă proprietatea proiectului](../building-community/#împarte-proprietatea-proiectului-tău) poate reduce foarte mult propriul tău volum de muncă, după cum @lmccart a descoperit pe proiectul ei, [p5.js](https://github.com/processing/p5.js).
<aside markdown="1" class="pquote">
<img src="https://avatars.githubusercontent.com/lmccart?s=180" class="pquote-avatar" alt="avatar">
<p>
Spuneam „Da, oricine poate fi implicat, nu trebuie să ai multă expertiză cu programarea [...].” Am avut oameni semnând să vină [la un eveniment] și atunci a fost când mă întrebam cu adevărat: este aceasta adevărat, ceea ce spuneam? Vor fi aproape 40 de oameni care vin, și nu este ca și cum aș putea sta cu fiecare dintre ei...Dar oamenii au venit împreună, și chiar a funcționat cumva. De îndată ce o persoana a reușit, ea și-a putut învăța aproapele.
</p>
<p>
<em>
I’d been saying, "Yeah, anyone can be involved, you don’t have to have a lot of coding expertise [...]." We had people sign up to come [to an event] and that’s when I was really wondering: is this true, what I’ve been saying? There are gonna be 40 people who show up, and it’s not like I can sit with each of them...But people came together, and it just sort of worked. As soon as one person got it, they could teach their neighbor.
</em>
</p>
<p markdown="1" class="pquote-credit">
— @lmccart, ["What Does "Open Source" Even Mean? p5.js Edition"](https://medium.com/@kenjagan/what-does-open-source-even-mean-p5-js-edition-98c02d354b39)
</p>
</aside>
Dacă trebuie să renunți la proiectul tău, fie ca pauză sau permanent, nu este nicio rușine în a cere altcuiva să-l preia pentru tine.
Dacă alți oameni sunt entuziaști în legătură cu direcția lui, dă-le permisiunea de a face commit sau predă controlul în mod formal altcuiva. Dacă cineva a bifurcat proiectul tău și îl menține activ altundeva, consideră legarea către copie din proiectul tău original. Este minunat că atât de mulți oameni vor ca proiectul tău să trăiască!
@progrium [a găsit că](https://web.archive.org/web/20151204215958/https://progrium.com/blog/2015/12/04/leadership-guilt-and-pull-requests/) documentarea viziunii pentru proiectul său, [Dokku](https://github.com/dokku/dokku), a ajutat aceste scopuri să trăiască mai departe chiar și după ce el a ieșit din proiect:
> Am scris o pagină de wiki descriind ce îmi doream și de ce îmi doream acele lucruri. Din anumite motive a venit ca o surpriză că întreținătorii au început să miște proiectul în acea direcție! S-a întâmplat exact cum aș fi făcut-o eu? Nu întotdeauna. Dar totuși a adus proiectul mai aproape de ceea ce scrisesem.
>
> I wrote a wiki page describing what I wanted and why I wanted it. For some reason it came as a surprise to me that the maintainers started moving the project in that direction! Did it happen exactly how I'd do it? Not always. But it still brought the project closer to what I wrote down.
### Lasă-i pe ceilalți să construiască soluțiile de care au nevoie
Dacă un contributor potențial are o părere diferită despre ce ar trebui să facă proiectul tău, poate ai vrea să îl încurajezi cu blândețe să lucreze pe propria lui copie.
Copierea proiectului nu trebuie să fie un lucru rău. Fiind capabil să copiezi și să modifici proiectele este unul dintre cele mai bune lucruri despre open source. Încurajând membrii comunității tale să lucreze pe propria lor copie poate furniza ieșirea creativă de care ei au nevoie, fără să intre în conflict cu viziunea proiectului tău.
<aside markdown="1" class="pquote">
<img src="https://avatars.githubusercontent.com/geerlingguy?s=180" class="pquote-avatar" alt="avatar">
<p>
Satisfac cazul de utilizare de 80%. Dacă ești unul dintre „unicorni”, te rog bifurcă munca mea. Nu mă voi simți jignit! Proiectele mele publice sunt aproape întotdeauna menite să rezolve cele mai comune probleme; eu încerc să fac mai ușor să ajungi mai adânc fie prin bifurcarea muncii mele sau prin extinderea ei.
</p>
<p>
<em>
I cater to the 80% use case. If you are one of the unicorns, please fork my work. I won't get offended! My public projects are almost always meant to solve the most common problems; I try to make it easy to go deeper by either forking my work or extending it.
</em>
</p>
<p markdown="1" class="pquote-credit">
— @geerlingguy, ["Why I Close PRs"](https://www.jeffgeerling.com/blog/2016/why-i-close-prs-oss-project-maintainer-notes)
</p>
</aside>
Același lucru se aplică unui utilizator care chiar dorește o soluție pentru care pur și simplu nu ai lățimea de bandă să o construiești. A oferi API-uri și cârlige de personalizare poate să-i ajute pe alții să-și satisfacă nevoile lor, fără să trebuiască să modifice sursa direct. @orta [a găsit că](https://artsy.github.io/blog/2016/07/03/handling-big-projects/) încurajarea plugin-urilor pentru CocoaPods a dus la „unele dintre cele mai interesante idei":
> Este aproape inevitabil ca odată ce un proiect devine mare, întreținătorii să trebuiască să devină tot mai conservatori în legătură cu felul în care ei introduc cod nou. Devii bun la a spune „nu”, dar o mulțime de oameni au nevoi legitime. Astfel, în schimb sfârșești prin a-ți transforma unealta într-o platformă.
>
> It's almost inevitable that once a project becomes big, maintainers have to become a lot more conservative about how they introduce new code. You become good at saying "no", but a lot of people have legitimate needs. So, instead you end up converting your tool into a platform.
## Cheamă roboții
Exact cum există sarcini cu care alți oameni te pot ajuta, există de asemenea sarcini pe care niciun om nu ar trebui să le îndeplinească vreodată. Roboții sunt prietenii tăi. Folosește-i pentru a-ți face viața în calitate de întreținător mai ușoară.
### Cere teste și alte verificări pentru a îmbunătăți calitatea codului tău
Una dintre cele mai importante căi prin care poți să-ți automatizezi proiectul este adăugarea de teste.
Testele îi fac pe contributori să se simtă încrezători că ei nu strică nimic. Ele de asemenea îți ușurează analizarea și acceptarea rapidă a contribuțiilor. Cu cât ești mai receptiv, cu atât comunitatea ta poate fi mai angajată.
Configurează teste automate care vor rula pe toate contribuțiile ce vin, și asigură-te că testele pot fi ușor rulate local de contributori. Cere ca toate contribuțiile să treacă testele tale înainte de a putea fi trimise. Vei ajuta la stabilirea unui standard minim de calitate pentru toate trimiterile. [Cererile de verificare de stare](https://help.github.com/articles/about-required-status-checks/) pe GitHub pot asigura că nicio schimbare nu este îmbinată fără să treacă testele tale.
Dacă adaugi teste, asigură-te că explici cum funcționează în fișierul tău CONTRIBUTING.
<aside markdown="1" class="pquote">
<img src="https://avatars.githubusercontent.com/edunham?s=180" class="pquote-avatar" alt="avatar">
<p>
Cred că testele sunt necesare pentru tot codul pe care oamenii lucrează. Dacă codul a fost complet și perfect corect, nu ar avea nevoie de schimbări – noi scriem cod doar când ceva este greșit, fie că aceasta este „Se blochează” sau „Îi lipsește o astfel de facilitate”. Și indiferent de schimbările pe care le faci, testele sunt esențiale pentru prinderea oricărei regresii pe care ai putea-o introduce accidental.
</p>
<p>
<em>
I believe that tests are necessary for all code that people work on. If the code was fully and perfectly correct, it wouldn't need changes – we only write code when something is wrong, whether that's "It crashes" or "It lacks such-and-such a feature". And regardless of the changes you're making, tests are essential for catching any regressions you might accidentally introduce.
</em>
</p>
<p markdown="1" class="pquote-credit">
— @edunham, ["Rust's Community Automation"](https://edunham.net/2016/09/27/rust_s_community_automation.html)
</p>
</aside>
### Folosește unelte pentru a automatiza sarcini de bază de întreținere
Vestea bună despre menținerea unui proiect popular este că alți întreținători probabil au întâmpinat probleme asemănatoare și au construit o soluție pentru ele.
Există o [varietate de unelte disponibile](https://github.com/showcases/tools-for-open-source) pentru a ajuta la automatizarea unor aspecte ale muncii de întreținere. Câteva exemple:
* [semantic-release](https://github.com/semantic-release/semantic-release) automatizează lansările tale
* [mention-bot](https://github.com/facebook/mention-bot) menționează examinatori potențiali pentru cererile de pull
* [Danger](https://github.com/danger/danger) ajută la automatizarea examinării codului
Pentru rapoartele de bug-uri și alte contribuții obișnuite, GitHub are [Șabloane de probleme și șabloane de cereri de pull](https://github.com/blog/2111-issue-and-pull-request-templates), pe care le poți crea pentru a simplifica informațiile pe care le primești. @TalAter a făcut un [ghid Alege-ți propria aventură](https://www.talater.com/open-source-templates/#/) pentru a te ajuta să-ți scrii șabloanele de probleme și de PR-uri.
Pentru a gestiona notificările tale prin email, poți seta [filtre de email](https://github.com/blog/2203-email-updates-about-your-own-activity) pentru a organiza după prioritate.
Dacă vrei să devii puțin mai avansat, ghidurile de stil și linteri pot standardiza contribuțiile proiectului și să le facă mai ușor de examinat și acceptat.
Totuși, dacă standardele sunt prea complicate, ele pot crește barierele în calea contribuției. Asigură-te că adaugi doar destule reguli încât să faci viețile tuturor mai ușoare.
Dacă nu ești sigur ce unelte să folosești, privește la ce fac alte proiecte populare, în special cele din ecosistemul tău. De exemplu, cum arată procesul de contribuție pentru alte module Node? Folosind unelte și abordări asemănătoare va face procesul tău mai familiar pentru contributorii tăi țintă.
## Este OK să apeși pauză
Munca pe sursă deschisă ți-a adus odată bucurie. Poate acum începe să te facă să te simți evitant sau vinovat.
Poate te simți copleșit sau simți un sentiment în creștere de groază când te gândești la proiectele tale. Și între timp, problemele și cererile de pull se adună.
Burnout-ul este o problemă reală și universală în munca pe sursă deschisă, în special în rândul întreținătorilor. În calitate de întreținător, fericirea ta este o cerință nenegociabilă pentru supraviețuirea oricărui proiect cu sursă deschisă.
Cu toate că ar trebui să meargă fără să se spună, fă o pauză! Nu ar trebui să aștepți până te simți extenuat pentru a-ți lua o vacanță. @brettcannon, un dezvoltator din inima Python, a decis să-și ia [o vacanță de o lună de zile](https://snarky.ca/why-i-took-october-off-from-oss-volunteering/) după 14 ani de muncă voluntară pe OSS.
Exact la fel ca oricare alt fel de muncă, luarea de pauze dese te va păstra revigorat, fericit, și încântat de munca ta.
<aside markdown="1" class="pquote">
<img src="https://avatars.githubusercontent.com/danielbachhuber?s=180" class="pquote-avatar" alt="avatar">
<p>
În întreținerea WP-CLI, am descoperit că am nevoie să mă fac fericit pe mine mai întâi, și să stabilesc limite clare ale implicării mele. Cel mai bun echilibru pe care l-am găsit este de 2-5 ore pe săptămână, ca parte din programul meu normal de muncă. Acest lucru îmi păstrează implicarea o pasiune, și departe de a simți prea mult ca muncă. Deoarece prioritizez problemele la care lucrez, pot face progres în mod obișnuit pe ce cred că este cel mai important.
</p>
<p>
<em>
In maintaining WP-CLI, I've discovered I need to make myself happy first, and set clear boundaries on my involvement. The best balance I've found is 2-5 hours per week, as a part of my normal work schedule. This keeps my involvement a passion, and from feeling too much like work. Because I prioritize the issues I'm working on, I can make regular progress on what I think is most important.
</em>
</p>
<p markdown="1" class="pquote-credit">
— @danielbachhuber, ["My condolences, you're now the maintainer of a popular open source project"](https://danielbachhuber.com/2016/06/26/my-condolences-youre-now-the-maintainer-of-a-popular-open-source-project/)
</p>
</aside>
Câteodată, poate fi greu să faci o pauză de la muncă pe sursă deschisă când simți ca și cum toți au nevoie de tine. Oamenii ar putea chiar să încerce să te facă să te simți vinovat pentru că te retragi.
Fă tot posibilul să găsești asistență pentru utilizatorii și comunitatea ta cât timp ești departe de un proiect. Dacă nu poți obține asistența de care ai nevoie, fă o pauză oricum. Asigură-te să comunici când nu ești disponibil, astfel încât oamenii nu sunt confuzionați de lipsa ta de reacție.
Luarea de pauze se aplică la mai mult decât doar vacanțe, de asemenea. Dacă nu dorești să faci muncă pe sursă deschisă în sfârșiturile de săptămână, sau în timpul orelor de lucru, comunică aceste așteptări celorlalți, astfel încât ei știu să nu te deranjeze.
## Ai grijă de tine mai întâi!
Întreținerea unui proiect popular cere abilități diferite față de stadiile anterioare ale creșterii, dar nu este mai puțin recompensant. În calitate de întreținător, vei practica abilități de conducere și personale la un nivel pe care puțini oameni ajung să-l experimenteze. Cu toate că nu este ușor întotdeauna să gestionezi, stabilirea de limite clare și asumarea doar a lucrurilor cu care te simți confortabil te va ajuta să rămâi fericit, revigorat, și productiv.
| 83.766962 | 523 | 0.781209 | ron_Latn | 1.000009 |
97db33c8add1786eab6eb61d7a51cb43e71c2d9a | 650 | md | Markdown | content/interesting_works/2013_12_02_[News]_How_does_Virtual-reality_Therapy_for_PTSD_work.md | BSPL-KU/bspl-ku.github.io | 6c3f19ee8965571467c43093390ba6004937d176 | [
"MIT"
] | 2 | 2021-07-29T21:42:17.000Z | 2021-08-01T12:54:00.000Z | content/interesting_works/2013_12_02_[News]_How_does_Virtual-reality_Therapy_for_PTSD_work.md | BSPL-KU/bspl-ku.github.io | 6c3f19ee8965571467c43093390ba6004937d176 | [
"MIT"
] | null | null | null | content/interesting_works/2013_12_02_[News]_How_does_Virtual-reality_Therapy_for_PTSD_work.md | BSPL-KU/bspl-ku.github.io | 6c3f19ee8965571467c43093390ba6004937d176 | [
"MIT"
] | null | null | null | ---
title: "[News] How does Virtual-reality Therapy for PTSD work?"
date: 2013-12-02 07:49:00
---
The virtual-reality therapy is usually used for post-traumatic stress disorder (PTSD) patients increasingly. This therapy makes the patient confront their trauma virtually and overcome from the trauma.
Even though the quarter of patient can recover from own jury, these method which makes the patient face to their trauma also can restore patients' memory.
It indicates that the virtual-reality therapy is not always good therapy for PTSD patients.
Read more: <http://www.scientificamerican.com/article.cfm?id=how-does-virtual-reality-therapy-fo>
| 46.428571 | 201 | 0.792308 | eng_Latn | 0.99006 |
97db425faaf16fa36653ce2077b84fc245bc26d3 | 751 | md | Markdown | content/services/tickets/tickets.md | funpher/qingcloud-docs | 2afacb6f5a9857466c7348bfb73a7ff2e8c35a6b | [
"Apache-2.0"
] | null | null | null | content/services/tickets/tickets.md | funpher/qingcloud-docs | 2afacb6f5a9857466c7348bfb73a7ff2e8c35a6b | [
"Apache-2.0"
] | 1 | 2020-11-26T09:36:04.000Z | 2020-11-26T09:36:04.000Z | content/services/tickets/tickets.md | funpher/qingcloud-docs | 2afacb6f5a9857466c7348bfb73a7ff2e8c35a6b | [
"Apache-2.0"
] | null | null | null | ---
title: "工单服务"
date: 2021-02-19T00:00:00+09:00
draft: false
collapsible: false
weight: 1
---
工单服务可为您有效地解决问题。您在工单系统中提交问题后,青云的工作人员致力于解决问题并反馈在工单系统中。您登陆控制台即可方便地查看记录或追溯问题。

## 查看工单
登陆控制台后,在页面右上角,点击**工单**,在下拉菜单中选择**我的工单**。您即可查看用户创建的工单、共享给您的工单和企业内的工单。
> 共享给您的工单需要您在**新邀请**中确定接受后,才能确认同意接收共享给**我的工单**。
>
> **企业内**的工单,是指该用户所属的企业的所有工单。

## 创建工单
登陆控制台后,您可以提交工单:
1. 在页面右上角,点击**工单**,在下拉菜单中选择**创建工单**。
2. 填写工单内容,包括:工单类型、标题、问题的影响类型、描述、相关产品等内容。
3. 点击**提交**。

## 回复工单
在您提交工单后,青云的后台工作人员将致力于解决您的问题,并回复在您提交的工单里。同时,您也会收到系统及时的邮件提醒,便于您查看工单。

若您对于问题的解决方案仍有疑问,您可以继续地回复工单并和后台工作人员沟通来解决问题。

| 17.880952 | 73 | 0.724368 | yue_Hant | 0.384347 |
97dc4aaabbc941be89a63a1ebac5ca67932bd14c | 2,206 | md | Markdown | notebooks/preprocess/README.md | ecmwf-projects/ai-vegetation-fuel | 9bc65dc9d8ab858d86b80133d39c7af8bd1d05d0 | [
"Apache-2.0"
] | 3 | 2021-07-01T12:06:14.000Z | 2021-11-17T13:10:43.000Z | notebooks/preprocess/README.md | ecmwf-projects/ml-fuel | 9bc65dc9d8ab858d86b80133d39c7af8bd1d05d0 | [
"Apache-2.0"
] | 6 | 2021-04-15T11:10:30.000Z | 2021-06-21T14:31:02.000Z | notebooks/preprocess/README.md | ecmwf-projects/ml-fuel | 9bc65dc9d8ab858d86b80133d39c7af8bd1d05d0 | [
"Apache-2.0"
] | 3 | 2021-04-10T18:20:52.000Z | 2021-10-01T01:07:12.000Z | # Notebooks for preprocessing completely raw data
These notebooks demonstrate (pre?)preprocessing for data obtained from heterogenous sources
various formats. We also have a notebook demonstrating the steps we undertake to convert
data from `grib` or `grib2` file format to the `netcdf` format used in the project.
## Directory structure
The raw data is expected to be in a directory `raw/` at the same level as
these notebooks. The notebooks output the preprocessed data into a `preprocess/`
directory that is also located at the same level as the notebooks. In our typical work-
flow, we use symbolic links to achieve this, because often times such large datasets of
raw files are present on data disks mounted elsewhere.
For example, if the working directory is `~/ai-vegetation-fuel/`, the raw data are in `/data1/raw_data/`, and we wish to store the processed
data in `/data1/processed_data/`, we would create symbolic links:
```bash
ln -s /data1/raw_data ./notebooks/preprocess/raw
ln -s /data1/processed_data ./notebooks/preprocess/preprocess
```
This should give you a directory structure like this (`ls -l ./notebooks/preprocess/`):
```bash
.
├── preprocess -> /data1/processed_data
├── preprocess_agb.ipynb
├── preprocess_burned_area.ipynb
├── preprocess_climate_regions.ipynb
├── preprocess_fire_anomalies.ipynb
├── preprocess_leaf_area_index.ipynb
├── preprocess_slopes.ipynb
├── preprocess_spi_gpcc.ipynb
├── preprocess_weather_anomalies.ipynb
├── raw -> /data1/raw_data
└── README.md
```
You can then create a symlink in the root of the repository with the name `data/` to
where you store the preprocessed data.
```bash
ln -s /data1/processed_data ./data
```
to get the following tree:
```bash
.
├── data -> /data1/preprocessed_data
├── dev-requirements.txt
├── docker
├── docs
│ └── _static
├── LICENSE
├── notebooks
│ └── preprocess
│ ├── preprocess -> /data1/preprocessed_data
│ └── raw -> /data1/raw_data
├── README.md
├── requirements.txt
└── src
├── models
├── pre-trained_models
└── utils
```
The Above Ground Biomass (AGB) dataset should be preprocessed first, since all other datasets use it as a baseline for resolution and dimensions.
| 31.514286 | 146 | 0.739347 | eng_Latn | 0.984813 |
97dc5ab5cc212085fa0b384f1a7a83daa956894a | 12,812 | md | Markdown | jekyll/_docs/continuous-deployment-with-aws-codedeploy.md | li3n3/circleci-docs | ee56c324667089734c998b54d739a78e5fdc3f64 | [
"MIT"
] | null | null | null | jekyll/_docs/continuous-deployment-with-aws-codedeploy.md | li3n3/circleci-docs | ee56c324667089734c998b54d739a78e5fdc3f64 | [
"MIT"
] | null | null | null | jekyll/_docs/continuous-deployment-with-aws-codedeploy.md | li3n3/circleci-docs | ee56c324667089734c998b54d739a78e5fdc3f64 | [
"MIT"
] | null | null | null | ---
layout: classic-docs
title: Continuous Deployment with AWS CodeDeploy
categories: [how-to]
description: Continuous Deployment with AWS CodeDeploy
last_updated: September 29, 2014
---
## Getting Started with CodeDeploy on CircleCI
AWS CodeDeploy is a deployment system that enables developers to automate the
deployment of applications to EC2 instances, and to update the applications as
required.
### AWS infrastructure
The first step to continuous deployment with CodeDeploy is setting up your EC2
instances, tagging them so you can define deployment groups, installing the
CodeDeploy agent on your hosts and setting up trust-roles so that CodeDeploy
can communicate with the CodeDeploy agents.
AWS provide a good [getting started with CodeDeploy guide][] for this part of
the process.
[getting started with CodeDeploy guide]: http://docs.aws.amazon.com/en_us/console/codedeploy/applications-user-guide
### CodeDeploy application
A CodeDeploy application is a collection of settings about where your
application can be deployed, how many instances can be deployed to at once,
what should be considered a failed deploy, and information on the trust-role to
use to allow CodeDeploy to interact with your EC2 instances.
**Note**: A CodeDeploy application does not specify what is to be deployed or what to
do during the deployment.
*What* to deploy is an archive of code/resources stored in S3 called an
`application revision`.
*How* to deploy is specified by the `AppSpec` file located inside the
application revision.
The [AppSpec][] file lives in your repo and tells CodeDeploy which files from
your application to deploy, where to deploy them, and also allows you specify
lifecyle scripts to be run at different stages during the deployment. You can
use these lifecycle scripts to stop your service before a new version is
deployed, run database migrations, install dependencies etc.
An [application revision][] is a zipfile or tarball containing the code/resources
to be deployed. It's usually created by packaging up your entire repo but can
be a sub-directory of the repo if desired. The `AppSpec` file must be stored
in the application revision as `<application-root>/appspec.yml`.
Generally speaking you'll have one application in your repo and so your
application revision can be created by packaging up your whole repo (excluding
`.git`). In this case `appspec.yml` should be placed in your repo root directory.
Application revisions are stored in [S3][] and are identified by the
combination of a bucket name, key, eTag, and for versioned buckets, the
object's version.
The most straightforward way to configure a new application is to log on to the
[CodeDeploy console][] which can guide you through the process of [creating a new
application][].
[AppSpec]: http://docs.aws.amazon.com/codedeploy/latest/userguide/writing-app-spec.html
[application revision]: http://docs.aws.amazon.com/codedeploy/latest/userguide/how-to-prepare-revision.html
[S3]: http://aws.amazon.com/s3/
[CodeDeploy console]: https://console.aws.amazon.com/codedeploy/home
[creating a new application]: https://console.aws.amazon.com/codedeploy/home#/applications/new
## Configuring CircleCI
CircleCI will automatically create new application revisions, upload them to
S3, and both trigger and watch deployments when you get a green build.
### Step 1: Create an IAM user for CircleCI to use
You should create an [IAM user][] to use solely for builds on
CircleCI, that way you have control over exactly which of your resources can be
accessed by code running as part of your build.
Take note of the Access Key ID and Secret Access Key allocated to your new IAM
user, you'll need these later.
For deploying with CodeDeploy your IAM user needs to be able to access S3 and
CodeDeploy at a minimum.
#### S3 IAM policy
CircleCI needs to be able to upload application revisions to your S3 bucket.
Permissions can be scoped down to the common-prefix you use for application
revision keys if you don't want to give access to the entire bucket.
The following policy snippet allows us to upload to `my-bucket` as long as the
key starts with `my-app`.
{
"Version": "2012-10-17",
"Statement": [
{
"Effect": "Allow",
"Action": [
"s3:PutObject"
],
"Resource": [
"arn:aws:s3:::my-bucket/my-app*"
]
}
]
}
#### CodeDeploy IAM policy
CircleCI also needs to be able to create application revisions, trigger
deployments and get deployment status. If your application is called `my-app`
and your account ID is `80398EXAMPLE` then the following policy snippet gives
us sufficient accesss:
{
"Version": "2012-10-17",
"Statement": [
{
"Effect": "Allow",
"Action": [
"codedeploy:RegisterApplicationRevision",
"codedeploy:GetApplicationRevision"
],
"Resource": [
"arn:aws:codedeploy:us-east-1:80398EXAMPLE:application:my-app"
]
},
{
"Effect": "Allow",
"Action": [
"codedeploy:CreateDeployment",
"codedeploy:GetDeployment"
],
"Resource": [
"arn:aws:codedeploy:us-east-1:80398EXAMPLE:deploymentgroup:my-app/*"
]
},
{
"Effect": "Allow",
"Action": [
"codedeploy:GetDeploymentConfig"
],
"Resource": [
"arn:aws:codedeploy:us-east-1:80398EXAMPLE:deploymentconfig:CodeDeployDefault.OneAtATime",
"arn:aws:codedeploy:us-east-1:80398EXAMPLE:deploymentconfig:CodeDeployDefault.HalfAtATime",
"arn:aws:codedeploy:us-east-1:80398EXAMPLE:deploymentconfig:CodeDeployDefault.AllAtOnce"
]
}
]
}
**Note:** This is the minimal policy necessary for creating deployments to
`my-app`. You will need to add additional `Resource` statements for each
additional CodeDeploy application. Alternatively, you can use wildcards in the
resource specification.
If you want to use custom deployment configurations then we will also need the
`GetDeploymentConfig` permission for each of the custom deployment
configurations, check out the [CodeDeploy IAM docs][] for more information.
[IAM user]: http://docs.aws.amazon.com/general/latest/gr/root-vs-iam.html
[CodeDeploy IAM docs]: http://docs.aws.amazon.com/codedeploy/latest/userguide/access-permissions.html
### Step 2: Configure CircleCI to use your new IAM user
Go to your project's **Project Settings > AWS keys** page, enter your IAM
user's Access Key ID and Secret Access Key and hit "Save AWS keys".
Your AWS keys are stored encrypted but it's important to note that they need to
be made available to code inside your build containers; anyone who can commit
code to your repo or trigger an ssh build will be able to read the AWS keys.
This is another important reason to use IAM to limit the resources the keys
can access!
### Step 3: (Optional) Configure packaging and revision storage
CircleCI needs some additional information to be able to package up your app
and register new revisions:
1. The directory in your repo to package up. This is relative to your repo's
root, `/` means the repo's root directory, `/app` means the `app` directory
in your repo's root directory.
2. Where to store new revisions in S3. The bucket name and a pattern to use to
generate new keys within that bucket. You can use [substitution variables][]
in your key pattern to help generate a unique key for each application
revision.
3. Which AWS region your application lives in. Each application revision must
be registered in the same AWS region that the CodeDeploy application was
created in.
If you want to be able to deploy this application from several different
branches (e.g. deploy `development` to your staging instances and `master` to
your production instances) you can configure these project-wide application
settings in the CircleCI UI at **Project Settings > AWS CodeDeploy**. The
main benefit is that you will have a simpler [circle.yml][] file.
You can also skip this step and configure everything in your [circle.yml][]
### Step 4: Configure deployment parameters
Configure your CodeDeploy deployment using the `codedeploy` block in
[circle.yml][]. At a minimum you need to tell CircleCI which application to
deploy and which deployment group the selected branch should be deployed to.
Any additional settings will override the project-wide configuration in the
project settings UI:
deployment:
staging:
branch: development
codedeploy:
my-app:
deployment_group: staging-instance-group
The benefit of project-wide application settings comes when you want to deploy
the same app to different deployment groups:
deployment:
staging:
branch: development
codedeploy:
my-app:
deployment_group: staging-instance-group
production:
branch: master
codedeploy:
my-app:
deployment_group: production-instance-group
If you wanted to override the S3 location for the application revisions
built for your production deployments (**Note:** you must specify both the
bucket name and key pattern if you override `revision_location`):
deployment:
production:
branch: master
codedeploy:
my-app:
deployment_group: production-instances
revision_location:
revision_type: S3
s3_location:
bucket: production-bucket
key_pattern: apps/my-app-master-{SHORT_COMMIT}-{BUILD_NUM}
If you haven't provided [project-wide settings](#step-3-optional-configure-packaging-and-revision-storage)
you need to provide all the information for your deployment in your
[circle.yml][]:
deployment:
staging:
branch: development
codedeploy:
my-app:
application_root: /
revision_location:
revision_type: S3
s3_location:
bucket: staging-bucket
key_pattern: apps/my-app-{SHORT_COMMIT}-{BUILD_NUM}
region: us-east-1
deployment_group: staging-instances
deployment_config: CodeDeployDefault.AllAtOnce
Breaking this down: there's one entry in the `codedeploy` block which is
named with your CodeDeploy application's name (in this example we're
deploying an application called `my-app`).
The sub-entries of `my-app` tell CircleCI where and how to deploy the `my-app`
application.
* `application_root` is the directory to package up into an application revision. It
is relative to your repo and must start with a `/`.
`/` means the repo root directory.
The entire contents of `application_root` will be packaged up into a zipfile and
uploaded to S3.
* `revision_location` tells CircleCI where to upload application revisions to.
* `revision_type` is where to store the revision - currently only S3 is supported
* `bucket` is the name of the bucket that should store your application
revision bundles.
* `key_pattern` is used to generate the S3 key. You can use [substitution variables][]
to generate unique keys for each build.
* `region` is the AWS region your application lives in
* `deployment_group` names the deployment group to deploy the new revision to.
* `deployment_config` [optional] names the deployment configuration. It can be
any of the standard three CodeDeploy configurations (FOO, BAR, and BAZ) or
if you want to use a custom configuration you've created you can name it
here.
### Key Patterns
Rather than overwriting a single S3 key with each new revision CircleCI can
generate unique keys for application revisions using substitution variables.
The available variables are:
* `{BRANCH}`: the branch being built
* `{COMMIT}`: the full SHA1 of the commit being build
* `{SHORT_COMMIT}`: the first 7 characters of the commit SHA1
* `{BUILD_NUM}`: the build number
For a unique key you'll need to embed at least one of `{COMMIT}`,
`{SHORT_COMMIT}`, or `{BUILD_NUM}` in your key pattern.
If you'd rather use a versioned bucket just use a fixed string for the key
pattern and we'll use the object's versioning info instead.
## Pre- and post-deployment steps
Unlike other deployment options you don't need to specify pre- or
post-deployment steps in your `circle.yml`.
CodeDeploy provides first class support for your application's lifecycle via
lifecycle scripts. As a result you can start/stop services, run database
migrations, install dependencies etc. across all your instances in a consistent
manner.
[substitution variables]: #key-patterns
[circle.yml]: /docs/configuration/#deployment
| 40.163009 | 116 | 0.729785 | eng_Latn | 0.995378 |
97dcc919f64f12b2e82b50ef4d197b495957cb75 | 6,925 | md | Markdown | msteams-platform/concepts/deploy-and-publish/appsource/prepare/app-manifest-checklist.md | isabella232/msteams-docs.es-ES | a5abd8a27d657bc325eb3f5e19e2ff0d1059f235 | [
"CC-BY-4.0",
"MIT"
] | null | null | null | msteams-platform/concepts/deploy-and-publish/appsource/prepare/app-manifest-checklist.md | isabella232/msteams-docs.es-ES | a5abd8a27d657bc325eb3f5e19e2ff0d1059f235 | [
"CC-BY-4.0",
"MIT"
] | 1 | 2021-02-23T19:09:10.000Z | 2021-02-23T19:09:10.000Z | msteams-platform/concepts/deploy-and-publish/appsource/prepare/app-manifest-checklist.md | isabella232/msteams-docs.es-ES | a5abd8a27d657bc325eb3f5e19e2ff0d1059f235 | [
"CC-BY-4.0",
"MIT"
] | null | null | null | ---
title: Lista de comprobación de manifiesto de la aplicación
description: La lista de comprobación del manifiesto de la aplicación para publicar su aplicación de Microsoft Teams en AppSource
keywords: lista de comprobación de publicación de tienda Office de Microsoft Teams
ms.openlocfilehash: 6186daf264f04e04d6037ddfb7d9208994cc3c57
ms.sourcegitcommit: 44ac886c0ca34a16222d3991a61606f8483b8481
ms.translationtype: MT
ms.contentlocale: es-ES
ms.lasthandoff: 02/05/2020
ms.locfileid: "41783881"
---
# <a name="app-manifest-checklist"></a>Lista de comprobación de manifiesto de la aplicación
>[!IMPORTANT]
>Estamos migrando la administración de las soluciones de Office desde el panel de vendedores al centro de Partners. Para más información, vea[la sección moverse del panel de control del vendedor al centro de Partners ](https://developer.microsoft.com/office/blogs/moving-management-of-solutions-from-seller-dashboard-to-partner-center/)y lea el[FAQ](https://docs.microsoft.com/office/dev/store/partner-center-faq).
El manifiesto de la aplicación debe cumplir con las directrices que se describen a continuación.
>[!Tip]
> Use App Studio para que le ayude a compilar el manifiesto de la aplicación. Validará la mayoría de los requisitos que se indican a continuación y mostrará cualquier error o advertencia en la ficha **probar y distribuir** .
## <a name="tips"></a>Sugerencias
* No use "Teams", "Microsoft" o "app" en el nombre de la aplicación.
* El developerName del manifiesto debe ser el mismo que el nombre del proveedor definido en el centro de asociados.
* Asegúrese de que la descripción de la aplicación, las capturas de pantallas, el texto y las imágenes promocionales solo describen la aplicación y que no contienen anuncios, promociones o nombres de marcas protegidos por copyright.
* Si su producto requiere una cuenta en su servicio u otro servicio, enumere la descripción y asegúrese de que hay vínculos para registrarse, iniciar sesión y cerrar sesión.
* Si su producto requiere compras adicionales para funcionar correctamente, enumere la descripción.
* Proporcione los términos y los vínculos de la Directiva de privacidad necesarios en el manifiesto y en el centro del partner o el panel. Compruebe que los vínculos se resuelven correctamente en la documentación correcta, lo ideal es que los equipos sean específicos. Para los bots, debe proporcionar esta misma información en la sección envío de la página de registro de bot Framework.
* Asegúrese de que los metadatos del manifiesto coinciden exactamente con los metadatos del centro del asociado (y, en el caso de los bots, en el registro del marco de bot). Tenga en cuenta que la entrada del centro de asociados puede contener una descripción más detallada y con formato para usarla en la página del producto AppSource.
* Asegúrese de que el título de la aplicación usado en el manifiesto es una **coincidencia exacta** con el título de la aplicación que se especifica en el envío del centro de asociados. *Vea* [crear listas eficaces en Microsoft AppSource y en Office: Use un nombre de complemento coherente ](https://docs.microsoft.com/office/dev/store/create-effective-office-store-listings#use-a-consistent-add-in-name).
## <a name="metadata-requirement"></a>Requisitos de metadatos
Se requieren los siguientes metadatos para la aplicación.
|Datos|Tipo|Size|Manifiesto|Centro para socios|Description|
|---|---|---|---|---|---|
|Paquete de la aplicación|.zip|||✔|El paquete de la aplicación real para la carga o el envío de AppSource.|
|Logotipo-color|.png|192×192 píxeles|`icon.color`||El icono para mostrar en la lista de la página del producto en la galería de Teams. Este es su logotipo de color de todo el producto.|
|Contorno de logotipo|.png|32×32 píxeles|`icon.outline`||Icono que se va a mostrar en Microsoft Teams, en el canal de chat de Microsoft Teams y en otras ubicaciones. Este es el logotipo representado como un contorno blanco con fondo transparente.|
|Logotipo de la aplicación|. png,. jpg,. JPEG,. gif|300×300 píxeles||✔|Icono que se va a mostrar en AppSource. Este es el logotipo de producto de color completo y es un archivo diferente del que se usa en el manifiesto para `icon.color`. debe ser inferior a 512 KB.|
|Vínculo de soporte técnico|URL|||✔|Un vínculo a material de soporte para los usuarios finales que podrían no tener instalada la aplicación. Vínculo disponible públicamente accesible sin ningún inicio de sesión (HTTPS).|
|Vínculo de privacidad|URL||`developer.privacyUrl`|✔|Un vínculo a la Directiva de privacidad (HTTPS).|
|Vínculo a vídeo|URL|||Opcional|Un vínculo a un vídeo sobre la aplicación.|
|FORMALIZA|. doc,. pdf, etc.|||Opcional|AppSource requiere un contrato de licencia para el usuario final (CLUF), que puede proporcionar como datos adjuntos. Si opta por no enviar un CLUF, se proporcionará uno en su nombre.|
|Condiciones de servicio|URL||`developer.termsOfServiceUrl`||Un vínculo a sus términos de servicio (HTTPS).|
|Notas de prueba|Rellenar un vínculo entre líneas o una dirección URL pública|||Notas de prueba detalladas sobre cómo probar la aplicación paso a paso. Incluya dos credenciales de inicio de sesión para probar los escenarios de administrador y no administrador.|
## <a name="localized-content"></a>Contenido localizado
> [!NOTE]
> AppSource planea admitir contenido localizado para los siguientes metadatos. Actualmente, la descripción de la aplicación solo se mostrará en inglés en AppSource, pero se mostrará correctamente en el cliente de Microsoft Teams. Consulte [localización de la aplicación](~/concepts/build-and-test/apps-localization.md) para obtener más información.
|Datos|Tipo|Size|Manifiesto|Centro para socios|Description|
|---|---|---|---|---|---|
|Nombre de la aplicación|Cadena|semestre|`name.short`|✔|El nombre de la aplicación tal como debería aparecer en el escaparate y en el producto.|
|Nombre de aplicación largo|Cadena|semestre|`name.full`|✔|El nombre de la aplicación tal como debería aparecer en el escaparate y en el producto.|
|La descripción breve|Cadena|80|`description.short`|✔|Breve descripción de la aplicación.|
|La descripción larga|Cadena|4000|`description.full`|✔|Una descripción más detallada de la aplicación. En el archivo del manifiesto, es adecuado un resumen preciso. En el centro de Partners, puede usar una descripción con formato y más rica para la página de producto de AppSource.|
|Capturas de pantalla (1-5)|. png,. jpg o. gif|1366w x 768h y menor que 1024 KB||✔|Al menos una captura de pantalla que muestra la experiencia de la aplicación. Se usa en la página de detalles de la aplicación.|
## <a name="submission-extras-for-bots"></a>Extras de envío para bots
Los bots en Microsoft Teams se deben crear con el marco de trabajo. Vea [crear un bot](~/bots/how-to/create-a-bot-for-teams.md) para obtener instrucciones. Use un icono de color 96x96 para el icono de bot en el marco de bot.
| 104.924242 | 414 | 0.788159 | spa_Latn | 0.987674 |
97de0804011977d82ca8fb436a710b4be8396050 | 8,042 | md | Markdown | packages/dropdown/CHANGELOG.md | henlz/farmblocks | 216637482d005446cf421c168e39f0d36f934cea | [
"MIT"
] | null | null | null | packages/dropdown/CHANGELOG.md | henlz/farmblocks | 216637482d005446cf421c168e39f0d36f934cea | [
"MIT"
] | null | null | null | packages/dropdown/CHANGELOG.md | henlz/farmblocks | 216637482d005446cf421c168e39f0d36f934cea | [
"MIT"
] | null | null | null | # Change Log
All notable changes to this project will be documented in this file.
See [Conventional Commits](https://conventionalcommits.org) for commit guidelines.
<a name="0.8.2"></a>
## [0.8.2](https://github.com/CraveFood/farmblocks/compare/@crave/farmblocks-dropdown@0.8.1...@crave/farmblocks-dropdown@0.8.2) (2018-08-13)
**Note:** Version bump only for package @crave/farmblocks-dropdown
<a name="0.8.1"></a>
## [0.8.1](https://github.com/CraveFood/farmblocks/compare/@crave/farmblocks-dropdown@0.8.0...@crave/farmblocks-dropdown@0.8.1) (2018-08-01)
**Note:** Version bump only for package @crave/farmblocks-dropdown
<a name="0.8.0"></a>
# [0.8.0](https://github.com/CraveFood/farmblocks/compare/@crave/farmblocks-dropdown@0.7.1...@crave/farmblocks-dropdown@0.8.0) (2018-07-26)
### Features
* **Dropdown:** allow a scroll bar to appear when content overflows the menu with a set max-height ([278057e](https://github.com/CraveFood/farmblocks/commit/278057e))
<a name="0.7.1"></a>
## [0.7.1](https://github.com/CraveFood/farmblocks/compare/@crave/farmblocks-dropdown@0.7.0...@crave/farmblocks-dropdown@0.7.1) (2018-07-20)
**Note:** Version bump only for package @crave/farmblocks-dropdown
<a name="0.7.0"></a>
# [0.7.0](https://github.com/CraveFood/farmblocks/compare/@crave/farmblocks-dropdown@0.6.6...@crave/farmblocks-dropdown@0.7.0) (2018-07-05)
### Features
* **dropdown:** new property for dropdown items: footer ([31b15f7](https://github.com/CraveFood/farmblocks/commit/31b15f7))
<a name="0.6.6"></a>
## [0.6.6](https://github.com/CraveFood/farmblocks/compare/@crave/farmblocks-dropdown@0.6.5...@crave/farmblocks-dropdown@0.6.6) (2018-06-22)
**Note:** Version bump only for package @crave/farmblocks-dropdown
<a name="0.6.5"></a>
## [0.6.5](https://github.com/CraveFood/farmblocks/compare/@crave/farmblocks-dropdown@0.6.4...@crave/farmblocks-dropdown@0.6.5) (2018-06-18)
**Note:** Version bump only for package @crave/farmblocks-dropdown
<a name="0.6.4"></a>
## [0.6.4](https://github.com/CraveFood/farmblocks/compare/@crave/farmblocks-dropdown@0.6.3...@crave/farmblocks-dropdown@0.6.4) (2018-06-12)
### Bug Fixes
* **stories:** the second argument of storiesOf shouldnt be a string ([ea2bbee](https://github.com/CraveFood/farmblocks/commit/ea2bbee))
<a name="0.6.3"></a>
## [0.6.3](https://github.com/CraveFood/farmblocks/compare/@crave/farmblocks-dropdown@0.6.2...@crave/farmblocks-dropdown@0.6.3) (2018-06-08)
**Note:** Version bump only for package @crave/farmblocks-dropdown
<a name="0.6.2"></a>
## [0.6.2](https://github.com/CraveFood/farmblocks/compare/@crave/farmblocks-dropdown@0.6.0...@crave/farmblocks-dropdown@0.6.2) (2018-05-23)
**Note:** Version bump only for package @crave/farmblocks-dropdown
<a name="0.6.1"></a>
## [0.6.1](https://github.com/CraveFood/farmblocks/compare/@crave/farmblocks-dropdown@0.6.0...@crave/farmblocks-dropdown@0.6.1) (2018-05-22)
**Note:** Version bump only for package @crave/farmblocks-dropdown
<a name="0.6.0"></a>
# [0.6.0](https://github.com/CraveFood/farmblocks/compare/@crave/farmblocks-dropdown@0.5.11...@crave/farmblocks-dropdown@0.6.0) (2018-05-10)
### Features
* **Dropdown:** add innerRef property ([5a87168](https://github.com/CraveFood/farmblocks/commit/5a87168)), closes [#342](https://github.com/CraveFood/farmblocks/issues/342)
<a name="0.5.11"></a>
## [0.5.11](https://github.com/CraveFood/farmblocks/compare/@crave/farmblocks-dropdown@0.5.10...@crave/farmblocks-dropdown@0.5.11) (2018-04-11)
**Note:** Version bump only for package @crave/farmblocks-dropdown
<a name="0.5.10"></a>
## [0.5.10](https://github.com/CraveFood/farmblocks/compare/@crave/farmblocks-dropdown@0.5.9...@crave/farmblocks-dropdown@0.5.10) (2018-04-10)
**Note:** Version bump only for package @crave/farmblocks-dropdown
<a name="0.5.9"></a>
## [0.5.9](https://github.com/CraveFood/farmblocks/compare/@crave/farmblocks-dropdown@0.5.8...@crave/farmblocks-dropdown@0.5.9) (2018-04-05)
**Note:** Version bump only for package @crave/farmblocks-dropdown
<a name="0.5.8"></a>
## [0.5.8](https://github.com/CraveFood/farmblocks/compare/@crave/farmblocks-dropdown@0.5.7...@crave/farmblocks-dropdown@0.5.8) (2018-03-15)
**Note:** Version bump only for package @crave/farmblocks-dropdown
<a name="0.5.7"></a>
## [0.5.7](https://github.com/CraveFood/farmblocks/compare/@crave/farmblocks-dropdown@0.5.6...@crave/farmblocks-dropdown@0.5.7) (2018-03-14)
**Note:** Version bump only for package @crave/farmblocks-dropdown
<a name="0.5.6"></a>
## [0.5.6](https://github.com/CraveFood/farmblocks/compare/@crave/farmblocks-dropdown@0.5.5...@crave/farmblocks-dropdown@0.5.6) (2018-03-01)
**Note:** Version bump only for package @crave/farmblocks-dropdown
<a name="0.5.5"></a>
## [0.5.5](https://github.com/CraveFood/farmblocks/compare/@crave/farmblocks-dropdown@0.5.4...@crave/farmblocks-dropdown@0.5.5) (2018-03-01)
**Note:** Version bump only for package @crave/farmblocks-dropdown
<a name="0.5.4"></a>
## [0.5.4](https://github.com/CraveFood/farmblocks/compare/@crave/farmblocks-dropdown@0.5.3...@crave/farmblocks-dropdown@0.5.4) (2018-02-28)
**Note:** Version bump only for package @crave/farmblocks-dropdown
<a name="0.5.3"></a>
## [0.5.3](https://github.com/CraveFood/farmblocks/compare/@crave/farmblocks-dropdown@0.5.2...@crave/farmblocks-dropdown@0.5.3) (2018-02-15)
**Note:** Version bump only for package @crave/farmblocks-dropdown
<a name="0.5.2"></a>
## [0.5.2](https://github.com/CraveFood/farmblocks/compare/@crave/farmblocks-dropdown@0.5.1...@crave/farmblocks-dropdown@0.5.2) (2018-02-14)
**Note:** Version bump only for package @crave/farmblocks-dropdown
<a name="0.5.1"></a>
## [0.5.1](https://github.com/CraveFood/farmblocks/compare/@crave/farmblocks-dropdown@0.5.0...@crave/farmblocks-dropdown@0.5.1) (2018-02-09)
**Note:** Version bump only for package @crave/farmblocks-dropdown
<a name="0.5.0"></a>
# [0.5.0](https://github.com/CraveFood/farmblocks/compare/@crave/farmblocks-dropdown@0.4.1...@crave/farmblocks-dropdown@0.5.0) (2018-02-07)
### Features
* **dropdown:** export dropdown wrappers ([0469a73](https://github.com/CraveFood/farmblocks/commit/0469a73))
* **dropdown item:** add selected and highlighted styles to dropdonw item wrapper ([3add0cb](https://github.com/CraveFood/farmblocks/commit/3add0cb))
<a name="0.4.1"></a>
## [0.4.1](https://github.com/CraveFood/farmblocks/compare/@crave/farmblocks-dropdown@0.4.0...@crave/farmblocks-dropdown@0.4.1) (2018-01-29)
**Note:** Version bump only for package @crave/farmblocks-dropdown
<a name="0.4.0"></a>
# [0.4.0](https://github.com/CraveFood/farmblocks/compare/@crave/farmblocks-dropdown@0.3.0...@crave/farmblocks-dropdown@0.4.0) (2018-01-26)
### Features
* **sizes:** add property size to allow trigger buttons of the same sizes as button ([dc1331c](https://github.com/CraveFood/farmblocks/commit/dc1331c))
<a name="0.3.0"></a>
# [0.3.0](https://github.com/CraveFood/farmblocks/compare/@crave/farmblocks-dropdown@0.2.1...@crave/farmblocks-dropdown@0.3.0) (2018-01-18)
### Features
* **dropdown:** add focused outline on dropdown component ([40b945a](https://github.com/CraveFood/farmblocks/commit/40b945a))
<a name="0.2.1"></a>
## [0.2.1](https://github.com/CraveFood/farmblocks/compare/@crave/farmblocks-dropdown@0.2.0...@crave/farmblocks-dropdown@0.2.1) (2018-01-18)
**Note:** Version bump only for package @crave/farmblocks-dropdown
<a name="0.2.0"></a>
# 0.2.0 (2018-01-17)
### Bug Fixes
* **color:** fix dropdown items color ([435841b](https://github.com/CraveFood/farmblocks/commit/435841b))
* **css:** add whitespace no-wrap on dropdown ([6205083](https://github.com/CraveFood/farmblocks/commit/6205083))
### Features
* **dropdown:** add width property to dropdown to set custom menu width ([df99091](https://github.com/CraveFood/farmblocks/commit/df99091))
* **dropdown:** initial dropdown component ([b1b8848](https://github.com/CraveFood/farmblocks/commit/b1b8848))
| 30.233083 | 172 | 0.707287 | yue_Hant | 0.1757 |
97de5763ba85c66a937a8ce2a5c37caa9229fa53 | 1,592 | md | Markdown | sites/ava-site/docs/api/ckb/CKBOptions.zh.md | Mu-L/AVA | 329c8554382dfc44fd90634fa921ee95875c3ec9 | [
"MIT"
] | 1,126 | 2020-01-02T07:33:48.000Z | 2022-03-30T07:14:16.000Z | sites/ava-site/docs/api/ckb/CKBOptions.zh.md | Mu-L/AVA | 329c8554382dfc44fd90634fa921ee95875c3ec9 | [
"MIT"
] | 180 | 2020-01-06T16:14:38.000Z | 2022-03-30T11:33:20.000Z | sites/ava-site/docs/api/ckb/CKBOptions.zh.md | Mu-L/AVA | 329c8554382dfc44fd90634fa921ee95875c3ec9 | [
"MIT"
] | 115 | 2020-01-04T11:00:59.000Z | 2022-03-28T07:57:47.000Z | ---
title: CKBOptions
order: 2
---
`markdown:docs/common/style.md`
<div class="doc-md">
得到图表知识所可能包含的所有属性。
```sign
CKBOptions(lang)
```
### 参数
* **lang** * 返回内容针对的语言。
* _可选参数_
* `参数类型`: *Language* extends string
* `默认值`: 'en-US'
* `选项`: 'en-US', 'zh-CN'
### 返回值
*object* * 包含以下属性:
#### `CKBOptions().family`
> *图表大类*一览,相似的图表会被分为一个大类(或家族)。
举例:
* LineCharts
* ColumnCharts
* BarCharts
* PieCharts
* AreaCharts
* ScatterCharts
* FunnelCharts
* HeatmapCharts
* RadarCharts
* Others
#### `CKBOptions().category`
> *图形类别*,数据可视化中的图形可以按照结构和性质被分为几个大类。
举例:
* Statistic
* Diagram
* Graph
* Map
#### `CKBOptions().purpose`
> *分析目的*,按照图表的分析目的分类,比如**饼图**更适合描述占比、**折线图**更适合描述趋势。
举例:
* Comparison
* Trend
* Distribution
* Rank
* Proportion
* Composition
#### `CKBOptions().coord`
> *坐标系*。
举例:
* NumberLine
* Cartesian2D
* SymmetricCartesian
* Cartesian3D
* Polar
* NodeLink
* Radar
#### `CKBOptions().shape`
> *形状*,基于对于图形的形状的感性认知,将图形分类。
举例:
* Lines
* Bars
* Round
* Square
* Area
* Scatter
* Symmetric
#### `CKBOptions().channel`
> *视觉通道*,视觉通道是指可以用来映射数据的一些视觉元素变量,比如长度、形状、颜色等。通过对图表类型中的视觉通道进行整理,也可以将图表分类。
举例:
* Position
* Length
* Color
* Area
* Angle
* ArcLength
* Direction
* Size
#### `CKBOptions().lom`
> *度量级别(Level of Measurement)*,对字段特性的一种描述。比如无序名词、有序名词、数值、日期时间,等。
举例:
* Nominal
* Ordinal
* Interval
* Discrete
* Continuous
* Time
### 示例
```js
import { CKBOptions } from '@antv/knowledge';
const options1 = CKBOptions();
const options2 = CKBOptions('zh-CN');
const allCategories = options1.category;
// ['Statistic', 'Diagram', 'Graph', 'Map']
```
</div>
| 11.536232 | 72 | 0.658291 | yue_Hant | 0.378231 |
97dedadc352a60f072b0bdcdd0d64ef47846c0aa | 137 | md | Markdown | README.md | gabriel-valle/minimizer | eb3e42ca5fa4667c712ad2228a6c8af3bdb73185 | [
"MIT"
] | null | null | null | README.md | gabriel-valle/minimizer | eb3e42ca5fa4667c712ad2228a6c8af3bdb73185 | [
"MIT"
] | null | null | null | README.md | gabriel-valle/minimizer | eb3e42ca5fa4667c712ad2228a6c8af3bdb73185 | [
"MIT"
] | null | null | null | # minimizer
Implementation of irrestrict minimization algorithms for scalar fields and 3d path visualization for functions with 2 inputs
| 45.666667 | 124 | 0.854015 | eng_Latn | 0.997243 |
97dff003c7e063a723f4ecdb9b6624cb0b219339 | 1,816 | md | Markdown | _docs/topic-09/09-exploring/07-page.md | ashley-rezvani/341-web-design-Spring2022 | 5d009bb506c1791768d1cd9bd55565eac1d6d57c | [
"MIT"
] | null | null | null | _docs/topic-09/09-exploring/07-page.md | ashley-rezvani/341-web-design-Spring2022 | 5d009bb506c1791768d1cd9bd55565eac1d6d57c | [
"MIT"
] | null | null | null | _docs/topic-09/09-exploring/07-page.md | ashley-rezvani/341-web-design-Spring2022 | 5d009bb506c1791768d1cd9bd55565eac1d6d57c | [
"MIT"
] | null | null | null | ---
title: Combining Colors
module: topic-09
permalink: /topic-09/combining-color/
---
<div class="divider-heading"></div>
When we start to mix the ratios of red, green, and blue, we come up with the rest of the colors of the spectrum. Some of the first colors we should consider are the complementary colors of red, green, and blue. To get the complementary color for red, we use full green and blue. This creates cyan.
- Name: cyan;
- RGB: rgb(0, 255, 255);
- Hex: #00ffff;
<div width="50%" height="20px"
style="background-color:#00ffff;color:#ff0000;padding:10px;font-size:1.25em;">
Red complements cyan.
</div>
<div width="50%" height="20px"
style="background-color:#ff0000;color:#00ffff;padding:10px;font-size:1.25em;">
Cyan complements red.
</div>
<br />
Likewise, to get the complementary colors for green and blue, which are magenta and yellow, respectively, we boost the values of the two other colors.
- Name: magenta;
- RGB: rgb(255, 0, 255);
- Hex: #ff00ff;
<div width="50%" height="20px"
style="background-color:#ff00ff;color:#00ff00;padding:10px;font-size:1.25em;">
Green complements magenta.
</div>
<div width="50%" height="20px"
style="background-color:#00ff00;color:#ff00ff;padding:10px;font-size:1.25em;">
Magenta complements green.
</div>
<br />
- Name: yellow;
- RGB: rgb(255, 255, 00);
- Hex: #ffff00;
<div width="50%" height="20px"
style="background-color:#ffff00;color:#0000ff;padding:10px;font-size:1.25em;">
Blue complements yellow.
</div>
<div width="50%" height="20px"
style="background-color:#0000ff;color:#ffff00;padding:10px;font-size:1.25em;">
Yellow complements blue.
</div>
<br />
Others colors are obviously some combination of these values. The exact ratios depend on the amounts of red, green, and blue in every value.
| 30.779661 | 297 | 0.706498 | eng_Latn | 0.913826 |
97e01655eee25297fa4c04f73f16edb32080fec0 | 66,723 | md | Markdown | repos/rabbitmq/remote/3.8.27.md | Mattlk13/repo-info | 734e8af562852b4d6503f484be845727b88a97ae | [
"Apache-2.0"
] | null | null | null | repos/rabbitmq/remote/3.8.27.md | Mattlk13/repo-info | 734e8af562852b4d6503f484be845727b88a97ae | [
"Apache-2.0"
] | 1 | 2020-11-05T19:56:17.000Z | 2020-11-12T13:09:29.000Z | repos/rabbitmq/remote/3.8.27.md | Mattlk13/repo-info | 734e8af562852b4d6503f484be845727b88a97ae | [
"Apache-2.0"
] | 1 | 2017-02-09T22:16:59.000Z | 2017-02-09T22:16:59.000Z | ## `rabbitmq:3.8.27`
```console
$ docker pull rabbitmq@sha256:723bc9acb06f70727a062b5deac8c11b6f645da15ba562036a001ea8de4df784
```
- Manifest MIME: `application/vnd.docker.distribution.manifest.list.v2+json`
- Platforms: 6
- linux; amd64
- linux; arm variant v7
- linux; arm64 variant v8
- linux; ppc64le
- linux; riscv64
- linux; s390x
### `rabbitmq:3.8.27` - linux; amd64
```console
$ docker pull rabbitmq@sha256:d3454b56aa889a4e6a483e58b6471795e562b86e67691698c5b54e446abb63b4
```
- Docker Version: 20.10.7
- Manifest MIME: `application/vnd.docker.distribution.manifest.v2+json`
- Total Size: **98.6 MB (98554303 bytes)**
(compressed transfer size, not on-disk size)
- Image ID: `sha256:6ca8b549025a6de874fd575bc66da21e9cd55df15d9caebe29b69d697d080417`
- Entrypoint: `["docker-entrypoint.sh"]`
- Default Command: `["rabbitmq-server"]`
```dockerfile
# Wed, 02 Feb 2022 02:14:45 GMT
ADD file:3ccf747d646089ed7fbb43c40c45dd43e86f0674115f856efada93c7e4a63624 in /
# Wed, 02 Feb 2022 02:14:46 GMT
CMD ["bash"]
# Wed, 02 Feb 2022 04:00:35 GMT
RUN set -eux; apt-get update; apt-get install -y --no-install-recommends gosu ; rm -rf /var/lib/apt/lists/*; gosu nobody true
# Wed, 02 Feb 2022 04:00:35 GMT
ARG PGP_KEYSERVER=keyserver.ubuntu.com
# Wed, 02 Feb 2022 04:00:35 GMT
ENV OPENSSL_VERSION=1.1.1m
# Wed, 02 Feb 2022 04:00:36 GMT
ENV OPENSSL_SOURCE_SHA256=f89199be8b23ca45fc7cb9f1d8d3ee67312318286ad030f5316aca6462db6c96
# Wed, 02 Feb 2022 04:00:36 GMT
ENV OPENSSL_PGP_KEY_IDS=0x8657ABB260F056B1E5190839D9C4D26D0E604491 0x5B2545DAB21995F4088CEFAA36CEE4DEB00CFE33 0xED230BEC4D4F2518B9D7DF41F0DB4D21C1D35231 0xC1F33DD8CE1D4CC613AF14DA9195C48241FBF7DD 0x7953AC1FBC3DC8B3B292393ED5E9E43F7DF9EE8C 0xE5E52560DD91C556DDBDA5D02064C53641C25E5D
# Sat, 26 Feb 2022 01:57:55 GMT
ENV OTP_VERSION=24.2.2
# Sat, 26 Feb 2022 01:57:55 GMT
ENV OTP_SOURCE_SHA256=a87bcbdcdd1b99de7038030123b2d655d46d6e698a9143608618bdbec6ebbee7
# Sat, 26 Feb 2022 02:07:16 GMT
# ARGS: PGP_KEYSERVER=keyserver.ubuntu.com
RUN set -eux; savedAptMark="$(apt-mark showmanual)"; apt-get update; apt-get install --yes --no-install-recommends autoconf ca-certificates dpkg-dev gcc g++ gnupg libncurses5-dev make wget ; rm -rf /var/lib/apt/lists/*; OPENSSL_SOURCE_URL="https://www.openssl.org/source/openssl-$OPENSSL_VERSION.tar.gz"; OPENSSL_PATH="/usr/local/src/openssl-$OPENSSL_VERSION"; OPENSSL_CONFIG_DIR=/usr/local/etc/ssl; wget --progress dot:giga --output-document "$OPENSSL_PATH.tar.gz.asc" "$OPENSSL_SOURCE_URL.asc"; wget --progress dot:giga --output-document "$OPENSSL_PATH.tar.gz" "$OPENSSL_SOURCE_URL"; export GNUPGHOME="$(mktemp -d)"; for key in $OPENSSL_PGP_KEY_IDS; do gpg --batch --keyserver "$PGP_KEYSERVER" --recv-keys "$key"; done; gpg --batch --verify "$OPENSSL_PATH.tar.gz.asc" "$OPENSSL_PATH.tar.gz"; gpgconf --kill all; rm -rf "$GNUPGHOME"; echo "$OPENSSL_SOURCE_SHA256 *$OPENSSL_PATH.tar.gz" | sha256sum --check --strict -; mkdir -p "$OPENSSL_PATH"; tar --extract --file "$OPENSSL_PATH.tar.gz" --directory "$OPENSSL_PATH" --strip-components 1; cd "$OPENSSL_PATH"; debMultiarch="$(dpkg-architecture --query DEB_HOST_MULTIARCH)"; MACHINE="$(dpkg-architecture --query DEB_BUILD_GNU_CPU)" RELEASE="4.x.y-z" SYSTEM='Linux' BUILD='???' ./config --openssldir="$OPENSSL_CONFIG_DIR" --libdir="lib/$debMultiarch" -Wl,-rpath=/usr/local/lib ; make -j "$(getconf _NPROCESSORS_ONLN)"; make install_sw install_ssldirs; cd ..; rm -rf "$OPENSSL_PATH"*; ldconfig; rmdir "$OPENSSL_CONFIG_DIR/certs" "$OPENSSL_CONFIG_DIR/private"; ln -sf /etc/ssl/certs /etc/ssl/private "$OPENSSL_CONFIG_DIR"; openssl version; OTP_SOURCE_URL="https://github.com/erlang/otp/releases/download/OTP-$OTP_VERSION/otp_src_$OTP_VERSION.tar.gz"; OTP_PATH="/usr/local/src/otp-$OTP_VERSION"; mkdir -p "$OTP_PATH"; wget --progress dot:giga --output-document "$OTP_PATH.tar.gz" "$OTP_SOURCE_URL"; echo "$OTP_SOURCE_SHA256 *$OTP_PATH.tar.gz" | sha256sum --check --strict -; tar --extract --file "$OTP_PATH.tar.gz" --directory "$OTP_PATH" --strip-components 1; cd "$OTP_PATH"; export ERL_TOP="$OTP_PATH"; ./otp_build autoconf; CFLAGS="$(dpkg-buildflags --get CFLAGS)"; export CFLAGS; export CFLAGS="$CFLAGS -Wl,-rpath=/usr/local/lib"; hostArch="$(dpkg-architecture --query DEB_HOST_GNU_TYPE)"; buildArch="$(dpkg-architecture --query DEB_BUILD_GNU_TYPE)"; dpkgArch="$(dpkg --print-architecture)"; dpkgArch="${dpkgArch##*-}"; jitFlag=; case "$dpkgArch" in amd64) jitFlag='--enable-jit' ;; esac; ./configure --host="$hostArch" --build="$buildArch" --disable-dynamic-ssl-lib --disable-hipe --disable-sctp --disable-silent-rules --enable-clock-gettime --enable-hybrid-heap --enable-kernel-poll --enable-shared-zlib --enable-smp-support --enable-threads --with-microstate-accounting=extra --without-common_test --without-debugger --without-dialyzer --without-diameter --without-edoc --without-erl_docgen --without-et --without-eunit --without-ftp --without-hipe --without-jinterface --without-megaco --without-observer --without-odbc --without-reltool --without-ssh --without-tftp --without-wx $jitFlag ; make -j "$(getconf _NPROCESSORS_ONLN)" GEN_OPT_FLGS="-O2 -fno-strict-aliasing"; make install; cd ..; rm -rf "$OTP_PATH"* /usr/local/lib/erlang/lib/*/examples /usr/local/lib/erlang/lib/*/src ; apt-mark auto '.*' > /dev/null; [ -z "$savedAptMark" ] || apt-mark manual $savedAptMark; find /usr/local -type f -executable -exec ldd '{}' ';' | awk '/=>/ { print $(NF-1) }' | sort -u | xargs -r dpkg-query --search | cut -d: -f1 | sort -u | xargs -r apt-mark manual ; apt-get purge -y --auto-remove -o APT::AutoRemove::RecommendsImportant=false; openssl version; erl -noshell -eval 'io:format("~p~n~n~p~n~n", [crypto:supports(), ssl:versions()]), init:stop().'
# Sat, 26 Feb 2022 02:07:16 GMT
ENV RABBITMQ_DATA_DIR=/var/lib/rabbitmq
# Sat, 26 Feb 2022 02:07:17 GMT
# ARGS: PGP_KEYSERVER=keyserver.ubuntu.com
RUN set -eux; groupadd --gid 999 --system rabbitmq; useradd --uid 999 --system --home-dir "$RABBITMQ_DATA_DIR" --gid rabbitmq rabbitmq; mkdir -p "$RABBITMQ_DATA_DIR" /etc/rabbitmq /etc/rabbitmq/conf.d /tmp/rabbitmq-ssl /var/log/rabbitmq; chown -fR rabbitmq:rabbitmq "$RABBITMQ_DATA_DIR" /etc/rabbitmq /etc/rabbitmq/conf.d /tmp/rabbitmq-ssl /var/log/rabbitmq; chmod 777 "$RABBITMQ_DATA_DIR" /etc/rabbitmq /etc/rabbitmq/conf.d /tmp/rabbitmq-ssl /var/log/rabbitmq; ln -sf "$RABBITMQ_DATA_DIR/.erlang.cookie" /root/.erlang.cookie
# Sat, 26 Feb 2022 02:17:52 GMT
ENV RABBITMQ_VERSION=3.8.27
# Sat, 26 Feb 2022 02:17:53 GMT
ENV RABBITMQ_PGP_KEY_ID=0x0A9AF2115F4687BD29803A206B73A36E6026DFCA
# Sat, 26 Feb 2022 02:17:53 GMT
ENV RABBITMQ_HOME=/opt/rabbitmq
# Sat, 26 Feb 2022 02:17:53 GMT
ENV PATH=/opt/rabbitmq/sbin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin RABBITMQ_LOGS=-
# Sat, 26 Feb 2022 02:18:11 GMT
# ARGS: PGP_KEYSERVER=keyserver.ubuntu.com
RUN set -eux; savedAptMark="$(apt-mark showmanual)"; apt-get update; apt-get install --yes --no-install-recommends ca-certificates gnupg wget xz-utils ; rm -rf /var/lib/apt/lists/*; RABBITMQ_SOURCE_URL="https://github.com/rabbitmq/rabbitmq-server/releases/download/v$RABBITMQ_VERSION/rabbitmq-server-generic-unix-latest-toolchain-$RABBITMQ_VERSION.tar.xz"; RABBITMQ_PATH="/usr/local/src/rabbitmq-$RABBITMQ_VERSION"; wget --progress dot:giga --output-document "$RABBITMQ_PATH.tar.xz.asc" "$RABBITMQ_SOURCE_URL.asc"; wget --progress dot:giga --output-document "$RABBITMQ_PATH.tar.xz" "$RABBITMQ_SOURCE_URL"; export GNUPGHOME="$(mktemp -d)"; gpg --batch --keyserver hkps://keys.openpgp.org --recv-keys "$RABBITMQ_PGP_KEY_ID"; gpg --batch --verify "$RABBITMQ_PATH.tar.xz.asc" "$RABBITMQ_PATH.tar.xz"; gpgconf --kill all; rm -rf "$GNUPGHOME"; mkdir -p "$RABBITMQ_HOME"; tar --extract --file "$RABBITMQ_PATH.tar.xz" --directory "$RABBITMQ_HOME" --strip-components 1; rm -rf "$RABBITMQ_PATH"*; grep -qE '^SYS_PREFIX=\$\{RABBITMQ_HOME\}$' "$RABBITMQ_HOME/sbin/rabbitmq-defaults"; sed -i 's/^SYS_PREFIX=.*$/SYS_PREFIX=/' "$RABBITMQ_HOME/sbin/rabbitmq-defaults"; grep -qE '^SYS_PREFIX=$' "$RABBITMQ_HOME/sbin/rabbitmq-defaults"; chown -R rabbitmq:rabbitmq "$RABBITMQ_HOME"; apt-mark auto '.*' > /dev/null; apt-mark manual $savedAptMark; apt-get purge -y --auto-remove -o APT::AutoRemove::RecommendsImportant=false; [ ! -e "$RABBITMQ_DATA_DIR/.erlang.cookie" ]; gosu rabbitmq rabbitmqctl help; gosu rabbitmq rabbitmqctl list_ciphers; gosu rabbitmq rabbitmq-plugins list; rm "$RABBITMQ_DATA_DIR/.erlang.cookie"
# Sat, 26 Feb 2022 02:18:13 GMT
# ARGS: PGP_KEYSERVER=keyserver.ubuntu.com
RUN set -eux; gosu rabbitmq rabbitmq-plugins enable --offline rabbitmq_prometheus; echo 'management_agent.disable_metrics_collector = true' > /etc/rabbitmq/conf.d/management_agent.disable_metrics_collector.conf; chown rabbitmq:rabbitmq /etc/rabbitmq/conf.d/management_agent.disable_metrics_collector.conf
# Sat, 26 Feb 2022 02:18:14 GMT
# ARGS: PGP_KEYSERVER=keyserver.ubuntu.com
RUN ln -sf /opt/rabbitmq/plugins /plugins
# Sat, 26 Feb 2022 02:18:14 GMT
ENV HOME=/var/lib/rabbitmq
# Sat, 26 Feb 2022 02:18:15 GMT
VOLUME [/var/lib/rabbitmq]
# Sat, 26 Feb 2022 02:18:15 GMT
ENV LANG=C.UTF-8 LANGUAGE=C.UTF-8 LC_ALL=C.UTF-8
# Sat, 26 Feb 2022 02:18:15 GMT
COPY file:031403dd5d8e0fcf8749c29d6595776a74d7fb2d07fd51c2c2f9f9f22af0a504 in /usr/local/bin/
# Sat, 26 Feb 2022 02:18:15 GMT
ENTRYPOINT ["docker-entrypoint.sh"]
# Sat, 26 Feb 2022 02:18:16 GMT
EXPOSE 15691 15692 25672 4369 5671 5672
# Sat, 26 Feb 2022 02:18:16 GMT
CMD ["rabbitmq-server"]
```
- Layers:
- `sha256:08c01a0ec47e82ebe2bec112f373d160983a6d1e9e66627f66a3322bc403221b`
Last Modified: Wed, 02 Feb 2022 02:16:20 GMT
Size: 28.6 MB (28564099 bytes)
MIME: application/vnd.docker.image.rootfs.diff.tar.gzip
- `sha256:1ceb23964d6c1492664adf8d6cc85511d6c3e315160fad60f75f7b096362ced4`
Last Modified: Wed, 02 Feb 2022 04:11:09 GMT
Size: 870.0 KB (870031 bytes)
MIME: application/vnd.docker.image.rootfs.diff.tar.gzip
- `sha256:3cc5b082947ae074b6c9b2cc42f146379cc92cde0fe1a46f78022e1550de65bb`
Last Modified: Sat, 26 Feb 2022 02:20:17 GMT
Size: 50.9 MB (50935361 bytes)
MIME: application/vnd.docker.image.rootfs.diff.tar.gzip
- `sha256:92491b691ce168d0737ced7d1eeb3a019e9abb41e8bc87b257c9ec1f9685b305`
Last Modified: Sat, 26 Feb 2022 02:20:09 GMT
Size: 2.1 KB (2085 bytes)
MIME: application/vnd.docker.image.rootfs.diff.tar.gzip
- `sha256:7c96e25e9f93bf32e1221b75655f54923fc19b71288e34d52310c3c24632619f`
Last Modified: Sat, 26 Feb 2022 02:21:34 GMT
Size: 18.2 MB (18177365 bytes)
MIME: application/vnd.docker.image.rootfs.diff.tar.gzip
- `sha256:e4bbad4b46415e6d41910c0462092fabd5c78b720af74519ac668b25566cbe15`
Last Modified: Sat, 26 Feb 2022 02:21:32 GMT
Size: 273.0 B
MIME: application/vnd.docker.image.rootfs.diff.tar.gzip
- `sha256:434d3208c758cd0ed841abf2a3486e5e9a2ba7c218fdcd583b41b0d23c343fe4`
Last Modified: Sat, 26 Feb 2022 02:21:32 GMT
Size: 105.0 B
MIME: application/vnd.docker.image.rootfs.diff.tar.gzip
- `sha256:4e5752918fd00362546f27aa6155db258478c455e0ffe342e3043f3c4c468b60`
Last Modified: Sat, 26 Feb 2022 02:21:32 GMT
Size: 5.0 KB (4984 bytes)
MIME: application/vnd.docker.image.rootfs.diff.tar.gzip
### `rabbitmq:3.8.27` - linux; arm variant v7
```console
$ docker pull rabbitmq@sha256:e553ebfd1c2dbd898d250960d578f7135c937bd16de03c0913514a2caa1968fd
```
- Docker Version: 20.10.7
- Manifest MIME: `application/vnd.docker.distribution.manifest.v2+json`
- Total Size: **84.8 MB (84806345 bytes)**
(compressed transfer size, not on-disk size)
- Image ID: `sha256:058924a26dfee99835d3cd6f114a89bac21834501a6b7b8810007365f25e2fd3`
- Entrypoint: `["docker-entrypoint.sh"]`
- Default Command: `["rabbitmq-server"]`
```dockerfile
# Wed, 02 Feb 2022 02:25:11 GMT
ADD file:0adc3f597b5ba8c31a9a4d67126166cf067749754e269fe2c3ed43f03457b53c in /
# Wed, 02 Feb 2022 02:25:12 GMT
CMD ["bash"]
# Wed, 02 Feb 2022 04:55:45 GMT
RUN set -eux; apt-get update; apt-get install -y --no-install-recommends gosu ; rm -rf /var/lib/apt/lists/*; gosu nobody true
# Wed, 02 Feb 2022 04:55:46 GMT
ARG PGP_KEYSERVER=keyserver.ubuntu.com
# Wed, 02 Feb 2022 04:55:46 GMT
ENV OPENSSL_VERSION=1.1.1m
# Wed, 02 Feb 2022 04:55:46 GMT
ENV OPENSSL_SOURCE_SHA256=f89199be8b23ca45fc7cb9f1d8d3ee67312318286ad030f5316aca6462db6c96
# Wed, 02 Feb 2022 04:55:47 GMT
ENV OPENSSL_PGP_KEY_IDS=0x8657ABB260F056B1E5190839D9C4D26D0E604491 0x5B2545DAB21995F4088CEFAA36CEE4DEB00CFE33 0xED230BEC4D4F2518B9D7DF41F0DB4D21C1D35231 0xC1F33DD8CE1D4CC613AF14DA9195C48241FBF7DD 0x7953AC1FBC3DC8B3B292393ED5E9E43F7DF9EE8C 0xE5E52560DD91C556DDBDA5D02064C53641C25E5D
# Fri, 25 Feb 2022 19:08:59 GMT
ENV OTP_VERSION=24.2.2
# Fri, 25 Feb 2022 19:08:59 GMT
ENV OTP_SOURCE_SHA256=a87bcbdcdd1b99de7038030123b2d655d46d6e698a9143608618bdbec6ebbee7
# Fri, 25 Feb 2022 19:14:32 GMT
# ARGS: PGP_KEYSERVER=keyserver.ubuntu.com
RUN set -eux; savedAptMark="$(apt-mark showmanual)"; apt-get update; apt-get install --yes --no-install-recommends autoconf ca-certificates dpkg-dev gcc g++ gnupg libncurses5-dev make wget ; rm -rf /var/lib/apt/lists/*; OPENSSL_SOURCE_URL="https://www.openssl.org/source/openssl-$OPENSSL_VERSION.tar.gz"; OPENSSL_PATH="/usr/local/src/openssl-$OPENSSL_VERSION"; OPENSSL_CONFIG_DIR=/usr/local/etc/ssl; wget --progress dot:giga --output-document "$OPENSSL_PATH.tar.gz.asc" "$OPENSSL_SOURCE_URL.asc"; wget --progress dot:giga --output-document "$OPENSSL_PATH.tar.gz" "$OPENSSL_SOURCE_URL"; export GNUPGHOME="$(mktemp -d)"; for key in $OPENSSL_PGP_KEY_IDS; do gpg --batch --keyserver "$PGP_KEYSERVER" --recv-keys "$key"; done; gpg --batch --verify "$OPENSSL_PATH.tar.gz.asc" "$OPENSSL_PATH.tar.gz"; gpgconf --kill all; rm -rf "$GNUPGHOME"; echo "$OPENSSL_SOURCE_SHA256 *$OPENSSL_PATH.tar.gz" | sha256sum --check --strict -; mkdir -p "$OPENSSL_PATH"; tar --extract --file "$OPENSSL_PATH.tar.gz" --directory "$OPENSSL_PATH" --strip-components 1; cd "$OPENSSL_PATH"; debMultiarch="$(dpkg-architecture --query DEB_HOST_MULTIARCH)"; MACHINE="$(dpkg-architecture --query DEB_BUILD_GNU_CPU)" RELEASE="4.x.y-z" SYSTEM='Linux' BUILD='???' ./config --openssldir="$OPENSSL_CONFIG_DIR" --libdir="lib/$debMultiarch" -Wl,-rpath=/usr/local/lib ; make -j "$(getconf _NPROCESSORS_ONLN)"; make install_sw install_ssldirs; cd ..; rm -rf "$OPENSSL_PATH"*; ldconfig; rmdir "$OPENSSL_CONFIG_DIR/certs" "$OPENSSL_CONFIG_DIR/private"; ln -sf /etc/ssl/certs /etc/ssl/private "$OPENSSL_CONFIG_DIR"; openssl version; OTP_SOURCE_URL="https://github.com/erlang/otp/releases/download/OTP-$OTP_VERSION/otp_src_$OTP_VERSION.tar.gz"; OTP_PATH="/usr/local/src/otp-$OTP_VERSION"; mkdir -p "$OTP_PATH"; wget --progress dot:giga --output-document "$OTP_PATH.tar.gz" "$OTP_SOURCE_URL"; echo "$OTP_SOURCE_SHA256 *$OTP_PATH.tar.gz" | sha256sum --check --strict -; tar --extract --file "$OTP_PATH.tar.gz" --directory "$OTP_PATH" --strip-components 1; cd "$OTP_PATH"; export ERL_TOP="$OTP_PATH"; ./otp_build autoconf; CFLAGS="$(dpkg-buildflags --get CFLAGS)"; export CFLAGS; export CFLAGS="$CFLAGS -Wl,-rpath=/usr/local/lib"; hostArch="$(dpkg-architecture --query DEB_HOST_GNU_TYPE)"; buildArch="$(dpkg-architecture --query DEB_BUILD_GNU_TYPE)"; dpkgArch="$(dpkg --print-architecture)"; dpkgArch="${dpkgArch##*-}"; jitFlag=; case "$dpkgArch" in amd64) jitFlag='--enable-jit' ;; esac; ./configure --host="$hostArch" --build="$buildArch" --disable-dynamic-ssl-lib --disable-hipe --disable-sctp --disable-silent-rules --enable-clock-gettime --enable-hybrid-heap --enable-kernel-poll --enable-shared-zlib --enable-smp-support --enable-threads --with-microstate-accounting=extra --without-common_test --without-debugger --without-dialyzer --without-diameter --without-edoc --without-erl_docgen --without-et --without-eunit --without-ftp --without-hipe --without-jinterface --without-megaco --without-observer --without-odbc --without-reltool --without-ssh --without-tftp --without-wx $jitFlag ; make -j "$(getconf _NPROCESSORS_ONLN)" GEN_OPT_FLGS="-O2 -fno-strict-aliasing"; make install; cd ..; rm -rf "$OTP_PATH"* /usr/local/lib/erlang/lib/*/examples /usr/local/lib/erlang/lib/*/src ; apt-mark auto '.*' > /dev/null; [ -z "$savedAptMark" ] || apt-mark manual $savedAptMark; find /usr/local -type f -executable -exec ldd '{}' ';' | awk '/=>/ { print $(NF-1) }' | sort -u | xargs -r dpkg-query --search | cut -d: -f1 | sort -u | xargs -r apt-mark manual ; apt-get purge -y --auto-remove -o APT::AutoRemove::RecommendsImportant=false; openssl version; erl -noshell -eval 'io:format("~p~n~n~p~n~n", [crypto:supports(), ssl:versions()]), init:stop().'
# Fri, 25 Feb 2022 19:14:33 GMT
ENV RABBITMQ_DATA_DIR=/var/lib/rabbitmq
# Fri, 25 Feb 2022 19:14:35 GMT
# ARGS: PGP_KEYSERVER=keyserver.ubuntu.com
RUN set -eux; groupadd --gid 999 --system rabbitmq; useradd --uid 999 --system --home-dir "$RABBITMQ_DATA_DIR" --gid rabbitmq rabbitmq; mkdir -p "$RABBITMQ_DATA_DIR" /etc/rabbitmq /etc/rabbitmq/conf.d /tmp/rabbitmq-ssl /var/log/rabbitmq; chown -fR rabbitmq:rabbitmq "$RABBITMQ_DATA_DIR" /etc/rabbitmq /etc/rabbitmq/conf.d /tmp/rabbitmq-ssl /var/log/rabbitmq; chmod 777 "$RABBITMQ_DATA_DIR" /etc/rabbitmq /etc/rabbitmq/conf.d /tmp/rabbitmq-ssl /var/log/rabbitmq; ln -sf "$RABBITMQ_DATA_DIR/.erlang.cookie" /root/.erlang.cookie
# Fri, 25 Feb 2022 19:22:18 GMT
ENV RABBITMQ_VERSION=3.8.27
# Fri, 25 Feb 2022 19:22:18 GMT
ENV RABBITMQ_PGP_KEY_ID=0x0A9AF2115F4687BD29803A206B73A36E6026DFCA
# Fri, 25 Feb 2022 19:22:19 GMT
ENV RABBITMQ_HOME=/opt/rabbitmq
# Fri, 25 Feb 2022 19:22:19 GMT
ENV PATH=/opt/rabbitmq/sbin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin RABBITMQ_LOGS=-
# Fri, 25 Feb 2022 19:23:03 GMT
# ARGS: PGP_KEYSERVER=keyserver.ubuntu.com
RUN set -eux; savedAptMark="$(apt-mark showmanual)"; apt-get update; apt-get install --yes --no-install-recommends ca-certificates gnupg wget xz-utils ; rm -rf /var/lib/apt/lists/*; RABBITMQ_SOURCE_URL="https://github.com/rabbitmq/rabbitmq-server/releases/download/v$RABBITMQ_VERSION/rabbitmq-server-generic-unix-latest-toolchain-$RABBITMQ_VERSION.tar.xz"; RABBITMQ_PATH="/usr/local/src/rabbitmq-$RABBITMQ_VERSION"; wget --progress dot:giga --output-document "$RABBITMQ_PATH.tar.xz.asc" "$RABBITMQ_SOURCE_URL.asc"; wget --progress dot:giga --output-document "$RABBITMQ_PATH.tar.xz" "$RABBITMQ_SOURCE_URL"; export GNUPGHOME="$(mktemp -d)"; gpg --batch --keyserver hkps://keys.openpgp.org --recv-keys "$RABBITMQ_PGP_KEY_ID"; gpg --batch --verify "$RABBITMQ_PATH.tar.xz.asc" "$RABBITMQ_PATH.tar.xz"; gpgconf --kill all; rm -rf "$GNUPGHOME"; mkdir -p "$RABBITMQ_HOME"; tar --extract --file "$RABBITMQ_PATH.tar.xz" --directory "$RABBITMQ_HOME" --strip-components 1; rm -rf "$RABBITMQ_PATH"*; grep -qE '^SYS_PREFIX=\$\{RABBITMQ_HOME\}$' "$RABBITMQ_HOME/sbin/rabbitmq-defaults"; sed -i 's/^SYS_PREFIX=.*$/SYS_PREFIX=/' "$RABBITMQ_HOME/sbin/rabbitmq-defaults"; grep -qE '^SYS_PREFIX=$' "$RABBITMQ_HOME/sbin/rabbitmq-defaults"; chown -R rabbitmq:rabbitmq "$RABBITMQ_HOME"; apt-mark auto '.*' > /dev/null; apt-mark manual $savedAptMark; apt-get purge -y --auto-remove -o APT::AutoRemove::RecommendsImportant=false; [ ! -e "$RABBITMQ_DATA_DIR/.erlang.cookie" ]; gosu rabbitmq rabbitmqctl help; gosu rabbitmq rabbitmqctl list_ciphers; gosu rabbitmq rabbitmq-plugins list; rm "$RABBITMQ_DATA_DIR/.erlang.cookie"
# Fri, 25 Feb 2022 19:23:08 GMT
# ARGS: PGP_KEYSERVER=keyserver.ubuntu.com
RUN set -eux; gosu rabbitmq rabbitmq-plugins enable --offline rabbitmq_prometheus; echo 'management_agent.disable_metrics_collector = true' > /etc/rabbitmq/conf.d/management_agent.disable_metrics_collector.conf; chown rabbitmq:rabbitmq /etc/rabbitmq/conf.d/management_agent.disable_metrics_collector.conf
# Fri, 25 Feb 2022 19:23:10 GMT
# ARGS: PGP_KEYSERVER=keyserver.ubuntu.com
RUN ln -sf /opt/rabbitmq/plugins /plugins
# Fri, 25 Feb 2022 19:23:10 GMT
ENV HOME=/var/lib/rabbitmq
# Fri, 25 Feb 2022 19:23:11 GMT
VOLUME [/var/lib/rabbitmq]
# Fri, 25 Feb 2022 19:23:11 GMT
ENV LANG=C.UTF-8 LANGUAGE=C.UTF-8 LC_ALL=C.UTF-8
# Fri, 25 Feb 2022 19:23:12 GMT
COPY file:031403dd5d8e0fcf8749c29d6595776a74d7fb2d07fd51c2c2f9f9f22af0a504 in /usr/local/bin/
# Fri, 25 Feb 2022 19:23:12 GMT
ENTRYPOINT ["docker-entrypoint.sh"]
# Fri, 25 Feb 2022 19:23:13 GMT
EXPOSE 15691 15692 25672 4369 5671 5672
# Fri, 25 Feb 2022 19:23:13 GMT
CMD ["rabbitmq-server"]
```
- Layers:
- `sha256:42bcffb5d2901aadaedc35f036cf725a537494a5869ae378ec427d313ff41fa6`
Last Modified: Wed, 02 Feb 2022 02:29:41 GMT
Size: 24.1 MB (24062751 bytes)
MIME: application/vnd.docker.image.rootfs.diff.tar.gzip
- `sha256:033c1014bd696b5abd40ce52e8e3567613bef1ec88fe07f44109c0f6a2d9e0e0`
Last Modified: Wed, 02 Feb 2022 05:09:21 GMT
Size: 838.9 KB (838889 bytes)
MIME: application/vnd.docker.image.rootfs.diff.tar.gzip
- `sha256:77a4c81da91aa6b1afe9f0418be5f70abf41ed18d6e30dddb834c369a36a6403`
Last Modified: Fri, 25 Feb 2022 19:27:15 GMT
Size: 41.7 MB (41718978 bytes)
MIME: application/vnd.docker.image.rootfs.diff.tar.gzip
- `sha256:3d8f51af35532b9c65bd85fb3b08fed926a1184adcaef38da4a24c2e9ab1e801`
Last Modified: Fri, 25 Feb 2022 19:26:54 GMT
Size: 2.1 KB (2075 bytes)
MIME: application/vnd.docker.image.rootfs.diff.tar.gzip
- `sha256:270ef190705ad07e3f36a26869b09527ec6e04a491367b8e17ecd7deafd08ad2`
Last Modified: Fri, 25 Feb 2022 19:29:20 GMT
Size: 18.2 MB (18178287 bytes)
MIME: application/vnd.docker.image.rootfs.diff.tar.gzip
- `sha256:c8aba0ab3e94c1e5bb26e307c628f804a736894944f8385c8c4600975f31cfb2`
Last Modified: Fri, 25 Feb 2022 19:29:17 GMT
Size: 273.0 B
MIME: application/vnd.docker.image.rootfs.diff.tar.gzip
- `sha256:a44ff9b3ae3b88afdecfcbc54fe9a75271b3ff7d443b364e141b6473ef8260d7`
Last Modified: Fri, 25 Feb 2022 19:29:16 GMT
Size: 107.0 B
MIME: application/vnd.docker.image.rootfs.diff.tar.gzip
- `sha256:fe57809107ced1c573ee63ff5db7b1e452c91af4eab49ae2413c9ba8fb479017`
Last Modified: Fri, 25 Feb 2022 19:29:17 GMT
Size: 5.0 KB (4985 bytes)
MIME: application/vnd.docker.image.rootfs.diff.tar.gzip
### `rabbitmq:3.8.27` - linux; arm64 variant v8
```console
$ docker pull rabbitmq@sha256:80366c166d661c5d7b3fb5e4a1a5161788657f86f793efb944a66174290d6b6e
```
- Docker Version: 20.10.7
- Manifest MIME: `application/vnd.docker.distribution.manifest.v2+json`
- Total Size: **88.8 MB (88750515 bytes)**
(compressed transfer size, not on-disk size)
- Image ID: `sha256:31691b12ac130c0703fc2390b90f33cbaa057b0b312e366bb95991695824898d`
- Entrypoint: `["docker-entrypoint.sh"]`
- Default Command: `["rabbitmq-server"]`
```dockerfile
# Wed, 02 Feb 2022 03:19:27 GMT
ADD file:3acc741be29b0b58e44d7302ab5ce65bf65ea1b35922be58a2cee9cb708d006a in /
# Wed, 02 Feb 2022 03:19:27 GMT
CMD ["bash"]
# Wed, 02 Feb 2022 06:08:54 GMT
RUN set -eux; apt-get update; apt-get install -y --no-install-recommends gosu ; rm -rf /var/lib/apt/lists/*; gosu nobody true
# Wed, 02 Feb 2022 06:08:55 GMT
ARG PGP_KEYSERVER=keyserver.ubuntu.com
# Wed, 02 Feb 2022 06:08:56 GMT
ENV OPENSSL_VERSION=1.1.1m
# Wed, 02 Feb 2022 06:08:57 GMT
ENV OPENSSL_SOURCE_SHA256=f89199be8b23ca45fc7cb9f1d8d3ee67312318286ad030f5316aca6462db6c96
# Wed, 02 Feb 2022 06:08:58 GMT
ENV OPENSSL_PGP_KEY_IDS=0x8657ABB260F056B1E5190839D9C4D26D0E604491 0x5B2545DAB21995F4088CEFAA36CEE4DEB00CFE33 0xED230BEC4D4F2518B9D7DF41F0DB4D21C1D35231 0xC1F33DD8CE1D4CC613AF14DA9195C48241FBF7DD 0x7953AC1FBC3DC8B3B292393ED5E9E43F7DF9EE8C 0xE5E52560DD91C556DDBDA5D02064C53641C25E5D
# Sat, 26 Feb 2022 00:01:00 GMT
ENV OTP_VERSION=24.2.2
# Sat, 26 Feb 2022 00:01:01 GMT
ENV OTP_SOURCE_SHA256=a87bcbdcdd1b99de7038030123b2d655d46d6e698a9143608618bdbec6ebbee7
# Sat, 26 Feb 2022 00:07:14 GMT
# ARGS: PGP_KEYSERVER=keyserver.ubuntu.com
RUN set -eux; savedAptMark="$(apt-mark showmanual)"; apt-get update; apt-get install --yes --no-install-recommends autoconf ca-certificates dpkg-dev gcc g++ gnupg libncurses5-dev make wget ; rm -rf /var/lib/apt/lists/*; OPENSSL_SOURCE_URL="https://www.openssl.org/source/openssl-$OPENSSL_VERSION.tar.gz"; OPENSSL_PATH="/usr/local/src/openssl-$OPENSSL_VERSION"; OPENSSL_CONFIG_DIR=/usr/local/etc/ssl; wget --progress dot:giga --output-document "$OPENSSL_PATH.tar.gz.asc" "$OPENSSL_SOURCE_URL.asc"; wget --progress dot:giga --output-document "$OPENSSL_PATH.tar.gz" "$OPENSSL_SOURCE_URL"; export GNUPGHOME="$(mktemp -d)"; for key in $OPENSSL_PGP_KEY_IDS; do gpg --batch --keyserver "$PGP_KEYSERVER" --recv-keys "$key"; done; gpg --batch --verify "$OPENSSL_PATH.tar.gz.asc" "$OPENSSL_PATH.tar.gz"; gpgconf --kill all; rm -rf "$GNUPGHOME"; echo "$OPENSSL_SOURCE_SHA256 *$OPENSSL_PATH.tar.gz" | sha256sum --check --strict -; mkdir -p "$OPENSSL_PATH"; tar --extract --file "$OPENSSL_PATH.tar.gz" --directory "$OPENSSL_PATH" --strip-components 1; cd "$OPENSSL_PATH"; debMultiarch="$(dpkg-architecture --query DEB_HOST_MULTIARCH)"; MACHINE="$(dpkg-architecture --query DEB_BUILD_GNU_CPU)" RELEASE="4.x.y-z" SYSTEM='Linux' BUILD='???' ./config --openssldir="$OPENSSL_CONFIG_DIR" --libdir="lib/$debMultiarch" -Wl,-rpath=/usr/local/lib ; make -j "$(getconf _NPROCESSORS_ONLN)"; make install_sw install_ssldirs; cd ..; rm -rf "$OPENSSL_PATH"*; ldconfig; rmdir "$OPENSSL_CONFIG_DIR/certs" "$OPENSSL_CONFIG_DIR/private"; ln -sf /etc/ssl/certs /etc/ssl/private "$OPENSSL_CONFIG_DIR"; openssl version; OTP_SOURCE_URL="https://github.com/erlang/otp/releases/download/OTP-$OTP_VERSION/otp_src_$OTP_VERSION.tar.gz"; OTP_PATH="/usr/local/src/otp-$OTP_VERSION"; mkdir -p "$OTP_PATH"; wget --progress dot:giga --output-document "$OTP_PATH.tar.gz" "$OTP_SOURCE_URL"; echo "$OTP_SOURCE_SHA256 *$OTP_PATH.tar.gz" | sha256sum --check --strict -; tar --extract --file "$OTP_PATH.tar.gz" --directory "$OTP_PATH" --strip-components 1; cd "$OTP_PATH"; export ERL_TOP="$OTP_PATH"; ./otp_build autoconf; CFLAGS="$(dpkg-buildflags --get CFLAGS)"; export CFLAGS; export CFLAGS="$CFLAGS -Wl,-rpath=/usr/local/lib"; hostArch="$(dpkg-architecture --query DEB_HOST_GNU_TYPE)"; buildArch="$(dpkg-architecture --query DEB_BUILD_GNU_TYPE)"; dpkgArch="$(dpkg --print-architecture)"; dpkgArch="${dpkgArch##*-}"; jitFlag=; case "$dpkgArch" in amd64) jitFlag='--enable-jit' ;; esac; ./configure --host="$hostArch" --build="$buildArch" --disable-dynamic-ssl-lib --disable-hipe --disable-sctp --disable-silent-rules --enable-clock-gettime --enable-hybrid-heap --enable-kernel-poll --enable-shared-zlib --enable-smp-support --enable-threads --with-microstate-accounting=extra --without-common_test --without-debugger --without-dialyzer --without-diameter --without-edoc --without-erl_docgen --without-et --without-eunit --without-ftp --without-hipe --without-jinterface --without-megaco --without-observer --without-odbc --without-reltool --without-ssh --without-tftp --without-wx $jitFlag ; make -j "$(getconf _NPROCESSORS_ONLN)" GEN_OPT_FLGS="-O2 -fno-strict-aliasing"; make install; cd ..; rm -rf "$OTP_PATH"* /usr/local/lib/erlang/lib/*/examples /usr/local/lib/erlang/lib/*/src ; apt-mark auto '.*' > /dev/null; [ -z "$savedAptMark" ] || apt-mark manual $savedAptMark; find /usr/local -type f -executable -exec ldd '{}' ';' | awk '/=>/ { print $(NF-1) }' | sort -u | xargs -r dpkg-query --search | cut -d: -f1 | sort -u | xargs -r apt-mark manual ; apt-get purge -y --auto-remove -o APT::AutoRemove::RecommendsImportant=false; openssl version; erl -noshell -eval 'io:format("~p~n~n~p~n~n", [crypto:supports(), ssl:versions()]), init:stop().'
# Sat, 26 Feb 2022 00:07:15 GMT
ENV RABBITMQ_DATA_DIR=/var/lib/rabbitmq
# Sat, 26 Feb 2022 00:07:15 GMT
# ARGS: PGP_KEYSERVER=keyserver.ubuntu.com
RUN set -eux; groupadd --gid 999 --system rabbitmq; useradd --uid 999 --system --home-dir "$RABBITMQ_DATA_DIR" --gid rabbitmq rabbitmq; mkdir -p "$RABBITMQ_DATA_DIR" /etc/rabbitmq /etc/rabbitmq/conf.d /tmp/rabbitmq-ssl /var/log/rabbitmq; chown -fR rabbitmq:rabbitmq "$RABBITMQ_DATA_DIR" /etc/rabbitmq /etc/rabbitmq/conf.d /tmp/rabbitmq-ssl /var/log/rabbitmq; chmod 777 "$RABBITMQ_DATA_DIR" /etc/rabbitmq /etc/rabbitmq/conf.d /tmp/rabbitmq-ssl /var/log/rabbitmq; ln -sf "$RABBITMQ_DATA_DIR/.erlang.cookie" /root/.erlang.cookie
# Sat, 26 Feb 2022 00:14:40 GMT
ENV RABBITMQ_VERSION=3.8.27
# Sat, 26 Feb 2022 00:14:41 GMT
ENV RABBITMQ_PGP_KEY_ID=0x0A9AF2115F4687BD29803A206B73A36E6026DFCA
# Sat, 26 Feb 2022 00:14:42 GMT
ENV RABBITMQ_HOME=/opt/rabbitmq
# Sat, 26 Feb 2022 00:14:43 GMT
ENV PATH=/opt/rabbitmq/sbin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin RABBITMQ_LOGS=-
# Sat, 26 Feb 2022 00:15:03 GMT
# ARGS: PGP_KEYSERVER=keyserver.ubuntu.com
RUN set -eux; savedAptMark="$(apt-mark showmanual)"; apt-get update; apt-get install --yes --no-install-recommends ca-certificates gnupg wget xz-utils ; rm -rf /var/lib/apt/lists/*; RABBITMQ_SOURCE_URL="https://github.com/rabbitmq/rabbitmq-server/releases/download/v$RABBITMQ_VERSION/rabbitmq-server-generic-unix-latest-toolchain-$RABBITMQ_VERSION.tar.xz"; RABBITMQ_PATH="/usr/local/src/rabbitmq-$RABBITMQ_VERSION"; wget --progress dot:giga --output-document "$RABBITMQ_PATH.tar.xz.asc" "$RABBITMQ_SOURCE_URL.asc"; wget --progress dot:giga --output-document "$RABBITMQ_PATH.tar.xz" "$RABBITMQ_SOURCE_URL"; export GNUPGHOME="$(mktemp -d)"; gpg --batch --keyserver hkps://keys.openpgp.org --recv-keys "$RABBITMQ_PGP_KEY_ID"; gpg --batch --verify "$RABBITMQ_PATH.tar.xz.asc" "$RABBITMQ_PATH.tar.xz"; gpgconf --kill all; rm -rf "$GNUPGHOME"; mkdir -p "$RABBITMQ_HOME"; tar --extract --file "$RABBITMQ_PATH.tar.xz" --directory "$RABBITMQ_HOME" --strip-components 1; rm -rf "$RABBITMQ_PATH"*; grep -qE '^SYS_PREFIX=\$\{RABBITMQ_HOME\}$' "$RABBITMQ_HOME/sbin/rabbitmq-defaults"; sed -i 's/^SYS_PREFIX=.*$/SYS_PREFIX=/' "$RABBITMQ_HOME/sbin/rabbitmq-defaults"; grep -qE '^SYS_PREFIX=$' "$RABBITMQ_HOME/sbin/rabbitmq-defaults"; chown -R rabbitmq:rabbitmq "$RABBITMQ_HOME"; apt-mark auto '.*' > /dev/null; apt-mark manual $savedAptMark; apt-get purge -y --auto-remove -o APT::AutoRemove::RecommendsImportant=false; [ ! -e "$RABBITMQ_DATA_DIR/.erlang.cookie" ]; gosu rabbitmq rabbitmqctl help; gosu rabbitmq rabbitmqctl list_ciphers; gosu rabbitmq rabbitmq-plugins list; rm "$RABBITMQ_DATA_DIR/.erlang.cookie"
# Sat, 26 Feb 2022 00:15:05 GMT
# ARGS: PGP_KEYSERVER=keyserver.ubuntu.com
RUN set -eux; gosu rabbitmq rabbitmq-plugins enable --offline rabbitmq_prometheus; echo 'management_agent.disable_metrics_collector = true' > /etc/rabbitmq/conf.d/management_agent.disable_metrics_collector.conf; chown rabbitmq:rabbitmq /etc/rabbitmq/conf.d/management_agent.disable_metrics_collector.conf
# Sat, 26 Feb 2022 00:15:06 GMT
# ARGS: PGP_KEYSERVER=keyserver.ubuntu.com
RUN ln -sf /opt/rabbitmq/plugins /plugins
# Sat, 26 Feb 2022 00:15:07 GMT
ENV HOME=/var/lib/rabbitmq
# Sat, 26 Feb 2022 00:15:08 GMT
VOLUME [/var/lib/rabbitmq]
# Sat, 26 Feb 2022 00:15:09 GMT
ENV LANG=C.UTF-8 LANGUAGE=C.UTF-8 LC_ALL=C.UTF-8
# Sat, 26 Feb 2022 00:15:11 GMT
COPY file:031403dd5d8e0fcf8749c29d6595776a74d7fb2d07fd51c2c2f9f9f22af0a504 in /usr/local/bin/
# Sat, 26 Feb 2022 00:15:11 GMT
ENTRYPOINT ["docker-entrypoint.sh"]
# Sat, 26 Feb 2022 00:15:12 GMT
EXPOSE 15691 15692 25672 4369 5671 5672
# Sat, 26 Feb 2022 00:15:13 GMT
CMD ["rabbitmq-server"]
```
- Layers:
- `sha256:bbf2fb66fa6e06dd46eb26f518f93171ee7c48be68aafb213aa7c2c12f4018ca`
Last Modified: Wed, 02 Feb 2022 03:21:24 GMT
Size: 27.2 MB (27169640 bytes)
MIME: application/vnd.docker.image.rootfs.diff.tar.gzip
- `sha256:9b05d593b97c7f054e8d55aed21371efc41ccb3941159a4e961afca212ccf7c6`
Last Modified: Wed, 02 Feb 2022 06:14:56 GMT
Size: 699.0 KB (698954 bytes)
MIME: application/vnd.docker.image.rootfs.diff.tar.gzip
- `sha256:5eef4d7e5e3043ccc6900757b7b4ba2c903e3a0de2ca15175a321ddc81afd095`
Last Modified: Sat, 26 Feb 2022 00:17:14 GMT
Size: 42.7 MB (42697054 bytes)
MIME: application/vnd.docker.image.rootfs.diff.tar.gzip
- `sha256:8cac96905f7cbf7949a4c1f6b1159545143733c044d48e7eee393c5025ccad75`
Last Modified: Sat, 26 Feb 2022 00:17:09 GMT
Size: 2.0 KB (1969 bytes)
MIME: application/vnd.docker.image.rootfs.diff.tar.gzip
- `sha256:891cd859386f95a41c19ce75e86ef5a45ba3ccc6ea28b363564efa4208e93979`
Last Modified: Sat, 26 Feb 2022 00:18:38 GMT
Size: 18.2 MB (18177530 bytes)
MIME: application/vnd.docker.image.rootfs.diff.tar.gzip
- `sha256:07dad87c9e189fee2fe8df7f83c8e28e9a6313a27e3ab52dee442a9895b9a5d5`
Last Modified: Sat, 26 Feb 2022 00:18:37 GMT
Size: 273.0 B
MIME: application/vnd.docker.image.rootfs.diff.tar.gzip
- `sha256:f2367c344597e31d9b1035286faf0d394398103ff2744be9d7181a281f85064f`
Last Modified: Sat, 26 Feb 2022 00:18:37 GMT
Size: 107.0 B
MIME: application/vnd.docker.image.rootfs.diff.tar.gzip
- `sha256:b4a868042a3faef6101f6868556da7661ab46469fce9e22aedaac96645a5b4a5`
Last Modified: Sat, 26 Feb 2022 00:18:37 GMT
Size: 5.0 KB (4988 bytes)
MIME: application/vnd.docker.image.rootfs.diff.tar.gzip
### `rabbitmq:3.8.27` - linux; ppc64le
```console
$ docker pull rabbitmq@sha256:73b247a4dd7a1671bf49cf9de119b002a8401ae68120950747ab7d43cae3ee1d
```
- Docker Version: 20.10.7
- Manifest MIME: `application/vnd.docker.distribution.manifest.v2+json`
- Total Size: **96.6 MB (96566137 bytes)**
(compressed transfer size, not on-disk size)
- Image ID: `sha256:9baa537599d76cb780fb850009717ded20ec66e34d54b0eac3db9cb46fd26eeb`
- Entrypoint: `["docker-entrypoint.sh"]`
- Default Command: `["rabbitmq-server"]`
```dockerfile
# Wed, 02 Feb 2022 03:50:21 GMT
ADD file:e27da75ca1655de0ac82ef9879f868863388ea992e031aeace61195495bc21bc in /
# Wed, 02 Feb 2022 03:50:25 GMT
CMD ["bash"]
# Wed, 02 Feb 2022 07:04:49 GMT
RUN set -eux; apt-get update; apt-get install -y --no-install-recommends gosu ; rm -rf /var/lib/apt/lists/*; gosu nobody true
# Wed, 02 Feb 2022 07:04:52 GMT
ARG PGP_KEYSERVER=keyserver.ubuntu.com
# Wed, 02 Feb 2022 07:04:55 GMT
ENV OPENSSL_VERSION=1.1.1m
# Wed, 02 Feb 2022 07:04:57 GMT
ENV OPENSSL_SOURCE_SHA256=f89199be8b23ca45fc7cb9f1d8d3ee67312318286ad030f5316aca6462db6c96
# Wed, 02 Feb 2022 07:05:00 GMT
ENV OPENSSL_PGP_KEY_IDS=0x8657ABB260F056B1E5190839D9C4D26D0E604491 0x5B2545DAB21995F4088CEFAA36CEE4DEB00CFE33 0xED230BEC4D4F2518B9D7DF41F0DB4D21C1D35231 0xC1F33DD8CE1D4CC613AF14DA9195C48241FBF7DD 0x7953AC1FBC3DC8B3B292393ED5E9E43F7DF9EE8C 0xE5E52560DD91C556DDBDA5D02064C53641C25E5D
# Sat, 26 Feb 2022 06:24:22 GMT
ENV OTP_VERSION=24.2.2
# Sat, 26 Feb 2022 06:24:28 GMT
ENV OTP_SOURCE_SHA256=a87bcbdcdd1b99de7038030123b2d655d46d6e698a9143608618bdbec6ebbee7
# Sat, 26 Feb 2022 06:38:10 GMT
# ARGS: PGP_KEYSERVER=keyserver.ubuntu.com
RUN set -eux; savedAptMark="$(apt-mark showmanual)"; apt-get update; apt-get install --yes --no-install-recommends autoconf ca-certificates dpkg-dev gcc g++ gnupg libncurses5-dev make wget ; rm -rf /var/lib/apt/lists/*; OPENSSL_SOURCE_URL="https://www.openssl.org/source/openssl-$OPENSSL_VERSION.tar.gz"; OPENSSL_PATH="/usr/local/src/openssl-$OPENSSL_VERSION"; OPENSSL_CONFIG_DIR=/usr/local/etc/ssl; wget --progress dot:giga --output-document "$OPENSSL_PATH.tar.gz.asc" "$OPENSSL_SOURCE_URL.asc"; wget --progress dot:giga --output-document "$OPENSSL_PATH.tar.gz" "$OPENSSL_SOURCE_URL"; export GNUPGHOME="$(mktemp -d)"; for key in $OPENSSL_PGP_KEY_IDS; do gpg --batch --keyserver "$PGP_KEYSERVER" --recv-keys "$key"; done; gpg --batch --verify "$OPENSSL_PATH.tar.gz.asc" "$OPENSSL_PATH.tar.gz"; gpgconf --kill all; rm -rf "$GNUPGHOME"; echo "$OPENSSL_SOURCE_SHA256 *$OPENSSL_PATH.tar.gz" | sha256sum --check --strict -; mkdir -p "$OPENSSL_PATH"; tar --extract --file "$OPENSSL_PATH.tar.gz" --directory "$OPENSSL_PATH" --strip-components 1; cd "$OPENSSL_PATH"; debMultiarch="$(dpkg-architecture --query DEB_HOST_MULTIARCH)"; MACHINE="$(dpkg-architecture --query DEB_BUILD_GNU_CPU)" RELEASE="4.x.y-z" SYSTEM='Linux' BUILD='???' ./config --openssldir="$OPENSSL_CONFIG_DIR" --libdir="lib/$debMultiarch" -Wl,-rpath=/usr/local/lib ; make -j "$(getconf _NPROCESSORS_ONLN)"; make install_sw install_ssldirs; cd ..; rm -rf "$OPENSSL_PATH"*; ldconfig; rmdir "$OPENSSL_CONFIG_DIR/certs" "$OPENSSL_CONFIG_DIR/private"; ln -sf /etc/ssl/certs /etc/ssl/private "$OPENSSL_CONFIG_DIR"; openssl version; OTP_SOURCE_URL="https://github.com/erlang/otp/releases/download/OTP-$OTP_VERSION/otp_src_$OTP_VERSION.tar.gz"; OTP_PATH="/usr/local/src/otp-$OTP_VERSION"; mkdir -p "$OTP_PATH"; wget --progress dot:giga --output-document "$OTP_PATH.tar.gz" "$OTP_SOURCE_URL"; echo "$OTP_SOURCE_SHA256 *$OTP_PATH.tar.gz" | sha256sum --check --strict -; tar --extract --file "$OTP_PATH.tar.gz" --directory "$OTP_PATH" --strip-components 1; cd "$OTP_PATH"; export ERL_TOP="$OTP_PATH"; ./otp_build autoconf; CFLAGS="$(dpkg-buildflags --get CFLAGS)"; export CFLAGS; export CFLAGS="$CFLAGS -Wl,-rpath=/usr/local/lib"; hostArch="$(dpkg-architecture --query DEB_HOST_GNU_TYPE)"; buildArch="$(dpkg-architecture --query DEB_BUILD_GNU_TYPE)"; dpkgArch="$(dpkg --print-architecture)"; dpkgArch="${dpkgArch##*-}"; jitFlag=; case "$dpkgArch" in amd64) jitFlag='--enable-jit' ;; esac; ./configure --host="$hostArch" --build="$buildArch" --disable-dynamic-ssl-lib --disable-hipe --disable-sctp --disable-silent-rules --enable-clock-gettime --enable-hybrid-heap --enable-kernel-poll --enable-shared-zlib --enable-smp-support --enable-threads --with-microstate-accounting=extra --without-common_test --without-debugger --without-dialyzer --without-diameter --without-edoc --without-erl_docgen --without-et --without-eunit --without-ftp --without-hipe --without-jinterface --without-megaco --without-observer --without-odbc --without-reltool --without-ssh --without-tftp --without-wx $jitFlag ; make -j "$(getconf _NPROCESSORS_ONLN)" GEN_OPT_FLGS="-O2 -fno-strict-aliasing"; make install; cd ..; rm -rf "$OTP_PATH"* /usr/local/lib/erlang/lib/*/examples /usr/local/lib/erlang/lib/*/src ; apt-mark auto '.*' > /dev/null; [ -z "$savedAptMark" ] || apt-mark manual $savedAptMark; find /usr/local -type f -executable -exec ldd '{}' ';' | awk '/=>/ { print $(NF-1) }' | sort -u | xargs -r dpkg-query --search | cut -d: -f1 | sort -u | xargs -r apt-mark manual ; apt-get purge -y --auto-remove -o APT::AutoRemove::RecommendsImportant=false; openssl version; erl -noshell -eval 'io:format("~p~n~n~p~n~n", [crypto:supports(), ssl:versions()]), init:stop().'
# Sat, 26 Feb 2022 06:38:17 GMT
ENV RABBITMQ_DATA_DIR=/var/lib/rabbitmq
# Sat, 26 Feb 2022 06:38:27 GMT
# ARGS: PGP_KEYSERVER=keyserver.ubuntu.com
RUN set -eux; groupadd --gid 999 --system rabbitmq; useradd --uid 999 --system --home-dir "$RABBITMQ_DATA_DIR" --gid rabbitmq rabbitmq; mkdir -p "$RABBITMQ_DATA_DIR" /etc/rabbitmq /etc/rabbitmq/conf.d /tmp/rabbitmq-ssl /var/log/rabbitmq; chown -fR rabbitmq:rabbitmq "$RABBITMQ_DATA_DIR" /etc/rabbitmq /etc/rabbitmq/conf.d /tmp/rabbitmq-ssl /var/log/rabbitmq; chmod 777 "$RABBITMQ_DATA_DIR" /etc/rabbitmq /etc/rabbitmq/conf.d /tmp/rabbitmq-ssl /var/log/rabbitmq; ln -sf "$RABBITMQ_DATA_DIR/.erlang.cookie" /root/.erlang.cookie
# Sat, 26 Feb 2022 06:53:29 GMT
ENV RABBITMQ_VERSION=3.8.27
# Sat, 26 Feb 2022 06:53:31 GMT
ENV RABBITMQ_PGP_KEY_ID=0x0A9AF2115F4687BD29803A206B73A36E6026DFCA
# Sat, 26 Feb 2022 06:53:35 GMT
ENV RABBITMQ_HOME=/opt/rabbitmq
# Sat, 26 Feb 2022 06:53:39 GMT
ENV PATH=/opt/rabbitmq/sbin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin RABBITMQ_LOGS=-
# Sat, 26 Feb 2022 06:56:22 GMT
# ARGS: PGP_KEYSERVER=keyserver.ubuntu.com
RUN set -eux; savedAptMark="$(apt-mark showmanual)"; apt-get update; apt-get install --yes --no-install-recommends ca-certificates gnupg wget xz-utils ; rm -rf /var/lib/apt/lists/*; RABBITMQ_SOURCE_URL="https://github.com/rabbitmq/rabbitmq-server/releases/download/v$RABBITMQ_VERSION/rabbitmq-server-generic-unix-latest-toolchain-$RABBITMQ_VERSION.tar.xz"; RABBITMQ_PATH="/usr/local/src/rabbitmq-$RABBITMQ_VERSION"; wget --progress dot:giga --output-document "$RABBITMQ_PATH.tar.xz.asc" "$RABBITMQ_SOURCE_URL.asc"; wget --progress dot:giga --output-document "$RABBITMQ_PATH.tar.xz" "$RABBITMQ_SOURCE_URL"; export GNUPGHOME="$(mktemp -d)"; gpg --batch --keyserver hkps://keys.openpgp.org --recv-keys "$RABBITMQ_PGP_KEY_ID"; gpg --batch --verify "$RABBITMQ_PATH.tar.xz.asc" "$RABBITMQ_PATH.tar.xz"; gpgconf --kill all; rm -rf "$GNUPGHOME"; mkdir -p "$RABBITMQ_HOME"; tar --extract --file "$RABBITMQ_PATH.tar.xz" --directory "$RABBITMQ_HOME" --strip-components 1; rm -rf "$RABBITMQ_PATH"*; grep -qE '^SYS_PREFIX=\$\{RABBITMQ_HOME\}$' "$RABBITMQ_HOME/sbin/rabbitmq-defaults"; sed -i 's/^SYS_PREFIX=.*$/SYS_PREFIX=/' "$RABBITMQ_HOME/sbin/rabbitmq-defaults"; grep -qE '^SYS_PREFIX=$' "$RABBITMQ_HOME/sbin/rabbitmq-defaults"; chown -R rabbitmq:rabbitmq "$RABBITMQ_HOME"; apt-mark auto '.*' > /dev/null; apt-mark manual $savedAptMark; apt-get purge -y --auto-remove -o APT::AutoRemove::RecommendsImportant=false; [ ! -e "$RABBITMQ_DATA_DIR/.erlang.cookie" ]; gosu rabbitmq rabbitmqctl help; gosu rabbitmq rabbitmqctl list_ciphers; gosu rabbitmq rabbitmq-plugins list; rm "$RABBITMQ_DATA_DIR/.erlang.cookie"
# Sat, 26 Feb 2022 06:56:35 GMT
# ARGS: PGP_KEYSERVER=keyserver.ubuntu.com
RUN set -eux; gosu rabbitmq rabbitmq-plugins enable --offline rabbitmq_prometheus; echo 'management_agent.disable_metrics_collector = true' > /etc/rabbitmq/conf.d/management_agent.disable_metrics_collector.conf; chown rabbitmq:rabbitmq /etc/rabbitmq/conf.d/management_agent.disable_metrics_collector.conf
# Sat, 26 Feb 2022 06:56:47 GMT
# ARGS: PGP_KEYSERVER=keyserver.ubuntu.com
RUN ln -sf /opt/rabbitmq/plugins /plugins
# Sat, 26 Feb 2022 06:56:53 GMT
ENV HOME=/var/lib/rabbitmq
# Sat, 26 Feb 2022 06:56:59 GMT
VOLUME [/var/lib/rabbitmq]
# Sat, 26 Feb 2022 06:57:05 GMT
ENV LANG=C.UTF-8 LANGUAGE=C.UTF-8 LC_ALL=C.UTF-8
# Sat, 26 Feb 2022 06:57:09 GMT
COPY file:031403dd5d8e0fcf8749c29d6595776a74d7fb2d07fd51c2c2f9f9f22af0a504 in /usr/local/bin/
# Sat, 26 Feb 2022 06:57:14 GMT
ENTRYPOINT ["docker-entrypoint.sh"]
# Sat, 26 Feb 2022 06:57:21 GMT
EXPOSE 15691 15692 25672 4369 5671 5672
# Sat, 26 Feb 2022 06:57:26 GMT
CMD ["rabbitmq-server"]
```
- Layers:
- `sha256:e4ad98202983f0b602991305f807e9b8460bb3fdb617889c276ccbd4b92c69b4`
Last Modified: Wed, 02 Feb 2022 03:53:11 GMT
Size: 33.3 MB (33284717 bytes)
MIME: application/vnd.docker.image.rootfs.diff.tar.gzip
- `sha256:8aa1a04576dec54f59194a886b4a4790320158c8737ac8727812313ffc7c6b8f`
Last Modified: Wed, 02 Feb 2022 07:23:06 GMT
Size: 822.2 KB (822234 bytes)
MIME: application/vnd.docker.image.rootfs.diff.tar.gzip
- `sha256:5802999a5598e471cb56dec3e2e7124fceee4cd05d4f6088e898193e3c3e35ea`
Last Modified: Sat, 26 Feb 2022 07:01:57 GMT
Size: 44.3 MB (44272510 bytes)
MIME: application/vnd.docker.image.rootfs.diff.tar.gzip
- `sha256:d5f3e36a5b28d2fbf0ecb1495cb116b9e7a398c314775c190adbf189712dfc78`
Last Modified: Sat, 26 Feb 2022 07:01:50 GMT
Size: 2.1 KB (2086 bytes)
MIME: application/vnd.docker.image.rootfs.diff.tar.gzip
- `sha256:8e3edd678c396c189996b92c4e82af01a9eca1550b19835bdf1ca39c73be785d`
Last Modified: Sat, 26 Feb 2022 07:03:36 GMT
Size: 18.2 MB (18179226 bytes)
MIME: application/vnd.docker.image.rootfs.diff.tar.gzip
- `sha256:8cd2ebed9cb7ca4c6c8da8369ef284c5330ab80f4f48e1bb095907c6c82766dc`
Last Modified: Sat, 26 Feb 2022 07:03:34 GMT
Size: 273.0 B
MIME: application/vnd.docker.image.rootfs.diff.tar.gzip
- `sha256:4f2bf7f6d8b9dc84b6309f0996ac059cf336e1ccdd70762b653cb373d33615fe`
Last Modified: Sat, 26 Feb 2022 07:03:34 GMT
Size: 105.0 B
MIME: application/vnd.docker.image.rootfs.diff.tar.gzip
- `sha256:a1df5fdd08bcd85b7c489be07d1a1c1096a32f40ae2f62482512f4916fe2ee4d`
Last Modified: Sat, 26 Feb 2022 07:03:34 GMT
Size: 5.0 KB (4986 bytes)
MIME: application/vnd.docker.image.rootfs.diff.tar.gzip
### `rabbitmq:3.8.27` - linux; riscv64
```console
$ docker pull rabbitmq@sha256:c0245d0494e277749d647eadd06dc5e0d59ac8f8ba7e65a18fd8e4f493c8bbc5
```
- Docker Version: 20.10.7
- Manifest MIME: `application/vnd.docker.distribution.manifest.v2+json`
- Total Size: **86.4 MB (86385748 bytes)**
(compressed transfer size, not on-disk size)
- Image ID: `sha256:09339274c986f925447f293f815eb5c719fe1fa9abc3fdb68d2510b7c4e3c320`
- Entrypoint: `["docker-entrypoint.sh"]`
- Default Command: `["rabbitmq-server"]`
```dockerfile
# Wed, 02 Feb 2022 02:15:02 GMT
ADD file:6e0c5699122e0452ab2b2424d34da73a6447915f9c47276d0841d180c21524a6 in /
# Wed, 02 Feb 2022 02:15:04 GMT
CMD ["bash"]
# Wed, 02 Feb 2022 04:48:20 GMT
RUN set -eux; apt-get update; apt-get install -y --no-install-recommends gosu ; rm -rf /var/lib/apt/lists/*; gosu nobody true
# Wed, 02 Feb 2022 04:48:20 GMT
ARG PGP_KEYSERVER=keyserver.ubuntu.com
# Wed, 02 Feb 2022 04:48:21 GMT
ENV OPENSSL_VERSION=1.1.1m
# Wed, 02 Feb 2022 04:48:21 GMT
ENV OPENSSL_SOURCE_SHA256=f89199be8b23ca45fc7cb9f1d8d3ee67312318286ad030f5316aca6462db6c96
# Wed, 02 Feb 2022 04:48:22 GMT
ENV OPENSSL_PGP_KEY_IDS=0x8657ABB260F056B1E5190839D9C4D26D0E604491 0x5B2545DAB21995F4088CEFAA36CEE4DEB00CFE33 0xED230BEC4D4F2518B9D7DF41F0DB4D21C1D35231 0xC1F33DD8CE1D4CC613AF14DA9195C48241FBF7DD 0x7953AC1FBC3DC8B3B292393ED5E9E43F7DF9EE8C 0xE5E52560DD91C556DDBDA5D02064C53641C25E5D
# Fri, 25 Feb 2022 19:14:33 GMT
ENV OTP_VERSION=24.2.2
# Fri, 25 Feb 2022 19:14:33 GMT
ENV OTP_SOURCE_SHA256=a87bcbdcdd1b99de7038030123b2d655d46d6e698a9143608618bdbec6ebbee7
# Fri, 25 Feb 2022 19:40:52 GMT
# ARGS: PGP_KEYSERVER=keyserver.ubuntu.com
RUN set -eux; savedAptMark="$(apt-mark showmanual)"; apt-get update; apt-get install --yes --no-install-recommends autoconf ca-certificates dpkg-dev gcc g++ gnupg libncurses5-dev make wget ; rm -rf /var/lib/apt/lists/*; OPENSSL_SOURCE_URL="https://www.openssl.org/source/openssl-$OPENSSL_VERSION.tar.gz"; OPENSSL_PATH="/usr/local/src/openssl-$OPENSSL_VERSION"; OPENSSL_CONFIG_DIR=/usr/local/etc/ssl; wget --progress dot:giga --output-document "$OPENSSL_PATH.tar.gz.asc" "$OPENSSL_SOURCE_URL.asc"; wget --progress dot:giga --output-document "$OPENSSL_PATH.tar.gz" "$OPENSSL_SOURCE_URL"; export GNUPGHOME="$(mktemp -d)"; for key in $OPENSSL_PGP_KEY_IDS; do gpg --batch --keyserver "$PGP_KEYSERVER" --recv-keys "$key"; done; gpg --batch --verify "$OPENSSL_PATH.tar.gz.asc" "$OPENSSL_PATH.tar.gz"; gpgconf --kill all; rm -rf "$GNUPGHOME"; echo "$OPENSSL_SOURCE_SHA256 *$OPENSSL_PATH.tar.gz" | sha256sum --check --strict -; mkdir -p "$OPENSSL_PATH"; tar --extract --file "$OPENSSL_PATH.tar.gz" --directory "$OPENSSL_PATH" --strip-components 1; cd "$OPENSSL_PATH"; debMultiarch="$(dpkg-architecture --query DEB_HOST_MULTIARCH)"; MACHINE="$(dpkg-architecture --query DEB_BUILD_GNU_CPU)" RELEASE="4.x.y-z" SYSTEM='Linux' BUILD='???' ./config --openssldir="$OPENSSL_CONFIG_DIR" --libdir="lib/$debMultiarch" -Wl,-rpath=/usr/local/lib ; make -j "$(getconf _NPROCESSORS_ONLN)"; make install_sw install_ssldirs; cd ..; rm -rf "$OPENSSL_PATH"*; ldconfig; rmdir "$OPENSSL_CONFIG_DIR/certs" "$OPENSSL_CONFIG_DIR/private"; ln -sf /etc/ssl/certs /etc/ssl/private "$OPENSSL_CONFIG_DIR"; openssl version; OTP_SOURCE_URL="https://github.com/erlang/otp/releases/download/OTP-$OTP_VERSION/otp_src_$OTP_VERSION.tar.gz"; OTP_PATH="/usr/local/src/otp-$OTP_VERSION"; mkdir -p "$OTP_PATH"; wget --progress dot:giga --output-document "$OTP_PATH.tar.gz" "$OTP_SOURCE_URL"; echo "$OTP_SOURCE_SHA256 *$OTP_PATH.tar.gz" | sha256sum --check --strict -; tar --extract --file "$OTP_PATH.tar.gz" --directory "$OTP_PATH" --strip-components 1; cd "$OTP_PATH"; export ERL_TOP="$OTP_PATH"; ./otp_build autoconf; CFLAGS="$(dpkg-buildflags --get CFLAGS)"; export CFLAGS; export CFLAGS="$CFLAGS -Wl,-rpath=/usr/local/lib"; hostArch="$(dpkg-architecture --query DEB_HOST_GNU_TYPE)"; buildArch="$(dpkg-architecture --query DEB_BUILD_GNU_TYPE)"; dpkgArch="$(dpkg --print-architecture)"; dpkgArch="${dpkgArch##*-}"; jitFlag=; case "$dpkgArch" in amd64) jitFlag='--enable-jit' ;; esac; ./configure --host="$hostArch" --build="$buildArch" --disable-dynamic-ssl-lib --disable-hipe --disable-sctp --disable-silent-rules --enable-clock-gettime --enable-hybrid-heap --enable-kernel-poll --enable-shared-zlib --enable-smp-support --enable-threads --with-microstate-accounting=extra --without-common_test --without-debugger --without-dialyzer --without-diameter --without-edoc --without-erl_docgen --without-et --without-eunit --without-ftp --without-hipe --without-jinterface --without-megaco --without-observer --without-odbc --without-reltool --without-ssh --without-tftp --without-wx $jitFlag ; make -j "$(getconf _NPROCESSORS_ONLN)" GEN_OPT_FLGS="-O2 -fno-strict-aliasing"; make install; cd ..; rm -rf "$OTP_PATH"* /usr/local/lib/erlang/lib/*/examples /usr/local/lib/erlang/lib/*/src ; apt-mark auto '.*' > /dev/null; [ -z "$savedAptMark" ] || apt-mark manual $savedAptMark; find /usr/local -type f -executable -exec ldd '{}' ';' | awk '/=>/ { print $(NF-1) }' | sort -u | xargs -r dpkg-query --search | cut -d: -f1 | sort -u | xargs -r apt-mark manual ; apt-get purge -y --auto-remove -o APT::AutoRemove::RecommendsImportant=false; openssl version; erl -noshell -eval 'io:format("~p~n~n~p~n~n", [crypto:supports(), ssl:versions()]), init:stop().'
# Fri, 25 Feb 2022 19:40:53 GMT
ENV RABBITMQ_DATA_DIR=/var/lib/rabbitmq
# Fri, 25 Feb 2022 19:40:55 GMT
# ARGS: PGP_KEYSERVER=keyserver.ubuntu.com
RUN set -eux; groupadd --gid 999 --system rabbitmq; useradd --uid 999 --system --home-dir "$RABBITMQ_DATA_DIR" --gid rabbitmq rabbitmq; mkdir -p "$RABBITMQ_DATA_DIR" /etc/rabbitmq /etc/rabbitmq/conf.d /tmp/rabbitmq-ssl /var/log/rabbitmq; chown -fR rabbitmq:rabbitmq "$RABBITMQ_DATA_DIR" /etc/rabbitmq /etc/rabbitmq/conf.d /tmp/rabbitmq-ssl /var/log/rabbitmq; chmod 777 "$RABBITMQ_DATA_DIR" /etc/rabbitmq /etc/rabbitmq/conf.d /tmp/rabbitmq-ssl /var/log/rabbitmq; ln -sf "$RABBITMQ_DATA_DIR/.erlang.cookie" /root/.erlang.cookie
# Fri, 25 Feb 2022 19:47:48 GMT
ENV RABBITMQ_VERSION=3.8.27
# Fri, 25 Feb 2022 19:47:48 GMT
ENV RABBITMQ_PGP_KEY_ID=0x0A9AF2115F4687BD29803A206B73A36E6026DFCA
# Fri, 25 Feb 2022 19:47:49 GMT
ENV RABBITMQ_HOME=/opt/rabbitmq
# Fri, 25 Feb 2022 19:47:49 GMT
ENV PATH=/opt/rabbitmq/sbin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin RABBITMQ_LOGS=-
# Fri, 25 Feb 2022 19:49:25 GMT
# ARGS: PGP_KEYSERVER=keyserver.ubuntu.com
RUN set -eux; savedAptMark="$(apt-mark showmanual)"; apt-get update; apt-get install --yes --no-install-recommends ca-certificates gnupg wget xz-utils ; rm -rf /var/lib/apt/lists/*; RABBITMQ_SOURCE_URL="https://github.com/rabbitmq/rabbitmq-server/releases/download/v$RABBITMQ_VERSION/rabbitmq-server-generic-unix-latest-toolchain-$RABBITMQ_VERSION.tar.xz"; RABBITMQ_PATH="/usr/local/src/rabbitmq-$RABBITMQ_VERSION"; wget --progress dot:giga --output-document "$RABBITMQ_PATH.tar.xz.asc" "$RABBITMQ_SOURCE_URL.asc"; wget --progress dot:giga --output-document "$RABBITMQ_PATH.tar.xz" "$RABBITMQ_SOURCE_URL"; export GNUPGHOME="$(mktemp -d)"; gpg --batch --keyserver hkps://keys.openpgp.org --recv-keys "$RABBITMQ_PGP_KEY_ID"; gpg --batch --verify "$RABBITMQ_PATH.tar.xz.asc" "$RABBITMQ_PATH.tar.xz"; gpgconf --kill all; rm -rf "$GNUPGHOME"; mkdir -p "$RABBITMQ_HOME"; tar --extract --file "$RABBITMQ_PATH.tar.xz" --directory "$RABBITMQ_HOME" --strip-components 1; rm -rf "$RABBITMQ_PATH"*; grep -qE '^SYS_PREFIX=\$\{RABBITMQ_HOME\}$' "$RABBITMQ_HOME/sbin/rabbitmq-defaults"; sed -i 's/^SYS_PREFIX=.*$/SYS_PREFIX=/' "$RABBITMQ_HOME/sbin/rabbitmq-defaults"; grep -qE '^SYS_PREFIX=$' "$RABBITMQ_HOME/sbin/rabbitmq-defaults"; chown -R rabbitmq:rabbitmq "$RABBITMQ_HOME"; apt-mark auto '.*' > /dev/null; apt-mark manual $savedAptMark; apt-get purge -y --auto-remove -o APT::AutoRemove::RecommendsImportant=false; [ ! -e "$RABBITMQ_DATA_DIR/.erlang.cookie" ]; gosu rabbitmq rabbitmqctl help; gosu rabbitmq rabbitmqctl list_ciphers; gosu rabbitmq rabbitmq-plugins list; rm "$RABBITMQ_DATA_DIR/.erlang.cookie"
# Fri, 25 Feb 2022 19:49:37 GMT
# ARGS: PGP_KEYSERVER=keyserver.ubuntu.com
RUN set -eux; gosu rabbitmq rabbitmq-plugins enable --offline rabbitmq_prometheus; echo 'management_agent.disable_metrics_collector = true' > /etc/rabbitmq/conf.d/management_agent.disable_metrics_collector.conf; chown rabbitmq:rabbitmq /etc/rabbitmq/conf.d/management_agent.disable_metrics_collector.conf
# Fri, 25 Feb 2022 19:49:39 GMT
# ARGS: PGP_KEYSERVER=keyserver.ubuntu.com
RUN ln -sf /opt/rabbitmq/plugins /plugins
# Fri, 25 Feb 2022 19:49:40 GMT
ENV HOME=/var/lib/rabbitmq
# Fri, 25 Feb 2022 19:49:40 GMT
VOLUME [/var/lib/rabbitmq]
# Fri, 25 Feb 2022 19:49:41 GMT
ENV LANG=C.UTF-8 LANGUAGE=C.UTF-8 LC_ALL=C.UTF-8
# Fri, 25 Feb 2022 19:49:41 GMT
COPY file:031403dd5d8e0fcf8749c29d6595776a74d7fb2d07fd51c2c2f9f9f22af0a504 in /usr/local/bin/
# Fri, 25 Feb 2022 19:49:42 GMT
ENTRYPOINT ["docker-entrypoint.sh"]
# Fri, 25 Feb 2022 19:49:42 GMT
EXPOSE 15691 15692 25672 4369 5671 5672
# Fri, 25 Feb 2022 19:49:43 GMT
CMD ["rabbitmq-server"]
```
- Layers:
- `sha256:a6994ca4576c569732b8a1d88605f325f0b9e4f0f51119178c871a821b9b7407`
Last Modified: Wed, 02 Feb 2022 02:33:31 GMT
Size: 24.2 MB (24224781 bytes)
MIME: application/vnd.docker.image.rootfs.diff.tar.gzip
- `sha256:bdc803bbb53910af967adf994925146e5c36ddc09825a9c6d9961f6e249b051a`
Last Modified: Wed, 02 Feb 2022 05:36:37 GMT
Size: 931.3 KB (931337 bytes)
MIME: application/vnd.docker.image.rootfs.diff.tar.gzip
- `sha256:7f89a9914fd2fe8c6413b80df41f95f44db1baf3823a77dbf8846b0b27b56954`
Last Modified: Fri, 25 Feb 2022 20:03:21 GMT
Size: 43.0 MB (43043746 bytes)
MIME: application/vnd.docker.image.rootfs.diff.tar.gzip
- `sha256:f21b15823594aa9505b1713ab0c40c0cd1d66b956c1e27c52e9f71d3ffea6040`
Last Modified: Fri, 25 Feb 2022 20:02:31 GMT
Size: 2.0 KB (1991 bytes)
MIME: application/vnd.docker.image.rootfs.diff.tar.gzip
- `sha256:7abbbbde3c0b5743e3bc2b3e430b8abd3e885bd5ea063c859514e4fe636ddd61`
Last Modified: Fri, 25 Feb 2022 20:07:55 GMT
Size: 18.2 MB (18178526 bytes)
MIME: application/vnd.docker.image.rootfs.diff.tar.gzip
- `sha256:fb0fb5a5089c9469b1685219ccc3c4d5c7f174cbc376f7e87d096e195b0a2af0`
Last Modified: Fri, 25 Feb 2022 20:07:45 GMT
Size: 274.0 B
MIME: application/vnd.docker.image.rootfs.diff.tar.gzip
- `sha256:4712db03329d4f694bc61db1f31ff056f09c99ba9af0db694a5d9774c8d88d3d`
Last Modified: Fri, 25 Feb 2022 20:07:45 GMT
Size: 107.0 B
MIME: application/vnd.docker.image.rootfs.diff.tar.gzip
- `sha256:e7fb660764d623d226b9433f979f86fed16985ae3bd29720af691458e71eaa6e`
Last Modified: Fri, 25 Feb 2022 20:07:45 GMT
Size: 5.0 KB (4986 bytes)
MIME: application/vnd.docker.image.rootfs.diff.tar.gzip
### `rabbitmq:3.8.27` - linux; s390x
```console
$ docker pull rabbitmq@sha256:b167cc3d15d6fdfcbe27d09e8a54b8230ea68fba80c5b28da8e68071b1655377
```
- Docker Version: 20.10.7
- Manifest MIME: `application/vnd.docker.distribution.manifest.v2+json`
- Total Size: **88.4 MB (88360102 bytes)**
(compressed transfer size, not on-disk size)
- Image ID: `sha256:b34a71da65a5b86de9dcf4f74566378d13c619abcc642684286a088609a76e4b`
- Entrypoint: `["docker-entrypoint.sh"]`
- Default Command: `["rabbitmq-server"]`
```dockerfile
# Wed, 02 Feb 2022 01:42:16 GMT
ADD file:f35a5307585c918b783420eea01f5947181ac58f8eeb855a8f73f38f1477700f in /
# Wed, 02 Feb 2022 01:42:17 GMT
CMD ["bash"]
# Wed, 02 Feb 2022 02:43:42 GMT
RUN set -eux; apt-get update; apt-get install -y --no-install-recommends gosu ; rm -rf /var/lib/apt/lists/*; gosu nobody true
# Wed, 02 Feb 2022 02:43:42 GMT
ARG PGP_KEYSERVER=keyserver.ubuntu.com
# Wed, 02 Feb 2022 02:43:42 GMT
ENV OPENSSL_VERSION=1.1.1m
# Wed, 02 Feb 2022 02:43:42 GMT
ENV OPENSSL_SOURCE_SHA256=f89199be8b23ca45fc7cb9f1d8d3ee67312318286ad030f5316aca6462db6c96
# Wed, 02 Feb 2022 02:43:42 GMT
ENV OPENSSL_PGP_KEY_IDS=0x8657ABB260F056B1E5190839D9C4D26D0E604491 0x5B2545DAB21995F4088CEFAA36CEE4DEB00CFE33 0xED230BEC4D4F2518B9D7DF41F0DB4D21C1D35231 0xC1F33DD8CE1D4CC613AF14DA9195C48241FBF7DD 0x7953AC1FBC3DC8B3B292393ED5E9E43F7DF9EE8C 0xE5E52560DD91C556DDBDA5D02064C53641C25E5D
# Fri, 25 Feb 2022 22:57:20 GMT
ENV OTP_VERSION=24.2.2
# Fri, 25 Feb 2022 22:57:20 GMT
ENV OTP_SOURCE_SHA256=a87bcbdcdd1b99de7038030123b2d655d46d6e698a9143608618bdbec6ebbee7
# Fri, 25 Feb 2022 23:02:32 GMT
# ARGS: PGP_KEYSERVER=keyserver.ubuntu.com
RUN set -eux; savedAptMark="$(apt-mark showmanual)"; apt-get update; apt-get install --yes --no-install-recommends autoconf ca-certificates dpkg-dev gcc g++ gnupg libncurses5-dev make wget ; rm -rf /var/lib/apt/lists/*; OPENSSL_SOURCE_URL="https://www.openssl.org/source/openssl-$OPENSSL_VERSION.tar.gz"; OPENSSL_PATH="/usr/local/src/openssl-$OPENSSL_VERSION"; OPENSSL_CONFIG_DIR=/usr/local/etc/ssl; wget --progress dot:giga --output-document "$OPENSSL_PATH.tar.gz.asc" "$OPENSSL_SOURCE_URL.asc"; wget --progress dot:giga --output-document "$OPENSSL_PATH.tar.gz" "$OPENSSL_SOURCE_URL"; export GNUPGHOME="$(mktemp -d)"; for key in $OPENSSL_PGP_KEY_IDS; do gpg --batch --keyserver "$PGP_KEYSERVER" --recv-keys "$key"; done; gpg --batch --verify "$OPENSSL_PATH.tar.gz.asc" "$OPENSSL_PATH.tar.gz"; gpgconf --kill all; rm -rf "$GNUPGHOME"; echo "$OPENSSL_SOURCE_SHA256 *$OPENSSL_PATH.tar.gz" | sha256sum --check --strict -; mkdir -p "$OPENSSL_PATH"; tar --extract --file "$OPENSSL_PATH.tar.gz" --directory "$OPENSSL_PATH" --strip-components 1; cd "$OPENSSL_PATH"; debMultiarch="$(dpkg-architecture --query DEB_HOST_MULTIARCH)"; MACHINE="$(dpkg-architecture --query DEB_BUILD_GNU_CPU)" RELEASE="4.x.y-z" SYSTEM='Linux' BUILD='???' ./config --openssldir="$OPENSSL_CONFIG_DIR" --libdir="lib/$debMultiarch" -Wl,-rpath=/usr/local/lib ; make -j "$(getconf _NPROCESSORS_ONLN)"; make install_sw install_ssldirs; cd ..; rm -rf "$OPENSSL_PATH"*; ldconfig; rmdir "$OPENSSL_CONFIG_DIR/certs" "$OPENSSL_CONFIG_DIR/private"; ln -sf /etc/ssl/certs /etc/ssl/private "$OPENSSL_CONFIG_DIR"; openssl version; OTP_SOURCE_URL="https://github.com/erlang/otp/releases/download/OTP-$OTP_VERSION/otp_src_$OTP_VERSION.tar.gz"; OTP_PATH="/usr/local/src/otp-$OTP_VERSION"; mkdir -p "$OTP_PATH"; wget --progress dot:giga --output-document "$OTP_PATH.tar.gz" "$OTP_SOURCE_URL"; echo "$OTP_SOURCE_SHA256 *$OTP_PATH.tar.gz" | sha256sum --check --strict -; tar --extract --file "$OTP_PATH.tar.gz" --directory "$OTP_PATH" --strip-components 1; cd "$OTP_PATH"; export ERL_TOP="$OTP_PATH"; ./otp_build autoconf; CFLAGS="$(dpkg-buildflags --get CFLAGS)"; export CFLAGS; export CFLAGS="$CFLAGS -Wl,-rpath=/usr/local/lib"; hostArch="$(dpkg-architecture --query DEB_HOST_GNU_TYPE)"; buildArch="$(dpkg-architecture --query DEB_BUILD_GNU_TYPE)"; dpkgArch="$(dpkg --print-architecture)"; dpkgArch="${dpkgArch##*-}"; jitFlag=; case "$dpkgArch" in amd64) jitFlag='--enable-jit' ;; esac; ./configure --host="$hostArch" --build="$buildArch" --disable-dynamic-ssl-lib --disable-hipe --disable-sctp --disable-silent-rules --enable-clock-gettime --enable-hybrid-heap --enable-kernel-poll --enable-shared-zlib --enable-smp-support --enable-threads --with-microstate-accounting=extra --without-common_test --without-debugger --without-dialyzer --without-diameter --without-edoc --without-erl_docgen --without-et --without-eunit --without-ftp --without-hipe --without-jinterface --without-megaco --without-observer --without-odbc --without-reltool --without-ssh --without-tftp --without-wx $jitFlag ; make -j "$(getconf _NPROCESSORS_ONLN)" GEN_OPT_FLGS="-O2 -fno-strict-aliasing"; make install; cd ..; rm -rf "$OTP_PATH"* /usr/local/lib/erlang/lib/*/examples /usr/local/lib/erlang/lib/*/src ; apt-mark auto '.*' > /dev/null; [ -z "$savedAptMark" ] || apt-mark manual $savedAptMark; find /usr/local -type f -executable -exec ldd '{}' ';' | awk '/=>/ { print $(NF-1) }' | sort -u | xargs -r dpkg-query --search | cut -d: -f1 | sort -u | xargs -r apt-mark manual ; apt-get purge -y --auto-remove -o APT::AutoRemove::RecommendsImportant=false; openssl version; erl -noshell -eval 'io:format("~p~n~n~p~n~n", [crypto:supports(), ssl:versions()]), init:stop().'
# Fri, 25 Feb 2022 23:02:33 GMT
ENV RABBITMQ_DATA_DIR=/var/lib/rabbitmq
# Fri, 25 Feb 2022 23:02:33 GMT
# ARGS: PGP_KEYSERVER=keyserver.ubuntu.com
RUN set -eux; groupadd --gid 999 --system rabbitmq; useradd --uid 999 --system --home-dir "$RABBITMQ_DATA_DIR" --gid rabbitmq rabbitmq; mkdir -p "$RABBITMQ_DATA_DIR" /etc/rabbitmq /etc/rabbitmq/conf.d /tmp/rabbitmq-ssl /var/log/rabbitmq; chown -fR rabbitmq:rabbitmq "$RABBITMQ_DATA_DIR" /etc/rabbitmq /etc/rabbitmq/conf.d /tmp/rabbitmq-ssl /var/log/rabbitmq; chmod 777 "$RABBITMQ_DATA_DIR" /etc/rabbitmq /etc/rabbitmq/conf.d /tmp/rabbitmq-ssl /var/log/rabbitmq; ln -sf "$RABBITMQ_DATA_DIR/.erlang.cookie" /root/.erlang.cookie
# Fri, 25 Feb 2022 23:09:04 GMT
ENV RABBITMQ_VERSION=3.8.27
# Fri, 25 Feb 2022 23:09:04 GMT
ENV RABBITMQ_PGP_KEY_ID=0x0A9AF2115F4687BD29803A206B73A36E6026DFCA
# Fri, 25 Feb 2022 23:09:04 GMT
ENV RABBITMQ_HOME=/opt/rabbitmq
# Fri, 25 Feb 2022 23:09:05 GMT
ENV PATH=/opt/rabbitmq/sbin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin RABBITMQ_LOGS=-
# Fri, 25 Feb 2022 23:09:20 GMT
# ARGS: PGP_KEYSERVER=keyserver.ubuntu.com
RUN set -eux; savedAptMark="$(apt-mark showmanual)"; apt-get update; apt-get install --yes --no-install-recommends ca-certificates gnupg wget xz-utils ; rm -rf /var/lib/apt/lists/*; RABBITMQ_SOURCE_URL="https://github.com/rabbitmq/rabbitmq-server/releases/download/v$RABBITMQ_VERSION/rabbitmq-server-generic-unix-latest-toolchain-$RABBITMQ_VERSION.tar.xz"; RABBITMQ_PATH="/usr/local/src/rabbitmq-$RABBITMQ_VERSION"; wget --progress dot:giga --output-document "$RABBITMQ_PATH.tar.xz.asc" "$RABBITMQ_SOURCE_URL.asc"; wget --progress dot:giga --output-document "$RABBITMQ_PATH.tar.xz" "$RABBITMQ_SOURCE_URL"; export GNUPGHOME="$(mktemp -d)"; gpg --batch --keyserver hkps://keys.openpgp.org --recv-keys "$RABBITMQ_PGP_KEY_ID"; gpg --batch --verify "$RABBITMQ_PATH.tar.xz.asc" "$RABBITMQ_PATH.tar.xz"; gpgconf --kill all; rm -rf "$GNUPGHOME"; mkdir -p "$RABBITMQ_HOME"; tar --extract --file "$RABBITMQ_PATH.tar.xz" --directory "$RABBITMQ_HOME" --strip-components 1; rm -rf "$RABBITMQ_PATH"*; grep -qE '^SYS_PREFIX=\$\{RABBITMQ_HOME\}$' "$RABBITMQ_HOME/sbin/rabbitmq-defaults"; sed -i 's/^SYS_PREFIX=.*$/SYS_PREFIX=/' "$RABBITMQ_HOME/sbin/rabbitmq-defaults"; grep -qE '^SYS_PREFIX=$' "$RABBITMQ_HOME/sbin/rabbitmq-defaults"; chown -R rabbitmq:rabbitmq "$RABBITMQ_HOME"; apt-mark auto '.*' > /dev/null; apt-mark manual $savedAptMark; apt-get purge -y --auto-remove -o APT::AutoRemove::RecommendsImportant=false; [ ! -e "$RABBITMQ_DATA_DIR/.erlang.cookie" ]; gosu rabbitmq rabbitmqctl help; gosu rabbitmq rabbitmqctl list_ciphers; gosu rabbitmq rabbitmq-plugins list; rm "$RABBITMQ_DATA_DIR/.erlang.cookie"
# Fri, 25 Feb 2022 23:09:21 GMT
# ARGS: PGP_KEYSERVER=keyserver.ubuntu.com
RUN set -eux; gosu rabbitmq rabbitmq-plugins enable --offline rabbitmq_prometheus; echo 'management_agent.disable_metrics_collector = true' > /etc/rabbitmq/conf.d/management_agent.disable_metrics_collector.conf; chown rabbitmq:rabbitmq /etc/rabbitmq/conf.d/management_agent.disable_metrics_collector.conf
# Fri, 25 Feb 2022 23:09:22 GMT
# ARGS: PGP_KEYSERVER=keyserver.ubuntu.com
RUN ln -sf /opt/rabbitmq/plugins /plugins
# Fri, 25 Feb 2022 23:09:22 GMT
ENV HOME=/var/lib/rabbitmq
# Fri, 25 Feb 2022 23:09:22 GMT
VOLUME [/var/lib/rabbitmq]
# Fri, 25 Feb 2022 23:09:22 GMT
ENV LANG=C.UTF-8 LANGUAGE=C.UTF-8 LC_ALL=C.UTF-8
# Fri, 25 Feb 2022 23:09:22 GMT
COPY file:031403dd5d8e0fcf8749c29d6595776a74d7fb2d07fd51c2c2f9f9f22af0a504 in /usr/local/bin/
# Fri, 25 Feb 2022 23:09:23 GMT
ENTRYPOINT ["docker-entrypoint.sh"]
# Fri, 25 Feb 2022 23:09:23 GMT
EXPOSE 15691 15692 25672 4369 5671 5672
# Fri, 25 Feb 2022 23:09:23 GMT
CMD ["rabbitmq-server"]
```
- Layers:
- `sha256:723da7fec7371c2b30effc8da0790968bef9dd221f5e34b5c8f7d2eff90fbd5e`
Last Modified: Wed, 02 Feb 2022 01:44:01 GMT
Size: 27.1 MB (27118737 bytes)
MIME: application/vnd.docker.image.rootfs.diff.tar.gzip
- `sha256:6bb23ea25f0a80a28c6345f37ffbfecd59fe27867814e3405021e21798c63a7f`
Last Modified: Wed, 02 Feb 2022 02:51:53 GMT
Size: 861.5 KB (861515 bytes)
MIME: application/vnd.docker.image.rootfs.diff.tar.gzip
- `sha256:a6db85ee88ff02e2014b91c71208b436f01e4f579148ec8c5bc2672375da6421`
Last Modified: Fri, 25 Feb 2022 23:11:13 GMT
Size: 42.2 MB (42195324 bytes)
MIME: application/vnd.docker.image.rootfs.diff.tar.gzip
- `sha256:feb5c9f013c275d56572094a19452e92941c6acf8c8ceb78b17f43854d2f9b09`
Last Modified: Fri, 25 Feb 2022 23:11:08 GMT
Size: 2.1 KB (2087 bytes)
MIME: application/vnd.docker.image.rootfs.diff.tar.gzip
- `sha256:8b1fadbd39326622efee0e6894b1786fd2a8a9999e346ceb95d4ec0cb7338232`
Last Modified: Fri, 25 Feb 2022 23:12:07 GMT
Size: 18.2 MB (18177071 bytes)
MIME: application/vnd.docker.image.rootfs.diff.tar.gzip
- `sha256:336e72f18f67366d6fe84cb417022aae24bf1da0dc13a5ac184f428e9ab831ad`
Last Modified: Fri, 25 Feb 2022 23:12:06 GMT
Size: 273.0 B
MIME: application/vnd.docker.image.rootfs.diff.tar.gzip
- `sha256:ea249cea8c95157ef17a99a4eeb37c52966935acf09fd9b25f89c080f262da71`
Last Modified: Fri, 25 Feb 2022 23:12:06 GMT
Size: 107.0 B
MIME: application/vnd.docker.image.rootfs.diff.tar.gzip
- `sha256:16ce67caf6d45c05ea9e04218f3db9f9171c1fe70ce3592cc165c1c856bdee4a`
Last Modified: Fri, 25 Feb 2022 23:12:06 GMT
Size: 5.0 KB (4988 bytes)
MIME: application/vnd.docker.image.rootfs.diff.tar.gzip
| 100.638009 | 3,881 | 0.741648 | yue_Hant | 0.329927 |
97e044ef1be743a8427d750e4b6d94a352280ae9 | 746 | md | Markdown | README.md | Muttley/rpg-campaign-backup | 384a54eeae36496f92b309307e9766b6309422f0 | [
"MIT"
] | null | null | null | README.md | Muttley/rpg-campaign-backup | 384a54eeae36496f92b309307e9766b6309422f0 | [
"MIT"
] | null | null | null | README.md | Muttley/rpg-campaign-backup | 384a54eeae36496f92b309307e9766b6309422f0 | [
"MIT"
] | null | null | null | # Fantasy Grounds: Campaign Backup Tool
Tool for backing up folder based RPG campaigns from VTTs like Fantasy Grounds or Foundry VTT, etc.
## Usage
~~~
campaign-backup.pl
Tool for backing up folder based RPG campaign data
OPTIONS:
-b|backup_dir Base backup directory to use.
default: S:\Dropbox\Games\Role-Playing Games\.Campaigns\.backups
-c|config_file Location of JSON config file
default: conf\config.json
-s|stdout Log to STDOUT rather than to a file
-v|debug Provide more verbose output.
-vv|trace Provide even more verbose output.
-?|help|usage This usage information
USAGE:
campaign-backup.pl
campaign-backup.pl -v
campaign-backup.pl -usage
~~~
| 21.941176 | 98 | 0.689008 | eng_Latn | 0.876364 |
97e0f1b24ac913df7938248d9a19d688502dca53 | 4,108 | md | Markdown | README.md | enra64/jointly-1 | 002665e6728dc1dac6a59b7fd0f24456f15ceeba | [
"MIT"
] | null | null | null | README.md | enra64/jointly-1 | 002665e6728dc1dac6a59b7fd0f24456f15ceeba | [
"MIT"
] | null | null | null | README.md | enra64/jointly-1 | 002665e6728dc1dac6a59b7fd0f24456f15ceeba | [
"MIT"
] | null | null | null | # Jointly: Signal Synchronizer
## The Syncing Process
To sync two sources with each other, they need a simultaneously recorded signal with a characteristic signature at two timepoints in common. This could be the magnitude of the accelerometer for example, if multiple devices are shaken together.
### Selecting common segments
The script can detect prominent shakes automatically with the `ShakeExtractor`. This is done by detecting the peaks above a certain `threshold`. They are then merged to sequences, if two peaks are not farther apart than a specified `distance` in milliseconds. Sequences with less than `min_length` peaks are filtered out. Sequences, that do not start or end in a `window` of seconds from start and end of the signal respectivley, are filtered in a second step. From these filtered sequences the sequence with the highest weight (mean + median of sequence) is selected for the corresponding segment.
### Calculation of the timeshift
To compensate the differences in the system time of different sources, the timeshift to synchronize the selected segments with each other is calculated. For the automatic computation of the timeshift between two signals the cross-correlation for each segment with the reference signal is calculated. The signals are shifted so that the correlation between the selected segments is maximized.
### Adjusting the frequency
Due to clock drift, which denotes the issue that a clock is not running at the exact same frequency as a specified reference clock, signals that can be in sync at one timepoint desynchronize gradually over time. To compensate this effect a stretching factor is calculated, which brings the difference between the timeshifts for the synchronization based on the first and second segment respectively to zero. After stretching the signal, the timeshift to align the signals has to be calculated again.
## Example
### Syncing data
The data has to be provided in pandas `DataFrame` with a `DateTimeIndex`.
```python
import jointly
sources = {
'Faros': {
'data': faros.data,
'ref_column': 'acc_mag',
},
'Empatica': {
'data': empatica.data,
'ref_column': 'acc_mag',
},
'Everion': {
'data': everion.data,
'ref_column': 'acc_mag',
}
}
ref_source_name = 'Empatica'
extractor = jointly.ShakeExtractor()
synchronizer = jointly.Synchronizer(sources, ref_source_name, extractor)
synced_data = synchronizer.get_synced_data()
```
### Saving data
To define the tables, which should be saved, create a dictionary. Every key at root level defines the name of the corresponding file. The dictionary at the second level defines a list of columns, which should be saved in this file, for each source. The `save_data()` method will also automatically save all data from all sources in a file named `TOTAL.csv`. This can be deactivated by adding the argument `save_total_table = False`.
```python
tables = {
'ACC': {
'Faros': ['Accelerometer_X', 'Accelerometer_Y', 'Accelerometer_Z'],
'Empatica': ['acc_x', 'acc_y', 'acc_z'],
'Everion': ['accx_data', 'accy_data', 'accz_data'],
},
'PPG': {
'Empatica': ['bvp'],
'Everion': ['blood_pulse_wave', 'led2_data', 'led3_data'],
},
'EDA': {
'Empatica': ['eda'],
'Everion': ['gsr_electrode'],
},
'ECG': {
'Faros': ['ECG'],
},
'TEMP': {
'Empatica': ['temp'],
'Everion': ['temperature_object'],
},
'HR': {
'Empatica': ['hr'],
'Everion': ['heart_rate', 'heart_rate_quality'],
},
'IBI': {
'Faros': ['HRV'],
'Empatica': ['ibi'],
'Everion': ['inter_pulse_interval', 'inter_pulse_interval_deviation'],
}
}
synchronizer.save_data(sync_dir_path, tables=tables)
```
## Logging
To activate logging simply add the following lines to your code:
```python
from jointly.log import logger
logger.setLevel(10)
```
This will give you insight into the shake detection, calculation of the timeshifts and stretching factor, and output plots of the segements. | 41.918367 | 598 | 0.709591 | eng_Latn | 0.996373 |
97e1c28098c528b74a34d04a63f6c6f72eee94f7 | 389 | md | Markdown | 2-resources/BLOG/Official-Documentation/en.javascript.info-master/en.javascript.info-master/1-js/06-advanced-functions/10-bind/3-second-bind/solution.md | impastasyndrome/Lambda-Resource-Static-Assets | 7070672038620d29844991250f2476d0f1a60b0a | [
"MIT"
] | null | null | null | 2-resources/BLOG/Official-Documentation/en.javascript.info-master/en.javascript.info-master/1-js/06-advanced-functions/10-bind/3-second-bind/solution.md | impastasyndrome/Lambda-Resource-Static-Assets | 7070672038620d29844991250f2476d0f1a60b0a | [
"MIT"
] | null | null | null | 2-resources/BLOG/Official-Documentation/en.javascript.info-master/en.javascript.info-master/1-js/06-advanced-functions/10-bind/3-second-bind/solution.md | impastasyndrome/Lambda-Resource-Static-Assets | 7070672038620d29844991250f2476d0f1a60b0a | [
"MIT"
] | 1 | 2021-11-05T07:48:26.000Z | 2021-11-05T07:48:26.000Z | The answer: **John**.
```js run no-beautify
function f() {
alert(this.name);
}
f = f.bind( {name: "John"} ).bind( {name: "Pete"} );
f(); // John
```
The exotic [bound function](https://tc39.github.io/ecma262/#sec-bound-function-exotic-objects) object returned by `f.bind(...)` remembers the context (and arguments if provided) only at creation time.
A function cannot be re-bound.
| 24.3125 | 201 | 0.66838 | eng_Latn | 0.670854 |
97e23f7ee7fca8750a24ad0941d10c4df5f370d7 | 1,100 | md | Markdown | docs/docs/ClientLibrary/Microsoft-FactoryOrchestrator-Client-FactoryOrchestratorClient-DeleteTaskList(System-Guid).md | patelp3989/FactoryOrchestrator | 1d2326a953c76412be0e1944df6acafffeafd4c7 | [
"MIT"
] | 34 | 2020-07-20T22:02:09.000Z | 2022-01-17T12:00:11.000Z | docs/docs/ClientLibrary/Microsoft-FactoryOrchestrator-Client-FactoryOrchestratorClient-DeleteTaskList(System-Guid).md | patelp3989/FactoryOrchestrator | 1d2326a953c76412be0e1944df6acafffeafd4c7 | [
"MIT"
] | 136 | 2020-07-24T18:29:19.000Z | 2022-03-25T20:20:49.000Z | docs/docs/ClientLibrary/Microsoft-FactoryOrchestrator-Client-FactoryOrchestratorClient-DeleteTaskList(System-Guid).md | patelp3989/FactoryOrchestrator | 1d2326a953c76412be0e1944df6acafffeafd4c7 | [
"MIT"
] | 24 | 2020-07-22T22:46:33.000Z | 2021-12-14T18:01:39.000Z | #### [Microsoft.FactoryOrchestrator.Client](./Microsoft-FactoryOrchestrator-Client.md 'Microsoft.FactoryOrchestrator.Client')
### [Microsoft.FactoryOrchestrator.Client](./Microsoft-FactoryOrchestrator-Client.md 'Microsoft.FactoryOrchestrator.Client').[FactoryOrchestratorClient](./Microsoft-FactoryOrchestrator-Client-FactoryOrchestratorClient.md 'Microsoft.FactoryOrchestrator.Client.FactoryOrchestratorClient')
## FactoryOrchestratorClient.DeleteTaskList(System.Guid) Method
Asynchronously Deletes a TaskList on the Service.
```csharp
public System.Threading.Tasks.Task DeleteTaskList(System.Guid listToDelete);
```
#### Parameters
<a name='Microsoft-FactoryOrchestrator-Client-FactoryOrchestratorClient-DeleteTaskList(System-Guid)-listToDelete'></a>
`listToDelete` [System.Guid](https://docs.microsoft.com/en-us/dotnet/api/System.Guid 'System.Guid')
The GUID of the TaskList to delete.
#### Returns
[System.Threading.Tasks.Task](https://docs.microsoft.com/en-us/dotnet/api/System.Threading.Tasks.Task 'System.Threading.Tasks.Task')
true if it was deleted successfully.
| 68.75 | 287 | 0.8 | yue_Hant | 0.580703 |
97e285168527860d568e0e6ae3f814a0ab7e9e5d | 3,346 | md | Markdown | Changelog.md | rpzzzzzz/AutoMate | de800da1b4404f52e3a33402d417556de24ce6f9 | [
"MIT"
] | null | null | null | Changelog.md | rpzzzzzz/AutoMate | de800da1b4404f52e3a33402d417556de24ce6f9 | [
"MIT"
] | null | null | null | Changelog.md | rpzzzzzz/AutoMate | de800da1b4404f52e3a33402d417556de24ce6f9 | [
"MIT"
] | null | null | null | # Changelog
## [1.6.0](https://github.com/PGSSoft/AutoMate/releases/tag/1.5.0)
Released on 2018-10-27.
#### Added
- Support for Xcode 10, iOS 12 and new devices
- Compatibility with Swift 4.2 (#11) via David Seek
## [1.5.0](https://github.com/PGSSoft/AutoMate/releases/tag/1.5.0)
Released on 2018-01-11.
#### Updated
- iOS deployment target is now 9.3
## [1.4.4](https://github.com/PGSSoft/AutoMate/releases/tag/1.4.4)
Released on 2018-01-01.
#### Updated
- Fixed building for Xcode 9.2.
## [1.4.3](https://github.com/PGSSoft/AutoMate/releases/tag/1.4.3)
Released on 2017-11-14.
#### Added
- Compatibility with Xcode 9.1, Swift 4 and iOS 11
#### Updated
- Deployment target of AutoMate and AutoMateExample is now 10.3
## [1.4.2](https://github.com/PGSSoft/AutoMate/releases/tag/1.4.2)
Released on 2017-10-27.
#### Fixed
- Build fails on Xcode 9.0.1 (#8) via Jorge Ramirez
## [1.4.1](https://github.com/PGSSoft/AutoMate/releases/tag/1.4.1)
Released on 2017-10-20.
#### Fixed
- The current version of swiftlint is failing the build (#6) via Jorge Ramirez
## [1.4.0](https://github.com/PGSSoft/AutoMate/releases/tag/1.4.0)
Released on 2017-07-03.
#### Added
- Check if application is running in UI test environment (with `AutoMate-AppBuddy`)
## [1.3.1](https://github.com/PGSSoft/AutoMate/releases/tag/1.3.1)
Released on 2017-04-11.
#### Added
- Smart coordinates `SmartXCUICoordinate`.
#### Updated
- Disabled Bitcode for Cocoapods.
- Adjusted swipe and tap methods to use smart coordinates.
## [1.3.0](https://github.com/PGSSoft/AutoMate/releases/tag/1.3.0)
Released on 2017-04-10.
#### Added
- Added `element(withIdentifier:labels:labelComparisonOperator:)`
- Added `swipe(to:times:avoid:from:until:)`, `swipe(to:untilExist:times:avoid:from:)` and `swipe(to:untilVisible:times:avoid:from:)` to use with collection views.
- Improve launch options type safety
## [1.2.0](https://github.com/PGSSoft/AutoMate/releases/tag/1.2.0)
Released on 2017-03-29.
#### Added
- More generic version of `wait(...)` methods: `wait(forFulfillmentOf predicate:...)`.
- `wait(forInvisibilityOf...)` method which is waiting for element to disappear.
- Compatibility with Xcode 8.3 and Swift 3.1.
## [1.1.0](https://github.com/PGSSoft/AutoMate/releases/tag/1.1.0)
Released on 2017-03-15.
#### Added
- Page Objects: `BaseAppPageProtocol`, `BaseAppPage`, `ModalPage`, `PushedPage`, `HealthPermissionView`
- `IdentifiableByElement` protocol
- `Locator` protocol
- `AppUITestCase` as base `XCTestCase` template
- Disable UIView animations (with `AutoMate-AppBuddy`)
- Improve handling of system alerts (with `AutoMate-ModelGenie`)
- Most permissions alerts (like: `LocationWhenInUseAlert`, `CalendarAlert`, `PhotosAlert`) (with `AutoMate-Templates`)
- Managing events, reminders and contacts (with `AutoMate-AppBuddy`)
#### Updated
- Documentation
## [1.0](https://github.com/PGSSoft/AutoMate/releases/tag/1.0)
Released on 2016-09-23.
#### Added
- CoreData debug launch options.
- String localization debug options.
- Extensions for UI Testing.
#### Updated
- Improved documentation.
## [0.1](https://github.com/PGSSoft/AutoMate/releases/tag/0.1)
Released on 2016-08-05.
#### Added
- General utilities for configuring tested application.
- System keyboards launch argument.
- System locale launch argument.
- System languages launch argument.
| 30.418182 | 162 | 0.724746 | eng_Latn | 0.41608 |
97e2e07a46e211bbb4d9d366e1f29552a00af86d | 2,415 | md | Markdown | README.md | PacktPublishing/Building-User-Interface-Using-React-And-Flux-v- | f3ca0f8d5454659167e46fb3461ae8712d99ed7f | [
"MIT"
] | 1 | 2021-10-01T23:23:48.000Z | 2021-10-01T23:23:48.000Z | README.md | PacktPublishing/Building-User-Interface-Using-React-And-Flux-v- | f3ca0f8d5454659167e46fb3461ae8712d99ed7f | [
"MIT"
] | null | null | null | README.md | PacktPublishing/Building-User-Interface-Using-React-And-Flux-v- | f3ca0f8d5454659167e46fb3461ae8712d99ed7f | [
"MIT"
] | 1 | 2021-05-21T10:36:26.000Z | 2021-05-21T10:36:26.000Z | # Building User interface using React and Flux [Video]
This is the code repository for [Building User interface using React and Flux [Video]](https://www.packtpub.com/application-development/building-user-interface-using-react-and-flux-video?utm_source=github&utm_medium=repository&utm_campaign=9781788839655), published by [Packt](https://www.packtpub.com/?utm_source=github). It contains all the supporting project files necessary to work through the video course from start to finish.
## About the Video Course
Businesses today are evolving so rapidly that having their own infrastructure to support their expansion is not feasible. As a result, they have been resorting to the elasticity of the cloud to provide a platform to build and deploy their highly scalable applications. This video will be the one stop for you to learn all about building cloud-native architectures in Python. You’ll learn about interacting data services and building Web views with React, after which we will take a detailed look at application security and performance.
<H2>What You Will Learn</H2>
<DIV class=book-info-will-learn-text>
<UL>
<LI>Gain hands-on knowledge of how to migrate your application to different database services
<LI>Learn how to build a user interface using React
<LI>Create UIs to Scale with Flux to understand about Flux for scaling applications </LI></UL></DIV>
## Instructions and Navigation
### Assumed Knowledge
To fully benefit from the coverage included in this course, you will need:<br/>
This video is ideal for developers with a basic knowledge of Python who want to learn to build, test, and scale their Python-based applications. No prior experience of writing microservices in Python is required.
### Technical Requirements
This course has the following software requirements:<br/>
React and flux
## Related Products
* [Troubleshooting Vue.js [Video]](https://www.packtpub.com/application-development/troubleshooting-vuejs-video?utm_source=github&utm_medium=repository&utm_campaign=9781788993531)
* [Angular 7 New Features [Video]](https://www.packtpub.com/web-development/angular-7-new-features-video?utm_source=github&utm_medium=repository&utm_campaign=9781789619683)
* [Microservices Development on Azure with Java [Video]](https://www.packtpub.com/virtualization-and-cloud/microservices-development-azure-java-video?utm_source=github&utm_medium=repository&utm_campaign=9781789808858)
| 86.25 | 536 | 0.810352 | eng_Latn | 0.980733 |
97e36860594b83f0d3a17b5f957c20cf2619a3e1 | 748 | md | Markdown | codelib/globeadmin_view_object_id/README.md | smalos/nubuilder-snippets | 321bae7378dd5375dbe4dd230aebbac16f4a022e | [
"Unlicense"
] | 9 | 2020-06-22T14:27:09.000Z | 2021-03-17T10:14:07.000Z | codelib/globeadmin_view_object_id/README.md | smalos/nubuilder-code-snippets | 321bae7378dd5375dbe4dd230aebbac16f4a022e | [
"Unlicense"
] | 1 | 2020-05-12T12:43:34.000Z | 2020-05-14T00:20:37.000Z | codelib/globeadmin_view_object_id/README.md | smalos/nubuilder-code-snippets | 321bae7378dd5375dbe4dd230aebbac16f4a022e | [
"Unlicense"
] | 11 | 2020-05-08T02:27:01.000Z | 2021-05-05T03:08:38.000Z | ### Globeadmin Helper: View an Object ID at a glance
As an administrator, you sometimes want to know the ID of an object at a glance.
By adding this code under Setup -> Header, the ID of an object is displayed when you move the mouse over an object.
The tooltip text will only be shown when the globeadmin is logged in.
<p align="left">
<img src="screenshots/globeadmin_view_object_id.gif">
</p>
```javascript
function globeadminViewObjectId() {
if (window.global_access) {
$("*").each(function() {
var id = $(this).attr('id');
if (id !== undefined) {
$(this).attr('title', 'ID: ' + id);
}
});
}
}
function nuOnLoad() {
globeadminViewObjectId();
}
```
| 24.129032 | 116 | 0.613636 | eng_Latn | 0.978437 |
97e4f3e3fab48b6a6d12c65cc47693f39c5e332f | 2,186 | md | Markdown | docs/extensibility/debugger/reference/idebugcodecontext2-getdocumentcontext.md | PavelAltynnikov/visualstudio-docs.ru-ru | 885a1cdea7a08daa90d84262b36bd83e6e9f1328 | [
"CC-BY-4.0",
"MIT"
] | null | null | null | docs/extensibility/debugger/reference/idebugcodecontext2-getdocumentcontext.md | PavelAltynnikov/visualstudio-docs.ru-ru | 885a1cdea7a08daa90d84262b36bd83e6e9f1328 | [
"CC-BY-4.0",
"MIT"
] | null | null | null | docs/extensibility/debugger/reference/idebugcodecontext2-getdocumentcontext.md | PavelAltynnikov/visualstudio-docs.ru-ru | 885a1cdea7a08daa90d84262b36bd83e6e9f1328 | [
"CC-BY-4.0",
"MIT"
] | null | null | null | ---
description: Возвращает контекст документа, соответствующий данному контексту кода.
title: 'IDebugCodeContext2:: Жетдокументконтекст | Документация Майкрософт'
ms.date: 11/04/2016
ms.topic: reference
f1_keywords:
- IDebugCodeContext2::GetDocumentContext
helpviewer_keywords:
- IDebugCodeContext2::GetDocumentContext
ms.assetid: d552cc92-963f-43c1-949f-ae6b63a427b8
author: acangialosi
ms.author: anthc
manager: jmartens
ms.workload:
- vssdk
dev_langs:
- CPP
- CSharp
ms.openlocfilehash: d99b67e76c8cc8719c77c88c8b93ca667c3a025a
ms.sourcegitcommit: 4b323a8a8bfd1a1a9e84f4b4ca88fa8da690f656
ms.translationtype: MT
ms.contentlocale: ru-RU
ms.lasthandoff: 03/05/2021
ms.locfileid: "102164166"
---
# <a name="idebugcodecontext2getdocumentcontext"></a>IDebugCodeContext2::GetDocumentContext
Возвращает контекст документа, соответствующий данному контексту кода. Контекст документа представляет собой расположение в исходном файле, соответствующее исходному коду, вызвавшему эту инструкцию.
## <a name="syntax"></a>Синтаксис
```cpp
HRESULT GetDocumentContext(
IDebugDocumentContext2** ppSrcCxt
);
```
```csharp
int GetDocumentContext(
out IDebugDocumentContext2 ppSrcCxt
);
```
## <a name="parameters"></a>Параметры
`ppSrcCxt`\
заполняет Возвращает объект [IDebugDocumentContext2](../../../extensibility/debugger/reference/idebugdocumentcontext2.md) , соответствующий контексту кода. Если `S_OK` возвращается значение, этого не должно быть `null` .
## <a name="return-value"></a>Возвращаемое значение
Возвращает значение `S_OK`, если выполнение прошло успешно; в противном случае возвращает код ошибки. Модуль отладки должен возвращать код сбоя, например, если `E_FAIL` в `out` `null` контексте кода нет связанной с исходным положением.
## <a name="remarks"></a>Комментарии
Как правило, контекст документа можно рассматривать как расположение в исходном файле, в то время как контекст кода является позицией инструкции кода в потоке выполнения.
## <a name="see-also"></a>См. также раздел
- [IDebugCodeContext2](../../../extensibility/debugger/reference/idebugcodecontext2.md)
- [IDebugDocumentContext2](../../../extensibility/debugger/reference/idebugdocumentcontext2.md)
| 39.035714 | 236 | 0.799177 | rus_Cyrl | 0.562959 |
97e55f52862cce1d72465e6842259a9f223626b8 | 653 | md | Markdown | docs/MessengerUpdate.md | zendesk/sunshine-conversations-ruby | 8fe5e79640276c237a22708c211552eaab66e87f | [
"Apache-2.0"
] | 3 | 2020-09-27T14:28:39.000Z | 2022-01-31T02:08:59.000Z | docs/MessengerUpdate.md | zendesk/sunshine-conversations-ruby | 8fe5e79640276c237a22708c211552eaab66e87f | [
"Apache-2.0"
] | 5 | 2020-08-12T11:56:03.000Z | 2021-09-30T18:21:26.000Z | docs/MessengerUpdate.md | zendesk/sunshine-conversations-ruby | 8fe5e79640276c237a22708c211552eaab66e87f | [
"Apache-2.0"
] | 2 | 2021-01-14T09:51:44.000Z | 2021-11-04T17:13:42.000Z | # SunshineConversationsClient::MessengerUpdate
## Properties
Name | Type | Description | Notes
------------ | ------------- | ------------- | -------------
**display_name** | **String** | A human-friendly name used to identify the integration. `displayName` can be unset by changing it to `null`. | [optional]
**page_access_token** | **String** | A Facebook Page Access Token. | [optional]
## Code Sample
```ruby
require 'SunshineConversationsClient'
instance = SunshineConversationsClient::MessengerUpdate.new(display_name: My awesome integration,
page_access_token: your_access_token)
```
| 32.65 | 174 | 0.644717 | yue_Hant | 0.400823 |
97e566b3bbef410de0a15e800dd60d04b422f9b0 | 555 | md | Markdown | _posts/2020-04-01-brand-babu-2018-telugu-hd-full-movie-download-brand-babu-telugu-hd-movie-download.md | tamilrockerss/tamilrockerss.github.io | ff96346e1c200f9507ae529f2a5acba0ecfb431d | [
"MIT"
] | null | null | null | _posts/2020-04-01-brand-babu-2018-telugu-hd-full-movie-download-brand-babu-telugu-hd-movie-download.md | tamilrockerss/tamilrockerss.github.io | ff96346e1c200f9507ae529f2a5acba0ecfb431d | [
"MIT"
] | null | null | null | _posts/2020-04-01-brand-babu-2018-telugu-hd-full-movie-download-brand-babu-telugu-hd-movie-download.md | tamilrockerss/tamilrockerss.github.io | ff96346e1c200f9507ae529f2a5acba0ecfb431d | [
"MIT"
] | 1 | 2020-11-08T11:13:29.000Z | 2020-11-08T11:13:29.000Z | ---
title: "Brand Babu 2018 Telugu HD Full Movie Download Brand Babu Telugu HD Movie Download"
date: "2020-04-01"
---
## Brand Babu 2018 Telugu HD Full Movie Download

**_Brand Babu 720p HD Single Part.mp4_**
**_Size: 1.18GB_**
**_[Download Server 1](https://openload.co/f/K-_7HTNcKFU/Brand_Babu_2018_Telugu_Proper_HDRip_-720p_-_x264_-_1.4GB.mkv)_**
**_[Download Server 2](https://openload.co/f/K-_7HTNcKFU/Brand_Babu_2018_Telugu_Proper_HDRip_-720p_-_x264_-_1.4GB.mkv)_**
| 32.647059 | 121 | 0.756757 | yue_Hant | 0.378188 |
97e58cc2082a85bd09336066c738b0b94a9d8929 | 1,576 | md | Markdown | website/content/learn/docs/sessions/what-are-sessions.md | ddunn2/kitura.io | 173d0fa0be3fccc803375edc9437d907c547686a | [
"Apache-2.0"
] | 33 | 2016-09-24T02:48:32.000Z | 2020-04-22T01:50:04.000Z | website/content/learn/docs/sessions/what-are-sessions.md | ddunn2/kitura.io | 173d0fa0be3fccc803375edc9437d907c547686a | [
"Apache-2.0"
] | 251 | 2016-09-22T09:55:23.000Z | 2020-09-22T12:40:41.000Z | website/content/learn/docs/sessions/what-are-sessions.md | ddunn2/kitura.io | 173d0fa0be3fccc803375edc9437d907c547686a | [
"Apache-2.0"
] | 41 | 2016-09-27T11:07:12.000Z | 2020-09-21T04:39:18.000Z | ---
path: "/docs/sessions/what-are-sessions"
title: What are Sessions?
---
#What are Sessions?
A session is server-side storage of information that will be persisted through a user's interaction with the application. When you first visit the server, you are provided with a cookie with a unique session id. The user presents this cookie with future requests so that the server can identify that the requests have come from the same user. Information associated with that session id can then be used to respond to the request. For example, you could persist the user's name, whether they are logged in, and their shopping basket inside a session. Kitura implements sessions using the [Kitura-Session](https://github.com/Kitura/Kitura-Session) library.
A session, unlike a database, is intended to be temporary. If it is not updated or if the session is ended, such as a user logging out, then it is cleared.
Sessions are not the same as authentication, although they are often used together. Authentication is used to verify a user's credentials to identify who they are. A session is used to check that the current request is from the same user as a previous request. You can learn more about authentication in our [What is Authentication?](../authentication/what-is-authentication) guide.
##Next steps
[Session with Codable routes](./codable-session): Learn how to persist the user's information between multiple requests on codable routes.
[Session with Raw routes](./raw-session): Learn how to persist the user's information between multiple requests on raw routes.
| 82.947368 | 655 | 0.790609 | eng_Latn | 0.99968 |
97e5c8d9bbac8472e1f43e1330feff1206f54384 | 4,155 | md | Markdown | README_EN.md | qmister/laravel-admin-block | 344386ed5263a9600c1e7f92b6fd9a9f2cd407ae | [
"Apache-2.0"
] | 2 | 2020-08-14T03:33:17.000Z | 2021-09-30T14:23:18.000Z | README_EN.md | qmister/laravel-admin-block | 344386ed5263a9600c1e7f92b6fd9a9f2cd407ae | [
"Apache-2.0"
] | null | null | null | README_EN.md | qmister/laravel-admin-block | 344386ed5263a9600c1e7f92b6fd9a9f2cd407ae | [
"Apache-2.0"
] | 2 | 2021-06-09T14:53:49.000Z | 2022-02-09T09:05:47.000Z | [中文文档](./README.md)
[英文文档](./README_EN.md)
Open source address: https://github.com/tp5er/laravel-admin-block
introduce
The price of each Walden block dog ranges from 100 to 15000, which is divided into five kinds of dogs:
It’s between 100 and 300,
Yongdeng is between 301 and 900,
It’s between 901 and 2500,
It’s between 2501 and 6000,
Chengdeng is between 6001 and 15000.
Dog snatching starts every afternoon at 9 time points: 14:00, 15:00, 16:30, 17:00, 17:30, 19:30, 20:00, 20:30 and 21:00.
Profit point
The platform provides services such as pet dog trading information, player transaction information docking, transaction credit maintenance and other services, and charges transaction fees.
In the process of adopting a pet dog, everyone is consuming differential. We don’t calculate much. If 2000 people make an appointment every day, each person will recharge 30 yuan / day, 2000 * 30 = 60000 yuan / day, and the monthly income will be 1.8 million yuan
Player benefits:
1) If such a good project is shared with partners, the first level promotion reward is 8%, the second level is 3%, and the third level is 5%. For example: you have 100 people pushing the team directly, and each person adopts a block pet dog with a total value of 5000 yuan per day, so the team income is 5000 yuan3%100 = 15000 yuan
2) Daily output
Recharge differential process:
Members need to transfer offline to the designated account number of the platform and note the good ID. after receiving the payment in the background, the background will recharge the differential for the members
Rules for members to buy and sell dogs: peer to peer transactions, no central account
For individual to individual point-to-point transactions, the platform has no capital precipitation and daily settlement, safe and fast point-to-point transactions and no fund pool. All transaction amount does not pass through the platform, point-to-point transaction. That is to say, when you buy a dog, you put the money into the other party’s account; when you sell a dog, the other party puts the money into your account.
Software architecture
Laravel + laravel admin + MySQL fully open source
Installation tutorial
~~~
Git clone warehouse address your path
cd your-path
composer install
php artisan admin:install
~~~
Timed tasks
~~~
crontab -e
* * * * * php your-path/artisan schedule:run >> /dev/null 2>&1
~~~
Task queue
~~~
nohup php your-path/artisan queue:work --tries=3 --sleep=3 &
~~~
Git deployment skills
~~~
directory right
chown -R www:www your-path
chmod 775 -R your-path/public/ your-path/storage/
chmod 775 -R your-path/storage/app/
Git changes due to ignoring permissions
git config --add core.filemode false
~~~
Clean cache command
php artisan clear-compiled
php artisan cache:clear
php artisan config:clear
php artisan route:clear
instructions
Trading rules of dog snatching dog in Huadeng block
1. Make an appointment in advance! (if not, the differential will be returned).
2. The buyer has made an appointment, and the seller will be informed by SMS
3. Every time before the start, there will be a 90 second countdown! When reading seconds to 10, refresh the page (poke the “dog market” in the lower left corner), and start to click frequently when reading 5 seconds, until then! The page begins to show “yellow box flip.”.
4. If there are too many people snatching dogs in a moment, the system may be stuck in an instant. You can exit and wait for 2 minutes to re-enter the app, and then check the “adoption record” to see if there is any dog snatched.
Daily adoption time: 14:00, 15:00, 16:30, 17:00, 17:30, 19:30, 20:00, 20:30, 21:00
5. Members are not allowed to buy their own dogs
6. The system judges the probability of a new member’s first dog snatching
statement
This project is purely involved in the laravel project and is not for commercial use
Do not use this system for commercial use, I am not responsible
Non commercial pure technology learning project, self research, no technical support! No technical support! No technical support!
# QQ Friend
[1751212020](http://wpa.qq.com/msgrd?v=3&uin=1751212020&site=qq&menu=yes)
| 38.472222 | 425 | 0.770878 | eng_Latn | 0.998313 |
97e60a3593e14d9c8e2c5b1a8a4f8f9107d0dd26 | 5,405 | md | Markdown | blog/cookie-consent.md | philipp-classen/whotracks.me | 560af3909ed81dacbc7ec3dcfd1af310e76d20f7 | [
"MIT"
] | 158 | 2017-10-23T07:04:51.000Z | 2020-11-13T12:30:41.000Z | blog/cookie-consent.md | philipp-classen/whotracks.me | 560af3909ed81dacbc7ec3dcfd1af310e76d20f7 | [
"MIT"
] | 64 | 2017-10-30T09:37:38.000Z | 2020-11-13T09:01:55.000Z | blog/cookie-consent.md | philipp-classen/whotracks.me | 560af3909ed81dacbc7ec3dcfd1af310e76d20f7 | [
"MIT"
] | 46 | 2017-10-26T15:03:53.000Z | 2020-09-03T10:30:03.000Z | title: Improving Cookie Consent
subtitle: Cliqz' new feature to make consent fairer
author: privacy team
type: article
publish: True
date: 2019-11-28
tags: blog, gdpr, consent
header_img: blog/autoconsent/cookie-blocker-prompt.png
+++
Since the GDPR came into force in May last year, the Cookie-Consent Popup has become a fixture of browsing the web. These popups are ostensibly there to allow you to choose whether you agree or disagree to your data being used for certain purposes on the site, but confusing UI design and tricks mean that many users are not able to select their desired consent settings. A recent [study](https://arxiv.org/pdf/1909.02638.pdf) showed that user fatigue with consent popups, and simple UI tricks are able to artificially inflate the opt-in rate. The study also showed that, when opt-out is the default, only 0.1% of users would consent to all data processing. This is in stark contrast to the over 90% opt-in rate that the [industry claims](https://www.thedrum.com/news/2018/07/31/over-90-users-consent-gdpr-requests-says-quantcast-after-enabling-1bn-them), and uses to justify that users are OK with tracking.
How can we restore balance to this situation, and allow users a fair choice about how their data is used? At Cliqz we have been developing a new feature to aim to address the difficulty of denying consent based around 3 core principles:
1. Opt-out and opt-in should both require maximum of one click, i.e. the time-cost should be the same, no matter which choice is made.
2. The user should not have to decide individually for every site. Their default choice can be used to give consent after their initial decision.
3. Consent banners only offering an 'OK' or 'Allow' option do not allow user choice. The are at best a distraction for the user, and at worst drive consent fatigue and encourage the bad practice of automatically clicking away message prompts. These should be hidden.
Unfortunately, implementing an automated consent choice in the browser is made challenging by the lack of adoption or adherence to browser standards. The [Do Not Track](https://www.w3.org/blog/2018/06/do-not-track-and-the-gdpr/) standard enables users to broadcast preferences around tracking, and for sites to communicate tracking status to the browser. Before that, the [P3P Project](https://www.w3.org/P3P/) attempted to standardise privacy practices and allow automated decision making around them. Both of these standards have been rejected by the tracking industry, who prefer to present consent on their terms. The industry have instead proposed and implemented the [Transparency and Consent Framework](https://iabeurope.eu/transparency-consent-framework/), which primarily focuses on communicating consent between vendors. It is a read-only API, so the browser can only read the consent status as set by the site, and not modify it. This means that consent can currently only be expressed by clicking through HTML forms.
<img class="img-responsive" src="../static/img/blog/autoconsent/cookie-blocker-before.gif" alt="Navigating a Cookie-Consent Popup manually" />
<p class="img-caption">Navigating a Cookie-Consent Popup manually.</p>
Luckily, the number of vendors offering consent solutions is limited, and browser extensions can simulate clicking through forms. Thus, [autoconsent](https://github.com/cliqz-oss/autoconsent) was born - a library of rules standardising the navigation of consent forms for the most popular sites and vendors. This library is able to:
* Detect the presence of supported Consent Management Providers on a page.
* Determine whether a popup or overlay is being shown on the page.
* Execute an opt-in (allow all purposes) or opt-out (reject all purposes).
* Where available, re-open the popup to allow modification of the settings.
In practice, this allows consent popups to be rapidly dismissed when loading a new site. The speed depends on the provider and how quickly their UI can be manipulated. In all cases, however, this is faster than a user could navigate the interface.
<img class="img-responsive" src="../static/img/blog/autoconsent/cookie-blocker-after.gif" alt="Automatic navigation of the Cookie-Consent Popup" />
<p class="img-caption">Automatic navigation of the Cookie-Consent Popup.</p>
For popups that are informational only, or force affirmative consent, we apply simple cosmetic rules. These are CSS rules that define elements in the page that should be hidden. As with the consent rules, we benefit from the defacto standardisation of tools for displaying of popups, such that a small number of rules can support the majority of popups shown by websites.
These elements combined mean that we now just have to ask the user once whether they want to opt-in or opt-out, then they will not be bothered by consent popups on the majority of sites they visit. At the same time, they will signal to these sites their approval or dissapproval of their data collection practices.
This signal of non-consent is important to encourage and incentivise a shift in data usage practices on the web. When sites realise they cannot just trick users into allowing invasive data collection, they will have a strong incentive to change the way they operate and respect users more.
The new Cliqz Cookie-Popup blocker is available in the latest version of the Cliqz browser. Get it at [cliqz.com](https://cliqz.com/download).
| 125.697674 | 1,028 | 0.796485 | eng_Latn | 0.999086 |
97e61131003c4fdc621ec43034e48ea09068db4d | 233 | md | Markdown | README.md | ALTAIRLAB/breathe2words | 389dc7fe2e3e24befe34a6f6be45c775d778f879 | [
"MIT"
] | null | null | null | README.md | ALTAIRLAB/breathe2words | 389dc7fe2e3e24befe34a6f6be45c775d778f879 | [
"MIT"
] | null | null | null | README.md | ALTAIRLAB/breathe2words | 389dc7fe2e3e24befe34a6f6be45c775d778f879 | [
"MIT"
] | null | null | null | # breathe2words
A project to convert breathe patterns to words and commands, which should help paralyzed people to socialize.
## Software
Thıs project orıgınally was desıgned under Processıng 2, then it was moved to a Processıng 3.
| 38.833333 | 109 | 0.806867 | eng_Latn | 0.999676 |
97e74f5418bce4797d5d353181d437c3b52e2fc4 | 2,078 | md | Markdown | 2021/CVE-2021-37624.md | justinforbes/cve | 375c65312f55c34fc1a4858381315fe9431b0f16 | [
"MIT"
] | 2,340 | 2022-02-10T21:04:40.000Z | 2022-03-31T14:42:58.000Z | 2021/CVE-2021-37624.md | justinforbes/cve | 375c65312f55c34fc1a4858381315fe9431b0f16 | [
"MIT"
] | 19 | 2022-02-11T16:06:53.000Z | 2022-03-11T10:44:27.000Z | 2021/CVE-2021-37624.md | justinforbes/cve | 375c65312f55c34fc1a4858381315fe9431b0f16 | [
"MIT"
] | 280 | 2022-02-10T19:58:58.000Z | 2022-03-26T11:13:05.000Z | ### [CVE-2021-37624](https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2021-37624)



### Description
FreeSWITCH is a Software Defined Telecom Stack enabling the digital transformation from proprietary telecom switches to a software implementation that runs on any commodity hardware. Prior to version 1.10.7, FreeSWITCH does not authenticate SIP MESSAGE requests, leading to spam and message spoofing. By default, SIP requests of the type MESSAGE (RFC 3428) are not authenticated in the affected versions of FreeSWITCH. MESSAGE requests are relayed to SIP user agents registered with the FreeSWITCH server without requiring any authentication. Although this behaviour can be changed by setting the `auth-messages` parameter to `true`, it is not the default setting. Abuse of this security issue allows attackers to send SIP MESSAGE messages to any SIP user agent that is registered with the server without requiring authentication. Additionally, since no authentication is required, chat messages can be spoofed to appear to come from trusted entities. Therefore, abuse can lead to spam and enable social engineering, phishing and similar attacks. This issue is patched in version 1.10.7. Maintainers recommend that this SIP message type is authenticated by default so that FreeSWITCH administrators do not need to be explicitly set the `auth-messages` parameter. When following such a recommendation, a new parameter can be introduced to explicitly disable authentication.
### POC
#### Reference
- http://packetstormsecurity.com/files/164628/FreeSWITCH-1.10.6-Missing-SIP-MESSAGE-Authentication.html
#### Github
- https://github.com/0xInfection/PewSWITCH
- https://github.com/ARPSyndicate/cvemon
- https://github.com/nomi-sec/PoC-in-GitHub
- https://github.com/taielab/awesome-hacking-lists
| 98.952381 | 1,372 | 0.803176 | eng_Latn | 0.962494 |
97e90de900a8d3339b0d01a0055fd5ae402f9b58 | 1,562 | md | Markdown | README.md | atrotter0/rpg-group-project | e21cc4aed8a972e6bd14404ee8b7afcc6f871889 | [
"MIT"
] | 4 | 2018-06-22T18:30:03.000Z | 2021-02-16T04:18:57.000Z | README.md | atrotter0/rpg-group-project | e21cc4aed8a972e6bd14404ee8b7afcc6f871889 | [
"MIT"
] | 26 | 2018-06-16T21:48:32.000Z | 2019-03-14T16:36:38.000Z | README.md | atrotter0/rpg-group-project | e21cc4aed8a972e6bd14404ee8b7afcc6f871889 | [
"MIT"
] | 4 | 2018-06-21T21:37:59.000Z | 2019-05-12T04:31:32.000Z | # Dungeon Escape
#### Epicodus Intro to Programming Week 5 Group Project, 06.14.18
#### By Abel Trotter, Kelli McCloskey, Elly Maimon, Thad Donaghue
## Description
**Dungeon Escape** is an RPG point-and-click adventure game built with JavaScript, jQuery, & Bootstrap! This project was built over the course of 4 days for the Epicodus Intro to Programming Week 5 Group Project.
**Key Features:**
* Character customization and leveling
* Save Game / Load Game options
* Random enemy loot drops
* Multiple enemy types
* Classic 8-bit RPG music and sound effects
* Turn-based battle system
* Inventory system
* Blood, sweat, and tears
## Personal Contributions
I had a hand in developing and designing most of the key features of the game and how these systems would work with one another.
My main contributions were the battle system, inventory, UI, and the save and load features.
## Setup/Contribution Requirements
1. Clone the repo
1. Checkout Develop
1. Make a new branch
1. Commit and push your changes
1. Create a PR
## Technologies Used
* HTML/CSS
* JavaScript
* Bootstrap 3.3.7
* jQuery 3.3.1
* localStorage
## Known Issues
* Need a way to use consumable items in inventory.
* Need a way to destroy items in inventory.
* Needs another pass at game balance.
## Links
* [Github Repo] (https://github.com/atrotter0/rpg-group-project)
* [Github Pages] (https://atrotter0.github.io/rpg-group-project)
## License
This software is licensed under the MIT license.
Copyright (c) 2018 **Abel Trotter, Kelli McCloskey, Elly Maimon, Thad Donaghue** | 26.931034 | 212 | 0.752881 | eng_Latn | 0.963311 |
97ec19dc3c4262ab4acbaaeb1c48af0ec15ce52d | 851 | md | Markdown | skills/B01IYYLSEU/README.md | AndHor66/alexa-skills-list | 3833958be3f3a4091414645d81a531915303a2ef | [
"MIT"
] | 232 | 2016-03-05T06:24:41.000Z | 2022-03-21T19:32:55.000Z | skills/B01IYYLSEU/README.md | AndHor66/alexa-skills-list | 3833958be3f3a4091414645d81a531915303a2ef | [
"MIT"
] | 5 | 2016-03-21T02:25:06.000Z | 2020-01-03T15:01:39.000Z | skills/B01IYYLSEU/README.md | AndHor66/alexa-skills-list | 3833958be3f3a4091414645d81a531915303a2ef | [
"MIT"
] | 52 | 2016-04-02T06:08:55.000Z | 2021-12-12T23:52:13.000Z | # [animal quiz](http://alexa.amazon.com/#skills/amzn1.ask.skill.b2539201-c290-491f-870f-1c845c7ab3e6)
 0
To use the animal quiz skill, try saying...
* *Alexa, start animal quiz*
* *Help*
* *Repeat*
this is some strange and unknown facts that are not common knowledge about animals
***
### Skill Details
* **Invocation Name:** animal quiz
* **Category:** null
* **ID:** amzn1.ask.skill.b2539201-c290-491f-870f-1c845c7ab3e6
* **ASIN:** B01IYYLSEU
* **Author:** earlwackerly
* **Release Date:** July 27, 2016 @ 09:56:51
* **In-App Purchasing:** No
| 34.04 | 287 | 0.701528 | eng_Latn | 0.276756 |
97ed125f8701a4b10f93159195d5a4e31dc54ea6 | 93 | md | Markdown | slides/20-the-challenge/40-fine.md | nicojs/stryker-tsconf-presentation | 5d35ad76a7827e2462dc71206a8e2491e6ea7f53 | [
"Apache-2.0"
] | null | null | null | slides/20-the-challenge/40-fine.md | nicojs/stryker-tsconf-presentation | 5d35ad76a7827e2462dc71206a8e2491e6ea7f53 | [
"Apache-2.0"
] | null | null | null | slides/20-the-challenge/40-fine.md | nicojs/stryker-tsconf-presentation | 5d35ad76a7827e2462dc71206a8e2491e6ea7f53 | [
"Apache-2.0"
] | null | null | null | 
We need something better
<!-- .element class="fragment" --> | 18.6 | 34 | 0.612903 | eng_Latn | 0.664694 |
97eda3382163227bcf1b8ef3198975ee5572fa2d | 339 | md | Markdown | DistributedMail/README.md | JohnLau55/harmonyos-codelabs | 3e07a6e0e610f54484821170dd44e0170ef2df18 | [
"Apache-2.0"
] | null | null | null | DistributedMail/README.md | JohnLau55/harmonyos-codelabs | 3e07a6e0e610f54484821170dd44e0170ef2df18 | [
"Apache-2.0"
] | null | null | null | DistributedMail/README.md | JohnLau55/harmonyos-codelabs | 3e07a6e0e610f54484821170dd44e0170ef2df18 | [
"Apache-2.0"
] | null | null | null | # Distributed Mail Client
In this article, we make a simple demonstration by simulating the collaborative email editing between different devices. We can complete the task of cross device migration through the migration button, and call the cross device pictures through the attachment button.
Licensing
Please see LICENSE for more info. | 56.5 | 267 | 0.828909 | eng_Latn | 0.999562 |
97ee59f9ba6935c1cc252e4114dd73cf7d9c4d1b | 1,444 | md | Markdown | includes/data-box-storage-account-size-limits.md | gliljas/azure-docs.sv-se-1 | 1efdf8ba0ddc3b4fb65903ae928979ac8872d66e | [
"CC-BY-4.0",
"MIT"
] | null | null | null | includes/data-box-storage-account-size-limits.md | gliljas/azure-docs.sv-se-1 | 1efdf8ba0ddc3b4fb65903ae928979ac8872d66e | [
"CC-BY-4.0",
"MIT"
] | null | null | null | includes/data-box-storage-account-size-limits.md | gliljas/azure-docs.sv-se-1 | 1efdf8ba0ddc3b4fb65903ae928979ac8872d66e | [
"CC-BY-4.0",
"MIT"
] | null | null | null | ---
author: alkohli
ms.service: databox
ms.subservice: heavy
ms.topic: include
ms.date: 06/18/2019
ms.author: alkohli
ms.openlocfilehash: 3e1f9c225a57e7d41f85c2a92dac989453057c51
ms.sourcegitcommit: 4499035f03e7a8fb40f5cff616eb01753b986278
ms.translationtype: MT
ms.contentlocale: sv-SE
ms.lasthandoff: 05/03/2020
ms.locfileid: "82736985"
---
Här följer gränserna för storleken på de data som kopieras till lagrings kontot. Kontrol lera att de data som du överför följer dessa gränser. Den senaste informationen om dessa begränsningar finns i [skalbarhets-och prestanda mål för Blob Storage](../articles/storage/blobs/scalability-targets.md) och [Azure Files skalbarhet och prestanda mål](../articles/storage/files/storage-files-scale-targets.md).
| Storlek på data som kopieras till Azure Storage-kontot | Standardgräns |
|---------------------------------------------------------------------|------------------------|
| Block Blob och Page BLOB | 2 PB för USA och Europa.<br>500 TB för alla andra regioner, som innehåller Storbritannien. <br> Detta inkluderar data från alla källor, inklusive Data Box-enhet.|
| Azure Files | Maximal storlek på standard fil resurser 100TiB *, 5 TB, Premium fil resurser 100TiB per resurs.<br> Alla mappar under *StorageAccount_AzureFiles* måste följa den här gränsen. |
| 68.761905 | 404 | 0.671745 | swe_Latn | 0.97195 |
97ef7595cdd1dd15d295742e05f5c249bbfe0573 | 1,844 | md | Markdown | docs/api.md | harlanmang/hooper | 3742b4c9bfc8bcb8db69894deb56213cfe8c1c77 | [
"MIT"
] | null | null | null | docs/api.md | harlanmang/hooper | 3742b4c9bfc8bcb8db69894deb56213cfe8c1c77 | [
"MIT"
] | null | null | null | docs/api.md | harlanmang/hooper | 3742b4c9bfc8bcb8db69894deb56213cfe8c1c77 | [
"MIT"
] | null | null | null | # API
## Props
|Prop |Default |Description|
|-----------------|-----|-----------|
|`itemsToShow` |1 |count of items to showed per view (can be a fraction).|
|`itemsToSlide` |1 |count of items to slide when use navigation buttons|
|`infiniteScroll` |false|enable infinite scrolling mode.|
|`centerMode` |false|enable center mode|
|`vertical` |false|enable vertical sliding mode|
|`rtl` |false|enable rtl mode|
|`mouseDrag` |true |toggle mouse dragging|
|`touchDrag` |true |toggle touch dragging|
|`wheelControl` |false|toggle mouse wheel sliding|
|`keysControl` |false|toggle keyboard control|
|`shortDrag` |true |enable any move to commit a slide|
|`autoPlay` |false|enable auto sliding to carousal|
|`playSpeed` |3000 |speed of auto play to trigger slide in ms|
|`transition` |300 |sliding transition time in ms|
|`sync` |'' |sync two carousels to slide together|
|`settings` |{ } |an object to pass all settings|
## Slots
Hooper accept two different slots, default slots for slides items, `addons` slot for addons components.
### default slot
```vue
<hooper vertical="true" style="height: 400px" :itemsToShow="1.5" :centerMode="true">
<slide>
slide 1
</slide>
<slide>
slide 2
</slide>
<slide>
slide 3
</slide>
<slide>
slide 4
</slide>
<slide>
slide 5
</slide>
<slide>
slide 6
</slide>
</hooper>
```
### addon slot
```vue
<hooper>
...
<hooper-navigation slot="hooper-addons"></hooper-navigation>
<hooper-progress slot="hooper-addons"></hooper-progress>
<hooper-pagination slot="hooper-addons"></hooper-pagination>
</hooper>
```
## Events
::: tip info
Coming soon
:::
## Methods
::: tip info
Coming soon
::: | 25.611111 | 104 | 0.610087 | eng_Latn | 0.63141 |
97ef94e62589dfdc3b95934d768d7f855a5d3134 | 1,104 | md | Markdown | README.md | hillebrandt/devblog | 8e8d7f3b95e91fd8cdb6115ecbafe2cbff2f7d95 | [
"MIT"
] | 21 | 2017-12-20T12:19:00.000Z | 2021-05-27T11:48:14.000Z | README.md | hillebrandt/devblog | 8e8d7f3b95e91fd8cdb6115ecbafe2cbff2f7d95 | [
"MIT"
] | 174 | 2017-08-18T07:51:09.000Z | 2022-02-09T12:13:05.000Z | README.md | hillebrandt/devblog | 8e8d7f3b95e91fd8cdb6115ecbafe2cbff2f7d95 | [
"MIT"
] | 132 | 2017-07-27T13:25:29.000Z | 2022-01-29T13:40:30.000Z | 
# adesso SE devblog
Dieses Repository beinhaltet Blog Posts für den [adesso Blog](https://blog.adesso.de/).
## Wie kann ich einen Blog Post veröffentlichen?
Eine Anleitung, wie ein Blog Post über dieses Repository veröffentlicht werden kann,
findest du [hier](examples/2021-08-23-blog-post-guide.md).
**WICHTIG: Achte bitte auf die Aktualisierung zur Angabe der Autoren**
Falls du einige Tipps zum Schreiben eines erfolgreichen Artikels haben möchtest, schau doch mal in unsere [Best Practices](https://github.com/adessoAG/devblog/blob/master/examples/best-practices.md).
## Aktuelle Preview der Blog Posts in diesem Repository
Eine Preview findet ihr [hier](https://adesso-devblog-pr-preview.netlify.com). Hierbei handelt
es sich aber nur um eine Preview, die genau die Blog Posts anzeigt, die in diesem Repository
enthalten sind. Sie werden regelmäßig in den adesso Blog integriert.
## Fragen?
Bei Fragen zu diesem Repository wende dich an [devblog@adesso.de](mailto:devblog@adesso.de).
| 50.181818 | 199 | 0.790761 | deu_Latn | 0.959074 |
97efe567c38dfd918ecd3738cee0a9c00bea6267 | 22 | md | Markdown | README.md | alyonakurtse/BOT2 | 4c0b0bc60cd4f806bf6223e777a24c6b9325ca55 | [
"MIT"
] | null | null | null | README.md | alyonakurtse/BOT2 | 4c0b0bc60cd4f806bf6223e777a24c6b9325ca55 | [
"MIT"
] | null | null | null | README.md | alyonakurtse/BOT2 | 4c0b0bc60cd4f806bf6223e777a24c6b9325ca55 | [
"MIT"
] | null | null | null | # BOT2
I add some text | 11 | 15 | 0.727273 | eng_Latn | 0.998831 |
97f224d38780bacaf4aeec9577f5a49322180bc2 | 519 | md | Markdown | _posts/2016-12-01-Thursday-1st-December.md | IT-WITH-ME/IT-WITH-ME.github.io | 02206647938a023a4ff5f514af5a01130d463224 | [
"MIT"
] | null | null | null | _posts/2016-12-01-Thursday-1st-December.md | IT-WITH-ME/IT-WITH-ME.github.io | 02206647938a023a4ff5f514af5a01130d463224 | [
"MIT"
] | null | null | null | _posts/2016-12-01-Thursday-1st-December.md | IT-WITH-ME/IT-WITH-ME.github.io | 02206647938a023a4ff5f514af5a01130d463224 | [
"MIT"
] | null | null | null | ---
title: Thursday 1st December
layout: post
author: nick.vyse
permalink: /thursday-1st-december/
source-id: 1Hsh3-1ZD1dH3A57qZdREpVO8TAPa7jYmj4GOLTb6Rvk
published: true
---
**Thursday 1st December**
Hi guys, today I managed to add a container to my blog which makes it look so much better so that's good also the same problem happened with the late update on my blog but it's working now. So that's all good also I changed the background and avatar to christmas images because CHRISTMAS!!!
Nick signing off, bye.
| 34.6 | 291 | 0.780347 | eng_Latn | 0.994847 |
97f27d6e08781fb18dbd67bc821f69ae4deab98e | 3,262 | md | Markdown | README.md | markuspoerschke/guzzle-oauth2-middleware | 25bad4609f969615a69da2530b783f231cf2b0c7 | [
"Apache-2.0"
] | 6 | 2019-07-10T09:23:39.000Z | 2021-04-04T20:05:13.000Z | README.md | markuspoerschke/guzzle-oauth2-middleware | 25bad4609f969615a69da2530b783f231cf2b0c7 | [
"Apache-2.0"
] | 11 | 2017-08-21T07:43:07.000Z | 2022-03-08T09:13:13.000Z | README.md | markuspoerschke/guzzle-oauth2-middleware | 25bad4609f969615a69da2530b783f231cf2b0c7 | [
"Apache-2.0"
] | 5 | 2018-03-21T22:23:28.000Z | 2022-03-08T09:13:39.000Z | Guzzle OAuth2 Middleware
=====
[](https://github.com/softonic/guzzle-oauth2-middleware/releases)
[](LICENSE.md)
[](https://travis-ci.org/softonic/guzzle-oauth2-middleware)
[](https://scrutinizer-ci.com/g/softonic/guzzle-oauth2-middleware/code-structure)
[](https://scrutinizer-ci.com/g/softonic/guzzle-oauth2-middleware)
[](https://packagist.org/packages/softonic/guzzle-oauth2-middleware)
[](http://isitmaintained.com/project/softonic/guzzle-oauth2-middleware "Average time to resolve an issue")
[](http://isitmaintained.com/project/softonic/guzzle-oauth2-middleware "Percentage of issues still open")
This package provides middleware for [guzzle](https://github.com/guzzle/guzzle/) for handling OAuth2 token negotiation and renewal on expiry transparently. It accecpts PHP League's [OAuth 2.0 Clients](https://github.com/thephpleague/oauth2-client).
Installation
-------
To install, use composer:
```
composer require softonic/guzzle-oauth2-middleware
```
Usage
-------
``` php
<?php
$options = [
'clientId' => 'myclient',
'clientSecret' => 'mysecret'
];
// Any provider extending League\OAuth2\Client\Provider\AbstractProvider will do
$provider = new Softonic\OAuth2\Client\Provider\Softonic($options);
// Send OAuth2 parameters and use token_options for any other parameters your OAuth2 provider needs
$config = ['grant_type' => 'client_credentials', 'scope' => 'myscope', 'token_options' => ['audience' => 'test_audience']];
// Any implementation of PSR-6 Cache will do
$cache = new \Symfony\Component\Cache\Adapter\FilesystemAdapter();
$client = \Softonic\OAuth2\Guzzle\Middleware\ClientBuilder::build(
$provider,
$config,
$cache,
['base_uri' => 'https://foo.bar/']
);
$response = $client->request('POST', 'qux);
```
Testing
-------
`softonic/guzzle-oauth2-middleware` has a [PHPUnit](https://phpunit.de) test suite and a coding style compliance test suite using [PHP CS Fixer](http://cs.sensiolabs.org/).
To run the tests, run the following command from the project folder.
``` bash
$ docker-compose run test
```
To run interactively using [PsySH](http://psysh.org/):
``` bash
$ docker-compose run psysh
```
License
-------
The Apache 2.0 license. Please see [LICENSE](LICENSE) for more information.
[PSR-2]: http://www.php-fig.org/psr/psr-2/
[PSR-4]: http://www.php-fig.org/psr/psr-4/
| 41.820513 | 248 | 0.749847 | eng_Latn | 0.363133 |
97f2839c0a01638ea78ac4ad3f28012f19f55799 | 37 | md | Markdown | magura.md | flancia-coop/doc.anagora.org | 323aeac8be6440c32251e95e219937b06f62b6dd | [
"CC0-1.0"
] | null | null | null | magura.md | flancia-coop/doc.anagora.org | 323aeac8be6440c32251e95e219937b06f62b6dd | [
"CC0-1.0"
] | null | null | null | magura.md | flancia-coop/doc.anagora.org | 323aeac8be6440c32251e95e219937b06f62b6dd | [
"CC0-1.0"
] | null | null | null | - a [[sweet]].
- from [[romania]] | 18.5 | 22 | 0.459459 | eng_Latn | 0.978517 |
97f336ff5e5946ff52f8a32d883d18368acfc458 | 2,711 | md | Markdown | _vendor/github.com/chef/automate/components/docs-chef-io/content/automate/node_credentials.md | stelgote/chef-web-docs | 395f2bb89b3abe03ff87081e9d3d8d079925c2e6 | [
"CC-BY-3.0"
] | null | null | null | _vendor/github.com/chef/automate/components/docs-chef-io/content/automate/node_credentials.md | stelgote/chef-web-docs | 395f2bb89b3abe03ff87081e9d3d8d079925c2e6 | [
"CC-BY-3.0"
] | 11 | 2021-05-03T03:41:45.000Z | 2022-03-11T23:24:29.000Z | _vendor/github.com/chef/automate/components/docs-chef-io/content/automate/node_credentials.md | stelgote/chef-web-docs | 395f2bb89b3abe03ff87081e9d3d8d079925c2e6 | [
"CC-BY-3.0"
] | null | null | null | +++
title = "Node Credentials"
date = 2018-05-22T17:23:24-07:00
weight = 20
draft = false
gh_repo = "automate"
[menu]
[menu.automate]
title = "Node Credentials"
identifier = "automate/settings/node_credentials.md Node Credentials"
parent = "automate/settings"
weight = 50
+++
The Chef Automate Credentials page allows you to add, edit, and delete ``SSH``, ``WinRm``, and ``Sudo`` credentials for remotely access to your nodes.
To manage your credentials, navigate to the _Node Credentials_ page from the **Settings** tab.

Adding SSH, WinRM, and Sudo credentials is the first step for using the Chef Automate Compliance Scanner. After adding credentials, you'll be able to add nodes and create scan jobs.
Depending on how you've set up your nodes, you may need to set up more than one key that uses the same SSH Private Key with different usernames. For example, AWS EC2 Amazon Linux nodes require the username ``ec2-user``, while AWS EC2 Ubuntu nodes require the username ``ubuntu`` or ``root``. The **Credentials** page enables saving two different sets of credentials, both using the same SSH Private Key and different user names. However, credentials with different content may also reuse identical key names; it may be advisable to reduce confusion by follow a naming pattern specifying the key name and platform to distinguish between similar credentials.
{{< warning >}}
A credential name may be reused, even when it contains different usernames or keys.
{{< /warning >}}
## Add a Credential
Select _Add Credential_ and a dialog box appears as shown below. Select the _Credential Type_ drop box to choose the desired credential type.
### Add a SSH Credential

**SSH** requires a credential name, a user name and either a SSH password **or** a SSH Private key, but not both.
### Add a WinRM Credential

**WinRM** requires a credential name, a user name, and a WinRM password.
### Add a Sudo Credential

**Sudo** requires a credential name, a user name, and a password **or** sudo options, but not both.
Credentials will be visible in the _Node Credentials_ view after using the **Save Credential** button. If you are not redirected to the credentials list, then review the credential you are attempting to add.
## Manage Credentials
* Edit a credential by selecting the credential's name, which opens the credential's detail page.
* Delete a credential by selecting **Delete** from the menu at the end of the table row.
| 45.949153 | 657 | 0.758761 | eng_Latn | 0.995871 |
97f387ee41c156a4d5fcdfd28c2b438ae8393b08 | 969 | md | Markdown | README.md | CrumpLab/rstatsmethods | a757b8b81a799bc3ff2d241b19370c1101c7157d | [
"CC-BY-4.0"
] | null | null | null | README.md | CrumpLab/rstatsmethods | a757b8b81a799bc3ff2d241b19370c1101c7157d | [
"CC-BY-4.0"
] | 10 | 2021-08-17T20:40:04.000Z | 2021-12-08T08:56:38.000Z | README.md | CrumpLab/rstatsmethods | a757b8b81a799bc3ff2d241b19370c1101c7157d | [
"CC-BY-4.0"
] | 1 | 2021-11-08T16:58:51.000Z | 2021-11-08T16:58:51.000Z | # rstatsmethods <a href='https:/crumplab.github.io/rstatsmethods'><img src='man/figures/logo.png' align="right" height="120.5" /></a>
<!-- badges: start -->
[](https://www.tidyverse.org/lifecycle/#experimental)
<!-- badges: end -->
## PSYC 7765/66 Statistical Methods Application I
This is the course website for PSYC 7765/66: Statistical Methods Applications I (Fall 2021), offered through the Experimental Psychology Master’s program, Department of Psychology, Brooklyn College of CUNY.
## PSYC 7765/66 Statistical Methods Application II
This website also contains the labs for Statistical Methods Applications II, to be offered in Spring 2022
Instructor: Matthew Crump
[mcrump@brooklyn.cuny.edu](mcrump@brooklyn.cuny.edu)
[](https://creativecommons.org/licenses/by-sa/4.0/)
| 51 | 207 | 0.76161 | eng_Latn | 0.445573 |
97f3fde43f504f6f2d847b8f0557702d4b286e79 | 2,342 | md | Markdown | README.md | mirdaki/theforce | 3eda9870301ad2e1ccfc1458a8a9911fc87f2149 | [
"BSD-2-Clause",
"MIT"
] | 9 | 2021-10-11T06:26:57.000Z | 2022-03-08T20:14:26.000Z | README.md | mirdaki/theforce | 3eda9870301ad2e1ccfc1458a8a9911fc87f2149 | [
"BSD-2-Clause",
"MIT"
] | 14 | 2021-10-31T03:05:07.000Z | 2021-11-22T02:24:55.000Z | README.md | mirdaki/theforce | 3eda9870301ad2e1ccfc1458a8a9911fc87f2149 | [
"BSD-2-Clause",
"MIT"
] | 1 | 2021-11-07T04:13:30.000Z | 2021-11-07T04:13:30.000Z | # The Force
> The Force is a gateway to abilities many believe are unnatural...
The Force is a Star Wars inspired programming language. All keywords are made up of quotes from the movies and it is fully armed and operational!
```force
Do it!
The Sacred Jedi Texts! "Hello there\n"
May The Force be with you.
```
## Getting Started
To learn about using The Force, please look at the [introduction](docs/introduction.md). We also have some [examples](examples) of full programs you can use as reference.
### Installing
If you have [cargo](https://doc.rust-lang.org/cargo/):
```bash
cargo install theforce
```
Or download directly from our [releases](https://github.com/mirdaki/theforce/releases).
### Usage
Run a `.force` file:
```bash
theforce /path/to/file
```
### Developing
[Install Rust](https://www.rust-lang.org/tools/install). We also provide a [Dev Container](https://code.visualstudio.com/docs/remote/create-dev-container) if you would prefer to run it that way.
To run the examples:
```bash
cargo run examples/hello-there.force
```
To run with LLVM support (currently a WIP):
```bash
cargo run examples/hello-there.force --features llvm
```
## Built With
Thank you to all the projects that helped make this possible!
- [Rust](https://www.rust-lang.org/) for being an awesome language to build with
- [Pest](https://pest.rs/) used for defining and parsing the grammar
- [Create Your Own Programming Language with Rust](https://createlang.rs/) provided an excellent introduction into the Rust tools needed to build this
- [ArnoldC](https://lhartikk.github.io/ArnoldC/) for providing inspiration for the design
## Contributing
Please read [CONTRIBUTING.md](CONTRIBUTING.md) for how to contribute to the project.
## License
This project is dual-licensed under the MIT or Yoda License - see the [LICENSE.md](LICENSE.md) and [YODA-LICENSE.md](YODA-LICENSE.md) files for details.
The Force is in no way affiliated with or endorsed by Lucasfilm Limited or any of its subsidiaries, employees, or associates. All Star Wars quotes and references in this project are copyrighted to Lucasfilm Limited. This project intends to use these strictly within the terms of fair use under United States copyright laws.
<small>Disney please don't sue us.</small>
| 34.955224 | 324 | 0.733134 | eng_Latn | 0.993626 |
97f466af42f4bd5193c6f20377815ff369e2099a | 1,265 | md | Markdown | README.md | homayoun1990/djangomap | e1812d0cfc568e2274a6ad2199e43211fe0128b1 | [
"MIT"
] | 1 | 2020-04-10T16:26:52.000Z | 2020-04-10T16:26:52.000Z | README.md | homayoun1990/djangomap | e1812d0cfc568e2274a6ad2199e43211fe0128b1 | [
"MIT"
] | null | null | null | README.md | homayoun1990/djangomap | e1812d0cfc568e2274a6ad2199e43211fe0128b1 | [
"MIT"
] | null | null | null | ### DjangoMap: An interactive map using django web framework and openlayer js package!
Use Django to build a simple web application from scratch
### Clone the project:
use this command to clone this project in your working directory: pip clone ...
### Creating a Virtual Environment:
A virtual environment allows you to create an isolated environment for the dependencies of your current project. This will allow you to avoid conflicts between the same packages that have different versions.
### Installing Dependencies:
The first step after creating and activating a virtual environment is to install Django. In this section, you’ll be installing the prerequisites needed before you can bootstrap your project, such as Django.
>> pip install django or pip install -r requirements.txt
### Building the Django Project and run the server:
We can use the Django django-admin.py tool to create the boilerplate code structure to get our project started. Change into the directory where you develop your applications.
### Adding Maps with OpenLayer:
penLayers is an open-source JavaScript library for displaying map data in web browsers as slippy maps. It provides an API for building rich web-based geographic applications similar to Google Maps and Bing Maps.
| 60.238095 | 211 | 0.8 | eng_Latn | 0.998037 |
97f546be76f6e13f49bdd77718a0e3aa0a1f08a0 | 497 | md | Markdown | add/metadata/System.Speech.Recognition/SemanticResultKey.meta.md | kcpr10/dotnet-api-docs | b73418e9a84245edde38474bdd600bf06d047f5e | [
"CC-BY-4.0",
"MIT"
] | 1 | 2020-06-16T22:24:36.000Z | 2020-06-16T22:24:36.000Z | add/metadata/System.Speech.Recognition/SemanticResultKey.meta.md | kcpr10/dotnet-api-docs | b73418e9a84245edde38474bdd600bf06d047f5e | [
"CC-BY-4.0",
"MIT"
] | null | null | null | add/metadata/System.Speech.Recognition/SemanticResultKey.meta.md | kcpr10/dotnet-api-docs | b73418e9a84245edde38474bdd600bf06d047f5e | [
"CC-BY-4.0",
"MIT"
] | 1 | 2020-05-02T13:31:28.000Z | 2020-05-02T13:31:28.000Z | ---
uid: System.Speech.Recognition.SemanticResultKey
---
---
uid: System.Speech.Recognition.SemanticResultKey.#ctor(System.String,System.Speech.Recognition.GrammarBuilder[])
ms.author: "kbridge"
---
---
uid: System.Speech.Recognition.SemanticResultKey.#ctor(System.String,System.String[])
ms.author: "kbridge"
---
---
uid: System.Speech.Recognition.SemanticResultKey.#ctor
ms.author: "kbridge"
---
---
uid: System.Speech.Recognition.SemanticResultKey.ToGrammarBuilder
ms.author: "kbridge"
---
| 20.708333 | 112 | 0.756539 | yue_Hant | 0.708223 |
97f5c398ae2c9d27d91986b6e76a2f6b4de3d559 | 2,061 | md | Markdown | _posts/2020-10-23-assignment-2-prototyping.md | disyraf/EP1000_Project | 8d38f4803aedaf9d096a5dbe13425ea2b8ba935d | [
"MIT"
] | null | null | null | _posts/2020-10-23-assignment-2-prototyping.md | disyraf/EP1000_Project | 8d38f4803aedaf9d096a5dbe13425ea2b8ba935d | [
"MIT"
] | null | null | null | _posts/2020-10-23-assignment-2-prototyping.md | disyraf/EP1000_Project | 8d38f4803aedaf9d096a5dbe13425ea2b8ba935d | [
"MIT"
] | null | null | null | ---
layout: post
title: Prototyping
date: 2020-10-23 21:20:30 +0800
image: 02.jpg
tags: Assignment-2
---
<strong>Prototyping</strong> is an essential part of building anything from small personal projects to large scale projects led by big organisations. There are many tools and materials that can be used for prototyping. The most common material used is cardboard.
### Why is cardboard good for prototyping?
Cardboard is a great material for protyping because it is cheap if not free, environmentally friendly and easy to work with. Corrugated cardboard readily collapses along the corrugations when force is applied in a way where one or both ends of the corrugations are not supported. That means that it bends easily in one direction but is extremely resistant to bending in the other. Cardboard can be used in many ways due to its versatility.
### Cardboard prototype
To prove to you that cardboard is really useful, I made a prototype myself.
I made a simple handphone holder, made up entirely out of cardboard.
### Sketching
Sketching is the first step to prototyping. It converts whatever is in your brain into the real world. The sketch doesn't have to be pretty. It just has to look like the picture you had in your head. For me, the handphone holder looked like this in my head:

Now, I know that it is not the best looking sketch, but it's something that I can work with.
### Tools
There are many tools that can be used but I went with the simplest of options - siccors, pen knife and some hot glue. These tools are perfect for the job and they are easily attainable.
### The Final Piece
Now that the sketch is done. I start to build it it out of cardboard. Here are the step I took to do so:
1. Draw the pieces on the cardboard

2. Use the pen knife and siccors to cut them out

3. Assemble the pieces using hot glue

4. Admire your work

| 57.25 | 439 | 0.756914 | eng_Latn | 0.999771 |
97f6963e1dc3b9e89f836d9af1962b21445e7385 | 105 | md | Markdown | docs/cookbook.md | IGZfranciscogamiz/iris | b3a2519b723da942af5354b92746f7e7e79506c6 | [
"BSD-3-Clause"
] | 1 | 2019-01-15T11:16:56.000Z | 2019-01-15T11:16:56.000Z | docs/cookbook.md | IGZfranciscogamiz/iris | b3a2519b723da942af5354b92746f7e7e79506c6 | [
"BSD-3-Clause"
] | null | null | null | docs/cookbook.md | IGZfranciscogamiz/iris | b3a2519b723da942af5354b92746f7e7e79506c6 | [
"BSD-3-Clause"
] | null | null | null | [Iris homepage](https://github.com/iris-js/iris) | [Documentation table of contents](toc.md)
# Cookbook
| 26.25 | 92 | 0.733333 | yue_Hant | 0.285828 |
97f6a43d04468832893332a7c7658bef4cccb5f6 | 1,115 | md | Markdown | _posts/2017-07-13-outreachy-week7-and-8.md | jnanjekye/jnanjekye.github.io | 67a7bcafc707efeb487bae46e9a0816312f6125d | [
"Apache-2.0"
] | null | null | null | _posts/2017-07-13-outreachy-week7-and-8.md | jnanjekye/jnanjekye.github.io | 67a7bcafc707efeb487bae46e9a0816312f6125d | [
"Apache-2.0"
] | null | null | null | _posts/2017-07-13-outreachy-week7-and-8.md | jnanjekye/jnanjekye.github.io | 67a7bcafc707efeb487bae46e9a0816312f6125d | [
"Apache-2.0"
] | null | null | null | ---
layout: post
title: Outreachy With Ceph - Java Test Suite building up
comments: true
---
From my last blog post, the [java suite](https://github.com/nanjekyejoannah/java_s3tests) is already up with more than 100 tests
implemented. Let me get to some details on what has and needs to be implemented.
## What has been implemented
This is a summary of the implemented tests a [commit](https://github.com/nanjekyejoannah/java_s3tests/commits/master) summary would sound more elaborate.
+ Tests on Basic object and bucket CRUD.
+ Tests on SSE and KMS.
+ Tests on object put and get with delimeters, prefixes, maxkeys, metadata and markers.
+ Tests on bucket and object put with with the different headers.
## What Next
I will have to ensure I have a CLI environment to run the tests as currently I run them in eclipse. I need to make use of Ant
to automate the test automation. This has been a battle but hoping to nail it down completely.
I will be implementing tests on AWS auth4. I will get advise and feedback on this on especially what should be tested
but will first share my test plan with my mentors.
| 42.884615 | 153 | 0.775785 | eng_Latn | 0.999262 |
97f6d912aa8dbb0b4841cad618cd8809b7bb17a9 | 34,220 | md | Markdown | soryeongk/180723_crawling_dataframe.md | jiyoung-choi/TWL | 925ac475c88bdb82eb6235b62e8195edd9d5ec70 | [
"MIT"
] | 6 | 2018-07-06T03:40:27.000Z | 2018-07-30T06:10:51.000Z | soryeongk/180723_crawling_dataframe.md | jiyoung-choi/TWL | 925ac475c88bdb82eb6235b62e8195edd9d5ec70 | [
"MIT"
] | 72 | 2018-06-28T06:21:46.000Z | 2018-07-26T11:06:42.000Z | soryeongk/180723_crawling_dataframe.md | jiyoung-choi/TWL | 925ac475c88bdb82eb6235b62e8195edd9d5ec70 | [
"MIT"
] | 46 | 2018-06-28T06:08:07.000Z | 2018-09-06T07:56:55.000Z |
# 청원페이지에서 번호, 분류, 제목, 내용 뽑아오기
국민청원 첫 페이지 하단의 청원 목록에서 번호, 분류, 제목, 참여인원을 추출하여 DataFrame에 담아주세요.
```python
import pandas as pd
```
```python
from urllib import request
from bs4 import BeautifulSoup
```
```python
url = "https://www1.president.go.kr/petitions?page=1"
with request.urlopen(url) as f:
html = f.read().decode('utf-8')
bs = BeautifulSoup(html, 'html5lib')
```
## 1. 번호를 뽑아봅시다아
```python
content_elements = bs.select('div.bl_no')
contents_no = [c.text for c in content_elements]
contents_no
```
['번호',
'번호 답변대기',
'번호 답변대기',
'번호 답변대기',
'번호 답변대기',
'번호 답변대기',
'번호 답변대기',
'번호 답변대기',
'번호',
'번호 243985',
'번호 243984',
'번호 243983',
'번호 243982',
'번호 243981',
'번호 243980',
'번호 243979',
'번호 243978',
'번호 243977',
'번호 243976',
'번호 243975',
'번호 243974',
'번호 243973',
'번호 243972',
'번호 243971']
번호, 답변대기 라는 말들은 필요가 없으니 빠이빠이 해줘야 한다
진영님의 질문에서 시작된 replace 방법..
뭘 좋아할지 몰라 다 준비해써 히히
함수는 하나하나 쓰기 귀찮고, 내가 만든 함수는 단 하나의 요소만 제거하기 때문에 매번 함수를 불러오기 때문에 효율적이지 않을수도..?
그래도 재사용이 가능하다는 장점이 있는데 흠
%time을 찍어볼까
### \#1 함수를 만들어서 표시
```python
def remove_e(data, element):
temp = [d.replace(element, '') if element in d else d for d in data]
result = [t for t in temp if t!='']
return result
```
```python
%time
result1 = remove_e(contents_no, '번호')
result1 = remove_e(result1, ' ')
result1 = remove_e(result1, '답변대기')
result1
```
CPU times: user 4 µs, sys: 2 µs, total: 6 µs
Wall time: 11 µs
['243985',
'243984',
'243983',
'243982',
'243981',
'243980',
'243979',
'243978',
'243977',
'243976',
'243975',
'243974',
'243973',
'243972',
'243971']
### \#2 답변대기가 없는 내용 중에서 번호와 띄어쓰기 등을 제거하는 방법
```python
%time
result2 = [
t.replace('번호', '').replace(' ','')
for t in contents_no
if '답변대기' not in t
if '번호' in t
if t.replace('번호', '').replace(' ','') != ''
]
result2
```
CPU times: user 4 µs, sys: 1e+03 ns, total: 5 µs
Wall time: 11.2 µs
['243985',
'243984',
'243983',
'243982',
'243981',
'243980',
'243979',
'243978',
'243977',
'243976',
'243975',
'243974',
'243973',
'243972',
'243971']
### \#3 답변대기가 없는 내용 중에서 번호와 띄어쓰기 등을 제거하는 방법2
```python
%time
result3 = []
for t in contents_no:
if '답변대기' not in t:
if '번호' in t:
t = t.replace('번호', '')
if t != '':
result3.append(t.replace(' ', ''))
result3
```
CPU times: user 3 µs, sys: 2 µs, total: 5 µs
Wall time: 11 µs
['243985',
'243984',
'243983',
'243982',
'243981',
'243980',
'243979',
'243978',
'243977',
'243976',
'243975',
'243974',
'243973',
'243972',
'243971']
## 2. 분류를 뽑아봅시다아
```python
content_elements = bs.select('div.bl_category')
contents_categ = [c.text for c in content_elements]
contents_categ
```
['분류',
'기타',
'외교/통일/국방',
'기타',
'반려동물',
'외교/통일/국방',
'문화/예술/체육/언론',
'반려동물',
'분류',
'분류 미래',
'분류 안전/환경',
'분류 일자리',
'분류 보건복지',
'분류 일자리',
'분류 일자리',
'분류 정치개혁',
'분류 교통/건축/국토',
'분류 인권/성평등',
'분류 정치개혁',
'분류 인권/성평등',
'분류 성장동력',
'분류 기타',
'분류 육아/교육',
'분류 정치개혁']
자세히 보니, 처음 9개는 다른 이야기이므로 신경쓰지말자!
번호 뽑기와 마찬가지의 방법을 사용하는데,
함수와 중복replace(?) 중에서 무엇이 더 효율적인지는 모르겠답
```python
result_categ = remove_e(contents_categ[8:],'분류')
result_categ = remove_e(result_categ, ' ')
result_categ
```
['미래',
'안전/환경',
'일자리',
'보건복지',
'일자리',
'일자리',
'정치개혁',
'교통/건축/국토',
'인권/성평등',
'정치개혁',
'인권/성평등',
'성장동력',
'기타',
'육아/교육',
'정치개혁']
이잉.... 중복이 발생했다....
사실 필요는 없지마눈 중복을 제거해봅시다아
```python
final = []
# final = [e for e in result_categ if e not in final]
for e in result_categ:
if e not in final:
final.append(e)
# 조금 더 짧게, 효율적으로 하고 싶은데.. 처음부터 중복이 발생하지 않게..!
final
```
['기타',
'외교/통일/국방',
'반려동물',
'문화/예술/체육/언론',
'미래',
'안전/환경',
'일자리',
'보건복지',
'정치개혁',
'교통/건축/국토',
'인권/성평등',
'성장동력',
'육아/교육']
## 3. 제목을 뽑아옵시다아
```python
content_elements = bs.select('div.bl_subject')
contents_subj = [c.text for c in content_elements]
contents_subj
```
['제목',
'\n\t\t\t\t\t\t\t\t\t\t\t\t\n\t\t\t\t\t\t\t\t\t\t\t\t제목 문재인 대통령님께 청원합니다.',
'\n\t\t\t\t\t\t\t\t\t\t\t\t\n\t\t\t\t\t\t\t\t\t\t\t\t제목 제주도 불법 난민 신청 문제에 따른 난민법, 무사증 입국, 난민신청허가 폐지/개헌 청원합니다.',
'\n\t\t\t\t\t\t\t\t\t\t\t\t\n\t\t\t\t\t\t\t\t\t\t\t\t제목 가해자들은 떳떳이 생활하고, 집단 성폭행 당한 피해자인 저희아이는 오히려 더 죄인같이 생활하고 있습니다. 미성년자 성폭행범 처벌을 더 강하하여 주세요.',
'\n\t\t\t\t\t\t\t\t\t\t\t\t\n\t\t\t\t\t\t\t\t\t\t\t\t제목 개.고양이 식용종식 전동연(개를 가축에서 제외하라)',
'\n\t\t\t\t\t\t\t\t\t\t\t\t\n\t\t\t\t\t\t\t\t\t\t\t\t제목 남편선교사가 안티폴로감옥에 있습니다.(필리핀)',
'\n\t\t\t\t\t\t\t\t\t\t\t\t\n\t\t\t\t\t\t\t\t\t\t\t\t제목 디스패치 폐간을 요청합니다.',
'\n\t\t\t\t\t\t\t\t\t\t\t\t\n\t\t\t\t\t\t\t\t\t\t\t\t제목 표창원 의원의 개,고양이 도살 금지 법안을 통과 시켜주세요!',
'제목',
'\n\t\t\t\t\t\t\t\t\t\t\t\t제목 네이버 블로그는 짝퉁 유통의 숙주\n\t\t\t\t\t\t\t\t\t\t\t',
'\n\t\t\t\t\t\t\t\t\t\t\t\t제목 비흡연자를위한 대책\n\t\t\t\t\t\t\t\t\t\t\t',
'\n\t\t\t\t\t\t\t\t\t\t\t\t제목 롯데그룹 총수 신동빈 회장을 제발 불구속 재판받게 ...\n\t\t\t\t\t\t\t\t\t\t\t',
'\n\t\t\t\t\t\t\t\t\t\t\t\t제목 어린이집 유치원 실내온도 냉/난방 규제 온도를 마련 ...\n\t\t\t\t\t\t\t\t\t\t\t',
'\n\t\t\t\t\t\t\t\t\t\t\t\t제목 연간 3억실의 공실이 발생하는 숙박업의 현실을 반영 ...\n\t\t\t\t\t\t\t\t\t\t\t',
'\n\t\t\t\t\t\t\t\t\t\t\t\t제목 특수고용직 형태이지만, 상당수가 불법인 가맹본부를 ...\n\t\t\t\t\t\t\t\t\t\t\t',
'\n\t\t\t\t\t\t\t\t\t\t\t\t제목 노회찬 주검을 보면서 단언 합니다..\n\t\t\t\t\t\t\t\t\t\t\t',
'\n\t\t\t\t\t\t\t\t\t\t\t\t제목 지하철 노약자석 없애주세요.\n\t\t\t\t\t\t\t\t\t\t\t',
'\n\t\t\t\t\t\t\t\t\t\t\t\t제목 국가와 경찰은 일베에 할머니 나체 사진을 무단 유포 ...\n\t\t\t\t\t\t\t\t\t\t\t',
'\n\t\t\t\t\t\t\t\t\t\t\t\t제목 이재명,은수미 조사철저\n\t\t\t\t\t\t\t\t\t\t\t',
'\n\t\t\t\t\t\t\t\t\t\t\t\t제목 국제선 계류장 항공기소음 및 항공기 매연부터 개선시 ...\n\t\t\t\t\t\t\t\t\t\t\t',
'\n\t\t\t\t\t\t\t\t\t\t\t\t제목 대한민국 증시 코스닥 코스피 이대로 괜찮아보이시나 ...\n\t\t\t\t\t\t\t\t\t\t\t',
'\n\t\t\t\t\t\t\t\t\t\t\t\t제목 서민죽이는 신용보증재단\n\t\t\t\t\t\t\t\t\t\t\t',
'\n\t\t\t\t\t\t\t\t\t\t\t\t제목 교과과정에서의 학생들의 예체능 교육 확대에 대한 청 ...\n\t\t\t\t\t\t\t\t\t\t\t',
'\n\t\t\t\t\t\t\t\t\t\t\t\t제목 이재명지사 의혹 해소\n\t\t\t\t\t\t\t\t\t\t\t']
```python
result_subj = remove_e(contents_subj[8:],'\n')
result_subj = remove_e(result_subj, '\t')
result_subj = remove_e(result_subj, '제목')
result_subj = remove_e(result_subj, ' ')
result_subj
```
['네이버블로그는짝퉁유통의숙주',
'비흡연자를위한대책',
'롯데그룹총수신동빈회장을제발불구속재판받게...',
'어린이집유치원실내온도냉/난방규제온도를마련...',
'연간3억실의공실이발생하는숙박업의현실을반영...',
'특수고용직형태이지만,상당수가불법인가맹본부를...',
'노회찬주검을보면서단언합니다..',
'지하철노약자석없애주세요.',
'국가와경찰은일베에할머니나체사진을무단유포...',
'이재명,은수미조사철저',
'국제선계류장항공기소음및항공기매연부터개선시...',
'대한민국증시코스닥코스피이대로괜찮아보이시나...',
'서민죽이는신용보증재단',
'교과과정에서의학생들의예체능교육확대에대한청...',
'이재명지사의혹해소']
## 4. 참여인원을 뽑아옵시다아
```python
content_elements = bs.select('div.bl_agree')
contents_votes = [c.text for c in content_elements]
contents_votes
```
['참여인원',
'\n\t\t\t\t\t\t\t\t\t\t\t\t참여인원 224,539명\n\t\t\t\t\t\t\t\t\t\t\t',
'\n\t\t\t\t\t\t\t\t\t\t\t\t참여인원 714,875명\n\t\t\t\t\t\t\t\t\t\t\t',
'\n\t\t\t\t\t\t\t\t\t\t\t\t참여인원 342,647명\n\t\t\t\t\t\t\t\t\t\t\t',
'\n\t\t\t\t\t\t\t\t\t\t\t\t참여인원 214,634명\n\t\t\t\t\t\t\t\t\t\t\t',
'\n\t\t\t\t\t\t\t\t\t\t\t\t참여인원 207,275명\n\t\t\t\t\t\t\t\t\t\t\t',
'\n\t\t\t\t\t\t\t\t\t\t\t\t참여인원 207,742명\n\t\t\t\t\t\t\t\t\t\t\t',
'\n\t\t\t\t\t\t\t\t\t\t\t\t참여인원 209,571명\n\t\t\t\t\t\t\t\t\t\t\t',
'참여인원',
'\n\t\t\t\t\t\t\t\t\t\t\t\t참여인원 0명\n\t\t\t\t\t\t\t\t\t\t\t',
'\n\t\t\t\t\t\t\t\t\t\t\t\t참여인원 0명\n\t\t\t\t\t\t\t\t\t\t\t',
'\n\t\t\t\t\t\t\t\t\t\t\t\t참여인원 0명\n\t\t\t\t\t\t\t\t\t\t\t',
'\n\t\t\t\t\t\t\t\t\t\t\t\t참여인원 1명\n\t\t\t\t\t\t\t\t\t\t\t',
'\n\t\t\t\t\t\t\t\t\t\t\t\t참여인원 3명\n\t\t\t\t\t\t\t\t\t\t\t',
'\n\t\t\t\t\t\t\t\t\t\t\t\t참여인원 2명\n\t\t\t\t\t\t\t\t\t\t\t',
'\n\t\t\t\t\t\t\t\t\t\t\t\t참여인원 0명\n\t\t\t\t\t\t\t\t\t\t\t',
'\n\t\t\t\t\t\t\t\t\t\t\t\t참여인원 2명\n\t\t\t\t\t\t\t\t\t\t\t',
'\n\t\t\t\t\t\t\t\t\t\t\t\t참여인원 0명\n\t\t\t\t\t\t\t\t\t\t\t',
'\n\t\t\t\t\t\t\t\t\t\t\t\t참여인원 2명\n\t\t\t\t\t\t\t\t\t\t\t',
'\n\t\t\t\t\t\t\t\t\t\t\t\t참여인원 1명\n\t\t\t\t\t\t\t\t\t\t\t',
'\n\t\t\t\t\t\t\t\t\t\t\t\t참여인원 1명\n\t\t\t\t\t\t\t\t\t\t\t',
'\n\t\t\t\t\t\t\t\t\t\t\t\t참여인원 0명\n\t\t\t\t\t\t\t\t\t\t\t',
'\n\t\t\t\t\t\t\t\t\t\t\t\t참여인원 0명\n\t\t\t\t\t\t\t\t\t\t\t',
'\n\t\t\t\t\t\t\t\t\t\t\t\t참여인원 5명\n\t\t\t\t\t\t\t\t\t\t\t']
```python
result_votes = remove_e(contents_votes[8:],'\n')
result_votes = remove_e(result_votes, '\t')
result_votes = remove_e(result_votes, '참여인원')
result_votes = remove_e(result_votes, ' ')
result_votes = remove_e(result_votes, '명')
result_votes = remove_e(result_votes, ',') # 숫자로 바꾸는 상황을 감안해서,, ㅎ
result_votes
```
['0', '0', '0', '1', '3', '2', '0', '2', '0', '2', '1', '1', '0', '0', '5']
데이터 프레임에 넣어서 예쁘게 만들기!
d = {'col1': [1, 2], 'col2': [3, 4]}
>>> df = pd.DataFrame(data=d)
```python
d = {'번호':result1, '분류':result_categ, '제목':result_subj, '참여인원':result_votes}
df = pd.DataFrame(data=d)
df
```
<div>
<style scoped>
.dataframe tbody tr th:only-of-type {
vertical-align: middle;
}
.dataframe tbody tr th {
vertical-align: top;
}
.dataframe thead th {
text-align: right;
}
</style>
<table border="1" class="dataframe">
<thead>
<tr style="text-align: right;">
<th></th>
<th>번호</th>
<th>분류</th>
<th>제목</th>
<th>참여인원</th>
</tr>
</thead>
<tbody>
<tr>
<th>0</th>
<td>243985</td>
<td>미래</td>
<td>네이버블로그는짝퉁유통의숙주</td>
<td>0</td>
</tr>
<tr>
<th>1</th>
<td>243984</td>
<td>안전/환경</td>
<td>비흡연자를위한대책</td>
<td>0</td>
</tr>
<tr>
<th>2</th>
<td>243983</td>
<td>일자리</td>
<td>롯데그룹총수신동빈회장을제발불구속재판받게...</td>
<td>0</td>
</tr>
<tr>
<th>3</th>
<td>243982</td>
<td>보건복지</td>
<td>어린이집유치원실내온도냉/난방규제온도를마련...</td>
<td>1</td>
</tr>
<tr>
<th>4</th>
<td>243981</td>
<td>일자리</td>
<td>연간3억실의공실이발생하는숙박업의현실을반영...</td>
<td>3</td>
</tr>
<tr>
<th>5</th>
<td>243980</td>
<td>일자리</td>
<td>특수고용직형태이지만,상당수가불법인가맹본부를...</td>
<td>2</td>
</tr>
<tr>
<th>6</th>
<td>243979</td>
<td>정치개혁</td>
<td>노회찬주검을보면서단언합니다..</td>
<td>0</td>
</tr>
<tr>
<th>7</th>
<td>243978</td>
<td>교통/건축/국토</td>
<td>지하철노약자석없애주세요.</td>
<td>2</td>
</tr>
<tr>
<th>8</th>
<td>243977</td>
<td>인권/성평등</td>
<td>국가와경찰은일베에할머니나체사진을무단유포...</td>
<td>0</td>
</tr>
<tr>
<th>9</th>
<td>243976</td>
<td>정치개혁</td>
<td>이재명,은수미조사철저</td>
<td>2</td>
</tr>
<tr>
<th>10</th>
<td>243975</td>
<td>인권/성평등</td>
<td>국제선계류장항공기소음및항공기매연부터개선시...</td>
<td>1</td>
</tr>
<tr>
<th>11</th>
<td>243974</td>
<td>성장동력</td>
<td>대한민국증시코스닥코스피이대로괜찮아보이시나...</td>
<td>1</td>
</tr>
<tr>
<th>12</th>
<td>243973</td>
<td>기타</td>
<td>서민죽이는신용보증재단</td>
<td>0</td>
</tr>
<tr>
<th>13</th>
<td>243972</td>
<td>육아/교육</td>
<td>교과과정에서의학생들의예체능교육확대에대한청...</td>
<td>0</td>
</tr>
<tr>
<th>14</th>
<td>243971</td>
<td>정치개혁</td>
<td>이재명지사의혹해소</td>
<td>5</td>
</tr>
</tbody>
</table>
</div>
끼욧 이쁘다
생각해보니 같은일을 엄청 반복한 기분..!
그리하야 함수로 묶어보려 한다.
## 한 방에 해버리기! 원샷 원킬!
```python
what_to_know = ['no', 'category', 'subject', 'agree'] # 각각 번호, 분류, 제목, 참여인원을 의미
address = 'div.bl_'+what_to_know[0]
what_to_remove = ['번호 ', '제목 ', '참여인원 ', '명', ',', '\n', '\t', '분류 ']
df_dict = {}
for i in range(len(what_to_know)):
address = 'div.bl_' + what_to_know[i]
contents = [c.text for c in bs.select(address)]
for e in what_to_remove:
contents = remove_e(contents, e)
df_dict[what_to_know[i]] = contents[9:]
```
```python
df = pd.DataFrame(data=df_dict)
df
```
<div>
<style scoped>
.dataframe tbody tr th:only-of-type {
vertical-align: middle;
}
.dataframe tbody tr th {
vertical-align: top;
}
.dataframe thead th {
text-align: right;
}
</style>
<table border="1" class="dataframe">
<thead>
<tr style="text-align: right;">
<th></th>
<th>agree</th>
<th>category</th>
<th>no</th>
<th>subject</th>
</tr>
</thead>
<tbody>
<tr>
<th>0</th>
<td>0</td>
<td>정치개혁</td>
<td>244029</td>
<td>정치인과 정치깡패?</td>
</tr>
<tr>
<th>1</th>
<td>0</td>
<td>정치개혁</td>
<td>244028</td>
<td>토착비리가 만연한 지자체의 범죄 이대로 묵과 할것인 ...</td>
</tr>
<tr>
<th>2</th>
<td>1</td>
<td>기타</td>
<td>244027</td>
<td>정치인들 얼마만큼 깨끗한지 전수조사 부탁합니다</td>
</tr>
<tr>
<th>3</th>
<td>1</td>
<td>경제민주화</td>
<td>244026</td>
<td>김재익 같으신 분을 뽑으세요</td>
</tr>
<tr>
<th>4</th>
<td>1</td>
<td>정치개혁</td>
<td>244025</td>
<td>대통령 문재인이 대통령직에서 물러나고 법의 심판을 ...</td>
</tr>
<tr>
<th>5</th>
<td>2</td>
<td>안전/환경</td>
<td>244024</td>
<td>지구온난화 대비</td>
</tr>
<tr>
<th>6</th>
<td>5</td>
<td>정치개혁</td>
<td>244023</td>
<td>노회찬 의원님 타살이 의심 할 대목이다.</td>
</tr>
<tr>
<th>7</th>
<td>16</td>
<td>정치개혁</td>
<td>244022</td>
<td>이재 상세히수사촉구 강력히 해야합니다</td>
</tr>
<tr>
<th>8</th>
<td>1</td>
<td>정치개혁</td>
<td>244021</td>
<td>이번기회에국회의원들 다조사하세요</td>
</tr>
<tr>
<th>9</th>
<td>0</td>
<td>행정</td>
<td>244020</td>
<td>배달대행업체의 세금계산서 의무 발행 요청드려요</td>
</tr>
<tr>
<th>10</th>
<td>1</td>
<td>경제민주화</td>
<td>244019</td>
<td>모든 공공분야는 국가에서 관리해야 합니다.~~^^</td>
</tr>
<tr>
<th>11</th>
<td>1</td>
<td>기타</td>
<td>244018</td>
<td>개인회생변제기간단축제도를전국에서시행해주시길부탁 ...</td>
</tr>
<tr>
<th>12</th>
<td>7</td>
<td>성장동력</td>
<td>244017</td>
<td>원자력을 없애면 나라가 망한다 청원제안</td>
</tr>
<tr>
<th>13</th>
<td>2</td>
<td>외교/통일/국방</td>
<td>244016</td>
<td>[노무현 전 대통령과 노회찬 의원의 죽음과 이박과 ...</td>
</tr>
<tr>
<th>14</th>
<td>3</td>
<td>정치개혁</td>
<td>244015</td>
<td>드루킹 수사 더 철저하게 수사할것을 청원합니다</td>
</tr>
</tbody>
</table>
</div>
# 10개 페이지에서 내용 뽑아오기!
국민청원 목록 첫 페이지 주소는 https://www1.president.go.kr/petitions?page=1 입니다.
두번째 페이지 주소는 https://www1.president.go.kr/petitions?page=2 입니다.
처음 10개 페이지에 순차적으로 방문하여 번호, 분류, 제목, 참여인원을 추출하고 그 결과를 DataFrame에 담아주세요.
```python
what_to_know = ['no', 'category', 'subject', 'agree'] # 각각 번호, 분류, 제목, 참여인원을 의미
address = 'div.bl_'+what_to_know[0]
what_to_remove = ['번호 ', '제목 ', '참여인원 ', '명', ',', '\n', '\t', '분류 ']
# 지워야 할, 필요없는 내용을 목록에 담아둠
df_dict = {}
count = 10 # 몇 개의 페이지 부를거야?
for i in range(1,count+1):
url = 'https://www1.president.go.kr/petitions?page='+str(i)
# 10개 페이지는 마지막 숫자만 바뀌는 점을 이용
with request.urlopen(url) as f:
html = f.read().decode('utf-8')
bs = BeautifulSoup(html, 'html5lib')
for i in range(len(what_to_know)):
address = 'div.bl_' + what_to_know[i]
contents = [c.text for c in bs.select(address)]
for e in what_to_remove:
contents = remove_e(contents,e)
df_dict[what_to_know[i]] = contents[9:]
df = pd.DataFrame(data=df_dict)
print(df)
if i < count:
print('\n')
```
agree category no subject
0 0 외교/통일/국방 244067 북한의 핵무기 폐기 실패했습니다 그 책임을 문재인 ...
1 0 정치개혁 244066 드루킹비밀조직들이 댓글조직들 불법사찰조직들 여론 ...
2 6 정치개혁 244065 이재 지사 사퇴
3 4 정치개혁 244064 이재 탄원
4 0 성장동력 244063 퇴근시 간판전원 끄기 운동합시다
5 1 경제민주화 244062 현 경제상황의 점검이 필요합니다. 52시간근로 최저 ...
6 7 정치개혁 244061 김경수 드루킹도 조사
7 2 인권/성평등 244060 문재인정부는 난민 문제 왜 넉놓고 있는건가?
8 0 정치개혁 244059 댓글조직들 불법사찰조직들 여론몰이조직들이 드루킹 ...
9 0 정치개혁 244058 국회원 특할비도 철저한 수사와 처벌을 하세요.
10 3 안전/환경 244057 촛불로 난민을 몰아냅시다
11 1 정치개혁 244056 네이트에 문대통령 비방하는 댓글이 조직적으로 운 ...
12 1 기타 244055 위안부 문제는 어떻게 되는건가요
13 22 기타 244054 이재은 사퇴하라
14 2 성장동력 244053 코스닥 활성화 정책 어떻게 된건가요?
agree category no subject
0 0 기타 244052 원전풀가동 안하냐...
1 2 정치개혁 244051 문재인 대통령님 제발 착한아이 콤플렉스로부터 벗어 ...
2 4 정치개혁 244050 드루킹댓글조작. 특검 팀의 특검법 위반 여부를. 조사 ...
3 1 경제민주화 244049 소상공인들의 순이익에 따른 최저임금 등급매기기
4 1 행정 244048 국민을 심판하는 판사올바른 판사는 국민이 뽑는 시 ...
5 49 정치개혁 244047 이재 은수미 사퇴
6 2 육아/교육 244046 경찰대학 입학 갑작스런 인원축소 부당합니다.
7 0 보건복지 244045 정신병자를 입원시켜주세요
8 0 농산어촌 244044 양식장 폐사하는 물고기 살려줍시다
9 5 행정 244043 고 노회찬의원 무궁화 대훈장 추서바랍니다
10 0 보건복지 244042 정신병자를 입원시켜주세요
11 14 정치개혁 244041 이재투하게 발켜주라
12 2 안전/환경 244040 건설근로자들 쉬게 해주세요
13 28 인권/성평등 244039 이재경기도지사에 대한 철저한 수사를 촉구합니다.
14 0 안전/환경 244038 기무사문건 작성자 청산
agree category no subject
0 5 정치개혁 244037 소상공인 채무 탕감 반대합니다.(성실히 갚은 사람들 ...
1 0 안전/환경 244036 건설근로자들 쉬게 해주세요
2 3 일자리 244035 영세 자영업자를 위한 최소한도의 보장인 노란우산공 ...
3 0 보건복지 244034 일베 박카스 할머니 나체 유포하고 조롱한 사건은 처 ...
4 3 일자리 244033 비정규직을개처럼 취급하는 IBK기업은행! 직접고용하 ...
5 50 정치개혁 244032 이재 경기지사 관련 엄중한 수사촉구!!!
6 5 정치개혁 244031 와 노회찬 투신하고 그알 이재 조폭하고 박전 기무 ...
7 30 정치개혁 244030 이재도지사 사퇴요구합니다
8 5 정치개혁 244029 정치인과 정치깡패?
9 1 정치개혁 244028 토착비리가 만연한 지자체의 범죄 이대로 묵과 할것인 ...
10 2 기타 244027 정치인들 얼마만큼 깨끗한지 전수조사 부탁합니다
11 1 경제민주화 244026 김재익 같으신 분을 뽑으세요
12 10 정치개혁 244025 대통령 문재인이 대통령직에서 물러나고 법의 심판을 ...
13 2 안전/환경 244024 지구온난화 대비
14 9 정치개혁 244023 노회찬 의원님 타살이 의심 할 대목이다.
agree category no subject
0 9 정치개혁 244023 노회찬 의원님 타살이 의심 할 대목이다.
1 60 정치개혁 244022 이재 상세히수사촉구 강력히 해야합니다
2 2 정치개혁 244021 이번기회에국회의원들 다조사하세요
3 0 행정 244020 배달대행업체의 세금계산서 의무 발행 요청드려요
4 1 경제민주화 244019 모든 공공분야는 국가에서 관리해야 합니다.~~^^
5 2 기타 244018 개인회생변제기간단축제도를전국에서시행해주시길부탁 ...
6 8 성장동력 244017 원자력을 없애면 나라가 망한다 청원제안
7 3 외교/통일/국방 244016 [노무현 전 대통령과 노회찬 의원의 죽음과 이박과 ...
8 5 정치개혁 244015 드루킹 수사 더 철저하게 수사할것을 청원합니다
9 1 기타 244014 국민청원을 없애주세요
10 1 정치개혁 244013 노회찬 대표님 문상
11 0 경제민주화 244012 현금영수증 제도를 활용하여 소상공인 활성화를 도모 ...
12 0 인권/성평등 244011 카카오톡의 영구정지 및 구매한 것에 대한 효력 무효 ...
13 0 일자리 244010 소상공인 죽이는 쇼셜앱 수수료율 제재 청원합니다.
14 1 일자리 244009 파격적공개공모일자리창출단기간확정프로적트
agree category no subject
0 48 정치개혁 244008 이재 정확한 수사하길
1 5 성장동력 244007 드루킹 일당은 더 있습니다.
2 3 경제민주화 244006 연대보증이 폐지됫다지만 아직도 고통받는 사람이 많 ...
3 1 기타 244005 법치주의를 우롱하는 이런짓
4 2 기타 244004 성인오락실
5 11 행정 244003 부천오정경찰서 실종팀의 무관심한 수사로 주말 폭염 ...
6 3 인권/성평등 244002 소방관 경찰관등 체력시험을 남녀모두 같게해주세요
7 8 기타 244001 문재인 그냥 사퇴하세요.
8 4 정치개혁 244000 노회찬 의원님 국립묘지로 보내 주세요
9 0 정치개혁 243999 우리나라에 보수비밀조직들에 비밀단체 비밀조직들이 ...
10 3 외교/통일/국방 243998 참전국가유공자수당을 최저생계비 이상으로 인상을 청 ...
11 32 정치개혁 243997 이재은수미 처벌하라
12 1 정치개혁 243996 드루킹 수사 국민들이 납득 할수 있는 결과로 마무리 ...
13 8 육아/교육 243995 여름방학 기간을 늘려주십시오
14 18 농산어촌 243994 세계 일류 고기능성 과일나무가 죽어가고 있습니다.
agree category no subject
0 9 정치개혁 243993 드루킹을 민주사회의 적폐로 엄벌에 처해주세요.
1 1 기타 243992 대한항공의 기업이름과 기업로고인 태극문양 사용 중 ...
2 0 기타 243991 검찰 인력 늘려서 수사의 질을 높여주세요
3 0 정치개혁 243990 익에 기대서 인터넷에서 범죄를 저지르고 있는 사람 ...
4 8 경제민주화 243989 빚 탕감 소각정책을 취소해주세요. 대국민 공감에 어 ...
5 129 정치개혁 243988 이제 은수미 조폭관련수사
6 74 정치개혁 243987 이재전 성남시장과 조폭연루설 확실한 조사를 바랍 ...
7 0 육아/교육 243986 살인적인 폭염에 적절한 조치 바랍니다.(폭염 경보시 ...
8 4 경제민주화 243985 2019 최저시급 인상에 파급 효과
9 14 교통/건축/국토 243984 건축법 개정을 간곡히 부탁드립니다.
10 56 정치개혁 243983 이재 도지사 조사를 위해 청원 링크 띄워요 한곳에 ...
11 1 미래 243982 네이버 블로그는 짝퉁 유통의 숙주
12 0 안전/환경 243981 비흡연자를위한 대책
13 2 일자리 243980 롯데그룹 총수 신동빈 회장을 제발 불구속 재판받게 ...
14 11 보건복지 243979 어린이집 유치원 실내온도 냉/난방 규제 온도를 마련 ...
agree category no subject
0 5 일자리 243978 연간 3억실의 공실이 발생하는 숙박업의 현실을 반영 ...
1 1 일자리 243977 특수고용직 형태이지만 상당수가 불법인 가맹본부를 ...
2 4 교통/건축/국토 243976 지하철 노약자석 없애주세요.
3 1691 인권/성평등 243975 국가와 경찰은 일베에 할머니 나체 사진을 무단 유포 ...
4 68 정치개혁 243974 이재은수미 조사철저
5 1 인권/성평등 243973 국제선 계류장 항공기소음 및 항공기 매연부터 개선시 ...
6 2 성장동력 243972 대한민국 증시 코스닥 코스피 이대로 괜찮아보이시나 ...
7 2 기타 243971 서민죽이는 신용보증재단
8 5 육아/교육 243970 교과과정에서의 학생들의 예체능 교육 확대에 대한 청 ...
9 35 정치개혁 243969 이재지사 의혹 해소
10 16 정치개혁 243968 이재 죽이기의 배후를 밝혀주십시요
11 0 경제민주화 243967 골목 상권 활성화 방안
12 8 인권/성평등 243966 여성만생각하는 여성들의 페미짓을 그만두게 처벌을 ...
13 6 행정 243965 조용하고 한적한곳에 경찰들 주차하고 쉬고있는모습 ...
14 9 정치개혁 243964 몰래 카메라 범죄 앞에 눈감은 정의의 여신 누구를 ...
agree category no subject
0 4 정치개혁 243963 특별사면 제도 자체를 없애야합니다...
1 2 일자리 243962 한국산업인력공단 국가기술자격시험 응시제도 개선(특 ...
2 1 보건복지 243961 미소금융
3 12 일자리 243960 용역에서 자회사로 글자만 바꾼 처우개선 IBK기업은 ...
4 8 미래 243959 오늘만 생각하는 정치 그만해주세요... 제발 부탁드립 ...
5 2 안전/환경 243958 차량2대이상세금관련
6 3 일자리 243957 Lg전자 불법도급 및 갑질
7 14 성장동력 243956 문재인 대통령 하야를 청원합니다.
8 1 외교/통일/국방 243955 한번만 도와주십시오 우리나라 군 세무조사 (방산비리 ...
9 1 미래 243954 네이버 블로그에서 짝퉁판매자들 단속좀 해주세요
10 9 인권/성평등 243953 여성가족부를 없에주세요
11 4 경제민주화 243952 영세 하청업을 보살펴 주세요
12 0 행정 243951 공무원들의 예산집행 제대로 이행되는지 감찰좀 ...
13 10 인권/성평등 243950 여성가족부 폐지
14 17 미래 243949 문재인대통령을 탄핵을 청원합니다.
agree category no subject
0 0 정치개혁 243948 알맹이 없는 강정 글거 부스럼
1 2 정치개혁 243947 Ibk기업은행 비정규직 실체
2 6 외교/통일/국방 243946 군대기피 목적으로 어린나이에 해외 나가서 영주권을 ...
3 8 성장동력 243945 탈원전 정책 지금이라도 인정하세요
4 2 경제민주화 243944 존경하는 새 시대의 새 대통령님 께 아룁니다.
5 1 농산어촌 243943 중력을 이용하는 농법으로 대한민국 농업인들에게 새 ...
6 4 일자리 243942 IBK국책은행 기업은행 진실혹은 거짓!!!
7 1 행정 243941 건축과/보육과/소방과/부서별 서로 다른 업무진행으로 ...
8 7 기타 243940 김상조 공정거래위원장님 하림지주 빨리 수사 종결해 ...
9 17 기타 243939 누진제폐기
10 2 기타 243938 특검 수사 제대로 하지 않는다면 두고 두고 원망만 쌓 ...
11 2 정치개혁 243937 조폭 동원해 급여 안주는 정치인이 4월혁 회장인 나 ...
12 5 보건복지 243936 공휴일 및 절을 연차로 다 공제해서 여름휴가로 쓸 ...
13 12 미래 243935 문재인대통령을 탄핵하며더불어 민주당을탄핵합니다
14 28 보건복지 243934 살인더위 피할 수 있게 한시적 전기 누진세 폐지해 주 ...
agree category no subject
0 1 성장동력 243933 소상공인 대출건 문의
1 12 정치개혁 243932 한국계외국인들 동족취급하는 적폐들 폐지해주세요
2 1 미래 243931 만나고싶습니다
3 1 보건복지 243930 국민건강보험금 산정문제를 제기합니다.
4 5 기타 243929 자영업자들 살려주세요
5 5 기타 243928 대통령님과 대화를 원합니다.
6 3 경제민주화 243927 주식 전일 종가를 다음 날 무조건 시초가로 출발하는 ...
7 1 육아/교육 243926 박근혜 치적 연금개혁을 인정하라.
8 35 인권/성평등 243925 난민 절대 반대
9 0 미래 243924 남극 북극 얼음이 녹아가는 현실에 대체할게없나요
10 5 보건복지 243923 대통령 각하께 담배값 인상을 청원드립니다.
11 142 정치개혁 243922 이재 은수미 청원 한곳으로 모아주세요. 꼭 없어져 ...
12 11 일자리 243921 "제조업 한다는 자부심 다 잃었다 이 땅서 더는 못버 ...
13 4 안전/환경 243920 응급실폭행범 엄중 처벌 바랍니다.
14 57 기타 243919 이재경기지사 관련
# 100개의 임의 청원의 제목과 본문을 뽑아오기!
2018년 7월 17일 현재 국민청원의 글 번호는 1번부터 238663번까지 부여되어 있습니다.
하지만 중간중간 삭제된 청원이 있어서 실제 청원 수는 238663개보다 적습니다.
309510번 청원글 본문의 URL은 https://www1.president.go.kr/petitions/309510 입니다.
URL에서 글번호만 바꾸면 해당 청원의 본문으로 이동할 수 있습니다.
이러한 URL 패턴의 특성을 활용하여 전체 청원 중 임의로 100개 청원의 제목과 본문을 추출하여 DataFrame에 담아주세요.
```python
import random
```
실제 청원 수를 구할 수 있을까?
[] 랜덤한 수 100개를 리스트에 담는다. (중복 x)
[] 리스트의 랜덤수를 하나씩 뽑아와서 URL을 바꾼다.
[] 이상의 방법으로 하나씩 뽑아 데이터 프레임을 만든다.
일단 번호는 244075까지 있지만, 삭제된 경우가 있음
각 페이지는 15개까지의 목차를 보여주기 때문에, (끝번호-첫번호)+1이 15 미만이라면, 전체번호(244075)에서 부족한 수만큼을 계속 빼는 것을 반복
예를들어, 끝번호-첫번호+1이 13이라면 2개의 청원이 삭제되었으므로 244075-2를 하여 total에 저장하고 다음 페이지를 확인
몇 페이지가 있는가?를 어차피 확인해야 함
그럼 그냥 몇 페이지가 있는지를 확인하고 마지막 페이지를 제외한 페이지 수에 15를 곱하고 마지막 페이지의 목차 개수를 더하면 될 듯
마지막 페이지라는 것을 어떻게 알 수 있지? 목차의 개수가 15 미만이겠지?
마지막 페이지가 15개일수도 있나? ㅇㅇ 그것도 고려해야 함
< 이렇게 해보자ㅏ >
1. 첫 페이지의 마지막 번호(가장 최근의 번호)를 불러오고 15로 나눈 몫을 구한다.
2. 지워진 청원의 개수가 그리 많지 않다는 가정 하에, 그 몫값을 페이지 번호로 하여 그 페이지부터 시작한다.
3. 만약 청원이 많이 지워진 상태라면, 몫값의 페이지가 없을 수도 있으니, 그런 경우에는 10페이지 더 앞당겨서 시작한다.
```python
with request.urlopen('https://www1.president.go.kr/petitions?page=1') as f:
html = f.read().decode('utf-8')
bs = BeautifulSoup(html, 'html5lib')
recent_num = [c.text for c in bs.select('div.bl_no')][9]
recent_num = int(recent_num.replace('번호 ',''))
recent_num
```
244115
```python
url_num = recent_num // 15
while True:
url = 'https://www1.president.go.kr/petitions?page='+str(url_num)
with request.urlopen(url) as f:
html = f.read().decode('utf-8')
bs = BeautifulSoup(html, 'html5lib')
plz_total = len([c.text for c in bs.select('div.bl_no')][9:])
if plz_total != 15:
total = 15 * (url_num-1) + plz_total
break
url_num += 1
total
```
244116
```python
url = 'https://www1.president.go.kr/petitions/50'
with request.urlopen(url) as f:
html = f.read().decode('utf-8')
bs = BeautifulSoup(html, 'html5lib')
contents = [c.text for c in bs.select('div.View_write')]
contents
```
['\n\t\t\t\t\t\t\t\t\t\t\t안녕하십니까? 대통령님 간단하게 제소개 부터 하겠습니다. 저는 치위생과 3학년 학생입니다. 다름이 아니라 너무 억울한 \n\t\t\t\t\t\t\t\t\t\t\t일이 생겨 이렇게 글을 적어봅니다. 저희 3학년들은 올해 아주 중요한 시험이 있었습니다.3년동안 힘들게 고생하여 이제곧 \n\t\t\t\t\t\t\t\t\t\t\t끝난다.조금만 힘을내자 라는 생각으로 견뎌내고 있었는데 갑자기 어제 과 단톡방의 이내용이 사실이냐면서 한장의 사진을 \n\t\t\t\t\t\t\t\t\t\t\t보내주었습니다. 그사진은 국시원에서 올린 공지글을 캡쳐한 사진이였습니다.국시원에서 국가고시 날짜가 변경 되었다. \n\t\t\t\t\t\t\t\t\t\t\t그러니 이글을 널리퍼트려줘라라는 의미를 가진 내용이였습니다. 그내용을 읽고 너무 황당하고 어이가 없어서 국시원쪽으로 전화를 해보니 자신들은 어쩔수없었다 나라에서 그날 공무원 시험을 봐야한다 하루의 2번 국시를 볼수없다 \n\t\t\t\t\t\t\t\t\t\t\t그래서 자신들과 보건복지부에서 다시 날짜를 정했는데 그날이 내년 18년 1월 5일이다 라는것입니다. \n\t\t\t\t\t\t\t\t\t\t\t그 말을 듣고 더욱 이해할수가 없었습니다. 먼저 그날 시험을 보기로한 국시생은 저희였는데 갑자기 아무런 말도 문자도 \n\t\t\t\t\t\t\t\t\t\t\t없고 그공지글도 그냥 통보였습니다. 국가에서보는 시험이 일반 초중고 수행평가도 아니고 갑자기 이런식으로 바뀐다는게 \n\t\t\t\t\t\t\t\t\t\t\t말이 됩니까? 고등학생들이 보는 모의고사,수능도 이런식으로 변경 하지않습니다. 통보로 날짜를 변경하는건 저희 \n\t\t\t\t\t\t\t\t\t\t\t에비 의료기사들을 무시하는걸로 보입니다. 저희학교 교수님들은 학생들에게 자부심을 가져라 치과위생사라는 직업을 아끼고 사랑하고 자부심을 가져도 되는 직업이니깐 당당 해져도 괜찮다고 말씀하셨습니다. 하지만 저는 그러지 못할꺼같습니다. 아마 저뿐만이아니라 마음 학생들도 자존심도 낮아지고 무시받는다고 생각했을것입니다. \n\t\t\t\t\t\t\t\t\t\t\t17년 12월16일날의 시험을 봐야하는 국시생들은 저희 예비 치과위생사들 입니다. 공무원 시험이 갑자기 잡혀서 봐야한다면 다른날로 정하는게 맞지않을까요? 먼저 그날의 시험을 보기로 한 학생들은 저희입니다 \n\t\t\t\t\t\t\t\t\t\t\t왜 갑자기 저희가 양보해야하고 피해를봐야합니까? 저희국시 앞으로 4달남은상태였고 지금 많은 학생들이 지쳐지만 \n\t\t\t\t\t\t\t\t\t\t\t12월만 참으면 된다 라는 생각으로 견딘 학생들에게는 1월에 본다는 통보는 너무 가혹합니다. \n\t\t\t\t\t\t\t\t\t\t\t제발 원래대로12월 16일날 국시를 볼수 있도록 도와주세요 제발 다시 정정 해주세요. \n\t\t\t\t\t\t\t\t\t\t\t부탁드립니다. \n\t\t\t\t\t\t\t\t\t\t\t꼭 끝까지 읽어보셨으면 좋겠습니다. \n\t\t\t\t\t\t\t\t\t\t']
```python
what_to_know = ['h3.petitionsView_title', 'div.View_write'] # 각각 번호, 분류, 제목, 참여인원을 의미
address = 'div.bl_'+what_to_know[0]
what_to_remove = ['\n', '\t']
# 지워야 할, 필요없는 내용을 목록에 담아둠
df_dict = {}
count = 100 # 몇 개의 페이지를 부를거야?
random_index = [random.randint(1, total) for i in range(count)]
print(random_index)
for i in range(count):
url = 'https://www1.president.go.kr/petitions/'+str(random_index[i])
with request.urlopen(url) as f:
html = f.read().decode('utf-8')
bs = BeautifulSoup(html, 'html5lib')
for n in range(len(what_to_know)):
address = what_to_know[n]
contents = [c.text for c in bs.select(address)]
for e in what_to_remove:
contents = remove_e(contents,e)
df_dict[what_to_know[n]] = contents
df = pd.DataFrame(data=df_dict)
print(df)
if i < count:
print('\n')
```
[227870, 160278, 35208, 21547, 36573, 88015, 7840, 219288, 151714, 42545, 170862, 86507, 190575, 68903, 239653, 42522, 5812, 235356, 177369, 228551, 222976, 150575, 136423, 215075, 2709, 223805, 4903, 20105, 23092, 199003, 66117, 105380, 21384, 139307, 45991, 89581, 42342, 180713, 25068, 237070, 168564, 171841, 191639, 188993, 122338, 80499, 214308, 25141, 83099, 147477, 99882, 211776, 37238, 218185, 174344, 65131, 85075, 52922, 9041, 76776, 183921, 200077, 148111, 199309, 151373, 41193, 139115, 233513, 139169, 140675, 80998, 13398, 226644, 18482, 175074, 141652, 46684, 234677, 54407, 53816, 39643, 68239, 36678, 210743, 221262, 131722, 82321, 140143, 109192, 55994, 5582, 9384, 181983, 149475, 139409, 153626, 146246, 239505, 60269, 107892]
---------------------------------------------------------------------------
HTTPError Traceback (most recent call last)
<ipython-input-294-35ec1f3dff9c> in <module>()
14 for i in range(count):
15 url = 'https://www1.president.go.kr/petitions/'+str(random_index[i])
---> 16 with request.urlopen(url) as f:
17 html = f.read().decode('utf-8')
18 bs = BeautifulSoup(html, 'html5lib')
/usr/lib/python3.6/urllib/request.py in urlopen(url, data, timeout, cafile, capath, cadefault, context)
221 else:
222 opener = _opener
--> 223 return opener.open(url, data, timeout)
224
225 def install_opener(opener):
/usr/lib/python3.6/urllib/request.py in open(self, fullurl, data, timeout)
530 for processor in self.process_response.get(protocol, []):
531 meth = getattr(processor, meth_name)
--> 532 response = meth(req, response)
533
534 return response
/usr/lib/python3.6/urllib/request.py in http_response(self, request, response)
640 if not (200 <= code < 300):
641 response = self.parent.error(
--> 642 'http', request, response, code, msg, hdrs)
643
644 return response
/usr/lib/python3.6/urllib/request.py in error(self, proto, *args)
568 if http_err:
569 args = (dict, 'default', 'http_error_default') + orig_args
--> 570 return self._call_chain(*args)
571
572 # XXX probably also want an abstract factory that knows when it makes
/usr/lib/python3.6/urllib/request.py in _call_chain(self, chain, kind, meth_name, *args)
502 for handler in handlers:
503 func = getattr(handler, meth_name)
--> 504 result = func(*args)
505 if result is not None:
506 return result
/usr/lib/python3.6/urllib/request.py in http_error_default(self, req, fp, code, msg, hdrs)
648 class HTTPDefaultErrorHandler(BaseHandler):
649 def http_error_default(self, req, fp, code, msg, hdrs):
--> 650 raise HTTPError(req.full_url, code, msg, hdrs, fp)
651
652 class HTTPRedirectHandler(BaseHandler):
HTTPError: HTTP Error 404: Not Found
| 28.708054 | 1,549 | 0.50941 | kor_Hang | 0.999366 |
97f6fb2037d9e4575a16ff5f76f8deb851508a21 | 598 | md | Markdown | README.md | elahw-zh25/douban-custom-lottery | 377ebbb8d5cc9abd424d981100a84fef5fe921b3 | [
"Apache-2.0"
] | 1 | 2021-11-27T07:48:20.000Z | 2021-11-27T07:48:20.000Z | README.md | elahw-zh25/douban-custom-lottery | 377ebbb8d5cc9abd424d981100a84fef5fe921b3 | [
"Apache-2.0"
] | null | null | null | README.md | elahw-zh25/douban-custom-lottery | 377ebbb8d5cc9abd424d981100a84fef5fe921b3 | [
"Apache-2.0"
] | 1 | 2021-06-02T17:29:46.000Z | 2021-06-02T17:29:46.000Z | # 豆瓣特殊机制手动开奖
用于帮助有特殊需求的小组抽奖机制,比如需要筛选掉一些主动放弃抽奖的用户。每抽选一位候选人,抽选者可以查看被抽中人的个人主页及本帖回复,以便人工筛选掉一些不符合抽奖规定的选手。
目前仅支持回复抽奖。一般的抽奖可以直接使用豆瓣抽奖小助手。
**申明:本代码只是用于帮助自动化一些复杂的人工流程,使用本代码进行其他不符合豆瓣社区规定的行为后果自负。**
使用方法:
```
python3 lottery.py TARGET_LINK NUMER_OF_LOTTERY
```
在这个小组贴:https://www.douban.com/group/topic/228012134/ 里随机抽10个人的指令:
```
python3 lottery.py https://www.douban.com/group/topic/228012134/ 10
```
运行结果如下

# Prerequisite
- python
- BeautifulSoup, requests
可以使用pip进行安装。
```
pip install bs4 requests
```
| 21.357143 | 87 | 0.765886 | yue_Hant | 0.883611 |
97f829c86949d3fbf206141836ef1d9fdd4e1537 | 1,533 | md | Markdown | README.md | mughees-Ilyas/Angular-test | c3dccb959d77cc08c0a240734f3a6642bdb63da7 | [
"MIT"
] | null | null | null | README.md | mughees-Ilyas/Angular-test | c3dccb959d77cc08c0a240734f3a6642bdb63da7 | [
"MIT"
] | 5 | 2021-09-02T02:23:18.000Z | 2022-03-02T06:22:06.000Z | README.md | mughees-Ilyas/Angular-test | c3dccb959d77cc08c0a240734f3a6642bdb63da7 | [
"MIT"
] | null | null | null | # Angular test app
steps to run the project.
1) clone the project.
2) run npm install to install dependencieis.
3)run npm start.
## user credintials
for login purpose use.
username: test
password: test
or
username: admin
password: admin
Any new user created will have authentication credintials as well. in case of new user created
username: {first name}
password: {first name}
as backend is a mocked one, if you refresh the page all new users will be deleted. (but as the information for login user is stored in local storage so you will still stay login. just the data will not appear in user list)
## routes
we have
/login route for login
/users for user list view.
/users/:id to view specific user.
/users/user/new to add new user
## Project file setup
pages folder have all the pages (login, user view etc)
components folder have components that are either universal such as navbar or reuseable.
_models folder have all the models needed for project (currently only users)
_helper folder have auth guard to specifiy which routes are protected and some HTTP intercepter function to mock backend
_Service folder have project services (authentication and user service).
## project explanation.
We have a simple authentication model to login. it will be intercepted by our mock backend which is build using HTTP interceptor
After login we will have a user view to list all users, a new to get detail about specific user and a view to add new user.all these views are
bundeled together in user component.
| 26.894737 | 222 | 0.778213 | eng_Latn | 0.99936 |
97f86c9a4e1ceacec6f435c051ca15a749c9c592 | 93 | md | Markdown | README.md | haroldavis/apiRecipes | bdaa2ec430ce9c506f3a588e7e8a751902aca161 | [
"MIT"
] | null | null | null | README.md | haroldavis/apiRecipes | bdaa2ec430ce9c506f3a588e7e8a751902aca161 | [
"MIT"
] | null | null | null | README.md | haroldavis/apiRecipes | bdaa2ec430ce9c506f3a588e7e8a751902aca161 | [
"MIT"
] | null | null | null | # apiRecipes
Building a recipe API for users. i will use technologys like node and graphQl
| 31 | 78 | 0.784946 | eng_Latn | 0.991513 |
97f86d190c1411ec47fe6d8908b2f6f501cd9667 | 1,627 | md | Markdown | articles/event-grid/edge/release-notes.md | sonquer/azure-docs.pl-pl | d8159cf8e870e807bd64e58188d281461b291ea8 | [
"CC-BY-4.0",
"MIT"
] | null | null | null | articles/event-grid/edge/release-notes.md | sonquer/azure-docs.pl-pl | d8159cf8e870e807bd64e58188d281461b291ea8 | [
"CC-BY-4.0",
"MIT"
] | null | null | null | articles/event-grid/edge/release-notes.md | sonquer/azure-docs.pl-pl | d8159cf8e870e807bd64e58188d281461b291ea8 | [
"CC-BY-4.0",
"MIT"
] | null | null | null | ---
title: Informacje o wersji — Azure Event Grid IoT Edge | Microsoft Docs
description: Azure Event Grid na temat IoT Edge informacji o wersji
author: banisadr
ms.author: babanisa
ms.reviewer: spelluru
ms.date: 01/09/2020
ms.topic: article
ms.service: event-grid
services: event-grid
ms.openlocfilehash: 18a4fb9a979841bbf6cd0090fc67a77327c61596
ms.sourcegitcommit: 5d6ce6dceaf883dbafeb44517ff3df5cd153f929
ms.translationtype: MT
ms.contentlocale: pl-PL
ms.lasthandoff: 01/29/2020
ms.locfileid: "76849710"
---
# <a name="release-notes-azure-event-grid-on-iot-edge"></a>Informacje o wersji: Azure Event Grid na IoT Edge
## <a name="100-preview1"></a>1.0.0 — zestawu
Początkowa wersja Azure Event Grid w IoT Edge. Uwzględnione funkcje:
* Tworzenie tematu
* Tworzenie subskrypcji zdarzeń
* Filtry zaawansowane
* Dzielenie na partie danych wyjściowych
* Zasady ponawiania prób
* Publikowanie modułu do modułu
* Publikuj w elemencie webhook jako miejsce docelowe
* Publikuj w usłudze IoT Edge Hub jako miejsce docelowe
* Publikuj w usłudze Azure Event Grid w chmurze jako miejsce docelowe
* Utrwalony stan metadanych
* Integracja modułu BLOB Storage
Tagi: `1.0.0-preview1`
## <a name="100-preview2"></a>1.0.0 — preview2
Zapoznawcza 2 Azure Event Grid na IoT Edge dodany:
* Konfigurowalne zdarzenia utrwalania na dysku
* Metryki tematu
* Metryki subskrypcji zdarzeń
* Publikuj do Event Hubs jako miejsce docelowe
* Publikowanie w kolejkach Service Bus jako miejsce docelowe
* Publikuj w Service Bus tematy jako miejsce docelowe
* Publikowanie w kolejkach magazynu jako miejsce docelowe
Tagi: `1.0.0-preview2`, `1.0`, `latest` | 32.54 | 108 | 0.788568 | pol_Latn | 0.996497 |
97f873326ab03a8aebe9ac2652d441c95fe632d9 | 381 | md | Markdown | content/vehicles/toyota/c-hr/2018-toyota-c-hr-xle/_index.md | openpilot-community/documentation-site | e93fbdfa63103d26f30127cf6be375a28789e04f | [
"MIT"
] | null | null | null | content/vehicles/toyota/c-hr/2018-toyota-c-hr-xle/_index.md | openpilot-community/documentation-site | e93fbdfa63103d26f30127cf6be375a28789e04f | [
"MIT"
] | null | null | null | content/vehicles/toyota/c-hr/2018-toyota-c-hr-xle/_index.md | openpilot-community/documentation-site | e93fbdfa63103d26f30127cf6be375a28789e04f | [
"MIT"
] | null | null | null | ---
id: 2018-toyota-c-hr-xle
title: 2018 2018 Toyota C-HR
description: Information about running Comma.ai Openpilot on the 2018 2018 Toyota C-HR
---
To chat about 2018 Toyota C-HR XLEs with the community, checkout the [Comma Slack](https://slack.comma.ai).
## Configurations
The following configuration details have been gathered for the 2018 2018 Toyota C-HR.
| 17.318182 | 108 | 0.732283 | eng_Latn | 0.93955 |
97f993bd5e0dbf72e080f9d5c449e5d47c202d5b | 5,038 | md | Markdown | README.md | zhangmengxiong/MXVideo | d9745a293141e9a7d0655126217b2f7f972838b8 | [
"Apache-2.0"
] | null | null | null | README.md | zhangmengxiong/MXVideo | d9745a293141e9a7d0655126217b2f7f972838b8 | [
"Apache-2.0"
] | null | null | null | README.md | zhangmengxiong/MXVideo | d9745a293141e9a7d0655126217b2f7f972838b8 | [
"Apache-2.0"
] | null | null | null | # MXVideo
#### 介绍
基于饺子播放器、kotlin编写的 Android 开源播放器, 开箱即用,欢迎提 issue 和 pull request
> 简书相关介绍(待完善):https://www.jianshu.com/nb/50294642
最新版本:[](https://jitpack.io/#com.gitee.zhangmengxiong/MXVideo)




#### 功能特性
- 任意播放器内核(包含开源IJK、谷歌Exo、阿里云等等)
- 单例播放,只能同时播放一个节目
- 0代码集成全屏功能
- 可以调节音量、屏幕亮度
- 可以注册播放状态监听回调
- 播放器高度可以根据视频高度自动调节
- 播放器支持设置宽高比,设置宽高比后,高度固定。
- 自动保存与恢复播放进度(可关闭)
- 支持循环播放、全屏时竖屏模式、可关闭快进快退功能、可关闭全屏功能、可关闭非WiFi环境下流量提醒
##### 1、通过 dependence 引入MXVideo
```groovy
dependencies {
implementation 'com.gitee.zhangmengxiong:MXVideo:x.x.x'
}
```
##### 2、页面集成
```xml
<com.mx.video.MXVideoStd
android:id="@+id/mxVideoStd"
android:layout_width="match_parent"
android:layout_height="wrap_content" />
```
```kotlin
// Activity或者Fragment中生命周期变更,处理进入后台/前台时的暂停/续播功能
override fun onStart() {
mxVideoStd.onStart()
super.onStart()
}
override fun onStop() {
mxVideoStd.onStop()
super.onStop()
}
```
##### 3、开始播放
```kotlin
// 设置播放占位图
Glide.with(this).load("http://www.xxx.com/xxx.png").into(mxVideoStd.getPosterImageView())
// 默认从上一次进度播放
mxVideoStd.setSource(MXPlaySource(Uri.parse("https://aaa.bbb.com/xxx.mp4"), "标题1"))
mxVideoStd.startPlay()
// 从头开始播放
mxVideoStd.setSource(MXPlaySource(Uri.parse("https://aaa.bbb.com/xxx.mp4"), "标题1"), seekTo = 0)
mxVideoStd.startPlay()
// 从第10秒开始播放
mxVideoStd.setSource(MXPlaySource(Uri.parse("https://aaa.bbb.com/xxx.mp4"), "标题1"), seekTo = 10)
mxVideoStd.startPlay()
```
> MXPlaySource 可选参数说明:
| 参数 | 说明 | 默认值 |
| :----- | :--: | -------: |
| title | 标题 | "" |
| headerMap | 网络请求头部 | null |
| changeOrientationWhenFullScreen | 全屏时是否需要变更Activity方向,如果 = null,会自动根据视频宽高来判断 | null |
| isLooping | 是否循环播放 | false |
| enableSaveProgress | 是否存储、读取播放进度 | true |
| isLiveSource | 是否直播源,当时直播时,不显示进度,无法快进快退暂停 | false |
##### 4、监听播放进度
```kotlin
mxVideoStd.addOnVideoListener(object : MXVideoListener() {
// 播放状态变更
override fun onStateChange(state: MXState) {
}
// 播放时间变更
override fun onPlayTicket(position: Int, duration: Int) {
}
})
```
##### 5、全屏返回 + 释放资源
> 这里MXVideo默认持有当前播放的MXVideoStd,可以使用静态方法操作退出全屏、释放资源等功能。
>
> 也可以直接使用viewId:mxVideoStd.isFullScreen(),mxVideoStd.isFullScreen(),mxVideoStd.release() 等方法。
```kotlin
override fun onBackPressed() {
if (MXVideo.isFullScreen()) {
MXVideo.gotoNormalScreen()
return
}
super.onBackPressed()
}
override fun onDestroy() {
MXVideo.releaseAll()
super.onDestroy()
}
```
### 功能相关
- 切换播放器内核
```kotlin
// 默认MediaPlayer播放器,库默认内置
com.mx.video.player.MXSystemPlayer
// 谷歌的Exo播放器
com.mx.mxvideo_demo.player.MXExoPlayer
// IJK播放器
com.mx.mxvideo_demo.player.MXIJKPlayer
// 设置播放源是可以设置内核,默认 = MXSystemPlayer
mxVideoStd.setSource(MXPlaySource(Uri.parse("xxx"), "xxx"), player = MXSystemPlayer::class.java)
```
- 视频渲染旋转角度
```kotlin
// 默认旋转角度 = MXOrientation.DEGREE_0
mxVideoStd.setOrientation(MXOrientation.DEGREE_90)
```
- 视频填充规则
```kotlin
// 强制填充宽高 MXScale.FILL_PARENT
// 根据视频大小,自适应宽高 MXScale.CENTER_CROP
// 默认填充规则 = MXScale.CENTER_CROP
mxVideoStd.setScaleType(MXScale.CENTER_CROP)
```
- MXVideoStd 控件宽高约束
> 在页面xml中添加,layout_width一般设置match_parent,高度wrap_content
```xml
<com.mx.video.MXVideoStd
android:id="@+id/mxVideoStd"
android:layout_width="match_parent"
android:layout_height="wrap_content" />
```
> 可以设置任意宽高比,如果设置宽高比,则控件高度需要设置android:layout_height="wrap_content",否则不生效。
>
> 当取消约束、MXVideo高度自适应、填充规则=MXScale.CENTER_CROP时,控件高度会自动根据视频宽高自动填充高度
```kotlin
// MXVideoStd控件设置宽高比= 16:9
mxVideoStd.setDimensionRatio(16.0 / 9.0)
// MXVideoStd控件设置宽高比= 4:3
mxVideoStd.setDimensionRatio(4.0 / 3.0)
// 取消约束
mxVideoStd.setDimensionRatio(0.0)
```
- 进度跳转
```kotlin
// 进度单位:秒 可以在启动播放后、错误或播完之前调用
mxVideoStd.seekTo(55)
```
- 设置不能快进快退
```kotlin
// 播放前设置 默认=true
mxVideoStd.getConfig().canSeekByUser = false
```
- 设置不能全屏
```kotlin
// 播放前设置 默认=true
mxVideoStd.getConfig().canFullScreen = false
```
- 设置不显示控件右上角时间
```kotlin
// 播放前设置 默认=true
mxVideoStd.getConfig().canShowSystemTime = false
```
- 设置不显示控件右上角电量图
```kotlin
// 播放前设置 默认=true
mxVideoStd.getConfig().canShowBatteryImg = false
```
- 设置关闭WiFi环境播放前提醒
```kotlin
// 播放前设置 默认=true
mxVideoStd.getConfig().showTipIfNotWifi = false
```
- 设置播放完成后自动退出全屏
```kotlin
// 播放前设置 默认=true
mxVideoStd.getConfig().gotoNormalScreenWhenComplete = true
```
- 设置播放错误后自动退出全屏
```kotlin
// 播放前设置 默认=true
mxVideoStd.getConfig().gotoNormalScreenWhenError = true
```
- 设置屏幕方向根据重力感应自动进入全屏、小屏模式
```kotlin
// 播放前设置 默认=false
mxVideoStd.getConfig().autoRotateBySensor = true
```
| 22.693694 | 123 | 0.692537 | yue_Hant | 0.703802 |