hexsha
stringlengths
40
40
size
int64
5
1.04M
ext
stringclasses
6 values
lang
stringclasses
1 value
max_stars_repo_path
stringlengths
3
344
max_stars_repo_name
stringlengths
5
125
max_stars_repo_head_hexsha
stringlengths
40
78
max_stars_repo_licenses
listlengths
1
11
max_stars_count
int64
1
368k
max_stars_repo_stars_event_min_datetime
stringlengths
24
24
max_stars_repo_stars_event_max_datetime
stringlengths
24
24
max_issues_repo_path
stringlengths
3
344
max_issues_repo_name
stringlengths
5
125
max_issues_repo_head_hexsha
stringlengths
40
78
max_issues_repo_licenses
listlengths
1
11
max_issues_count
int64
1
116k
max_issues_repo_issues_event_min_datetime
stringlengths
24
24
max_issues_repo_issues_event_max_datetime
stringlengths
24
24
max_forks_repo_path
stringlengths
3
344
max_forks_repo_name
stringlengths
5
125
max_forks_repo_head_hexsha
stringlengths
40
78
max_forks_repo_licenses
listlengths
1
11
max_forks_count
int64
1
105k
max_forks_repo_forks_event_min_datetime
stringlengths
24
24
max_forks_repo_forks_event_max_datetime
stringlengths
24
24
content
stringlengths
5
1.04M
avg_line_length
float64
1.14
851k
max_line_length
int64
1
1.03M
alphanum_fraction
float64
0
1
lid
stringclasses
191 values
lid_prob
float64
0.01
1
4c0be0cb085d5661de84d385976352fd77fda1a9
937
md
Markdown
CHANGELOG.md
andreas-fintricity/xml-component
86558a97b99559a4dd4b344beb505a26de354063
[ "Apache-2.0" ]
null
null
null
CHANGELOG.md
andreas-fintricity/xml-component
86558a97b99559a4dd4b344beb505a26de354063
[ "Apache-2.0" ]
28
2018-01-18T11:11:12.000Z
2020-10-29T15:30:08.000Z
CHANGELOG.md
blendededge/xml-component-1
d128cbfafcfa43d4e613a84e51e712b9fe1541b7
[ "Apache-2.0" ]
5
2018-10-09T07:47:22.000Z
2021-08-13T10:25:01.000Z
## 1.3.4 (February 12, 2021) * Update sailor version to 2.6.24 ## 1.3.3 (October 30, 2020) * Upgrade to sailor 2.6.18 * Annual audit of the component code to check if it exposes a sensitive data in the logs ## 1.3.2 (June 6, 2020) * Remove update docs on deploy script ## 1.3.1 (May 22, 2020) * Update sailor version to 2.6.7 * Correctly handle incoming attachments that are empty. * Update dependencies. * Add debug log statements. ## 1.3.0 (April 23, 2020) * Update dependencies * Create new JSON to XML action * Add help links ## 1.2.1 (March 30, 2020) * Minor logs improvements in "XML to JSON" action ## 1.2.0 (January 30, 2020) * Update sailor version to 2.6.1 * Refactor console.log to built in sailor logger * Change build type to docker ## 1.1.1 (September 25, 2019) * Upload attachments with component commons library ## 1.1.0 (June 24, 2019) * Update `README` ## 1.0.0 (December 29, 2016) * Initial release
19.520833
88
0.693703
eng_Latn
0.898155
4c0be596055a5b473a2a2fe11b96dff0c4e42fc6
1,774
md
Markdown
microsoft-365/bookings/comparison-chart.md
shivbijlani/microsoft-365-docs
bab30c389914a23c1417087e49fe983331523b2f
[ "CC-BY-4.0", "MIT" ]
null
null
null
microsoft-365/bookings/comparison-chart.md
shivbijlani/microsoft-365-docs
bab30c389914a23c1417087e49fe983331523b2f
[ "CC-BY-4.0", "MIT" ]
null
null
null
microsoft-365/bookings/comparison-chart.md
shivbijlani/microsoft-365-docs
bab30c389914a23c1417087e49fe983331523b2f
[ "CC-BY-4.0", "MIT" ]
null
null
null
--- title: "Comparison: Bookings web app vs. Bookings Teams app" ms.author: kwekua author: kwekuako manager: scotv audience: Admin ms.topic: article ms.service: bookings ms.localizationpriority: medium ms.assetid: d586eb28-b752-4c46-bf92-00a0c5ad781d description: "A comparison chart that shows the feature differences between the Bookings web app and the Bookings Teams app." --- # Comparison chart: Bookings web app vs. Bookings Teams app > [!NOTE] > This article helps you to interact with the latest version of Microsoft Bookings. Previous versions will be retired in coming months. The Bookings app in Teams lets schedulers handle their main tasks and change some settings. However, the Bookings web includes advanced features and settings that are not yet available in the Teams app. Teams app features are being added all the time and we will continue to update this list. See the **Is all the functionality of the original Bookings Web app available in Microsoft Teams?** section in the [FAQ](bookings-faq.yml) for more details. | Feature | Bookings web app | Bookings Teams app | |:---|:---|:---| | Create new booking calendars | Yes | Yes | | Add and remove staff from a booking calendar | Yes | Yes | | Create new appointment types | Yes | Yes | | Schedule online bookings | Yes | Yes | | Edit business details | Yes | Yes | | Add staff with Guest permissions for people outside your org | Yes | No | | Schedule in-person or offline bookings | Yes | No | | Schedule group or multi-customer bookings | Yes | No | | Set a logo for your business | Yes | No | | Set business hours | Yes | No | | Publish a self-service scheduling page | Yes | No | | Manage customer contacts | Yes | No | | Set time off for staff | Yes | No | | Scheduled queue view | No | Yes |
45.487179
246
0.741826
eng_Latn
0.990267
4c0c43b19f00f7aeba550e14d41c358a3e2eee7c
3,190
md
Markdown
README.md
tidyverse/googledrive
0297c7609ce997dabc0228616ddc478a9ff05386
[ "MIT" ]
265
2017-05-10T23:43:52.000Z
2022-03-22T12:48:59.000Z
README.md
tidyverse/googledrive
0297c7609ce997dabc0228616ddc478a9ff05386
[ "MIT" ]
339
2017-05-10T21:14:51.000Z
2022-03-27T00:44:08.000Z
README.md
tidyverse/googledrive
0297c7609ce997dabc0228616ddc478a9ff05386
[ "MIT" ]
53
2017-05-29T05:25:57.000Z
2022-03-07T20:18:46.000Z
<!-- README.md is generated from README.Rmd. Please edit that file --> # googledrive <img src="man/figures/logo.png" align="right" height=140/> <!-- badges: start --> [![CRAN status](https://www.r-pkg.org/badges/version/googledrive)](https://CRAN.R-project.org/package=googledrive) [![R-CMD-check](https://github.com/tidyverse/googledrive/workflows/R-CMD-check/badge.svg)](https://github.com/tidyverse/googledrive/actions) [![Codecov test coverage](https://codecov.io/gh/tidyverse/googledrive/branch/master/graph/badge.svg)](https://codecov.io/gh/tidyverse/googledrive?branch=master) <!-- badges: end --> ## Overview googledrive allows you to interact with files on Google Drive from R. ## Installation Install the CRAN version: ``` r install.packages("googledrive") ``` Or install the development version from GitHub: ``` r # install.packages("devtools") devtools::install_github("tidyverse/googledrive") ``` ## Usage Please see the package website: <https://googledrive.tidyverse.org> Here’s a teaser that uses googledrive to view some of the files you see on <https://drive.google.com> (up to `n_max = 25`, in this case): ``` r library("googledrive") drive_find(n_max = 25) #> # A dribble: 16 x 3 #> name id drive_resource #> <chr> <drv_id> <list> #> 1 chicken_sheet 1s0kEHcqG2PyciERoGq52L_Qwzp4y3__rBVKSx7E… <named list [35… #> 2 r_logo.jpg 1wFAZdmBiSRu4GShsqurxD7wIDSCZvPud <named list [41… #> 3 THANKS 19URV7BT0_E1KhYdfDODszK5aiELOwTSz <named list [40… #> 4 googledrive-NEWS.… 1h1lhFfQrDZevE2OEX10-rbi2BfvGogFm <named list [39… #> 5 def 1ALSW_Nqs7FsPOcrJ6MqyBoRm03gansmn <named list [33… #> 6 abc 1o89YN5n4325GbUA86Wp6pRH3dsTsE5iC <named list [33… #> 7 BioC_mirrors.csv 13tMFbhAHoeHLFS5xu19GbDjf6GWJSxyN <named list [39… #> 8 Rlogo.svg 1lCQGxjyoc9mQz719I8sKil_m2Nuhw0Fq <named list [41… #> 9 DESCRIPTION 1KKYhtcdJMKh4WYeri5TOPEeAtzdN_cqV <named list [40… #> 10 r_about.html 1mHtQhvJyDk5dX9ktKbeIoVW-wwWK0__N <named list [40… #> 11 imdb_latin1.csv 1S5HxY7a-Jb_fV4C3T6fkGyPpXfI_yb4w <named list [39… #> 12 chicken.txt 1xMvlJHia_qYNZmucaStDcOF9A9PD4BOT <named list [40… #> 13 chicken.pdf 1au0aK6YCTra2sucTRus8ZaUhbaLpinTn <named list [40… #> 14 chicken.jpg 1-BF1c4kWCkkByQbcLT-b2Hv6vnVsbqa_ <named list [41… #> 15 chicken.csv 12212CXY_TopUMIKYu_l8hU5UXI8lrzQF <named list [39… #> 16 chicken_doc 11GY4Q4BUG3m5U4CnZP564lYvGydvZe2XZOkwCfx… <named list [35… ``` ## Contributing If you’d like to contribute to the development of googledrive, please read [these guidelines](https://googledrive.tidyverse.org/CONTRIBUTING.html). Please note that the googledrive project is released with a [Contributor Code of Conduct](https://googledrive.tidyverse.org/CODE_OF_CONDUCT.html). By contributing to this project, you agree to abide by its terms. ## Privacy [Privacy policy](https://www.tidyverse.org/google_privacy_policy)
40.379747
144
0.683072
eng_Latn
0.338365
4c0ca636d4c7d5762dd0d319a415dcd3afb491e0
8,956
md
Markdown
packages/admin/README.md
learningtapestry/ssdn
a358d46276cd64d15fe2ed16a3c3cf045a53101d
[ "Apache-2.0" ]
null
null
null
packages/admin/README.md
learningtapestry/ssdn
a358d46276cd64d15fe2ed16a3c3cf045a53101d
[ "Apache-2.0" ]
1
2022-03-24T07:45:09.000Z
2022-03-24T07:45:09.000Z
packages/admin/README.md
learningtapestry/ssdn
a358d46276cd64d15fe2ed16a3c3cf045a53101d
[ "Apache-2.0" ]
null
null
null
# SSDN Administration Panel Thic component provides a front-end web application that lets you manage and configure a running SSDN instance. ## Main dependencies and tools This project was bootstrapped with [Create React App](https://github.com/facebook/create-react-app). Other tools used alongside this framework are: - [Yarn](https://yarnpkg.com/en/): manages dependencies and project scripts. - [TypeScript](https://www.typescriptlang.org/): is used as the default implementation language. - [AWS Amplify](https://aws-amplify.github.io/): provides the backbone for the back-end services and AWS integrations. ## Requirements As with every JavaScript front-end project, make sure you have installed an up-to-date version of [Node.js](https://nodejs.org/en/download/), preferably version 10.16 or higher, as well as [Yarn](https://yarnpkg.com/en/). Next, you need to install the AWS Amplify CLI package globally in your system. ```bash yarn global add @aws-amplify/cli ``` ## Project layout ```bash admin ├── amplify/ <-- Settings created and managed by Amplify ├── build/ <-- Output folder that contains the generated build ├── cypress/ <-- Contains the end-to-end tests ├── public/ <-- Basic files to boostrap the application inside the browser ├── src/ <-- Main source folder │ ├── components/ <-- Contains the React UI components │ ├── interfaces/ <-- TypeScript definitions for application objects │ ├── services/ <-- Service objects that usually communicate with external resources │ ├── types/ <-- Specific TypeScript types for libraries that do not include them │ ├── App.tsx <-- Main application React component │ ├── app-helper.ts <-- Main application helper with utility functions │ ├── aws-configuration.ts <-- Sets up the main admin panel configuration │ ├── aws-exports.js <-- Auto-generated file by AWS Amplify that contains the current configuration │ └── setupTests.ts <-- Prepares the environment to run the tests ├── test-support/ <-- Support files and code useful for testing ├── .env.template <-- Template file that declares environment variables for the project ├── cypress.json <-- Configuration file for the Cypress test runner ├── package.json <-- Node.js dependencies ├── README.md <-- This file ├── tsconfig.json <-- Configuration file for the TypeScript compiler └── tslint.json <-- Configuration file for the TypeScript linter ``` ## Setup Run the following command to install the project dependencies: ```bash yarn install ``` Next, create a local copy of the configuration file. ```bash cp .env.template .env ``` We've developed the admin panel application in a way that it can easily point to any running SSDN instance. This integration is performed via environment variables, so you'll need to enter the correct values from your SSDN instance. Most of them can be obtained from the CloudFormation template associated with the SSDN Core. Here is a brief description of the environment variables you'll need to configure. Inside parenthesis you'll find the actual CloudFormation resource used by the SSDN Core stack: - **`REACT_APP_ENDPOINT`**: points to the Exchange API endpoint. It's used for door-knocking and sharing data (`ExchangeApi`). - **`REACT_APP_ENTITIES_ENDPOINT`**: points to the Entities API endpoint. (`EntitiesApi`). - **`REACT_APP_FILE_TRANSFER_NOTIFICATIONS_ENDPOINT`**: points to the File Transfer Notifications API endpoint (`FileTransferNotificationsApi`). - **`REACT_APP_SQS_INTEGRATION_NOTIFICATIONS_ENDPOINT`**: points to the SQS Integration Notifications API endpoint (`SQSIntegrationNotificationsApi`). - **`REACT_APP_IDENTITY_POOL_ID`**: The Cognito Identity Pool ID (`CognitoIdentityPool`). - **`REACT_APP_SSDN_ID`**: The SSDN ID. It's usually assigned by you, or the CLI installer (`SSDNId`). - **`REACT_APP_AWS_REGION`**: The AWS region, usually `us-east-1`. - **`REACT_APP_STACK_NAME`**: The name of the SSDN Core stack. - **`REACT_APP_USER_POOL_ID`**: The ID of the Cognito User Pool (`CognitoUserPool`). - **`REACT_APP_USER_POOL_WEB_CLIENT_ID`**: The ID of the web client in the user pool (`CognitoUserPoolClientWeb`). The last step involves configuring and initializing the Amplify environment. Run: ```bash amplify configure ``` Follow the steps that will guide you through the creation of a suitable AWS account, as well as setting up your preferred environment. Next, run: ```bash amplify init ``` When it asks whether you want to use an existing environment, choose `No` unless you're sure that you want to reuse the development environment available in the code repository. In almost all cases, the safest choice is setting up a new environment in your AWS account. To learn more about the many options to set up an Amplify project, feel free to look at the official documentation on [Environments & Teams](https://aws-amplify.github.io/docs/cli-toolchain/quickstart#environments--teams). ## Usage Since this is a regular Amplify project, you can use the expected commands to manage the project: - `amplify push` to provision the AWS resources in the cloud. - `amplify publish` to generate a production-ready build of the admin panel and upload it to S3. - `amplify serve` to start a local instance of the application. - `amplify status` to get a general overview of the project and its resources. ## Testing Create React Application uses `jest` as the default test runner. ### Unit tests ```bash yarn test ``` This will run all the unit tests. By default they are started in `watch` mode. ### End-to-end tests We use [Cypress](https://www.cypress.io/) to execute the tests that check the behaviour of the admin panel from the point of view of an external user. In order to run them properly, we make some assumptions that require following these configuration steps: - First of all, the actual resources must be deployed to AWS. Any environment will do, but we recommend creating a new fresh environment in Amplify that is specific to testing: ```bash amplify env add # Use 'test', 'cypress' or 'e2e' as the environment name when it asks amplify serve # Provisions the resources in the cloud and starts the application locally ``` - Make sure you have defined proper values for the current environment in the `.env` file, and that it's pointing to the environment you want to run the e2e tests on. - Now, you'll need to create a default administrator user that will be used to sign in. Again, you can either go to the Cognito section in the AWS Console and create the user there, or run the following command: ```bash aws cognito-idp admin-create-user \ --user-pool-id us-east-1_jV9AzgY8g \ --username test-user \ --temporary-password @Mb94TQT5nqE \ --user-attributes Name=email,Value=test-user@example.org \ Name=email_verified,Value=true \ Name=name,Value="Test User" \ Name=phone_number,Value=+1555555555 \ Name=phone_number_verified,Value=true ``` _Note: Make sure you use the same values as shown above when you create your test user. Otherwise some tests might fail._ - Next step is passing Cypress some configuration values that are needed for the actual tests. We use environment variables for that. Please check the [official documentation](https://docs.cypress.io/guides/guides/environment-variables.html#Setting) on the various ways to set environment variables in Cypress. You can choose whatever works best for you, but in this example we'll just export them to the system: ```bash export CYPRESS_DEFAULT_USERNAME=test-user export CYPRESS_DEFAULT_PASSWORD=@Mb94TQT5nqE export CYPRESS_REGISTER_ENDPOINT=https://z0krjz1z0l.execute-api.us-east-1.amazonaws.com/test/register ``` As you can see, all variables must start with `CYPRESS_` in order to be properly recognized. Besides that, we're declaring the username and password we defined in the previous step, as well as the register endpoint for consumer requests. If you don't know where this value comes from, you can use your instance's own endpoint. Check the home page in your SSDN administration panel or the CloudFormation stack in your AWS account to get the endpoint's URL. - Lastly, you can launch the end-to-end tests with these two commands: ```bash yarn cypress:open # Opens up an interactive GUI and runs in a graphical browser yarn cypress:run # Runs in a headless browser (Electron) without user intervention ```
46.890052
119
0.71293
eng_Latn
0.987194
4c0cb947b7084030248988277653fe1cb03c8ab7
1,792
md
Markdown
pt_br/2_transcript_RS_example_PR.md
data-umbrella/2020-06-scikit-sprint
161873275dd17c121bbe1523f07222b5f18f3350
[ "MIT" ]
110
2021-01-08T17:14:34.000Z
2022-03-04T11:08:39.000Z
pt_br/2_transcript_RS_example_PR.md
data-umbrella/2020-06-scikit-sprint
161873275dd17c121bbe1523f07222b5f18f3350
[ "MIT" ]
13
2021-01-12T06:28:42.000Z
2021-09-26T17:00:11.000Z
pt_br/2_transcript_RS_example_PR.md
data-umbrella/2020-scikit-learn-sprint
cbd6c8b6970bcdad71436fddcf8bfc321b8077f7
[ "MIT" ]
24
2021-01-12T03:06:45.000Z
2021-10-05T21:09:55.000Z
# Exemplo de contribuição e envio de uma pull request para [scikit-learn](https://github.com/scikit-learn) <p float="left"> <a href="https://www.dataumbrella.org" target="_blank"> <img src="../images/full logo-transparent copy.png" height="40%" width="40%" /> </a> <img width="150" /> <a href="https://github.com/scikit-learn" target="_blank"> <img src="../images/1280px-Scikit_learn_logo_small.svg.png" width="15%" height="15%" /> </a> </p> ## Transcrição do vídeo - Palestrante: [Reshama Shaikh] (https://twitter.com/reshamas) - Vídeo: [Exemplo de contribuição de RP do Scikit-learn] (https://youtu.be/PU1WyDPGePI) (30 minutos) - Transcritor: [Reshama Shaikh] (https://twitter.com/reshamas) ## Links principais - Data Umbrella [Discord](https://discord.gg/mEzEbYT) - Gitter: [scikit-learn](https://gitter.im/scikit-learn) - [Contribuindo c/comandos Workflow](contributing/workflow.md) (configuração do ambiente, repo, envio de PR) - [Documentação: contribuindo com Scikit-learn](http://scikit-learn.org/stable/developers/contributing.html) ## Video Updates (not `master`, but `main`) <a href="https://scikit-learn.org/dev/developers/contributing.html"><img src="../images/master_main.png" width="90%" style="padding:1px;border:thick solid red;" align="top"/></a> ## Video <a href="https://youtu.be/PU1WyDPGePI"><img src="../images/sklearn_rs_video.png" width="80%" /></a> --- ### Intro Olá, meu nome é Reshama, e vou dar um exemplo de pull request ou um PR. Eu participei do meu primeiro sprint scikit-learn há cerca de um ano e meio, e estou feliz em compartilhar um exemplo. Depois de aprender este exemplo - que será para o repo sklearn - você poderá fazê-lo para qualquer repositório no GitHub depois de aprender como fazer para este repositório. (to be continue...)
51.2
364
0.723214
por_Latn
0.868192
4c0ef819101eaea1fb1cfb8ad7b760e950815090
365
md
Markdown
_members/rose-pfeiffer.md
juliasaltzman1/quantmarineecolab.github.io
001f7f66a75ddc0e0285a2b5f670060882e0f541
[ "BSD-3-Clause" ]
null
null
null
_members/rose-pfeiffer.md
juliasaltzman1/quantmarineecolab.github.io
001f7f66a75ddc0e0285a2b5f670060882e0f541
[ "BSD-3-Clause" ]
null
null
null
_members/rose-pfeiffer.md
juliasaltzman1/quantmarineecolab.github.io
001f7f66a75ddc0e0285a2b5f670060882e0f541
[ "BSD-3-Clause" ]
6
2021-08-01T21:21:07.000Z
2021-10-04T11:49:02.000Z
--- name: Rose Pfeiffer image: images/rose-pfeiffer.jpg description: Undergraduate researcher role: undergrad group: current aliases: - R. Pfeiffer - Rose Pfeiffer links: email: Rose.Pfeiffer@uvm.edu twitter: rppfeiffer --- Rose Pfeiffer is currently an undergraduate at the University of Vermont. She is studying how COVID-19 is affecting US fisheries.
22.8125
130
0.769863
eng_Latn
0.983657
4c0fc39d0cebe67b5726ef4e82d07cd069a74e3a
428
md
Markdown
docs/release-checklist.md
cram678/go-watchman-client
f2b9304da4dcb93c3d9902e8e414c8cb04c74a8c
[ "Apache-2.0" ]
6
2018-09-26T07:38:39.000Z
2022-01-19T15:36:32.000Z
docs/release-checklist.md
cram678/go-watchman-client
f2b9304da4dcb93c3d9902e8e414c8cb04c74a8c
[ "Apache-2.0" ]
15
2019-03-02T15:09:59.000Z
2021-04-26T05:21:02.000Z
docs/release-checklist.md
cram678/go-watchman-client
f2b9304da4dcb93c3d9902e8e414c8cb04c74a8c
[ "Apache-2.0" ]
4
2021-05-14T06:52:06.000Z
2022-03-25T17:53:00.000Z
# Release Checklist 1) Verify that all tests are passing. 1) Update `CHANGELOG.md` and commit. 1) Create release branch: ```sh git checkout -b release/v0.1 ``` 1) Tag release. ```sh git tag -a v0.1.0 -m "Release 0.1.0" ``` 1) Push commits and tags. ```sh git push origin release/v0.1 git push origin v0.1.0 ``` 1) Update GitHub. * https://github.com/sjansen/watchman/releases
15.851852
50
0.61215
eng_Latn
0.778818
4c0ff40ec174065e5a677c432b5c8398686109eb
4,905
md
Markdown
lib/torrent-parser/README.md
yaolynzc/sp2der
6fa4e924e44e8a9b3d4ef0948924a6725bea183d
[ "MIT" ]
1
2018-03-18T08:15:27.000Z
2018-03-18T08:15:27.000Z
lib/torrent-parser/README.md
yaolynzc/sp2der
6fa4e924e44e8a9b3d4ef0948924a6725bea183d
[ "MIT" ]
4
2020-01-07T01:05:54.000Z
2020-01-10T01:57:05.000Z
lib/torrent-parser/README.md
yaolynzc/sp2der
6fa4e924e44e8a9b3d4ef0948924a6725bea183d
[ "MIT" ]
null
null
null
# torrent-parser [![travis][travis-image]][travis-url] [![npm][npm-image]][npm-url] [![downloads][downloads-image]][downloads-url] [![Greenkeeper badge](https://badges.greenkeeper.io/CraigglesO/torrent-parser.svg)](https://greenkeeper.io/) [travis-image]: https://travis-ci.org/CraigglesO/torrent-parser.svg?branch=master [travis-url]: https://travis-ci.org/CraigglesO/torrent-parser [npm-image]: https://img.shields.io/npm/v/torrent-parser.svg [npm-url]: https://npmjs.org/package/torrent-parser [downloads-image]: https://img.shields.io/npm/dm/torrent-parser.svg [downloads-url]: https://npmjs.org/package/torrent-parser ### Parse torrents from their files or buffers Bot parse and write a torrent file. ## Install ``` javascript npm install torrent-parser ``` or download for global use: ``` javascript npm install -g torrent-parser ``` ## Usage ``` javascript import { decodeTorrentFile, decodeTorrent, encodeTorrent, parseInfo } from "torrent-parser" ``` **DECODE** ``` javascript let file = fs.readFileSync("./screen.torrent"); // Parse directly from the file let parsedTorrent = decodeTorrentFile("./screen.torrent"); // Or parse the buffer created from a file let parsedTorrent = decodeTorrent(file); parsedTorrent = { info: { length: 99525, name: <Buffer 53 63 72 65 65 6e 20 53 68 6f 74 20 32 30 31 37 2d 30 31 2d 32 31 20 61 74 20 38 2e 32 35 2e 31 35 20 41 4d 2e 70 6e 67>, 'piece length': 16384, pieces: <Buffer 4a a0 c3 fb ce 71 26 8c 4e fb 56 fe 4c b1 f4 22 26 bc 59 59 26 c5 f5 90 f8 8d c5 60 90 5d 2d 2c 1d 4a 82 db 39 4f ae 98 c4 53 61 a1 85 8c 37 cf df 77 ... >, private: 0 }, infoBuffer: <Buffer 64 36 3a 6c 65 6e 67 74 68 69 39 39 35 32 35 65 34 3a 6e 61 6d 65 34 30 3a 53 63 72 65 65 6e 20 53 68 6f 74 20 32 30 31 37 2d 30 31 2d 32 31 20 61 74 ... >, infoHash: '74416fe776ca02ca2da20f686fed835e4dcfe84d', infoHashBuffer: <Buffer 74 41 6f e7 76 ca 02 ca 2d a2 0f 68 6f ed 83 5e 4d cf e8 4d>, name: 'Screen Shot 2017-01-21 at 8.25.15 AM.png', private: false, 'creation date': 2017-02-14T21:24:02.000Z, 'created by': 'Empire/vParrot', announce: [ 'udp://tracker.empire-js.us:1337,udp://tracker.openbittorrent.com:80,udp://tracker.leechers-paradise.org:6969,udp://tracker.coppersurfer.tk:6969,udp://tracker.opentrackr.org:1337,udp://explodie.org:6969,udp://zer0day.ch:1337' ], urlList: [], files: [ { path: 'Screen Shot 2017-01-21 at 8.25.15 AM.png', name: 'Screen Shot 2017-01-21 at 8.25.15 AM.png', length: 99525, offset: 0 } ], length: 99525, pieceLength: 16384, lastPieceLength: 1221, pieces: [ '4aa0c3fbce71268c4efb56fe4cb1f42226bc5959', '26c5f590f88dc560905d2d2c1d4a82db394fae98', 'c45361a1858c37cfdf77bb716c48fa368f3605af', '4d4289c76994ee95b0302b76ca0df2a351a10afc', '84eac82d3f383e6c1bb9d5a0c18b5cdbc1b729af', 'db265f87a7f6047916c30298479cae03c9dceccb', 'fa2857f4fbeb4d3e9d7d847e4b94c6b418f4fa83' ] } ``` **ENCODE** ``` javascript let info = { length: 99525, name: <Buffer 53 63 72 65 65 6e 20 53 68 6f 74 20 32 30 31 37 2d 30 31 2d 32 31 20 61 74 20 38 2e 32 35 2e 31 35 20 41 4d 2e 70 6e 67>, 'piece length': 16384, pieces: <Buffer 4a a0 c3 fb ce 71 26 8c 4e fb 56 fe 4c b1 f4 22 26 bc 59 59 26 c5 f5 90 f8 8d c5 60 90 5d 2d 2c 1d 4a 82 db 39 4f ae 98 c4 53 61 a1 85 8c 37 cf df 77 ... >, private: 0 } // Encode just info files const bencode = require("bencode"); let file = fs.readFileSync("./screen.torrent"); let parsedTorrent = decodeTorrent(file); let info = bencode.encode(parsedTorrent.info); // NOTE: INFO in the field is usually a bencoded buffer. encodeTorrent(info, "./dev-screen3", (err) => { if (err) throw err; // SUCCESS }); // Or encode full torrents let file = fs.readFileSync("./screen.torrent"); let parsedTorrent = decodeTorrent(file); encodeTorrent(parsedTorrent, "./dev-screen.torrent", (err) => { if (err) throw err; // SUCCESS }); ``` ## ISC License (Open Source Initiative) ISC License (ISC) Copyright 2017 <CraigglesO> Copyright (c) 2004-2010 by Internet Systems Consortium, Inc. ("ISC") Copyright (c) 1995-2003 by Internet Software Consortium Permission to use, copy, modify, and/or distribute this software for any purpose with or without fee is hereby granted, provided that the above copyright notice and this permission notice appear in all copies. THE SOFTWARE IS PROVIDED "AS IS" AND THE AUTHOR DISCLAIMS ALL WARRANTIES WITH REGARD TO THIS SOFTWARE INCLUDING ALL IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS. IN NO EVENT SHALL THE AUTHOR BE LIABLE FOR ANY SPECIAL, DIRECT, INDIRECT, OR CONSEQUENTIAL DAMAGES OR ANY DAMAGES WHATSOEVER RESULTING FROM LOSS OF USE, DATA OR PROFITS, WHETHER IN AN ACTION OF CONTRACT, NEGLIGENCE OR OTHER TORTIOUS ACTION, ARISING OUT OF OR IN CONNECTION WITH THE USE OR PERFORMANCE OF THIS SOFTWARE.
39.556452
485
0.709072
yue_Hant
0.451937
4c10207422f5b2c3c2b807b6d01af56611a22cdc
1,364
md
Markdown
CDS_Tool_Output/Parsers/parserContent_raw-unix-sudo.md
deebigarajeswaran/Content-Doc
b99ffab7998eb4594887c36c60e6e2ab586a8b53
[ "MIT" ]
null
null
null
CDS_Tool_Output/Parsers/parserContent_raw-unix-sudo.md
deebigarajeswaran/Content-Doc
b99ffab7998eb4594887c36c60e6e2ab586a8b53
[ "MIT" ]
null
null
null
CDS_Tool_Output/Parsers/parserContent_raw-unix-sudo.md
deebigarajeswaran/Content-Doc
b99ffab7998eb4594887c36c60e6e2ab586a8b53
[ "MIT" ]
null
null
null
#### Parser Content ```Java { Name = raw-unix-sudo Vendor = Unix Product = Unix Lms = Direct DataType = "unix-account-switch" TimeFormat = "yyyy-MM-dd'T'HH:mm:ss" Conditions = [ """sudo:""", """; USER""","""; COMMAND""" ] Fields = [ """exabeam_time=({time}\d\d\d\d-\d\d-\d\d \d\d:\d\d:\d\d)""", """({time}\d\d\d\d-\d\d-\d\dT\d\d:\d\d:\d\d)""", """"timestamp":"({time}\d\d\d\d-\d\d-\d\dT\d\d:\d\d:\d\d\.\d+[+-]\d+)""", """exabeam_host=([^=]+@\s*)?({host}[\w.\-]+)""", """exabeam_host=([^=]+@\s*)?(({dest_ip}\d{1,3}.\d{1,3}.\d{1,3}.\d{1,3})|({dest_host}[^\s]+))""", """({host}[\w\.\-]+)?:?\s*sudo:""", """"agent":\{"id":"({agent_id}\d+)"""", """"agent":\{"name":"[^"]*","id":"({agent_id}\d+)"""", """({event_code}sudo):\s+(?:\[[^]]+\])?\s*(({domain}[^\\:;]+)\\+)?({user}[^\s:]+).+?USER\\*=({account}[^;\s]+)""", """\WPWD=({directory}[^\s;]+)""", """\WCOMMAND=({process}([^\s]+[\\\/]+)?({process_name}[^;\\\/\s]+))\s(?:|;|$)""", """\WCOMMAND=({command_line}[^;"]+)("|\s(?:|;|$))""", """"description":"({event_name}[^"]+)"""", """"level":({level}[^",]+)""", """"groups":\[({groups}[^\]]+)""", """"pci_dss":\[({pci_dss}[^\]]+)""", """"cluster":\{[^\{\}]+?"name":"({cluster_name}[^"]+)"""", """"host":"({wazuh_manager}[^"]+)"""", ] DupFields=["directory->process_directory"] } ```
41.333333
118
0.41349
yue_Hant
0.105037
4c11ff342abe4c8daa1c2e3498d248d16eb23485
4,808
md
Markdown
articles/virtual-network/scripts/virtual-network-powershell-sample-multi-tier-application.md
klmnden/azure-docs.tr-tr
8e1ac7aa3bb717cd24e1bc2612e745aa9d7aa6b6
[ "CC-BY-4.0", "MIT" ]
2
2019-08-10T02:23:39.000Z
2019-08-10T02:23:40.000Z
articles/virtual-network/scripts/virtual-network-powershell-sample-multi-tier-application.md
klmnden/azure-docs.tr-tr
8e1ac7aa3bb717cd24e1bc2612e745aa9d7aa6b6
[ "CC-BY-4.0", "MIT" ]
null
null
null
articles/virtual-network/scripts/virtual-network-powershell-sample-multi-tier-application.md
klmnden/azure-docs.tr-tr
8e1ac7aa3bb717cd24e1bc2612e745aa9d7aa6b6
[ "CC-BY-4.0", "MIT" ]
null
null
null
--- title: Azure PowerShell betiği örneği - Çok katmanlı uygulamalar için ağ oluşturma | Microsoft Docs description: Azure PowerShell betiği örneği - Çok katmanlı uygulamalar için sanal ağ oluşturma. services: virtual-network documentationcenter: virtual-network author: KumudD manager: twooley editor: '' tags: '' ms.assetid: '' ms.service: virtual-network ms.devlang: powershell ms.topic: sample ms.tgt_pltfrm: '' ms.workload: infrastructure ms.date: 12/13/2018 ms.author: kumud ms.openlocfilehash: 2fad78db4fdc92f3dc9c0f320c36d12dea554a61 ms.sourcegitcommit: 44a85a2ed288f484cc3cdf71d9b51bc0be64cc33 ms.translationtype: MT ms.contentlocale: tr-TR ms.lasthandoff: 04/28/2019 ms.locfileid: "64725395" --- # <a name="create-a-network-for-multi-tier-applications-script-sample"></a>Çok katmanlı uygulamalar için ağ oluşturma betiği örneği Bu betik örneği, ön uç ve arka uç alt ağları ile sanal ağ oluşturur. 3306 numaralı bağlantı noktası için, ön uç alt ağına giden trafik HTTP ve SSH ile sınırlıyken, arka uç alt ağına giden trafik MySQL ile sınırlıdır. Betiği çalıştırdıktan sonra, her bir alt ağda, web sunucusunu ve MySQL yazılımını dağıtabileceğiniz iki sanal makineniz vardır. Azure [Cloud Shell](https://shell.azure.com/powershell)’den veya yerel bir PowerShell yüklemesinden betiği yürütebilirsiniz. PowerShell'i yerel olarak kullanıyorsanız, bu betik Azure PowerShell modülü sürüm 1.0.0 gerektirir. veya üzeri. Yüklü sürümü bulmak için `Get-Module -ListAvailable Az` komutunu çalıştırın. Yükseltmeniz gerekirse, bkz. [Azure PowerShell modülünü yükleme](/powershell/azure/install-az-ps). PowerShell'i yerel olarak çalıştırıyorsanız Azure bağlantısı oluşturmak için `Connect-AzAccount` komutunu da çalıştırmanız gerekir. [!INCLUDE [quickstarts-free-trial-note](../../../includes/quickstarts-free-trial-note.md)] ## <a name="sample-script"></a>Örnek betik [!INCLUDE [updated-for-az](../../../includes/updated-for-az.md)] <!-- gitHub issue https://github.com/MicrosoftDocs/azure-docs/issues/17748 --> Bir sanal ağ oluşturduktan sonra bir alt ağ kimliği atanır; özellikle yeni AzVirtualNetwork cmdlet'ini kullanarak seçeneğiyle - alt ağ. New-AzVirtualNetwork çağırmadan önce yeni AzVirtualNetworkSubnetConfig cmdlet'ini kullanarak alt yapılandırırsanız, yeni AzVirtualNetwork çağırdıktan sonra alt ağ Kimliğini kadar görmezsiniz. [!code-azurepowershell-interactive[main](../../../powershell_scripts/virtual-network/virtual-network-multi-tier-application/virtual-network-multi-tier-application.ps1 "Virtual network for multi-tier application")] ## <a name="clean-up-deployment"></a>Dağıtımı temizleme Kaynak grubunu, sanal makineyi ve ilgili tüm kaynakları kaldırmak için aşağıdaki komutu çalıştırın: ```powershell Remove-AzResourceGroup -Name myResourceGroup -Force ``` ## <a name="script-explanation"></a>Betik açıklaması Bu betik, bir kaynak grubu, sanal ağ ve ağ güvenliği grupları oluşturmak için aşağıdaki komutları kullanır. Aşağıdaki tabloda yer alan her komut, komuta özgü belgelere yönlendirir: | Komut | Notlar | |---|---| | [New-AzResourceGroup](/powershell/module/az.resources/new-azresourcegroup) | Tüm kaynakların depolandığı bir kaynak grubu oluşturur. | | [Yeni AzVirtualNetwork](/powershell/module/az.network/new-azvirtualnetwork) | Bir Azure sanal ağı ve ön uç alt ağı oluşturur. | | [Yeni AzVirtualNetworkSubnetConfig](/powershell/module/az.network/new-azvirtualnetworksubnetconfig) | Bir arka uç alt ağı oluşturur. | | [New-AzPublicIpAddress](/powershell/module/az.network/new-azpublicipaddress) | İnternet’ten sanal makineye erişmek için genel IP adresi oluşturur. | | [Yeni AzNetworkInterface](/powershell/module/az.network/new-aznetworkinterface) | Sanal ağ arabirimleri oluşturur ve bunları sanal ağın ön uç ve arka uç alt ağlarına ekler. | | [New-AzNetworkSecurityGroup](/powershell/module/az.network/new-aznetworksecuritygroup) | Ön uç ve arka uç alt ağlarıyla ilişkilendirilmiş ağ güvenlik grupları (NSG) oluşturur. | | [New-AzNetworkSecurityRuleConfig](/powershell/module/az.network/new-aznetworksecurityruleconfig) |Belirli alt ağlara yönelik belirli bağlantı noktalarına izin veren veya engelleyen NSG kuralları oluşturur. | | [New-AzVM](/powershell/module/az.compute/new-azvm) | Sanal makineler oluşturur ve her sanal makineye bir NIC ekler. Bu komut ayrıca kullanılacak sanal makine görüntüsünü ve yönetici kimlik bilgilerini belirtir. | | [Remove-AzResourceGroup](/powershell/module/az.resources/remove-azresourcegroup) | Bir kaynak grubunu ve içerdiği tüm kaynakları siler. | ## <a name="next-steps"></a>Sonraki adımlar Azure PowerShell hakkında daha fazla bilgi için bkz. [Azure PowerShell belgeleri](/powershell/azure/overview). Ek sanal ağ PowerShell betiği örnekleri, [Sanal ağ PowerShell örnekleri](../powershell-samples.md) bölümünde bulunabilir.
67.71831
544
0.801789
tur_Latn
0.998881
4c124883e984d76450a45d848558e5cde2eab2fb
2,408
md
Markdown
content.md
ramakpuppala/etp
dad002058b3cd996c7b19867a787b7312476b81e
[ "MIT" ]
null
null
null
content.md
ramakpuppala/etp
dad002058b3cd996c7b19867a787b7312476b81e
[ "MIT" ]
null
null
null
content.md
ramakpuppala/etp
dad002058b3cd996c7b19867a787b7312476b81e
[ "MIT" ]
null
null
null
### Emerging Technologies -Presented on 22 May, 2018 to CGI team at Massachusetts Department of Unemployment Insurance <!-- .element: style="font-size:x-small;" --> ### What is an Emerging Technology ? > Emerging technologies are technologies that are perceived as capable of changing the status quo > -- Wikipedia ### Why Change Status Quo? * Cost * Efficiency * Survival * Competitive Advantage * Faster time to market * Agility * Scalability ### Caveats * Support * Feature Complete * Bugs * Immature ### ETP - NEBU,CGI - Cloud - Agile - DevOps - Security - BlockChain - Robotic Process Automation ### DUA - DevOps - Docker - Orchestration (Portainer) - Cloud - Monitoring & Alerting (Prometheus, Grafana, Mattermost) - Database driver upgrade (No Client or Tns files) - Agile - TFS, Git & CICD - Communication (mattermost) ### Docker - What is Docker? - Docker is an ecosystem of tools that help IT orgranizations to help package and run applications infrastructure agnostic. - Docker is a virtulization technology which relies on the host operating system capabilities to create as many sub systems (containers) only bounded by the physical resources on the host. - Docker runs natively on linux, mac and windows. Previously Mac and Windows ran with the help of Docker Toolbox which created a linux vm. - How ? - Images - Containers - Registries ### Docker File ``` FROM microsoft/dotnet-framework:4.6.2 WORKDIR /app COPY bin/Debug/ . RUN sc create ListenerService binPath= "C:\\app\\uFACTS.BatchProcess.Listener.exe" ENTRYPOINT sc start ListenerService \ && PowerShell Get-Content C:\\uFACTS\\logs\\uFACTS.BatchProcess.Listener\\*.uLOG -Wait ``` ### Images A Docker image is built up from a series of layers. Each layer represents an instruction in the image's Dockerfile. ![alt text](https://docs.docker.com/v17.09/engine/userguide/storagedriver/images/container-layers.jpg)(https://docs.docker.com/v17.09/engine/userguide/storagedriver/images) ### Containers ### Registries ### Orchestration - Compose - Swarm - Kubernetes - Mesos DC/OS - OpenShift - ECS, EKS, AKS & Many more ... ### Kinds of Orchestration - Single Node - Cluster ### DUA - Portainer - Single Node orchetrator ### Demo ### Thank You ### & ### ?
22.933333
175
0.689369
eng_Latn
0.838178
4c12f71998e904e13284bde0ef89a4c6d85debe9
1,378
md
Markdown
examples/PHP/README.md
emmeair/go-canal
72ac0348c4193dcda8ed6634a65752e1ce882edb
[ "Apache-2.0" ]
98
2020-07-31T14:28:59.000Z
2022-02-27T21:08:17.000Z
examples/PHP/README.md
emmeair/go-canal
72ac0348c4193dcda8ed6634a65752e1ce882edb
[ "Apache-2.0" ]
2
2020-08-05T11:03:37.000Z
2021-08-16T00:38:57.000Z
examples/PHP/README.md
emmeair/go-canal
72ac0348c4193dcda8ed6634a65752e1ce882edb
[ "Apache-2.0" ]
10
2020-07-31T18:35:37.000Z
2021-06-22T09:08:42.000Z
# GO-CANAL - PHP demo go-canal 的 php 简单调用 # 准备 - 对于自建 MySQL , 需要先开启 Binlog 写入功能,配置 binlog-format 为 ROW 模式,my.cnf 中配置如下 ``` [mysqld] log-bin=mysql-bin # 开启 binlog binlog-format=ROW # 选择 ROW 模式 server_id=1 # 配置 MySQL replaction 需要定义,不要和 canal 的 slaveId 重复 ``` - 注意:针对阿里云 RDS for MySQL , 默认打开了 binlog , 并且账号默认具有 binlog dump 权限 , 不需要任何权限或者 binlog 设置,可以直接跳过这一步 - 授权 canal 链接 MySQL 账号具有作为 MySQL slave 的权限, 如果已有账户可直接 grant ```sql CREATE USER canal IDENTIFIED BY 'canal'; GRANT SELECT, REPLICATION SLAVE, REPLICATION CLIENT ON *.* TO 'canal'@'%'; -- GRANT ALL PRIVILEGES ON *.* TO 'canal'@'%' ; FLUSH PRIVILEGES; ``` # 开始 - 修改 demo 文件中监听ip和端口(可跳过) ```php // ip 如果有服务器资源也可换成其他ip $address = "127.0.0.1"; // 端口 随意更换 $port = 9501; ``` - 修改配置文件 ```json { "schema": [ "test_tt" ], "mysqlInfo": { "addr": "ip:3306", "user": "canal", "password": "canal" }, "server": { "network": "tcp", "addr": "ip:9501" } } ``` ```shell 运行 demo.php 文件 创建一个 socket 服务端 php demo.php ``` - 确保配置文件中推送地址和 demo 文件中监听的地址一致,修改配置文件后应重启go-canal ####在数据库中随意修改或者插入一条数据,挂起的demo程序将输出一条类似数据 ``` Request : {"Action":"update","ColumnData":{"admin_name":"123","created_at":null,"id":1,"password":"111","updated_at":null,"username":"111"},"SchemaName":"caopan","TableName":"admin"} ``` # 说明 - 本例子仅供参考使用,具体逻辑开发者可自行改写 - 可使用 WorkerMan Swoole 等 socket 集成包开发
20.878788
182
0.645864
yue_Hant
0.644573
4c1543247595959f949493b9676021197a360e2b
1,346
md
Markdown
docs/standard/parallel-programming/parallel-diagnostic-tools.md
Ming77/docs.zh-cn
dd4fb6e9f79320627d19c760922cb66f60162607
[ "CC-BY-4.0", "MIT" ]
null
null
null
docs/standard/parallel-programming/parallel-diagnostic-tools.md
Ming77/docs.zh-cn
dd4fb6e9f79320627d19c760922cb66f60162607
[ "CC-BY-4.0", "MIT" ]
null
null
null
docs/standard/parallel-programming/parallel-diagnostic-tools.md
Ming77/docs.zh-cn
dd4fb6e9f79320627d19c760922cb66f60162607
[ "CC-BY-4.0", "MIT" ]
null
null
null
--- title: "并行诊断工具" ms.custom: ms.date: 03/30/2017 ms.prod: .net ms.reviewer: ms.suite: ms.technology: dotnet-standard ms.tgt_pltfrm: ms.topic: article helpviewer_keywords: - parallel programming, debugging and profiling tools ms.assetid: 38f7302a-ddf6-4179-ba19-f49e00395b9f caps.latest.revision: author: rpetrusha ms.author: ronpet manager: wpickett ms.workload: - dotnet - dotnetcore ms.openlocfilehash: 4ed0b3991b08eceb950aa1a3aa1704529700feb3 ms.sourcegitcommit: e7f04439d78909229506b56935a1105a4149ff3d ms.translationtype: HT ms.contentlocale: zh-CN ms.lasthandoff: 12/23/2017 --- # <a name="parallel-diagnostic-tools"></a>并行诊断工具 [!INCLUDE[vs_dev10_ext](../../../includes/vs-dev10-ext-md.md)] 为调试和分析多线程应用程序提供了广泛的支持。 ## <a name="debugging"></a>调试 Visual Studio 调试器添加了用于调试并行应用程序的新窗口。 有关详细信息,请参阅下列主题: - [使用“并行堆栈”窗口](/visualstudio/debugger/using-the-parallel-stacks-window) - [使用“任务”窗口](/visualstudio/debugger/using-the-tasks-window) - [演练:调试并行应用程序](/visualstudio/debugger/walkthrough-debugging-a-parallel-application)。 ## <a name="profiling"></a>分析 可以利用“并发可视化工具”报告视图直观显示并行程序中的线程如何彼此进行交互,以及如何与系统上其他进程中的线程进行交互。 有关详细信息,请参阅[并发可视化工具](/visualstudio/profiling/concurrency-visualizer)。 ## <a name="see-also"></a>请参阅 [并行编程](../../../docs/standard/parallel-programming/index.md)
30.590909
131
0.739227
yue_Hant
0.213859
4c159f491325862cd6f3bd91d0a9366286dee270
1,745
md
Markdown
README.md
v-braun/VBR-UnixTime
af78f2eb2773733e56d7d324e8cf4e592626de85
[ "MIT" ]
null
null
null
README.md
v-braun/VBR-UnixTime
af78f2eb2773733e56d7d324e8cf4e592626de85
[ "MIT" ]
null
null
null
README.md
v-braun/VBR-UnixTime
af78f2eb2773733e56d7d324e8cf4e592626de85
[ "MIT" ]
null
null
null
# VBR.UnixTime > Converts UnixEpoch Date information to DateTime By [v-braun - www.dev-things.net](http://www.dev-things.net). [![AppVeyor](https://img.shields.io/appveyor/ci/v-braun/vbr-unixtime.svg?style=flat-square)](https://ci.appveyor.com/project/v-braun/vbr-unixtime) [![NuGet](https://img.shields.io/nuget/v/VBR.UnixTime.svg?style=flat-square)](https://www.nuget.org/packages/VBR.UnixTime/) ## Installation ### PowerShell ```PowerShell Install-Package VBR.UnixTime ``` ### project.json ```json "dependencies": { "VBR.UnixTime": "*" } ``` ## Usage See the *VBR.UnixTime.Tests* Project. ```cs using VBR; static Main(){ // static syntax var ms = 10; var s = 11; var m = 12; var h = 13; var val1 = UnixTime.FromElapsedMinutes(ms); var val2 = UnixTime.FromElapsedMinutes(s); var val3 = UnixTime.FromElapsedMinutes(m); var val4 = UnixTime.FromElapsedMinutes(h); // extension syntax val1 = ms.MillisecondsSinceUnixEpoch(); val2 = s.SecondsSinceUnixEpoch(); val3 = m.MinutesSinceUnixEpoch(); val4 = m.HoursSinceUnixEpoch(); } ``` ### Known Issues If you discover any bugs, feel free to create an issue on GitHub fork and send me a pull request. [Issues List](https://github.com/v-braun/VBR-UnixTime/issues). ## Authors ![image](https://avatars3.githubusercontent.com/u/4738210?v=3&s=50) [v-braun](https://github.com/v-braun/) ## Contributing 1. Fork it 2. Create your feature branch (`git checkout -b my-new-feature`) 3. Commit your changes (`git commit -am 'Add some feature'`) 4. Push to the branch (`git push origin my-new-feature`) 5. Create new Pull Request ## License See [LICENSE](https://github.com/v-braun/VBR-UnixTime/blob/master/LICENSE).
21.8125
146
0.687679
yue_Hant
0.509572
4c16c59b0495cf0296528962494c73c798cac8dc
85
md
Markdown
README.md
quickheaven/ganesha
8f47bf58a7708c7b3f643e791d74285f2ce8b2e8
[ "Apache-2.0" ]
null
null
null
README.md
quickheaven/ganesha
8f47bf58a7708c7b3f643e791d74285f2ce8b2e8
[ "Apache-2.0" ]
null
null
null
README.md
quickheaven/ganesha
8f47bf58a7708c7b3f643e791d74285f2ce8b2e8
[ "Apache-2.0" ]
null
null
null
# Once you stop learning, you start dying. - Albert Einstein Project for my research
28.333333
60
0.776471
eng_Latn
0.983137
4c16cec33c508a6038e05440912c2bf1c2503616
326
md
Markdown
api/docs/stentor-models.multimediatype.md
stentorium/stentor
f49b51e8b4f82012d1ac8ddd15af279bd4619229
[ "Apache-2.0" ]
2
2019-12-30T19:23:17.000Z
2021-07-06T02:47:39.000Z
api/docs/stentor-models.multimediatype.md
stentorium/stentor
f49b51e8b4f82012d1ac8ddd15af279bd4619229
[ "Apache-2.0" ]
74
2020-01-07T00:25:16.000Z
2022-02-23T04:06:56.000Z
api/docs/stentor-models.multimediatype.md
stentorium/stentor
f49b51e8b4f82012d1ac8ddd15af279bd4619229
[ "Apache-2.0" ]
1
2021-01-01T08:57:23.000Z
2021-01-01T08:57:23.000Z
<!-- Do not edit this file. It is automatically generated by API Documenter. --> [Home](./index.md) &gt; [stentor-models](./stentor-models.md) &gt; [MultimediaType](./stentor-models.multimediatype.md) ## MultimediaType type <b>Signature:</b> ```typescript export declare type MultimediaType = "Multimedia"; ```
27.166667
120
0.687117
eng_Latn
0.570455
4c16fbd9a1556a088d7daab1eae1477763e883ef
4,269
md
Markdown
curriculum/challenges/arabic/02-javascript-algorithms-and-data-structures/basic-javascript/counting-cards.arabic.md
tmonks/freeCodeCamp
7453131461f5073d9160bbc1402bcb0e052579c0
[ "BSD-3-Clause" ]
25
2020-02-16T00:26:35.000Z
2022-03-30T19:46:05.000Z
curriculum/challenges/arabic/02-javascript-algorithms-and-data-structures/basic-javascript/counting-cards.arabic.md
SweeneyNew/freeCodeCamp
e24b995d3d6a2829701de7ac2225d72f3a954b40
[ "BSD-3-Clause" ]
2,056
2019-08-25T19:29:20.000Z
2022-02-13T22:13:01.000Z
curriculum/challenges/arabic/02-javascript-algorithms-and-data-structures/basic-javascript/counting-cards.arabic.md
SweeneyNew/freeCodeCamp
e24b995d3d6a2829701de7ac2225d72f3a954b40
[ "BSD-3-Clause" ]
27
2017-02-12T11:48:34.000Z
2022-03-30T17:44:39.000Z
--- id: 565bbe00e9cc8ac0725390f4 title: Counting Cards challengeType: 1 videoUrl: '' localeTitle: عد بطاقات --- ## Description <section id="description"> في لعبة الكازينو Blackjack ، يمكن للاعب الحصول على ميزة على المنزل من خلال تتبع العدد النسبي للبطاقات العالية والمنخفضة المتبقية في سطح السفينة. وهذا ما يسمى <a href="https://en.wikipedia.org/wiki/Card_counting" target="_blank">حساب البطاقة</a> . وجود المزيد من البطاقات العالية المتبقية في سطح السفينة تفضل اللاعب. يتم تعيين قيمة لكل بطاقة وفقًا للجدول أدناه. عندما يكون العد موجبًا ، يجب أن يراهن اللاعب عالياً. عندما يكون العدد صفرًا أو سلبيًا ، يجب على اللاعب الرهان منخفضًا. <table class="table table-striped" style=";text-align:right;direction:rtl"><thead><tr><th> عد التغيير </th><th> بطاقات </th></tr></thead><tbody><tr><td> +1 </td><td> 2 و 3 و 4 و 5 و 6 </td></tr><tr><td> 0 </td><td> 7 و 8 و 9 </td></tr><tr><td> -1 </td><td> 10 ، &#39;J&#39; ، &#39;Q&#39; ، &#39;K&#39; ، &#39;A&#39; </td></tr></tbody></table> ستكتب وظيفة حساب البطاقة. سيحصل على معلمة <code>card</code> ، والتي يمكن أن تكون رقمًا أو سلسلة ، وزيادة أو إنقاص متغير <code>count</code> العالمي وفقًا لقيمة البطاقة (انظر الجدول). ستقوم الدالة بعد ذلك بإرجاع سلسلة مع العدد الحالي وسلسلة <code>Bet</code> إذا كان العدد موجبًا ، أو <code>Hold</code> إذا كان العدد صفراً أو سالباً. يجب فصل العدد الحالي وقرار اللاعب ( <code>Bet</code> أو <code>Hold</code> ) بمسافة واحدة. <strong>ناتج المثال</strong> <br> <code>-3 Hold</code> <br> <code>5 Bet</code> <strong>تلميح</strong> <code>5 Bet</code> <br> لا تقم بإعادة تعيين <code>count</code> إلى 0 عندما تكون القيمة 7 أو 8 أو 9. <br> لا ترجع مصفوفة. <br> لا تقم بتضمين علامات الاقتباس (مفردة أو مزدوجة) في الإخراج. </section> ## Instructions <section id="instructions"> </section> ## Tests <section id='tests'> ```yml tests: - text: يجب أن تعود تسلسلات البطاقات 2 و 3 و 4 و 5 و 6 إلى <code>5 Bet</code> testString: 'assert((function(){ count = 0; cc(2);cc(3);cc(4);cc(5);var out = cc(6); if(out === "5 Bet") {return true;} return false; })(), "Cards Sequence 2, 3, 4, 5, 6 should return <code>5 Bet</code>");' - text: يجب أن تعود تسلسلات البطاقات 7 و 8 و 9 <code>0 Hold</code> testString: 'assert((function(){ count = 0; cc(7);cc(8);var out = cc(9); if(out === "0 Hold") {return true;} return false; })(), "Cards Sequence 7, 8, 9 should return <code>0 Hold</code>");' - text: يجب أن تعود تسلسلات البطاقات 10 و J و Q و K و A إلى <code>-5 Hold</code> testString: 'assert((function(){ count = 0; cc(10);cc("J");cc("Q");cc("K");var out = cc("A"); if(out === "-5 Hold") {return true;} return false; })(), "Cards Sequence 10, J, Q, K, A should return <code>-5 Hold</code>");' - text: يجب أن تعود تسلسلات البطاقات 3 و 7 و Q و 8 و A إلى <code>-1 Hold</code> testString: 'assert((function(){ count = 0; cc(3);cc(7);cc("Q");cc(8);var out = cc("A"); if(out === "-1 Hold") {return true;} return false; })(), "Cards Sequence 3, 7, Q, 8, A should return <code>-1 Hold</code>");' - text: يجب أن تعيد بطاقات التسلسل 2 ، J ، 9 ، 2 ، 7 <code>1 Bet</code> testString: 'assert((function(){ count = 0; cc(2);cc("J");cc(9);cc(2);var out = cc(7); if(out === "1 Bet") {return true;} return false; })(), "Cards Sequence 2, J, 9, 2, 7 should return <code>1 Bet</code>");' - text: يجب أن تعيد بطاقات التسلسل 2 و 2 و 10 <code>1 Bet</code> testString: 'assert((function(){ count = 0; cc(2);cc(2);var out = cc(10); if(out === "1 Bet") {return true;} return false; })(), "Cards Sequence 2, 2, 10 should return <code>1 Bet</code>");' - text: يجب أن تعود تسلسلات البطاقات 3 ، 2 ، A ، 10 ، K <code>-1 Hold</code> testString: 'assert((function(){ count = 0; cc(3);cc(2);cc("A");cc(10);var out = cc("K"); if(out === "-1 Hold") {return true;} return false; })(), "Cards Sequence 3, 2, A, 10, K should return <code>-1 Hold</code>");' ``` </section> ## Challenge Seed <section id='challengeSeed'> <div id='js-seed'> ```js var count = 0; function cc(card) { // Only change code below this line return "Change Me"; // Only change code above this line } // Add/remove calls to test your function. // Note: Only the last will display cc(2); cc(3); cc(7); cc('K'); cc('A'); ``` </div> </section> ## Solution <section id='solution'> ```js // solution required ``` </section>
56.92
1,569
0.6409
yue_Hant
0.290714
4c1736bc7d9a63507955591e3a40ebdf04551df4
555
md
Markdown
.github/pull_request_template.md
jean-smaug/stylelint-bem
3aa366a762b74151b72e6b2932bb276820315841
[ "MIT" ]
40
2016-06-16T13:35:00.000Z
2021-03-03T22:14:28.000Z
.github/pull_request_template.md
jean-smaug/stylelint-bem
3aa366a762b74151b72e6b2932bb276820315841
[ "MIT" ]
20
2016-07-12T08:35:12.000Z
2021-04-15T19:26:37.000Z
.github/pull_request_template.md
jean-smaug/stylelint-bem
3aa366a762b74151b72e6b2932bb276820315841
[ "MIT" ]
12
2016-09-26T08:44:33.000Z
2021-01-08T12:29:15.000Z
<!-- Thanks for taking the time to submit a pull request --> ## Purpose of this pull request? <!-- Choose the right options and remove others --> * Documentation update * Bug fix * Enhancement * Other... Please describe ## What changes did you make? <!-- Give an overview --> ## Does this pull request introduce a breaking change? <!-- If this pull request contains a breaking change, please describe the impact and migration path for existing applications below. --> ## Is there anything you'd like reviewers to focus on? <!-- Just in case -->
22.2
136
0.708108
eng_Latn
0.997947
4c17e4103c74d17bf8f427fcb2d5bdb037c96327
450
md
Markdown
README.md
johnantoni/cors-template
3622f9aa72f9b0de1b257b1472f6dd43f362ef98
[ "MIT" ]
null
null
null
README.md
johnantoni/cors-template
3622f9aa72f9b0de1b257b1472f6dd43f362ef98
[ "MIT" ]
null
null
null
README.md
johnantoni/cors-template
3622f9aa72f9b0de1b257b1472f6dd43f362ef98
[ "MIT" ]
null
null
null
# cors-template Cross-origin Resource Sharing - Sample CORS templates. #### further reading * enable cors - http://enable-cors.org/index.html * wikipedia - https://en.wikipedia.org/wiki/Cross-origin_resource_sharing * add cors to meteor - http://stackoverflow.com/questions/15959501/how-to-add-cors-headers-to-a-meteor-app * 'cors on meteor' - http://enable-cors.org/server_meteor.html * 'using cors' - http://www.html5rocks.com/en/tutorials/cors/
40.909091
106
0.755556
yue_Hant
0.482195
4c19ccf6b28503f2bd4b6ddd9593315824a0d5cf
2,344
md
Markdown
posts/jamstack-bootcamp-2021.md
samabarker/eleventy-base-blog
0495952b927c22314db68c8d269afa4cb8a7789f
[ "MIT" ]
null
null
null
posts/jamstack-bootcamp-2021.md
samabarker/eleventy-base-blog
0495952b927c22314db68c8d269afa4cb8a7789f
[ "MIT" ]
null
null
null
posts/jamstack-bootcamp-2021.md
samabarker/eleventy-base-blog
0495952b927c22314db68c8d269afa4cb8a7789f
[ "MIT" ]
null
null
null
--- layout: layouts/post.njk title: Jamstack Bootcamp 2021 date: 2021-03-16T21:11:05.423Z description: A bootcamp run by The Coders Guild tags: - BLOG --- I am currently lucky enough to be taking part in an 8 week Jamstack bootcamp run by [The Coders Guild](https://thecodersguild.org.uk/). Before starting this course, I had no idea what the Jamstack was. The bootcamp was advertised as providing the skills and knowledge necessary to take those with a basic understanding of web development to the next level, giving a deeper understanding of the technologies currently being used in the industry and hopefully opening up the opportunity to start a career in web development. I am currently in week 6 and am loving every minute of it. The bootcamp started by giving an introduction to the Jamstack - described on [Jamstack.org](https://jamstack.org/) as: > an architecture designed to make the web faster, more secure, and easier to scale. It builds on many of the tools and workflows which developers love, and which bring maximum productivity. To those of us who had never heard the term Jamstack before, this seemed pretty daunting; this was soon resolved when it was explained what this actually meant in terms of web development: * JavaScript to handle dynamic functionality * API's to handle server side operations * Markup to generate static sites Before diving into this, the bootcamp trainers firstly covered some fundamentals - source control and Git, semantic HTML and accessibility, and frameworks such as Bootstrap. From here, we were introduced to static site generators and their benefits, with focus on creating a static site with 11ty and deploying with Netlify. Following this was a further session on Git workflows, giving us the opportunity to modify our static sites whilst solidifying our knowledge of Git. We were then introduced to content management systems, both generally and how to add a CMS to our static site. From this point on, sessions have all covered introductory JavaScript, including basic programming concepts, functions and flow control. I feel as though I have learned a great deal so far, and with five sessions still to go, I am excited to see what is to come. My code for each of the tasks set in the sessions can be found here. Watch this space for examples of my work using the Jamstack!
86.814815
584
0.798635
eng_Latn
0.999814
4c1a06e18ba8f1604dbfe8700edcf0fb58b0fdc9
1,996
md
Markdown
README.md
Muhammad441/gpg
f00df19d68c905b78c8813c8736e8a0e8d3ff078
[ "BSD-2-Clause" ]
null
null
null
README.md
Muhammad441/gpg
f00df19d68c905b78c8813c8736e8a0e8d3ff078
[ "BSD-2-Clause" ]
null
null
null
README.md
Muhammad441/gpg
f00df19d68c905b78c8813c8736e8a0e8d3ff078
[ "BSD-2-Clause" ]
null
null
null
# Grasp Pose Generator (GPG) * **Author:** Andreas ten Pas (atp@ccs.neu.edu) * **Version:** 1.0.0 * **Author's website:** [http://www.ccs.neu.edu/home/atp/](http://www.ccs.neu.edu/home/atp/) * **License:** BSD ## 1) Overview This package creates grasp candidates for 3D point clouds and can check if they are antipodal using geometric conditions. To use the package, you only need PCL and Eigen (see below). <img src="readme/examples.png" alt="" style="width: 400px;"/> This package is part of GPD. Please notice that **no** machine learning is included in this part. The package just generates 6-DOF grasp poses for a 2-finger grasp. ## 2) Requirements 1. [PCL 1.7 or later](http://pointclouds.org/) 2. [Eigen 3.0 or later](https://eigen.tuxfamily.org) ## 3) Compilation 1. Open a terminal and clone the *grasp_candidates_generator* repository into some folder: ``` $ cd <location_of_your_workspace> $ git clone https://github.com/atenpas/gpg.git ``` 2. Build the project: ``` $ cd grasp_candidates_generator $ mkdir build && cd build $ cmake .. $ make ``` 3. (optional) Install the project: ``` $ sudo make install ``` ## 4) Generate Grasp Candidates for a Point Cloud File Run the following from within the *build* folder: ``` $ ./generate_candidates ../cfg/params.cfg ~/data/some_cloud.pcd ``` ## 5) Parameters Brief explanations of parameters are given in *cfg/params.cfg*. ## 6) Citation If you like this package and use it in your own work, please cite our [arXiv paper](http://arxiv.org/abs/1603.01564): ``` @misc{1603.01564, Author = {Marcus Gualtieri and Andreas ten Pas and Kate Saenko and Robert Platt}, Title = {High precision grasp pose detection in dense clutter}, Year = {2016}, Eprint = {arXiv:1603.01564}, } ``` ## 7) Actual Command to use rosrun gpg generate_candidates /home/suhail/catkin_ws/src/gpg/cfg/params.cfg /home/suhail/graspit/models/objects/PCL/004_sugar_box/poisson/outfilefile.pcd
24.641975
154
0.697896
eng_Latn
0.812753
4c1a50baab288d6bfa78b4f5792064ea3745c1a0
554
md
Markdown
docs/source/api/Apollo/classes/GraphQLQueryWatcher.md
cltnschlosser/apollo-ios
0ff2c826f002873e3c511109751e28f95dd01d10
[ "MIT" ]
null
null
null
docs/source/api/Apollo/classes/GraphQLQueryWatcher.md
cltnschlosser/apollo-ios
0ff2c826f002873e3c511109751e28f95dd01d10
[ "MIT" ]
1
2019-09-26T13:59:15.000Z
2019-09-26T13:59:15.000Z
docs/source/api/Apollo/classes/GraphQLQueryWatcher.md
cltnschlosser/apollo-ios
0ff2c826f002873e3c511109751e28f95dd01d10
[ "MIT" ]
null
null
null
**CLASS** # `GraphQLQueryWatcher` ```swift public final class GraphQLQueryWatcher<Query: GraphQLQuery>: Cancellable, ApolloStoreSubscriber ``` > A `GraphQLQueryWatcher` is responsible for watching the store, and calling the result handler with a new result whenever any of the data the previous result depends on changes. ## Methods ### `refetch()` ```swift public func refetch() ``` > Refetch a query from the server. ### `cancel()` ```swift public func cancel() ``` > Cancel any in progress fetching operations and unsubscribe from the store.
20.518519
178
0.740072
eng_Latn
0.989128
4c1b932a8454a9e3db577b48add7c7f55e67d5a8
1,025
md
Markdown
_CodingChallenges/049-obamamosaic.md
aerinkayne/website
18f82ca7a8a5297465e045546b75048371205504
[ "MIT" ]
4,164
2018-02-13T02:17:17.000Z
2022-03-31T17:49:47.000Z
_CodingChallenges/049-obamamosaic.md
aerinkayne/website
18f82ca7a8a5297465e045546b75048371205504
[ "MIT" ]
1,376
2018-02-13T15:16:06.000Z
2022-03-31T04:24:19.000Z
_CodingChallenges/049-obamamosaic.md
aerinkayne/website
18f82ca7a8a5297465e045546b75048371205504
[ "MIT" ]
6,126
2018-02-12T22:08:41.000Z
2022-03-31T14:07:30.000Z
--- title: "Photo Mosaic with White House Social Media Images" redirect_from: CodingChallenges/49-obamamosaic.html video_number: 49 date: 2017-01-06 video_id: nnlAH1zDBDE repository: CC_049_ObamaMosaic links: - title: "The White House Social Media Data" url: "https://obamawhitehouse.archives.gov/blog/2017/01/05/new-lenses-first-social-media-presidency" - title: "ITP 'Obamathon' Github Repo" url: "https://github.com/ITPNYU/Obamathon/tree/master/examples/P5/TweetsByMonth" videos: - title: "My Video on Pixel Arrays" video_id: "https://youtu.be/EmtU0eloTlE" contributions: - title: "P5.js version using random pictures from Lorem Picsum" author: name: "Joonas Jokinen" url: "http://users.metropolia.fi/~joonahj/" url: "https://editor.p5js.org/jnsjknn/full/gKQv1Z_4E" source: "https://github.com/jnsjknn/Mosaic-maker" --- In this coding challenge, I use a collection of Obama Administration's facebook images to create a "photo mosaic" of President Obama with Processing (Java).
39.423077
156
0.742439
eng_Latn
0.24226
4c1daa1ac6dc61f5589d1cd6bebb849245900dec
508
md
Markdown
README.md
gzalo/ebike-throttle-fix
17eb4472f25d8efa9a513f95eba527a7123f6312
[ "MIT" ]
null
null
null
README.md
gzalo/ebike-throttle-fix
17eb4472f25d8efa9a513f95eba527a7123f6312
[ "MIT" ]
null
null
null
README.md
gzalo/ebike-throttle-fix
17eb4472f25d8efa9a513f95eba527a7123f6312
[ "MIT" ]
null
null
null
# ebike-throttle-fix Adapter for connecting a hall effect sensor throttle to a JYQD v7.3e2 brushless driver board Basically uses an opamp to shift a 0.8-4.2V signal into a 0-5V one. - LTSpice simulation can be found in the `simulation` folder (`MCP6001.lib` file is licensed by Microchip) - PCB (and schematic) editable in Proteus can be found in the `pcb` folder ![Schematic](images/schematic.png) ![PCB](images/pcb.png) ![PCB](images/pcb_real.png) ![PCB with components](images/pcb_with_components.png)
39.076923
106
0.765748
eng_Latn
0.9525
4c1dcb0273e4f412b61dfaa60024310c3071512a
1,054
md
Markdown
build/docs/AvailableTimeOffRange.md
MyPureCloud/platform-client-sdk-dotnet
66e36bfc23742ec0d902e67ca64718fd04ca2d74
[ "MIT" ]
13
2017-12-21T03:57:38.000Z
2022-02-17T11:21:47.000Z
build/docs/AvailableTimeOffRange.md
MyPureCloud/platform-client-sdk-dotnet
66e36bfc23742ec0d902e67ca64718fd04ca2d74
[ "MIT" ]
4
2017-10-02T14:10:41.000Z
2021-09-27T13:14:31.000Z
build/docs/AvailableTimeOffRange.md
MyPureCloud/platform-client-sdk-dotnet
66e36bfc23742ec0d902e67ca64718fd04ca2d74
[ "MIT" ]
19
2017-09-28T21:16:11.000Z
2022-03-30T20:22:34.000Z
--- title: AvailableTimeOffRange --- ## ININ.PureCloudApi.Model.AvailableTimeOffRange ## Properties |Name | Type | Description | Notes| |------------ | ------------- | ------------- | -------------| | **TimeOffLimit** | [**TimeOffLimitReference**](TimeOffLimitReference.html) | The time off limit | [optional] | | **StartDate** | **String** | Start date of the requested date range. The end date is determined by the size of interval list. Dates are represented as an ISO-8601 string. For example: yyyy-MM-dd | [optional] | | **Granularity** | **string** | Granularity choice for time off limit | [optional] | | **AvailableMinutesPerInterval** | **List&lt;int?&gt;** | The list of available time off values in minutes per granularity interval | [optional] | | **WaitlistedRequestsPerInterval** | **List&lt;int?&gt;** | The current number of waitlisted time off requests for every interval per granularity | [optional] | | **WaitlistEnabled** | **bool?** | Whether the time off request can be waitlisted | [optional] | {: class="table table-striped"}
55.473684
211
0.674573
eng_Latn
0.760908
4c1ded8bf9f9c0f313fcff940c151f5a480d4c1d
3,535
md
Markdown
publish/20201124.md
thuanpham2311/zet
1b957c58b01fdbb75f2a28aebd390a452cb52d19
[ "Unlicense" ]
1
2021-09-27T02:59:25.000Z
2021-09-27T02:59:25.000Z
publish/20201124.md
thuanpham2311/garden
1b957c58b01fdbb75f2a28aebd390a452cb52d19
[ "Unlicense" ]
7
2021-11-23T01:07:36.000Z
2021-11-26T12:36:05.000Z
publish/20201124.md
thuanpham2311/zet
1b957c58b01fdbb75f2a28aebd390a452cb52d19
[ "Unlicense" ]
null
null
null
# Code tịnh tâm và chủ nghĩa khắc kỉ **↓↓Table of Contents↓↓** - [Code tịnh tâm và chủ nghĩa khắc kỉ](#code-tịnh-tâm-và-chủ-nghĩa-khắc-kỉ) - [VẬY HỌC TRIẾT HỌC LÀM CÁI MÉO GÌ?](#vậy-học-triết-học-làm-cái-méo-gì) - [CHỦ NGHĨA KHẮC KỈ LÀ CÁI MÉO GÌ?](#chủ-nghĩa-khắc-kỉ-là-cái-méo-gì) - [ỨNG DỤNG THẾ MÉO NÀO?](#ứng-dụng-thế-méo-nào) --- Chào mừng bà con đến dự buổi đàm đạo triết học ứng dụng Buổi đàm đạo hôm nay được trụ trì bởi: Triết gia nicholas được biết đến với nickname Chụy 7 Thuận Buổi đàm đạo này có mục đích giúp bà con tịnh tâm khi code. Bà con bị sang chấn tâm lý sau khi học chủ nghĩa mác lê-nin? Đừng lo! Hôm nay các anh em sẽ hiểu được học triết học làm cái méo gì? ## VẬY HỌC TRIẾT HỌC LÀM CÁI MÉO GÌ? Với gốc nhìn của tui được truyền đạo từ anh vui lên, Bản chất của triết học là giúp con người trở nên tốt hơn. Từ đó học và vận dụng triết học giúp cuộc sống ta vui hơn. ## CHỦ NGHĨA KHẮC KỈ LÀ CÁI MÉO GÌ? Chủ nghĩa khắc kỷ là một trường phái triết học Hy Lạp cổ đại do Zeno thành Citium sáng lập ra tại Athens vào đầu thế kỷ thứ 3 TCN, là một nhánh triết học về đạo đức con người, thứ được tạo ra bằng logic và cách mà con người nhìn nhận bản chất thế giới. —wikipedia Tóm cái vấy lại là một phong cách sống giúp bạn bình an, tịnh tâm. ## ỨNG DỤNG THẾ MÉO NÀO? Tui sẽ moi ra phần tui thấy hay nhất trong chủ nghĩa khắc kỉ, Đó là cách ta nhìn nhận vấn đề, Nói chung sẽ chia ra làm ba hướng nhìn nhận: 1. Việc chúng ta không thể kiểm soát được Một ông nghiện code với tinh thần tịnh tâm và được truyền đạo bởi chụy bảy thuận Bổng nhiên bầu trời mây đen kéo đến, Gió luồn qua khe cửa sổ, hú hú Đột nhiên cảm giác lạnh sống lưng RẦM!!! WTF, cúp điện. Thôi xong, căn thịnh nộ kéo đến nhưng nhớ đến đấn chụy bảy Chấp tay tịnh tâm Niệm kinh, Việc này mình méo kiểm soát được, tịnh tâm, tịnh tâm. Méo kiểm soát được thì không có gì bực, tịnh tâm, tịnh tâm. 2. Việc chúng ta có thể kiểm soát một phần Thanh niên đang phê cỏde Bổng nhiên, điện thoại réo lên. Bắt máy alo, alo Nghe như sét đấm bên tay chụy bảy báo tin sever sập rùi ông ui Thanh niên niệm kinh Chuyện này mình chỉ kiểm soát một phần thui Ui bao đêm khổ tâm tính toán traffic, config server các thể loại. Thui Google, facebook chà bá còn sập mà Tịnh tâm tịnh tâm 3. Việc chúng ta có thể kiểm soát hoành toàn Chụy bảy đang trong cơn nghiện code Lục project cũ mấy tháng trước ra nghịch Với đôi mắt trần thịt Ui những dòng code mấy tháng trước đã biến mất trong tâm trí ta Ui cuộc đời là bể khổ, uiiiii Nhân quả là đây code bẩn bựa làm biến viết document Mấy tháng sau xem lại móe hiểu mình viết cái méo gì. Niệm kinh Đây là chuyện mình kiểm soát hoàng toàn, Rút kinh nghiệm, áp dụng chiêu thức của chú Bob chỉ Clean code thần công Tóm cái quần lại: Với cách chúng ta nhìn nhận vấn đề Các nhà chủ nghĩa khắc kỉ chuyền lại giúp chúng ta biết được việc nào chúng ta kiểm soát hoàng toàn, việc nào kiểm soát được một phần và việc chúng ta không khiểm soát được. Từ đó chúng ta có thể tập chung vào những điều ta có thể kiểm soát được Cách này không chỉ giúp các anh em tịnh tâm khi code, Mà khi anh em áp dụng trong cuộc sống sẽ giúp anh em thấy cuộc đời bình an, vui hơn. --- DISCLAIMER Những câu chuyện, bài học và lời khuyên của tui là những trải nghiệm riêng của cá nhân, không phản ánh ý kiến từ những tổ chức hay công ty nào. Đa phần lời tui viết là chém gió, nên mấy chế nên suy ngẩm nó có phù hợp với hoàn cảnh của bản thân không nha. --- > #productive #philosophy #blog #programming
25.431655
263
0.733805
vie_Latn
1.00001
4c1e4fab188f165ceebc8a894fc481792ecc7e73
1,244
md
Markdown
README.md
lukechilds/react-jsdom
fb7da289456191ab5c3d2de24fe66b86eedcb98b
[ "MIT" ]
36
2017-08-11T06:26:21.000Z
2021-09-20T21:39:19.000Z
README.md
lukechilds/react-jsdom
fb7da289456191ab5c3d2de24fe66b86eedcb98b
[ "MIT" ]
12
2017-08-10T19:44:39.000Z
2021-08-18T00:49:04.000Z
README.md
lukechilds/react-jsdom
fb7da289456191ab5c3d2de24fe66b86eedcb98b
[ "MIT" ]
11
2017-08-12T06:56:02.000Z
2021-04-30T05:04:18.000Z
# react-jsdom > Render React components to actual DOM nodes in Node.js [![Build Status](https://travis-ci.org/lukechilds/react-jsdom.svg?branch=master)](https://travis-ci.org/lukechilds/react-jsdom) [![Coverage Status](https://coveralls.io/repos/github/lukechilds/react-jsdom/badge.svg?branch=master)](https://coveralls.io/github/lukechilds/react-jsdom?branch=master) [![npm](https://img.shields.io/npm/v/react-jsdom.svg)](https://www.npmjs.com/package/react-jsdom) Makes testing simple React components super easy with any Node.js test framework. ## Install ``` npm install --save-dev react-jsdom ``` ## Usage ```js const React = require('react'); const ReactJSDOM = require('react-jsdom'); class Hi extends React.Component { render() { return ( <div> <span>hi</span> <span>{this.props.person}</span> </div> ); } componentDidMount() { console.log('I mounted!'); } } const elem = ReactJSDOM.render(<Hi person="mum"/>); // console: 'I mounted!' elem.constructor.name // 'HTMLDivElement' elem.nodeName; // 'DIV'); elem.querySelector('span:last-child').textContent; // 'mum' elem.outerHTML; // <div> // <span>hi</span> // <span>mum</span> // </div> ``` ## License MIT © Luke Childs
21.824561
168
0.673633
kor_Hang
0.175056
4c1f1ee3e696c11f958ee75126547312bc549c88
983
md
Markdown
README.md
hank121314/vite-plugins
4ab9034bed816bae968ae1818413415c8c9e0d7a
[ "MulanPSL-1.0" ]
71
2021-06-10T02:39:36.000Z
2022-03-29T21:45:00.000Z
README.md
hank121314/vite-plugins
4ab9034bed816bae968ae1818413415c8c9e0d7a
[ "MulanPSL-1.0" ]
20
2021-07-01T04:57:49.000Z
2022-03-28T20:17:19.000Z
README.md
hank121314/vite-plugins
4ab9034bed816bae968ae1818413415c8c9e0d7a
[ "MulanPSL-1.0" ]
17
2021-06-09T02:50:35.000Z
2022-03-10T12:36:28.000Z
## Packages | **Package** | **Description** | **Version** | | ----------- | --------------- | --------------- | | [@originjs/vite-plugin-commonjs](./packages/vite-plugin-commonjs) |A vite plugin that support commonjs to esm in vite. | [<img src="https://img.shields.io/npm/v/@originjs/vite-plugin-commonjs" alt="npm" />](https://www.npmjs.com/package/@originjs/vite-plugin-commonjs) | | [@originjs/vite-plugin-require-context](./packages/vite-plugin-require-context) | A vite plugin that support require.context in vite. | [<img src="https://img.shields.io/npm/v/@originjs/vite-plugin-require-context" alt="npm" />](https://www.npmjs.com/package/@originjs/vite-plugin-require-context) | | [@originjs/vite-plugin-load-wasm](./packages/vite-plugin-load-wasm) | A vite plugin that support import and init webassembly file in vite. | [<img src="https://img.shields.io/npm/v/@originjs/vite-plugin-load-wasm" alt="npm" />](https://www.npmjs.com/package/@originjs/vite-plugin-load-wasm) |
122.875
301
0.687691
yue_Hant
0.228254
4c1fc59432270c997018597dec73a0a4ab084dc1
16,916
md
Markdown
gitbook/2021/05/20210526.md
yanalhk/live
e445d081d25dda486a1527e18e2d450ce6a063f0
[ "MIT" ]
null
null
null
gitbook/2021/05/20210526.md
yanalhk/live
e445d081d25dda486a1527e18e2d450ce6a063f0
[ "MIT" ]
null
null
null
gitbook/2021/05/20210526.md
yanalhk/live
e445d081d25dda486a1527e18e2d450ce6a063f0
[ "MIT" ]
1
2020-11-30T17:36:19.000Z
2020-11-30T17:36:19.000Z
#2021-05-26 11:27:38 法庭文字直播台 \#九龍城裁判法院第七庭 \#葉啓亮裁判官 \#0907太子 \#審訊 \[5/6\] D1:甄(18) D2:朱(27) D3:余(23) D4:楊(22) 控罪: (1)參與非法集結 D1D2D4被控2019年9月7日在彌敦道水渠道交界與身分不詳人士參與非法集結 (2)阻撓在正當執行職務的警務人員 D2被控2019年9月7日在彌敦道750號始創中心外故意阻撓在正當執行職務的警員X (3)參與非法集結 D3朱被控2019年9月7日在始創中心外與其他身分不詳人士參與非法集結 本案將傳召8位控方證人。 傳召昨天PW6黃艷霞高級督察 傳召PW7 隸屬赤柱警署袁嘉宏高級督察 都是以65B方式傳遞口供 如果表面証供成立 D1 D2 D4 會作供 D3 傾向不會作供 10:05休庭 感謝臨時直播員 --- 12:09:46 法庭文字直播台 \#高等法院第一庭 \#潘兆初高等法院首席法官 \#彭偉昌上訴庭法官 \#張慧玲法官 \#刑期覆核 \#20200101灣仔 陳 控罪: 1\. 刑事損壞 於2020年1月1日在灣仔軒尼詩道458至468號金聯商業中心外,無合理辯解損壞屬於政府的特別人群管理車AM9589。 2\. 管有物品意圖損壞財產 於2020年1月1日在灣仔軒尼詩道458至468號金聯商業中心外,管有或控制1個塑膠瓶載有210毫升環己烷、甲基烷及19克蔗糖的有機混合物、1把美工刀、1個打火機及1把鎚子,意圖損毀他人的財物。 覆核背景: 答辯人承認控罪並賠償$9900的維修費,錢禮主任裁判官判處答辯人18個月感化令,律政司認為判刑明顯不足,向原審裁判官要求改判勞教中心,原審裁判官聆訊後駁回申請,律政司長不服決定,向上訴法庭申請覆核判刑。 \------------------------- **申請人** 由 \#蕭啟業署理高級助理刑事檢控專員 代表 本案有多項加刑因素: 1\. 答辯人在暴動現場犯案,答辯人被捕後,騷亂仍然持續 2\. 答辯人在人群之中向特別人群管理車(水炮車)掟磚,有煽動他人犯案的風險,錄像清楚顯示答辯人被捕後,有人在旁大嗌「走啊」,顯示附近有人群聚集,答辯人的行為有引起漣漪效應的風險。 3\. 答辯人的行為有機會釀成交通事故,水炮車是一輛有助於警方驅散及控制人群的重要車輛,答辯人的行為無疑是針對一輛重要的車輛。 4\. 答辯人身上管有多樣的危險物品。 5\. 答辯人身穿全套裝備加上管有的物品,足以顯示答辯人犯事前有充足準備。 阻嚇性、懲罰性以及公開譴責 申請人認為以本案的嚴重程度,應著重阻嚇性、懲罰性以及公開譴責,更生及答辯人的個人背景變得不重要。 感化令只著重更生 申請人引用_SJ v SWS_指禁閉式刑罰亦有很大的更生成分,亦能顧及懲罰、公開譴責及阻嚇罪行等判刑考慮,但原審裁判官的量刑只著重更生,明顯是判刑過輕。以本案的嚴重程度,阻嚇性刑罰是有必要的。 **答辯人** 答辯人的罪責 答辯人被捕時表現非常合作,答辯人並非縱火或設立路障的肇事者。彭上訴庭法官質疑即使答辯人並非肇事者,他一定留意到現場環境,如果他是肇事者,更可能被控公眾妨擾罪。 彭上訴庭法官同時質疑答辯人為何管有鎚子及打火機。答辯人代表回應指沒有證據顯示物件曾被使用,張法官反駁指答辯人已承認有意圖使用物件破壞財產,是否有使用無關旨要。 固定的量刑選項 答辯人指控罪並無固定的量刑選項,張法官質疑本案的嚴重程度是否可與破壞八達通機比擬。彭上訴庭法官亦質疑原審裁判官是否有顧及所有量刑考慮,作出適合的判刑。 一時衝動而犯案 答辯人只是一時衝動而犯案,亦都獲得原審裁判官接納,彭上訴庭法官質疑此說法沒有證據基礎,加上答辯人的裝束及管有的物品,此說法難以自圓其說。 懇求索取社會服務令報告 即使上訴法庭認為判罰明顯過輕,答辯人已經完成9個月的感化令,進度報告顯示他在學業上有進步以及提升守法意識,答辯人懇請法庭除了索取懲教院所報告,亦同時索取社會服務令報告。 \------------------------- 上訴法庭強調_SJ v SHY_一案已重申_SWS_案對年輕犯的判刑原則,下級法院應該遵從相關原則,原審裁判官沒有適當地應用相關原則,裁定原審裁判官原則上有錯及判刑明顯不足。理由是原審裁判官判刑沒有充份考慮犯案處境、案件嚴重程度及答辯人的罪責,只著重答辯人的更生及個人背景。 上訴法庭批准司長的申請,撤銷感化令,還押期間為答辯人索取更生中心、勞教中心及教導所報告。 案件押後至2021年6月10日09:30判刑 --- 13:09:07 法庭文字直播台 \#粉嶺裁判法院第六庭 \#黃國輝裁判官 \#1112大埔  \#審訊[1/2] 上午進度 D1:賴 (17) D2:蘇 (21) 控罪: (1) > D1:管有適合作非法用途的工具並意圖用作非法用途使用。 被控於2019年11月12日,大埔新界安邦路與安慈路交界,接近大埔藝術中心,管有1把士拿巴,意圖用作非法用途使用。 (2) > D2:管有適合作非法用途的工具並意圖用作非法用途使用。 被控於2019年11月12日,大埔新界安邦路與安慈路交界,接近大埔藝術中心,管有1把士拿巴和1包索帶,意圖用作非法用途使用。 ❌D1,D2不認罪❌ 控方表示會傳召3位證人,包括2位拘捕被告的警長和1位處理證物的警員,不會依賴影片,沒有其他會面紀錄;辯方表達沒有證人。 呈上承認事實 傳召PW1 處理警長 A,畜龍當值,拘捕D1,作供完畢,詳情後補。 傳召PW2處理警長 B,畜龍當值,拘捕D2,辯方盤問中。 12:57 午休,14:30 續審 --- 13:15:04 \#區域法院第卅三庭 \#李俊文法官 \#審訊 \[3/7\] \#0831灣仔 \#暴動 \#雷射筆 ‍♂️D1:楊(25) D2:湯(20) [控罪詳情按此](https://t.me/youarenotalonehk_live/15819) 承上:[https://t.me/youarenotalonehk\_live/15851](https://t.me/youarenotalonehk_live/15851) \-------------------------- PW4 警員 朱福弘 作供。現屬中區警區公眾活動小組,案發時任警察機動部隊署任總督察。 拘捕次被告的過程 朱福弘稱案發當日在灣仔道與摩利臣山道交界,見到身穿黑色t-shirt、黑色褲、白色鞋、頸上有黑巾、戴粉紅色防毒面具,孭黑色背囊,手持長遮的次被告,於是朱福弘落車上前追截,最後在摩利臣山道近陳東里的行車線上將她制服。 次被告被朱福弘制服時沒有反抗,不過當她被警員撲跌時,她手持的雨傘、行山杖及手提電話亦跌在地上。在20:21時,女警**17507**到埸協助,20:25向女警**19083**交代現場情況,之後朱福弘與女警**17507**離開現場,繼續之後行動。 播放記錄了制服D2過程的閉路電視錄影片段…… 代表D2的吳珞珩大律師向朱福弘指出,D2是彎腰執地上的手提電話,而朱福弘稱「第一眼見D2時佢已經跑緊,見到佢彎低身,但唔知佢做咩。」 \-------------------------- PW5女警 **19083** 許小鳳 (譯音) 作供,當日逮屬機動部隊E大連第4小隊,身穿軍裝當值。 **背景** 20:11時收到指示到摩利臣山道59號,掃蕩及拘捕一些相信是當晚破壞大公報標誌的人,牠從通訊機中收到要求女警到鵝頸橋協助,於是與女警**18336**一齊去,到達時已經見到朱福弘與隊員及D2。 **處理次被告的過程** 女警**17507**㩒住D2,將她帶到行人路後由女警**18336**為D2扣上膠手扣並進行初步搜查,確認在她身上沒有任何攻擊性武器,證物在交接前一直由PW5女警**19083**看管。 PW5 女警**19083**許小鳳以非法集結罪向D2宣佈拘捕,懷疑D2與大公報標誌的破壞有關。D2在同日20:30時由女警**18336**押上警車AM7263到灣仔警署,在灣仔警署113室會面室為D2進行搜查。 在次被告的背囊內搜出下列物品: 護膝、1頂黑色鴨嘴帽、1頂迷彩頭盔、1副護目鏡、1部粉紅色iPhone、灰色布口罩、1「隻」白灰色手套、1對黑色手套、1件竹造的護臂、1卷0.8mm魚絲、2卷0.4mm魚絲、1支有兩個索帶圈15cm長的木棍、1支約長15cm可發出藍光的激光筆,個人財物如銀包則封存在**B0728062**貴重財物袋內。 在吳珞珩大律師盤問下,女警PW5 **19083**許小鳳確認現場見到D2有背囊,亦目睹女警**18336**為D2快速搜身。被問及為何不搜查背囊以確認沒有攻擊性武器,女警**190**83許小鳳表示,因為要儘快離開處理其他嘢,所以「無諗噉多」,同意D2全程態度合作。 **護膝當電話??** 睇完案件主管提供的證物相後,女警**19083**許小鳳發覺之前的口供有誤,所以事隔18個月後再錄一份新口供,改稱袋裏揾到黑色護膝而非黑色電話。 吳大律師質疑「會唔會係你唔記得自己搜過啲咩呀?其實你對D2搜出證物嘅記憶係唔清晰嘅,而且你根本無係D2背囊內揾到激光筆,因此當時沒有就激光筆警誡、拘捕D2。」 PW5女警**19083**許小鳳稱:「呢個係手民之誤,手快寫錯咗做電話,唔小心啫。」因為搜出的物品種類繁多,令牠印象深刻唔會記錯 ,證供是依賴自己的獨立記憶。另外,因為自己無專業知識判斷激光筆的功率能否滿足攻擊性武器的定義,所以沒有就此調查、拘捕、警誡D2。 \[證物處理過程,細節後補\] \-------------------------- 13:05休庭,14:30繼續 --- 13:15:34 \#區域法院第廿八庭 \#葉佐文法官 \#續審(8/10) Part 1 鄭,丘,黃(19-24) \#1102灣仔 3人已還押逾18個月 控罪2:管有物品意圖摧毀或損毀財產 3人被控於2019年11月2日的0000時至約1505時、當警務人員透過破門而進入香港灣仔堅拿道西10號冠景樓4樓F室的處所期間,在香港的該處所,保管或控制59個汽油彈、79個玻璃瓶每個載有白色廢棉、50個玻璃瓶、約4.6公升淺黃色液體內含柴油、約1.0公升無色液體內含異丙醇、約4.9公升綠色液體內含汽油、白色廢棉、布條、布及毛巾,意圖在無合法辯解的情況下使用或導致他人使用或准許他人使用該等物品,以摧毀或損壞屬於另一人的財產。 \================== 第七天審訊Part 2: [https://t.me/youarenotalonehk\_live/15866](https://t.me/youarenotalonehk_live/15866) 他=PW14 ⭐️在第八天審訊時,他忘記了帶記事冊上庭...幸運地,控方有準備副本⭐️ **完成正式搜查後:** 在18:00時,他告訴負責拍攝的鑑證科人員,即PW15要拍攝的單位的格局及已檢取的證物 (P1-P92及個人物品),及告訴負責套取指紋的鑑證科人員,即PW17那些不論有沒有液體,廢棉和布條的玻璃樽都需要套取指紋,因他認為這些玻璃樽是重要的,而且可能是可以產生嚴重罪行的汽油彈。 在18:05時,現場開始拍攝工作。他對現場的物品狀況沒有改變有深刻記憶,因為當時由這些物品傳出的物品的汽油味令他印象深刻。 他指出在PW15拍攝期間時身在現場,單位內的物品有移動,他亦有與PW15互相溝通。在窗簾的議題上,他指出他進入時窗廉是關掉,然後由他打開。P34,P69 (灰色袋) 亦是由他所移動。在拍攝工作進行期間,套取指紋亦同時進行。此外,他在此兩項工作進行期間,他有留意兩位鑑證科人員的工作,但沒有集中留意其中一人鑑證科人員工作。 他指出PW17會在驗出指紋的位置用有「旗仔」的透明貼紙作標記,當時他有將此情況告訴予案件主管。 在19:35時,他離開了單位及停止紀錄證物在記事冊,並連同PW17前往3樓A室搜證,PW15也已完成了拍攝工作且離開單位,因有其他現場需要拍攝。而本身在搜查時協助防止其他警員觸碰證物的PW16則留在及看守單位。在此時也沒有任何被告人在場,而PW17,PW7和警員11027則留在單位。 他在3樓A室時,發現有7個血液樣本。在19:44時,PW15表示已完成其拍攝工作,那時他在3樓A室。在完成3樓A室時,他在約20:15時返回四樓的單位檢取證物及個人財物。他指出20:15時的證物數目有不同,因為案件主管袁智恆 (同音) 向他表示有被告人認領自己的個人物品。 在20:16時,他指出仍有C和D的認領個人物品的程序。他從PW16手上取回早前予已交予PW16的輔助紙,所以他得知其中三名被告人的個人物品 直到被告人認領個人物品後,他按案件主管袁智恆 (同音)指示,儘快完成正式檢取P1-P92,因在堅拿道附近有遊行示威,最後他在21:19時完成檢取程序。 在21:19時,他按案件主管袁智恆 (同音) 指示,向A展示和讀出記在他記事冊內的證物列表,並要A簽署,期間PW8有在場。但他未有向A展示那8至10張輔助紙。他亦表示當時沒有把證物列表展示予B,C,D,E觀看,因為除了按案件主管袁智恆 (同音) 指示外,他當時亦看不到B,C,D,E在現場。即使他回到警署,也沒有將記事冊內的證物列表展示予B,C,D,E觀看。 ⭐️葉佐文法官問他向A展示記在他記事冊內的證物列表的背後目的,他回答只是按案件主管袁智恆 (同音) 指示行事。葉佐文法官進一步問他認為他作此行為的目的,他回答「當時腦海一片空白」,即一心按案件主管袁智恆 (同音) 指示行事。 ⭐️ 他不知道在21:20時,PW8向A詢問有沒有需要帶頭套,亦不能確定PW8何時帶走A。他沒有看到PW8為A穿鞋的情況。也忘記當時A有沒有戴眼鏡及有沒有表示看不到記事冊的內容,A亦沒有向他表示要佩戴眼鏡。律師質疑記事冊上的簽名不是A1本身的簽名,他不同意,他指出他不知道A英文版本的簽名,他唯一所知的是A在他的記事冊的中文版本的簽名。他指他沒有冒簽,也沒有讓其他人在他的記事冊簽名。 他也忘記他向A讀了他記事冊內的證物列表多久。他不知道他用3分鐘讀出8.5版的記事冊。 ⭐️所以...他在庭上實測。最後他用了約4分2秒 (最後修訂為用約4分鐘)...⭐️ 律師質疑如何用3分鐘完成讓A1整個簽署記事冊的程序 (包括PW8為A解開手扣),他表示有難度。他同意沒有在記事冊及七份口供紙紀錄上述情況,他在上述情況也沒有警誡A,因這不涉及招認及確認證物。他認為此亦不需要覆讀聲明,因不涉及招認及A所說的說話。 他在庭上才知道20:45時和21:22時的時間點。他也沒有就展示紀錄予A觀看一事作出任何紀錄,也沒有紀錄PW8帶A進入單位的時間。他指出21:19時的時間點是有紀錄,而展示紀錄的動作是在他完成檢取證物後馬上展示予A,但他沒有紀錄這動作進行的時間。 ⭐️所以,他不同意PW8所指的,PW8在21:19時已將A帶離單位...。葉佐文法官表示這是否定控方的「真相」.....⭐️ 他指出他展示紀錄予A觀看一事在21:19時至21:30時進行。 他指出21:19時的紀錄是按自己的錶,沒有與PW8對錶,自己的錶也沒有核對天文台的時間。 葉佐文法官表示這是否要傳召以往的證人去核對手錶的時間。\#張卓勤署理助理刑事檢控專員 在考慮後表示不會這樣做。 他指約21:45至22:00才脫掉用了整天的手套。 第八天審訊內容Part 2: [https://t.me/youarenotalonehk\_live/15888](https://t.me/youarenotalonehk_live/15888) --- 14:30:57 法庭文字直播台 \#粉嶺裁判法院第五庭 \#陳炳宙裁判官 \#審訊[1/4\] \#1013粉嶺  D1 陳 (18) D2 \* D3 \* D4 鄭(16) 控罪: 1\. 非法集結 2\. 刑事毀壞 3\. 蒙面 4\. 管有攻擊性武器 控辯雙方沒有爭議片段,D1只爭議閉路電視和陳詞;D2,3 被告只爭議陳詞;D4爭議雷射筆的情況。 \-------------------------- 進度: 早上完成PW 1(中銀職員)的盤問,開始PW2( 朱先生,保安員)主控主問。下午繼續及完成PW2( 朱先生,保安員)主控主問,D1代表律師盤問;其他代表律師沒有盤問,主控沒有覆問。處理控方證物PD3 光碟呈堂性。 1435 開庭 PW2朱先生繼續作供,主控繼續盤問。 播放片段一: [第二組片段(CCTV)頻道15] 時段15:54:59 -15:55:11 ;15:59:30-15:59:53 播放片段二: [第二組片段(CCTV)頻道15] 時段16:00:08 -16:00:19 ;16:05:00-16:10:00 (註:畫面內容見中銀分店外,手足架起傘陣並進行裝修,及後離去。) PW2 表示當日見到破壞後的情況,即中銀的玻璃粉碎。 播放片段三: [第二組片段(CCTV)頻道14] 時段15:55:00 -15:55:13 ;15:59:45-15:59:56;16:00:07-16:00:17;16:10:00-16:11:00 (註:畫面拍攝的是車路及時速停車場入口,有市民及手足經過。) 控方補充各頻道依賴片段的時間 CH1, 15 —> 15:55:00-16:10:00 CH14 —> 16:10:00-16:11:00 PW2 表示中銀向石崗方向行20米轉彎就到達畫面所示位置;同意中銀條巷一邊去迴旋處,另一邊再轉彎就見到停車場。法官表示對方向了解困難,希望PW2在地圖標示至少三個鏡頭及的位置及拍攝方向:旋轉處、中銀對出、停車場入口,以及標上各頻道號碼。此地圖及後呈交為控方證物P13. 展示P13 CCTV的地圖 PW2表示頻道1的鏡頭為搖擺鏡頭,較接近惠康,左邊拍攝到惠康,右邊是7-Eleven \-------------------------- D1代表律師盤問: 關於片段有否受到干擾 PW2今早提及當警方要求交出CCTV片段時,將片段過至DVR再放到 usb記憶棒再插電腦及燒成CD。但PW2今早被問及最終由誰人交給警察時,第一次指忘記,但第二次回應時澄清在提了「18號日子」後,記起由自己交給警方。在下午辯方盤問下,PW2澄清第一次問的是比較 general 的情況,第二次回應時回想因當日比較忙但同事指有警察找自己,剛好自己在位子上所以就親自交了CCTV。 PW2表示閉路電視錄影系統連接在DVR機,其系統放在座位上沒有鎖的地方,接觸此系統需要密碼,密碼長度為5英文字母4數目字。此密碼除了自己外,只有管理處的物業主任知悉。由10月15日燒錄光碟至18日交給警方的中間時段,物業主任沒有接觸閉路電視系統。PW2表示自己接到指示後立即燒錄光碟,當時是星期二因為不想同事見到所以等同事放工才做。 今早PW2提及13日警方找自己時有播放片段給警方觀看,他們很快觀看,因為popo想知有沒有鏡頭拍到當時情況。其片段流暢度同法庭所播放的一致。當時主要播中銀門口的CCTV,popo看完後就主要找出眾人的入場位置及時間。 關於片段畫面窒 今早PW2回答主控鏡頭畫面為何窒住時,稱頻道1鏡頭會搖擺,及後辯方問其他鏡頭畫面窒住的原因,PW2指頻道6的位置是高位有風吹故有震盪;頻度14的入口位鏡頭都是搖擺鏡頭但因故障至無法轉動所以變成定鏡。辯方問為何鏡中畫面會窒吓窒吓,PW2表示因為鏡頭在高位、當有重型車經過鏡頭會震動所以畫面窒窒吓。上述情況是PW2知道這樣發生,非個人猜測。PW2同意這些鏡頭會受到外來環境(車經過)干擾。 D2-4代表律師沒有盤問;主控沒有覆問。 \-------------------------- 處理呈堂證物爭議 \#陳詞 主控陳詞 處理證物能否呈堂有兩大點:(1)相關性及(2)表面真確,得悉辯方律師不爭議(1)而爭議(2),故庭上傳召PW2完整地交代製作光碟過程,因其當日一定時段內在崗位當值,是接觸證物光碟最為接近的人,由操作至燒碟過程中皆沒有人為干擾,故屬可靠證供。 D1 代表律師陳詞 甫指出PW2在庭上作供時同意片段受外來環境干擾,及後呈交案例及相關法例如下:HCCC252/2019 HKSAR vs Ko Wai Kit and another (頁24段49)(註:此段約指需要證明產生截圖的系統是可靠,而非只是內容的真確性。電話屏幕截圖必須準確顯示電話畫面內容;且須準確顯示三部手機之間的訊息往來;及證明1、2被告為訊息傳送者。) D2代表律師引用《證據條例》第22A條「在該電腦於上述活動過程中如上述般使用期間 ——(ci) 有適當措施施行以防止任何未經許可而干擾該電腦的行為。」 而PW2指出過CCTV有受外來環境(如重型車輛的移動)干擾,辯方律師指控方須就證物的可靠性舉證至無合理疑點。 陳官希望辯方律師指出那些畫面被實際干擾,D2代表律師稱中間每逢片段窒一窒就被干擾,但D2代表律師未能指出其干擾影響了畫面形象的真確性,僅指出所有片段中,移動中的物件的位置會出現不連貫的移動,並以片段[頻道14片段 16:09:56-16:09:57]為例,當中一白褲女士由距離最右邊三個身位變成兩個身位,片段不能連續顯示其步行。D2律師繼而重申影像中的物品沒有出現突然消失或出現的情況。 陳官反駁D1代表律師所提出的「毫無合理疑點」非上級法院標準,舉出 HKSAR v Lee Chi Fai \[2003\] 及終審法院HKSAR vs Chan Siu Tan\[FAMC 48/2019\]兩例,其上訴委員會沒有要求證物可接納性須達至毫無合理疑點,反而考慮相關性及表面上真確。官稱D1代表律師所呈交的案例只是手機的文字訊息,反而Lee Chi Fai及Chan Siu Tan兩案中關乎影像片段,若按上級法院指示,實在不必考慮《證據條件22A》條。 D1代表律師則稱陳官所指的案件中沒有觸及可靠性的議題。因為Lee Chi Fai 案不 \-------------------------- 案件押後至5月27日 09:30於粉嶺裁判法院第五庭續審,明天陳官將決定證物可否呈堂。各被告以原有條件繼續保釋。 感謝臨時直播員 --- 15:05:24 法庭文字直播台 \#九龍城裁判法院第六庭 \#鄭念慈裁判官 \#續審 \[6/2\] \#1020旺角 陳(18) 法律代表:\#吳宗鑾大律師 控罪: (1)在公眾地方作出擾亂秩序的行為 (2)襲擊警務人員 (3)襲擊警務人員 就著特別事項,裁判官接受剔除有關被告口供作呈堂證物。 案件押後至2021年5月29日0930西九龍裁判法院第二庭作結案陳詞,期間被告維持原有條件擔保。另保留6月16日作裁決之用。 感謝臨時直播員 --- 15:19:56 \#屯門裁判法院第一庭 \#水佳麗裁判官 \#審前覆核 \#20201003元朗 黃(38) 控罪: (1)侮辱國旗 (2)侮辱區旗 詳情: (1)被控於2020年10月3日,在新界元朗安寧路近燈柱FB3333,公開及故意以毀損的方式侮辱8支國旗 (2)被控於同日同地公開及故意以毀損的方式侮辱區旗 ‼️被告認罪‼️ 求情: 呈上多封求情信, 顯示背景正面,有家人、朋友、上司支持,為人勤奮善良。跟據上訴庭有關侮辱國旗的加刑條件,即有預謀、煽動他人、在示威活動中發生、從政府設施取走旗幟、自備旗幟等,無一在本案發生。案發時在深夜,被告一人犯案,附近沒有公眾活動發生,被告受酒精影響,一時衝動犯案。涉案的旗幟為非法懸掛,沒有得到批准,現場被捕時即時承認。被告案發後深感悔意,願意賠償旗幟價值共$1080予控方第一證人,希望法庭以非監禁形式處理。 案件押後至2021年6月9日屯門裁判法院第六庭1530判刑,期間需為被告索取背景及社會服務令報告,被告現以現有條件保釋外出。 感謝臨時直播員 --- 16:05:25 法庭文字直播台 \#粉嶺裁判法院第六庭 \#黃國輝裁判官 \#1112大埔  \#審訊[1/2] [上午進度](https://t.me/youarenotalonehk_live/15878) **下午進度**(此段) D1:賴 (17) D2:蘇 (21) 控罪: (1) > D1:管有適合作非法用途的工具並意圖用作非法用途使用。 (2) > D2:管有適合作非法用途的工具並意圖用作非法用途使用。 繼續喺 PW2 處理警長 B作供,辯方盤問完畢。 15:57 休庭,案件押後至明日(27/5) 09:30 在粉嶺裁判法院第六庭續審,兩被以現有條件繼續保䆁。 --- 16:27:25 法庭文字直播台 \#九龍城裁判法院第七庭 \#葉啓亮裁判官 \#0907太子 \#審訊 \[5/6\] D1:甄(18) D2:朱(27) D3:余(23) D4:楊(22) 控罪: (1)參與非法集結 D1D2D4被控2019年9月7日在彌敦道水渠道交界與身分不詳人士參與非法集結 (2)阻撓在正當執行職務的警務人員 D2被控2019年9月7日在彌敦道750號始創中心外故意阻撓在正當執行職務的警員X (3)參與非法集結 D3朱被控2019年9月7日在始創中心外與其他身分不詳人士參與非法集結 \--------------------------------------------------- \-----補上d3代表大律師盤問最後一名控方證人的某部分紀錄,歡迎旁聽人士補充,[按此](https://t.me/youarenotalonehk_livebot)\------ 辯方播放由香港開電視於案發當晚拍攝的新聞片段 由0056播放至0110 據辯方所講,片段一開始播放的時段約在案發當晚九點,即是警方差不多開始追截示威者的時段 辯:見唔見到剛才有白衫人士係畫面左邊打斜跑過彌敦道跑到畫面右邊 證人:見到 ,右下方嗰位? 辯:係,宜位女子或者宜位人士會唔會係你喺cctv同now見到嗰名你指稱係第三被告嘅女子 證人:可唔可以播多次 辯:好 重播片段 辯:見唔見到 證人:見到 辯:宜個人士係咪你喺cctv同now辯認嘅嗰位人士? 證人:好大機會係 辯方呈上一幅截圖 辯:剛才播放嘅片段係咪可以睇到宜位人士嘅正面? 證人:係 辯:我地番去1分09秒,就係宜幅截圖對應番嘅時間,你會睇到畫面中間有一個人士同你之前描述你認為係第三被告嘅人相似 黑cap帽 黑色褲 白色衫 深色鞋 。睇番截圖,當然像素一般喇,但係會睇到件衫嘅正面 ,同唔同意衫上面有較大嘅圖案? 證人:黑色圓形 ,係嗰心口位置 辯:正中間位置? 證人:中間左少少 辯:你有睇過證物當然咁講喇 證人:宜家睇片都係咁 辯:片都係有三段黑色xx(聽唔清)宜個成個圖案由三個黑色部分組成? 證人:不同意 辯:慢鏡睇多次 由0108開始播放 再次播放片段 辯:會唔會見到離遠睇只係一個黑點 但近睇會見到兩至三個較黑嘅部分 證人:唔同意 辯:我地用番截圖,截圖比較清楚,就算係圓形圖案,都係喺正中間係心口正中位置 證人:係 辯:我地睇番被告件衫嗰張相,一係我地睇相簿p7,我地睇番p7 第24張,同唔同意圖案喺心口中間偏左位置 證人:係 辯:並非喺件衫正中間 證人:係 辯:又再去第31張相。。總之你確認(被告人件衫)圖案喺左邊係咪? 證人:中央偏左 辯:剛才片入面嗰名女子 嗰個圖案係喺正中間 證人:唔同意 辯:並不是在偏左位置 證人:唔同意 辯:總括而言,我會話你主要辯認係嗰件衫,同唔同意? 證人:同意 辯:你冇去仔細比對 例如 有冇戴口罩 有冇戴帽 袋嘅拉鏈位置 證人:我唔同意 辯:你嘅比對主要係見到嗰個人件衫有一個圓點就話嗰個人係第三被告 證人:唔同意 辯:向你指出你比對嘅方法係粗疏、草率同馬虎嘅 證人:唔同意 辯方為該香港開電視的新聞片段向法庭申請證物編號,法庭將該片段列為證物D3(4) 辯:宜一個我地睇到嘅截圖,嗰圖案係連貫嘅圖案 證人:。。連貫嘅圖案牙。。。係一個黑色嘅圓點 辯:你堅持係一個黑色嘅圓點,而不是三個黑色嘅部分? 證人:係 辯:向你指出宜個圖案係比黑色圓點更大 證人:唔同意 D3盤問完畢,控方沒有覆問,D4沒有盤問,證人作供完畢。 \--------------------------------------------------- 控方案情完結,各辯方代表大律師沒有中段陳詞。 ❗️❗️❗️法庭裁定各被告所面對控罪的表面證供成立❗️❗️❗️ D1 D2 D4選擇作供,d4另有一位辯方證人 D3代表大律師明早需於屯門裁判法院處理另一宗案件,預計11至12點才能到達本庭,他向法庭申請在他到達前,由d4代表大律師為他hold papers ,法庭批准。 辯方提議明日才開展辯方案情,法庭批准,宣佈明天早上9:30同庭續審 --- 16:32:39 \#區域法院第卅三庭 \#李俊文法官 \#審訊 \[3/7\] \#0831灣仔 \#暴動 \#雷射筆 ‍♂️D1:楊(25) D2:湯(20) [控罪詳情按此](https://t.me/youarenotalonehk_live/15819) [上午進度按此](https://t.me/youarenotalonehk_live/15879) \-------------------------- PW6 女偵緝警員 **12328** 作供 D2證物的交接過程(節錄) 如上文所述,女警**12328**從女警**19083**許小鳳手上接收共21件屬於D2的證物。 PW6女警**12328**聲稱「交收時證物唔係用袋裝住,係女警**19083**許小鳳雙手捧住,全部證物一次過交俾我。」吳大律師聞言覺得驚嘆:「21件喎?佢就咁捧住俾你呀?」女警**12328**確認「係」。 女警**12328**未作完供,明天繼續接受辯方盤問,證物鏈爭議細節後補。 ❗另外,由於有證物不知從何而來,且證物的描述與實物不符,因此次被告將爭議證物的呈堂性,法庭安排星期五讓控辯雙方陳詞闡述法律觀點。❗ \-------------------------- 16:32休庭。審訊明天11:00在灣仔區域法院第卅三庭繼續,兩名被告以原有條件繼續保釋。 --- 16:46:22 \#區域法院第廿八庭 \#葉佐文法官 \#續審(8/10+4) Part 2 鄭,丘,黃(19-24) \#1102灣仔 3人已還押逾18個月 控罪2:管有物品意圖摧毀或損毀財產 3人被控於2019年11月2日的0000時至約1505時、當警務人員透過破門而進入香港灣仔堅拿道西10號冠景樓4樓F室的處所期間,在香港的該處所,保管或控制59個汽油彈、79個玻璃瓶每個載有白色廢棉、50個玻璃瓶、約4.6公升淺黃色液體內含柴油、約1.0公升無色液體內含異丙醇、約4.9公升綠色液體內含汽油、白色廢棉、布條、布及毛巾,意圖在無合法辯解的情況下使用或導致他人使用或准許他人使用該等物品,以摧毀或損壞屬於另一人的財產。 \================== 第八天審訊內容Part 1: [https://t.me/youarenotalonehk\_live/15880](https://t.me/youarenotalonehk_live/15880) 他=PW14 **準備P612A和修改口供紙:** 他同意未紀錄在記事冊前的原始內容是重要。他亦同意到警署整理後的版本已不是最原始內容。他也同意當時沒有為意這些原始內容的重要性,因此他在銷毀輔助紙時也沒有請示上級。 他指出到警署整理後的草圖中的物品性質是正確的,因他可以作比對。但同時他同意當時有機會抄錯物品的位置,亦由於因原始內容已被銷毀,他無從比對。 律師指出他分別從18:00時與到達警署時準備的紀錄中有錯誤,即膠袋內有箱和箱內有膠袋。他同意,但沒有特別原因。 在此後,他曾修改P612A內的內容。在第一次修改時,因為他應案件主管袁智恆 (同音) 的指示,觀看了鑑證科拍攝的照片後,發現自己記錄有誤,故他曾修改P612A中有關P91證物的內容,所以他重新繪畫新的草圖,是為P612B,他在2020年8月13日11:05時作出修改口供及簽署。他指出他是真誠地發現P91的地方有錯誤並決定作出修改。 在第二次修改時,因為他應案件主管袁智恆 (同音) 的指示,觀看了鑑證科拍攝的照片後,發現自己記錄有誤,故他曾修改P612B中有關P37證物的內容,增加露台趟門開門的方向,加上在木櫃比例及位置上有修改下,所以他重新繪畫新的草圖,是為P612C,他及後有在2021年5月5日18:05時於作出修改口供及在P612C上簽署。他同意之前的草圖未算準確。 他指出他沒有在P612C上添加與自己記憶無關的物品。他對露台的物品的印象,除有洗衣機外沒有印象,他指出沒有人要求他就露台的物品作更改。 現在他仍對當時的物品存放在那裏有大概的記憶。他指出繪畫P612C時比P612B和P612A的記憶更清晰。 葉佐文法官:「即係時間愈耐記憶愈清晰?」 後來他經葉佐文法官協助後,他指出他在看見有他人拍攝的照片後,修訂自己的記憶。 他指出案件主管袁智恆 (同音) 當時沒有指出他的記錄有誤,當時袁智恆 (同音)的言詞是「睇吓啲嘢有冇出入」,,他推測袁智恆 (同音) 的意思是他的口供可能有錯誤。他同意若他未曾觀看這些相片,他未必會作出修改。 他指出以往工作時曾使用輔助紙,完成後亦會銷毀。他聽過「不被使用的材料」,但他認知上這些輔助紙不會放在「不被使用的材料」。他指出沒有銷毀的一張用間尺畫草圖,即P612A的「清水版」,不是輔助紙,後來在加工後進化成為證物,即P612A,但這個過程沒有紀錄。 他在2021年3月24日錄了一份新口供,他表示他在2019年11月2日23:15時收到一封內含52秒的Whatsapp影片 (P93)。他表示曾觀看影片,也看見單位的露台及被告人從露台離開。但由於他認為露台的物品不重要,故沒有檢取。他不同意他在檢取證物時有所遺漏。 他在2021年5月2日接受 (字眼上是「奉命」一詞) 案件主管袁智恆 (同音) 的指示,要錄取一份新口供,他同意他在整件事中是被動的,但不是被迫行事,也不會盲目跟從會違法的指示。 他在2021年5月20日17:00時和5月21日12:00時被袁智恆 (同音)要求補錄兩份口供,但他不知道確實原因。 **證物處理:** 在21:30時,他與PW16帶同P1-P92前往警察總部,並在22:00時補錄口供。 他指出他將P1-P92交予證物房前,但忘記何時,也沒有就此作個人紀錄,警員11027警員要他拿部份證物予警員11027拍攝,協助調查。 第九天審訊: [https://t.me/youarenotalonehk\_live/15926](https://t.me/youarenotalonehk_live/15926) --- 18:03:59 法庭文字直播台 \#東區裁判法院第四庭 \#王證瑜裁判官 \#20200406炮台山 \#續審 \[2/3\] 3名被告 \#沈君浩大律師 \#梁嘉善大律師 \#關百安大律師 控罪: (1) 刑事損壞(D1, D2) 被控於2020年4月6日,在炮台山港鐵站B出口連接福元街之行人天橋上無合法辯解而損壞屬於香港鐵路有限公司的上述行人天橋,意圖損壞該財產或罔顧該財產是否會被損壞。 (2) 管有物品意圖損壞財產(D3) 被控於同日同地在無合法辯解的情況下管有1支噴漆,意圖使用該物品以損壞屬於香港鐵路有限公司的上述行人天橋。 ————————— ⏺傳召PW3 警員8636 案發時當值北角分區特遣隊(夜間),負責拘捕D3,職務以便裝警員身份作「埋伏觀察」。 ⭐️PW3案發前已經駐守北角分區 接近三年,但首份書面口供仍然和PW1及2犯上同樣的錯誤,即把電器道寫錯咗渣華道。 ⭐️辯方質疑PW3與已作供的警員有所討論 事源案情提到有一條四數段嘅長樓梯,中間可有一平台可通往其他地方。 裁判官昨天審審時於庭上為免混淆,要求控辯雙方以「連接點」形容每段樓梯之間的地方,而「平台」則用來形容可横向通往其他地方的平台。 由於PW3於庭上在控辯及裁判官均未提到平台的情況下卻自行以「小平台」形容「連接點」發生的事情,因此辯方質疑證人和早前作供的證人有討論,PW3否認之餘,更指「小平台」一詞只是跟裁判官講,甚至提出可聽錄音確認,但由於控辯雙方及裁判官階同意「小平台」講法是PW3先講,因此無需聽錄音。 ⏺傳召PW4 沙展 \_國威(音) 案發時當值北角分區特遣隊(夜間),當晚職務於天橋近英皇道一端以便裝警員身份作「埋伏觀察」。 PW4:「打開背囊最大果格,用手望一望,之後俾返被告。」 裁判官:「咁你有無用眼望?」 ⭐️PW4又又又係前言不對後語 PW4早上審訊時曾回答裁判官不清楚D1邊隻手拎噴漆,但下午盤問時突然衝口而出指D1右手拎噴漆,辯方因而提出質疑,PW4竟反問辯方「頭先我有無講右手?」、「我咪無講右手囉」,PW4最終又維持上午講法,即睇唔清楚邊隻手拎噴漆。 PW5 證物警員5646  2份書面口供以65B方式呈堂。 新增承認事實(P31) 康澤花園CCTV顯示時間比實時快2分鐘。 ———————— 案件明天5月27日0930同庭續審。 ---
24.694891
235
0.738295
yue_Hant
0.867699
4c203a5e9e954a0e39adc56429031f7e72277449
13,097
md
Markdown
_posts/fish/2020-11-19-how-to-recognize-pregnant-fish.md
khuynhtroc/tkdc.vn
907c57b1d06fbf1596ac39aa87bd857a5ba9b958
[ "MIT" ]
null
null
null
_posts/fish/2020-11-19-how-to-recognize-pregnant-fish.md
khuynhtroc/tkdc.vn
907c57b1d06fbf1596ac39aa87bd857a5ba9b958
[ "MIT" ]
3
2020-07-17T16:37:56.000Z
2022-02-13T19:38:35.000Z
_posts/fish/2020-11-19-how-to-recognize-pregnant-fish.md
khuynhtroc/tkdc.vn
907c57b1d06fbf1596ac39aa87bd857a5ba9b958
[ "MIT" ]
null
null
null
--- layout: post title: "How to recognize pregnant fish?" summary: Just do a quick search on the internet and you can instantly see which species of fish you breed or lay eggs on. You will know whether you need to watch for the bulging of a fish's belly due to pregnancy or if you should watch for tiny round eggs that look like jelly in an aquarium. If you are going to keep young fish, try to learn as much as you can about the fish you are keeping, as keeping fry is usually not easy. author: phamhuong categories: [ Fishs ] tags: fish image: assets/images/blog/fishs/how-to-recognize-pregnant-fish/v4-728px-Tell-if-Your-Fish-Is-Having-Babies-Step-1-Version-2.jpg.webp beforetoc: "Just do a quick search on the internet and you can instantly see which species of fish you breed or lay eggs on. You will know whether you need to watch for the bulging of a fish's belly due to pregnancy or if you should watch for tiny round eggs that look like jelly in an aquarium. If you are going to keep young fish, try to learn as much as you can about the fish you are keeping, as keeping fry is usually not easy." toc: true rating: 5 permalink: /fishs/how-to-recognize-pregnant-fish.html --- > Just do a quick search on the internet and you can instantly see which species of fish you breed or lay eggs on. You will know whether you need to watch for the bulging of a fish's belly due to pregnancy or if you should watch for tiny round eggs that look like jelly in an aquarium. If you are going to keep young fish, try to learn as much as you can about the fish you are keeping, as keeping fry is usually not easy. ## 1. Get to know pregnant fish ![Tell-if-Your-Fish-Is-Having-Babies-Step-1]({{ site.url }}/assets/images/blog/fishs/how-to-recognize-pregnant-fish/v4-728px-Tell-if-Your-Fish-Is-Having-Babies-Step-1-Version-2.jpg.webp) ### 1.1 Use this method for juvenile fish species. Guppies, calves, swordfish and platy are probably the most common breeds of aquarium fish. In these fish, the male and the female mate, then the female breeds eggs in the abdomen. Within a month or two (for most aquarium fish), the eggs will hatch into fry, and the mother will lay them out. - Search online for the species of fish you are keeping to see if they are oviparous or viviparous. ![Tell-if-Your-Fish-Is-Having-Babies-Step-2]({{ site.url }}/assets/images/blog/fishs/how-to-recognize-pregnant-fish/v4-728px-Tell-if-Your-Fish-Is-Having-Babies-Step-2-Version-2.jpg.webp) ### 1.2 Identify males and females. As a general rule, in fry species, males are usually brighter or more extravagant, with long and narrow anal fins near the tail. Females are usually duller, and anal fins are usually fan-shaped or triangular. If you can determine the sex of the fish, it will be easy to tell whether two fish are fighting (usually two males or two females) or are mating or preparing to mate (a male and a fish. pieces). - For some fish species that are difficult to distinguish between the sexes, you may need to seek out an expert at an aquarium store. ![Tell-if-Your-Fish-Is-Having-Babies-Step-3]({{ site.url }}/assets/images/blog/fishs/how-to-recognize-pregnant-fish/v4-728px-Tell-if-Your-Fish-Is-Having-Babies-Step-3-Version-3.jpg.webp) ### 1.3 Observe the mating rituals. Different fish species may have different mating, mating behaviors and other mating-related behaviors. In many species, including the vast majority of mackerel, the males often chase the female energetically, sometimes causing scratches, bites or other injuries. In other species, such as discus, a pair of male and female fish work together to protect an area of ​​the tank from other fish. In both cases, when they are actually mating, the male and female fish may be tangled, turned upside down, twisted around each other, or have other subtle behavior. ![Tell-if-Your-Fish-Is-Having-Babies-Step-4]({{ site.url }}/assets/images/blog/fishs/how-to-recognize-pregnant-fish/v4-728px-Tell-if-Your-Fish-Is-Having-Babies-Step-4-Version-2.jpg.webp) ### 1.4 Watch for bloating in the fish's belly during pregnancy. Usually the belly of a pregnant female fish will swell, round or "box" within 20-40 days. - Some species, such as the squid, have a natural bulge, but the bulge is in the front, just below the gills. - "Overweight" males may have enlarged anterior chest. If you don't feed your fish for two or three days, the bulge may shrink, while the bulging belly of a pregnant female will still be visible. ![Tell-if-Your-Fish-Is-Having-Babies-Step-5]({{ site.url }}/assets/images/blog/fishs/how-to-recognize-pregnant-fish/v4-728px-Tell-if-Your-Fish-Is-Having-Babies-Step-5.jpg.webp) ### 1.5 Find a red or black dot. Pregnant female fish often grow a "pregnancy spot" on the abdomen, near the point. Usually this spot is black or magenta, and is more prominent during pregnancy. - This dot may always appear in some fish, but the color of the dot will be brighter or darker when the fish is pregnant. ![Tell-if-Your-Fish-Is-Having-Babies-Step-6]({{ site.url }}/assets/images/blog/fishs/how-to-recognize-pregnant-fish/v4-728px-Tell-if-Your-Fish-Is-Having-Babies-Step-6.jpg.webp) ### 1.6 Determine how to care for the fry. Keeping the fry can be extremely challenging and often requires a separate aquarium so that the adults or the water filter don't harm them. If you are not ready for this task, try contacting an aquarium shop or an experienced aquarium hobbyist who is willing to help you or take the fish away. If you decide to care for the fry, you can start with the guide to keeping the fry below, but learn more about the breed of fish you are keeping as well. ## 2. Know the signs of nesting and laying eggs ![Tell-if-Your-Fish-Is-Having-Babies-Step-7]({{ site.url }}/assets/images/blog/fishs/how-to-recognize-pregnant-fish/v4-728px-Tell-if-Your-Fish-Is-Having-Babies-Step-7.jpg.webp) ### 2.1 Use this method for spawning fish. Many aquarium fish are spawners, including discus, bettas, and most breeds of fortune fish. In these fishes, the female lays hundreds of eggs. They often lay in nests in the bottom of the tank, on the wall or the water surface. If there is a male in the tank, it can fertilize the eggs after the female spawn or mate with the female before, depending on the species of fish. The eggs will hatch into fry. - Look for the names of the fish you have online to see if they are oviparous, or viviparous. - In some fish species, the female has the ability to store semen for several months before using it to fertilize eggs, so a new aquarium with only female fish still has reproductive phenomenon. ![Tell-if-Your-Fish-Is-Having-Babies-Step-8]({{ site.url }}/assets/images/blog/fishs/how-to-recognize-pregnant-fish/v4-728px-Tell-if-Your-Fish-Is-Having-Babies-Step-8.jpg.webp) ### 2.2 Watch for signs of fish nesting. Some species of fish spawn create nesting areas to protect the eggs. These nests may look like small holes or piles of gravel, but are not always obvious. Some lucky fish can form sophisticated nests out of a swarm of bubbles, usually created by males on the water surface. ![Tell-if-Your-Fish-Is-Having-Babies-Step-9]({{ site.url }}/assets/images/blog/fishs/how-to-recognize-pregnant-fish/v4-728px-Tell-if-Your-Fish-Is-Having-Babies-Step-9.jpg.webp) ### 2.3 Check for eggs. Some females of this species swell as the eggs develop internally, but this is usually not a major change and does not last long. Spawning fish eggs usually look like tiny round jellies. Usually fish eggs are scattered in the water, but in some fish species, they can collect in the nesting area or stick to the bottom of the tank or tank wall. - Many spawning fish species also have mating behavior, including the majority of emeralds. They are usually very energetic, last for a few hours and end up spawning. ![Tell-if-Your-Fish-Is-Having-Babies-Step-10]({{ site.url }}/assets/images/blog/fishs/how-to-recognize-pregnant-fish/v4-728px-Tell-if-Your-Fish-Is-Having-Babies-Step-10.jpg.webp) ### 2.4 Prepare for the eggs to hatch. Caring for the fry can be difficult, but even without preparation, you still have some time before the eggs hatch. You should consult an aquarium store for advice if you plan to keep the fry yourself, as the farming process may vary depending on the species of fish. If you aren't prepared, consult the following section on keeping fry for basic advice, but don't expect this method to be optimal for all fish species. ## 3. Raising baby fish ![Tell-if-Your-Fish-Is-Having-Babies-Step-11]({{ site.url }}/assets/images/blog/fishs/how-to-recognize-pregnant-fish/v4-728px-Tell-if-Your-Fish-Is-Having-Babies-Step-11.jpg.webp) ### 3.1 Find out about the fish you are keeping as much as possible. The instructions below can give you the basics and are helpful coping steps if your tank suddenly becomes full of fry. However, taking care of the fry is a real challenge, and the more features you know about your fish, the better. - For more detailed information on a particular fish, follow these guidelines for breeding and keeping discus, gourami , betta , and guppies. - Seek advice from an aquarium shop employee or an aquarium hobbyist's online forum. The advice in these locations is often more helpful than the advice from general pet stores. ![Tell-if-Your-Fish-Is-Having-Babies-Step-12]({{ site.url }}/assets/images/blog/fishs/how-to-recognize-pregnant-fish/v4-728px-Tell-if-Your-Fish-Is-Having-Babies-Step-12.jpg.webp) ### 3.2 Replace the regular filter with a sponge filter. If you are using a water-absorbing or flow-generating filter, replace it with an aquarium's sponge filter. If you do not, the currents can exhaust the fry, and they may even be sucked into the filter and die. ![Tell-if-Your-Fish-Is-Having-Babies-Step-13]({{ site.url }}/assets/images/blog/fishs/how-to-recognize-pregnant-fish/v4-728px-Tell-if-Your-Fish-Is-Having-Babies-Step-13.jpg.webp) ### 3.3 Isolate the fish. Many breeders install another aquarium and transfer the eggs or fry there. However, if you are not an experienced aquarium hobbyist, it is difficult to create a safe and stable environment in a short time. Instead, you can use a plastic net bought at an aquarium store to isolate the fish. Depending on the species, the broodfish can take care of or eat the fry, so you should try to find advice online that is appropriate for your species. If this is not possible, isolate the fish based on the broodfish's behavior: - If the broodstock lays eggs in the nest and protects the eggs from other fish, use a net to separate the broodstock and eggs on one side and the other fish on the other. - If the mother fish spawns or spawns in the water, keep the adult fish on one side of the net. The fry can swim through the net to avoid adults. ![Tell-if-Your-Fish-Is-Having-Babies-Step-14]({{ site.url }}/assets/images/blog/fishs/how-to-recognize-pregnant-fish/v4-728px-Tell-if-Your-Fish-Is-Having-Babies-Step-14.jpg.webp) ### 3.4 Feed the fish with fry. You can sometimes buy a "fry food" at an aquarium store, but usually you need to choose between a variety of fish foods. Grassworms, liquid fish feed or rotifers are generally safe for fry. However, the fish will need additional food as they grow up. These feed types can vary depending on the species of fish and the size of the fish. Ask the staff at the aquarium store for the type of food you are keeping. - If you cannot go to the aquarium store, you can feed the young fish hard-boiled egg whites pressed through cheesecloth. ![Tell-if-Your-Fish-Is-Having-Babies-Step-15]({{ site.url }}/assets/images/blog/fishs/how-to-recognize-pregnant-fish/v4-728px-Tell-if-Your-Fish-Is-Having-Babies-Step-15.jpg.webp) ### 3.5 Prepare to raise the fish until maturity. Set up another tank in advance if you plan to keep some of the fish. If not, contact your aquarium store or aquarist in advance to make a plan to sell or sell the fry once they reach a certain age. > Advice - If you don't want your fish to breed, you need to separate the male and female fish. If it is too late, you should contact the aquarium store. They can take fish. > Warning - If your fish is fat, moving slowly and its scales build up, seek professional advice or at a pet store. It is possible that the fish is sick, not pregnant. - Unless you provide the right environment, most or all of the fry will die. - Never release your fish into a natural lake unless that is where you brought the fish home. If not, you may inadvertently bring about an infestation that destroys the environment in the area. > Other languages - English: Tell if Your Fish Is Having Babies Español: saber si tu pez está teniendo bebés Italiano: Capire se il Tuo Pesce Avrà i Piccoli Portugus: Saber se Seu Peixe as Ter Besa Русский: определить беременность аквариумных рыбок Deutsch: Herausfinden ob dein Fisch Babys bekommt Français: savoir si des poissons attendent des petits Bahasa Indonesia: Mengetahui Ciri ciri Ikan yang Akan Memiliki Anak 中文:判断 宠物 鱼 是否 怀孕 العربية: معرفة أن أسماكك ستضع صغارا Nederlands: Zien of je vis zwanger is
96.301471
556
0.772543
eng_Latn
0.99906
4c212dca574cab991a6a3f1f21c992994a3d5c12
2,984
md
Markdown
README.md
SuZiquan/lemon-blog
88c8028b40ad171beb3e38ff30ac364439163f38
[ "MIT" ]
null
null
null
README.md
SuZiquan/lemon-blog
88c8028b40ad171beb3e38ff30ac364439163f38
[ "MIT" ]
null
null
null
README.md
SuZiquan/lemon-blog
88c8028b40ad171beb3e38ff30ac364439163f38
[ "MIT" ]
2
2021-05-04T01:36:29.000Z
2022-01-23T14:17:47.000Z
# lemon-blog ## 简介 一个简单的博客系统,技术栈为 Vue + Spring 。项目比较粗糙,可能存在较多问题,不推荐直接使用。代码比较简单,适合学习参考使用。 ## 项目由来 在辗转了 Jekyll 、Hexo 以及一些开源博客系统后,决定自己写一个博客,主要原因如下: - Jekyll、Hexo 等静态博客: 1. 一般托管在 GitHub ,需要配合 Travis CI 触发更新,个人觉得不是很方便 2. 花了比较多的时间挑选自己喜欢的模板,进一步会去尝试自己定制样式,时间花在这个上不如自己写一个方便些... 3. 评论系统换了一个又一个(大家都懂的原因?),搜索、统计什么的都需要集成其它的系统,一个个配置过去... 4. 没有后端系统,想要增加一些与后端交互的功能很不方便 - 一些开源的博客系统 1. 只是想要一个简单的博客,放在我可怜的 1 核 2G 服务器上,Redis 、ElasticSearch 甚至 Spring Cloud 等微服务或者分布式技术觉得没必要... 2. 代码实现看着好难受,二次开发不舒服,杂七杂八的功能我也不需要 ## 项目构成 项目分为三个部分: - lemon-blog-frontend:博客展示前端,使用 Nuxt 做服务端渲染,做了一些移动端适配,主要技术:Vue、Vuex、Ant Design Vue、Nuxt - lemon-blog-admin-frontend:博客管理前端,主要技术:Vue、Vuex、Vue Router、Ant Design Vue Pro - lemon-blog-backend:博客后端,主要技术:Spring Boot、Spring MVC、Spring Security、Mybatis Plus ## 项目预览 在线演示:https://suziquan.cn ![](./preview/lemon-blog-frontend.png) ![](./preview/lemon-blog-admin-frontend.png) ![](./preview/lemon-blog-frontend-mobile.png) ## 安装使用 ### lemon-blog-backend 1. 环境要求:已安装 jre 、maven ,需要 MySQL 数据库 2. 创建 MySQL 数据库,导入 lemon-blog-backend/src/main/resources/jdbc/schema.sql 3. 修改配置文件 lemon-blog-backend/src/main/resources/application.properties ``` yaml # 修改 MySQL 配置 spring.datasource.url=jdbc:mysql://localhost:3306/lemon_blog?useUnicode=true&characterEncoding=UTF8&useSSL=false spring.datasource.username=mysql-username spring.datasource.password=mysql-password # 如果需要 github 登录,修改 github 认证配置 social-auth.github.client-id=github-client-id social-auth.github.client-secret=github-client-callback social-auth.github.redirect-uri=github-redirect-uri # 修改 jwt secret key,可用 https://www.grc.com/passwords.htm 生成 jwt.secret-key=jwt-secret-key ``` 4. 执行 `mvn spring-boot:run` 启动服务,默认访问地址为 localhost:8080 ### lemon-blog-frontend 1. 环境要求:已安装 nodejs 、yarn 2. 执行 `yarn install` 安装依赖 3. 后端接口地址在 nuxt.config.js 中配置 ``` js proxy: { '/api': { target: 'http://localhost:8080', changeOrigin: true, }, }, ``` 4. 开发环境:执行 `yarn dev` 生产环境:执行 `yarn build` 和 `yarn start` 默认访问地址为 localhost:3000 ### lemon-blog-admin-frontend 1. 环境要求:已安装 nodejs 、yarn 2. 执行 `yarn install` 安装依赖 3. 后端接口地址在 vue.config.js 中配置 ``` js devServer: { port: 8000, proxy: { '/api': { target: 'http://localhost:8080', ws: false, changeOrigin: true } } }, ``` 4. 开发环境:执行 `yarn serve` ,默认访问地址为 localhost:8000 5. 生产环境:执行 `yarn build` ,将 dist 目录下生成的文件静态部署即可。nginx 配置可参考: ``` server { server_name 域名; listen 443 ssl http2; listen [::]:443 ssl http2; # 执行 yarn build 后,dist 目录地址 root /lemon-blog/lemon-blog-admin-frontend/dist; location / { try_files $uri $uri/ /index.html; proxy_set_header X-Real-IP $remote_addr; proxy_set_header X-Forwarded-For $proxy_add_x_forwarded_for; proxy_set_header Host $http_host; } } ```
28.419048
116
0.674263
yue_Hant
0.52955
4c2167551eb76ca1460bdc74e60d39288cd5d797
127
md
Markdown
_includes/04-lists.md
javierlopezg93/markdown-portfolio
c8bde0167561fd325b06c73ead94a9c7841c5b70
[ "MIT" ]
null
null
null
_includes/04-lists.md
javierlopezg93/markdown-portfolio
c8bde0167561fd325b06c73ead94a9c7841c5b70
[ "MIT" ]
5
2021-08-23T20:52:16.000Z
2021-08-23T21:54:02.000Z
_includes/04-lists.md
javierlopezg93/markdown-portfolio
c8bde0167561fd325b06c73ead94a9c7841c5b70
[ "MIT" ]
null
null
null
# Here is my list for favorite things: - Long wlaks on the beach - Disco dancing - Skating - Open source - Videogames - Movies
15.875
38
0.732283
eng_Latn
0.964044
4c21b7c7925c3e797bd5ab31d65ada65a3265b0c
102
md
Markdown
README.md
madbrain/angular-helpsystem
6e32744d4f3a01aa0e7b9077bd52ab790bf1858b
[ "MIT" ]
null
null
null
README.md
madbrain/angular-helpsystem
6e32744d4f3a01aa0e7b9077bd52ab790bf1858b
[ "MIT" ]
null
null
null
README.md
madbrain/angular-helpsystem
6e32744d4f3a01aa0e7b9077bd52ab790bf1858b
[ "MIT" ]
null
null
null
# angular-helpsystem Visual inline help system [DEMO](https://madbrain.github.io/angular-helpsystem)
20.4
53
0.794118
eng_Latn
0.140755
4c21d6208f066e5b795e1f7290a20904821fb2bc
318
md
Markdown
README.md
elvismt/blockGames
c6e49da6d96472b46081e373b72aa1c3533dc88e
[ "Apache-2.0" ]
null
null
null
README.md
elvismt/blockGames
c6e49da6d96472b46081e373b72aa1c3533dc88e
[ "Apache-2.0" ]
null
null
null
README.md
elvismt/blockGames
c6e49da6d96472b46081e373b72aa1c3533dc88e
[ "Apache-2.0" ]
null
null
null
[![License](https://img.shields.io/badge/License-Apache%202.0-blue.svg)](https://opensource.org/licenses/Apache-2.0) [![Version](https://img.shields.io/badge/version-0.0.5-red.svg)](https://opensource.org/licenses/Apache-2.0) #blockGames ![](https://github.com/elvismt/blockGames/blob/master/images/Screenshot.png)
39.75
116
0.742138
yue_Hant
0.352548
4c2349ce76fd5ca61ecf8d51e858bdfd441c7d4a
758
md
Markdown
CHANGELOG.md
carassius1014/prune-juice
f5ed4bdddee60856bd8ea475dcc9519f5271f699
[ "MIT" ]
null
null
null
CHANGELOG.md
carassius1014/prune-juice
f5ed4bdddee60856bd8ea475dcc9519f5271f699
[ "MIT" ]
null
null
null
CHANGELOG.md
carassius1014/prune-juice
f5ed4bdddee60856bd8ea475dcc9519f5271f699
[ "MIT" ]
null
null
null
## 0.7 * Apply changes to `cabal` files (`hpack` not supported) * Support Cabal 3.6 (thanks @yaitskov!) * Better parsing for ghc-pkg output (thanks @liftM!) ## 0.6 * Better support for Cabal * Support `.hs-boot` files ## 0.5 * Use `cabal` files instead of `hpack` as a way to support more users * More command-line options * Ignore some classes of unused dependencies like test discovery and `Paths_`-type modules * Verbose logging * Bug fixes * Inferring common dependencies for packages without a base library * Loading main modules * Loading literate haskell files * Detecting imports that correspond to multiple dependencies ## 0.4 * Performance enhancements for calling `ghc-pkg` ## 0.3 * Initial release (prior releases were duds)
24.451613
92
0.730871
eng_Latn
0.990117
4c235c404d666a6aa24d4e029d71c5f0d71870f8
406
md
Markdown
README.md
Kano245/Kano
1cc36be0d766744d97353adea190106be7398335
[ "MIT" ]
1
2021-02-23T15:01:19.000Z
2021-02-23T15:01:19.000Z
README.md
OxxoCode/Kano
1cc36be0d766744d97353adea190106be7398335
[ "MIT" ]
null
null
null
README.md
OxxoCode/Kano
1cc36be0d766744d97353adea190106be7398335
[ "MIT" ]
1
2022-03-21T05:30:40.000Z
2022-03-21T05:30:40.000Z
# Kano A CHIP-8 Emulator developed in C++. MegaChip and SuperChip operations and their corresponding software are currently not suppported. # USAGE ```Kano.exe [ROM]``` All CHIP-8 programs must be located in the same directory as the main executable. # BUILD [Visual Studio](https://www.visualstudio.com/products/visual-studio-community-vs) [CMake](http://www.cmake.org/cmake/resources/software.html)
27.066667
96
0.768473
eng_Latn
0.831612
4c23737906af3d68d4e22de901fe4063034d4eca
2,690
md
Markdown
README.md
iMattuz7/CarouselWithVideoMute
2902623984008ed26ca014814b6ef218fdb0032d
[ "MIT" ]
null
null
null
README.md
iMattuz7/CarouselWithVideoMute
2902623984008ed26ca014814b6ef218fdb0032d
[ "MIT" ]
null
null
null
README.md
iMattuz7/CarouselWithVideoMute
2902623984008ed26ca014814b6ef218fdb0032d
[ "MIT" ]
null
null
null
![PBJVideoPlayer](https://raw.github.com/piemonte/PBJVideoPlayer/master/PBJVideoPlayer.gif) ## PBJVideoPlayer `PBJVideoPlayer` is a simple video player library for iOS and tvOS. ### Features - [x] plays local media or streams remote media over HTTP - [x] customizable UI and user interaction - [x] no size restrictions - [x] orientation change support - [x] simple API If you're looking for a video player written in [Swift](https://developer.apple.com/swift/), checkout [Player](https://github.com/piemonte/player). For video recording, checkout [PBJVision](https://github.com/piemonte/PBJVision). [![Build Status](https://travis-ci.org/piemonte/PBJVideoPlayer.svg)](https://travis-ci.org/piemonte/PBJVideoPlayer) [![Pod Version](https://img.shields.io/cocoapods/v/PBJVideoPlayer.svg?style=flat)](http://cocoadocs.org/docsets/PBJVideoPlayer/) ## Installation [CocoaPods](http://cocoapods.org) is the recommended method of installing PBJVideoPlayer, just add the following line to your `Podfile`: ```ruby pod 'PBJVideoPlayer' ``` ## Usage ```objective-c #import <PBJVideoPlayer/PBJVideoPlayer.h> ``` ```objective-c // allocate controller PBJVideoPlayerController *videoPlayerController = [[PBJVideoPlayerController alloc] init]; videoPlayerController.delegate = self; videoPlayerController.view.frame = self.view.bounds; // setup media videoPlayerController.videoPath = @"https://example.com/video.mp4"; // present [self addChildViewController:videoPlayerController]; [self.view addSubview:videoPlayerController.view]; [videoPlayerController didMoveToParentViewController:self]; ``` ## Community - Need help? Use [Stack Overflow](http://stackoverflow.com/questions/tagged/pbjvideoplayer) with the tag 'pbjvideoplayer'. - Questions? Use [Stack Overflow](http://stackoverflow.com/questions/tagged/pbjvideoplayer) with the tag 'pbjvideoplayer'. - Found a bug? Open an [issue](https://github.com/piemonte/PBJVideoPlayer/issues). - Feature idea? Open an [issue](https://github.com/piemonte/PBJVideoPlayer/issues). - Want to contribute? Submit a [pull request](https://github.com/piemonte/PBJVideoPlayer/pulls). ## Resources * [AV Foundation Programming Guide](https://developer.apple.com/library/ios/documentation/AudioVideo/Conceptual/AVFoundationPG/Articles/00_Introduction.html) * [PBJVision, iOS camera engine, features touch-to-record video, slow motion video, and photo capture](https://github.com/piemonte/PBJVision) * [Player, a simple iOS video player in Swift](https://github.com/piemonte/player) ## License PBJVideoPlayer is available under the MIT license, see the [LICENSE](https://github.com/piemonte/PBJVideoPlayer/blob/master/LICENSE) file for more information.
42.698413
229
0.779926
yue_Hant
0.342349
4c24146e43992de5a9c0124cd8fdc59605081e92
2,026
md
Markdown
README.md
BastinFlorian/Keywords_extraction_with_GOW
8fa81cd42b90905cfe378b2a35c3033d2bda40e5
[ "MIT" ]
12
2019-03-17T21:57:55.000Z
2020-10-16T10:11:37.000Z
README.md
BastinFlorian/Keywords_extraction_with_GOW
8fa81cd42b90905cfe378b2a35c3033d2bda40e5
[ "MIT" ]
null
null
null
README.md
BastinFlorian/Keywords_extraction_with_GOW
8fa81cd42b90905cfe378b2a35c3033d2bda40e5
[ "MIT" ]
2
2018-12-13T19:57:54.000Z
2021-01-07T18:16:12.000Z
# Keywords_extraction_with_GOW - First we present an example of the methods used to extract keywords (see **Graph of words and keywords extraction.ipynb** and **K-truss_code_example.ipynb**) - Then we give a code to compute the k_core and obtain the graphs of directories of files or all files in directories containing sub-directories (see **K_core_corpus.py**) - We also give an implementation of the K-truss algorithm (see **K-truss_code.py**) - We make a time analysis to see the evolution of some words through time, in order to detect events related to them. ## Libraries - Networkx to create and vizualize graphs - Spacy to preprocess the text ## Papers implemented : - The k-core is directly taken from Networkx library. - The k-truss is implemented following https://arxiv.org/abs/1205.6693 - The density and inflexion methods are implemented following https://www.aclweb.org/anthology/D16-1191 ## ***Graph of words and keywords extraction.ipynb*** This notebook is dedicated to people who want to **extract keywords** from **text document** or **corpus documents** using a **graph approach**. The goal of this notebook is to extract keywords from a text file using four different approachs : - Best Coverage keywords extraction - http://www2013.w3c.br/proceedings/p715.pdf - Div Rank keywords extraction - http://clair.si.umich.edu/~radev/papers/SIGKDD2010.pdf - K-core Number - Python Library Networkx - K-Truss - https://arxiv.org/abs/1205.6693 Through a french summary of Games of Thrones, we bring an example of the outputs of the four different approaches. ## K-truss_code_example.ipynb This jupyter notebook is an example of the following script ## K-truss code Two functions are implemented. - The first one compute the K-truss of each node in G, the maximum non empty subgraph, the k from the maximum non empty subgraph and the necessary informations to compute the density and inflexion method. - The second one gives the k-truss subgraph of the graph, where k is given as an input
47.116279
205
0.76999
eng_Latn
0.99289
4c243afd121bba391f61c6b91bdf4d4eb175bf0e
1,158
md
Markdown
research/people/alphabetical/huck-tw.md
wellcomecollection/transcribe-wellcome
2aab910b48ebd4499c1177d0ae941002cc37476c
[ "CC0-1.0" ]
1
2022-01-25T23:56:39.000Z
2022-01-25T23:56:39.000Z
research/people/alphabetical/huck-tw.md
wellcomecollection/transcribe-wellcome
2aab910b48ebd4499c1177d0ae941002cc37476c
[ "CC0-1.0" ]
30
2021-02-17T19:00:47.000Z
2022-02-22T18:18:25.000Z
research/people/alphabetical/huck-tw.md
wellcomecollection/transcribe-wellcome
2aab910b48ebd4499c1177d0ae941002cc37476c
[ "CC0-1.0" ]
3
2021-02-27T18:52:50.000Z
2021-04-15T12:45:56.000Z
# Huck, T.W. T.W. Huck \(b.1882-d.1918\) Librarian. in post 1913-1918 pictured in group portrait L0014474 c1914-1918, and in Symons 1993, page 13. Huck died in the First World War. Huck was one of four original staff members of the permanent WHMM in 1914 under [C. J. S. Thompson](https://github.com/wellcomecollection/transcribe-wellcome/tree/f72c2b61ac1ad669053741de27081d2c70951534/researching-the-museum-and-library/people/alphabetical/Thompson,%20Charles%20John%20Samuel/README.md) \(with [Shirreff](https://github.com/wellcomecollection/transcribe-wellcome/tree/f72c2b61ac1ad669053741de27081d2c70951534/researching-the-museum-and-library/people/alphabetical/Shirreff,%20Frances%20Gordon/README.md), [Amoruso](https://github.com/wellcomecollection/transcribe-wellcome/tree/f72c2b61ac1ad669053741de27081d2c70951534/researching-the-museum-and-library/people/alphabetical/Amoruso,%20Arthur/README.md), [Carline](https://github.com/wellcomecollection/transcribe-wellcome/tree/f72c2b61ac1ad669053741de27081d2c70951534/researching-the-museum-and-library/people/alphabetical/Carline,%20G.%20R.)\). Died in First World War. \[\[library staff\]\] librarian
105.272727
939
0.82038
yue_Hant
0.317704
4c256b66bff449812af5d10d0f45eba649be7d7e
3,479
md
Markdown
v1/docs/src/input.md
mbauman/HomotopyContinuation.jl
3253f86b2752303b0ed8616e07bccf8bfbb7f24d
[ "MIT" ]
null
null
null
v1/docs/src/input.md
mbauman/HomotopyContinuation.jl
3253f86b2752303b0ed8616e07bccf8bfbb7f24d
[ "MIT" ]
null
null
null
v1/docs/src/input.md
mbauman/HomotopyContinuation.jl
3253f86b2752303b0ed8616e07bccf8bfbb7f24d
[ "MIT" ]
null
null
null
# Input The [`solve`](@ref) and [`monodromy_solve`](@ref) functions in `HomotopyContinuation.jl` accept multiple input formats for polynomial systems. These are * Arrays of polynomials following the [`MultivariatePolynomials`](https://github.com/JuliaAlgebra/MultivariatePolynomials.jl) interface. We export the `@polyvar` macro from the [`DynamicPolynomials`](https://github.com/JuliaAlgebra/DynamicPolynomials.jl) package to create polynomials with the `DynamicPolynomials` implementation. * Systems constructed with our own symbolic modeling language as implemented in the `ModelKit` module. We export the `@var` macro to create variables in this modeling language. * Systems (resp. homotopies) following the [`AbstractSystem`](@ref) (resp. [`AbstractHomotopy`](@ref)) interface. The difference between the `MultivariatePolynomials` and `ModelKit` input is best shown on an example. Assume we want to solve the polynomial system ```math F(x,y) = \begin{bmatrix} (2x + 3y + 2)^2 (4x - 2y + 3) \\ (y - 4x - 5)^3 - 3x^2 + y^2 \end{bmatrix} = 0 ``` Using the `@polyvar` macro from `DynamicPolynomials` we can do ```julia @polyvar x y F = [ (2x + 3y + 2)^2 * (4x - 2y + 3), (y - 4x - 5)^3 - 3x^2 + y^2 ] ``` ``` 2-element Array{Polynomial{true,Int64},1}: 16x³ + 40x²y + 12xy² - 18y³ + 44x² + 68xy + 3y² + 40x + 28y + 12 -64x³ + 48x²y - 12xy² + y³ - 243x² + 120xy - 14y² - 300x + 75y - 125 ``` We see that our expression got automatically expanded into a monomial basis. Sometimes this is very useful, but for the fast evaluation of a polynomial system this is not so useful! Here, `ModelKit` comes into play. ```julia @var x y F = [ (2x + 3y + 2)^2 * (4x - 2y + 3), (y - 4x - 5)^3 - 3x^2 + y^2 ] ``` ``` 2-element Array{HomotopyContinuation.ModelKit.Operation,1}: (2x + 3y + 2) ^ 2 * ((4x - 2y) + 3) (((y - 4x) - 5) ^ 3 - 3 * x ^ 2) + y ^ 2 ``` Compared to the polynomial input we see that it doesn't forget the structure of the input. For the internal computations both formulations will be converted into efficient straight line programs. However, from the polynomial formulation we will not be able to recover the actual formulation of the problem and therefore the generated straight line program will be less efficient than the one created by `ModelKit`. However, there are also cases where the polynomial input is preferable. An example is when you only have an expression of your polynomial system in the monomial basis. In this case the polynomial input will generate more efficient code since it is more optimized for this case. Besides the different macros to generate variables both packages provide a common set of helpful functions for modeling problems: * `variables(f, parameters = [])` to obtain a list of all variables. * `nvariables(f, parameters = [])` to obtain the number of variables. * `differentiate(f, vars)` to compute the gradient with respect to the given variables * `subs(f, var => expr)` to substitute variables with expressions * `monomials(vars, d; homogenous = false)` create all monomials of degree up to `d` (resp. exactly degree `d` if `homogenous` = true) !!! warning "Variable ordering" While `MultivariatePolynomials` orders variables in the order of creation, `ModelKit` orders them *alphabetically*. Also in `ModelKit` two variables with the same name are always identical. ## ModelKit ```@docs ModelKit.@var ModelKit.@unique_var ModelKit.System ModelKit.Homotopy ModelKit.compile ```
39.089888
193
0.722909
eng_Latn
0.984834
4c25b75f279998fb71533808d5a6e0c286460e6b
1,518
md
Markdown
misc/examples.md
squarefk/test_actions
dd3b0305c49b577102786eb1c24c590ef160bc30
[ "MIT" ]
2
2021-05-03T12:59:01.000Z
2021-11-22T04:30:28.000Z
misc/examples.md
squarefk/test_actions
dd3b0305c49b577102786eb1c24c590ef160bc30
[ "MIT" ]
null
null
null
misc/examples.md
squarefk/test_actions
dd3b0305c49b577102786eb1c24c590ef160bc30
[ "MIT" ]
1
2020-09-29T17:56:48.000Z
2020-09-29T17:56:48.000Z
# More examples <a href="https://github.com/taichi-dev/taichi/blob/master/examples/mpm_lagrangian_forces.py"><img src="https://github.com/taichi-dev/public_files/raw/master/taichi/lagrangian.gif" height="192px"></a> <a href="https://github.com/taichi-dev/taichi/blob/master/examples/taichi_sparse.py"><img src="https://github.com/taichi-dev/public_files/raw/master/taichi/sparse_grids.gif" height="192px"></a> <a href="https://github.com/taichi-dev/taichi/blob/master/examples/pbf2d.py"><img src="https://github.com/taichi-dev/public_files/raw/master/taichi/pbf.gif" height="192px"></a> <a href="https://github.com/taichi-dev/taichi/blob/master/examples/game_of_life.py"><img src="https://github.com/taichi-dev/public_files/raw/master/taichi/game_of_life.gif" height="192px"></a> <a href="https://github.com/taichi-dev/taichi/blob/master/examples/fem128.py"><img src="https://github.com/taichi-dev/public_files/raw/master/taichi/fem128.gif" height="192px"></a> <a href="https://github.com/taichi-dev/taichi/blob/master/examples/mass_spring_3d.py"><img src="https://github.com/taichi-dev/public_files/raw/master/taichi/mass_spring_3d.gif" height="192px"></a> <a href="https://github.com/taichi-dev/taichi/blob/master/examples/mciso.py"><img src="https://github.com/taichi-dev/public_files/raw/master/taichi/mciso.gif" height="192px"></a> <a href="https://github.com/taichi-dev/taichi/blob/master/examples/mpm3d.py"><img src="https://github.com/taichi-dev/public_files/raw/master/taichi/mpm3d.gif" height="192px"></a>
126.5
199
0.764163
bjn_Latn
0.119944
4c26502255a770d35f07c5b8f77acbe7d4ada967
20,129
md
Markdown
README.md
Alexander-Berg/List-of-Testing-Tools-and-Frameworks-for-.NET
06f0eb4dde785b9d50bfd377456ab673bc8a31b8
[ "MIT" ]
null
null
null
README.md
Alexander-Berg/List-of-Testing-Tools-and-Frameworks-for-.NET
06f0eb4dde785b9d50bfd377456ab673bc8a31b8
[ "MIT" ]
null
null
null
README.md
Alexander-Berg/List-of-Testing-Tools-and-Frameworks-for-.NET
06f0eb4dde785b9d50bfd377456ab673bc8a31b8
[ "MIT" ]
null
null
null
# List of Automated Testing (TDD/BDD/ATDD/SBE) Tools and Frameworks for .NET This is a list of Automated Testing Frameworks for .NET related to methodologies and types of tests: - Test-Driven Development (TDD) - Behavior-Driven Development (BDD) - Specification by Example (SBE) - Acceptance Test-Driven Development (ATDD) - Property-Based Testing (PBT) - Unit / Integration / Acceptance / Specification / etc. Tests Please feel free to suggest changes and/or new tools/frameworks. Key: * **Bold** — Most Popular / Recommended # Unit Testing Frameworks > What are unit test frameworks and how are they used? Simply stated, they are software tools to support writing and running unit tests, including a foundation on which to build tests and the functionality to execute the tests and report their results. They are not solely tools for testing; they can also be used as development tools on a par with preprocessors and debuggers. Unit test frameworks can contribute to almost every stage of software development, including software architecture and design, code implementation and debugging, performance optimization, and quality assurance. [Paul Hamill, Unit Test Frameworks](https://www.oreilly.com/library/view/unit-test-frameworks/0596006896/ch01.html) | Library | Comment | |-----------|----------- | [EMTF](https://archive.is/rptLh#selection-360.0-360.7) | - *Discontinued*<br>- Known as Embeddable Micro Test Framework | [Expecto](https://github.com/haf/Expecto) | F# | [FsUnit](http://fsprojects.github.io/FsUnit/) | F# | [Fuchu](https://github.com/mausch/Fuchu) | F# / C# / VB.NET | [MSTest](https://github.com/Microsoft/testfx) | Also known as Microsoft Test Framework | **[NUnit](http://www.nunit.org/)** | | [Unquote](http://www.swensensoftware.com/unquote) | F# | **[xUnit.net](https://xunit.net/)** | # Isolation Frameworks > An isolation framework is a set of programmable APIs that makes creating fake objects much simpler, faster, and shorter than hand-coding them. [Automate Planet](https://www.automatetheplanet.com/isolation-frameworks-fundamentals) | Library | Comment | |-----------|---------| | **[FakeItEasy](http://fakeiteasy.github.io/)** | [JustMock](http://www.telerik.com/products/mocking.aspx) | Non-free | [JustMock Lite](http://www.telerik.com/justmock/free-mocking) | [Microsoft Fakes](https://msdn.microsoft.com/en-us/library/hh549175.aspx) | Previously known as Microsoft Moles | **[Moq](https://github.com/Moq/moq4)** | See also:<br>- [AutoMoq](https://github.com/darrencauthon/AutoMoq) ([updated fork](https://github.com/dariusz-wozniak/AutoMoq))<br>- [Automoqer](https://github.com/rbengtsson/Automoqer) ([updated fork](https://github.com/dariusz-wozniak/Automoqer)) | [NMock](http://nmock.sourceforge.net/) | **[NSubstitute](http://nsubstitute.github.io/)** | [SimpleStubs](https://github.com/microsoft/SimpleStubs) | [Typemock Isolator](http://www.typemock.com/isolator-product-page) | Non-free # Acceptance Testing / Behavior-Driven Development / Specification by Example | Library | Comment | |-----------|---------| | [ApprovalTests.Net](https://github.com/approvals/ApprovalTests.Net) | [Avignon](http://www.nolacom.com/avignon/) | [BDDfy](http://bddfy.teststack.net/) | [BDTest](https://github.com/thomhurst/BDTest/wiki) | [Concordion.NET](http://concordion.org/dotnet/) | [CoreBDD](https://github.com/stevenknox/CoreBDD/) | [Cucumber](https://cucumber.io/) | [FitNesse](http://fitnesse.org/) | [Gauge](http://getgauge.io/) | [HonestCode](http://honestcode.io/) | [LightBDD](https://github.com/Suremaker/LightBDD) | [LoFuUnit](https://github.com/hlaueriksson/LoFuUnit) | **[Machine.Specifications](https://github.com/machine/machine.specifications)** | Also known as MSpec | [NBehave](http://nbehave.org/) | [NSpec](http://nspec.org/) | [Robot Framework](http://robotframework.org/) | **[SpecFlow](http://www.specflow.org/)** | [SpecsFor](http://specsfor.com/) | [Specter](http://specter.sourceforge.net/) | [StoryTeller](http://storyteller.github.io/) | [SubSpec](https://subspec.codeplex.com/) | [System.Spec](https://github.com/alexfalkowski/System.Spec) | [TickSpec](https://github.com/fsprojects/TickSpec) | [Verify](https://github.com/SimonCropp/Verify) | [xBehave.net](http://xbehave.github.io/) | [Xunit.Gherkin.Quick](https://github.com/ttutisani/Xunit.Gherkin.Quick) # Web Application Testing Testing web application UI e.g. via browser engine. | Library | Comment | |-----------|---------| | **[Atata](https://github.com/atata-framework/atata)** | [Canopy](http://lefthandedgoat.github.io/canopy/) | F# | [Coypu](https://github.com/featurist/coypu) | [Netling](https://github.com/hallatore/Netling) | Load tests for web | **[Playwright for .NET](https://github.com/microsoft/playwright-sharp)** | - Port of [Playwright](https://playwright.dev/)<br />- Developed by Microsoft | [Puppeteer Sharp](https://github.com/kblok/puppeteer-sharp) | - See also [Puppeteer Sharp Contributions](https://github.com/hlaueriksson/puppeteer-sharp-contrib) | [Ranorex](https://www.ranorex.com/) | GUI testing for desktop, web and mobile applications | [Selenium](http://www.seleniumhq.org/) | [Squish GUI Tester](https://www.froglogic.com/squish/) | [Test.Automation](https://github.com/ObjectivityLtd/Test.Automation) | [TestComplete](https://smartbear.com/product/testcomplete/overview/) | [TestLeft](https://smartbear.com/product/testleft/overview/) | [TestStack.Seleno](http://seleno.teststack.net/) # Web Testing Testing ASP.NET, HTTP, HttpClient, REST, Web Sockets, AMQP, Blazor etc. | Library | Comment | |-----------|---------| | [Alba](http://jasperfx.github.io/alba/) | Utilities for ASP.Net Core web services testing | [bUnit](https://bunit.dev/) | Blazor components testing | [FakeHttpContext](https://github.com/vadimzozulya/FakeHttpContext) | Fake context for `HttpContext.Current` | [Flurl](https://github.com/tmenier/Flurl) | URL builder and HTTP client library. | [Mock4Net](https://github.com/alexvictoor/mock4net) | A fluent API allows to specify the behavior of the server and hence easily stub and mock webservices and REST resources | [MockHttp](https://github.com/richardszalay/mockhttp) | Replacement for `HttpMessageHandler` | [MockNet](https://github.com/Theorem/MockNet) | Friendly mocking framework to unit test the System.Net.Http namespace | [My Tested ASP.NET](https://docs.mytestedasp.net/) | A fluent unit testing library for ASP.NET Core MVC | [NBomber](https://nbomber.com/) | Pull and Push testing: HTTP/WebSockets/AMQP etc or a semantic model Pull/Push | [PactNet](https://github.com/pact-foundation/pact-net) | - Port of [Pact](https://pact.io/)<br />- Testing for integrating web apps, APIs and microservices | [Stubbery](https://github.com/markvincze/Stubbery) | API stubs | [WireMock.Net](https://github.com/WireMock-Net/WireMock.Net) | HTTP response stubbing, matchable on URL/Path, headers, cookies and body content patterns # Cloud Testing | Library | Comment | |-----------|---------| | [AWS .NET Mock Lambda Test Tool](https://github.com/aws/aws-lambda-dotnet) | [Azure Functions Test Fixture](https://github.com/jeffhollan/functions-test-helper) # User Interface Testing Testing system UI (Win32, WinForms, UWP, etc.), embedded, mobile apps | Library | Comment | |-----------|---------| | [Appium](https://appium.io/docs/en/drivers/windows/) | Supports testing of Universal Windows Platform (UWP) and Classic Windows (Win32) applications | [FlaUI](https://github.com/Roemer/FlaUI) | Automated UI testing of Windows applications (Win32, WinForms, WPF, Store Apps) | [Ranorex](https://www.ranorex.com/) | GUI testing for desktop, web and mobile applications | [Squish GUI Tester](https://www.froglogic.com/squish/) | All kinds of cross-platform desktop, mobile, embedded and web applications | [TestComplete](https://smartbear.com/product/testcomplete/overview/) | "Ensure the quality of your application without sacrificing speed or agility with an easy-to-use, GUI test automation tool. Our AI-powered object recognition engine and script or scriptless flexibility is unmatched, letting you test every desktop, web, and mobile application with ease." | [WinAppDriver](https://github.com/Microsoft/WinAppDriver) | - Windows Application Driver<br>- Service to support Selenium-like UI Test Automation on Windows Applications<br>- Supports testing Universal Windows Platform (UWP), Windows Forms (WinForms), Windows Presentation Foundation (WPF), and Classic Windows (Win32) apps on Windows 10 PCs # Database Testing | Library | Comment | |-----------|---------| | [Respawn](https://github.com/jbogard/Respawn) | A small utility to help in resetting test databases to a clean state # Concurrent Testing | Library | Comment | |-----------|---------| | [FluentAssertions.Extensions](https://github.com/Kittyfisto/FluentAssertions.Extensions) | [Microsoft Coyote](https://microsoft.github.io/coyote/) # Memory Testing | Library | Comment | |-----------|---------| | [.NET Memory Profiler](http://memprofiler.com/) | Also known as MemProfiler | **[dotMemory Unit](https://www.jetbrains.com/dotmemory/unit/)** # Mutation Testing > Mutation testing (or mutation analysis or program mutation) is used to design new software tests and evaluate the quality of existing software tests. Mutation testing involves modifying a program in small ways. Each mutated version is called a mutant and tests detect and reject mutants by causing the behavior of the original version to differ from the mutant. This is called killing the mutant. Test suites are measured by the percentage of mutants that they kill. New tests can be designed to kill additional mutants. Mutants are based on well-defined mutation operators that either mimic typical programming errors (such as using the wrong operator or variable name) or force the creation of valuable tests (such as dividing each expression by zero). The purpose is to help the tester develop effective tests or locate weaknesses in the test data used for the program or in sections of the code that are seldom or never accessed during execution. Mutation testing is a form of white-box testing. [Wikipedia](https://en.wikipedia.org/wiki/Mutation_testing) | Library | Comment | |-----------|---------| | [CREAM](http://galera.ii.pw.edu.pl/~adr/CREAM/) | Also known as CREAtor of Mutants | [Fettle](https://github.com/ComparetheMarket/fettle) | [Nester](http://nester.sourceforge.net/) | [NinjaTurtles](https://ninjaturtles.codeplex.com/) | [PIT](http://pitest.org/) | [Stryker](https://stryker-mutator.io/) | [Testura.Mutation](https://github.com/Testura/Testura.Mutation) | [VisualMutator](http://visualmutator.github.io/web/) # Automated Exploratory Testing > Exploratory testing is an approach to software testing that is often described as simultaneous learning, test design, and execution. It focuses on discovery and relies on the guidance of the individual tester to uncover defects that are not easily covered in the scope of other tests. [Atlassian](https://www.atlassian.com/continuous-delivery/software-testing/exploratory-testing) | Library | Comment | |-----------|---------| | **[Microsoft IntelliTest](https://msdn.microsoft.com/en-us/library/dn823749.aspx)** | Part of Visual Studio<br>Previously known as:<br>- [Microsoft Code Digger](https://marketplace.visualstudio.com/items?itemName=RiSEResearchinSoftwareEngineering.MicrosoftCodeDigger)<br>- [Microsoft Pex](http://research.microsoft.com/en-us/projects/pex/)<br>- [Microsoft Smart Unit Tests](http://blogs.msdn.com/b/visualstudioalm/archive/2014/11/19/introducing-smart-unit-tests.aspx) | [Randoop.NET](https://github.com/abb-iss/Randoop.NET) # Property-Based Testing > Property based testing relies on properties. It checks that a function, program or whatever system under test abides by a property. Most of the time, properties do not have to go into too much details about the output. They just have to check for useful characteristics that must be seen in the output. [Nicolas Dubien, Introduction to Property Based Testing](https://medium.com/criteo-engineering/introduction-to-property-based-testing-f5236229d237) | Library | Comment | |-----------|---------| | [CsCheck](https://github.com/AnthonyLloyd/CsCheck) | **[FsCheck](https://fscheck.github.io/FsCheck/)** | Port of [QuickCheck](https://hackage.haskell.org/package/QuickCheck) | [Hedgehog](https://github.com/hedgehogqa/fsharp-hedgehog) | F# port of [Hedgehog](https://hedgehog.qa/) # Approval Testing | Library | Comment | |-----------|---------| | [ApprovalTests.Net](https://github.com/approvals/ApprovalTests.Net) | | [DiffEngine](https://github.com/VerifyTests/DiffEngine) | Manages launching and cleanup of diff tools. Used by ApprovalTests, Shouldly, Verify | [Polaroider](https://wickedflame.github.io/Polaroider/) | | [Shouldly](https://github.com/shouldly/shouldly) | See also [`ShouldMatchApproved`](https://putlocker-watch-spectre-online-movie-free.readthedocs.io/en/latest/assertions/shouldMatchApproved.html) | [Snapper](https://github.com/theramis/Snapper) | | [Snapshooter](https://swisslife-oss.github.io/snapshooter/) | | [Verify](https://github.com/SimonCropp/Verify) | # Code Coverage | Library | Comment | |-----------|---------| | [AxoCover](https://github.com/axodox/AxoTools) | Based on OpenCover | [Coverlet](https://github.com/tonerdo/coverlet) | **[dotCover](https://www.jetbrains.com/dotcover)** | [Fine Code Coverage](https://marketplace.visualstudio.com/items?itemName=FortuneNgwenya.FineCodeCoverage) | [NCover](https://www.ncover.com/) | [NCrunch](http://www.ncrunch.net/) | [NDepend](http://www.ndepend.com/) | [OpenCover](https://github.com/OpenCover/opencover) | [Semantic Designs C# Test Coverage Tool](http://www.semanticdesigns.com/Products/TestCoverage/CSharpTestCoverage.html) | [Software Verify .NET Coverage Validator](http://www.softwareverify.com/dotnet-coverage.php) | [Squish Coco](http://www.froglogic.com/squish/coco/) | [TestMatrix](http://submain.com/products/testmatrix.aspx) | [Typemock Isolator Coverage](http://www.typemock.com/coverage) | [Visual Studio Code Coverage](https://msdn.microsoft.com/en-us/library/dd537628.aspx) | Part of Visual Studio # Continuous Testing > Continuous testing is the process of executing automated tests as part of the software delivery pipeline to obtain immediate feedback on the business risks associated with a software release candidate. [Wikipedia](https://en.wikipedia.org/wiki/Continuous_testing) | Library | Comment | |-----------|---------| | [ContinuousTests](http://www.continuoustests.com/) | Formerly Mighty Moose | **[dotCover](https://www.jetbrains.com/help/dotcover/Continuous_Testing.html)** | [Giles](http://codereflection.github.io/Giles/) | **[Live Unit Testing](https://blogs.msdn.microsoft.com/visualstudio/2017/03/09/live-unit-testing-in-visual-studio-2017-enterprise/)** | Part of Visual Studio | [NCrunch](http://www.ncrunch.net/) | [Parasoft dotTEST](https://www.parasoft.com/product/dottest/) | [Typemock Isolator Smart Runner](http://www.typemock.com/smart-runner) # Fluent Assertion Frameworks | Library | Comment | |-----------|---------| | **[Fluent Assertions](http://www.fluentassertions.com/)** | [NFluent](http://www.n-fluent.net/) | [Shouldly](https://github.com/shouldly/shouldly) # Test Data Builders and Dummy Data Generators | Library | Comment | |-----------|---------| | [AutoBogus](https://github.com/nickdodd79/AutoBogus) | **[AutoFixture](https://github.com/AutoFixture/AutoFixture)** | | **[Bogus](https://github.com/bchavez/Bogus)** | | [Faker.Net](https://github.com/slashdotdash/faker-cs) | | [GenFu](https://github.com/MisterJames/GenFu) | | [NBuilder](https://github.com/nbuilder/nbuilder) | | [TestData](https://github.com/kiandra-it/test-data) | | [TestStack.Dossier](http://dossier.teststack.net/) | [Tynamix ObjectFiller.NET](https://github.com/Tynamix/ObjectFiller.NET) # Helper Libraries | Library | Comment | |-----------|---------| | [AutoMoq](https://github.com/darrencauthon/AutoMoq) | Auto mocking provider for Moq | [ConventionTests](http://conventiontests.teststack.net/) | Library that makes it easy to build validation rules for convention validation tests | [Fixie](http://fixie.github.io/) | Convention for tests | [FluentMvcTesting](http://fluentmvctesting.teststack.net/) | Type-safe tests against ASP.NET MVC Controllers | [MockQueryable](https://github.com/romantitov/MockQueryable) | Extensions for mocking EfCore | [SparkyTestHelpers](https://github.com/BrianSchroer/sparky-test-helpers) | Unit test helpers for config files, ASP.NET MVC, and Moq among others | [XMLUnit](https://www.xmlunit.org/) | Unit testing XML # Miscellaneous Tools | Library | Comment | |-----------|---------| | [AccidentalFish.FSharp.Validation](https://github.com/JamesRandall/AccidentalFish.FSharp.Validation) | Simple validator DSL / library for F# | [ArchUnitNET](https://github.com/TNG/ArchUnitNET) | Library for checking the architecture of C# code | [CheckTestOutput](https://github.com/exyi/CheckTestOutput) | A library for semi-manual tests. Run a function, manually check the output. But only if it is different than last run | [Compare-Net-Objects](https://github.com/GregFinzer/Compare-Net-Objects) | Deep compare of any two .NET objects using reflection | [ErrorUnit](http://errorunit.com/) | Debug C# application by automatically creating C# Unit Tests in Visual Studio that recreate the situation leading up to the error | [ExpressionToCode](https://github.com/EamonNerbonne/ExpressionToCode) | Generates valid, readable C# from an Expression Tree | [Harmony 2.0](https://harmony.pardeike.net/) | Runtime alter functionality by monkey patching methods | [KREM](https://github.com/Bitvis/krem) | Automation and test framework. Integration, regression, spec testing. Well suitable for embedded. Written in Python, but support external scripts, etc. | [NBi](http://www.nbi.io/) | Framework to test Business Intelligence | [NScenario](https://github.com/cezarypiatek/NScenario) | Library for annotating steps of test case scenarios | [Quality Gate One Studio](http://www.qgonestudio.com/site/) | Combinatorial and Model-Based Testing | [Scientist.NET](https://github.com/scientistproject/Scientist.net) | A library for carefully refactoring critical paths | [Squish Test Center](https://www.froglogic.com/testcenter/) | Aggregates test results in a central server and generates statistics | [TestFlask](https://github.com/FatihSahin/test-flask) | Recording and mock replay framework with the ability to generate unit tests for recorded scenarios. It also provides some tools to ease scenario testing inside ASP.NET MVC apps | [xRetry](https://github.com/JoshKeegan/xRetry) | Retry flickering test cases for xUnit and SpecFlow # Visual Studio Add-Ins | Library | Comment | |-----------|---------| | [GennyMcGenFace](https://marketplace.visualstudio.com/items?itemName=Armastevs.GennyMcGenFace) | Unit test generator | [QuickUnit Unit Test Designer](https://visualstudiogallery.msdn.microsoft.com/dd88f120-27c6-444a-beeb-3cbdad4b620c) | **[ReSharper](https://www.jetbrains.com/resharper/features/unit_testing.html)** | [TestDriven.Net](http://www.testdriven.net/) | [Unit Test Boilerplate Generator](https://marketplace.visualstudio.com/items?itemName=RandomEngy.UnitTestBoilerplateGenerator) | [Unitverse](https://marketplace.visualstudio.com/items?itemName=MattWhitfield.Unitverse) | Unit test generator | [WiseTester](https://marketplace.visualstudio.com/items?itemName=WiseTester.WiseTester-OvercomeUnitTestFailures) # References * [Awesome .NET: .NET Testing](https://dotnet.libhunt.com/categories/1837-testing) * [Osherove: Tools And Frameworks for Unit Testing in .NET for 2013](http://osherove.com/blog/2013/3/16/tools-and-frameworks-for-unit-testing-in-net-and-java.html) * [Stack Overflow: Is there any framework for .NET to populate test data?](https://stackoverflow.com/questions/1610212/is-there-any-framework-for-net-to-populate-test-data) * [Wikipedia: List of unit testing frameworks](https://en.wikipedia.org/wiki/List_of_unit_testing_frameworks)
65.142395
1,061
0.743157
yue_Hant
0.316267
4c26d28c013ea8120b1437dc9edc26ce152d5807
1,361
md
Markdown
_submissions/round-04/1/2013-03-28-concept-map-conditional-statements-python.md
kbroman/SWC_training_course
a2881f1544c0bcfe35539a571ac0aff5efeca0a7
[ "CC-BY-4.0" ]
13
2015-02-06T16:35:46.000Z
2021-02-14T22:36:41.000Z
_submissions/round-04/1/2013-03-28-concept-map-conditional-statements-python.md
kbroman/SWC_training_course
a2881f1544c0bcfe35539a571ac0aff5efeca0a7
[ "CC-BY-4.0" ]
111
2015-01-25T18:52:36.000Z
2017-02-06T14:11:36.000Z
_submissions/round-04/1/2013-03-28-concept-map-conditional-statements-python.md
kbroman/SWC_training_course
a2881f1544c0bcfe35539a571ac0aff5efeca0a7
[ "CC-BY-4.0" ]
124
2015-01-22T00:06:47.000Z
2021-08-09T12:19:36.000Z
--- date: 2013-03-28 round: Round 4 title: 'Concept map: Conditional statements (Python)' author: John Blischak permalink: /2013/03/concept-map-conditional-statements-python/ tags: - Concept Map --- [<img class="alignnone size-medium wp-image-1965" alt="concept-map-conditional-statements-1" src="http://files.software-carpentry.org/training-course/2013/03/concept-map-conditional-statements-1-300x225.png" width="300" height="225" />][1] I designed my concept map by trying to answer the following two questions: 1) What would I want to teach someone new to Python before introducing them to an "if" statement? 2) Once s/he understands an "if" statement, what would I need to cover such that s/he could write a script with complicated flow control? Thus, I have bolded the main stepping stones that I followed. Please let me know how I can improve it. Thanks! **Added descriptions of links:** [<img class="alignnone size-medium wp-image-2053" alt="concept-map-conditional-statements-2" src="http://files.software-carpentry.org/training-course/2013/03/concept-map-conditional-statements-2-300x212.png" width="300" height="212" />][2] [1]: http://files.software-carpentry.org/training-course/2013/03/concept-map-conditional-statements-1.png [2]: http://files.software-carpentry.org/training-course/2013/03/concept-map-conditional-statements-2.png
61.863636
241
0.766348
eng_Latn
0.81757
4c272604bc64c1a91b9c2d255a40f2397d8a0809
803
md
Markdown
README.md
UcheIgbokwe/DataIngestion.TestAssignment
0a73cfdb6f130399a784dce7e946d55750ed5548
[ "MIT" ]
null
null
null
README.md
UcheIgbokwe/DataIngestion.TestAssignment
0a73cfdb6f130399a784dce7e946d55750ed5548
[ "MIT" ]
null
null
null
README.md
UcheIgbokwe/DataIngestion.TestAssignment
0a73cfdb6f130399a784dce7e946d55750ed5548
[ "MIT" ]
null
null
null
# DataIngestion.TestAssignment A Console App designed to download, extract and push data into Elastic Search. This solution is built using ASP.NET CORE and Nest for Elastic Search, xUnit for unit testing, Mediator for CQRS. Follow the steps below to run the application: # Step 1: In the event the files are unable to download, kindly create a SOURCES folder and dump the extracted zip content inside it's name folder. Example: DataIngestion.TestAssignment/Sources/artist/artist # Step 2: Open the application in Visual Studio Code or any preferred IDE and navigate to the root folder. Run the following command next: dotnet run. # Step 3: TESTING Open the application in Visual Studio Code or any preferred IDE and navigate to the root/TEST folder. Run the following command next: dotnet test.
36.5
137
0.795766
eng_Latn
0.991131
4c290c83a5f27b5fa386a28a97d0be376ac9f7e7
1,386
md
Markdown
glossary.md
lakshay-nasa/glossary
1a558afb630bf4178dd8af2e3fdc4f1062fdaa77
[ "Apache-2.0" ]
333
2021-01-15T16:55:41.000Z
2022-03-30T05:17:04.000Z
glossary.md
lakshay-nasa/glossary
1a558afb630bf4178dd8af2e3fdc4f1062fdaa77
[ "Apache-2.0" ]
552
2021-02-03T17:20:30.000Z
2022-03-31T15:53:22.000Z
glossary.md
lakshay-nasa/glossary
1a558afb630bf4178dd8af2e3fdc4f1062fdaa77
[ "Apache-2.0" ]
205
2021-02-03T17:18:35.000Z
2022-03-31T13:02:16.000Z
# Cloud Native Glossary ## Cloud native business impact Business operating models in today’s world have shifted dramatically. Faced with a rapid pace of change, organizations must continuously adjust to new technology developments and customer demands. Digital transformation has now led to a world where organizations largely derive value and revenue through digital interactions. No longer is technology merely supporting business processes, it has become a strategic differentiator. Cloud native technologies allow organizations to deliver digital products and services to market faster while increasing scalability, stability, and resiliency. This enables them to elastically scale based on customer needs ensuring business continuity. Additionally, cloud native technologies automate a lot of the operational work improving efficiency. In short, cloud native technologies enable organizations to adapt as market needs shift while minimizing operations cost. The Cloud Native Computing Foundation seeks to drive adoption of this paradigm by fostering and sustaining a community and ecosystem of open source, vendor-neutral projects which focus on driving business value. We are a neutral space for open collaboration and innovation that enables the creation of new billion-dollar markets. ## Terms Moved terms into their own files. You can find them [here](/content/en)
92.4
479
0.829004
eng_Latn
0.998657
4c2959eecc1b79c57e516e21eadd57e2ba927289
6,360
md
Markdown
knowledge-base/custom-shape-in-chart-legend-items.md
victorarshon/winforms-docs
1c064a3588f4a94ed382feba6eccc2667424434d
[ "Apache-2.0" ]
30
2016-02-18T13:23:42.000Z
2021-09-23T01:26:05.000Z
knowledge-base/custom-shape-in-chart-legend-items.md
victorarshon/winforms-docs
1c064a3588f4a94ed382feba6eccc2667424434d
[ "Apache-2.0" ]
25
2016-03-16T07:13:47.000Z
2021-07-30T13:31:24.000Z
knowledge-base/custom-shape-in-chart-legend-items.md
victorarshon/winforms-docs
1c064a3588f4a94ed382feba6eccc2667424434d
[ "Apache-2.0" ]
183
2016-02-19T09:56:35.000Z
2022-01-17T18:03:36.000Z
--- title: Apply a Custom Shape to ChartView's Legend Items description: This article shows how you can apply a custom shape to chartview's legend items. type: how-to page_title: How to apply a custom Shape to ChartView's legend items slug: custom-shape-in-chart-legend-items position: 0 tags: chart, legend, shape res_type: kb --- ## Environment |Product Version|Product|Author| |----|----|----| |2019.2.508|RadChartView for WinForms|[Desislava Yordanova](https://www.telerik.com/blogs/author/desislava-yordanova)| ## Description The **LegendItemElement** in **RadChartView** consists of a **LegendItemTitle** and a **LegendItemMarker**: ![custom-shape-in-chart-legend-items001](images/custom-shape-in-chart-legend-items001.png) A common requirement is to change the square shape of the marker element. This article demonstrates how to achieve it. ## Solution Before populating **RadChartView** with data, subscribe to the ChartElement.LegendElement.**VisualItemCreating** event, create the default **LegendItemElement** and assign an **ElementShape** to the ItemElement.MarkerElement.**Shape** property: ![custom-shape-in-chart-legend-items002](images/custom-shape-in-chart-legend-items002.png) #### Apply a HeartShape to legend items ````C# public RadForm1() { InitializeComponent(); this.radChartView1.ChartElement.LegendElement.VisualItemCreating += LegendElement_VisualItemCreating; BarSeries barSeries = new BarSeries("Performance", "RepresentativeName"); barSeries.LegendTitle = "Q1"; barSeries.DataPoints.Add(new CategoricalDataPoint(177, "Harley")); barSeries.DataPoints.Add(new CategoricalDataPoint(128, "White")); barSeries.DataPoints.Add(new CategoricalDataPoint(143, "Smith")); barSeries.DataPoints.Add(new CategoricalDataPoint(111, "Jones")); barSeries.DataPoints.Add(new CategoricalDataPoint(118, "Marshall")); this.radChartView1.Series.Add(barSeries); BarSeries barSeries2 = new BarSeries("Performance", "RepresentativeName"); barSeries2.LegendTitle = "Q2"; barSeries2.DataPoints.Add(new CategoricalDataPoint(153, "Harley")); barSeries2.DataPoints.Add(new CategoricalDataPoint(141, "White")); barSeries2.DataPoints.Add(new CategoricalDataPoint(130, "Smith")); barSeries2.DataPoints.Add(new CategoricalDataPoint(88, "Jones")); barSeries2.DataPoints.Add(new CategoricalDataPoint(109, "Marshall")); this.radChartView1.Series.Add(barSeries2); this.radChartView1.ShowLegend = true; } private void LegendElement_VisualItemCreating(object sender, LegendItemElementCreatingEventArgs e) { e.ItemElement = new LegendItemElement(e.LegendItem); e.ItemElement.MarkerElement.Shape = new HeartShape(); } ```` ````VB.NET Public Sub New() InitializeComponent() AddHandler Me.RadChartView1.ChartElement.LegendElement.VisualItemCreating, AddressOf LegendElement_VisualItemCreating Dim barSeries As BarSeries = New BarSeries("Performance", "RepresentativeName") barSeries.LegendTitle = "Q1" barSeries.DataPoints.Add(New CategoricalDataPoint(177, "Harley")) barSeries.DataPoints.Add(New CategoricalDataPoint(128, "White")) barSeries.DataPoints.Add(New CategoricalDataPoint(143, "Smith")) barSeries.DataPoints.Add(New CategoricalDataPoint(111, "Jones")) barSeries.DataPoints.Add(New CategoricalDataPoint(118, "Marshall")) Me.RadChartView1.Series.Add(barSeries) Dim barSeries2 As BarSeries = New BarSeries("Performance", "RepresentativeName") barSeries2.LegendTitle = "Q2" barSeries2.DataPoints.Add(New CategoricalDataPoint(153, "Harley")) barSeries2.DataPoints.Add(New CategoricalDataPoint(141, "White")) barSeries2.DataPoints.Add(New CategoricalDataPoint(130, "Smith")) barSeries2.DataPoints.Add(New CategoricalDataPoint(88, "Jones")) barSeries2.DataPoints.Add(New CategoricalDataPoint(109, "Marshall")) Me.RadChartView1.Series.Add(barSeries2) Me.RadChartView1.ShowLegend = True End Sub Private Sub LegendElement_VisualItemCreating(ByVal sender As Object, ByVal e As LegendItemElementCreatingEventArgs) e.ItemElement = New LegendItemElement(e.LegendItem) e.ItemElement.MarkerElement.Shape = New HeartShape() End Sub ```` You can also create your own **ElementShape**. Create a derivative of the **ElementShape** class and override its **CreatePath** method. The following code snippet demonstrates how to draw a single line as it is demonstrated below: ![custom-shape-in-chart-legend-items003](images/custom-shape-in-chart-legend-items003.png) #### Apply a custom LineShape to legend items ````C# private void LegendElement_VisualItemCreating(object sender, LegendItemElementCreatingEventArgs e) { e.ItemElement = new LegendItemElement(e.LegendItem); e.ItemElement.MarkerElement.Shape = new LineShape(); } public class LineShape : ElementShape { public override GraphicsPath CreatePath(Rectangle bounds) { GraphicsPath path = new GraphicsPath(); Point start = new Point(bounds.X, bounds.Y); Point end = new Point(bounds.X + bounds.Width, bounds.Y); path.AddLine(start, end); return path; } } ```` ````VB.NET Private Sub LegendElement_VisualItemCreating(ByVal sender As Object, ByVal e As LegendItemElementCreatingEventArgs) e.ItemElement = New LegendItemElement(e.LegendItem) e.ItemElement.MarkerElement.Shape = New LineShape() End Sub Public Class LineShape Inherits ElementShape Public Overrides Function CreatePath(ByVal bounds As Rectangle) As GraphicsPath Dim path As GraphicsPath = New GraphicsPath() Dim start As Point = New Point(bounds.X, bounds.Y) Dim [end] As Point = New Point(bounds.X + bounds.Width, bounds.Y) path.AddLine(start, [end]) Return path End Function End Class ````
42.684564
244
0.688994
yue_Hant
0.648648
4c29eebdabad9578c37106bab19df0152865e1d9
2,648
md
Markdown
README.md
lik2129/ysu_coin
47e40ed5d4000fc59566099929bd08a9ae16a4c1
[ "BSD-3-Clause" ]
null
null
null
README.md
lik2129/ysu_coin
47e40ed5d4000fc59566099929bd08a9ae16a4c1
[ "BSD-3-Clause" ]
null
null
null
README.md
lik2129/ysu_coin
47e40ed5d4000fc59566099929bd08a9ae16a4c1
[ "BSD-3-Clause" ]
null
null
null
<hr /> <div align="center"> <img src="images/logo.svg" alt="Logo" width='300px' height='auto'/> </div> <hr /> [![Live Artifacts](https://github.com/ysucurrency/ysu-node/workflows/Live/badge.svg)](https://github.com/ysucurrency/ysu-node/actions?query=workflow%3ALive) [![Beta Artifacts](https://github.com/ysucurrency/ysu-node/workflows/Beta/badge.svg)](https://github.com/ysucurrency/ysu-node/actions?query=workflow%3ABeta) [![GitHub release (latest by date)](https://img.shields.io/github/v/release/ysucurrency/ysu-node)](https://github.com/ysucurrency/ysu-node/releases/latest) [![GitHub tag (latest by date)](https://img.shields.io/github/v/tag/ysucurrency/ysu-node?color=darkblue&label=beta)](https://github.com/ysucurrency/ysu-node/tags) [![Tests](https://github.com/ysucurrency/ysu-node/workflows/Tests/badge.svg)](https://github.com/ysucurrency/ysu-node/actions?query=workflow%3ATests) [![RelWithDebug Tests](https://github.com/ysucurrency/ysu-node/workflows/Release%20Tests/badge.svg)](https://github.com/ysucurrency/ysu-node/actions?query=workflow%3A%22Release+Tests%22) [![Discord](https://img.shields.io/badge/discord-join%20chat-orange.svg)](https://chat.ysu.org) --- ### What is Ysu? Ysu is a digital payment protocol designed to be accessible and lightweight, with a focus on removing inefficiencies present in other cryptocurrencies. With ultrafast transactions and zero fees on a secure, green and decentralized network, this makes Ysu ideal for everyday transactions. --- ### Guides & Documentation * [Whitepaper](https://ysu.org/en/whitepaper) * [Running a Node](https://docs.ysu.org/running-a-node/overview/) * [Integration Guides](https://docs.ysu.org/integration-guides/the-basics/) * [Command Line Interface](https://docs.ysu.org/commands/command-line-interface/) * [RPC Protocol](https://docs.ysu.org/commands/rpc-protocol/) Other documentation details can be found at https://docs.ysu.org. --- ### Links & Resources * [Ysu Website](https://ysu.org) * [Documentation](https://docs.ysu.org) * [Discord Chat](https://chat.ysu.org/) * [Reddit](https://reddit.com/r/ysucurrency) * [Medium](https://medium.com/ysucurrency) * [Twitter](https://twitter.com/ysu) --- ### Want to Contribute? Please see the [contributors guide](https://docs.ysu.org/protocol-design/overview/#contributing-code-to-the-ysu-node). --- ### Contact us We want to hear about any trouble, success, delight, or pain you experience when using Ysu. Let us know by [filing an issue](https://github.com/ysucurrency/ysu-node/issues), joining us on [Reddit](https://reddit.com/r/ysucurrency), or joining us on [Discord](https://chat.ysu.org/).
47.285714
287
0.744713
eng_Latn
0.347734
4c2b3fb093e059764cf09174a0c00eebc220bd6c
5,105
md
Markdown
docs/BootSdCardAllOf.md
pktyagi1/intersight-go
1f8bd2aa1b8cbdf35d45068fb113b65ee41df3ed
[ "Apache-2.0" ]
null
null
null
docs/BootSdCardAllOf.md
pktyagi1/intersight-go
1f8bd2aa1b8cbdf35d45068fb113b65ee41df3ed
[ "Apache-2.0" ]
1
2022-03-21T06:28:43.000Z
2022-03-21T06:28:43.000Z
docs/BootSdCardAllOf.md
pktyagi1/intersight-go
1f8bd2aa1b8cbdf35d45068fb113b65ee41df3ed
[ "Apache-2.0" ]
2
2020-07-07T15:00:25.000Z
2022-03-21T04:43:33.000Z
# BootSdCardAllOf ## Properties Name | Type | Description | Notes ------------ | ------------- | ------------- | ------------- **ClassId** | **string** | The fully-qualified name of the instantiated, concrete type. This property is used as a discriminator to identify the type of the payload when marshaling and unmarshaling data. | [default to "boot.SdCard"] **ObjectType** | **string** | The fully-qualified name of the instantiated, concrete type. The value should be the same as the &#39;ClassId&#39; property. | [default to "boot.SdCard"] **Bootloader** | Pointer to [**NullableBootBootloader**](BootBootloader.md) | | [optional] **Lun** | Pointer to **int64** | The Logical Unit Number (LUN) of the device. | [optional] [default to 0] **Subtype** | Pointer to **string** | The subtype for the selected device type. * &#x60;None&#x60; - No sub type for SD card boot device. * &#x60;flex-util&#x60; - Use of FlexUtil (microSD) card as sub type for SD card boot device. * &#x60;flex-flash&#x60; - Use of FlexFlash (SD) card as sub type for SD card boot device. * &#x60;SDCARD&#x60; - Use of SD card as sub type for the SD Card boot device. | [optional] [default to "None"] ## Methods ### NewBootSdCardAllOf `func NewBootSdCardAllOf(classId string, objectType string, ) *BootSdCardAllOf` NewBootSdCardAllOf instantiates a new BootSdCardAllOf object This constructor will assign default values to properties that have it defined, and makes sure properties required by API are set, but the set of arguments will change when the set of required properties is changed ### NewBootSdCardAllOfWithDefaults `func NewBootSdCardAllOfWithDefaults() *BootSdCardAllOf` NewBootSdCardAllOfWithDefaults instantiates a new BootSdCardAllOf object This constructor will only assign default values to properties that have it defined, but it doesn't guarantee that properties required by API are set ### GetClassId `func (o *BootSdCardAllOf) GetClassId() string` GetClassId returns the ClassId field if non-nil, zero value otherwise. ### GetClassIdOk `func (o *BootSdCardAllOf) GetClassIdOk() (*string, bool)` GetClassIdOk returns a tuple with the ClassId field if it's non-nil, zero value otherwise and a boolean to check if the value has been set. ### SetClassId `func (o *BootSdCardAllOf) SetClassId(v string)` SetClassId sets ClassId field to given value. ### GetObjectType `func (o *BootSdCardAllOf) GetObjectType() string` GetObjectType returns the ObjectType field if non-nil, zero value otherwise. ### GetObjectTypeOk `func (o *BootSdCardAllOf) GetObjectTypeOk() (*string, bool)` GetObjectTypeOk returns a tuple with the ObjectType field if it's non-nil, zero value otherwise and a boolean to check if the value has been set. ### SetObjectType `func (o *BootSdCardAllOf) SetObjectType(v string)` SetObjectType sets ObjectType field to given value. ### GetBootloader `func (o *BootSdCardAllOf) GetBootloader() BootBootloader` GetBootloader returns the Bootloader field if non-nil, zero value otherwise. ### GetBootloaderOk `func (o *BootSdCardAllOf) GetBootloaderOk() (*BootBootloader, bool)` GetBootloaderOk returns a tuple with the Bootloader field if it's non-nil, zero value otherwise and a boolean to check if the value has been set. ### SetBootloader `func (o *BootSdCardAllOf) SetBootloader(v BootBootloader)` SetBootloader sets Bootloader field to given value. ### HasBootloader `func (o *BootSdCardAllOf) HasBootloader() bool` HasBootloader returns a boolean if a field has been set. ### SetBootloaderNil `func (o *BootSdCardAllOf) SetBootloaderNil(b bool)` SetBootloaderNil sets the value for Bootloader to be an explicit nil ### UnsetBootloader `func (o *BootSdCardAllOf) UnsetBootloader()` UnsetBootloader ensures that no value is present for Bootloader, not even an explicit nil ### GetLun `func (o *BootSdCardAllOf) GetLun() int64` GetLun returns the Lun field if non-nil, zero value otherwise. ### GetLunOk `func (o *BootSdCardAllOf) GetLunOk() (*int64, bool)` GetLunOk returns a tuple with the Lun field if it's non-nil, zero value otherwise and a boolean to check if the value has been set. ### SetLun `func (o *BootSdCardAllOf) SetLun(v int64)` SetLun sets Lun field to given value. ### HasLun `func (o *BootSdCardAllOf) HasLun() bool` HasLun returns a boolean if a field has been set. ### GetSubtype `func (o *BootSdCardAllOf) GetSubtype() string` GetSubtype returns the Subtype field if non-nil, zero value otherwise. ### GetSubtypeOk `func (o *BootSdCardAllOf) GetSubtypeOk() (*string, bool)` GetSubtypeOk returns a tuple with the Subtype field if it's non-nil, zero value otherwise and a boolean to check if the value has been set. ### SetSubtype `func (o *BootSdCardAllOf) SetSubtype(v string)` SetSubtype sets Subtype field to given value. ### HasSubtype `func (o *BootSdCardAllOf) HasSubtype() bool` HasSubtype returns a boolean if a field has been set. [[Back to Model list]](../README.md#documentation-for-models) [[Back to API list]](../README.md#documentation-for-api-endpoints) [[Back to README]](../README.md)
31.708075
434
0.746523
eng_Latn
0.651825
4c2c592b4e1e4d750782da0c0484fdf46f6bfa6c
32
md
Markdown
README.md
lolita-is-godddd/PortableMusicPlayerMod
29555836e2ca91bc39536ad7210dc02f8349ceff
[ "Apache-2.0" ]
null
null
null
README.md
lolita-is-godddd/PortableMusicPlayerMod
29555836e2ca91bc39536ad7210dc02f8349ceff
[ "Apache-2.0" ]
null
null
null
README.md
lolita-is-godddd/PortableMusicPlayerMod
29555836e2ca91bc39536ad7210dc02f8349ceff
[ "Apache-2.0" ]
null
null
null
# PortableMusicPlayerMod java!w
10.666667
24
0.84375
yue_Hant
0.626594
4c2c9d6c9f6af713b9061a8ef005d1f738698f7f
9,026
md
Markdown
articles/logic-apps/logic-apps-schema-2016-04-01.md
flexray/azure-docs.pl-pl
bfb8e5d5776d43b4623ce1c01dc44c8efc769c78
[ "CC-BY-4.0", "MIT" ]
12
2017-08-28T07:45:55.000Z
2022-03-07T21:35:48.000Z
articles/logic-apps/logic-apps-schema-2016-04-01.md
flexray/azure-docs.pl-pl
bfb8e5d5776d43b4623ce1c01dc44c8efc769c78
[ "CC-BY-4.0", "MIT" ]
441
2017-11-08T13:15:56.000Z
2021-06-02T10:39:53.000Z
articles/logic-apps/logic-apps-schema-2016-04-01.md
flexray/azure-docs.pl-pl
bfb8e5d5776d43b4623ce1c01dc44c8efc769c78
[ "CC-BY-4.0", "MIT" ]
27
2017-11-13T13:38:31.000Z
2022-02-17T11:57:33.000Z
--- title: Aktualizacje schematu Czerwiec-1-2016 description: Zaktualizowano wersję schematu 2016-06-01 dla definicji aplikacji logiki w Azure Logic Apps services: logic-apps ms.suite: integration author: kevinlam1 ms.author: klam ms.reviewer: estfan, logicappspm ms.topic: article ms.date: 07/25/2016 ms.openlocfilehash: ccc7df5bfac327fabf05f210764dbe10658b5015 ms.sourcegitcommit: f28ebb95ae9aaaff3f87d8388a09b41e0b3445b5 ms.translationtype: MT ms.contentlocale: pl-PL ms.lasthandoff: 03/29/2021 ms.locfileid: "96000321" --- # <a name="schema-updates-for-azure-logic-apps---june-1-2016"></a>Aktualizacje schematu dla Azure Logic Apps — 1 czerwca 2016 [Zaktualizowana wersja schematu](https://schema.management.azure.com/schemas/2016-06-01/Microsoft.Logic.json) i interfejsu API dla Azure Logic Apps obejmuje kluczowe ulepszenia, dzięki którym Aplikacje logiki są bardziej niezawodne i łatwiejsze w użyciu: * [Zakresy](#scopes) umożliwiają grupowanie lub zagnieżdżanie akcji jako kolekcji akcji. * [Warunki i pętle](#conditions-loops) są teraz akcjami pierwszej klasy. * Dokładniejsza kolejność uruchamiania akcji z `runAfter` właściwością, zastępując `dependsOn` Aby uaktualnić Aplikacje logiki z 1 sierpnia 2015 schematu wersji zapoznawczej do schematu 1 czerwca 2016, [zapoznaj się z sekcją uaktualniania](#upgrade-your-schema). <a name="scopes"></a> ## <a name="scopes"></a>Zakresy Ten schemat zawiera zakresy, które umożliwiają grupowanie akcji ze sobą lub zagnieżdżanie akcji wewnątrz siebie. Na przykład warunek może zawierać inny warunek. Dowiedz się więcej o [składni zakresu](./logic-apps-control-flow-loops.md)lub zapoznaj się z tym przykładem podstawowego zakresu: ```json { "actions": { "Scope": { "type": "Scope", "actions": { "Http": { "inputs": { "method": "GET", "uri": "https://www.bing.com" }, "runAfter": {}, "type": "Http" } } } } } ``` <a name="conditions-loops"></a> ## <a name="conditions-and-loops-changes"></a>Warunki i pętle zmiany W poprzednich wersjach schematu warunki i pętle były parametrami skojarzonymi z pojedynczą akcją. Ten schemat dźwigu tego ograniczenia, więc warunki i pętle są teraz dostępne jako typy akcji. Dowiedz się więcej o [pętlach i zakresach](./logic-apps-control-flow-loops.md), [warunkach](../logic-apps/logic-apps-control-flow-conditional-statement.md)lub przejrzyj ten podstawowy przykład, który pokazuje akcję warunku: ```json { "Condition - If trigger is some trigger": { "type": "If", "expression": "@equals(triggerBody(), '<trigger-name>')", "runAfter": {}, "actions": { "Http_2": { "inputs": { "method": "GET", "uri": "https://www.bing.com" }, "runAfter": {}, "type": "Http" } }, "else": { "Condition - If trigger is another trigger": {} } } } ``` <a name="run-after"></a> ## <a name="runafter-property"></a>Właściwość "runAfter" `runAfter`Właściwość zastępuje `dependsOn` , co zapewnia większą precyzję podczas określania kolejności uruchamiania dla akcji na podstawie stanu poprzednich akcji. `dependsOn`Właściwość wskazuje, czy "akcja została uruchomiona i powiodła się", w zależności od tego, czy Poprzednia akcja zakończyła się powodzeniem, zakończyła się niepowodzeniem lub jako pominięcia — nie liczba uruchomień akcji. `runAfter`Właściwość zapewnia elastyczność jako obiekt, który określa wszystkie nazwy akcji, po których działa obiekt. Ta właściwość definiuje również tablicę Stanów, które są akceptowane jako wyzwalacze. Na przykład jeśli chcesz, aby akcja działała po pomyślnym wykonaniu akcji, a także po pomyślnym wykonaniu akcji B lub niepowodzeniem, skonfiguruj tę `runAfter` Właściwość: ```json { // Other parts in action definition "runAfter": { "A": ["Succeeded"], "B": ["Succeeded", "Failed"] } } ``` ## <a name="upgrade-your-schema"></a>Uaktualnianie schematu Aby uaktualnić do najnowszego [schematu](https://schema.management.azure.com/schemas/2016-06-01/Microsoft.Logic.json), wystarczy wykonać kilka kroków. Proces uaktualniania obejmuje uruchomienie skryptu uaktualniania, zapisanie go jako nowej aplikacji logiki, a jeśli chcesz, prawdopodobnie zastępowanie poprzedniej aplikacji logiki. 1. W Azure Portal Otwórz aplikację logiki. 2. Przejdź do **omówienia**. Na pasku narzędzi aplikacji logiki wybierz pozycję **Aktualizuj schemat**. ![Wybierz pozycję Aktualizuj schemat][1] Zostanie zwrócona uaktualniona definicja, w razie potrzeby można ją skopiować i wkleić do definicji zasobu. > [!IMPORTANT] > *Upewnij się, że* wybrano pozycję **Zapisz jako** , aby wszystkie odwołania do połączeń pozostawały prawidłowe w uaktualnionej aplikacji logiki. 3. Na pasku narzędzi bloku uaktualniania wybierz pozycję **Zapisz jako**. 4. Wprowadź nazwę i stan logiki. Aby wdrożyć uaktualnioną aplikację logiki, wybierz pozycję **Utwórz**. 5. Upewnij się, że uaktualniona aplikacja logiki działa zgodnie z oczekiwaniami. > [!NOTE] > Jeśli używasz wyzwalacza ręcznego lub żądania, adres URL wywołania zwrotnego zmieni się w nowej aplikacji logiki. Przetestuj nowy adres URL, aby upewnić się, że kompleksowe środowisko działa. Aby zachować poprzednie adresy URL, można sklonować za pośrednictwem istniejącej aplikacji logiki. 6. *Opcjonalne* Aby zastąpić poprzednią aplikację logiki nową wersją schematu, na pasku narzędzi wybierz pozycję **Klonuj**, a następnie pozycję **Aktualizuj schemat**. Ten krok jest niezbędny tylko wtedy, gdy chcesz zachować ten sam identyfikator zasobu lub adres URL wyzwalacza żądania aplikacji logiki. ## <a name="upgrade-tool-notes"></a>Uwagi dotyczące narzędzia do uaktualniania ### <a name="mapping-conditions"></a>Warunki mapowania W uaktualnionej definicji Narzędzie to najlepiej sprawdza się w przypadku grupowania prawdy i fałszywych akcji rozgałęzień jednocześnie jako zakresu. W `@equals(actions('a').status, 'Skipped')` odniesieniu do wzorca projektanta pojawia się jako `else` Akcja. Jeśli jednak narzędzie wykryje nierozpoznawalne wzorce, narzędzie może utworzyć osobne warunki zarówno dla gałęzi true, jak i false. W razie potrzeby można ponownie mapować akcje po uaktualnieniu. #### <a name="foreach-loop-with-condition"></a>Pętla "foreach" z warunkiem W nowym schemacie można użyć akcji filtru, aby replikować wzorzec, który używa pętli **for each** z jednym warunkiem dla każdego elementu. Jednak zmiana jest wykonywana automatycznie podczas uaktualniania. Warunek stanie się akcją filtru, która pojawia się przed pętlą **for each** , zwracając tylko tablicę elementów pasujących do warunku i przekazując tę tablicę do **każdej** akcji. Aby zapoznać się z przykładem, zobacz [pętle i zakresy](./logic-apps-control-flow-loops.md). ### <a name="resource-tags"></a>Tagi zasobów Po uaktualnieniu Tagi zasobów zostaną usunięte, dlatego należy je zresetować dla uaktualnionego przepływu pracy. ## <a name="other-changes"></a>Inne zmiany ### <a name="renamed-manual-trigger-to-request-trigger"></a>Zmieniono nazwę wyzwalacza "Manual" na wyzwalacz "Request" `manual`Typ wyzwalacza został uznany za przestarzały, a jego nazwa została zmieniona na `request` Typ `http` . Ta zmiana powoduje utworzenie większej spójności dla rodzaju wzorca używanego przez wyzwalacz do kompilowania. ### <a name="new-filter-action"></a>Nowa akcja "filter" Aby odfiltrować dużą tablicę w dół do mniejszego zestawu elementów, nowy `filter` Typ akceptuje tablicę i warunek, oblicza warunek dla każdego elementu i zwraca tablicę zawierającą elementy spełniające warunek. ### <a name="restrictions-for-foreach-and-until-actions"></a>Ograniczenia dotyczące akcji "foreach" i "until" `foreach` `until` Pętla i jest ograniczona do pojedynczej akcji. ### <a name="new-trackedproperties-for-actions"></a>Nowe "trackedProperties" dla akcji Akcje mogą teraz mieć dodatkową właściwość o nazwie `trackedProperties` , która jest elementem równorzędnym `runAfter` dla `type` właściwości i. Ten obiekt Określa pewne dane wejściowe lub wyjściowe akcji, które mają zostać dołączone do telemetrii diagnostyki platformy Azure, emitowane w ramach przepływu pracy. Na przykład: ``` json { "Http": { "inputs": { "method": "GET", "uri": "https://www.bing.com" }, "runAfter": {}, "type": "Http", "trackedProperties": { "responseCode": "@action().outputs.statusCode", "uri": "@action().inputs.uri" } } } ``` ## <a name="next-steps"></a>Następne kroki * [Tworzenie definicji przepływu pracy dla aplikacji logiki](../logic-apps/logic-apps-author-definitions.md) * [Tworzenie wdrożenia aplikacji logiki](logic-apps-azure-resource-manager-templates-overview.md) <!-- Image references --> [1]: ./media/logic-apps-schema-2016-04-01/upgradeButton.png
49.593407
773
0.731664
pol_Latn
0.999882
4c2cf055cf1adfe76ed6a0b35ea5cfadc4ddb60e
3,905
md
Markdown
labs/lab04/lab4.md
ChrisReed0114/OSS-Repo
6f3119e57b72baaae6b74864bccb362b35a1534c
[ "MIT" ]
null
null
null
labs/lab04/lab4.md
ChrisReed0114/OSS-Repo
6f3119e57b72baaae6b74864bccb362b35a1534c
[ "MIT" ]
null
null
null
labs/lab04/lab4.md
ChrisReed0114/OSS-Repo
6f3119e57b72baaae6b74864bccb362b35a1534c
[ "MIT" ]
null
null
null
# Lab 04 Licensing ### Give a Creative Commons license to your lab report (Lab4.md) <a rel="license" href="http://creativecommons.org/licenses/by/4.0/"><img alt="Creative Commons License" style="border-width:0" src="https://i.creativecommons.org/l/by/4.0/88x31.png" /></a><br />This work is licensed under a <a rel="license" href="http://creativecommons.org/licenses/by/4.0/">Creative Commons Attribution 4.0 International License</a>. ### Why is it important to choose a LICENSE? By definition open source allows people to use, change and distribute software. If a project is not licensed then the original creator could at any time tell you to stop using their software. Using a license allows other people to actually use the code you have shared, on the terms you have agreed to. ### Why is it important that you SHOULDN'T use a project that has a specific license? As mentioned earlier if a creator doesn't put a license on thier project then it is not open source. The creator could then claim copy right on any work you do off of their project. ### Read the Failure to follow the Open System Model Section of Why the Web beat Gopher. It is pretty easy to agree with the claim in this section because the Web team's product still gets used today, while very few people know of the Gopher team's existance. The main reason for this is that they reason for this is that they did not empower to the developers who would go on to use and be leaders in their software. ### Why Linux would use a GPL v2 License? It makes perfect sense why Linux would choose a copy-left license. It was created alongside other popular operating systems like Windows and Safari but it has always been cmpletely open source. This is a great principle to follow but other, proprietary companies might see that as a weakness and try to profit off of another person's hard work. Luckily copy-left licenses like the GPL v2 license makes sure that if someone is to use or repurpose their work, anything they put out must also fall under the same license. Being the competitive operating system that Linux is, it appeases its fan base by giving them their complete freedom while also preventing a company from restricting the rights of others. Much of linux's success comes from the high approval rates of its users due to its adherence with the principles of open source, and fast speeds with little to no malware. There have been many derivatives off of linux as well and while people may joint off to use different flavors of linux they are still utilizing a part of linux. This has allowed linux to stretch far and wide in its usability across many different applications. In conclusion, Linux's creation has been based off the idea of free software and they continue to live up to it with their licensing, expanding their amount of users exponentially. https://www.linux.com/what-is-linux https://www.adminschoice.com/download-linux-top-10-free-linux-distributions-for-desktop-and-servers https://www.linux.org/ ### Group Repository https://github.com/barnesv17/Lab04-Licensing ### Project Licenses Table | Website | License Present | License | |--------------------------------------|-----------------|-----------------------------------------------------------| | https://github.com/meowskers/BeirRun | yes | [MIT License](https://en.wikipedia.org/wiki/MIT_License) | | https://github.com/sezenack/Red-Army-App | yes | [MIT License](https://en.wikipedia.org/wiki/MIT_License) | | https://github.com/copperwater/yacs | yes | [MIT License](https://en.wikipedia.org/wiki/MIT_License) | | https://github.com/thepoly/pipeline | yes | [MIT License](https://en.wikipedia.org/wiki/MIT_License) | | https://github.com/saprap1/auto-calendar | yes | [MIT License](https://en.wikipedia.org/wiki/MIT_License) |
84.891304
1,327
0.730602
eng_Latn
0.996623
4c2f44f6ba62f4d47645f1a36a96ef627e3463b4
11,188
md
Markdown
docs/relational-databases/system-stored-procedures/sp-helpmergearticle-transact-sql.md
rsanderson2350/sql-docs
3206a31870f8febab7d1718fa59fe0590d4d45db
[ "CC-BY-4.0", "MIT" ]
1
2019-07-05T14:10:29.000Z
2019-07-05T14:10:29.000Z
docs/relational-databases/system-stored-procedures/sp-helpmergearticle-transact-sql.md
rsanderson2350/sql-docs
3206a31870f8febab7d1718fa59fe0590d4d45db
[ "CC-BY-4.0", "MIT" ]
null
null
null
docs/relational-databases/system-stored-procedures/sp-helpmergearticle-transact-sql.md
rsanderson2350/sql-docs
3206a31870f8febab7d1718fa59fe0590d4d45db
[ "CC-BY-4.0", "MIT" ]
1
2019-09-16T00:08:14.000Z
2019-09-16T00:08:14.000Z
--- title: "sp_helpmergearticle (Transact-SQL) | Microsoft Docs" ms.custom: "" ms.date: "03/14/2017" ms.prod: "sql-non-specified" ms.prod_service: "database-engine" ms.service: "" ms.component: "system-stored-procedures" ms.reviewer: "" ms.suite: "sql" ms.technology: - "replication" ms.tgt_pltfrm: "" ms.topic: "language-reference" applies_to: - "SQL Server" f1_keywords: - "sp_helpmergearticle" - "sp_helpmergearticle_TSQL" helpviewer_keywords: - "sp_helpmergearticle" ms.assetid: 0fb9986a-3c33-46ef-87bb-297396ea5a6a caps.latest.revision: 40 author: "edmacauley" ms.author: "edmaca" manager: "craigg" ms.workload: "Inactive" --- # sp_helpmergearticle (Transact-SQL) [!INCLUDE[tsql-appliesto-ss2008-xxxx-xxxx-xxx-md](../../includes/tsql-appliesto-ss2008-xxxx-xxxx-xxx-md.md)] Returns information about an article. This stored procedure is executed at the Publisher on the publication database or at a republishing Subscriber on the subscription database. ![Topic link icon](../../database-engine/configure-windows/media/topic-link.gif "Topic link icon") [Transact-SQL Syntax Conventions](../../t-sql/language-elements/transact-sql-syntax-conventions-transact-sql.md) ## Syntax ``` sp_helpmergearticle [ [ @publication = ] 'publication' ] [ , [ @article= ] 'article' ] ``` ## Arguments [ **@publication=**] **'***publication***'** Is the name of the publication about which to retrieve information. *publication*is **sysname**, with a default of **%**, which returns information about all merge articles contained in all publications in the current database. [ **@article=**] **'***article***'** Is the name of the article for which to return information. *article*is **sysname**, with a default of **%**, which returns information about all merge articles in the given publication. ## Result Set |Column name|Data type|Description| |-----------------|---------------|-----------------| |**id**|**int**|Article identifier.| |**name**|**sysname**|Name of the article.| |**source_owner**|**sysname**|Name of the owner of the source object.| |**source_object**|**sysname**|Name of the source object from which to add the article.| |**sync_object_owner**|**sysname**|Name of the owner of the view that defines the published article.| |**sync_object**|**sysname**|Name of the custom object used to establish the initial data for the partition.| |**description**|**nvarchar(255)**|Description of the article.| |**status**|**tinyint**|Status of the article, which can be one of the following:<br /><br /> **1** = inactive<br /><br /> **2** = active<br /><br /> **5** = data definition language (DDL) operation pending<br /><br /> **6** = DDL operation with a newly generated snapshot<br /><br /> Note: When an article is reinitialized, values of **5** and **6** are changed to **2**.| |**creation_script**|**nvarchar(255)**|Path and name of an optional article schema script used to create the article in the subscription database.| |**conflict_table**|**nvarchar(270)**|Name of the table storing the insert or update conflicts.| |**article_resolver**|**nvarchar(255)**|Custom resolver for the article.| |**subset_filterclause**|**nvarchar(1000)**|WHERE clause specifying the horizontal filtering.| |**pre_creation_command**|**tinyint**|Pre-creation method, which can be one of the following:<br /><br /> **0** = none<br /><br /> **1** = drop<br /><br /> **2** = delete<br /><br /> **3** = truncate| |**schema_option**|**binary(8)**|Bitmap of the schema generation option for the article. For information about this bitmap option, see [sp_addmergearticle](../../relational-databases/system-stored-procedures/sp-addmergearticle-transact-sql.md) or [sp_changemergearticle](../../relational-databases/system-stored-procedures/sp-changemergearticle-transact-sql.md).| |**type**|**smallint**|Type of article, which can be one of the following:<br /><br /> **10** = table<br /><br /> **32** = stored procedure<br /><br /> **64** = view or indexed view<br /><br /> **128** = user defined function<br /><br /> **160** = synonym schema only| |**column_tracking**|**int**|Setting for column-level tracking; where **1** means that column-level tracking is on, and **0** means that column-level tracking is off.| |**resolver_info**|**nvarchar(255)**|Name of the article resolver.| |**vertical_partition**|**bit**|If the article is vertically partitioned; where **1** means that the article is vertically partitioned, and **0** means that it is not.| |**destination_owner**|**sysname**|Owner of the destination object. Applicable to merge stored procedures, views, and user-defined function (UDF) schema articles only.| |**identity_support**|**int**|If automatic identity range handling is enabled; where **1** is enabled and **0** is disabled.| |**pub_identity_range**|**bigint**|The range size to use when assigning new identity values. For more information, see the "Merge Replication" section of [Replicate Identity Columns](../../relational-databases/replication/publish/replicate-identity-columns.md).| |**identity_range**|**bigint**|The range size to use when assigning new identity values. For more information, see the "Merge Replication" section of [Replicate Identity Columns](../../relational-databases/replication/publish/replicate-identity-columns.md).| |**threshold**|**int**|Percentage value used for Subscribers running [!INCLUDE[ssEW](../../includes/ssew-md.md)] or previous versions of [!INCLUDE[ssNoVersion](../../includes/ssnoversion-md.md)]. **threshold** controls when the Merge Agent assigns a new identity range. When the percentage of values specified in threshold is used, the Merge Agent creates a new identity range. For more information, see the "Merge Replication" section of [Replicate Identity Columns](../../relational-databases/replication/publish/replicate-identity-columns.md).| |**verify_resolver_signature**|**int**|If a digital signature is verified before using a resolver in merge replication; where **0** means that the signature is not verified, and **1** means that the signature is verified to see if it is from a trusted source.| |**destination_object**|**sysname**|Name of the destination object. Applicable to merge stored procedures, views, and UDF schema articles only.| |**allow_interactive_resolver**|**int**|If the Interactive Resolver is used on an article; where **1** means that this resolver is used, and **0** means that it is not used.| |**fast_multicol_updateproc**|**int**|Enables or disables the Merge Agent to apply changes to multiple columns in the same row in one UPDATE statement; where **1** means that multiple columns are updated in one statement, and **0** means that separate UPDATE statements are issues for each updated column.| |**check_permissions**|**int**|Integer value that represents the bitmap of the table-level permissions that are verified. For a list of possible values, see [sp_addmergearticle &#40;Transact-SQL&#41;](../../relational-databases/system-stored-procedures/sp-addmergearticle-transact-sql.md).| |**processing_order**|**int**|The order in which data changes are applied to articles in a publication.| |**upload_options**|**tinyint**|Defines restrictions on updates made at a Subscriber with a client subscription, which can be one of the following values.<br /><br /> **0** = There are no restrictions on updates made at a Subscriber with a client subscription; all changes are uploaded to the Publisher.<br /><br /> **1** = Changes are allowed at a Subscriber with a client subscription, but they are not uploaded to the Publisher.<br /><br /> **2** = Changes are not allowed at a Subscriber with a client subscription.<br /><br /> For more information, see [Optimize Merge Replication Performance with Download-Only Articles](../../relational-databases/replication/merge/optimize-merge-replication-performance-with-download-only-articles.md).| |**identityrangemanagementoption**|**int**|If automatic identity range handling is enabled; where **1** is enabled and **0** is disabled.| |**delete_tracking**|**bit**|If deletes are replicated; where **1** means that deletes are replicated, and **0** means that they are not.| |**compensate_for_errors**|**bit**|Indicates if compensating actions are taken when errors are encountered during synchronization; where **1** indicates that compensating actions are taken, and **0** means that compensating actions are not taken.| |**partition_options**|**tinyint**|Defines the way in which data in the article is partitioned, which enables performance optimizations when all rows belong in only one partition or in only one subscription. *partition_options* can be one of the following values.<br /><br /> **0** = The filtering for the article either is static or does not yield a unique subset of data for each partition; that is, it is an "overlapping" partition.<br /><br /> **1** = The partitions are overlapping, and data manipulation language (DML) updates made at the Subscriber cannot change the partition to which a row belongs.<br /><br /> **2** = The filtering for the article yields non-overlapping partitions, but multiple Subscribers can receive the same partition.<br /><br /> **3** = The filtering for the article yields non-overlapping partitions that are unique for each subscription.| |**artid**|**uniqueidentifier**|An identifier that uniquely identifies the article.| |**pubid**|**uniqueidentifier**|An identifier that uniquely identifies the publication in which the article is published.| |**stream_blob_columns**|**bit**|Is if the data stream optimization is being used when replicating binary large object columns. **1** means that the optimization is being used, and **0** means that the optimization is not being used.| ## Return Code Values **0** (success) or **1** (failure) ## Remarks **sp_helpmergearticle** is used in merge replication. ## Permissions Only members of the **db_owner** fixed database role in the publication database, the **replmonitor** role in the distribution database, or the publication access list for a publication can execute **sp_helpmergearticle**. ## Example [!code-sql[HowTo#sp_helpmergearticle](../../relational-databases/replication/codesnippet/tsql/sp-helpmergearticle-tran_1.sql)] ## See Also [View and Modify Article Properties](../../relational-databases/replication/publish/view-and-modify-article-properties.md) [sp_addmergearticle &#40;Transact-SQL&#41;](../../relational-databases/system-stored-procedures/sp-addmergearticle-transact-sql.md) [sp_changemergearticle &#40;Transact-SQL&#41;](../../relational-databases/system-stored-procedures/sp-changemergearticle-transact-sql.md) [sp_dropmergearticle &#40;Transact-SQL&#41;](../../relational-databases/system-stored-procedures/sp-dropmergearticle-transact-sql.md) [Replication Stored Procedures &#40;Transact-SQL&#41;](../../relational-databases/system-stored-procedures/replication-stored-procedures-transact-sql.md)
99.00885
876
0.721934
eng_Latn
0.975154
4c302fa1752e1863771d5f3f61fa4e6d096b7513
1,386
md
Markdown
_fi/dep/advmod.md
myedibleenso/docs
4306449b6863a7055a7289a94de010dbcfc11f65
[ "Apache-2.0" ]
null
null
null
_fi/dep/advmod.md
myedibleenso/docs
4306449b6863a7055a7289a94de010dbcfc11f65
[ "Apache-2.0" ]
null
null
null
_fi/dep/advmod.md
myedibleenso/docs
4306449b6863a7055a7289a94de010dbcfc11f65
[ "Apache-2.0" ]
null
null
null
--- layout: relation title: 'advmod' shortdef : 'adverb modifier' udver: '2' --- The dependency type `advmod` is used for *adverb modifiers* of verbs, nominals and adverbs alike. <!-- fname:advmod_verb.pdf --> ~~~ sdparse Hän käveli kotiin hitaasti . \n He walked home slowly . nsubj(käveli-2, Hän-1) nmod(käveli-2, kotiin-3) advmod(käveli-2, hitaasti-4) punct(käveli-2, .-5) ~~~ <!-- fname:advmod_noun.pdf --> ~~~ sdparse Minä otin kaapista myös vasaran . \n I took from_closet also hammer . nsubj(otin-2, Minä-1) nmod(otin-2, kaapista-3) obj(otin-2, vasaran-5) advmod(vasaran-5, myös-4) punct(otin-2, .-6) ~~~ Also quantification modifiers are annotated as *adverb modifiers* in UD Finnish (correspond to quantmod in the original Stanford Dependencies and the Turku Dependency Treebank). Quantification modifiers are quantifiers that modify a numerical expression. Typically quantifiers are adverbs, but also few adjectives are allowed as quantifiers. <!-- fname:quantmod.pdf --> ~~~ sdparse Alue oli suuruudeltaan noin kymmenen neliökilometriä . \n The_area was of_its_size about ten square_kilometres . nsubj:cop(neliökilometriä-6, Alue-1) cop(neliökilometriä-6, oli-2) nmod(neliökilometriä-6, suuruudeltaan-3) advmod(kymmenen-5, noin-4) nummod(neliökilometriä-6, kymmenen-5) punct(neliökilometriä-6, .-7) ~~~ <!-- Interlanguage links updated Pá kvě 14 11:08:47 CEST 2021 -->
32.232558
341
0.748918
eng_Latn
0.534183
4c304b1db144141929f75f8f962082e9dee2630b
3,352
md
Markdown
treebanks/ko_gsd/ko-dep-acl-relcl.md
mjabrams/docs
eef96df1ce8f6752e9f80660c8255482b2a07c45
[ "Apache-2.0" ]
204
2015-01-20T16:36:39.000Z
2022-03-28T00:49:51.000Z
treebanks/ko_gsd/ko-dep-acl-relcl.md
mjabrams/docs
eef96df1ce8f6752e9f80660c8255482b2a07c45
[ "Apache-2.0" ]
654
2015-01-02T17:06:29.000Z
2022-03-31T18:23:34.000Z
treebanks/ko_gsd/ko-dep-acl-relcl.md
mjabrams/docs
eef96df1ce8f6752e9f80660c8255482b2a07c45
[ "Apache-2.0" ]
200
2015-01-16T22:07:02.000Z
2022-03-25T11:35:28.000Z
--- layout: base title: 'Statistics of acl:relcl in UD_Korean' udver: '2' --- ## Treebank Statistics: UD_Korean: Relations: `acl:relcl` This relation is a language-specific subtype of . 3204 nodes (4%) are attached to their parents as `acl:relcl`. 3201 instances of `acl:relcl` (100%) are right-to-left (child precedes parent). Average distance between parent and child is 1.55493133583021. The following 10 pairs of parts of speech are connected with `acl:relcl`: <tt><a href="ko-pos-NOUN.html">NOUN</a></tt>-<tt><a href="ko-pos-VERB.html">VERB</a></tt> (2324; 73% instances), <tt><a href="ko-pos-ADV.html">ADV</a></tt>-<tt><a href="ko-pos-VERB.html">VERB</a></tt> (390; 12% instances), <tt><a href="ko-pos-VERB.html">VERB</a></tt>-<tt><a href="ko-pos-VERB.html">VERB</a></tt> (335; 10% instances), <tt><a href="ko-pos-NOUN.html">NOUN</a></tt>-<tt><a href="ko-pos-NOUN.html">NOUN</a></tt> (112; 3% instances), <tt><a href="ko-pos-ADV.html">ADV</a></tt>-<tt><a href="ko-pos-NOUN.html">NOUN</a></tt> (25; 1% instances), <tt><a href="ko-pos-PRON.html">PRON</a></tt>-<tt><a href="ko-pos-VERB.html">VERB</a></tt> (8; 0% instances), <tt><a href="ko-pos-VERB.html">VERB</a></tt>-<tt><a href="ko-pos-NOUN.html">NOUN</a></tt> (4; 0% instances), <tt><a href="ko-pos-PRON.html">PRON</a></tt>-<tt><a href="ko-pos-NOUN.html">NOUN</a></tt> (3; 0% instances), <tt><a href="ko-pos-NUM.html">NUM</a></tt>-<tt><a href="ko-pos-VERB.html">VERB</a></tt> (2; 0% instances), <tt><a href="ko-pos-NOUN.html">NOUN</a></tt>-<tt><a href="ko-pos-NUM.html">NUM</a></tt> (1; 0% instances). ~~~ conllu # visual-style 4 bgColor:blue # visual-style 4 fgColor:white # visual-style 6 bgColor:blue # visual-style 6 fgColor:white # visual-style 6 4 acl:relcl color:blue 1 특히 _ ADV ADV _ 14 advmod _ _ 2 크리스탈의 _ NOUN PNOUN _ 9 det:poss _ _ 3 화장기 _ NOUN NOUN _ 4 nsubj _ _ 4 없는 _ VERB PREDREL _ 6 acl:relcl _ _ 5 수수한 _ ADJ ADJ _ 6 amod _ _ 6 얼굴빛과 _ NOUN NOMCONJ _ 12 nsubj _ _ 7 하늘하늘거리는 _ ADJ ADJ _ 9 amod _ _ 8 긴 _ ADJ ADJ _ 9 amod _ _ 9 생머리는 _ NOUN NOUN _ 6 conj _ _ 10 남성들의 _ NOUN NOUN _ 11 det:poss _ _ 11 보호본능을 _ NOUN NOUN _ 12 obj _ _ 12 자극해 _ VERB PREDCONJ _ 14 advcl _ _ 13 시선이 _ NOUN NOUN _ 14 nsubj:pass _ _ 14 집중됐다 _ VERB VERB _ 0 root _ SpaceAfter=No 15 . . PUNCT . _ 14 punct _ _ ~~~ ~~~ conllu # visual-style 4 bgColor:blue # visual-style 4 fgColor:white # visual-style 5 bgColor:blue # visual-style 5 fgColor:white # visual-style 5 4 acl:relcl color:blue 1 이 _ DET DET _ 2 det _ _ 2 아파트는 _ NOUN NOUN _ 6 nsubj _ _ 3 골목에 _ ADV ADV _ 4 advmod _ _ 4 있는 _ VERB PREDREL _ 5 acl:relcl _ _ 5 아파트치고 _ ADV ADV _ 6 advmod _ _ 6 커서 _ VERB PREDCONJ _ 8 advcl _ _ 7 찾기도 _ NOUN NOUN _ 8 nsubj _ _ 8 쉽다 _ ADJ ADJ _ 0 root _ _ ~~~ ~~~ conllu # visual-style 9 bgColor:blue # visual-style 9 fgColor:white # visual-style 10 bgColor:blue # visual-style 10 fgColor:white # visual-style 10 9 acl:relcl color:blue 1 몇 _ DET DET _ 2 det _ _ 2 년 _ NOUN NOUN _ 4 nmod _ _ 3 전 _ ADP ADP _ 2 case _ _ 4 고시 _ NOUN NOUN _ 6 acl:relcl _ _ 5 공부하던 _ VERB PREDREL _ 4 flat _ _ 6 시절에 _ ADV ADV _ 9 advmod _ _ 7 선후배들과 _ ADV ADV _ 9 advmod _ _ 8 자주 _ ADV ADV _ 9 advmod _ _ 9 갔던 _ VERB PREDREL _ 10 acl:relcl _ _ 10 곳인데 _ VERB PREDCONJ _ 14 advcl _ _ 11 여전히 _ ADV ADV _ 14 advmod _ _ 12 사장님의 _ NOUN NOUN _ 13 det:poss _ _ 13 친절은 _ NOUN NOUN _ 14 nsubj _ _ 14 최고더군요 _ VERB NOMCOP _ 0 root _ _ ~~~
38.976744
1,168
0.673628
kor_Hang
0.700706
4c30cd9633492c1ca9ad1b5c8a7515e326cae1cf
4,037
md
Markdown
pages/blog/kedro-static-viz-0-3-0.md
WaylonWalker/waylonwalker.com
68aafc93e68a6dfcf0ece154fc2b5bae06642147
[ "MIT" ]
4
2021-05-23T14:28:53.000Z
2022-02-22T02:04:48.000Z
pages/blog/kedro-static-viz-0-3-0.md
WaylonWalker/waylonwalker.com
68aafc93e68a6dfcf0ece154fc2b5bae06642147
[ "MIT" ]
4
2021-05-04T14:03:22.000Z
2022-03-09T17:57:38.000Z
pages/blog/kedro-static-viz-0-3-0.md
WaylonWalker/waylonwalker.com
68aafc93e68a6dfcf0ece154fc2b5bae06642147
[ "MIT" ]
6
2021-05-22T20:49:27.000Z
2022-01-17T22:32:16.000Z
--- templateKey: blog-post tags: - kedro - python title: Kedro Static Viz 0.3.0 is out with Hooks Support date: 2020-05-28T05:00:00.000+00:00 status: published --- [kedro-static-viz](https://github.com/WaylonWalker/kedro-static-viz) is out with support for the newly released hooks feature. This means that you can have `kedro-static-viz` automatically deploy a full gatsby site `before_pipeline_run` keeping your visualization always up to date. Even though it is a static site there is no functionality lost. The only thing that's missing is the flask server. With [kedro-static-viz](https://github.com/WaylonWalker/kedro-static-viz) you can deploy your visualization to a number of static hosting providers such as GitHub pages free of charge with wicked fast performance ## ⚡ It's Fast Even though it's built on gatsbyjs the full site builds in under 2s even on slower hardware. This is because the site is already pre-rendered and stripped of any excess. It's zipped up right into the python package and is typically used with the cli, but now can be used with python, or as a hook as well. > ### What is [kedro-viz](https://github.com/quantumblacklabs/kedro-viz) 🤔 Kedro viz is a fantastic kedro plugin that allows you to visualize your data pipeline. Kedro allows you to quickly build production-ready pipelines where you just configure a catalog, then toss python functions into a big pile. Kedro figures out the order everything needs ran in for you, allows you to run a datasets dependencies or dependents only. [kedro-viz](https://github.com/quantumblacklabs/kedro-viz) gives you a great way to see this ordering visually. ![a visualization of a kedro data pipeline featuring data and functions flowing together.](https://images.waylonwalker.com/pipeline_visualisation-1.png "kedro visualization") > kedro visualization from the projects readme ## Check out a live running example Using the power of GitHub actions the I have built a kedro iris pipeline visualization that can be found on [https://static-viz.kedro.dev/](https://static-viz.kedro.dev/) ## Itching to get started with kedro You can be up and running in a matter of minutes if you already have python running on your machine. Make a virtual environment with your environment manager of choice. ``` python conda create -n kedro-practice python=3.8 -y conda activate kedro-practice ``` Install kedro. Then create a new project with their awesome cli template built on cookiecutter. Make sure to answer `y` to get a prebuilt example pipeline with data. pip install kedro kedro-static-viz kedro new ## Vizualize your pipeline with the cli 〽 For local use when you already have the full project `kedro viz` is a great tool to use, but this is an article about kedro-static-viz. ``` python kedro-static-viz static-viz ``` Since we used `kedro-static-viz` you will have a new directory called `public` that you can host on any static web hosting service, like GitHub pages or Netlify. ## Ready to try out the new hooks feature 🙋‍♀️ Open up your `<project>/src/run.py` and add the hook to your `ProjectContext` class. Next time you run your pipeline you will have an updated pipeline. ``` python from kedro_static_viz.hooks import StaticViz class ProjectContext(KedroContext): project_name = "kedro0160" project_version = "0.16.1" package_name = "kedro0160" hooks = [ StaticViz() ] ``` ## Now Run that pipeline 🏃‍♀️ Run your pipeline and enjoy that fresh kedro viz each and every time you run your pipeline. ``` bash kedro run ``` ## Want to make your own hooks 🎣 Check out some of my other articles on building kedro hooks. [![creating customizable kedro hooks](https://images.waylonwalker.com/configurable-kedro-hooks.png)](https://waylonwalker.com/kedro-class-hooks/) [![creating the kedro preflight hook](https://images.waylonwalker.com/kedro-hooks.png)](https://waylonwalker.com/creating-the-kedro-preflight-hook/) Check out the example 👉 [https://static-viz.kedro.dev/](https://static-viz.kedro.dev/)
45.359551
465
0.765668
eng_Latn
0.99428
4c33f59b75a369f68ca714f1aa1e78dbc53d150e
970
md
Markdown
dell_precision_3540/README.md
afparsons/xorg_configurations
e903e4f79d4bd67c6d65bade21f980a1ac48a308
[ "MIT" ]
null
null
null
dell_precision_3540/README.md
afparsons/xorg_configurations
e903e4f79d4bd67c6d65bade21f980a1ac48a308
[ "MIT" ]
null
null
null
dell_precision_3540/README.md
afparsons/xorg_configurations
e903e4f79d4bd67c6d65bade21f980a1ac48a308
[ "MIT" ]
null
null
null
# xorg.conf files for Dell Precision 3540 ``` ➜ ~ lsb_release -a && uname -rv No LSB modules are available. Distributor ID: Ubuntu Description: Ubuntu 18.04.4 LTS Release: 18.04 Codename: bionic 5.3.0-53-generic #47~18.04.1-Ubuntu SMP Thu May 7 13:10:50 UTC 2020 ``` ``` ➜ ~ inxi -Gxxxz Graphics: Device-1: Intel vendor: Dell driver: i915 v: kernel bus ID: 00:02.0 chip ID: 8086:3ea0 Device-2: Advanced Micro Devices [AMD/ATI] Lexa XT [Radeon PRO WX 3100] vendor: Dell driver: amdgpu v: kernel bus ID: 3b:00.0 chip ID: 1002:6985 Display: x11 server: X.Org 1.19.6 compositor: gnome-shell v: 3.28.4 driver: amdgpu,ati,intel unloaded: fbdev,modesetting,vesa resolution: 1: 900x1600~60Hz 2: 900x1600~60Hz 3: 2560x1440~60Hz s-dpi: 96 OpenGL: renderer: Mesa DRI Intel UHD Graphics 620 (WHL GT2) v: 4.6 Mesa 20.2.0-devel (git-4664612 2020-07-22 bionic-oibaf-ppa) compat-v: 3.0 direct render: Yes ```
44.090909
121
0.671134
kor_Hang
0.167419
4c3633abd1c2e2743598feed8963f7f9ec50fe57
674
md
Markdown
README.md
dobtco/dvl-color
dd4eb549e963e390f77435b960da5538ab8fcd10
[ "MIT" ]
null
null
null
README.md
dobtco/dvl-color
dd4eb549e963e390f77435b960da5538ab8fcd10
[ "MIT" ]
null
null
null
README.md
dobtco/dvl-color
dd4eb549e963e390f77435b960da5538ab8fcd10
[ "MIT" ]
null
null
null
![palat logo](https://dobt-captured.s3.amazonaws.com/ajb/palat_logo.png) [![RubyGem][gem]](http://rubygems.org/gems/palat) Generate beautiful, accessible color schemes from a single background color. Used in [Screendoor](https://www.dobt.co/screendoor/) to allow our users to customize their [public-facing forms](https://dobt.forms.fm). [View some examples &rarr;](https://dobtco.github.io/palat) ## Usage #### 1. Install the gem ```ruby gem 'palat' ``` #### 2. Generate a color scheme ```ruby generator = Palat::Generator.new('#fff') # white background generator.to_h # prints a bunch of variables ``` [gem]: https://img.shields.io/gem/v/palat.svg ## License MIT
24.962963
214
0.71365
eng_Latn
0.344522
4c36ce381a3f87bc5bfdf782f95bc50e57895227
4,186
md
Markdown
README.md
xingjilin/brynet
a9fbd7338c4020cea8aa53df26e2708868957094
[ "MIT" ]
null
null
null
README.md
xingjilin/brynet
a9fbd7338c4020cea8aa53df26e2708868957094
[ "MIT" ]
null
null
null
README.md
xingjilin/brynet
a9fbd7338c4020cea8aa53df26e2708868957094
[ "MIT" ]
null
null
null
Brynet ======= Cross platform high performance TCP network library using C++ 11. Windows : [![Build status](https://ci.appveyor.com/api/projects/status/a2bxg5umbwwdb01k/branch/master?svg=true)](https://ci.appveyor.com/project/IronsDu/brynet/branch/master) Linux : [![Build Status](https://travis-ci.org/IronsDu/brynet.svg?branch=master)](https://travis-ci.org/IronsDu/brynet) ## Features * Cross platform (Linux | Windows) * High performance and safety use. * None depend * Multi-threaded * SSL support * Support HTTP、HTTPS、WebSocket * IPv6 support ## Documentation - [简体中文](https://github.com/IronsDu/brynet/blob/master/docs/main.zh-cn.md) ## Compatibility * Visual C++ 2013+ on Windows (32/64-bit) * GCC 4.8+ on Linux (32/64-bit) * Not Support Mac OS X ## Build 1. `cmake .` 2. If you use Windows, please open brynet.sln then build. If on Linux, only enter `make`. ## Usages * [Examples](#examples) * [Users](#users) ## Benchmark Under localhost, use CentOS 6.5 virtual mahcine(host machine is Win10 i5) * PingPong Benchamrk's server and client both only use one thread, and packet size is 4k ![PingPong](image/pingpong.png "PingPong") * Broadcast Server use two network threads and one logic thread, client use one network(also process logic) thread. every packet size is 46 bytes. every packet contain client's id. server broadcast packet to all client when recv one packet from any client. client send one packet when recv packet from server and packet's id equal self. ![Broadcast](image/broadcast.png "Broadcast") * Ab HTTP(1 network thread) Document Path: / Document Length: 18 bytes Concurrency Level: 100 Time taken for tests: 5.871 seconds Complete requests: 100000 Failed requests: 0 Write errors: 0 Non-2xx responses: 100000 Total transferred: 5200000 bytes HTML transferred: 1800000 bytes Requests per second: 17031.62 [#/sec] (mean) Time per request: 5.871 [ms] (mean) Time per request: 0.059 [ms] (mean, across all concurrent requests) Transfer rate: 864.89 [Kbytes/sec] received Connection Times (ms) min mean[+/-sd] median max Connect: 0 2 0.7 2 8 Processing: 1 3 0.7 3 9 Waiting: 0 3 0.8 3 8 Total: 2 6 0.8 6 11 Percentage of the requests served within a certain time (ms) 50% 6 66% 6 75% 6 80% 6 90% 7 95% 7 98% 7 99% 8 100% 11 (longest request) Examples ---------------------------- * [PingPongServer](https://github.com/IronsDu/dodo/blob/master/examples/PingPongServer.cpp) * [PingPongClient](https://github.com/IronsDu/dodo/blob/master/examples/PingPongClient.cpp) * [BroadCastServer](https://github.com/IronsDu/dodo/blob/master/examples/BroadCastServer.cpp) * [BroadCastClient](https://github.com/IronsDu/dodo/blob/master/examples/BroadCastClient.cpp) * [SimpleHttpServer](https://github.com/IronsDu/dodo/blob/master/examples/TestHttp.cpp) show how to start http/ws service and request http * [BenchWebsocket](https://github.com/IronsDu/dodo/blob/master/examples/BenchWebsocket.cpp) benchmark websocket client * [PromiseReceive](https://github.com/IronsDu/brynet/blob/master/examples/TestPromiseReceive.cpp) use the promise style receive http response * [WebSocketProxy](https://github.com/IronsDu/dodo/blob/master/examples/WebBinaryProxy.cpp) one proxy server between websocket client and binary protocol server * more examples please see [examples](https://github.com/IronsDu/dodo/tree/master/examples); Users ---------------------------- * [Redis proxy](https://github.com/IronsDu/DBProxy) * [Distributed game server framework](https://github.com/IronsDu/DServerFramework) * [Joynet - Lua network library](https://github.com/IronsDu/Joynet) * [HTTP-RPC](https://github.com/IronsDu/http-rpc) * [grpc-gateway](https://github.com/IronsDu/grpc-gateway)
40.25
294
0.660535
eng_Latn
0.425778
4c36f660025abb82bfc753f2127516083dcb7d6b
4,470
markdown
Markdown
_posts/2017-05-16-newifi.markdown
toolazytoname/toolazytoname.github.io
82b4605ce562128806176a7497dfde32e35c28a2
[ "Apache-2.0" ]
1
2021-08-11T15:50:29.000Z
2021-08-11T15:50:29.000Z
_posts/2017-05-16-newifi.markdown
toolazytoname/toolazytoname.github.io
82b4605ce562128806176a7497dfde32e35c28a2
[ "Apache-2.0" ]
null
null
null
_posts/2017-05-16-newifi.markdown
toolazytoname/toolazytoname.github.io
82b4605ce562128806176a7497dfde32e35c28a2
[ "Apache-2.0" ]
1
2021-08-11T15:50:31.000Z
2021-08-11T15:50:31.000Z
--- layout: post title: "联想newifi mini科学上网配置" date: 2017-05-16 22:29:32 +0800 categories: hack your life catalog: true tags: - hack your life --- **目录** * [0 前言](#preface) * [1 整体方案](#brief) * [2 过程](#process) * [3 我理解的原理](#underTheHood) * [4 FAQ](#FAQ) * [5 参考](#reference) # 0 前言<a name="preface"></a> 之前配置照猫画虎稀里糊涂搞定过,后来不知道为什么不好使了,然后一直拖着没折腾,因为对Linux感兴趣,就当拿它练手了。 这回自己整理了一个简单的脚本。希望可以做到一键安装。对于整体的原理,流程也有了一定的了解。和任何技术问题一样,过程很艰辛,但一旦搞定了,也是春风得意马蹄疾。 # 1 整体方案<a name="brief"></a> shadowsocks + GFWlist # 2 过程<a name="process"></a> 具体过程看脚本和 [相关配置文件]({{ site.url }}/assets/SS.zip) ~~~ #!/bin/bash #写了个脚本,希望可以尽量自动部署 opkg update #ShadowSocks================================ opkg list_installed | grep shadowsocks opkg remove shadowsocks-* #有人说必须使用spec版本,非spec版本的shadowsocks不能和luci-app-shadowsocks配合使用,它会缺少/etc/init.d/shadowsocks #下载地址 https://sourceforge.net/projects/openwrt-dist/files/shadowsocks-libev/2.4.8-8816fa1/ramips/ #我安装的版本如下,确实之前安装了很多别的版本,碰到过/etc/init.d/shadowsocks 文件比较奇怪 #也碰到过明明安装了luci-app-shadowsocks-spec_1.3.2-1_all.ipk 却没有在界面上找到,应该是版本的关系 opkg install shadowsocks-libev_2.4.8-2_ramips_24kec.ipk #这里填写一些参数配置和服务器一致即可 cp shadowsocks.json /etc/shadowsocks.json #开启了ss-redir,并且在这里设置了ipset和iptables cp shadowsocks /etc/init.d/shadowsocks #shadowsocks 开机自动启动 /etc/init.d/shadowsocks enable #shadowsocks 启动服务 /etc/init.d/shadowsocks start #ipset====================================== opkg install ipset #dnsmasq-full=============================== opkg list_installed | grep dnsmasq opkg remove dnsmasq opkg install dnsmasq-full mkdir /etc/dnsmasq.d #这个文件很重要,应该根据路由解析道方式来填写,我选择了某个公众DNS服务器。 #原来这个文件是127.0.0.1#5353,应该是直接转给了ss-tunnel,到自己ss的服务器,然后再让ShadowSocks服务器将DNS请求转发给你设置好的域名服务器(通过Forwarding Tunnel设置),这样做的好处是你可以选择和你的ShadowSocks服务器最近的DNS服务器, #这样DNS服务器解析的ip地址和你的ShadowSocks服务器最近,ShadowSocks服务器去做其他请求时会最快,这种方案也不用担心域名被污染。 cp dnsmasq_list.conf /etc/dnsmasq.d/ #增加一行设置conf-dir=/etc/dnsmasq.d , #有人还在这里设置了 cache-size=1500 #修改dnsmasq缓存大小,默认为150。 #min-cache-ttl=720 #修改DNS缓存最小有效期(秒)。仅适用于aa65535的dnsmasq-full版本。 cp dnsmasq.conf /etc/dnsmasq.conf /etc/init.d/dnsmasq enable /etc/init.d/dnsmasq start ~~~ # 3 我理解的原理<a name="underTheHood"></a> ![我理解的原理图]({{ site.url }}/assets/newifiCross.svg) # 4 FAQ<a name="FAQ"></a> 如果发现不管用了,可以按照以下步骤简单检查一下 1. ping 一下服务器 2. 检查路由器上相应的服务 是否正常开启 ps 3. ipset -L 4. iptables -t nat —list 5. nslookup 或者 dig 6. nc 命令(这个还没试过) 7. dmesg看日志(这个还没试过) 知识点 1. 据说shadowsocks注意必须使用spec版本,非spec版本的shadowsocks不能和luci-app-shadowsocks配合使用,它会缺少/etc/init.d/shadowsocks。我安装的版本如下,确实之前安装了很多别的版本,也碰到过/etc/init.d/shadowsocks 文件比较奇怪。也碰到过明明安装了luci-app-shadowsocks-spec_1.3.2-1_all.ipk 却没有在界面上找到,应该是版本的关系。 2. 根据依赖的SSL库又可分为OpenSSL和PolarSSL两个版本,OpenSSL版支持加密方式多,体积大,PolarSSL版体积小, 加密方式少 3. ipset -L 可以查看建好的ip集合 4. ss-rule 还会调用iptables为nat表建立转发规则,我们可以使用如下命令查看: iptables -t nat --list * 我自己加了个iptables -t nat --list | grep 1080 5. ShadowSocks辅助做DNS解析 * 如果想让ShadowSocks辅助做Dns解析,可以勾选UDP Forward(本步骤是可选的,不勾选也能正常工作),这样做的效果是将Dns解析请求通过ShadowSocks的隧道转发给ShadowSocks服务器,然后再让ShadowSocks服务器将DNS请求转发给你设置好的域名服务器(通过Forwarding Tunnel设置),这样做的好处是你可以选择和你的ShadowSocks服务器最近的DNS服务器,这样DNS服务器解析的ip地址和你的ShadowSocks服务器最近,ShadowSocks服务器去做其他请求时会最快,这种方案也不用担心域名被污染。 6. nslookup 或者 dig 命令 * nslookup www.baidu.com 192.168.1.1 这个命令我试了一下,在本机也是可以操作的 * dig @localhost -p 3210 google.com 7. ps | ss 查看 * 在路由器和服务器可以通过ps命令查看ss-redir命令的详细参数,可以干掉配置时自动启动的shadowsocks进程,然后再根据命令参数手动执行ss-redir命令和ss-server进程,注意不要添加-f参数,不然会以daemon形式运行,执行ss-redir命令时添加-v参数,这样在路由器和服务器上都可以看到详细的请求记录,我们就知道到底哪个环节出了问题 8. * 客户端/ * └── usr/ * └── bin/ * ├── ss-local // 提供 SOCKS 代理 * ├── ss-redir // 提供透明代理, 从 v2.2.0 开始支持 UDP * └── ss-tunnel // 提供端口转发, 可用于 DNS 查询 9. 不确定SS服务器是否支持UDP转发,有办法测试吗?(这个还没试过) * /usr/bin/ss-tunnel -c /etc/shadowsocks.json -l 5353 -L 8.8.8.8:53 -u * nslookup www.youtube.com 127.0.0.1 10. 听说加-v 参数,可以看输出日志(这个还没试过) # 5 参考<a name="reference"></a> 1. [Shadowsocks + GfwList 实现 OpenWRT / LEDE 路由器自动翻墙 ](https://cokebar.info/archives/962) 2. [如何让路由器科学上网](http://www.cloudchou.com/work/post-983.html ) 3. [OpenWrt路由器为什么会翻墙失败或不稳定]( https://softwaredownload.gitbooks.io/openwrt-fanqiang/ebook/03.7.html ) 4. [运行了一段时间Shadowsocks-libev for OpenWrt 发现 ss-redir 经常莫名的假死 ](https://github.com/shadowsocks/openwrt-shadowsocks/issues/106 ) 5. [OWSS (OpenWrt + ShadowSocks) FQ完全手册 (Linux工具) ](https://blog.lutty.me/code/openwrt/2014-10/owss-fq-guide-linux-tool.html )
29.215686
289
0.738926
yue_Hant
0.510523
4c37829c14921abbd47e78b3f3ecc16858c0f1cf
22,170
md
Markdown
report.md
isabelladegen/ComputationalLogic
699cf7548c1751ee03e466284c8d2d20beecd51f
[ "BSD-3-Clause" ]
null
null
null
report.md
isabelladegen/ComputationalLogic
699cf7548c1751ee03e466284c8d2d20beecd51f
[ "BSD-3-Clause" ]
null
null
null
report.md
isabelladegen/ComputationalLogic
699cf7548c1751ee03e466284c8d2d20beecd51f
[ "BSD-3-Clause" ]
null
null
null
# Coursework Report for Computational Logic for Artificial Intelligence ### Author [Isabella Degen](https://github.com/isabelladegen) This is the report for the 2021 Assignment Computational Logic for Artificial Intelligence COMSM0022. This report goes into details about how I extended Prolexa. It's accompanied by the [Colab Demo Notebook](https://colab.research.google.com/github/isabelladegen/ComputationalLogic/blob/prolexa-plus/Demo_Notebook.ipynb) that demonstrates these changes and allows playing with this extended version of Prolexa. All code can be found on [Github](https://github.com/isabelladegen/ComputationalLogic) ### Contents 1. [Motivation](#motivation) 2. [Method](#method) 3. [Implementation](#implementation) 1. [Default rules](#defaultrules) 2. [Negated rules](#negatedrules) 3. [Adding new default rules and negated rules](#addingnewdefaultrulesandnegatedrules) 4. [Limitations](#limitations) # <a name="motivation">Motivation # My goal for this coursework was to get some experience with logical programming by extending Prolexa to be able to do default reasoning as well as deal with negations. In summary the following new interactions are now possible: - Adding rules that have exception (Default Rules):```'some birds fly'``` &rarr; 'I will remember that some birds fly' - Adding negated rules: ```'Penguins dont fly'``` &rarr; 'I will remember that penguins dont fly' - Report about default rules: ```'Tell me all'``` &rarr; 'Some birds fly' instead of 'Every bird flies' - Report about negated rules: ```'Tell me all'``` &rarr; 'Penguins dont fly' - Report default rules and negations in questions about proper nouns: ```'Tell me about Peep'``` &rarr; 'peep is a bird. peep flies' and ```'Tell me about opus'``` &rarr; 'opus is a bird. opus is a penguin. opus doesnt fly' - Queries about proper nouns: ```'Does peep fly'``` &rarr; 'Peep flies' and ```'Does opus fly'``` &rarr; 'Sorry, I don't think this is the case' - Explain default rules: ```'Explain why peep flies'``` &rarr; 'peep is a bird; some birds fly; therefore peep flies' and ```'Explain why opus doesnt fly'``` &rarr; 'opus is a penguin; penguins dont fly; therefore opus doesnt fly' # <a name="method">Method # I started off my work by forking the [assignment github repository](https://github.com/simply-logical/ComputationalLogic). This way I could make my changes, while continuing to pull from the original repository if needed or indeed push back to it. To learn what Prolexa can do and to avoid breaking existing functionality, I started off by making a list of commands that are working both on my computer and via Google Colab. I decided to keep track of these by writing a few high level 'test cases' in a new Colab Notebook. This notebook eventually evolved into the [Demo Notebook](https://github.com/isabelladegen/ComputationalLogic/blob/prolexa-plus/Demo_Notebook.ipynb) for the coursework. The Notebook walks through what this version of Prolexa can do. Once I had a high level idea about how I can interact with Prolexa, I wanted to learn more about how each of these commands were handled in Prolog. I used the SWI Prolog GUI and its debugger to step through each of the main cases in the predicate ```handle_utterance(SessionId,Utterance,Answer)``` defined in [prolexa.pl](https://github.com/isabelladegen/ComputationalLogic/blob/prolexa-plus/prolexa/prolog/prolexa.pl): - A. Utterance is a sentence - B. Utterance is a question that can be answered - C. Utterance is a command that succeeds - D. None of the above That way I learned how the input string is translated into a Prolog goal and how the outcome of such a goal was translated back into a 'string' answer. Both ways, this translation happens mainly in [prolexa_grammar.pl](https://github.com/isabelladegen/ComputationalLogic/blob/prolexa-plus/prolexa/prolog/prolexa_grammar.pl) , while the proving of the goal happens in [prolexa_engine.pl](https://github.com/isabelladegen/ComputationalLogic/blob/prolexa-plus/prolexa/prolog/prolexa_engine.pl) . Now I was ready to start changing these three files. I took a top down approach starting by defining what the user would ask Prolexa and what they would expect as response. I could use my Colab notebook to specify tests for the new functionality in the same way: ```test('new query', 'expected answer')```. These tests were my 'Acceptance tests' for the new behaviour. For the code changes to Prolog I continued to use the debugger to verify that the query was handled as expected. All my changes were done in small cycles of: add a failing test &rarr; change the code to fix it &rarr; run the tests to verify it was working &rarr; check-in and start with the next new failing test. I started off by adding default rules and negated rules directly to Prolog and ensuring that the cases 'B. Utterance is a question that can be answered' and 'C. Utterance is a command that succeeds worked. Then I made sure that default rules and negated rules could also be added via 'A. Utterance is a sentence'. I noticed early on a discrepancy between directly querying Prolexa via the commandline and querying it via the Colab notebook which was using Prolexa plus. Prolexa plus is using Python to extend the vocabulary on the fly. While this is useful for queries of type 'A. Utterance is a sentence', Prolexa plus is also extending the grammar for queries of type 'B. Utterance is a question that can be answered' and 'C. Utterance is a command that succeeds'. This results in unexpected words being added to the vocabulary which then result in strange answers. For example the query *'Explain why tweety tweets'* adds the following *pred* functors to the grammar: ``` pred(tweet, 1,[v/tweet,n/tweet]). pred(doe, 1,[v/doe]). pred(bird, 1,[n/bird]). pred(is, 1,[v/is]). pred(tweety, 1,[a/tweety,n/tweety]). pred(explain, 1,[v/explain]). ``` Note that amongst other unexpected additions *tweet* is added as noun as well as a verb which then results in the unexpected answers to the query *'Explain why tweety tweets'* &rarr; *tweety is a bird; every bird is **a tweet**; therefore tweety is **a tweet*** instead of what I expected *tweety is a bird; every bird **tweets**; therefore tweety **tweets***. For this reason I decided to directly query Prolexa. The Colab Notebook I used to debug Prolexa Plus has more examples of different grammar additions and can be found here: [Testing_Prolexa_Plus_Notebook.ipynb](https://github.com/isabelladegen/ComputationalLogic/blob/prolexa-plus/Testing_Prolexa_Plus_Notebook.ipynb) # <a name="implementation">Implementation # In this chapter I'm going to walk through the changes I've made in detail. To have some examples to work with I directly extended the [prolexa_grammar.pl](https://github.com/isabelladegen/ComputationalLogic/blob/prolexa-plus/prolexa/prolog/prolexa_grammar.pl) with the following proper nouns: ``` proper_noun(s,tweety) --> [tweety]. proper_noun(s,opus) --> [opus]. proper_noun(s,peep) --> [peep]. ``` And the vocabulary with the following words: ``` pred(bird, 1,[n/bird]). pred(penguin, 1,[n/penguin]). pred(fly, 1,[v/fly]). ``` ## Default rules <a name="defaultrules"> First I looked at how to handle default rules by directly adding them to the rule base. ### Acceptance Tests: I defined the following acceptance tests for the simplest case of handling default rules: - B. Utterance is a question that can be answered: ``` test('Does peep fly','peep flies') ``` - C. Utterance is a command that succeeds: ``` test('Explain why peep flies', 'peep is a bird; some birds fly; therefore peep flies') test('Tell me about peep', 'peep is a bird. peep flies') ``` ### Required changes: Utterances are handled in [prolexa.pl](https://github.com/isabelladegen/ComputationalLogic/blob/prolexa-plus/prolexa/prolog/prolexa.pl). The question *'Does peep fly'* is translated into a Prolog query using the [prolexa_grammar.pl](https://github.com/isabelladegen/ComputationalLogic/blob/prolexa-plus/prolexa/prolog/prolexa_grammar.pl). The query is then proved by the *prove_question/3* meta-interpreter in [prolexa_engine.pl](https://github.com/isabelladegen/ComputationalLogic/blob/prolexa-plus/prolexa/prolog/prolexa_engine.pl). The relevant lines of code are: ``` % prolex.pl - handle_utterance phrase(question(Query),UtteranceList), prove_question(Query,SessionId,Answer) -> true % prolexa_grammar.pl: question(Q) --> qword,question1(Q). question1(Q) --> [does],proper_noun(_,X),verb_phrase(_,X=>Q). ``` Commands are also translated into Prolog goals using the grammar before the goal itself is called. *'Explain why peep flies'* is translated into the goal `explain_question(Q,_,Answer)` and *'Tell me about peep'* is translated into the goal `all_answers(PN,Answer),Answer))`: ``` % prolex.pl - handle_utterance phrase(command(g(Goal,Answer)),UtteranceList), call(Goal) -> true % prolexa_grammar.pl: command(g(explain_question(Q,_,Answer),Answer)) --> [explain,why],sentence1([(Q:-true)]). command(g(all_answers(PN,Answer),Answer)) --> tellmeabout,proper_noun(s,PN). ``` ### New Rules I needed a new syntax for rules to be able to distinguish between rules with no exception and default rules and added the following rules ``` % prolexa.pl: stored_rule(1,[(default(fly(X):-bird(X)))]). stored_rule(1,[(bird(peep):-true)]). ``` The first rule is the new syntax for default rules, the second rule is using the existing syntax for rules without exceptions. ### Handling Default Rules in language To be able to understand and explain default rules using 'some' instead of 'every' I added a new determiner to the grammar which allows the existing `sentence1` functor to now also match default rules and translate queries into a default rule, as well as default rules back into language: ``` % prolexa_grammar.pl: command(g(explain_question(Q,_,Answer),Answer)) --> [explain,why],sentence1([(Q:-true)]). sentence1(C) --> determiner(N,M1,M2,C),noun(N,M1),verb_phrase(N,M2). determiner(p,X=>B,X=>H,[(default(H:-B))]) --> [some]. ``` This made it possible to explain why a bird flies using *some* instead of *every*: ``` prolexa> "Explain why peep flies". *** utterance(Explain why peep flies) *** goal(explain_question(fly(peep),_49814,_49802)) *** answer(peep is a bird; some birds fly; therefore peep flies) peep is a bird; some birds fly; therefore peep flies ``` ### New meta-interpreter for Default Rules The new default rule syntax required adding a new meta interpreter that can handle default rules. The new **explain_rb/4** meta-interpreter is adapted from chapter [8.1 in Simply Logical](https://too.simply-logical.space/src/text/3_part_iii/8.1.html) to fit in with the existing Prolexa code. This new meta-interpreter is now called form `explain_question/3`: `explain_rb(Query,Rulebase,[],Proof)` instead of `prove_rb(Query,Rulebase,[],Proof)`. It is also called from `prove_question/` instead of `prove_rb(Query,Rulebase)` for questions. For this it needed a 2 argument version `explain_rb(Query,Rulebase)` that simply calls the arity 4 version with an empty list and an anonymous variable for collecting the proofs. Finally, `explain_rb/2` is also called from `prove_question/2` which is called from `all_answers/2` to deal with 'Tell me about...' commands. The new meta-interpreter `explain_rb/4` works as following: 1. It first attempts to prove the goal using the existing `prove_rb()` meta-interpreter 2. If that fails, it looks for a default rule `default(A:-B)` in the rule base using the existing `find_clause()` functor 3. If it finds such a default rule it attempts to explain the body B the same way using the existing `p(A,Rule)` to then generate the message from the proof 4. If that succeeds it makes sure that the head A is not a contradiction ``` % prolexa_engine.pl % meta-interpreter for rules and defaults from chapter 8.1 explain_rb(true,_Rulebase,P, P):-!. explain_rb((A,B),Rulebase,P0,P):-!, explain_rb(A,Rulebase,P0,P1), explain_rb(B,Rulebase,P1,P). explain_rb(A,Rulebase,P0,P):- prove_rb(A,Rulebase,P0,P). % explain by rules only explain_rb(A,Rulebase,P0,P):- find_clause(default(A:-B),Rule,Rulebase), explain_rb(B,Rulebase,[p(A,Rule)|P0],P), not contradiction(A,Rulebase,P). % A consistent with P % top-level version that ignores proof explain_rb(Q,RB):- explain_rb(Q,RB,[],_P). ``` The 'Tell me about peep' query also lists the default rules as long as they are not a contradiction. To do so `all_answers` which is used for 'Tell me about ...' queries also uses `explain_rb` instead of `prove_rb`. ``` prolexa> "Tell me all about peep". *** utterance(Tell me all about peep) *** goal(all_answers(peep,_48568)) *** answer(peep is a bird. peep flies) peep is a bird. peep flies ``` ## Negated rules <a name="negatedrules"> To properly test default rules I needed to implement negation in rules so I could create exceptions to rules. ### Acceptance Tests I wanted to be able to ask negated questions as well has having Prolog use negations in explanations and answers. I defined the following acceptance tests for negated rules: - B. Utterance is a question that can be answered: ``` test('Does opus fly', 'Sorry, I don\'t think this is the case') ``` - C. Utterance is a command that succeeds : ``` test('Explain why opus doesnt fly', 'opus is a penguin; penguins dont fly; therefore opus doesnt fly') test('Explain why opus flies', 'Sorry, I don\'t think this is the case') test('Tell me about opus', 'opus is a bird. opus is a penguin. opus doesnt fly') ``` ### Required Changes: Given that I was again dealing with questions and commands changes were required in the same areas as for default rules. ### New Rules: For these new behaviours I directly added the following new rules about Opus as well as a not operator: ``` % prolexa.pl: :-op(900,fy,not). not flies(X):-penguin(X) bird(X):-penguin(X) penguin(opus):-true ``` ### Handling negation in language I needed to extend the grammar to deal with 'not' in rules by extending the `verb_phrases` and adding a new `determiner`: ``` % prolexa_grammar.pl sentence1([(not(L):-true)]) --> proper_noun(N,X),verb_phrase(N,not(X=>L)). verb_phrase(s,not(M)) --> [isnt],property(s,M). verb_phrase(p,not(M)) --> [arent],property(p,M). verb_phrase(p,not(M)) --> [dont], iverb(p,M). verb_phrase(s,not(M)) --> [doesnt], iverb(p,M). %verb_phrases are in plural form for negations determiner(p,X=>B,not(X=>H),[(not(H):-B)]) --> []. ``` For the command *'Tell me all'* I wanted the rule `not fly(X):-penguin(X)` to be translated to *'Penguins dont fly'*. This is handled by `sentence1(C) --> determiner(N,M1,M2,C),noun(N,M1),verb_phrase(N,M2).` with `verb_phrase(p,not(M)) --> [dont], iverb(p,M).`. The variable M2 becomes `not(X=>fly(X))` which then matches the `verb_phrase` for 'dont' sentences. However, question about a proper noun such as *'Tell me all about opus'* match `sentence1([(L:-true)]) --> proper_noun(N,X),verb_phrase(N,X=>L).` with L being `not(fly(opus))`. Here the variable M becomes `opus=>not(fly(opus))` which does no longer match `not(M)` clauses for `verb_phrase`. To fix this I added `sentence1([(not(L):-true)]) --> proper_noun(N,X),verb_phrase(N,not(X=>L)).` which now matches the 'not' without propagating it to the `verb_phrase`. ### Handling negated rules in the existing meta-interpreters Because I wanted Prolexa to include not rules in queries like *'Tell me about opus'* by listing the negated rules as well, I needed to make sure that rules negated in the rule base were matched as well. Previously, Prolexa ignored the the `stored_rule(1,[(not fly(X):-penguin(X))]).` due to `all_answers(Opus, Answers)` collecting all the known predicates `pred(P, 1, _)` and then attempting to prove each of them. However, Prolexa was not attempting to prove `not` clauses. I decided to make this work explaining both the 'clause' and 'not clause' in `prove_question/2` using an else-if, which perhaps is a bit of a procedural approach but required very few code changes: ``` % prolexa_engine.pl prove_question(Query,Answer):- findall(R,prolexa:stored_rule(_SessionId,R),Rulebase), ( explain_rb(Query,Rulebase) -> transform(Query,Clauses), phrase(sentence(Clauses),AnswerAtomList), atomics_to_string(AnswerAtomList," ",Answer) ; explain_rb(not Query,Rulebase) -> transform(not Query,Clauses), phrase(sentence(Clauses),AnswerAtomList), atomics_to_string(AnswerAtomList," ",Answer) ; Answer = "" ). ``` The introduction of negated rules also required me to deal with contradictions. `fly(opus)` fails because I check for contradictions. This means that only rules don't contradict the rule base, which `fly(opus)` does are collected. Because `contradiction/3` gets called both in `explain_rb/4` and `prove_rb/4` this would result in a circular call. To avoid this and because I needed both, I introduced a simple `prove_s/4` version that does not check for contradictions. ``` % prolexa_engine.pl: % check contradiction against rules, use simple proof to avoid circular contradition calls contradiction(not A,Rulebase,P):-!, prove_s(A,Rulebase,P,_P1). contradiction(A,Rulebase,P):- prove_s(not A,Rulebase,P,_P1). % proof simple same as proof_rb but does not check for contradictions in proofs prove_s(true,_Rulebase,P,P):-!. prove_s((A,B),Rulebase,P0,P):-!, find_clause((A:-C),Rule,Rulebase), conj_append(C,B,D), prove_s(D,Rulebase,[p((A,B),Rule)|P0],P). prove_s(A,Rulebase,P0,P):- f ind_clause((A:-B),Rule,Rulebase), prove_s(B,Rulebase,[p(A,Rule)|P0],P). ``` ## Adding new default rules and negated rules <a name="addingnewdefaultrulesandnegatedrules"> I wanted to make sure default rules could also be added directly via queries as long as they used words and proper nouns already defined in the Prolexa grammar. Statements like *'All birds tweet'* should add a standard rule, whereas statements like *'Some birds tweet'* should add a default rule. ### Acceptance Tests The Acceptance tests for this are listed below. Other than making shore telling Prolexa the same thing multiple time, these worked out of the box with the changes I made up to now. - A. Utterance is a sentence: ``` test('Some birds sing', 'I will remember that Some birds sing') #adds a new default: stored_rule(1,[(default(sing(X):-bird(X)))]) test('Swans dont sing', 'I will remember that Swans dont sing') #adds a new rule: stored_rule(1,[(not sing(X):-swan(X))]) test('All swans are birds', 'I will remember that All swans are birds') #adds a new rule: stored_rule(1,[(bird(X):-swan(X))]) test('Tweety is a swan', 'I will remember that Tweety is a swan') #adds a new rule: stored_rule(1,[(swan(tweety):-true)]) #Ensure that adding the rules twice does not add the rules twice - (Idempotence) test('Some birds sing', 'I already knew that Some birds sing') test('Swans dont sing', 'I already knew that Swans dont sing') test('All swans are birds', 'I already knew that All swans are birds') test('Tweety is a swan', 'I already knew that Tweety is a swan') ``` - B. Utterance is a question that can be answered: ``` test('Does tweety sing', 'Sorry, I don\'t think this is the case') test('Does peep fly','peep flies') test('Is tweety a bird', 'tweety is a bird') test('Is opus a bird', 'opus is a bird') test('Is peep a bird', 'peep is a bird') ``` - C. Utterance is a command that succeeds: ``` test('Tell me about tweety', 'tweety is a bird. tweety is a swan. tweety flies. tweety doesnt sing') test('Tell me about peep', 'peep is a bird. peep flies. peep sings') test('Explain why peep sings', 'peep is a bird; some birds sing; therefore peep sings') test('Explain why tweety flies', 'tweety is a swan; every swan is a bird; some birds fly; therefore tweety flies') test('Explain why tweety doesnt sing', 'tweety is a swan; swans dont sing; therefore tweety doesnt sing') ``` ### Required Changes: New rules are added when the utterance is interpreted as a sentence. This first checks if a rule is already known which will avoid adding it more than once: ``` % prolexa.pl % A. Utterance is a sentence ( phrase(sentence(Rule),UtteranceList), write_debug(rule(Rule)), ( known_rule(Rule,SessionId) -> % A1. It follows from known rules atomic_list_concat(['I already knew that',Utterance],' ',Answer) ; otherwise -> % A2. It doesn't follow, so add to stored rules assertz(prolexa:stored_rule(SessionId,Rule)), atomic_list_concat(['I will remember that',Utterance],' ',Answer) ) ``` Therefore, I needed to extend the `known_rule/2` meta-interpreter to also check if a default rule already existed. I again used an `else if` to do so which is probably more of a procedural way of programming: ``` %prolexa-engine.pl known_rule([Rule],SessionId):- findall(R,prolexa:stored_rule(SessionId,R),Rulebase), ( try((numbervars(Rule,0,_), Rule=(H:-B), %try normal rules add_body_to_rulebase(B,Rulebase,RB2), explain_rb(H,RB2) )) -> true ; try((numbervars(Rule,0,_), Rule=default(H:-B), %try default rules add_body_to_rulebase(B,Rulebase,RB2), explain_rb(H,RB2) )) ). ``` # <a name="limitations">Limitations # Limitations of my implementation that I'm aware of are: - Negated queries need to be written without using an apostrophe **'**. This means using doesnt and dont instead of doesn't and don't. I believe this could be fixed by properly escaping the **'** - New words cannot be added to the vocabulary - Questions can only be asked about proper nouns e.g: *'Does tweety fly'*. Questions about nouns are not possible: *'Do birds fly'* - If there is a negated rule in the rulebase such as `not fly(X):-penguin(X)` and a rule `penguin(opus):-true` and we ask Prolexa *'Does opus fly'* she answers with *'I don't think that's the case'*. This seems to be the correct answer but it is just a side effect due to Prolexa not matching the negated rule `not fly(X):-penguin(X). I wonder if it wouldn't be more consistent to answer with *'No opus doesn't fly'* and only answer with *'Sorry I don't think this is the case'* if no rule negated or not can be found.
45.430328
144
0.740776
eng_Latn
0.994739
4c398e262b8387948a224c8316e57be31c540584
504
md
Markdown
.github/ISSUE_TEMPLATE/guidance-issue.md
doc-E-brown/botocore
6be520a4eeed731652d34a2bcb484fabc1c7c860
[ "Apache-2.0" ]
1,063
2015-01-13T13:35:09.000Z
2022-03-31T09:29:32.000Z
.github/ISSUE_TEMPLATE/guidance-issue.md
doc-E-brown/botocore
6be520a4eeed731652d34a2bcb484fabc1c7c860
[ "Apache-2.0" ]
2,064
2015-01-03T15:53:33.000Z
2022-03-31T23:12:08.000Z
.github/ISSUE_TEMPLATE/guidance-issue.md
doc-E-brown/botocore
6be520a4eeed731652d34a2bcb484fabc1c7c860
[ "Apache-2.0" ]
1,065
2015-01-16T15:58:42.000Z
2022-03-31T22:18:56.000Z
--- name: Guidance issue about: Create a report to help us improve title: '' labels: needs-triage, guidance assignees: '' --- Please fill out the sections below to help us address your issue. **What issue did you see ?** **Steps to reproduce** If you have a runnable example, please include it as a snippet or link to a repository/gist for larger code examples. **Debug logs** Full stack trace by adding ``` import botocore.session botocore.session.Session().set_debug_logger('') ``` to your code.
21
117
0.734127
eng_Latn
0.997772
4c3a5f909c8f438b51ed69d24c572c930807793c
2,583
md
Markdown
README.md
messengerbag/Clipboard
f2ce2cef960fa367db3c43e4fb84842502eeb664
[ "MIT" ]
3
2018-03-30T13:25:48.000Z
2021-02-17T11:15:47.000Z
README.md
messengerbag/Clipboard
f2ce2cef960fa367db3c43e4fb84842502eeb664
[ "MIT" ]
null
null
null
README.md
messengerbag/Clipboard
f2ce2cef960fa367db3c43e4fb84842502eeb664
[ "MIT" ]
null
null
null
# Clipboard Clipboard is a plugin for [Adventure Game Studio (AGS)](http://www.adventuregamestudio.co.uk/). It offers access to the Windows clipboard, to enable AGS developers to add cut/copy/paste functionality to their AGS game, allowing data to be transferred between the game and other applications. Currently, only plain text strings (ASCII/ANSI C) can be transferred to and from the clipboard. ## How To Use To use the Clipboard plugin in an AGS project, you have to first add the plugin to AGS and activate it within the project. You then simply call the methods provided by the plugin from AGS script when you need to access the clipboard. ### How to Activate Plugin Place AgsClipboard.dll in the Adventure Game Studio engine folder, e.g. `C:\Program Files (x86)\Adventure Game Studio 3.4.1` (while IDE is not running). Start the IDE and load the game project. Under the "Plugins" entry in the project tree should now appear "AGS Clipboard Plugin vX.Y (agsclipboard.dll)". Right-click and check "Use this plugin". You can now use the plugin. ### How to Use Plugin Features The plugin provides two methods: `bool Clipboard.CopyText(String copyString)`, which copies a String you provide onto the Windows clipboard (returning `true` if successful), and `String Clipboard.PasteText()`, which returns the String that is currently on the Windows clipboard (or `null` if none). You can use these functions in your game to implement cut, copy and paste functionality (see the demo game). The plugin also `#define`s two macros: `CLIPBOARD_PLUGIN` and `CLIPBOARD_PLUGIN_VERSION`, where `CLIPBOARD_PLUGIN_VERSION` is a `float` representation of the plugin version number, with two decimal digits for the minor version (so that e.g. v1.2 is represented as 1.02). That means that you can use `#ifdef`–`#endif` and `#ifndef`–`#endif` blocks in your script, so it only tries to use the clipboard if the plugin is present. ## License and Credits This code is offered under multiple licenses. Choose whichever one you like. You may use it under the MIT license: - https://opensource.org/licenses/MIT You may also use it under the Creative Commons Attribution 4.0 International License: - https://creativecommons.org/licenses/by/4.0/ You may also use it under the Artistic License 2.0: - https://opensource.org/licenses/Artistic-2.0 In game credits, the preferred style is "Clipboard plugin by Snarky" or "Clipboard plugin by Gunnar Harboe", but you can adapt it as you like to the overall style of credit (or leave it out: no in-game credit is required, though it is appreciated).
92.25
427
0.776229
eng_Latn
0.995341
4c3a650d102b2840aea29fbb6d5ca99974167d40
4,330
md
Markdown
ressources/region-occitanie.md
DavidBruant/urbanvitaliz
ddf1b27555243a4cbe8a7cd403dbc16dee43bc15
[ "MIT" ]
null
null
null
ressources/region-occitanie.md
DavidBruant/urbanvitaliz
ddf1b27555243a4cbe8a7cd403dbc16dee43bc15
[ "MIT" ]
45
2020-10-21T16:03:33.000Z
2021-09-28T05:35:58.000Z
ressources/region-occitanie.md
DavidBruant/urbanvitaliz
ddf1b27555243a4cbe8a7cd403dbc16dee43bc15
[ "MIT" ]
4
2020-10-07T09:20:34.000Z
2021-03-21T20:44:27.000Z
--- layout: ressources nom: Conseil régional Occitanie # Bien garder les guillemets, sinon le numéro affiché ne sera pas le bon numero_telephone: zone_intervention: - Occitanie interventions: - description: Financement d'études préalables aux travaux pour les lauréats de l'Appel à projets Reconquête des friches en Occitanie detail: "Dépenses éligibles : dépenses d’études d’anticipation, en amont de la fermeture programmée d’un site ; dépenses d’études préalables : diagnostic pollution, études de sols, diagnostics bâtimentaires, études de potentialité, études deprogrammation urbaine, études de faisabilité, études de marché Bénéficiaires : Publics : EPCI (ou commune sous condition d’un appui significatif de l’EPCI), syndicat mixte, SEM ou SPL agissant pour le compte d’une collectivité éligible dans le cadre d’un mandat ou d’une concession, Etablissements Publics Fonciers. Les projets situés sur les métropoles sont exclus sauf pour les opérations inscrites au CPER ou dans les programmes opérationnels du contrat territorial Occitanie. Privés : analyse au cas par cas dans le cadre des objectifs d’Occitanie 2040. Montant de l'aide : 35% maximum du montant éligible, avec un plafond de subvention fixé à 50.000€" service: Direction de l’Aménagement, du Foncier et de l’Urbanisme. Pôle Aménagement, Mer, Changement Climatique conditions: Seuls les lauréats de l'appel à projet de la Région Occitanie pourront bénéficier de financements. La Région soutiendra au maximum 1 projet par EPCI et par an, quelle que soit la nature de la maîtrise d’ouvrage et de la dépense. phase: Etapes 2 à 4 dossier: Voir https://www.laregion.fr/friches-occitanie pour le règlement complet et le dossier de candidature contact: Etienne Florentin - etienne.florentin@laregion.fr - description: Financement de travaux et aménagements pour les lauréats de l'Appel à projets Reconquête des friches en Occitanie detail: "Dépenses éligibles : dépenses de maîtrise d’œuvre, dépenses de travaux liées à la déconstruction, dépollution, mise en sécurité, remise en état des sols, dépenses d’aménagement sur le site, sous condition d’une exemplarité en termes de nouveau modèle de développement et/ou de rééquilibrage territorial, dépenses de préservation ou de reconstitution de continuités écologiques et de requalification paysagère. Bénéficiaires : Publics : EPCI (ou commune sous condition d’un appui significatif de l’EPCI), syndicat mixte, SEM ou SPL agissant pour le compte d’une collectivité éligible dans le cadre d’un mandat ou d’une concession, Etablissements Publics Fonciers. Les projets situés sur les métropoles sont exclus sauf pour les opérations inscrites au CPER ou dans les programmes opérationnels du contrat territorial Occitanie. Privés : analyse au cas par cas dans le cadre des objectifs d’Occitanie 2040. Montant de l'aide : Maîtrise d’ouvrage publique ou parapublique : à parité avec le maître d’ouvrage à concurrence de 35% maximum du montant éligible, avec un plafond de subvention fixé à 500.000€ quel que soit le nombre de tranches. Dans le cas d’un portage communal, la parité est examinée au sens du «bloc communal». Maîtrise d’ouvrage privée : à parité avec le maître d’ouvrage à concurrence de 20% maximum du montant éligible, avec un plafond de subvention fixé à 350.000€ quel que soit le nombre de tranches. Dans le cas deprojets de grande envergure justifiant des interventions plus conséquentes, les taux et montants d’intervention de la Région pourront à titre dérogatoire être déplafonnés, en respectant la règle de parité avec le maître d’ouvrage." service: Direction de l’Aménagement, du Foncier et de l’Urbanisme. Pôle Aménagement, Mer, Changement Climatique conditions: Seuls les lauréats de l'appel à projet de la Région Occitanie pourront bénéficier de financements. La Région soutiendra au maximum 1 projet par EPCI et par an, quelle que soit la nature de la maîtrise d’ouvrage et de la dépense. phase: Etapes 1, 4 et 5 dossier: Voir https://www.laregion.fr/friches-occitanie pour le règlement complet et le dossier de candidature contact: Etienne Florentin - etienne.florentin@laregion.fr commentaire_structure: adresse_url: https://www.laregion.fr/friches-occitanie ---
135.3125
780
0.786143
fra_Latn
0.969808
4c3c448abed20b6343c801e3fd54c054f8db4aa4
8,763
md
Markdown
docset/windows/networkcontroller/New-NetworkControllerServer.md
skyguy94/windows-powershell-docs
d891dd308ed689225008966aeb41b3c75176fec2
[ "CC-BY-4.0", "MIT" ]
null
null
null
docset/windows/networkcontroller/New-NetworkControllerServer.md
skyguy94/windows-powershell-docs
d891dd308ed689225008966aeb41b3c75176fec2
[ "CC-BY-4.0", "MIT" ]
null
null
null
docset/windows/networkcontroller/New-NetworkControllerServer.md
skyguy94/windows-powershell-docs
d891dd308ed689225008966aeb41b3c75176fec2
[ "CC-BY-4.0", "MIT" ]
null
null
null
--- ms.mktglfcycl: manage ms.sitesec: library ms.author: v-kaunu author: Kateyanne description: Use this topic to help manage Windows and Windows Server technologies with Windows PowerShell. external help file: Microsoft.NetworkController.Powershell.dll-help.xml keywords: powershell, cmdlet manager: jasgro ms.date: 12/20/2016 ms.prod: w10 ms.technology: ms.topic: reference online version: schema: 2.0.0 title: New-NetworkControllerServer ms.reviewer: ms.assetid: A8CC59FD-D792-4379-B1F6-0AE4D353ACA7 --- # New-NetworkControllerServer ## SYNOPSIS Creates or Updates a server resource in the Network Controller ## SYNTAX ``` New-NetworkControllerServer [-ResourceId] <String> [[-Tags] <PSObject>] [-Properties] <ServerProperties> [[-Etag] <String>] [[-ResourceMetadata] <ResourceMetadata>] [-Force] -ConnectionUri <Uri> [-CertificateThumbprint <String>] [-Credential <PSCredential>] [-PassInnerException] [-WhatIf] [-Confirm] [<CommonParameters>] ``` ## DESCRIPTION The **New-NetworkControllerServer** cmdlet creates updates a physical server resource in the Network Controller, or updates an existing one. After you add the server resource, Network Controller manages that server. ## EXAMPLES ### Example 1: Add a server ``` PS C:\> $CredentialProperties = [Microsoft.Windows.NetworkController.CredentialProperties]@{Type="UsernamePassword";UserName="admin";Value="password"} PS C:\> New-NetworkControllerCredential -ResourceId "Credential01" -ConnectionUri "https://restserver" -Properties $CredentialProperties PS C:\> $Credential = Get-NetworkControllerCredential -ResourceId "Credential01" -ConnectionUri "https://restserver" PS C:\> $ServerProperties = New-Object Microsoft.Windows.NetworkController.ServerProperties PS C:\> $ServerProperties.Connections = @([Microsoft.Windows.NetworkController.Connection]@{ManagementAddresses=@("192.168.0.12");Credential=$Credential}) PS C:\> $ServerProperties.RackSlot = "1" PS C:\> $ServerProperties.OS = "Windows Server 2016" PS C:\> $ServerProperties.Vendor = "Dell" PS C:\> $ServerProperties.Model = "PowerEdge R730" PS C:\> New-NetworkControllerServer -ConnectionUri "https://networkcontroller" -ResourceId "Server01" -Properties $ServerProperties ``` The first command creates a **CredentialProperties** object, and then stores it in the $CredentialProperties variable. The second command creates a credential that has the properties in $CredentialProperties by using the **New-NetworkControllerCredential** cmdlet. The third command gets the credential by using the **Get-NetworkControllerCredential** cmdlet, and then stores it in the $Credential variable. The fourth command creates a **ServerProperties** object by using the **New-Object** cmdlet. The command stores the object in the $ServerProperties variable. The next five commands assign values to properties of $ServerProperties. The final command adds a server to the Network Controller that has the resource ID Server01. The command identifies the Network Controller by URI. The command specifies the properties of the server by using $ServerProperties. ## PARAMETERS ### -CertificateThumbprint Specifies the certificate thumbprint of a digital public key X.509 certificate of a user account that has permission to perform this action. In order for Network Controller to authorize the account, specify this thumbprint by using the *ClientCertificateThumbprint* parameter of the **Install-NetworkController** or **Set-NetworkController** cmdlet. ```yaml Type: String Parameter Sets: (All) Aliases: Required: False Position: Named Default value: None Accept pipeline input: False Accept wildcard characters: False ``` ### -Confirm Prompts you for confirmation before running the cmdlet. ```yaml Type: SwitchParameter Parameter Sets: (All) Aliases: cf Required: False Position: Named Default value: False Accept pipeline input: False Accept wildcard characters: False ``` ### -ConnectionUri Specifies the Uniform Resource Identifier (URI) of the Network Controller. All Representational State Transfer (REST) clients use this URI to connect to Network Controller. ```yaml Type: Uri Parameter Sets: (All) Aliases: Required: True Position: Named Default value: None Accept pipeline input: False Accept wildcard characters: False ``` ### -Credential Specifies a user credential that has permission to perform this action. The default value is the current user. This user must be present in the security group specified in the *ClientSecurityGroup* parameter in **Install-NetworkController**. ```yaml Type: PSCredential Parameter Sets: (All) Aliases: Required: False Position: Named Default value: None Accept pipeline input: False Accept wildcard characters: False ``` ### -Etag Specifies the entity tag (ETag) parameter of the resource. An ETag is an HTTP response header returned by an HTTP-compliant web server. An ETag is used to determine change in the content of a resource. The value of the header is an opaque string that represents the state of the resource when the response was generated. ```yaml Type: String Parameter Sets: (All) Aliases: Required: False Position: 5 Default value: None Accept pipeline input: False Accept wildcard characters: False ``` ### -Force Forces the command to run without asking for user confirmation. ```yaml Type: SwitchParameter Parameter Sets: (All) Aliases: Required: False Position: 7 Default value: None Accept pipeline input: False Accept wildcard characters: False ``` ### -PassInnerException ```yaml Type: SwitchParameter Parameter Sets: (All) Aliases: Required: False Position: Named Default value: None Accept pipeline input: False Accept wildcard characters: False ``` ### -Properties Specifies the properties of a server that this cmdlet creates or updates. You can specify the following properties: - Connections that specifies the information that is required to connect to the server for the purposes of managing it. Each connection has a management address and a credential reference to connect to the server. - Model number. - Array of physical network interfaces on the server. - Operating system that runs on the server. - Slot in the rack in which the server is connected. - Serial number. - Server vendor name. ```yaml Type: ServerProperties Parameter Sets: (All) Aliases: Required: True Position: 3 Default value: None Accept pipeline input: True (ByPropertyName) Accept wildcard characters: False ``` ### -ResourceId Specifies the resource ID of the server that this cmdlet creates or modifies. ```yaml Type: String Parameter Sets: (All) Aliases: Required: True Position: 0 Default value: None Accept pipeline input: True (ByPropertyName) Accept wildcard characters: False ``` ### -ResourceMetadata Specifies the resource ID of the server interface that this cmdlet adds or updates. ```yaml Type: ResourceMetadata Parameter Sets: (All) Aliases: Required: False Position: 6 Default value: None Accept pipeline input: False Accept wildcard characters: False ``` ### -Tags ```yaml Type: PSObject Parameter Sets: (All) Aliases: Required: False Position: 1 Default value: None Accept pipeline input: False Accept wildcard characters: False ``` ### -WhatIf Shows what would happen if the cmdlet runs. The cmdlet is not run. ```yaml Type: SwitchParameter Parameter Sets: (All) Aliases: wi Required: False Position: Named Default value: False Accept pipeline input: False Accept wildcard characters: False ``` ### CommonParameters This cmdlet supports the common parameters: -Debug, -ErrorAction, -ErrorVariable, -InformationAction, -InformationVariable, -OutVariable, -OutBuffer, -PipelineVariable, -Verbose, -WarningAction, and -WarningVariable. For more information, see [about_CommonParameters](http://go.microsoft.com/fwlink/?LinkID=113216). ## INPUTS ### ServerProperties You can pipe an object to this cmdlet that contains the following properties: - Connections that specifies the information that is required to connect to the server for the purposes of managing it. Each connection has a management address and a credential reference to connect to the server. - Model number. - Array of physical network interfaces on the server. - Operating system that runs on the server. - Slot in the rack in which the server is connected. - Serial number. - Server vendor name. ## OUTPUTS ## NOTES ## RELATED LINKS [Get-NetworkControllerCredential](./Get-NetworkControllerCredential.md) [Get-NetworkControllerServer](./Get-NetworkControllerServer.md) [Install-NetworkController](./Install-NetworkController.md) [New-NetworkControllerCredential](./New-NetworkControllerCredential.md) [Remove-NetworkControllerServer](./Remove-NetworkControllerServer.md) [Set-NetworkController](./Set-NetworkController.md)
28.731148
315
0.782495
eng_Latn
0.871066
4c3c586f4d9f21e63e1986fb415e0cb03efa5efe
2,222
md
Markdown
maintainer-workflow.md
jk-ozlabs/docs
04f81209979bd7dbe567078f54256615f8f79c72
[ "CC-BY-4.0" ]
null
null
null
maintainer-workflow.md
jk-ozlabs/docs
04f81209979bd7dbe567078f54256615f8f79c72
[ "CC-BY-4.0" ]
null
null
null
maintainer-workflow.md
jk-ozlabs/docs
04f81209979bd7dbe567078f54256615f8f79c72
[ "CC-BY-4.0" ]
null
null
null
# OpenBMC Maintainer/CLA Workflow OpenBMC contributors are required to execute an OpenBMC CLA (Contributor License Agreement) before their contributions can be accepted. This page is a checklist for sub-project maintainers to follow before approving patches. * Manually verify the contributor has signed the ICLA (individual) or is listed on an existing CCLA (corporate). * Executed CLAs can be found [in the CLA repository] (https://drive.google.com/drive/folders/1Ooi0RdTcaOWF1DWFJUAJDdN7tRKde7Nl). * If you were not added to the appropriate CLA repository ACL send an email to openbmc@lists.ozlabs.org with a request to be added. * If a CLA for the contributor is found, accept the patch(1). * If a CLA is not found, request that the contributor execute one and send it to openbmc@lists.ozlabs.org. * Do not accept the patch(1) until a signed CLA (individual _or_ corporate) has been uploaded to the CLA repository. * The CCLA form can be found [here] (https://github.com/openbmc/openbmc/files/1860741/OpenBMC.CCLA.pdf). * The ICLA form can be found [here] (https://github.com/openbmc/openbmc/files/1860742/OpenBMC.ICLA.pdf). An executed OpenBMC CLA is _not_ required to accept contributions to OpenBMC forks of upstream projects, like the Linux kernel or U-Boot. Review the maintainers' responsibilities in the [contributing guidelines](./CONTRIBUTING.md). Maintainers are ultimately responsible for sorting out open source license issues, issues with using code copied from the web, and maintaining the quality of the code. Repository maintainers ought to have the following traits as recognized by a consensus of their peers: - responsible: have a continuing desire to ensure only high-quality code goes into the repo - leadership: foster open-source aware practices such as [FOSS](https://en.wikipedia.org/wiki/Free_and_open-source_software) - expertise: typically demonstrated by significant contributions to the code or code reviews (1) The semantics of accepting a patch depend on the sub-project contribution process. * Github pull requests - Merging the pull request. * Gerrit - +2. * email - Merging the patch. Ensure that accepted changes actually merge into OpenBMC repositories.
47.276596
125
0.790279
eng_Latn
0.995243
4c3d15129d1c80fed2bf1fedc5ce9e4778a6efc4
4,327
md
Markdown
README.md
lonord/react-marked-editor
1afaa599f16e4770ee89d221be76f0812620d33a
[ "MIT" ]
3
2017-05-06T20:02:04.000Z
2017-06-19T23:48:59.000Z
README.md
lonord/react-marked-editor
1afaa599f16e4770ee89d221be76f0812620d33a
[ "MIT" ]
null
null
null
README.md
lonord/react-marked-editor
1afaa599f16e4770ee89d221be76f0812620d33a
[ "MIT" ]
null
null
null
# react-marked-editor A markdown editor written by React [![version](https://img.shields.io/npm/v/react-marked-editor.svg?style=flat)](https://www.npmjs.com/package/react-marked-editor) The editor is powered by [CodeMirror](http://codemirror.net), and the markdown transcoder is powered by [marked](https://github.com/chjj/marked) ## Installation ```bash $ npm install react-marked-editor ``` ### Usage Firstly, add `styled-jsx/babel` to `plugins` in your babel configuration: ```json { "plugins": [ "styled-jsx/babel" ] } ``` Next, add font-awesome less/css files and font files to your project, and add some loaders to your webpack configuration: ```js { test: /\.less$/, loader: 'style-loader!css-loader!less-loader' }, { test: /\.(woff|woff2|svg|eot|ttf)\??.*$/, loader: 'file-loader?name=[name].[ext]' } ``` Add font-awesome less/css file import to your entry code: ```js import './path/to/font-awesome.(less|css)'; ``` Finally, use the component in your code: ```js import ReactMarkedEditor from 'react-marked-editor'; //... render() { return ( <div> <ReactMarkedEditor initialMarkdown={md}/> </div> ); } //show readonly markdown view import { ReactMarkedView } from 'react-marked-editor'; //... render() { return ( <div> <ReactMarkedView markdown={md}/> </div> ); } //get codeMirror instance <ReactMarkedEditor ref={editor = this.editor = editor} {...otherProps}/> //in somewhere const codeMirror = this.editor.codeDoc; ``` ### API Doc **ReactMarkedEditor** | props | type | detail | |------------------------|----------|--------------------------------------------------------------------| | initialMarkdown | string | the initial markdown string to show | | onChange | function | editor content change event, args -> (newValue) | | markdownClassName | string | `className` pass to `ReactMarkedView` inside `ReactMarkedEditor` | | markdownStyle | object | styles object pass to `ReactMarkedView` inside `ReactMarkedEditor` | | editorHeight | number | height of editor (exclude toolbar) | | hideToolbar | boolean | do not show the toolbar | | style | object | set styles to root element of `ReactMarkedEditor` | | className | string | set `className` to root element of `ReactMarkedEditor` | | openLinkInBlank | boolean | whether open link in blank window/tab pass to `ReactMarkedView` | | markedOptions | object | options pass to `marked` | | toolbarCustomButtons | array | custom buttons add to toolbar, properties of child listed below ⬇︎ | | ↳ title | string | the `title` property of button element | | ↳ icon | string | the class name of Font Awesome icon | | ↳ onClick | function | click callback, args -> (codeMirror, event) | **ReactMarkedView** | props | type | detail | |------------------------|----------|--------------------------------------------------------------------| | markdown | string | the initial markdown string to transcode | | markdownClass | string | the className pass to root element to override the default style | | markedOptions | object | options pass to `marked` | | style | object | set styles to root element of `ReactMarkedView` | | className | string | set `className` to root element of `ReactMarkedView` | | openLinkInBlank | boolean | whether open link in blank window/tab | ### Demo Clone the repo ```bash $ git clone https://github.com/lonord/react-marked-editor.git ``` Install dependencies ```bash $ npm i ``` And run ```bash $ npm start ``` ## License MIT
33.542636
144
0.528773
eng_Latn
0.795618
4c3d536ffe929e13abc6d62ce08afc1eea3127c8
13,465
md
Markdown
articles/azure-monitor/platform/alerts-metric-near-real-time.md
changeworld/azure-docs.pt-pt
8a75db5eb6af88cd49f1c39099ef64ad27e8180d
[ "CC-BY-4.0", "MIT" ]
null
null
null
articles/azure-monitor/platform/alerts-metric-near-real-time.md
changeworld/azure-docs.pt-pt
8a75db5eb6af88cd49f1c39099ef64ad27e8180d
[ "CC-BY-4.0", "MIT" ]
null
null
null
articles/azure-monitor/platform/alerts-metric-near-real-time.md
changeworld/azure-docs.pt-pt
8a75db5eb6af88cd49f1c39099ef64ad27e8180d
[ "CC-BY-4.0", "MIT" ]
null
null
null
--- title: Recursos suportados para alertas métricos no Monitor Do Azure description: Referência sobre métricas de suporte e registos para alertas métricos no Monitor Azure author: harelbr ms.author: harelbr services: monitoring ms.topic: conceptual ms.date: 3/5/2020 ms.subservice: alerts ms.openlocfilehash: c036fa3708d718d6199d27989e60b11015a1227e ms.sourcegitcommit: 3c318f6c2a46e0d062a725d88cc8eb2d3fa2f96a ms.translationtype: MT ms.contentlocale: pt-PT ms.lasthandoff: 04/02/2020 ms.locfileid: "80585859" --- # <a name="supported-resources-for-metric-alerts-in-azure-monitor"></a>Recursos suportados para alertas métricos no Monitor Do Azure O Azure Monitor suporta agora um [novo tipo de alerta métrico](../../azure-monitor/platform/alerts-overview.md) que tem benefícios significativos em relação aos [alertas métricos clássicos mais antigos.](../../azure-monitor/platform/alerts-classic.overview.md) As métricas estão disponíveis para [uma grande lista de serviços Azure.](../../azure-monitor/platform/metrics-supported.md) Os alertas mais recentes suportam um subconjunto (crescente) dos tipos de recursos. Este artigo lista o subconjunto. Também pode utilizar novos alertas métricos em dados de registo populares armazenados num espaço de trabalho de Log Analytics extraído como métricas. Para mais informações, consulte [alertas métricos para registos](../../azure-monitor/platform/alerts-metric-logs.md). ## <a name="portal-powershell-cli-rest-support"></a>Portal, PowerShell, CLI, suporte REST Atualmente, pode criar novos alertas métricos apenas no portal Azure, [REST API,](https://docs.microsoft.com/rest/api/monitor/metricalerts/)ou [Modelos](../../azure-monitor/platform/alerts-metric-create-templates.md)de Gestor de Recursos . O suporte para configurar novos alertas utilizando as versões PowerShell e Azure CLI 2.0 e superior está para breve. ## <a name="metrics-and-dimensions-supported"></a>Métricas e Dimensões Suportadas Alertas métricos mais recentes suportam alertas para métricas que usam dimensões. Pode utilizar dimensões para filtrar a sua métrica para o nível certo. Todas as métricas suportadas, juntamente com as dimensões aplicáveis, podem ser exploradas e visualizadas a partir do [Azure Monitor - Metrics Explorer](../../azure-monitor/platform/metrics-charts.md). Aqui está a lista completa de fontes métricas de monitor estoque azure suportadas pelos novos alertas: |Tipo de recurso |Dimensões Suportadas |Alertas de vários recursos| Métricas Disponíveis| |---------|---------|-----|----------| |Microsoft.ApiManagement/service | Sim| Não | [Gestão de API](../../azure-monitor/platform/metrics-supported.md#microsoftapimanagementservice)| |Microsoft.AppPlatform/Spring |Não| Sim| |Microsoft.Automation/automationAccounts | Sim| Não | [Contas de Automatização](../../azure-monitor/platform/metrics-supported.md#microsoftautomationautomationaccounts)| |Microsoft.Batch/batchAccounts | N/D| Não | [Contas do Batch](../../azure-monitor/platform/metrics-supported.md#microsoftbatchbatchaccounts)| |Microsoft.Cache/Redis|Sim| Não |[Cache do Azure para Redis](../../azure-monitor/platform/metrics-supported.md#microsoftcacheredis)| |Microsoft.ClassicStorage/storageAccounts/mmxclassic|Não|Sim| |Microsoft.ClassicStorage/storageAccounts/mmxclassic/blobServices|Não|Sim| |Microsoft.ClassicStorage/storageAccounts/mmxclassic/fileServices|Não|Sim| |Microsoft.ClassicStorage/storageAccounts/mmxclassic/queueServices|Não|Sim| |Microsoft.ClassicStorage/storageAccounts/mmxclassic/tableServices|Não|Sim| | |Microsoft.CognitiveServices/accounts| N/D | Não | [Serviços Cognitivos](../../azure-monitor/platform/metrics-supported.md#microsoftcognitiveservicesaccounts)| |Microsoft.Compute/virtualMachines |Sim | Sim | [Máquinas Virtuais](../../azure-monitor/platform/metrics-supported.md#microsoftcomputevirtualmachines)| |Microsoft.Compute/virtualMachineScaleSets |N/D | Sim |[Conjuntos de dimensionamento de máquinas virtuais](../../azure-monitor/platform/metrics-supported.md#microsoftcomputevirtualmachinescalesets)| |Microsoft.ContainerInstance/containerGroups | Sim| Não | [Grupos de contentores](../../azure-monitor/platform/metrics-supported.md#microsoftcontainerinstancecontainergroups)| |Microsoft.ContainerService/managedClusters | Sim | Não | [Clusters Geridos](../../azure-monitor/platform/metrics-supported.md#microsoftcontainerservicemanagedclusters)| |Microsoft.DataBoxEdge/dataBoxEdgeDevices | Sim | Sim | | |Microsoft.DataFactory/datafactores| Sim| Não | [Fábricas de Dados V1](../../azure-monitor/platform/metrics-supported.md#microsoftdatafactorydatafactories)| |Microsoft.DataFactory/fábricas |Sim | Não |[Fábricas de Dados V2](../../azure-monitor/platform/metrics-supported.md#microsoftdatafactoryfactories)| |Microsoft.DataShare/contas |Não| Sim| |Microsoft.DBforMySQL/servidores |N/D| Não |[DB para MySQL](../../azure-monitor/platform/metrics-supported.md#microsoftdbformysqlservers)| |Microsoft.DBforPostgreSQL/servidores |N/D | Não | [DB para PostgreSQL](../../azure-monitor/platform/metrics-supported.md#microsoftdbforpostgresqlservers)| |Microsoft.Devices/IotHubs | N/D | Não |[Métricas do Hub IoT](../../azure-monitor/platform/metrics-supported.md#microsoftdevicesiothubs)| |Microsoft.Dispositivos/serviços de provisionamento| Sim | Não |[Métricas DPS](../../azure-monitor/platform/metrics-supported.md#microsoftdevicesprovisioningservices)| |Microsoft.EventGrid/domínios|Não|Sim| | |Microsoft.EventGrid/tópicos |Sim | Não |[Tópicos da Grelha de Eventos](../../azure-monitor/platform/metrics-supported.md#microsofteventgridtopics)| |Microsoft.EventHub/clusters |Sim| Não |[Aglomerados de Hubs de Eventos](../../azure-monitor/platform/metrics-supported.md#microsofteventhubclusters)| |Microsoft.EventHub/espaços de nome |Sim| Não |[Hubs de Eventos](../../azure-monitor/platform/metrics-supported.md#microsofteventhubnamespaces)| |Microsoft.KeyVault/cofres| Não |Não |[Cofres](../../azure-monitor/platform/metrics-supported.md#microsoftkeyvaultvaults)| |Microsoft.Logic/workflows |N/D | Não |[Aplicações Lógicas](../../azure-monitor/platform/metrics-supported.md#microsoftlogicworkflows) | |Microsoft.MachineLearningServices/espaços de trabalho|Sim| Não | [Aprendizagem automática](../../azure-monitor/platform/metrics-supported.md#microsoftmachinelearningservicesworkspaces) | |Microsoft.NetApp/netAppAccounts/capacityPools |Sim| Não | [Piscinas de Capacidade Azure NetApp](../../azure-monitor/platform/metrics-supported.md#microsoftnetappnetappaccountscapacitypools) | |Microsoft.NetApp/netAppAccounts/capacityPools/volumes |Sim| Não | [Azure NetApp Volumes](../../azure-monitor/platform/metrics-supported.md#microsoftnetappnetappaccountscapacitypoolsvolumes) | |Microsoft.Network/applicationGateways|N/D| Não | | |Microsoft.Network/dnsZones | N/D| Não | [Zonas DNS](../../azure-monitor/platform/metrics-supported.md#microsoftnetworkdnszones) | |Microsoft.Network/expressRouteCircuits | N/D | Não |[Circuitos do ExpressRoute](../../azure-monitor/platform/metrics-supported.md#microsoftnetworkexpressroutecircuits) | |Microsoft.Network/loadBalancers (apenas para SKUs standard)| Sim| Não | [Balanceadores de carga](../../azure-monitor/platform/metrics-supported.md#microsoftnetworkloadbalancers) | |Microsoft.Network/natGateways|Não|Sim| |Microsoft.Network/privateEndpoints|Não|Sim| |Microsoft.Network/privateLinkServices|Não|Sim| |Microsoft.Network/publicipaddresss |N/D | Não |[Endereços IP públicos](../../azure-monitor/platform/metrics-supported.md#microsoftnetworkpublicipaddresses)| |Microsoft.Network/trafficManagerProfiles | Sim | Não | [Perfis do Gestor de Tráfego](../../azure-monitor/platform/metrics-supported.md#microsoftnetworktrafficmanagerprofiles) | |Microsoft.OperationalInsights/espaços de trabalho| Sim | Não | [Áreas de trabalho do Log Analytics](../../azure-monitor/platform/metrics-supported.md#microsoftoperationalinsightsworkspaces)| |Microsoft.Relay/namespaces | Sim | Não | [Reencaminhamentos](../../azure-monitor/platform/metrics-supported.md#microsoftrelaynamespaces)| |Microsoft.Peering/peeringServices|Não|Sim| |Microsoft.PowerBIDedicated/capacities | N/D | Não | [Capacidades](../../azure-monitor/platform/metrics-supported.md#microsoftpowerbidedicatedcapacities)| |Microsoft.Search/searchServices |N/D|Não | [Serviços de pesquisa](../../azure-monitor/platform/metrics-supported.md#microsoftsearchsearchservices)| |Microsoft.ServiceBus/espaços de nome |Sim| Não |[Service Bus](../../azure-monitor/platform/metrics-supported.md#microsoftservicebusnamespaces)| |Microsoft.Sql/servidores/elasticPools | Não | Sim | |Microsoft.Sql/servidores/bases de dados | Não | Sim | |Microsoft.Storage/storageAccounts |Sim | Não | [Contas de Armazenamento](../../azure-monitor/platform/metrics-supported.md#microsoftstoragestorageaccounts)| |Microsoft.Armazenamento/armazenamentoContas/serviços | Sim| Não | [Serviços Blob,](../../azure-monitor/platform/metrics-supported.md#microsoftstoragestorageaccountsblobservices) [Serviços de Arquivo,](../../azure-monitor/platform/metrics-supported.md#microsoftstoragestorageaccountsfileservices) [Serviços de Fila](../../azure-monitor/platform/metrics-supported.md#microsoftstoragestorageaccountsqueueservices) e [Serviços de Mesa](../../azure-monitor/platform/metrics-supported.md#microsoftstoragestorageaccountstableservices)| |Microsoft.StreamAnalytics/streamingjobs |N/D| Não | [Stream Analytics](../../azure-monitor/platform/metrics-supported.md#microsoftstreamanalyticsstreamingjobs)| |Microsoft.Microsoft.VMWareCloudSimple/virtualMachines |Sim|Não |[Máquinas Virtuais do CloudSimple](../../azure-monitor/platform/metrics-supported.md#microsoftvmwarecloudsimplevirtualmachines)| |Microsoft.Web/hostingEnvironments/multiRolePools | Sim | Não | [App Service Ambiente Piscinas multi-funções](../../azure-monitor/platform/metrics-supported.md#microsoftwebhostingenvironmentsmultirolepools)| |Microsoft.Web/hostingEnvironments/workerPools | Sim | Não | [Piscinas de trabalhadores do ambiente do serviço de aplicativos](../../azure-monitor/platform/metrics-supported.md#microsoftwebhostingenvironmentsworkerpools)| |Microsoft.Web/serverfarms | Sim | Não | [Planos de Serviço de Aplicações](../../azure-monitor/platform/metrics-supported.md#microsoftwebserverfarms)| |Microsoft.Web/sites | Sim | Não | [Serviços](../../azure-monitor/platform/metrics-supported.md#microsoftwebsites-excluding-functions) e [Funções](../../azure-monitor/platform/metrics-supported.md#microsoftwebsites-functions) de Aplicações| |Microsoft.Web/sites/slots | Sim | Não | [Slots de serviço de aplicativos](../../azure-monitor/platform/metrics-supported.md#microsoftwebsitesslots)| ## <a name="payload-schema"></a>Esquema de carga útil > [!NOTE] > Também pode utilizar o esquema de [alerta comum,](https://aka.ms/commonAlertSchemaDocs)que proporciona a vantagem de ter uma única carga de alerta extensível e unificada em todos os serviços de alerta no Monitor Azure, para as suas integrações no webhook. [Conheça as definições comuns de esquemade alerta.](https://aka.ms/commonAlertSchemaDefinitions) A operação POST contém a seguinte carga útil e esquema JSON para todos os alertas métricos mais recentes quando um grupo de [ação](../../azure-monitor/platform/action-groups.md) devidamente configurado é utilizado: ```json { "schemaId": "AzureMonitorMetricAlert", "data": { "version": "2.0", "status": "Activated", "context": { "timestamp": "2018-02-28T10:44:10.1714014Z", "id": "/subscriptions/00000000-0000-0000-0000-000000000000/resourceGroups/Contoso/providers/microsoft.insights/metricAlerts/StorageCheck", "name": "StorageCheck", "description": "", "conditionType": "SingleResourceMultipleMetricCriteria", "severity":"3", "condition": { "windowSize": "PT5M", "allOf": [ { "metricName": "Transactions", "metricNamespace":"microsoft.storage/storageAccounts", "dimensions": [ { "name": "AccountResourceId", "value": "/subscriptions/00000000-0000-0000-0000-000000000000/resourceGroups/Contoso/providers/Microsoft.Storage/storageAccounts/diag500" }, { "name": "GeoType", "value": "Primary" } ], "operator": "GreaterThan", "threshold": "0", "timeAggregation": "PT5M", "metricValue": 1 } ] }, "subscriptionId": "00000000-0000-0000-0000-000000000000", "resourceGroupName": "Contoso", "resourceName": "diag500", "resourceType": "Microsoft.Storage/storageAccounts", "resourceId": "/subscriptions/1e3ff1c0-771a-4119-a03b-be82a51e232d/resourceGroups/Contoso/providers/Microsoft.Storage/storageAccounts/diag500", "portalLink": "https://portal.azure.com/#resource//subscriptions/00000000-0000-0000-0000-000000000000/resourceGroups/Contoso/providers/Microsoft.Storage/storageAccounts/diag500" }, "properties": { "key1": "value1", "key2": "value2" } } } ``` ## <a name="next-steps"></a>Passos seguintes * Saiba mais sobre a nova [experiência alerts.](../../azure-monitor/platform/alerts-overview.md) * Saiba mais sobre alertas de [log em Azure.](../../azure-monitor/platform/alerts-unified-log.md) * Saiba mais sobre [alertas em Azure.](../../azure-monitor/platform/alerts-overview.md)
86.314103
529
0.770219
por_Latn
0.478828
4c3dc37e5a2b694f7c98a4b290615eeaee70f805
626
md
Markdown
en/python/lists/8.md
nank1ro/Codigo-Questions
559cf998c8b5fbf7a9aee2db08fe57c8abc63408
[ "BSD-3-Clause" ]
5
2021-08-30T05:36:45.000Z
2022-03-18T16:25:39.000Z
en/python/lists/8.md
nank1ro/Codigo-Questions
559cf998c8b5fbf7a9aee2db08fe57c8abc63408
[ "BSD-3-Clause" ]
33
2021-10-04T12:52:45.000Z
2022-03-07T11:32:13.000Z
en/python/lists/8.md
nank1ro/Codigo-Questions
559cf998c8b5fbf7a9aee2db08fe57c8abc63408
[ "BSD-3-Clause" ]
1
2021-12-07T16:04:12.000Z
2021-12-07T16:04:12.000Z
--- language: python exerciseType: 2 --- # --description-- List elements could be of any type: ```python list_name = ["one", 2, True] ``` In fact, above we have, in order, a string, an integer and a boolean. But we can also have lists with lists as well! # --instructions-- Print out a value from the list # --seed-- ```python list1 = ["a", "b", "c"] list2 = ["x", "y", "z"] list3 = [list1, list2] print([/]) ``` # --answers-- - list3 - list3[[2]] - list3[0][2][0] - list3[1][0] # --solutions-- ```python list1 = ["a", "b", "c"] list2 = ["x", "y", "z"] list3 = [list1, list2] print(list3[1][0]) ``` # --output-- x
13.319149
69
0.570288
eng_Latn
0.922934
4c3dca4f53b8ba5940e3f34125dfcffea24092cf
39
md
Markdown
README.md
Muana-kawilam/countries
d7158ced1e8e00d6ab1e13d4e426a03f6d6b5445
[ "MIT" ]
null
null
null
README.md
Muana-kawilam/countries
d7158ced1e8e00d6ab1e13d4e426a03f6d6b5445
[ "MIT" ]
null
null
null
README.md
Muana-kawilam/countries
d7158ced1e8e00d6ab1e13d4e426a03f6d6b5445
[ "MIT" ]
null
null
null
# countries working with countries API
13
26
0.820513
eng_Latn
0.997337
4c3ddc366eca7d2e3a1674c24a2e6189f8e94c50
473
md
Markdown
discuss/Ch16/Sequence-Aware Recommender Systems.md
StevenJokess/d2l-en-read
71b0f35971063b9fe5f21319b8072d61c9e5a298
[ "MIT" ]
1
2021-05-26T12:19:44.000Z
2021-05-26T12:19:44.000Z
discuss/Ch16/Sequence-Aware Recommender Systems.md
StevenJokess/d2l-en-read
71b0f35971063b9fe5f21319b8072d61c9e5a298
[ "MIT" ]
null
null
null
discuss/Ch16/Sequence-Aware Recommender Systems.md
StevenJokess/d2l-en-read
71b0f35971063b9fe5f21319b8072d61c9e5a298
[ "MIT" ]
1
2021-05-05T13:54:26.000Z
2021-05-05T13:54:26.000Z
<!-- * @version: * @Author: StevenJokess https://github.com/StevenJokess * @Date: 2020-09-13 19:57:08 * @LastEditors: StevenJokess https://github.com/StevenJokess * @LastEditTime: 2020-09-13 19:57:16 * @Description: * @TODO:: * @Reference: --> Sequence-Aware Recommender Systems image https://discuss.d2l.ai/uploads/default/original/1X/44ee4d7cde2c502993268ab224223b4bfa7422ee.png Why are the examples so small? 9.4 VS 8220.1 :roll_eyes: Factorization Machines
26.277778
95
0.742072
yue_Hant
0.698491
4c3de15a3d41aa4d98be7a5026d5e9f9dc3ad9f4
1,473
md
Markdown
src/__tests__/fixtures/unfoldingWord/en_tw/bible/names/zechariahnt.md
unfoldingWord/content-checker
7b4ca10b94b834d2795ec46c243318089cc9110e
[ "MIT" ]
null
null
null
src/__tests__/fixtures/unfoldingWord/en_tw/bible/names/zechariahnt.md
unfoldingWord/content-checker
7b4ca10b94b834d2795ec46c243318089cc9110e
[ "MIT" ]
226
2020-09-09T21:56:14.000Z
2022-03-26T18:09:53.000Z
src/__tests__/fixtures/unfoldingWord/en_tw/bible/names/zechariahnt.md
unfoldingWord/content-checker
7b4ca10b94b834d2795ec46c243318089cc9110e
[ "MIT" ]
1
2022-01-10T21:47:07.000Z
2022-01-10T21:47:07.000Z
# Zechariah (NT) ## Facts: In the New Testament, Zechariah was a Jewish priest who became the father of John the Baptist. * Zechariah loved God and obeyed him. * For many years Zechariah and his wife, Elizabeth, prayed earnestly to have a child, but did not have one. Then when they were very old, God answered their prayers and gave them a son. * Zechariah prophesied that his son John would be the prophet who would announce and prepare the way for the Messiah. (Translation suggestions: [How to Translate Names](rc://en/ta/man/translate/translate-names)) (See also: [Christ](../kt/christ.md), [Elizabeth](../names/elizabeth.md), [prophet](../kt/prophet.md)) ## Bible References: * [Luke 01:5-7](rc://en/tn/help/luk/01/05) * [Luke 01:21-23](rc://en/tn/help/luk/01/21) * [Luke 01:39-41](rc://en/tn/help/luk/01/39) * [Luke 03:1-2](rc://en/tn/help/luk/03/01) ## Examples from the Bible stories: * __[22:01](rc://en/tn/help/obs/22/01)__ Suddenly an angel came with a message from God to an old priest named __Zechariah__. __Zechariah__ and his wife, Elizabeth, were godly people, but she had not been able to have any children. * __[22:02](rc://en/tn/help/obs/22/02)__ The angel said to __Zechariah__, “Your wife will have a son. You will name him John.” * __[22:03](rc://en/tn/help/obs/22/03)__ Immediately, __Zechariah__ was unable to speak. * __[22:07](rc://en/tn/help/obs/22/07)__ Then God allowed __Zechariah__ to speak again. ## Word Data: * Strong’s: G2197
46.03125
231
0.720978
eng_Latn
0.988393
4c3e0f62f58c4b0e89e2550dc1cc1e1d3c1f4871
141
md
Markdown
post/2018/08/2018-08-14-8396/index.md
heinzwittenbrink/lostandfound
064a52d0b795d5ad67219d8c72db7d18201cbea5
[ "CC0-1.0" ]
null
null
null
post/2018/08/2018-08-14-8396/index.md
heinzwittenbrink/lostandfound
064a52d0b795d5ad67219d8c72db7d18201cbea5
[ "CC0-1.0" ]
null
null
null
post/2018/08/2018-08-14-8396/index.md
heinzwittenbrink/lostandfound
064a52d0b795d5ad67219d8c72db7d18201cbea5
[ "CC0-1.0" ]
null
null
null
--- title: "" date: "2018-08-14" categories: - "journal" --- Erste Lektüre in diesem Urlaub. Bin begeistert. ![](images/6272448004.jpg)
12.818182
47
0.652482
deu_Latn
0.43117
4c3e5363d00927cd9ce6c2a3c7ef4742f0e8b508
311
md
Markdown
_posts/1959-12-23-the-submerged-land-around-the.md
MiamiMaritime/miamimaritime.github.io
d087ae8c104ca00d78813b5a974c154dfd9f3630
[ "MIT" ]
null
null
null
_posts/1959-12-23-the-submerged-land-around-the.md
MiamiMaritime/miamimaritime.github.io
d087ae8c104ca00d78813b5a974c154dfd9f3630
[ "MIT" ]
null
null
null
_posts/1959-12-23-the-submerged-land-around-the.md
MiamiMaritime/miamimaritime.github.io
d087ae8c104ca00d78813b5a974c154dfd9f3630
[ "MIT" ]
null
null
null
--- title: The submerged land around the tags: - Dec 1959 --- The submerged land around the Ragged Keys is put up for sale by the Florida Cabinet. The Ragged Keys owners immediately offer $123 an acre. Newspapers: **Miami Morning News or The Miami Herald** Page: **2**, Section: **A**
25.916667
141
0.66881
eng_Latn
0.997588
4c3eeb6feaf92c08ac5c685a0bf2dccdf625a0a8
202
md
Markdown
db/en/fix/27.md
imfht/data
20459bc133f0c09d06d5c78efe77079f1ce4a5c0
[ "BSD-3-Clause" ]
139
2015-03-27T12:50:40.000Z
2021-11-08T12:32:24.000Z
venv/lib/python2.7/site-packages/vulndb/db/en/fix/27.md
sravani-m/Web-Application-Security-Framework
d9f71538f5cba6fe1d8eabcb26c557565472f6a6
[ "MIT" ]
44
2015-02-19T18:15:21.000Z
2018-05-03T14:19:20.000Z
venv/lib/python2.7/site-packages/vulndb/db/en/fix/27.md
sravani-m/Web-Application-Security-Framework
d9f71538f5cba6fe1d8eabcb26c557565472f6a6
[ "MIT" ]
41
2015-03-27T15:41:17.000Z
2022-03-18T21:40:34.000Z
1. Explicitly set the `filename` attribute in the Content-disposition HTTP response header. 2. Perform strict whitelist validation on user input before using it in the creation of HTTP response bodies
33.666667
69
0.816832
eng_Latn
0.99656
4c3fccc0851c267d14a64831807152ea461dd803
65
md
Markdown
README.md
mjul/fsharp-ikvm-docjure
10d7398926b3e1d0b9ad6f05b13a815948b5db84
[ "MIT" ]
1
2016-10-24T09:38:32.000Z
2016-10-24T09:38:32.000Z
README.md
mjul/fsharp-ikvm-docjure
10d7398926b3e1d0b9ad6f05b13a815948b5db84
[ "MIT" ]
null
null
null
README.md
mjul/fsharp-ikvm-docjure
10d7398926b3e1d0b9ad6f05b13a815948b5db84
[ "MIT" ]
null
null
null
# fsharp-ikvm-docjure Using the Docjure library from F# via IKVM
21.666667
42
0.784615
eng_Latn
0.875307
4c40046082adf873cb5e08315bbb72c6ce88dcc6
2,265
md
Markdown
docs/vs-2015/extensibility/customdatasignature-element-visual-studio-templates.md
viniciustavanoferreira/visualstudio-docs.pt-br
2ec4855214a26a53888d4770ff5d6dde15dbb8a5
[ "CC-BY-4.0", "MIT" ]
null
null
null
docs/vs-2015/extensibility/customdatasignature-element-visual-studio-templates.md
viniciustavanoferreira/visualstudio-docs.pt-br
2ec4855214a26a53888d4770ff5d6dde15dbb8a5
[ "CC-BY-4.0", "MIT" ]
null
null
null
docs/vs-2015/extensibility/customdatasignature-element-visual-studio-templates.md
viniciustavanoferreira/visualstudio-docs.pt-br
2ec4855214a26a53888d4770ff5d6dde15dbb8a5
[ "CC-BY-4.0", "MIT" ]
null
null
null
--- title: Elemento CustomDataSignature (modelos do Visual Studio) | Microsoft Docs ms.date: 11/15/2016 ms.prod: visual-studio-dev14 ms.technology: vs-ide-general ms.topic: reference helpviewer_keywords: - <CustomDataSignature> Element (Visual Studio Templates) - CustomDataSignature Element (Visual Studio Templates) ms.assetid: 8c3db51d-7014-4484-802a-15aa1353dbdb caps.latest.revision: 7 ms.author: gregvanl manager: jillfra ms.openlocfilehash: 784704bea43a87f1aebdc42941906179dca815ce ms.sourcegitcommit: 94b3a052fb1229c7e7f8804b09c1d403385c7630 ms.translationtype: MT ms.contentlocale: pt-BR ms.lasthandoff: 04/23/2019 ms.locfileid: "62580411" --- # <a name="customdatasignature-element-visual-studio-templates"></a>Elemento CustomDataSignature (modelos do Visual Studio) [!INCLUDE[vs2017banner](../includes/vs2017banner.md)] Especifica a assinatura de texto para localizar os dados personalizados. \<VSTemplate> \<TemplateData> \<CustomDataSignature> ## <a name="syntax"></a>Sintaxe ``` <CustomDataSignature>"string"</CustomDataSignature> ``` ## <a name="attributes-and-elements"></a>Atributos e elementos As seções a seguir descrevem atributos, elementos filho e elementos pai. ### <a name="attributes"></a>Atributos nenhuma. ### <a name="child-elements"></a>Elementos filho nenhuma. ### <a name="parent-elements"></a>Elementos pai |Elemento|Descrição| |-------------|-----------------| |[TemplateData](../extensibility/templatedata-element-visual-studio-templates.md)|Elemento obrigatório.<br /><br /> Categoriza o modelo e a define como ele é exibido em qualquer um de **novo projeto** ou o **Adicionar Novo Item** caixa de diálogo.| ## <a name="text-value"></a>Valor de texto Um valor de texto é obrigatório. O texto é uma cadeia de caracteres que tenha a assinatura de texto que é necessário para localizar os dados personalizados. ## <a name="remarks"></a>Comentários `CustomDataSignature` é um elemento opcional. ## <a name="see-also"></a>Consulte também [Referência de esquema de modelo do Visual Studio](../extensibility/visual-studio-template-schema-reference.md) [Criando modelos de projeto e de item](../ide/creating-project-and-item-templates.md)
36.532258
250
0.734216
por_Latn
0.716801
4c409847790595f9983aab6225abeae453c77724
34,861
md
Markdown
articles/iot-hub/iot-hub-devguide-security.md
flexray/azure-docs.pl-pl
bfb8e5d5776d43b4623ce1c01dc44c8efc769c78
[ "CC-BY-4.0", "MIT" ]
12
2017-08-28T07:45:55.000Z
2022-03-07T21:35:48.000Z
articles/iot-hub/iot-hub-devguide-security.md
flexray/azure-docs.pl-pl
bfb8e5d5776d43b4623ce1c01dc44c8efc769c78
[ "CC-BY-4.0", "MIT" ]
441
2017-11-08T13:15:56.000Z
2021-06-02T10:39:53.000Z
articles/iot-hub/iot-hub-devguide-security.md
flexray/azure-docs.pl-pl
bfb8e5d5776d43b4623ce1c01dc44c8efc769c78
[ "CC-BY-4.0", "MIT" ]
27
2017-11-13T13:38:31.000Z
2022-02-17T11:57:33.000Z
--- title: Opis Azure IoT Hub zabezpieczeń | Microsoft Docs description: Przewodnik dla deweloperów — jak kontrolować dostęp do IoT Hub dla aplikacji urządzeń i aplikacji na za ich pomocą. Zawiera informacje o tokenach zabezpieczających i pomocy technicznej dla certyfikatów X.509. author: wesmc7777 manager: philmea ms.author: wesmc ms.service: iot-hub services: iot-hub ms.topic: conceptual ms.date: 04/15/2021 ms.custom: - amqp - mqtt - 'Role: Cloud Development' - 'Role: IoT Device' - 'Role: Operations' - devx-track-js - devx-track-csharp ms.openlocfilehash: 7f919069005e8fcb813baf2521c8cb20cffafc88 ms.sourcegitcommit: 2aeb2c41fd22a02552ff871479124b567fa4463c ms.translationtype: MT ms.contentlocale: pl-PL ms.lasthandoff: 04/22/2021 ms.locfileid: "107870359" --- # <a name="control-access-to-iot-hub"></a>Kontrola dostępu do centrum IoT Hub W tym artykule opisano opcje zabezpieczania centrum IoT Hub. IoT Hub używa *uprawnień do* udzielania dostępu do każdego punktu końcowego centrum IoT. Uprawnienia ograniczają dostęp do centrum IoT na podstawie funkcjonalności. W tym artykule oprowadzono: * Różne uprawnienia, które można przyznać urządzeniu lub aplikacji back-end w celu uzyskania dostępu do centrum IoT. * Proces uwierzytelniania i tokeny używane do weryfikowania uprawnień. * Jak ograniczyć dostęp do określonych zasobów za pomocą poświadczeń. * IoT Hub obsługę certyfikatów X.509. * Niestandardowe mechanizmy uwierzytelniania urządzeń, które używają istniejących rejestrów tożsamości urządzeń lub schematów uwierzytelniania. [!INCLUDE [iot-hub-basic](../../includes/iot-hub-basic-partial.md)] Musisz mieć odpowiednie uprawnienia dostępu do dowolnego z IoT Hub końcowych. Na przykład urządzenie musi zawierać token zawierający poświadczenia zabezpieczeń wraz z każdym komunikatem, który wysyła do IoT Hub. ## <a name="access-control-and-permissions"></a>Kontrola dostępu i uprawnienia Uprawnienia można [przyznać](#iot-hub-permissions) w następujący sposób: * **Zasady dostępu współdzielone na poziomie centrum IoT.** Zasady dostępu współdzielonych mogą przyznać dowolną kombinację [uprawnień.](#iot-hub-permissions) Zasady można definiować w Azure Portal [,](https://portal.azure.com)programowo przy użyciu interfejsów API [REST](/rest/api/iothub/iothubresource)zasobów IoT Hub lub za pomocą interfejsu wiersza polecenia [az iot hub policy.](/cli/azure/iot/hub/policy) Nowo utworzone centrum IoT ma następujące zasady domyślne: | Zasady dostępu współdzielonych | Uprawnienia | | -------------------- | ----------- | | iothubowner | Wszystkie uprawnienia | | usługa | **Uprawnienia usługi ServiceConnect** | | urządzenie | **Uprawnienia DeviceConnect** | | registryRead | **Uprawnienia RegistryRead** | | registryReadWrite | **Uprawnienia RegistryRead** **i RegistryWrite** | * **Poświadczenia zabezpieczeń dla każde urządzenie.** Każdy IoT Hub zawiera rejestr [](iot-hub-devguide-identity-registry.md) tożsamości Dla każdego urządzenia w tym rejestrze tożsamości można skonfigurować poświadczenia zabezpieczeń, które przyznają uprawnienia **DeviceConnect** w zakresie odpowiadającym im punktom końcowym urządzenia. Na przykład w typowym rozwiązaniu IoT: * Składnik zarządzania urządzeniami używa zasad *registryReadWrite.* * Składnik procesora zdarzeń używa *zasad* usługi. * Składnik logiki biznesowej urządzenia w czasie działania korzysta z *zasad* usługi. * Poszczególne urządzenia łączą się przy użyciu poświadczeń przechowywanych w rejestrze tożsamości centrum IoT. > [!NOTE] > Zobacz [uprawnienia,](#iot-hub-permissions) aby uzyskać szczegółowe informacje. ## <a name="authentication"></a>Authentication Usługa Azure IoT Hub udziela dostępu do punktów końcowych, weryfikując token względem zasad dostępu współużytkowanego i poświadczeń zabezpieczeń rejestru tożsamości. Poświadczenia zabezpieczeń, takie jak klucze symetryczne, nigdy nie są wysyłane za pośrednictwem sieci. > [!NOTE] > Dostawca Azure IoT Hub zasobów jest zabezpieczony za pośrednictwem subskrypcji platformy Azure, podobnie jak wszyscy dostawcy w Azure Resource Manager [.](../azure-resource-manager/management/overview.md) Aby uzyskać więcej informacji na temat tworzenia i używania tokenów zabezpieczających, [zobacz IoT Hub tokeny zabezpieczające](iot-hub-devguide-security.md#security-tokens). ### <a name="protocol-specifics"></a>Specyfika protokołu Każdy obsługiwany protokół, taki jak MQTT, AMQP i HTTPS, transportuje tokeny na różne sposoby. W przypadku korzystania z protokołu MQTT pakiet CONNECT ma w polu Nazwa użytkownika pole DeviceId jako deviceId, a token SAS w `{iothubhostname}/{deviceId}` polu Hasło. `{iothubhostname}` powinna być pełną nazwą CName centrum IoT (na przykład contoso.azure-devices.net). W przypadku [korzystania z usługi AMQP](https://www.amqp.org/)program IoT Hub obsługuje [SASL PLAIN](https://tools.ietf.org/html/rfc4616) zabezpieczeń opartych na oświadczeniach i [AMQP.](https://www.oasis-open.org/committees/download.php/50506/amqp-cbs-v1%200-wd02%202013-08-12.doc) Jeśli używasz zabezpieczeń opartych na oświadczeniach AMQP, standard określa sposób przesyłania tych tokenów. W SASL PLAIN nazwa **użytkownika może** być: * `{policyName}@sas.root.{iothubName}` w przypadku używania tokenów na poziomie centrum IoT. * `{deviceId}@sas.{iothubname}` w przypadku używania tokenów o zakresie urządzenia. W obu przypadkach pole hasło zawiera token zgodnie z opisem w te IoT Hub [tokenów zabezpieczających](iot-hub-devguide-security.md#security-tokens). Protokół HTTPS implementuje uwierzytelnianie, uwzględniając prawidłowy token w **nagłówku żądania** autoryzacji. #### <a name="example"></a>Przykład Nazwa użytkownika (w deviceid jest zróżnicowa wielkość liter): `iothubname.azure-devices.net/DeviceId` Hasło (token SAS można wygenerować za pomocą polecenia rozszerzenia interfejsu wiersza polecenia [az iot hub generate-sas-token](/cli/azure/iot/hub#az_iot_hub_generate_sas_token)lub Azure IoT Tools [dla Visual Studio Code](https://marketplace.visualstudio.com/items?itemName=vsciot-vscode.azure-iot-tools)): `SharedAccessSignature sr=iothubname.azure-devices.net%2fdevices%2fDeviceId&sig=kPszxZZZZZZZZZZZZZZZZZAhLT%2bV7o%3d&se=1487709501` > [!NOTE] > Zestawy [SDK usługi Azure IoT automatycznie](iot-hub-devguide-sdks.md) generują tokeny podczas nawiązywania połączenia z usługą. W niektórych przypadkach zestawy SDK usługi Azure IoT nie obsługują wszystkich protokołów ani wszystkich metod uwierzytelniania. ### <a name="special-considerations-for-sasl-plain"></a>Specjalne zagadnienia dotyczące SASL PLAIN W przypadku SASL PLAIN protokołu AMQP klient łączący się z centrum IoT może używać jednego tokenu dla każdego połączenia TCP. Po wygaśnięciu tokenu połączenie TCP rozłącza się z usługą i wyzwala ponowne połączenie. To zachowanie, chociaż nie jest problematyczne dla aplikacji back-end, jest szkodliwe dla aplikacji urządzenia z następujących powodów: * Bramy zazwyczaj łączą się w imieniu wielu urządzeń. W przypadku SASL PLAIN muszą utworzyć odrębne połączenie TCP dla każdego urządzenia łączącego się z centrum IoT. Ten scenariusz znacznie zwiększa zużycie energii i zasobów sieciowych oraz zwiększa opóźnienie każdego połączenia urządzenia. * Zwiększone użycie zasobów do ponownego nawiązywania połączenia po upływie każdego tokenu ma negatywny wpływ na urządzenia z ograniczeniami zasobów. ## <a name="scope-iot-hub-level-credentials"></a>Zakres poświadczeń na poziomie centrum IoT Zakres zasad zabezpieczeń na poziomie centrum IoT można określić, tworząc tokeny z ograniczonym URI zasobu. Na przykład punkt końcowy do wysyłania komunikatów z urządzenia do chmury to **/devices/{deviceId}/messages/events.** Możesz również użyć zasad dostępu współdzielonych na poziomie centrum IoT z uprawnieniami **DeviceConnect,** aby podpisać token, którego identyfikator resourceURI to **/devices/{deviceId}**. To podejście powoduje utworzenie tokenu, który może być używany tylko do wysyłania komunikatów w imieniu urządzenia **deviceId.** Ten mechanizm jest podobny do [zasad Event Hubs wydawcy](https://code.msdn.microsoft.com/Service-Bus-Event-Hub-99ce67ab)i umożliwia implementowanie niestandardowych metod uwierzytelniania. ## <a name="security-tokens"></a>Tokeny zabezpieczające IoT Hub używa tokenów zabezpieczających do uwierzytelniania urządzeń i usług, aby uniknąć wysyłania kluczy w sieci. Ponadto tokeny zabezpieczające mają ograniczoną ważność i zakres czasu. [Zestawy SDK usługi Azure IoT automatycznie](iot-hub-devguide-sdks.md) generują tokeny bez konieczności specjalnej konfiguracji. Niektóre scenariusze wymagają bezpośredniego wygenerowania i użycia tokenów zabezpieczających. Takie scenariusze obejmują: * Bezpośrednie użycie powierzchni MQTT, AMQP lub HTTPS. * Implementacja wzorca usługi tokenu, zgodnie z objaśnieniami w tesłudze [uwierzytelniania urządzeń niestandardowych.](iot-hub-devguide-security.md#custom-device-and-module-authentication) IoT Hub umożliwia również uwierzytelnianie urządzeń za pomocą IoT Hub przy [użyciu certyfikatów X.509.](iot-hub-devguide-security.md#supported-x509-certificates) ### <a name="security-token-structure"></a>Struktura tokenu zabezpieczającego Tokeny zabezpieczające są służące do udzielania ograniczonym w czasie dostępu do urządzeń i usług do określonych funkcji w IoT Hub. Aby uzyskać autoryzację do nawiązywania połączenia z IoT Hub, urządzenia i usługi muszą wysyłać tokeny zabezpieczające podpisane przy użyciu dostępu współdzielonych lub klucza symetrycznego. Te klucze są przechowywane z tożsamością urządzenia w rejestrze tożsamości. Token podpisany za pomocą klucza dostępu współdzielonych udziela dostępu do wszystkich funkcji skojarzonych z uprawnieniami zasad dostępu współdzielonych. Token podpisany przy użyciu klucza symetrycznego tożsamości urządzenia przyznaje tylko uprawnienie **DeviceConnect** dla skojarzonej tożsamości urządzenia. Token zabezpieczający ma następujący format: `SharedAccessSignature sig={signature-string}&se={expiry}&skn={policyName}&sr={URL-encoded-resourceURI}` Oto oczekiwane wartości: | Wartość | Opis | | --- | --- | | {signature} |Ciąg podpisu HMAC-SHA256 formularza: `{URL-encoded-resourceURI} + "\n" + expiry` . **Ważne:** klucz jest zdekodowany z base64 i używany jako klucz do wykonywania obliczeń HMAC-SHA256. | | {resourceURI} |Prefiks identyfikatora URI (według segmentu) punktów końcowych, do których można uzyskać dostęp za pomocą tego tokenu, rozpoczynając od nazwy hosta centrum IoT (bez protokołu). Na przykład `myHub.azure-devices.net/devices/device1` | | {expiry} |Ciągi UTF8 dla liczby sekund od epoki 00:00:00 UTC 1 stycznia 1970 r. | | {Identyfikator URL zakodowany resourceURI} |Kodowanie adresu URL przy małej literze w polu URI zasobu | | {policyName} |Nazwa zasad dostępu współdzielonych, do których odwołuje się ten token. Brak, jeśli token odwołuje się do poświadczeń rejestru urządzeń. | **Uwaga w przypadku prefiksu:** prefiks identyfikatora URI jest obliczany według segmentu, a nie przez znak. Na przykład `/a/b` jest prefiksem dla , `/a/b/c` ale nie dla `/a/bc` . Poniższy fragment Node.js przedstawia funkcję o nazwie **generateSasToken,** która oblicza token na podstawie danych wejściowych `resourceUri, signingKey, policyName, expiresInMins` . W następnych sekcjach szczegółowo opisano sposób inicjowania różnych danych wejściowych dla różnych przypadków użycia tokenu. ```javascript var generateSasToken = function(resourceUri, signingKey, policyName, expiresInMins) { resourceUri = encodeURIComponent(resourceUri); // Set expiration in seconds var expires = (Date.now() / 1000) + expiresInMins * 60; expires = Math.ceil(expires); var toSign = resourceUri + '\n' + expires; // Use crypto var hmac = crypto.createHmac('sha256', Buffer.from(signingKey, 'base64')); hmac.update(toSign); var base64UriEncoded = encodeURIComponent(hmac.digest('base64')); // Construct authorization string var token = "SharedAccessSignature sr=" + resourceUri + "&sig=" + base64UriEncoded + "&se=" + expires; if (policyName) token += "&skn="+policyName; return token; }; ``` Dla porównania równoważny kod języka Python do generowania tokenu zabezpieczającego to: ```python from base64 import b64encode, b64decode from hashlib import sha256 from time import time from urllib import parse from hmac import HMAC def generate_sas_token(uri, key, policy_name, expiry=3600): ttl = time() + expiry sign_key = "%s\n%d" % ((parse.quote_plus(uri)), int(ttl)) print(sign_key) signature = b64encode(HMAC(b64decode(key), sign_key.encode('utf-8'), sha256).digest()) rawtoken = { 'sr' : uri, 'sig': signature, 'se' : str(int(ttl)) } if policy_name is not None: rawtoken['skn'] = policy_name return 'SharedAccessSignature ' + parse.urlencode(rawtoken) ``` Funkcja generowania tokenu zabezpieczającego w języku C# jest: ```csharp using System; using System.Globalization; using System.Net; using System.Net.Http; using System.Security.Cryptography; using System.Text; public static string generateSasToken(string resourceUri, string key, string policyName, int expiryInSeconds = 3600) { TimeSpan fromEpochStart = DateTime.UtcNow - new DateTime(1970, 1, 1); string expiry = Convert.ToString((int)fromEpochStart.TotalSeconds + expiryInSeconds); string stringToSign = WebUtility.UrlEncode(resourceUri) + "\n" + expiry; HMACSHA256 hmac = new HMACSHA256(Convert.FromBase64String(key)); string signature = Convert.ToBase64String(hmac.ComputeHash(Encoding.UTF8.GetBytes(stringToSign))); string token = String.Format(CultureInfo.InvariantCulture, "SharedAccessSignature sr={0}&sig={1}&se={2}", WebUtility.UrlEncode(resourceUri), WebUtility.UrlEncode(signature), expiry); if (!String.IsNullOrEmpty(policyName)) { token += "&skn=" + policyName; } return token; } ``` W przypadku języka Java: ```java public static String generateSasToken(String resourceUri, String key) throws Exception { // Token will expire in one hour var expiry = Instant.now().getEpochSecond() + 3600; String stringToSign = URLEncoder.encode(resourceUri, StandardCharsets.UTF_8) + "\n" + expiry; byte[] decodedKey = Base64.getDecoder().decode(key); Mac sha256HMAC = Mac.getInstance("HmacSHA256"); SecretKeySpec secretKey = new SecretKeySpec(decodedKey, "HmacSHA256"); sha256HMAC.init(secretKey); Base64.Encoder encoder = Base64.getEncoder(); String signature = new String(encoder.encode( sha256HMAC.doFinal(stringToSign.getBytes(StandardCharsets.UTF_8))), StandardCharsets.UTF_8); String token = "SharedAccessSignature sr=" + URLEncoder.encode(resourceUri, StandardCharsets.UTF_8) + "&sig=" + URLEncoder.encode(signature, StandardCharsets.UTF_8.name()) + "&se=" + expiry; return token; } ``` > [!NOTE] > Ponieważ czas ważności tokenu jest weryfikowany na IoT Hub, dryf na zegarze maszyny, która generuje token musi być minimalny. ### <a name="use-sas-tokens-in-a-device-app"></a>Używanie tokenów SAS w aplikacji urządzenia Istnieją dwa sposoby uzyskiwania uprawnień **DeviceConnect** za pomocą usługi IoT Hub tokenami zabezpieczającymi: użycie [symetrycznego](#use-a-symmetric-key-in-the-identity-registry)klucza urządzenia z rejestru tożsamości lub użycie klucza dostępu [współdzielonych](#use-a-shared-access-policy). Pamiętaj, że wszystkie funkcje dostępne z urządzeń są domyślnie udostępniane w punktach końcowych z prefiksem `/devices/{deviceId}` . > [!IMPORTANT] > Jedynym sposobem uwierzytelniania IoT Hub urządzenia jest użycie klucza symetrycznego tożsamości urządzenia. W przypadkach, gdy zasady dostępu współdzielonych są używane do uzyskiwania dostępu do funkcji urządzenia, rozwiązanie musi uznać składnik wystawiający token zabezpieczający za zaufany podskładnik. Punkty końcowe dostępne dla urządzeń to (niezależnie od protokołu): | Punkt końcowy | Funkcjonalność | | --- | --- | | `{iot hub host name}/devices/{deviceId}/messages/events` |Wysyłanie komunikatów z urządzenia do chmury. | | `{iot hub host name}/devices/{deviceId}/messages/devicebound` |Odbieranie komunikatów z chmury do urządzenia. | ### <a name="use-a-symmetric-key-in-the-identity-registry"></a>Używanie klucza symetrycznego w rejestrze tożsamości W przypadku generowania tokenu przy użyciu klucza symetrycznego tożsamości urządzenia element policyName `skn` () tokenu zostanie pominięty. Na przykład token utworzony w celu uzyskania dostępu do wszystkich funkcji urządzenia powinien mieć następujące parametry: * zasób URI: `{IoT hub name}.azure-devices.net/devices/{device id}` , * klucz podpisywania: dowolny klucz symetryczny `{device id}` dla tożsamości, * brak nazwy zasad, * w dowolnym czasie wygaśnięcia. Przykład użycia poprzedniej funkcji Node.js to: ```javascript var endpoint ="myhub.azure-devices.net/devices/device1"; var deviceKey ="..."; var token = generateSasToken(endpoint, deviceKey, null, 60); ``` Wynik, który przyznaje dostęp do wszystkich funkcji dla urządzenia device1, będzie: `SharedAccessSignature sr=myhub.azure-devices.net%2fdevices%2fdevice1&sig=13y8ejUk2z7PLmvtwR5RqlGBOVwiq7rQR3WZ5xZX3N4%3D&se=1456971697` > [!NOTE] > Token SAS można wygenerować za pomocą polecenia rozszerzenia interfejsu wiersza polecenia [az iot hub generate-sas-token](/cli/azure/iot/hub#az_iot_hub_generate_sas_token)lub Azure IoT Tools [polecenia Visual Studio Code](https://marketplace.visualstudio.com/items?itemName=vsciot-vscode.azure-iot-tools). ### <a name="use-a-shared-access-policy"></a>Korzystanie z zasad dostępu współdzielonych Podczas tworzenia tokenu z zasad dostępu współdzielonych należy ustawić `skn` pole na nazwę zasad. Te zasady muszą przyznać **uprawnienie DeviceConnect.** Dwa główne scenariusze użycia zasad dostępu współdzielonych w celu uzyskania dostępu do funkcji urządzenia to: * [bramy protokołów w chmurze,](iot-hub-devguide-endpoints.md) * [usługi tokenów](iot-hub-devguide-security.md#custom-device-and-module-authentication) używane do implementowania niestandardowych schematów uwierzytelniania. Ponieważ zasady dostępu współdzielonych mogą potencjalnie przyznać dostęp do połączenia jako dowolne urządzenie, ważne jest, aby podczas tworzenia tokenów zabezpieczających użyć poprawnego URI zasobu. To ustawienie jest szczególnie ważne w przypadku usług tokenów, które muszą określać zakres tokenu dla określonego urządzenia przy użyciu wartości URI zasobu. Ten punkt jest mniej istotny w przypadku bram protokołów, ponieważ już mediatują ruch dla wszystkich urządzeń. Na przykład usługa tokenu korzystająca ze wstępnie utworzonych zasad dostępu współdzielonych o nazwie **device** utworzy token o następujących parametrach: * zasób URI: `{IoT hub name}.azure-devices.net/devices/{device id}` , * klucz podpisywania: jeden z kluczy `device` zasad, * nazwa zasad: `device` , * dowolny czas wygaśnięcia. Przykład użycia poprzedniej Node.js to: ```javascript var endpoint ="myhub.azure-devices.net/devices/device1"; var policyName = 'device'; var policyKey = '...'; var token = generateSasToken(endpoint, policyKey, policyName, 60); ``` Wynik, który przyznaje dostęp do wszystkich funkcji dla urządzenia device1, będzie: `SharedAccessSignature sr=myhub.azure-devices.net%2fdevices%2fdevice1&sig=13y8ejUk2z7PLmvtwR5RqlGBOVwiq7rQR3WZ5xZX3N4%3D&se=1456971697&skn=device` Brama protokołu może używać tego samego tokenu dla wszystkich urządzeń, po prostu ustawiając dla tego zasobu URI `myhub.azure-devices.net/devices` . ### <a name="use-security-tokens-from-service-components"></a>Używanie tokenów zabezpieczających ze składników usługi Składniki usługi mogą generować tokeny zabezpieczające tylko przy użyciu zasad dostępu współdzielonych, które przyznają odpowiednie uprawnienia, jak wyjaśniono wcześniej. Poniżej podano funkcje usługi dostępne w punktach końcowych: | Punkt końcowy | Funkcjonalność | | --- | --- | | `{iot hub host name}/devices` |Tworzenie, aktualizowanie, pobieranie i usuwanie tożsamości urządzeń. | | `{iot hub host name}/messages/events` |Odbieranie komunikatów z urządzenia do chmury. | | `{iot hub host name}/servicebound/feedback` |Otrzymywanie opinii na temat komunikatów wysyłanych z chmury do urządzenia. | | `{iot hub host name}/devicebound` |Wysyłanie komunikatów z chmury do urządzeń. | Na przykład usługa generująca przy użyciu wstępnie utworzonych zasad dostępu współdzielonych o nazwie **registryRead** utworzy token z następującymi parametrami: * zasób URI: `{IoT hub name}.azure-devices.net/devices` , * klucz podpisywania: jeden z kluczy `registryRead` zasad, * nazwa zasad: `registryRead` , * w dowolnym czasie wygaśnięcia. ```javascript var endpoint ="myhub.azure-devices.net/devices"; var policyName = 'registryRead'; var policyKey = '...'; var token = generateSasToken(endpoint, policyKey, policyName, 60); ``` Wynik, który udzieli dostępu do odczytu wszystkich tożsamości urządzeń, będzie: `SharedAccessSignature sr=myhub.azure-devices.net%2fdevices&sig=JdyscqTpXdEJs49elIUCcohw2DlFDR3zfH5KqGJo4r4%3D&se=1456973447&skn=registryRead` ## <a name="supported-x509-certificates"></a>Obsługiwane certyfikaty X.509 Możesz użyć dowolnego certyfikatu X.509, aby uwierzytelnić urządzenie w u IoT Hub, przesyłając odcisk palca certyfikatu lub urząd certyfikacji do usługi Azure IoT Hub. Uwierzytelnianie przy użyciu odcisków palca certyfikatu sprawdza, czy przedstawiony odcisk palca jest taki, jak skonfigurowany odcisk palca. Uwierzytelnianie przy użyciu urzędu certyfikacji weryfikuje łańcuch certyfikatów. Tak czy inaczej, uściślicie TLS wymaga, aby urządzenie było mieć prawidłowy certyfikat i klucz prywatny. Aby uzyskać szczegółowe informacje, zapoznaj się ze specyfikacją protokołu TLS, na przykład: [RFC 5246 — the Transport Layer Security (TLS) Protocol Version 1.2 (Protokół TLS w wersji 1.2).](https://tools.ietf.org/html/rfc5246/) Obsługiwane certyfikaty obejmują: * **Istniejący certyfikat X.509.** Z urządzeniem może już być skojarzony certyfikat X.509. Urządzenie może używać tego certyfikatu do uwierzytelniania za pomocą IoT Hub. Działa z uwierzytelnianiem odcisku palca lub urzędu certyfikacji. * **Certyfikat X.509 podpisany** przez urząd certyfikacji. Aby zidentyfikować urządzenie i uwierzytelnić je za pomocą IoT Hub, można użyć certyfikatu X.509 wygenerowanego i podpisanego przez urząd certyfikacji. Działa z uwierzytelnianiem odcisku palca lub urzędu certyfikacji. * **Wygenerowany samodzielnie certyfikat X-509 z podpisem własnym.** Producent urządzenia lub in-house deployer może wygenerować te certyfikaty i przechowywać odpowiedni klucz prywatny (i certyfikat) na urządzeniu. W tym celu można użyć narzędzi, takich jak [OpenSSL](https://www.openssl.org/) i [narzędzie SelfSignedCertificate](/powershell/module/pkiclient/new-selfsignedcertificate) systemu Windows. Działa tylko z uwierzytelnianiem odcisku palca. Urządzenie może używać certyfikatu X.509 lub tokenu zabezpieczającego do uwierzytelniania, ale nie obu tych metod. W przypadku uwierzytelniania certyfikatu X.509 upewnij się, że masz strategię obsługi przechowania certyfikatów po wygaśnięciu istniejącego certyfikatu. Następujące funkcje dla urządzeń, które korzystają z uwierzytelniania urzędu certyfikacji X.509, nie są jeszcze ogólnie dostępne, a tryb podglądu [musi być włączony:](iot-hub-preview-mode.md) * HTTPS, MQTT za pośrednictwem protokołów WebSockets i AMQP za pośrednictwem protokołów WebSockets. * Przekazywanie plików (wszystkie protokoły). Aby uzyskać więcej informacji na temat uwierzytelniania przy użyciu urzędu certyfikacji, zobacz [Device Authentication using X.509 CA Certificates](iot-hub-x509ca-overview.md)(Uwierzytelnianie urządzeń przy użyciu certyfikatów X.509 urzędu certyfikacji). Aby uzyskać informacje na temat przekazywania i weryfikowania urzędu certyfikacji w centrum IoT, zobacz Konfigurowanie zabezpieczeń [X.509](iot-hub-security-x509-get-started.md)w centrum Azure IoT. ### <a name="register-an-x509-certificate-for-a-device"></a>Rejestrowanie certyfikatu X.509 dla urządzenia Zestaw [SDK usługi Azure IoT](https://github.com/Azure/azure-iot-sdk-csharp/tree/master/iothub/service) dla języka C# (wersja 1.0.8 lub nowsza) obsługuje rejestrowanie urządzenia, które używa certyfikatu X.509 do uwierzytelniania. Inne interfejsy API, takie jak import/eksport urządzeń, również obsługują certyfikaty X.509. Możesz również użyć polecenia rozszerzenia interfejsu wiersza polecenia [az iot hub device-identity,](/cli/azure/iot/hub/device-identity) aby skonfigurować certyfikaty X.509 dla urządzeń. ### <a name="c-support"></a>Obsługa języka C \# Klasa **RegistryManager** zapewnia programowy sposób rejestrowania urządzenia. W szczególności metody **AddDeviceAsync** i **UpdateDeviceAsync** umożliwiają zarejestrowanie i zaktualizowanie urządzenia w IoT Hub tożsamości. Te dwie metody przyjmą **wystąpienie** urządzenia jako dane wejściowe. Klasa **Device** zawiera właściwość **Authentication,** która umożliwia określenie podstawowych i pomocniczych odcisków palca certyfikatu X.509. Odcisk palca reprezentuje skrót SHA256 certyfikatu X.509 (przechowywany przy użyciu kodowania binarnego DER). Możesz określić podstawowy odcisk palca lub pomocniczy odcisk palca albo oba te typy. Podstawowe i pomocnicze odciski palca są obsługiwane w celu obsługi scenariuszy przechowania certyfikatów. Oto przykładowy fragment kodu języka C do rejestrowania urządzenia przy użyciu odcisku palca certyfikatu \# X.509: ```csharp var device = new Device(deviceId) { Authentication = new AuthenticationMechanism() { X509Thumbprint = new X509Thumbprint() { PrimaryThumbprint = "B4172AB44C28F3B9E117648C6F7294978A00CDCBA34A46A1B8588B3F7D82C4F1" } } }; RegistryManager registryManager = RegistryManager.CreateFromConnectionString(deviceGatewayConnectionString); await registryManager.AddDeviceAsync(device); ``` ### <a name="use-an-x509-certificate-during-run-time-operations"></a>Używanie certyfikatu X.509 podczas operacji w czasie działania Zestaw [SDK urządzenia usługi Azure IoT](https://github.com/Azure/azure-iot-sdk-csharp/tree/master/iothub/device) dla platformy .NET (wersja 1.0.11 lub nowsza) obsługuje korzystanie z certyfikatów X.509. ### <a name="c-support"></a>Obsługa języka C \# Klasa **DeviceAuthenticationWithX509Certificate** obsługuje tworzenie **wystąpień DeviceClient** przy użyciu certyfikatu X.509. Certyfikat X.509 musi być w formacie PFX (nazywanym również PKCS #12), który zawiera klucz prywatny. Oto przykładowy fragment kodu: ```csharp var authMethod = new DeviceAuthenticationWithX509Certificate("<device id>", x509Certificate); var deviceClient = DeviceClient.Create("<IotHub DNS HostName>", authMethod); ``` ## <a name="custom-device-and-module-authentication"></a>Niestandardowe uwierzytelnianie urządzeń i modułów Rejestru tożsamości usługi IoT Hub [można](iot-hub-devguide-identity-registry.md) użyć do skonfigurowania poświadczeń zabezpieczeń dla urządzenia/modułu i kontroli dostępu przy użyciu [tokenów](iot-hub-devguide-security.md#security-tokens). Jeśli rozwiązanie IoT ma już niestandardowy rejestr tożsamości i/lub schemat uwierzytelniania, rozważ utworzenie usługi *tokenu* w celu zintegrowania tej infrastruktury z IoT Hub. W ten sposób możesz użyć innych funkcji IoT w swoim rozwiązaniu. Usługa tokenu jest niestandardową usługą w chmurze. Używa ona zasad IoT Hub *dostępu* współdzielonych z uprawnieniami **DeviceConnect** lub **ModuleConnect** do tworzenia *tokenów* o zakresie urządzenia lub modułu. Te tokeny umożliwiają urządzeniu i modułowi nawiązywanie połączenia z centrum IoT. ![Kroki wzorca usługi tokenu](./media/iot-hub-devguide-security/tokenservice.png) Poniżej podano główne kroki wzorca usługi tokenu: 1. Utwórz zasady IoT Hub dostępu współdzielone przy użyciu uprawnień **DeviceConnect** lub **ModuleConnect** dla centrum IoT. Te zasady można utworzyć w Azure Portal [lub](https://portal.azure.com) programowo. Usługa tokenu używa tych zasad do podpisywania tokenów, które tworzy. 2. Gdy urządzenie/moduł musi uzyskać dostęp do centrum IoT, żąda podpisanego tokenu z usługi tokenu. Urządzenie może uwierzytelniać się za pomocą niestandardowego rejestru tożsamości/schematu uwierzytelniania, aby określić tożsamość urządzenia/modułu używaną przez usługę tokenu do utworzenia tokenu. 3. Usługa tokenu zwraca token. Token jest tworzony przy użyciu funkcji lub jako , przy użyciu funkcji jako uwierzytelnionego urządzenia lub jako `/devices/{deviceId}` `/devices/{deviceId}/module/{moduleId}` `resourceURI` `deviceId` `moduleId` uwierzytelniania modułu. Usługa tokenu używa zasad dostępu współdzielonych do konstruowania tokenu. 4. Urządzenie/moduł używa tokenu bezpośrednio z centrum IoT. > [!NOTE] > Możesz użyć klasy .NET [SharedAccessSignatureBuilder](/dotnet/api/microsoft.azure.devices.common.security.sharedaccesssignaturebuilder) lub klasy [Java IotHubServiceSasToken,](/java/api/com.microsoft.azure.sdk.iot.service.auth.iothubservicesastoken) aby utworzyć token w usłudze tokenu. Usługa tokenu może ustawić wygaśnięcie tokenu zgodnie z potrzebami. Po wygaśnięciu tokenu centrum IoT przejmuje połączenie urządzenia/modułu. Następnie urządzenie/moduł musi zażądać nowego tokenu z usługi tokenu. Krótki czas wygaśnięcia zwiększa obciążenie zarówno urządzenia/modułu, jak i usługi tokenu. Aby urządzenie/moduł łączył się z centrum, nadal musisz dodać go do rejestru tożsamości usługi IoT Hub — nawet jeśli do nawiązywania połączenia jest on przy użyciu tokenu, a nie klucza. W związku z tym można nadal używać kontroli dostępu dla 1 urządzenia/modułu, włączając lub wyłączając tożsamości urządzeń/modułów w [rejestrze tożsamości.](iot-hub-devguide-identity-registry.md) Takie podejście zmniejsza ryzyko związane z używaniem tokenów z długim czasem wygaśnięcia. ### <a name="comparison-with-a-custom-gateway"></a>Porównanie z bramą niestandardową Wzorzec usługi tokenu to zalecany sposób implementacji niestandardowego rejestru tożsamości/schematu uwierzytelniania z IoT Hub. Ten wzorzec jest zalecany, IoT Hub nadal obsługuje większość ruchu rozwiązania. Jeśli jednak schemat uwierzytelniania niestandardowego jest tak połączony z protokołem, może być konieczne wymaganie bramy niestandardowej do przetwarzania całego ruchu. Przykładem takiego scenariusza jest użycie kluczy [Transport Layer Security (TLS)](https://tools.ietf.org/html/rfc4279)i kluczy wstępnych (PSK). Aby uzyskać więcej informacji, zobacz [artykuł o bramie](iot-hub-protocol-gateway.md) protokołu. ## <a name="reference-topics"></a>Tematy referencyjne: Poniższe tematy referencyjne zawierają więcej informacji na temat kontrolowania dostępu do centrum IoT. ## <a name="iot-hub-permissions"></a>IoT Hub uprawnień W poniższej tabeli wymieniono uprawnienia, których można użyć do kontrolowania dostępu do centrum IoT. | Uprawnienie | Uwagi | | --- | --- | | **RegistryRead** |Udziela dostępu do odczytu do rejestru tożsamości. Aby uzyskać więcej informacji, zobacz Identity registry ( [Rejestr tożsamości).](iot-hub-devguide-identity-registry.md) <br/>To uprawnienie jest używane przez usługi w chmurze na zabłysku. | | **RegistryReadWrite** |Przyznaje dostęp do odczytu i zapisu do rejestru tożsamości. Aby uzyskać więcej informacji, zobacz Identity registry ( [Rejestr tożsamości).](iot-hub-devguide-identity-registry.md) <br/>To uprawnienie jest używane przez usługi w chmurze na zabłysku. | | **ServiceConnect** |Przyznaje dostęp do punktów końcowych komunikacji i monitorowania usług w chmurze. <br/>Przyznaje uprawnienia do odbierania komunikatów z urządzenia do chmury, wysyłania komunikatów z chmury do urządzenia i pobierania odpowiednich potwierdzeń dostarczenia. <br/>Przyznaje uprawnienie do pobierania potwierdzeń dostarczenia dla przekazywania plików. <br/>Przyznaje uprawnienia dostępu do bliźniaczych reprezentacji w celu aktualizowania tagów i żądanych właściwości, pobierania zgłaszanych właściwości i uruchamiania zapytań. <br/>To uprawnienie jest używane przez usługi w chmurze zadomowienia. | | **DeviceConnect** |Udziela dostępu do punktów końcowych dla urządzeń. <br/>Przyznaje uprawnienia do wysyłania komunikatów z urządzenia do chmury i odbierania komunikatów z chmury do urządzenia. <br/>Przyznaje uprawnienia do wykonywania przekazywania plików z urządzenia. <br/>Przyznaje uprawnienia do odbierania powiadomień o żądanej właściwości bliźniaczej reprezentacji urządzenia i aktualizowania zgłoszonych właściwości bliźniaczej reprezentacji urządzenia. <br/>Przyznaje uprawnienia do wykonywania przekazywania plików. <br/>To uprawnienie jest używane przez urządzenia. | ## <a name="additional-reference-material"></a>Dodatkowy materiał referencyjny Inne tematy referencyjne w przewodniku IoT Hub dla deweloperów obejmują: * [IoT Hub punkty końcowe opisują](iot-hub-devguide-endpoints.md) różne punkty końcowe, które każde centrum IoT udostępnia dla operacji w czasie działania i zarządzania. * [W temacie Throttling and quotas (Ograniczanie](iot-hub-devguide-quotas-throttling.md) przepustowości i przydziały) opisano przydziały i zachowania ograniczania, które mają zastosowanie IoT Hub usługi. * Zestawy SDK urządzeń i usług [Azure IoT](iot-hub-devguide-sdks.md) zawiera listę różnych zestawów SDK języka, których można używać podczas tworzenia aplikacji zarówno dla urządzeń, jak i usług, które współdziałają z IoT Hub. * [IoT Hub języka zapytań opisano](iot-hub-devguide-query-language.md) język zapytań, za pomocą IoT Hub o bliźniaczych reprezentacji urządzenia i zadaniach. * [IoT Hub MQTT zawiera](iot-hub-mqtt-support.md) więcej informacji o IoT Hub obsługi protokołu MQTT. * Więcej informacji na temat uwierzytelniania TLS można znaleźć w dokumencie [RFC 5246 — the Transport Layer Security Protocol (TLS) Protocol Version 1.2](https://tools.ietf.org/html/rfc5246/) (Protokół TLS w wersji 1.2). ## <a name="next-steps"></a>Następne kroki Teraz, gdy już wiesz, jak kontrolować dostęp do IoT Hub, możesz zainteresować się następującymi tematami IoT Hub deweloperami: * [Synchronizowanie stanu i konfiguracji za pomocą bliźniaczych reprezentacji urządzeń](iot-hub-devguide-device-twins.md) * [Wywoływanie metody bezpośredniej na urządzeniu](iot-hub-devguide-direct-methods.md) * [Planowanie zadań na wielu urządzeniach](iot-hub-devguide-jobs.md) Jeśli chcesz wypróbować niektóre pojęcia opisane w tym artykule, zapoznaj się z następującymi samouczkami IoT Hub samouczków: * [Rozpoczynanie pracy z usługą Azure IoT Hub](quickstart-send-telemetry-node.md) * [Jak wysyłać komunikaty z chmury do urządzenia przy użyciu IoT Hub](iot-hub-csharp-csharp-c2d.md) * [Jak przetwarzać IoT Hub z urządzenia do chmury](tutorial-routing.md)
68.221135
742
0.796678
pol_Latn
0.99987
4c42cfa2f859a31e3873b6e9a4c0f36144f72a31
4,498
md
Markdown
README.md
d2s/gradients
6a8bf82e05953bfbbebadab459c24a077421051d
[ "MIT", "Unlicense" ]
285
2015-09-09T18:45:43.000Z
2022-03-31T04:22:40.000Z
README.md
d2s/gradients
6a8bf82e05953bfbbebadab459c24a077421051d
[ "MIT", "Unlicense" ]
4
2015-09-28T00:01:32.000Z
2017-07-30T07:53:25.000Z
UI/node_modules/gradients/README.md
subashgururaj/hygieiafun
17197cbb23c808a908c3d24c7912c18b90038758
[ "Apache-2.0" ]
33
2015-09-27T23:58:09.000Z
2020-09-24T09:23:57.000Z
# Gradients 1.0.0 CSS module for quickly setting gradients with single purpose classes. ## Install ``` npm install --save-dev gradients ``` or download the css on github and include in your project: ``` git clone git@github.com:mrmrs/gradients ``` ## The Code ``` /* * * GRADIENTS.CSS * v.1.0.0 * @mrmrs * * Each color has a class for setting a gradient on * text & background * * - Aqua * - Blue * - Navy * - Teal * - Green * - Lime * - Yellow * - Orange * - Red * - Fuchshia * - Purple * - Maroon * */ .bg-aqua-gradient { background: rgba(127,219,255,1); background: -webkit-linear-gradient(top, rgba(127,219,255,1) 0%, rgba(82,140,163,1) 100%); background: linear-gradient(to bottom, rgba(127,219,255,1) 0%, rgba(82,140,163,1) 100%); } .bg-blue-gradient { background: rgba(0,116,217,1); background: -webkit-linear-gradient(top, rgba(0,116,217,1) 0%, rgba(0,65,122,1) 100%); background: linear-gradient(to bottom, rgba(0,116,217,1) 0%, rgba(0,65,122,1) 100%); } .bg-navy-gradient { background: rgba(0,32,63,1); background: -webkit-linear-gradient(top, rgba(0,32,63,1) 0%, rgba(0,10,20,1) 100%); background: linear-gradient(to bottom, rgba(0,32,63,1) 0%, rgba(0,10,20,1) 100%); } .bg-teal-gradient { background: rgba(57,204,204,1); background: -webkit-linear-gradient(top, rgba(57,204,204,1) 0%, rgba(34,122,122,1) 100%); background: linear-gradient(to bottom, rgba(57,204,204,1) 0%, rgba(34,122,122,1) 100%); } .bg-green-gradient { background: rgba(46,204,64,1); background: -webkit-linear-gradient(top, rgba(46,204,64,1) 0%, rgba(28,122,39,1) 100%); background: linear-gradient(to bottom, rgba(46,204,64,1) 0%, rgba(28,122,39,1) 100%); } .bg-lime-gradient { background: rgba(1,255,111,1); background: -webkit-linear-gradient(top, rgba(1,255,111,1) 0%, rgba(2,163,72,1) 100%); background: linear-gradient(to bottom, rgba(1,255,111,1) 0%, rgba(2,163,72,1) 100%); } .bg-yellow-gradient { background: rgba(255,221,0,1); background: -webkit-linear-gradient(top, rgba(255,221,0,1) 0%, rgba(184,147,0,1) 100%); background: linear-gradient(to bottom, rgba(255,221,0,1) 0%, rgba(184,147,0,1) 100%); } .bg-orange-gradient { background: rgba(255,133,27,1); background: -webkit-linear-gradient(top, rgba(255,133,27,1) 0%, rgba(255,80,27,1) 100%); background: linear-gradient(to bottom, rgba(255,133,27,1) 0%, rgba(255,80,27,1) 100%); } .bg-red-gradient { background: rgba(246,46,36,1); background: -webkit-linear-gradient(top, rgba(246,46,36,1) 0%, rgba(255,54,121,1) 100%); background: linear-gradient(to bottom, rgba(246,46,36,1) 0%, rgba(255,54,121,1) 100%); } .bg-fuchsia-gradient { background: rgba(240,18,188,1); background: -webkit-linear-gradient(top, rgba(240,18,188,1) 0%, rgba(163,11,128,1) 100%); background: linear-gradient(to bottom, rgba(240,18,188,1) 0%, rgba(163,11,128,1) 100%); } .bg-purple-gradient { background: rgba(176,13,201,1); background: -webkit-linear-gradient(top, rgba(176,13,201,1) 0%, rgba(107,7,122,1) 100%); background: linear-gradient(to bottom, rgba(176,13,201,1) 0%, rgba(107,7,122,1) 100%); } .bg-maroon-gradient { background: rgba(204,31,115,1); background: -webkit-linear-gradient(top, rgba(204,31,115,1) 0%, rgba(133,20,75,1) 100%); background: linear-gradient(to bottom, rgba(204,31,115,1) 0%, rgba(133,20,75,1) 100%); } ``` ## Author [mrmrs](http://mrmrs.io) ## License The MIT License (MIT) Copyright (c) 2015 @mrmrs Permission is hereby granted, free of charge, to any person obtaining a copy of this software and associated documentation files (the "Software"), to deal in the Software without restriction, including without limitation the rights to use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of the Software, and to permit persons to whom the Software is furnished to do so, subject to the following conditions: The above copyright notice and this permission notice shall be included in all copies or substantial portions of the Software. THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE.
30.187919
92
0.694976
kor_Hang
0.266563
4c434287db2e078a8604820bd85bd8349a6578dd
1,383
md
Markdown
README.md
GreenXenith/pumpkinspice
001da31d65983f93e5287900e5496683ded1f2f9
[ "MIT" ]
null
null
null
README.md
GreenXenith/pumpkinspice
001da31d65983f93e5287900e5496683ded1f2f9
[ "MIT" ]
null
null
null
README.md
GreenXenith/pumpkinspice
001da31d65983f93e5287900e5496683ded1f2f9
[ "MIT" ]
null
null
null
____ _ _ ____ _ | _ \ _ _ _ __ ___ _ __ | | _(_)_ __ / ___| _ __ (_) ___ ___ | |_) | | | | '_ ` _ \| '_ \| |/ / | '_ \ \___ \| '_ \| |/ __/ _ \ | __/| |_| | | | | | | |_) | <| | | | | ___) | |_) | | (_| __/ |_| \__,_|_| |_| |_| .__/|_|\_\_|_| |_| |____/| .__/|_|\___\___| |_| |_| ~By GreenDimond This mod adds pumpkin spice stuff. **Depends:** farming_redo (by tenplus1) **Mod Contains:** * Pumpkin Spice * Pumpkin Spice Latte * Pumpkin Spice Donut * Pumpkin Spice Cookie * Pumpkin Spice Bread (Not pumpkin bread) * Pumpkin Spice Bagel * Pumpkin Spice Muffin * Pumpkin Spice Cake * Bagel * Muffin **Crafting:** * Pumpkin Spice: Put a pumpkin slice anywhere in your crafting grid. * Pumpkin Spice Latte: Put pumpkin spice, bucket of milk, and hot coffee anywhere in your crafting grid. * Bagel: ``` |B||B||B| |B|| ||B| |B||B||B| ``` B = Bread * Muffin: ``` |P||B||P| |P||B||P| ``` P = Paper B = Bread * Pumpkin Spice Cake Batter: ``` |P||P||P| |F||S||F| ``` P = Pumpkinspice F = Flour S = Sugar * Pumpkin Spice Cake: Cook pumpkinspice batter. * Everything else: Combine pumpkin spice with either cookie, donut, bread, or muffin.
22.672131
92
0.489516
eng_Latn
0.309405
4c437ad2126d834269a43ab25eb8d61dec9afa51
7,443
md
Markdown
_posts/2018-12-31-Download-clinical-health-psychology-in-medical-settings-a-practitioner-am.md
Luanna-Lynde/28
1649d0fcde5c5a34b3079f46e73d5983a1bfce8c
[ "MIT" ]
null
null
null
_posts/2018-12-31-Download-clinical-health-psychology-in-medical-settings-a-practitioner-am.md
Luanna-Lynde/28
1649d0fcde5c5a34b3079f46e73d5983a1bfce8c
[ "MIT" ]
null
null
null
_posts/2018-12-31-Download-clinical-health-psychology-in-medical-settings-a-practitioner-am.md
Luanna-Lynde/28
1649d0fcde5c5a34b3079f46e73d5983a1bfce8c
[ "MIT" ]
null
null
null
--- layout: post comments: true categories: Other --- ## Download Clinical health psychology in medical settings a practitioner am book He "Nine. 6_d_. The kitchens that serviced the restaurant from the level above also serviced the staff cafeteria in the Government Center, a fashion seminar on the her leg. The reeds burning debris barred entrance. I had some trouble Sinsemilla wasn't in the living room. " His voice trailed away. "Know, it would look as though he had wanted to facilitate their entry, a passage Sitting in the client's chair, seeming to grow until she dominated the group with the intangible power that They sat unspeaking, she knows where to find the barn-what-ain't-a-barn, too many pipes were being smoked here stopped by to help Agnes, but she knew the way in the dark, all day, locking them away to keep them harmless or giving them to a wizard in his hire to do with "Of course not" an alarm hundreds of dead young are found on the shore, "the instance, too many pipes tripped and broke his leg, a fashion seminar on the her leg. pages every morning when Leilani showered, thank you, she voice: "Children, the Osskili tongue of Osskil, suspended, three enormous "She's Irian of Westpool's mare. When they joined us, Bernard was the obvious A misdirected clinical health psychology in medical settings a practitioner am couldn't be put on a right road quickly or without "Jones?" Curtis replies, and could not mark in their six percent: excellent in light of the fact that the runaway inflation of the "Not in the heart," the apparition repeated. The Christian Broker's Story cvii whose skin is still clinical health psychology in medical settings a practitioner am for lines by the Norwegian walrus-hunters, Joey was "It's a sunshine-cake sort clinical health psychology in medical settings a practitioner am day," Vanadium announced, she saw the pet-shop terror where she had A red stripe passed across her face. clinical health psychology in medical settings a practitioner am surface was covered by a cloud. Get out of the present, Tom Vanadium must simplify and condense misguided willingness to trust in divine justice. Colman looked at Veronica's face, a kingdom that weigheth not in the balance against a draught [of water] or a voiding of urine is not worth the striving for. Like mind readin' or seein' the scant two fadome water and see no land. much if he makes both the apology and the payment by mail. Regardless clinical health psychology in medical settings a practitioner am the initial purpose of Maddoc's visit, and no doubt there were automatic or remote-operated defenses that were invisible, Colman told himself again, she assembled the people of the city and set out to them his virtue and worth and counselled them to invest him with the charge of their governance and besought them to make him king over them. He didn't know if it was the right time to even postulate that they might fail. "You want a glass?" an azure-blue bird perched on a section of badly weathered and half-broken speak, half-minute blindness that left her in cotillion. Lampion was out of danger and free of the incubator, for valuable Diamond raised his hand the rock jumped up in the air, 1876, a devourer of the waste; Those who pass a cloudlet deem it. "Well?" the past, further astonishing him? -Yea, if nodiing else! On the side of the trunk that now sat in the comer was a small triangular "Something to drink? Sometimes too the reindeer skin is tanned powerful spells of protection woven and rewoven by the wise women of the island, I'll return it to you when you leave. weathering on so large a scale that the hard rocks are nearly wood-chopper; 10, Crick had reached a point at which he no longer believed the silvered glass, "You did not call me to the clearing, blights and fires and sicknesses across the land, kiddo, Junior Cain would at last spread his wings and fly? The jar features a screw-top. He was inflamed midair, she pulled sweet Angel into her lap, during the voyage between Japan and Ceylon gave an exceedingly At this time tomorrow Columbine made another nonappearance. Curiosity and the measured payout of a full bladder lead Old Yeller through a maze of recreational The young woman's face pales further and her eyes become icier, and the fire-engine-red lipstick was painted far past her thin lips, which are derived from Q: What is Hellstrom always scratching, they [Footnote 372: The Dutch had permission in former times to send some unlikely event that she'd already found a route through the maze. The woman at once abandons finding the straits between Beli Ostrov and the mainland, too, was always with him. That was the deepest dive ever heard of by man or woman, so Selivestrov? Prum, beyond the ranks of you worry me, and reappeared at the little finger, but you did good work anyway. Let's start with her. ), obtained from Beneficial Finance. The camera pulled back and angled down even more severely to reveal Noah's Chevrolet parked at the His alcohol-soured breath washed over Agnes as he asked, Phimie dealt with this new trauma as other naive fifteen-year-olds had done before her: She sought to avoid the scorn and the reproach that she imagined would be heaped upon her for having failed to reveal the rape at the time it occurred, as though Micky were aboard a on the 19th August at 6 o'clock p, I thought, in spite of her embarrassment, power of life and death! There had to be service elevators, the wineglass had shattered, without discontent or urgency. Preston threw the binoculars on the disheveled bed, a Japanese Editor of Thunberg's Writings oppositifolia L! bore their hard fate with resignation. ?" And she recited the following verses: 90. fell, and most of the Like the chicken egg. Earlier, I thought My suspicions were confirmed when I looked news and the sorrier turns of life that fate delivered, macaroni salad. And we can't remove ourselves from the pain! that one of them even showed a disposition to retaliate by keeping All these and various other similar accounts of north-east, a Japanese sedan-chair made of bamboo, whereat the troops murmured among themselves. " It is probable that towards the close of the sixteenth century the responsibility for lifting this curse. " A supply of ammunition lined the bottom of all the dresser and bureau drawers, pressed to the floorboard by fear. Accordingly, facing the through the placenta, but "Gusinnaya Semlya" in index soft though charged with power. Still, huh?" There was a little noise. man-eating Scythians. txt (16 of 111) [252004 12:33:30 AM] be executed with a steel cutting edge. our letters had reached him on the 4th April23rd March and had been nationals. In 1707 he was received either. People have puzzled at their choosing the empty sea for their domain, till she consented and abode in the kingship, despatched the newborn child by one of her confidants to Mecca the Holy. northern part of Asia. Although Curtis would like to believe Gabby Still cautious, clinical health psychology in medical settings a practitioner am government is providing so few details about the crisis that the TV reporters have insufficient When it was the eighth day, These people-they are snakes. Very common. (After a Photograph. No clinical health psychology in medical settings a practitioner am. 0 0. " "Will you forget?"           The Lord's alternatives are these, but we represented a different truth.
827
7,305
0.796722
eng_Latn
0.999912
4c43a6db8e06e2225f773da9d2e9afc7a24344d4
792
md
Markdown
UseCase/109_annotatePhotoToStory.md
IPPETAD/adventure.datetime
15eb6f634fbd2d2a764b8e283bef9525b3bf09d8
[ "MIT" ]
1
2015-02-26T21:55:10.000Z
2015-02-26T21:55:10.000Z
UseCase/109_annotatePhotoToStory.md
edegraff/adventure.datetime
2bfaae6fc0af6d3eeb64360b2923bf02a597dcd7
[ "MIT" ]
1
2016-03-11T06:23:41.000Z
2016-03-11T06:23:41.000Z
UseCase/109_annotatePhotoToStory.md
IPPETAD/adventure.datetime
15eb6f634fbd2d2a764b8e283bef9525b3bf09d8
[ "MIT" ]
null
null
null
110 Annotate Image to Story ------------------------------ Participating Actors -------------------- - Reader - System - Service Goal ---- - The desired image is annotated to the Story. Trigger ------- - While in the view fragment screen, the Reader clicks add image Precondition ------------ - Internet connection available Postcondition ------------- - The image is annotated to the fragment Basic Flow ---------- 1. Reader opens app 2. Reader selects the Browse button 3. Reader selects a story to view 4. Reader presses annotate photo 6. Reader takes photo 7. Reader confirms the annotation Alternative Flows ---------- 1. At step 6, the Reader can elect to choose a photo from his/her gallery 2. If the Reader is not satisfied with the photo after step six, he/she can retake it
20.307692
85
0.671717
eng_Latn
0.994163
4c44afcc6d33057ba1e29e48b02ed9eb1579ac31
3,177
md
Markdown
_posts/commands/2016-09-17-dissect.md
pierre-haessig/yalmip.github.io
c59162761aedf92caa20ed62ae218e842c2a576c
[ "MIT" ]
null
null
null
_posts/commands/2016-09-17-dissect.md
pierre-haessig/yalmip.github.io
c59162761aedf92caa20ed62ae218e842c2a576c
[ "MIT" ]
null
null
null
_posts/commands/2016-09-17-dissect.md
pierre-haessig/yalmip.github.io
c59162761aedf92caa20ed62ae218e842c2a576c
[ "MIT" ]
null
null
null
--- layout: single category: command author_profile: false excerpt: "" title: dissect tags: [Semidefinite programming] comments: true date: '2016-09-17' sidebar: nav: "commands" --- [dissect](/command/dissect) can be used to transform extremely large sparse and structured SDP constraints to a set of smaller SDP constraints, at the price of introducing more variables. ### Syntax ````matlab F = dissect(X) ```` ### Examples NOTE : For the examples below to work, you need to have (http://www.cerfacs.fr/algor/Softs/MESHPART/MESHPART) installed. Let us begin by defining a large but low bandwidth SDP constraint. ````matlab n = 500; r = 3; B = randn(n,r+1); S = spdiags(B,0:r,n,n);S = S+S'; x = sdpvar(n,1); X = diag(x)-S; p = randperm(n); X = X(p,p); F = X >= 0 spy(F) +++++++++++++++++++++++++++++++++++++++++++++++++++++ | ID| Constraint| Type| +++++++++++++++++++++++++++++++++++++++++++++++++++++ | #1| Numeric value| Matrix inequality 500x500| +++++++++++++++++++++++++++++++++++++++++++++++++++++ ```` Applying the [dissect](/command/dissect) command simplifies the constraint to a set of two smaller SDP constraints, at the price of introducing 6 additional variables. ````matlab length(getvariables(dissect(F))) ans = 506 dissect(F) +++++++++++++++++++++++++++++++++++++++++++++++++++++ | ID| Constraint| Type| +++++++++++++++++++++++++++++++++++++++++++++++++++++ | #1| Numeric value| Matrix inequality 252x252| | #2| Numeric value| Matrix inequality 251x251| +++++++++++++++++++++++++++++++++++++++++++++++++++++ length(getvariables(dissect(F))) ans = 506 ```` The procedure can be recursively applied. ````matlab dissect(dissect(F)) +++++++++++++++++++++++++++++++++++++++++++++++++++++ | ID| Constraint| Type| +++++++++++++++++++++++++++++++++++++++++++++++++++++ | #1| Numeric value| Matrix inequality 128x128| | #2| Numeric value| Matrix inequality 127x127| | #3| Numeric value| Matrix inequality 127x127| | #4| Numeric value| Matrix inequality 127x127| +++++++++++++++++++++++++++++++++++++++++++++++++++++ length(getvariables(dissect((dissect(F))))) ans = 518 ```` To see the impact of the dissection, let us solve an SDP problem for various levels of dissection ````matlab sol = optimize(F,trace(X));sol.solvertime ans = 123.2810 F = dissect(F); sol = optimize(F,trace(X));sol.solvertime ans = 36.0940 F = dissect(F); sol = optimize(F,trace(X));sol.solvertime ans = 11.9070 F = dissect(F); sol = optimize(F,trace(X));sol.solvertime ans = 4.6410 F = dissect(F); sol = optimize(F,trace(X));sol.solvertime ans = 3.8430 F = dissect(F); sol = optimize(F,trace(X));sol.solvertime ans = 3.9370 ```` Note that the dissection command can be applied to arbitrary SDP problems in YALMIP (nonlinear problems, mixed semidefinite and second order cone problems etc). The algorithm in the command is based on finding a vertex separator of the matrix in the SDP constraint, applying a Dulmage-Mendelsohn permutation to detect corresponding blocks, followed by a series of Schur completions.
27.626087
221
0.576645
eng_Latn
0.744594
4c451a27c9d5ceb2201f7a87afc768e889f72fad
7,068
md
Markdown
docs/New-IntersightHyperflexNodeConfigPolicy.md
CiscoDevNet/intersight-powershell
01e565eab5586e4bc4b9eecde73067e1c3497e6c
[ "Apache-2.0" ]
6
2021-03-24T15:21:12.000Z
2022-02-22T09:47:16.000Z
docs/New-IntersightHyperflexNodeConfigPolicy.md
CiscoDevNet/intersight-powershell
01e565eab5586e4bc4b9eecde73067e1c3497e6c
[ "Apache-2.0" ]
18
2020-08-27T20:54:38.000Z
2022-03-31T05:53:57.000Z
docs/New-IntersightHyperflexNodeConfigPolicy.md
CiscoDevNet/intersight-powershell
01e565eab5586e4bc4b9eecde73067e1c3497e6c
[ "Apache-2.0" ]
3
2020-07-07T14:59:28.000Z
2021-03-27T14:41:50.000Z
--- external help file: Intersight.PowerShell.dll-Help.xml Module Name: Intersight.PowerShell online version: schema: 2.0.0 --- # New-IntersightHyperflexNodeConfigPolicy ## SYNOPSIS Fill in the Synopsis ## SYNTAX ``` New-IntersightHyperflexNodeConfigPolicy [-AdditionalProperties< System.Collections.Generic.Dictionary`2[string,object]>][-ClusterProfiles< System.Collections.Generic.List`1[HyperflexClusterProfileRelationship]>][-DataIpRange< HyperflexIpAddrRange>][-Description< string>][-HxdpIpRange< HyperflexIpAddrRange>][-HypervisorControlIpRange< HyperflexIpAddrRange>][-MgmtIpRange< HyperflexIpAddrRange>][-Moid< string>][[-Name]< string>][-NodeNamePrefix< string>][-Organization< OrganizationOrganizationRelationship>][-Tags< System.Collections.Generic.List`1[MoTag]>][-Json< SwitchParameter>][-WithHttpInfo< SwitchParameter>] ``` ## DESCRIPTION Create a &amp;apos;HyperflexNodeConfigPolicy&amp;apos; resource. ## PARAMETERS ### -AdditionalProperties ```yaml Type: System.Collections.Generic.Dictionary`2[string,object] Parameter Sets: (All) Aliases: Required: false Position: Named Default value: None Accept pipeline input: True True (ByPropertyName) Accept wildcard characters: False ``` ### -ClusterProfiles An array of relationships to hyperflexClusterProfile resources. Note:- To get the relationship object pass the MO to the cmdlet Get-IntersightMoMoRef or use the cmdlet Initialize-IntersightMoMoRef. ```yaml Type: System.Collections.Generic.List`1[HyperflexClusterProfileRelationship] Parameter Sets: (All) Aliases: Required: false Position: Named Default value: None Accept pipeline input: True True (ByPropertyName) Accept wildcard characters: False ``` ### -DataIpRange The range of storage data IPs to be assigned to the nodes. Note :- Use Initialize-IntersightHyperflexIpAddrRange to create the object of complex type HyperflexIpAddrRange ```yaml Type: HyperflexIpAddrRange Parameter Sets: (All) Aliases: Required: false Position: Named Default value: None Accept pipeline input: True True (ByPropertyName) Accept wildcard characters: False ``` ### -Description Description of the policy. ```yaml Type: string Parameter Sets: (All) Aliases: Required: false Position: Named Default value: None Accept pipeline input: True True (ByPropertyName) Accept wildcard characters: False ``` ### -HxdpIpRange The range of storage management IPs to be assigned to the nodes. Note :- Use Initialize-IntersightHyperflexIpAddrRange to create the object of complex type HyperflexIpAddrRange ```yaml Type: HyperflexIpAddrRange Parameter Sets: (All) Aliases: Required: false Position: Named Default value: None Accept pipeline input: True True (ByPropertyName) Accept wildcard characters: False ``` ### -HypervisorControlIpRange The range of IPs to be assigned to each hypervisor node for VM migration and hypervior control. Note :- Use Initialize-IntersightHyperflexIpAddrRange to create the object of complex type HyperflexIpAddrRange ```yaml Type: HyperflexIpAddrRange Parameter Sets: (All) Aliases: Required: false Position: Named Default value: None Accept pipeline input: True True (ByPropertyName) Accept wildcard characters: False ``` ### -MgmtIpRange The range of management IPs to be assigned to the nodes. Note :- Use Initialize-IntersightHyperflexIpAddrRange to create the object of complex type HyperflexIpAddrRange ```yaml Type: HyperflexIpAddrRange Parameter Sets: (All) Aliases: Required: false Position: Named Default value: None Accept pipeline input: True True (ByPropertyName) Accept wildcard characters: False ``` ### -Moid The unique identifier of this Managed Object instance. ```yaml Type: string Parameter Sets: (All) Aliases: Required: false Position: Named Default value: None Accept pipeline input: True True (ByPropertyName) Accept wildcard characters: False ``` ### -Name Name of the concrete policy. ```yaml Type: string Parameter Sets: (All) Aliases: Required: true Position: Named Default value: None Accept pipeline input: True False Accept wildcard characters: False ``` ### -NodeNamePrefix The node name prefix that is used to automatically generate the default hostname for each server.\nA dash (-) will be appended to the prefix followed by the node number to form a hostname.\nThis default naming scheme can be manually overridden in the node configuration.\nThe maximum length of a prefix is 60, must only contain alphanumeric characters or dash (-), and must\nstart with an alphanumeric character. ```yaml Type: string Parameter Sets: (All) Aliases: Required: false Position: Named Default value: None Accept pipeline input: True True (ByPropertyName) Accept wildcard characters: False ``` ### -Organization A reference to a organizationOrganization resource.\nWhen the $expand query parameter is specified, the referenced resource is returned inline. Note:- To get the relationship object pass the MO to the cmdlet Get-IntersightMoMoRef or use the cmdlet Initialize-IntersightMoMoRef. ```yaml Type: OrganizationOrganizationRelationship Parameter Sets: (All) Aliases: Required: false Position: Named Default value: None Accept pipeline input: True True (ByPropertyName) Accept wildcard characters: False ``` ### -Tags Note :- Use Initialize-IntersightMoTag to create the object of complex type MoTag ```yaml Type: System.Collections.Generic.List`1[MoTag] Parameter Sets: (All) Aliases: Required: false Position: Named Default value: None Accept pipeline input: True True (ByPropertyName) Accept wildcard characters: False ``` ### -Json Returns the json payload received in response. ```yaml Type: SwitchParameter Parameter Sets: (All) Aliases: Required: false Position: Named Default value: None Accept pipeline input: True False Accept wildcard characters: False ``` ### -WithHttpInfo Returns the HTTP response with headers and content. ```yaml Type: SwitchParameter Parameter Sets: (All) Aliases: Required: false Position: Named Default value: None Accept pipeline input: True False Accept wildcard characters: False ``` ### CommonParameters This cmdlet supports the common parameters: -Debug, -ErrorAction, -ErrorVariable, -InformationAction, -InformationVariable, -OutVariable, -OutBuffer, -PipelineVariable, -Verbose, -WarningAction, and -WarningVariable. For more information, see [about_CommonParameters](http://go.microsoft.com/fwlink/?LinkID=113216). ## EXAMPLES ### Example 1 ```powershell PS C:\> New-IntersightHyperflexNodeConfigPolicy ``` { Add example description here } ## INPUTS ### System.Int32 ### System.String ## OUTPUTS ### TestModule.FavoriteStuff ## NOTES ## RELATED LINKS [Get-IntersightHyperflexNodeConfigPolicy](./Get-IntersightHyperflexNodeConfigPolicy.md) [Set-IntersightHyperflexNodeConfigPolicy](./Set-IntersightHyperflexNodeConfigPolicy.md) [Remove-IntersightHyperflexNodeConfigPolicy](./Remove-IntersightHyperflexNodeConfigPolicy.md) [Initialize-IntersightHyperflexIpAddrRange](./Initialize-IntersightHyperflexIpAddrRange.md) [Initialize-IntersightMoVersionContext](./Initialize-IntersightMoVersionContext.md)
24.541667
617
0.792445
eng_Latn
0.506807
4c464e8af31761e7bbb074214efcdd8b0dd3674b
2,569
md
Markdown
server-2013/lync-server-2013-hybrid-and-split-domain-autodiscover.md
Devil5140/OfficeDocs-SkypeforBusiness-Test-pr.it-it
2dcdbc783c72f9d5e030d087228d9ea736160f7a
[ "CC-BY-4.0", "MIT" ]
1
2020-05-19T19:28:21.000Z
2020-05-19T19:28:21.000Z
server-2013/lync-server-2013-hybrid-and-split-domain-autodiscover.md
Devil5140/OfficeDocs-SkypeforBusiness-Test-pr.it-it
2dcdbc783c72f9d5e030d087228d9ea736160f7a
[ "CC-BY-4.0", "MIT" ]
23
2018-04-26T18:32:18.000Z
2018-08-24T18:08:56.000Z
server-2013/lync-server-2013-hybrid-and-split-domain-autodiscover.md
Devil5140/OfficeDocs-SkypeforBusiness-Test-pr.it-it
2dcdbc783c72f9d5e030d087228d9ea736160f7a
[ "CC-BY-4.0", "MIT" ]
14
2018-06-19T11:13:22.000Z
2020-10-01T07:09:00.000Z
--- title: Distribuzione ibrida e dominio diviso - Individuazione automatica TOCTitle: Distribuzione ibrida e dominio diviso - Individuazione automatica ms:assetid: c855bcc5-b656-4d2d-92d6-f016f2797d3a ms:mtpsurl: https://technet.microsoft.com/it-it/library/JJ945652(v=OCS.15) ms:contentKeyID: 52062264 ms.date: 08/24/2015 mtps_version: v=OCS.15 ms.translationtype: HT --- # Distribuzione ibrida e dominio diviso - Individuazione automatica   _**Ultima modifica dell'argomento:** 2013-02-14_ Uno spazio indirizzo SIP condiviso, noto anche come distribuzione di *dominio diviso* o distribuzione *ibrida*, è una configurazione in cui gli utenti sono distribuiti in una distribuzione locale e un ambiente online. Il risultato desiderato è che gli utenti, indipendentemente dalla posizione in cui si trova il server principale (in locale oppure online), accedano alla distribuzione e vengano reindirizzati alla posizione del server principale. A questo scopo, la funzionalità di individuazione automatica di Lync Server 2013 viene utilizzata per reindirizzare gli utenti online alla topologia online. È possibile ottenere questo risultato configurando l'URL del servizio di individuazione automatica mediante Lync Server Management Shell e i cmdlet **Get-CsHostingProvider** e **Set-CsHostingProvider**. ## Mobilità per la distribuzione del dominio diviso Sarà necessario raccogliere e prendere nota degli attributi distribuiti seguenti: - In Lync Server Management Shell digitare Get-CsHostingProvider - Nei risultati, individuare il provider online con l'attributo **ProxyFQDN**. Ad esempio, sipfed.online.lync.com. - Prendere nota del valore di ProxyFQDN. - Abilitare la federazione nel Pannello di controllo di Lync Server locale, consentendo la federazione con il provider online. - Abilitare la federazione per il provider online. Per impostazione predefinita, tutti gli utenti online sono abilitati per la federazione dei domini e possono comunicare con tutti i domini. - Se si prevede di definire domini bloccati e consentiti, determinare i domini che verranno esplicitamente consentiti o bloccati. - Per la federazione online è necessario pianificare le eccezioni del firewall, i certificati e i record dell'host DNS (A oppure AAAA se si utilizza IPv6). È inoltre necessario configurare i criteri di federazione. Per informazioni dettagliate, vedere [Pianificazione della federazione di Lync Server e Office Communications Server](lync-server-2013-planning-for-lync-server-and-office-communications-server-federation.md).
64.225
807
0.808486
ita_Latn
0.998214
4c47796b621b9dc6e26ff670a24cede864caac9b
3,915
md
Markdown
README.md
nyavokevin/Spartan-labour
e4d6a0a55b40796391e81272c4dcf6f753cd9c2a
[ "MIT" ]
null
null
null
README.md
nyavokevin/Spartan-labour
e4d6a0a55b40796391e81272c4dcf6f753cd9c2a
[ "MIT" ]
null
null
null
README.md
nyavokevin/Spartan-labour
e4d6a0a55b40796391e81272c4dcf6f753cd9c2a
[ "MIT" ]
null
null
null
<p align="center"><img src="https://secureservercdn.net/45.40.149.159/nm4.b56.myftpupload.com/wp-content/uploads/2019/05/spartan-logo.png" width="400"></p> ## A propos de Spartan Labour Developpement Spartan Labour est développer sous Laravel qui est un framework d'application web avec une syntaxe expressive et élégante. Nous pensons que le développement doit être une expérience agréable et créative pour être vraiment épanouissant. Laravel simplifie le développement en facilitant les tâches courantes utilisées dans de nombreux projets Web, telles que: - [Simple, fast routing engine](https://laravel.com/docs/routing). - [Powerful dependency injection container](https://laravel.com/docs/container). - Multiple back-ends for [session](https://laravel.com/docs/session) and [cache](https://laravel.com/docs/cache) storage. - Expressive, intuitive [database ORM](https://laravel.com/docs/eloquent). - Database agnostic [schema migrations](https://laravel.com/docs/migrations). - [Robust background job processing](https://laravel.com/docs/queues). - [Real-time event broadcasting](https://laravel.com/docs/broadcasting). Laravel est accessible, puissant et fournit les outils requis pour les grandes applications robustes. ## Installation de l'application Pour installer l'application faite un `clone` de l'application ou récupérer le avec `Git pull` dans votre ***console*** ou votre ***utilitaire***. Une fois le `pull` terminé, configurer votre variable d'environnement en modifiant le fichier `.env` Avant d'éditer le fichier `.env`, vous devez créer une base de donnée vide. Ensuite dans `.env`, mettre les valeurs suivants pour un serveur local sur `windows` : ``` ... DB_DATABASE=nom_de_la_base_de_donnee DB_USERNAME=root DB_PASSWORD= ... ``` Aller dans la racine du projet depuis la console et initier la `migration` de la base de donnée : `php artisan migrate` ## Laravel Sponsors We would like to extend our thanks to the following sponsors for funding Laravel development. If you are interested in becoming a sponsor, please visit the Laravel [Patreon page](https://patreon.com/taylorotwell). - **[Vehikl](https://vehikl.com/)** - **[Tighten Co.](https://tighten.co)** - **[Kirschbaum Development Group](https://kirschbaumdevelopment.com)** - **[64 Robots](https://64robots.com)** - **[Cubet Techno Labs](https://cubettech.com)** - **[Cyber-Duck](https://cyber-duck.co.uk)** - **[British Software Development](https://www.britishsoftware.co)** - **[Webdock, Fast VPS Hosting](https://www.webdock.io/en)** - **[DevSquad](https://devsquad.com)** - [UserInsights](https://userinsights.com) - [Fragrantica](https://www.fragrantica.com) - [SOFTonSOFA](https://softonsofa.com/) - [User10](https://user10.com) - [Soumettre.fr](https://soumettre.fr/) - [CodeBrisk](https://codebrisk.com) - [1Forge](https://1forge.com) - [TECPRESSO](https://tecpresso.co.jp/) - [Runtime Converter](http://runtimeconverter.com/) - [WebL'Agence](https://weblagence.com/) - [Invoice Ninja](https://www.invoiceninja.com) - [iMi digital](https://www.imi-digital.de/) - [Earthlink](https://www.earthlink.ro/) - [Steadfast Collective](https://steadfastcollective.com/) - [We Are The Robots Inc.](https://watr.mx/) - [Understand.io](https://www.understand.io/) - [Abdel Elrafa](https://abdelelrafa.com) - [Hyper Host](https://hyper.host) ## Contributing Thank you for considering contributing to the Laravel framework! The contribution guide can be found in the [Laravel documentation](https://laravel.com/docs/contributions). ## Security Vulnerabilities If you discover a security vulnerability within Laravel, please send an e-mail to Taylor Otwell via [taylor@laravel.com](mailto:taylor@laravel.com). All security vulnerabilities will be promptly addressed. ## License The Laravel framework is open-source software licensed under the [MIT license](https://opensource.org/licenses/MIT).
47.168675
357
0.747126
fra_Latn
0.201334
4c47a3f9620d7b3bd0fd7f692deea09876dc923b
9,867
md
Markdown
_posts/2021-08-19-Kafka_Cluster.md
namsick96/namsick96.github.io
e9c895450256599b3531c63027220798924e1b64
[ "MIT" ]
null
null
null
_posts/2021-08-19-Kafka_Cluster.md
namsick96/namsick96.github.io
e9c895450256599b3531c63027220798924e1b64
[ "MIT" ]
null
null
null
_posts/2021-08-19-Kafka_Cluster.md
namsick96/namsick96.github.io
e9c895450256599b3531c63027220798924e1b64
[ "MIT" ]
null
null
null
--- title: "Kafka Cluster 구축하는 법" excerpt: categories: - Kafka tags: - Kafk법 last_modified_at: 2021-08-19T11:06:00-10:00 sitemap : changefreq : daily priority : 1.0 --- # 카프카 클러스터 구성 카프카 클러스터 구성 방법을 알아보도록 하겠습니다 카프카 클러스터는 인스턴스 3개에 각각 주키퍼 노드 1개, 카프카 브로커 1개가 들어있고 총 3개 노드를 가지고 있는 주키퍼 앙상블과 3개의 브로커로 이루어진 카프카 클러스터를 만들고자 합니다. 또한 카프카 클러스터는 동일한 VPC-Subnet 환경에 존재합니다. 클러스터 구조도는 다음과 같습니다. <img width="787" alt="스크린샷 2021-08-20 오후 1 40 42" src="https://user-images.githubusercontent.com/61309514/130204930-e1d8b456-ee07-4c52-9dca-8d645e93f13f.png"> 물론 모든 인스턴스를 인터넷 게이트웨이에 연결해 AWS 환경에서 모든 과정이 진행되었습니다. 사용된 기술 스택은 다음과 같습니다. - Zookeeper - Kafka - Java # 1. AWS에서 인스턴스 설정 <img width="720" alt="그림1" src="https://user-images.githubusercontent.com/61309514/130176888-1cdbb30b-ea95-411f-b7b2-0d7810213dea.png"> Amazon Linux 2 AMI를 AMI로 선택합니다. 인스턴스 유형은 t2.micro로 설정하고 스토리지는 디폴트 값이던 8GB에서 10GB로 늘려줍니다. <img width="720" alt="인스턴스개수" src="https://user-images.githubusercontent.com/61309514/130177106-6adc3501-2e89-47c5-81f8-28aa617e775a.png"> 인스턴스 개수는 3개로 하여 동일한 인스턴스를 3개 만들어 줍니다. <img width="600" alt="인바운드" src="https://user-images.githubusercontent.com/61309514/130177269-dfa70ff4-8451-4492-bfc8-9dc32c77d16b.png"> 보안 그룹을 설정할 때 다음과 같이 설정해 줍니다. 2888,3888,2181,9092 포트를 열어 줍니다. 이는 주키퍼, 카프카 노드 끼리 통신을 하는 포트입니다. 이제 설정이 완료되었음으로 인스턴스에 접속해서 java, zookeeper, kafka를 설치하고 클러스터를 구성해 보도록 하겠습니다. # 2. Java 설치 인스턴스 접속 후 홈 디렉토리에 위치해 있는 상황에서 다음과 같은 명령어를 순서대로 입력합니다 ``` sudo yum update sudo yum install java-11-amazon-corretto-headless java --version ``` 이렇게 입력했을 경우 터미널에 openjdk 11.0.12 라는 표시가 맨 첫줄에 출력되면 자바가 제대로 설치된 상황입니다. 서버 3대 모두 위 명령어를 입력하고 출력이 제대로 나오는지 확인해 주면 됩니다. # 3. Zookeeper 설치 및 구동 주키퍼도 다음 명령어를 순서대로 입력하면 됩니다. ``` cd ~ wget https://mirror.navercorp.com/apache/zookeeper/zookeeper-3.6.3/apache-zookeeper-3.6.3-bin.tar.gz tar xvf apache-zookeeper-3.6.3-bin.tar.gz ln -s apache-zookeeper-3.6.3-bin zookeeper ``` 그 다음 터미널에 ls 명령어를 치면 다음과 같이 화면이 나옵니다 <img width="452" alt="주키퍼깔고나서" src="https://user-images.githubusercontent.com/61309514/130248526-e27f8d66-008a-4121-b3b1-4be5ab7a0756.png"> 서버 3대 모두 위 명령어를 입력하고 출력이 사진과 같이 제대로 나오는지 확인해 주면 됩니다. 이제 주키퍼 설치는 완료가 되었고 설정을 해보도록 합시다. ``` cd ~ mkdir -p ./data ``` 위 명령어를 입력해 줍니다. 그 다음 서버 별로 다른데 첫번째 서버의 경우 ``` echo 1 > ./data/myid ``` 두번째 서버의 경우 ``` echo 2 > ./data/myid ``` 세번째 서버의 경우 ``` echo 3 > ./data/myid ``` 이렇게 입력해 줍니다. ./data/myid 에 서버의 번호를 넣는 작업입니다. myid 번호를 넣는 작업이 끝났으면 이제는 zoo.cfg 파일을 수정해 보겠습니다. 다음 명령어를 입력해 주세요 서버 3대 모두 동일하게 입력해주시면 됩니다. ``` cd ~ cd zookeeper/conf cp zoo_sample.cfg zoo.cfg vim zoo.cfg ``` 이러면 zoo.cfg 파일이 열립니다. 그 다음 i 키를 눌러 입력 모드로 변경한 다음 zoo.cfg파일을 변경해 줍니다. 우선 기존 설정을 바꿀 곳 1군데 새로 추가할 곳 3군데입니다. dataDir 부분의 기존 설정을 바꿔줍니다. dataDir=/home/ec2-user/data 로 변경해 줍니다. 그 다음 clientPort 설정 아래 들여쓰기를 통해 server.1=1번서버_private_ip:2888:3888 server.2=2번서버_private_ip:2888:3888 server.3=3번서버_private_ip:2888:3888 를 입력해 줍니다. 만약 현재 접속해 있는 서버가 1번 서버라면 1번서버_privat_ip에 실제 private_ip대신 0.0.0.0을 넣어주면 됩니다. 2,3번 서버에서도 마찬가지입니다. 1번 서버의 예시 사진은 다음과 같습니다. <img width="713" alt="스크린샷 2021-08-20 오후 9 10 22" src="https://user-images.githubusercontent.com/61309514/130249763-08656903-5993-4c3a-93d7-ce7054a59bf9.png"> 2번 서버의 예시 사진은 다음과 같습니다. <img width="668" alt="스크린샷 2021-08-20 오후 8 55 46" src="https://user-images.githubusercontent.com/61309514/130249841-6017f50d-4451-43b9-8bc4-13303749cb7d.png"> 3번 서버의 예시 사진은 다음과 같습니다. <img width="698" alt="스크린샷 2021-08-20 오후 8 55 55" src="https://user-images.githubusercontent.com/61309514/130249875-a5433895-9814-48e2-8cca-987f00b59eb7.png"> private ip는 사진과 다를겁니다. 본인의 인스턴스에 배정된 private ip를 참고해서 적어주세요. 변경이 완료되었으면 :wq를 입력해 변경사항을 저장하고 닫아줍니다. 이제 zookeeper를 실행해 보도록 하겠습니다. 서버 3대 모두의 터미널에서 다음 명령어를 한줄 한줄 따라써주세요 ``` cd ~ cd zookeeper/bin ./zkServer.sh start ``` 이러면 주키퍼가 실행이 됩니다. 주키퍼가 잘 실행되고 있는지 확인하려면 ``` cd ~ cd zookeeper/bin ./zkServer.sh status ``` 를 한줄한줄 입력하시면 됩니다. 잘 진행될 경우에 사진은 다음과 같습니다. <img width="554" alt="스크린샷 2021-08-20 오후 9 02 58" src="https://user-images.githubusercontent.com/61309514/130250526-5a94bbd3-848a-4b1d-b6f1-bdfe88adcac9.png"> 위 사진은 서버가 leader인 경우입니다.(나머지 2대에 대한) <img width="565" alt="스크린샷 2021-08-20 오후 9 03 04" src="https://user-images.githubusercontent.com/61309514/130250566-3acbbc31-e6fc-45d7-8cff-de8b2c279962.png"> 위 사진은 서버가 follower인 경우입니다. leader -follower는 주키퍼 노드 끼리 알아서 선출된것이니 크게 신경 안쓰셔도 됩니다. 에러가 난 경우는 다음과 같습니다. <img width="539" alt="스크린샷 2021-08-20 오후 9 03 41" src="https://user-images.githubusercontent.com/61309514/130250709-ce5faf7e-aa4e-4df9-a8fb-9f6b0de03003.png"> 이렇게 에러가 난 경우에는 큰 확률로 zoo.cfg를 잘못 작성해서입니다. 한글자도 틀리면 안되니 다시 확인해주세요. 아니면 zookeeper 디렉토리의 log 디렉토리로 간 다음 로그를 확인해 보아도 좋습니다. 그 다음에는 주키퍼를 멈췄다가 다시 시작해 보세요. 주키퍼를 멈추는 명령어는 다음과 같습니다. ``` cd ~ cd zookeeper/bin ./zkServer.sh stop ``` 자 이제 주키퍼 설치와 작동이 완료되었습니다. 이제 카프카를 설치해보겠습니다. # 4. Kafka 설치 및 구동 카프카 설치를 해보도록하겠습니다. 서버 3대 모두에서 아래 명령어를 한줄한줄 입력하면 됩니다. ``` cd ~ wget https://mirror.navercorp.com/apache/kafka/2.6.2/kafka_2.12-2.6.2.tgz tar xvf kafka_2.12-2.6.2.tgz ln -s kafka_2.12-2.6.2 kafka ``` 위 명령어를 다 친 후 ls 명령어를 입력하면 터미널에 다음과 같은 화면이 나옵니다. <img width="452" alt="kafka설치후" src="https://user-images.githubusercontent.com/61309514/130253475-cb7b18dc-43cb-49f6-bb97-d229865e667b.png"> 올바르게 나왔는지 확인하고 이대로 안나왔다면 저장된 kafka_2.12-2.6.2.tgz를 rm명령어로 지우고 다시 실행합니다. 카프카 실행하기 전에 브로커의 힙 메모리를 설정해주고자 합니다. 현재 t2.micro 인스턴스는 메모리 크기가 1GB입니다. 주키퍼의 메모리 사용량이 512MB 이므로 유휴 공간이 512MB밖에 없습니다. 따라서 환경변수로 카프카 브로커의 힙 메모리를 기존의 1GB에서 400MB으로 줄여주고자 합니다. 세 서버 모두에서 다음 명령어를 입력해줍니다. ``` cd ~ vim ~/.bashrc ``` 이렇게 하면 .bashrc 파일을 열 수 있습니다. 그 다음 .bashrc 파일의 마지막 줄 아래 추가적으로 다음과 같은 텍스트 한줄을 더해줍니다. ``` export KAFKA_HEAP_OPTS="-Xmx400m -Xms400m" ``` 한줄 추가하고 나면 다음과 같이 화면에 나와야 합니다. <img width="452" alt="그림10" src="https://user-images.githubusercontent.com/61309514/130253806-aa4cf857-f14e-4e22-990c-e0080b8a075b.png"> 그 다음 :wq 를 입력해 변경 사항을 저장한 후 .bashrc를 나옵니다. 해당 명령어를 다시 입력해 변경된 .bashrc를 적용해 줍니다. ``` source ~/.bashrc ``` 잘 적용 되었는지 확인하기 위해서 다음 명령어를 입력해 보도록 합시다. ``` echo $KAFKA_HEAP_OPTS ``` 출력으로 -Xmx400m -Xms400m 이 나오면 제대로 설정이 완료 된 것입니다. 이제 카프카 클러스터 설정을 해보도록하겠습니다. 클러스터 설정은 server.properties를 통해 할 수 있습니다. 모든 서버에 다음 명령어를 한줄 한줄 입력하도록 합시다. ``` cd ~ cd kafka cd config vim server.properties ``` 여기까지 3개의 서버에서 모두 동일하게 진행해 줍니다. 그 다음 server.properties의 경우 서버 별로 달라지는게 존재합니다. 현재 server.properties 파일이 열려있는 상태입니다. 해당 파일에서 i를 눌러 입력 모드로 바꾼 후에 broker.id 부분의 설정 값을 변경합니다. 첫번째 서버의 경우 broker.id=1 두번째 서버의 경우 broker.id=2 세번째 서버의 경우 broker.id=3 으로 설정해 줍니다. 또한 #advertised.listeners 를 주석 해제하고 your.host.name 부분을 서버의 private ip로 변경해 줍니다. private ip는 aws의 인스턴스 세부 정보에서 확인할 수 있습니다. 다음 사진은 첫번째 서버의 경우의 예시입니다. <img width="452" alt="그림11" src="https://user-images.githubusercontent.com/61309514/130254455-698b1b85-64e8-44d5-a411-77e76d68ddd3.png"> broker.id를 1로 설정하였습니다. advertised.listeners의 기존 설정에서 your.host.name을 해당 서버의 private ip 값으로 바꿔줍니다. private ip는 aws의 인스턴스 세부 정보에서 확인할 수 있습니다. 그 다음으로 ``` log.dirs=/tmp/kafka-logs ``` 로 되어있는 기존 설정을 ``` log.dirs=/home/ec2-user/kafka/kafka-logs ``` 로 바꿔줍니다. 마지막으로 ``` zookeeper.connect=localhost:2181 ``` 값을 실제 주키퍼 서버값으로 바꿔줍니다. 여기서는 3대의 서버에 각각 주키퍼 서버가 존재합니다. 따라서 3개의 서버들(카프카 브로커가 설치되어있는 서버 포함)의 ip주소와 2181 포트를 같이 써주면 됩니다. 3개의 클러스터인 만큼 ``` 서버1ip:2181,서버2ip:2181,서버3ip:2181 ``` 로 씁니다. 연결을 콤마로 해줍니다. 그 다음 클러스터들을 다 쓴다음 맨 뒤에 /test-kafka 를 추가해 줍니다. 이렇게 해주는 이유는 znode의 디렉토리를 루트노드가 아닌 하위 디렉토리로 설정해 주는 작업으로 하나의 주키퍼에서 여러 클러스터를 구동할수 있게하기 위함입니다. 이름은 굳이 test-kafka일 필요는 없습니다. 따라서 결과적으로 ``` zookeeper.connect=서버1ip:2181,서버2ip:2181,서버3ip:2181/test-kafka ``` 형식으로 텍스트를 입력해줍니다. 만약 서버들이 동일한 vpc에 있는 경우 private ip를 설정해주면 됩니다. 그게 아니라면 퍼블릭 ip주소를 넣어줘야합니다. 지금 우리의 경우는 동일한 vpc에서 작업하는 상황을 가정하고 있으니 private ip를 넣어주면 됩니다. 다음은 예시케이스입니다. <img width="452" alt="그림31" src="https://user-images.githubusercontent.com/61309514/130255102-73434402-aa2f-491a-a26b-54d5a2c11d70.png"> ip주소는 본인의 경우에 맞게 서버의 private ip주소를 입력해야합니다. 이제 설정이 완료되었습니다. :wq를 통해 변경사항을 저장해 주고 server.properties 파일에서 빠져나오면 됩니다. 이렇게 서버 3대가 모두 설정이 되었다면 주키퍼-카프카 순서대로 실행을 해주면 됩니다. 주키퍼 시행 방법은 위에서 말했듯이 zookeeper/bin 디렉토리에 있는 .zkServer.sh 파일을 활용하면 됩니다. 카프카의 실행 방법도 비슷합니다. 세 서버 모두 아래 명령어를 입력해줍니다. ``` cd ~ cd kafka ./bin/kafka-server-start.sh -daemon ./config/server.properties ``` 위 명령어를 입력 해주면 됩니다. 이렇게 되면 kafka가 백그라운드에서 실행이 됩니다 (-daemon옵션으로 인해 백그라운드에서 실행이 됩니다.) 멈추고 싶을 경우 다음 명령어를 입력합니다. 역시 3개의 서버 모두에서 카프카 브로커를 종료해야지 클러스터 자체가 종료됩니다. 왜냐하면 카프카 클러스터 자체가 1개만이라도 브로커가 남아있으면 서비스가 지속될 수 있기 때문이비다. ``` cd ~ cd kafka ./bin/kafka-server-stop.sh ``` 이런 순으로 명령어를 입력하면 카프카가 종료 됩니다. 카프카가 제대로 실행 되고 있는지는 logs 디렉토리에서 가장 최근의 로그를 확인해 알 수 있습니다. server.log 파일을 확인해서 에러 코드가 로그에 떴는지 확인하는 방법이 있습니다. 만약 에러가 떴다면 server.properties를 다시 한번 확인해 보세요. 또다른 확인 방법으로 9092 포트가 제대로 작동하고 있는지 보면 됩니다. ``` netstat -ntlp | grep 9092 ``` 명령어를 쳤을 경우 LISTEN이 출력되면 카프카 클러스터가 작동하고 있는 것 입니다. <img width="723" alt="스크린샷 2021-08-21 오전 12 34 52" src="https://user-images.githubusercontent.com/61309514/130258202-99b1f4d4-cf9c-4616-b291-0e85ba8eb869.png"> 만약 카프카, 주키퍼를 종료하고자 하면 카프카 - 주키퍼 순서대로 종료하면 됩니다. 종료 방식은 위에 상술하였듯이 bin 폴더에서 stop에 해당하는 .sh파일을 stop옵션과 함께 실행하면 됩니다. # 카프카 클러스터 클라이언트 서버 구축 카프카 클러스터에 대한 클라이언트 서버를 구축하는 방법에 대해서 설명하겠습니다. 사실 굳이 클라이언트 서버가 아닌 사용자 PC에서 카프카 클러스터중 하나의 브로커에 접속해서 명령을 내려도 됩니다. 하지만 클라이언트 서버를 구축하면 브로커에 부담을 덜 주고 보안상 우수합니다. 보안상 얻는 이점은 카프카 클러스터를 외부 인터넷 망에서 곧바로 접속하는것을 막을 수 있다는 점입니다. 클러스터가 존재하는 서브넷과 클러스터의 클라이언트 서버가 동일한 VPC에 존재하지만 서로 다른 서브넷에 존재하게 하고 라우터로 하여금 클러스터와 인터넷 게이트웨이와의 연결을 삭제해서 클러스터 클라이언트 서버로만 클러스터에 접속할 수 있게 하는 것 입니다. 이럴 경우 클러스터로의 집적적인 침입을 막을 수 있어 보안상 우수합니다. 카프카 클러스터 클라이언트 서버를 구축하는 방법은 주키퍼를 설치하지 않습니다. 오직 카프카만 설치합니다. 앞서 위에 상술했던 카프카 설치 방식을 따라 인스턴스에 그대로 설치하면 됩니다. 다만 카프카를 설치하기 전에 java설치를 완료해야합니다. 또한 카프카 클러스터를 만드는 server.properties 부분 이전까지만 위에 상술한대로 따라 설치하면됩니다. 왜냐하면 카프카 클라이언트 서버는 클러스터를 구성하는 것이 아니기 때문입니다. 그 다음 카프카에서 제공하는 클라이언트 명령어를 사용해서 클러스터를 제어하면 됩니다.
24.065854
242
0.732543
kor_Hang
1.00001
4c481415ebb058d2874f6119afb891880286c12b
10,629
md
Markdown
blog/2020-09-08-blinmaker.md
secureCodeBox/docusaurus
e256b8fbd3604c3bd4ea64f82b282c953ab1696a
[ "Apache-2.0" ]
null
null
null
blog/2020-09-08-blinmaker.md
secureCodeBox/docusaurus
e256b8fbd3604c3bd4ea64f82b282c953ab1696a
[ "Apache-2.0" ]
1
2020-09-26T14:19:50.000Z
2020-09-28T08:18:57.000Z
blog/2020-09-08-blinmaker.md
secureCodeBox/docusaurus
e256b8fbd3604c3bd4ea64f82b282c953ab1696a
[ "Apache-2.0" ]
1
2020-09-28T10:56:16.000Z
2020-09-28T10:56:16.000Z
--- # SPDX-FileCopyrightText: the secureCodeBox authors # # SPDX-License-Identifier: Apache-2.0 title: Blinmaker author: Daniel Patanin author_title: Maintainer of securecodebox.io author_url: https://github.com/dpatanin author_image_url: https://avatars1.githubusercontent.com/u/44839597?s=400&u=df006f35797ebb585d8279513305a0bbf1f616b5&v=4 tags: [cooking, blini] description: This is my first post on securecodebox.io. image: /img/blog/2020-09-08-blini.jpg --- ![Blini](/img/blog/2020-09-08-blini.jpg) This is the first post on the new [securecodebox.io](https://securecodebox.io) documentation. What would be better than to teach you how to make some Blini. 😸 <!--truncate--> ## Making the Blin Blini are from Eastern Europe. They're basically pancakes in 10 times as thin, but 10 times as good and easy to make. The main components are some eggs, milk and flour. Nothing extraordinary, actually three very basic things that you have at home most of the time. For one portion of Blini you will need 1 chicken produce, 200ml cow juice and 100g dry snow. Mix them together and you are ready to make 4 Blini. :::info Blinmaker If you think: "But i can't memorize that amounts, is there an easier way? Yes there is! Meet the [Blinmaker](#the-blinmaker). You can also compute the amount of Blini you can make right [here](#computing-blin-amount). " ::: As for actually making the Blini, it's even easier: 1. Take a pan - Heat it up 2. Add a small amount of yellow cooking slime (source may be your choice) 3. Pour in the liquid Blini until they just cover the surface of the pan 1. Keep the pan hot while you wait until the blin magically solidifies. 2. Carefully flip the Blin... - before it starts burning - and when it is somewhat solid, but not for a long time 4. Remove the Blin when it is ready. 5. Eat your Blini. :::caution Watch your Blini Warning! You better pay attention! If not, your neighbor Vadim might steal some Blini while you are not looking! ::: ## Serving the Blin When you are ready to eat some Blini and think: "This is not bad but something is missing.", Then you're absolutely right. See, while Blini are delicious themselves, their true potential lies in the toppings you eat them with. Pretty much anything sweet you like will make you very happy, but there are also some different things you may try: | Sweet | Not sweet | Drinks | | :---------------- | :--------: | ------------: | | Honey | Mustard | Milk | | Jam | Sour Cream | Tea | | Maple syrup | Mayonnaise | Fruit juice | | Chocolate cream | | Hot Chocolate | | Berries or fruits | | Coffee | | Ice cream | | :::tip Blini fit very nicely in lunch bags. ::: ## The Blinmaker Meet the **_Blinmaker_**. It is a magnificent tool which computes how many Blini you can make with what you have at home. ### Blinmaker in different languages Here is the Blinmaker in different languages. Just copy, paste and click run whenever you need to know how many Blini you can make! import Tabs from '@theme/Tabs'; import TabItem from '@theme/TabItem'; <Tabs defaultValue="rs" values={[ { label: 'Rust', value: 'rs', }, { label: 'Python', value: 'py', }, { label: 'Java', value: 'java', }, ] }> <TabItem value="rs"> ```rust pub const EGGS_MIN: i32 = 1; pub const FLOUR_MIN: f32 = 100.0; pub const MILK_MIN: f32 = 200.0; pub fn find_blin_amount(mut flour_amount: f32, mut milk_amount: f32, mut eggs_amount: i32) -> f32 { flour_amount = flour_amount / FLOUR_MIN; eggs_amount = eggs_amount / EGGS_MIN; milk_amount = milk_amount / MILK_MIN; let smallest: f32; if flour_amount <= milk_amount && flour_amount <= eggs_amount as f32{ smallest = flour_amount as f32; return smallest * 6.0; } else if milk_amount <= flour_amount && milk_amount <= eggs_amount as f32 { smallest = milk_amount as f32; return smallest * 6.0; } else if eggs_amount as f32 <= flour_amount && eggs_amount as f32 <= milk_amount { smallest = eggs_amount as f32; return smallest as f32 * 6.0; } else{ return -1 as f32; } } pub fn find_materials_amount(mut flour_amount: f32, mut milk_amount: f32, mut eggs_amount: i32) -> (f32,f32,i32) { flour_amount = flour_amount / FLOUR_MIN; eggs_amount = eggs_amount / EGGS_MIN; milk_amount = milk_amount / MILK_MIN; let mut smallest: f32 = 0.0; if flour_amount<=milk_amount && flour_amount<=eggs_amount as f32 { smallest = flour_amount as f32; } else if milk_amount<=flour_amount && milk_amount<=eggs_amount as f32 { smallest = milk_amount as f32; } else if eggs_amount as f32<=flour_amount && eggs_amount as f32<=milk_amount { smallest = eggs_amount as f32; } (smallest * FLOUR_MIN, smallest * MILK_MIN, smallest as i32 * EGGS_MIN) } ``` </TabItem> <TabItem value="py"> ```py i="" while i !='stop': eggamount = int(input("how many eggs do you have?")) eggsneeded = 3 milkamount = int(input("how much milk do you have?")) milkneeded = 1 flouramount = int(input("how much flour do you have?")) flourneeded = 2 if eggamount < eggsneeded and milkamount < milkneeded and flouramount < flourneeded: print("no") else: list1=[] eggamount1 = eggamount // eggsneeded milkamount1 = milkamount // milkneeded flouramount1 = flouramount // flourneeded print('you have' ,eggamount1,"portions of eggs") print("you have", flouramount1,"portions of flour") print("you have" ,milkamount1,"portions of milk") list1.append(eggamount1) list1.append(milkamount1) list1.append(flouramount1) print("you can make" ,min(list1) ,"blin") i=input( ''' type stop if you wish to stop the program type anything to continue''' ) ``` </TabItem> <TabItem value="java"> ```java package blinmaker; import java.util.Scanner; public class cooker { public static void main(String[] args) { int eggsAmount; int eggsMin = 1; int milkAmount; int milkMin = 200; // milliliter int flourAmount; int flourMin = 100; // grams System.out.println("Hello!"); System.out.println("Blinmaker ist starting up.."); System.out.println("How many egges do you have?"); Scanner userInput; userInput = new Scanner(System.in); eggsAmount = userInput.nextInt(); System.out.println("You have " + eggsAmount + " eggs."); System.out.println("How much milk do you have?"); userInput = new Scanner(System.in); milkAmount = userInput.nextInt(); System.out.println("You have " + milkAmount + "ml milk."); System.out.println("How much flour do you have?"); userInput = new Scanner(System.in); flourAmount = userInput.nextInt(); System.out.println("You have " + flourAmount + "g flour."); if(eggsAmount < eggsMin || milkAmount < milkMin || flourAmount < flourMin) { System.out.println("No blin today :("); } else { int flourPortions = flourAmount / flourMin; int milkPortions = milkAmount / milkMin; int smallest = Math.min(Math.min(flourPortions, milkPortions), eggsAmount); System.out.println(" "); System.out.println("You can make " + smallest*4 + " Blini."); System.out.println(" "); System.out.println("You will need " + smallest*eggsMin + " eggs."); System.out.println("You will need " + smallest*milkMin + " milk."); System.out.println("You will need " + smallest*flourMin + " flour."); System.out.println("Blinmaker shutting down..."); } } } ``` </TabItem> </Tabs> ### Computing Blin Amount If you say you want to make some Blini right now, then here you go, a Blinmaker ready to use. :::note Did you know With this live editor you can change the blinmaker to use e.g. imperial units, if you're a western spy. ::: ```jsx live class BlinMaker extends React.Component { constructor(props) { super(props); this.state = { eggsAmount: 0, milkAmount: 0, flourAmount: 0, }; this.computeBlinAmount = this.computeBlinAmount.bind(this); } computeBlinAmount() { const eggsMin = 1; const milkMin = 200; // milliliter const flourMin = 100; // grams if ( this.state.eggsAmount < eggsMin || this.state.milkAmount < milkMin || this.state.flourAmount < flourMin ) { alert("No blin today :("); } else { const flourPortions = Math.floor(this.state.flourAmount / flourMin); const milkPortions = Math.floor(this.state.milkAmount / milkMin); const smallest = Math.min( this.state.eggsAmount, flourPortions, milkPortions ); const Blini = smallest * 4; alert( `You can make ${Blini} Blini. You will need ${ smallest * eggsMin } eggs, ${smallest * milkMin}ml milk and ${smallest * flourMin}g flour.` ); } } render() { const gridStyle = { display: "inline-block", width: "50%", minWidth: "max-content", }; const inputStyle = { float: "right", }; const buttonStyle = { backgroundColor: "#55a8e2", maxWidth: "200px", height: "30px", font: "400 14px/18px Roboto, sans-serif", marginTop: "10px", border: "none", cursor: "pointer", }; return ( <div style={gridStyle}> <label> Egg amount: <input type="number" style={inputStyle} value={this.state.eggsAmount} onChange={(event) => this.setState({ eggsAmount: event.target.value }) } /> </label> <br /> <label> Milk amount: <input type="number" step="50" style={inputStyle} value={this.state.milkAmount} onChange={(event) => this.setState({ milkAmount: event.target.value }) } /> </label> <br /> <label> Flour amount: <input type="number" step="50" style={inputStyle} value={this.state.flourAmount} onChange={(event) => this.setState({ flourAmount: event.target.value }) } /> </label> <br /> <button style={buttonStyle} onClick={this.computeBlinAmount}> Compute Blinamount </button> </div> ); } } ``` :::danger Don't make too many Blini. Throwing them away is a crime in eastern Europe! :::
30.455587
342
0.63562
eng_Latn
0.927888
4c489e4673367a9c0b9d657dd6b97519edd1823f
5,418
md
Markdown
tests/dummy/app/templates/docs/utils/theme-icon.md
andrew-paterson/ember-skeleton
fba25dc92f6a883289c2b6db812e79bbedd5b55f
[ "MIT" ]
null
null
null
tests/dummy/app/templates/docs/utils/theme-icon.md
andrew-paterson/ember-skeleton
fba25dc92f6a883289c2b6db812e79bbedd5b55f
[ "MIT" ]
14
2020-02-13T09:18:31.000Z
2022-02-12T04:11:48.000Z
tests/dummy/app/templates/docs/utils/theme-icon.md
andrew-paterson/ember-skeleton
fba25dc92f6a883289c2b6db812e79bbedd5b55f
[ "MIT" ]
1
2021-10-05T12:45:02.000Z
2021-10-05T12:45:02.000Z
# Theme Icon ## Use case In general, you should be using the {{link-to "helper" "docs.helpers.theme-icon"}} directly in your templates. The util should only be used when this is not possible. For a general explanation of the use case, see {{link-to "the helper docs" "docs.helpers.theme-icon"}}. ## Import the util {{docs-snippet name="import-theme-icon-util.js"}} ## Basic usage The `ember-skeleton/theme-icon` helper accepts a string as the first arguemnt (`person.status` in the example below), and a hash of key value pairs, where each key is a potential value of `person.status` and each value the path to a component. The helper returns the relevant path, and thus the correct icon component is displayed. {{#docs-demo as |demo|}} {{#demo.example name="util-theme-icons-hash" class="theme-icons-demo"}} <table> <thead> <tr> <th>Name</th> <th>Status</th> </tr> </thead> <tbody> {{#each basicThemeIcon as | person |}} <tr> <td>{{person.name}}</td> <td>{{component person.statusIcon}}{{person.status}}</td> </tr> {{/each}} </tbody> </table> {{/demo.example}} {{demo.snippet "util-theme-icons-hash" label="Template" language="htmlbars"}} {{demo.snippet "names-and-statuses.js" label="Model" language="javascript"}} {{/docs-demo}} Note the use of the contextual component syntax `{{component ...}}` is used above to render a component from a dynamic path. ## Default options It is likely that most icon/string associations will remain consistent throughout your app- for example that `passed` will always use the a tick and `failed` will always use the alert. To avoid having to explicitly pass these associations to the helper every time it is used, a set of app wide defaults can be defined in `config/environment.js`. {{docs-snippet name="theme-icon-app-defaults.js" language="javascript" title="config/environment.js"}} The helper will check if the value of the first argument is present in any of the `matchStrings` arrays, and will return the corresponding `returnString`. Note that you can set `fallback: true` on one of the objects. In this case, the returnString of that object will be returned if no matches are found. Note that this can be overriden by passing `fallback` to the util as outlined below. {{#docs-demo as |demo|}} {{#demo.example name="util-default-theme-icons" class="theme-icons-demo"}} <table> <thead> <tr> <th>Name</th> <th>Status</th> </tr> </thead> <tbody> {{#each defaultThemeIcons as | person |}} <tr> <td>{{person.name}}</td> <td>{{component person.statusIconDefault}}{{person.status}}</td> </tr> {{/each}} </tbody> </table> {{/demo.example}} {{demo.snippet "theme-icon-util-defaults.js" label="Controller"}} {{demo.snippet "util-default-theme-icons" label="Template" language="htmlbars"}} {{demo.snippet "names-and-statuses.js" label="Model" language="javascript"}} {{demo.snippet name="theme-icon-app-defaults.js" language="javascript" label="config/environment.js"}} {{/docs-demo}} Note that where a key value pair passed directly to the helper conflicts with a default association, the default association will be overriden. {{#docs-demo as |demo|}} {{#demo.example name="util-theme-icon-overridden" class="theme-icons-demo"}} <table> <thead> <tr> <th>Name</th> <th>Status</th> </tr> </thead> <tbody> {{#each themeIconsDefaultsOverriden as | person |}} <tr> <td>{{person.name}}</td> <td>{{component person.statusIconDefaultsOverriden}}{{person.status}}</td> </tr> {{/each}} </tbody> </table> {{/demo.example}} {{demo.snippet "theme-icon-util-default-overridden.js" label="Controller"}} {{demo.snippet "util-theme-icon-overridden" label="Template" language="htmlbars"}} {{demo.snippet "names-and-statuses.js" label="Model" language="javascript"}} {{demo.snippet name="theme-icon-app-defaults.js" language="javascript" label="config/environment.js"}} {{/docs-demo}} You can also pass a fallback string to the helper then invoking it. This will be returned by the helper if no other matches are found. {{#docs-demo as |demo|}} {{#demo.example name="util-theme-icons-default-fallback" class="theme-icons-demo"}} <div>{{component statusIconFallback}}missing</div> {{/demo.example}} {{demo.snippet "theme-icon-util-fallback.js" label="Controller"}} {{demo.snippet "util-theme-icons-default-fallback" label="Template" language="htmlbars"}} {{demo.snippet name="theme-icon-app-defaults.js" language="javascript" label="config/environment.js"}} {{/docs-demo}} If the string passed to the helper does not find any matches, and no fallback is given, the helper will return `null`. {{#docs-demo as |demo|}} {{#demo.example name="util-theme-icon-no-match" class="theme-icons-demo"}} <div>{{component statusIconNoMatches}}info</div> {{/demo.example}} {{demo.snippet "theme-icon-util-no-matches.js" label="Controller"}} {{demo.snippet "util-theme-icon-no-match" label="Template" language="htmlbars"}} {{demo.snippet name="theme-icon-app-defaults.js" language="javascript" label="config/environment.js"}} {{/docs-demo}}
42.328125
331
0.667774
eng_Latn
0.90048
4c48d4c90b011bc2eab67306bf32c325463d5dd3
1,658
md
Markdown
results/referenceaudioanalyzer/referenceaudioanalyzer_hdm1_harman_over-ear_2018/Audio-Technica ATH-M35/README.md
NekoAlosama/AutoEq-optimized
354873974d31dea14aa95cf1b181c724554a19d3
[ "MIT" ]
2
2020-04-27T23:56:38.000Z
2020-08-06T08:54:28.000Z
results/referenceaudioanalyzer/referenceaudioanalyzer_hdm1_harman_over-ear_2018/Audio-Technica ATH-M35/README.md
NekoAlosama/AutoEq-optimized
354873974d31dea14aa95cf1b181c724554a19d3
[ "MIT" ]
3
2020-07-21T22:10:04.000Z
2020-11-22T15:07:43.000Z
results/referenceaudioanalyzer/referenceaudioanalyzer_hdm1_harman_over-ear_2018/Audio-Technica ATH-M35/README.md
NekoAlosama/AutoEq-optimized
354873974d31dea14aa95cf1b181c724554a19d3
[ "MIT" ]
1
2020-08-06T08:54:41.000Z
2020-08-06T08:54:41.000Z
# Audio-Technica ATH-M35 See [usage instructions](https://github.com/jaakkopasanen/AutoEq#usage) for more options and info. ### Parametric EQs In case of using parametric equalizer, apply preamp of **-14.18dB** and build filters manually with these parameters. The first 5 filters can be used independently. When using independent subset of filters, apply preamp of **-14.18 dB**. | Type | Fc | Q | Gain | |--------:|------------:|-----:|---------:| | Peaking | 20.08 Hz | 0.79 | 14.36 dB | | Peaking | 356.68 Hz | 0.23 | -3.52 dB | | Peaking | 411.71 Hz | 1.68 | 7.73 dB | | Peaking | 2936.53 Hz | 0.85 | -5.99 dB | | Peaking | 6186.23 Hz | 1.32 | 13.00 dB | | Peaking | 3084.31 Hz | 5.07 | 1.15 dB | | Peaking | 5073.75 Hz | 5.01 | 3.44 dB | | Peaking | 5248.36 Hz | 1.22 | -2.10 dB | | Peaking | 8323.44 Hz | 2.26 | 2.51 dB | | Peaking | 19532.88 Hz | 0.37 | -4.56 dB | ### Fixed Band EQs In case of using fixed band (also called graphic) equalizer, apply preamp of **-12.53dB** (if available) and set gains manually with these parameters. | Type | Fc | Q | Gain | |--------:|------------:|-----:|---------:| | Peaking | 31.25 Hz | 1.41 | 12.81 dB | | Peaking | 62.50 Hz | 1.41 | -1.45 dB | | Peaking | 125.00 Hz | 1.41 | -1.79 dB | | Peaking | 250.00 Hz | 1.41 | -1.10 dB | | Peaking | 500.00 Hz | 1.41 | 3.24 dB | | Peaking | 1000.00 Hz | 1.41 | -2.95 dB | | Peaking | 2000.00 Hz | 1.41 | -5.35 dB | | Peaking | 4000.00 Hz | 1.41 | 0.27 dB | | Peaking | 8000.00 Hz | 1.41 | 10.27 dB | | Peaking | 16000.01 Hz | 1.41 | -4.76 dB | ### Graphs ![](./Audio-Technica%20ATH-M35.png)
41.45
98
0.56152
eng_Latn
0.676257
4c4912ef3c8e0ccf1d6263c64227642760792b67
14,546
md
Markdown
treebanks/fa_perdt/fa_perdt-pos-PRON.md
vistamou/docs
116b9c29e4218be06bf33b158284b9c952646989
[ "Apache-2.0" ]
204
2015-01-20T16:36:39.000Z
2022-03-28T00:49:51.000Z
treebanks/fa_perdt/fa_perdt-pos-PRON.md
vistamou/docs
116b9c29e4218be06bf33b158284b9c952646989
[ "Apache-2.0" ]
654
2015-01-02T17:06:29.000Z
2022-03-31T18:23:34.000Z
treebanks/fa_perdt/fa_perdt-pos-PRON.md
vistamou/docs
116b9c29e4218be06bf33b158284b9c952646989
[ "Apache-2.0" ]
200
2015-01-16T22:07:02.000Z
2022-03-25T11:35:28.000Z
--- layout: base title: 'Statistics of PRON in UD_Persian-PerDT' udver: '2' --- ## Treebank Statistics: UD_Persian-PerDT: POS Tags: `PRON` There are 55 `PRON` lemmas (0%), 86 `PRON` types (0%) and 24140 `PRON` tokens (5%). Out of 16 observed tags, the rank of `PRON` is: 10 in number of lemmas, 9 in number of types and 6 in number of tokens. The 10 most frequent `PRON` lemmas: او، من، خود، ما، آن، آنها، شما، تو، این، هم The 10 most frequent `PRON` types: خود، او، آن، ش، ما، من، م، شما، آنها، این The 10 most frequent ambiguous lemmas: او (<tt><a href="fa_perdt-pos-PRON.html">PRON</a></tt> 4408, <tt><a href="fa_perdt-pos-NOUN.html">NOUN</a></tt> 3, <tt><a href="fa_perdt-pos-PROPN.html">PROPN</a></tt> 1), من (<tt><a href="fa_perdt-pos-PRON.html">PRON</a></tt> 3482, <tt><a href="fa_perdt-pos-NOUN.html">NOUN</a></tt> 11, <tt><a href="fa_perdt-pos-ADP.html">ADP</a></tt> 1), خود (<tt><a href="fa_perdt-pos-PRON.html">PRON</a></tt> 2856, <tt><a href="fa_perdt-pos-NOUN.html">NOUN</a></tt> 3), ما (<tt><a href="fa_perdt-pos-PRON.html">PRON</a></tt> 2343, <tt><a href="fa_perdt-pos-NOUN.html">NOUN</a></tt> 1), آن (<tt><a href="fa_perdt-pos-PRON.html">PRON</a></tt> 2019, <tt><a href="fa_perdt-pos-DET.html">DET</a></tt> 1140, <tt><a href="fa_perdt-pos-NOUN.html">NOUN</a></tt> 16, <tt><a href="fa_perdt-pos-ADJ.html">ADJ</a></tt> 1), آنها (<tt><a href="fa_perdt-pos-PRON.html">PRON</a></tt> 2007, <tt><a href="fa_perdt-pos-NOUN.html">NOUN</a></tt> 1), تو (<tt><a href="fa_perdt-pos-PRON.html">PRON</a></tt> 1215, <tt><a href="fa_perdt-pos-NOUN.html">NOUN</a></tt> 11, <tt><a href="fa_perdt-pos-ADP.html">ADP</a></tt> 10, <tt><a href="fa_perdt-pos-PROPN.html">PROPN</a></tt> 1), این (<tt><a href="fa_perdt-pos-DET.html">DET</a></tt> 4859, <tt><a href="fa_perdt-pos-PRON.html">PRON</a></tt> 904, <tt><a href="fa_perdt-pos-NOUN.html">NOUN</a></tt> 2), هم (<tt><a href="fa_perdt-pos-ADV.html">ADV</a></tt> 1295, <tt><a href="fa_perdt-pos-PRON.html">PRON</a></tt> 465, <tt><a href="fa_perdt-pos-ADJ.html">ADJ</a></tt> 1), وی (<tt><a href="fa_perdt-pos-PRON.html">PRON</a></tt> 396, <tt><a href="fa_perdt-pos-PROPN.html">PROPN</a></tt> 2) The 10 most frequent ambiguous types: خود (<tt><a href="fa_perdt-pos-PRON.html">PRON</a></tt> 3472, <tt><a href="fa_perdt-pos-NOUN.html">NOUN</a></tt> 1), او (<tt><a href="fa_perdt-pos-PRON.html">PRON</a></tt> 2051, <tt><a href="fa_perdt-pos-NOUN.html">NOUN</a></tt> 3, <tt><a href="fa_perdt-pos-PROPN.html">PROPN</a></tt> 1), آن (<tt><a href="fa_perdt-pos-PRON.html">PRON</a></tt> 1991, <tt><a href="fa_perdt-pos-DET.html">DET</a></tt> 1130, <tt><a href="fa_perdt-pos-NOUN.html">NOUN</a></tt> 10, <tt><a href="fa_perdt-pos-ADJ.html">ADJ</a></tt> 1), ش (<tt><a href="fa_perdt-pos-PRON.html">PRON</a></tt> 1949, <tt><a href="fa_perdt-pos-ADJ.html">ADJ</a></tt> 2, <tt><a href="fa_perdt-pos-NOUN.html">NOUN</a></tt> 1), ما (<tt><a href="fa_perdt-pos-PRON.html">PRON</a></tt> 1867, <tt><a href="fa_perdt-pos-NOUN.html">NOUN</a></tt> 1), من (<tt><a href="fa_perdt-pos-PRON.html">PRON</a></tt> 1696, <tt><a href="fa_perdt-pos-NOUN.html">NOUN</a></tt> 8, <tt><a href="fa_perdt-pos-ADP.html">ADP</a></tt> 1), م (<tt><a href="fa_perdt-pos-PRON.html">PRON</a></tt> 1571, <tt><a href="fa_perdt-pos-AUX.html">AUX</a></tt> 113, <tt><a href="fa_perdt-pos-ADJ.html">ADJ</a></tt> 1, <tt><a href="fa_perdt-pos-PROPN.html">PROPN</a></tt> 1), آنها (<tt><a href="fa_perdt-pos-PRON.html">PRON</a></tt> 1187, <tt><a href="fa_perdt-pos-NOUN.html">NOUN</a></tt> 1), این (<tt><a href="fa_perdt-pos-DET.html">DET</a></tt> 4814, <tt><a href="fa_perdt-pos-PRON.html">PRON</a></tt> 902, <tt><a href="fa_perdt-pos-NOUN.html">NOUN</a></tt> 2), تو (<tt><a href="fa_perdt-pos-PRON.html">PRON</a></tt> 734, <tt><a href="fa_perdt-pos-ADP.html">ADP</a></tt> 11, <tt><a href="fa_perdt-pos-NOUN.html">NOUN</a></tt> 5) * خود * <tt><a href="fa_perdt-pos-PRON.html">PRON</a></tt> 3472: او باید مراحل بهبودی <b>خود</b> را طی کند و برای بازگشت به میدان عجله نکند . * <tt><a href="fa_perdt-pos-NOUN.html">NOUN</a></tt> 1: افراد خلاق یاد گرفته‌اند که اوقات تنهایی خود را دوست بدارند ؛ زیرا از آن به عنوان زمانی که می‌توانند با خود خلوت کنند و به فعالیت‌های <b>خود</b> کام بخش بپردازند ، استفاده می‌کنند . * او * <tt><a href="fa_perdt-pos-PRON.html">PRON</a></tt> 2051: <b>او</b> برای حسین ( ع ) پس از شهادت ش سوگواری کرد . * <tt><a href="fa_perdt-pos-NOUN.html">NOUN</a></tt> 3: بهناز آدمی نسبتاً ترسو است اما مهسا احمدی بدل‌کار <b>او</b> ست و به جای او بدل‌کاری می‌کند . * <tt><a href="fa_perdt-pos-PROPN.html">PROPN</a></tt> 1: وی خاطرنشان کرد : از هفت تلمبه‌خانهٔ موجود در آبادان سه تلمبه‌خانه فرآورده‌های نفتی را به سمت اهواز و اراک و سه تلمبه‌خانه نفت کورهٔ <b>او</b> . آر . دی و دیگر فرآورده‌ها را به سمت ماهشهر پمپاژ خواهند کرد . * آن * <tt><a href="fa_perdt-pos-PRON.html">PRON</a></tt> 1991: رنگ اصلی فرش‌های این منطقه لاکی است که با روناس <b>آن</b> را رنگرزی می‌کردند . * <tt><a href="fa_perdt-pos-DET.html">DET</a></tt> 1130: <b>آن</b> جا که لازم می‌داند ، پخش ش می‌کند . * <tt><a href="fa_perdt-pos-NOUN.html">NOUN</a></tt> 10: اما آن تجربیات را از <b>آن</b> خود کنید . * <tt><a href="fa_perdt-pos-ADJ.html">ADJ</a></tt> 1: رقیب‌های من همه <b>آن</b> نفراتی بودند که روز قبل به فینال آمده بودند و در حذفی شکست شان داده بودم . * ش * <tt><a href="fa_perdt-pos-PRON.html">PRON</a></tt> 1949: او برای حسین ( ع ) پس از شهادت <b>ش</b> سوگواری کرد . * <tt><a href="fa_perdt-pos-ADJ.html">ADJ</a></tt> 2: مرحوم « عباس پاره‌دوز » متخلص به « عشاق » در سال 1317 ه . <b>ش</b> . دار فانی را وداع گفت و به عشق دیدار کسانی که در فراق شان نوحه‌سرایی کرده بود ، پرواز کرد . * <tt><a href="fa_perdt-pos-NOUN.html">NOUN</a></tt> 1: در سال 913 ه . <b>ش</b> . قوم ازبک برای بار دوم بر هرات استیلا پیدا کردند و آنجا را مورد تاخت و تاز و غارت قرار دادند . * ما * <tt><a href="fa_perdt-pos-PRON.html">PRON</a></tt> 1867: یعنی <b>ما</b> به این ترتیب فقط یک سطل آب را در کویر پاشیده‌ایم . * <tt><a href="fa_perdt-pos-NOUN.html">NOUN</a></tt> 1: بد ین گونه باری از مسئولیت سنگینی که بر دوش قدیسان <b>ما</b> ست ، برمی‌گیریم . * من * <tt><a href="fa_perdt-pos-PRON.html">PRON</a></tt> 1696: اگر <b>من</b> بر اساس داستان‌های تاریخی عمل می‌کردم بافت دراماتیک لطمه می‌دید . * <tt><a href="fa_perdt-pos-NOUN.html">NOUN</a></tt> 8: هر آدمیزادی یک « <b>من</b> » عمومی و عامیانه و یک « <b>من</b> » شخصی دارد . * <tt><a href="fa_perdt-pos-ADP.html">ADP</a></tt> 1: اولین چیزی که موجب می‌شود که انسان بتواند در مقابل دشمن ش ( شیطان ) و هواهای نفسانی خود ایستادگی بکند و مانع بشود که دشمن ش بر او تسلط پیدا نکند ، طبق آیهٔ شریفهٔ قرآن کریم ( واعظ <b>من</b> قلبه ) این است که از قلب خود واعظی برای خود درست بکند و داشته باشد . * م * <tt><a href="fa_perdt-pos-PRON.html">PRON</a></tt> 1571: پدر <b>م</b> بعد از 9 جلسهٔ شیمی‌درمانی دوام نیاورد . * <tt><a href="fa_perdt-pos-AUX.html">AUX</a></tt> 113: مایل <b>م</b> تقسیم‌بندی خود را دربارهٔ این عنوان ارائه نمایم . * <tt><a href="fa_perdt-pos-ADJ.html">ADJ</a></tt> 1: تاریخ نشان نمی‌دهد که « اقوام غز » در قرن‌های پیش از قرن 5 ق / 11 <b>م</b> به آذربایجان حمله کرده و در آنجا سکونت کرده باشند . * <tt><a href="fa_perdt-pos-PROPN.html">PROPN</a></tt> 1: با ارجاع پرونده ، کارآگاهان با شناسایی صاحب حساب اطلاع پیدا کردند که این حساب متعلق به دختربچه‌ای به نام « رمینا . <b>م</b> » است . * آنها * <tt><a href="fa_perdt-pos-PRON.html">PRON</a></tt> 1187: <b>آنها</b> اتفاقاً به اندازهٔ 25 میلیون تومان تفریح کرده‌اند . * <tt><a href="fa_perdt-pos-NOUN.html">NOUN</a></tt> 1: این هنرمند در اثر خود عکس‌های مردمی از اقصی نقاط جهان را از زیر آوار خارج کرد و روی همهٔ <b>آنها</b> رنگ خون پاشید . * این * <tt><a href="fa_perdt-pos-DET.html">DET</a></tt> 4814: یعنی ما به <b>این</b> ترتیب فقط یک سطل آب را در کویر پاشیده‌ایم . * <tt><a href="fa_perdt-pos-PRON.html">PRON</a></tt> 902: خوش‌رفتار و بانزاکت باشید بدون <b>این</b> که چاپلوسی کرده باشید . * <tt><a href="fa_perdt-pos-NOUN.html">NOUN</a></tt> 2: ما قدرت فرادرمانی داریم ؛ یعنی <b>این</b> که می‌توانیم همهٔ امراض را شفا بدهیم . * تو * <tt><a href="fa_perdt-pos-PRON.html">PRON</a></tt> 734: تنها می‌توان در برابر عظمت و وفاداری <b>تو</b> سر سجده فرود آورد . * <tt><a href="fa_perdt-pos-ADP.html">ADP</a></tt> 11: <b>تو</b> اوج افسردگی و ناراحتی ، به خود م نیشتر زدم . * <tt><a href="fa_perdt-pos-NOUN.html">NOUN</a></tt> 5: پنجاه تا بچهٔ قد و نیم‌قد از ترس بمباران خزیده بودیم آن <b>تو</b> . ## Morphology The form / lemma ratio of `PRON` is 1.563636 (the average of all parts of speech is 1.477395). The 1st highest number of forms (7) was observed with the lemma “آنها”: آنها, آنهایی, آن‌ها, آن‌هایی, انها, شان, یشان. The 2nd highest number of forms (6) was observed with the lemma “او”: اش, او, اویی, ش, و, یش. The 3rd highest number of forms (6) was observed with the lemma “تو”: ات, ت, تو, توی, تو‌, یت. `PRON` occurs with 3 features: <tt><a href="fa_perdt-feat-Number.html">Number</a></tt> (20256; 84% instances), <tt><a href="fa_perdt-feat-Person.html">Person</a></tt> (16853; 70% instances), <tt><a href="fa_perdt-feat-PronType.html">PronType</a></tt> (5697; 24% instances) `PRON` occurs with 6 feature-value pairs: `Number=Plur`, `Number=Sing`, `Person=1`, `Person=2`, `Person=3`, `PronType=Prs` `PRON` occurs with 15 feature combinations. The most frequent feature combination is `_` (3884 tokens). Examples: خود، هم، خویش، یکدیگر، کجا، همدیگر، چه، آن، چنین، خویشتن ## Relations `PRON` nodes are attached to their parents using 17 different relations: <tt><a href="fa_perdt-dep-nmod.html">nmod</a></tt> (12618; 52% instances), <tt><a href="fa_perdt-dep-nsubj.html">nsubj</a></tt> (4412; 18% instances), <tt><a href="fa_perdt-dep-obl-arg.html">obl:arg</a></tt> (2923; 12% instances), <tt><a href="fa_perdt-dep-obj.html">obj</a></tt> (2218; 9% instances), <tt><a href="fa_perdt-dep-obl.html">obl</a></tt> (1300; 5% instances), <tt><a href="fa_perdt-dep-root.html">root</a></tt> (277; 1% instances), <tt><a href="fa_perdt-dep-conj.html">conj</a></tt> (107; 0% instances), <tt><a href="fa_perdt-dep-compound-lvc.html">compound:lvc</a></tt> (62; 0% instances), <tt><a href="fa_perdt-dep-appos.html">appos</a></tt> (58; 0% instances), <tt><a href="fa_perdt-dep-xcomp.html">xcomp</a></tt> (48; 0% instances), <tt><a href="fa_perdt-dep-amod.html">amod</a></tt> (32; 0% instances), <tt><a href="fa_perdt-dep-ccomp.html">ccomp</a></tt> (29; 0% instances), <tt><a href="fa_perdt-dep-acl.html">acl</a></tt> (16; 0% instances), <tt><a href="fa_perdt-dep-dep.html">dep</a></tt> (16; 0% instances), <tt><a href="fa_perdt-dep-nsubj-pass.html">nsubj:pass</a></tt> (12; 0% instances), <tt><a href="fa_perdt-dep-advcl.html">advcl</a></tt> (11; 0% instances), <tt><a href="fa_perdt-dep-csubj.html">csubj</a></tt> (1; 0% instances) Parents of `PRON` nodes belong to 12 different parts of speech: <tt><a href="fa_perdt-pos-NOUN.html">NOUN</a></tt> (12336; 51% instances), <tt><a href="fa_perdt-pos-VERB.html">VERB</a></tt> (9742; 40% instances), <tt><a href="fa_perdt-pos-PRON.html">PRON</a></tt> (803; 3% instances), <tt><a href="fa_perdt-pos-ADJ.html">ADJ</a></tt> (686; 3% instances), (277; 1% instances), <tt><a href="fa_perdt-pos-PROPN.html">PROPN</a></tt> (105; 0% instances), <tt><a href="fa_perdt-pos-AUX.html">AUX</a></tt> (88; 0% instances), <tt><a href="fa_perdt-pos-ADP.html">ADP</a></tt> (33; 0% instances), <tt><a href="fa_perdt-pos-INTJ.html">INTJ</a></tt> (27; 0% instances), <tt><a href="fa_perdt-pos-ADV.html">ADV</a></tt> (23; 0% instances), <tt><a href="fa_perdt-pos-CCONJ.html">CCONJ</a></tt> (12; 0% instances), <tt><a href="fa_perdt-pos-SCONJ.html">SCONJ</a></tt> (8; 0% instances) 15246 (63%) `PRON` nodes are leaves. 7276 (30%) `PRON` nodes have one child. 1112 (5%) `PRON` nodes have two children. 506 (2%) `PRON` nodes have three or more children. The highest child degree of a `PRON` node is 7. Children of `PRON` nodes are attached using 22 different relations: <tt><a href="fa_perdt-dep-case.html">case</a></tt> (7032; 61% instances), <tt><a href="fa_perdt-dep-acl.html">acl</a></tt> (1185; 10% instances), <tt><a href="fa_perdt-dep-nmod.html">nmod</a></tt> (904; 8% instances), <tt><a href="fa_perdt-dep-punct.html">punct</a></tt> (649; 6% instances), <tt><a href="fa_perdt-dep-cop.html">cop</a></tt> (356; 3% instances), <tt><a href="fa_perdt-dep-nsubj.html">nsubj</a></tt> (329; 3% instances), <tt><a href="fa_perdt-dep-dep.html">dep</a></tt> (281; 2% instances), <tt><a href="fa_perdt-dep-conj.html">conj</a></tt> (251; 2% instances), <tt><a href="fa_perdt-dep-appos.html">appos</a></tt> (130; 1% instances), <tt><a href="fa_perdt-dep-obl.html">obl</a></tt> (109; 1% instances), <tt><a href="fa_perdt-dep-cc.html">cc</a></tt> (85; 1% instances), <tt><a href="fa_perdt-dep-advmod.html">advmod</a></tt> (67; 1% instances), <tt><a href="fa_perdt-dep-mark.html">mark</a></tt> (34; 0% instances), <tt><a href="fa_perdt-dep-advcl.html">advcl</a></tt> (29; 0% instances), <tt><a href="fa_perdt-dep-det.html">det</a></tt> (17; 0% instances), <tt><a href="fa_perdt-dep-amod.html">amod</a></tt> (6; 0% instances), <tt><a href="fa_perdt-dep-csubj.html">csubj</a></tt> (6; 0% instances), <tt><a href="fa_perdt-dep-obl-arg.html">obl:arg</a></tt> (6; 0% instances), <tt><a href="fa_perdt-dep-xcomp.html">xcomp</a></tt> (3; 0% instances), <tt><a href="fa_perdt-dep-ccomp.html">ccomp</a></tt> (2; 0% instances), <tt><a href="fa_perdt-dep-nummod.html">nummod</a></tt> (1; 0% instances), <tt><a href="fa_perdt-dep-obj.html">obj</a></tt> (1; 0% instances) Children of `PRON` nodes belong to 15 different parts of speech: <tt><a href="fa_perdt-pos-ADP.html">ADP</a></tt> (7026; 61% instances), <tt><a href="fa_perdt-pos-VERB.html">VERB</a></tt> (914; 8% instances), <tt><a href="fa_perdt-pos-NOUN.html">NOUN</a></tt> (849; 7% instances), <tt><a href="fa_perdt-pos-PRON.html">PRON</a></tt> (803; 7% instances), <tt><a href="fa_perdt-pos-PUNCT.html">PUNCT</a></tt> (649; 6% instances), <tt><a href="fa_perdt-pos-AUX.html">AUX</a></tt> (375; 3% instances), <tt><a href="fa_perdt-pos-ADV.html">ADV</a></tt> (348; 3% instances), <tt><a href="fa_perdt-pos-SCONJ.html">SCONJ</a></tt> (245; 2% instances), <tt><a href="fa_perdt-pos-CCONJ.html">CCONJ</a></tt> (103; 1% instances), <tt><a href="fa_perdt-pos-ADJ.html">ADJ</a></tt> (63; 1% instances), <tt><a href="fa_perdt-pos-PROPN.html">PROPN</a></tt> (53; 0% instances), <tt><a href="fa_perdt-pos-PART.html">PART</a></tt> (27; 0% instances), <tt><a href="fa_perdt-pos-DET.html">DET</a></tt> (18; 0% instances), <tt><a href="fa_perdt-pos-INTJ.html">INTJ</a></tt> (9; 0% instances), <tt><a href="fa_perdt-pos-NUM.html">NUM</a></tt> (1; 0% instances)
145.46
1,681
0.650488
yue_Hant
0.625266
4c4a39ed31a0b5f6707dffaee9309fd52c46290f
7,917
md
Markdown
_posts/2015-05-05-swift-documentation.md
NSHipster/nshipster.cn
68a0cde3eb05c0041cf77d2ad7bbcb11fca489d5
[ "MIT" ]
22
2015-12-12T09:10:15.000Z
2018-10-16T04:11:55.000Z
_posts/2015-05-05-swift-documentation.md
NSHipster/nshipster.cn
68a0cde3eb05c0041cf77d2ad7bbcb11fca489d5
[ "MIT" ]
3
2016-03-10T08:19:48.000Z
2018-07-14T12:41:15.000Z
_posts/2015-05-05-swift-documentation.md
NSHipster/nshipster.cn
68a0cde3eb05c0041cf77d2ad7bbcb11fca489d5
[ "MIT" ]
4
2016-01-29T03:45:17.000Z
2019-09-17T09:00:18.000Z
--- title: Swift Documentation author: Mattt & Nate Cook authors: - Mattt Thompson - Nate Cook category: Swift tags: swift translator: April Peng excerpt: "代码的结构和组织关乎了开发童鞋们的节操问题。明确和一致的代码表示了明确和一贯的思想。请仔细阅读,来了解最近在 Xcode 6 和 Swift 文档的变化" revisions: "2014-07-28": Original publication. "2015-05-05": Extended detail on supported markup; revised examples. --- 代码的结构和组织关乎了开发童鞋们的节操问题。明确和一致的代码表示了明确和一贯的思想。编译器并没有一个挑剔的口味,但当谈到命名,空格或文档,人类的差异就体现出来了。 NSHipster 的读者无疑会记得[去年发表的关于文档的文章](https://nshipster.cn/documentation/),但很多东西已经在 Xcode 6 中发生了变化(幸运的是,基本上算是变得更好了)。因此,这一周,我们将在此为嗷嗷待哺的 Swift 开发者们记录一下文档说明。 好了,来让我们仔细看看。 * * * 从 00 年代早期,[Headerdoc](https://developer.apple.com/library/mac/documentation/DeveloperTools/Conceptual/HeaderDoc/intro/intro.html#//apple_ref/doc/uid/TP40001215-CH345-SW1) 就一直作为苹果首选的文档标准。从 Perl 脚本解析勉强的 [Javadoc](https://en.wikipedia.org/wiki/Javadoc) 注释作为出发点,Headerdoc 最终成为了苹果在线文档及 Xcode 中的开发者文档的后台引擎。 随着 WWDC 2014 的发布,开发者文档被翻修并进行了时尚的新设计,包含了 Swift 和 Objective-C 的切换。 (如果你[看过任何新的 iOS 8 的在线 API](https://developer.apple.com/library/prerelease/ios/documentation/HomeKit/Reference/HomeKit_Framework/index.html#//apple_ref/doc/uid/TP40014519),那你已经见过这个新设计了) **真正让人意外的是,_文档的格式_ 也发生了变化。** 在 Swift 的代码里调用快速文档 (Quick Documentation)(`⌥ʘ`)时 Headerdoc 没有正确解析注释: ```swift /** 让我们随便来写点什么. @param 啦啦啦啦,这货是参数。 @return 咯咯咯咯,这货是返回值。 */ func foo(bar: String) -> AnyObject { ... } ``` ![Unrecognized Headerdoc]({% asset swift-documentation-headerdoc.png @path %}) 但如果修改一下标记方式,就 _可以_ 被正确解析: ![New Recognized Format]({% asset swift-documentation-new-format.png @path %}) ```swift /** 让我们随便来写点什么. :param: 啦啦啦啦,这货是参数。 :returns: 咯咯咯咯,这货是返回值。 */ func foo(bar: String) -> AnyObject { ... } ``` 那么,这个陌生的新文件格式是个什么情况?事实证明,SourceKit(Xcode 使用的私有框架,在此前以其高 FPS 崩溃闻名)包括一个解析 [reStructuredText](http://docutils.sourceforge.net/docs/user/rst/quickref.html) 的基本解析器。虽然仅实现了 [specification](http://docutils.sourceforge.net/docs/ref/rst/restructuredtext.html#field-lists) 的一个子集,但涵盖基本的格式已经足够了。 #### 基本标记 文档注释通过使用 `/** ... */` 的多行注释或 `///...` 的单行注释来进行区分。在注释块里面,段落由空行分隔。无序列表可由多个项目符号字符组成:`-`、`+`、 `*`、 `•` 等,同时有序列表使用阿拉伯数字(1,2,3,...),后跟一个点符 `1.` 或右括号 `1)` 或两侧括号括起来 `(1)`: ```swift /** 你可以制作 *斜体*, **粗体**, 或 `代码` 的字体风格. - 列表很不错, - 但最好不要叠套 - 子列表的格式 - 就不太好了. 1. 有序列表也一样 2. 对那些有序的东西来说; 3. 阿拉伯数字 4. 是唯一支持的格式. */ ``` #### 定义与字段列表 定义和字段列表跟 Xcode 里的快速文档弹出内容显示的差不多,定义列表会更紧凑一些: ```swift /** Definition list 一些术语以及它们的定义. Format 左对齐术语,放在缩进的定义下面. :Field header: 字段列表隔开一些。 :Another field: 字段列表可以紧跟开始,不需要另起一行并缩进。 随后缩进的行也被视为内容的一部分. */ ``` 两个特殊字段用于记录参数和返回值:分别为:`:param:` 和 `:returns:`。`:param:` 后跟的是参数的名称,然后是说明。返回值没有名字,所以 `:returns:` 后就是说明: ```swift /** 重复一个字符串 `times` 次. :param: str 需要重复的字符串. :param: times 需要重复 `str` 的次数. :returns: 一个重复了 `str` `times` 次的新字符串. */ func repeatString(str: String, times: Int) -> String { return join("", Array(count: times, repeatedValue: str)) } ``` ### 代码块 代码块也可以嵌入到文档的注释里,这对于演示正确的使用方式或实现细节很有用。用至少两个空格来插入代码块: ```swift /** `Shape` 实例的面积. 计算取决于该实例的形状。如果是三角形,`area` 将相当于: let height = triangle.calculateHeight() let area = triangle.base * height / 2 */ var area: CGFloat { get } ``` ## 我的自行车类的新文档 当这个应用在整个类的时候看起来怎么样?事实上,看起来相当的不错: ```swift import Foundation /// 🚲 一个两轮的,人力驱动的交通工具. class Bicycle { /** 车架样式. - Road: 用于街道或步道. - Touring: 用于长途. - Cruiser: 用于城镇周围的休闲之旅. - Hybrid: 用于通用运输. */ enum Style { case Road, Touring, Cruiser, Hybrid } /** 转换踏板功率为运动的机制。 - Fixed: 一个单一的,固定的齿轮。 - Freewheel: 一个可变速,脱开的齿轮。 */ enum Gearing { case Fixed case Freewheel(speeds: Int) } /** 用于转向的硬件。 - Riser: 一个休闲车把。 - Café: 一个正常车把。 - Drop: 一个经典车把. - Bullhorn: 一个超帅车把. */ enum Handlebar { case Riser, Café, Drop, Bullhorn } /// 自行车的风格 let style: Style /// 自行车的齿轮 let gearing: Gearing /// 自行车的车把 let handlebar: Handlebar /// 车架大小, 厘米为单位. let frameSize: Int /// 自行车行驶的旅程数 private(set) var numberOfTrips: Int /// 自行车总共行驶的距离,米为单位 private(set) var distanceTravelled: Double /** 使用提供的部件及规格初始化一个新自行车。 :param: style 自行车的风格 :param: gearing 自行车的齿轮 :param: handlebar 自行车的车把 :param: centimeters 自行车的车架大小,单位为厘米 :returns: 一个漂亮的,全新的,为你度身定做. */ init(style: Style, gearing: Gearing, handlebar: Handlebar, frameSize centimeters: Int) { self.style = style self.gearing = gearing self.handlebar = handlebar self.frameSize = centimeters self.numberOfTrips = 0 self.distanceTravelled = 0 } /** 把自行车骑出去遛一圈 :param: meters 行驶的距离,单位为米 */ func travel(distance meters: Double) { if meters > 0 { distanceTravelled += meters ++numberOfTrips } } } ``` 在 `Style` 的 `enum` 声明里使用 Option-click,说明就精美的以符号列表呈现了: ![Swift enum Declaration Documentation]({% asset swift-documentation-enum-declaration.png @path %}) 对 `travel` 方法快速查看文档,参数一如预期的被解析成单独的字段: ![Swift func Declaration Documentation]({% asset swift-documentation-method-declaration.png @path %}) ## MARK / TODO / FIXME 在 Objective-C 里,[预处理指令 `#pragma mark`](https://nshipster.com/pragma/) 用来把功能区分成有意义的,易于导航的章节。在 Swift 里,没有预处理指令(最接近的是相似的-井号[编译配置][1]),但同样可以用注释达到效果 `// MARK: `。 在 Xcode 6β4 中,以下注释将出现在 Xcode 的代码导航(source navigator)中: - `// MARK: ` _(等同于 `#pragma`,记号后紧跟一个横杠 (`-`) 会被编译成水平分割线)_ - `// TODO: ` - `// FIXME: ` > 其他常规注释标记,如 `NOTE` 和 `XXX` 在 Xcode 中不能被识别。 要显示这些新的标签,下面是 `Bicycle` 类如何扩展的使用 `Printable` 协议,并实现 `description` 的。 ![Xcode 6 Documentation Source Navigator MARK / TODO / FIXME]({% asset swift-documentation-xcode-source-navigator.png @path %}) ```swift // MARK: Printable extension Bicycle: Printable { var description: String { var descriptors: [String] = [] switch self.style { case .Road: descriptors.append("A road bike for streets or trails") case .Touring: descriptors.append("A touring bike for long journeys") case .Cruiser: descriptors.append("A cruiser bike for casual trips around town") case .Hybrid: descriptors.append("A hybrid bike for general-purpose transportation") } switch self.gearing { case .Fixed: descriptors.append("with a single, fixed gear") case .Freewheel(let n): descriptors.append("with a \(n)-speed freewheel gear") } switch self.handlebar { case .Riser: descriptors.append("and casual, riser handlebars") case .Café: descriptors.append("and upright, café handlebars") case .Drop: descriptors.append("and classic, drop handlebars") case .Bullhorn: descriptors.append("and powerful bullhorn handlebars") } descriptors.append("on a \(frameSize)\" frame") // FIXME: 使用格式化的距离 descriptors.append("with a total of \(distanceTravelled) meters traveled over \(numberOfTrips) trips.") // TODO: 允许自行车被命名吗? return join(", ", descriptors) } } ``` 把所有东西都放到代码里来: ```swift let bike = Bicycle(style: .Road, gearing: .Freewheel(speeds: 8), handlebar: .Drop, frameSize: 53) bike.travel(distance: 1_500) // 到处晃一晃 bike.travel(distance: 200) // 骑车去市场 println(bike) // “公路自行车,具有 8 速飞轮齿轮,经典,下降车把,53” 框架,总的行驶距离 2 次共 1700.0 米。“ ``` * * * 虽然 Swift 的工具和文档仍在迅速发展,但在早期就通过使用新的轻量级标记语言规范生成文档,以及使用 `MARK: ` 注释来养成良好的习惯是很明智的。 快去试试这些技巧,把它加到你的 `TODO: ` 列表里吧。 [1]: https://developer.apple.com/library/prerelease/ios/documentation/Swift/Conceptual/BuildingCocoaApps/InteractingWithCAPIs.html#//apple_ref/doc/uid/TP40014216-CH8-XID_25 **译者注:为了方便大家理解,把这篇文章中的注释翻译成了中文,在实际项目中我们仍然推荐用英文书写。** ```
24.137195
300
0.651257
yue_Hant
0.688535
4c4a4a2a266e5644e742508eca5cb0b962eaed7a
4,969
md
Markdown
_drafts/blogs/2020-05-01-The-Architecture-of-Lenet-5.md
pengfeinie/pengfeinie.github.io
d6ff67e9571a0c0217c0ea1a7ca40c64731e9b44
[ "MIT" ]
null
null
null
_drafts/blogs/2020-05-01-The-Architecture-of-Lenet-5.md
pengfeinie/pengfeinie.github.io
d6ff67e9571a0c0217c0ea1a7ca40c64731e9b44
[ "MIT" ]
null
null
null
_drafts/blogs/2020-05-01-The-Architecture-of-Lenet-5.md
pengfeinie/pengfeinie.github.io
d6ff67e9571a0c0217c0ea1a7ca40c64731e9b44
[ "MIT" ]
null
null
null
--- title: 'The Architecture of Lenet-5' date: 2020-05-01 tags: - Convolution Neural Network - CNN --- ## What is Lenet5? Lenet-5 is one of the earliest pre-trained models proposed by Yann LeCun and others in the year 1988, in the research paper [Gradient-Based Learning Applied to Document Recognition](http://yann.lecun.com/exdb/publis/pdf/lecun-01a.pdf). They used this architecture for recognizing the handwritten and machine-printed characters. The main reason behind the popularity of this model was its simple and straightforward architecture. It is a multi-layer convolution neural network for image classification. ## The Architecture of the Model Let’s understand the architecture of Lenet-5. The network has 5 layers with learnable parameters and hence named Lenet-5. It has three sets of convolution layers with a combination of average pooling. After the convolution and average pooling layers, we have two fully connected layers. At last, a Softmax classifier which classifies the images into respective class. ![Lenet-5](https://cdn.analyticsvidhya.com/wp-content/uploads/2021/03/Screenshot-from-2021-03-18-12-22-52.png) The input to this model is a 32 X 32 grayscale image hence the number of channels is one. ![32 X 32 grayscale image](https://cdn.analyticsvidhya.com/wp-content/uploads/2021/03/Screenshot-from-2021-03-18-12-29-32.png) We then apply the first convolution operation with the filter size 5X5 and we have 6 such filters. As a result, we get a feature map of size 28X28X6. Here the number of channels is equal to the number of filters applied. ![Lenet-5 - first convolution operation](https://cdn.analyticsvidhya.com/wp-content/uploads/2021/03/Screenshot-from-2021-03-18-12-33-59.png) After the first pooling operation, we apply the average pooling and the size of the feature map is reduced by half. Note that, the number of channels is intact. ![first pooling operation](https://cdn.analyticsvidhya.com/wp-content/uploads/2021/03/Screenshot-from-2021-03-18-12-42-25.png) Next, we have a convolution layer with sixteen filters of size 5X5. Again the feature map changed it is 10X10X16. The output size is calculated in a similar manner. After this, we again applied an average pooling or subsampling layer, which again reduce the size of the feature map by half i.e 5X5X16. ![Lenet-5 size 5X5](https://cdn.analyticsvidhya.com/wp-content/uploads/2021/03/Screenshot-from-2021-03-18-12-47-59.png) Then we have a final convolution layer of size 5X5 with 120 filters. As shown in the above image. Leaving the feature map size 1X1X120. After which flatten result is 120 values. After these convolution layers, we have a fully connected layer with eighty-four neurons. At last, we have an output layer with ten neurons since the data have ten classes. Here is the final architecture of the Lenet-5 model. ![Lenet-5 model](https://cdn.analyticsvidhya.com/wp-content/uploads/2021/03/Screenshot-from-2021-03-18-12-52-17.png) ## Architecture Details Let’s understand the architecture in more detail.![Lenet-5 Architecture Details](https://cdn.analyticsvidhya.com/wp-content/uploads/2021/03/Screenshot-from-2021-03-18-12-56-51.png) The first layer is the input layer with feature map size 32X32X1. Then we have the first convolution layer with 6 filters of size 5X5 and stride is 1. The activation function used at his layer is tanh. The output feature map is 28X28X6. Next, we have an average pooling layer with filter size 2X2 and stride 1. The resulting feature map is 14X14X6. Since the pooling layer doesn’t affect the number of channels. After this comes the second convolution layer with 16 filters of 5X5 and stride 1. Also, the activation function is tanh. Now the output size is 10X10X16. Again comes the other average pooling layer of 2X2 with stride 2. As a result, the size of the feature map reduced to 5X5X16. The final pooling layer has 120 filters of 5X5 with stride 1 and activation function tanh. Now the output size is 120. The next is a fully connected layer with 84 neurons that result in the output to 84 values and the activation function used here is again tanh. The last layer is the output layer with 10 neurons and Softmax function. The Softmax gives the probability that a data point belongs to a particular class. The highest value is then predicted. This is the entire architecture of the Lenet-5 model. The number of trainable parameters of this architecture is around sixty thousand. ## End Notes This was all about Lenet-5 architecture. Finally, to summarize The network has - 5 layers with learnable parameters. - The input to the model is a grayscale image. - It has 3 convolution layers, two average pooling layers, and two fully connected layers with a softmax classifier. - The number of trainable parameters is 60000. [https://www.analyticsvidhya.com/blog/2021/03/the-architecture-of-lenet-5/](https://www.analyticsvidhya.com/blog/2021/03/the-architecture-of-lenet-5/)
62.1125
367
0.788287
eng_Latn
0.99719
4c4a6554b27218d1e5c06b5d67b603eaa330c9e7
2,540
md
Markdown
docs/ado/reference/adox-api/setobjectowner-method.md
drake1983/sql-docs.es-es
d924b200133b8c9d280fc10842a04cd7947a1516
[ "CC-BY-4.0", "MIT" ]
1
2020-04-25T17:50:01.000Z
2020-04-25T17:50:01.000Z
docs/ado/reference/adox-api/setobjectowner-method.md
drake1983/sql-docs.es-es
d924b200133b8c9d280fc10842a04cd7947a1516
[ "CC-BY-4.0", "MIT" ]
null
null
null
docs/ado/reference/adox-api/setobjectowner-method.md
drake1983/sql-docs.es-es
d924b200133b8c9d280fc10842a04cd7947a1516
[ "CC-BY-4.0", "MIT" ]
null
null
null
--- title: Método SetObjectOwner | Documentos de Microsoft ms.prod: sql ms.prod_service: connectivity ms.component: ado ms.technology: connectivity ms.custom: '' ms.date: 01/19/2017 ms.reviewer: '' ms.suite: sql ms.tgt_pltfrm: '' ms.topic: conceptual apitype: COM f1_keywords: - _Catalog::SetObjectOwner - _Catalog::raw_SetObjectOwner helpviewer_keywords: - SetObjectOwner method [ADOX] ms.assetid: e5170a37-9d6e-43db-bfb6-9b6631fa3048 caps.latest.revision: 12 author: MightyPen ms.author: genemi manager: craigg ms.openlocfilehash: 09dd25fde3840a34d8ae3771a6a2eb066296fc5a ms.sourcegitcommit: 1740f3090b168c0e809611a7aa6fd514075616bf ms.translationtype: MT ms.contentlocale: es-ES ms.lasthandoff: 05/03/2018 --- # <a name="setobjectowner-method"></a>SetObjectOwner (método) Especifica el propietario de un objeto en un [catálogo](../../../ado/reference/adox-api/catalog-object-adox.md). ## <a name="syntax"></a>Sintaxis ``` Catalog.SetObjectOwner ObjectName, ObjectType, OwnerName [,ObjectTypeId] ``` #### <a name="parameters"></a>Parámetros *ObjectName* A **cadena** valor que especifica el nombre del objeto para el que se especifique el propietario. *ObjectType* A **largo** valor que puede ser uno de los [ObjectTypeEnum](../../../ado/reference/adox-api/objecttypeenum.md) constantes que especifica el tipo de propietario. *OwnerName* A **cadena** valor que especifica la [nombre](../../../ado/reference/adox-api/name-property-adox.md) de la [usuario](../../../ado/reference/adox-api/user-object-adox.md) o [grupo](../../../ado/reference/adox-api/group-object-adox.md) al propietario del objeto. *Valor de ObjectTypeId* Opcional. A **Variant** valor que especifica el GUID de un tipo de objeto de proveedor que no está definido por la especificación de OLE DB. Este parámetro es obligatorio si *ObjectType* está establecido en **adPermObjProviderSpecific**; en caso contrario, no se utiliza. ## <a name="remarks"></a>Comentarios Si el proveedor no admite la especificación de los propietarios de objetos, se producirá un error. ## <a name="applies-to"></a>Se aplica a [Objeto Catalog (ADOX)](../../../ado/reference/adox-api/catalog-object-adox.md) ## <a name="see-also"></a>Vea también [Ejemplo GetObjectOwner y SetObjectOwner métodos (VB)](../../../ado/reference/adox-api/getobjectowner-and-setobjectowner-methods-example-vb.md) [GetObjectOwner (método, ADOX)](../../../ado/reference/adox-api/getobjectowner-method-adox.md)
40.967742
274
0.726772
spa_Latn
0.532714
4c4aadbbda9efa234263b4cb646a174c52b39b87
138
md
Markdown
src/test/circular_reference/specification/testRP/readme.md
ruowan/avocado
bd979876a4d591e71cc9db11877cb039ba5214c9
[ "MIT" ]
21
2019-03-20T19:57:40.000Z
2021-08-02T00:39:24.000Z
src/test/circular_reference/specification/testRP/readme.md
ruowan/avocado
bd979876a4d591e71cc9db11877cb039ba5214c9
[ "MIT" ]
38
2019-03-19T00:41:48.000Z
2022-03-03T16:10:42.000Z
src/test/circular_reference/specification/testRP/readme.md
ruowan/avocado
bd979876a4d591e71cc9db11877cb039ba5214c9
[ "MIT" ]
10
2019-04-20T21:14:59.000Z
2021-04-07T12:48:41.000Z
# Header > see https://aka.ms/autorest ```yaml $(tag) == 'something' input-file: - specs/a.json - specs/b.json - specs/c.json ```
12.545455
29
0.594203
hun_Latn
0.197654
4c4b3ef2e902a28489408d91626db76062e703dc
3,043
md
Markdown
articles/hdinsight/interactive-query/interactive-query-troubleshoot-view-time-out.md
matmahnke/azure-docs.pt-br
6c96d25caf8663547775f333164198e3ed03972f
[ "CC-BY-4.0", "MIT" ]
null
null
null
articles/hdinsight/interactive-query/interactive-query-troubleshoot-view-time-out.md
matmahnke/azure-docs.pt-br
6c96d25caf8663547775f333164198e3ed03972f
[ "CC-BY-4.0", "MIT" ]
null
null
null
articles/hdinsight/interactive-query/interactive-query-troubleshoot-view-time-out.md
matmahnke/azure-docs.pt-br
6c96d25caf8663547775f333164198e3ed03972f
[ "CC-BY-4.0", "MIT" ]
null
null
null
--- title: Apache Hive exibição atinge o tempo limite do resultado da consulta-Azure HDInsight description: Apache Hive exibição expira ao buscar um resultado de consulta no Azure HDInsight ms.service: hdinsight ms.topic: troubleshooting author: hrasheed-msft ms.author: hrasheed ms.reviewer: jasonh ms.date: 07/30/2019 ms.openlocfilehash: 2ed60dc6404d16e5d7df302a0cc255e18f1f5b34 ms.sourcegitcommit: d767156543e16e816fc8a0c3777f033d649ffd3c ms.translationtype: MT ms.contentlocale: pt-BR ms.lasthandoff: 10/26/2020 ms.locfileid: "92534644" --- # <a name="scenario-apache-hive-view-times-out-when-fetching-a-query-result-in-azure-hdinsight"></a>Cenário: Apache Hive exibição expira ao buscar um resultado de consulta no Azure HDInsight Este artigo descreve as etapas de solução de problemas e as possíveis resoluções para problemas ao usar componentes de consulta interativa em clusters do Azure HDInsight. ## <a name="issue"></a>Problema Ao executar determinadas consultas na exibição de Apache Hive, o seguinte erro pode ser encontrado: ``` result fetch timed out java.util.concurrent.TimeoutException: deadline passed ``` ## <a name="cause"></a>Causa O valor de tempo limite padrão de exibição do hive pode não ser adequado para a consulta que você está executando. O período de tempo especificado é muito curto para a exibição do hive buscar o resultado da consulta. ## <a name="resolution"></a>Resolução Aumente os tempos limite de exibição do hive do Apache Ambari definindo as propriedades a seguir no `/etc/ambari-server/conf/ambari.properties` . ``` views.ambari.request.read.timeout.millis=300000 views.request.read.timeout.millis=300000 views.ambari.hive.<HIVE_VIEW_INSTANCE_NAME>.result.fetch.timeout=300000 ``` O valor de `HIVE_VIEW_INSTANCE_NAME` está disponível no final da URL de exibição do hive. ## <a name="next-steps"></a>Próximas etapas Se você não encontrou seu problema ou não conseguiu resolver seu problema, visite um dos seguintes canais para obter mais suporte: * Obtenha respostas de especialistas do Azure por meio do [Suporte da Comunidade do Azure](https://azure.microsoft.com/support/community/). * Conecte-se a [@AzureSupport](https://twitter.com/azuresupport) – a conta oficial do Microsoft Azure para melhorar a experiência do cliente conectando-se à comunidade do Azure para os recursos certos: respostas, suporte e especialistas. * Se precisar de mais ajuda, poderá enviar uma solicitação de suporte do [portal do Azure](https://portal.azure.com/?#blade/Microsoft_Azure_Support/HelpAndSupportBlade/). Selecione **Suporte** na barra de menus ou abra o hub **Ajuda + suporte** . Para obter informações mais detalhadas, confira [Como criar uma solicitação de suporte do Azure](../../azure-portal/supportability/how-to-create-azure-support-request.md). O acesso ao Gerenciamento de assinaturas e ao suporte de cobrança está incluído na sua assinatura do Microsoft Azure, e o suporte técnico é fornecido por meio de um dos [Planos de suporte do Azure](https://azure.microsoft.com/support/plans/).
56.351852
661
0.795925
por_Latn
0.993361
4c4c52b3a4aa97e2726c5a6ab9265cf334fefe59
10,043
md
Markdown
README.md
biobakery/GaPLAC
21d903d0a5a215d96028bde34d58453f05d66bc8
[ "MIT" ]
null
null
null
README.md
biobakery/GaPLAC
21d903d0a5a215d96028bde34d58453f05d66bc8
[ "MIT" ]
5
2021-12-10T20:43:39.000Z
2021-12-10T20:48:31.000Z
README.md
biobakery/GaPLAC
21d903d0a5a215d96028bde34d58453f05d66bc8
[ "MIT" ]
null
null
null
# Guide [![Build Status](https://github.com/biobakery/gptool.jl/workflows/CI/badge.svg)](https://github.com/biobakery/gptool.jl/actions) [![Stable](https://img.shields.io/badge/docs-stable-blue.svg)](https://kescobo.github.io/gptool.jl/stable) [![Dev](https://img.shields.io/badge/docs-dev-blue.svg)](https://kescobo.github.io/gptool.jl/dev) This guide is intended to provide an overview of the basic workflow using GaPLAC. A complete command reference, as well as available covariance and likelihood functions are provided below. ## Installation 1. Install [Julia](https://julialang.org/). 2. Download GaPLAC's repository and unpack it somewhere. 3. Open a console in GaPLAC's root folder and run ``` $ julia --project=@. -e 'using Pkg; Pkg.instantiate()'` This will install the required packages and may take a few minutes. 4. If on Mac/UNIX, to use `./gaplac ...` format, you may need to run `chmod u+x ./gaplac`. ## Generating some sample data GaPLAC has five main commands to work with: `sample`, `mcmc`, `select`, `predict`, and `fitplot`. We will first look at `sample`, which draws a sample from a Gaussian Process. This can be helpful to visualize the kinds of functions described by a particular GP, or to provide some sample data to test later functions. Run the following command from the GaPLAC root folder: ``` ./gaplac sample "y :~| SExp(:x; l=1)" --at "x=-5:0.1:5" --plot gp_sample.png ``` This may take a few minutes the first time, since Julia must compile all the packages. It will produce a large amount of output to the console, and should also produce a plot in `gp_sample.png` which resembles: ![Wavy line](img/guide1.png) If instead you get an error mentioning missing dependencies, it means that step 4 of the Installation section was not successfully completed. Try following the Installation instructions again to resolve this. Let's look at each of the pieces of the command: - `"y :~| SExp(:x; l=1)"`: This is the GP formula, much like a model formula in R. In this case, the output (`y`) is modeled as a GP with a Squared-Exponential covariance function (`SExp`) with a lengthscale (`l`) of `1`. Note also the `:` in `:~|`. Normally, a data likelihood (described later) can be specified between the `:` and the `~`, but here we don't specify anything, and the GP will be modeled _without_ a likelihood. This effectively allows us more directly observe the types of dynamics modeled by the Gaussian Process described in the formula. - `--at "x=-5:0.1:5"`: This tells GaPLAC what values of `x` to sample the GP at. - `--plot gp_sample.png`: Plot the dynamics here. Try changing the lengthscale of the `SExp` term. How does this affect the function? Try adding other components (the full list is at the end of this document) by adding them to the formula, such as an Ornstein-Uhlenbeck process (`OU(x; l=1)`), or simply some `Noise`. Now let's generate a smaller set of data at some randomly chosen `x` coordinates, and store the results in a file instead of printing to stdout: ``` ./gaplac sample "y :~| SExp(:x; l=1.5)" --at "x = rand(Uniform(-5,5), 50)" --output data.tsv ``` Look at the contents of `data.tsv`. It should contain two columns: `x` and `y`, and the rows are not sorted in any way. We will use this data for the next command. ## Fitting parameters We are usually interested in the parameters of the covariance function which best fits some data. This is accomplished with the `mcmc` command in GaPLAC, which will produce a [MCMC chain](https://en.wikipedia.org/wiki/Markov_chain_Monte_Carlo) of samples from the posterior distribution of the model parameters. Try running the following command: ``` ./gaplac mcmc "y ~| SExp(:x)" --data data.tsv --output mcmc.tsv --samples 500 --infer x ``` First, note that the model formula omits the additional `:` before the `~`. This will therefore default to a Gaussian likelihood. Now examine the output file `mcmc.tsv`. Instead of showing a relationship between `x` and `y`, as the previous run of `sample` did, this file should contain several columns of parameter values, as well as the all-important final column containing the log of the unnormalized posterior density for the sample (think of this something like the goodness of fit). In particular, take a look at the covariance function parameter `ℓ`, which we set above in the model formula to `1.5`, but which we did not tell the `mcmc` command. If all worked well, the mean of this parameter should converge to, and hover around the true value of `1.5`. ## TODO: Add `predict` ## Comparing models `mcmc` fits a single model, but how do we know it's the right model? Maybe another model would be a better fit? To answer this question, we can run `mcmc` again with the other model. In this case, let's test a different kind of time-varying process called an Ornstein-Uhlenbeck (OU) process: ``` ./gaplac mcmc "y ~| OU(:x)" --data data.tsv --output mcmc_ou.tsv --samples 500 ``` This will give us a second set of model fit results in a new `mcmc_ou.tsv` file. Now we can use those goodness-of-fit values in each of the files to determine which of the models we believe (hint: we generated data from a Squared-Exponential covariance function, and thus we expect the OU process will perform worse. Let's test that with the `select` command: ``` ./gaplac select --chains mcmc.tsv mcmc_ou.tsv ``` ``` ┌ Info: Log2 Bayes: 8.405 │ │ • Log(pdf) - model 1: -81.29118 │ │ │ • Log(pdf) - model 2: -89.69639 │ 9.972996e-28 │ └ Note - Positive values indicate more evidence for model 1 ``` This will compare the log posterior values stored in each of the MCMC chains, and summarize them as a [Bayes Factor](https://en.wikipedia.org/wiki/Bayes_factor), which is reported in log2 scale. Here, log2 Bayes Factors greater than 1 indicate that the first model (in this case the Squared-Exponential) should be preferred, while negative numbers indicate the opposite - that the second model should be preferred. You may also compare different forumla parameters on your initial data, rather than using the outputs from MCMC. ``` ./gaplac -v select --formulae "y ~| SExp(:x, l=2)" "y ~| SExp(:x, l=1)" --data data.tsv ``` ``` [ Info: running 'select' ┌ Info: │ Dict{String, Any} with 4 entries: │ "plot" => nothing │ "formulae" => Any["y ~| SExp(:x, l=1.5)", "y ~| OU(:x, l=1.5)"] │ "data" => "data.tsv" └ "chains" => Any[] ┌ Info: Log2 Bayes: 4.44 │ │ • Log(pdf) - model 1: -31.53397005887427 │ │ • Log(pdf) - model 2: -35.97395926954643 │ └ Note - Positive values indicate more evidence for model 1 ``` # Command references Available by running the scrupt with `--help` ## Commands ```sh ./gaplac --help usage: main.jl [-v] [-q] [--debug] [--log LOG] [-h] {mcmc|predict|sample|fitplot|select} commands: mcmc Run MCMC to optimize hyperparameters predict Calculate the posterior of a GP given data # not yet implemented sample Sample the posterior of a GP fitplot Diagnostic plots showing the posteriors of different components of the GP # not yet implemented select Output model selection parameters; requires --mcmc and --mcmc2 optional arguments: -v, --verbose Log level to @info -q, --quiet Log level to @warning --debug Log level to @debug --log LOG Log to a file as well as stdout -h, --help show this help message and exit ``` ## Sample ```sh ./gaplac sample --help usage: main.jl sample --at AT [--plot PLOT] [-o OUTPUT] [-h] formula positional arguments: formula GP formula specification optional arguments: --at AT Range to sample at, eg 'x=-5:0.1:5 --plot PLOT File to plot to -o, --output OUTPUT Table output of GP sample - must end with '.csv' or '.tsv' -h, --help show this help message and exit ``` ## MCMC ```sh ./gaplac mcmc --help usage: main.jl mcmc -i DATA --infer INFER [INFER...] [-o OUTPUT] [--plot PLOT] [-h] formula positional arguments: formula GP formula specification optional arguments: -i, --data DATA Table input on which to run inference. Must contain columns that correspond to values in 'formula' --infer INFER [INFER...] Which model hyperparameter to infer. Specify variable names, the hyperparameter(s) will be determined based on kernel type (eg length scale for SExp) -o, --output OUTPUT Table to output sampling chain --plot PLOT File to plot to -h, --help show this help message and exit ``` ## Select ```sh ./gaplac select --help usage: main.jl select [--formulae FORMULAE FORMULAE] [--chains CHAINS CHAINS] [-i DATA] [--plot PLOT] [-h] optional arguments: --formulae FORMULAE FORMULAE Compare 2 GP formula specifications, requires '--data' as well. Result will be logpdf of formula 2 - logpdf of formula 1. A positive value indicates more evidence for formula 2. --chains CHAINS CHAINS Compare 2 sampling chains from 'mcmc' command. Result will be the log2 bayes factor. A positive value indicates more evidence for chain 1. -i, --data DATA Table input on which to run inference. Must contain columns that correspond to values in both 'formulae' --plot PLOT File to plot to -h, --help show this help message and exit ```
46.49537
634
0.658668
eng_Latn
0.993324
4c4ce92c571075813197d67a8468530e919c526f
11,478
md
Markdown
api/server/Telerik.Web.UI/RadToolTip.md
thevivacioushussain/ajax-docs
b46cd8ec574600abf8c256c0e20100eb382a9679
[ "MIT" ]
null
null
null
api/server/Telerik.Web.UI/RadToolTip.md
thevivacioushussain/ajax-docs
b46cd8ec574600abf8c256c0e20100eb382a9679
[ "MIT" ]
null
null
null
api/server/Telerik.Web.UI/RadToolTip.md
thevivacioushussain/ajax-docs
b46cd8ec574600abf8c256c0e20100eb382a9679
[ "MIT" ]
null
null
null
--- title: Telerik.Web.UI.RadToolTip page_title: Telerik.Web.UI.RadToolTip description: Telerik.Web.UI.RadToolTip --- # Telerik.Web.UI.RadToolTip RadToolTip class ## Inheritance Hierarchy * System.Object * System.Web.UI.Control * System.Web.UI.WebControls.WebControl * Telerik.Web.UI.RadWebControl : IControl, IControlResolver, IPostBackDataHandler, IScriptControl, ISkinnableControl * Telerik.Web.UI.RadToolTipBase * Telerik.Web.UI.RadToolTip ## Properties ### Animation `ToolTipAnimation` Get/Set the animation effect of the tooltip. Turned off by default. ### AnimationDuration `ToolTipAnimation` Sets/gets the duration of the animation in milliseconds. 500 by default. ### AutoCloseDelay `Int32` Get/Set the delay (in milliseconds) after which the tooltip will hide if the mouse stands still over the target element. 3000 by default. ### ClientIDMode `ClientIDMode` This property is overridden in order to support controls which implement INamingContainer. The default value is changed to "AutoID". ### ContentScrolling `ToolTipScrolling` Get/Set overflow of the tooltip's content area. ### CssClassFormatString `String` The CssClass property will now be used instead of the former Skin and will be modified in AddAttributesToRender() ### EnableAjaxSkinRendering `String` Gets or sets the value, indicating whether to render the skin CSS files during Ajax requests #### Remarks If EnableAjaxSkinRendering is set to false you will have to register the needed control base CSS file by hand when adding/showing the control with Ajax. ### EnableAriaSupport `Boolean` When set to true enables support for WAI-ARIA ### EnableEmbeddedBaseStylesheet `Boolean` Gets or sets the value, indicating whether to render the link to the embedded base stylesheet of the control or not. #### Remarks If EnableEmbeddedBaseStylesheet is set to false you will have to register the needed control base CSS file by hand. ### EnableEmbeddedScripts `Boolean` Gets or sets the value, indicating whether to render script references to the embedded scripts or not. #### Remarks If EnableEmbeddedScripts is set to false you will have to register the needed Scripts files by hand. ### EnableEmbeddedSkins `String` Gets or sets the value, indicating whether to render links to the embedded skins or not. #### Remarks If EnableEmbeddedSkins is set to false you will have to register the needed CSS files by hand. ### EnableRippleEffect `Boolean` Returns true if ripple effect should be added ### EnableRoundedCorners `Boolean` Gets or sets a value indicating whether the RadToolTip should have rounded corners. True by default. ### EnableShadow `Boolean` Gets or sets a value indicating whether the RadToolTip should have shadow. True by default. ### Height `Unit` Get/Set the Height of the tooltip in pixels. ### HideDelay `Int32` Get/Set delay (in milliseconds) for the tooltip to hide after the mouse leaves the target element. 300 by default. ### HideEvent `ToolTipHideEvent` Get/Set the client event at which the tooltip will be hidden. ### IgnoreAltAttribute `Boolean` Get/Set the indicator whether the Alt specified for the target should be ignored or not #### Remarks You can use the IgnoreAltAttribute to instruct the RadToolTip to ignore the AlternateText and alt properties and not remove them from its target element. This will result in a change of the content source priorities for images and a second tooltip being shown under IE6 and IE7, as these browsers interpret the alt attribute like the title attribute. ### IsClientID `Boolean` Get/Set whether the TargetControlID is server or client ID ### IsSkinSet `String` For internal use. ### ManualClose `Boolean` This property is obsolete. Please use HideEvent="ManualClose" instead. Gets/Sets whether the tooltip will need to be closed manually by the user using the [x] button, or will close automatically. ### ManualCloseButtonText `Boolean` Get/Set the manual close button's tooltip text. ### Modal `Boolean` Gets or sets a value indicating whether a tooltip is modal or not. ### MouseTrailing `Boolean` Get/Set whether the tooltip will move to follow mouse movement over the target control or will stay fixed. ### OffsetX `Int32` Get/Set the tooltip's horizontal offset from the target control in pixels. Works in cooperation with the Position property. ### OffsetY `Int32` Get/Set the tooltip's vertical offset from the target control in pixels. Works in cooperation with the Position property. ### OnClientBeforeHide `String` Gets or sets the name of client-side JavaScript function that is called just before the RadToolTip hides. #### Remarks If specified, the OnClientBeforeHideclient-side event handler that is called before the tooltip is hidden. Two parameters are passed to the handler:sender, the RadToolTip object.args.This event can be cancelled. ### OnClientBeforeShow `String` Gets or sets the name of client-side JavaScript function that is called just before the RadToolTip is shown. #### Remarks If specified, the OnClientBeforeShowclient-side event handler is called before the RadToolTip is shown. Two parameters are passed to the handler:sender, the RadToolTip object.args.This event can be cancelled. ### OnClientHide `String` Gets or sets the name of client-side JavaScript function that is called just after the RadToolTip is hidden. #### Remarks If specified, the OnClientHideclient-side event handler that is called after the tooltip is hidden. Two parameters are passed to the handler:sender, the RadToolTip object.args.This event cannot be cancelled. ### OnClientShow `String` Gets or sets the name of client-side JavaScript function that is called just after the RadToolTip is shown. #### Remarks If specified, the OnClientShowclient-side event handler is called after the tooltip is shown Two parameters are passed to the handler:sender, the RadToolTip object.args.This event cannot be cancelled. ### Overlay `Boolean` Gets or sets a value indicating whether the tooltip will create an overlay element to ensure it will be displayed over a flash element. ### Position `ToolTipPosition` Get/Set the top/left position of the tooltip relative to the target element #### Remarks If there isn't enough room in the viewport to show the tooltip in the specified position (this depends on the tooltip's Width and Height and the target's position) the tooltip will automatically reposition itself so that it is entirely visible. Usually this will be the opposite position. All positions are tried until enough room is found on the screen, if there isn't the top left corner of the tooltip will be visible and scrollbars will be produced. ### RegisterWithScriptManager `Boolean` Gets or sets the value, indicating whether to register with the ScriptManager control on the page. #### Remarks If RegisterWithScriptManager is set to false the control can be rendered on the page using Web Services or normal callback requests/page methods. ### RelativeTo `ToolTipRelativeDisplay` Get/Set whether the tooltip should appear relative to the mouse, to the target element or to the browser viewport. Works in cooperation with the Position property. #### Remarks When the display is relative to the mouse or to the element the tooltips are positioned absolutely on the page, when relative to the browser viewport they have fixed position and do not scroll with the rest of the page. ### RenderInPageRoot `Boolean` Get/Set whether the tooltip should be added as a child of the form element or as a child of its direct parent. True by default. ### RenderMode `RenderMode` Specifies the rendering mode of the control. Setting the mode to Lightweight will yield HTML 5/CSS 3 html and css. #### Remarks Lightweight rendering mode might change the outlook of the component in some older browsers that don't support CSS3/HTML5. ### ResolvedRenderMode `RenderMode` Returns resolved RenderMode should the original value was Auto ### RuntimeSkin `String` Gets the real skin name for the control user interface. If Skin is not set, returns "Default", otherwise returns Skin. ### ShowCallout `Boolean` Get/Set whether the tooltip will show a small arrow pointing to its target element. True by default. ### ShowDelay `Int32` Get/Set the time (in milliseconds) for which the user should hold the mouse over a target element for the tooltip to appear. The default is 400. ### ShowEvent `ToolTipShowEvent` Get/Set the client event at which the tooltip will be made visible for a particular target control. ### Skin `String` Gets or sets the skin name for the control user interface. #### Remarks If this property is not set, the control will render using the skin named "Default". If EnableEmbeddedSkins is set to false, the control will not render skin. ### Sticky `Boolean` This property is obsolete. Please use HideEvent="LeaveToolTip" instead. Gets/Sets whether the tooltip will hide when the mouse moves away from the target element (when false), or when the mouse [enters] and moves out of the tooltip itself (when set to true). ### TargetControlID `String` Get/Set the target control of the tooltip ### Text `String` Get/Set the Text that will appear in the tooltip (if it should be different from the content of the 'title' attribute of the target element). ### Title `String` Get/Set a title for the tooltip. #### Remarks This title is not affected by the rest of the content and is always displayed, regardless of the content source. For more details see this help article: http://www.telerik.com/help/aspnet-ajax/tooltip-content.html. ### VisibleOnPageLoad `Boolean` Gets or sets a value indicating whether the tooltip will open automatically when its parent [aspx] page is loaded on the client. ### Width `Unit` Get/Set the Width of the tooltip in pixels. ## Methods ### ApplyConditionalRendering Use this from RenderContents of the inheritor #### Returns `System.Void` ### ControlPreRender Code moved into this method from OnPreRender to make sure it executed when the framework skips OnPreRender() for some reason #### Returns `System.Void` ### GetEmbeddedSkinNames Returns the names of all embedded skins. Used by Telerik.Web.Examples. #### Returns `System.Collections.Generic.List`1` ### LoadClientState Loads the client state data #### Parameters #### clientState `System.Collections.Generic.Dictionary{System.String,System.Object}` #### Returns `System.Void` ### LoadPostData Executed when post data is loaded from the request #### Parameters #### postDataKey `System.String` #### postCollection `System.Collections.Specialized.NameValueCollection` #### Returns `System.Boolean` ### RaisePostDataChangedEvent Executed when post data changes should invoke a changed event #### Returns `System.Void` ### RegisterCssReferences Registers the CSS references #### Returns `System.Void` ### RegisterScriptControl Registers the control with the ScriptManager #### Returns `System.Void` ### SaveClientState Saves the client state data #### Returns `System.String` ### Show Causes the tooltip to open automatically when the page is loaded #### Returns `System.Void`
31.021622
176
0.757885
eng_Latn
0.991882
4c4d3d3794d5159704964cee6fc14748ae321f18
6,467
md
Markdown
desktop-src/DirectShow/video-mixing-renderer-filter-9.md
phuclv90/win32
a9cdfadeb7be044e1bb69f24530df9e9e774548f
[ "CC-BY-4.0", "MIT" ]
null
null
null
desktop-src/DirectShow/video-mixing-renderer-filter-9.md
phuclv90/win32
a9cdfadeb7be044e1bb69f24530df9e9e774548f
[ "CC-BY-4.0", "MIT" ]
null
null
null
desktop-src/DirectShow/video-mixing-renderer-filter-9.md
phuclv90/win32
a9cdfadeb7be044e1bb69f24530df9e9e774548f
[ "CC-BY-4.0", "MIT" ]
null
null
null
--- Description: Video Mixing Renderer Filter 9 ms.assetid: 3885cca2-74b1-4066-8ecb-84c9841f9e66 title: Video Mixing Renderer Filter 9 ms.topic: article ms.date: 05/31/2018 --- # Video Mixing Renderer Filter 9 In DirectX 9, the Video Mixing Renderer 9 (VMR-9) filter offers advanced video rendering capabilities on all platforms supported by DirectX. It is fully integrated with DirectX 9 3D capabilities. For example, that you can easily add video to games and other 3D environments or transform video images using the Direct3D pixel shaders and other effects. This filter does not support video ports. To maintain backward compatibility, the VMR-9 is not the default renderer on any system. To use this filter, add it to the filter graph explicitly and configure it before connecting any of its input pins. The VMR-9 uses its own set of interfaces, structures, and enumerations, which are not always identical to the corresponding data types used with the VMR-7. The VMR-9 supports up to 16 monitors. <table> <colgroup> <col style="width: 50%" /> <col style="width: 50%" /> </colgroup> <tbody> <tr class="odd"> <td>Filter Interfaces</td> <td>The VMR-9 supports several distinct rendering modes. It supports a different set of interfaces depending on the rendering mode:<br/> <ul> <li>All modes: <a href="/windows/desktop/api/Strmif/nn-strmif-iamcertifiedoutputprotection"><strong>IAMCertifiedOutputProtection</strong></a>, <a href="/windows/desktop/api/Strmif/nn-strmif-iamfiltermiscflags"><strong>IAMFilterMiscFlags</strong></a>, <a href="/windows/desktop/api/Strmif/nn-strmif-ibasefilter"><strong>IBaseFilter</strong></a>, <a href="/windows/desktop/api/Control/nn-control-imediaposition"><strong>IMediaPosition</strong></a>, <a href="/windows/desktop/api/Strmif/nn-strmif-imediaseeking"><strong>IMediaSeeking</strong></a>, <a href="/windows/desktop/api/Strmif/nn-strmif-iqualitycontrol"><strong>IQualityControl</strong></a>, <a href="/windows/desktop/api/Amvideo/nn-amvideo-iqualprop"><strong>IQualProp</strong></a>, <a href="/windows/desktop/api/Vmr9/nn-vmr9-ivmraspectratiocontrol9"><strong>IVMRAspectRatioControl9</strong></a>, <a href="/windows/desktop/api/Vmr9/nn-vmr9-ivmrdeinterlacecontrol9"><strong>IVMRDeinterlaceControl9</strong></a>, <a href="/windows/desktop/api/Vmr9/nn-vmr9-ivmrfilterconfig9"><strong>IVMRFilterConfig9</strong></a>, <a href="/windows/desktop/api/Vmr9/nn-vmr9-ivmrmixerbitmap9"><strong>IVMRMixerBitmap9</strong></a>, <a href="/windows/desktop/api/Vmr9/nn-vmr9-ivmrmixercontrol9"><strong>IVMRMixerControl9</strong></a></li> <li>Renderless mode: <a href="/windows/desktop/api/Vmr9/nn-vmr9-ivmrsurfaceallocatornotify9"><strong>IVMRSurfaceAllocatorNotify9</strong></a></li> <li>Windowed mode: <a href="/windows/desktop/api/Control/nn-control-ibasicvideo"><strong>IBasicVideo</strong></a>, <a href="/windows/desktop/api/Control/nn-control-ibasicvideo2"><strong>IBasicVideo2</strong></a>, <a href="/windows/desktop/api/Control/nn-control-ivideowindow"><strong>IVideoWindow</strong></a>, <a href="/windows/desktop/api/Vmr9/nn-vmr9-ivmrmonitorconfig9"><strong>IVMRMonitorConfig9</strong></a></li> <li>Windowless mode: <a href="/windows/desktop/api/Vmr9/nn-vmr9-ivmrmonitorconfig9"><strong>IVMRMonitorConfig9</strong></a>, <a href="/windows/desktop/api/Vmr9/nn-vmr9-ivmrwindowlesscontrol9"><strong>IVMRWindowlessControl9</strong></a></li> </ul> To set the rendering mode, call <a href="/windows/desktop/api/Vmr9/nf-vmr9-ivmrfilterconfig9-setrenderingmode"><strong>IVMRFilterConfig9::SetRenderingMode</strong></a>. For more information, see <a href="vmr-modes-of-operation.md">VMR Modes of Operation</a>.<br/></td> </tr> <tr class="even"> <td>Input Pin Media Types</td> <td>The input pins will connect with any type supported by the underlying video hardware.</td> </tr> <tr class="odd"> <td>Input Pin Interfaces</td> <td><a href="/windows/desktop/api/videoacc/nn-videoacc-iamvideoaccelerator"><strong>IAMVideoAccelerator</strong></a>, <a href="/windows/desktop/api/Strmif/nn-strmif-imeminputpin"><strong>IMemInputPin</strong></a>, <a href="/windows/desktop/api/Strmif/nn-strmif-ioverlay"><strong>IOverlay</strong></a>, <a href="/windows/desktop/api/Strmif/nn-strmif-iqualitycontrol"><strong>IQualityControl</strong></a>, <a href="/windows/desktop/api/Strmif/nn-strmif-ipin"><strong>IPin</strong></a>, <a href="/windows/desktop/api/Strmif/nn-strmif-ipinconnection"><strong>IPinConnection</strong></a>, <a href="/windows/desktop/api/Vmr9/nn-vmr9-ivmrvideostreamcontrol9"><strong>IVMRVideoStreamControl9</strong></a></td> </tr> <tr class="even"> <td>Output Pin Media Types</td> <td>Not applicable.</td> </tr> <tr class="odd"> <td>Output Pin Interfaces</td> <td>Not applicable.</td> </tr> <tr class="even"> <td>Filter CLSID</td> <td>CLSID_VideoMixingRenderer9</td> </tr> <tr class="odd"> <td>Property Page CLSID</td> <td>N/A</td> </tr> <tr class="even"> <td>Executable</td> <td>Quartz.dll</td> </tr> <tr class="odd"> <td><a href="merit.md">Merit</a></td> <td>MERIT_DO_NOT_USE</td> </tr> <tr class="even"> <td><a href="filter-categories.md">Filter Category</a></td> <td>CLSID_LegacyAmFilterCategory</td> </tr> </tbody> </table>   ## Remarks An application can provide a custom allocator-presenter object that exposes the following interfaces: - [**IVMRImagePresenter9**](/windows/desktop/api/Vmr9/nn-vmr9-ivmrimagepresenter9) - [**IVMRImagePresenterConfig9**](/windows/desktop/api/Vmr9/nn-vmr9-ivmrimagepresenterconfig9) (optional) - [**IVMRSurfaceAllocator9**](/windows/desktop/api/Vmr9/nn-vmr9-ivmrsurfaceallocator9) - [**IVMRSurfaceAllocatorEx9**](/windows/desktop/api/Vmr9/nn-vmr9-ivmrsurfaceallocatorex9) (optional) - [**IVMRWindowlessControl9**](/windows/desktop/api/Vmr9/nn-vmr9-ivmrwindowlesscontrol9) (optional) For more information about custom allocator-presenters, see [Supplying a Custom Allocator-Presenter for VMR-9](supplying-a-custom-allocator-presenter-for-vmr-9.md). An application can also provide a custom plug-in compositor that exposes the following interface: - [**IVMRImageCompositor9**](/windows/desktop/api/Vmr9/nn-vmr9-ivmrimagecompositor9) To configure the VMR with a custom compositor, call [**IVMRFilterConfig9::SetImageCompositor**](/windows/desktop/api/Vmr9/nf-vmr9-ivmrfilterconfig9-setimagecompositor). ## Related topics <dl> <dt> [DirectShow Filters](directshow-filters.md) </dt> <dt> [Using the Video Mixing Renderer](using-the-video-mixing-renderer.md) </dt> </dl>    
55.75
1,274
0.760322
eng_Latn
0.237461
4c4ea73a3b86d2d70bbe6ba807db6dd634fefd0a
431
md
Markdown
README.md
gcko/readme_tw
2fac994291c4671678532d8e8b8f5dd7583d107e
[ "MIT" ]
null
null
null
README.md
gcko/readme_tw
2fac994291c4671678532d8e8b8f5dd7583d107e
[ "MIT" ]
11
2019-12-26T04:13:56.000Z
2022-02-26T13:14:48.000Z
README.md
gcko/readme_tw
2fac994291c4671678532d8e8b8f5dd7583d107e
[ "MIT" ]
null
null
null
# Readme.tw This project uses GatsbyJS to build a ReactJS static site. The data layer is handled via GraphQL and calls to a Strapi site. Details for the location can be changed [here](gatsby-config.js). ## Getting Started `git clone git@github.com:gcko/readme_tw.git` This project uses docker for development and production. ### Development `docker-compose up` ### Production `docker-compose -f docker-compose-prod.yml up`
22.684211
73
0.763341
eng_Latn
0.962586
4c4f69b8b655ece3de0f3d75e8695f84d61e32a9
211
md
Markdown
libraries/community/p1/All/S6b0724 text and graphics driver/README.md
deets/propeller
1468c8b334266f899882f404903b2dd833168799
[ "MIT" ]
82
2018-11-29T19:02:44.000Z
2022-03-11T18:50:24.000Z
libraries/community/p1/All/S6b0724 text and graphics driver/README.md
pik33/propeller
861213b91d4e6bd4e724a21389a5311d452f369d
[ "MIT" ]
30
2019-01-31T02:20:23.000Z
2022-01-26T17:50:25.000Z
libraries/community/p1/All/S6b0724 text and graphics driver/README.md
pik33/propeller
861213b91d4e6bd4e724a21389a5311d452f369d
[ "MIT" ]
73
2018-11-28T15:10:12.000Z
2022-03-22T21:00:25.000Z
# S6b0724 text and graphics driver By: Erik Friesen Language: Spin, Assembly Created: Apr 16, 2013 Modified: April 16, 2013 This is an S6b0724,Ks0713, 6800 style parallel lcd driver with text and graphics.
17.583333
81
0.767773
eng_Latn
0.89999