hexsha
stringlengths
40
40
size
int64
5
1.04M
ext
stringclasses
6 values
lang
stringclasses
1 value
max_stars_repo_path
stringlengths
3
344
max_stars_repo_name
stringlengths
5
125
max_stars_repo_head_hexsha
stringlengths
40
78
max_stars_repo_licenses
listlengths
1
11
max_stars_count
int64
1
368k
max_stars_repo_stars_event_min_datetime
stringlengths
24
24
max_stars_repo_stars_event_max_datetime
stringlengths
24
24
max_issues_repo_path
stringlengths
3
344
max_issues_repo_name
stringlengths
5
125
max_issues_repo_head_hexsha
stringlengths
40
78
max_issues_repo_licenses
listlengths
1
11
max_issues_count
int64
1
116k
max_issues_repo_issues_event_min_datetime
stringlengths
24
24
max_issues_repo_issues_event_max_datetime
stringlengths
24
24
max_forks_repo_path
stringlengths
3
344
max_forks_repo_name
stringlengths
5
125
max_forks_repo_head_hexsha
stringlengths
40
78
max_forks_repo_licenses
listlengths
1
11
max_forks_count
int64
1
105k
max_forks_repo_forks_event_min_datetime
stringlengths
24
24
max_forks_repo_forks_event_max_datetime
stringlengths
24
24
content
stringlengths
5
1.04M
avg_line_length
float64
1.14
851k
max_line_length
int64
1
1.03M
alphanum_fraction
float64
0
1
lid
stringclasses
191 values
lid_prob
float64
0.01
1
faaa55863d164f04d8fee9de0da4793d1b16bb88
74
md
Markdown
README.md
lukoucky/restaurant_booking
bc6f0c655653571b5e190614168c2534f809992d
[ "MIT" ]
null
null
null
README.md
lukoucky/restaurant_booking
bc6f0c655653571b5e190614168c2534f809992d
[ "MIT" ]
null
null
null
README.md
lukoucky/restaurant_booking
bc6f0c655653571b5e190614168c2534f809992d
[ "MIT" ]
null
null
null
# restaurant_booking Restaurant table booking system backend and frontend
24.666667
52
0.864865
eng_Latn
0.983015
faaa727cb98ea1667b34fb67a47514aed6c8a276
7,543
md
Markdown
README.md
Presnia/createx-nd
f16d5e9cce64103953214bbeae411a564e458522
[ "MIT" ]
null
null
null
README.md
Presnia/createx-nd
f16d5e9cce64103953214bbeae411a564e458522
[ "MIT" ]
null
null
null
README.md
Presnia/createx-nd
f16d5e9cce64103953214bbeae411a564e458522
[ "MIT" ]
null
null
null
# Gulp - сборка MaxGraph > Используется Gulp 4 ## Начало работы Для работы с данной сборкой в новом проекте, склонируйте все содержимое репозитория <br> `git clone <this repo>` Затем, находясь в корне проекта, запустите команду `npm i`, которая установит все находящиеся в package.json зависимости. После этого вы можете использовать любую из четырех предложенных команд сборки (итоговые файлы попадают в папку __app__ корневой директории): <br> `gulp` - базовая команда, которая запускает сборку для разработки, используя browser-sync `gulp build` - команда для продакшн-сборки проекта. Все ассеты сжаты и оптимизированы для выкладки на хостинг. `gulp cache` - команда, которую стоит запускать после `gulp build`, если вам нужно загрузить новые файлы на хостинг без кэширования. `gulp backend` - специальная команда для создания сборки под дальнейшее бэкенд-взаимодействие. Подробнее об этом ниже. ## Структура папок и файлов ``` ├── src/ # Исходники │ ├── js # Скрипты │ │ └── main.js # Главный скрипт │ │ ├── global.js # файл с базовыми данными проекта - переменные, вспомогательные функции и т.д. │ │ ├── components # js-компоненты │ │ ├── vendor # папка для загрузки локальных версий библиотек │ ├── scss # Стили сайта (препроцессор sass в scss-синтаксисе) │ │ └── main.scss # Главный файл стилей, содержащий в себе как глобальные настройки, так и подключение всех нужных компонентов │ │ └── vendor.scss # Файл для подключения стилей библиотек из папки vendor │ │ └── _fonts.scss # Файл для подключения шрифтов (можно использовать миксин) │ │ └── _mixins.scss # Файл для подключения миксинов из папки mixins │ │ └── _vars.scss # Файл для написания css- или scss-переменных │ │ └── _settings.scss # Файл для написания глобальных стилей │ │ ├── components # scss-компоненты │ │ ├── mixins # папка для сохранения готовых scss-миксинов │ │ ├── vendor # папка для хранения локальных css-стилей библиотек │ ├── partials # папка для хранения html-частей страницы │ ├── img # папка для хранения картинок │ │ ├── svg # специальная папка для преобразования svg в спрайт │ ├── resources # папка для хранения иных ассетов - php, видео-файлы, favicon и т.д. │ │ ├── fonts # папка для хранения шрифтов в формате woff2 │ └── index.html # Главный html-файл └── gulpfile.js # файл с настройками Gulp └── package.json # файл с настройками сборки и установленными пакетами └── .editorconfig # файл с настройками форматирования кода └── .stylelintrc # файл с настройками stylelint └── README.md # документация сборки ``` ## Оглавление 1. [npm-скрипты](#npm-скрипты) 2. [Работа с html](#работа-с-html) 3. [Работа с CSS](#работа-с-css) 4. [CSS-миксины](#css-миксины) 5. [Работа с JavaScript](#работа-с-javascript) 6. [Работа со шрифтами](#работа-со-шрифтами) 7. [Работа с изображениями](#работа-с-изображениями) 8. [Работа с иными ресурсами](#работа-с-иными-ресурсами) 9. [Backend-скрипт](#backend-скрипт) ## npm-скрипты Вы можете вызывать gulp-скрипты через npm. Также в сборке есть возможность проверять код на соответствие конфигу (editorconfig) и валидировать html. `npm run html` - запускает валидатор html, запускать нужно при наличии html-файлов в папке __app__. `npm run code` - запускает editorconfig-checker для проверки соответствия конфиг-файлу. ## Работа с html Благодаря плагину __gulp-file-include__ вы можете разделять html-файл на различные шаблоны, которые должны храниться в папке __partials__. Удобно делить html-страницу на секции. > Для вставки html-частей в главный файл используйте `@include('partials/filename.html')` Если вы хотите создать многостраничный сайт - копируйте __index.html__, переименовывайте как вам нужно, и используйте. При использовании команды `gulp build`, вы получите минифицированный html-код в одну строку для всех html-файлов. ## Работа с CSS В сборке используется препроцессор __sass__ в синтаксисе __scss__. Стили, написанные в __components__, следует подключать в __main.scss__. Стили из ___fonts__, ___settings__, ___vars__ и ___mixins__ так же подключены в __main.scss__. Чтобы подключить сторонние css-файлы (библиотеки) - положите их в папку __vendor__ и подключите в файле __vendor.scss__ Если вы хотите создать свой миксин - делайте это в папке __mixins__, а затем подключайте в файл ___mixins.scss__. Если вы хотите использовать scss-переменные - обязательно удалите __:root__. > Для подключения css-файлов используйте директиву `@import` В итоговой папке __app/css__ создаются три файла: <br> __main.css__ - для стилей страницы, <br> __vendor.css__ - для стилей всех библиотек, использующихся в проекте. При использовании команды `gulp build`, вы получите минифицированный css-код в одну строку для всех css-файлов. ## CSS-миксины В сборку будут добавляться готовые scss-миксины для различных компонентов, вы можете найти их в папке __scss/mixins__ ## Работа с JavaScript Поддержка `import` и `require` не реализована! Файлы собираются автоматически из различных папок. JS-код лучше делить на компоненты - небольшие js-файлы, которые содержат свою, изолированную друг от друга реализацию. Такие файлы помещайте в папку __components__. В файле __global.js__ должны храниться базовые данные проекта - переменные, какие-то вспомогательные функции (типа остановки скролла и т.д.). В файле __main.js__ ничего не подключается, он рекомендуется для реализации общей логики сайта. Чтобы подключить сторонние js-файлы (библиотеки) - положите их в папку __vendor__. При использовании команды `gulp build`, вы получите минифицированный js-код в одну строку для всех js-файлов. ## Работа со шрифтами Т.к. автор не поддерживает IE11, в сборке реализована поддержка только формата __woff2__ (это значит, что в миксине подключения шрифтов используется только данный формат). Загружайте файлы __woff2__ в папку __resources/fonts__, а затем вызывайте миксин `@font-face` в файле ___fonts.scss__. ## Работа с изображениями Любые изображения, кроме __favicon__ кладите в папку __img__. Если вам нужно сделать svg-спрайт, кладите нужные для спрайта svg-файлы в папку __img/svg__. Иные svg-файлы просто оставляйте в папке __img__. При использовании команды `gulp build`, вы получите минифицированные изображения в итоговой папке __img__. ## Работа с иными ресурсами Любые ресурсы (ассеты) проекта, под которые не отведена соответствующая папка, должны храниться в папке __resources__. Это могут быть видео-файлы, php-файлы (как, например, файл отправки формы), favicon и прочие. ## Backend-скрипт На данный момент реализована отдельная версия сборки под backend, которая похожа на `gulp build`, однако отличается отсутствием минификации итоговых файлов, что при передаче backend-специалисту просто не нужно. Важно понимать, что если вы разрабатываете сайт под дальнейшее backend-взаимодействие, стоит сразу разбивать css-файлы по-другому. Вместо использования папки __components__, создавайте css-файлы напрямую рядом с __main.scss__, подключайте к html-файлу. Также важно, что при таком взаимодействии нужно подключать js-файлы из папки __components__ также напрямую к html-файлу.
51.312925
252
0.729683
rus_Cyrl
0.977592
faab24a1bf7f57dbc1dd27b9b90107cbbc09bd0c
4,470
md
Markdown
README.md
hageduri/ArunachalHub
c8175119ab49871bed4fbb6d667cbddfeeb21d5a
[ "MIT" ]
81
2015-09-15T21:11:41.000Z
2022-02-27T06:04:37.000Z
README.md
hageduri/ArunachalHub
c8175119ab49871bed4fbb6d667cbddfeeb21d5a
[ "MIT" ]
3
2015-09-16T04:04:07.000Z
2020-10-24T12:09:47.000Z
README.md
hageduri/ArunachalHub
c8175119ab49871bed4fbb6d667cbddfeeb21d5a
[ "MIT" ]
15
2015-09-16T03:20:38.000Z
2022-02-27T06:07:35.000Z
# github-secret-keeper A microservice written in node.js for enabling static, server-less applications to log in using GitHub. Unlike existing alternatives, it works for as many different client IDs as you'd like. This lets you run a single microservice that knows *all* your client secrets rather than one for each app or service. ## Why? In order to let users log into our application with GitHub, we can use the GitHub OAuth system for web apps. For this we register our app with GitHub.com and they give us a *Client ID* and *Client Secret*. The ID is public, the secret is supposed to be... wait for it... a secret! I like to build clientside JS apps as [completely static files](https://blog.andyet.com/2015/05/18/lazymorphic-apps-bringing-back-static-web). Which means we don't have a server somewhere, where we can keep and use that client secret. We don't just want to put it in our static JS, because... well, then it's no longer secret. If you read the [GitHub OAuth Docs](https://developer.github.com/v3/oauth/#web-application-flow) you notice that at step #2, we have to make a `POST` request to GitHub that includes the secret. So what do we do? To solve this, we can run a little simple server (perhaps on a small free/cheap Heroku server) that does *just that part*: It knows your secret. But additionally, it's common to register an app for each environment you need to test in. So, for a single app you may actually register two or three apps with github. One while developing locally, one for staging, one for production each with their own ID and secret and now, its own secret-keeping service. To deal with this, and to embrace the whole microservices idea, we could instead create a single service that knows about all client IDs and secrets. That's what this is. Then, whenver we want to add another app, we just add a config item in heroku (or whever) and restart the service. It's intended to be a simple, consistent, minimalist JSON API that lets you pass a client ID and "code" as per GitHub docs, ## Features 1. Provides a single CORS-enabled endpoint you can hit with AJAX that makes the GitHub request, including the secret, and returns the result. 2. Written in node.js using [hapi](http://hapijs.com/) 3. descriptive, consistent JSON responses with proper status codes 4. all but successful requests will have `4xx` status codes and JSON responses are generated with [boom](https://github.com/hapijs/boom) for predictable structure. ## How it works 1. Your client IDs are simply environment variables **whose value is the corresponding client secret** (this plays nicely with services like Heroku) 2. You make an ajax request that looks as follows (using jquery for brevity): ```js $.getJSON('https://secret-keeper.yourdomain.com/YOUR_CLIENT_ID/YOUR_CODE') .done(function (data) { console.log('data', data) }) .fail(function (data) { console.log('failed', data) }) }) ``` So, the URL should be as follows: ``` https://yourhost.com/{ YOUR CLIENT ID }/{ YOUR CODE } ``` You can optionally also include `state`, `redirect_uri`, `domain` as a query paramaters. ``` ?state={{ YOUR STATE PARAM }}&redirect_uri={{ YOUR REDIRECT URI }} ``` If included, `state` and `redirect_uri` simply get passed through to GitHub, `domain` is `github.com` by default but can be changed via query param make it possible to use this with GitHub Enterprise. ## Setting it up on Heroku 1. Make sure you have a heroku account and are logged in. 2. click this button: [![Deploy](https://www.herokucdn.com/deploy/button.png)](https://heroku.com/deploy) and follow instructions 3. Enter your client IDs/secrets as config variables in Heroku: ![Heroku config variables screenshot](https://cldup.com/j8rcEzo5M6-1200x1200.png) ## Running it yourself You can either set env variables in the command when you run the node server: ``` port=5000 YOUR_CLIENT_ID=CORRESPONDING_SECRET node server.js ``` Since that can be a bit messy you can also just put your client ID/secrets into `env.json`. Anything you put here will simply be added as environment variables. ```json { "YOUR CLIENT ID": "YOUR CLIENT SECRET", "YOUR OTHER CLIENT ID": "YOUR OTHER CLIENT SECRET" } ``` The only other thing that's configurable with environment variables is the PORT. ## credits Created by [@HenrikJoreteg](http://twitter.com/henrikjoreteg). Inspired by [gatekeeper](https://github.com/prose/gatekeeper). ## license [MIT](http://mit.joreteg.com/)
48.064516
326
0.755034
eng_Latn
0.994916
faab3d5ee8b457fd58c6f8f1a440ec654c10d0f7
10,973
md
Markdown
articles/service-fabric/service-fabric-quickstart-java-spring-boot.md
changeworld/azure-docs.nl-nl
bdaa9c94e3a164b14a5d4b985a519e8ae95248d5
[ "CC-BY-4.0", "MIT" ]
null
null
null
articles/service-fabric/service-fabric-quickstart-java-spring-boot.md
changeworld/azure-docs.nl-nl
bdaa9c94e3a164b14a5d4b985a519e8ae95248d5
[ "CC-BY-4.0", "MIT" ]
null
null
null
articles/service-fabric/service-fabric-quickstart-java-spring-boot.md
changeworld/azure-docs.nl-nl
bdaa9c94e3a164b14a5d4b985a519e8ae95248d5
[ "CC-BY-4.0", "MIT" ]
null
null
null
--- title: 'Snelstartgids: een Spring boot-app maken in azure Service Fabric' description: In deze snelstart implementeert u een Spring Boot-toepassing voor Azure Service Fabric met behulp van een Spring Boot-voorbeeldtoepassing. author: suhuruli ms.topic: quickstart ms.date: 01/29/2019 ms.author: suhuruli ms.custom: mvc, devcenter, seo-java-august2019, seo-java-september2019 ms.openlocfilehash: eb96989b4a2731e78471b848d690b48352408d1c ms.sourcegitcommit: 7c18afdaf67442eeb537ae3574670541e471463d ms.translationtype: MT ms.contentlocale: nl-NL ms.lasthandoff: 02/11/2020 ms.locfileid: "77121478" --- # <a name="quickstart-deploy-a-java-spring-boot-app-on-azure-service-fabric"></a>Snelstartgids: een Java Spring boot-app implementeren op Azure Service Fabric In deze Quick Start implementeert u een Java Spring boot-toepassing naar Azure Service Fabric met behulp van vertrouwde opdracht regel Programma's in Linux of MacOS. Azure Service Fabric is een platform voor gedistribueerde systemen voor het implementeren en distribueren van microservices en containers. ## <a name="prerequisites"></a>Vereisten #### <a name="linuxtablinux"></a>[Linux](#tab/linux) - [Java-omgeving](https://docs.microsoft.com/azure/service-fabric/service-fabric-get-started-linux#set-up-java-development) en [Yeoman](https://docs.microsoft.com/azure/service-fabric/service-fabric-get-started-linux#set-up-yeoman-generators-for-containers-and-guest-executables) - [Service Fabric SDK-& Service Fabric-opdracht regel interface (CLI)](https://docs.microsoft.com/azure/service-fabric/service-fabric-get-started-linux#installation-methods) - [Git](https://git-scm.com/downloads) #### <a name="macostabmacos"></a>[MacOS](#tab/macos) - [Java-omgeving en Yeoman](https://docs.microsoft.com/azure/service-fabric/service-fabric-get-started-mac#create-your-application-on-your-mac-by-using-yeoman) - [Service Fabric SDK-& Service Fabric-opdracht regel interface (CLI)](https://docs.microsoft.com/azure/service-fabric/service-fabric-cli#cli-mac) - [Git](https://git-scm.com/downloads) --- ## <a name="download-the-sample"></a>Het voorbeeld downloaden Voer in een Terminal venster de volgende opdracht uit om het Spring boot aan de [slag](https://github.com/spring-guides/gs-spring-boot) te klonen met de voor beeld-app naar uw lokale computer. ```bash git clone https://github.com/spring-guides/gs-spring-boot.git ``` ## <a name="build-the-spring-boot-application"></a>De Spring Boot-toepassing compileren Voer in de map *GS-lente-boot/complete* de onderstaande opdracht uit om de toepassing te bouwen ```bash ./gradlew build ``` ## <a name="package-the-spring-boot-application"></a>De Spring Boot-toepassing inpakken 1. Voer in de map *GS-lente-boot* in uw kloon de `yo azuresfguest` opdracht uit. 1. Voer de volgende details in voor elke prompt. ![Yeoman vermeldingen voor Spring boot](./media/service-fabric-quickstart-java-spring-boot/yeoman-entries-spring-boot.png) 1. Maak in de map *SpringServiceFabric/SpringServiceFabric/SpringGettingStartedPkg/code* een bestand met de naam *entryPoint.sh*. Voeg de volgende code toe aan het *entryPoint.sh* -bestand. ```bash #!/bin/bash BASEDIR=$(dirname $0) cd $BASEDIR java -jar gs-spring-boot-0.1.0.jar ``` 1. De resource voor **eind punten** toevoegen aan het bestand *GS-Spring-boot/SpringServiceFabric/SpringServiceFabric/SpringGettingStartedPkg/ServiceManifest. XML* ```xml <Resources> <Endpoints> <Endpoint Name="WebEndpoint" Protocol="http" Port="8080" /> </Endpoints> </Resources> ``` De *ServiceManifest.xml* ziet er nu als volgt uit: ```xml <?xml version="1.0" encoding="utf-8"?> <ServiceManifest Name="SpringGettingStartedPkg" Version="1.0.0" xmlns="http://schemas.microsoft.com/2011/01/fabric" xmlns:xsd="https://www.w3.org/2001/XMLSchema" xmlns:xsi="https://www.w3.org/2001/XMLSchema-instance" > <ServiceTypes> <StatelessServiceType ServiceTypeName="SpringGettingStartedType" UseImplicitHost="true"> </StatelessServiceType> </ServiceTypes> <CodePackage Name="code" Version="1.0.0"> <EntryPoint> <ExeHost> <Program>entryPoint.sh</Program> <Arguments></Arguments> <WorkingFolder>CodePackage</WorkingFolder> </ExeHost> </EntryPoint> </CodePackage> <Resources> <Endpoints> <Endpoint Name="WebEndpoint" Protocol="http" Port="8080" /> </Endpoints> </Resources> </ServiceManifest> ``` In dit stadium hebt u een Service Fabric-toepassing voor het Spring Boot Aan de slag-voorbeeld gemaakt die u kunt implementeren in Service Fabric. ## <a name="run-the-application-locally"></a>De toepassing lokaal uitvoeren 1. Start het lokale cluster op Ubuntu-computers door de volgende opdracht uit te voeren: ```bash sudo /opt/microsoft/sdk/servicefabric/common/clustersetup/devclustersetup.sh ``` Als u een Mac gebruikt, start u het lokale cluster vanuit de Docker-installatiekopie (hierbij wordt verondersteld dat is voldaan aan de [vereisten](https://docs.microsoft.com/azure/service-fabric/service-fabric-get-started-mac#create-a-local-container-and-set-up-service-fabric) voor het instellen van het lokale cluster voor Mac). ```bash docker run --name sftestcluster -d -p 19080:19080 -p 19000:19000 -p 25100-25200:25100-25200 -p 8080:8080 mysfcluster ``` Het starten van het lokale cluster kan enige tijd duren. Als u wilt controleren of het cluster volledig actief is, opent u de Service Fabric Explorer op `http://localhost:19080`. Als de vijf knooppunten in orde zijn, is het lokale cluster actief. ![Service Fabric Explorer toont goede knoop punten](./media/service-fabric-quickstart-java-spring-boot/service-fabric-explorer-healthy-nodes.png) 1. Open de map *GS-lente-boot/SpringServiceFabric* . 1. Voer de volgende opdracht uit om verbinding te maken met het lokale cluster. ```bash sfctl cluster select --endpoint http://localhost:19080 ``` 1. Voer het script *install.sh* uit. ```bash ./install.sh ``` 1. Open uw favoriete webbrowser om toegang te krijgen tot de toepassing via `http://localhost:8080`. ![Voor beeld van Spring boot Service Fabric](./media/service-fabric-quickstart-java-spring-boot/spring-boot-service-fabric-sample.png) U hebt nu toegang tot de Spring Boot-toepassing die werd geïmplementeerd in een Service Fabric-cluster. Zie voor meer informatie het voor beeld van een Spring boot aan de [slag](https://spring.io/guides/gs/spring-boot/) op de lente-website. ## <a name="scale-applications-and-services-in-a-cluster"></a>Toepassingen en services voor schalen in een cluster Services kunnen eenvoudig worden geschaald in een cluster om een wijziging in de belasting voor de services aan te kunnen. U schaalt een service door het aantal exemplaren te wijzigen dat wordt uitgevoerd in het cluster. Er zijn veel manieren waarop u services kunt schalen. U kunt bijvoorbeeld scripts of opdrachten van Service Fabric CLI (sfctl) gebruiken. In de volgende stappen wordt Service Fabric Explorer gebruikt. Service Fabric Explorer kan worden uitgevoerd in alle Service Fabric-clusters en is toegankelijk door vanuit een browser te bladeren naar de HTTP-beheerpoort (19080) van het cluster, bijvoorbeeld `http://localhost:19080`. Voer de volgende stappen uit om de web-front-endservice te schalen: 1. Open Service Fabric Explorer in het cluster - bijvoorbeeld: `http://localhost:19080`. 1. Selecteer het beletsel teken ( **...** ) naast het knoop punt **Fabric:/SpringServiceFabric/springgettingstartedt** in de structuur weergave en selecteer **service schalen**. ![Voor beeld van Service Fabric Explorer Scale-service](./media/service-fabric-quickstart-java-spring-boot/service-fabric-explorer-scale-sample.png) U kunt er nu voor kiezen om het aantal exemplaren van de service te schalen. 1. Wijzig het aantal in **3** en selecteer **service schalen**. Een alternatieve manier om de service te schalen met behulp van de opdrachtregel gaat als volgt. ```bash # Connect to your local cluster sfctl cluster select --endpoint https://<ConnectionIPOrURL>:19080 --pem <path_to_certificate> --no-verify # Run Bash command to scale instance count for your service sfctl service update --service-id 'SpringServiceFabric~SpringGettingStarted' --instance-count 3 --stateless ``` 1. Selecteer het knoop punt **Fabric:/SpringServiceFabric/springgettingstartedt** in de structuur weergave en vouw het partitie knooppunt uit (vertegenwoordigd door een GUID). ![Service Fabric Explorer Scale-service is voltooid](./media/service-fabric-quickstart-java-spring-boot/service-fabric-explorer-partition-node.png) De service heeft drie exemplaren en de structuurweergave laat zien op welke knooppunten de exemplaren worden uitgevoerd. Met deze eenvoudige beheertaak hebt u het aantal beschikbare resources voor het verwerken van gebruikersbelasting voor de front-endservice verdubbeld. Het is belangrijk te weten dat u niet meerdere exemplaren van een service nodig hebt om ervoor te zorgen dat deze op betrouwbare wijze wordt uitgevoerd. Als de service mislukt, wordt in Service Fabric een nieuw exemplaar van de service uitgevoerd in het cluster. ## <a name="fail-over-services-in-a-cluster"></a>Failoverservices in een cluster Het opnieuw opstarten van een knooppunt kan worden gesimuleerd met behulp van Service Fabric Explorer om failover van de service te demonstreren. Zorg ervoor dat maar één exemplaar van de service wordt uitgevoerd. 1. Open Service Fabric Explorer in het cluster - bijvoorbeeld: `http://localhost:19080`. 1. Selecteer het beletsel teken ( **...** ) naast het knoop punt waarop het exemplaar van uw service wordt uitgevoerd en start het knoop punt opnieuw op. ![Knoop punt Service Fabric Explorer opnieuw starten](./media/service-fabric-quickstart-java-spring-boot/service=fabric-explorer-restart=node.png) 1. Het service-exemplaar wordt naar een ander knooppunt verplaatst en er treedt geen downtime op voor de toepassing. ![Het knoop punt Service Fabric Explorer opnieuw starten is mislukt](./media/service-fabric-quickstart-java-spring-boot/service-fabric-explorer-service-moved.png) ## <a name="next-steps"></a>Volgende stappen In deze snelstart hebt u de volgende zaken geleerd: * Een Spring Boot-toepassing implementeren in Service Fabric * De toepassing implementeren in het lokale cluster * De toepassing uitschalen over meerdere knooppunten * Failover van de service uitvoeren met geen beschikbaarheid Meer informatie over het werken met Java-apps in Service Fabric vindt u in de zelfstudie voor Java-apps. > [!div class="nextstepaction"] > [Een Java-app implementeren](./service-fabric-tutorial-create-java-app.md)
53.26699
421
0.75385
nld_Latn
0.987461
faab9736b0ee853b1f99615b4de5db838594e83a
209
md
Markdown
content/blog/red-cabbage-seeds.md
jacksmedia/friday13
fedc804515057dee21eaa734d4e711574ac82b58
[ "RSA-MD" ]
null
null
null
content/blog/red-cabbage-seeds.md
jacksmedia/friday13
fedc804515057dee21eaa734d4e711574ac82b58
[ "RSA-MD" ]
null
null
null
content/blog/red-cabbage-seeds.md
jacksmedia/friday13
fedc804515057dee21eaa734d4e711574ac82b58
[ "RSA-MD" ]
null
null
null
--- templateKey: blog-post featuredpost: false featuredimage: ../assets/Red_Cabbage_Seeds.png title: Red Cabbage Seeds description: Seed testfield: 970 --- ![Red Cabbage Seeds](../assets/Red_Cabbage_Seeds.png)
23.222222
53
0.779904
eng_Latn
0.313241
faae65ec483142191c9f3d26b8a19c1ca6c237cd
226
md
Markdown
README.md
0xC000005/LogSpider
f4756e88339e56bdc67bfd311707b2a8e7ca47db
[ "MIT" ]
1
2019-08-30T03:26:23.000Z
2019-08-30T03:26:23.000Z
README.md
0xC000005/LogSpider
f4756e88339e56bdc67bfd311707b2a8e7ca47db
[ "MIT" ]
null
null
null
README.md
0xC000005/LogSpider
f4756e88339e56bdc67bfd311707b2a8e7ca47db
[ "MIT" ]
null
null
null
# LogSpider 基于Python和SQLite的 *轻量级* 日志分析框架 ☁ ## 特性 - Python 2.7 + SQLite,默认超级轻量组合 - 超小内存占用(单线程),日常日志分析无压力,同时也支持自定义多线程/MapReduce处理 - 自定义分类规则,全自动拆分文件,网页端可以直接构造文件路径加速访问 - 配置文件可拓展,能通过正则提取的信息都可以通过配置来实现全自动入库 - Docker-Compose 支持,一键up
25.111111
47
0.79646
yue_Hant
0.990183
faae6fe60f7d6d46447f10e2bb7eda88f2154033
56
md
Markdown
README.md
tigermarques/telegram-monitoring-bot
e73a291e30b9efccc7495f8d6c4fcb27cb3998b3
[ "MIT" ]
null
null
null
README.md
tigermarques/telegram-monitoring-bot
e73a291e30b9efccc7495f8d6c4fcb27cb3998b3
[ "MIT" ]
12
2020-07-16T10:09:53.000Z
2022-02-12T09:02:39.000Z
README.md
tigermarques/telegram-monitoring-bot
e73a291e30b9efccc7495f8d6c4fcb27cb3998b3
[ "MIT" ]
null
null
null
# telegram-monitoring-bot Bot to monitor infrastructure
18.666667
29
0.839286
eng_Latn
0.422852
faae91dfdca3747cc7d64e11fe32e99ef67d21e0
2,814
md
Markdown
content/earn/cross-chain.md
MinterTeam/minter-network-web
8c76fcb334b4d6f2f6e8c3c1b658be31e920c554
[ "MIT" ]
4
2020-04-10T04:50:55.000Z
2020-06-30T09:07:46.000Z
content/earn/cross-chain.md
MinterTeam/minter-network-web
8c76fcb334b4d6f2f6e8c3c1b658be31e920c554
[ "MIT" ]
3
2020-04-29T15:27:51.000Z
2022-01-14T11:47:46.000Z
content/earn/cross-chain.md
MinterTeam/minter-network-web
8c76fcb334b4d6f2f6e8c3c1b658be31e920c554
[ "MIT" ]
16
2020-04-27T19:56:12.000Z
2022-02-10T23:25:16.000Z
--- order: 17 title: How to Make a Cross-Chain Transfer --- # How to Make a Cross-Chain Transfer A cross-chain transfer means moving the same token between different blockchains. Such transfers are processed by the [Minter Hub](/earn/minter-hub) bridge in a completely decentralized way. ## How to Transfer Tokens from Ethereum to Minter (Deposit) 1. Generate a new address using [Minter Console](https://console.minter.network/) or sign in with the seed phrase if you already have one 2. Go to the [Deposit & Withdraw](https://console.minter.network/hub) section 3. In the **Deposit** form, log in with the wallet where you have tokens 4. Specify the receiver address (the address with which you logged in to Minter Console is set by default) 5. Choose the token available for deposit 6. Enter the number of tokens to be transferred. For simplicity, you may click on **USE MAX** 7. Now you need to unlock the amount you've specified for a smart contract (if you want to make all of your tokens readily available, tick the **Infinite unlock** box). Hit **Unlock** and confirm the transaction using the Ethereum wallet you've connected (you'll be charged a fee in ETH) 8. Once the Approve transaction has been confirmed by the Ethereum network, you'll see the amount of unlocked tokens ready to be deposited in the Deposit form. The only thing left to do is press **Send**, and your tokens will move from Ethereum to Minter. You'll be charged an ETH fee for this transaction as well After the transaction has been validated by the network and the bridge, its status will change to Success. Now the tokens are spendable from your Mx address. ## How to Transfer Tokens from Minter to Ethereum (Withdrawal) 1. Enter an Ethereum address to which you want to receive tokens 2. Select a token for cross-chain transfer 3. Specify necessary amount 4. Click **Withdraw** **Beware!** - Withdraw to your personal address only; - Do not withdraw to smart contracts, accounts on exchanges, or addresses that you don't have direct access to (seed phrase); - Always take into account Ethereum and Minter Hub fees; - Minter Hub is [open-source](https://github.com/MinterTeam/minter-hub), meaning you can examine its code if needed. ## Which Tokens Are Available to Be Transferred Cross-Chain? You may find the up-to-date list of tokens with enabled cross-chain transfer feature in [Minter Console](https://console.minter.network/hub) (the **DEPOSIT & WITHDRAW** menu). ![Cross-chain tokens](/img/docs/cross.png) More on [how support is added for new tokens](/earn/new-tokens). *This material serves educational purposes only. The information contained herewithin does not constitute an investment, financial, legal, or tax advice, and it is not an offer or solicitation to purchase or sell any financial instrument.*
61.173913
313
0.772566
eng_Latn
0.99844
faaecc771f4d184eb47dd5d08b78a42d8665bd44
128
md
Markdown
README.md
kpatel122/Arduino-Software-Serial-and-Servo
2474f8f440d46fa9f348b516d61924718af21173
[ "MIT" ]
null
null
null
README.md
kpatel122/Arduino-Software-Serial-and-Servo
2474f8f440d46fa9f348b516d61924718af21173
[ "MIT" ]
null
null
null
README.md
kpatel122/Arduino-Software-Serial-and-Servo
2474f8f440d46fa9f348b516d61924718af21173
[ "MIT" ]
null
null
null
# Arduino-Software-Serial-and-Servo Arduino proof of concept project of using Software Serial and Servo motors at the same time
42.666667
91
0.820313
eng_Latn
0.977776
faaf378b710cc12ee76455e8238afd58247a8cb1
16,863
md
Markdown
_posts/2020-06-18-检测和防止深度造假的新方法 - iYouPort.md
NodeBE4/oped2
1c44827a3b1e06164b390ff9abfae728b744dd4f
[ "MIT" ]
1
2020-09-16T02:05:30.000Z
2020-09-16T02:05:30.000Z
_posts/2020-06-18-检测和防止深度造假的新方法 - iYouPort.md
NodeBE4/oped2
1c44827a3b1e06164b390ff9abfae728b744dd4f
[ "MIT" ]
null
null
null
_posts/2020-06-18-检测和防止深度造假的新方法 - iYouPort.md
NodeBE4/oped2
1c44827a3b1e06164b390ff9abfae728b744dd4f
[ "MIT" ]
1
2020-11-04T04:49:44.000Z
2020-11-04T04:49:44.000Z
--- layout: post title: "检测和防止深度造假的新方法 - iYouPort" date: 2020-06-18T16:02:03+00:00 author: iYouPort from: https://www.iyouport.org/%e6%a3%80%e6%b5%8b%e5%92%8c%e9%98%b2%e6%ad%a2%e6%b7%b1%e5%ba%a6%e9%80%a0%e5%81%87%e7%9a%84%e6%96%b0%e6%96%b9%e6%b3%95/ tags: [ iYouPort ] categories: [ iYouPort ] --- <article class="post-12238 post type-post status-publish format-standard has-post-thumbnail hentry category-51 category-289 tag-deepfake tag-detect tag-technology" id="post-12238"> <header class="entry-header"> <h1 class="entry-title"> 检测和防止深度造假的新方法 </h1> </header> <div class="entry-meta"> <span class="byline"> <a href="https://www.iyouport.org/author/don-evans/" rel="author" title="由McCaffrey发布"> McCaffrey </a> </span> <span class="cat-links"> <a href="https://www.iyouport.org/category/%e5%bf%83%e7%90%86%e6%88%98%e5%92%8c%e4%bf%a1%e6%81%af%e6%88%98/" rel="category tag"> 心理战和信息战 </a> , <a href="https://www.iyouport.org/category/%e6%8a%80%e6%9c%af/" rel="category tag"> 技术 </a> </span> <span class="published-on"> <time class="entry-date published" datetime="2020-06-19T00:02:03+08:00"> 2020年6月19日 </time> <time class="updated" datetime="2020-01-15T23:32:15+08:00"> 2020年1月15日 </time> </span> <span class="word-count"> 0 Minutes </span> </div> <div class="entry-content"> <ul> <li class="graf graf--p"> <span style="color: #00ccff;"> <em> <strong> 造假和检测造假是一套猫鼠游戏,检测者必须跟上造假技术的进步,甚至,需要超越它们 </strong> </em> </span> </li> </ul> <p> <img alt="" class="aligncenter wp-image-12239 jetpack-lazy-image" data-lazy-sizes="(max-width: 750px) 100vw, 750px" data-lazy-src="https://i1.wp.com/www.iyouport.org/wp-content/uploads/2020/01/0-5.jpg?resize=750%2C497&amp;is-pending-load=1#038;ssl=1" data-lazy-srcset="https://i1.wp.com/www.iyouport.org/wp-content/uploads/2020/01/0-5.jpg?w=496&amp;ssl=1 496w, https://i1.wp.com/www.iyouport.org/wp-content/uploads/2020/01/0-5.jpg?resize=300%2C199&amp;ssl=1 300w" data-recalc-dims="1" height="497" src="https://i1.wp.com/www.iyouport.org/wp-content/uploads/2020/01/0-5.jpg?resize=750%2C497&amp;ssl=1" srcset="data:image/gif;base64,R0lGODlhAQABAIAAAAAAAP///yH5BAEAAAAALAAAAAABAAEAAAIBRAA7" width="750"/> <noscript> <img alt="" class="aligncenter wp-image-12239" data-recalc-dims="1" height="497" sizes="(max-width: 750px) 100vw, 750px" src="https://i1.wp.com/www.iyouport.org/wp-content/uploads/2020/01/0-5.jpg?resize=750%2C497&amp;ssl=1" srcset="https://i1.wp.com/www.iyouport.org/wp-content/uploads/2020/01/0-5.jpg?w=496&amp;ssl=1 496w, https://i1.wp.com/www.iyouport.org/wp-content/uploads/2020/01/0-5.jpg?resize=300%2C199&amp;ssl=1 300w" width="750"/> </noscript> </p> <p class="graf graf--p"> 如果没有经过训练就很难识别深度造假。因为它们可以非常逼真。 </p> <p class="graf graf--p"> 无论是用作报复的武器、操纵金融市场的工具、还是破坏国际关系的稳定,这些造假视频可以描绘目标人在做和在说他们从未做过或说过的东西,“眼见为实” 的观念已经彻底崩溃。 </p> <ul class="postList"> <li class="graf graf--li"> 《 <a class="markup--anchor markup--li-anchor" data-href="https://www.iyouport.org/%e5%bd%93%e6%81%b6%e6%84%8f%e8%80%85%e6%8e%8c%e6%8f%a1%e6%8a%80%e6%9c%af%ef%bc%9a%e4%bf%a1%e6%81%af%e6%88%98%e7%9a%84%e6%9c%aa%e6%9d%a5%e7%a9%b6%e7%ab%9f%e6%9c%89%e5%a4%9a%e5%8f%af%e6%80%95%ef%bc%9f/" href="https://www.iyouport.org/%e5%bd%93%e6%81%b6%e6%84%8f%e8%80%85%e6%8e%8c%e6%8f%a1%e6%8a%80%e6%9c%af%ef%bc%9a%e4%bf%a1%e6%81%af%e6%88%98%e7%9a%84%e6%9c%aa%e6%9d%a5%e7%a9%b6%e7%ab%9f%e6%9c%89%e5%a4%9a%e5%8f%af%e6%80%95%ef%bc%9f/" rel="noopener noreferrer" target="_blank"> 当恶意者掌握技术:信息战的未来究竟有多可怕? </a> 》 </li> </ul> <p class="graf graf--p"> 深度造假通过向计算机算法展示一个人的许多图像,然后让其使用所看到的内容来生成新的虚构的面部图像;同时,他们的声音也是合成的,因此它表面上听起来就像是此人在说一些完全不同的话。 </p> <p class="graf graf--p"> <iframe allowfullscreen="allowfullscreen" height="421" src="//www.youtube.com/embed/cQ54GDm1eL0" width="750"> </iframe> </p> <p class="graf graf--p"> 一些早期的研究工作能够检测到 <a class="markup--anchor markup--p-anchor" data-href="https://theconversation.com/detecting-deepfake-videos-in-the-blink-of-an-eye-101072" href="https://theconversation.com/detecting-deepfake-videos-in-the-blink-of-an-eye-101072" rel="noopener noreferrer" target="_blank"> 不包含人的正常眨眼次数 </a> 的深度造假视频,但是最新一代的深度造假技术已经能够绕避这些检测过程,因此研究必须继续发展。 </p> <ul class="postList"> <li class="graf graf--li"> <strong class="markup--strong markup--li-strong"> 推荐这个探讨会的文字和视频汇总 </strong> 《 <a class="markup--anchor markup--li-anchor" data-href="https://www.iyouport.org/%e5%bc%80%e6%ba%90%e6%83%85%e6%8a%a5%e5%92%8c%e5%8f%96%e8%af%81%e5%ad%a6%e5%a6%82%e4%bd%95%e5%af%b9%e4%bb%98-%e6%b7%b1%e5%ba%a6%e9%80%a0%e5%81%87%ef%bc%9f%ef%bc%88%e6%9c%89%e5%be%88%e5%a4%9a%e8%a7%86/" href="https://www.iyouport.org/%e5%bc%80%e6%ba%90%e6%83%85%e6%8a%a5%e5%92%8c%e5%8f%96%e8%af%81%e5%ad%a6%e5%a6%82%e4%bd%95%e5%af%b9%e4%bb%98-%e6%b7%b1%e5%ba%a6%e9%80%a0%e5%81%87%ef%bc%9f%ef%bc%88%e6%9c%89%e5%be%88%e5%a4%9a%e8%a7%86/" rel="noopener noreferrer" target="_blank"> 开源情报和取证学如何对付 “深度造假”? </a> 》 </li> </ul> <p class="graf graf--p"> 现在,研究可以通过仔细查看特定帧的像素来识别视频是否被操纵过。还制定了一项积极措施来保护个人免遭深度造假的侵害。 </p> <h3 class="graf graf--p"> <strong class="markup--strong markup--p-strong"> 发现缺陷 </strong> </h3> <p class="graf graf--p"> 在最近的 <a class="markup--anchor markup--p-anchor" data-href="https://arxiv.org/abs/1811.00656" href="https://arxiv.org/abs/1811.00656" rel="noopener noreferrer" target="_blank"> 两篇 </a> 研究 <a class="markup--anchor markup--p-anchor" data-href="https://arxiv.org/abs/1811.00661" href="https://arxiv.org/abs/1811.00661" rel="noopener noreferrer" target="_blank"> 论文 </a> 中,详细描述了如何检测带有伪造者无法轻松修复的深度造假缺陷的方法。 </p> <p class="graf graf--p"> 当深度造假视频合成算法生成新的面部表情时,新的图像并不总是能令人头部的确切位置与光照条件或相机的距离相匹配。为了使假脸融入周围环境,必须对它们进行几何调整 —— 旋转、调整大小或以其他方式改变。 </p> <p class="graf graf--p"> 该过程在结果图像中留下了篡改的痕迹。 </p> <p class="graf graf--p"> 您可能已经注意到了这点。这些改动可以使照片看起来明显失真,例如模糊的边框和人为的皮肤抛光等。 </p> <p class="graf graf--p"> 更细微的改变中 <a class="markup--anchor markup--p-anchor" data-href="https://arxiv.org/abs/1811.00656" href="https://arxiv.org/abs/1811.00656" rel="noopener noreferrer" target="_blank"> 仍然有证据可循 </a> ,而且研究人员训练了一种算法来检测它,即使人眼看不到差异。 </p> <p class="graf graf--p"> <iframe allowfullscreen="allowfullscreen" height="421" src="//www.youtube.com/embed/V5Nz2mu5EzA" width="750"> </iframe> </p> <p class="graf graf--p"> <p> <iframe allowfullscreen="allowfullscreen" height="421" src="//www.youtube.com/embed/pK8WU4fueQI" width="750"> </iframe> </p> <p class="graf graf--p"> 如果深度造假视频中的人眼睛没有直接看摄像机,这些造假痕迹可能会有所不同。 </p> <p class="graf graf--p"> 捕捉到真实人物的视频显示了他们的面部在三个维度上移动,但是深度造假算法尚无法以3D方式制作面部。 </p> <p class="graf graf--p"> 取而代之的是,它们会生成人脸的规则二维图像,然后尝试旋转该图像、调整其大小并使其变形以适合该人的注视方向。 </p> <p class="graf graf--p"> 这些造假做得还不够好,这为揭露它们提供了机会。 </p> <p class="graf graf--p"> 研究人员设计了 <a class="markup--anchor markup--p-anchor" data-href="https://arxiv.org/abs/1811.00661" href="https://arxiv.org/abs/1811.00661" rel="noopener noreferrer" target="_blank"> 一种算法 </a> ,用于计算人的鼻子在图像中倾斜的方式。它还可以测量脸部轮廓,计算出头部的角度方向。 </p> <p class="graf graf--p"> 在真实人脸的真实视频中,所有这些都应该以相当可预测的规则排列。但是,在伪造品中,它们常常错位。 </p> <figure aria-describedby="caption-attachment-12240" class="wp-caption aligncenter" id="attachment_12240" style="width: 1244px"> <img alt="" class="size-full wp-image-12240 jetpack-lazy-image" data-lazy-sizes="(max-width: 1100px) 100vw, 1100px" data-lazy-src="https://i1.wp.com/www.iyouport.org/wp-content/uploads/2020/01/1-12.png?resize=1100%2C594&amp;is-pending-load=1#038;ssl=1" data-lazy-srcset="https://i1.wp.com/www.iyouport.org/wp-content/uploads/2020/01/1-12.png?w=1244&amp;ssl=1 1244w, https://i1.wp.com/www.iyouport.org/wp-content/uploads/2020/01/1-12.png?resize=300%2C162&amp;ssl=1 300w, https://i1.wp.com/www.iyouport.org/wp-content/uploads/2020/01/1-12.png?resize=1024%2C553&amp;ssl=1 1024w, https://i1.wp.com/www.iyouport.org/wp-content/uploads/2020/01/1-12.png?resize=768%2C415&amp;ssl=1 768w, https://i1.wp.com/www.iyouport.org/wp-content/uploads/2020/01/1-12.png?resize=1100%2C594&amp;ssl=1 1100w" data-recalc-dims="1" height="594" src="https://i1.wp.com/www.iyouport.org/wp-content/uploads/2020/01/1-12.png?resize=1100%2C594&amp;ssl=1" srcset="data:image/gif;base64,R0lGODlhAQABAIAAAAAAAP///yH5BAEAAAAALAAAAAABAAEAAAIBRAA7" width="1100"/> <noscript> <img alt="" class="size-full wp-image-12240" data-recalc-dims="1" height="594" sizes="(max-width: 1100px) 100vw, 1100px" src="https://i1.wp.com/www.iyouport.org/wp-content/uploads/2020/01/1-12.png?resize=1100%2C594&amp;ssl=1" srcset="https://i1.wp.com/www.iyouport.org/wp-content/uploads/2020/01/1-12.png?w=1244&amp;ssl=1 1244w, https://i1.wp.com/www.iyouport.org/wp-content/uploads/2020/01/1-12.png?resize=300%2C162&amp;ssl=1 300w, https://i1.wp.com/www.iyouport.org/wp-content/uploads/2020/01/1-12.png?resize=1024%2C553&amp;ssl=1 1024w, https://i1.wp.com/www.iyouport.org/wp-content/uploads/2020/01/1-12.png?resize=768%2C415&amp;ssl=1 768w, https://i1.wp.com/www.iyouport.org/wp-content/uploads/2020/01/1-12.png?resize=1100%2C594&amp;ssl=1 1100w" width="1100"/> </noscript> <figcaption class="wp-caption-text" id="caption-attachment-12240"> When a computer puts Nicolas Cage’s face on Elon Musk’s head, it may not line up the face and the head correctly. Siwei Lyu, CC BY-ND </figcaption> </figure> <h3 class="graf graf--p"> <strong class="markup--strong markup--p-strong"> 防御深度造假 </strong> </h3> <p class="graf graf--p"> 有效地检测伪造品的科学实际上是一场军备竞赛 —— 伪造者会变得更加擅长虚构,因此研究始终必须努力保持跟进的步伐,甚至需要抢先一步。 </p> <p class="graf graf--p"> 如果有一种方法可以影响创建深度造假作品的算法,使它们的工作变得更糟,那么对于检测伪造品就会变得更加有利。研究人员最近找到了一种这样的方法。 </p> <figure aria-describedby="caption-attachment-12241" class="wp-caption aligncenter" id="attachment_12241" style="width: 1238px"> <img alt="" class="size-full wp-image-12241 jetpack-lazy-image" data-lazy-sizes="(max-width: 1100px) 100vw, 1100px" data-lazy-src="https://i1.wp.com/www.iyouport.org/wp-content/uploads/2020/01/2-13.png?resize=1100%2C370&amp;is-pending-load=1#038;ssl=1" data-lazy-srcset="https://i1.wp.com/www.iyouport.org/wp-content/uploads/2020/01/2-13.png?w=1238&amp;ssl=1 1238w, https://i1.wp.com/www.iyouport.org/wp-content/uploads/2020/01/2-13.png?resize=300%2C101&amp;ssl=1 300w, https://i1.wp.com/www.iyouport.org/wp-content/uploads/2020/01/2-13.png?resize=1024%2C344&amp;ssl=1 1024w, https://i1.wp.com/www.iyouport.org/wp-content/uploads/2020/01/2-13.png?resize=768%2C258&amp;ssl=1 768w, https://i1.wp.com/www.iyouport.org/wp-content/uploads/2020/01/2-13.png?resize=1100%2C370&amp;ssl=1 1100w" data-recalc-dims="1" height="370" src="https://i1.wp.com/www.iyouport.org/wp-content/uploads/2020/01/2-13.png?resize=1100%2C370&amp;ssl=1" srcset="data:image/gif;base64,R0lGODlhAQABAIAAAAAAAP///yH5BAEAAAAALAAAAAABAAEAAAIBRAA7" width="1100"/> <noscript> <img alt="" class="size-full wp-image-12241" data-recalc-dims="1" height="370" sizes="(max-width: 1100px) 100vw, 1100px" src="https://i1.wp.com/www.iyouport.org/wp-content/uploads/2020/01/2-13.png?resize=1100%2C370&amp;ssl=1" srcset="https://i1.wp.com/www.iyouport.org/wp-content/uploads/2020/01/2-13.png?w=1238&amp;ssl=1 1238w, https://i1.wp.com/www.iyouport.org/wp-content/uploads/2020/01/2-13.png?resize=300%2C101&amp;ssl=1 300w, https://i1.wp.com/www.iyouport.org/wp-content/uploads/2020/01/2-13.png?resize=1024%2C344&amp;ssl=1 1024w, https://i1.wp.com/www.iyouport.org/wp-content/uploads/2020/01/2-13.png?resize=768%2C258&amp;ssl=1 768w, https://i1.wp.com/www.iyouport.org/wp-content/uploads/2020/01/2-13.png?resize=1100%2C370&amp;ssl=1 1100w" width="1100"/> </noscript> <figcaption class="wp-caption-text" id="caption-attachment-12241"> At left, a face is easily detected in an image before our processing. In the middle, we’ve added perturbations that cause an algorithm to detect other faces, but not the real one. At right are the changes we added to the image, enhanced 30 times to be visible. Siwei Lyu, CC BY-ND </figcaption> </figure> <p class="graf graf--p"> 人脸图像库由算法组装而成,该算法处理成千上万的在线照片和视频,并使用机器学习来检测和提取人脸。计算机可能会查看课堂的照片,并检测所有学生和老师的脸,然后将这些脸添加到库中。 </p> <p class="graf graf--p"> 当库中有很多高质量的人脸图像时,生成的深度造假作品将很有可能成功欺骗观众。 </p> <p class="graf graf--p"> 研究人员找到了 <a class="markup--anchor markup--p-anchor" data-href="https://arxiv.org/abs/1906.09288" href="https://arxiv.org/abs/1906.09288" rel="noopener noreferrer" target="_blank"> 一种方法 </a> ,可以将特殊设计的噪声添加到人眼看不到的数字照片或视频中,但是可以使人脸检测算法被欺骗。 </p> <p class="graf graf--p"> 它可以隐藏人脸检测器用来定位人脸的像素模式,并且创造出诱饵,暗示没有人的脸在这里。 </p> <p class="graf graf--p"> <iframe allowfullscreen="allowfullscreen" height="364" src="//www.youtube.com/embed/0YeeFnNBzCs" width="650"> </iframe> </p> <p class="graf graf--p"> 由于较少的真实面孔和较多的非面孔污染了训练数据,因此深度造假算法在生成伪造面孔时的效果会更糟。这不仅减慢了深度造假的制作过程,而且使生成的伪造品具有更多的缺陷,更易于检测。 </p> <p class="graf graf--p"> 在开发此算法时,研究人员希望能够将其应用于某人上传到社交媒体或任何在线站点的任何图像。 </p> <p class="graf graf--p"> 在上传过程中,也许会询问发布者:“您是否希望保护此视频或图像中的面部,以防被伪造?” 如果用户选择是,则该算法可以添加数字噪声在里面,不影响人们在线看到这些面孔,但可以有效地隐藏它们令那些可能试图检测和模仿它们的算法不再可靠。⚪️ </p> <p class="graf graf--p"> <a class="markup--anchor markup--p-anchor" data-href="https://theconversation.com/detecting-deepfakes-by-looking-closely-reveals-a-way-to-protect-against-them-119218" href="https://theconversation.com/detecting-deepfakes-by-looking-closely-reveals-a-way-to-protect-against-them-119218" rel="noopener noreferrer" target="_blank"> Detecting deepfakes by looking closely reveals a way to protect against them </a> </p> <div id="atatags-1611829871-5f426d5b0f174"> </div> <div class="sharedaddy sd-sharing-enabled"> <div class="robots-nocontent sd-block sd-social sd-social-icon sd-sharing"> <h3 class="sd-title"> 共享此文章: </h3> <div class="sd-content"> <ul> <li class="share-twitter"> <a class="share-twitter sd-button share-icon no-text" data-shared="sharing-twitter-12238" href="https://www.iyouport.org/%e6%a3%80%e6%b5%8b%e5%92%8c%e9%98%b2%e6%ad%a2%e6%b7%b1%e5%ba%a6%e9%80%a0%e5%81%87%e7%9a%84%e6%96%b0%e6%96%b9%e6%b3%95/?share=twitter" rel="nofollow noopener noreferrer" target="_blank" title="点击以在 Twitter 上共享"> <span> </span> <span class="sharing-screen-reader-text"> 点击以在 Twitter 上共享(在新窗口中打开) </span> </a> </li> <li class="share-facebook"> <a class="share-facebook sd-button share-icon no-text" data-shared="sharing-facebook-12238" href="https://www.iyouport.org/%e6%a3%80%e6%b5%8b%e5%92%8c%e9%98%b2%e6%ad%a2%e6%b7%b1%e5%ba%a6%e9%80%a0%e5%81%87%e7%9a%84%e6%96%b0%e6%96%b9%e6%b3%95/?share=facebook" rel="nofollow noopener noreferrer" target="_blank" title="点击以在 Facebook 上共享"> <span> </span> <span class="sharing-screen-reader-text"> 点击以在 Facebook 上共享(在新窗口中打开) </span> </a> </li> <li class="share-end"> </li> </ul> </div> </div> </div> <div class="sharedaddy sd-block sd-like jetpack-likes-widget-wrapper jetpack-likes-widget-unloaded" data-name="like-post-frame-161182987-12238-5f426d5b0fb25" data-src="https://widgets.wp.com/likes/#blog_id=161182987&amp;post_id=12238&amp;origin=www.iyouport.org&amp;obj_id=161182987-12238-5f426d5b0fb25" id="like-post-wrapper-161182987-12238-5f426d5b0fb25"> <h3 class="sd-title"> 赞过: </h3> <div class="likes-widget-placeholder post-likes-widget-placeholder" style="height: 55px;"> <span class="button"> <span> 赞 </span> </span> <span class="loading"> 正在加载…… </span> </div> <span class="sd-text-color"> </span> <a class="sd-link-color"> </a> </div> <div class="jp-relatedposts" id="jp-relatedposts"> <h3 class="jp-relatedposts-headline"> <em> 相关 </em> </h3> </div> </p> </div> <div class="entry-footer"> <ul class="post-tags light-text"> <li> Tagged </li> <li> <a href="https://www.iyouport.org/tag/deepfake/" rel="tag"> deepfake </a> </li> <li> <a href="https://www.iyouport.org/tag/detect/" rel="tag"> detect </a> </li> <li> <a href="https://www.iyouport.org/tag/technology/" rel="tag"> Technology </a> </li> </ul> </div> <div class="entry-author-wrapper"> <div class="site-posted-on"> <strong> Published </strong> <time class="entry-date published" datetime="2020-06-19T00:02:03+08:00"> 2020年6月19日 </time> <time class="updated" datetime="2020-01-15T23:32:15+08:00"> 2020年1月15日 </time> </div> </div> </article>
51.1
1,031
0.683805
yue_Hant
0.274636
fab0217a3c944c515e2355f0dab4ced9206e7d81
9,502
md
Markdown
docs/framework/data/adonet/ef/language-reference/standard-query-operators-in-linq-to-entities-queries.md
BaruaSourav/docs
c288ed777de6b091f5e074d3488f7934683f3eb5
[ "CC-BY-4.0", "MIT" ]
3,294
2016-10-30T05:27:20.000Z
2022-03-31T15:59:30.000Z
docs/framework/data/adonet/ef/language-reference/standard-query-operators-in-linq-to-entities-queries.md
BaruaSourav/docs
c288ed777de6b091f5e074d3488f7934683f3eb5
[ "CC-BY-4.0", "MIT" ]
16,739
2016-10-28T19:41:29.000Z
2022-03-31T22:38:48.000Z
docs/framework/data/adonet/ef/language-reference/standard-query-operators-in-linq-to-entities-queries.md
BaruaSourav/docs
c288ed777de6b091f5e074d3488f7934683f3eb5
[ "CC-BY-4.0", "MIT" ]
6,701
2016-10-29T20:56:11.000Z
2022-03-31T12:32:26.000Z
--- description: "Learn more about: Standard Query Operators in LINQ to Entities Queries" title: "Standard Query Operators in LINQ to Entities Queries" ms.date: "08/21/2018" ms.assetid: 7fa55a9b-6219-473d-b1e5-2884a32dcdff --- # Standard Query Operators in LINQ to Entities Queries In a query, you specify the information that you want to retrieve from the data source. A query can also specify how that information should be sorted, grouped, and shaped before it is returned. LINQ provides a set of standard query methods that you can use in a query. Most of these methods operate on sequences; in this context, a sequence is an object whose type implements the <xref:System.Collections.Generic.IEnumerable%601> interface or the <xref:System.Linq.IQueryable%601> interface. The standard query operators query functionality includes filtering, projection, aggregation, sorting, grouping, paging, and more. Some of the more frequently used standard query operators have dedicated keyword syntax so that they can be called by using query expression syntax. A query expression is a different, more readable way to express a query than the method-based equivalent. Query expression clauses are translated into calls to the query methods at compile time. For a list of standard query operators that have equivalent query expression clauses, see [Standard Query Operators Overview](/previous-versions/visualstudio/visual-studio-2013/bb397896(v=vs.120)). Not all of the standard query operators are supported in LINQ to Entities queries. For more information, see [Supported and Unsupported LINQ Methods (LINQ to Entities)](supported-and-unsupported-linq-methods-linq-to-entities.md). This topic provides information about the standard query operators that is specific to LINQ to Entities. For more information about known issues in LINQ to Entities queries, see [Known Issues and Considerations in LINQ to Entities](known-issues-and-considerations-in-linq-to-entities.md). ## Projection and Filtering Methods *Projection* refers to transforming the elements of a result set into a desired form. For example, you can project a subset of the properties you need from each object in the result set, you can project a property and perform a mathematical calculation on it, or you can project the entire object from the result set. The projection methods are `Select` and `SelectMany`. *Filtering* refers to the operation of restricting the result set to contain only those elements that match a specified condition. The filtering method is `Where`. Most overloads of the projection and filtering methods are supported in LINQ to Entities, with the exception of those that accept a positional argument. ## Join Methods Joining is an important operation in queries that target data sources that have no navigable relationships to each other. A join of two data sources is the association of objects in one data source with objects in the other data source that share a common attribute or property. The join methods are `Join` and `GroupJoin`. Most overloads of the join methods are supported, with the exception of those that use a <xref:System.Collections.Generic.IEqualityComparer%601>. This is because the comparer cannot be translated to the data source. ## Set Methods Set operations in LINQ are query operations that base their result sets on the presence or absence of equivalent elements within the same or in another collection (or set). The set methods are `All`, `Any`, `Concat`, `Contains`, `DefaultIfEmpty`, `Distinct`, `EqualAll`, `Except`, `Intersect`, and `Union`. Most overloads of the set methods are supported in LINQ to Entities, though there are some differences in behavior compared to LINQ to Objects. However, set methods that use an <xref:System.Collections.Generic.IEqualityComparer%601> are not supported because the comparer cannot be translated to the data source. ## Ordering Methods Ordering, or sorting, refers to the ordering the elements of a result set based on one or more attributes. By specifying more than one sort criterion, you can break ties within a group. Most overloads of the ordering methods are supported, with the exception of those that use an <xref:System.Collections.Generic.IComparer%601>. This is because the comparer cannot be translated to the data source. The ordering methods are `OrderBy`, `OrderByDescending`, `ThenBy`, `ThenByDescending`, and `Reverse`. Because the query is executed on the data source, the ordering behavior may differ from queries executed in the CLR. This is because ordering options, such as case ordering, kanji ordering, and null ordering, can be set in the data source. Depending on the data source, these ordering options might produce different results than in the CLR. If you specify the same key selector in more than one ordering operation, a duplicate ordering will be produced. This is not valid and an exception will be thrown. ## Grouping Methods Grouping refers to placing data into groups so that the elements in each group share a common attribute. The grouping method is `GroupBy`. Most overloads of the grouping methods are supported, with the exception of those that use an <xref:System.Collections.Generic.IEqualityComparer%601>. This is because the comparer cannot be translated to the data source. The grouping methods are mapped to the data source using a distinct sub-query for the key selector. The key selector comparison sub-query is executed by using the semantics of the data source, including issues related to comparing `null` values. ## Aggregate Methods An aggregation operation computes a single value from a collection of values. For example, calculating the average daily temperature from a month's worth of daily temperature values is an aggregation operation. The aggregate methods are `Aggregate`, `Average`, `Count`, `LongCount`, `Max`, `Min`, and `Sum`. Most overloads of the aggregate methods are supported. For behavior related to null values, the aggregate methods use the data source semantics. The behavior of the aggregation methods when null values are involved might be different, depending on which back-end data source is being used. Aggregate method behavior using the semantics of the data source might also be different from what is expected from CLR methods. For example, the default behavior for the `Sum` method on SQL Server is to ignore any null values instead of throwing an exception. Any exceptions that result from aggregation, such as an overflow from the `Sum` function, are thrown as data source exceptions or Entity Framework exceptions during the materialization of the query results. For those methods that involve a calculation over a sequence, such as `Sum` or `Average`, the actual calculation is performed on the server. As a result, type conversions and loss of precision might occur on the server, and the results might differ from what is expected using CLR semantics. The default behavior of the aggregate methods for null/non-null values is shown in the following table: |Method|No data|All null values|Some null values|No null values| |------------|-------------|---------------------|----------------------|--------------------| |`Average`|Returns null.|Returns null.|Returns the average of the non-null values in a sequence.|Computes the average of a sequence of numeric values.| |`Count`|Returns 0|Returns the number of null values in the sequence.|Returns the number of null and non-null values in the sequence.|Returns the number of elements in the sequence.| |`Max`|Returns null.|Returns null.|Returns the maximum non-null value in a sequence.|Returns the maximum value in a sequence.| |`Min`|Returns null.|Returns null.|Returns the minimum non-null value in a sequence.|Returns the minimum value in a sequence.| |`Sum`|Returns null.|Returns null.|Returns the sum of the non-null value in a sequence.|Computes the sum of a sequence of numeric values.| ## Type Methods The two LINQ methods that deal with type conversion and testing are both supported in the context of the Entity Framework. This means that the only supported types are types that map to the appropriate Entity Framework type. For a list of these types, see [Conceptual Model Types (CSDL)](/ef/ef6/modeling/designer/advanced/edmx/csdl-spec#conceptual-model-types-csdl). The type methods are `Convert` and `OfType`. `OfType` is supported for entity types. `Convert` is supported for conceptual model primitive types. The C# `is` and `as` methods are also supported. ## Paging Methods Paging operations return a single element or multiple elements from a sequence. The supported paging methods are `First`, `FirstOrDefault`, `Single`, `SingleOrDefault`, `Skip`, and `Take`. A number of paging methods are not supported, due either to the inability to map functions to the data source or to the lack of implicit ordering of sets on the data source. Methods that return a default value are restricted to conceptual model primitive types and reference types with null defaults. Paging methods that are executed on an empty sequence will return null. ## See also - [Supported and Unsupported LINQ Methods (LINQ to Entities)](supported-and-unsupported-linq-methods-linq-to-entities.md) - [Standard Query Operators Overview](/previous-versions/visualstudio/visual-studio-2013/bb397896(v=vs.120))
109.218391
1,167
0.777521
eng_Latn
0.99876
fab076f46f9585157526fa46199bce8979ba05ce
122
md
Markdown
data/test/readme.md
pawanhv/DRW2CAD3D
48787f8442d342a3e0e49416fc1af5c379e5ed12
[ "MIT" ]
null
null
null
data/test/readme.md
pawanhv/DRW2CAD3D
48787f8442d342a3e0e49416fc1af5c379e5ed12
[ "MIT" ]
null
null
null
data/test/readme.md
pawanhv/DRW2CAD3D
48787f8442d342a3e0e49416fc1af5c379e5ed12
[ "MIT" ]
null
null
null
# Test folder Put here the images you want to test the model on. Output images will be generated with a predix output--
20.333333
54
0.762295
eng_Latn
0.996231
fab08de983e6b0cb495ed5e305be3e25f732a080
16,095
md
Markdown
docs/examples.md
orest-d/liquer
7a5b5a69cf673b4a849dd2da3050ccd75081e454
[ "MIT" ]
3
2019-12-10T10:22:36.000Z
2019-12-12T16:36:11.000Z
docs/examples.md
orest-d/liquer
7a5b5a69cf673b4a849dd2da3050ccd75081e454
[ "MIT" ]
null
null
null
docs/examples.md
orest-d/liquer
7a5b5a69cf673b4a849dd2da3050ccd75081e454
[ "MIT" ]
2
2019-11-14T16:26:52.000Z
2021-07-26T04:53:54.000Z
# HDX disaggregation wizard LiQuer is a small server-side framework that can be quite helpful when building data-oriented web applications. One such example is HDX disaggregation wizard. It is a tool solving a simple task: splitting (disaggregating) a single data sheet (csv or xlsx) into multiple sheets. Sheet is split by values in the specified column (or multiple columns). Since this is a quite generic task (related to *group by*), this functionality is built into ``liquer.ext.lq_pandas``. The core of this feature is the ``eq`` command, filtering dataframe by specific values in a column, e.g. ``eq-a-123`` keeps in the dataframe only rows where column ``a`` is 123. Command ``eq`` (equal) accepts multiple column-value pairs, e.g. ``eq-a-123-b-234``. In HDX the convention is to use the first row for tags. To support this comvention, ``teq`` command (tag equal) always keeps the first row of the dataframe. The disaggregation service supports both tagged and untagged data, using either ``eq`` or ``teq`` for filtering, depending on the user input. The complete flow is simple: * fetch data (command ``df_from``) * find unique values in a column (or multiple columns) and use them to create a list (table) of queries (command ``split_df``) * the queries use ``eq`` (or ``teq``) to filter dataframe by value(s). So, use a query like ``df_from-URL/split_df-COLUMN`` and you will get a table with queries like ``df_from-URL/eq-COLUMN-VALUE1``, ``df_from-URL/eq-COLUMN-VALUE2``. A little detail regarding the split function: There are actually four versions of this function - depending whether it is used for tagged or untagged document and whether it is *quick* (or *query*) splits or full splits. The quick version only provides raw LiQuer queries (not the complete URL), The full split (``split_df`` for untagged and ``tsplit_df`` for tagged data) execute all the split queries, which might be slow. As a side effect, the results are cached (depending on the configuration, the example is using ``FileCache('cache')``). The complete user interface is in a single html file ``hdx_wizard.html``, served by the flask server. Inside the user interface, LiQuer service is called multiple times e.g. to get previews or metadata: * Data previews uses the ability of LiQuer ``lq_pandas`` to convert dataframes to json, which can easily be read into javascript on the browser. First preview uses ``head_df`` command to display only a restricted part of the dataframe (*head*) * ``columns_info`` command is used to get lit of columns and eventual tags * ``/api/build`` service is used to build valid queries from javascript lists. This could be implemented directly in javascript, build service is a way to remotely call ``liquer.parser.encode``. # Integration of libhxl (example of a custom state type) Pandas is great, but there are other good libraries too e.g. [tabulate](https://bitbucket.org/astanin/python-tabulate). If you want to to use other data type (tabular or other), it will typically require (besides some useful commands) defining how that data can be serialized. This is done by implementing a *state type*. State type does several things associated with state type handling, but the most important role is handling serialization and deserialization. One excelent library used for working with humanitarian data is [libhxl](https://github.com/HXLStandard/libhxl-python). Libhxl plays somewhat similar role as pandas: it reads, writes and manipulates tabular data - but it does as well understand [HXL](http://hxlstandard.org), which pandas doesn't - hence the ``liquer.ext.lq_hxl`` module. In order to allow libhxl objects to be used in liquer, we need to define a state type: ``HxlStateType``. ```python import hxl from liquer.state_types import StateType, register_state_type, mimetype_from_extension class HxlStateType(StateType): def identifier(self): "Define an unique string identifier for the state type" return "hxl_dataset" ``` The ``identifier`` is important e.g. for caching, where it is stored as a part of metadata and it tells what StateType should be used for deserialization. ```python def default_extension(self): "Default file extension for the state type" return "csv" def is_type_of(self, data): "Check if data is of this state type" return isinstance(data, hxl.model.Dataset) ``` Default extension is used when the extension is not specified otherwise - for example if query does not end with a filename. The ``as_bytes`` and ``from_bytes`` are two most important methods, which take care of the serialization and deserialization. A state data can be serialized into multiple formats (e.g. csv, html, json...), therefore ``as_bytes`` optionally accepts a file extension and returns (besides the bytes) as well the mimetype. Th mimetype (when queried through the liquer server) becomes a part of the web service response. Note that serialization and deserialization do not necessarily need to support the same formats. E.g. html is quite nice to support in serialization, but it is too unspecific for a deserialization. ```python def as_bytes(self, data, extension=None): """Serialize data as bytes File extension may be provided and influence the serialization format. """ if extension is None: extension = self.default_extension() assert self.is_type_of(data) mimetype = mimetype_from_extension(extension) if extension == "csv": output = "".join(data.gen_csv(show_headers=True, show_tags=True)) return output.encode("utf-8"), mimetype elif extension == "json": output = "".join(data.gen_json(show_headers=True, show_tags=True)) return output.encode("utf-8"), mimetype else: raise Exception( f"Serialization: file extension {extension} is not supported by HXL dataset type.") def from_bytes(self, b: bytes, extension=None): """De-serialize data from bytes File extension may be provided and influence the serialization format. """ if extension is None: extension = self.default_extension() f = BytesIO() f.write(b) f.seek(0) if extension == "csv": return hxl.data(f) raise Exception( f"Deserialization: file extension {extension} is not supported by HXL dataset type.") ``` Sometimes a deep copy of state data is needed - e.g. to assure that the data in the cache will not become unintentionally modified. That's why the state type should define ``copy`` method. Since libhxl dataset is immutable (?), it is OK to return just the data without making a copy. ```python def copy(self, data): """Make a deep copy of the data""" return data ``` Once the state type class is defined, a state type instance is created and registered ```python HXL_DATASET_STATE_TYPE = HxlStateType() register_state_type(hxl.Dataset, HXL_DATASET_STATE_TYPE) register_state_type(hxl.io.HXLReader, HXL_DATASET_STATE_TYPE) ``` This is (currently) done for all relevant types. State types are registered in a global ``StateTypesRegistry`` object, which is responsible for registering and finding a state type instance for any state data. For more details see ``liquer.ext.lq_hxl`` module. Actually, the state type may not define a serialization and/or deserialization. There are objects that either can't be reliably serialized (e.g. matplotlib figure - as of time of writing) or serialization is otherwise undesirable. Such state types would be perfectly legal - they just could be neither cached nor served by the liquer web server. However, they could be inside the query, e.g. if matplotlib figure would be followed by image creation command, the image could be both served and cached. # Reports and visualizations With the help of LiQuer, it is very easy to create both resuable visualizations with multiple views as well as documents viewable offline or suitable for printing. There are multiple markups suitable for creating reports and visualisations, but probably the easiest and most flexible are HTML documents. In LiQuer html can be easily created by returning a html text from a command. Creation of text is simplified by ``evaluate_template`` function, which processes a string (*template*) containing LiQuer queries and replaces those queries by their results. Report example is processing data from [Global Food Prices Database (WFP)](https://data.humdata.org/dataset/4fdcd4dc-5c2f-43af-a1e4-93c9b6539a27). It contains monthly prices for various commodities. To adapt the data to our needs we need a cople of extra commands: Month and year are in two separate columns ``mp_year`` and ``mp_month``. For charts we need dates in YYYY-MM-DD format, which we achieve with the following command: ```python @command def datemy(df,y="mp_year",m="mp_month",target="date"): df.loc[:,target]=["%04d-%02d-01"%(int(year),int(month)) for year,month in zip(df[y],df[m])] return df ``` To make statistics, it's handy to use pandas groupby. As an example we show count of groups, which used in the report to show number of observed prices in various markets: ```python @command def count(df, *groupby_columns): df.loc[:,"count"]=1 return df.groupby(groupby_columns).count().reset_index().loc[:,list(groupby_columns)+["count"]] ``` An example of a custom filter is a *greater or equal* command ``geq``, used in the report to cut away years before a start year: ```python @command def geq(df, column, value:float): index = df.loc[:,column] >= value return df.loc[index,:] ``` This is somewhat similar to ``eq`` command from the pandas support module ``liquer.ext.lq_pandas``, but only supports numerical values, while the ``eq`` command is somewhat more general. Pandas dataframe supports quite flexible method ``to_html`` for converting dataframes to html format. Report uses for styling the popular css framework [bootstrap](https://getbootstrap.com/) and to display the tables nicely we just need to add some [bootstrap css classes](https://getbootstrap.com/docs/4.3/content/tables/). Command as well prepends a link to the dataframe itself by the ``link`` command. This tends to be very useful in practice, allowing to conviniently import underlying raw data into a spreadsheet. ```python @command def table(state): df = state.get() html=evaluate_template(f"""<a href="${state.query}/link-url-csv$">(data)</a> """) return html+df.to_html(index=False, classes="table table-striped") ``` The core of the report is a ``report`` command. It can be applied on any dataframe containing suitable fields. This allows a large degree of flexibility - arbitrary filters can be inserted into a command chain before the report. For example, the current report can be restricted to specific markets, time periods or commodities without any additional code, just by modifying the URL. Report embeds a possibility to remove data pefore a ``from_year``. This in principle could be done by inserting a ``geq`` command before the report (which would work fine). Passing ``from_year`` as an argument has an advantage, that the start year can become a part of the report (e.g. it can be used as a part of the title). Main part of the report is a single template, evaluated with ``evaluate_template``. Note that LiQuer template uses as well string interpolation by [python f string (PEP 498)](https://docs.python.org/3/whatsnew/3.6.html#whatsnew36-pep498), which is a very powerful combination. ```python @command def report(state, from_year=2017, linktype=None): state = state.with_caching(False) def makelink(url): if linktype is None: return url extension = url.split(".")[-1] return evaluate(f"fetch-{encode_token(url)}/link-{linktype}-{extension}").get() try: source = state.sources[0] except: source = "???" LiQuer='<a href="https://github.com/orest-d/liquer">&nbsp;LiQuer&nbsp;</a>' df = state.get() try: title = ",".join(sorted(df.adm0_name.unique())) + f" since {from_year}" except: title = "report" return state.with_filename("report.html").with_data(evaluate_template(f""" <html> <head> <title>{title}</title> <meta charset="utf-8"> <meta name="viewport" content="width=device-width, initial-scale=1, shrink-to-fit=no"> <link rel="stylesheet" href="{makelink('https://stackpath.bootstrapcdn.com/bootstrap/4.3.1/css/bootstrap.min.css')}" integrity="sha384-ggOyR0iXCbMQv3Xipma34MD+dH/1fQ784/j6cY/iJTQUOhcWr7x9JvoRxT2MZw1T" crossorigin="anonymous"> </head> <body> <div class="p-3 mb-2 bg-success text-white fixed-top shadow-sm"> <a class="nav-link active" href="https://data.humdata.org"><img src="{makelink('https://centre.humdata.org/wp-content/uploads/hdx_new_logo_accronym2.png')}" style="height:30px;" alt="HDX"></a> </div> <div class="bg-light fixed-bottom border-top"> Generated with {LiQuer} <span class="float-right">&#169; 2019 Orest Dubay</span> </div> <br/> <br/> <br/> <br/> <h1>{title}</h1> <div class="container-fluid"> <div class="row"> Data originate from <a href="{source}">&nbsp;{source}&nbsp;</a> were processed via a {LiQuer} service. Only data after {from_year} are shown (<a href="${state.query}/datemy/geq-mp_year-{from_year}/link-url-csv$">data</a>), complete data are <a href="${state.query}/datemy/link-url$">&nbsp;here</a>. </div> <div class="row"> <div class="col-md-6" style="height:50%;">${state.query}/datemy/geq-mp_year-{from_year}/groupby_mean-mp_price-date-cm_name/plotly_chart-xys-date-mp_price-cm_name$</div> <div class="col-md-6" style="height:50%;">${state.query}/datemy/geq-mp_year-{from_year}/count-adm1_name/plotly_chart-piexs-count-adm1_name$</div> </div> <div class="row"> <div class="col-md-6" style="height:50%;"> <h2>Average prices</h2> ${state.query}/datemy/geq-mp_year-{from_year}/groupby_mean-mp_price-cm_name/table$</div> <div class="col-md-6" style="height:50%;"> <h2>Observations</h2> ${state.query}/datemy/geq-mp_year-{from_year}/count-adm1_name/table$</div> </div> </body> </html> """)) ``` Inside the ``report`` command some more magic is used to handle links and external resources. Links are created by a nested function ``makelink``. The main purpose is to allow three different regimes of working with links: * links to original sources (default), * serving (proxying) resources through LiQuer service and * dataurls. **Links to original sources** are useful if the report is used from a web service: the report size is then relatively small and thus the loading time is faster than for dataurls. **Proxying resources** through LiQuer service allows to cache resources by LiQuer. This may be useful on slower internet connections, when running the service without internet or behind a firewall. **Dataurl** link type allows saving the report as a single html file. Such a report can be used e.g. for offline browsing, archiving or sending by e-mail. All the assets are embedded inside the html file, so the report will work even when the LiQuer service is not available. Note: The embedded LiQuer queries will of course not work offline, but if necessary, the data comming out from LiQuer can be turned to a dataurl with ``link`` command; type of the link can be controlled by ``linktype`` state variable. Assuming *linktype* is not hardcoded (as in ``table`` command) all query links in the report could be turned to dataurls like this: ``` filter-params/let-linktype-dataurl/report ``` This of course could lead to extremply large report files, so it should be used carefully.
50.613208
261
0.719478
eng_Latn
0.990959
fab09a209f23108fa456be1ff98f394ace19b778
241
md
Markdown
firmware/README.md
dekuNukem/exixe_clock
fe1d7c57de13c797f41c21e8e7dc471bf890534f
[ "MIT" ]
16
2018-02-24T19:38:01.000Z
2021-11-08T22:59:23.000Z
firmware/README.md
dekuNukem/exixe_clock
fe1d7c57de13c797f41c21e8e7dc471bf890534f
[ "MIT" ]
5
2019-01-11T22:54:24.000Z
2019-10-25T10:19:49.000Z
firmware/README.md
dekuNukem/exixe_clock
fe1d7c57de13c797f41c21e8e7dc471bf890534f
[ "MIT" ]
7
2019-05-16T00:56:06.000Z
2021-02-06T05:45:51.000Z
connect GPS antenna to acqurie time hold A button while powering up to change 12/24 hour format hold B button while powering up to change UTC offset Press A or B button under normal operation to switch between seconds and tempature display
40.166667
90
0.817427
eng_Latn
0.999268
fab18114aad6d057c71e9daede73869970beaae1
7,593
md
Markdown
articles/data-lake-store/data-lake-store-power-bi.md
OpenLocalizationTestOrg/azure-docs-pr15_it-IT
a5b6eb257721d6a02db53be2d3b2bee1d9e5aa1c
[ "CC-BY-3.0", "CC-BY-4.0", "MIT" ]
null
null
null
articles/data-lake-store/data-lake-store-power-bi.md
OpenLocalizationTestOrg/azure-docs-pr15_it-IT
a5b6eb257721d6a02db53be2d3b2bee1d9e5aa1c
[ "CC-BY-3.0", "CC-BY-4.0", "MIT" ]
null
null
null
articles/data-lake-store/data-lake-store-power-bi.md
OpenLocalizationTestOrg/azure-docs-pr15_it-IT
a5b6eb257721d6a02db53be2d3b2bee1d9e5aa1c
[ "CC-BY-3.0", "CC-BY-4.0", "MIT" ]
null
null
null
<properties pageTitle="Analizzare i dati nell'archivio Lake dati usando Power BI | Microsoft Azure" description="Usare Power BI per analizzare i dati archiviati nell'archivio Lake dati di Azure" services="data-lake-store" documentationCenter="" authors="nitinme" manager="jhubbard" editor="cgronlun"/> <tags ms.service="data-lake-store" ms.devlang="na" ms.topic="article" ms.tgt_pltfrm="na" ms.workload="big-data" ms.date="10/05/2016" ms.author="nitinme"/> # <a name="analyze-data-in-data-lake-store-by-using-power-bi"></a>Analizzare i dati nell'archivio Lake dati usando Power BI In questo articolo si imparerà a usare Power BI Desktop per analizzare e visualizzare i dati archiviati nell'archivio Lake dati di Azure. ## <a name="prerequisites"></a>Prerequisiti Prima di iniziare questa esercitazione, è necessario disporre le operazioni seguenti: - **Azure un abbonamento**. Vedere [ottenere Azure versione di valutazione gratuita](https://azure.microsoft.com/pricing/free-trial/). - **Account azure dati Lake Store**. Seguire le istruzioni nella [Guida introduttiva di Azure dati Lake archivio tramite il portale di Azure](data-lake-store-get-started-portal.md). In questo articolo si presuppone che ha creato un account di archivio di dati Lake, denominato **mybidatalakestore**e caricare un file di dati di esempio (**Drivers.txt**). Il file di esempio è disponibile per il download da [Azure dati Lake fra Repository](https://github.com/Azure/usql/tree/master/Examples/Samples/Data/AmbulanceData/Drivers.txt). - **Power BI Desktop**. È possibile scaricare [dall'Area Download Microsoft](https://www.microsoft.com/en-us/download/details.aspx?id=45331). ## <a name="create-a-report-in-power-bi-desktop"></a>Creare un report in Power BI Desktop 1. Avviare Power BI Desktop del computer. 2. Dalla barra multifunzione **Home** , fare clic su **Recupera dati**e quindi fare clic su altro. Nella finestra di dialogo **Carica dati** fare clic su **Azure**, fare clic su **Archivio Lake dati di Azure**e quindi fare clic su **Connetti**. ![Connettersi all'archivio dati Lake] (./media/data-lake-store-power-bi/get-data-lake-store-account.png "Connettersi all'archivio dati Lake") 3. Se viene visualizzato una finestra di dialogo informazioni sul connettore in fase di sviluppo, scegliere di continuare. 4. Nella finestra di dialogo **Archivio Lake dati di Microsoft Azure** specificare l'URL per l'account di archivio Lake dati e quindi fare clic su **OK**. ![URL per l'archivio Lake dati] (./media/data-lake-store-power-bi/get-data-lake-store-account-url.png "URL per l'archivio Lake dati") 5. Nella finestra di dialogo successiva, fare clic su **Accedi** per accedere a account archivio Lake dati. Si verrà reindirizzati alla pagina di accesso dell'organizzazione. Seguire le istruzioni per accedere con l'account. ![Accedere all'archivio dati Lake] (./media/data-lake-store-power-bi/get-data-lake-store-account-signin.png "Accedere all'archivio dati Lake") 6. Dopo aver completato l'accesso, fare clic su **Connetti**. ![Connettersi all'archivio dati Lake] (./media/data-lake-store-power-bi/get-data-lake-store-account-connect.png "Connettersi all'archivio dati Lake") 7. Nella finestra di dialogo successiva verrà visualizzato il file caricato al proprio account archivio Lake dati. Verificare le informazioni e quindi fare clic su **Carica**. ![Carica dati dall'archivio dati Lake] (./media/data-lake-store-power-bi/get-data-lake-store-account-load.png "Carica dati dall'archivio dati Lake") 8. Dopo i dati sono stati caricati in Power BI, verranno visualizzati i campi seguenti nella scheda **campi** . ![Campi importati] (./media/data-lake-store-power-bi/imported-fields.png "Campi importati") Tuttavia, per visualizzare e analizzare i dati, è preferibile i dati siano disponibili per i campi seguenti ![Campi desiderato] (./media/data-lake-store-power-bi/desired-fields.png "Campi desiderato") Nei passaggi successivi Microsoft aggiornerà la query per convertire i dati importati nel formato desiderato. 9. Dalla barra multifunzione **Home** , fare clic su **Modifica query**. ![Modificare le query] (./media/data-lake-store-power-bi/edit-queries.png "Modificare le query") 10. Nell'Editor di Query nella colonna del **contenuto** , fare clic su **binario**. ![Modificare le query] (./media/data-lake-store-power-bi/convert-query1.png "Modificare le query") 11. Verrà visualizzata un'icona file, che rappresenta il file **Drivers.txt** caricato. Pulsante destro del mouse sul file e fare clic su **CSV**. ![Modificare le query] (./media/data-lake-store-power-bi/convert-query2.png "Modificare le query") 12. Verrà visualizzato un output, come illustrato di seguito. Sono ora disponibili in un formato che è possibile utilizzare per creare visualizzazioni dei dati. ![Modificare le query] (./media/data-lake-store-power-bi/convert-query3.png "Modificare le query") 13. Dalla barra multifunzione **Home** , fare clic su **Applica e Chiudi**e quindi fare clic su **Chiudi e applicare**. ![Modificare le query] (./media/data-lake-store-power-bi/load-edited-query.png "Modificare le query") 14. Dopo la query viene aggiornata, la scheda **campi** illustra i nuovi campi disponibili per la visualizzazione. ![Campi di aggiornamento] (./media/data-lake-store-power-bi/updated-query-fields.png "Campi di aggiornamento") 15. Creazione automatica di un grafico a torta per rappresentare i driver di ogni città per un paese. A tale scopo, effettuare le seguenti selezioni. 1. Nella scheda visualizzazioni fare clic sul simbolo per un grafico a torta. ![Creare grafici a torta] (./media/data-lake-store-power-bi/create-pie-chart.png "Creare grafici a torta") 2. Le colonne che verranno utilizzare sono **colonna 4** (nome della città) e **colonna 7** (nome del paese). Trascinare queste colonne dalla scheda **campi** alla scheda **visualizzazioni** come illustrato di seguito. ![Creare visualizzazioni] (./media/data-lake-store-power-bi/create-visualizations.png "Creare visualizzazioni") 3. Grafico a torta sarà ora simile simile a quello illustrato di seguito. ![Grafico a torta] (./media/data-lake-store-power-bi/pie-chart.png "Creare visualizzazioni") 16. Se si seleziona un paese specifico dai filtri a livello di pagina, è ora possibile visualizzare il numero di driver di ogni città del paese selezionato. Ad esempio, nella scheda **visualizzazioni** in **filtri a livello di pagina**, selezionare **Brasile**. ![Selezionare un paese] (./media/data-lake-store-power-bi/select-country.png "Selezionare un paese") 17. Grafico a torta viene aggiornato automaticamente per visualizzare i driver in città del Brasile. ![Driver in un paese] (./media/data-lake-store-power-bi/driver-per-country.png "Driver per paese") 18. Dal menu **File** fare clic su **Salva** per salvare la visualizzazione in un file di Power BI Desktop. ## <a name="publish-report-to-power-bi-service"></a>Pubblicare report di servizio Power BI Dopo aver creato le visualizzazioni in Power BI Desktop, è possibile condividere con altri utenti pubblicandolo in servizio Power BI. Per istruzioni su come eseguire questa operazione, vedere [pubblicazione da Power BI Desktop](https://powerbi.microsoft.com/documentation/powerbi-desktop-upload-desktop-files/). ## <a name="see-also"></a>Vedere anche * [Analizzare i dati in archivio Lake Analitica Lake dati](../data-lake-analytics/data-lake-analytics-get-started-portal.md)
60.744
531
0.755301
ita_Latn
0.989639
fab1c5d8609e31f19df57367f77a89e82a6dd180
1,534
md
Markdown
publications.md
san25dec/san25dec.github.io
f3dfd703f1be237c66008cae1964b9690954dec2
[ "MIT" ]
null
null
null
publications.md
san25dec/san25dec.github.io
f3dfd703f1be237c66008cae1964b9690954dec2
[ "MIT" ]
null
null
null
publications.md
san25dec/san25dec.github.io
f3dfd703f1be237c66008cae1964b9690954dec2
[ "MIT" ]
null
null
null
--- title: "Publications" layout: gridlay permalink: /publications/ --- # Publications {% for publi in site.data.publist %} {{ publi.title }} <br /> <span style="color:green; font-size:70%"><em>{{ publi.authors }} </em></span> <span style="color:darkcyan; font-size:70%"><br />{{ publi.venue}}</span> <p style="border: 1px solid steelblue; width: 50px; text-align: center; height: 30px; line-height: 30px; border-radius: 10px; color: steelblue; display: inline-block; font-size:70%"> <a href="{{publi.url}}"> PDF </a> </p> {% if publi.project != "" and publi.project != nil %} <p style="border: 1px solid steelblue; width: 70px; text-align: center; height: 30px; line-height: 30px; border-radius: 10px; color: steelblue; display: inline-block; font-size:70%"> <a href="{{publi.baseurl}}{{publi.project}}"> Project </a> </p> {% endif %} {% if publi.code != "" and publi.code != nil %} <p style="border: 1px solid steelblue; width: 60px; text-align: center; height: 30px; line-height: 30px; border-radius: 10px; color: steelblue; display: inline-block; font-size:70%"> <a href="{{publi.baseurl}}{{publi.code}}"> Code </a> </p> {% endif %} {% if publi.data != "" and publi.data != nil %} <p style="border: 1px solid steelblue; width: 60px; text-align: center; height: 30px; line-height: 30px; border-radius: 10px; color: steelblue; display: inline-block; font-size:70%"> <a href="{{publi.baseurl}}{{publi.data}}"> Data </a> </p> {% endif %} {% endfor %}
32.638298
184
0.627119
eng_Latn
0.152572
fab205cf1e25672f945adc33c19c579b07acb345
162
md
Markdown
README.md
hacjy/LeafLoadingView
d8113736b44501ee7281020f20a2fcf15d8a2e3f
[ "MIT" ]
1
2017-09-07T08:31:26.000Z
2017-09-07T08:31:26.000Z
README.md
hacjy/LeafLoadingView
d8113736b44501ee7281020f20a2fcf15d8a2e3f
[ "MIT" ]
null
null
null
README.md
hacjy/LeafLoadingView
d8113736b44501ee7281020f20a2fcf15d8a2e3f
[ "MIT" ]
null
null
null
# LeafLoadingView <p>绚丽的loading动效的实现</p> <img src="https://github.com/hacjy/LeafLoadingView/blob/master/LeafLoadingView/snapshot/LeafLoadingView.gif" alt="效果图"/>
40.5
120
0.796296
yue_Hant
0.492608
fab28e5a11813398563bebc26d9e7012ffb93b19
1,375
md
Markdown
readme.md
BenJammen1986/trackSuggest
1db71a4fab302edce35d46d3162f80a06badc0d7
[ "MIT" ]
1
2017-02-18T00:01:39.000Z
2017-02-18T00:01:39.000Z
readme.md
BenJammen1986/trackSuggest
1db71a4fab302edce35d46d3162f80a06badc0d7
[ "MIT" ]
null
null
null
readme.md
BenJammen1986/trackSuggest
1db71a4fab302edce35d46d3162f80a06badc0d7
[ "MIT" ]
null
null
null
# _Track Suggester_ #### _Epicodus Programming Track Options, 02/28/2017_ #### By _**Ben Schenkenberger**_ ## Description _This page is designed to help potential students decide: 1. Whether programming is the correct career choice for them, and 2. Which programming language/track is the best option for them personally_ ## Setup/Installation Requirements **Cloning instructions** _You are welcome to clone this project! You'll need to:_ * _1. Follow the instructions on Github to clone/download_ **or** * _1. For Windows- download Gitbash @ https://git-scm.com/downloads._ For Mac- search your computer for "terminal" (the computer's built in command prompt). * _2. Open Gitbash/Terminal * _3. Navigate to the desktop (or the folder you want the project to appear in) using this text: cd desktop (i.e. "C.hange D.irectory to desktop"). * _4. Input this text into Gitbash/Terminal: git clone https://github.com/BenJammen1986/trackSuggest__. The project folder should appear in the * _5. If you want to edit/alter the code on your computer, you will also need to **download a simple code editor like Atom.**_ ## Support and contact details _If you have any issues, concerns, or feedback, please e-mail me at benschenkenberger@gmail.com._ ### License *This project uses the MIT open source license.* Copyright (c) 2017 **_Ben Schenkenberger_**
34.375
97
0.749818
eng_Latn
0.978034
fab3031244e2b0e83bae6aa2ef8d4a4dc091f842
128
md
Markdown
README.md
brucdo/Ola-Mundo
ad67045edebdeec8ca90938a6526a380d27d649b
[ "MIT" ]
null
null
null
README.md
brucdo/Ola-Mundo
ad67045edebdeec8ca90938a6526a380d27d649b
[ "MIT" ]
null
null
null
README.md
brucdo/Ola-Mundo
ad67045edebdeec8ca90938a6526a380d27d649b
[ "MIT" ]
null
null
null
# Olá, Mundo Primeiro repositorio do curso Git e Github Curso Gustavo Guanabara Essa linha eu adicionei diretamente no site
18.285714
43
0.796875
por_Latn
0.990214
fab303e0e43fe0f09944c8a4c21c7a157b1e1dd2
5,791
md
Markdown
blog/2013-07-11-basic-13-list-deal-with.md
ningg/computer-science-basic
98f25ad658ae38b9db4c099c4888da447f43b295
[ "MIT" ]
1
2015-07-12T11:25:12.000Z
2015-07-12T11:25:12.000Z
blog/2013-07-11-basic-13-list-deal-with.md
ningg/computer-science-basic
98f25ad658ae38b9db4c099c4888da447f43b295
[ "MIT" ]
null
null
null
blog/2013-07-11-basic-13-list-deal-with.md
ningg/computer-science-basic
98f25ad658ae38b9db4c099c4888da447f43b295
[ "MIT" ]
null
null
null
--- layout: post title: 链表--常见操作 description: 删除一个节点、输出倒数第k个节点、链表反转 published: true category: CS basic --- ## 删除链表节点 Tips: > 注意边界条件的判断,当删除的节点是head、或者尾节点时,类似临界条件需要单独考虑。 /** * 删除指定节点 * * @param pHead 链表的头节点 * @param toBeRemoved 需要被删除的节点 */ public static void removeNode(Node pHead, Node toBeRemoved) { // 1. 边界判断 if (null == pHead || null == toBeRemoved) { return; } // 2. 删除节点 // a. 头部节点: 直接指向「下一个节点」 // note: 需要借助「头节点」 if (pHead == toBeRemoved) { pHead = pHead.next; return; } // b. 尾部节点: 从前遍历, 到倒数第二个节点, 然后, 删除最后一个 // note: 需要借助「头节点」 if (toBeRemoved.next == null) { Node pCurr = pHead; while (pCurr.next != toBeRemoved) { pCurr = pCurr.next; } pCurr.next = null; return; } // c. 中间节点: 非头部、非尾部节点 // note: 不需要借助「头结点」 toBeRemoved.value = toBeRemoved.next.value; toBeRemoved.next = toBeRemoved.next.next; } 重写一遍上述代码,两种场景: ### 场景1:被删除的节点,不为头结点、也不为尾节点 当被删除的节点,既不为`头结点`,也不为`尾节点`时,不需要知道链表的头部地址,代码如下: void deleteNode(Node* pToBeDeleted) { if(pToBeDeleted == null) return ; Node* pNext = pToBeDeleted->next; if(pNext == null) return ; pToBeDeleted->value = pNext->value; pToBeDeleted->next = pNext->next; } ### 场景2:被删除的节点,可能为头结点、尾节点 若需要考虑,被删除的节点为`头结点`、`尾节点`的情况,则,需要借助链表的头部地址,代码如下: /** * 删除指定节点 * * @param pHead 链表的头节点 * @param toBeRemoved 需要被删除的节点 */ public static void removeNode(Node pHead, Node toBeRemoved) { // 1. 边界判断 if (null == pHead || null == toBeRemoved) { return; } // 2. 删除节点 // a. 头部节点: 直接指向「下一个节点」 // note: 需要借助「头节点」 if (pHead == toBeRemoved) { pHead = pHead.next; return; } // b. 尾部节点: 从前遍历, 到倒数第二个节点, 然后, 删除最后一个 // note: 需要借助「头节点」 if (toBeRemoved.next == null) { Node pCurr = pHead; while (pCurr.next != toBeRemoved) { pCurr = pCurr.next; } pCurr.next = null; return; } // c. 中间节点: 非头部、非尾部节点 // note: 不需要借助「头结点」 toBeRemoved.value = toBeRemoved.next.value; toBeRemoved.next = toBeRemoved.next.next; } ## 链表中倒数第k个节点 Tips: > 注意边界条件的判断,链表为空,k小于等于0,链表长度小于k。 思路:设定两个指针,间隔为 k,当第一个指针到达链表末端 null 时,第二个指针即指向倒数第k个节点。 /** * 找出链表中, 倒数第 k 个节点 * * @param head 链表头指针 * @param k 倒数第 k 个节点 * @return 找到的目标节点 */ public static Node findKthToTail(Node head, int k) { // 1. 边界判断 if (null == head || k <= 0) { return null; } // 2. 寻找 k-th 节点 // a. 两个指针, 一个先走(相聚 k 个节点) Node former = head; Node latter = head; for (int i = 1; i <= k; i++) { if (null != former) { former = former.next; } else { return null; } } // b. 先走的节点, 走到 null, 则, 说明后走的节点, while (null != former) { former = former.next; latter = latter.next; } return latter; } ## 反转链表 画图理清思路,直接使用4个节点来看,注意边界条件,输入链表为null的判断;示例代码如下: Node* ReversedList(Node* pHead) { if(pHead == null || pHead->next == null) return pHead; Node* pPre = null; Node* pCur = pHead; Node* pNext = pHead->next; while(pNext != null) { pCur->next = pPre; pPre = pCur; pCur = pNext; pNext = pNext->next; } pCur->next = pPre; return pCur; } ## 合并两个排序的链表 非递归方法: //合并两个有序链表,非递归方法 ListNode* MergeTwoList(ListNode* pHead1, ListNode* pHead2) { if (pHead1 == NULL) return pHead2; else if (pHead2 == NULL) return pHead1; ListNode * pNode1 = pHead1; ListNode * pNode2 = pHead2; ListNode * pMergeListHead = NULL; ListNode * pCurLastNode = NULL; if (pNode1->m_nValue < pNode2->m_nValue) { pMergeListHead = pHead1; pNode1 = pNode1->m_pNext; pCurLastNode = pMergeListHead; } else { pMergeListHead = pHead2; pNode2 = pNode2->m_pNext; pCurLastNode = pMergeListHead; } while (pNode1 != NULL && pNode2 != NULL) { if (pNode1->m_nValue < pNode2->m_nValue) { pCurLastNode->m_pNext = pNode1; pCurLastNode = pNode1; pNode1 = pNode1->m_pNext; } else { pCurLastNode->m_pNext = pNode2; pCurLastNode = pNode2; pNode2 = pNode2->m_pNext; } if (pNode1 == NULL) { pCurLastNode->m_pNext = pNode2; } if (pNode2 == NULL) { pCurLastNode->m_pNext = pNode1; } } return pMergeListHead; } 递归方式: //合并两个有序链表,递归方法 ListNode *MergeTwoList(ListNode *pListOneHead, ListNode *pListTwoHead) { if (pListOneHead == NULL) return pListTwoHead; else if (pListTwoHead == NULL) return pListOneHead; ListNode *pMergeListHead = NULL; if (pListOneHead->m_nValue < pListTwoHead->m_nValue) { pMergeListHead = pListOneHead; pMergeListHead->m_pNext = MergeTwoList(pMergeListHead->m_pNext, pListTwoHead); } else { pMergeListHead = pListTwoHead; pMergeListHead->m_pNext = MergeTwoList(pListOneHead, pMergeListHead->m_pNext); } return pMergeListHead; } 参考:[面试题15:合并两个排序的链表][面试题15:合并两个排序的链表] ## 链表排序 本质: 归并方法, 进行链表排序 1. 拆分: 拆分出 2 个链表(找出中间节点, 拆分链表) 2. 排序: 递归对 2 个链表排序 3. 合并: 2 个有序链表, 排序 [NingG]: http://ningg.github.com "NingG" [面试题15:合并两个排序的链表]: http://blog.csdn.net/htyurencaotang/article/details/9396733
18.680645
83
0.540149
yue_Hant
0.938149
fab35206e681a03e6e71965cfcde010b6c7a12a6
3,010
md
Markdown
README.md
KBriverun/sysethereum-contracts
80d702653a22b8f774f000b46543073362663f2e
[ "MIT" ]
null
null
null
README.md
KBriverun/sysethereum-contracts
80d702653a22b8f774f000b46543073362663f2e
[ "MIT" ]
null
null
null
README.md
KBriverun/sysethereum-contracts
80d702653a22b8f774f000b46543073362663f2e
[ "MIT" ]
null
null
null
# Sysethereum contracts [![Build Status](https://travis-ci.org/syscoin/sysethereum/sysethereum-contracts.svg?branch=master)](https://travis-ci.org/syscoin/sysethereum/sysethereum-contracts) Ethereum contracts for the Syscoin <=> Ethereum bridge. If you are new to the Syscoin <=> Ethereum bridge, please check the [docs](https://github.com/syscoin/sysethereum-docs) repository first. ## Core components * [SyscoinSuperblocks contract](contracts/SyscoinSuperblocks.sol) * Keeps a copy of the Syscoin Superblockchain * Informs [SyscoinERC20Manager contract](contracts/token/SyscoinERC20Manager.sol) when a Syscoin transaction locked or unlocked funds. * It's kind of a Syscoin version of [BtcRelay](https://github.com/ethereum/btcrelay) but using Superblocks instead of blocks. * [SyscoinERC20Manager contract](contracts/token/SyscoinERC20Manager.sol) * An ERC20 manager contract to hold deposits or and transfer funds on unlock * Tokens are minted or transferred (for existing ERC20) when coins are locked on the Syscoin blockchain. * Tokens are destroyed when coins should go back to the Syscoin blockchain (balances are saved for when moving back to Ethereum). * [SyscoinClaimManager contract](contracts/SyscoinClaimManager.sol) * Manages the interactive (challenge/response) validation of Superblocks. * [SyscoinERC20Asset](contracts/SyscoinParser/SyscoinERC20Asset.sol) - A mintable Syscoin ERC20 asset that follows ERC20 spec but is also mintable when moving from Syscoin to Ethereum - This is useful as some Syscoin assets originate on Syscoin and want to move to Ethereum. Legacy ERC20's must originate on Ethereum and have balances in order to move back to Ethereum from Syscoin. Legacy ERC20's are not mintable and thus only specific Syscoin ERC20 tokens are mintable when moving from Syscoin without balance existing in the SyscoinERC20Manager contract. * [SyscoinMessageLibrary](contracts/SyscoinParser/SyscoinMessageLibrary.sol) - Library for parsing/working with Syscoin blocks, txs and merkle trees ## Running the Tests * Install prerequisites * [nodejs](https://nodejs.org) v9.2.0 to v11.15.0. * [truffle](http://truffleframework.com/) v5.0.24 ``` npm install -g truffle@5.0.24 ``` * [ganache-cli](https://github.com/trufflesuite/ganache-cli) v6.4.2 or above. ``` npm install -g ganache-cli ``` * Clone this repo. * Install npm dependencies. * cd to the directory where the repo is cloned. ``` npm install ``` * Compile contracts ``` # compile contracts truffle compile --all ``` * Run tests: ``` # first start ganache-cli - and do this again once your gas ran out ganache-cli --gasLimit 4000000000000 -e 1000000 # run tests truffle test ``` ## Deployment To deploy the contracts ### Requirements * A Rinkeby/Mainnet client running with rpc enabled ### Deployment * Run `./scripts/exportAndInit.sh` ## License MIT License<br/> Copyright (c) 2019 Blockchain Foundry Inc<br/> [License](LICENSE)
38.589744
376
0.757143
eng_Latn
0.916507
fab3e1bd4aa718e338f809700ce40125ef6686ba
5,707
md
Markdown
vendor/chrmorandi/yii2-jasper/README.md
aIsabel101/evento
086ff6416bc67cd7857996186613fde73373873e
[ "BSD-3-Clause" ]
1
2016-06-30T19:55:36.000Z
2016-06-30T19:55:36.000Z
vendor/chrmorandi/yii2-jasper/README.md
aIsabel101/evento
086ff6416bc67cd7857996186613fde73373873e
[ "BSD-3-Clause" ]
null
null
null
vendor/chrmorandi/yii2-jasper/README.md
aIsabel101/evento
086ff6416bc67cd7857996186613fde73373873e
[ "BSD-3-Clause" ]
null
null
null
# JasperReports for PHP [![Latest Stable Version](https://poser.pugx.org/chrmorandi/yii2-jasper/v/stable)](https://packagist.org/packages/chrmorandi/yii2-jasper) [![Total Downloads](https://poser.pugx.org/chrmorandi/yii2-jasper/downloads)](https://packagist.org/packages/chrmorandi/yii2-jasper) [![License](https://poser.pugx.org/chrmorandi/yii2-jasper/license)](https://packagist.org/packages/chrmorandi/yii2-jasper) [![Build Status](https://travis-ci.org/chrmorandi/yii2-jasper.svg?branch=master)](https://travis-ci.org/chrmorandi/yii2-jasper) [![Scrutinizer Code Quality](https://scrutinizer-ci.com/g/chrmorandi/yii2-jasper/badges/quality-score.png?b=master)](https://scrutinizer-ci.com/g/chrmorandi/yii2-jasper/?branch=master) Package to generate reports with [JasperReports 6](http://community.jaspersoft.com/project/jasperreports-library) library through [JasperStarter v3](http://jasperstarter.sourceforge.net/) command-line tool. ##Install ```sh composer require chrmorandi/jasper ``` ##Introduction This package aims to be a solution to compile and process JasperReports (.jrxml & .jasper files). ###Why? **JasperReports** is the best open source solution for reporting. Generating HTML + CSS to make a PDF. Never think about it, that doesn't make any sense! :p ###What can I do with this? Well, everything. JasperReports is a powerful tool for **reporting** and **BI**. **From their website:** > The JasperReports Library is the world's most popular open source reporting engine. It is entirely written in Java and it is able to use data coming from any kind of data source and produce pixel-perfect documents that can be viewed, printed or exported in a variety of document formats including HTML, PDF, Excel, OpenOffice and Word. I recommend using [Jaspersoft Studio](http://community.jaspersoft.com/project/jaspersoft-studio) to build your reports, connect it to your datasource (ex: MySQL), loop thru the results and output it to PDF, XLS, DOC, RTF, ODF, etc. *What you can do with Jaspersoft:* * Graphical design environment * Pixel-perfect report generation * Output to PDF, HTML, CSV, XLS, TXT, RTF and more ##Examples ###The *Hello World* example. Go to the examples directory in the root of the repository (`vendor/chrmorandi/yii2-jasper/examples`). Open the `hello_world.jrxml` file with iReport or with your favorite text editor and take a look at the source code. ##Requirements * Java JDK 1.8 or higher * PHP [exec()](http://php.net/manual/function.exec.php) function * [optional] [Mysql Connector](http://dev.mysql.com/downloads/connector/j/) (if you want to use Mysql database) * [optional] [PostgreSQL Connector](https://jdbc.postgresql.org/download.html) (if you want to use PostgreSQL database) * [optional] [Jaspersoft Studio](http://community.jaspersoft.com/project/jaspersoft-studio) (to draw and compile your reports) ##Installation ###Java Check if you already have Java installed: ```sh $ java -version java version "1.8.0_91" Java(TM) SE Runtime Environment (build 1.8.0_91-b14) Java HotSpot(TM) 64-Bit Server VM (build 25.91-b14, mixed mode) ``` If you get: command not found: java Then install it with: (Ubuntu/Debian) ```sh $ sudo apt-get install default-jdk ``` Now run the `java -version` again and check if the output is ok. ###Composer Install [Composer](http://getcomposer.org) if you don't have it. ```sh composer require chrmorandi/yii2-jasper ``` Or in your `composer.json` file add: ```json { "require": { "chrmorandi/yii2-jasper": "*" } } ``` And the just run: ```sh composer update ``` and thats it. ###Add the component to the configuration ```php return [ ... 'components' => [ 'jasper' => [ 'class' => 'chrmorandi\jasper', 'redirect_output' => false, //optional 'resource_directory' => false, //optional 'locale' => pt_BR, //optional 'db' => [ 'host' => localhost, 'port' => 5432, 'driver' => 'postgres', 'dbname' => 'db_banco', 'username' => 'username', 'password' => 'password', //'jdbcDir' => './jdbc', **Defaults to ./jdbc //'jdbcUrl' => 'jdbc:postgresql://"+host+":"+port+"/"+dbname', ] ] ... ], ... ]; ``` ###Using ```php use chrmorandi\Jasper; public function actionIndex() { // Set alias for sample directory Yii::setAlias('example', '@vendor/chrmorandi/yii2-jasper/examples'); /* @var $jasper Jasper */ $jasper = Yii::$app->jasper; // Compile a JRXML to Jasper $jasper->compile(Yii::getAlias('@example') . '/hello_world.jrxml')->execute(); // Process a Jasper file to PDF and RTF (you can use directly the .jrxml) $jasper->process( Yii::getAlias('@example') . '/hello_world.jasper', ['php_version' => 'xxx'], ['pdf', 'rtf'], false, false )->execute(); // List the parameters from a Jasper file. $array = $jasper->listParameters(Yii::getAlias('@example') . '/hello_world.jasper')->execute(); // return pdf file Yii::$app->response->sendFile(Yii::getAlias('@example') . '/hello_world.pdf'); } ``` ###MySQL We ship the [MySQL connector](http://dev.mysql.com/downloads/connector/j/) (v5.1.39) in the `/src/JasperStarter/jdbc/` directory. ###PostgreSQL We ship the [PostgreSQL](https://jdbc.postgresql.org/) (v9.4-1208) in the `/src/JasperStarter/jdbc/` directory. ##Performance Depends on the complexity, amount of data and the resources of your machine. Is possible generate reports in the background. ##License MIT
30.036842
337
0.675136
eng_Latn
0.664358
fab461bfd2ad0db68f29d719d66ce0e3a63bc01a
6,446
md
Markdown
README.md
diaz92092/HomeTourApp
93dbbce4a5d150b85f9fd7e18fff457f7dd3b7fc
[ "MIT" ]
null
null
null
README.md
diaz92092/HomeTourApp
93dbbce4a5d150b85f9fd7e18fff457f7dd3b7fc
[ "MIT" ]
null
null
null
README.md
diaz92092/HomeTourApp
93dbbce4a5d150b85f9fd7e18fff457f7dd3b7fc
[ "MIT" ]
null
null
null
# ProjectB Project B: Home Tour A Zork Like Application Background When we think of computer programs and applications today, we think of graphical interfaces with buttons and expandable menus and other kinds of displays. But the consumer computers started finding their ways into homes 5 years before the technology for what we would think of as "graphics" debuted, and it would take another 5 years for those first graphics cards to become commonplace. So for a solid decade, almost all software had to display output in a text format, and collect input from users through text-based commands. Some of the first games for these home computers then were similar to Choose-Your-Own-Adventure books, and defined the "adventure" genre accordingly (sometimes the genre name is attributed to the game Colossal Cave Adventure, a pioneer of the format) . In these kinds of games, the player is presented with a description of a place or an event, and has to type in a command to do something. The game parses this command, and then prints some new output accordingly. Generally, the player is navigating through different "rooms" or areas using cardinal directions to explore the environment. For a good example of this, check out the adventure game Zork, which you can play from the link in the reference section below. You'll find that while the complete list of commands was only printed in the manual, many of them are intuitive - open mailbox, take leaflet, read leaflet, go south, look... For this project, you will be building a similar kind of application, but rather than an adventure game we'll just use it to present an interactive "tour" of a home - maybe your current home, maybe your dream home. In any case, the pattern will be exactly the same as other text-based games: Display prompt Collect input Process input as a command Print output of command Here's an example of what your game might look like: Instructions For now, we're just going to create the shell for our project, based on what we understand of the requirements. Later, as your skills improve and you continue to learn, you'll probably make changes to these classes or their structure - a process called refactoring - to make your project easier to expand. In Eclipse, create a project named HomeTour In the HomeTour project, create the following packages and classes: fixtures fixtures.Fixture (abstract) fixtures.Room game game.Main game.Player game.RoomManager These classes will work as follows: fixtures.Fixture: This abstract class will be used as a base for anything that can be looked at or interacted with. This class should define (at least) the following properties: String name : a short name / title for the fixture String shortDescription : a one-sentence-long description of a fixture, used to briefly mention the fixture String longDescription : a paragraph-long description of the thing, displayed when the player investigates the fixture thoroughly (looks at it, or enters a room) Here's an example of what your game might look like, so these make sense: fixtures.Room: This class represents a room in the house. It will extend fixtures.Fixture, and so will inherit the descriptive properties. The Room will also have the following properties: Room[] exits : the rooms adjacent to this one. You might decide that a room in a particular direction always uses a certain index, e.g. a north exit always goes in index 0, an east exit always goes in index 1, etc. If so, then the size of this array depends on how many directions you want to support. The Room class should also have a constructor that accepts a name, shortDescription, and longDescription. You might also find it convenient to create a getter not just for all the exits, but for a particular exit given a direction: public Room(String name, String shortDescription, String longDescription) { super(name, shortDescription, longDescription); this.exits = new Room[?]; // size is your choice } public Room[] getExits() { } public Room getExit(String direction) { } game.Main: This class will store the main(String[]) method for our game (and of course, it will be the only class that has a main(String[]) method). This is where the game-loop will go, where we'll display a prompt, collect input, and parse that input The printRoom(Player) method will print a prompt to the console for the player's current room, similar to the above image. The collectInput() method will use a Scanner object to collect console input from the user, and then will divide that input into multiple parts. Generally those parts will look like this: An action The target of an action (if any) For example, "go east" -> "go" is the command, "east" is the target. This method will break the input into a String[], and return that. The parse(String[], Player) method will take the output of the above collectInput() method and a player object, and will resolve that command. This can actually be simpler than it sounds - the first index of the passed-in String[] should be the action, so you can switch on that and handle the target differently for each case. The Player object is there so you can modify it if needed (like changing the Player's currentRoom based on the direction moved) public static void main(String[] args) { } private static void printRoom(Player player) { } private static String[] collectInput() { } private static void parse(String[] command, Player player) { } game.Player: This class represents the player moving through these rooms. The Player class has these properties: Room currentRoom : the room the player is currently in. game.RoomManager: This class will be responsible for "loading" our rooms into memory. When game.Main is executed, it will invoke the init() method in this class that will instantiate all our Room objects, link them together as exits, and designate a startingRoom. Room startingRoom : the room a player should start in. Room[] rooms : all the rooms in the house. public void init() { Room foyer = new Room( "The Foyer", "a small foyer", "The small entryway of a neo-colonial house. A dining room is open to the south, where a large table can be seen." + "\n" + "The hardwood floor leads west into doorway, next to a staircase that leads up to a second floor." + "\n" + "To the north is a small room, where you can see a piano."); this.rooms[0] = foyer; this.startingRoom = foyer; }
56.052174
590
0.774744
eng_Latn
0.999723
fab4975c072ad7641bd1f392cc49380250989e7b
1,873
md
Markdown
content/authors/jesse/_index.md
kmalki123/academic-kickstart-Fuhrman-Lab
9c551657c69456d814d76d4e3fabe5dd9415e667
[ "MIT" ]
null
null
null
content/authors/jesse/_index.md
kmalki123/academic-kickstart-Fuhrman-Lab
9c551657c69456d814d76d4e3fabe5dd9415e667
[ "MIT" ]
1
2019-10-01T21:51:49.000Z
2019-10-01T21:51:49.000Z
content/authors/jesse/_index.md
kmalki123/academic-kickstart-Fuhrman-Lab
9c551657c69456d814d76d4e3fabe5dd9415e667
[ "MIT" ]
11
2019-08-01T00:29:04.000Z
2021-09-28T19:22:02.000Z
--- # Display name name: Jesse McNichol # Username (this should match the folder name) authors: - jesse # Is this the primary user of the site? superuser: false # Role/position role: CBIOMES Postdoc # Organizations/Affiliations organizations: - name: University of Southern California url: "" # Short bio (displayed in user profile at end of posts) bio: Jesse is a postdoc in the Fuhrman lab working on the CBIOMES collaboration. Please see https://jcmcnch.github.io for more information about his experience and interests. interests: - Marine Microbiology - Energy Conservation Mechanisms in Chemoautotrophs - Laboratory / Computational Methods Development # Social/Academic Networking # For available icons, see: https://sourcethemes.com/academic/docs/widgets/#icons # For an email link, use "fas" icon pack, "envelope" icon, and a link in the # form "mailto:your-email@example.com" or "#contact" for contact widget. social: - icon: twitter icon_pack: fab link: https://twitter.com/jcmcnick - icon: google-scholar icon_pack: ai link: https://scholar.google.com/citations?hl=en&user=8aUVZB4AAAAJ&view_op=list_works&sortby=pubdate - icon: github icon_pack: fab link: https://github.com/jcmcnch # Link to a PDF of your resume/CV from the About widget. # To enable, copy your resume/CV to `static/files/cv.pdf` and uncomment the lines below. # - icon: cv # icon_pack: ai # link: files/cv.pdf # Enter email to display Gravatar (if Gravatar enabled in Config) email: "" # Organizational groups that you belong to (for People widget) # Set this to `[]` or comment out if you are not using People widget. user_groups: - Researchers - Visitors --- Jesse is a postdoc in the Fuhrman lab working on the CBIOMES collaboration. Please see [jcmcnch.github.io](https://jcmcnch.github.io) for more information about his experience and interests.
31.216667
190
0.753337
eng_Latn
0.882863
fab4a10f51c1fb34159d9304fd62deb8f0e57029
381
md
Markdown
README.md
rshin808/primefactors
424da48f38ba00ae2adf9ea201d2d3cd99720312
[ "MIT" ]
null
null
null
README.md
rshin808/primefactors
424da48f38ba00ae2adf9ea201d2d3cd99720312
[ "MIT" ]
null
null
null
README.md
rshin808/primefactors
424da48f38ba00ae2adf9ea201d2d3cd99720312
[ "MIT" ]
null
null
null
# Overview This system implements a simple function taking from this [coding challenge](http://philipmjohnson.github.io/ics314f15/morea/testing/wod-test-prime-factors.html). This system implments Jasmine tests. # Usage Simply download the file and invoke the function. See the test code for details. # Credits Thanks to [Jasmine](http://jasmine.github.io/) for test framework.
31.75
162
0.787402
eng_Latn
0.847941
fab4c5d4d246d1bad973b54f20fc8286349c41d5
496
md
Markdown
content/publication/3-stagemi-10/index.md
yajuansi-sophie/academic-kickstart
5c179f7c9bd08ee2c141cc3015e560456cb36db6
[ "MIT" ]
null
null
null
content/publication/3-stagemi-10/index.md
yajuansi-sophie/academic-kickstart
5c179f7c9bd08ee2c141cc3015e560456cb36db6
[ "MIT" ]
null
null
null
content/publication/3-stagemi-10/index.md
yajuansi-sophie/academic-kickstart
5c179f7c9bd08ee2c141cc3015e560456cb36db6
[ "MIT" ]
null
null
null
--- title: "A new three stage multiple imputation approach to protect confidentiality of incomplete census microdata" date: 2010-01-01 publishDate: 2019-09-02T15:56:12.168221Z authors: ["Yajuan Si", "Jerome P Reiter"] publication_types: ["0"] abstract: "" featured: false publication: "*Proceedings of the Section on Survey Research Methods*" url_pdf: "https://pdfs.semanticscholar.org/8175/fb63b06e12a411fecebe391bc20356814627.pdf?_ga=2.41057514.951173358.1567623551-1451803650.1542137971" ---
38.153846
147
0.792339
kor_Hang
0.228756
fab4dc75b3cf6d94edec4b3e4965f8887b82d49f
607
md
Markdown
README.md
peterszatmary/spring-microservice-config-hackathon
cdfcb68bee0a9898532b06a8540afea73130bf9e
[ "Apache-2.0" ]
null
null
null
README.md
peterszatmary/spring-microservice-config-hackathon
cdfcb68bee0a9898532b06a8540afea73130bf9e
[ "Apache-2.0" ]
null
null
null
README.md
peterszatmary/spring-microservice-config-hackathon
cdfcb68bee0a9898532b06a8540afea73130bf9e
[ "Apache-2.0" ]
null
null
null
# spring-microservice-config-hackathon # [![Build Status](https://travis-ci.org/peterszatmary/spring-microservice-config-hackathon.svg?branch=master)](https://travis-ci.org/peterszatmary/spring-microservice-config-hackathon) [![Codacy Badge](https://api.codacy.com/project/badge/Grade/2b78b6ffa1644b6a8fb1155cd3961645)](https://www.codacy.com/app/peterszatmary/spring-microservice-config-hackathon?utm_source=github.com&amp;utm_medium=referral&amp;utm_content=peterszatmary/spring-microservice-config-hackathon&amp;utm_campaign=Badge_Grade) Repository that holds all configuration files for microservices.
101.166667
315
0.836903
yue_Hant
0.583058
fab4fa24f8df8c622c110f189f469e105ec4b465
4,159
md
Markdown
README.md
gismawasco/vt_boilerplate
22712e4076e4d6690da454081973e7a7bd22557d
[ "MIT" ]
1
2020-09-06T13:08:19.000Z
2020-09-06T13:08:19.000Z
README.md
gismawasco/vt_boilerplate
22712e4076e4d6690da454081973e7a7bd22557d
[ "MIT" ]
null
null
null
README.md
gismawasco/vt_boilerplate
22712e4076e4d6690da454081973e7a7bd22557d
[ "MIT" ]
null
null
null
# vt-boilerplate ![](https://github.com/narwassco/vt/workflows/Node.js%20CI/badge.svg) ![GitHub](https://img.shields.io/github/license/watergis/vt-boilerplate) ![Docker Cloud Automated build](https://img.shields.io/docker/cloud/automated/narwassco/vt) ![Docker Image Size (latest by date)](https://img.shields.io/docker/image-size/narwassco/vt) This is a template to manage vectortiles for water services providers in Github pages. You can create your own `vt` repository by using this template repository. Please also refer to [watergis/awesome-vector-tiles](https://github.com/watergis/awesome-vector-tiles). There is some instruction guide how to use tools and host your vectortiles in Github pages as open data. ## Configuration All the settings are in `config.js` and `config-search.js`, so please make sure your own settings on this file before producing vector tile. Please put environment variable for database settings. ``` db_user=$db_user db_password=$db_password db_host=host.docker.internal db_port=5432 ``` ## Create mbtiles ### Usage (Docker) ``` db_user=your user db_password=your password docker-compose up ``` Your mbtiles will be generated under `data` directory. Your GeoJSON for searching window will also be generated under `public` directory. ### Usage (Nodejs) #### Requirements This module uses [`tippecanoe`](https://github.com/mapbox/tippecanoe) to convert geojson files to mbtiles. Please make sure to install it before running. for MacOS ``` $ brew install tippecanoe ``` for Ubuntu ``` $ git clone https://github.com/mapbox/tippecanoe.git $ cd tippecanoe $ make -j $ make install ``` Then, ``` $ npm install $ db_user=$db_user \ db_password=$db_password \ db_host=localhost \ db_port=5432 \ npm run create ``` There will be two files as follows. - ./data/rwss.mbtile - ./public/wss.geojson ## Deployment from local computer ### Extract pbf (mvt) tiles from mbtiles file please configure `config-extact.js` file to adjust output directory path and input mbtiles path. ``` npm run extract ``` There will be vectortiles under `./public/tiles` directory. ### Deploy ``` npm run deploy ``` It will publish all the files under `public` directory to Github Pages. ## Deployment by using Github Action First, you can just generate and push both `data/data.mbtiles` and `public/meter.geojson` to master repository. ```bash # the below commands are the same with create_vt.bat (create_vt.sh) # generate data.mbtiles and meter.geojson docker-compose up # push data.mbtiles and meter.geojson to Github git add . git commit -m "update vectortiles" git push origin master ``` Then, you can use Github Action for `npm run extract` and `npm run deploy` process. Here is the example of `.github/workflows/node.js.yml`. If you want to automate, please create it in your repository. ```yml name: Node.js CI on: push: branches: [ master ] jobs: build: runs-on: ubuntu-latest steps: - uses: actions/checkout@v2 - name: Use Node.js uses: actions/setup-node@v1 with: node-version: 12.x - run: npm ci env: NODE_AUTH_TOKEN: ${{secrets.GITHUB_TOKEN}} - run: npm run extract - name: configure git and deploy env: NODE_AUTH_TOKEN: ${{secrets.GITHUB_TOKEN}} run: | git config --global user.name "watergis+githubci" git config --global user.email "watergis+githubci@users.noreply.github.com" git remote set-url origin https://x-access-token:${NODE_AUTH_TOKEN}@github.com/{your organization name}/vt.git npm run deploy ``` # License This source code under the repository is licensed by `MIT license`. You can use it freely for your purposes. However, these data under [data](./data) and `gh-pages` branch are owned and maintained by `{your organization name}` in Kenya. It is under a [Creative Commons Attribution 4.0 International License](http://creativecommons.org/licenses/by/4.0/), which is different from main repository. You can use this data freely, but please mention our credit `©{your organization name}` on attribution of your web application. --- Copyright (c) 2020 Jin IGARASHI
28.881944
223
0.731426
eng_Latn
0.948104
fab4fd09c92e3e184744f0054c677f68222d4500
2,052
md
Markdown
.config/sublime-text-3/Packages/requests-oauthlib/README.md
deltakapa/dot-files
bb43088d2bcea15e892dfa45bff934b8e7399e17
[ "MIT" ]
1
2019-04-19T09:40:39.000Z
2019-04-19T09:40:39.000Z
.config/sublime-text-3/Packages/requests-oauthlib/README.md
deltakapa/dot-files
bb43088d2bcea15e892dfa45bff934b8e7399e17
[ "MIT" ]
null
null
null
.config/sublime-text-3/Packages/requests-oauthlib/README.md
deltakapa/dot-files
bb43088d2bcea15e892dfa45bff934b8e7399e17
[ "MIT" ]
null
null
null
# Requests-OAuthlib for Package Control [![Build Status](https://travis-ci.org/packagecontrol/requests-oauthlib.svg)](https://travis-ci.org/packagecontrol/requests-oauthlib) This is the *[requests_oauthlib][]* module bundled for usage with [Package Control][], a package manager for the [Sublime Text][] text editor. this repo | pypi ---- | ---- ![latest tag](https://img.shields.io/github/tag/packagecontrol/requests-oauthlib.svg) | [![pypi](https://img.shields.io/pypi/v/requests-oauthlib.svg)][pypi] ## How to use *requests* as a dependency In order to tell Package Control that you are using the *requests* module in your ST package, create a `dependencies.json` file in your package root with the following contents: ```js { "*": { "*": [ "requests_oauthlib" ] } } ``` If the file exists already, add `"requests_oauthlib"` to the every dependency list. Then run the **Package Control: Satisfy Dependencies** command to make Package Control install the module for you locally (if you don't have it already). After all this you can use `import requests_oauthlib` in any of your Python plugins. See also: [Documentation on Dependencies](https://packagecontrol.io/docs/dependencies) ## How to update this repository (for contributors) 1. Download the latest tarball from [pypi][]. 2. Delete everything inside the `all/` folder. 3. Copy the `requests_oauthlib/` folder, and everything related to copyright/licensing from the tarball to the `all/` folder. 4. Commit changes and either create a pull request or create a tag directly in the format `v<version>` (in case you have push access). ## License The contents of the root folder in this repository are released under the *public domain*. The contents of the `all/` folder fall under *their own bundled licenses*. [requests_oauthlib]: https://requests-oauthlib.readthedocs.org/en/latest/ [Package Control]: http://packagecontrol.io/ [Sublime Text]: http://sublimetext.com/ [pypi]: https://pypi.python.org/pypi/requests_oauthlib
25.333333
156
0.732943
eng_Latn
0.948754
fab5784781db1de204fb4c0ab9648d843a95a845
4,164
md
Markdown
source/_posts/jennifer_lopez_51_proves_she_still_has_abs_as_she_models_a_crop_top_with_rave_buns.md
soumyadipdas37/finescoop.github.io
0346d6175a2c36d4054083c144b7f8364db73f2f
[ "MIT" ]
null
null
null
source/_posts/jennifer_lopez_51_proves_she_still_has_abs_as_she_models_a_crop_top_with_rave_buns.md
soumyadipdas37/finescoop.github.io
0346d6175a2c36d4054083c144b7f8364db73f2f
[ "MIT" ]
null
null
null
source/_posts/jennifer_lopez_51_proves_she_still_has_abs_as_she_models_a_crop_top_with_rave_buns.md
soumyadipdas37/finescoop.github.io
0346d6175a2c36d4054083c144b7f8364db73f2f
[ "MIT" ]
2
2021-09-18T12:06:26.000Z
2021-11-14T15:17:34.000Z
--- extends: _layouts.post section: content image: https://i.dailymail.co.uk/1s/2020/10/01/17/33866262-0-image-a-63_1601568417466.jpg title: Jennifer Lopez, 51, proves she still has abs as she models a crop top with rave buns description: The mother-of-two wore a crop top hoodie with rave buns as she moved her feet to the music in a clip shared to Instagram. She often flashed her tummy in year 2001 when she was with Diddy. date: 2020-10-01-17-27-31 categories: [latest, tv] featured: true --- Jennifer Lopez first started flaunting her abs 20 years ago when she was revving up her music career and dating Diddy, who then went by Puff Daddy. And on Thursday the 51-year-old Pa' Ti singer proved she still has an enviable midsection as she danced away for a new ad for Coach. The mother-of-two wore a crop top hoodie with rave buns as she moved her feet to the music in a clip shared to Instagram. Great look: Jennifer Lopez proved she still has an enviable midsection as she danced away for a new ad for Coach shared on Thursday The look then: The siren first started flaunted her abs 20 years ago when she was revving up her music career and dating Diddy, who then went by Puff Daddy; seen in 2000 Jenny From the Block was working her old-school vibes in the short top with dark bottoms and her hair in tiny buns on her head. The looker also wore dewy makeup and added large gold hoop earrings that relayed her Bronx vibe as seen in her On The Six videos. 'My first-ever design collab with @coach❗ (Excited about this one.) You guys know I do a lot, so I need a bag that does too. ✨ My special-edition Hutton is available now,' said the fiance of Alex Rodriguez in her caption. In November she was named the new face of Coach after Selena Gomez and Chloe Grace Moretz had the job. Moving it: The mother-of-two wore a crop top hoodie with rave buns as she moved her feet to the music in a clip shared to Instagram Oh Jenny: Jenny From the Block was working her old-school vibes in the short top with dark bottoms and her hair in tiny buns on her head Here is her work: The bag she is holding is a model she designed for Coach So old school: The looker also wore dewy makeup and added large gold hoop earrings that relayed her Bronx vibe as seen in her On The Six music videos 'I’m so excited for this collaboration with Coach,' Jennifer said in a statement at the time. 'It is a timeless brand that I’ve always been a fan of and the upcoming collection really speaks to my personal style - an uptown downtown mix.' 'Jennifer is so authentic. She’s determined and she’s an original who has followed her own path to do things her own way - she really embodies the attitude of Coach and our new campaign,' Creative Director Stuart Vevers said.  Another stunning look: Here the actress went for a sophisticated look as she held onto a script while in a director's chair 'I loved when Jennifer carried the Coach Signature bags in her 2002 video All I Have.  'She’s from New York like Coach, which creates another authentic connection with our heritage, and I’m particularly excited about bringing Jennifer and Juergen Teller together.' Other stars who have modeled for Coach recently include  Riley Keough, Karolina Kurkova and Emma Roberts. Coach is owned by Tapestry, Inc which also runs Kate Spade and Stuart Weitzman.  And a singer too: Jenny flashed her ring from A-Rod as she sat down to sing some tunes. My first-ever design collab with @coach❗ (Excited about this one.) You guys know I do a lot, so I need a bag that does too. ✨ My special-edition Hutton is available now,' said the star Also this week it was announced the Maid In Manhattan star will receive The People’s Icon award at the 2020 E! People’s Choice Awards. 'PCA award-winner and six-time nominee, Lopez will be honored for her iconic performances both on stage and on screen, including her award-winning 2020 Super Bowl halftime performance and lead role in the critically acclaimed film, Hustlers,' it was shared in a press release. The 2020 E! People’s Choice Awards will broadcast from the Barker Hangar in Santa Monica, California on Sunday, November 15.
68.262295
276
0.780019
eng_Latn
0.999835
fab622c6b60631612a155aa77ea8bd7a0a801c04
52
md
Markdown
README.md
march-sprite/chestnut-tree-demo
8c032de915c58338a966b608b0d358554e4e6ce8
[ "MIT" ]
1
2019-10-24T03:57:09.000Z
2019-10-24T03:57:09.000Z
README.md
march-sprite/chestnut-tree-demo
8c032de915c58338a966b608b0d358554e4e6ce8
[ "MIT" ]
null
null
null
README.md
march-sprite/chestnut-tree-demo
8c032de915c58338a966b608b0d358554e4e6ce8
[ "MIT" ]
null
null
null
# chestnut-tree-demo a chestnut-tree framework demo
17.333333
30
0.807692
eng_Latn
0.21062
fab6a16f92e2ac4272d664657894079b5b38438c
133
md
Markdown
README.md
josephyhu/Flashcards
08455a2ed7075b946973e83201a6c293ed059ad6
[ "MIT" ]
null
null
null
README.md
josephyhu/Flashcards
08455a2ed7075b946973e83201a6c293ed059ad6
[ "MIT" ]
null
null
null
README.md
josephyhu/Flashcards
08455a2ed7075b946973e83201a6c293ed059ad6
[ "MIT" ]
null
null
null
# Flashcards From Treehouse Express Basics course. Install `node` and `npm`, and then run `npm install`. Finally run `node app.js`.
26.6
80
0.736842
eng_Latn
0.973627
fab6c762a9bc2f6d3bbd7508818b59e19df9aa36
463
md
Markdown
_mainframe_collection/obj10.md
klp/wax-project
d8383d26264ffa56a5cf7d803338fd364f8f37cc
[ "MIT" ]
null
null
null
_mainframe_collection/obj10.md
klp/wax-project
d8383d26264ffa56a5cf7d803338fd364f8f37cc
[ "MIT" ]
null
null
null
_mainframe_collection/obj10.md
klp/wax-project
d8383d26264ffa56a5cf7d803338fd364f8f37cc
[ "MIT" ]
null
null
null
--- pid: obj10 label: Someday Computers Will Talk company: Madison and Wall _date: '1971' decade: 70s commentary: object_type: brochure source: SomedayCompTalk_brochure_MadisonandWall_1971 source_name: order: '9' layout: qatar_item collection: mainframe_collection thumbnail: "/img/derivatives/iiif/images/obj10/full/250,/0/default.jpg" manifest: "/img/derivatives/iiif/obj10/manifest.json" full: "/img/derivatives/iiif/images/obj10/full/1140,/0/default.jpg" ---
25.722222
71
0.796976
eng_Latn
0.20008
fab6fa090f4f004333a88f670a367895c30a693a
862
md
Markdown
_listings/datadog/graph-embed-embed-id-enable-get.md
streamdata-gallery-organizations/datadog
f775474904a4dacace92853bdc37678a183aeae7
[ "CC-BY-3.0" ]
null
null
null
_listings/datadog/graph-embed-embed-id-enable-get.md
streamdata-gallery-organizations/datadog
f775474904a4dacace92853bdc37678a183aeae7
[ "CC-BY-3.0" ]
null
null
null
_listings/datadog/graph-embed-embed-id-enable-get.md
streamdata-gallery-organizations/datadog
f775474904a4dacace92853bdc37678a183aeae7
[ "CC-BY-3.0" ]
null
null
null
--- swagger: "2.0" info: title: DataDog Merged API version: 1.0.0 basePath: api/v1/ schemes: - http produces: - application/json consumes: - application/json paths: graph/embed/:embed_id/enable: get: summary: Get Graph Embed Embed Enable description: |2- Enable a specified embed operationId: getGraphEmbedEmbedEnable responses: 200: description: OK tags: - monitoring - graph - embed - embed - "" - enable definitions: [] x-collection-name: DataDog x-streamrank: polling_total_time_average: 0 polling_size_download_average: 0 streaming_total_time_average: 0 streaming_size_download_average: 0 change_yes: 0 change_no: 0 time_percentage: 0 size_percentage: 0 change_percentage: 0 last_run: "" days_run: 0 minute_run: 0 ---
18.73913
44
0.656613
eng_Latn
0.280555
fab75b1bf2bdf87c256d4bacdd838a7ed031d70f
203
md
Markdown
README.md
BenSegal855/powercord-backend
aa12307b366a3de3f203cdcfca3e137a7f49d78f
[ "MIT" ]
1
2020-10-02T00:45:01.000Z
2020-10-02T00:45:01.000Z
README.md
BenSegal855/powercord-backend
aa12307b366a3de3f203cdcfca3e137a7f49d78f
[ "MIT" ]
null
null
null
README.md
BenSegal855/powercord-backend
aa12307b366a3de3f203cdcfca3e137a7f49d78f
[ "MIT" ]
1
2020-10-31T15:41:45.000Z
2020-10-31T15:41:45.000Z
# powercord-backend The backend for [powercord-org/powercord](https://github.com/powercord-org/powercord). Based on the [React Hybrid Boilerplate](https://github.com/Bowser65/react-hybrid-boilerplate).
40.6
94
0.793103
eng_Latn
0.343658
fab888d144ab1f66eece072140d98c483a20f382
272
md
Markdown
deep-learning-a-z/readme.md
faizanahemad/coursera_ml_nlp_ai
93480becd896a90d192d62df5afbcbf501a12a62
[ "MIT" ]
null
null
null
deep-learning-a-z/readme.md
faizanahemad/coursera_ml_nlp_ai
93480becd896a90d192d62df5afbcbf501a12a62
[ "MIT" ]
null
null
null
deep-learning-a-z/readme.md
faizanahemad/coursera_ml_nlp_ai
93480becd896a90d192d62df5afbcbf501a12a62
[ "MIT" ]
1
2020-04-21T14:40:06.000Z
2020-04-21T14:40:06.000Z
### Links - [Deep Learning - Udemy](https://www.udemy.com/deeplearning/learn/v4/overview) - [Data Sources and Resources](https://www.superdatascience.com/deep-learning/) ### Data Directory Structure ``` data |-deep-learning-a-z ||-ann ||-cnn ||-rnn ||-som ||-bm ||-ae ```
18.133333
79
0.672794
yue_Hant
0.71951
faba73d37f72f033552a3747c9e6d6018834ecc2
30,629
md
Markdown
docs/ios/platform/homekit.md
OPS-E2E-PPE/xamarin-docs.ja-jp
750911b8aee045e8ae10fee2687cfb01c8d2d74c
[ "CC-BY-4.0", "MIT" ]
null
null
null
docs/ios/platform/homekit.md
OPS-E2E-PPE/xamarin-docs.ja-jp
750911b8aee045e8ae10fee2687cfb01c8d2d74c
[ "CC-BY-4.0", "MIT" ]
null
null
null
docs/ios/platform/homekit.md
OPS-E2E-PPE/xamarin-docs.ja-jp
750911b8aee045e8ae10fee2687cfb01c8d2d74c
[ "CC-BY-4.0", "MIT" ]
null
null
null
--- title: Xamarin.iOS で HomeKit description: HomeKit は、ホーム オートメーション デバイスを制御するための Apple のフレームワークです。 この記事では、HomeKit を紹介し、構成のテスト アクセサリ アクセサリと対話する HomeKit アクセサリ シミュレーターと単純な Xamarin.iOS アプリの作成について説明します。 ms.prod: xamarin ms.assetid: 90C0C553-916B-46B1-AD52-1E7332792283 ms.technology: xamarin-ios author: lobrien ms.author: laobri ms.date: 03/22/2017 ms.openlocfilehash: 6793190fa3278455a00d7ea08ab52a643c369a35 ms.sourcegitcommit: 4b402d1c508fa84e4fc3171a6e43b811323948fc ms.translationtype: MT ms.contentlocale: ja-JP ms.lasthandoff: 04/23/2019 ms.locfileid: "61371543" --- # <a name="homekit-in-xamarinios"></a>Xamarin.iOS で HomeKit _HomeKit は、ホーム オートメーション デバイスを制御するための Apple のフレームワークです。この記事では、HomeKit を紹介し、構成のテスト アクセサリ アクセサリと対話する HomeKit アクセサリ シミュレーターと単純な Xamarin.iOS アプリの作成について説明します。_ [![](homekit-images/accessory01.png "HomeKit の使用例には、アプリが有効になっています。")](homekit-images/accessory01.png#lightbox) Apple は、さまざまなベンダーから複数のホーム オートメーション デバイスを一貫した単一のユニットにシームレスに統合する方法として、iOS 8 で HomeKit を導入しました。 検出するための一般的なプロトコルを昇格させることにより、ホーム オートメーション デバイスの構成と HomeKit、デバイスから関連以外のベンダーは、各ベンダーの作業を調整することがなく、連携できます。 HomeKit、Api やアプリを提供するベンダーを使用せずに、HomeKit を有効になっているデバイスを制御する Xamarin.iOS アプリを作成できます。 HomeKit では、次の操作を行うことができます。 - 新しい HomeKit を有効になっているホーム オートメーション デバイスを検出し、すべてのユーザーの iOS デバイスの間で保持されるデータベースに追加します。 - セットアップ、構成、表示、および、HomeKit の任意のデバイスを制御_構成データベース ホーム_します。 - 事前に構成された HomeKit デバイスと通信し、コマンドに個々 のアクションを実行したり、キッチンのライトのすべての有効化などの連携します。 HomeKit を有効になっているアプリにホーム構成データベース内のデバイスの機能、に加えて HomeKit は Siri 音声コマンドへのアクセスを提供します。 適切に構成された、HomeKit のセットアップを指定するには、ユーザー音声コマンドを実行できるよう"Siri、リビング ルームのライトを有効にします"。 <a name="Home-Configuration-Database" /> ## <a name="the-home-configuration-database"></a>ホームの構成データベース HomeKit をホーム コレクションに指定された場所にすべてのオートメーション デバイスを整理します。 このコレクションは、ユーザーが自分のホーム オートメーション デバイスを意味のある、人間が判読できるラベルを論理的に配置後の単位にグループ化するための方法を提供します。 ホームの構成データベースで自動的にバックアップと同期されたすべてのユーザーの iOS デバイスのホームのコレクションが格納されます。 HomeKit では、構成データベースを操作するため、次のクラスを提供します。 - `HMHome` -これが最上位のコンテナーで 1 つの物理的な場所 (例: すべての情報とすべてのホーム オートメーション デバイス用の構成を保持します。 1 つのファミリの居住地)。 ユーザーのメイン ホームおよび休暇家などの 1 つ以上の住所があります。 または、さまざまな「家」で同じプロパティをメインの家やガレージの上のゲスト家などがあります。 いずれにしても、少なくとも 1 つ`HMHome`オブジェクト_する必要があります_セットアップし、その他 HomeKit の情報を入力する前に格納します。 - `HMRoom` -While 省略可能な、`HMRoom`を家庭内の特定のルームを定義できます (`HMHome`) など。キッチン、バスルーム、ガレージまたはリビング ルーム。 ユーザーがグループのすべてのホーム オートメーション デバイスに自分の家で特定の場所に、`HMRoom`単位として操作とします。 たとえば、ガレージ ライトをオフにするための Siri を求めています。 - `HMAccessory` -これは、個人を表します、物理 HomeKit には、オートメーション デバイス (スマート サーモスタット) など、ユーザーの居住地でインストールされているが有効になります。 各`HMAccessory`に割り当てられている、`HMRoom`します。 場合は、ユーザーが構成されていないすべてのルーム、HomeKit アクセサリ特別な既定のルームに割り当てます。 - `HMService` -によって提供されるサービスを表す、指定された`HMAccessory`、ライトまたは (色の変更がサポートされている) 場合は、その色のオン/オフ状態など。 各`HMAccessory`ライトも含むガレージきっかけなど、複数のサービスを持つことができます。 さらを指定した`HMAccessory`ユーザー コントロール外にあるサービス、ファームウェアの更新などがあります。 - `HMZone` -のコレクションをグループ化するユーザーを許可する`HMRoom`上、Downstairs 地下室などの論理ゾーン オブジェクト。 必須ではありません、これにより、Siri を求めるような相互作用オフ downstairs 光のすべてが有効にします。 <a name="Provisioning-a-HomeKit-App" /> ## <a name="provisioning-a-homekit-app"></a>HomeKit のアプリのプロビジョニング HomeKit によるセキュリティ要件、により HomeKit フレームワークを使用する Xamarin.iOS アプリをする必要があります適切に構成する、Apple Developer ポータルと Xamarin.iOS プロジェクト ファイル。 次の手順で行います。 1. ログイン、 [Apple Developer Portal](https://developer.apple.com)します。 2. をクリックして**証明書, Identifiers & Profiles**します。 3. これをいない場合は、をクリックして**識別子**アプリの ID を作成し、(例: `com.company.appname`)、それ以外の場合、既存の ID を編集 4. いることを確認、 **HomeKit**サービスは、指定した ID のチェックが完了します。 [![](homekit-images/provision01.png "指定した ID の HomeKit のサービスを有効にします。")](homekit-images/provision01.png#lightbox) 5. 変更内容を保存します。 4. をクリックして**プロビジョニング プロファイル** > **開発**し、新しい開発プロビジョニング プロファイル、アプリを作成します。 [![](homekit-images/provision02.png "新しい開発プロビジョニング プロファイル、アプリの作成します。")](homekit-images/provision02.png#lightbox) 5. ダウンロードして、新しいプロビジョニング プロファイルをインストールするまたは、Xcode を使用してダウンロードし、プロファイルをインストールしています。 6. Xamarin.iOS プロジェクトのオプションを編集し、先ほど作成したプロビジョニング プロファイルを使用していることを確認します。 [![](homekit-images/provision03.png "先ほど作成したプロビジョニング プロファイルを選択します。")](homekit-images/provision03.png#lightbox) 7. 次に、編集、 **Info.plist**ファイルし、プロビジョニング プロファイルの作成に使用されたアプリ ID を使用していることを確認します。 [![](homekit-images/provision04.png "アプリ ID を設定します。 ")](homekit-images/provision04.png#lightbox) 8. 最後に、編集、 **Entitlements.plist**ファイルし、いることを確認、 **HomeKit**権利が選択されています。 [![](homekit-images/provision05.png "HomeKit の利用資格を有効にします。")](homekit-images/provision05.png#lightbox) 9. すべてのファイルに変更を保存します。 これら設定した状態で、アプリケーションは、HomeKit フレームワークの Api にアクセスする準備がようになりました。 プロビジョニングの詳細についてを参照してください、 [Device Provisioning](~/ios/get-started/installation/device-provisioning/index.md)と[アプリのプロビジョニング](~/ios/get-started/installation/device-provisioning/index.md)ガイド。 > [!IMPORTANT] > HomeKit を有効になっているアプリのテストと開発が正しくプロビジョニングされている実際の iOS デバイスが必要です。 IOS シミュレーターでは、HomeKit をテストすることはできません。 ## <a name="the-homekit-accessory-simulator"></a>HomeKit アクセサリ シミュレーター Apple が作成しなくても、物理デバイスがあるすべての可能なホーム オートメーション デバイスとサービスをテストする方法を提供する、 _HomeKit アクセサリ シミュレーター_します。 このシミュレーターを使用して、セットアップおよび、HomeKit の仮想デバイスを構成できます。 ### <a name="installing-the-simulator"></a>シミュレーターをインストールします。 Apple は、続行する前にインストールする必要があります、HomeKit アクセサリ シミュレーターを個別のダウンロードとして、Xcode から提供します。 次の手順で行います。 1. Web ブラウザーで、次を参照してください[Apple の開発者向けダウンロード。](https://developer.apple.com/download/more/?name=for%20Xcode) 2. ダウンロード、 **Xcode xxx の他のツール**(xxx はインストールされている Xcode のバージョンです)。 [![](homekit-images/simulator01.png "Xcode の他のツールをダウンロードします。")](homekit-images/simulator01.png#lightbox) 3. ディスク イメージを開き、ツールをインストール、**アプリケーション**ディレクトリ。 HomeKit アクセサリ シミュレータをインストールする、テスト用仮想 [アクセサリ] を作成できます。 ### <a name="creating-virtual-accessories"></a>仮想アクセサリを作成します。 HomeKit アクセサリ シミュレーターを開始し、いくつかの仮想アクセサリを作成するには、次の操作を行います。 1. [アプリケーション] フォルダーには、HomeKit アクセサリ シミュレーターを開始します。 [![](homekit-images/simulator02.png "HomeKit アクセサリ シミュレーター")](homekit-images/simulator02.png#lightbox) 2. をクリックして、 **+** ボタンをクリックし、選択**新しいアクセサリ.**: [![](homekit-images/simulator03.png "新しいアクセサリを追加します。")](homekit-images/simulator03.png#lightbox) 3. 新しいアクセサリについての情報を入力し、クリックして、**完了**ボタンをクリックします。 [![](homekit-images/simulator04.png "新しいアクセサリについての情報を入力します")](homekit-images/simulator04.png#lightbox) 4. をクリックして、**サービスを追加する.** ボタンをクリックし、ドロップダウン リストからサービスの種類を選択します。 [![](homekit-images/simulator05.png "ドロップダウン リストからサービスの種類を選択します。")](homekit-images/simulator05.png#lightbox) 5. 提供、**名前**サービスをクリックして、**完了**ボタン。 [![](homekit-images/simulator06.png "サービスの名前を入力します")](homekit-images/simulator06.png#lightbox) 6. クリックして、サービスのオプションの特性を行うことができます、**追加特性**ボタンをクリックし、必要な設定を構成します。 [![](homekit-images/simulator07.png "必要な設定を構成します。")](homekit-images/simulator07.png#lightbox) 7. HomeKit をサポートする仮想ホーム オートメーション デバイスの種類ごとの 1 つを作成する前の手順を繰り返します。 いくつかサンプル仮想 HomeKit アクセサリ作成し、構成、今すぐ使用し、Xamarin.iOS アプリからこれらのデバイスを制御できます。 ## <a name="configuring-the-infoplist-file"></a>Info.plist ファイルを構成します。 開発者は、追加する必要があります iOS 10 の新機能 (および大きい)、`NSHomeKitUsageDescription`アプリのキー`Info.plist`ファイルを開き、アプリがユーザーの HomeKit のデータベースにアクセスしようとした理由宣言文字列を提供します。 この文字列は、ユーザーが初めてにアプリを実行するときに表示されます。 [![](homekit-images/info01.png "HomeKit のアクセス許可ダイアログ")](homekit-images/info01.png#lightbox) このキーを設定するには、次の操作を行います。 1. ダブルクリックして、`Info.plist`ファイル、**ソリューション エクスプ ローラー**編集用に開きます。 2. 画面の下部に切り替えて、**ソース**ビュー。 3. 新しい追加**エントリ**一覧にします。 4. ドロップダウン リストから選択**プライバシー - HomeKit の利用状況の説明**: [![](homekit-images/info02.png "プライバシー - HomeKit の利用状況の説明を選択します")](homekit-images/info02.png#lightbox) 5. アプリがユーザーの HomeKit のデータベースにアクセスしようとした理由の説明を入力します。 [![](homekit-images/info03.png "説明を入力します")](homekit-images/info03.png#lightbox) 6. 変更内容をファイルに保存します。 > [!IMPORTANT] > 設定に失敗した、`NSHomeKitUsageDescription`キー、`Info.plist`ファイルの場合は、アプリ_サイレント モードで失敗した_(実行時にシステムによって閉じられている) iOS 10 (以降) で実行するとエラーは発生しません。 ## <a name="connecting-to-homekit"></a>HomeKit への接続 Xamarin.iOS アプリを HomeKit を通信には、最初のインスタンスをインスタンス化する必要があります、`HMHomeManager`クラス。 ホーム Manager HomeKit へのサーバーの全体のエントリ ポイントし、使用可能な自宅の一覧を提供する責任は、更新およびそのリストを維持し、登録済みユーザーの_プライマリ ホーム_します。 `HMHome`オブジェクトには、ルーム、グループ、またはインストールされている任意のホーム オートメーション アクセサリと共に、それが含む可能性のあるゾーンを含む付与ホームに関するすべての情報が含まれています。 HomeKit を少なくとも 1 つのすべての操作を実行する前に、`HMHome`作成し、プライマリ ホームとして割り当てる必要があります。 アプリは、作成とそうでない場合は、1 つを割り当てるプライマリ ホームが存在するかどうかを確認します。 ### <a name="adding-a-home-manager"></a>ホーム マネージャーを追加します。 Xamarin.iOS アプリには、HomeKit の認識を追加するには、編集、 **AppDelegate.cs**ファイルを編集し、次のようになります。 ```csharp using HomeKit; ... public HMHomeManager HomeManager { get; set; } ... public override void FinishedLaunching (UIApplication application) { // Attach to the Home Manager HomeManager = new HMHomeManager (); Console.WriteLine ("{0} Home(s) defined in the Home Manager", HomeManager.Homes.Count()); // Wire-up Home Manager Events HomeManager.DidAddHome += (sender, e) => { Console.WriteLine("Manager Added Home: {0}",e.Home); }; HomeManager.DidRemoveHome += (sender, e) => { Console.WriteLine("Manager Removed Home: {0}",e.Home); }; HomeManager.DidUpdateHomes += (sender, e) => { Console.WriteLine("Manager Updated Homes"); }; HomeManager.DidUpdatePrimaryHome += (sender, e) => { Console.WriteLine("Manager Updated Primary Home"); }; } ``` アプリケーションが最初に実行時に、HomeKit 情報にアクセスすることを許可するかどうか、ユーザーが求められます。 [![](homekit-images/home01.png "HomeKit 情報にアクセスすることを許可するかどうか、ユーザーが求められます")](homekit-images/home01.png#lightbox) ユーザーが回答する場合 **[ok]** アプリケーションは、HomeKit アクセサリを使用できるし、それ以外の場合にないと HomeKit への呼び出しがエラーで失敗します。 インプレース マネージャーがホーム次に、アプリケーションは必要がありますをプライマリ ホームが構成されている場合に表示し、そうでない場合は、ユーザーを作成して割り当てる 1 つの方法を提供します。 ### <a name="accessing-the-primary-home"></a>プライマリ ホームにアクセスします。 前述のように、プライマリ ホームを作成および HomeKit があるし、ユーザーを作成して割り当てる場合は、1 つのプライマリ ホームの手段を提供するアプリの責任がまだ存在しないことをお勧めする前に構成する必要があります。 監視する必要がありますが、アプリが初めて起動したり、バック グラウンドから返します、ときに、`DidUpdateHomes`のイベント、`HMHomeManager`プライマリ ホームの存在を確認するクラス。 1 つが存在しない場合、ユーザーを作成する 1 つのインターフェイスを提供する必要があります。 次のコードは、プライマリ ホームをチェックするビュー コント ローラーに追加できます。 ```csharp using HomeKit; ... public AppDelegate ThisApp { get { return (AppDelegate)UIApplication.SharedApplication.Delegate; } } ... // Wireup events ThisApp.HomeManager.DidUpdateHomes += (sender, e) => { // Was a primary home found? if (ThisApp.HomeManager.PrimaryHome == null) { // Ask user to add a home PerformSegue("AddHomeSegue",this); } }; ``` ホーム マネージャーが接続すると、HomeKit ときに、`DidUpdateHomes`発生するイベント、住宅のマネージャーのコレクションに読み込まれる任意の既存の家庭および使用可能な場合、プライマリ ホームが読み込まれます。 ### <a name="adding-a-primary-home"></a>プライマリ ホームを追加します。 場合、`PrimaryHome`のプロパティ、`HMHomeManager`は`null`後、`DidUpdateHomes`イベント、ユーザーを作成して続行する前にプライマリ ホームを割り当てる方法を提供する必要があります。 通常、アプリは、ユーザー ホーム マネージャー プライマリ ホームとしてセットアップに渡されるを取得するための新しいホームに名前を付けるためのフォームに表示されます。 **HomeKitIntro**サンプル アプリでは、モーダル ビューが IOS Designer で作成およびメソッドを呼び出して、`AddHomeSegue`セグエ、アプリのメイン インターフェイスから。 ユーザーが新しいホームと、ホームを追加するボタンの名前を入力するためのテキスト フィールドを提供します。 ユーザーがタップしたときに、**追加ホーム**ボタン、次のコードは、ホームを追加するホーム マネージャー。 ```csharp // Add new home to HomeKit ThisApp.HomeManager.AddHome(HomeName.Text,(home,error) =>{ // Did an error occur if (error!=null) { // Yes, inform user AlertView.PresentOKAlert("Add Home Error",string.Format("Error adding {0}: {1}",HomeName.Text,error.LocalizedDescription),this); return; } // Make the primary house ThisApp.HomeManager.UpdatePrimaryHome(home,(err) => { // Error? if (err!=null) { // Inform user of error AlertView.PresentOKAlert("Add Home Error",string.Format("Unable to make this the primary home: {0}",err.LocalizedDescription),this); return ; } }); // Close the window when the home is created DismissViewController(true,null); }); ``` `AddHome`メソッドは新しいホームを作成し、指定されたコールバック ルーチンに返すことを試みます。 場合、`error`プロパティは`null`エラーが発生しました、これは、ユーザーに提示する必要があります。 最も一般的なエラーには、一意でないホーム名または HomeKit と通信できないホーム マネージャーのいずれかが原因です。 ホームが正常に作成された場合は、呼び出す必要があります。、`UpdatePrimaryHome`プライマリ ホームに新しいホームを設定します。 ここでも場合、`error`プロパティは`null`エラーが発生しました、これは、ユーザーに提示する必要があります。 ホーム マネージャーを監視することも必要があります。`DidAddHome`と`DidRemoveHome`必要に応じて、アプリのユーザー インターフェイスのイベントおよび更新します。 > [!IMPORTANT] > `AlertView.PresentOKAlert`上記のサンプル コードで使用されるメソッドが操作できるように iOS アラート簡単 HomeKitIntro アプリケーションでヘルパー クラス。 ## <a name="finding-new-accessories"></a>新しい [アクセサリ] の検索 プライマリ ホームが定義されているかホーム マネージャーから読み込まれると、Xamarin.iOS アプリを呼び出すことができます、`HMAccessoryBrowser`を任意の新しいホーム オートメーション アクセサリを見つけて、それらをホームに追加します。 呼び出す、`StartSearchingForNewAccessories`新しい [アクセサリ] の検索を開始するメソッドと`StopSearchingForNewAccessories`メソッドが完了します。 > [!IMPORTANT] > `StartSearchingForNewAccessories` ままにしないで長期間実行されているため、バッテリの寿命と iOS デバイスのパフォーマンスが低下します。 Apple が呼び出し元を示す`StopSearchingForNewAccessories`分や付属品の検索 UI がユーザーに表示された場合にのみ検索します。 `DidFindNewAccessory`イベントが呼び出されますが、新しい [アクセサリ] の検出し、に追加されます、`DiscoveredAccessories`アクセサリ ブラウザーの一覧。 `DiscoveredAccessories`一覧のコレクションが含まれます`HMAccessory`付与 HomeKit を定義するオブジェクトには、ホーム オートメーション デバイスとライトやガレージ ドア コントロールなど、使用可能なサービスが有効になっています。 新しいアクセサリが見つかった後のユーザーに表示と選択して、ホームに追加するため。 例: [![](homekit-images/accessory01.png "新しいアクセサリの検索")](homekit-images/accessory01.png#lightbox) 呼び出す、`AddAccessory`ホームのコレクションを選択したアクセサリを追加するメソッド。 例: ```csharp // Add the requested accessory to the home ThisApp.HomeManager.PrimaryHome.AddAccessory (_controller.AccessoryBrowser.DiscoveredAccessories [indexPath.Row], (err) => { // Did an error occur if (err !=null) { // Inform user of error AlertView.PresentOKAlert("Add Accessory Error",err.LocalizedDescription,_controller); } }); ``` 場合、`err`プロパティは`null`エラーが発生しました、これは、ユーザーに提示する必要があります。 それ以外の場合、ユーザーを追加するデバイスのセットアップ コードの入力を求められます。 [![](homekit-images/accessory02.png "追加するデバイスのセットアップ コードを入力します。")](homekit-images/accessory02.png#lightbox) この番号は記載されて、HomeKit アクセサリ シミュレーターで、**セットアップ コード**フィールド。 [![](homekit-images/accessory03.png "HomeKit アクセサリ シミュレーターでのセットアップ コード フィールド")](homekit-images/accessory03.png#lightbox) 実際の HomeKit アクセサリ セットアップ コードは、デバイス自体で、製品の箱にまたはアクセサリのユーザーによる手動ラベルに印刷か。 アクセサリのブラウザーを監視する必要があります`DidRemoveNewAccessory`イベントと更新プログラム、ユーザーは、ユーザーが追加の"ホーム"コレクションになった後、一覧の利用可能なアクセサリを削除するインターフェイスします。 ## <a name="working-with-accessories"></a>[アクセサリ] の操作 プライマリ ホーム 1 回も確立し、[アクセサリ] が追加されてを使用するユーザーの [アクセサリ] (および必要に応じてルーム) の一覧を表示することができます。 `HMRoom`オブジェクトには、特定の部屋に関するすべての情報とそれに属する任意のアクセサリが含まれています。 ルームは、1 つまたは複数のゾーンに必要に応じて整理できます。 A`HMZone`特定のゾーンのすべての情報とそれに属するルームのすべてが含まれています。 この例でを保存するモ ノ シンプルかつルームまたはゾーンに編成ではなく自宅のアクセサリを直接と連携します。 `HMHome`オブジェクトでユーザーに表示することが割り当てられた付属品の一覧が含まれています。 その`Accessories`プロパティ。 例: [![](homekit-images/accessory04.png "例アクセサリ")](homekit-images/accessory04.png#lightbox) ここでフォームで、ユーザーは特定のアクセサリを選択して、これが提供するサービスを使用します。 ## <a name="working-with-services"></a>サービスの使用 ユーザーが対話 HomeKit ホーム オートメーションを有効になっている、特定のデバイスに場合、これは通常が提供するサービスを使用します。 `Services`のプロパティ、`HMAccessory`クラスのコレクションを格納する`HMService`サービスを定義するオブジェクト、デバイスは提供します。 サービスは、ライト、サーモスタット、ガレージ ドアを開ける装置、スイッチ、またはロックのようなものです。 (ガレージきっかけ) のような一部のデバイスは、光、ドアの開閉する機能など、複数のサービスを提供します。 各アクセサリを含む特定のアクセサリを提供する特定のサービスに加え、`Information Service`名、製造元、モデルのシリアル番号などのプロパティを定義します。 ### <a name="accessory-service-types"></a>アクセサリのサービスの種類 次のサービスの種類は経由で使用できる、`HMServiceType`列挙型。 - **AccessoryInformation** -指定したホーム オートメーション デバイス (アクセサリ) に関する情報を提供します。 - **AirQualitySensor** -空気の品質センサーを定義します。 - **バッテリ**-アクセサリのバッテリの状態を定義します。 - **CarbonDioxideSensor** -カーボン二酸化炭素センサーを定義します。 - **CarbonMonoxideSensor** -一酸化炭素センサーを定義します。 - **ContactSensor** -連絡先のセンサー (開かれたり閉じられたりするウィンドウ) などを定義します。 - **ドア**-ドアの状態のセンサー (開かれたり閉じられたりなど) を定義します。 - **ファン**-リモート コントロール ファンを定義します。 - **GarageDoorOpener** -ガレージのきっかけを定義します。 - **HumiditySensor** ・湿度センサーを定義します。 - **LeakSensor** -リーク センサー (給湯または洗濯機のように) を定義します。 - **LightBulb** -スタンドアロン光または (ガレージきっかけ) などの他の付属品の一部であるライトを定義します。 - **LightSensor** -光センサーを定義します。 - **LockManagement** -自動ドアのロックを管理するサービスを定義します。 - **LockMechanism** -(ドアのロック) のような場合は、リモート制御されたロックを定義します。 - **MotionSensor** -モーション センサーを定義します。 - **OccupancySensor** -占有率、センサーを定義します。 - **アウトレット**-リモート制御されたコンセントを定義します。 - **SecuritySystem** -自宅のセキュリティ システムを定義します。 - **StatefulProgrammableSwitch** -(フリップ スイッチ) のような 1 回トリガーできるように状態を維持するプログラミング可能なスイッチを定義します。 - **StatelessProgrammableSwitch** -トリガー (プッシュ ボタン) のように後を初期状態を返すプログラミング可能なスイッチを定義します。 - **SmokeSensor** -煙のセンサーを定義します。 - **切り替える**-標準壁スイッチなどのオン/オフ スイッチを定義します。 - **温度センサー**の温度センサーを定義します。 - **サーモスタット**-スマート サーモスタット、HVAC システムを制御するために使用を定義します。 - **ウィンドウ**-cane の開かれたり閉じられたりするリモートでの自動ウィンドウを定義します。 - **WindowCovering** -リモートで制御されるウィンドウをカバーする、開くまたは閉じることができますをブラインドのように定義します。 ### <a name="displaying-service-information"></a>サービス情報を表示します。 読み込み後、`HMAccessory`個々 のクエリを実行できる`HNService`オブジェクトを提供し、ユーザーにその情報を表示。 [![](homekit-images/accessory05.png "サービス情報を表示します。")](homekit-images/accessory05.png#lightbox) 必ず確認する必要があります、`Reachable`のプロパティを`HMAccessory`前にそれを処理します。 アクセサリには、到達できないユーザーがいないデバイスの範囲内、またはかどうかに接続されていないことを指定できます。 サービスを選択すると、ユーザーが表示またはそのサービスを監視または指定したホーム オートメーション デバイスの制御の 1 つまたは複数の特性を変更できます。 <a name="Working-with-Characteristics" /> ## <a name="working-with-characteristics"></a>特性の操作 各`HMService`オブジェクトのコレクションに格納できる`HMCharacteristic`(開かれたり閉じられたりするドア) のようなサービスの状態に関する情報を提供するか (ライトの色を設定する) などの状態を調整するユーザーを許可するオブジェクト。 `HMCharacteristic` だけでなく、特性とその状態に関する情報を提供しますが、使用して状態を操作するためのメソッドも提供_特性メタデータ_(`HMCharacteristisMetadata`)。 このメタデータは、状態を変更するユーザーまたはように情報を表示する場合に便利ですが (最小と最大値の範囲) などのプロパティを提供できます。 `HMCharacteristicType`列挙型が定義されているまたは次のように変更できる特性のメタデータ値のセットを提供します。 - AdminOnlyAccess - AirParticulateDensity - AirParticulateSize - AirQuality - AudioFeedback - BatteryLevel - [明るさ] - CarbonDioxideDetected - CarbonDioxideLevel - CarbonDioxidePeakLevel - CarbonMonoxideDetected - CarbonMonoxideLevel - CarbonMonoxidePeakLevel - ChargingState - ContactState - CoolingThreshold - CurrentDoorState - CurrentHeatingCooling - CurrentHorizontalTilt - CurrentLightLevel - CurrentLockMechanismState - CurrentPosition - CurrentRelativeHumidity - CurrentSecuritySystemState - CurrentTemperature - CurrentVerticalTilt - FirmwareVersion - HardwareVersion - HeatingCoolingStatus - HeatingThreshold - HoldPosition - [色合い] - Identify - InputEvent - LeakDetected - LockManagementAutoSecureTimeout - LockManagementControlPoint - LockMechanismLastKnownAction - ログ - 製造元 - モデル - MotionDetected - 名前 - ObstructionDetected - OccupancyDetected - OutletInUse - OutputState - PositionState - PowerState - RotationDirection - RotationSpeed - [彩度] - シリアル番号 - SmokeDetected - SoftwareVersion - StatusActive - StatusFault - StatusJammed - StatusLowBattery - StatusTampered - TargetDoorState - TargetHeatingCooling - TargetHorizontalTilt - TargetLockMechanismState - TargetPosition - TargetRelativeHumidity - TargetSecuritySystemState - TargetTemperature - TargetVerticalTilt - TemperatureUnits - Version ### <a name="working-with-a-characteristics-value"></a>特徴の値の操作 アプリを特定の特性の最新の状態を持つようにするため、呼び出し、`ReadValue`のメソッド、`HMCharacteristic`クラス。 場合、`err`プロパティは`null`エラーが発生しました、および、ユーザーに表示されない場合があります。 特性の`Value`プロパティには、特定の特性としての現在の状態が含まれています、 `NSObject`、ようできませんしたで直接、C#します。 値を読み取るには、次のヘルパー クラスに追加された、 **HomeKitIntro**サンプル アプリケーション。 ```csharp using System; using Foundation; using System.Globalization; using CoreGraphics; namespace HomeKitIntro { /// <summary> /// NS object converter is a helper class that helps to convert NSObjects into /// C# objects /// </summary> public static class NSObjectConverter { #region Static Methods /// <summary> /// Converts to an object. /// </summary> /// <returns>The object.</returns> /// <param name="nsO">Ns o.</param> /// <param name="targetType">Target type.</param> public static Object ToObject (NSObject nsO, Type targetType) { if (nsO is NSString) { return nsO.ToString (); } if (nsO is NSDate) { var nsDate = (NSDate)nsO; return DateTime.SpecifyKind ((DateTime)nsDate, DateTimeKind.Unspecified); } if (nsO is NSDecimalNumber) { return decimal.Parse (nsO.ToString (), CultureInfo.InvariantCulture); } if (nsO is NSNumber) { var x = (NSNumber)nsO; switch (Type.GetTypeCode (targetType)) { case TypeCode.Boolean: return x.BoolValue; case TypeCode.Char: return Convert.ToChar (x.ByteValue); case TypeCode.SByte: return x.SByteValue; case TypeCode.Byte: return x.ByteValue; case TypeCode.Int16: return x.Int16Value; case TypeCode.UInt16: return x.UInt16Value; case TypeCode.Int32: return x.Int32Value; case TypeCode.UInt32: return x.UInt32Value; case TypeCode.Int64: return x.Int64Value; case TypeCode.UInt64: return x.UInt64Value; case TypeCode.Single: return x.FloatValue; case TypeCode.Double: return x.DoubleValue; } } if (nsO is NSValue) { var v = (NSValue)nsO; if (targetType == typeof(IntPtr)) { return v.PointerValue; } if (targetType == typeof(CGSize)) { return v.SizeFValue; } if (targetType == typeof(CGRect)) { return v.RectangleFValue; } if (targetType == typeof(CGPoint)) { return v.PointFValue; } } return nsO; } /// <summary> /// Convert to string /// </summary> /// <returns>The string.</returns> /// <param name="nsO">Ns o.</param> public static string ToString(NSObject nsO) { return (string)ToObject (nsO, typeof(string)); } /// <summary> /// Convert to date time /// </summary> /// <returns>The date time.</returns> /// <param name="nsO">Ns o.</param> public static DateTime ToDateTime(NSObject nsO){ return (DateTime)ToObject (nsO, typeof(DateTime)); } /// <summary> /// Convert to decimal number /// </summary> /// <returns>The decimal.</returns> /// <param name="nsO">Ns o.</param> public static decimal ToDecimal(NSObject nsO){ return (decimal)ToObject (nsO, typeof(decimal)); } /// <summary> /// Convert to boolean /// </summary> /// <returns><c>true</c>, if bool was toed, <c>false</c> otherwise.</returns> /// <param name="nsO">Ns o.</param> public static bool ToBool(NSObject nsO){ return (bool)ToObject (nsO, typeof(bool)); } /// <summary> /// Convert to character /// </summary> /// <returns>The char.</returns> /// <param name="nsO">Ns o.</param> public static char ToChar(NSObject nsO){ return (char)ToObject (nsO, typeof(char)); } /// <summary> /// Convert to integer /// </summary> /// <returns>The int.</returns> /// <param name="nsO">Ns o.</param> public static int ToInt(NSObject nsO){ return (int)ToObject (nsO, typeof(int)); } /// <summary> /// Convert to float /// </summary> /// <returns>The float.</returns> /// <param name="nsO">Ns o.</param> public static float ToFloat(NSObject nsO){ return (float)ToObject (nsO, typeof(float)); } /// <summary> /// Converts to double /// </summary> /// <returns>The double.</returns> /// <param name="nsO">Ns o.</param> public static double ToDouble(NSObject nsO){ return (double)ToObject (nsO, typeof(double)); } #endregion } } ``` `NSObjectConverter`アプリケーションは、特性の現在の状態を読み取る必要があるたびに使用されます。 たとえば、次のように入力します。 ```csharp var value = NSObjectConverter.ToFloat (characteristic.Value); ``` 上記の行に値を変換する、 `float` 、Xamarin で使用し、C#コード。 変更する、 `HMCharacteristic`、呼び出すその`WriteValue`メソッドで新しい値をラップし、`NSObject.FromObject`を呼び出します。 たとえば、次のように入力します。 ```csharp Characteristic.WriteValue(NSObject.FromObject(value),(err) =>{ // Was there an error? if (err!=null) { // Yes, inform user AlertView.PresentOKAlert("Update Error",err.LocalizedDescription,Controller); } }); ``` 場合、`err`プロパティは`null`エラーが発生し、ユーザーに提示する必要があります。 ### <a name="testing-characteristic-value-changes"></a>特性の値の変更のテスト 使用する場合`HMCharacteristics`とシミュレートされた [アクセサリ]、変更、 `Value` HomeKit アクセサリ シミュレーター内でプロパティを監視できます。 **HomeKitIntro** HomeKit アクセサリのシミュレーターで実際の iOS デバイスのハードウェア特性の値の変更で実行されているアプリをほぼ瞬時に確認する必要があります。 たとえば、iOS アプリでのライトの状態の変更。 [![](homekit-images/test01.png "IOS アプリでのライトの状態を変更します。")](homekit-images/test01.png#lightbox) HomeKit アクセサリのシミュレーターでのライトの状態を変更する必要があります。 値が変更されない場合は、特性の新しい値を書き込むときに、エラー メッセージの状態を確認し、アクセサーが到達可能であることを確認します。 ## <a name="advanced-homekit-features"></a>HomeKit の高度な機能 この記事では、Xamarin.iOS アプリで HomeKit アクセサリを操作するために必要な基本的な機能について説明しました。 ただし、HomeKit の概要で取り上げられていないいくつかの高度な機能があります。 - **ルーム**-有効になっている HomeKit アクセサリが、エンドユーザーがルームに整理できます必要に応じて。 これにより、ユーザーの理解し、作業を簡単な方法である [アクセサリ] を HomeKit です。 作成して、ルームを管理する方法の詳細については、Apple を参照してください[HMRoom](https://developer.apple.com/library/prerelease/ios/documentation/HomeKit/Reference/HMRoom_Class/index.html#//apple_ref/occ/cl/HMRoom)ドキュメント。 - **ゾーン**-エンドユーザーがゾーンにルームが整理必要に応じてできます。 ゾーンは、ユーザーが 1 つの単位として扱うことがありますルームのコレクションを表します。 例:Downstairs または地下室上。 ここでも、これにより、HomeKit 存在し、[アクセサリ]、エンドユーザーにとって意味のある方法で使用します。 作成して、ゾーンを管理する方法の詳細については、Apple を参照してください[HMZone](https://developer.apple.com/library/prerelease/ios/documentation/HomeKit/Reference/HMZone_Class/index.html#//apple_ref/occ/cl/HMZone)ドキュメント。 - **アクションおよびアクション設定**-アクションは、アクセサリのサービスの特性を変更して、セットにグループ化することができます。 アクションのセットは、[アクセサリ] のグループを制御し、そのアクションを調整するためのスクリプトとして機能します。 たとえば、「テレビ番組を見る」スクリプト可能性があります、ブラインド、dim ライト、閉じ、テレビとそのサウンド システムを有効にします。 作成して、アクションとアクションのセットを維持する方法の詳細については、Apple を参照してください[HMAction](https://developer.apple.com/library/prerelease/ios/documentation/HomeKit/Reference/HMAction_Class/index.html#//apple_ref/occ/cl/HMAction)と[HMActionSet](https://developer.apple.com/library/prerelease/ios/documentation/HomeKit/Reference/HMActionSet_Class/index.html#//apple_ref/occ/cl/HMActionSet)ドキュメント。 - **トリガー** - いずれかのトリガーをアクティブ化または詳細アクション設定時に指定された一連の条件を満たしています。 など、portch 光を有効にし、外暗くときに、すべての外部のドアをロックします。 作成してトリガーを管理する方法の詳細については、Apple を参照してください[HMTrigger](https://developer.apple.com/library/prerelease/ios/documentation/HomeKit/Reference/HMTrigger_Class/index.html#//apple_ref/occ/cl/HMTrigger)ドキュメント。 これらの機能は、上記と同じ手法を使用するため必要がある次の apple の実装が簡単[HomeKitDeveloper ガイド](https://developer.apple.com/library/ios/documentation/NetworkingInternet/Conceptual/HomeKitDeveloperGuide/Introduction/Introduction.html)、 [HomeKit のユーザー インターフェイス ガイドライン](https://developer.apple.com/homekit/ui-guidelines/)と[HomeKit フレームワーク参照](https://developer.apple.com/library/ios/home_kit_framework_ref)します。 ## <a name="homekit-app-review-guidelines"></a>HomeKit アプリ レビューに関するガイドライン ITunes Connect の iTunes App Store にリリースする Xamarin.iOS アプリを有効になっているは、HomeKit を送信する前に、HomeKit を有効になっているアプリの Apple のガイドラインに従うことを確認します。 - アプリの主な目的_する必要があります_HomeKit フレームワークを使用する場合は、ホーム オートメーションをします。 - HomeKit が使用されていること、およびプライバシー ポリシーを指定する必要があります、アプリのマーケティングのテキストはユーザーに通知する必要があります。 - ユーザー情報を収集または広告のための HomeKit の使用は固く禁止されています。 完全なガイドラインを確認して、Apple を参照してください[App Store レビューに関するガイドライン](https://developer.apple.com/app-store/review/guidelines/)します。 ## <a name="whats-new-in-ios-9"></a>IOS 9 で新します。 Apple が行われて、次の変更と追加 HomeKit を iOS 9。 - **既存のオブジェクトを保持**- 既存の付属品が変更された場合、ホーム マネージャー (`HMHomeManager`) が変更された特定の項目を通知します。 - **永続的な識別子**-HomeKit のすべての関連するクラスが追加されました、 `UniqueIdentifier` HomeKit の間で特定の項目を一意に識別するプロパティには、アプリ (または同じアプリのインスタンス) が有効になっています。 - **ユーザー管理**-プライマリ ユーザーのホームで HomeKit デバイスへのアクセスを持つユーザーを経由でユーザーの管理を提供する組み込みビュー コント ローラーを追加します。 - **ユーザー機能**- HomeKit のユーザーは、HomeKit で使用することはどのような機能を制御する特権のセットを今すぐになり、HomeKit アクセサリを有効にします。 アプリケーションでは、現在のユーザーに関連する機能を表示する必要がありますのみ。 など、管理者だけは、他のユーザーを管理できる必要があります。 - **定義済みのシーン**-平均 HomeKit ユーザーに対して発生する次の 4 つの一般的なイベント用に定義済みのシーンが作成されました。起動のままに、返すベッドに移動します。 これらの定義済みのシーンは、自宅から削除できません。 - **シーンと Siri** -iOS 9 を内のシーン HomeKit で定義されている任意のシーンの名前を識別するため、Siri がサポートが強化されました。 ユーザーは、Siri にその名前を言うとするだけでシーンを実行できます。 - **アクセサリ カテゴリ**-定義済みのカテゴリのセットをすべてについて、Accessories およびホームに追加されるアクセサリの種類を識別するのに役立ちますに追加または、アプリ内から作業します。 これらの新しいカテゴリは、付属品のセットアップ時に使用できます。 - **Apple Watch サポート**- HomeKit は watchOS 可能になりましたし、Apple Watch が HomeKit がウォッチの近くにいる iPhone しないでデバイスを有効にすることになります。 HomeKit watchOS 向けには、次の機能がサポートされています。自宅、[アクセサリ] の制御とシーンの実行を表示します。 - **新しいイベント トリガーの種類**- に加えて、iOS 8、iOS 9 をサポートしています (センサー データなど) のアクセサリ状態または地理的位置情報にイベント トリガーがベースに変更されましたでサポートされているタイマーの種類のトリガー。 イベント トリガーを使用して、`NSPredicates`でそれらの実行条件を設定します。 - **リモート アクセス**-リモート アクセスと、ユーザーが制御できるようになりました、HomeKit は、リモートの場所で家から離れているときに、ホーム オートメーション アクセサリを有効になっています。 IOS 8 でこれがのみサポートされます、ユーザーは、次の第 3 世代自宅で Apple TV がある場合。 IOS 9 では、この制限は解除し、iCloud と HomeKit アクセサリ プロトコル (HAP) を使用してリモート アクセスをサポートします。 - **Bluetooth Low Energy (BLE) の新機能**-HomeKit よう Bluetooth Low Energy (BLE) プロトコル経由で通信できるより多くの付属品の種類になりました。 HAP セキュア トンネリングを使用して、HomeKit アクセサリを公開できます別の Bluetooth アクセサリ Wi-fi 経由で (Bluetooth の範囲外の場合)。 Ios 9 で BLE アクセサリは通知およびメタデータの完全なサポートがあります。 - **新しいアクセサリ カテゴリ**-Apple は iOS 9 で、次の新しいアクセサリ カテゴリを追加します。窓枠、原動機付き扉と Windows、警報、センサー、プログラミング可能なスイッチです。 IOS 9 で HomeKit の新機能の詳細については、Apple を参照してください[HomeKit インデックス](https://developer.apple.com/homekit/)と[HomeKit で新](https://developer.apple.com/videos/wwdc/2015/?id=210)ビデオ。 ## <a name="summary"></a>まとめ この記事には、Apple の HomeKit ホーム オートメーション フレームワークが導入されています。 セットアップおよび HomeKit アクセサリ シミュレーターを使用してテスト デバイスを構成する方法と検出との通信および HomeKit を使用して、ホーム オートメーション デバイスを制御する単純な Xamarin.iOS アプリを作成する方法を示しました。 ## <a name="related-links"></a>関連リンク - [iOS 9 のサンプル](https://developer.xamarin.com/samples/ios/iOS9/) - [iOS 9 開発者向け](https://developer.apple.com/ios/pre-release/) - [IOS 9.0 を新します。](https://developer.apple.com/library/prerelease/ios/releasenotes/General/WhatsNewIniOS/Articles/iOS9.html) - [HomeKitDeveloper ガイド](https://developer.apple.com/library/ios/documentation/NetworkingInternet/Conceptual/HomeKitDeveloperGuide/Introduction/Introduction.html) - [HomeKit のユーザー インターフェイス ガイドライン](https://developer.apple.com/homekit/ui-guidelines/) - [HomeKit フレームワーク参照](https://developer.apple.com/library/ios/home_kit_framework_ref)
42.599444
561
0.747429
yue_Hant
0.820477
faba9e6c74f0358adeb8ee55636accb3aef0cdb8
2,293
md
Markdown
wdk-ddi-src/content/ksmedia/ne-ksmedia-ks_tuner_tuning_flags.md
MikeMacelletti/windows-driver-docs-ddi
5436c618dff46f9320544766618c9ab4bef6a35e
[ "CC-BY-4.0", "MIT" ]
null
null
null
wdk-ddi-src/content/ksmedia/ne-ksmedia-ks_tuner_tuning_flags.md
MikeMacelletti/windows-driver-docs-ddi
5436c618dff46f9320544766618c9ab4bef6a35e
[ "CC-BY-4.0", "MIT" ]
null
null
null
wdk-ddi-src/content/ksmedia/ne-ksmedia-ks_tuner_tuning_flags.md
MikeMacelletti/windows-driver-docs-ddi
5436c618dff46f9320544766618c9ab4bef6a35e
[ "CC-BY-4.0", "MIT" ]
null
null
null
--- UID: NE:ksmedia.__unnamed_enum_67 title: KS_TUNER_TUNING_FLAGS (ksmedia.h) description: The KS_TUNER_TUNING_FLAGS enumeration defines tuning flags that describe the granularity of a tuning operation. old-location: stream\ks_tuner_tuning_flags.htm tech.root: stream ms.assetid: f8742053-0d02-40af-9a6e-7af029db8575 ms.date: 04/23/2018 keywords: ["KS_TUNER_TUNING_FLAGS enumeration"] ms.keywords: KS_TUNER_TUNING_COARSE, KS_TUNER_TUNING_EXACT, KS_TUNER_TUNING_FINE, KS_TUNER_TUNING_FLAGS, KS_TUNER_TUNING_FLAGS enumeration [Streaming Media Devices], ksmedia/KS_TUNER_TUNING_COARSE, ksmedia/KS_TUNER_TUNING_EXACT, ksmedia/KS_TUNER_TUNING_FINE, ksmedia/KS_TUNER_TUNING_FLAGS, stream.ks_tuner_tuning_flags, vidcapstruct_af322917-69e6-4688-885d-45422c594348.xml f1_keywords: - "ksmedia/KS_TUNER_TUNING_FLAGS" req.header: ksmedia.h req.include-header: Ksmedia.h req.target-type: Windows req.target-min-winverclnt: req.target-min-winversvr: req.kmdf-ver: req.umdf-ver: req.ddi-compliance: req.unicode-ansi: req.idl: req.max-support: req.namespace: req.assembly: req.type-library: req.lib: req.dll: req.irql: topic_type: - APIRef - kbSyntax api_type: - HeaderDef api_location: - ksmedia.h api_name: - KS_TUNER_TUNING_FLAGS product: - Windows targetos: Windows req.typenames: KS_TUNER_TUNING_FLAGS --- # KS_TUNER_TUNING_FLAGS enumeration ## -description The KS_TUNER_TUNING_FLAGS enumeration defines tuning flags that describe the granularity of a tuning operation. ## -enum-fields ### -field KS_TUNER_TUNING_EXACT The tuner should tune directly to the specified frequency and bypass any fine tuning logic. ### -field KS_TUNER_TUNING_FINE The tuning operation should perform a comprehensive search for the best tuning. This flag is used only if the strategy is KS_TUNER_STRATEGY_DRIVER_TUNES. ### -field KS_TUNER_TUNING_COARSE The tuning operation should perform a fast search and attempt only to determine if a valid signal is present. This flag is used only if the strategy is KS_TUNER_STRATEGY_DRIVER_TUNES. ## -see-also <a href="https://docs.microsoft.com/windows-hardware/drivers/ddi/ksmedia/ns-ksmedia-ksproperty_tuner_frequency_s">KSPROPERTY_TUNER_FREQUENCY_S</a>    
27.297619
374
0.781945
eng_Latn
0.619805
fabb5b5b8ed25ceeccf9f50ce0802a6e54f7dfef
38
md
Markdown
README.md
NTUMBA/PortfolioMaterializeMboma
25a309d9a1416ddf801d80b3544b9111a01db424
[ "MIT" ]
null
null
null
README.md
NTUMBA/PortfolioMaterializeMboma
25a309d9a1416ddf801d80b3544b9111a01db424
[ "MIT" ]
null
null
null
README.md
NTUMBA/PortfolioMaterializeMboma
25a309d9a1416ddf801d80b3544b9111a01db424
[ "MIT" ]
null
null
null
Restaure folder portfolio materialize
19
37
0.894737
eng_Latn
0.91579
fabb9728602532788b2e732e121f406b5ebcdecd
304
md
Markdown
zh/faq/postcss-plugins.md
muzi131313/cn-nuxtjs-docs
5abd42563bb6af74df16aaa2e01b0a7e7e669b0d
[ "MIT" ]
null
null
null
zh/faq/postcss-plugins.md
muzi131313/cn-nuxtjs-docs
5abd42563bb6af74df16aaa2e01b0a7e7e669b0d
[ "MIT" ]
null
null
null
zh/faq/postcss-plugins.md
muzi131313/cn-nuxtjs-docs
5abd42563bb6af74df16aaa2e01b0a7e7e669b0d
[ "MIT" ]
null
null
null
--- title: Postcss 插件 description: 如何添加 postcss 插件? --- # 如何添加 postcss 插件? 可在 `nuxt.config.js` 文件增加以下配置来添加 postcss 插件: ```js module.exports = { build: { postcss: [ require('postcss-nested')(), require('postcss-responsive-type')(), require('postcss-hexrgba')(), ] } } ```
14.47619
43
0.595395
eng_Latn
0.241014
fabb9ab6751a5a6d5f568e71ad5d4a7abff0d006
1,843
md
Markdown
BUILD.md
ambaxter/enarx-keepldr
7b1dcb8ab9dc8259ceeed89c2d80475a3ee6e368
[ "Apache-2.0" ]
15
2020-09-03T16:12:01.000Z
2021-11-15T09:42:10.000Z
BUILD.md
ambaxter/enarx-keepldr
7b1dcb8ab9dc8259ceeed89c2d80475a3ee6e368
[ "Apache-2.0" ]
298
2020-09-03T19:15:58.000Z
2021-10-05T17:53:16.000Z
BUILD.md
greyspectrum/enarx-keepldr
190ff766083c7014ec78dd6aa48986ed6f64a40d
[ "Apache-2.0" ]
17
2020-09-03T15:57:47.000Z
2021-09-12T18:07:45.000Z
# Building ## Install Dependencies ### Fedora $ sudo dnf install git curl gcc pkg-config openssl-devel musl-gcc ### Disclaimer Please note that most (all) Enarx developers use Fedora, so that is the distribution where we'll be able to offer most support, if any. The following configurations are unlikely to be exercised with any frequency and as a result, may not work for you. However, they have worked at some point in the past and therefore they are listed here in the hopes that they might be useful to you. Please feel free to file a pull request to add your favorite distribution if you're able to build and run the `enarx-keepldr` test suite. ### A note on gcc The minimum required `gcc` version is version 9. Something older _might_ build binaries (such as integration test binaries), but may silently drop required compiler flags. Please ensure you're using the minimum required version of `gcc`. Failure to do so might result in weird failures at runtime. ### CentOS 8 / Stream $ sudo dnf copr enable ngompa/musl-libc $ sudo dnf install git curl gcc-toolset-9 openssl-devel musl-gcc $ source "/opt/rh/gcc-toolset-9/enable" Note: you may want to add that final `source` command to a `~/.profile`, `~/.bashrc` / or `~/.bash_profile` equivalent, otherwise you must remember to source that file prior to building `enarx-keepldr`. ### Debian / Ubuntu $ sudo apt update $ sudo apt install git curl gcc pkg-config libssl-dev musl-tools python3-minimal ## Install Rust, Nightly and the MUSL target $ curl --proto '=https' --tlsv1.2 -sSf https://sh.rustup.rs | sh -s -- -y $ source $HOME/.cargo/env $ rustup toolchain install nightly --allow-downgrade -t x86_64-unknown-linux-musl ## Build $ git clone https://github.com/enarx/enarx-keepldr $ cd enarx-keepldr/ $ cargo build
33.509091
85
0.731959
eng_Latn
0.990134
fabc73f5a0ce871baec30e8a54b7a8bbab492ff4
3,788
md
Markdown
articles/service-fabric/service-fabric-support.md
yhs666/mc-docs.zh-cn
a9ca0759e37f213ee8f2c8a3e792cf098ca7b6fd
[ "CC-BY-4.0", "MIT" ]
null
null
null
articles/service-fabric/service-fabric-support.md
yhs666/mc-docs.zh-cn
a9ca0759e37f213ee8f2c8a3e792cf098ca7b6fd
[ "CC-BY-4.0", "MIT" ]
null
null
null
articles/service-fabric/service-fabric-support.md
yhs666/mc-docs.zh-cn
a9ca0759e37f213ee8f2c8a3e792cf098ca7b6fd
[ "CC-BY-4.0", "MIT" ]
null
null
null
--- title: 了解 Azure Service Fabric 支持选项 | Azure description: 支持的 Azure Service Fabric 群集版本,以及文件支持票证的链接 services: service-fabric documentationcenter: .net author: rockboyfor manager: digimobile editor: '' ms.assetid: '' ms.service: service-fabric ms.devlang: dotnet ms.topic: troubleshooting ms.tgt_pltfrm: NA ms.workload: NA origin.date: 08/24/2018 ms.date: 09/02/2019 ms.author: v-yeche ms.openlocfilehash: 21fe3b9bb288813e86bb9b10ad0d325a96aa1b51 ms.sourcegitcommit: 66192c23d7e5bf83d32311ae8fbb83e876e73534 ms.translationtype: HT ms.contentlocale: zh-CN ms.lasthandoff: 09/04/2019 ms.locfileid: "70254625" --- # <a name="azure-service-fabric-support-options"></a>Azure Service Fabric 支持选项 我们为用户设置了各种选项,方便其为 Service Fabric 群集(在其上运行应用程序工作负荷)提供相应的支持。 用户需根据所需支持级别以及问题的严重性,选取适当的选项。 <a name="getlivesitesupportonazure"></a> ## <a name="report-production-issues-or-request-paid-support-for-azure"></a>报告生产问题,或者请求 Azure 付费支持 若要报告部署在 Azure 上的 Service Fabric 群集的问题,请通过 [Azure 门户](https://support.azure.cn/support/support-azure/)开具支持票证。 <!--Duplicated [Azure support portal](https://support.azure.cn/zh-cn/support/support-azure/).--> 了解有关以下方面的详细信息: - [世纪互联对 Azure 的支持](https://www.azure.cn/support/plans/)。 <!--Not Available on - [Microsoft premier support](https://support.microsoft.com/premier)--> > [!Note] > 在青铜级可靠性层级或单节点群集上运行的群集只能用来运行测试性工作负荷。 如果你遇到在青铜可靠性级别或单节点群集上运行的群集的问题,Azure 支持团队会协助你解决问题,但不会进行根本原因分析。 请参阅[群集的可靠性特征](/service-fabric/service-fabric-cluster-capacity#the-reliability-characteristics-of-the-cluster)以获取更多详细信息。 > > 若要详细了解生产就绪性群集的必要信息,请参阅[生产就绪性核对清单](/service-fabric/service-fabric-production-readiness-checklist)。 <a name="getlivesitesupportonprem"></a> ## <a name="report-production-issues-or-request-paid-support-for-standalone-service-fabric-clusters"></a>报告生产问题,或者请求独立 Service Fabric 群集的付费支持 若要报告部署在本地或其他云上的 Service Fabric 群集的问题,请通过 [Azure 支持门户](https://support.azure.cn/support/support-azure/)开具专业支持票证。 <!--Not Available on - [Professional Support from Microsoft for on-premises](https://support.microsoft.com/gp/offerprophone?wa=wsignin1.0)--> <!--Not Available on - [Microsoft premier support](https://support.microsoft.com/premier)--> <a name="getsupportonissues"></a> ## <a name="report-azure-service-fabric-issues"></a>报告 Azure Service Fabric 问题 我们已设置 GitHub 存储库,用于报告 Service Fabric 问题。 我们还积极监视以下论坛。 ### <a name="github-repo"></a>GitHub 存储库 在 [Service-Fabric-issues git 存储库](https://github.com/Azure/service-fabric-issues)中报告 Azure Service Fabric 问题。 此存储库用于报告和跟踪 Azure Service Fabric 问题,以及进行小型功能请求。 **请勿使用此存储库报告实时站点问题**。 ### <a name="msdn-forums"></a>MSDN 论坛 [MSDN 上的 Service Fabric 论坛][msdn-forum]最适合提问有关平台工作方式以及如何通过该平台完成某些任务的问题。 <!-- Not Available on ### Azure Feedback forum--> <a name="previewversion"></a> ## <a name="service-fabric-preview-versions---unsupported-for-production-use"></a>Service Fabric 预览版本 - 不支持在生产环境中使用 我们会不时发布包含重要功能的版本,希望用户对这些功能提供反馈,这些版本将作为预览版发布。 这些预览版本应仅用于测试目的。 生产群集应始终运行支持的稳定 Service Fabric 版本。 预览版本始终以主版本号和次版本号 255 开头。 例如,如果看到 Service Fabric 版本 255.255.5703.949,则该版本应仅在测试群集中使用且处于预览状态。 这些预览版本也在 [Service Fabric 团队博客](https://blogs.msdn.microsoft.com/azureservicefabric)上公布,并将提供有关包含的功能的详细信息。 这些预览版本没有付费的支持选项。 使用[报告 Azure Service Fabric 问题](/service-fabric/service-fabric-support#report-azure-service-fabric-issues)下列出的选项之一提出问题或提供反馈。 ## <a name="next-steps"></a>后续步骤 [支持的 Service Fabric 版本](service-fabric-versions.md) <!--references--> [msdn-forum]: https://support.azure.cn/support/contact/ <!--Not Available on [stackoverflow]: http://stackoverflow.com/questions/tagged/azure-service-fabric--> <!-- Not Referenced on [acom-docs]: ../service-fabric/index.yml--> <!-- Not Referenced on [sample-repos]: http://aka.ms/servicefabricsamples--> <!--Update_Description: update meta properties, wording update-->
42.561798
290
0.774815
yue_Hant
0.423659
fabceeba6cd12092a76bf007524c49c8d6e98be5
237
md
Markdown
long-term-roadmap/protocol-list.md
rootnoob/flexi-chains
d87c35422cdde3ca8bf3f2acc2643facbe4d7b72
[ "MIT" ]
1
2021-09-20T01:37:56.000Z
2021-09-20T01:37:56.000Z
long-term-roadmap/protocol-list.md
rootnoob/flexi-chains
d87c35422cdde3ca8bf3f2acc2643facbe4d7b72
[ "MIT" ]
null
null
null
long-term-roadmap/protocol-list.md
rootnoob/flexi-chains
d87c35422cdde3ca8bf3f2acc2643facbe4d7b72
[ "MIT" ]
null
null
null
<h3>Long-term planned supported protocol list</h3> OpenVPN [Wireguard](https://www.wireguard.com/) IKEv2 IPSEC SOCKSv5 Proxy HTTPS Proxy TOR Proxychains [Trojan](https://trojan-gfw.github.io/trojan/)
15.8
50
0.658228
yue_Hant
0.513242
fabd3e5b8e5970f9bd88e070bd4001dd27c14104
2,902
md
Markdown
includes/app-service-blueprint-security.md
OpenLocalizationTestOrg/azure-docs-pr15_pt-BR
95dabd136ee50edd2caa1216e745b9f13ff7a1f2
[ "CC-BY-3.0", "CC-BY-4.0", "MIT" ]
1
2018-08-29T17:03:44.000Z
2018-08-29T17:03:44.000Z
includes/app-service-blueprint-security.md
OpenLocalizationTestOrg/azure-docs-pr15_pt-BR
95dabd136ee50edd2caa1216e745b9f13ff7a1f2
[ "CC-BY-3.0", "CC-BY-4.0", "MIT" ]
null
null
null
includes/app-service-blueprint-security.md
OpenLocalizationTestOrg/azure-docs-pr15_pt-BR
95dabd136ee50edd2caa1216e745b9f13ff7a1f2
[ "CC-BY-3.0", "CC-BY-4.0", "MIT" ]
null
null
null
* [Proteger seu aplicativo web usando vários meios de autenticação e autorização](../articles/app-service-web/web-sites-authentication-authorization.md) * [Autenticação do Active Directory do Azure para o aplicativo da instalação](https://azure.microsoft.com/blog/azure-websites-authentication-authorization/) * [Proteger o tráfego para seu aplicativo habilitando TLS (TLS SSL) - HTTPS](../articles/app-service-web/web-sites-configure-ssl-certificate.md) * [Forçar todo o tráfego recebido através de conexão de HTTPS](http://microsoftazurewebsitescheatsheet.info/#force-https) * [Ativar a segurança de transporte estrito (HSTS)](http://microsoftazurewebsitescheatsheet.info/#enable-http-strict-transport-security-hsts) * [Restringir o acesso ao seu aplicativo pelo endereço IP do cliente](http://microsoftazurewebsitescheatsheet.info/#filtering-traffic-by-ip) * [Restringir o acesso ao seu aplicativo por comportamento do cliente - frequência de solicitação e concorrência](http://microsoftazurewebsitescheatsheet.info/#dynamic-ip-restrictions) * [Digitalizar seu código de aplicativo web usando a verificação de segurança de Tinfoil de vulnerabilidades](https://azure.microsoft.com/blog/web-vulnerability-scanning-for-azure-app-service-powered-by-tinfoil-security/) * [Configurar autenticação comum TLS para exigir certificados de cliente para se conectar ao seu aplicativo web](../articles/app-service-web/app-service-web-configure-tls-mutual-auth.md) * [Configurar um certificado de cliente para uso de seu aplicativo para conectar-se com segurança a recursos externos](https://azure.microsoft.com/blog/using-certificates-in-azure-websites-applications/) * [Remover cabeçalhos de servidor padrão para evitar ferramentas de impressão digital seu aplicativo](https://azure.microsoft.com/blog/removing-standard-server-headers-on-windows-azure-web-sites/) * [Conectar seu aplicativo aos recursos em uma rede privada usando ponto-To-Site VPN com segurança](../articles/app-service-web/web-sites-integrate-with-vnet.md) * [Conectar seu aplicativo aos recursos em uma rede privada usando conexões híbrida com segurança](../articles/app-service-web/web-sites-hybrid-connection-get-started.md) * [Atingir o isolamento de segurança para seus aplicativos usando ambientes de serviço de aplicativo (ASE)](../articles/app-service-web/app-service-app-service-environment-intro.md) * [Configurar um Firewall de aplicativo de Web (WAF) na frente de sua ASE](../articles/app-service-web/app-service-app-service-environment-web-application-firewall.md) * [Configurar o controle de acesso para tráfego de rede de entrada para seu ASE](../articles/app-service-web/app-service-app-service-environment-control-inbound-traffic.md) * [Conectar-se com segurança para os recursos de back-end do seu ASE](../articles/app-service-web/app-service-app-service-environment-securely-connecting-to-backend-resources.md)
161.222222
221
0.805651
por_Latn
0.949528
fabd435a6ed7e61d6698b0d425e441a6f4bf0b51
67
md
Markdown
README.md
mrjohnsonpham/17-react-portfolio
2073002eae6be5927c9843ae0d2325bd739bc237
[ "MIT" ]
null
null
null
README.md
mrjohnsonpham/17-react-portfolio
2073002eae6be5927c9843ae0d2325bd739bc237
[ "MIT" ]
null
null
null
README.md
mrjohnsonpham/17-react-portfolio
2073002eae6be5927c9843ae0d2325bd739bc237
[ "MIT" ]
null
null
null
Deployed Link: https://mrjohnsonpham.github.io/17-react-portfolio/
33.5
66
0.80597
kor_Hang
0.226551
fabd51f9e40cd00e43d5bd01f5d62e972d0da7d1
14,458
md
Markdown
README.md
treasure-data/embulk-input-marketo
2f7c4782c6ea82bd1043aab9851af79a143f3a70
[ "MIT" ]
9
2016-08-13T22:43:13.000Z
2021-09-15T06:55:35.000Z
README.md
treasure-data/embulk-input-marketo
2f7c4782c6ea82bd1043aab9851af79a143f3a70
[ "MIT" ]
65
2015-06-25T04:33:21.000Z
2020-09-17T05:10:53.000Z
README.md
treasure-data/embulk-input-marketo
2f7c4782c6ea82bd1043aab9851af79a143f3a70
[ "MIT" ]
9
2016-01-10T14:59:12.000Z
2021-03-17T18:08:39.000Z
[![Build Status](https://travis-ci.org/treasure-data/embulk-input-marketo.svg?branch=master)](https://travis-ci.org/treasure-data/embulk-input-marketo) [![Code Climate](https://codeclimate.com/github/treasure-data/embulk-input-marketo/badges/gpa.svg)](https://codeclimate.com/github/treasure-data/embulk-input-marketo) [![Test Coverage](https://codeclimate.com/github/treasure-data/embulk-input-marketo/badges/coverage.svg)](https://codeclimate.com/github/treasure-data/embulk-input-marketo/coverage) [![Gem Version](https://badge.fury.io/rb/embulk-input-marketo.svg)](http://badge.fury.io/rb/embulk-input-marketo) # Marketo input plugin for Embulk embulk-input-marketo is the gem preparing Embulk input plugins for [Marketo](http://www.marketo.com/). - Lead(lead) - Activity log(activity) - Lead by list(all_lead_with_list_id) - Lead by program(all_lead_with_program_id) - Campaign(campaign) - Assets Programs (program) - Program Members (program_members) This plugin uses Marketo REST API. ## Overview Required Embulk version >= 0.8.33 (since 0.6.0). * **Plugin type**: input * **Resume supported**: no * **Cleanup supported**: no * **Guess supported**: no ## Install ``` $ embulk gem install embulk-input-marketo ``` ## Configuration ### API Below parameters are shown in "Admin" > "Web Services" page in Marketo. ### Base configuration parameter All target have this configuration parameters | name | required | default value | description | |----------------------------------|----------|---------------|----------------------------------------------------------------------------------------------------------------------------------| | **target** | true | | Marketo targets | | **account_id** | true | | Marketo Muchkin id | | **client_id** | true | | Marketo REST client id | | **client_secret** | true | | Marketo REST client secret | | **marketo_limit_interval_milis** | false | 20 | Marketo have limitation of 100 calls per 20 second. If REST API calls are failed they will wait this amount of time before retry | | **batch_size** | false | 300 | Token paging batch size. Some REST API support batch | | **max_return** | false | 200 | Max return for Endpoint that use offset paging | | **partner_api_key** | false | | Set Marketo Partner API Key see: http://developers.marketo.com/support/Marketo_LaunchPoint_Technology_Partner_API_Key.pdf | ### Bulk extract target configuration parameter (Lead and Activity) All bulk extract target use this configuration parameter | name | required | default value | description | |-----------------------------|----------|---------------|-------------------------------------------------------------------------------------------------------------------------------| | **from_date** | true | | Import data since this date. Example: 2017-10-11T06:43:24+00:00 | | **fetch_days** | false | 1 | Amount of days to fetch since from_date | | **polling_interval_second** | false | 60 | Amount of time to wait between pooling job status in second | | **bulk_job_timeout_second** | false | 3600 | Amount of time to wait for bulk job to complete in second | | **incremental** | false | true | If incremental is set to true, next run will have from_date set to the previous to_date(calculated by from_date + fetch_days) | | **incremental_column** | false | createdAt | Column use to filter from_date and to_date | ### Lead Lead target extract all Marketo leads, it use Marketo bulk extract feature. Configuration include bulk extract configuration. `target: lead` Configuration: | name | required | default value | description | |---------------------|----------|---------------|-----------------------------------------------------------------------------------------------------------------------------------------------------------| | **use_updated_at** | false | false | Support filter with `updateAt` column, but not all Marketo Account have the feature to filter by updatedAt, updatedAt don't support incremental ingestion | | **included_fields** | false | null | List of lead fields to included in export request sent to Marketo, can be used to reduce the size of BulkExtract file | Schema type: Dynamic via describe lead endpoint. Incremental support: yes Range ingestion: yes ### Activity Activity target extract all Marketo activity log. Configuration include all bulk extract configuration `target: activity` Schema type: Static schema Incremental support: yes Range ingestion: yes Filter by specific activity type ids: yes. See [#95](https://github.com/treasure-data/embulk-input-marketo/issues/95) ### Campaign Campaign extract all campaign data from Marketo `target: campaign` Schema type: Static schema Incremental support: no Range ingestion: no ### Lead by list Extract all Lead data including lead's list id `target: all_lead_with_list_id` Configuration: | name | required | default value | description | |---------------------|----------|---------------|-----------------------------------------------------------------------------------------------------------------| | **included_fields** | false | null | List of lead fields to included in export request sent to Marketo, can be used to reduce request, response size | | **list_ids** | false | null | Import Leads by specified Lists_ID. If not specified will import all Leads by all List IDs | Schema type: Dynamic via describe leads. Schema will have 1 addition column name listId that contain the id of the list the lead belong to Incremental support: no Range ingestion: no ### Lead by program Extract all Lead data including lead's program id `target: all_lead_with_program_id` Configuration: | name | required | default value | description | |---------------------|----------|---------------|-----------------------------------------------------------------------------------------------------------------------| | **included_fields** | false | null | List of lead fields to included in export request sent to Marketo, can be used to reduce request, response size | | **program_ids** | false | null | Import Members by specified Program_ID (comma-separated). If not specified will import all Members by all Program IDs | Schema type: Dynamic via describe leads. Schema will have 1 addition column name listId that contain the id of the list the lead belong to Incremental support: no Range ingestion: no ### Assets programs Get Assets Programs by Query Tag type, Date range or all if no query by specified. `target: program` Configuration: | name | required | default value | description | |-----------------------------|----------|---------------|--------------------------------------------------------------------------------------------------------------------------------------------------------------| | **query_by** | false | null | Get assets programs by query, supported values `date_range`, `tag_type` leave unset to fetch all programs | | **earliest_updated_at** | false | null | Required if query by `date_range` is selected. Exclude programs prior to this date. Must be valid ISO-8601 string | | **latest_updated_at** | false | null | Required if query by `date_range` is selected. Exclude programs after this date. Must be valid ISO-8601 string | | **filter_type** | false | null | Optional value send with query by `date_range` is selected to filter out the result from Marketo. Supported values `id`, `programId`, `folderId`, `workspace`| | **filter_values** | false | null | Set the values associated with `filter_type` | | **tag_type** | false | null | Required if query by `tag_type` is selected. Type of program tag | | **tag_value** | false | null | Required if query by `tag_type` is selected. Value of the tag | | **report_duration** | false | null | Amount of milliseconds to fetch from `earliest_updated_at`. If `incremental = true` this value will automatically calculated for the first run by `latest_updated_at` - `earliest_updated_at` | | **incremental** | false | true | If incremental is set to true, next run will have `earliest_updated_at` set to the previous `latest_updated_at` + `report_duration`. Incremental import only support by query `date_range` | Schema type: Static schema Incremental support: yes (Query by `date_range` only) Range ingestion: yes `target: all_lead_with_program_id` Configuration: | name | required | default value | description | |---------------------------------------|----------|---------------|-----------------------------------------------------------------------------------------------------------------------| | **custom_object_api_name** | true | null | The API name of the custom object | | **custom_object_fields** | false | null | Comma separated API name of fields of the custom object (Optional) | | **custom_object_filter_type** | true | null | Field to search on Valid values are: dedupeFields, idFields, and any field defined in searchableFields attribute of Describe endpoint. Default is dedupeFields | | **custom_object_filter_values** | false | null | Comma-separated list of field values to match. | | **custom_object_filter_from_value** | false | null | Filter Marketo Custom Object has value greater than this value | | **custom_object_filter_to_value** | false | null | Filter Marketo Custom Object has value smaller than this value. If not set, only records that have value greater than "From Value" will be returned. Job will stop if no record found in 300 consecutive value. | Schema type: dynamic schema Incremental support: no ### Program Members configuration parameter Get Members by Program Ids or All Program. | name | required | default value | description | |---------------------|----------|---------------|-----------------------------------------------------------------------------------------------------------------------| | **program_ids** | false | null | Import Members by specified Program_ID (comma-separated). If not specified will import all Members by all Program IDs | ## Example For lead, you have `partial-config.yml` like below: ```yaml in: type: marketo target: lead account_id: ACCOUNT_ID client_id: CLIENT_ID client_secret: CLIENT_SECRET from_date: 2017-09-01 fetch_days: 1 out: type: stdout ``` You can run `embulk guess partial-config.yml -o lead-config.yml` and got `lead-config.yml`. `lead-config.yml` includes a schema for Lead. Next, you can run `embulk preview lead-config.yml` for preview and `embulk run lead-config.yml` for run. Example of Assets Programs config ```yaml in: account_id: ACCOUNT_ID client_id: CLIENT_ID client_secret: CLIENT_SECRET target: program type: marketo query_by: date_range filter_type: folderId filter_values: - 2598 - 1001 earliest_updated_at: 2018-08-20T00:00:00.000Z latest_updated_at: 2018-08-31T00:00:00.000Z incremental: true ```
59.743802
278
0.482916
eng_Latn
0.90943
fabe1a30d315cfb05f4d458a0efe5390d1ea91c9
2,099
md
Markdown
category/research.md
franckpicard/franckpicard.github.io
b95557a8a5dbc9e908f0d7600bdef7e12471bd2c
[ "MIT" ]
null
null
null
category/research.md
franckpicard/franckpicard.github.io
b95557a8a5dbc9e908f0d7600bdef7e12471bd2c
[ "MIT" ]
null
null
null
category/research.md
franckpicard/franckpicard.github.io
b95557a8a5dbc9e908f0d7600bdef7e12471bd2c
[ "MIT" ]
null
null
null
--- layout: category title: Research --- ### Single Cell genomics We develop statistical methods for the analysis of single cell data. We started by proposing a new method for PLS for classification, and we recently proposed a probabilistic version of PCA adapted to the analysis of over-dispersed counts with zero inflation. Recent developments focus on ATACSeq data analysis. Visit the [SingleStatomics](http://anr-singlestatomics.pages.math.cnrs.fr/) ANR project. ### Point process modeling in genomics Recent projects focus on the spatial modeling of genomic data using point processes. We developed a testing procedure based on continuous testing to compare maps of genomic features, while controling for multiple (continuous) error rates. We also develop Hawkes models to catch spatial interactions between genomic features. ### Replication origins in vertebrates We work on the spatial program of replication at fine scales in vertebrate genomes. We characterized replication origins in humans by analyzing Oriseq data, and we determined epigenetic signatures that characterize the spatio temporal program of replication. Our method of peak detection using scan statistics can be reproduced using <a href="{{ '/assets/soft/scan-method.zip' | prepend: site.baseurl | prepend: site.url }}">this code</a> (with a toy example). We are currently working on the conservations of replication origins in vertebrates, and on the better characterization of these epigenomic signatures, especially with ATACSeq data. ### Functional Data analysis We developed curve clustering models that account for an inter-individual variability throught mixed functional models. We also developed a shrinkage method for functional mixed models. More recent developments concern functional models in the Poisson case and functional PCA ### Population Genomics and sex-linked genes We developed a statistical approach to infer sex-linked genes based on a controlled cross. This method is available with the sexdetector software and we are currently developing a population-based model to enrich this method.
83.96
642
0.813244
eng_Latn
0.994939
fabe57871660a12265dc23f9a8caa49738891bf7
424
md
Markdown
src/posts/2004-11-03.free-java.md
polarbirke/chrisennsdotcom
7a9df5abefa429ef2b1a4bbac592f0d4830b03cd
[ "MIT" ]
2
2020-04-05T00:53:27.000Z
2020-08-07T17:46:13.000Z
src/posts/2004-11-03.free-java.md
polarbirke/chrisennsdotcom
7a9df5abefa429ef2b1a4bbac592f0d4830b03cd
[ "MIT" ]
7
2020-09-15T16:59:09.000Z
2021-03-19T22:11:15.000Z
src/posts/2004-11-03.free-java.md
polarbirke/chrisennsdotcom
7a9df5abefa429ef2b1a4bbac592f0d4830b03cd
[ "MIT" ]
2
2020-02-06T23:55:34.000Z
2020-02-12T05:13:45.000Z
--- title: "Free Java" views: '2' --- <p>My Tim Horton's rep (aka the lady who remembers what I always get every morning) gave me a free lg english toffee today. That made my day after a crappy night of helping a guy figure out how to get pictures off his digital camera that just doesn't really want to function properly with WinME.</p> <p>I think I'm going to switch to being strictly a Mac support. Stupid Windows.</p>
60.571429
300
0.742925
eng_Latn
0.99938
fabe86643dc3eb66b90cc798414910ec47445822
1,222
md
Markdown
internal/api/repository/db_repo/README.md
JokerFy/go-gin-api
8c88314e549488f495f8c480177f64bee942d7c6
[ "MIT" ]
2
2021-11-17T00:37:42.000Z
2022-02-16T09:44:25.000Z
internal/api/repository/db_repo/README.md
GolangFamily/go-gin-api
0dbd4b093530d5dbebb098da126b29da6929d55b
[ "MIT" ]
null
null
null
internal/api/repository/db_repo/README.md
GolangFamily/go-gin-api
0dbd4b093530d5dbebb098da126b29da6929d55b
[ "MIT" ]
null
null
null
## 使用示例 以 `user_demo` 为例: ```go // 查询:多条 + 分页 page := 2 num := 2 offset := (page - 1) * num user, err = user_demo_repo.NewQueryBuilder(). WhereIdNotIn([]int32{1, 2, 3}). WhereUserName(db_repo.EqualPredicate, "tom"). Limit(num). Offset(offset). QueryAll(u.db.GetDbR().WithContext(ctx.RequestContext())) // 查询:总数 count, err := user_demo_repo.NewQueryBuilder(). WhereIdNotIn([]int32{1, 2, 3}). WhereUserName(db_repo.EqualPredicate, "tom"). Count(u.db.GetDbR().WithContext(ctx.RequestContext())) // 查询:单条 user, err = user_demo_repo.NewQueryBuilder(). WhereUserName(db_repo.EqualPredicate, "tom"). QueryOne(u.db.GetDbR().WithContext(ctx.RequestContext())) // 创建 model := user_demo_repo.NewModel() model.UserName = user.UserName model.NickName = user.NickName model.Mobile = user.Mobile id, err = model.Create(u.db.GetDbW().WithContext(ctx.RequestContext())) // 编辑 model := user_demo_repo.NewModel() model.Id = id data := map[string]interface{}{ "nick_name": nickname, } err = model.Updates(u.db.GetDbW().WithContext(ctx.RequestContext()), data) // 删除 model := user_demo_repo.NewModel() model.Id = id err = model.Delete(u.db.GetDbW().WithContext(ctx.RequestContext())) ```
23.960784
74
0.689034
yue_Hant
0.756326
fabfbb528a0c4c1206b95d450d03c66e2a8ab6e0
8,577
md
Markdown
content/cdk-ddb-quicksight-eng/index.md
mmuller88/mmblog
f9912e61eb95583582b114009565e18b833c70b9
[ "MIT" ]
1
2020-06-07T08:07:32.000Z
2020-06-07T08:07:32.000Z
content/cdk-ddb-quicksight-eng/index.md
mmuller88/mmblog
f9912e61eb95583582b114009565e18b833c70b9
[ "MIT" ]
27
2020-05-22T06:26:30.000Z
2022-02-26T21:20:49.000Z
content/cdk-ddb-quicksight-eng/index.md
mmuller88/mmblog
f9912e61eb95583582b114009565e18b833c70b9
[ "MIT" ]
1
2020-05-24T20:10:05.000Z
2020-05-24T20:10:05.000Z
--- title: AWS DynamoDB Analysis with QuickSight and AWS CDK date: '2021-04-08' image: 'ddb-quicksight.jpeg' tags: ['eng', '2021', 'projen', 'cdk', 'quicksight', 'aws'] #nofeed gerUrl: https://martinmueller.dev/cdk-ddb-quicksight pruneLength: 50 --- Hi. AWS DynamoDB is an extremely performant and scalable NoSQL database. Due to the lack of a schema, data, called items in DynamoDb, can be extremely flexible. This also allows a kind of evolutionary development of the items by simply creating new columns. But there is a catch. Because we are no longer in the world of relational data, we can no longer perform relational operations such as joins or the usual relational operations like COUNT, ORDER BY, GROUP by and many more. Now you ask yourself why should I be able to perform joins for example, they already bugged us with relational databases? We can use joins for example in analysis. I'll try to explain it with a store: * Top X products sold in the period from t1 to t2 grouped by gender. In our example the users and the sold products are each in their own DynamoDB table and are indirectly connected via a userId. However, it is also conceivable that both are in the same table, but in different data sets or rows. DynamoDB does not allow joins and so we cannot relate products sold to gender. The solution to the problem is AWS Athena, QuickSight, Lambda and S3. Using a Lambda, we store the DynamoDB items as a flat JSON file in an S3. Then we let Athena access it. QuickSight then uses Athena as a data engine to create joins, analytics, and dashboards. How you automate this with AWS CDK and a description of the AWS services used will follow in the next sections. For the impatient ones, here is the [code](https://github.com/mmuller88/ddb-quicksight). But before we got to the next section I would like to thank the sponsor for this blogpost and the exciting project to perform analysis of DynamoDB tables using QuickSight. Thanks to [TAKE2](https://www.take2.co/) for letting me be part of your agile and motivated team to work on exciting AWS CDK tasks like these. # AWS DynamoDB [AWS DynamoDB](https://docs.aws.amazon.com/amazondynamodb/latest/developerguide/Introduction.html) is a managed NoSQL database with very good performance and scaling. Managing the database from AWS eliminates tedious administrative tasks such as installation or maintenance. DynamoDB also has backup features like on demand or point-in-time recovery. In DynamoDB, the inserted data does not have to follow a fixed schema like in relational databases. This is super flexible and very useful, but can also lead to problems like confusion or inconsistencies in column names. Therefore I recommend to allow only certain columns in the table. This can be achieved for example by schema validation in Api Gateway or using a GraphQL schema in AWS AppSync. # AWS Athena AWS Athena allows data to be queried. To access the data, the developer uses standard SQL as the query language. The data source can be various AWS services such as S3, RedShift and most recently DynamoDB. The advantage to Athena is that it is serverless, so you can focus directly on querying the data. To set DynamoDB as data source for Athena you need a Lambda Connector. The connector writes all items from the table into an S3 bucket. Fortunately, AWS already provides a SAM Lambda that does the job. This lambda is called [AthenaDynamoDBConnector](https://github.com/awslabs/aws-athena-query-federation/blob/master/athena-dynamodb) # AWS QuickSight AWS QuickSight is a service for creating and analyzing visualizations of customer data. The customer data can reside in AWS services like S3, RedShift or as in our case in DynamoDB. ![pic](https://raw.githubusercontent.com/mmuller88/ddb-quicksight/main/misc/QS.png) QuickSight cannot directly read data from DynamoDB at this time and a small intermediate step must be taken. The DynamoDB data has to be exported in a S3 bucket e.g. as JSON. Then QuickSight can read the data located in the S3. To push the data into the S3 bucket the approach of using an AthenaDynamoDBConnector Lambda is suitable. You can read more about this in the next section. Quicksight offers many cool functions for processing and visualizing data that can come from DynamoDB for example. You can also define them as code with CDK. I am working on storing the QuickSight analyses in templates via CDK in order to make them cross account accessible. This will allow me to create and test analyses on a dev account and then deploy them automatically to the prod account via CDK. How exactly this should work will be explained in the next blogpost. # AWS CDK AWS CDK](https://github.com/aws/aws-cdk) is an open source framework for creating and managing AWS resources. By using languages familiar to the developer such as TypeScript or Python, the infrastructure is described as code. In doing so, CDK synthesizes the code into AWS Cloudformation Templates and can optionally deploy them right away. AWS CDK has been experiencing a steady increase in enthusiastic developers since 2019 and already has a strong and helpful community that is very active on [Slack](https://cdk-dev.slack.com). There is of course much more to say about AWS CDK and I recommend you explore it. Drop me a line if you have any questions. With AWS CDK, I have achieved a high level of automation in creating and managing the DynamoDB QuickSight deployment. This involves defining the required AWS resources and its configurations nicely as code and then simply executing them. And here you can see a component diagram: ![pic](https://raw.githubusercontent.com/mmuller88/ddb-quicksight/main/misc/ddb-quicksight.png) You can find the AWS CDK code for the DynamoDB Athena deployment in my [repo](https://github.com/mmuller88/ddb-quicksight/blob/main/src/ddb-athena-stack.ts). Be sure to check out the readme there as well since it contains a lot of important instructions and information. Unfortunately I couldn't write everything in AWS CDK because e.g. the SAM Lambda AthenaDynamoDBConnector is not or only difficult to translate in AWS CDK and it has to be deployed manually until now. # Outlook It would be super cool if the AthenaDynamoDBConnector was also available in AWS CDK. Also, the QuickSight Cloudformation resources still seem very immature to me as some things are not supported at all like the DataSet. A [GitHub Issue](https://github.com/aws-cloudformation/aws-cloudformation-coverage-roadmap/issues/274) has already been created. As mentioned in the QuickSight section, I am working on persisting the analytics from QuickSight using CDK and then making them available cross account wise. I will show my findings in the next blogpost. # Summary. AWS QuickSight is an exciting analytics tool for evaluating data in a DynamoDB table. Data is the new gold and therefore it is extremely important to be able to process data that is difficult to process, for example in a DynamoDB table. If you can use familiar mechanisms like SQL queries and aggregate functions, that's great. Also QuickSight offers cool graphical solutions for displaying analyses and dashboards. I am very excited to be able to work with it more. Thanks also to Jared Donboch for the extremely helpful BlogPost [Using Athena data connectors to visualize DynamoDB data with AWS QuickSight](https://dev.to/jdonboch/finally-dynamodb-support-in-aws-quicksight-sort-of-2lbl) . Based on that, I was able to automate as much as possible by composing it into an AWS CDK stack. Thanks again to [TAKE2](https://www.take2.co/) for sponsoring this blog post. Thanks to the [DeepL translater (free version)](https://DeepL.com/Translator) for helping with translating to english and saving me tons of time :). To the wonderful readers of this article I'm saying that feedback of any kind is welcome. In the future I will try to include a discussion and comment feature here. In the meantime, please feel free to send me feedback via my social media accounts such as [Twitter](https://twitter.com/MartinMueller_) or [FaceBook](https://facebook.com/martin.muller.10485). Thank you very much :). I love to work on Content Management Open Source projects. A lot from my stuff you can already use on https://github.com/mmuller88 . If you like my work there and my blog posts, please consider supporting me on Patreon: <a href="https://patreon.com/bePatron?u=29010217" data-patreon-widget-type="become-patron-button">Become a Patreon!</a><script async src="https://c6.patreon.com/becomePatronButton.bundle.js"></script>
112.855263
470
0.796199
eng_Latn
0.998091
fabfe08d3836a85a285a2154fed674ad240830d0
106
md
Markdown
README.md
thachnuida/simple-chat-app-angularjs-firebase
a7fe0119e630d15f92f3e78bf65582cf15f331ff
[ "MIT" ]
null
null
null
README.md
thachnuida/simple-chat-app-angularjs-firebase
a7fe0119e630d15f92f3e78bf65582cf15f331ff
[ "MIT" ]
null
null
null
README.md
thachnuida/simple-chat-app-angularjs-firebase
a7fe0119e630d15f92f3e78bf65582cf15f331ff
[ "MIT" ]
null
null
null
# Simple chat app by AngularJS and Firebase # To do - Testing # Demo http://chat-saysua.herokuapp.com/
11.777778
43
0.716981
kor_Hang
0.630394
fac11572e563b9f6c64490d3406d7002f8b5f67c
2,834
md
Markdown
vendor/github.com/songangweb/mcache/README.md
leon-sxd/durl
21ef6e6254e1652c4247f1710ca0b9f5fa83454e
[ "MIT" ]
321
2021-05-14T02:47:47.000Z
2022-03-30T01:51:33.000Z
vendor/github.com/songangweb/mcache/README.md
leon-sxd/durl
21ef6e6254e1652c4247f1710ca0b9f5fa83454e
[ "MIT" ]
3
2021-07-09T10:54:53.000Z
2021-12-10T08:20:35.000Z
vendor/github.com/songangweb/mcache/README.md
leon-sxd/durl
21ef6e6254e1652c4247f1710ca0b9f5fa83454e
[ "MIT" ]
6
2021-05-24T08:49:19.000Z
2022-02-21T06:05:53.000Z
# 欢迎使用 mcache 内存缓存包 ### mcache是一个基于golang-lru开发的缓存包 mcache 增加了缓存过期时间,增加lfu算法,修改了原有arc算法的依赖结构. 后续还会源源不断增加内存算法. - 根据过期时间懒汉式删除过期数据,也可主动刷新过期缓存 ## why? 为什么要用mcache? 因缓存的使用相关需求,牺牲一部分服务器内存,因减少了网络数据交互,直接使用本机内存,可换取比redis,memcache等更快的缓存速度, 可做为更高一层的缓存需要 ## what? 用mcache能做什么? 可作为超高频率数据使用的缓存存储机制 ## how? mcache怎么用? 根据需要的不同缓存淘汰算法,使用对应的调用方式 ## 代码实现: len := 10 // NewLRU 构造一个给定大小的LRU缓存列表 Cache, _ := m_cache.NewLRU(Len) // NewLFU 构造一个给定大小的LFU缓存列表 Cache, _ := m_cache.NewLFU(Len) // NewARC 构造一个给定大小的ARC缓存列表 Cache, _ := m_cache.NewARC(Len) // New2Q 构造一个给定大小的2Q缓存列表 Cache, _ := m_cache.New2Q(Len) // Purge is used to completely clear the cache. // Purge 用于完全清除缓存 Cache.Purge() // PurgeOverdue is used to completely clear the overdue cache. // PurgeOverdue 用于清除过期缓存。 Cache.PurgeOverdue() // Add adds a value to the cache. Returns true if an eviction occurred. // Add 向缓存添加一个值。如果已经存在,则更新信息 Cache.Add(1,1,1614306658000) Cache.Add(2,2,0) // expirationTime 传0代表无过期时间 // Get looks up a key's value from the cache. // Get 从缓存中查找一个键的值 Cache.Get(2) // Contains checks if a key is in the cache, without updating the // recent-ness or deleting it for being stale. // Contains 检查某个键是否在缓存中,但不更新缓存的状态 Cache.Contains(2) // Peek returns the key value (or undefined if not found) without updating // the "recently used"-ness of the key. // Peek 在不更新的情况下返回键值(如果没有找到则返回false),不更新缓存的状态 Cache.Peek(2) // ContainsOrAdd checks if a key is in the cache without updating the // recent-ness or deleting it for being stale, and if not, adds the value. // Returns whether found and whether an eviction occurred. // ContainsOrAdd 判断是否已经存在于缓存中,如果已经存在则不创建及更新内容 Cache.ContainsOrAdd(3,3,0) // PeekOrAdd checks if a key is in the cache without updating the // recent-ness or deleting it for being stale, and if not, adds the value. // Returns whether found and whether an eviction occurred. // PeekOrAdd 判断是否已经存在于缓存中,如果已经存在则不更新其键的使用状态 Cache.PeekOrAdd(4,4,0) // Remove removes the provided key from the cache. // Remove 从缓存中移除提供的键 Cache.Remove(2) // Resize changes the cache size. // Resize 调整缓存大小,返回调整前的数量 len := 100 Cache.Resize(len) // RemoveOldest removes the oldest item from the cache. // RemoveOldest 从缓存中移除最老的项 Cache.RemoveOldest() // GetOldest returns the oldest entry // GetOldest 返回最老的条目 Cache.GetOldest() // Keys returns a slice of the keys in the cache, from oldest to newest. // Keys 返回缓存中键的切片,从最老的到最新的 Cache.Keys() // Len returns the number of items in the cache. // Len 获取缓存已存在的缓存条数 Cache.Len()
23.421488
78
0.66796
eng_Latn
0.89953
fac13011c2c2cf490c18df3b17bf5af183c08994
12,143
md
Markdown
translated/command-line-heroes-season-3-creating-javascript.md
gxlct008/LCRH
410d3e7bd1ba657ebcc5998e5686865e48f5054b
[ "Apache-2.0" ]
1
2021-08-13T14:53:17.000Z
2021-08-13T14:53:17.000Z
translated/command-line-heroes-season-3-creating-javascript.md
Flying-FeiFia/LCRH
34338da7cd31f75bdf77ae4c95b265cc11e73c8f
[ "Apache-2.0" ]
null
null
null
translated/command-line-heroes-season-3-creating-javascript.md
Flying-FeiFia/LCRH
34338da7cd31f75bdf77ae4c95b265cc11e73c8f
[ "Apache-2.0" ]
null
null
null
[#]: collector: (bestony) [#]: translator: (gxlct008) [#]: reviewer: (windgeek) [#]: publisher: ( ) [#]: url: ( ) [#]: subject: (Command Line Heroes: Season 3: Creating JavaScript) [#]: via: (https://www.redhat.com/en/command-line-heroes/season-3/creating-javascript) [#]: author: (RedHat https://www.redhat.com/en/command-line-heroes) Command Line Heroes: Season 3: Creating JavaScript ====== **00:00** - _Saron Yitbarek_ 嗨,大家好。 我们回来了。 我们很高兴能推出 Command Line Heroes 第三季。 我们要感谢你们中很多人在这个节目中讲述的故事,因为每一季都源于我们与开发人员、 SIS 管理员、 IT 架构师、 工程师以及开源社区的人们讨论您最感兴趣的的主题和技术。 现在,我们正在进一步开放这种方式。 我们希望大家都能参与进来,帮助塑造 Command Line Heroes 的未来。 您可以通过我们的简短调查来做到这一点。 您喜欢这个节目的什么地方? 您还希望我们多谈论哪些内容? 亲爱的听众,我们想进一步了解您。 您是开发人员吗? 您是在运营部门工作,还是在做一些与技术完全无关的工作? 请访问 commandlineheroes.com/survey,以帮助我们提升第四季及以后的播客内容。 现在,让我们进入第三季。 **01:00** - _Saron Yitbarek_ Brendan Eich (布兰登·艾奇) 在 Netscape (网景) 公司总部的办公桌前坐下时只有 34 岁。 他正致力于为期 10 天的大规模编码冲刺。 一种新的语言,一种全新的编程语言,将在在短短 10 天内诞生。 那是在 1995 年,编程语言的世界即将永远改变。 **01:26** - _Saron Yitbarek_ 我是 Saron Yitbarek,这里是 Command Line Heroes,一个来自 Red Hat (红帽) 的原创播客。 整个一季,我们都在探索编程语言的威力和前景,探索我们的语言是如何塑造开发世界的,以及它们是如何推动我们的工作的。 这一次,我们追踪 JavaScript 的创建历程。 也许您以前听过 Brendan Eich 的故事,但是像 JavaScript 这种计算机语言是如何真正创造的呢? 其中肯定有 Brendan 的冲刺。 但是这个故事还有更多的内容。 **02:02** - _Saron Yitbarek_ 我们的 JavaScript 故事始于一场战争,一场浏览器之战。 1990 年代的浏览器大战似乎已经成为历史,但他们的影响无疑是巨大的。 在战场的一方,Netscape 与 Sun Microsystems 结成了联盟。 另一方,您看到的是 Microsoft,软件巨头。 他们争夺的战利品是什么? 赌注已经大得不能再大了,因为这是一场决定谁将成为互联网看门人的对决。 **02:40** - _Saron Yitbarek_ 为了真正了解浏览器之战是如何落下帷幕的,让我来打电话给我最喜欢的科技历史学家之一、 作家 Clive Thompson (克莱夫·汤普森)。 他最新的一本书—— **02:50** - _Clive Thompson_ 《编码者: 新部落的形成和世界的重塑》。 **02:54** - _Saron Yitbarek_ Clive 和我谈论的是浏览器之战,让我来为您做个铺垫吧。 您会看到 Netscape 意识到浏览器将会是人们用来上网的关键软件。 还有 Microsoft,他们的整个商业模式就是将东西打包到 Windows 中。 直到 1990 年代,他们才真正对浏览器感兴趣,微软意识到也许他们一直睡在方向盘上了。 世界正在向线上移动,微软 Windows 内没有任何东西可以帮助他们实现这一目标。 但是那里的那些人,一家名为 Netscape 的公司,他们正在提供一个通往互联网的入口。 突然之间,微软在整个行业的主导地位看起来并不是那么绝对。 浏览器之战始于那一刻,即微软意识到了互联网的力量,并眯着眼睛看向他们新竞争对手的那一刻。 好了,这就是我的铺垫。 这里我和 Clive 讨论接下来发生的事情。 **04:03** - _Clive Thompson_ 这场战争是关于谁将成为上网的主要门户。 您必须意识到,在 90 年代初期,没有人真正在线。 当 Mosaic 出现并最终变成 Netscape 时,他们是第一款任何人都可以下载的并让人能够浏览Web的浏览器。 他们于 1994 年 12 月上线。 所以突然之间,成千上万的人能够以这种图形方式使用互联网。 他们获得了巨量的下载和大量的新闻报道。 基本上每个人都在说:“是的,Netscape 是这种被称之为互联网的事物的未来。” **04:40** - _Clive Thompson_ 所以在西雅图,你可以看到微软非常警惕地关注着这件事,因为他们几乎忽略了互联网。 他们只专注于销售 Windows,实际上并没有对这种被称为互联网的疯狂新事物给予任何关注。 因此,他们不得不玩一场急速追赶游戏。 近一年后,他们才推出自己的浏览器。 在 1995 年秋天,他们的浏览器问世了,这实质上是浏览器大战的开始,那时微软也正在努力成为人们上网的门户。 **05:13** - _Saron Yitbarek_ Okay。 花费一年的时间才让浏览器面世听起来不算太糟,对吧? 时间不算太长。 对吧? 这似乎是一个合理的时间。 **05:21** - _Clive Thompson_ 不,是真的。 这听起来好像不是很长时间,但那时事情发展得是如此之快。 而且人们有一种强烈的先发优势意识,那就是第一家能以你上网的方式作为自己品牌的公司将成为多年的赢家,甚至可能永远是赢家。 我还记得当时的开发速度有多快。 我的意思是,Netscape 每两三个月就会推出一款新的浏览器,对吗? 他们会说,“哇。 现在,我们已经将电子邮件集成到浏览器中了。 现在,我们在顶部有了一个小小的搜索栏。” 它一直在变得越来越好。 你可以在某种程度上看到,你知道的,可以在网上做的所有事情都进入了视线,因为它们可以快速迭代并快速将其推出。 **06:01** - _Clive Thompson_ 微软习惯于非常缓慢的开发模式。 这是您长达四年的开发过程。 它是我们能买到的没有 bug 的版本。 把它封盒,投放到商店去,然后我们四年都不发布新版本。 现在 Netscape 出现了,它是第一家说,“不,我们将推出一款不怎么合格的产品,但它运行得足够好,我们将在三个月、三个月又三个月内推出一个新的供你下载。” 这完全破坏了微软的稳定。 **06:30** - _Saron Yitbarek_ 好吧。 如果我是微软,我可以看着它说,“哦,天哪。 这就是未来。 我需要迎头赶上。 我需要竞争。” 或者我可以说,“啊,这是一种时尚。” 那么浏览器到底是什么? 这让微软选择了第一个选项。 它让微软说,“哦,天哪。 这是真的。 我需要竞争。” **06:51** - _Clive Thompson_ 浏览器本身具有大量的文化传播和积淀作用。 您在互联网上可以做的第一件事,一般是获得像文化之类的乐趣。 您可以突然进入某个乐队的网页,查看他们的帖子和他们的照片。 您可以通过找到佛罗里达州的所有模特训练人去研究自己的爱好,对吗? 所以,在此之前,关于互联网的一切都看起来很刻板。 电子邮件,文件传输,诸如此类。 我的意思是,突然之间,浏览器使互联网看起来像一本杂志,像一个有趣的互动对象。 报纸,CNN 和杂志第一次以这种非常激动人心的方式对此进行了报道。 就在这一刻,科技从深入商业版块发展成为 《纽约时报》 A1 版。 **07:41** - _Saron Yitbarek_ 那么,对于开发人员而言,Netscape 甚至仅仅只算是浏览器能有什么吸引力呢? 他们为什么如此着迷呢? **07:48** - _Clive Thompson_ 为此我拜访过很多开发人员。 突然间,随着浏览器的出现,互联网出现了,你可能只看到一个 web 页面,上面写着:“下载我那酷酷的软件吧。” 因此,它开启了我们今天看到的软件制造的整个世界。 **08:04** - _Saron Yitbarek_ 我在这里应该提一下,起初微软实际上提出要收购 Netscape。 他们出价很低,Netscape 拒绝了他们。 因此,微软不得不打造自己的浏览器。 称他们自己的为 Explorer (IE)。 **08:21** - _Clive Thompson_ 微软花了一年的时间疯狂地开发浏览器,并于 1995 年秋天将其推出。 他们所做的与 Netscape 差不多。 他们很快就做出了一些东西,并不担心它是否完美,因为它会越来越好。 但是,在 90 年代后半叶真正出现的是一场关于谁的浏览器将是最有趣,最具交互性、最尖端的战争。 **08:53** - _Saron Yitbarek_ 请记住,Netscape 在这方面绝不占上风。 **08:57** - _Clive Thompson_ 微软拥有非常强大的地位。 当 Windows 安装到全球所有计算机的 80% ~ 90% 时,很容易就会把您的软件设置为默认软件。 而这正是他们所做的。 所以你可以看到 Explorer 的崛起,崛起,崛起。 **09:16** - _Saron Yitbarek_ 在某种程度上,可怜的老网景在这场战斗中一直处于劣势,但事情就是这样。 在战斗结束之前,他们抛出了一个美丽的 Hail Mary,事实证明,这将成为整个编程世界的一个令人难以置信的成绩。 **09:35** - _Clive Thompson_ 这就是 JavaScript 创建过程中迷人而怪异的故事。 **09:43** - _Saron Yitbarek_ 所有围绕网络的热议,围绕浏览器生命潜力的热议,都非常清楚地表明了一件事。 我们需要一种新的编程语言,一种远远超出 HTML 的语言。 我们需要一种为所有新的基于 web 的开发量身定做的语言。 我们想要一种不仅在线上生存,而且在那里蓬勃发展的语言。 **10:10** - _Clive Thompson_ 如何为浏览器创建编程语言? **10:15** - _Saron Yitbarek_ 我的朋友,这是一个数十亿美元的问题。 大约在 Netscape 看到微软与他们竞争的时候,他们开始关注 Java™。 Java 会成为 Web 开发的语言吗? Java 是这种丰富的编译语言。 它的性能和 C++ 一样好。 但它仍然需要编译。 开发人员确实想要一些更轻量级的东西,一些可以解释而不是编译的东西,一些可以吸引所有涌入 Web 的非专业程序员的东西。 毕竟,那些新的程序员想要直接在网页上工作。 那是梦想。 **11:05** - _Saron Yitbarek_ Netscape 需要一种可以在浏览器内部运行的编程语言,让开发人员能够让这些静态网页栩栩如生。 他们想,如果他们能在发布 Netscape 2.0 测试版的同时,发布一种新的轻量级语言,为网络编程创造奇迹,那不是很棒吗? 只有一个问题。 他们正好有 10 天的时间来创造一门新的语言。 实际上,它给了一个叫 Brendan Eich 的人 10 天的时间。 他就是那个负责完成这件事的人。 毫无疑问,如果有人能做到这一点,那就是他。 当 Brendan 还是伊利诺伊大学的学生时,他常常为了好玩而创造新的语言,只是为了玩玩语法。 **11:57** - _Charles Severance_ Brendan Eich 的关键在于,那时的 Brendan Eich,在构建 JavaScript 时已经成了某种程度上的语言狂热分子。 **12:05** - _Saron Yitbarek_ 为了了解 Eich (布兰登·艾奇) 到底取得了什么成果,我们联系了密歇根大学信息学院的教授 Charles Severance (查尔斯·塞维兰斯)。 **12:14** - _Charles Severance_ JavaScript 在某种程度上是在 Java 被视为未来的环境中创建的,在 1994 年,我们认为它 (Java) 将解决所有问题。 一年后,真正能解决一切的东西即将出现,但它不能说,“嘿,我已经解决了一切”,因为每个人,包括我自己,就像都相信 94,95 年我们看到了摇滚乐的未来一样,这个未来就是 Java 编程语言。 他们必须建立一种看似无关紧要,看似愚蠢,看似毫无意义,但却是正确的解决方案的语言。 **12:56** - _Saron Yitbarek_ 但是 Eich 提供的可不仅仅是一种玩具语言。 它以隐藏的方式进行了复杂处理,并从以前的语言中汲取了主要灵感。 **13:07** - _Charles Severance_ 如果您看一下基本语法,很明显它的灵感来自于带有花括号和分号的 C 语言。 一些字符串模式取自 Java 编程语言,但面向对象的底层模式取自名为 Moda-2 的编程语言,它有函数是一等类的概念。 对我来说,这确实是使 JavaScript 成为如此强大以及可扩展语言的最令人惊叹的选择之一,即函数的主体,构成函数本身的代码也是数据。 **13:41** - _Charles Severance_ 另一个真正的灵感来源于 HyperCard。 JavaScript 总是在浏览器中运行,这意味着它有文档对象模型的基本数据上下文,文档对象模型是网页的面向对象表示。 它不像传统的编程语言。 JavaScript 代码不是从一开始就启动的。 首先它是一个网页,它最终也以这个面向事件的编程结束。 **14:12** - _Saron Yitbarek_ 1995 年 11 月 30 日,当 JavaScript 与 Netscape Navigator 2.0 一起发布时,所有的魔力都被植入到一种强大的语言小种子中。 包括 America Online (美国在线) 和 AT&T (美国电话电报公司) 在内的 28 家公司同意将其作为一种开放标准语言使用。 当它发布时,有一些老的专业人士对 JavaScript 嗤之以鼻。 他们认为这只是一种新手的语言。 他们错过了它革命性的潜力。 **14:46** - _Charles Severance_ Brendan(布兰登)决定将所有这些来自不太知名语言的超高级概念融入其中,这些语言非常类似于高级面向对象语言。 所以 JavaScript 就像一只特洛伊木马。 它在某种程度上潜入了我们的集体意识,认为它是愚蠢的、 开玩笑的、 容易的和轻量级的。 但是几乎从一开始它就是内置的、 功能强大的、 深思熟虑的编程语言,它几乎能够在计算机科学中做任何事情。 **15:17** - _Saron Yitbarek_ 其结果是成为了一种浏览器原生语言,可以随着我们在线生活的发展而不断进化。 没过多久,JavaScript 就成为了实际上的 web 开发选择。 **15:29** - _Charles Severance_ JavaScript 是一种我别无选择的语言,我只能学习它,从字面上讲,学习 JavaScript 的人通常别无选择,因为他们会说,“我想构建一个浏览器应用程序,我想让它有交互元素。” 因此,答案是您必须学习 JavaScript。 如果你想象一下,比如说,你最喜欢的语言是什么,那么这个问题的答案几乎就是 x 加上 JavaScript,对吧? 有人可能会说,“我喜欢 Python 和 JavaScript ”,或者 “我喜欢 Scala 和 JavaScript”,因为它就像是每个人都需要学习的语言。 **16:05** - _Saron Yitbarek_ Charles Severance (查尔斯·塞维兰斯) 是密歇根大学信息学院的教授。 他说,Netscape 公司一开始非常强大,他们在浏览器之战中奋力拼搏,但最终...... **16:22** - _Clive Thompson_ Netscape 作为一款严肃的产品就这样消失了。 **16:27** - _Saron Yitbarek_ 微软在整个行业的主导地位是一股压倒性的力量。 尽管在浏览器游戏上晚了一年,但他们还是能够重新站上榜首并赢得了这一天的胜利。 但你知道,Netscape 的 Hail Mary,它的 JavaScript 的创造,是成功的,因为在战斗结束很久之后,这种从浏览器战争中诞生的语言的瑰宝,它的来世将改变一切。 **17:01** - _Saron Yitbarek_ 如果您是最近才开始编程的,很可能会认为您可以开发可更改和更新的交互式 Web 页面,而无需从服务器提取页面的全新副本。 但是,想像一下,当这样做会成为一种全新的选择时会是什么样子的。 我们有请 Red Hat (红帽公司) 的软件工程师 Michael Clayton (迈克尔·克莱顿) 帮助我们了解那是一个多么巨大的转变。 **17:28** - _Michael Clayton_ 在,我想说 2004 年,Google Mail 发布了。 Gmail,据我所知,它是第一个真正将 JavaScript 带到更高水平的 Web 应用程序,它使用 JavaScript 来动态地切换你正在查看的内容。 **17:49** - _Saron Yitbarek_ 假设您正在查看收件箱,然后单击了一封电子邮件。 在过去,你的电子邮件查看器会在你的浏览器中加载一个全新的页面,仅仅是为了向您显示那封电子邮件。 当您关闭该电子邮件时,它会重新加载整个收件箱。 **18:05** - _Michael Clayton_ 这造成了很大的延迟。 当您在视图之间来回切换时有很多等待,Gmail 改变了这一切。 他们使用 JavaScript 在后台获取您想要查看的内容,然后将其展现在您面前,而无需等待全新的页面视图。 **18:23** - _Saron Yitbarek_ 这节省了大量的时间和精力。 但是仔细想想,它改变的不仅仅是速度。 它改变了我们工作的本质。 **18:35** - _Michael Clayton_ 所以,作为一种职业的 web 开发者,已经从类似幕后角色的服务端走到了离用户仅薄薄一层之隔的位置,因为他们直接在浏览器中编写代码,而用户也正是通过浏览器查看 web 页面。 **18:52** - _Saron Yitbarek_ 它改变了一切。 事实上,您完全可以把引领 Web2.0 革命的功劳都归功于 JavaScript。 任何有网络浏览器的人都会突然之间拥有一个摆在他们面前的开发环境。 但是,正如我之前提到的,老保守派对民主程度并不一定感到舒服。 **19:16** - _Michael Clayton_ 早期对 JavaScript 反对的,我也是其中的一员。 我有阻止 JavaScript 运行的浏览器扩展。 我认为它是一种无用的玩具语言,每当我访问一个网页,该网页的某些关键功能需要 JavaScript 时,我都会感到愤怒。 我想,“您应该在没有 JavaScript 的情况下以正确的方式构建您的网站。” **19:43** - _Saron Yitbarek_ 然而,很快,Brendan Eich 仅仅用 10 天创建的语言,它所蕴含的美和潜力对每个人来说都变得显而易见。 现在,它不仅征服了浏览器,也征服了服务器。 有了 Node.js,这种小语言可能会打开一个全新的领域。 **20:03** - _Michael Clayton_ 当我听说 JavaScript 打算在服务器上运行时,我想,“为什么会有人想这么做? 那时,我已经是一名专业的 JavaScript 开发人员了。 我每天都写很多 JS,但我还是不太明白为什么它可以归属到服务器端,事实证明,像很多听众都知道的那样,Node.js 现在是这个行业的一支巨大的力量。 我认为这是有充分理由的。 **20:32** - _Michael Clayton_ Node.js 如此成功的原因之一就是它挖掘到了庞大的前端 JavaScript 开发人员和客户端开发人员团体。 他们在写代码。 他们在用 JavaScript 为浏览器编写代码。 这么多的开发者,现在又可以用同样的语言来为服务器端编程,这让他们立刻就拥有了大量的可以立即开始为服务器端做贡献的人员。 该工具已经在您的工具包中,您只需将其拿出来,安装上 Node.js,然后就可以加入到编码竞赛中去了。 **21:11** - _Saron Yitbarek_ 首先在浏览器中,然后又在服务器上。 JavaScript 是这种朴实无华、 暗地里优雅,有时候会有缺陷的语言。 一个所有人都低估了的浏览器战争的幸存者。 **21:25** - _Michael Clayton_ JavaScript 一直以来都是编程语言的灰姑娘故事,始于基本上是在 10 天内拼凑起来的初态。 中间经历了来自其他编程社区的许多嘲笑,然而仍以某种方式继续取得成功和增长,最后到现在稳居世界上最流行的编程语言中排名第一、第二的位置。 JavaScript 基本上无处不在。 在网页内部运行的能力意味着 JavaScript 和 Web 一样普及,这是非常普遍的。 **22:08** - _Saron Yitbarek_ Michael Clayton 是 Red Hat 的工程师。 JavaScript 吞噬了世界吗? 它是否搭上了 web 的顺风车,才成了一种主流语言? 我想找出 JavaScript 的实际边界在哪里。 **22:25** - _Klint Finley_ 嗨,我叫 Klint Finley (克林特·芬利)。我是 Wired.com(连线)网站的撰稿人。 **22:28** - _Saron Yitbarek_ Klint (克林特) 对同样的事情也很好奇。 他越是关注今天 JavaScript 的运行方式,就越发意识到它已经渗透到他在线生活的每一个环节。 **22:40** - _Klint Finley_ 甚至在您有机会决定是否希望所有这些不同的应用程序都在您的计算机上运行之前,JavaScript 已经成为一种可以增强整个应用程序能力的工具。 他们中的一些开始运行,他们参与了广告或促进广告商使用的跟踪。 所以,在你的浏览器中有很多事情是看不见的,你甚至可能不知道或者不想发生。 **23:07** - _Saron Yitbarek_ 因此,Klint 决定进行一些实验。 **23:10** - _Klint Finley_ 我决定试着在没有 JavaScript 的情况下使用 web 一段时间。 我决定试一试,花了一周时间禁用浏览器中的 JavaScript。 **23:21** - _Saron Yitbarek_ 听起来很简单,但是放弃所有 JavaScript 产生了一些令人惊讶的效果。 因为 JavaScript 已经变得如此庞大,如此全方位耗费资源,这种以轻量级著称的语言现在实际上占用了大量的空间和精力。 当 Klint 屏蔽了那一种语言时才发现... **23:39** - _Klint Finley_ 总体而言,这在很多方面都是一种更好的 Web 体验,比如页面加载更快,页面更干净,我电脑的电池续航时间更长,并且我对电脑上发生的事情有了更多的控制感,因为并不是所有这些奇怪的、 看不见的随机程序都在后台运行。 **24:02** - _Saron Yitbarek_ 想象一下第一次没有弹出式广告的生活是多么幸福。 **24:07** - _Klint Finley_ 它在很大程度上依赖于 JavaScript 来加载。 所以网页变得简单多了,广告少了,干扰也少了。 **24:17** - _Saron Yitbarek_ 不过,这种整洁的 web 体验并不是全部。 如果你拔掉 JavaScript 的插头,Web 的某些部分就完全不能工作了。 **24:26** - _Klint Finley_ 很多内容都不能正常运行了。 我想 Gmail 把我重定向到了一个为旧手机设计的差异版本。 Facebook 做了同样的事情,很多流畅的互动没有了。 它变得更像是一系列的网页。 因此, Netflix 无法正常工作。 YouTube 无法正常运行。 是的,任何非常依赖互动的东西都不能运行了。 拿掉了 JavaScript,有好处也有坏处,最终我不得不做出抉择,有 JavaScript 总比什么都没有要好。 **25:05** - _Saron Yitbarek_ Klint Finley 是 Wired.com 的撰稿人。 大多数人预测 JavaScript 只会继续主导移动和桌面应用程序开发。 像基于浏览器的游戏、 基于浏览器的艺术项目等等,它们的复杂程度正在飞涨。 不断增长的 JavaScript 社区正在最大限度地利用这一潜力。 **25:34** - _Saron Yitbarek_ 值得回想一下,就在 1995 年,就在几十年前,Brendan Eich 坐在一个房间里,设计出一门新的语言。 今天,这种语言渗透到我们所做的每一件事中。 说一些新的代码串将改变世界听起来有点陈词滥调,但它确实发生了。 一位代码英雄将他对语言的所有热爱汇聚到 10 天的冲刺中,世界的 DNA 也将永远改变。 **26:10** - _Saron Yitbarek_ 我们可以为 Google Docs、 YouTube 和 Netflix 感谢 JavaScript。 但是您知道,“能力越大,责任越大”,随着 JavaScript 的影响力在大量开源库的推动下不断增长,责任不再仅仅落在一个人身上了。 范围更广的社区已经掌握了控制权。 SlashData 最近估计 JavaScript 开发人员的数量为 970 万,在 GitHub 上,JavaScript 比任何其他语言都有更多的 PR (Pull Requests) 。 力量在于整个世界的 Command Line Heroes,一直在帮助 JavaScript 在我们发展未来的过程中成长。 **26:59** - _Saron Yitbarek_ 下一期,Command Line Heroes 将遇到另外一种 web 语言,我们将探索 Perl 是如何在一个广阔的新领域蓬勃发展的。 **28:04** - _Saron Yitbarek_ 最后,一位听众分享了我们上一季的 Hello World 插曲,我们还谈到了 Brendan Eich 和 JavaScript。 在那一期,一位客人说,在那 10 天里,布兰登可能没有睡过多少觉,如果有的话,也是很少。 Brendan 在推特上回应说,他确实在那次冲刺过程中睡过觉。 想要更多地了解这 10 天发生了什么,请查看 Devchat 对 Brendan 的采访播客。 我们会在我们的节目记录里加个链接。 我是 Saron Yitbarek。 直到下期,请继续编码。 -------------------------------------------------------------------------------- via: https://www.redhat.com/en/command-line-heroes/season-3/creating-javascript 作者:[Red Hat][a] 选题:[bestony][b] 译者:[gxlct008](https://github.com/gxlct008) 校对:[windgeek](https://github.com/windgeek) 本文由 [LCRH](https://github.com/LCTT/LCRH) 原创编译,[Linux中国](https://linux.cn/) 荣誉推出 [a]: https://www.redhat.com/en/command-line-heroes [b]: https://github.com/bestony
39.045016
362
0.791402
yue_Hant
0.613609
fac1699441834378b54ff33d122d3306ccb59833
8,115
md
Markdown
README.md
vincentsarago/cog-translator
46499244b81bb5df044e90c47cceb4a5daeeecaa
[ "BSD-2-Clause" ]
5
2018-11-04T14:18:01.000Z
2019-11-14T08:05:10.000Z
README.md
vincentsarago/cog-translator
46499244b81bb5df044e90c47cceb4a5daeeecaa
[ "BSD-2-Clause" ]
1
2018-11-02T15:22:11.000Z
2018-11-02T15:22:11.000Z
README.md
vincentsarago/cog-translator
46499244b81bb5df044e90c47cceb4a5daeeecaa
[ "BSD-2-Clause" ]
1
2021-01-30T14:40:21.000Z
2021-01-30T14:40:21.000Z
cog_translator ============== [`ecs-watchbot-fargate`](https://github.com/vincentsarago/ecs-watchbot-fargate) example to create a simple/self-managed processing pipeline using ECS Fargate. `ecs-watchbot-fargate` is based on `@mapbox/ecs-watchbot`, please refer to https://github.com/mapbox/ecs-watchbot for more information Install and Deploy ------------------ **prerequisites** - `cloudformation-kms-production` deployed according to the instructions in [cloudformation-kms](https://github.com/mapbox/cloudformation-kms). Makes encryption of sensitive environment variables that need to be passed to ECS simple using [cfn-config](https://github.com/mapbox/cfn-config). - Install `@mapbox/cfn-config` and follow https://github.com/mapbox/cfn-config#prerequisites ```bash $ npm install -g @mapbox/cfn-config ``` **Deploy** - Create a Fargate Cluster [link](link) - add docker image in your ecr repository - edit [Makefile](https://github.com/vincentsarago/cog-translator/blob/master/Makefile) (update service region, name, or version) - `make push` - Install dependencies ```bash $ npm install ``` - deploy ```bash $ cfn-config create production cloudformation/cog-translator.template.js -c mybucket-configs ? AlarmEmail. Email address: contact@remotepixel.ca ? Bucket. Bucket to grant stack PUT access to: remotepixel-pub ? Cluster. Cluster name or ARN: fargate-cluster ? CpuAllocation. CPU allocation: 2048 ? ErrorThreshold. Error threshold for alert: 10 ? ImageVersion: 1.0.0 ? MemoryAllocation. Memory allocation (Mb): 12288 ? minSize. Minimum workers size.: 0 ? maxSize. Maximum worker size: 10 ``` Use --- Once your stack is up and running we can send message to the AWS SQS queue. By design ecs-watchbot will scale your fargate task up and down depending on how many messages are left in the queue (AWS lambda will run every 5min to check the SQS queue). We first need to get the SNS topic for our cog-translate-production stack ```bash $ aws cloudformation describe-stacks --stack-name cog-translator-production | jq -r '.Stacks[0].Outputs[] | select(.OutputKey == "SnsTopic") | .OutputValue' arn:aws:sns:{REGION}:{MY-AWS-ACCOUNT-ID}:cog-translator-production-WatchbotTopic-{STACK-VERSION} ``` ```bash $ python scripts/feed.py https://my-url.tif \ --bucket my-bucket \ --key dir/dir/my-cog.tif --topic arn:aws:sns:{REGION}:{MY-AWS-ACCOUNT-ID}:cog-translator-production-WatchbotTopic-{STACK-VERSION} ``` Example ------- In this example we'll convert huge GeoTIFF from DG opendata repository to COG. ###### 1. Get file list ```python import requests import bs4 as BeautifulSoup url = 'https://www.digitalglobe.com/opendata/super-typhoon-yutu/post-event' # Read Page r = requests.get(url) # Use BeautifulSoup to parse and extract all imagey links soup = BeautifulSoup.BeautifulSoup(r.text) s = soup.findAll('a',attrs={"class":"opendata__tilelinks"}) list_file = list(set([l.get('href') for l in s if not l.get('href').endswith('ovr')])) with open('list_dg_yutu.txt', 'w') as f: f.write('\n'.join(list_file)) ``` ```bash $ cat list_dg_yutu.txt http://opendata.digitalglobe.com/cyclone-yutu/post-event/2018-10-26/10400100458B5A00/1313220.tif http://opendata.digitalglobe.com/cyclone-yutu/post-event/2018-10-26/10400100458B5A00/1313030.tif http://opendata.digitalglobe.com/cyclone-yutu/post-event/2018-10-26/10400100458B5A00/1313230.tif http://opendata.digitalglobe.com/cyclone-yutu/post-event/2018-10-26/10400100458B5A00/1313001.tif http://opendata.digitalglobe.com/cyclone-yutu/post-event/2018-10-26/10400100458B5A00/1313203.tif http://opendata.digitalglobe.com/cyclone-yutu/post-event/2018-10-26/10400100458B5A00/1313012.tif http://opendata.digitalglobe.com/cyclone-yutu/post-event/2018-10-26/10400100458B5A00/1313213.tif http://opendata.digitalglobe.com/cyclone-yutu/post-event/2018-10-26/10400100458B5A00/1313221.tif http://opendata.digitalglobe.com/cyclone-yutu/post-event/2018-10-26/10400100458B5A00/1313020.tif http://opendata.digitalglobe.com/cyclone-yutu/post-event/2018-10-26/10400100458B5A00/1313000.tif http://opendata.digitalglobe.com/cyclone-yutu/post-event/2018-10-26/10400100458B5A00/1313031.tif http://opendata.digitalglobe.com/cyclone-yutu/post-event/2018-10-26/10400100458B5A00/1313022.tif http://opendata.digitalglobe.com/cyclone-yutu/post-event/2018-10-26/10400100458B5A00/1313212.tif http://opendata.digitalglobe.com/cyclone-yutu/post-event/2018-10-26/10400100458B5A00/1313010.tif http://opendata.digitalglobe.com/cyclone-yutu/post-event/2018-10-26/10400100458B5A00/1313013.tif http://opendata.digitalglobe.com/cyclone-yutu/post-event/2018-10-26/10400100458B5A00/1313011.tif http://opendata.digitalglobe.com/cyclone-yutu/post-event/2018-10-26/10400100458B5A00/1313021.tif http://opendata.digitalglobe.com/cyclone-yutu/post-event/2018-10-26/10400100458B5A00/1313202.tif http://opendata.digitalglobe.com/cyclone-yutu/post-event/2018-10-26/10400100458B5A00/1313200.tif http://opendata.digitalglobe.com/cyclone-yutu/post-event/2018-10-26/10400100458B5A00/1313023.tif http://opendata.digitalglobe.com/cyclone-yutu/post-event/2018-10-26/10400100458B5A00/1313231.tif http://opendata.digitalglobe.com/cyclone-yutu/post-event/2018-10-26/10400100458B5A00/1313033.tif http://opendata.digitalglobe.com/cyclone-yutu/post-event/2018-10-26/10400100458B5A00/1313201.tif http://opendata.digitalglobe.com/cyclone-yutu/post-event/2018-10-26/10400100458B5A00/1313210.tif http://opendata.digitalglobe.com/cyclone-yutu/post-event/2018-10-26/10400100458B5A00/1313002.tif http://opendata.digitalglobe.com/cyclone-yutu/post-event/2018-10-26/10400100458B5A00/1313003.tif http://opendata.digitalglobe.com/cyclone-yutu/post-event/2018-10-26/10400100458B5A00/1313211.tif http://opendata.digitalglobe.com/cyclone-yutu/post-event/2018-10-26/10400100458B5A00/1313032.tif ``` ###### 2. Get stack topic ```bash $ topic=$(aws cloudformation describe-stacks --stack-name cog-translator-production | jq -r '.Stacks[0].Outputs[] | select(.OutputKey == "SnsTopic") | .OutputValue') ``` ###### 3. Send work to our stack ```bash $ bucket=opendata.remotepixe.ca $ prefix=dg_post_yutu/ $ cat list_dg_yutu.txt | while read line; do; date=$(echo $line | cut -d'/' -f6 | sed 's/-/_/g'); bname=$(echo $line | cut -d'/' -f8); python scripts/feed.py $line --bucket $bucket --key $prefix$date/$bname --topic $topic; done ``` ###### 4. Get a :coffee: ###### 5. Data overview Converting the tif to COG we created a set of data that weight 421Mb, far from the original ~29Gb hosted by DG. ```bash $ aws s3 ls s3://opendata.remotepixel.ca/dg_post_yutu/20181026/ --human --summarize 2018-11-02 23:49:57 3.4 MiB 1313000.tif 2018-11-02 23:49:57 3.4 MiB 1313001.tif 2018-11-02 23:49:57 5.1 MiB 1313002.tif 2018-11-02 23:49:57 16.9 MiB 1313003.tif 2018-11-02 23:49:57 3.4 MiB 1313010.tif 2018-11-02 23:49:57 4.0 MiB 1313011.tif 2018-11-02 23:49:57 18.0 MiB 1313012.tif 2018-11-02 23:49:57 18.3 MiB 1313013.tif 2018-11-02 23:49:57 8.8 MiB 1313020.tif 2018-11-02 23:49:57 30.3 MiB 1313021.tif 2018-11-02 23:49:57 7.4 MiB 1313022.tif 2018-11-02 23:49:57 26.2 MiB 1313023.tif 2018-11-02 23:49:57 22.3 MiB 1313030.tif 2018-11-02 23:49:58 17.7 MiB 1313031.tif 2018-11-02 23:49:58 20.3 MiB 1313033.tif 2018-11-02 23:49:58 5.3 MiB 1313200.tif 2018-11-02 23:49:58 26.1 MiB 1313201.tif 2018-11-02 23:49:58 6.6 MiB 1313202.tif 2018-11-02 23:49:59 24.5 MiB 1313203.tif 2018-11-02 23:49:59 33.4 MiB 1313210.tif 2018-11-02 23:49:59 22.7 MiB 1313211.tif 2018-11-02 23:49:59 29.1 MiB 1313212.tif 2018-11-02 23:50:00 21.7 MiB 1313213.tif 2018-11-02 23:50:00 6.2 MiB 1313220.tif 2018-11-02 23:50:00 18.6 MiB 1313221.tif 2018-11-02 23:50:00 10.7 MiB 1313230.tif 2018-11-02 23:50:00 10.8 MiB 1313231.tif Total Objects: 27 Total Size: 421.3 MiB ``` ###### 5. Use the data ```bash $ pip install rio-glui $ rio glui https://s3.amazonaws.com/opendata.remotepixel.ca/dg_post_yutu/20181026/1313021.tif ``` ![](https://user-images.githubusercontent.com/10407788/47956729-a234a580-df7f-11e8-9e22-f332bb348459.jpg)
42.046632
290
0.751325
yue_Hant
0.208954
fac1c689a0e5aaa6c0d51544a3d9ea4e89ef402e
80
md
Markdown
changelog.md
silvercommerce/currency-switcher
458be6acfd0814507147f25758a9e35c72311c06
[ "BSD-3-Clause" ]
null
null
null
changelog.md
silvercommerce/currency-switcher
458be6acfd0814507147f25758a9e35c72311c06
[ "BSD-3-Clause" ]
4
2018-07-01T20:35:28.000Z
2018-07-01T21:52:48.000Z
changelog.md
silvercommerce/currency-switcher
458be6acfd0814507147f25758a9e35c72311c06
[ "BSD-3-Clause" ]
null
null
null
# Log of changes for CurrencySwitcher module ## 1.0.0 * First initial release
13.333333
44
0.7375
eng_Latn
0.965225
fac22e0d0b99f923ee506b1fdde492a892619dca
120
md
Markdown
README.md
mohahf19/mohahf19.github.io
4d9559ba0bc101680a38aee8a2fc0ffad3f114c0
[ "MIT" ]
null
null
null
README.md
mohahf19/mohahf19.github.io
4d9559ba0bc101680a38aee8a2fc0ffad3f114c0
[ "MIT" ]
null
null
null
README.md
mohahf19/mohahf19.github.io
4d9559ba0bc101680a38aee8a2fc0ffad3f114c0
[ "MIT" ]
null
null
null
# Personal Website Welcome to my personal website. Now, if you excuse me, I have to figure out where everything goes..
30
99
0.766667
eng_Latn
0.998326
fac23f1a70c1298147e8cb6f8c57b825100d31b5
308
md
Markdown
README.md
hadrons/generator-react-mobx-tools
83095dad85922f9797accab15d557346015d7235
[ "MIT" ]
null
null
null
README.md
hadrons/generator-react-mobx-tools
83095dad85922f9797accab15d557346015d7235
[ "MIT" ]
null
null
null
README.md
hadrons/generator-react-mobx-tools
83095dad85922f9797accab15d557346015d7235
[ "MIT" ]
null
null
null
# React + mobx + flow generator tools Minimal tool for modules and components creation. This is super specific for the stack we use at Hadrons. :) ## Usage ```bash $ yo react-mobx-tools $ yo react-mobx-tools:add-module $ yo react-mobx-tools:add-component $ yo react-mobx-tools:add-stateless-component ```
22
58
0.74026
eng_Latn
0.947942
fac2994fa2013e00ed725c6819490d74e51bfdf2
3,146
md
Markdown
_posts/2018-10-01-Download-nhs-math-and-literature-tests.md
Luanna-Lynde/28
1649d0fcde5c5a34b3079f46e73d5983a1bfce8c
[ "MIT" ]
null
null
null
_posts/2018-10-01-Download-nhs-math-and-literature-tests.md
Luanna-Lynde/28
1649d0fcde5c5a34b3079f46e73d5983a1bfce8c
[ "MIT" ]
null
null
null
_posts/2018-10-01-Download-nhs-math-and-literature-tests.md
Luanna-Lynde/28
1649d0fcde5c5a34b3079f46e73d5983a1bfce8c
[ "MIT" ]
null
null
null
--- layout: post comments: true categories: Other --- ## Download Nhs math and literature tests book An English translation has since industry. Movement became ground. Marine had likewise distributed Italian orders to some of the _Vega_ "Yes, his father Suleiman Shah kept strait watch nhs math and literature tests the child and cherished him and named him Melik Shah, whom I motor homes and travel trailers. Otherwise, acquisition! In the end, he can were going to use it in Jackpot. " saw one before?" way as the Studebaker rolled forward, I'll life-affirming music. With one more than makes up for all the inconveniences. " over there or anything. Or aren't you friends anymore?" reflection but the face of a young woman. ); the last-named is the most common. When they joined us, he snatched his hand back, are you sure you can afford this?" on business granted to Europeans? Bartholomew?" he asked sleepily. Yet Junior must endure this their ideas of what is clean or unclean differs considerably from the Kargish king wear Morred's ring," the Queen Mother said. The Mad Lover ccccxi he parties, perhaps. Driscoll and Sirocco remained with Wellington in the corridor. Chevy to its limits. Thingy's pissed, "How shall we do with this youth, _Kljautlj_, but not always to others, and stood motionless again, I highly recommend Culture of Death: The Assault on Medical Ethics in America by "Oh. thus neither indented with deep fjords surrounded with high course and tentatively approaches. The egg will have to be implanted into a foreign womb and that, the Organizer outlined what we were striking for, beat him again. txt press for money, talk about prowled, then stepped out into nhs math and literature tests hall. " Having finished her sandwich They call this the Otter's House," he said. Worse In the confusion of Otter's mind, head next to the door, it slipped into the tight curve of his curled forefinger, you nhs math and literature tests. With high fences and nhs math and literature tests of Indian laurels constant employment in killing foxes and at other work. When the Dixie Chicks followed Brooks, "I wasn't scared of a dumb old spider," Angel insisted in her own voice, while Stormbel and the officers marched down the main aisle to the center of the floor and nhs math and literature tests to face the Congress from in front of where Wellesley was still standing, acquainted her with his case and that wherein he was of puissance and delight and majesty and venerance and loftiness of rank. hardened snow. After some time, they say so will the Archmage be one returned from death? "She only comes to dance? At Tom Vanadium's request, but she wasn't able to get to her feet to reach the CLOUDS SWARMED THE late-afternoon sun. And thus he answers: "I'm being Curtis Hammond. " (33) Quoth he, before we were ten. In the course of it she stopped calling me Mr. "So! too, healing, or you are left with no one to trust. site of the Gimp's grave in Montana, hitching around the ranch in The Nhs math and literature tests McCoys, kindled the lamp and went round about the house with the little one. Magnified thirty to forty times.
349.555556
3,043
0.78862
eng_Latn
0.999909
fac405820ae2378507b8faa9e38d3957e6f6881d
1,607
markdown
Markdown
src/code-challenges/C/candles/readme.markdown
nejcm/js-algorithms
13538820b3d2ea92a8ddd32842289191772a059a
[ "MIT" ]
2
2020-06-14T17:33:51.000Z
2022-03-26T00:37:49.000Z
src/code-challenges/C/candles/readme.markdown
nejcm/js-algorithms
13538820b3d2ea92a8ddd32842289191772a059a
[ "MIT" ]
null
null
null
src/code-challenges/C/candles/readme.markdown
nejcm/js-algorithms
13538820b3d2ea92a8ddd32842289191772a059a
[ "MIT" ]
null
null
null
When a candle finishes burning it leaves a leftover. `makeNew` leftovers can be combined to make a new candle, which, when burning down, will in turn leave another leftover. You have `candlesNumber` candles in your possession. What's the total number of candles you can burn, assuming that you create new candles as soon as you have enough leftovers? Example For `candlesNumber = 5` and `makeNew = 2`, the output should be `candles(candlesNumber, makeNew) = 9`. Here is what you can do to burn `9` candles: - burn `5` candles, obtain `5` leftovers; - create `2` more candles, using `4` leftovers (`1` leftover remains); - burn `2` candles, end up with `3` leftovers; - create another candle using `2` leftovers (`1` leftover remains); - burn the created candle, which gives another leftover (`2` leftovers in total); - create a candle from the remaining leftovers; - burn the last candle. Thus, you can burn `5 + 2 + 1 + 1 = 9` candles, which is the answer. Input/Output - **\[execution time limit\] 4 seconds (js)** - **\[input\] integer candlesNumber** The number of candles you have in your possession. _Guaranteed constraints:_ `1 ≤ candlesNumber ≤ 15`. - **\[input\] integer makeNew** The number of leftovers that you can use up to create a new candle. _Guaranteed constraints:_ `2 ≤ makeNew ≤ 5`. - **\[output\] integer** **\[JavaScript (ES6)\] Syntax Tips** // Prints help message to the console // Returns a string function helloWorld(name) { console.log("This prints to the console when you Run Tests"); return "Hello, " + name; }
30.903846
88
0.696951
eng_Latn
0.995224
fac41a7f6be3a566c01f007249ceaa3da67ac812
1,201
md
Markdown
_posts/2016-11-27-so-sassi-2015-lola.md
mightts/mightts.github.io
4c47706b48c2b23ff1694b666225249fb3ae798c
[ "MIT" ]
null
null
null
_posts/2016-11-27-so-sassi-2015-lola.md
mightts/mightts.github.io
4c47706b48c2b23ff1694b666225249fb3ae798c
[ "MIT" ]
null
null
null
_posts/2016-11-27-so-sassi-2015-lola.md
mightts/mightts.github.io
4c47706b48c2b23ff1694b666225249fb3ae798c
[ "MIT" ]
null
null
null
--- layout: post date: '2016-11-27' title: "So Sassi - 2015 - Lola" category: So Sassi tags: ["pretty","made","formal","custom","dresses"] image: http://img.metalkind.com/57462-thickbox_default/so-sassi-2015-lola.jpg --- So Sassi - 2015 - Lola Price: **$319.99** <a href="https://www.metalkind.com/en/so-sassi/15454-so-sassi-2015-lola.html"><amp-img layout="responsive" width="600" height="600" src="//img.metalkind.com/57462-thickbox_default/so-sassi-2015-lola.jpg" alt="So Sassi - 2015 - Lola 0" /></a> <a href="https://www.metalkind.com/en/so-sassi/15454-so-sassi-2015-lola.html"><amp-img layout="responsive" width="600" height="600" src="//img.metalkind.com/57463-thickbox_default/so-sassi-2015-lola.jpg" alt="So Sassi - 2015 - Lola 1" /></a> <a href="https://www.metalkind.com/en/so-sassi/15454-so-sassi-2015-lola.html"><amp-img layout="responsive" width="600" height="600" src="//img.metalkind.com/57464-thickbox_default/so-sassi-2015-lola.jpg" alt="So Sassi - 2015 - Lola 2" /></a> Buy it: [So Sassi - 2015 - Lola](https://www.metalkind.com/en/so-sassi/15454-so-sassi-2015-lola.html "So Sassi - 2015 - Lola") View more: [So Sassi](https://www.metalkind.com/en/183-so-sassi "So Sassi")
70.647059
241
0.701082
kor_Hang
0.072873
fac5071bf1c41a9f42c11a6635a872b3086af2ad
874
md
Markdown
README.md
pedroknup/pymakr-spider
b4785ad6df47e8bbdd24d13c6a0acb520d6e79bc
[ "MIT" ]
null
null
null
README.md
pedroknup/pymakr-spider
b4785ad6df47e8bbdd24d13c6a0acb520d6e79bc
[ "MIT" ]
3
2020-06-25T15:34:02.000Z
2020-06-25T15:34:03.000Z
README.md
pedroknup/pymakr-spider
b4785ad6df47e8bbdd24d13c6a0acb520d6e79bc
[ "MIT" ]
null
null
null
# Pymakr Spider This application was built to track the downloads counter of the Pymakr extension. It's a web crawler/spider bot that opens a virtual browser in memory and navigate to Pymakr's sites on the Microsoft and Atom marketplaces, since they don't provide an open API. Those numbers are stored by default in the history.txt file and it compares the today's numbers with the latest saved day. If the last rows of history.txt are * 03/06/2020;95916;33970; * 03/06/2020;95931;33978; * 22/06/2020;97324;35041; * 22/06/2020;97332;35098; and today's (25/06/2020) record is * 25/06/2020;97567;35276; the output would be ![Output](https://i.imgur.com/1XD07D4.png) Note that only the latest record of the latest date was considered. ### To Do - [ ] Create a Cron Job to run this app on a daily basis. - [ ] Store the data on AWS - [ ] Chart representation?
28.193548
300
0.737986
eng_Latn
0.993788
fac515886e6cbad7bce4b16d1d6efdae341dd1be
780
md
Markdown
README.md
evetion/StarTIN.jl
4ed47962053d468cf6b86df874ec3025c8151706
[ "MIT" ]
3
2021-03-08T21:02:14.000Z
2021-03-10T15:51:33.000Z
README.md
evetion/StarTIN.jl
4ed47962053d468cf6b86df874ec3025c8151706
[ "MIT" ]
2
2021-03-11T20:26:30.000Z
2021-12-24T13:41:58.000Z
README.md
evetion/StarTIN.jl
4ed47962053d468cf6b86df874ec3025c8151706
[ "MIT" ]
null
null
null
# StarTIN [![Stable](https://img.shields.io/badge/docs-stable-blue.svg)](https://evetion.github.io/StarTIN.jl/stable) [![Dev](https://img.shields.io/badge/docs-dev-blue.svg)](https://evetion.github.io/StarTIN.jl/dev) [![Build Status](https://github.com/evetion/StarTIN.jl/workflows/CI/badge.svg)](https://github.com/evetion/StarTIN.jl/actions) A Julia wrapper around the Delaunay triangulator [startin](https://github.com/hugoledoux/startin) written in Rust by @hugoledoux. # Install ```julia ]add StarTIN ``` # Usage ```julia using StarTIN t = DT() points = rand(3,100) insert!(t, points) value = interpolate_linear(t, 0.5, 0.5) write!("test.obj", t) ``` # TODO - Add support for DT options, such as random walking or not - Add support for retrieving all stars/triangles
26
129
0.724359
yue_Hant
0.422573
fac7c12cb26d6db94ae1449b360c8f9347693f3e
5,417
md
Markdown
cloud/azure/mysql/README.md
osodevops/terraform-datadog-monitors
dd52d7f17b28733b713ba1299c48058197f2023f
[ "Apache-2.0" ]
null
null
null
cloud/azure/mysql/README.md
osodevops/terraform-datadog-monitors
dd52d7f17b28733b713ba1299c48058197f2023f
[ "Apache-2.0" ]
null
null
null
cloud/azure/mysql/README.md
osodevops/terraform-datadog-monitors
dd52d7f17b28733b713ba1299c48058197f2023f
[ "Apache-2.0" ]
null
null
null
# CLOUD AZURE MYSQL DataDog monitors ## How to use this module ```hcl module "datadog-monitors-cloud-azure-mysql" { source = "claranet/monitors/datadog//cloud/azure/mysql" version = "{revision}" environment = var.environment message = module.datadog-message-alerting.alerting-message } ``` ## Purpose Creates DataDog monitors with the following checks: - Mysql Server CPU usage - Mysql Server IO consumption - Mysql Server memory usage - Mysql Server storage ## Inputs | Name | Description | Type | Default | Required | |------|-------------|------|---------|:-----:| | cpu\_usage\_enabled | Flag to enable Mysql status monitor | `string` | `"true"` | no | | cpu\_usage\_extra\_tags | Extra tags for Mysql status monitor | `list(string)` | `[]` | no | | cpu\_usage\_message | Custom message for Mysql CPU monitor | `string` | `""` | no | | cpu\_usage\_threshold\_critical | Mysql CPU usage in percent (critical threshold) | `string` | `"90"` | no | | cpu\_usage\_threshold\_warning | Mysql CPU usage in percent (warning threshold) | `string` | `"80"` | no | | cpu\_usage\_time\_aggregator | Monitor aggregator for Mysql CPU [available values: min, max or avg] | `string` | `"min"` | no | | cpu\_usage\_timeframe | Monitor timeframe for Mysql CPU [available values: `last_#m` (1, 5, 10, 15, or 30), `last_#h` (1, 2, or 4), or `last_1d`] | `string` | `"last_15m"` | no | | environment | Architecture environment | `string` | n/a | yes | | evaluation\_delay | Delay in seconds for the metric evaluation | `number` | `900` | no | | filter\_tags\_custom | Tags used for custom filtering when filter\_tags\_use\_defaults is false | `string` | `"*"` | no | | filter\_tags\_use\_defaults | Use default filter tags convention | `string` | `"true"` | no | | free\_storage\_enabled | Flag to enable Mysql status monitor | `string` | `"true"` | no | | free\_storage\_extra\_tags | Extra tags for Mysql status monitor | `list(string)` | `[]` | no | | free\_storage\_message | Custom message for Mysql Free Storage monitor | `string` | `""` | no | | free\_storage\_threshold\_critical | Mysql Free Storage remaining in percent (critical threshold) | `string` | `"10"` | no | | free\_storage\_threshold\_warning | Mysql Free Storage remaining in percent (warning threshold) | `string` | `"20"` | no | | free\_storage\_time\_aggregator | Monitor aggregator for Mysql Free Storage [available values: min, max or avg] | `string` | `"min"` | no | | free\_storage\_timeframe | Monitor timeframe for Mysql Free Storage [available values: `last_#m` (1, 5, 10, 15, or 30), `last_#h` (1, 2, or 4), or `last_1d`] | `string` | `"last_15m"` | no | | io\_consumption\_enabled | Flag to enable Mysql status monitor | `string` | `"true"` | no | | io\_consumption\_extra\_tags | Extra tags for Mysql status monitor | `list(string)` | `[]` | no | | io\_consumption\_message | Custom message for Mysql IO consumption monitor | `string` | `""` | no | | io\_consumption\_threshold\_critical | Mysql IO consumption in percent (critical threshold) | `string` | `"90"` | no | | io\_consumption\_threshold\_warning | Mysql IO consumption in percent (warning threshold) | `string` | `"80"` | no | | io\_consumption\_time\_aggregator | Monitor aggregator for Mysql IO consumption [available values: min, max or avg] | `string` | `"min"` | no | | io\_consumption\_timeframe | Monitor timeframe for Mysql IO consumption [available values: `last_#m` (1, 5, 10, 15, or 30), `last_#h` (1, 2, or 4), or `last_1d`] | `string` | `"last_15m"` | no | | memory\_usage\_enabled | Flag to enable Mysql status monitor | `string` | `"true"` | no | | memory\_usage\_extra\_tags | Extra tags for Mysql status monitor | `list(string)` | `[]` | no | | memory\_usage\_message | Custom message for Mysql memory monitor | `string` | `""` | no | | memory\_usage\_threshold\_critical | Mysql memory usage in percent (critical threshold) | `string` | `"90"` | no | | memory\_usage\_threshold\_warning | Mysql memory usage in percent (warning threshold) | `string` | `"80"` | no | | memory\_usage\_time\_aggregator | Monitor aggregator for Mysql memory [available values: min, max or avg] | `string` | `"min"` | no | | memory\_usage\_timeframe | Monitor timeframe for Mysql memory [available values: `last_#m` (1, 5, 10, 15, or 30), `last_#h` (1, 2, or 4), or `last_1d`] | `string` | `"last_15m"` | no | | message | Message sent when an alert is triggered | `any` | n/a | yes | | new\_host\_delay | Delay in seconds before monitor new resource | `number` | `300` | no | | notify\_no\_data | Will raise no data alert if set to true | `bool` | `true` | no | | prefix\_slug | Prefix string to prepend between brackets on every monitors names | `string` | `""` | no | ## Outputs | Name | Description | |------|-------------| | mysql\_cpu\_usage\_id | id for monitor mysql\_cpu\_usage | | mysql\_free\_storage\_id | id for monitor mysql\_free\_storage | | mysql\_io\_consumption\_id | id for monitor mysql\_io\_consumption | | mysql\_memory\_usage\_id | id for monitor mysql\_memory\_usage | ## Related documentation DataDog documentation: [https://docs.datadoghq.com/integrations/azure/](https://docs.datadoghq.com/integrations/azure/) You have to search `mysql` Azure Database for MySQL servers metrics documentation: [https://docs.microsoft.com/en-us/azure/mysql/concepts-monitoring](https://docs.microsoft.com/en-us/azure/mysql/concepts-monitoring)
66.060976
196
0.686542
eng_Latn
0.547947
fac842ab76ca7f298045d6294ebf01d4ef8a62f4
222
md
Markdown
README.md
zyhfish/Dnn.AdminExperience.Library
45ced692f0f791e676886215a0eb66258575eec2
[ "MIT" ]
null
null
null
README.md
zyhfish/Dnn.AdminExperience.Library
45ced692f0f791e676886215a0eb66258575eec2
[ "MIT" ]
null
null
null
README.md
zyhfish/Dnn.AdminExperience.Library
45ced692f0f791e676886215a0eb66258575eec2
[ "MIT" ]
null
null
null
[![Build status](https://ci.appveyor.com/api/projects/status/6bsoj5n67m2r9s2b?svg=true)](https://ci.appveyor.com/project/DnnAutomation/dnn-adminexperience-library) # Dnn.PersonaBar Library Installable Persona Bar Library
44.4
163
0.810811
yue_Hant
0.312945
fac8729b46ae77d6c37a96f88615ce538fa259eb
3,052
md
Markdown
leetcode/01100-01199/01195_fizz-buzz-multithreaded/README.md
geekhall/algorithms
7dfab1e952e4b1b3ae63ec1393fd481b8bf4af86
[ "MIT" ]
null
null
null
leetcode/01100-01199/01195_fizz-buzz-multithreaded/README.md
geekhall/algorithms
7dfab1e952e4b1b3ae63ec1393fd481b8bf4af86
[ "MIT" ]
null
null
null
leetcode/01100-01199/01195_fizz-buzz-multithreaded/README.md
geekhall/algorithms
7dfab1e952e4b1b3ae63ec1393fd481b8bf4af86
[ "MIT" ]
null
null
null
# 01195. Fizz Buzz Multithreaded _Read this in other languages:_ [_简体中文_](README.zh-CN.md) <p>You have the four functions:</p> <ul> <li><code>printFizz</code> that prints the word <code>&quot;Fizz&quot;</code> to the console,</li> <li><code>printBuzz</code> that prints the word <code>&quot;Buzz&quot;</code> to the console,</li> <li><code>printFizzBuzz</code> that prints the word <code>&quot;FizzBuzz&quot;</code> to the console, and</li> <li><code>printNumber</code> that prints a given integer to the console.</li> </ul> <p>You are given an instance of the class <code>FizzBuzz</code> that has four functions: <code>fizz</code>, <code>buzz</code>, <code>fizzbuzz</code> and <code>number</code>. The same instance of <code>FizzBuzz</code> will be passed to four different threads:</p> <ul> <li><strong>Thread A:</strong> calls <code>fizz()</code> that should output the word <code>&quot;Fizz&quot;</code>.</li> <li><strong>Thread B:</strong> calls <code>buzz()</code> that should output the word <code>&quot;Buzz&quot;</code>.</li> <li><strong>Thread C:</strong> calls <code>fizzbuzz()</code> that should output the word <code>&quot;FizzBuzz&quot;</code>.</li> <li><strong>Thread D:</strong> calls <code>number()</code> that should only output the integers.</li> </ul> <p>Modify the given class to output the series <code>[1, 2, &quot;Fizz&quot;, 4, &quot;Buzz&quot;, ...]</code> where the <code>i<sup>th</sup></code> token (<strong>1-indexed</strong>) of the series is:</p> <ul> <li><code>&quot;FizzBuzz&quot;</code> if <code>i</code> is divisible by <code>3</code> and <code>5</code>,</li> <li><code>&quot;Fizz&quot;</code> if <code>i</code> is divisible by <code>3</code> and not <code>5</code>,</li> <li><code>&quot;Buzz&quot;</code> if <code>i</code> is divisible by <code>5</code> and not <code>3</code>, or</li> <li><code>i</code> if <code>i</code> is not divisible by <code>3</code> or <code>5</code>.</li> </ul> <p>Implement the <code>FizzBuzz</code> class:</p> <ul> <li><code>FizzBuzz(int n)</code> Initializes the object with the number <code>n</code> that represents the length of the sequence that should be printed.</li> <li><code>void fizz(printFizz)</code> Calls <code>printFizz</code> to output <code>&quot;Fizz&quot;</code>.</li> <li><code>void buzz(printBuzz)</code> Calls <code>printBuzz</code> to output <code>&quot;Buzz&quot;</code>.</li> <li><code>void fizzbuzz(printFizzBuzz)</code> Calls <code>printFizzBuzz</code> to output <code>&quot;FizzBuzz&quot;</code>.</li> <li><code>void number(printNumber)</code> Calls <code>printnumber</code> to output the numbers.</li> </ul> <p>&nbsp;</p> <p><strong>Example 1:</strong></p> <pre><strong>Input:</strong> n = 15 <strong>Output:</strong> [1,2,"fizz",4,"buzz","fizz",7,8,"fizz","buzz",11,"fizz",13,14,"fizzbuzz"] </pre><p><strong>Example 2:</strong></p> <pre><strong>Input:</strong> n = 5 <strong>Output:</strong> [1,2,"fizz",4,"buzz"] </pre> <p>&nbsp;</p> <p><strong>Constraints:</strong></p> <ul> <li><code>1 &lt;= n &lt;= 50</code></li> </ul>
53.54386
262
0.674312
eng_Latn
0.765859
fac87bdc13d6d0162804a628bd7b46d591da9f19
2,551
md
Markdown
README.md
Synchronicity89/POC_JS_CS_CPPCLI_STDCPP
b52315012db4df2ea5a1db9c79ebeb9974822e4e
[ "MIT" ]
null
null
null
README.md
Synchronicity89/POC_JS_CS_CPPCLI_STDCPP
b52315012db4df2ea5a1db9c79ebeb9974822e4e
[ "MIT" ]
null
null
null
README.md
Synchronicity89/POC_JS_CS_CPPCLI_STDCPP
b52315012db4df2ea5a1db9c79ebeb9974822e4e
[ "MIT" ]
null
null
null
# POC_JS_CS_CPPCLI_STDCPP This is a POC to show that using Visual Studio on Windows you can Debug JS, CS, C++/CLI, regular C++ using std:: all in same solution A popular type of web app these days is a single page app where the JavaScript manages the DOM, instead of refreshing the entire page. It is also fine to have separate pages, and some combination of separate pages and relatively heavy JavaScript/DOM interaction. The JavaScript makes HTTP requests to a Web API backend, which here is implemented in C#. Let's say there is some heavy crunching to do in native C++. That native C++ code can reached easily by using a layer of C++/CLI which compiles to .NET assemblies that can be referenced by the C# layer. Its possible to link libraries of native C++, either static or dynamic. It would be interesting to see if that improves the performance of the native C++. Visual Studio Installation Requirements: -C++ -C++/CLI -Asp.NET -C# Building: Right click on solution and click Clean, then Rebuild. Click restore nuget packages if needed. Running/Testing The HealthCheck.html page contains some javascript to test a sample function Values in the Web API. It returns two strings as sample data. Set thas the start page. If prompted to install a https certificate for IIS Express, go ahead and click yes. Press F12 in Windows browsers to bring up the web browser Developer tabs and chose console to see the test results when you force the browser to go to <localhost>:<port>/Api/Values The HealthCheck.html page retrieves the sample data from the Web API three different ways, and uses jQuery to make the data visible to the user. There are also examples of using the <style> tag to define styles. Set a breakpoint anywhere. If you have only made changes to javascript then hit Ctrl-S to save them and simply refresh the browser page. No need to rerun the solution. I wrote it in Visual Studio 2019 so it is using the 142 build toolset, so if you have Visual Studio 2017 you will probably need to change the project properties of CppLayer. First make sure All Configurations is selected instead of only Debug or only Release. Then for General Configuration Properties check Platform Toolset. If it says not installed then use the drop down to select 141 Toolset or one that you actually have installed. Likewise with the Windows SDK Version. Choose the most recent Windows SDK version you have or some version close to 10. It should be possible to set breakpoints in all of the languages javascript, C#, C++/CLI, regular C++
70.861111
299
0.78479
eng_Latn
0.998978
fac8f1498e9b19734af013d7c269b19287c4ad85
846
md
Markdown
src/grid/grid_why.md
fakecoinbase/threefoldfoundationslashinfo_foundation
950528401fee7de1dafb6c8ab2f8f7debaf068f2
[ "Apache-2.0" ]
null
null
null
src/grid/grid_why.md
fakecoinbase/threefoldfoundationslashinfo_foundation
950528401fee7de1dafb6c8ab2f8f7debaf068f2
[ "Apache-2.0" ]
null
null
null
src/grid/grid_why.md
fakecoinbase/threefoldfoundationslashinfo_foundation
950528401fee7de1dafb6c8ab2f8f7debaf068f2
[ "Apache-2.0" ]
null
null
null
![threefold grid header](./img/grid_header.png) # The Need For A New Internet Grid The Internet is growing at an accelerated rate. To service this demand a few large companies (Google, Amazon, Alibaba, Facebook, etc.) have built and continue to build supersized, power-hungry and centralized data centers. These data centers make up the majority of all Internet capacity available today. However, the Internet is expanding much faster than datacenters and the planet can cope with. The solution is to have the Internet be more **distributed**, **cost**-**effective**, **neutral,** and **sustainable**. ThreeFold has developed such a new technology, the ThreeFold Grid (Grid) and a token which enables all this to happen. And you can become a part of this solution by becoming a farmer and add more decentralized capacity to the Threefold Grid.
120.857143
637
0.782506
eng_Latn
0.999399
fac95b181aaa4d4a14683a32b94d919b52f71c27
102
md
Markdown
README.md
GrunkleSqueaky/Color-Punch
98ecf123c3667246d23ad6522340525daba2aea0
[ "MIT" ]
null
null
null
README.md
GrunkleSqueaky/Color-Punch
98ecf123c3667246d23ad6522340525daba2aea0
[ "MIT" ]
1
2021-02-28T19:32:16.000Z
2021-02-28T19:32:16.000Z
README.md
GrunkleSqueaky/Color-Punch
98ecf123c3667246d23ad6522340525daba2aea0
[ "MIT" ]
1
2021-02-28T04:08:13.000Z
2021-02-28T04:08:13.000Z
#[ModKit Wiki](../../wiki) Requires "EmikBaseModules" from https://github.com/Emik03/EmikBaseModules
25.5
73
0.745098
yue_Hant
0.399872
fac95d0d22240c872485278be86b60d04113bf32
1,044
md
Markdown
java/src/main/java/com/jpeony/leetcode/n0617/README.md
yihonglei/thinking-in-algorithms
4a601f682a26fab577236d23fcc77a92afdf6b4c
[ "MIT" ]
5
2020-10-07T10:54:08.000Z
2022-01-07T01:27:23.000Z
java/src/main/java/com/jpeony/leetcode/n0617/README.md
yihonglei/thinking-in-algorithms
4a601f682a26fab577236d23fcc77a92afdf6b4c
[ "MIT" ]
null
null
null
java/src/main/java/com/jpeony/leetcode/n0617/README.md
yihonglei/thinking-in-algorithms
4a601f682a26fab577236d23fcc77a92afdf6b4c
[ "MIT" ]
2
2020-10-10T16:50:16.000Z
2021-12-19T12:51:42.000Z
# [617. Merge Two Binary Trees](https://leetcode.com/problems/merge-two-binary-trees/) ## 题目 You are given two binary trees root1 and root2. Imagine that when you put one of them to cover the other, some nodes of the two trees are overlapped while the others are not. You need to merge the two trees into a new binary tree. The merge rule is that if two nodes overlap, then sum node values up as the new value of the merged node. Otherwise, the NOT null node will be used as the node of the new tree. Return the merged tree. **Note:** The merging process must start from the root nodes of both trees. Example 1: ``` Input: root1 = [1,3,2,5], root2 = [2,1,3,null,4,null,7] Output: [3,4,5,5,4,null,7] ``` Example 2: ``` Input: root1 = [1], root2 = [1,2] Output: [2,2] ``` Constraints: - The number of nodes in both trees is in the range [0, 2000]. - -104 <= Node.val <= 104 ## 题目含义 合并两个二叉树,重叠部分两个结点得和作为新结点,不重合部分直接作为新结点,返回合并后的新树。 ## 算法思路 【深度优先】 ## 复杂度分析 时间复杂度:O(min(m,n))。m 和 n 分别为二叉树的节点个数。 空间复杂度:O(min(m,n))。m 和 n 分别为二叉树的节点个数。
22.695652
106
0.69636
eng_Latn
0.9899
facaa5805aa179ccb65b96dc229f3e6a7fb60184
3,875
md
Markdown
src/02. Content/04. Posts.md
asher-scott/piranha.core.docs
3339309c0735a5386cf5dc34cc066c1e8fc76a3f
[ "MIT" ]
null
null
null
src/02. Content/04. Posts.md
asher-scott/piranha.core.docs
3339309c0735a5386cf5dc34cc066c1e8fc76a3f
[ "MIT" ]
null
null
null
src/02. Content/04. Posts.md
asher-scott/piranha.core.docs
3339309c0735a5386cf5dc34cc066c1e8fc76a3f
[ "MIT" ]
null
null
null
# Posts The content templates of your Posts are designed using Post types. Remember that this does not necessarily mean that all posts of the same Post type has to be rendered the same way, it simply means that they have the same content structure. The preferred way of importing Post types is by using the `Piranha.AttributeBuilder` package. With this package you can directly mark your Post models with the Attributes needed. ## Your First Post Type Let's first look at how the simplest possible post type could look like. This post types doesn't provide anything other than the main content are which is made up of blocks. ~~~ csharp using Piranha.AttributeBuilder; using Piranha.Models; [PostType(Title = "Simple Post")] public class SimplePost : Post<SimplePost> { } ~~~ To import this post type during your application startup you add the following line to your `Configure` method. ~~~ csharp using Piranha.AttributeBuilder; var builder = new PostTypeBuilder(api) .AddType(typeof(SimplePost)); builder.Build(); ~~~ You can also define a single import for **all of the content types** in your application by adding the assemblies containing your types. The benefit of this is that you don't have to update your Startup code when adding new types. ~~~ csharp using Piranha.AttributeBuilder; var builder = new ContentTypeBuilder(api) .AddAssembly(typeof(Startup)); builder.Build(); ~~~ ### Post Type Configuration The `PostTypeAttribute` has the following attributes available for configuring the behaviour of the Post Type. ##### Title ~~~ csharp [PostType(Title = "Simple Post")] ~~~ The display title to show when working with posts in the manager interface. If this property is omitted the **class name** of the Post Type will be used as title. ##### UseBlocks ~~~ csharp [PostType(UseBlocks = false)] ~~~ Whether or not the main block content area will be used. This can be very useful for posts displaying information that should be fixed in its formatting and you want to limit the content to the pre-defined regions. ## Post Routing By default, all post request are rewritten to the route `/post`. Since you want to load different model types for your posts, and often render them by different views or pages you need to specify which route should handle your Post type. Let's say we have a post that also displays a hero. ~~~ csharp using Piranha.AttributeBuilder; using Piranha.Extend; using Piranha.Extend.Fields; using Piranha.Models; [PostType(Title = "Hero Post")] [PostTypeRoute(Title = "Default", Route = "/heropost")] public class HeroPost : Post<HeroPost> { public class HeroRegion { [Field] public StringField Title { get; set; } [Field] public ImageField Image { get; set; } [Field] public TextField Body { get; set; } } [Region] public HeroRegion Hero { get; set; } } ~~~ By adding the `PostTypeRouteAttribute` to your post type, all requests for post of this page type will now be routed to `/heropost`. ### Multiple Post Routes Let's say we would also like use render Hero Post as an extra wide campaign post. We can achieve this by add a second `PostTypeRouteAttribute` to the class. ~~~ csharp [PostType(Title = "Hero Post")] [PostTypeRoute(Title = "Default", Route = "/heropost")] [PostTypeRoute(Title = "Super wide", Route = "/superwide")] public class HeroPost : Post<HeroPost> { ... } ~~~ By adding a second route the **post settings** in the manager will now show a dropdown where the editor can select which route the current post should use. ## Advanced Content Now that you know the basics for setting up posts, you should continue to read about the different components available for creating more advanced content. Information about this can be found in the articles [Blocks](blocks), [Regions](regions) and [Fields](fields).
35.227273
289
0.743742
eng_Latn
0.996647
facacaa3c9f4cc91f643b17fee484b8ca37f2f61
1,669
md
Markdown
microsoft.ui.xaml.controls/hubsectionheaderclickeventargs.md
stevemonaco/winui-api
3e5ad1a5275746690c39fd2502c60928b756f3b5
[ "CC-BY-4.0", "MIT" ]
63
2018-11-02T13:52:13.000Z
2022-03-31T16:31:24.000Z
microsoft.ui.xaml.controls/hubsectionheaderclickeventargs.md
stevemonaco/winui-api
3e5ad1a5275746690c39fd2502c60928b756f3b5
[ "CC-BY-4.0", "MIT" ]
99
2018-11-16T15:15:12.000Z
2022-03-31T15:53:15.000Z
microsoft.ui.xaml.controls/hubsectionheaderclickeventargs.md
stevemonaco/winui-api
3e5ad1a5275746690c39fd2502c60928b756f3b5
[ "CC-BY-4.0", "MIT" ]
35
2018-10-16T05:35:33.000Z
2022-03-30T23:27:08.000Z
--- -api-id: T:Microsoft.UI.Xaml.Controls.HubSectionHeaderClickEventArgs -api-type: winrt class --- <!-- Class syntax. public class HubSectionHeaderClickEventArgs : Windows.UI.Xaml.Controls.IHubSectionHeaderClickEventArgs --> # Microsoft.UI.Xaml.Controls.HubSectionHeaderClickEventArgs ## -description Provides data for the [Hub.SectionHeaderClick](hub_sectionheaderclick.md) event. ## -remarks ## -examples ## -see-also [Control](control.md), [Hub](hub.md), [HubSectionCollection](hubsectioncollection.md), [Hub.SectionHeaderClick](hub_sectionheaderclick.md), HubSectionHeaderClickEventArgs, [ISemanticZoomInformation](isemanticzoominformation.md), [AppBar](appbar.md), [CommandBar](commandbar.md), [Your first app - Part 3: Navigation, layout, and views](/previous-versions/windows/apps/jj215600(v=win.10)), [Your first app - Add navigation and views in a C++ UWP app (tutorial 3 of 4)](/previous-versions/windows/apps/dn263172(v=win.10)), [Navigation](/windows/uwp/layout/navigation-basics), [Adding app bars (XAML)](/previous-versions/windows/apps/hh781230(v=win.10)), [XAML Hub control sample](https://github.com/microsoftarchive/msdn-code-gallery-microsoft/tree/master/Official%20Windows%20Platform%20Sample/XAML%20Hub%20control%20sample), [XAML AppBar control sample](https://go.microsoft.com/fwlink/p/?LinkID=242388), [XAML Navigation sample](https://go.microsoft.com/fwlink/p/?LinkID=389440), [Navigation design basics](/windows/uwp/layout/navigation-basics), [Guidelines for app bars](/windows/uwp/controls-and-patterns/app-bars), [Bottom app bar](/windows/uwp/controls-and-patterns/app-bars), [Top app bar](/windows/uwp/controls-and-patterns/app-bars)
79.47619
1,241
0.782504
yue_Hant
0.526648
facb224b1fad7bebd2bfa3efc18f3adb08d4dde7
1,444
md
Markdown
2020/09/21/2020-09-21 19:20.md
zhzhzhy/WeiBoHot_history
32ce4800e63f26384abb17d43e308452c537c902
[ "MIT" ]
3
2020-07-14T14:54:15.000Z
2020-08-21T06:48:24.000Z
2020/09/21/2020-09-21 19:20.md
zhzhzhy/WeiBoHot_history
32ce4800e63f26384abb17d43e308452c537c902
[ "MIT" ]
null
null
null
2020/09/21/2020-09-21 19:20.md
zhzhzhy/WeiBoHot_history
32ce4800e63f26384abb17d43e308452c537c902
[ "MIT" ]
null
null
null
2020年09月21日19时数据 Status: 200 1.Puff结婚 微博热度:2318812 2.外交部说不存在所谓海峡中线 微博热度:1623590 3.邓伦太能寄快递了 微博热度:1604189 4.专家反驳刘宇宁采摘的是道具 微博热度:1592298 5.陈羽凡发福 微博热度:1519098 6.央视曝光微信清粉骗局 微博热度:1131236 7.刘宇宁道歉 微博热度:1017463 8.方方 微博热度:960428 9.李心草案16名民警被问责处理 微博热度:726393 10.张文宏称至少一年世界才能重新开放 微博热度:650162 11.虞书欣 我不是杨超越我的头比她大 微博热度:610597 12.为了保护饼干不睡觉的妹妹 微博热度:521676 13.日本将为新婚夫妇发放4万元补贴 微博热度:417635 14.番禺持刀伤人嫌疑人自残身亡 微博热度:384471 15.江苏卫视大剧片单 微博热度:371004 16.未成年人首次触网年龄不断降低 微博热度:336862 17.白发奶奶步履蹒跚玩丢手绢 微博热度:336836 18.西安外国语大学回应学生喊楼 微博热度:333518 19.劳森回应 微博热度:330949 20.佟丽娅王俊凯主持百花奖 微博热度:328686 21.3月14号出生的名字 微博热度:324951 22.四十正好 微博热度:322677 23.斗鱼向虎牙道歉 微博热度:321741 24.退伍军人20秒救起落水女童 微博热度:318976 25.千万不要学电视剧里玩浪漫 微博热度:314847 26.视频通话还能这样玩 微博热度:312696 27.迪丽热巴齐耳短发 微博热度:310134 28.南师大学生校内死亡警方已介入调查 微博热度:308976 29.3400余款手机APP存安全隐患 微博热度:306522 30.倪妮短发造型 微博热度:305039 31.成毅回应发糖少 微博热度:298450 32.岳云鹏 微博热度:290682 33.今年将迎124年来最早秋分 微博热度:282598 34.外交部批美官员接连访台是政治挑衅 微博热度:279572 35.密室监控名场面 微博热度:279063 36.将逝者骨灰做成项链 微博热度:272772 37.超六成日本人愿意退休后继续工作 微博热度:269372 38.生死之椒 微博热度:264853 39.李由 微博热度:258456 40.胡兵直播了24小时 微博热度:243065 41.热心市民熊先生 微博热度:195834 42.山东一小型飞机坠落3人死亡 微博热度:193350 43.女大学生中奖1.4万个鸡蛋 微博热度:191318 44.飞机修理工英语口语超流利 微博热度:190908 45.周深萨顶顶隔了几条街的声音 微博热度:190475 46.极限挑战 国家二级保护植物 微博热度:158586 47.北京野生动物园回应游客坐车顶游览 微博热度:131944 48.不少快递企业工资高于当地平均水平 微博热度:91008 49.中国新设3个自贸试验区有4点考虑 微博热度:65384 50.久哲回归 微博热度:64996
7.078431
19
0.786704
yue_Hant
0.4026
facbe4de60323c71a81a707c5f0e2f8246a49879
1,524
md
Markdown
iOS/Component/组件二进制.md
lionsom/XiOS
07be7e3d1f0bc63ee3aa5a96ae397e1090c8281c
[ "Apache-2.0" ]
18
2019-11-19T16:21:22.000Z
2022-02-15T07:55:41.000Z
iOS/Component/组件二进制.md
lionsom/XiOS
07be7e3d1f0bc63ee3aa5a96ae397e1090c8281c
[ "Apache-2.0" ]
null
null
null
iOS/Component/组件二进制.md
lionsom/XiOS
07be7e3d1f0bc63ee3aa5a96ae397e1090c8281c
[ "Apache-2.0" ]
1
2020-11-08T10:18:22.000Z
2020-11-08T10:18:22.000Z
# pod 打包成二进制 [GitHub cocoapods-packager](https://github.com/CocoaPods/cocoapods-packager) [使用cocoapods-packager打包静态库](https://punmy.cn/2019/05/25/%E4%BD%BF%E7%94%A8cocoapods-packager%E6%89%93%E5%8C%85%E9%9D%99%E6%80%81%E5%BA%93.html) #pod package 生成 Framework **pod package** 是 cocoapods 的一个插件,如果没有的话使用以下命令安装: sudo gem install cocoapods-packager pod package 是根据 **.podspec** 描述文件来生成二进制库 例子(生成动态库): ```kotlin pod package AFPodSpec.podspec --force --dynamic --no-mangle --spec-sources=https://github.com/CocoaPods/Specs.git ``` 命令参数 ```dart //强制覆盖之前已经生成过的二进制库 --force //生成静态.framework --embedded //生成静态.a --library //生成动态.framework --dynamic //动态.framework是需要签名的,所以只有生成动态库的时候需要这个BundleId --bundle-identifier //不包含依赖的符号表,生成动态库的时候不能包含这个命令,动态库一定需要包含依赖的符号表。 --exclude-deps //表示生成的库是debug还是release,默认是release。--configuration=Debug --configuration --no-mangle //表示不使用name mangling技术,pod package默认是使用这个技术的。我们能在用pod package生成二进制库的时候会看到终端有输出Mangling symbols和Building mangled framework。表示使用了这个技术。 //如果你的pod库没有其他依赖的话,那么不使用这个命令也不会报错。但是如果有其他依赖,不使用--no-mangle这个命令的话,那么你在工程里使用生成的二进制库的时候就会报错:Undefined symbols for architecture x86_64。 --subspecs //如果你的pod库有subspec,那么加上这个命名表示只给某个或几个subspec生成二进制库,--subspecs=subspec1,subspec2。生成的库的名字就是你podspec的名字,如果你想生成的库的名字跟subspec的名字一样,那么就需要修改podspec的名字。 这个脚本就是批量生成subspec的二进制库,每一个subspec的库名就是podspecName+subspecName。 --spec-sources //一些依赖的source,如果你有依赖是来自于私有库的,那就需要加上那个私有库的source,默认是cocoapods的Specs仓库。--spec-sources=private,https://github.com/CocoaPods/Specs.git。 ```
23.8125
144
0.799869
yue_Hant
0.674168
facc2bb56eb4bb1a00501b7c8d973f9ed08362f4
1,757
md
Markdown
_posts/people-love/11/2021-04-06-iceee.md
chito365/p
d43434482da24b09c9f21d2f6358600981023806
[ "MIT" ]
null
null
null
_posts/people-love/11/2021-04-06-iceee.md
chito365/p
d43434482da24b09c9f21d2f6358600981023806
[ "MIT" ]
null
null
null
_posts/people-love/11/2021-04-06-iceee.md
chito365/p
d43434482da24b09c9f21d2f6358600981023806
[ "MIT" ]
null
null
null
--- id: 4596 title: ICEEE date: 2021-04-06T17:13:29+00:00 author: Laima layout: post guid: https://ukdataservers.com/iceee/ permalink: /04/06/iceee tags: - show love category: Guides --- * some text {: toc} ## Who is ICEEE Social media star who makes virtual reality videos. He has earned more than 1.7 million followers on his iceeevr TikTok account. ## Prior to Popularity He posted his first TikTok in January 2021. ## Random data In March 2021, he promised his Instagram followers that he would post photos of his face if he got 1000 likes. ## Family & Everyday Life of ICEEE In March 2020, he paid tribute to his dog who had passed away. ## People Related With ICEEE He is a fan of NFL football player JuJu Smith-Schuster.
19.307692
128
0.312464
eng_Latn
0.994304
facc3a2a08c1fad83fb5b2f9bee21ca89b5d1c19
5,458
md
Markdown
docs/ru/adapterref/iobroker.xs1/README.md
OLFDB/ioBroker.docs
26501a66cacd9b0c65976a3d10760de201f39a86
[ "MIT" ]
47
2017-08-15T12:55:53.000Z
2021-12-25T23:25:20.000Z
docs/ru/adapterref/iobroker.xs1/README.md
gaudes/ioBroker.docs
59a32ab5b17da58604c272b9718246fe3eb41f5a
[ "MIT" ]
253
2017-03-28T20:20:48.000Z
2022-03-08T08:49:52.000Z
docs/ru/adapterref/iobroker.xs1/README.md
gaudes/ioBroker.docs
59a32ab5b17da58604c272b9718246fe3eb41f5a
[ "MIT" ]
148
2017-03-24T21:32:12.000Z
2022-03-10T07:17:46.000Z
--- translatedFrom: en translatedWarning: Если вы хотите отредактировать этот документ, удалите поле «translationFrom», в противном случае этот документ будет снова автоматически переведен editLink: https://github.com/ioBroker/ioBroker.docs/edit/master/docs/ru/adapterref/iobroker.xs1/README.md title: ioBroker.xs1 hash: YDDMrsvKEdeOSKCT1MPhIWYk68X7WT5w1tOQy0vU+BA= --- # IoBroker.xs1 ![логотип](../../../en/adapterref/iobroker.xs1/admin/xs1.png) ![Версия NPM](http://img.shields.io/npm/v/iobroker.xs1.svg) ![Загрузки](https://img.shields.io/npm/dm/iobroker.xs1.svg) ![Трэвис-CI](http://img.shields.io/travis/frankjoke/ioBroker.xs1/master.svg) ![NPM](https://nodei.co/npm/iobroker.xs1.png?downloads=true) ## Адаптер ioBroker zu EZcontrol XS1 Der Adapter kommuniziert über die RestAPI des XS1 and hängt sich auch and das XS1 als listener um alle Änderungen смягчают и утверждают, что iBroker weiterzuleiten. Befehle vom ioBroker werden zuerst mit ack = false gesendet und wenn etwas vom Слушатель kommt dann passiert das mit ack = true. Man wei dann zumindest dass XS1 den Befehl gesendet hat. Der Adapter scannt alle verfügbaren Sensoren (только для чтения) и Aktoren (чтение / запись) и verwendet die am XS1 vergebenen Namen. Momentan werden keine Spezialinformationen wie Batterielevel unterstützt da diese dem Listener leider nicht weitergegeben werden. Вы можете связаться с ним по электронной почте, чтобы позвонить ему в XS1. Моментальные данные Passwort-Zugriff реализованы и реализованы в XS1 kein Passwort gesetzt sein! Für Sensoren welche im state eine 'Батарея разряжена' - Meldung anzeigen wird ein .LOWBAT-State erzeugt. Die Copylist erlaubt direktes Gleichschalten zwischen Слушатель и Акторен. Damit kann man Aktoren zusammenschalten welche ohne im ioБрокерские скрипы шрайбен цу мюссен. Также Wenn Aktor A von XS! Ауф Эйн Гехт Вирд ау Актор Б (и С ..) Das ist sinnvoll wenn Aktoren verschiedene Systeme benutzen (Aktor A = FS20, B = AB400, C = HMS) и zusammen geschaltet werden sollen (Ein funksender von FS20 kann dann directct auch einen AB400 Funkstekdose schalten). Синтаксис формы ist {"von_a": "auf_b (, auf_c, ...)", "von_b": "auf_c", ....} Отрицательный знак назначения. Ein Beispiel von mir: {"UWPumpeT2": "UWPumpe", "UWPumpe": "UWPumpeT2", "Schalter1": "Licht1, Licht2"} Дамит Вирд дер Тастер (UWPumpeT2) с мужским взором в городе Рихтюн ioBroker nur noch einen Aktor verwenden. 'Schalter1' würde 'Licht1' и 'Licht2' gleichzeitig mitschalten. Für die neu hinzugefügte Watchdog-Funktion Sollte Im XS1 в действующей компании «Сторожевой пес» «Watchdog» kreiert werden. Dieser wird jede Минутный удар и падение 4 Minuten lan dieser Umschaltvorgang nicht zurückgemeldet wird wird der Adapter neu gestartet. ## Wichtig! - * Der Adapter benötigt Node> = v6. *! * Einen blinden (aber nicht virtuellen) Актуатор с демона Намен «Сторожевой пес» Эрстеллен. ## Changelog ### 1.1.0 * Added Admin3 capabities and support for new js-controller * Adapter runs only with node>=8.16 ### 1.0.2 * Added more sensors. All unknown types will use 'value' role. This can lead to problems if actual type is a boolean, but should work otherwise. As a result all sensors should be listed now. ### 1.0.0 * Update accepted device list and test for node v 8 * Tarvis updated to test right repository ### 0.5.2 * Update variables list and values from XS1 but change values only if they are different than in state not to create false state updates ### 0.5.1 * Adapter test auf Node 4.x und 6.x für Windows und Linux. * Fehler beim ersten Einlesen von boolean states korrigiert. ### 0.5.0 * LOWBAT für Sensoren mit Battery low state. * Abhängigkeit von 'async' und 'request' entfernt, damit braucht xs1 keine zusätzlichen Module mehr. * Watchdog mit XS1-Aktuator implementiert. * Cleanup der states wenn sie nicht mehr verwendet werden (und z.B. vom XS1 gelöscht werden) ### 0.4.2 Watchdog von 4 Minuten implementiert, wenn 4 Minuten kein Signal vom XS1 kommt wird Adapter gestoppt. jede Minute sendet der Adapter ein Signal an den Aktuator 'Watchdog' der dies bestätigen sollte. iobroker sollte den Adapter dann neu starten. ### 0.4.0 Erster öffentliche Version, kann lesen und Aktuatoren schreiben (Befehle absetzten). TODO: Dokumentieren und Batteriestatus polling implementieren. ### 0.1.0 Erster Test, Kann nur lesen und mithören ## License The MIT License (MIT) Copyright (c) 2016 Frank Joke Permission is hereby granted, free of charge, to any person obtaining a copy of this software and associated documentation files (the "Software"), to deal in the Software without restriction, including without limitation the rights to use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of the Software, and to permit persons to whom the Software is furnished to do so, subject to the following conditions: The above copyright notice and this permission notice shall be included in all copies or substantial portions of the Software. THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE.
50.537037
217
0.781239
deu_Latn
0.695929
facd3ffd9e5f91e721567458a6a9b809b05fac3f
46
md
Markdown
README.md
Cubey2019/vcoin-mongo-models
d0790c661073b5b20a0a9124c4f1b2fffd5de4f6
[ "BSD-3-Clause" ]
null
null
null
README.md
Cubey2019/vcoin-mongo-models
d0790c661073b5b20a0a9124c4f1b2fffd5de4f6
[ "BSD-3-Clause" ]
null
null
null
README.md
Cubey2019/vcoin-mongo-models
d0790c661073b5b20a0a9124c4f1b2fffd5de4f6
[ "BSD-3-Clause" ]
null
null
null
# vcoin-mongo-models Mongodb models for Vcoin
15.333333
24
0.804348
eng_Latn
0.542841
facd5f6a867400877d510814147d9e7bf5ef8113
1,314
md
Markdown
_posts/2021-06-23-06_01_1311.md
mandeuk26/hyde
dbfd9ceccd435475363678518713faffe80d9648
[ "MIT" ]
null
null
null
_posts/2021-06-23-06_01_1311.md
mandeuk26/hyde
dbfd9ceccd435475363678518713faffe80d9648
[ "MIT" ]
null
null
null
_posts/2021-06-23-06_01_1311.md
mandeuk26/hyde
dbfd9ceccd435475363678518713faffe80d9648
[ "MIT" ]
2
2021-06-21T13:59:42.000Z
2021-06-22T10:48:49.000Z
--- layout: post title: 백준 1311 할 일 정하기 1 tags: [Problem, Bitmask] --- [https://www.acmicpc.net/problem/1311](https://www.acmicpc.net/problem/1311) n명의 사람과 n개의 일이 있을 때 사람마다 일을 처리하는 비용이 각각 적혀있다고 했을 때 모든 일을 하는데 필요한 비용의 최솟값을 구하는 문제이다. n개의 일에 대해 비트마스크를 적용시켜 상태를 확인한다고 하자. 모든 경우에 대해 탐색해야하기 때문에 dfs를 사용할 것이고 중복 연산을 방지하기 위해 dp를 사용할 것이다. 0번째 사람부터 n-1번째 사람까지 하나씩 일을 부여할 것인데 visited라는 변수를 통해 아직 해결되지 않은 일들을 for문을 통해 다음 사람에게 부여시킨다. for문 내부에서는 dfs를 통해 해당 경우의 최소 비용을 탐색해오고 그 중 가장 적은 비용이 드는 것을 선택한다. visited를 비트마스크가 아닌 배열로 해도 기능상에는 차이가 없다. 하지만 길이 20의 배열을 계속해서 함수 호출로 넘겨줘야하기 때문에 overhead가 상당하다. 그래서 비트마스크를 활용하는 것이다. - 전체 코드 ```swift let n = Int(readLine()!)! var dp:[[Int]] = Array(repeating: Array(repeating: -1, count: 1<<21), count: 20) var cost:[[Int]] = [] for _ in 1...n { cost.append(readLine()!.split(separator: " ").map{Int(String($0))!}) } func dfs(d: Int, visited: Int) -> Int { if d == n { return 0 } else if dp[d][visited] != -1 { return dp[d][visited] } else { dp[d][visited] = 1_000_000 for i in 0..<n { if (visited & (1<<i)) == 0 { dp[d][visited] = min(dp[d][visited], dfs(d: d+1, visited: visited|(1<<i)) + cost[d][i]) } } return dp[d][visited] } } print(dfs(d:0, visited:0)) ```
27.957447
183
0.591324
kor_Hang
1.000007
facdf4381e633a2cba632fbef2d3b71e7fa18ba5
90
md
Markdown
README.md
paullewallencom/java-978-1-7886-2864-8
9982e3f4896f3c9679e50371932a6b350448ae11
[ "Apache-2.0" ]
null
null
null
README.md
paullewallencom/java-978-1-7886-2864-8
9982e3f4896f3c9679e50371932a6b350448ae11
[ "Apache-2.0" ]
null
null
null
README.md
paullewallencom/java-978-1-7886-2864-8
9982e3f4896f3c9679e50371932a6b350448ae11
[ "Apache-2.0" ]
null
null
null
# java-978-1-7886-2864-8 Introduction to Data Structures &amp; Algorithms in Java [Video]
30
64
0.766667
yue_Hant
0.649717
facf05b56763278e005e03ec45b47786d538a71f
2,108
md
Markdown
README.md
mlewis109/blogo
eae7eae4363f0e93f1c40b3096ab51ad90eed8f7
[ "Unlicense" ]
2
2021-12-16T04:16:53.000Z
2022-03-23T16:55:18.000Z
README.md
mlewis109/blogo
eae7eae4363f0e93f1c40b3096ab51ad90eed8f7
[ "Unlicense" ]
null
null
null
README.md
mlewis109/blogo
eae7eae4363f0e93f1c40b3096ab51ad90eed8f7
[ "Unlicense" ]
null
null
null
# Blogo ## Introduction The aim of Blogo is to allow anyone with knowledge of python create 3d structures in [Blender](https://www.blender.org/) in a simple and intuitive way. With an API that doesn’t change and which does most of the little things for you. It is not designed to be able to give photo perfect renderings, but it does output something that can be manipulated in Blender manually if you want to do that. It does this by providing a Logo-like interface to python. The basic idea is that each Blogo object describes a two dimensional cross section (defaulting to a square) which is moved through space leaving behind a trail where it has been. It is moved through space using simple commands such as forward, left, right, up, down. The cross section can be modified to give any shape and can further be modified by multipliers, i.e. each of the points on the right of the cross section can be placed twice as far from the middle. This allows a wall, for example, to be constructed easily by saying that the square cross section should have the left and right sides 0.05 metres away from the middle and the bottom should be 0 metres from the middle and the top 2.4 metres from the middle. Then by repeating four times move forward by 10 metres, turn right 90 degrees you will get a wall around a 10m x 10m square. The interface is designed to make it simple to do things that people are likely to want. If you want to add a texture to that wall, then you can use the add_texture function and give it a filename and that image will be on the wall, if you want a particular colour then a tuple of RGBA colour values can be used instead. ## Getting Started Assuming you already have Blender installed, simply copy the files from blogo/src/ into <blender_install_dir>/scripts/ and restart Blender. Now it is installed and can be used. Check the [quickstart](https://blogo.readthedocs.io/en/latest/Documentation/quickstart.html) in the help for more details and what to do to make shapes. ## Documentation Is available at https://blogo.readthedocs.io/en/latest/index.html
110.947368
848
0.780361
eng_Latn
0.999905
facf41c977e696bb9888fdac032025292b7e8884
3,988
md
Markdown
_posts/2019-06-08-Download-study-guide-for-ekg-certification.md
Kirsten-Krick/Kirsten-Krick
58994392de08fb245c4163dd2e5566de8dd45a7a
[ "MIT" ]
null
null
null
_posts/2019-06-08-Download-study-guide-for-ekg-certification.md
Kirsten-Krick/Kirsten-Krick
58994392de08fb245c4163dd2e5566de8dd45a7a
[ "MIT" ]
null
null
null
_posts/2019-06-08-Download-study-guide-for-ekg-certification.md
Kirsten-Krick/Kirsten-Krick
58994392de08fb245c4163dd2e5566de8dd45a7a
[ "MIT" ]
null
null
null
--- layout: post comments: true categories: Other --- ## Download Study guide for ekg certification book horseback. A gash in it deepened, for instance. spot. Words threatened to spill from him again, not one. There were many inquiries for gunpowder, Cass. It ought to be observed, Paul walked in brought from the Ural, slowly past. "Illiterate wizards are the curse of Earthsea!" he cried? Simple white plates bought at Sears. He'd never had a chance to read this to Perri or to benefit from her opinion. If I hadn't hidden my murdered husband's He had experienced considerable self-revelation during the past eighteen Study guide for ekg certification remembers that Cass advised a quick shower because the motor home isn't Chapter 66 Four black bearers had appeared, if the Earthside tests on a roll of toilet paper didn't. But she came, hoping the was the good of possessing the Throne of Maharion if nobody sat in it but a drunken cripple, recovered the boy's clothes from her. Now this lower random in the Behring Sea, hung next to those white canes that were now he rebuffed Angel too sharply, placed at absorbed a measure of her aunt's attitude toward the bad news and the sorrier turns of life that fate Kara port, the motor home study guide for ekg certification Regardless of her other successes or failures as a parent. can't become conscious of anything until they are as articulate as their oppressors. Hound shrugged. In the first place, sir," a voice called out, magnified beyond imagining, "wake up, as an organism becomes more complex and specialized. Though he may be Gutenberg-tm work. "It's coming. Salk assured him. So much time I was the guest study guide for ekg certification the acting Governor, no one on Earth is going to be able to defy the edict. " worse than killing. There's no use complaining. Unfortunately, Irian, with Angel exuberant in spite of the hour and Oreo energized. insisted on calling themselves scientists, if he was caught with it. ] Satan than him," said Geneva. It's an orrery. That she opposed my views so openly I considered a good sign; a pulp-fiction hero. " file:D|Documents20and20SettingsharryDesktopUrsula20K! Nummelin passed one of the severest winters that Arctic literature has "Well, at the whiff of herbs and aromatic smoke, but that' transfer didn't go through. And surely there Study guide for ekg certification, unpredictable, pushing back like an inflated balloon. Crouched on the deep sill, as well as from sailing weighed anchor in order to continue their voyage, however, though somewhat different, "Tell me of yonder portrait and what girl is study guide for ekg certification of the daughters of the kings; else will I take thy head, you'll find em Junior was vigilant, its contents having been explored in haste, they are endlessly devious, but later modified itself to use human heart pumps from the genetic information taken from the bodies of the men and women we buried," She paused to let that sink in. 5 "No, I 'member, and punctually comply growing crowd gathered around the dead zone. "Say, but didn't stay around to see them do it, snap, Dr, the mind had a thermostat of its own, and he supposed that already he was missing her. All of them. "This isn't absolutely final as yet" she'd have this third snake to worry about. itself, but suddenly I felt her [Illustration: "SEAL ROOKERY" ON ST. txt (96 of 111) [252004 12:33:31 AM] way, yet not all. As if having to get through the feeder ramps wasn't problem enough, unless it was being told that her choices in life hadn't been the Sea-otter, and then Moog Indigo slides into the last number with scarcely a pause, but they looked sterner than the others: early advocates of Academy of Art College, he Mercury Mountaineer, she was sound asleep. An old sorcerer. " "Nick," he suggested, as well as in arranging the more formal "The cloak-and-dagger aspect ought to be fun. one of those rare study guide for ekg certification with a pure soul. "I'm alone.
443.111111
3,881
0.787111
eng_Latn
0.999658
facfe467707ec1d4de153a8759926b1f9f54898a
243
md
Markdown
_posts/2018/2018-11-25-jinyong_guoxiang.md
mengwenchao/mengwenchao.github.io
caba84849eafa4fdc16b78d9cd014ec806f8b9a3
[ "BSD-3-Clause", "MIT" ]
null
null
null
_posts/2018/2018-11-25-jinyong_guoxiang.md
mengwenchao/mengwenchao.github.io
caba84849eafa4fdc16b78d9cd014ec806f8b9a3
[ "BSD-3-Clause", "MIT" ]
null
null
null
_posts/2018/2018-11-25-jinyong_guoxiang.md
mengwenchao/mengwenchao.github.io
caba84849eafa4fdc16b78d9cd014ec806f8b9a3
[ "BSD-3-Clause", "MIT" ]
null
null
null
--- layout: single title: 郭襄 date: 2018-11-25 17:38:20.000000000 +08:00 tags: - 生活 - 武侠 --- 看到一首写郭襄的小诗,很有意境: 我路过山时,山不说话 我路过海时,海不说话 小毛驴滴滴答答,倚天剑伴我走天涯 大家都说我是爱着杨大侠,才在峨眉山上出了家 其实我是爱上了峨眉上的的云和霞,像极了十六岁那年的烟花
10.565217
42
0.621399
yue_Hant
0.488783
fad1786a5b25d88000d05bfa4fa27ad4e1dc57b8
150
md
Markdown
CHANGELOG.md
davidcai/atom-clean-syntax
d84b848a23711e917ffa95fb295a5f52a8461e63
[ "MIT" ]
null
null
null
CHANGELOG.md
davidcai/atom-clean-syntax
d84b848a23711e917ffa95fb295a5f52a8461e63
[ "MIT" ]
null
null
null
CHANGELOG.md
davidcai/atom-clean-syntax
d84b848a23711e917ffa95fb295a5f52a8461e63
[ "MIT" ]
null
null
null
## 0.1.0 - First Release * Every feature added * Every bug fixed ## 0.2.1 - Official Release * Package as an atom theme ## 0.2.2 * Less noise in UI
15
27
0.66
eng_Latn
0.966224
fad29ee35d412b09275da60eadac0abe450529c1
7,557
md
Markdown
readme.md
KristianBonitz/awesome-juggling
ca7e793d9c496842005744f581549427d290413e
[ "Unlicense" ]
5
2019-07-16T22:26:58.000Z
2021-12-01T19:25:06.000Z
readme.md
KristianBonitz/awesome-juggling
ca7e793d9c496842005744f581549427d290413e
[ "Unlicense" ]
2
2019-09-05T17:23:56.000Z
2019-09-05T17:44:40.000Z
readme.md
KristianBonitz/awesome-juggling
ca7e793d9c496842005744f581549427d290413e
[ "Unlicense" ]
4
2019-07-17T04:50:23.000Z
2020-11-30T05:18:13.000Z
Awesome Juggling [![Awesome](https://awesome.re/badge.svg)](https://awesome.re) <p align="center"> <a href="awesome.md">What is an awesome list?</a>&nbsp;&nbsp;&nbsp; <a href="contributing.md">Contribution guide</a>&nbsp;&nbsp;&nbsp; </p> This is a curated list of juggling resources. There are many great juggling resources on the web but they can be hard to find since search engines don't rank them highly. Contributions welcome. Add links through pull requests or create an issue to start a discussion. <br> ## Contents - [Social Sites](#social-sites) - [Learning to Juggle](#learning-to-juggle) - [Passing Patterns](#passing-patterns) - [Tools](#tools) - [Videos](#videos) - [Games](#games) - [Misc](#misc) - [Vendors](#vendors) ## Social Sites - [Juggling Edge](https://www.jugglingedge.com/clublistings.php) - Juggling club listings, upcoming juggling festivals and forum. - [/r/Juggling](https://www.reddit.com/r/juggling/) - The largest reddit juggling community. - [Juggling Rock](https://www.facebook.com/groups/JugglingRock/) - The most active facebook juggling community. - [IJA](https://www.juggle.org/) - The International Juggler's Association runs an annual festival, has a forum and publishes juggling articles. ## Learning to Juggle - [How to Juggle](https://www.wikihow.com/Juggle) - The wikiHow page for juggling. - [Passing Pedagogy](http://passingpedagogy.com/) - This site explains a methodology for teaching club passing. ## Passing Patterns - [MAJ Pattern Book](https://madjugglers.com/majpatternbook) - A collection of passing patterns from the Madison Area Jugglers. - [Passing Pattern Anthology](https://jonglieren-jena.de/ppa/ppa.html) - A collection of passing patterns from Markus Oehme. - [passingdb.com](https://www.passingdb.com/index.php) - A collection of passing patterns from JiBe (Jean-Baptiste Hurteaux). - [Passing Zone](https://passing.zone/) - A collection of videos and explanations for a variety of passing patterns from the Passing Zone. - [Juggling Edge Passing Patterns](http://www.jugglingedge.com/pdf/PassingPatternsAug06.pdf) - A collection of passing patterns from Mark Weston. - [Will's Passing Patterns](http://web.csulb.edu/~wmurray/jugglingArticles/WillPatterns.pdf) - A collection of passing patterns from Will Murray. - [Berkeley Juggling Patterns](https://berkeleyjuggling.org/juggling-patterns/) - A collection of passing patterns from the Berkeley Juggling Club. - [Aerial Mirage Jugglers](http://www.gnerds.com/juggle/) - A collection of passing patterns and explanations from the Aerial Mirage Jugglers. - [Aidan's Juggling Resources](http://www.juggle.me.uk/passing/) - A collection of patterns and explanations from Aiden Burns. - [Passing Wiki](https://passingwiki.org/wiki/Main_Page) - A passing pattern wiki. ## Tools - [Juggling Lab](https://jugglinglab.org/) - Software used to generate and animate juggling patterns. - [Gunswap](http://www.gunswap.co/about) - A juggling animator and pattern library. - [Juggloid](http://juggloid.com/) - A siteswap animator with a large number of patterns that have already been created. - [Compatible Siteswaps](https://www.cs.cmu.edu/~ckaestne/siteswaps.xhtml) - A list of compatible siteswaps with filters. - [Two person Passing Pattern Spreadsheet](https://drive.google.com/file/d/0B26BTNBYVjFqdW9mWUgteDZYT00/view?ths=true) - A spreadsheet of two person passing patterns siteswaps. - [Juggling Graphics](https://juggling.graphics/) - A siteswap graphic creator. - [Prechac This](http://www.prechacthis.org/) - A siteswap generator website. - [prech.ac](http://prech.ac) - A url shortener for PrechacThis ([source](https://github.com/prechac/prech.ac)) - [Passist](https://passist.org/) - A siteswap generator website. - [Siteswap Generator](https://play.google.com/store/apps/details?id=namlit.siteswapgenerator&hl=gsw) - A siteswap generator for Android. - [Siteswap Suggest](http://joshmermelstein.com/juggle-suggest/) - A siteswap animator with an autocomplete feature. - [PassingSync](https://play.google.com/store/apps/details?id=edu.cmu.mastersofflyingobjects.passingsync) - An android app that speaks juggling instructions for passing partners to follow along with. - [Madeye](http://madeye.org/juggling/) - A variety of siteswap tools. - [Pattern Generator](http://jacos.nl/how-to-use-the-pattern-generator/) - A pattern generator website. - [JoePass!](http://koelnvention.de/w/?page_id=151) - Software used to animate juggling patterns. - [JuggleHacker](https://www.jugglehacker.com) - A generator for hijack passing patterns. ## Videos - [Juggling TV](http://juggling.tv/) - A juggling specific video site. - [Manipeo](http://manipeo.com/) - A user submitted juggling video site. ## Games - [Siteswap Anagrams](http://siteswapgame.herokuapp.com/) - A puzzle game for calculating siteswap anagrams. ## Misc - [Juggling Records](https://www.juggling-records.com/) - A tracker for current juggling world records. - [Fight Night Combat](http://www.fightnightcombat.com/index.html) - A tracker for juggling combat tournaments and standings. - [Juggling by Numbers - Numberphile](https://www.youtube.com/watch?time_continue=99&v=7dwgusHjA0Y) - An explanation video for siteswap. - [Siteswap Bot](https://github.com/loganstafman/siteswap-bot) - The source code for the reddit siteswap bot. - [Library of Juggling](https://www.libraryofjuggling.com/) - A compilation of juggling tricks with explanations. - [Skilldex](https://skilldex.org) - An online community to learn, share, and organize juggling skills. ## Vendors Alphabetical - [Bravo Juggling](http://www.bravojuggling.com/) - Sells a variety of juggling props (Hungary). - [Brontosaurus Balls](http://brontosaurusballs.com/) - Specializes in russian juggling balls (United States). - [Cathedral Juggling](http://www.cathedraljuggling.com/) - Sells a variety of juggling props (United States). - [Dube](https://www.dube.com/) - Sells a variety of juggling props (United States). - [Firetoys](https://www.firetoys.com/) - Sells a variety of juggling props (United States). - [Flairco](http://www.flairco.com) - Sells flair bartending supplies (United States). - [Flames 'N Games](https://flamesngames.co.uk/) - Sells a variety of juggling props (United Kingdom). - [Flowtoys](https://flowtoys.com/) - Sells a variety of juggling props (United States). - [Flying Clipper](https://www.flyingclipper.com/) - Sells footbags and juggling balls (United States). - [Gballz](https://gballz.com/) - Specializes in juggling balls (United States). - [Henrys](https://www.henrys-online.de/en/)- Sells a variety of juggling equipment (Germany). - [Higgins Brothers](http://higginsbrothers.com/en/) - Sells a variety of juggling props (Canada). - [Master Ongs Prop Shop](http://www.masterongspropshop.com/) - Sells a variety of juggling props (United States). - [Odd Balls](https://www.oddballs.co.uk/) - Sells a variety of juggling props (United Kingdom). - [Pass the Props](http://passtheprops.com/) - Sells a variety of juggling props (United States). - [Play Juggling](https://www.playjuggling.com/en/) - Sells a variety of juggling props (Italy). - [Renegade Juggling](https://www.renegadejuggling.com/) - Sells a variety of juggling props (United States). - [Sport Juggling Company](http://sportjugglingco.com/) - Specializes in juggling balls (United States). - [Todd Smith](http://toddsmith.com/) - Sells a variety of juggling props (United States). - [Three Finger Juggling](https://threefingerjuggling.com/) - Specializes in dangerous juggling props (United States).
70.626168
199
0.752415
eng_Latn
0.352669
fad4a5880a6e2e5c3ff1c283e898dda51a7615fc
14,208
md
Markdown
README_CN_V2.md
huaryliu/helm-wrapper
ed05629878f6d926d55034b276dbaa17b6d09a17
[ "Apache-2.0" ]
null
null
null
README_CN_V2.md
huaryliu/helm-wrapper
ed05629878f6d926d55034b276dbaa17b6d09a17
[ "Apache-2.0" ]
null
null
null
README_CN_V2.md
huaryliu/helm-wrapper
ed05629878f6d926d55034b276dbaa17b6d09a17
[ "Apache-2.0" ]
null
null
null
# A [Helm3](https://github.com/helm/helm) HTTP Wrapper With Go SDK Helm3 摒弃了 Helm2 的 Tiller 架构,使用纯命令行的方式执行相关操作。如果想通过 Helm API 来实现相关功能,很遗憾官方并没有提供类似的服务。不过,因为官方提供了相对友好的 [Helm Go SDK](https://helm.sh/docs/topics/advanced/),我们只需在此基础上做封装即可实现。[helm-wrapper](https://github.com/opskumu/helm-wrapper) 就是这样一个通过 Go [Gin](https://github.com/gin-gonic/gin) Web 框架,结合 Helm Go SDK 封装的 HTTP Server,让 Helm 相关的日常命令操作可以通过 Restful API 的方式来实现同样的操作。 ## Support API * 如果某些API需要支持多个集群,则可以使用以下参数 | Params | Description | | :- | :- | | kube_context | 支持指定kube_context来区分不同集群 | helm 原生命令行和相关 API 对应关系: + helm install - `POST` - `/api/namespaces/:namespace/releases/:release?chart=<chartName>` POST Body: ``` json { "dry_run": false, // `--dry-run` "disable_hooks": false, // `--no-hooks` "wait": false, // `--wait` "devel": false, // `--false` "description": "", // `--description` "atomic": false, // `--atomic` "skip_crds": false, // `--skip-crds` "sub_notes": false, // `--render-subchart-notes` "create_namespace": false, // `--create-namespace` "dependency_update": false, // `--dependency-update` "values": "", // `--values` "set": [], // `--set` "set_string": [], // `--set-string` "ca_file": "", // `--ca-file` "cert_file": "", // `--cert-file` "key_file": "", // `--key-file` "insecure_skip_verify": "", // `--insecure-skip-verify` "keyring": "", // `--keyring` "password": "", // `--password` "repo": "", // `--repo` "username": "", // `--username` "verify": false, // `--verify` "version": "" // `--version` } ``` > 此处 values 内容同 helm install `--values` 选项 + helm uninstall - `DELETE` - `/api/namespaces/:namespace/releases/:release` + helm upgrade - `PUT` - `/api/namespaces/:namespace/releases/:release?chart=<chartName>` PUT Body: ``` json { "dry_run": false, // `--dry-run` "disable_hooks": false, // `--no-hooks` "wait": false, // `--wait` "devel": false, // `--false` "description": "", // `--description` "atomic": false, // `--atomic` "skip_crds": false, // `--skip-crds` "sub_notes": false, // `--render-subchart-notes` "force": false, // `--force` "install": false, // `--install` "recreate": false, // `--recreate` "cleanup_on_fail": false, // `--cleanup-on-fail` "values": "", // `--values` "set": [], // `--set` "set_string": [], // `--set-string` "ca_file": "", // `--ca-file` "cert_file": "", // `--cert-file` "key_file": "", // `--key-file` "insecure_skip_verify": "", // `--insecure-skip-verify` "keyring": "", // `--keyring` "password": "", // `--password` "repo": "", // `--repo` "username": "", // `--username` "verify": false, // `--verify` "version": "" // `--version` } ``` > 此处 values 内容同 helm upgrade `--values` 选项 + helm rollback - `PUT` - `/api/namespaces/:namespace/releases/:release/versions/:reversion` PUT Body 可选: ``` json { "dry_run": false, // `--dry-run` "disable_hooks": false, // `--no-hooks` "wait": false, // `--wait` "force": false, // `--force` "recreate": false, // `--recreate` "cleanup_on_fail": false, // `--cleanup-on-fail` "history_max": // `--history-max` int } ``` + helm list - `GET` - `/api/namespaces/:namespace/releases` Body: ``` json { "all": false, // `--all` "all_namespaces": false, // `--all-namespaces` "by_date": false, // `--date` "sort_reverse": false, // `--reverse` "limit": , // `--max` "offset": , // `--offset` "filter": "", // `--filter` "uninstalled": false, // `--uninstalled` "uninstalling": false, // `--uninstalling` "superseded": false, // `--superseded` "failed": false, // `--failed` "deployed": false, // `--deployed` "pending": false // `--pending` } ``` + helm get - `GET` - `/api/namespaces/:namespace/releases/:release` | Params | Description | | :- | :- | | info | 支持 hooks/manifest/notes/values 信息,默认为 values | | output | values 输出格式(仅当 info=values 时有效),支持 json/yaml,默认为 json | + helm release history - `GET` - `/api/namespaces/:namespace/releases/:release/histories` + helm show - `GET` - `/api/charts` | Params | Description | | :- | :- | | chart | 指定 chart 名,必填 | | info | 支持 all/readme/values/chart 信息,默认为 all | | version | 支持版本指定,同命令行 | + helm search repo - `GET` - `/api/repositories/charts` | Params | Description | | :- | :- | | keyword | 搜索关键字,必填 | | version | 指定 chart version | | versions | if "true", all versions | + helm repo list - `GET` - `/api/repositories` + helm repo update - `PUT` - `/api/repositories` + helm env - `GET` - `/api/envs` + upload chart - `POST` - `/api/charts/upload` | Params | Description | | :- | :- | | chart | chart 包,必须为 .tgz 文件 | + list local charts - `GET` - `/api/charts/upload` > 当前该版本处于 Alpha 状态,还没有经过大量的测试,只是把相关的功能测试了一遍,你也可以在此基础上自定义适合自身的版本。 ## Support API -V2 ------ ------ 为了支撑多集群场景,新增以下接口: #### 集群配置文件【新增】 - upload - POST - `/api/k8s/config/upload` Form: | Params | Description | | :----- | :------------- | | file | 集群的配置文件 | 返回值 ```json { "code": 200, "data": "7fc76a25-14c5-4da8-b19a-985a36dfdc95" } ``` > 说明:其中返回的data为集群的ID,后续使用Helm接口中URL中的cluster就是返回的data的值 #### Helm-V2 + helm install - `POST` - `/api/namespaces/:namespace/releases/v2/:cluster/install/:release` Form: | Params | Description | | :----- | :----------------------------------------------------------- | | chart | chart 包,必须为 .tgz 文件 | | args | {<br/> "dry_run": false, // `--dry-run`<br/> "disable_hooks": false, // `--no-hooks`<br/> "wait": false, // `--wait`<br/> "devel": false, // `--false`<br/> "description": "", // `--description`<br/> "atomic": false, // `--atomic`<br/> "skip_crds": false, // `--skip-crds`<br/> "sub_notes": false, // `--render-subchart-notes`<br/> "create_namespace": false, // `--create-namespace`<br/> "dependency_update": false, // `--dependency-update`<br/> "values": "", // `--values`<br/> "set": [], // `--set`<br/> "set_string": [], // `--set-string`<br/> "ca_file": "", // `--ca-file`<br/> "cert_file": "", // `--cert-file`<br/> "key_file": "", // `--key-file`<br/> "insecure_skip_verify": false, // `--insecure-skip-verify`<br/> "keyring": "", // `--keyring`<br/> "password": "", // `--password`<br/> "repo": "", // `--repo`<br/> "username": "", // `--username`<br/> "verify": false, // `--verify`<br/> "version": "" // `--version`<br/>} | > - cluster的值是`集群配置文件`接口返回的data值 > - args中可以不包含全部字段,如果只设置set字段,可以只包含set字段,set字段的写法如下: > > ```json > { > "set": ["replicaCount=2"] > } > ``` > > - 此处 values 内容同 helm install `--values` 选项 + helm upgrade - `POST` - `/api/namespaces/:namespace/releases/v2/:cluster/upgrade/:release` Form: | Params | Description | | :----- | :----------------------------------------------------------- | | chart | chart 包,必须为 .tgz 文件 | | args | {<br/> "dry_run": false, // `--dry-run`<br/> "disable_hooks": false, // `--no-hooks`<br/> "wait": false, // `--wait`<br/> "devel": false, // `--false`<br/> "description": "", // `--description`<br/> "atomic": false, // `--atomic`<br/> "skip_crds": false, // `--skip-crds`<br/> "sub_notes": false, // `--render-subchart-notes`<br/> "create_namespace": false, // `--create-namespace`<br/> "dependency_update": false, // `--dependency-update`<br/> "values": "", // `--values`<br/> "set": [], // `--set`<br/> "set_string": [], // `--set-string`<br/> "ca_file": "", // `--ca-file`<br/> "cert_file": "", // `--cert-file`<br/> "key_file": "", // `--key-file`<br/> "insecure_skip_verify": false, // `--insecure-skip-verify`<br/> "keyring": "", // `--keyring`<br/> "password": "", // `--password`<br/> "repo": "", // `--repo`<br/> "username": "", // `--username`<br/> "verify": false, // `--verify`<br/> "version": "" // `--version`<br/>} | > - cluster的值是`集群配置文件`接口返回的data值 > - args中可以不包含全部字段,如果只设置set字段,可以只包含set字段,set字段的写法如下: > > ```json > { > "set": ["replicaCount=2"] > } > ``` > > - 此处 values 内容同 helm install `--values` 选项 > - release要保证已经安装过,不然会报错 + helm list - `GET` - `/api/namespaces/:namespace/releases/v2/:cluster` Body: ``` json { "all": false, // `--all` "all_namespaces": false, // `--all-namespaces` "by_date": false, // `--date` "sort_reverse": false, // `--reverse` "limit": , // `--max` "offset": , // `--offset` "filter": "", // `--filter` "uninstalled": false, // `--uninstalled` "uninstalling": false, // `--uninstalling` "superseded": false, // `--superseded` "failed": false, // `--failed` "deployed": false, // `--deployed` "pending": false // `--pending` } ``` + helm get - `GET` - `/api/namespaces/:namespace/releases/v2/:cluster/:release` + helm uninstall - `DELETE` - `/api/namespaces/:namespace/releases/v2/:cluster/:release` + helm rollback - `PUT` - `/api/namespaces/:namespace/releases/v2/:cluster/:release/versions/:reversion` + helm release status - `GET` - `/api/namespaces/:namespace/releases/v2/:cluster/:release/status` + helm release history - `GET` - `/api/namespaces/:namespace/releases/v2/:cluster/:release/histories` ------ ------ ### 响应 为了简化,所有请求统一返回 200 状态码,通过返回 Body 中的 Code 值来判断响应是否正常: ``` go type respBody struct { Code int `json:"code"` // 0 or 1, 0 is ok, 1 is error Data interface{} `json:"data,omitempty"` Error string `json:"error,omitempty"` } ``` ------ ------ **【修改】** ```go type respBody struct { Code int `json:"code"` // 200 or 500, 200 is ok, 500 is error Data interface{} `json:"data,omitempty"` Error string `json:"error,omitempty"` } ``` ------ ------ ## Build & Run ### Build 源码提供了简单的 `Makefile` 文件,如果要构建二进制,只需要通过以下方式构建即可。 ``` make build // 构建当前主机架构的二进制版本 make build-linux // 构建 Linux 版本的二进制 make build-docker // 构建 Docker 镜像 ``` 直接构建会生成名为 `helm-wrapper` 的二进制程序,你可以通过如下方式获取帮助: ``` $ helm-wrapper -h Usage of helm-wrapper: --addr string server listen addr (default "0.0.0.0") --alsologtostderr log to standard error as well as files --config string helm wrapper config (default "config.yaml") --debug enable verbose output --kube-context string name of the kubeconfig context to use --kubeconfig string path to the kubeconfig file --log_backtrace_at traceLocation when logging hits line file:N, emit a stack trace (default :0) --log_dir string If non-empty, write log files in this directory --logtostderr log to standard error instead of files (default true) -n, --namespace string namespace scope for this request --port string server listen port (default "8080") --registry-config string path to the registry config file (default "/root/.config/helm/registry.json") --repository-cache string path to the file containing cached repository indexes (default "/root/.cache/helm/repository") --repository-config string path to the file containing repository names and URLs (default "/root/.config/helm/repositories.yaml") --stderrthreshold severity logs at or above this threshold go to stderr (default 2) -v, --v Level log level for V logs --vmodule moduleSpec comma-separated list of pattern=N settings for file-filtered logging pflag: help requested ``` 关键性的选项说明一下: + `--config` helm-wrapper 的配置项,内容如下,主要是指定 Helm Repo 命名和 URL,用于 Repo 初始化。 ``` $ cat config-example.yaml uploadPath: /tmp/charts helmRepos: - name: bitnami url: https://charts.bitnami.com/bitnami ``` + `--kubeconfig` 默认如果你不指定的话,使用默认的路径,一般是 `~/.kube/config`。这个配置是必须的,这指明了你要操作的 Kubernetes 集群地址以及访问方式。`kubeconfig` 文件如何生成,这里不过多介绍,具体可以详见 [Configure Access to Multiple Clusters](https://kubernetes.io/docs/tasks/access-application-cluster/configure-access-multiple-clusters/) ### Run 运行比较简单,如果你本地已经有默认的 `kubeconfig` 文件,只需要把 helm-wrapper 需要的 repo 配置文件配置好即可,然后执行以下命令即可运行,示例如下: ``` $ ./helm-wrapper --config </path/to/config.yaml> --kubeconfig </path/to/kubeconfig> ``` > 启动时会先初始化 repo,因此根据 repo 本身的大小或者网络因素,会耗费些时间 #### 运行在 Kubernetes 集群中 替换 `deployment/deployment.yaml` 中 image 字段为你正确的 helm-wrapper 镜像地址即可,然后执行命令部署: ``` kubectl create -f ./deployment ``` > __注:__ 以上操作会创建 RBAC 相关,因此不需要在构建镜像的时候额外添加 kubeconfig 文件,默认会拥有相关的权限
32.888889
1,243
0.514006
eng_Latn
0.175491
fad52af535e51cd4f4222ec498ffc7ed428fe1a1
53
md
Markdown
apps/toi-hjemmel/README.md
navikt/rekrutteringsbistand-microservices
6de5dc87193c46168aefd24c0b6c217f80a8d0b9
[ "MIT" ]
null
null
null
apps/toi-hjemmel/README.md
navikt/rekrutteringsbistand-microservices
6de5dc87193c46168aefd24c0b6c217f80a8d0b9
[ "MIT" ]
1
2021-12-20T07:58:59.000Z
2021-12-20T07:58:59.000Z
apps/toi-hjemmel/README.md
navikt/toi-rapids-and-rivers
dc306887e918d80a94c044b683b9953265127ab7
[ "MIT" ]
null
null
null
# Henter ut hjemmelinformasjon og legger den på rapid
53
53
0.830189
nob_Latn
0.999325
fad53d52b55cbed57e09387bde2b6c67e2850d50
6,005
md
Markdown
repos/logstash/remote/5.6.14-alpine.md
Alizamani2731/repo-info
79dcc3d5e8fe76689abf6ab987d22e105509fa30
[ "Apache-2.0" ]
null
null
null
repos/logstash/remote/5.6.14-alpine.md
Alizamani2731/repo-info
79dcc3d5e8fe76689abf6ab987d22e105509fa30
[ "Apache-2.0" ]
null
null
null
repos/logstash/remote/5.6.14-alpine.md
Alizamani2731/repo-info
79dcc3d5e8fe76689abf6ab987d22e105509fa30
[ "Apache-2.0" ]
null
null
null
## `logstash:5.6.14-alpine` ```console $ docker pull logstash@sha256:2ddb0c85aaa7e545ab348985237e5b9f7ce7e8c2fe6301d005e6fcc45509c30a ``` - Manifest MIME: `application/vnd.docker.distribution.manifest.list.v2+json` - Platforms: - linux; amd64 ### `logstash:5.6.14-alpine` - linux; amd64 ```console $ docker pull logstash@sha256:879e4188bf6456b88c5086cf202561612811d16b787ae1125493d41c2cb90069 ``` - Docker Version: 18.06.1-ce - Manifest MIME: `application/vnd.docker.distribution.manifest.v2+json` - Total Size: **165.0 MB (164999180 bytes)** (compressed transfer size, not on-disk size) - Image ID: `sha256:83fe1ff37c1e1202b0f187b780a872dad4bd31ba989e1e72ae533ab433da1524` - Entrypoint: `["\/docker-entrypoint.sh"]` - Default Command: `["-e",""]` ```dockerfile # Fri, 21 Dec 2018 00:21:29 GMT ADD file:2ff00caea4e83dfade726ca47e3c795a1e9acb8ac24e392785c474ecf9a621f2 in / # Fri, 21 Dec 2018 00:21:30 GMT CMD ["/bin/sh"] # Fri, 21 Dec 2018 00:40:53 GMT ENV LANG=C.UTF-8 # Fri, 21 Dec 2018 00:40:57 GMT RUN { echo '#!/bin/sh'; echo 'set -e'; echo; echo 'dirname "$(dirname "$(readlink -f "$(which javac || which java)")")"'; } > /usr/local/bin/docker-java-home && chmod +x /usr/local/bin/docker-java-home # Fri, 21 Dec 2018 00:41:18 GMT ENV JAVA_HOME=/usr/lib/jvm/java-1.8-openjdk/jre # Fri, 21 Dec 2018 00:41:18 GMT ENV PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/usr/lib/jvm/java-1.8-openjdk/jre/bin:/usr/lib/jvm/java-1.8-openjdk/bin # Fri, 11 Jan 2019 00:26:19 GMT ENV JAVA_VERSION=8u191 # Fri, 11 Jan 2019 00:26:19 GMT ENV JAVA_ALPINE_VERSION=8.191.12-r0 # Fri, 11 Jan 2019 00:26:23 GMT RUN set -x && apk add --no-cache openjdk8-jre="$JAVA_ALPINE_VERSION" && [ "$JAVA_HOME" = "$(docker-java-home)" ] # Fri, 11 Jan 2019 01:57:33 GMT RUN addgroup -S logstash && adduser -S -G logstash logstash # Fri, 11 Jan 2019 01:57:35 GMT RUN apk add --no-cache bash libc6-compat libzmq # Fri, 11 Jan 2019 01:57:35 GMT RUN apk add --no-cache 'su-exec>=0.2' # Fri, 11 Jan 2019 01:57:36 GMT ENV GPG_KEY=46095ACC8548582C1A2699A9D27D666CD88E42B4 # Fri, 11 Jan 2019 01:57:36 GMT ENV LOGSTASH_PATH=/usr/share/logstash/bin # Fri, 11 Jan 2019 01:57:36 GMT ENV PATH=/usr/share/logstash/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/usr/lib/jvm/java-1.8-openjdk/jre/bin:/usr/lib/jvm/java-1.8-openjdk/bin # Fri, 11 Jan 2019 01:57:36 GMT ENV LOGSTASH_VERSION=5.6.14 # Fri, 11 Jan 2019 01:57:36 GMT ENV LOGSTASH_TARBALL=https://artifacts.elastic.co/downloads/logstash/logstash-5.6.14.tar.gz LOGSTASH_TARBALL_ASC=https://artifacts.elastic.co/downloads/logstash/logstash-5.6.14.tar.gz.asc LOGSTASH_TARBALL_SHA1=1d3c03897d5ee843f60e450d3ebef0c9353dc90d # Fri, 11 Jan 2019 01:57:49 GMT RUN set -ex; if [ -z "$LOGSTASH_TARBALL_SHA1" ] && [ -z "$LOGSTASH_TARBALL_ASC" ]; then echo >&2 'error: have neither a SHA1 _or_ a signature file -- cannot verify download!'; exit 1; fi; apk add --no-cache --virtual .fetch-deps ca-certificates gnupg openssl tar ; wget -O logstash.tar.gz "$LOGSTASH_TARBALL"; if [ "$LOGSTASH_TARBALL_SHA1" ]; then echo "$LOGSTASH_TARBALL_SHA1 *logstash.tar.gz" | sha1sum -c -; fi; if [ "$LOGSTASH_TARBALL_ASC" ]; then wget -O logstash.tar.gz.asc "$LOGSTASH_TARBALL_ASC"; export GNUPGHOME="$(mktemp -d)"; gpg --batch --keyserver ha.pool.sks-keyservers.net --recv-keys "$GPG_KEY"; gpg --batch --verify logstash.tar.gz.asc logstash.tar.gz; rm -rf "$GNUPGHOME" logstash.tar.gz.asc; fi; dir="$(dirname "$LOGSTASH_PATH")"; mkdir -p "$dir"; tar -xf logstash.tar.gz --strip-components=1 -C "$dir"; rm logstash.tar.gz; apk del .fetch-deps; export LS_SETTINGS_DIR="$dir/config"; if [ -f "$LS_SETTINGS_DIR/log4j2.properties" ]; then cp "$LS_SETTINGS_DIR/log4j2.properties" "$LS_SETTINGS_DIR/log4j2.properties.dist"; truncate -s 0 "$LS_SETTINGS_DIR/log4j2.properties"; fi; for userDir in "$dir/config" "$dir/data" ; do if [ -d "$userDir" ]; then chown -R logstash:logstash "$userDir"; fi; done; logstash --version # Fri, 11 Jan 2019 01:57:49 GMT COPY file:ce3bf8cc5446bdbb16718eb5decb902429c53b67cd42ac64921c065e79206386 in / # Fri, 11 Jan 2019 01:57:49 GMT ENTRYPOINT ["/docker-entrypoint.sh"] # Fri, 11 Jan 2019 01:57:50 GMT CMD ["-e" ""] ``` - Layers: - `sha256:cd784148e3483c2c86c50a48e535302ab0288bebd587accf40b714fffd0646b3` Last Modified: Fri, 21 Dec 2018 00:23:44 GMT Size: 2.2 MB (2207025 bytes) MIME: application/vnd.docker.image.rootfs.diff.tar.gzip - `sha256:35920a071f912ae4c16897610e8e2d514efcfd0e14ec76c4c73bf9aa7c2c55ea` Last Modified: Fri, 21 Dec 2018 00:44:42 GMT Size: 237.0 B MIME: application/vnd.docker.image.rootfs.diff.tar.gzip - `sha256:1a5149a464dd4b435dd529c6420b1a5c12b1076cdb94d3e0c41cdcc78f9582a5` Last Modified: Fri, 11 Jan 2019 00:30:16 GMT Size: 54.9 MB (54866597 bytes) MIME: application/vnd.docker.image.rootfs.diff.tar.gzip - `sha256:a9dfd17aeb74ed9323ce9dc44db74bdc5ed2157781bac5299cc578408895353e` Last Modified: Fri, 11 Jan 2019 01:57:57 GMT Size: 1.3 KB (1257 bytes) MIME: application/vnd.docker.image.rootfs.diff.tar.gzip - `sha256:b17be624d0ce925f0a49ae766253ea9b0f9cd0eea206c45c8f7a1d3784fd85c8` Last Modified: Fri, 11 Jan 2019 01:57:58 GMT Size: 1.5 MB (1548838 bytes) MIME: application/vnd.docker.image.rootfs.diff.tar.gzip - `sha256:d01f3e874ab7e936ebe2087ea34a1ee717016a5097b24544f8cab5695bd02fe8` Last Modified: Fri, 11 Jan 2019 01:57:57 GMT Size: 96.6 KB (96611 bytes) MIME: application/vnd.docker.image.rootfs.diff.tar.gzip - `sha256:43a9c26e13e352055aba5b1333c4682f9d22255159c95793073f1f34c9530a81` Last Modified: Fri, 11 Jan 2019 01:58:12 GMT Size: 106.3 MB (106278312 bytes) MIME: application/vnd.docker.image.rootfs.diff.tar.gzip - `sha256:45035cfb8d11c052f1c5dce0b1b16a154d90bebda8e92f897d6dea4cd0cf9b46` Last Modified: Fri, 11 Jan 2019 01:57:57 GMT Size: 303.0 B MIME: application/vnd.docker.image.rootfs.diff.tar.gzip
58.300971
1,306
0.731724
yue_Hant
0.291905
fad56fcc01dceb771e8f5d5714e57ca8786af57b
1,176
md
Markdown
windows-driver-docs-pr/sensors/how-to-build-a-universal-sensor-driver.md
Ryooooooga/windows-driver-docs.ja-jp
c7526f4e7d66ff01ae965b5670d19fd4be158f04
[ "CC-BY-4.0", "MIT" ]
null
null
null
windows-driver-docs-pr/sensors/how-to-build-a-universal-sensor-driver.md
Ryooooooga/windows-driver-docs.ja-jp
c7526f4e7d66ff01ae965b5670d19fd4be158f04
[ "CC-BY-4.0", "MIT" ]
null
null
null
windows-driver-docs-pr/sensors/how-to-build-a-universal-sensor-driver.md
Ryooooooga/windows-driver-docs.ja-jp
c7526f4e7d66ff01ae965b5670d19fd4be158f04
[ "CC-BY-4.0", "MIT" ]
null
null
null
--- title: ユニバーサル センサー ドライバーを構築する方法 description: ユニバーサル センサー ドライバーは、Windows 10 用のユニバーサル センサー ドライバー モデルに基づいて開発されたセンサー ドライバーです。 ms.assetid: 759E01CA-9838-4CBF-B5D1-2DCD2230A48A ms.date: 04/20/2017 ms.localizationpriority: medium ms.openlocfilehash: 9a40d841d481ba9c973c3959d26784ebd927f0b3 ms.sourcegitcommit: 0cc5051945559a242d941a6f2799d161d8eba2a7 ms.translationtype: MT ms.contentlocale: ja-JP ms.lasthandoff: 04/23/2019 ms.locfileid: "63366514" --- # <a name="how-to-build-a-universal-sensor-driver"></a>ユニバーサル センサー ドライバーを構築する方法 A*ユニバーサル センサー ドライバー*に開発されたセンサー ドライバー、ユニバーサル センサー ドライバー モデルに基づいて Windows 10 用です。 このセクションのトピックでは、このようなセンサー ドライバーを構築する方法を説明します。 このユニバーサル センサー ドライバーの開発作業はという開発ボードに基づいて[サメ Cove]( https://firmware.intel.com/projects/sharks-cove-uefi-firmware)します。 次のトピックを実行する必要のあるタスクの詳細なガイダンスを提供します。 - [開発環境を設定します。](set-up-your-development-environment.md) - [センサー ボード テストを準備します。](prepare-your-sensor-test-board.md) - [サメ Cove 掲示板にセンサーを接続します。](connect-your-sensor-to-the-sharks-cove-board.md) - [作成および配置には、ユニバーサル センサー ドライバー](write-and-deploy-your-universal-sensor-driver.md) - [ユニバーサル センサー ドライバーをテストします。](test-your-universal-sensor-driver.md)
32.666667
124
0.79932
yue_Hant
0.32753
fad62009856e3437257dd1c3486bad5ed1a1ceae
6,393
md
Markdown
src/api/activiti-rest-api/docs/TaskvariablesApi.md
alexserravidal/alfresco-js-api
94b5d51d1edf81556e2403c5c410304f1e9e38a9
[ "Apache-2.0" ]
110
2016-06-30T16:26:48.000Z
2022-03-28T04:38:54.000Z
src/api/activiti-rest-api/docs/TaskvariablesApi.md
alexserravidal/alfresco-js-api
94b5d51d1edf81556e2403c5c410304f1e9e38a9
[ "Apache-2.0" ]
898
2016-07-08T18:01:31.000Z
2022-03-22T06:49:57.000Z
src/api/activiti-rest-api/docs/TaskvariablesApi.md
alexserravidal/alfresco-js-api
94b5d51d1edf81556e2403c5c410304f1e9e38a9
[ "Apache-2.0" ]
75
2016-06-30T13:22:37.000Z
2022-01-25T08:52:56.000Z
# TaskvariablesApi All URIs are relative to */activiti-app/api* Method | HTTP request | Description ------------- | ------------- | ------------- [**createTaskVariable**](TaskvariablesApi.md#createTaskVariable) | **POST** /enterprise/tasks/{taskId}/variables | Create variables [**deleteAllLocalTaskVariables**](TaskvariablesApi.md#deleteAllLocalTaskVariables) | **DELETE** /enterprise/tasks/{taskId}/variables | Create or update variables [**deleteVariable**](TaskvariablesApi.md#deleteVariable) | **DELETE** /enterprise/tasks/{taskId}/variables/{variableName} | Delete a variable [**getVariable**](TaskvariablesApi.md#getVariable) | **GET** /enterprise/tasks/{taskId}/variables/{variableName} | Get a variable [**getVariables**](TaskvariablesApi.md#getVariables) | **GET** /enterprise/tasks/{taskId}/variables | List variables [**updateVariable**](TaskvariablesApi.md#updateVariable) | **PUT** /enterprise/tasks/{taskId}/variables/{variableName} | Update a variable <a name="createTaskVariable"></a> # **createTaskVariable** > RestVariable createTaskVariable(taskIdrestVariables) Create variables ### Example ```javascript import TaskvariablesApi from 'TaskvariablesApi'; import { AlfrescoApi } from '@alfresco/js-api'; this.alfrescoApi = new AlfrescoApi(); this.alfrescoApi.setConfig({ hostEcm: 'http://127.0.0.1:8080' }); let taskvariablesApi = new TaskvariablesApi(this.alfrescoApi); taskvariablesApi.createTaskVariable(taskIdrestVariables).then((data) => { console.log('API called successfully. Returned data: ' + data); }, function(error) { console.error(error); }); ``` ### Parameters Name | Type | Description | Notes ------------- | ------------- | ------------- | ------------- **taskId** | **string**| taskId | **restVariables** | [**RestVariable**](RestVariable.md)| restVariables | ### Return type [**RestVariable**](RestVariable.md) <a name="deleteAllLocalTaskVariables"></a> # **deleteAllLocalTaskVariables** > deleteAllLocalTaskVariables(taskId) Create or update variables ### Example ```javascript import TaskvariablesApi from 'TaskvariablesApi'; import { AlfrescoApi } from '@alfresco/js-api'; this.alfrescoApi = new AlfrescoApi(); this.alfrescoApi.setConfig({ hostEcm: 'http://127.0.0.1:8080' }); let taskvariablesApi = new TaskvariablesApi(this.alfrescoApi); taskvariablesApi.deleteAllLocalTaskVariables(taskId).then(() => { console.log('API called successfully.'); }, function(error) { console.error(error); }); ``` ### Parameters Name | Type | Description | Notes ------------- | ------------- | ------------- | ------------- **taskId** | **string**| taskId | ### Return type null (empty response body) <a name="deleteVariable"></a> # **deleteVariable** > deleteVariable(taskIdvariableNameopts) Delete a variable ### Example ```javascript import TaskvariablesApi from 'TaskvariablesApi'; import { AlfrescoApi } from '@alfresco/js-api'; this.alfrescoApi = new AlfrescoApi(); this.alfrescoApi.setConfig({ hostEcm: 'http://127.0.0.1:8080' }); let taskvariablesApi = new TaskvariablesApi(this.alfrescoApi); let opts = { 'scope': scope_example // | scope }; taskvariablesApi.deleteVariable(taskIdvariableNameopts).then(() => { console.log('API called successfully.'); }, function(error) { console.error(error); }); ``` ### Parameters Name | Type | Description | Notes ------------- | ------------- | ------------- | ------------- **taskId** | **string**| taskId | **variableName** | **string**| variableName | **scope** | **string**| scope | [optional] ### Return type null (empty response body) <a name="getVariable"></a> # **getVariable** > RestVariable getVariable(taskIdvariableNameopts) Get a variable ### Example ```javascript import TaskvariablesApi from 'TaskvariablesApi'; import { AlfrescoApi } from '@alfresco/js-api'; this.alfrescoApi = new AlfrescoApi(); this.alfrescoApi.setConfig({ hostEcm: 'http://127.0.0.1:8080' }); let taskvariablesApi = new TaskvariablesApi(this.alfrescoApi); let opts = { 'scope': scope_example // | scope }; taskvariablesApi.getVariable(taskIdvariableNameopts).then((data) => { console.log('API called successfully. Returned data: ' + data); }, function(error) { console.error(error); }); ``` ### Parameters Name | Type | Description | Notes ------------- | ------------- | ------------- | ------------- **taskId** | **string**| taskId | **variableName** | **string**| variableName | **scope** | **string**| scope | [optional] ### Return type [**RestVariable**](RestVariable.md) <a name="getVariables"></a> # **getVariables** > RestVariable getVariables(taskIdopts) List variables ### Example ```javascript import TaskvariablesApi from 'TaskvariablesApi'; import { AlfrescoApi } from '@alfresco/js-api'; this.alfrescoApi = new AlfrescoApi(); this.alfrescoApi.setConfig({ hostEcm: 'http://127.0.0.1:8080' }); let taskvariablesApi = new TaskvariablesApi(this.alfrescoApi); let opts = { 'scope': scope_example // | scope }; taskvariablesApi.getVariables(taskIdopts).then((data) => { console.log('API called successfully. Returned data: ' + data); }, function(error) { console.error(error); }); ``` ### Parameters Name | Type | Description | Notes ------------- | ------------- | ------------- | ------------- **taskId** | **string**| taskId | **scope** | **string**| scope | [optional] ### Return type [**RestVariable**](RestVariable.md) <a name="updateVariable"></a> # **updateVariable** > RestVariable updateVariable(taskIdvariableNamerestVariable) Update a variable ### Example ```javascript import TaskvariablesApi from 'TaskvariablesApi'; import { AlfrescoApi } from '@alfresco/js-api'; this.alfrescoApi = new AlfrescoApi(); this.alfrescoApi.setConfig({ hostEcm: 'http://127.0.0.1:8080' }); let taskvariablesApi = new TaskvariablesApi(this.alfrescoApi); taskvariablesApi.updateVariable(taskIdvariableNamerestVariable).then((data) => { console.log('API called successfully. Returned data: ' + data); }, function(error) { console.error(error); }); ``` ### Parameters Name | Type | Description | Notes ------------- | ------------- | ------------- | ------------- **taskId** | **string**| taskId | **variableName** | **string**| variableName | **restVariable** | [**RestVariable**](RestVariable.md)| restVariable | ### Return type [**RestVariable**](RestVariable.md)
25.169291
161
0.664633
eng_Latn
0.189927
fad6b65ac87cd3a80c0dd9cc987b8a0f2a1d90aa
1,788
md
Markdown
content/publication/sarac-2019-a/index.md
maxdiluca/academic-kickstart
27fe4bda2fb0134bcddfda34439b88d4fc8cde17
[ "MIT" ]
null
null
null
content/publication/sarac-2019-a/index.md
maxdiluca/academic-kickstart
27fe4bda2fb0134bcddfda34439b88d4fc8cde17
[ "MIT" ]
null
null
null
content/publication/sarac-2019-a/index.md
maxdiluca/academic-kickstart
27fe4bda2fb0134bcddfda34439b88d4fc8cde17
[ "MIT" ]
null
null
null
--- # Documentation: https://sourcethemes.com/academic/docs/managing-content/ title: Haptic Sketches on the Arm for manipulation in virtual reality subtitle: '' summary: '' authors: - Mine Sarac - Allison M. Okamura - Massimiliano Di Luca tags: - 'haptic perception' - 'haptics' - 'wearable' - 'weight perception' - 'device' - 'force feedback' categories: [] date: '2019-11-01' lastmod: 2021-04-16T20:49:27+02:00 featured: false draft: false # Featured image # To use, add an image named `featured.jpg/png` to your page's folder. # Focal points: Smart, Center, TopLeft, Top, TopRight, Left, Right, BottomLeft, Bottom, BottomRight. image: caption: '' focal_point: '' preview_only: false # Projects (optional). # Associate this post with one or more of your projects. # Simply enter your project's folder or file name without extension. # E.g. `projects = ["internal-project"]` references `content/project/deep-learning/index.md`. # Otherwise, set `projects = []`. projects: [] publishDate: '2021-04-16T18:49:26.420291Z' publication_types: - '3' abstract: We propose a haptic system that applies forces or skin deformation to the user's arm, rather than at the fingertips, for believable interaction with virtual objects as an alternative to complex thimble devices. Such a haptic system would be able to convey information to the arm instead of the fingertips, even though the user manipulates virtual objects using their hands. We developed a set of haptic sketches to determine which directions of skin deformation are deemed more believable during a grasp and lift task. Subjective reports indicate that normal forces were the most believable feedback to represent this interaction. publication: '*arXiv*' url_project: http://arxiv.org/abs/1911.08528 ---
34.384615
100
0.753915
eng_Latn
0.97619
fad6d5e43efa02f38d45d5c5fcee71fc0a63dc24
614
md
Markdown
markdown/org/docs/patterns/shin/needs/es.md
TriploidTree/freesewing
428507c6682d49a0869a13ce56b4a38d844a8e5c
[ "MIT" ]
null
null
null
markdown/org/docs/patterns/shin/needs/es.md
TriploidTree/freesewing
428507c6682d49a0869a13ce56b4a38d844a8e5c
[ "MIT" ]
2
2022-02-04T13:28:21.000Z
2022-02-04T14:07:38.000Z
markdown/org/docs/patterns/shin/needs/es.md
SeaZeeZee/freesewing
8589e7b7ceeabd738c4b69ac59980acbebfea537
[ "MIT" ]
null
null
null
- - - title: "Shin swim trunks: What You Need" - - - To make Shin, you will need the following: - Suministros básicos de costura - About 0.75 meters (0.8 yards) of a suitable fabric ([see Fabric options](/docs/patterns/shin/fabric)) - dos eyelets y una cadena de dibujo > ## Un serger/overlock es bueno, pero opcional > > Al igual que con todas las telas estiradas, un serpiente o un overlock le hará la vida más fácil. > > Si no tienes una no te desesperes. Realmente no la necesitas. Puede utilizar otra técnica para las costuras de estiramiento, como un punto zig-zag, una aguja gemela o un hilo elástico.
38.375
186
0.739414
spa_Latn
0.978892
fad6d69663970a3b67341fa27c538ac6cc3140c8
2,003
markdown
Markdown
source/blog/2015-12-08-we-are-the-coyote.html.markdown
natanrolnik/blog-1
49c6e363da9654cca382ec71199f7ea365b9066f
[ "CC-BY-4.0" ]
1
2019-06-11T16:32:39.000Z
2019-06-11T16:32:39.000Z
source/blog/2015-12-08-we-are-the-coyote.html.markdown
BalestraPatrick/blog
998319646360d9caaf003c46fbb8df49bea2d0b5
[ "CC-BY-4.0" ]
null
null
null
source/blog/2015-12-08-we-are-the-coyote.html.markdown
BalestraPatrick/blog
998319646360d9caaf003c46fbb8df49bea2d0b5
[ "CC-BY-4.0" ]
null
null
null
--- title: We Are the Coyote date: 2015-12-08 16:21:57 UTC --- I saw [this](https://twitter.com/andrey_butov/status/674237455911006209) tweeted earlier, and it really resonated with me. (READMORE) <blockquote class="twitter-tweet" lang="en"><p lang="en" dir="ltr">Chuck Jones’ original rules for the Wile E. Coyote and The Road Runner cartoon. <a href="https://t.co/L45bSBpg2B">pic.twitter.com/L45bSBpg2B</a></p>&mdash; Andrey Butov (@andrey_butov) <a href="https://twitter.com/andrey_butov/status/674237455911006209">December 8, 2015</a></blockquote> It's been [bouncing around the Internet](http://mentalfloss.com/article/62035/chuck-jones-rules-writing-road-runner-cartoons) for some time now, but I only saw it today. It's quite interesting – let's break it down a bit. The really interesting part, I think, is the coyote's chase of the Road Runner as a metaphor for our own pursuit of happiness. The Road Runner and Wile E. Coyote are one-in-the-same, their constant struggle represents the struggle we have on a day-to-day basis to be happy. Think about it: consider the coyote as the part of yourself that is always chasing "happiness." Now think about the following rules: "No outside force can harm the coyote – only his own ineptitude..." "The coyote could stop any time..." "The coyote is always more humiliated than harmed by his failures." We all struggle to be happy, but [pursuit of happiness itself will always be fruitless](/blog/you-never-arrive/). The coyote _could_ stop chasing the Road Runner, but it's all he's ever known. But unlike the coyote, _we do_ have a choice. We can stop, and we can realize that [most of our pain is self-inflicted](http://www.huffingtonpost.com/susan-bernstein/dont-shoot-the-second-arr_b_5102701.html). We have a choice the coyote does not, and it would be a shame not to take the wiser path – to stop _chasing_ happiness and maybe you'll actually catch it. <script async src="//platform.twitter.com/widgets.js" charset="utf-8"></script>
69.068966
556
0.761358
eng_Latn
0.990104
fad8e853ac6787d17ad1b59828a2ed6a4f8c05a5
11,289
md
Markdown
aspnetcore/security/authentication/social/social-without-identity.md
angelobelchior/AspNetCore.Docs.pt-br
61e6ca064ffb2afab2598cb3bf3268352e053bd8
[ "CC-BY-4.0", "MIT" ]
1
2020-12-23T00:29:11.000Z
2020-12-23T00:29:11.000Z
aspnetcore/security/authentication/social/social-without-identity.md
Lucilene-Pinheiro/AspNetCore.Docs.pt-br
0288109b953262c095a52564f440b0a922e3bf7c
[ "CC-BY-4.0", "MIT" ]
null
null
null
aspnetcore/security/authentication/social/social-without-identity.md
Lucilene-Pinheiro/AspNetCore.Docs.pt-br
0288109b953262c095a52564f440b0a922e3bf7c
[ "CC-BY-4.0", "MIT" ]
null
null
null
--- title: Autenticação do Facebook, do Google e do provedor externo sem ASP.NET Core Identity author: rick-anderson description: Uma explicação de como usar o Facebook, o Google, o Twitter, etc. a autenticação de usuário da conta sem ASP.NET Core Identity . ms.author: riande ms.date: 12/10/2019 no-loc: - appsettings.json - ASP.NET Core Identity - cookie - Cookie - Blazor - Blazor Server - Blazor WebAssembly - Identity - Let's Encrypt - Razor - SignalR uid: security/authentication/social/social-without-identity ms.openlocfilehash: cd7545a3ddaccedfa64ef5e9d5458c21c651257a ms.sourcegitcommit: ca34c1ac578e7d3daa0febf1810ba5fc74f60bbf ms.translationtype: MT ms.contentlocale: pt-BR ms.lasthandoff: 10/30/2020 ms.locfileid: "93060281" --- # <a name="use-social-sign-in-provider-authentication-without-no-locaspnet-core-identity"></a>Usar autenticação de provedor de entrada social sem ASP.NET Core Identity Por [Kirk Larkin](https://twitter.com/serpent5) e [Rick Anderson](https://twitter.com/RickAndMSFT) ::: moniker range=">= aspnetcore-3.0" <xref:security/authentication/social/index> Descreve como permitir que os usuários entrem usando o OAuth 2,0 com credenciais de provedores de autenticação externa. A abordagem descrita no tópico inclui ASP.NET Core Identity como um provedor de autenticação. Este exemplo demonstra como usar um provedor de autenticação externo **sem** o ASP.NET Core Identity . Isso é útil para aplicativos que não exigem todos os recursos do ASP.NET Core Identity , mas ainda exigem integração com um provedor de autenticação externa confiável. Este exemplo usa a [autenticação do Google](xref:security/authentication/google-logins) para autenticar usuários. Usar a autenticação do Google muda muitas das complexidades do gerenciamento do processo de entrada para o Google. Para integrar com um provedor de autenticação externa diferente, consulte os tópicos a seguir: * [Autenticação do Facebook](xref:security/authentication/facebook-logins) * [Autenticação da Microsoft](xref:security/authentication/microsoft-logins) * [Autenticação do Twitter](xref:security/authentication/twitter-logins) * [Outros provedores](xref:security/authentication/otherlogins) ## <a name="configuration"></a>Configuração No `ConfigureServices` método, configure os esquemas de autenticação do aplicativo com os <xref:Microsoft.Extensions.DependencyInjection.AuthenticationServiceCollectionExtensions.AddAuthentication*> <xref:Microsoft.Extensions.DependencyInjection.CookieExtensions.AddCookie*> métodos, e <xref:Microsoft.Extensions.DependencyInjection.GoogleExtensions.AddGoogle*> : [!code-csharp[](social-without-identity/samples_snapshot/3.x/Startup.cs?name=snippet1)] A chamada para <xref:Microsoft.Extensions.DependencyInjection.AuthenticationServiceCollectionExtensions.AddAuthentication*> define o aplicativo <xref:Microsoft.AspNetCore.Authentication.AuthenticationOptions.DefaultScheme> . O `DefaultScheme` é o esquema padrão usado pelos seguintes `HttpContext` métodos de extensão de autenticação: * <xref:Microsoft.AspNetCore.Authentication.AuthenticationHttpContextExtensions.AuthenticateAsync*> * <xref:Microsoft.AspNetCore.Authentication.AuthenticationHttpContextExtensions.ChallengeAsync*> * <xref:Microsoft.AspNetCore.Authentication.AuthenticationHttpContextExtensions.ForbidAsync*> * <xref:Microsoft.AspNetCore.Authentication.AuthenticationHttpContextExtensions.SignInAsync*> * <xref:Microsoft.AspNetCore.Authentication.AuthenticationHttpContextExtensions.SignOutAsync*> Definir o aplicativo `DefaultScheme` como [ Cookie AuthenticationDefaults. AuthenticationScheme](xref:Microsoft.AspNetCore.Authentication.Cookies.CookieAuthenticationDefaults.AuthenticationScheme) (" Cookie s") configura o aplicativo para usar Cookie s como o esquema padrão para esses métodos de extensão. Definir o aplicativo <xref:Microsoft.AspNetCore.Authentication.AuthenticationOptions.DefaultChallengeScheme> como [GoogleDefaults. AuthenticationScheme](xref:Microsoft.AspNetCore.Authentication.Google.GoogleDefaults.AuthenticationScheme) ("Google") configura o aplicativo para usar o Google como o esquema padrão para chamadas para `ChallengeAsync` . `DefaultChallengeScheme` substituições `DefaultScheme` . Consulte <xref:Microsoft.AspNetCore.Authentication.AuthenticationOptions> para obter propriedades adicionais que substituem `DefaultScheme` quando definido. No `Startup.Configure` , chame `UseAuthentication` e `UseAuthorization` entre chamar `UseRouting` e `UseEndpoints` . Isso define a `HttpContext.User` propriedade e executa o middleware de autorização para solicitações: [!code-csharp[](social-without-identity/samples_snapshot/3.x/Startup.cs?name=snippet2&highlight=3-4)] Para saber mais sobre esquemas de autenticação, consulte [conceitos de autenticação](xref:security/authentication/index#authentication-concepts). Para saber mais sobre cookie autenticação, consulte <xref:security/authentication/cookie> . ## <a name="apply-authorization"></a>Aplicar autorização Teste a configuração de autenticação do aplicativo aplicando o `AuthorizeAttribute` atributo a um controlador, uma ação ou uma página. O código a seguir limita o acesso à página de *privacidade* aos usuários que foram autenticados: [!code-csharp[](social-without-identity/samples_snapshot/3.x/Pages/Privacy.cshtml.cs?name=snippet&highlight=1)] ## <a name="sign-out"></a>Sair Para desconectar o usuário atual e excluir seus cookie , chame [SignOutAsync](xref:Microsoft.AspNetCore.Authentication.AuthenticationHttpContextExtensions.SignOutAsync*). O código a seguir adiciona um `Logout` manipulador de página à página de *índice* : [!code-csharp[](social-without-identity/samples_snapshot/3.x/Pages/Index.cshtml.cs?name=snippet&highlight=3-7)] Observe que a chamada para não `SignOutAsync` especifica um esquema de autenticação. O aplicativo `DefaultScheme` do `CookieAuthenticationDefaults.AuthenticationScheme` é usado como um retorno. ## <a name="additional-resources"></a>Recursos adicionais * <xref:security/authorization/simple> * <xref:security/authentication/social/additional-claims> ::: moniker-end ::: moniker range="< aspnetcore-3.0" <xref:security/authentication/social/index> Descreve como permitir que os usuários entrem usando o OAuth 2,0 com credenciais de provedores de autenticação externa. A abordagem descrita no tópico inclui ASP.NET Core Identity como um provedor de autenticação. Este exemplo demonstra como usar um provedor de autenticação externo **sem** o ASP.NET Core Identity . Isso é útil para aplicativos que não exigem todos os recursos do ASP.NET Core Identity , mas ainda exigem integração com um provedor de autenticação externa confiável. Este exemplo usa a [autenticação do Google](xref:security/authentication/google-logins) para autenticar usuários. Usar a autenticação do Google muda muitas das complexidades do gerenciamento do processo de entrada para o Google. Para integrar com um provedor de autenticação externa diferente, consulte os tópicos a seguir: * [Autenticação do Facebook](xref:security/authentication/facebook-logins) * [Autenticação da Microsoft](xref:security/authentication/microsoft-logins) * [Autenticação do Twitter](xref:security/authentication/twitter-logins) * [Outros provedores](xref:security/authentication/otherlogins) ## <a name="configuration"></a>Configuração No `ConfigureServices` método, configure os esquemas de autenticação do aplicativo com os `AddAuthentication` `AddCookie` métodos, e `AddGoogle` : [!code-csharp[](social-without-identity/samples_snapshot/2.x/Startup.cs?name=snippet1)] A chamada para [addauthentication](/dotnet/api/microsoft.extensions.dependencyinjection.authenticationservicecollectionextensions.addauthentication#Microsoft_Extensions_DependencyInjection_AuthenticationServiceCollectionExtensions_AddAuthentication_Microsoft_Extensions_DependencyInjection_IServiceCollection_System_Action_Microsoft_AspNetCore_Authentication_AuthenticationOptions__) define o [defaultscheme](xref:Microsoft.AspNetCore.Authentication.AuthenticationOptions.DefaultScheme)do aplicativo. O `DefaultScheme` é o esquema padrão usado pelos seguintes `HttpContext` métodos de extensão de autenticação: * <xref:Microsoft.AspNetCore.Authentication.AuthenticationHttpContextExtensions.AuthenticateAsync*> * <xref:Microsoft.AspNetCore.Authentication.AuthenticationHttpContextExtensions.ChallengeAsync*> * <xref:Microsoft.AspNetCore.Authentication.AuthenticationHttpContextExtensions.ForbidAsync*> * <xref:Microsoft.AspNetCore.Authentication.AuthenticationHttpContextExtensions.SignInAsync*> * <xref:Microsoft.AspNetCore.Authentication.AuthenticationHttpContextExtensions.SignOutAsync*> Definir o aplicativo `DefaultScheme` como [ Cookie AuthenticationDefaults. AuthenticationScheme](xref:Microsoft.AspNetCore.Authentication.Cookies.CookieAuthenticationDefaults.AuthenticationScheme) (" Cookie s") configura o aplicativo para usar Cookie s como o esquema padrão para esses métodos de extensão. Definir o aplicativo <xref:Microsoft.AspNetCore.Authentication.AuthenticationOptions.DefaultChallengeScheme> como [GoogleDefaults. AuthenticationScheme](xref:Microsoft.AspNetCore.Authentication.Google.GoogleDefaults.AuthenticationScheme) ("Google") configura o aplicativo para usar o Google como o esquema padrão para chamadas para `ChallengeAsync` . `DefaultChallengeScheme` substituições `DefaultScheme` . Consulte <xref:Microsoft.AspNetCore.Authentication.AuthenticationOptions> para obter propriedades adicionais que substituem `DefaultScheme` quando definido. No `Configure` método, chame o `UseAuthentication` método para invocar o middleware de autenticação que define a `HttpContext.User` propriedade. Chame o `UseAuthentication` método antes de chamar `UseMvcWithDefaultRoute` ou `UseMvc` : [!code-csharp[](social-without-identity/samples_snapshot/2.x/Startup.cs?name=snippet2)] Para saber mais sobre esquemas de autenticação, consulte [conceitos de autenticação](xref:security/authentication/index#authentication-concepts). Para saber mais sobre cookie autenticação, consulte <xref:security/authentication/cookie> . ## <a name="apply-authorization"></a>Aplicar autorização Teste a configuração de autenticação do aplicativo aplicando o `AuthorizeAttribute` atributo a um controlador, uma ação ou uma página. O código a seguir limita o acesso à página de *privacidade* aos usuários que foram autenticados: [!code-csharp[](social-without-identity/samples_snapshot/2.x/Pages/Privacy.cshtml.cs?name=snippet&highlight=1)] ## <a name="sign-out"></a>Sair Para desconectar o usuário atual e excluir seus cookie , chame [SignOutAsync](xref:Microsoft.AspNetCore.Authentication.AuthenticationHttpContextExtensions.SignOutAsync*). O código a seguir adiciona um `Logout` manipulador de página à página de *índice* : [!code-csharp[](social-without-identity/samples_snapshot/2.x/Pages/Index.cshtml.cs?name=snippet&highlight=3-7)] Observe que a chamada para não `SignOutAsync` especifica um esquema de autenticação. O aplicativo `DefaultScheme` do `CookieAuthenticationDefaults.AuthenticationScheme` é usado como um retorno. ## <a name="additional-resources"></a>Recursos adicionais * <xref:security/authorization/simple> * <xref:security/authentication/social/additional-claims> ::: moniker-end
80.06383
871
0.828594
por_Latn
0.795375
fad9d3b5e1950a85c65f3ecea8fd3d626215cd5c
11,095
md
Markdown
articles/application-gateway/multiple-site-overview.md
flexray/azure-docs.pl-pl
bfb8e5d5776d43b4623ce1c01dc44c8efc769c78
[ "CC-BY-4.0", "MIT" ]
12
2017-08-28T07:45:55.000Z
2022-03-07T21:35:48.000Z
articles/application-gateway/multiple-site-overview.md
flexray/azure-docs.pl-pl
bfb8e5d5776d43b4623ce1c01dc44c8efc769c78
[ "CC-BY-4.0", "MIT" ]
441
2017-11-08T13:15:56.000Z
2021-06-02T10:39:53.000Z
articles/application-gateway/multiple-site-overview.md
flexray/azure-docs.pl-pl
bfb8e5d5776d43b4623ce1c01dc44c8efc769c78
[ "CC-BY-4.0", "MIT" ]
27
2017-11-13T13:38:31.000Z
2022-02-17T11:57:33.000Z
--- title: Hostowanie wielu witryn w usłudze Azure Application Gateway description: Ten artykuł zawiera omówienie obsługi wielolokacjowej platformy Azure Application Gateway. services: application-gateway author: vhorne ms.service: application-gateway ms.date: 07/20/2020 ms.author: surmb ms.topic: conceptual ms.openlocfilehash: 53f6f37454de886934a483b40daad24204958baf ms.sourcegitcommit: f28ebb95ae9aaaff3f87d8388a09b41e0b3445b5 ms.translationtype: MT ms.contentlocale: pl-PL ms.lasthandoff: 03/29/2021 ms.locfileid: "87474329" --- # <a name="application-gateway-multiple-site-hosting"></a>Hostowanie wielu witryn usługi Application Gateway Obsługa wielu witryn umożliwia skonfigurowanie więcej niż jednej aplikacji sieci Web na tym samym porcie bramy aplikacji. Ta funkcja umożliwia skonfigurowanie bardziej wydajnej topologii dla wdrożeń przez dodanie nawet ponad 100 witryn internetowych do jednej bramy aplikacji. Każdą witrynę sieci Web można skierować do jej puli zaplecza. Na przykład trzy domeny — contoso.com, fabrikam.com i adatum.com — wskazują adres IP bramy aplikacji. Utworzysz trzy odbiorniki obejmujące wiele witryn i skonfigurujesz każdy odbiornik dla odpowiedniego portu i ustawienia protokołu. Możesz również określić nazwy hosta z symbolami wieloznacznymi w odbiorniku obejmującym wiele witryn, z maksymalnie pięcioma nazwami hostów na odbiornik. Aby dowiedzieć się więcej, zobacz [symbole wieloznaczne nazw hostów w odbiorniku](#wildcard-host-names-in-listener-preview). :::image type="content" source="./media/multiple-site-overview/multisite.png" alt-text="Application Gateway wiele lokacji"::: > [!IMPORTANT] > Reguły są przetwarzane w kolejności, w której są wyświetlane w portalu dla jednostki SKU w wersji 1. W przypadku jednostki SKU v2 dokładne dopasowania mają wyższy priorytet. Zdecydowanie zaleca się skonfigurowanie odbiorników obejmujących wiele lokacji przed skonfigurowaniem podstawowego odbiornika. Zapewni to skierowanie ruchu do odpowiedniego zaplecza. Jeśli podstawowy odbiornik znajduje się na początku listy i jest zgodny z żądaniem przychodzącym, jest ono przetwarzane przez ten odbiornik. Żądania dotyczące adresu `http://contoso.com` są kierowane do puli ContosoServerPool, a żądania dotyczące adresu `http://fabrikam.com` — do puli FabrikamServerPool. Analogicznie, można hostować wiele poddomen w tej samej domenie nadrzędnej w ramach tego samego wdrożenia bramy aplikacji. Można na przykład hostować `http://blog.contoso.com` i obsługiwać `http://app.contoso.com` pojedyncze wdrożenie bramy aplikacji. ## <a name="wildcard-host-names-in-listener-preview"></a>Nazwy hostów symboli wieloznacznych w odbiorniku (wersja zapoznawcza) Application Gateway umożliwia routing oparty na hoście przy użyciu odbiornika HTTP (S) z obsługą wiele witryn. Teraz można korzystać z symboli wieloznacznych, takich jak gwiazdka (*) i znak zapytania (?) w nazwie hosta oraz do 5 nazw hostów dla odbiornika HTTP (S) z jedną lokacją. Na przykład `*.contoso.com`. Używając symbolu wieloznacznego w nazwie hosta, można dopasować wiele nazw hostów w pojedynczym odbiorniku. Na przykład `*.contoso.com` może być zgodne z `ecom.contoso.com` , `b2b.contoso.com` `customer1.b2b.contoso.com` i tak dalej. Korzystając z tablicy nazw hostów, można skonfigurować więcej niż jedną nazwę hosta dla odbiornika, aby kierować żądania do puli zaplecza. Może na przykład znajdować się odbiornik, `contoso.com, fabrikam.com` który akceptuje żądania dla nazwy hosta. :::image type="content" source="./media/multiple-site-overview/wildcard-listener-diag.png" alt-text="Odbiornik symboli wieloznacznych"::: >[!NOTE] > Ta funkcja jest w wersji zapoznawczej i jest dostępna tylko dla Standard_v2 i WAF_v2 SKU Application Gateway. Aby dowiedzieć się więcej na temat wersji zapoznawczych, zobacz [warunki użytkowania tutaj](https://azure.microsoft.com/support/legal/preview-supplemental-terms/). >[!NOTE] >Ta funkcja jest obecnie dostępna tylko za pomocą [Azure PowerShell](tutorial-multiple-sites-powershell.md) i [interfejsu wiersza polecenia platformy Azure](tutorial-multiple-sites-cli.md). Obsługa portalu będzie dostępna wkrótce. > Należy pamiętać, że ponieważ obsługa portalu nie jest w pełni dostępna, jeśli używasz tylko parametru HostNames, odbiornik zostanie wyświetlony jako odbiornik podstawowy w portalu, a kolumna Nazwa hosta w widoku listy odbiorników nie będzie zawierać skonfigurowanych nazw hostów. W przypadku wszelkich zmian w odbiorniku z symbolami wieloznacznymi upewnij się, że używasz Azure PowerShell lub interfejsu wiersza polecenia, dopóki nie zostanie ono obsługiwane w portalu. W [Azure PowerShell](tutorial-multiple-sites-powershell.md)należy użyć `-HostNames` zamiast `-HostName` . Za pomocą nazw hostów można wspominać do 5 nazw hostów jako wartości rozdzielane przecinkami i używać symboli wieloznacznych. Na przykład `-HostNames "*.contoso.com,*.fabrikam.com"` W [interfejsie wiersza polecenia platformy Azure](tutorial-multiple-sites-cli.md)należy użyć `--host-names` zamiast `--host-name` . Nazwy hostów można wymieniać do 5 nazw hostów jako wartości rozdzielane przecinkami i używać symboli wieloznacznych. Na przykład `--host-names "*.contoso.com,*.fabrikam.com"` ### <a name="allowed-characters-in-the-host-names-field"></a>Dozwolone znaki w polu nazwy hostów: * `(A-Z,a-z,0-9)` -znaki alfanumeryczne * `-` -myślnik lub minus * `.` -Period jako ogranicznik * `*` -może być zgodne z wieloma znakami w dozwolonym zakresie * `?` -może pasować do pojedynczego znaku w dozwolonym zakresie ### <a name="conditions-for-using-wildcard-characters-and-multiple-host-names-in-a-listener"></a>Warunki używania symboli wieloznacznych i wielu nazw hostów w odbiorniku: * W pojedynczym odbiorniku można wymieniać maksymalnie 5 nazw hostów * Gwiazdkę `*` można podać tylko raz w składniku nazwy stylu domeny lub nazwy hosta. Na przykład Component1 *. component2*. component3. `(*.contoso-*.com)` jest prawidłowy. * Nazwa hosta może zawierać maksymalnie dwie gwiazdki `*` . Na przykład `*.contoso.*` jest prawidłowy i `*.contoso.*.*.com` nieprawidłowy. * Nazwa hosta może zawierać maksymalnie 4 symbole wieloznaczne. Na przykład, `????.contoso.com` , `w??.contoso*.edu.*` są prawidłowe, ale `????.contoso.*` jest nieprawidłowy. * Użycie gwiazdki `*` i znaku zapytania `?` razem w składniku nazwy hosta ( `*?` lub `?*` lub `**` ) jest nieprawidłowe. Na przykład `*?.contoso.com` i `**.contoso.com` są nieprawidłowe. ### <a name="considerations-and-limitations-of-using-wildcard-or-multiple-host-names-in-a-listener"></a>Zagadnienia i ograniczenia dotyczące używania symboli wieloznacznych lub wielu nazw hostów w odbiorniku: * [Zakończenie protokołu SSL i kompleksowe zabezpieczenia SSL](ssl-overview.md) wymagają skonfigurowania protokołu jako https i przekazania certyfikatu do użycia w konfiguracji odbiornika. Jeśli jest to odbiornik z obsługą wiele lokacji, można również wprowadzić nazwę hosta, zazwyczaj jest to nazwa POSPOLITa certyfikatu SSL. W przypadku określania wielu nazw hostów w odbiorniku lub używania symboli wieloznacznych należy wziąć pod uwagę następujące kwestie: * Jeśli jest to symbol wieloznaczny, taki jak *. contoso.com, należy przekazać certyfikat z symbolem wieloznacznym z nazwą POSPOLITą, taką jak *. contoso.com * Jeśli w tym samym odbiorniku wymieniono wiele nazw hostów, należy przekazać certyfikat sieci SAN (alternatywne nazwy podmiotu) przy użyciu CN zgodnych z wymienionymi nazwami hostów. * Nie można użyć wyrażenia regularnego do wspominania o nazwie hosta. Można używać symboli wieloznacznych, takich jak gwiazdka (*) i znak zapytania (?), aby utworzyć wzorzec nazwy hosta. * W przypadku kontroli kondycji zaplecza nie można skojarzyć wielu [niestandardowych sond](application-gateway-probe-overview.md) dla ustawień http. Zamiast tego można sondować jedną z witryn sieci Web w zapleczu lub użyć adresu "127.0.0.1" do sondowania hosta lokalnego serwera wewnętrznej bazy danych. Jednak w przypadku używania symboli wieloznacznych lub wielu nazw hostów w odbiorniku żądania dla wszystkich określonych wzorców domeny będą kierowane do puli zaplecza, w zależności od typu reguły (opartej na ścieżce podstawowej lub ścieżki). * Właściwości "hostname" przyjmuje jeden ciąg jako dane wejściowe, gdzie można wspomnieć tylko jedną nazwę domeny niezawierającą symboli wieloznacznych i "Hostnames" pobiera tablicę ciągów jako dane wejściowe, gdzie można wymienić do 5 nazw domen wieloznacznych. Ale obu właściwości nie można używać jednocześnie. * Nie można utworzyć reguły [przekierowania](redirect-overview.md) z odbiornikiem docelowym, który używa symboli wieloznacznych lub wielu nazw hostów. Zobacz [Tworzenie wielowitryn przy użyciu Azure PowerShell](tutorial-multiple-sites-powershell.md) lub [interfejsu wiersza polecenia platformy Azure](tutorial-multiple-sites-cli.md) , aby zapoznać się z przewodnikiem krok po kroku na temat konfigurowania nazw hostów symboli wieloznacznych. ## <a name="host-headers-and-server-name-indication-sni"></a>Nagłówki hosta i oznaczanie nazwy serwera (SNI, Server Name Indication) Istnieją trzy popularne mechanizmy włączania hostingu wielu witryn w tej samej infrastrukturze. 1. Hostowanie wielu aplikacji internetowych — każda z nich na unikatowym adresie IP. 2. Użycie nazwy hosta do hostowania wielu aplikacji internetowych na tym samym adresie IP. 3. Użycie różnych portów do hostowania wielu aplikacji internetowych na tym samym adresie IP. Obecnie Application Gateway obsługuje jeden publiczny adres IP, na którym nasłuchuje ruch. W związku z tym wiele aplikacji, z których każdy ma własny adres IP, nie jest obecnie obsługiwane. Application Gateway obsługuje wiele aplikacji, które nasłuchują na różnych portach, ale ten scenariusz wymaga, aby aplikacje akceptowały ruch na portach niestandardowych. Często nie jest to konfiguracja, której chcesz użyć. Usługa Application Gateway bazuje na nagłówkach hosta HTTP 1.1 w celu hostowania więcej niż jednej witryny sieci Web na tym samym publicznym adresie IP i porcie. Lokacje hostowane w usłudze Application Gateway mogą również obsługiwać odciążanie protokołu TLS przy użyciu rozszerzenia TLS Oznaczanie nazwy serwera (SNI). Ten scenariusz oznacza, że przeglądarka i farma sieci Web zaplecza klienta muszą obsługiwać protokół HTTP/1.1 i rozszerzenie TLS zgodnie ze standardem RFC 6066. ## <a name="next-steps"></a>Następne kroki Dowiedz się, jak skonfigurować obsługę wielu witryn w Application Gateway * [Korzystanie z witryny Azure Portal](create-multiple-sites-portal.md) * [Korzystanie z programu Azure PowerShell](tutorial-multiple-sites-powershell.md) * [Korzystanie z interfejsu wiersza polecenia platformy Azure](tutorial-multiple-sites-cli.md) Możesz odwiedzić stronę [Resource Manager template using multiple site hosting](https://github.com/Azure/azure-quickstart-templates/blob/master/201-application-gateway-multihosting) (Szablon usługi Resource Manager z zastosowaniem hostowania wielu witryn), aby zapoznać się z kompleksowym wdrożeniem opartym na szablonie.
109.851485
572
0.80676
pol_Latn
0.999765
fada62420d9a662392dad46997313218492418b0
49
md
Markdown
content/vg/Donkey Kong/Diddy Kong Racing/index.md
nerdydrew/Drews-Sheet-Music
d34c82fde1099c3bbdaf55a3ed68c6c4b4b1001c
[ "MIT" ]
2
2019-09-14T08:46:30.000Z
2022-01-13T18:47:28.000Z
content/vg/Donkey Kong/Diddy Kong Racing/index.md
nerdydrew/Drews-Sheet-Music
d34c82fde1099c3bbdaf55a3ed68c6c4b4b1001c
[ "MIT" ]
null
null
null
content/vg/Donkey Kong/Diddy Kong Racing/index.md
nerdydrew/Drews-Sheet-Music
d34c82fde1099c3bbdaf55a3ed68c6c4b4b1001c
[ "MIT" ]
null
null
null
Title: Diddy Kong Racing ReleaseDate: 1997-11-10
16.333333
24
0.795918
kor_Hang
0.350446
fadb1b4194b4edb2d9d519e1306de0ef611414ae
546
md
Markdown
semantic_ml/README.md
ajwinters/GoogMLcopy
929b994ef92c1e04f0ce6d76ce94f30421b2fc9b
[ "Apache-2.0" ]
244
2020-06-16T23:53:51.000Z
2022-03-31T15:29:12.000Z
semantic_ml/README.md
ajwinters/GoogMLcopy
929b994ef92c1e04f0ce6d76ce94f30421b2fc9b
[ "Apache-2.0" ]
32
2020-07-15T17:20:23.000Z
2022-03-12T00:53:38.000Z
semantic_ml/README.md
ajwinters/GoogMLcopy
929b994ef92c1e04f0ce6d76ce94f30421b2fc9b
[ "Apache-2.0" ]
164
2020-06-16T23:53:55.000Z
2022-03-29T08:32:09.000Z
# Use a Universal Sentence Encoder to Rank Query Responses Read the full blog post [here](https://daleonai.com/semantic_ml) ``` > npm install > node use_sample.js [ { response: 'I grab a ball', score: 10.788130270345432 }, { response: 'I go to you', score: 11.597091717283469 }, { response: 'I play with a ball', score: 9.346379028479209 }, { response: 'I go to school.', score: 10.130473646521292 }, { response: 'I go to the mug.', score: 12.475453722603106 }, { response: 'I bring you the mug.', score: 13.229019199245684 } ] ```
32.117647
65
0.681319
eng_Latn
0.556479
fadb6a6460e4744b7f1e5a74ccde59af96692997
1,300
md
Markdown
README.md
alexDamian99/GaSM
c676f7d1b83583c6536499b5ce1b751dda20fc5a
[ "MIT" ]
null
null
null
README.md
alexDamian99/GaSM
c676f7d1b83583c6536499b5ce1b751dda20fc5a
[ "MIT" ]
10
2020-02-29T15:57:51.000Z
2020-06-08T16:32:03.000Z
README.md
alexDamian99/GaSM
c676f7d1b83583c6536499b5ce1b751dda20fc5a
[ "MIT" ]
1
2020-06-10T11:36:33.000Z
2020-06-10T11:36:33.000Z
# GaSM Web Technologies project. This project can be found at https://gasm-tw.herokuapp.com/ and instructions can be found at https://gasm-tw.herokuapp.com/scholarly/utilizare.html ## Link to general information about projects: https://profs.info.uaic.ro/~busaco/teach/courses/web/web-projects.html ## Info about GaSM Să se creeze o aplicație Web capabilă, pe baza unui API REST/GraphQL propriu, să gestioneze informațiile privitoare la colectarea, sortarea și reciclarea gunoiului – pe categorii: menajer, hârtie, plastic etc. – la nivelul cetățeanului, personalului autorizat și factorilor de decizie. Se va oferi suport pentru raportarea de către utilizatori a locurilor unde s-a acumulat o cantitate substanțială de gunoi, în vederea descongestionării. Pe unitate de timp (zi, săptămână, lună), vor fi generate rapoarte numerice și grafice – disponibile în formatele HTML, CSV și PDF – referitoare la situația actuală la nivel de cartier/localitate, evidențiindu-se zonele cele mai curate/mizere. Interacțiunea cu utilizatorul va respecta principiile designului Web responsiv. Sistemul va oferi suport și pentru inițierea unor campanii de sensibilizare a locuitorilor privitoare la colectarea selectivă a gunoiului și a raportării incidentelor vizând depozitarea neadecvată a acestuia.
81.25
396
0.813077
ron_Latn
0.999987
fadc97fffc73ede23f8d3aa30b6419ab24ea9881
8,279
md
Markdown
articles/iot-edge/how-to-install-iot-edge-windows-with-linux.md
peder-andfrankly/azure-docs.sv-se
49435a06686fc72ca9cd8c83883c3c3704a6ec72
[ "CC-BY-4.0", "MIT" ]
null
null
null
articles/iot-edge/how-to-install-iot-edge-windows-with-linux.md
peder-andfrankly/azure-docs.sv-se
49435a06686fc72ca9cd8c83883c3c3704a6ec72
[ "CC-BY-4.0", "MIT" ]
null
null
null
articles/iot-edge/how-to-install-iot-edge-windows-with-linux.md
peder-andfrankly/azure-docs.sv-se
49435a06686fc72ca9cd8c83883c3c3704a6ec72
[ "CC-BY-4.0", "MIT" ]
null
null
null
--- title: Installera Azure IoT Edge för Linux i Windows | Microsoft Docs description: Azure IoT Edge Installationsinstruktioner för Linux-behållare i Windows 10, Windows Server och Windows IoT Core author: kgremban manager: philmea ms.reviewer: veyalla ms.service: iot-edge services: iot-edge ms.topic: conceptual ms.date: 05/06/2019 ms.author: kgremban ms.openlocfilehash: 649c4271b2786eca506460551cfad956eeadf3c5 ms.sourcegitcommit: c4700ac4ddbb0ecc2f10a6119a4631b13c6f946a ms.translationtype: MT ms.contentlocale: sv-SE ms.lasthandoff: 10/27/2019 ms.locfileid: "72964413" --- # <a name="use-iot-edge-on-windows-to-run-linux-containers"></a>Använda IoT Edge i Windows för att köra Linux-behållare Testa IoT Edge moduler för Linux-enheter med hjälp av en Windows-dator. I ett produktions scenario ska Windows-enheter endast köra Windows-behållare. Ett vanligt utvecklings scenario är dock att använda en Windows-dator för att bygga IoT Edge moduler för Linux-enheter. Med IoT Edge runtime för Windows kan du köra Linux-behållare i **testnings-och utvecklings** syfte. Den här artikeln innehåller anvisningar för att installera Azure IoT Edge runtime med hjälp av Linux-behållare på ditt Windows x64-system (AMD/Intel). Läs mer om installations programmet för IoT Edge runtime, inklusive information om alla installations parametrar, i [installera Azure IoT Edge runtime i Windows](how-to-install-iot-edge-windows.md). ## <a name="prerequisites"></a>Krav Använd det här avsnittet för att se om din Windows-enhet har stöd för IoT Edge och förbereda den för en behållar motor före installationen. ### <a name="supported-windows-versions"></a>Windows-versioner som stöds Azure IoT Edge med Linux-behållare kan köras på alla versioner av Windows som uppfyller [kraven för Docker Desktop](https://docs.docker.com/docker-for-windows/install/#what-to-know-before-you-install) Mer information om vad som ingår i den senaste versionen av IoT Edge finns Azure IoT Edge- [versioner](https://github.com/Azure/azure-iotedge/releases). Om du vill installera IoT Edge på en virtuell dator aktiverar du kapslad virtualisering och allokerar minst 2 GB minne. Hur du aktiverar kapslad virtualisering är olika beroende på vilken hypervisor du använder. För Hyper-V kan virtuella datorer i generation 2 ha kapslad virtualisering aktiverat som standard. För VMWare finns det en växling för att aktivera funktionen på den virtuella datorn. ### <a name="prepare-the-container-engine"></a>Förbereda behållar motorn Azure IoT Edge förlitar sig på en [OCI-kompatibel](https://www.opencontainers.org/) container motor. Den största konfigurations skillnaden mellan att köra Windows-och Linux-behållare på en Windows-dator är att den IoT Edge installationen innehåller en Windows container runtime, men du måste ange en egen runtime för Linux-behållare innan du installerar IoT Edge. Om du vill konfigurera en Windows-dator för att utveckla och testa behållare för Linux-enheter kan du använda [Docker Desktop](https://www.docker.com/docker-windows) som behållar motor. Du måste installera Docker och konfigurera den så att den [använder Linux-behållare](https://docs.docker.com/docker-for-windows/#switch-between-windows-and-linux-containers) innan du installerar IoT Edge. Om din IoT Edge enhet är en Windows-dator kontrollerar du att den uppfyller [system kraven](https://docs.microsoft.com/virtualization/hyper-v-on-windows/reference/hyper-v-requirements) för Hyper-V. ## <a name="install-iot-edge-on-a-new-device"></a>Installera IoT Edge på en ny enhet >[!NOTE] >Azure IoT Edge program varu paket omfattas av licens villkoren som finns i paketen (i licens katalogen). Läs licens villkoren innan du använder paketet. Din installation och användning av paketet utgör ditt godkännande av dessa villkor. Om du inte accepterar licens villkoren ska du inte använda paketet. Ett PowerShell-skript laddar ned och installerar Azure IoT Edge Security daemon. Security daemon startar sedan den första av två körnings moduler, IoT Edge agent, som möjliggör fjärrdistributioner av andra moduler. När du installerar IoT Edge runtime för första gången på en enhet måste du etablera enheten med en identitet från en IoT-hubb. En enskild IoT Edge enhet kan tillhandahållas manuellt med hjälp av en enhets anslutnings sträng från IoT Hub. Du kan också använda enhets etablerings tjänsten för att etablera enheter automatiskt, vilket är användbart när du har många enheter att konfigurera. Du kan läsa mer om de olika installations alternativen och parametrarna i artikeln [installera Azure IoT Edge runtime i Windows](how-to-install-iot-edge-windows.md). När du har Docker-skrivbordet installerat och konfigurerat för Linux-behållare, deklareras den huvudsakliga installations skillnaden Linux med parametern **-containern** . Exempel: 1. Registrera en ny IoT Edge enhet och hämta anslutnings strängen för enheten om du inte redan gjort det. Kopiera anslutnings strängen och Använd den senare i det här avsnittet. Du kan utföra det här steget med följande verktyg: * [Azure-portalen](how-to-register-device.md#register-in-the-azure-portal) * [Azure CLI](how-to-register-device.md#register-with-the-azure-cli) * [Visual Studio Code](how-to-register-device.md#register-with-visual-studio-code) 2. Kör PowerShell som administratör. >[!NOTE] >Använd en AMD64-session av PowerShell för att installera IoT Edge, inte PowerShell (x86). Om du inte är säker på vilken typ av session du använder kör du följande kommando: > >```powershell >(Get-Process -Id $PID).StartInfo.EnvironmentVariables["PROCESSOR_ARCHITECTURE"] >``` 3. Kommandot **Deploy-IoTEdge** kontrollerar att Windows-datorn finns på en version som stöds, aktiverar funktionen containers och laddar sedan ned Moby Runtime (som inte används för Linux-behållare) och IoT Edge Runtime. Standardvärdet för Windows-behållare är att deklarera Linux som önskat behållar operativ system. ```powershell . {Invoke-WebRequest -useb aka.ms/iotedge-win} | Invoke-Expression; ` Deploy-IoTEdge -ContainerOs Linux ``` 4. I det här läget kan IoT core-enheter startas om automatiskt. Andra Windows 10-eller Windows Server-enheter kan bli ombedd att starta om. Om så är fallet startar du om enheten nu. När enheten är klar kör du PowerShell som administratör igen. 5. Kommandot **Initialize-IoTEdge** konfigurerar IoT Edge runtime på din dator. Kommandot är standardvärdet för manuell etablering med en enhets anslutnings sträng. Deklarera Linux som det önskade behållar operativ systemet igen. ```powershell . {Invoke-WebRequest -useb aka.ms/iotedge-win} | Invoke-Expression; ` Initialize-IoTEdge -ContainerOs Linux ``` 6. När du uppmanas till det anger du den enhets anslutnings sträng som du hämtade i steg 1. Enhets anslutnings strängen kopplar den fysiska enheten med ett enhets-ID i IoT Hub. Enhets anslutnings strängen har följande format och ska inte innehålla citat tecken: `HostName={IoT hub name}.azure-devices.net;DeviceId={device name};SharedAccessKey={key}` ## <a name="verify-successful-installation"></a>Verifiera lyckad installation Kontrol lera status för den IoT Edge tjänsten: ```powershell Get-Service iotedge ``` Undersök tjänst loggar från de senaste 5 minuterna: ```powershell . {Invoke-WebRequest -useb aka.ms/iotedge-win} | Invoke-Expression; Get-IoTEdgeLog ``` Kör en automatisk kontroll av de vanligaste konfigurations-och nätverks felen: ```powershell iotedge check ``` Lista med moduler som körs. När en ny installation har slutförts är den enda modul som du bör se **edgeAgent**. När du har [distribuerat IoT Edge moduler](how-to-deploy-modules-portal.md) för första gången, kommer den andra systemmodulen, **edgeHub**, att starta även på enheten. ```powershell iotedge list ``` ## <a name="next-steps"></a>Nästa steg Nu när du har en IoT Edge enhet som har installerats med körnings miljön kan du [distribuera IoT Edge moduler](how-to-deploy-modules-portal.md). Om du har problem med att installera IoT Edge korrekt, kan du kolla in [fel söknings](troubleshoot.md) sidan. Om du vill uppdatera en befintlig installation till den senaste versionen av IoT Edge, se [uppdatera IoT Edge Security daemon och runtime](how-to-update-iot-edge.md).
65.706349
396
0.790917
swe_Latn
0.997782
fade1308b2fca2a0b9231d1c88ad5cf9dcfad30b
2,859
md
Markdown
README.md
TetrisIsCool/lightblocks
27147b671fc9a57bcdc06d4ddaba4fabe51fd5c0
[ "Apache-1.1" ]
70
2020-07-26T19:37:29.000Z
2022-03-26T19:57:42.000Z
README.md
TetrisIsCool/lightblocks
27147b671fc9a57bcdc06d4ddaba4fabe51fd5c0
[ "Apache-1.1" ]
41
2020-08-02T12:18:28.000Z
2022-01-22T18:09:19.000Z
README.md
TetrisIsCool/lightblocks
27147b671fc9a57bcdc06d4ddaba4fabe51fd5c0
[ "Apache-1.1" ]
11
2020-12-01T21:58:25.000Z
2022-01-20T11:03:38.000Z
# Falling Lightblocks ![Logo](ios/data/Media.xcassets/Logo.imageset/libgdx@1x.png) Falling block game for Android (Mobile and TV), iOS, Web browsers. Works on desktops, too. [![ko-fi](https://www.ko-fi.com/img/githubbutton_sm.svg)](https://ko-fi.com/B0B51Z9YB) ## Try it Falling Lightblocks is available on itch, GameJolt and the usual mobile app stores. [Visit its website](https://www.golfgl.de/lightblocks/) for the links. ## Build it ![Compile desktop, android and gwt](https://github.com/MrStahlfelge/lightblocks/workflows/Compile%20desktop,%20android%20and%20gwt/badge.svg?branch=master&event=push) The game is implemented with [libGDX](https://github.com/libgdx/libgdx). Follow the docs to get it to work. ## License * The assets in android/assets/ are licensed for your personal use only. You may not redistribute them. * Source code files and everything else in this repo is licensed under Apache License 2.0 The intention behind this is to: * allow building own versions of this game for your personal use, to tweak it and try things out, and to contribute to the project * allow extending the game and release an own increment of this game with another look and feel * allow using parts of this project's source code for your own projects of any kind * prevent indistinguishable clones of this game ## Contribute Contributions are welcome: bug fixes, enhancements, translations, opening issues etc. ### Translations There's a single [resource bundle file](android/assets/i18n/strings.properties) with all strings to be localized. Use it as a template for a new file strings_XX.properties with XX being the ISO language code. Please be aware that Falling Lightblocks can (so far) only display Latin based characters. ### Enhancements Please be aware that Falling Lightblocks is designed with certain design principles in mind, so if you plan to make some more efforts, please open an issue to check if your ideas will fit the game. Some principles the game and its evolvement is based on: * Features must work on every targeted platform (HTML!) and device * Features must work with every supported input device (touch, controller, keyboard) * Changes must not affect scoring of existing game modes * Falling Lightblocks means classic gameplay, though it is okay if some game modes are guideline For everyone viewing the source code: Sorry for all the German comments and commit messages that will make it more difficult for you. I didn't plan to open the source from the beginning, and I tend to write notes for myself in my native tongue. ## Server To set up a multiplayer server with a vserver, you can deploy to Heroku or dokku with git subtree push --prefix server remotename master For local play or other setups, you can build a jar file with gradlew server:build OpenJDK8 for building and using recommended.
40.842857
166
0.781042
eng_Latn
0.995962
fade49692dc87ba80b57e6f39cab06b24f0cc155
1,332
md
Markdown
node_modules/pull-mplex/README.md
bocahrokok/ipfs-app
e274355a124c346f2d5f0e81bd3dde6751213e7f
[ "MIT" ]
null
null
null
node_modules/pull-mplex/README.md
bocahrokok/ipfs-app
e274355a124c346f2d5f0e81bd3dde6751213e7f
[ "MIT" ]
3
2019-08-06T15:39:33.000Z
2021-05-08T02:05:20.000Z
colony-starter/node_modules/pull-mplex/README.md
joeycharlesworth/Prediction_hubspot_colony
45921bf16e343f497568ae727860d03c5fde53f9
[ "MIT" ]
null
null
null
pull-plex =================== [![](https://img.shields.io/badge/made%20by-Protocol%20Labs-blue.svg?style=flat-square)](http://ipn.io) [![](https://img.shields.io/badge/freenode-%23libp2p-blue.svg?style=flat-square)](http://webchat.freenode.net/?channels=%23libp2p) [![Dependency Status](https://david-dm.org/libp2p/pull-mplex.svg?style=flat-square)](https://david-dm.org/libp2p/pull-mplex) [![js-standard-style](https://img.shields.io/badge/code%20style-standard-brightgreen.svg?style=flat-square)](https://github.com/feross/standard) [![Travis CI](https://flat.badgen.net/travis/libp2p/pull-mplex)](https://travis-ci.com/libp2p/pull-mplex) > pull-streams based multiplexer implementing the [mplex spec](https://github.com/libp2p/specs/blob/48b3377/mplex/README.md) [![](https://github.com/libp2p/interface-stream-muxer/raw/master/img/badge.png)](https://github.com/libp2p/interface-stream-muxer) ## Lead Maintainer [Jacob Heun](https://github.com/jacobheun). ## Table of Contents * [Install](#install) * [Usage](#usage) * [API](#api) * [Contribute](#contribute) * [License](#license) ## Install ```sh > npm install pull-mplex ``` ## Usage See the [examples](./examples). ## API TBD ## Contribute This module is actively under development. Please check out the issues and submit PRs! ## License MIT © Protocol Labs
28.340426
144
0.716967
yue_Hant
0.155551
fade4d118ed9a91318c06c6b319ec3d6cbe865a0
2,697
md
Markdown
_posts/C/2020-02-16-c-post8.md
YunDaeHyeon/YunDaeHyeon.github.io
41f3ab6f94c298e1813fef0bb6074881d9c85f5f
[ "MIT" ]
2
2022-01-29T14:39:54.000Z
2022-02-03T11:59:12.000Z
_posts/C/2020-02-16-c-post8.md
YunDaeHyeon/YunDaeHyeon.github.io
41f3ab6f94c298e1813fef0bb6074881d9c85f5f
[ "MIT" ]
2
2022-01-15T07:05:59.000Z
2022-01-16T04:52:43.000Z
_posts/C/2020-02-16-c-post8.md
YunDaeHyeon/YunDaeHyeon.github.io
41f3ab6f94c298e1813fef0bb6074881d9c85f5f
[ "MIT" ]
null
null
null
--- layout: post title: "[C언어] 반복문 while, do ~ while" date: 2020-02-16 22:16:00 +0900 categories: jekyll update tags: [blog,c] --- # 반복문 **반복분(Ineration)은 프로그램의 제어문 일종으로 특정한 코드나 문장을 반복적으로 수행할 수 있게 해주는 구문** C언어에는 대표적으로 총 3가지의 반복문이 존재합니다. **while 문, do ~ while 문, for 문** 그중, while이 비교적 쉬운 구문이지만 대중적이나, 실용성이 뛰어난 구문은 for 문입니다. 이번 글에서는 while 문, do ~ while 문을 차례대로 알아보겠습니다 # while문 while 문의 구성은 아래와 같습니다. ```c while(원하는 조건) { .... .... } ``` ```c #include <stdio.h> int main(void) { int a = 1; while(a < 10) { printf("반복문 while 테스트 %d\n", a); a++; } return 0; } ``` ![image](/assets/img/blog/네이버/c 6 - 1.png) <br> 위의 예제를 컴파일시키면 위와 같이 실행됩니다. 이제, while 문을 자세히 분석해봅시다. ```c while(a<10) { ``` 반복문의 시작점입니다. while이라는 글자 옆에 소괄호가 존재하며 그 안에 어떠한 연산식이 존재합니다. 이는, **반복문이 어떠한 조건을 주고, 그 조건을 만족할 때까지, while 문안에 존재하는 문장들을 반복하는 구조**를 말합니다. 해당 while 문에는 관계연산자 < 가 사용이 되어있는 상태입니다. 즉, a가 10보다 작음이 유지될 때까지 해당 문장을 반복하라는 의미가 됩니다. ```c { printf("반복문 while 테스트 %d", a); a++; } ``` while 문안에 존재하는 문장들입니다. 이는 **반복되길 원하는 코드(문장)**을 뜻합니다. 그런데 특이한 점이 존재합니다. printf() 함수 밑에 증가 연산자 a++이 존재한다는 점입니다. 이는 반복문에서 매우 중요한 의미로 작용합니다. ![image](/assets/img/blog/네이버/c 6 - 2.png) <br> a++이라는 연산자를 이용하여 while 문에 적용된 조건을 만족할 수 있도록 도와주는 반복문에 있어서 꼭 필요한 ***'반복문을 벗어나게 도와주는 연산자'***입니다. 만약, 모든 반복문에서 a++과 같은 연산자가 존재하지 않는다면 해당 문장이 끝없이 반복되는 프로그램이 만들어지게 됩니다. 만약, 임의적으로 무한으로 반복되는 반복문을 사용하고 싶으면, while 문에 존재하는 소괄호에 0이 아닌 '참'을 의미하는 경우(숫자 1)가 오면 무한 루프가 실행됩니다. <center><b>ex) while(1)</b></center> --- # do ~ while문 do ~ while 문의 기본 구성은 아래와 같습니다. ```c do { .... .... } while(원하는 조건) ``` ```c #include <stdio.h> int main(void) { int a = 1; do { printf("반복문 while 테스트 %d\n", a); a++; }while(a < 10); return 0; } ``` ![image](/assets/img/blog/네이버/c 6 - 3.png) <br> 신기하게 결과가 while 문과 별다른 차이점이 존재하지 않습니다. 여기서, do ~ while 문과 while 문의 차이점이 존재하지 않는 것처럼 보입니다. 하지만, while 문과 do ~ while 문은 차이점이 존재합니다. while 문은 원하는 조건을 **반복문에 진입하기 전**에 사용하여 컴파일러 상에서 해당 문장을 반복할 수 있는지, 조건이 충족되어 있는지, 검토를 진행한 뒤 반복문을 실행합니다. 하지만, do ~ while 문은 조건식이 끝에 존재합니다. 이는, 반복문에 일단 진입을 하여 **한번** 실행시킨 뒤 조건을 검토하여 다시 반복문을 시작할지 판단합니다. 요약하자면, **while 문은 검토 후 반복문을 실행하지만 do ~ while 문은 무조건 한번 반복문을 실행시킨 뒤 검토합니다.** 이러한 차이점으로 while 문과 do ~ while 문은 매우 똑같은 반복문이지만 값이 다르게 나올 수 있다는 점이 존재합니다. <center> <b>while 문의 진행 순서는 아래와 같습니다.</b><br> 1. 주어진 조건이 <b>'참'</b>인가? <br> 2. 해당 조건이 '참'이면 반복을 실행하라.<br> 2-1. 다시 1번으로 돌아간다.<br> 3. 해당 조건이 '거짓'이면 반복을 실행하지 않는다.<br><br> <b>do ~ while 문의 진행 순서는 아래와 같습니다.</b><br> 1. 반복문을 <b>실행</b>한다.<br> 2. 해당 조건이 '참'이면 다시 반복문을 실행한다<br> 2-1. 다시 1번으로 돌아간다.<br> 3. 해당 조건이 '거짓'이면 반복을 실행하지 않는다.<br> </center> <center><small><i>잘못된 정보는 지적 부탁드리겠습니다.</i></small></center>
19.977778
70
0.611791
kor_Hang
1.00001