hexsha stringlengths 40 40 | size int64 5 1.04M | ext stringclasses 6 values | lang stringclasses 1 value | max_stars_repo_path stringlengths 3 344 | max_stars_repo_name stringlengths 5 125 | max_stars_repo_head_hexsha stringlengths 40 78 | max_stars_repo_licenses listlengths 1 11 | max_stars_count int64 1 368k ⌀ | max_stars_repo_stars_event_min_datetime stringlengths 24 24 ⌀ | max_stars_repo_stars_event_max_datetime stringlengths 24 24 ⌀ | max_issues_repo_path stringlengths 3 344 | max_issues_repo_name stringlengths 5 125 | max_issues_repo_head_hexsha stringlengths 40 78 | max_issues_repo_licenses listlengths 1 11 | max_issues_count int64 1 116k ⌀ | max_issues_repo_issues_event_min_datetime stringlengths 24 24 ⌀ | max_issues_repo_issues_event_max_datetime stringlengths 24 24 ⌀ | max_forks_repo_path stringlengths 3 344 | max_forks_repo_name stringlengths 5 125 | max_forks_repo_head_hexsha stringlengths 40 78 | max_forks_repo_licenses listlengths 1 11 | max_forks_count int64 1 105k ⌀ | max_forks_repo_forks_event_min_datetime stringlengths 24 24 ⌀ | max_forks_repo_forks_event_max_datetime stringlengths 24 24 ⌀ | content stringlengths 5 1.04M | avg_line_length float64 1.14 851k | max_line_length int64 1 1.03M | alphanum_fraction float64 0 1 | lid stringclasses 191 values | lid_prob float64 0.01 1 |
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
f90a5f0e98329342e1786475ae5a933a9ea5c2a1 | 2,560 | md | Markdown | sdk-api-src/content/d2d1_1/nf-d2d1_1-id2d1devicecontext-drawgdimetafile(id2d1gdimetafile_constd2d1_point_2f).md | amorilio/sdk-api | 54ef418912715bd7df39c2561fbc3d1dcef37d7e | [
"CC-BY-4.0",
"MIT"
] | null | null | null | sdk-api-src/content/d2d1_1/nf-d2d1_1-id2d1devicecontext-drawgdimetafile(id2d1gdimetafile_constd2d1_point_2f).md | amorilio/sdk-api | 54ef418912715bd7df39c2561fbc3d1dcef37d7e | [
"CC-BY-4.0",
"MIT"
] | null | null | null | sdk-api-src/content/d2d1_1/nf-d2d1_1-id2d1devicecontext-drawgdimetafile(id2d1gdimetafile_constd2d1_point_2f).md | amorilio/sdk-api | 54ef418912715bd7df39c2561fbc3d1dcef37d7e | [
"CC-BY-4.0",
"MIT"
] | null | null | null | ---
UID: NF:d2d1_1.ID2D1DeviceContext.DrawGdiMetafile(ID2D1GdiMetafile,constD2D1_POINT_2F)
title: ID2D1DeviceContext::DrawGdiMetafile (d2d1_1.h)
description: Draw a metafile to the device context.
helpviewer_keywords: ["DrawGdiMetafile","DrawGdiMetafile method [Direct2D]","DrawGdiMetafile method [Direct2D]","ID2D1DeviceContext interface","ID2D1DeviceContext interface [Direct2D]","DrawGdiMetafile method","ID2D1DeviceContext.DrawGdiMetafile","ID2D1DeviceContext::DrawGdiMetafile","ID2D1DeviceContext::DrawGdiMetafile(ID2D1GdiMetafile","D2D1_POINT_2F)","d2d1_1/ID2D1DeviceContext::DrawGdiMetafile","direct2d.id2d1devicecontext_drawgdimetafile"]
old-location: direct2d\id2d1devicecontext_drawgdimetafile.htm
tech.root: Direct2D
ms.assetid: d0746d7f-0779-46c0-8a02-c92e6851e371
ms.date: 12/05/2018
ms.keywords: DrawGdiMetafile, DrawGdiMetafile method [Direct2D], DrawGdiMetafile method [Direct2D],ID2D1DeviceContext interface, ID2D1DeviceContext interface [Direct2D],DrawGdiMetafile method, ID2D1DeviceContext.DrawGdiMetafile, ID2D1DeviceContext::DrawGdiMetafile, ID2D1DeviceContext::DrawGdiMetafile(ID2D1GdiMetafile,D2D1_POINT_2F), d2d1_1/ID2D1DeviceContext::DrawGdiMetafile, direct2d.id2d1devicecontext_drawgdimetafile
req.header: d2d1_1.h
req.include-header:
req.target-type: Windows
req.target-min-winverclnt: Windows 8 and Platform Update for Windows 7 [desktop apps \| UWP apps]
req.target-min-winversvr: Windows Server 2012 and Platform Update for Windows Server 2008 R2 [desktop apps \| UWP apps]
req.kmdf-ver:
req.umdf-ver:
req.ddi-compliance:
req.unicode-ansi:
req.idl:
req.max-support:
req.namespace:
req.assembly:
req.type-library:
req.lib:
req.dll: D2d1.dll
req.irql:
targetos: Windows
req.typenames:
req.redist:
ms.custom: 19H1
f1_keywords:
- ID2D1DeviceContext::DrawGdiMetafile
- d2d1_1/ID2D1DeviceContext::DrawGdiMetafile
dev_langs:
- c++
topic_type:
- APIRef
- kbSyntax
api_type:
- COM
api_location:
- D2d1.dll
api_name:
- ID2D1DeviceContext.DrawGdiMetafile
---
# ID2D1DeviceContext::DrawGdiMetafile
## -description
Draw a metafile to the device context.
## -parameters
### -param gdiMetafile [in]
Type: <b><a href="/windows/desktop/api/d2d1_1/nn-d2d1_1-id2d1gdimetafile">ID2D1GdiMetafile</a>*</b>
The metafile to draw.
### -param targetOffset [in, optional]
Type: <b>const <a href="/windows/desktop/Direct2D/d2d1-point-2f">D2D1_POINT_2F</a>*</b>
The offset from the upper left corner of the render target.
## -see-also
<a href="/windows/desktop/api/d2d1_1/nn-d2d1_1-id2d1devicecontext">ID2D1DeviceContext</a> | 36.056338 | 448 | 0.802734 | yue_Hant | 0.779275 |
f90af274c219bf71889bfb6fa664d9225ad2956e | 3,974 | md | Markdown | articles/container-service/kubernetes/container-service-kubernetes-coscale.md | y32saji/azure-docs | cf971fe82e9ee70db9209bb196ddf36614d39d10 | [
"CC-BY-4.0",
"MIT"
] | 2 | 2019-08-30T23:39:11.000Z | 2019-08-30T23:39:14.000Z | articles/container-service/kubernetes/container-service-kubernetes-coscale.md | y32saji/azure-docs | cf971fe82e9ee70db9209bb196ddf36614d39d10 | [
"CC-BY-4.0",
"MIT"
] | 1 | 2019-06-12T00:05:28.000Z | 2019-07-09T09:39:55.000Z | articles/container-service/kubernetes/container-service-kubernetes-coscale.md | y32saji/azure-docs | cf971fe82e9ee70db9209bb196ddf36614d39d10 | [
"CC-BY-4.0",
"MIT"
] | 1 | 2019-11-24T22:01:50.000Z | 2019-11-24T22:01:50.000Z | ---
title: (DEPRECATED) Monitor an Azure Kubernetes cluster with CoScale
description: Monitor a Kubernetes cluster in Azure Container Service using CoScale
services: container-service
author: fryckbos
manager: jeconnoc
ms.service: container-service
ms.topic: article
ms.date: 05/22/2017
ms.author: saudas
ms.custom: mvc
---
# (DEPRECATED) Monitor an Azure Container Service Kubernetes cluster with CoScale
[!INCLUDE [ACS deprecation](../../../includes/container-service-kubernetes-deprecation.md)]
In this article, we show you how to deploy the [CoScale](https://web.archive.org/web/20180317071550/https://www.coscale.com/) agent to monitor all nodes and containers in your Kubernetes cluster in Azure Container Service. You need an account with CoScale for this configuration.
## About CoScale
CoScale is a monitoring platform that gathers metrics and events from all containers in several orchestration platforms. CoScale offers full-stack monitoring for Kubernetes environments. It provides visualizations and analytics for all layers in the stack: the OS, Kubernetes, Docker, and applications running inside your containers. CoScale offers several built-in monitoring dashboards, and it has built-in anomaly detection to allow operators and developers to find infrastructure and application issues fast.

As shown in this article, you can install agents on a Kubernetes cluster to run CoScale as a SaaS solution. If you want to keep your data on-site, CoScale is also available for on-premises installation.
## Prerequisites
You first need to [create a CoScale account](https://web.archive.org/web/20170507123133/https://www.coscale.com/free-trial).
This walkthrough assumes that you have [created a Kubernetes cluster using Azure Container Service](container-service-kubernetes-walkthrough.md).
It also assumes that you have the `az` Azure CLI and `kubectl` tools installed.
You can test if you have the `az` tool installed by running:
```azurecli
az --version
```
If you don't have the `az` tool installed, there are instructions [here](/cli/azure/install-azure-cli).
You can test if you have the `kubectl` tool installed by running:
```bash
kubectl version
```
If you don't have `kubectl` installed, you can run:
```azurecli
az acs kubernetes install-cli
```
## Installing the CoScale agent with a DaemonSet
[DaemonSets](https://kubernetes.io/docs/concepts/workloads/controllers/daemonset/) are used by Kubernetes to run a single instance of a container on each host in the cluster.
They're perfect for running monitoring agents such as the CoScale agent.
After you log in to CoScale, go to the [agent page](https://app.coscale.com/)
to install CoScale agents on your cluster using a DaemonSet. The CoScale UI provides guided configuration steps to create an agent and start monitoring your complete Kubernetes cluster.

To start the agent on the cluster, run the supplied command:

That's it! Once the agents are up and running, you should see data in the console in a few minutes. Visit
the [agent page](https://app.coscale.com/) to see a summary of your cluster, perform additional configuration steps, and see dashboards such as the **Kubernetes cluster overview**.

The CoScale agent is automatically deployed on new machines in the cluster. The agent updates automatically when a new version is released.
## Next steps
See the [CoScale documentation](https://web.archive.org/web/20180415164304/http://docs.coscale.com:80/) and [blog](https://web.archive.org/web/20170501021344/http://www.coscale.com:80/blog) for more information about CoScale monitoring solutions.
| 47.309524 | 512 | 0.790388 | eng_Latn | 0.956278 |
f90b31dd3246927ce4c7e9ed66f6f1278d4565c1 | 9,446 | md | Markdown | _pages/cv.md | BrianAvant-NOAA/bavant.github.io | 8a651a23da01691c237e71e75ff25d2eb0e4aed5 | [
"MIT"
] | null | null | null | _pages/cv.md | BrianAvant-NOAA/bavant.github.io | 8a651a23da01691c237e71e75ff25d2eb0e4aed5 | [
"MIT"
] | null | null | null | _pages/cv.md | BrianAvant-NOAA/bavant.github.io | 8a651a23da01691c237e71e75ff25d2eb0e4aed5 | [
"MIT"
] | null | null | null | ---
layout: archive
title: "CV"
permalink: /cv/
author_profile: true
redirect_from:
- /resume
---
{% include base_path %}
Education
======
* Geographic Information Science Certificate, Gainesville State College, 2009
* B.S. Applied Environmental Spatial Analysis, Gainesville State College, 2010
* M.S. Forest Resources, University of Georgia, 2013
Software Skills
======
R, Python, Linux, Bash, SQL, Git, Redmine, Agile, ArcGIS, QGIS, Docker, Nginx
Professional Experience
======
* Water Resource Engineer (Software Engineer), Lynker Technologies, National Water Center, June 2019-Present
* Develop NOAA’s National Flood Inundation Mapping (FIM) model (<a href="https://github.com/NOAA-OWP/cahaba" target="_blank" rel="noopener"><span style="color:blue">cahaba repo</span></a>)
* Collaborate in a distributed team environment using Github and Agile software development practices
* Optimize and scale open-source geospatial model pipelines on Linux - Python/Docker
* Develop custom solutions, tools, and workflows to debug, test, and evaluate predictive models
* Collaborate with external development teams
* Libraries: GDAL/OGR, Pandas/Geopandas, Rasterio, SQLite, Multiprocessing, Arcpy, Numpy
* Hydrologist (Software Developer), ORISE Fellowship - U.S. Environmental Protection Agency, Office of Research and Development, October 2016-June 2019
* Researched nanomaterial fate and transport in various aquatic ecosystems (see publications)
* Contributed to the development of Water-quality Analysis Simulation Program (WASP), version 8, Advanced Toxicant module including testing, verification of algorithms, technical guides and user’s manual, training workshops, and multiple peer-reviewed research papers
* Performed model sensitivity analysis and calibration of fate and transport models
* Developed hydrologic flow routing algorithms in Python for <a href="https://github.com/quanted/hms" target="_blank" rel="noopener"><span style="color:blue">Hydrologic Micro Services (HMS)</span></a>
* Developed an R Shiny web <a href="https://github.com/quanted/wq_screen" target="_blank" rel="noopener"><span style="color:blue">application</span></a> to screen water quality samples using criteria based on their location
* Hydrologist, U.S. Environmental Protection Agency, Office of Research and Development, February 2015-September 2016
* Tested source code of fate and transport model and verified parameter algorithms
* Built pipelines to acquire model input data from APIs and convert to organized data structures (R)
* Modeled fate and transport of sediments, metals, and emerging contaminants
* Developed open-source pipelines for extracting, evaluating, and visualizing model results (R)
* Mudlogger, Morco Geological Services Inc., November 2013-December 2014
* Collected and logged lithology samples for formation mapping and hydrocarbon analysis
* Analyzed chromatographs and logging/drilling software
* Prepared technical reports and corresponded with operations geologists
* Independent Contractor, Headwater Science, LLC, August-September 2013
* Stream habitat assessments; macroinvertebrate identification and analysis
* Electrofishing and seine net sampling
* Graduate Research Assistant, Dept. Forestry and Natural Resources & Savannah River National Lab (U.S. Dept. of Energy), University of Georgia, 2010-2013
* Developed methods for streamflow prediction in ungauged basins using parameter information from calibrated gauged basins with Hydrologic Simulation Program FORTRAN (HSPF)
* Estimated data and model uncertainty using parameter estimation (PEST) software
* Applied statistical metrics to quantify streamflow and watershed characteristic similarities
* GIS Specialist, U.S. Forest Service, University of Georgia, 2012
* Created GIS layers, maps, and technical reports following EPA’s DRASTIC protocol
* Developed and organized the Ground Water Vulnerability and Risk Assessment for U.S. Forest Service Southern Region lands
* Collaborated with a team to produce GIS products, procedure manuals, and reports
* Research Assistant, Institute for Environmental and Spatial Analysis & Georgia Power, Gainesville State College, 2009-2010
* Assisted in developing and decision support system (DSS) website using HTML and ArcGIS
* Created GIS layers displaying watershed descriptions and water quality data in an interactive framework
* Senior Project, Institute for Environmental and Spatial Analysis, Gainesville State College, 2009-2010
* Developed a land use map of Rocky Creek, GA to identify stream banks lacking riparian buffers
* Produced a canopy height model from LIDAR data to classify land use with object-based image analysis
Awards
======
* EPA Gold Medal for Scientific Excellence (Gold King Mine Report) - 2017
* EPA Scientific and Technological Achievement Award Nominee - 2021 (Pending Results)
Journal Publications
======
<ul>{% for post in site.publications %}
{% include archive-single-cv.html %}
{% endfor %}</ul>
Reports and Proceedings Papers
======
* Sitterson, J., C.D. Knightes, B. Avant. (2018). Flow Routing Techniques for Environmental Modeling. U.S. Environmental Protection Agency, Washington, DC, <a href="https://cfpub.epa.gov/si/si_public_record_Report.cfm?dirEntryId=342907&Lab=NERL" target="_blank" rel="noopener"><span style="color:blue">EPA/600/B-18/256</span></a>. <a href="https://cfpub.epa.gov/si/si_public_file_download.cfm?p_download_id=537222&Lab=NERL"><img src="/images/pdf.jpg" style="width: 20px; height: 20px; margin-left: 1px;">
* Sitterson, J., C.D. Knightes, R. Parmar, K. Wolfe, M. Muche, B. Avant. (2017). An Overview of Rainfall-Runoff Model Types. U.S. Environmental Protection Agency, Washington, DC, <a href="https://cfpub.epa.gov/si/si_public_record_report.cfm?dirEntryId=339328&Lab=NERL" target="_blank" rel="noopener"><span style="color:blue">EPA/600/R-17/482</span></a>. <a href="https://cfpub.epa.gov/si/si_public_file_download.cfm?p_download_id=533906&Lab=NERL"><img src="/images/pdf.jpg" style="width: 20px; height: 20px; margin-left: 1px;">
* Sitterson, J., C.D. Knightes, R. Parmar, K. Wolfe, B. Avant, A. Ignatius, D. Smith. (2017). A Survey of Precipitation Data for Environmental Modeling. U.S. Environmental Protection Agency, Washington, DC, <a href="https://cfpub.epa.gov/si/si_public_record_report.cfm?Lab=NERL&dirEntryId=339606" target="_blank" rel="noopener"><span style="color:blue">EPA/600/R-17/441</span></a>. <a href="https://cfpub.epa.gov/si/si_public_file_download.cfm?p_download_id=534513&Lab=NERL"><img src="/images/pdf.jpg" style="width: 20px; height: 20px; margin-left: 1px;">
* Ambrose, B., B. Avant, Y. Han, C.D. Knightes, T. Wool. (2017). Water Quality Assessment Simulation Program (WASP8): Upgrades to the Advanced Toxicant Module for Simulating Dissolved Chemicals, Nanomaterials, and Solids. U.S. Environmental Protection Agency, Washington, DC, <a href="https://cfpub.epa.gov/si/si_public_record_report.cfm?Lab=NERL&dirEntryId=338180" target="_blank" rel="noopener"><span style="color:blue">EPA/600/R-17/326</span></a>. <a href="https://cfpub.epa.gov/si/si_public_file_download.cfm?p_download_id=535418&Lab=NERL"><img src="/images/pdf.jpg" style="width: 20px; height: 20px; margin-left: 1px;">
* Avant, B., Knightes, C., Bouchard, D., Chang, X., Henderson, M., Zepp, R. (2017). Modeling Engineered Nanomaterials (ENMs) Fate and Transport in Aquatic Ecosystems. <a href="http://gwri.gatech.edu/GWRC2017" target="_blank" rel="noopener"><span style="color:blue">Proceedings of the 2017 Georgia Water Resources Conference</span></a>. Athens, GA, April. <a href="http://gwri.gatech.edu/sites/default/files/files/docs/2017/avantknightesbouchardchanghendersonzeppgwrc2017.pdf"><img src="/images/pdf.jpg" style="width: 20px; height: 20px; margin-left: 1px;">
* US EPA. (2017). Analysis of the Transport and Fate of Metals Released from the Gold King Mine in the Animas and San Juan Rivers. Office of Research and Development. Athens, GA. <a href="https://cfpub.epa.gov/si/si_public_record_report.cfm?Lab=NERL&dirEntryID=325950" target="_blank" rel="noopener"><span style="color:blue">EPA Report</span></a>. <a href="https://cfpub.epa.gov/si/si_public_file_download.cfm?p_download_id=530074&Lab=NERL"><img src="/images/pdf.jpg" style="width: 20px; height: 20px; margin-left: 1px;">
* Avant, B., (2013). Comparison of Input Precipitation Sources in Streamflow Forecasting. Master’s Thesis. Daniel B. Warnell School of Forestry and Natural Resources. University of Georgia, United States. <a href="https://getd.libs.uga.edu/pdfs/avant_brian_k_201312_ms.pdf"><img src="/images/pdf.jpg" style="width: 20px; height: 20px; margin-left: 1px;">
* Avant, B., A. Ignatius, T. Rasmussen, A. Grundstein, T. Mote, J. Shepherd. (2011). Coupling Tritium Release Data with Remotely Sensed Precipitation Data to Assess Model Uncertainties. <a href="http://hdl.handle.net/1853/46461" target="_blank" rel="noopener"><span style="color:blue">Proceedings of the 2011 Georgia Water Resources Conference</span></a>. Athens, GA, April. <a href="https://smartech.gatech.edu/bitstream/handle/1853/46461/Poster7.01121Avant.pdf?sequence=1&isAllowed=y"><img src="/images/pdf.jpg" style="width: 20px; height: 20px; margin-left: 1px;">
Teaching
======
<ul>{% for post in site.teaching %}
{% include archive-single-cv.html %}
{% endfor %}</ul>
| 86.66055 | 624 | 0.764027 | eng_Latn | 0.619073 |
f90baeb2300020059c425ba662939f483ba198b5 | 67 | md | Markdown | README.md | lxykyle01/WebNotes | acfa3eee3fc543dc64a1641ffe7f5f39e8c83bf5 | [
"MIT"
] | null | null | null | README.md | lxykyle01/WebNotes | acfa3eee3fc543dc64a1641ffe7f5f39e8c83bf5 | [
"MIT"
] | null | null | null | README.md | lxykyle01/WebNotes | acfa3eee3fc543dc64a1641ffe7f5f39e8c83bf5 | [
"MIT"
] | null | null | null | # WebNotes
### Here is My Website
### Record my Life and Knowledge
| 16.75 | 32 | 0.701493 | eng_Latn | 0.955116 |
f90bbfd098a63c44a1df4b8a952d04b3b8988dfc | 734 | md | Markdown | docs/error-messages/compiler-errors-2/compiler-error-c2726.md | psimn/cpp-docs.zh-cn | 0f8c59315e1753eb94b113dac7c38b3b70486ad7 | [
"CC-BY-4.0",
"MIT"
] | null | null | null | docs/error-messages/compiler-errors-2/compiler-error-c2726.md | psimn/cpp-docs.zh-cn | 0f8c59315e1753eb94b113dac7c38b3b70486ad7 | [
"CC-BY-4.0",
"MIT"
] | null | null | null | docs/error-messages/compiler-errors-2/compiler-error-c2726.md | psimn/cpp-docs.zh-cn | 0f8c59315e1753eb94b113dac7c38b3b70486ad7 | [
"CC-BY-4.0",
"MIT"
] | null | null | null | ---
title: 编译器错误 C2726
ms.date: 11/04/2016
f1_keywords:
- C2726
helpviewer_keywords:
- C2726
ms.assetid: f0191bb7-c175-450b-bf09-a3213db96d09
ms.openlocfilehash: 36ddf54b3924ae48969fa810ba2883c51b2264b5
ms.sourcegitcommit: 16fa847794b60bf40c67d20f74751a67fccb602e
ms.translationtype: MT
ms.contentlocale: zh-CN
ms.lasthandoff: 12/03/2019
ms.locfileid: "74757418"
---
# <a name="compiler-error-c2726"></a>编译器错误 C2726
“gcnew”只能用于创建具有托管或 WinRT 类型的对象
不能在垃圾回收堆上创建本机类型的实例。
下面的示例生成 C2726,并演示如何修复此错误:
```cpp
// C2726.cpp
// compile with: /clr
using namespace System;
class U {};
ref class V {};
value class W {};
int main() {
U* pU = gcnew U; // C2726
U* pU2 = new U; // OK
V^ p2 = gcnew V; // OK
W p3; // OK
}
```
| 18.35 | 60 | 0.708447 | yue_Hant | 0.263523 |
f90bcceb00dd1b25b1e43b8f533e9eae2ecd24c6 | 7,368 | md | Markdown | 8_transport_layer_security.md | AmanoTooko/embeddedappsec | edefc0e72262fd49cc0febb4b40aee1347409f66 | [
"CC0-1.0"
] | 40 | 2018-05-03T02:50:35.000Z | 2022-02-11T08:15:18.000Z | 8_transport_layer_security.md | AmanoTooko/embeddedappsec | edefc0e72262fd49cc0febb4b40aee1347409f66 | [
"CC0-1.0"
] | 24 | 2021-01-08T19:25:50.000Z | 2021-02-02T04:55:05.000Z | 8_transport_layer_security.md | AmanoTooko/embeddedappsec | edefc0e72262fd49cc0febb4b40aee1347409f66 | [
"CC0-1.0"
] | 12 | 2017-12-08T12:21:45.000Z | 2022-02-01T17:23:44.000Z | # Transport Layer Security
Ensure all methods of communication are utilizing industry standard encryption configurations for [TLS](https://www.securecoding.cert.org/confluence/display/c/API10-C.+APIs+should+have+security+options+enabled+by+default). The use of TLS ensures that all data remains confidential and untampered with while in transit. Utilize free certificate authority services such as [Let’s Encrypt](https://letsencrypt.org/) if the embedded device utilizes domain names.
[**Example**](http://fm4dd.com/openssl/certverify.htm) **of how to perform a basic certificate validation against a root certificate authority, using the OpenSSL library functions. :**
```c
#include <openssl/bio.h>
#include <openssl/err.h>
#include <openssl/pem.h>
#include <openssl/x509.h>
#include <openssl/x509_vfy.h>
int main() {
const char ca_bundlestr[] = "./ca-bundle.pem";
const char cert_filestr[] = "./cert-file.pem";
BIO *certbio = NULL;
BIO *outbio = NULL;
X509 *error_cert = NULL;
X509 *cert = NULL;
X509_NAME *certsubject = NULL;
X509_STORE *store = NULL;
X509_STORE_CTX *vrfy_ctx = NULL;
int ret;
/* ---------------------------------------------------------- *
* These function calls initialize openssl for correct work. *
* ---------------------------------------------------------- */
OpenSSL_add_all_algorithms();
ERR_load_BIO_strings();
ERR_load_crypto_strings();
/* ---------------------------------------------------------- *
* Create the Input/Output BIO's. *
* ---------------------------------------------------------- */
certbio = BIO_new(BIO_s_file());
outbio = BIO_new_fp(stdout, BIO_NOCLOSE);
/* ---------------------------------------------------------- *
* Initialize the global certificate validation store object. *
* ---------------------------------------------------------- */
if (!(store=X509_STORE_new()))
BIO_printf(outbio, "Error creating X509_STORE_CTX object\n");
/* ---------------------------------------------------------- *
* Create the context structure for the validation operation. *
* ---------------------------------------------------------- */
vrfy_ctx = X509_STORE_CTX_new();
/* ---------------------------------------------------------- *
* Load the certificate and cacert chain from file (PEM). *
* ---------------------------------------------------------- */
ret = BIO_read_filename(certbio, cert_filestr);
if (! (cert = PEM_read_bio_X509(certbio, NULL, 0, NULL))) {
BIO_printf(outbio, "Error loading cert into memory\n");
exit(-1);
}
ret = X509_STORE_load_locations(store, ca_bundlestr, NULL);
if (ret != 1)
BIO_printf(outbio, "Error loading CA cert or chain file\n");
/* ---------------------------------------------------------- *
* Initialize the ctx structure for a verification operation: *
* Set the trusted cert store, the unvalidated cert, and any *
* potential certs that could be needed (here we set it NULL) *
* ---------------------------------------------------------- */
X509_STORE_CTX_init(vrfy_ctx, store, cert, NULL);
/* ---------------------------------------------------------- *
* Check the complete cert chain can be build and validated. *
* Returns 1 on success, 0 on verification failures, and -1 *
* for trouble with the ctx object (i.e. missing certificate) *
* ---------------------------------------------------------- */
ret = X509_verify_cert(vrfy_ctx);
BIO_printf(outbio, "Verification return code: %d\n", ret);
if(ret == 0 || ret == 1)
BIO_printf(outbio, "Verification result text: %s\n",
X509_verify_cert_error_string(vrfy_ctx->error));
/* ---------------------------------------------------------- *
* The error handling below shows how to get failure details *
* from the offending certificate. *
* ---------------------------------------------------------- */
if(ret == 0) {
/* get the offending certificate causing the failure */
error_cert = X509_STORE_CTX_get_current_cert(vrfy_ctx);
certsubject = X509_NAME_new();
certsubject = X509_get_subject_name(error_cert);
BIO_printf(outbio, "Verification failed cert:\n");
X509_NAME_print_ex(outbio, certsubject, 0, XN_FLAG_MULTILINE);
BIO_printf(outbio, "\n");
}
/* ---------------------------------------------------------- *
* Free up all structures *
* ---------------------------------------------------------- */
X509_STORE_CTX_free(vrfy_ctx);
X509_STORE_free(store);
X509_free(cert);
BIO_free_all(certbio);
BIO_free_all(outbio);
exit(0);
}
```
**Considerations \(Disclaimer: The List below is non-exhaustive\):**
* Use the latest possible version of TLS for new products \(as of writing, this is TLS 1.2\)
* Consider implementing TLS two-way authentication for firmware that accepts TLS connections from a limited group of allowed clients.
* If possible, consider using mutual-authentication to authenticate both end-points.
* Validate the certificate public key, hostname, and [chain](http://fm4dd.com/openssl/certverify.htm).
* Ensure certificate and their chains use SHA256 for signing.
* Disable deprecated SSL and early TLS versions.
* Disable deprecated, NULL and weak cipher suites.
* Ensure private key and certificates are stored securely - e.g. Secure Environment or Trusted Execution Environment, or protected using strong cryptography.
* Keep certificates updated with up to date secure configurations.
* Ensure proper certificate update features are available upon expiration.
* Verify TLS configurations utilizing services such as [ssllabs.com](https://www.ssllabs.com), nmap using `--script ssl-enum-ciphers.nse`, TestSSLServer.jar, sslscan and sslyze.
**Other Example\(s\):**
To utilize TLS, there are other options besides OpenSSL. A non-exhaustive list is below.
Formerly PolarSSL, a list of projects using mbed TLS can be found at:
* [https://tls.mbed.org/kb/generic/projects-using-mbedtls](https://tls.mbed.org/kb/generic/projects-using-mbedtls)
* [https://tls.mbed.org/](https://tls.mbed.org/)
Examples of implementation can be found at
* [https://tls.mbed.org/kb/how-to/mbedtls-tutorial](https://tls.mbed.org/kb/how-to/mbedtls-tutorial)
Formerly CyaSSL, wolfSSL and a list of projects using wolfSSL can be found at:
* [https://www.wolfssl.com/wolfSSL/wolfssl-embedded-ssl-case-studies.html](https://www.wolfssl.com/wolfSSL/wolfssl-embedded-ssl-case-studies.html)
* [https://www.wolfssl.com/wolfSSL/Home.html](https://www.wolfssl.com/wolfSSL/Home.html)
Examples of implementation can be found at:
* [https://github.com/wolfSSL/wolfssl-examples](https://github.com/wolfSSL/wolfssl-examples)
## Additional References <a id="additional-references"></a>
* [https://letsencrypt.org/](https://letsencrypt.org/)
* [https://community.letsencrypt.org/t/certificate-for-embedded-device-without-a-domain-name/2372](https://community.letsencrypt.org/t/certificate-for-embedded-device-without-a-domain-name/2372)
* [http://fm4dd.com/openssl/](http://fm4dd.com/openssl/)
* [https://www.engadget.com/2015/04/19/wink-home-automation-hub-bricked/](https://www.engadget.com/2015/04/19/wink-home-automation-hub-bricked/)
| 48.156863 | 458 | 0.605592 | eng_Latn | 0.633202 |
f90bfc92503ffc2733284ed3f415ccc1566321fa | 200 | md | Markdown | wiki/translations/fr/Spreadsheet_Module.md | arslogavitabrevis/FreeCAD-documentation | 8166ce0fd906f0e15672cd1d52b3eb6cf9e17830 | [
"CC0-1.0"
] | null | null | null | wiki/translations/fr/Spreadsheet_Module.md | arslogavitabrevis/FreeCAD-documentation | 8166ce0fd906f0e15672cd1d52b3eb6cf9e17830 | [
"CC0-1.0"
] | null | null | null | wiki/translations/fr/Spreadsheet_Module.md | arslogavitabrevis/FreeCAD-documentation | 8166ce0fd906f0e15672cd1d52b3eb6cf9e17830 | [
"CC0-1.0"
] | null | null | null | # Spreadsheet Module/fr
1. REDIRECT [Spreadsheet Workbench/fr](Spreadsheet_Workbench/fr.md)
---
[documentation index](../README.md) > [Spreadsheet](Spreadsheet_Workbench.md) > Spreadsheet Module/fr
| 33.333333 | 101 | 0.77 | kor_Hang | 0.328579 |
f90d9adc365b9d2a070d3e2dbb31b1112cfbd3ed | 567 | md | Markdown | README-zh.md | yangjiang3973/vheel | 0a5b4c0aca9aac288ee940e4f0c2677155315f52 | [
"MIT"
] | 1 | 2022-02-13T06:57:43.000Z | 2022-02-13T06:57:43.000Z | README-zh.md | yangjiang3973/vheel | 0a5b4c0aca9aac288ee940e4f0c2677155315f52 | [
"MIT"
] | 1 | 2021-03-28T09:32:58.000Z | 2021-03-28T09:32:58.000Z | README-zh.md | yangjiang3973/vheel | 0a5b4c0aca9aac288ee940e4f0c2677155315f52 | [
"MIT"
] | null | null | null | # vheel
Show you how to create a vue3-similar framework step by step
通过一步步渐进式地做一个仿 Vue3 的轮子,来学习 Vue3 的源码。
## 简介
很多人对 Vue3 等前端框架是如何被创造的感兴趣,但是直接读源码来理解整体是很耗时间的活。
所以我想用一种正向的,渐进式的方法,来解释如何造一个类似 Vue3 的框架。
## 如何使用
每一个分支(branch)都包含,目前完成的,划分过的开发进度。同时,main 主干始终拥有最新的进度。
我建议使用时,先把整个 repo clone 到本地,从第一个分支`01-setup-dev-env`开始,跟随进度一起敲代码。
在 `vheel/playground/main.js`,我通常会创建一些 demo,来展示当前分支开发的新功能
想跑 demo 的话,先
`npm install`
之后
`npm run dev`
## 文章
对于每个分支,都有一篇对应的文章解释当前做了哪些工作以及为什么。
最早的版本会直接发在我的公众号上:`奔三程序员Club`。
建议你边看文章,边从第一个 branch 开始,从零敲代码造轮子,遇到不明白的时候再来看完成的 branch。
| 15.75 | 64 | 0.781305 | yue_Hant | 0.411568 |
f90e05acdbba4fb4f3e40a8ca096675c765b0965 | 4,291 | md | Markdown | lennoxs30api/docs/datamodel.md | hufman/lennoxs30api | a84669dd7ff87579406eeeb158b2359b2e878b0d | [
"MIT"
] | 6 | 2021-06-05T02:50:21.000Z | 2021-11-28T23:08:41.000Z | lennoxs30api/docs/datamodel.md | hufman/lennoxs30api | a84669dd7ff87579406eeeb158b2359b2e878b0d | [
"MIT"
] | 12 | 2021-07-16T00:28:59.000Z | 2021-11-20T22:52:51.000Z | lennoxs30api/docs/datamodel.md | hufman/lennoxs30api | a84669dd7ff87579406eeeb158b2359b2e878b0d | [
"MIT"
] | 2 | 2021-10-30T04:44:48.000Z | 2021-10-30T14:43:04.000Z | The purpose of this document is to decsribe the data model of the Lenox S30 API
Home
(homeid, id, name)
Systems
System
(sysid)
(outdoor temperature)
(data) - there is a very large system object
Schedules - a list of the schedules that are configured for this system
Zones
(name) <Zone 1>
(id) <0>
(config) - capabilities of the zone - heating, cooling, dehumidification, min/max setpoints, current active schedule
Schedules (not sure what the difference is between these and the ones at the system level)
(sensors)
"sensors": [
{
"hum": 56.81336476924544,
"humStatus": "good",
"id": 0,
"tant": 69.55912714012544,
"tempStatus": "good",
"tsense": 69.5591815349143
}
(status)
status": {
"allergenDefender": false,
"aux": false,
"balancePoint": "none",
"coolCoast": false,
"damper": 0,
"defrost": false,
"demand": 0,
"fan": false,
"heatCoast": false,
"humOperation": "off",
"humidity": 57,
"humidityStatus": "good",
"period": {
"csp": 75,
"cspC": 24,
"desp": 55,
"fanMode": "auto",
"hsp": 57,
"hspC": 14,
"humidityMode": "off",
"husp": 40,
"sp": 73,
"spC": 23,
"startTime": 0,
"systemMode": "cool"
},
Entity: Homes
The top level is a list of one or more homes.
Attributes:
homeid - identifier for the home, this is an integer, numbers I see are > 2,000,000
id - zero based index of the homes for this account
name - the user configured name of the home
address - physical address, latitiude / longitude
Systems - a list of systems in the Home
When Obtained:
Information is retrieved as part of login
Entity: Systems
Represents an HVAC system within a specific home
Static Attributes:
sysId - a GUID indentifying the system - this is the ID that is used to subscribe for information
Dynamic Attributes
presense - indicates if the system is online or offline
outdoor temperature
"AdditionalParameters": null,
"Data": {
"system": {
"publisher": {
"publisherName": "lcc"
},
"status": {
"outdoorTemperature": 67,
"outdoorTemperatureC": 19,
"outdoorTemperatureStatus": "good"
}
}
},
"MessageId": "637576290738517589|5fe01439cc69416b9c26f268dee1df0c",
"MessageType": "PropertyChange",
"SenderId": ""0000000-0000-0000-0000-000000000002"",
"TargetID": "mapp079372367644467046827006_myemail@email.com"
system configuration from dynamic
"system": {
"capacityPrognostics": {
"alertThreshold": 0,
"filterGain": 0,
"isValid": false,
"persistenceCountThreshold": 0,
"persistentThreshold": 0
}, .....
When Obtained:
Information is retrieved as part of login
Entity Zones
| 36.058824 | 129 | 0.41855 | eng_Latn | 0.913985 |
f90e9e5a4948de8cad70a33c197b1c99839329c6 | 253 | md | Markdown | _posts/2019-04-12-patent-certificate.md | arielly/huxpro.github.io | e717f75fd85c6ab53496bb69e54e4455176ba904 | [
"Apache-2.0"
] | 1 | 2019-04-12T10:53:30.000Z | 2019-04-12T10:53:30.000Z | _posts/2019-04-12-patent-certificate.md | arielly/huxpro.github.io | e717f75fd85c6ab53496bb69e54e4455176ba904 | [
"Apache-2.0"
] | null | null | null | _posts/2019-04-12-patent-certificate.md | arielly/huxpro.github.io | e717f75fd85c6ab53496bb69e54e4455176ba904 | [
"Apache-2.0"
] | null | null | null | ---
layout: post
title: "专利证书"
subtitle: ""
author: "arielly"
header-img: "img/post-bg-halting.jpg"
header-mask: 0.3
tags: [ 专利 ]
---
# 专利证书



| 14.055556 | 37 | 0.652174 | eng_Latn | 0.081533 |
f90f4f6f25e6333f5fde9e3aa63e2a9463bc910d | 61 | md | Markdown | README.md | real-xikram/Trail_MERN | 06cb12ef469cb12de6a06656671d845eb648923d | [
"MIT"
] | null | null | null | README.md | real-xikram/Trail_MERN | 06cb12ef469cb12de6a06656671d845eb648923d | [
"MIT"
] | null | null | null | README.md | real-xikram/Trail_MERN | 06cb12ef469cb12de6a06656671d845eb648923d | [
"MIT"
] | null | null | null | # Trail_MERN
In this repo we will learn about MERN frameword
| 20.333333 | 47 | 0.803279 | eng_Latn | 0.982166 |
f90f852404cfdf10c4898d48a17da42080f3d087 | 1,510 | md | Markdown | WindowsServerDocs/administration/windows-commands/using-the-copy-drivergroup-command.md | AnirbanPaul/windowsserverdocs | b7b12767c1261b17bdbf87cda49f341ccaf687bb | [
"CC-BY-4.0",
"MIT"
] | null | null | null | WindowsServerDocs/administration/windows-commands/using-the-copy-drivergroup-command.md | AnirbanPaul/windowsserverdocs | b7b12767c1261b17bdbf87cda49f341ccaf687bb | [
"CC-BY-4.0",
"MIT"
] | null | null | null | WindowsServerDocs/administration/windows-commands/using-the-copy-drivergroup-command.md | AnirbanPaul/windowsserverdocs | b7b12767c1261b17bdbf87cda49f341ccaf687bb | [
"CC-BY-4.0",
"MIT"
] | null | null | null | ---
title: Using the copy-DriverGroup Command
description: "Windows Commands topic for **** - "
ms.custom: na
ms.prod: windows-server-threshold
ms.reviewer: na
ms.suite: na
ms.technology: manage-windows-commands
ms.tgt_pltfrm: na
ms.topic: article
ms.assetid: 0aaf6fa5-8b5b-4a1e-ae9b-8b5c6d89f571
author: coreyp-at-msft
ms.author: coreyp
manager: dongill
ms.date: 10/12/2016
---
# Using the copy-DriverGroup Command
> Applies To: Windows Server 2016, Windows Server 2012 R2, Windows Server 2012
Duplicates an existing driver group on the server including the filters, driver packages, and enabled/disabled status.
## Syntax
```
WDSUTIL /Copy-DriverGroup [/Server:<Server name>] /DriverGroup:<Source Group Name> /GroupName:<New Group Name>
```
## Parameters
|Parameter|Description|
|---------|-----------|
|[/Server:\<Server name>]|Specifies the name of the server. This can be the NetBIOS name or the FQDN. If no server name is specified, the local server is used.|
|/DriverGroup:\<Source Group Name>|Specifies the name of the source driver group.|
|/GroupName:\<New Group Name>|Specifies the name of the new driver group.|
## <a name="BKMK_examples"></a>Examples
To copy a driver group, type one of the following:
```
WDSUTIL /Copy-DriverGroup /Server:MyWdsServer /DriverGroup:PrinterDrivers /GroupName:X86PrinterDrivers
```
```
WDSUTIL /Copy-DriverGroup /DriverGroup:PrinterDrivers /GroupName:ColorPrinterDrivers
```
#### Additional references
[Command-Line Syntax Key](command-line-syntax-key.md) | 30.2 | 160 | 0.751656 | eng_Latn | 0.657882 |
f90f98a4ec16a10aa8da059f8dabc5f5e353cf22 | 17 | md | Markdown | README.md | aogn/test | d703fa6b1257baeac41306c6b78761ddfd692c57 | [
"Apache-2.0"
] | null | null | null | README.md | aogn/test | d703fa6b1257baeac41306c6b78761ddfd692c57 | [
"Apache-2.0"
] | null | null | null | README.md | aogn/test | d703fa6b1257baeac41306c6b78761ddfd692c57 | [
"Apache-2.0"
] | null | null | null | test text # test
| 8.5 | 16 | 0.705882 | eng_Latn | 0.770315 |
f910849308e0de3a756420fdd512379d1f926d5a | 85 | md | Markdown | README.md | emmmwinama/Blog | ddb6f653ec4c5afac6879da6107786350a027040 | [
"MIT"
] | null | null | null | README.md | emmmwinama/Blog | ddb6f653ec4c5afac6879da6107786350a027040 | [
"MIT"
] | null | null | null | README.md | emmmwinama/Blog | ddb6f653ec4c5afac6879da6107786350a027040 | [
"MIT"
] | null | null | null | # Blog
Personal blog developed in Java Spring Boot 2 mongodb reactjs and tailwindcss
| 28.333333 | 77 | 0.823529 | eng_Latn | 0.899164 |
f910f04e0993c86dc37e53e80eda0f4759640aea | 4,540 | md | Markdown | WindowsServerDocs/identity/ad-ds/get-started/virtual-dc/Support-for-using-Hyper-V-Replica-for-virtualized-domain-controllers.md | huache/windowsserverdocs.zh-cn | a39065b2fd3ef28c99a27b6c91372b058e4422f8 | [
"CC-BY-4.0",
"MIT"
] | null | null | null | WindowsServerDocs/identity/ad-ds/get-started/virtual-dc/Support-for-using-Hyper-V-Replica-for-virtualized-domain-controllers.md | huache/windowsserverdocs.zh-cn | a39065b2fd3ef28c99a27b6c91372b058e4422f8 | [
"CC-BY-4.0",
"MIT"
] | null | null | null | WindowsServerDocs/identity/ad-ds/get-started/virtual-dc/Support-for-using-Hyper-V-Replica-for-virtualized-domain-controllers.md | huache/windowsserverdocs.zh-cn | a39065b2fd3ef28c99a27b6c91372b058e4422f8 | [
"CC-BY-4.0",
"MIT"
] | null | null | null | ---
ms.assetid: 45a65504-70b5-46ea-b2e0-db45263fabaa
title: 支持将 Hyper-V 副本用于虚拟化域控制器
author: iainfoulds
ms.author: iainfou
manager: daveba
ms.date: 05/31/2017
ms.topic: article
ms.openlocfilehash: 8c4d96bbf23e9f25a0ed38f6ca9d6e4c33cdb7b7
ms.sourcegitcommit: 1dc35d221eff7f079d9209d92f14fb630f955bca
ms.translationtype: MT
ms.contentlocale: zh-CN
ms.lasthandoff: 08/26/2020
ms.locfileid: "88938187"
---
# <a name="support-for-using-hyper-v-replica-for-virtualized-domain-controllers"></a>支持将 Hyper-V 副本用于虚拟化域控制器
> 适用于:Windows Server 2016、Windows Server 2012 R2、Windows Server 2012
本主题介绍了对于使用 Hyper-V 副本复制作为域控制器 (DC) 运行的虚拟机 (VM) 的支持能力。 Hyper-V 副本是始于 Windows Server 2012 的 Hyper-V 新功能,它提供虚拟机级别的内置复制机制。
Hyper-V 副本通过 LAN 或 WAN 链接以异步方式将所选虚拟机从主 Hyper-V 主机复制到副本 Hyper-V 主机。 完成初始复制后,将按照管理员定义的时间间隔复制后续的更改。
故障转移可以是计划的或非计划的。 计划的故障转移由管理员在主虚拟机上启动,它将任何未复制的更改复制到副本虚拟机上,以防止丢失任何数据。 将在副本虚拟机上启动非计划的故障转移,以响应主虚拟机中的意外失败。 因为没有机会传输可能尚未复制的主虚拟机上的更改,所以可能会丢失数据。
有关 Hyper-V 副本的详细信息,请参阅 [Hyper-V 副本概述](/previous-versions/windows/it-pro/windows-server-2012-R2-and-2012/jj134172(v=ws.11))和[部署 Hyper-V 副本](/previous-versions/windows/it-pro/windows-server-2012-R2-and-2012/jj134207(v=ws.11))。
> [!NOTE]
> 仅可在 Windows Server Hyper-V(不是在 Windows 8 上运行的 Hyper-V 版本)上运行 Hyper-V 副本。
## <a name="windows-server-2012-or-newer-domain-controllers-required"></a>需要 Windows Server 2012 或更高版本的域控制器
Windows Server 2012 Hyper-v 引入了 VM-生成 id (VMGenID) 。 VMGenID 提供了一种方法,可在发生重大更改时,使虚拟机监控程序与来宾 OS 进行通信。 例如,虚拟机监控程序可以与已从快照还原(Hyper-V 快照还原技术,而不是备份还原)的虚拟化 DC 进行通信。 Windows Server 2012 和更高版本中的 AD DS 可感知 VMGenID VM 技术,并使用它来检测何时执行虚拟机监控程序操作(例如快照还原),从而允许它更好地保护自身。
> [!NOTE]
> 只有 Windows Server 2012 Dc 或更高版本上的 AD DS 提供从 VMGenID 生成的这些安全措施;运行所有以前版本的 Windows Server 的 Dc 受在使用不受支持的机制(例如快照还原)还原虚拟化 DC 时可能发生的问题。 有关这些安全措施和何时触发它们的详细信息,请参阅[虚拟化域控制器体系结构](./virtualized-domain-controller-architecture.md)。
当 Hyper-v 副本故障转移 (计划内或计划外的) 时,虚拟化的 DC 会检测到 VMGenID 重置,并触发上述安全功能。 然后,Active Directory 操作将如常地继续。 副本 VM 取代主 VM 运行。
> [!NOTE]
> 考虑到现在存在两个具有相同 DC 标识的实例,有可能主要实例和复制的实例都将运行。 尽管 Hyper-V 副本已准备控制机制以确保主 VM 和副本 VM 不会同时运行,但如果发生它们之间的链接在虚拟机复制之后失败的事件,它们仍可能会同时运行。 在这个不太可能发生的事件中,运行 Windows Server 2012 的虚拟化 DC 具有有助于保护 AD DS 的安全措施,而运行更早版本 Windows Server 的虚拟化 DC 没有这些安全措施。
使用 Hyper-V 副本时,请确保按照[在 Hyper-V 上运行虚拟域控制器](/previous-versions/windows/it-pro/windows-server-2008-R2-and-2008/dd363553(v=ws.10))的最佳实践进行操作。 例如,这部分讨论了关于在虚拟 SCSI 磁盘上存储 Active Directory 文件的建议,从而对数据持久性提供了更强有力的保证。
## <a name="supported-and-unsupported-scenarios"></a>支持的和不支持的方案
非计划的故障转移和测试故障转移仅支持运行 Windows Server 2012 或更高版本的 Vm。 即使对于计划的故障转移,也建议为虚拟化 DC 使用 Windows Server 2012 或更高版本,以便在管理员不小心同时启动主 VM 和复制 VM 的事件中降低风险。
计划的故障转移支持运行更早版本 Windows Server 的 VM,但因为有 USN 回滚的可能,所以未计划的故障转移不支持运行这类 VM。 有关 USN 回滚的详细信息,请参阅 [USN 和 USN 回滚](/previous-versions/windows/it-pro/windows-server-2008-R2-and-2008/dd363553(v=ws.10))。
> [!NOTE]
> 对于域或林没有任何功能级别的要求;仅对作为 VM(使用 Hyper-V 副本复制)运行的 DC 存在操作系统要求。 可以在包含其他物理或虚拟 DC 的林中部署 VM,这些 DC 运行较早版本的 Windows Server,并且不一定使用 Hyper-V 副本进行复制。
此支持声明基于在单个域林中执行的测试,但是也支持多域林配置。 对于这些测试,虚拟化域控制器 DC1 和 DC2 是同一站点中的 Active Directory 复制伙伴,它们托管于 Windows Server 2012 上运行 Hyper-V 的服务器。 运行 DC2 的 VM 来宾已启用 Hyper-V 副本。 由另一个在地理上相隔很远的数据中心托管副本服务器。 为了帮助说明如下所述的测试用例过程,将副本服务器上运行的 VM 称为 DC2-Rec(尽管在实践中,它将保留与原始 VM 相同的名称)。
### <a name="windows-server-2012"></a>Windows Server 2012
下表介绍了对运行 Windows Server 2012 的虚拟化 DC 的支持和测试用例。
| 计划内故障转移 | 非计划的故障转移 |
|--|--|
| 支持 | 支持 |
| 测试用例:<p>-DC1 和 DC2 正在运行 Windows Server 2012。<p>-DC2 已关闭,并在 DC2-Rec 上执行故障转移。故障转移可以是计划的或非计划的。<p>-在 DC2 开始后,它将检查其数据库中的 VMGenID 值是否与 Hyper-v 副本服务器保存的虚拟机驱动程序中的值相同。<p>-因此,DC2 记录触发了虚拟化安全措施;换句话说,它会重置其 InvocationID,丢弃其 RID 池,并在其假定为操作主机角色之前设置初始同步要求。 有关初始同步要求的详细信息,请参阅 。<p>-DC2 将 VMGenID 的新值保存在其数据库中,并在新 InvocationID 的上下文中提交任何后续更新。<p>-由于 InvocationID 重置,DC1 将聚合到 DC2-Rec 引入的所有 AD 更改上,即使它是及时回滚的,这意味着在故障转移后,在 DC2 记录上执行的任何 AD 更新将安全地聚合起来 | 对于计划的故障转移,测试用例相同,但存在下列例外:<p>-在发生故障转移事件之前,在 DC2 上收到但尚未由 AD 复制到复制伙伴的 AD 更新将丢失。<p>-在由 AD 向 DC1 复制的恢复点之后,在 DC2 上收到的 AD 更新将从 DC1 复制回 DC2-Rec。 |
### <a name="windows-server-2008-r2-and-earlier-versions"></a>Windows Server 2008 R2 及较早版本
下表介绍了对运行 Windows Server 2008 R2 及较早版本的虚拟化 DC 的支持。
| 计划内故障转移 | 非计划的故障转移 |
|--|--|
| 支持,但不建议这样做,因为运行这些版本的 Windows Server 的 DC 不支持 VMGenID 或使用关联的虚拟化安全措施。 这使它们面临 USN 回滚的风险。 有关详细信息,请参阅 [USN 和 USN 回滚](/previous-versions/windows/it-pro/windows-server-2008-R2-and-2008/dd363553(v=ws.10))。 | 不支持<p>**注意:** 如果 USN 回滚不是风险,例如林中的单个 DC (不推荐的配置) ,则将支持非计划的故障转移。 |
| 测试用例:<p>-DC1 和 DC2 正在运行 Windows Server 2008 R2。<p>-DC2 已关闭,并在 DC2-Rec 上执行计划的故障转移。在关闭完成之前,DC2 上的所有数据都将复制到 DC2 记录。<p>-在 DC2 开始后,它会使用与 DC2 相同的 invocationID 恢复与 DC1 的复制。 | 空值 |
| 62.191781 | 565 | 0.787225 | yue_Hant | 0.879586 |
f9114413352b0881c333f40e37f5fd59988d6713 | 558 | md | Markdown | ktor-ingshowcase/readme.md | fschutte/ING-API-Showcase | 81076776369b5df2daafd6e05c2ba00cd85c1ccb | [
"MIT"
] | 1 | 2020-09-12T17:41:51.000Z | 2020-09-12T17:41:51.000Z | ktor-ingshowcase/readme.md | fschutte/ING-API-Showcase | 81076776369b5df2daafd6e05c2ba00cd85c1ccb | [
"MIT"
] | 1 | 2019-06-08T17:40:42.000Z | 2019-08-02T21:38:45.000Z | ktor-ingshowcase/readme.md | fschutte/ING-API-Showcase | 81076776369b5df2daafd6e05c2ba00cd85c1ccb | [
"MIT"
] | 1 | 2019-11-10T18:18:40.000Z | 2019-11-10T18:18:40.000Z | # Ktor ING Showcase
Ktor is a Kotlin specific microservices framework. See https://ktor.io for more information.
This module was created with the help of Intellij IDE and has the following features enabled: Http Client Engine, Apache Client Engine, Json serialization, Logging.
## Intro
Simple demonstration of consuming the ING Showcase API, see https://developer.ing.com/api-marketplace/marketplace.
## Building and running
Simply run Application.kt from your IDO or to run your application from command line use:
```
mvn clean compile exec:java
```
| 32.823529 | 164 | 0.786738 | eng_Latn | 0.979397 |
f9129b620174524218d934bb1d251456569e8bc4 | 455 | md | Markdown | README.md | vamega/repo-proxy | 6d036c90a2d6d3b4151ff295da000297458dd60f | [
"MIT"
] | null | null | null | README.md | vamega/repo-proxy | 6d036c90a2d6d3b4151ff295da000297458dd60f | [
"MIT"
] | null | null | null | README.md | vamega/repo-proxy | 6d036c90a2d6d3b4151ff295da000297458dd60f | [
"MIT"
] | null | null | null | repo-proxy
==========
Your new web app is ready to go!
To run your app you'll need to:
1. Activate a python 3.5 or 3.6 environment
2. Install the required packages with `pip install -r requirements.txt`
3. Make sure the app's settings are configured correctly (see `settings.yml`). You can also
use environment variables to define sensitive settings, eg. DB connection variables
4. You can then run your app during development with `adev runserver .`
| 35 | 91 | 0.751648 | eng_Latn | 0.999003 |
f912b1fdaf38bde49fba2ec82e9667cbf50e8539 | 61 | md | Markdown | README.md | marciobastiani/marciobastiani.github.io | a33fdc3640ad9073d2d40e64362aa6768a1b48cc | [
"CC-BY-3.0"
] | null | null | null | README.md | marciobastiani/marciobastiani.github.io | a33fdc3640ad9073d2d40e64362aa6768a1b48cc | [
"CC-BY-3.0"
] | null | null | null | README.md | marciobastiani/marciobastiani.github.io | a33fdc3640ad9073d2d40e64362aa6768a1b48cc | [
"CC-BY-3.0"
] | null | null | null | # marciobastiani.github.io
My personal web site for showcase
| 20.333333 | 33 | 0.819672 | eng_Latn | 0.837857 |
f912c43a7cfd0d209a393b33c6065f0364ec118d | 14,116 | md | Markdown | articles/media-services/media-services-widevine-license-template-overview.md | Zwoch/azure-docs.de-de | c76b1dfefc2541ca6d661c9eaca428b42b34ebc8 | [
"CC-BY-4.0",
"MIT"
] | null | null | null | articles/media-services/media-services-widevine-license-template-overview.md | Zwoch/azure-docs.de-de | c76b1dfefc2541ca6d661c9eaca428b42b34ebc8 | [
"CC-BY-4.0",
"MIT"
] | null | null | null | articles/media-services/media-services-widevine-license-template-overview.md | Zwoch/azure-docs.de-de | c76b1dfefc2541ca6d661c9eaca428b42b34ebc8 | [
"CC-BY-4.0",
"MIT"
] | null | null | null | ---
title: "Übersicht über die Widevine-Lizenzvorlage | Microsoft Do"
description: "Dieses Thema bietet einen Überblick über eine Widevine-Lizenzvorlage, mit der Widevine-Lizenzen konfiguriert werden."
author: juliako
manager: cfowler
editor:
services: media-services
documentationcenter:
ms.assetid: 0e6f1f05-7ed6-4ed6-82a0-0cc2182b075a
ms.service: media-services
ms.workload: media
ms.tgt_pltfrm: na
ms.devlang: na
ms.topic: article
ms.date: 06/29/2017
ms.author: juliako
ms.openlocfilehash: 85de5765975b0c55fafe9bb4c14a1c1f435a6d5c
ms.sourcegitcommit: 9292e15fc80cc9df3e62731bafdcb0bb98c256e1
ms.translationtype: HT
ms.contentlocale: de-DE
ms.lasthandoff: 01/10/2018
---
# <a name="widevine-license-template-overview"></a>Übersicht über die Widevine-Lizenzvorlage
Widevine-Lizenzen können mithilfe von Azure Media Services konfiguriert und angefordert werden. Wenn der Player versucht, Ihre durch Widevine geschützten Inhalte wiederzugeben, wird eine Anforderung zum Erwerben einer Lizenz an den Lizenzbereitstellungsdienst gesendet. Wenn der Lizenzdienst die Anforderung genehmigt, stellt der Dienst die Lizenz aus. Sie wird an den Client gesendet und dient zum Entschlüsseln und Wiedergeben des angegebenen Inhalts.
Eine Widevine-Lizenzanforderung ist als JSON-Nachricht formatiert.
>[!NOTE]
> Sie können eine leere Nachricht ohne Werte erstellen, indem Sie einfach „{}“ verwenden. Dann wird eine Lizenzvorlage mit Standardeinstellungen erstellt. Die Standardeinstellungen funktionieren in den meisten Fällen. Microsoft-basierte Lizenzübermittlungsszenarien sollten immer die Standardeinstellungen verwenden. Wenn Sie die Werte für „provider“ und „content_id“ festlegen müssen, muss der Anbieter den Widevine-Anmeldeinformationen entsprechen.
{
“payload”:“<license challenge>”,
“content_id”: “<content id>”
“provider”: ”<provider>”
“allowed_track_types”:“<types>”,
“content_key_specs”:[
{
“track_type”:“<track type 1>”
},
{
“track_type”:“<track type 2>”
},
…
],
“policy_overrides”:{
“can_play”:<can play>,
“can persist”:<can persist>,
“can_renew”:<can renew>,
“rental_duration_seconds”:<rental duration>,
“playback_duration_seconds”:<playback duration>,
“license_duration_seconds”:<license duration>,
“renewal_recovery_duration_seconds”:<renewal recovery duration>,
“renewal_server_url”:”<renewal server url>”,
“renewal_delay_seconds”:<renewal delay>,
“renewal_retry_interval_seconds”:<renewal retry interval>,
“renew_with_usage”:<renew with usage>
}
}
## <a name="json-message"></a>JSON-Nachricht
| NAME | Wert | BESCHREIBUNG |
| --- | --- | --- |
| payload |Base64-codierte Zeichenfolge |Die von einem Client gesendete Lizenzanforderung. |
| content_id |Base64-codierte Zeichenfolge |Bezeichner, der zum Ableiten der Schlüssel-ID und des Inhaltsschlüssels für „content_key_specs.track_type“ verwendet wird. |
| Anbieter |Zeichenfolge |Wird zum Nachschlagen von Inhaltsschlüsseln und Richtlinien verwendet. Wenn für die Widevine-Lizenzbereitstellung die Microsoft-Schlüsselübermittlung verwendet wird, wird dieser Parameter ignoriert. |
| policy_name |Zeichenfolge |Der Name einer zuvor registrierten Richtlinie. Optional. |
| allowed_track_types |enum |SD_ONLY oder SD_HD. Steuert, welche Inhaltsschlüssel in eine Lizenz aufgenommen werden. |
| content_key_specs |Array von JSON-Strukturen, siehe Abschnitt „Spezifikationen für Inhaltsschlüssel“. |Eine feiner abgestimmte Steuerung der zurückzugebenden Inhaltsschlüssel. Weitere Informationen finden Sie im Abschnitt „Spezifikationen für Inhaltsschlüssel“. Nur einer der Werte („allowed_track_types“ oder „content_key_specs“) kann angegeben werden. |
| use_policy_overrides_exclusively |Boolescher Wert, true oder false |Verwenden Sie Richtlinienattribute, die von „policy_overrides“ angegeben werden, und lassen Sie alle zuvor gespeicherten Richtlinien weg. |
| policy_overrides |JSON-Struktur, siehe Abschnitt „Außerkraftsetzungen von Richtlinien“. |Richtlinieneinstellungen für diese Lizenz. Falls dieses Medienobjekt über eine vorab definierte Richtlinie verfügt, werden diese angegebenen Werte verwendet. |
| session_init |JSON-Struktur, siehe Abschnitt „Sitzungsinitialisierung“. |An die Lizenz werden optionale Daten übergeben. |
| parse_only |Boolescher Wert, true oder false |Die Lizenzanforderung wird analysiert, jedoch wird keine Lizenz ausgestellt. Allerdings werden Werte aus der Lizenzanforderung in der Antwort zurückgegeben. |
## <a name="content-key-specs"></a>Spezifikationen für Inhaltsschlüssel
Wenn eine Richtlinie bereits vorhanden ist, müssen keine Werte in der Spezifikation für Inhaltsschlüssel angegeben werden. Die mit diesen Inhalten verbundene, bereits bestehende Richtlinie wird zur Bestimmung des Ausgabeschutzes verwendet, z. B. HDCP (High-bandwidth Digital Content Protection) und CGMS (Copy General Management System). Wenn eine vorhandene Richtlinie nicht beim Widevine-Lizenzserver registriert ist, kann der Inhaltsanbieter die Werte in die Lizenzanforderung einfügen.
Jeder content_key_specs-Wert muss für alle Titel angegeben werden, unabhängig von der Option „use_policy_overrides_exclusively“.
| NAME | Wert | BESCHREIBUNG |
| --- | --- | --- |
| content_key_specs. track_type |Zeichenfolge |Der Name eines Titeltyps. Wenn „content_key_specs“ in der Lizenzanforderung angegeben ist, sollten Sie unbedingt alle Titeltypen explizit angeben. Andernfalls wird die Wiedergabe nach 10 Sekunden beendet. |
| content_key_specs <br/> security_level |UInt32 |Definiert die Clientstabilitätsanforderungen für die Wiedergabe. <br/> – Softwarebasierte White-Box-Kryptografie ist erforderlich. <br/> – Softwarekryptografie und ein verborgener Decoder sind erforderlich. <br/> – Die zentralen Vorgänge für Daten und Kryptografie müssen innerhalb einer hardwaregestützten vertrauenswürdigen Ausführungsumgebung ausgeführt werden. <br/> – Kryptografie und Decodierung müssen innerhalb einer hardwaregestützten vertrauenswürdigen Ausführungsumgebung ausgeführt werden. <br/> – Kryptografie, Decodierung und Verarbeitung von Medien (komprimiert und nicht komprimiert) müssen innerhalb einer hardwaregestützten vertrauenswürdigen Ausführungsumgebung verarbeitet werden. |
| content_key_specs <br/> required_output_protection.hdc |Zeichenfolge, HDCP_NONE, HDCP_V1 oder HDCP_V2 |Gibt an, ob HDCP erforderlich ist. |
| content_key_specs <br/>key |Base64-<br/>codierte Zeichenfolge |Der Inhaltsschlüssel, der für diesen Titel verwendet werden soll. Wenn ein Wert angegeben wird, ist „track_type“ oder „key_id“ erforderlich. Mit dieser Option können Inhaltsanbieter den Inhaltsschlüssel für diesen Titel einfügen, statt einen Schlüssel durch den Widevine-Lizenzserver zu generieren oder zu suchen. |
| content_key_specs.key_id |Base64-codierte binäre Zeichenfolge, 16 Bytes |Eindeutiger Bezeichner für den Schlüssel. |
## <a name="policy-overrides"></a>Außerkraftsetzungen von Richtlinien
| NAME | Wert | BESCHREIBUNG |
| --- | --- | --- |
| policy_overrides. can_play |Boolescher Wert, true oder false |Gibt an, dass die Wiedergabe des Inhalts zulässig ist. Die Standardeinstellung ist "false". |
| policy_overrides. can_persist |Boolescher Wert, true oder false |Gibt an, dass die Lizenz in einem permanenten Speicher für die Offlineverwendung beibehalten werden kann. Die Standardeinstellung ist "false". |
| policy_overrides. can_renew |Boolescher Wert, true oder false |Gibt an, dass die Verlängerung dieser Lizenz zulässig ist. Bei „true“ kann die Dauer der Lizenz über den Takt verlängert werden. Die Standardeinstellung ist "false". |
| policy_overrides. license_duration_seconds |int64 |Gibt das Zeitfenster für diese bestimmte Lizenz an. Der Wert 0 gibt an, dass es keine Beschränkung für die Dauer gibt. Der Standardwert ist 0. |
| policy_overrides. rental_duration_seconds |int64 |Gibt das Zeitfenster an, in dem die Wiedergabe zulässig ist. Der Wert 0 gibt an, dass es keine Beschränkung für die Dauer gibt. Der Standardwert ist 0. |
| policy_overrides. playback_duration_seconds |int64 |Das Anzeigezeitfenster, sobald die Wiedergabe innerhalb der Lizenzdauer beginnt. Der Wert 0 gibt an, dass es keine Beschränkung für die Dauer gibt. Der Standardwert ist 0. |
| policy_overrides. renewal_server_url |Zeichenfolge |Alle Heartbeatanforderungen (Verlängerung) für diese Lizenz werden an die angegebene URL weitergeleitet. Dieses Feld wird nur verwendet, wenn „can_renew“ auf „true“ festgelegt ist. |
| policy_overrides. renewal_delay_seconds |int64 |Die Anzahl der Sekunden nach „license_start_time“, bevor der erste Versuch einer Verlängerung unternommen wird. Dieses Feld wird nur verwendet, wenn „can_renew“ auf „true“ festgelegt ist. Der Standardwert ist 0. |
| policy_overrides. renewal_retry_interval_seconds |int64 |Gibt die Verzögerung in Sekunden zwischen den nachfolgenden Anforderungen zu Lizenzverlängerung an, falls ein Fehler auftritt. Dieses Feld wird nur verwendet, wenn „can_renew“ auf „true“ festgelegt ist. |
| policy_overrides. renewal_recovery_duration_seconds |int64 |Das Zeitfenster, für das die Wiedergabe fortgesetzt werden kann, während der Versuch einer Verlängerung unternommen wird, aber aufgrund von Back-End-Problemen mit dem Lizenzserver noch nicht erfolgreich war. Der Wert 0 gibt an, dass es keine Beschränkung für die Dauer gibt. Dieses Feld wird nur verwendet, wenn „can_renew“ auf „true“ festgelegt ist. |
| policy_overrides. renew_with_usage |Boolescher Wert, true oder false |Gibt an, dass die Lizenz zur Verlängerung gesendet wird, wenn die Verwendung startet. Dieses Feld wird nur verwendet, wenn „can_renew“ auf „true“ festgelegt ist. |
## <a name="session-initialization"></a>Sitzungsinitialisierung
| NAME | Wert | BESCHREIBUNG |
| --- | --- | --- |
| provider_session_token |Base64-codierte Zeichenfolge |Dieses Sitzungstoken wird wieder in der Lizenz übergeben und ist in nachfolgenden Verlängerungen vorhanden. Das Sitzungstoken bleibt nicht über Sitzungen hinaus erhalten. |
| provider_client_token |Base64-codierte Zeichenfolge |Clienttoken, das in der Lizenzantwort zurück gesendet wird. Wenn die Lizenzanforderung ein Clienttoken enthält, wird dieser Wert ignoriert. Das Clienttoken bleibt über Lizenzsitzungen hinaus erhalten. |
| override_provider_client_token |Boolescher Wert, true oder false |Wenn „false“ festgelegt wird und die Lizenzanforderung ein Clienttoken enthält, verwenden Sie das Token aus der Anforderung, selbst wenn ein Clienttoken in dieser Struktur angegeben wurde. Verwenden Sie bei „true“ immer das Token, das in dieser Struktur angegeben ist. |
## <a name="configure-your-widevine-licenses-by-using-net-types"></a>Konfigurieren Ihrer Widevine-Lizenzen mit .NET-Typen
Media Services bietet .NET-APIs, mit denen Sie Ihre Widevine-Lizenzen konfigurieren können.
### <a name="classes-as-defined-in-the-media-services-net-sdk"></a>Klassen, wie im Media Services .NET SDK definiert
Die folgenden Klassen sind Definitionen dieser Typen:
public class WidevineMessage
{
public WidevineMessage();
[JsonProperty(NullValueHandling = NullValueHandling.Ignore)]
public AllowedTrackTypes? allowed_track_types { get; set; }
[JsonProperty(NullValueHandling = NullValueHandling.Ignore)]
public ContentKeySpecs[] content_key_specs { get; set; }
[JsonProperty(NullValueHandling = NullValueHandling.Ignore)]
public object policy_overrides { get; set; }
}
[JsonConverter(typeof(StringEnumConverter))]
public enum AllowedTrackTypes
{
SD_ONLY = 0,
SD_HD = 1
}
public class ContentKeySpecs
{
public ContentKeySpecs();
[JsonProperty(NullValueHandling = NullValueHandling.Ignore)]
public string key_id { get; set; }
[JsonProperty(NullValueHandling = NullValueHandling.Ignore)]
public RequiredOutputProtection required_output_protection { get; set; }
[JsonProperty(NullValueHandling = NullValueHandling.Ignore)]
public int? security_level { get; set; }
[JsonProperty(NullValueHandling = NullValueHandling.Ignore)]
public string track_type { get; set; }
}
public class RequiredOutputProtection
{
public RequiredOutputProtection();
public Hdcp hdcp { get; set; }
}
[JsonConverter(typeof(StringEnumConverter))]
public enum Hdcp
{
HDCP_NONE = 0,
HDCP_V1 = 1,
HDCP_V2 = 2
}
### <a name="example"></a>Beispiel
Das folgende Beispiel zeigt, wie Sie .NET-APIs verwenden, um eine einfache Widevine-Lizenz zu konfigurieren:
private static string ConfigureWidevineLicenseTemplate()
{
var template = new WidevineMessage
{
allowed_track_types = AllowedTrackTypes.SD_HD,
content_key_specs = new[]
{
new ContentKeySpecs
{
required_output_protection = new RequiredOutputProtection { hdcp = Hdcp.HDCP_NONE},
security_level = 1,
track_type = "SD"
}
},
policy_overrides = new
{
can_play = true,
can_persist = true,
can_renew = false
}
};
string configuration = JsonConvert.SerializeObject(template);
return configuration;
}
## <a name="media-services-learning-paths"></a>Media Services-Lernpfade
[!INCLUDE [media-services-learning-paths-include](../../includes/media-services-learning-paths-include.md)]
## <a name="provide-feedback"></a>Feedback geben
[!INCLUDE [media-services-user-voice-include](../../includes/media-services-user-voice-include.md)]
## <a name="see-also"></a>Weitere Informationen
[Verwenden von dynamischer allgemeiner Verschlüsselung mit PlayReady und/oder Widevine](media-services-protect-with-playready-widevine.md)
| 70.228856 | 754 | 0.758288 | deu_Latn | 0.972778 |
f912c476b68a694e6b2f32be140ec3c501fa948b | 620 | md | Markdown | specification/graphrbac/data-plane/readme.ruby.md | cthrash/azure-rest-api-specs | fd25e4af660446bdec0faa8e7e44e1f9ab3e9685 | [
"MIT"
] | 1,590 | 2015-08-04T16:49:39.000Z | 2022-03-31T23:52:35.000Z | specification/graphrbac/data-plane/readme.ruby.md | cthrash/azure-rest-api-specs | fd25e4af660446bdec0faa8e7e44e1f9ab3e9685 | [
"MIT"
] | 18,161 | 2015-08-04T21:50:56.000Z | 2022-03-31T23:59:18.000Z | specification/graphrbac/data-plane/readme.ruby.md | cthrash/azure-rest-api-specs | fd25e4af660446bdec0faa8e7e44e1f9ab3e9685 | [
"MIT"
] | 5,168 | 2015-08-04T18:43:48.000Z | 2022-03-31T20:28:55.000Z | ## Ruby
These settings apply only when `--ruby` is specified on the command line.
``` yaml
package-name: azure_graph_rbac
package-version: "0.16.0"
azure-arm: true
```
### Ruby multi-api
``` yaml $(ruby) && $(multiapi)
batch:
- tag: 1.6
```
### Tag: 1.6 and ruby
These settings apply only when `--tag=1.6 --ruby` is specified on the command line.
Please also specify `--ruby-sdks-folder=<path to the root directory of your azure-sdk-for-ruby clone>`.
``` yaml $(tag) == '1.6' && $(ruby)
namespace: "Azure::GraphRbac::V1_6"
output-folder: $(ruby-sdks-folder)/data/azure_graph_rbac/lib
title: GraphRbacClient
```
| 22.142857 | 103 | 0.680645 | eng_Latn | 0.964584 |
f9132c4204ccfc04d04dbfbee03be16ba6c03290 | 72 | md | Markdown | _o/dev/box/framework/dotphp/DelMaster/_source/xend/about/Doc/Summary.md | vae24co/bryao | d3420d384eaf98b475ee974f4462591f611a785c | [
"Apache-2.0"
] | null | null | null | _o/dev/box/framework/dotphp/DelMaster/_source/xend/about/Doc/Summary.md | vae24co/bryao | d3420d384eaf98b475ee974f4462591f611a785c | [
"Apache-2.0"
] | null | null | null | _o/dev/box/framework/dotphp/DelMaster/_source/xend/about/Doc/Summary.md | vae24co/bryao | d3420d384eaf98b475ee974f4462591f611a785c | [
"Apache-2.0"
] | null | null | null | # LAUNCHING THE APP
1. On launch, terminate session & redirect to login | 36 | 52 | 0.763889 | kor_Hang | 0.448231 |
f914555c6e954e13bf4ddb23ab965ede84ca3d3b | 176 | md | Markdown | .changeset/healthy-parents-happen.md | abdonrd/lion | c99a5dbb579011472e56040911f7947d430714e7 | [
"MIT"
] | null | null | null | .changeset/healthy-parents-happen.md | abdonrd/lion | c99a5dbb579011472e56040911f7947d430714e7 | [
"MIT"
] | null | null | null | .changeset/healthy-parents-happen.md | abdonrd/lion | c99a5dbb579011472e56040911f7947d430714e7 | [
"MIT"
] | null | null | null | ---
'@lion/core': patch
---
Allow SlotMixin to work with default slots using empty string as key (''). Ensure that we do not add slot attribute to the slottable in this case.
| 29.333333 | 146 | 0.721591 | eng_Latn | 0.998583 |
f914984e9c45663ed29aa02ef9402589ee006dfd | 9,483 | md | Markdown | articles/sentinel/watchlist-schemas.md | discentem/azure-docs | b1495f74a87004c34c5e8112e2b9f520ce94e290 | [
"CC-BY-4.0",
"MIT"
] | null | null | null | articles/sentinel/watchlist-schemas.md | discentem/azure-docs | b1495f74a87004c34c5e8112e2b9f520ce94e290 | [
"CC-BY-4.0",
"MIT"
] | 1 | 2021-12-26T08:14:40.000Z | 2021-12-26T08:14:40.000Z | articles/sentinel/watchlist-schemas.md | discentem/azure-docs | b1495f74a87004c34c5e8112e2b9f520ce94e290 | [
"CC-BY-4.0",
"MIT"
] | null | null | null | ---
title: Schemas for Microsoft Sentinel watchlist templates | Microsoft Docs
description: Learn about the schemas used in each built-in watchlist template in Microsoft Sentinel.
author: batamig
ms.author: bagol
ms.topic: reference
ms.custom: mvc, ignite-fall-2021
ms.date: 11/09/2021
---
# Microsoft Sentinel built-in watchlist template schemas (Public preview)
[!INCLUDE [Banner for top of topics](./includes/banner.md)]
This article details the schemas used in each built-in watchlist template provided by Microsoft Sentinel. For more information, see [Create a new watchlist using a template (Public preview)](watchlists.md#create-a-new-watchlist-using-a-template-public-preview).
> [!IMPORTANT]
> The Microsoft Sentinel watchlist templates are currently in PREVIEW. The [Azure Preview Supplemental Terms](https://azure.microsoft.com/support/legal/preview-supplemental-terms/) include additional legal terms that apply to Azure features that are in beta, preview, or otherwise not yet released into general availability.
>
## High Value Assets
The High Value Assets watchlist lists devices, resources, and other assets that have critical value in the organization, and includes the following fields:
| Field name | Format | Example | Mandatory/Optional |
| ---------- | ----------------------------------- | -------------------------------------------------------------------------------------------------------------------------------------- | ------------------ |
| **Asset Type** | String | `Device`, `Azure resource`, `AWS resource`, `URL`, `SPO`, `File share`, `Other` | Mandatory |
| **Asset Id** | String, depending on asset type | `/subscriptions/d1d8779d-38d7-4f06-91db-9cbc8de0176f/resourceGroups/SOC-Purview/providers/Microsoft.Storage/storageAccounts/purviewadls` | Mandatory |
| **Asset Name** | String | `Microsoft.Storage/storageAccounts/purviewadls` | Optional |
| **Asset FQDN** | FQDN | `Finance-SRv.local.microsoft.com` | Mandatory |
| **IP Address** | IP | `1.1.1.1` | Optional |
| **Tags** | List | `["SAW user","Blue Ocean team"] ` | Optional |
| | | | |
## VIP Users
The VIP Users watchlist lists user accounts of employees that have high impact value in the organization, and includes the following values:
| Field name | Format | Example | Mandatory/Optional |
| ------------------- | ------ | --------------------------------------------------- | ------------------ |
| **User Identifier** | UID | `52322ec8-6ebf-11eb-9439-0242ac130002` | Optional |
| **User AAD Object Id** | SID | `03fa4b4e-dc26-426f-87b7-98e0c9e2955e` | Optional |
| **User On-Prem Sid** | SID | `S-1-12-1-4141952679-1282074057-627758481-2916039507` | Optional |
| **User Principal Name** | UPN | `JeffL@seccxp.ninja` | Mandatory |
| **Tags** | List | `["SAW user","Blue Ocean team"]` | Optional |
| | | | |
## Network Mapping
The Network Mapping watchlist lists IP subnets and their respective organizational contexts, and includes the following fields:
| Field name | Format | Example | Mandatory/Optional |
| ---------- | ------------ | ---------------------------- | ------------------ |
| **IP Subnet** | Subnet range |` 198.51.100.0/24 - 198….../22` | Mandatory |
| **Range Name** | String | `DMZ` | Optional |
| **Tags** | List | `["Example","Example"]` | Optional |
| | | | |
## Terminated Employees
The Terminated Employees watchlist lists user accounts of employees that have been, or are about to be, terminated, and includes the following fields:
| Field name | Format | Example | Mandatory/Optional |
| ------------------- | ------------------------------------------------------------------------------- | ------------------------------------ | ------------------ |
| **User Identifier** | UID | `52322ec8-6ebf-11eb-9439-0242ac130002` | Optional |
| **User AAD Object Id** | SID | `03fa4b4e-dc26-426f-87b7-98e0c9e2955e` | Optional |
| **User On-Prem Sid** | SID | `S-1-12-1-4141952679-1282074057-123` | Optional |
| **User Principal Name** | UPN | `JeffL@seccxp.ninja` | Mandatory |
| **UserState** | String <br><br>We recommend using either `Notified` or `Terminated` | `Terminated` | Mandatory |
| **Notification date** | Timestamp - day | `01.12.20` | Optional |
| **Termination date** | Timestamp - day | `01.01.21` | Mandatory |
| **Tags** | List | `["SAW user","Amba Wolfs team"]` | Optional |
| | | | |
## Identity Correlation
The Identity Correlation watchlist lists related user accounts that belong to the same person, and includes the following fields:
| Field name | Format | Example | Mandatory/Optional |
| -------------------------------- | ------- | --------------------------------------------------- | ------------------ |
| **User Identifier** | UID | `52322ec8-6ebf-11eb-9439-0242ac130002` | Optional |
| **User AAD Object Id** | SID | `03fa4b4e-dc26-426f-87b7-98e0c9e2955e` | Optional |
| **User On-Prem Sid** | SID | `S-1-12-1-4141952679-1282074057-627758481-2916039507` | Optional |
| **User Principal Name** | UPN | `JeffL@seccxp.ninja` | Mandatory |
| **Employee Id** | String | `8234123` | Optional |
| **Email** | Email | `JeffL@seccxp.ninja` | Optional |
| **Associated Privileged Account ID** | UID/SID | `S-1-12-1-4141952679-1282074057-627758481-2916039507` | Optional |
| **Associated Privileged Account** | UPN | `Admin@seccxp.ninja` | Optional |
| **Tags** | List | `["SAW user","Amba Wolfs team"]` | Optional |
| | | | |
## Service Accounts
The Service Accounts watchlist lists service accounts and their owners, and includes the following fields:
| Field name | Format | Example | Mandatory/Optional |
| ------------------------- | ------ | --------------------------------------------------- | ------------------ |
| **Service Identifier** | UID | `1111-112123-12312312-123123123` | Optional |
| **Service AAD Object Id** | SID | `11123-123123-123123-123123` | Optional |
| **Service On-Prem Sid** | SID | `S-1-12-1-3123123-123213123-12312312-2916039507` | Optional |
| **Service Principal Name** | UPN | `myserviceprin@contoso.com` | Mandatory |
| **Owner User Identifier** | UID | `52322ec8-6ebf-11eb-9439-0242ac130002` | Optional |
| **Owner User AAD Object Id** | SID | `03fa4b4e-dc26-426f-87b7-98e0c9e2955e` | Optional |
| **Owner User On-Prem Sid** | SID | `S-1-12-1-4141952679-1282074057-627758481-2916039507` | Optional |
| **Owner User Principal Name** | UPN | `JeffL@seccxp.ninja` | Mandatory |
| **Tags** | List | `["Automation Account","GitHub Account"]` | Optional |
| | | | |
## Next steps
For more information, see [Use Microsoft Sentinel watchlists](watchlists.md).
| 83.184211 | 324 | 0.429611 | eng_Latn | 0.622687 |
f9153edf751d045136afe821ce6348da734172ef | 30 | md | Markdown | README.md | roman-bytes/roman-bytes-dev | c0628a1c330fe0ff20ed9fd69eea8fd009230285 | [
"MIT"
] | null | null | null | README.md | roman-bytes/roman-bytes-dev | c0628a1c330fe0ff20ed9fd69eea8fd009230285 | [
"MIT"
] | 3 | 2021-09-21T17:23:37.000Z | 2022-02-27T12:24:40.000Z | README.md | roman-bytes/roman-bytes-dev | c0628a1c330fe0ff20ed9fd69eea8fd009230285 | [
"MIT"
] | null | null | null | # Roman Bytes
coming soon...
| 7.5 | 14 | 0.666667 | eng_Latn | 0.988006 |
f915548217a913e842ef1f2405d04ddcbeda3f4c | 2,100 | md | Markdown | OpenBullet2/Docs/lolicode/data_variable.md | Enigma422/OpenBullet2 | 7132ae93da54c129dc5e74bfb8068692ef562be8 | [
"MIT"
] | 565 | 2021-03-08T20:32:59.000Z | 2022-03-30T18:21:12.000Z | OpenBullet2/Docs/lolicode/data_variable.md | Enigma422/OpenBullet2 | 7132ae93da54c129dc5e74bfb8068692ef562be8 | [
"MIT"
] | 637 | 2021-03-08T21:29:58.000Z | 2022-03-31T16:59:04.000Z | OpenBullet2/Docs/lolicode/data_variable.md | Enigma422/OpenBullet2 | 7132ae93da54c129dc5e74bfb8068692ef562be8 | [
"MIT"
] | 268 | 2021-03-08T20:31:59.000Z | 2022-03-31T08:16:19.000Z | ### The data variable
This variable contains all data related to the current bot.
##### Useful properties
- `data.UseProxy` (`bool`) whether to use the proxy assigned to the bot
- `data.STATUS` (`string`) the current status of the bot
- `data.RAWSOURCE` (`byte[]`) the content of the last http response received
- `data.SOURCE` (`string`) same as above but as a string
- `data.ERROR` (`string`) contains the message of the last exception caught when using safe mode (in blocks that support it)
- `data.ADDRESS` (`string`) the absolute uri of the last http response (after redirection)
- `data.RESPONSECODE` (`int`) the status code of the last http response
- `data.COOKIES` (`Dictionary<string, string>`) the cookies sent or received so far (e.g. `data.COOKIES["PHPSESSID"]`)
- `data.HEADERS` (`Dictionary<string, string>`) the headers of the last http response (e.g. `data.HEADERS["Location"]`)
- `data.Objects` (`Dictionary<string, object>`) holds stateful objects for cross-block use (they will get disposed automatically at the end of the script)
- `data.MarkedForCapture` (`List<string>`) all the names of variables marked for capture
###### Line
- `data.Line.Data` (`string`) the whole (unsplit) data line assigned to the bot
- `data.Line.Retries` (`int`) the amount of times the data has been retried
###### Proxy
Note: `data.Proxy` is null if proxies are off, so always make a null check first
- `data.Proxy.Host` (`string`)
- `data.Proxy.Port` (`int`)
- `data.Proxy.Username` (`string`)
- `data.Proxy.Password` (`string`)
- `data.Proxy.Type` (`ProxyType`) can be `Http`/`Socks4`/`Socks5`/`Socks4a`
###### Logger
- `data.Logger.Enabled` (`bool`) enables or disables the logger (e.g. when there is too much data to print)
---
##### Useful methods
- `data.MarkForCapture(string varName)` adds the variable name to the `data.MarkedForCapture` list
- `data.Logger.Log(string message, string htmlColor, bool canViewAsHtml)` htmlColor must be e.g. `#fff` or `white`
- `data.Logger.Log(IEnumerable<string> enumerable, string htmlColor, bool canViewAsHtml)`
- `data.Logger.Clear()` clears the log | 60 | 154 | 0.720476 | eng_Latn | 0.909218 |
f91575bd664a97b753df26f36f8a322e2aeebd8c | 1,399 | md | Markdown | lessons/networks/private_test_network.md | vernondcole/learn-salt | b87e1396a29ee6ddb976288bbd0239a61cc9ce07 | [
"Apache-2.0"
] | 2 | 2017-11-02T22:54:48.000Z | 2018-05-04T08:11:09.000Z | lessons/networks/private_test_network.md | vernondcole/learn-salt | b87e1396a29ee6ddb976288bbd0239a61cc9ce07 | [
"Apache-2.0"
] | null | null | null | lessons/networks/private_test_network.md | vernondcole/learn-salt | b87e1396a29ee6ddb976288bbd0239a61cc9ce07 | [
"Apache-2.0"
] | 1 | 2017-12-12T21:40:00.000Z | 2017-12-12T21:40:00.000Z | ### How to set up a private test network
- Get a router of your very own.
The example files are set to operate on a small network at IP address 192.168.88.0/24.
A test network for this setup has been provided by an inexpensive MikroTik "hAP lite" router.
Almost any home router could probably be used, but the MikroTik operating software provides more
functionality for an experienced network technician. The router provides basic DHCP service,
including the possibility of reserving fixed addresses for the host, bevy master, and test bed computers.
- Locate a hard-wire feed from your host network.
In these days of wireless operation, we are often ignoring wire networks.
We really need a wire for this, so find how to plug into your local ethernet.
- Plug your private routers `Internet` or `WAN` port into your host network.
- Plug your workstation wired port into a `LAN` port of your private router.
- Configure your router.
* open the routers [web page at 192.168.88.1](http://192.168.88.1) for a MikroTik,
[or try 192.168.1.1](http://192.168.1.1) for most other small routers.
* alter your private network number so that it does not match the host network.
* reconnect to the router if you changed addresses.
* set up your private wireless SSID to a different name from your host's.
- Plug your lab computers into the private router (or wireless connect to the new SSID).
| 53.807692 | 105 | 0.765547 | eng_Latn | 0.999309 |
f915a51173349af25087bd266f836088b02c675b | 172 | md | Markdown | README.md | ExodusMovement/keccak | 506040b8ffeaf0b57df27774e024c297edcba6c9 | [
"MIT"
] | null | null | null | README.md | ExodusMovement/keccak | 506040b8ffeaf0b57df27774e024c297edcba6c9 | [
"MIT"
] | null | null | null | README.md | ExodusMovement/keccak | 506040b8ffeaf0b57df27774e024c297edcba6c9 | [
"MIT"
] | null | null | null | # keccak
<https://github.com/cryptocoinjs/keccak> fork without the native part.
## LICENSE
This library is free and open-source software released under the MIT license.
| 21.5 | 77 | 0.773256 | eng_Latn | 0.981848 |
f91622e2350b6b6a3a45e5afcb556bf792588620 | 3,351 | md | Markdown | _pages/publications.md | sohanseth/sohanseth.github.io.backup | 98158b0e7f1294f7ce5ab2cbcb9ee0d4e3420727 | [
"MIT"
] | 1 | 2021-03-05T10:16:07.000Z | 2021-03-05T10:16:07.000Z | _pages/publications.md | sohanseth/sohanseth.github.io.backup | 98158b0e7f1294f7ce5ab2cbcb9ee0d4e3420727 | [
"MIT"
] | null | null | null | _pages/publications.md | sohanseth/sohanseth.github.io.backup | 98158b0e7f1294f7ce5ab2cbcb9ee0d4e3420727 | [
"MIT"
] | null | null | null | ---
layout: archive
title: "Publications"
permalink: /publications/
author_profile: true
---
# 2020
- __Robust, reproducible clinical patterns in hospitalised patients with COVID-19__
_Jonathan Millar, Lucile Neyton, Sohan Seth, et al._
Under Review
[[medRxiv]](https://www.medrxiv.org/content/10.1101/2020.08.14.20168088v2)
- __Clinical characteristics of children and young people admitted to hospital with covid-19 in United Kingdom: prospective multicentre observational cohort study__
_Olivia Swann et al._
The BMJ
[[Paper]](https://www.bmj.com/content/370/bmj.m3249)
# 2019
- __scID: identification of equivalent transcriptional cell populations across single cell RNA-seq data using discriminant analysis__
_Katerina Boufea, Sohan Seth, Nizar N Batada_
iScience
[[Paper]](https://www.sciencedirect.com/science/article/pii/S2589004220300985)
- __Endoscopic sensing of distal lung physiology__
_Debaditya Choudhury et al._
Journal of Physics: Conference Series
[[Paper]](https://iopscience.iop.org/article/10.1088/1742-6596/1151/1/012009/pdf)
# 2018
- __Model Criticism in Latent Space__
_Sohan Seth, Iain Murray, Christopher K. I. Williams_
Bayesian Analysis
[[arXiv]](https://arxiv.org/abs/1711.04674) [[code]](https://github.com/sohanseth/mcls)
- __Estimating Bacterial and Cellular Load in FCFM Imaging__
_Sohan Seth, Ahsan K. Akram, Kevin Dhaliwal, Christopher K. I. Williams_
Journal of Imaging
[[paper]](https://www.mdpi.com/2313-433X/4/1/11) [[code]](https://github.com/sohanseth/bactload)
# 2017
- __Estimating Bacterial Load in FCFM Imaging__
_Sohan Seth, Ahsan K. Akram, Kevin Dhaliwal, Chrisopher. K. I. Williams_
Annual Conference on Medical Image Understanding and Analysis
[[paper]](https://link.springer.com/chapter/10.1007/978-3-319-60964-5_79) [[code]](https://github.com/sohanseth/bactload)
- __Endoscopic Sensing of Alveolar pH__
_Debaditya Choudhury et al._
Biomedical Optics Express
[[paper]](https://www.ncbi.nlm.nih.gov/pmc/articles/PMC5231296/) [[code+data]](https://datashare.is.ed.ac.uk/handle/10283/2220)
# 2016
- __Assessing the Utility of Autofluorescence-based Pulmonary Optical Endomicroscopy to Predict the Malignant Potential of Solitary Pulmonary Nodules in Humans__
_Sohan Seth, Ahsan R Akram, Paul McCool, Jody Westerfeld, David Wilson, Stephen McLaughlin, Kevin Dhaliwal, Christopher K. I. Williams_
Nature Scientific Reports
[[paper]](https://www.ncbi.nlm.nih.gov/pubmed/27550539) [[code]](https://datashare.is.ed.ac.uk/handle/10283/869)
- __Archetypal Analysis for Nominal Observations__
_Sohan Seth, Manual J. A. Eugster_
IEEE transactions on Pattern Analysis and Machine Intelligence
[[paper]](https://ieeexplore.ieee.org/document/7214318/) [[code]](https://github.com/aalab/naa)
- __Modelling-Based Experiment Retrieval: A Case Study with Gene Expression Clustering__
_Paul Blomstedt, Ritabrata Dutta, Sohan Seth, Alvis Brazma, Samuel Kaski_
Bioinformatics
[[paper]](https://academic.oup.com/bioinformatics/article/32/9/1388/1744207)
- __Probabilistic Archetypal Analysis__
_Sohan Seth, Manuel J. A. Eugster_
Machine Learning
[[paper]](https://link.springer.com/article/10.1007/s10994-015-5498-8) [[code]](https://github.com/aalab/paa)
| 43.519481 | 166 | 0.748433 | kor_Hang | 0.317843 |
f916d5caab6941abb4665e67533a31dbd71116b1 | 2,060 | md | Markdown | README.md | sanj0/kopfkino | a0d782853e9cbf0dc02d0b22e92c29b8161162b3 | [
"Apache-2.0"
] | 3 | 2021-12-26T23:58:36.000Z | 2021-12-27T07:51:45.000Z | README.md | sanj0/kopfkino | a0d782853e9cbf0dc02d0b22e92c29b8161162b3 | [
"Apache-2.0"
] | 10 | 2021-10-03T15:03:46.000Z | 2022-01-17T20:33:05.000Z | README.md | sanj0/kopfkino | a0d782853e9cbf0dc02d0b22e92c29b8161162b3 | [
"Apache-2.0"
] | null | null | null | <div align="center">
<img src="https://github.com/sanj0/kopfkino/blob/main/logos/grayish/kopfkino_grayish_xs.png" alt="kopfkino" align="center">
</div>
# kopfkino
> _kopkino_ (noun) processes, events that take place only or mainly in the imagination, in one's own power of imagination.
> (translated from https://www.duden.de/rechtschreibung/Kopfkino)
Kopfkino is a java2d game library that benefits from 5+ years of development on
its predecessor, [Salty Engine](https://www.github.com/sanj0/salty-engine).
## Get started
For information on how to use the engnie, consider taking a look at the [github wiki](https://github.com/sanj0/kopfkino/wiki)
### 1. How to get Kopfkino Engine
Kopfkino engine is built using maven. After cloning the repository, simply build
the project using `mvn clean install`. Now include kopfkino in your own maven
project by appending the following dependency declaration to your pom.xml:
```xml
<dependencies>
<!--...-->
<dependency>
<groupId>io.github.sanj0</groupId>
<artifactId>kopfkino</artifactId>
<version>0.0.1-SNAPSHOT</version>
</dependency>
<!--...-->
</dependencies>
```
### 2. Start a simple kopfkino game
To get the bare minimum kopfkino running, consider the following code.
```java
import de.sanj0.kopfkino.Game;
import de.sanj0.kopfkino.scene.EmptyScene;
import java.awt.*;
public class Main {
public static void main(String[] args) {
// initialise the game with a resolutino of 1920x1080 and a nice name
// the latter is used for example in the window title
Game.init(1920, 1080, "hello, kopfkino!");
// Game is a singleton - Game.getInstance() retrieves the instance
// ... to for example set the background (frame clear) color
Game.getInstance().setBackgroundColor(Color.WHITE);
// start the game after a splash screen of 3 seconds
// into an empty scene,
// with 5 milliseconds between fixed ticks
// and 60 frames per second
Game.start(3000, new EmptyScene(), 5, 60);
}
}
```
| 34.333333 | 129 | 0.703883 | eng_Latn | 0.949963 |
f916f182b014d95c42372052879c7ae8f91063d1 | 504 | md | Markdown | _posts/2013-11-19-08aa916e.md | raidenii/raidenii.github.io | 89159035f7a4f9d13d52c7cf54f5a1c542ec0b10 | [
"MIT"
] | null | null | null | _posts/2013-11-19-08aa916e.md | raidenii/raidenii.github.io | 89159035f7a4f9d13d52c7cf54f5a1c542ec0b10 | [
"MIT"
] | null | null | null | _posts/2013-11-19-08aa916e.md | raidenii/raidenii.github.io | 89159035f7a4f9d13d52c7cf54f5a1c542ec0b10 | [
"MIT"
] | null | null | null | ---
layout: single
title: aMule+nginx反向代理
date: 2013-11-19 10:24:14.000000000 -05:00
tags:
- amule
- nginx
permalink: "/2013/11/1098"
---
平台Debian 7.2+aMule 2.3.1+nginx 1.2.1
```
location ^~ /amuleweb {
proxy_set_header Host $http_host;
proxy_set_header X-Real-IP $remote_addr;
proxy_set_header X-Forwarded-For $proxy_add_x_forwarded_for;
proxy_pass http://127.0.0.1:4711/;
proxy_redirect default;
}
```
实际上aMule的web server是一个不完整的http/1.0服务器(用curl获取http头的时候是空的)。
ps: aMule的内存泄漏极其严重,还是用mldonkey比较好 | 21 | 61 | 0.755952 | yue_Hant | 0.33657 |
f9176be1a04395e99a50042aa1e33e93f17df59d | 1,785 | md | Markdown | README.md | CSC198/Autovis | 70dc335bcc6e4c15b37c8ce46df2ac0de0cbd0b1 | [
"MIT"
] | 1 | 2020-01-19T00:27:24.000Z | 2020-01-19T00:27:24.000Z | README.md | PS50-Spring2018/MasterRepo | 70dc335bcc6e4c15b37c8ce46df2ac0de0cbd0b1 | [
"MIT"
] | null | null | null | README.md | PS50-Spring2018/MasterRepo | 70dc335bcc6e4c15b37c8ce46df2ac0de0cbd0b1 | [
"MIT"
] | 4 | 2019-11-14T19:22:24.000Z | 2021-05-16T09:24:48.000Z |
# Project AutoVis: Automatic Visualization via Webcam
AutoVis is a tool for automating visualization of colored reactions.
## Usage
Two computers are needed, one for running the experiment (i.e. taking images using a built-in camera or an external webcam) and one for running the communication manager that detects the accumulation of image data and initiates the data analysis.
### Computer 1: For taking images on the webcam, execute:
`$ python MainExperiment.py`
The user will input the path to the Dropbox directory, reaction ID, duration of webcam, and image-taking frequency.
### Computer 2: For initiating the communication manager, execute:
`$ python MainAnalysis.py`
The user will input the path to the Dropbox directory and reaction ID.
## Installation
To install Autovis, the user needs to install the dependencies listed below.
- For `opencv-python`, we recommend installing via pip: `pip install opencv-python`. For installation on Windows, we recommend following the steps [here](https://opencv-python-tutroals.readthedocs.io/en/latest/py_tutorials/py_setup/py_table_of_contents_setup/py_table_of_contents_setup.html).
- For `numpy` and `matplotlib`, we recommend installing via pip: `python -m pip install --user numpy matplotlib`. For more detailed instructions, refer to this set of [instructions](https://www.scipy.org/install.html).
- For `seaborn`, we recommend installing via pip: `pip install seaborn`.
Coming soon: we will work on making Autovis pip-installable.
## Dependencies and Versions Used
- python 3.5
- opencv-python 4.1.0.25
- numpy 1.16.4
- matplotlib 2.2.2
- seaborn 0.8.1
## Key functionalities

## Authors
AutoVis was written by Physical Sciences 50 (Spring 2018) at Harvard University.
| 36.428571 | 292 | 0.77535 | eng_Latn | 0.963978 |
f9178b6f61b14f94c628cdb9e01cde83deb311ae | 28 | md | Markdown | README.md | FrankJWH/Nviron | a9b658572bd8e9a11bad21e18b2eb15193d55d21 | [
"Apache-2.0"
] | null | null | null | README.md | FrankJWH/Nviron | a9b658572bd8e9a11bad21e18b2eb15193d55d21 | [
"Apache-2.0"
] | null | null | null | README.md | FrankJWH/Nviron | a9b658572bd8e9a11bad21e18b2eb15193d55d21 | [
"Apache-2.0"
] | null | null | null | # Nviron
Nviron Game Engine
| 9.333333 | 18 | 0.785714 | vie_Latn | 0.289312 |
f917f3ef922cfe1492b2a9f3cbe4643a67eb45bb | 1,208 | md | Markdown | _posts/2011-01-31-warning-signals-indicators-vs-likelihood-continued.md | kubke/labnotebook | ebada896118f21a8d63e9b584e2d1424a5d3a6bd | [
"CC0-1.0"
] | 1 | 2019-06-27T11:33:45.000Z | 2019-06-27T11:33:45.000Z | _posts/2011-01-31-warning-signals-indicators-vs-likelihood-continued.md | kubke/labnotebook | ebada896118f21a8d63e9b584e2d1424a5d3a6bd | [
"CC0-1.0"
] | null | null | null | _posts/2011-01-31-warning-signals-indicators-vs-likelihood-continued.md | kubke/labnotebook | ebada896118f21a8d63e9b584e2d1424a5d3a6bd | [
"CC0-1.0"
] | null | null | null | ---
comments: true
date: 2011-01-31 16:49:22
layout: post
slug: warning-signals-indicators-vs-likelihood-continued
title: 'Warning signals: Indicators vs Likelihood continued'
redirects: [/wordpress/archives/898, /archives/898]
categories:
- ecology
tags:
- warning-signals
---
Continuing to explore examples
Some difficulty in the "snowfall" parallel computation engine, removing the ~/.sfCluster directory seems to reset the error "unable to unserialize nodes..." Also don't have good handling of variable names for tracking the boostrapping of parameters. For the moment have just added a toggle to suppress these to avoid dim mismatch in rownames command. Strange that this happens to the indicator_vs_likelihood code but not to the nearly-identical call to montecarlotest() in lin_bifur_models.R.
Various simulation/analyses runs from today:
[flickr-gallery mode="search" tags="warningsignals" min_upload_date="2011-01-31 00:00:00" max_upload_date="2011-02-01 8:59:59"]
trying still-harder-to-detect change (m=-.01)
Running right now on zero:
* at nice 15: indicator_vs_likelihood.R
* at nice 19: indicator_vs_likelihood.R (with more reps?)
* at 18: LTC.R
* at 17: LSN.R
| 30.974359 | 495 | 0.767384 | eng_Latn | 0.935286 |
f9183228bc72c08d3c0bb26dd6fb6f26e902d7ff | 203 | md | Markdown | content/lua/client/questreadbook.md | xackery/eqquestapi | e3bb4d58651c7c2bb1ced94deb59115946eed3c5 | [
"MIT"
] | null | null | null | content/lua/client/questreadbook.md | xackery/eqquestapi | e3bb4d58651c7c2bb1ced94deb59115946eed3c5 | [
"MIT"
] | 1 | 2020-09-08T17:21:08.000Z | 2020-09-08T17:21:08.000Z | content/lua/client/questreadbook.md | xackery/eqquestapi | e3bb4d58651c7c2bb1ced94deb59115946eed3c5 | [
"MIT"
] | 1 | 2020-08-29T00:49:26.000Z | 2020-08-29T00:49:26.000Z | ---
title: QuestReadBook
searchTitle: Lua Client QuestReadBook
weight: 1
hidden: true
menuTitle: QuestReadBook
---
## QuestReadBook
```lua
Client:QuestReadBook(const char *text, number type); -- void
``` | 18.454545 | 60 | 0.748768 | yue_Hant | 0.645234 |
f9183e8b5f742ad814e3c5d3b25ad155101bfd8a | 1,339 | md | Markdown | README.md | margaret85/iCanHazAdventure | 638ea500545784f74d411c06b60434c22bda0441 | [
"MIT"
] | 1 | 2020-05-23T20:09:53.000Z | 2020-05-23T20:09:53.000Z | README.md | codemargaret/iCanHazAdventure | 638ea500545784f74d411c06b60434c22bda0441 | [
"MIT"
] | null | null | null | README.md | codemargaret/iCanHazAdventure | 638ea500545784f74d411c06b60434c22bda0441 | [
"MIT"
] | null | null | null | # I Can Haz Adventure
#### _A choose-your-own-adventure story where you are a cat, 10.31.2017_
#### By _**Margaret Berry and Keegan Ruebling**_
## Description
_This project was generated with [Angular CLI](https://github.com/angular/angular-cli) version 1.0.0._
## Project Goals
* Practice generating an app using Angular CLI.
* Practice routing.
* Practice creating models and writing services.
## Setup/Installation Requirements
_Run the following commands in Terminal:_
1. `$ git clone` [this repository](https://github.com/codemargaret/iCanHazAdventure.git)
2. `$ cd iCanHazAdventure`
3. `$ npm install`
4. `$ bower install`
5. `$ ng serve`
6. _Navigate to localhost:4200_
## Known Bugs
_Character traits do not show on character page._
## Future Features
_Add more story pages._
## Support and contact details
_If you have issues, questions, ideas, or concerns, please contact [Margaret](codeberry1@gmail.com). Feel free to make a contribution to the code._
To get more help on the Angular CLI use `ng help` or go check out the [Angular CLI README](https://github.com/angular/angular-cli/blob/master/README.md).
## Technologies Used
* _JavaScript_
* _TypeScript_
* _Node_
* _Bower_
* _Angular CLI_
### License
*This software is licensed under the MIT license.*
Copyright (c) 2017 **_Margaret Berry and Keegan Ruebling_**
| 28.489362 | 153 | 0.751307 | eng_Latn | 0.919717 |
f9186cee13687148757ee1b8415dc02617ff24d4 | 177 | md | Markdown | features/Generators.md | yagihiro/action-cable-testing | f7a1f7bad18b65abc6403bed5451c4a583dbfb85 | [
"MIT"
] | 214 | 2017-10-24T07:54:36.000Z | 2022-03-22T23:56:34.000Z | features/Generators.md | yagihiro/action-cable-testing | f7a1f7bad18b65abc6403bed5451c4a583dbfb85 | [
"MIT"
] | 53 | 2017-11-03T07:55:08.000Z | 2021-03-25T15:12:13.000Z | features/Generators.md | yagihiro/action-cable-testing | f7a1f7bad18b65abc6403bed5451c4a583dbfb85 | [
"MIT"
] | 22 | 2017-10-24T13:32:50.000Z | 2021-06-16T19:14:28.000Z | This gem provides RSpec generators for channels specs. For example:
rails generate rspec:channel chat
will create a new spec file in `spec/channels/chat_channel_spec.rb`.
| 29.5 | 68 | 0.79096 | eng_Latn | 0.983871 |
f9190472f07c3cb815902e17ba3ac0c0ccf7b721 | 71 | md | Markdown | README.md | cmb69/ext-fiber | 88739c7b489a538988ee492e364bef659e22c66b | [
"PHP-3.01"
] | 1 | 2020-12-10T09:05:27.000Z | 2020-12-10T09:05:27.000Z | README.md | cmb69/ext-fiber | 88739c7b489a538988ee492e364bef659e22c66b | [
"PHP-3.01"
] | null | null | null | README.md | cmb69/ext-fiber | 88739c7b489a538988ee492e364bef659e22c66b | [
"PHP-3.01"
] | 1 | 2019-09-30T06:29:35.000Z | 2019-09-30T06:29:35.000Z | # Fiber Extension
Fiber implementation for PHP using native C fibers.
| 17.75 | 51 | 0.802817 | eng_Latn | 0.856668 |
f9195f0496a40100d22458e077d928776c0feb13 | 3,536 | md | Markdown | docs/ado/guide/remote-data-service/step-5-datacontrol-is-made-usable-rds-tutorial.md | Jteve-Sobs/sql-docs.de-de | 9843b0999bfa4b85e0254ae61e2e4ada1d231141 | [
"CC-BY-4.0",
"MIT"
] | null | null | null | docs/ado/guide/remote-data-service/step-5-datacontrol-is-made-usable-rds-tutorial.md | Jteve-Sobs/sql-docs.de-de | 9843b0999bfa4b85e0254ae61e2e4ada1d231141 | [
"CC-BY-4.0",
"MIT"
] | null | null | null | docs/ado/guide/remote-data-service/step-5-datacontrol-is-made-usable-rds-tutorial.md | Jteve-Sobs/sql-docs.de-de | 9843b0999bfa4b85e0254ae61e2e4ada1d231141 | [
"CC-BY-4.0",
"MIT"
] | null | null | null | ---
description: 'Schritt 5: DataControl wird nutzbar gemacht (RDS-Tutorial)'
title: 'Schritt 5: DataControl ist verwendbar (RDS-Tutorial) | Microsoft-Dokumentation'
ms.prod: sql
ms.prod_service: connectivity
ms.technology: connectivity
ms.custom: ''
ms.date: 11/09/2018
ms.reviewer: ''
ms.topic: conceptual
helpviewer_keywords:
- RDS tutorial [ADO], datacontrol made usable
ms.assetid: ed5c4a24-9804-4c85-817e-317652acb9b4
author: rothja
ms.author: jroth
ms.openlocfilehash: 18365d26c9b46fb651d68291dc5fa026f23e3bfb
ms.sourcegitcommit: e700497f962e4c2274df16d9e651059b42ff1a10
ms.translationtype: MT
ms.contentlocale: de-DE
ms.lasthandoff: 08/17/2020
ms.locfileid: "88451912"
---
# <a name="step-5-datacontrol-is-made-usable-rds-tutorial"></a>Schritt 5: DataControl wird nutzbar gemacht (RDS-Tutorial)
Das zurückgegebene **Recordset** -Objekt ist zur Verwendung verfügbar. Sie können Sie wie jedes andere **Recordset**überprüfen, navigieren oder bearbeiten. Was Sie mit dem **Recordset** tun können, hängt von Ihrer Umgebung ab. Visual Basic und Visual C++ über visuelle Steuerelemente verfügen, die ein **Recordset** direkt oder indirekt mit der Unterstützung eines aktivierenden Daten Steuer Elements verwenden können.
> [!IMPORTANT]
> Ab Windows 8 und Windows Server 2012 sind RDS-Server Komponenten nicht mehr im Windows-Betriebssystem enthalten (weitere Details finden Sie unter Windows 8 und [Windows Server 2012 Compatibility Cookbook](https://www.microsoft.com/download/details.aspx?id=27416) ). RDS-Client Komponenten werden in einer zukünftigen Version von Windows entfernt. Nutzen Sie diese Funktionen bei Neuentwicklungen nicht mehr, und planen Sie die Änderung von Anwendungen, die diese Funktion zurzeit verwenden. Anwendungen, die RDS verwenden, sollten zu [WCF Data Service](https://go.microsoft.com/fwlink/?LinkId=199565)migriert werden.
Wenn Sie z. b. eine Webseite in Microsoft Internet Explorer anzeigen, möchten Sie möglicherweise die Daten des **Recordset** -Objekts in einem visuellen Steuerelement anzeigen. Visuelle Steuerelemente auf einer Webseite können nicht direkt auf ein **Recordset** -Objekt zugreifen. Allerdings können Sie über das RDS auf das **Recordset** -Objekt zugreifen [. DataControl](../../../ado/reference/rds-api/datacontrol-object-rds.md). Das **RDS. DataControl** kann von einem visuellen Steuerelement verwendet werden, wenn die [SourceRecordset](../../../ado/reference/rds-api/recordset-sourcerecordset-properties-rds.md) -Eigenschaft auf das **Recordset** -Objekt festgelegt ist.
Für das visuelle Steuerelement Objekt muss sein **dataSrc** -Parameter auf RDS festgelegt sein **. DataControl**und die zugehörige **Datafld** -Eigenschaft sind auf ein **Recordset** -Objektfeld (Spalte) festgelegt.
Legen Sie in diesem Tutorial die **SourceRecordset** -Eigenschaft fest:
```vb
Sub RDSTutorial5()
Dim DS as New RDS.DataSpace
Dim RS as ADODB.Recordset
Dim DC as New RDS.DataControl
Dim DF as Object
Set DF = DS.CreateObject("RDSServer.DataFactory", "https://yourServer")
Set RS = DF.Query ("DSN=Pubs", "SELECT * FROM Authors")
DC.SourceRecordset = RS ' Visual controls can now bind to DC.
...
```
## <a name="see-also"></a>Weitere Informationen
[Schritt 6: Änderungen werden an den Server gesendet (RDS-Tutorial)](../../../ado/guide/remote-data-service/step-6-changes-are-sent-to-the-server-rds-tutorial.md)
[RDS-Tutorial (VBScript)](../../../ado/guide/remote-data-service/rds-tutorial-vbscript.md)
| 70.72 | 677 | 0.764706 | deu_Latn | 0.91741 |
f91976d3330508a47d5a275e6a810ff74758e52a | 2,446 | md | Markdown | docs/docs/glossary/Add-VSIoTAnalyticsDatasetIotEventsDestinationConfiguration.md | sheldonhull/VaporShell | e6a29672ce84b461e4f8d6058a52b83cbfaf3c5c | [
"Apache-2.0"
] | 35 | 2017-08-22T23:16:27.000Z | 2020-02-13T18:26:47.000Z | docs/docs/glossary/Add-VSIoTAnalyticsDatasetIotEventsDestinationConfiguration.md | sheldonhull/VaporShell | e6a29672ce84b461e4f8d6058a52b83cbfaf3c5c | [
"Apache-2.0"
] | 31 | 2017-08-29T03:27:32.000Z | 2020-03-04T22:02:20.000Z | docs/docs/glossary/Add-VSIoTAnalyticsDatasetIotEventsDestinationConfiguration.md | sheldonhull/VaporShell | e6a29672ce84b461e4f8d6058a52b83cbfaf3c5c | [
"Apache-2.0"
] | 6 | 2020-04-21T18:29:31.000Z | 2021-12-24T11:01:08.000Z | # Add-VSIoTAnalyticsDatasetIotEventsDestinationConfiguration
## SYNOPSIS
Adds an AWS::IoTAnalytics::Dataset.IotEventsDestinationConfiguration resource property to the template.
Configuration information for delivery of dataset contents to AWS IoT Events.
## SYNTAX
```
Add-VSIoTAnalyticsDatasetIotEventsDestinationConfiguration [-InputName] <Object> [-RoleArn] <Object>
[<CommonParameters>]
```
## DESCRIPTION
Adds an AWS::IoTAnalytics::Dataset.IotEventsDestinationConfiguration resource property to the template.
Configuration information for delivery of dataset contents to AWS IoT Events.
## PARAMETERS
### -InputName
The name of the AWS IoT Events input to which dataset contents are delivered.
Documentation: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-iotanalytics-dataset-ioteventsdestinationconfiguration.html#cfn-iotanalytics-dataset-ioteventsdestinationconfiguration-inputname
PrimitiveType: String
UpdateType: Mutable
```yaml
Type: Object
Parameter Sets: (All)
Aliases:
Required: True
Position: 1
Default value: None
Accept pipeline input: False
Accept wildcard characters: False
```
### -RoleArn
The ARN of the role that grants AWS IoT Analytics permission to deliver dataset contents to an AWS IoT Events input.
Documentation: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-iotanalytics-dataset-ioteventsdestinationconfiguration.html#cfn-iotanalytics-dataset-ioteventsdestinationconfiguration-rolearn
PrimitiveType: String
UpdateType: Mutable
```yaml
Type: Object
Parameter Sets: (All)
Aliases:
Required: True
Position: 2
Default value: None
Accept pipeline input: False
Accept wildcard characters: False
```
### CommonParameters
This cmdlet supports the common parameters: -Debug, -ErrorAction, -ErrorVariable, -InformationAction, -InformationVariable, -OutVariable, -OutBuffer, -PipelineVariable, -Verbose, -WarningAction, and -WarningVariable. For more information, see [about_CommonParameters](http://go.microsoft.com/fwlink/?LinkID=113216).
## INPUTS
## OUTPUTS
### Vaporshell.Resource.IoTAnalytics.Dataset.IotEventsDestinationConfiguration
## NOTES
## RELATED LINKS
[http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-iotanalytics-dataset-ioteventsdestinationconfiguration.html](http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-iotanalytics-dataset-ioteventsdestinationconfiguration.html)
| 33.972222 | 315 | 0.822567 | yue_Hant | 0.777327 |
f919c0bae073d3a7752764773f7d4ea5ddd47bff | 3,365 | md | Markdown | README.md | lavsharmaa/tensorflow-asl-voice | 14844a19503b3bab8b5a955c5ab65ba4bc6eeec5 | [
"MIT"
] | null | null | null | README.md | lavsharmaa/tensorflow-asl-voice | 14844a19503b3bab8b5a955c5ab65ba4bc6eeec5 | [
"MIT"
] | null | null | null | README.md | lavsharmaa/tensorflow-asl-voice | 14844a19503b3bab8b5a955c5ab65ba4bc6eeec5 | [
"MIT"
] | null | null | null | <p align="center">
<h2 align="center">American Sign Language Detection Model with voice feedback using SSD_Mobilenet trained on Google Colab</h2>
</p>
<p align="center">
<img src="https://img.shields.io/badge/Python-3.8.5-lightgrey?style=for-the-badge" alt="repo language">
<img src="https://img.shields.io/badge/Java-Android-lightgrey?style=for-the-badge" alt="repo language">
Sign_Language_detection_SSD_Mobilenet_Colab_TFLITE.ipynb
- Model Used - ssd_mobilenet_v2_fpnlite_320x320_coco17_tpu-8
- DL FrameWork Used - TensorFlow Version 2
- DataSet Used - [ASL By David Lee](https://app.roboflow.com/dataset/american-sign-language-letters-14kx4/)
- Platform - Google Colab Using GPU
Download the APK from [Google Drive](https://drive.google.com/file/d/1C2IDptaM8lSLHeJG_E4LT8Im0ZukaZFa/view)
- First of all I would like to thanks codePerfectPlus for such a wonderful blog and code. You can find their links below.
- This repo is just an extension to the original repo by codePerfectPlus
- Link to the original [repo](https://github.com/codePerfectPlus/ASL)
- Link to the blog by codePerfectPlus on [Real-time sign language detection android application using TensorFlow lite](https://dev.to/codeperfectplus/real-time-sign-language-detection-android-application-using-tensorflow-lite-2d9g)
## Addition to the original [repo](https://github.com/codePerfectPlus/ASL)
- I have just implemented a voice feedback feature for the alphabet recognized by the app.
- In the bottom sheet you can find the Text-To-Speech button in front of that a ImageView is implemented with a music sign logo.
- When you perform a sign in front of the camera and click on the icon it gives you feedback in voice.
## Implementation method
- First I have logged the output from the DetectorActivity.java
```
Log.i("Recognitionss", String.valueOf(results.get(0).getTitle()));
// passing the detected alphabet to xml
DetectorActivity.super.toShowOutput.setText(String.valueOf(results.get(0).getTitle()));
```
- Passing the result to the xml file
```
// define this in CameraActivity.java for storing the result
protected TextView toShowOutput;
```
- Using onClickListener for the ```mp3``` audio to be played when a particular alphabet is encountered
CameraActivity.java
```
@Override
public void onClick(View v) {
// some code
```
// voice for the detected alphabet
else if (v.getId() == R.id.speak1) {
// this is the value we are showing
EditText mEdit = findViewById(R.id.editText);
String letter = mEdit.getText().toString();
if (letter.equalsIgnoreCase("A")) {
final MediaPlayer mp100 = MediaPlayer.create(this, R.raw.letter_a);
mp100.start();
}
else if (letter.equalsIgnoreCase("B")) {
final MediaPlayer mp100 = MediaPlayer.create(this, R.raw.letter_b);
mp100.start();
}
// similarly you can implement for the other alphabet
```
- Not the ideal way to implement this method but as ```TTS``` or ```Text-To-Speech``` was causing me some issue so I implemented the feedback in a different manner. TTS would be the best approach or else you need to define sound for every alphabet or word inside your ```raw``` folder in android.
- You can raise the issue if you face any problem in the code.
- Thanks to [David Lee](https://www.linkedin.com/in/daviddaeshinlee/) for amazing DataSet.
| 51.769231 | 296 | 0.745022 | eng_Latn | 0.838977 |
f91a7b34c7277dc1240223aeb4a8a555380c7aa2 | 8,051 | md | Markdown | articles/app-service/faq-deployment.md | Jontii/azure-docs.sv-se | d2551c12e17b442dc0b577205d034dcd6c73cff9 | [
"CC-BY-4.0",
"MIT"
] | null | null | null | articles/app-service/faq-deployment.md | Jontii/azure-docs.sv-se | d2551c12e17b442dc0b577205d034dcd6c73cff9 | [
"CC-BY-4.0",
"MIT"
] | null | null | null | articles/app-service/faq-deployment.md | Jontii/azure-docs.sv-se | d2551c12e17b442dc0b577205d034dcd6c73cff9 | [
"CC-BY-4.0",
"MIT"
] | null | null | null | ---
title: Vanliga frågor om distribution – Azure App Service | Microsoft Docs
description: Få svar på vanliga frågor om distribution för den Web Apps funktionen i Azure App Service.
author: genlin
manager: dcscontentpm
tags: top-support-issue
ms.assetid: 2fa5ee6b-51a6-4237-805f-518e6c57d11b
ms.topic: article
ms.date: 11/01/2018
ms.author: genli
ms.custom: seodec18
ms.openlocfilehash: 163a6940e50d1f8beacc23855fd1e6f9daad0085
ms.sourcegitcommit: 829d951d5c90442a38012daaf77e86046018e5b9
ms.translationtype: MT
ms.contentlocale: sv-SE
ms.lasthandoff: 10/09/2020
ms.locfileid: "88080481"
---
# <a name="deployment-faqs-for-web-apps-in-azure"></a>Vanliga frågor om distribution för Web Apps i Azure
Den här artikeln innehåller svar på vanliga frågor och svar om distributions problem för [Web Apps funktionen i Azure App Service](https://azure.microsoft.com/services/app-service/web/).
[!INCLUDE [support-disclaimer](../../includes/support-disclaimer.md)]
## <a name="i-am-just-getting-started-with-app-service-web-apps-how-do-i-publish-my-code"></a>Jag kommer precis att komma igång med App Service-webbappar. Hur gör jag för att publicera min kod?
Här följer några alternativ för att publicera din webb program kod:
* Distribuera med hjälp av Visual Studio. Om du har Visual Studio-lösningen högerklickar du på projektet för webb programmet och väljer sedan **publicera**.
* Distribuera med hjälp av en FTP-klient. I Azure Portal laddar du ned publicerings profilen för den webbapp som du vill distribuera din kod till. Ladda sedan upp filerna till \site\wwwroot med samma publicerings profil FTP-autentiseringsuppgifter.
Mer information finns i [distribuera din app till App Service](deploy-local-git.md).
## <a name="i-see-an-error-message-when-i-try-to-deploy-from-visual-studio-how-do-i-resolve-this-error"></a>Jag får ett fel meddelande när jag försöker distribuera från Visual Studio. Hur gör jag för att lösa det här felet?
Om du ser följande meddelande kanske du använder en äldre version av SDK: "fel vid distribution för resursen" YourResourceName "i resurs gruppen" YourResourceGroup ": MissingRegistrationForLocation: prenumerationen har inte registrerats för resurs typens komponenter på platsen" Central US ". Registrera den här providern igen för att få åtkomst till den här platsen. "
Lös problemet genom att uppgradera till den [senaste SDK: n](https://azure.microsoft.com/downloads/). Om du ser det här meddelandet och du har den senaste SDK: n skickar du en support förfrågan.
## <a name="how-do-i-deploy-an-aspnet-application-from-visual-studio-to-app-service"></a>Hur gör jag för att du distribuera ett ASP.NET-program från Visual Studio till App Service?
<a id="deployasp"></a>
Självstudien [skapa din första ASP.NET-webbapp i Azure på fem minuter](quickstart-dotnetcore.md) visar hur du distribuerar ett ASP.NET-webbprogram till en webbapp i App Service med hjälp av Visual Studio.
## <a name="what-are-the-different-types-of-deployment-credentials"></a>Vilka är de olika typerna av distributions uppgifter?
App Service stöder två typer av autentiseringsuppgifter för lokal Git-distribution och FTP/S-distribution. Mer information om hur du konfigurerar autentiseringsuppgifter för distribution finns i [Konfigurera autentiseringsuppgifter för distribution för App Service](deploy-configure-credentials.md).
## <a name="what-is-the-file-or-directory-structure-of-my-app-service-web-app"></a>Vad är fil-eller katalog strukturen för min App Service-webbapp?
Information om fil strukturen för din App Service-app finns i [fil strukturen i Azure](https://github.com/projectkudu/kudu/wiki/File-structure-on-azure).
## <a name="how-do-i-resolve-ftp-error-550---there-is-not-enough-space-on-the-disk-when-i-try-to-ftp-my-files"></a>Hur gör jag för att lösa "FTP-fel 550 – det finns inte tillräckligt med utrymme på disken" när jag försöker FTP-filer?
Om det här meddelandet visas är det troligt att du kör en diskkvot i tjänste planen för din webbapp. Du kan behöva skala upp till en högre tjänst nivå baserat på disk utrymmes behoven. Mer information om pris planer och resurs gränser finns [App Service prissättning](https://azure.microsoft.com/pricing/details/app-service/).
## <a name="how-do-i-set-up-continuous-deployment-for-my-app-service-web-app"></a>Hur gör jag för att du konfigurera kontinuerlig distribution för min App Service-webbapp?
Du kan ställa in kontinuerlig distribution från flera resurser, inklusive Azure DevOps, OneDrive, GitHub, BitBucket, Dropbox och andra git-databaser. Dessa alternativ är tillgängliga i portalen. [Kontinuerlig distribution till App Service](deploy-continuous-deployment.md) är en praktisk självstudie som förklarar hur du konfigurerar kontinuerlig distribution.
## <a name="how-do-i-troubleshoot-issues-with-continuous-deployment-from-github-and-bitbucket"></a>Hur gör jag för att felsöka problem med kontinuerlig distribution från GitHub och Bitbucket?
Information om hur du undersöker problem med kontinuerlig distribution från GitHub eller BitBucket finns i [undersöka kontinuerlig distribution](https://github.com/projectkudu/kudu/wiki/Investigating-continuous-deployment).
## <a name="i-cant-ftp-to-my-site-and-publish-my-code-how-do-i-resolve-this-issue"></a>Jag kan inte FTP till min webbplats och publicera min kod. Hur gör jag för att du lösa det här problemet?
Så här löser du FTP-problem:
1. Kontrol lera att du har angett rätt värdnamn och autentiseringsuppgifter. Detaljerad information om olika typer av autentiseringsuppgifter och hur du använder dem finns i [distributions uppgifter](https://github.com/projectkudu/kudu/wiki/Deployment-credentials).
2. Kontrol lera att FTP-portarna inte blockeras av en brand vägg. Portarna måste ha följande inställningar:
* Anslutningsport för FTP-kontroll: 21
* Anslutnings port för FTP-data: 989, 10001-10300
## <a name="how-do-i-publish-my-code-to-app-service"></a>Hur gör jag för att publicera min kod till App Service?
Azure snabb starten är utformad för att hjälpa dig att distribuera din app med hjälp av distributions stacken och den metod som du väljer. Om du vill använda Azure Portal snabb starten går du till App Service under **distribution**och väljer **snabb start**.
## <a name="why-does-my-app-sometimes-restart-after-deployment-to-app-service"></a>Varför startar min app ibland om efter distributionen till App Service?
Om du vill veta mer om under vilka omständigheter en program distribution kan leda till en omstart, se [distribution jämfört med körnings problem](https://github.com/projectkudu/kudu/wiki/Deployment-vs-runtime-issues#deployments-and-web-app-restarts"). I artikeln beskrivs App Service distribuerar filer till mappen wwwroot. Appen startas aldrig om direkt.
## <a name="how-do-i-integrate-azure-devops-code-with-app-service"></a>Hur integrerar jag Azure DevOps-kod med App Service?
Du har två alternativ för att använda kontinuerlig distribution med Azure DevOps:
* Använd ett git-projekt. Anslut via App Service med hjälp av distributions Center.
* Använd ett TFVC-projekt (Team Foundation Version Control). Distribuera med hjälp av build-agenten för App Service.
Kontinuerlig kod distribution för båda dessa alternativ beror på befintliga arbets flöden för utvecklare och inchecknings procedurer. Mer information finns i de här artiklarna:
* [Implementera kontinuerlig distribution av din app till en Azure-webbplats](https://www.visualstudio.com/docs/release/examples/azure/azure-web-apps-from-build-and-release-hubs)
* [Konfigurera en Azure DevOps-organisation så att den kan distribueras till en webbapp](https://github.com/projectkudu/kudu/wiki/Setting-up-a-VSTS-account-so-it-can-deploy-to-a-Web-App)
## <a name="how-do-i-use-ftp-or-ftps-to-deploy-my-app-to-app-service"></a>Hur gör jag för att använda FTP eller FTPS för att distribuera appen till App Service?
Information om hur du använder FTP eller FTPS för att distribuera webbappen till App Service finns i [distribuera din app för att app service med FTP/S](deploy-ftp.md).
| 83 | 370 | 0.794187 | swe_Latn | 0.994893 |
f91aef284da10ccb1e8c5e7489fdd69b4ee0a141 | 4,096 | md | Markdown | README.md | lockdownstudio/jsfour-idcard | c3070a365aa6c37dce6694b1d704659226e4fb41 | [
"OML"
] | 32 | 2018-10-17T04:56:05.000Z | 2021-11-08T02:21:14.000Z | README.md | lockdownstudio/jsfour-idcard | c3070a365aa6c37dce6694b1d704659226e4fb41 | [
"OML"
] | 51 | 2018-10-16T15:34:41.000Z | 2020-09-05T17:23:27.000Z | README.md | lockdownstudio/jsfour-idcard | c3070a365aa6c37dce6694b1d704659226e4fb41 | [
"OML"
] | 63 | 2018-10-16T14:38:37.000Z | 2022-03-29T17:37:51.000Z | # jsfour-idcard
This is an updated version of my <a href="https://github.com/jonassvensson4/jsfour-legitimation">jsfour-legitimation<a/>. It has and ID card, firearms license and a driver license
## LICENSE
Please don't sell or reupload this resource
## INSTALLATION
Drag and drop.
You also need to have <a href="https://github.com/ESX-Org/es_extended">es_extended</a> and <a href="https://github.com/ESX-Org/esx_license">esx_license</a> installed.
You need to add a couple rows of code depending on how you want to use the ID. Please check the **Usage** down below.
## SCREENSHOTS



## USAGE
Example on how to add a button-event since people don't want to learn:
https://pastebin.com/UPQRcAei
```lua
-- ### Event usages:
-- Look at your own ID-card
TriggerServerEvent('jsfour-idcard:open', GetPlayerServerId(PlayerId()), GetPlayerServerId(PlayerId()))
-- Show your ID-card to the closest person
local player, distance = ESX.Game.GetClosestPlayer()
if distance ~= -1 and distance <= 3.0 then
TriggerServerEvent('jsfour-idcard:open', GetPlayerServerId(PlayerId()), GetPlayerServerId(player))
else
ESX.ShowNotification('No players nearby')
end
-- Look at your own driver license
TriggerServerEvent('jsfour-idcard:open', GetPlayerServerId(PlayerId()), GetPlayerServerId(PlayerId()), 'driver')
-- Show your driver license to the closest person
local player, distance = ESX.Game.GetClosestPlayer()
if distance ~= -1 and distance <= 3.0 then
TriggerServerEvent('jsfour-idcard:open', GetPlayerServerId(PlayerId()), GetPlayerServerId(player), 'driver')
else
ESX.ShowNotification('No players nearby')
end
-- Look at your own firearms license
TriggerServerEvent('jsfour-idcard:open', GetPlayerServerId(PlayerId()), GetPlayerServerId(PlayerId()), 'weapon')
-- Show your firearms license to the closest person
local player, distance = ESX.Game.GetClosestPlayer()
if distance ~= -1 and distance <= 3.0 then
TriggerServerEvent('jsfour-idcard:open', GetPlayerServerId(PlayerId()), GetPlayerServerId(player), 'weapon')
else
ESX.ShowNotification('No players nearby')
end
-- ### A menu (THIS IS AN EXAMPLE)
function openMenu()
ESX.UI.Menu.Open(
'default', GetCurrentResourceName(), 'id_card_menu',
{
title = 'ID menu',
elements = {
{label = 'Check your ID', value = 'checkID'},
{label = 'Show your ID', value = 'showID'},
{label = 'Check your driver license', value = 'checkDriver'},
{label = 'Show your driver license', value = 'showDriver'},
{label = 'Check your firearms license', value = 'checkFirearms'},
{label = 'Show your firearms license', value = 'showFirearms'},
}
},
function(data, menu)
local val = data.current.value
if val == 'checkID' then
TriggerServerEvent('jsfour-idcard:open', GetPlayerServerId(PlayerId()), GetPlayerServerId(PlayerId()))
elseif val == 'checkDriver' then
TriggerServerEvent('jsfour-idcard:open', GetPlayerServerId(PlayerId()), GetPlayerServerId(PlayerId()), 'driver')
elseif val == 'checkFirearms' then
TriggerServerEvent('jsfour-idcard:open', GetPlayerServerId(PlayerId()), GetPlayerServerId(PlayerId()), 'weapon')
else
local player, distance = ESX.Game.GetClosestPlayer()
if distance ~= -1 and distance <= 3.0 then
if val == 'showID' then
TriggerServerEvent('jsfour-idcard:open', GetPlayerServerId(PlayerId()), GetPlayerServerId(player))
elseif val == 'showDriver' then
TriggerServerEvent('jsfour-idcard:open', GetPlayerServerId(PlayerId()), GetPlayerServerId(player), 'driver')
elseif val == 'showFirearms' then
TriggerServerEvent('jsfour-idcard:open', GetPlayerServerId(PlayerId()), GetPlayerServerId(player), 'weapon')
end
else
ESX.ShowNotification('No players nearby')
end
end
end,
function(data, menu)
menu.close()
end
)
end
```
PSD file: https://www.dropbox.com/sh/ho6xq5cmk6sxz6x/AAB3aPJOylL7EWrU6BFb45-0a?dl=0
| 36.247788 | 179 | 0.735596 | eng_Latn | 0.341937 |
f91b026e87205b0581add7a4f48068c43ca2f389 | 2,160 | md | Markdown | docs/visual-basic/language-reference/modifiers/narrowing.md | CharleyGui/docs.fr-fr | 2563c94abf0d041d775f700b552d1dbe199f03d5 | [
"CC-BY-4.0",
"MIT"
] | null | null | null | docs/visual-basic/language-reference/modifiers/narrowing.md | CharleyGui/docs.fr-fr | 2563c94abf0d041d775f700b552d1dbe199f03d5 | [
"CC-BY-4.0",
"MIT"
] | null | null | null | docs/visual-basic/language-reference/modifiers/narrowing.md | CharleyGui/docs.fr-fr | 2563c94abf0d041d775f700b552d1dbe199f03d5 | [
"CC-BY-4.0",
"MIT"
] | null | null | null | ---
title: Narrowing
ms.date: 07/20/2015
f1_keywords:
- vb.narrowing
helpviewer_keywords:
- conversions [Visual Basic], type
- type conversion [Visual Basic]
- conversions [Visual Basic], data type
- Narrowing keyword [Visual Basic]
- data type conversion [Visual Basic]
ms.assetid: a207ee91-aca4-4771-b4e2-713f029bf2bb
ms.openlocfilehash: 77515357ac9dc972992df09c471695aad13985c4
ms.sourcegitcommit: d2db216e46323f73b32ae312c9e4135258e5d68e
ms.translationtype: MT
ms.contentlocale: fr-FR
ms.lasthandoff: 09/22/2020
ms.locfileid: "90867929"
---
# <a name="narrowing-visual-basic"></a>Narrowing (Visual Basic)
Indique qu’un opérateur de conversion ( `CType` ) convertit une classe ou une structure en un type qui peut ne pas pouvoir contenir certaines des valeurs possibles de la classe ou de la structure d’origine.
## <a name="converting-with-the-narrowing-keyword"></a>Conversion avec le mot clé restrictive
La procédure de conversion doit spécifier `Public Shared` en plus de `Narrowing` .
Les conversions restrictives ne réussissent pas toujours au moment de l’exécution et peuvent échouer ou entraîner une perte de données. Exemples : `Long` à `Integer` , `String` à `Date` et un type de base à un type dérivé. Cette dernière conversion est restrictive, car le type de base ne peut pas contenir tous les membres du type dérivé et, par conséquent, n’est pas une instance du type dérivé.
Si `Option Strict` est `On` , le code de consommation doit utiliser `CType` pour toutes les conversions restrictives.
Le `Narrowing` mot clé peut être utilisé dans ce contexte :
[Operator Statement](../statements/operator-statement.md)
## <a name="see-also"></a>Voir aussi
- [Operator Statement](../statements/operator-statement.md)
- [Widening](widening.md)
- [Widening and Narrowing Conversions](../../programming-guide/language-features/data-types/widening-and-narrowing-conversions.md)
- [Comment : définir un opérateur](../../programming-guide/language-features/procedures/how-to-define-an-operator.md)
- [CType Function](../functions/ctype-function.md)
- [Option Strict Statement](../statements/option-strict-statement.md)
| 49.090909 | 400 | 0.766204 | fra_Latn | 0.884889 |
f91b0d2cee57c6c083c8db32a0ba98347fdf3587 | 62 | md | Markdown | README.md | javiyt/cloudtuya-go | a464aa699d438b6c4017ed492a73e9d971e7237e | [
"Apache-2.0"
] | null | null | null | README.md | javiyt/cloudtuya-go | a464aa699d438b6c4017ed492a73e9d971e7237e | [
"Apache-2.0"
] | null | null | null | README.md | javiyt/cloudtuya-go | a464aa699d438b6c4017ed492a73e9d971e7237e | [
"Apache-2.0"
] | null | null | null | # cloudtuya-go
Port of the cloudtuya repository made using go
| 20.666667 | 46 | 0.806452 | eng_Latn | 0.989937 |
f91b6e521f9b43f8321b9b99cbd801770e54a539 | 2,439 | md | Markdown | content/blog/storyboard.md | davidalexandercurrie/itp-blog | 4d3661c56872fdcc83de6ca24cba8d722a5dc3ad | [
"MIT"
] | null | null | null | content/blog/storyboard.md | davidalexandercurrie/itp-blog | 4d3661c56872fdcc83de6ca24cba8d722a5dc3ad | [
"MIT"
] | null | null | null | content/blog/storyboard.md | davidalexandercurrie/itp-blog | 4d3661c56872fdcc83de6ca24cba8d722a5dc3ad | [
"MIT"
] | null | null | null | ---
path: animation-week-2
date: 2020-11-11T19:07:09.387Z
title: Storyboard
description: Animation Week 2
---
I am working with Duncan Figurski on animation assignment 2. This week we met to plan out our story elements and work process.

We came up with a plan to create a story based on a character that lives multiple lives throughout the film. We decided each _live_ would be allowed to be vastly different in both narative and visual style.

Above is the character Duncan created in Illustrator that we will be basing much of the story around. Duncan designed the character in a way that allowed a side angle version of each character to merge together to create a new version of the character which is what originally gave us the idea to base the story on a number of different lives the character has. We loosely decided the character would merge together in some way at the end of each scene and begin a new life in the next scene.
The work was devided up in a way that we both designed a scene that we will then fuse together. My scene depicts the character being created on a factory assembly line. The character is built by machines on a seemingly endless conveyor belt and eventually comes face to face with one of the machines that the character then becomes joined with. All the machines in the factory then converge on the new merged character dismantling it and themselves and the scene dissipates into a cloud of smoke.
Below are the frames I created for the storyboard.












Below are both Duncan's scenes, and mine, in Gif format.


| 62.538462 | 496 | 0.768758 | eng_Latn | 0.992468 |
f91ba5b27d0139e2259d115f11297bcaead9e290 | 1,357 | md | Markdown | README.md | clayne/K2NN | 49ad31c8d2cdfe2fa6e54673905ebe604e12a8ae | [
"MIT"
] | 9 | 2016-10-04T14:14:02.000Z | 2022-02-08T11:04:12.000Z | README.md | clayne/K2NN | 49ad31c8d2cdfe2fa6e54673905ebe604e12a8ae | [
"MIT"
] | 10 | 2016-11-21T16:08:16.000Z | 2018-11-12T17:39:10.000Z | README.md | clayne/K2NN | 49ad31c8d2cdfe2fa6e54673905ebe604e12a8ae | [
"MIT"
] | 3 | 2016-10-04T14:14:05.000Z | 2020-08-01T00:23:55.000Z | Fastest CPU implementation of both a brute-force
and a custom Multi-Index Hash Table accelerator
system for matching 512-bit binary descriptors
in 2NN mode, i.e., a match is returned if the best
match between a query vector and a training vector
is more than a certain threshold number of bits
better than the second-best match.
Yes, that means the DIFFERENCE in popcounts is used
for thresholding, NOT the ratio. This is the CORRECT
approach for binary descriptors.
Both 8-bit and 16-bit MIH tables are supported.
I currently recommend 16-bit.
All functionality is contained in the files K2NN.h and twiddle_table.h.
'main.cpp' is simply a sample test harness with example usage and
performance testing.
Example initialization of Matcher class
Matcher<false> m(tvecs, size, qvecs, size, threshold, max_twiddles);
Options:
Brute-force complete (exact) match:
m.bruteMatch();
Single twiddle pass for a very fast partial match,
with no false positives (i.e. if a match is returned, it's truly the best match):
m.fastApproxMatch();
Multi-index hash (MIH) complete (exact) match, with fall-back to brute force after max_twiddles passes:
m.exactMatch();
Match until complete or until 'n' passes elapse (partial):
m.approxMatchToNTwiddles(n);
Afterward, the finalized matches are waiting
in the vector 'm.matches'.
| 33.925 | 104 | 0.764186 | eng_Latn | 0.996382 |
f91c36401b6b1d22111c18edba6fae03261d0f71 | 74 | md | Markdown | README.md | perlmonger42/go-lox | 46c87a8443f8ae48bf700a0a6fddfe9f1a9b8369 | [
"MIT"
] | null | null | null | README.md | perlmonger42/go-lox | 46c87a8443f8ae48bf700a0a6fddfe9f1a9b8369 | [
"MIT"
] | null | null | null | README.md | perlmonger42/go-lox | 46c87a8443f8ae48bf700a0a6fddfe9f1a9b8369 | [
"MIT"
] | null | null | null | # go-lox
Bob Nystrom's Lox (from Crafting Interpreters) implemented in Go
| 24.666667 | 64 | 0.783784 | eng_Latn | 0.893816 |
f91c452546d0b9e98bcde31c98e0c04bdb248073 | 279 | md | Markdown | src/test/resources/testDocuments/list/descriptionList.kts.md | Durun/lateko | b5809e171acf41a804b2fbdb4ddb21185bd93e8b | [
"MIT"
] | 1 | 2022-01-16T10:50:55.000Z | 2022-01-16T10:50:55.000Z | src/test/resources/testDocuments/list/descriptionList.kts.md | Durun/lateko | b5809e171acf41a804b2fbdb4ddb21185bd93e8b | [
"MIT"
] | 20 | 2020-03-04T13:10:08.000Z | 2020-04-10T13:57:02.000Z | src/test/resources/testDocuments/list/descriptionList.kts.md | Durun/lateko | b5809e171acf41a804b2fbdb4ddb21185bd93e8b | [
"MIT"
] | null | null | null | # Description List
<div id="chdescriptionlistexample"></div>
## Description list example
With linebreak
- titleA
descriptionA
- titleB
descriptionB
- titleC
descriptionC
And without linebreak
- titleA descriptionA
- titleB descriptionB
- titleC descriptionC
End.
| 10.730769 | 41 | 0.759857 | eng_Latn | 0.588846 |
f91cb395f50c88b7364b7469b96ec1f55d3b78fd | 977 | md | Markdown | README.md | diptangsu/waifu-pics | 854b15f31be1decabcae2ff248297c28e361498b | [
"MIT"
] | null | null | null | README.md | diptangsu/waifu-pics | 854b15f31be1decabcae2ff248297c28e361498b | [
"MIT"
] | null | null | null | README.md | diptangsu/waifu-pics | 854b15f31be1decabcae2ff248297c28e361498b | [
"MIT"
] | null | null | null | # Waifu pics
A simple command line app to open a waifu image/gif on the browser. This is a wrapper for the [waifu.pics](https://waifu.pics/docs) API.
## Installation
```
pip install waifu-pics
```
or, since you're probably using `python3`
```
pip3 install waifu-pics
```
## Usage
```
waifu help shows this message
waifu open a random sfw image on your browser
waifu [type] [category] open an image from a particular type and category
Valid types:
sfw, nsfw
Valid sfw categories:
bite, yeet, handhold, slap, hug, waifu, kiss, wink, bonk, smile, poke, highfive, shinobu, bully, wave, nom, smug, cuddle, cringe, glomp, dance, kill, neko, blush, pat, awoo, happy, megumin, cry, lick
Valid nsfw categories:
neko, trap, blowjob, waifu
```
## Example
```
waifu sfw hug
```
I don't know why I did this. I guess I was really bored and I found this weird API and I felt like coding.
Absolutely not proud of this lol.
| 23.261905 | 203 | 0.675537 | eng_Latn | 0.983018 |
f91dcaf19b2fc11fa8aea6ea4a03ab2b25164d94 | 510 | md | Markdown | docs/LegacyAddress.md | CiscoDevNet/java-msx-sdk | 38a722c209f0ea49c97fd7f11be14ccf9ff49c1b | [
"MIT"
] | 1 | 2022-01-11T08:52:38.000Z | 2022-01-11T08:52:38.000Z | docs/LegacyAddress.md | CiscoDevNet/java-msx-sdk | 38a722c209f0ea49c97fd7f11be14ccf9ff49c1b | [
"MIT"
] | 1 | 2021-11-23T19:30:40.000Z | 2021-11-23T19:30:40.000Z | docs/LegacyAddress.md | CiscoDevNet/java-msx-sdk | 38a722c209f0ea49c97fd7f11be14ccf9ff49c1b | [
"MIT"
] | null | null | null |
# LegacyAddress
## Properties
Name | Type | Description | Notes
------------ | ------------- | ------------- | -------------
**name** | **String** | | [optional]
**displayName** | **String** | | [optional]
**company** | **String** | | [optional]
**address1** | **String** | | [optional]
**address2** | **String** | | [optional]
**city** | **String** | | [optional]
**state** | **String** | | [optional]
**country** | **String** | | [optional]
**postCode** | **String** | | [optional]
| 23.181818 | 60 | 0.460784 | yue_Hant | 0.3559 |
f91e3e04858c649335b90e0ee22d46d39049cc38 | 134 | md | Markdown | README.md | actions-cool/issue-bot | e5ed032c983fd573e1e5b849137c194524d5ceef | [
"MIT"
] | 1 | 2021-01-07T13:54:06.000Z | 2021-01-07T13:54:06.000Z | README.md | actions-cool/issue-bot | e5ed032c983fd573e1e5b849137c194524d5ceef | [
"MIT"
] | 2 | 2020-12-30T13:17:28.000Z | 2022-02-25T04:31:43.000Z | README.md | actions-cool/issue-bot | e5ed032c983fd573e1e5b849137c194524d5ceef | [
"MIT"
] | 1 | 2021-03-10T03:32:33.000Z | 2021-03-10T03:32:33.000Z | # Issue Chat
🤡 I'm not sure how far it can be done. May just entertainment.
See https://github.com/actions-cool/issue-chat/issues/1
| 22.333333 | 62 | 0.738806 | eng_Latn | 0.805575 |
f91e6741c02deb72a42e0b4db4fede70929de0f6 | 650 | md | Markdown | windows.applicationmodel/package_getapplistentriesasync_364802562.md | gbaychev/winrt-api | 25346cd51bc9d24c8c4371dc59768e039eaf02f1 | [
"CC-BY-4.0",
"MIT"
] | 2 | 2020-01-20T07:56:10.000Z | 2021-11-06T17:07:12.000Z | windows.applicationmodel/package_getapplistentriesasync_364802562.md | gbaychev/winrt-api | 25346cd51bc9d24c8c4371dc59768e039eaf02f1 | [
"CC-BY-4.0",
"MIT"
] | null | null | null | windows.applicationmodel/package_getapplistentriesasync_364802562.md | gbaychev/winrt-api | 25346cd51bc9d24c8c4371dc59768e039eaf02f1 | [
"CC-BY-4.0",
"MIT"
] | 1 | 2018-01-20T02:36:26.000Z | 2018-01-20T02:36:26.000Z | ---
-api-id: M:Windows.ApplicationModel.Package.GetAppListEntriesAsync
-api-type: winrt method
---
<!-- Method syntax
public Windows.Foundation.IAsyncOperation<Windows.Foundation.Collections.IVectorView<Windows.ApplicationModel.Core.AppListEntry>> GetAppListEntriesAsync()
-->
# Windows.ApplicationModel.Package.GetAppListEntriesAsync
## -description
Enumerates the packaged apps on the device. Only apps included in the current package are returned.
## -returns
A list of packaged apps along with their display name, description, and logo.
## -remarks
## -examples
## -see-also
[AppListEntry](../windows.applicationmodel.core/applistentry.md) | 28.26087 | 154 | 0.792308 | eng_Latn | 0.574009 |
f91f1f00adfb2bcff42bdde6020abda1c13dafc6 | 246 | md | Markdown | includes/sql-database-browse-to-server.md | bplus/azure-docs | 3e3fc1c429f17f7a5b6cbb835890619c46b4e137 | [
"CC-BY-4.0",
"MIT"
] | 1 | 2022-03-22T15:03:27.000Z | 2022-03-22T15:03:27.000Z | includes/sql-database-browse-to-server.md | bplus/azure-docs | 3e3fc1c429f17f7a5b6cbb835890619c46b4e137 | [
"CC-BY-4.0",
"MIT"
] | null | null | null | includes/sql-database-browse-to-server.md | bplus/azure-docs | 3e3fc1c429f17f7a5b6cbb835890619c46b4e137 | [
"CC-BY-4.0",
"MIT"
] | 2 | 2017-02-18T05:45:54.000Z | 2019-12-21T21:23:13.000Z |
Open your SQL server page:
1. Go to the [Azure portal](https://portal.azure.com).
2. Click **More services** > **SQL servers**:

3. Click the desired SQL server.
| 24.6 | 77 | 0.686992 | eng_Latn | 0.3124 |
f92189217870818fe9e1302663226c38e1ddf229 | 2,296 | md | Markdown | dynamicsax2012-technet/esp-generate-the-declaration-347-report.md | s0pach/DynamicsAX2012-technet | 8412306681e6b914ebcfad0a9ee05038474ef1e6 | [
"CC-BY-4.0",
"MIT"
] | 1 | 2020-06-16T22:06:04.000Z | 2020-06-16T22:06:04.000Z | dynamicsax2012-technet/esp-generate-the-declaration-347-report.md | s0pach/DynamicsAX2012-technet | 8412306681e6b914ebcfad0a9ee05038474ef1e6 | [
"CC-BY-4.0",
"MIT"
] | null | null | null | dynamicsax2012-technet/esp-generate-the-declaration-347-report.md | s0pach/DynamicsAX2012-technet | 8412306681e6b914ebcfad0a9ee05038474ef1e6 | [
"CC-BY-4.0",
"MIT"
] | null | null | null | ---
title: (ESP) Generate the Declaration 347 report
TOCTitle: (ESP) Generate the Declaration 347 report
ms:assetid: bf19f1c3-4523-4c8f-8061-2d7d0febe4c4
ms:mtpsurl: https://technet.microsoft.com/library/Hh242823(v=AX.60)
ms:contentKeyID: 36059255
author: Khairunj
ms.date: 04/18/2014
mtps_version: v=AX.60
f1_keywords:
- Spain
- Spanish
- report
audience: Application User
ms.search.region: Spain
---
# (ESP) Generate the Declaration 347 report
_**Applies To:** Microsoft Dynamics AX 2012 R3, Microsoft Dynamics AX 2012 R2, Microsoft Dynamics AX 2012 Feature Pack, Microsoft Dynamics AX 2012_
You can use the **Declaration 347** form to submit the Declaration 347 report to the tax authorities. When you generate the Declaration 347 report, a transaction line that has the new tax exempt number for the customer or vendor is displayed. All transaction entries are added to the report after both the tax exempt number and the customer number or vendor number are verified. The total amount is calculated after a tax exempt number and a customer number are grouped. For each customer or vendor, the tax exempt number is verified, and all transactions are included on one line of the Declaration 347 report.
1. Click **General ledger** \> **Periodic** \> **Report 347** \> **Declaration 347**.
2. Click **Generate** to open the **Declaration 347** form.
3. In the **Fiscal year** field, enter the fiscal year that the Declaration 347 report is printed for.
4. In the **Minimum amount** field, enter the minimum amount to report in the declaration.
5. In the **Minimum amount of payments in cash** field, enter the minimum amount of the cash payment to report in the declaration.
6. In the **Document number of the declaration** field, enter the document number of the Declaration 347 report.
7. Select the **Group only by tax exempt number** check box to ignore the changes for the tax exempt number, and to group all customer transactions and vendor transactions by tax exempt number in the declaration.
8. Click **OK** to generate the Declaration 347 report.
## See also
[(ESP) Generate declaration 347 (class form)](https://technet.microsoft.com/library/aa589594\(v=ax.60\))
[(ESP) Declaration 347 (form)](https://technet.microsoft.com/library/aa552566\(v=ax.60\))
| 45.92 | 611 | 0.76176 | eng_Latn | 0.98391 |
f922672097a926df287f6e5c77bf68554289a21a | 323 | md | Markdown | README.md | richplastow/actile | 9bf974a7e6ad8282ffd480d6bb47e15462824ab1 | [
"MIT"
] | null | null | null | README.md | richplastow/actile | 9bf974a7e6ad8282ffd480d6bb47e15462824ab1 | [
"MIT"
] | null | null | null | README.md | richplastow/actile | 9bf974a7e6ad8282ffd480d6bb47e15462824ab1 | [
"MIT"
] | null | null | null | Actile
========
#### A grid of ‘active tiles’, which scale to fit any window shape
- [Homepage](http://actile.richplastow.com/)
- [Documentation](http://actile.richplastow.com/#/doc/documentation)
- [Test](http://actile.richplastow.com/test/run-test.html)
- [Fork Actile on GitHub](https://github.com/richplastow/actile)
| 32.3 | 68 | 0.712074 | yue_Hant | 0.529423 |
f9235923241a603b77589e83b74aea75d4dd496e | 525 | md | Markdown | add/metadata/System.Workflow.Activities/WorkflowServiceAttributesDynamicPropertyValidator.meta.md | erikly/docs | 3de58af074aadfc5ad44c6c7106a0531d7cfa3e3 | [
"CC-BY-4.0",
"MIT"
] | null | null | null | add/metadata/System.Workflow.Activities/WorkflowServiceAttributesDynamicPropertyValidator.meta.md | erikly/docs | 3de58af074aadfc5ad44c6c7106a0531d7cfa3e3 | [
"CC-BY-4.0",
"MIT"
] | null | null | null | add/metadata/System.Workflow.Activities/WorkflowServiceAttributesDynamicPropertyValidator.meta.md | erikly/docs | 3de58af074aadfc5ad44c6c7106a0531d7cfa3e3 | [
"CC-BY-4.0",
"MIT"
] | 1 | 2020-11-16T19:24:50.000Z | 2020-11-16T19:24:50.000Z | ---
uid: System.Workflow.Activities.WorkflowServiceAttributesDynamicPropertyValidator
author: "Erikre"
ms.author: "erikre"
manager: "erikre"
---
---
uid: System.Workflow.Activities.WorkflowServiceAttributesDynamicPropertyValidator.#ctor
author: "Erikre"
ms.author: "erikre"
manager: "erikre"
---
---
uid: System.Workflow.Activities.WorkflowServiceAttributesDynamicPropertyValidator.Validate(System.Workflow.ComponentModel.Compiler.ValidationManager,System.Object)
author: "Erikre"
ms.author: "erikre"
manager: "erikre"
---
| 25 | 163 | 0.80381 | yue_Hant | 0.252278 |
f923d0833ea2b47a1c3e9fa7d185e94b39d294b8 | 2,450 | md | Markdown | DISTRO_README.md | MaTriXy/kotlin-native | 9d9ebe75928a52cd7712f35779307398d249736c | [
"Apache-2.0"
] | 1 | 2019-05-16T23:18:14.000Z | 2019-05-16T23:18:14.000Z | DISTRO_README.md | nacersalaheddine/kotlin-native | 2934588bbed99b7dbb4c8949c6d580da3d9cdf49 | [
"ECL-2.0",
"Apache-2.0"
] | null | null | null | DISTRO_README.md | nacersalaheddine/kotlin-native | 2934588bbed99b7dbb4c8949c6d580da3d9cdf49 | [
"ECL-2.0",
"Apache-2.0"
] | null | null | null | # Kotlin/Native #
_Kotlin/Native_ is a LLVM backend for the Kotlin compiler, runtime
implementation and native code generation facility using LLVM toolchain.
_Kotlin/Native_ is primarily designed to allow compilation for platforms where
virtual machines are not desirable or possible (such as iOS, embedded targets),
or where developer is willing to produce reasonably-sized self-contained program
without need to ship an additional execution runtime.
To get started with _Kotlin/Native_ take a look at the attached samples.
* `androidNativeActivity` - Android Native Activity rendering 3D graphics using OpenGLES
* `calculator` - iOS Swift application, using Kotlin/Native code compiled into the framework
* `csvparser` - simple CSV file parser and analyzer
* `gitchurn` - program interoperating with `libgit2` for GIT repository analysis
* `gtk` - GTK2 interoperability example
* `html5Canvas` - WebAssembly example
* `libcurl` - using of FTP/HTTP/HTTPS client library `libcurl`
* `nonBlockingEchoServer` - multi-client TCP/IP echo server using co-routines
* `objc` - AppKit Objective-C interoperability example for macOS
* `opengl` - OpenGL/GLUT teapot example
* `python_extension` - Python extension written in Kotlin/Native
* `socket` - TCP/IP echo server
* `tensorflow` - simple client for TensorFlow Machine Intelligence library
* `tetris` - Tetris game implementation (using SDL2 for rendering)
* `uikit` - UIKit Objective-C interoperability example for iOS
* `videoplayer` - SDL and FFMPEG-based video and audio player
* `win32` - trivial Win32 GUI application
* `workers` - example of using workers API
See `README.md` in each sample directory for more information and build instructions.
_Kotlin/Native_ could be used either as standalone compiler toolchain or as Gradle
plugin. See `GRADLE_PLUGIN.md` for more details on how to use this plugin.
Compile your programs like that:
export PATH=kotlin-native-<platform>-<version>/bin:$PATH
kotlinc hello.kt -o hello
For an optimized compilation use -opt:
kotlinc hello.kt -o hello -opt
To generate interoperability stubs create library definition file
(take a look on `samples/tetris/tetris.sdl`) and run `cinterop` tool like this:
cinterop -def lib.def
See `INTEROP.md` for more information on how to use C libraries from _Kotlin/Native_.
See `RELEASE_NOTES.md` for information on supported platforms and current limitations. | 45.37037 | 94 | 0.773061 | eng_Latn | 0.9811 |
f9250032c7e951aabde60259da63c40cb0ba1f79 | 5,162 | md | Markdown | Simple-OTT/README.md | thijsl/samples-android-sdk | 09fe3eb20746938d7bf460085f25a1c74fadecc2 | [
"BSD-3-Clause"
] | null | null | null | Simple-OTT/README.md | thijsl/samples-android-sdk | 09fe3eb20746938d7bf460085f25a1c74fadecc2 | [
"BSD-3-Clause"
] | null | null | null | Simple-OTT/README.md | thijsl/samples-android-sdk | 09fe3eb20746938d7bf460085f25a1c74fadecc2 | [
"BSD-3-Clause"
] | null | null | null | # Reference Apps - THEO Simple OTT
The purpose of this app is to demonstrate how [THEOplayer] could be used in a "real" production-like
application.
For quick start, please proceed with the [Quick Start](#quick-start) guide.
For application architecture, please proceed with the [Application Architecture](#application-architecture) guide.
## Guides
_**THEO Simple OTT**_ application shows various THEOplayer features used together. To learn more
about them please check following guides:
* [THEOplayer How To's - Full Screen Management]
* [THEOplayer How To's - Google Cast Integration]
* [THEOplayer How To's - Downloading Stream Content]
This app is an extension of [THEO Basic Playback] application. For help with getting started with
THEOplayer or Android Studio feel free to check related guides:
* [THEO Knowledge Base - Android Studio Setup]
* [THEO Knowledge Base - Virtual and Physical Devices]
* [THEOplayer How To's - THEOplayer Android SDK Integration]
## Quick Start
1. Obtain THEOplayer Android SDK with **Caching** and **ExoPlayer** features enabled and unzip it.
Please visit [Get Started with THEOplayer] to get required THEOplayer Android SDK.
2. Copy **`theoplayer-android-[name]-[version]-minapi16-release.aar`** file from unzipped SDK into
application **[libs]** folder and rename it to **`theoplayer.aar`**.
Project is configured to load SDK with such name, for using other name please change
`implementation ':theoplayer@aar'` dependency in [app-level build.gradle] file accordingly.
Please check [THEOplayer How To's - THEOplayer Android SDK Integration] guide for more information
about integrating THEOplayer Android SDK.
3. Open _**THEO Simple OTT**_ application in Android Studio.
For more information about installing Android Studio please check
[THEO Knowledge Base - Android Studio Setup] guide.
Android Studio should automatically synchronize and rebuild project. If this won't happen please
select **File > Sync Project with Gradle Files** menu item to do it manually. Please note, that
in very rare cases it will be required to synchronize project twice.
4. Select **Run > Run 'app'** menu item to run application on a device selected by default.
To change the device please select **Run > Select Device...** menu item. For more information
about working with Android devices please check [THEO Knowledge Base - Virtual and Physical Devices]
guide.
## Application Architecture
Application presents view of four tabs:
* **LIVE** - where live streams can be played
* **ON DEMAND** - where VoD streams can be played
* **OFFLINE** - where available VoD streams can be downloaded nad played
* **SETTINGS** - where all downloaded streams can be removed and download preferences can be set

Streams presented on tabs are defined in [stream_sources.json] file stored in application raw
resources. They can be easily updated. Every tab that displays streams has its own section in this
JSON configuration. The stream sources JSON configuration should be structured as follows:
```
{
"live": StreamSource[],
"onDemand": StreamSource[],
"offline": StreamSource[]
}
```
where `StreamSource` has following structure:
```
{
"title": "Stream Title",
"description": "Some Stream Description",
"image": "@drawable/streamImage",
"source": "hxxps://some.host.com/some-asset.m3u8"
}
```
Please note that `image` should keep reference to existing drawable resource.
## Streams/Content Rights:
The DRM streams used in this app (if any) are provided by our Partner: [EZ DRM] and hold all
the rights for the content. These streams are DRM protected and cannot be used for any other purposes.
## License
This project is licensed under the BSD 3 Clause License - see the [LICENSE] file for details.
[//]: # (Links and Guides reference)
[THEOplayer]: https://www.theoplayer.com/
[THEO Basic Playback]: ../Basic-Playback
[THEO Knowledge Base - Android Studio Setup]: ../Basic-Playback/guides/knowledgebase-android-studio-setup/README.md
[THEO Knowledge Base - Virtual and Physical Devices]: ../Basic-Playback/guides/knowledgebase-virtual-and-physical-devices/README.md
[THEOplayer How To's - THEOplayer Android SDK Integration]: ../Basic-Playback/guides/howto-theoplayer-android-sdk-integration/README.md
[THEOplayer How To's - Full Screen Management]: ../Full-Screen-Handling/guides/howto-full-screen-management/README.md
[THEOplayer How To's - Google Cast Integration]: ../Google-Cast/guides/howto-google-cast-integration/README.md
[THEOplayer How To's - Downloading Stream Content]: ../Offline-Playback/guides/howto-downloading-stream-content/README.md
[Get Started with THEOplayer]: https://www.theoplayer.com/licensing
[EZ DRM]: https://ezdrm.com/
[//]: # (Project files reference)
[LICENSE]: LICENSE
[libs]: app/libs
[app-level build.gradle]: app/build.gradle
[stream_sources.json]: app/src/main/res/raw/stream_sources.json
| 41.96748 | 136 | 0.740023 | eng_Latn | 0.91406 |
f925f2508b89af551bd4e65c373b8572f8d5d828 | 5,478 | md | Markdown | docs/msbuild/msbuild-reference.md | doodz/visualstudio-docs.fr-fr | 49c7932ec7a761e4cd7c259a5772e5415253a7a5 | [
"CC-BY-4.0",
"MIT"
] | null | null | null | docs/msbuild/msbuild-reference.md | doodz/visualstudio-docs.fr-fr | 49c7932ec7a761e4cd7c259a5772e5415253a7a5 | [
"CC-BY-4.0",
"MIT"
] | 1 | 2018-10-19T08:00:06.000Z | 2018-10-19T08:00:06.000Z | docs/msbuild/msbuild-reference.md | doodz/visualstudio-docs.fr-fr | 49c7932ec7a761e4cd7c259a5772e5415253a7a5 | [
"CC-BY-4.0",
"MIT"
] | null | null | null | ---
title: "Informations de référence sur MSBuild | Microsoft Docs"
ms.custom:
ms.date: 11/04/2016
ms.reviewer:
ms.suite:
ms.technology: vs-ide-sdk
ms.tgt_pltfrm:
ms.topic: article
dev_langs:
- VB
- CSharp
- C++
- jsharp
helpviewer_keywords: MSBuild, reference
ms.assetid: 093395e1-70da-4f74-b34d-046c5e2b32e8
caps.latest.revision: "22"
author: kempb
ms.author: kempb
manager: ghogen
ms.openlocfilehash: 40b1fdb27bd2a256a1cff4b5a2066a3223939771
ms.sourcegitcommit: f40311056ea0b4677efcca74a285dbb0ce0e7974
ms.translationtype: HT
ms.contentlocale: fr-FR
ms.lasthandoff: 10/31/2017
---
# <a name="msbuild-reference"></a>Référence MSBuild
[!INCLUDE[vstecmsbuild](../extensibility/internals/includes/vstecmsbuild_md.md)] est le système de génération de [!INCLUDE[vsprvs](../code-quality/includes/vsprvs_md.md)]. Les liens indiqués ci-dessous renvoient aux rubriques qui contiennent les informations de référence sur [!INCLUDE[vstecmsbuild](../extensibility/internals/includes/vstecmsbuild_md.md)].
## <a name="in-this-section"></a>Dans cette section
[Informations de référence sur le schéma de fichier projet MSBuild](../msbuild/msbuild-project-file-schema-reference.md)
Décrit les éléments XML qui composent le format de fichier [!INCLUDE[vstecmsbuild](../extensibility/internals/includes/vstecmsbuild_md.md)].
[Task Reference (Informations de référence sur les tâches MSBuild)](../msbuild/msbuild-task-reference.md)
Décrit quelques-unes des tâches standard qui sont incluses avec [!INCLUDE[vstecmsbuild](../extensibility/internals/includes/vstecmsbuild_md.md)].
[Conditions MSBuild](../msbuild/msbuild-conditions.md)
Décrit les conditions qui sont disponibles dans les fichiers [!INCLUDE[vstecmsbuild](../extensibility/internals/includes/vstecmsbuild_md.md)].
[Constructions conditionnelles MSBuild](../msbuild/msbuild-conditional-constructs.md)
Décrit l’utilisation des éléments `Choose`, `When` et `Otherwise`.
[Propriétés réservées et connues de MSBuild](../msbuild/msbuild-reserved-and-well-known-properties.md)
Décrit les propriétés réservées de [!INCLUDE[vstecmsbuild](../extensibility/internals/includes/vstecmsbuild_md.md)].
[Propriétés communes des projets MSBuild](../msbuild/common-msbuild-project-properties.md)
Décrit les propriétés de projets qui sont communes à tous les types de projet, ainsi que les propriétés qui sont souvent utilisées par des types de projet particuliers.
[Éléments communs des projets MSBuild](../msbuild/common-msbuild-project-items.md)
Décrit les éléments de projets qui sont communs à tous les types de projet, ainsi que les éléments qui sont souvent utilisés par des types de projet particuliers.
[Command-Line Reference (Informations de référence sur la ligne de commande MSBuild)](../msbuild/msbuild-command-line-reference.md)
Décrit les arguments et les commutateurs qui peuvent être utilisés avec [!INCLUDE[vstecmsbuild](../extensibility/internals/includes/vstecmsbuild_md.md)].exe.
[Fichiers .Targets](../msbuild/msbuild-dot-targets-files.md)
Décrit le fichier .Targets qui est inclus dans [!INCLUDE[vstecmsbuild](../extensibility/internals/includes/vstecmsbuild_md.md)].
[Métadonnées d’élément connues](../msbuild/msbuild-well-known-item-metadata.md)
Répertorie les métadonnées qui sont créées avec chaque élément.
[Fichiers réponse MSBuild](../msbuild/msbuild-response-files.md)
Décrit les fichiers .rsp qui contiennent des commutateurs de ligne de commande.
[Ressources supplémentaires pour MSBuild](../msbuild/additional-resources-for-msbuild.md)
Fournit des liens vers les sites web et groupes de discussion [!INCLUDE[vstecmsbuild](../extensibility/internals/includes/vstecmsbuild_md.md)].
[Informations de référence sur MSBuild WPF](../msbuild/wpf-msbuild-reference.md)
Contient des informations de référence sur les cibles et les tâches [!INCLUDE[vstecmsbuild](../extensibility/internals/includes/vstecmsbuild_md.md)] pour WPF (Windows Presentation Foundation).
[Caractères d’échappement spéciaux](../msbuild/special-characters-to-escape.md)
Répertorie les caractères qui peuvent devoir être insérés dans une « séquence d’échappement » pour être interprétés correctement. Une séquence d’échappement est une série de caractères qui signifie que ce qui suit est une autre interprétation.
## <a name="related-sections"></a>Rubriques connexes
[Vue d’ensemble de MSBuild](../msbuild/msbuild.md) Présente [!INCLUDE[vstecmsbuild](../extensibility/internals/includes/vstecmsbuild_md.md)] et fournit des liens vers des rubriques qui expliquent comment l’utiliser pour générer des projets.
<xref:Microsoft.Build.Conversion>
Contient des informations de référence sur l’espace de noms Conversion.
<xref:Microsoft.Build.Evaluation>
Contient des informations de référence sur l’espace de noms Evaluation.
<xref:Microsoft.Build.Execution>
Contient des informations de référence sur l’espace de noms Execution.
<xref:Microsoft.Build.Framework>
Contient des informations de référence sur l’espace de noms Framework.
<xref:Microsoft.Build.Logging>
Contient des informations de référence sur l’espace de noms Logging.
<xref:Microsoft.Build.Tasks>
Contient des informations de référence sur l’espace de noms Tasks.
<xref:Microsoft.Build.Utilities>
Contient des informations de référence sur l’espace de noms Utilities. | 57.663158 | 359 | 0.7751 | fra_Latn | 0.818008 |
f92639760f435465287cd798ed76cbecddf1b3dc | 5,500 | md | Markdown | README_ja.md | suecharo/stayhome-wes | 30e6fddc859d54f12e9a5efc1d8571628b1b5d8c | [
"Apache-2.0"
] | 8 | 2020-05-07T12:17:58.000Z | 2020-10-19T07:21:33.000Z | README_ja.md | suecharo/stayhome-wes | 30e6fddc859d54f12e9a5efc1d8571628b1b5d8c | [
"Apache-2.0"
] | null | null | null | README_ja.md | suecharo/stayhome-wes | 30e6fddc859d54f12e9a5efc1d8571628b1b5d8c | [
"Apache-2.0"
] | 1 | 2020-05-08T01:30:20.000Z | 2020-05-08T01:30:20.000Z | # Genpei (源平)
[](https://github.com/suecharo/genpei/actions?query=workflow%3Apytest)
[](https://github.com/suecharo/genpei/actions?query=workflow%3Aflake8)
[](https://github.com/suecharo/genpei/actions?query=workflow%3Aisort)
[](https://github.com/suecharo/genpei/actions?query=workflow%3Amypy)
[](http://www.apache.org/licenses/LICENSE-2.0)
Genpei (源平) は、[Global Alliance for Genomics and Health](https://www.ga4gh.org) (GA4GH) により制定された [Workflow Execution Service](https://github.com/ga4gh/workflow-execution-service-schemas) (WES) API 定義に準拠した標準実装です。
Microservice の思想に則り、[Flask](https://a2c.bitbucket.io/flask/) と [cwltool](https://github.com/common-workflow-language/cwltool) を用いており、[Common Workflow Language](https://www.commonwl.org) を用いて作られた、シンプルかつ拡張性の高い REST API Server です。
[Common Workflow Language](https://www.commonwl.org) (CWL) により書かれた Workflow の実行や管理をサポートします。
## Install and Run
Python 3.6 以上を想定しています。
```bash
$ pip3 install genpei
$ genpei
```
### Docker
Docker を用いた利用も想定しています。
cwltool 内で Docker-in-Docker (DinD) を用いるため、`docker.sock` や `/tmp` などを mount しなければなりません。
詳しくは、[DockerHub - cwltool](https://hub.docker.com/r/commonworkflowlanguage/cwltool/) のドキュメントを確認してください。
```bash
# 起動
$ docker-compose up -d
# 起動確認
$ docker-compose logs
```
## Usage
API 仕様は、[GitHub - GA4GH WES](https://github.com/ga4gh/workflow-execution-service-schemas) や [SwaggerUI - GA4GH WES](https://suecharo.github.io/genpei-swagger-ui/dist/) を確認してください。
一番簡単な REST API Request として、`GET /service-info` の例を挙げます。
```json
GET /service-info
{
"auth_instructions_url": "https://github.com/suecharo/genpei",
"contact_info_url": "https://github.com/suecharo/genpei",
"default_workflow_engine_parameters": [],
"supported_filesystem_protocols": [
"http",
"https",
"file"
],
"supported_wes_versions": [
"1.0.0"
],
"system_state_counts": {},
"tags": {
"wes_name": "genpei"
},
"workflow_engine_versions": {
"cwltool": "3.0.20200324120055"
},
"workflow_type_versions": {
"CWL": {
"workflow_type_version": [
"v1.0",
"v1.1",
"v1.1.0-dev1",
"v1.2.0-dev1",
"v1.2.0-dev2"
]
}
}
}
```
起動時引数 (`--host` and `--port`) を指定することで、起動 Host や Port を変更できます。また、これらの引数に対応する環境変数として、`GENPEI_HOST`, `GENPEI_PORT` が用意されています。
```bash
genpei --help
usage: genpei [-h] [--host] [-p] [--debug] [-r] [--service-info]
An implementation of GA4GH Workflow Execution Service Standard as a microservice
optional arguments:
-h, --help show this help message and exit
--host Host address of Flask. (default: 127.0.0.1)
-p , --port Port of Flask. (default: 8080)
--debug Enable debug mode of Flask.
-r , --run-dir Specify the run dir. (default: ./run)
--service-info Specify `service-info.json`. The workflow_engine_versions, workflow_type_versions
and system_state_counts are overwritten in the application.
$ genpei --host 0.0.0.0 --port 5000
```
Genpei は、投入された workflow や workflow parameter、output files などを file system 上で管理しています。これら全ての file をまとめた directory を run dir と呼んでおり、default は `${PWD}/run` です。run dir の場所は、起動時引数 `--run-dir` や環境変数 `GENPEI_RUN_DIR` で上書きできます。
run dir 構造は、以下のようになっており、それぞれの run における file 群が配置されています。初期化やそれぞれの run の削除は `rm` を用いた物理的な削除により行えます。
```bash
$ tree run
.
├── 11
│ └── 11a23a68-a914-427a-80cd-9ad6f7cfd256
│ ├── cmd.txt
│ ├── end_time.txt
│ ├── exe
│ │ └── workflow_params.json
│ ├── exit_code.txt
│ ├── outputs
│ │ ├── ERR034597_1.small_fastqc.html
│ │ ├── ERR034597_1.small.fq.trimmed.1P.fq
│ │ ├── ERR034597_1.small.fq.trimmed.1U.fq
│ │ ├── ERR034597_1.small.fq.trimmed.2P.fq
│ │ ├── ERR034597_1.small.fq.trimmed.2U.fq
│ │ └── ERR034597_2.small_fastqc.html
│ ├── run.pid
│ ├── run_request.json
│ ├── start_time.txt
│ ├── state.txt
│ ├── stderr.log
│ └── stdout.log
├── 14
│ └── ...
├── 2d
│ └── ...
└── 6b
└── ...
```
`POST /runs` の実行は非常に複雑です。Python の [requests](https://requests.readthedocs.io/en/master/) を用いた例として、[GitHub - genpei/tests/post_runs_examples](https://github.com/suecharo/genpei/tree/master/tests/post_runs_examples) が用意されています。参考にしてください。
## Development
開発環境は以下で起動します。
```bash
$ docker-compose -f docker-compose.dev.yml up -d --build
$ docker-compose -f docker-compose.dev.yml exec app bash
```
Lint Tool として、[flake8](https://pypi.org/project/flake8/), [isort](https://github.com/timothycrosley/isort), [mypy](http://mypy-lang.org) を用いています。
それぞれの実行方法は以下のとおりです。
```bash
$ bash ./tests/lint_and_style_check/flake8.sh
$ bash ./tests/lint_and_style_check/isort.sh
$ bash ./tests/lint_and_style_check/mypy.sh
```
Test Tool として、[pytest](https://docs.pytest.org/en/latest/) を用いてます。
実行方法は以下のとおりです。
```bash
$ pytest .
```
## License
[Apache-2.0](https://www.apache.org/licenses/LICENSE-2.0). See the [LICENSE](https://github.com/suecharo/genpei/blob/master/LICENSE).
- The Developer and Maintainer: [@suecharo](https://github.com/suecharo)
- The Godfather of this library: [@inutano](https://github.com/inutano)
| 32.934132 | 234 | 0.686182 | yue_Hant | 0.281197 |
f9263c0b877bd25ceffc43f98fb224cbc949a497 | 23,990 | md | Markdown | scripts/03_rnaseqSubfield.md | raynamharris/IntegrativeProjectWT2015 | a74c924edb8060e79565f7dfd511c326f17438cd | [
"MIT"
] | 3 | 2019-12-09T18:50:02.000Z | 2021-02-03T13:12:06.000Z | scripts/03_rnaseqSubfield.md | raynamharris/IntegrativeProjectWT2015 | a74c924edb8060e79565f7dfd511c326f17438cd | [
"MIT"
] | 2 | 2017-11-21T14:56:52.000Z | 2019-06-06T22:23:32.000Z | scripts/03_rnaseqSubfield.md | raynamharris/IntegrativeProjectWT2015 | a74c924edb8060e79565f7dfd511c326f17438cd | [
"MIT"
] | 1 | 2021-06-29T05:55:08.000Z | 2021-06-29T05:55:08.000Z | Subfield analysis
-----------------
This script is used to identify treatement differences within each
subfield, generate volcano plots, venn diagrams, and tables for
subsequent GO analyses. The final mutlipanel figures for the manuscript
have been inserted just below the subheadings.
library(tidyverse)
library(cowplot) ## for some easy to use themes
library(DESeq2) ## for gene expression analysis
library(png)
library(grid)
library(scales)
library(apaTables) # for ANOVA tables
library(BiocParallel)
register(MulticoreParam(6))
## load functions
source("figureoptions.R")
source("functions_RNAseq.R")
## set output file for figures
knitr::opts_chunk$set(fig.path = '../figures/03_rnaseqSubfield/', cache = T)
Wrangle data
------------
# prep col data,
outliers <- c("146D-DG-3", "145A-CA3-2", "146B-DG-2", "146D-CA1-3", "148B-CA1-4")
colData <- read.csv("../data/00_colData.csv", header = T) %>%
filter(!RNAseqID %in% outliers)
colData$training <- factor(colData$training, levels = levelstraining)
colData$treatment <- factor(colData$treatment, levels = levelstreatment)
# remove outliers
savecols <- as.character(colData$RNAseqID) #select the rowsname
savecols <- as.vector(savecols) # make it a vector
countData <- read.csv("../data/00_countData.csv",
header = T, check.names = F, row.names = 1) %>%
dplyr::select(one_of(savecols)) # select just the columns
head(countData)
## 143A-CA3-1 143A-DG-1 143B-CA1-1 143B-DG-1 143C-CA1-1 143D-CA1-3
## 0610007P14Rik 85 112 60 48 38 28
## 0610009B22Rik 24 34 21 10 19 0
## 0610009L18Rik 4 9 10 8 2 0
## 0610009O20Rik 85 185 44 72 76 25
## 0610010F05Rik 142 155 54 117 57 39
## 0610010K14Rik 24 74 14 21 23 21
## 143D-DG-3 144A-CA1-2 144A-CA3-2 144A-DG-2 144B-CA1-1 144B-CA3-1
## 0610007P14Rik 43 80 21 80 72 34
## 0610009B22Rik 1 30 9 9 14 9
## 0610009L18Rik 2 9 5 0 1 4
## 0610009O20Rik 38 95 20 82 49 24
## 0610010F05Rik 61 119 17 138 75 25
## 0610010K14Rik 12 32 10 20 41 11
## 144C-CA1-2 144C-CA3-2 144C-DG-2 144D-CA3-2 144D-DG-2 145A-CA1-2
## 0610007P14Rik 63 28 49 43 150 133
## 0610009B22Rik 15 24 16 13 23 36
## 0610009L18Rik 2 4 6 9 13 21
## 0610009O20Rik 89 48 85 46 151 96
## 0610010F05Rik 95 59 97 111 140 179
## 0610010K14Rik 38 14 24 17 62 34
## 145A-DG-2 145B-CA1-1 145B-DG-1 146A-CA1-2 146A-CA3-2 146A-DG-2
## 0610007P14Rik 41 51 21 29 87 23
## 0610009B22Rik 14 15 10 15 9 6
## 0610009L18Rik 1 3 0 8 9 6
## 0610009O20Rik 52 124 46 68 102 44
## 0610010F05Rik 52 30 41 69 83 45
## 0610010K14Rik 24 27 24 33 12 23
## 146B-CA1-2 146B-CA3-2 146C-CA1-4 146C-DG-4 146D-CA3-3 147C-CA1-3
## 0610007P14Rik 13 33 38 22 87 69
## 0610009B22Rik 6 43 7 6 23 17
## 0610009L18Rik 0 2 9 0 7 2
## 0610009O20Rik 16 46 31 10 83 58
## 0610010F05Rik 53 110 41 32 148 149
## 0610010K14Rik 11 11 7 5 10 46
## 147C-CA3-3 147C-DG-3 147D-CA3-1 147D-DG-1 148A-CA1-3 148A-CA3-3
## 0610007P14Rik 164 82 79 305 135 55
## 0610009B22Rik 30 39 41 105 59 24
## 0610009L18Rik 11 3 9 67 16 7
## 0610009O20Rik 191 145 88 377 162 65
## 0610010F05Rik 328 185 222 446 198 149
## 0610010K14Rik 31 27 0 173 60 40
## 148A-DG-3 148B-CA3-4 148B-DG-4
## 0610007P14Rik 104 122 16
## 0610009B22Rik 15 45 2
## 0610009L18Rik 11 11 1
## 0610009O20Rik 226 70 18
## 0610010F05Rik 176 177 23
## 0610010K14Rik 17 39 10
Get varience stabilized gene expression for each tissue
-------------------------------------------------------
# DEGs with looking at all four treatments individually
DGdds <- returnddstreatment("DG")
## [1] "DG"
## estimating size factors
## estimating dispersions
## gene-wise dispersion estimates: 6 workers
## mean-dispersion relationship
## final dispersion estimates, fitting model and testing: 6 workers
CA3dds <- returnddstreatment("CA3")
## [1] "CA3"
## estimating size factors
## estimating dispersions
## gene-wise dispersion estimates: 6 workers
## mean-dispersion relationship
## final dispersion estimates, fitting model and testing: 6 workers
CA1dds <- returnddstreatment("CA1")
## [1] "CA1"
## estimating size factors
## estimating dispersions
## gene-wise dispersion estimates: 6 workers
## mean-dispersion relationship
## final dispersion estimates, fitting model and testing: 6 workers
# DEGs with looking at all grouped trained and yoked
DGdds2 <- returnddstraining("DG")
## [1] "DG"
## estimating size factors
## estimating dispersions
## gene-wise dispersion estimates: 6 workers
## mean-dispersion relationship
## final dispersion estimates, fitting model and testing: 6 workers
## -- replacing outliers and refitting for 58 genes
## -- DESeq argument 'minReplicatesForReplace' = 7
## -- original counts are preserved in counts(dds)
## estimating dispersions
## fitting model and testing
CA3dds2 <- returnddstraining("CA3")
## [1] "CA3"
## estimating size factors
## estimating dispersions
## gene-wise dispersion estimates: 6 workers
## mean-dispersion relationship
## final dispersion estimates, fitting model and testing: 6 workers
CA1dds2 <- returnddstraining("CA1")
## [1] "CA1"
## estimating size factors
## estimating dispersions
## gene-wise dispersion estimates: 6 workers
## mean-dispersion relationship
## final dispersion estimates, fitting model and testing: 6 workers
## -- replacing outliers and refitting for 82 genes
## -- DESeq argument 'minReplicatesForReplace' = 7
## -- original counts are preserved in counts(dds)
## estimating dispersions
## fitting model and testing
savevsds(DGdds2, "../data/03_DG_vsdtraining.csv")
## 143A-DG-1 143B-DG-1 143D-DG-3 144A-DG-2 144C-DG-2 144D-DG-2
## 0610007P14Rik 7.153718 7.167228 7.569987 7.271891 7.178889 7.395072
## 0610009B22Rik 6.607383 6.495129 6.172178 6.378164 6.648238 6.509850
## 0610009L18Rik 6.268598 6.433471 6.282645 5.904182 6.362953 6.360970
## 145A-DG-2 145B-DG-1 146A-DG-2 146C-DG-4 147C-DG-3 147D-DG-1
## 0610007P14Rik 7.326240 6.920197 7.075780 7.654996 7.065253 7.242216
## 0610009B22Rik 6.756751 6.612735 6.514596 6.858188 6.715943 6.707290
## 0610009L18Rik 6.135124 5.904182 6.514596 5.904182 6.132069 6.548655
## 148A-DG-3 148B-DG-4
## 0610007P14Rik 7.247632 7.131677
## 0610009B22Rik 6.430112 6.349605
## 0610009L18Rik 6.355221 6.219767
savevsds(CA3dds2, "../data/03_CA3_vsdtraining.csv")
## 143A-CA3-1 144A-CA3-2 144B-CA3-1 144C-CA3-2 144D-CA3-2 146A-CA3-2
## 0610007P14Rik 7.159328 7.688310 7.347158 7.097481 6.973766 7.346091
## 0610009B22Rik 6.518149 7.068934 6.600198 7.003315 6.435071 6.284311
## 0610009L18Rik 6.064367 6.746786 6.320031 6.272773 6.320996 6.284311
## 146B-CA3-2 146D-CA3-3 147C-CA3-3 147D-CA3-1 148A-CA3-3 148B-CA3-4
## 0610007P14Rik 6.842126 7.230928 7.244900 6.929800 7.016913 7.377410
## 0610009B22Rik 6.988761 6.533834 6.410424 6.609981 6.600412 6.768893
## 0610009L18Rik 6.021975 6.184369 6.150573 6.155501 6.212184 6.259681
savevsds(CA1dds2, "../data/03_CA1_vsdtraining.csv")
## 143B-CA1-1 143C-CA1-1 143D-CA1-3 144A-CA1-2 144B-CA1-1 144C-CA1-2
## 0610007P14Rik 7.626135 7.205231 7.426464 7.430562 7.477774 7.258673
## 0610009B22Rik 7.063836 6.916634 6.195554 6.965957 6.775834 6.723175
## 0610009L18Rik 6.799392 6.431682 6.195554 6.621009 6.351612 6.389144
## 145A-CA1-2 145B-CA1-1 146A-CA1-2 146B-CA1-2 146C-CA1-4 147C-CA1-3
## 0610007P14Rik 7.492491 7.451376 7.197799 7.077403 7.494980 7.335370
## 0610009B22Rik 6.886631 6.891557 6.923209 6.799601 6.768535 6.772299
## 0610009L18Rik 6.725450 6.509225 6.729555 6.195554 6.844058 6.394538
## 148A-CA1-3
## 0610007P14Rik 7.411631
## 0610009B22Rik 7.012540
## 0610009L18Rik 6.625123
Results to compare with volcano plots
-------------------------------------
print("DG")
## [1] "DG"
res_summary_subfield(DGdds2, c("training", "trained", "yoked"))
## [1] "training" "trained" "yoked"
## [1] 214
##
## out of 17006 with nonzero total read count
## adjusted p-value < 0.1
## LFC > 0 (up) : 177, 1%
## LFC < 0 (down) : 37, 0.22%
## outliers [1] : 0, 0%
## low counts [2] : 6929, 41%
## (mean count < 13)
## [1] see 'cooksCutoff' argument of ?results
## [2] see 'independentFiltering' argument of ?results
##
## NULL
res_summary_subfield(DGdds, c("treatment", "conflict.trained", "standard.trained"))
## [1] "treatment" "conflict.trained" "standard.trained"
## [1] 0
##
## out of 17011 with nonzero total read count
## adjusted p-value < 0.1
## LFC > 0 (up) : 0, 0%
## LFC < 0 (down) : 0, 0%
## outliers [1] : 20, 0.12%
## low counts [2] : 0, 0%
## (mean count < 0)
## [1] see 'cooksCutoff' argument of ?results
## [2] see 'independentFiltering' argument of ?results
##
## NULL
res_summary_subfield(DGdds, c("treatment", "conflict.yoked", "standard.yoked"))
## [1] "treatment" "conflict.yoked" "standard.yoked"
## [1] 3
##
## out of 17011 with nonzero total read count
## adjusted p-value < 0.1
## LFC > 0 (up) : 3, 0.018%
## LFC < 0 (down) : 0, 0%
## outliers [1] : 20, 0.12%
## low counts [2] : 0, 0%
## (mean count < 0)
## [1] see 'cooksCutoff' argument of ?results
## [2] see 'independentFiltering' argument of ?results
##
## NULL
print("CA3")
## [1] "CA3"
res_summary_subfield(CA3dds2, c("training", "trained", "yoked"))
## [1] "training" "trained" "yoked"
## [1] 0
##
## out of 16497 with nonzero total read count
## adjusted p-value < 0.1
## LFC > 0 (up) : 0, 0%
## LFC < 0 (down) : 0, 0%
## outliers [1] : 27, 0.16%
## low counts [2] : 5, 0.03%
## (mean count < 0)
## [1] see 'cooksCutoff' argument of ?results
## [2] see 'independentFiltering' argument of ?results
##
## NULL
res_summary_subfield(CA3dds, c("treatment", "conflict.trained", "standard.trained"))
## [1] "treatment" "conflict.trained" "standard.trained"
## [1] 0
##
## out of 16502 with nonzero total read count
## adjusted p-value < 0.1
## LFC > 0 (up) : 0, 0%
## LFC < 0 (down) : 0, 0%
## outliers [1] : 11, 0.067%
## low counts [2] : 0, 0%
## (mean count < 0)
## [1] see 'cooksCutoff' argument of ?results
## [2] see 'independentFiltering' argument of ?results
##
## NULL
res_summary_subfield(CA3dds, c("treatment", "conflict.yoked", "standard.yoked"))
## [1] "treatment" "conflict.yoked" "standard.yoked"
## [1] 2
##
## out of 16502 with nonzero total read count
## adjusted p-value < 0.1
## LFC > 0 (up) : 1, 0.0061%
## LFC < 0 (down) : 1, 0.0061%
## outliers [1] : 11, 0.067%
## low counts [2] : 0, 0%
## (mean count < 0)
## [1] see 'cooksCutoff' argument of ?results
## [2] see 'independentFiltering' argument of ?results
##
## NULL
print("CA1")
## [1] "CA1"
res_summary_subfield(CA1dds2, c("training", "trained", "yoked"))
## [1] "training" "trained" "yoked"
## [1] 16
##
## out of 16846 with nonzero total read count
## adjusted p-value < 0.1
## LFC > 0 (up) : 1, 0.0059%
## LFC < 0 (down) : 15, 0.089%
## outliers [1] : 0, 0%
## low counts [2] : 2619, 16%
## (mean count < 1)
## [1] see 'cooksCutoff' argument of ?results
## [2] see 'independentFiltering' argument of ?results
##
## NULL
res_summary_subfield(CA1dds, c("treatment", "conflict.trained", "standard.trained"))
## [1] "treatment" "conflict.trained" "standard.trained"
## [1] 0
##
## out of 16852 with nonzero total read count
## adjusted p-value < 0.1
## LFC > 0 (up) : 0, 0%
## LFC < 0 (down) : 0, 0%
## outliers [1] : 32, 0.19%
## low counts [2] : 0, 0%
## (mean count < 0)
## [1] see 'cooksCutoff' argument of ?results
## [2] see 'independentFiltering' argument of ?results
##
## NULL
res_summary_subfield(CA1dds, c("treatment", "conflict.yoked", "standard.yoked"))
## [1] "treatment" "conflict.yoked" "standard.yoked"
## [1] 917
##
## out of 16852 with nonzero total read count
## adjusted p-value < 0.1
## LFC > 0 (up) : 545, 3.2%
## LFC < 0 (down) : 372, 2.2%
## outliers [1] : 32, 0.19%
## low counts [2] : 4892, 29%
## (mean count < 5)
## [1] see 'cooksCutoff' argument of ?results
## [2] see 'independentFiltering' argument of ?results
##
## NULL
Volcano plots
-------------
# create data frame for making volcanos plots
DGa <- calculateDEGs(DGdds, "DG", "treatment", "standard.trained", "standard.yoked")
DGb <- calculateDEGs(DGdds, "DG", "treatment", "conflict.trained", "conflict.yoked")
DGc <- calculateDEGs(DGdds, "DG", "treatment", "conflict.trained", "standard.trained")
DGd <- calculateDEGs(DGdds, "DG", "treatment", "conflict.yoked", "standard.yoked")
DGe <- calculateDEGs(DGdds2, "DG", "training", "trained", "yoked")
CA3a <- calculateDEGs(CA3dds, "CA3", "treatment", "standard.trained", "standard.yoked")
CA3b <- calculateDEGs(CA3dds, "CA3", "treatment", "conflict.trained", "conflict.yoked")
CA3c <- calculateDEGs(CA3dds, "CA3", "treatment", "conflict.trained", "standard.trained")
CA3d <- calculateDEGs(CA3dds, "CA3", "treatment", "conflict.yoked", "standard.yoked")
CA3e <- calculateDEGs(CA3dds2, "CA3", "training", "trained", "yoked")
CA1a <- calculateDEGs(CA1dds, "CA1", "treatment", "standard.trained", "standard.yoked")
CA1b <- calculateDEGs(CA1dds, "CA1", "treatment", "conflict.trained", "conflict.yoked")
CA1c <- calculateDEGs(CA1dds, "CA1", "treatment", "conflict.trained", "standard.trained")
CA1d <- calculateDEGs(CA1dds, "CA1", "treatment", "conflict.yoked", "standard.yoked")
CA1e <- calculateDEGs(CA1dds2, "CA1", "training", "trained", "yoked")
# save df with DEGs
allDEG <- rbind(DGa, DGb, DGc, DGd, DGe,
CA3a, CA3b, CA3c, CA3d, CA3e,
CA1a, CA1b, CA1c, CA1d, CA1e) %>%
dplyr::filter(direction != "NS") %>%
dplyr::mutate(lfc = round(lfc, 2),
padj = scientific(padj, digits = 3),
logpadj = round(logpadj, 2)) %>%
arrange(tissue, comparison, gene)
pca analysis and bar plots functions
------------------------------------
a <- plotPCs(DGdds, "DG")
b <- plot.volcano(DGa, "\ns. trained vs s. yoked") + labs(y = "-log10(p-value)")
c <- plot.volcano(DGb, "\nc. trained vs c. yoked")
d <- plot.volcano(DGc, "\ns. trained vs c. trained")
e <- plot.volcano(DGd, "\ns. yoked vs. c. yoked")
f <- plot.volcano(DGe, "\nyoked vs. trained")
g <- plotPCs(CA3dds, "CA3")
h <- plot.volcano(CA3a, " ") + labs(y = "-log10(p-value)")
i <- plot.volcano(CA3b, " ")
j <- plot.volcano(CA3c, " ")
k <- plot.volcano(CA3d, " ")
l <- plot.volcano(CA3e, " ")
m <- plotPCs(CA1dds, "CA1")
n <- plot.volcano(CA1a, " ") + labs(y = "-log10(p-value)")
o <- plot.volcano(CA1b, " ")
p <- plot.volcano(CA1c, " ")
q <- plot.volcano(CA1d, " ")
r <- plot.volcano(CA1e, " ")
legend <- get_legend(a + theme(legend.position = "bottom",
legend.title = element_blank()) +
guides(color = guide_legend(nrow = 2)))
mainplot <- plot_grid(a,b,c,d,e,f,g,h,i,j,k,l,m,n,o,p,q,r,
nrow = 3, rel_widths = c(1,1,0.8,0.8,0.8,0.8),
labels = c("(a)", "(b)", "", "", "", "(c)",
"", "", "", "", "", "",
"", "", "", "", "", ""),
label_size = 8)
## Warning in MASS::cov.trob(data[, vars]): Probable convergence failure
fig3 <- plot_grid(mainplot, legend, ncol = 1, rel_heights = c(1, 0.1))
fig3

save files
----------
pdf(file="../figures/03_rnaseqSubfield/volcanos.pdf", width=6.69, height=6)
plot(fig3)
dev.off()
## quartz_off_screen
## 2
pdf(file="../figures/fig-3.pdf", width=6.69, height=6)
plot(fig3)
dev.off()
## quartz_off_screen
## 2
suppltable4 <- allDEG %>% filter(tissue == "DG" & comparison == "yoked vs. trained")
head(suppltable4)
## tissue gene lfc padj logpadj comparison direction
## 1 DG 1190002N15Rik 1.78 3.60e-04 3.44 yoked vs. trained trained
## 2 DG 2410002F23Rik -0.51 9.26e-02 1.03 yoked vs. trained yoked
## 3 DG A830010M20Rik 1.47 1.68e-07 6.78 yoked vs. trained trained
## 4 DG Abhd2 0.68 1.33e-02 1.88 yoked vs. trained trained
## 5 DG Acan 1.96 5.25e-10 9.28 yoked vs. trained trained
## 6 DG Adamts1 1.75 1.28e-02 1.89 yoked vs. trained trained
suppltable5 <- allDEG %>% filter(tissue != "DG" & comparison != "yoked vs. trained")
head(suppltable5)
## tissue gene lfc padj logpadj
## 1 CA1 Chd8 -0.77 8.19e-02 1.09
## 2 CA1 D430019H16Rik -0.64 8.19e-02 1.09
## 3 CA1 Fgfr1 -0.66 8.19e-02 1.09
## 4 CA1 Gm20390 2.84 1.06e-03 2.97
## 5 CA1 Itga10 -2.17 8.19e-02 1.09
## 6 CA1 Ppp1r10 -0.86 6.80e-02 1.17
## comparison direction
## 1 conflict.yoked vs. conflict.trained conflict.yoked
## 2 conflict.yoked vs. conflict.trained conflict.yoked
## 3 conflict.yoked vs. conflict.trained conflict.yoked
## 4 conflict.yoked vs. conflict.trained conflict.trained
## 5 conflict.yoked vs. conflict.trained conflict.yoked
## 6 conflict.yoked vs. conflict.trained conflict.yoked
write_csv(suppltable4, "../data/suppltable-4.csv")
write_csv(suppltable5, "../data/suppltable-5.csv")
citation("DESeq2")
##
## Love, M.I., Huber, W., Anders, S. Moderated estimation of fold change
## and dispersion for RNA-seq data with DESeq2 Genome Biology 15(12):550
## (2014)
##
## A BibTeX entry for LaTeX users is
##
## @Article{,
## title = {Moderated estimation of fold change and dispersion for RNA-seq data with DESeq2},
## author = {Michael I. Love and Wolfgang Huber and Simon Anders},
## year = {2014},
## journal = {Genome Biology},
## doi = {10.1186/s13059-014-0550-8},
## volume = {15},
## issue = {12},
## pages = {550},
## }
citation("png")
##
## To cite package 'png' in publications use:
##
## Simon Urbanek (2013). png: Read and write PNG images. R package
## version 0.1-7. https://CRAN.R-project.org/package=png
##
## A BibTeX entry for LaTeX users is
##
## @Manual{,
## title = {png: Read and write PNG images},
## author = {Simon Urbanek},
## year = {2013},
## note = {R package version 0.1-7},
## url = {https://CRAN.R-project.org/package=png},
## }
##
## ATTENTION: This citation information has been auto-generated from the
## package DESCRIPTION file and may need manual editing, see
## 'help("citation")'.
citation("grid")
##
## The 'grid' package is part of R. To cite R in publications use:
##
## R Core Team (2019). R: A language and environment for statistical
## computing. R Foundation for Statistical Computing, Vienna, Austria.
## URL https://www.R-project.org/.
##
## A BibTeX entry for LaTeX users is
##
## @Manual{,
## title = {R: A Language and Environment for Statistical Computing},
## author = {{R Core Team}},
## organization = {R Foundation for Statistical Computing},
## address = {Vienna, Austria},
## year = {2019},
## url = {https://www.R-project.org/},
## }
##
## We have invested a lot of time and effort in creating R, please cite it
## when using it for data analysis. See also 'citation("pkgname")' for
## citing R packages.
citation("BiocParallel")
##
## To cite package 'BiocParallel' in publications use:
##
## Martin Morgan, Valerie Obenchain, Michel Lang, Ryan Thompson and
## Nitesh Turaga (2019). BiocParallel: Bioconductor facilities for
## parallel evaluation. R package version 1.18.0.
## https://github.com/Bioconductor/BiocParallel
##
## A BibTeX entry for LaTeX users is
##
## @Manual{,
## title = {BiocParallel: Bioconductor facilities for parallel evaluation},
## author = {Martin Morgan and Valerie Obenchain and Michel Lang and Ryan Thompson and Nitesh Turaga},
## year = {2019},
## note = {R package version 1.18.0},
## url = {https://github.com/Bioconductor/BiocParallel},
## }
| 38.63124 | 110 | 0.533597 | eng_Latn | 0.43709 |
f92695f290554af880209561339e402a6d617a0d | 3,909 | md | Markdown | _posts/2018-10-12-I-am-relocating-to-the-US.md | dendibakh/dendibakh.github.io | 54a53957a7708c3768a4b7a6c8cc54d6c43ac8b0 | [
"CC-BY-4.0"
] | 45 | 2017-11-03T14:52:59.000Z | 2022-02-10T16:00:06.000Z | _posts/2018-10-12-I-am-relocating-to-the-US.md | dendibakh/dendibakh.github.io | 54a53957a7708c3768a4b7a6c8cc54d6c43ac8b0 | [
"CC-BY-4.0"
] | 3 | 2021-02-01T09:24:50.000Z | 2021-07-17T08:10:54.000Z | _posts/2018-10-12-I-am-relocating-to-the-US.md | dendibakh/dendibakh.github.io | 54a53957a7708c3768a4b7a6c8cc54d6c43ac8b0 | [
"CC-BY-4.0"
] | 14 | 2018-03-22T14:05:15.000Z | 2021-08-23T07:06:31.000Z | ---
layout: post
title: I am relocating to the US.
categories: [personal]
---
------
**Subscribe to my [mailing list](https://mailchi.mp/4eb73720aafe/easyperf), support me on [Patreon](https://www.patreon.com/dendibakh) or by PayPal [donation](https://www.paypal.com/cgi-bin/webscr?cmd=_donations&business=TBM3NW8TKTT34¤cy_code=USD&source=url).**
------
I'm about to start a new chapter in my career and life. I'm relocating to the US in a few days. Right now I want to say `Thank you` to Poland and say `Hello` to US.
For those who know me: don't worry, I'm not changing job. I'm still staying with Intel and will be doing basically the same things I do now. But there was a business need for a company to move my project from Poland to US.
### Thank you Poland!
I spent in Poland 4 amazing years and lived in 2 cities: Wroclaw and Gdansk. Wroclaw has very strong IT community and in general is a very beautiful modern city. Gdansk is located close to the sea, so that's definitely an advantage!
Polish people are very polite and kind, especially they like children. My younger daughter was born in Poland, so I know what I'm talking about :) . In general I can say that my adaptation in Poland was easy. Also Russian (my native) language is very similar to Polish, so I started to understand what people on street are talking about very quickly. I've been in many different countries, and I swear that for living I would prefer Poland to many other European countries. I hope to be in Poland again!
### It's needless to say that I learned a lot!
I've spent first 2,5 years at Nokia working on embedded SW for radio frequency modules. Although it was embedded SW we used all the modern tools like git, cmake, gerrit and latest C++ standard (C++14 at that time, gcc 5.2).
But I always loved writing fast code and benchmarking it. I learned some of the stuff related to that on my own. For example, compiler optimizations and cpu architecture. Regardless it was required or not, I tried to benchmark the code that I was writing.
So one opportunity came to me in this regard. It was a position in compiler development team at Intel which I later accepted. While being at Intel I really mastered my skills of performance analysis and compiler development. There are a lot of very smart people, which you can learn from.
{: .center-image-width-30 }
Special `thanks` goes to [code::dive](http://codedive.pl/) team and the whole Nokia. For me it was an amazing opportunity to meet with world class experts and have a chat with them while drinking another glass of beer. :)
{: .center-image-width-30 }
{: .center-image-width-30 }
I was planning to give a talk this year and even submitted one, but my relocation broke those plans.
On one of such code::dive's I met cool guys from Bochum, which organize [embo++](https://www.embo.io/) conference. Thank you crew, for accepting [my talk](https://www.youtube.com/watch?v=Lxw3K37OP-w) and letting me to give it.
### Speaking of my accomplishments
I would like to mention my blog (which you are reading now). I started it in 2016 and since then I have `40'000` pageviews and `20'000` unique users, which I consider quite a good achievement. Every day I have around 30-40 pageviews and around 15 new users on my blog. All this gives me the energy to keep on writing new articles.
Another thing I'm proud of is that I learned Polish language. I don't feel uncomfortable speaking or writing Polish. I'm still very far from people who know 8-9 different languages, but I still consider this as an advantage that shows your divers knowledge.
**I'm at the beginning of new journey and I'm very excited about that!**
**I can't wait to say `Hello!` to the US, but for now I keep saying `Thank you!` to Poland!**
| 76.647059 | 503 | 0.759785 | eng_Latn | 0.999611 |
f926ec5a4e2f8111440d76ab850fa61092b72861 | 6,995 | md | Markdown | WindowsServerDocs/administration/overview.md | mgreenegit/windowsserverdocs | d5d615fab8b670ec7ae0e83f98f82f0bf42d3c08 | [
"CC-BY-4.0",
"MIT"
] | null | null | null | WindowsServerDocs/administration/overview.md | mgreenegit/windowsserverdocs | d5d615fab8b670ec7ae0e83f98f82f0bf42d3c08 | [
"CC-BY-4.0",
"MIT"
] | null | null | null | WindowsServerDocs/administration/overview.md | mgreenegit/windowsserverdocs | d5d615fab8b670ec7ae0e83f98f82f0bf42d3c08 | [
"CC-BY-4.0",
"MIT"
] | null | null | null | ---
title: Windows Server management overview
description: Learn about more about the different capabilities and solutions to manage Windows Server.
author: thomasmaurer
ms.author: thmaure
manager: rclaus
ms.reviewer: rclaus
ms.topic: overview
ms.date: 02/24/2022
---
# Windows Server management overview
Windows Server is the platform for building an infrastructure of connected applications, networks, and web services. As a Windows Server administrator, you've probably used many of the native Windows Server Microsoft Management Consoles (MMC) to keep the infrastructure secure and available. As the foundation of many on-premises, hybrid, and cloud native applications, the Windows Server teams have continued to invest in making the management and adminstration of your Windows Server instances easier by offering management tools like Azure Arc, Windows Admin Center, and System Center. These tools are designed to work together, and each have capabilities to meet you where you are in your server management needs.
:::image type="content" source="media/windows-server-management-overview.png" alt-text="Windows Server Management Overview":::
| Azure Arc | Windows Admin Center | System Center |
| --------------- | --------------- | --------------- |
| [Onboard server to Azure Arc](/azure/azure-arc/servers/learn/quick-enable-hybrid-vm) | [Download Windows Admin Center](https://www.microsoft.com/evalcenter/evaluate-windows-admin-center) | [Get System Center](https://www.microsoft.com/system-center)
## Cloud-based management using Azure Arc
Azure Arc-enabled servers enables you to manage your Windows and Linux physical servers and virtual machines hosted outside of Azure, on your corporate network, or other cloud provider. This management experience is designed to be consistent with how you manage native Azure virtual machines. This allows you to manage, govern, and secure your Windows Servers with services such as Azure Policy, Microsoft Defender for Cloud, Azure Monitor, Azure Update Management, and more.
| Details | Description |
| --------------- | --------------- |
| Scope | Limitless scale, Windows Server + Linux server |
| Interface | Web browser, REST APIs, command line tools |
| Disconnected mode | Limited support |
| Focus | At-scale governance, security, and monitoring |
### Example scenarios for Azure Arc
- You want to proactively monitor the OS and workloads running on the machine.
- Manage it using Automation runbooks or solutions like Update Management
- Secure your Windows Server using Microsoft Defender for Cloud.
- Govern your Windows Server machines using Azure Policy Guest Configuration
Learn more about Azure Arc-enabled server at [What is Azure Arc-enabled server](/azure/azure-arc/servers/overview).
## Deep Windows Server and cluster administration with Windows Admin Center
Windows Admin Center is a locally-deployed, browser-based server adminstration tool set that lets you manage your Windows Servers with no Azure or cloud dependency. Windows Admin Center gives you full control over all aspects of your server infrastructure and is particularly useful for managing servers on private networks that are not connected to the Internet.
Windows Admin Center is the modern evolution of "in-box" management tools, like Server Manager and MMC, and complements other management solutions.
| Details | Description |
| --------------- | --------------- |
| Scope | Single Windows Server or hyperconverged clusters |
| Interface | Web browser |
| Disconnected mode | Supported |
| Focus | General server management, role configuration and troubleshooting |
### Example scenarios for Windows Admin Center
- Manage your servers and clusters with modernized versions of familiar tools such as Server Manager.
- Integration with Azure helps you optionally connect your on-premises servers with relevant cloud services in combination with Azure Arc.
- Streamline management of Azure Stack HCI or Windows Server hyperconverged clusters. Use simplified workloads to create and manage VMs, Storage Spaces Direct volumes, Software-Defined Networking and more.
Learn more about Windows Admin Center at [What is Windows Admin Center](../manage/windows-admin-center/understand/what-is.md).
## Datacenter-scale management with System Center
System Center allows you to stay in control of your IT infrastructure across your environment and platforms. It allows you to simplify the deployment, configuration, management, and monitoring of your infrastructure and virtualized software-defined datacenter, while increasing agility and performance.
| Details | Description |
| --------------- | --------------- |
| Scope | Entire datacenter (multi-server management) |
| Interface | GUI and command line tools |
| Disconnected mode | Supported |
| Focus | Day to day operations |
### Example scenarios for System Center
- Operations Manager provides infrastructure monitoring that is flexible and cost-effective, helps ensure the predictable performance and availability of vital applications, and offers comprehensive monitoring for your datacenter and private cloud.
- Data Protection Manager is a robust enterprise backup and recovery system that contributes to your Business Continuity/Disaster Recovery (BCDR) strategy by facilitating the backup and recovery of enterprise data.
- Virtual Machine Manager enables you to configure, manage and transform traditional datacenters, and helping to provide a unified management experience across on-premises and service provider for your virtualization workloads.
Learn more about System Center at [System Center Documentation](/system-center).
## Local management tools
Windows Server includes a number of tools to help you understand your Windows Server environment, manage specific servers, fine-tune performance, troubleshooting, and eventually automate many management tasks.
| Details | Description |
| --------------- | --------------- |
| Scope | Single to few servers |
| Interface | GUI and command line |
| Disconnected mode | Supported |
| Focus | General server management, role configuration and troubleshooting |
### Example scenarios of local management tools
- Server Manager is a management console in Windows Server that helps IT professionals provision and manage both local and remote Windows-based servers.
- All supported versions of Windows and Windows Server have a set of Win32 console commands built in. You can use these to automate tasks by using scripts or scripting tools.
- As a scripting language, PowerShell is commonly used for automating the management of Windows Server and other systems.
- Using Sconfig to manage Windows Server Core.
Learn more about local Windows Server management tools such as:
- [Server Manager](server-manager/server-manager.md)
- [PowerShell](/powershell/scripting/overview)
- [Windows Commands](windows-commands/windows-commands.md)
- [Remote Server Administration Tools](../remote/remote-server-administration-tools.md)
| 66.619048 | 717 | 0.781558 | eng_Latn | 0.990316 |
f92777208314067a767c018014cd02ec6bd57139 | 2,392 | md | Markdown | ApplicationCases/CrazyChristmas/README.md | HMS-Core/hms-ml-demo | 2cc893fb9dc2c198e9a3893941240cb690004057 | [
"Apache-2.0"
] | 288 | 2020-05-22T02:09:54.000Z | 2022-03-31T01:42:55.000Z | ApplicationCases/CrazyChristmas/README.md | FStranieri/hms-ml-demo | c7cd308b6c25d3b981b4fc91436f32adb4c98649 | [
"Apache-2.0"
] | 33 | 2020-05-25T10:19:07.000Z | 2022-03-13T11:37:29.000Z | ApplicationCases/CrazyChristmas/README.md | FStranieri/hms-ml-demo | c7cd308b6c25d3b981b4fc91436f32adb4c98649 | [
"Apache-2.0"
] | 98 | 2020-05-22T02:24:36.000Z | 2022-03-28T11:50:02.000Z | # CrazyChristmas
[](https://developer.huawei.com/consumer/en/doc/development/HMS-Guides/ml-introduction-4)
English | [中文](https://github.com/HMS-Core/hms-ml-demo/blob/master/ApplicationCases/CrazyChristmas/README_ZH.md)
## Table of Contents
* [Introduction](#introduction)
* [Project directory structure](#project-directory-structure)
* [More Scenarios](#more-scenarios)
* [Procedure](#procedure)
* [Supported Environment](#supported-environments)
* [License](#license)
## Introduction
CrazyChristmas uses the hand key point recognition function of HUAWEI ML Kit to control the sled to move to catch falling goods.
Obtain the [DemoAPK](https://h5hosting-drcn.dbankcdn.cn/cch5/AIBussiness-MLKit/christmas/apk_release_christmas_game.apk) for experience.
This demo demonstrates how to use [HUAWEI ML Kit](https://developer.huawei.com/consumer/en/hms/huawei-mlkit) to quickly develop a red envelopes game app. The purpose is to help you experience the hand key point function and integrate HUAWEI ML Kit as soon as possible.
## Project directory structure
CrazyChristmas
|-- com.huawei.mlkit.sample
|-- Activity
|-- GoodsActivity // game page
## More Scenarios
With the hand key point recognition capability provided by HUAWEI ML Kit, you can not only develop CrazyChristmas applets, but also implement various functions, such as:
1. Add special effects to hands.
2. Hand action switching background.
## Procedure
- Preparations
- Add the Huawei Maven repository to the build.gradle file in the root directory of the project.
- Add build dependency on the SDK to the build.gradle file in the app directory.
- Add the hand key point detection model to the manifest.xml file of Android.
- Apply for the camera permission in the manifest.xml file of Android.
- Key steps of code development
- Submit a dynamic permission application.
- Create a hand keypoint analyzer.
- Create a LensEngine.
- Call the lensEngine.run(holder) method to perform hand keypoint recognition to move sled.
## Supported Environments
Devices with Android 4.4 or later are recommended.
## License
The face detection sample of HUAWEI ML Kit has obtained the [Apache 2.0 license](http://www.apache.org/licenses/LICENSE-2.0).
| 46 | 277 | 0.744983 | eng_Latn | 0.911292 |
f927b5c9a0b415eddf44a2a5d27a6db8ff1f2c9f | 9,355 | md | Markdown | articles/iot-pnp/howto-install-pnp-cli.md | span/azure-docs | ed6b7131eaf6f43978482d58ba78ce62a36982d6 | [
"CC-BY-4.0",
"MIT"
] | null | null | null | articles/iot-pnp/howto-install-pnp-cli.md | span/azure-docs | ed6b7131eaf6f43978482d58ba78ce62a36982d6 | [
"CC-BY-4.0",
"MIT"
] | 1 | 2017-04-21T17:57:59.000Z | 2017-04-21T17:58:30.000Z | articles/iot-pnp/howto-install-pnp-cli.md | span/azure-docs | ed6b7131eaf6f43978482d58ba78ce62a36982d6 | [
"CC-BY-4.0",
"MIT"
] | null | null | null | ---
title: Use the Azure IoT extension for Azure CLI to interact with IoT Plug and Play Preview devices | Microsoft Docs
description: Install the Azure IoT extension for Azure CLI and use it to interact with the IoT Plug and Play devices connected to my IoT hub.
author: ChrisGMsft
ms.author: chrisgre
ms.date: 12/26/2019
ms.topic: conceptual
ms.service: iot-pnp
services: iot-pnp
ms.custom: mvc
# As a solution developer, I want to use the Azure IoT extension for the Azure CLI to interact with IoT Plug and Play devices connected to an IoT hub to test and verify their behavior.
---
# Install and use the Azure IoT extension for the Azure CLI
[The Azure CLI](https://docs.microsoft.com/cli/azure?view=azure-cli-latest) is an open-source cross platform command-line tool for managing Azure resources such as IoT Hub. The Azure CLI is available on Windows, Linux, and MacOS. The Azure CLI is also pre-installed in the [Azure Cloud Shell](https://shell.azure.com). The Azure CLI lets you manage Azure IoT Hub resources, Device Provisioning Service instances, and linked-hubs without installing any extensions.
The Azure IoT extension for the Azure CLI is a command-line tool for interacting with, and testing IoT Plug and Play Preview devices. You can use the extension to:
- Connect to a device.
- View the telemetry the device sends.
- Work with device properties.
- Call device commands.
This article shows you how to:
- Install and configure the Azure IoT extension for the Azure CLI.
- Use the extension to interact with and test your devices.
- Use the extension to manage interfaces in the model repository.
## Install Azure IoT extension for the Azure CLI
### Step 1 - Install the Azure CLI
Follow the [installation instructions](https://docs.microsoft.com/cli/azure/install-azure-cli?view=azure-cli-latest) to set up the Azure CLI in your environment. To use all the commands below, your Azure CLI version must be version 2.0.73 or above. Use `az -–version` to validate.
### Step 2 - Install IoT extension
[The IoT extension readme](https://github.com/Azure/azure-iot-cli-extension) describes several ways to install the extension. The simplest way is to run `az extension add --name azure-iot`. After installation, you can use `az extension list` to validate the currently installed extensions or `az extension show --name azure-iot` to see details about the IoT extension. To remove the extension, you can use `az extension remove --name azure-iot`.
## Use Azure IoT extension for the Azure CLI
### Prerequisites
To sign in to your Azure subscription, run the following command:
```cmd/sh
az login
```
> [!NOTE]
> If you're using the Azure cloud shell, you're automatically signed in and don't need to run the previous command.
To use the Azure IoT extension for the Azure CLI, you need:
- An Azure IoT hub. There are many ways to add an IoT hub to your Azure subscription, such as [Create an IoT hub using the Azure CLI](../iot-hub/iot-hub-create-using-cli.md). You need the IoT hub's connection string to run the Azure IoT extension commands. If you don't have an Azure subscription, create a [free account](https://azure.microsoft.com/free/?WT.mc_id=A261C142F) before you begin.
- A device registered in your IoT hub. You can use the following Azure CLI command to register a device, be sure to replace the `{YourIoTHubName}` and `{YourDeviceID}` placeholders with your values:
```cmd/sh
az iot hub device-identity create --hub-name {YourIoTHubName} --device-id {YourDeviceID}
```
- Some commands need the connection string for a company model repository. A model repository for your company is created when you first [onboard to the Azure Certified for IoT portal](howto-onboard-portal.md). A third party might share their model repository connection string with you to give you access to their interfaces and models.
### Interact with a device
You can use the extension to view and interact with IoT Plug and Play devices that are connected to an IoT hub. The extension works with the digital twin that represents the IoT Plug and Play device.
#### List devices and interfaces
List all devices on an IoT Hub:
```cmd/sh
az iot hub device-identity list --hub-name {YourIoTHubName}
```
List all interfaces registered by an IoT Plug and Play device:
```cmd/sh
az iot dt list-interfaces --hub-name {YourIoTHubName} --device-id {YourDeviceID}
```
#### Properties
List all properties and property values for an interface on a device:
```cmd/sh
az iot dt list-properties --hub-name {YourIoTHubName} --device-id {YourDeviceID} --interface {YourInterfaceID} --source private --repo-login "{YourCompanyModelRepoConnectionString}"
```
Set the value of a read-write property:
```cmd/sh
az iot dt update-property --hub-name {YourIoTHubName} --device-id {YourDeviceID} --interface-payload {JSONPayload or FilePath}
```
An example payload file to set the **name** property on the **sensor** interface of a device to **Contoso** looks like the following:
```json
{
"sensor": {
"properties": {
"name": {
"desired": {
"value": "Contoso"
}
}
}
}
}
```
#### Commands
List all commands for an interface on a device:
```cmd/sh
az iot dt list-commands --hub-name {YourIoTHubName} --device-id {YourDeviceID} --interface {YourInterfaceID} --source private --repo-login {YourCompanyModelRepoConnectionString}
```
Without the `--repo-login` parameter, this command uses the public model repository.
Invoke a command:
```cmd/sh
az iot dt invoke-command --hub-name {YourIoTHubName} --device-id {YourDeviceID} --interface {YourInterfaceID} --cn {CommandName} --command-payload {CommandPayload or FilePath}
```
#### Digital twin events
Monitor all IoT Plug and Play digital twin events from a specific device and interface going to the **$Default** event hub consumer group:
```cmd/sh
az iot dt monitor-events --hub-name {YourIoTHubName} --device-id {YourDeviceID} --interface {YourInterfaceID}
```
Monitor all IoT Plug and Play digital twin events from a specific device and interface going a specific consumer group:
```cmd/sh
az iot dt monitor-events --hub-name {YourIoTHubName} --device-id {YourDeviceID} --interface {YourInterfaceID} --consumer-group {YourConsumerGroup}
```
### Manage interfaces in a model repository
The following commands use the public IoT Plug and Play model repository. To use a company model repository, add the `--login` argument with your model repository connection string.
List interfaces in the public IoT Plug and Play model repository:
```cmd/sh
az iot pnp interface list
```
Show an interface in the public IoT Plug and Play model repository:
```cmd/sh
az iot pnp interface show --interface {YourInterfaceId}
```
Create an interface in your IoT Plug and Play company model repository:
```cmd/sh
az iot pnp interface create --definition {JSONPayload or FilePath} --login {YourCompanyModelRepoConnectionString}
```
You can't directly create an interface in the public model repository.
Update an interface in your IoT Plug and Play company model repository:
```cmd/sh
az iot pnp interface update --definition {JSONPayload or FilePath} --login {YourCompanyModelRepoConnectionString}
```
You can't directly update an interface in the public model repository.
Publish an interface from your IoT Plug and Play company model repository to the public model repository. This operation makes the interface immutable:
```cmd/sh
az iot pnp interface publish --interface {YourInterfaceID} --login {YourCompanyModelRepoConnectionString}
```
Only Microsoft partners can publish interfaces to the public model repository.
### Manage device capability models in a model repository
The following commands use the public IoT Plug and Play model repository. To use a company model repository, add the `--login` argument with your model repository connection string.
List device capability models in the IoT Plug and Play public model repository:
```cmd/sh
az iot pnp capability-model list
```
Show a device capability model in the IoT Plug and Play public model repository:
```cmd/sh
az iot pnp capability-model show --model {YourModelID}
```
Create a device capability model in an IoT Plug and Play company model repository:
```cmd/sh
az iot pnp capability-model create --definition {JSONPayload or FilePath} --login {YourCompanyModelRepoConnectionString}
```
You can't directly create a model in the public model repository.
Update a device capability model in the IoT Plug and Play company model repository:
```cmd/sh
az iot pnp capability-model update --definition {JSONPayload or FilePath} --login {YourCompanyModelRepoConnectionString}
```
You can't directly update a model in the public model repository.
Publish a device capability model from your IoT Plug and Play company model repository to the public model repository. This operation makes the model immutable:
```cmd/sh
az iot pnp capability-model publish --model {YourModelID} --login {YourCompanyModelRepoConnectionString}
```
Only Microsoft partners can publish models to the public model repository.
## Next steps
In this how-to article, you've learned how to install and use the Azure IoT extension for the Azure CLI to interact with your Plug and Play devices. A suggested next step is to learn how to [Manage models](./howto-manage-models.md).
| 41.030702 | 463 | 0.764083 | eng_Latn | 0.965842 |
f9281a4ef82a90cf6fff00a5aa184e0df53f3afe | 47 | md | Markdown | README.md | alokkulkarni/react-ak-css-spinners | bd35256f89120e380aa5075087ea6d888fc3d13b | [
"MIT"
] | 1 | 2020-04-28T09:30:04.000Z | 2020-04-28T09:30:04.000Z | README.md | alokkulkarni/react-ak-css-spinners | bd35256f89120e380aa5075087ea6d888fc3d13b | [
"MIT"
] | 1 | 2021-05-11T10:56:06.000Z | 2021-05-11T10:56:06.000Z | README.md | alokkulkarni/react-ak-css-spinners | bd35256f89120e380aa5075087ea6d888fc3d13b | [
"MIT"
] | null | null | null | # react-ak-css-spinners
react spinners library
| 15.666667 | 23 | 0.808511 | eng_Latn | 0.831055 |
f9284d1cb7d3a590939f997ca1940cbec69c2311 | 668 | md | Markdown | node_modules/minify/node_modules/tomas/node_modules/checkup/README.md | scorpionsk5/HashHandler | c44b156c54e6b7d78811d06f7778a20d686d41b9 | [
"MIT"
] | null | null | null | node_modules/minify/node_modules/tomas/node_modules/checkup/README.md | scorpionsk5/HashHandler | c44b156c54e6b7d78811d06f7778a20d686d41b9 | [
"MIT"
] | 3 | 2020-07-17T03:59:09.000Z | 2022-01-22T05:07:12.000Z | node_modules/minify/node_modules/tomas/node_modules/checkup/README.md | scorpionsk5/HashHandler | c44b156c54e6b7d78811d06f7778a20d686d41b9 | [
"MIT"
] | null | null | null | # Checkup
Check arguments and if they wrong throw exeption.
## Install
```
npm i chukup --save
```
## How to use?
```js
var check = require('checkup');
function someFn(arg1, arg2, arg3) {
check({
arg1: arg1,
arg2, arg2
arg3: arg3
});
}
function showName(name, callback) {
check(arguments, ['name'])
.check(arguments, ['callback'])
.type('name', name, 'string')
.type('callback', callback, 'function');
console.log('every thing is ok:', name);
}
function callCallback(callback) {
check([callback], ['callback'])
.type('callback', callback, 'function');
callback();
}
```
## License
MIT
| 14.844444 | 49 | 0.588323 | eng_Latn | 0.610462 |
f9286e4e92a6f0d7be2cb83e49dba1c11cb7883f | 3,128 | md | Markdown | _posts/2015-11-19-delivery-partnership-playbook.md | magnessjo/18f.gsa.gov | 2414cf64d8e55dc2e8641cf6e3a4962517162a82 | [
"CC0-1.0"
] | 307 | 2015-01-15T20:05:58.000Z | 2022-02-06T03:29:59.000Z | _posts/2015-11-19-delivery-partnership-playbook.md | magnessjo/18f.gsa.gov | 2414cf64d8e55dc2e8641cf6e3a4962517162a82 | [
"CC0-1.0"
] | 1,634 | 2015-01-04T01:48:16.000Z | 2022-03-29T09:02:16.000Z | _posts/2015-11-19-delivery-partnership-playbook.md | magnessjo/18f.gsa.gov | 2414cf64d8e55dc2e8641cf6e3a4962517162a82 | [
"CC0-1.0"
] | 401 | 2015-01-02T02:41:21.000Z | 2022-03-28T14:28:14.000Z | ---
title: "New playbook details what it's like to work with 18F Delivery"
date: 2015-11-19
layout: post
authors:
- will
tags:
- how we work
- agency work
- transformation services
- acquisition services
excerpt: "If you or your agency have thought about working with 18F but are unsure of how we work with our partners, we have a new set of guidelines to help you out. The 18F Delivery Partnership Playbook is specifically targeted at federal offices interested in working with 18F to build digital services."
description: "If you or your agency have thought about working with 18F but are unsure of how we work with our partners, we have a new set of guidelines to help you out. The 18F Delivery Partnership Playbook is specifically targeted at federal offices interested in working with 18F to build digital services."
---
If you or your agency have thought about working with 18F but are unsure
of how we work with our partners, we have a new set of guidelines to
help you out. The [18F Delivery Partnership Playbook](https://pages.18f.gov/partnership-playbook/) is specifically
targeted at federal offices interested in working with 18F to build
digital services.
The playbook is based both on our project experience to date and common
questions that come up during our business development process. It lays
out the plays and techniques that we’ve found make for a successful
collaboration between us and our partners. We hope that naming our
aspirations can help better align expectations about how we’ll work
together. Many of the principles below build on specific plays in the
[U.S. Digital Services Playbook](https://playbook.cio.gov/), which some
18F staffers assisted in authoring.
The [eight plays in this playbook](https://pages.18f.gov/partnership-playbook/) are:
1. We build in the open.
2. We work with an empowered product owner.
3. We focus on understanding the problem first.
4. We work in an agile way.
5. We use user-centered research and design methods.
6. We may revisit the project at a high level if there is a major change in project goals.
7. We transfer projects back to your team for ongoing support.
8. We deploy projects using best practice back-end methods and technology.
Under each point, you’ll see explanatory information, specifics on what
the play means for our shared work, and how to know if you are ready to
execute the play.
As [our work](https://18f.gsa.gov/dashboard)
goes on, we will continue to iterate this playbook and other guides to
memorialize the lessons we learn. If you don’t feel prepared to work in
the ways described, [18F Consulting]({{ site.baseurl }}/consulting/)
may be able to help introduce some of the playbook’s ideas through tools
like
[agile](https://18f.gsa.gov/2015/02/11/a-story-of-an-agile-workshop/)
[workshops](https://18f.gsa.gov/2015/08/31/how-playing-with-legos-taught-executives-agile/).
Please let us know how the playbook can be improved by
[opening an issue](https://github.com/18F/partnership-playbook/issues). And if this
sounds like the way you like to work, get in touch at
[inquiries18f@gsa.gov](mailto:inquiries18f@gsa.gov).
| 52.133333 | 310 | 0.783248 | eng_Latn | 0.999332 |
f9289db9f4e7dca7d480be5421e8cf8511fbad68 | 5,180 | md | Markdown | _posts/2020-01-21-api-logging.md | jhkim105/jhkim105.github.io | a19c968818f243c12f96c6907757f2f0a23596da | [
"MIT"
] | null | null | null | _posts/2020-01-21-api-logging.md | jhkim105/jhkim105.github.io | a19c968818f243c12f96c6907757f2f0a23596da | [
"MIT"
] | 3 | 2019-08-21T09:49:39.000Z | 2021-09-27T21:54:28.000Z | _posts/2020-01-21-api-logging.md | jhkim105/jhkim105.github.io | a19c968818f243c12f96c6907757f2f0a23596da | [
"MIT"
] | null | null | null | ---
layout: post
title: "Api Logging"
date: 2020-01-21 20:00:00 +0900
categories: Backend
tag: Spring
---
* content
{:toc}
Request Response Loggging Using AbstractLoggingFilter
## ApiLoggingFilter.java
```
import com.rsupport.commons.core.util.StringUtils;
import lombok.extern.slf4j.Slf4j;
import lombok.val;
import org.springframework.http.MediaType;
import org.springframework.web.filter.AbstractRequestLoggingFilter;
import org.springframework.web.util.ContentCachingRequestWrapper;
import org.springframework.web.util.ContentCachingResponseWrapper;
import org.springframework.web.util.WebUtils;
import javax.servlet.FilterChain;
import javax.servlet.ServletException;
import javax.servlet.ServletRequest;
import javax.servlet.http.HttpServletRequest;
import javax.servlet.http.HttpServletResponse;
import java.io.IOException;
import java.util.Arrays;
import java.util.List;
@Slf4j
public class ApiLoggingFilter extends AbstractRequestLoggingFilter {
private static final List<MediaType> LOGGING_TYPE = Arrays.asList(
MediaType.APPLICATION_FORM_URLENCODED,
MediaType.APPLICATION_JSON,
MediaType.APPLICATION_JSON_UTF8,
MediaType.APPLICATION_XML,
MediaType.valueOf("application/*+json"),
MediaType.valueOf("application/*+xml")
);
@Override
protected void beforeRequest(HttpServletRequest httpServletRequest, String s) {
log.info("{}", s);
}
@Override
protected void afterRequest(HttpServletRequest httpServletRequest, String s) {
log.info("{}", s);
}
private void writeResponse(HttpServletResponse responseToCache) {
String responseData = getResponseData(responseToCache);
log.debug("response:{}", responseData);
}
@Override
protected void doFilterInternal(HttpServletRequest request, HttpServletResponse response, FilterChain filterChain) throws ServletException,
IOException {
boolean isFirstRequest = !this.isAsyncDispatch(request);
Object requestToUse = request;
if (this.isIncludePayload() && isFirstRequest && !(request instanceof ContentCachingRequestWrapper)) {
requestToUse = new ContentCachingRequestWrapper(request, this.getMaxPayloadLength());
}
boolean shouldLog = this.shouldLog((HttpServletRequest)requestToUse);
if (shouldLog && isFirstRequest) {
this.beforeRequest((HttpServletRequest)requestToUse, this.getBeforeMessage((HttpServletRequest)requestToUse));
}
ContentCachingResponseWrapper responseToCache = new ContentCachingResponseWrapper(response);
try {
filterChain.doFilter((ServletRequest)requestToUse, responseToCache);
} finally {
if (shouldLog && !this.isAsyncStarted((HttpServletRequest)requestToUse)) {
this.afterRequest((HttpServletRequest)requestToUse, this.getAfterMessage((HttpServletRequest)requestToUse));
this.writeResponse(responseToCache);
}
responseToCache.copyBodyToResponse();
}
}
@Override
protected boolean shouldLog(HttpServletRequest request) {
String contentType = request.getContentType();
if (StringUtils.isBlank(contentType))
return false;
val mediaType = MediaType.valueOf(request.getContentType());
val enabledLoggingType = LOGGING_TYPE.stream().anyMatch(visibleType -> visibleType.includes(mediaType));
return enabledLoggingType;
}
private static String getResponseData(final HttpServletResponse response) {
String payload = null;
ContentCachingResponseWrapper wrapper = WebUtils.getNativeResponse(response, ContentCachingResponseWrapper.class);
if (wrapper != null) {
byte[] buf = wrapper.getContentAsByteArray();
if (buf.length > 0) {
try {
payload = new String(buf, 0, buf.length, wrapper.getCharacterEncoding());
} catch (IOException e) {
// ignored
}
}
}
return payload;
}
private String getBeforeMessage(HttpServletRequest request) {
String prefix = String.format("before request[%s] [", request.getMethod());
String suffix = "]";
return this.createMessage(request, prefix, suffix);
}
private String getAfterMessage(HttpServletRequest request) {
String prefix = String.format("request[%s] [", request.getMethod());
String suffix = "]";
return this.createMessage(request, prefix, suffix);
}
}
```
## WebAppInitiallizer.java
```
@Slf4j
public class WebAppInitializer extends AbstractWebAppInitializer {
@Override
public void onStartup(ServletContext servletContext) throws ServletException {
super.onStartup(servletContext);
addListener(servletContext);
addCharacterEncodingFilter(servletContext);
addApiLoggingFilter(servletContext);
...
private void addApiLoggingFilter(ServletContext servletContext) {
ApiLoggingFilter apiLoggingFilter = new ApiLoggingFilter();
apiLoggingFilter.setIncludePayload(true);
apiLoggingFilter.setIncludeQueryString(true);
apiLoggingFilter.setIncludeHeaders(true);
apiLoggingFilter.setMaxPayloadLength(10000);
FilterRegistration.Dynamic filter = servletContext.addFilter("apiLoggingFilter", apiLoggingFilter);
filter.addMappingForUrlPatterns(null, false, "/*");
}
``` | 30.650888 | 141 | 0.751158 | kor_Hang | 0.271239 |
f9291c57989b5f53785ff0d0fb5098634f26f2a5 | 9,543 | md | Markdown | TEDed/Titles_starting_A_to_O/A_giant_bubble_for_debate_Liz_Diller.md | gt-big-data/TEDVis | 328a4c62e3a05c943b2a303817601aebf198c1aa | [
"MIT"
] | 91 | 2018-01-24T12:54:48.000Z | 2022-03-07T21:03:43.000Z | cleaned_teded_data/Titles_starting_A_to_O/A_giant_bubble_for_debate_Liz_Diller.md | nadaataiyab/TED-Talks-Nutrition-NLP | 4d7e8c2155e12cb34ab8da993dee0700a6775ff9 | [
"MIT"
] | null | null | null | cleaned_teded_data/Titles_starting_A_to_O/A_giant_bubble_for_debate_Liz_Diller.md | nadaataiyab/TED-Talks-Nutrition-NLP | 4d7e8c2155e12cb34ab8da993dee0700a6775ff9 | [
"MIT"
] | 18 | 2018-01-24T13:18:51.000Z | 2022-01-09T01:06:02.000Z |
we conventionally divided space into
private and public realms and we know
these legal distinctions very well
because we become experts at protecting
our private property and private space
but we're less attuned to the nuances of
the public what translates generic
public space into qualitative space and
this is something that our studio has
been working on for the past decade and
we're doing this through some case
studies a large chunk of our work has
been put into transforming this
neglected industrial ruin into a viable
post-industrial space that looks forward
and backwards at the same time and
another huge chunk of our work has gone
into making relevant a site that's grown
out of sync with its time we've been
working on democratizing Lincoln Center
for a public that doesn't usually have
300 dollars to spend on an opera ticket
so we've been eating drinking thinking
living public space for quite a long
time and it's taught us really one thing
and that is to truly make a good public
space you have to erase the distinctions
between architecture urbanism landscape
media design and so on it really goes
beyond distinction now we are moving on
to Washington DC and we're working on
another transformation and that is for
the existing Hirshhorn Museum
that's sited on the most revered public
space in America the National Mall the
mall is a symbol of American democracy
and what's fantastic is that this symbol
is not a thing it's not an image it's
not an artifact it's actually it's a
space and it's kind of just defined by
line of buildings on either side it's a
space where citizens can voice their
discontent and show their power it's a
place where pivotal moments in American
history have taken place and they're
inscribed in there forever
like the march on Washington for Jobs
and Freedom and the great speech that
Martin Luther King gave there the
Vietnam protests the commemoration of
all that died in the pandemic of AIDS
the March for women's reproductive
rights right up until almost the present
the mall is the greatest civic stage in
this country for descent and it's
synonymous with free speech even if
you're not sure what it is that you have
to say it may just be a place for civic
and miseration there is a huge
disconnect we believe between the
communicative and discursive space of
the mall and the museums that line it to
either side and that is that those
museums are usually passive they have
passive relationships between the museum
as the presenter and the audience as the
receiver of information and so you can
see dinosaurs and and and insects and
collections of locomotives and and all
of that but you're really not involved
you're being talked to when Richard
Kirsch aaalac took over as director of
the Hirshhorn in 2009 he was determined
to take advantage of the fact that this
museum was sited the most unique place
the seat of power in the US and while
art and politics are inherently and
implicitly
together always and all the time there
could be some very special relationship
that could be forged here in its
uniqueness the question is is it
possible ultimately for art's to insert
itself into the dialogue of national and
world affairs and could the museum be an
agent of cultural diplomacy there are
over a hundred and eighty embassies in
Washington DC there are over five
hundred think tanks there should be a
way of harnessing all of that
intellectual and global energy into and
somehow through the museum there should
be some kind of brain trust
so the Hirshhorn as we began to think
about
and as we evolved the mission with
Richard and his team it's really his
lifeblood
but beyond exhibiting contemporary art
the Hirshhorn will become a public forum
a place of discourse for for issues
around arts culture politics and policy
it would have the global reach of the
World Economic Forum it would have the
interdisciplinarity of the TED
Conference it would have kind of the
informality of The Times Square and for
this new initiative the Hirshhorn would
have to expand or appropriate aside for
a temporary deployable structure this is
it this is the Hirshhorn so 230 foot
diameter concrete donut design in the
early 70s by Gordon Bunn shaft
it's hulking its silent its cloistered
its arrogant it's a design challenge
architects love to hate it a one
redeeming feature is its lifted up off
the ground and it's got this void and
it's God an empty core kind of in the
spirit and that facade very much
corporate and federal style and around
that space the ring is actually
galleries very very difficult to mount
shows in there when the Hirshhorn opened
it'll always huxtable the New York Times
critic had some choice words neo
penitentiary modern a maimed monument
and and maimed mall for a maimed
collection almost four decades later how
would this building expand for a new
progressive program where would it go
can't go in the mall there is no space
there it can't go in the courtyard it's
already taken up by landscape and by
sculptures oh there's always the hole
but how could it take the space of that
hole and not be buried in it and
invisibly how could it become iconic and
what language would it take the
Hirshhorn sits among the most monumental
institutions most are neoclassical heavy
and opaque made of stone or concrete and
question is well if one inhabits that
space what is the material
of them all it has to be different from
the buildings there it has to be
something entirely different it has to
be air in our imagination it has to be
light it has to be ephemeral it has to
be formless and it has to be free
so this is the big idea
it's a giant air bag the expansion takes
the shape of its container and it loses
out wherever can the top and sides but
more poetically we like to think of the
structure as inhaling the Democratic air
of the mall bringing it into itself the
before and the after it was dubbed the
bubble by the press that was the lounge
it's basically one big volume of air
that just oozes out in every direction
the membrane is translucent it's made of
silicone coated glass fiber and it's
inflated twice a year for one month at a
time this is the view from the inside so
you might have been wondering how in the
world did we get this approved by the
federal government I mean it was it had
to be approved by actually two agencies
and and it was and one is there to
preserve the dignity and sanctity of
them all
I blush whenever I show this it is yours
to interpret but one thing I could say
is that it's a combination of iconoclasm
and adoration there was also some
creative interpretation involved the
Congressional Buildings Act of 1910
limits the height of buildings in DC to
130 feet except for spires towers domes
and minarets there's pretty much exempts
monuments of the church and state and
the bubble is 153 feet that's the
Pantheon next to it it's about 1.2
million cubic feet of compressed air and
so we argued it on the merits of being a
dome so there it is very stately among
all these stately buildings in the mall
and while this Hirshhorn is not
landmarked it's very very historically
sensitive and so we couldn't really
touch its services we couldn't leave any
traces behind
so we strained it from the edges and we
held it by cables it's a study of some
bondage techniques which are actually
very very important because it's hit by
wind all the time there's one permanent
steel ring at the top but it can't be
seen from any vantage point on them all
there are also some restrictions about
how much it could be lit it glows from
within it's translucent but it can't be
more lit than the Capitol or some of the
monuments so it's down the hierarchy on
lighting so it comes to the side twice a
year it's taken off the delivery truck
its hoisted and then it's inflated with
this low pressure air and then it's
restrained with the cables and then it's
ballasted with water at the very bottom
this is a converse strange moment we
were asked by the the bureaucracy at the
mall how much time would it take to
install and we said well the first
direction would take one week the very
first and they really connected with
that idea and and then it was really
easy all the way through but so we
didn't really have that many hurdles I
have to say with the with the government
and all the authorities but some of the
toughest hurdles have been the technical
ones this is the warp and weft this is a
point cloud there are extreme pressures
this is a very very unusual building in
that there's no gravity load but there's
load in every direction and I'm just
gonna zip through these slides and this
is the space in action so flexible
interior for discussions just like this
but in the round luminous and
reconfigurable could be used for
anything for performances films for
installations and the very first program
will be one of cultural dialogue and
diplomacy organized in partnership with
the Council on Foreign Relations form
and content are together here the bubble
is an ante monument the ideals of
participatory democracy are represented
through suppleness rather than rigidity
art and politics occupy an ambiguous
site outside the museum walls but inside
of the museum's core blending its
with the Democratic era of the mall and
the bubble will inflate hopefully for
the first time at the end of 2013
thank you
| 38.01992 | 48 | 0.804359 | eng_Latn | 0.999986 |
f9294766f429ab0626fa61e6f451737fbf65a879 | 710 | md | Markdown | libs/openFrameworks/vk/shaders/readme.md | openframeworks-vk/openFrameworks | f78eb5466c84e17f699966940c7d3ed44bc4f40e | [
"MIT"
] | 68 | 2016-06-28T10:56:32.000Z | 2021-11-14T20:38:18.000Z | libs/openFrameworks/vk/shaders/readme.md | openframeworks-vk/openFrameworks | f78eb5466c84e17f699966940c7d3ed44bc4f40e | [
"MIT"
] | 1 | 2017-04-17T15:42:25.000Z | 2017-04-21T14:52:27.000Z | libs/openFrameworks/vk/shaders/readme.md | openframeworks-vk/openFrameworks | f78eb5466c84e17f699966940c7d3ed44bc4f40e | [
"MIT"
] | 6 | 2016-06-28T22:19:56.000Z | 2020-05-14T05:46:45.000Z | # Precompiled default shaders for openFrameworks VK Renderer
The shaders in this folder are used internally by the VK renderer.
We provide shaders as GLSL source - and SPIRV, and source and
SPIRV must be in sync.
To compile shaders we use glslc, which is the front-end for
standalone shaderc:
glslc default.vert default.frag -c -mfmt=num
Note the '-mfmt=num' parameter - this stores spirv in text form
as a list uint32_t literals.
We compile shaders into spir-v because it accelerates load times,
and also allows us to inline shader code as binary blobs into
applications:
// Include shader source into binary
static const std::vector<uin32_t> myShader {
#include "myshadersource.spv"
}
| 30.869565 | 67 | 0.767606 | eng_Latn | 0.995895 |
f929d51dc32c7eb97b024e5299010a8c079f88a5 | 81 | md | Markdown | README.md | salledhali/release-based-workflow | 46c4a02993cd8cce74103b63acc1673972f59578 | [
"MIT"
] | null | null | null | README.md | salledhali/release-based-workflow | 46c4a02993cd8cce74103b63acc1673972f59578 | [
"MIT"
] | 14 | 2021-06-01T14:46:55.000Z | 2021-06-01T15:51:51.000Z | README.md | salledhali/release-based-workflow | 46c4a02993cd8cce74103b63acc1673972f59578 | [
"MIT"
] | null | null | null |
* [HTML5 Game Engines](https://github.com/salledhali/release-based-workflow)
| 13.5 | 76 | 0.740741 | zul_Latn | 0.194201 |
f929e31afa0fd57c34be0a4231ec2487fb6f18a8 | 5,084 | md | Markdown | articles/lab-services/devtest-lab-grant-user-permissions-to-specific-lab-policies.md | derekbekoe/azure-docs | e3342cf3e78dc903ad3cb122bada170b5dd6a9d9 | [
"CC-BY-4.0",
"MIT"
] | 1 | 2019-07-18T11:30:47.000Z | 2019-07-18T11:30:47.000Z | articles/lab-services/devtest-lab-grant-user-permissions-to-specific-lab-policies.md | derekbekoe/azure-docs | e3342cf3e78dc903ad3cb122bada170b5dd6a9d9 | [
"CC-BY-4.0",
"MIT"
] | 1 | 2018-10-09T03:19:23.000Z | 2018-10-09T03:19:23.000Z | articles/lab-services/devtest-lab-grant-user-permissions-to-specific-lab-policies.md | derekbekoe/azure-docs | e3342cf3e78dc903ad3cb122bada170b5dd6a9d9 | [
"CC-BY-4.0",
"MIT"
] | 1 | 2021-06-20T19:42:01.000Z | 2021-06-20T19:42:01.000Z | ---
title: Grant user permissions to specific lab policies | Microsoft Docs
description: Learn how to grant user permissions to specific lab policies in DevTest Labs based on each user's needs
services: devtest-lab,virtual-machines,lab-services
documentationcenter: na
author: spelluru
manager: femila
editor: ''
ms.assetid: 5ca829f0-eb69-40a1-ae26-03a629db1d7e
ms.service: lab-services
ms.workload: na
ms.tgt_pltfrm: na
ms.devlang: na
ms.topic: article
ms.date: 04/17/2018
ms.author: spelluru
---
# Grant user permissions to specific lab policies
## Overview
This article illustrates how to use PowerShell to grant users permissions to a particular lab policy. That way, permissions can be applied based on each user's needs. For example, you might want to grant a particular user the ability to change the VM policy settings, but not the cost policies.
## Policies as resources
As discussed in the [Azure Role-based Access Control](../role-based-access-control/role-assignments-portal.md) article, RBAC enables fine-grained access management of resources for Azure. Using RBAC, you can segregate duties within your DevOps team and grant only the amount of access to users that they need to perform their jobs.
In DevTest Labs, a policy is a resource type that enables the RBAC action **Microsoft.DevTestLab/labs/policySets/policies/**. Each lab policy is a resource in the Policy resource type, and can be assigned as a scope to an RBAC role.
For example, in order to grant users read/write permission to the **Allowed VM Sizes** policy, you would create a custom role that works with the **Microsoft.DevTestLab/labs/policySets/policies/*** action, and then assign the appropriate users to this custom role in the scope of **Microsoft.DevTestLab/labs/policySets/policies/AllowedVmSizesInLab**.
To learn more about custom roles in RBAC, see the [Custom roles access control](../role-based-access-control/custom-roles.md).
## Creating a lab custom role using PowerShell
In order to get started, you’ll need to read the following article, which will explain how to install and configure the Azure PowerShell cmdlets: [https://azure.microsoft.com/blog/azps-1-0-pre](https://azure.microsoft.com/blog/azps-1-0-pre).
Once you’ve set up the Azure PowerShell cmdlets, you can perform the following tasks:
* List all the operations/actions for a resource provider
* List actions in a particular role:
* Create a custom role
The following PowerShell script illustrates examples of how to perform these tasks:
‘List all the operations/actions for a resource provider.
Get-AzureRmProviderOperation -OperationSearchString "Microsoft.DevTestLab/*"
‘List actions in a particular role.
(Get-AzureRmRoleDefinition "DevTest Labs User").Actions
‘Create custom role.
$policyRoleDef = (Get-AzureRmRoleDefinition "DevTest Labs User")
$policyRoleDef.Id = $null
$policyRoleDef.Name = "Policy Contributor"
$policyRoleDef.IsCustom = $true
$policyRoleDef.AssignableScopes.Clear()
$policyRoleDef.AssignableScopes.Add("/subscriptions/<SubscriptionID> ")
$policyRoleDef.Actions.Add("Microsoft.DevTestLab/labs/policySets/policies/*")
$policyRoleDef = (New-AzureRmRoleDefinition -Role $policyRoleDef)
## Assigning permissions to a user for a specific policy using custom roles
Once you’ve defined your custom roles, you can assign them to users. In order to assign a custom role to a user, you must first obtain the **ObjectId** representing that user. To do that, use the **Get-AzureRmADUser** cmdlet.
In the following example, the **ObjectId** of the *SomeUser* user is 05DEFF7B-0AC3-4ABF-B74D-6A72CD5BF3F3.
PS C:\>Get-AzureRmADUser -SearchString "SomeUser"
DisplayName Type ObjectId
----------- ---- --------
someuser@hotmail.com 05DEFF7B-0AC3-4ABF-B74D-6A72CD5BF3F3
Once you have the **ObjectId** for the user and a custom role name, you can assign that role to the user with the **New-AzureRmRoleAssignment** cmdlet:
PS C:\>New-AzureRmRoleAssignment -ObjectId 05DEFF7B-0AC3-4ABF-B74D-6A72CD5BF3F3 -RoleDefinitionName "Policy Contributor" -Scope /subscriptions/<SubscriptionID>/resourceGroups/<ResourceGroupName>/providers/Microsoft.DevTestLab/labs/<LabName>/policySets/default/policies/AllowedVmSizesInLab
In the previous example, the **AllowedVmSizesInLab** policy is used. You can use any of the following polices:
* MaxVmsAllowedPerUser
* MaxVmsAllowedPerLab
* AllowedVmSizesInLab
* LabVmsShutdown
[!INCLUDE [devtest-lab-try-it-out](../../includes/devtest-lab-try-it-out.md)]
## Next steps
Once you've granted user permissions to specific lab policies, here are some next steps to consider:
* [Secure access to a lab](devtest-lab-add-devtest-user.md)
* [Set lab policies](devtest-lab-set-lab-policy.md)
* [Create a lab template](devtest-lab-create-template.md)
* [Create custom artifacts for your VMs](devtest-lab-artifact-author.md)
* [Add a VM to a lab](devtest-lab-add-vm.md)
| 54.666667 | 350 | 0.755311 | eng_Latn | 0.929299 |
f92a61d9d5b6482ef98e410776a94f670f1267a3 | 21,004 | md | Markdown | sccm-ps/ConfigurationManager/New-CMApplicationDeployment.md | spinydelta/sccm-docs-powershell-ref | 3b3694e20bc24093693e3fdd4300614d3abac568 | [
"CC-BY-4.0",
"MIT"
] | null | null | null | sccm-ps/ConfigurationManager/New-CMApplicationDeployment.md | spinydelta/sccm-docs-powershell-ref | 3b3694e20bc24093693e3fdd4300614d3abac568 | [
"CC-BY-4.0",
"MIT"
] | null | null | null | sccm-ps/ConfigurationManager/New-CMApplicationDeployment.md | spinydelta/sccm-docs-powershell-ref | 3b3694e20bc24093693e3fdd4300614d3abac568 | [
"CC-BY-4.0",
"MIT"
] | null | null | null | ---
description: Create an application deployment.
external help file: AdminUI.PS.dll-Help.xml
Module Name: ConfigurationManager
ms.date: 05/24/2020
schema: 2.0.0
title: New-CMApplicationDeployment
---
# New-CMApplicationDeployment
## SYNOPSIS
Create an application deployment.
## SYNTAX
### SearchByValueMandatory (Default)
```
New-CMApplicationDeployment [-AllowRepairApp <Boolean>] [-ApprovalRequired <Boolean>]
[-AutoCloseExecutable <Boolean>] [-DeadlineDateTime <DateTime>] [-DeployAction <DeployActionType>]
[-DeployPurpose <DeployPurposeType>] [-DisableContentDependencyDetection] [-EnableMomAlert <Boolean>]
[-EnableSoftDeadline <Boolean>] [-FailParameterValue <Int32>] [-GenerateScomAlertOnFailure <Boolean>]
[-InputObject] <IResultObject> [-OverrideServiceWindow <Boolean>] [-PostponeDateTime <DateTime>]
[-PreDeploy <Boolean>] [-RebootOutsideServiceWindow <Boolean>] [-ReplaceToastNotificationWithDialog <Boolean>]
[-Simulation] [-SuccessParameterValue <Int32>] [-TimeBaseOn <TimeType>] [-UpdateSupersedence <Boolean>]
[-UserNotification <UserNotificationType>] [-DistributeCollectionName <String>] [-DistributeContent]
[-DistributionPointGroupName <String>] [-DistributionPointName <String>] [-AvailableDateTime <DateTime>]
[-Comment <String>] [-PersistOnWriteFilterDevice <Boolean>] [-SendWakeupPacket <Boolean>]
[-UseMeteredNetwork <Boolean>] [-Collection <IResultObject>] [-CollectionId <String>]
[-CollectionName <String>] [-DisableWildcardHandling] [-ForceWildcardHandling] [-WhatIf] [-Confirm]
[<CommonParameters>]
```
### SearchByIdMandatory
```
New-CMApplicationDeployment [-AllowRepairApp <Boolean>] [-ApprovalRequired <Boolean>]
[-AutoCloseExecutable <Boolean>] [-DeadlineDateTime <DateTime>] [-DeployAction <DeployActionType>]
[-DeployPurpose <DeployPurposeType>] [-DisableContentDependencyDetection] [-EnableMomAlert <Boolean>]
[-EnableSoftDeadline <Boolean>] [-FailParameterValue <Int32>] [-GenerateScomAlertOnFailure <Boolean>]
[-Id] <Int32> [-OverrideServiceWindow <Boolean>] [-PostponeDateTime <DateTime>] [-PreDeploy <Boolean>]
[-RebootOutsideServiceWindow <Boolean>] [-ReplaceToastNotificationWithDialog <Boolean>] [-Simulation]
[-SuccessParameterValue <Int32>] [-TimeBaseOn <TimeType>] [-UpdateSupersedence <Boolean>]
[-UserNotification <UserNotificationType>] [-DistributeCollectionName <String>] [-DistributeContent]
[-DistributionPointGroupName <String>] [-DistributionPointName <String>] [-AvailableDateTime <DateTime>]
[-Comment <String>] [-PersistOnWriteFilterDevice <Boolean>] [-SendWakeupPacket <Boolean>]
[-UseMeteredNetwork <Boolean>] [-Collection <IResultObject>] [-CollectionId <String>]
[-CollectionName <String>] [-DisableWildcardHandling] [-ForceWildcardHandling] [-WhatIf] [-Confirm]
[<CommonParameters>]
```
### SearchByNameMandatory
```
New-CMApplicationDeployment [-AllowRepairApp <Boolean>] [-ApprovalRequired <Boolean>]
[-AutoCloseExecutable <Boolean>] [-DeadlineDateTime <DateTime>] [-DeployAction <DeployActionType>]
[-DeployPurpose <DeployPurposeType>] [-DisableContentDependencyDetection] [-EnableMomAlert <Boolean>]
[-EnableSoftDeadline <Boolean>] [-FailParameterValue <Int32>] [-GenerateScomAlertOnFailure <Boolean>]
[-Name] <String> [-OverrideServiceWindow <Boolean>] [-PostponeDateTime <DateTime>] [-PreDeploy <Boolean>]
[-RebootOutsideServiceWindow <Boolean>] [-ReplaceToastNotificationWithDialog <Boolean>] [-Simulation]
[-SuccessParameterValue <Int32>] [-TimeBaseOn <TimeType>] [-UpdateSupersedence <Boolean>]
[-UserNotification <UserNotificationType>] [-DistributeCollectionName <String>] [-DistributeContent]
[-DistributionPointGroupName <String>] [-DistributionPointName <String>] [-AvailableDateTime <DateTime>]
[-Comment <String>] [-PersistOnWriteFilterDevice <Boolean>] [-SendWakeupPacket <Boolean>]
[-UseMeteredNetwork <Boolean>] [-Collection <IResultObject>] [-CollectionId <String>]
[-CollectionName <String>] [-DisableWildcardHandling] [-ForceWildcardHandling] [-WhatIf] [-Confirm]
[<CommonParameters>]
```
## DESCRIPTION
The **New-CMApplicationDeployment** cmdlet creates an application deployment. For more information, see [Deploy applications with Configuration Manager](/mem/configmgr/apps/deploy-use/deploy-applications).
> [!NOTE]
> Run Configuration Manager cmdlets from the Configuration Manager site drive, for example `PS XYZ:\>`. For more information, see [getting started](/powershell/sccm/overview).
## EXAMPLES
### Example 1: Install an application
This command creates a new deployment for **Visual Studio 2019** to the collection **Developers Workstation**. It installs the app, and is required. Both the available date and deadline are the same time in the past, so as soon as the client receives this policy, it installs the app.
```powershell
New-CMApplicationDeployment -Name "Visual Studio 2019" -AvailableDateTime '01/01/2020 00:00:00' -CollectionName 'Developers Workstation' -DeadlineDateTime '01/01/2020 00:00:00' -DeployAction Install -DeployPurpose Required
```
## PARAMETERS
### -AllowRepairApp
Applies to version 2002 and later. Use this parameter to configure the repair application option when creating a deployment for an application.
```yaml
Type: Boolean
Parameter Sets: (All)
Aliases: AllowUserRepairApplication
Required: False
Position: Named
Default value: None
Accept pipeline input: False
Accept wildcard characters: False
```
### -ApprovalRequired
If you set this parameter to `$true`, an administrator must approve a request for this application on the device.
```yaml
Type: Boolean
Parameter Sets: (All)
Aliases: AppRequiresApproval
Required: False
Position: Named
Default value: None
Accept pipeline input: False
Accept wildcard characters: False
```
### -AvailableDateTime
Specify a **DateTime** object for when this deployment is _available_. To get this object, use the [Get-Date](/powershell/module/microsoft.powershell.utility/get-date) built-in cmdlet.
Use **DeadlineDateTime** to specify the deployment assignment, or _deadline_.
```yaml
Type: DateTime
Parameter Sets: (All)
Aliases:
Required: False
Position: Named
Default value: None
Accept pipeline input: False
Accept wildcard characters: False
```
### -Collection
Specify a collection object to which the application is deployed. To get this object, use the [Get-CMCollection](Get-CMCollection.md) cmdlet.
```yaml
Type: IResultObject
Parameter Sets: (All)
Aliases:
Required: False
Position: Named
Default value: None
Accept pipeline input: False
Accept wildcard characters: False
```
### -CollectionId
Specify the ID of the collection to which this application is deployed. For example, `"SMS00004"`.
```yaml
Type: String
Parameter Sets: (All)
Aliases:
Required: False
Position: Named
Default value: None
Accept pipeline input: False
Accept wildcard characters: False
```
### -CollectionName
Specify the name of the collection to which this application is deployed.
```yaml
Type: String
Parameter Sets: (All)
Aliases:
Required: False
Position: Named
Default value: None
Accept pipeline input: False
Accept wildcard characters: True
```
### -Comment
Specify an optional comment for this deployment.
```yaml
Type: String
Parameter Sets: (All)
Aliases:
Required: False
Position: Named
Default value: None
Accept pipeline input: False
Accept wildcard characters: False
```
### -Confirm
Prompts you for confirmation before running the cmdlet.
```yaml
Type: SwitchParameter
Parameter Sets: (All)
Aliases: cf
Required: False
Position: Named
Default value: None
Accept pipeline input: False
Accept wildcard characters: False
```
### -DeadlineDateTime
Specify a **DateTime** object for when this deployment is assigned, also known as the _deadline_. To get this object, use the [Get-Date](/powershell/module/microsoft.powershell.utility/get-date) built-in cmdlet.
Use **-AvailableDateTime** to specify when the deployment is _available_.
```yaml
Type: DateTime
Parameter Sets: (All)
Aliases: SupersedenceDeadlineDateTime
Required: False
Position: Named
Default value: None
Accept pipeline input: False
Accept wildcard characters: False
```
### -DeployAction
Specify the deployment action, either to install or uninstall the application. If competing deployments target the same device, the **Install** action takes priority.
```yaml
Type: DeployActionType
Parameter Sets: (All)
Aliases:
Accepted values: Install, Uninstall
Required: False
Position: Named
Default value: None
Accept pipeline input: False
Accept wildcard characters: False
```
### -DeployPurpose
Specify the deployment purpose:
- `Available`: The user sees the application in Software Center. They can install it on demand.
- `Required`: The client automatically installs the app according to the schedule that you set. If the application isn't hidden, a user can track its deployment status. They can also use Software Center to install the application before the deadline.
```yaml
Type: DeployPurposeType
Parameter Sets: (All)
Aliases:
Accepted values: Available, Required
Required: False
Position: Named
Default value: None
Accept pipeline input: False
Accept wildcard characters: False
```
### -DisableContentDependencyDetection
Add this parameter to not automatically distribute content for dependent apps.
```yaml
Type: SwitchParameter
Parameter Sets: (All)
Aliases: DisableDetectAssociatedContentDependencies
Required: False
Position: Named
Default value: None
Accept pipeline input: False
Accept wildcard characters: False
```
### -DisableWildcardHandling
This parameter treats wildcard characters as literal character values. You can't combine it with **ForceWildcardHandling**.
```yaml
Type: SwitchParameter
Parameter Sets: (All)
Aliases:
Required: False
Position: Named
Default value: None
Accept pipeline input: False
Accept wildcard characters: False
```
### -DistributeCollectionName
The site distributes content to the distribution points that are associated with this collection name.
```yaml
Type: String
Parameter Sets: (All)
Aliases:
Required: False
Position: Named
Default value: None
Accept pipeline input: False
Accept wildcard characters: False
```
### -DistributeContent
Add this parameter if you need to distribute the app content first.
```yaml
Type: SwitchParameter
Parameter Sets: (All)
Aliases:
Required: False
Position: Named
Default value: None
Accept pipeline input: False
Accept wildcard characters: False
```
### -DistributionPointGroupName
To distribute the application content, specify the name of a distribution point group.
```yaml
Type: String
Parameter Sets: (All)
Aliases:
Required: False
Position: Named
Default value: None
Accept pipeline input: False
Accept wildcard characters: False
```
### -DistributionPointName
To distribute the application content, specify the name of a distribution point.
```yaml
Type: String
Parameter Sets: (All)
Aliases:
Required: False
Position: Named
Default value: None
Accept pipeline input: False
Accept wildcard characters: False
```
### -EnableMomAlert
Set this parameter to `$true` to enable System Center Operations Manager maintenance mode for this deployment.
```yaml
Type: Boolean
Parameter Sets: (All)
Aliases:
Required: False
Position: Named
Default value: None
Accept pipeline input: False
Accept wildcard characters: False
```
### -EnableSoftDeadline
Set this parameter to `$true` to enable delayed enforcement.
```yaml
Type: Boolean
Parameter Sets: (All)
Aliases:
Required: False
Position: Named
Default value: None
Accept pipeline input: False
Accept wildcard characters: False
```
### -FailParameterValue
Specifies the percentage of failed application installation that causes an alert.
Specify an integer from 1 through 100.
To enable this alert, set the **CreatAlertBaseOnPercentFailure** parameter to `$True`.
```yaml
Type: Int32
Parameter Sets: (All)
Aliases:
Required: False
Position: Named
Default value: None
Accept pipeline input: False
Accept wildcard characters: False
```
### -ForceWildcardHandling
This parameter processes wildcard characters and may lead to unexpected behavior (not recommended). You can't combine it with **DisableWildcardHandling**.
```yaml
Type: SwitchParameter
Parameter Sets: (All)
Aliases:
Required: False
Position: Named
Default value: None
Accept pipeline input: False
Accept wildcard characters: False
```
### -GenerateScomAlertOnFailure
Indicates whether to create an Operations Manager alert if a client fails to install the application.
```yaml
Type: Boolean
Parameter Sets: (All)
Aliases: RaiseMomAlertsOnFailure
Required: False
Position: Named
Default value: None
Accept pipeline input: False
Accept wildcard characters: False
```
### -Id
Specify the ID of the application to deploy.
```yaml
Type: Int32
Parameter Sets: SearchByIdMandatory
Aliases: CIId, CI_ID, ApplicationId
Required: True
Position: 0
Default value: None
Accept pipeline input: False
Accept wildcard characters: False
```
### -InputObject
Specify an application object to deploy. To get this object, use the [Get-CMApplication](Get-CMApplication.md) cmdlet.
```yaml
Type: IResultObject
Parameter Sets: SearchByValueMandatory
Aliases: Application
Required: True
Position: 0
Default value: None
Accept pipeline input: True (ByValue)
Accept wildcard characters: False
```
### -Name
Specify the name of the application to deploy.
```yaml
Type: String
Parameter Sets: SearchByNameMandatory
Aliases: LocalizedDisplayName, ApplicationName
Required: True
Position: 0
Default value: None
Accept pipeline input: False
Accept wildcard characters: False
```
### -OverrideServiceWindow
Indicates whether the deployment takes place even if scheduled outside of a maintenance window.
A maintenance window is a specified period of time used for computer maintenance and updates.
If this value is `$True`, Configuration Manager deploys the application even if the scheduled time falls outside the maintenance window.
If this value is `$False`, Configuration Manager doesn't deploy the application outside the window. It waits until it can deploy in an available window.
```yaml
Type: Boolean
Parameter Sets: (All)
Aliases:
Required: False
Position: Named
Default value: None
Accept pipeline input: False
Accept wildcard characters: False
```
### -PersistOnWriteFilterDevice
Indicates whether to enable write filters for embedded devices.
For a value of `$True`, the device commits changes during a maintenance window. This action requires a restart.
For a value of `$False`, the device saves changes in an overlay and commits them later.
```yaml
Type: Boolean
Parameter Sets: (All)
Aliases:
Required: False
Position: Named
Default value: None
Accept pipeline input: False
Accept wildcard characters: False
```
### -PostponeDateTime
When you set **CreateAlertBaseOnPercentSuccess** to `$true`, use this parameter to specify a **DateTime** object. Configuration Manager creates a deployment alert when the threshold is lower than the **SuccessParameterValue** after this date.
To get this object, use the [Get-Date](/powershell/module/microsoft.powershell.utility/get-date) built-in cmdlet.
```yaml
Type: DateTime
Parameter Sets: (All)
Aliases:
Required: False
Position: Named
Default value: None
Accept pipeline input: False
Accept wildcard characters: False
```
### -PreDeploy
Indicates whether to pre-deploy the application to the primary device of the user.
```yaml
Type: Boolean
Parameter Sets: (All)
Aliases:
Required: False
Position: Named
Default value: None
Accept pipeline input: False
Accept wildcard characters: False
```
### -RebootOutsideServiceWindow
Indicates whether a computer restarts outside a maintenance window.
A maintenance window is a specified period of time used for computer maintenance and updates.
If this value is `$True`, any required restart takes place without regard to maintenance windows.
If this value is `$False`, the computer doesn't restart outside a maintenance window.
```yaml
Type: Boolean
Parameter Sets: (All)
Aliases:
Required: False
Position: Named
Default value: None
Accept pipeline input: False
Accept wildcard characters: False
```
### -ReplaceToastNotificationWithDialog
When required software is available on the client, set this parameter to `$true` to replace the default toast notifications with a dialog window. It's false by default. For more information, see [Replace toast notifications with dialog window](/mem/configmgr/apps/plan-design/plan-for-software-center#bkmk_impact).
```yaml
Type: Boolean
Parameter Sets: (All)
Aliases:
Required: False
Position: Named
Default value: None
Accept pipeline input: False
Accept wildcard characters: False
```
### -SendWakeupPacket
Indicates whether to send a wake-up packet to computers before the deployment begins.
If this value is `$True`, Configuration Manager attempts to wake a computer from sleep.
If this value is `$False`, it doesn't wake computers from sleep.
For computers to wake, you must first configure Wake On LAN.
```yaml
Type: Boolean
Parameter Sets: (All)
Aliases:
Required: False
Position: Named
Default value: None
Accept pipeline input: False
Accept wildcard characters: False
```
### -Simulation
Add this parameter to create a deployment simulation. For more information, see [Simulate application deployments with Configuration Manager](/mem/configmgr/apps/deploy-use/simulate-application-deployments).
```yaml
Type: SwitchParameter
Parameter Sets: (All)
Aliases:
Required: False
Position: Named
Default value: None
Accept pipeline input: False
Accept wildcard characters: False
```
### -SuccessParameterValue
Specifies the percentage of successful application installation that causes an alert.
Specify an integer from 0 through 99.
To enable this alert, set the **CreateAlertBaseOnPercentSuccess** parameter as `$True`.
```yaml
Type: Int32
Parameter Sets: (All)
Aliases:
Required: False
Position: Named
Default value: None
Accept pipeline input: False
Accept wildcard characters: False
```
### -TimeBaseOn
Specifies which time zone to use:
- `LocalTime`: Use local time.
- `UTC`: Use Coordinated Universal Time (UTC).
```yaml
Type: TimeType
Parameter Sets: (All)
Aliases:
Accepted values: LocalTime, Utc
Required: False
Position: Named
Default value: None
Accept pipeline input: False
Accept wildcard characters: False
```
### -UpdateSupersedence
For an available deployment, use this parameter to specify the installation deadline to upgrade users or devices that have the superseded application installed. Use **DeadlineDateTime** to specify a specific time, otherwise it's as soon as possible after the **AvailableDateTime**.
```yaml
Type: Boolean
Parameter Sets: (All)
Aliases:
Required: False
Position: Named
Default value: None
Accept pipeline input: False
Accept wildcard characters: False
```
### -UseMeteredNetwork
Indicates whether to allow clients to download content over a metered internet connection after the deadline, which may incur extra expense.
```yaml
Type: Boolean
Parameter Sets: (All)
Aliases:
Required: False
Position: Named
Default value: None
Accept pipeline input: False
Accept wildcard characters: False
```
### -UserNotification
Specifies the type of user notification.
- `DisplayAll`: Display in Software Center and show all notifications.
- `DisplaySoftwareCenterOnly`: Display in Software Center, and only show notifications of computer restarts.
- `HideAll`: Hide in Software Center and all notifications.
```yaml
Type: UserNotificationType
Parameter Sets: (All)
Aliases:
Accepted values: DisplayAll, DisplaySoftwareCenterOnly, HideAll
Required: False
Position: Named
Default value: None
Accept pipeline input: False
Accept wildcard characters: False
```
### -WhatIf
Shows what would happen if the cmdlet runs. The cmdlet doesn't run.
```yaml
Type: SwitchParameter
Parameter Sets: (All)
Aliases: wi
Required: False
Position: Named
Default value: None
Accept pipeline input: False
Accept wildcard characters: False
```
### -AutoCloseExecutable
{{ Fill AutoCloseExecutable Description }}
```yaml
Type: Boolean
Parameter Sets: (All)
Aliases: AutoCloseExeOnInstallBehavior
Required: False
Position: Named
Default value: None
Accept pipeline input: False
Accept wildcard characters: False
```
### CommonParameters
This cmdlet supports the common parameters: -Debug, -ErrorAction, -ErrorVariable, -InformationAction, -InformationVariable, -OutVariable, -OutBuffer, -PipelineVariable, -Verbose, -WarningAction, and -WarningVariable. For more information, see [about_CommonParameters](http://go.microsoft.com/fwlink/?LinkID=113216).
## INPUTS
### Microsoft.ConfigurationManagement.ManagementProvider.IResultObject
## OUTPUTS
### System.Object
## NOTES
## RELATED LINKS
[Get-CMApplication](Get-CMApplication.md)
[Get-CMApplicationDeployment](Get-CMApplicationDeployment.md)
[Remove-CMApplicationDeployment](Remove-CMApplicationDeployment.md)
[Set-CMApplicationDeployment](Set-CMApplicationDeployment.md)
[Deploy applications with Configuration Manager](/mem/configmgr/apps/deploy-use/deploy-applications)
| 26.756688 | 315 | 0.782994 | eng_Latn | 0.758469 |
f92a935c4e9b885ce50d705fc83020b7195b9d6a | 3,290 | md | Markdown | docs/visual-basic/language-reference/statements/return-statement.md | olifantix/docs.de-de | a31a14cdc3967b64f434a2055f7de6bf1bb3cda8 | [
"CC-BY-4.0",
"MIT"
] | null | null | null | docs/visual-basic/language-reference/statements/return-statement.md | olifantix/docs.de-de | a31a14cdc3967b64f434a2055f7de6bf1bb3cda8 | [
"CC-BY-4.0",
"MIT"
] | null | null | null | docs/visual-basic/language-reference/statements/return-statement.md | olifantix/docs.de-de | a31a14cdc3967b64f434a2055f7de6bf1bb3cda8 | [
"CC-BY-4.0",
"MIT"
] | null | null | null | ---
title: Return-Anweisung (Visual Basic)
ms.date: 07/20/2015
f1_keywords:
- vb.Return
helpviewer_keywords:
- Return statement [Visual Basic], syntax
- control flow [Visual Basic], returning control to expressions
- Return statement [Visual Basic]
- expressions [Visual Basic], returning control to
ms.assetid: ac86e7f0-5a67-42c3-9834-0e0381efa3ec
ms.openlocfilehash: 2f614045be1b91b9c747d961cdefd526ba1bab98
ms.sourcegitcommit: 3d5d33f384eeba41b2dff79d096f47ccc8d8f03d
ms.translationtype: MT
ms.contentlocale: de-DE
ms.lasthandoff: 05/04/2018
---
# <a name="return-statement-visual-basic"></a>Return-Anweisung (Visual Basic)
Die Steuerung an den Code, aufgerufen zurückgegeben eine `Function`, `Sub`, `Get`, `Set`, oder `Operator` Prozedur.
## <a name="syntax"></a>Syntax
```
Return
-or-
Return expression
```
## <a name="part"></a>Segment
`expression`
Erforderlich einer `Function`, `Get`, oder `Operator` Prozedur. Der Ausdruck den Wert an den aufrufenden Code zurückgegeben werden.
## <a name="remarks"></a>Hinweise
In einer `Sub` oder `Set` Prozedur, die `Return` Anweisung entspricht einer `Exit Sub` oder `Exit Property` -Anweisung und `expression` muss nicht angegeben werden.
In einem `Function`, `Get`, oder `Operator` Prozedur, die `Return` -Anweisung muss enthalten `expression`, und `expression` muss in einen Datentyp, der in den Rückgabetyp der Prozedur konvertiert werden ausgewertet. In einem `Function` oder `Get` Prozedur, Sie haben auch die Alternative einen Ausdruck zuweisen, um den Namen der Prozedur, die als Rückgabewert dient, und klicken Sie dann Ausführen einer `Exit Function` oder `Exit Property` Anweisung. In einer `Operator` Verfahren verwenden Sie `Return``expression`.
Sie können beliebig viele einschließen `Return` Anweisungen nach Bedarf in der gleichen Prozedur.
> [!NOTE]
> Der Code in eine `Finally` Block ausgeführt wird, nachdem ein `Return` -Anweisung in einer `Try` oder `Catch` Block ist aufgetreten, aber vor, `Return` -Anweisung ausführt. Ein `Return` -Anweisung kann nicht eingefügt werden, einem `Finally` Block.
## <a name="example"></a>Beispiel
Im folgenden Beispiel wird die `Return` Anweisung mehrmals an den aufrufenden Code zurückgegeben werden soll, wenn die Prozedur keinen nichts weiter tun.
[!code-vb[VbVbalrStatements#53](../../../visual-basic/language-reference/error-messages/codesnippet/VisualBasic/return-statement_1.vb)]
## <a name="see-also"></a>Siehe auch
[Function-Anweisung](../../../visual-basic/language-reference/statements/function-statement.md)
[Sub-Anweisung](../../../visual-basic/language-reference/statements/sub-statement.md)
[Get-Anweisung](../../../visual-basic/language-reference/statements/get-statement.md)
[Set-Anweisung](../../../visual-basic/language-reference/statements/set-statement.md)
[Operator-Anweisung](../../../visual-basic/language-reference/statements/operator-statement.md)
[Property-Anweisung](../../../visual-basic/language-reference/statements/property-statement.md)
[Exit-Anweisung](../../../visual-basic/language-reference/statements/exit-statement.md)
[Try...Catch...Finally-Anweisung](../../../visual-basic/language-reference/statements/try-catch-finally-statement.md)
| 57.719298 | 521 | 0.742857 | deu_Latn | 0.877014 |
f92abcd29ab1d5bb17bb64fcfdc7599ccc890307 | 6,578 | md | Markdown | en/functions/operations/trigger/trigger-update.md | batFormat/docs | 2414d9c5fa3cb2b2f7601082fd378e21575439ce | [
"CC-BY-4.0"
] | 117 | 2018-12-29T10:20:17.000Z | 2022-03-30T12:30:13.000Z | en/functions/operations/trigger/trigger-update.md | batFormat/docs | 2414d9c5fa3cb2b2f7601082fd378e21575439ce | [
"CC-BY-4.0"
] | 205 | 2018-12-29T14:58:45.000Z | 2022-03-30T21:47:12.000Z | en/functions/operations/trigger/trigger-update.md | batFormat/docs | 2414d9c5fa3cb2b2f7601082fd378e21575439ce | [
"CC-BY-4.0"
] | 393 | 2018-12-26T16:53:47.000Z | 2022-03-31T17:33:48.000Z | # Updating a trigger
You can change the [name](#update-name) and [description](#update-description) of a trigger and [manage trigger labels](#manage-label).
{% include [trigger-list-note](../../../_includes/functions/trigger-list-note.md) %}
## Changing the name of a trigger {#update-name}
{% list tabs %}
- Management console
1. In the [management console]({{ link-console-main }}), go to the folder where the trigger is located.
1. Open **{{ sf-name }}**.
1. Go to the **Triggers** tab.
1. Select the trigger you want to update.
1. In the upper-right corner of the page, click **Edit**.
- CLI
{% include [cli-install](../../../_includes/cli-install.md) %}
{% include [default-catalogue](../../../_includes/default-catalogue.md) %}
To change the trigger name, run the command:
```
yc serverless trigger update <trigger name> --new-name <new trigger name>
```
Result:
```
id: dd0gj5tsj2pq9at8ja8i
folder_id: aoek49ghmknnpj1ll45e
created_at: "2019-08-28T12:26:25.675Z"
name: my-trigger
rule:
message_queue:
queue_id: yrn:yc:ymq:ru-central1:aoek49ghmknnpj1ll45e:my-mq
service_account_id: bfbqqeo6jkpls2tse5o6
batch_settings:
size: "10"
cutoff: 10s
invoke_function:
function_id: b09e5lu91ta21vdrrgma
function_tag: $latest
service_account_id: bfbqqeo6jkpls2tse5o6
status: ACTIVE
```
- API
You can change the trigger name using the [update](../../triggers/api-ref/Trigger/update.md) API method.
{% endlist %}
## Updating the description of a trigger {#update-description}
{% list tabs %}
- Management console
1. In the [management console]({{ link-console-main }}), go to the folder where the trigger is located.
1. Open **{{ sf-name }}**.
1. Go to the **Triggers** tab.
1. Select the trigger you want to update.
1. In the upper-right corner of the page, click **Edit**.
- CLI
{% include [cli-install](../../../_includes/cli-install.md) %}
{% include [default-catalogue](../../../_includes/default-catalogue.md) %}
To update the trigger description, run the command:
```
yc serverless trigger update <trigger name> --description "<trigger description>"
```
Result:
```
id: dd0gj5tsj2pq9at8ja8i
folder_id: aoek49ghmknnpj1ll45e
created_at: "2019-08-28T12:26:25.675Z"
name: my-trigger
description: My YMQ trigger.
rule:
message_queue:
queue_id: yrn:yc:ymq:ru-central1:aoek49ghmknnpj1ll45e:my-mq
service_account_id: bfbqqeo6jkpls2tse5o6
batch_settings:
size: "10"
cutoff: 10s
invoke_function:
function_id: b09e5lu91ta21vdrrgma
function_tag: $latest
service_account_id: bfbqqeo6jkpls2tse5o6
status: ACTIVE
```
- API
You can update the trigger description using the [update](../../triggers/api-ref/Trigger/update.md) API method.
{% endlist %}
## Managing trigger labels {#manage-label}
You can perform the following actions with trigger labels:
* [Add a label](#add-label)
* [Update a label](#update-label)
* [Delete a label](#remove-label)
### Adding a label {#add-label}
{% list tabs %}
- CLI
{% include [cli-install](../../../_includes/cli-install.md) %}
{% include [default-catalogue](../../../_includes/default-catalogue.md) %}
To add a label to a trigger, run the command:
```
yc serverless trigger add-labels <trigger name> --labels <key>=<value>
```
Result:
```
id: dd0gj5tsj2pq9at8ja8i
folder_id: aoek49ghmknnpj1ll45e
created_at: "2019-08-28T12:26:25.675Z"
name: my-trigger
description: My YMQ trigger.
labels:
version: beta
rule:
message_queue:
queue_id: yrn:yc:ymq:ru-central1:aoek49ghmknnpj1ll45e:my-mq
service_account_id: bfbqqeo6jkpls2tse5o6
batch_settings:
size: "10"
cutoff: 10s
invoke_function:
function_id: b09e5lu91ta21vdrrgma
function_tag: $latest
service_account_id: bfbqqeo6jkpls2tse5o6
status: ACTIVE
```
- API
You can add a trigger label using the [update](../../triggers/api-ref/Trigger/update.md) API method.
{% endlist %}
### Updating a label {#update-label}
{% list tabs %}
- CLI
{% include [cli-install](../../../_includes/cli-install.md) %}
{% include [default-catalogue](../../../_includes/default-catalogue.md) %}
To update a trigger label, run the command:
{% note warning %}
The existing set of `labels` is completely replaced by the set transmitted.
{% endnote %}
```
yc serverless trigger update <trigger name> --labels <key>=<value>
```
Result:
```
id: dd0gj5tsj2pq9at8ja8i
folder_id: aoek49ghmknnpj1ll45e
created_at: "2019-08-28T12:26:25.675Z"
name: my-trigger
description: My YMQ trigger.
labels:
new_labels: my-beta-trigger
rule:
message_queue:
queue_id: yrn:yc:ymq:ru-central1:aoek49ghmknnpj1ll45e:my-mq
service_account_id: bfbqqeo6jkpls2tse5o6
batch_settings:
size: "10"
cutoff: 10s
invoke_function:
function_id: b09e5lu91ta21vdrrgma
function_tag: $latest
service_account_id: bfbqqeo6jkpls2tse5o6
status: ACTIVE
```
- API
You can update a trigger label using the [update](../../triggers/api-ref/Trigger/update.md) API method.
{% endlist %}
### Deleting a label {#remove-label}
{% list tabs %}
- CLI
{% include [cli-install](../../../_includes/cli-install.md) %}
{% include [default-catalogue](../../../_includes/default-catalogue.md) %}
To delete a trigger label, run the command:
```
yc serverless trigger remove-labels <trigger name> --labels <key>
```
Result:
```
id: dd0gj5tsj2pq9at8ja8i
folder_id: aoek49ghmknnpj1ll45e
created_at: "2019-08-28T12:26:25.675Z"
name: my-trigger
description: My YMQ trigger.
rule:
message_queue:
queue_id: yrn:yc:ymq:ru-central1:aoek49ghmknnpj1ll45e:my-mq
service_account_id: bfbqqeo6jkpls2tse5o6
batch_settings:
size: "10"
cutoff: 10s
invoke_function:
function_id: b09e5lu91ta21vdrrgma
function_tag: $latest
service_account_id: bfbqqeo6jkpls2tse5o6
status: ACTIVE
```
- API
You can delete a trigger label using the [update](../../triggers/api-ref/Trigger/update.md) API method.
{% endlist %}
| 25.496124 | 135 | 0.639252 | eng_Latn | 0.55094 |
f92ac64eca8277e3a8738f63ed8b45fcd1e310a8 | 5,438 | md | Markdown | docs/syntax/function-arguments.md | SCV/enso | 2801f58ba9bc70b7b82e5a57f4e8b1796b509477 | [
"Apache-2.0"
] | null | null | null | docs/syntax/function-arguments.md | SCV/enso | 2801f58ba9bc70b7b82e5a57f4e8b1796b509477 | [
"Apache-2.0"
] | null | null | null | docs/syntax/function-arguments.md | SCV/enso | 2801f58ba9bc70b7b82e5a57f4e8b1796b509477 | [
"Apache-2.0"
] | null | null | null | ---
layout: developer-doc
title: Function Arguments
category: syntax
tags: [syntax, functions]
order: 11
---
# Function Arguments
One of the biggest usability innovations of Enso is the set of argument types
that it supports. The combination of named and defaulted arguments with a
curried language creates a tool in which it is very clear to express even
complex APIs.
<!-- MarkdownTOC levels="2,3" autolink="true" -->
- [Positional Arguments](#positional-arguments)
- [Named Arguments](#named-arguments)
- [Defaulted Arguments](#defaulted-arguments)
- [Optional Arguments](#optional-arguments)
- [Splats Arguments \(Variadics\)](#splats-arguments-variadics)
- [Type Applications](#type-applications)
- [Underscore Arguments](#underscore-arguments)
<!-- /MarkdownTOC -->
## Positional Arguments
Much like most programming languages, functions in Enso can be called with their
arguments provided positionally. This is the simple case that everybody is
familiar with.
## Named Arguments
All arguments in Enso are defined with a name. Like all programming languages,
this is necessary for that argument to be used. However, what Enso allows is for
users to then _call_ those arguments by name.
- An argument is called by name using the syntax `(name = value)` (or one may
also take advantage of the operator precedence to write `name=value`).
- Named arguments are applied in the order they are given. This means that if
you positionally apply to an argument `foo` and then try to later apply to it
by name, this will fail due to currying of functions.
- Named arguments _cannot_ be used while using operator syntax. This means that
an expression of the form `a + b` cannot apply arguments by name. However,
when calling the operator as a method (`a.+ b`), the call-by-name syntax may
indeed be used (`a.+ (that = b)`).
This is a great usability boon as in complex APIs it can often be difficult to
remember the order or arguments.
## Defaulted Arguments
Enso also allows users to define their functions with _defaults_ for the
function's arguments. This is very useful for complex APIs as it allows users to
experiment and iterate quickly by only providing the arguments that they want to
customise.
- An argument is defined with a default using the syntax `(name = default_val)`,
which, as above, accounts for precedence rules.
- Argument defaults are applied to the function if no argument value is provided
by position or name for that argument.
- Argument defaults are evaluated lazily if the function is lazy in that
argument.
- We provide a `...` operator which suspends application of the default
arguments for the purposes of currying.
## Optional Arguments
There are certain cases where the type information for an argument may be able
to be inferred by the compiler. This is best explained by example. Consider the
implementation of a `read` function that reads text and outputs a value of a
particular type.
```ruby
read : Text -> t -> t
read text this = t.fromText text
```
You can use this function by explicitly providing the type information in either
of the following ways:
```ruby
val1 = read '5' Int
val2 = Int.read '5'
```
This, however, is often tedious, especially in contexts where this information
could be inferred by the compiler. We can re-write `read` as follows:
```ruby
read : Text -> (t=t) -> t
read text (this=this) = t.fromText text
```
This allows users both to provide the argument explicitly or leave it out. In
the case where it is not provided, the compiler will attempt to infer it from
usage. If this is impossible, an error would be raised.
Enso provides a syntactic sugar for the `t=t` syntax. The above code can be
written instead using `?`.
```ruby
read : Text -> t? -> t
read text this? = t.fromText text
```
## Splats Arguments (Variadics)
Enso provides users with the ability to define variadic functions, or _splats_
functions in our terminology. These are very useful for defining expressive APIs
and flexible code.
- These work for both positional and keyword arguments.
- They are defined using the syntax `name...`, where `name` is an arbitrary
argument name.
> The actionables for this section are:
>
> - Work out how (and if) this can interact with currying.
> - Do we even want this?
## Type Applications
There are sometimes cases where the user wants to explicitly refine the type of
an argument at the _call_ site of a function. This can be useful for debugging,
and for writing ad-hoc code. Much like the named-arguments in applications
above, Enso also provides a syntax for refining types at the application site.
- To refine an argument type by name at the application site, use the `:=`
operator (e.g. `arg_name := T`).
- This _will_ be type-checked by the compiler, and so `T` must be a valid
subtype for the type inferred for (or defined for) the function being called.
## Underscore Arguments
Enso provides the `_` argument as a quick way to create a lambda from a function
call. It obeys the following rules.
- Replacing any function argument with `_` will create a lambda that accepts an
argument and passes it in the place of the underscore. All other function
arguments are applied as normal.
- This works both by name and positionally.
- When a function is provided multiple `_` arguments, they are desugared left to
right as the arguments would be applied to the function definition, creating
nested lambdas.
| 38.842857 | 80 | 0.762045 | eng_Latn | 0.999619 |
f92b306c1756f0a4ddc75e351fa448c47961e81e | 3,953 | md | Markdown | README.md | StarCross-Tech/redbpf | fa330d27908a4ac07703ce381a6b3f1d2894712d | [
"Apache-2.0",
"MIT"
] | 3 | 2021-04-04T18:40:31.000Z | 2021-04-24T16:43:41.000Z | README.md | rc-grey/redbpf | cd0bb80f71e144a80f8ee42436d5b8697b2d068e | [
"Apache-2.0",
"MIT"
] | null | null | null | README.md | rc-grey/redbpf | cd0bb80f71e144a80f8ee42436d5b8697b2d068e | [
"Apache-2.0",
"MIT"
] | 1 | 2022-02-28T04:11:35.000Z | 2022-02-28T04:11:35.000Z | RedBPF
======

[](https://circleci.com/gh/ingraind/redbpf)
[](https://app.element.io/#/room/!vCJcBZDeGUXaqSvPpL:rustch.at?via=rustch.at)
A Rust eBPF toolchain.
# Overview
The redbpf project is a collection of tools and libraries to build eBPF
programs using Rust. It includes:
- [redbpf](https://ingraind.org/api/redbpf/) - a user space library that can be
used to load eBPF programs
- [redbpf-probes](https://ingraind.org/api/redbpf_probes/) - an idiomatic Rust
API to write eBPF programs that can be loaded by the linux kernel
- [redbpf-macros](https://ingraind.org/api/redbpf_macros/) - companion crate to
`redbpf-probes` which provides convenient procedural macros useful when
writing eBPF programs
- [cargo-bpf](https://ingraind.org/api/cargo_bpf/) - a cargo subcommand for
creating, building and debugging eBPF programs
# Requirements
In order to use redbpf you need LLVM 11 and the headers for the kernel you want
to target.
## Linux kernel
The **minimum kernel version supported is 4.19**. Kernel headers are discovered
automatically, or you can use the `KERNEL_SOURCE` environment variable to point
to a specific location. Building against a linux source tree is supported as
long as you run `make prepare` first.
## Installing dependencies on Debian based distributions
On Debian, Ubuntu and derivatives you can install the dependencies running:
sudo apt-get -y install build-essential zlib1g-dev \
llvm-11-dev libclang-11-dev linux-headers-$(uname -r)
If your distribution doesn't have LLVM 11, you can add the [official LLVM
APT repository](apt.llvm.org) to your `sources.list`.
## Installing dependencies on RPM based distributions
First ensure that your distro includes LLVM 11:
yum info llvm-devel | grep Version
Version : 11.0.0
If you don't have vesion 11, you can get it from the Fedora 33 repository.
Then install the dependencies running:
yum install clang llvm-devel zlib-devel kernel-devel
# Getting started
The easiest way to get started is using `cargo-bpf`, see the
[documentation](https://ingraind.org/api/cargo_bpf/) for more info.
[redbpf-tools](https://github.com/redsift/redbpf/tree/master/redbpf-tools) is a
`cargo-bpf` generated crate that includes simple examples you can use to
understand how to structure your programs.
Finally the [ingraind project](https://github.com/redsift/ingraind)
includes more concrete examples of redbpf programs.
# Building from source
After cloning the repository run:
git submodule sync
git submodule update --init
Install the dependencies as documented above, then run `cargo build` as usual.
# License
This repository contains code from other software in the following
directories, licensed under their own particular licenses:
* `bpf-sys/libelf/*`: GPL2 + LGPL3
* `bpf-sys/bcc/*`: Apache2, public domain
* `include/bpf_helpers.h` LGPL2 + BSD-2
* `include/bpf_helper_defs.h`: LGPL2 + BSD-2
* `bpf-sys/libbpf`: LGPL2 + BSD-2
Where '+' means they are dual licensed.
RedBPF and its components, unless otherwise stated, are licensed under either of
* Apache License, Version 2.0, ([LICENSE-APACHE](LICENSE-APACHE) or
http://www.apache.org/licenses/LICENSE-2.0)
* MIT license ([LICENSE-MIT](LICENSE-MIT) or http://opensource.org/licenses/MIT)
at your option.
# Contribution
This project is for everyone. We ask that our users and contributors
take a few minutes to review our [code of conduct](https://github.com/ingraind/project/blob/main/CODE_OF_CONDUCT.md).
Unless you explicitly state otherwise, any contribution intentionally submitted
for inclusion in the work by you, as defined in the Apache-2.0 license, shall
be dual licensed as above, without any additional terms or conditions.
| 34.982301 | 157 | 0.765242 | eng_Latn | 0.97216 |
f92b3e5ebce523b5f6dded3706c90ff59c03fffc | 6,657 | md | Markdown | docs/porting/floating-point-migration-issues.md | TimTalerJr/cpp-docs.de-de | 474abea3949c6c2972936dfd31c0861fc89a8d9e | [
"CC-BY-4.0",
"MIT"
] | null | null | null | docs/porting/floating-point-migration-issues.md | TimTalerJr/cpp-docs.de-de | 474abea3949c6c2972936dfd31c0861fc89a8d9e | [
"CC-BY-4.0",
"MIT"
] | null | null | null | docs/porting/floating-point-migration-issues.md | TimTalerJr/cpp-docs.de-de | 474abea3949c6c2972936dfd31c0861fc89a8d9e | [
"CC-BY-4.0",
"MIT"
] | null | null | null | ---
title: Gleitkomma-Migrationsprobleme
ms.date: 05/17/2017
ms.assetid: 36a1b552-2f2b-4919-bc9d-c17f42434954
ms.openlocfilehash: 0a84b764d395063f38cae299cff75437318b024e
ms.sourcegitcommit: 0cfc43f90a6cc8b97b24c42efcf5fb9c18762a42
ms.translationtype: MT
ms.contentlocale: de-DE
ms.lasthandoff: 11/05/2019
ms.locfileid: "73626981"
---
# <a name="floating-point-migration-issues"></a>Gleitkomma-Migrationsprobleme
Wenn Sie Ihre Projekte auf eine neuere Version von Visual Studio upgraden, kann es manchmal vorkommen, dass sich die Ergebnisse bestimmter Operationen mit Gleitkommazahlen geändert haben. Dies geschieht in der Regel aus zwei Gründen: Änderungen beim Generieren von Code, die den verfügbaren Prozessor besser nutzen, und Fehlerbehebungen oder Änderungen an den Algorithmen, die in den mathematischen Funktionen der C-Laufzeitbibliothek (CRT) verwendet werden. Im Allgemeinen sind die neuen Ergebnisse innerhalb der Grenzen, die vom Sprachstandard festgelegt wurden, korrekt. Lesen Sie weiter, um herauszufinden, was sich geändert hat, und wenn es wichtig ist, wie Sie die gleichen Ergebnisse für Ihre Funktionen bekommen wie zuvor.
## <a name="new-math-functions-and-universal-crt-changes"></a>Neue mathematische Funktionen und universelle CRT-Änderungen
Die meisten mathematischen CRT-Funktionen sind seit Jahren in Visual Studio verfügbar, jedoch sind ab Visual Studio 2013 alle Funktionen enthalten, die für ISO C99 erforderlich sind. Diese Funktionen werden implementiert, damit die Sprache ebenso leistungsfähig wie korrekt ist. Da das korrekt gerundete Ergebnis in jedem Fall nur sehr teuer errechenbar ist, wurden diese Funktionen dazu entworfen, eine starke Annäherung an das korrekt gerundete Ergebnis zu erzielen. In den meisten Fällen liegt das erzeugte Ergebnis innerhalb der +/-1-Einheit der geringsten Präzision, oder *ulp*, des korrekt gerundeten Ergebnisses, obwohl die Ungenauigkeit auch größer ausfallen kann. Wenn Sie zuvor eine andere math-Bibliothek verwendet haben, um diese Funktionen zu erhalten, können Unterschiede bei der Implementierung für die Änderung in Ihren Ergebnissen verantwortlich sein.
Wenn die mathematischen Funktionen in die universelle CRT in Visual Studio 2015 verschoben wurden, wurden einige neue Algorithmen verwendet und mehrere Fehler in der Implementierung der Funktionen, die in Visual Studio 2013 neu waren, wurden korrigiert. Diese Änderungen können zu feststellbaren Unterschieden in den Ergebnissen von Gleitkommaberechnungen führen, die diese Funktionen verwenden. Die fehlerbehafteten Funktionen waren erf, exp2, remainder, remquo, scalbln, und scalbn, und ihre float- und long double-Varianten. Andere Änderungen in Visual Studio 2015 haben Probleme bei der Information des beibehaltenen Statuswort des Gleitkommas und des Ausnahmestatus in den Funktionen _clear87, _clearfp, fegetenv, fesetenv, und feholdexcept behoben.
## <a name="processor-differences-and-compiler-flags"></a>Prozessorunterschiede und Compilerflags
Viele der Gleitkommafunktionen in der mathematischen Bibliothek haben unterschiedliche Implementierungen für verschiedene CPU-Architekturen. Die 32-Bit-x86-CRT hat möglicherweise eine andere Implementierung als die 64-Bit x64 CRT. Darüber hinaus haben möglicherweise einige der Funktionen mehrere Implementierungen für eine bestimmte CPU-Architektur. Eine möglichst effiziente Implementierung wird je nach den von der CPU unterstützten Anweisungssets dynamisch zur Laufzeit ausgewählt. In der 32-Bit-x86-CRT haben einige Funktionen eine x87- und eine SSE2-Implementierung. Wenn eine CPU verwendet wird, die SSE2 unterstützt, wird die schnellere SSE2-Implementierung verwendet. Wenn eine CPU verwendet wird, die SSE2 nicht unterstützt, wird die langsamere x87-Implementierung verwendet. Möglicherweise sehen Sie dies bei der Migration von altem Code, da in Visual Studio 2012 die Standardoption der x86-Compilerarchitektur in [/arch:SSE2](../build/reference/arch-x86.md) geändert wurde. Da verschiedene Implementierungen der Funktionen der mathematischen Bibliothek verschiedene CPU-Anweisungen und andere Algorithmen verwenden, um Ergebnisse zu erzielen, unterscheiden sich die Ergebnisse auf den verschiedenen Plattformen möglicherweise. In den meisten Fällen liegen die Ergebnisse innerhalb +/-1 ULP des korrekt gerundeten Ergebnisses, die tatsächlichen Ergebnisse können jedoch in den CPUs variieren.
Die Verbesserungen für Richtigkeit bei der Codeerstellung in verschiedenen Gleitkomma-Modi in Visual Studio können auch die Ergebnisse der Operationen mit Gleitkomma beeinflussen, wenn der alte Code mit dem neuen verglichen wird, auch bei Verwendung der gleichen Compilerflags. Der von Visual Studio 2010 generierte Code beim Festlegen von [/fp:precise](../build/reference/fp-specify-floating-point-behavior.md) (Standard) oder `/fp:strict` hat möglicherweise die fortgeschrittenen NaN-Werte durch die Ausdrücke nicht korrekt weitergegeben. Daher können einige Ausdrücke, die in älteren Compilern ein numerisches Ergebnis ausgegeben haben, jetzt ordnungsgemäß ein NaN-Ergebnis erzeugen. Sie sehen möglicherweise auch Unterschiede, da die für `/fp:fast` aktivierten Codeoptimierungen jetzt weitere Features des Prozessors nutzen. Diese Optimierungen können weniger Anweisungen verwenden, aber Sie können die generierten Ergebnissen beeinflussen, da einige zuvor sichtbare intermediate-Vorgänge entfernt wurden.
## <a name="how-to-get-identical-results"></a>So rufen Sie identische Ergebnisse ab
In den meisten Fällen führen Gleitkomma-Änderungen in den neuesten Compilern und Bibliotheken zu schnellerem oder genauerem Verhalten, oder beides. Sie erkennen möglicherweise auch eine bessere Prozessorleistung, wenn die SSE2-Anweisungen die x87-Anweisungen ersetzen. Wenn Sie jedoch über Code verfügen, der das Gleitkomma-Verhalten eines älteren Codes exakt replizieren muss, sollten Sie lieber die Visual Studio Funktionen zur nativen Festlegung von Zielversionen verwenden und das betroffene Projekt mit älteren Toolsets erstellen. Weitere Informationen finden Sie unter [Use native multi-targeting in Visual Studio to build old projects (Verwenden der nativen Festlegung von Zielversionen in Visual Studio, um alte Projekte zu erstellen)](use-native-multi-targeting.md).
## <a name="see-also"></a>Siehe auch
[Aktualisieren von Projekten aus früheren Versionen von VisualC++](upgrading-projects-from-earlier-versions-of-visual-cpp.md)<br/>
[Überblick über potenzielle Aktualisierungsprobleme (Visual C++)](overview-of-potential-upgrade-issues-visual-cpp.md)<br/>
[Änderungsverlauf von Visual C++ von 2003 bis 2015](visual-cpp-change-history-2003-2015.md) | 184.916667 | 1,403 | 0.833559 | deu_Latn | 0.998466 |
f92c08ce80f65a164645a9e8fce2ee69b28ac9ba | 3,192 | md | Markdown | README.md | claudioclutter/mdma | 010665718ade824d8e7bddc981d90ee2e1e935fa | [
"MIT"
] | null | null | null | README.md | claudioclutter/mdma | 010665718ade824d8e7bddc981d90ee2e1e935fa | [
"MIT"
] | null | null | null | README.md | claudioclutter/mdma | 010665718ade824d8e7bddc981d90ee2e1e935fa | [
"MIT"
] | null | null | null | MDM Agent
=========
Mdma helps you deploy an iOS app to many devices at scheduled times with [SimpleMDM](https://www.simplemdm.com).
The **source code** is available on [GitHub](https://github.com/clutter/mdma).
[](https://codeclimate.com/github/clutter/mdma/maintainability)
[](https://travis-ci.com/clutter/mdma)
After [setting up the right groups on SimpleMDM](#setup), you can do the following:
<img width="1792" alt="buildnew" src="https://user-images.githubusercontent.com/32649767/36555945-599852ac-17b8-11e8-9913-4600b78c9a6f.png">
[1] Upload a new build for an iOS app and decide when it should be pushed to devices
<img width="1792" alt="home" src="https://user-images.githubusercontent.com/32649767/36555942-595203ba-17b8-11e8-91ef-89d9a1461a2c.png">
[2] Check the status of every pending, scheduled, and completed push
<img width="1792" alt="devices" src="https://user-images.githubusercontent.com/32649767/36555941-593adbc2-17b8-11e8-9a52-579f99a2e33d.png">
[3] Browse the list of all devices to check the current version of the app
How it works
============
The app is hosted on Heroku and relies on one job run by Heroku Scheduler every 10 minutes.
`bundle exec rake deploys:enqueue` checks whether any build needs to be deployed soon and adds to the queue.
A separate job is run on Heroku Scheduler every day.
`bundle exec rake devices:fetch` fetches the list of devices with the latest app version.
How to contribute
=================
Whenever a new PR is opened, a new Review App is created on Heroku, where you can test your code.
The Review App uses a test app called Ugly Sweater with a test device group.
Test your features on the review app and make sure that Code Climate is happy, then merge.
How to configure
================
In order to use `mdma`, the following environment variables need to be set:
- `MDMA_APP_ID`: The SimpleMDM ID of the app to push
- `MDMA_APP_GROUP_ID`: The SimpleMDM ID of the app group that identifies the devices to push to
- `MDMA_APP_IDENTIFIER`: The unique identifier of the iOS app
- `RAILS_MASTER_KEY`: The key to decrypt the credentials stored in `config/credentials.yml.enc`
The following environment variables are optional:
- `GITHUB_PROJECT`: The GitHub "username/project" path to fetch release notes from
- `SLACK_CHANNEL`: The Slack channel to post notifications to (defaults to #deploys)
For completeness, these are the credentials stored in the app:
```yaml
aws:
access_key_id: "[Access key for an S3 Bucket to upload builds to]"
secret_access_key: "[Access secret for an S3 Bucket to upload builds to]"
simple_mdm:
key: "[SimpleMDM API key]"
google:
client_id: "[Google app Client ID to log into the app]"
client_secret: "[Google app Client Secret to log into the app]"
github:
username: "[Username of a GitHub account with read access to clutter/clutter-ios-wms]"
token: "[Personal access token for the account to read the releases for that app]"
slack:
token_url: "[Slack token URL to send build notifications to Slack]"
```
| 41.454545 | 157 | 0.755326 | eng_Latn | 0.946331 |
f92c8ff90206f8dc766100b3ec516f7c6439a472 | 89 | md | Markdown | README.md | LudiusMaximus/PlayedTimeMinimapClock | fc535dff87dcb30385b75bde00b824068a73037e | [
"MIT"
] | 1 | 2020-05-20T22:43:00.000Z | 2020-05-20T22:43:00.000Z | README.md | LudiusMaximus/PlayedTimeMinimapClock | fc535dff87dcb30385b75bde00b824068a73037e | [
"MIT"
] | null | null | null | README.md | LudiusMaximus/PlayedTimeMinimapClock | fc535dff87dcb30385b75bde00b824068a73037e | [
"MIT"
] | null | null | null | # PlayedTimeMinimapClock
WoW Addon that adds the Broker_PlayedTime to the Minimap Clock.
| 29.666667 | 63 | 0.842697 | eng_Latn | 0.663762 |
f92cf4310bef84172e3e4bc06e5126a541352a69 | 5,812 | md | Markdown | docs/vs-2015/ide/customizing-intellisense-for-requirejs.md | monkey3310/visualstudio-docs.pl-pl | adc80e0d3bef9965253897b72971ccb1a3781354 | [
"CC-BY-4.0",
"MIT"
] | null | null | null | docs/vs-2015/ide/customizing-intellisense-for-requirejs.md | monkey3310/visualstudio-docs.pl-pl | adc80e0d3bef9965253897b72971ccb1a3781354 | [
"CC-BY-4.0",
"MIT"
] | null | null | null | docs/vs-2015/ide/customizing-intellisense-for-requirejs.md | monkey3310/visualstudio-docs.pl-pl | adc80e0d3bef9965253897b72971ccb1a3781354 | [
"CC-BY-4.0",
"MIT"
] | null | null | null | ---
title: Dostosowywanie funkcji IntelliSense dla RequireJS | Dokumentacja firmy Microsoft
ms.custom: ''
ms.date: 2018-06-30
ms.prod: visual-studio-dev14
ms.reviewer: ''
ms.suite: ''
ms.technology:
- vs-ide-general
ms.tgt_pltfrm: ''
ms.topic: article
ms.assetid: 2be07ef8-9c08-444b-a21a-22a4fe6386a3
caps.latest.revision: 6
author: gewarren
ms.author: gewarren
manager: ghogen
ms.openlocfilehash: fbc4d9b85a3eb8e0fe5f3a890a76bae4695912e4
ms.sourcegitcommit: 55f7ce2d5d2e458e35c45787f1935b237ee5c9f8
ms.translationtype: MT
ms.contentlocale: pl-PL
ms.lasthandoff: 08/22/2018
ms.locfileid: "42628063"
---
# <a name="customizing-intellisense-for-requirejs"></a>Dostosowywanie funkcji IntelliSense dla RequireJS
[!INCLUDE[vs2017banner](../includes/vs2017banner.md)]
Najnowszą wersję tego tematu znajduje się w temacie [dokumentacja programu Visual Studio 2017](https://docs.microsoft.com/en-us/visualstudio/).
Począwszy od programu Visual Studio 2013 Update 4, obsługę popularnych plik RequireJS JavaScript i modularnej modułu ładującego jest obsługiwane. RequireJS ułatwia definiowanie zależności między modułami kodu i załadować dynamicznie modułów tylko wtedy, gdy jest to wymagane. Podczas pisania kodu JavaScript, który używa RequireJS, sugestie funkcji IntelliSense zostanie podana dla modułów już przywoływany z definicji modułu lub odwoływać się za pomocą wywołania `require()` z w obrębie kodu.
Domyślnie program Visual Studio obsługuje bardzo podstawową konfigurację do obsługi RequireJS, ale jest powszechną praktyką, aby skonfigurować własne niestandardowe ustawienia konfiguracji (oznacza to, aby zdefiniować aliasów dla bibliotek). W tym temacie opisano różne sposoby, które można dostosować Visual Studio do pracy z projektu Instalatora unikatowy.
W tym temacie opisano sposób:
- Dostosowywanie RequireJS w projektach programu ASP.NET
- Dostosowywanie RequireJS w projektach JSProj, które są używane do tworzenia aplikacji Apache Cordova, aplikacje Windows Store i aplikacji LightSwitch HTML
## <a name="customize-requirejs-in-aspnet-projects"></a>Dostosowywanie RequireJS w projektach programu ASP.NET
Obsługa RequireJS automatycznie jest włączona, gdy plik o nazwie require.js odwołuje się do bieżącego pliku JavaScript (Aby uzyskać więcej informacji, zobacz sekcję Określanie kontekstu IntelliSense w [JavaScript IntelliSense](../ide/javascript-intellisense.md)). W projektach programu ASP.NET, odwołuje się do require.js odbywa się zwykle przy użyciu / / / \<odwołania / > dyrektywy w pliku _references.js.
### <a name="configure-the-data-main-attribute-in-an-aspnet-project"></a>Skonfigurowanie atrybutu main danych w projektach programu ASP.NET
Do końca dokładnie symulować, jak aplikacja będzie działać po uruchomieniu, Edytor kodu JavaScript musi wiedzieć, jakiego pliku, można najpierw załadować, podczas konfigurowania require.js. Jest to zazwyczaj konfigurowane w swojej aplikacji HTML plików za pomocą `data-main` atrybutu elementu skryptu, który odwołuje się do require.js, jak pokazano poniżej.
```html
<script src="js/require.js" data-main="js/app.js"></script>
```
W tym przykładzie skrypt odwołuje się danych main (js/app.js) jest ładowany natychmiast po require.js. Plik, który jest ładowany natychmiast jest najlepszym miejscem, aby najpierw skonfigurować użycie RequireJS (przy użyciu `require.config()`). Sprawdzić Edytor kodu JavaScript, jakiego pliku na potrzeby `data-main` w aplikacji Dodaj `data-main` atrybutu, a następnie zmodyfikuj / / / \<odwołania / > dyrektywę, który odwołuje się do require.js w aplikacji. Na przykład można użyć tej dyrektywy:
```javascript
/// <reference path="js/require.js" data-main="js/app.js" />
```
### <a name="configure-the-application-start-page-in-an-aspnet-project"></a>Konfigurowanie strony początkowej aplikacji w projektach programu ASP.NET
Po uruchomieniu aplikacji, RequireJS założono, że względnych ścieżek do plików (na przykład, ".. \\"ścieżki) są względne wobec plik HTML, który załadowana biblioteka require.js. Podczas pisania kodu w edytorze programu Visual Studio dla projektu programu ASP.NET, ta strona początkowa jest nieznany i musisz poinformować edytora, co uruchomić stronę, aby użyć w przypadku użycia względne ścieżki do pliku. Aby to zrobić, Dodaj `start-page` atrybutu użytkownika / / / \<odwołania / > dyrektywy.
```javascript
/// <reference path="js/require.js" data-main="js/app.js" start-page="/app/index.html" />
```
`start-page` Atrybut określa adres URL strony, jak go w przeglądarce podczas uruchamiania aplikacji.
## <a name="customize-requirejs-in-jsproj-projects"></a>Dostosowywanie RequireJS w projektach JSProj
Projekty JSProj (pliki projektu kończące się na rozszerzenie .jsproj) są używane podczas tworzenia aplikacji dla aplikacji Apache Cordova, oparty na języku HTML aplikacji Windows Store Apps lub LightSwitch HTML. W przeciwieństwie do projektów programu ASP.NET te projekty odczytać odwołania do plików .js z plików HTML, które istnieje w projekcie. W związku z tym podczas edytowania kodu JavaScript w projekcie JSProj, zostanie wyświetlony, pomocy technicznej dla RequireJS jest włączona, jeśli plik JavaScript aktualnie edytowanym odwołuje się do innego pliku HTML, który odwołuje się require.js.
Kroki dostosowywania niezbędne dla projektów programu ASP.NET nie są potrzebne w pliku JSProj projektu. Oznacza to, że skrypt pliki używane przez `data-main` atrybut w tagu skryptu, który odwołuje się do require.js są ładowane automatycznie skonfigurować require.js. Plik HTML, odwołuje się do require.js służy również jako stronę startową dla aplikacji.
## <a name="see-also"></a>Zobacz też
[Funkcja IntelliSense dla języka JavaScript](../ide/javascript-intellisense.md)
| 78.540541 | 600 | 0.790606 | pol_Latn | 0.999835 |
f92d0857c7578fef9df754c30bd622deedf4a6d9 | 843 | md | Markdown | README.md | aerobounce/homebrew-ffmpeg-fdk-aac | 47ca4a6e63cfde556a9e19f3cdcbfb8b9fec59e1 | [
"MIT"
] | 2 | 2019-11-13T11:34:51.000Z | 2019-11-13T16:00:49.000Z | README.md | aerobounce/homebrew-ffmpeg-fdk-aac | 47ca4a6e63cfde556a9e19f3cdcbfb8b9fec59e1 | [
"MIT"
] | null | null | null | README.md | aerobounce/homebrew-ffmpeg-fdk-aac | 47ca4a6e63cfde556a9e19f3cdcbfb8b9fec59e1 | [
"MIT"
] | null | null | null | # homebrew-ffmpeg-fdk-aac
Default ffmpeg formula with fdk-aac.
# Install
```
brew install aerobounce/ffmpeg-fdk-aac/ffmpeg
```
# Changes from the original Formula
This formula adds these arguments:
```
--enable-libfdk-aac
--enable-nonfree
--disable-htmlpages
-fno-stack-check
```
`fno-stack-check` is a bug workaround when building with newer toolchains on macOS: https://trac.ffmpeg.org/ticket/8073#comment:12
# Why
homebrew-core ffmpeg Formula removed all the options and is missing `libfdk-aac` – "highest-quality AAC encoder available with ffmpeg".
> The Fraunhofer FDK AAC codec library. This is currently the highest-quality AAC encoder available with ffmpeg. Requires ffmpeg to be configured with --enable-libfdk-aac (and additionally --enable-nonfree if you're also using --enable-gpl).
https://trac.ffmpeg.org/wiki/Encode/AAC
| 30.107143 | 241 | 0.771056 | eng_Latn | 0.940404 |
f92d4e2f382438f38ff10d7baa1af7129ce60446 | 1,820 | md | Markdown | src/lt/2020-02/04/02.md | PrJared/sabbath-school-lessons | 94a27f5bcba987a11a698e5e0d4279b81a68bc9a | [
"MIT"
] | 68 | 2016-10-30T23:17:56.000Z | 2022-03-27T11:58:16.000Z | src/lt/2020-02/04/02.md | PrJared/sabbath-school-lessons | 94a27f5bcba987a11a698e5e0d4279b81a68bc9a | [
"MIT"
] | 367 | 2016-10-21T03:50:22.000Z | 2022-03-28T23:35:25.000Z | src/lt/2020-02/04/02.md | PrJared/sabbath-school-lessons | 94a27f5bcba987a11a698e5e0d4279b81a68bc9a | [
"MIT"
] | 109 | 2016-08-02T14:32:13.000Z | 2022-03-31T10:18:41.000Z | ---
title: I. Papročiai
date: 19/04/2020
---
Patys papročiai nėra blogai. Jie suteikia pasikartojantiems darbams mūsų kasdieniniame gyvenime tam tikrą rutiną ir struktūrą. Jie gali padėti mums palaikyti ryšį su mūsų šaknimis. Taigi nestebina, kad papročiai taip pat atlieka svarbų vaidmenį religijoje. Tačiau yra ir tam tikrų su papročiais susijusių pavojų.
`1. Ko Mk 7, 1–13 moko mus apie tai, kaip Jėzus savo dienomis reagavo į kai kuriuos žmonių papročius?`
Papročiai, su kuriais Jėzus susidūrė žydų bendruomenėje, buvo kruopščiai perduodami mokytojo mokiniui. Jėzaus laikais jie turėjo savo vietą šalia Rašto. Tačiau papročiai su laiku yra linkę augti, sukaupdami vis daugiau detalių ir pusių, kurios iš pradžių nebuvo Dievo Žodžio ir Jo plano dalis. Šie žmogiški papročiai – net jei jie puoselėjami gerbiamų „prosenių“ (žr. Mk 7, 3. 5), t.y. religinių žydų bendruomenės vadovų – nėra lygiaverčiai Dievo įsakams (žr. Mk 7, 8–9). Tai buvo žmogiški papročiai, ir galų gale jie privedė prie to, kad niekais pavertė „Dievo Žodį“ (Mk 7, 13).
`2. Perskaitykite 1 Kor 11, 2 ir 2 Tes 3, 6. Kaip atskirti Dievo Žodį nuo žmogaus papročių? Kodėl svarbu juos atskirti?`
Gyvas Dievo Žodis ugdo mumyse garbingą ir ištikimą nusistatymą jo atžvilgiu. Ši ištikimybė lemia tam tikrą paprotį. Tačiau mūsų ištikimybė visada turi būti gyvam Dievui, kuris savo valią apreiškė rašytiniame Dievo Žodyje. Taigi Raštas atlieka išskirtinį vaidmenį, viršijantį visus žmogaus papročius. Raštas yra aukščiau visų, net ir gerų papročių. Papročiai, kylantys iš mūsų patyrimo su Dievu ir Jo Žodžiu, turi būti nuolat išmėginami Šventojo Rašto matu.
`Ką mes kaip bažnyčia darome, ką būtų galima įvardyti „papročiu“? Kodėl visada svarbu papročius atskirti nuo Rašto mokymo? Pateikite savo atsakymą klasėje sabatą.`
| 107.058824 | 579 | 0.784066 | lit_Latn | 1.00001 |
f92d7c6f6015755bc8510e7df3a87061df3bd9fd | 129 | md | Markdown | notes/README.md | Drv4MC/ICS3-Python-Notes | 3614a5adfb4199bfc67dad8bbeb3bfee45401a5d | [
"MIT"
] | 8 | 2019-09-03T14:20:04.000Z | 2022-02-07T17:10:11.000Z | notes/README.md | Drv4MC/ICS3-Python-Notes | 3614a5adfb4199bfc67dad8bbeb3bfee45401a5d | [
"MIT"
] | 10 | 2020-02-04T17:41:36.000Z | 2022-02-17T00:47:16.000Z | notes/README.md | Drv4MC/ICS3-Python-Notes | 3614a5adfb4199bfc67dad8bbeb3bfee45401a5d | [
"MIT"
] | 17 | 2020-09-15T16:40:23.000Z | 2022-03-22T17:52:32.000Z | # Notes
This section will contain notes about each of the topics listed above, as well as include any examples for each of them. | 43 | 120 | 0.790698 | eng_Latn | 1.000002 |
f92dc64c73eaf93c6cd1c0d6e0bdd7f9e2a64cc4 | 1,097 | md | Markdown | windows.applicationmodel.userdataaccounts/userdataaccountstore_createaccountasync_946619348.md | embender/winrt-api | c3d1c5e6000fa7b06ed691e0bb48386f54c488c5 | [
"CC-BY-4.0",
"MIT"
] | 2 | 2020-01-20T07:56:10.000Z | 2021-11-06T17:07:12.000Z | windows.applicationmodel.userdataaccounts/userdataaccountstore_createaccountasync_946619348.md | embender/winrt-api | c3d1c5e6000fa7b06ed691e0bb48386f54c488c5 | [
"CC-BY-4.0",
"MIT"
] | null | null | null | windows.applicationmodel.userdataaccounts/userdataaccountstore_createaccountasync_946619348.md | embender/winrt-api | c3d1c5e6000fa7b06ed691e0bb48386f54c488c5 | [
"CC-BY-4.0",
"MIT"
] | 1 | 2018-01-20T02:36:26.000Z | 2018-01-20T02:36:26.000Z | ---
-api-id: M:Windows.ApplicationModel.UserDataAccounts.UserDataAccountStore.CreateAccountAsync(System.String,System.String)
-api-type: winrt method
---
<!-- Method syntax
public Windows.Foundation.IAsyncOperation<Windows.ApplicationModel.UserDataAccounts.UserDataAccount> CreateAccountAsync(System.String userDisplayName, System.String packageRelativeAppId)
-->
# Windows.ApplicationModel.UserDataAccounts.UserDataAccountStore.CreateAccountAsync
## -description
Asynchronously creates a user data account, specifying a displayable user name and a GUID that identifies the app in the Microsoft Store.
## -parameters
### -param userDisplayName
A string containing the user name that is suitable for display.
### -param packageRelativeAppId
The GUID that identifies the app in the Microsoft Store.
## -returns
Returns the newly created [UserDataAccount](userdataaccount.md).
## -remarks
## -examples
## -see-also
[CreateAccountAsync(String)](userdataaccountstore_createaccountasync_1955614316.md), [CreateAccountAsync(String, String)](userdataaccountstore_findaccountsasync_2001360321.md)
| 35.387097 | 186 | 0.821331 | eng_Latn | 0.362916 |
f92dcd3dc162297d5b433e98e67e81174a7a6c9b | 6,006 | md | Markdown | src/cookie/utils/ResetEvent.md | Aisthetic/cookietouch | 54eccabfc96941cb958e4cc68364b063a8114cb5 | [
"MIT"
] | 2 | 2020-12-16T20:31:36.000Z | 2020-12-18T21:50:20.000Z | src/cookie/utils/ResetEvent.md | Dreaminguy/cookietouch | 54eccabfc96941cb958e4cc68364b063a8114cb5 | [
"MIT"
] | null | null | null | src/cookie/utils/ResetEvent.md | Dreaminguy/cookietouch | 54eccabfc96941cb958e4cc68364b063a8114cb5 | [
"MIT"
] | 1 | 2020-04-09T17:47:10.000Z | 2020-04-09T17:47:10.000Z | #ResetEvent
## What is a ResetEvent?
The reset event is somewhat based on the C# [AutoResetEvent](http://msdn.microsoft.com/en-us/library/system.threading.autoresetevent(v=vs.110).aspx) and [ManualResetEvent](http://msdn.microsoft.com/en-us/library/system.threading.manualresetevent.aspx) classes.
It is similar to a promise only it can be used multiple times.
When a function begins an activity that must complete before other functions proceed, it calls _reset_ to put the ResetEvent in the non-signaled state.
Functions that call _wait_ on the reset event will not execute immediately, awaiting the signal. When the running function completes the activity, it calls _set_ to signal that the waiting functions can proceed.
All waiting functions are executed until the event becomes non-signaled.
Once it has been signaled, a reset event remains signaled until it is manually reset using the _reset_ function. That is, calls to _wait_ execute immediately.
### Basic Usage
```js
var ResetEvent = require('node-async-locks').ResetEvent;
```
```js
var resetEvent = new ResetEvent();
var x = 0;
resetEvent.wait(function(){
x+=1;
});
resetEvent.wait(function(){
console.log(x); //2
});
x++;
resetEvent.set();
}]
```
### Helper Functions
ResetEvent uses several helper functions (on the **prototype**) which can be overridden to provide custom functionality.
#### ResetEvent#createToken(callback) -> token
A function that creates the token which will be used in this reset event.
The token has the following fields:
* **id** - A unique id for each token, must be comparable using === operator.
* **isCanceled** - A boolean representing the cancellation state of the token.
* **callback** - The callback to be called when the token is ready to execute.
* **elapsed** - [optional] A function which returns the elapsed time between the creation of the token and now.
* **start** - [optional] The start time of when this token was created.
* **resetEvent** - [optional] A reference to the reset event that created this token.
#### ResetEvent#executeCallback(token)
A function which is used to execute the callback on the token.
The default implementation will execute the callback synchronously.
#### ResetEvent#reduceQueue(queue, options)
A function which is used to reduce the reset event queue size when a call to _wait_ is made.
If the options are changed pragmatically after an instance was created, it is up to the user to call this function to adjust the queue size.
Override this function to create different queuing logic.
### ResetEvent API
The main API of the ResetEvent object instance.
_ResetEventInstance_ represents an instance created by calling ````new ResetEvent()````
#### ResetEvent#constructor(isSignaled, options) -> ResetEventInstance
Creates a new ResetEventInstance using the given signaled state and options.
If no options are provided the default options are used.
The default options defined as ````ResetEvent.defaultOptions```` :
```js
{
maxQueueSize: Infinity,
overflowStrategy: 'this',
autoResetCount: Infinity
}
```
Override any default option to make all future reset event instance created with the new defaults.
##### Supported Options
See AsyncLock [Supported Options](#supported-options) and:
* **autoResetCount** (number) [default Infinity] - The number of callbacks to call before the event is auto reset (becomes non-signaled).
#### ResetEventInstance#reset()
Marks the reset event as not signaled. All further calls to _wait_ will not execute immediately.
```js
var resetEvent = new ResetEvent(true);
resetEvent.wait(function(){
//This is executed
});
resetEvent.reset();
resetEvent.wait(function(){
//This is not executed
});
```
#### ResetEventInstance#set()
Marks the reset event as signaled and executes all pending callbacks. All further calls to _wait_ will execute immediately.
if _autoResetCount_ count option was passed, it will execute only the given number of callbacks (excluding canceled callbacks)
and then mark the event as non-signaled.
```js
var resetEvent = new ResetEvent(false);
var x;
resetEvent.wait(function(){
console.log(x); // 10
});
x = 10;
resetEvent.set();
resetEvent.wait(function(){
console.log(x); // 10
});
x = 20;
resetEvent.wait(function(){
console.log(x); // 20
});
```
#### ResetEventInstance#wait(callback,[timeout]) -> token
Waits until the reset event becomes signaled then executes the callback function.
If the reset event is already signaled when wait is called, the callback is executed immediately.
The callback function signature is _callback(token)_, it will receive the token returned by the _wait_ function.
If _timeout_ is provided will wait only the given amount of milliseconds and then cancel the call.
If _timeout_ is not provided will wait indefinitely.
Returns a token which can be used to track the elapsed time.
```js
var resetEvent = new ResetEvent(false);
var x;
resetEvent.wait(function(){
console.log(x); // This is never called
},100);
x = 10;
setTimeout(function(){
resetEvent.set();
},1000);
resetEvent.wait(function(){
console.log(x); // 20
});
x = 20;
resetEvent.wait(function(){
console.log(x); // 20
});
```
#### ResetEventInstance#isSignaled() -> boolean
Returns true if the reset event is currently signaled and false otherwise.
```js
var resetEvent = new ResetEvent();
resetEvent.isSignaled(); //false;
resetEvent.set();
resetEvent.isSignaled(); //true;
resetEvent.reset();
resetEvent.isSignaled(); //false;
```
#### ResetEventInstance#queueSize() -> number
Returns the number of callbacks currently pending on the reset event.
Note than inside a callback that callback is not considered pending.
```js
var resetEvent = new ResetEvent(false);
resetEvent.wait(function(){
console.log(resetEvent.queueSize()); // 0
});
console.log(resetEvent.queueSize()); // 1
resetEvent.set();
```
| 32.464865 | 260 | 0.733267 | eng_Latn | 0.983521 |
f92e308f7863db7dd3b72ad65a41dbc968744ce9 | 11,671 | md | Markdown | README.md | textcreationpartnership/A79165 | fe5b94924bd17c670b0fec6e2ce7518b8560898d | [
"CC0-1.0"
] | null | null | null | README.md | textcreationpartnership/A79165 | fe5b94924bd17c670b0fec6e2ce7518b8560898d | [
"CC0-1.0"
] | null | null | null | README.md | textcreationpartnership/A79165 | fe5b94924bd17c670b0fec6e2ce7518b8560898d | [
"CC0-1.0"
] | null | null | null | #A glimpse of eternity Very useful to awaken sinners, and to comfort saints. Profitable to be read in families, and given at funerals. By Abr. Caley.#
##Caley, Abraham, d. 1672.##
A glimpse of eternity Very useful to awaken sinners, and to comfort saints. Profitable to be read in families, and given at funerals. By Abr. Caley.
Caley, Abraham, d. 1672.
##General Summary##
**Links**
[TCP catalogue](http://www.ota.ox.ac.uk/tcp/) •
[HTML](http://tei.it.ox.ac.uk/tcp/Texts-HTML/free/A79/A79165.html) •
[EPUB](http://tei.it.ox.ac.uk/tcp/Texts-EPUB/free/A79/A79165.epub) •
[Page images (Historical Texts)](https://historicaltexts.jisc.ac.uk/eebo-99897223e)
**Availability**
To the extent possible under law, the Text Creation Partnership has waived all copyright and related or neighboring rights to this keyboarded and encoded edition of the work described above, according to the terms of the CC0 1.0 Public Domain Dedication (http://creativecommons.org/publicdomain/zero/1.0/). This waiver does not extend to any page images or other supplementary files associated with this work, which may be protected by copyright or other license restrictions. Please go to https://www.textcreationpartnership.org/ for more information about the project.
**Major revisions**
1. __2013-04__ __TCP__ *Assigned for keying and markup*
1. __2013-06__ __SPi Global__ *Keyed and coded from ProQuest page images*
1. __2013-08__ __Lauren Proux__ *Sampled and proofread*
1. __2013-08__ __Lauren Proux__ *Text and markup reviewed and edited*
1. __2014-03__ __pfs__ *Batch review (QC) and XML conversion*
##Content Summary##
#####Front#####
Fox's Time and End of Time.A GLIMPSE OF Eternity.Very Uſeful To Awaken Sinners, and to Comfort Saints.Profitable to be Read in
1. To the Reader.
1. The CONTENTS.
#####Body#####
1. A Glimpſe of Eternity.
_ The INTRODƲCTION.
_ CHAP. I. Of Eternal, Inviſible things, the firſt Argument from God.
_ CHAP. II. Of the Meritorious Cauſes, and the Nature of Happineſs and Puniſhment, and the Immortality of Man.
_ CHAP. III. Of Scripture-Proofs of Eternal Happineſs, Conſiſting in Sight, Love, Joy, Praiſe; with created Acceſſories: and Eternal Miſery, Expreſſed by Wrath, Worm, Fire, Priſon, Darkneſs, Burning, Torment.
_ CHAP. IV. Of the Sublimeneſs of Eternity, as Tranſcending all Expreſſion, Knowledge (of it ſelf, or meaſure) and all Imagination.
_ CHAP. V. Of the importance of Eternity, to the endleſneſs of it. Conſidering God will not, nothing elſe can, put an end to it.
_ CHAP. VI. Of Eternity without ſucceſſion, or without conſumption.
_ CHAP. VII. Of Eternal Happineſs and Miſery without intermiſſion; and without mixture in Heaven or Hell.
_ CHAP. VIII. Of Lamentation for thoſe at eaſe and careleſs of Eternity, from three ſeveral Aggravations: with Expoſtulations.
_ CHAP. IX. Of Caution to prevent miſtakes about the Adverſity of the Godly, and the Proſperity of the Wicked in this ſtate.
_ CHAP. X.• Exhortation to Reſtrain from Sin, and Redeem Time.
_ CHAP. XI. An Exhortation to look on Eternal things, by our Meditations, Expreſſions, Affections of Deſire, Hope, Love, Delight, and Endeavours.
_ CHAP. XII.•f looking to Eternal things as our end, enforced by eight ſeveral Arguments.
_ CHAP. XIII. Of Motives drawn from other things, other men, our ſelves, and the unſpeakable benefits of a proſpect of things Eternal.
_ CHAP. XIV. Of various other conſiderations to move us to make proviſion for Eternity.
_ CHAP. XV. Of Directions to help us in looking after Eternal Bleſſedneſs; with Anſwers to ſome Objections and Cautions.
**Types of content**
* Oh, Mr. Jourdain, there is **prose** in there!
There are 1218 **omitted** fragments!
@__reason__ (1218) : foreign (53), illegible (1165) • @__resp__ (1165) : #KEYERS (1165) • @__extent__ (1165) : 2 letters (56), 1 letter (1067), 1 word (39), 3 letters (2), 4 letters (1)
**Character listing**
|Text|string(s)|codepoint(s)|
|---|---|---|
|Latin-1 Supplement|¹²àëò|185 178 224 235 242|
|Latin Extended-A|ſ|383|
|Latin Extended-B|Ʋ|434|
|Combining Diacritical Marks|̄|772|
|General Punctuation|•|8226|
|Superscripts and Subscripts|⁶|8310|
|Geometric Shapes|◊▪|9674 9642|
|CJKSymbolsandPunctuation|〈〉|12296 12297|
##Tag Usage Summary##
###Header Tag Usage###
|No|element name|occ|attributes|
|---|---|---|---|
|1.|__author__|2||
|2.|__availability__|1||
|3.|__biblFull__|1||
|4.|__change__|5||
|5.|__date__|8| @__when__ (1) : 2014-11 (1)|
|6.|__edition__|2||
|7.|__editionStmt__|2||
|8.|__editorialDecl__|1||
|9.|__encodingDesc__|1||
|10.|__extent__|2||
|11.|__fileDesc__|1||
|12.|__idno__|6| @__type__ (6) : DLPS (1), STC (2), EEBO-CITATION (1), PROQUEST (1), VID (1)|
|13.|__keywords__|1| @__scheme__ (1) : http://authorities.loc.gov/ (1)|
|14.|__label__|5||
|15.|__langUsage__|1||
|16.|__language__|1| @__ident__ (1) : eng (1)|
|17.|__listPrefixDef__|1||
|18.|__note__|5||
|19.|__notesStmt__|2||
|20.|__p__|11||
|21.|__prefixDef__|2| @__ident__ (2) : tcp (1), char (1) • @__matchPattern__ (2) : ([0-9\-]+):([0-9IVX]+) (1), (.+) (1) • @__replacementPattern__ (2) : http://eebo.chadwyck.com/downloadtiff?vid=$1&page=$2 (1), https://raw.githubusercontent.com/textcreationpartnership/Texts/master/tcpchars.xml#$1 (1)|
|22.|__profileDesc__|1||
|23.|__projectDesc__|1||
|24.|__pubPlace__|2||
|25.|__publicationStmt__|2||
|26.|__publisher__|2||
|27.|__ref__|1| @__target__ (1) : http://www.textcreationpartnership.org/docs/. (1)|
|28.|__revisionDesc__|1||
|29.|__seriesStmt__|1||
|30.|__sourceDesc__|1||
|31.|__term__|3||
|32.|__textClass__|1||
|33.|__title__|3||
|34.|__titleStmt__|2||
###Text Tag Usage###
|No|element name|occ|attributes|
|---|---|---|---|
|1.|__am__|1||
|2.|__bibl__|2||
|3.|__body__|1||
|4.|__closer__|1||
|5.|__date__|1||
|6.|__dateline__|1||
|7.|__desc__|1218||
|8.|__div__|21| @__type__ (21) : half_title (1), title_page (1), to_the_reader (1), table_of_contents (1), text (1), introduction (1), chapter (15) • @__n__ (15) : 1 (1), 2 (1), 3 (1), 4 (1), 5 (1), 6 (1), 7 (1), 8 (1), 9 (1), 10 (1), 11 (1), 12 (1), 13 (1), 14 (1), 15 (1)|
|9.|__epigraph__|1||
|10.|__ex__|1||
|11.|__expan__|1||
|12.|__front__|1||
|13.|__g__|1455| @__ref__ (1455) : char:EOLhyphen (1284), char:V (9), char:EOLunhyphen (129), char:punc (31), char:abque (1), char:cmbAbbrStroke (1)|
|14.|__gap__|1218| @__reason__ (1218) : foreign (53), illegible (1165) • @__resp__ (1165) : #KEYERS (1165) • @__extent__ (1165) : 2 letters (56), 1 letter (1067), 1 word (39), 3 letters (2), 4 letters (1)|
|15.|__head__|20||
|16.|__hi__|6871||
|17.|__item__|15||
|18.|__list__|1||
|19.|__note__|515| @__n__ (515) : * (24), b (2), c (2), d (1), e (1), f (1), † (4), a (1), 1 (4), 2 (4), 3 (4), 4 (4), 5 (2), 6 (2), (u) (19), (w) (14), (x) (14), (y) (14), (z) (14), (a) (20), (b) (16), (c) (25), (d) (22), (e) (20), 7 (1), (n) (13), (o) (18), (q) (16), (r) (14), (s) (16), (t) (13), (l) (19), (m) (17), (p) (16), (f) (15), (g) (15), [i] (4), [k] (2), [l] (4), [m] (4), (h) (15), (i) (15), (k) (19), [o] (3), [p] (3), [q] (2), [s] (2), [t] (2), [u] (1), [w] (2), [x] (3), [y] (2), [z] (3), [a] (2), [b] (1), [c] (1), [h] (3), [n] (4), (ſ) (2), [f] (2), [g] (1), [d] (1), [e] (1), [r] (1), (vv) (1), (b (1), f) (1) • @__place__ (515) : bottom (515)|
|20.|__p__|247| @__n__ (150) : 1 (39), 2 (35), 3 (23), 4 (17), 5 (7), 6 (6), 7 (5), 8 (4), 9 (3), 10 (2), 11 (2), 12 (2), 13 (2), 14 (2), 15 (1)|
|21.|__pb__|230| @__facs__ (230) : tcp:135134:1 (2), tcp:135134:2 (2), tcp:135134:3 (2), tcp:135134:4 (2), tcp:135134:5 (2), tcp:135134:6 (2), tcp:135134:7 (2), tcp:135134:8 (2), tcp:135134:9 (2), tcp:135134:10 (2), tcp:135134:11 (2), tcp:135134:12 (2), tcp:135134:13 (2), tcp:135134:14 (2), tcp:135134:15 (2), tcp:135134:16 (2), tcp:135134:17 (2), tcp:135134:18 (2), tcp:135134:19 (2), tcp:135134:20 (2), tcp:135134:21 (2), tcp:135134:22 (2), tcp:135134:23 (2), tcp:135134:24 (2), tcp:135134:25 (2), tcp:135134:26 (2), tcp:135134:27 (2), tcp:135134:28 (2), tcp:135134:29 (2), tcp:135134:30 (2), tcp:135134:31 (2), tcp:135134:32 (2), tcp:135134:33 (2), tcp:135134:34 (2), tcp:135134:35 (2), tcp:135134:36 (2), tcp:135134:37 (2), tcp:135134:38 (2), tcp:135134:39 (2), tcp:135134:40 (2), tcp:135134:41 (2), tcp:135134:42 (2), tcp:135134:43 (2), tcp:135134:44 (2), tcp:135134:45 (2), tcp:135134:46 (2), tcp:135134:47 (2), tcp:135134:48 (2), tcp:135134:49 (2), tcp:135134:50 (2), tcp:135134:51 (2), tcp:135134:52 (2), tcp:135134:53 (2), tcp:135134:54 (2), tcp:135134:55 (2), tcp:135134:56 (2), tcp:135134:57 (2), tcp:135134:58 (2), tcp:135134:59 (2), tcp:135134:60 (2), tcp:135134:61 (2), tcp:135134:62 (2), tcp:135134:63 (2), tcp:135134:64 (2), tcp:135134:65 (2), tcp:135134:66 (2), tcp:135134:67 (2), tcp:135134:68 (2), tcp:135134:69 (2), tcp:135134:70 (2), tcp:135134:71 (2), tcp:135134:72 (2), tcp:135134:73 (2), tcp:135134:74 (2), tcp:135134:75 (2), tcp:135134:76 (2), tcp:135134:77 (2), tcp:135134:78 (2), tcp:135134:79 (2), tcp:135134:80 (2), tcp:135134:81 (2), tcp:135134:82 (2), tcp:135134:83 (2), tcp:135134:84 (2), tcp:135134:85 (2), tcp:135134:86 (2), tcp:135134:87 (2), tcp:135134:88 (2), tcp:135134:89 (2), tcp:135134:90 (2), tcp:135134:91 (2), tcp:135134:92 (2), tcp:135134:93 (2), tcp:135134:94 (2), tcp:135134:95 (2), tcp:135134:96 (2), tcp:135134:97 (2), tcp:135134:98 (2), tcp:135134:99 (2), tcp:135134:100 (2), tcp:135134:101 (2), tcp:135134:102 (2), tcp:135134:103 (2), tcp:135134:104 (2), tcp:135134:105 (2), tcp:135134:106 (2), tcp:135134:107 (2), tcp:135134:108 (2), tcp:135134:109 (2), tcp:135134:110 (2), tcp:135134:111 (2), tcp:135134:112 (2), tcp:135134:113 (2), tcp:135134:114 (2), tcp:135134:115 (2) • @__n__ (222) : 1 (1), 2 (1), 3 (1), 4 (1), 5 (1), 6 (1), 7 (1), 8 (1), 9 (1), 10 (1), 11 (1), 12 (1), 13 (1), 14 (1), 15 (1), 16 (1), 17 (1), 18 (1), 19 (1), 20 (1), 21 (1), 22 (1), 23 (1), 24 (1), 25 (1), 26 (1), 27 (1), 28 (1), 29 (1), 30 (1), 31 (1), 32 (1), 33 (1), 34 (1), 35 (1), 36 (1), 37 (1), 38 (1), 39 (1), 40 (1), 41 (1), 42 (1), 43 (1), 45 (1), 46 (2), 47 (1), 48 (1), 49 (1), 50 (1), 51 (1), 52 (1), 53 (1), 54 (1), 55 (1), 56 (1), 57 (1), 58 (1), 59 (1), 60 (1), 61 (1), 62 (1), 63 (1), 64 (1), 65 (1), 66 (1), 67 (1), 68 (1), 69 (1), 70 (1), 71 (1), 72 (1), 73 (1), 74 (1), 75 (1), 76 (1), 77 (1), 78 (1), 79 (1), 80 (1), 81 (1), 82 (1), 83 (1), 84 (1), 85 (1), 86 (1), 87 (1), 88 (1), 89 (1), 90 (1), 91 (1), 92 (1), 93 (1), 94 (1), 95 (1), 96 (1), 97 (1), 98 (1), 99 (1), 100 (1), 101 (1), 102 (1), 103 (1), 104 (1), 105 (1), 106 (1), 107 (1), 108 (1), 109 (1), 110 (1), 111 (1), 112 (1), 113 (1), 114 (1), 115 (1), 116 (1), 117 (2), 118 (2), 119 (1), 120 (1), 121 (1), 122 (1), 123 (1), 124 (1), 125 (1), 126 (1), 127 (1), 128 (1), 129 (1), 130 (1), 131 (1), 132 (1), 133 (1), 134 (1), 135 (1), 136 (1), 137 (1), 138 (1), 139 (1), 140 (1), 141 (1), 142 (1), 143 (1), 144 (1), 145 (1), 146 (1), 147 (1), 148 (1), 149 (1), 150 (1), 151 (1), 152 (1), 153 (1), 154 (1), 155 (1), 156 (1), 157 (1), 158 (1), 159 (1), 160 (1), 161 (1), 162 (1), 163 (1), 164 (1), 165 (1), 166 (1), 167 (1), 168 (1), 169 (1), 170 (1), 171 (1), 172 (1), 173 (1), 174 (1), 175 (1), 176 (1), 177 (1), 178 (1), 179 (1), 180 (1), 181 (1), 182 (1), 183 (1), 184 (1), 185 (1), 186 (1), 187 (1), 188 (1), 189 (1), 190 (1), 191 (1), 192 (1), 193 (1), 194 (1), 195 (1), 196 (1), 197 (1), 198 (1), 199 (1), 200 (1), 201 (1), 202 (1), 203 (1), 204 (1), 205 (1), 206 (1), 207 (1), 208 (1), 209 (1), 210 (1), 211 (1), 212 (1), 213 (1), 214 (1), 215 (1), 216 (1), 219 (1), 220 (1), 221 (1), 222 (1)|
|22.|__q__|4||
|23.|__signed__|1||
|24.|__trailer__|1||
| 71.601227 | 4,108 | 0.613486 | eng_Latn | 0.219105 |
f92e4876e72e13228ed5538c132aadbfe0b6c455 | 218 | md | Markdown | _watches/M20200520_074814_TLP_3.md | Meteoros-Floripa/meteoros.floripa.br | 7d296fb8d630a4e5fec9ab1a3fb6050420fc0dad | [
"MIT"
] | 5 | 2020-05-19T17:04:49.000Z | 2021-03-30T03:09:14.000Z | _watches/M20200520_074814_TLP_3.md | Meteoros-Floripa/site | 764cf471d85a6b498873610e4f3b30efd1fd9fae | [
"MIT"
] | null | null | null | _watches/M20200520_074814_TLP_3.md | Meteoros-Floripa/site | 764cf471d85a6b498873610e4f3b30efd1fd9fae | [
"MIT"
] | 2 | 2020-05-19T17:06:27.000Z | 2020-09-04T00:00:43.000Z | ---
layout: watch
title: TLP3 - 20/05/2020 - M20200520_074814_TLP_3T.jpg
date: 2020-05-20 07:48:14
permalink: /2020/05/20/watch/M20200520_074814_TLP_3
capture: TLP3/2020/202005/20200519/M20200520_074814_TLP_3T.jpg
---
| 27.25 | 62 | 0.784404 | eng_Latn | 0.064397 |
f92e7866d101ef51deb1e2b6cb3c3415066bc088 | 45,526 | md | Markdown | CHANGELOG.md | astarsoftware/mopub-ios-sdk | b2a8de26486dc0394d6b797c6f72a9430eef9306 | [
"OML"
] | null | null | null | CHANGELOG.md | astarsoftware/mopub-ios-sdk | b2a8de26486dc0394d6b797c6f72a9430eef9306 | [
"OML"
] | null | null | null | CHANGELOG.md | astarsoftware/mopub-ios-sdk | b2a8de26486dc0394d6b797c6f72a9430eef9306 | [
"OML"
] | null | null | null | ## Version 5.18.0 (August 4, 2021)
- **Features**
- Changed when the close button and countdown timer are presented.
- Added InMobi, Fyber, Ogury, and Mintegral as supported networks.
- **Bug Fixes**
- Fixed a crash in `MPProgressOverlayView`.
- Fixed an issue where MRAID was not working correctly on iOS 15.
## Version 5.17.0 (May 11, 2021)
- **Features**
- Added support for SKAdNetwork 2.2 including view through attribution.
- Added support for iOS Simulators on Apple Silicon Macs.
- Bumped minimum Xcode version to Xcode 12.5.
- Renamed the fullscreen `willAppear` and `didAppear` callbacks to `willPresent` and `didPresent`.
- Updated OMSDK to 1.3.16.
- **Bug Fixes**
- Fixed an issue that prevented `SKStoreProductViewController` from being used for App Store and iTunes URLs in some cases.
- Fixed an issue that could prevent consent synchronization callbacks from being invoked.
- Fixed an issue that caused an end card image to center rather than fit when too big.
- Fixed an issue that could cause an ad with an invalid reward to not be treated as a rewarded ad.
## Version 5.16.2 (March 18, 2021)
- **Bug Fixes**
- Address issue with `radioAccessTechnologyString` on XCode > 12.1
- Attempted to address NSInvalidArgumentException for NSLayoutConstraint in VAST Player
## Version 5.16.1 (February 19, 2021)
- **Bug Fixes**
- Fixed bug where users were not receiving rewards from Rewarded network ads using network adapters.
## Version 5.16.0 (February 16, 2021)
- **Features**
- Rewarded ads feature and API improvements.
- `MPRewardedVideo` has been renamed to `MPRewardedAds`. See the [API reference](https://developers.mopub.com/publishers/reference/ios/MoPub/) for more details.
- Removed support for the Native Video format.
- Added 5G cellular support.
- Deprecated `interstitialDidFailToLoadAd:`. Use `interstitialDidFailToLoadAd:withError:` instead.
- Addressed confusion in the naming and function of the `disappear` ad lifecycle callbacks.
- VAST creatives without file extensions will infer file extension from the MIME type.
- The MoPub SDK's module name has been renamed from `MoPub` to `MoPubSDK` due to a limitation with Swift's ability to resolve name collisions between a module and class.
- **Bug Fixes**
- Fixed bug where consent synchronization is fired twice at app launch.
- Fixed bug where the failure callback was not fired when a mediated adapter does not exist.
- Fixed bug where the `SKAdNetwork` time stamp was parsed as `integerValue` instead of `longLongValue`.
- Fixed bugs related to unintentional initialization of the consent manager when the MoPub SDK not been initialized.
## Version 5.15.0 (November 18, 2020)
- **Features**
- The MoPub iOS SDK now includes Swift 5.
- Updated countdown animation background color to black for better visibility.
- Enforce HTTPS for base URLs.
- Removed native video support.
- Add support for Snap Audience Network.
- **Bug Fixes**
- Fixed bug where app foregrounding was requesting a new banner ad instead of resuming the refresh timer.
- Fixed bug with animated GIFs in VAST end cards.
- Fixed bug with scheduled deallocation of HTML Viewability trackers.
- Fixed `SKStoreProductViewController` causing freezes on iOS 13.0 and 13.1 devices.
- Fixed bug where attempting to instantiate a mediation adapter that does not exist at runtime will not fire the failure callback.
## Version 5.14.1 (October 5, 2020)
- **Bug Fixes**
- Fixed a bug where delegate methods `interstitialWillDisappear:`, `interstitialDidDisappear:`, `rewardedVideoAdWillDisappearForAdUnitID:`, and `rewardedVideoAdDidDisappearForAdUnitID:` did not fire.
## Version 5.14.0 (October 1, 2020)
- **Features**
- Add beta support for OMSDK version 1.3.4.
- iOS14 support for `SKAdNetwork`, `ATTrackingManagerAuthorizationStatus`, and location changes.
- Support Pangle as a certified mediation network.
- Remove Mintegral as a certified mediation network.
- Bump minimum Xcode version to Xcode 12.
- **Bug Fixes**
- Cleaned up the MoPub Xcode project to properly mark files as public or private.
- Fixed a bug where an ad may become frozen when `SKStoreProductViewController` is shown.
- Fixed a multithreaded crash in `MPVastModel`.
- Fixed a bug where videos slightly longer than 15 seconds were skippable. Videos with duration less than 16 seconds are considered unskippable.
## Version 5.13.1 (July 9, 2020)
- **Bug Fixes**
- Fixed bug with where mediated network rewards were given back instead of the selected reward.
## Version 5.13.0 (June 15, 2020)
- **Features**
- Remove Moat and IAS measurement SDKs.
- Consolidate interstitials and rewarded ads into one container. Third party network adapters for these formats should now extend `MPFullscreenAdAdapter` and conform to `MPThirdPartyFullscreenAdAdapter`.
- Upped the minimum version to iOS 10.
- **Bug Fixes**
- Various bug fixes.
- Fixed multithreading crash in `MPTimer` due to null reference exception.
- Fixed bug where interstitial ads returning from `SKStoreProductViewController` are accidentally closed.
- Fixed bug where VAST companion ad clickthrough trackers were fired even when no clickthrough URL was spcified.
- Fixed bug where 302 redirects from https sources were not followed to the end of the redirect chain.
## Version 5.12.1 (April 16, 2020)
- **Bug Fixes**
- Fixed banner click trackers not firing for mediated networks that do not use MoPub's auto click tracking.
## Version 5.12.0 (April 6, 2020)
- **Features**
- Location setters for all formats are marked deprecated and will be removed in a future release.
- Added Mintegral as a supported network.
- A new field `appVersion` has been added to `MPImpressionData`.
- Update the Sample app to 64-bit architectures only.
- Added ad load history to the Saved Ads section in the Sample app.
- **Bug Fixes**
- Fixed potential multithreading crash in `MPAdServerURLBuilder`.
## Version 5.11.0 (February 4, 2020)
- **Features**
- Update GDPR logic to allow MoPub to reacquire consent for new vendors.
- Update our support for OpenRTB Native Ads to version 1.2 and add an optional `sponsored` text field for native ads.
- Removed deprecated custom event method `requestAdWithSize:customEventInfo:` in `MPBannerCustomEvent`, `requestInterstitialWithCustomEventInfo:` in `MPInterstitialCustomEvent`, `requestRewardedVideoWithCustomEventInfo:` in `MPRewardedVideoCustomEvent`, and `requestAdWithCustomEventInfo:` in `MPNativeCustomEvent`.
- **Bug Fixes**
- Fixed non-native SDK target compilation error.
- Fixed potential deadlock in `MPConsentManager` when scheduling `MPTimer`.
- Fixed potential crash in `MPTableViewAdPlacer` and `MPCollectionViewAdPlacer`.
## Version 5.10.0 (October 30, 2019)
- **Features**
- Added support for the Verizon native ad renderer.
- Deprecated base custom event `requestAd` calls without the `adMarkup` parameter.
- **Bug Fixes**
- Fixed non-native SDK target compilation error.
- Fixed potential crash in `MPTableViewAdPlacer` and `MPCollectionViewAdPlacer`.
- Removed extraneous `NSLog` statements for Release build configuration.
- Fixed VAST error code macro replacement in tracking URLs.
## Version 5.9.0 (September 16, 2019)
- **Features**
- Add iOS 13 support to both SDK and MoPub Sample app.
- Totally remove `UIWebView` implementation and comments in MoPub SDK and MoPub Sample app.
- Add multi-window support for MoPub Sample app in iPadOS 13. New window can be opened by Drag & Dropping an ad cell in the ad list.
- Remove support for `tel` and `sms` functions for MRAID ads.
- Add Dark Mode support for MoPub Sample app in iOS 13.
- Remove the Objective C sample app project.
- Adopt `XCFramework` and the new Xcode build system with fastlane script updates, and thus require Xcode 11 to build instead of Xcode 9.
- Remove deprecated VAST extension `MoPubViewabilityTracker`.
- Replace deprecated `MPMoviePlayerViewController` with `AVPlayerViewController`. This affects MRAID videos.
- Replace deprecated `UIAlertView` with `UIAlertViewController`.
- **Bug Fixes**
- Update `MPRealTimeTimer` so that it can properly handle foreground notifications that aren't balanced with backgrounding notifications.
- Fix an assertion crash in GDPR Sync that only happens in debug builds.
- Present `SKStoreProductViewController` only in portrait mode, so that we can prevent a `SKStoreProductViewController` crash in landscape mode (as designed by Apple).
- Fix an infinite load ad bug that happens when the ad URL to retry is the same as the failed ad URL.
- Fix a bug where location information is not sent to Ad Server when location permission has been allowed, the app can collect PII, and no app-specified location is set.
## Version 5.8.0 (July 22, 2019)
- **Features**
- Minimum version of the MoPub SDK bumped to iOS 9.
- StoreKit Improvement: New Apple URL schemes for apps.apple.com, books.apple.com, and music.apple.com are now parsed for `SKStoreProductViewController`.
- StoreKit Improvement: Affiliate token and campagin token are now parsed for `SKStoreProductViewController`.
- Existing banner constants are deprecated in favor of new, configurable height-based constants. To use these, `MPAdView`'s frame must be set before an ad load is attempted.
- Updated `MPAdView`'s `initWithAdUnitId:size:`, `loadAd`, and `adViewDidLoadAd:` APIs by providing overloads `initWithAdUnitId:`, `loadAdWithMaxAdSize:`, and `adViewDidLoadAd:adSize:` which move the requested ad size to load time instead of at initialization time.
- `SFSafariViewController` is now exclusively used for in-app clickthrough destinations.
- Disallow the sending of empty ad unit IDs for consent.
- **Bug Fixes**
- iOS 13 fixes: Explicitly set `modalPresentationStyle` for all modals in the MoPubSDK to `UIModalPresentationFullSCreen` since iOS 13 beta 1 changed the default modal presentation behavior.
- Fixed occasional crash due with `MPTimer` by ensuring it is always run on the main runloop.
- Fixed bug where banner and medium rectangle auto refresh timer was being fired even if the refresh interval was zero.
- Fixed bug where updated ad targeting parameters were not sent when banners were auto refreshing.
- Fixed a bug where the `UIButton+MPAdditions` category was impacting all `UIButton`s in the app. MoPub-specific `UIButton` customization is now contained in a subclass.
## Version 5.7.1 (June 3, 2019)
- **Features**
- Impression Level Revenue Data can now be received via a notification
- **Bug Fixes**
- Fixed occasional crash due to multithreading bug
## Version 5.7.0 (May 20, 2019)
- **Features**
- Impression Level Revenue Data: A data object that includes revenue information associated with each impression
- Verizon Ads SDK now supported as a mediated network
- Native ad renderer registration for FacebookNativeCustomEvent and MillennialNativeCustomEvent is removed from the SDK. Pubishers must register renderers in their app.
- **Bug Fixes**
- Fixed bug where native video fires an impression when main image asset is missing
- Fixed MRAID off-screen compliance for resized ads on tablets
- Fixed crash in Canary App when tapping on the `+` on iPad
- Replaced deprecated usage of `openURL:` with `openURL:options:completionHandler:` for iOS10+
- Fixed bug where click trackers can fire more than once on HTML banners and HTML interstitials
- Fixed bug in Canary App where ad units that were read using the QR code reader were not being saved
- Fixed bug where GDPR consent dialog was allowed to be presented twice in a row
## Version 5.6.0 (March 18, 2019)
- **Features**
- Added `+` button to the Canary sample app allowing manual entry of custom ad units
- **Bug Fixes**
- MRAID orientation, expansion, and resizing edge case bug fixes
- MRAID expansion will no longer trigger a click tracking event
- MRAID logging no longer spams the device console
- Fixed position bug of the Rewarded Video countdown timer when rotating the device after the ad loads
## Version 5.5.0 (January 28, 2019)
- **Features**
- Advanced Bidding automatically initializes
- GDPR legitimate interest API now available; publishers may opt into allowing supported networks to collect user information on the basis of legitimate interest.
- We now distribute separate frameworks for simulator, device, and universal architectures
- **Bug Fixes**
- Fixed rewarded video state occasionally not being reset correctly upon load failure
- Tweaked MRAID `ready` event timing so that it's in-spec
- Canary test app improvements and bug fixes
## Version 5.4.1 (November 28, 2018)
- **Bug Fixes**
- Changed the MoPubSampleApp+Framework target to MoPubSampleApp in the Objective-C Sample App.
- Fixed crash when `MPTableViewAdPlacer` makes multiple ad requests within a short amount of time.
- Fixed bug with the internal state of rewarded video when the video fails to play.
## Version 5.4.0 (October 3, 2018)
- **Features**
- SDK distribution as a dynamic framework is now available.
- Local extras are now supported for all ad formats.
- **Bug Fixes**
- HTTP error codes now include the localized error description.
- Added missing mraid.js file protections when showing MRAID ads.
- Fixed native video crash.
- Fixed native ad timeout timer invalidation.
## Version 5.3.0 (August 15, 2018)
- **Features**
- Laying the foundation for platform optimization work that enables the SDK to receive multiple ad responses per ad request, reducing the number of round trips between the server and the client required to fill the requests.
## Version 5.2.0 (July 9, 2018)
- **Features**
- SDK initialization is required for ads to load.
- Added callback to the consent dialog when it is dismissed.
- **Bug Fixes**
- Synchronized access to shared `NSMutableDictionary` in `MPHTTPNetworkSession`.
- Video ads using Device orientation now appear aligned correctly on iPhone X.
## Version 5.1.0 (June 5, 2018)
- **Features**
- Updated `MPReachability` to be IPv6 compliant.
- Allow publishers to determine which users should be treated as GDPR compliant users through the new API `forceGDPRApplicable`.
- Alert a publisher (through logs) when they are trying to use the new GDPR consent flow without being whitelisted.
- Banner refresh will only occur after an impression.
## Version 5.0.0 (May 14, 2018)
- **Features**
- General Data Protection Regulation (GDPR) update to support a way for publishers to determine GDPR applicability and to obtain and manage consent from users in European Economic Area, the United Kingdom, or Switzerland to serve personalize ads.
- New SDK initialization method to initialize consent management and rewarded video ad networks. Required for receiving personalized ads. In future versions of the SDK, initialization will be required to receive ads.
- Updated the networking stack to use `NSURLSession` in place of the deprecated `NSURLConnection`.
- Updated ad requests to use POST instead of GET.
- **Bug Fixes**
- Renamed the `/MoPubSDK/Native Ads/` folder to `/MoPubSDK/NativeAds/`.
- Removed the usage of deprecated `shouldAutorotateToInterfaceOrientation`.
## Version 4.20.1 (March 12, 2018)
- **Bug Fixes**
- Fixes compatibility issues with some fullscreen ads on iPhone X
## Version 4.20.0 (February 20, 2018)
- **Bug Fixes**
- Fixed ad expiration check for rewarded ad formats
- **Ad Network Mediation Updates**
- Network mediation adapters are now in a separate repository to enable an independent release cadence and faster updates to the adapters. Please find the new location [here](https://github.com/mopub/mopub-ios-mediation).
## Version 4.19.0 (December 11, 2017)
- **Bug Fixes**
- Ensure proper viewability initialization before ad content is loaded
- Fire appropriate error delegate when rewarded video ad view is not ready to be shown
- Resolve video playback sizing issue when creative MoPubForceOrientation is set to "Device"
- Resolve WKWebView sizing and alignment issues on iPhoneX
- **Ad Network Mediation Updates**
- Certified Facebook Audience Network 4.26.1
- Certified Flurry 8.1.0
- Added support for Millennial/AOL Rewarded Video adapters for 6.6.0
## Version 4.18.0 (November 1, 2017)
- **Features**
- iPhone X compatibility improvements including moving the close button into safe area.
- **Bug Fixes**
- Fixed a bug with unspecified rewarded video currencies.
- Fixed C99 compilation bug.
- **Ad Network Mediation Updates**
- AdColony 3.2.1
- AdMob 7.24.1
- AOL 6.6.0 (formerly Millennial)
- Chartboost 7.0
- Facebook Audience Network 4.26.0
- Tapjoy 11.11.0
- Unity Ads 2.1.1
- Vungle 5.3.0
## Version 4.17.0 (September 27, 2017)
- **Features**
- Rewarded videos can now optionally pass back custom data to the publisher's reward server.
- Updated the minimum iOS version of the SDK to iOS 8.
- Update Facebook adapter with non whitespace clickable policy.
## Version 4.16.0 (August 23, 2017)
- **Features**
- Added viewability support for Integral Ad Science (IAS) and Moat, two of the leading independent viewability measurement providers
- To disable this feature, see note below on [Disabling Viewability Measurement](#disableViewability).
- New app launch rewarded video initialization method for mediated network SDKs
- **Bug Fixes**
- Fixed native video crash caused by empty VAST tracking event
- Prevent interstitials from firing clicks without user interaction
### <a name="disableViewability"></a>Disabling Viewability Measurement
There are a few options for opting out of viewability measurement:
##### Opting Out in a Manual Integration
Before dragging the MoPubSDK folder into your Xcode project, simply delete the “Moat” folder to opt out of Moat or the “Avid” folder to opt out of IAS in MoPubSDK/Viewability/. If you would like to opt out of both, delete both folders.
##### Opting Out in a CocoaPods Integration
Including `pod 'mopub-ios-sdk'` in your Podfile will include both IAS and Moat SDKs, as well as the MoPub SDK. In order to opt out:
- `pod 'mopub-ios-sdk/Avid'` will include the IAS SDK, but not the Moat SDK, as well as the MoPub SDK.
- `pod 'mopub-ios-sdk/Moat'` will include the Moat SDK, but not the IAS SDK, as well as the MoPub SDK.
- `pod 'mopub-ios-sdk/Core'` will only include the MoPub SDK, with viewability measurement totally disabled.
Make sure to run `pod update` once your Podfile is set up to your preferences.
##### Software Disable
If you would like to opt out of viewability measurement but do not want to modify the MoPub SDK, a function is provided for your convenience. As soon as possible after calling `[[MoPub sharedInstance] start]`, call `[[MoPub sharedInstance] disableViewability:(vendors)]`. In place of “(vendors)”, `MPViewabilityOptionIAS` will disable IAS but leave Moat enabled, `MPViewabilityOptionMoat` will disable Moat but leave IAS enabled, and `MPViewabilityOptionAll` will disable all viewability measurement.
### Disclosure
MoPub v4.16 SDK integrates technology from our partners Integral Ad Science, Inc. (“IAS”) and Moat, Inc. (“Moat”) in order to support viewability measurement and other proprietary reporting that [IAS](https://integralads.com/capabilities/viewability/) and [Moat](https://moat.com/analytics) provide to their advertiser and publisher clients. You have the option to remove or disable this technology by following the opt-out instructions [above](#disableViewability).
If you do not remove or disable IAS's and/or Moat’s technology in accordance with these instructions, you agree that IAS's [privacy policy](https://integralads.com/privacy-policy/) and [license](https://integralads.com/sdk-license-agreement) and Moat’s [privacy policy](https://moat.com/privacy), [terms](https://moat.com/terms), and [license](https://moat.com/sdklicense.txt), respectively, apply to your integration of these partners' technologies into your application.
## Version 4.15.0 (June 19th, 2017)
- **Bug Fixes**
- Updated Facebook Audience Network banner and interstitial impression tracking
- Allow taps to pass through the gradient overlays for rewarded videos
## Version 4.14.0 (May 10th, 2017)
- **Features**
- For Rewarded ads, the client-side callback will now be invoked when using server-side rewarding.
- Non-mediated interstitial, rewarded, and native ad placer ads will expire within 4 hours.
- **Bug Fixes**
- Fix old custom events that use the wrong native renderer.
- Replace usage of typeof with __typeof__ for C99 and C11 compliance.
- Fix CFBridgingRetain casting bug.
- Native ad impression tracker will now fire while scrolling.
- Fix HTML click tracker to fire when using window.location and window.open.
## Version 4.13.1 (April 6th, 2017)
- **Bug Fixes**
- Fixed compile error in the MoPub Base SDK Excluding Native bundle.
## Version 4.13.0 (March 23rd, 2017)
- **Features**
- Added support for mediation of Google AdMob rewarded video demand (Google Mobile Ads SDK v7.19.0).
- Google AdMob native ads mediation is now generally available (Google Mobile Ads SDK v7.19.0).
- Updated the Tapjoy network mediation adapter to support Tapjoy SDK v11.10.0
- **Bug Fixes**
- Introduced additional preventative measures to improve creative quality.
## Version 4.12.0 (February 9th, 2017)
- **Features**
- Rewarded ad units now support rich media.
- Allow MoPub static native renderer to render Flurry native ads.
- Removed size limit for native ad main images.
- **Bug Fixes**
- Native video selection logic now filters by supported MIME types.
- Ad placer now supports section count.
- Fix CFStringRef variable initialization.
## Version 4.11.1 (November 28th, 2016)
- **App Transport Security Updates**
- Checks for "NSAllowsArbitraryLoadsInMedia" were changed to "NSAllowsArbitraryLoadsForMedia", per updated Apple documentation
- Resolves issue in which explicitly using NSAllowsArbitraryLoadsForMedia or NSAllowsArbitraryLoadsInWebContent causes HTTP clickthroughs not to resolve on iOS 10.1 or higher
## Version 4.11.0 (November 10th, 2016)
- **The MoPub SDK now uses WKWebView to display ads when possible. Backwards compatibility for old OS versions is retained using UIWebView.**
- **Native video start tracker now fires immediately upon successful video playback.**
- **Bug fixes**
- Native ads and native video ads now correctly fire impression trackers while scrolling.
## Version 4.10.1 (October 28th, 2016)
- **Bug fixes**
- Fixed iOS 10 bug that causes SKStoreProductViewController to crash if the app does not list portrait as a supported orientation.
## Version 4.10.0 (October 18th, 2016)
- **Certified FAN 4.15.1**
- **Certified Chartboost 6.5.2**
- **Certified Yahoo 7.6.4**
- **Certified TapJoy support for 11.8.2**
- **Certified Millennial support for 6.3.1**
- **Certified Vungle 4.0.6**
- **Bug fixes**
- Added support for the CocoaPods use_frameworks! directive
## Version 4.9.1 (September 14th, 2016)
- **iOS 10 compatibility updates**
- Fixed an issue related to screen bounds calculation
- **Removed EventKit, EventKitUI frameworks and a few files related to EventKit and MRAID image downloader**
- Please completely remove the MoPub SDK from your project and re-integrate this version to ensure that files are properly removed from your project
## Version 4.9.0 (September 1st, 2016)
- **Modular SDK - publishers can download the base or base excluding native SDK package**
- **Removed the full SDK bundle**
- **iOS 10 compatibility updates**
- Rotating frequency capping ID for non-targeted ads under 'Limit Ad Tracking' setting
- Removed save picture and save to calendar MRAID functionality
- **Removed iAd and InMobi adapters**
- **Added Cocoapods module name: "MoPub"**
- **Bug fixes**
- Fixed an issue when multiple rewarded videos are loaded at the same time
## Version 4.8.0 (August 1st, 2016)
- **renamed MPError enums to avoid possible naming conflict with MediaPlayer framework errors**.
## Version 4.7.0 (June 2nd, 2016)
- **Rewarded video server-side currency rewarding (Beta)**.
## Version 4.6.0 (April 21th, 2016)
- **Certified Chartboost version 6.4.0**
- **Certified Tapjoy version 11.5.1**
- **Bug fixes**
- Fixed resource loading issues when using cocoapods and frameworks
## Version 4.5.1 (April 4th, 2016)
- **bitcode support for MoPub Fabric Kit**
## Version 4.5 (March 24th, 2016)
- **Rewarded video support from the MoPub Marketplace (Beta)**
- **Bug fixes**
- The SDK now correctly sends matched modal presented/dismissed callbacks on clickthrough
## Version 4.4 (February 17th, 2016)
- **Enabled SSL ad requests by default**
- **Bug fixes**
- Fixed native video impression tracking
- Made closeable hot spot consistent across all full-screen creative types
## Version 4.3 (December 15th, 2015)
- **Minor SDK improvements**.
## Version 4.2 (November 30th, 2015)
- **Upgraded Facebook SDK support to 4.8.0**.
- Facebook Audience Network custom events for native and native video automatically display Facebook's AdChoices icon.
- **Added mediation support for Facebook video**.
- **Bug fixes**
- Added mp prefix to common constants.
- Fixed minor issue with video resuming during background to foreground transitions.
- Fixed minor issue generating the 'mute' video status event.
## Version 4.1 (November 12th, 2015)
- **Added MoPub prefixes to constants and category methods**.
- **Certified Tapjoy 11.2.2**.
- **Certified Vungle 3.2.0**.
## Version 4.0 (October 6th, 2015)
- **Minimum supported iOS version is now 6.0**.
- **Updated native ad integration APIs**.
- **Improved native ad placer caching and request logic**.
- **Clicks are now automatically tracked for native ads that use the manual integration**.
- **Removed deprecated classes**.
- Removed legacy custom event classes deprecated in 1.10.
- Removed MPTableViewAdManager class deprecated in 3.0.
## Version 3.13 (September 17th, 2015)
- **Added iOS 9 support**.
## Version 3.12 (August 31st, 2015)
- **Added Privacy Information icon support for MoPub native ads**.
- **GA of rewarded video ad mediation**.
- Added mediation support for AdColony, Chartboost, Vungle, and Unity rewarded video ads.
## Version 3.11 (August 20th, 2015)
- **Updated Millennial Media custom events (Millennial Media SDK 6.0+ only)**.
## Version 3.10 (August 3rd, 2015)
- **Minor improvements**.
- **Bug fixes**.
- didDismissInterstitial is now called when the dismiss animation has completed.
## Version 3.9 (July 1st, 2015)
- **Added VAST 3.0 standard support for video ads**.
- **Improved video player UX**.
- **Improved error messages**.
- **Improved deep link handling**.
- **Bug fixes**.
- Fixed clickthrough behavior for MRAID ads that use iframes.
## Version 3.8 (June 1st, 2015)
- **Minor improvements**.
## Version 3.7 (April 30th, 2015)
- **Added iAd medium rectangle ad support**.
- **Certified Google AdMob SDK version 7.1.0**.
- **Certified Greystripe SDK version 4.4.0**.
- **Certified Vungle SDK version 3.0.13**.
- Added click callback support.
- **Bug fixes**.
- Addressed a race condition when receiving location updates after calling -`[MPGeolocationProvider disableLocationUpdates:]`.
## Version 3.6 (April 3rd, 2015)
- **Bug fixes**.
- Fixed crash caused by some MRAID ads attempting to set an orientation that the app doesn't support.
## Version 3.5 (March 10th, 2015)
- **Deprecated custom event class methods and constants for setting ad network parameters**.
- **Changed banner minimum refresh time interval to 10 seconds**.
- **Greystripe custom events now accept parameters configured using app.mopub.com**.
## Version 3.4 (January 30th, 2015)
- **Certified Facebook SDK Version 3.21.1**.
- **Bug fixes**.
- Fixed MRAID force orientation command for MRAID interstitials.
- Fixed interstitial bug where sound and video would continue to play after dismissal.
## Version 3.3 (December 8th, 2014)
- **MRAID 2.0 support**. The MoPub SDK is now compliant with the MRAID 2.0 specification to enable rich media ads in banners and interstitial ad units. Learn more about MRAID from the [IAB](http://www.iab.net/MRAID#MRAID). To minimize integration errors, please completely remove the existing MoPub SDK from your project and then integrate the latest version.
- **Automatic geolocation updates**. If your app already has location permissions, the MoPub SDK will automatically attempt to acquire location data for ad requests. Please use `locationUpdatesEnabled` in `MoPub.h` to opt out of this functionality. The MoPub SDK will never prompt the user for permission if location permissions are not currently granted.
- **Added support for AdColony SDK 2.4.12**.
- **Bug fixes**.
- Fixed displaying previously cached Chartboost interstitials.
- Fixed crash caused by refreshing Facebook banners after click.
- Fixed iAd interstitial dismissed callback on iOS 8
- Fixed HTML interstitial duplicate click trackers
## Version 3.2 (October 17th, 2014)
- **We have launched a new license as of version 3.2.0.** To view the full license, visit [http://www.mopub.com/legal/sdk-license-agreement/](http://www.mopub.com/legal/sdk-license-agreement/)
## Version 3.1 (October 9th, 2014)
- Updated native mediation framework to support Facebook Audience Network SDK 3.18.2
- If you're directly using `MPNativeAd`, you should implement the `MPNativeAdDelegate` protocol found in `MPNativeAdDelegate.h` and set the delegate property on your `MPNativeAd` instance.
- Added convenience methods to `MPTableViewAdPlacer` and `MPCollectionViewAdPlacer` that default to using server-controlled native ad positioning
- `+ (instancetype)placerWithTableView:viewController:defaultAdRenderingClass:(Class)defaultAdRenderingClass;`
- `+ (instancetype)placerWithCollectionView:viewController:defaultAdRenderingClass:(Class)defaultAdRenderingClass;`
- Fixed compiler error in `MPDiskLRUCache.m` if `OS_OBJECT_USE_OBJC` is false
## Version 3.0 (September 30th, 2014)
- **The MoPub SDK now uses Automatic Reference Counting**
- **Swift support:** to use the MoPub SDK in your Swift project, simply import `MoPubSDK/MoPub-Bridging-Header.h` to your project and ensure the Objective-C Bridging Header build setting under Swift Compiler - Code Generation has a path to the header.
- Updated Chartboost custom event (Chartboost SDK 5.0.1)
- Bug fixes
- mraid.js will reject mraid calls until the SDK signals it is ready
- banner ads will pause autorefresh when the app enters the background and resume autorefresh when the app enters the foreground
### IMPORTANT UPGRADE INSTRUCTIONS
As of version 3.0.0, the MoPub SDK uses Automatic Reference Counting. If you're upgrading from an earlier version (2.4.0 or earlier) that uses Manual Reference Counting, in order to minimize integration errors with the manual removal of the `-fno-objc-arc` compiler flag, our recommendation is to completely remove the existing MoPub SDK from your project and then integrate the latest version. Alternatively, you can manually remove the `-fno-objc-arc` compiler flag from all MoPub SDK files. If your project uses Manual Reference Counting, you must add the `-fobjc-arc` compiler flag to all MoPub SDK files.
## Version 2.4 (August 28th, 2014)
- **Simplified native ads integration**: integration instructions and documentation are available on the [GitHub wiki](https://github.com/mopub/mopub-ios-sdk/wiki/Native-Ads-Integration)
- Updated Vungle custom event (Vungle SDK 3.0.8)
- Optional method `- (void)interstitialDidReceiveTapEvent:` added to `MPInterstitialAdControllerDelegate`
- Hardened native ad custom events against invalid image URLs
## Version 2.3 (July 17th, 2014)
- MoPub base SDK is now 64-bit compatible (Please check mediated networks for 64-bit support)
- Certified support for InMobi 4.4.1, Greystripe/Conversant 4.3, and AdMob 6.9.3
- Additional measures to prevent autoloading deep-links without user interaction for banners
- Bug fixes
- A cached Millennial Media interstitial will be correctly loaded
- Fixed crash if the close button is quickly tapped after tapping an MRAID interstitial
## Version 2.2 (June 19th, 2014)
- **Native ads mediation**: integration instructions and documentation are available on the [GitHub wiki](https://github.com/mopub/mopub-ios-sdk/wiki/Integrating-Native-Third-Party-Ad-Networks). Facebook and InMobi native ads may be mediated using the MoPub SDK.
- **Native ads content filtering**: Added the ability to specify which native ad elements you want to receive from the MoPub Marketplace to optimize bandwidth use and download only required assets, via `MPNativeAdRequestTargeting.desiredAssets`. This feature only works for the six standard Marketplace assets, found in `MPNativeAdConstants.h`. Any additional elements added in direct sold ads will always be sent down in the extras.
- Added star rating information to the `MPNativeAd` object, via `MPNativeAd.starRating`. This method returns an `NSNumber` (double value) corresponding to an app's rating on a 5-star scale.
- Bug fixes
- Handle Millennial Media SDK's `MillennialMediaAdWillTerminateApplication` notification
- Ensured that banners never autorefresh until they have been loaded at least once
## Version 2.1 (May 15th, 2014)
- Improved user privacy protection
- Device identifiers are removed from logging output
- Improved user protection against auto-dialing ads
- Prompt user for confirmation when a `tel` URL is encountered
- Updated Millennial Media custom events (Millennial Media SDK 5.2+ only)
- Updated Vungle custom event (Vungle SDK 2.0+ only)
### Version 2.1.1 (May 22nd, 2014)
- Fixed Millennial Media SDK 5.2 banner custom event failover
## Version 2.0 (April 23rd, 2014)
- Added support for MoPub Native Ads. Please view the integration wiki [here](https://github.com/mopub/mopub-ios-sdk/wiki/Native-Ads-Integration).
- Updated the minimum required iOS version to iOS 5.0
- Removed `TouchJSON` dependency. `TouchJSON` files may be removed from your project.
## Version 1.17 (November 20, 2013)
- AdColony Custom Event
- Supports AdColony as a custom native ad network for interstitial videos. Note that V4VC (virtual currency reward) is currently not supported.
- Handle ISO Latin-1 site encoding in addition to UTF-8
- Bug fixes
### Version 1.17.3.0 (March 20th, 2014)
- Updated Chartboost custom event (Chartboost SDK 4.0+ only)
- Bug fixes
- Fixed iOS 7 bug where certain interstitial images may fail to load
### Version 1.17.2.0 (February 20th, 2014)
- Updated InMobi custom events (InMobi SDK 4.0.3+ only)
- Bug fixes
- MRAID viewable property now correctly updates on app background and resume
- MRAID command urls are no longer re-encoded for processing
### Version 1.17.1.0 (January 23rd, 2014)
- Sample app improvements
- Improved manual ad unit entry view
- Save manually entered ad unit ids
- Ability to enter keywords for each ad unit
- Bug fixes
- MRAID `isViewable` command now correctly returns a boolean value
## Version 1.16 (October 15, 2013)
- Creative Controls
- Creative Flagging
- **Important**: ```MPAdAlertGestureRecognizer``` and ```MPAdAlertManager``` classes as well as ```MessageUI.framework``` must be added to your project to enable flagging functionality.
- Allows users to report certain inappropriate ad experiences directly to MoPub with a special gesture.
- User must swipe back and forth at least four times within the ad view to flag a creative.
- Swipes must cover more than ⅓ of the ad width and must be completely horizontal.
- Only works for direct sold, Marketplace, and server to server ad network ads.
- Blocked Popups
- Javascript alert, confirm, and prompt dialogs are blocked.
- Blocked Auto-redirects
- Ads that automatically redirect users to another page without user interaction are automatically blocked.
- MoPub Video Pre-caching
- Video ads from the Marketplace will be pre-cached automatically and videos will not be shown until they can play without additional buffering.
- Simple Ads Demo Improvements
- 300x250 and 728x90 test spots added to the demo app.
- Vungle Custom Event
- Supports Vungle as a custom native ad network for interstitial videos.
- SKStoreProductViewController iOS 7 Orientation Crash Fix
- Fixes iOS 7 bug that causes SKStoreProductViewController to crash if the app does not list portrait as a supported orientation.
- Log more readable message in response to the "no ads available" server error.
- Updated mraid.getVersion() to return 2.0
### Version 1.16.0.1 (October 24, 2013)
- MRAID commands now properly handle encoded URLs.
## Version 1.14 (September 12, 2013)
- iOS 7 Gold Master support
- Verified compatibility with latest Millennial iOS SDK (5.1.1)
- Updated support for InMobi SDK version 4.0
- Bug fixes
#### Updates to InMobi Integrations
- **Important**: As of version 1.14.0.0, the InMobi custom events packaged with the MoPub SDK only support InMobi version 4.00 and up. Follow the instructions [here](http://www.inmobi.com/support/art/25856216/22465648/integrating-mopub-with-inmobi-ios-sdk-4-0/) to integrate InMobi version 4.00 and up. If you would like to continue to use a prior version of the InMobi SDK, do not update the custom event files and follow the instructions [here](http://developer.inmobi.com/wiki/index.php?title=MoPub_InMobi_iOS) to integrate.
### Version 1.14.1.0 (September 18, 2013)
- Fixed an issue causing certain interstitials to be incorrectly centered or sized
- Updated the SDK bundle to include the Millennial Media 5.1.1 SDK
## Version 1.13 (August 22, 2013)
- Added support for creating calendar events, storing pictures, and video playback via MRAID APIs
- Fixed a rendering issue with HTML interstitials on iOS 5
- Fixed crashes resulting from delegate callbacks being executed on deallocated objects
## Version 1.12 (April 25, 2013)
#### Updates to Third Party Integrations
- Third-party ad network integrations are **now implemented as custom events instead of adapters**.
> **Please remove any old adapters from your code and use the new custom events located in the `AdNetworkSupport` folder instead.**
- Added support for Millennial SDK 5.0
- Updated Chartboost integration to honor the location parameter (configurable via the server)
- Updated Custom Events API.
> **If you have implemented a Custom Event, please read the Custom Events [documentation](https://github.com/mopub/mopub-ios-sdk/wiki/Custom-Events) and update your code appropriately.**
#### Updates to the MoPub SDK
- The MoPub SDK now requires **iOS 4.3+**
- Removed all references to `[UIDevice uniqueIdentifier]`
- Added support for opening iTunes links in an `SKStoreProductViewController`
- Added [session tracking](https://github.com/mopub/mopub-ios-sdk/wiki/Conversion-Tracking#session-tracking)
- Added numerous data signals (wireless connectivity, location accuracy, bundle version, telephony information) to ad requests
- Added test coverage to MoPub SDK
#### Distribution and Documentation Updates
- Added .zip archive distribution options with bundled third party network SDKs. Learn more at the updated [wiki](https://github.com/mopub/mopub-ios-sdk/wiki/Getting-Started).
- Added appledoc style [Class Documentation](https://github.com/mopub/mopub-ios-sdk/tree/master/ClassDocumentation)
- Updated the MoPub Sample Application
### Version 1.12.5.0 (August 1, 2013)
- Updated to support Millennial SDK 5.1.0
- Fixed warnings resulting from duplicate category methods
- Fixed a crash occurring when an interstitial was tapped and dismissed immediately afterwards
### Version 1.12.4.0 (June 26, 2013)
- Fixed a memory leak when displaying MRAID ads
### Version 1.12.3.0 (June 18, 2013)
- Fixed inconsistency between ad request user agent and click-handling user agent
- Fixed crashes that occur when banners are deallocated in the process of displaying modal content
### Version 1.12.2.0 (June 7, 2013)
- Fixed issue causing expanded MRAID banner ads to obscure modal content
- Fixed issue in which impressions were not tracked properly for MRAID banner ads
- Added new API methods on `MPAdView` for managing ad refresh behavior (`-startAutomaticallyRefreshingContents` and `-stopAutomaticallyRefreshingContents`)
- Deprecated `ignoresAutorefresh` property on `MPAdView`
### Version 1.12.1.0 (May 13, 2013)
- Fixed issue causing banners from custom HTML networks to be improperly sized
- Updated the SDK bundle to include the Millennial Media 5.0.1 SDK
### Version 1.12.0.1 (April 26, 2013)
- Fixed some leaks reported by the static analyzer
## Version 1.11 (March 13, 2013)
- Fixed issue causing a crash for legacy custom event methods
- Fixed issue causing refresh timer to not be scheduled properly on connection errors
- Updated the sample Chartboost custom event to avoid improperly setting the Chartboost delegate to nil in -dealloc
## Version 1.10 (February 13, 2013)
- Introduced custom event classes
- Fixed issue causing metrics-recording URLs to be incorrect when certain ad sources fail
- Fixed issue causing interstitials to be sized incorrectly when the status bar changes state
- Fixed issue preventing loading indicator from being dismissed properly for HTML interstitials
- Fixed issue that allows the browser controller to continue loading after it has been dismissed
- Added 'testing' property on `MPAdView` and `MPInterstitialAdController`
- Increased accuracy of iAd impression tracking
## Version 1.9.0.0 (September 27, 2012)
- Added support for iOS 6 and the new iPhone 5 screen size
- Added support for the Facebook ads test program
- Added support for `ASIdentifierManager` (`UIDevice.identifierForAdvertising` replacement)
- Re-introduced UDID as a fall-back identifier on earlier iOS ## Versions (with an opt-out mechanism)
- Fixed issues with redirecting certain native iOS URLs (e.g. itunes.apple.com) in the in-app browser
- Fixed an issue in which an interstitial might not dismiss properly when leaving an app via a click
- Updated the SimpleAdsDemo sample app for iOS 6
- Added clarity to certain console log entries
- Added some minor visual improvements to the click progress indicator
## Version 1.8.0.0
- Fixed a crash in `MPAdManager` due to uncanceled NSURLConnections
- Fixed an issue with mraid://open URL decoding
- Fixed an issue in which third-party interstitials could block the display of subsequent HTML interstitials
- Fixed an issue in which third-party interstitials could trigger lifecycle callbacks after expiration
- Added iOS 6 view controller auto-rotation methods to `MPInterstitialAdController`
- Added support for iOS 6 advertising identifier
- Removed references to `-[UIDevice uniqueIdentifier]` and OpenUDID
- Added runtime checks for `CALayer` and `UIActionSheet` selectors to prevent crashes on iOS 3.1
- Improved the Millennial interstitial adapter to handle all return values from `-checkForCachedAd`
## Version 1.7.0.0
- Improved click experience to avoid blank screens when loading pages with many redirects
- Fixed an issue in which `MPAdView` would implicitly change its 'hidden' property
- Fixed an issue in which the in-app browser failed to dismiss properly upon `-[UIApplication openURL:]`
- Fixed issues in which the `MRAID.isViewable` method would erroneously return true
- Fixed a divide-by-zero exception which occurred when presenting MRAID interstitials
## Version 1.6.0.0
- Added new API method for displaying an interstitial (`-showFromViewController:`)
- Added new delegate property on `MPInterstitialAdController`
- Deprecated old API method for displaying an interstitial (`-show:`)
- Deprecated parent property on `MPInterstitialAdController`
- Deprecated various callbacks in `MPInterstitialAdControllerDelegate`
## Version 1.5.0.0
- Added support for Millennial Media SDK 4.5.5
- Modified Millennial Media interstitial adapter to be more robust to ad display failures
## Version 1.4.0.0
- Reduced the amount of logging messages regarding autorefresh
- Modified JSON deserializer to avoid getting NSNull objects
- Fixed issue in which interstitials could appear blank upon repeated show: calls
- Removed call to deprecated `SKMutablePayment` class method
- Added APIs for enabling and disabling the in-app purchase transaction observer
- Fixed a memory leak in `MPInterstitialAdController`
- Added support for OpenUDID as a optional replacement for `UIDevice's -uniqueIdentifier`
## Version 1.2.0.0
- Fixed a bug in which landscape interstitials appeared off-center on iOS 5.0+
- Fixed some static analyzer warnings in `MPAdManager` and `MPAdBrowserController`
- Fixed a memory leak in `MPAdConversionTracker`
- Changed '\*\*\*CLEAR\*\*\*' message to 'No ad available' for clarity
- Added support for Millennial Media leaderboard ads
- Changed behavior of `-setIgnoresAutorefresh:` to pause (rather than cancel) existing timers
- Added support for interstitial custom events
| 55.317132 | 609 | 0.758951 | eng_Latn | 0.983086 |
f92e8d89015c0d996049b1801f72fcbff8a23d12 | 4,022 | md | Markdown | Packages/server/README.md | slonopotamus/app | 2cf662ccd5a377c1a15679063e99f43ad5c77265 | [
"MIT"
] | null | null | null | Packages/server/README.md | slonopotamus/app | 2cf662ccd5a377c1a15679063e99f43ad5c77265 | [
"MIT"
] | null | null | null | Packages/server/README.md | slonopotamus/app | 2cf662ccd5a377c1a15679063e99f43ad5c77265 | [
"MIT"
] | null | null | null | # Debate Map (Server)
Codebase for the Debate Map website's backend ([debatemap.app](https://debatemap.app)).
## Setup
> Continued from: https://github.com/debate-map/app#setup
### Environment variables
Copy the `.env.template` file in the repo root, rename the copy to `.env`, and fill in the necessary environment-variables. (The sections below will show which of those environment variables are needed, and how to supply them.)
### 1) Local server, base
1) Ensure [PostgreSQL](https://www.postgresql.org/) (v10+) is installed.
2) Ensure your PostgreSQL server [has logical decoding enabled](https://www.graphile.org/postgraphile/live-queries/#graphilesubscriptions-lds), by ensuring the following settings are set in `postgresql.conf` (and then restarting PostgreSQL):
```
wal_level = logical
max_wal_senders = 10
max_replication_slots = 10
```
(Note: you can determine where your `postgresql.conf` file is by running `psql template1 -c 'SHOW config_file'`)
3) Ensure the `wal2json` PostgreSQL plugin is installed: https://github.com/eulerto/wal2json#build-and-install
4) Create a Postgres database for this project, by running: `createdb debate-map`
5) Init `debate-map` db in PostgreSQL, by running `yarn start server.initDB`.
### 2) Local server, using docker
Note: The docker images produced directly will have the name `dm_server`.
1) Install Docker Desktop: https://docs.docker.com/desktop
2) Install the Docker "dive" tool (helps for inspecting image contents without starting contianer): https://github.com/wagoodman/dive
2.1) In addition, make a shortcut to `\\wsl$\docker-desktop-data\version-pack-data\community\docker\overlay2`; this is the path you can open in Windows Explorer to view the raw files in the docker-built "layers". (ie. your project's output-files, as seen in the docker builds)
3) For direct docker builds, run `npm start server.dockerBuild`. (image-name: `dm_server`)
### 3) Local server, using docker + kubernetes (k3d/kind) + skaffold (helper)
Note: The docker images produced by skaffold will have the name `packages-server`.
1) Create your cluster in k3d or kind.
1.A.1) For k3d, install from here: https://k3d.io/#installation
1.A.2) Set up local kubernetes cluster, using k3d: `k3d cluster create main-1` (in the future, run `k3d cluster start main-1` to start the cluster, and `k3d cluster stop main-1` to stop)
1.B.1) For kind, install from here: https://kind.sigs.k8s.io/docs/user/quick-start
1.B.2) Set up local kubernetes cluster, using kind: `kind create cluster main-1` (in the future, use regular docker commands to start/stop the cluster nodes, eg. `kubectl cluster-info --context kind-main-1`)
2) Install Skaffold (trying it out): https://skaffold.dev/docs/install)
3) For docker->kubernetes build+rebuilds, run `npm start server.skaffoldDev`. (whenever you want a rebuild, wait for previous build to finish, then press enter in the terminal)
4) For docker->kubernetes builds, run `npm start server.skaffoldBuild`. (image-name: `packages-server`)
5) For docker->kubernetes build+run, run `npm start server.skaffoldRun`. (image-name: `packages-server`)
### 4) Remote server, using docker + kubernetes
Note: These instructions are for OVH-cloud's Public Cloud servers.
1) Create a Public Cloud project on OVH cloud. (in the US, us.ovhcloud.com is recommended for their in-country servers)
2) Follow the instructions here to setup a Kubernetes cluster: https://youtu.be/vZOj59Oer7U?t=586
2.1) In the "node pool" step, select "1". (Debate Map does not currently need more than one node)
2.2) In the "node type" step, select the cheapest option, Discovery d2-4. (~$12/mo)
3) TODO
## Editing + running
See here: <https://github.com/debate-map/app#editing--running>
## Database migrations
See here for overview: <https://github.com/Venryx/web-vcore/tree/master/Docs/DatabaseMigrations.md>
Actions:
* To create a new migration, make a copy of the latest migration in `Knex/Migrations`, rename it (incrementing the number), then clear the up/down functions. | 55.09589 | 276 | 0.756837 | eng_Latn | 0.884797 |
f92e8da0987b5f917fb1f88252b62159546dcf82 | 1,392 | md | Markdown | enhavo/tradukenda/pt/gramatiko/07.md | junefish/kurso-zagreba-metodo | d057192dd28b85710614cdf2425bebd52d04b775 | [
"CC-BY-4.0"
] | 124 | 2016-03-22T16:53:28.000Z | 2022-03-10T15:10:10.000Z | enhavo/tradukenda/pt/gramatiko/07.md | junefish/kurso-zagreba-metodo | d057192dd28b85710614cdf2425bebd52d04b775 | [
"CC-BY-4.0"
] | 55 | 2016-02-01T19:47:49.000Z | 2022-02-23T00:00:15.000Z | enhavo/tradukenda/pt/gramatiko/07.md | junefish/kurso-zagreba-metodo | d057192dd28b85710614cdf2425bebd52d04b775 | [
"CC-BY-4.0"
] | 55 | 2016-02-16T19:18:49.000Z | 2021-07-04T22:33:33.000Z | # Negação
Regras para negação:
- *__Nek__ li __nek__ ŝi respondis.* – Nem ele nem ela responderam.
- *Vi __nek__ sidas __nek__ rigardas.* – Você nem se senta nem olha (Você nem está sentado, nem olhando).
- *Mi __neniam__ diros al iu.* – Nunca direi a ninguém. (*Mi neniam diros al neniu* é errado em Esperanto, ou seja, o Esperanto não admite duas negações.)
# *Mem*
*Mem* significa "mesmo", "próprio" (por mim mesmo, por você mesmo, por si, etc.), e é usado para ressaltar o substantivo ou pronome que o precede (Compare *si*, reflexivo, lição 4.)
- *Mi __mem__ faris tion.* – Eu mesmo fiz isso. Eu fiz isso por mim mesmo.
# Até…
Há várias maneiras de dizer "adeus" em Esperanto, mas a mais comum é
- *ĝis la revido* – literalmente "até a revista" (compare com o português "até logo", até mais (ver), até breve, até a vista", com o francês "au revoir".)
Alternativamente, pode ser dito sem o artigo:
- *ĝis revido*.
# O prefixo *ek-*
mostra
1. o início de uma ação, ou
2. uma ação súbita e momentânea.
Examples:
- *__ek__paroli* – começar a falar
- *__ek__silenti* – cair em silêncio
- *__ek__sidi* – sentar-se, tomar assento
- *__ek__ridi* – cair na gargalhada
# O sufixo *-aĵ*
significa "coisa", objeto concreto:
- *manĝ__aĵ__o* – comida
- *trink__aĵ__o* – bebida
- *bel__aĵ__o* – alguma coisa bela
- *send__aĵ__o* – alguma coisa enviada, missiva
| 26.264151 | 181 | 0.696839 | por_Latn | 0.99971 |
f92ebf2ee022a30dce9402eccf88b5e1114e9fa9 | 284 | md | Markdown | todo.md | wuxudongxd/labClockInAdmin | 8faa6cee9c11ac679772358cfea69a4dac880301 | [
"MIT"
] | null | null | null | todo.md | wuxudongxd/labClockInAdmin | 8faa6cee9c11ac679772358cfea69a4dac880301 | [
"MIT"
] | null | null | null | todo.md | wuxudongxd/labClockInAdmin | 8faa6cee9c11ac679772358cfea69a4dac880301 | [
"MIT"
] | null | null | null | - [ ] analysis 页面完善
当日签到人数/总人数 饼图
当日请假人数/总人数 饼图
每月签到人数 折线图
每月请假人数 折线图
签到详情 请假、请假次数 每日签到详情
{
totalCount: number;
currentDayClockCount: number;
currentDayLeaveCount: number;
currentMonthClockRecords: clockInDays;
currentMonthClockRecords: clockInDays;
}
| 12.347826 | 42 | 0.725352 | yue_Hant | 0.536175 |
f92f5dc406314a26850c20f7b137c8591e83c66e | 3,075 | md | Markdown | docs/android/get-started/installation/openjdk.md | cozyplanes/xamarin-docs.ko-kr | fef8d2951f8409717c63113a4c3301656f9a8ce6 | [
"CC-BY-4.0",
"MIT"
] | null | null | null | docs/android/get-started/installation/openjdk.md | cozyplanes/xamarin-docs.ko-kr | fef8d2951f8409717c63113a4c3301656f9a8ce6 | [
"CC-BY-4.0",
"MIT"
] | null | null | null | docs/android/get-started/installation/openjdk.md | cozyplanes/xamarin-docs.ko-kr | fef8d2951f8409717c63113a4c3301656f9a8ce6 | [
"CC-BY-4.0",
"MIT"
] | 1 | 2021-03-08T02:25:40.000Z | 2021-03-08T02:25:40.000Z | ---
title: Microsoft의 모바일 OpenJDK 배포 미리 보기
description: Microsoft의 모바일 배포용 OpenJDK 배포를 구성하는 단계별 가이드입니다.
ms.prod: xamarin
ms.assetid: B5F8503D-F4D1-44CB-8B29-187D1E20C979
ms.technology: xamarin-android
author: vyedin
ms.author: vyedin
ms.date: 07/22/2018
ms.openlocfilehash: 2022337ebd65997c7b2492137193586278f2dffd
ms.sourcegitcommit: bf51592be39b2ae3d63d029be1d7745ee63b0ce1
ms.translationtype: HT
ms.contentlocale: ko-KR
ms.lasthandoff: 08/06/2018
ms.locfileid: "39573596"
---
# <a name="microsofts-mobile-openjdk-distribution-preview"></a>Microsoft의 모바일 OpenJDK 배포 미리 보기
_이 가이드에서는 Microsoft의 OpenJDK 배포 미리 보기 릴리스로 전환하는 단계를 설명합니다. 이 배포는 모바일 개발을 위한 것입니다._

## <a name="overview"></a>개요
Visual Studio 15.9 및 Mac용 Visual Studio 7.7부터 Visual Studio Tools for Xamarin은 Oracle의 JDK에서 **Android 개발 전용인 경량 버전의 OpenJDK**로 전환됩니다.

이 이동의 이점은 다음과 같습니다.
- Android 개발에 적합한 OpenJDK 버전을 언제나 사용할 수 있습니다.
- JDK 9 또는 10을 다운로드해도 개발 환경에 영향을 미치지 않습니다.
- 다운로드 크기 및 공간이 상당히 줄어듭니다.
- 타사 서버 및 설치 관리자에 더 이상 문제가 발생하지 않습니다.
개선된 환경으로 더 빨리 이동하는 경우 Windows와 Mac 둘 다에서 Microsoft 모바일 OpenJDK 배포 빌드를 테스트에 사용할 수 있습니다. 설치 프로세스는 아래에 설명되어 있으며 언제든지 Oracle JDK로 되돌릴 수 있습니다.
## <a name="download"></a>다운로드
시작하려면 시스템에 올바른 빌드를 다운로드합니다.
- **Mac** – https://dl.xamarin.com/OpenJDK/mac/microsoft-dist-openjdk-1.8.0.9.zip
- **Windows x86** – https://dl.xamarin.com/OpenJDK/win32/microsoft-dist-openjdk-1.8.0.9.zip
- **Windows x64** – https://dl.xamarin.com/OpenJDK/win64/microsoft-dist-openjdk-1.8.0.9.zip
## <a name="configure"></a>구성
올바른 위치에 압축을 풉니다.
- **Mac** – **$HOME/Library/Developer/Xamarin/jdk/microsoft_dist_openjdk_1.8.0.9**
- **Windows** – **C:\\Program Files\\Android\\jdk\\microsoft_dist_openjdk_1.8.0.9**
> [!IMPORTANT]
> 이 예에서는 빌드 1.8.0.9를 사용하지만, 다운로드하는 버전이 더 최신 버전일 수 있습니다.
새 JDK에 대한 IDE 경로를 지정합니다.
- **Mac** – **도구 > SDK Manager > 위치**를 클릭하고 **JDK(Java SDK) 위치**를 OpenJDK 설치의 전체 경로로 변경합니다. 다음 예에서 이 경로는 **$HOME/Library/Developer/Xamarin/jdk/microsoft_dist_openjdk_1.8.0.9**로 설정됩니다.

- **Windows** – **도구 > 옵션 > Xamarin > Android 설정**을 클릭하고 **Java Development Kit 위치**를 OpenJDK 설치의 전체 경로로 변경합니다. 다음 예에서 이 경로는 **C:\\Program Files\\Android\\jdk\\microsoft_dist_openjdk_1.8.0.9**로 설정됩니다.

## <a name="revert"></a>되돌리기
Oracle JDK로 되돌리려면 Java SDK 위치를 이전에 사용한 Oracle JDK 경로로 변경하고 솔루션을 다시 빌드합니다. Mac에서는 **기본값으로 다시 설정**을 클릭하여 Oracle JDK 경로로 되돌릴 수 있습니다.
Microsoft 모바일 OpenJDK 배포에 문제가 있는 경우, 신속하게 추적 및 수정될 수 있도록 IDE의 피드백 도구를 사용하여 문제를 보고하세요.
## <a name="known-issues--planned-fix-dates"></a>알려진 문제 및 계획된 수정 일자
`JAVA_HOME` 환경 변수는 SDK 및 장치 관리자로 올바르게 내보내기가 되지 않을 수 있습니다. 해결 방법으로, 이 환경 변수를 컴퓨터의 OpenJDK 위치에 설정할 수 있습니다. 이 문제는 15.9 미리 보기에서 수정됩니다.
## <a name="summary"></a>요약
이 아티클에서는 IDE를 구성하여 Microsoft의 모바일 OpenJDK 배포의 미리 보기 릴리스를 사용하는 방법을 알아보았습니다. 해당 기능은 2018년 하반기에 안정적인 릴리스가 계획되어 있습니다.
| 37.5 | 206 | 0.727805 | kor_Hang | 1.000005 |
f93087e3991b1293ab04df32509d968806434c9b | 878 | md | Markdown | articles/dev-spaces/team-development-netcore.md | basma-msft/azure-docs | 112500699eac9fd5ca4d145c0938e10e7b9984fb | [
"CC-BY-4.0",
"MIT"
] | 1 | 2019-05-06T07:55:24.000Z | 2019-05-06T07:55:24.000Z | articles/dev-spaces/team-development-netcore.md | basma-msft/azure-docs | 112500699eac9fd5ca4d145c0938e10e7b9984fb | [
"CC-BY-4.0",
"MIT"
] | null | null | null | articles/dev-spaces/team-development-netcore.md | basma-msft/azure-docs | 112500699eac9fd5ca4d145c0938e10e7b9984fb | [
"CC-BY-4.0",
"MIT"
] | 1 | 2019-11-08T10:14:07.000Z | 2019-11-08T10:14:07.000Z | ---
title: "Team development with Azure Dev Spaces using .NET Core and VS Code"
titleSuffix: Azure Dev Spaces
services: azure-dev-spaces
ms.service: azure-dev-spaces
author: zr-msft
ms.author: zarhoads
ms.date: 07/09/2018
ms.topic: "tutorial"
description: "Rapid Kubernetes development with containers and microservices on Azure"
keywords: "Docker, Kubernetes, Azure, AKS, Azure Kubernetes Service, containers, Helm, service mesh, service mesh routing, kubectl, k8s "
---
[!INCLUDE [](../../includes/devspaces-team-development-1.md)]
### Make a code change
Go to the VS Code window for `mywebapi` and make a code edit to the `string Get(int id)` method in `Controllers/ValuesController.cs`, for example:
```csharp
[HttpGet("{id}")]
public string Get(int id)
{
return "mywebapi now says something new";
}
```
[!INCLUDE [](../../includes/devspaces-team-development-2.md)]
| 31.357143 | 146 | 0.735763 | eng_Latn | 0.722452 |
f9309fced67eea70fe68153c54e86de6710a8419 | 11,637 | md | Markdown | site/en-snapshot/hub/tf2_saved_model.md | jvishnuvardhan/docs-l10n | b5d8cc9fd4a98d9b2e12d550e27f584e93f74994 | [
"Apache-2.0"
] | 3,234 | 2018-03-30T15:58:33.000Z | 2022-03-31T16:24:25.000Z | site/en-snapshot/hub/tf2_saved_model.md | jvishnuvardhan/docs-l10n | b5d8cc9fd4a98d9b2e12d550e27f584e93f74994 | [
"Apache-2.0"
] | 805 | 2018-03-31T23:10:34.000Z | 2022-03-18T13:39:31.000Z | site/en-snapshot/hub/tf2_saved_model.md | jvishnuvardhan/docs-l10n | b5d8cc9fd4a98d9b2e12d550e27f584e93f74994 | [
"Apache-2.0"
] | 1,931 | 2018-03-30T19:17:50.000Z | 2022-03-27T02:39:44.000Z | <!--* freshness: { owner: 'maringeo' reviewed: '2021-06-30' review_interval: '6 months' } *-->
# SavedModels from TF Hub in TensorFlow 2
The
[SavedModel format of TensorFlow 2](https://www.tensorflow.org/guide/saved_model)
is the recommended way to share pre-trained models and model pieces on
TensorFlow Hub. It replaces the older [TF1 Hub format](tf1_hub_module.md) and
comes with a new set of APIs.
This page explains how to reuse TF2 SavedModels in a TensorFlow 2 program with
the low-level `hub.load()` API and its `hub.KerasLayer` wrapper. (Typically,
`hub.KerasLayer` is combined with other `tf.keras.layers` to build a Keras model
or the `model_fn` of a TF2 Estimator.) These APIs can also load the legacy
models in TF1 Hub format, within limits, see the
[compatibility guide](model_compatibility.md).
Users of TensorFlow 1 can update to TF 1.15 and then use the same APIs.
Older versions of TF1 do not work.
## Using SavedModels from TF Hub
### Using a SavedModel in Keras
[Keras](https://www.tensorflow.org/guide/keras/) is TensorFlow's high-level API
for building deep learning models by composing Keras Layer objects.
The `tensorflow_hub` library provides the class `hub.KerasLayer` that gets
initialized with the URL (or filesystem path) of a SavedModel and then
provides the computation from the SavedModel, including its pre-trained
weights.
Here is an example of using a pre-trained text embedding:
```python
import tensorflow as tf
import tensorflow_hub as hub
hub_url = "https://tfhub.dev/google/nnlm-en-dim128/2"
embed = hub.KerasLayer(hub_url)
embeddings = embed(["A long sentence.", "single-word", "http://example.com"])
print(embeddings.shape, embeddings.dtype)
```
From this, a text classifier can be built in the usual Keras way:
```python
model = tf.keras.Sequential([
embed,
tf.keras.layers.Dense(16, activation="relu"),
tf.keras.layers.Dense(1, activation="sigmoid"),
])
```
The [Text classification
colab](https://colab.research.google.com/github/tensorflow/hub/blob/master/examples/colab/tf2_text_classification.ipynb)
is a complete example how to train and evaluate such a classifier.
The model weights in a `hub.KerasLayer` are set to non-trainable by default.
See the section on fine-tuning below for how to change that. Weights are
shared between all applications of the same layer object, as usual in Keras.
### Using a SavedModel in an Estimator
Users of TensorFlow's
[Estimator](https://www.tensorflow.org/tutorials/distribute/multi_worker_with_estimator)
API for distributed training can use SavedModels from TF Hub by
writing their `model_fn` in terms of `hub.KerasLayer` among other
`tf.keras.layers`.
### Behind the scenes: SavedModel downloading and caching
Using a SavedModel from TensorFlow Hub (or other HTTPS servers that implement
its [hosting](hosting.md) protocol) downloads and decompresses it to the local
filesystem if not already present. The environment variable `TFHUB_CACHE_DIR`
can be set to override the default temporary location for caching the downloaded
and uncompressed SavedModels. For details, see [Caching](caching.md).
### Using a SavedModel in low-level TensorFlow
The function `hub.load(handle)` downloads and decompresses a SavedModel
(unless `handle` is already a filesystem path) and then returns the result
of loading it with TensorFlow's built-in function `tf.saved_model.load()`.
Therefore, `hub.load()` can handle any valid SavedModel (unlike its
predecessor `hub.Module` for TF1).
#### Advanced topic: what to expect from the SavedModel after loading
Depending on the contents of the SavedModel, the result of
`obj = hub.load(...)` can be invoked in various ways (as explained in
much greater detail in TensorFlow's [SavedModel
Guide](https://www.tensorflow.org/guide/saved_model):
* The serving signatures of the SavedModel (if any) are represented as a
dictionary of concrete functions and can be called like
`tensors_out = obj.signatures["serving_default"](**tensors_in)`,
with dictionaries of tensors keyed by the respective input and output
names and subject to the signature's shape and dtype constraints.
* The
[`@tf.function`](https://www.tensorflow.org/api_docs/python/tf/function)-decorated
methods of the saved object (if any) are restored as tf.function objects
that can be called by all combinations of Tensor and non-Tensor arguments
for which the tf.function had been
[traced](https://www.tensorflow.org/tutorials/customization/performance#tracing)
prior to saving. In particular, if there is an `obj.__call__` method
with suitable traces, `obj` itself can be called like a Python function.
A simple example could look like
`output_tensor = obj(input_tensor, training=False)`.
This leaves enormous liberty in the interfaces that SavedModels can
implement. The [Reusable SavedModels interface](reusable_saved_models.md)
for `obj` establishes conventions such that client code, including adapters
like `hub.KerasLayer`, know how to use the SavedModel.
Some SavedModels may not follow that convention, especially whole models
not meant to be reused in larger models, and just provide serving signatures.
The trainable variables in a SavedModel are reloaded as trainable,
and `tf.GradientTape` will watch them by default. See the section on
fine-tuning below for some caveats, and consider avoiding this for starters.
Even if you want to fine-tune, you may want to see if `obj.trainable_variables`
advises to re-train only a subset of the originally trainable variables.
## Creating SavedModels for TF Hub
### Overview
SavedModel is TensorFlow's standard serialization format for trained models
or model pieces.
It stores the model's trained weights together with the exact TensorFlow
operations to perform its computation. It can be used independently from
the code that created it. In particular, it can be reused across different
high-level model-building APIs like Keras, because TensorFlow operations
are their common basic language.
### Saving from Keras
Starting with TensorFlow 2, `tf.keras.Model.save()` and
`tf.keras.models.save_model()` default to the SavedModel format (not HDF5).
The resulting SavedModels that can be used with `hub.load()`,
`hub.KerasLayer` and similar adapters for other high-level APIs
as they become available.
To share a complete Keras Model, just save it with `include_optimizer=False`.
To share a piece of a Keras Model, make the piece a Model in itself and then
save that. You can either lay out the code like that from the start....
```python
piece_to_share = tf.keras.Model(...)
full_model = tf.keras.Sequential([piece_to_share, ...])
full_model.fit(...)
piece_to_share.save(...)
```
...or cut out the piece to share after the fact (if it aligns with the
layering of your full model):
```python
full_model = tf.keras.Model(...)
sharing_input = full_model.get_layer(...).get_output_at(0)
sharing_output = full_model.get_layer(...).get_output_at(0)
piece_to_share = tf.keras.Model(sharing_input, sharing_output)
piece_to_share.save(..., include_optimizer=False)
```
[TensorFlow Models](https://github.com/tensorflow/models) on GitHub
uses the former approach for BERT (see
[nlp/tools/export_tfhub_lib.py](https://github.com/tensorflow/models/blob/master/official/nlp/tools/export_tfhub_lib.py),
note the split between `core_model` for export and the `pretrainer` for
restoring the checkpoint)
and the the latter approach for ResNet (see
[vision/image_classification/tfhub_export.py](https://github.com/tensorflow/models/blob/master/official/vision/image_classification/resnet/tfhub_export.py)).
### Saving from low-level TensorFlow
This requires good familiarity with TensorFlow's [SavedModel
Guide](https://www.tensorflow.org/guide/saved_model).
If you want to provide more than just a serving signature, you should
implement the [Reusable SavedModel interface](reusable_saved_models.md).
Conceptually, this looks like
```python
class MyMulModel(tf.train.Checkpoint):
def __init__(self, v_init):
super().__init__()
self.v = tf.Variable(v_init)
self.variables = [self.v]
self.trainable_variables = [self.v]
self.regularization_losses = [
tf.function(input_signature=[])(lambda: 0.001 * self.v**2),
]
@tf.function(input_signature=[tf.TensorSpec(shape=None, dtype=tf.float32)])
def __call__(self, inputs):
return tf.multiply(inputs, self.v)
tf.saved_model.save(MyMulModel(2.0), "/tmp/my_mul")
layer = hub.KerasLayer("/tmp/my_mul")
print(layer([10., 20.])) # [20., 40.]
layer.trainable = True
print(layer.trainable_weights) # [2.]
print(layer.losses) # 0.004
```
## Fine-Tuning
Training the already-trained variables of an imported SavedModel together with
those of the model around it is called *fine-tuning* the SavedModel.
This can result in better quality, but often makes the training more
demanding (may take more time, depend more on the optimizer and its
hyperparameters, increase the risk of overfitting and require dataset
augmentation, esp. for CNNs). We advise SavedModel consumers to look into
fine-tuning only after having established a good training regime,
and only if the SavedModel publisher recommends it.
Fine-tuning changes the "continuous" model parameters that are trained.
It does not change hard-coded transformations, such as tokenizing text
input and mapping tokens to their corresponding entries in an embedding matrix.
### For SavedModel consumers
Creating a `hub.KerasLayer` like
```python
layer = hub.KerasLayer(..., trainable=True)
```
enables fine-tuning of the SavedModel loaded by the layer. It adds the
trainable weights and weight regularizers declared in the SavedModel
to the Keras model, and runs the SavedModel's computation in training
mode (think of dropout etc.).
The [image classification
colab](https://github.com/tensorflow/hub/blob/master/examples/colab/tf2_image_retraining.ipynb)
contains an end-to-end example with optional fine-tuning.
#### Re-exporting the fine-tuning result
Advanced users may want to save the results of fine-tuning back into
a SavedModel that can be used instead of the originally loaded one.
This can be done with code like
```python
loaded_obj = hub.load("https://tfhub.dev/...")
hub_layer = hub.KerasLayer(loaded_obj, trainable=True, ...)
model = keras.Sequential([..., hub_layer, ...])
model.compile(...)
model.fit(...)
export_module_dir = os.path.join(os.getcwd(), "finetuned_model_export")
tf.saved_model.save(loaded_obj, export_module_dir)
```
### For SavedModel creators
When creating a SavedModel for sharing on TensorFlow Hub,
think ahead if and how its consumers should fine-tune it,
and provide guidance in the documentation.
Saving from a Keras Model should make all the mechanics of fine-tuning work
(saving weight regularization losses, declaring trainable variables, tracing
`__call__` for both `training=True` and `training=False`, etc.)
Choose a model interface that plays well with gradient flow,
e.g., output logits instead of softmax probabilities or top-k predictions.
If the model use dropout, batch normalization, or similar training techniques
that involve hyperparameters, set them to values that make sense across many
expected target problems and batch sizes. (As of this writing, saving from
Keras does not make it easy to let consumers adjust them.)
Weight regularizers on individual layers are saved (with their regularization
strength coefficients), but weight regularization from within the optimizer
(like `tf.keras.optimizers.Ftrl.l1_regularization_strength=...)`)
is lost. Advise consumers of your SavedModel accordingly.
| 40.688811 | 157 | 0.775973 | eng_Latn | 0.982723 |
f930c03ce4098e5b8bee786368daacac2045616f | 2,628 | md | Markdown | CHANGELOG.md | doawoo/nerves_key | d7b4d18e7d968310e8f3c4a711e008ecbae67d79 | [
"Apache-2.0"
] | null | null | null | CHANGELOG.md | doawoo/nerves_key | d7b4d18e7d968310e8f3c4a711e008ecbae67d79 | [
"Apache-2.0"
] | null | null | null | CHANGELOG.md | doawoo/nerves_key | d7b4d18e7d968310e8f3c4a711e008ecbae67d79 | [
"Apache-2.0"
] | null | null | null | # Changelog
## v1.0.0 - 2021-10-23
This release only bumps the version number. It doesn't have any code changes.
## v0.5.5
* New features
* Add `sign_digest/2` and documentation for how to connect to Google Cloud
Platform IoT Core. Thanks to Alex McLain for this feature.
## v0.5.4
* New features
* Support storing 16+ character serial numbers in the NervesKey
## v0.5.3
* New features
* Use the term signer certificate more consistently throughout
* Add `mix nerves_key.device` to generate device certifications as a debugging
and learning aide. See the README.md for details.
## v0.5.2
* New features
* Add `NervesKey.ssl_opts/1` helper function to simplify integration with
libraries using TLS like NervesHub and Tortoise.
## v0.5.1
* New features
* `nerves_key_pkcs11` is now included as a dependency since it is almost
always used and easily forgotten
* Add `NervesKey.device_info/1` to expose the ATECC508A/608A version that was
installed on the device
## v0.5.0
* New features
* Add `NervesKey.put_settings/2` and `NervesKey.get_settings/1` to support
storing and retrieving a small map on a NervesKey. This is useful for data
that travels with certificates and settings that don't change much.
## v0.4.0
* New features
* Add `NervesKey.detected?/1` to check whether a NervesKey is actually
installed.
* Bug fixes
* Clear out the entire auxiliary certificate slots to avoid any confusion for
whether the certificates are present.
## v0.3.2
* New features
* Add helper functions for detecting and clearing out auxiliary certificates
## v0.3.1
* New features
* Add helper for provisioning NervesKeys using a default serial number
## v0.3.0
* New features
* Support a auxiliary device certificate that can be updated after the
provisioning step. This supports use cases where the provisioning
certificate's private key isn't available or won't work.
* Add `provisioned?/1` to quickly check whether a device has been provisioned
## v0.2.0
* New features
* Support setting signer key expiration dates
* Add a convenience method for getting the manufacturing serial number
* Bug fixes
* Fixed configuration compatibility checking - Thanks to Peter Marks for this
fix.
## v0.1.2
* Bug fixes
* Lock the private key slot so that a genkey can't replace its contents
## v0.1.1
* Bug fixes
* Fix signature failure issues by encoding the raw public key before constructing
the subject_key_id and authority_key_id for calls to `NervesKey.signer_cert/1`
and `NervesKey.device_cert/1`
## v0.1.0
Initial release
| 26.545455 | 83 | 0.740868 | eng_Latn | 0.996878 |
f931199a0fbb32558665271f4995d855cdb510a8 | 64 | md | Markdown | README.md | mscanza/Ear-Training-App | 72aae907a722d1c4894947d71e7cde30bd1e5a5d | [
"MIT"
] | null | null | null | README.md | mscanza/Ear-Training-App | 72aae907a722d1c4894947d71e7cde30bd1e5a5d | [
"MIT"
] | null | null | null | README.md | mscanza/Ear-Training-App | 72aae907a722d1c4894947d71e7cde30bd1e5a5d | [
"MIT"
] | null | null | null | # ear-training-app
An app for practicing ear training-intervals
| 21.333333 | 44 | 0.8125 | eng_Latn | 0.968059 |
f931df9642b978a02e1eb53c4f212ddfdcc7350a | 213 | md | Markdown | content/Overconfidence.md | ransurf/quartz | 174c514401f4265b360fb0e22449adeb462cc152 | [
"MIT"
] | null | null | null | content/Overconfidence.md | ransurf/quartz | 174c514401f4265b360fb0e22449adeb462cc152 | [
"MIT"
] | null | null | null | content/Overconfidence.md | ransurf/quartz | 174c514401f4265b360fb0e22449adeb462cc152 | [
"MIT"
] | null | null | null | Status:
Tags:
Links: [[Bias]]
___
# Overconfidence
## Principles
- Overestimating our personal capabilities
- Making decisions
## Downsides
- May encourage us to do things we are not qualified for
___
References: | 17.75 | 56 | 0.769953 | eng_Latn | 0.976908 |
f932568bc8e6aac433d7e486174b98f5d5ce94a5 | 676 | md | Markdown | docs/rest-api.md | alexhsamuel/apsis | 7a038a39e5002637b60f73147728b4cc53692da3 | [
"BSD-3-Clause"
] | 1 | 2021-06-03T15:35:31.000Z | 2021-06-03T15:35:31.000Z | docs/rest-api.md | alexhsamuel/apsis | 7a038a39e5002637b60f73147728b4cc53692da3 | [
"BSD-3-Clause"
] | 93 | 2018-08-17T20:32:09.000Z | 2022-03-23T17:34:37.000Z | docs/rest-api.md | alexhsamuel/apsis | 7a038a39e5002637b60f73147728b4cc53692da3 | [
"BSD-3-Clause"
] | null | null | null | ### Create a run
To create a run of an existing job:
```
POST /api/v1/runs
request = {
"job_id": "JOB-ID",
"args": {
"PARAM": "VALUE",
...
},
"times": {
"schedule": "TIME"
}
}
```
If the schedule time is omitted, the run is run immediately. Args may be
omitted if empty.
To create an ad hoc job and a run of it:
```
POST /api/v1/runs
request = {
"job": {
...
},
"args": {
"PARAM": "VALUE",
...
},
"times": {
"schedule": "TIME"
}
}
```
The format of the job is the same as job files in the job repo. As above, the
run is run immediately if the schedule time is omitted. Args is usually not
required and may be omitted.
| 16.095238 | 78 | 0.58432 | eng_Latn | 0.995874 |