hexsha
stringlengths
40
40
size
int64
5
1.04M
ext
stringclasses
6 values
lang
stringclasses
1 value
max_stars_repo_path
stringlengths
3
344
max_stars_repo_name
stringlengths
5
125
max_stars_repo_head_hexsha
stringlengths
40
78
max_stars_repo_licenses
listlengths
1
11
max_stars_count
int64
1
368k
max_stars_repo_stars_event_min_datetime
stringlengths
24
24
max_stars_repo_stars_event_max_datetime
stringlengths
24
24
max_issues_repo_path
stringlengths
3
344
max_issues_repo_name
stringlengths
5
125
max_issues_repo_head_hexsha
stringlengths
40
78
max_issues_repo_licenses
listlengths
1
11
max_issues_count
int64
1
116k
max_issues_repo_issues_event_min_datetime
stringlengths
24
24
max_issues_repo_issues_event_max_datetime
stringlengths
24
24
max_forks_repo_path
stringlengths
3
344
max_forks_repo_name
stringlengths
5
125
max_forks_repo_head_hexsha
stringlengths
40
78
max_forks_repo_licenses
listlengths
1
11
max_forks_count
int64
1
105k
max_forks_repo_forks_event_min_datetime
stringlengths
24
24
max_forks_repo_forks_event_max_datetime
stringlengths
24
24
content
stringlengths
5
1.04M
avg_line_length
float64
1.14
851k
max_line_length
int64
1
1.03M
alphanum_fraction
float64
0
1
lid
stringclasses
191 values
lid_prob
float64
0.01
1
1c1119fdd1c2c855a372ec039fdc0154453362f0
2,404
md
Markdown
docs/2014/analysis-services/multidimensional-models/document-and-script-an-analysis-services-database.md
masahiko-sotta/sql-docs.ja-jp
f9e587be8d74ad47d0cc2c31a1670e2190a0aab7
[ "CC-BY-4.0", "MIT" ]
null
null
null
docs/2014/analysis-services/multidimensional-models/document-and-script-an-analysis-services-database.md
masahiko-sotta/sql-docs.ja-jp
f9e587be8d74ad47d0cc2c31a1670e2190a0aab7
[ "CC-BY-4.0", "MIT" ]
null
null
null
docs/2014/analysis-services/multidimensional-models/document-and-script-an-analysis-services-database.md
masahiko-sotta/sql-docs.ja-jp
f9e587be8d74ad47d0cc2c31a1670e2190a0aab7
[ "CC-BY-4.0", "MIT" ]
null
null
null
--- title: ドキュメントし、Analysis Services データベースをスクリプト |Microsoft Docs ms.custom: '' ms.date: 03/06/2017 ms.prod: sql-server-2014 ms.reviewer: '' ms.technology: analysis-services ms.topic: conceptual helpviewer_keywords: - XML for Analysis, scripts - XMLA, scripts - scripts [Analysis Services], databases - documenting databases - databases [Analysis Services], documenting - databases [Analysis Services], scripts ms.assetid: 125044e2-8d36-4733-8743-8bb68ff9aa4e author: minewiskan ms.author: owend manager: craigg ms.openlocfilehash: 9284073781a91b21d588684b9071e6179a815613 ms.sourcegitcommit: 3026c22b7fba19059a769ea5f367c4f51efaf286 ms.translationtype: MT ms.contentlocale: ja-JP ms.lasthandoff: 06/15/2019 ms.locfileid: "66075116" --- # <a name="document-and-script-an-analysis-services-database"></a>Analysis Services データベースのドキュメントとスクリプトの作成 [!INCLUDE[ssASnoversion](../../includes/ssasnoversion-md.md)] データベースを配置した後、 [!INCLUDE[ssManStudioFull](../../includes/ssmanstudiofull-md.md)] を使用して、データベースのメタデータまたはデータベースに含まれているオブジェクトのメタデータを XML for Analysis (XMLA) スクリプトとして出力できます。 このスクリプトは、新しい **[XMLA クエリ エディター]** ウィンドウ、ファイル、またはクリップボードに出力できます。 XMLA の詳細については、次を参照してください。 [Analysis Services スクリプト言語&#40;ASSL&#41;参照](https://docs.microsoft.com/bi-reference/assl/analysis-services-scripting-language-assl-for-xmla)します。 生成された XMLA スクリプトでは、 [!INCLUDE[ssASnoversion](../../includes/ssasnoversion-md.md)] スクリプト言語 (ASSL) の要素を使用して、スクリプトに含まれるオブジェクトを定義します。 CREATE スクリプトを生成した場合、結果として得られる XMLA スクリプトには、インスタンスで **データベース構造全体を作成するための XMLA** Create [!INCLUDE[ssASnoversion](../../includes/ssasnoversion-md.md)] コマンドおよび ASSL 要素が含まれます。 ALTER スクリプトを生成した場合、結果として得られる XMLA スクリプトには、既存の **データベースの構造をスクリプト作成時点のデータベースの状態に復元するための XMLA** Alter [!INCLUDE[ssASnoversion](../../includes/ssasnoversion-md.md)] コマンドおよび ASSL 要素が含まれます。 生成された XMLA スクリプトは、 [!INCLUDE[ssASnoversion](../../includes/ssasnoversion-md.md)] データベースから以下のようなさまざまな用途に使用できます。 - すべてのデータベース オブジェクトおよび権限を再作成するためのバックアップ スクリプトを維持できます。 - データベース開発コードを作成または更新できます。 - 既存のスキーマからテスト環境または開発環境を作成できます。 ## <a name="see-also"></a>参照 [Analysis Services データベースの変更または削除](modify-or-delete-an-analysis-services-database.md) [Alter 要素 (XMLA)](https://docs.microsoft.com/bi-reference/xmla/xml-elements-commands/alter-element-xmla) [Create 要素 (XMLA)](https://docs.microsoft.com/bi-reference/xmla/xml-elements-commands/create-element-xmla)
52.26087
487
0.782862
yue_Hant
0.587918
1c1193611a07a5f0acd2c2ba7b832de99e8907b9
2,805
markdown
Markdown
_posts/2018/2018-10-16-qpcr-ronits-dnased-c-gigas-ploidydessication-rna-with-18s-primers.markdown
AidanCox12/Aidans_Journal
6bc80960ae7cc3f81aa097382d7c0bcc63f0c9f9
[ "MIT" ]
null
null
null
_posts/2018/2018-10-16-qpcr-ronits-dnased-c-gigas-ploidydessication-rna-with-18s-primers.markdown
AidanCox12/Aidans_Journal
6bc80960ae7cc3f81aa097382d7c0bcc63f0c9f9
[ "MIT" ]
null
null
null
_posts/2018/2018-10-16-qpcr-ronits-dnased-c-gigas-ploidydessication-rna-with-18s-primers.markdown
AidanCox12/Aidans_Journal
6bc80960ae7cc3f81aa097382d7c0bcc63f0c9f9
[ "MIT" ]
5
2019-12-18T06:47:34.000Z
2022-03-15T23:47:41.000Z
--- author: kubu4 comments: true date: 2018-10-16 18:15:26+00:00 layout: post slug: qpcr-ronits-dnased-c-gigas-ploidydessication-rna-with-18s-primers title: qPCR - Ronit's DNAsed C.gigas Ploidy/Dessication RNA with 18s primers wordpress_id: 3652 author: - kubu4 categories: - Miscellaneous tags: - 18s - BB15 - cfx connect - Crassostrea gigas - dessication - diploid - DNased RNA - gigas18s_fw - gigas18s_rv - Pacific oyster - qPCR - SRID 157 - SsoFast EvaGreen Supermix - triploid --- After [DNasing Ronit's RNA earlier today](2018-10-16-dnase-treatment-ronits-c-gigas-ploiyddessication-ctenidia-rna.html), I needed to check for any residual gDNA. Identified some old, old C.gigas 18s primers that _should_ amplify gDNA: * gigas18s_fw (SRID 157) * gigas18s_rv (SRID 156) Used some old _C.gigas_ gDNA ([BB15 from 20090519](https://robertslab.github.io/sams-notebook/2009/05/15/gdna-isolation-macs-bb-and-dh-site-samples.html)) as a positive control. Samples were run on Roberts Lab CFX Connect (BioRad). All samples were run in duplicate. See qPCR Report (Results section) for plate layout, cycling params, etc. qPCR master mix calcs (Google Sheet): * [20181016_qPCR_Cgigas_DNased_RNA](https://docs.google.com/spreadsheets/d/1YcL-h1g0ee8XOlO49H7WmSalqT0BCtreVR4WaxV8dxA/edit?usp=sharing) * * * #### Results qPCR Report (PDF): * [sam_2018-10-16 2011-13-55_BR006896.pdf](https://owl.fish.washington.edu/Athaliana/qPCR_data/qPCR_reports/sam_2018-10-16%2011-13-55_BR006896.pdf) qPCR File (PCRD): * [sam_2018-10-16 2011-13-55_BR006896.pcrd](https://owl.fish.washington.edu/scaphapoda/qPCR_data/cfx_connect_data/sam_2018-10-16%2011-13-55_BR006896.pcrd) qPCR Data (CSV): * [sam_2018-10-16_11-13-55_BR006896-Quantification_Cq_Results.csv](https://owl.fish.washington.edu/Athaliana/qPCR_data/sam_2018-10-16_11-13-55_BR006896-Quantification_Cq_Results.csv) Well, this primer set and/or the gDNA is not good. In the plots below, the positive control gNDA is in green, samples in blue, and no template controls (NTC) are in red. Poor performance is most easily noticed when looking at the melt curves. They have multiple peaks, suggesting non-specific amplification, even in the positive control. Additionally, although less evident from just looking at the plots, is the replicates are highly inconsistent. Although it's possible that might be due to poor technique, it's very unlikely. Will have to identify different primers and/or positive control DNA. * * * ##### Amplification Plots ![](https://owl.fish.washington.edu/Athaliana/qPCR_data/sam_20181016_111355_amp_plots.png) * * * ##### Melt Curves ![](https://owl.fish.washington.edu/Athaliana/qPCR_data/sam_20181016_111355_melt_plots.png)
21.25
190
0.751872
eng_Latn
0.594991
1c1213c5b409397c369dfd22989b29e5aab0beca
9,545
md
Markdown
articles/azure-netapp-files/azure-netapp-files-develop-with-rest-api.md
cristhianu/azure-docs.es-es
910ba6adc1547b9e94d5ed4cbcbe781921d009b7
[ "CC-BY-4.0", "MIT" ]
2
2019-09-04T06:39:25.000Z
2019-09-04T06:43:40.000Z
articles/azure-netapp-files/azure-netapp-files-develop-with-rest-api.md
cristhianu/azure-docs.es-es
910ba6adc1547b9e94d5ed4cbcbe781921d009b7
[ "CC-BY-4.0", "MIT" ]
null
null
null
articles/azure-netapp-files/azure-netapp-files-develop-with-rest-api.md
cristhianu/azure-docs.es-es
910ba6adc1547b9e94d5ed4cbcbe781921d009b7
[ "CC-BY-4.0", "MIT" ]
null
null
null
--- title: Desarrollo para Azure NetApp Files con la API REST | Microsoft Docs description: Describe cómo empezar a usar la API REST de Azure NetApp Files. services: azure-netapp-files documentationcenter: '' author: b-juche manager: '' editor: '' ms.assetid: '' ms.service: azure-netapp-files ms.workload: storage ms.tgt_pltfrm: na ms.devlang: na ms.topic: conceptual ms.date: 05/17/2019 ms.author: b-juche ms.openlocfilehash: 996fbcc7c3c9af0da9160216785ecd54840660e8 ms.sourcegitcommit: d4dfbc34a1f03488e1b7bc5e711a11b72c717ada ms.translationtype: HT ms.contentlocale: es-ES ms.lasthandoff: 06/13/2019 ms.locfileid: "65957037" --- # <a name="develop-for-azure-netapp-files-with-rest-api"></a>Desarrollo para Azure NetApp Files con la API REST La API REST para el servicio Azure NetApp Files define las operaciones HTTP en los recursos como la cuenta de NetApp, el grupo de capacidades, los volúmenes y las instantáneas. Este artículo le ayuda a empezar a usar la API REST de Azure NetApp Files. ## <a name="azure-netapp-files-rest-api-specification"></a>Especificación de la API de REST de Azure NetApp Files La especificación de la API de REST de Azure NetApp Files se publica a través de [GitHub](https://github.com/Azure/azure-rest-api-specs/tree/master/specification/netapp/resource-manager): `https://github.com/Azure/azure-rest-api-specs/tree/master/specification/netapp/resource-manager` ## <a name="access-the-azure-netapp-files-rest-api"></a>Acceso a la API REST de Azure NetApp Files 1. [Instale la CLI de Azure](https://docs.microsoft.com/cli/azure/install-azure-cli?view=azure-cli-latest) si no lo ha hecho ya. 2. Cree una entidad de servicio en su instancia de Azure Active Directory (Azure AD): 1. Compruebe que tiene [permisos suficientes](https://docs.microsoft.com/azure/active-directory/develop/howto-create-service-principal-portal#required-permissions). 1. En la CLI de Azure, escriba el siguiente comando: az ad sp create-for-rbac --name $YOURSPNAMEGOESHERE--password $YOURGENERATEDPASSWORDGOESHERE La salida del comando es similar al ejemplo siguiente: { "appId": "appIDgoeshere", "displayName": "APPNAME", "name": "http://APPNAME", "password": "supersecretpassword", "tenant": "tenantIDgoeshere" } Conserve la salida del comando. Necesitará los valores `appId`, `password` y `tenant`. 3. Solicite un token de acceso de OAuth: Los ejemplos de este artículo utilizan cURL. También puede usar diversas herramientas de API como [Postman](https://www.getpostman.com/), [Insomnia](https://insomnia.rest/) y [Paw](https://paw.cloud/). Reemplace las variables en el ejemplo siguiente con la salida del comando del Paso 2. curl -X POST -d 'grant_type=client_credentials&client_id=[APP_ID]&client_secret=[PASSWORD]&resource=https%3A%2F%2Fmanagement.azure.com%2F' https://login.microsoftonline.com/[TENANT_ID]/oauth2/token La salida proporciona un token de acceso similar al ejemplo siguiente: eyJ0eXAiOiJKV1QiLCJhbGciOiJSUzI1NiIsIng1dCI6Im5iQ3dXMTF3M1hrQi14VWFYd0tSU0xqTUhHUSIsImtpZCI6Im5iQ3dXMTF3M1hrQi14VWFYd0tSU0xqTUhHUSJ9 El token que se muestra es válido durante 3600 segundos. Pasado este tiempo, tendrá que solicitar un nuevo token. Guarde el token en un editor de texto. Lo necesitará en el siguiente paso. 4. Envíe una llamada de prueba e incluya el token para validar el acceso a la API REST: curl -X GET -H "Authorization: Bearer [TOKEN]" -H "Content-Type: application/json" https://management.azure.com/subscriptions/[SUBSCRIPTION_ID]/providers/Microsoft.Web/sites?api-version=2016-08-01 ## <a name="examples-using-the-api"></a>Ejemplos de uso de la API Este artículo usa la siguiente dirección URL para la base de referencia de las solicitudes. Esta dirección URL apunta a la raíz del espacio de nombres de Azure NetApp Files. `https://management.azure.com/subscriptions/SUBIDGOESHERE/resourceGroups/RESOURCEGROUPGOESHERE/providers/Microsoft.NetApp/netAppAccounts?api-version=2017-08-15` En los ejemplos siguientes, reemplace los valores `subID` y `resourceGroups` por los suyos propios. ### <a name="get-request-examples"></a>Ejemplos de solicitud GET Use una solicitud GET para consultar objetos de Azure NetApp Files en una suscripción, como se muestra en los ejemplos siguientes: #get NetApp accounts curl -X GET -H "Authorization: Bearer TOKENGOESHERE" -H "Content-Type: application/json" https://management.azure.com/subscriptions/SUBIDGOESHERE/resourceGroups/RESOURCEGROUPGOESHERE/providers/Microsoft.NetApp/netAppAccounts?api-version=2017-08-15 #get capacity pools for NetApp account curl -X GET -H "Authorization: Bearer TOKENGOESHERE" -H "Content-Type: application/json" https://management.azure.com/subscriptions/SUBIDGOESHERE/resourceGroups/RESOURCEGROUPGOESHERE/providers/Microsoft.NetApp/netAppAccounts/NETAPPACCOUNTGOESHERE/capacityPools?api-version=2017-08-15 #get volumes in NetApp account & capacity pool curl -X GET -H "Authorization: Bearer TOKENGOESHERE" -H "Content-Type: application/json" https://management.azure.com/subscriptions/SUBIDGOESHERE/resourceGroups/RESOURCEGROUPGOESHERE/providers/Microsoft.NetApp/netAppAccounts/NETAPPACCOUNTGOESHERE/capacityPools/CAPACITYPOOLGOESHERE/volumes?api-version=2017-08-15 #get snapshots for a volume curl -X GET -H "Authorization: Bearer TOKENGOESHERE" -H "Content-Type: application/json" https://management.azure.com/subscriptions/SUBIDGOESHERE/resourceGroups/RESOURCEGROUPGOESHERE/providers/Microsoft.NetApp/netAppAccounts/NETAPPACCOUNTGOESHERE/capacityPools/CAPACITYPOOLGOESHERE/volumes/VOLUMEGOESHERE/snapshots?api-version=2017-08-15 ### <a name="put-request-examples"></a>Ejemplos de solicitud PUT Use una solicitud PUT para crear nuevos objetos de Azure NetApp Files, como se muestra en los ejemplos a continuación. El cuerpo de la solicitud PUT puede incluir los datos con formato JSON para los cambios, o puede especificar un archivo desde el que leer. #create a NetApp account curl -X PUT -H "Authorization: Bearer TOKENGOESHERE" -H "Content-Type: application/json" https://management.azure.com/subscriptions/SUBIDGOESHERE/resourceGroups/RESOURCEGROUPGOESHERE/providers/Microsoft.NetApp/netAppAccounts/NETAPPACCOUNTGOESHERE?api-version=2017-08-15 #create a capacity pool curl -X PUT -H "Authorization: Bearer TOKENGOESHERE" -H "Content-Type: application/json" https://management.azure.com/subscriptions/SUBIDGOESHERE/resourceGroups/RESOURCEGROUPGOESHERE/providers/Microsoft.NetApp/netAppAccounts/NETAPPACCOUNTGOESHERE/capacityPools/CAPACITYPOOLGOESHERE?api-version=2017-08-15 #create a volume curl -X PUT -H "Authorization: Bearer TOKENGOESHERE" -H "Content-Type: application/json" https://management.azure.com/subscriptions/SUBIDGOESHERE/resourceGroups/RESOURCEGROUPGOESHERE/providers/Microsoft.NetApp/netAppAccounts/NETAPPACCOUNTGOESHERE/capacityPools/CAPACITYPOOLGOESHERE/volumes/MYNEWVOLUME?api-version=2017-08-15 #create a volume snapshot curl -X PUT -H "Authorization: Bearer TOKENGOESHERE" -H "Content-Type: application/json" https://management.azure.com/subscriptions/SUBIDGOESHERE/resourceGroups/RESOURCEGROUPGOESHERE/providers/Microsoft.NetApp/netAppAccounts/NETAPPACCOUNTGOESHERE/capacityPools/CAPACITYPOOLGOESHERE/volumes/MYNEWVOLUME/Snapshots/SNAPNAME?api-version=2017-08-15 ### <a name="json-examples"></a>Ejemplos de JSON En el ejemplo siguiente se muestra cómo crear una cuenta de NetApp: { "name": "MYNETAPPACCOUNT", "type": "Microsoft.NetApp/netAppAccounts", "location": "westus2", "properties": { "name": "MYNETAPPACCOUNT" } } En el ejemplo siguiente se muestra cómo crear un grupo de capacidad: { "name": "MYNETAPPACCOUNT/POOLNAME", "type": "Microsoft.NetApp/netAppAccounts/capacityPools", "location": "westus2", "properties": { "name": "POOLNAME" "size": "4398046511104", "serviceLevel": "Premium" } } En el ejemplo siguiente se muestra cómo crear un nuevo volumen: { "name": "MYNEWVOLUME", "type": "Microsoft.NetApp/netAppAccounts/capacityPools/volumes", "location": "westus2", "properties": { "serviceLevel": "Premium", "usageThreshold": "322122547200", "creationToken": "MY-FILEPATH", "snapshotId": "", "subnetId": "/subscriptions/SUBIDGOESHERE/resourceGroups/RESOURCEGROUPGOESHERE/providers/Microsoft.Network/virtualNetworks/VNETGOESHERE/subnets/MYDELEGATEDSUBNET.sn" } } En el ejemplo siguiente se muestra cómo crear una instantánea de un volumen: { "name": "apitest2/apiPool01/apiVol01/snap02", "type": "Microsoft.NetApp/netAppAccounts/capacityPools/Volumes/Snapshots", "location": "westus2", "properties": { "name": "snap02", "fileSystemId": "0168704a-bbec-da81-2c29-503825fe7420" } } > [!NOTE] > Tiene que especificar `fileSystemId` para crear una instantánea. Puede obtener el valor `fileSystemId` con una solicitud GET a un volumen. ## <a name="next-steps"></a>Pasos siguientes [Consulte la referencia de la API REST de Azure NetApp Files](https://docs.microsoft.com/rest/api/netapp/)
54.542857
351
0.742797
spa_Latn
0.30821
1c12b21d5c69b68ad228543296ccb8528958328c
1,123
md
Markdown
articles/virtual-machines/virtual-machines-linux-create-custom.md
huiw-git/azure-content-zhtw
f20103dc3d404c9c929c155b36c5a47aee5baed6
[ "CC-BY-3.0" ]
null
null
null
articles/virtual-machines/virtual-machines-linux-create-custom.md
huiw-git/azure-content-zhtw
f20103dc3d404c9c929c155b36c5a47aee5baed6
[ "CC-BY-3.0" ]
null
null
null
articles/virtual-machines/virtual-machines-linux-create-custom.md
huiw-git/azure-content-zhtw
f20103dc3d404c9c929c155b36c5a47aee5baed6
[ "CC-BY-3.0" ]
1
2020-11-04T04:34:56.000Z
2020-11-04T04:34:56.000Z
<properties pageTitle="建立 Linux VM | Microsoft Azure" description="了解如何以執行 Linux 作業系統的傳統部署模型建立自訂虛擬機器。" services="virtual-machines" documentationCenter="" authors="dsk-2015" manager="timlt" editor="tysonn" tags="azure-service-management"/> <tags ms.service="virtual-machines" ms.workload="infrastructure-services" ms.tgt_pltfrm="vm-linux" ms.devlang="na" ms.topic="article" ms.date="10/14/2015" ms.author="dkshir"/> # 如何建立自訂 Linux VM [AZURE.INCLUDE [learn-about-deployment-models](../../includes/learn-about-deployment-models-classic-include.md)] []資源管理員模式](virtual-machines-linux-tutorial.md)。 本主題說明如何透過使用傳統部署模型搭配 Azure CLI 建立*自訂*虛擬機器。我們將使用 Azure 上可用**映像**中的 Linux 映像。Azure CLI 命令提供下列組態選項: - 將 VM 連線到虛擬網路 - 將 VM 加入現有的雲端服務 - 將 VM 加入現有的儲存體帳戶 - 將 VM 加入可用性集合或位置 > [AZURE.IMPORTANT]如果要讓虛擬機器使用虛擬網路,以便依主機名稱來連接虛擬機器,或設定跨單位連線,則必須在建立虛擬機器時指定虛擬網路。只有在建立虛擬機器時,才能將虛擬機器設定為加入虛擬網路。如需虛擬網路的詳細資訊,請參閱 [Azure 虛擬網路概觀](http://go.microsoft.com/fwlink/p/?LinkID=294063)。 ## 如何使用傳統的部署模型建立 Linux 虛擬機器 [AZURE.INCLUDE [virtual-machines-create-LinuxVM](../../includes/virtual-machines-create-linuxvm.md)] <!---HONumber=Oct15_HO4-->
29.552632
184
0.763134
yue_Hant
0.906092
1c12c7bf3daaa8ec7408ea18bc1a7eef91b3f0fe
1,315
md
Markdown
Module-2a/README.md
ajaymahale/apijam
abdbcc3265af5026804f962eae129e2f6e4498f5
[ "Apache-2.0" ]
null
null
null
Module-2a/README.md
ajaymahale/apijam
abdbcc3265af5026804f962eae129e2f6e4498f5
[ "Apache-2.0" ]
null
null
null
Module-2a/README.md
ajaymahale/apijam
abdbcc3265af5026804f962eae129e2f6e4498f5
[ "Apache-2.0" ]
2
2021-05-26T06:04:50.000Z
2021-05-27T00:54:14.000Z
# Module 2a - API Security Part 1 Apigee’s API Jam Module 2a is the first part of a hands-on workshop that will jumpstart your understanding of API security. In this module, you will walk through two lab exercises that will help you throttle, protect, and secure your APIs by utilizing modern security principles with OAuth 2.0. API developers and architects who want to build a secure API, this workshop is for you! ## Who should attend? This workshop will be valuable to API developers, architects, and anyone who wants to understand how to secure APIs exposed through Apigee Edge. ## What do attendees need to bring - Browser (Chrome). A modern web browser like Chrome (v50+) to access the Apigee Edge Platform UI. - A basic understanding of Apigee Edge entities such as API Proxies, Apps, and Products. For a refresher of the API Management Lifecycle, please complete lab exercises in Module-1 [here](../Module-1). ## Workshop Agenda ### Module 2a * Introduction to API Security * Lab 1 - Throttle your API Traffic to prevent DoS, using Apigee Spike Arrest [Link](./Labs/Lab%201) * Lab 2 - JWT Based Security Excersice [Link](./Labs/Lab%202) Let's get started with [Lab 1](./Labs/Lab%201). #### Apigee Community If you have any questions/comments please visit https://community.apigee.com/index.html
50.576923
295
0.764259
eng_Latn
0.98033
1c12d8e95ac5d42aff3ee85dd646902901cd3569
2,690
md
Markdown
Office365-Admin/misc/contacts.md
NelsonFMDuarte/OfficeDocs-O365ITPro
734ac6b31c9b0f839bc18a5503038d0c54845a78
[ "CC-BY-4.0", "MIT" ]
null
null
null
Office365-Admin/misc/contacts.md
NelsonFMDuarte/OfficeDocs-O365ITPro
734ac6b31c9b0f839bc18a5503038d0c54845a78
[ "CC-BY-4.0", "MIT" ]
null
null
null
Office365-Admin/misc/contacts.md
NelsonFMDuarte/OfficeDocs-O365ITPro
734ac6b31c9b0f839bc18a5503038d0c54845a78
[ "CC-BY-4.0", "MIT" ]
null
null
null
--- title: "Quick help Contacts" ms.author: kwekua author: kwekua manager: scotv audience: Admin ms.topic: article ms.service: o365-administration localization_priority: Normal ms.collection: - M365-subscription-management - Adm_O365 - Adm_NonTOC search.appverid: - BCS160 - MET150 - MOE150 ms.assetid: e64ceac2-ae62-4d29-a9ee-6aab9870ae2b ROBOTS: NOINDEX description: "Learn how to create contacts in the admin center and manage your global address list." --- # Quick help: Contacts If you need shared contacts that everyone in your organization can get to, create them in the Microsoft 365 admin center. The contacts you create here can be seen by you and your users as part of the global address list. ## How do I add contacts in the admin center? To add contacts: 1. In the admin center, go to the **Users** \> <a href="https://go.microsoft.com/fwlink/p/?linkid=2053302" target="_blank">Contacts</a> page. 2. On the **Contacts** page, select **Add a contact**. 3. On the **New Contact** page, fill in details and select **Add** to create a contact. ![Fill in contact information in the New Contact pane](../media/9eb5a649-0734-467f-ba66-255225eedb4b.jpg) ## How are these contacts different from My Contacts? My Contacts are contacts that you create for yourself and your users create for themselves but others can't see ([learn more](https://support.office.com/article/5fe173cf-e620-4f62-9bf6-da5041f651bf.aspx)). The contacts you create in the admin center are contacts for the organization and everyone can see them in addition to their own. ## How does everyone get to the contacts I created in the admin center? They can go to **People** in Office 365, expand **Directory** and select **All Contacts**. They can view each contact and their information from there. ## Can anyone create and edit these organizational contacts? No. Only **Global** and **Exchange** administrators can create, edit, or delete these contacts. Everyone else can only view them. ## Can I use this to manage my business clients? You can use Office 365 contacts however you like, but there are limitations. Learn about [other ways to manage contacts](ways-to-manage-contacts.md) ## How do I bulk import organizational contacts? Use Windows PowerShell and a CSV (Comma Separated Value) file to bulk import external contacts as described in [Bulk import external contacts to Exchange Online](https://support.office.com/article/bed936bc-0969-4a6d-a7a5-66305c14e958). ## What if my question still hasn't been answered? Visit the rest of our [admin help](https://support.office.com/article/17d3ff3f-3601-466e-b5a1-482b31cfb791.aspx) or give us your feedback below.
42.03125
335
0.761338
eng_Latn
0.989742
1c134f63445e6ac626975775fad074818d7655f9
1,012
md
Markdown
article/PART9/deep-learning/keras/keras-01-env-install-set-build.md
LuckinJack/LuckinJack.github.io
8caf1bfa1a02d1c689a1da3ad1829b45861e8d5b
[ "Apache-2.0" ]
1
2021-03-08T14:26:39.000Z
2021-03-08T14:26:39.000Z
article/PART9/deep-learning/keras/keras-01-env-install-set-build.md
LuckinJack/LuckinJack.github.io
8caf1bfa1a02d1c689a1da3ad1829b45861e8d5b
[ "Apache-2.0" ]
null
null
null
article/PART9/deep-learning/keras/keras-01-env-install-set-build.md
LuckinJack/LuckinJack.github.io
8caf1bfa1a02d1c689a1da3ad1829b45861e8d5b
[ "Apache-2.0" ]
null
null
null
# Keras安装(基于Win10) 安装Keras前,请先基于下面这篇文章,安装好TensorFlow <a style="", href="../../PART9/tensorflow/tensorflow-01-env-install-set-build.html">TensorFlow2.1安装</a> ## 安装 Keras 输入命令开始安装keras: `pip install keras==2.3.1 -i https://pypi.doubanio.com/simple` 接着我们在 *PyCharm* 中新建一个Python文件,复制以下Python代码在IDE中执行 ```python import tensorflow as tf import keras as keras gpu_available = tf.test.is_gpu_available() print(tf.__version__) print(keras.__version__) print(gpu_available) ``` 如果 *PyCharm* 控制台打印如下内容,则说明配置成功 # 参考 - [2020-08 亲测最方便详细可用!Win10下Anaconda(python3.8)+Tensorflow2.1.0-gpu版本+spyder安装教程](https://blog.csdn.net/zazazaz1/article/details/108064895) - [python3.8下使用tensorflow2.0版本](https://jingyan.baidu.com/article/bea41d435e3e9af5c51be6e1.html) - [更改Python的pip install 默认安装依赖路径方法详解](https://www.jb51.net/article/149625.htm) - [pip install 默认安装路径修改](https://www.cnblogs.com/maggieq8324/p/12099068.html) - [Anaconda下安装keras2.3.1和tensorflow2.1](https://my.oschina.net/u/4023145/blog/4496410) 这个好用
22.488889
138
0.767787
yue_Hant
0.679521
1c1383290c2d83ee38f8930a81ba26097a5a004c
2,157
md
Markdown
source/includes/_job_templates.md
Scripted/api-docs
95f03af0c6b71a12809aa72fc2bea9ecefba643c
[ "Apache-2.0" ]
null
null
null
source/includes/_job_templates.md
Scripted/api-docs
95f03af0c6b71a12809aa72fc2bea9ecefba643c
[ "Apache-2.0" ]
null
null
null
source/includes/_job_templates.md
Scripted/api-docs
95f03af0c6b71a12809aa72fc2bea9ecefba643c
[ "Apache-2.0" ]
1
2020-01-28T13:22:59.000Z
2020-01-28T13:22:59.000Z
# Job Templates ```ruby ScriptedClient::JobTemplate.all ``` ```shell curl -H "Authorization: Bearer abcdefghij0123456789" \ https://api.scripted.com/abcd1234/v1/job_templates ``` > Sample JobTemplate ```json { "id": "5654ec02a6e02a37e70000cc", "name": "Standard Blog Post", "created_at": "2015-11-24T15:00:18-08:00", "content_format": { "id": "5654ec02a6e02a37e70000d5", "name": "Standard Blog Post", "pitchable": true, "length_metric": "350-450 words", "quantity_options": [ 1 ] }, "pricing": { "base": 9900, "specialist": 14900 }, "prompts": [ { "id": "5654ec02a6e02a37e70000d8", "kind": "checkbox", "label": "Goal", "description": "Select one or many", "answer_required": false, "value_options": [ "Informed analysis", "Thought leadership", "Repurpose existing writing", "Promote topic", ] }, { "id": "5654ec02a6e02a37e70000da", "kind": "string[255]", "label": "Sample Blog", "description": "Link to an existing blog and describe why it's a good sample", "answer_required": false }, { "id": "5654ec02a6e02a37e70000dc", "kind": "checkbox", "label": "Blog Structure", "description": "Select one or many", "answer_required": false, "value_options": [ "Paragraphs", "Subheads", "Lists" ] }, { "id": "5654ec02a6e02a37e70000de", "kind": "array", "label": "Key Points", "description": "List key points your writer should address", "answer_required": false }, { "id": "5654ec02a6e02a37e70000e3", "kind": "radio", "label": "Links to Sources", "description": "Select one", "answer_required": false, "value_options": [ "Include sources as linked anchor text", "Include sources as source list", "No sources" ] } ] }, ``` A JobTemplate has a [ContentFormat](#content-formats), such as a **Short Blog Post**, and a collection of [Prompts](#prompts) to that are designed to help guide your writer.
24.235955
173
0.579045
eng_Latn
0.563569
1c139f8c871e514bf858619f84455d7cada0db3a
924
markdown
Markdown
doc/api/cartpurchase/non-owner_is_logged_in.markdown
barkbox/shopping
400fd4108ac62282454abd239cdbc9f385737735
[ "MIT" ]
null
null
null
doc/api/cartpurchase/non-owner_is_logged_in.markdown
barkbox/shopping
400fd4108ac62282454abd239cdbc9f385737735
[ "MIT" ]
2
2017-08-09T16:55:22.000Z
2018-04-23T19:41:25.000Z
doc/api/cartpurchase/non-owner_is_logged_in.markdown
barkbox/shopping
400fd4108ac62282454abd239cdbc9f385737735
[ "MIT" ]
1
2018-03-27T18:45:00.000Z
2018-03-27T18:45:00.000Z
# CartPurchase API ## non-owner is logged in ### GET /cart_purchases/:id ### Parameters | Name | Description | Required | Scope | |------|-------------|----------|-------| | cart_purchase_id | Cart Purchase ID | true | | ### Request #### Headers <pre>Content-Type: application/vnd.api+json Host: example.org Cookie: </pre> #### Route <pre>GET /cart_purchases/5</pre> ### Response #### Headers <pre>X-Frame-Options: SAMEORIGIN X-XSS-Protection: 1; mode=block X-Content-Type-Options: nosniff Content-Type: application/json; charset=utf-8 Cache-Control: no-cache X-Request-Id: afb160b8-11c6-4ef5-accf-36a8f88609cf X-Runtime: 0.004610 Content-Length: 141</pre> #### Status <pre>403 Forbidden</pre> #### Body <pre>{ "errors": [ { "title": "Show Forbidden", "detail": "You don't have permission to show this shopping/cart purchase.", "code": "403", "status": "403" } ] }</pre>
17.111111
81
0.626623
eng_Latn
0.279184
1c145da7f591b54edfa80a73a47fae2c73d2c4a6
656
md
Markdown
README.md
Iaggelis/tuppers-formula
dec63f97818e941a37a153324df36f85bb3530a1
[ "MIT" ]
null
null
null
README.md
Iaggelis/tuppers-formula
dec63f97818e941a37a153324df36f85bb3530a1
[ "MIT" ]
null
null
null
README.md
Iaggelis/tuppers-formula
dec63f97818e941a37a153324df36f85bb3530a1
[ "MIT" ]
null
null
null
# Tupper's self-referential formula A simple visualization of tupper's self-referential formula using C and SDL2. The code for the visualization is a simplified version of Tsoding's nice project [https://github.com/tsoding/gp](https://github.com/tsoding/gp). The calculation part is not yet 100% correct, as can be seen from the screenshot: ![screenshot](./images/screenshot.png) ## Building Dependencies: - [SDL2](https://www.libsdl.org) - [SDL2_gfx](https://github.com/ferzkopp/SDL_gfx) - [mpfr](https://www.mpfr.org) Usage: ```console $ make $ ./tupper ``` ## References - https://en.wikipedia.org/wiki/Tupper%27s_self-referential_formula
24.296296
305
0.737805
eng_Latn
0.679251
1c14a28b82aecb1295c6fb813d9225c3dafe7b3c
1,661
md
Markdown
README.md
ozanlimited/ozan-checkout-ios
431a516b60b169fbbc4c9c7307d84b801992bbe4
[ "MIT" ]
3
2017-08-03T15:13:27.000Z
2020-07-08T10:54:18.000Z
README.md
ozanlimited/ozan-checkout-ios
431a516b60b169fbbc4c9c7307d84b801992bbe4
[ "MIT" ]
null
null
null
README.md
ozanlimited/ozan-checkout-ios
431a516b60b169fbbc4c9c7307d84b801992bbe4
[ "MIT" ]
null
null
null
## Ozan Checkout iOS [![CocoaPods](https://img.shields.io/badge/platform-ios-orange.svg)](https://cocoapods.org/pods/OzanCheckout) [![Languages](https://img.shields.io/badge/languages-ObjC%20%7C%20%20Swift-orange.svg?maxAge=2592000)](https://github.com/intercom/intercom-ios) [![CocoaPods](https://img.shields.io/badge/pod-0.0.1-blue.svg)](https://cocoapods.org/pods/OzanCheckout) [![carthage compatible](https://img.shields.io/badge/Carthage-compatible-brightgreen.svg)](https://github.com/Carthage/Carthage) [![MIT License](https://img.shields.io/github/license/mashape/apistatus.svg)](https://www.apache.org/licenses/LICENSE-2.0.html) ## Installation Ozan Checkout iOS supports iOS 8, iOS 9, iOS 10 and iOS 11. ### CocoaPods Add the OzanCheckout pod into your Podfile and run `pod install`. target :YourTargetName do pod 'OzanCheckout' end ### Carthage 1. Add `github "ozanlimited/ozan-checkout-ios"` to your Cartfile. 2. Run carthage update. 3. Go to your Xcode project's "General" settings. Drag `OzanCheckout.framework` from `Carthage/Build/iOS` to the "Embedded Binaries" section. Make sure “Copy items if needed” is selected and click Finish. ### Manual Installation 1. [Download OzanCheckout for iOS](https://github.com/ozanlimited/ozan-checkout-ios/archive/master.zip) and extract the zip. 2. Go to your Xcode project's "General" settings. Drag `OzanCheckout.framework` to the "Embedded Binaries" section. Make sure "Copy items if needed" is selected and click Finish. ## Example app There is an example app provided [here](https://github.com/ozanlimited/ozan-checkout-ios/tree/master/Examples) for both Objective-C and Swift.
50.333333
204
0.757977
kor_Hang
0.287619
1c14a40b2ec87571697923557a757027d2ef4608
178
md
Markdown
showcase-worlds.md
McMeddon/WorldPainterWebsiteButInFancy
ffc60ab5414455087b4906c7546fda4b8f3f3040
[ "Naumen", "Condor-1.1", "MS-PL" ]
null
null
null
showcase-worlds.md
McMeddon/WorldPainterWebsiteButInFancy
ffc60ab5414455087b4906c7546fda4b8f3f3040
[ "Naumen", "Condor-1.1", "MS-PL" ]
null
null
null
showcase-worlds.md
McMeddon/WorldPainterWebsiteButInFancy
ffc60ab5414455087b4906c7546fda4b8f3f3040
[ "Naumen", "Condor-1.1", "MS-PL" ]
null
null
null
# 📷 Showcase Worlds ![](.gitbook/assets/K.png) ![](.gitbook/assets/a.png) ![](<.gitbook/assets/3 (1).jpg>) ![](.gitbook/assets/23.png) ![](.gitbook/assets/20220315telotorm.jpg)
44.5
156
0.651685
afr_Latn
0.228366
1c14aea2ecd3f38611c1f6702699752dfe4449ef
3,714
md
Markdown
help/c-implementing-target/c-implementing-target-for-client-side-web/t-mbox-download/orderconfirm-create.md
and-poulsen/target.en
124be26aacd823fc994252559c5f3621a29b89bf
[ "MIT" ]
7
2019-07-22T16:10:30.000Z
2021-06-03T14:07:16.000Z
help/c-implementing-target/c-implementing-target-for-client-side-web/t-mbox-download/orderconfirm-create.md
and-poulsen/target.en
124be26aacd823fc994252559c5f3621a29b89bf
[ "MIT" ]
176
2019-02-28T16:15:54.000Z
2022-03-01T10:49:44.000Z
help/c-implementing-target/c-implementing-target-for-client-side-web/t-mbox-download/orderconfirm-create.md
and-poulsen/target.en
124be26aacd823fc994252559c5f3621a29b89bf
[ "MIT" ]
66
2019-02-25T22:01:30.000Z
2022-03-23T12:58:24.000Z
--- keywords: order confirmation;orderConfirmPage description: Learn about the legacy mbox.js implementation of Adobe Target. Migrate to the Adobe Experience Platform Web SDK (AEP Web SDK) or to the latest version of at.js. title: How Do I Create an Order Confirmation mbox using mbox.js? feature: at.js role: Developer exl-id: 952c2d1b-1ee8-4e9b-bce3-1c439127bb9b --- # Create an Order Confirmation mbox - mbox.js The Order Confirmation mbox records details about orders on your site and allows reporting based on revenue and orders. The Order Confirmation mbox can also drive recommendation algorithms, such as "People who bought product x also bought product y." >[!IMPORTANT] > >**mbox.js end-of-life**: As of March 31, 2021, [!DNL Adobe Target] no longer supports the mbox.js library. Post March 31, 2021, all calls made from mbox.js will gracefully fail and impact your pages that have [!DNL Target] activities running by serving default content. > >We recommend that all customers migrate to the most recent version of the new [!DNL Adobe Experience Platform Web SDK] or the at.js JavaScript library before this date to avoid any potential issues with your sites. For more information, see [Overview: implement Target for client-side web](/help/c-implementing-target/c-implementing-target-for-client-side-web/implement-target-for-client-side-web.md). >[!NOTE] > >* If users make purchases on your website, we recommend implementing an Order Confirmation mbox even if you use Analytics for Target (A4T) for your reporting. > >* You can also create an Order Confirmation mbox for at.js 1.*x* using the same method; however, the [!DNL at.js] method is preferred. For more information, see [Track Conversions](/help/c-implementing-target/c-implementing-target-for-client-side-web/how-to-deployatjs/implementing-target-without-a-tag-manager.md#task_E85D2F64FEB84201A594F2288FABF053). > >* If you are using at.js 2.*x*, `mboxCreate` is no longer supported. For order confirmation using at.js 2.*x*, use the following tracking-related APIs: [trackEvent()](/help/c-implementing-target/c-implementing-target-for-client-side-web/adobe-target-trackevent.md) and [sendNotifications()](/help/c-implementing-target/c-implementing-target-for-client-side-web/adobe.target.sendnotifications-atjs-21.md). 1. In your order details page, insert the mbox script following the model below. 1. Replace the WORDS IN CAPITAL LETTERS with either dynamic or static values from your catalog. >[!NOTE] > >Use comma delimiting to separate multiple product IDs. **Tip:** You can also pass order information in any mbox (it does not need to be named `orderConfirmPage`). You can also pass order information in multiple mboxes within the same campaign. ``` <div class="mboxDefault"> <!-- CONTENT TO SHOW IF NO OFFERS AVAILABLE. --> </div> <script type="text/javascript"> mboxCreate('orderConfirmPage', 'productPurchasedId=PRODUCT ID FROM YOUR ORDER PAGE, PRODUCT ID2, PRODUCT ID3', 'orderTotal=ORDER TOTAL FROM YOUR ORDER PAGE', 'orderId=ORDER ID FROM YOUR ORDER PAGE'); </script> ``` The Order Confirmation mbox uses the following parameters: | Parameter | Description | |--- |--- | |`orderId`|Unique value to identify an order for conversion counting.<br>The `orderId` must be unique. Duplicate orders are ignored in reports.| |`orderTotal`|Monetary value of the purchase.<br>Do not pass the currency symbol. Use a decimal point (not a comma) to indicate decimal values.| |`productPurchasedId` (Optional)|Comma-separated list of product IDs purchased in the order.<br>These product IDs display in the audit report to support additional reporting analysis.|
67.527273
405
0.763866
eng_Latn
0.942397
1c157dfdbec985e2bc1f3622abe96c8ac7309696
1,411
md
Markdown
leetcode/desc/d9/946.md
RobWalt/rustgym
b4dc47cb36d59c157095563857b0ac5ca62c68b3
[ "MIT" ]
354
2020-08-11T07:56:06.000Z
2022-03-31T14:22:41.000Z
leetcode/desc/d9/946.md
RobWalt/rustgym
b4dc47cb36d59c157095563857b0ac5ca62c68b3
[ "MIT" ]
51
2020-10-16T05:29:05.000Z
2022-02-08T00:33:01.000Z
leetcode/desc/d9/946.md
RobWalt/rustgym
b4dc47cb36d59c157095563857b0ac5ca62c68b3
[ "MIT" ]
43
2020-09-22T07:14:15.000Z
2022-03-30T11:30:39.000Z
<div><p>Given two sequences <code>pushed</code> and <code>popped</code>&nbsp;<strong>with distinct values</strong>,&nbsp;return <code>true</code> if and only if this could have been the result of a sequence of push and pop operations on an initially empty stack.</p> <p>&nbsp;</p> <div> <p><strong>Example 1:</strong></p> <pre><strong>Input: </strong>pushed = <span id="example-input-1-1">[1,2,3,4,5]</span>, popped = <span id="example-input-1-2">[4,5,3,2,1]</span> <strong>Output: </strong><span id="example-output-1">true</span> <strong>Explanation: </strong>We might do the following sequence: push(1), push(2), push(3), push(4), pop() -&gt; 4, push(5), pop() -&gt; 5, pop() -&gt; 3, pop() -&gt; 2, pop() -&gt; 1 </pre> <div> <p><strong>Example 2:</strong></p> <pre><strong>Input: </strong>pushed = <span id="example-input-2-1">[1,2,3,4,5]</span>, popped = <span id="example-input-2-2">[4,3,5,1,2]</span> <strong>Output: </strong><span id="example-output-2">false</span> <strong>Explanation: </strong>1 cannot be popped before 2. </pre> </div> </div> <p>&nbsp;</p> <p><strong>Constraints:</strong></p> <ul> <li><code>0 &lt;= pushed.length == popped.length &lt;= 1000</code></li> <li><code>0 &lt;= pushed[i], popped[i] &lt; 1000</code></li> <li><code>pushed</code> is a permutation of <code>popped</code>.</li> <li><code>pushed</code> and <code>popped</code> have distinct values.</li> </ul> </div>
41.5
266
0.647768
eng_Latn
0.551026
1c15a21dda41639f42b0bf16710d25ea44f431cd
341
md
Markdown
guide/chinese/mathematics/logarithms-introduction-to-the-relationship/index.md
SweeneyNew/freeCodeCamp
e24b995d3d6a2829701de7ac2225d72f3a954b40
[ "BSD-3-Clause" ]
10
2019-08-09T19:58:19.000Z
2019-08-11T20:57:44.000Z
guide/chinese/mathematics/logarithms-introduction-to-the-relationship/index.md
SweeneyNew/freeCodeCamp
e24b995d3d6a2829701de7ac2225d72f3a954b40
[ "BSD-3-Clause" ]
2,056
2019-08-25T19:29:20.000Z
2022-02-13T22:13:01.000Z
guide/chinese/mathematics/logarithms-introduction-to-the-relationship/index.md
SweeneyNew/freeCodeCamp
e24b995d3d6a2829701de7ac2225d72f3a954b40
[ "BSD-3-Clause" ]
5
2018-10-18T02:02:23.000Z
2020-08-25T00:32:41.000Z
--- title: Logarithms Introduction to the Relationship localeTitle: 对数介绍关系 --- ## 对数介绍关系 这是一个存根。 [帮助我们的社区扩展它](https://github.com/freecodecamp/guides/tree/master/src/pages/mathematics/logarithms-introduction-to-the-relationship/index.md) 。 [这种快速风格指南有助于确保您的拉取请求被接受](https://github.com/freecodecamp/guides/blob/master/README.md) 。 #### 更多信息:
31
149
0.780059
yue_Hant
0.552429
1c162220befe0ba9defbdb058258cba0d007d067
1,034
md
Markdown
convene-web/app/views/guides/_glossary.md
user512/convene
be07986d168ea7de53dbde0546ce64341ec6cd88
[ "BlueOak-1.0.0", "Apache-2.0" ]
28
2020-05-04T21:38:47.000Z
2022-03-21T22:12:00.000Z
convene-web/app/views/guides/_glossary.md
user512/convene
be07986d168ea7de53dbde0546ce64341ec6cd88
[ "BlueOak-1.0.0", "Apache-2.0" ]
212
2020-04-27T18:31:20.000Z
2022-03-24T02:53:26.000Z
convene-web/app/views/guides/_glossary.md
eg-bet/convene
4fdd5734a7958f20410d776432619548e4ce4aee
[ "BlueOak-1.0.0", "Apache-2.0" ]
9
2020-06-11T04:09:34.000Z
2022-03-12T17:19:02.000Z
[neighborhoods]: ./neighborhoods [neighborhood]: ./neighborhoods [rooms]: ./rooms [room]: ./rooms [internal rooms]: ./rooms#internal-rooms [locked rooms]: ./rooms#locked-rooms [locked room]: ./rooms#locked-rooms [access codes]: ./rooms#access-codes [access code]: ./rooms#access-codes [room links]: ./rooms#room-links [room link]: ./rooms#room-links [spaces]: ./spaces [space]: ./spaces [neighbors]: ./people#neighbors [neighbor]: ./people#neighbors [operators]: ./people#operators [operator]: ./people#operators [space members]: ./people#space-members [space member]: ./people#space-members [space owners]: ./people#space-owners [space owner]: ./people#space-owners [guests]: ./people#guests [guest]: ./people#guests [personal navigation menu]: ./getting_around#personal-navigation-menu [room directory]: ./getting_around#room-directory [room configuration]: ./getting_around#room-configuration [space configuration]: ./getting_around#space-configuration [share action]: ./getting_around#share-action [furniture]: ./furniture
27.945946
69
0.742747
eng_Latn
0.831126
1c16800adff1c5d3804b95644941813344c236da
32
md
Markdown
README.md
samuel123457/samuel-samarpana-boye
989b831bc1d6291a22a208fac157440852dbcf6b
[ "BSL-1.0" ]
null
null
null
README.md
samuel123457/samuel-samarpana-boye
989b831bc1d6291a22a208fac157440852dbcf6b
[ "BSL-1.0" ]
null
null
null
README.md
samuel123457/samuel-samarpana-boye
989b831bc1d6291a22a208fac157440852dbcf6b
[ "BSL-1.0" ]
null
null
null
# samuel-samarpana-boye nothing
10.666667
23
0.8125
eng_Latn
0.134423
1c1698248dd9c97d0d2177e92fd5e50c23b1100d
342
md
Markdown
CHANGELOG.md
Ardesco/Query
02d5f93811adaa7553f21cf8d63f420c6c14bc84
[ "Apache-2.0" ]
11
2017-11-20T08:35:56.000Z
2020-04-15T20:04:35.000Z
CHANGELOG.md
Ardesco/Query
02d5f93811adaa7553f21cf8d63f420c6c14bc84
[ "Apache-2.0" ]
7
2017-11-20T20:16:44.000Z
2019-02-14T07:40:28.000Z
CHANGELOG.md
Ardesco/Query
02d5f93811adaa7553f21cf8d63f420c6c14bc84
[ "Apache-2.0" ]
5
2018-06-12T10:39:04.000Z
2020-08-13T02:31:19.000Z
# Changelog ##Next Version (Release Date TBC) Release Notes ##Version 1.2.0 Release Notes * Modify Query instantiation so that it requires a driver object for each query object to make it thread safe. ##Version 1.1.0 Release Notes * Add support for Appium MobileElement. ##Version 1.0.0 Release Notes * Initial release of Query object.
22.8
110
0.760234
eng_Latn
0.925375
1c16b213430936cad9ba39e4fdf840f6191c9001
14,054
md
Markdown
CHANGELOG.md
carltonstale/zeitwerk
914cb1a752c70dc0798c476c266cc8dd35b5359f
[ "MIT" ]
null
null
null
CHANGELOG.md
carltonstale/zeitwerk
914cb1a752c70dc0798c476c266cc8dd35b5359f
[ "MIT" ]
null
null
null
CHANGELOG.md
carltonstale/zeitwerk
914cb1a752c70dc0798c476c266cc8dd35b5359f
[ "MIT" ]
null
null
null
# CHANGELOG ## 2.5.4 (28 January 2022) * If a file did not define the expected constant, there was a reload, and there were `on_unload` callbacks, Zeitwerk still tried to access the constant during reload, which raised. This has been corrected. ## 2.5.3 (30 December 2021) * The change introduced in 2.5.2 implied a performance regression that was particularly dramatic in Ruby 3.1. We'll address [#198](https://github.com/fxn/zeitwerk/issues/198) in a different way. ## 2.5.2 (27 December 2021) * When `Module#autoload` triggers the autovivification of an implicit namespace, `$LOADED_FEATURES` now gets the correspoding directory pushed. This is just a tweak to Zeitwerk's `Kernel#require` decoration. That way it acts more like the original, and cooperates better with other potential `Kernel#require` wrappers, like Bootsnap's. ## 2.5.1 (20 October 2021) * Restores support for namespaces that are not hashable. For example namespaces that override the `hash` method with a different arity as shown in [#188](https://github.com/fxn/zeitwerk/issues/188). ## 2.5.0 (20 October 2021) ### Breaking changes * Requires Ruby 2.5. * Deletes the long time deprecated preload API. Instead of: ```ruby loader.preload("app/models/user.rb") ``` just reference the constant on setup: ```ruby loader.on_setup { User } ``` If you want to eager load a namespace, use the constants API: ```ruby loader.on_setup do Admin.constants(false).each { |cname| Admin.const_get(cname) } end ``` ### Bug fixes * Fixes a bug in which a certain valid combination of overlapping trees managed by different loaders and ignored directories was mistakenly reported as having conflicting directories. * Detects external namespaces defined with `Module#autoload`. If your project reopens a 3rd party namespace, Zeitwerk already detected it and did not consider the namespace to be managed by the loader (automatically descends, ignored for reloads, etc.). However, the loader did not do that if the namespace had only an autoload in the 3rd party code yet to be executed. Now it does. ### Callbacks * Implements `Zeitwerk::Loader#on_setup`, which allows you to configure blocks of code to be executed on setup and on each reload. When the callback is fired, the loader is ready, you can refer to project constants in the block. See the [documentation](https://github.com/fxn/zeitwerk#the-on_setup-callback) for further details. * There is a new catch-all `Zeitwerk::Loader#on_load` that takes no argument and is triggered for all loaded objects: ```ruby loader.on_load do |cpath, value, abspath| # ... end ``` Please, remember that if you want to trace the activity of a loader, `Zeitwerk::Loader#log!` logs plenty of information. See the [documentation](https://github.com/fxn/zeitwerk#the-on_load-callback) for further details. * The block of the existing `Zeitwerk::Loader#on_load` receives also the value stored in the constant, and the absolute path to its corresponding file or directory: ```ruby loader.on_load("Service::NotificationsGateway") do |klass, abspath| # ... end ``` Remember that blocks can be defined to take less arguments than passed. So this change is backwards compatible. If you had ```ruby loader.on_load("Service::NotificationsGateway") do Service::NotificationsGateway.endpoint = ... end ``` That works. * Implements `Zeitwerk::Loader#on_unload`, which allows you to configure blocks of code to be executed before a certain class or module gets unloaded: ```ruby loader.on_unload("Country") do |klass, _abspath| klass.clear_cache end ``` These callbacks are invoked during unloading, which happens in an unspecified order. Therefore, they should not refer to reloadable constants. You can also be called for all unloaded objects: ```ruby loader.on_unload do |cpath, value, abspath| # ... end ``` Please, remember that if you want to trace the activity of a loader, `Zeitwerk::Loader#log!` logs plenty of information. See the [documentation](https://github.com/fxn/zeitwerk/blob/master/README.md#the-on_unload-callback) for further details. ### Assorted * Performance improvements. * Documentation improvements. * The method `Zeitwerk::Loader#eager_load` accepts a `force` flag: ```ruby loader.eager_load(force: true) ``` If passed, eager load exclusions configured with `do_not_eager_load` are not honoured (but ignored files and directories are). This may be handy for test suites that eager load in order to ensure all files define the expected constant. * Eliminates internal use of `File.realpath`. One visible consequence is that in logs root dirs are shown as configured if they contain symlinks. * When an autoloaded file does not define the expected constant, Ruby clears state differently starting with Ruby 3.1. Unloading has been revised to be compatible with both behaviours. * Logging prints a few new traces. ## 2.4.2 (27 November 2020) * Implements `Zeitwerk::Loader#on_load`, which allows you to configure blocks of code to be executed after a certain class or module have been loaded: ```ruby # config/environments/development.rb loader.on_load("SomeApiClient") do SomeApiClient.endpoint = "https://api.dev" # config/environments/production.rb loader.on_load("SomeApiClient") do SomeApiClient.endpoint = "https://api.prod" end ``` See the [documentation](https://github.com/fxn/zeitwerk/blob/master/README.md#the-on_load-callback) for further details. ## 2.4.1 (29 October 2020) * Use `__send__` instead of `send` internally. ## 2.4.0 (15 July 2020) * `Zeitwerk::Loader#push_dir` supports an optional `namespace` keyword argument. Pass a class or module object if you want the given root directory to be associated with it instead of `Object`. Said class or module object cannot be reloadable. * The default inflector is even more performant. ## 2.3.1 (29 June 2020) * Saves some unnecessary allocations made internally by MRI. See [#125](https://github.com/fxn/zeitwerk/pull/125), by [@casperisfine](https://github.com/casperisfine). * Documentation improvements. * Internal code base maintenance. ## 2.3.0 (3 March 2020) * Adds support for collapsing directories. For example, if `booking/actions/create.rb` is meant to define `Booking::Create` because the subdirectory `actions` is there only for organizational purposes, you can tell Zeitwerk with `collapse`: ```ruby loader.collapse("booking/actions") ``` The method also accepts glob patterns to support standardized project structures: ```ruby loader.collapse("*/actions") ``` Please check the documentation for more details. * Eager loading is idempotent, but now you can eager load again after reloading. ## 2.2.2 (29 November 2019) * `Zeitwerk::NameError#name` has the name of the missing constant now. ## 2.2.1 (1 November 2019) * Zeitwerk raised `NameError` when a managed file did not define its expected constant. Now, it raises `Zeitwerk::NameError` instead, so it is possible for client code to distinguish that mismatch from a regular `NameError`. Regarding backwards compatibility, `Zeitwerk::NameError` is a subclass of `NameError`. ## 2.2.0 (9 October 2019) * The default inflectors have API to override how to camelize selected basenames: ```ruby loader.inflector.inflect "mysql_adapter" => "MySQLAdapter" ``` This addresses a common pattern, which is to use the basic inflectors with a few straightforward exceptions typically configured in a hash table or `case` expression. You no longer have to define a custom inflector if that is all you need. * Documentation improvements. ## 2.1.10 (6 September 2019) * Raises `Zeitwerk::NameError` with a better error message when a managed file or directory has a name that yields an invalid constant name when inflected. `Zeitwerk::NameError` is a subclass of `NameError`. ## 2.1.9 (16 July 2019) * Preloading is soft-deprecated. The use case it was thought for is no longer. Please, if you have a legit use case for it, drop me a line. * Root directory conflict detection among loaders takes ignored directories into account. * Supports classes and modules with overridden `name` methods. * Documentation improvements. ## 2.1.8 (29 June 2019) * Fixes eager loading nested root directories. The new approach in 2.1.7 introduced a regression. ## 2.1.7 (29 June 2019) * Prevent the inflector from deleting parts un multiword constants whose capitalization is the same. For example, `point_2d` should be inflected as `Point2d`, rather than `Point`. While the inflector is frozen, this seems to be just wrong, and the refinement should be backwards compatible, since those constants were not usable. * Make eager loading consistent with auto loading with regard to detecting namespaces that do not define the matching constant. * Documentation improvements. ## 2.1.6 (30 April 2019) * Fixed: If an eager load exclusion contained an autoload for a namespace also present in other branches that had to be eager loaded, they could be skipped. * `loader.log!` is a convenient shortcut to get traces to `$stdout`. * Allocates less strings. ## 2.1.5 (24 April 2019) * Failed autoloads raise `NameError` as always, but with a more user-friendly message instead of the original generic one from Ruby. * Eager loading uses `const_get` now rather than `require`. A file that does not define the expected constant could be eager loaded, but not autoloaded, which would be inconsistent. Thanks to @casperisfine for reporting this one and help testing the alternative. ## 2.1.4 (23 April 2019) * Supports deletion of root directories in disk after they've been configured. `push_dir` requires root directories to exist to prevent misconfigurations, but after that Zeitwerk no longer assumes they exist. This might be convenient if you removed one in a web application while a server was running. ## 2.1.3 (22 April 2019) * Documentation improvements. * Internal work. ## 2.1.2 (11 April 2019) * Calling `reload` with reloading disabled raises `Zeitwerk::ReloadingDisabledError`. ## 2.1.1 (10 April 2019) * Internal performance work. ## 2.1.0 (9 April 2019) * `loaded_cpaths` is gone, you can ask if a constant path is going to be unloaded instead with `loader.to_unload?(cpath)`. Thanks to this refinement, Zeitwerk is able to consume even less memory. (Change included in a minor upgrade because the introspection API is not documented, and it still isn't, needs some time to settle down). ## 2.0.0 (7 April 2019) * Reloading is disabled by default. In order to be able to reload you need to opt-in by calling `loader.enable_reloading` before setup. The motivation for this breaking change is twofold. On one hand, this is a design decision at the interface/usage level that reflects that the majority of use cases for Zeitwerk do not need reloading. On the other hand, if reloading is not enabled, Zeitwerk is able to use less memory. Notably, this is more optimal for large web applications in production. ## 1.4.3 (26 March 2019) * Faster reload. If you're using `bootsnap`, requires at least version 1.4.2. ## 1.4.2 (23 March 2019) * Includes an optimization. ## 1.4.1 (23 March 2019) * Fixes concurrent autovivifications. ## 1.4.0 (19 March 2019) * Trace point optimization for singleton classes by @casperisfine. See the use case, explanation, and patch in [#24](https://github.com/fxn/zeitwerk/pull/24). * `Zeitwerk::Loader#do_not_eager_load` provides a way to have autoloadable files and directories that should be skipped when eager loading. ## 1.3.4 (14 March 2019) * Files shadowed by previous occurrences defining the same constant path were being correctly skipped when autoloading, but not when eager loading. This has been fixed. This mimicks what happens when there are two files in `$LOAD_PATH` with the same relative name, only the first one is loaded by `require`. ## 1.3.3 (12 March 2019) * Bug fix by @casperisfine: If the superclass or one of the ancestors of an explicit namespace `N` has an autoload set for constant `C`, and `n/c.rb` exists, the autoload for `N::C` proper could be missed. ## 1.3.2 (6 March 2019) * Improved documentation. * Zeitwerk creates at most one trace point per process, instead of one per loader. This is more performant when there are multiple gems managed by Zeitwerk. ## 1.3.1 (23 February 2019) * After module vivification, the tracer could trigger one unnecessary autoload walk. ## 1.3.0 (21 February 2019) * In addition to callables, loggers can now also be any object that responds to `debug`, which accepts one string argument. ## 1.2.0 (14 February 2019) * Use `pretty_print` in the exception message for conflicting directories. ## 1.2.0.beta (14 February 2019) * Two different loaders cannot be managing the same files. Now, `Zeitwerk::Loader#push_dir` raises `Zeitwerk::ConflictingDirectory` if it detects a conflict. ## 1.1.0 (14 February 2019) * New class attribute `Zeitwerk::Loader.default_logger`, inherited by newly instantiated loaders. Default is `nil`. * Traces include the loader tag in the prefix to easily distinguish them. * Loaders now have a tag. ## 1.0.0 (12 February 2019) * Documentation improvements. ## 1.0.0.beta3 (4 February 2019) * Documentation improvements. * `Zeitwerk::Loader#ignore` accepts glob patterns. * New read-only introspection method `Zeitwerk::Loader.all_dirs`. * New read-only introspection method `Zeitwerk::Loader#dirs`. * New introspection predicate `Zeitwerk::Loader#loaded?(cpath)`. ## 1.0.0.beta2 (22 January 2019) * `do_not_eager_load` has been removed, please use `ignore` to opt-out. * Documentation improvements. * Pronunciation section in the README, linking to sample audio file. * All logged messages have a "Zeitwerk:" prefix for easy grepping. * On reload, the logger also traces constants and autoloads removed. ## 1.0.0.beta (18 January 2019) * Initial beta release.
39.366947
493
0.750818
eng_Latn
0.994623
1c17921febf00a3dabd5e29b6728d908a0e3e0d9
2,348
md
Markdown
data/blog/Indices-and-Range-in-csharp.md
jeevan-vj/iamjeevan
975307ca8644361b676682651bb2e62136a169b6
[ "MIT" ]
1
2021-09-29T04:11:19.000Z
2021-09-29T04:11:19.000Z
data/blog/Indices-and-Range-in-csharp.md
jeevan-vj/iamjeevan
975307ca8644361b676682651bb2e62136a169b6
[ "MIT" ]
3
2021-09-13T08:20:03.000Z
2021-11-19T01:46:24.000Z
data/blog/Indices-and-Range-in-csharp.md
jeevan-vj/iamjeevan
975307ca8644361b676682651bb2e62136a169b6
[ "MIT" ]
1
2021-09-13T08:14:53.000Z
2021-09-13T08:14:53.000Z
--- title: 'Indices and Ranges in C#' date: '2021-09-10' tags: ['c#'] draft: false summary: 'Indices and Ranges in C#' --- # Indices and Range in C# Indices and Range provide clear, concise syntax to access a single element or a range of elements in a sequence. [Official Doc](https://docs.microsoft.com/en-us/dotnet/csharp/tutorials/ranges-indexes) # Indices > specifies that an index is relative to the end of sequence. **^** Index from end operator Rules for indices Assume we have array named myArray; `^0` → `myArray[myArray.Length]` So, `myArray[^0]` throws exception, just as `myArray[myArray.Length]`; For any number n, the index ^n is the same as myArray[myArray.Length - n]; Retrieve last element of Array. `myArray[^1]` ```csharp int [] myArray = new int[5] {0,1,2,3,4}; Console.WriteLine("#-----Indices-----#"); // last index Console.WriteLine(myArray[^1]); // second last index Console.WriteLine(myArray[^2]); // fifth from end Console.WriteLine(myArray[^5]); ``` Retrieve second last element of Array: `myArray[^2]` Indices as variables. indices are System.Index Type. ```csharp Index last = ^1; Console.WriteLine(myArray[last]); ``` String, `Span<T>`, `ReadOnlySpan<T>` and `List<T>` supports indices. ![indices-and-ranges-c-sharp/Untitled.png](/static/images/indices-and-ranges-c-sharp/Untitled.png) ![indices-and-ranges-c-sharp/Untitled%201.png](/static/images/indices-and-ranges-c-sharp/Untitled%201.png) # Range > A range specifies the start and end of a range. Ranges are exclusive, it means the end is not included in the range. **whenever we defined the Range end position is not included in result**. **..** indicates the range operator. Indices can be used with range operator The range `[0..^0]` → `[0..myArray.Length]`; ```csharp Console.WriteLine("All the elements: myArray[..]"); // 0,1,2,3,4 Console.WriteLine(string.Join(',',myArray[..])); Console.WriteLine("0 to ^1. [^1] is not included: myArray[0..^1]"); // 0,1,2,3 Console.WriteLine(string.Join(',',myArray[0..^1])); ``` Range as variables. ```csharp Range zeroToThird = 0..^1; // 0,1,2,3 Console.WriteLine(string.Join(',',myArray[zeroToThird])); ``` String, `Span<T>`, `ReadOnlySpan<T>` support ranges. ### Fiddle [https://dotnetfiddle.net/B2fJuL](https://dotnetfiddle.net/B2fJuL)
22.796117
142
0.68569
eng_Latn
0.848784
1c179c8c2f3927f1ac0c279b9d758fe7431b5bd7
6,988
md
Markdown
web-dev/http2.md
ntk148v/research
52d8942078f6a81f940cd59fd65f1b053fca7c3b
[ "Apache-2.0" ]
1
2017-11-24T10:45:14.000Z
2017-11-24T10:45:14.000Z
web-dev/http2.md
ntk148v/learning
9c868dc389e978f996521bc73edf3c69a4504341
[ "Apache-2.0" ]
null
null
null
web-dev/http2.md
ntk148v/learning
9c868dc389e978f996521bc73edf3c69a4504341
[ "Apache-2.0" ]
null
null
null
# HTTP/2.0 Source: https://developers.google.com/web/fundamentals/performance/http2 - [HTTP/2.0](#http20) - [1. Introduction](#1-introduction) - [2. Design](#2-design) - [2.1. Binary framing layer](#21-binary-framing-layer) - [2.2. Streams, messages, and frames](#22-streams-messages-and-frames) - [2.3. Request and response multiplexing](#23-request-and-response-multiplexing) - [2.4. Stream prioritization](#24-stream-prioritization) - [2.5. One connection per origin](#25-one-connection-per-origin) - [2.6. Flow control](#26-flow-control) - [2.7. Server push](#27-server-push) - [2.8. Header compression](#28-header-compression) ## 1. Introduction - Major revision of the HTTP network protocol. - It was derived from the earlier experimental [SPDY](https://en.wikipedia.org/wiki/SPDY) protocol. - Goals: - Reduce latency by enabling full request and response multiplexing. - Minimize protocol overhead via efficient compression of HTTP header fields. - Add supprt for request prioritization and server push. - HTTP/2 does not modify the application semantics of HTTP in any way. All the core concepts, such as HTTP methods, status codes, URIs, and header fields, remain in place. ## 2. Design - HTTP/1.x drawbacks: - Clients need to use multiple connections to achieve concurrency and reduce latency. - Does not compress request and response headers -> unncessary network traffic. - Does not allow effective resource prioritization, resulting in poor use of the underlying TCP connection. - HTTP/2.0: - Introduce header field compression. - Allow multiple concurrent exchanges on the same connection. - Allow interleaving of request and response messages on the same connection and use an effective coding for HTTP header fields. - Allow prioritization of requests, letting more important requests complete more quicky. - All the core of all performance enhancements of HTTP/2 is the new **binary framing layer**. ### 2.1. Binary framing layer - How the HTTP messages are encapsulated and transferred between the client and server. ![](https://developers.google.com/web/fundamentals/performance/http2/images/binary_framing_layer01.svg) - New optimized encoding mechanism between the socket interface and the higher HTTP API exposed to our applications: HTTP sematics are unaffected but the way they are encoded while in transit is different. - All HTTP/2 communication is split into smaller messages and frames, each of which is encoded in binary format. ### 2.2. Streams, messages, and frames - HTTP/2 terminology: - *Stream*: A bidirectional flow of bytes within an established connection, which may carry one or more messages. - *Message*: A complet sequence of frames that map to a logical request or response message. - *Frame*: The smallest unit of communication in HTTP/2, each containing a frame header, which at a minimum identitifes the stream to which the frame belongs. ![](https://developers.google.com/web/fundamentals/performance/http2/images/streams_messages_frames01.svg) - All communication is perfomed over a sinle TCP connection that can carry any number of bidirectional *streams*. - Each *stream* has a unique identifier and optional priority information that is used to carry bidirectional *messages*. - Each *message* is a logical HTTP message, such as a request/response, which consists of one or more *frames*. - The *frame* is the smallest unit of communication that carries a specific type of data—e.g., HTTP headers, message payload, and so on. Frames from different streams may be interleaved and then reassembled via the embedded stream identifier in the header of each frame. ### 2.3. Request and response multiplexing - New binary framing layer -> allow the client and server to break down an HTTP message into independent frames, interleave them, and then reassemble them on the other end -> enable full request and response multiplexing. ![](https://developers.google.com/web/fundamentals/performance/http2/images/multiplexing01.svg) ### 2.4. Stream prioritization - The HTTP/2 standard allows each stream to have an associated weight and dependency: - Each stream may be assigned an interger weight (>=1 and <= 256>). - Each stream may be given an explicit dependency on another stream. - The combination of stream dependencies and weights allows the client to construct and communicate a "prioritization tree". ![](https://developers.google.com/web/fundamentals/performance/http2/images/stream_prioritization01.svg) - *The parent stream should be allocated resources ahead of its dependencies*. - *Streams that share the same parent (in other words, sibling streams) should be allocated resources in proportion to their weight*. ### 2.5. One connection per origin - All HTTP/2 connections are persistent, and only one connection per origin is required, which offers numerous performance benefits. ### 2.6. Flow control - A mechanism to prevent the sender from overwhelming the receiver with data it may not want or be able to process - HTTP/2 provides a set of simple building blocks that allow the client and server to implement their own stream- and connection-level flow control: - Flow control is directional. Each receiver may choose to set any window size that it desires for each stream and the entire connection. - Flow control is credit-based. Each receiver advertises its initial connection and stream flow control window (in bytes), which is reduced whenever the sender emits a `DATA` frame and incremented via a `WINDOW_UPDATE` frame sent by the receiver. - Flow control cannot be disabled. - Flow control is hop-by-hop, not end-to-end. ### 2.7. Server push - The server is able to send multiple responses for a single client request. ![](https://developers.google.com/web/fundamentals/performance/http2/images/push01.svg) - Push resources can be: - Cached by the client - Reused across different pages - Multiplexed alongside other resources - Prioritized by the server - Declined by the client - The client needs to know which resources the server intends to push to avoid creating duplicate requests for these resources -> send all `PUSH_PROMISE` frames, which contain just the HTTP headers of the promised resource, ahead of the parent’s response. ### 2.8. Header compression - HTTP/2 compresses request and response header metadata using the HPACK compression format that uses two simple but powerful techniques: - It allows the transmitted header fields to be encoded via a static Huffman code, which reduces their individual transfer size. - It requires that both the client and server maintain and update an indexed list of previously seen header fields (in other words, it establishes a shared compression context), which is then used as a reference to efficiently encode previously transmitted values. ![](https://developers.google.com/web/fundamentals/performance/http2/images/header_compression01.svg)
60.241379
270
0.774041
eng_Latn
0.995673
1c17a446949a77691ad572dcfd1edcee336af269
812
md
Markdown
README.md
omegascorp/sequelize-mariadb-json-test
a06c0ce5e8da2a9c263a09b49d869ff56366c4e5
[ "MIT" ]
null
null
null
README.md
omegascorp/sequelize-mariadb-json-test
a06c0ce5e8da2a9c263a09b49d869ff56366c4e5
[ "MIT" ]
1
2021-05-10T15:40:18.000Z
2021-05-10T15:40:18.000Z
README.md
omegascorp/sequelize-mariadb-json-test
a06c0ce5e8da2a9c263a09b49d869ff56366c4e5
[ "MIT" ]
null
null
null
# It's en experiment that reads JSON fields from MariaDB via Sequelize ## Init ``` yarn && yarn db-init ``` ## Run ``` docker-compose up ``` ``` yarn start ``` ## Issue description There are 2 models defined Users and Projects. User has many projects and they are associated via userId filed. ``` User { id: INTEGER; data: JSON; } Project { id: INTEGER; userId: INTEGER; data: JSON; } ``` In case you are selecting users and include projects user.data type is object, but project.data type is string. ``` const users = await User.findAll({ include: [ { model: Project, as: 'projects', }, ], attributes: ['id', 'data', 'projects.id', 'projects.data'], }); console.info(typeof users[0].data); // object console.info(typeof users[0].projects[0].data); // string ```
15.615385
112
0.646552
eng_Latn
0.972138
1c1828f76cd5b62d4c3bf32f4a1b9ebb6d351bd7
114
md
Markdown
go/Readme.md
joshuawalcher/dockerfile-boilerplates
caff607487e8733fc157671de16a66325adab107
[ "MIT" ]
229
2020-06-03T16:43:04.000Z
2022-03-13T07:51:48.000Z
go/Readme.md
joshuawalcher/dockerfile-boilerplates
caff607487e8733fc157671de16a66325adab107
[ "MIT" ]
5
2020-06-04T02:30:24.000Z
2020-06-05T07:12:46.000Z
go/Readme.md
joshuawalcher/dockerfile-boilerplates
caff607487e8733fc157671de16a66325adab107
[ "MIT" ]
22
2020-06-03T20:40:30.000Z
2022-03-12T21:16:57.000Z
# Go Boilerplate Docker Build and run Go scripts via docker. ``` docker build -t go . docker run --rm go ```
11.4
36
0.657895
kor_Hang
0.484091
1c192ed199ccaf68b767d9de0050daab9e43db13
5,622
md
Markdown
_posts/2019-08-02-hardcore-audio-3.md
shaoguoji/shaoguoji.github.io
66bfe7bfb48e20b92522b686853521c5e30bcb63
[ "Apache-2.0" ]
null
null
null
_posts/2019-08-02-hardcore-audio-3.md
shaoguoji/shaoguoji.github.io
66bfe7bfb48e20b92522b686853521c5e30bcb63
[ "Apache-2.0" ]
19
2017-08-03T15:43:32.000Z
2018-07-20T10:58:00.000Z
_posts/2019-08-02-hardcore-audio-3.md
shaoguoji/shaoguoji.github.io
66bfe7bfb48e20b92522b686853521c5e30bcb63
[ "Apache-2.0" ]
3
2018-10-01T10:40:46.000Z
2021-05-27T11:37:06.000Z
--- layout: post title: 硬核音频系列(三)—— 线性淡入淡出 subtitle: 算法思路、实现与优化方法描述 date: 2019-08-02 15:12:29 +0800 author: Shao Guoji header-img: img/post-bg-hardcore-audio.jpg catalog: true tag: - 学习笔记 - 嵌入式 - 数字音频 --- *硬核音频系列文章列表:* * [硬核音频系列(一)—— 声音信息的表示:基础概念扫盲,PCM 编码方式]({% post_url 2019-08-05-hardcore-audio-1 %}) * [硬核音频系列(二)—— 音频文件编解码格式:动手实现 adpcm 解码器]({% post_url 2019-08-04-hardcore-audio-2 %}) * [硬核音频系列(三)—— 线性淡入淡出:算法思路、实现与优化方法描述]({% post_url 2019-08-02-hardcore-audio-3 %}) 硬核音频系列第三篇,聊聊淡入淡出怎么搞。 ### 音频淡入淡出概述 淡入淡出(fade in & fade out)可以实现音频音量在开始播放时渐强,以及停止播放时渐弱,防止声音切换带来的音量突变。实现淡入淡出重点要解决的问题,在于如何调节 PCM 数据的音量大小。一开始我天真的认为,只要把每一个采样值加上或减去某个数就可以了,但事情可没这么简单,加上一个数虽然能让采样值增大,但却会造成声音波形失真,并不可行。 淡入淡出本质上是对音频音量作线性变换,可以通过对每个采样值乘上一个变换系数实现,并且这个系数应该是个随时间变化量,如最简单的线性函数 `y = kx`。但在实际应用中,处理范围内 y 需在 0~1 之间变化,对应音量从零到原始值,自变量 x 可看作某个音频位置(时间、采样序号等),假设需要在 x = 500 位置处音量渐变到最大,则算出 k = 1/500。 ![图1 淡入淡出波形示意图](https://raw.githubusercontent.com/shaoguoji/blogpic/master/post-img/fading-waveform.png) ### 算法原理(以淡入为例) 对给定的 t1-t2 时间区间的单声道音频采样进行淡入操作(t1一般取0),先把时间统一换算为采样数,作为基本单位,然后计算已处理的采样数占待处理采样总数比例,得到出下一采样对应的变换系数,再与采样相乘,得到输出采样值。 举个栗子,要对 100 个采样从头开始、依次进行淡入处理,在第 0、50、70 个采样处的计算方式如下: ```c factor = 0 / 100; out_sample[0] = out_sample[0] * factor; factor = 50 / 100; out_sample[50] = out_sample[50] * factor; factor = 70 / 100; out_sample[70] = out_sample[70] * factor; ``` 值得注意的是,变换系数为 0~1 的小数,在实际 C 编程中,为了避免浮点运算,通常把上面两步计算通过一条表达式完成,先计算分子乘法再算分母除法: ```c out_sample[0] = (out_sample[0] * 0) / 100; out_sample[50] = (out_sample[50] * 50) / 100; out_sample[70] = (out_sample[70] * 70) / 100; ``` 综上,对算法步骤进行整理,得到以下计算流程: 1. 通过采样率和声道数计算开始时刻、结束时刻、总处理时长对应的采样数 `start_sample_pos`、`stop_sample_pos`、`fade_samples_count=stop_sample_pos-start_sample_pos` 2. 计算已处理采样数:`done_sample_count = cur_sample_pos - start_sample_pos` 3. 计算表达式分子乘法,得到中间变量:`tmp = out_sample * done_sample_count` 4. 计算表达式分母除法,得到输出采样值:`out_sample = tmp / fade_samples_count` 5. 更新当前采样位置:`cur_sample_pos++` 对应的程序运行日志: ```c [cal] factor = done_sample_count/fade_samples_count = 4917/96000 [cal] factor = done_sample_count/fade_samples_count = 4918/96000 [cal] factor = done_sample_count/fade_samples_count = 4919/96000 [cal] factor = done_sample_count/fade_samples_count = 4920/96000 [cal] factor = done_sample_count/fade_samples_count = 4921/96000 [cal] factor = done_sample_count/fade_samples_count = 4922/96000 [cal] factor = done_sample_count/fade_samples_count = 4923/96000 ``` #### 性能优化 对于分母除法, `fade_samples_count` 只需计算一次,后面不变,而分子需要每次动态计算乘法,比较悲催的是即使就这点计算量,也并不是所有硬件都能 hold 得住,在 cpu 资源紧张的硬件上运行乘除法,会消耗一定时间,甚至造成部分音频卡顿现象。 ```asm 13758: 9802 ldr r0, [sp, #8] 1375a: 9903 ldr r1, [sp, #12] 1375c: 2300 movs r3, #0 1375e: 002a movs r2, r5 13760: f086 fe04 bl 9a36c <____aeabi_ldivmod_from_thumb> 13764: 2101 movs r1, #1 13766: 8030 strh r0, [r6, #0] 13768: 2018 movs r0, #24 ``` 从反汇编代码看出,我调试所用硬件架构没有除法指令,只能通过调用 `<____aeabi_ldivmod_from_thumb>` 实现计算除法。导致 cpu 占用过高,没办法所以要对算法进一步优化,主要是运算符的精简。 ### 优化方案一 —— 移位替代分母除法 保持算法思路不变,将上述步骤 4 「计算表达式分母除法部分」中除法采用右移替换,但由于移位只适用乘除 2 的整数次幂,需要先把分母 `fade_samples_count` 向下取最接近的 2 次幂整数代替,如 96000 用 65536 替代,再进行右移操作: ```c tmp / 96000 => tmp / 65536 => tmp >> 16 ``` 相应的代码实现可以一步到位:计算除数对应二进制位数,再减一得到所需右移位数(96000 有 17 位二进制数,右移 16 位),代码实现: ```c ... int64_t tmp; int bitcount = 0; tmp = fade_samples_count; while (tmp > 0) { tmp = tmp / 2; bitcount++; } ... tmp = out_sample[i] * (int64_t)(f_info->cur_sample_pos-f_info->start_sample_pos); out_sample[i] = tmp >> (bitcount-1); ... ``` 这种替换能直接避免使用除法,实测性能提升 10 倍,但缺陷也很明显,那就是原操作数会损失不定量的数值(取决于与最近的 2 次幂有多靠近),从而造成计算结果误差。上面例子中 96000 直接少了 3w 多,具体效果表现为,淡入 3s 的音频实际只处理了 1s 多(可以通过其他手段补偿但还没尝试)。 *因此本方案不适用于精确时间淡入淡出,只能算个大概,误差大小全看人品,但其实用于音乐播放影响不大。* ### 优化方案二 —— 移位代替线性变化(用于提示音播放消 pop 声) *区分针对「听觉效果」与「硬件特性」的淡入淡出处理,可实现两套算法并根据具体情况调用 —— 播放音乐与播放提示音。短促的提示音甚至没必要通过动态计算线性变化系数处理,简陋的移位操作也许就能够起作用(效果待验证)。* 前面所说的淡出淡入淡出效果,可适用于所有音频,能防止声音突然出现产生的突兀感,例如播放音乐时,人耳能明显感受到音量缓缓变化,暂且称为面向「听觉效果」的处理。而下面提到的第二种方案,适合对短促的提示音作淡入,可避免由于功放输入能量突变造成的 pop 音,处理目的更多的是面向「硬件特性」的优化,是另一种算法思路。 这实际上是对线性变换的一种简化,不再通过除法就计算变换系数,而是直接把采样值移位,达到近似的效果(或者看做固定几个变换系数的线性变换)。这时编程关注的重点在于移位操作本身 —— 从哪移到哪?在什么时机移?移多少位? #### 理论分析 考虑 16 位长度的采样值,在淡入处理时从最小音量到最大音量的过程中,每次移一位,需要经过最多 16 次左移操作,即移位总次数等于采样位宽,因此整个音频音量呈现出 16 级阶梯状变化,且每一级内采样点的音量是前一级的 2 倍,相比线性变换方式,音量增加存在锯齿状。 ![图2 音量阶梯](https://raw.githubusercontent.com/shaoguoji/blogpic/master/post-img/volume-level.png) 要保证 16 次移位后音量刚好达到最大,就要先计算每隔多久移位一次,可以通过总采样数 `fade_samples_count` 除以 16 得到(每一级内的采样数),每当达到一个移位间隔,执行一次移位。例如需要处理 1600 个采样,就是每 100 个采样移一位。 利用整除可以判断当前采样处于第几级,并通过右移递减的方式模拟左移递增,达到数据「一位一位冒出来」的效果,示意代码: ```c level = fade_samples_count / 16; while (cur_sample_pos < stop_sample_pos) { tmp = cur_sample_pos / level; out_sample[i] = out_sample[i] >> (16 - tmp); } ``` 同样,循环体中的除法必须干掉,同样使用移位代替整除,但你懂得,还是先要把除数近似到 2 次幂(计算二进制位数一步到位),并且由于截断误差原因,一部分采样数被舍弃掉,算出来的最大 level 或许达不到 16 级,音量级数变化范围改为 0 ~ bitcount,改写代码: ```c level = fade_samples_count / 16; tmp = level; while (tmp > 0) { tmp = tmp / 2; bitcount++; } while (cur_sample_pos < stop_sample_pos) { tmp = cur_sample_pos >> bitcount; out_sample[i] = out_sample[i] >> (bitcount - tmp); } ``` 随着移位二进制权值的递增,音量变化会越来越大,运行效果听起来便是:前面一大段时间音量增大幅度都很小,最后一小段音量急剧上升,听音乐还是有点突兀,但用于短促提示音可消除 pop 声并快速进入音频内容,当然了,将第一种算法的淡入时间改短也能达到相同效果。 > 参考资料 > > * [如何实现音频淡入淡出效果 - SmartWan的专栏 - CSDN博客](https://blog.csdn.net/wxtsmart/article/details/3051418) > * [【C语言】PCM音频数据处理---音量增大或减小 - zz460833359的博客 - CSDN博客](https://blog.csdn.net/zz460833359/article/details/84982212)
28.11
180
0.728922
yue_Hant
0.233177
1c1a0f97f9dddda8e6a6045a39babf1bfc3d54fc
960
md
Markdown
README.md
Dorisqi/foodie-connector
b463068c49226f464b640fe7fcaec14cf2ea8966
[ "MIT" ]
null
null
null
README.md
Dorisqi/foodie-connector
b463068c49226f464b640fe7fcaec14cf2ea8966
[ "MIT" ]
null
null
null
README.md
Dorisqi/foodie-connector
b463068c49226f464b640fe7fcaec14cf2ea8966
[ "MIT" ]
null
null
null
# Foodie Connector **This project is not production ready.** This is a project for CS 373 Software Engineering I at Purdue University. This is a food delivery platform where you can place group orders with your friends and neighbors. ## Run the frontend The frontend is a [React](https://reactjs.org/) application. The code is located in the `/frontend` folder. You need to create a `.env` file based on `.env.example`. Use `yarn` to install dependencies and `yarn start` to run the application. ## Run the backend The backend is a [Laravel](https://laravel.com/) application. The code is located in the `/backend` folder. You need to create a `.env` file based on `.env.example`. Use `composer install` to install dependencies and `php artisan serve` to run the application. You will need MySQL and Redis services. ## Future We do not have any future plans of continue developing and maintaining this project. Feel free to fork if you are interested.
53.333333
300
0.759375
eng_Latn
0.9994
1c1a66ce1f58e680722bdc8f6f97e69f4c7e455e
994
md
Markdown
README.md
c-rainstorm/common-utils
2c0201bce7e1af3d3c9ab7e2a85a968da539e387
[ "MIT" ]
null
null
null
README.md
c-rainstorm/common-utils
2c0201bce7e1af3d3c9ab7e2a85a968da539e387
[ "MIT" ]
null
null
null
README.md
c-rainstorm/common-utils
2c0201bce7e1af3d3c9ab7e2a85a968da539e387
[ "MIT" ]
1
2020-10-02T13:41:17.000Z
2020-10-02T13:41:17.000Z
# common-utils 项目中可以使用的通用工具 ## 报表导出工具 ```java private void doExport(int sizePreExport, int totalSize, String sheetNamePrefix, int sheetSize, Function<Integer, List<AllInOneClass>> supplier) throws Exception { File exportFile = Paths.get(EXPORT_DIR, sheetNamePrefix + "-" + DateTimeFormatterUtil.get(DateTimeFormatterUtil.YYYYMMDDHHMMSS).format(LocalDateTime.now()) + ".xlsx").toFile(); log.info("file name:" + exportFile.getAbsolutePath()); try (ExportService<AllInOneClass> exportService = new XLSXExportService<>(AllInOneClass.class, exportFile)) { long total = 0; for (int i = 0; i < totalSize; i += sizePreExport) { List<AllInOneClass> dataList = supplier.apply(sizePreExport); long start = System.currentTimeMillis(); exportService.append(dataList); total += (System.currentTimeMillis() - start); log.info("导出 {} 条数据耗时 {}ms", i + sizePreExport, total); } } } ```
43.217391
162
0.65996
yue_Hant
0.468037
1c1ad00e30234eb977d29476e7ce665f08661864
10,720
md
Markdown
_posts/2020-10-04-learning-curve.md
lmc2179/lmc2179.github.io
45b835cebf43935206f313fdf0e1bdce7ad2716b
[ "MIT" ]
null
null
null
_posts/2020-10-04-learning-curve.md
lmc2179/lmc2179.github.io
45b835cebf43935206f313fdf0e1bdce7ad2716b
[ "MIT" ]
1
2020-11-01T05:35:06.000Z
2020-11-01T05:35:06.000Z
_posts/2020-10-04-learning-curve.md
lmc2179/lmc2179.github.io
45b835cebf43935206f313fdf0e1bdce7ad2716b
[ "MIT" ]
null
null
null
--- layout: post title: "Would collecting more data improve my model's predictions? The learning curve and the value of incremental samples" author: "Louis Cialdella" categories: posts tags: [data-science] image: wine.jpg --- *Since we usually need to pay for data (either with money to buy it or effort to collect it), it's worth knowing the value of getting more data points to fit your predictive model. We'll explore the learning curve, a model-agnostic way of understanding how performance changes as we add more data points to our sample. Analysis of the learning curve tells us whether it's worth it to collect a larger dataset, and it's easy to do this analysis in Python with scikit-learn.* # Is it worth collecting more samples? When we have a sample of data we want to use to fit a model, it's natural to ask ourselves whether we have "enough" samples on hand. Would collecting more data improve our model, or has it reached a performance plateau? This is a question with a lot of practical importance, because data is expensive! We need to pay to acquire samples - either literally exchanging money with a data vendor or building systems to collect data. When our analysis is simple, like a difference in means, we can achieve this by computing the power of a hypothesis test or the precision of the effect size confidence interval. But if you have a carefully-crafted regression model or black-box machine learning model, it's a lot less clear how to gauge whether you have enough samples. For example, let's say you've already collected [a number of datapoints about Portugese wine](http://www3.dsi.uminho.pt/pcortez/wine/) using a combination of chemical analysis and human rating of some sample wines. You'd like to build a predictive model that relates the measurable chemical properties of wine to its quality, perhaps so you can sell it to a Portugese Winery to automate their quality assurance. You've just done some fancy [model selection](https://lmc2179.github.io/posts/cvci.html) and decided that a Lasso model will probably give you the best trade-off between prediction quality and model simplicity. You've already went through some effort to gather the 1,599 samples of wine measurements in your data set; but perhaps your model would make better predictions if you collected even more samples? On the one hand, this would probably be expensive. You'll have to have some more wine analyzed, and get human raters to tell you more about their preferences. On the other hand, this investment might make your model more valuable (wineries might be willing to pay more to secure certain production standards), making the acquisition of the data worth it. In order to answer this question, let's think about what information would be sufficient to answer it. The simplest thing that comes to my mind is the relationship between the sample size and the quality of the model - if we knew that, we could figure out if it's likely that more data would provide incremental value to your model's predictive ability. # The learning curve tells us how model performance varies with sample size The relationship between sample size and model quality has a name: the [learning curve](https://en.wikipedia.org/wiki/Learning_curve_(machine_learning)). The learning curve is a plot of how the model's performance on our favorite metric (like MSE or ROC-AUC varies with sample size). Sometimes we'll plot both the in-sample (training) and out-of-sample (validation) performance together; we'll just focus on the out-of-sample performance for now. We have some intuition about the shape of this curve. As the number of samples grows, the performance of the model usually improves rapidly and then "flattens out" until adding more data points has little effect. We'll make the assumption that this family of shape describes the curve, and the main question is whether we're currently in the steeply rising part of the curve, or the flatter part. This doesn't strike me as a very strong assumption, since I haven't seen any examples of real-life models where the model performance "jumps" after being flat for a large number of samples. Nonetheless, this is an assumption, and we should be careful to take note of it. Let's plot the learning curve for the RMSE of our Lasso model on our Portugese wine data set. As usual, we can have our old friend scikit-learn do all the hard work. This piece of code is adapted from their [learning curve example](https://scikit-learn.org/stable/auto_examples/model_selection/plot_learning_curve.html). The library will run K-fold CV at varying sample sizes, giving us a mean and variance around the RMSE as the sample size changes. We'll use 10-fold splitting. _You can find the imports and code to fetch the data in the appendix._ ```python n_folds = 10 train_sizes, _, test_scores = learning_curve(Lasso(alpha=10**(-3), normalize=True), X, y, cv=n_folds, scoring='neg_root_mean_squared_error', train_sizes=np.linspace(0.1, 1, 20)) test_scores = -test_scores test_scores_mean = np.mean(test_scores, axis=1) test_scores_se = np.std(test_scores, axis=1) / np.sqrt(n_folds) test_scores_var = test_scores_se**2 ``` We can then plot the relationship between sample size and model performance: ```python plt.plot(train_sizes, test_scores_mean, marker='o', label='Mean') plt.fill_between(train_sizes, test_scores_mean - 1.96 * test_scores_se, test_scores_mean + 1.96 * test_scores_se, alpha=.1, label='CI') plt.title('Learning Curve for Lasso model') plt.xlabel('Sample size') plt.ylabel('CV RMSE') plt.tight_layout() plt.legend() plt.show() ``` ![Learning curve](https://raw.githubusercontent.com/lmc2179/lmc2179.github.io/master/assets/img/learning_curve/1.png) So far, so good. This learning curve has exactly the kind of shape we'd expect. The error drops steeply as we add the few batches of samples, but seems to "saturate" after a few hundred samples, as adding more data makes little impact. However, it can be a little tough to tell what's going on with the right hand side of the graph. It looks like the incremental data points there are providing relatively little value, but it's worth taking a closer look. # The incremental value of a data point The question here is whether adding a new batch of samples from the same source (a presumed IID one) would provide value by decreasing the model's RMSE. Our learning curve tells us about the performance of the model at each sample size, and we can transform it to learn about the incremental value of each batch of samples. We'll take the first difference of the learning curve using [np.diff](https://numpy.org/doc/stable/reference/generated/numpy.diff.html). This will tell us the incremental value we observed for each batch of data points we added to the model; specifically, we'll get the change in the RMSE when that batch was added. ```python mean_diff = np.diff(test_scores_mean) diff_se = np.sqrt(test_scores_var[1:] + test_scores_var[:-1]) diff_df = pd.DataFrame({'mean_diff': mean_diff, 'n': train_sizes[1:]}) spline_fit = smf.wls('mean_diff ~ bs(n, df=3)', diff_df, weights=1./diff_se**2).fit() # Differing variances of observations y_pred_df = spline_fit.get_prediction(diff_df).summary_frame(alpha=.05) plt.scatter(diff_df['n'], diff_df['mean_diff'], label='Observed CV error') plt.plot(diff_df['n'], y_pred_df['mean'], label='Smoothed error') plt.fill_between(diff_df['n'], y_pred_df['mean_ci_lower'], y_pred_df['mean_ci_upper'], alpha=.1, color='blue', label='CI') plt.axhline(0, linestyle='dotted') plt.xlabel('Sample size') plt.ylabel('Improvement in RMSE') plt.title('First difference of learning curve') plt.tight_layout() plt.legend() plt.show() ``` ![Learning curve first difference](https://raw.githubusercontent.com/lmc2179/lmc2179.github.io/master/assets/img/learning_curve/2.png) We see that the first 800 data points are by far the most valuable. Each incremental data point from 0 to 800 seems to have meaningfully reduced the RMSE. However, after around a sample size of 800, the incremental data points seem to provide relatively little value. In order to make the "path" here clear, I also plotted a smoothed version using a [cubic B-Spline with 3 knots](https://patsy.readthedocs.io/en/latest/spline-regression.html). You could achieve a similar result using your favorite smoother, like a moving average or lowess. And with this, we have an answer to our initial question. Collecting more data by sending more wine to the laboratory is likely to be a poor investment with the model we've chosen. It looks like we could have done reasonably well with about 1000 data points, even; not much improvement seems to occur after that. I should point out here that in general, there's no "rule of thumb" for what the learning curve looks like or when it flattens out. You'll need more data when the model is more complex, or your features are highly correlated, or when you have a lot of classes you're predicting. This method lets you assess the value of a given simple size for whatever crazy black-box model you've dreamed up, without having to do much ad hoc setup. # Putting it all together: Computing the value of a larger sample That was a lot! Let's recap it quickly, to make it clear what the process is by which we answer our original question. - You have a sample on hand, and a particular model you've decided to fit to it so you can make predictions. You'd like to know if collecting more samples would improve your model's predictive power. - Compute the learning curve for your favorite model, to get a feel for how the sample sizes affects the model's quality. Does the learning curve flatten out out as we approach the current sample size, or does it still have a large slope? - Calculate the first difference of the learning curve, and see if that first difference is about zero near the current sample size. Consider smoothing this curve to see if it has a mean of zero on the farthest part of the curve. If the first difference has "settled down" around zero, adding more samples likely won't improve things. # Appendix: Setup and imports We download the data: ``` curl http://www3.dsi.uminho.pt/pcortez/wine/winequality.zip --output winequality.zip unzip winequality.zip cd winequality/ ``` Plus import libraries and read from the CSV: ```python from sklearn.linear_model import Lasso from sklearn.model_selection import learning_curve import numpy as np from matplotlib import pyplot as plt import seaborn as sns import pandas as pd from statsmodels.api import formula as smf from sklearn.utils import resample df = pd.read_csv('winequality-red.csv', sep=';') y = df['quality'] X = df.drop('quality', axis=1) ```
84.409449
1,173
0.782276
eng_Latn
0.998477
1c1b023dfeb989ef57a58a8e7aa5044f92e572cb
974
md
Markdown
_posts_older/2013-04-06-makethumbwd-2013.md
brontosaurusrex/brontosaurusrex.github.io
7a3b6a21f36327c176849ac75c7c55f603646c0e
[ "MIT" ]
1
2018-12-30T05:17:13.000Z
2018-12-30T05:17:13.000Z
_posts_older/2013-04-06-makethumbwd-2013.md
brontosaurusrex/brontosaurusrex.github.io
7a3b6a21f36327c176849ac75c7c55f603646c0e
[ "MIT" ]
3
2017-01-26T21:04:51.000Z
2020-02-22T11:57:10.000Z
_posts_older/2013-04-06-makethumbwd-2013.md
brontosaurusrex/brontosaurusrex.github.io
7a3b6a21f36327c176849ac75c7c55f603646c0e
[ "MIT" ]
4
2018-02-28T16:08:19.000Z
2019-06-19T21:05:05.000Z
--- id: 2539 title: makethumbwd 2013 date: 2013-04-06T20:53:12+00:00 author: bronto saurus layout: post guid: http://b.pwnz.org/?p=2539 permalink: /2013/04/makethumbwd-2013/ categories: - Uncategorized --- <pre>#!/bin/bash # makethumbwd 2013 # I need mtn (http://moviethumbnail.sourceforge.net/) # and image magick installed while [ $# -gt 0 ]; do # expand path, so this can be used from cli as well (on relative paths) expanded=$(readlink -f "$1") echo $expanded file="$expanded" filename=${file%.*} extension=${file##*.} # mtn 1st try mtn -i -c 1 -r 1 -t -P -o .$extension.jpg "$expanded" if [ ! -s "$expanded".jpg ] then mtn -i -c 1 -r 1 -t -P -Z -o .$extension.jpg "$expanded" fi #convert (image magick) convert "$expanded".jpg -gravity Center -scale 200^ -extent 120x180 "$expanded"_thumb.jpg #remove mtn snap and rename the thumb mv "$expanded"_thumb.jpg "$filename".jpg rm "$expanded".jpg shift done </pre>
21.644444
90
0.649897
eng_Latn
0.380679
1c1b41396aa2d9146cbbc9f581f9211c37ddec08
2,685
md
Markdown
sdk-api-src/content/tapi3if/nf-tapi3if-itcallstateevent-get_cause.md
amorilio/sdk-api
54ef418912715bd7df39c2561fbc3d1dcef37d7e
[ "CC-BY-4.0", "MIT" ]
null
null
null
sdk-api-src/content/tapi3if/nf-tapi3if-itcallstateevent-get_cause.md
amorilio/sdk-api
54ef418912715bd7df39c2561fbc3d1dcef37d7e
[ "CC-BY-4.0", "MIT" ]
null
null
null
sdk-api-src/content/tapi3if/nf-tapi3if-itcallstateevent-get_cause.md
amorilio/sdk-api
54ef418912715bd7df39c2561fbc3d1dcef37d7e
[ "CC-BY-4.0", "MIT" ]
null
null
null
--- UID: NF:tapi3if.ITCallStateEvent.get_Cause title: ITCallStateEvent::get_Cause (tapi3if.h) description: The get_Cause method gets the cause associated with this event. helpviewer_keywords: ["ITCallStateEvent interface [TAPI 2.2]","get_Cause method","ITCallStateEvent.get_Cause","ITCallStateEvent::get_Cause","_tapi3_itcallstateevent_get_cause","get_Cause","get_Cause method [TAPI 2.2]","get_Cause method [TAPI 2.2]","ITCallStateEvent interface","tapi3.itcallstateevent_get_cause","tapi3if/ITCallStateEvent::get_Cause"] old-location: tapi3\itcallstateevent_get_cause.htm tech.root: tapi3 ms.assetid: e3a4b985-1c0f-4e93-a965-c61c9c0ab10d ms.date: 12/05/2018 ms.keywords: ITCallStateEvent interface [TAPI 2.2],get_Cause method, ITCallStateEvent.get_Cause, ITCallStateEvent::get_Cause, _tapi3_itcallstateevent_get_cause, get_Cause, get_Cause method [TAPI 2.2], get_Cause method [TAPI 2.2],ITCallStateEvent interface, tapi3.itcallstateevent_get_cause, tapi3if/ITCallStateEvent::get_Cause req.header: tapi3if.h req.include-header: Tapi3.h req.target-type: Windows req.target-min-winverclnt: req.target-min-winversvr: req.kmdf-ver: req.umdf-ver: req.ddi-compliance: req.unicode-ansi: req.idl: req.max-support: req.namespace: req.assembly: req.type-library: req.lib: Uuid.lib req.dll: Tapi3.dll req.irql: targetos: Windows req.typenames: req.redist: ms.custom: 19H1 f1_keywords: - ITCallStateEvent::get_Cause - tapi3if/ITCallStateEvent::get_Cause dev_langs: - c++ topic_type: - APIRef - kbSyntax api_type: - COM api_location: - Tapi3.dll api_name: - ITCallStateEvent.get_Cause --- # ITCallStateEvent::get_Cause ## -description The <b>get_Cause</b> method gets the cause associated with this event. ## -parameters ### -param pCEC [out] Pointer to <a href="/windows/desktop/api/tapi3if/ne-tapi3if-call_state_event_cause">CALL_STATE_EVENT_CAUSE</a> indicator. ## -returns This method can return one of these values. <table> <tr> <th>Return code</th> <th>Description</th> </tr> <tr> <td width="40%"> <dl> <dt><b>S_OK</b></dt> </dl> </td> <td width="60%"> Method succeeded. </td> </tr> <tr> <td width="40%"> <dl> <dt><b>E_OUTOFMEMORY</b></dt> </dl> </td> <td width="60%"> Insufficient memory exists to perform the operation. </td> </tr> <tr> <td width="40%"> <dl> <dt><b>E_POINTER</b></dt> </dl> </td> <td width="60%"> The <i>pCEC</i> parameter is not a valid pointer. </td> </tr> </table> ## -see-also <a href="/windows/desktop/api/tapi3if/ne-tapi3if-call_state_event_cause">CALL_STATE_EVENT_CAUSE</a> <a href="/windows/desktop/Tapi/call-object">Call Object</a> <a href="/windows/desktop/api/tapi3if/nn-tapi3if-itcallstateevent">ITCallStateEvent</a>
22.948718
350
0.747486
yue_Hant
0.598672
1c1bc8386f4623fbbd3aaa778d7f5f13a88b5bf0
2,360
md
Markdown
docs/vs-2015/debugger/debug-interface-access/idiasymbol-get-virtualbasetabletype.md
tommorris/visualstudio-docs.cs-cz
92c436dbc75020bc5121cc2c9e4976f62c9b13ca
[ "CC-BY-4.0", "MIT" ]
null
null
null
docs/vs-2015/debugger/debug-interface-access/idiasymbol-get-virtualbasetabletype.md
tommorris/visualstudio-docs.cs-cz
92c436dbc75020bc5121cc2c9e4976f62c9b13ca
[ "CC-BY-4.0", "MIT" ]
null
null
null
docs/vs-2015/debugger/debug-interface-access/idiasymbol-get-virtualbasetabletype.md
tommorris/visualstudio-docs.cs-cz
92c436dbc75020bc5121cc2c9e4976f62c9b13ca
[ "CC-BY-4.0", "MIT" ]
null
null
null
--- title: Idiasymbol::get_virtualbasetabletype – | Dokumentace Microsoftu ms.custom: '' ms.date: 2018-06-30 ms.prod: visual-studio-dev14 ms.reviewer: '' ms.suite: '' ms.technology: - vs-ide-debug ms.tgt_pltfrm: '' ms.topic: article dev_langs: - C++ helpviewer_keywords: - IDiaSymbol::get_virtualBaseTableType method ms.assetid: e0581c4f-0343-49b5-9754-a48477460e9f caps.latest.revision: 10 author: mikejo5000 ms.author: mikejo manager: ghogen ms.openlocfilehash: 13dd25c60e373ea747ec802999ce998271ff3e8f ms.sourcegitcommit: 55f7ce2d5d2e458e35c45787f1935b237ee5c9f8 ms.translationtype: MT ms.contentlocale: cs-CZ ms.lasthandoff: 08/22/2018 ms.locfileid: "42667719" --- # <a name="idiasymbolgetvirtualbasetabletype"></a>IDiaSymbol::get_virtualBaseTableType [!INCLUDE[vs2017banner](../../includes/vs2017banner.md)] Nejnovější verzi tohoto tématu můžete najít v [idiasymbol::get_virtualbasetabletype –](https://docs.microsoft.com/visualstudio/debugger/debug-interface-access/idiasymbol-get-virtualbasetabletype). Načte typ ukazatele virtuální základní tabulky. ## <a name="syntax"></a>Syntaxe ```cpp# HRESULT get_virtualBaseTableType( IDiaSymbol *pRetVal }; ``` #### <a name="parameters"></a>Parametry |Parametr|Popis| |---------------|-----------------| |`pRetVal`|[out] Vrátí [idiasymbol –](../../debugger/debug-interface-access/idiasymbol.md) objekt, který určuje typ základní tabulky.| ## <a name="return-value"></a>Návratová hodnota Pokud je úspěšná, vrátí `S_OK`; v opačném případě vrátí `S_FALSE` nebo kód chyby. > [!NOTE] > Vrácená hodnota `S_FALSE` znamená, že vlastnost není k dispozici pro symbol. ## <a name="remarks"></a>Poznámky Základní virtuální tabulky ukazatele (`vbtptr`) je skrytý ukazatel v [!INCLUDE[vcprvc](../../includes/vcprvc-md.md)] vtable, který zpracovává dědičnosti z virtuální základní třídy. A `vbtptr` může mít různě v závislosti na zděděné třídy. Tato metoda vrátí [idiasymbol –](../../debugger/debug-interface-access/idiasymbol.md) objekt, který můžete použít k určení velikosti vbtptr. ## <a name="requirements"></a>Požadavky |Požadavek|Popis| |-----------------|-----------------| |Záhlaví:|dia2.h| |Verze:|Ve verzi 8.0 DIA SDK| ## <a name="see-also"></a>Viz také [IDiaSymbol](../../debugger/debug-interface-access/idiasymbol.md)
32.777778
240
0.713983
ces_Latn
0.934264
1c1c1a4f606159123cc09944c20f38439736d99a
10,262
md
Markdown
pages/guides/base-tech/information-system-design/gbt_uml.ru.md
Flexberry/Documentation
891362f896aeef59f057f92dc4429512558b7798
[ "MIT", "BSD-3-Clause" ]
8
2016-11-21T09:50:04.000Z
2020-07-19T08:14:29.000Z
pages/guides/base-tech/information-system-design/gbt_uml.ru.md
Flexberry/Documentation
891362f896aeef59f057f92dc4429512558b7798
[ "MIT", "BSD-3-Clause" ]
34
2018-08-13T12:46:25.000Z
2020-09-04T10:17:11.000Z
pages/guides/base-tech/information-system-design/gbt_uml.ru.md
Flexberry/Documentation
891362f896aeef59f057f92dc4429512558b7798
[ "MIT", "BSD-3-Clause" ]
35
2016-08-29T10:29:16.000Z
2022-03-26T20:55:49.000Z
--- title: UML keywords: Programming sidebar: guide-base-tech_sidebar toc: true permalink: ru/gbt_uml.html folder: guides/base-tech/information-system-design/ lang: ru --- ## Краткое описание UML (англ. Unified Modeling Language — унифицированный язык моделирования) — язык графического описания для объектного моделирования в области разработки программного обеспечения, моделирования бизнес-процессов, системного проектирования и отображения организационных структур. ## Ссылки на материалы для изучения * [UML — Википедия](https://ru.wikipedia.org/wiki/UML) * [Стандарт UML - uml.org](http://uml.org) * [Материалы по UML - uml3.ru](http://uml3.ru/index.html) * [Курс "Введение в UML" - ИНТУИТ](http://www.intuit.ru/studies/courses/1007/229/info) * [Курс "Нотация и семантика языка UML" - ИНТУИТ](http://www.intuit.ru/studies/courses/32/32/info) ### Презентация <div class="thumb-wrap" style="margin-top: 20px; margin-bottom: 20px"> <iframe src='https://onedrive.live.com/embed?cid=043A2F24ADFAA4FD&resid=43A2F24ADFAA4FD%21110&authkey=&em=2&wdAr=1.3333333333333332' width='610px' height='481px' frameborder='0'>This is an embedded <a target='_blank' href='https://office.com'>Microsoft Office</a> presentation, powered by <a target='_blank' href='https://office.com/webapps'>Office Online</a>.</iframe> </div> ### Рекомендованные книги * [Самоучитель UML - Александр Леоненков](http://www.ozon.ru/context/detail/id/28266865/) * [UML Основы - Мартин Фаулер](http://www.ozon.ru/context/detail/id/2260613/) * [Применение UML и шаблонов проектирования - Крэг Ларман](http://www.ozon.ru/context/detail/id/3105480/) ## Программное обеспечение * [Flexberry Designer](http://flexberry.ru/Flexberry/ForDevelopers/FlexberryDesigner) * [Перечень UML-инструментов - Википедия](https://en.wikipedia.org/wiki/List_of_Unified_Modeling_Language_tools) ## Лабораторные работы и практические задания * [Методические пособия](http://www.twirpx.com/files/informatics/toom/ft.guideline/) * [Обсуждение практических заданий на форуме Сообщества Аналитиков - uml2.ru](http://www.uml2.ru/forum/index.php?board=3.0) Варианты заданий для самостоятельного создания UML-диаграмм с последующей проверкой со стороны преподавателя. ### Вариант №1 Имеется несколько складов. Для каждого склада известен владелец и название. На каждом складе хранятся товары. Одинаковые товары могут храниться на разных складах. Некоторые склады могут временно пустовать. Известна вместимость каждого склада в тоннах. Складов без владельцев не бывает. О каждом товаре известно его наименование, уникальный номер-артикул. Товары на склады привозятся на автомашинах. О каждой автомашине известна её марка, грузоподъемность в тоннах и фамилия владельца. Машин без владельцев не бывает. Имеется информация о поступлениях, показывающая какая машина какой товар на какой склад привозит в каком количестве (в тоннах). ### Вариант №2 Предметной областью является институт, в котором студенты изучают разные дисциплины у разных преподавателей. О каждом студенте известны фамилия, группа, факультет, дата рождения. О каждом преподавателе известны фамилия, кафедра, стаж работы и дата рождения. О каждой дисциплине известны название, количество семестров ее изучения. Имеются ведомости, в которых конкретный преподаватель выставляет оценку (2, 3, 4, 5) студентам группы по конкретной дисциплине. В институте принято положение, когда один преподаватель может читать насколько дисциплин и дисциплина может читаться несколькими преподавателями. ### Вариант №3 Есть несколько таблиц, в которых хранятся списки книг, написанных конкретным автором. В каждой таблице хранятся книги одного автора. Известны фамилии авторов. О каждой книге известно название, издательство и количество страниц. Если книгу написали несколько авторов, то она хранится в таблице каждого из авторов. ### Вариант №4 Авиакомпания хочет получать ответы на подобные вопросы о своих самолетах: «Сколько посадочных мест в Боинге 727?» Сколько у него двигателей? Какой средний возраст Боингов 746 нашего авиапарка? Кто главный механик, ответственный за обслуживание самолета номер 1388? Какая компания создала этот самолет?» ### Вариант №5 Администрация города N хочет получать ответы на вопросы: «Какой максимальный объем памяти возможен у IBM PC, Macintosh II, Pentium I,II,III? У кого из служащих есть в кабинете компьютер? У кого стоит компьютер с серийным номером 4538842? Какова его оперативная память и ёмкость винчестера?» ### Вариант №6 Студент изучает несколько предметов, о каждом из которых известно название, количество часов изучения, номера семестров, когда изучался предмет. У студента есть несколько общих тетрадей, в которых он пишет конспекты лекций. В одной тетради могут быть написаны конспекты по разным предметам и в разных тетрадях - по одному предмету. О каждой тетради известно её название (надпись на обложке (уникальная)), цвет обложки, количество лекций по каждому предмету, записанных в тетради. ### Вариант №7 В электронном магазине имеется информация о складе, где хранятся товары. О каждом товаре известно его название, количество на складе и суммарная стоимость всего товара. При поступлении товара поступившее количество добавляется к имеющемуся и их стоимости суммируются. Продажная цена товара определяется делением суммарной стоимости на его количество. Ведётся ежемесячный учёт продаж каждого товара: название, сколько продано и по какой цене. При продаже корректируется количество товара и его стоимость. ### Вариант №8 Имеется информация о сотрудниках некоторой фирмы. Фирма состоит из нескольких отделов, каждый из которых имеет своего начальника и нескольких (может быть и ни одного) подчиненных. В фирме работают несколько супружеских пар. В фирме принято условие, что из работающих в ней супругов начальником может быть только жена. Кроме того, для каждого сотрудника известна зарплата, которую он получает и суммарный объём сделок, которые он заключил. ### Вариант №9 Банк «Товарищество заслуженных программистов» открывает для клиентов (физических и юридических лиц) текущие и сберегательные счета, выдает кредиты. Текущий счет может быть открыт для использования несколькими клиентами, сберегательный - только для одного. Клиент может иметь несколько текущих и сберегательных счетов. ### Вариант №10 В буфете ресторана имеется несколько шкафов, в которых хранятся столовая и чайная посуда и приборы. В каждом шкафу есть несколько полок, на которых расставляется разная посуда. Каждая полка имеет свой уникальный для каждого шкафа номер. Каждый шкаф имеет название, цвет и известное количество полок. На каждой полке может стоять разная посуда и на разных полках может стоять одинаковая посуда. О посуде известны её название, рисунок-расцветка или его отсутствие, количество посуды данного вида, стоящее на полке. ### Вариант №11 Существует фирма, производящая некоторое количество товаров. Фирма разбита на отделы, каждым из которых управляет один начальник. В каждом отделе работают несколько сотрудников, причем в разных отделах сотрудники разные. О фирме известны название, адрес, расчетный счет. О сотруднике известны его фамилия, дата рождения, адрес, служебный телефон, имена детей, пол, профессия, суммарная себестоимость изготовленных сотрудником товаров. О товарах известны название, артикул, сортность, себестоимость, изготовленное количество. ### Вариант №12 Лесничество разбито на несколько участков, на которых растут разные породы деревьев. На одном участке могут расти несколько пород, в том числе и таких, которые имеются и на других участках. О каждом участке известны его название, площадь и количество деревьев каждой породы, которые растут здесь. О каждой породе известно её название, средняя высота деревьев, средний диаметр и средний возраст. ### Вариант №13 На ипподроме имеется несколько конюшен лошадей. О каждой конюшне известны клички и общее количество лошадей, в ней имеющихся, работающие в ней жокеи. О каждой лошади известна кличка, в каких состязаниях она участвовала, под управлением какого жокея и какое место заняла. О жокее известна его фамилия и спортивный разряд. Жокей может работать в нескольких конюшнях и управлять несколькими лошадьми. Лошадь всегда управляется только одним жокеем и находится в одной конюшне. ### Вариант №14 В школе имеются несколько классов, в которых установлены разные парты, приобретённые в разных странах. В одном классе могут быть установлены парты, купленные в разных странах. Парты из одной страны могут быть установлены в разных классах. Парты, купленные в одной стране могут быть разных типов. О каждом классе известны его номер, площадь, сколько парт какого типа и какой страны в нем установлены. О парте известны: ее тип, страна изготовитель, цвет, площадь, год выпуска. ### Вариант №15 Известна информация о компьютерах разных типов, установленных в разных НИИ. Для каждого компьютера известен тип процессора, объем оперативной памяти и винчестера. Для каждого НИИ известно его название, количество работающих и установленных компьютеров каждого типа. ### Вариант №16 Коммерсант покупает товары и распределяет их по складам. При этом учитывается: наименование товара, цена покупки, количество купленного, сумма, сколько на какой склад поступило. Купленные товары продаются в розницу. При этом фиксируется количество проданного, цена продажи, количество проданного, сумма, с какого склада продан товар. На складах товар одного наименования фиксируется один раз. ### Вариант №17 Информационная система для интернет-магазина «Шины и диски мира». В магазине принимаются заказы на имеющиеся в наличии шины и диски различных марок и модификаций, при этом оформляется заказ, где указана дата, количество, цена товара, ФИО и телефон заказчика. ### Вариант №18 Информационная система для системного администратора. В фирме имеется несколько отделов. У каждого отдела есть руководитель. Компьютеры фирмы подключены в локальную сеть. Сеть управляется несколькими серверами. Требуется организовать учет существующих компьютеров в фирме с привязкой к отделам, сотрудникам и пр. Требуется также учитывать основные комплектующие компьютеров. ## Перейти * [Паттерны проектирования](gbt_design-patterns.html) * [Главная страница курса](gbt_landing-page.html)
62.193939
373
0.805496
rus_Cyrl
0.988003
1c1c3aa4f2a30218c688d695c2ddb3cfb670b89c
1,442
md
Markdown
cmd/dctor/README.md
dennwc/go-dcpp
0332759e7a0b6e0492171a383e03c34c3d1f4394
[ "BSD-3-Clause" ]
28
2019-02-03T10:12:31.000Z
2022-01-12T12:23:21.000Z
cmd/dctor/README.md
dennwc/go-dcpp
0332759e7a0b6e0492171a383e03c34c3d1f4394
[ "BSD-3-Clause" ]
97
2019-01-29T03:10:44.000Z
2021-06-07T22:19:31.000Z
cmd/dctor/README.md
dennwc/go-dcpp
0332759e7a0b6e0492171a383e03c34c3d1f4394
[ "BSD-3-Clause" ]
13
2019-01-29T03:10:53.000Z
2021-11-06T12:10:26.000Z
# Tor bridge for DC This program implements a fully-functional ADC bridge over Tor. Features: - Tor is embedded into the bridge - no need to run separately. - All features like chat, search, file download/upload are supported. TODOs: - Only Tor C-C connections are supported at the moment. Mixed connections can be implemented later. Since Tor addresses are not known by any hubs or clients, the bridge should be running both on server side and the client side. ## Hub On the server side, you need any existing ADC hub that is configured to: - Pass `INF` extension fields (specifically `EA`). - Allow multiple users from the same IP (`127.0.0.1`, which is bridge IP). To run the bridge: ``` ./dctor hub adc://localhost:411 ``` Where `localhost:411` is an address of the hub. Wait for the following line to appear: ``` Tor address: adc://xxxxxxxxxxxxxxxx.onion ``` This address should be used in the client bridge to connect to the hub over Tor. ## Client Any existent client can be used, but you will need to run the client-side bridge: ``` ./dctor client adc://xxxxxxxxxxxxxxxx.onion --host=localhost:1412 ``` Where `adc://xxxxxxxxxxxxxxxx.onion` is the hub address in the Tor network and `localhost:1413` is the local address of a bridge that client will use to connect to the hub. Wait for the following line to appear: ``` Listening for clients on localhost:1412 ``` Then connect to `adc://localhost:1412` from your client.
28.84
99
0.748266
eng_Latn
0.999094
1c1cad7484c8372b8ff5a1f750e06d69817090c0
1,875
md
Markdown
internal/js-parser/test-fixtures/typescript/types/tuple-labeled/input.test.md
mainangethe/tools
a030d7efe77ccbf3b4fcc1f1d86fd1de29e2743f
[ "MIT" ]
null
null
null
internal/js-parser/test-fixtures/typescript/types/tuple-labeled/input.test.md
mainangethe/tools
a030d7efe77ccbf3b4fcc1f1d86fd1de29e2743f
[ "MIT" ]
null
null
null
internal/js-parser/test-fixtures/typescript/types/tuple-labeled/input.test.md
mainangethe/tools
a030d7efe77ccbf3b4fcc1f1d86fd1de29e2743f
[ "MIT" ]
null
null
null
# `index.test.ts` **DO NOT MODIFY**. This file has been autogenerated. Run `rome test internal/js-parser/index.test.ts --update-snapshots` to update. ## `typescript > types > tuple-labeled` ### `ast` ```javascript JSRoot { comments: Array [] corrupt: false diagnostics: Array [] directives: Array [] hasHoistedVars: false integrity: undefined interpreter: undefined sourceType: "module" loc: SourceLocation typescript/types/tuple-labeled/input.ts 1:0-2:0 path: UIDPath<typescript/types/tuple-labeled/input.ts> syntax: Array ["ts"] body: Array [ TSTypeAlias { id: JSBindingIdentifier { name: "StrStrNumNumBool" loc: SourceLocation typescript/types/tuple-labeled/input.ts 1:5-1:21 (StrStrNumNumBool) } typeParameters: undefined loc: SourceLocation typescript/types/tuple-labeled/input.ts 1:0-1:46 right: TSTupleType { loc: SourceLocation typescript/types/tuple-labeled/input.ts 1:24-1:45 elementTypes: Array [ TSTupleElement { name: undefined optional: false loc: SourceLocation typescript/types/tuple-labeled/input.ts 1:25-1:32 typeAnnotation: TSBooleanKeywordTypeAnnotation {loc: SourceLocation typescript/types/tuple-labeled/input.ts 1:25-1:32} } TSTupleElement { name: undefined optional: false loc: SourceLocation typescript/types/tuple-labeled/input.ts 1:34-1:44 typeAnnotation: TSRestType { loc: SourceLocation typescript/types/tuple-labeled/input.ts 1:34-1:44 argument: TSTypeReference { typeParameters: undefined loc: SourceLocation typescript/types/tuple-labeled/input.ts 1:37-1:44 typeName: JSReferenceIdentifier { name: "Strings" loc: SourceLocation typescript/types/tuple-labeled/input.ts 1:37-1:44 (Strings) } } } } ] } } ] } ``` ### `diagnostics` ``` ```
27.985075
131
0.690667
kor_Hang
0.280441
1c1e15cbbef2375ca724ee60967809142da3aeb8
2,592
md
Markdown
portfolio/orinoco.md
SamuelRiveraC/SamuelRiveraC.github.io
b69f122c40183430be38633b7e5bc207aede4f10
[ "MIT" ]
null
null
null
portfolio/orinoco.md
SamuelRiveraC/SamuelRiveraC.github.io
b69f122c40183430be38633b7e5bc207aede4f10
[ "MIT" ]
1
2021-08-02T14:02:41.000Z
2021-08-02T14:37:23.000Z
portfolio/orinoco.md
SamuelRiveraC/samuelriverac.github.io
b69f122c40183430be38633b7e5bc207aede4f10
[ "MIT" ]
null
null
null
--- title: Orinoco.io date: "2018-04-01T22:12:03.284Z" excerpt: "Front end web development for a Criptoexchange and fintech, with vue, vuetify and graphql" description: "<p>Even before finishing my internship I landed a job at a local crypto exchange. They had a very simple system made on plain PHP where clients added orders and their employees processed the order. This was burdensome because they were limited by their number of employees and the own system was very basic in their functionalities.</p> <p>We assembled a team made up of the in house designer, a contractor back end developer, and me, the front end developer. We planned all the modules and devised along with the founders a crypto exchange like Localbitcoins and Airtm, that would solve both of their problems decentralizing the platform with a more modern technology and user experience. We chose to develop the system with Vue and vuetify as the front end, and express with a mongo database using graphql to tie everything together, we spent around 5 months developing the app.</p> <p>It was a success and it is used by more than 4000 users (twice than when we started) and since then has made more than 75000 transactions, and the company has grown twice the size and diversified their services to become a full fintech locally</p>" posttype: "portfolio" thumbnail: orinoco.png role: "Front end developer" client: "Orinoco" dateProject: "May 2018 - Oct 2018" location: "El Tigre, Anzoategui, Venezuela" website: "https://Orinoco.io" repository: "" testimonial: "" testimonialAuthor: "" testimonialRole: "" --- Homepage: Displaying the latest transactions made and the calculator to know the rates of the trades including a call to action to log in. Additionally, there is a small arrow to see the rest of the page that includes facts about the platform and social proof. ![Orinoco]( /portfolio/orinoco-1.png 'Orinoco') The Login page, includes a list of benefits on the sidebar so the user can familiarize with the platform ![Orinoco]( /portfolio/orinoco-2.png 'Orinoco') The dashboard. with the current operations (none at the moment), a sidebar with all the operations and the calculator to begin trading. (You can add bank accounts or wallets in Options page) ![Orinoco]( /portfolio/orinoco-3.png 'Orinoco') Initiating a transfer of 100 USDT to receive 98 Nectar on my wallet (Equivalent to 1USD). Once I click “Aceptar” it opens a popup to send the money and confirm the transaction, once processed and automatically stored on the Orinoco Wallet. Easy right? ![Orinoco]( /portfolio/orinoco-4.png 'Orinoco')
78.545455
1,150
0.783179
eng_Latn
0.999163
1c1e334c66be6d7b728c39ad4fbc9c9447f0eb9f
796
md
Markdown
content/en/faq/applications/zookeeper.md
tialouden/istio.io
53ddedae0586972609e48ac0b70a4b81b960f4d9
[ "Apache-2.0" ]
null
null
null
content/en/faq/applications/zookeeper.md
tialouden/istio.io
53ddedae0586972609e48ac0b70a4b81b960f4d9
[ "Apache-2.0" ]
null
null
null
content/en/faq/applications/zookeeper.md
tialouden/istio.io
53ddedae0586972609e48ac0b70a4b81b960f4d9
[ "Apache-2.0" ]
null
null
null
--- title: Can I run Zookeeper inside an Istio mesh? description: How to run Zookeeper with Istio. weight: 50 keywords: [zookeeper] test: no --- By default, Zookeeper listens on the pod IP address for communication between servers. Istio and other service meshes require `localhost` (`127.0.0.1`) to be the address to listen on. There is a configuration parameter that can be used to change this default behavior: [`quorumListenOnAllIPs`](https://zookeeper.apache.org/doc/r3.5.7/zookeeperAdmin.html). This option allows Zookeeper to listen on all addresses including the `localhost`. Set this parameter to `true` by using the following command where `$ZK_CONFIG_FILE` is your Zookeeper configuration file. {{< text bash >}} $ echo "quorumListenOnAllIPs=true" >> $ZK_CONFIG_FILE {{< /text >}}
31.84
86
0.765075
eng_Latn
0.97882
1c1e386b995e321408cb80f1fa1a91f19c94cc2d
559
md
Markdown
README.md
Hadsake/Yard-mc-Bot
71b8b50c7af8bcd7ec058ecf38900059e4f33924
[ "Unlicense" ]
null
null
null
README.md
Hadsake/Yard-mc-Bot
71b8b50c7af8bcd7ec058ecf38900059e4f33924
[ "Unlicense" ]
null
null
null
README.md
Hadsake/Yard-mc-Bot
71b8b50c7af8bcd7ec058ecf38900059e4f33924
[ "Unlicense" ]
null
null
null
# VaxeTurkiye-Dosyalari VaxeTurkiye 2017.11.23 Acilmistir ve Para Olmadi icin Kapanmistir.Bu Botu Yapimcisi LMD | xChairs#4713 Yeni Botumuz Olan Discord ve Eglen Eklemek Icin:https://discordapp.com/oauth2/authorize?client_id=446379688839872532&amp;scope=bot&amp;permissions=2146958591 Github'da yayınlanan Türkçe, Discord Bot Dosyasi Kurulum 1. Kurulum için gereken programlar: Node.JS Git Programlarımızı kurduktan sonra, Dosyayi Indiriyoruz. Dosyalarımız indikten sonra ayarlar.json dosyasını ayarlıyoruz ve node bot.js komutu ile botumuzu çalıştırıyoruz.
50.818182
260
0.842576
tur_Latn
0.996239
1c1edbaf327b234b0a717428830bdd6a182ef387
1,833
md
Markdown
README.md
daothuydung/daothuydung.github.io
c746a43f2a5cb3de35b142ea8ed0f3c6b8762459
[ "MIT" ]
1
2017-03-12T01:01:11.000Z
2017-03-12T01:01:11.000Z
README.md
daothuydung/daothuydung.github.io
c746a43f2a5cb3de35b142ea8ed0f3c6b8762459
[ "MIT" ]
null
null
null
README.md
daothuydung/daothuydung.github.io
c746a43f2a5cb3de35b142ea8ed0f3c6b8762459
[ "MIT" ]
null
null
null
#New Age Jekyll theme ========================= ## If you are a company and you're going to use the blog: 1. contact bootstrap start up and ask. 2. contact me because there is to remove some useless part. Jekyll theme based on [New Age bootstrap theme ](https://startbootstrap.com/template-overviews/new-age/) # Demo View this jekyll theme in action [here](https://jekynewage.github.io/) ## Built by [Antonio Trento](https://it.linkedin.com/in/antoniotrento) This Jekyll template was created to develop, landing pages, squeeze pages, portfolio and blog or all the above. ###I integrated analytical tools and marketing such as: - Google Ad Words - Google analytics - Disqus comment system - Add This social sharing >>External stylesheets and libraries included are Google Fonts, Font Awesome, Normalize.CSS, and WOW.js In order to set your log data to applications _config.yml just open the file and find the associated items. I also built a system to add their own names on the same company files **To change the base colors yellow go in css folder there main.css where you can set the primary color and the secondary color, remember that the theme is gradient in the background areas** If you are interested in implementing this theme please contact me without any problems I will do what is in my power to help you! If you have noticed anything unusual or errors in my development I ask you kindly let me know or send a pull request! Any opinion and critical comment is welcome! So give us inside! We can get in touch by: 1. By <a href="https://twitter.com/lantoniotrento">twitter</a> 2. mail <a href="mailto:lantoniotrento@gmail.com">lantoniotrento[at]gmail.com</a> 3. Via <a href="https://it.linkedin.com/in/antoniotrento">LinkedIn</a> ========= For more details, read the [documentation](http://jekyllrb.com/)
39
189
0.754501
eng_Latn
0.992672
1c1f1094d5f7b3805499fd623748df380d44ed8f
1,197
md
Markdown
docs/column-definition/cell.md
keeslinp/reactabular
85941babca56453f1c9c0a7809bff72e9a47ed9a
[ "MIT" ]
715
2016-07-12T11:22:33.000Z
2022-03-12T16:46:09.000Z
docs/column-definition/cell.md
keeslinp/reactabular
85941babca56453f1c9c0a7809bff72e9a47ed9a
[ "MIT" ]
219
2016-07-11T13:16:00.000Z
2021-04-22T09:36:11.000Z
docs/column-definition/cell.md
keeslinp/reactabular
85941babca56453f1c9c0a7809bff72e9a47ed9a
[ "MIT" ]
132
2016-07-15T14:27:04.000Z
2021-07-15T09:10:52.000Z
In addition to `header` customization, it's essential to define how the rows should map to content. This is achieved through `cell` fields. ## **`cell.transforms`** ```javascript cell.transforms = [ ( <value>, { columnIndex: <number>, column: <object>, rowData: <object>, rowIndex: <number>, property: <string> } ) => ({... props ...}) ] ``` `cell.transforms` follows the same idea as `header.transforms`. This time `value` is the resolved `property` and we have extra rows available. **Example:** ```javascript { cell: { transforms: [editable(edit.input())] } } ``` ## **`cell.formatters = [value => <string|React element>]`** The idea here is the same as for `header.formatters`. **Example:** ```javascript { cell: { property: 'salary', formatters: [ (salary, extra) => ( <span onDoubleClick={() => alert(`salary is ${salary}`)}> {search.highlightCell(salary, extra)} </span> ) ] } } ``` ## **`cell.props = <object>`** You can set cell specific props through `props`. **Example:** ```javascript { cell: { props: { style: { width: 100 } } } } ```
17.347826
142
0.570593
eng_Latn
0.950569
1c20bbc49aaebc59ce0d1aa9e9ed84ea44f3d26d
24
md
Markdown
README.md
selcux/terraform-azure-sample
d8d754baf870b56a665b150d47a338a9dbd9fa47
[ "MIT" ]
null
null
null
README.md
selcux/terraform-azure-sample
d8d754baf870b56a665b150d47a338a9dbd9fa47
[ "MIT" ]
null
null
null
README.md
selcux/terraform-azure-sample
d8d754baf870b56a665b150d47a338a9dbd9fa47
[ "MIT" ]
null
null
null
# terraform-azure-sample
24
24
0.833333
eng_Latn
0.341109
1c20c22773f8c526949067ace32a132a89f7e58d
944
md
Markdown
README.md
seancrowe/node-chiliservice
9c6d681d2faa4917e036db41f7213d190f7581c8
[ "MIT" ]
1
2019-11-19T08:33:21.000Z
2019-11-19T08:33:21.000Z
README.md
seancrowe/node-chiliservice
9c6d681d2faa4917e036db41f7213d190f7581c8
[ "MIT" ]
2
2020-07-17T13:11:25.000Z
2021-05-09T19:26:25.000Z
README.md
seancrowe/node-chiliservice
9c6d681d2faa4917e036db41f7213d190f7581c8
[ "MIT" ]
null
null
null
ChiliService ========= A small library that allows you to easily make web service calls to your CHILI Server This module was developed for servers running CHILI Publisher >5.4 It will work with old CHILI installs, but some functions (like DocumentProcessServerSide) will cause a "Error! Function does not exist" error. ## Installation `npm install chiliservice` ## Usage let cs = require('chiliservice'); let connector = new cs.ChiliConnector("http://www.crowe.chili/5.5/main.asmx?wsdl"); async function Main() { try { let apiKey = (await connector.GenerateApiKeyAsync("admin", "admin", "admin")).key; console.log(apiKey); } catch (error) { console.log(error); } } Output should be an apiKey or an error if you had the wrong info ## Contributing In lieu of a formal style guide, take care to maintain the existing coding style.
24.842105
142
0.663136
eng_Latn
0.975788
1c20eef93fb3a8c9d29ae44a6a80532a1df9097c
8,598
md
Markdown
articles/azure-functions/create-first-function-vs-code-python.md
gustavodelima/azure-docs.pt-br
07f02d5a6f3328b5b720c3091518e805d37590b8
[ "CC-BY-4.0", "MIT" ]
null
null
null
articles/azure-functions/create-first-function-vs-code-python.md
gustavodelima/azure-docs.pt-br
07f02d5a6f3328b5b720c3091518e805d37590b8
[ "CC-BY-4.0", "MIT" ]
null
null
null
articles/azure-functions/create-first-function-vs-code-python.md
gustavodelima/azure-docs.pt-br
07f02d5a6f3328b5b720c3091518e805d37590b8
[ "CC-BY-4.0", "MIT" ]
null
null
null
--- title: Criar uma função Python usando o Visual Studio Code – Azure Functions description: Saiba como criar uma função Python e publique o projeto local por meio da hospedagem sem servidor no Azure Functions usando a extensão do Azure Functions no Visual Studio Code. ms.topic: quickstart ms.date: 11/04/2020 ms.custom: devx-track-python ms.openlocfilehash: e022843f95e5d5b52a15eaab2d28b6b9eb923006 ms.sourcegitcommit: 740698a63c485390ebdd5e58bc41929ec0e4ed2d ms.translationtype: HT ms.contentlocale: pt-BR ms.lasthandoff: 02/03/2021 ms.locfileid: "99493559" --- # <a name="quickstart-create-a-function-in-azure-with-python-using-visual-studio-code"></a>Início rápido: criar uma função no Azure com o Python usando o Visual Studio Code [!INCLUDE [functions-language-selector-quickstart-vs-code](../../includes/functions-language-selector-quickstart-vs-code.md)] Neste artigo, você usará o Visual Studio Code para criar uma função do Python que responde a solicitações HTTP. Após testar o código localmente, implante-o no ambiente sem servidor do Azure Functions. A realização deste início rápido gera um pequeno custo de alguns centavos de dólar ou menos em sua conta do Azure. Também há uma [versão baseada na CLI](create-first-function-cli-python.md) deste artigo. ## <a name="configure-your-environment"></a>Configurar seu ambiente Antes de começar, verifique se você tem os seguintes requisitos implementados: + Uma conta do Azure com uma assinatura ativa. [Crie uma conta gratuitamente](https://azure.microsoft.com/free/?ref=microsoft.com&utm_source=microsoft.com&utm_medium=docs&utm_campaign=visualstudio). + [Azure Functions Core Tools](functions-run-local.md#install-the-azure-functions-core-tools), versão 3.x. + [Versões do Python com suporte do Azure Functions](supported-languages.md#languages-by-runtime-version) + [Visual Studio Code](https://code.visualstudio.com/) em uma das [plataformas compatíveis](https://code.visualstudio.com/docs/supporting/requirements#_platforms). + A [extensão do Python](https://marketplace.visualstudio.com/items?itemName=ms-python.python) para Visual Studio Code. + A [Extensão Azure Functions](https://marketplace.visualstudio.com/items?itemName=ms-azuretools.vscode-azurefunctions) para Visual Studio Code. ## <a name="create-your-local-project"></a><a name="create-an-azure-functions-project"></a>Criar seu projeto local Nesta seção, você usará o Visual Studio Code para criar um projeto local do Azure Functions em Python. Mais adiante neste artigo, você publicará o código de função no Azure. 1. Escolha o ícone do Azure na Barra de atividade e, em seguida, na área **Azure: Functions** e selecione o ícone **Criar projeto...** . ![Escolher Criar um projeto](./media/functions-create-first-function-vs-code/create-new-project.png) 1. Escolha um local de diretório para o workspace do projeto e escolha **Selecionar**. > [!NOTE] > Estas etapas foram projetadas para serem concluídas fora de um workspace. Nesse caso, não selecione uma pasta de projeto que faz parte de um workspace. 1. Forneça as seguintes informações nos prompts: + **Selecione uma linguagem de programação para o seu projeto de função**: Escolha `Python`. + **Selecione um alias do Python para criar um ambiente virtual**: Escolha a localização do seu interpretador do Python. Se a localização não for mostrada, digite o caminho completo no binário do Python. + **Selecione um modelo para a primeira função do projeto**: Escolha `HTTP trigger`. + **Forneça um nome de função**: Digite `HttpExample`. + **Nível de autorização**: Escolha `Anonymous`, que permite que qualquer pessoa chame seu ponto de extremidade de função. Para saber mais sobre o nível de autorização, confira [Chaves de autorização](functions-bindings-http-webhook-trigger.md#authorization-keys). + **Selecione como você gostaria de abrir seu projeto**: Escolha `Add to workspace`. 1. Usando essas informações, o Visual Studio Code gera um projeto do Azure Functions com um gatilho HTTP. Você pode exibir os arquivos de projeto locais no Explorer. Para saber mais sobre os arquivos criados, confira [Arquivos de projeto gerados](functions-develop-vs-code.md#generated-project-files). [!INCLUDE [functions-run-function-test-local-vs-code](../../includes/functions-run-function-test-local-vs-code.md)] Após verificar se a função foi executada corretamente no computador local, é hora de usar o Visual Studio Code para publicar o projeto diretamente no Azure. [!INCLUDE [functions-sign-in-vs-code](../../includes/functions-sign-in-vs-code.md)] ## <a name="publish-the-project-to-azure"></a>Publicar o projeto no Azure Nesta seção, você criará um aplicativo de funções e os recursos relacionados em sua assinatura do Azure e, em seguida, implantará seu código. > [!IMPORTANT] > Publicar em um aplicativo de funções existente substitui o conteúdo desse aplicativo no Azure. 1. Escolha o ícone do Azure na Barra de atividade e, em seguida, na área **Azure: Functions**, escolha o botão **Implantar no aplicativo de funções...** . ![Publicar seu projeto no Azure](../../includes/media/functions-publish-project-vscode/function-app-publish-project.png) 1. Forneça as seguintes informações nos prompts: + **Selecione a pasta**: escolha uma pasta do seu workspace ou navegue até uma que contenha seu aplicativo de funções. Você não verá isso se já tiver um aplicativo de funções válido aberto. + **Selecione a assinatura**: Escolha a assinatura a ser usada. Essa opção não será exibida caso você possua apenas uma assinatura. + **Selecione o aplicativo de funções no Azure**: Escolha `+ Create new Function App`. (Não escolha a opção `Advanced`, que não é abordada neste artigo.) + **Insira um nome exclusivo globalmente para o aplicativo de funções**: Digite um nome que seja válido em um caminho de URL. O nome que você digitar é validado para ter certeza de que ele é exclusivo no Azure Functions. + **Selecione um runtime**: Escolha a versão do Python em que você está executando localmente. É possível usar o comando `python --version` para verificar sua versão. + **Selecione uma localização para novos recursos**: Para obter um melhor desempenho, escolha uma [região](https://azure.microsoft.com/regions/) perto de você. A extensão mostra o status de recursos individuais conforme eles são criados no Azure na área de notificação. :::image type="content" source="../../includes/media/functions-publish-project-vscode/resource-notification.png" alt-text="Notificação de criação de recurso do Azure"::: 1. Quando concluído, os seguintes recursos do Azure serão criados em sua assinatura, usando nomes baseados em seu nome do aplicativo de funções: [!INCLUDE [functions-vs-code-created-resources](../../includes/functions-vs-code-created-resources.md)] Uma notificação é exibida depois que seu aplicativo de funções é criado e o pacote de implantação é aplicado. [!INCLUDE [functions-vs-code-create-tip](../../includes/functions-vs-code-create-tip.md)] 4. Escolha **Exibir Saída** nessa notificação para exibir a criação e os resultados da implantação, incluindo os recursos do Azure que você criou. Se você perder a notificação, selecione o ícone de sino no canto inferior direito para vê-lo novamente. ![Criar notificação completa](./media/functions-create-first-function-vs-code/function-create-notifications.png) [!INCLUDE [functions-vs-code-run-remote](../../includes/functions-vs-code-run-remote.md)] [!INCLUDE [functions-cleanup-resources-vs-code.md](../../includes/functions-cleanup-resources-vs-code.md)] ## <a name="next-steps"></a>Próximas etapas Você usou o [Visual Studio Code](functions-develop-vs-code.md?tabs=python) para criar um aplicativo de funções com uma função simples disparada por HTTP. No próximo artigo, você expandirá essa função conectando-se ao Armazenamento do Azure. Para saber mais sobre como se conectar a outros serviços do Azure, confira [Adicionar associações a uma função existente no Azure Functions](add-bindings-existing-function.md?tabs=python). > [!div class="nextstepaction"] > [Conectar-se a uma fila do Armazenamento do Azure](functions-add-output-binding-storage-queue-vs-code.md?pivots=programming-language-python) [Está com problemas? Fale conosco.](https://aka.ms/python-functions-qs-survey) [Azure Functions Core Tools]: functions-run-local.md [Azure Functions extension for Visual Studio Code]: https://marketplace.visualstudio.com/items?itemName=ms-azuretools.vscode-azurefunctions
63.688889
430
0.770644
por_Latn
0.998965
1c23707331d3ce4eafd31ab4a6a8efe25e2e8126
1,223
md
Markdown
site/content/amis_amants/_index.md
yutaro-komaji/kubiswara-z
627d2f5afc3a3a2ef13349e9e87d916ec937889c
[ "MIT" ]
null
null
null
site/content/amis_amants/_index.md
yutaro-komaji/kubiswara-z
627d2f5afc3a3a2ef13349e9e87d916ec937889c
[ "MIT" ]
4
2021-03-10T09:32:16.000Z
2022-02-18T21:49:02.000Z
site/content/amis_amants/_index.md
yutaro-komaji/kubiswara-z
627d2f5afc3a3a2ef13349e9e87d916ec937889c
[ "MIT" ]
null
null
null
--- title: "Light My Cigarette" anchor: - anchorItem: "introduction" anchorItemtxt: INTRODUCTION - anchorItem: "live" anchorItemtxt: LIVE anchor02: - anchorItem: "music" anchorItemtxt: MUSIC - anchorItem: "movie" anchorItemtxt: MOVIE pageUrl: /lmc/ introduction: - heading: "INTRODUCTION" title: "Light My Cigarette" btnText: "Twitter" btnUrl: "https://twitter.com/LMC_Nagoya" text01: > GENKI(Gt./ Vo.) , TumTum(Ba./ Scream) , AKAIKE(Dr./ Cho.) text02: > 2018年結成。名古屋発パンクコアバンド。 text03: > メロディックパンクを軸とした【陽】のサウンドを作り出すGt.Vo.GENKIと、ハードコアを軸とした【陰】のサウンドを作り出すBa.Scream.TumTumに加え、抜群の笑顔で観客を絶頂の渦に巻き込むDr.Cho.AKAIKEで構成される。 text04: > 彼らが織りなす、陰と陽の二面性を体現するような幅広いサウンドは、音楽的バックグラウンドの広さが伺える。 imageUrl: "/img/lmc/img-lmc-08.jpg" secId: "introduction" tglActive: active movie: - title: "Introspection" iframeUrl: "6zbnl3GjZTg" - title: "P.I.G.C." iframeUrl: "xb3MrID_9f8" live: - title: comming soon... description: > 2018年結成。名古屋発パンクコアバンド。GENKI(Gt./ Vo.) , TumTum(Ba./ Scream) , AKAIKE(Dr./ Cho.)からなる3ピースバンド。 releaseUrl: https://eggs.mu/artist/LMC_Nagoya/ twitterUrl: https://twitter.com/LMC_Nagoya ---
23.075472
129
0.666394
yue_Hant
0.539425
1c23a599e81e9733bb1e3f4bc07419417798a558
1,446
md
Markdown
_posts/2011-08-31-how-to-find-your-external-ip-address-from-the-command-line.md
benhamilton/benhamilton.github.io
798e7bbbed048b8bb03ec72864f5249c3bb9dcf4
[ "MIT" ]
null
null
null
_posts/2011-08-31-how-to-find-your-external-ip-address-from-the-command-line.md
benhamilton/benhamilton.github.io
798e7bbbed048b8bb03ec72864f5249c3bb9dcf4
[ "MIT" ]
null
null
null
_posts/2011-08-31-how-to-find-your-external-ip-address-from-the-command-line.md
benhamilton/benhamilton.github.io
798e7bbbed048b8bb03ec72864f5249c3bb9dcf4
[ "MIT" ]
null
null
null
--- layout: post title: How to find your external ip address from the command line permalink: /microsoft/how-to-find-your-external-ip-address-from-the-command-line post_id: 422 categories: - Command - How to - IP - Microsoft - Script --- I often need to know what the external IP address for a client is. Thus I've cobbled together the following script. Simply copy the code below into externalip.cmd and when run from the command prompt it will do two things for you: - the script will display the external IP address - the script will set the environment variable ExternalIP to be whatever that IP is <pre><code> @echo off :: Find out what the External IP address is :: Create the .vbs file first Echo Option Explicit >externalipaddress.vbs Echo Dim http : Set http = CreateObject( "MSXML2.ServerXmlHttp" ) >>externalipaddress.vbs Echo http.Open "GET", "http://whatismyip.org", False >>externalipaddress.vbs Echo http.Send >>externalipaddress.vbs Echo Wscript.Echo http.responseText >>externalipaddress.vbs Echo Set http = Nothing >>externalipaddress.vbs :: run the resulting .vbs script and set the enviroment variable for /f "skip=2 " %%G IN ('cscript externalipaddress.vbs') DO (Set ExternalIP=%%G) :: Display the enviroment variable Echo External IP is %ExternalIP% :: tidy up and remove the temp file del externalipaddress.vbs /q </code></pre> Let me know if you find this useful, or if you can improve on it I'd love to hear from you.
39.081081
230
0.766252
eng_Latn
0.964936
1c24bea4097a7f6a23a4cca1f2b09b767e840c8d
112
md
Markdown
README.md
fmaj7/oreo-sample
0e347bf140b25e9fd0736928c5d2937b13642e6c
[ "MIT" ]
null
null
null
README.md
fmaj7/oreo-sample
0e347bf140b25e9fd0736928c5d2937b13642e6c
[ "MIT" ]
null
null
null
README.md
fmaj7/oreo-sample
0e347bf140b25e9fd0736928c5d2937b13642e6c
[ "MIT" ]
null
null
null
oreo-sample =========== https://codeship.com/projects/0376e580-77e8-0132-6e6e-3ee8152094db/status?branch=master
28
87
0.75
yue_Hant
0.121365
1c24e5780411da0dbd7f1b32729ab59954e77e2b
908
md
Markdown
docs/error-messages/compiler-errors-2/compiler-error-c3154.md
littlenine/cpp-docs.zh-tw
811b8d2137808d9d0598fbebec6343fdfa88c79e
[ "CC-BY-4.0", "MIT" ]
null
null
null
docs/error-messages/compiler-errors-2/compiler-error-c3154.md
littlenine/cpp-docs.zh-tw
811b8d2137808d9d0598fbebec6343fdfa88c79e
[ "CC-BY-4.0", "MIT" ]
null
null
null
docs/error-messages/compiler-errors-2/compiler-error-c3154.md
littlenine/cpp-docs.zh-tw
811b8d2137808d9d0598fbebec6343fdfa88c79e
[ "CC-BY-4.0", "MIT" ]
null
null
null
--- title: 編譯器錯誤 C3154 ms.date: 11/04/2016 f1_keywords: - C3154 helpviewer_keywords: - C3154 ms.assetid: 78005c74-eaaf-4ac2-88ae-6c25d01a302a ms.openlocfilehash: 9f7af4e19fab5f5a0539e9fc3bf9dbeffb5c6fbf ms.sourcegitcommit: c6f8e6c2daec40ff4effd8ca99a7014a3b41ef33 ms.translationtype: MT ms.contentlocale: zh-TW ms.lasthandoff: 04/24/2019 ms.locfileid: "64344648" --- # <a name="compiler-error-c3154"></a>編譯器錯誤 C3154 必須是 '、' 省略符號之前。 非逗號分隔的省略符號參數陣列函式不支援。 變數引數的函式宣告不正確。 如需詳細資訊,請參閱[變數引數清單 (...)(C++/CLI)](../../extensions/variable-argument-lists-dot-dot-dot-cpp-cli.md). ## <a name="example"></a>範例 下列範例會產生 C3154。 ``` // C3154.cpp // compile with: /clr ref struct R { void Func(int ... array<int> ^); // C3154 void Func2(int i, ... array<int> ^){} // OK void Func3(array<int> ^){} // OK void Func4(... array<int> ^){} // OK }; int main() { R ^ r = gcnew R; r->Func4(1,2,3); } ```
21.619048
99
0.672907
yue_Hant
0.202719
1c24f23458b38516f892eaae81926a962e46b0bb
136
md
Markdown
docs/cursos/procesos/uncordobax/mcm002/_sidebar.md
SidVal/conocimientos
3e00e5574f0a45446b8169cef6417a8d72e1a891
[ "MIT" ]
null
null
null
docs/cursos/procesos/uncordobax/mcm002/_sidebar.md
SidVal/conocimientos
3e00e5574f0a45446b8169cef6417a8d72e1a891
[ "MIT" ]
null
null
null
docs/cursos/procesos/uncordobax/mcm002/_sidebar.md
SidVal/conocimientos
3e00e5574f0a45446b8169cef6417a8d72e1a891
[ "MIT" ]
null
null
null
* <a href="javascript:history.back()">Atrás</a> * [Clases](/cursos/marketing/uncordobax/) * [Contenido](/c/) * [☆](/medium.md#estrella)
27.2
47
0.654412
spa_Latn
0.10065
1c254f60327503557773124e3dce115cc78628f3
363
md
Markdown
packages/morphic-fastcheck-interpreters/docs/modules/model/object.ts.md
Brettm12345/morphic-ts
7daf85ec739b13bf8a149888c3ccf79654637449
[ "MIT" ]
null
null
null
packages/morphic-fastcheck-interpreters/docs/modules/model/object.ts.md
Brettm12345/morphic-ts
7daf85ec739b13bf8a149888c3ccf79654637449
[ "MIT" ]
null
null
null
packages/morphic-fastcheck-interpreters/docs/modules/model/object.ts.md
Brettm12345/morphic-ts
7daf85ec739b13bf8a149888c3ccf79654637449
[ "MIT" ]
null
null
null
--- title: model/object.ts nav_order: 9 parent: Modules --- --- <h2 class="text-delta">Table of contents</h2> - [fastCheckObjectInterpreter (constant)](#fastcheckobjectinterpreter-constant) --- # fastCheckObjectInterpreter (constant) **Signature** ```ts export const fastCheckObjectInterpreter: ModelAlgebraObject1<FastCheckURI> = ... ``` Added in v0.0.1
15.125
80
0.732782
eng_Latn
0.211783
1c25b76d18068e6017cfe5dff1056eeb74d69690
10,300
md
Markdown
socrata/rv3g-ypg7.md
axibase/open-data-catalog
18210b49b6e2c7ef05d316b6699d2f0778fa565f
[ "Apache-2.0" ]
7
2017-05-02T16:08:17.000Z
2021-05-27T09:59:46.000Z
socrata/rv3g-ypg7.md
axibase/open-data-catalog
18210b49b6e2c7ef05d316b6699d2f0778fa565f
[ "Apache-2.0" ]
5
2017-11-27T15:40:39.000Z
2017-12-05T14:34:14.000Z
socrata/rv3g-ypg7.md
axibase/open-data-catalog
18210b49b6e2c7ef05d316b6699d2f0778fa565f
[ "Apache-2.0" ]
3
2017-03-03T14:48:48.000Z
2019-05-23T12:57:42.000Z
# Calls for Service 2012 ## Dataset | Name | Value | | :--- | :---- | | Catalog | [Link](https://catalog.data.gov/dataset/calls-for-service-2012) | | Metadata | [Link](https://data.nola.gov/api/views/rv3g-ypg7) | | Data: JSON | [100 Rows](https://data.nola.gov/api/views/rv3g-ypg7/rows.json?max_rows=100) | | Data: CSV | [100 Rows](https://data.nola.gov/api/views/rv3g-ypg7/rows.csv?max_rows=100) | | Host | data.nola.gov | | Id | rv3g-ypg7 | | Name | Calls for Service 2012 | | Attribution | Orleans Parish Communications District | | Category | Public Safety and Preparedness | | Tags | crime, police, nopd | | Created | 2013-01-03T23:49:27Z | | Publication Date | 2016-02-11T22:53:00Z | ## Description This dataset reflects incidents that have been reported to the New Orleans Police Department in 2012. Data is provided by Orleans Parish Communication District (OPCD), the administrative office of 9-1-1 for the City of New Orleans. In the OPCD system, NOPD may reclassify or change the signal type for up to 36 hours after the incident is marked up. For information about an incident after this time period, citizens may request police reports from the NOPD Public Records Division. In order to protect the privacy of victims, addresses are shown at the block level and the call types cruelty to juveniles, juvenile attachment and missing juvenile have been removed in accordance with the Louisiana Public Records Act, L.R.S. 44:1. Map coordinates (X,Y) have been removed for the following call types: Aggravated Rape, Aggravated Rape - MA, Crime Against Nature, Mental Patient, Oral Sexual Battery, Prostitution, Sexual Battery, Simple Rape, Simple Rape - Male V, and Soliciting for Prost.Disclaimer: These incidents may be based upon preliminary information supplied to the Police Department by the reporting parties that have not been verified. The preliminary crime classifications may be changed at a later date based upon additional investigation and there is always the possibility of mechanical or human error. Therefore, the New Orleans Police Department does not guarantee (either expressed or implied) the accuracy, completeness, timeliness, or correct sequencing of the information and the information should not be used for comparison purposes over time. The New Orleans Police Department will not be responsible for any error or omission, or for the use of, or the results obtained from the use of this information. All data visualizations on maps should be considered approximate and attempts to derive specific addresses are strictly prohibited. The New Orleans Police Department is not responsible for the content of any off-site pages that are referenced by or that reference this web page other than an official City of New Orleans or New Orleans Police Department web page. The user specifically acknowledges that the New Orleans Police Department is not responsible for any defamatory, offensive, misleading, or illegal conduct of other users, links, or third parties and that the risk of injury from the foregoing rests entirely with the user. Any use of the information for commercial purposes is strictly prohibited. The unauthorized use of the words "New Orleans Police Department," "NOPD," or any colorable imitation of these words or the unauthorized use of the New Orleans Police Department logo is unlawful. This web page does not, in any way, authorize such use. ## Columns ```ls | Included | Schema Type | Field Name | Name | Data Type | Render Type | | ======== | =========== | =============== | =============== | ============= | ============= | | Yes | series tag | nopd_item | NOPD_Item | text | text | | Yes | series tag | type | Type_ | text | text | | Yes | series tag | typetext | TypeText | text | text | | Yes | series tag | priority | Priority | text | text | | No | | mapx | MapX | number | text | | No | | mapy | MapY | number | text | | Yes | time | timecreate | TimeCreate | calendar_date | calendar_date | | No | | timedispatch | TimeDispatch | calendar_date | calendar_date | | No | | timearrive | TimeArrive | calendar_date | calendar_date | | No | | timeclosed | TimeClosed | calendar_date | calendar_date | | Yes | series tag | disposition | Disposition | text | text | | Yes | series tag | dispositiontext | DispositionText | text | text | | No | | block_address | BLOCK_ADDRESS | text | text | | Yes | series tag | zip | Zip | text | text | | Yes | series tag | policedistrict | PoliceDistrict | text | number | ``` ## Time Field ```ls Value = timecreate Format & Zone = yyyy-MM-dd'T'HH:mm:ss ``` ## Series Fields ```ls Excluded Fields = mapx,mapy,timedispatch,timearrive,timeclosed,block_address ``` ## Data Commands ```ls series e:rv3g-ypg7 d:2012-01-01T00:00:11.000Z t:zip=70116 t:nopd_item=A0000112 t:priority=2C t:dispositiontext="NECESSARY ACTION TAKEN" t:policedistrict=8 t:typetext="BURGLAR ALARM, SILEN" t:type=62A t:disposition=NAT m:row_number.rv3g-ypg7=1 series e:rv3g-ypg7 d:2012-01-01T00:00:36.000Z t:zip=70129 t:nopd_item=A0000412 t:priority=2B t:dispositiontext=UNFOUNDED t:policedistrict=7 t:typetext="DISCHARGING FIREARMS" t:type=94 t:disposition=UNF m:row_number.rv3g-ypg7=2 series e:rv3g-ypg7 d:2012-01-01T00:01:13.000Z t:zip=70122 t:nopd_item=A0000212 t:priority=1C t:dispositiontext="NECESSARY ACTION TAKEN" t:policedistrict=3 t:typetext="DISTURBANCE (OTHER)" t:type=103 t:disposition=NAT m:row_number.rv3g-ypg7=3 ``` ## Meta Commands ```ls metric m:row_number.rv3g-ypg7 p:long l:"Row Number" entity e:rv3g-ypg7 l:"Calls for Service 2012" t:attribution="Orleans Parish Communications District" t:url=https://data.nola.gov/api/views/rv3g-ypg7 property e:rv3g-ypg7 t:meta.view v:id=rv3g-ypg7 v:category="Public Safety and Preparedness" v:averageRating=0 v:name="Calls for Service 2012" v:attribution="Orleans Parish Communications District" property e:rv3g-ypg7 t:meta.view.license v:name="Creative Commons 1.0 Universal (Public Domain Dedication)" v:termsLink=http://creativecommons.org/publicdomain/zero/1.0/legalcode v:logoUrl=images/licenses/ccZero.png property e:rv3g-ypg7 t:meta.view.owner v:id=etmh-sfwk v:screenName="Greg Hymel" v:displayName="Greg Hymel" property e:rv3g-ypg7 t:meta.view.tableauthor v:id=etmh-sfwk v:screenName="Greg Hymel" v:roleName=publisher v:displayName="Greg Hymel" property e:rv3g-ypg7 t:meta.view.metadata.custom_fields.common_core v:Contact_Email=data@nola.gov ``` ## Top Records ```ls | nopd_item | type | typetext | priority | mapx | mapy | timecreate | timedispatch | timearrive | timeclosed | disposition | dispositiontext | block_address | zip | policedistrict | | ========= | ==== | ==================== | ======== | ================ | =============== | =================== | =================== | =================== | =================== | =========== | ====================== | ============================ | ===== | ============== | | A0000112 | 62A | BURGLAR ALARM, SILEN | 2C | 3683627.00000000 | 532625.00000000 | 2012-01-01T00:00:11 | 2012-01-01T00:02:46 | | 2012-01-01T00:33:32 | NAT | NECESSARY ACTION TAKEN | 009XX Decatur St | 70116 | 8 | | A0000412 | 94 | DISCHARGING FIREARMS | 2B | 3732996.00000000 | 562418.00000000 | 2012-01-01T00:00:36 | 2012-01-01T00:03:22 | 2012-01-01T00:16:59 | 2012-01-01T00:30:09 | UNF | UNFOUNDED | 147XX Chef Menteur Hwy | 70129 | 7 | | A0000212 | 103 | DISTURBANCE (OTHER) | 1C | 3687688.00000000 | 548824.00000000 | 2012-01-01T00:01:13 | 2012-01-01T00:01:19 | 2012-01-01T00:01:44 | 2012-01-01T00:19:52 | NAT | NECESSARY ACTION TAKEN | 038XX Gentilly Blvd | 70122 | 3 | | A0000712 | 21 | COMPLAINT OTHER | 1H | 3670776.00000000 | 521242.00000000 | 2012-01-01T00:01:18 | 2012-01-01T00:13:35 | | 2012-01-01T00:20:13 | NAT | NECESSARY ACTION TAKEN | Carondelet St & Napoleon Ave | 70115 | 2 | | A0000512 | 62A | BURGLAR ALARM, SILEN | 2C | 3665739.00000000 | 549621.00000000 | 2012-01-01T00:01:20 | 2012-01-01T00:02:52 | 2012-01-01T00:09:11 | 2012-01-01T01:55:13 | NAT | NECESSARY ACTION TAKEN | 002XX W Harrison Ave | 70124 | 3 | | A0000912 | 94 | DISCHARGING FIREARMS | 2B | 3665391.00000000 | 536341.00000000 | 2012-01-01T00:01:48 | 2012-01-01T00:04:56 | 2012-01-01T00:10:07 | 2012-01-01T00:11:51 | NAT | NECESSARY ACTION TAKEN | Edinburgh St & Gen Ogden St | 70118 | 2 | | A0000612 | 94F | FIREWORKS | 1H | 3675716.00000000 | 524537.00000000 | 2012-01-01T00:01:53 | 2012-01-01T00:13:59 | | 2012-01-01T00:14:35 | NAT | NECESSARY ACTION TAKEN | 2nd St & S Saratoga St | 70113 | 6 | | A0000812 | 94 | DISCHARGING FIREARMS | 2B | 3669605.00000000 | 530044.00000000 | 2012-01-01T00:01:57 | 2012-01-01T00:04:22 | 2012-01-01T00:07:00 | 2012-01-01T00:08:06 | NAT | NECESSARY ACTION TAKEN | 044XX Walmsley Ave | 70125 | 2 | | A0001012 | 94F | FIREWORKS | 1H | 3698138.00000000 | 552955.00000000 | 2012-01-01T00:02:30 | 2012-01-01T00:18:28 | 2012-01-01T00:30:35 | 2012-01-01T00:39:17 | NAT | NECESSARY ACTION TAKEN | 045XX Lynhuber Dr | 70126 | 7 | | A0001212 | 107 | SUSPICIOUS PERSON | 2B | 3705859.00000000 | 561144.00000000 | 2012-01-01T00:02:44 | 2012-01-01T00:35:42 | | 2012-01-01T00:41:18 | GOA | GONE ON ARRIVAL | 072XX Yorktown Dr | 70127 | 7 | ```
100.980392
2,694
0.633204
eng_Latn
0.614963
1c26ce00c2615a80e5d9dc3f2f99c8f686791be9
2,559
md
Markdown
README.md
chaowang1994/ali-tianchi
27575ce01efcb4d0e816fa681cb8b996d941f17b
[ "BSD-2-Clause" ]
1
2018-10-10T14:14:56.000Z
2018-10-10T14:14:56.000Z
README.md
chaowang1994/ali-tianchi
27575ce01efcb4d0e816fa681cb8b996d941f17b
[ "BSD-2-Clause" ]
null
null
null
README.md
chaowang1994/ali-tianchi
27575ce01efcb4d0e816fa681cb8b996d941f17b
[ "BSD-2-Clause" ]
null
null
null
# Caffe ## 1. 新增加常见[数据增强方式](https://github.com/twtygqyy/caffe-augmentation) 2018.10.07 新增 - min_side - resize and crop preserving aspect ratio, default 0 (disabled); - max_rotation_angle - max angle for an image rotation, default 0; - contrast_brightness_adjustment - enable/disable contrast adjustment, default false; - smooth_filtering - enable/disable smooth filterion, default false; - min_contrast - min contrast multiplier (min alpha), default 0.8; - max_contrast - min contrast multiplier (max alpha), default 1.2; - max_brightness_shift - max brightness shift in positive and negative directions (beta), default 5; - max_smooth - max smooth multiplier, default 6; - max_color_shift - max color shift along RGB axes - apply_probability - how often every transformation should be applied, default 0.5; - debug_params - enable/disable printing tranformation parameters, default false; 使用方法: 在网络的 prototxt 指定: ``` layer { name: "data" type: "ImageData" top: "data" top: "label" include { phase: TRAIN } transform_param { mirror: true contrast_brightness_adjustment: true smooth_filtering: true min_side_min: 256 min_side_max: 480 crop_size: 224 mean_file: "imagenet_mean.binaryproto" min_contrast: 0.8 max_contrast: 1.2 max_smooth: 6 apply_probability: 0.5 max_color_shift: 20 debug_params: false } image_data_param { source: "train_list.txt" batch_size: 64 } } 在测试(testing phase)时 : layer { name: "data" type: "ImageData" top: "data" top: "label" include { phase: TEST } transform_param { mirror: false min_side: 256 crop_size: 224 mean_file: "imagenet_mean.binaryproto" } image_data_param { source: "test_list.txt" batch_size: 32 } } ``` ### 坚持参加天池比赛,为春招做准备 #### 1. [2018广东工业智造大数据创新大赛——智能算法赛](https://tianchi.aliyun.com/competition/introduction.htm?spm=5176.100150.711.5.322c2784ctjRFB&raceId=231682) ### 2. ## License and Citation Caffe is released under the [BSD 2-Clause license](https://github.com/BVLC/caffe/blob/master/LICENSE). The BAIR/BVLC reference models are released for unrestricted use. Please cite Caffe in your publications if it helps your research: @article{jia2014caffe, Author = {Jia, Yangqing and Shelhamer, Evan and Donahue, Jeff and Karayev, Sergey and Long, Jonathan and Girshick, Ross and Guadarrama, Sergio and Darrell, Trevor}, Journal = {arXiv preprint arXiv:1408.5093}, Title = {Caffe: Convolutional Architecture for Fast Feature Embedding}, Year = {2014} }
19.534351
170
0.722939
eng_Latn
0.672663
1c274f54ed1445a4ad8d82213461c441bb5cad49
5,637
md
Markdown
README.md
nageshwaran08/OCS-GLPI-Installation-on-CentOS-7
cabd8c987e272b41b069194e042c8bef2f841bb7
[ "PHP-3.01", "PHP-3.0", "IJG", "Zend-2.0" ]
null
null
null
README.md
nageshwaran08/OCS-GLPI-Installation-on-CentOS-7
cabd8c987e272b41b069194e042c8bef2f841bb7
[ "PHP-3.01", "PHP-3.0", "IJG", "Zend-2.0" ]
null
null
null
README.md
nageshwaran08/OCS-GLPI-Installation-on-CentOS-7
cabd8c987e272b41b069194e042c8bef2f841bb7
[ "PHP-3.01", "PHP-3.0", "IJG", "Zend-2.0" ]
null
null
null
## OCS Installation 1. Install this on Centos 7 Minimal installation 1. Install wget ``` yum install wget -y ``` 1. Download and run [setup.sh](https://github.com/nageshwaran08/OCS-GLPI-Installation-on-CentOS-7/blob/master/setup.sh) on the server ``` https://raw.githubusercontent.com/nageshwaran08/OCS-GLPI-Installation-on-CentOS-7/master/setup.sh ``` 1. Launch MySQL Secure Installation script ``` /usr/bin/mysql_secure_installation ``` 1. Install additional package ``` cpan XML::Entities cpan Apache::DBI cpan ModPerl::MM cpan Apache2::SOAP cpan Mojolicious::Lite cpan Switch cpan Plack::Handler (or) yum install "perl(XML::Entities)" yum install "perl(Apache::DBI)" yum install "perl(ModPerl::MM)" yum install "perl(Apache2::SOAP)" yum install "perl(Mojolicious::Lite)" yum install "perl(Switch)" yum install "perl(Plack::Handler)" ``` 1. Exclude ports 80 for Apache, 3306 for MySQL and 25 for SMTP from iptables ``` firewall-cmd --zone=public --add-port=80/tcp --permanent firewall-cmd --zone=public --add-port=3306/tcp --permanent firewall-cmd --zone=public --add-port=25/tcp --permanent firewall-cmd --reload ``` 1. Install and activate the REMI and EPEL RPM Repositories ``` wget https://dl.fedoraproject.org/pub/epel/epel-release-latest-7.noarch.rpm && wget http://rpms.famillecollet.com/enterprise/remi-release-7.rpm && rpm -Uvh *.rpm ``` 1. Update PHP ``` yum update php* --enablerepo=remi -y ``` 1. Verify the new PHP version ``` php -v PHP 5.4.45 (cli) (built: Mar 1 2018 09:57:11) Copyright (c) 1997-2014 The PHP Group Zend Engine v2.4.0, Copyright (c) 1998-2014 Zend Technologies ``` 1. Download OCS Inventory and run setup ``` OCS_VER=2.4.1 && cd /var/www/html && wget https://github.com/OCSInventory-NG/OCSInventory-ocsreports/releases/download/2.4.1/OCSNG_UNIX_SERVER_$OCS_VER.tar.gz && tar -xzvf OCSNG_UNIX_SERVER_$OCS_VER.tar.gz && cd OCSNG_UNIX_SERVER_$OCS_VER && ./setup.sh ``` 1. Increase post_max_size and upload_max_filesize in /etc/php.ini ``` post_max_size = 200M upload_max_filesize = 200M ``` 1. Restart Apache ``` service httpd restart ``` 1. Add write permission to the directory ``` chmod +w /var/lib/ocsinventory-reports ``` 1. Run mysql_upgrade ``` mysql_upgrade -u root -p[password] ``` 1. Create OCS user in MySQL and assign privileges for OCSWEB database ``` mysql -u root -p [password] GRANT ALL PRIVILEGES ON `ocsweb` .* TO 'ocs'@'localhost' IDENTIFIED BY 'ocs' WITH GRANT OPTION; FLUSH PRIVILEGES; ``` 1. Perform initial OCS config then login to OCS ``` URL: [IP Address]/ocsreports Login: admin Password: admin MySQL login: ocs MySQL password: [your password when setting up MySQL] Name of Database: ocsweb MySQL Hostname: localhost ``` 1. Change 'Trace Deleted' config to 'ON' in OCS. Config > Config > Server ![img](http://i.imgur.com/GD8p2TG.png) ![img](http://i.imgur.com/qtG0R5S.jpg) 1. Remove install script ``` rm -f /usr/share/ocsinventory-reports/ocsreports/install.php ``` ## GLPI Installation 1. Download GLPI ``` cd /var/www/html wget https://github.com/glpi-project/glpi/releases/download/9.1.7.1/glpi-9.1.7.1.tgz tar -xzvf glpi-9.1.7.1.tgz ``` 1. Update permissions ``` chown apache:apache -R glpi/ chmod 777 glpi/files/ glpi/config/ ``` 1. Edit /etc/httpd/conf/httpd.conf file. Change all occurrences of AllowOverride None to AllowOverride All ``` ... <Directory /> Options FollowSymLinks AllowOverride All </Directory> ... AllowOverride All ... ``` 1. Restart Apache ``` service httpd restart ``` 1. Setting up database for GLPI use ``` mysql -u root -p [rootsecret] CREATE USER 'glpi'@'%' IDENTIFIED BY 'glpisecret'; GRANT USAGE ON *.* TO 'glpi'@'%' IDENTIFIED BY 'glpisecret'; CREATE DATABASE IF NOT EXISTS `glpi`; GRANT ALL PRIVILEGES ON `glpi`.* TO `glpi`@'%'; CREATE USER 'sync'@'%' IDENTIFIED BY 'syncsecret'; GRANT USAGE ON *.* TO 'sync'@'%' IDENTIFIED BY 'syncsecret'; GRANT SELECT ON `ocsweb`.* TO `sync`@'%'; GRANT DELETE ON `ocsweb`.`deleted_equiv` TO `sync`@`%`; GRANT UPDATE (`CHECKSUM`) ON `ocsweb`.`hardware` TO `sync`@`%`; FLUSH PRIVILEGES; exit ``` 1. Install GLPI cronjob ``` crontab -u apache -e * * * * * /usr/bin/php /var/www/html/glpi/front/cron.php &>/dev/null ``` 1. Login to GLPI ``` URL: [IP Address]/glpi Login: glpi Password: glpi MySQL Server: 127.0.0.1 User: glpi Password: glpisecret ``` ## GLPI and OCS Integration 1. Download OCS plugin, extract and change it's ownership. ``` cd /var/www/html/glpi/plugins wget https://github.com/pluginsGLPI/ocsinventoryng/releases/download/1.4.3/glpi-ocsinventoryng-1.4.3.tar.gz tar -xzvf glpi-ocsinventoryng-1.4.3.tar.gz chown -R apache:apache ocsinventoryng/ ``` 1. Go to GLPI > Setup > Plugin page. Click Install, then Enable. 1. Connect GLPI to OCS DB using 'sync' account. 1. Add OCS Server details ![img](http://imgur.com/5YQQrKo.png) 1. Go to Datas to import, from dropdown select YES. ![img](https://image.prntscr.com/image/f9ZGzoGAQEqjcStnB_2Zvw.png) ## OCS Agent Installation http://wiki.ocsinventory-ng.org/index.php?title=Documentation:UnixAgent
24.723684
165
0.644669
yue_Hant
0.422194
1c288fe0540925c2ed3c64e539f284a06266151a
2,078
md
Markdown
wdk-ddi-src/content/wditypes/ns-wditypes-_wdi_type_pmk_name.md
AlexGuteniev/windows-driver-docs-ddi
99b84cea8977c8b190d39e65ccf26f1885ba3189
[ "CC-BY-4.0", "MIT" ]
176
2018-01-12T23:42:01.000Z
2022-03-30T18:23:27.000Z
wdk-ddi-src/content/wditypes/ns-wditypes-_wdi_type_pmk_name.md
AlexGuteniev/windows-driver-docs-ddi
99b84cea8977c8b190d39e65ccf26f1885ba3189
[ "CC-BY-4.0", "MIT" ]
1,093
2018-01-23T07:33:03.000Z
2022-03-30T20:15:21.000Z
wdk-ddi-src/content/wditypes/ns-wditypes-_wdi_type_pmk_name.md
AlexGuteniev/windows-driver-docs-ddi
99b84cea8977c8b190d39e65ccf26f1885ba3189
[ "CC-BY-4.0", "MIT" ]
251
2018-01-21T07:35:50.000Z
2022-03-22T19:33:42.000Z
--- UID: NS:wditypes._WDI_TYPE_PMK_NAME title: _WDI_TYPE_PMK_NAME (wditypes.h) description: The WDI_TYPE_PMK_NAME structure defines the PMKR0Name or PMKR1Name (802.11r). old-location: netvista\wdi_type_pmk_name.htm tech.root: netvista ms.date: 05/02/2018 keywords: ["WDI_TYPE_PMK_NAME structure"] ms.keywords: "*PWDI_TYPE_PMK_NAME, PWDI_TYPE_PMK_NAME, PWDI_TYPE_PMK_NAME structure pointer [Network Drivers Starting with Windows Vista], WDI_TYPE_PMK_NAME, WDI_TYPE_PMK_NAME structure [Network Drivers Starting with Windows Vista], _WDI_TYPE_PMK_NAME, netvista.wdi_type_pmk_name, wditypes/PWDI_TYPE_PMK_NAME, wditypes/WDI_TYPE_PMK_NAME" req.header: wditypes.hpp req.include-header: req.target-type: Windows req.target-min-winverclnt: Windows 10 req.target-min-winversvr: Windows Server 2016 req.kmdf-ver: req.umdf-ver: req.ddi-compliance: req.unicode-ansi: req.idl: req.max-support: req.namespace: req.assembly: req.type-library: req.lib: req.dll: req.irql: targetos: Windows req.typenames: WDI_TYPE_PMK_NAME, *PWDI_TYPE_PMK_NAME f1_keywords: - _WDI_TYPE_PMK_NAME - wditypes/_WDI_TYPE_PMK_NAME - PWDI_TYPE_PMK_NAME - wditypes/PWDI_TYPE_PMK_NAME - WDI_TYPE_PMK_NAME - wditypes/WDI_TYPE_PMK_NAME topic_type: - APIRef - kbSyntax api_type: - HeaderDef api_location: - wditypes.hpp api_name: - _WDI_TYPE_PMK_NAME - PWDI_TYPE_PMK_NAME - WDI_TYPE_PMK_NAME --- # _WDI_TYPE_PMK_NAME structure ## -description > [!IMPORTANT] > This topic is part of the [WDI driver model](/windows-hardware/drivers/network/wdi-miniport-driver-design-guide) released in Windows 10. The WDI driver model is in maintenance mode and will only receive high priority fixes. [WiFiCx](/windows-hardware/drivers/netcx/wifi-wdf-class-extension-wificx) is the Wi-Fi driver model released in Windows 11. We recommend that you use WiFiCx to take advantage of the latest features. The WDI_TYPE_PMK_NAME structure defines the PMKR0Name or PMKR1Name (802.11r). ## -struct-fields ### -field Name the PMKR0Name or PMKR1Name.
31.969231
426
0.774783
yue_Hant
0.718746
1c289c751649be8a9b9527d4b1ec5f3eb8d43268
2,071
md
Markdown
content/en/resources/customization/beagle-for-ios/custom-widgets.md
otaviojava/beagle-docs
3ed846302bfde8b938be4559bc19f45bf2b0d6bf
[ "Apache-2.0" ]
null
null
null
content/en/resources/customization/beagle-for-ios/custom-widgets.md
otaviojava/beagle-docs
3ed846302bfde8b938be4559bc19f45bf2b0d6bf
[ "Apache-2.0" ]
null
null
null
content/en/resources/customization/beagle-for-ios/custom-widgets.md
otaviojava/beagle-docs
3ed846302bfde8b938be4559bc19f45bf2b0d6bf
[ "Apache-2.0" ]
null
null
null
--- title: Custom Widgets weight: 149 description: 'You will find here, how to create a component and a Custom widgets class' --- --- ## Introduction Beagle already has basic widgets that can be used to alter your UI application through back-end. However, you can add new components to make the views of your applications visible to Beagle and also make them be used in the back-end. ## How to create components \(custom views\) and widgets? ### Step 1: Create a Widget See below an example of a customized component that represents UILabel: ```swift struct MyCustomComponent: ServerDrivenComponent { let text: String func toView(renderer: BeagleRenderer) -> UIView { let label = UILabel(frame: .zero) label.text = text label.numberOfLines = 0 return label } } ``` You can see in the example `MyCustomComponent` is a `ServerDrivenComponent`, it is a protocol that conforms to `Decodable` which is responsible to decode the properties your widgets exposes to the backend. ### Step 2: Register the Widget It is required you register to Beagle. Inside the configure file use the **`registerCustomComponent().`** method. The first parameter is a string tha refers as your BFF will call it and the second parameter is the component's class. ```swift Beagle.registerCustomComponent( "MyCustomComponent", componentType: MyCustomComponent.self ) ``` After you register your customized component, you can use via server-driven. ### Step 3: Display the component You can use your component declaratively or put it in an instance until `BeagleScreenViewController` or call it with the `toView()` method and present the `UIView` that appears in your own view controller. ```swift let beagleScreenViewController = Beagle.screen( .declarative( .init(child: MyCustomComponent(text: "Hello Beagle!") ) ) ) ``` Even if you have a more complex component in your `UIViews`, the process will be very similar, you just have to provide an `ServerDrivenComponent` or a `Widget` type.
33.403226
234
0.733945
eng_Latn
0.997166
1c28cd19dd704a9eb3a1144ef03afcb0704ea969
5,329
md
Markdown
docs/framework/winforms/advanced/application-settings-attributes.md
olifantix/docs.de-de
a31a14cdc3967b64f434a2055f7de6bf1bb3cda8
[ "CC-BY-4.0", "MIT" ]
null
null
null
docs/framework/winforms/advanced/application-settings-attributes.md
olifantix/docs.de-de
a31a14cdc3967b64f434a2055f7de6bf1bb3cda8
[ "CC-BY-4.0", "MIT" ]
null
null
null
docs/framework/winforms/advanced/application-settings-attributes.md
olifantix/docs.de-de
a31a14cdc3967b64f434a2055f7de6bf1bb3cda8
[ "CC-BY-4.0", "MIT" ]
null
null
null
--- title: Attribute für Anwendungseinstellungen ms.date: 03/30/2017 helpviewer_keywords: - application settings [Windows Forms], attributes - attributes [Windows Forms], application settings - wrapper classes [Windows Forms], application settings ms.assetid: 53caa66c-a9fb-43a5-953c-ad092590098d ms.openlocfilehash: d52549546bc838d8d38da33b9bb9931488795064 ms.sourcegitcommit: 3d5d33f384eeba41b2dff79d096f47ccc8d8f03d ms.translationtype: MT ms.contentlocale: de-DE ms.lasthandoff: 05/04/2018 --- # <a name="application-settings-attributes"></a>Attribute für Anwendungseinstellungen Die Anwendungseinstellungen-Architektur bietet viele Attribute, die für die Wrapperklasse für Applikationen-Einstellungen oder der einzelnen Eigenschaften angewendet werden können. Diese Attribute werden zur Laufzeit von der Anwendung Einstellungen Infrastruktur häufig speziell der Einstellungsanbieter untersucht, um seine ordnungsgemäße Ausführung benutzerdefinierte Wrapper genannten Anforderungen anzupassen. Die folgende Tabelle enthält die Attribute, die auf die Wrapperklasse für Anwendungseinstellungen, einzelne Eigenschaften dieser Klasse oder beides angewendet werden können. Laut Definition sind nur ein einziges Bereichsattribut –**UserScopedSettingAttribute** oder **ApplicationScopedSettingAttribute**– muss für jede Eigenschaft angewendet werden. > [!NOTE] > Ein benutzerdefinierten Einstellungsanbieter abgeleitet wurde. die <xref:System.Configuration.SettingsProvider> Klasse, ist nur erforderlich, erkennt die folgenden drei Attribute: **ApplicationScopedSettingAttribute**, **UserScopedSettingAttribute**, und **DefaultSettingValueAttribute**. |Attribut|Ziel|Beschreibung| |---------------|------------|-----------------| |<xref:System.Configuration.SettingsProviderAttribute>|Beides|Gibt den kurzen Namen des Anbieters Einstellungen an, für den permanenten Speicher verwendet.<br /><br /> Wenn dieses Attribut nicht angegeben ist, den Standardanbieter <xref:System.Configuration.LocalFileSettingsProvider>, wird angenommen.| |<xref:System.Configuration.UserScopedSettingAttribute>|Beides|Definiert eine Eigenschaft als eine benutzerspezifische anwendungseinstellung an.| |<xref:System.Configuration.ApplicationScopedSettingAttribute>|Beides|Definiert eine Eigenschaft als im Gültigkeitsbereich der Anwendung anwendungseinstellung an.| |<xref:System.Configuration.DefaultSettingValueAttribute>|Eigenschaft|Gibt eine Zeichenfolge, die in der fest programmierte Standardwert für diese Eigenschaft vom Anbieter deserialisiert werden kann.<br /><br /> Die <xref:System.Configuration.LocalFileSettingsProvider> erfordert dieses Attribut keine und ausnahmslos überschrieben werden, sofern durch dieses Attribut ist ein Wert bereits beibehalten.| |<xref:System.Configuration.SettingsDescriptionAttribute>|Eigenschaft|Enthält den beschreibenden Text für eine einzelne Einstellung, die hauptsächlich von Tools zur Laufzeit und Entwurfszeit verwendet.| |<xref:System.Configuration.SettingsGroupNameAttribute>|Klasse|Stellt einen expliziten Namen für eine Einstellungsgruppe. Wenn dieses Attribut fehlt, <xref:System.Configuration.ApplicationSettingsBase> verwendet den Namen des Wrappers.| |<xref:System.Configuration.SettingsGroupDescriptionAttribute>|Klasse|Enthält den beschreibenden Text für eine Einstellungsgruppe primär von Tools zur Laufzeit und Entwurfszeit verwendet.| |<xref:System.Configuration.SettingsManageabilityAttribute>|Beides|Gibt NULL oder mehr Verwaltbarkeit-Dienste, die für die Einstellungsgruppe oder die Eigenschaft bereitgestellt werden soll. Die verfügbaren Dienste werden vom beschrieben die <xref:System.Configuration.SettingsManageability> Enumeration.| |<xref:System.Configuration.SpecialSettingAttribute>|Eigenschaft|Gibt an, dass eine Einstellung auf eine spezielle, vordefinierte Kategorie, z. B. eine Verbindungszeichenfolge gehört, die spezielle Verarbeitung durch den Einstellungsanbieter vorschlägt. Die vordefinierten Kategorien für dieses Attribut werden definiert, indem die <xref:System.Configuration.SpecialSetting> Enumeration.| |<xref:System.Configuration.SettingsSerializeAsAttribute>|Beides|Gibt einen bevorzugten Serialisierungsmechanismus für eine Einstellungsgruppe oder der Eigenschaft an. Die verfügbaren Serialisierungsmechanismen werden definiert, indem die <xref:System.Configuration.SettingsSerializeAs> Enumeration.| |<xref:System.Configuration.NoSettingsVersionUpgradeAttribute>|Eigenschaft|Gibt an, dass ein Einstellungsanbieter alle Upgrade Funktionalität der Anwendung für die markierte Eigenschaft deaktivieren sollten.| *Klasse* gibt an, dass das Attribut nur auf eine Wrapperklasse für Anwendungseinstellungen angewendet werden kann. *Eigenschaft* gibt an, dass das Attribut nur auf Einstellungseigenschaften angewendet werden kann. *Beide* gibt an, dass das Attribut auf eine Ebene angewendet werden kann. ## <a name="see-also"></a>Siehe auch <xref:System.Configuration.ApplicationSettingsBase> <xref:System.Configuration.SettingsProvider> [Architektur der Anwendungseinstellungen](../../../../docs/framework/winforms/advanced/application-settings-architecture.md) [Vorgehensweise: Erstellen von Anwendungseinstellungen](http://msdn.microsoft.com/library/53b3af80-1c02-4e35-99c6-787663148945)
121.113636
415
0.830175
deu_Latn
0.979152
1c291f2a095590b86dc300d2eb00509effdd302d
6,704
md
Markdown
docs/framework/unmanaged-api/debugging/icordebugmanagedcallback-interface.md
MoisesMlg/docs.es-es
4e8c9f518ab606048dd16b6c6a43a4fa7de4bcf5
[ "CC-BY-4.0", "MIT" ]
null
null
null
docs/framework/unmanaged-api/debugging/icordebugmanagedcallback-interface.md
MoisesMlg/docs.es-es
4e8c9f518ab606048dd16b6c6a43a4fa7de4bcf5
[ "CC-BY-4.0", "MIT" ]
null
null
null
docs/framework/unmanaged-api/debugging/icordebugmanagedcallback-interface.md
MoisesMlg/docs.es-es
4e8c9f518ab606048dd16b6c6a43a4fa7de4bcf5
[ "CC-BY-4.0", "MIT" ]
null
null
null
--- title: ICorDebugManagedCallback (Interfaz) ms.date: 03/30/2017 api_name: - ICorDebugManagedCallback api_location: - mscordbi.dll api_type: - COM f1_keywords: - ICorDebugManagedCallback helpviewer_keywords: - ICorDebugManagedCallback interface [.NET Framework debugging] ms.assetid: b47f1d61-c7dc-4196-b926-0b08c94f7041 topic_type: - apiref ms.openlocfilehash: 6eebabc3a08027eab4ac55c1e46dd75b1f75bd21 ms.sourcegitcommit: d8020797a6657d0fbbdff362b80300815f682f94 ms.translationtype: MT ms.contentlocale: es-ES ms.lasthandoff: 11/24/2020 ms.locfileid: "95679707" --- # <a name="icordebugmanagedcallback-interface"></a>ICorDebugManagedCallback (Interfaz) Proporciona métodos que permiten procesar las devoluciones de llamada del depurador. ## <a name="methods"></a>Métodos |Método|Descripción| |------------|-----------------| |[Método Break](icordebugmanagedcallback-break-method.md)|Notifica al depurador cuando <xref:System.Reflection.Emit.OpCodes.Break> se ejecuta una instrucción en la secuencia de código.| |[Método Breakpoint](icordebugmanagedcallback-breakpoint-method.md)|Notifica al depurador cuando se encuentra un punto de interrupción.| |[Método BreakpointSetError](icordebugmanagedcallback-breakpointseterror-method.md)|Notifica al depurador que el Common Language Runtime (CLR) no pudo enlazar con precisión un punto de interrupción establecido antes de que una función se compilara Just-in-Time (JIT).| |[Método ControlCTrap](icordebugmanagedcallback-controlctrap-method.md)|Notifica al depurador que se ha capturado CTRL + C en el proceso que se está depurando.| |[Método CreateAppDomain](icordebugmanagedcallback-createappdomain-method.md)|Notifica al depurador que se ha creado un dominio de aplicación.| |[Método CreateProcess](icordebugmanagedcallback-createprocess-method.md)|Notifica al depurador cuando un proceso se ha adjuntado o Iniciado por primera vez.| |[Método CreateThread](icordebugmanagedcallback-createthread-method.md)|Notifica al depurador que un subproceso ha empezado a ejecutar código administrado.| |[Método DebuggerError](icordebugmanagedcallback-debuggererror-method.md)|Notifica al depurador que se ha producido un error al intentar controlar un evento desde CLR.| |[Método EditAndContinueRemap](icordebugmanagedcallback-editandcontinueremap-method.md)|Desusado. Notifica al depurador que se ha enviado un evento de reasignación al IDE.| |[Método EvalComplete](icordebugmanagedcallback-evalcomplete-method.md)|Notifica al depurador que se ha completado una evaluación.| |[Método EvalException](icordebugmanagedcallback-evalexception-method.md)|Notifica al depurador que se ha finalizado una evaluación con una excepción no controlada.| |[Método Exception](icordebugmanagedcallback-exception-method.md)|Notifica al depurador que se ha producido una excepción del código administrado.| |[Método ExitAppDomain](icordebugmanagedcallback-exitappdomain-method.md)|Notifica al depurador que se ha salido de un dominio de aplicación.| |[Método ExitProcess](icordebugmanagedcallback-exitprocess-method.md)|Notifica al depurador que se ha salido de un proceso.| |[Método ExitThread](icordebugmanagedcallback-exitthread-method.md)|Notifica al depurador que un subproceso que estaba ejecutando código administrado ha salido.| |[Método LoadAssembly](icordebugmanagedcallback-loadassembly-method.md)|Notifica al depurador que se ha cargado correctamente un ensamblado CLR.| |[Método LoadClass](icordebugmanagedcallback-loadclass-method.md)|Notifica al depurador que se ha cargado una clase.| |[Método LoadModule](icordebugmanagedcallback-loadmodule-method.md)|Notifica al depurador que un módulo CLR se ha cargado correctamente.| |[Método LogMessage](icordebugmanagedcallback-logmessage-method.md)|Notifica al depurador que un subproceso administrado de CLR ha llamado a un método en la <xref:System.Diagnostics.EventLog> clase para registrar un evento.| |[Método LogSwitch](icordebugmanagedcallback-logswitch-method.md)|Notifica al depurador que un subproceso administrado de CLR ha llamado a un método en la <xref:System.Diagnostics.Switch> clase para crear, modificar o eliminar un modificador de depuración o traza.| |[Método NameChange](icordebugmanagedcallback-namechange-method.md)|Notifica al depurador que el nombre de un dominio de aplicación o un subproceso ha cambiado.| |[Método StepComplete](icordebugmanagedcallback-stepcomplete-method.md)|Notifica al depurador que se ha completado un paso.| |[Método UnloadAssembly](icordebugmanagedcallback-unloadassembly-method.md)|Notifica al depurador que se ha descargado un ensamblado CLR.| |[Método UnloadClass](icordebugmanagedcallback-unloadclass-method.md)|Notifica al depurador que se está descargando una clase.| |[Método UnloadModule](icordebugmanagedcallback-unloadmodule-method.md)|Notifica al depurador que se ha descargado un módulo CLR (DLL).| |[Método UpdateModuleSymbols](icordebugmanagedcallback-updatemodulesymbols-method.md)|Notifica al depurador que los símbolos para un módulo CLR han cambiado.| ## <a name="remarks"></a>Comentarios Todas las devoluciones de llamada se serializan, se llaman en el mismo subproceso y se llama con el proceso en el estado Synchronized. Cada implementación de devolución de llamada debe llamar a [ICorDebugController:: Continue](icordebugcontroller-continue-method.md) para reanudar la ejecución. Si `ICorDebugController::Continue` no se llama a antes de que se devuelva la devolución de llamada, el proceso seguirá detenido y no se producirán más devoluciones de llamada de eventos hasta que `ICorDebugController::Continue` se llame a. Un depurador debe implementar [ICorDebugManagedCallback2](icordebugmanagedcallback2-interface.md) si está depurando .NET Framework aplicaciones de la versión 2,0. Una instancia de `ICorDebugManagedCallback` o `ICorDebugManagedCallback2` se pasa como el objeto de devolución de llamada a [ICorDebug:: SetManagedHandler (](icordebug-setmanagedhandler-method.md). > [!NOTE] > Esta interfaz no admite que se la llame de forma remota, ya sea entre procesos o entre equipos. ## <a name="requirements"></a>Requisitos **Plataformas:** Vea [Requisitos de sistema](../../get-started/system-requirements.md). **Encabezado:** CorDebug.idl, CorDebug.h **Biblioteca:** CorGuids.lib **.NET Framework versiones:**[!INCLUDE[net_current_v10plus](../../../../includes/net-current-v10plus-md.md)] ## <a name="see-also"></a>Consulte también - [ICorDebug (Interfaz)](icordebug-interface.md) - [ICorDebugManagedCallback2 (Interfaz)](icordebugmanagedcallback2-interface.md) - [Interfaces para depuración](debugging-interfaces.md)
78.870588
402
0.802208
spa_Latn
0.876285
1c2931f44e0bf875cef5d3cf543bb2385057613f
506
md
Markdown
postprocessing/README.md
Ruler0421/crawl-cfgov
354da1fe8a2cc881004cfb8b53c6c4f84bd2fd69
[ "CC0-1.0" ]
10
2020-10-15T02:02:36.000Z
2022-01-25T20:54:30.000Z
postprocessing/README.md
Ruler0421/crawl-cfgov
354da1fe8a2cc881004cfb8b53c6c4f84bd2fd69
[ "CC0-1.0" ]
13
2020-10-13T15:38:54.000Z
2021-02-13T18:08:56.000Z
postprocessing/README.md
Ruler0421/crawl-cfgov
354da1fe8a2cc881004cfb8b53c6c4f84bd2fd69
[ "CC0-1.0" ]
4
2020-10-27T18:40:05.000Z
2021-02-18T11:35:43.000Z
# HTML postprocessing This directory contains [sed](https://linux.die.net/man/1/sed) scripts used to postprocess website HTML after it has been downloaded. These scripts are used to simplify the HTML before it is committed back to this repository. To apply these scripts to a directory containing HTML: ```sh ./transform_results.sh path_to_html ``` The HTML will be modified in-place. A testing script is also included for validating the existing scripts and developing new ones: ```sh ./test.sh ```
25.3
94
0.774704
eng_Latn
0.999083
1c298abb88e16ea418750dff98f8d3aae80a659a
112
md
Markdown
README.md
samtron1412/samtron1412.github.io
37c57e81a0afe604738d3a8e4b6a99f8628b2d77
[ "MIT" ]
null
null
null
README.md
samtron1412/samtron1412.github.io
37c57e81a0afe604738d3a8e4b6a99f8628b2d77
[ "MIT" ]
1
2021-09-28T00:15:35.000Z
2021-09-28T00:15:35.000Z
README.md
samtron1412/samtron1412.github.io
37c57e81a0afe604738d3a8e4b6a99f8628b2d77
[ "MIT" ]
null
null
null
# About This is the source code for my personal website. Unless stated otherwise, all content is MIT-licensed.
22.4
53
0.785714
eng_Latn
0.999835
1c2a3de17d340b393243dcf2d09e70679a26a55f
2,597
md
Markdown
README.md
seamile4kairi/personal_template
8fec3f5f76a21c02c82ea9b79a32c7943aa93f69
[ "MIT" ]
null
null
null
README.md
seamile4kairi/personal_template
8fec3f5f76a21c02c82ea9b79a32c7943aa93f69
[ "MIT" ]
null
null
null
README.md
seamile4kairi/personal_template
8fec3f5f76a21c02c82ea9b79a32c7943aa93f69
[ "MIT" ]
null
null
null
Kairerplate =========== The website boilerplate for/by Kairi KAWASAKI. Requirement ----------- - Node.js 16+ And also, you should know about: - [Parcel](https://parceljs.org/docs/) – The build tool used in this boilerplate - [EJS](https://ejs.co/#docs) – Default template engine - [NOTE] You can use other template engines as long as you follow Parcel's specifications. - [PostCSS](https://postcss.org/) - [NOTE] You can use other alt-languages as long as you follow Parcel's specifications. - [Stimulus](https://stimulus.hotwired.dev/) – Default JavaScript framework - [NOTE] You have no needs to use it, but I'd recommend because it's modern in architecture and highly readable on development. - Atomic Design – Use this as a component structure Usage ----- ### Installation ```shell $ npm ci ``` ### On development ```shell $ npm start ``` ### On build ```shell $ npm run build ``` File structure -------------- ``` project-root/ | |- dist/ # (Ignored) destination directory w/ `npm (start|run build)` | |- assets/ # Directory for common files referenced by source codes | |- scripts/ # Directory for JavaScript files | | |- modules/ | | `- app.js # Common JavaScript file | |- static/ # Directory for images or other files | `- styles/ # Directory for style files | |- modules/ | `- app.pcss # Common (Post)CSS file | |- components/ # Directory for components | |- atoms/ | |- molecules/ | `- organisms/ | |- Hoge/ # [NOTE] Store all files used in the component into the same directory | | |- Hoge.ejs | | |- Hoge.js | | |- Hoge.pcss | | |- images/ | | `- ... | `- ... | |- layouts/ # Directory for layout templates | |- modules/ | `- Default.ejs | |- pages/ # Directory for each pages | `- index.ejs | |- static/ # Directory for static files not to be referenced from source codes | |- .config.js # Configuration file for this whole project |- .ejsrc.js # Configuration file for EJS |- .htmlnanorc # Configuration file for minifiying HTML |- .parcelrc # Configuration file for Parcel |- .postcssrc.js # Configuration file for PostCSS |- package.json # [NOTE] Some of configurations are also defined here `- ... ``` LICENSE ------- MIT
26.5
131
0.558337
eng_Latn
0.937821
1c2a7fe919f4778b46702bf418a27a4679e1ee4a
381
md
Markdown
README.md
biggoron/phonetizer
d5a35c581b4be17842007bf9017cddb76e9720dd
[ "MIT" ]
null
null
null
README.md
biggoron/phonetizer
d5a35c581b4be17842007bf9017cddb76e9720dd
[ "MIT" ]
null
null
null
README.md
biggoron/phonetizer
d5a35c581b4be17842007bf9017cddb76e9720dd
[ "MIT" ]
null
null
null
# Phonetizer - French Utils library that checks french phonetization Provides python functions and CLI. ## Usage Install as pip package (pip install phonetizer-fr-dan), then use command line: ``` $ check_phonetisation orthophoniste oRtofonist model.pt -> 1.2 $ check_phonetisation orthophoniste patat model.pt -> -0.5 ``` ## Dev ``` from phonetizer import PhonetizerModel ```
18.142857
78
0.755906
eng_Latn
0.91326
1c2ae92272174b944a47d383737c15abf4a98e62
2,236
md
Markdown
README.md
oxalica/fcitx.vim
bd6fc8c0919f92d39b9990fdf63d1781cc47926f
[ "MIT" ]
null
null
null
README.md
oxalica/fcitx.vim
bd6fc8c0919f92d39b9990fdf63d1781cc47926f
[ "MIT" ]
null
null
null
README.md
oxalica/fcitx.vim
bd6fc8c0919f92d39b9990fdf63d1781cc47926f
[ "MIT" ]
null
null
null
Keep and restore fcitx state for each buffer separately when leaving/re-entering insert mode or search mode. Like always typing English in normal mode, but Chinese in insert mode. D-Bus only works with the same user so this won't work with `sudo vim`. See the `fcitx5-server` branch for an experimental implementation that supports `sudo vim`. By default, it use python3 and D-Bus to toggle IME state. If you set `g:fcitx5_remote` to the executable path of `fcitx5-remote` **BEFORE** loading the plugin, it will use `fcitx5-remote` instead of python and D-Bus. In this case, python3 support is optional. Usually `fcitx5-remote` mode is way faster since Python script need quite some time for the initial load if you don't use any other plugins that load Python. Base requirements: * fcitx 5 Requirements for Python mode (`g:fcitx5_remote` is not set): * Vim with Python 3 compiled in * The python-dbus package Requirements for `fcitx5-remote` mode (`g:fcitx5_remote` is set): * fcitx5-remote Links: * [git repo](https://github.com/lilydjwg/fcitx.vim) * [www.vim.org](https://www.vim.org/scripts/script.php?script_id=3764) Warning: 1. If you use Vim in terminal, to avoid the Esc delay, please set `'ttimeoutlen'` to 100 or some other value. And check screen's `maptimeout` or tmux's `escape-time` option if you use it too. 在离开或重新进入插入模式或搜索模式时自动记录和恢复每个缓冲区各自的输入法状态,以便在普通模式下始终是英文输入模式,切换回插入模式时恢复离开前的输入法输入模式。 D-Bus 只在同一用户时有效,所以使用 `sudo vim` 时本代码就失效了。在 `fcitx5-server` 分支有一个实验性的版本支持 `sudo vim` 的用法。 本插件默认会使用 Python 3 并通过 D-Bus 来切换输入法状态。 但如果你在加载插件**之前**设置了 `g:fcitx5_remote` 为你已安装的 `fcitx5-remote` 可执行文件的路径,那么本插件会使用它来切换输入法状态;此模式下本插件并不需要 Python 。 如果你没有其他使用 Python 的 Vim 插件,本插件的 Python 模式初始化可能会显著拖慢启动时间;而 `fcitx5-remote` 模式则没有这个问题。 基本要求: * fcitx 5 使用 Python 模式的要求(未设置 `g:fcitx5_remote`): * 带有 Python 3 支持的 Vim * python-dbus 包 使用 `fcitx5-remote` 模式的要求(需设置 `g:fcitx5_remote`): * fcitx5-remote 链接: * [git 仓库](https://github.com/lilydjwg/fcitx.vim) * [www.vim.org](https://www.vim.org/scripts/script.php?script_id=3764) 注意事项: 1. 终端下请设置 Vim `'ttimeoutlen'` 选项为较小值(如100),否则退出插入模式时会有较严重的延迟。同样会造成延迟的还有 screen 的 `maptimeout` 选项以及 tmux 的 `escape-time` 选项。 2. 请在fcitx5-configtool中确认英语是第一个输入法,中文是第二个输入法,rime用户请注意在fcitx5中一定要有两个输入法。
35.492063
201
0.767442
eng_Latn
0.720809
1c2bb7df1f77ae2ebd2ad47713cb5d78609fa09b
4,895
md
Markdown
Markdown/10000s/10000/lifted suffer.md
rcvd/interconnected-markdown
730d63c55f5c868ce17739fd7503d562d563ffc4
[ "MIT" ]
2
2022-01-19T09:04:58.000Z
2022-01-23T15:44:37.000Z
Markdown/00500s/00500/lifted suffer.md
rcvd/interconnected-markdown
730d63c55f5c868ce17739fd7503d562d563ffc4
[ "MIT" ]
null
null
null
Markdown/00500s/00500/lifted suffer.md
rcvd/interconnected-markdown
730d63c55f5c868ce17739fd7503d562d563ffc4
[ "MIT" ]
1
2022-01-09T17:10:33.000Z
2022-01-09T17:10:33.000Z
- - Concerned alone nearly to the this the. Ill they by which his held he. Be but best are between. Lodging which countenance their any with water. Murmured per long that [[teeth]] among. Has at to for with of somehow. Fall be the upon me been. One directly the hand in v. The and mostly more but was to found with. Worse conquered of been and patriotism them for. Gabriel god beautiful the in. Enter according out so the. Connect man def were was. Upon the for girls women this. This a in in on resolved. Protect very his to stood. To with the pride [[tells]] rain that. Not book shape lay one Rome the. Remaining small passing freedom all and. Fixed on [[dressed]] i let de or. Take dropped me the point well. You at her he flaming in. The i interests its are time and. House the older to raging the neither so. Dare nearest to just the also in. Me was the face longer of in. There for another makes her of but [[dressed impression]]. In to upon good of he. Complaining he we understood had that will said. There liked preparations and to of i among. Servants her were the condition. His array their but going little would. Our sake many members health not time primarily and. Tea make in she lying time. With breast i left foot his newspaper. In pound the touch an had eaten. Off verdure listened found that would the the. Was house can his eye conquered i. Person in and [[suffer welcome]] shall not go. Last the in by them only not once good. And authority called tailor i the. Chiefs but i less considerable chocolate but. Few him remark hours has you say the. - Mostly the will our and them. - Very the developed the speech up his. - Of sudden he passage but. - Rocks always allow one suspect she. - Below rather and kill dust night. - Advantage taken England know as must. - Silent him passed fairly. - Lincoln human received are man his. - Names of of rushing will route. - At of do when of acceptable mock in. - To have passions after [[title owner]] the were. - Were i some the original and exist. - Intended fair sinful could breath or. - Each was and the wounded man faces. - In the see or [[content]] for. - Veil Ishmael on can the morning. - Will it spirit he body got of. - System of that some provide sweet fire. - An which the to some. - Mrs offered works no are i that. - Coach no never very i had. - The def or let tip unknown. - Neck in verdure prefer remembered. - For to was iron the musical only. - Here diligence style in in knowing do. - Let the i the partly [[flesh]]. - Terrible had the walking then terror. - Could the of it that of and. - As all moves before but. - At absurd [[extraordinary]] saddle this really. - Famous sea on old for innocence. - First encoding the of the. - With seemed the silent to. - His of little of and disappear Miriam. - Much from for inflict than pockets. - Impression and so the errors another. - His the there of again supposition. - [[hopes December]] villain settled means the spirit. - Yellow he the between to upstairs the it. [[proceeded]] late the light the something. At and shew chance the or. Of [[proceeded]] is do and i into produce. Is you women required lovely said. The come write violent glass so honour are. I this sometimes do singing had some robbed. Undertook of that appeared nothing this opened through. Must ornament other away want to. The sand should silent looked of. For the has and works in. Gone the had ceiling to without as. My enormous [[dressed]] interrupted protection caught it. Though to youth of of dark whom. - Of the and your Arabic. Of to recent contain in old. It their true for coming. Upon is [[lifted explanation]] modern dust associated. Had the result deal upon hesitated. And manhood gladly kings else due of of. And i and had evolution in. Out your his and. As copied not air. No i thus act of. Christine and danger her to. As by to clothes were of thou that. - Boat which turned roman against. He through was of with as. Quite the which it just the wrote do. The and that here not i with. Flame with to of about know them. Episode were fair midst i the treatment. All her to of that to have of. Crash you shall direct at to from. Maintenance been been here the the. Since now the child themselves legs the. That license upon ill the to. Seen he de documents come. Had under of to in well to. Are the weak in us of. Say me to paid had had. - Truth share the was the and still. It it Ruth towards smitten sufficiently taught. And and conversation warriors was. Could had old personal streets that formal. Of the the substance by he tip carry. The drinking writing him cheat much emotions. Vow of surroundings bath distinction in till. Dr [[suffer]] interesting last in would. Had the monuments are under. End for get if were name. Some cost entered hope i thee the. The entity themselves said meet.
111.25
1,565
0.746885
eng_Latn
0.999953
1c2bb98b3415def8b817c714b811106099d84066
8,950
markdown
Markdown
_posts/2020-02-28-jquery-plugin.markdown
guyang369/guyang369.github.io
bd313ad56e632c757fb47ecb679743dcbc6a39c4
[ "Apache-2.0" ]
null
null
null
_posts/2020-02-28-jquery-plugin.markdown
guyang369/guyang369.github.io
bd313ad56e632c757fb47ecb679743dcbc6a39c4
[ "Apache-2.0" ]
null
null
null
_posts/2020-02-28-jquery-plugin.markdown
guyang369/guyang369.github.io
bd313ad56e632c757fb47ecb679743dcbc6a39c4
[ "Apache-2.0" ]
1
2020-04-20T06:22:58.000Z
2020-04-20T06:22:58.000Z
--- layout: post title: jQuery 插件开发(转) subtitle: jQuery 插件开发 date: 2020-02-28 author: guyang header-img: "img/post-bg-2018.jpg" tags: - jQuery插件开发 --- 转自: [https://www.cnblogs.com/wayou/p/jquery_plugin_tutorial.html](https://www.cnblogs.com/wayou/p/jquery_plugin_tutorial.html) #### 基本格式: ```js $.fn.pluginName = function(){ // your code goes here } ``` 基本上就是往$.fn上面添加一个方法,名字是我们的插件名称。然后我们的插件代码在这个方法里展开。 比如我们将页面上所有链接颜色转成红色,则可以这样写这个插件: ```js $.fn.myPlugin = function(){ // 在这里面,this指的是用jQuery选中的元素 // example: $('a').myPlugin(),则this=$('a') this.css('color','red'); } ``` 在插件名字定义的这个函数内部,this指代的是我们在调用该插件时,用jQuery选择器选中的元素,一般是一个jQuery类型的集合。比如$('a') 返回的是页面上所有a标签的集合,且这个集合已经是jQuery包装类型了,也就是说,在对其进行操作的时候可以直接调用jQuery的其他方法而不需要再用$来包装一下。 所以在上面的插件代码中,我们在this身上调用jQuery的css()方法,也就相当于在调用$('a').css()。 理解this在这个地方的含义很重要。这样你才知道为什么可以直接调用jQuery方法同时在其他地方this指代不同时我们又需要用jQuery重新包装才能调用,下面会讲到。初学容易被this的值整晕,但理解了就不难。 现在就可以去页面试试我们的代码了,在页面上放几个链接,调用插件后链接字段变成红色。 ```html <ul> <li> <a href="http://www.webo.com/liuwayong">我的微博</a> </li> <li> <a href="http://http://www.cnblogs.com/Wayou/">我的博客</a> </li> <li> <a href="http://wayouliu.duapp.com/">我的小站</a> </li> </ul> <p>这是p标签不是a标签,我不会受影响</p> <script src="jquery-1.11.0.min.js"></script> <script src="jquery.myplugin.js"></script> <script type="text/javascript"> $(function(){ $('a').myPlugin(); }) </script> ``` 下面进一步,在插件代码里处理每个具体的元素,而不是对一个集合进行处理,这样我们就可以针对每个元素进行相应操作。 我们已经知道this指代jQuery选择器返回的集合,那么通过调用jQuery的.each()方法就可以处理集合中的每个元素了,但此刻要注意的是,在each方法内部,this指代的是普通的DOM元素了,如果需要调用jQuery的方法那就需要用$来重新包装一下。 更改后的代码为: ```js $.fn.myPlugin = function(){ // 在这里,this指的是用jQuery选中的元素 this.css('color','red'); this.each(function(){ // 对每个元素进行操作 $(this).append(''+$(this).attr('href')); }); } ``` 到此,你已经可以编写功能简单的jQuery插件了。 下面开始jQuery插件编写中一个重要的部分,参数的接收。 #### 支持链式调用 我们都知道jQuery一个非常优雅的特性是支持链式调用,选择好DOM元素后可以不断地调用其他方法。 要让插件不打破这种链式调用,只需要return一下即可。 ```js $.fn.myPlugin = function(){ // 在这里,this指的是用jQuery选中的元素 this.css('color','red'); return this.each(function(){ // 对每个元素进行操作 $(this).append(''+$(this).attr('href')); }); } ``` #### 让插件接收参数 一个强劲的插件是可以让使用者随意定制的,这要求我们在编写插件时就要考虑得全面些,尽量提供合适的参数。 比如现在我们不想让链接只变成红色,我们让插件的使用者自己定义显示什么颜色,要做到这一点很方便,只需要使用者在调用的时候传入一个参数即可。同时我们在插件的代码里面接收。另一方面,为了灵活,使用者可以不传参数,插件里面会给出参数的默认值 。 在处理插件参数的接收上,通常使用jQuery的extend方法,当给extend方法传递一个以上的参数时,它会将所有参数对象合并到第一个里,同时,如果对象中有同名属性时,合并的时候后面的会覆盖前面的。 利用这一点,我们可以在插件里定义一个保存插件参数默认值的对象,同时将接收来的参数对象合并到默认对象上,最后就实现了用户指定了值的参数使用指定的值,未指定的参数使用插件默认值。 为了演示方便,再指定一个参数fontSize,允许调用插件的时候设置字段大小。 ```js $.fn.myPlugin = function (options) { var defaults = { 'color': 'red', 'fontSize': '12px' }; var settings = $.extend(defaults, options); return this.css({ 'color': settings.color, 'fontSize': settings.fontSize }) } ``` 现在,我们调用的时候指定颜色,字体大小未指定,会运用插件里的默认值12px。 ```js $('a').myPlugin({ 'color':'#2C9929' }); ``` 同时指定颜色和字体大小: ```js $('a').myPlugin({ 'color':'#2C9929', 'fontSize':'20px' }); ``` #### 保护好默认参数 注意到上面的代码调用extend时会将defaults的值改变,这样不好,因为它作为插件应有的一些东西应该维持原样,另外就是如果你在后续代码中还要使用这些默认值的话,当你再次访问它时它已经被用户传进来的参数更改了。 一个好的做法是将一个新的空对象做为$.extend的第一个参数,defaults和用户传递的参数对象紧随其后,这样做的好处就是所有值被合并到这个空对象上,保护了插件页面的默认值 ```js $.fn.myPlugin = function (options) { var defaults = { 'color': 'red', 'fontSize': '12px' }; var settings = $.extend({},defaults, options);//将一个空对象做为第一个参数 return this.css({ 'color': settings.color, 'fontSize': settings.fontSize }) } ``` 到此,插件可以接收和处理参数后,就可以编写出更健壮而灵活的插件了。若要编写一个复杂的插件,代码量会很大,如何组织代码就成了一个需要面临的问题,没有一个好的方式来组织这些代码,整体感觉会杂乱无章,同时也不好维护,所以将插件的所有方法属性包装到一个对象上,用面向对象的思维来进行开发,无疑会使工作轻松很多。 #### 面向对象的插件开发 为什么要有面向对象的思维,因为如果不这样,你可能需要一个方法的时候就去定义一个function,当需要另一个方法的时候,再去随便定义一个function,同样,需要一个变量的时候,毫无规则地定义一此散落在代码各处的变量。 还是老问题,不方便维护,也不够清晰。当然,这些问题在代码规模较小时是体现不出来的。 如果将需要的重要变量定义到对象的属性上,函数变成对象的方法,当我们需要的时候通过对象来获取,一来方便管理,二来不会影响外部命名空间,因为所有这些变量名还有方法名都是在对象内部。 接着上面的例子,我们可以把这个插件抽象成一个美化页面的对象,因为他的功能是设置颜色啊字体啊什么的,当然我们还可以加入其他功能比如设置下划线啊什么的。当然对于这个例子抽象成对象有点小题大做,这里仅作演示用。 所以我们新建一个对象命名为Beautifier,然后我们在插件里使用这个对象来编码。 ```js // 定义Beautifier的构造函数 var Beautifier = function (ele, opt) { this.$element = ele; this.defaults = { 'color': 'red', 'fontSize': '12px', 'textDecoration': 'none' }; this.options = $.extend({}, this.defaults, opt); }; //定义Beautifier的方法 Beautifier.prototype = { beautify: function () { return this.$element.css({ 'color': this.options.color, 'fontSize': this.options.fontSize, 'textDecoration': this.options.textDecoration }); } }; //在插件中使用Beautifier对象 $.fn.myPlugin = function (options) { // 创建Beautifier的实例 var beautifier = new Beautifier(this, options); // 调用其方法 return beautifier.beautify(); }; ``` 通过上面这样一改造,我们的代码变得更面向对象了,也更好维护和理解,以后要加新功能新方法,只需向对象添加新变量及方法即可,然后在插件里实例化后即可调用新添加的东西。 插件的调用还是一样的,我们对代码的改动并不影响插件其他地方,只是将代码的组织结构改动了而已。 指定文字带下划线的调用: ```js $(function(){ $('a').myPlugin({ 'color': '#2C9929', 'fontSize': '20px', 'textDecoration': 'underline' }); }) ``` 到这里,你可以更好的编写复杂的插件同时很好地组织代码了。当我们回头去看上面的代码时,其实也还是有改进空间的。也就是下面介绍的关于命名空间及变量的一些杂项。 #### 关于命名空间 不仅仅jQuery插件的开发,我们在写任何js代码时都应该注意的一点是不要污染全局命名空间。因为随着你代码的增多,如果有意无意在全局范围内定义一些变量的话,最后很难维护,也容易跟别人写的代码有冲突。 比如你在代码中向全局window对象添加了一个变量status用于存放状态,同时页面中引用了另一个别人写的库,也向全局添加了这样一个同名变量,最后的结果肯定不是你想要的。所以不到万不得已,一般我们不会将变量定义成全局的。 一个好的做法是始终用自调用匿名函数包裹你的代码,这样就可以完全放心,安全地将它用于任何地方了,绝对没有冲突。 #### 用自调用匿名函数包裹你的代码 我们知道JavaScript中无法用花括号方便地创建作用域,但函数却可以形成一个作用域,域内的代码是无法被外界访问的。如果我们将自己的代码放入一个函数中,那么就不会污染全局命名空间,同时不会和别的代码冲突。 如上面我们定义了一个Beautifier全局变量,它会被附到全局的window对象上,为了防止这种事情发生,你或许会说,把所有代码放到jQuery的插件定义代码里面去啊,也就是放到$.fn.myPlugin里面。这样做也是种选择。但会让我们实际跟插件定义有关的代码变得臃肿,而在$.fn.myPlugin里面我们其实应该更专注于插件的调用,以及如何与jQuery互动。 所以保持原来的代码不变,我们将所有代码用自调用匿名函数包裹。 ```js (function(){ // 定义Beautifier的构造函数 var Beautifier = function (ele, opt) { this.$element = ele; this.defaults = { 'color': 'red', 'fontSize': '12px', 'textDecoration': 'none' }; this.options = $.extend({}, this.defaults, opt); }; //定义Beautifier的方法 Beautifier.prototype = { beautify: function () { return this.$element.css({ 'color': this.options.color, 'fontSize': this.options.fontSize, 'textDecoration': this.options.textDecoration }); } }; //在插件中使用Beautifier对象 $.fn.myPlugin = function (options) { // 创建Beautifier的实例 var beautifier = new Beautifier(this, options); // 调用其方法 return beautifier.beautify(); }; })(); ``` 这样做的好处,也就是上面所阐述的那样。另外还有一个好处就是,自调用匿名函数里面的代码会在第一时间执行,页面准备好过后,上面的代码就将插件准备好了,以方便在后面的代码中使用插件。 目前为止似乎接近完美了。如果再考虑到其他一些因素,比如我们将这段代码放到页面后,前面别人写的代码没有用分号结尾,或者前面的代码将window,undefined等这些系统变量或者关键字修改掉了,正好我们又在自己的代码里面进行了使用,那结果也是不可预测的,这不是我们想要的。 所以好的做法是我们在代码开头加一个分号,这在任何时候都是一个好习惯。 同时,将系统变量以参数形式传递到插件内部,当我们这样做后,window等系统变量在插件内部就有了一个局部的引用,可以提高访问速度,会有些许性能的提升。 最后我们得到一个非常安全、结构良好的代码: ```js ;(function($,window,document,undefined){ //我们的代码。。 //blah blah blah... })(jQuery,window,document); ``` 而至于这个undefined,稍微有意思一点,为了得到没有被修改的undefined,我们并没有传递这个参数,但却在接收时接收了它,因为实际并没有传,所以undefined那个位置接收到的就是真实的undefined了。是不是有点hack的味道,值得细细体会的技术。 所以最后我们的插件成了这样: ```js ;(function($,window,document,undefined){ // 定义Beautifier的构造函数 var Beautifier = function (ele, opt) { this.$element = ele; this.defaults = { 'color': 'red', 'fontSize': '12px', 'textDecoration': 'none' }; this.options = $.extend({}, this.defaults, opt); }; //定义Beautifier的方法 Beautifier.prototype = { beautify: function () { return this.$element.css({ 'color': this.options.color, 'fontSize': this.options.fontSize, 'textDecoration': this.options.textDecoration }); } }; //在插件中使用Beautifier对象 $.fn.myPlugin = function (options) { // 创建Beautifier的实例 var beautifier = new Beautifier(this, options); // 调用其方法 return beautifier.beautify(); }; })(jQuery,window,document); ``` 一个安全,结构良好,组织有序的插件编写完成。 #### 关于变量定义及命名 ##### 变量定义 好的做法是把将要使用的变量名用一个var关键字一并定义在代码开头,变量名间用逗号隔开。 ##### 变量及函数命名 一般使用驼峰命名法,即首个单词的首字母小写,后面单词首字母大写,比如 resultArray, 对于常量,所有字母采用大写,多个单词用下划线隔开,比如TICKET_NUM=100. 当对象是jQuery类型时,建议以$开头,因为可以很方便地将它与普通变量区别开来,一看到以$开头我们就知道它是jQuery类型可以直接在其身上调用jQuery相关的方法,比如 var $element=$('a');之后就可以在后面的代码中很方便地使用它,并且与其他变量容易区分开来。 ##### 引号的使用 一般HTML代码里面使用双引号,而在JavaScript中多用单引号,比如下面代码: ```js var name = 'Wayou'; document.getElementById('example').innerHTML = '< a href="http://opensource.cnsuning.com/">'+name+''; //href=".." HTML中保持双引号,JavaScript中保持单引号 ``` 一方面,HTML代码中本来就使用的是双引号,别一方面,在JavaScript中引号中还需要引号的时候,要求我们单引号间隔着写才是合法的语句,除非你使用转意符。再者,坚持这样的统一可以保持代码风格的一致,不会出现这里字符串用双引号包着,另外的地方用单引号。
28.503185
184
0.694525
yue_Hant
0.720259
1c2c9749af97ea7954a45dc9e4c6663c55b5322a
3,521
md
Markdown
_posts/2021-09-26-loco-old-pals.md
bfi-prog-notes/bfi-prog-notes.github.io
f2f55d7418677e76da1fecdeac349b5e39a1607e
[ "Apache-2.0" ]
1
2021-05-18T19:38:21.000Z
2021-05-18T19:38:21.000Z
_posts/2021-09-26-loco-old-pals.md
bfi-prog-notes/bfi-prog-notes.github.io
f2f55d7418677e76da1fecdeac349b5e39a1607e
[ "Apache-2.0" ]
null
null
null
_posts/2021-09-26-loco-old-pals.md
bfi-prog-notes/bfi-prog-notes.github.io
f2f55d7418677e76da1fecdeac349b5e39a1607e
[ "Apache-2.0" ]
null
null
null
--- layout: post title: Old Pals published: true date: 2021-09-26 readtime: true categories: ['LOCO SHORTS WEEKENDER<br>IN PARTNERSHIP WITH MINDSEYE'] tags: [Shorts, Comedy] metadata: pdf: '2021-09-26-loco-old-pals.pdf' --- BAFTA-recognised festival LOCO aka The London Comedy Film Festival is back to share the best comedy short films rescued from the madness of 2020 in this Comedy Shorts Weekender, in partnership with MindsEye. After a hell of a year, what could be better than meeting up with old friends? Featuring some of our favourites from previous festivals, this selection of short comedy films comes from veteran LOCO filmmakers and contributors. Featuring a man in search of an ending, some historical re-enactors in need of a lift, and a midwife in danger of ruining the miracle of birth. **Programme** **Tyresome** Dir. Theresa Varga Wris. Nathan Bryon and Theresa Varga **Dawn of a New Gay** Dir. Rosie Gaunt-Mathieson Wri. Jack Rooke **Neville Is Dead** Dir. Louis Paxton Wris. Grant O’Rourke & Louis Paxton **Murder Me** Dir/Wri. Stuart Laws **Tina and Peter** Dir. Dan Hodgson Wri. Tracey Collins & Adam McNicol **Anoraks** Dir. Zoe Alker Wris. Zoe Alker, Bryher Flanders & Miles Sloman **I Want to Make You Happy** Dir. Ben Mallaby Wri. Mark Brennan & Paul F. Taylor **Trevor** Dir. Jon Drever Wri. Luke McQueen **Twins** Dir. Louis Hudson Wris. Louis Hudson & Benjamin PD Kane <br><br> **LOCO SHORTS WEEKENDER IN PARTNERSHIP WITH MINDSEYE**<br> **Awkward Encounters**<br> Fri 24 Sep 20:50<br> **Intimate Details**<br> Sat 25 Sep 18:10<br> **Dark Turns**<br> Sat 25 Sep 20:40<br> **Old Pals**<br> Sun 26 Sep 12:10<br> **LOCO: BAFTA-recognised Best of 2020**<br> Sun 26 Sep 15:30<br> <br> <img style="float:left" src="/img/loco.png"><br> <br><br><br><br><br><br><br><br><br><br> <img style="float:left" src="/img/mindseye-black.png"> <br><br><br><br><br><br><br><br><br><br><br> **BFI SOUTHBANK** Welcome to the home of great film and TV, with three cinemas and a studio, a world-class library, regular exhibitions and a pioneering Mediatheque with 1000s of free titles for you to explore. Browse special-edition merchandise in the BFI Shop.We&#39;re also pleased to offer you a unique new space, the BFI Riverfront – with unrivalled riverside views of Waterloo Bridge and beyond, a delicious seasonal menu, plus a stylish balcony bar for cocktails or special events. Come and enjoy a pre-cinema dinner or a drink on the balcony as the sun goes down. **BECOME A BFI MEMBER** Enjoy a great package of film benefits including priority booking at BFI Southbank and BFI Festivals. Join today at [**bfi.org.uk/join**](http://www.bfi.org.uk/join) **BFI PLAYER** We are always open online on BFI Player where you can watch the best new, cult &amp; classic cinema on demand. Showcasing hand-picked landmark British and independent titles, films are available to watch in three distinct ways: Subscription, Rentals &amp; Free to view. See something different today on [**player.bfi.org.uk**](https://player.bfi.org.uk) Join the BFI mailing list for regular programme updates. Not yet registered? Create a new account at [**www.bfi.org.uk/signup**](http://www.bfi.org.uk/signup) **Programme notes and credits compiled by the BFI Documentation Unit Notes may be edited or abridged Questions/comments? Contact the Programme Notes team by [email](mailto: prognotes@bfi.org.uk)** <!--stackedit_data: eyJoaXN0b3J5IjpbMTI3MDQ3MzAzMV19 -->
36.677083
555
0.737575
eng_Latn
0.933977
1c2d3bf586218933de09de7bdf329e44e7054d95
428
md
Markdown
markdown/02.md
reale/dechargements
165236222418e3499ad5e3cb1dc64bcdfb02abdf
[ "MIT" ]
null
null
null
markdown/02.md
reale/dechargements
165236222418e3499ad5e3cb1dc64bcdfb02abdf
[ "MIT" ]
null
null
null
markdown/02.md
reale/dechargements
165236222418e3499ad5e3cb1dc64bcdfb02abdf
[ "MIT" ]
null
null
null
# ii che fregola mi scappa di mattina nel sesso di trovarmi voglia di meccaniche lusinghe d'inventarmi smania di dita imperiose serpi e maestre des études d'exécution transcendante che comandano miele alle membra ma poi che fatica scrivere di cose indicibilissime di fame e di sete e di voglia di questo meccanico amore autoironico oh sì è anch'esso amore ne ha almeno la viltà e tanticchia di strazio
23.777778
44
0.771028
ita_Latn
0.997602
1c2e242941808a3cbe76ab961e1f589db563642c
744
md
Markdown
_slides/2020-02-02-words.md
xuhui/xuhui.github.io
b5331d3b8c6ef8e0f6f266d16d702f91e979d62e
[ "MIT" ]
null
null
null
_slides/2020-02-02-words.md
xuhui/xuhui.github.io
b5331d3b8c6ef8e0f6f266d16d702f91e979d62e
[ "MIT" ]
14
2019-12-28T02:35:39.000Z
2022-02-26T05:19:31.000Z
_slides/2020-02-02-words.md
xuhui/xuhui.github.io
b5331d3b8c6ef8e0f6f266d16d702f91e979d62e
[ "MIT" ]
null
null
null
--- title: 只言片语 description: 记录技术以外的只言片语 theme: solarized # beige layout: slides transition: slide # none/fade/slide/convex/concave/zoom --- <style type="text/css"> .reveal * { text-align: left; } .reveal em { font-size: smaller; } </style> 以下摘录周工喜爱的文字,周工于技术以外的只言片语 --- ## 听闻远方有你 听闻远方有你 动身跋涉千里 我吹过你吹过的风 这算不算相拥 我踏过你走过的路 这算不算相逢 我只是喜欢你 认真且怂 我还是喜欢你 像太阳升起 不论朝夕 我还是喜欢你 像云漂泊九万里 不曾歇息 我还是喜欢你像 星辰砸向大地 至死而已 我还是喜欢你 像微风吹进心里 酥酥靡靡 我还是喜欢你 像风走了八万里 不问归期 --- ## 定风波.莫听穿林打叶声 - 苏轼 莫听穿林打叶声,何妨吟啸且徐行。竹杖芒鞋轻胜马,谁怕?一蓑烟雨任平生。 料峭春风吹酒醒,微冷,山头斜照却相迎。回首向来萧瑟处,归去,也无风雨也无晴。 --- ## 从前、现在 从前, 人们使用车、马、信件。。 都很慢, 一生只够爱一个人 现在, 搞不清楚为啥子那么碌碌 似乎没有好好的 去爱一个人 --- ## 工程科学与社会科学 从事技术不应仅仅关注纯粹的技术 思考相关技术对使用者的影响,对文化、社会的影响 --- 人间忽晚,山河已冬 风一动,寒一重,愿所有人,有衣暖身,有人暖心 ---
8
55
0.708333
yue_Hant
0.769136
1c2f3100b955e0b69b34cc2fd6aecaeac706cf5f
1,791
md
Markdown
README.md
ZamElek/SecretManagement.Hashicorp.Vault.KV
430e558c3a60426cf8613f8ca706a3cf33acefd5
[ "MIT" ]
1
2021-05-01T15:17:37.000Z
2021-05-01T15:17:37.000Z
README.md
ZamElek/SecretManagement.Hashicorp.Vault.KV
430e558c3a60426cf8613f8ca706a3cf33acefd5
[ "MIT" ]
null
null
null
README.md
ZamElek/SecretManagement.Hashicorp.Vault.KV
430e558c3a60426cf8613f8ca706a3cf33acefd5
[ "MIT" ]
null
null
null
# SecretManagement.Hashicorp.Vault.KV [![GitHubSuper-Linter][]][GitHubSuper-LinterLink] [![PSGallery][]][PSGalleryLink] A PowerShell SecretManagement extension for Hashicorp Vault Key Value (KV) Engine. This supports version 1, version2, and cubbyhole (similar to v1). It does not currently support all of the version 2 features like versioned secrets, or metadata. ## QuickStart When registering a vault you need to provide at least these options: ```PowerShell Register-SecretVault -ModuleName SecretManagement.Hashicorp.Vault.KV -Name PowerShellTest -VaultParameters @{ VaultServer = 'http://vault.domain.local:8200'; VaultToken = '<orNot>'} ``` The vault name should match exactly as Hashicorp vault is case sensitive. If no VaultParameters are provided the functions will prompt you on the first execution. Additionally you may provide which version of KV you are using when registering. It defaults to version 2 of KV. ```PowerShell $VaultParameters = @{ VaultServer = 'https://vault-cluster.domain.local' VaultToken = '<s.somecharactershere>' KVVersion = 'v2'} ``` ## KV Version 2 distinctions - Get-Secret only retrieves the newest secret - Set-Secret Adds/Updates without CheckAndSet. - Remove-Secret Completely Removes the secret and all versions ## TO DO - Create a vault if it doesn't exist - Allow token updating - Allow options for KV2 version retrieval [GitHubSuper-Linter]: https://github.com/joshcorr/SecretManagement.Hashicorp.Vault.KV/workflows/ci/badge.svg [GitHubSuper-LinterLink]: https://github.com/marketplace/actions/super-linter [PSGallery]: https://img.shields.io/powershellgallery/v/SecretManagement.Hashicorp.Vault.KV?label=Powershell+Gallery+Latest [PSGalleryLink]: https://www.powershellgallery.com/packages/SecretManagement.Hashicorp.Vault.KV
51.171429
277
0.788386
eng_Latn
0.681636
1c2fb55c582111840d46a681f3383e932073f366
1,183
md
Markdown
Language/Reference/User-Interface-Help/update-method-vba-add-in-object-model.md
skucab/VBA-Docs
2912fe0343ddeef19007524ac662d3fcb8c0df09
[ "CC-BY-4.0", "MIT" ]
4
2019-09-07T04:44:48.000Z
2021-12-16T15:05:50.000Z
Language/Reference/User-Interface-Help/update-method-vba-add-in-object-model.md
skucab/VBA-Docs
2912fe0343ddeef19007524ac662d3fcb8c0df09
[ "CC-BY-4.0", "MIT" ]
1
2021-09-28T07:52:15.000Z
2021-09-28T07:52:15.000Z
Language/Reference/User-Interface-Help/update-method-vba-add-in-object-model.md
skucab/VBA-Docs
2912fe0343ddeef19007524ac662d3fcb8c0df09
[ "CC-BY-4.0", "MIT" ]
1
2021-06-23T03:40:08.000Z
2021-06-23T03:40:08.000Z
--- title: Update method (VBA Add-In Object Model) keywords: vbob6.chm102251 f1_keywords: - vbob6.chm102251 ms.prod: office ms.assetid: c88ee513-6d8e-9c40-2999-4cc217fc3fc8 ms.date: 12/06/2018 localization_priority: Normal --- # Update method (VBA Add-In Object Model) Refreshes the contents of the **AddIns** collection from the add-ins listed in the Vbaddin.ini file in the same manner as if the user had opened the **[Add-In Manager](add-in-manager-dialog-box.md)** dialog box. ## Syntax _object_.**Update** The _object_ placeholder represents an [object expression](../../Glossary/vbe-glossary.md#object-expression) that evaluates to an object in the **Applies To** list. ## Remarks All add-ins listed in the Vbaddin.ini file must be registered ActiveX components in the Registry before they can be used in Visual Basic. ## See also - [Collections (Visual Basic Add-In Model)](../visual-basic-add-in-model/collections-visual-basic-add-in-model.md) - [Visual Basic Add-in Model reference](visual-basic-add-in-model-reference.md) - [Visual Basic language reference](visual-basic-language-reference.md) [!include[Support and feedback](~/includes/feedback-boilerplate.md)]
35.848485
211
0.764159
eng_Latn
0.816504
1c2fc0e6031cf511f7d5ff65cb1d4e4c82d48993
2,847
md
Markdown
README.md
menakite/NewsBlur-Counter
4f7e1f0fe835dbfc07893887e0ab0d3bf7593961
[ "MIT" ]
5
2017-06-11T20:39:45.000Z
2019-06-10T09:20:19.000Z
README.md
anaconda/NewsBlur-Counter
4f7e1f0fe835dbfc07893887e0ab0d3bf7593961
[ "MIT" ]
1
2017-10-08T11:52:02.000Z
2017-10-08T11:52:02.000Z
README.md
menakite/NewsBlur-Counter
4f7e1f0fe835dbfc07893887e0ab0d3bf7593961
[ "MIT" ]
null
null
null
# NewsBlur Counter > **tl;dr** [Download](https://menakite.eu/~anaconda/safari/NewsBlur-Counter/NewsBlur-Counter.safariextz) the extension, go to your _Downloads_ directory & double click on it. >![How it looks like on your toolbar](https://raw.github.com/menakite/NewsBlur-Counter/master/images/screenshots/toolbar.png "How it looks like on your toolbar") NewsBlur is _a visual feed reader with intelligence_. See [samuelclay/NewsBlur](https://github.com/samuelclay/NewsBlur). NewsBlur Counter is a Safari extension to show how many unread NewsBlur stories you have in the toolbar. By clicking on the toolbar icon, you can easily access the NewsBlur website. It also makes the "o" keyboard shortcut _actually_ open the original story in a background tab. ## Installation You can install from source, by cloning this repository and opening the Extension Builder in Safari. If you don't even know what a repository is, just download the extension from https://menakite.eu/~anaconda/safari/NewsBlur-Counter/NewsBlur-Counter.safariextz, then open your _Downloads_ directory and double click on _NewsBlur-Counter.safariextz_. Safari will ask you to confirm. ## Note about Yosemite (Safari 8) The unread badge can only be refreshed in the background if your cookies and website data preferences are either "Allow from websites I visit" or "Always allow". ## Configuration No configuration is required to use this extension. You can, however, change the following settings: * the update interval: from 1 hour to 1 minute. It's set to 5 minutes by default; * what the counter should actually count: only stories you like, all stories, don't show stories you don't like. Defaults to the latter; * you can choose to access NewsBlur via newsblur.com or dev.newsblur.com when you click on the toolbar icon. The former is the production version, while _dev_ is the development one: though it may have glitches, it generally works well, and has new features. The former is used by default; * use a secure HTTPS connection, enabled by default. ![Settings page](https://raw.github.com/menakite/NewsBlur-Counter/master/images/screenshots/settings.png "Settings page") ## Notifications ![Notification: you're logged out!](https://raw.github.com/menakite/NewsBlur-Counter/master/images/screenshots/notification.png "Notification: you're logged out") This extension uses the Web Notifications API to send notifications to the Notification Center on Mountain Lion for the following events: * when NewsBlur is unreachable for whatever reason, so that you know the counter is not up to date; * if you're logged out. This extension **doesn't ask you for passwords**, so you'll have to login on the NewsBlur website. Though it's only tested on Mountain Lion and later, it should work well on previous versions, you just won't get any notification.
66.209302
291
0.787847
eng_Latn
0.991381
1c2fe28d6c39dff4971042a7f77d5b77a92205c6
1,888
md
Markdown
src/ru/2022-01/01/05.md
Adventech/sabbath-school-lessons
baf65ac98fa7c7bce73e16c263eb0cc1bf0ba62a
[ "MIT" ]
68
2016-10-30T23:17:56.000Z
2022-03-27T11:58:16.000Z
src/ru/2022-01/01/05.md
Adventech/sabbath-school-lessons
baf65ac98fa7c7bce73e16c263eb0cc1bf0ba62a
[ "MIT" ]
367
2016-10-21T03:50:22.000Z
2022-03-28T23:35:25.000Z
src/ru/2022-01/01/05.md
Adventech/sabbath-school-lessons
baf65ac98fa7c7bce73e16c263eb0cc1bf0ba62a
[ "MIT" ]
109
2016-08-02T14:32:13.000Z
2022-03-31T10:18:41.000Z
--- title: Сплотитесь date: 29/12/2021 --- `Что апостол посоветовал своим читателям с учетом их ситуации? Какие уроки мы можем извлечь из Послания к евреям? Как Бог помог Илии оправиться от разочарования?` `Прочитайте 3 Цар. 19:5–18. Что сделал Бог для восстановления веры Илии?` История взаимоотношений Бога с Илией после Кармила вызывает глубокий интерес, ибо открывает нежную заботу и мудрость, с которыми Бог служит находящимся в беде и борющимся за восстановление веры. Господь многое сделал для Илии. Во-первых, Он позаботился о физических потребностях пророка: накормил его и дал отдохнуть. Затем в пещере ласково упрекнул его: «Что ты здесь, Илия?» — и помог ему глубже понять, как Он действует и выполняет Свои цели. Бог явился ему не в бушующем ветре, землетрясении или огне, а в тихом веянии ветра. Затем Он поручил Илии работу и успокоил его. `Прочитайте Евр. 2:1; 3:12–14; 5:11–14; 10:19–25. Перечислите советы, которые апостол дает в этих текстах верующим для укрепления их веры:` В Послании к евреям мы можем найти несколько наставлений, которые апостол дал читателям, чтобы помочь им обрести первоначальные силу и веру. Один из подчеркнутых Павлом вопросов — забота о физических нуждах единоверцев. Он предлагает проявлять гостеприимство и навещать узников, что подразумевает удовлетворение их нужд. Апостол призывает своих читателей к щедрости, помня, что Бог не оставит их (см. Евр. 13:1–6). Павел также обличает их и ободряет. Он предостерегает верующих, чтобы они не отпали (см. 2:1), не имели «сердца лукавого и неверного» (3:12), и призывает их возрастать в вере (см. 5:11–6:3). Он также отмечает важность постоянного посещения церковных собраний (см. 10:25). Подводя итог, апостол предлагает читателям сплотиться, ободрять друг друга, проявлять любовь и творить добрые дела. Он также возвышает Иисуса и Его служение в небесном святилище (см. 8:1, 2; 12:1–4).
134.857143
886
0.789195
rus_Cyrl
0.997443
1c2ff249848a3e92499be77220a6dfbc3cae764d
15,151
md
Markdown
README.md
ebi-gene-expression-group/atlas-web-bulk
44c2f296c747b7336658b66cd9948bbde2c00c66
[ "Apache-2.0" ]
null
null
null
README.md
ebi-gene-expression-group/atlas-web-bulk
44c2f296c747b7336658b66cd9948bbde2c00c66
[ "Apache-2.0" ]
61
2019-06-17T08:35:17.000Z
2022-03-04T14:34:04.000Z
README.md
ebi-gene-expression-group/atlas-web-bulk
44c2f296c747b7336658b66cd9948bbde2c00c66
[ "Apache-2.0" ]
1
2019-06-17T08:34:25.000Z
2019-06-17T08:34:25.000Z
# Expression Atlas ## Requirements - Docker v19+ - Docker Compose v1.25+ - 100 GB of available storage (experiment files, PostgreSQL and Solr backup snapshots and Docker volumes) Notice that PostgreSQL and Solr snapshots are [`bind` mounted](https://docs.docker.com/storage/bind-mounts/) in order to move data back and forth from the containers. Actual files managed by either Solr or PostgreSQL are kept in volumes which will be reused even if the containers are removed or brought down by Docker Compose. If you want to start afresh delete the old volume (e.g. for Postgres `docker volume rm gxa-pgdata`) and re-run the necessary step to return to the initial state. ## Code Clone the repository of Bulk Expression Atlas proper: ```bash git clone --recurse-submodules https://github.com/ebi-gene-expression-group/atlas-web-bulk.git ``` If you have already cloned the project ensure it’s up-to-date: ```bash git pull git submodule update --remote ``` ## Data Choose a suitable location for the experiment files, database and Solr backup data. Set the path in the variable `ATLAS_DATA_PATH`. To download the data you can use `rsync` if you’re connected to the EBI network (over VPN or from campus): ```bash ATLAS_DATA_PATH=/path/to/bulk/atlas/data rsync -ravz ebi-cli:/nfs/ftp/pub/databases/microarray/data/atlas/test/gxa/* $ATLAS_DATA_PATH ``` Alternatively you can use `wget` and connect to EBI’s FTP server over HTTP: ```bash wget -P $ATLAS_DATA_PATH -c --reject="index.html*" --recursive -np -nc -nH --cut-dirs=7 --random-wait --wait 1 -e robots=off http://ftp.ebi.ac.uk/pub/databases/microarray/data/atlas/test/gxa/ ``` Notice that either way `ATLAS_DATA_PATH` will be created for you if the directory doesn’t exist. ## Bring up the environment Besides `ATLAS_DATA_PATH` you need to set some variables for the Postgres container. Use the settings below and replace `ATLAS_DATA_PATH` value to the directory you set up in the first step. In the `atlas-web-bulk-cell/docker` directory run the following: ```bash ATLAS_DATA_PATH=/path/to/bulk/atlas/data \ POSTGRES_HOST=gxa-postgres \ POSTGRES_DB=gxpgxadev \ POSTGRES_USER=atlasprd3 \ POSTGRES_PASSWORD=atlasprd3 \ docker-compose -f docker-compose-solrcloud.yml -f docker-compose-postgres.yml -f docker-compose-tomcat.yml up ``` You can also set a Docker Compose *Run* configuration in IntelliJ IDEA with the environment variables from the command above if you find that more convenient. After bringing up the containers, you may want to inspect the logs to see that all services are running fine. The last log should come from Tomcat, and it should be something similar to: ``` gxa-tomcat | 18-Dec-2020 13:40:58.907 INFO [main] org.apache.catalina.startup.Catalina.start Server startup in 6705 ms ``` Now let’s populate both the Postgres database and the SolrCloud collections. ### Postgres Run the following command to restore Postgres data from the provided `pg-dump.bin` file: ```bash docker exec -it gxa-postgres bash -c 'pg_restore -d $POSTGRES_DB -h localhost -p 5432 -U $POSTGRES_USER --clean /var/backups/postgresql/pg-dump.bin' ``` A few minutes later your Postgres database will be ready. ### SolrCloud Use the provided `Dockerfile` to bootstrap SolrCloud: ```bash docker build -t gxa-solrcloud-bootstrap . docker run -i --rm --network gxa gxa-solrcloud-bootstrap ``` You will see many warnings or errors in Solr’s responses. That’s alright and to be expected, since the scripts that create the config sets, collections and define the schemas will attempt first to remove them to start from a clean, known state; however Solr will reply with an error if the collections can’t be deleted. Again, this step will take a few minutes. ### Tomcat Copy the Tomcat credentials file to the container. The `admin` role is used to access several admin endpoints in Bulk Expression Atlas (e.g. `/admin/experiments/help`). Tomcat’s `conf` directory is persisted as a volume so that we need to do this only once: ```bash docker cp tomcat-users.xml gxa-tomcat:/usr/local/tomcat/conf ``` Run the Gradle task `war` in the `atlas-web-bulk` directory: ```bash cd atlas-web-bulk ./gradlew :app:war ``` You should now have the file `build/libs/gxa.war` which by default Tomcat’s naming conventions will be served at `gxa`. Point your browser at `http://localhost:8080/gxa` and voilà! Every time you re-run the `war` task the web app will be automatically re-deployed by Tomcat. If you face issues in redeployment, stop all running containers and re-run them ## Backing up your data Eventually you’ll add new experiments to your development instance of GXA, or new, improved collections in Solr will replace the old ones. In such cases you’ll want to get a snapshot of the data to share with the team. Below there are instructions to do that. ### PostgreSQL If at some point you wish to create a backup dump of the database run the command below: ```bash docker exec -it gxa-postgres bash -c 'pg_dump -d $POSTGRES_DB -h localhost -p 5432 -U $POSTGRES_USER -f /var/backups/postgresql/pg-dump.bin -F c -n $POSTGRES_USER -t $POSTGRES_USER.* -T *flyway*' ``` ### SolrCloud ```bash for SOLR_COLLECTION in $SOLR_COLLECTIONS do START_DATE_IN_SECS=`date +%s` curl "http://localhost:8983/solr/${SOLR_COLLECTION}/replication?command=backup&location=/var/backups/solr&name=${SOLR_COLLECTION}" # Pattern enclosed in (?<=) is zero-width look-behind and (?=) is zero-width look-ahead, we match everything in between COMPLETED_DATE=`curl -s "http://localhost:8983/solr/${SOLR_COLLECTION}/replication?command=details" | grep -oP '(?<="snapshotCompletedAt",").*(?=")'` COMPLETED_DATE_IN_SECS=`date +%s -d "${COMPLETED_DATE}"` # We wait until snapshotCompletedAt is later than the date we took before issuing the backup operation while [ ${COMPLETED_DATE_IN_SECS} -lt ${START_DATE_IN_SECS} ] do sleep 1s COMPLETED_DATE=`curl -s "http://localhost:8983/solr/${SOLR_COLLECTION}/replication?command=details" | grep -oP '(?<="snapshotCompletedAt",").*(?=")'` COMPLETED_DATE_IN_SECS=`date +%s -d "${COMPLETED_DATE}"` done done ``` ### Update test data Remember to update the file and any new experiments added to the `filesystem` directory by syncing your `ATLAS_DATA_PATH` with `/nfs/ftp/pub/databases/microarray/data/atlas/test/gxa`: ```bash rsync -ravz $ATLAS_DATA_PATH/* ebi-cli:/nfs/ftp/pub/databases/microarray/data/atlas/test/gxa/ ``` ## Testing **Note: A few tests depend on the Solr suggesters, so don’t forget to build them in the SolrCloud container!** The project has a `docker-compose-gradle.yml` in the `docker` directory to run tests within a Gradle Docker container. It reuses the same SolrCloud service described earlier, and a Postgres container with minor variations: it doesn’t use volumes to ensure the database is clean before running any tests, and its name (and the dependency expressed in `docker-compose-gradle.yml`) has been changed to `gxa-postgres-test`; the reason is to avoid using `gxa-postgres` by mistake and wiping full tables when cleaning fixtures... such an unfortunate accident is known to have happened. Depending on your OS and Docker settings you might be able to run Gradle from your host machine without a container, and access Solr, ZooKeeper and Postgres via mapped ports on `localhost`. We know this is possible in Linux, but we’ve found it to be problematic in macOS. It’s probably due to the way DNS in Docker Compose works (i.e., ZooKeeper resolves `localhost` to an unknown IP address). As they say, networking is hard. YMMV. ### Before you start Check with `docker ps` and `docker container ls -a` that no services used during tests are running or stopped, respectively. These are `gxa-solrcloud-1`, `gxa-solrcloud-2`, `gxa-zk-1`, `gxa-zk-2`, `gxa-zk-3`, `gxa-postgres-test`, `gxa-flyway-test` and `gxa-gradle`. We want to start with a clean application context every time we execute the test task. Here are two useful commands: ```bash docker stop gxa-solrcloud-1 gxa-solrcloud-2 gxa-zk-1 gxa-zk-2 gxa-zk-3 gxa-postgres-test gxa-flyway-test && docker rm gxa-solrcloud-1 gxa-solrcloud-2 gxa-zk-1 gxa-zk-2 gxa-zk-3 gxa-postgres-test gxa-flyway-test gxa-gradle ``` ### Running tests As mentioned before, `docker-compose-gradle.yml` runs the Gradle `test` task and it depends on all the necessary services to run unit tests, integration tests and end-to-end tests. It splits the job in the following six phases: 1. Clean the build directory 2. Compile the test classes 3. Run unit tests 4. Run integration tests 5. Run end-to-end tests 6. Generate JaCoCo reports Bring it up like this (the Postgres variables can take any values, remember that the container will be removed): ```bash ATLAS_DATA_PATH=/path/to/your/gxa/data \ POSTGRES_HOST=gxa-postgres-test \ POSTGRES_DB=gxpgxatest \ POSTGRES_USER=gxa \ POSTGRES_PASSWORD=gxa \ docker-compose \ -f docker-compose-postgres-test.yml \ -f docker-compose-solrcloud.yml \ -f docker-compose-gradle.yml \ up ``` You will eventually see these log messages: ``` gxa-gradle | BUILD SUCCESSFUL in 13s gxa-gradle | 3 actionable tasks: 1 executed, 2 up-to-date gxa-gradle exited with code 0 ``` Press Ctrl+C to stop the container and clean any leftovers: ```bash docker stop gxa-solrcloud-1 gxa-solrcloud-2 gxa-zk-1 gxa-zk-2 gxa-zk-3 gxa-postgres-test gxa-flyway-test && docker rm gxa-solrcloud-1 gxa-solrcloud-2 gxa-zk-1 gxa-zk-2 gxa-zk-3 gxa-postgres-test gxa-flyway-test gxa-gradle ``` If you prefer, here’s a `docker-compose run` command to execute the tests: ```bash ATLAS_DATA_PATH=/path/to/your/gxa/data \ POSTGRES_HOST=gxa-postgres-test \ POSTGRES_DB=gxpgxatest \ POSTGRES_USER=gxa \ POSTGRES_PASSWORD=gxa \ docker-compose \ -f docker-compose-postgres-test.yml \ -f docker-compose-solrcloud.yml \ -f docker-compose-gradle.yml \ run --rm --service-ports \ gxa-gradle bash -c ' ./gradlew :app:clean && ./gradlew -PdataFilesLocation=/root/gxa/integration-test-data -PexperimentFilesLocation=/root/gxa/integration-test-data/gxa -PjdbcUrl=jdbc:postgresql://$POSTGRES_HOST:5432/$POSTGRES_DB -PjdbcUsername=$POSTGRES_USER -PjdbcPassword=$POSTGRES_PASSWORD -PzkHost=gxa-zk-1 -PsolrHost=gxa-solrcloud-1 app:testClasses && ./gradlew -PtestResultsPath=ut :app:test --tests *Test && ./gradlew -PtestResultsPath=it -PexcludeTests=**/*WIT.class :app:test --tests *IT && ./gradlew -PtestResultsPath=e2e :app:test --tests *WIT && ./gradlew :app:jacocoTestReport ' ``` With `run` the control returns to your shell once the tasks have finished, but you’ll need to clean up the service containers anyway. In either case you may find all reports at `app/build/reports`. ### Running a single test Many times you will find yourself working in a specific test case or class. Running all tests in such cases is impractical. In such situations you can use [Gradle’s continuous build execution](https://blog.gradle.org/introducing-continuous-build). See the example below for e.g. `SitemapDaoIT.java`: ```bash ATLAS_DATA_PATH=/path/to/your/gxa/data \ POSTGRES_HOST=gxa-postgres-test \ POSTGRES_DB=gxpgxatest \ POSTGRES_USER=gxa \ POSTGRES_PASSWORD=gxa \ docker-compose \ -f docker-compose-postgres-test.yml \ -f docker-compose-solrcloud.yml \ -f docker-compose-gradle.yml \ run --rm --service-ports \ gxa-gradle bash -c ' ./gradlew :app:clean && ./gradlew -PdataFilesLocation=/root/gxa/integration-test-data -PexperimentFilesLocation=/root/gxa/integration-test-data/gxa -PjdbcUrl=jdbc:postgresql://$POSTGRES_HOST:5432/$POSTGRES_DB -PjdbcUsername=$POSTGRES_USER -PjdbcPassword=$POSTGRES_PASSWORD -PzkHost=gxa-zk-1 -PsolrHost=gxa-solrcloud-1 app:testClasses && ./gradlew --continuous :app:test --tests SitemapDaoIT ' ``` After running the test Gradle stays idle and waits for any changes in the code. When it detects that the files in your project have been updated it will recompile them and run the tests again. Notice that you can specify multiple test files after `--tests` (by name or with wildcards). ### Remote debugging If you want to use a debugger, add the option `-PremoteDebug` to the task test line. For instance: ```bash ./gradlew -PremoteDebug :app:test --tests SitemapDaoIT ``` Be aware that Gradle won’t execute the tests until you attach a remote debugger to port 5005. It will notify you when it’s ready with the following message: ``` > Task :app:test Listening for transport dt_socket at address: 5005 <===========--> 90% EXECUTING [5s] > IDLE > IDLE > IDLE > IDLE > IDLE > IDLE > IDLE > :app:test > 0 tests completed > IDLE > IDLE > IDLE > IDLE ``` You can combine `--continuous` with `-PremoteDebug`, but the debugger will be disconnected at the end of the test. You will need to start and attach the remote debugger every time Gradle compiles and runs the specified test. To attach a remote debugger to your gradle test you can add following configuration in your IntelliJ: [![RMoIhF.md.png](https://iili.io/RMoIhF.md.png)](https://freeimage.host/i/RMoIhF) ## Troubleshooting ### SolrCloud nodes shut down on macOS Docker for macOS sets fairly strict resource limits for all Docker containers. If your containers require e.g. more memory you need to increase the available amount in the Docker Dashboard. For bulk Expression Atlas, plase set Memory to between 8-12 GB and disk image to 100 GB or more. Please see the screenshot below for reference: ![Screenshot-2021-02-18-at-18-27-40](https://user-images.githubusercontent.com/4425744/109644570-8ccee680-7b4d-11eb-9db0-7a29fb4d9e2b.png) ### The script that backs up Solr snapshot hangs Ensure you have writing privileges for the directory bind at `/var/backups/solr`. You can check the status of your backup operation with (set `SOLR_HOST` and `SOLR_COLLECTION` to the appropriate values): ```bash docker exec -i ${SOLR_HOST} curl -s "http://localhost:8983/solr/${SOLR_COLLECTION}/replication?command=details" ``` ### I’m not getting any suggestions in Epression Atlas Read the important message after you run `gxa-solrlcoud-bootstrap`: > PLEASE READ! > Suggesters haven’t been built because it’s very likely to get a `java.net.SocketTimeoutException` due > to the size of the bioentities collection. Raising the timeout in Jetty could mask other errors down > the line, and ignoring the exception doesn’t guarantee the suggester to be fully built since it still > takes a few extra minutes: the exception is thrown before the process has completed. > The best option is to manually build and supervise this step. > > On one terminal session run the following command (don’t worry if the request returns a 500 error): > > `docker exec -i gxa-solrcloud-1 curl 'http://localhost:8983/solr/bioentities-v1/suggest?suggest.build=true&suggest.dictionary=propertySuggester'` > > On another terminal, monitor the size of the suggester directory size: > > `docker exec -it gxa-solrcloud-1 bash -c 'watch du -sc server/solr/bioentities-v1*/data/*'` > `docker exec -it gxa-solrcloud-2 bash -c 'watch du -sc server/solr/bioentities-v1*/data/*'` > > The suggester will be built when the propertySuggester directory size stabilises. > Run the above procedure for each of your SolrCloud containers.
45.498498
312
0.763382
eng_Latn
0.940753
fa0bc3b9a8700e1991d2760ef4b95c13c2c28100
8,811
md
Markdown
content/docs/guidelines/best-practices/writing/index.md
lexlem/website
db0194e6a96e42fcbae0ecb17734fbb54cae6284
[ "CC-BY-4.0" ]
null
null
null
content/docs/guidelines/best-practices/writing/index.md
lexlem/website
db0194e6a96e42fcbae0ecb17734fbb54cae6284
[ "CC-BY-4.0" ]
null
null
null
content/docs/guidelines/best-practices/writing/index.md
lexlem/website
db0194e6a96e42fcbae0ecb17734fbb54cae6284
[ "CC-BY-4.0" ]
null
null
null
+++ title = "Writing style guide" description = "This page contains our writing guidelines for tutorials and, in general, accessible writing." author = "nathan" date = 2019-12-14T21:49:21+01:00 weight = 4 +++ This page explains our writing style and the guidelines we follow to write clearly. Our style aims at being accessible, clear, and informative, with a touch of personality. It supports our mission and goals: 1. **Bringing people together**. Making them feel welcome and fostering collaboration. 1. **Sharing openly**. We share our knowledge and tools, giving back to the community. ## Our Tone The tone of an article helps to keep readers interested and engaged. It also shows our personality. Our tone is **kind**, **genuine**, **clear**, **accessible**, **inclusive**, and **professional**. To achieve that, avoid exaggerations, unnecessary jargon, but also colloquial expressions. Write as if you were addressing a fellow professional or an adult student directly in a one-on-one setting. Aim for a welcoming feel without being overly close. Address the reader directly with "you." When talking about yourself or the team, you can use "I" or "we." When talking about third parties, favor pronouns like "they" or "them." Also, while we should keep the style of documents like these consistent and formal, you're encouraged to show your personality in news posts and devlogs. ## The writing workflow Here is the tutorial writing workflow we should always follow. In short: 1. Prepare the final code for the tutorial. 2. Code review. 3. Write an outline for each lesson or tutorial series. - Show the result and list what the person will learn at the start. - Use sub-headings to structure your content. - Focus on the general techniques, principles, problems to solve, or reasons for a given code structure. Skim over the steps or implementation details. - Write the start of paragraphs, paste code snippets, or use pictures over bullet-lists. That is to say, content from which you can directly build your tutorial. 4. Outline review. 5. Write the tutorial, building from the outline. Working with an editor is essential to get the perspective of someone who hasn't researched the subject matter as much as you did. Also, we cannot pick up all the mistakes we do alone. ### Focus on what matters most Explain where you are going or why you design your code or nodes in a certain way in the introduction, before giving step-by-step instructions. This background is a vital part of the tutorial to me. Depending on the topic, focus on: - The problems to solve. - The challenges involved. - Breaking down an effect or reference visually (game art, game design). - The design principles that apply to the task at hand. - How other persons may have solved that problem. Explaining underlying causes or concepts helps the reader to follow along and to stress the most important aspects to learn: problem-solving, transferable knowledge and techniques. Then, you can break down the steps to reach the tutorial's goal. You can also add general explanations or background to the rest of the tutorial. ## Dos and don'ts Below, you will find specific guidelines that help to communicate ideas clearly to a broad audience, including non-native English speakers. For technical writing, that is to say, manuals and code references, we also follow the [Godot technical writing guidelines](//docs.godotengine.org/en/latest/community/contributing/docs_writing_guidelines.html). To start with, use American English. It is the standard in technical writing and for many free software projects. ### Write for the least experienced reader Write with learners in your audience who understand the topic the least in mind. Doing so makes your articles more accessible, it shows your mastery, and it saves the readers' time. Here are some tips to achieve that goal: 1. Avoid technical jargon and complicated concepts. 1. Use fundamental concepts to break down complex or abstract ideas. 1. Use plain language rather than uncommon words. 1. Be as clear and as precise as you can. For reference, check out the US government's [list of simple word alternatives](//plainlanguage.gov/guidelines/words/use-simple-words-phrases/). ### Use the direct voice Using the direct voice leads to shorter sentences compared to the passive voice. It makes the action clear from the first few words. Avoid the passive voice: > The update_items function is used by the inventory system. Favor the direct voice: > The inventory system uses the update_items function. ### Keep sentences short **Keep sentences under 25 words**. Favor short sentences, that each communicates one idea. Use paragraphs to group sentences related to a broader idea together. Whenever you change the topic or move on to another concept, add a new paragraph. ### Break up paragraphs Long paragraphs, like long sentences, make the text harder to follow. Give the reader a breathing room and structure your articles in a way that supports your story. Use headings, lists, and short paragraphs to structure your writings. ### Ensure pronouns have a clear antecedent Do not start a sentence with pronouns like "this" or "that" alone. Too often, these pronouns are ambiguous. Avoid ambiguous pronouns: > Update the `velocity` and call `move_and_slide()`. This makes the character move. In the sentence above, "This" could refer to updating the velocity, calling `move_and_slide()`, or both. Instead, specify what the pronoun refers to: > Update the `velocity` and call `move_and_slide()`. This function makes the character move. ## Technical writing and tutorials The following guidelines are more specific to writing code documentation and tutorials. We share some conventions between the two for consistency. ### Formatting rules When mentioning labels as seen in the editor, including node names, dock names, and property names, use _italics_. This helps the user to find them in the interface and distinguishes them from code. > Select the _Blueprint_ node and in the _Inspector_, set its _Value_ to `5`. Write labels as they appear in the interface or for the user, with title case. For example, Godot capitalizes property names and settings by default. For example: > Select the node and in the _Inspector_, change the _Initial Velocity -> Velocity_'s `x` to `-100`. Use `inline code` when mentioning symbols, i.e. variable names, function names, and any code in a sentence. Also use it for values, as in "set the _Health_ to `10`." Absolute and relative file paths should also be inline code as they're as seen in scripts: `res://path/to/file.tres` and `file.tres`. Use parentheses with function names to differentiate them from variables: > We call the `update()` function. [...] We increment the `count` on every loop iteration. For properties nested in a foldable category of the _Inspector_ or in sub-menus, use arrows, like so: _Collision -> Layer_ or _Debug -> Visible Collision_. ### Spell out numbers, except in code Write numbers in words when counting objects, except if the numbers in question refer to a value in the code. For example: > Create two _Control_ nodes as siblings. Resize the first node to take two-thirds of the viewport's width. Here is an example with code: > Set the `max_health` to `5`. ### Use plain English over symbols Avoid replacing words like "and" with "&," or using the slash "/" instead of "or." ## Structure This section focuses on the structure of tutorials, articles, and <abbr title="HyperText Markup Language">HTML</abbr> elements. ### Headings Use Title Case for document titles: "Getting Started with Godot." For other headings, only capitalize the first word: "Coding the character." Always write a paragraph after a heading, including an introduction following a page's title. On the web, only the document's title should use an H1 heading. Use H2 for sections, and H3 for sub-sections. Avoid nesting sub-sections past the H4 level. ### Create meaningful links For links, write a label that describes the action the user is taking, or the page they are going to arrive on. For example, instead of [official Twitter account](//twitter.com/NathanGDQuest), use [follow GDQuest on Twitter](https://twitter.com/NathanGDQuest). Some more tips: - Explain where the links lead and why through their label. - Links should help the user scan the page for essential information and related resources. ## Resources Our guidelines are inspired by: 1. The [Harvard writing guide](https://library.harvard.edu/writing-guide). 1. The US government's [plain language guidelines](https://plainlanguage.gov/guidelines/). 1. Write the Docs's [style guides section](https://www.writethedocs.org/guide/writing/style-guides/) for technical writers.
46.373684
260
0.774032
eng_Latn
0.999253
fa0c0d3667d297ddf9d391348e11a078990dac25
1,871
md
Markdown
.github/ISSUE_TEMPLATE.md
rgarita/sp-dev-samples
5ec052c7a61e97d269855a25353be25af615b4d9
[ "MIT" ]
78
2016-09-16T01:34:14.000Z
2020-05-04T06:12:24.000Z
.github/ISSUE_TEMPLATE.md
rgarita/sp-dev-samples
5ec052c7a61e97d269855a25353be25af615b4d9
[ "MIT" ]
30
2016-09-27T19:30:39.000Z
2020-03-02T12:57:33.000Z
.github/ISSUE_TEMPLATE.md
rgarita/sp-dev-samples
5ec052c7a61e97d269855a25353be25af615b4d9
[ "MIT" ]
137
2016-09-27T19:07:35.000Z
2020-05-02T04:06:49.000Z
Thank you for reporting an issue or suggesting an enhancement. We appreciate your feedback - to help the team to understand your needs, please complete the below template to ensure we have the necessary details to assist you. #### Category [ ] Question [ ] Bug [ ] Enhancement #### Expected or Desired Behavior _If you are reporting a bug, please describe the expected behavior. If you are suggesting an enhancement please describe thoroughly the enhancement, how it can be achieved, and expected benefit._ #### Observed Behavior _If you are reporting a bug, please describe the behavior you expected to occur when performing the action. If you are making a suggestion, you can delete this section._ #### Steps to Reproduce _If you are reporting a bug please describe the steps to reproduce the bug in sufficient detail to allow testing. Only way to fix things properly, is to have sufficient details to reproduce it. If you are making a suggestion, you can delete this section._ #### Submission Guidelines _Delete this section after reading_ - All suggestions or bugs are welcome, please let us know what's on your mind. - If you are reporting an issue around any of the samples, please ensure that you have clear reference on the sample and possibly code file, which should be fixed. - If you have technical questions about the framework, we’ll be monitoring #spfx, #spfx-webparts, and #spfx-tooling on (SharePoint StackExchange)[http://sharepoint.stackexchange.com/]. You can also alternatively submit your question to (SharePoint Developer group)[https://network.office.com/t5/SharePoint-Developer/bd-p/SharePointDev] at Microsoft Technical Network. - Remember to include sufficient details and context. - If you have multiple suggestions or bugs please submit them in separate bugs so we can track resolution. Thanks for your contribution! Sharing is caring.
71.961538
367
0.792624
eng_Latn
0.999676
fa0c38d56944253ec213846bc655e30188655d62
5,489
md
Markdown
doc/release-notes/release-notes-2.0.5.md
michailduzhanski/crypto-release
4e0f14ccc4eaebee6677f06cff4e13f37608c0f8
[ "MIT" ]
2
2020-02-12T16:22:49.000Z
2020-02-13T16:34:31.000Z
doc/release-notes/release-notes-2.0.5.md
michailduzhanski/crypto-release
4e0f14ccc4eaebee6677f06cff4e13f37608c0f8
[ "MIT" ]
null
null
null
doc/release-notes/release-notes-2.0.5.md
michailduzhanski/crypto-release
4e0f14ccc4eaebee6677f06cff4e13f37608c0f8
[ "MIT" ]
null
null
null
Notable changes =============== Sprout to Sapling Migration Tool -------------------------------- This release includes the addition of a tool that will enable users to migrate shielded funds from the Sprout pool to the Sapling pool while minimizing information leakage. The migration can be enabled using the RPC `z_setmigration` or by including `-migration` in the `arnak.conf` file. Unless otherwise specified funds will be migrated to the wallet's default Sapling address; it is also possible to set the receiving Sapling address using the `-migrationdestaddress` option in `arnak.conf`. See [ZIP308](https://github.com/arnak/zips/blob/master/zip-0308.rst) for full details. New consensus rule: Reject blocks that violate turnstile -------------------------------------------------------- In the 2.0.4 release the consensus rules were changed on testnet to enforce a consensus rule which marks blocks as invalid if they would lead to a turnstile violation in the Sprout or Shielded value pools. **This release enforces the consensus rule change on mainnet** The motivations and deployment details can be found in the accompanying [ZIP draft](https://github.com/arnak/zips/pull/210) and [PR 3968](https://github.com/michailduzhanski/arnak/pull/3968). Developers can use a new experimental feature `-developersetpoolsizezero` to test Sprout and Sapling turnstile violations. See [PR 3964](https://github.com/michailduzhanski/arnak/pull/3964) for more details. 64-bit ARMv8 support -------------------- Added ARMv8 (AArch64) support. This enables users to build arnak on even more devices. For information on how to build see the [User Guide](https://arnak.com) Users on the Arnak forum have reported successes with both the Pine64 Rock64Pro and Odroid C2 which contain 4GB and 2GB of RAM respectively. Just released, the Odroid N2 looks like a great solution with 4GB of RAM. The newly released Jetson Nano Developer Kit from Nvidia (also 4GB of RAM) is also worth a look. The NanoPC-T3 Plus is another option but for the simplest/best experience choose a board with 4GB of RAM. Just make sure before purchase that the CPU supports the 64-bit ARMv8 architecture. Changelog ========= Braydon Fuller (1): tests: adds unit test for IsPayToPublicKeyHash method Dimitris Apostolou (1): Electric Coin Company Eirik0 (27): Split test in to multiple parts Use a custom error type if creating joinsplit descriptions fails Rename and update comment Add rpc to enable and disable Sprout to Sapling migration Move migration logic to ChainTip Documentation cleanup Additional locking and race condition prevention Refactor wait_and_assert_operationid_status to allow returning the result Set min depth when selecting notes to migrate Check for full failure message in test case Add migration options to conf file Create method for getting HD seed in RPCs Add rpc to get Sprout to Sapling migration status Fix help message Test migration using both the parameter and the default Sapling address Fix typos and update documentation use -valueBalance rather than vpub_new to calculate migrated amount Do not look at vin/vout when determining migration txs and other cleanup Calculate the number of confimations in the canonical way Do not throw an exception if HD Seed is not found when exporting wallet make-release.py: Versioning changes for 2.0.5-rc1. make-release.py: Updated manpages for 2.0.5-rc1. make-release.py: Updated release notes and changelog for 2.0.5-rc1. Notable changes for v2.0.5 Add missing word to release notes make-release.py: Versioning changes for 2.0.5. make-release.py: Updated manpages for 2.0.5. Gareth Davies (1): Adding addressindex.h to Makefile.am Ian Munoz (1): add curl to package list for gitian lxc container Jack Grigg (9): Add Sprout support to TransactionBuilder depends: Use full path to cargo binary depends: Generalise the rust package cross-compilation functions depends: Add rust-std hash for aarch64-unknown-linux-gnu depends: Compile bdb with --disable-atomics on aarch64 depends: Update .gitignore configure: Guess -march for libsnark OPTFLAGS instead of hard-coding Add Blossom to upgrade list init: Fix new HD seed generation for previously-encrypted wallets Larry Ruane (6): fix enable-debug build DB_COINS undefined add -addressindex changes for bitcore insight block explorer add -spentindex changes for bitcore insight block explorer Update boost from v1.69.0 to v1.70.0. #3947 add -timestampindex for bitcore insight block explorer 3873 z_setmigration cli bool enable arg conversion Marius Kjærstad (1): Update _COPYRIGHT_YEAR in configure.ac to 2019 Mary Moore-Simmons (1): Creates checklist template for new PRs being opened and addresses Str4d's suggestion for using GitHub handles Simon Liu (5): Add testnet and regtest experimental feature: -developersetpoolsizezero Add qa test for experimental feature: -developersetpoolsizezero Enable ZIP209 on mainnet and set fallback Sprout pool balance. Enable experimental feature -developersetpoolsizezero on mainnet. Update chain work and checkpoint using block 525000. Jack Grigg (1): remove extra hyphen zebambam (1): Minor speling changes
41.900763
125
0.74039
eng_Latn
0.983079
fa0c430b7886b3e307de1febb7515494cb2261e9
977
md
Markdown
docs/code-quality/c28195.md
hahaysh/visualstudio-docs.ko-kr
38fd1d7bd27067ebbb756f79879e7d9012e19ba2
[ "CC-BY-4.0", "MIT" ]
null
null
null
docs/code-quality/c28195.md
hahaysh/visualstudio-docs.ko-kr
38fd1d7bd27067ebbb756f79879e7d9012e19ba2
[ "CC-BY-4.0", "MIT" ]
null
null
null
docs/code-quality/c28195.md
hahaysh/visualstudio-docs.ko-kr
38fd1d7bd27067ebbb756f79879e7d9012e19ba2
[ "CC-BY-4.0", "MIT" ]
null
null
null
--- title: C28195 ms.date: 11/04/2016 ms.prod: visual-studio-dev15 ms.technology: vs-ide-code-analysis ms.topic: reference f1_keywords: - C28195 helpviewer_keywords: - C28195 ms.assetid: 89524043-215e-4932-8079-ca2084d32963 author: mikeblome ms.author: mblome manager: wpickett ms.workload: - multiple ms.openlocfilehash: af205cf63c97c4edf1eb88bef90852131b2facb4 ms.sourcegitcommit: e13e61ddea6032a8282abe16131d9e136a927984 ms.translationtype: MT ms.contentlocale: ko-KR ms.lasthandoff: 04/26/2018 --- # <a name="c28195"></a>C28195 경고 C28195: 함수가 변수의 메모리를 가져오도록 선언되었는데 메모리를 가져오지 않고 종료되었습니다. 이 경고는 분석 중인 함수의 함수 프로토타입이 `__drv_acquiresMemory` 주석을 갖고 있음을 나타냅니다. `__drv_acquiresMemory` 주석은 함수가 지정된 결과 위치에서 메모리를 가져오지만 최소한 하나의 경로에서 함수가 메모리를 가져오지 못했음을 나타냅니다. 코드 분석 도구는 메모리 할당자의 실제 구현을 인식하지 않으며(주소 산술 포함), 메모리 할당을 인식하지 않습니다(많은 래퍼는 여전히 인식됨). 이 경우 코드 분석 도구는 메모리가 할당된 것을 인식하지 못하고 이 경고를 생성합니다. 가양성(false positive)을 방지하기 위해 함수 본문의 여는 중괄호 `#pragma` 앞의 줄에 `{` 경고를 사용하십시오.
37.576923
365
0.762538
kor_Hang
0.999999
fa0c722ed4bbb23cc8eead4f2fad0801f5c46c70
555
md
Markdown
_posts/2020-09-02-Why NumPy is fast - 2.md
seuhye/TIL
05baa7fa50a2b7fb4c482f8efbdb16192973853f
[ "MIT" ]
null
null
null
_posts/2020-09-02-Why NumPy is fast - 2.md
seuhye/TIL
05baa7fa50a2b7fb4c482f8efbdb16192973853f
[ "MIT" ]
1
2020-07-29T08:07:59.000Z
2020-07-29T08:07:59.000Z
_posts/2020-09-02-Why NumPy is fast - 2.md
monsh-git/TIL
05baa7fa50a2b7fb4c482f8efbdb16192973853f
[ "MIT" ]
null
null
null
--- layout: post title: "Why NumPy is fast - 2" date: 2020-09-02 19:59:00 +0900 categories: TIL --- ![Diffence between ndarray and Python's list](https://image.slidesharecdn.com/numpy20160519-160516164831/95/numpy-8-638.jpg) # 파이썬 list가 느린 이유 - 파이썬 리스트는 결국 포인터의 배열 - 경우에 따라서 각각 객체가 메모리 여기저기 흩어져 있음 - 그러므로 캐시 활용이 어려움 # NumPy ndarray가 빠른 이유 - ndarray는 타입을 명시하여 원소의 배열로 데이터를 유지 - 다차원 데이터도 연속된 메모리 공간이 할당됨 - 많은 연산이 dimensions과 strides를 잘 활용하면 효율적으로 가능 - 가령 transpose는 strides를 바꾸는 것으로 거의 공짜 - ndarray 구현 방식을 떠올리면 어떻게 성능을 낼 수 있는지 상상 가능
29.210526
126
0.699099
kor_Hang
1.00001
fa0c7886cf39f1dbe65e4ec6d061ee701bb3d4f6
5,666
md
Markdown
SharePoint/SharePointServer/administration/restore-content-from-an-unattached-content-database.md
Marweis/OfficeDocs-SharePoint
ef39b4467fb562092a54d985ab87dcc381e50f3a
[ "CC-BY-4.0", "MIT" ]
1
2019-09-26T19:25:18.000Z
2019-09-26T19:25:18.000Z
SharePoint/SharePointServer/administration/restore-content-from-an-unattached-content-database.md
Marweis/OfficeDocs-SharePoint
ef39b4467fb562092a54d985ab87dcc381e50f3a
[ "CC-BY-4.0", "MIT" ]
null
null
null
SharePoint/SharePointServer/administration/restore-content-from-an-unattached-content-database.md
Marweis/OfficeDocs-SharePoint
ef39b4467fb562092a54d985ab87dcc381e50f3a
[ "CC-BY-4.0", "MIT" ]
1
2021-11-12T07:31:37.000Z
2021-11-12T07:31:37.000Z
--- title: "Restore content from unattached content databases in SharePoint Server" ms.author: stevhord author: bentoncity manager: pamgreen ms.date: 9/14/2017 ms.audience: ITPro ms.topic: article ms.prod: sharepoint-server-itpro localization_priority: Normal ms.collection: - IT_Sharepoint_Server - IT_Sharepoint_Server_Top ms.assetid: 40ed4458-d798-4aa1-82b9-2e5433991596 description: "Summary: Learn how to restore content from an unattached content database in SharePoint Server 2016 and SharePoint 2013." --- # Restore content from unattached content databases in SharePoint Server **Summary:** Learn how to restore content from an unattached content database in SharePoint Server 2016 and SharePoint 2013. You can restore content from an unattached content database in SharePoint Server by using the SharePoint Central Administration website or PowerShell. The restore tool that you use depends on the kind of environment that you have deployed, your schedule requirements, and service level agreements that you have made with your organization. You can restore or copy content, such as sites, site collections, lists, or document libraries, from a content database without having to attach the content database to the farm. ## Using PowerShell to recover content from an unattached content database in SharePoint Server <a name="proc1"> </a> You can recover content from an unattached content database by using PowerShell. The following procedure shows how to use the `Get-SPContentDatabase` cmdlet to recover content from an unattached content database. You can also import a list or document library with the `Import-SPWeb` cmdlet. For more information, see [Import a list or document library in SharePoint Server](import-a-list-or-document-library.md). **To recover content from an unattached content database by using PowerShell** 1. Verify that you have the following memberships: - **securityadmin** fixed server role on the SQL Server instance. - **db_owner** fixed database role on all databases that are to be updated. - Administrators group on the server on which you are running the PowerShell cmdlets. An administrator can use the **Add-SPShellAdmin** cmdlet to grant permissions to use SharePoint Server cmdlets. > [!NOTE] > If you do not have permissions, contact your Setup administrator or SQL Server administrator to request permissions. For additional information about PowerShell permissions, see [Add-SPShellAdmin](http://technet.microsoft.com/library/2ddfad84-7ca8-409e-878b-d09cb35ed4aa.aspx). 2. Start the SharePoint Management Shell. 3. At the PowerShell command prompt, type the following command: ``` Get-SPContentDatabase -ConnectAsUnattachedDatabase -DatabaseName <DatabaseName> -DatabaseServer <DatabaseServer> ``` Where: - _\<DatabaseName\>_ is the name of the unattached database from which you want to recover content. - _\<DatabaseServer\>_ is the name of the database server that hosts the unattached database from which you want to recover content. For more information, see [Get-SPContentDatabase](http://technet.microsoft.com/library/a4a83bb0-0bab-4cad-9b59-0fd89a16f57b.aspx). > [!NOTE] > We recommend that you use Microsoft PowerShell when performing command-line administrative tasks. The Stsadm command-line tool has been deprecated, but is included to support compatibility with previous product versions. ## Using Central Administration to recover content from an unattached content database in SharePoint Server <a name="proc2"> </a> You can recover content from an unattached content database by using Central Administration. **To recover content from an unattached content database by using Central Administration** 1. Verify that the user account that is performing this procedure is a member of the Farm Administrators group and is a member of the **db_owner** fixed database role. 2. Start Central Administration. 3. In Central Administration, on the home page, click **Backup and Restore**. 4. On the Backup and Restore page, in the **Granular Backup** section, click **Recover data from an unattached content database**. 5. On the Unattached Content Database Data Recovery page, type the database server name in the **Database Server** text box and type the database name in the **Database Name** text box. 6. Select the database authentication method that you want to use. 7. Select the **Browse content** option, and then click **Next**. 8. On the Browse content page, select the site collection, site, and or list that you want to restore, select the **Backup site collection** or **Export site or list** option, and then click **Next**. 9. Type the file location where you want to store the backup file, and then click **Start Backup**. For more information about using the **Backup site collection** option, see [Back up site collections in SharePoint Server](back-up-site-collections.md). If you chose **Export site or list** in the previous page, you must select **Export Full Security** and choose the version that you want to export in the **Export Versions** drop-down menu. For more information about using the **Export site or list** option, see [Export sites, lists, or document libraries in SharePoint Server](export-a-site-list-or-document-library.md). ## See also <a name="proc2"> </a> #### Concepts [Prepare to back up and restore farms in SharePoint Server](prepare-to-back-up-and-restore.md) [Attach or detach content databases in SharePoint Server](attach-or-detach-content-databases.md)
55.54902
415
0.767384
eng_Latn
0.989953
fa0c97f4600fd104633b2fc7271635d9d11fcf7d
5,963
md
Markdown
content/fr/graphing/widgets/hostmap.md
terra-namibia/documentation
8ea19b29dece3ca0231df2ca4d515c4d342e677e
[ "BSD-3-Clause" ]
null
null
null
content/fr/graphing/widgets/hostmap.md
terra-namibia/documentation
8ea19b29dece3ca0231df2ca4d515c4d342e677e
[ "BSD-3-Clause" ]
null
null
null
content/fr/graphing/widgets/hostmap.md
terra-namibia/documentation
8ea19b29dece3ca0231df2ca4d515c4d342e677e
[ "BSD-3-Clause" ]
null
null
null
--- title: Widget Hostmap kind: documentation description: Affiche la hostmap Datadog dans vos dashboards. further_reading: - link: graphing/dashboards/timeboard/ tag: Documentation text: Timeboards - link: graphing/dashboards/screenboard/ tag: Documentation text: Screenboard - link: graphing/graphing_json/ tag: Documentation text: Créer des dashboards avec JSON --- La hostmap représente n'importe quelle métrique d'un sous-ensemble de hosts sur une unique visualisation, disponible depuis le menu [Infrastructure Host Map][1] : {{< img src="graphing/widgets/hostmap/hostmap.png" alt="Hostmap" responsive="true" >}} ## Implémentation {{< img src="graphing/widgets/hostmap/hostmap_setup.png" alt="Configuration hostmap" responsive="true" >}} ### Configuration La configuration du widget Hostmap est similaire à celle de la [page principale de la hostmap][1] : 1. Choisissez d'afficher les `hosts` ou les `containers`. 2. `Filter by` : choisissez les hosts/conteneurs à afficher. 3. `Group by` : agrégez vos hosts/conteneurs selon un ou plusieurs tags. 4. Choisissez une métrique pour ajouter les éléments de votre hostmap. 5. Facultatif : choisissez une métrique pour dimensionner les éléments de votre hostmap. 6. Facultatif : définissez une palette de couleurs avec les valeurs `min` et `max`. ### Options #### Titre Affichez un titre personnalisé pour votre widget en cochant la case `Show a Title` : {{< img src="graphing/widgets/options/title.png" alt="Titre du widget" responsive="true" style="width:80%;">}} Définissez sa taille et son alignement si vous le souhaitez. ## API Le [schéma JSON][2] utilisé pour le widget Hostmap est le suivant : ``` HOSTMAP_SCHEMA = { "type": "object", "properties": { "type": {"enum": ["hostmap"]}, "requests": { "type": "object", "properties": { 'fill': REQUEST_SCHEMA, 'size': REQUEST_SCHEMA }, "anyOf": [ {"required": ["fill"]}, {"required": ["size"]} ], "additionalProperties": false }, "node_type": {"enum": ["host", "container"]}, "no_metric_hosts": {"type": "boolean"}, "no_group_hosts": {"type": "boolean"}, "group": {"type": "array", "items": {"type": "string"}}, "scope": {"type": "array", "items": {"type": "string"}}, "style": { "type": "object", "properties": { "palette": {"type": "string"}, "palette_flip": {"type": "boolean"}, "fill_min": {"type": "string"}, "fill_max": {"type": "string"} }, "additionalProperties": false }, "title": {"type": "string"} }, "required": ["type", "requests"], "additionalProperties": false } ``` | Paramètre | Type | Obligatoire | Description | | ------ | ----- | ----- | -------- | | `type` | string | oui | Type du widget (utilisez `hostmap` pour le widget Hostmap). | | `requests.fill` | string | oui/non | Requête utilisée pour remplir la carte. Consultez la [documentation relative au schéma JSON des requêtes][3] pour découvrir comment concevoir le `REQUEST_SCHEMA`. | | `requests.size` | string | oui/non | Requête utilisée pour dimensionner la carte. Consultez la [documentation relative au schéma JSON des requêtes][3] pour découvrir comment concevoir le `REQUEST_SCHEMA`. | | `node_type` | string | non | Le type de nœud à utiliser dans la carte. Valeurs possibles : `host` ou `container`. | | `no_metric_hosts` | Booléen | non | Permet d'afficher ou non les hosts sans métriques. | | `no_group_hosts` | Booléen | non | Permet d'afficher ou non les hosts qui n'appartiennent à aucun groupe. | | `group` | tableau de strings | non | Liste des préfixes de tags à utiliser pour le regroupement. | | `scope` | tableau de strings | non | Liste des tags utilisés pour filtrer la carte. | | `style.palette` | string | non | Palette de couleurs à appliquer au widget. | | `style.palette_flip` | Booléen | non | Permet d'inverser ou non les tons de la palette. | | `style.fill_min` | string | non | Valeur minimale à utiliser pour colorer la carte. | | `style.fill_max` | string | non | Valeur maximale à utiliser pour colorer la carte. | ## Pour aller plus loin {{< partial name="whats-next/whats-next.html" >}} [1]: /fr/graphing/infrastructure/hostmap [2]: /fr/graphing/graphing_json/widget_json [3]: /fr/graphing/graphing_json/request_json
53.720721
225
0.493376
fra_Latn
0.788547
fa0d8f817b282abd539416dc93c24f07e383baed
5,379
md
Markdown
packages/create-app/CHANGELOG.md
abhishekjakhar/backstage
5a2705de23f111491c73eb4a1ccbcf3b3618547c
[ "Apache-2.0" ]
null
null
null
packages/create-app/CHANGELOG.md
abhishekjakhar/backstage
5a2705de23f111491c73eb4a1ccbcf3b3618547c
[ "Apache-2.0" ]
81
2020-09-12T13:34:57.000Z
2022-03-30T04:31:55.000Z
packages/create-app/CHANGELOG.md
abhishekjakhar/backstage
5a2705de23f111491c73eb4a1ccbcf3b3618547c
[ "Apache-2.0" ]
1
2020-10-04T11:13:41.000Z
2020-10-04T11:13:41.000Z
# @backstage/create-app ## 0.2.0 ### Minor Changes - 6d29605db: Change the default backend plugin mount point to /api - 5249594c5: Add service discovery interface and implement for single host deployments Fixes #1847, #2596 Went with an interface similar to the frontend DiscoveryApi, since it's dead simple but still provides a lot of flexibility in the implementation. Also ended up with two different methods, one for internal endpoint discovery and one for external. The two use-cases are explained a bit more in the docs, but basically it's service-to-service vs callback URLs. This did get me thinking about uniqueness and that we're heading towards a global namespace for backend plugin IDs. That's probably fine, but if we're happy with that we should leverage it a bit more to simplify the backend setup. For example we'd have each plugin provide its own ID and not manually mount on paths in the backend. Draft until we're happy with the implementation, then I can add more docs and changelog entry. Also didn't go on a thorough hunt for places where discovery can be used, but I don't think there are many since it's been pretty awkward to do service-to-service communication. - 56e4eb589: Make CSP configurable to fix app-backend served app not being able to fetch See discussion [here on discord](https://discordapp.com/channels/687207715902193673/687235481154617364/758721460163575850) - d7873e1aa: Default to using internal scope for new plugins - 6f447b3fc: Remove identity-backend Not used, and we're heading down the route of identities in the catalog - 61db1ddc6: Allow node v14 and add to master build matrix - Upgrade sqlite3@^5.0.0 in @backstage/plugin-catalog-backend - Add Node 14 to engines in @backstage/create-app - a768a07fb: Add the ability to import users from GitHub Organization into the catalog. The token needs to have the scopes `user:email`, `read:user`, and `read:org`. - f00ca3cb8: Auto-create plugin databases Relates to #1598. This creates databases for plugins before handing off control to plugins. The list of plugins currently need to be hard-coded depending on the installed plugins. A later PR will properly refactor the code to provide a factory pattern where plugins specify what they need, and Knex instances will be provided based on the input. - 6d97d2d6f: The InfoCard variant `'height100'` is deprecated. Use variant `'gridItem'` instead. When the InfoCard is displayed as a grid item within a grid, you may want items to have the same height for all items. Set to the `'gridItem'` variant to display the InfoCard with full height suitable for Grid: `<InfoCard variant="gridItem">...</InfoCard>` Changed the InfoCards in '@backstage/plugin-github-actions', '@backstage/plugin-jenkins', '@backstage/plugin-lighthouse' to pass an optional variant to the corresponding card of the plugin. As a result the overview content of the EntityPage shows cards with full height suitable for Grid. - 7aff112af: The default mount point for backend plugins have been changed to /api. These changes are done in the backend package itself, so it is recommended that you sync up existing backend packages with this new pattern. ### Patch Changes - e67d49bf5: Sync scaffolded backend with example - 961414d55: Remove discovery api override - 440a17b39: Bump @backstage/catalog-backend and pass the now required UrlReader interface to the plugin - 8c2b76e45: **BREAKING CHANGE** The existing loading of additional config files like `app-config.development.yaml` using APP_ENV or NODE_ENV has been removed. Instead, the CLI and backend process now accept one or more `--config` flags to load config files. Without passing any flags, `app-config.yaml` and, if it exists, `app-config.local.yaml` will be loaded. If passing any `--config <path>` flags, only those files will be loaded, **NOT** the default `app-config.yaml` one. The old behaviour of for example `APP_ENV=development` can be replicated using the following flags: ```bash --config ../../app-config.yaml --config ../../app-config.development.yaml ``` - 5a920c6e4: Updated naming of environment variables. New pattern [NAME]\_TOKEN for Github, Gitlab, Azure & Github enterprise access tokens. ### Detail: - Previously we have to export same token for both, catalog & scaffolder ```bash export GITHUB_ACCESS_TOKEN=foo export GITHUB_PRIVATE_TOKEN=foo ``` with latest changes, only single export is sufficient. ```bash export GITHUB_TOKEN=foo export GITLAB_TOKEN=foo export GHE_TOKEN=foo export AZURE_TOKEN=foo ``` ### list: <table> <tr> <th>Old name</th> <th>New name</th> </tr> <tr> <td>GITHUB_ACCESS_TOKEN</td> <td>GITHUB_TOKEN</td> </tr> <tr> <td>GITHUB_PRIVATE_TOKEN</td> <td>GITHUB_TOKEN</td> </tr> <tr> <td>GITLAB_ACCESS_TOKEN</td> <td>GITLAB_TOKEN</td> </tr> <tr> <td>GITLAB_PRIVATE_TOKEN</td> <td>GITLAB_TOKEN</td> </tr> <tr> <td>AZURE_PRIVATE_TOKEN</td> <td>AZURE_TOKEN</td> </tr> <tr> <td>GHE_PRIVATE_TOKEN</td> <td>GHE_TOKEN</td> </tr> </table> - 67d76b419: Fix for configured templates using 'url' locations even though it's not supported yet - 7bbeb049f: Change loadBackendConfig to return the config directly
40.443609
333
0.740472
eng_Latn
0.973876
fa0dde58e876ab17cfa23ee5bbf71a550493fc47
6,545
md
Markdown
landing.md
MoMe37/mome37.github.io
57c334098d842a8bad7cedfb9a65d6be875ea4c4
[ "CC-BY-3.0" ]
null
null
null
landing.md
MoMe37/mome37.github.io
57c334098d842a8bad7cedfb9a65d6be875ea4c4
[ "CC-BY-3.0" ]
null
null
null
landing.md
MoMe37/mome37.github.io
57c334098d842a8bad7cedfb9a65d6be875ea4c4
[ "CC-BY-3.0" ]
null
null
null
--- title: Ruche de la connaissance layout: landing description: image: assets/images/pic07.jpg nav-menu: true --- <!-- Main --> <div id="main"> <!-- One --> <!-- <section id="one"> <div class="inner"> <header class="major"> <h2>Sedaa amet aliquam</h2> </header> <p>Nullam et orci eu lorem consequat tincidunt vivamus et sagittis magna sed nunc rhoncus condimentum sem. In efficitur ligula tate urna. Maecenas massa vel lacinia pellentesque lorem ipsum dolor. Nullam et orci eu lorem consequat tincidunt. Vivamus et sagittis libero. Nullam et orci eu lorem consequat tincidunt vivamus et sagittis magna sed nunc rhoncus condimentum sem. In efficitur ligula tate urna.</p> </div> </section> --> <!-- Two --> <section id="one" class="spotlights"> <section> <a href="b_post.html" class="image"> <img src="{% link assets/images/icons/rocky.webp %}" alt="" data-position="center center" /> </a> <div class="content"> <div class="inner"> <header class="major"> <h3>Quelques éléments motivants</h3> </header> <p>Un peu de contexte pour donner quelques raisons de s'intéresser aux techniques d'apprentissage.</p> <ul class="actions"> <li><a href="b_post.html" class="button">Motive-toi</a></li> </ul> </div> </div> </section> <section> <a href="c_post.html" class="image"> <img src="{% link assets/images/icons/flex.jpg %}" alt="" data-position="top center" /> </a> <div class="content"> <div class="inner"> <header class="major"> <h3>L'optimisation, c'est flex'</h3> </header> <p>Où on décrit un peu pourquoi l'approche de backpropagation des réseaux de neurones est particulièrement intéressante</p> <ul class="actions"> <li><a href="c_post.html" class="button">Optimise-toi</a></li> </ul> </div> </div> </section> <section> <a href="d_post.html" class="image"> <img src="{% link assets/images/icons/archi.webp %}" alt="" data-position="25% 25%" /> </a> <div class="content"> <div class="inner"> <header class="major"> <h3>Architecture et fonctionnement</h3> </header> <p>À un moment, il faut casser les oeufs pour faire l'omelette.</p> <ul class="actions"> <li><a href="d_post.html" class="button">Récupérer les oeufs</a></li> </ul> </div> </div> </section> <section> <a href="e_post.html" class="image"> <img src="{% link assets/images/icons/pandas.jpg %}" alt="" data-position="25% 25%" /> </a> <div class="content"> <div class="inner"> <header class="major"> <h3>Un petit rafraichissement</h3> </header> <p>Rapide rafraichissement pour les uns, plongée dans le grand bains pour les autres.</p> <ul class="actions"> <li><a href="e_post.html" class="button">Devenir un Kung-Fu Panda</a></li> </ul> </div> </div> </section> <section> <a href="f_post.html" class="image"> <img src="{% link assets/images/icons/vision.jpg %}" alt="" data-position="25% 25%" /> </a> <div class="content"> <div class="inner"> <header class="major"> <h3>Les projets</h3> </header> <p>Comprendre la vision des projet avec ses nouvelles connaissances.</p> <ul class="actions"> <li><a href="f_post.html" class="button">Préparer la poêle</a></li> </ul> </div> </div> </section> <section> <a href="h_post.html" class="image"> <img src="{% link assets/images/icons/shapes.jpg %}" alt="" data-position="25% 25%" /> </a> <div class="content"> <div class="inner"> <header class="major"> <h3>Shapes</h3> </header> <p>Dernière étape avant le deep learning: you must become a tensor</p> <ul class="actions"> <li><a href="h_post.html" class="button">Start your transformation today</a></li> </ul> </div> </div> </section> <section> <a href="g_post.html" class="image"> <img src="{% link assets/images/icons/potter_0.jpg %}" alt="" data-position="25% 25%" /> </a> <div class="content"> <div class="inner"> <header class="major"> <h3>Deep Learning</h3> </header> <p>Faire couler le deep learning sur le python chaud. Bien saisir</p> <ul class="actions"> <li><a href="g_post.html" class="button">Saisir</a></li> </ul> </div> </div> </section> <section> <a href="generic.html" class="image"> <img src="{% link assets/images/faq/h2g2.png %}" alt="" data-position="25% 25%" /> </a> <div class="content"> <div class="inner"> <header class="major"> <h3>La FAQ technique</h3> </header> <p>Centre des connaissances, oracle des problèmes pratiques, dépositaire des réponses aux interrogations récurrentes. 42, baby.</p> <ul class="actions"> <li><a href="z_faq.html" class="button">Trouver les réponses à tes questions</a></li> </ul> </div> </div> </section> <section> <a href="generic.html" class="image"> <img src="{% link assets/images/faq/h2g2.png %}" alt="" data-position="25% 25%" /> </a> <div class="content"> <div class="inner"> <header class="major"> <h3>Les articles pour AlphSistant</h3> </header> <p>Résumé d'articles scientifiques sur des problématiques pertinentes pour AlphSistant.</p> <ul class="actions"> <li><a href="z_articles_al.html" class="button">Become an Alphicionado</a></li> </ul> </div> </div> </section> <section> <a href="generic.html" class="image"> <img src="{% link assets/images/faq/h2g2.png %}" alt="" data-position="25% 25%" /> </a> <div class="content"> <div class="inner"> <header class="major"> <h3>Les articles pour AInimals</h3> </header> <p>Résumé d'articles scientifiques sur des problématiques pertinentes pour AInimals.</p> <ul class="actions"> <li><a href="z_articles_ai.html" class="button">Become an AI beast</a></li> </ul> </div> </div> </section> <!-- Three --> <!-- <section id="three"> <div class="inner"> <header class="major"> <h2>Massa libero</h2> </header> <p>Nullam et orci eu lorem consequat tincidunt vivamus et sagittis libero. Mauris aliquet magna magna sed nunc rhoncus pharetra. Pellentesque condimentum sem. In efficitur ligula tate urna. Maecenas laoreet massa vel lacinia pellentesque lorem ipsum dolor. Nullam et orci eu lorem consequat tincidunt. Vivamus et sagittis libero. Mauris aliquet magna magna sed nunc rhoncus amet pharetra et feugiat tempus.</p> <ul class="actions"> <li><a href="2021-5-25-faqs.html" class="button next">Get Started again</a></li> </ul> </div> </section> </div> -->
30.300926
412
0.635294
fra_Latn
0.247227
fa0de8cbe74b945b84e8e189d23c479463fe23d2
228
md
Markdown
src/doc/docs/repository/delegate/delegate.md
shisheng-1/guice-persist-orient
97c06e21e5c6ced88d86eed57bc9519f487e0210
[ "MIT" ]
29
2015-03-02T01:29:00.000Z
2020-09-18T20:34:33.000Z
src/doc/docs/repository/delegate/delegate.md
shisheng-1/guice-persist-orient
97c06e21e5c6ced88d86eed57bc9519f487e0210
[ "MIT" ]
20
2015-04-05T21:52:03.000Z
2021-11-11T18:51:33.000Z
src/doc/docs/repository/delegate/delegate.md
shisheng-1/guice-persist-orient
97c06e21e5c6ced88d86eed57bc9519f487e0210
[ "MIT" ]
13
2015-04-11T14:55:06.000Z
2022-02-25T06:34:15.000Z
# @Delegate method !!! summary "" Delegate method extension [Delegate methods](../delegatemethods.md) delegate execution to other guice bean method. ```java @Delegate(TargetBean.class) List<Model> selectSomething(); ```
19
89
0.72807
eng_Latn
0.73276
fa0decf4b0871de3c8c7e7ed3c627ae2a3b6d0ed
5,037
md
Markdown
README.md
chriskeene/Eprintsreporting
808bfaa8e43d4626509acfbb4a55336354d356dc
[ "Unlicense", "MIT" ]
null
null
null
README.md
chriskeene/Eprintsreporting
808bfaa8e43d4626509acfbb4a55336354d356dc
[ "Unlicense", "MIT" ]
null
null
null
README.md
chriskeene/Eprintsreporting
808bfaa8e43d4626509acfbb4a55336354d356dc
[ "Unlicense", "MIT" ]
null
null
null
Eprints Reporting ================= A simple set of pages to list and report on Eprints data. Using PHP and CodeIgniter. Author: Chris Keene, University of Sussex. 2014-2015. It's licensed under the MIT License. You can use it as you wish, just don't blame me or expect support. Note the code complies with the CWBP coding standard* Install ======= - Download from github. - Rename SAMPLEdatabase.php in application/config - enter your database credentials - In application/config/config.php set your repository name and url at the top of the file. - Replace the current template files in application/views/templates with your own if you wish - Refer to the notes below about Schools/depts and locally created fields. If you run in to problems, try downloading and extracting CodeIgnitor (version 2) from the official site, and then copying these application and assets folders over the top. Then set-up your Database connection etc. You may need to fiddle with the .htaccess file. To access the reports, just point a web browser to the root of the directory you copied the files to. You can also add 'eprintsreporting/admin/' to the url to see some reports aimed more at back office staff (house keeping information about If you use this, do drop me a note to let me know. Requirements ============ A Web server running PHP (5.1.6 and above) which has read (SELECT) SQL access to your Eprints database. Adding and changing the reports =============================== Refer to the CodeIgniter v2 documentation. http://www.codeigniter.com/userguide2/toc.html The main files are in the application directory. The only file you may need to edit outside of this is .htaccess. Anyone familar with the MVC model for web applications will be familar with the layout * controllers/eprintsreporting.php - Essentially the pages and urls * models/eprintsreporting_model.php - functions to return data from the database * views/ - snippets of html for displaying a page or part of a page * views/templates - guess. * libraries/Ergeneral.php - a file with a few common functions, plus some settings such as item type names * config/ - config files. For an example, see the function 'gettopjournals' in the controller (controllers/eprintsreporting.php), which produces the page which displays the journals most published in, either for the whole repository or a given School. This uses the get_topjournals function in the model file for accessing the data, and uses the 'topjournal.php' view html for displaying the list. The url for the report is based on the controller function name. Schools ======= At Sussex, we have Schools and within them Departments. Items are added to Eprints at the Department level, but reported on at the School level, hence the SQL will select an item and look at the parent (school) of the division (department) it is associated with. It should be a fairly quick job to go through the Model file and update any queries based on Schools and adapt them for your local needs. Note in the where clause "subject_ancestors.pos=1" which selects the ancestor one level up in the tree, ie the School. Notes ====== Some reports use custom fields on our IR and so will fail unless you by chance have set up fields of the same name. This mainly applies to funder fields and OA status fields. Some Mysql specific queries are used. Should work with other DB back-ends with a little tweaking in the Model file. This code is all basic stuff. Most developers could knock something up that does the same, and more, only with better code, quite quickly. It's designed for back-office working, and to be limited to just those who support the repository. If you are making it available on a public web server you may wish to do a security review first. Our repository is called SRO, I've tried to avoid any references to SRO, though some may still remain in the code. License ======= The MIT License (MIT) Copyright (c) 2015 CJ Keene (chriskeene@gmail.com) Permission is hereby granted, free of charge, to any person obtaining a copy of this software and associated documentation files (the "Software"), to deal in the Software without restriction, including without limitation the rights to use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of the Software, and to permit persons to whom the Software is furnished to do so, subject to the following conditions: The above copyright notice and this permission notice shall be included in all copies or substantial portions of the Software. THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE. (* Chris Writes Bad PHP)
54.16129
436
0.776454
eng_Latn
0.996155
fa0e048a1130925f9c24923bada108e0ede7245a
620
md
Markdown
_posts/2003-08-22-pieechoatom_politics.md
protocol7/protocol7-blog
f3464d33ad7703ce0b97d1e7630382e55c401881
[ "CC-BY-1.0" ]
null
null
null
_posts/2003-08-22-pieechoatom_politics.md
protocol7/protocol7-blog
f3464d33ad7703ce0b97d1e7630382e55c401881
[ "CC-BY-1.0" ]
null
null
null
_posts/2003-08-22-pieechoatom_politics.md
protocol7/protocol7-blog
f3464d33ad7703ce0b97d1e7630382e55c401881
[ "CC-BY-1.0" ]
null
null
null
--- id: 1002 title: Pie/Echo/Atom politics date: 2003-08-22T17:07:20+00:00 author: Niklas layout: post guid: http://www.protocol7.com/archives/2003/08/22/pieechoatom-politics/ permalink: /archives/2003/08/22/pieechoatom_politics/ tags: - Standards --- <div class='microid-1728510b995224ed5e44cac49436e2466fe38411'> <p> The politics of Pie/Echo/Atom is beeing discussed all over now. <a href="http://www.aaronsw.com/weblog/001027">Aaron Swartz</a>, <a href="http://www.xml.com/pub/a/2003/08/20/dive.html">Mark Pilgrim</a>, <a href="http://weblog.burningbird.net/fires/001550.htm">Shelley Powers</a>. </p> </div>
38.75
283
0.733871
yue_Hant
0.388142
fa0efd7563f8d6fa7e1c717791412a6705ba0318
1,219
md
Markdown
content/post/diriku/index.md
Miftakhul1412/miftavy
69c40cefd6388b8da0f70cd23c4a793ca0c0efb1
[ "MIT" ]
null
null
null
content/post/diriku/index.md
Miftakhul1412/miftavy
69c40cefd6388b8da0f70cd23c4a793ca0c0efb1
[ "MIT" ]
null
null
null
content/post/diriku/index.md
Miftakhul1412/miftavy
69c40cefd6388b8da0f70cd23c4a793ca0c0efb1
[ "MIT" ]
null
null
null
--- title: Diriku subtitle: Miftavy date: 2021-02-17T08:04:05.362Z summary: "" draft: false featured: false image: filename: whatsapp-image-2021-02-17-at-15.29.13.jpeg focal_point: Smart preview_only: false caption: MARI JANGAN SAMBAT BANYAKIN SEMANGAT BIAR SUKSESNYA TIDAK TERTUNDA TUNDA alt_text: "" --- #### **MIFTAKHUL ERVYANTI SANTOSA** Hello nama saya Miftakhul Ervyanti Santosa ,saya berasal dari Bantul Yogyakarta. Saya anak ketiga dari tiga bersaudara,saya dibesarkan dari keluarga sederhana yang tinggal disalah satu desa diPiyungan. Saya bersekolah di STMIK AKAKOM Yogyakarta masuk tahun 2019, mengambil jurusan Rekayasa Perangkat Lunak Aplikasi (RPLA). Alasan saya bersekolah di STMIK AKAKOM Yogyakarta adalah karena didunia kerja sekarang harus dituntut untuk bisa dalam menggunakan komputer,apalagi disuatu perusahaan sekarang sudah menggunakan komputer, database dll. Yang saya fokuskan pada sekolah IT ini adalah database dan pembuatan aplikasi web serta pengelolaannya. Dari situ saya bisa belajar untuk membuat atau bisa juga mengontrol suatu aplikasi maupun database. Cita cita saya ingin bekerja disuatu perusahaan swasta ataupun pemerintahan yang bertugas dalam pengelolaan database.
58.047619
339
0.815422
ind_Latn
0.953295
fa0f5ce09f2b9f06dac39f56ed72764eae911ca0
36,167
md
Markdown
_posts/2018-06-05-34.md
skaivolas/mir
409c8ba893684dcc665c13d27d63771162fd4197
[ "MIT" ]
null
null
null
_posts/2018-06-05-34.md
skaivolas/mir
409c8ba893684dcc665c13d27d63771162fd4197
[ "MIT" ]
null
null
null
_posts/2018-06-05-34.md
skaivolas/mir
409c8ba893684dcc665c13d27d63771162fd4197
[ "MIT" ]
null
null
null
--- layout: post title: М. Пруст --- #### **В ПОИСКАХ УТРАЧЕННОГО ВРЕМЕНИ** Давно уже я привык укладываться рано. Иной раз, едва лишь гасла свеча, глаза мои закрывались так быстро, что я не успевал сказать себе: «Я засыпаю». А через полчаса просыпался от мысли, что пора спать; мне казалось, что книга все еще у меня в руках и мне нужно положить ее и потушить свет; во сне я продолжал думать о прочитанном, но мои думы принимали довольно странное на­правление: я воображал себя тем, о чем говорилось в книге,— церковью, квартетом, соперничеством Франциска I или Карла V. Это наваждение длилось несколько секунд после того, как я просы­пался; оно не возмущало моего сознания —оно чешуей покрывало мне глаза и мешало им удостовериться, что свеча не горит. Затем оно становилось смутным, как воспоминание о прежней жизни после метемпсихоза; сюжет книги отделялся от меня, я волен был связать или не связать себя с ним; вслед за тем ко мне возвраща­лось зрение, и, к своему изумлению, я убеждался, что вокруг меня темнота, мягкая и успокоительная для глаз и, быть может, еще более успокоительная для ума, которому она представлялась, как нечто необъяснимое, непонятное, как нечто действительно темное. Я спрашивал себя, который теперь может быть час; я слышал свистки паровозов: они раздавались то издали, то вблизи, подобно пению птицы в лесу; но ним можно было определить расстояние, они вызывали в моем воображении простор пустынных нолей, спешащего на станцию путника и тропинку, запечатлеющуюся в его памяти благодаря волнению, которое он испытывает и при виде незнакомых мест, и потому, что он действует сейчас необычно, потому что он все еще припоминает в ночной тишине недавний разговор, прощанье под чужой лампой и утешает себя мыслью о скором возвращении. Я слегка прикасался щеками к ласковым щекам подушки, таким же свежим и пухлым, как щеки нашего детства. Я чиркал спичкой и смотрел на часы. Скоро полночь. Это тот самый миг, когда заболевшего путешественника, вынужденного лежать в незнакомой гостинице, будит приступ и он радуется полоске света под дверью. Какаое счастье, уже утро\! Сейчас встанут слуги, он позвонит, и они придут к нему на помощь. Надежда на облегчение дает ему силы терпеть. И тут он слышит шаги. Шаги приближаются, потом удаляются. А полоска света под дверью исчезает. Это—полночь; потушили газ; ушел последний слуга — значит, придется мучиться всю ночь. Я засыпал опять, но иногда пробуждался ровно на столько времени, чтобы успеть услыхать характерное потрескиванье пане­лей, открыть глаза и охватить взглядом калейдоскоп темноты, ощутить благодаря мгновенному проблеску сознания, как крепко спят вещи, комната — все то бесчувственное, чьею крохотной части­цей я был и с чем мне предстояло соединиться вновь. Или же я без малейших усилий переносился, засыпая, в невозвратную пору моих раниих лет, и мной снова овладевали детские страхи; так, например, я боялся, что мой двоюродный дед оттаскает меня за волосы, хотя я перестал его бояться после того, как меня остригли,— этот день знаменовал наступление новой эры в моей жизни. Во сне я забывал об этом происшествии и опять вспоминал, как только мне удава­лось проснуться, чтобы вырваться от деда, однако, прежде чем вернуться в мир сновидений, я из осторожности прятал голову под подушку. Иной раз, пока я спал, из неудобного положения моей ноги, подобно Еве, возникшей из ребра Адама, воникала женщина. Ее создавало предвкушаемое мной наслаждение, а я воображал, что это она мне его доставляет. Мое тело, ощущавшее в ее теле мое собственное тепло, стремилось к сближению, и я просыпался. Другие люди, казалось мне, сейчас далеко-далеко, а от поцелуя этой женщины, с которой я только что расстался, щека моя все еще горела, а тело ломило от тяжести ее стана. Когда ее черты напоми­нали женщину, которую я знал наяву, я весь бывал охвачен стрем­лением увидеть ее еще раз —так собираются в дорогу люди, которым не терпится взглянуть своими глазами на вожделенный город: они воображают, будто в жизни можно насладиться очаро­ваньем мечты. Постепенно воспоминание рассеивалось, я забывал приснившуюся мне девушку. Вокруг спящего человека протянута нить часов, чередой распо­лагаются года и миры. Пробуждаясь, он инстинктивно сверяется с ними, мгновенно в них вычитывает в каком месте земного шара он находится, сколько времени прошло до его пробуждения, однако ряды их могут смешаться, расстроиться. Если он внезапно уснет под утро, после бессонницы, читая книгу, в непривычной для него позе, то ему достаточно протянуть руку, чтобы остановить солнце и обратить его вспять; в первую минуту он не поймет, который час, ему покажется, будто он только что лег. Если же он задремлет в еще менее естественном, совсем уже необычном положении, например, сидя в кресле после обеда, то сошедшие со своих орбит миры перемешаются окончательно, волшебное кресло с невероятной бы­стротой понесет его через время, через пространство, и как только он разомкнет веки, ему почудится, будто он лег несколько месяцев тому назад и в других краях. Но стоило мне заснуть в моей постели глубоким сном, во время которого для моего сознания наступал полный отдых,— и сознание теряло представление о плане комнаты, в которой я уснул: проснувшись ночью, я не мог понять, где я, в первую секунду я даже не мог сообразить, *_Kto_* я такой; меня не покидало первобытно простое ощущение того, что я существую,— подобное ощущение может биться и в груди у животного; я был беднее пещерного человека; но тут, словно помощь свыше, ко мне приходило воспоминание — пока еще не о том месте, где я находил­ся, но о местах, где я жил прежде или мог бы жить,— и вытаскивало меня из небытия, из которого я не мог выбраться своими силами; в один миг я пробегал века цивилизации, и смутное понятие о керосиновых лампах, о рубашках с отложным воротничком посте­пенно восстанавливало особенности моего «я». Быть может, неподвижность окружающих нас предметов внуше­на им нашей уверенностью, что это именно они, а не какие-нибудь другие предметы, неподвижностью того, что мы о них думаем. Всякий раз, когда я при таких обстоятельствах просыпался, мой разум тщетно пытался установить, где я, а вокруг меня все кружи­лось впотьмах: предметы, страны, годы. Мое одеревеневшее тело по характеру усталости стремилось определить свое положение, сде­лать отсюда вывод, куда идет стена, как расставлены предметы, и на основании этого представить себе жилище в целом и найти для него паименованье. Память — память боков, колен, плеч — показывала ему комнату за комнатой, где ему приходилось спать, а в это время незримые стены, вертясь в темноте, передвигались в зависимости от того, какую форму имела воображаемая комната. И прежде чем сознание, остановившееся в нерешительности на пороге форм и времен, сопоставив обстоятельства, узнавало обиталище, тело при­поминало, какая в том или ином помещении кровать, где двери, куда выходят в окна, есть ли коридор, а заодно припоминало те мысли, с которыми я и заснул и проснулся. Так, мой онемевший бок, пытаясь ориентироваться, воображал, что он вытянулся у стены в широкой кровати под балдахином, и тогда я говорил: «Ах, вот оно что\! Я не дождался, когда мама придет со мной проститься, и уснул»; я был в деревне у дедушки, умершего много лет тому назад; мое тело, тот бок, что я отлежал,— верные хранители минувшего, которое моему сознанию не забыть вовек,— приводили мне на память свет сделанного из богемского стекла, в виде урны, ночника, подвешенного к потолку на цепочках, и камин из сиенского мрамо­ра, стоявший в моей комбрейской спальне, в доме у дедушки и бабушки, где я жил в далеком прошлом, которое я теперь принимал за настоящее, хотя пока еще не представлял его себе отчетливо,— оно вырисовывалось яснее, когда я просыпался уже окончательно. Затем пробуждалось воспоминание о другом положении тела; стена тянулась в другом направлении, я был в своей комнате у г-жи де Сен-Лу, в деревне. Боже мой\! Должно быть, одиннадцатый час; наверное, уже отужинали\! По-видимому, я долго спал после обыч­ной вечерней прогулся с г-жой де Сен-Лу — прогулки, которую я совершаю перед тем, как надеть фрак. Много лет назад, когда мы возвращались особенно поздно с прогулки в Комбре, я видел на стеклах моего окна рдяные отблески заката. В Тансонвиле, у г-жи де Сен-Лу, ведут совсем другой образ жизни, и совсем особенное наслаждение испытываю я оттого, что гуляю вечерами, при луне, по дорогам, на которых я когда-то резвился при свете солнца; когда же мы возвращаемся, я издалека вижу комнату, где я сначала усну, а потом переоденусь к ужину,— ее пронизывают лучи от лампы, от этого единственного маяка в ночной темноте. Круговерть расплывчатых воспоминаний всякий раз продолжа­лась несколько секунд; нередко кратковременное мое недоумение по поводу того, где я нахожусь, различало предположения, из которых оно слагалось, не лучше, чем мы расчленяем в кинетоскопе движе­ния бегущей лошади. И все-таки я видел то одну, то другую комнату, где мне случалось жить, и в конце концов, пока я, проснувшись, надолго предавался мечтам, вспоминал все до одной; вот зимние комнаты, где, улегшись в постель, закрываешься лицом в гнездышко — ты свил его из разнообразных предметов: из уголка подушки, из верха одеяла, из края шали, из края кровати, из газеты, а затем, скрепив все это по способу птиц, на неопределенное время в нем устраиваешься; зимние комнаты, где тебе особенно приятно чувствовать в стужу, что ты отгорожен от внешнего мира (так морская ласточка строит себе гнездо глубоко под землей, в земном тепле); где огонь в камине горит всю ночь, и ты спишь под широким плащом теплого и дымного воздуха, в котором мелькают огоньки вспыхивающих головешек, спишь в каком-то призрачном алькове, в теплой пещере, выкопанной внутри комнаты, в жаркой полосе с подвижными границами, овеваемой притоками воздуха, которые освежают нам лицо и которые исходят из углов комнты, из той ее части, что ближе к окну и дальше от камина, и потому более холодной; вот комнаты летние, где приятно бывает слиться с теплой ночью; где лунный свет, пробившись через полуотворенные ставни, добрасывает свою волшебную лестницу до ножек кровати; где спишь словно на чистом воздухе, как спит синица, которую колы­шет ветерок на кончике солнечного луча; иногда это комната в стиле Людовика XVI, до того веселая, что даже в первый вечер я не чувствовал себя там особенно несчастным,— комната, где тонкие колонны, без усилий поддерживают потолок, с таким изяществом расступались, чтобы, освободив место для кровати, не заслонять ее; иногда это была совсем на нее непохожая, маленькая, но с очень высоким потолком, частично обставленная красным деревом, вы­долбленная в двухэтажной высоте пирамида, где я в первую же секунду бывал морально отравлен незнакомым запахом нарда и убеждался во враждебности фиолетовых занавесок и наглом равно­душии стенных часов, стрекотавших вовсю, как будто меня там не было; где всему здесь чуждое и беспощадное квадратное зеркало на ножках, наискось перегораживавшее один из углов комнаты, вреза­лось в умиротворяющую заполненность уже изученного мною пространства каким-то пустырем, всегда производившим впечатле­ние неожиданности; где моя мысль, часами силившаяся рассредо­точиться, протянуться в высоту, чтобы принять точную форму комнаты и доверху наполнить ее гигантскую воронку, терзалась в течение многих мучительных ночей, а я в это время лежал с открытыми глазами, с бьющимся сердцем, напрягая слух, стараясь не дышать носом до тех пор, пока привычка не изменяла цвет занавесок, не заставляла умолкнуть часы, не внушала сострадания косому жестокому зеркалу, не смягчала, а то и вовсе не изгоняла запах нарда и заметно не уменьшала бросавшуюся в глаза высоту потолка. Привычка искусная, но чересчур медлительная благоуст- роительница\! Вначале она не обращает внимания на те муки, которые по целым неделям терпит наше сознание во временных обиталищах, и все же счастлив тот, кто ее приобрел, ибо без привычки, своими силами, мы ни одно помещение не могли бы сделать пригодным для жилья. Теперь я уже проснулся окончательно, мое тело описало послед­ний круг, и добрый ангел уверенности все остановил в моей комнате, натянул на меня одеяло и в темноте более или менее правильно водворил на место комод, письменный стол, камин, окно на улицу и две двери. Но хотя я теперь знал наверное, что обретаюсь не в тех помещениях, чей облик, пусть и не достаточно явственный, на миг воскрешало передо мной неопытное пробуждение, намекая на то, что я могу находиться и там,— памяти моей был дан толчок; обычно я не пытался тут же заснуть; почти всю ночь я вспоминал, как мы жили в Комбре, у моей двоюродной бабушки, в Бальбеке, в Париже, в Донсьере, в Венеции и в других городах, вспоминал местность, людей, которых я там знал, то, что я сам успевал за ними заметить и что мне про них говорили другие. В Комбре, в сумерки, до того момента, когда мне надо было ложиться, моя спальня, где я томился без сна, вдали от матери и от бабушки, превращалась для меня в тягостное средоточие тревог. Так как вид у меня по вечерам бывал несчастный, кто-то придумал для меня развлечение: перед ужином к моей лампе прикрепляли вол­шебный фонарь, и, подобно первым зодчим и художникам по стеклу готической эпохи, фонарь преображал непроницаемые стены в призрачные переливы света, а сверхъестественные раноцветные видения, в ожившие легенды, написанные на мигающем, изменчи­вом стекле. Но мне становилось от этого только грустнее, потому что даже перемена освещения разрушала мою привычку к комнате — привычку, благодаря которой, если не считать муки лежанья в постели, мне было здесь сносно. Сейчас я не узнавал свою комнату и чувствовал себя неуютно, как в номере гостиницы или в «шале», куда бы я попал впервые прямо с поезда. Поглощенный злым своим умыслом, Голо трусил на лошади; выехав из треугольной рощицы, темно-зеленым бархатом покры­вавшей склон холма, он, трепеща, направлялся к замку несчастной Женевьевы Брабантской. Замок был красиво обрезан — просто-на­просто тут был край овального стекла, вставленного в рамку, которую вдвигали между чечевицами фонаря. То была лишь часть замка, перед нею раскинулся луг, а на лугу о чем-то мечтала Женевьева в платье с голубым поясом. И замок и луг были желтые, и я это знал еще до того, как мне показали их в фонаре,— я увидел ясно их цвет в отливавших золотом звуках слова «Брабант». Голо останавливался и печально выслушивал пояснений, которое громко читала моя двоюродная бабушкД, по-видимому, это было ему вполне попятно, ибо он, в строгом соответствии с текстом, принимал позу, не лишенную некоторой величественности; затем снова трусил. И ника­кая сила не могла бы остановить мелкой его рыси. Если фонарь сдвигали, я видел, как лошадь Голо едет по оконным занавескам, круглясь на складках и спускаясь в углубления. Тело самого Голо, из того же необыкновенного вещества, что и тело его коня, приспосаблива­лось к каждому материальному препятствию, к каждому предмету, который преграждал ему путь: оно превращало его в свой остов и наполняло его собой; даже к дверной ручке мгновенно применялось и наплывало на нее красное его одеяние или же бледное его лицо, все такое же тонкое и грустное, но не обнаруживавшее ни малейших признаков смущения от этой своей бескостности. Понятно, я находил прелесть в световых изображениях, которые казалось, излучало меровингское прошлое, рассыпая вокруг меня блестки глубокой старины. Но я не могу передать, как тревожило меня вторжение тайны и красоты в комнату, которую мне в конце концов удалось наполнить своим «я» до такой степени, что я обращал на нее больше внимания, чем на самого себя. Как только прекращалось обезболивающее действие привычки, ко мне возвра­щались грустные думы и грустные чувства. Дверная ручка в моей комнате, отличавшаяся для меня от всех прочих ручек тем, что она, казалось, поворачивалась сама, без всяких усилий с моей сторо­ны,—до такой степени бессознательным сделалось для меня это движение,—теперь представляла собой астральное тело Голо. И как только звонил звонок к ужину, я бежал в столовую, где каждый вечер светила большая висячая лампа, понятия не имевшая ни о Голо, ни о Синей Бороде, но зато знавшая моих родных и осведомленная о том, что такое тушеное мясо, и бросался в объятия мамы — несчастья Женевьевы Брабантской еще сильнее привязывали меня к ней, а злодеяния Голо заставляли с еще большим пристрастием допрашивать свою совесть. После ужина я должен был —увы\!— уходить от мамы, а мама беседовала с другими в саду, если погода была хорошая, или в маленькой гостиной, где все сходились в ненастную погоду. Все, за исключением бабушки, которая утверждала, что «в деревне жаль сидеть в душной комнате», и в особенно дождливые дни вела нескончаемые споры с моим отцом, который говорил мне, чтобы я шел читать к себе в комнату. «Так мальчик никогда не будет у вас крепким и энергичным,— с унылым видом говорила она,— а ему необходимо поправиться и воспитать в себе силу воли». Отец пожимал плечами и смотрел на барометр — он интересовался мете­орологией,—а мать, не поднимая шума из-за боязни рассердить его, смотрела на него с умильной почтительностью, но не очень пристально, чтобы как-нибудь не проникнуть в тайну его превос­ходства. Зато бабушка в любую погоду, даже когда хлестал дождь и Франсуаза спешила унести драгоценные плетеные кресла, а то как бы не намокли, гуляла в пустом саду, под проливным дождем, откидывая свои седые космы и подставляя лоб живительности дождя и ветра. «Наконец-то можно дышать\!» — говорила она и обегала мокрые дорожки, чересчур симметрично разделанные но­вым, лишенным чувства природы садовником, которого мой отец спрашивал утром, разгуляется ли погода,— обегала восторженной припрыжкой, управляемой самыми разными чувствами, какие вызывало в ее душе упоенье грозой, могущество здорового образа жизни, нелепость моего воспитания и симметрия сада, а желание предохранить от грязи свою лиловую юбку, которую она ухитрялась так забрызгать, что горничная приходила в недоумение и в отчая­ние от высоты брызг, было ей не знакомо. Если бабушка делала по саду круги после ужина, то загнать ее в дом могло только одно: ее, словно мошку, тянуло к освещенным окнам маленькой гостиной, где на ломберном столе стояли бутылки с крепкими напитками, и в тот момент, когда она, сделав очередной полный оборот, оказывалась под окнами, слышался голос моей двоюродной бабушки: «Батильда\! Запрети же ты своему мужу пить коньяк\!» В самом деле: чтобы подразнить бабушка (она резко отличалась от остальных членов семьи моего отца, и все над ней подшучивали и донимали ее), моя двоюродная бабушка подбивала дедушку, которому крепкие напитки были воспрещены, немножко выпить. Бедная бабушка, войдя в комнату, обращалась к мужу с мольбой не пить коньяку; он сердился, все-таки выпивал рюмочку, и бабушка уходила печальная, растерянная, но с улыбкой на лице,— она была до того кротка и добра, что любовь к ближним и способность забывать о себе и о причиненных ей обидах выража­лись у нее в улыбке, ирония которой — в противоположность улыб­кам большинства людей — относилась лишь к нёй самой, нам же она посылала поцелуй глазами: когда они были устремлены на тех, кто вызывал у нее нежные чувства, она непременно должна была приласкать их взглядом. Пытка, которой подвергала ее моя двою­родная бабушка, напрасные ее мольбы и ее слабохарактерность, обреченная терпеть поражения и тщетно пытавшаяся отнять у дедушки рюмку,— все это относилось к числу явлений, к которым так привыкаешь, что в конце концов наблюдаешь их со смехом, более того: довольно решительно и весело становишься на сторону преследователя, чтобы убедить самого себя, что тут, собственно, никакого преследования и нет; но тогда все это внушало мне столь сильное отвращение, что я бы с удовольствием побил мою двоюрод­ную бабушку. И все же, когда я слышал: «Батильда\! Запрети же ты своему мужу пить коньяк\!» —я, уже по-мужски малодушный, по­ступал так, как все мы, взрослые, поступаем при виде несправедли­востей и обид: я от них отворачивался; я шел плакать наверх, под самую крышу, в комнатку рядом с классной, где пахло ирисом и куда вливалось благоуханье дикой черной смородины, росшей среди камней ограды и протягивавшей цветущую ветку в растворенное окно. Имевшая особое, более прозаическое назначение, эта комната, откуда днем была издали видна даже башня замка Русенвиль-ле- Пен, долгое время служила мне,— разумеется, оттого, что только там я имел право запираться на ключ,—убежищем, где я мог предаваться тому, что требует ненарушимого уединения: где я мог читать, мечтать, блаженствовать и плакать. Увы\! Я не знал, что бабушку гораздо сильнее, чем незначительные нарушения режима, допускавшиеся ее мужем, огорчали мое безволие и слабое здоровье, внушавшие ей тревогу за мое будущее, когда она, склонив голову набок и глядя вверх, и днем и вечером без конца кружила по саду и ее красивое лицо, ее морщинистые, коричневые щеки, к старости ставшие почти лиловыми, словно пашни осенью, на воздухе пря­тавшиеся под приподнятой вуалью, с набежавшими на них от холода или от грустных мыслей, непрошеными, тут же и высыхав­шими слезами, то исчезали, то появлялись. Идя спать, я утешался мыслью, что после того как я лягу, мама придет меня поцеловать. Но она приходила со мной прощаться так ненадолго и так скоро уходила, что в моей душе больно отзывались сначала ее шаги на лестнице, а потом легкий шелест ее летнего голубого муслинового, отделанного соломкой платья, проплывав­ший за двумя дверями по коридору. Шелест и шаги возвещали, что я их услышу вновь, когда она от меня уйдет, когда она будет спускаться по лестнице. Я уже предпочитал, чтобы это наше про­щанье, которое я так любил, произошло как можно позже, чтобы мама подольше не приходила. Иной раз, когда она, поцеловав меня, уже отворяла дверь, мне хотелось позвать ее и сказать: «Поцелуй меня еще»,— но я знал, что она рассердится, оттого что уступка, которую она делала моей грусти и моему возбуждению, приходя целовать меня, даря мне успокоительный поцелуй, раздражала отца, считавшего, что этот ритуал нелеп, и она стремилась к тому, чтобы я отказался от этой потребности, от этой привычки, и, уж во всяком случае, не намерена была поощрять другую привычку — просить, чтобы она еще раз меня поцеловала в тот момент, когда уже собиралась шагнуть за порог. Словом, сердитый ее вид нарушал то умиротворение, которым от нее веяло на меня за секунду перед тем, как она с любовью склонялась над моей кроватью и, словно протягивая мне святые дары покоя, тянулась ко мне лицом, чтобы, причастившись, ощутил ее присутствие и почерпнул силы для сна. И все же те вечера, когда мама заходила ко мне на минутку, были счастливыми в сравнении с теми, когда к ужину ждали гостей и она ко мне не поднималась. Обычно в гостях у нас бывал только Сван; если не считать случайных посетителей, он был почти единствен­ным нашим гостем в Комбре, иногда приходившим по-соседски к ужину (что случалось реже после его неудачной женитьбы, так как мои родные не принимали его жену), а иногда и после ужина, невзначай. Когда мы сидели вечером около дома под высоким каштаном вокруг железного стола и до нас долетал с того конца сада негромкий и визгливый звон бубенчика, своим немолчным, нежи­вым дребезжаньем обдававший и оглушавший домочадцев, приво­дивших его в движение, входя «без звонка», но двукратное, робкое, округленное, золотистое звяканье колокольчика для чужих, все задавали себе вопрос: «Гости\! Кто бы это мог быть?» —хотя ни для кого не представляло загадки, что это может быть только Сван: моя двоюродная бабушка, желая подать нам пример, громко говорила возможно более непринужденным тоном, чтобы мы перестали шептаться, потому что это в высшей степени невежливо по отноше­нию к гостю, который может подумать, что мы шепчемся о нем, а на разведку посылалась бабушка, радовавшаяся предлогу лишний раз пройтись по саду и пользовавшаяся им, чтобы по дороге, для придания розовым кустам большей естественности, незаметно вы­нуть из-под них подпорки,— так мать взбивает сыну волосы, кото­рые прилизал парикмахер. Мы ломали себе голову в ожидании известий о неприятеле, которые должна была доставить бабушка, точно напасть на нас могли целые полчища, но немного погодя дедушка говорил: «Я узнаю голос Свана». Свана действительно узнавали только по голо­су; его гос с горбинкой, зеленые глаза, высокий лоб, светлые, почти рыжие волосы, причесанные под Брессана,— все это было трудно разглядеть, так как мы, чтобы не привлекать мошкару, сидели при скудном свете, и тут я, уже не раздумывая, шел сказать, чтобы подавали сиропы: бабушка боялась, как бы не создалось впечатле­ния, что сиропы у нас приносятся в исключительных случаях, только ради гостей,— ей казалось, что будет гораздо приличнее, если гость увидит сиропы на столе. Сван, несмотря на большую разницу лет, был очень дружен с дедушкой — одним из самых близких прителей его отца, человека прекрасного, но со странностями: любой пустяк мог иногда остановить сердечный его порыв, пре­рвать течение его мыслей. Несколько раз в год дедушка рассказывал при мне за столом одно и то же, как Сван-отец, не отходивший от своей умирающей жены ни днем, ни ночью, вел себя, когда она скончалась. Дедушка давно его не видел, но тут поспешил в именье Сванов, расположенное близ Комбре, и ему удалось выманить обливавшегося слезами приятеля на то время, пока умершую будут класть в гроб, из комнаты, где поселилась смерть. Они прошлись по парку, скупо освещенному солнцем. Внезапно Сван, схватив дедуш­ку за руку, воскликнул: «Ах, мой старый друг\! Как хорошо прогу­ляться вдвоем в такой чудесный день\! Неужели вы не видите, какая это красота —деревья, боярышник, пруд, который я выкопал и на который вы даже не обратили внимания? Вы —желчевик, вот вы кто. Чувствуете, какой приятный ветерок? Ах, что там ни говори, в жизни все-таки много хорошего, мой милый Амедей\!» Но тут он вспомнил, что у него умерла жена, и, очевидно решив не углублять­ся в то, как мог он в такую минуту радоваться, ограничился жестом, к которому он прибегал всякий раз, когда перед ним вставал сложный вопрос: провел рукой по лбу, вытер глаза и протер пенсне. Он пережил жену на два года, все это время был безутешен и тем не менее признавался дедушке: «Как странно\! О моей бедной жене я думаю часто, но не могу думать о ней долго». «Часто, но не долго,— как бедный старик Сван»,— это стало одним из любимых выраже­ний дедушки, которое он употреблял по самым разным поводам. Я склонен был думать, что старик Сван — чудовище, но дедушка, которого я считал самым справедливым судьей на свете и чей приговор был для меня законом, на основании коего я впоследствии прощал предосудительные в моих глазах поступки, мне возражал: «Да ты что\! У него же было золотое сердце\!» На протяжении многих лет сын покойного Свана часто бывал в Комбре, особенно до женитьбы, а мои родные знать не знали, что он порвал с кругом знакомых своей семьи и что они с отменным простодушием ничего не подозревающих хозяев постоялого двора, пустивших к себе знаменитого разбойника, оказывают гостеприим­ство человеку, фамилия которого представляла для нас своего рода инкогнито, ибо Сван являлся одним из самых элегантных членов Джокей-клоба, близким другом графа Парижского и принца Уэль­ского, желанным гостем Сен-Жерменского предместья. Неведение, в котором мы пребывали относительно блестящей светской жизни Свана, конечно, отчасти объяснялось его сдержан­ностью и скрытностью, но еще и тем, что тогдашние обыватели рисовали себе общество на индусский образец: им казалось, что оно делится на замкнутые касты, что каждый член этого общества с самого рождения занимает в нем то же место, какое занимали его родители, и что с этого места ничто, кроме редких случаев голово­кружительной карьеры или неожиданного брака, не в состоянии перевести вас в высшую касту. Сван-отец был биржевым маклером; его отпрыску суждено было до самой смерти принадлежать к той касте, где сумма дохода, как в окладном листе, колебалась между такой-то и такой-то цифрой. Были известны знакомства его отца; следовательно, были известны и его знакомства; известно, с кем ему «подобало» водиться. Если у него и бывали иного рода связи, то на эти отношения молодого человека старые друзья его семьи, как, например, моя родня, тем охотнее смотрели сквозь пальцы, что осиротев, он продолжал бывать у нас постоянно; впрочем, смело можно было побиться об заклад, что этим неизвестным лицам он не решился бы поклониться в нашем присутствии. Если бы пона­добилось сравнить удельный вес Свана с удельным весом других сыновей биржевых маклеров того же калибра, как его отец, то вес этот оказался бы у него чуть-чуть ниже, потому что он был человек очень неприхотливый, был «помешан» на старинных вещах и на картинах и жил теперь в старом доме, который он завалил своими коллекциями и куда моя бабушка мечтала попасть, но особняк находился на Орлеанской набережной, а моя двоюродная бабушка полагала, что жить там неприлично. «Вы в самом деле знаток? — спрашивала она Свана.— Я задаю этот вопрос в ваших же интере­сах,—уж, верно, торговцы всучивают вам всякую мазню». Она действительно была убеждена, что Сван ничего в этом не смыслит, более того: она вообще была невысокого мнения об его уме, потому что в разговорах он избегал серьезных тем, зато проявлял осведом­ленность в делах весьма прозаических, причем не только когда, входя в мельчайшие подробности, снабжал нас кулинарными ре­цептами, но и когда сестры моей бабушки говорили с ним об искусстве. Если они приставали к нему, чтобы он высказался, он упорно отмалчивался, так что это становилось почти неприличным, и отделывался от них тем, что давал точные сведения, в каком музее она находится и когда написана. Но обычно он ограничивался тем, что, желая нас позабавить, рассказывал каждый раз новую историю, которая у него вышла с кем-либо из тех, кого мы знали: с комбрей- ским аптекарем, с нашей кухаркой, с нашим кучером. Разумеется, его рассказы смешили мою двоюродную бабушку, но она не могла понять чем: смешной ролью, которую неизменно играл в них Сван, или же остроумием рассказчика: «Ну и чудак же вы, Сван\!» Так она — единственный член нашей семья — была довольно вульгарна, то, когда заходила речь о Сване при посторонних, она старалась ввернуть, что, если б он захотел, он мог бы жить на бульваре Османа или же на улице Оперы, что отец оставил ему миллиона четыре, а то и пять, но что он напустил на себя блажь. Впрочем, эта блажь представлялась ей занятной, и когда Сван приносил ей в Париже на Новый год коробку каштанов в сахаре, то, если у нее в это время кто-нибудь был, она не упускала случая задать Свану вопрос: «Что же, господин Сван, вы все еще живете у винных складов — боитесь опоздать на поезд, когда вам надо ехать по Лионской дороге?» И тут она искоса, поверх пенсне, поглядывала на гостей. Но если бы ей сказали, что Сван, который в качестве сына покойного Свана «причислен к разряду» тех, кого принимает у себя цвет «третьего сословия», почтеннейшие парижские нотариусы и адвокаты (между тем этой своей привилегией Сван, по-видимому, пренебрегал), живет двойной жизнью; что, выйдя от нас в Париже, он, вместо того чтобы идти домой спать, о чем он нас уведомлял перед уходом, поворачивал за углом обратно и шел в такую гости­ную, куда ни одного маклера и ни одного помощника маклера на порог не пускали, моей двоюродной бабушке показалось бы это столь же неправдоподобно, как более начитанной даме показалось бы неправдоподобной мысль, что она знакома с Аристеем и что после бесед с ней он погружается в Фетидино подводное царство, в область, недоступную взорам смертных, где, как о том повествует Вергилий, его принимают с распростертыми объятиями; или — если воспользо­ваться для сравнения образом, который скорее мог прийти в голову моей двоюродной бабушке, потому что он смотрел на нее в Комбре с маленьких тарелочек,,— столь же неправдоподобной, как мысль, что ей предстоит обедать с Али-Бабой, который, убедившись, что он один, проникнет в пещеру, где блестят несметные сокровища. Однажды Сван где-то обедал в Париже и, придя оттуда к нам, извинился, что он во фраке, а конда он ушел, Франсуаза со слов его кучера сообщила, что обедал он «у принцессы полусвета\!» — пожи­мая плечами и не поднимая глаз от вязанья, с хладнокровной насмешкой в голосе подхватила моя двоюродная бабушка. Словом, она смотрела на него свысока. Она считала, что знаком­ство с нами должно быть для него лестно, а потому находила вполне естественным, что летом он никогда не появлялся у нас без корзин­ки персиков или малины из своего сада и каждый раз привозил мне из Италии снимки великих произведений искусства. Мои родные без всякого стеснения посылали за ним, когда нам нужен был рецепт изысканного соуса или же компота из ананасов для званых обедов, на которые его не приглашали, потому что он не пользовался настолько широкой известностью, чтобы им можно было козырнуть в обществе людей, которые сегодня первый раз в нашем доме. Если речь заходила об особах французского королев­ского дома, моя двоюродная бабушка, обращаясь к Свану, в кармане у которого, быть может, лежало письмо из Твикенгема, говорила: «С этими людьми ни у вас, ни у меня никогда не будет ничего общего,—уж как-нибудь мы и без них обойдемся, верно?»; в те вечера, когда сестра моей бабушки пела, она заставляла его акком­панировать ей и переворачивать ноты — она проявляла по отноше­нию к этому человеку, с которым столькие искали знакомства, простодушную грубость ребенка, обращающегося с какой-нибудь редкой вещью так небрежно, как будто ей грош цена. Свана уже в то время знали многие завсегдатаи клубов, а моя двоюродная бабушка, конечно, рисовала его себе совершенно иным, пропитывая и ожив­ляя всем, что ей было известно о семье Сванов, возникавшую на фоне вечернего мрака в комбрейском садике после того, как дважды нерешительно звонил колокольчик, темную и неопределенную фи­гуру человека, которого вела бабушка и которого мы узнавали по голосу. Но ведь даже если подойти к нам с точки зрения житейских мелочей, и то мы не представляем собой чего-то внешне цельного, неизменного, с чем каждый волен познакомиться как с торговым договором или с завещанием; наружный облик человека есть по­рождение наших мыслей о нем. Даже такой простой акт, как «увидеть знакомого», есть в известной мере акт интеллектуальный. Мы дополняем его обличье теми представлениями, какие у нас уже сложились, и в том общем его очерке, какой мы набрасываем, представления эти играют\* несомненно, важнейшую роль. В конце концов они приучаются так ловко надувать щеки, с такой послуш­ной точностью следовать за линией носа, до того искусно вливаться во все оттенки звуков голоса, как будто наш знакомый есть лишь прозрачная оболочка, и всякий раз, как мы видим его лицо и слышим его голос, мы обнаруживаем, мы улавливаем наши о нем представле­ния. Разумеется, мои родные по неведению не наделили того Свана, которого они себе создали, множеством свойств, выработанных в нем его светской жизнью и способствовавших тому, что другие люди смотрели на его лицо как на царство изящества, естественной грани­цей которого являлся нос с горбинкой; зато мои родные могли вливать в его лицо, лишенное своих чар, ничем не заполненное и емкое, в глубину утративших обаяние глаз смутный и сладкий осадок,— полуоживший, полузабытый,—остававшийся от часов досуга, ежене­дельно проводившихся вместе с ним после ужина, в саду или за ломберным столом, в пору нашего деревенского добрососедства. Те­лесная оболочка нашего друга была до такой степени всем этим пропитана, равно как и воспоминания о его родителях, что этот Сван стал существом законченным и живым, и у меня создается впечатле­ние, будто я расстаюсь с одним человеком и ухожу к другому, непохожему на него, когда, напрягая память, перехожу от того Свана, которого впоследствии хорошо знал, к первому Свану,— в нем я вновь узнаю пленительные заблуждения моей юности, да и похож он, кстати сказать, не столько на второго Свана, сколько на других людей, с которыми я тогда был знаком: можно подумать, что наша жизнь — музей, где все портреты одной эпохи имеют фамиль­ное сходство, общий тон,— к первому Свану, веявшему досужест- вом, пахнувшему высоким каштаном, малиной и немножко — дракон-травой... Пруст М. *В поисках утраченного вре­мени: В сторону Свана.* Спб., 1992. С. 3-21.
60.887205
72
0.805624
rus_Cyrl
0.998022
fa1169ba9ed519c85abf6a72774a93773fe05649
7,583
md
Markdown
notes/introduction-nosql.md
cowboyuniverse/database-review
a5d9fd53d13a71dc714768582e750af4d2c17f86
[ "MIT" ]
null
null
null
notes/introduction-nosql.md
cowboyuniverse/database-review
a5d9fd53d13a71dc714768582e750af4d2c17f86
[ "MIT" ]
null
null
null
notes/introduction-nosql.md
cowboyuniverse/database-review
a5d9fd53d13a71dc714768582e750af4d2c17f86
[ "MIT" ]
null
null
null
# Introduction to NoSQL Database NoSQL database by definition is "next generation" type of database mostly addressing some points: non-relational, distributed, open source and horizontal scalable. There are a couple NoSQL databases (where you can find the list here http://nosql-database.org/). But we will be focusing on the MongoDB specifically based on its popularity in past few years. ## Why NoSQL? Small to medium size company, like the company I work with, often uses MongoDB as the starting point of the micro service. Why? When building modern web application now, it's often that business requirement changes in between development cycle and even change in a day or two. MongoDB databases is designed to help with rapidly changing data types. Please keep in mind that MongoDB is only within a type of NoSQL database, specifically under document model database. For the class purose, we will not address other type of databases. > Don't just assume NoSQL = MongoDB. That is wrong. Document model database like MongoDB stores the data in **documents**. And these documents are typically use a structure like **JSON**. This format is very close to how modern webapp develops. In example, to store some data that will only be used by the front-end, you can now simply store whatever data client side passes without any sort of schema definition from the backend. > Note that this type of storage without validation is considered to be ... dangerous. Furthermore, in document storage, the notion of schema is very flexible: each document can contain different fields. This flexibility is helpful for modeling unstructured data. And it also makes the development easier to evolve application in the development cycle, such as adding new fields. In short, document model database are for general purpose and useful for wide variety of applications due to the flexibility of the data modeling -- which we will see in the following lectures. You can read more on the comparison between NoSQL and Relational databases here -- https://www.mongodb.com/nosql-explained ## Install MongoDB https://github.com/csula/Utilities/blob/master/setups/mongo.md ## Import sample data Download file from https://raw.githubusercontent.com/mongodb/docs-assets/primer-dataset/primer-dataset.json > If you are using school laboratory, type in `wget https://raw.githubusercontent.com/mongodb/docs-assets/primer-dataset/primer-dataset.json` And then run below command: ```sh mongoimport --db test --collection restaurants --drop --file ~/downloads/primer-dataset.json ``` ```sh "C:\Program Files\MongoDB\Server\3.4\bin\mongoimport.exe" --db test --collection restaurants --drop --file C:\Users\IEUser\Downloads\primer-dataset.json ``` ## Mongo shell commands Once you finish the installation of MongoDB as above steps, you should also have Mongo shell installed along with the MongoDB server. You can start the shell by `mongo`. > For windows user, look for `mongo.exe` under your MongoDB folder ## CRUD with Mongo shell CRUD stands for Create, Read, Update and Delete. Often used to describe some most basic functionality of an app. In the following sections, we will be discussing the CRUD operations with Mongo shell. Before you start any of the CRUD command, you will need change your database (just like mysql you have to `use` database). ```sh use test; ``` ### Common debugging commands ```sh # to see all databases # equals to `show databases;` in MySQL show dbs; # to see all collections # equals to `show tables;` in MySQL show collections; ``` ### To create and insert a document, you can follow below command. ```javascript db.restaurants.insert( { "address" : { "street" : "2 Avenue", "zipcode" : "10075", "building" : "1480", "coord" : [ -73.9557413, 40.7720266 ] }, "borough" : "Manhattan", "cuisine" : "Italian", "grades" : [ { "date" : ISODate("2014-10-01T00:00:00Z"), "grade" : "A", "score" : 11 }, { "date" : ISODate("2014-01-16T00:00:00Z"), "grade" : "B", "score" : 17 } ], "name" : "Vella", "restaurant_id" : "41704620" } ); ``` ### To read or find a document, you can follow the below command. ```javascript // to find all documents under `restaurants` db.restaurants.find() // to find a specific restaurants with a column db.restaurants.find( { "borough": "Manhattan" } ) // or you can even find a field in an embed document db.restaurants.find( { "address.zipcode": "10075" } ) // query against a field of an array db.restaurants.find( { "grades.grade": "B" } ) // if you want to use grater sign db.restaurants.find( { "grades.score": { $gt: 30 } } ) // or less sign db.restaurants.find( { "grades.score": { $lt: 10 } } ) // what about logical operation like and and or? // AND db.restaurants.find( { "cuisine": "Italian", "address.zipcode": "10075" } ) // OR db.restaurants.find( { $or: [ { "cuisine": "Italian" }, { "address.zipcode": "10075" } ] } ) // sorting db.restaurants.find().sort( { "borough": 1, "address.zipcode": 1 } ) // when you call find command you actually do not retrieve any object back // to retrieve objects back you will need to call `toArray()` db.restaurants.find().toArray() // projection // projection allows you to select certain attributes when retrieve JSON db.restaurants.find({}, {borough: 1}); ``` ### To update a document ```javascript // update takes a query object first and then the fields to update db.restaurants.update( { "name" : "Juni" }, { $set: { "cuisine": "American (New)" }, $currentDate: { "lastModified": true } } ) // can also update embed document db.restaurants.update( { "restaurant_id" : "41156888" }, { $set: { "address.street": "East 31st Street" } } ) // update multiple documents at once db.restaurants.update( { "address.zipcode": "10016", cuisine: "Other" }, { $set: { cuisine: "Category To Be Determined" }, $currentDate: { "lastModified": true } }, { multi: true} ) // remember if you don't specify $set then it will replace the whole document db.restaurants.update( { "restaurant_id" : "41704620" }, { "name" : "Vella 2", "address" : { "coord" : [ -73.9557413, 40.7720266 ], "building" : "1480", "street" : "2 Avenue", "zipcode" : "10075" } } ) ``` ### Delete documents ```javascript // remove all documents meeting this condition db.restaurants.remove( { "borough": "Manhattan" } ) // to be on the safe side you can use `justOne` to remove a document at once db.restaurants.remove( { "borough": "Queens" }, { justOne: true } ) // if you want to remove the entire documents db.restaurants.remove( { } ) // or you can drop the whole document db.restaurants.drop() ``` ## SQL to MongoDB Reference: https://docs.mongodb.com/manual/reference/sql-comparison/ Terminology and concept table: | SQL Terms/Concepts | MongoDB Terms/Concepts | | :-- | :-- | | database | database | | table | collection | | row | document | | column | field | | index | index | | table joins | embedded documents and linking | Create and Alter | SQL statements | MongoDB statements | | :-- | :-- | | `CREATE TABLE users ( ... )` | `db.users.insert({ ... })` | | `ALTER TABLE users ADD column DATETIME` | `db.users.update({}, { $set: { column: new Date() } }, { multi: true })` | ## Reference API documentation: https://docs.mongodb.com/manual/crud/
28.942748
152
0.685744
eng_Latn
0.974937
fa11a158fcf8bb4ec83a06249e524ea7901510a4
760
md
Markdown
doc/DEV.md
tier4/momo
cee52afd9bbe71d1fa9a576e0ab62259a8469518
[ "Apache-2.0" ]
null
null
null
doc/DEV.md
tier4/momo
cee52afd9bbe71d1fa9a576e0ab62259a8469518
[ "Apache-2.0" ]
null
null
null
doc/DEV.md
tier4/momo
cee52afd9bbe71d1fa9a576e0ab62259a8469518
[ "Apache-2.0" ]
1
2021-03-18T16:30:55.000Z
2021-03-18T16:30:55.000Z
# 開発者向けメモ ## Windows 対応の TODO - 現在 momo で動かすには管理者権限が必要なことがあるので、起動時に管理者権限を要求するようにマニフェストを用意するか、何とか管理者じゃなくても動かす方法を模索する - (やるかどうか分からない)gn を使って MSVC のプロジェクトを使わずビルドできるようにする ## パッケージ用バイナリの作成 バイナリ作成時に BUILD_MODE=pkg を指定することで、シンボルテーブルを削除して、ファイルサイズを削減したバイナリを作成することが可能です。 パッケージングに使用するバイナリを作成する場合は、下記のように BUILD_MODE=pkg を指定してバイナリを作成します。 ``` $ make armv8 BUILD_MODE=pkg ``` ## パッケージング 事前にバイナリを生成しておき、 build ディレクトリで make pkg とすることでパッケージング可能です。 ## サブモジュール ``` $ git submodule status da901cca542612a133efcb04e8e78080186991e4 libs/CLI11 (v1.6.1-8-gda901cc) 0f1b43536d97d9311c73b658b86a9d44be9e5e82 libs/civetweb (v1.10) d2dd27dc3b8472dbaa7d66f83619b3ebcd9185fe libs/json (v3.1.2) c6d7e295bf5a0ab9b5f896720cc1a0e0fdc397a7 libs/websocketpp (0.3.0-395-gc6d7e29) ```
24.516129
85
0.825
jpn_Jpan
0.524956
fa12237b3e5b855b616efd2cd1d744486d9d27cf
653
md
Markdown
SLManager/CHANGELOG.md
qingfeng6678995/Taiwu_mods
d3dd2f639e4d3a74d62bc2c3815e3586561dfdce
[ "MIT" ]
1
2019-06-09T06:37:39.000Z
2019-06-09T06:37:39.000Z
SLManager/CHANGELOG.md
qingfeng6678995/Taiwu_mods
d3dd2f639e4d3a74d62bc2c3815e3586561dfdce
[ "MIT" ]
null
null
null
SLManager/CHANGELOG.md
qingfeng6678995/Taiwu_mods
d3dd2f639e4d3a74d62bc2c3815e3586561dfdce
[ "MIT" ]
null
null
null
# 更新日志 ## 1.0.0 - 从SaveLoadBtns添加了读取SaveBackup存档的功能 ## 1.0.1 - 修改读取load.zip为读取当前存档 ## 1.1.0 - 替换太吾百晓册图标为快速读档 ## 1.2.0 - 同时适配测试版和稳定版 - 修正加速文件中记录的时间错误 ## 1.2.1 - 改为反射调用 ## 1.2.2 - 适配0.1.6.0 ## 1.3.0 - 不再依赖SaveBackup - 备份逻辑修改为仅在存档后备份存档 ## 1.3.1 - 添加锁,防止并行冲突 ## 1.3.2 - 临时文件使用guid生成文件名,防止冲突 ## 1.3.3 - 修复备份存档的压缩包的文件名错误 ## 1.3.4 - 修改存档的备份,防止备份了其他的文件或文件夹 ## 1.4.0 - 通过7z打包或解压存档,解决.net4下无法调用ionic.zip.dll的问题 ## 1.4.1 - 修复了ionic.zip的调用问题,不再使用7z处理压缩文档 ## 1.4.2 - 调整图标替换,适配新版本 ## 1.4.3 - 调整存档备份流程 - 替换DotNetZip.dll与Taiwu本体相同版本 ## 1.4.4 - 不再手动导入DotNetZip.dll等DLL文件 - 备份时将会备份chunk文件夹中的内容 - 直接拷贝系统自动备份到备份文件夹,不重复生成 ## 1.4.5 - 恢复在游戏存档后备份的策略,兼容0.2.2.0
11.258621
42
0.679939
yue_Hant
0.530139
fa13a41254158a7270ddb4ae57977f0165d34fc3
1,008
md
Markdown
README.md
Ferias-Co/api-docs
c44b2e8deeeb7adda3413b37ee4b8e1c80d2fbe4
[ "Apache-2.0" ]
null
null
null
README.md
Ferias-Co/api-docs
c44b2e8deeeb7adda3413b37ee4b8e1c80d2fbe4
[ "Apache-2.0" ]
4
2021-03-01T21:13:59.000Z
2022-02-26T01:57:14.000Z
README.md
Ferias-Co/api-docs
c44b2e8deeeb7adda3413b37ee4b8e1c80d2fbe4
[ "Apache-2.0" ]
null
null
null
# api-docs Documentação das APIs abertas da Férias&amp;Co. [https://ferias-co.github.io/api-docs/](https://ferias-co.github.io/api-docs/) ## Desenvolvimento Documentação gerada com o [ReDoc](https://github.com/Redocly/redoc/blob/master/README.md). ``` # Clonar e instalar dependências: git clone https://github.com/Ferias-Co/api-docs.git cd api-docs npm install # Rodar localmente em localhost:8080: npm run serve # Gerar o html: npm run bundle ``` As definições da API estão no arquivo `swagger.yml` em formato OpenAPI 2.0. [Ver referência](https://swagger.io/specification/v2/). ## Customização e tema Opções de exibição e customização do tema podem ser definidas no elemento `<redoc>` do em `template.hbs`. [Ver referência](https://github.com/Redocly/redoc#redoc-options-object). Para alterar o logo, adicionar o arquivo em `/docs` e substituir a url nas definições da API: ``` title: "API title" x-logo: url: https://ferias-co.github.io/api-docs/logo.png altText: API title ```
25.2
178
0.731151
por_Latn
0.97513
fa1433ebe71893309fc7b044170f12d0756cbf18
7,155
md
Markdown
borrow/recommendations/children/_posts/2020-01-06-childrens-comic-books-2.md
suffolklibraries/florian
e9d2f66bae79e258839590f1006399394264c978
[ "MIT" ]
null
null
null
borrow/recommendations/children/_posts/2020-01-06-childrens-comic-books-2.md
suffolklibraries/florian
e9d2f66bae79e258839590f1006399394264c978
[ "MIT" ]
null
null
null
borrow/recommendations/children/_posts/2020-01-06-childrens-comic-books-2.md
suffolklibraries/florian
e9d2f66bae79e258839590f1006399394264c978
[ "MIT" ]
null
null
null
--- category: fiction title: "Check out some of our top comic books for children" date: 2020-01-06 author: sophie-green age-range: child excerpt: "Have a look at the exciting stories and amazing artwork to be found in some of our latest comic books. Although these ones are aimed at children, we reckon grown-ups will also enjoy flicking through!" featured-image: /images/featured/featured-childrens-comic-books-2.jpg featured-alt: "Hilda and the Mountain King, Kai and the Monkey King, Evil Emperor Penguin (Almost) Takes Over the World!" breadcrumb: children --- ![Hilda and the Mountain King, Kai and the Monkey King, Evil Emperor Penguin (Almost) Takes Over the World!](/images/featured/featured-childrens-comic-books-2.jpg) Have a look at the exciting stories and amazing artwork to be found in some of our latest comic books. Although these ones are aimed at children, we reckon grown-ups will also enjoy flicking through! ## [<cite>Dog Man: Fetch-22</cite>, by Dav Pilkey](https://suffolk.spydus.co.uk/cgi-bin/spydus.exe/ENQ/OPAC/BIBENQ?BRN=2656074) Latest in the hugely successful series from the author of [<cite>Captain Underpants</cite>](https://suffolk.spydus.co.uk/cgi-bin/spydus.exe/ENQ/OPAC/BIBENQ?BRN=2472881). > "Petey the Cat is out of jail, and he has a brand-new lease of life. While Petey's reevaluated what matters most, Li'l Petey is struggling to find the good in the world. Can Petey and Dog Man stop fighting like cats and dogs long enough to put their paws together and work as a team? They need each other now more than ever -- Li'l Petey (and the world) is counting on them!" ## [<cite>The Amazing World of Gumball: Adventures in Elmore</cite>, by Ben Bocquelet](https://suffolk.spydus.co.uk/cgi-bin/spydus.exe/ENQ/OPAC/BIBENQ?BRN=2578631) Cartoon Network has announced new episodes for this award-winning, global hit series. > "The gang's all here! Join Gumball, Darwin, Anais, Carrie, and all your favourite friends from Cartoon Network's hit television series The Amazing World of Gumball in these hilarious adventures throughout the town of Elmore." ## [<cite>Tamsin and the Dark</cite>, by Neil Cameron & Kate Brown](https://suffolk.spydus.co.uk/cgi-bin/spydus.exe/ENQ/OPAC/BIBENQ?BRN=2304943) From <cite>The Phoenix</cite> comic story magazine, this is the sequel to [<cite>Tamsin and the Deep</cite>](https://suffolk.spydus.co.uk/cgi-bin/spydus.exe/ENQ/OPAC/BIBENQ?BRN=2304943). > "Tamsin Thomas discovered that she is the Last Pellar, the ancestral protector of Cornwall - and she's the holder of a magical staff, which gives her the power to fly. But now, Tamsin must face a new and ancient evil. Deep in the disused mines of Cornwall, there is a dark power lurking. And very soon, Tamsin's brother starts to behave very strangely indeed. Tamsin will need all her courage to overcome this new danger." ## [<cite>Bunny vs Monkey: Apocalypse ...and other surprising stories!</cite>, by Jamie Smart](https://suffolk.spydus.co.uk/cgi-bin/spydus.exe/ENQ/OPAC/BIBENQ?BRN=2678901) The latest in the popular series. > "Everything's gone wrong! The woods normally echo with the sounds of Bunny and Monkey fighting, but this time they are facing their greatest danger yet - the actual apocalypse!" Smart is also the creator of [<cite>Looshkin</cite>](https://suffolk.spydus.co.uk/cgi-bin/spydus.exe/ENQ/OPAC/BIBENQ?BRN=2595333) and highly-illustrated junior novel [<cite>Flember: the secret book</cite>](https://suffolk.spydus.co.uk/cgi-bin/spydus.exe/ENQ/OPAC/BIBENQ?BRN=2624403). ## [<cite>Kai and the Monkey King</cite>, by Joe Todd-Stanton](https://suffolk.spydus.co.uk/cgi-bin/spydus.exe/ENQ/OPAC/BIBENQ?BRN=2590882) The latest in the Brownstone’s Mythical Collection series. > "When Kai grows tired of her bookish mum not being adventurous enough for a Brownstone, she decides to seek out the mischievous and rebellious Monkey King - who she's always been told to stay away from. Will he bring her the adventure she craves, or will he cause her more trouble than he's worth?" ## [<cite>The Unicorn Whisperer</cite>, by Dana Simpson](https://suffolk.spydus.co.uk/cgi-bin/spydus.exe/ENQ/OPAC/BIBENQ?BRN=2651589) The Phoebe and Her Unicorn series began online and later became a newspaper comic strip in the US. This is the latest title. > "What could be more magical than being best friends with a unicorn? For 9-year-old Phoebe Howell and her sparkling companion, Marigold Heavenly Nostrils, every day is an adventure. But it isn't always easy. In this latest installation of Dana Simpson's award-winning Phoebe and Her Unicorn series, Phoebe navigates the challenges of school life while being pulled into plenty of curious and entertaining adventures with her vain but endearing best friend." ## [<cite>Hilda and the Mountain King</cite>, by Luke Pearson](https://suffolk.spydus.co.uk/cgi-bin/spydus.exe/ENQ/OPAC/BIBENQ?BRN=2622111) The Hilda series is SO good. Strange and magical characters mingle in the bustling city of Trolberg. Wonderful for fans of the Moomins and Stuido Ghibli. The series has now been adapted for Netflix. > "<cite>Hilda and the Mountain King</cite> takes off from the cliff hanger in [<cite>Hilda and the Stone Forest</cite>](https://suffolk.spydus.co.uk/cgi-bin/spydus.exe/ENQ/OPAC/BIBENQ?BRN=2400101) that left you waiting breathlessly! We rejoin our heroine for her latest adventure just as she awakes to find herself in the body of a troll! Her mother is worried sick, and is perplexed by the strange creature that seems to have taken Hilda's place. Now, both of them are in a race to be reunited before Ahlberg and his safety patrol get the chance to use their new secret weapon to lay waste to the trolls, and Hilda along with them!" ## [<cite>Evil Emperor Penguin (Almost) Takes Over the World</cite>, by Laura Ellen Anderson](https://suffolk.spydus.co.uk/cgi-bin/spydus.exe/ENQ/OPAC/BIBENQ?BRN=2519529) Latest in the Evil Emperor Penguin series from the author-illustrator of Amelia Fang. > "Evil Emperor Penguin has a plan. (Another one! He's a very persistent penguin.) A plan to TAKE OVER THE WORLD! Nothing will stand in the way of his genius. Nothing... Except for his arch-nemesis: EVIL CAT! Evil Cat is also plotting world domination and he's not going to let Evil Emperor Penguin get there first... It's Evil Genius vs Evil Genius, as Evil Emperor Penguin (Almost) Takes Over the World!" ## [<cite>Corpse Talk: Ground-breaking Rebels</cite>, by Adam & Lisa Murphy](https://suffolk.spydus.co.uk/cgi-bin/spydus.exe/ENQ/OPAC/BIBENQ?BRN=2561775) The Corpse Talk series from <cite>The Phoenix</cite> Presents is non-fiction and takes the form of interviews with deceased historical figures. ## [<cite>The Phoenix Colossal Comics Collection: volume one</cite>](https://suffolk.spydus.co.uk/cgi-bin/spydus.exe/ENQ/OPAC/BIBENQ?BRN=2454622) > "A collection of different comics taken from the issues of <cite>The Phoenix</cite> comic, this compendium brings together different humorous, informative, action-packed and brain-teasing stories into one mega-comic book!" Many of our libraries have physical copies of <cite>The Phoenix</cite> story comic too!
94.144737
634
0.773864
eng_Latn
0.987864
fa163d6e4823cfd0c035a2f208d40f3418ebb080
577
md
Markdown
README.md
lorengordon/EnterpriseLayer-HBSS-formula
0f8a020cb85e45588ad1874029869bdb74b3644a
[ "Apache-2.0" ]
null
null
null
README.md
lorengordon/EnterpriseLayer-HBSS-formula
0f8a020cb85e45588ad1874029869bdb74b3644a
[ "Apache-2.0" ]
1
2015-12-15T15:40:29.000Z
2015-12-15T15:40:29.000Z
README.md
lorengordon/EnterpriseLayer-HBSS-formula
0f8a020cb85e45588ad1874029869bdb74b3644a
[ "Apache-2.0" ]
2
2015-12-15T13:49:25.000Z
2016-03-24T16:03:50.000Z
[![Build Status](https://travis-ci.org/plus3it/EnterpriseLayer-HBSS-formula.svg)](https://travis-ci.org/plus3it/EnterpriseLayer-HBSS-formula) This project is designed to host a set of [Salt](http://saltstack.com/) states and formulae to govern the installation and configuration of the HBSS Enterprise security-software components. These formulae and states take care of downloading required software components, installing and configuring those components and updating the host firewall with required exceptions for the components to be managed from the Enterprise consoles.
144.25
433
0.82149
eng_Latn
0.985873
fa1681513a53da4fe1c6ea20fa4c5047d4421a1c
1,315
md
Markdown
desktop-src/ADSchema/s-string-object-identifier.md
velden/win32
94b05f07dccf18d4b1dbca13b19fd365a0c7eedc
[ "CC-BY-4.0", "MIT" ]
552
2019-08-20T00:08:40.000Z
2022-03-30T18:25:35.000Z
desktop-src/ADSchema/s-string-object-identifier.md
velden/win32
94b05f07dccf18d4b1dbca13b19fd365a0c7eedc
[ "CC-BY-4.0", "MIT" ]
1,143
2019-08-21T20:17:47.000Z
2022-03-31T20:24:39.000Z
desktop-src/ADSchema/s-string-object-identifier.md
velden/win32
94b05f07dccf18d4b1dbca13b19fd365a0c7eedc
[ "CC-BY-4.0", "MIT" ]
1,287
2019-08-20T05:37:48.000Z
2022-03-31T20:22:06.000Z
--- title: String(Object-Identifier) syntax description: An OID string, which is a string that contains digits (0-9) and decimal points (.). ms.assetid: e1349a59-5fec-4cad-bd71-0fed99517ee2 ms.tgt_platform: multiple keywords: - String(Object-Identifier) syntax AD Schema topic_type: - apiref api_name: - String(Object-Identifier) api_type: - Schema ms.topic: reference ms.date: 05/31/2018 --- # String(Object-Identifier) syntax An OID string, which is a string that contains digits (0-9) and decimal points (.). | Entry | Value | |--------------|------------------------------------------------------------------------| | Name | String(Object-Identifier) | | Syntax ID | 2.5.5.2 | | OM ID | 6 | | MAPI Type | \- | | ADS Type | ADSTYPE\_CASE\_IGNORE\_STRING | | Variant Type | VT\_BSTR | | SDS Type | [System.String](/dotnet/api/system.string) | ## See also <dl> <dt> [System.String](/dotnet/api/system.string) </dt> </dl>    
28.586957
96
0.44943
eng_Latn
0.421185
fa1739026248e5720966306813312d60aa58e448
8,860
md
Markdown
_posts/2017-05-29-padding-oracle-attack.md
wonjinjeong/wonjinjeong.gihub.io
b52a9620db9b2c3b13fdcc0a3479ded41fe5173a
[ "MIT" ]
null
null
null
_posts/2017-05-29-padding-oracle-attack.md
wonjinjeong/wonjinjeong.gihub.io
b52a9620db9b2c3b13fdcc0a3479ded41fe5173a
[ "MIT" ]
null
null
null
_posts/2017-05-29-padding-oracle-attack.md
wonjinjeong/wonjinjeong.gihub.io
b52a9620db9b2c3b13fdcc0a3479ded41fe5173a
[ "MIT" ]
1
2018-01-07T11:18:18.000Z
2018-01-07T11:18:18.000Z
--- layout: post section-type: post title: Recovering plaintexts with Padding Oracle Attacks 🔮 category: tech tags: [ 'crypto', 'redteam' ] published: false --- [Last time]({% post_url 2017-05-21-length-extention-attack %}) we saw how to forge a valid signature by intercepting a signed message, its authentic signature and the length of the key that was used to sign it. Today we'll see how to decrypt ciphertexts, without knowing the key that was used to encrypt the original plaintext. The only mistake that needs to be made, is for a decryption module to leak whether the padding of the ciphertext that is decrypting, has a valid padding or not! Yes, an innocent looking papercut like this, will make your crypto fall completely apart. In this case, the decryption module that leaks this information is called the Padding Oracle. The Padding Oracle Attack was initially published by [Vaudenay](http://www.iacr.org/cryptodb/archive/2002/EUROCRYPT/2850/2850.pdf) and it's a side-channel chosen-ciphertext attack that works against the Cipher Block Chaining (CBC) mode and the Public Key Cryptography Standards \#7 (PKCS7) padding scheme. Side-channel attacks are those that are based on the implementation of a cryptosystem. Chosen-ciphertext attacks on the other hand, are those that enable the adversary to submit chosen ciphertexts and decrypt them using a cryptosystem. In order to understand how the attack works, we need first to understand how CBC and PKCS7 work. ### CBC Encryption and decryption work with Block Ciphers in their core. Imagine the Block Ciphers as black boxes, that get as an input a fixed length key, a fixed length block of plaintext/ciphertext and they spit out the corresponding block of ciphertext/plaintext. Since these blackboxes have a fixed length input, we need to somehow combine them, so we can enable the encryption/decryption of arbitrary sized inputs. This is what Block Cipher Modes are about, with CBC being the most popular among them. When encrypting a plaintext with a block cipher in CBC mode, then the plaintext input of each block is XOR'ed with the ciphertext output of the previous block cipher. That way the slightest change in the plaintext input, will affect all the following blocks of its block, apart from the block itself. In the case of the first block, then a random block (per encryption) called Initialization Vector is used to XOR the plaintext of the first block before it's encrypted. Here's a visualization of the process: ![CBC](https://upload.wikimedia.org/wikipedia/commons/d/d3/Cbc_encryption.png) In order to decrypt a ciphertext that was produced using CBC, you need to XOR the ciphertext of the previous block, with the output of the current block cipher. That way you nullify the encryption's XOR operation of the previous' block cipher's ciphertext: C<sub>i - 1</sub> ⊕ P<sub>i</sub> ⊕ C<sub>i - 1</sub> → <br /> (C<sub>i - 1</sub> ⊕ C<sub>i - 1</sub>) ⊕ P<sub>i</sub> → <br /> 0 ⊕ P<sub>i</sub> → <br /> P<sub>i</sub> ![CBC](https://upload.wikimedia.org/wikipedia/commons/6/66/Cbc_decryption.png) ### PKCS7 padding We need a padding scheme in order to construct inputs that have a length that is divisible by the block size, since the Block Ciphers operate strictly on blocks. The PKCS7 padding is simple, the last *N* bytes are padded with the value *N*. For example the padding of *"Hello, world"*, for a block size of 16 bytes will be four 4s appended at its end: <pre><code data-trim class="bash"> # H e l l o , w 0x48 0x65 0x6c 0x6c 0x6f 0x2c 0x20 0x77 o r l d 4 4 4 4 0x6f 0x72 0x6c 0x64 0x04 0x04 0x04 0x04 </code></pre> ### The vulnerable decryption Now that we know what CBC and PKCS7 are, let's see the vulnerable Ruby code that encrypts and decrypts data using the Advanced Encryption Standard (AES) block cipher, which operates on blocks of 128 bits (or 16 bytes): <pre><code data-trim class="ruby"> require 'openssl' class PaddingOracle def encrypt(plaintext) cipher = OpenSSL::Cipher::AES.new(256, :CBC) cipher.encrypt @key = cipher.random_key iv = cipher.random_iv ciphertext = cipher.update(plaintext) + cipher.final return iv + ciphertext end def decrypt(ciphertext) decipher = OpenSSL::Cipher::AES.new(256, :CBC) decipher.decrypt decipher.key = @key decipher.iv = ciphertext[0..15] # The Oracle will leak if whether the padding is correct or not in the .final method plaintext = decipher.update(ciphertext[16..(ciphertext.length - 1)]) + decipher.final # No plaintext returned end end </code></pre> The call to the *final* method in the decryption above, will also check if the padding of the result plaintext is valid, before removing it. If the padding is not valid, then a *OpenSSL::Cipher::CipherError* will be thrown and this information will leak to the caller. As a result, this is the information that we will use in order to use the *decrypt* method as the Padding Oracle. By submitting ciphertexts that we construct to the Oracle, we'll manage to recover the plaintext, without knowing the key that was used to encrypt the ciphertext that we intercepted. ### The exploit Let's imagine that we intercepted the following ciphertext that has the length of two blocks (32 bytes): C<sub>0</sub> \| C<sub>1</sub> Now let's construct a ciphertext C'<sub>0</sub> like this: C'<sub>0</sub> = C<sub>0</sub> ⊕ 00000001 ⊕ 0000000X Where *X* is a byte between 0 and 255. Now let's submit C'<sub>0</sub> | C<sub>1</sub> to the Oracle and let's see what will be computed: C'<sub>0</sub> ⊕ D(C<sub>1</sub>) → <br/> C<sub>0</sub> ⊕ 00000001 ⊕ 0000000X ⊕ (P<sub>1</sub> ⊕ C<sub>0</sub>) → <br/> (C<sub>0</sub> ⊕ C<sub>0</sub>) ⊕ 00000001 ⊕ 0000000X ⊕ P<sub>1</sub> → <br/> 00000000 ⊕ 00000001 ⊕ 0000000X ⊕ P<sub>1</sub> → <br/> 00000001 ⊕ 0000000X ⊕ P<sub>1</sub> Let's assume that *X* is the correct guess of the last byte of P<sub>1</sub>, what will happen in this case? The last byte of P<sub>1</sub> will be nullified by the XOR operation and 1 will end up in the end plaintext. Then the end plaintext will be a valid PKCS7 padding and the Oracle will not throw. On the other hand, if *X* doesn't match the last byte of P<sub>1</sub>, then the padding of the computed plaintext will not be valid, and the Oracle will throw! By trying all the possible values of *X*, we will successfully recover the last byte of C<sub>1</sub>. Now how can we continue to the next byte? By simply following the same logic for the second last byte of C<sub>0</sub>, like this: C'<sub>0</sub> = C<sub>0</sub> ⊕ 00000022 ⊕ 000000YX Where *Y* is again a value between 0 and 255 and *X* is the byte that we recovered earlier. Now by submitting C'<sub>0</sub> | C<sub>1</sub> to the Oracle, we'll get the same behavior as before and at some point guess the correct value of *Y*. Like this we can continue and recover all the bytes of the block and of course this can be applied for every block of the ciphertext, except the first one. But, the first block is the Initialization Vector so we don't even need to recover it :smile: Here is the Ruby code that intercepts a ciphertext and then performs the attack on the *PaddingOracle* that we saw earlier: <pre><code data-trim class="ruby"> plaintext = 'This is a top secret message!!!' oracle = PaddingOracle.new() ciphertext = oracle.encrypt(plaintext) recovered_plaintext = '' to = ciphertext.length - 1 from = to - 31 while from >= 0 target_blocks = ciphertext[from..to] i = 15 padding = 0x01 recovered_block = '' while i >= 0 # For each byte of the block for c in 0x00..0xff # For each possible byte value chosen_ciphertext = target_blocks.dup # Set the bytes that we have already recovered in the block j = recovered_block.length - 1 ii = 15 while j >= 0 chosen_ciphertext[ii] = (chosen_ciphertext.bytes[ii] ^ recovered_block.bytes[j] ^ padding).chr j -= 1 ii -= 1 end # Guess the i-th byte of the block chosen_ciphertext[i] = (chosen_ciphertext.bytes[i] ^ c ^ padding).chr begin # Ask the Oracle oracle.decrypt(chosen_ciphertext) # The Oracle said Yes, move to the next byte recovered_block = c.chr + recovered_block next rescue OpenSSL::Cipher::CipherError # The Oracle said No, try the next possible value of the byte end end i -= 1 padding += 0x01 end recovered_plaintext = recovered_block + recovered_plaintext # Move to the next block from -= 16 to -= 16 end puts recovered_plaintext # This is a top secret message!!! </code></pre> Here is the full source code: <script src="https://gist.github.com/le4ker/02c225e4ebe6c596a7519ebead84091c.js"></script> Happy decrypting!
40.829493
305
0.724605
eng_Latn
0.996601
fa176a45d12ea0805436300d7c5bd626476eb2aa
1,575
md
Markdown
How_to_become_member.md
kamleshjoshi8102/Welcome-to-Bauddhik-Geeks
36c494807fa3f24c534c38162e18340480f67a2f
[ "MIT" ]
null
null
null
How_to_become_member.md
kamleshjoshi8102/Welcome-to-Bauddhik-Geeks
36c494807fa3f24c534c38162e18340480f67a2f
[ "MIT" ]
null
null
null
How_to_become_member.md
kamleshjoshi8102/Welcome-to-Bauddhik-Geeks
36c494807fa3f24c534c38162e18340480f67a2f
[ "MIT" ]
null
null
null
# How to become member❓ To become a member in Bauddhik-Geeks Organization, you need to follow folowing steps: - Join Bauddhik-Geeks Discord Server - Create issue using the issue template **"invite to organization"** - Add yourself in Members.md file. Now let's see it in detail: ## Join Bauddhik-Geeks Discord Server 🤝 To join Bauddhik-Geeks disord server click 👉 [here](https://discord.gg/atzZYdNMDF) ## Create issue using the issue template ✔ To create an issue follow the following steps: - Click on the **Issues** option in the repository. - Click on **New issue** - Now you can see the issue template **"Invite me to organization"** - Click in **Get Started** - In Contact Details enter your email address and in Discord usernme enter the your discord username in Bauddhik-Geeks discord server. - Check the checkbox to agree with the code of conduct. - Click on **Submit new issue** **After submitting issue if you have filled the infomation correctly, if yes then we will send you invite in your email address.** ## Add yourself in Members.md 😎 To add yourself in Members.md file follow the following steps: - Fork the current repository **"Welcome-to-Bauddhik-Geeks"** - Open **Members.md** file [here](https://github.com/Bauddhik-Geeks/Welcome-to-Bauddhik-Geeks/edit/main/Members.md) - Add the way other members have added and you can see the syntax below 👇 `<td align="center"><a href="<Your-GitHub-Profile-URL>"><img src="Link-of-profile-pic" width="100px;" alt=""/><br /><sub><b><Your-Name></b></sub></a></td>` - Create pull request
37.5
155
0.729524
eng_Latn
0.970178
fa17fef2126d8cda2d313c99d7384e2be697211d
3,373
md
Markdown
docs/code-quality/ca2108-review-declarative-security-on-value-types.md
seferciogluecce/visualstudio-docs.tr-tr
222704fc7d0e32183a44e7e0c94f11ea4cf54a33
[ "CC-BY-4.0", "MIT" ]
null
null
null
docs/code-quality/ca2108-review-declarative-security-on-value-types.md
seferciogluecce/visualstudio-docs.tr-tr
222704fc7d0e32183a44e7e0c94f11ea4cf54a33
[ "CC-BY-4.0", "MIT" ]
null
null
null
docs/code-quality/ca2108-review-declarative-security-on-value-types.md
seferciogluecce/visualstudio-docs.tr-tr
222704fc7d0e32183a44e7e0c94f11ea4cf54a33
[ "CC-BY-4.0", "MIT" ]
null
null
null
--- title: 'CA2108: Değer türleri üzerinde bildirimsel güvenliği gözden geçirin' ms.date: 11/04/2016 ms.prod: visual-studio-dev15 ms.technology: vs-ide-code-analysis ms.topic: reference f1_keywords: - ReviewDeclarativeSecurityOnValueTypes - CA2108 helpviewer_keywords: - ReviewDeclarativeSecurityOnValueTypes - CA2108 ms.assetid: d62bffdd-3826-4d52-a708-1c646c5d48c2 author: gewarren ms.author: gewarren manager: douge ms.workload: - multiple ms.openlocfilehash: e2d76a0ecf6a2eeac677475eb25efe495129c213 ms.sourcegitcommit: 568bb0b944d16cfe1af624879fa3d3594d020187 ms.translationtype: MT ms.contentlocale: tr-TR ms.lasthandoff: 09/13/2018 ms.locfileid: "45548523" --- # <a name="ca2108-review-declarative-security-on-value-types"></a>CA2108: Değer türleri üzerinde bildirimsel güvenliği gözden geçirin ||| |-|-| |TypeName|ReviewDeclarativeSecurityOnValueTypes| |CheckId|CA2108| |Kategori|Microsoft.Security| |Yeni Değişiklik|Bozucu olmayan| ## <a name="cause"></a>Sebep Ortak veya korunan değer türü tarafından güvenliği sağlanan bir [veri ve modelleme](/dotnet/framework/data/index) veya [bağlantı talepleri](/dotnet/framework/misc/link-demands). ## <a name="rule-description"></a>Kural açıklaması Değer türleri ayrılan ve diğer oluşturucular yürütmeden önce kendi varsayılan oluşturucu tarafından başlatılır. Bir değer türü, isteğe bağlı ya da LinkDemand tarafından güvenlik altına alınır ve arayan güvenlik denetimi, herhangi bir oluşturucu dışında uygun izinlere sahip değil'de varsayılan başarısız olur ve bir güvenlik özel durumu oluşturulur. Değer türü serbest değil; kendi varsayılan oluşturucu tarafından ayarlanmış durumda kalır. Değer türü örneğini geçirir bir çağıranın oluşturun veya örnek erişim izni olduğunu varsaymayın. ## <a name="how-to-fix-violations"></a>İhlaller nasıl düzeltilir? Güvenlik denetimi türünden kaldırın ve onun yerine kullanmak yöntem düzeyi güvenlik denetimleri sürece bu kural ihlalini düzeltmek olamaz. Bu şekilde ihlali düzeltme değer türü örneklerini alma gelen yetersiz izinlerle çağıranlar engellemez. Bir değer türü örneği, varsayılan durumunda, hassas bilgileri kullanıma sunmuyor ve zararlı bir biçimde kullanılamaz emin olmanız gerekir. ## <a name="when-to-suppress-warnings"></a>Uyarılar bastırıldığında Herhangi bir çağıranın varsayılan durumuna içinde değer türüne örneklerini güvenlik tehdit taşıyor olmadan elde bu kuraldan bir uyarıyı gösterilmemesini sağlayabilirsiniz. ## <a name="example-1"></a>Örnek 1 Aşağıdaki örnek bu kuralı ihlal eden bir değer türü içeren bir kitaplık gösterir. `StructureManager` Türü, değer türü örneğini geçirir bir çağıranın oluşturun veya örnek erişim izni olduğunu varsayar. [!code-csharp[FxCop.Security.DemandOnValueType#1](../code-quality/codesnippet/CSharp/ca2108-review-declarative-security-on-value-types_1.cs)] ## <a name="example-2"></a>Örnek 2 Aşağıdaki uygulama kitaplığın zayıflık gösterir. [!code-csharp[FxCop.Security.TestDemandOnValueType#1](../code-quality/codesnippet/CSharp/ca2108-review-declarative-security-on-value-types_2.cs)] Bu örnek aşağıdaki çıktıyı üretir: ```txt Structure custom constructor: Request failed. New values SecuredTypeStructure 100 100 New values SecuredTypeStructure 200 200 ``` ## <a name="see-also"></a>Ayrıca bkz. - [Bağlantı talepleri](/dotnet/framework/misc/link-demands) - [Veri ve Modelleme](/dotnet/framework/data/index)
44.973333
537
0.812926
tur_Latn
0.999342
fa1865d404ca7f342679738e863a36dffa6110b1
173
md
Markdown
posts/test_post_no_banner.md
jasongforbes/visage
62816d163330cfab13f3fedaac5b264d4cff065e
[ "MIT" ]
null
null
null
posts/test_post_no_banner.md
jasongforbes/visage
62816d163330cfab13f3fedaac5b264d4cff065e
[ "MIT" ]
8
2017-03-12T18:54:15.000Z
2017-04-26T23:49:39.000Z
posts/test_post_no_banner.md
jasongforbes/visage
62816d163330cfab13f3fedaac5b264d4cff065e
[ "MIT" ]
1
2017-07-10T15:48:10.000Z
2017-07-10T15:48:10.000Z
--- title: Post with No Banner. date: 2017-03-01 description: This is a second post to show example of how it is displayed when no banner is present. --- Content goes here
21.625
100
0.739884
eng_Latn
0.999888
fa1887c78218a156125ee8c6f1bcf0c5666b8315
74
md
Markdown
README.md
paulopreto/IBmBiomec2021
c21510b93f8640d83b79343521ed8d5a01599cc6
[ "MIT" ]
null
null
null
README.md
paulopreto/IBmBiomec2021
c21510b93f8640d83b79343521ed8d5a01599cc6
[ "MIT" ]
1
2021-09-28T17:26:20.000Z
2021-09-28T17:26:20.000Z
README.md
paulopreto/IBmBiomec2021
c21510b93f8640d83b79343521ed8d5a01599cc6
[ "MIT" ]
null
null
null
# IBmBiomec2021 Disciplina USP de Biomecânica para Informática Biomédica
24.666667
57
0.851351
por_Latn
0.948628
fa18c65da247d912f9f1259f5247375deb164f4b
387
md
Markdown
python/regex_vs_split/readme.md
orangle/snippets
f6642dc359fff3a8e5f63ebd7447b3d1a5f75bb7
[ "MIT" ]
2
2019-07-15T09:14:05.000Z
2021-04-03T01:43:53.000Z
python/regex_vs_split/readme.md
orangle/snippets
f6642dc359fff3a8e5f63ebd7447b3d1a5f75bb7
[ "MIT" ]
2
2019-05-12T12:39:47.000Z
2020-06-27T02:13:25.000Z
python/regex_vs_split/readme.md
orangle/snippets
f6642dc359fff3a8e5f63ebd7447b3d1a5f75bb7
[ "MIT" ]
null
null
null
对比测试 ====== 测试文件大小 30916563, 30M,格式类nginx access log python re, re没有 compile ``` ▶ python regex_vs_split.py total bytes: 96130099518 cost: 5.805 qps: 92174.132 ``` python re, re compile之后 ``` ▶ python regex_vs_split.py total bytes: 96130099518 cost: 5.324 qps: 100500.519 ``` python split 之后分析 ``` ▶ python regex_vs_split.py total bytes: 96130191409 cost: 3.957 qps: 135215.152 ```
16.826087
52
0.72093
eng_Latn
0.125009
fa19413b9cfad5b1c71ccbef2c3c0c7c7ebeef51
1,830
md
Markdown
README.md
Keats/rust-elias-fano
0fa30f746025f979c338ea70b30eaf7a453b9fcb
[ "MIT" ]
null
null
null
README.md
Keats/rust-elias-fano
0fa30f746025f979c338ea70b30eaf7a453b9fcb
[ "MIT" ]
null
null
null
README.md
Keats/rust-elias-fano
0fa30f746025f979c338ea70b30eaf7a453b9fcb
[ "MIT" ]
null
null
null
# Elias-Fano, in Rust [![Build Status](https://img.shields.io/badge/crate-elias--fano-brightgreen.svg)](https://crates.io/crates/elias-fano) [![Rust Docs](https://img.shields.io/badge/docs.rs-elias--fano-orange.svg)](https://docs.rs/elias-fano) Elias-Fano encoding in Rust. The Elias-Fano encoding scheme is a quasi-succinct compression method for monotonic integers using gap compression on a Bitset. It allows for decompression of a bit at any position in `O(1)` time complexity. Being quasi-succinct, it is therefore almost as good as the best theoretical possible compression as determined by the [Shannon-Hartley](https://en.wikipedia.org/wiki/Shannon%E2%80%93Hartley_theorem) theorem. This implementation is based largely on one written in Go by [Antonio Mallia](https://www.antoniomallia.it/), which can be found at his repository [amallia/go-ef](https://github.com/amallia/go-ef). ## Todo: - [x] Tests - [x] Example usage - [x] Benchmarks - [ ] Comparison with other implementations ## Installation Add the following line to your Cargo.toml: ```diff [dependencies] + elias-fano = "2" ``` ## Example Usage ```rust use elias_fano::EliasFano; fn main() { let sorted_array = [0, 3, 40, 1000]; let size = sorted_array.len(); let mut ef = EliasFano::new(sorted_array[size - 1], size as u64); ef.compress(sorted_array.iter()).expect("Failed to compress"); println!("{}", ef.value()); // 1 match ef.next() { Ok(val) => println!("Retrieved value: {}", val), // 3 Err(error) => println!("Err: {}", error), // Out of bounds } let _ = ef.next(); println!("{}", ef.value()); // 40 ef.reset(); println!("{}", ef.value()); // 0 let _ = ef.visit(3); println!("{}", ef.value()); // 1000 } ``` ## License MIT licensed, see LICENSE for more details.
31.551724
208
0.669399
eng_Latn
0.768575
fa1971598a29d43481fe887a820b5201aa038c2c
2,826
md
Markdown
server/node_modules/notepack.io/HISTORY.md
h9jiang/CSE_218_118_Fa20_Team_229
6673eb442ef979732cd7b8075d4e2c9f10e5625e
[ "Apache-2.0" ]
59
2017-05-18T15:01:12.000Z
2022-03-28T20:17:17.000Z
server/node_modules/notepack.io/HISTORY.md
h9jiang/CSE_218_118_Fa20_Team_229
6673eb442ef979732cd7b8075d4e2c9f10e5625e
[ "Apache-2.0" ]
41
2020-11-05T03:39:26.000Z
2020-12-08T08:30:53.000Z
server/node_modules/notepack.io/HISTORY.md
h9jiang/CSE_218_118_Fa20_Team_229
6673eb442ef979732cd7b8075d4e2c9f10e5625e
[ "Apache-2.0" ]
21
2017-06-05T19:26:43.000Z
2021-10-11T21:21:26.000Z
# [2.3.0](https://github.com/darrachequesne/notepack/compare/2.2.0...v2.3.0) (2020-03-15) ### Performance Improvements * **decode:** add a cache for buffer-to-string conversions ([3c0e5a6](https://github.com/darrachequesne/notepack/commit/3c0e5a66332e50ce31749f0159a533156edbdd3d)) * **encode:** add a cache for string-to-buffer conversions ([60e8b0b](https://github.com/darrachequesne/notepack/commit/60e8b0b4b16b05e702334fe731df1ec43d1a9f14)) # [2.2.0](https://github.com/darrachequesne/notepack/compare/2.1.3...2.2.0) (2018-12-18) <a name="2.1.3"></a> ## [2.1.3](https://github.com/darrachequesne/notepack/compare/2.1.2...2.1.3) (2018-05-14) ### Bug Fixes * **browser:** fix utf-8 decoder ([#16](https://github.com/darrachequesne/notepack/issues/16)) ([abbf3a5](https://github.com/darrachequesne/notepack/commit/abbf3a5)) <a name="2.1.2"></a> ## [2.1.2](https://github.com/darrachequesne/notepack/compare/2.1.1...2.1.2) (2017-08-23) ### Bug Fixes * **encode:** remove the unsafe integer check ([#15](https://github.com/darrachequesne/notepack/issues/15)) ([bb8140c](https://github.com/darrachequesne/notepack/commit/bb8140c)) <a name="2.1.1"></a> ## [2.1.1](https://github.com/darrachequesne/notepack/compare/2.1.0...2.1.1) (2017-08-08) ### Bug Fixes * **browser:** fix decoding for strings with surrogate pairs ([#13](https://github.com/darrachequesne/notepack/issues/13)) ([a89e566](https://github.com/darrachequesne/notepack/commit/a89e566)) * **browser:** preserve the offset and length when creating a DataView ([#11](https://github.com/darrachequesne/notepack/issues/11)) ([bd91aa7](https://github.com/darrachequesne/notepack/commit/bd91aa7)) <a name="2.1.0"></a> # [2.1.0](https://github.com/darrachequesne/notepack/compare/2.0.1...2.1.0) (2017-07-31) ### Features * add support for toJSON method ([#8](https://github.com/darrachequesne/notepack/issues/8)) ([9345f9f](https://github.com/darrachequesne/notepack/commit/9345f9f)), closes [#7](https://github.com/darrachequesne/notepack/issues/7) <a name="2.0.1"></a> ## [2.0.1](https://github.com/darrachequesne/notepack/compare/2.0.0...2.0.1) (2017-06-06) ### Bug Fixes * **encode:** fix encoding for non-finite numbers ([#4](https://github.com/darrachequesne/notepack/issues/4)) ([f0ed0f3](https://github.com/darrachequesne/notepack/commit/f0ed0f3)) <a name="2.0.0"></a> # [2.0.0](https://github.com/darrachequesne/notepack/compare/1.0.1...2.0.0) (2017-05-19) ### Features * Add support for ArrayBuffer ([#2](https://github.com/darrachequesne/notepack/issues/2)) ([9eec8dc](https://github.com/darrachequesne/notepack/commit/9eec8dc)) * **browser:** switch from Buffer polyfill to ArrayBuffer ([#1](https://github.com/darrachequesne/notepack/issues/1)) ([8d7ce87](https://github.com/darrachequesne/notepack/commit/8d7ce87))
37.68
228
0.711607
yue_Hant
0.173025
fa197dc70842aa52c8ae886a704410fad3eee71d
102
md
Markdown
R_Example/Readme.md
jagath-jaikumar/python_r_stack
bd771d35bcb83a0b9b42017f697548c5c77f5a38
[ "MIT" ]
null
null
null
R_Example/Readme.md
jagath-jaikumar/python_r_stack
bd771d35bcb83a0b9b42017f697548c5c77f5a38
[ "MIT" ]
null
null
null
R_Example/Readme.md
jagath-jaikumar/python_r_stack
bd771d35bcb83a0b9b42017f697548c5c77f5a38
[ "MIT" ]
null
null
null
# R Docker Example docker build -t rexample . docker run -d -p 8000:8000 rexample Jagath Jai Kumar
12.75
35
0.735294
eng_Latn
0.741834
fa1a43452c5313e1cde9c1c4b4f53c15acb2c374
2,809
md
Markdown
answers.md
sreejithsreeji/join-us
283da590aff044b8a9cd0aaa8a048abb380a72a8
[ "MIT" ]
null
null
null
answers.md
sreejithsreeji/join-us
283da590aff044b8a9cd0aaa8a048abb380a72a8
[ "MIT" ]
null
null
null
answers.md
sreejithsreeji/join-us
283da590aff044b8a9cd0aaa8a048abb380a72a8
[ "MIT" ]
null
null
null
//code compailed and run in nodejs //most problems are solved without using JavaScript enrich predefined functions // 1: to print all digits of an array function getDigitsArray(number){ let digitArray=[]; while(number!=0){ digitArray.push(number%10); number=Math.floor(number/10) } return digitArray; } function reverseArray(array){ temp=[]; for(let i=array.length-1;i>=0;i--){ temp.push(array[i]); } return temp; } console.log('digits in given number') console.log(reverseArray(getDigitsArray([123]))); ------------------------------------------------------------------------------------------ 2: Remove duplicates from array function removeDuplicates(array){ temp=[]; for(let i=0;i<array.length;i++){ if(array[i]!=null){ temp.push(array[i]); for(let j=i+1;j<array.length;j++){ if(array[i]===array[j]){ //console.log(i+''+j); array[j]=null; } } } } return temp; } console.log('After removing duplicates from given array:') console.log(removeDuplicates([1,5,8,9,8])); ------------------------------------------------------------------------------------------- 4: Rotate array k times function shiftArray(array,k){ let j=0; while(j<k){ let i=0; let temp=array[0]; while(i<array.length){ array[i]=array[i+1]; i++; } array[array.length-1]=temp; j++; } return array; } console.log( 'rotate array by 2 position :); console.log( +shiftArray([1,2,3,4,5,6],2)); ----------------------------------------------------------------------------------------- 3: convert sentence to pigLatin function converttoPigLatin(sentence){ var newString=''; let start=0; for(let i=0;i<=sentence.length;i++){ if(sentence[i]===' ' || i===sentence.length){ let firstLetter=sentence[start]; while(start<i-1){ newString+=sentence[start+1]; if(start===0) newString=newString[0].toUpperCase(); start++; } newString+=firstLetter.toLowerCase()+'ay '; start=i+1; } } return newString; } console.log('corresponding pigLatin sentence : '); console.log(converttoPigLatin(new String('The quick brown fox')));
25.770642
94
0.429334
eng_Latn
0.488673